Nov 22 06:34:21 localhost kernel: Linux version 5.14.0-639.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-67.el9) #1 SMP PREEMPT_DYNAMIC Sat Nov 15 10:30:41 UTC 2025
Nov 22 06:34:21 localhost kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Nov 22 06:34:21 localhost kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-639.el9.x86_64 root=UUID=47e3724e-7a1b-439a-9543-b98c9a290709 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Nov 22 06:34:21 localhost kernel: BIOS-provided physical RAM map:
Nov 22 06:34:21 localhost kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Nov 22 06:34:21 localhost kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Nov 22 06:34:21 localhost kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Nov 22 06:34:21 localhost kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Nov 22 06:34:21 localhost kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Nov 22 06:34:21 localhost kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Nov 22 06:34:21 localhost kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Nov 22 06:34:21 localhost kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Nov 22 06:34:21 localhost kernel: NX (Execute Disable) protection: active
Nov 22 06:34:21 localhost kernel: APIC: Static calls initialized
Nov 22 06:34:21 localhost kernel: SMBIOS 2.8 present.
Nov 22 06:34:21 localhost kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Nov 22 06:34:21 localhost kernel: Hypervisor detected: KVM
Nov 22 06:34:21 localhost kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Nov 22 06:34:21 localhost kernel: kvm-clock: using sched offset of 10650929365 cycles
Nov 22 06:34:21 localhost kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Nov 22 06:34:21 localhost kernel: tsc: Detected 2799.998 MHz processor
Nov 22 06:34:21 localhost kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved
Nov 22 06:34:21 localhost kernel: e820: remove [mem 0x000a0000-0x000fffff] usable
Nov 22 06:34:21 localhost kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Nov 22 06:34:21 localhost kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Nov 22 06:34:21 localhost kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Nov 22 06:34:21 localhost kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Nov 22 06:34:21 localhost kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Nov 22 06:34:21 localhost kernel: Using GB pages for direct mapping
Nov 22 06:34:21 localhost kernel: RAMDISK: [mem 0x2d83a000-0x32c14fff]
Nov 22 06:34:21 localhost kernel: ACPI: Early table checksum verification disabled
Nov 22 06:34:21 localhost kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Nov 22 06:34:21 localhost kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 22 06:34:21 localhost kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 22 06:34:21 localhost kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 22 06:34:21 localhost kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Nov 22 06:34:21 localhost kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 22 06:34:21 localhost kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 22 06:34:21 localhost kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Nov 22 06:34:21 localhost kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Nov 22 06:34:21 localhost kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Nov 22 06:34:21 localhost kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Nov 22 06:34:21 localhost kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Nov 22 06:34:21 localhost kernel: No NUMA configuration found
Nov 22 06:34:21 localhost kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Nov 22 06:34:21 localhost kernel: NODE_DATA(0) allocated [mem 0x23ffd5000-0x23fffffff]
Nov 22 06:34:21 localhost kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Nov 22 06:34:21 localhost kernel: Zone ranges:
Nov 22 06:34:21 localhost kernel:   DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Nov 22 06:34:21 localhost kernel:   DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Nov 22 06:34:21 localhost kernel:   Normal   [mem 0x0000000100000000-0x000000023fffffff]
Nov 22 06:34:21 localhost kernel:   Device   empty
Nov 22 06:34:21 localhost kernel: Movable zone start for each node
Nov 22 06:34:21 localhost kernel: Early memory node ranges
Nov 22 06:34:21 localhost kernel:   node   0: [mem 0x0000000000001000-0x000000000009efff]
Nov 22 06:34:21 localhost kernel:   node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Nov 22 06:34:21 localhost kernel:   node   0: [mem 0x0000000100000000-0x000000023fffffff]
Nov 22 06:34:21 localhost kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Nov 22 06:34:21 localhost kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Nov 22 06:34:21 localhost kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Nov 22 06:34:21 localhost kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Nov 22 06:34:21 localhost kernel: ACPI: PM-Timer IO Port: 0x608
Nov 22 06:34:21 localhost kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Nov 22 06:34:21 localhost kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Nov 22 06:34:21 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Nov 22 06:34:21 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Nov 22 06:34:21 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Nov 22 06:34:21 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Nov 22 06:34:21 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Nov 22 06:34:21 localhost kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Nov 22 06:34:21 localhost kernel: TSC deadline timer available
Nov 22 06:34:21 localhost kernel: CPU topo: Max. logical packages:   8
Nov 22 06:34:21 localhost kernel: CPU topo: Max. logical dies:       8
Nov 22 06:34:21 localhost kernel: CPU topo: Max. dies per package:   1
Nov 22 06:34:21 localhost kernel: CPU topo: Max. threads per core:   1
Nov 22 06:34:21 localhost kernel: CPU topo: Num. cores per package:     1
Nov 22 06:34:21 localhost kernel: CPU topo: Num. threads per package:   1
Nov 22 06:34:21 localhost kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Nov 22 06:34:21 localhost kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Nov 22 06:34:21 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Nov 22 06:34:21 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Nov 22 06:34:21 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Nov 22 06:34:21 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Nov 22 06:34:21 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Nov 22 06:34:21 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Nov 22 06:34:21 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Nov 22 06:34:21 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Nov 22 06:34:21 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Nov 22 06:34:21 localhost kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Nov 22 06:34:21 localhost kernel: Booting paravirtualized kernel on KVM
Nov 22 06:34:21 localhost kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Nov 22 06:34:21 localhost kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Nov 22 06:34:21 localhost kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Nov 22 06:34:21 localhost kernel: pcpu-alloc: s225280 r8192 d28672 u262144 alloc=1*2097152
Nov 22 06:34:21 localhost kernel: pcpu-alloc: [0] 0 1 2 3 4 5 6 7 
Nov 22 06:34:21 localhost kernel: kvm-guest: PV spinlocks disabled, no host support
Nov 22 06:34:21 localhost kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-639.el9.x86_64 root=UUID=47e3724e-7a1b-439a-9543-b98c9a290709 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Nov 22 06:34:21 localhost kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-639.el9.x86_64", will be passed to user space.
Nov 22 06:34:21 localhost kernel: random: crng init done
Nov 22 06:34:21 localhost kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Nov 22 06:34:21 localhost kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Nov 22 06:34:21 localhost kernel: Fallback order for Node 0: 0 
Nov 22 06:34:21 localhost kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Nov 22 06:34:21 localhost kernel: Policy zone: Normal
Nov 22 06:34:21 localhost kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Nov 22 06:34:21 localhost kernel: software IO TLB: area num 8.
Nov 22 06:34:21 localhost kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Nov 22 06:34:21 localhost kernel: ftrace: allocating 49298 entries in 193 pages
Nov 22 06:34:21 localhost kernel: ftrace: allocated 193 pages with 3 groups
Nov 22 06:34:21 localhost kernel: Dynamic Preempt: voluntary
Nov 22 06:34:21 localhost kernel: rcu: Preemptible hierarchical RCU implementation.
Nov 22 06:34:21 localhost kernel: rcu:         RCU event tracing is enabled.
Nov 22 06:34:21 localhost kernel: rcu:         RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Nov 22 06:34:21 localhost kernel:         Trampoline variant of Tasks RCU enabled.
Nov 22 06:34:21 localhost kernel:         Rude variant of Tasks RCU enabled.
Nov 22 06:34:21 localhost kernel:         Tracing variant of Tasks RCU enabled.
Nov 22 06:34:21 localhost kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Nov 22 06:34:21 localhost kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Nov 22 06:34:21 localhost kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Nov 22 06:34:21 localhost kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Nov 22 06:34:21 localhost kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Nov 22 06:34:21 localhost kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Nov 22 06:34:21 localhost kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Nov 22 06:34:21 localhost kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Nov 22 06:34:21 localhost kernel: Console: colour VGA+ 80x25
Nov 22 06:34:21 localhost kernel: printk: console [ttyS0] enabled
Nov 22 06:34:21 localhost kernel: ACPI: Core revision 20230331
Nov 22 06:34:21 localhost kernel: APIC: Switch to symmetric I/O mode setup
Nov 22 06:34:21 localhost kernel: x2apic enabled
Nov 22 06:34:21 localhost kernel: APIC: Switched APIC routing to: physical x2apic
Nov 22 06:34:21 localhost kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Nov 22 06:34:21 localhost kernel: Calibrating delay loop (skipped) preset value.. 5599.99 BogoMIPS (lpj=2799998)
Nov 22 06:34:21 localhost kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Nov 22 06:34:21 localhost kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Nov 22 06:34:21 localhost kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Nov 22 06:34:21 localhost kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Nov 22 06:34:21 localhost kernel: Spectre V2 : Mitigation: Retpolines
Nov 22 06:34:21 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Nov 22 06:34:21 localhost kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Nov 22 06:34:21 localhost kernel: RETBleed: Mitigation: untrained return thunk
Nov 22 06:34:21 localhost kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Nov 22 06:34:21 localhost kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Nov 22 06:34:21 localhost kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Nov 22 06:34:21 localhost kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Nov 22 06:34:21 localhost kernel: x86/bugs: return thunk changed
Nov 22 06:34:21 localhost kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Nov 22 06:34:21 localhost kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Nov 22 06:34:21 localhost kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Nov 22 06:34:21 localhost kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Nov 22 06:34:21 localhost kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Nov 22 06:34:21 localhost kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Nov 22 06:34:21 localhost kernel: Freeing SMP alternatives memory: 40K
Nov 22 06:34:21 localhost kernel: pid_max: default: 32768 minimum: 301
Nov 22 06:34:21 localhost kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Nov 22 06:34:21 localhost kernel: landlock: Up and running.
Nov 22 06:34:21 localhost kernel: Yama: becoming mindful.
Nov 22 06:34:21 localhost kernel: SELinux:  Initializing.
Nov 22 06:34:21 localhost kernel: LSM support for eBPF active
Nov 22 06:34:21 localhost kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Nov 22 06:34:21 localhost kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Nov 22 06:34:21 localhost kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Nov 22 06:34:21 localhost kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Nov 22 06:34:21 localhost kernel: ... version:                0
Nov 22 06:34:21 localhost kernel: ... bit width:              48
Nov 22 06:34:21 localhost kernel: ... generic registers:      6
Nov 22 06:34:21 localhost kernel: ... value mask:             0000ffffffffffff
Nov 22 06:34:21 localhost kernel: ... max period:             00007fffffffffff
Nov 22 06:34:21 localhost kernel: ... fixed-purpose events:   0
Nov 22 06:34:21 localhost kernel: ... event mask:             000000000000003f
Nov 22 06:34:21 localhost kernel: signal: max sigframe size: 1776
Nov 22 06:34:21 localhost kernel: rcu: Hierarchical SRCU implementation.
Nov 22 06:34:21 localhost kernel: rcu:         Max phase no-delay instances is 400.
Nov 22 06:34:21 localhost kernel: smp: Bringing up secondary CPUs ...
Nov 22 06:34:21 localhost kernel: smpboot: x86: Booting SMP configuration:
Nov 22 06:34:21 localhost kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Nov 22 06:34:21 localhost kernel: smp: Brought up 1 node, 8 CPUs
Nov 22 06:34:21 localhost kernel: smpboot: Total of 8 processors activated (44799.96 BogoMIPS)
Nov 22 06:34:21 localhost kernel: node 0 deferred pages initialised in 10ms
Nov 22 06:34:21 localhost kernel: Memory: 7765992K/8388068K available (16384K kernel code, 5786K rwdata, 13900K rodata, 4188K init, 7176K bss, 616268K reserved, 0K cma-reserved)
Nov 22 06:34:21 localhost kernel: devtmpfs: initialized
Nov 22 06:34:21 localhost kernel: x86/mm: Memory block size: 128MB
Nov 22 06:34:21 localhost kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Nov 22 06:34:21 localhost kernel: futex hash table entries: 2048 (order: 5, 131072 bytes, linear)
Nov 22 06:34:21 localhost kernel: pinctrl core: initialized pinctrl subsystem
Nov 22 06:34:21 localhost kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Nov 22 06:34:21 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Nov 22 06:34:21 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Nov 22 06:34:21 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Nov 22 06:34:21 localhost kernel: audit: initializing netlink subsys (disabled)
Nov 22 06:34:21 localhost kernel: audit: type=2000 audit(1763793259.110:1): state=initialized audit_enabled=0 res=1
Nov 22 06:34:21 localhost kernel: thermal_sys: Registered thermal governor 'fair_share'
Nov 22 06:34:21 localhost kernel: thermal_sys: Registered thermal governor 'step_wise'
Nov 22 06:34:21 localhost kernel: thermal_sys: Registered thermal governor 'user_space'
Nov 22 06:34:21 localhost kernel: cpuidle: using governor menu
Nov 22 06:34:21 localhost kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Nov 22 06:34:21 localhost kernel: PCI: Using configuration type 1 for base access
Nov 22 06:34:21 localhost kernel: PCI: Using configuration type 1 for extended access
Nov 22 06:34:21 localhost kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Nov 22 06:34:21 localhost kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Nov 22 06:34:21 localhost kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Nov 22 06:34:21 localhost kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Nov 22 06:34:21 localhost kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Nov 22 06:34:21 localhost kernel: Demotion targets for Node 0: null
Nov 22 06:34:21 localhost kernel: cryptd: max_cpu_qlen set to 1000
Nov 22 06:34:21 localhost kernel: ACPI: Added _OSI(Module Device)
Nov 22 06:34:21 localhost kernel: ACPI: Added _OSI(Processor Device)
Nov 22 06:34:21 localhost kernel: ACPI: Added _OSI(3.0 _SCP Extensions)
Nov 22 06:34:21 localhost kernel: ACPI: Added _OSI(Processor Aggregator Device)
Nov 22 06:34:21 localhost kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Nov 22 06:34:21 localhost kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC
Nov 22 06:34:21 localhost kernel: ACPI: Interpreter enabled
Nov 22 06:34:21 localhost kernel: ACPI: PM: (supports S0 S3 S4 S5)
Nov 22 06:34:21 localhost kernel: ACPI: Using IOAPIC for interrupt routing
Nov 22 06:34:21 localhost kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Nov 22 06:34:21 localhost kernel: PCI: Using E820 reservations for host bridge windows
Nov 22 06:34:21 localhost kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Nov 22 06:34:21 localhost kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Nov 22 06:34:21 localhost kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Nov 22 06:34:21 localhost kernel: acpiphp: Slot [3] registered
Nov 22 06:34:21 localhost kernel: acpiphp: Slot [4] registered
Nov 22 06:34:21 localhost kernel: acpiphp: Slot [5] registered
Nov 22 06:34:21 localhost kernel: acpiphp: Slot [6] registered
Nov 22 06:34:21 localhost kernel: acpiphp: Slot [7] registered
Nov 22 06:34:21 localhost kernel: acpiphp: Slot [8] registered
Nov 22 06:34:21 localhost kernel: acpiphp: Slot [9] registered
Nov 22 06:34:21 localhost kernel: acpiphp: Slot [10] registered
Nov 22 06:34:21 localhost kernel: acpiphp: Slot [11] registered
Nov 22 06:34:21 localhost kernel: acpiphp: Slot [12] registered
Nov 22 06:34:21 localhost kernel: acpiphp: Slot [13] registered
Nov 22 06:34:21 localhost kernel: acpiphp: Slot [14] registered
Nov 22 06:34:21 localhost kernel: acpiphp: Slot [15] registered
Nov 22 06:34:21 localhost kernel: acpiphp: Slot [16] registered
Nov 22 06:34:21 localhost kernel: acpiphp: Slot [17] registered
Nov 22 06:34:21 localhost kernel: acpiphp: Slot [18] registered
Nov 22 06:34:21 localhost kernel: acpiphp: Slot [19] registered
Nov 22 06:34:21 localhost kernel: acpiphp: Slot [20] registered
Nov 22 06:34:21 localhost kernel: acpiphp: Slot [21] registered
Nov 22 06:34:21 localhost kernel: acpiphp: Slot [22] registered
Nov 22 06:34:21 localhost kernel: acpiphp: Slot [23] registered
Nov 22 06:34:21 localhost kernel: acpiphp: Slot [24] registered
Nov 22 06:34:21 localhost kernel: acpiphp: Slot [25] registered
Nov 22 06:34:21 localhost kernel: acpiphp: Slot [26] registered
Nov 22 06:34:21 localhost kernel: acpiphp: Slot [27] registered
Nov 22 06:34:21 localhost kernel: acpiphp: Slot [28] registered
Nov 22 06:34:21 localhost kernel: acpiphp: Slot [29] registered
Nov 22 06:34:21 localhost kernel: acpiphp: Slot [30] registered
Nov 22 06:34:21 localhost kernel: acpiphp: Slot [31] registered
Nov 22 06:34:21 localhost kernel: PCI host bridge to bus 0000:00
Nov 22 06:34:21 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Nov 22 06:34:21 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Nov 22 06:34:21 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Nov 22 06:34:21 localhost kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Nov 22 06:34:21 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Nov 22 06:34:21 localhost kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Nov 22 06:34:21 localhost kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Nov 22 06:34:21 localhost kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Nov 22 06:34:21 localhost kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Nov 22 06:34:21 localhost kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Nov 22 06:34:21 localhost kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Nov 22 06:34:21 localhost kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Nov 22 06:34:21 localhost kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Nov 22 06:34:21 localhost kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Nov 22 06:34:21 localhost kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Nov 22 06:34:21 localhost kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Nov 22 06:34:21 localhost kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Nov 22 06:34:21 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Nov 22 06:34:21 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Nov 22 06:34:21 localhost kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Nov 22 06:34:21 localhost kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Nov 22 06:34:21 localhost kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Nov 22 06:34:21 localhost kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Nov 22 06:34:21 localhost kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Nov 22 06:34:21 localhost kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Nov 22 06:34:21 localhost kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Nov 22 06:34:21 localhost kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Nov 22 06:34:21 localhost kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Nov 22 06:34:21 localhost kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Nov 22 06:34:21 localhost kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Nov 22 06:34:21 localhost kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Nov 22 06:34:21 localhost kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Nov 22 06:34:21 localhost kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Nov 22 06:34:21 localhost kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Nov 22 06:34:21 localhost kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Nov 22 06:34:21 localhost kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Nov 22 06:34:21 localhost kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Nov 22 06:34:21 localhost kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Nov 22 06:34:21 localhost kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Nov 22 06:34:21 localhost kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Nov 22 06:34:21 localhost kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Nov 22 06:34:21 localhost kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Nov 22 06:34:21 localhost kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Nov 22 06:34:21 localhost kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Nov 22 06:34:21 localhost kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Nov 22 06:34:21 localhost kernel: iommu: Default domain type: Translated
Nov 22 06:34:21 localhost kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Nov 22 06:34:21 localhost kernel: SCSI subsystem initialized
Nov 22 06:34:21 localhost kernel: ACPI: bus type USB registered
Nov 22 06:34:21 localhost kernel: usbcore: registered new interface driver usbfs
Nov 22 06:34:21 localhost kernel: usbcore: registered new interface driver hub
Nov 22 06:34:21 localhost kernel: usbcore: registered new device driver usb
Nov 22 06:34:21 localhost kernel: pps_core: LinuxPPS API ver. 1 registered
Nov 22 06:34:21 localhost kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Nov 22 06:34:21 localhost kernel: PTP clock support registered
Nov 22 06:34:21 localhost kernel: EDAC MC: Ver: 3.0.0
Nov 22 06:34:21 localhost kernel: NetLabel: Initializing
Nov 22 06:34:21 localhost kernel: NetLabel:  domain hash size = 128
Nov 22 06:34:21 localhost kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Nov 22 06:34:21 localhost kernel: NetLabel:  unlabeled traffic allowed by default
Nov 22 06:34:21 localhost kernel: PCI: Using ACPI for IRQ routing
Nov 22 06:34:21 localhost kernel: PCI: pci_cache_line_size set to 64 bytes
Nov 22 06:34:21 localhost kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff]
Nov 22 06:34:21 localhost kernel: e820: reserve RAM buffer [mem 0xbffdb000-0xbfffffff]
Nov 22 06:34:21 localhost kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Nov 22 06:34:21 localhost kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Nov 22 06:34:21 localhost kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Nov 22 06:34:21 localhost kernel: vgaarb: loaded
Nov 22 06:34:21 localhost kernel: clocksource: Switched to clocksource kvm-clock
Nov 22 06:34:21 localhost kernel: VFS: Disk quotas dquot_6.6.0
Nov 22 06:34:21 localhost kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Nov 22 06:34:21 localhost kernel: pnp: PnP ACPI init
Nov 22 06:34:21 localhost kernel: pnp 00:03: [dma 2]
Nov 22 06:34:21 localhost kernel: pnp: PnP ACPI: found 5 devices
Nov 22 06:34:21 localhost kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Nov 22 06:34:21 localhost kernel: NET: Registered PF_INET protocol family
Nov 22 06:34:21 localhost kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Nov 22 06:34:21 localhost kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Nov 22 06:34:21 localhost kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Nov 22 06:34:21 localhost kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Nov 22 06:34:21 localhost kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Nov 22 06:34:21 localhost kernel: TCP: Hash tables configured (established 65536 bind 65536)
Nov 22 06:34:21 localhost kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Nov 22 06:34:21 localhost kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Nov 22 06:34:21 localhost kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Nov 22 06:34:21 localhost kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Nov 22 06:34:21 localhost kernel: NET: Registered PF_XDP protocol family
Nov 22 06:34:21 localhost kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Nov 22 06:34:21 localhost kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Nov 22 06:34:21 localhost kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Nov 22 06:34:21 localhost kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Nov 22 06:34:21 localhost kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Nov 22 06:34:21 localhost kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Nov 22 06:34:21 localhost kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Nov 22 06:34:21 localhost kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Nov 22 06:34:21 localhost kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 75164 usecs
Nov 22 06:34:21 localhost kernel: PCI: CLS 0 bytes, default 64
Nov 22 06:34:21 localhost kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Nov 22 06:34:21 localhost kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Nov 22 06:34:21 localhost kernel: ACPI: bus type thunderbolt registered
Nov 22 06:34:21 localhost kernel: Trying to unpack rootfs image as initramfs...
Nov 22 06:34:21 localhost kernel: Initialise system trusted keyrings
Nov 22 06:34:21 localhost kernel: Key type blacklist registered
Nov 22 06:34:21 localhost kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Nov 22 06:34:21 localhost kernel: zbud: loaded
Nov 22 06:34:21 localhost kernel: integrity: Platform Keyring initialized
Nov 22 06:34:21 localhost kernel: integrity: Machine keyring initialized
Nov 22 06:34:21 localhost kernel: Freeing initrd memory: 85868K
Nov 22 06:34:21 localhost kernel: NET: Registered PF_ALG protocol family
Nov 22 06:34:21 localhost kernel: xor: automatically using best checksumming function   avx       
Nov 22 06:34:21 localhost kernel: Key type asymmetric registered
Nov 22 06:34:21 localhost kernel: Asymmetric key parser 'x509' registered
Nov 22 06:34:21 localhost kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Nov 22 06:34:21 localhost kernel: io scheduler mq-deadline registered
Nov 22 06:34:21 localhost kernel: io scheduler kyber registered
Nov 22 06:34:21 localhost kernel: io scheduler bfq registered
Nov 22 06:34:21 localhost kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Nov 22 06:34:21 localhost kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Nov 22 06:34:21 localhost kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Nov 22 06:34:21 localhost kernel: ACPI: button: Power Button [PWRF]
Nov 22 06:34:21 localhost kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Nov 22 06:34:21 localhost kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Nov 22 06:34:21 localhost kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Nov 22 06:34:21 localhost kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Nov 22 06:34:21 localhost kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Nov 22 06:34:21 localhost kernel: Non-volatile memory driver v1.3
Nov 22 06:34:21 localhost kernel: rdac: device handler registered
Nov 22 06:34:21 localhost kernel: hp_sw: device handler registered
Nov 22 06:34:21 localhost kernel: emc: device handler registered
Nov 22 06:34:21 localhost kernel: alua: device handler registered
Nov 22 06:34:21 localhost kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Nov 22 06:34:21 localhost kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Nov 22 06:34:21 localhost kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Nov 22 06:34:21 localhost kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Nov 22 06:34:21 localhost kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Nov 22 06:34:21 localhost kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Nov 22 06:34:21 localhost kernel: usb usb1: Product: UHCI Host Controller
Nov 22 06:34:21 localhost kernel: usb usb1: Manufacturer: Linux 5.14.0-639.el9.x86_64 uhci_hcd
Nov 22 06:34:21 localhost kernel: usb usb1: SerialNumber: 0000:00:01.2
Nov 22 06:34:21 localhost kernel: hub 1-0:1.0: USB hub found
Nov 22 06:34:21 localhost kernel: hub 1-0:1.0: 2 ports detected
Nov 22 06:34:21 localhost kernel: usbcore: registered new interface driver usbserial_generic
Nov 22 06:34:21 localhost kernel: usbserial: USB Serial support registered for generic
Nov 22 06:34:21 localhost kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Nov 22 06:34:21 localhost kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Nov 22 06:34:21 localhost kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Nov 22 06:34:21 localhost kernel: mousedev: PS/2 mouse device common for all mice
Nov 22 06:34:21 localhost kernel: rtc_cmos 00:04: RTC can wake from S4
Nov 22 06:34:21 localhost kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Nov 22 06:34:21 localhost kernel: rtc_cmos 00:04: registered as rtc0
Nov 22 06:34:21 localhost kernel: rtc_cmos 00:04: setting system clock to 2025-11-22T06:34:20 UTC (1763793260)
Nov 22 06:34:21 localhost kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Nov 22 06:34:21 localhost kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Nov 22 06:34:21 localhost kernel: hid: raw HID events driver (C) Jiri Kosina
Nov 22 06:34:21 localhost kernel: usbcore: registered new interface driver usbhid
Nov 22 06:34:21 localhost kernel: usbhid: USB HID core driver
Nov 22 06:34:21 localhost kernel: drop_monitor: Initializing network drop monitor service
Nov 22 06:34:21 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Nov 22 06:34:21 localhost kernel: Initializing XFRM netlink socket
Nov 22 06:34:21 localhost kernel: NET: Registered PF_INET6 protocol family
Nov 22 06:34:21 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Nov 22 06:34:21 localhost kernel: Segment Routing with IPv6
Nov 22 06:34:21 localhost kernel: NET: Registered PF_PACKET protocol family
Nov 22 06:34:21 localhost kernel: mpls_gso: MPLS GSO support
Nov 22 06:34:21 localhost kernel: IPI shorthand broadcast: enabled
Nov 22 06:34:21 localhost kernel: AVX2 version of gcm_enc/dec engaged.
Nov 22 06:34:21 localhost kernel: AES CTR mode by8 optimization enabled
Nov 22 06:34:21 localhost kernel: sched_clock: Marking stable (1232006151, 164100707)->(1508140027, -112033169)
Nov 22 06:34:21 localhost kernel: registered taskstats version 1
Nov 22 06:34:21 localhost kernel: Loading compiled-in X.509 certificates
Nov 22 06:34:21 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: f7751431c703da8a75244ce96aad68601cf1c188'
Nov 22 06:34:21 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Nov 22 06:34:21 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Nov 22 06:34:21 localhost kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Nov 22 06:34:21 localhost kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Nov 22 06:34:21 localhost kernel: Demotion targets for Node 0: null
Nov 22 06:34:21 localhost kernel: page_owner is disabled
Nov 22 06:34:21 localhost kernel: Key type .fscrypt registered
Nov 22 06:34:21 localhost kernel: Key type fscrypt-provisioning registered
Nov 22 06:34:21 localhost kernel: Key type big_key registered
Nov 22 06:34:21 localhost kernel: Key type encrypted registered
Nov 22 06:34:21 localhost kernel: ima: No TPM chip found, activating TPM-bypass!
Nov 22 06:34:21 localhost kernel: Loading compiled-in module X.509 certificates
Nov 22 06:34:21 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: f7751431c703da8a75244ce96aad68601cf1c188'
Nov 22 06:34:21 localhost kernel: ima: Allocated hash algorithm: sha256
Nov 22 06:34:21 localhost kernel: ima: No architecture policies found
Nov 22 06:34:21 localhost kernel: evm: Initialising EVM extended attributes:
Nov 22 06:34:21 localhost kernel: evm: security.selinux
Nov 22 06:34:21 localhost kernel: evm: security.SMACK64 (disabled)
Nov 22 06:34:21 localhost kernel: evm: security.SMACK64EXEC (disabled)
Nov 22 06:34:21 localhost kernel: evm: security.SMACK64TRANSMUTE (disabled)
Nov 22 06:34:21 localhost kernel: evm: security.SMACK64MMAP (disabled)
Nov 22 06:34:21 localhost kernel: evm: security.apparmor (disabled)
Nov 22 06:34:21 localhost kernel: evm: security.ima
Nov 22 06:34:21 localhost kernel: evm: security.capability
Nov 22 06:34:21 localhost kernel: evm: HMAC attrs: 0x1
Nov 22 06:34:21 localhost kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Nov 22 06:34:21 localhost kernel: Running certificate verification RSA selftest
Nov 22 06:34:21 localhost kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Nov 22 06:34:21 localhost kernel: Running certificate verification ECDSA selftest
Nov 22 06:34:21 localhost kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Nov 22 06:34:21 localhost kernel: clk: Disabling unused clocks
Nov 22 06:34:21 localhost kernel: Freeing unused decrypted memory: 2028K
Nov 22 06:34:21 localhost kernel: Freeing unused kernel image (initmem) memory: 4188K
Nov 22 06:34:21 localhost kernel: Write protecting the kernel read-only data: 30720k
Nov 22 06:34:21 localhost kernel: Freeing unused kernel image (rodata/data gap) memory: 436K
Nov 22 06:34:21 localhost kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Nov 22 06:34:21 localhost kernel: Run /init as init process
Nov 22 06:34:21 localhost kernel:   with arguments:
Nov 22 06:34:21 localhost kernel:     /init
Nov 22 06:34:21 localhost kernel:   with environment:
Nov 22 06:34:21 localhost kernel:     HOME=/
Nov 22 06:34:21 localhost kernel:     TERM=linux
Nov 22 06:34:21 localhost kernel:     BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-639.el9.x86_64
Nov 22 06:34:21 localhost systemd[1]: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Nov 22 06:34:21 localhost systemd[1]: Detected virtualization kvm.
Nov 22 06:34:21 localhost systemd[1]: Detected architecture x86-64.
Nov 22 06:34:21 localhost systemd[1]: Running in initrd.
Nov 22 06:34:21 localhost kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Nov 22 06:34:21 localhost kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Nov 22 06:34:21 localhost kernel: usb 1-1: Product: QEMU USB Tablet
Nov 22 06:34:21 localhost kernel: usb 1-1: Manufacturer: QEMU
Nov 22 06:34:21 localhost kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Nov 22 06:34:21 localhost systemd[1]: No hostname configured, using default hostname.
Nov 22 06:34:21 localhost systemd[1]: Hostname set to <localhost>.
Nov 22 06:34:21 localhost systemd[1]: Initializing machine ID from VM UUID.
Nov 22 06:34:21 localhost kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Nov 22 06:34:21 localhost kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Nov 22 06:34:21 localhost systemd[1]: Queued start job for default target Initrd Default Target.
Nov 22 06:34:21 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Nov 22 06:34:21 localhost systemd[1]: Reached target Local Encrypted Volumes.
Nov 22 06:34:21 localhost systemd[1]: Reached target Initrd /usr File System.
Nov 22 06:34:21 localhost systemd[1]: Reached target Local File Systems.
Nov 22 06:34:21 localhost systemd[1]: Reached target Path Units.
Nov 22 06:34:21 localhost systemd[1]: Reached target Slice Units.
Nov 22 06:34:21 localhost systemd[1]: Reached target Swaps.
Nov 22 06:34:21 localhost systemd[1]: Reached target Timer Units.
Nov 22 06:34:21 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Nov 22 06:34:21 localhost systemd[1]: Listening on Journal Socket (/dev/log).
Nov 22 06:34:21 localhost systemd[1]: Listening on Journal Socket.
Nov 22 06:34:21 localhost systemd[1]: Listening on udev Control Socket.
Nov 22 06:34:21 localhost systemd[1]: Listening on udev Kernel Socket.
Nov 22 06:34:21 localhost systemd[1]: Reached target Socket Units.
Nov 22 06:34:21 localhost systemd[1]: Starting Create List of Static Device Nodes...
Nov 22 06:34:21 localhost systemd[1]: Starting Journal Service...
Nov 22 06:34:21 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Nov 22 06:34:21 localhost systemd[1]: Starting Apply Kernel Variables...
Nov 22 06:34:21 localhost systemd[1]: Starting Create System Users...
Nov 22 06:34:21 localhost systemd[1]: Starting Setup Virtual Console...
Nov 22 06:34:21 localhost systemd[1]: Finished Create List of Static Device Nodes.
Nov 22 06:34:21 localhost systemd[1]: Finished Apply Kernel Variables.
Nov 22 06:34:21 localhost systemd[1]: Finished Create System Users.
Nov 22 06:34:21 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Nov 22 06:34:21 localhost systemd-journald[304]: Journal started
Nov 22 06:34:21 localhost systemd-journald[304]: Runtime Journal (/run/log/journal/44d48670015e4b659deefcaebe9200b4) is 8.0M, max 153.6M, 145.6M free.
Nov 22 06:34:21 localhost systemd-sysusers[309]: Creating group 'users' with GID 100.
Nov 22 06:34:21 localhost systemd-sysusers[309]: Creating group 'dbus' with GID 81.
Nov 22 06:34:21 localhost systemd-sysusers[309]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Nov 22 06:34:21 localhost systemd[1]: Started Journal Service.
Nov 22 06:34:21 localhost systemd[1]: Starting Create Volatile Files and Directories...
Nov 22 06:34:21 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Nov 22 06:34:21 localhost systemd[1]: Finished Create Volatile Files and Directories.
Nov 22 06:34:21 localhost systemd[1]: Finished Setup Virtual Console.
Nov 22 06:34:21 localhost systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Nov 22 06:34:21 localhost systemd[1]: Starting dracut cmdline hook...
Nov 22 06:34:21 localhost dracut-cmdline[325]: dracut-9 dracut-057-102.git20250818.el9
Nov 22 06:34:21 localhost dracut-cmdline[325]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-639.el9.x86_64 root=UUID=47e3724e-7a1b-439a-9543-b98c9a290709 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Nov 22 06:34:21 localhost systemd[1]: Finished dracut cmdline hook.
Nov 22 06:34:21 localhost systemd[1]: Starting dracut pre-udev hook...
Nov 22 06:34:21 localhost kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Nov 22 06:34:21 localhost kernel: device-mapper: uevent: version 1.0.3
Nov 22 06:34:21 localhost kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Nov 22 06:34:21 localhost kernel: RPC: Registered named UNIX socket transport module.
Nov 22 06:34:21 localhost kernel: RPC: Registered udp transport module.
Nov 22 06:34:21 localhost kernel: RPC: Registered tcp transport module.
Nov 22 06:34:21 localhost kernel: RPC: Registered tcp-with-tls transport module.
Nov 22 06:34:21 localhost kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Nov 22 06:34:21 localhost rpc.statd[442]: Version 2.5.4 starting
Nov 22 06:34:21 localhost rpc.statd[442]: Initializing NSM state
Nov 22 06:34:21 localhost rpc.idmapd[447]: Setting log level to 0
Nov 22 06:34:21 localhost systemd[1]: Finished dracut pre-udev hook.
Nov 22 06:34:21 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Nov 22 06:34:21 localhost systemd-udevd[460]: Using default interface naming scheme 'rhel-9.0'.
Nov 22 06:34:21 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Nov 22 06:34:21 localhost systemd[1]: Starting dracut pre-trigger hook...
Nov 22 06:34:21 localhost systemd[1]: Finished dracut pre-trigger hook.
Nov 22 06:34:21 localhost systemd[1]: Starting Coldplug All udev Devices...
Nov 22 06:34:22 localhost systemd[1]: Created slice Slice /system/modprobe.
Nov 22 06:34:22 localhost systemd[1]: Starting Load Kernel Module configfs...
Nov 22 06:34:22 localhost systemd[1]: Finished Coldplug All udev Devices.
Nov 22 06:34:22 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Nov 22 06:34:22 localhost systemd[1]: Finished Load Kernel Module configfs.
Nov 22 06:34:22 localhost systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Nov 22 06:34:22 localhost systemd[1]: Reached target Network.
Nov 22 06:34:22 localhost systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Nov 22 06:34:22 localhost systemd[1]: Starting dracut initqueue hook...
Nov 22 06:34:22 localhost kernel: libata version 3.00 loaded.
Nov 22 06:34:22 localhost kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Nov 22 06:34:22 localhost kernel: ata_piix 0000:00:01.1: version 2.13
Nov 22 06:34:22 localhost kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Nov 22 06:34:22 localhost kernel: scsi host0: ata_piix
Nov 22 06:34:22 localhost kernel: scsi host1: ata_piix
Nov 22 06:34:22 localhost kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Nov 22 06:34:22 localhost kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Nov 22 06:34:22 localhost kernel:  vda: vda1
Nov 22 06:34:22 localhost systemd[1]: Mounting Kernel Configuration File System...
Nov 22 06:34:22 localhost systemd[1]: Mounted Kernel Configuration File System.
Nov 22 06:34:22 localhost systemd[1]: Reached target System Initialization.
Nov 22 06:34:22 localhost systemd[1]: Reached target Basic System.
Nov 22 06:34:22 localhost kernel: ata1: found unknown device (class 0)
Nov 22 06:34:22 localhost kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Nov 22 06:34:22 localhost kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Nov 22 06:34:22 localhost systemd-udevd[478]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 06:34:22 localhost kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Nov 22 06:34:22 localhost kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Nov 22 06:34:22 localhost kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Nov 22 06:34:22 localhost kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0
Nov 22 06:34:22 localhost systemd[1]: Found device /dev/disk/by-uuid/47e3724e-7a1b-439a-9543-b98c9a290709.
Nov 22 06:34:22 localhost systemd[1]: Reached target Initrd Root Device.
Nov 22 06:34:22 localhost systemd[1]: Finished dracut initqueue hook.
Nov 22 06:34:22 localhost systemd[1]: Reached target Preparation for Remote File Systems.
Nov 22 06:34:22 localhost systemd[1]: Reached target Remote Encrypted Volumes.
Nov 22 06:34:22 localhost systemd[1]: Reached target Remote File Systems.
Nov 22 06:34:22 localhost systemd[1]: Starting dracut pre-mount hook...
Nov 22 06:34:22 localhost systemd[1]: Finished dracut pre-mount hook.
Nov 22 06:34:22 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/47e3724e-7a1b-439a-9543-b98c9a290709...
Nov 22 06:34:22 localhost systemd-fsck[556]: /usr/sbin/fsck.xfs: XFS file system.
Nov 22 06:34:22 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/47e3724e-7a1b-439a-9543-b98c9a290709.
Nov 22 06:34:22 localhost systemd[1]: Mounting /sysroot...
Nov 22 06:34:23 localhost kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Nov 22 06:34:23 localhost kernel: XFS (vda1): Mounting V5 Filesystem 47e3724e-7a1b-439a-9543-b98c9a290709
Nov 22 06:34:23 localhost kernel: XFS (vda1): Ending clean mount
Nov 22 06:34:23 localhost systemd[1]: Mounted /sysroot.
Nov 22 06:34:23 localhost systemd[1]: Reached target Initrd Root File System.
Nov 22 06:34:23 localhost systemd[1]: Starting Mountpoints Configured in the Real Root...
Nov 22 06:34:23 localhost systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Nov 22 06:34:23 localhost systemd[1]: Finished Mountpoints Configured in the Real Root.
Nov 22 06:34:23 localhost systemd[1]: Reached target Initrd File Systems.
Nov 22 06:34:23 localhost systemd[1]: Reached target Initrd Default Target.
Nov 22 06:34:23 localhost systemd[1]: Starting dracut mount hook...
Nov 22 06:34:23 localhost systemd[1]: Finished dracut mount hook.
Nov 22 06:34:23 localhost systemd[1]: Starting dracut pre-pivot and cleanup hook...
Nov 22 06:34:23 localhost rpc.idmapd[447]: exiting on signal 15
Nov 22 06:34:24 localhost systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Nov 22 06:34:24 localhost systemd[1]: Finished dracut pre-pivot and cleanup hook.
Nov 22 06:34:24 localhost systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Nov 22 06:34:24 localhost systemd[1]: Stopped target Network.
Nov 22 06:34:24 localhost systemd[1]: Stopped target Remote Encrypted Volumes.
Nov 22 06:34:24 localhost systemd[1]: Stopped target Timer Units.
Nov 22 06:34:24 localhost systemd[1]: dbus.socket: Deactivated successfully.
Nov 22 06:34:24 localhost systemd[1]: Closed D-Bus System Message Bus Socket.
Nov 22 06:34:24 localhost systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Nov 22 06:34:24 localhost systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Nov 22 06:34:24 localhost systemd[1]: Stopped target Initrd Default Target.
Nov 22 06:34:24 localhost systemd[1]: Stopped target Basic System.
Nov 22 06:34:24 localhost systemd[1]: Stopped target Initrd Root Device.
Nov 22 06:34:24 localhost systemd[1]: Stopped target Initrd /usr File System.
Nov 22 06:34:24 localhost systemd[1]: Stopped target Path Units.
Nov 22 06:34:24 localhost systemd[1]: Stopped target Remote File Systems.
Nov 22 06:34:24 localhost systemd[1]: Stopped target Preparation for Remote File Systems.
Nov 22 06:34:24 localhost systemd[1]: Stopped target Slice Units.
Nov 22 06:34:24 localhost systemd[1]: Stopped target Socket Units.
Nov 22 06:34:24 localhost systemd[1]: Stopped target System Initialization.
Nov 22 06:34:24 localhost systemd[1]: Stopped target Local File Systems.
Nov 22 06:34:24 localhost systemd[1]: Stopped target Swaps.
Nov 22 06:34:24 localhost systemd[1]: dracut-mount.service: Deactivated successfully.
Nov 22 06:34:24 localhost systemd[1]: Stopped dracut mount hook.
Nov 22 06:34:24 localhost systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Nov 22 06:34:24 localhost systemd[1]: Stopped dracut pre-mount hook.
Nov 22 06:34:24 localhost systemd[1]: Stopped target Local Encrypted Volumes.
Nov 22 06:34:24 localhost systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Nov 22 06:34:24 localhost systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Nov 22 06:34:24 localhost systemd[1]: dracut-initqueue.service: Deactivated successfully.
Nov 22 06:34:24 localhost systemd[1]: Stopped dracut initqueue hook.
Nov 22 06:34:24 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully.
Nov 22 06:34:24 localhost systemd[1]: Stopped Apply Kernel Variables.
Nov 22 06:34:24 localhost systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Nov 22 06:34:24 localhost systemd[1]: Stopped Create Volatile Files and Directories.
Nov 22 06:34:24 localhost systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Nov 22 06:34:24 localhost systemd[1]: Stopped Coldplug All udev Devices.
Nov 22 06:34:24 localhost systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Nov 22 06:34:24 localhost systemd[1]: Stopped dracut pre-trigger hook.
Nov 22 06:34:24 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Nov 22 06:34:24 localhost systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Nov 22 06:34:24 localhost systemd[1]: Stopped Setup Virtual Console.
Nov 22 06:34:24 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Nov 22 06:34:24 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Nov 22 06:34:24 localhost systemd[1]: initrd-cleanup.service: Deactivated successfully.
Nov 22 06:34:24 localhost systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Nov 22 06:34:24 localhost systemd[1]: systemd-udevd.service: Deactivated successfully.
Nov 22 06:34:24 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Nov 22 06:34:24 localhost systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Nov 22 06:34:24 localhost systemd[1]: Closed udev Control Socket.
Nov 22 06:34:24 localhost systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Nov 22 06:34:24 localhost systemd[1]: Closed udev Kernel Socket.
Nov 22 06:34:24 localhost systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Nov 22 06:34:24 localhost systemd[1]: Stopped dracut pre-udev hook.
Nov 22 06:34:24 localhost systemd[1]: dracut-cmdline.service: Deactivated successfully.
Nov 22 06:34:24 localhost systemd[1]: Stopped dracut cmdline hook.
Nov 22 06:34:24 localhost systemd[1]: Starting Cleanup udev Database...
Nov 22 06:34:24 localhost systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Nov 22 06:34:24 localhost systemd[1]: Stopped Create Static Device Nodes in /dev.
Nov 22 06:34:24 localhost systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Nov 22 06:34:24 localhost systemd[1]: Stopped Create List of Static Device Nodes.
Nov 22 06:34:24 localhost systemd[1]: systemd-sysusers.service: Deactivated successfully.
Nov 22 06:34:24 localhost systemd[1]: Stopped Create System Users.
Nov 22 06:34:24 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Nov 22 06:34:24 localhost systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Nov 22 06:34:24 localhost systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Nov 22 06:34:24 localhost systemd[1]: Finished Cleanup udev Database.
Nov 22 06:34:24 localhost systemd[1]: Reached target Switch Root.
Nov 22 06:34:24 localhost systemd[1]: Starting Switch Root...
Nov 22 06:34:24 localhost systemd[1]: Switching root.
Nov 22 06:34:24 localhost systemd-journald[304]: Journal stopped
Nov 22 06:34:26 localhost systemd-journald[304]: Received SIGTERM from PID 1 (systemd).
Nov 22 06:34:26 localhost kernel: audit: type=1404 audit(1763793264.895:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Nov 22 06:34:26 localhost kernel: SELinux:  policy capability network_peer_controls=1
Nov 22 06:34:26 localhost kernel: SELinux:  policy capability open_perms=1
Nov 22 06:34:26 localhost kernel: SELinux:  policy capability extended_socket_class=1
Nov 22 06:34:26 localhost kernel: SELinux:  policy capability always_check_network=0
Nov 22 06:34:26 localhost kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 22 06:34:26 localhost kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 22 06:34:26 localhost kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 22 06:34:26 localhost kernel: audit: type=1403 audit(1763793265.155:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Nov 22 06:34:26 localhost systemd[1]: Successfully loaded SELinux policy in 267.760ms.
Nov 22 06:34:26 localhost systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 77.243ms.
Nov 22 06:34:26 localhost systemd[1]: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Nov 22 06:34:26 localhost systemd[1]: Detected virtualization kvm.
Nov 22 06:34:26 localhost systemd[1]: Detected architecture x86-64.
Nov 22 06:34:26 localhost systemd-rc-local-generator[638]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 06:34:26 localhost systemd[1]: initrd-switch-root.service: Deactivated successfully.
Nov 22 06:34:26 localhost systemd[1]: Stopped Switch Root.
Nov 22 06:34:26 localhost systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Nov 22 06:34:26 localhost systemd[1]: Created slice Slice /system/getty.
Nov 22 06:34:26 localhost systemd[1]: Created slice Slice /system/serial-getty.
Nov 22 06:34:26 localhost systemd[1]: Created slice Slice /system/sshd-keygen.
Nov 22 06:34:26 localhost systemd[1]: Created slice User and Session Slice.
Nov 22 06:34:26 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Nov 22 06:34:26 localhost systemd[1]: Started Forward Password Requests to Wall Directory Watch.
Nov 22 06:34:26 localhost systemd[1]: Set up automount Arbitrary Executable File Formats File System Automount Point.
Nov 22 06:34:26 localhost systemd[1]: Reached target Local Encrypted Volumes.
Nov 22 06:34:26 localhost systemd[1]: Stopped target Switch Root.
Nov 22 06:34:26 localhost systemd[1]: Stopped target Initrd File Systems.
Nov 22 06:34:26 localhost systemd[1]: Stopped target Initrd Root File System.
Nov 22 06:34:26 localhost systemd[1]: Reached target Local Integrity Protected Volumes.
Nov 22 06:34:26 localhost systemd[1]: Reached target Path Units.
Nov 22 06:34:26 localhost systemd[1]: Reached target rpc_pipefs.target.
Nov 22 06:34:26 localhost systemd[1]: Reached target Slice Units.
Nov 22 06:34:26 localhost systemd[1]: Reached target Swaps.
Nov 22 06:34:26 localhost systemd[1]: Reached target Local Verity Protected Volumes.
Nov 22 06:34:26 localhost systemd[1]: Listening on RPCbind Server Activation Socket.
Nov 22 06:34:26 localhost systemd[1]: Reached target RPC Port Mapper.
Nov 22 06:34:26 localhost systemd[1]: Listening on Process Core Dump Socket.
Nov 22 06:34:26 localhost systemd[1]: Listening on initctl Compatibility Named Pipe.
Nov 22 06:34:26 localhost systemd[1]: Listening on udev Control Socket.
Nov 22 06:34:26 localhost systemd[1]: Listening on udev Kernel Socket.
Nov 22 06:34:26 localhost systemd[1]: Mounting Huge Pages File System...
Nov 22 06:34:26 localhost systemd[1]: Mounting POSIX Message Queue File System...
Nov 22 06:34:26 localhost systemd[1]: Mounting Kernel Debug File System...
Nov 22 06:34:26 localhost systemd[1]: Mounting Kernel Trace File System...
Nov 22 06:34:26 localhost systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Nov 22 06:34:26 localhost systemd[1]: Starting Create List of Static Device Nodes...
Nov 22 06:34:26 localhost systemd[1]: Starting Load Kernel Module configfs...
Nov 22 06:34:26 localhost systemd[1]: Starting Load Kernel Module drm...
Nov 22 06:34:26 localhost systemd[1]: Starting Load Kernel Module efi_pstore...
Nov 22 06:34:26 localhost systemd[1]: Starting Load Kernel Module fuse...
Nov 22 06:34:26 localhost systemd[1]: Starting Read and set NIS domainname from /etc/sysconfig/network...
Nov 22 06:34:26 localhost systemd[1]: systemd-fsck-root.service: Deactivated successfully.
Nov 22 06:34:26 localhost systemd[1]: Stopped File System Check on Root Device.
Nov 22 06:34:26 localhost systemd[1]: Stopped Journal Service.
Nov 22 06:34:26 localhost systemd[1]: Starting Journal Service...
Nov 22 06:34:26 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Nov 22 06:34:26 localhost systemd[1]: Starting Generate network units from Kernel command line...
Nov 22 06:34:26 localhost systemd[1]: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Nov 22 06:34:26 localhost systemd[1]: Starting Remount Root and Kernel File Systems...
Nov 22 06:34:26 localhost systemd[1]: Repartition Root Disk was skipped because no trigger condition checks were met.
Nov 22 06:34:26 localhost systemd[1]: Starting Apply Kernel Variables...
Nov 22 06:34:26 localhost systemd[1]: Starting Coldplug All udev Devices...
Nov 22 06:34:26 localhost systemd[1]: Mounted Huge Pages File System.
Nov 22 06:34:26 localhost systemd[1]: Mounted POSIX Message Queue File System.
Nov 22 06:34:26 localhost systemd[1]: Mounted Kernel Debug File System.
Nov 22 06:34:26 localhost kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Nov 22 06:34:26 localhost systemd[1]: Mounted Kernel Trace File System.
Nov 22 06:34:26 localhost kernel: fuse: init (API version 7.37)
Nov 22 06:34:26 localhost systemd[1]: Finished Create List of Static Device Nodes.
Nov 22 06:34:26 localhost systemd-journald[679]: Journal started
Nov 22 06:34:26 localhost systemd-journald[679]: Runtime Journal (/run/log/journal/fee38d0f94bf6f4b17ec77ba536bd6ab) is 8.0M, max 153.6M, 145.6M free.
Nov 22 06:34:26 localhost systemd[1]: Queued start job for default target Multi-User System.
Nov 22 06:34:26 localhost systemd[1]: systemd-journald.service: Deactivated successfully.
Nov 22 06:34:26 localhost systemd[1]: Started Journal Service.
Nov 22 06:34:26 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Nov 22 06:34:26 localhost systemd[1]: Finished Load Kernel Module configfs.
Nov 22 06:34:26 localhost systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Nov 22 06:34:26 localhost systemd[1]: Finished Load Kernel Module efi_pstore.
Nov 22 06:34:26 localhost systemd[1]: modprobe@fuse.service: Deactivated successfully.
Nov 22 06:34:26 localhost systemd[1]: Finished Load Kernel Module fuse.
Nov 22 06:34:26 localhost systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Nov 22 06:34:26 localhost systemd[1]: Finished Generate network units from Kernel command line.
Nov 22 06:34:26 localhost systemd[1]: Finished Remount Root and Kernel File Systems.
Nov 22 06:34:26 localhost systemd[1]: Finished Apply Kernel Variables.
Nov 22 06:34:26 localhost kernel: ACPI: bus type drm_connector registered
Nov 22 06:34:26 localhost systemd[1]: Mounting FUSE Control File System...
Nov 22 06:34:26 localhost systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Nov 22 06:34:26 localhost systemd[1]: Starting Rebuild Hardware Database...
Nov 22 06:34:26 localhost systemd[1]: Starting Flush Journal to Persistent Storage...
Nov 22 06:34:26 localhost systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Nov 22 06:34:26 localhost systemd[1]: Starting Load/Save OS Random Seed...
Nov 22 06:34:26 localhost systemd[1]: Starting Create System Users...
Nov 22 06:34:26 localhost systemd-journald[679]: Runtime Journal (/run/log/journal/fee38d0f94bf6f4b17ec77ba536bd6ab) is 8.0M, max 153.6M, 145.6M free.
Nov 22 06:34:26 localhost systemd-journald[679]: Received client request to flush runtime journal.
Nov 22 06:34:26 localhost systemd[1]: modprobe@drm.service: Deactivated successfully.
Nov 22 06:34:26 localhost systemd[1]: Finished Load Kernel Module drm.
Nov 22 06:34:26 localhost systemd[1]: Finished Coldplug All udev Devices.
Nov 22 06:34:26 localhost systemd[1]: Mounted FUSE Control File System.
Nov 22 06:34:26 localhost systemd[1]: Finished Flush Journal to Persistent Storage.
Nov 22 06:34:27 localhost systemd[1]: Finished Load/Save OS Random Seed.
Nov 22 06:34:27 localhost systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Nov 22 06:34:27 localhost systemd[1]: Finished Create System Users.
Nov 22 06:34:27 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Nov 22 06:34:27 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Nov 22 06:34:27 localhost systemd[1]: Reached target Preparation for Local File Systems.
Nov 22 06:34:27 localhost systemd[1]: Reached target Local File Systems.
Nov 22 06:34:27 localhost systemd[1]: Starting Rebuild Dynamic Linker Cache...
Nov 22 06:34:27 localhost systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Nov 22 06:34:27 localhost systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Nov 22 06:34:27 localhost systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Nov 22 06:34:27 localhost systemd[1]: Starting Automatic Boot Loader Update...
Nov 22 06:34:27 localhost systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Nov 22 06:34:27 localhost systemd[1]: Starting Create Volatile Files and Directories...
Nov 22 06:34:27 localhost bootctl[696]: Couldn't find EFI system partition, skipping.
Nov 22 06:34:27 localhost systemd[1]: Finished Automatic Boot Loader Update.
Nov 22 06:34:27 localhost systemd[1]: Finished Create Volatile Files and Directories.
Nov 22 06:34:27 localhost systemd[1]: Starting Security Auditing Service...
Nov 22 06:34:27 localhost systemd[1]: Starting RPC Bind...
Nov 22 06:34:27 localhost systemd[1]: Starting Rebuild Journal Catalog...
Nov 22 06:34:27 localhost systemd[1]: Finished Rebuild Journal Catalog.
Nov 22 06:34:27 localhost auditd[702]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Nov 22 06:34:27 localhost auditd[702]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Nov 22 06:34:27 localhost systemd[1]: Started RPC Bind.
Nov 22 06:34:27 localhost augenrules[707]: /sbin/augenrules: No change
Nov 22 06:34:27 localhost augenrules[722]: No rules
Nov 22 06:34:27 localhost augenrules[722]: enabled 1
Nov 22 06:34:27 localhost augenrules[722]: failure 1
Nov 22 06:34:27 localhost augenrules[722]: pid 702
Nov 22 06:34:27 localhost augenrules[722]: rate_limit 0
Nov 22 06:34:27 localhost augenrules[722]: backlog_limit 8192
Nov 22 06:34:27 localhost augenrules[722]: lost 0
Nov 22 06:34:27 localhost augenrules[722]: backlog 4
Nov 22 06:34:27 localhost augenrules[722]: backlog_wait_time 60000
Nov 22 06:34:27 localhost augenrules[722]: backlog_wait_time_actual 0
Nov 22 06:34:27 localhost augenrules[722]: enabled 1
Nov 22 06:34:27 localhost augenrules[722]: failure 1
Nov 22 06:34:27 localhost augenrules[722]: pid 702
Nov 22 06:34:27 localhost augenrules[722]: rate_limit 0
Nov 22 06:34:27 localhost augenrules[722]: backlog_limit 8192
Nov 22 06:34:27 localhost augenrules[722]: lost 0
Nov 22 06:34:27 localhost augenrules[722]: backlog 4
Nov 22 06:34:27 localhost augenrules[722]: backlog_wait_time 60000
Nov 22 06:34:27 localhost augenrules[722]: backlog_wait_time_actual 0
Nov 22 06:34:27 localhost augenrules[722]: enabled 1
Nov 22 06:34:27 localhost augenrules[722]: failure 1
Nov 22 06:34:27 localhost augenrules[722]: pid 702
Nov 22 06:34:27 localhost augenrules[722]: rate_limit 0
Nov 22 06:34:27 localhost augenrules[722]: backlog_limit 8192
Nov 22 06:34:27 localhost augenrules[722]: lost 0
Nov 22 06:34:27 localhost augenrules[722]: backlog 4
Nov 22 06:34:27 localhost augenrules[722]: backlog_wait_time 60000
Nov 22 06:34:27 localhost augenrules[722]: backlog_wait_time_actual 0
Nov 22 06:34:27 localhost systemd[1]: Started Security Auditing Service.
Nov 22 06:34:27 localhost systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Nov 22 06:34:27 localhost systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Nov 22 06:34:28 localhost systemd[1]: Finished Rebuild Hardware Database.
Nov 22 06:34:28 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Nov 22 06:34:28 localhost systemd-udevd[730]: Using default interface naming scheme 'rhel-9.0'.
Nov 22 06:34:28 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Nov 22 06:34:28 localhost systemd[1]: Starting Load Kernel Module configfs...
Nov 22 06:34:28 localhost systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Nov 22 06:34:28 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Nov 22 06:34:28 localhost systemd[1]: Finished Load Kernel Module configfs.
Nov 22 06:34:28 localhost systemd-udevd[736]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 06:34:28 localhost kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Nov 22 06:34:28 localhost kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Nov 22 06:34:28 localhost kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Nov 22 06:34:28 localhost kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Nov 22 06:34:28 localhost kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Nov 22 06:34:28 localhost kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Nov 22 06:34:28 localhost kernel: Console: switching to colour dummy device 80x25
Nov 22 06:34:28 localhost kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Nov 22 06:34:28 localhost kernel: [drm] features: -context_init
Nov 22 06:34:28 localhost kernel: [drm] number of scanouts: 1
Nov 22 06:34:28 localhost kernel: [drm] number of cap sets: 0
Nov 22 06:34:28 localhost kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Nov 22 06:34:28 localhost kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Nov 22 06:34:28 localhost kernel: Console: switching to colour frame buffer device 128x48
Nov 22 06:34:28 localhost kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Nov 22 06:34:28 localhost kernel: kvm_amd: TSC scaling supported
Nov 22 06:34:28 localhost kernel: kvm_amd: Nested Virtualization enabled
Nov 22 06:34:28 localhost kernel: kvm_amd: Nested Paging enabled
Nov 22 06:34:28 localhost kernel: kvm_amd: LBR virtualization supported
Nov 22 06:34:29 localhost systemd[1]: Finished Rebuild Dynamic Linker Cache.
Nov 22 06:34:29 localhost systemd[1]: Starting Update is Completed...
Nov 22 06:34:29 localhost systemd[1]: Finished Update is Completed.
Nov 22 06:34:29 localhost systemd[1]: Reached target System Initialization.
Nov 22 06:34:29 localhost systemd[1]: Started dnf makecache --timer.
Nov 22 06:34:29 localhost systemd[1]: Started Daily rotation of log files.
Nov 22 06:34:29 localhost systemd[1]: Started Daily Cleanup of Temporary Directories.
Nov 22 06:34:29 localhost systemd[1]: Reached target Timer Units.
Nov 22 06:34:29 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Nov 22 06:34:29 localhost systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Nov 22 06:34:29 localhost systemd[1]: Reached target Socket Units.
Nov 22 06:34:29 localhost systemd[1]: Starting D-Bus System Message Bus...
Nov 22 06:34:29 localhost systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Nov 22 06:34:29 localhost systemd[1]: Started D-Bus System Message Bus.
Nov 22 06:34:29 localhost systemd[1]: Reached target Basic System.
Nov 22 06:34:29 localhost dbus-broker-lau[810]: Ready
Nov 22 06:34:29 localhost systemd[1]: Starting NTP client/server...
Nov 22 06:34:29 localhost systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Nov 22 06:34:29 localhost systemd[1]: Starting Restore /run/initramfs on shutdown...
Nov 22 06:34:29 localhost systemd[1]: Starting IPv4 firewall with iptables...
Nov 22 06:34:29 localhost systemd[1]: Started irqbalance daemon.
Nov 22 06:34:29 localhost systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Nov 22 06:34:29 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 22 06:34:29 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 22 06:34:29 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 22 06:34:29 localhost systemd[1]: Reached target sshd-keygen.target.
Nov 22 06:34:29 localhost systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Nov 22 06:34:29 localhost systemd[1]: Reached target User and Group Name Lookups.
Nov 22 06:34:29 localhost systemd[1]: Starting User Login Management...
Nov 22 06:34:29 localhost systemd[1]: Finished Restore /run/initramfs on shutdown.
Nov 22 06:34:29 localhost chronyd[828]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Nov 22 06:34:29 localhost chronyd[828]: Loaded 0 symmetric keys
Nov 22 06:34:29 localhost chronyd[828]: Using right/UTC timezone to obtain leap second data
Nov 22 06:34:29 localhost chronyd[828]: Loaded seccomp filter (level 2)
Nov 22 06:34:29 localhost systemd[1]: Started NTP client/server.
Nov 22 06:34:29 localhost systemd-logind[821]: New seat seat0.
Nov 22 06:34:29 localhost systemd-logind[821]: Watching system buttons on /dev/input/event0 (Power Button)
Nov 22 06:34:29 localhost systemd-logind[821]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Nov 22 06:34:29 localhost systemd[1]: Started User Login Management.
Nov 22 06:34:29 localhost kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Nov 22 06:34:29 localhost kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Nov 22 06:34:29 localhost iptables.init[815]: iptables: Applying firewall rules: [  OK  ]
Nov 22 06:34:29 localhost systemd[1]: Finished IPv4 firewall with iptables.
Nov 22 06:34:31 localhost cloud-init[838]: Cloud-init v. 24.4-7.el9 running 'init-local' at Sat, 22 Nov 2025 06:34:31 +0000. Up 12.62 seconds.
Nov 22 06:34:32 localhost kernel: ISO 9660 Extensions: Microsoft Joliet Level 3
Nov 22 06:34:32 localhost kernel: ISO 9660 Extensions: RRIP_1991A
Nov 22 06:34:32 localhost systemd[1]: run-cloud\x2dinit-tmp-tmpax2sckd3.mount: Deactivated successfully.
Nov 22 06:34:32 localhost systemd[1]: Starting Hostname Service...
Nov 22 06:34:32 localhost systemd[1]: Started Hostname Service.
Nov 22 06:34:32 np0005531886.novalocal systemd-hostnamed[854]: Hostname set to <np0005531886.novalocal> (static)
Nov 22 06:34:32 np0005531886.novalocal systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Nov 22 06:34:32 np0005531886.novalocal systemd[1]: Reached target Preparation for Network.
Nov 22 06:34:32 np0005531886.novalocal systemd[1]: Starting Network Manager...
Nov 22 06:34:32 np0005531886.novalocal NetworkManager[858]: <info>  [1763793272.8504] NetworkManager (version 1.54.1-1.el9) is starting... (boot:a2036d99-efbc-42fa-8cf6-98ac98e337b2)
Nov 22 06:34:32 np0005531886.novalocal NetworkManager[858]: <info>  [1763793272.8509] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Nov 22 06:34:32 np0005531886.novalocal NetworkManager[858]: <info>  [1763793272.8666] manager[0x55bdf1fe2080]: monitoring kernel firmware directory '/lib/firmware'.
Nov 22 06:34:32 np0005531886.novalocal NetworkManager[858]: <info>  [1763793272.8822] hostname: hostname: using hostnamed
Nov 22 06:34:32 np0005531886.novalocal NetworkManager[858]: <info>  [1763793272.8824] hostname: static hostname changed from (none) to "np0005531886.novalocal"
Nov 22 06:34:32 np0005531886.novalocal NetworkManager[858]: <info>  [1763793272.8829] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Nov 22 06:34:32 np0005531886.novalocal NetworkManager[858]: <info>  [1763793272.8989] manager[0x55bdf1fe2080]: rfkill: Wi-Fi hardware radio set enabled
Nov 22 06:34:32 np0005531886.novalocal NetworkManager[858]: <info>  [1763793272.8990] manager[0x55bdf1fe2080]: rfkill: WWAN hardware radio set enabled
Nov 22 06:34:32 np0005531886.novalocal systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Nov 22 06:34:32 np0005531886.novalocal NetworkManager[858]: <info>  [1763793272.9194] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Nov 22 06:34:32 np0005531886.novalocal NetworkManager[858]: <info>  [1763793272.9195] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Nov 22 06:34:32 np0005531886.novalocal NetworkManager[858]: <info>  [1763793272.9196] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Nov 22 06:34:32 np0005531886.novalocal NetworkManager[858]: <info>  [1763793272.9196] manager: Networking is enabled by state file
Nov 22 06:34:32 np0005531886.novalocal NetworkManager[858]: <info>  [1763793272.9198] settings: Loaded settings plugin: keyfile (internal)
Nov 22 06:34:32 np0005531886.novalocal NetworkManager[858]: <info>  [1763793272.9236] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Nov 22 06:34:32 np0005531886.novalocal NetworkManager[858]: <info>  [1763793272.9289] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Nov 22 06:34:32 np0005531886.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 22 06:34:32 np0005531886.novalocal NetworkManager[858]: <info>  [1763793272.9345] dhcp: init: Using DHCP client 'internal'
Nov 22 06:34:32 np0005531886.novalocal NetworkManager[858]: <info>  [1763793272.9347] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Nov 22 06:34:32 np0005531886.novalocal NetworkManager[858]: <info>  [1763793272.9359] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 06:34:32 np0005531886.novalocal NetworkManager[858]: <info>  [1763793272.9376] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Nov 22 06:34:32 np0005531886.novalocal NetworkManager[858]: <info>  [1763793272.9381] device (lo): Activation: starting connection 'lo' (f5c5737f-5ac0-4563-b7a5-22075693542b)
Nov 22 06:34:32 np0005531886.novalocal NetworkManager[858]: <info>  [1763793272.9389] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Nov 22 06:34:32 np0005531886.novalocal NetworkManager[858]: <info>  [1763793272.9391] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 22 06:34:32 np0005531886.novalocal systemd[1]: Started Network Manager.
Nov 22 06:34:32 np0005531886.novalocal NetworkManager[858]: <info>  [1763793272.9416] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Nov 22 06:34:32 np0005531886.novalocal NetworkManager[858]: <info>  [1763793272.9419] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Nov 22 06:34:32 np0005531886.novalocal NetworkManager[858]: <info>  [1763793272.9421] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Nov 22 06:34:32 np0005531886.novalocal NetworkManager[858]: <info>  [1763793272.9422] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Nov 22 06:34:32 np0005531886.novalocal NetworkManager[858]: <info>  [1763793272.9423] device (eth0): carrier: link connected
Nov 22 06:34:32 np0005531886.novalocal NetworkManager[858]: <info>  [1763793272.9425] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Nov 22 06:34:32 np0005531886.novalocal NetworkManager[858]: <info>  [1763793272.9430] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Nov 22 06:34:32 np0005531886.novalocal systemd[1]: Reached target Network.
Nov 22 06:34:32 np0005531886.novalocal NetworkManager[858]: <info>  [1763793272.9439] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 22 06:34:32 np0005531886.novalocal NetworkManager[858]: <info>  [1763793272.9442] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 22 06:34:32 np0005531886.novalocal NetworkManager[858]: <info>  [1763793272.9442] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 22 06:34:32 np0005531886.novalocal NetworkManager[858]: <info>  [1763793272.9444] manager: NetworkManager state is now CONNECTING
Nov 22 06:34:32 np0005531886.novalocal NetworkManager[858]: <info>  [1763793272.9445] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 22 06:34:32 np0005531886.novalocal NetworkManager[858]: <info>  [1763793272.9449] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 22 06:34:32 np0005531886.novalocal NetworkManager[858]: <info>  [1763793272.9451] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 22 06:34:32 np0005531886.novalocal systemd[1]: Starting Network Manager Wait Online...
Nov 22 06:34:32 np0005531886.novalocal systemd[1]: Starting GSSAPI Proxy Daemon...
Nov 22 06:34:32 np0005531886.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 22 06:34:32 np0005531886.novalocal NetworkManager[858]: <info>  [1763793272.9604] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Nov 22 06:34:32 np0005531886.novalocal NetworkManager[858]: <info>  [1763793272.9605] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Nov 22 06:34:32 np0005531886.novalocal NetworkManager[858]: <info>  [1763793272.9611] device (lo): Activation: successful, device activated.
Nov 22 06:34:33 np0005531886.novalocal systemd[1]: Started GSSAPI Proxy Daemon.
Nov 22 06:34:33 np0005531886.novalocal systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Nov 22 06:34:33 np0005531886.novalocal systemd[1]: Reached target NFS client services.
Nov 22 06:34:33 np0005531886.novalocal systemd[1]: Reached target Preparation for Remote File Systems.
Nov 22 06:34:33 np0005531886.novalocal systemd[1]: Reached target Remote File Systems.
Nov 22 06:34:33 np0005531886.novalocal systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Nov 22 06:34:34 np0005531886.novalocal NetworkManager[858]: <info>  [1763793274.4845] dhcp4 (eth0): state changed new lease, address=38.129.56.29
Nov 22 06:34:34 np0005531886.novalocal NetworkManager[858]: <info>  [1763793274.4860] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Nov 22 06:34:34 np0005531886.novalocal NetworkManager[858]: <info>  [1763793274.4883] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 22 06:34:34 np0005531886.novalocal NetworkManager[858]: <info>  [1763793274.4917] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 22 06:34:34 np0005531886.novalocal NetworkManager[858]: <info>  [1763793274.4919] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 22 06:34:34 np0005531886.novalocal NetworkManager[858]: <info>  [1763793274.4921] manager: NetworkManager state is now CONNECTED_SITE
Nov 22 06:34:34 np0005531886.novalocal NetworkManager[858]: <info>  [1763793274.4925] device (eth0): Activation: successful, device activated.
Nov 22 06:34:34 np0005531886.novalocal NetworkManager[858]: <info>  [1763793274.4930] manager: NetworkManager state is now CONNECTED_GLOBAL
Nov 22 06:34:34 np0005531886.novalocal NetworkManager[858]: <info>  [1763793274.4933] manager: startup complete
Nov 22 06:34:34 np0005531886.novalocal systemd[1]: Finished Network Manager Wait Online.
Nov 22 06:34:34 np0005531886.novalocal systemd[1]: Starting Cloud-init: Network Stage...
Nov 22 06:34:34 np0005531886.novalocal cloud-init[923]: Cloud-init v. 24.4-7.el9 running 'init' at Sat, 22 Nov 2025 06:34:34 +0000. Up 15.48 seconds.
Nov 22 06:34:34 np0005531886.novalocal cloud-init[923]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Nov 22 06:34:34 np0005531886.novalocal cloud-init[923]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Nov 22 06:34:34 np0005531886.novalocal cloud-init[923]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Nov 22 06:34:34 np0005531886.novalocal cloud-init[923]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Nov 22 06:34:34 np0005531886.novalocal cloud-init[923]: ci-info: |  eth0  | True |         38.129.56.29         | 255.255.255.0 | global | fa:16:3e:a8:d0:a9 |
Nov 22 06:34:34 np0005531886.novalocal cloud-init[923]: ci-info: |  eth0  | True | fe80::f816:3eff:fea8:d0a9/64 |       .       |  link  | fa:16:3e:a8:d0:a9 |
Nov 22 06:34:34 np0005531886.novalocal cloud-init[923]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Nov 22 06:34:34 np0005531886.novalocal cloud-init[923]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Nov 22 06:34:34 np0005531886.novalocal cloud-init[923]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Nov 22 06:34:34 np0005531886.novalocal cloud-init[923]: ci-info: ++++++++++++++++++++++++++++++++Route IPv4 info++++++++++++++++++++++++++++++++
Nov 22 06:34:34 np0005531886.novalocal cloud-init[923]: ci-info: +-------+-----------------+-------------+-----------------+-----------+-------+
Nov 22 06:34:34 np0005531886.novalocal cloud-init[923]: ci-info: | Route |   Destination   |   Gateway   |     Genmask     | Interface | Flags |
Nov 22 06:34:34 np0005531886.novalocal cloud-init[923]: ci-info: +-------+-----------------+-------------+-----------------+-----------+-------+
Nov 22 06:34:34 np0005531886.novalocal cloud-init[923]: ci-info: |   0   |     0.0.0.0     | 38.129.56.1 |     0.0.0.0     |    eth0   |   UG  |
Nov 22 06:34:34 np0005531886.novalocal cloud-init[923]: ci-info: |   1   |   38.129.56.0   |   0.0.0.0   |  255.255.255.0  |    eth0   |   U   |
Nov 22 06:34:34 np0005531886.novalocal cloud-init[923]: ci-info: |   2   | 169.254.169.254 | 38.129.56.5 | 255.255.255.255 |    eth0   |  UGH  |
Nov 22 06:34:34 np0005531886.novalocal cloud-init[923]: ci-info: +-------+-----------------+-------------+-----------------+-----------+-------+
Nov 22 06:34:34 np0005531886.novalocal cloud-init[923]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Nov 22 06:34:34 np0005531886.novalocal cloud-init[923]: ci-info: +-------+-------------+---------+-----------+-------+
Nov 22 06:34:34 np0005531886.novalocal cloud-init[923]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Nov 22 06:34:34 np0005531886.novalocal cloud-init[923]: ci-info: +-------+-------------+---------+-----------+-------+
Nov 22 06:34:34 np0005531886.novalocal cloud-init[923]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Nov 22 06:34:34 np0005531886.novalocal cloud-init[923]: ci-info: |   3   |    local    |    ::   |    eth0   |   U   |
Nov 22 06:34:34 np0005531886.novalocal cloud-init[923]: ci-info: |   4   |  multicast  |    ::   |    eth0   |   U   |
Nov 22 06:34:34 np0005531886.novalocal cloud-init[923]: ci-info: +-------+-------------+---------+-----------+-------+
Nov 22 06:34:38 np0005531886.novalocal useradd[991]: new group: name=cloud-user, GID=1001
Nov 22 06:34:38 np0005531886.novalocal useradd[991]: new user: name=cloud-user, UID=1001, GID=1001, home=/home/cloud-user, shell=/bin/bash, from=none
Nov 22 06:34:38 np0005531886.novalocal useradd[991]: add 'cloud-user' to group 'adm'
Nov 22 06:34:38 np0005531886.novalocal useradd[991]: add 'cloud-user' to group 'systemd-journal'
Nov 22 06:34:38 np0005531886.novalocal useradd[991]: add 'cloud-user' to shadow group 'adm'
Nov 22 06:34:38 np0005531886.novalocal useradd[991]: add 'cloud-user' to shadow group 'systemd-journal'
Nov 22 06:34:39 np0005531886.novalocal cloud-init[923]: Generating public/private rsa key pair.
Nov 22 06:34:39 np0005531886.novalocal cloud-init[923]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Nov 22 06:34:39 np0005531886.novalocal cloud-init[923]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Nov 22 06:34:39 np0005531886.novalocal cloud-init[923]: The key fingerprint is:
Nov 22 06:34:39 np0005531886.novalocal cloud-init[923]: SHA256:e2qwyXxOgbwGjOwM8J3huaMIHyKElKEwdQh6tJlAO9E root@np0005531886.novalocal
Nov 22 06:34:39 np0005531886.novalocal cloud-init[923]: The key's randomart image is:
Nov 22 06:34:39 np0005531886.novalocal cloud-init[923]: +---[RSA 3072]----+
Nov 22 06:34:39 np0005531886.novalocal cloud-init[923]: |**+..            |
Nov 22 06:34:39 np0005531886.novalocal cloud-init[923]: |++*E             |
Nov 22 06:34:39 np0005531886.novalocal cloud-init[923]: |+== .            |
Nov 22 06:34:39 np0005531886.novalocal cloud-init[923]: |++.= = .         |
Nov 22 06:34:39 np0005531886.novalocal cloud-init[923]: |o.+ B o S        |
Nov 22 06:34:39 np0005531886.novalocal cloud-init[923]: |.+   o.. o       |
Nov 22 06:34:39 np0005531886.novalocal cloud-init[923]: |+ + ooo+o .      |
Nov 22 06:34:39 np0005531886.novalocal cloud-init[923]: |o+ o o=.oo       |
Nov 22 06:34:39 np0005531886.novalocal cloud-init[923]: |. o    +o        |
Nov 22 06:34:39 np0005531886.novalocal cloud-init[923]: +----[SHA256]-----+
Nov 22 06:34:39 np0005531886.novalocal cloud-init[923]: Generating public/private ecdsa key pair.
Nov 22 06:34:39 np0005531886.novalocal cloud-init[923]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Nov 22 06:34:39 np0005531886.novalocal cloud-init[923]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Nov 22 06:34:39 np0005531886.novalocal cloud-init[923]: The key fingerprint is:
Nov 22 06:34:39 np0005531886.novalocal cloud-init[923]: SHA256:rPReSWBtM1eKSvS++lP5tibKIzCVhiChVhrghVp+V+0 root@np0005531886.novalocal
Nov 22 06:34:39 np0005531886.novalocal cloud-init[923]: The key's randomart image is:
Nov 22 06:34:39 np0005531886.novalocal cloud-init[923]: +---[ECDSA 256]---+
Nov 22 06:34:39 np0005531886.novalocal cloud-init[923]: |o.o+    ..    .  |
Nov 22 06:34:39 np0005531886.novalocal cloud-init[923]: |.oB .  ..o.. o   |
Nov 22 06:34:39 np0005531886.novalocal cloud-init[923]: |o* . . o+oB o    |
Nov 22 06:34:39 np0005531886.novalocal cloud-init[923]: |o . . o++=E+     |
Nov 22 06:34:39 np0005531886.novalocal cloud-init[923]: |   . ..oS o  .   |
Nov 22 06:34:39 np0005531886.novalocal cloud-init[923]: |     .oo . oo    |
Nov 22 06:34:39 np0005531886.novalocal cloud-init[923]: |      .o. +. .   |
Nov 22 06:34:39 np0005531886.novalocal cloud-init[923]: |       ..+o . +  |
Nov 22 06:34:39 np0005531886.novalocal cloud-init[923]: |        oo++ +.. |
Nov 22 06:34:39 np0005531886.novalocal cloud-init[923]: +----[SHA256]-----+
Nov 22 06:34:39 np0005531886.novalocal cloud-init[923]: Generating public/private ed25519 key pair.
Nov 22 06:34:39 np0005531886.novalocal cloud-init[923]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Nov 22 06:34:39 np0005531886.novalocal cloud-init[923]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Nov 22 06:34:39 np0005531886.novalocal cloud-init[923]: The key fingerprint is:
Nov 22 06:34:39 np0005531886.novalocal cloud-init[923]: SHA256:kTG28aBpGvoVn5BjpSVRHlUPhm8B2AEiDZTTIfJPCYk root@np0005531886.novalocal
Nov 22 06:34:39 np0005531886.novalocal cloud-init[923]: The key's randomart image is:
Nov 22 06:34:39 np0005531886.novalocal cloud-init[923]: +--[ED25519 256]--+
Nov 22 06:34:39 np0005531886.novalocal cloud-init[923]: |  .o**=+%=+=+    |
Nov 22 06:34:39 np0005531886.novalocal cloud-init[923]: |  Eo++o&.Xo..o   |
Nov 22 06:34:39 np0005531886.novalocal cloud-init[923]: |    o.& = .. ..  |
Nov 22 06:34:39 np0005531886.novalocal cloud-init[923]: |   . B = o  o    |
Nov 22 06:34:39 np0005531886.novalocal cloud-init[923]: |  . . o S  .     |
Nov 22 06:34:39 np0005531886.novalocal cloud-init[923]: |   . .           |
Nov 22 06:34:39 np0005531886.novalocal cloud-init[923]: |    .            |
Nov 22 06:34:39 np0005531886.novalocal cloud-init[923]: |                 |
Nov 22 06:34:39 np0005531886.novalocal cloud-init[923]: |                 |
Nov 22 06:34:39 np0005531886.novalocal cloud-init[923]: +----[SHA256]-----+
Nov 22 06:34:39 np0005531886.novalocal systemd[1]: Finished Cloud-init: Network Stage.
Nov 22 06:34:39 np0005531886.novalocal systemd[1]: Reached target Cloud-config availability.
Nov 22 06:34:39 np0005531886.novalocal systemd[1]: Reached target Network is Online.
Nov 22 06:34:39 np0005531886.novalocal systemd[1]: Starting Cloud-init: Config Stage...
Nov 22 06:34:39 np0005531886.novalocal systemd[1]: Starting Crash recovery kernel arming...
Nov 22 06:34:39 np0005531886.novalocal systemd[1]: Starting Notify NFS peers of a restart...
Nov 22 06:34:39 np0005531886.novalocal sm-notify[1007]: Version 2.5.4 starting
Nov 22 06:34:39 np0005531886.novalocal systemd[1]: Starting System Logging Service...
Nov 22 06:34:39 np0005531886.novalocal systemd[1]: Starting OpenSSH server daemon...
Nov 22 06:34:39 np0005531886.novalocal systemd[1]: Starting Permit User Sessions...
Nov 22 06:34:39 np0005531886.novalocal systemd[1]: Started Notify NFS peers of a restart.
Nov 22 06:34:39 np0005531886.novalocal sshd[1009]: Server listening on 0.0.0.0 port 22.
Nov 22 06:34:39 np0005531886.novalocal sshd[1009]: Server listening on :: port 22.
Nov 22 06:34:39 np0005531886.novalocal systemd[1]: Started OpenSSH server daemon.
Nov 22 06:34:39 np0005531886.novalocal chronyd[828]: Selected source 138.197.135.239 (2.centos.pool.ntp.org)
Nov 22 06:34:39 np0005531886.novalocal chronyd[828]: System clock TAI offset set to 37 seconds
Nov 22 06:34:39 np0005531886.novalocal systemd[1]: Finished Permit User Sessions.
Nov 22 06:34:39 np0005531886.novalocal systemd[1]: Started Command Scheduler.
Nov 22 06:34:39 np0005531886.novalocal systemd[1]: Started Getty on tty1.
Nov 22 06:34:39 np0005531886.novalocal rsyslogd[1008]: [origin software="rsyslogd" swVersion="8.2506.0-2.el9" x-pid="1008" x-info="https://www.rsyslog.com"] start
Nov 22 06:34:39 np0005531886.novalocal systemd[1]: Started Serial Getty on ttyS0.
Nov 22 06:34:39 np0005531886.novalocal rsyslogd[1008]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Nov 22 06:34:39 np0005531886.novalocal systemd[1]: Reached target Login Prompts.
Nov 22 06:34:39 np0005531886.novalocal systemd[1]: Started System Logging Service.
Nov 22 06:34:39 np0005531886.novalocal crond[1014]: (CRON) STARTUP (1.5.7)
Nov 22 06:34:39 np0005531886.novalocal crond[1014]: (CRON) INFO (Syslog will be used instead of sendmail.)
Nov 22 06:34:39 np0005531886.novalocal crond[1014]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 19% if used.)
Nov 22 06:34:39 np0005531886.novalocal systemd[1]: Reached target Multi-User System.
Nov 22 06:34:39 np0005531886.novalocal systemd[1]: Starting Record Runlevel Change in UTMP...
Nov 22 06:34:39 np0005531886.novalocal sshd-session[1030]: Unable to negotiate with 38.102.83.114 port 58758: no matching host key type found. Their offer: ssh-ed25519,ssh-ed25519-cert-v01@openssh.com [preauth]
Nov 22 06:34:39 np0005531886.novalocal systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Nov 22 06:34:39 np0005531886.novalocal systemd[1]: Finished Record Runlevel Change in UTMP.
Nov 22 06:34:39 np0005531886.novalocal crond[1014]: (CRON) INFO (running with inotify support)
Nov 22 06:34:39 np0005531886.novalocal sshd-session[1048]: Unable to negotiate with 38.102.83.114 port 58764: no matching host key type found. Their offer: ecdsa-sha2-nistp384,ecdsa-sha2-nistp384-cert-v01@openssh.com [preauth]
Nov 22 06:34:39 np0005531886.novalocal sshd-session[1054]: Unable to negotiate with 38.102.83.114 port 58780: no matching host key type found. Their offer: ecdsa-sha2-nistp521,ecdsa-sha2-nistp521-cert-v01@openssh.com [preauth]
Nov 22 06:34:39 np0005531886.novalocal rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 22 06:34:39 np0005531886.novalocal sshd-session[1061]: Connection reset by 38.102.83.114 port 58794 [preauth]
Nov 22 06:34:39 np0005531886.novalocal sshd-session[1012]: Connection closed by 38.102.83.114 port 58744 [preauth]
Nov 22 06:34:39 np0005531886.novalocal sshd-session[1081]: Unable to negotiate with 38.102.83.114 port 58810: no matching host key type found. Their offer: ssh-rsa,ssh-rsa-cert-v01@openssh.com [preauth]
Nov 22 06:34:39 np0005531886.novalocal sshd-session[1083]: Unable to negotiate with 38.102.83.114 port 58812: no matching host key type found. Their offer: ssh-dss,ssh-dss-cert-v01@openssh.com [preauth]
Nov 22 06:34:39 np0005531886.novalocal sshd-session[1038]: Connection closed by 38.102.83.114 port 58760 [preauth]
Nov 22 06:34:39 np0005531886.novalocal kdumpctl[1018]: kdump: No kdump initial ramdisk found.
Nov 22 06:34:39 np0005531886.novalocal kdumpctl[1018]: kdump: Rebuilding /boot/initramfs-5.14.0-639.el9.x86_64kdump.img
Nov 22 06:34:39 np0005531886.novalocal cloud-init[1091]: Cloud-init v. 24.4-7.el9 running 'modules:config' at Sat, 22 Nov 2025 06:34:39 +0000. Up 20.30 seconds.
Nov 22 06:34:39 np0005531886.novalocal sshd-session[1074]: Connection closed by 38.102.83.114 port 58796 [preauth]
Nov 22 06:34:39 np0005531886.novalocal systemd[1]: Finished Cloud-init: Config Stage.
Nov 22 06:34:39 np0005531886.novalocal systemd[1]: Starting Cloud-init: Final Stage...
Nov 22 06:34:40 np0005531886.novalocal cloud-init[1251]: Cloud-init v. 24.4-7.el9 running 'modules:final' at Sat, 22 Nov 2025 06:34:40 +0000. Up 20.75 seconds.
Nov 22 06:34:40 np0005531886.novalocal cloud-init[1276]: #############################################################
Nov 22 06:34:40 np0005531886.novalocal cloud-init[1279]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Nov 22 06:34:40 np0005531886.novalocal cloud-init[1284]: 256 SHA256:rPReSWBtM1eKSvS++lP5tibKIzCVhiChVhrghVp+V+0 root@np0005531886.novalocal (ECDSA)
Nov 22 06:34:40 np0005531886.novalocal cloud-init[1291]: 256 SHA256:kTG28aBpGvoVn5BjpSVRHlUPhm8B2AEiDZTTIfJPCYk root@np0005531886.novalocal (ED25519)
Nov 22 06:34:40 np0005531886.novalocal cloud-init[1295]: 3072 SHA256:e2qwyXxOgbwGjOwM8J3huaMIHyKElKEwdQh6tJlAO9E root@np0005531886.novalocal (RSA)
Nov 22 06:34:40 np0005531886.novalocal cloud-init[1297]: -----END SSH HOST KEY FINGERPRINTS-----
Nov 22 06:34:40 np0005531886.novalocal cloud-init[1298]: #############################################################
Nov 22 06:34:40 np0005531886.novalocal dracut[1305]: dracut-057-102.git20250818.el9
Nov 22 06:34:40 np0005531886.novalocal cloud-init[1251]: Cloud-init v. 24.4-7.el9 finished at Sat, 22 Nov 2025 06:34:40 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 20.98 seconds
Nov 22 06:34:40 np0005531886.novalocal irqbalance[816]: Cannot change IRQ 25 affinity: Operation not permitted
Nov 22 06:34:40 np0005531886.novalocal irqbalance[816]: IRQ 25 affinity is now unmanaged
Nov 22 06:34:40 np0005531886.novalocal irqbalance[816]: Cannot change IRQ 31 affinity: Operation not permitted
Nov 22 06:34:40 np0005531886.novalocal irqbalance[816]: IRQ 31 affinity is now unmanaged
Nov 22 06:34:40 np0005531886.novalocal irqbalance[816]: Cannot change IRQ 28 affinity: Operation not permitted
Nov 22 06:34:40 np0005531886.novalocal irqbalance[816]: IRQ 28 affinity is now unmanaged
Nov 22 06:34:40 np0005531886.novalocal irqbalance[816]: Cannot change IRQ 32 affinity: Operation not permitted
Nov 22 06:34:40 np0005531886.novalocal irqbalance[816]: IRQ 32 affinity is now unmanaged
Nov 22 06:34:40 np0005531886.novalocal irqbalance[816]: Cannot change IRQ 30 affinity: Operation not permitted
Nov 22 06:34:40 np0005531886.novalocal irqbalance[816]: IRQ 30 affinity is now unmanaged
Nov 22 06:34:40 np0005531886.novalocal irqbalance[816]: Cannot change IRQ 29 affinity: Operation not permitted
Nov 22 06:34:40 np0005531886.novalocal irqbalance[816]: IRQ 29 affinity is now unmanaged
Nov 22 06:34:40 np0005531886.novalocal systemd[1]: Finished Cloud-init: Final Stage.
Nov 22 06:34:40 np0005531886.novalocal systemd[1]: Reached target Cloud-init target.
Nov 22 06:34:40 np0005531886.novalocal dracut[1307]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/47e3724e-7a1b-439a-9543-b98c9a290709 /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-639.el9.x86_64kdump.img 5.14.0-639.el9.x86_64
Nov 22 06:34:41 np0005531886.novalocal dracut[1307]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Nov 22 06:34:41 np0005531886.novalocal dracut[1307]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Nov 22 06:34:41 np0005531886.novalocal dracut[1307]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Nov 22 06:34:41 np0005531886.novalocal dracut[1307]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Nov 22 06:34:41 np0005531886.novalocal dracut[1307]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Nov 22 06:34:41 np0005531886.novalocal dracut[1307]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Nov 22 06:34:41 np0005531886.novalocal dracut[1307]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Nov 22 06:34:41 np0005531886.novalocal dracut[1307]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Nov 22 06:34:41 np0005531886.novalocal dracut[1307]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Nov 22 06:34:41 np0005531886.novalocal dracut[1307]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Nov 22 06:34:41 np0005531886.novalocal dracut[1307]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Nov 22 06:34:41 np0005531886.novalocal dracut[1307]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Nov 22 06:34:41 np0005531886.novalocal dracut[1307]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Nov 22 06:34:41 np0005531886.novalocal dracut[1307]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Nov 22 06:34:41 np0005531886.novalocal dracut[1307]: Module 'ifcfg' will not be installed, because it's in the list to be omitted!
Nov 22 06:34:41 np0005531886.novalocal dracut[1307]: Module 'plymouth' will not be installed, because it's in the list to be omitted!
Nov 22 06:34:41 np0005531886.novalocal dracut[1307]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Nov 22 06:34:41 np0005531886.novalocal dracut[1307]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Nov 22 06:34:41 np0005531886.novalocal dracut[1307]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Nov 22 06:34:41 np0005531886.novalocal dracut[1307]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Nov 22 06:34:41 np0005531886.novalocal dracut[1307]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Nov 22 06:34:41 np0005531886.novalocal dracut[1307]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Nov 22 06:34:41 np0005531886.novalocal dracut[1307]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Nov 22 06:34:41 np0005531886.novalocal dracut[1307]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Nov 22 06:34:41 np0005531886.novalocal dracut[1307]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Nov 22 06:34:41 np0005531886.novalocal dracut[1307]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Nov 22 06:34:42 np0005531886.novalocal dracut[1307]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Nov 22 06:34:42 np0005531886.novalocal dracut[1307]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Nov 22 06:34:42 np0005531886.novalocal dracut[1307]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Nov 22 06:34:42 np0005531886.novalocal dracut[1307]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Nov 22 06:34:42 np0005531886.novalocal dracut[1307]: Module 'resume' will not be installed, because it's in the list to be omitted!
Nov 22 06:34:42 np0005531886.novalocal dracut[1307]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Nov 22 06:34:42 np0005531886.novalocal dracut[1307]: Module 'earlykdump' will not be installed, because it's in the list to be omitted!
Nov 22 06:34:42 np0005531886.novalocal dracut[1307]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Nov 22 06:34:42 np0005531886.novalocal dracut[1307]: memstrack is not available
Nov 22 06:34:42 np0005531886.novalocal dracut[1307]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Nov 22 06:34:42 np0005531886.novalocal dracut[1307]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Nov 22 06:34:42 np0005531886.novalocal dracut[1307]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Nov 22 06:34:42 np0005531886.novalocal dracut[1307]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Nov 22 06:34:42 np0005531886.novalocal dracut[1307]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Nov 22 06:34:42 np0005531886.novalocal dracut[1307]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Nov 22 06:34:42 np0005531886.novalocal dracut[1307]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Nov 22 06:34:42 np0005531886.novalocal dracut[1307]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Nov 22 06:34:42 np0005531886.novalocal dracut[1307]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Nov 22 06:34:42 np0005531886.novalocal dracut[1307]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Nov 22 06:34:42 np0005531886.novalocal dracut[1307]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Nov 22 06:34:42 np0005531886.novalocal dracut[1307]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Nov 22 06:34:42 np0005531886.novalocal dracut[1307]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Nov 22 06:34:42 np0005531886.novalocal dracut[1307]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Nov 22 06:34:42 np0005531886.novalocal dracut[1307]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Nov 22 06:34:42 np0005531886.novalocal dracut[1307]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Nov 22 06:34:42 np0005531886.novalocal dracut[1307]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Nov 22 06:34:42 np0005531886.novalocal dracut[1307]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Nov 22 06:34:42 np0005531886.novalocal dracut[1307]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Nov 22 06:34:42 np0005531886.novalocal dracut[1307]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Nov 22 06:34:42 np0005531886.novalocal dracut[1307]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Nov 22 06:34:42 np0005531886.novalocal dracut[1307]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Nov 22 06:34:42 np0005531886.novalocal dracut[1307]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Nov 22 06:34:42 np0005531886.novalocal dracut[1307]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Nov 22 06:34:42 np0005531886.novalocal dracut[1307]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Nov 22 06:34:42 np0005531886.novalocal dracut[1307]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Nov 22 06:34:42 np0005531886.novalocal dracut[1307]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Nov 22 06:34:42 np0005531886.novalocal dracut[1307]: memstrack is not available
Nov 22 06:34:42 np0005531886.novalocal dracut[1307]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Nov 22 06:34:43 np0005531886.novalocal dracut[1307]: *** Including module: systemd ***
Nov 22 06:34:43 np0005531886.novalocal dracut[1307]: *** Including module: fips ***
Nov 22 06:34:43 np0005531886.novalocal dracut[1307]: *** Including module: systemd-initrd ***
Nov 22 06:34:43 np0005531886.novalocal dracut[1307]: *** Including module: i18n ***
Nov 22 06:34:44 np0005531886.novalocal dracut[1307]: *** Including module: drm ***
Nov 22 06:34:44 np0005531886.novalocal dracut[1307]: *** Including module: prefixdevname ***
Nov 22 06:34:44 np0005531886.novalocal dracut[1307]: *** Including module: kernel-modules ***
Nov 22 06:34:44 np0005531886.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 22 06:34:44 np0005531886.novalocal kernel: block vda: the capability attribute has been deprecated.
Nov 22 06:34:45 np0005531886.novalocal dracut[1307]: *** Including module: kernel-modules-extra ***
Nov 22 06:34:45 np0005531886.novalocal dracut[1307]:   kernel-modules-extra: configuration source "/run/depmod.d" does not exist
Nov 22 06:34:45 np0005531886.novalocal dracut[1307]:   kernel-modules-extra: configuration source "/lib/depmod.d" does not exist
Nov 22 06:34:45 np0005531886.novalocal dracut[1307]:   kernel-modules-extra: parsing configuration file "/etc/depmod.d/dist.conf"
Nov 22 06:34:45 np0005531886.novalocal dracut[1307]:   kernel-modules-extra: /etc/depmod.d/dist.conf: added "updates extra built-in weak-updates" to the list of search directories
Nov 22 06:34:45 np0005531886.novalocal dracut[1307]: *** Including module: qemu ***
Nov 22 06:34:45 np0005531886.novalocal dracut[1307]: *** Including module: fstab-sys ***
Nov 22 06:34:45 np0005531886.novalocal dracut[1307]: *** Including module: rootfs-block ***
Nov 22 06:34:45 np0005531886.novalocal dracut[1307]: *** Including module: terminfo ***
Nov 22 06:34:45 np0005531886.novalocal dracut[1307]: *** Including module: udev-rules ***
Nov 22 06:34:46 np0005531886.novalocal dracut[1307]: Skipping udev rule: 91-permissions.rules
Nov 22 06:34:46 np0005531886.novalocal dracut[1307]: Skipping udev rule: 80-drivers-modprobe.rules
Nov 22 06:34:46 np0005531886.novalocal dracut[1307]: *** Including module: virtiofs ***
Nov 22 06:34:46 np0005531886.novalocal dracut[1307]: *** Including module: dracut-systemd ***
Nov 22 06:34:46 np0005531886.novalocal dracut[1307]: *** Including module: usrmount ***
Nov 22 06:34:46 np0005531886.novalocal dracut[1307]: *** Including module: base ***
Nov 22 06:34:46 np0005531886.novalocal dracut[1307]: *** Including module: fs-lib ***
Nov 22 06:34:46 np0005531886.novalocal dracut[1307]: *** Including module: kdumpbase ***
Nov 22 06:34:46 np0005531886.novalocal dracut[1307]: *** Including module: microcode_ctl-fw_dir_override ***
Nov 22 06:34:46 np0005531886.novalocal dracut[1307]:   microcode_ctl module: mangling fw_dir
Nov 22 06:34:46 np0005531886.novalocal dracut[1307]:     microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Nov 22 06:34:46 np0005531886.novalocal dracut[1307]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Nov 22 06:34:47 np0005531886.novalocal dracut[1307]:     microcode_ctl: configuration "intel" is ignored
Nov 22 06:34:47 np0005531886.novalocal dracut[1307]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Nov 22 06:34:47 np0005531886.novalocal dracut[1307]:     microcode_ctl: configuration "intel-06-2d-07" is ignored
Nov 22 06:34:47 np0005531886.novalocal dracut[1307]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Nov 22 06:34:47 np0005531886.novalocal dracut[1307]:     microcode_ctl: configuration "intel-06-4e-03" is ignored
Nov 22 06:34:47 np0005531886.novalocal dracut[1307]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Nov 22 06:34:47 np0005531886.novalocal dracut[1307]:     microcode_ctl: configuration "intel-06-4f-01" is ignored
Nov 22 06:34:47 np0005531886.novalocal dracut[1307]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Nov 22 06:34:47 np0005531886.novalocal dracut[1307]:     microcode_ctl: configuration "intel-06-55-04" is ignored
Nov 22 06:34:47 np0005531886.novalocal dracut[1307]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Nov 22 06:34:47 np0005531886.novalocal dracut[1307]:     microcode_ctl: configuration "intel-06-5e-03" is ignored
Nov 22 06:34:47 np0005531886.novalocal dracut[1307]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Nov 22 06:34:47 np0005531886.novalocal dracut[1307]:     microcode_ctl: configuration "intel-06-8c-01" is ignored
Nov 22 06:34:47 np0005531886.novalocal dracut[1307]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Nov 22 06:34:47 np0005531886.novalocal dracut[1307]:     microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Nov 22 06:34:47 np0005531886.novalocal dracut[1307]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Nov 22 06:34:47 np0005531886.novalocal dracut[1307]:     microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Nov 22 06:34:47 np0005531886.novalocal dracut[1307]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Nov 22 06:34:47 np0005531886.novalocal dracut[1307]:     microcode_ctl: configuration "intel-06-8f-08" is ignored
Nov 22 06:34:47 np0005531886.novalocal dracut[1307]:     microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Nov 22 06:34:47 np0005531886.novalocal dracut[1307]: *** Including module: openssl ***
Nov 22 06:34:47 np0005531886.novalocal dracut[1307]: *** Including module: shutdown ***
Nov 22 06:34:47 np0005531886.novalocal dracut[1307]: *** Including module: squash ***
Nov 22 06:34:47 np0005531886.novalocal dracut[1307]: *** Including modules done ***
Nov 22 06:34:47 np0005531886.novalocal dracut[1307]: *** Installing kernel module dependencies ***
Nov 22 06:34:48 np0005531886.novalocal dracut[1307]: *** Installing kernel module dependencies done ***
Nov 22 06:34:48 np0005531886.novalocal dracut[1307]: *** Resolving executable dependencies ***
Nov 22 06:34:50 np0005531886.novalocal dracut[1307]: *** Resolving executable dependencies done ***
Nov 22 06:34:50 np0005531886.novalocal dracut[1307]: *** Generating early-microcode cpio image ***
Nov 22 06:34:50 np0005531886.novalocal dracut[1307]: *** Store current command line parameters ***
Nov 22 06:34:50 np0005531886.novalocal dracut[1307]: Stored kernel commandline:
Nov 22 06:34:50 np0005531886.novalocal dracut[1307]: No dracut internal kernel commandline stored in the initramfs
Nov 22 06:34:51 np0005531886.novalocal dracut[1307]: *** Install squash loader ***
Nov 22 06:34:52 np0005531886.novalocal dracut[1307]: *** Squashing the files inside the initramfs ***
Nov 22 06:34:53 np0005531886.novalocal dracut[1307]: *** Squashing the files inside the initramfs done ***
Nov 22 06:34:53 np0005531886.novalocal dracut[1307]: *** Creating image file '/boot/initramfs-5.14.0-639.el9.x86_64kdump.img' ***
Nov 22 06:34:53 np0005531886.novalocal dracut[1307]: *** Hardlinking files ***
Nov 22 06:34:53 np0005531886.novalocal dracut[1307]: Mode:           real
Nov 22 06:34:53 np0005531886.novalocal dracut[1307]: Files:          50
Nov 22 06:34:53 np0005531886.novalocal dracut[1307]: Linked:         0 files
Nov 22 06:34:53 np0005531886.novalocal dracut[1307]: Compared:       0 xattrs
Nov 22 06:34:53 np0005531886.novalocal dracut[1307]: Compared:       0 files
Nov 22 06:34:53 np0005531886.novalocal dracut[1307]: Saved:          0 B
Nov 22 06:34:53 np0005531886.novalocal dracut[1307]: Duration:       0.000545 seconds
Nov 22 06:34:53 np0005531886.novalocal dracut[1307]: *** Hardlinking files done ***
Nov 22 06:34:54 np0005531886.novalocal dracut[1307]: *** Creating initramfs image file '/boot/initramfs-5.14.0-639.el9.x86_64kdump.img' done ***
Nov 22 06:34:55 np0005531886.novalocal kdumpctl[1018]: kdump: kexec: loaded kdump kernel
Nov 22 06:34:55 np0005531886.novalocal kdumpctl[1018]: kdump: Starting kdump: [OK]
Nov 22 06:34:55 np0005531886.novalocal systemd[1]: Finished Crash recovery kernel arming.
Nov 22 06:34:55 np0005531886.novalocal systemd[1]: Startup finished in 1.559s (kernel) + 4.002s (initrd) + 30.251s (userspace) = 35.813s.
Nov 22 06:34:55 np0005531886.novalocal sshd-session[4299]: Accepted publickey for zuul from 38.102.83.114 port 56340 ssh2: RSA SHA256:zhs3MiW0JhxzckYcMHQES8SMYHj1iGcomnyzmbiwor8
Nov 22 06:34:55 np0005531886.novalocal systemd[1]: Created slice User Slice of UID 1000.
Nov 22 06:34:55 np0005531886.novalocal systemd[1]: Starting User Runtime Directory /run/user/1000...
Nov 22 06:34:55 np0005531886.novalocal systemd-logind[821]: New session 1 of user zuul.
Nov 22 06:34:55 np0005531886.novalocal systemd[1]: Finished User Runtime Directory /run/user/1000.
Nov 22 06:34:55 np0005531886.novalocal systemd[1]: Starting User Manager for UID 1000...
Nov 22 06:34:55 np0005531886.novalocal systemd[4303]: pam_unix(systemd-user:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 22 06:34:56 np0005531886.novalocal systemd[4303]: Queued start job for default target Main User Target.
Nov 22 06:34:56 np0005531886.novalocal systemd[4303]: Created slice User Application Slice.
Nov 22 06:34:56 np0005531886.novalocal systemd[4303]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 22 06:34:56 np0005531886.novalocal systemd[4303]: Started Daily Cleanup of User's Temporary Directories.
Nov 22 06:34:56 np0005531886.novalocal systemd[4303]: Reached target Paths.
Nov 22 06:34:56 np0005531886.novalocal systemd[4303]: Reached target Timers.
Nov 22 06:34:56 np0005531886.novalocal systemd[4303]: Starting D-Bus User Message Bus Socket...
Nov 22 06:34:56 np0005531886.novalocal systemd[4303]: Starting Create User's Volatile Files and Directories...
Nov 22 06:34:56 np0005531886.novalocal systemd[4303]: Finished Create User's Volatile Files and Directories.
Nov 22 06:34:56 np0005531886.novalocal systemd[4303]: Listening on D-Bus User Message Bus Socket.
Nov 22 06:34:56 np0005531886.novalocal systemd[4303]: Reached target Sockets.
Nov 22 06:34:56 np0005531886.novalocal systemd[4303]: Reached target Basic System.
Nov 22 06:34:56 np0005531886.novalocal systemd[4303]: Reached target Main User Target.
Nov 22 06:34:56 np0005531886.novalocal systemd[4303]: Startup finished in 194ms.
Nov 22 06:34:56 np0005531886.novalocal systemd[1]: Started User Manager for UID 1000.
Nov 22 06:34:56 np0005531886.novalocal systemd[1]: Started Session 1 of User zuul.
Nov 22 06:34:56 np0005531886.novalocal sshd-session[4299]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 22 06:34:56 np0005531886.novalocal python3[4385]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 06:35:02 np0005531886.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 22 06:35:07 np0005531886.novalocal python3[4415]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 06:35:16 np0005531886.novalocal python3[4473]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 06:35:17 np0005531886.novalocal python3[4513]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Nov 22 06:35:19 np0005531886.novalocal python3[4539]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDIDDDD+fltt9cmDgcjLSkGENwZvQzj5XoQ8wGDcg2s6u+LVhotbjXRoCyQvkLrQ9+aYjFbt1JZ05PeSToOVkPdJ2l6AucsYKMFk7tKlgqYA0SfBQkQjrI4dYCIJp5Zl46tl+HQ7eT2kkERLJRgc1sNhw88jbxU83GEmQNcj9/Q6rj2r+/nIptD66sUseZ1GDb43Ao7zBSzRrD8HRZlEfDChNFod0RykV5phE1R5jhZzJ7KtwI8ovnac3+YT5JW3uK2sdRHHMkZyMiqLqGgsozncX0tlbDqQ6Td89rR3ia15IGC2ZhCwZ5c8vyHhHLG0eEjA73ADlY3cxVKkV8ULfKIWbZL7+AmS7WLvTbD3QSMnkFyuzpAbq/zrs1iZFaLNioOyXiKn0sdTX+CE+goDViTSGJIE8ELsdVZ1adwTqArvAG+Rek7RLJ0oiTWo43Kjdyfs/JYcGpxz+5HVoi4aE2g0M5qhLU7D/EmGa4VwYjui4rxXMlhIFmTsq1NgHSMlB8= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 06:35:20 np0005531886.novalocal python3[4563]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 06:35:20 np0005531886.novalocal python3[4662]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 22 06:35:20 np0005531886.novalocal python3[4733]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763793320.283635-251-166571850005417/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=6bc74860ecfa49adaf1e65a536fcfd6f_id_rsa follow=False checksum=d1aad691a5f7d928d36e451e57eecb0570edc5f2 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 06:35:21 np0005531886.novalocal python3[4856]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 22 06:35:21 np0005531886.novalocal python3[4927]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763793321.299946-306-96136824998488/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=6bc74860ecfa49adaf1e65a536fcfd6f_id_rsa.pub follow=False checksum=5c64f06d32705901c18adda8251e89259a484c91 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 06:35:23 np0005531886.novalocal python3[4975]: ansible-ping Invoked with data=pong
Nov 22 06:35:24 np0005531886.novalocal python3[4999]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 06:35:26 np0005531886.novalocal python3[5057]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Nov 22 06:35:27 np0005531886.novalocal python3[5089]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 06:35:27 np0005531886.novalocal python3[5113]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 06:35:28 np0005531886.novalocal python3[5137]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 06:35:28 np0005531886.novalocal python3[5161]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 06:35:29 np0005531886.novalocal python3[5185]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 06:35:29 np0005531886.novalocal python3[5209]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 06:35:30 np0005531886.novalocal sudo[5233]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ypxyhxfdswpbukbafydvmsqyigoyreja ; /usr/bin/python3'
Nov 22 06:35:30 np0005531886.novalocal sudo[5233]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 06:35:31 np0005531886.novalocal python3[5235]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 06:35:31 np0005531886.novalocal sudo[5233]: pam_unix(sudo:session): session closed for user root
Nov 22 06:35:31 np0005531886.novalocal sudo[5311]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lurqcsrzmrbxoxpmcqjpscvsaopzolgt ; /usr/bin/python3'
Nov 22 06:35:31 np0005531886.novalocal sudo[5311]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 06:35:31 np0005531886.novalocal python3[5313]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 22 06:35:31 np0005531886.novalocal sudo[5311]: pam_unix(sudo:session): session closed for user root
Nov 22 06:35:32 np0005531886.novalocal sudo[5384]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-smwocxmbexqlkognkhtxqjbbbafpzsqk ; /usr/bin/python3'
Nov 22 06:35:32 np0005531886.novalocal sudo[5384]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 06:35:32 np0005531886.novalocal python3[5386]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763793331.3167193-31-26751220527594/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 06:35:32 np0005531886.novalocal sudo[5384]: pam_unix(sudo:session): session closed for user root
Nov 22 06:35:33 np0005531886.novalocal python3[5434]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 06:35:33 np0005531886.novalocal python3[5458]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 06:35:33 np0005531886.novalocal python3[5482]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 06:35:33 np0005531886.novalocal python3[5506]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 06:35:34 np0005531886.novalocal python3[5530]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 06:35:34 np0005531886.novalocal python3[5554]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 06:35:34 np0005531886.novalocal python3[5578]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 06:35:34 np0005531886.novalocal python3[5602]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 06:35:35 np0005531886.novalocal python3[5626]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 06:35:35 np0005531886.novalocal python3[5650]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 06:35:35 np0005531886.novalocal python3[5674]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 06:35:36 np0005531886.novalocal python3[5698]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 06:35:36 np0005531886.novalocal python3[5722]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 06:35:36 np0005531886.novalocal python3[5746]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 06:35:36 np0005531886.novalocal python3[5770]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 06:35:37 np0005531886.novalocal python3[5794]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 06:35:37 np0005531886.novalocal python3[5818]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 06:35:37 np0005531886.novalocal python3[5842]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 06:35:38 np0005531886.novalocal python3[5866]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 06:35:38 np0005531886.novalocal python3[5890]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 06:35:38 np0005531886.novalocal python3[5914]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 06:35:38 np0005531886.novalocal python3[5938]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 06:35:39 np0005531886.novalocal python3[5962]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 06:35:39 np0005531886.novalocal python3[5986]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 06:35:39 np0005531886.novalocal python3[6010]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 06:35:40 np0005531886.novalocal python3[6034]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 06:35:42 np0005531886.novalocal sudo[6058]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pdqlwgxewfjgjcpniqjkzrbkkmqyhfnw ; /usr/bin/python3'
Nov 22 06:35:42 np0005531886.novalocal sudo[6058]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 06:35:42 np0005531886.novalocal python3[6060]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Nov 22 06:35:42 np0005531886.novalocal systemd[1]: Starting Time & Date Service...
Nov 22 06:35:42 np0005531886.novalocal systemd[1]: Started Time & Date Service.
Nov 22 06:35:42 np0005531886.novalocal systemd-timedated[6062]: Changed time zone to 'UTC' (UTC).
Nov 22 06:35:43 np0005531886.novalocal sudo[6058]: pam_unix(sudo:session): session closed for user root
Nov 22 06:35:44 np0005531886.novalocal sudo[6089]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dkvkzulmhdlllepqzrbbqwqkbloiogkp ; /usr/bin/python3'
Nov 22 06:35:44 np0005531886.novalocal sudo[6089]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 06:35:44 np0005531886.novalocal python3[6091]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 06:35:44 np0005531886.novalocal sudo[6089]: pam_unix(sudo:session): session closed for user root
Nov 22 06:35:45 np0005531886.novalocal python3[6167]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 22 06:35:45 np0005531886.novalocal python3[6238]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1763793345.0816002-251-146015838858154/source _original_basename=tmpn8f2pvu0 follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 06:35:46 np0005531886.novalocal python3[6338]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 22 06:35:46 np0005531886.novalocal python3[6409]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1763793345.9411547-301-41949518680532/source _original_basename=tmp8u_sxph3 follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 06:35:47 np0005531886.novalocal sudo[6509]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzavgbjxmawvxvjwpduhybrdcqfsggiw ; /usr/bin/python3'
Nov 22 06:35:47 np0005531886.novalocal sudo[6509]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 06:35:47 np0005531886.novalocal python3[6511]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 22 06:35:47 np0005531886.novalocal sudo[6509]: pam_unix(sudo:session): session closed for user root
Nov 22 06:35:47 np0005531886.novalocal sudo[6582]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hrjxroekorneeezlunmdiscmbjxyhfdj ; /usr/bin/python3'
Nov 22 06:35:47 np0005531886.novalocal sudo[6582]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 06:35:47 np0005531886.novalocal python3[6584]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1763793347.0391786-381-121869229759425/source _original_basename=tmpgongosxy follow=False checksum=0eb5322b65ee02e6c6f1c11e26c1221ac339918b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 06:35:47 np0005531886.novalocal sudo[6582]: pam_unix(sudo:session): session closed for user root
Nov 22 06:35:48 np0005531886.novalocal python3[6632]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 06:35:48 np0005531886.novalocal python3[6658]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 06:35:48 np0005531886.novalocal sudo[6736]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yvcdoxucmnveszqnzagdlqfgdeyvrqlr ; /usr/bin/python3'
Nov 22 06:35:48 np0005531886.novalocal sudo[6736]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 06:35:48 np0005531886.novalocal python3[6738]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 22 06:35:48 np0005531886.novalocal sudo[6736]: pam_unix(sudo:session): session closed for user root
Nov 22 06:35:49 np0005531886.novalocal sudo[6809]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xohgjniijdrfvckoxyfgoqdwfwutttas ; /usr/bin/python3'
Nov 22 06:35:49 np0005531886.novalocal sudo[6809]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 06:35:49 np0005531886.novalocal python3[6811]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1763793348.6500192-451-31827491506067/source _original_basename=tmpifgca4wc follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 06:35:49 np0005531886.novalocal sudo[6809]: pam_unix(sudo:session): session closed for user root
Nov 22 06:35:49 np0005531886.novalocal sudo[6860]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uixybwzarqqlbwqkzwudugzzybaiwnoy ; /usr/bin/python3'
Nov 22 06:35:49 np0005531886.novalocal sudo[6860]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 06:35:50 np0005531886.novalocal python3[6862]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163e3b-3c83-2f73-06ac-00000000001f-1-compute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 06:35:50 np0005531886.novalocal sudo[6860]: pam_unix(sudo:session): session closed for user root
Nov 22 06:35:50 np0005531886.novalocal irqbalance[816]: Cannot change IRQ 26 affinity: Operation not permitted
Nov 22 06:35:50 np0005531886.novalocal irqbalance[816]: IRQ 26 affinity is now unmanaged
Nov 22 06:35:50 np0005531886.novalocal python3[6890]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env
                                                       _uses_shell=True zuul_log_id=fa163e3b-3c83-2f73-06ac-000000000020-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Nov 22 06:35:52 np0005531886.novalocal python3[6918]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 06:36:13 np0005531886.novalocal systemd[1]: systemd-timedated.service: Deactivated successfully.
Nov 22 06:36:28 np0005531886.novalocal sudo[6945]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cffiabexylgligjshduwhftchmsjjwnn ; /usr/bin/python3'
Nov 22 06:36:28 np0005531886.novalocal sudo[6945]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 06:36:28 np0005531886.novalocal python3[6947]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 06:36:28 np0005531886.novalocal sudo[6945]: pam_unix(sudo:session): session closed for user root
Nov 22 06:37:12 np0005531886.novalocal kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Nov 22 06:37:12 np0005531886.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Nov 22 06:37:12 np0005531886.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Nov 22 06:37:12 np0005531886.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Nov 22 06:37:12 np0005531886.novalocal kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Nov 22 06:37:12 np0005531886.novalocal kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Nov 22 06:37:13 np0005531886.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Nov 22 06:37:13 np0005531886.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Nov 22 06:37:13 np0005531886.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Nov 22 06:37:13 np0005531886.novalocal kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Nov 22 06:37:13 np0005531886.novalocal NetworkManager[858]: <info>  [1763793433.0441] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Nov 22 06:37:13 np0005531886.novalocal systemd-udevd[6949]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 06:37:13 np0005531886.novalocal NetworkManager[858]: <info>  [1763793433.0635] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 22 06:37:13 np0005531886.novalocal NetworkManager[858]: <info>  [1763793433.0679] settings: (eth1): created default wired connection 'Wired connection 1'
Nov 22 06:37:13 np0005531886.novalocal NetworkManager[858]: <info>  [1763793433.0686] device (eth1): carrier: link connected
Nov 22 06:37:13 np0005531886.novalocal NetworkManager[858]: <info>  [1763793433.0689] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Nov 22 06:37:13 np0005531886.novalocal NetworkManager[858]: <info>  [1763793433.0699] policy: auto-activating connection 'Wired connection 1' (404dc1b5-53f2-3896-aa44-c017e0bc287f)
Nov 22 06:37:13 np0005531886.novalocal NetworkManager[858]: <info>  [1763793433.0706] device (eth1): Activation: starting connection 'Wired connection 1' (404dc1b5-53f2-3896-aa44-c017e0bc287f)
Nov 22 06:37:13 np0005531886.novalocal NetworkManager[858]: <info>  [1763793433.0707] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 22 06:37:13 np0005531886.novalocal NetworkManager[858]: <info>  [1763793433.0713] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 22 06:37:13 np0005531886.novalocal NetworkManager[858]: <info>  [1763793433.0719] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 22 06:37:13 np0005531886.novalocal NetworkManager[858]: <info>  [1763793433.0726] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Nov 22 06:37:13 np0005531886.novalocal systemd[4303]: Starting Mark boot as successful...
Nov 22 06:37:13 np0005531886.novalocal systemd[4303]: Finished Mark boot as successful.
Nov 22 06:37:13 np0005531886.novalocal python3[6976]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163e3b-3c83-e99d-39a2-000000000128-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 06:37:20 np0005531886.novalocal sudo[7054]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vhilxuiziyjlaubkfarrqkzksmbgjgjr ; OS_CLOUD=vexxhost /usr/bin/python3'
Nov 22 06:37:20 np0005531886.novalocal sudo[7054]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 06:37:21 np0005531886.novalocal python3[7056]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 22 06:37:21 np0005531886.novalocal sudo[7054]: pam_unix(sudo:session): session closed for user root
Nov 22 06:37:21 np0005531886.novalocal sudo[7127]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dpqyntiwevsbaahqdlgbephkwqvyxsfq ; OS_CLOUD=vexxhost /usr/bin/python3'
Nov 22 06:37:21 np0005531886.novalocal sudo[7127]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 06:37:21 np0005531886.novalocal python3[7129]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763793440.7038898-104-135057197098652/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=353afb1c58f55cef2c93836e2eaa5777e8c6c70c backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 06:37:21 np0005531886.novalocal sudo[7127]: pam_unix(sudo:session): session closed for user root
Nov 22 06:37:21 np0005531886.novalocal sudo[7177]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-trwfwnrfgvsxqpixueyjmjnxlyfivunn ; OS_CLOUD=vexxhost /usr/bin/python3'
Nov 22 06:37:21 np0005531886.novalocal sudo[7177]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 06:37:22 np0005531886.novalocal python3[7179]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 06:37:22 np0005531886.novalocal systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Nov 22 06:37:22 np0005531886.novalocal systemd[1]: Stopped Network Manager Wait Online.
Nov 22 06:37:22 np0005531886.novalocal systemd[1]: Stopping Network Manager Wait Online...
Nov 22 06:37:22 np0005531886.novalocal systemd[1]: Stopping Network Manager...
Nov 22 06:37:22 np0005531886.novalocal NetworkManager[858]: <info>  [1763793442.2886] caught SIGTERM, shutting down normally.
Nov 22 06:37:22 np0005531886.novalocal NetworkManager[858]: <info>  [1763793442.2901] dhcp4 (eth0): canceled DHCP transaction
Nov 22 06:37:22 np0005531886.novalocal NetworkManager[858]: <info>  [1763793442.2902] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 22 06:37:22 np0005531886.novalocal NetworkManager[858]: <info>  [1763793442.2902] dhcp4 (eth0): state changed no lease
Nov 22 06:37:22 np0005531886.novalocal NetworkManager[858]: <info>  [1763793442.2907] manager: NetworkManager state is now CONNECTING
Nov 22 06:37:22 np0005531886.novalocal NetworkManager[858]: <info>  [1763793442.3073] dhcp4 (eth1): canceled DHCP transaction
Nov 22 06:37:22 np0005531886.novalocal NetworkManager[858]: <info>  [1763793442.3074] dhcp4 (eth1): state changed no lease
Nov 22 06:37:22 np0005531886.novalocal NetworkManager[858]: <info>  [1763793442.3128] exiting (success)
Nov 22 06:37:22 np0005531886.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 22 06:37:22 np0005531886.novalocal systemd[1]: NetworkManager.service: Deactivated successfully.
Nov 22 06:37:22 np0005531886.novalocal systemd[1]: Stopped Network Manager.
Nov 22 06:37:22 np0005531886.novalocal systemd[1]: NetworkManager.service: Consumed 1.413s CPU time, 9.9M memory peak.
Nov 22 06:37:22 np0005531886.novalocal systemd[1]: Starting Network Manager...
Nov 22 06:37:22 np0005531886.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 22 06:37:22 np0005531886.novalocal NetworkManager[7183]: <info>  [1763793442.3541] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:a2036d99-efbc-42fa-8cf6-98ac98e337b2)
Nov 22 06:37:22 np0005531886.novalocal NetworkManager[7183]: <info>  [1763793442.3544] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Nov 22 06:37:22 np0005531886.novalocal NetworkManager[7183]: <info>  [1763793442.3596] manager[0x556b0b209070]: monitoring kernel firmware directory '/lib/firmware'.
Nov 22 06:37:22 np0005531886.novalocal systemd[1]: Starting Hostname Service...
Nov 22 06:37:22 np0005531886.novalocal systemd[1]: Started Hostname Service.
Nov 22 06:37:22 np0005531886.novalocal NetworkManager[7183]: <info>  [1763793442.4444] hostname: hostname: using hostnamed
Nov 22 06:37:22 np0005531886.novalocal NetworkManager[7183]: <info>  [1763793442.4445] hostname: static hostname changed from (none) to "np0005531886.novalocal"
Nov 22 06:37:22 np0005531886.novalocal NetworkManager[7183]: <info>  [1763793442.4453] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Nov 22 06:37:22 np0005531886.novalocal NetworkManager[7183]: <info>  [1763793442.4460] manager[0x556b0b209070]: rfkill: Wi-Fi hardware radio set enabled
Nov 22 06:37:22 np0005531886.novalocal NetworkManager[7183]: <info>  [1763793442.4460] manager[0x556b0b209070]: rfkill: WWAN hardware radio set enabled
Nov 22 06:37:22 np0005531886.novalocal NetworkManager[7183]: <info>  [1763793442.4489] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Nov 22 06:37:22 np0005531886.novalocal NetworkManager[7183]: <info>  [1763793442.4490] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Nov 22 06:37:22 np0005531886.novalocal NetworkManager[7183]: <info>  [1763793442.4490] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Nov 22 06:37:22 np0005531886.novalocal NetworkManager[7183]: <info>  [1763793442.4491] manager: Networking is enabled by state file
Nov 22 06:37:22 np0005531886.novalocal NetworkManager[7183]: <info>  [1763793442.4493] settings: Loaded settings plugin: keyfile (internal)
Nov 22 06:37:22 np0005531886.novalocal NetworkManager[7183]: <info>  [1763793442.4497] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Nov 22 06:37:22 np0005531886.novalocal NetworkManager[7183]: <info>  [1763793442.4522] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Nov 22 06:37:22 np0005531886.novalocal NetworkManager[7183]: <info>  [1763793442.4533] dhcp: init: Using DHCP client 'internal'
Nov 22 06:37:22 np0005531886.novalocal NetworkManager[7183]: <info>  [1763793442.4536] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Nov 22 06:37:22 np0005531886.novalocal NetworkManager[7183]: <info>  [1763793442.4541] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 06:37:22 np0005531886.novalocal NetworkManager[7183]: <info>  [1763793442.4546] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Nov 22 06:37:22 np0005531886.novalocal NetworkManager[7183]: <info>  [1763793442.4555] device (lo): Activation: starting connection 'lo' (f5c5737f-5ac0-4563-b7a5-22075693542b)
Nov 22 06:37:22 np0005531886.novalocal NetworkManager[7183]: <info>  [1763793442.4562] device (eth0): carrier: link connected
Nov 22 06:37:22 np0005531886.novalocal NetworkManager[7183]: <info>  [1763793442.4567] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Nov 22 06:37:22 np0005531886.novalocal NetworkManager[7183]: <info>  [1763793442.4572] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Nov 22 06:37:22 np0005531886.novalocal NetworkManager[7183]: <info>  [1763793442.4573] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 22 06:37:22 np0005531886.novalocal NetworkManager[7183]: <info>  [1763793442.4580] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 22 06:37:22 np0005531886.novalocal NetworkManager[7183]: <info>  [1763793442.4587] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 22 06:37:22 np0005531886.novalocal NetworkManager[7183]: <info>  [1763793442.4595] device (eth1): carrier: link connected
Nov 22 06:37:22 np0005531886.novalocal NetworkManager[7183]: <info>  [1763793442.4599] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Nov 22 06:37:22 np0005531886.novalocal NetworkManager[7183]: <info>  [1763793442.4606] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (404dc1b5-53f2-3896-aa44-c017e0bc287f) (indicated)
Nov 22 06:37:22 np0005531886.novalocal NetworkManager[7183]: <info>  [1763793442.4606] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 22 06:37:22 np0005531886.novalocal NetworkManager[7183]: <info>  [1763793442.4614] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 22 06:37:22 np0005531886.novalocal NetworkManager[7183]: <info>  [1763793442.4622] device (eth1): Activation: starting connection 'Wired connection 1' (404dc1b5-53f2-3896-aa44-c017e0bc287f)
Nov 22 06:37:22 np0005531886.novalocal systemd[1]: Started Network Manager.
Nov 22 06:37:22 np0005531886.novalocal NetworkManager[7183]: <info>  [1763793442.4629] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Nov 22 06:37:22 np0005531886.novalocal NetworkManager[7183]: <info>  [1763793442.4634] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Nov 22 06:37:22 np0005531886.novalocal NetworkManager[7183]: <info>  [1763793442.4636] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Nov 22 06:37:22 np0005531886.novalocal NetworkManager[7183]: <info>  [1763793442.4638] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Nov 22 06:37:22 np0005531886.novalocal NetworkManager[7183]: <info>  [1763793442.4640] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 22 06:37:22 np0005531886.novalocal NetworkManager[7183]: <info>  [1763793442.4643] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 22 06:37:22 np0005531886.novalocal NetworkManager[7183]: <info>  [1763793442.4646] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 22 06:37:22 np0005531886.novalocal NetworkManager[7183]: <info>  [1763793442.4648] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 22 06:37:22 np0005531886.novalocal NetworkManager[7183]: <info>  [1763793442.4649] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Nov 22 06:37:22 np0005531886.novalocal NetworkManager[7183]: <info>  [1763793442.4656] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 22 06:37:22 np0005531886.novalocal NetworkManager[7183]: <info>  [1763793442.4658] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 22 06:37:22 np0005531886.novalocal NetworkManager[7183]: <info>  [1763793442.4664] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 22 06:37:22 np0005531886.novalocal NetworkManager[7183]: <info>  [1763793442.4666] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Nov 22 06:37:22 np0005531886.novalocal NetworkManager[7183]: <info>  [1763793442.4695] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Nov 22 06:37:22 np0005531886.novalocal NetworkManager[7183]: <info>  [1763793442.4697] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Nov 22 06:37:22 np0005531886.novalocal NetworkManager[7183]: <info>  [1763793442.4701] device (lo): Activation: successful, device activated.
Nov 22 06:37:22 np0005531886.novalocal NetworkManager[7183]: <info>  [1763793442.4708] dhcp4 (eth0): state changed new lease, address=38.129.56.29
Nov 22 06:37:22 np0005531886.novalocal NetworkManager[7183]: <info>  [1763793442.4713] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Nov 22 06:37:22 np0005531886.novalocal systemd[1]: Starting Network Manager Wait Online...
Nov 22 06:37:22 np0005531886.novalocal NetworkManager[7183]: <info>  [1763793442.4786] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 22 06:37:22 np0005531886.novalocal NetworkManager[7183]: <info>  [1763793442.4805] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 22 06:37:22 np0005531886.novalocal NetworkManager[7183]: <info>  [1763793442.4806] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 22 06:37:22 np0005531886.novalocal NetworkManager[7183]: <info>  [1763793442.4809] manager: NetworkManager state is now CONNECTED_SITE
Nov 22 06:37:22 np0005531886.novalocal NetworkManager[7183]: <info>  [1763793442.4812] device (eth0): Activation: successful, device activated.
Nov 22 06:37:22 np0005531886.novalocal NetworkManager[7183]: <info>  [1763793442.4817] manager: NetworkManager state is now CONNECTED_GLOBAL
Nov 22 06:37:22 np0005531886.novalocal sudo[7177]: pam_unix(sudo:session): session closed for user root
Nov 22 06:37:22 np0005531886.novalocal python3[7263]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163e3b-3c83-e99d-39a2-0000000000bd-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 06:37:32 np0005531886.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 22 06:37:52 np0005531886.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 22 06:38:07 np0005531886.novalocal NetworkManager[7183]: <info>  [1763793487.3090] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 22 06:38:07 np0005531886.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 22 06:38:07 np0005531886.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 22 06:38:07 np0005531886.novalocal NetworkManager[7183]: <info>  [1763793487.3446] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 22 06:38:07 np0005531886.novalocal NetworkManager[7183]: <info>  [1763793487.3449] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 22 06:38:07 np0005531886.novalocal NetworkManager[7183]: <info>  [1763793487.3459] device (eth1): Activation: successful, device activated.
Nov 22 06:38:07 np0005531886.novalocal NetworkManager[7183]: <info>  [1763793487.3466] manager: startup complete
Nov 22 06:38:07 np0005531886.novalocal NetworkManager[7183]: <info>  [1763793487.3467] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Nov 22 06:38:07 np0005531886.novalocal NetworkManager[7183]: <warn>  [1763793487.3471] device (eth1): Activation: failed for connection 'Wired connection 1'
Nov 22 06:38:07 np0005531886.novalocal NetworkManager[7183]: <info>  [1763793487.3479] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Nov 22 06:38:07 np0005531886.novalocal systemd[1]: Finished Network Manager Wait Online.
Nov 22 06:38:07 np0005531886.novalocal NetworkManager[7183]: <info>  [1763793487.3543] dhcp4 (eth1): canceled DHCP transaction
Nov 22 06:38:07 np0005531886.novalocal NetworkManager[7183]: <info>  [1763793487.3543] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Nov 22 06:38:07 np0005531886.novalocal NetworkManager[7183]: <info>  [1763793487.3543] dhcp4 (eth1): state changed no lease
Nov 22 06:38:07 np0005531886.novalocal NetworkManager[7183]: <info>  [1763793487.3558] policy: auto-activating connection 'ci-private-network' (cbb9ac2c-e3a8-55fd-9244-47a9ee91a7c3)
Nov 22 06:38:07 np0005531886.novalocal NetworkManager[7183]: <info>  [1763793487.3562] device (eth1): Activation: starting connection 'ci-private-network' (cbb9ac2c-e3a8-55fd-9244-47a9ee91a7c3)
Nov 22 06:38:07 np0005531886.novalocal NetworkManager[7183]: <info>  [1763793487.3563] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 22 06:38:07 np0005531886.novalocal NetworkManager[7183]: <info>  [1763793487.3564] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 22 06:38:07 np0005531886.novalocal NetworkManager[7183]: <info>  [1763793487.3570] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 22 06:38:07 np0005531886.novalocal NetworkManager[7183]: <info>  [1763793487.3577] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 22 06:38:07 np0005531886.novalocal NetworkManager[7183]: <info>  [1763793487.3919] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 22 06:38:07 np0005531886.novalocal NetworkManager[7183]: <info>  [1763793487.3921] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 22 06:38:07 np0005531886.novalocal NetworkManager[7183]: <info>  [1763793487.3926] device (eth1): Activation: successful, device activated.
Nov 22 06:38:17 np0005531886.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 22 06:38:22 np0005531886.novalocal sshd-session[4312]: Received disconnect from 38.102.83.114 port 56340:11: disconnected by user
Nov 22 06:38:22 np0005531886.novalocal sshd-session[4312]: Disconnected from user zuul 38.102.83.114 port 56340
Nov 22 06:38:22 np0005531886.novalocal sshd-session[4299]: pam_unix(sshd:session): session closed for user zuul
Nov 22 06:38:22 np0005531886.novalocal systemd-logind[821]: Session 1 logged out. Waiting for processes to exit.
Nov 22 06:39:16 np0005531886.novalocal sshd-session[7291]: Accepted publickey for zuul from 38.102.83.114 port 55254 ssh2: RSA SHA256:7oNZ/bEUHjK/egwXGXBg4W9ef54zFnfkETNnlGSYLiw
Nov 22 06:39:16 np0005531886.novalocal systemd-logind[821]: New session 3 of user zuul.
Nov 22 06:39:16 np0005531886.novalocal systemd[1]: Started Session 3 of User zuul.
Nov 22 06:39:16 np0005531886.novalocal sshd-session[7291]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 22 06:39:16 np0005531886.novalocal sudo[7370]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lprupzubhoufmoxcahbmbaaxdrgmfmvw ; OS_CLOUD=vexxhost /usr/bin/python3'
Nov 22 06:39:16 np0005531886.novalocal sudo[7370]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 06:39:16 np0005531886.novalocal python3[7372]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 22 06:39:16 np0005531886.novalocal sudo[7370]: pam_unix(sudo:session): session closed for user root
Nov 22 06:39:17 np0005531886.novalocal sudo[7443]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmrmuvxyjokmozrcsvclujfrzrohyefz ; OS_CLOUD=vexxhost /usr/bin/python3'
Nov 22 06:39:17 np0005531886.novalocal sudo[7443]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 06:39:17 np0005531886.novalocal python3[7445]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763793556.5775464-365-143125438183565/source _original_basename=tmpi1by2d0s follow=False checksum=ec5da2e3f9737eb58d2ca927fe651700c5f6760b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 06:39:17 np0005531886.novalocal sudo[7443]: pam_unix(sudo:session): session closed for user root
Nov 22 06:39:21 np0005531886.novalocal sshd-session[7294]: Connection closed by 38.102.83.114 port 55254
Nov 22 06:39:21 np0005531886.novalocal sshd-session[7291]: pam_unix(sshd:session): session closed for user zuul
Nov 22 06:39:21 np0005531886.novalocal systemd[1]: session-3.scope: Deactivated successfully.
Nov 22 06:39:21 np0005531886.novalocal systemd-logind[821]: Session 3 logged out. Waiting for processes to exit.
Nov 22 06:39:21 np0005531886.novalocal systemd-logind[821]: Removed session 3.
Nov 22 06:40:16 np0005531886.novalocal systemd[4303]: Created slice User Background Tasks Slice.
Nov 22 06:40:16 np0005531886.novalocal systemd[4303]: Starting Cleanup of User's Temporary Files and Directories...
Nov 22 06:40:16 np0005531886.novalocal systemd[4303]: Finished Cleanup of User's Temporary Files and Directories.
Nov 22 06:45:31 np0005531886.novalocal sshd-session[7476]: Accepted publickey for zuul from 38.102.83.114 port 42750 ssh2: RSA SHA256:7oNZ/bEUHjK/egwXGXBg4W9ef54zFnfkETNnlGSYLiw
Nov 22 06:45:31 np0005531886.novalocal systemd-logind[821]: New session 4 of user zuul.
Nov 22 06:45:32 np0005531886.novalocal systemd[1]: Started Session 4 of User zuul.
Nov 22 06:45:32 np0005531886.novalocal sshd-session[7476]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 22 06:45:32 np0005531886.novalocal sudo[7503]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dkvdmlnbonaltxvpujlqnjssnhnlxxpm ; /usr/bin/python3'
Nov 22 06:45:32 np0005531886.novalocal sudo[7503]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 06:45:32 np0005531886.novalocal python3[7505]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda
                                                       _uses_shell=True zuul_log_id=fa163e3b-3c83-eb85-1f76-000000000ca6-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 06:45:32 np0005531886.novalocal sudo[7503]: pam_unix(sudo:session): session closed for user root
Nov 22 06:45:32 np0005531886.novalocal sudo[7531]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eqalvzcotykmesqidbmgtlaadntndnwl ; /usr/bin/python3'
Nov 22 06:45:32 np0005531886.novalocal sudo[7531]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 06:45:32 np0005531886.novalocal python3[7533]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 06:45:32 np0005531886.novalocal sudo[7531]: pam_unix(sudo:session): session closed for user root
Nov 22 06:45:32 np0005531886.novalocal sudo[7558]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dnwmonkognlblrllctrxzgiezucoaowa ; /usr/bin/python3'
Nov 22 06:45:32 np0005531886.novalocal sudo[7558]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 06:45:33 np0005531886.novalocal python3[7560]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 06:45:33 np0005531886.novalocal sudo[7558]: pam_unix(sudo:session): session closed for user root
Nov 22 06:45:33 np0005531886.novalocal sudo[7584]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wywninpwebglgzwlxohsyfsbgbacyxtr ; /usr/bin/python3'
Nov 22 06:45:33 np0005531886.novalocal sudo[7584]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 06:45:33 np0005531886.novalocal python3[7586]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 06:45:33 np0005531886.novalocal sudo[7584]: pam_unix(sudo:session): session closed for user root
Nov 22 06:45:33 np0005531886.novalocal sudo[7610]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tnebawhlvewtejfsnufmfdemjtmeowql ; /usr/bin/python3'
Nov 22 06:45:33 np0005531886.novalocal sudo[7610]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 06:45:33 np0005531886.novalocal python3[7612]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 06:45:33 np0005531886.novalocal sudo[7610]: pam_unix(sudo:session): session closed for user root
Nov 22 06:45:33 np0005531886.novalocal sudo[7636]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xokcljzwsjzamzsnedmwrpfpbuqiznnf ; /usr/bin/python3'
Nov 22 06:45:33 np0005531886.novalocal sudo[7636]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 06:45:33 np0005531886.novalocal python3[7638]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 06:45:33 np0005531886.novalocal sudo[7636]: pam_unix(sudo:session): session closed for user root
Nov 22 06:45:34 np0005531886.novalocal sudo[7714]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gsirraaygswtrsdqhfpuubogjhgpeyfu ; /usr/bin/python3'
Nov 22 06:45:34 np0005531886.novalocal sudo[7714]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 06:45:34 np0005531886.novalocal python3[7716]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 22 06:45:34 np0005531886.novalocal sudo[7714]: pam_unix(sudo:session): session closed for user root
Nov 22 06:45:34 np0005531886.novalocal sudo[7787]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zchrvuakdysvodfepncmcwwgdfgrtpip ; /usr/bin/python3'
Nov 22 06:45:34 np0005531886.novalocal sudo[7787]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 06:45:34 np0005531886.novalocal python3[7789]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763793934.1337886-367-237569000603368/source _original_basename=tmpx4w6l3lz follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 06:45:34 np0005531886.novalocal sudo[7787]: pam_unix(sudo:session): session closed for user root
Nov 22 06:45:35 np0005531886.novalocal sudo[7837]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hcoqooztetbduwptjpcyokjrijlqrxxy ; /usr/bin/python3'
Nov 22 06:45:35 np0005531886.novalocal sudo[7837]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 06:45:35 np0005531886.novalocal python3[7839]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 22 06:45:35 np0005531886.novalocal systemd[1]: Reloading.
Nov 22 06:45:35 np0005531886.novalocal systemd-rc-local-generator[7862]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 06:45:36 np0005531886.novalocal sudo[7837]: pam_unix(sudo:session): session closed for user root
Nov 22 06:45:37 np0005531886.novalocal sudo[7893]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vniofftctuuogoxjyclsmxfarqlzkwxq ; /usr/bin/python3'
Nov 22 06:45:37 np0005531886.novalocal sudo[7893]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 06:45:37 np0005531886.novalocal python3[7895]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Nov 22 06:45:37 np0005531886.novalocal sudo[7893]: pam_unix(sudo:session): session closed for user root
Nov 22 06:45:37 np0005531886.novalocal sudo[7920]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eqbhcfnmrtfkgzuoxmtsafnwayxxhwdq ; /usr/bin/python3'
Nov 22 06:45:37 np0005531886.novalocal sudo[7920]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 06:45:37 np0005531886.novalocal python3[7922]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 06:45:37 np0005531886.novalocal sudo[7920]: pam_unix(sudo:session): session closed for user root
Nov 22 06:45:38 np0005531886.novalocal sudo[7948]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kwejuxonjjbmedocaahgwrrvbhfzfkgv ; /usr/bin/python3'
Nov 22 06:45:38 np0005531886.novalocal sudo[7948]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 06:45:38 np0005531886.novalocal python3[7950]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 06:45:38 np0005531886.novalocal sudo[7948]: pam_unix(sudo:session): session closed for user root
Nov 22 06:45:38 np0005531886.novalocal sudo[7976]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kdujwpujzeaomhezcbxxdvwaojwxtwef ; /usr/bin/python3'
Nov 22 06:45:38 np0005531886.novalocal sudo[7976]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 06:45:38 np0005531886.novalocal python3[7978]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 06:45:38 np0005531886.novalocal sudo[7976]: pam_unix(sudo:session): session closed for user root
Nov 22 06:45:38 np0005531886.novalocal sudo[8004]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ihufpepvcoxhdvkkidoekhbisbyodvzn ; /usr/bin/python3'
Nov 22 06:45:38 np0005531886.novalocal sudo[8004]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 06:45:38 np0005531886.novalocal python3[8006]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 06:45:38 np0005531886.novalocal sudo[8004]: pam_unix(sudo:session): session closed for user root
Nov 22 06:45:39 np0005531886.novalocal python3[8033]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;
                                                       _uses_shell=True zuul_log_id=fa163e3b-3c83-eb85-1f76-000000000cad-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 06:45:39 np0005531886.novalocal python3[8063]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 22 06:45:42 np0005531886.novalocal sshd-session[7479]: Connection closed by 38.102.83.114 port 42750
Nov 22 06:45:42 np0005531886.novalocal sshd-session[7476]: pam_unix(sshd:session): session closed for user zuul
Nov 22 06:45:42 np0005531886.novalocal systemd[1]: session-4.scope: Deactivated successfully.
Nov 22 06:45:42 np0005531886.novalocal systemd[1]: session-4.scope: Consumed 4.723s CPU time.
Nov 22 06:45:42 np0005531886.novalocal systemd-logind[821]: Session 4 logged out. Waiting for processes to exit.
Nov 22 06:45:42 np0005531886.novalocal systemd-logind[821]: Removed session 4.
Nov 22 06:45:44 np0005531886.novalocal sshd-session[8068]: Accepted publickey for zuul from 38.102.83.114 port 38074 ssh2: RSA SHA256:7oNZ/bEUHjK/egwXGXBg4W9ef54zFnfkETNnlGSYLiw
Nov 22 06:45:44 np0005531886.novalocal systemd-logind[821]: New session 5 of user zuul.
Nov 22 06:45:44 np0005531886.novalocal systemd[1]: Started Session 5 of User zuul.
Nov 22 06:45:44 np0005531886.novalocal sshd-session[8068]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 22 06:45:44 np0005531886.novalocal sudo[8095]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-weirzjgnxeebzcletedohrcgnrazkfug ; /usr/bin/python3'
Nov 22 06:45:44 np0005531886.novalocal sudo[8095]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 06:45:44 np0005531886.novalocal python3[8097]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Nov 22 06:46:02 np0005531886.novalocal kernel: SELinux:  Converting 385 SID table entries...
Nov 22 06:46:02 np0005531886.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Nov 22 06:46:02 np0005531886.novalocal kernel: SELinux:  policy capability open_perms=1
Nov 22 06:46:02 np0005531886.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Nov 22 06:46:02 np0005531886.novalocal kernel: SELinux:  policy capability always_check_network=0
Nov 22 06:46:02 np0005531886.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 22 06:46:02 np0005531886.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 22 06:46:02 np0005531886.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 22 06:46:14 np0005531886.novalocal kernel: SELinux:  Converting 385 SID table entries...
Nov 22 06:46:14 np0005531886.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Nov 22 06:46:14 np0005531886.novalocal kernel: SELinux:  policy capability open_perms=1
Nov 22 06:46:14 np0005531886.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Nov 22 06:46:14 np0005531886.novalocal kernel: SELinux:  policy capability always_check_network=0
Nov 22 06:46:14 np0005531886.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 22 06:46:14 np0005531886.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 22 06:46:14 np0005531886.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 22 06:46:23 np0005531886.novalocal kernel: SELinux:  Converting 385 SID table entries...
Nov 22 06:46:23 np0005531886.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Nov 22 06:46:23 np0005531886.novalocal kernel: SELinux:  policy capability open_perms=1
Nov 22 06:46:23 np0005531886.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Nov 22 06:46:23 np0005531886.novalocal kernel: SELinux:  policy capability always_check_network=0
Nov 22 06:46:23 np0005531886.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 22 06:46:23 np0005531886.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 22 06:46:23 np0005531886.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 22 06:46:24 np0005531886.novalocal setsebool[8165]: The virt_use_nfs policy boolean was changed to 1 by root
Nov 22 06:46:24 np0005531886.novalocal setsebool[8165]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Nov 22 06:46:46 np0005531886.novalocal kernel: SELinux:  Converting 388 SID table entries...
Nov 22 06:46:46 np0005531886.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Nov 22 06:46:46 np0005531886.novalocal kernel: SELinux:  policy capability open_perms=1
Nov 22 06:46:46 np0005531886.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Nov 22 06:46:46 np0005531886.novalocal kernel: SELinux:  policy capability always_check_network=0
Nov 22 06:46:46 np0005531886.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 22 06:46:46 np0005531886.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 22 06:46:46 np0005531886.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 22 06:47:08 np0005531886.novalocal dbus-broker-launch[811]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Nov 22 06:47:08 np0005531886.novalocal systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 22 06:47:08 np0005531886.novalocal systemd[1]: Starting man-db-cache-update.service...
Nov 22 06:47:08 np0005531886.novalocal systemd[1]: Reloading.
Nov 22 06:47:08 np0005531886.novalocal systemd-rc-local-generator[8917]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 06:47:08 np0005531886.novalocal systemd[1]: Queuing reload/restart jobs for marked units…
Nov 22 06:47:09 np0005531886.novalocal sudo[8095]: pam_unix(sudo:session): session closed for user root
Nov 22 06:47:20 np0005531886.novalocal irqbalance[816]: Cannot change IRQ 27 affinity: Operation not permitted
Nov 22 06:47:20 np0005531886.novalocal irqbalance[816]: IRQ 27 affinity is now unmanaged
Nov 22 06:47:54 np0005531886.novalocal python3[24695]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"
                                                        _uses_shell=True zuul_log_id=fa163e3b-3c83-d633-179b-00000000000c-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 06:47:55 np0005531886.novalocal kernel: evm: overlay not supported
Nov 22 06:47:56 np0005531886.novalocal systemd[4303]: Starting D-Bus User Message Bus...
Nov 22 06:47:56 np0005531886.novalocal dbus-broker-launch[25230]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Nov 22 06:47:56 np0005531886.novalocal dbus-broker-launch[25230]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Nov 22 06:47:56 np0005531886.novalocal systemd[4303]: Started D-Bus User Message Bus.
Nov 22 06:47:56 np0005531886.novalocal dbus-broker-lau[25230]: Ready
Nov 22 06:47:56 np0005531886.novalocal systemd[4303]: selinux: avc:  op=load_policy lsm=selinux seqno=6 res=1
Nov 22 06:47:56 np0005531886.novalocal systemd[4303]: Created slice Slice /user.
Nov 22 06:47:56 np0005531886.novalocal systemd[4303]: podman-25073.scope: unit configures an IP firewall, but not running as root.
Nov 22 06:47:56 np0005531886.novalocal systemd[4303]: (This warning is only shown for the first unit using IP firewalling.)
Nov 22 06:47:56 np0005531886.novalocal systemd[4303]: Started podman-25073.scope.
Nov 22 06:47:56 np0005531886.novalocal systemd[4303]: Started podman-pause-15fd778b.scope.
Nov 22 06:47:58 np0005531886.novalocal sudo[25929]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fgfxkzzcekutjcmwvsafoubodqblboqm ; /usr/bin/python3'
Nov 22 06:47:58 np0005531886.novalocal sudo[25929]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 06:47:58 np0005531886.novalocal python3[25939]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]
                                                       location = "38.102.83.155:5001"
                                                       insecure = true path=/etc/containers/registries.conf block=[[registry]]
                                                       location = "38.102.83.155:5001"
                                                       insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 06:47:58 np0005531886.novalocal python3[25939]: ansible-ansible.builtin.blockinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Nov 22 06:47:58 np0005531886.novalocal sudo[25929]: pam_unix(sudo:session): session closed for user root
Nov 22 06:47:59 np0005531886.novalocal sshd-session[8071]: Connection closed by 38.102.83.114 port 38074
Nov 22 06:47:59 np0005531886.novalocal sshd-session[8068]: pam_unix(sshd:session): session closed for user zuul
Nov 22 06:47:59 np0005531886.novalocal systemd[1]: session-5.scope: Deactivated successfully.
Nov 22 06:47:59 np0005531886.novalocal systemd[1]: session-5.scope: Consumed 1min 6.460s CPU time.
Nov 22 06:47:59 np0005531886.novalocal systemd-logind[821]: Session 5 logged out. Waiting for processes to exit.
Nov 22 06:47:59 np0005531886.novalocal systemd-logind[821]: Removed session 5.
Nov 22 06:48:11 np0005531886.novalocal systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 22 06:48:11 np0005531886.novalocal systemd[1]: Finished man-db-cache-update.service.
Nov 22 06:48:11 np0005531886.novalocal systemd[1]: man-db-cache-update.service: Consumed 1min 7.250s CPU time.
Nov 22 06:48:11 np0005531886.novalocal systemd[1]: run-r648cbdd64eb8495eb37f18beab60032d.service: Deactivated successfully.
Nov 22 06:48:23 np0005531886.novalocal sshd-session[29573]: Connection closed by 38.129.56.219 port 51478 [preauth]
Nov 22 06:48:23 np0005531886.novalocal sshd-session[29574]: Connection closed by 38.129.56.219 port 51490 [preauth]
Nov 22 06:48:24 np0005531886.novalocal sshd-session[29572]: Unable to negotiate with 38.129.56.219 port 51508: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Nov 22 06:48:24 np0005531886.novalocal sshd-session[29575]: Unable to negotiate with 38.129.56.219 port 51500: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Nov 22 06:48:24 np0005531886.novalocal sshd-session[29577]: Unable to negotiate with 38.129.56.219 port 51516: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Nov 22 06:48:31 np0005531886.novalocal sshd-session[29582]: Accepted publickey for zuul from 38.102.83.114 port 48142 ssh2: RSA SHA256:7oNZ/bEUHjK/egwXGXBg4W9ef54zFnfkETNnlGSYLiw
Nov 22 06:48:31 np0005531886.novalocal systemd-logind[821]: New session 6 of user zuul.
Nov 22 06:48:31 np0005531886.novalocal systemd[1]: Started Session 6 of User zuul.
Nov 22 06:48:31 np0005531886.novalocal sshd-session[29582]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 22 06:48:31 np0005531886.novalocal python3[29609]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOpFAjaSiSlb3Z20Y0m04CgcPrwNFzNBBf5oLwBoYILNoMPdmUatHE9iyvTfqyXv8EDwL6ikMKNecwZLodb/nJI= zuul@np0005531885.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 06:48:31 np0005531886.novalocal sudo[29633]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qkhffbiqfeyydmrszypkipnpatpjouku ; /usr/bin/python3'
Nov 22 06:48:31 np0005531886.novalocal sudo[29633]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 06:48:31 np0005531886.novalocal python3[29635]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOpFAjaSiSlb3Z20Y0m04CgcPrwNFzNBBf5oLwBoYILNoMPdmUatHE9iyvTfqyXv8EDwL6ikMKNecwZLodb/nJI= zuul@np0005531885.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 06:48:31 np0005531886.novalocal sudo[29633]: pam_unix(sudo:session): session closed for user root
Nov 22 06:48:32 np0005531886.novalocal sudo[29659]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zkebfnpnlappopkyhbovvkfsfnlkqrpp ; /usr/bin/python3'
Nov 22 06:48:32 np0005531886.novalocal sudo[29659]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 06:48:33 np0005531886.novalocal python3[29661]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005531886.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Nov 22 06:48:33 np0005531886.novalocal useradd[29663]: new group: name=cloud-admin, GID=1002
Nov 22 06:48:33 np0005531886.novalocal useradd[29663]: new user: name=cloud-admin, UID=1002, GID=1002, home=/home/cloud-admin, shell=/bin/bash, from=none
Nov 22 06:48:33 np0005531886.novalocal sudo[29659]: pam_unix(sudo:session): session closed for user root
Nov 22 06:48:33 np0005531886.novalocal sudo[29693]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ewfyiipyxrubqfjqkoraeanjytadnzqk ; /usr/bin/python3'
Nov 22 06:48:33 np0005531886.novalocal sudo[29693]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 06:48:34 np0005531886.novalocal python3[29695]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOpFAjaSiSlb3Z20Y0m04CgcPrwNFzNBBf5oLwBoYILNoMPdmUatHE9iyvTfqyXv8EDwL6ikMKNecwZLodb/nJI= zuul@np0005531885.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 22 06:48:34 np0005531886.novalocal sudo[29693]: pam_unix(sudo:session): session closed for user root
Nov 22 06:48:34 np0005531886.novalocal sudo[29771]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ojmfszohpznjuoymbthvvsbyttxlpcej ; /usr/bin/python3'
Nov 22 06:48:34 np0005531886.novalocal sudo[29771]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 06:48:34 np0005531886.novalocal python3[29773]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 22 06:48:34 np0005531886.novalocal sudo[29771]: pam_unix(sudo:session): session closed for user root
Nov 22 06:48:34 np0005531886.novalocal sudo[29844]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-edfgknwqijvagcctwviemhgdgqjvolnp ; /usr/bin/python3'
Nov 22 06:48:34 np0005531886.novalocal sudo[29844]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 06:48:35 np0005531886.novalocal python3[29846]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1763794114.1793363-167-111018957570671/source _original_basename=tmpqw8kabs9 follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 06:48:35 np0005531886.novalocal sudo[29844]: pam_unix(sudo:session): session closed for user root
Nov 22 06:48:35 np0005531886.novalocal sudo[29894]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-akldfieissihvtsqfdjehwpruezmszoc ; /usr/bin/python3'
Nov 22 06:48:35 np0005531886.novalocal sudo[29894]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 06:48:36 np0005531886.novalocal python3[29896]: ansible-ansible.builtin.hostname Invoked with name=compute-0 use=systemd
Nov 22 06:48:36 np0005531886.novalocal systemd[1]: Starting Hostname Service...
Nov 22 06:48:36 np0005531886.novalocal systemd[1]: Started Hostname Service.
Nov 22 06:48:36 np0005531886.novalocal systemd-hostnamed[29900]: Changed pretty hostname to 'compute-0'
Nov 22 06:48:36 compute-0 systemd-hostnamed[29900]: Hostname set to <compute-0> (static)
Nov 22 06:48:36 compute-0 NetworkManager[7183]: <info>  [1763794116.1525] hostname: static hostname changed from "np0005531886.novalocal" to "compute-0"
Nov 22 06:48:36 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 22 06:48:36 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 22 06:48:36 compute-0 sudo[29894]: pam_unix(sudo:session): session closed for user root
Nov 22 06:48:36 compute-0 sshd-session[29585]: Connection closed by 38.102.83.114 port 48142
Nov 22 06:48:36 compute-0 sshd-session[29582]: pam_unix(sshd:session): session closed for user zuul
Nov 22 06:48:36 compute-0 systemd-logind[821]: Session 6 logged out. Waiting for processes to exit.
Nov 22 06:48:36 compute-0 systemd[1]: session-6.scope: Deactivated successfully.
Nov 22 06:48:36 compute-0 systemd[1]: session-6.scope: Consumed 2.600s CPU time.
Nov 22 06:48:36 compute-0 systemd-logind[821]: Removed session 6.
Nov 22 06:48:46 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 22 06:49:06 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 22 06:50:06 compute-0 systemd[1]: Starting Cleanup of Temporary Directories...
Nov 22 06:50:06 compute-0 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Nov 22 06:50:06 compute-0 systemd[1]: Finished Cleanup of Temporary Directories.
Nov 22 06:50:06 compute-0 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Nov 22 06:55:33 compute-0 sshd-session[29924]: Accepted publickey for zuul from 38.129.56.219 port 56074 ssh2: RSA SHA256:7oNZ/bEUHjK/egwXGXBg4W9ef54zFnfkETNnlGSYLiw
Nov 22 06:55:33 compute-0 systemd-logind[821]: New session 7 of user zuul.
Nov 22 06:55:33 compute-0 systemd[1]: Started Session 7 of User zuul.
Nov 22 06:55:33 compute-0 sshd-session[29924]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 22 06:55:34 compute-0 python3[30000]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 06:55:36 compute-0 sudo[30114]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qdmphmwsedauvbbflfvvvramibwdmlxh ; /usr/bin/python3'
Nov 22 06:55:36 compute-0 sudo[30114]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 06:55:36 compute-0 python3[30116]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 22 06:55:36 compute-0 sudo[30114]: pam_unix(sudo:session): session closed for user root
Nov 22 06:55:36 compute-0 sudo[30187]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rguvspwzbzajimfhjdnxgoidybfwskid ; /usr/bin/python3'
Nov 22 06:55:36 compute-0 sudo[30187]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 06:55:36 compute-0 python3[30189]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1763794535.942976-33952-149672968383414/source mode=0755 _original_basename=delorean.repo follow=False checksum=1830be8248976a7f714fb01ca8550e92dfc79ad2 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 06:55:36 compute-0 sudo[30187]: pam_unix(sudo:session): session closed for user root
Nov 22 06:55:36 compute-0 sudo[30213]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tjzojzthoucbokdksqbhtttkavfizgmi ; /usr/bin/python3'
Nov 22 06:55:36 compute-0 sudo[30213]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 06:55:36 compute-0 python3[30215]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 22 06:55:36 compute-0 sudo[30213]: pam_unix(sudo:session): session closed for user root
Nov 22 06:55:37 compute-0 sudo[30286]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjymyxavnbdiuzvgqbubmhicrxxyowba ; /usr/bin/python3'
Nov 22 06:55:37 compute-0 sudo[30286]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 06:55:37 compute-0 python3[30288]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1763794535.942976-33952-149672968383414/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=0bdbb813b840548359ae77c28d76ca272ccaf31b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 06:55:37 compute-0 sudo[30286]: pam_unix(sudo:session): session closed for user root
Nov 22 06:55:37 compute-0 sudo[30312]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wixajzmgmhybcyghfhuzabxbvvycnkkg ; /usr/bin/python3'
Nov 22 06:55:37 compute-0 sudo[30312]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 06:55:37 compute-0 python3[30314]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 22 06:55:37 compute-0 sudo[30312]: pam_unix(sudo:session): session closed for user root
Nov 22 06:55:37 compute-0 sudo[30385]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rfdsjiprecozvyoxtbfbzjypvcmskmsq ; /usr/bin/python3'
Nov 22 06:55:37 compute-0 sudo[30385]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 06:55:37 compute-0 python3[30387]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1763794535.942976-33952-149672968383414/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 06:55:37 compute-0 sudo[30385]: pam_unix(sudo:session): session closed for user root
Nov 22 06:55:37 compute-0 sudo[30411]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ifpsnxcsxesixjavllpguqxnxbvaxron ; /usr/bin/python3'
Nov 22 06:55:37 compute-0 sudo[30411]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 06:55:38 compute-0 python3[30413]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 22 06:55:38 compute-0 sudo[30411]: pam_unix(sudo:session): session closed for user root
Nov 22 06:55:38 compute-0 sudo[30484]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bbpcqgenahtgsjdcortnzhbicrezxggr ; /usr/bin/python3'
Nov 22 06:55:38 compute-0 sudo[30484]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 06:55:38 compute-0 python3[30486]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1763794535.942976-33952-149672968383414/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 06:55:38 compute-0 sudo[30484]: pam_unix(sudo:session): session closed for user root
Nov 22 06:55:38 compute-0 sudo[30510]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evtgaphbchdlacmkefbmviimroivxejo ; /usr/bin/python3'
Nov 22 06:55:38 compute-0 sudo[30510]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 06:55:38 compute-0 python3[30512]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 22 06:55:38 compute-0 sudo[30510]: pam_unix(sudo:session): session closed for user root
Nov 22 06:55:38 compute-0 sudo[30583]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwudnwrlnrbrmonztwqduwlsndqtsewg ; /usr/bin/python3'
Nov 22 06:55:38 compute-0 sudo[30583]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 06:55:38 compute-0 python3[30585]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1763794535.942976-33952-149672968383414/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 06:55:38 compute-0 sudo[30583]: pam_unix(sudo:session): session closed for user root
Nov 22 06:55:39 compute-0 sudo[30609]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zgggeayzwjjeazsypjrrikerpdgoohiv ; /usr/bin/python3'
Nov 22 06:55:39 compute-0 sudo[30609]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 06:55:39 compute-0 python3[30611]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 22 06:55:39 compute-0 sudo[30609]: pam_unix(sudo:session): session closed for user root
Nov 22 06:55:39 compute-0 sudo[30682]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zhhuraconcxshrpccseuqtsdaagcyask ; /usr/bin/python3'
Nov 22 06:55:39 compute-0 sudo[30682]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 06:55:39 compute-0 python3[30684]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1763794535.942976-33952-149672968383414/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 06:55:39 compute-0 sudo[30682]: pam_unix(sudo:session): session closed for user root
Nov 22 06:55:39 compute-0 sudo[30708]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzdxnwaadkkzgemopcmcsgxitrqpmltm ; /usr/bin/python3'
Nov 22 06:55:39 compute-0 sudo[30708]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 06:55:39 compute-0 python3[30710]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 22 06:55:39 compute-0 sudo[30708]: pam_unix(sudo:session): session closed for user root
Nov 22 06:55:39 compute-0 sudo[30781]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oekaxqawepampjluudjlredglowjiypa ; /usr/bin/python3'
Nov 22 06:55:39 compute-0 sudo[30781]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 06:55:40 compute-0 python3[30783]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1763794535.942976-33952-149672968383414/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=6646317362318a9831d66a1804f6bb7dd1b97cd5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 06:55:40 compute-0 sudo[30781]: pam_unix(sudo:session): session closed for user root
Nov 22 06:55:42 compute-0 sshd-session[30809]: Connection closed by 192.168.122.11 port 38510 [preauth]
Nov 22 06:55:42 compute-0 sshd-session[30808]: Connection closed by 192.168.122.11 port 38496 [preauth]
Nov 22 06:55:42 compute-0 sshd-session[30811]: Unable to negotiate with 192.168.122.11 port 38526: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Nov 22 06:55:42 compute-0 sshd-session[30810]: Unable to negotiate with 192.168.122.11 port 38534: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Nov 22 06:55:42 compute-0 sshd-session[30812]: Unable to negotiate with 192.168.122.11 port 38514: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Nov 22 06:55:49 compute-0 python3[30841]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 07:00:48 compute-0 sshd-session[29927]: Received disconnect from 38.129.56.219 port 56074:11: disconnected by user
Nov 22 07:00:48 compute-0 sshd-session[29927]: Disconnected from user zuul 38.129.56.219 port 56074
Nov 22 07:00:48 compute-0 sshd-session[29924]: pam_unix(sshd:session): session closed for user zuul
Nov 22 07:00:48 compute-0 systemd[1]: session-7.scope: Deactivated successfully.
Nov 22 07:00:48 compute-0 systemd[1]: session-7.scope: Consumed 4.751s CPU time.
Nov 22 07:00:48 compute-0 systemd-logind[821]: Session 7 logged out. Waiting for processes to exit.
Nov 22 07:00:48 compute-0 systemd-logind[821]: Removed session 7.
Nov 22 07:01:01 compute-0 CROND[30847]: (root) CMD (run-parts /etc/cron.hourly)
Nov 22 07:01:01 compute-0 run-parts[30850]: (/etc/cron.hourly) starting 0anacron
Nov 22 07:01:01 compute-0 anacron[30858]: Anacron started on 2025-11-22
Nov 22 07:01:01 compute-0 anacron[30858]: Will run job `cron.daily' in 35 min.
Nov 22 07:01:01 compute-0 anacron[30858]: Will run job `cron.weekly' in 55 min.
Nov 22 07:01:01 compute-0 anacron[30858]: Will run job `cron.monthly' in 75 min.
Nov 22 07:01:01 compute-0 anacron[30858]: Jobs will be executed sequentially
Nov 22 07:01:01 compute-0 run-parts[30860]: (/etc/cron.hourly) finished 0anacron
Nov 22 07:01:01 compute-0 CROND[30846]: (root) CMDEND (run-parts /etc/cron.hourly)
Nov 22 07:11:49 compute-0 sshd-session[30866]: Accepted publickey for zuul from 192.168.122.30 port 37842 ssh2: ECDSA SHA256:XSwr0+qdoVcqF4tsJEe7yzRPcUrJW8ET1w6IEkjbKvs
Nov 22 07:11:49 compute-0 systemd-logind[821]: New session 8 of user zuul.
Nov 22 07:11:49 compute-0 systemd[1]: Started Session 8 of User zuul.
Nov 22 07:11:49 compute-0 sshd-session[30866]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 22 07:11:50 compute-0 python3.9[31019]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 07:11:52 compute-0 sudo[31198]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-orlujnkhlengcpfrokzoxwizpvnxkllg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795511.6537094-61-45510618206308/AnsiballZ_command.py'
Nov 22 07:11:52 compute-0 sudo[31198]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:11:52 compute-0 python3.9[31200]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail
                                            pushd /var/tmp
                                            curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz
                                            pushd repo-setup-main
                                            python3 -m venv ./venv
                                            PBR_VERSION=0.0.0 ./venv/bin/pip install ./
                                            ./venv/bin/repo-setup current-podified -b antelope
                                            popd
                                            rm -rf repo-setup-main
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 07:11:59 compute-0 sudo[31198]: pam_unix(sudo:session): session closed for user root
Nov 22 07:12:00 compute-0 sshd-session[30869]: Connection closed by 192.168.122.30 port 37842
Nov 22 07:12:00 compute-0 sshd-session[30866]: pam_unix(sshd:session): session closed for user zuul
Nov 22 07:12:00 compute-0 systemd[1]: session-8.scope: Deactivated successfully.
Nov 22 07:12:00 compute-0 systemd[1]: session-8.scope: Consumed 7.966s CPU time.
Nov 22 07:12:00 compute-0 systemd-logind[821]: Session 8 logged out. Waiting for processes to exit.
Nov 22 07:12:00 compute-0 systemd-logind[821]: Removed session 8.
Nov 22 07:12:16 compute-0 sshd-session[31258]: Accepted publickey for zuul from 192.168.122.30 port 39338 ssh2: ECDSA SHA256:XSwr0+qdoVcqF4tsJEe7yzRPcUrJW8ET1w6IEkjbKvs
Nov 22 07:12:16 compute-0 systemd-logind[821]: New session 9 of user zuul.
Nov 22 07:12:16 compute-0 systemd[1]: Started Session 9 of User zuul.
Nov 22 07:12:16 compute-0 sshd-session[31258]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 22 07:12:16 compute-0 python3.9[31411]: ansible-ansible.legacy.ping Invoked with data=pong
Nov 22 07:12:18 compute-0 python3.9[31585]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 07:12:18 compute-0 sudo[31735]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-daxjkbmwcbnzyidveczwvahppdlybenr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795538.5449893-98-111275920438514/AnsiballZ_command.py'
Nov 22 07:12:18 compute-0 sudo[31735]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:12:19 compute-0 python3.9[31737]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 07:12:19 compute-0 sudo[31735]: pam_unix(sudo:session): session closed for user root
Nov 22 07:12:19 compute-0 sudo[31888]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixircssrskqzbfiqjoxqtepzcsqfmprr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795539.553492-134-79614521977218/AnsiballZ_stat.py'
Nov 22 07:12:19 compute-0 sudo[31888]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:12:20 compute-0 python3.9[31890]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 07:12:20 compute-0 sudo[31888]: pam_unix(sudo:session): session closed for user root
Nov 22 07:12:20 compute-0 sudo[32040]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-httpqfrrpzceaurclbspyudvbocemnqr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795540.3846214-158-181765873257797/AnsiballZ_file.py'
Nov 22 07:12:20 compute-0 sudo[32040]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:12:21 compute-0 python3.9[32042]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:12:21 compute-0 sudo[32040]: pam_unix(sudo:session): session closed for user root
Nov 22 07:12:21 compute-0 sudo[32192]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zrfadcncovfyalqgjeaaefgkvpyjltcl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795541.2711682-182-105018871547363/AnsiballZ_stat.py'
Nov 22 07:12:21 compute-0 sudo[32192]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:12:21 compute-0 python3.9[32194]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:12:21 compute-0 sudo[32192]: pam_unix(sudo:session): session closed for user root
Nov 22 07:12:22 compute-0 sudo[32315]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ruqxzbwxxcyecpvfqlqwuqvnnugxqpri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795541.2711682-182-105018871547363/AnsiballZ_copy.py'
Nov 22 07:12:22 compute-0 sudo[32315]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:12:22 compute-0 python3.9[32317]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1763795541.2711682-182-105018871547363/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:12:22 compute-0 sudo[32315]: pam_unix(sudo:session): session closed for user root
Nov 22 07:12:22 compute-0 sudo[32467]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xrqxxjvpntmqwpdhmvfqdhpfqxolfyxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795542.6075633-227-84980386747168/AnsiballZ_setup.py'
Nov 22 07:12:22 compute-0 sudo[32467]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:12:23 compute-0 python3.9[32469]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 07:12:23 compute-0 sudo[32467]: pam_unix(sudo:session): session closed for user root
Nov 22 07:12:23 compute-0 sudo[32623]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rntanjwecejtvtlahognuxanzkpzdceh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795543.6400108-251-101796940791224/AnsiballZ_file.py'
Nov 22 07:12:23 compute-0 sudo[32623]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:12:24 compute-0 python3.9[32625]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:12:24 compute-0 sudo[32623]: pam_unix(sudo:session): session closed for user root
Nov 22 07:12:24 compute-0 sudo[32775]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nhocpffjryyqrlxpbeiypdqetsssgwey ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795544.3725467-278-261363316228044/AnsiballZ_file.py'
Nov 22 07:12:24 compute-0 sudo[32775]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:12:24 compute-0 python3.9[32777]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:12:24 compute-0 sudo[32775]: pam_unix(sudo:session): session closed for user root
Nov 22 07:12:26 compute-0 python3.9[32927]: ansible-ansible.builtin.service_facts Invoked
Nov 22 07:12:30 compute-0 python3.9[33180]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:12:31 compute-0 python3.9[33330]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 07:12:32 compute-0 python3.9[33484]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 07:12:33 compute-0 sudo[33640]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yirdxzxohefhipxyouwklvtkkdxfyjqw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795552.9612837-422-94373297468556/AnsiballZ_setup.py'
Nov 22 07:12:33 compute-0 sudo[33640]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:12:33 compute-0 python3.9[33642]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 22 07:12:33 compute-0 sudo[33640]: pam_unix(sudo:session): session closed for user root
Nov 22 07:12:34 compute-0 sudo[33724]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-anqflarptkkdbvvachgbrguvrqhtqega ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795552.9612837-422-94373297468556/AnsiballZ_dnf.py'
Nov 22 07:12:34 compute-0 sudo[33724]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:12:34 compute-0 python3.9[33726]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 22 07:13:29 compute-0 systemd[1]: Reloading.
Nov 22 07:13:29 compute-0 systemd-rc-local-generator[33924]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 07:13:29 compute-0 systemd[1]: Starting dnf makecache...
Nov 22 07:13:29 compute-0 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Nov 22 07:13:29 compute-0 dnf[33934]: Failed determining last makecache time.
Nov 22 07:13:29 compute-0 dnf[33934]: delorean-openstack-barbican-42b4c41831408a8e323 140 kB/s | 3.0 kB     00:00
Nov 22 07:13:29 compute-0 dnf[33934]: delorean-python-glean-10df0bd91b9bc5c9fd9cc02d7 169 kB/s | 3.0 kB     00:00
Nov 22 07:13:29 compute-0 systemd[1]: Reloading.
Nov 22 07:13:29 compute-0 dnf[33934]: delorean-openstack-cinder-1c00d6490d88e436f26ef 193 kB/s | 3.0 kB     00:00
Nov 22 07:13:29 compute-0 dnf[33934]: delorean-python-stevedore-c4acc5639fd2329372142 179 kB/s | 3.0 kB     00:00
Nov 22 07:13:29 compute-0 dnf[33934]: delorean-python-observabilityclient-2f31846d73c 166 kB/s | 3.0 kB     00:00
Nov 22 07:13:29 compute-0 systemd-rc-local-generator[33967]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 07:13:29 compute-0 dnf[33934]: delorean-os-net-config-bbae2ed8a159b0435a473f38 141 kB/s | 3.0 kB     00:00
Nov 22 07:13:29 compute-0 dnf[33934]: delorean-openstack-nova-6f8decf0b4f1aa2e96292b6 203 kB/s | 3.0 kB     00:00
Nov 22 07:13:29 compute-0 dnf[33934]: delorean-python-designate-tests-tempest-347fdbc 191 kB/s | 3.0 kB     00:00
Nov 22 07:13:29 compute-0 dnf[33934]: delorean-openstack-glance-1fd12c29b339f30fe823e 170 kB/s | 3.0 kB     00:00
Nov 22 07:13:30 compute-0 dnf[33934]: delorean-openstack-keystone-e4b40af0ae3698fbbbb 170 kB/s | 3.0 kB     00:00
Nov 22 07:13:30 compute-0 dnf[33934]: delorean-openstack-manila-3c01b7181572c95dac462 171 kB/s | 3.0 kB     00:00
Nov 22 07:13:30 compute-0 dnf[33934]: delorean-python-whitebox-neutron-tests-tempest- 139 kB/s | 3.0 kB     00:00
Nov 22 07:13:30 compute-0 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Nov 22 07:13:30 compute-0 dnf[33934]: delorean-openstack-octavia-ba397f07a7331190208c 172 kB/s | 3.0 kB     00:00
Nov 22 07:13:30 compute-0 dnf[33934]: delorean-openstack-watcher-c014f81a8647287f6dcc 188 kB/s | 3.0 kB     00:00
Nov 22 07:13:30 compute-0 dnf[33934]: delorean-python-tcib-1124124ec06aadbac34f0d340b 169 kB/s | 3.0 kB     00:00
Nov 22 07:13:30 compute-0 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Nov 22 07:13:30 compute-0 dnf[33934]: delorean-puppet-ceph-7352068d7b8c84ded636ab3158 180 kB/s | 3.0 kB     00:00
Nov 22 07:13:30 compute-0 dnf[33934]: delorean-openstack-swift-dc98a8463506ac520c469a 187 kB/s | 3.0 kB     00:00
Nov 22 07:13:30 compute-0 systemd[1]: Reloading.
Nov 22 07:13:30 compute-0 dnf[33934]: delorean-python-tempestconf-8515371b7cceebd4282 187 kB/s | 3.0 kB     00:00
Nov 22 07:13:30 compute-0 dnf[33934]: delorean-openstack-heat-ui-013accbfd179753bc3f0 179 kB/s | 3.0 kB     00:00
Nov 22 07:13:30 compute-0 systemd-rc-local-generator[34022]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 07:13:30 compute-0 dnf[33934]: CentOS Stream 9 - BaseOS                         74 kB/s | 7.3 kB     00:00
Nov 22 07:13:30 compute-0 systemd[1]: Listening on LVM2 poll daemon socket.
Nov 22 07:13:30 compute-0 dnf[33934]: CentOS Stream 9 - AppStream                      84 kB/s | 7.4 kB     00:00
Nov 22 07:13:30 compute-0 dbus-broker-launch[810]: Noticed file-system modification, trigger reload.
Nov 22 07:13:30 compute-0 dbus-broker-launch[810]: Noticed file-system modification, trigger reload.
Nov 22 07:13:30 compute-0 dbus-broker-launch[810]: Noticed file-system modification, trigger reload.
Nov 22 07:13:30 compute-0 dnf[33934]: CentOS Stream 9 - CRB                            62 kB/s | 7.2 kB     00:00
Nov 22 07:13:30 compute-0 dnf[33934]: CentOS Stream 9 - Extras packages                27 kB/s | 8.3 kB     00:00
Nov 22 07:13:30 compute-0 dnf[33934]: dlrn-antelope-testing                           165 kB/s | 3.0 kB     00:00
Nov 22 07:13:31 compute-0 dnf[33934]: dlrn-antelope-build-deps                        183 kB/s | 3.0 kB     00:00
Nov 22 07:13:31 compute-0 dnf[33934]: centos9-rabbitmq                                132 kB/s | 3.0 kB     00:00
Nov 22 07:13:31 compute-0 dnf[33934]: centos9-storage                                 138 kB/s | 3.0 kB     00:00
Nov 22 07:13:31 compute-0 dnf[33934]: centos9-opstools                                128 kB/s | 3.0 kB     00:00
Nov 22 07:13:31 compute-0 dnf[33934]: NFV SIG OpenvSwitch                             140 kB/s | 3.0 kB     00:00
Nov 22 07:13:31 compute-0 dnf[33934]: repo-setup-centos-appstream                     206 kB/s | 4.4 kB     00:00
Nov 22 07:13:31 compute-0 dnf[33934]: repo-setup-centos-baseos                        177 kB/s | 3.9 kB     00:00
Nov 22 07:13:31 compute-0 dnf[33934]: repo-setup-centos-highavailability              175 kB/s | 3.9 kB     00:00
Nov 22 07:13:31 compute-0 dnf[33934]: repo-setup-centos-powertools                    194 kB/s | 4.3 kB     00:00
Nov 22 07:13:31 compute-0 dnf[33934]: Extra Packages for Enterprise Linux 9 - x86_64  252 kB/s |  33 kB     00:00
Nov 22 07:13:32 compute-0 dnf[33934]: Metadata cache created.
Nov 22 07:13:32 compute-0 systemd[1]: dnf-makecache.service: Deactivated successfully.
Nov 22 07:13:32 compute-0 systemd[1]: Finished dnf makecache.
Nov 22 07:13:32 compute-0 systemd[1]: dnf-makecache.service: Consumed 1.766s CPU time.
Nov 22 07:14:40 compute-0 kernel: SELinux:  Converting 2718 SID table entries...
Nov 22 07:14:40 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Nov 22 07:14:40 compute-0 kernel: SELinux:  policy capability open_perms=1
Nov 22 07:14:40 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Nov 22 07:14:40 compute-0 kernel: SELinux:  policy capability always_check_network=0
Nov 22 07:14:40 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 22 07:14:40 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 22 07:14:40 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 22 07:14:40 compute-0 dbus-broker-launch[811]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Nov 22 07:14:40 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 22 07:14:41 compute-0 systemd[1]: Starting man-db-cache-update.service...
Nov 22 07:14:41 compute-0 systemd[1]: Reloading.
Nov 22 07:14:41 compute-0 systemd-rc-local-generator[34371]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 07:14:41 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 22 07:14:41 compute-0 sudo[33724]: pam_unix(sudo:session): session closed for user root
Nov 22 07:14:42 compute-0 sudo[35276]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fdtfvhllyyxiyyipezrfozlwmdytnruw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795681.895693-458-160190996813355/AnsiballZ_command.py'
Nov 22 07:14:42 compute-0 sudo[35276]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:14:42 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 22 07:14:42 compute-0 systemd[1]: Finished man-db-cache-update.service.
Nov 22 07:14:42 compute-0 systemd[1]: man-db-cache-update.service: Consumed 1.151s CPU time.
Nov 22 07:14:42 compute-0 systemd[1]: run-rfd1543bba8874e8c90d77674a98fca90.service: Deactivated successfully.
Nov 22 07:14:42 compute-0 python3.9[35278]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 07:14:43 compute-0 sudo[35276]: pam_unix(sudo:session): session closed for user root
Nov 22 07:14:44 compute-0 sudo[35558]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-njjyqqwshehlsdkfaxtmgiqrqefervar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795683.8460746-482-169865023302459/AnsiballZ_selinux.py'
Nov 22 07:14:44 compute-0 sudo[35558]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:14:44 compute-0 python3.9[35560]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Nov 22 07:14:44 compute-0 sudo[35558]: pam_unix(sudo:session): session closed for user root
Nov 22 07:14:45 compute-0 sudo[35711]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yrihofautndwzoipnbfvccimrjciskhh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795685.2424333-515-272508825080491/AnsiballZ_command.py'
Nov 22 07:14:45 compute-0 sudo[35711]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:14:45 compute-0 python3.9[35713]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Nov 22 07:14:46 compute-0 sudo[35711]: pam_unix(sudo:session): session closed for user root
Nov 22 07:14:47 compute-0 sudo[35865]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agvcrwcapdcrwikidjwocefnmffzddgq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795687.4189062-539-247328019758438/AnsiballZ_file.py'
Nov 22 07:14:47 compute-0 sudo[35865]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:14:53 compute-0 python3.9[35867]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:14:53 compute-0 sudo[35865]: pam_unix(sudo:session): session closed for user root
Nov 22 07:14:53 compute-0 sudo[36017]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-adkzjjlathysgzshoevpiibfzqgzumxs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795693.4323308-563-153961794313604/AnsiballZ_mount.py'
Nov 22 07:14:53 compute-0 sudo[36017]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:15:07 compute-0 python3.9[36019]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Nov 22 07:15:07 compute-0 sudo[36017]: pam_unix(sudo:session): session closed for user root
Nov 22 07:15:11 compute-0 sudo[36169]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-imjgpiaxwsozywhkzroypqlrygzptcyv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795710.727802-647-57644008582055/AnsiballZ_file.py'
Nov 22 07:15:11 compute-0 sudo[36169]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:15:11 compute-0 python3.9[36171]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:15:11 compute-0 sudo[36169]: pam_unix(sudo:session): session closed for user root
Nov 22 07:15:11 compute-0 sudo[36321]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-synocbmlwxxpyvoiiijlihuwcgqsdxeq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795711.6139495-671-17467873176039/AnsiballZ_stat.py'
Nov 22 07:15:11 compute-0 sudo[36321]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:15:12 compute-0 python3.9[36323]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:15:12 compute-0 sudo[36321]: pam_unix(sudo:session): session closed for user root
Nov 22 07:15:12 compute-0 sudo[36444]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yizrtsjdzjijvodjorptlsmrgaezmlnb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795711.6139495-671-17467873176039/AnsiballZ_copy.py'
Nov 22 07:15:12 compute-0 sudo[36444]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:15:12 compute-0 python3.9[36446]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763795711.6139495-671-17467873176039/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=ba56832c35f23d00035eec09df0d02bc867796ad backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:15:12 compute-0 sudo[36444]: pam_unix(sudo:session): session closed for user root
Nov 22 07:15:14 compute-0 sudo[36596]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wmoasgoqvfvekqrztuzpcapjjkswupcj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795713.763312-743-142645283324723/AnsiballZ_stat.py'
Nov 22 07:15:14 compute-0 sudo[36596]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:15:14 compute-0 python3.9[36598]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 07:15:14 compute-0 sudo[36596]: pam_unix(sudo:session): session closed for user root
Nov 22 07:15:14 compute-0 sudo[36748]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hdaurmlvaooztynudyjbjiyaousyoshe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795714.551722-767-71908027551293/AnsiballZ_command.py'
Nov 22 07:15:14 compute-0 sudo[36748]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:15:15 compute-0 python3.9[36750]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 07:15:15 compute-0 sudo[36748]: pam_unix(sudo:session): session closed for user root
Nov 22 07:15:15 compute-0 sudo[36901]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-owshaeswisanommwyrsfwinxguethqgf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795715.3204646-791-109536964698947/AnsiballZ_file.py'
Nov 22 07:15:15 compute-0 sudo[36901]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:15:15 compute-0 python3.9[36903]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:15:15 compute-0 sudo[36901]: pam_unix(sudo:session): session closed for user root
Nov 22 07:15:16 compute-0 sudo[37053]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hxkqxknmhpkqspefvobqaathxfdvywia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795716.382749-824-263064505432786/AnsiballZ_getent.py'
Nov 22 07:15:16 compute-0 sudo[37053]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:15:16 compute-0 python3.9[37055]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Nov 22 07:15:17 compute-0 sudo[37053]: pam_unix(sudo:session): session closed for user root
Nov 22 07:15:17 compute-0 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 22 07:15:17 compute-0 sudo[37207]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-trztevqpavgztrwpplrwyimfhvwvqwaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795717.2133589-848-41005256967259/AnsiballZ_group.py'
Nov 22 07:15:17 compute-0 sudo[37207]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:15:17 compute-0 python3.9[37209]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 22 07:15:17 compute-0 groupadd[37210]: group added to /etc/group: name=qemu, GID=107
Nov 22 07:15:17 compute-0 groupadd[37210]: group added to /etc/gshadow: name=qemu
Nov 22 07:15:17 compute-0 groupadd[37210]: new group: name=qemu, GID=107
Nov 22 07:15:18 compute-0 sudo[37207]: pam_unix(sudo:session): session closed for user root
Nov 22 07:15:18 compute-0 sudo[37365]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-auozcklqdnyfgekbonilmuuaunmzjowu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795718.305726-872-204648165307370/AnsiballZ_user.py'
Nov 22 07:15:18 compute-0 sudo[37365]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:15:19 compute-0 python3.9[37367]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 22 07:15:19 compute-0 useradd[37369]: new user: name=qemu, UID=107, GID=107, home=/home/qemu, shell=/sbin/nologin, from=/dev/pts/0
Nov 22 07:15:19 compute-0 sudo[37365]: pam_unix(sudo:session): session closed for user root
Nov 22 07:15:19 compute-0 sudo[37525]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evxpnolrtyechntviavmkudjozwmauva ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795719.6338537-896-272272932292896/AnsiballZ_getent.py'
Nov 22 07:15:19 compute-0 sudo[37525]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:15:20 compute-0 python3.9[37527]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Nov 22 07:15:20 compute-0 sudo[37525]: pam_unix(sudo:session): session closed for user root
Nov 22 07:15:20 compute-0 sudo[37678]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cafmussmixxbvnzawodmmybzcyxsedxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795720.28506-920-18632199754805/AnsiballZ_group.py'
Nov 22 07:15:20 compute-0 sudo[37678]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:15:20 compute-0 python3.9[37680]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 22 07:15:21 compute-0 groupadd[37681]: group added to /etc/group: name=hugetlbfs, GID=42477
Nov 22 07:15:21 compute-0 groupadd[37681]: group added to /etc/gshadow: name=hugetlbfs
Nov 22 07:15:21 compute-0 groupadd[37681]: new group: name=hugetlbfs, GID=42477
Nov 22 07:15:21 compute-0 sudo[37678]: pam_unix(sudo:session): session closed for user root
Nov 22 07:15:22 compute-0 sudo[37836]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-epstpqiscevzrdjmhhyaujkrwqgkgwkh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795721.6355991-947-201296923438629/AnsiballZ_file.py'
Nov 22 07:15:22 compute-0 sudo[37836]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:15:22 compute-0 python3.9[37838]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Nov 22 07:15:22 compute-0 sudo[37836]: pam_unix(sudo:session): session closed for user root
Nov 22 07:15:23 compute-0 sudo[37988]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fxvvqpidjjkvshgrcgxhrvyifpmjlfny ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795722.87409-980-23400588275773/AnsiballZ_dnf.py'
Nov 22 07:15:23 compute-0 sudo[37988]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:15:23 compute-0 python3.9[37990]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 22 07:15:25 compute-0 sudo[37988]: pam_unix(sudo:session): session closed for user root
Nov 22 07:15:26 compute-0 sudo[38141]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oruteoqubtlticukkaczdnrgkywqacnd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795725.7212791-1004-104746873657808/AnsiballZ_file.py'
Nov 22 07:15:26 compute-0 sudo[38141]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:15:26 compute-0 python3.9[38143]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:15:26 compute-0 sudo[38141]: pam_unix(sudo:session): session closed for user root
Nov 22 07:15:26 compute-0 sudo[38293]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-osqfohsrjzesipluologjimsgcfmfaor ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795726.5749927-1028-182521683983647/AnsiballZ_stat.py'
Nov 22 07:15:26 compute-0 sudo[38293]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:15:27 compute-0 python3.9[38295]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:15:27 compute-0 sudo[38293]: pam_unix(sudo:session): session closed for user root
Nov 22 07:15:27 compute-0 sudo[38416]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ghjprwywndpfndbtgaptpoxcgajmpdyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795726.5749927-1028-182521683983647/AnsiballZ_copy.py'
Nov 22 07:15:27 compute-0 sudo[38416]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:15:27 compute-0 python3.9[38418]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763795726.5749927-1028-182521683983647/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:15:27 compute-0 sudo[38416]: pam_unix(sudo:session): session closed for user root
Nov 22 07:15:28 compute-0 sudo[38568]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wqjlonabaqfuebugcpjxfntmcuxcxleh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795727.8565984-1073-73532169029566/AnsiballZ_systemd.py'
Nov 22 07:15:28 compute-0 sudo[38568]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:15:28 compute-0 python3.9[38570]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 07:15:28 compute-0 systemd[1]: Starting Load Kernel Modules...
Nov 22 07:15:28 compute-0 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Nov 22 07:15:28 compute-0 kernel: Bridge firewalling registered
Nov 22 07:15:28 compute-0 systemd-modules-load[38574]: Inserted module 'br_netfilter'
Nov 22 07:15:28 compute-0 systemd[1]: Finished Load Kernel Modules.
Nov 22 07:15:28 compute-0 sudo[38568]: pam_unix(sudo:session): session closed for user root
Nov 22 07:15:29 compute-0 sudo[38729]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kmewwkdniltzazaxdessdyieavyhxxjp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795729.1834233-1097-223962670806200/AnsiballZ_stat.py'
Nov 22 07:15:29 compute-0 sudo[38729]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:15:29 compute-0 python3.9[38731]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:15:29 compute-0 sudo[38729]: pam_unix(sudo:session): session closed for user root
Nov 22 07:15:30 compute-0 sudo[38852]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zcvjkgrlnkdvrugpljafklybhhimqydb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795729.1834233-1097-223962670806200/AnsiballZ_copy.py'
Nov 22 07:15:30 compute-0 sudo[38852]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:15:30 compute-0 python3.9[38854]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763795729.1834233-1097-223962670806200/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:15:30 compute-0 sudo[38852]: pam_unix(sudo:session): session closed for user root
Nov 22 07:15:30 compute-0 sudo[39004]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ckkbeezqhkaniynbbyrduovanzjbpoho ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795730.7334652-1151-30958780172099/AnsiballZ_dnf.py'
Nov 22 07:15:30 compute-0 sudo[39004]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:15:31 compute-0 python3.9[39006]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 22 07:15:35 compute-0 dbus-broker-launch[810]: Noticed file-system modification, trigger reload.
Nov 22 07:15:35 compute-0 dbus-broker-launch[810]: Noticed file-system modification, trigger reload.
Nov 22 07:15:36 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 22 07:15:36 compute-0 systemd[1]: Starting man-db-cache-update.service...
Nov 22 07:15:36 compute-0 systemd[1]: Reloading.
Nov 22 07:15:36 compute-0 systemd-rc-local-generator[39070]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 07:15:36 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 22 07:15:37 compute-0 sudo[39004]: pam_unix(sudo:session): session closed for user root
Nov 22 07:15:38 compute-0 python3.9[40886]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 07:15:39 compute-0 python3.9[42014]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Nov 22 07:15:39 compute-0 python3.9[42881]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 07:15:40 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 22 07:15:40 compute-0 systemd[1]: Finished man-db-cache-update.service.
Nov 22 07:15:40 compute-0 systemd[1]: man-db-cache-update.service: Consumed 4.339s CPU time.
Nov 22 07:15:40 compute-0 systemd[1]: run-re6055b4963da441c9469e25a1eeed47a.service: Deactivated successfully.
Nov 22 07:15:40 compute-0 sudo[43164]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxruurmrplshukahhnlqsngjgwdydbcs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795740.3314018-1268-233365556201032/AnsiballZ_command.py'
Nov 22 07:15:40 compute-0 sudo[43164]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:15:40 compute-0 python3.9[43166]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 07:15:40 compute-0 systemd[1]: Starting Dynamic System Tuning Daemon...
Nov 22 07:15:41 compute-0 systemd[1]: Starting Authorization Manager...
Nov 22 07:15:41 compute-0 systemd[1]: Started Dynamic System Tuning Daemon.
Nov 22 07:15:41 compute-0 polkitd[43383]: Started polkitd version 0.117
Nov 22 07:15:41 compute-0 polkitd[43383]: Loading rules from directory /etc/polkit-1/rules.d
Nov 22 07:15:41 compute-0 polkitd[43383]: Loading rules from directory /usr/share/polkit-1/rules.d
Nov 22 07:15:41 compute-0 polkitd[43383]: Finished loading, compiling and executing 2 rules
Nov 22 07:15:41 compute-0 polkitd[43383]: Acquired the name org.freedesktop.PolicyKit1 on the system bus
Nov 22 07:15:41 compute-0 systemd[1]: Started Authorization Manager.
Nov 22 07:15:41 compute-0 sudo[43164]: pam_unix(sudo:session): session closed for user root
Nov 22 07:15:42 compute-0 sudo[43551]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pfvakwyrtvhiafmxtpjxyyugcqdnnfzw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795741.8622477-1295-142240878487804/AnsiballZ_systemd.py'
Nov 22 07:15:42 compute-0 sudo[43551]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:15:42 compute-0 python3.9[43553]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 07:15:42 compute-0 systemd[1]: Stopping Dynamic System Tuning Daemon...
Nov 22 07:15:42 compute-0 systemd[1]: tuned.service: Deactivated successfully.
Nov 22 07:15:42 compute-0 systemd[1]: Stopped Dynamic System Tuning Daemon.
Nov 22 07:15:42 compute-0 systemd[1]: Starting Dynamic System Tuning Daemon...
Nov 22 07:15:42 compute-0 systemd[1]: Started Dynamic System Tuning Daemon.
Nov 22 07:15:42 compute-0 sudo[43551]: pam_unix(sudo:session): session closed for user root
Nov 22 07:15:43 compute-0 python3.9[43714]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Nov 22 07:15:47 compute-0 sudo[43864]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xntojklohypnlibmmcnubhbbkhmiqcfj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795747.6648998-1466-134502998063375/AnsiballZ_systemd.py'
Nov 22 07:15:47 compute-0 sudo[43864]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:15:48 compute-0 python3.9[43866]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 07:15:48 compute-0 systemd[1]: Reloading.
Nov 22 07:15:48 compute-0 systemd-rc-local-generator[43892]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 07:15:48 compute-0 sudo[43864]: pam_unix(sudo:session): session closed for user root
Nov 22 07:15:48 compute-0 sudo[44053]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bjjofqvdktvnjvwobwvhjkvoaihpdkzx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795748.6347563-1466-219624917523097/AnsiballZ_systemd.py'
Nov 22 07:15:48 compute-0 sudo[44053]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:15:49 compute-0 python3.9[44055]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 07:15:49 compute-0 systemd[1]: Reloading.
Nov 22 07:15:49 compute-0 systemd-rc-local-generator[44085]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 07:15:49 compute-0 sudo[44053]: pam_unix(sudo:session): session closed for user root
Nov 22 07:15:50 compute-0 sudo[44242]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-erzjexiqojgfguxszmvdnyceduvuxdbj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795749.7798536-1514-183615610211186/AnsiballZ_command.py'
Nov 22 07:15:50 compute-0 sudo[44242]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:15:50 compute-0 python3.9[44244]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 07:15:50 compute-0 sudo[44242]: pam_unix(sudo:session): session closed for user root
Nov 22 07:15:50 compute-0 sudo[44395]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ruepxjnkuibudcecofqbtyoqffglmptw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795750.5942988-1538-275628875711883/AnsiballZ_command.py'
Nov 22 07:15:50 compute-0 sudo[44395]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:15:51 compute-0 python3.9[44397]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 07:15:51 compute-0 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Nov 22 07:15:51 compute-0 sudo[44395]: pam_unix(sudo:session): session closed for user root
Nov 22 07:15:51 compute-0 sudo[44548]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cthmaycvsffldyzwnfhlhwmsiptfkmeu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795751.368668-1562-30234758643053/AnsiballZ_command.py'
Nov 22 07:15:51 compute-0 sudo[44548]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:15:51 compute-0 python3.9[44550]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 07:15:53 compute-0 sudo[44548]: pam_unix(sudo:session): session closed for user root
Nov 22 07:15:53 compute-0 sudo[44710]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xbsgvolxxradbgrvjvgvlorrissrbhhd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795753.549679-1586-56287305222037/AnsiballZ_command.py'
Nov 22 07:15:53 compute-0 sudo[44710]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:15:54 compute-0 python3.9[44712]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 07:15:54 compute-0 sudo[44710]: pam_unix(sudo:session): session closed for user root
Nov 22 07:15:54 compute-0 sudo[44863]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hjtlxhzkasygxwadmysgjcqfgypxdger ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795754.3889012-1610-14116159286630/AnsiballZ_systemd.py'
Nov 22 07:15:54 compute-0 sudo[44863]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:15:54 compute-0 python3.9[44865]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 07:15:55 compute-0 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Nov 22 07:15:55 compute-0 systemd[1]: Stopped Apply Kernel Variables.
Nov 22 07:15:55 compute-0 systemd[1]: Stopping Apply Kernel Variables...
Nov 22 07:15:55 compute-0 systemd[1]: Starting Apply Kernel Variables...
Nov 22 07:15:55 compute-0 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Nov 22 07:15:55 compute-0 systemd[1]: Finished Apply Kernel Variables.
Nov 22 07:15:55 compute-0 sudo[44863]: pam_unix(sudo:session): session closed for user root
Nov 22 07:15:55 compute-0 sshd-session[31261]: Connection closed by 192.168.122.30 port 39338
Nov 22 07:15:55 compute-0 sshd-session[31258]: pam_unix(sshd:session): session closed for user zuul
Nov 22 07:15:55 compute-0 systemd[1]: session-9.scope: Deactivated successfully.
Nov 22 07:15:55 compute-0 systemd[1]: session-9.scope: Consumed 2min 13.184s CPU time.
Nov 22 07:15:55 compute-0 systemd-logind[821]: Session 9 logged out. Waiting for processes to exit.
Nov 22 07:15:55 compute-0 systemd-logind[821]: Removed session 9.
Nov 22 07:16:01 compute-0 sshd-session[44896]: Accepted publickey for zuul from 192.168.122.30 port 56112 ssh2: ECDSA SHA256:XSwr0+qdoVcqF4tsJEe7yzRPcUrJW8ET1w6IEkjbKvs
Nov 22 07:16:01 compute-0 systemd-logind[821]: New session 10 of user zuul.
Nov 22 07:16:01 compute-0 systemd[1]: Started Session 10 of User zuul.
Nov 22 07:16:01 compute-0 sshd-session[44896]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 22 07:16:02 compute-0 python3.9[45049]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 07:16:03 compute-0 python3.9[45203]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 07:16:04 compute-0 sudo[45357]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lkbgqdwafeoaoggcwjccdklszzwvbigl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795764.4681282-115-222419642197851/AnsiballZ_command.py'
Nov 22 07:16:04 compute-0 sudo[45357]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:16:05 compute-0 python3.9[45359]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 07:16:05 compute-0 sudo[45357]: pam_unix(sudo:session): session closed for user root
Nov 22 07:16:06 compute-0 python3.9[45510]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 07:16:07 compute-0 sudo[45664]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zxqvqrhaurgpmszjpddkrxvgkqzkwxtz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795766.728302-175-32473245176224/AnsiballZ_setup.py'
Nov 22 07:16:07 compute-0 sudo[45664]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:16:07 compute-0 python3.9[45666]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 22 07:16:07 compute-0 sudo[45664]: pam_unix(sudo:session): session closed for user root
Nov 22 07:16:08 compute-0 sudo[45748]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tzovawvippdrfrynqgfxsxxdonyyttol ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795766.728302-175-32473245176224/AnsiballZ_dnf.py'
Nov 22 07:16:08 compute-0 sudo[45748]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:16:08 compute-0 python3.9[45750]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 22 07:16:09 compute-0 sudo[45748]: pam_unix(sudo:session): session closed for user root
Nov 22 07:16:10 compute-0 sudo[45901]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dtoxyiofwevmzslncvzlfyztmsiyrdtq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795769.887366-211-31630728050878/AnsiballZ_setup.py'
Nov 22 07:16:10 compute-0 sudo[45901]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:16:10 compute-0 python3.9[45903]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 22 07:16:10 compute-0 sudo[45901]: pam_unix(sudo:session): session closed for user root
Nov 22 07:16:11 compute-0 sudo[46072]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqyuyzbayuhurqplkzexjvwazdoenhzh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795771.0607626-244-183198200463221/AnsiballZ_file.py'
Nov 22 07:16:11 compute-0 sudo[46072]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:16:11 compute-0 python3.9[46074]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:16:11 compute-0 sudo[46072]: pam_unix(sudo:session): session closed for user root
Nov 22 07:16:12 compute-0 sudo[46224]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mmjricousgqjyunusohdsnrtqcnmbncl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795771.974308-268-194361052215595/AnsiballZ_command.py'
Nov 22 07:16:12 compute-0 sudo[46224]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:16:12 compute-0 python3.9[46226]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 07:16:12 compute-0 systemd[1]: var-lib-containers-storage-overlay-compat297070450-merged.mount: Deactivated successfully.
Nov 22 07:16:12 compute-0 podman[46227]: 2025-11-22 07:16:12.622385685 +0000 UTC m=+0.167145777 system refresh
Nov 22 07:16:12 compute-0 sudo[46224]: pam_unix(sudo:session): session closed for user root
Nov 22 07:16:13 compute-0 sudo[46386]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jnxfcoafoegqsqgbriquvmyevuvrlfob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795772.8989434-292-279573264005660/AnsiballZ_stat.py'
Nov 22 07:16:13 compute-0 sudo[46386]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:16:13 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 07:16:13 compute-0 python3.9[46388]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:16:13 compute-0 sudo[46386]: pam_unix(sudo:session): session closed for user root
Nov 22 07:16:14 compute-0 sudo[46509]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zoskjggtpdcjvdqyvxlhsfwskamwfkkk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795772.8989434-292-279573264005660/AnsiballZ_copy.py'
Nov 22 07:16:14 compute-0 sudo[46509]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:16:14 compute-0 python3.9[46511]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763795772.8989434-292-279573264005660/.source.json follow=False _original_basename=podman_network_config.j2 checksum=bf029c725592abb6fcc564a1c2593b3ec0d2b4a2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:16:14 compute-0 sudo[46509]: pam_unix(sudo:session): session closed for user root
Nov 22 07:16:15 compute-0 sudo[46661]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qrnfamjqnguhzspbcanzwvlndbjqqrqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795774.7506652-337-233805431192439/AnsiballZ_stat.py'
Nov 22 07:16:15 compute-0 sudo[46661]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:16:15 compute-0 python3.9[46663]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:16:15 compute-0 sudo[46661]: pam_unix(sudo:session): session closed for user root
Nov 22 07:16:15 compute-0 sudo[46784]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ydvzniubdgzagiwdqvbfakndmxqbwkvq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795774.7506652-337-233805431192439/AnsiballZ_copy.py'
Nov 22 07:16:15 compute-0 sudo[46784]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:16:15 compute-0 python3.9[46786]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763795774.7506652-337-233805431192439/.source.conf follow=False _original_basename=registries.conf.j2 checksum=193e1b13ee9dd51d1fc7c456c46399ca66d3b9c7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:16:15 compute-0 sudo[46784]: pam_unix(sudo:session): session closed for user root
Nov 22 07:16:16 compute-0 sudo[46936]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjdymktoelemsbjyyjwwegsbffzzbdys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795776.2005029-385-126179449235864/AnsiballZ_ini_file.py'
Nov 22 07:16:16 compute-0 sudo[46936]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:16:16 compute-0 python3.9[46938]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:16:16 compute-0 sudo[46936]: pam_unix(sudo:session): session closed for user root
Nov 22 07:16:17 compute-0 sudo[47088]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ntgljvjejzhasghhidynsywjiczzvrhy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795777.0792372-385-18508902700216/AnsiballZ_ini_file.py'
Nov 22 07:16:17 compute-0 sudo[47088]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:16:17 compute-0 python3.9[47090]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:16:17 compute-0 sudo[47088]: pam_unix(sudo:session): session closed for user root
Nov 22 07:16:18 compute-0 sudo[47240]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zxnlqxjrvkoudedoongcdefhgsconpkz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795777.7341094-385-32827593249862/AnsiballZ_ini_file.py'
Nov 22 07:16:18 compute-0 sudo[47240]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:16:18 compute-0 python3.9[47242]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:16:18 compute-0 sudo[47240]: pam_unix(sudo:session): session closed for user root
Nov 22 07:16:18 compute-0 sudo[47392]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvneoxfjyhfknhymkuqxemmtuzlnyuej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795778.368256-385-97003900210382/AnsiballZ_ini_file.py'
Nov 22 07:16:18 compute-0 sudo[47392]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:16:18 compute-0 python3.9[47394]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:16:18 compute-0 sudo[47392]: pam_unix(sudo:session): session closed for user root
Nov 22 07:16:20 compute-0 python3.9[47544]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 07:16:20 compute-0 sudo[47696]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-flqnurveiklbmmsnvznzqmstqoczsoll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795780.3050892-505-242245545261431/AnsiballZ_dnf.py'
Nov 22 07:16:20 compute-0 sudo[47696]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:16:20 compute-0 python3.9[47698]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 22 07:16:22 compute-0 sudo[47696]: pam_unix(sudo:session): session closed for user root
Nov 22 07:16:22 compute-0 sudo[47849]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rwxvnukdkphbesrubjmocvierxqikdts ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795782.460758-529-1054335417305/AnsiballZ_dnf.py'
Nov 22 07:16:22 compute-0 sudo[47849]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:16:22 compute-0 python3.9[47851]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 22 07:16:24 compute-0 sudo[47849]: pam_unix(sudo:session): session closed for user root
Nov 22 07:16:26 compute-0 sudo[48009]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-afkzjoruppjmdmazcvtwufypmhqaninr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795785.958906-559-261823611165720/AnsiballZ_dnf.py'
Nov 22 07:16:26 compute-0 sudo[48009]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:16:26 compute-0 python3.9[48011]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 22 07:16:27 compute-0 sudo[48009]: pam_unix(sudo:session): session closed for user root
Nov 22 07:16:29 compute-0 sudo[48162]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bqfbbnpbdnrcebxijwpnpjdcldhzigkw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795788.7137978-586-47500913944903/AnsiballZ_dnf.py'
Nov 22 07:16:29 compute-0 sudo[48162]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:16:29 compute-0 python3.9[48164]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 22 07:16:30 compute-0 sudo[48162]: pam_unix(sudo:session): session closed for user root
Nov 22 07:16:31 compute-0 sudo[48315]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tkanifsogrvwhettwnowqxpxvpuoyygu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795791.5315518-619-19405322642951/AnsiballZ_dnf.py'
Nov 22 07:16:31 compute-0 sudo[48315]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:16:32 compute-0 python3.9[48317]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['NetworkManager-ovs'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 22 07:16:33 compute-0 sudo[48315]: pam_unix(sudo:session): session closed for user root
Nov 22 07:16:36 compute-0 sudo[48471]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-krstgzgnjlnkufqjtlchexpkwhrrzjxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795796.3273144-643-220512714714214/AnsiballZ_dnf.py'
Nov 22 07:16:36 compute-0 sudo[48471]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:16:36 compute-0 python3.9[48473]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 22 07:16:42 compute-0 sudo[48471]: pam_unix(sudo:session): session closed for user root
Nov 22 07:16:44 compute-0 sudo[48640]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zkrtvwhiyyanrvstsnfwmjloftrrfhfp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795804.2526138-670-96759453945634/AnsiballZ_dnf.py'
Nov 22 07:16:44 compute-0 sudo[48640]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:16:44 compute-0 python3.9[48642]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 22 07:16:46 compute-0 sudo[48640]: pam_unix(sudo:session): session closed for user root
Nov 22 07:16:47 compute-0 sudo[48793]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jblqcccomqksvawejvfdakhanomuulcz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795806.7287593-697-183663299064934/AnsiballZ_dnf.py'
Nov 22 07:16:47 compute-0 sudo[48793]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:16:47 compute-0 python3.9[48795]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 22 07:17:08 compute-0 sudo[48793]: pam_unix(sudo:session): session closed for user root
Nov 22 07:17:09 compute-0 sudo[49129]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vnzxdefdsiionjdidvcwqqwsgqgkdrzx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795829.5784347-724-66702609390648/AnsiballZ_dnf.py'
Nov 22 07:17:09 compute-0 sudo[49129]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:17:10 compute-0 python3.9[49131]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['iscsi-initiator-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 22 07:17:11 compute-0 sudo[49129]: pam_unix(sudo:session): session closed for user root
Nov 22 07:17:12 compute-0 sudo[49285]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lfigjhkvhfafndmbkmgbrkqfdhamdsrw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795832.206299-757-89933980361588/AnsiballZ_file.py'
Nov 22 07:17:12 compute-0 sudo[49285]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:17:12 compute-0 python3.9[49287]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:17:12 compute-0 sudo[49285]: pam_unix(sudo:session): session closed for user root
Nov 22 07:17:13 compute-0 sudo[49460]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxavjipbysepehfnrmhmndjdmnbunihc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795832.863561-781-221677965944085/AnsiballZ_stat.py'
Nov 22 07:17:13 compute-0 sudo[49460]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:17:13 compute-0 python3.9[49462]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:17:13 compute-0 sudo[49460]: pam_unix(sudo:session): session closed for user root
Nov 22 07:17:13 compute-0 sudo[49583]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bezxxtimasvixrheimllujdqzsdnujmd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795832.863561-781-221677965944085/AnsiballZ_copy.py'
Nov 22 07:17:13 compute-0 sudo[49583]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:17:13 compute-0 python3.9[49585]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1763795832.863561-781-221677965944085/.source.json _original_basename=.p1xav_39 follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:17:13 compute-0 sudo[49583]: pam_unix(sudo:session): session closed for user root
Nov 22 07:17:14 compute-0 sudo[49735]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-npihadmjtdazbasxarrbbiubglvcylcn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795834.3260002-835-239999653147036/AnsiballZ_podman_image.py'
Nov 22 07:17:14 compute-0 sudo[49735]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:17:14 compute-0 python3.9[49737]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 22 07:17:15 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 07:17:19 compute-0 systemd[1]: var-lib-containers-storage-overlay-compat2748969426-lower\x2dmapped.mount: Deactivated successfully.
Nov 22 07:17:23 compute-0 podman[49749]: 2025-11-22 07:17:23.235964736 +0000 UTC m=+8.198222849 image pull 197857ba4b35dfe0da58eb2e9c37f91c8a1d2b66c0967b4c66656aa6329b870c quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Nov 22 07:17:23 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 07:17:23 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 07:17:23 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 07:17:23 compute-0 sudo[49735]: pam_unix(sudo:session): session closed for user root
Nov 22 07:17:24 compute-0 sudo[50046]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wflwodvvxkeeeqgyukbjdwuapivaveek ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795844.1965373-874-169220690409457/AnsiballZ_podman_image.py'
Nov 22 07:17:24 compute-0 sudo[50046]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:17:24 compute-0 python3.9[50048]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 22 07:17:24 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 07:17:27 compute-0 podman[50060]: 2025-11-22 07:17:27.629660449 +0000 UTC m=+2.870908589 image pull 5a87eb2d1bea5c4c3bce654551fc0b05a96cf5556b36110e17bddeee8189b072 quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Nov 22 07:17:27 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 07:17:27 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 07:17:27 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 07:17:27 compute-0 sudo[50046]: pam_unix(sudo:session): session closed for user root
Nov 22 07:17:28 compute-0 sudo[50290]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aawemurmqucvobvefceghwrcwfxgvqxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795848.7227104-901-43622954509100/AnsiballZ_podman_image.py'
Nov 22 07:17:28 compute-0 sudo[50290]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:17:29 compute-0 python3.9[50292]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 22 07:17:29 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 07:17:56 compute-0 podman[50304]: 2025-11-22 07:17:56.36211445 +0000 UTC m=+27.064291790 image pull 8e31b7b83c8d26bacd9598fdae1b287d27f8fa7d1d3cf4270dd8e435ff2f6a66 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Nov 22 07:17:56 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 07:17:56 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 07:17:56 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 07:17:56 compute-0 sudo[50290]: pam_unix(sudo:session): session closed for user root
Nov 22 07:17:58 compute-0 sudo[50601]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zycbigzjstobdfsvnvkpuzenwbbmxpln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795878.4087496-934-142956942427318/AnsiballZ_podman_image.py'
Nov 22 07:17:58 compute-0 sudo[50601]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:17:58 compute-0 python3.9[50603]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 22 07:17:58 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 07:18:07 compute-0 podman[50615]: 2025-11-22 07:18:07.896899086 +0000 UTC m=+8.973974156 image pull 5b3bac081df6146e06acefa72320d250dc7d5f82abc7fbe0b9e83aec1e1587f5 quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Nov 22 07:18:07 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 07:18:07 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 07:18:07 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 07:18:08 compute-0 sudo[50601]: pam_unix(sudo:session): session closed for user root
Nov 22 07:18:08 compute-0 sudo[50868]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfxybhjgthajvqlbojjhawwjgsjmeuij ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795888.2217221-934-224290846019423/AnsiballZ_podman_image.py'
Nov 22 07:18:08 compute-0 sudo[50868]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:18:08 compute-0 python3.9[50870]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter:v1.5.0 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 22 07:18:12 compute-0 podman[50883]: 2025-11-22 07:18:12.913472505 +0000 UTC m=+4.127604089 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Nov 22 07:18:12 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 07:18:12 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 07:18:12 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 07:18:13 compute-0 sudo[50868]: pam_unix(sudo:session): session closed for user root
Nov 22 07:18:18 compute-0 sshd-session[44899]: Connection closed by 192.168.122.30 port 56112
Nov 22 07:18:18 compute-0 sshd-session[44896]: pam_unix(sshd:session): session closed for user zuul
Nov 22 07:18:18 compute-0 systemd[1]: session-10.scope: Deactivated successfully.
Nov 22 07:18:18 compute-0 systemd[1]: session-10.scope: Consumed 1min 35.740s CPU time.
Nov 22 07:18:18 compute-0 systemd-logind[821]: Session 10 logged out. Waiting for processes to exit.
Nov 22 07:18:18 compute-0 systemd-logind[821]: Removed session 10.
Nov 22 07:18:24 compute-0 sshd-session[51030]: Accepted publickey for zuul from 192.168.122.30 port 40978 ssh2: ECDSA SHA256:XSwr0+qdoVcqF4tsJEe7yzRPcUrJW8ET1w6IEkjbKvs
Nov 22 07:18:24 compute-0 systemd-logind[821]: New session 11 of user zuul.
Nov 22 07:18:24 compute-0 systemd[1]: Started Session 11 of User zuul.
Nov 22 07:18:24 compute-0 sshd-session[51030]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 22 07:18:25 compute-0 python3.9[51183]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 07:18:29 compute-0 sudo[51337]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oarythxreaxnrklcdnnnkoifcxyiikwg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795909.4809337-73-170443436920851/AnsiballZ_getent.py'
Nov 22 07:18:29 compute-0 sudo[51337]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:18:30 compute-0 python3.9[51339]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Nov 22 07:18:30 compute-0 sudo[51337]: pam_unix(sudo:session): session closed for user root
Nov 22 07:18:31 compute-0 sudo[51490]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uiwnmaotkcrjngmvcohkwnoaxtntswhq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795911.002098-97-251685913459800/AnsiballZ_group.py'
Nov 22 07:18:31 compute-0 sudo[51490]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:18:31 compute-0 python3.9[51492]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 22 07:18:31 compute-0 groupadd[51493]: group added to /etc/group: name=openvswitch, GID=42476
Nov 22 07:18:31 compute-0 groupadd[51493]: group added to /etc/gshadow: name=openvswitch
Nov 22 07:18:31 compute-0 groupadd[51493]: new group: name=openvswitch, GID=42476
Nov 22 07:18:31 compute-0 sudo[51490]: pam_unix(sudo:session): session closed for user root
Nov 22 07:18:32 compute-0 sudo[51648]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzurkhpsjrvntsnyjbrceuxszjhnapyk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795911.9190772-121-272427318024825/AnsiballZ_user.py'
Nov 22 07:18:32 compute-0 sudo[51648]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:18:32 compute-0 python3.9[51650]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 22 07:18:32 compute-0 useradd[51652]: new user: name=openvswitch, UID=42476, GID=42476, home=/home/openvswitch, shell=/sbin/nologin, from=/dev/pts/0
Nov 22 07:18:32 compute-0 useradd[51652]: add 'openvswitch' to group 'hugetlbfs'
Nov 22 07:18:32 compute-0 useradd[51652]: add 'openvswitch' to shadow group 'hugetlbfs'
Nov 22 07:18:32 compute-0 sudo[51648]: pam_unix(sudo:session): session closed for user root
Nov 22 07:18:33 compute-0 sudo[51808]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lmkyvzdocgqvycdlbsqmcialohnteckw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795913.0685773-151-92461310215870/AnsiballZ_setup.py'
Nov 22 07:18:33 compute-0 sudo[51808]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:18:33 compute-0 python3.9[51810]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 22 07:18:33 compute-0 sudo[51808]: pam_unix(sudo:session): session closed for user root
Nov 22 07:18:34 compute-0 sudo[51892]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgsmnvybzoipnbbglmlphslqyxibvmjt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795913.0685773-151-92461310215870/AnsiballZ_dnf.py'
Nov 22 07:18:34 compute-0 sudo[51892]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:18:34 compute-0 python3.9[51894]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 22 07:18:37 compute-0 sudo[51892]: pam_unix(sudo:session): session closed for user root
Nov 22 07:18:38 compute-0 sudo[52055]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pnzpiiomdttorqfpitjtaksremrsoxbk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795917.9466164-193-193250338377913/AnsiballZ_dnf.py'
Nov 22 07:18:38 compute-0 sudo[52055]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:18:38 compute-0 python3.9[52057]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 22 07:18:53 compute-0 kernel: SELinux:  Converting 2731 SID table entries...
Nov 22 07:18:53 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Nov 22 07:18:53 compute-0 kernel: SELinux:  policy capability open_perms=1
Nov 22 07:18:53 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Nov 22 07:18:53 compute-0 kernel: SELinux:  policy capability always_check_network=0
Nov 22 07:18:53 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 22 07:18:53 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 22 07:18:53 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 22 07:18:53 compute-0 groupadd[52080]: group added to /etc/group: name=unbound, GID=993
Nov 22 07:18:53 compute-0 groupadd[52080]: group added to /etc/gshadow: name=unbound
Nov 22 07:18:53 compute-0 groupadd[52080]: new group: name=unbound, GID=993
Nov 22 07:18:53 compute-0 useradd[52087]: new user: name=unbound, UID=993, GID=993, home=/var/lib/unbound, shell=/sbin/nologin, from=none
Nov 22 07:18:54 compute-0 dbus-broker-launch[811]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Nov 22 07:18:54 compute-0 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Nov 22 07:18:55 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 22 07:18:55 compute-0 systemd[1]: Starting man-db-cache-update.service...
Nov 22 07:18:55 compute-0 systemd[1]: Reloading.
Nov 22 07:18:55 compute-0 systemd-rc-local-generator[52585]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 07:18:55 compute-0 systemd-sysv-generator[52588]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 07:18:56 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 22 07:18:56 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 22 07:18:56 compute-0 systemd[1]: Finished man-db-cache-update.service.
Nov 22 07:18:56 compute-0 systemd[1]: run-r944aa94f25034038bf4da6a40d0d8d0c.service: Deactivated successfully.
Nov 22 07:18:56 compute-0 sudo[52055]: pam_unix(sudo:session): session closed for user root
Nov 22 07:18:57 compute-0 sudo[53153]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ilfpezvqygigmivmfuhowmjkhfcfeabx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795937.1399198-217-157270635237606/AnsiballZ_systemd.py'
Nov 22 07:18:57 compute-0 sudo[53153]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:18:58 compute-0 python3.9[53155]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 22 07:18:58 compute-0 systemd[1]: Reloading.
Nov 22 07:18:58 compute-0 systemd-rc-local-generator[53186]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 07:18:58 compute-0 systemd-sysv-generator[53189]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 07:18:58 compute-0 systemd[1]: Starting Open vSwitch Database Unit...
Nov 22 07:18:58 compute-0 chown[53197]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Nov 22 07:18:58 compute-0 ovs-ctl[53202]: /etc/openvswitch/conf.db does not exist ... (warning).
Nov 22 07:18:58 compute-0 ovs-ctl[53202]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Nov 22 07:18:58 compute-0 ovs-ctl[53202]: Starting ovsdb-server [  OK  ]
Nov 22 07:18:58 compute-0 ovs-vsctl[53251]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Nov 22 07:18:58 compute-0 ovs-vsctl[53271]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"df09844c-c111-44b4-9c36-d4950a55a590\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Nov 22 07:18:58 compute-0 ovs-ctl[53202]: Configuring Open vSwitch system IDs [  OK  ]
Nov 22 07:18:58 compute-0 ovs-ctl[53202]: Enabling remote OVSDB managers [  OK  ]
Nov 22 07:18:58 compute-0 ovs-vsctl[53277]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Nov 22 07:18:58 compute-0 systemd[1]: Started Open vSwitch Database Unit.
Nov 22 07:18:58 compute-0 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Nov 22 07:18:58 compute-0 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Nov 22 07:18:58 compute-0 systemd[1]: Starting Open vSwitch Forwarding Unit...
Nov 22 07:18:58 compute-0 kernel: openvswitch: Open vSwitch switching datapath
Nov 22 07:18:58 compute-0 ovs-ctl[53322]: Inserting openvswitch module [  OK  ]
Nov 22 07:18:59 compute-0 ovs-ctl[53291]: Starting ovs-vswitchd [  OK  ]
Nov 22 07:18:59 compute-0 ovs-vsctl[53339]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Nov 22 07:18:59 compute-0 ovs-ctl[53291]: Enabling remote OVSDB managers [  OK  ]
Nov 22 07:18:59 compute-0 systemd[1]: Started Open vSwitch Forwarding Unit.
Nov 22 07:18:59 compute-0 systemd[1]: Starting Open vSwitch...
Nov 22 07:18:59 compute-0 systemd[1]: Finished Open vSwitch.
Nov 22 07:18:59 compute-0 sudo[53153]: pam_unix(sudo:session): session closed for user root
Nov 22 07:19:00 compute-0 python3.9[53491]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 07:19:01 compute-0 sudo[53641]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xecbsrolaicbadqrbuuxwdtvjndfaoxq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795940.6550987-271-92824189023628/AnsiballZ_sefcontext.py'
Nov 22 07:19:01 compute-0 sudo[53641]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:19:01 compute-0 python3.9[53643]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Nov 22 07:19:04 compute-0 kernel: SELinux:  Converting 2745 SID table entries...
Nov 22 07:19:04 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Nov 22 07:19:04 compute-0 kernel: SELinux:  policy capability open_perms=1
Nov 22 07:19:04 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Nov 22 07:19:04 compute-0 kernel: SELinux:  policy capability always_check_network=0
Nov 22 07:19:04 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 22 07:19:04 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 22 07:19:04 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 22 07:19:04 compute-0 sudo[53641]: pam_unix(sudo:session): session closed for user root
Nov 22 07:19:05 compute-0 python3.9[53798]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 07:19:06 compute-0 sudo[53954]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqouuawgfzknbeovcnsraatzdejmygii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795946.477598-325-261199471015588/AnsiballZ_dnf.py'
Nov 22 07:19:06 compute-0 dbus-broker-launch[811]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Nov 22 07:19:06 compute-0 sudo[53954]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:19:07 compute-0 python3.9[53956]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 22 07:19:08 compute-0 sudo[53954]: pam_unix(sudo:session): session closed for user root
Nov 22 07:19:09 compute-0 sudo[54107]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ktywpxitzeibxqkkutxxcnbtpqtjysvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795948.8056703-349-144061355776949/AnsiballZ_command.py'
Nov 22 07:19:09 compute-0 sudo[54107]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:19:09 compute-0 python3.9[54109]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 07:19:10 compute-0 sudo[54107]: pam_unix(sudo:session): session closed for user root
Nov 22 07:19:10 compute-0 sudo[54394]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kezyllkxvneftpckkdpzzjhncgofyjdj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795950.4427066-373-147886967976737/AnsiballZ_file.py'
Nov 22 07:19:10 compute-0 sudo[54394]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:19:11 compute-0 python3.9[54396]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 22 07:19:11 compute-0 sudo[54394]: pam_unix(sudo:session): session closed for user root
Nov 22 07:19:11 compute-0 python3.9[54546]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 07:19:12 compute-0 sudo[54698]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxpzbhcqsufcdvrkatvqcsybowbdbwhg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795952.234462-421-270218037495174/AnsiballZ_dnf.py'
Nov 22 07:19:12 compute-0 sudo[54698]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:19:12 compute-0 python3.9[54700]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 22 07:19:14 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 22 07:19:14 compute-0 systemd[1]: Starting man-db-cache-update.service...
Nov 22 07:19:14 compute-0 systemd[1]: Reloading.
Nov 22 07:19:14 compute-0 systemd-rc-local-generator[54739]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 07:19:14 compute-0 systemd-sysv-generator[54743]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 07:19:14 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 22 07:19:15 compute-0 sudo[54698]: pam_unix(sudo:session): session closed for user root
Nov 22 07:19:15 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 22 07:19:15 compute-0 systemd[1]: Finished man-db-cache-update.service.
Nov 22 07:19:15 compute-0 systemd[1]: run-r937f9d70d5254e7ea3502de3649d7b36.service: Deactivated successfully.
Nov 22 07:19:16 compute-0 sudo[55016]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utzmuiirbttjwysewvyggncslrxbjwwl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795956.0567818-445-219978781550624/AnsiballZ_systemd.py'
Nov 22 07:19:16 compute-0 sudo[55016]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:19:16 compute-0 python3.9[55018]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 07:19:16 compute-0 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Nov 22 07:19:16 compute-0 systemd[1]: Stopped Network Manager Wait Online.
Nov 22 07:19:16 compute-0 systemd[1]: Stopping Network Manager Wait Online...
Nov 22 07:19:16 compute-0 systemd[1]: Stopping Network Manager...
Nov 22 07:19:16 compute-0 NetworkManager[7183]: <info>  [1763795956.7482] caught SIGTERM, shutting down normally.
Nov 22 07:19:16 compute-0 NetworkManager[7183]: <info>  [1763795956.7496] dhcp4 (eth0): canceled DHCP transaction
Nov 22 07:19:16 compute-0 NetworkManager[7183]: <info>  [1763795956.7496] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 22 07:19:16 compute-0 NetworkManager[7183]: <info>  [1763795956.7496] dhcp4 (eth0): state changed no lease
Nov 22 07:19:16 compute-0 NetworkManager[7183]: <info>  [1763795956.7499] manager: NetworkManager state is now CONNECTED_SITE
Nov 22 07:19:16 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 22 07:19:16 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 22 07:19:16 compute-0 NetworkManager[7183]: <info>  [1763795956.7852] exiting (success)
Nov 22 07:19:16 compute-0 systemd[1]: NetworkManager.service: Deactivated successfully.
Nov 22 07:19:16 compute-0 systemd[1]: Stopped Network Manager.
Nov 22 07:19:16 compute-0 systemd[1]: NetworkManager.service: Consumed 19.800s CPU time, 4.1M memory peak, read 0B from disk, written 31.0K to disk.
Nov 22 07:19:16 compute-0 systemd[1]: Starting Network Manager...
Nov 22 07:19:16 compute-0 NetworkManager[55036]: <info>  [1763795956.8531] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:a2036d99-efbc-42fa-8cf6-98ac98e337b2)
Nov 22 07:19:16 compute-0 NetworkManager[55036]: <info>  [1763795956.8535] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Nov 22 07:19:16 compute-0 NetworkManager[55036]: <info>  [1763795956.8605] manager[0x5650402ee090]: monitoring kernel firmware directory '/lib/firmware'.
Nov 22 07:19:16 compute-0 systemd[1]: Starting Hostname Service...
Nov 22 07:19:16 compute-0 systemd[1]: Started Hostname Service.
Nov 22 07:19:16 compute-0 NetworkManager[55036]: <info>  [1763795956.9486] hostname: hostname: using hostnamed
Nov 22 07:19:16 compute-0 NetworkManager[55036]: <info>  [1763795956.9486] hostname: static hostname changed from (none) to "compute-0"
Nov 22 07:19:16 compute-0 NetworkManager[55036]: <info>  [1763795956.9491] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Nov 22 07:19:16 compute-0 NetworkManager[55036]: <info>  [1763795956.9495] manager[0x5650402ee090]: rfkill: Wi-Fi hardware radio set enabled
Nov 22 07:19:16 compute-0 NetworkManager[55036]: <info>  [1763795956.9496] manager[0x5650402ee090]: rfkill: WWAN hardware radio set enabled
Nov 22 07:19:16 compute-0 NetworkManager[55036]: <info>  [1763795956.9516] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-ovs.so)
Nov 22 07:19:16 compute-0 NetworkManager[55036]: <info>  [1763795956.9526] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Nov 22 07:19:16 compute-0 NetworkManager[55036]: <info>  [1763795956.9526] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Nov 22 07:19:16 compute-0 NetworkManager[55036]: <info>  [1763795956.9527] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Nov 22 07:19:16 compute-0 NetworkManager[55036]: <info>  [1763795956.9527] manager: Networking is enabled by state file
Nov 22 07:19:16 compute-0 NetworkManager[55036]: <info>  [1763795956.9529] settings: Loaded settings plugin: keyfile (internal)
Nov 22 07:19:16 compute-0 NetworkManager[55036]: <info>  [1763795956.9532] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Nov 22 07:19:16 compute-0 NetworkManager[55036]: <info>  [1763795956.9556] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Nov 22 07:19:16 compute-0 NetworkManager[55036]: <info>  [1763795956.9565] dhcp: init: Using DHCP client 'internal'
Nov 22 07:19:16 compute-0 NetworkManager[55036]: <info>  [1763795956.9567] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Nov 22 07:19:16 compute-0 NetworkManager[55036]: <info>  [1763795956.9572] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 07:19:16 compute-0 NetworkManager[55036]: <info>  [1763795956.9577] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Nov 22 07:19:16 compute-0 NetworkManager[55036]: <info>  [1763795956.9583] device (lo): Activation: starting connection 'lo' (f5c5737f-5ac0-4563-b7a5-22075693542b)
Nov 22 07:19:16 compute-0 NetworkManager[55036]: <info>  [1763795956.9588] device (eth0): carrier: link connected
Nov 22 07:19:16 compute-0 NetworkManager[55036]: <info>  [1763795956.9591] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Nov 22 07:19:16 compute-0 NetworkManager[55036]: <info>  [1763795956.9596] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Nov 22 07:19:16 compute-0 NetworkManager[55036]: <info>  [1763795956.9596] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 22 07:19:16 compute-0 NetworkManager[55036]: <info>  [1763795956.9601] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 22 07:19:16 compute-0 NetworkManager[55036]: <info>  [1763795956.9607] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 22 07:19:16 compute-0 NetworkManager[55036]: <info>  [1763795956.9612] device (eth1): carrier: link connected
Nov 22 07:19:16 compute-0 NetworkManager[55036]: <info>  [1763795956.9615] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Nov 22 07:19:16 compute-0 NetworkManager[55036]: <info>  [1763795956.9619] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (cbb9ac2c-e3a8-55fd-9244-47a9ee91a7c3) (indicated)
Nov 22 07:19:16 compute-0 NetworkManager[55036]: <info>  [1763795956.9619] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 22 07:19:16 compute-0 NetworkManager[55036]: <info>  [1763795956.9624] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 22 07:19:16 compute-0 NetworkManager[55036]: <info>  [1763795956.9629] device (eth1): Activation: starting connection 'ci-private-network' (cbb9ac2c-e3a8-55fd-9244-47a9ee91a7c3)
Nov 22 07:19:16 compute-0 NetworkManager[55036]: <info>  [1763795956.9643] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Nov 22 07:19:16 compute-0 NetworkManager[55036]: <info>  [1763795956.9650] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Nov 22 07:19:16 compute-0 systemd[1]: Started Network Manager.
Nov 22 07:19:16 compute-0 NetworkManager[55036]: <info>  [1763795956.9653] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Nov 22 07:19:16 compute-0 NetworkManager[55036]: <info>  [1763795956.9654] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Nov 22 07:19:16 compute-0 NetworkManager[55036]: <info>  [1763795956.9657] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 22 07:19:16 compute-0 NetworkManager[55036]: <info>  [1763795956.9659] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 22 07:19:16 compute-0 NetworkManager[55036]: <info>  [1763795956.9661] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 22 07:19:16 compute-0 NetworkManager[55036]: <info>  [1763795956.9663] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 22 07:19:16 compute-0 NetworkManager[55036]: <info>  [1763795956.9666] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Nov 22 07:19:16 compute-0 NetworkManager[55036]: <info>  [1763795956.9672] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 22 07:19:16 compute-0 NetworkManager[55036]: <info>  [1763795956.9675] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 22 07:19:16 compute-0 NetworkManager[55036]: <info>  [1763795956.9682] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 22 07:19:16 compute-0 NetworkManager[55036]: <info>  [1763795956.9691] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 22 07:19:16 compute-0 NetworkManager[55036]: <info>  [1763795956.9706] dhcp4 (eth0): state changed new lease, address=38.129.56.29
Nov 22 07:19:16 compute-0 NetworkManager[55036]: <info>  [1763795956.9710] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Nov 22 07:19:16 compute-0 NetworkManager[55036]: <info>  [1763795956.9776] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 22 07:19:16 compute-0 NetworkManager[55036]: <info>  [1763795956.9781] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Nov 22 07:19:16 compute-0 NetworkManager[55036]: <info>  [1763795956.9782] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 22 07:19:16 compute-0 NetworkManager[55036]: <info>  [1763795956.9784] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Nov 22 07:19:16 compute-0 NetworkManager[55036]: <info>  [1763795956.9789] device (lo): Activation: successful, device activated.
Nov 22 07:19:16 compute-0 NetworkManager[55036]: <info>  [1763795956.9793] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 22 07:19:16 compute-0 NetworkManager[55036]: <info>  [1763795956.9796] manager: NetworkManager state is now CONNECTED_LOCAL
Nov 22 07:19:16 compute-0 NetworkManager[55036]: <info>  [1763795956.9798] device (eth1): Activation: successful, device activated.
Nov 22 07:19:16 compute-0 NetworkManager[55036]: <info>  [1763795956.9805] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 22 07:19:16 compute-0 NetworkManager[55036]: <info>  [1763795956.9806] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 22 07:19:16 compute-0 NetworkManager[55036]: <info>  [1763795956.9809] manager: NetworkManager state is now CONNECTED_SITE
Nov 22 07:19:16 compute-0 NetworkManager[55036]: <info>  [1763795956.9811] device (eth0): Activation: successful, device activated.
Nov 22 07:19:16 compute-0 NetworkManager[55036]: <info>  [1763795956.9815] manager: NetworkManager state is now CONNECTED_GLOBAL
Nov 22 07:19:16 compute-0 NetworkManager[55036]: <info>  [1763795956.9817] manager: startup complete
Nov 22 07:19:16 compute-0 systemd[1]: Starting Network Manager Wait Online...
Nov 22 07:19:17 compute-0 systemd[1]: Finished Network Manager Wait Online.
Nov 22 07:19:17 compute-0 sudo[55016]: pam_unix(sudo:session): session closed for user root
Nov 22 07:19:17 compute-0 sudo[55242]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zyktnvqnnltemdmocrhhlkxqehuyvcez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795957.2202272-469-38321075122025/AnsiballZ_dnf.py'
Nov 22 07:19:17 compute-0 sudo[55242]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:19:18 compute-0 python3.9[55244]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 22 07:19:23 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 22 07:19:23 compute-0 systemd[1]: Starting man-db-cache-update.service...
Nov 22 07:19:23 compute-0 systemd[1]: Reloading.
Nov 22 07:19:23 compute-0 systemd-rc-local-generator[55299]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 07:19:23 compute-0 systemd-sysv-generator[55302]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 07:19:23 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 22 07:19:26 compute-0 sudo[55242]: pam_unix(sudo:session): session closed for user root
Nov 22 07:19:27 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 22 07:19:27 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 22 07:19:27 compute-0 systemd[1]: Finished man-db-cache-update.service.
Nov 22 07:19:27 compute-0 systemd[1]: run-rb6cff2d6e159495b826bfca3988774da.service: Deactivated successfully.
Nov 22 07:19:28 compute-0 sudo[55703]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fwuvahfaloksrizysylnjqcldhfhqhhq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795967.8996944-505-226620818694998/AnsiballZ_stat.py'
Nov 22 07:19:28 compute-0 sudo[55703]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:19:28 compute-0 python3.9[55705]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 07:19:28 compute-0 sudo[55703]: pam_unix(sudo:session): session closed for user root
Nov 22 07:19:29 compute-0 sudo[55855]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-btiudasusnpfrehyrpohzcyycxxkgogi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795968.6448572-532-198831073959689/AnsiballZ_ini_file.py'
Nov 22 07:19:29 compute-0 sudo[55855]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:19:29 compute-0 python3.9[55857]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:19:29 compute-0 sudo[55855]: pam_unix(sudo:session): session closed for user root
Nov 22 07:19:30 compute-0 sudo[56009]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-izlvdvzcorwlolkejtwqlxkogdswkkks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795969.783621-562-238728406994068/AnsiballZ_ini_file.py'
Nov 22 07:19:30 compute-0 sudo[56009]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:19:30 compute-0 python3.9[56011]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:19:30 compute-0 sudo[56009]: pam_unix(sudo:session): session closed for user root
Nov 22 07:19:30 compute-0 sudo[56161]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vsgjkeqfnobddwrgaemsghokhujgbvbu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795970.560534-562-174215168629675/AnsiballZ_ini_file.py'
Nov 22 07:19:30 compute-0 sudo[56161]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:19:31 compute-0 python3.9[56163]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:19:31 compute-0 sudo[56161]: pam_unix(sudo:session): session closed for user root
Nov 22 07:19:31 compute-0 sudo[56313]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-whkrhvrcdmmfmpzuboqmceivjyzvwmgh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795971.4290679-607-155330531545974/AnsiballZ_ini_file.py'
Nov 22 07:19:31 compute-0 sudo[56313]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:19:32 compute-0 python3.9[56315]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:19:32 compute-0 sudo[56313]: pam_unix(sudo:session): session closed for user root
Nov 22 07:19:32 compute-0 sudo[56465]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-chqbjqusbksgilcpfqbjzssjxiexcgrg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795972.3166254-607-96990080363027/AnsiballZ_ini_file.py'
Nov 22 07:19:32 compute-0 sudo[56465]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:19:32 compute-0 python3.9[56467]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:19:32 compute-0 sudo[56465]: pam_unix(sudo:session): session closed for user root
Nov 22 07:19:33 compute-0 sudo[56617]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rcccrtnbwbztecfrfhxivkhgmcbgnhfh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795973.0699766-652-16505228360983/AnsiballZ_stat.py'
Nov 22 07:19:33 compute-0 sudo[56617]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:19:33 compute-0 python3.9[56619]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:19:33 compute-0 sudo[56617]: pam_unix(sudo:session): session closed for user root
Nov 22 07:19:34 compute-0 sudo[56740]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mtorokgedvermbnwjxjjuxjrkbuebvgi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795973.0699766-652-16505228360983/AnsiballZ_copy.py'
Nov 22 07:19:34 compute-0 sudo[56740]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:19:34 compute-0 python3.9[56742]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1763795973.0699766-652-16505228360983/.source _original_basename=.7pe0qtri follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:19:34 compute-0 sudo[56740]: pam_unix(sudo:session): session closed for user root
Nov 22 07:19:35 compute-0 sudo[56892]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-omknxttrlcyrqgrcszqiccegarqokvkk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795975.2775104-697-259133126648041/AnsiballZ_file.py'
Nov 22 07:19:35 compute-0 sudo[56892]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:19:35 compute-0 python3.9[56894]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:19:35 compute-0 sudo[56892]: pam_unix(sudo:session): session closed for user root
Nov 22 07:19:36 compute-0 sudo[57044]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hciaqgkovgmdhnifklodhpnnrqojtcib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795976.003635-721-261309286444632/AnsiballZ_edpm_os_net_config_mappings.py'
Nov 22 07:19:36 compute-0 sudo[57044]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:19:36 compute-0 python3.9[57046]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Nov 22 07:19:36 compute-0 sudo[57044]: pam_unix(sudo:session): session closed for user root
Nov 22 07:19:37 compute-0 sudo[57196]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qunohevfpotlvqiqbfvmixcapuwctcbj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795977.1209505-748-216420455464809/AnsiballZ_file.py'
Nov 22 07:19:37 compute-0 sudo[57196]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:19:37 compute-0 python3.9[57198]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:19:37 compute-0 sudo[57196]: pam_unix(sudo:session): session closed for user root
Nov 22 07:19:38 compute-0 sudo[57348]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zugtxavupaldxxhkgvwfwlejlkwtdknb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795978.0896125-778-43884787830692/AnsiballZ_stat.py'
Nov 22 07:19:38 compute-0 sudo[57348]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:19:38 compute-0 sudo[57348]: pam_unix(sudo:session): session closed for user root
Nov 22 07:19:39 compute-0 sudo[57471]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qrvufllyzsuoevotropdfwdaqjuokwcf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795978.0896125-778-43884787830692/AnsiballZ_copy.py'
Nov 22 07:19:39 compute-0 sudo[57471]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:19:39 compute-0 sudo[57471]: pam_unix(sudo:session): session closed for user root
Nov 22 07:19:39 compute-0 sudo[57623]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xibgnlojpgiykuobxemeegaseydhrmbw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795979.4693725-823-25491689733871/AnsiballZ_slurp.py'
Nov 22 07:19:39 compute-0 sudo[57623]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:19:40 compute-0 python3.9[57625]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Nov 22 07:19:40 compute-0 sudo[57623]: pam_unix(sudo:session): session closed for user root
Nov 22 07:19:41 compute-0 sudo[57798]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bpzctzwvpvuggebdmbebilgqnvgjflkm ; ANSIBLE_ASYNC_DIR=\'~/.ansible_async\' /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795980.414823-850-29260244140222/async_wrapper.py j850372044044 300 /home/zuul/.ansible/tmp/ansible-tmp-1763795980.414823-850-29260244140222/AnsiballZ_edpm_os_net_config.py _'
Nov 22 07:19:41 compute-0 sudo[57798]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:19:41 compute-0 ansible-async_wrapper.py[57800]: Invoked with j850372044044 300 /home/zuul/.ansible/tmp/ansible-tmp-1763795980.414823-850-29260244140222/AnsiballZ_edpm_os_net_config.py _
Nov 22 07:19:41 compute-0 ansible-async_wrapper.py[57803]: Starting module and watcher
Nov 22 07:19:41 compute-0 ansible-async_wrapper.py[57803]: Start watching 57804 (300)
Nov 22 07:19:41 compute-0 ansible-async_wrapper.py[57804]: Start module (57804)
Nov 22 07:19:41 compute-0 ansible-async_wrapper.py[57800]: Return async_wrapper task started.
Nov 22 07:19:41 compute-0 sudo[57798]: pam_unix(sudo:session): session closed for user root
Nov 22 07:19:41 compute-0 python3.9[57805]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Nov 22 07:19:42 compute-0 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Nov 22 07:19:42 compute-0 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Nov 22 07:19:42 compute-0 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Nov 22 07:19:42 compute-0 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Nov 22 07:19:42 compute-0 kernel: cfg80211: failed to load regulatory.db
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.3751] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=57806 uid=0 result="success"
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.3772] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=57806 uid=0 result="success"
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.4267] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.4271] audit: op="connection-add" uuid="73ddbb53-dda3-4093-a404-69c3ef1f2f7b" name="br-ex-br" pid=57806 uid=0 result="success"
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.4286] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.4287] audit: op="connection-add" uuid="7710ad96-1aa9-42e9-b688-9d92b47fae09" name="br-ex-port" pid=57806 uid=0 result="success"
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.4298] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.4300] audit: op="connection-add" uuid="0fad31d5-f7ce-4072-927a-d5f613ebcfa8" name="eth1-port" pid=57806 uid=0 result="success"
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.4310] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.4312] audit: op="connection-add" uuid="24c982cb-c82d-4f40-a97e-03c02def930d" name="vlan20-port" pid=57806 uid=0 result="success"
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.4322] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.4323] audit: op="connection-add" uuid="973ce131-f264-41e7-8b0d-e84177dc2b35" name="vlan21-port" pid=57806 uid=0 result="success"
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.4333] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.4334] audit: op="connection-add" uuid="488f949a-b66c-4218-b7ac-4c225a310301" name="vlan22-port" pid=57806 uid=0 result="success"
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.4352] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="ipv4.dhcp-client-id,ipv4.dhcp-timeout,802-3-ethernet.mtu,connection.autoconnect-priority,connection.timestamp,ipv6.addr-gen-mode,ipv6.method,ipv6.dhcp-timeout" pid=57806 uid=0 result="success"
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.4365] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/10)
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.4366] audit: op="connection-add" uuid="d553c188-8a9c-4ff5-9470-78bbd9fe7239" name="br-ex-if" pid=57806 uid=0 result="success"
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5248] audit: op="connection-update" uuid="cbb9ac2c-e3a8-55fd-9244-47a9ee91a7c3" name="ci-private-network" args="ovs-external-ids.data,ipv4.dns,ipv4.routing-rules,ipv4.never-default,ipv4.method,ipv4.addresses,ipv4.routes,connection.master,connection.controller,connection.port-type,connection.slave-type,connection.timestamp,ipv6.addr-gen-mode,ipv6.dns,ipv6.routing-rules,ipv6.method,ipv6.addresses,ipv6.routes,ovs-interface.type" pid=57806 uid=0 result="success"
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5271] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5274] audit: op="connection-add" uuid="29564bc1-d460-4d6e-9885-554f86114355" name="vlan20-if" pid=57806 uid=0 result="success"
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5293] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5295] audit: op="connection-add" uuid="2f282fd6-72dd-4de8-afc5-9f13ac1270d6" name="vlan21-if" pid=57806 uid=0 result="success"
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5317] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5320] audit: op="connection-add" uuid="de3aa99a-28f3-4762-aaa3-d27685d08bcb" name="vlan22-if" pid=57806 uid=0 result="success"
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5345] audit: op="connection-delete" uuid="404dc1b5-53f2-3896-aa44-c017e0bc287f" name="Wired connection 1" pid=57806 uid=0 result="success"
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5367] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5380] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5384] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (73ddbb53-dda3-4093-a404-69c3ef1f2f7b)
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5385] audit: op="connection-activate" uuid="73ddbb53-dda3-4093-a404-69c3ef1f2f7b" name="br-ex-br" pid=57806 uid=0 result="success"
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5388] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5394] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5399] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (7710ad96-1aa9-42e9-b688-9d92b47fae09)
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5401] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5407] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5413] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (0fad31d5-f7ce-4072-927a-d5f613ebcfa8)
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5415] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5422] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5428] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (24c982cb-c82d-4f40-a97e-03c02def930d)
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5430] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5438] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5442] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (973ce131-f264-41e7-8b0d-e84177dc2b35)
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5445] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5453] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5458] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (488f949a-b66c-4218-b7ac-4c225a310301)
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5460] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5464] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5466] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5474] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5479] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5484] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (d553c188-8a9c-4ff5-9470-78bbd9fe7239)
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5486] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5489] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5491] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5493] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5495] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5505] device (eth1): disconnecting for new activation request.
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5514] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5519] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5521] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5523] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5526] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5531] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5536] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (29564bc1-d460-4d6e-9885-554f86114355)
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5537] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5540] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5541] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5542] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5545] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5549] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5553] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (2f282fd6-72dd-4de8-afc5-9f13ac1270d6)
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5554] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5557] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5559] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5561] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5563] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5568] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5573] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (de3aa99a-28f3-4762-aaa3-d27685d08bcb)
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5574] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5577] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5579] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5581] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5582] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5596] audit: op="device-reapply" interface="eth0" ifindex=2 args="ipv4.dhcp-client-id,ipv4.dhcp-timeout,802-3-ethernet.mtu,connection.autoconnect-priority,ipv6.addr-gen-mode,ipv6.method" pid=57806 uid=0 result="success"
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5599] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5602] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5604] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5620] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5638] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 22 07:19:43 compute-0 kernel: ovs-system: entered promiscuous mode
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5645] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5648] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5651] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5656] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5660] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5663] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 22 07:19:43 compute-0 kernel: Timeout policy base is empty
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5665] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5671] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5675] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5679] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 22 07:19:43 compute-0 systemd-udevd[57812]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5681] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 22 07:19:43 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5689] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5694] dhcp4 (eth0): canceled DHCP transaction
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5694] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5694] dhcp4 (eth0): state changed no lease
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5695] dhcp4 (eth0): activation: beginning transaction (no timeout)
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5706] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.5709] audit: op="device-reapply" interface="eth1" ifindex=3 pid=57806 uid=0 result="fail" reason="Device is not activated"
Nov 22 07:19:43 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 22 07:19:43 compute-0 kernel: br-ex: entered promiscuous mode
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.6213] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.6217] dhcp4 (eth0): state changed new lease, address=38.129.56.29
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.6224] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Nov 22 07:19:43 compute-0 kernel: vlan20: entered promiscuous mode
Nov 22 07:19:43 compute-0 systemd-udevd[57811]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 07:19:43 compute-0 kernel: vlan22: entered promiscuous mode
Nov 22 07:19:43 compute-0 kernel: vlan21: entered promiscuous mode
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.6759] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.6860] device (eth1): Activation: starting connection 'ci-private-network' (cbb9ac2c-e3a8-55fd-9244-47a9ee91a7c3)
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.6865] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.6866] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.6868] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.6870] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.6871] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.6872] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.6880] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.6897] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.6903] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.6905] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.6908] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.6913] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.6918] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.6924] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.6927] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.6930] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.6934] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.6937] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.6940] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.6943] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.6945] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.6948] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.6951] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.6957] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.6957] device (eth1): state change: config -> deactivating (reason 'new-activation', managed-type: 'full')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.6959] device (eth1): released from controller device eth1
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.6964] device (eth1): disconnecting for new activation request.
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.6965] audit: op="connection-activate" uuid="cbb9ac2c-e3a8-55fd-9244-47a9ee91a7c3" name="ci-private-network" pid=57806 uid=0 result="success"
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.6967] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.6987] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.6993] device (eth1): Activation: starting connection 'ci-private-network' (cbb9ac2c-e3a8-55fd-9244-47a9ee91a7c3)
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.7003] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.7020] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.7023] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.7026] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=57806 uid=0 result="success"
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.7028] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.7034] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.7041] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.7050] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.7055] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.7058] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.7063] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.7067] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.7068] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.7069] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 22 07:19:43 compute-0 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.7072] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.7074] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.7079] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.7084] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.7089] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.7096] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.7103] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.7111] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.8553] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.8556] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.8564] device (eth1): Activation: successful, device activated.
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.9889] checkpoint[0x5650402c4950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Nov 22 07:19:43 compute-0 NetworkManager[55036]: <info>  [1763795983.9891] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=57806 uid=0 result="success"
Nov 22 07:19:44 compute-0 NetworkManager[55036]: <info>  [1763795984.2367] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=57806 uid=0 result="success"
Nov 22 07:19:44 compute-0 NetworkManager[55036]: <info>  [1763795984.2376] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=57806 uid=0 result="success"
Nov 22 07:19:44 compute-0 NetworkManager[55036]: <info>  [1763795984.7812] audit: op="networking-control" arg="global-dns-configuration" pid=57806 uid=0 result="success"
Nov 22 07:19:44 compute-0 NetworkManager[55036]: <info>  [1763795984.8287] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Nov 22 07:19:44 compute-0 NetworkManager[55036]: <info>  [1763795984.9168] audit: op="networking-control" arg="global-dns-configuration" pid=57806 uid=0 result="success"
Nov 22 07:19:44 compute-0 NetworkManager[55036]: <info>  [1763795984.9191] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=57806 uid=0 result="success"
Nov 22 07:19:45 compute-0 NetworkManager[55036]: <info>  [1763795985.0394] checkpoint[0x5650402c4a20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Nov 22 07:19:45 compute-0 NetworkManager[55036]: <info>  [1763795985.0396] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=57806 uid=0 result="success"
Nov 22 07:19:45 compute-0 ansible-async_wrapper.py[57804]: Module complete (57804)
Nov 22 07:19:45 compute-0 sudo[58147]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zfdgrnnjgxdcyugpylqwaqwsfagnwygi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795984.5616865-850-154170431833058/AnsiballZ_async_status.py'
Nov 22 07:19:45 compute-0 sudo[58147]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:19:45 compute-0 python3.9[58149]: ansible-ansible.legacy.async_status Invoked with jid=j850372044044.57800 mode=status _async_dir=/root/.ansible_async
Nov 22 07:19:45 compute-0 sudo[58147]: pam_unix(sudo:session): session closed for user root
Nov 22 07:19:45 compute-0 sudo[58247]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jyihxoowrrjqxynohsfwyuouzftiljcv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795984.5616865-850-154170431833058/AnsiballZ_async_status.py'
Nov 22 07:19:45 compute-0 sudo[58247]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:19:45 compute-0 python3.9[58249]: ansible-ansible.legacy.async_status Invoked with jid=j850372044044.57800 mode=cleanup _async_dir=/root/.ansible_async
Nov 22 07:19:45 compute-0 sudo[58247]: pam_unix(sudo:session): session closed for user root
Nov 22 07:19:46 compute-0 ansible-async_wrapper.py[57803]: Done in kid B.
Nov 22 07:19:46 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 22 07:19:49 compute-0 sudo[58403]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zwizdxnreodgraxbttbugjdbctktnxoz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795989.7521148-926-40600418216877/AnsiballZ_stat.py'
Nov 22 07:19:49 compute-0 sudo[58403]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:19:50 compute-0 python3.9[58405]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:19:50 compute-0 sudo[58403]: pam_unix(sudo:session): session closed for user root
Nov 22 07:19:50 compute-0 sudo[58526]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qobfstkdkcxkjfphxrlghyttreyfsnbt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795989.7521148-926-40600418216877/AnsiballZ_copy.py'
Nov 22 07:19:50 compute-0 sudo[58526]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:19:50 compute-0 python3.9[58528]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763795989.7521148-926-40600418216877/.source.returncode _original_basename=.ilp4dqn6 follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:19:50 compute-0 sudo[58526]: pam_unix(sudo:session): session closed for user root
Nov 22 07:19:51 compute-0 sudo[58678]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-adcizoowuodmooozdnwoldbhcjwprvza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795991.1871984-974-204134389857533/AnsiballZ_stat.py'
Nov 22 07:19:51 compute-0 sudo[58678]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:19:51 compute-0 python3.9[58680]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:19:51 compute-0 sudo[58678]: pam_unix(sudo:session): session closed for user root
Nov 22 07:19:52 compute-0 sudo[58802]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pegbbvetoqquzychxegaqxgkfdyckslm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795991.1871984-974-204134389857533/AnsiballZ_copy.py'
Nov 22 07:19:52 compute-0 sudo[58802]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:19:52 compute-0 python3.9[58804]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763795991.1871984-974-204134389857533/.source.cfg _original_basename=.pwgaeg05 follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:19:52 compute-0 sudo[58802]: pam_unix(sudo:session): session closed for user root
Nov 22 07:19:52 compute-0 sudo[58954]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tguvjbinkbztrgptiacyokgolqwmqbho ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763795992.6242273-1019-56036012846441/AnsiballZ_systemd.py'
Nov 22 07:19:52 compute-0 sudo[58954]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:19:53 compute-0 python3.9[58956]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 07:19:53 compute-0 systemd[1]: Reloading Network Manager...
Nov 22 07:19:53 compute-0 NetworkManager[55036]: <info>  [1763795993.3437] audit: op="reload" arg="0" pid=58960 uid=0 result="success"
Nov 22 07:19:53 compute-0 NetworkManager[55036]: <info>  [1763795993.3444] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Nov 22 07:19:53 compute-0 systemd[1]: Reloaded Network Manager.
Nov 22 07:19:53 compute-0 sudo[58954]: pam_unix(sudo:session): session closed for user root
Nov 22 07:19:53 compute-0 sshd-session[51033]: Connection closed by 192.168.122.30 port 40978
Nov 22 07:19:53 compute-0 sshd-session[51030]: pam_unix(sshd:session): session closed for user zuul
Nov 22 07:19:53 compute-0 systemd[1]: session-11.scope: Deactivated successfully.
Nov 22 07:19:53 compute-0 systemd[1]: session-11.scope: Consumed 52.996s CPU time.
Nov 22 07:19:53 compute-0 systemd-logind[821]: Session 11 logged out. Waiting for processes to exit.
Nov 22 07:19:53 compute-0 systemd-logind[821]: Removed session 11.
Nov 22 07:20:00 compute-0 sshd-session[58991]: Accepted publickey for zuul from 192.168.122.30 port 56060 ssh2: ECDSA SHA256:XSwr0+qdoVcqF4tsJEe7yzRPcUrJW8ET1w6IEkjbKvs
Nov 22 07:20:00 compute-0 systemd-logind[821]: New session 12 of user zuul.
Nov 22 07:20:00 compute-0 systemd[1]: Started Session 12 of User zuul.
Nov 22 07:20:00 compute-0 sshd-session[58991]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 22 07:20:01 compute-0 python3.9[59144]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 07:20:02 compute-0 python3.9[59299]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 22 07:20:03 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 22 07:20:03 compute-0 python3.9[59489]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 07:20:04 compute-0 sshd-session[58994]: Connection closed by 192.168.122.30 port 56060
Nov 22 07:20:04 compute-0 sshd-session[58991]: pam_unix(sshd:session): session closed for user zuul
Nov 22 07:20:04 compute-0 systemd-logind[821]: Session 12 logged out. Waiting for processes to exit.
Nov 22 07:20:04 compute-0 systemd[1]: session-12.scope: Deactivated successfully.
Nov 22 07:20:04 compute-0 systemd[1]: session-12.scope: Consumed 2.236s CPU time.
Nov 22 07:20:04 compute-0 systemd-logind[821]: Removed session 12.
Nov 22 07:20:10 compute-0 sshd-session[59517]: Accepted publickey for zuul from 192.168.122.30 port 39214 ssh2: ECDSA SHA256:XSwr0+qdoVcqF4tsJEe7yzRPcUrJW8ET1w6IEkjbKvs
Nov 22 07:20:10 compute-0 systemd-logind[821]: New session 13 of user zuul.
Nov 22 07:20:10 compute-0 systemd[1]: Started Session 13 of User zuul.
Nov 22 07:20:10 compute-0 sshd-session[59517]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 22 07:20:11 compute-0 python3.9[59671]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 07:20:12 compute-0 python3.9[59825]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 07:20:13 compute-0 sudo[59979]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-macmzzkucwwgywskhnsyhugmcwdgqbze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796013.5448647-85-40824757043666/AnsiballZ_setup.py'
Nov 22 07:20:13 compute-0 sudo[59979]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:20:14 compute-0 python3.9[59981]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 22 07:20:14 compute-0 sudo[59979]: pam_unix(sudo:session): session closed for user root
Nov 22 07:20:14 compute-0 sudo[60064]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-strrjgyutlybpxzeytbhuaynmvwzwewe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796013.5448647-85-40824757043666/AnsiballZ_dnf.py'
Nov 22 07:20:14 compute-0 sudo[60064]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:20:15 compute-0 python3.9[60066]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 22 07:20:16 compute-0 sudo[60064]: pam_unix(sudo:session): session closed for user root
Nov 22 07:20:16 compute-0 sudo[60217]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxogqchmeihurteknvkdqhihtdcysqdu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796016.7235575-121-219850256439710/AnsiballZ_setup.py'
Nov 22 07:20:16 compute-0 sudo[60217]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:20:17 compute-0 python3.9[60219]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 22 07:20:17 compute-0 sudo[60217]: pam_unix(sudo:session): session closed for user root
Nov 22 07:20:18 compute-0 sudo[60408]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tfhnkxditfmefnyzjwdmucfusgvnqoks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796017.9717548-154-262027038686165/AnsiballZ_file.py'
Nov 22 07:20:18 compute-0 sudo[60408]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:20:18 compute-0 python3.9[60410]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:20:18 compute-0 sudo[60408]: pam_unix(sudo:session): session closed for user root
Nov 22 07:20:19 compute-0 sudo[60560]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nhlgikbvxtuzmqlxzgbpjwydukaixrxj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796018.85446-178-103134104104587/AnsiballZ_command.py'
Nov 22 07:20:19 compute-0 sudo[60560]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:20:19 compute-0 python3.9[60562]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 07:20:19 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 07:20:19 compute-0 sudo[60560]: pam_unix(sudo:session): session closed for user root
Nov 22 07:20:20 compute-0 sudo[60722]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xelbjyvtdrfqtvhbthwalreszmzjqktd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796019.749853-202-73974888936282/AnsiballZ_stat.py'
Nov 22 07:20:20 compute-0 sudo[60722]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:20:20 compute-0 python3.9[60724]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:20:20 compute-0 sudo[60722]: pam_unix(sudo:session): session closed for user root
Nov 22 07:20:20 compute-0 sudo[60800]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mqyzxlyrdvbtygeeznvpkedparopupwq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796019.749853-202-73974888936282/AnsiballZ_file.py'
Nov 22 07:20:20 compute-0 sudo[60800]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:20:20 compute-0 python3.9[60802]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:20:20 compute-0 sudo[60800]: pam_unix(sudo:session): session closed for user root
Nov 22 07:20:21 compute-0 sudo[60952]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jbphehwckyfqvqgbxnyqcrpxtlyilewv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796021.0359209-238-61486372150092/AnsiballZ_stat.py'
Nov 22 07:20:21 compute-0 sudo[60952]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:20:21 compute-0 python3.9[60954]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:20:21 compute-0 sudo[60952]: pam_unix(sudo:session): session closed for user root
Nov 22 07:20:21 compute-0 sudo[61030]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ktxqvjcuirhgywgckxypbbczvhdutoxh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796021.0359209-238-61486372150092/AnsiballZ_file.py'
Nov 22 07:20:21 compute-0 sudo[61030]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:20:21 compute-0 python3.9[61032]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:20:22 compute-0 sudo[61030]: pam_unix(sudo:session): session closed for user root
Nov 22 07:20:22 compute-0 sudo[61183]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qgairwhddsdpriktkgffeweynoqsckmc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796022.2923007-277-236148553033453/AnsiballZ_ini_file.py'
Nov 22 07:20:22 compute-0 sudo[61183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:20:22 compute-0 python3.9[61185]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:20:22 compute-0 sudo[61183]: pam_unix(sudo:session): session closed for user root
Nov 22 07:20:23 compute-0 sudo[61335]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-laucowtmsgdvgtafxygqujowdocslnru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796023.0967495-277-203442480463763/AnsiballZ_ini_file.py'
Nov 22 07:20:23 compute-0 sudo[61335]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:20:23 compute-0 python3.9[61337]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:20:23 compute-0 sudo[61335]: pam_unix(sudo:session): session closed for user root
Nov 22 07:20:23 compute-0 sudo[61487]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bnryeilqnspzobyaecnogiyjecfweuvj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796023.727294-277-249054051210547/AnsiballZ_ini_file.py'
Nov 22 07:20:23 compute-0 sudo[61487]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:20:24 compute-0 python3.9[61489]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:20:24 compute-0 sudo[61487]: pam_unix(sudo:session): session closed for user root
Nov 22 07:20:24 compute-0 sudo[61639]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-whtgcgyqywjeyedeaebkbancibdclplh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796024.3345082-277-75164360529980/AnsiballZ_ini_file.py'
Nov 22 07:20:24 compute-0 sudo[61639]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:20:24 compute-0 python3.9[61641]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:20:24 compute-0 sudo[61639]: pam_unix(sudo:session): session closed for user root
Nov 22 07:20:25 compute-0 sudo[61791]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-grgtseekuzkkvzowocskiyhosjzyuezd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796025.2418027-370-203739557060567/AnsiballZ_dnf.py'
Nov 22 07:20:25 compute-0 sudo[61791]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:20:25 compute-0 python3.9[61793]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 22 07:20:27 compute-0 sudo[61791]: pam_unix(sudo:session): session closed for user root
Nov 22 07:20:28 compute-0 sudo[61944]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-btdjcdimwptjvfjuqtzrvichflebyucb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796027.8414128-403-58028951726337/AnsiballZ_setup.py'
Nov 22 07:20:28 compute-0 sudo[61944]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:20:28 compute-0 python3.9[61946]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 07:20:28 compute-0 sudo[61944]: pam_unix(sudo:session): session closed for user root
Nov 22 07:20:28 compute-0 sudo[62098]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hhhvgspjfnylarbenoosnlavqbgkllve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796028.6980655-427-169050323780602/AnsiballZ_stat.py'
Nov 22 07:20:28 compute-0 sudo[62098]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:20:29 compute-0 python3.9[62100]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 07:20:29 compute-0 sudo[62098]: pam_unix(sudo:session): session closed for user root
Nov 22 07:20:29 compute-0 sudo[62250]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rcxdpnbaaswxpptasmnsvfkmknbabvdp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796029.5510108-454-42395398233036/AnsiballZ_stat.py'
Nov 22 07:20:29 compute-0 sudo[62250]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:20:30 compute-0 python3.9[62252]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 07:20:30 compute-0 sudo[62250]: pam_unix(sudo:session): session closed for user root
Nov 22 07:20:30 compute-0 sudo[62402]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pcczwfuejalwyhvidkiiyidowfeyucwu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796030.420412-484-100729184154659/AnsiballZ_command.py'
Nov 22 07:20:30 compute-0 sudo[62402]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:20:30 compute-0 python3.9[62404]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 07:20:30 compute-0 sudo[62402]: pam_unix(sudo:session): session closed for user root
Nov 22 07:20:31 compute-0 sudo[62555]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rdkzhlkprecqqqibovihznjgzwlxwusg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796031.2281163-514-47608281403667/AnsiballZ_service_facts.py'
Nov 22 07:20:31 compute-0 sudo[62555]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:20:31 compute-0 python3.9[62557]: ansible-service_facts Invoked
Nov 22 07:20:31 compute-0 network[62574]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 22 07:20:31 compute-0 network[62575]: 'network-scripts' will be removed from distribution in near future.
Nov 22 07:20:31 compute-0 network[62576]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 22 07:20:34 compute-0 sudo[62555]: pam_unix(sudo:session): session closed for user root
Nov 22 07:20:35 compute-0 sudo[62859]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wyxzojgznaqgwfgjkkrdzdvvhrybqkrn ; /bin/bash /home/zuul/.ansible/tmp/ansible-tmp-1763796035.528089-559-114141412125017/AnsiballZ_timesync_provider.sh /home/zuul/.ansible/tmp/ansible-tmp-1763796035.528089-559-114141412125017/args'
Nov 22 07:20:35 compute-0 sudo[62859]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:20:35 compute-0 sudo[62859]: pam_unix(sudo:session): session closed for user root
Nov 22 07:20:36 compute-0 sudo[63026]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ctpewjdtbudjdvhzcgcnzdozabkpukji ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796036.457118-592-175744880611099/AnsiballZ_dnf.py'
Nov 22 07:20:36 compute-0 sudo[63026]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:20:36 compute-0 python3.9[63028]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 22 07:20:38 compute-0 sudo[63026]: pam_unix(sudo:session): session closed for user root
Nov 22 07:20:39 compute-0 sudo[63179]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bgwazuodezoheazvwloxmiaafbxibqrq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796038.9753087-631-200798131579705/AnsiballZ_package_facts.py'
Nov 22 07:20:39 compute-0 sudo[63179]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:20:39 compute-0 python3.9[63181]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Nov 22 07:20:40 compute-0 sudo[63179]: pam_unix(sudo:session): session closed for user root
Nov 22 07:20:41 compute-0 sudo[63331]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fgiixoccdwjaztajcjrqmobgtntunmvm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796040.7535691-661-128525106522856/AnsiballZ_stat.py'
Nov 22 07:20:41 compute-0 sudo[63331]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:20:41 compute-0 python3.9[63333]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:20:41 compute-0 sudo[63331]: pam_unix(sudo:session): session closed for user root
Nov 22 07:20:41 compute-0 sudo[63456]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xdiwxfutxukssglmjaeeokxarorindjy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796040.7535691-661-128525106522856/AnsiballZ_copy.py'
Nov 22 07:20:41 compute-0 sudo[63456]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:20:41 compute-0 python3.9[63458]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796040.7535691-661-128525106522856/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:20:41 compute-0 sudo[63456]: pam_unix(sudo:session): session closed for user root
Nov 22 07:20:42 compute-0 sudo[63610]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mnsdzwtdwjzefqicnvheuepqrczmeewy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796042.18347-706-115767562892930/AnsiballZ_stat.py'
Nov 22 07:20:42 compute-0 sudo[63610]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:20:42 compute-0 python3.9[63612]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:20:42 compute-0 sudo[63610]: pam_unix(sudo:session): session closed for user root
Nov 22 07:20:43 compute-0 sudo[63735]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uyhvduamfvfwkzdeduruqakiwwawjxmb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796042.18347-706-115767562892930/AnsiballZ_copy.py'
Nov 22 07:20:43 compute-0 sudo[63735]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:20:43 compute-0 python3.9[63737]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796042.18347-706-115767562892930/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:20:43 compute-0 sudo[63735]: pam_unix(sudo:session): session closed for user root
Nov 22 07:20:45 compute-0 sudo[63889]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wopnzwuxzjdfdptkqwkaeqznksyixnst ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796044.7253416-769-181612237335398/AnsiballZ_lineinfile.py'
Nov 22 07:20:45 compute-0 sudo[63889]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:20:45 compute-0 python3.9[63891]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:20:45 compute-0 sudo[63889]: pam_unix(sudo:session): session closed for user root
Nov 22 07:20:46 compute-0 sudo[64043]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lehheowzrrgjgcckizyiagfpqonvvdud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796046.485836-814-184529864114727/AnsiballZ_setup.py'
Nov 22 07:20:46 compute-0 sudo[64043]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:20:47 compute-0 python3.9[64045]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 22 07:20:47 compute-0 sudo[64043]: pam_unix(sudo:session): session closed for user root
Nov 22 07:20:47 compute-0 sudo[64128]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-drsluuoxsyeksatcmukqiwdddzyrwpum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796046.485836-814-184529864114727/AnsiballZ_systemd.py'
Nov 22 07:20:47 compute-0 sudo[64128]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:20:48 compute-0 python3.9[64130]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 07:20:48 compute-0 sudo[64128]: pam_unix(sudo:session): session closed for user root
Nov 22 07:20:49 compute-0 sudo[64282]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-srqcigkgtqbfrpujcqmzxjlykktqfgtc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796049.1408088-862-279606429372338/AnsiballZ_setup.py'
Nov 22 07:20:49 compute-0 sudo[64282]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:20:49 compute-0 python3.9[64284]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 22 07:20:49 compute-0 sudo[64282]: pam_unix(sudo:session): session closed for user root
Nov 22 07:20:50 compute-0 sudo[64366]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lknonxsrsujdrifwzibkwhmxabhseogg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796049.1408088-862-279606429372338/AnsiballZ_systemd.py'
Nov 22 07:20:50 compute-0 sudo[64366]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:20:50 compute-0 python3.9[64368]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 07:20:50 compute-0 chronyd[828]: chronyd exiting
Nov 22 07:20:50 compute-0 systemd[1]: Stopping NTP client/server...
Nov 22 07:20:50 compute-0 systemd[1]: chronyd.service: Deactivated successfully.
Nov 22 07:20:50 compute-0 systemd[1]: Stopped NTP client/server.
Nov 22 07:20:50 compute-0 systemd[1]: Starting NTP client/server...
Nov 22 07:20:50 compute-0 chronyd[64376]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Nov 22 07:20:50 compute-0 chronyd[64376]: Frequency -23.619 +/- 0.171 ppm read from /var/lib/chrony/drift
Nov 22 07:20:50 compute-0 chronyd[64376]: Loaded seccomp filter (level 2)
Nov 22 07:20:50 compute-0 systemd[1]: Started NTP client/server.
Nov 22 07:20:50 compute-0 sudo[64366]: pam_unix(sudo:session): session closed for user root
Nov 22 07:20:51 compute-0 sshd-session[59520]: Connection closed by 192.168.122.30 port 39214
Nov 22 07:20:51 compute-0 sshd-session[59517]: pam_unix(sshd:session): session closed for user zuul
Nov 22 07:20:51 compute-0 systemd[1]: session-13.scope: Deactivated successfully.
Nov 22 07:20:51 compute-0 systemd[1]: session-13.scope: Consumed 24.982s CPU time.
Nov 22 07:20:51 compute-0 systemd-logind[821]: Session 13 logged out. Waiting for processes to exit.
Nov 22 07:20:51 compute-0 systemd-logind[821]: Removed session 13.
Nov 22 07:20:57 compute-0 sshd-session[64402]: Accepted publickey for zuul from 192.168.122.30 port 39834 ssh2: ECDSA SHA256:XSwr0+qdoVcqF4tsJEe7yzRPcUrJW8ET1w6IEkjbKvs
Nov 22 07:20:57 compute-0 systemd-logind[821]: New session 14 of user zuul.
Nov 22 07:20:57 compute-0 systemd[1]: Started Session 14 of User zuul.
Nov 22 07:20:57 compute-0 sshd-session[64402]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 22 07:20:58 compute-0 python3.9[64555]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 07:20:59 compute-0 sudo[64709]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jmsbzkysyowddvdksdahknctorwpbiug ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796058.892969-64-11190657810412/AnsiballZ_file.py'
Nov 22 07:20:59 compute-0 sudo[64709]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:20:59 compute-0 python3.9[64711]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:20:59 compute-0 sudo[64709]: pam_unix(sudo:session): session closed for user root
Nov 22 07:21:00 compute-0 sudo[64884]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ewqebpsjipnsrysfjqiuaujyixatqhxv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796059.819114-88-265058529772968/AnsiballZ_stat.py'
Nov 22 07:21:00 compute-0 sudo[64884]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:21:00 compute-0 python3.9[64886]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:21:00 compute-0 sudo[64884]: pam_unix(sudo:session): session closed for user root
Nov 22 07:21:00 compute-0 sudo[64962]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ufdyhdgjmlkgcjrjeqtfjdolehshqvug ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796059.819114-88-265058529772968/AnsiballZ_file.py'
Nov 22 07:21:00 compute-0 sudo[64962]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:21:01 compute-0 python3.9[64964]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.4utvqomr recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:21:01 compute-0 sudo[64962]: pam_unix(sudo:session): session closed for user root
Nov 22 07:21:01 compute-0 sudo[65114]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dmaigddisubzjaeqmzdceyjzfablkngn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796061.6637127-148-238058406042878/AnsiballZ_stat.py'
Nov 22 07:21:01 compute-0 sudo[65114]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:21:02 compute-0 python3.9[65116]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:21:02 compute-0 sudo[65114]: pam_unix(sudo:session): session closed for user root
Nov 22 07:21:02 compute-0 sudo[65237]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fzndqlukpskuzalsihxnbktmdvlklsqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796061.6637127-148-238058406042878/AnsiballZ_copy.py'
Nov 22 07:21:02 compute-0 sudo[65237]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:21:02 compute-0 python3.9[65239]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796061.6637127-148-238058406042878/.source _original_basename=.34tw8_mc follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:21:02 compute-0 sudo[65237]: pam_unix(sudo:session): session closed for user root
Nov 22 07:21:03 compute-0 sudo[65389]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eaucynwexuuexuezlqojlcwexbsvxqbz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796063.4360776-196-245944253068496/AnsiballZ_file.py'
Nov 22 07:21:03 compute-0 sudo[65389]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:21:03 compute-0 python3.9[65391]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:21:03 compute-0 sudo[65389]: pam_unix(sudo:session): session closed for user root
Nov 22 07:21:04 compute-0 sudo[65541]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wwgpnbqysiuawlfnyqqgyyxtbzjbfmvf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796064.0907576-220-45878616426119/AnsiballZ_stat.py'
Nov 22 07:21:04 compute-0 sudo[65541]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:21:04 compute-0 python3.9[65543]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:21:04 compute-0 sudo[65541]: pam_unix(sudo:session): session closed for user root
Nov 22 07:21:04 compute-0 sudo[65664]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mkauybghusdhedezpurnljfvdwpvpaqq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796064.0907576-220-45878616426119/AnsiballZ_copy.py'
Nov 22 07:21:04 compute-0 sudo[65664]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:21:05 compute-0 python3.9[65666]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763796064.0907576-220-45878616426119/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:21:05 compute-0 sudo[65664]: pam_unix(sudo:session): session closed for user root
Nov 22 07:21:05 compute-0 sudo[65816]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iolbpvuvgzaqjfvojkeuzcnbgqprywxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796065.199882-220-98621826935506/AnsiballZ_stat.py'
Nov 22 07:21:05 compute-0 sudo[65816]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:21:05 compute-0 python3.9[65818]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:21:05 compute-0 sudo[65816]: pam_unix(sudo:session): session closed for user root
Nov 22 07:21:06 compute-0 sudo[65939]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iyiyjwfdiumicujhqrpbjoqpusobpdkn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796065.199882-220-98621826935506/AnsiballZ_copy.py'
Nov 22 07:21:06 compute-0 sudo[65939]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:21:06 compute-0 python3.9[65941]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763796065.199882-220-98621826935506/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:21:06 compute-0 sudo[65939]: pam_unix(sudo:session): session closed for user root
Nov 22 07:21:06 compute-0 sudo[66091]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yijmtrrwkiaomocfbdztkiskyvurrdgp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796066.6102076-307-195050450069390/AnsiballZ_file.py'
Nov 22 07:21:06 compute-0 sudo[66091]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:21:07 compute-0 python3.9[66093]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:21:07 compute-0 sudo[66091]: pam_unix(sudo:session): session closed for user root
Nov 22 07:21:07 compute-0 sudo[66243]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uhadkfzcwfjkjchrrqkphmfmdbqrorxs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796067.303211-331-222974843767584/AnsiballZ_stat.py'
Nov 22 07:21:07 compute-0 sudo[66243]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:21:07 compute-0 python3.9[66245]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:21:07 compute-0 sudo[66243]: pam_unix(sudo:session): session closed for user root
Nov 22 07:21:08 compute-0 sudo[66366]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wzdsotkwnxgixnjibbdiyoznetaichzm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796067.303211-331-222974843767584/AnsiballZ_copy.py'
Nov 22 07:21:08 compute-0 sudo[66366]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:21:08 compute-0 python3.9[66368]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796067.303211-331-222974843767584/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:21:08 compute-0 sudo[66366]: pam_unix(sudo:session): session closed for user root
Nov 22 07:21:08 compute-0 sudo[66518]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pjjdthtnkfwybfmufcypxmavbhycxyeo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796068.7372355-376-232078378906978/AnsiballZ_stat.py'
Nov 22 07:21:08 compute-0 sudo[66518]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:21:09 compute-0 python3.9[66520]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:21:09 compute-0 sudo[66518]: pam_unix(sudo:session): session closed for user root
Nov 22 07:21:09 compute-0 sudo[66641]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rqzdejvnsblbhmifadulavqlowywjtna ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796068.7372355-376-232078378906978/AnsiballZ_copy.py'
Nov 22 07:21:09 compute-0 sudo[66641]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:21:09 compute-0 python3.9[66643]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796068.7372355-376-232078378906978/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:21:09 compute-0 sudo[66641]: pam_unix(sudo:session): session closed for user root
Nov 22 07:21:10 compute-0 sudo[66793]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-npftcubfcjfhfrvqzdvpxcptiafrdugu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796070.0321991-421-31066666429774/AnsiballZ_systemd.py'
Nov 22 07:21:10 compute-0 sudo[66793]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:21:11 compute-0 python3.9[66795]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 07:21:11 compute-0 systemd[1]: Reloading.
Nov 22 07:21:11 compute-0 systemd-rc-local-generator[66819]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 07:21:11 compute-0 systemd-sysv-generator[66823]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 07:21:11 compute-0 systemd[1]: Reloading.
Nov 22 07:21:11 compute-0 systemd-sysv-generator[66862]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 07:21:11 compute-0 systemd-rc-local-generator[66859]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 07:21:11 compute-0 systemd[1]: Starting EDPM Container Shutdown...
Nov 22 07:21:11 compute-0 systemd[1]: Finished EDPM Container Shutdown.
Nov 22 07:21:11 compute-0 sudo[66793]: pam_unix(sudo:session): session closed for user root
Nov 22 07:21:12 compute-0 sudo[67019]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxkhphhlwkrjkiccxtoxvorqoxzayaqt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796071.824533-445-210306530620088/AnsiballZ_stat.py'
Nov 22 07:21:12 compute-0 sudo[67019]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:21:12 compute-0 python3.9[67021]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:21:12 compute-0 sudo[67019]: pam_unix(sudo:session): session closed for user root
Nov 22 07:21:12 compute-0 sudo[67142]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-poskirvbkaxeapfrkxfedroxxxlecuel ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796071.824533-445-210306530620088/AnsiballZ_copy.py'
Nov 22 07:21:12 compute-0 sudo[67142]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:21:12 compute-0 python3.9[67144]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796071.824533-445-210306530620088/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:21:12 compute-0 sudo[67142]: pam_unix(sudo:session): session closed for user root
Nov 22 07:21:13 compute-0 sudo[67294]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qlohipmqtmjddbkcfcighmwafykjcgdi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796073.0847564-490-264776571730474/AnsiballZ_stat.py'
Nov 22 07:21:13 compute-0 sudo[67294]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:21:13 compute-0 python3.9[67296]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:21:13 compute-0 sudo[67294]: pam_unix(sudo:session): session closed for user root
Nov 22 07:21:13 compute-0 sudo[67417]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-npijrnuuvykzygezmreraolokqbiyncr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796073.0847564-490-264776571730474/AnsiballZ_copy.py'
Nov 22 07:21:13 compute-0 sudo[67417]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:21:14 compute-0 python3.9[67419]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796073.0847564-490-264776571730474/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:21:14 compute-0 sudo[67417]: pam_unix(sudo:session): session closed for user root
Nov 22 07:21:14 compute-0 sudo[67569]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-erednumefjkcqgcytrlneudkmgngnthl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796074.3832624-535-248155615057015/AnsiballZ_systemd.py'
Nov 22 07:21:14 compute-0 sudo[67569]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:21:14 compute-0 python3.9[67571]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 07:21:14 compute-0 systemd[1]: Reloading.
Nov 22 07:21:15 compute-0 systemd-rc-local-generator[67598]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 07:21:15 compute-0 systemd-sysv-generator[67602]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 07:21:15 compute-0 systemd[1]: Reloading.
Nov 22 07:21:15 compute-0 systemd-sysv-generator[67641]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 07:21:15 compute-0 systemd-rc-local-generator[67638]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 07:21:15 compute-0 systemd[1]: Starting Create netns directory...
Nov 22 07:21:15 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 22 07:21:15 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 22 07:21:15 compute-0 systemd[1]: Finished Create netns directory.
Nov 22 07:21:15 compute-0 sudo[67569]: pam_unix(sudo:session): session closed for user root
Nov 22 07:21:16 compute-0 python3.9[67797]: ansible-ansible.builtin.service_facts Invoked
Nov 22 07:21:16 compute-0 network[67814]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 22 07:21:16 compute-0 network[67815]: 'network-scripts' will be removed from distribution in near future.
Nov 22 07:21:16 compute-0 network[67816]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 22 07:21:20 compute-0 sudo[68076]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yhmveuqpgbynqsdushmnbczchweabaxe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796080.3177724-583-194939339851913/AnsiballZ_systemd.py'
Nov 22 07:21:20 compute-0 sudo[68076]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:21:20 compute-0 python3.9[68078]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 07:21:20 compute-0 systemd[1]: Reloading.
Nov 22 07:21:21 compute-0 systemd-sysv-generator[68112]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 07:21:21 compute-0 systemd-rc-local-generator[68109]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 07:21:21 compute-0 systemd[1]: Stopping IPv4 firewall with iptables...
Nov 22 07:21:21 compute-0 iptables.init[68119]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Nov 22 07:21:21 compute-0 iptables.init[68119]: iptables: Flushing firewall rules: [  OK  ]
Nov 22 07:21:21 compute-0 systemd[1]: iptables.service: Deactivated successfully.
Nov 22 07:21:21 compute-0 systemd[1]: Stopped IPv4 firewall with iptables.
Nov 22 07:21:21 compute-0 sudo[68076]: pam_unix(sudo:session): session closed for user root
Nov 22 07:21:21 compute-0 sudo[68313]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hjgjiomrcygvfzxfjaycbqitquflstrf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796081.7124345-583-191635470725724/AnsiballZ_systemd.py'
Nov 22 07:21:21 compute-0 sudo[68313]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:21:22 compute-0 python3.9[68315]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 07:21:22 compute-0 sudo[68313]: pam_unix(sudo:session): session closed for user root
Nov 22 07:21:23 compute-0 sudo[68467]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-taygpqonflpsjgjrezmybyebqilihxsz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796082.7783544-631-277899468815019/AnsiballZ_systemd.py'
Nov 22 07:21:23 compute-0 sudo[68467]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:21:23 compute-0 python3.9[68469]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 07:21:23 compute-0 systemd[1]: Reloading.
Nov 22 07:21:23 compute-0 systemd-rc-local-generator[68499]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 07:21:23 compute-0 systemd-sysv-generator[68502]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 07:21:23 compute-0 systemd[1]: Starting Netfilter Tables...
Nov 22 07:21:23 compute-0 systemd[1]: Finished Netfilter Tables.
Nov 22 07:21:23 compute-0 sudo[68467]: pam_unix(sudo:session): session closed for user root
Nov 22 07:21:24 compute-0 sudo[68659]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fvvhdwuwqgnpycxnekiuyqsonfmvwuza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796083.9483495-655-73615735389368/AnsiballZ_command.py'
Nov 22 07:21:24 compute-0 sudo[68659]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:21:24 compute-0 python3.9[68661]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 07:21:24 compute-0 sudo[68659]: pam_unix(sudo:session): session closed for user root
Nov 22 07:21:25 compute-0 sudo[68812]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qehlxlywqegvidyemdhtepvzkqajztzv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796085.2485049-697-104961087717879/AnsiballZ_stat.py'
Nov 22 07:21:25 compute-0 sudo[68812]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:21:25 compute-0 python3.9[68814]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:21:25 compute-0 sudo[68812]: pam_unix(sudo:session): session closed for user root
Nov 22 07:21:26 compute-0 sudo[68937]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-razynbwsxwuptduvgtykrzdqmrnvdkdp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796085.2485049-697-104961087717879/AnsiballZ_copy.py'
Nov 22 07:21:26 compute-0 sudo[68937]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:21:26 compute-0 python3.9[68939]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796085.2485049-697-104961087717879/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:21:26 compute-0 sudo[68937]: pam_unix(sudo:session): session closed for user root
Nov 22 07:21:26 compute-0 sudo[69090]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ejrsisxxhskrruzyfbsxelpxrxnkawqt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796086.5934384-742-55607077669956/AnsiballZ_systemd.py'
Nov 22 07:21:26 compute-0 sudo[69090]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:21:27 compute-0 python3.9[69092]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 07:21:27 compute-0 systemd[1]: Reloading OpenSSH server daemon...
Nov 22 07:21:27 compute-0 sshd[1009]: Received SIGHUP; restarting.
Nov 22 07:21:27 compute-0 systemd[1]: Reloaded OpenSSH server daemon.
Nov 22 07:21:27 compute-0 sshd[1009]: Server listening on 0.0.0.0 port 22.
Nov 22 07:21:27 compute-0 sshd[1009]: Server listening on :: port 22.
Nov 22 07:21:27 compute-0 sudo[69090]: pam_unix(sudo:session): session closed for user root
Nov 22 07:21:27 compute-0 sudo[69246]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vslgfffqwranprtceyrpjiersznnyjab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796087.59658-766-82612136722730/AnsiballZ_file.py'
Nov 22 07:21:27 compute-0 sudo[69246]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:21:28 compute-0 python3.9[69248]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:21:28 compute-0 sudo[69246]: pam_unix(sudo:session): session closed for user root
Nov 22 07:21:28 compute-0 sudo[69398]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-decldfuplhrnilgvbwqgsmxzrradzodv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796088.3036375-790-137578492057337/AnsiballZ_stat.py'
Nov 22 07:21:28 compute-0 sudo[69398]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:21:28 compute-0 python3.9[69400]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:21:28 compute-0 sudo[69398]: pam_unix(sudo:session): session closed for user root
Nov 22 07:21:29 compute-0 sudo[69521]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-udkirjjsminaxxouuaueudfgoifiuxfw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796088.3036375-790-137578492057337/AnsiballZ_copy.py'
Nov 22 07:21:29 compute-0 sudo[69521]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:21:29 compute-0 python3.9[69523]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796088.3036375-790-137578492057337/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:21:29 compute-0 sudo[69521]: pam_unix(sudo:session): session closed for user root
Nov 22 07:21:30 compute-0 sudo[69673]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uitjhheytuqhmzrnxhozlkqbkgzjanyd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796089.954294-844-123989079875805/AnsiballZ_timezone.py'
Nov 22 07:21:30 compute-0 sudo[69673]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:21:30 compute-0 python3.9[69675]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Nov 22 07:21:30 compute-0 systemd[1]: Starting Time & Date Service...
Nov 22 07:21:30 compute-0 systemd[1]: Started Time & Date Service.
Nov 22 07:21:30 compute-0 sudo[69673]: pam_unix(sudo:session): session closed for user root
Nov 22 07:21:31 compute-0 sudo[69829]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hlrxfnbyqrogzzinjuqgzzzsqstexxzy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796091.1480107-871-39137727510770/AnsiballZ_file.py'
Nov 22 07:21:31 compute-0 sudo[69829]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:21:31 compute-0 python3.9[69831]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:21:31 compute-0 sudo[69829]: pam_unix(sudo:session): session closed for user root
Nov 22 07:21:32 compute-0 sudo[69981]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxybawmgmlapfjbdsmsgvszegzzuhmpi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796091.8807018-895-200946763699825/AnsiballZ_stat.py'
Nov 22 07:21:32 compute-0 sudo[69981]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:21:32 compute-0 python3.9[69983]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:21:32 compute-0 sudo[69981]: pam_unix(sudo:session): session closed for user root
Nov 22 07:21:32 compute-0 sudo[70104]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xwjzqdllyqkeeueexkhomoedrermjjwf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796091.8807018-895-200946763699825/AnsiballZ_copy.py'
Nov 22 07:21:32 compute-0 sudo[70104]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:21:32 compute-0 python3.9[70106]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796091.8807018-895-200946763699825/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:21:32 compute-0 sudo[70104]: pam_unix(sudo:session): session closed for user root
Nov 22 07:21:33 compute-0 sudo[70256]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qllxefwxmmbnpyzagzbgksrdfwhsench ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796093.2022529-940-280230136792865/AnsiballZ_stat.py'
Nov 22 07:21:33 compute-0 sudo[70256]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:21:33 compute-0 python3.9[70258]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:21:33 compute-0 sudo[70256]: pam_unix(sudo:session): session closed for user root
Nov 22 07:21:34 compute-0 sudo[70379]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iouocrgjalogbwfgjoxmraxobwdpdnvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796093.2022529-940-280230136792865/AnsiballZ_copy.py'
Nov 22 07:21:34 compute-0 sudo[70379]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:21:34 compute-0 python3.9[70381]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796093.2022529-940-280230136792865/.source.yaml _original_basename=.0d6n8meg follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:21:34 compute-0 sudo[70379]: pam_unix(sudo:session): session closed for user root
Nov 22 07:21:35 compute-0 sudo[70531]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-brdsszhisxwtcgwjkfhsjxmsxxslzxit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796095.1125968-985-120706099892755/AnsiballZ_stat.py'
Nov 22 07:21:35 compute-0 sudo[70531]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:21:35 compute-0 python3.9[70533]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:21:35 compute-0 sudo[70531]: pam_unix(sudo:session): session closed for user root
Nov 22 07:21:35 compute-0 sudo[70654]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bshoflszxsrqmbgrkrotkplhnrbsyxiz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796095.1125968-985-120706099892755/AnsiballZ_copy.py'
Nov 22 07:21:35 compute-0 sudo[70654]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:21:36 compute-0 python3.9[70656]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796095.1125968-985-120706099892755/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:21:36 compute-0 sudo[70654]: pam_unix(sudo:session): session closed for user root
Nov 22 07:21:36 compute-0 sudo[70806]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mksditosldkudkkkfheedazyrmptcbcl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796096.4445686-1030-135677918658769/AnsiballZ_command.py'
Nov 22 07:21:36 compute-0 sudo[70806]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:21:36 compute-0 python3.9[70808]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 07:21:36 compute-0 sudo[70806]: pam_unix(sudo:session): session closed for user root
Nov 22 07:21:37 compute-0 sudo[70959]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gchkrpawuozkejvabpfbnzoabufucalq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796097.1842403-1054-46916110161227/AnsiballZ_command.py'
Nov 22 07:21:37 compute-0 sudo[70959]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:21:37 compute-0 python3.9[70961]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 07:21:37 compute-0 sudo[70959]: pam_unix(sudo:session): session closed for user root
Nov 22 07:21:38 compute-0 sudo[71112]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vclxkxboalgkelhocagampfqkzvqacjw ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763796097.8954666-1078-202706250067696/AnsiballZ_edpm_nftables_from_files.py'
Nov 22 07:21:38 compute-0 sudo[71112]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:21:38 compute-0 python3[71114]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 22 07:21:38 compute-0 sudo[71112]: pam_unix(sudo:session): session closed for user root
Nov 22 07:21:39 compute-0 sudo[71264]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-usxwmdbkoaoqzlnrrzsnmzufypjibwik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796098.777973-1102-120328657095897/AnsiballZ_stat.py'
Nov 22 07:21:39 compute-0 sudo[71264]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:21:39 compute-0 python3.9[71266]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:21:39 compute-0 sudo[71264]: pam_unix(sudo:session): session closed for user root
Nov 22 07:21:39 compute-0 sudo[71387]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hguvfpmnrjtydjyrosnuoigpevtnhadh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796098.777973-1102-120328657095897/AnsiballZ_copy.py'
Nov 22 07:21:39 compute-0 sudo[71387]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:21:39 compute-0 python3.9[71389]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796098.777973-1102-120328657095897/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:21:39 compute-0 sudo[71387]: pam_unix(sudo:session): session closed for user root
Nov 22 07:21:40 compute-0 sudo[71539]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cniprwkbsnjkvibbbybjactiijdcjbwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796100.1648653-1147-89645835231244/AnsiballZ_stat.py'
Nov 22 07:21:40 compute-0 sudo[71539]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:21:40 compute-0 python3.9[71541]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:21:40 compute-0 sudo[71539]: pam_unix(sudo:session): session closed for user root
Nov 22 07:21:41 compute-0 sudo[71662]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mzbgqhkgmtsossamxxovyjmfrsiqtjxz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796100.1648653-1147-89645835231244/AnsiballZ_copy.py'
Nov 22 07:21:41 compute-0 sudo[71662]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:21:41 compute-0 python3.9[71664]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796100.1648653-1147-89645835231244/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:21:41 compute-0 sudo[71662]: pam_unix(sudo:session): session closed for user root
Nov 22 07:21:41 compute-0 sudo[71814]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-llcouvljeatxqcpntvjxoouawqorazid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796101.5439894-1192-24225707788589/AnsiballZ_stat.py'
Nov 22 07:21:41 compute-0 sudo[71814]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:21:42 compute-0 python3.9[71816]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:21:42 compute-0 sudo[71814]: pam_unix(sudo:session): session closed for user root
Nov 22 07:21:42 compute-0 sudo[71937]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kmpoaiqjnhomhknfjvepvnlkmtvkdwnm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796101.5439894-1192-24225707788589/AnsiballZ_copy.py'
Nov 22 07:21:42 compute-0 sudo[71937]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:21:42 compute-0 python3.9[71939]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796101.5439894-1192-24225707788589/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:21:42 compute-0 sudo[71937]: pam_unix(sudo:session): session closed for user root
Nov 22 07:21:43 compute-0 sudo[72089]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nlplksnkdybgwfhqrejlhnwztqrgavsn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796103.33634-1237-274641562346239/AnsiballZ_stat.py'
Nov 22 07:21:43 compute-0 sudo[72089]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:21:43 compute-0 python3.9[72091]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:21:43 compute-0 sudo[72089]: pam_unix(sudo:session): session closed for user root
Nov 22 07:21:44 compute-0 sudo[72212]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-almxwmgfgquqaastqvqpftgwrgjebsbe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796103.33634-1237-274641562346239/AnsiballZ_copy.py'
Nov 22 07:21:44 compute-0 sudo[72212]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:21:44 compute-0 python3.9[72214]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796103.33634-1237-274641562346239/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:21:44 compute-0 sudo[72212]: pam_unix(sudo:session): session closed for user root
Nov 22 07:21:45 compute-0 sudo[72364]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gbghoqoicwoeauvpbbkltrwtezaolefw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796104.60397-1282-168056235169096/AnsiballZ_stat.py'
Nov 22 07:21:45 compute-0 sudo[72364]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:21:45 compute-0 python3.9[72366]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:21:45 compute-0 sudo[72364]: pam_unix(sudo:session): session closed for user root
Nov 22 07:21:45 compute-0 sudo[72487]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ujmlzeoldztvoozjfaazvhrntdtsyhxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796104.60397-1282-168056235169096/AnsiballZ_copy.py'
Nov 22 07:21:45 compute-0 sudo[72487]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:21:45 compute-0 python3.9[72489]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796104.60397-1282-168056235169096/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:21:45 compute-0 sudo[72487]: pam_unix(sudo:session): session closed for user root
Nov 22 07:21:46 compute-0 sudo[72639]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wdeughrplxuaoxckwloexrphexrdlijq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796106.1096237-1327-197484231863060/AnsiballZ_file.py'
Nov 22 07:21:46 compute-0 sudo[72639]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:21:46 compute-0 python3.9[72641]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:21:46 compute-0 sudo[72639]: pam_unix(sudo:session): session closed for user root
Nov 22 07:21:47 compute-0 sudo[72791]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-owcyhzhcchlemlzlodmbvspyzxtzpjaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796106.8534298-1351-43499315356928/AnsiballZ_command.py'
Nov 22 07:21:47 compute-0 sudo[72791]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:21:47 compute-0 python3.9[72793]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 07:21:47 compute-0 sudo[72791]: pam_unix(sudo:session): session closed for user root
Nov 22 07:21:48 compute-0 sudo[72950]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fbhibschfejidswduwxddtcbkgseeipl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796107.633736-1375-112866559183028/AnsiballZ_blockinfile.py'
Nov 22 07:21:48 compute-0 sudo[72950]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:21:48 compute-0 python3.9[72952]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:21:48 compute-0 sudo[72950]: pam_unix(sudo:session): session closed for user root
Nov 22 07:21:49 compute-0 sudo[73103]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qwrjfmaylizjyvpkpvilhfpwerqjnavp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796109.1617565-1402-63861835404191/AnsiballZ_file.py'
Nov 22 07:21:49 compute-0 sudo[73103]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:21:49 compute-0 python3.9[73105]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:21:49 compute-0 sudo[73103]: pam_unix(sudo:session): session closed for user root
Nov 22 07:21:50 compute-0 sudo[73255]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zrfqoxbdwdmlsqgjmmplkkazqmsymaqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796109.8020105-1402-109472053724730/AnsiballZ_file.py'
Nov 22 07:21:50 compute-0 sudo[73255]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:21:50 compute-0 python3.9[73257]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:21:50 compute-0 sudo[73255]: pam_unix(sudo:session): session closed for user root
Nov 22 07:21:50 compute-0 sudo[73407]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dsyfczvbdsvrsfaxljwprbizzukujfiq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796110.498328-1447-154114122237143/AnsiballZ_mount.py'
Nov 22 07:21:50 compute-0 sudo[73407]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:21:51 compute-0 python3.9[73409]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Nov 22 07:21:51 compute-0 sudo[73407]: pam_unix(sudo:session): session closed for user root
Nov 22 07:21:51 compute-0 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 22 07:21:51 compute-0 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 22 07:21:51 compute-0 sudo[73561]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-crxeruciwnzbyaxlgpwrgnmkllynmybh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796111.3673909-1447-18541519248199/AnsiballZ_mount.py'
Nov 22 07:21:51 compute-0 sudo[73561]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:21:51 compute-0 python3.9[73563]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Nov 22 07:21:51 compute-0 sudo[73561]: pam_unix(sudo:session): session closed for user root
Nov 22 07:21:52 compute-0 sshd-session[64405]: Connection closed by 192.168.122.30 port 39834
Nov 22 07:21:52 compute-0 sshd-session[64402]: pam_unix(sshd:session): session closed for user zuul
Nov 22 07:21:52 compute-0 systemd[1]: session-14.scope: Deactivated successfully.
Nov 22 07:21:52 compute-0 systemd[1]: session-14.scope: Consumed 33.922s CPU time.
Nov 22 07:21:52 compute-0 systemd-logind[821]: Session 14 logged out. Waiting for processes to exit.
Nov 22 07:21:52 compute-0 systemd-logind[821]: Removed session 14.
Nov 22 07:21:57 compute-0 sshd-session[73589]: Accepted publickey for zuul from 192.168.122.30 port 37756 ssh2: ECDSA SHA256:XSwr0+qdoVcqF4tsJEe7yzRPcUrJW8ET1w6IEkjbKvs
Nov 22 07:21:57 compute-0 systemd-logind[821]: New session 15 of user zuul.
Nov 22 07:21:58 compute-0 systemd[1]: Started Session 15 of User zuul.
Nov 22 07:21:58 compute-0 sshd-session[73589]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 22 07:21:58 compute-0 sudo[73742]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pmruuvirhgjackeigrvkmrfudriegyxl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796118.1113021-23-96444631869626/AnsiballZ_tempfile.py'
Nov 22 07:21:58 compute-0 sudo[73742]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:21:58 compute-0 python3.9[73744]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Nov 22 07:21:58 compute-0 sudo[73742]: pam_unix(sudo:session): session closed for user root
Nov 22 07:21:59 compute-0 sudo[73894]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-arvwaglkjorabrpcobhxuvclmmbggfuj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796118.995717-59-67979155332141/AnsiballZ_stat.py'
Nov 22 07:21:59 compute-0 sudo[73894]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:21:59 compute-0 python3.9[73896]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 07:21:59 compute-0 sudo[73894]: pam_unix(sudo:session): session closed for user root
Nov 22 07:22:00 compute-0 sudo[74046]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ntzlpldjrzqyxnfthbjjuiavrkvkwhcn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796119.8570538-89-149883566647100/AnsiballZ_setup.py'
Nov 22 07:22:00 compute-0 sudo[74046]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:22:00 compute-0 systemd[1]: systemd-timedated.service: Deactivated successfully.
Nov 22 07:22:00 compute-0 python3.9[74048]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 07:22:00 compute-0 sudo[74046]: pam_unix(sudo:session): session closed for user root
Nov 22 07:22:01 compute-0 sudo[74200]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rtmrsecvstxviobunxhljclwxemletqw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796121.0341444-114-140370198560351/AnsiballZ_blockinfile.py'
Nov 22 07:22:01 compute-0 sudo[74200]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:22:01 compute-0 python3.9[74202]: ansible-ansible.builtin.blockinfile Invoked with block=compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCmnpAkzBA+/P5ygAqTpHSo/yxyshcDXOqGY2sZ6+LmKpfF/U/3puURRCYPFHLvU6Fe2oRGY6GwNjK7ej5idUzOOTgf6eMc2MfuxlwwYk9lQWXXYu3BIFbZTa/Jz2j3Jd5KpxE11/bi7aYfn5u+oXd0Q+EgbyaX14S6EGKPujybZZbWbPUjXyNBIpHDRP3QOvtmf0oXpNj7FZ/+eQ5okb2AzQeflovexeLh5/TrUuMpBgxJC+IT5bDgtr3scwyEN7Su9iQQos2qnNIIzuFTAJrbao4uS5RsC+rRO10O4Z+2p8nWhQuSG2tQ63gvUhaXg8h1KFhHYfclNow/Nzxq1rSASWv2iNeUsoDWgxH7Yq3GPbGEofld095ADvo32HdVYHmdYEaD9GLY7WKHW6ilz14vUYQ6cN6XZoli1rdTt1Z/UWpQSy64npnbT3IGeztmD2KPGZP2laTkFkxzTh7m0Dz2sBx1rbfhQV8SjNw0ZkeSV+G3sqXWqozNXMvk1k7Ma/8=
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIPqC7967YjYmXjy5Y1Atr1idIuJEqYVlUbJ/ivnjEtjv
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJYSJRTKIRJHGorxvpDox6ZIDiNrie6EQnECMuD5IFEY7kEn/cP5JLTUpe4kf0aZt1r5R4WnwY6StRedSzkyRk0=
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCowOVwCxoeDeu/gjiGxT6DsxjadsI6OsklC9oYH1562wrbeZDtXV2FXgAB8clz6v5hIpHsJBOPMHniNRFVQwlu7A3igwl6rkisIR380P+Ttep7r2pEz15KdpK86MS3svcPZn5qKpfnr+3JUxX9Kt71rH4jGzpDSQCFB/vJPmodINZL7o8vaTg1Gz0vkf+zJlmQjq3fUKVrInLbL6hPyuV8pXqtw3q+JYCIrXJlHDFPOngM4PsGnOJL6j9PaOEdRXK30tQNQlzko6lfntblufy8mAb/o9Sn1ulCIbI1nIJqkTVm9aK31C4nWSPumTQ9GLdi0dvultCwMbw0ym7pzFAWlxrsx4V9GRz2yqAPLbNwEFoaA42ScSLnQpq+Y3747tGiT5jdKz2AyCBa6sN43tUXKR/mtjBpXXoOsCvgvzvnlul+TRmjoju2jFsL05dlNImskQ1UwAn5iIr+7TvzDF03jeQYani/6aykV0z4KJyt0VneL9fYnSlSZ7dnpTfkYgc=
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAII6yEJELXMtPQkFh9QeTL6LtFdllgEtcCx/vTvD4VQRD
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBENm8i1piornPA+bu3lQ1gnDuQ+S2zp/iE9MvAGxNHKKvA3MMS333GJWFx+BV6kDFZ9hDTDj/kimvzvpu8lziNQ=
                                            compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDQtcPDTu5W9vsjUBaPvvVOkaIA7MPqUmOieWOqa7ySB03c8aREkaNDH3Zlp68jXwdC1Qpw0/2EGQ83sfaSlvG6XSE4QBwVDoOMe7GzTY8agr/ZZOIedAz8v04HH0OpnsD0tqLQlZZ0nuBJ7UM7iP5PTbc7O1Z2n35+F+XTqiKfSmsCSxhnwJhgyZBKS0HJUIsvQoVw1N699OnanMSweTImsEURAEEsL3zrVM9Qa/uw2XH5LTuU9kXzfqKNgy/5VXcEbamLe+cPbFPKDc8ei1sCASL3xDbyGriLdNKOiSjytc5GTcG0eg5aHmBxz1/KWYAf9JCs/xAGk0Nifft+xlfC1OwkiPBCHsUlfVWzERxno/lVQQGvrNTgMZ1G/lJwuhRYCWlScgfADkcZSitTszGd/qlunDx3biSKRE1RnACqaNsF0Hum0S7m6d8wxj6TNoD618+lN82HqRhRMhrVQ+hxQySpHEXSWTdhfVLND+2neEL9hyT0cCzB33FvD+YacRk=
                                            compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIIBd7JSklzBfXUPIvKiAxXVL//OQf5r0dI648cExbdgs
                                            compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAE41g4YqzC3bfy1t/lRYP5p85+7h3wD8DzLbz0LtdbkROkWg/OHzC73WNbkqdHKqwacHfch6fbycv9mIDE73cM=
                                             create=True mode=0644 path=/tmp/ansible.2bnlpq54 state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:22:01 compute-0 sudo[74200]: pam_unix(sudo:session): session closed for user root
Nov 22 07:22:02 compute-0 sudo[74352]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wmlugksyhmnwczbskcblqfxowgtzllwi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796121.7843971-138-169674817450389/AnsiballZ_command.py'
Nov 22 07:22:02 compute-0 sudo[74352]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:22:02 compute-0 python3.9[74354]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.2bnlpq54' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 07:22:02 compute-0 sudo[74352]: pam_unix(sudo:session): session closed for user root
Nov 22 07:22:03 compute-0 sudo[74506]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jgsejqmvfzdkasingaskjjknpxydkcie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796122.5980384-162-268827584263194/AnsiballZ_file.py'
Nov 22 07:22:03 compute-0 sudo[74506]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:22:03 compute-0 python3.9[74508]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.2bnlpq54 state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:22:03 compute-0 sudo[74506]: pam_unix(sudo:session): session closed for user root
Nov 22 07:22:03 compute-0 sshd-session[73592]: Connection closed by 192.168.122.30 port 37756
Nov 22 07:22:03 compute-0 sshd-session[73589]: pam_unix(sshd:session): session closed for user zuul
Nov 22 07:22:03 compute-0 systemd[1]: session-15.scope: Deactivated successfully.
Nov 22 07:22:03 compute-0 systemd[1]: session-15.scope: Consumed 3.199s CPU time.
Nov 22 07:22:03 compute-0 systemd-logind[821]: Session 15 logged out. Waiting for processes to exit.
Nov 22 07:22:03 compute-0 systemd-logind[821]: Removed session 15.
Nov 22 07:22:09 compute-0 sshd-session[74533]: Accepted publickey for zuul from 192.168.122.30 port 34044 ssh2: ECDSA SHA256:XSwr0+qdoVcqF4tsJEe7yzRPcUrJW8ET1w6IEkjbKvs
Nov 22 07:22:09 compute-0 systemd-logind[821]: New session 16 of user zuul.
Nov 22 07:22:09 compute-0 systemd[1]: Started Session 16 of User zuul.
Nov 22 07:22:09 compute-0 sshd-session[74533]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 22 07:22:10 compute-0 python3.9[74686]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 07:22:11 compute-0 sudo[74840]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jywujuwvllntgakpyraobgygtromequw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796130.9179451-61-81353441991333/AnsiballZ_systemd.py'
Nov 22 07:22:11 compute-0 sudo[74840]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:22:11 compute-0 python3.9[74842]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Nov 22 07:22:11 compute-0 sudo[74840]: pam_unix(sudo:session): session closed for user root
Nov 22 07:22:12 compute-0 sudo[74994]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xnpojwvsxsikgrbaduznpensgcgftave ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796132.125826-85-174128758360128/AnsiballZ_systemd.py'
Nov 22 07:22:12 compute-0 sudo[74994]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:22:12 compute-0 python3.9[74996]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 07:22:12 compute-0 sudo[74994]: pam_unix(sudo:session): session closed for user root
Nov 22 07:22:13 compute-0 sudo[75147]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-egywfwkyxhyjufnirripeqspibscwjtw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796133.0401692-112-268240652015545/AnsiballZ_command.py'
Nov 22 07:22:13 compute-0 sudo[75147]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:22:13 compute-0 python3.9[75149]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 07:22:13 compute-0 sudo[75147]: pam_unix(sudo:session): session closed for user root
Nov 22 07:22:14 compute-0 sudo[75300]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwlpuntvndtklellmebaahrnjiowpcgz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796133.8646128-136-18139503573088/AnsiballZ_stat.py'
Nov 22 07:22:14 compute-0 sudo[75300]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:22:14 compute-0 python3.9[75302]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 07:22:14 compute-0 sudo[75300]: pam_unix(sudo:session): session closed for user root
Nov 22 07:22:14 compute-0 sudo[75454]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uviuttknqcaajjrwuxzebethyimffhxd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796134.7199302-160-203082707198100/AnsiballZ_command.py'
Nov 22 07:22:14 compute-0 sudo[75454]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:22:15 compute-0 python3.9[75456]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 07:22:15 compute-0 sudo[75454]: pam_unix(sudo:session): session closed for user root
Nov 22 07:22:15 compute-0 sudo[75609]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lzglttifzvfqctclsfpwsqvbcnaoqfpf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796135.4508574-184-83118021302060/AnsiballZ_file.py'
Nov 22 07:22:15 compute-0 sudo[75609]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:22:16 compute-0 python3.9[75611]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:22:16 compute-0 sudo[75609]: pam_unix(sudo:session): session closed for user root
Nov 22 07:22:16 compute-0 sshd-session[74536]: Connection closed by 192.168.122.30 port 34044
Nov 22 07:22:16 compute-0 sshd-session[74533]: pam_unix(sshd:session): session closed for user zuul
Nov 22 07:22:16 compute-0 systemd[1]: session-16.scope: Deactivated successfully.
Nov 22 07:22:16 compute-0 systemd[1]: session-16.scope: Consumed 4.199s CPU time.
Nov 22 07:22:16 compute-0 systemd-logind[821]: Session 16 logged out. Waiting for processes to exit.
Nov 22 07:22:16 compute-0 systemd-logind[821]: Removed session 16.
Nov 22 07:22:21 compute-0 sshd-session[75636]: Accepted publickey for zuul from 192.168.122.30 port 51078 ssh2: ECDSA SHA256:XSwr0+qdoVcqF4tsJEe7yzRPcUrJW8ET1w6IEkjbKvs
Nov 22 07:22:21 compute-0 systemd-logind[821]: New session 17 of user zuul.
Nov 22 07:22:21 compute-0 systemd[1]: Started Session 17 of User zuul.
Nov 22 07:22:21 compute-0 sshd-session[75636]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 22 07:22:22 compute-0 python3.9[75789]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 07:22:23 compute-0 sudo[75943]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ngvtrydgqwpnmtkxfjmijilubvmlvkvo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796143.4566584-67-91939101231193/AnsiballZ_setup.py'
Nov 22 07:22:23 compute-0 sudo[75943]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:22:24 compute-0 python3.9[75945]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 22 07:22:24 compute-0 sudo[75943]: pam_unix(sudo:session): session closed for user root
Nov 22 07:22:24 compute-0 sudo[76027]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwlzlafifunmrlfkgmrzbrmhzgdxgzga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796143.4566584-67-91939101231193/AnsiballZ_dnf.py'
Nov 22 07:22:24 compute-0 sudo[76027]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:22:24 compute-0 python3.9[76029]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 22 07:22:26 compute-0 sudo[76027]: pam_unix(sudo:session): session closed for user root
Nov 22 07:22:27 compute-0 python3.9[76180]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 07:22:28 compute-0 python3.9[76331]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 22 07:22:29 compute-0 python3.9[76481]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 07:22:30 compute-0 python3.9[76631]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 07:22:30 compute-0 sshd-session[75639]: Connection closed by 192.168.122.30 port 51078
Nov 22 07:22:30 compute-0 sshd-session[75636]: pam_unix(sshd:session): session closed for user zuul
Nov 22 07:22:30 compute-0 systemd[1]: session-17.scope: Deactivated successfully.
Nov 22 07:22:30 compute-0 systemd[1]: session-17.scope: Consumed 5.695s CPU time.
Nov 22 07:22:30 compute-0 systemd-logind[821]: Session 17 logged out. Waiting for processes to exit.
Nov 22 07:22:30 compute-0 systemd-logind[821]: Removed session 17.
Nov 22 07:22:36 compute-0 sshd-session[76656]: Accepted publickey for zuul from 192.168.122.30 port 60106 ssh2: ECDSA SHA256:XSwr0+qdoVcqF4tsJEe7yzRPcUrJW8ET1w6IEkjbKvs
Nov 22 07:22:36 compute-0 systemd-logind[821]: New session 18 of user zuul.
Nov 22 07:22:36 compute-0 systemd[1]: Started Session 18 of User zuul.
Nov 22 07:22:36 compute-0 sshd-session[76656]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 22 07:22:37 compute-0 python3.9[76809]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 07:22:39 compute-0 sudo[76963]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tnethhqvgmazniycwrxerwjhivnhirri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796159.0890508-115-156631266716173/AnsiballZ_file.py'
Nov 22 07:22:39 compute-0 sudo[76963]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:22:39 compute-0 python3.9[76965]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:22:39 compute-0 sudo[76963]: pam_unix(sudo:session): session closed for user root
Nov 22 07:22:40 compute-0 sudo[77115]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vtpeipmzqdajwwgfvhhdesscsdxbvvgt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796159.93972-115-59607621018483/AnsiballZ_file.py'
Nov 22 07:22:40 compute-0 sudo[77115]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:22:40 compute-0 python3.9[77117]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:22:40 compute-0 sudo[77115]: pam_unix(sudo:session): session closed for user root
Nov 22 07:22:41 compute-0 sudo[77267]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rdrxdzbrjmiwmrkckqeipesywucvuamb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796160.5654254-159-130375552078610/AnsiballZ_stat.py'
Nov 22 07:22:41 compute-0 sudo[77267]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:22:41 compute-0 python3.9[77269]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:22:41 compute-0 sudo[77267]: pam_unix(sudo:session): session closed for user root
Nov 22 07:22:41 compute-0 sudo[77390]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kzfcocdmnofhqzfiniqoicjbolnxmgbq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796160.5654254-159-130375552078610/AnsiballZ_copy.py'
Nov 22 07:22:41 compute-0 sudo[77390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:22:41 compute-0 python3.9[77392]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796160.5654254-159-130375552078610/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=908d24b1d4f07146150f149c1d59fda4c9079c90 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:22:41 compute-0 sudo[77390]: pam_unix(sudo:session): session closed for user root
Nov 22 07:22:42 compute-0 sudo[77542]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nbcatjjtqpheeadiuvatqqgiffvfvsfo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796162.0859306-159-267049304186269/AnsiballZ_stat.py'
Nov 22 07:22:42 compute-0 sudo[77542]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:22:42 compute-0 python3.9[77544]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:22:42 compute-0 sudo[77542]: pam_unix(sudo:session): session closed for user root
Nov 22 07:22:42 compute-0 sudo[77665]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tpqwlzstuzaooeempwtrtdehbuboaqjd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796162.0859306-159-267049304186269/AnsiballZ_copy.py'
Nov 22 07:22:42 compute-0 sudo[77665]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:22:43 compute-0 python3.9[77667]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796162.0859306-159-267049304186269/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=16bd60fbc24423e3fc1bbc9e201827083d9b5e39 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:22:43 compute-0 sudo[77665]: pam_unix(sudo:session): session closed for user root
Nov 22 07:22:43 compute-0 sudo[77817]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqgvwvclcskodecltgasuhoxqeoyrbjy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796163.1704159-159-241656614399500/AnsiballZ_stat.py'
Nov 22 07:22:43 compute-0 sudo[77817]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:22:43 compute-0 python3.9[77819]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:22:43 compute-0 sudo[77817]: pam_unix(sudo:session): session closed for user root
Nov 22 07:22:43 compute-0 sudo[77940]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xaodjtqqsnfbxewcokeybrucqhnqqnzu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796163.1704159-159-241656614399500/AnsiballZ_copy.py'
Nov 22 07:22:43 compute-0 sudo[77940]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:22:44 compute-0 python3.9[77942]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796163.1704159-159-241656614399500/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=a6ec447d1121e4c48ab5cd09adee01977c931bd0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:22:44 compute-0 sshd-session[77943]: Connection closed by 180.163.31.56 port 23104
Nov 22 07:22:44 compute-0 sudo[77940]: pam_unix(sudo:session): session closed for user root
Nov 22 07:22:44 compute-0 sudo[78094]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-auusydpvixxmxmqopcffgnisbuizyidf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796164.372685-288-197424306200105/AnsiballZ_file.py'
Nov 22 07:22:44 compute-0 sudo[78094]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:22:44 compute-0 python3.9[78096]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:22:44 compute-0 sudo[78094]: pam_unix(sudo:session): session closed for user root
Nov 22 07:22:45 compute-0 sudo[78246]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oyekpqvxpaorgtcwfanvtrcfgyceyhoc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796164.9723504-288-60205791622327/AnsiballZ_file.py'
Nov 22 07:22:45 compute-0 sudo[78246]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:22:45 compute-0 python3.9[78248]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:22:45 compute-0 sudo[78246]: pam_unix(sudo:session): session closed for user root
Nov 22 07:22:45 compute-0 sudo[78399]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qcfpgeyvkpsacnrtalscmdefqzjlrkix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796165.656553-334-213258132614523/AnsiballZ_stat.py'
Nov 22 07:22:45 compute-0 sudo[78399]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:22:46 compute-0 python3.9[78401]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:22:46 compute-0 sudo[78399]: pam_unix(sudo:session): session closed for user root
Nov 22 07:22:46 compute-0 sudo[78522]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yiglwrdijmbnmmfzngwlcyziductqess ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796165.656553-334-213258132614523/AnsiballZ_copy.py'
Nov 22 07:22:46 compute-0 sudo[78522]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:22:47 compute-0 python3.9[78524]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796165.656553-334-213258132614523/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=fc71d3a5346f8a84de8225733157f4ce336ba6cd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:22:47 compute-0 sudo[78522]: pam_unix(sudo:session): session closed for user root
Nov 22 07:22:47 compute-0 sudo[78674]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tzdvqeiqbpazafgmjqtfccwltvvmzgxo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796167.1703918-334-222226443268031/AnsiballZ_stat.py'
Nov 22 07:22:47 compute-0 sudo[78674]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:22:47 compute-0 python3.9[78676]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:22:47 compute-0 sudo[78674]: pam_unix(sudo:session): session closed for user root
Nov 22 07:22:47 compute-0 sudo[78797]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oadgfvbhwxfurbvrsetkjtwgadamclhf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796167.1703918-334-222226443268031/AnsiballZ_copy.py'
Nov 22 07:22:47 compute-0 sudo[78797]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:22:48 compute-0 python3.9[78799]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796167.1703918-334-222226443268031/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=8ad07d9f15fb881d541cc871f705c812e1318a58 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:22:48 compute-0 sudo[78797]: pam_unix(sudo:session): session closed for user root
Nov 22 07:22:48 compute-0 sudo[78949]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rvmcwrpsjzxxxdsitkfdikcggkmffcuk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796168.3436193-334-2392822071804/AnsiballZ_stat.py'
Nov 22 07:22:48 compute-0 sudo[78949]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:22:48 compute-0 python3.9[78951]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:22:48 compute-0 sudo[78949]: pam_unix(sudo:session): session closed for user root
Nov 22 07:22:49 compute-0 sudo[79072]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ncqnniopiqqojilmjsbvmcwbhmuyrypq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796168.3436193-334-2392822071804/AnsiballZ_copy.py'
Nov 22 07:22:49 compute-0 sudo[79072]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:22:49 compute-0 python3.9[79074]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796168.3436193-334-2392822071804/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=1feb348d0a4ed90120bc5574dc5e63d6a2af2a82 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:22:49 compute-0 sshd-session[77992]: Invalid user a from 180.163.31.56 port 23118
Nov 22 07:22:49 compute-0 sudo[79072]: pam_unix(sudo:session): session closed for user root
Nov 22 07:22:49 compute-0 sshd-session[77992]: Connection closed by invalid user a 180.163.31.56 port 23118 [preauth]
Nov 22 07:22:49 compute-0 sudo[79224]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-accayjjvumtmmjnxzvqyqmfuqdhnngbx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796169.6360118-474-280277158011328/AnsiballZ_file.py'
Nov 22 07:22:49 compute-0 sudo[79224]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:22:50 compute-0 python3.9[79226]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:22:50 compute-0 sudo[79224]: pam_unix(sudo:session): session closed for user root
Nov 22 07:22:50 compute-0 sudo[79376]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgjnuszavvlnkuhwdxnuwmemqdhxbret ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796170.2273507-474-189561299505172/AnsiballZ_file.py'
Nov 22 07:22:50 compute-0 sudo[79376]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:22:50 compute-0 python3.9[79378]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:22:50 compute-0 sudo[79376]: pam_unix(sudo:session): session closed for user root
Nov 22 07:22:51 compute-0 sudo[79528]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yntrqltlclrfcwzfddptsyjycrhbugmx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796170.8497474-518-236881696420690/AnsiballZ_stat.py'
Nov 22 07:22:51 compute-0 sudo[79528]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:22:51 compute-0 python3.9[79530]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:22:51 compute-0 sudo[79528]: pam_unix(sudo:session): session closed for user root
Nov 22 07:22:51 compute-0 sudo[79651]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzykfiaczqucietupwecganpdmjiblnh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796170.8497474-518-236881696420690/AnsiballZ_copy.py'
Nov 22 07:22:51 compute-0 sudo[79651]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:22:51 compute-0 python3.9[79653]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796170.8497474-518-236881696420690/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=b7226ad2e92f08c99ce16005cce8df66824e9b63 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:22:51 compute-0 sudo[79651]: pam_unix(sudo:session): session closed for user root
Nov 22 07:22:52 compute-0 sudo[79803]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-prbzrwzasydalvlnshshsvujgiihcnal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796171.9620554-518-264505880789917/AnsiballZ_stat.py'
Nov 22 07:22:52 compute-0 sudo[79803]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:22:52 compute-0 python3.9[79805]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:22:52 compute-0 sudo[79803]: pam_unix(sudo:session): session closed for user root
Nov 22 07:22:52 compute-0 sudo[79926]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vlvphwiayrzdaruwyfuhhzplryepktqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796171.9620554-518-264505880789917/AnsiballZ_copy.py'
Nov 22 07:22:52 compute-0 sudo[79926]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:22:52 compute-0 python3.9[79928]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796171.9620554-518-264505880789917/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=ab92e79a33a6e2fca5144cd0532be918fe14e6b1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:22:53 compute-0 sudo[79926]: pam_unix(sudo:session): session closed for user root
Nov 22 07:22:53 compute-0 sudo[80078]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tcprrxqulgptbkogceegleiekajktbcj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796173.1402729-518-106623309734425/AnsiballZ_stat.py'
Nov 22 07:22:53 compute-0 sudo[80078]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:22:53 compute-0 python3.9[80080]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:22:53 compute-0 sudo[80078]: pam_unix(sudo:session): session closed for user root
Nov 22 07:22:54 compute-0 sudo[80201]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uzdoqhmarmzqotyukrfrnzuuxfngerau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796173.1402729-518-106623309734425/AnsiballZ_copy.py'
Nov 22 07:22:54 compute-0 sudo[80201]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:22:54 compute-0 python3.9[80203]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796173.1402729-518-106623309734425/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=4fdb886e8e59bb6279a99d33ac2fe1fdfdb39a04 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:22:54 compute-0 sudo[80201]: pam_unix(sudo:session): session closed for user root
Nov 22 07:22:54 compute-0 sudo[80353]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nffxfmedziodemixivnzppceabfjuzml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796174.4324903-646-129859320956154/AnsiballZ_file.py'
Nov 22 07:22:54 compute-0 sudo[80353]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:22:54 compute-0 python3.9[80355]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:22:54 compute-0 sudo[80353]: pam_unix(sudo:session): session closed for user root
Nov 22 07:22:55 compute-0 sudo[80505]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sxvrkyofbslmqxnodkssbpfjqyrdmnbx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796175.0872772-646-280890885040947/AnsiballZ_file.py'
Nov 22 07:22:55 compute-0 sudo[80505]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:22:55 compute-0 python3.9[80507]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:22:55 compute-0 sudo[80505]: pam_unix(sudo:session): session closed for user root
Nov 22 07:22:56 compute-0 sudo[80657]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqngdrjkuwrlqkckfqfjfqnmuhasabar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796175.7906137-691-263080741955538/AnsiballZ_stat.py'
Nov 22 07:22:56 compute-0 sudo[80657]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:22:56 compute-0 python3.9[80659]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:22:56 compute-0 sudo[80657]: pam_unix(sudo:session): session closed for user root
Nov 22 07:22:56 compute-0 sudo[80780]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tpylnvwtfujghmhegmcpufdchiqprbck ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796175.7906137-691-263080741955538/AnsiballZ_copy.py'
Nov 22 07:22:56 compute-0 sudo[80780]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:22:56 compute-0 python3.9[80782]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796175.7906137-691-263080741955538/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=b386ae136714f709366a053290bc77a1cf5e74ff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:22:56 compute-0 sudo[80780]: pam_unix(sudo:session): session closed for user root
Nov 22 07:22:57 compute-0 sudo[80932]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-noxhhztwjrpwznhefiuhqgptjtiwomjf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796177.0142107-691-153908986430595/AnsiballZ_stat.py'
Nov 22 07:22:57 compute-0 sudo[80932]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:22:57 compute-0 python3.9[80934]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:22:57 compute-0 sudo[80932]: pam_unix(sudo:session): session closed for user root
Nov 22 07:22:57 compute-0 sudo[81055]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nkkznzyrljjijsleefglcefvjnszfbmn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796177.0142107-691-153908986430595/AnsiballZ_copy.py'
Nov 22 07:22:57 compute-0 sudo[81055]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:22:58 compute-0 python3.9[81057]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796177.0142107-691-153908986430595/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=ab92e79a33a6e2fca5144cd0532be918fe14e6b1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:22:58 compute-0 sudo[81055]: pam_unix(sudo:session): session closed for user root
Nov 22 07:22:58 compute-0 sudo[81207]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jblanriptuboqdkoqffdahdioyvrjvzf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796178.1536102-691-120613314053007/AnsiballZ_stat.py'
Nov 22 07:22:58 compute-0 sudo[81207]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:22:58 compute-0 python3.9[81209]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:22:58 compute-0 sudo[81207]: pam_unix(sudo:session): session closed for user root
Nov 22 07:22:58 compute-0 sudo[81330]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ajylmvdigcgknevimgfdcqhlynlsensv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796178.1536102-691-120613314053007/AnsiballZ_copy.py'
Nov 22 07:22:58 compute-0 sudo[81330]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:22:59 compute-0 python3.9[81332]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796178.1536102-691-120613314053007/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=b28f1c9576d6bd3f9e551bf4469fc4dbbb536af9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:22:59 compute-0 sudo[81330]: pam_unix(sudo:session): session closed for user root
Nov 22 07:23:00 compute-0 sudo[81482]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-axsugyfusfhsyrmejzuuwivmrjenrfge ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796180.1187143-862-162760102740323/AnsiballZ_file.py'
Nov 22 07:23:00 compute-0 sudo[81482]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:23:00 compute-0 python3.9[81484]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:23:00 compute-0 sudo[81482]: pam_unix(sudo:session): session closed for user root
Nov 22 07:23:00 compute-0 chronyd[64376]: Selected source 216.128.178.20 (pool.ntp.org)
Nov 22 07:23:01 compute-0 sudo[81634]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubvmcdlacdqupecplqmmctcruxlhcmsv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796180.8062031-899-135052765786569/AnsiballZ_stat.py'
Nov 22 07:23:01 compute-0 sudo[81634]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:23:01 compute-0 python3.9[81636]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:23:01 compute-0 sudo[81634]: pam_unix(sudo:session): session closed for user root
Nov 22 07:23:01 compute-0 sudo[81757]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-okvimskiyhlwipgrmdbkedisrfnvayww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796180.8062031-899-135052765786569/AnsiballZ_copy.py'
Nov 22 07:23:01 compute-0 sudo[81757]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:23:01 compute-0 python3.9[81759]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796180.8062031-899-135052765786569/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=ba56832c35f23d00035eec09df0d02bc867796ad backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:23:01 compute-0 sudo[81757]: pam_unix(sudo:session): session closed for user root
Nov 22 07:23:02 compute-0 sudo[81909]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwjqtyhmkovxghkyevlqqegwdyvqqcxd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796182.1258159-946-14940893854214/AnsiballZ_file.py'
Nov 22 07:23:02 compute-0 sudo[81909]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:23:02 compute-0 python3.9[81911]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:23:02 compute-0 sudo[81909]: pam_unix(sudo:session): session closed for user root
Nov 22 07:23:03 compute-0 sudo[82061]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ymalfpphmwqjbsnlcybkkxzazhwbfvsm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796182.8090038-972-32342013145094/AnsiballZ_stat.py'
Nov 22 07:23:03 compute-0 sudo[82061]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:23:03 compute-0 python3.9[82063]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:23:03 compute-0 sudo[82061]: pam_unix(sudo:session): session closed for user root
Nov 22 07:23:03 compute-0 sudo[82184]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwkzbxcxkzqesmbkmealhspewgflijzn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796182.8090038-972-32342013145094/AnsiballZ_copy.py'
Nov 22 07:23:03 compute-0 sudo[82184]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:23:03 compute-0 python3.9[82186]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796182.8090038-972-32342013145094/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=ba56832c35f23d00035eec09df0d02bc867796ad backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:23:03 compute-0 sudo[82184]: pam_unix(sudo:session): session closed for user root
Nov 22 07:23:04 compute-0 sudo[82336]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yktjziruddhnzcwcvsvfuxcicjwsnxrz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796184.0150795-1019-31965881614401/AnsiballZ_file.py'
Nov 22 07:23:04 compute-0 sudo[82336]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:23:04 compute-0 python3.9[82338]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:23:04 compute-0 sudo[82336]: pam_unix(sudo:session): session closed for user root
Nov 22 07:23:04 compute-0 sudo[82488]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfgmqwnrmchszqlvpfaeqyenbupphjtj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796184.6580741-1041-71998928117344/AnsiballZ_stat.py'
Nov 22 07:23:04 compute-0 sudo[82488]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:23:05 compute-0 python3.9[82490]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:23:05 compute-0 sudo[82488]: pam_unix(sudo:session): session closed for user root
Nov 22 07:23:05 compute-0 sudo[82611]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-otpamjdxqhrdopartibdwydoqbaexscl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796184.6580741-1041-71998928117344/AnsiballZ_copy.py'
Nov 22 07:23:05 compute-0 sudo[82611]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:23:05 compute-0 python3.9[82613]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796184.6580741-1041-71998928117344/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=ba56832c35f23d00035eec09df0d02bc867796ad backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:23:05 compute-0 sudo[82611]: pam_unix(sudo:session): session closed for user root
Nov 22 07:23:06 compute-0 sudo[82763]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lzfvflxchvbfzqagblwrexzsfoxfshub ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796185.9186795-1091-31855217191021/AnsiballZ_file.py'
Nov 22 07:23:06 compute-0 sudo[82763]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:23:06 compute-0 python3.9[82765]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:23:06 compute-0 sudo[82763]: pam_unix(sudo:session): session closed for user root
Nov 22 07:23:06 compute-0 sudo[82915]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cstyzjeoubxhsqgmrwwqffmzlohgxtnx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796186.6094694-1115-101441550925027/AnsiballZ_stat.py'
Nov 22 07:23:06 compute-0 sudo[82915]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:23:07 compute-0 python3.9[82917]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:23:07 compute-0 sudo[82915]: pam_unix(sudo:session): session closed for user root
Nov 22 07:23:07 compute-0 sudo[83038]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mhstykswnyjbxtvswkidyrulptqkhkzb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796186.6094694-1115-101441550925027/AnsiballZ_copy.py'
Nov 22 07:23:07 compute-0 sudo[83038]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:23:07 compute-0 python3.9[83040]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796186.6094694-1115-101441550925027/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=ba56832c35f23d00035eec09df0d02bc867796ad backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:23:07 compute-0 sudo[83038]: pam_unix(sudo:session): session closed for user root
Nov 22 07:23:08 compute-0 sudo[83190]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-knejcmewgxkxdokvdosjtrkbvmjedxnv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796188.0936837-1165-274147105674497/AnsiballZ_file.py'
Nov 22 07:23:08 compute-0 sudo[83190]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:23:08 compute-0 python3.9[83192]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:23:08 compute-0 sudo[83190]: pam_unix(sudo:session): session closed for user root
Nov 22 07:23:09 compute-0 sudo[83342]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfvrrlwgklyvenrbywlyhjbuhgjljdsv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796188.7859838-1188-33256195503544/AnsiballZ_stat.py'
Nov 22 07:23:09 compute-0 sudo[83342]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:23:09 compute-0 python3.9[83344]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:23:09 compute-0 sudo[83342]: pam_unix(sudo:session): session closed for user root
Nov 22 07:23:09 compute-0 sudo[83465]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nmyoftmoqcihomiztpbbhspqvqdexojv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796188.7859838-1188-33256195503544/AnsiballZ_copy.py'
Nov 22 07:23:09 compute-0 sudo[83465]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:23:09 compute-0 python3.9[83467]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796188.7859838-1188-33256195503544/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=ba56832c35f23d00035eec09df0d02bc867796ad backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:23:09 compute-0 sudo[83465]: pam_unix(sudo:session): session closed for user root
Nov 22 07:23:10 compute-0 sudo[83617]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rsyqcfpepuxtejjesnoorbvwvvhattnm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796190.0708592-1239-196455681977141/AnsiballZ_file.py'
Nov 22 07:23:10 compute-0 sudo[83617]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:23:10 compute-0 python3.9[83619]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:23:10 compute-0 sudo[83617]: pam_unix(sudo:session): session closed for user root
Nov 22 07:23:11 compute-0 sudo[83769]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rrvapttpgxepekszymmvtvvwhgpeiibm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796190.7504115-1262-103603966203418/AnsiballZ_stat.py'
Nov 22 07:23:11 compute-0 sudo[83769]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:23:11 compute-0 python3.9[83771]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:23:11 compute-0 sudo[83769]: pam_unix(sudo:session): session closed for user root
Nov 22 07:23:11 compute-0 sudo[83892]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kekqslmadrkbisctnfidaaladrrtcozp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796190.7504115-1262-103603966203418/AnsiballZ_copy.py'
Nov 22 07:23:11 compute-0 sudo[83892]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:23:11 compute-0 python3.9[83894]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796190.7504115-1262-103603966203418/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=ba56832c35f23d00035eec09df0d02bc867796ad backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:23:11 compute-0 sudo[83892]: pam_unix(sudo:session): session closed for user root
Nov 22 07:23:12 compute-0 sudo[84044]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xansruzvrjeydenbjzmydmjbtyntsvjk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796191.965516-1309-112656625280639/AnsiballZ_file.py'
Nov 22 07:23:12 compute-0 sudo[84044]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:23:12 compute-0 python3.9[84046]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:23:12 compute-0 sudo[84044]: pam_unix(sudo:session): session closed for user root
Nov 22 07:23:12 compute-0 sudo[84196]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-isvwqnmikcgixxsczzotvnlcfzgqnmnm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796192.5537655-1331-70289195939992/AnsiballZ_stat.py'
Nov 22 07:23:12 compute-0 sudo[84196]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:23:13 compute-0 python3.9[84198]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:23:13 compute-0 sudo[84196]: pam_unix(sudo:session): session closed for user root
Nov 22 07:23:13 compute-0 sudo[84319]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mebedzaimgevywxofxptvafeehedpwqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796192.5537655-1331-70289195939992/AnsiballZ_copy.py'
Nov 22 07:23:13 compute-0 sudo[84319]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:23:13 compute-0 python3.9[84321]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796192.5537655-1331-70289195939992/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=ba56832c35f23d00035eec09df0d02bc867796ad backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:23:13 compute-0 sudo[84319]: pam_unix(sudo:session): session closed for user root
Nov 22 07:23:14 compute-0 sshd-session[76659]: Connection closed by 192.168.122.30 port 60106
Nov 22 07:23:14 compute-0 sshd-session[76656]: pam_unix(sshd:session): session closed for user zuul
Nov 22 07:23:14 compute-0 systemd[1]: session-18.scope: Deactivated successfully.
Nov 22 07:23:14 compute-0 systemd[1]: session-18.scope: Consumed 28.343s CPU time.
Nov 22 07:23:14 compute-0 systemd-logind[821]: Session 18 logged out. Waiting for processes to exit.
Nov 22 07:23:14 compute-0 systemd-logind[821]: Removed session 18.
Nov 22 07:23:20 compute-0 sshd-session[84346]: Accepted publickey for zuul from 192.168.122.30 port 55776 ssh2: ECDSA SHA256:XSwr0+qdoVcqF4tsJEe7yzRPcUrJW8ET1w6IEkjbKvs
Nov 22 07:23:20 compute-0 systemd-logind[821]: New session 19 of user zuul.
Nov 22 07:23:20 compute-0 systemd[1]: Started Session 19 of User zuul.
Nov 22 07:23:20 compute-0 sshd-session[84346]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 22 07:23:21 compute-0 python3.9[84499]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 07:23:22 compute-0 sudo[84653]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lvyxlabtfxooohcwcummenyexyxtczgn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796202.0146408-67-105852672861139/AnsiballZ_file.py'
Nov 22 07:23:22 compute-0 sudo[84653]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:23:22 compute-0 python3.9[84655]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:23:22 compute-0 sudo[84653]: pam_unix(sudo:session): session closed for user root
Nov 22 07:23:23 compute-0 sudo[84805]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-edsioesswpixeekeewfgwtceymhpzvia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796202.7757525-67-213872473705108/AnsiballZ_file.py'
Nov 22 07:23:23 compute-0 sudo[84805]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:23:23 compute-0 python3.9[84807]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:23:23 compute-0 sudo[84805]: pam_unix(sudo:session): session closed for user root
Nov 22 07:23:24 compute-0 python3.9[84957]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 07:23:24 compute-0 sudo[85107]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zatvwsuyrxippopruslxftyxqotybthr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796204.2551987-136-216931099601557/AnsiballZ_seboolean.py'
Nov 22 07:23:24 compute-0 sudo[85107]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:23:24 compute-0 python3.9[85109]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Nov 22 07:23:26 compute-0 sudo[85107]: pam_unix(sudo:session): session closed for user root
Nov 22 07:23:26 compute-0 sudo[85263]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-waczxzwayouxyrxdujdlccnaowgsruzr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796206.6628377-166-267671499383383/AnsiballZ_setup.py'
Nov 22 07:23:26 compute-0 dbus-broker-launch[811]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Nov 22 07:23:26 compute-0 sudo[85263]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:23:27 compute-0 python3.9[85265]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 22 07:23:27 compute-0 sudo[85263]: pam_unix(sudo:session): session closed for user root
Nov 22 07:23:27 compute-0 sudo[85347]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yztdfannmkyvpyrbkrvkevfoefnmikvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796206.6628377-166-267671499383383/AnsiballZ_dnf.py'
Nov 22 07:23:27 compute-0 sudo[85347]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:23:28 compute-0 python3.9[85349]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 22 07:23:29 compute-0 sudo[85347]: pam_unix(sudo:session): session closed for user root
Nov 22 07:23:30 compute-0 sudo[85500]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oshbehzdsrqqvqokawjnqczspsmznzgh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796209.6746492-202-70477910666591/AnsiballZ_systemd.py'
Nov 22 07:23:30 compute-0 sudo[85500]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:23:30 compute-0 python3.9[85502]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 22 07:23:30 compute-0 sudo[85500]: pam_unix(sudo:session): session closed for user root
Nov 22 07:23:31 compute-0 sudo[85655]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-naebuomythzkebksaqpmgbavsmecygya ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763796210.8832788-226-145959627914022/AnsiballZ_edpm_nftables_snippet.py'
Nov 22 07:23:31 compute-0 sudo[85655]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:23:31 compute-0 python3[85657]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks
                                            rule:
                                              proto: udp
                                              dport: 4789
                                          - rule_name: 119 neutron geneve networks
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              state: ["UNTRACKED"]
                                          - rule_name: 120 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: OUTPUT
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                          - rule_name: 121 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: PREROUTING
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                           dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Nov 22 07:23:31 compute-0 sudo[85655]: pam_unix(sudo:session): session closed for user root
Nov 22 07:23:32 compute-0 sudo[85807]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qmpemtgrlxstupbiflucxurnuqqhnete ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796211.881178-253-194016413950041/AnsiballZ_file.py'
Nov 22 07:23:32 compute-0 sudo[85807]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:23:32 compute-0 python3.9[85809]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:23:32 compute-0 sudo[85807]: pam_unix(sudo:session): session closed for user root
Nov 22 07:23:32 compute-0 sudo[85959]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wiuvadwydpzkxyqfkolaxodbgdbkzjfe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796212.5851407-277-8856344165173/AnsiballZ_stat.py'
Nov 22 07:23:32 compute-0 sudo[85959]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:23:33 compute-0 python3.9[85961]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:23:33 compute-0 sudo[85959]: pam_unix(sudo:session): session closed for user root
Nov 22 07:23:33 compute-0 sudo[86037]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pogbucriucrymkmpoecwhebrpvpdtryl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796212.5851407-277-8856344165173/AnsiballZ_file.py'
Nov 22 07:23:33 compute-0 sudo[86037]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:23:33 compute-0 python3.9[86039]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:23:33 compute-0 sudo[86037]: pam_unix(sudo:session): session closed for user root
Nov 22 07:23:34 compute-0 sudo[86189]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-htrwzjocosecepgasdfiaxaexjfyvddj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796213.9235637-313-123043099445581/AnsiballZ_stat.py'
Nov 22 07:23:34 compute-0 sudo[86189]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:23:34 compute-0 python3.9[86191]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:23:34 compute-0 sudo[86189]: pam_unix(sudo:session): session closed for user root
Nov 22 07:23:34 compute-0 sudo[86267]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qniipsmeiwynnbfwdmwcpdrsjdvsrlhe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796213.9235637-313-123043099445581/AnsiballZ_file.py'
Nov 22 07:23:34 compute-0 sudo[86267]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:23:34 compute-0 python3.9[86269]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.7fue68ws recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:23:34 compute-0 sudo[86267]: pam_unix(sudo:session): session closed for user root
Nov 22 07:23:35 compute-0 sudo[86419]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xhtbuaxgppnbypxtoolbszwrquitadfd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796215.0491757-349-28354497219355/AnsiballZ_stat.py'
Nov 22 07:23:35 compute-0 sudo[86419]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:23:35 compute-0 python3.9[86421]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:23:35 compute-0 sudo[86419]: pam_unix(sudo:session): session closed for user root
Nov 22 07:23:35 compute-0 sudo[86497]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hhcahyhetxiqvfoqltepptpkztiswmgk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796215.0491757-349-28354497219355/AnsiballZ_file.py'
Nov 22 07:23:35 compute-0 sudo[86497]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:23:35 compute-0 python3.9[86499]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:23:35 compute-0 sudo[86497]: pam_unix(sudo:session): session closed for user root
Nov 22 07:23:36 compute-0 sudo[86649]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dyeytdvnorfrysqszrkfwycoaaakriqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796216.2138627-388-80687818438436/AnsiballZ_command.py'
Nov 22 07:23:36 compute-0 sudo[86649]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:23:36 compute-0 python3.9[86651]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 07:23:36 compute-0 sudo[86649]: pam_unix(sudo:session): session closed for user root
Nov 22 07:23:37 compute-0 sudo[86802]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-znlwqoetnkhylqejngludxyxmkkskoiz ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763796217.0720758-412-17004925819699/AnsiballZ_edpm_nftables_from_files.py'
Nov 22 07:23:37 compute-0 sudo[86802]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:23:37 compute-0 python3[86804]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 22 07:23:37 compute-0 sudo[86802]: pam_unix(sudo:session): session closed for user root
Nov 22 07:23:38 compute-0 sudo[86954]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qounrhnljjilcdztsigwkiujlemmcubb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796217.9370563-436-154298932792663/AnsiballZ_stat.py'
Nov 22 07:23:38 compute-0 sudo[86954]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:23:38 compute-0 python3.9[86956]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:23:38 compute-0 sudo[86954]: pam_unix(sudo:session): session closed for user root
Nov 22 07:23:38 compute-0 sudo[87079]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnrqtltvakgoojsqpwmsqrpwpznrvyos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796217.9370563-436-154298932792663/AnsiballZ_copy.py'
Nov 22 07:23:38 compute-0 sudo[87079]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:23:39 compute-0 python3.9[87081]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796217.9370563-436-154298932792663/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:23:39 compute-0 sudo[87079]: pam_unix(sudo:session): session closed for user root
Nov 22 07:23:39 compute-0 sudo[87231]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-htcwicooouprmitgxnhbsszodlefdows ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796219.358684-481-272728868732130/AnsiballZ_stat.py'
Nov 22 07:23:39 compute-0 sudo[87231]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:23:39 compute-0 python3.9[87233]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:23:39 compute-0 sudo[87231]: pam_unix(sudo:session): session closed for user root
Nov 22 07:23:40 compute-0 sudo[87356]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gglfqqotyajyclwbjebtknoscbnbcqhm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796219.358684-481-272728868732130/AnsiballZ_copy.py'
Nov 22 07:23:40 compute-0 sudo[87356]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:23:40 compute-0 python3.9[87358]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796219.358684-481-272728868732130/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:23:40 compute-0 sudo[87356]: pam_unix(sudo:session): session closed for user root
Nov 22 07:23:40 compute-0 sudo[87508]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-upsbvszabffhjroxszwdrtgdrzuavtjd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796220.6388085-526-150538107161856/AnsiballZ_stat.py'
Nov 22 07:23:40 compute-0 sudo[87508]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:23:41 compute-0 python3.9[87510]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:23:41 compute-0 sudo[87508]: pam_unix(sudo:session): session closed for user root
Nov 22 07:23:41 compute-0 sudo[87633]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ooenjvmhspkkihsevseixlbrvxbbidgh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796220.6388085-526-150538107161856/AnsiballZ_copy.py'
Nov 22 07:23:41 compute-0 sudo[87633]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:23:41 compute-0 python3.9[87635]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796220.6388085-526-150538107161856/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:23:41 compute-0 sudo[87633]: pam_unix(sudo:session): session closed for user root
Nov 22 07:23:42 compute-0 sudo[87785]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xcmdifaqahkndzbavfckjqiqevfieyea ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796222.091389-571-10802988890634/AnsiballZ_stat.py'
Nov 22 07:23:42 compute-0 sudo[87785]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:23:42 compute-0 python3.9[87787]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:23:42 compute-0 sudo[87785]: pam_unix(sudo:session): session closed for user root
Nov 22 07:23:42 compute-0 sudo[87910]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gyyctdqbrnciltrghcakimulqmfcgddj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796222.091389-571-10802988890634/AnsiballZ_copy.py'
Nov 22 07:23:42 compute-0 sudo[87910]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:23:43 compute-0 python3.9[87912]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796222.091389-571-10802988890634/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:23:43 compute-0 sudo[87910]: pam_unix(sudo:session): session closed for user root
Nov 22 07:23:43 compute-0 sudo[88062]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xbpkisfcixzmbpnecxpaehkankecnxvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796223.4308248-616-62886722912911/AnsiballZ_stat.py'
Nov 22 07:23:43 compute-0 sudo[88062]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:23:43 compute-0 python3.9[88064]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:23:44 compute-0 sudo[88062]: pam_unix(sudo:session): session closed for user root
Nov 22 07:23:44 compute-0 sudo[88187]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pucqzxouyhbvtfppspxriasjtzhdsrkf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796223.4308248-616-62886722912911/AnsiballZ_copy.py'
Nov 22 07:23:44 compute-0 sudo[88187]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:23:44 compute-0 python3.9[88189]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796223.4308248-616-62886722912911/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:23:44 compute-0 sudo[88187]: pam_unix(sudo:session): session closed for user root
Nov 22 07:23:45 compute-0 sudo[88339]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-napyygfuvfgvwbowtmrmldkbmqcotigt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796224.7884533-661-61000033773817/AnsiballZ_file.py'
Nov 22 07:23:45 compute-0 sudo[88339]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:23:45 compute-0 python3.9[88341]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:23:45 compute-0 sudo[88339]: pam_unix(sudo:session): session closed for user root
Nov 22 07:23:45 compute-0 sudo[88491]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqqutofjdcovjgtebgcjuwivdwzxvkiv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796225.478185-685-267280447044159/AnsiballZ_command.py'
Nov 22 07:23:45 compute-0 sudo[88491]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:23:45 compute-0 python3.9[88493]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 07:23:45 compute-0 sudo[88491]: pam_unix(sudo:session): session closed for user root
Nov 22 07:23:46 compute-0 sudo[88646]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uthzgxybztvxwpwwvfhcvjamdgqlqyhn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796226.26072-709-56947625241793/AnsiballZ_blockinfile.py'
Nov 22 07:23:46 compute-0 sudo[88646]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:23:46 compute-0 python3.9[88648]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:23:46 compute-0 sudo[88646]: pam_unix(sudo:session): session closed for user root
Nov 22 07:23:47 compute-0 sudo[88798]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frlytbzwzxipudlrwifjtpiofgkzsyty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796227.1485443-736-131232547008435/AnsiballZ_command.py'
Nov 22 07:23:47 compute-0 sudo[88798]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:23:47 compute-0 python3.9[88800]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 07:23:47 compute-0 sudo[88798]: pam_unix(sudo:session): session closed for user root
Nov 22 07:23:48 compute-0 sudo[88951]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eusodqiiqrciojbxpfrkxpgqswmhwakx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796227.8601053-760-247181927666529/AnsiballZ_stat.py'
Nov 22 07:23:48 compute-0 sudo[88951]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:23:48 compute-0 python3.9[88953]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 07:23:48 compute-0 sudo[88951]: pam_unix(sudo:session): session closed for user root
Nov 22 07:23:48 compute-0 sudo[89105]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bribtzfffwlilelcfithzourwwdqyncf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796228.565481-784-239333006342716/AnsiballZ_command.py'
Nov 22 07:23:48 compute-0 sudo[89105]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:23:49 compute-0 python3.9[89107]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 07:23:49 compute-0 sudo[89105]: pam_unix(sudo:session): session closed for user root
Nov 22 07:23:49 compute-0 sudo[89260]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zpducpqouetxbnvpatxfuhgtyfuvdsgj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796229.3639982-808-204875005764215/AnsiballZ_file.py'
Nov 22 07:23:49 compute-0 sudo[89260]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:23:49 compute-0 python3.9[89262]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:23:49 compute-0 sudo[89260]: pam_unix(sudo:session): session closed for user root
Nov 22 07:23:51 compute-0 python3.9[89412]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 07:23:52 compute-0 sudo[89563]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-asqrqhkpizqthcucswgjlxrhohiofcjj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796232.3009055-928-171180309931649/AnsiballZ_command.py'
Nov 22 07:23:52 compute-0 sudo[89563]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:23:52 compute-0 python3.9[89565]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:0e:0a:93:45:69:49" external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch 
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 07:23:52 compute-0 ovs-vsctl[89566]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:0e:0a:93:45:69:49 external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Nov 22 07:23:52 compute-0 sudo[89563]: pam_unix(sudo:session): session closed for user root
Nov 22 07:23:53 compute-0 sudo[89716]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evotldjhoaqojajgttvbgcqwqhigfjyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796233.0659502-955-54449045458392/AnsiballZ_command.py'
Nov 22 07:23:53 compute-0 sudo[89716]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:23:53 compute-0 python3.9[89718]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                            ovs-vsctl show | grep -q "Manager"
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 07:23:53 compute-0 sudo[89716]: pam_unix(sudo:session): session closed for user root
Nov 22 07:23:54 compute-0 sudo[89871]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-welzhqdjbkruotcfaadvzrwkxgcnujix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796233.7941837-979-131185658200039/AnsiballZ_command.py'
Nov 22 07:23:54 compute-0 sudo[89871]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:23:54 compute-0 python3.9[89873]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 07:23:54 compute-0 ovs-vsctl[89874]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Nov 22 07:23:54 compute-0 sudo[89871]: pam_unix(sudo:session): session closed for user root
Nov 22 07:23:54 compute-0 python3.9[90024]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 07:23:55 compute-0 sudo[90176]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qzynehyxhfeabfhzzywpqrccchtprfjz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796235.337484-1030-9121701139015/AnsiballZ_file.py'
Nov 22 07:23:55 compute-0 sudo[90176]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:23:55 compute-0 python3.9[90178]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:23:55 compute-0 sudo[90176]: pam_unix(sudo:session): session closed for user root
Nov 22 07:23:56 compute-0 sudo[90328]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-juuouqjhyadgoownuqnpqyfusqorldgo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796236.086105-1054-176490589109778/AnsiballZ_stat.py'
Nov 22 07:23:56 compute-0 sudo[90328]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:23:56 compute-0 python3.9[90330]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:23:56 compute-0 sudo[90328]: pam_unix(sudo:session): session closed for user root
Nov 22 07:23:56 compute-0 sudo[90406]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tsbiuxskcqoqtwjtbpuovupuxghaewwa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796236.086105-1054-176490589109778/AnsiballZ_file.py'
Nov 22 07:23:56 compute-0 sudo[90406]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:23:57 compute-0 python3.9[90408]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:23:57 compute-0 sudo[90406]: pam_unix(sudo:session): session closed for user root
Nov 22 07:23:57 compute-0 sudo[90558]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dzrpfetfntwgeojzjpyehgxvjbaqqhzv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796237.3480937-1054-62530844970250/AnsiballZ_stat.py'
Nov 22 07:23:57 compute-0 sudo[90558]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:23:57 compute-0 python3.9[90560]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:23:57 compute-0 sudo[90558]: pam_unix(sudo:session): session closed for user root
Nov 22 07:23:58 compute-0 sudo[90636]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-exqprwsjwszsevwwgjkqbssxacxjelcb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796237.3480937-1054-62530844970250/AnsiballZ_file.py'
Nov 22 07:23:58 compute-0 sudo[90636]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:23:58 compute-0 python3.9[90638]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:23:58 compute-0 sudo[90636]: pam_unix(sudo:session): session closed for user root
Nov 22 07:23:58 compute-0 sudo[90788]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zuwecmrelcsfulxgkriaqqryolooslxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796238.4699998-1123-113491887151774/AnsiballZ_file.py'
Nov 22 07:23:58 compute-0 sudo[90788]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:23:59 compute-0 python3.9[90790]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:23:59 compute-0 sudo[90788]: pam_unix(sudo:session): session closed for user root
Nov 22 07:23:59 compute-0 sudo[90940]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uctduyviirxpkjxamccuvgytrnkyvlcl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796239.247679-1147-249537206320357/AnsiballZ_stat.py'
Nov 22 07:23:59 compute-0 sudo[90940]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:23:59 compute-0 python3.9[90942]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:23:59 compute-0 sudo[90940]: pam_unix(sudo:session): session closed for user root
Nov 22 07:24:00 compute-0 sudo[91018]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-niuvwdkqzvpltihqcthbfuayjxvgjtkk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796239.247679-1147-249537206320357/AnsiballZ_file.py'
Nov 22 07:24:00 compute-0 sudo[91018]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:24:00 compute-0 python3.9[91020]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:24:00 compute-0 sudo[91018]: pam_unix(sudo:session): session closed for user root
Nov 22 07:24:00 compute-0 sudo[91170]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hkmbkqprnexzgyjqtvdhzktujfqtywbo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796240.449058-1183-54521878392317/AnsiballZ_stat.py'
Nov 22 07:24:00 compute-0 sudo[91170]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:24:00 compute-0 python3.9[91172]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:24:00 compute-0 sudo[91170]: pam_unix(sudo:session): session closed for user root
Nov 22 07:24:01 compute-0 sudo[91248]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gcrhpoobwjgilegropddbbgasvxxdkxi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796240.449058-1183-54521878392317/AnsiballZ_file.py'
Nov 22 07:24:01 compute-0 sudo[91248]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:24:01 compute-0 python3.9[91250]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:24:01 compute-0 sudo[91248]: pam_unix(sudo:session): session closed for user root
Nov 22 07:24:02 compute-0 sudo[91400]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tnceaxxkprxifzsphbdkhswaerxrzonb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796241.6636653-1219-22618528126069/AnsiballZ_systemd.py'
Nov 22 07:24:02 compute-0 sudo[91400]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:24:02 compute-0 python3.9[91402]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 07:24:02 compute-0 systemd[1]: Reloading.
Nov 22 07:24:02 compute-0 systemd-rc-local-generator[91427]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 07:24:02 compute-0 systemd-sysv-generator[91432]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 07:24:02 compute-0 sudo[91400]: pam_unix(sudo:session): session closed for user root
Nov 22 07:24:03 compute-0 sudo[91588]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxnguennkazxoczgjlkxauexvwinabyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796242.9713235-1243-190153821058551/AnsiballZ_stat.py'
Nov 22 07:24:03 compute-0 sudo[91588]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:24:03 compute-0 python3.9[91590]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:24:03 compute-0 sudo[91588]: pam_unix(sudo:session): session closed for user root
Nov 22 07:24:03 compute-0 sudo[91666]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ueathdbsiazgtxcyfvpghdxjlwrumhub ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796242.9713235-1243-190153821058551/AnsiballZ_file.py'
Nov 22 07:24:03 compute-0 sudo[91666]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:24:03 compute-0 python3.9[91668]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:24:03 compute-0 sudo[91666]: pam_unix(sudo:session): session closed for user root
Nov 22 07:24:04 compute-0 sudo[91818]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jrwolpkhkdxsnvvzybxlsgfrmpcpjagb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796244.1121333-1279-5495968190485/AnsiballZ_stat.py'
Nov 22 07:24:04 compute-0 sudo[91818]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:24:04 compute-0 python3.9[91820]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:24:04 compute-0 sudo[91818]: pam_unix(sudo:session): session closed for user root
Nov 22 07:24:04 compute-0 sudo[91896]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cignkclfmodpukqhhwhhpktcrbaypjlj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796244.1121333-1279-5495968190485/AnsiballZ_file.py'
Nov 22 07:24:04 compute-0 sudo[91896]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:24:05 compute-0 python3.9[91898]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:24:05 compute-0 sudo[91896]: pam_unix(sudo:session): session closed for user root
Nov 22 07:24:05 compute-0 sudo[92048]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gcufbxqfxmogylcjscykzqskraawkglx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796245.314726-1315-254777295535073/AnsiballZ_systemd.py'
Nov 22 07:24:05 compute-0 sudo[92048]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:24:05 compute-0 python3.9[92050]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 07:24:06 compute-0 systemd[1]: Reloading.
Nov 22 07:24:06 compute-0 systemd-rc-local-generator[92079]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 07:24:06 compute-0 systemd-sysv-generator[92083]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 07:24:06 compute-0 systemd[1]: Starting Create netns directory...
Nov 22 07:24:06 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 22 07:24:06 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 22 07:24:06 compute-0 systemd[1]: Finished Create netns directory.
Nov 22 07:24:06 compute-0 sudo[92048]: pam_unix(sudo:session): session closed for user root
Nov 22 07:24:07 compute-0 sudo[92242]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzecmjoxilhmqpqacwvvnzsxnhtchvmf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796246.6706634-1345-253660997933773/AnsiballZ_file.py'
Nov 22 07:24:07 compute-0 sudo[92242]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:24:07 compute-0 python3.9[92244]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:24:07 compute-0 sudo[92242]: pam_unix(sudo:session): session closed for user root
Nov 22 07:24:07 compute-0 sudo[92394]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eiqqyzhnhtruqrybgsimsjvkdifyqtdl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796247.5355566-1369-250282054341136/AnsiballZ_stat.py'
Nov 22 07:24:07 compute-0 sudo[92394]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:24:07 compute-0 python3.9[92396]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:24:08 compute-0 sudo[92394]: pam_unix(sudo:session): session closed for user root
Nov 22 07:24:08 compute-0 sudo[92517]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lxtwaefrzhqktoqolqahmeefdbowqqhv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796247.5355566-1369-250282054341136/AnsiballZ_copy.py'
Nov 22 07:24:08 compute-0 sudo[92517]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:24:08 compute-0 python3.9[92519]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763796247.5355566-1369-250282054341136/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:24:08 compute-0 sudo[92517]: pam_unix(sudo:session): session closed for user root
Nov 22 07:24:09 compute-0 sudo[92669]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dqxiqgdxtjzoansclytudqxptsifhvlj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796249.474206-1420-264181541367642/AnsiballZ_file.py'
Nov 22 07:24:09 compute-0 sudo[92669]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:24:10 compute-0 python3.9[92671]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:24:10 compute-0 sudo[92669]: pam_unix(sudo:session): session closed for user root
Nov 22 07:24:10 compute-0 sudo[92821]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bgypbbxqjumdksrmaxzaksnaatgjdrke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796250.3886192-1444-49941616596629/AnsiballZ_stat.py'
Nov 22 07:24:10 compute-0 sudo[92821]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:24:10 compute-0 python3.9[92823]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:24:10 compute-0 sudo[92821]: pam_unix(sudo:session): session closed for user root
Nov 22 07:24:11 compute-0 sudo[92944]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zluhttleihkuyjvxiyvcpvstjngaxrnz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796250.3886192-1444-49941616596629/AnsiballZ_copy.py'
Nov 22 07:24:11 compute-0 sudo[92944]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:24:11 compute-0 python3.9[92946]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796250.3886192-1444-49941616596629/.source.json _original_basename=.8dc517js follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:24:11 compute-0 sudo[92944]: pam_unix(sudo:session): session closed for user root
Nov 22 07:24:11 compute-0 sudo[93096]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vylztnbtifuhxqvdeqtocimwznjzspuu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796251.6908522-1489-205711286374127/AnsiballZ_file.py'
Nov 22 07:24:11 compute-0 sudo[93096]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:24:12 compute-0 python3.9[93098]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:24:12 compute-0 sudo[93096]: pam_unix(sudo:session): session closed for user root
Nov 22 07:24:12 compute-0 sudo[93248]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gdojnlcythsdseunecltvvnrdolybcli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796252.6294677-1513-148847002507219/AnsiballZ_stat.py'
Nov 22 07:24:12 compute-0 sudo[93248]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:24:13 compute-0 sudo[93248]: pam_unix(sudo:session): session closed for user root
Nov 22 07:24:13 compute-0 sudo[93371]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wfiwejosloxfztwuysfmystxxxzrkkeq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796252.6294677-1513-148847002507219/AnsiballZ_copy.py'
Nov 22 07:24:13 compute-0 sudo[93371]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:24:13 compute-0 sudo[93371]: pam_unix(sudo:session): session closed for user root
Nov 22 07:24:14 compute-0 sudo[93523]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bcgnkiaqtywtyepftaehopodbmaibbat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796254.212555-1564-113330173390070/AnsiballZ_container_config_data.py'
Nov 22 07:24:14 compute-0 sudo[93523]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:24:14 compute-0 python3.9[93525]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Nov 22 07:24:14 compute-0 sudo[93523]: pam_unix(sudo:session): session closed for user root
Nov 22 07:24:15 compute-0 sudo[93675]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zeeoaiddcnxtqnuokihafecxfyrjqnef ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796255.1583767-1591-70954426435869/AnsiballZ_container_config_hash.py'
Nov 22 07:24:15 compute-0 sudo[93675]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:24:15 compute-0 python3.9[93677]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 22 07:24:15 compute-0 sudo[93675]: pam_unix(sudo:session): session closed for user root
Nov 22 07:24:16 compute-0 sudo[93827]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vkhfkypymaaxuuyjucojqqwgggsquifg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796256.1432512-1618-156893592387124/AnsiballZ_podman_container_info.py'
Nov 22 07:24:16 compute-0 sudo[93827]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:24:16 compute-0 python3.9[93829]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 22 07:24:16 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 07:24:16 compute-0 sudo[93827]: pam_unix(sudo:session): session closed for user root
Nov 22 07:24:18 compute-0 sudo[93990]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ypfeegqqhjjrdocqoqkjipptqzqnsdui ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763796257.6734636-1657-160428168093366/AnsiballZ_edpm_container_manage.py'
Nov 22 07:24:18 compute-0 sudo[93990]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:24:18 compute-0 python3[93992]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 22 07:24:18 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 07:24:18 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 07:24:18 compute-0 podman[94029]: 2025-11-22 07:24:18.70175819 +0000 UTC m=+0.061356185 container create ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 07:24:18 compute-0 podman[94029]: 2025-11-22 07:24:18.670655724 +0000 UTC m=+0.030253739 image pull 197857ba4b35dfe0da58eb2e9c37f91c8a1d2b66c0967b4c66656aa6329b870c quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Nov 22 07:24:18 compute-0 python3[93992]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Nov 22 07:24:18 compute-0 sudo[93990]: pam_unix(sudo:session): session closed for user root
Nov 22 07:24:19 compute-0 sudo[94216]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ucbzschtrxdqqcrabtdoimwbettgmxct ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796259.0021608-1681-166514308358400/AnsiballZ_stat.py'
Nov 22 07:24:19 compute-0 sudo[94216]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:24:19 compute-0 python3.9[94218]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 07:24:19 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 22 07:24:19 compute-0 sudo[94216]: pam_unix(sudo:session): session closed for user root
Nov 22 07:24:20 compute-0 sudo[94370]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jyweiawergbsgpnjitiwafqimrqmvazy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796259.8298798-1708-66339146319502/AnsiballZ_file.py'
Nov 22 07:24:20 compute-0 sudo[94370]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:24:20 compute-0 python3.9[94372]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:24:20 compute-0 sudo[94370]: pam_unix(sudo:session): session closed for user root
Nov 22 07:24:20 compute-0 sudo[94446]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dzgihwklewmrxscvpwmzqoscpxafyfku ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796259.8298798-1708-66339146319502/AnsiballZ_stat.py'
Nov 22 07:24:20 compute-0 sudo[94446]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:24:20 compute-0 python3.9[94448]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 07:24:20 compute-0 sudo[94446]: pam_unix(sudo:session): session closed for user root
Nov 22 07:24:21 compute-0 sudo[94597]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sobwtlocditckuiujabtdllocytwmeau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796260.8142908-1708-143516351408716/AnsiballZ_copy.py'
Nov 22 07:24:21 compute-0 sudo[94597]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:24:21 compute-0 python3.9[94599]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763796260.8142908-1708-143516351408716/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:24:21 compute-0 sudo[94597]: pam_unix(sudo:session): session closed for user root
Nov 22 07:24:21 compute-0 sudo[94673]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xawoezokrhesocbrkkxcdkgkcdocwara ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796260.8142908-1708-143516351408716/AnsiballZ_systemd.py'
Nov 22 07:24:21 compute-0 sudo[94673]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:24:22 compute-0 python3.9[94675]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 22 07:24:22 compute-0 systemd[1]: Reloading.
Nov 22 07:24:22 compute-0 systemd-sysv-generator[94705]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 07:24:22 compute-0 systemd-rc-local-generator[94701]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 07:24:22 compute-0 sudo[94673]: pam_unix(sudo:session): session closed for user root
Nov 22 07:24:22 compute-0 sudo[94785]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hhmeoxqygadfktulejzmxqxtepwlbejm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796260.8142908-1708-143516351408716/AnsiballZ_systemd.py'
Nov 22 07:24:22 compute-0 sudo[94785]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:24:23 compute-0 python3.9[94787]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 07:24:23 compute-0 systemd[1]: Reloading.
Nov 22 07:24:23 compute-0 systemd-sysv-generator[94818]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 07:24:23 compute-0 systemd-rc-local-generator[94814]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 07:24:23 compute-0 systemd[1]: Starting ovn_controller container...
Nov 22 07:24:23 compute-0 systemd[1]: Created slice Virtual Machine and Container Slice.
Nov 22 07:24:23 compute-0 systemd[1]: Started libcrun container.
Nov 22 07:24:23 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1ef240e67ebd23432f2014a9cd5b254d3566436c947e88db60322c3cf24cec9d/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Nov 22 07:24:23 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f.
Nov 22 07:24:23 compute-0 podman[94828]: 2025-11-22 07:24:23.602684364 +0000 UTC m=+0.122987435 container init ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Nov 22 07:24:23 compute-0 ovn_controller[94843]: + sudo -E kolla_set_configs
Nov 22 07:24:23 compute-0 podman[94828]: 2025-11-22 07:24:23.627720681 +0000 UTC m=+0.148023732 container start ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Nov 22 07:24:23 compute-0 edpm-start-podman-container[94828]: ovn_controller
Nov 22 07:24:23 compute-0 systemd[1]: Created slice User Slice of UID 0.
Nov 22 07:24:23 compute-0 systemd[1]: Starting User Runtime Directory /run/user/0...
Nov 22 07:24:23 compute-0 systemd[1]: Finished User Runtime Directory /run/user/0.
Nov 22 07:24:23 compute-0 systemd[1]: Starting User Manager for UID 0...
Nov 22 07:24:23 compute-0 systemd[94882]: pam_unix(systemd-user:session): session opened for user root(uid=0) by root(uid=0)
Nov 22 07:24:23 compute-0 edpm-start-podman-container[94827]: Creating additional drop-in dependency for "ovn_controller" (ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f)
Nov 22 07:24:23 compute-0 podman[94850]: 2025-11-22 07:24:23.721369349 +0000 UTC m=+0.082816279 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 22 07:24:23 compute-0 systemd[1]: ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f-6dd3390be6fd404c.service: Main process exited, code=exited, status=1/FAILURE
Nov 22 07:24:23 compute-0 systemd[1]: ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f-6dd3390be6fd404c.service: Failed with result 'exit-code'.
Nov 22 07:24:23 compute-0 systemd[1]: Reloading.
Nov 22 07:24:23 compute-0 systemd-sysv-generator[94932]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 07:24:23 compute-0 systemd-rc-local-generator[94926]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 07:24:23 compute-0 systemd[94882]: Queued start job for default target Main User Target.
Nov 22 07:24:23 compute-0 systemd[94882]: Created slice User Application Slice.
Nov 22 07:24:23 compute-0 systemd[94882]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Nov 22 07:24:23 compute-0 systemd[94882]: Started Daily Cleanup of User's Temporary Directories.
Nov 22 07:24:23 compute-0 systemd[94882]: Reached target Paths.
Nov 22 07:24:23 compute-0 systemd[94882]: Reached target Timers.
Nov 22 07:24:23 compute-0 systemd[94882]: Starting D-Bus User Message Bus Socket...
Nov 22 07:24:23 compute-0 systemd[94882]: Starting Create User's Volatile Files and Directories...
Nov 22 07:24:23 compute-0 systemd[94882]: Finished Create User's Volatile Files and Directories.
Nov 22 07:24:23 compute-0 systemd[94882]: Listening on D-Bus User Message Bus Socket.
Nov 22 07:24:23 compute-0 systemd[94882]: Reached target Sockets.
Nov 22 07:24:23 compute-0 systemd[94882]: Reached target Basic System.
Nov 22 07:24:23 compute-0 systemd[94882]: Reached target Main User Target.
Nov 22 07:24:23 compute-0 systemd[94882]: Startup finished in 152ms.
Nov 22 07:24:24 compute-0 systemd[1]: Started User Manager for UID 0.
Nov 22 07:24:24 compute-0 systemd[1]: Started ovn_controller container.
Nov 22 07:24:24 compute-0 systemd[1]: Started Session c1 of User root.
Nov 22 07:24:24 compute-0 sudo[94785]: pam_unix(sudo:session): session closed for user root
Nov 22 07:24:24 compute-0 ovn_controller[94843]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 22 07:24:24 compute-0 ovn_controller[94843]: INFO:__main__:Validating config file
Nov 22 07:24:24 compute-0 ovn_controller[94843]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 22 07:24:24 compute-0 ovn_controller[94843]: INFO:__main__:Writing out command to execute
Nov 22 07:24:24 compute-0 systemd[1]: session-c1.scope: Deactivated successfully.
Nov 22 07:24:24 compute-0 ovn_controller[94843]: ++ cat /run_command
Nov 22 07:24:24 compute-0 ovn_controller[94843]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Nov 22 07:24:24 compute-0 ovn_controller[94843]: + ARGS=
Nov 22 07:24:24 compute-0 ovn_controller[94843]: + sudo kolla_copy_cacerts
Nov 22 07:24:24 compute-0 systemd[1]: Started Session c2 of User root.
Nov 22 07:24:24 compute-0 systemd[1]: session-c2.scope: Deactivated successfully.
Nov 22 07:24:24 compute-0 ovn_controller[94843]: + [[ ! -n '' ]]
Nov 22 07:24:24 compute-0 ovn_controller[94843]: + . kolla_extend_start
Nov 22 07:24:24 compute-0 ovn_controller[94843]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Nov 22 07:24:24 compute-0 ovn_controller[94843]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Nov 22 07:24:24 compute-0 ovn_controller[94843]: + umask 0022
Nov 22 07:24:24 compute-0 ovn_controller[94843]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Nov 22 07:24:24 compute-0 ovn_controller[94843]: 2025-11-22T07:24:24Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Nov 22 07:24:24 compute-0 ovn_controller[94843]: 2025-11-22T07:24:24Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Nov 22 07:24:24 compute-0 ovn_controller[94843]: 2025-11-22T07:24:24Z|00003|main|INFO|OVN internal version is : [24.03.7-20.33.0-76.8]
Nov 22 07:24:24 compute-0 ovn_controller[94843]: 2025-11-22T07:24:24Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Nov 22 07:24:24 compute-0 ovn_controller[94843]: 2025-11-22T07:24:24Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Nov 22 07:24:24 compute-0 ovn_controller[94843]: 2025-11-22T07:24:24Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Nov 22 07:24:24 compute-0 NetworkManager[55036]: <info>  [1763796264.1726] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Nov 22 07:24:24 compute-0 NetworkManager[55036]: <info>  [1763796264.1734] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 22 07:24:24 compute-0 NetworkManager[55036]: <info>  [1763796264.1743] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/15)
Nov 22 07:24:24 compute-0 NetworkManager[55036]: <info>  [1763796264.1748] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/16)
Nov 22 07:24:24 compute-0 NetworkManager[55036]: <info>  [1763796264.1751] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Nov 22 07:24:24 compute-0 kernel: br-int: entered promiscuous mode
Nov 22 07:24:24 compute-0 ovn_controller[94843]: 2025-11-22T07:24:24Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Nov 22 07:24:24 compute-0 ovn_controller[94843]: 2025-11-22T07:24:24Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 22 07:24:24 compute-0 ovn_controller[94843]: 2025-11-22T07:24:24Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 22 07:24:24 compute-0 ovn_controller[94843]: 2025-11-22T07:24:24Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Nov 22 07:24:24 compute-0 ovn_controller[94843]: 2025-11-22T07:24:24Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Nov 22 07:24:24 compute-0 ovn_controller[94843]: 2025-11-22T07:24:24Z|00012|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Nov 22 07:24:24 compute-0 ovn_controller[94843]: 2025-11-22T07:24:24Z|00013|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Nov 22 07:24:24 compute-0 ovn_controller[94843]: 2025-11-22T07:24:24Z|00014|main|INFO|OVS feature set changed, force recompute.
Nov 22 07:24:24 compute-0 ovn_controller[94843]: 2025-11-22T07:24:24Z|00015|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 22 07:24:24 compute-0 ovn_controller[94843]: 2025-11-22T07:24:24Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 22 07:24:24 compute-0 ovn_controller[94843]: 2025-11-22T07:24:24Z|00017|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 22 07:24:24 compute-0 ovn_controller[94843]: 2025-11-22T07:24:24Z|00018|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Nov 22 07:24:24 compute-0 ovn_controller[94843]: 2025-11-22T07:24:24Z|00019|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Nov 22 07:24:24 compute-0 ovn_controller[94843]: 2025-11-22T07:24:24Z|00020|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 22 07:24:24 compute-0 ovn_controller[94843]: 2025-11-22T07:24:24Z|00021|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Nov 22 07:24:24 compute-0 ovn_controller[94843]: 2025-11-22T07:24:24Z|00022|main|INFO|OVS feature set changed, force recompute.
Nov 22 07:24:24 compute-0 ovn_controller[94843]: 2025-11-22T07:24:24Z|00023|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Nov 22 07:24:24 compute-0 ovn_controller[94843]: 2025-11-22T07:24:24Z|00024|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Nov 22 07:24:24 compute-0 systemd-udevd[94977]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 07:24:24 compute-0 sudo[95105]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fbbkbjkdvuoqtfywtetzrdsplhaencng ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796264.341126-1792-71523097149913/AnsiballZ_command.py'
Nov 22 07:24:24 compute-0 sudo[95105]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:24:24 compute-0 python3.9[95107]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 07:24:24 compute-0 ovs-vsctl[95108]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Nov 22 07:24:24 compute-0 sudo[95105]: pam_unix(sudo:session): session closed for user root
Nov 22 07:24:25 compute-0 sudo[95258]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-waiwxvfhmckvdhieuzaujdphxwjnxxra ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796265.1047442-1816-252981556625956/AnsiballZ_command.py'
Nov 22 07:24:25 compute-0 sudo[95258]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:24:25 compute-0 python3.9[95260]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 07:24:25 compute-0 ovs-vsctl[95262]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Nov 22 07:24:25 compute-0 sudo[95258]: pam_unix(sudo:session): session closed for user root
Nov 22 07:24:25 compute-0 ovn_controller[94843]: 2025-11-22T07:24:25Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 22 07:24:25 compute-0 ovn_controller[94843]: 2025-11-22T07:24:25Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 22 07:24:25 compute-0 ovn_controller[94843]: 2025-11-22T07:24:25Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 22 07:24:25 compute-0 ovn_controller[94843]: 2025-11-22T07:24:25Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 22 07:24:25 compute-0 ovn_controller[94843]: 2025-11-22T07:24:25Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 22 07:24:25 compute-0 ovn_controller[94843]: 2025-11-22T07:24:25Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 22 07:24:25 compute-0 NetworkManager[55036]: <info>  [1763796265.9522] manager: (ovn-4984e1-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Nov 22 07:24:25 compute-0 NetworkManager[55036]: <info>  [1763796265.9531] manager: (ovn-e686e2-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/18)
Nov 22 07:24:25 compute-0 kernel: genev_sys_6081: entered promiscuous mode
Nov 22 07:24:25 compute-0 NetworkManager[55036]: <info>  [1763796265.9699] device (genev_sys_6081): carrier: link connected
Nov 22 07:24:25 compute-0 systemd-udevd[94979]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 07:24:25 compute-0 NetworkManager[55036]: <info>  [1763796265.9704] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/19)
Nov 22 07:24:26 compute-0 sudo[95417]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bhogxtcpaegdpjufmblpsxdcgwdzzcds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796266.3500135-1858-226717077192185/AnsiballZ_command.py'
Nov 22 07:24:26 compute-0 sudo[95417]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:24:26 compute-0 python3.9[95419]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 07:24:26 compute-0 ovs-vsctl[95420]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Nov 22 07:24:26 compute-0 sudo[95417]: pam_unix(sudo:session): session closed for user root
Nov 22 07:24:26 compute-0 NetworkManager[55036]: <info>  [1763796266.8818] manager: (ovn-73ab13-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/20)
Nov 22 07:24:27 compute-0 sshd-session[84349]: Connection closed by 192.168.122.30 port 55776
Nov 22 07:24:27 compute-0 sshd-session[84346]: pam_unix(sshd:session): session closed for user zuul
Nov 22 07:24:27 compute-0 systemd[1]: session-19.scope: Deactivated successfully.
Nov 22 07:24:27 compute-0 systemd[1]: session-19.scope: Consumed 43.361s CPU time.
Nov 22 07:24:27 compute-0 systemd-logind[821]: Session 19 logged out. Waiting for processes to exit.
Nov 22 07:24:27 compute-0 systemd-logind[821]: Removed session 19.
Nov 22 07:24:34 compute-0 systemd[1]: Stopping User Manager for UID 0...
Nov 22 07:24:34 compute-0 systemd[94882]: Activating special unit Exit the Session...
Nov 22 07:24:34 compute-0 systemd[94882]: Stopped target Main User Target.
Nov 22 07:24:34 compute-0 systemd[94882]: Stopped target Basic System.
Nov 22 07:24:34 compute-0 systemd[94882]: Stopped target Paths.
Nov 22 07:24:34 compute-0 systemd[94882]: Stopped target Sockets.
Nov 22 07:24:34 compute-0 systemd[94882]: Stopped target Timers.
Nov 22 07:24:34 compute-0 systemd[94882]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 22 07:24:34 compute-0 systemd[94882]: Closed D-Bus User Message Bus Socket.
Nov 22 07:24:34 compute-0 systemd[94882]: Stopped Create User's Volatile Files and Directories.
Nov 22 07:24:34 compute-0 systemd[94882]: Removed slice User Application Slice.
Nov 22 07:24:34 compute-0 systemd[94882]: Reached target Shutdown.
Nov 22 07:24:34 compute-0 systemd[94882]: Finished Exit the Session.
Nov 22 07:24:34 compute-0 systemd[94882]: Reached target Exit the Session.
Nov 22 07:24:34 compute-0 systemd[1]: user@0.service: Deactivated successfully.
Nov 22 07:24:34 compute-0 systemd[1]: Stopped User Manager for UID 0.
Nov 22 07:24:34 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/0...
Nov 22 07:24:34 compute-0 systemd[1]: run-user-0.mount: Deactivated successfully.
Nov 22 07:24:34 compute-0 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Nov 22 07:24:34 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/0.
Nov 22 07:24:34 compute-0 systemd[1]: Removed slice User Slice of UID 0.
Nov 22 07:24:35 compute-0 sshd-session[95448]: Accepted publickey for zuul from 192.168.122.30 port 38702 ssh2: ECDSA SHA256:XSwr0+qdoVcqF4tsJEe7yzRPcUrJW8ET1w6IEkjbKvs
Nov 22 07:24:35 compute-0 systemd-logind[821]: New session 21 of user zuul.
Nov 22 07:24:35 compute-0 systemd[1]: Started Session 21 of User zuul.
Nov 22 07:24:35 compute-0 sshd-session[95448]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 22 07:24:37 compute-0 python3.9[95601]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 07:24:38 compute-0 sudo[95755]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xpitkbneqspddgwnkxnenzvmucegnncw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796278.1198838-67-163928051970960/AnsiballZ_file.py'
Nov 22 07:24:38 compute-0 sudo[95755]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:24:38 compute-0 python3.9[95757]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:24:39 compute-0 sudo[95755]: pam_unix(sudo:session): session closed for user root
Nov 22 07:24:39 compute-0 sudo[95907]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxupfznmrgzvtoxxzemiyamarhxmmfxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796279.1745331-67-124892952771736/AnsiballZ_file.py'
Nov 22 07:24:39 compute-0 sudo[95907]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:24:39 compute-0 python3.9[95909]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:24:39 compute-0 sudo[95907]: pam_unix(sudo:session): session closed for user root
Nov 22 07:24:40 compute-0 sudo[96059]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lmabbejpniejrkpkxwahpyabzurdiwmd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796279.898076-67-120776764200628/AnsiballZ_file.py'
Nov 22 07:24:40 compute-0 sudo[96059]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:24:40 compute-0 python3.9[96061]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:24:40 compute-0 sudo[96059]: pam_unix(sudo:session): session closed for user root
Nov 22 07:24:40 compute-0 sudo[96211]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxszrpmzvnnmohsqjawwsmgdoyprazov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796280.6153243-67-68632172703946/AnsiballZ_file.py'
Nov 22 07:24:40 compute-0 sudo[96211]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:24:41 compute-0 python3.9[96213]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:24:41 compute-0 sudo[96211]: pam_unix(sudo:session): session closed for user root
Nov 22 07:24:41 compute-0 sudo[96363]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzfivmbkegcajnnzeffzlfganovyxjoi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796281.2817364-67-207526789458301/AnsiballZ_file.py'
Nov 22 07:24:41 compute-0 sudo[96363]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:24:41 compute-0 python3.9[96365]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:24:41 compute-0 sudo[96363]: pam_unix(sudo:session): session closed for user root
Nov 22 07:24:42 compute-0 python3.9[96516]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 07:24:43 compute-0 sudo[96666]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtqprlchjejhliktapcxaujcntehmmsp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796282.7678652-199-166279687880974/AnsiballZ_seboolean.py'
Nov 22 07:24:43 compute-0 sudo[96666]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:24:43 compute-0 python3.9[96668]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Nov 22 07:24:44 compute-0 sudo[96666]: pam_unix(sudo:session): session closed for user root
Nov 22 07:24:45 compute-0 python3.9[96818]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:24:45 compute-0 python3.9[96939]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763796284.3784568-223-131196872584185/.source follow=False _original_basename=haproxy.j2 checksum=95c62e64c8f82dd9393a560d1b052dc98d38f810 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:24:46 compute-0 python3.9[97089]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:24:47 compute-0 python3.9[97210]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763796286.1463988-268-2731889546744/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:24:48 compute-0 sudo[97360]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wqymcefqanbvyucsscqvqsyngpcwbizs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796287.9148748-319-157742238349669/AnsiballZ_setup.py'
Nov 22 07:24:48 compute-0 sudo[97360]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:24:48 compute-0 python3.9[97362]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 22 07:24:48 compute-0 sudo[97360]: pam_unix(sudo:session): session closed for user root
Nov 22 07:24:49 compute-0 sudo[97444]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgocnvogyijevangriqxpqwxstewannq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796287.9148748-319-157742238349669/AnsiballZ_dnf.py'
Nov 22 07:24:49 compute-0 sudo[97444]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:24:49 compute-0 python3.9[97446]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 22 07:24:50 compute-0 sudo[97444]: pam_unix(sudo:session): session closed for user root
Nov 22 07:24:51 compute-0 sudo[97597]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mmvuvslucpfxjueszrfvbxrhrszgzggp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796291.0349765-355-64791367715703/AnsiballZ_systemd.py'
Nov 22 07:24:51 compute-0 sudo[97597]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:24:51 compute-0 python3.9[97599]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 22 07:24:52 compute-0 sudo[97597]: pam_unix(sudo:session): session closed for user root
Nov 22 07:24:52 compute-0 python3.9[97752]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:24:53 compute-0 python3.9[97873]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763796292.2465274-379-220540712443826/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:24:53 compute-0 ovn_controller[94843]: 2025-11-22T07:24:53Z|00025|memory|INFO|16384 kB peak resident set size after 29.8 seconds
Nov 22 07:24:53 compute-0 ovn_controller[94843]: 2025-11-22T07:24:53Z|00026|memory|INFO|idl-cells-OVN_Southbound:273 idl-cells-Open_vSwitch:585 ofctrl_desired_flow_usage-KB:7 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:3
Nov 22 07:24:53 compute-0 podman[97997]: 2025-11-22 07:24:53.999163004 +0000 UTC m=+0.113841823 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 07:24:54 compute-0 python3.9[98036]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:24:54 compute-0 python3.9[98170]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763796293.6393235-379-229198933288066/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:24:56 compute-0 python3.9[98320]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:24:57 compute-0 python3.9[98441]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763796296.3486314-511-56990391208789/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:24:57 compute-0 python3.9[98591]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:24:58 compute-0 python3.9[98712]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763796297.4244852-511-237181741988783/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=3fd0bbe67f8d6b170421a2b4395a288aa69eaea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:24:59 compute-0 python3.9[98862]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 07:24:59 compute-0 sudo[99014]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-krfosnybpbkkwjyeafrjmpekmuddzuie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796299.4119065-625-56180257684341/AnsiballZ_file.py'
Nov 22 07:24:59 compute-0 sudo[99014]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:24:59 compute-0 python3.9[99016]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:24:59 compute-0 sudo[99014]: pam_unix(sudo:session): session closed for user root
Nov 22 07:25:00 compute-0 sudo[99166]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxlwdcoknbhfipbxgeshfocyyxxhepya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796300.0903103-649-53090713533793/AnsiballZ_stat.py'
Nov 22 07:25:00 compute-0 sudo[99166]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:25:00 compute-0 python3.9[99168]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:25:00 compute-0 sudo[99166]: pam_unix(sudo:session): session closed for user root
Nov 22 07:25:00 compute-0 sudo[99244]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-csgtunknqbjqhoeikiiufsgnfsetdskr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796300.0903103-649-53090713533793/AnsiballZ_file.py'
Nov 22 07:25:00 compute-0 sudo[99244]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:25:01 compute-0 python3.9[99246]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:25:01 compute-0 sudo[99244]: pam_unix(sudo:session): session closed for user root
Nov 22 07:25:01 compute-0 sudo[99396]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oxmdlacwgvrgilgsjpidjtdmfwewjknv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796301.1760828-649-21425329855246/AnsiballZ_stat.py'
Nov 22 07:25:01 compute-0 sudo[99396]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:25:01 compute-0 python3.9[99398]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:25:01 compute-0 sudo[99396]: pam_unix(sudo:session): session closed for user root
Nov 22 07:25:01 compute-0 sudo[99474]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-abpbnwqoppezkebauvaqksdxxhjrizkl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796301.1760828-649-21425329855246/AnsiballZ_file.py'
Nov 22 07:25:01 compute-0 sudo[99474]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:25:02 compute-0 python3.9[99476]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:25:02 compute-0 sudo[99474]: pam_unix(sudo:session): session closed for user root
Nov 22 07:25:02 compute-0 sudo[99626]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-blhjsfrhjqhyblujfzdxvpvilexuwrdl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796302.3245227-718-201965942313267/AnsiballZ_file.py'
Nov 22 07:25:02 compute-0 sudo[99626]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:25:02 compute-0 python3.9[99628]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:25:02 compute-0 sudo[99626]: pam_unix(sudo:session): session closed for user root
Nov 22 07:25:03 compute-0 sudo[99778]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jysxyhkhljvpnmivjjwyvfoicpseoyda ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796303.0427911-742-47478936197184/AnsiballZ_stat.py'
Nov 22 07:25:03 compute-0 sudo[99778]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:25:03 compute-0 python3.9[99780]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:25:03 compute-0 sudo[99778]: pam_unix(sudo:session): session closed for user root
Nov 22 07:25:03 compute-0 sudo[99856]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fiqbvhfdgvyjphenerurbbfqkdyhykbr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796303.0427911-742-47478936197184/AnsiballZ_file.py'
Nov 22 07:25:03 compute-0 sudo[99856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:25:03 compute-0 python3.9[99858]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:25:03 compute-0 sudo[99856]: pam_unix(sudo:session): session closed for user root
Nov 22 07:25:04 compute-0 sudo[100008]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwlvokoegqvxaovyqdjjbyylhsbdwaca ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796304.1957242-778-198840094562652/AnsiballZ_stat.py'
Nov 22 07:25:04 compute-0 sudo[100008]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:25:04 compute-0 python3.9[100010]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:25:04 compute-0 sudo[100008]: pam_unix(sudo:session): session closed for user root
Nov 22 07:25:04 compute-0 sudo[100086]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wexadplabcwcjloxahedqpniralkqres ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796304.1957242-778-198840094562652/AnsiballZ_file.py'
Nov 22 07:25:04 compute-0 sudo[100086]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:25:05 compute-0 python3.9[100088]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:25:05 compute-0 sudo[100086]: pam_unix(sudo:session): session closed for user root
Nov 22 07:25:05 compute-0 sudo[100238]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-plqpbjjnpsxsxcxfrpvfotwsspsxnisy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796305.3025134-814-42386225514263/AnsiballZ_systemd.py'
Nov 22 07:25:05 compute-0 sudo[100238]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:25:05 compute-0 python3.9[100240]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 07:25:05 compute-0 systemd[1]: Reloading.
Nov 22 07:25:05 compute-0 systemd-rc-local-generator[100268]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 07:25:05 compute-0 systemd-sysv-generator[100271]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 07:25:06 compute-0 sudo[100238]: pam_unix(sudo:session): session closed for user root
Nov 22 07:25:06 compute-0 sudo[100427]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-defetzvshqkcgmteokbvxyjcqmplzjyg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796306.339169-838-137777801979340/AnsiballZ_stat.py'
Nov 22 07:25:06 compute-0 sudo[100427]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:25:06 compute-0 python3.9[100429]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:25:06 compute-0 sudo[100427]: pam_unix(sudo:session): session closed for user root
Nov 22 07:25:07 compute-0 sudo[100505]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pqzztqhuadbssbkntkqqphyirxlulbny ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796306.339169-838-137777801979340/AnsiballZ_file.py'
Nov 22 07:25:07 compute-0 sudo[100505]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:25:07 compute-0 python3.9[100507]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:25:07 compute-0 sudo[100505]: pam_unix(sudo:session): session closed for user root
Nov 22 07:25:07 compute-0 sudo[100657]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zeqiehowadndvejioguubrmfypgxizix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796307.4465442-874-162406149588049/AnsiballZ_stat.py'
Nov 22 07:25:07 compute-0 sudo[100657]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:25:07 compute-0 python3.9[100659]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:25:07 compute-0 sudo[100657]: pam_unix(sudo:session): session closed for user root
Nov 22 07:25:08 compute-0 sudo[100735]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nlxajccjppvktberfjdbwwlwoejywtbp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796307.4465442-874-162406149588049/AnsiballZ_file.py'
Nov 22 07:25:08 compute-0 sudo[100735]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:25:08 compute-0 python3.9[100737]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:25:08 compute-0 sudo[100735]: pam_unix(sudo:session): session closed for user root
Nov 22 07:25:08 compute-0 sudo[100887]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fgihjemryudhpjqtriiymywvzixlxuyu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796308.5400286-910-245843286172589/AnsiballZ_systemd.py'
Nov 22 07:25:08 compute-0 sudo[100887]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:25:09 compute-0 python3.9[100889]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 07:25:09 compute-0 systemd[1]: Reloading.
Nov 22 07:25:09 compute-0 systemd-sysv-generator[100918]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 07:25:09 compute-0 systemd-rc-local-generator[100915]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 07:25:09 compute-0 systemd[1]: Starting Create netns directory...
Nov 22 07:25:09 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 22 07:25:09 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 22 07:25:09 compute-0 systemd[1]: Finished Create netns directory.
Nov 22 07:25:09 compute-0 sudo[100887]: pam_unix(sudo:session): session closed for user root
Nov 22 07:25:10 compute-0 sudo[101082]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ivwnolviwnegmdixylbemogczoylpaoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796309.7560346-940-101061210839063/AnsiballZ_file.py'
Nov 22 07:25:10 compute-0 sudo[101082]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:25:10 compute-0 python3.9[101084]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:25:10 compute-0 sudo[101082]: pam_unix(sudo:session): session closed for user root
Nov 22 07:25:10 compute-0 sudo[101234]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-njezqtwyecyvquevpgvorifaegsrogmn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796310.4470623-964-178282588036903/AnsiballZ_stat.py'
Nov 22 07:25:10 compute-0 sudo[101234]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:25:10 compute-0 python3.9[101236]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:25:10 compute-0 sudo[101234]: pam_unix(sudo:session): session closed for user root
Nov 22 07:25:11 compute-0 sudo[101357]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lkldxvcagtexxtezvvnzrfucgooxevfl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796310.4470623-964-178282588036903/AnsiballZ_copy.py'
Nov 22 07:25:11 compute-0 sudo[101357]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:25:11 compute-0 python3.9[101359]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763796310.4470623-964-178282588036903/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:25:11 compute-0 sudo[101357]: pam_unix(sudo:session): session closed for user root
Nov 22 07:25:12 compute-0 sudo[101509]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ysuxxlifdunkrnkyylosufrrcodghdlg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796312.0721714-1015-250564653113069/AnsiballZ_file.py'
Nov 22 07:25:12 compute-0 sudo[101509]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:25:12 compute-0 python3.9[101511]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:25:12 compute-0 sudo[101509]: pam_unix(sudo:session): session closed for user root
Nov 22 07:25:13 compute-0 sudo[101661]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-znkmrgivcefxoyurhfqoombyysddkkhz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796312.7909963-1039-6141264024712/AnsiballZ_stat.py'
Nov 22 07:25:13 compute-0 sudo[101661]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:25:13 compute-0 python3.9[101663]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:25:13 compute-0 sudo[101661]: pam_unix(sudo:session): session closed for user root
Nov 22 07:25:13 compute-0 sudo[101784]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wtuksnaenwunxtiuccauufqudkoyniee ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796312.7909963-1039-6141264024712/AnsiballZ_copy.py'
Nov 22 07:25:13 compute-0 sudo[101784]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:25:13 compute-0 python3.9[101786]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796312.7909963-1039-6141264024712/.source.json _original_basename=.k9ymtjgn follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:25:13 compute-0 sudo[101784]: pam_unix(sudo:session): session closed for user root
Nov 22 07:25:14 compute-0 sudo[101936]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ajlnuwpkhejxsmtlblahpzpybnukazsz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796314.1260073-1084-20085026178340/AnsiballZ_file.py'
Nov 22 07:25:14 compute-0 sudo[101936]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:25:14 compute-0 python3.9[101938]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:25:14 compute-0 sudo[101936]: pam_unix(sudo:session): session closed for user root
Nov 22 07:25:15 compute-0 sudo[102088]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yhuwplyoymporuoldlqpdzmfwbvhkdqw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796314.9938555-1108-114698517098160/AnsiballZ_stat.py'
Nov 22 07:25:15 compute-0 sudo[102088]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:25:15 compute-0 sudo[102088]: pam_unix(sudo:session): session closed for user root
Nov 22 07:25:15 compute-0 sudo[102211]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gqkgqkhnwtfbuebpylbfuhwandhfksjk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796314.9938555-1108-114698517098160/AnsiballZ_copy.py'
Nov 22 07:25:15 compute-0 sudo[102211]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:25:16 compute-0 sudo[102211]: pam_unix(sudo:session): session closed for user root
Nov 22 07:25:17 compute-0 sudo[102363]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bypqfexsjidgvmfheyyijepuvikxkrgn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796316.527707-1159-8887174711/AnsiballZ_container_config_data.py'
Nov 22 07:25:17 compute-0 sudo[102363]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:25:17 compute-0 python3.9[102365]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Nov 22 07:25:17 compute-0 sudo[102363]: pam_unix(sudo:session): session closed for user root
Nov 22 07:25:18 compute-0 sudo[102515]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-beooiqhixcyxsgrerdryekfjrpfpsbah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796317.532958-1186-214738528384089/AnsiballZ_container_config_hash.py'
Nov 22 07:25:18 compute-0 sudo[102515]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:25:18 compute-0 python3.9[102517]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 22 07:25:18 compute-0 sudo[102515]: pam_unix(sudo:session): session closed for user root
Nov 22 07:25:19 compute-0 sudo[102667]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fabwjltkpwpwokhlbobykxesoekthwed ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796318.632424-1213-125900665452770/AnsiballZ_podman_container_info.py'
Nov 22 07:25:19 compute-0 sudo[102667]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:25:19 compute-0 python3.9[102669]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 22 07:25:19 compute-0 sudo[102667]: pam_unix(sudo:session): session closed for user root
Nov 22 07:25:20 compute-0 sudo[102845]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yozgtauqellsbyhvuopvvshdwjvmedbt ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763796320.0380795-1252-259998148867889/AnsiballZ_edpm_container_manage.py'
Nov 22 07:25:20 compute-0 sudo[102845]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:25:20 compute-0 python3[102847]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 22 07:25:26 compute-0 podman[102904]: 2025-11-22 07:25:26.232694561 +0000 UTC m=+1.874303674 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 22 07:25:27 compute-0 podman[102859]: 2025-11-22 07:25:27.263744652 +0000 UTC m=+6.423703270 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 07:25:27 compute-0 podman[102981]: 2025-11-22 07:25:27.41342257 +0000 UTC m=+0.053884445 container create c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 22 07:25:27 compute-0 podman[102981]: 2025-11-22 07:25:27.382905969 +0000 UTC m=+0.023367864 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 07:25:27 compute-0 python3[102847]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 07:25:27 compute-0 sudo[102845]: pam_unix(sudo:session): session closed for user root
Nov 22 07:25:28 compute-0 sudo[103168]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jhmwjmjpprdimdnyjpmlsutsmzumfzvz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796328.2295578-1276-90119355870261/AnsiballZ_stat.py'
Nov 22 07:25:28 compute-0 sudo[103168]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:25:28 compute-0 python3.9[103170]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 07:25:28 compute-0 sudo[103168]: pam_unix(sudo:session): session closed for user root
Nov 22 07:25:29 compute-0 sudo[103322]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bwwwalzhxhqmqpqcqrijtmaasbjenmvy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796329.045699-1303-59130669305230/AnsiballZ_file.py'
Nov 22 07:25:29 compute-0 sudo[103322]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:25:29 compute-0 python3.9[103324]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:25:29 compute-0 sudo[103322]: pam_unix(sudo:session): session closed for user root
Nov 22 07:25:29 compute-0 sudo[103398]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dxbemxtqdzsqjryrgiywpzvorkogbviu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796329.045699-1303-59130669305230/AnsiballZ_stat.py'
Nov 22 07:25:29 compute-0 sudo[103398]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:25:30 compute-0 python3.9[103400]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 07:25:30 compute-0 sudo[103398]: pam_unix(sudo:session): session closed for user root
Nov 22 07:25:30 compute-0 sudo[103550]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ntoecotvugqwgoamfxnbzkmugplltauc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796330.172343-1303-267770961722407/AnsiballZ_copy.py'
Nov 22 07:25:30 compute-0 sudo[103550]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:25:31 compute-0 python3.9[103552]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763796330.172343-1303-267770961722407/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:25:31 compute-0 sudo[103550]: pam_unix(sudo:session): session closed for user root
Nov 22 07:25:31 compute-0 sudo[103626]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qzftlalecgvonlgpumrwiyafmidacwxz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796330.172343-1303-267770961722407/AnsiballZ_systemd.py'
Nov 22 07:25:31 compute-0 sudo[103626]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:25:32 compute-0 python3.9[103628]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 22 07:25:32 compute-0 systemd[1]: Reloading.
Nov 22 07:25:32 compute-0 systemd-sysv-generator[103656]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 07:25:32 compute-0 systemd-rc-local-generator[103650]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 07:25:32 compute-0 sudo[103626]: pam_unix(sudo:session): session closed for user root
Nov 22 07:25:32 compute-0 sudo[103738]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ugeqleqclidhqyicuurhlqypbmhurgzi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796330.172343-1303-267770961722407/AnsiballZ_systemd.py'
Nov 22 07:25:32 compute-0 sudo[103738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:25:33 compute-0 python3.9[103740]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 07:25:33 compute-0 systemd[1]: Reloading.
Nov 22 07:25:33 compute-0 systemd-rc-local-generator[103770]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 07:25:33 compute-0 systemd-sysv-generator[103773]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 07:25:33 compute-0 systemd[1]: Starting ovn_metadata_agent container...
Nov 22 07:25:34 compute-0 systemd[1]: Started libcrun container.
Nov 22 07:25:34 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f6c4cf92f8bebe2a064e81c66de82ea59c39f82d7c8d3042b76a33de17858f53/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Nov 22 07:25:34 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f6c4cf92f8bebe2a064e81c66de82ea59c39f82d7c8d3042b76a33de17858f53/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 07:25:34 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37.
Nov 22 07:25:35 compute-0 podman[103782]: 2025-11-22 07:25:35.192975188 +0000 UTC m=+1.437239743 container init c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent)
Nov 22 07:25:35 compute-0 ovn_metadata_agent[103800]: + sudo -E kolla_set_configs
Nov 22 07:25:35 compute-0 podman[103782]: 2025-11-22 07:25:35.222277981 +0000 UTC m=+1.466542556 container start c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 22 07:25:35 compute-0 edpm-start-podman-container[103782]: ovn_metadata_agent
Nov 22 07:25:35 compute-0 ovn_metadata_agent[103800]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 22 07:25:35 compute-0 ovn_metadata_agent[103800]: INFO:__main__:Validating config file
Nov 22 07:25:35 compute-0 ovn_metadata_agent[103800]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 22 07:25:35 compute-0 ovn_metadata_agent[103800]: INFO:__main__:Copying service configuration files
Nov 22 07:25:35 compute-0 ovn_metadata_agent[103800]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Nov 22 07:25:35 compute-0 ovn_metadata_agent[103800]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Nov 22 07:25:35 compute-0 ovn_metadata_agent[103800]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Nov 22 07:25:35 compute-0 ovn_metadata_agent[103800]: INFO:__main__:Writing out command to execute
Nov 22 07:25:35 compute-0 ovn_metadata_agent[103800]: INFO:__main__:Setting permission for /var/lib/neutron
Nov 22 07:25:35 compute-0 ovn_metadata_agent[103800]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Nov 22 07:25:35 compute-0 ovn_metadata_agent[103800]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Nov 22 07:25:35 compute-0 ovn_metadata_agent[103800]: INFO:__main__:Setting permission for /var/lib/neutron/external
Nov 22 07:25:35 compute-0 ovn_metadata_agent[103800]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Nov 22 07:25:35 compute-0 ovn_metadata_agent[103800]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Nov 22 07:25:35 compute-0 ovn_metadata_agent[103800]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Nov 22 07:25:35 compute-0 ovn_metadata_agent[103800]: ++ cat /run_command
Nov 22 07:25:35 compute-0 ovn_metadata_agent[103800]: + CMD=neutron-ovn-metadata-agent
Nov 22 07:25:35 compute-0 ovn_metadata_agent[103800]: + ARGS=
Nov 22 07:25:35 compute-0 ovn_metadata_agent[103800]: + sudo kolla_copy_cacerts
Nov 22 07:25:35 compute-0 edpm-start-podman-container[103781]: Creating additional drop-in dependency for "ovn_metadata_agent" (c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37)
Nov 22 07:25:35 compute-0 ovn_metadata_agent[103800]: + [[ ! -n '' ]]
Nov 22 07:25:35 compute-0 ovn_metadata_agent[103800]: + . kolla_extend_start
Nov 22 07:25:35 compute-0 ovn_metadata_agent[103800]: Running command: 'neutron-ovn-metadata-agent'
Nov 22 07:25:35 compute-0 ovn_metadata_agent[103800]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Nov 22 07:25:35 compute-0 ovn_metadata_agent[103800]: + umask 0022
Nov 22 07:25:35 compute-0 ovn_metadata_agent[103800]: + exec neutron-ovn-metadata-agent
Nov 22 07:25:35 compute-0 systemd[1]: Reloading.
Nov 22 07:25:35 compute-0 podman[103807]: 2025-11-22 07:25:35.321327972 +0000 UTC m=+0.084430007 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Nov 22 07:25:35 compute-0 systemd-sysv-generator[103882]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 07:25:35 compute-0 systemd-rc-local-generator[103878]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 07:25:35 compute-0 systemd[1]: Started ovn_metadata_agent container.
Nov 22 07:25:35 compute-0 sudo[103738]: pam_unix(sudo:session): session closed for user root
Nov 22 07:25:36 compute-0 sshd-session[95451]: Connection closed by 192.168.122.30 port 38702
Nov 22 07:25:36 compute-0 sshd-session[95448]: pam_unix(sshd:session): session closed for user zuul
Nov 22 07:25:36 compute-0 systemd[1]: session-21.scope: Deactivated successfully.
Nov 22 07:25:36 compute-0 systemd[1]: session-21.scope: Consumed 48.562s CPU time.
Nov 22 07:25:36 compute-0 systemd-logind[821]: Session 21 logged out. Waiting for processes to exit.
Nov 22 07:25:36 compute-0 systemd-logind[821]: Removed session 21.
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.240 103805 INFO neutron.common.config [-] Logging enabled!
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.241 103805 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.241 103805 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.242 103805 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.242 103805 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.242 103805 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.242 103805 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.242 103805 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.242 103805 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.242 103805 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.243 103805 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.243 103805 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.243 103805 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.243 103805 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.243 103805 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.243 103805 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.243 103805 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.244 103805 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.244 103805 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.244 103805 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.244 103805 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.244 103805 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.244 103805 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.244 103805 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.245 103805 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.245 103805 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.245 103805 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.245 103805 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.245 103805 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.245 103805 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.245 103805 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.245 103805 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.245 103805 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.245 103805 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.246 103805 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.246 103805 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.246 103805 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.246 103805 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.246 103805 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.246 103805 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.246 103805 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.246 103805 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.247 103805 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.247 103805 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.247 103805 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.247 103805 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.247 103805 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.247 103805 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.247 103805 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.247 103805 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.248 103805 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.248 103805 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.248 103805 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.248 103805 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.248 103805 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.248 103805 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.248 103805 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.248 103805 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.248 103805 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.249 103805 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.249 103805 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.249 103805 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.249 103805 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.249 103805 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.249 103805 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.249 103805 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.249 103805 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.249 103805 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.250 103805 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.250 103805 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.250 103805 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.250 103805 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.250 103805 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.250 103805 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-cell1-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.250 103805 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.250 103805 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.251 103805 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.251 103805 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.251 103805 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.251 103805 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.251 103805 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.251 103805 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.251 103805 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.251 103805 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.252 103805 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.252 103805 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.252 103805 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.252 103805 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.252 103805 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.252 103805 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.252 103805 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.252 103805 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.253 103805 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.253 103805 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.253 103805 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.253 103805 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.253 103805 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.253 103805 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.253 103805 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.253 103805 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.253 103805 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.253 103805 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.254 103805 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.254 103805 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.254 103805 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.254 103805 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.254 103805 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.254 103805 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.254 103805 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.254 103805 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.255 103805 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.255 103805 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.255 103805 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.255 103805 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.255 103805 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.255 103805 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.255 103805 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.255 103805 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.255 103805 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.256 103805 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.256 103805 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.256 103805 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.256 103805 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.256 103805 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.256 103805 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.256 103805 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.256 103805 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.257 103805 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.257 103805 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.257 103805 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.257 103805 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.257 103805 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.257 103805 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.257 103805 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.258 103805 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.258 103805 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.258 103805 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.258 103805 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.258 103805 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.258 103805 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.258 103805 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.258 103805 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.259 103805 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.259 103805 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.259 103805 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.259 103805 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.259 103805 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.259 103805 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.259 103805 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.260 103805 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.260 103805 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.260 103805 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.260 103805 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.260 103805 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.260 103805 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.260 103805 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.260 103805 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.261 103805 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.261 103805 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.261 103805 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.261 103805 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.261 103805 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.261 103805 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.261 103805 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.262 103805 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.262 103805 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.262 103805 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.262 103805 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.262 103805 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.262 103805 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.262 103805 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.262 103805 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.263 103805 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.263 103805 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.263 103805 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.263 103805 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.263 103805 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.263 103805 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.263 103805 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.263 103805 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.263 103805 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.264 103805 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.264 103805 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.264 103805 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.264 103805 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.264 103805 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.264 103805 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.264 103805 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.265 103805 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.265 103805 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.265 103805 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.265 103805 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.265 103805 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.265 103805 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.265 103805 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.265 103805 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.266 103805 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.266 103805 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.266 103805 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.266 103805 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.266 103805 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.266 103805 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.266 103805 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.267 103805 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.267 103805 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.267 103805 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.267 103805 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.267 103805 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.267 103805 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.267 103805 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.268 103805 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.268 103805 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.268 103805 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.268 103805 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.268 103805 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.268 103805 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.268 103805 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.269 103805 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.269 103805 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.269 103805 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.269 103805 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.269 103805 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.269 103805 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.269 103805 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.270 103805 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.270 103805 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.270 103805 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.270 103805 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.270 103805 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.270 103805 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.270 103805 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.271 103805 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.271 103805 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.271 103805 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.271 103805 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.271 103805 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.271 103805 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.271 103805 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.271 103805 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.272 103805 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.272 103805 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.272 103805 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.272 103805 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.272 103805 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.272 103805 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.272 103805 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.272 103805 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.273 103805 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.273 103805 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.273 103805 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.273 103805 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.273 103805 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.273 103805 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.273 103805 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.273 103805 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.274 103805 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.274 103805 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.274 103805 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.274 103805 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.274 103805 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.274 103805 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.274 103805 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.274 103805 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.275 103805 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.275 103805 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.275 103805 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.275 103805 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.275 103805 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.275 103805 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.275 103805 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.276 103805 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.276 103805 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.276 103805 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.276 103805 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.276 103805 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.276 103805 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.276 103805 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.276 103805 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.277 103805 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.277 103805 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.277 103805 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.277 103805 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.277 103805 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.277 103805 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.277 103805 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.277 103805 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.278 103805 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.278 103805 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.278 103805 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.278 103805 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.278 103805 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.278 103805 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.278 103805 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.279 103805 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.279 103805 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.279 103805 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.279 103805 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.279 103805 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.279 103805 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.279 103805 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.279 103805 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.280 103805 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.280 103805 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.280 103805 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.290 103805 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.290 103805 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.290 103805 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.290 103805 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.291 103805 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.307 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name df09844c-c111-44b4-9c36-d4950a55a590 (UUID: df09844c-c111-44b4-9c36-d4950a55a590) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.338 103805 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.338 103805 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.338 103805 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.339 103805 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.343 103805 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.348 103805 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.354 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', 'df09844c-c111-44b4-9c36-d4950a55a590'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], external_ids={}, name=df09844c-c111-44b4-9c36-d4950a55a590, nb_cfg_timestamp=1763796272196, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.355 103805 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7f5173b57bb0>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.356 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.356 103805 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.356 103805 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.356 103805 INFO oslo_service.service [-] Starting 1 workers
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.361 103805 DEBUG oslo_service.service [-] Started child 103913 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.364 103805 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmp8oxaqx2l/privsep.sock']
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.364 103913 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-430739'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.390 103913 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.391 103913 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.391 103913 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.395 103913 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.401 103913 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Nov 22 07:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.408 103913 INFO eventlet.wsgi.server [-] (103913) wsgi starting up on http:/var/lib/neutron/metadata_proxy
Nov 22 07:25:37 compute-0 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Nov 22 07:25:38 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:38.060 103805 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Nov 22 07:25:38 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:38.061 103805 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp8oxaqx2l/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Nov 22 07:25:38 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.929 103918 INFO oslo.privsep.daemon [-] privsep daemon starting
Nov 22 07:25:38 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.934 103918 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Nov 22 07:25:38 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.936 103918 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Nov 22 07:25:38 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:37.936 103918 INFO oslo.privsep.daemon [-] privsep daemon running as pid 103918
Nov 22 07:25:38 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:38.063 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[69474609-aaa4-4fc3-b377-dcbc2e414c5e]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:25:38 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:38.585 103918 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:25:38 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:38.585 103918 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:25:38 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:38.585 103918 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.142 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[9409455b-6c0c-436c-96da-b7213bcad157]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.144 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, column=external_ids, values=({'neutron:ovn-metadata-id': 'a1f3a176-8e33-578c-b27e-1bf4bfad2f9f'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.369 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.892 103805 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.892 103805 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.892 103805 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.892 103805 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.892 103805 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.892 103805 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.893 103805 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.893 103805 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.893 103805 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.893 103805 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.893 103805 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.893 103805 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.893 103805 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.894 103805 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.894 103805 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.894 103805 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.894 103805 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.894 103805 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.894 103805 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.894 103805 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.895 103805 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.895 103805 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.895 103805 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.895 103805 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.895 103805 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.895 103805 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.895 103805 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.895 103805 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.895 103805 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.896 103805 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.896 103805 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.896 103805 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.896 103805 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.896 103805 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.896 103805 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.896 103805 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.896 103805 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.897 103805 DEBUG oslo_service.service [-] host                           = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.897 103805 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.897 103805 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.897 103805 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.897 103805 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.897 103805 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.897 103805 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.897 103805 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.897 103805 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.898 103805 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.898 103805 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.898 103805 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.898 103805 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.898 103805 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.898 103805 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.898 103805 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.898 103805 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.898 103805 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.898 103805 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.899 103805 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.899 103805 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.899 103805 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.899 103805 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.899 103805 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.899 103805 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.899 103805 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.899 103805 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.899 103805 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.899 103805 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.900 103805 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.900 103805 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.900 103805 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.900 103805 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.900 103805 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.900 103805 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.900 103805 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.900 103805 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-cell1-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.900 103805 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.900 103805 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.901 103805 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.901 103805 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.901 103805 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.901 103805 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.901 103805 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.901 103805 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.901 103805 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.901 103805 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.901 103805 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.902 103805 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.902 103805 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.902 103805 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.902 103805 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.902 103805 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.902 103805 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.902 103805 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.902 103805 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.902 103805 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.903 103805 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.903 103805 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.903 103805 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.903 103805 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.903 103805 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.903 103805 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.903 103805 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.903 103805 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.903 103805 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.903 103805 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.904 103805 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.904 103805 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.904 103805 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.904 103805 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.904 103805 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.904 103805 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.904 103805 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.904 103805 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.905 103805 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.905 103805 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.905 103805 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.905 103805 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.905 103805 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.905 103805 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.905 103805 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.905 103805 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.906 103805 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.906 103805 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.906 103805 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.906 103805 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.906 103805 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.906 103805 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.906 103805 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.906 103805 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.907 103805 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.907 103805 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.907 103805 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.907 103805 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.907 103805 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.907 103805 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.907 103805 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.907 103805 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.907 103805 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.908 103805 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.908 103805 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.908 103805 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.908 103805 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.908 103805 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.908 103805 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.908 103805 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.908 103805 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.908 103805 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.909 103805 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.909 103805 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.909 103805 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.909 103805 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.909 103805 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.909 103805 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.909 103805 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.909 103805 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.909 103805 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.909 103805 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.910 103805 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.910 103805 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.910 103805 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.910 103805 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.910 103805 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.910 103805 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.910 103805 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.910 103805 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.910 103805 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.911 103805 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.911 103805 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.911 103805 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.911 103805 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.911 103805 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.911 103805 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.911 103805 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.911 103805 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.911 103805 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.912 103805 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.912 103805 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.912 103805 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.912 103805 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.912 103805 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.912 103805 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.912 103805 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.912 103805 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.913 103805 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.913 103805 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.913 103805 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.913 103805 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.913 103805 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.913 103805 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.914 103805 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.914 103805 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.914 103805 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.914 103805 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.914 103805 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.914 103805 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.914 103805 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.915 103805 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.915 103805 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.915 103805 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.915 103805 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.915 103805 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.915 103805 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.915 103805 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.915 103805 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.915 103805 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.916 103805 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.916 103805 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.916 103805 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.916 103805 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.916 103805 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.916 103805 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.916 103805 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.916 103805 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.916 103805 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.917 103805 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.917 103805 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.917 103805 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.917 103805 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.917 103805 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.917 103805 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.918 103805 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.918 103805 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.918 103805 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.918 103805 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.918 103805 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.918 103805 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.918 103805 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.918 103805 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.919 103805 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.919 103805 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.919 103805 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.919 103805 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.919 103805 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.919 103805 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.919 103805 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.919 103805 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.919 103805 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.920 103805 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.920 103805 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.920 103805 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.920 103805 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.920 103805 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.920 103805 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.920 103805 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.920 103805 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.920 103805 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.921 103805 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.921 103805 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.921 103805 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.921 103805 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.921 103805 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.921 103805 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.921 103805 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.921 103805 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.921 103805 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.922 103805 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.922 103805 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.922 103805 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.922 103805 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.922 103805 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.922 103805 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.922 103805 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.922 103805 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.922 103805 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.922 103805 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.923 103805 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.923 103805 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.923 103805 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.923 103805 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.923 103805 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.923 103805 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.923 103805 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.923 103805 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.924 103805 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.924 103805 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.924 103805 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.924 103805 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.924 103805 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.924 103805 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.924 103805 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.924 103805 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.924 103805 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.925 103805 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.925 103805 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.925 103805 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.925 103805 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.925 103805 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.925 103805 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.925 103805 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.925 103805 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.926 103805 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.926 103805 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.926 103805 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.926 103805 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.926 103805 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.926 103805 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.926 103805 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.926 103805 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.926 103805 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.926 103805 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.927 103805 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.927 103805 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.927 103805 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.927 103805 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:25:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:25:39.927 103805 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 22 07:25:41 compute-0 sshd-session[103923]: Accepted publickey for zuul from 192.168.122.30 port 49888 ssh2: ECDSA SHA256:XSwr0+qdoVcqF4tsJEe7yzRPcUrJW8ET1w6IEkjbKvs
Nov 22 07:25:41 compute-0 systemd-logind[821]: New session 22 of user zuul.
Nov 22 07:25:41 compute-0 systemd[1]: Started Session 22 of User zuul.
Nov 22 07:25:41 compute-0 sshd-session[103923]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 22 07:25:42 compute-0 python3.9[104076]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 07:25:43 compute-0 sudo[104230]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oaityjtuoezoxqnrknguhdjopjaqzguh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796343.49799-67-24388331191934/AnsiballZ_command.py'
Nov 22 07:25:43 compute-0 sudo[104230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:25:44 compute-0 python3.9[104232]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 07:25:44 compute-0 sudo[104230]: pam_unix(sudo:session): session closed for user root
Nov 22 07:25:45 compute-0 sudo[104395]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ijjbfymzznwdgprirxutpukpotlppizb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796344.5869224-100-279092145886354/AnsiballZ_systemd_service.py'
Nov 22 07:25:45 compute-0 sudo[104395]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:25:45 compute-0 python3.9[104397]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 22 07:25:45 compute-0 systemd[1]: Reloading.
Nov 22 07:25:45 compute-0 systemd-rc-local-generator[104425]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 07:25:45 compute-0 systemd-sysv-generator[104428]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 07:25:46 compute-0 sudo[104395]: pam_unix(sudo:session): session closed for user root
Nov 22 07:25:46 compute-0 python3.9[104582]: ansible-ansible.builtin.service_facts Invoked
Nov 22 07:25:46 compute-0 network[104599]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 22 07:25:46 compute-0 network[104600]: 'network-scripts' will be removed from distribution in near future.
Nov 22 07:25:46 compute-0 network[104601]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 22 07:25:50 compute-0 sudo[104860]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jiiuwigcaurgphjtikpziekhnqidktcj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796350.0739446-157-122042152041118/AnsiballZ_systemd_service.py'
Nov 22 07:25:50 compute-0 sudo[104860]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:25:50 compute-0 python3.9[104862]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 07:25:50 compute-0 sudo[104860]: pam_unix(sudo:session): session closed for user root
Nov 22 07:25:51 compute-0 sudo[105013]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-balssnmjyesaaghdzlumzzkjyvbcmgbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796350.8322306-157-244151517248804/AnsiballZ_systemd_service.py'
Nov 22 07:25:51 compute-0 sudo[105013]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:25:51 compute-0 python3.9[105015]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 07:25:51 compute-0 sudo[105013]: pam_unix(sudo:session): session closed for user root
Nov 22 07:25:51 compute-0 sudo[105166]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixdkpjpqwfjopbzimhmbxdfnhqdperhh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796351.6079624-157-254482239454691/AnsiballZ_systemd_service.py'
Nov 22 07:25:51 compute-0 sudo[105166]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:25:52 compute-0 python3.9[105168]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 07:25:52 compute-0 sudo[105166]: pam_unix(sudo:session): session closed for user root
Nov 22 07:25:52 compute-0 sudo[105319]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rqqizmdtzibwbrfwfqiqqzyamtjooewc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796352.363852-157-79347310088289/AnsiballZ_systemd_service.py'
Nov 22 07:25:52 compute-0 sudo[105319]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:25:52 compute-0 python3.9[105321]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 07:25:52 compute-0 sudo[105319]: pam_unix(sudo:session): session closed for user root
Nov 22 07:25:53 compute-0 sudo[105472]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-csmbmigrtkbyevsmwwxpfgixgneddtug ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796353.0937548-157-161531793021969/AnsiballZ_systemd_service.py'
Nov 22 07:25:53 compute-0 sudo[105472]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:25:53 compute-0 python3.9[105474]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 07:25:53 compute-0 sudo[105472]: pam_unix(sudo:session): session closed for user root
Nov 22 07:25:54 compute-0 sudo[105625]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjibxjzadktatmryudgrwjqlkbganpmd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796353.799893-157-143250972341763/AnsiballZ_systemd_service.py'
Nov 22 07:25:54 compute-0 sudo[105625]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:25:54 compute-0 python3.9[105627]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 07:25:54 compute-0 sudo[105625]: pam_unix(sudo:session): session closed for user root
Nov 22 07:25:54 compute-0 sudo[105778]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vomxdkngmbxdexzkhnmjszcpqgkjoovy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796354.6060371-157-243773134123354/AnsiballZ_systemd_service.py'
Nov 22 07:25:54 compute-0 sudo[105778]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:25:55 compute-0 python3.9[105780]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 07:25:55 compute-0 sudo[105778]: pam_unix(sudo:session): session closed for user root
Nov 22 07:25:56 compute-0 sudo[105931]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwwpvhxfyrjrwrfucgfqnmasxdlbkjra ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796356.0592086-313-174110659244911/AnsiballZ_file.py'
Nov 22 07:25:56 compute-0 sudo[105931]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:25:56 compute-0 python3.9[105933]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:25:56 compute-0 sudo[105931]: pam_unix(sudo:session): session closed for user root
Nov 22 07:25:57 compute-0 sudo[106089]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cwbiwxhvvnssyvopjthmhmxsmcvefqpv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796357.0126722-313-196660464883400/AnsiballZ_file.py'
Nov 22 07:25:57 compute-0 sudo[106089]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:25:57 compute-0 podman[106057]: 2025-11-22 07:25:57.423114548 +0000 UTC m=+0.129769779 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 22 07:25:57 compute-0 python3.9[106091]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:25:57 compute-0 sudo[106089]: pam_unix(sudo:session): session closed for user root
Nov 22 07:25:58 compute-0 sudo[106261]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dudlfmccgxbezsdwnkvoseqjflthvofn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796358.1190958-313-140739996574262/AnsiballZ_file.py'
Nov 22 07:25:58 compute-0 sudo[106261]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:25:58 compute-0 python3.9[106263]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:25:58 compute-0 sudo[106261]: pam_unix(sudo:session): session closed for user root
Nov 22 07:25:59 compute-0 sudo[106413]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fycnzznhdszvhxooqllplbijgwptknee ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796358.7836134-313-247886175719140/AnsiballZ_file.py'
Nov 22 07:25:59 compute-0 sudo[106413]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:26:00 compute-0 python3.9[106415]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:26:00 compute-0 sudo[106413]: pam_unix(sudo:session): session closed for user root
Nov 22 07:26:01 compute-0 sudo[106565]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jddnitmzpteydmzuwcjpxxbqxhbgvara ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796361.0838766-313-86033699899231/AnsiballZ_file.py'
Nov 22 07:26:01 compute-0 sudo[106565]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:26:01 compute-0 python3.9[106567]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:26:01 compute-0 sudo[106565]: pam_unix(sudo:session): session closed for user root
Nov 22 07:26:02 compute-0 sudo[106717]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cyhltmwgabhvcnbthubmvmtyxmffkvtk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796361.7394323-313-196493469245099/AnsiballZ_file.py'
Nov 22 07:26:02 compute-0 sudo[106717]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:26:02 compute-0 python3.9[106719]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:26:02 compute-0 sudo[106717]: pam_unix(sudo:session): session closed for user root
Nov 22 07:26:02 compute-0 sudo[106869]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vrxqxmpymznygezlvsiqfijmpuiwgoae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796362.3839664-313-170468443311150/AnsiballZ_file.py'
Nov 22 07:26:02 compute-0 sudo[106869]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:26:02 compute-0 python3.9[106871]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:26:02 compute-0 sudo[106869]: pam_unix(sudo:session): session closed for user root
Nov 22 07:26:04 compute-0 sudo[107021]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pormohvkjzibmfanudolqbikyakewtwf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796363.7934244-463-267541800226082/AnsiballZ_file.py'
Nov 22 07:26:04 compute-0 sudo[107021]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:26:04 compute-0 python3.9[107023]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:26:04 compute-0 sudo[107021]: pam_unix(sudo:session): session closed for user root
Nov 22 07:26:04 compute-0 sudo[107173]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dcvokrwkvkiqioktwhnsxbickyauwkdg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796364.4232876-463-208500148755312/AnsiballZ_file.py'
Nov 22 07:26:04 compute-0 sudo[107173]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:26:04 compute-0 python3.9[107175]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:26:04 compute-0 sudo[107173]: pam_unix(sudo:session): session closed for user root
Nov 22 07:26:05 compute-0 sudo[107325]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-reugqjxtdshauwkiaxkpmrhjlepllpdf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796364.9980934-463-80165100415361/AnsiballZ_file.py'
Nov 22 07:26:05 compute-0 sudo[107325]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:26:05 compute-0 python3.9[107327]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:26:05 compute-0 sudo[107325]: pam_unix(sudo:session): session closed for user root
Nov 22 07:26:05 compute-0 sudo[107488]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywtuthfxxmqocjavzddplpyldjngvzzz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796365.5879254-463-110553979952138/AnsiballZ_file.py'
Nov 22 07:26:05 compute-0 sudo[107488]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:26:05 compute-0 podman[107451]: 2025-11-22 07:26:05.857119515 +0000 UTC m=+0.053401713 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 07:26:06 compute-0 python3.9[107500]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:26:06 compute-0 sudo[107488]: pam_unix(sudo:session): session closed for user root
Nov 22 07:26:06 compute-0 sudo[107650]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gyigciwgmoiqrwhhglrafnhdeoigswho ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796366.2037528-463-217804377712506/AnsiballZ_file.py'
Nov 22 07:26:06 compute-0 sudo[107650]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:26:06 compute-0 python3.9[107652]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:26:06 compute-0 sudo[107650]: pam_unix(sudo:session): session closed for user root
Nov 22 07:26:07 compute-0 sudo[107802]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-drmwipvylnhdfshudxqhefjhlkfjhaec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796366.8187134-463-6987367355696/AnsiballZ_file.py'
Nov 22 07:26:07 compute-0 sudo[107802]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:26:07 compute-0 python3.9[107804]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:26:07 compute-0 sudo[107802]: pam_unix(sudo:session): session closed for user root
Nov 22 07:26:07 compute-0 sudo[107954]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thatlbmojahdxjrmuxlvyaxzyalulzjt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796367.487604-463-123083077320219/AnsiballZ_file.py'
Nov 22 07:26:07 compute-0 sudo[107954]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:26:07 compute-0 python3.9[107956]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:26:07 compute-0 sudo[107954]: pam_unix(sudo:session): session closed for user root
Nov 22 07:26:08 compute-0 sudo[108106]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-msxgrtvbluekbxpbqxoatzjmqxliuzvl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796368.3681731-616-216414050981532/AnsiballZ_command.py'
Nov 22 07:26:08 compute-0 sudo[108106]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:26:08 compute-0 python3.9[108108]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 07:26:08 compute-0 sudo[108106]: pam_unix(sudo:session): session closed for user root
Nov 22 07:26:09 compute-0 python3.9[108260]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 22 07:26:10 compute-0 sudo[108410]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rvkuzxzxlisnadrouojrnywkivymxyvn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796369.9945042-670-46060312320224/AnsiballZ_systemd_service.py'
Nov 22 07:26:10 compute-0 sudo[108410]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:26:10 compute-0 python3.9[108412]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 22 07:26:10 compute-0 systemd[1]: Reloading.
Nov 22 07:26:10 compute-0 systemd-sysv-generator[108443]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 07:26:10 compute-0 systemd-rc-local-generator[108438]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 07:26:10 compute-0 sudo[108410]: pam_unix(sudo:session): session closed for user root
Nov 22 07:26:11 compute-0 sudo[108597]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zvmbbplqinjxylwodeiusuratnnwjdzx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796371.0484738-694-24876026652862/AnsiballZ_command.py'
Nov 22 07:26:11 compute-0 sudo[108597]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:26:11 compute-0 python3.9[108599]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 07:26:11 compute-0 sudo[108597]: pam_unix(sudo:session): session closed for user root
Nov 22 07:26:11 compute-0 sudo[108750]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ttvdcaqtwpytxxulgzolgfvemmyiiqxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796371.6883085-694-68930841176079/AnsiballZ_command.py'
Nov 22 07:26:11 compute-0 sudo[108750]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:26:12 compute-0 python3.9[108752]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 07:26:12 compute-0 sudo[108750]: pam_unix(sudo:session): session closed for user root
Nov 22 07:26:12 compute-0 sudo[108903]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bofvtybcnqsqbtvxtpvwraaadozdutps ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796372.2540388-694-13552899111779/AnsiballZ_command.py'
Nov 22 07:26:12 compute-0 sudo[108903]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:26:12 compute-0 python3.9[108905]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 07:26:12 compute-0 sudo[108903]: pam_unix(sudo:session): session closed for user root
Nov 22 07:26:13 compute-0 sudo[109056]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fgtjygwfamfdulikdiujfyrmrnwunkoh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796372.84039-694-11893066375056/AnsiballZ_command.py'
Nov 22 07:26:13 compute-0 sudo[109056]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:26:13 compute-0 python3.9[109058]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 07:26:13 compute-0 sudo[109056]: pam_unix(sudo:session): session closed for user root
Nov 22 07:26:13 compute-0 sudo[109209]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mrvyajklgzutxjxicuztworxhjpbyamy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796373.4221919-694-152396813331744/AnsiballZ_command.py'
Nov 22 07:26:13 compute-0 sudo[109209]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:26:13 compute-0 python3.9[109211]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 07:26:13 compute-0 sudo[109209]: pam_unix(sudo:session): session closed for user root
Nov 22 07:26:14 compute-0 sudo[109362]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mqnydnupvvimalqcwrydppwpumgbnkyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796373.983185-694-252167167649326/AnsiballZ_command.py'
Nov 22 07:26:14 compute-0 sudo[109362]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:26:14 compute-0 python3.9[109364]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 07:26:14 compute-0 sudo[109362]: pam_unix(sudo:session): session closed for user root
Nov 22 07:26:14 compute-0 sudo[109515]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-afycyfnuoxwvmjqsbfoawdhxnfdsdqal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796374.5553553-694-226600023050133/AnsiballZ_command.py'
Nov 22 07:26:14 compute-0 sudo[109515]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:26:14 compute-0 python3.9[109517]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 07:26:15 compute-0 sudo[109515]: pam_unix(sudo:session): session closed for user root
Nov 22 07:26:16 compute-0 sudo[109668]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pilxbjjqsbqnjmegdlooybuwfqgyuotd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796375.9881632-856-271353393890101/AnsiballZ_getent.py'
Nov 22 07:26:16 compute-0 sudo[109668]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:26:16 compute-0 python3.9[109670]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Nov 22 07:26:16 compute-0 sudo[109668]: pam_unix(sudo:session): session closed for user root
Nov 22 07:26:17 compute-0 sudo[109821]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eahuoudowrochlhtasyjxdpzphsyiuqu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796376.8037431-880-91516010114560/AnsiballZ_group.py'
Nov 22 07:26:17 compute-0 sudo[109821]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:26:17 compute-0 python3.9[109823]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 22 07:26:17 compute-0 groupadd[109824]: group added to /etc/group: name=libvirt, GID=42473
Nov 22 07:26:17 compute-0 groupadd[109824]: group added to /etc/gshadow: name=libvirt
Nov 22 07:26:17 compute-0 groupadd[109824]: new group: name=libvirt, GID=42473
Nov 22 07:26:17 compute-0 sudo[109821]: pam_unix(sudo:session): session closed for user root
Nov 22 07:26:18 compute-0 sudo[109979]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cqvotbfnupzjwmxirqiegdvetmmkdbei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796377.8620927-904-106091434718355/AnsiballZ_user.py'
Nov 22 07:26:18 compute-0 sudo[109979]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:26:18 compute-0 python3.9[109981]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 22 07:26:18 compute-0 useradd[109983]: new user: name=libvirt, UID=42473, GID=42473, home=/home/libvirt, shell=/sbin/nologin, from=/dev/pts/0
Nov 22 07:26:18 compute-0 sudo[109979]: pam_unix(sudo:session): session closed for user root
Nov 22 07:26:19 compute-0 sudo[110139]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wzeyxuilxvcqtcnsdyfvbygbtysnugsk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796379.1300147-937-277292956983193/AnsiballZ_setup.py'
Nov 22 07:26:19 compute-0 sudo[110139]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:26:19 compute-0 python3.9[110141]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 22 07:26:19 compute-0 sudo[110139]: pam_unix(sudo:session): session closed for user root
Nov 22 07:26:20 compute-0 sudo[110223]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gxewnmovkzkprwshacobubbtnfmbgcsp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796379.1300147-937-277292956983193/AnsiballZ_dnf.py'
Nov 22 07:26:20 compute-0 sudo[110223]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:26:20 compute-0 python3.9[110225]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 22 07:26:28 compute-0 podman[110285]: 2025-11-22 07:26:28.439408424 +0000 UTC m=+0.081895937 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 22 07:26:36 compute-0 podman[110443]: 2025-11-22 07:26:36.414085004 +0000 UTC m=+0.063289847 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Nov 22 07:26:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:26:37.291 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:26:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:26:37.292 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:26:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:26:37.292 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:26:47 compute-0 kernel: SELinux:  Converting 2757 SID table entries...
Nov 22 07:26:47 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Nov 22 07:26:47 compute-0 kernel: SELinux:  policy capability open_perms=1
Nov 22 07:26:47 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Nov 22 07:26:47 compute-0 kernel: SELinux:  policy capability always_check_network=0
Nov 22 07:26:47 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 22 07:26:47 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 22 07:26:47 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 22 07:26:57 compute-0 kernel: SELinux:  Converting 2757 SID table entries...
Nov 22 07:26:57 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Nov 22 07:26:57 compute-0 kernel: SELinux:  policy capability open_perms=1
Nov 22 07:26:57 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Nov 22 07:26:57 compute-0 kernel: SELinux:  policy capability always_check_network=0
Nov 22 07:26:57 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 22 07:26:57 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 22 07:26:57 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 22 07:26:59 compute-0 dbus-broker-launch[811]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Nov 22 07:26:59 compute-0 podman[110478]: 2025-11-22 07:26:59.434652954 +0000 UTC m=+0.081116529 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 22 07:27:07 compute-0 podman[110505]: 2025-11-22 07:27:07.393780766 +0000 UTC m=+0.046926710 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 22 07:27:30 compute-0 podman[123247]: 2025-11-22 07:27:30.424102434 +0000 UTC m=+0.077220088 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 07:27:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:27:37.293 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:27:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:27:37.293 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:27:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:27:37.293 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:27:38 compute-0 podman[127327]: 2025-11-22 07:27:38.400181725 +0000 UTC m=+0.049729335 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 07:27:51 compute-0 kernel: SELinux:  Converting 2758 SID table entries...
Nov 22 07:27:51 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Nov 22 07:27:51 compute-0 kernel: SELinux:  policy capability open_perms=1
Nov 22 07:27:51 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Nov 22 07:27:51 compute-0 kernel: SELinux:  policy capability always_check_network=0
Nov 22 07:27:51 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 22 07:27:51 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 22 07:27:51 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 22 07:27:53 compute-0 groupadd[127370]: group added to /etc/group: name=dnsmasq, GID=992
Nov 22 07:27:53 compute-0 groupadd[127370]: group added to /etc/gshadow: name=dnsmasq
Nov 22 07:27:53 compute-0 groupadd[127370]: new group: name=dnsmasq, GID=992
Nov 22 07:27:53 compute-0 useradd[127377]: new user: name=dnsmasq, UID=992, GID=992, home=/var/lib/dnsmasq, shell=/usr/sbin/nologin, from=none
Nov 22 07:27:53 compute-0 dbus-broker-launch[810]: Noticed file-system modification, trigger reload.
Nov 22 07:27:53 compute-0 dbus-broker-launch[811]: avc:  op=load_policy lsm=selinux seqno=14 res=1
Nov 22 07:27:53 compute-0 dbus-broker-launch[810]: Noticed file-system modification, trigger reload.
Nov 22 07:27:54 compute-0 groupadd[127390]: group added to /etc/group: name=clevis, GID=991
Nov 22 07:27:54 compute-0 groupadd[127390]: group added to /etc/gshadow: name=clevis
Nov 22 07:27:54 compute-0 groupadd[127390]: new group: name=clevis, GID=991
Nov 22 07:27:54 compute-0 useradd[127397]: new user: name=clevis, UID=991, GID=991, home=/var/cache/clevis, shell=/usr/sbin/nologin, from=none
Nov 22 07:27:54 compute-0 usermod[127407]: add 'clevis' to group 'tss'
Nov 22 07:27:54 compute-0 usermod[127407]: add 'clevis' to shadow group 'tss'
Nov 22 07:27:56 compute-0 polkitd[43383]: Reloading rules
Nov 22 07:27:56 compute-0 polkitd[43383]: Collecting garbage unconditionally...
Nov 22 07:27:56 compute-0 polkitd[43383]: Loading rules from directory /etc/polkit-1/rules.d
Nov 22 07:27:56 compute-0 polkitd[43383]: Loading rules from directory /usr/share/polkit-1/rules.d
Nov 22 07:27:56 compute-0 polkitd[43383]: Finished loading, compiling and executing 3 rules
Nov 22 07:27:56 compute-0 polkitd[43383]: Reloading rules
Nov 22 07:27:56 compute-0 polkitd[43383]: Collecting garbage unconditionally...
Nov 22 07:27:56 compute-0 polkitd[43383]: Loading rules from directory /etc/polkit-1/rules.d
Nov 22 07:27:56 compute-0 polkitd[43383]: Loading rules from directory /usr/share/polkit-1/rules.d
Nov 22 07:27:56 compute-0 polkitd[43383]: Finished loading, compiling and executing 3 rules
Nov 22 07:27:57 compute-0 groupadd[127594]: group added to /etc/group: name=ceph, GID=167
Nov 22 07:27:57 compute-0 groupadd[127594]: group added to /etc/gshadow: name=ceph
Nov 22 07:27:57 compute-0 groupadd[127594]: new group: name=ceph, GID=167
Nov 22 07:27:57 compute-0 useradd[127600]: new user: name=ceph, UID=167, GID=167, home=/var/lib/ceph, shell=/sbin/nologin, from=none
Nov 22 07:28:00 compute-0 systemd[1]: Stopping OpenSSH server daemon...
Nov 22 07:28:00 compute-0 sshd[1009]: Received signal 15; terminating.
Nov 22 07:28:00 compute-0 systemd[1]: sshd.service: Deactivated successfully.
Nov 22 07:28:00 compute-0 systemd[1]: Stopped OpenSSH server daemon.
Nov 22 07:28:00 compute-0 systemd[1]: sshd.service: Consumed 1.198s CPU time, read 32.0K from disk, written 0B to disk.
Nov 22 07:28:00 compute-0 systemd[1]: Stopped target sshd-keygen.target.
Nov 22 07:28:00 compute-0 systemd[1]: Stopping sshd-keygen.target...
Nov 22 07:28:00 compute-0 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 22 07:28:00 compute-0 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 22 07:28:00 compute-0 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 22 07:28:00 compute-0 systemd[1]: Reached target sshd-keygen.target.
Nov 22 07:28:00 compute-0 systemd[1]: Starting OpenSSH server daemon...
Nov 22 07:28:00 compute-0 sshd[128129]: Server listening on 0.0.0.0 port 22.
Nov 22 07:28:00 compute-0 sshd[128129]: Server listening on :: port 22.
Nov 22 07:28:00 compute-0 systemd[1]: Started OpenSSH server daemon.
Nov 22 07:28:00 compute-0 podman[128117]: 2025-11-22 07:28:00.738708501 +0000 UTC m=+0.112560366 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 22 07:28:02 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 22 07:28:02 compute-0 systemd[1]: Starting man-db-cache-update.service...
Nov 22 07:28:02 compute-0 systemd[1]: Reloading.
Nov 22 07:28:02 compute-0 systemd-rc-local-generator[128402]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 07:28:02 compute-0 systemd-sysv-generator[128406]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 07:28:02 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 22 07:28:08 compute-0 sudo[110223]: pam_unix(sudo:session): session closed for user root
Nov 22 07:28:09 compute-0 podman[136104]: 2025-11-22 07:28:09.412414623 +0000 UTC m=+0.054085134 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 22 07:28:10 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 22 07:28:10 compute-0 systemd[1]: Finished man-db-cache-update.service.
Nov 22 07:28:10 compute-0 systemd[1]: man-db-cache-update.service: Consumed 10.169s CPU time.
Nov 22 07:28:10 compute-0 systemd[1]: run-ra1f072c5036541fb98d73246640167c6.service: Deactivated successfully.
Nov 22 07:28:17 compute-0 sudo[136943]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ekglquadxvhtueitdqnxaywodxtwxspi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796497.2749097-973-14060962285915/AnsiballZ_systemd.py'
Nov 22 07:28:17 compute-0 sudo[136943]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:28:18 compute-0 python3.9[136945]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 22 07:28:18 compute-0 systemd[1]: Reloading.
Nov 22 07:28:18 compute-0 systemd-rc-local-generator[136973]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 07:28:18 compute-0 systemd-sysv-generator[136976]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 07:28:18 compute-0 sudo[136943]: pam_unix(sudo:session): session closed for user root
Nov 22 07:28:18 compute-0 sudo[137132]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-toohpdpxbelcvezucgxmkjxddbtjmklv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796498.625679-973-134386642533724/AnsiballZ_systemd.py'
Nov 22 07:28:18 compute-0 sudo[137132]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:28:19 compute-0 python3.9[137134]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 22 07:28:19 compute-0 systemd[1]: Reloading.
Nov 22 07:28:19 compute-0 systemd-rc-local-generator[137159]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 07:28:19 compute-0 systemd-sysv-generator[137165]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 07:28:19 compute-0 sudo[137132]: pam_unix(sudo:session): session closed for user root
Nov 22 07:28:19 compute-0 sudo[137322]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nljhjrcrbupyjcjifuyhathprwxagbuh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796499.6346076-973-124188760857992/AnsiballZ_systemd.py'
Nov 22 07:28:19 compute-0 sudo[137322]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:28:20 compute-0 python3.9[137324]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 22 07:28:20 compute-0 systemd[1]: Reloading.
Nov 22 07:28:20 compute-0 systemd-rc-local-generator[137351]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 07:28:20 compute-0 systemd-sysv-generator[137354]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 07:28:20 compute-0 sudo[137322]: pam_unix(sudo:session): session closed for user root
Nov 22 07:28:20 compute-0 sudo[137511]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wljemtnpzyoyovaqdynzhwtjcbalajrq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796500.666461-973-256206338643490/AnsiballZ_systemd.py'
Nov 22 07:28:20 compute-0 sudo[137511]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:28:21 compute-0 python3.9[137513]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 22 07:28:22 compute-0 systemd[1]: Reloading.
Nov 22 07:28:22 compute-0 systemd-rc-local-generator[137542]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 07:28:22 compute-0 systemd-sysv-generator[137546]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 07:28:22 compute-0 sudo[137511]: pam_unix(sudo:session): session closed for user root
Nov 22 07:28:23 compute-0 sudo[137700]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-elelfieifucyiwlssegxgqtiqlvrfmni ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796503.270877-1060-71192086265652/AnsiballZ_systemd.py'
Nov 22 07:28:23 compute-0 sudo[137700]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:28:23 compute-0 python3.9[137702]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 22 07:28:23 compute-0 systemd[1]: Reloading.
Nov 22 07:28:24 compute-0 systemd-rc-local-generator[137733]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 07:28:24 compute-0 systemd-sysv-generator[137736]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 07:28:24 compute-0 sudo[137700]: pam_unix(sudo:session): session closed for user root
Nov 22 07:28:24 compute-0 sudo[137890]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tcifabdledmlowerrantbeczhnpcstut ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796504.341091-1060-47594496623208/AnsiballZ_systemd.py'
Nov 22 07:28:24 compute-0 sudo[137890]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:28:24 compute-0 python3.9[137892]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 22 07:28:25 compute-0 systemd[1]: Reloading.
Nov 22 07:28:25 compute-0 systemd-sysv-generator[137927]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 07:28:25 compute-0 systemd-rc-local-generator[137923]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 07:28:25 compute-0 sudo[137890]: pam_unix(sudo:session): session closed for user root
Nov 22 07:28:25 compute-0 sudo[138080]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-osyhlyuinbwvsspivyvmlmtmwjgussfu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796505.411075-1060-17314711615762/AnsiballZ_systemd.py'
Nov 22 07:28:25 compute-0 sudo[138080]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:28:26 compute-0 python3.9[138082]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 22 07:28:26 compute-0 systemd[1]: Reloading.
Nov 22 07:28:26 compute-0 systemd-rc-local-generator[138112]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 07:28:26 compute-0 systemd-sysv-generator[138115]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 07:28:26 compute-0 sudo[138080]: pam_unix(sudo:session): session closed for user root
Nov 22 07:28:26 compute-0 sudo[138270]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgenyzawpoztlbonddgqhbaktupvikur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796506.5475142-1060-18035273175131/AnsiballZ_systemd.py'
Nov 22 07:28:26 compute-0 sudo[138270]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:28:27 compute-0 python3.9[138272]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 22 07:28:27 compute-0 sudo[138270]: pam_unix(sudo:session): session closed for user root
Nov 22 07:28:27 compute-0 sudo[138425]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umxlcgunwfjkvayxvfcmnpzblnjskbbi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796507.3668032-1060-176191322899079/AnsiballZ_systemd.py'
Nov 22 07:28:27 compute-0 sudo[138425]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:28:28 compute-0 python3.9[138427]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 22 07:28:28 compute-0 systemd[1]: Reloading.
Nov 22 07:28:28 compute-0 systemd-sysv-generator[138460]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 07:28:28 compute-0 systemd-rc-local-generator[138456]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 07:28:28 compute-0 sudo[138425]: pam_unix(sudo:session): session closed for user root
Nov 22 07:28:29 compute-0 sudo[138615]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dktumkybusrdwipvldgyekgykcepnqhs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796509.005996-1168-71617513969411/AnsiballZ_systemd.py'
Nov 22 07:28:29 compute-0 sudo[138615]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:28:29 compute-0 python3.9[138617]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 22 07:28:29 compute-0 systemd[1]: Reloading.
Nov 22 07:28:29 compute-0 systemd-rc-local-generator[138648]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 07:28:29 compute-0 systemd-sysv-generator[138651]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 07:28:29 compute-0 systemd[1]: Listening on libvirt proxy daemon socket.
Nov 22 07:28:29 compute-0 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Nov 22 07:28:29 compute-0 sudo[138615]: pam_unix(sudo:session): session closed for user root
Nov 22 07:28:30 compute-0 sudo[138808]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywqhwortdvlymsxmhresmojlgtgjkcwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796510.3423529-1192-275604268048041/AnsiballZ_systemd.py'
Nov 22 07:28:30 compute-0 sudo[138808]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:28:30 compute-0 python3.9[138810]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 22 07:28:31 compute-0 sudo[138808]: pam_unix(sudo:session): session closed for user root
Nov 22 07:28:31 compute-0 podman[138812]: 2025-11-22 07:28:31.08237401 +0000 UTC m=+0.082154491 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 22 07:28:31 compute-0 sudo[138987]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fswvlumaawtbdumqsifpbklvxsueyacy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796511.2025895-1192-185308218265988/AnsiballZ_systemd.py'
Nov 22 07:28:31 compute-0 sudo[138987]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:28:31 compute-0 python3.9[138989]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 22 07:28:31 compute-0 sudo[138987]: pam_unix(sudo:session): session closed for user root
Nov 22 07:28:32 compute-0 sudo[139142]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-brbfwzeqomemvpcrjwxfynceebjbwvcv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796512.0382566-1192-199293257458691/AnsiballZ_systemd.py'
Nov 22 07:28:32 compute-0 sudo[139142]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:28:32 compute-0 python3.9[139144]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 22 07:28:32 compute-0 sudo[139142]: pam_unix(sudo:session): session closed for user root
Nov 22 07:28:33 compute-0 sudo[139297]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bpnzqlrxdvbtlsfaysuhjnirozlzmoac ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796512.8399134-1192-97208068930389/AnsiballZ_systemd.py'
Nov 22 07:28:33 compute-0 sudo[139297]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:28:33 compute-0 python3.9[139299]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 22 07:28:33 compute-0 sudo[139297]: pam_unix(sudo:session): session closed for user root
Nov 22 07:28:34 compute-0 sudo[139452]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wrbhrkrczuwlsknegyqyujsowvjutttv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796513.8097916-1192-276921577729882/AnsiballZ_systemd.py'
Nov 22 07:28:34 compute-0 sudo[139452]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:28:34 compute-0 python3.9[139454]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 22 07:28:34 compute-0 sudo[139452]: pam_unix(sudo:session): session closed for user root
Nov 22 07:28:34 compute-0 sudo[139607]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-smylvrnpfnnlxdgaybjjkdgvcuebtfwq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796514.6156366-1192-247502538282832/AnsiballZ_systemd.py'
Nov 22 07:28:34 compute-0 sudo[139607]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:28:35 compute-0 python3.9[139609]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 22 07:28:35 compute-0 sudo[139607]: pam_unix(sudo:session): session closed for user root
Nov 22 07:28:35 compute-0 sudo[139762]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kbxblhdqkcsqihgzvljolvqxkkrnmiwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796515.4193356-1192-100997841089654/AnsiballZ_systemd.py'
Nov 22 07:28:35 compute-0 sudo[139762]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:28:35 compute-0 python3.9[139764]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 22 07:28:36 compute-0 sudo[139762]: pam_unix(sudo:session): session closed for user root
Nov 22 07:28:36 compute-0 sudo[139917]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-psobpeoutdbvbcsykqfyvlylcsdsegpi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796516.2259345-1192-117020054048278/AnsiballZ_systemd.py'
Nov 22 07:28:36 compute-0 sudo[139917]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:28:36 compute-0 python3.9[139919]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 22 07:28:36 compute-0 sudo[139917]: pam_unix(sudo:session): session closed for user root
Nov 22 07:28:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:28:37.293 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:28:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:28:37.294 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:28:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:28:37.294 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:28:37 compute-0 sudo[140072]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dvacsnucwhctmsdkmdmusklnbmpioigj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796517.119862-1192-70551220420979/AnsiballZ_systemd.py'
Nov 22 07:28:37 compute-0 sudo[140072]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:28:37 compute-0 python3.9[140074]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 22 07:28:38 compute-0 sudo[140072]: pam_unix(sudo:session): session closed for user root
Nov 22 07:28:38 compute-0 sudo[140227]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kfbjywhympmalhneeujkecwdjkfpfbhl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796518.1859725-1192-79830926790578/AnsiballZ_systemd.py'
Nov 22 07:28:38 compute-0 sudo[140227]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:28:38 compute-0 python3.9[140229]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 22 07:28:38 compute-0 sudo[140227]: pam_unix(sudo:session): session closed for user root
Nov 22 07:28:39 compute-0 sudo[140382]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxdxttzcwymiomobalafhrjgugewbzpx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796519.0383918-1192-63928510785673/AnsiballZ_systemd.py'
Nov 22 07:28:39 compute-0 sudo[140382]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:28:39 compute-0 python3.9[140384]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 22 07:28:39 compute-0 sudo[140382]: pam_unix(sudo:session): session closed for user root
Nov 22 07:28:39 compute-0 podman[140386]: 2025-11-22 07:28:39.779810264 +0000 UTC m=+0.067709147 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 22 07:28:40 compute-0 sudo[140556]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vmffauyoiybeognagswrkxtxxekvsvvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796519.887335-1192-137661459346461/AnsiballZ_systemd.py'
Nov 22 07:28:40 compute-0 sudo[140556]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:28:40 compute-0 python3.9[140558]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 22 07:28:40 compute-0 sudo[140556]: pam_unix(sudo:session): session closed for user root
Nov 22 07:28:40 compute-0 sudo[140711]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kkqlmynqlthdlufcbibworezlndfplrk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796520.666004-1192-24606847101403/AnsiballZ_systemd.py'
Nov 22 07:28:40 compute-0 sudo[140711]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:28:41 compute-0 python3.9[140713]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 22 07:28:41 compute-0 sudo[140711]: pam_unix(sudo:session): session closed for user root
Nov 22 07:28:41 compute-0 sudo[140866]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ekcpbtwwnzcttrnyxviedjtpkbqlksps ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796521.5298564-1192-246624584235553/AnsiballZ_systemd.py'
Nov 22 07:28:41 compute-0 sudo[140866]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:28:42 compute-0 python3.9[140868]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 22 07:28:42 compute-0 sudo[140866]: pam_unix(sudo:session): session closed for user root
Nov 22 07:28:45 compute-0 sudo[141021]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nqejwmnhvkiugzyavzjijsihyqjlufev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796524.7346482-1498-238886317007761/AnsiballZ_file.py'
Nov 22 07:28:45 compute-0 sudo[141021]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:28:45 compute-0 python3.9[141023]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:28:45 compute-0 sudo[141021]: pam_unix(sudo:session): session closed for user root
Nov 22 07:28:45 compute-0 sudo[141173]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xrcdeulwjhuzreitrxmwyyqwlgrmtlzw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796525.523323-1498-159326073689047/AnsiballZ_file.py'
Nov 22 07:28:45 compute-0 sudo[141173]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:28:46 compute-0 python3.9[141175]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:28:46 compute-0 sudo[141173]: pam_unix(sudo:session): session closed for user root
Nov 22 07:28:47 compute-0 sudo[141325]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qnfmvkrqbvokqrqjgqsypxnygzwnidkg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796526.690412-1498-198687829470355/AnsiballZ_file.py'
Nov 22 07:28:47 compute-0 sudo[141325]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:28:47 compute-0 python3.9[141327]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:28:47 compute-0 sudo[141325]: pam_unix(sudo:session): session closed for user root
Nov 22 07:28:47 compute-0 sudo[141477]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bnbsdrwknzqixeicttzpcbmzrhpiziaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796527.5602245-1498-240463176733566/AnsiballZ_file.py'
Nov 22 07:28:47 compute-0 sudo[141477]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:28:48 compute-0 python3.9[141479]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:28:48 compute-0 sudo[141477]: pam_unix(sudo:session): session closed for user root
Nov 22 07:28:48 compute-0 sudo[141629]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-myhklfztqoktuosxrbvmurssyeavpyzh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796528.2943857-1498-129524406639669/AnsiballZ_file.py'
Nov 22 07:28:48 compute-0 sudo[141629]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:28:48 compute-0 python3.9[141631]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:28:48 compute-0 sudo[141629]: pam_unix(sudo:session): session closed for user root
Nov 22 07:28:49 compute-0 sudo[141781]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mprjpqgzixcgqijfgvxqokdwzoaupebv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796529.0630994-1498-52770084860375/AnsiballZ_file.py'
Nov 22 07:28:49 compute-0 sudo[141781]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:28:49 compute-0 python3.9[141783]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:28:49 compute-0 sudo[141781]: pam_unix(sudo:session): session closed for user root
Nov 22 07:28:50 compute-0 sudo[141933]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rhvwgouirdzjfbhpspevyydcgijuwhzq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796530.2138617-1627-43863330362921/AnsiballZ_stat.py'
Nov 22 07:28:50 compute-0 sudo[141933]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:28:51 compute-0 python3.9[141935]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:28:51 compute-0 sudo[141933]: pam_unix(sudo:session): session closed for user root
Nov 22 07:28:52 compute-0 sudo[142058]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qkbglnlgoihdspvcwatdloljwicedngz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796530.2138617-1627-43863330362921/AnsiballZ_copy.py'
Nov 22 07:28:52 compute-0 sudo[142058]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:28:52 compute-0 python3.9[142060]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763796530.2138617-1627-43863330362921/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:28:52 compute-0 sudo[142058]: pam_unix(sudo:session): session closed for user root
Nov 22 07:28:52 compute-0 sudo[142210]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fbsmylbpocukyegkwzlcvauolghgbexa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796532.413404-1627-136400599627194/AnsiballZ_stat.py'
Nov 22 07:28:52 compute-0 sudo[142210]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:28:52 compute-0 python3.9[142212]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:28:52 compute-0 sudo[142210]: pam_unix(sudo:session): session closed for user root
Nov 22 07:28:53 compute-0 sudo[142335]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oabxtmuguhhtbbnzsrgeyfwittksgdqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796532.413404-1627-136400599627194/AnsiballZ_copy.py'
Nov 22 07:28:53 compute-0 sudo[142335]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:28:53 compute-0 python3.9[142337]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763796532.413404-1627-136400599627194/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:28:53 compute-0 sudo[142335]: pam_unix(sudo:session): session closed for user root
Nov 22 07:28:53 compute-0 sudo[142487]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hhenejmipskzhwrqvtqwvttzlkyobmmz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796533.6397905-1627-93719545095164/AnsiballZ_stat.py'
Nov 22 07:28:53 compute-0 sudo[142487]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:28:54 compute-0 python3.9[142489]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:28:54 compute-0 sudo[142487]: pam_unix(sudo:session): session closed for user root
Nov 22 07:28:54 compute-0 sudo[142612]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xhphxvvairrlwselebvkgmbpakhltooc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796533.6397905-1627-93719545095164/AnsiballZ_copy.py'
Nov 22 07:28:54 compute-0 sudo[142612]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:28:54 compute-0 python3.9[142614]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763796533.6397905-1627-93719545095164/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:28:54 compute-0 sudo[142612]: pam_unix(sudo:session): session closed for user root
Nov 22 07:28:55 compute-0 sudo[142764]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oqockjxysuaapgrqeepzfxpzjdwmdsyu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796534.9972503-1627-126120999894891/AnsiballZ_stat.py'
Nov 22 07:28:55 compute-0 sudo[142764]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:28:55 compute-0 python3.9[142766]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:28:55 compute-0 sudo[142764]: pam_unix(sudo:session): session closed for user root
Nov 22 07:28:56 compute-0 sudo[142889]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ohykdzibguzzofwjxwocisofboglhqbs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796534.9972503-1627-126120999894891/AnsiballZ_copy.py'
Nov 22 07:28:56 compute-0 sudo[142889]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:28:56 compute-0 python3.9[142891]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763796534.9972503-1627-126120999894891/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:28:56 compute-0 sudo[142889]: pam_unix(sudo:session): session closed for user root
Nov 22 07:28:57 compute-0 sudo[143041]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lhwshyntxcbguibiccmbwsrjkfgyhjhb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796536.7226505-1627-125019970731496/AnsiballZ_stat.py'
Nov 22 07:28:57 compute-0 sudo[143041]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:28:57 compute-0 python3.9[143043]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:28:57 compute-0 sudo[143041]: pam_unix(sudo:session): session closed for user root
Nov 22 07:28:57 compute-0 sudo[143166]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjvsmstwfuwsmfobxapketeielslkeeu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796536.7226505-1627-125019970731496/AnsiballZ_copy.py'
Nov 22 07:28:57 compute-0 sudo[143166]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:28:57 compute-0 python3.9[143168]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763796536.7226505-1627-125019970731496/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:28:57 compute-0 sudo[143166]: pam_unix(sudo:session): session closed for user root
Nov 22 07:28:58 compute-0 sudo[143318]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cwvobnkqyxoynztwwhhmyepepvmcgwcj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796538.0725024-1627-19674168742817/AnsiballZ_stat.py'
Nov 22 07:28:58 compute-0 sudo[143318]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:28:58 compute-0 python3.9[143320]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:28:58 compute-0 sudo[143318]: pam_unix(sudo:session): session closed for user root
Nov 22 07:28:58 compute-0 sudo[143443]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-illqpjvwwpeyernwklskreiksejxsuuj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796538.0725024-1627-19674168742817/AnsiballZ_copy.py'
Nov 22 07:28:58 compute-0 sudo[143443]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:28:59 compute-0 python3.9[143445]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763796538.0725024-1627-19674168742817/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:28:59 compute-0 sudo[143443]: pam_unix(sudo:session): session closed for user root
Nov 22 07:28:59 compute-0 sudo[143595]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zdtvbqitxnmbcyybskxuiscvhklabnvr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796539.349599-1627-260753539539731/AnsiballZ_stat.py'
Nov 22 07:28:59 compute-0 sudo[143595]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:28:59 compute-0 python3.9[143597]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:28:59 compute-0 sudo[143595]: pam_unix(sudo:session): session closed for user root
Nov 22 07:29:00 compute-0 sudo[143718]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tiwmemoksbreowrwihwuelrgbdeulpkz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796539.349599-1627-260753539539731/AnsiballZ_copy.py'
Nov 22 07:29:00 compute-0 sudo[143718]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:29:00 compute-0 python3.9[143720]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763796539.349599-1627-260753539539731/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:29:00 compute-0 sudo[143718]: pam_unix(sudo:session): session closed for user root
Nov 22 07:29:00 compute-0 sudo[143870]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rsmraikthcpyrlealhszprqelssbvuxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796540.6360426-1627-14083634293049/AnsiballZ_stat.py'
Nov 22 07:29:00 compute-0 sudo[143870]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:29:01 compute-0 python3.9[143872]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:29:01 compute-0 sudo[143870]: pam_unix(sudo:session): session closed for user root
Nov 22 07:29:01 compute-0 podman[143898]: 2025-11-22 07:29:01.449301982 +0000 UTC m=+0.098515997 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 22 07:29:01 compute-0 sudo[144023]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hszmsvgbzrzuukimvtzpunvdwxrlcgga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796540.6360426-1627-14083634293049/AnsiballZ_copy.py'
Nov 22 07:29:01 compute-0 sudo[144023]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:29:01 compute-0 python3.9[144025]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763796540.6360426-1627-14083634293049/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:29:01 compute-0 sudo[144023]: pam_unix(sudo:session): session closed for user root
Nov 22 07:29:03 compute-0 sudo[144175]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-toikgcowppcnzsqniichkmwiiolstcmw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796543.2163527-1966-237824577812148/AnsiballZ_command.py'
Nov 22 07:29:03 compute-0 sudo[144175]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:29:03 compute-0 python3.9[144177]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Nov 22 07:29:03 compute-0 sudo[144175]: pam_unix(sudo:session): session closed for user root
Nov 22 07:29:04 compute-0 sudo[144328]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ivgqtjhhmybpxtldwnvsdelvxbhrbeuo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796544.1277761-1993-68955248044266/AnsiballZ_file.py'
Nov 22 07:29:04 compute-0 sudo[144328]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:29:04 compute-0 python3.9[144330]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:29:04 compute-0 sudo[144328]: pam_unix(sudo:session): session closed for user root
Nov 22 07:29:05 compute-0 sudo[144480]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjvrzcfampaebecqxcbyzjhgjuqrazqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796544.7781382-1993-79259673754196/AnsiballZ_file.py'
Nov 22 07:29:05 compute-0 sudo[144480]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:29:05 compute-0 python3.9[144482]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:29:05 compute-0 sudo[144480]: pam_unix(sudo:session): session closed for user root
Nov 22 07:29:05 compute-0 sudo[144632]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iuryiooebxmzupgrtwiyfygiuhrrnpne ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796545.4067328-1993-278605401296028/AnsiballZ_file.py'
Nov 22 07:29:05 compute-0 sudo[144632]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:29:05 compute-0 python3.9[144634]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:29:05 compute-0 sudo[144632]: pam_unix(sudo:session): session closed for user root
Nov 22 07:29:06 compute-0 sudo[144784]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fprjrqyfmmmmtbzhkywvyuohphyhjorh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796546.454675-1993-125892162758661/AnsiballZ_file.py'
Nov 22 07:29:06 compute-0 sudo[144784]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:29:07 compute-0 python3.9[144786]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:29:07 compute-0 sudo[144784]: pam_unix(sudo:session): session closed for user root
Nov 22 07:29:07 compute-0 sudo[144936]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ivmsdywdacyrgqzkqqxrdjucblvfnjbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796547.162494-1993-98356267780216/AnsiballZ_file.py'
Nov 22 07:29:07 compute-0 sudo[144936]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:29:07 compute-0 python3.9[144938]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:29:07 compute-0 sudo[144936]: pam_unix(sudo:session): session closed for user root
Nov 22 07:29:08 compute-0 sudo[145088]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-adevdomzyahbhpdmrhpmhafsemctpjkx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796547.851378-1993-103552429253239/AnsiballZ_file.py'
Nov 22 07:29:08 compute-0 sudo[145088]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:29:08 compute-0 python3.9[145090]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:29:08 compute-0 sudo[145088]: pam_unix(sudo:session): session closed for user root
Nov 22 07:29:08 compute-0 sudo[145240]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ectavklvupccyyuxsatsltycwusykmmc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796548.4982045-1993-48082984532487/AnsiballZ_file.py'
Nov 22 07:29:08 compute-0 sudo[145240]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:29:08 compute-0 python3.9[145242]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:29:08 compute-0 sudo[145240]: pam_unix(sudo:session): session closed for user root
Nov 22 07:29:09 compute-0 sudo[145392]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-glmahyqtvwxnnctkjkimunsgtcultopr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796549.1064043-1993-239588151096983/AnsiballZ_file.py'
Nov 22 07:29:09 compute-0 sudo[145392]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:29:09 compute-0 python3.9[145394]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:29:09 compute-0 sudo[145392]: pam_unix(sudo:session): session closed for user root
Nov 22 07:29:10 compute-0 sudo[145560]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nhieqmfgueuylobswkuzhusjxywjnrfz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796549.831221-1993-11517624385783/AnsiballZ_file.py'
Nov 22 07:29:10 compute-0 sudo[145560]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:29:10 compute-0 podman[145518]: 2025-11-22 07:29:10.159063154 +0000 UTC m=+0.054385671 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent)
Nov 22 07:29:10 compute-0 python3.9[145565]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:29:10 compute-0 sudo[145560]: pam_unix(sudo:session): session closed for user root
Nov 22 07:29:10 compute-0 sudo[145716]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xullphxjqtdfdhxaqnaxqkbhyunijufc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796550.584863-1993-99779098423481/AnsiballZ_file.py'
Nov 22 07:29:10 compute-0 sudo[145716]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:29:11 compute-0 python3.9[145718]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:29:11 compute-0 sudo[145716]: pam_unix(sudo:session): session closed for user root
Nov 22 07:29:11 compute-0 sudo[145868]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mnnpurzbnxcidmtcomznulmlfstxdzsc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796551.2817929-1993-72413858663300/AnsiballZ_file.py'
Nov 22 07:29:11 compute-0 sudo[145868]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:29:11 compute-0 python3.9[145870]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:29:11 compute-0 sudo[145868]: pam_unix(sudo:session): session closed for user root
Nov 22 07:29:12 compute-0 sudo[146020]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uoevnzuyaigyxalftzoyxpojiquyliso ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796552.0826323-1993-274227085967613/AnsiballZ_file.py'
Nov 22 07:29:12 compute-0 sudo[146020]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:29:12 compute-0 python3.9[146022]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:29:12 compute-0 sudo[146020]: pam_unix(sudo:session): session closed for user root
Nov 22 07:29:13 compute-0 sudo[146172]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tniehmhxfrkwhzvgcvachliwehfrrujn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796552.8485868-1993-115970761316135/AnsiballZ_file.py'
Nov 22 07:29:13 compute-0 sudo[146172]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:29:13 compute-0 python3.9[146174]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:29:13 compute-0 sudo[146172]: pam_unix(sudo:session): session closed for user root
Nov 22 07:29:13 compute-0 sudo[146324]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-feyaylqhzafwhrxfveyzwntglkzmwaqw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796553.506894-1993-108146626479216/AnsiballZ_file.py'
Nov 22 07:29:13 compute-0 sudo[146324]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:29:13 compute-0 python3.9[146326]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:29:13 compute-0 sudo[146324]: pam_unix(sudo:session): session closed for user root
Nov 22 07:29:14 compute-0 sudo[146476]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zumjneaqpvhgqgdrkfoiyymsglxuwoml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796554.561998-2290-178501171725892/AnsiballZ_stat.py'
Nov 22 07:29:14 compute-0 sudo[146476]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:29:15 compute-0 python3.9[146478]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:29:15 compute-0 sudo[146476]: pam_unix(sudo:session): session closed for user root
Nov 22 07:29:15 compute-0 sudo[146599]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-njcwtttucabtajxzpewrclhptsmucvlx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796554.561998-2290-178501171725892/AnsiballZ_copy.py'
Nov 22 07:29:15 compute-0 sudo[146599]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:29:15 compute-0 python3.9[146601]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796554.561998-2290-178501171725892/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:29:15 compute-0 sudo[146599]: pam_unix(sudo:session): session closed for user root
Nov 22 07:29:16 compute-0 sudo[146751]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yxjtwajtfhzuwvkedwnarqjcizwcisjk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796555.8739765-2290-58289401387805/AnsiballZ_stat.py'
Nov 22 07:29:16 compute-0 sudo[146751]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:29:16 compute-0 python3.9[146753]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:29:16 compute-0 sudo[146751]: pam_unix(sudo:session): session closed for user root
Nov 22 07:29:16 compute-0 sudo[146874]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nedpluirvzpdknjztjdgxbpghugeiudt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796555.8739765-2290-58289401387805/AnsiballZ_copy.py'
Nov 22 07:29:16 compute-0 sudo[146874]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:29:17 compute-0 python3.9[146876]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796555.8739765-2290-58289401387805/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:29:17 compute-0 sudo[146874]: pam_unix(sudo:session): session closed for user root
Nov 22 07:29:17 compute-0 sudo[147026]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ysjuofcfwleccqvxxkqheravzloiwxpj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796557.2742708-2290-240495329976058/AnsiballZ_stat.py'
Nov 22 07:29:17 compute-0 sudo[147026]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:29:17 compute-0 python3.9[147028]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:29:17 compute-0 sudo[147026]: pam_unix(sudo:session): session closed for user root
Nov 22 07:29:18 compute-0 sudo[147149]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nrlyxtgxoxqqgeplibegsfnurwdiuloi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796557.2742708-2290-240495329976058/AnsiballZ_copy.py'
Nov 22 07:29:18 compute-0 sudo[147149]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:29:18 compute-0 python3.9[147151]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796557.2742708-2290-240495329976058/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:29:18 compute-0 sudo[147149]: pam_unix(sudo:session): session closed for user root
Nov 22 07:29:18 compute-0 sudo[147301]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hrohkexzyuecabjjgqwftkfginfclpef ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796558.520726-2290-170275118275304/AnsiballZ_stat.py'
Nov 22 07:29:18 compute-0 sudo[147301]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:29:19 compute-0 python3.9[147303]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:29:19 compute-0 sudo[147301]: pam_unix(sudo:session): session closed for user root
Nov 22 07:29:19 compute-0 sudo[147424]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mnhgqihscqgfticiktvxbzeanmnrewna ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796558.520726-2290-170275118275304/AnsiballZ_copy.py'
Nov 22 07:29:19 compute-0 sudo[147424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:29:19 compute-0 python3.9[147426]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796558.520726-2290-170275118275304/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:29:19 compute-0 sudo[147424]: pam_unix(sudo:session): session closed for user root
Nov 22 07:29:20 compute-0 sudo[147576]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-clmnftfqknhcxxoraaviwfklbyudoesg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796559.7529755-2290-275961105388035/AnsiballZ_stat.py'
Nov 22 07:29:20 compute-0 sudo[147576]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:29:20 compute-0 python3.9[147578]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:29:20 compute-0 sudo[147576]: pam_unix(sudo:session): session closed for user root
Nov 22 07:29:20 compute-0 sudo[147699]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yabdemkzyyzuuxoadpobozrxqbszygtd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796559.7529755-2290-275961105388035/AnsiballZ_copy.py'
Nov 22 07:29:20 compute-0 sudo[147699]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:29:20 compute-0 python3.9[147701]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796559.7529755-2290-275961105388035/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:29:20 compute-0 sudo[147699]: pam_unix(sudo:session): session closed for user root
Nov 22 07:29:21 compute-0 sudo[147851]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-liulvgftukmihmrrostssoxoeuldixvu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796560.9386547-2290-55426005410816/AnsiballZ_stat.py'
Nov 22 07:29:21 compute-0 sudo[147851]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:29:21 compute-0 python3.9[147853]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:29:21 compute-0 sudo[147851]: pam_unix(sudo:session): session closed for user root
Nov 22 07:29:22 compute-0 sudo[147974]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yaixaafvjzxduyargoxlzzpzzondbwtf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796560.9386547-2290-55426005410816/AnsiballZ_copy.py'
Nov 22 07:29:22 compute-0 sudo[147974]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:29:22 compute-0 python3.9[147976]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796560.9386547-2290-55426005410816/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:29:22 compute-0 sudo[147974]: pam_unix(sudo:session): session closed for user root
Nov 22 07:29:22 compute-0 sudo[148126]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kycznrseojsxfbwrhkbpdhjlrgnibsww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796562.4956303-2290-190403964555613/AnsiballZ_stat.py'
Nov 22 07:29:22 compute-0 sudo[148126]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:29:22 compute-0 python3.9[148128]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:29:22 compute-0 sudo[148126]: pam_unix(sudo:session): session closed for user root
Nov 22 07:29:23 compute-0 sudo[148249]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cmlkiasqlrilruvmyhgorbidjbepdlun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796562.4956303-2290-190403964555613/AnsiballZ_copy.py'
Nov 22 07:29:23 compute-0 sudo[148249]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:29:23 compute-0 python3.9[148251]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796562.4956303-2290-190403964555613/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:29:23 compute-0 sudo[148249]: pam_unix(sudo:session): session closed for user root
Nov 22 07:29:24 compute-0 sudo[148401]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sbdizapqqusjawxwvjriosjepcgdjtij ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796563.7234075-2290-119672121292085/AnsiballZ_stat.py'
Nov 22 07:29:24 compute-0 sudo[148401]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:29:24 compute-0 python3.9[148403]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:29:24 compute-0 sudo[148401]: pam_unix(sudo:session): session closed for user root
Nov 22 07:29:24 compute-0 sudo[148524]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bcinaxxwjkrukmjyxjrbyehchjxnjakd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796563.7234075-2290-119672121292085/AnsiballZ_copy.py'
Nov 22 07:29:24 compute-0 sudo[148524]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:29:24 compute-0 python3.9[148526]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796563.7234075-2290-119672121292085/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:29:24 compute-0 sudo[148524]: pam_unix(sudo:session): session closed for user root
Nov 22 07:29:25 compute-0 sudo[148676]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kmqebjdczvlgxuerdxjwtimqbjxeuysz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796564.9988453-2290-19534492245985/AnsiballZ_stat.py'
Nov 22 07:29:25 compute-0 sudo[148676]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:29:25 compute-0 python3.9[148678]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:29:25 compute-0 sudo[148676]: pam_unix(sudo:session): session closed for user root
Nov 22 07:29:25 compute-0 sudo[148799]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vlhqeccbaexxkbualvytulgsngkrctyh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796564.9988453-2290-19534492245985/AnsiballZ_copy.py'
Nov 22 07:29:25 compute-0 sudo[148799]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:29:26 compute-0 python3.9[148801]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796564.9988453-2290-19534492245985/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:29:26 compute-0 sudo[148799]: pam_unix(sudo:session): session closed for user root
Nov 22 07:29:26 compute-0 sudo[148951]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qpwdxwiyjblyffqrcvgykrgccylvmpmh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796566.2133327-2290-113248178651247/AnsiballZ_stat.py'
Nov 22 07:29:26 compute-0 sudo[148951]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:29:26 compute-0 python3.9[148953]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:29:26 compute-0 sudo[148951]: pam_unix(sudo:session): session closed for user root
Nov 22 07:29:27 compute-0 sudo[149074]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ruwrumptmfrvaorzjdqpjzqfufflrpsb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796566.2133327-2290-113248178651247/AnsiballZ_copy.py'
Nov 22 07:29:27 compute-0 sudo[149074]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:29:27 compute-0 python3.9[149076]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796566.2133327-2290-113248178651247/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:29:27 compute-0 sudo[149074]: pam_unix(sudo:session): session closed for user root
Nov 22 07:29:27 compute-0 sudo[149226]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hqiywbwaauuhrcilagqkoooohfdbwdby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796567.4823947-2290-46155402430275/AnsiballZ_stat.py'
Nov 22 07:29:27 compute-0 sudo[149226]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:29:27 compute-0 python3.9[149228]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:29:28 compute-0 sudo[149226]: pam_unix(sudo:session): session closed for user root
Nov 22 07:29:28 compute-0 sudo[149349]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nqmqyiqxitajhzorcnkufgizttbwziud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796567.4823947-2290-46155402430275/AnsiballZ_copy.py'
Nov 22 07:29:28 compute-0 sudo[149349]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:29:28 compute-0 python3.9[149351]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796567.4823947-2290-46155402430275/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:29:28 compute-0 sudo[149349]: pam_unix(sudo:session): session closed for user root
Nov 22 07:29:29 compute-0 sudo[149501]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kbzqlewbttrjvwgoryrtgonjtevwjamp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796568.7506807-2290-193874810979754/AnsiballZ_stat.py'
Nov 22 07:29:29 compute-0 sudo[149501]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:29:29 compute-0 python3.9[149503]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:29:29 compute-0 sudo[149501]: pam_unix(sudo:session): session closed for user root
Nov 22 07:29:29 compute-0 sudo[149624]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rztmogmxorgyaswyrogdblceuoejjksb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796568.7506807-2290-193874810979754/AnsiballZ_copy.py'
Nov 22 07:29:29 compute-0 sudo[149624]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:29:29 compute-0 python3.9[149626]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796568.7506807-2290-193874810979754/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:29:29 compute-0 sudo[149624]: pam_unix(sudo:session): session closed for user root
Nov 22 07:29:30 compute-0 sudo[149776]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thbjjaxaohxyfgqtvjjrhcqcfsffzdcm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796569.9790714-2290-273298733165297/AnsiballZ_stat.py'
Nov 22 07:29:30 compute-0 sudo[149776]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:29:30 compute-0 python3.9[149778]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:29:30 compute-0 sudo[149776]: pam_unix(sudo:session): session closed for user root
Nov 22 07:29:30 compute-0 sudo[149899]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hsrsbptktgkiacaqkkaiepdgxivgtgws ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796569.9790714-2290-273298733165297/AnsiballZ_copy.py'
Nov 22 07:29:30 compute-0 sudo[149899]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:29:31 compute-0 python3.9[149901]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796569.9790714-2290-273298733165297/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:29:31 compute-0 sudo[149899]: pam_unix(sudo:session): session closed for user root
Nov 22 07:29:31 compute-0 sudo[150063]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-psqpaeamzajuizhqrfxjmclxmthungpp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796571.252102-2290-241536017254696/AnsiballZ_stat.py'
Nov 22 07:29:31 compute-0 sudo[150063]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:29:31 compute-0 podman[150025]: 2025-11-22 07:29:31.647160197 +0000 UTC m=+0.140110462 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 22 07:29:31 compute-0 python3.9[150072]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:29:31 compute-0 sudo[150063]: pam_unix(sudo:session): session closed for user root
Nov 22 07:29:32 compute-0 sudo[150199]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cqvqfgqidldedovhqxbklbpxksypqbbu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796571.252102-2290-241536017254696/AnsiballZ_copy.py'
Nov 22 07:29:32 compute-0 sudo[150199]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:29:32 compute-0 python3.9[150201]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796571.252102-2290-241536017254696/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:29:32 compute-0 sudo[150199]: pam_unix(sudo:session): session closed for user root
Nov 22 07:29:34 compute-0 python3.9[150351]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                             ls -lRZ /run/libvirt | grep -E ':container_\S+_t'
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 07:29:36 compute-0 sudo[150504]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dgkuurntaxrcvotlvnvyuyzprssvenqa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796575.8704867-2908-192458664628256/AnsiballZ_seboolean.py'
Nov 22 07:29:36 compute-0 sudo[150504]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:29:36 compute-0 python3.9[150506]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Nov 22 07:29:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:29:37.293 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:29:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:29:37.295 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:29:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:29:37.295 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:29:37 compute-0 sudo[150504]: pam_unix(sudo:session): session closed for user root
Nov 22 07:29:38 compute-0 sudo[150660]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cxpugegkmaajgwbbijkqahlcxdzrhikl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796578.2391715-2932-270663121952558/AnsiballZ_copy.py'
Nov 22 07:29:38 compute-0 dbus-broker-launch[811]: avc:  op=load_policy lsm=selinux seqno=15 res=1
Nov 22 07:29:38 compute-0 sudo[150660]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:29:38 compute-0 python3.9[150662]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:29:38 compute-0 sudo[150660]: pam_unix(sudo:session): session closed for user root
Nov 22 07:29:39 compute-0 sudo[150812]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gkuxmrnmrinlizjtesvrvzcmkoheifzp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796578.9177725-2932-97339120186846/AnsiballZ_copy.py'
Nov 22 07:29:39 compute-0 sudo[150812]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:29:39 compute-0 python3.9[150814]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:29:39 compute-0 sudo[150812]: pam_unix(sudo:session): session closed for user root
Nov 22 07:29:39 compute-0 sudo[150964]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yfvqwtsewcnhpllboxzetmiwvapjjwgh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796579.63125-2932-71533409305629/AnsiballZ_copy.py'
Nov 22 07:29:39 compute-0 sudo[150964]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:29:40 compute-0 python3.9[150966]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:29:40 compute-0 sudo[150964]: pam_unix(sudo:session): session closed for user root
Nov 22 07:29:40 compute-0 podman[151066]: 2025-11-22 07:29:40.463973413 +0000 UTC m=+0.091184226 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, config_id=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 07:29:40 compute-0 sudo[151135]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjcjckmlwrrgdnndyjwaqlfyegibbvlo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796580.2150764-2932-171401764504839/AnsiballZ_copy.py'
Nov 22 07:29:40 compute-0 sudo[151135]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:29:40 compute-0 python3.9[151138]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:29:40 compute-0 sudo[151135]: pam_unix(sudo:session): session closed for user root
Nov 22 07:29:41 compute-0 sudo[151288]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wvlddjdfzkogvicwezgptepndqheajye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796580.866463-2932-426860919352/AnsiballZ_copy.py'
Nov 22 07:29:41 compute-0 sudo[151288]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:29:41 compute-0 python3.9[151290]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:29:41 compute-0 sudo[151288]: pam_unix(sudo:session): session closed for user root
Nov 22 07:29:42 compute-0 sudo[151440]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-noreyccabcbiwswldswjhsogobqhizcv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796581.9347925-3040-24439482314134/AnsiballZ_copy.py'
Nov 22 07:29:42 compute-0 sudo[151440]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:29:42 compute-0 python3.9[151442]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:29:42 compute-0 sudo[151440]: pam_unix(sudo:session): session closed for user root
Nov 22 07:29:42 compute-0 sudo[151592]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vizzlykmacvxchhezmtbdsqsvqdspoau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796582.617467-3040-92360924874139/AnsiballZ_copy.py'
Nov 22 07:29:42 compute-0 sudo[151592]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:29:43 compute-0 python3.9[151594]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:29:43 compute-0 sudo[151592]: pam_unix(sudo:session): session closed for user root
Nov 22 07:29:43 compute-0 sudo[151744]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-etuufgszazfuusdzwbylxhcqjxiqvejj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796583.2285216-3040-263577757345481/AnsiballZ_copy.py'
Nov 22 07:29:43 compute-0 sudo[151744]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:29:43 compute-0 python3.9[151746]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:29:43 compute-0 sudo[151744]: pam_unix(sudo:session): session closed for user root
Nov 22 07:29:44 compute-0 sudo[151896]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zsusciiytzkyzeaxeikvookbjzpofsmc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796583.8916314-3040-182239007050211/AnsiballZ_copy.py'
Nov 22 07:29:44 compute-0 sudo[151896]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:29:44 compute-0 python3.9[151898]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:29:44 compute-0 sudo[151896]: pam_unix(sudo:session): session closed for user root
Nov 22 07:29:44 compute-0 sudo[152048]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zmnkcotjbmjnxuogtexgndmizeyhspxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796584.474607-3040-52331806833912/AnsiballZ_copy.py'
Nov 22 07:29:44 compute-0 sudo[152048]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:29:45 compute-0 python3.9[152050]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:29:45 compute-0 sudo[152048]: pam_unix(sudo:session): session closed for user root
Nov 22 07:29:45 compute-0 sudo[152200]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bwgqnvsttsfunznpxdvcypygrpgrvbph ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796585.3229702-3148-236565790928255/AnsiballZ_systemd.py'
Nov 22 07:29:45 compute-0 sudo[152200]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:29:45 compute-0 python3.9[152202]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 07:29:45 compute-0 systemd[1]: Reloading.
Nov 22 07:29:46 compute-0 systemd-sysv-generator[152233]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 07:29:46 compute-0 systemd-rc-local-generator[152230]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 07:29:46 compute-0 systemd[1]: Starting libvirt logging daemon socket...
Nov 22 07:29:46 compute-0 systemd[1]: Listening on libvirt logging daemon socket.
Nov 22 07:29:46 compute-0 systemd[1]: Starting libvirt logging daemon admin socket...
Nov 22 07:29:46 compute-0 systemd[1]: Listening on libvirt logging daemon admin socket.
Nov 22 07:29:46 compute-0 systemd[1]: Starting libvirt logging daemon...
Nov 22 07:29:46 compute-0 systemd[1]: Started libvirt logging daemon.
Nov 22 07:29:46 compute-0 sudo[152200]: pam_unix(sudo:session): session closed for user root
Nov 22 07:29:47 compute-0 sudo[152394]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxmnlshlstapejqxtojahjfifixyuhqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796586.5462127-3148-61289888550188/AnsiballZ_systemd.py'
Nov 22 07:29:47 compute-0 sudo[152394]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:29:47 compute-0 python3.9[152396]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 07:29:47 compute-0 systemd[1]: Reloading.
Nov 22 07:29:47 compute-0 systemd-sysv-generator[152425]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 07:29:47 compute-0 systemd-rc-local-generator[152421]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 07:29:48 compute-0 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Nov 22 07:29:48 compute-0 systemd[1]: Starting libvirt nodedev daemon socket...
Nov 22 07:29:48 compute-0 systemd[1]: Listening on libvirt nodedev daemon socket.
Nov 22 07:29:48 compute-0 systemd[1]: Starting libvirt nodedev daemon admin socket...
Nov 22 07:29:48 compute-0 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Nov 22 07:29:48 compute-0 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Nov 22 07:29:48 compute-0 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Nov 22 07:29:48 compute-0 systemd[1]: Starting libvirt nodedev daemon...
Nov 22 07:29:48 compute-0 systemd[1]: Started libvirt nodedev daemon.
Nov 22 07:29:48 compute-0 sudo[152394]: pam_unix(sudo:session): session closed for user root
Nov 22 07:29:48 compute-0 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Nov 22 07:29:48 compute-0 sudo[152610]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ieugovyttksuzdrqsvzwfutctchqqfiz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796588.4165525-3148-64624599335450/AnsiballZ_systemd.py'
Nov 22 07:29:48 compute-0 sudo[152610]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:29:48 compute-0 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Nov 22 07:29:48 compute-0 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Nov 22 07:29:49 compute-0 python3.9[152612]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 07:29:49 compute-0 systemd[1]: Reloading.
Nov 22 07:29:49 compute-0 systemd-rc-local-generator[152648]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 07:29:49 compute-0 systemd-sysv-generator[152651]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 07:29:49 compute-0 systemd[1]: Starting libvirt proxy daemon admin socket...
Nov 22 07:29:49 compute-0 systemd[1]: Starting libvirt proxy daemon read-only socket...
Nov 22 07:29:49 compute-0 systemd[1]: Listening on libvirt proxy daemon admin socket.
Nov 22 07:29:49 compute-0 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Nov 22 07:29:49 compute-0 systemd[1]: Starting libvirt proxy daemon...
Nov 22 07:29:49 compute-0 systemd[1]: Started libvirt proxy daemon.
Nov 22 07:29:49 compute-0 sudo[152610]: pam_unix(sudo:session): session closed for user root
Nov 22 07:29:49 compute-0 setroubleshoot[152432]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 143aaab8-c3b7-4f8e-b526-17d3b3903ddb
Nov 22 07:29:49 compute-0 setroubleshoot[152432]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Nov 22 07:29:49 compute-0 setroubleshoot[152432]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 143aaab8-c3b7-4f8e-b526-17d3b3903ddb
Nov 22 07:29:49 compute-0 setroubleshoot[152432]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Nov 22 07:29:50 compute-0 sudo[152830]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bqfymziiabyxfmycdczapvgcijdbkxwq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796589.8092303-3148-279730714417505/AnsiballZ_systemd.py'
Nov 22 07:29:50 compute-0 sudo[152830]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:29:50 compute-0 python3.9[152832]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 07:29:50 compute-0 systemd[1]: Reloading.
Nov 22 07:29:50 compute-0 systemd-rc-local-generator[152856]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 07:29:50 compute-0 systemd-sysv-generator[152863]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 07:29:50 compute-0 systemd[1]: Listening on libvirt locking daemon socket.
Nov 22 07:29:50 compute-0 systemd[1]: Starting libvirt QEMU daemon socket...
Nov 22 07:29:50 compute-0 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Nov 22 07:29:50 compute-0 systemd[1]: Starting Virtual Machine and Container Registration Service...
Nov 22 07:29:50 compute-0 systemd[1]: Listening on libvirt QEMU daemon socket.
Nov 22 07:29:50 compute-0 systemd[1]: Starting libvirt QEMU daemon admin socket...
Nov 22 07:29:50 compute-0 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Nov 22 07:29:50 compute-0 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Nov 22 07:29:50 compute-0 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Nov 22 07:29:50 compute-0 systemd[1]: Started Virtual Machine and Container Registration Service.
Nov 22 07:29:50 compute-0 systemd[1]: Starting libvirt QEMU daemon...
Nov 22 07:29:50 compute-0 systemd[1]: Started libvirt QEMU daemon.
Nov 22 07:29:50 compute-0 sudo[152830]: pam_unix(sudo:session): session closed for user root
Nov 22 07:29:51 compute-0 sudo[153045]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-diaateonzbfoveosmgjftcqxeavvuxtf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796590.9208045-3148-1327137276956/AnsiballZ_systemd.py'
Nov 22 07:29:51 compute-0 sudo[153045]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:29:51 compute-0 python3.9[153047]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 07:29:51 compute-0 systemd[1]: Reloading.
Nov 22 07:29:51 compute-0 systemd-sysv-generator[153078]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 07:29:51 compute-0 systemd-rc-local-generator[153075]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 07:29:51 compute-0 systemd[1]: Starting libvirt secret daemon socket...
Nov 22 07:29:51 compute-0 systemd[1]: Listening on libvirt secret daemon socket.
Nov 22 07:29:51 compute-0 systemd[1]: Starting libvirt secret daemon admin socket...
Nov 22 07:29:51 compute-0 systemd[1]: Starting libvirt secret daemon read-only socket...
Nov 22 07:29:51 compute-0 systemd[1]: Listening on libvirt secret daemon admin socket.
Nov 22 07:29:51 compute-0 systemd[1]: Listening on libvirt secret daemon read-only socket.
Nov 22 07:29:51 compute-0 systemd[1]: Starting libvirt secret daemon...
Nov 22 07:29:52 compute-0 systemd[1]: Started libvirt secret daemon.
Nov 22 07:29:52 compute-0 sudo[153045]: pam_unix(sudo:session): session closed for user root
Nov 22 07:29:53 compute-0 sudo[153257]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zosywxqfxkwizdmukkkyfdcahipcwrnj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796592.9778838-3259-230274162430442/AnsiballZ_file.py'
Nov 22 07:29:53 compute-0 sudo[153257]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:29:53 compute-0 python3.9[153259]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:29:53 compute-0 sudo[153257]: pam_unix(sudo:session): session closed for user root
Nov 22 07:29:53 compute-0 sudo[153409]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzxpphnbmbjvoliplbmwptwceftopybz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796593.651028-3283-250004874369896/AnsiballZ_find.py'
Nov 22 07:29:53 compute-0 sudo[153409]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:29:54 compute-0 python3.9[153411]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 22 07:29:54 compute-0 sudo[153409]: pam_unix(sudo:session): session closed for user root
Nov 22 07:29:55 compute-0 sudo[153561]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wwebsjzdtrxewuhstrbjnibkxdkwobdr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796594.7089567-3325-196326217540150/AnsiballZ_stat.py'
Nov 22 07:29:55 compute-0 sudo[153561]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:29:55 compute-0 python3.9[153563]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:29:55 compute-0 sudo[153561]: pam_unix(sudo:session): session closed for user root
Nov 22 07:29:56 compute-0 sudo[153684]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bcwkaxqfmuuppcagdnccrjubzihnmuyt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796594.7089567-3325-196326217540150/AnsiballZ_copy.py'
Nov 22 07:29:56 compute-0 sudo[153684]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:29:56 compute-0 python3.9[153686]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796594.7089567-3325-196326217540150/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:29:56 compute-0 sudo[153684]: pam_unix(sudo:session): session closed for user root
Nov 22 07:29:57 compute-0 sudo[153836]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xguynbvtrwuqcktcqcepaqkypnlwmpbo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796597.034692-3373-166485643739241/AnsiballZ_file.py'
Nov 22 07:29:57 compute-0 sudo[153836]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:29:57 compute-0 python3.9[153838]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:29:57 compute-0 sudo[153836]: pam_unix(sudo:session): session closed for user root
Nov 22 07:29:58 compute-0 sudo[153988]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmfyjhptpfmfyibmgdcpfmxgakiugyts ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796597.7204738-3397-133133152003527/AnsiballZ_stat.py'
Nov 22 07:29:58 compute-0 sudo[153988]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:29:58 compute-0 python3.9[153990]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:29:58 compute-0 sudo[153988]: pam_unix(sudo:session): session closed for user root
Nov 22 07:29:58 compute-0 sudo[154066]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xrmnkbrfhgptbhnnumrvgypsgzcepobi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796597.7204738-3397-133133152003527/AnsiballZ_file.py'
Nov 22 07:29:58 compute-0 sudo[154066]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:29:58 compute-0 python3.9[154068]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:29:58 compute-0 sudo[154066]: pam_unix(sudo:session): session closed for user root
Nov 22 07:29:59 compute-0 sudo[154218]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-atbjhpfvzzpewqmzkgudwjhyrhrtonak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796599.0000014-3433-195361803263447/AnsiballZ_stat.py'
Nov 22 07:29:59 compute-0 sudo[154218]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:29:59 compute-0 python3.9[154220]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:29:59 compute-0 sudo[154218]: pam_unix(sudo:session): session closed for user root
Nov 22 07:29:59 compute-0 sudo[154296]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hvfeidkbyvhpxvioampifuficxdgjxbv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796599.0000014-3433-195361803263447/AnsiballZ_file.py'
Nov 22 07:29:59 compute-0 sudo[154296]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:29:59 compute-0 python3.9[154298]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.axh2txe8 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:29:59 compute-0 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Nov 22 07:29:59 compute-0 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Consumed 1.032s CPU time.
Nov 22 07:29:59 compute-0 sudo[154296]: pam_unix(sudo:session): session closed for user root
Nov 22 07:29:59 compute-0 systemd[1]: setroubleshootd.service: Deactivated successfully.
Nov 22 07:30:00 compute-0 sudo[154448]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rodfpahtizzxlylnjobjmetnypggjnhn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796600.1772738-3469-181932701598188/AnsiballZ_stat.py'
Nov 22 07:30:00 compute-0 sudo[154448]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:30:00 compute-0 python3.9[154450]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:30:00 compute-0 sudo[154448]: pam_unix(sudo:session): session closed for user root
Nov 22 07:30:01 compute-0 sudo[154526]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-upvwgtuxkzwotxsgujcyskyyhewvlqal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796600.1772738-3469-181932701598188/AnsiballZ_file.py'
Nov 22 07:30:01 compute-0 sudo[154526]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:30:01 compute-0 python3.9[154528]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:30:01 compute-0 sudo[154526]: pam_unix(sudo:session): session closed for user root
Nov 22 07:30:01 compute-0 sudo[154689]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eamvkmqbjrmtsdkitvkbctnxxroshggm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796601.6099136-3508-199655879292935/AnsiballZ_command.py'
Nov 22 07:30:01 compute-0 sudo[154689]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:30:01 compute-0 podman[154652]: 2025-11-22 07:30:01.970174552 +0000 UTC m=+0.090616191 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, org.label-schema.build-date=20251118, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 22 07:30:02 compute-0 python3.9[154696]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 07:30:02 compute-0 sudo[154689]: pam_unix(sudo:session): session closed for user root
Nov 22 07:30:02 compute-0 sudo[154856]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sszqdfvhinoetgjvtyebvlojhcjrowqt ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763796602.3236709-3532-140315816081523/AnsiballZ_edpm_nftables_from_files.py'
Nov 22 07:30:02 compute-0 sudo[154856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:30:03 compute-0 python3[154858]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 22 07:30:03 compute-0 sudo[154856]: pam_unix(sudo:session): session closed for user root
Nov 22 07:30:03 compute-0 sudo[155008]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ggeemmwujfnhhwsfanxraszrvsefwxif ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796603.2656543-3556-197109355482205/AnsiballZ_stat.py'
Nov 22 07:30:03 compute-0 sudo[155008]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:30:03 compute-0 python3.9[155010]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:30:03 compute-0 sudo[155008]: pam_unix(sudo:session): session closed for user root
Nov 22 07:30:04 compute-0 sudo[155086]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dxghpggoekkopbvbsxmvndkletwsdgqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796603.2656543-3556-197109355482205/AnsiballZ_file.py'
Nov 22 07:30:04 compute-0 sudo[155086]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:30:04 compute-0 python3.9[155088]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:30:04 compute-0 sudo[155086]: pam_unix(sudo:session): session closed for user root
Nov 22 07:30:05 compute-0 sudo[155238]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cuspyzowejxykazfztesiamhpxckbpxz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796605.4561653-3592-75832630283915/AnsiballZ_stat.py'
Nov 22 07:30:05 compute-0 sudo[155238]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:30:06 compute-0 python3.9[155240]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:30:06 compute-0 sudo[155238]: pam_unix(sudo:session): session closed for user root
Nov 22 07:30:06 compute-0 sudo[155316]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-alnnjjvvodzavapiudqgvgxuiiiwjebj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796605.4561653-3592-75832630283915/AnsiballZ_file.py'
Nov 22 07:30:06 compute-0 sudo[155316]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:30:06 compute-0 python3.9[155318]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:30:06 compute-0 sudo[155316]: pam_unix(sudo:session): session closed for user root
Nov 22 07:30:07 compute-0 sudo[155469]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ibjkvvdbbacchqrxoiyzhrgefweeenyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796607.0717444-3628-218068424831342/AnsiballZ_stat.py'
Nov 22 07:30:07 compute-0 sudo[155469]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:30:07 compute-0 python3.9[155471]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:30:07 compute-0 sudo[155469]: pam_unix(sudo:session): session closed for user root
Nov 22 07:30:07 compute-0 sudo[155547]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-piytjteuwvyjpfniqzzxqnsuchcnkxxq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796607.0717444-3628-218068424831342/AnsiballZ_file.py'
Nov 22 07:30:07 compute-0 sudo[155547]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:30:08 compute-0 python3.9[155549]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:30:08 compute-0 sudo[155547]: pam_unix(sudo:session): session closed for user root
Nov 22 07:30:08 compute-0 sudo[155699]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfhvywybwopncwfxpfpdshtfxizosdnp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796608.2871823-3664-272484886100541/AnsiballZ_stat.py'
Nov 22 07:30:08 compute-0 sudo[155699]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:30:08 compute-0 python3.9[155701]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:30:08 compute-0 sudo[155699]: pam_unix(sudo:session): session closed for user root
Nov 22 07:30:08 compute-0 sudo[155777]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nsmivxsdyfhfokziprwngqzfbcwurdjk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796608.2871823-3664-272484886100541/AnsiballZ_file.py'
Nov 22 07:30:09 compute-0 sudo[155777]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:30:09 compute-0 python3.9[155779]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:30:09 compute-0 sudo[155777]: pam_unix(sudo:session): session closed for user root
Nov 22 07:30:09 compute-0 sudo[155929]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mxxdwdndwvbljqvylwjlboaeewlhqyim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796609.5127196-3700-80594868520296/AnsiballZ_stat.py'
Nov 22 07:30:09 compute-0 sudo[155929]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:30:10 compute-0 python3.9[155931]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:30:10 compute-0 sudo[155929]: pam_unix(sudo:session): session closed for user root
Nov 22 07:30:10 compute-0 sudo[156054]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lotdubwguzjgixjgrfdrbfbmrmxphvjk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796609.5127196-3700-80594868520296/AnsiballZ_copy.py'
Nov 22 07:30:10 compute-0 sudo[156054]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:30:10 compute-0 podman[156056]: 2025-11-22 07:30:10.575181285 +0000 UTC m=+0.049639826 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Nov 22 07:30:10 compute-0 python3.9[156057]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763796609.5127196-3700-80594868520296/.source.nft follow=False _original_basename=ruleset.j2 checksum=8a12d4eb5149b6e500230381c1359a710881e9b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:30:10 compute-0 sudo[156054]: pam_unix(sudo:session): session closed for user root
Nov 22 07:30:11 compute-0 sudo[156226]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sezclkeayabimmseuwnlxadovfqcwnat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796610.91643-3745-51073229069927/AnsiballZ_file.py'
Nov 22 07:30:11 compute-0 sudo[156226]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:30:11 compute-0 python3.9[156228]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:30:11 compute-0 sudo[156226]: pam_unix(sudo:session): session closed for user root
Nov 22 07:30:12 compute-0 sudo[156378]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ukpkfrdjlmlnhrnkuupxhiopvwqpzjuo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796611.805732-3769-212946573596633/AnsiballZ_command.py'
Nov 22 07:30:12 compute-0 sudo[156378]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:30:12 compute-0 python3.9[156380]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 07:30:12 compute-0 sudo[156378]: pam_unix(sudo:session): session closed for user root
Nov 22 07:30:13 compute-0 sudo[156533]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-epyeuqejghxakgsqedscmeootnisctsl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796612.6106627-3793-163391988165018/AnsiballZ_blockinfile.py'
Nov 22 07:30:13 compute-0 sudo[156533]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:30:13 compute-0 python3.9[156535]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:30:13 compute-0 sudo[156533]: pam_unix(sudo:session): session closed for user root
Nov 22 07:30:14 compute-0 sudo[156685]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywwkjqekoanzvtkhwfjehfaderixpncm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796613.759746-3820-40247091248121/AnsiballZ_command.py'
Nov 22 07:30:14 compute-0 sudo[156685]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:30:14 compute-0 python3.9[156687]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 07:30:14 compute-0 sudo[156685]: pam_unix(sudo:session): session closed for user root
Nov 22 07:30:14 compute-0 sudo[156838]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kqpwhttsskykwmdbxbiqeuvyhdpoczti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796614.4729366-3844-42616676748178/AnsiballZ_stat.py'
Nov 22 07:30:14 compute-0 sudo[156838]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:30:14 compute-0 python3.9[156840]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 07:30:14 compute-0 sudo[156838]: pam_unix(sudo:session): session closed for user root
Nov 22 07:30:15 compute-0 sudo[156992]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tldicgaxapxtztgtghxdqcahebpyuenv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796615.3093164-3868-95047437882619/AnsiballZ_command.py'
Nov 22 07:30:15 compute-0 sudo[156992]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:30:15 compute-0 python3.9[156994]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 07:30:15 compute-0 sudo[156992]: pam_unix(sudo:session): session closed for user root
Nov 22 07:30:16 compute-0 sudo[157147]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-initcszckpkjxtydnbauvzmscfsyskic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796616.0497868-3892-160252168703456/AnsiballZ_file.py'
Nov 22 07:30:16 compute-0 sudo[157147]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:30:16 compute-0 python3.9[157149]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:30:16 compute-0 sudo[157147]: pam_unix(sudo:session): session closed for user root
Nov 22 07:30:17 compute-0 sudo[157299]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wfnehoavzscnklcktcxwtzqyydlbynvz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796616.7779558-3916-129742015282386/AnsiballZ_stat.py'
Nov 22 07:30:17 compute-0 sudo[157299]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:30:17 compute-0 python3.9[157301]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:30:17 compute-0 sudo[157299]: pam_unix(sudo:session): session closed for user root
Nov 22 07:30:17 compute-0 sudo[157422]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvkgdbballgrgcgeqisbvgvblhegohdx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796616.7779558-3916-129742015282386/AnsiballZ_copy.py'
Nov 22 07:30:17 compute-0 sudo[157422]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:30:17 compute-0 python3.9[157424]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796616.7779558-3916-129742015282386/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:30:17 compute-0 sudo[157422]: pam_unix(sudo:session): session closed for user root
Nov 22 07:30:18 compute-0 sudo[157574]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mpphtzrzsyejokggnafgvzcnrvfstsji ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796618.3138566-3961-238133984800626/AnsiballZ_stat.py'
Nov 22 07:30:18 compute-0 sudo[157574]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:30:18 compute-0 python3.9[157576]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:30:18 compute-0 sudo[157574]: pam_unix(sudo:session): session closed for user root
Nov 22 07:30:19 compute-0 sudo[157697]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xbipygzjftzpwnmwzuvfkkwvqopzccwp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796618.3138566-3961-238133984800626/AnsiballZ_copy.py'
Nov 22 07:30:19 compute-0 sudo[157697]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:30:19 compute-0 python3.9[157699]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796618.3138566-3961-238133984800626/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:30:19 compute-0 sudo[157697]: pam_unix(sudo:session): session closed for user root
Nov 22 07:30:19 compute-0 sudo[157849]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-etoygglfwpddxstqskefbcxorspuzbmm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796619.6325006-4006-115328281317235/AnsiballZ_stat.py'
Nov 22 07:30:19 compute-0 sudo[157849]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:30:20 compute-0 python3.9[157851]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:30:20 compute-0 sudo[157849]: pam_unix(sudo:session): session closed for user root
Nov 22 07:30:20 compute-0 sudo[157972]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-idrdvzjzdwxcrnrbaldzfbmylftkqqsv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796619.6325006-4006-115328281317235/AnsiballZ_copy.py'
Nov 22 07:30:20 compute-0 sudo[157972]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:30:20 compute-0 python3.9[157974]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796619.6325006-4006-115328281317235/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:30:20 compute-0 sudo[157972]: pam_unix(sudo:session): session closed for user root
Nov 22 07:30:21 compute-0 sudo[158124]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lccdnvcoametvyhxfaurvnvnhsxtphne ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796620.923408-4051-172312855117208/AnsiballZ_systemd.py'
Nov 22 07:30:21 compute-0 sudo[158124]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:30:21 compute-0 python3.9[158126]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 07:30:21 compute-0 systemd[1]: Reloading.
Nov 22 07:30:21 compute-0 systemd-rc-local-generator[158154]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 07:30:21 compute-0 systemd-sysv-generator[158158]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 07:30:21 compute-0 systemd[1]: Reached target edpm_libvirt.target.
Nov 22 07:30:21 compute-0 sudo[158124]: pam_unix(sudo:session): session closed for user root
Nov 22 07:30:22 compute-0 sudo[158315]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thovmuxspwozabfrycfkpjcnzhekimhf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796622.1393547-4075-211796955607468/AnsiballZ_systemd.py'
Nov 22 07:30:22 compute-0 sudo[158315]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:30:22 compute-0 python3.9[158317]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Nov 22 07:30:22 compute-0 systemd[1]: Reloading.
Nov 22 07:30:22 compute-0 systemd-rc-local-generator[158344]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 07:30:22 compute-0 systemd-sysv-generator[158347]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 07:30:23 compute-0 systemd[1]: Reloading.
Nov 22 07:30:23 compute-0 systemd-rc-local-generator[158381]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 07:30:23 compute-0 systemd-sysv-generator[158386]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 07:30:23 compute-0 sudo[158315]: pam_unix(sudo:session): session closed for user root
Nov 22 07:30:23 compute-0 sshd-session[103926]: Connection closed by 192.168.122.30 port 49888
Nov 22 07:30:23 compute-0 sshd-session[103923]: pam_unix(sshd:session): session closed for user zuul
Nov 22 07:30:24 compute-0 systemd-logind[821]: Session 22 logged out. Waiting for processes to exit.
Nov 22 07:30:24 compute-0 systemd[1]: session-22.scope: Deactivated successfully.
Nov 22 07:30:24 compute-0 systemd[1]: session-22.scope: Consumed 3min 14.083s CPU time.
Nov 22 07:30:24 compute-0 systemd-logind[821]: Removed session 22.
Nov 22 07:30:29 compute-0 sshd-session[158414]: Accepted publickey for zuul from 192.168.122.30 port 53640 ssh2: ECDSA SHA256:XSwr0+qdoVcqF4tsJEe7yzRPcUrJW8ET1w6IEkjbKvs
Nov 22 07:30:29 compute-0 systemd-logind[821]: New session 23 of user zuul.
Nov 22 07:30:29 compute-0 systemd[1]: Started Session 23 of User zuul.
Nov 22 07:30:29 compute-0 sshd-session[158414]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 22 07:30:30 compute-0 python3.9[158567]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 07:30:31 compute-0 python3.9[158721]: ansible-ansible.builtin.service_facts Invoked
Nov 22 07:30:31 compute-0 network[158738]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 22 07:30:31 compute-0 network[158739]: 'network-scripts' will be removed from distribution in near future.
Nov 22 07:30:31 compute-0 network[158740]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 22 07:30:32 compute-0 podman[158746]: 2025-11-22 07:30:32.866336123 +0000 UTC m=+0.092338876 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 07:30:36 compute-0 sudo[159034]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-esapndgryrsjqdovgazbavmrbdtqjsej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796636.2833784-106-205969346400375/AnsiballZ_setup.py'
Nov 22 07:30:36 compute-0 sudo[159034]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:30:36 compute-0 python3.9[159036]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 22 07:30:37 compute-0 sudo[159034]: pam_unix(sudo:session): session closed for user root
Nov 22 07:30:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:30:37.295 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:30:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:30:37.296 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:30:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:30:37.296 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:30:37 compute-0 sudo[159118]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ikwzztdoybbcibcakfnlxkfykbmytmpz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796636.2833784-106-205969346400375/AnsiballZ_dnf.py'
Nov 22 07:30:37 compute-0 sudo[159118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:30:37 compute-0 python3.9[159120]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 22 07:30:41 compute-0 podman[159122]: 2025-11-22 07:30:41.40914195 +0000 UTC m=+0.059014792 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Nov 22 07:30:42 compute-0 sudo[159118]: pam_unix(sudo:session): session closed for user root
Nov 22 07:30:43 compute-0 sudo[159290]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wrkeeyuzifrkfzfuegwwgnuxbbwqizlj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796643.4087822-142-280634691400055/AnsiballZ_stat.py'
Nov 22 07:30:43 compute-0 sudo[159290]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:30:44 compute-0 python3.9[159292]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 07:30:44 compute-0 sudo[159290]: pam_unix(sudo:session): session closed for user root
Nov 22 07:30:44 compute-0 sudo[159442]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uggvtphkdxuyvhiguggzuhortjsmjzcs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796644.3794959-172-140202108354725/AnsiballZ_command.py'
Nov 22 07:30:44 compute-0 sudo[159442]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:30:45 compute-0 python3.9[159444]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 07:30:45 compute-0 sudo[159442]: pam_unix(sudo:session): session closed for user root
Nov 22 07:30:45 compute-0 sudo[159595]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tpxavcfcgupkywinwjqmftycnyeyquvt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796645.447406-202-35425031276075/AnsiballZ_stat.py'
Nov 22 07:30:45 compute-0 sudo[159595]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:30:45 compute-0 python3.9[159597]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 07:30:45 compute-0 sudo[159595]: pam_unix(sudo:session): session closed for user root
Nov 22 07:30:46 compute-0 sudo[159747]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qlbspozzavfvkfsikqanpoxalnlwkure ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796646.120489-226-277061172983015/AnsiballZ_command.py'
Nov 22 07:30:46 compute-0 sudo[159747]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:30:46 compute-0 python3.9[159749]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 07:30:46 compute-0 sudo[159747]: pam_unix(sudo:session): session closed for user root
Nov 22 07:30:47 compute-0 sudo[159900]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wzxflhprbgaoblrauofvncyadizxafxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796646.9853315-250-128249019838353/AnsiballZ_stat.py'
Nov 22 07:30:47 compute-0 sudo[159900]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:30:47 compute-0 python3.9[159902]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:30:47 compute-0 sudo[159900]: pam_unix(sudo:session): session closed for user root
Nov 22 07:30:48 compute-0 sudo[160023]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jhaaqywmznuofsoznynactnahrfhicpl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796646.9853315-250-128249019838353/AnsiballZ_copy.py'
Nov 22 07:30:48 compute-0 sudo[160023]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:30:48 compute-0 python3.9[160025]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796646.9853315-250-128249019838353/.source.iscsi _original_basename=.3g6ekjb4 follow=False checksum=f06bbc00cf53f0d9496154a104a1a30ca1a9b61d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:30:48 compute-0 sudo[160023]: pam_unix(sudo:session): session closed for user root
Nov 22 07:30:48 compute-0 sudo[160175]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-luyotlimroaopbametzcbhxdyqrstfdy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796648.4187012-295-115693800312771/AnsiballZ_file.py'
Nov 22 07:30:48 compute-0 sudo[160175]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:30:49 compute-0 python3.9[160177]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:30:49 compute-0 sudo[160175]: pam_unix(sudo:session): session closed for user root
Nov 22 07:30:49 compute-0 sudo[160327]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-byskepltuqvdygttjvfcvdjldxmdroyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796649.2891295-319-191287189694299/AnsiballZ_lineinfile.py'
Nov 22 07:30:49 compute-0 sudo[160327]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:30:49 compute-0 python3.9[160329]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:30:49 compute-0 sudo[160327]: pam_unix(sudo:session): session closed for user root
Nov 22 07:30:49 compute-0 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 22 07:30:49 compute-0 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 22 07:30:49 compute-0 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 22 07:30:50 compute-0 sudo[160480]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-siybemdccafqoabwcdhtjvzncyfpufzf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796650.2527766-346-277321642219976/AnsiballZ_systemd_service.py'
Nov 22 07:30:50 compute-0 sudo[160480]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:30:51 compute-0 python3.9[160482]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 07:30:51 compute-0 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Nov 22 07:30:51 compute-0 sudo[160480]: pam_unix(sudo:session): session closed for user root
Nov 22 07:30:51 compute-0 sudo[160636]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mxeueofwwhpswdxfnkhzkuytgpmpjaoi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796651.5023787-370-263523693270533/AnsiballZ_systemd_service.py'
Nov 22 07:30:51 compute-0 sudo[160636]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:30:52 compute-0 python3.9[160638]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 07:30:52 compute-0 systemd[1]: Reloading.
Nov 22 07:30:52 compute-0 systemd-rc-local-generator[160672]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 07:30:52 compute-0 systemd-sysv-generator[160676]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 07:30:52 compute-0 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Nov 22 07:30:52 compute-0 systemd[1]: Starting Open-iSCSI...
Nov 22 07:30:52 compute-0 kernel: Loading iSCSI transport class v2.0-870.
Nov 22 07:30:52 compute-0 systemd[1]: Started Open-iSCSI.
Nov 22 07:30:52 compute-0 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Nov 22 07:30:52 compute-0 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Nov 22 07:30:52 compute-0 sudo[160636]: pam_unix(sudo:session): session closed for user root
Nov 22 07:30:53 compute-0 sudo[160838]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aczhkprdpfjcffkljrllsklxwzmhjzdo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796653.170492-403-57892483215888/AnsiballZ_service_facts.py'
Nov 22 07:30:53 compute-0 sudo[160838]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:30:53 compute-0 python3.9[160840]: ansible-ansible.builtin.service_facts Invoked
Nov 22 07:30:53 compute-0 network[160857]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 22 07:30:53 compute-0 network[160858]: 'network-scripts' will be removed from distribution in near future.
Nov 22 07:30:53 compute-0 network[160859]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 22 07:30:57 compute-0 sudo[160838]: pam_unix(sudo:session): session closed for user root
Nov 22 07:30:58 compute-0 sudo[161128]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fjqwyubgfwtjnuafbwqujnoefrblrhjs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796657.9314737-433-151665516885312/AnsiballZ_file.py'
Nov 22 07:30:58 compute-0 sudo[161128]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:30:58 compute-0 python3.9[161130]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 22 07:30:58 compute-0 sudo[161128]: pam_unix(sudo:session): session closed for user root
Nov 22 07:30:59 compute-0 sudo[161280]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fxvumjdjfkrzesetkajncrhrafpbruas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796658.617575-457-223500092284140/AnsiballZ_modprobe.py'
Nov 22 07:30:59 compute-0 sudo[161280]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:30:59 compute-0 python3.9[161282]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Nov 22 07:30:59 compute-0 sudo[161280]: pam_unix(sudo:session): session closed for user root
Nov 22 07:30:59 compute-0 sudo[161436]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qkqzjwqdyrzaxbmmfuomytsmojlenlfz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796659.4719102-481-96211900639098/AnsiballZ_stat.py'
Nov 22 07:30:59 compute-0 sudo[161436]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:30:59 compute-0 python3.9[161438]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:30:59 compute-0 sudo[161436]: pam_unix(sudo:session): session closed for user root
Nov 22 07:31:00 compute-0 sudo[161559]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rrrkamegdcnbvrvxhtxoawzlwzatqjsk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796659.4719102-481-96211900639098/AnsiballZ_copy.py'
Nov 22 07:31:00 compute-0 sudo[161559]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:31:00 compute-0 python3.9[161561]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796659.4719102-481-96211900639098/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:31:00 compute-0 sudo[161559]: pam_unix(sudo:session): session closed for user root
Nov 22 07:31:01 compute-0 sudo[161711]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iqmxbqsudfxsmilvddpioqemnmugvywk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796660.7939732-529-198987956930923/AnsiballZ_lineinfile.py'
Nov 22 07:31:01 compute-0 sudo[161711]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:31:01 compute-0 python3.9[161713]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:31:01 compute-0 sudo[161711]: pam_unix(sudo:session): session closed for user root
Nov 22 07:31:02 compute-0 sudo[161863]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qzbdzqeblpoghlstzcggkdktioyhjzcb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796661.5105174-553-23219208699880/AnsiballZ_systemd.py'
Nov 22 07:31:02 compute-0 sudo[161863]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:31:02 compute-0 python3.9[161865]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 07:31:02 compute-0 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Nov 22 07:31:02 compute-0 systemd[1]: Stopped Load Kernel Modules.
Nov 22 07:31:02 compute-0 systemd[1]: Stopping Load Kernel Modules...
Nov 22 07:31:02 compute-0 systemd[1]: Starting Load Kernel Modules...
Nov 22 07:31:02 compute-0 systemd[1]: Finished Load Kernel Modules.
Nov 22 07:31:02 compute-0 sudo[161863]: pam_unix(sudo:session): session closed for user root
Nov 22 07:31:02 compute-0 sudo[162019]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mppcqxdqpqpmzezgzolqbbdxeiavykie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796662.675592-577-239272640528722/AnsiballZ_file.py'
Nov 22 07:31:02 compute-0 sudo[162019]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:31:03 compute-0 podman[162021]: 2025-11-22 07:31:03.064574695 +0000 UTC m=+0.119823164 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251118, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 22 07:31:03 compute-0 python3.9[162022]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:31:03 compute-0 sudo[162019]: pam_unix(sudo:session): session closed for user root
Nov 22 07:31:03 compute-0 sudo[162198]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-atktznaheaarkgqqtktclrpsviojrugu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796663.5041108-604-133006465519928/AnsiballZ_stat.py'
Nov 22 07:31:03 compute-0 sudo[162198]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:31:03 compute-0 python3.9[162200]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 07:31:03 compute-0 sudo[162198]: pam_unix(sudo:session): session closed for user root
Nov 22 07:31:04 compute-0 sudo[162350]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwhcepdqggmavmsjsvxhgjutqnwwbtff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796664.2690437-631-106444247580498/AnsiballZ_stat.py'
Nov 22 07:31:04 compute-0 sudo[162350]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:31:04 compute-0 python3.9[162352]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 07:31:04 compute-0 sudo[162350]: pam_unix(sudo:session): session closed for user root
Nov 22 07:31:05 compute-0 sudo[162502]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hvjnwakdfhdczayyvhtnqeyvgksulons ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796664.9467382-655-142007535257708/AnsiballZ_stat.py'
Nov 22 07:31:05 compute-0 sudo[162502]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:31:05 compute-0 python3.9[162504]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:31:05 compute-0 sudo[162502]: pam_unix(sudo:session): session closed for user root
Nov 22 07:31:05 compute-0 sudo[162625]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nhnnqlnwgecblnxqcbpeayutfixozebo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796664.9467382-655-142007535257708/AnsiballZ_copy.py'
Nov 22 07:31:05 compute-0 sudo[162625]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:31:05 compute-0 python3.9[162627]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796664.9467382-655-142007535257708/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:31:05 compute-0 sudo[162625]: pam_unix(sudo:session): session closed for user root
Nov 22 07:31:06 compute-0 sudo[162777]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjhonxoetxtzwjqrynaozfporhjdpiow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796666.1578653-700-223617532088126/AnsiballZ_command.py'
Nov 22 07:31:06 compute-0 sudo[162777]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:31:06 compute-0 python3.9[162779]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 07:31:06 compute-0 sudo[162777]: pam_unix(sudo:session): session closed for user root
Nov 22 07:31:07 compute-0 sudo[162930]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ebiwzldhdknpejliycsvzpdwomfewutm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796666.8342674-724-202407429990322/AnsiballZ_lineinfile.py'
Nov 22 07:31:07 compute-0 sudo[162930]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:31:07 compute-0 python3.9[162932]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:31:07 compute-0 sudo[162930]: pam_unix(sudo:session): session closed for user root
Nov 22 07:31:07 compute-0 sudo[163082]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dufbcqrwzyrvcuqanaerqcwpdspgwyku ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796667.547466-748-219761870579858/AnsiballZ_replace.py'
Nov 22 07:31:07 compute-0 sudo[163082]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:31:08 compute-0 python3.9[163084]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:31:08 compute-0 sudo[163082]: pam_unix(sudo:session): session closed for user root
Nov 22 07:31:08 compute-0 sudo[163234]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qyoxboeerxjpdjsomrtojdmsnxztfnvl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796668.370556-772-102028907086747/AnsiballZ_replace.py'
Nov 22 07:31:08 compute-0 sudo[163234]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:31:08 compute-0 python3.9[163236]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:31:08 compute-0 sudo[163234]: pam_unix(sudo:session): session closed for user root
Nov 22 07:31:09 compute-0 sudo[163386]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iujghdvikifjgcgngaokkdaldkmogcgj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796669.1019592-799-76067178934766/AnsiballZ_lineinfile.py'
Nov 22 07:31:09 compute-0 sudo[163386]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:31:09 compute-0 python3.9[163388]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:31:09 compute-0 sudo[163386]: pam_unix(sudo:session): session closed for user root
Nov 22 07:31:09 compute-0 sudo[163538]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qbllnuleqaakhsvdnstzfdxusgeyxwsw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796669.713395-799-61179013429096/AnsiballZ_lineinfile.py'
Nov 22 07:31:09 compute-0 sudo[163538]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:31:10 compute-0 python3.9[163540]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:31:10 compute-0 sudo[163538]: pam_unix(sudo:session): session closed for user root
Nov 22 07:31:10 compute-0 sudo[163690]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yxjcirdshfhpqcnwmoowvklvcksxzzjw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796670.3018696-799-40099651977624/AnsiballZ_lineinfile.py'
Nov 22 07:31:10 compute-0 sudo[163690]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:31:10 compute-0 python3.9[163692]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:31:10 compute-0 sudo[163690]: pam_unix(sudo:session): session closed for user root
Nov 22 07:31:11 compute-0 sudo[163842]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mrlhppmkqxtlcsrxvzxfagexyauxbwmi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796670.9262664-799-105017646654231/AnsiballZ_lineinfile.py'
Nov 22 07:31:11 compute-0 sudo[163842]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:31:11 compute-0 python3.9[163844]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:31:11 compute-0 sudo[163842]: pam_unix(sudo:session): session closed for user root
Nov 22 07:31:11 compute-0 sudo[164005]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-baajhvpninbjvnscxqhffxfqxovrosid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796671.6750743-886-147986623081685/AnsiballZ_stat.py'
Nov 22 07:31:11 compute-0 sudo[164005]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:31:11 compute-0 podman[163968]: 2025-11-22 07:31:11.966115073 +0000 UTC m=+0.052789547 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 22 07:31:12 compute-0 python3.9[164011]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 07:31:12 compute-0 sudo[164005]: pam_unix(sudo:session): session closed for user root
Nov 22 07:31:12 compute-0 sudo[164166]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vhvcchuvfeezlcmbqtzqxcrgcwuhguzf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796672.4719079-910-169633453990653/AnsiballZ_file.py'
Nov 22 07:31:12 compute-0 sudo[164166]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:31:12 compute-0 python3.9[164168]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/multipath/.multipath_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:31:12 compute-0 sudo[164166]: pam_unix(sudo:session): session closed for user root
Nov 22 07:31:13 compute-0 sudo[164318]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ocbqtzbywlszavrhkbvlruouimxaqfrj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796673.3086007-937-7226549028135/AnsiballZ_file.py'
Nov 22 07:31:13 compute-0 sudo[164318]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:31:13 compute-0 python3.9[164320]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:31:13 compute-0 sudo[164318]: pam_unix(sudo:session): session closed for user root
Nov 22 07:31:14 compute-0 sudo[164470]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-libakeoujfzauqamxlwzeztsonjcqoaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796673.9782646-961-97044766146859/AnsiballZ_stat.py'
Nov 22 07:31:14 compute-0 sudo[164470]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:31:14 compute-0 python3.9[164472]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:31:14 compute-0 sudo[164470]: pam_unix(sudo:session): session closed for user root
Nov 22 07:31:14 compute-0 sudo[164548]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nelzpqhnueauqqdpliupxgabigliodga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796673.9782646-961-97044766146859/AnsiballZ_file.py'
Nov 22 07:31:14 compute-0 sudo[164548]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:31:14 compute-0 python3.9[164550]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:31:14 compute-0 sudo[164548]: pam_unix(sudo:session): session closed for user root
Nov 22 07:31:15 compute-0 sudo[164700]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djvwcayykrcpcjthuccakwdrxyuboonb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796675.0516663-961-178649132323311/AnsiballZ_stat.py'
Nov 22 07:31:15 compute-0 sudo[164700]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:31:15 compute-0 python3.9[164702]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:31:15 compute-0 sudo[164700]: pam_unix(sudo:session): session closed for user root
Nov 22 07:31:15 compute-0 sudo[164778]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lkiyfzlaxuqovqmkjhhagcqscrurckyh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796675.0516663-961-178649132323311/AnsiballZ_file.py'
Nov 22 07:31:15 compute-0 sudo[164778]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:31:16 compute-0 python3.9[164780]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:31:16 compute-0 sudo[164778]: pam_unix(sudo:session): session closed for user root
Nov 22 07:31:16 compute-0 sudo[164930]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qalotizudyrxdtcchjeecdtbeyasqjjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796676.3541744-1030-174109093716411/AnsiballZ_file.py'
Nov 22 07:31:16 compute-0 sudo[164930]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:31:16 compute-0 python3.9[164932]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:31:16 compute-0 sudo[164930]: pam_unix(sudo:session): session closed for user root
Nov 22 07:31:17 compute-0 sudo[165082]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jsoopxjunqootnjthcpbtuourriiohdm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796677.075907-1054-1615013324919/AnsiballZ_stat.py'
Nov 22 07:31:17 compute-0 sudo[165082]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:31:17 compute-0 python3.9[165084]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:31:17 compute-0 sudo[165082]: pam_unix(sudo:session): session closed for user root
Nov 22 07:31:17 compute-0 sudo[165160]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ryhbofgpwdokwgqsuhnhnvostleugwuz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796677.075907-1054-1615013324919/AnsiballZ_file.py'
Nov 22 07:31:17 compute-0 sudo[165160]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:31:18 compute-0 python3.9[165162]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:31:18 compute-0 sudo[165160]: pam_unix(sudo:session): session closed for user root
Nov 22 07:31:18 compute-0 sudo[165312]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dwlvrvlrkhluzbajbhftmqdypvyjubcj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796678.221785-1090-119682152393479/AnsiballZ_stat.py'
Nov 22 07:31:18 compute-0 sudo[165312]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:31:18 compute-0 python3.9[165314]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:31:18 compute-0 sudo[165312]: pam_unix(sudo:session): session closed for user root
Nov 22 07:31:18 compute-0 sudo[165390]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ygzxjhklgdfxtqlivlkiqscfavdkgfrd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796678.221785-1090-119682152393479/AnsiballZ_file.py'
Nov 22 07:31:18 compute-0 sudo[165390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:31:19 compute-0 python3.9[165392]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:31:19 compute-0 sudo[165390]: pam_unix(sudo:session): session closed for user root
Nov 22 07:31:19 compute-0 sudo[165542]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efisbkqskdswxvdqlifszumheqrzxaza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796679.51192-1126-251628331396716/AnsiballZ_systemd.py'
Nov 22 07:31:19 compute-0 sudo[165542]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:31:20 compute-0 python3.9[165544]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 07:31:20 compute-0 systemd[1]: Reloading.
Nov 22 07:31:20 compute-0 systemd-rc-local-generator[165572]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 07:31:20 compute-0 systemd-sysv-generator[165576]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 07:31:20 compute-0 sudo[165542]: pam_unix(sudo:session): session closed for user root
Nov 22 07:31:20 compute-0 sudo[165732]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lesetspeinrlvtzlvptakibbyvlfmitu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796680.6919582-1150-21612525628491/AnsiballZ_stat.py'
Nov 22 07:31:20 compute-0 sudo[165732]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:31:21 compute-0 python3.9[165734]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:31:21 compute-0 sudo[165732]: pam_unix(sudo:session): session closed for user root
Nov 22 07:31:21 compute-0 sudo[165810]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gxwdmkxzqtqqdbkscqqzmaekakkwzkli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796680.6919582-1150-21612525628491/AnsiballZ_file.py'
Nov 22 07:31:21 compute-0 sudo[165810]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:31:21 compute-0 python3.9[165812]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:31:21 compute-0 sudo[165810]: pam_unix(sudo:session): session closed for user root
Nov 22 07:31:22 compute-0 sudo[165962]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mngcknwnjgnduyzgxtcbyayawroainlp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796681.913736-1186-20628932287250/AnsiballZ_stat.py'
Nov 22 07:31:22 compute-0 sudo[165962]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:31:22 compute-0 python3.9[165964]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:31:22 compute-0 sudo[165962]: pam_unix(sudo:session): session closed for user root
Nov 22 07:31:22 compute-0 sudo[166040]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fofmxzepzkcqeimhwgpuyzltflukznyv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796681.913736-1186-20628932287250/AnsiballZ_file.py'
Nov 22 07:31:22 compute-0 sudo[166040]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:31:22 compute-0 python3.9[166042]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:31:22 compute-0 sudo[166040]: pam_unix(sudo:session): session closed for user root
Nov 22 07:31:23 compute-0 sudo[166192]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evddspfvfjwfxweheagsrzygnqjkabwn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796683.0046513-1222-100150439015096/AnsiballZ_systemd.py'
Nov 22 07:31:23 compute-0 sudo[166192]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:31:23 compute-0 python3.9[166194]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 07:31:23 compute-0 systemd[1]: Reloading.
Nov 22 07:31:23 compute-0 systemd-rc-local-generator[166218]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 07:31:23 compute-0 systemd-sysv-generator[166222]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 07:31:24 compute-0 systemd[1]: Starting Create netns directory...
Nov 22 07:31:24 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 22 07:31:24 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 22 07:31:24 compute-0 systemd[1]: Finished Create netns directory.
Nov 22 07:31:24 compute-0 sudo[166192]: pam_unix(sudo:session): session closed for user root
Nov 22 07:31:24 compute-0 sudo[166386]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qvhfenuarzcchnwuvsxjnhruknxdjult ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796684.5977433-1252-196731793743721/AnsiballZ_file.py'
Nov 22 07:31:24 compute-0 sudo[166386]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:31:25 compute-0 python3.9[166388]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:31:25 compute-0 sudo[166386]: pam_unix(sudo:session): session closed for user root
Nov 22 07:31:25 compute-0 sudo[166538]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lyswdlirjurcbllauirgyholxvepnzwq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796685.2921717-1276-80786859667292/AnsiballZ_stat.py'
Nov 22 07:31:25 compute-0 sudo[166538]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:31:25 compute-0 python3.9[166540]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:31:25 compute-0 sudo[166538]: pam_unix(sudo:session): session closed for user root
Nov 22 07:31:26 compute-0 sudo[166661]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mvvwoyqeptutlfmqkmtuygznpyxttiud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796685.2921717-1276-80786859667292/AnsiballZ_copy.py'
Nov 22 07:31:26 compute-0 sudo[166661]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:31:26 compute-0 python3.9[166663]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/multipathd/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763796685.2921717-1276-80786859667292/.source _original_basename=healthcheck follow=False checksum=af9d0c1c8f3cb0e30ce9609be9d5b01924d0d23f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:31:26 compute-0 sudo[166661]: pam_unix(sudo:session): session closed for user root
Nov 22 07:31:27 compute-0 sudo[166813]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uypntulemixrcmfsaujmkmkrxpbyncdh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796686.768321-1327-85039254992068/AnsiballZ_file.py'
Nov 22 07:31:27 compute-0 sudo[166813]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:31:27 compute-0 python3.9[166815]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:31:27 compute-0 sudo[166813]: pam_unix(sudo:session): session closed for user root
Nov 22 07:31:27 compute-0 sudo[166965]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ytqdjpdwdaduxpthydsrtuxrcpzaomcs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796687.5874674-1351-89811255341930/AnsiballZ_stat.py'
Nov 22 07:31:27 compute-0 sudo[166965]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:31:28 compute-0 python3.9[166967]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:31:28 compute-0 sudo[166965]: pam_unix(sudo:session): session closed for user root
Nov 22 07:31:28 compute-0 sudo[167088]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxaoqmokoegjmquvaqwqhvcafftobiom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796687.5874674-1351-89811255341930/AnsiballZ_copy.py'
Nov 22 07:31:28 compute-0 sudo[167088]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:31:28 compute-0 python3.9[167090]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/multipathd.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796687.5874674-1351-89811255341930/.source.json _original_basename=.__ly4e67 follow=False checksum=3f7959ee8ac9757398adcc451c3b416c957d7c14 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:31:28 compute-0 sudo[167088]: pam_unix(sudo:session): session closed for user root
Nov 22 07:31:29 compute-0 sudo[167240]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eshxbtytgbrrlumntnbvtbkaxmcnmlmf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796688.8206773-1396-129809114537686/AnsiballZ_file.py'
Nov 22 07:31:29 compute-0 sudo[167240]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:31:29 compute-0 python3.9[167242]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:31:29 compute-0 sudo[167240]: pam_unix(sudo:session): session closed for user root
Nov 22 07:31:29 compute-0 sudo[167392]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-klsnyjxdnnsxsasyethyrryvfznthsxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796689.5565023-1420-214883698927519/AnsiballZ_stat.py'
Nov 22 07:31:29 compute-0 sudo[167392]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:31:30 compute-0 sudo[167392]: pam_unix(sudo:session): session closed for user root
Nov 22 07:31:30 compute-0 sudo[167515]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjzoklymrtcvaxjziyrtrwwxgghohixw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796689.5565023-1420-214883698927519/AnsiballZ_copy.py'
Nov 22 07:31:30 compute-0 sudo[167515]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:31:30 compute-0 sudo[167515]: pam_unix(sudo:session): session closed for user root
Nov 22 07:31:31 compute-0 sudo[167667]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-copaviymkuxtybuktbvqxcdabkgipfby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796691.0560367-1471-194674344249634/AnsiballZ_container_config_data.py'
Nov 22 07:31:31 compute-0 sudo[167667]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:31:31 compute-0 python3.9[167669]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False
Nov 22 07:31:31 compute-0 sudo[167667]: pam_unix(sudo:session): session closed for user root
Nov 22 07:31:33 compute-0 sudo[167819]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lzhsnwvvefzjjyngnyprgjndnmnbfvqb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796692.2202072-1498-151941276337713/AnsiballZ_container_config_hash.py'
Nov 22 07:31:33 compute-0 sudo[167819]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:31:33 compute-0 podman[167821]: 2025-11-22 07:31:33.260693005 +0000 UTC m=+0.100239936 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible)
Nov 22 07:31:33 compute-0 python3.9[167822]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 22 07:31:33 compute-0 sudo[167819]: pam_unix(sudo:session): session closed for user root
Nov 22 07:31:34 compute-0 sudo[167998]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzplitkdnlrwyxrzrofpjkwyoekxksdn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796693.6478996-1525-145152074749555/AnsiballZ_podman_container_info.py'
Nov 22 07:31:34 compute-0 sudo[167998]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:31:34 compute-0 python3.9[168000]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 22 07:31:34 compute-0 sudo[167998]: pam_unix(sudo:session): session closed for user root
Nov 22 07:31:35 compute-0 sudo[168175]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ltbzfsoocbsfvpcebfekzfsefycroctp ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763796695.3618453-1564-91944823616901/AnsiballZ_edpm_container_manage.py'
Nov 22 07:31:35 compute-0 sudo[168175]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:31:36 compute-0 python3[168177]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 22 07:31:36 compute-0 podman[168213]: 2025-11-22 07:31:36.374373323 +0000 UTC m=+0.056382641 container create 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 07:31:36 compute-0 podman[168213]: 2025-11-22 07:31:36.341904363 +0000 UTC m=+0.023913701 image pull 5a87eb2d1bea5c4c3bce654551fc0b05a96cf5556b36110e17bddeee8189b072 quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Nov 22 07:31:36 compute-0 python3[168177]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name multipathd --conmon-pidfile /run/multipathd.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=multipathd --label container_name=multipathd --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run/udev:/run/udev --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /var/lib/openstack/healthchecks/multipathd:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Nov 22 07:31:36 compute-0 sudo[168175]: pam_unix(sudo:session): session closed for user root
Nov 22 07:31:36 compute-0 sudo[168401]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eigmufsvfsqktikycqvxwopibjevdyaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796696.6996639-1588-237948813488558/AnsiballZ_stat.py'
Nov 22 07:31:36 compute-0 sudo[168401]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:31:37 compute-0 python3.9[168403]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 07:31:37 compute-0 sudo[168401]: pam_unix(sudo:session): session closed for user root
Nov 22 07:31:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:31:37.297 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:31:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:31:37.298 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:31:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:31:37.298 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:31:37 compute-0 sudo[168555]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hiwvojtmspefigeztmobotbonprkqfcm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796697.556347-1615-119091632897353/AnsiballZ_file.py'
Nov 22 07:31:37 compute-0 sudo[168555]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:31:38 compute-0 python3.9[168557]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:31:38 compute-0 sudo[168555]: pam_unix(sudo:session): session closed for user root
Nov 22 07:31:38 compute-0 sudo[168631]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twjkttegmqqizhkyxjtwepaslbgcfgda ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796697.556347-1615-119091632897353/AnsiballZ_stat.py'
Nov 22 07:31:38 compute-0 sudo[168631]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:31:38 compute-0 python3.9[168633]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 07:31:38 compute-0 sudo[168631]: pam_unix(sudo:session): session closed for user root
Nov 22 07:31:39 compute-0 sudo[168782]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vsrsgygrntixievywibjyskrvgjbentu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796698.6541893-1615-68238807438721/AnsiballZ_copy.py'
Nov 22 07:31:39 compute-0 sudo[168782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:31:39 compute-0 python3.9[168784]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763796698.6541893-1615-68238807438721/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:31:39 compute-0 sudo[168782]: pam_unix(sudo:session): session closed for user root
Nov 22 07:31:39 compute-0 sudo[168858]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfnpvjuhorowheldggabiabaniaefdsh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796698.6541893-1615-68238807438721/AnsiballZ_systemd.py'
Nov 22 07:31:39 compute-0 sudo[168858]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:31:39 compute-0 python3.9[168860]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 22 07:31:39 compute-0 systemd[1]: Reloading.
Nov 22 07:31:40 compute-0 systemd-rc-local-generator[168887]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 07:31:40 compute-0 systemd-sysv-generator[168892]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 07:31:40 compute-0 sudo[168858]: pam_unix(sudo:session): session closed for user root
Nov 22 07:31:40 compute-0 sudo[168970]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tdsbltzksvjeidwtgqsilqjizjjlpfga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796698.6541893-1615-68238807438721/AnsiballZ_systemd.py'
Nov 22 07:31:40 compute-0 sudo[168970]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:31:40 compute-0 python3.9[168972]: ansible-systemd Invoked with state=restarted name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 07:31:40 compute-0 systemd[1]: Reloading.
Nov 22 07:31:40 compute-0 systemd-sysv-generator[169004]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 07:31:40 compute-0 systemd-rc-local-generator[169001]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 07:31:41 compute-0 systemd[1]: Starting multipathd container...
Nov 22 07:31:41 compute-0 systemd[1]: Started libcrun container.
Nov 22 07:31:41 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3d28218f56977452ee39e44bea2c2a0fcfe825dd0e97cf28b2c19c63ebe3fa2b/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 22 07:31:41 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3d28218f56977452ee39e44bea2c2a0fcfe825dd0e97cf28b2c19c63ebe3fa2b/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 22 07:31:41 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f.
Nov 22 07:31:41 compute-0 podman[169013]: 2025-11-22 07:31:41.354130309 +0000 UTC m=+0.140116319 container init 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, config_id=multipathd, io.buildah.version=1.41.3)
Nov 22 07:31:41 compute-0 multipathd[169029]: + sudo -E kolla_set_configs
Nov 22 07:31:41 compute-0 podman[169013]: 2025-11-22 07:31:41.383817897 +0000 UTC m=+0.169803907 container start 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 07:31:41 compute-0 sudo[169035]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 22 07:31:41 compute-0 sudo[169035]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 22 07:31:41 compute-0 sudo[169035]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 22 07:31:41 compute-0 podman[169013]: multipathd
Nov 22 07:31:41 compute-0 systemd[1]: Started multipathd container.
Nov 22 07:31:41 compute-0 sudo[168970]: pam_unix(sudo:session): session closed for user root
Nov 22 07:31:41 compute-0 multipathd[169029]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 22 07:31:41 compute-0 multipathd[169029]: INFO:__main__:Validating config file
Nov 22 07:31:41 compute-0 multipathd[169029]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 22 07:31:41 compute-0 multipathd[169029]: INFO:__main__:Writing out command to execute
Nov 22 07:31:41 compute-0 sudo[169035]: pam_unix(sudo:session): session closed for user root
Nov 22 07:31:41 compute-0 multipathd[169029]: ++ cat /run_command
Nov 22 07:31:41 compute-0 multipathd[169029]: + CMD='/usr/sbin/multipathd -d'
Nov 22 07:31:41 compute-0 multipathd[169029]: + ARGS=
Nov 22 07:31:41 compute-0 multipathd[169029]: + sudo kolla_copy_cacerts
Nov 22 07:31:41 compute-0 sudo[169057]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Nov 22 07:31:41 compute-0 sudo[169057]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 22 07:31:41 compute-0 sudo[169057]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 22 07:31:41 compute-0 podman[169036]: 2025-11-22 07:31:41.456476523 +0000 UTC m=+0.058891176 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_id=multipathd, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 22 07:31:41 compute-0 sudo[169057]: pam_unix(sudo:session): session closed for user root
Nov 22 07:31:41 compute-0 systemd[1]: 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f-7c3a09044c80a296.service: Main process exited, code=exited, status=1/FAILURE
Nov 22 07:31:41 compute-0 multipathd[169029]: + [[ ! -n '' ]]
Nov 22 07:31:41 compute-0 multipathd[169029]: + . kolla_extend_start
Nov 22 07:31:41 compute-0 systemd[1]: 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f-7c3a09044c80a296.service: Failed with result 'exit-code'.
Nov 22 07:31:41 compute-0 multipathd[169029]: Running command: '/usr/sbin/multipathd -d'
Nov 22 07:31:41 compute-0 multipathd[169029]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Nov 22 07:31:41 compute-0 multipathd[169029]: + umask 0022
Nov 22 07:31:41 compute-0 multipathd[169029]: + exec /usr/sbin/multipathd -d
Nov 22 07:31:41 compute-0 multipathd[169029]: 3442.167712 | --------start up--------
Nov 22 07:31:41 compute-0 multipathd[169029]: 3442.167731 | read /etc/multipath.conf
Nov 22 07:31:41 compute-0 multipathd[169029]: 3442.173287 | path checkers start up
Nov 22 07:31:42 compute-0 podman[169191]: 2025-11-22 07:31:42.221587974 +0000 UTC m=+0.062748016 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 07:31:42 compute-0 python3.9[169230]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 07:31:42 compute-0 sudo[169388]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-halleytygdtkzjnzsbjjbkjqiavctvto ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796702.674897-1723-37743086416960/AnsiballZ_command.py'
Nov 22 07:31:42 compute-0 sudo[169388]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:31:43 compute-0 python3.9[169390]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --filter volume=/etc/multipath.conf --format {{.Names}} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 07:31:43 compute-0 sudo[169388]: pam_unix(sudo:session): session closed for user root
Nov 22 07:31:43 compute-0 sudo[169553]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mnybxusomwgzkynurqvqzvnbweqytomv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796703.520231-1747-111264573610429/AnsiballZ_systemd.py'
Nov 22 07:31:43 compute-0 sudo[169553]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:31:44 compute-0 python3.9[169555]: ansible-ansible.builtin.systemd Invoked with name=edpm_multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 07:31:44 compute-0 systemd[1]: Stopping multipathd container...
Nov 22 07:31:44 compute-0 multipathd[169029]: 3445.186328 | exit (signal)
Nov 22 07:31:44 compute-0 multipathd[169029]: 3445.187222 | --------shut down-------
Nov 22 07:31:44 compute-0 systemd[1]: libpod-8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f.scope: Deactivated successfully.
Nov 22 07:31:44 compute-0 podman[169559]: 2025-11-22 07:31:44.526376628 +0000 UTC m=+0.335231292 container died 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, config_id=multipathd, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 22 07:31:44 compute-0 systemd[1]: 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f-7c3a09044c80a296.timer: Deactivated successfully.
Nov 22 07:31:44 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f.
Nov 22 07:31:44 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f-userdata-shm.mount: Deactivated successfully.
Nov 22 07:31:44 compute-0 systemd[1]: var-lib-containers-storage-overlay-3d28218f56977452ee39e44bea2c2a0fcfe825dd0e97cf28b2c19c63ebe3fa2b-merged.mount: Deactivated successfully.
Nov 22 07:31:44 compute-0 podman[169559]: 2025-11-22 07:31:44.847929919 +0000 UTC m=+0.656784553 container cleanup 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 07:31:44 compute-0 podman[169559]: multipathd
Nov 22 07:31:44 compute-0 podman[169588]: multipathd
Nov 22 07:31:44 compute-0 systemd[1]: edpm_multipathd.service: Deactivated successfully.
Nov 22 07:31:44 compute-0 systemd[1]: Stopped multipathd container.
Nov 22 07:31:44 compute-0 systemd[1]: Starting multipathd container...
Nov 22 07:31:45 compute-0 systemd[1]: Started libcrun container.
Nov 22 07:31:45 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3d28218f56977452ee39e44bea2c2a0fcfe825dd0e97cf28b2c19c63ebe3fa2b/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 22 07:31:45 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3d28218f56977452ee39e44bea2c2a0fcfe825dd0e97cf28b2c19c63ebe3fa2b/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 22 07:31:45 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f.
Nov 22 07:31:45 compute-0 podman[169601]: 2025-11-22 07:31:45.055672181 +0000 UTC m=+0.109204896 container init 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Nov 22 07:31:45 compute-0 multipathd[169616]: + sudo -E kolla_set_configs
Nov 22 07:31:45 compute-0 sudo[169622]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 22 07:31:45 compute-0 sudo[169622]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 22 07:31:45 compute-0 sudo[169622]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 22 07:31:45 compute-0 podman[169601]: 2025-11-22 07:31:45.090539535 +0000 UTC m=+0.144072230 container start 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.build-date=20251118, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 07:31:45 compute-0 podman[169601]: multipathd
Nov 22 07:31:45 compute-0 systemd[1]: Started multipathd container.
Nov 22 07:31:45 compute-0 multipathd[169616]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 22 07:31:45 compute-0 multipathd[169616]: INFO:__main__:Validating config file
Nov 22 07:31:45 compute-0 multipathd[169616]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 22 07:31:45 compute-0 multipathd[169616]: INFO:__main__:Writing out command to execute
Nov 22 07:31:45 compute-0 sudo[169622]: pam_unix(sudo:session): session closed for user root
Nov 22 07:31:45 compute-0 multipathd[169616]: ++ cat /run_command
Nov 22 07:31:45 compute-0 multipathd[169616]: + CMD='/usr/sbin/multipathd -d'
Nov 22 07:31:45 compute-0 multipathd[169616]: + ARGS=
Nov 22 07:31:45 compute-0 multipathd[169616]: + sudo kolla_copy_cacerts
Nov 22 07:31:45 compute-0 sudo[169553]: pam_unix(sudo:session): session closed for user root
Nov 22 07:31:45 compute-0 sudo[169641]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Nov 22 07:31:45 compute-0 sudo[169641]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 22 07:31:45 compute-0 sudo[169641]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 22 07:31:45 compute-0 sudo[169641]: pam_unix(sudo:session): session closed for user root
Nov 22 07:31:45 compute-0 multipathd[169616]: + [[ ! -n '' ]]
Nov 22 07:31:45 compute-0 multipathd[169616]: + . kolla_extend_start
Nov 22 07:31:45 compute-0 multipathd[169616]: Running command: '/usr/sbin/multipathd -d'
Nov 22 07:31:45 compute-0 multipathd[169616]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Nov 22 07:31:45 compute-0 multipathd[169616]: + umask 0022
Nov 22 07:31:45 compute-0 multipathd[169616]: + exec /usr/sbin/multipathd -d
Nov 22 07:31:45 compute-0 multipathd[169616]: 3445.874170 | --------start up--------
Nov 22 07:31:45 compute-0 multipathd[169616]: 3445.874200 | read /etc/multipath.conf
Nov 22 07:31:45 compute-0 multipathd[169616]: 3445.881265 | path checkers start up
Nov 22 07:31:45 compute-0 podman[169623]: 2025-11-22 07:31:45.192053102 +0000 UTC m=+0.092183857 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 22 07:31:45 compute-0 systemd[1]: 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f-127842aaf15b8ed5.service: Main process exited, code=exited, status=1/FAILURE
Nov 22 07:31:45 compute-0 systemd[1]: 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f-127842aaf15b8ed5.service: Failed with result 'exit-code'.
Nov 22 07:31:45 compute-0 sudo[169805]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-adhanlcpifnbuiqwrommaqhlmqwyomqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796705.3316612-1771-188400518361997/AnsiballZ_file.py'
Nov 22 07:31:45 compute-0 sudo[169805]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:31:45 compute-0 python3.9[169807]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:31:45 compute-0 sudo[169805]: pam_unix(sudo:session): session closed for user root
Nov 22 07:31:46 compute-0 sudo[169957]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-egezatympbzkpcrxcvzqrrvkmoakwfhg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796706.4355273-1807-212015546383313/AnsiballZ_file.py'
Nov 22 07:31:46 compute-0 sudo[169957]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:31:46 compute-0 python3.9[169959]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 22 07:31:47 compute-0 sudo[169957]: pam_unix(sudo:session): session closed for user root
Nov 22 07:31:47 compute-0 sudo[170109]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oxgagtyirubtsazgelcmykthyeymdjun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796707.2901185-1831-116729179223761/AnsiballZ_modprobe.py'
Nov 22 07:31:47 compute-0 sudo[170109]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:31:47 compute-0 python3.9[170111]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Nov 22 07:31:47 compute-0 kernel: Key type psk registered
Nov 22 07:31:47 compute-0 sudo[170109]: pam_unix(sudo:session): session closed for user root
Nov 22 07:31:48 compute-0 systemd[1]: virtnodedevd.service: Deactivated successfully.
Nov 22 07:31:48 compute-0 sudo[170273]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-csirmehqrdrfxhhevbeovqgcccupuyir ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796708.145696-1855-267383644671158/AnsiballZ_stat.py'
Nov 22 07:31:48 compute-0 sudo[170273]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:31:48 compute-0 python3.9[170275]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:31:48 compute-0 sudo[170273]: pam_unix(sudo:session): session closed for user root
Nov 22 07:31:49 compute-0 sudo[170396]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-celifptdhbragqvcnozztstvdbntoezp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796708.145696-1855-267383644671158/AnsiballZ_copy.py'
Nov 22 07:31:49 compute-0 sudo[170396]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:31:49 compute-0 python3.9[170398]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796708.145696-1855-267383644671158/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:31:49 compute-0 sudo[170396]: pam_unix(sudo:session): session closed for user root
Nov 22 07:31:49 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Nov 22 07:31:50 compute-0 sudo[170549]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jsmnbqohxdeodtdudvlignvtktstykei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796709.8343616-1903-109815332018964/AnsiballZ_lineinfile.py'
Nov 22 07:31:50 compute-0 sudo[170549]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:31:50 compute-0 python3.9[170551]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:31:50 compute-0 sudo[170549]: pam_unix(sudo:session): session closed for user root
Nov 22 07:31:50 compute-0 systemd[1]: virtqemud.service: Deactivated successfully.
Nov 22 07:31:50 compute-0 sudo[170702]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zvshkyydmwjvfubzxxjhixcpqwdaarqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796710.6100364-1927-267768858918583/AnsiballZ_systemd.py'
Nov 22 07:31:50 compute-0 sudo[170702]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:31:51 compute-0 python3.9[170704]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 07:31:51 compute-0 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Nov 22 07:31:51 compute-0 systemd[1]: Stopped Load Kernel Modules.
Nov 22 07:31:51 compute-0 systemd[1]: Stopping Load Kernel Modules...
Nov 22 07:31:51 compute-0 systemd[1]: Starting Load Kernel Modules...
Nov 22 07:31:51 compute-0 systemd[1]: Finished Load Kernel Modules.
Nov 22 07:31:51 compute-0 sudo[170702]: pam_unix(sudo:session): session closed for user root
Nov 22 07:31:52 compute-0 sudo[170858]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fmbzsajbsybrtzbsxbfrhhcpoziyxshg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796711.7439616-1951-98807339944873/AnsiballZ_dnf.py'
Nov 22 07:31:52 compute-0 sudo[170858]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:31:52 compute-0 python3.9[170860]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 22 07:31:52 compute-0 systemd[1]: virtsecretd.service: Deactivated successfully.
Nov 22 07:31:54 compute-0 systemd[1]: Reloading.
Nov 22 07:31:55 compute-0 systemd-sysv-generator[170897]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 07:31:55 compute-0 systemd-rc-local-generator[170894]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 07:31:55 compute-0 systemd[1]: Reloading.
Nov 22 07:31:55 compute-0 systemd-rc-local-generator[170926]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 07:31:55 compute-0 systemd-sysv-generator[170930]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 07:31:55 compute-0 systemd-logind[821]: Watching system buttons on /dev/input/event0 (Power Button)
Nov 22 07:31:55 compute-0 systemd-logind[821]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Nov 22 07:31:55 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 22 07:31:55 compute-0 systemd[1]: Starting man-db-cache-update.service...
Nov 22 07:31:56 compute-0 systemd[1]: Reloading.
Nov 22 07:31:56 compute-0 systemd-rc-local-generator[171024]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 07:31:56 compute-0 systemd-sysv-generator[171027]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 07:31:56 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 22 07:31:57 compute-0 sudo[170858]: pam_unix(sudo:session): session closed for user root
Nov 22 07:31:59 compute-0 sudo[172310]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nehoivgkxzjmlmxssjzxshcjubiaopqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796719.4123125-1975-50353506656680/AnsiballZ_systemd_service.py'
Nov 22 07:31:59 compute-0 sudo[172310]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:32:00 compute-0 python3.9[172312]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 07:32:00 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 22 07:32:00 compute-0 systemd[1]: Finished man-db-cache-update.service.
Nov 22 07:32:00 compute-0 systemd[1]: man-db-cache-update.service: Consumed 1.636s CPU time.
Nov 22 07:32:00 compute-0 systemd[1]: run-ra81d5db899fd42ed9726f178b96ebe06.service: Deactivated successfully.
Nov 22 07:32:00 compute-0 systemd[1]: Stopping Open-iSCSI...
Nov 22 07:32:00 compute-0 iscsid[160680]: iscsid shutting down.
Nov 22 07:32:00 compute-0 systemd[1]: iscsid.service: Deactivated successfully.
Nov 22 07:32:00 compute-0 systemd[1]: Stopped Open-iSCSI.
Nov 22 07:32:00 compute-0 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Nov 22 07:32:00 compute-0 systemd[1]: Starting Open-iSCSI...
Nov 22 07:32:00 compute-0 systemd[1]: Started Open-iSCSI.
Nov 22 07:32:00 compute-0 sudo[172310]: pam_unix(sudo:session): session closed for user root
Nov 22 07:32:01 compute-0 python3.9[172468]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 07:32:02 compute-0 sudo[172622]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xelsgsoiuxyfsnrzuucgzysiskifumyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796722.0137992-2027-43571822235036/AnsiballZ_file.py'
Nov 22 07:32:02 compute-0 sudo[172622]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:32:02 compute-0 python3.9[172624]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:32:02 compute-0 sudo[172622]: pam_unix(sudo:session): session closed for user root
Nov 22 07:32:03 compute-0 sudo[172786]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nlsjcezjitinnvbaoiyzoubcazcpovlc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796723.0834777-2060-129244085914222/AnsiballZ_systemd_service.py'
Nov 22 07:32:03 compute-0 sudo[172786]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:32:03 compute-0 podman[172748]: 2025-11-22 07:32:03.43345945 +0000 UTC m=+0.088705007 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 22 07:32:03 compute-0 python3.9[172792]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 22 07:32:03 compute-0 systemd[1]: Reloading.
Nov 22 07:32:03 compute-0 systemd-rc-local-generator[172829]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 07:32:03 compute-0 systemd-sysv-generator[172832]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 07:32:04 compute-0 sudo[172786]: pam_unix(sudo:session): session closed for user root
Nov 22 07:32:04 compute-0 python3.9[172987]: ansible-ansible.builtin.service_facts Invoked
Nov 22 07:32:04 compute-0 network[173004]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 22 07:32:04 compute-0 network[173005]: 'network-scripts' will be removed from distribution in near future.
Nov 22 07:32:04 compute-0 network[173006]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 22 07:32:08 compute-0 sudo[173278]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mlipheyvaibonpegeenhlxgrphmbavqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796728.4127107-2117-83941508776993/AnsiballZ_systemd_service.py'
Nov 22 07:32:08 compute-0 sudo[173278]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:32:09 compute-0 python3.9[173280]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 07:32:09 compute-0 sudo[173278]: pam_unix(sudo:session): session closed for user root
Nov 22 07:32:09 compute-0 sudo[173431]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sednuvdmogkhbamigqheinvgblzdmyuh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796729.2143059-2117-66332504021631/AnsiballZ_systemd_service.py'
Nov 22 07:32:09 compute-0 sudo[173431]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:32:09 compute-0 python3.9[173433]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 07:32:09 compute-0 sudo[173431]: pam_unix(sudo:session): session closed for user root
Nov 22 07:32:10 compute-0 sudo[173584]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qmucudroefeowdrjtdrckjfgsextpwkr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796729.9854867-2117-225771550875454/AnsiballZ_systemd_service.py'
Nov 22 07:32:10 compute-0 sudo[173584]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:32:10 compute-0 python3.9[173586]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 07:32:10 compute-0 sudo[173584]: pam_unix(sudo:session): session closed for user root
Nov 22 07:32:11 compute-0 sudo[173737]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xhnjddnfxhhxmaqvajavdfbatbuxmshd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796730.7755454-2117-1385158404666/AnsiballZ_systemd_service.py'
Nov 22 07:32:11 compute-0 sudo[173737]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:32:11 compute-0 python3.9[173739]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 07:32:11 compute-0 sudo[173737]: pam_unix(sudo:session): session closed for user root
Nov 22 07:32:11 compute-0 sudo[173890]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wwgwigmhpefvutmwpzhknvxnjsfqudtv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796731.5039988-2117-13408860221686/AnsiballZ_systemd_service.py'
Nov 22 07:32:11 compute-0 sudo[173890]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:32:12 compute-0 python3.9[173892]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 07:32:12 compute-0 podman[173893]: 2025-11-22 07:32:12.406237052 +0000 UTC m=+0.056672920 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Nov 22 07:32:12 compute-0 sudo[173890]: pam_unix(sudo:session): session closed for user root
Nov 22 07:32:12 compute-0 sudo[174062]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-spjbpzcdlmtzshicefajvuyngwweedqb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796732.53188-2117-109138282244338/AnsiballZ_systemd_service.py'
Nov 22 07:32:12 compute-0 sudo[174062]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:32:13 compute-0 python3.9[174064]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 07:32:13 compute-0 sudo[174062]: pam_unix(sudo:session): session closed for user root
Nov 22 07:32:13 compute-0 sudo[174215]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qznnvoniovkdautnwsfummxnykbhbefe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796733.3183541-2117-179823174349497/AnsiballZ_systemd_service.py'
Nov 22 07:32:13 compute-0 sudo[174215]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:32:13 compute-0 python3.9[174217]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 07:32:13 compute-0 sudo[174215]: pam_unix(sudo:session): session closed for user root
Nov 22 07:32:14 compute-0 sudo[174368]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-epbzdpdzmyzqdmkiwtgsfaxssupobumh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796734.062451-2117-34706814780817/AnsiballZ_systemd_service.py'
Nov 22 07:32:14 compute-0 sudo[174368]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:32:14 compute-0 python3.9[174370]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 07:32:14 compute-0 sudo[174368]: pam_unix(sudo:session): session closed for user root
Nov 22 07:32:15 compute-0 podman[174396]: 2025-11-22 07:32:15.442236883 +0000 UTC m=+0.082783885 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=multipathd)
Nov 22 07:32:15 compute-0 sudo[174539]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-irnfnrcdmfamcdpsfznuuyrczfvgsrll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796735.412339-2294-192860672991108/AnsiballZ_file.py'
Nov 22 07:32:15 compute-0 sudo[174539]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:32:15 compute-0 python3.9[174541]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:32:15 compute-0 sudo[174539]: pam_unix(sudo:session): session closed for user root
Nov 22 07:32:16 compute-0 sudo[174691]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gcrhstxahijspfemietxeucnzhbgtlqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796736.0478969-2294-42951404928806/AnsiballZ_file.py'
Nov 22 07:32:16 compute-0 sudo[174691]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:32:16 compute-0 python3.9[174693]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:32:16 compute-0 sudo[174691]: pam_unix(sudo:session): session closed for user root
Nov 22 07:32:16 compute-0 sudo[174843]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxomkhgrucyzlzqqhusxrqyppkccncfp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796736.6208243-2294-56367939367688/AnsiballZ_file.py'
Nov 22 07:32:16 compute-0 sudo[174843]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:32:17 compute-0 python3.9[174845]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:32:17 compute-0 sudo[174843]: pam_unix(sudo:session): session closed for user root
Nov 22 07:32:17 compute-0 sshd-session[172860]: error: kex_exchange_identification: read: Connection reset by peer
Nov 22 07:32:17 compute-0 sshd-session[172860]: Connection reset by 90.117.163.17 port 40042
Nov 22 07:32:17 compute-0 sudo[174995]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-siqufovkwhqyldkreivpgeuzbbkgnwnq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796737.31858-2294-274535246016774/AnsiballZ_file.py'
Nov 22 07:32:17 compute-0 sudo[174995]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:32:17 compute-0 python3.9[174997]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:32:17 compute-0 sudo[174995]: pam_unix(sudo:session): session closed for user root
Nov 22 07:32:18 compute-0 sudo[175147]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gpdzouivjskfmvxqcdoluhenocyegavj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796737.9537075-2294-173159415649513/AnsiballZ_file.py'
Nov 22 07:32:18 compute-0 sudo[175147]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:32:18 compute-0 python3.9[175149]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:32:18 compute-0 sudo[175147]: pam_unix(sudo:session): session closed for user root
Nov 22 07:32:18 compute-0 sudo[175299]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oyjhhcyygaudgofncgeuwnsosotrzmch ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796738.6032624-2294-113492249154459/AnsiballZ_file.py'
Nov 22 07:32:18 compute-0 sudo[175299]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:32:19 compute-0 python3.9[175301]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:32:19 compute-0 sudo[175299]: pam_unix(sudo:session): session closed for user root
Nov 22 07:32:19 compute-0 sudo[175451]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kndltztbaqkxxcflspqrchrtbipijkge ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796739.2609968-2294-150384492277622/AnsiballZ_file.py'
Nov 22 07:32:19 compute-0 sudo[175451]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:32:19 compute-0 python3.9[175453]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:32:19 compute-0 sudo[175451]: pam_unix(sudo:session): session closed for user root
Nov 22 07:32:20 compute-0 sudo[175603]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thugagpdhwfmaiglcswnryyxxcabqdty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796739.926266-2294-46405493323734/AnsiballZ_file.py'
Nov 22 07:32:20 compute-0 sudo[175603]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:32:20 compute-0 python3.9[175605]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:32:20 compute-0 sudo[175603]: pam_unix(sudo:session): session closed for user root
Nov 22 07:32:21 compute-0 sudo[175755]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cgtovsznexpniuapizhandhwbwpsxgnh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796740.73193-2465-232282106049297/AnsiballZ_file.py'
Nov 22 07:32:21 compute-0 sudo[175755]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:32:21 compute-0 python3.9[175757]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:32:21 compute-0 sudo[175755]: pam_unix(sudo:session): session closed for user root
Nov 22 07:32:21 compute-0 sudo[175907]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hkjrezrdzhsjitskorypaljyhuzxxrem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796741.3901048-2465-72384162576198/AnsiballZ_file.py'
Nov 22 07:32:21 compute-0 sudo[175907]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:32:21 compute-0 python3.9[175909]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:32:21 compute-0 sudo[175907]: pam_unix(sudo:session): session closed for user root
Nov 22 07:32:22 compute-0 sudo[176059]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rnukddelpmlyojwutjsczwacdbncxegy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796742.0436578-2465-121560062931731/AnsiballZ_file.py'
Nov 22 07:32:22 compute-0 sudo[176059]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:32:22 compute-0 python3.9[176061]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:32:22 compute-0 sudo[176059]: pam_unix(sudo:session): session closed for user root
Nov 22 07:32:22 compute-0 sudo[176211]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bhxjpozarejfxylnfutudaztcwwdfkpz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796742.6933105-2465-131740894083187/AnsiballZ_file.py'
Nov 22 07:32:22 compute-0 sudo[176211]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:32:23 compute-0 python3.9[176213]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:32:23 compute-0 sudo[176211]: pam_unix(sudo:session): session closed for user root
Nov 22 07:32:23 compute-0 sudo[176363]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xcbumvbmorlssbzofmcmksecgccrnvev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796743.504824-2465-189582352589142/AnsiballZ_file.py'
Nov 22 07:32:23 compute-0 sudo[176363]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:32:24 compute-0 python3.9[176365]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:32:24 compute-0 sudo[176363]: pam_unix(sudo:session): session closed for user root
Nov 22 07:32:24 compute-0 sudo[176515]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-duvcbnxizncpsvigvnwjztmrixrnnlcc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796744.174422-2465-33450186636290/AnsiballZ_file.py'
Nov 22 07:32:24 compute-0 sudo[176515]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:32:24 compute-0 python3.9[176517]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:32:24 compute-0 sudo[176515]: pam_unix(sudo:session): session closed for user root
Nov 22 07:32:25 compute-0 sudo[176667]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jmhkrxwdfshprvpzlhunqfptgrmykscg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796744.9085162-2465-249941700303895/AnsiballZ_file.py'
Nov 22 07:32:25 compute-0 sudo[176667]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:32:25 compute-0 python3.9[176669]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:32:25 compute-0 sudo[176667]: pam_unix(sudo:session): session closed for user root
Nov 22 07:32:25 compute-0 sudo[176819]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bpffnkdaydvpqhldzoeqgbruivmxjisc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796745.5349462-2465-226782429765477/AnsiballZ_file.py'
Nov 22 07:32:25 compute-0 sudo[176819]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:32:26 compute-0 python3.9[176821]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:32:26 compute-0 sudo[176819]: pam_unix(sudo:session): session closed for user root
Nov 22 07:32:26 compute-0 sudo[176971]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-snaybwdlkizwymqvgmxgtizrqcuotpvy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796746.4006357-2639-45885784174329/AnsiballZ_command.py'
Nov 22 07:32:26 compute-0 sudo[176971]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:32:26 compute-0 python3.9[176973]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 07:32:26 compute-0 sudo[176971]: pam_unix(sudo:session): session closed for user root
Nov 22 07:32:27 compute-0 python3.9[177125]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 22 07:32:28 compute-0 sudo[177275]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rtxtvpocxwdqshcsgqtgfpmwfcwrhdmp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796748.0832767-2693-180479315086025/AnsiballZ_systemd_service.py'
Nov 22 07:32:28 compute-0 sudo[177275]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:32:28 compute-0 python3.9[177277]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 22 07:32:28 compute-0 systemd[1]: Reloading.
Nov 22 07:32:28 compute-0 systemd-sysv-generator[177310]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 07:32:28 compute-0 systemd-rc-local-generator[177306]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 07:32:28 compute-0 sudo[177275]: pam_unix(sudo:session): session closed for user root
Nov 22 07:32:29 compute-0 sudo[177462]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cktkrzdptlargatvyjqewlycwdacmhzq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796749.320687-2717-85503735044342/AnsiballZ_command.py'
Nov 22 07:32:29 compute-0 sudo[177462]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:32:29 compute-0 python3.9[177464]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 07:32:29 compute-0 sudo[177462]: pam_unix(sudo:session): session closed for user root
Nov 22 07:32:30 compute-0 sudo[177615]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hlrxmrkfaaqhqxaxpntnqkbjmloiruhs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796749.8880024-2717-226327810982223/AnsiballZ_command.py'
Nov 22 07:32:30 compute-0 sudo[177615]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:32:30 compute-0 python3.9[177617]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 07:32:30 compute-0 sudo[177615]: pam_unix(sudo:session): session closed for user root
Nov 22 07:32:30 compute-0 sudo[177768]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-leztcruvdzthoxybmhduzvsldvjxyihm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796750.5876522-2717-162001859416423/AnsiballZ_command.py'
Nov 22 07:32:30 compute-0 sudo[177768]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:32:31 compute-0 python3.9[177770]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 07:32:31 compute-0 sudo[177768]: pam_unix(sudo:session): session closed for user root
Nov 22 07:32:31 compute-0 sudo[177921]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rlfceojzsibtkbojzgplpdvmjccdijes ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796751.327821-2717-266062851930396/AnsiballZ_command.py'
Nov 22 07:32:31 compute-0 sudo[177921]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:32:31 compute-0 python3.9[177923]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 07:32:31 compute-0 sudo[177921]: pam_unix(sudo:session): session closed for user root
Nov 22 07:32:32 compute-0 sudo[178074]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tghyjwuzyaznjeepkbznigzhqxvfppjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796751.9693396-2717-152609579742071/AnsiballZ_command.py'
Nov 22 07:32:32 compute-0 sudo[178074]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:32:32 compute-0 python3.9[178076]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 07:32:32 compute-0 sudo[178074]: pam_unix(sudo:session): session closed for user root
Nov 22 07:32:32 compute-0 sudo[178227]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sthzlliqheebyllbskpjdfurvyalvozm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796752.6762998-2717-176976969477405/AnsiballZ_command.py'
Nov 22 07:32:32 compute-0 sudo[178227]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:32:33 compute-0 python3.9[178229]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 07:32:33 compute-0 sudo[178227]: pam_unix(sudo:session): session closed for user root
Nov 22 07:32:33 compute-0 sudo[178395]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tltlqtxpdzqvedbfavlrrzfzxdlihltl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796753.304477-2717-20498190189544/AnsiballZ_command.py'
Nov 22 07:32:33 compute-0 sudo[178395]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:32:33 compute-0 podman[178354]: 2025-11-22 07:32:33.671607221 +0000 UTC m=+0.112782622 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 07:32:33 compute-0 python3.9[178402]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 07:32:33 compute-0 sudo[178395]: pam_unix(sudo:session): session closed for user root
Nov 22 07:32:34 compute-0 sudo[178559]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-liohnghkydjgejngkmymkrpwmvdzdoum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796753.9686751-2717-206844296990678/AnsiballZ_command.py'
Nov 22 07:32:34 compute-0 sudo[178559]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:32:34 compute-0 python3.9[178561]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 07:32:34 compute-0 sudo[178559]: pam_unix(sudo:session): session closed for user root
Nov 22 07:32:36 compute-0 sudo[178712]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lvnabqonmdfsjjfedpebameraudgwuwm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796755.9164324-2924-9175953643501/AnsiballZ_file.py'
Nov 22 07:32:36 compute-0 sudo[178712]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:32:36 compute-0 python3.9[178714]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:32:36 compute-0 sudo[178712]: pam_unix(sudo:session): session closed for user root
Nov 22 07:32:36 compute-0 sudo[178864]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-quvgndzmgwqhidwpicizxrwiexjnymlp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796756.5732374-2924-244221765690213/AnsiballZ_file.py'
Nov 22 07:32:36 compute-0 sudo[178864]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:32:37 compute-0 python3.9[178866]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:32:37 compute-0 sudo[178864]: pam_unix(sudo:session): session closed for user root
Nov 22 07:32:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:32:37.298 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:32:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:32:37.299 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:32:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:32:37.299 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:32:37 compute-0 sudo[179016]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ljdvantoqlcybelfgfuclaubsgjilejy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796757.2118037-2924-206229117086415/AnsiballZ_file.py'
Nov 22 07:32:37 compute-0 sudo[179016]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:32:37 compute-0 python3.9[179018]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:32:37 compute-0 sudo[179016]: pam_unix(sudo:session): session closed for user root
Nov 22 07:32:38 compute-0 sudo[179168]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bmkftatymocuevgbebumzkxkmwquqvty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796758.0710514-2990-59803079402576/AnsiballZ_file.py'
Nov 22 07:32:38 compute-0 sudo[179168]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:32:38 compute-0 python3.9[179170]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:32:38 compute-0 sudo[179168]: pam_unix(sudo:session): session closed for user root
Nov 22 07:32:39 compute-0 sudo[179320]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfzzvkrqbpeocqxiohuhisxndkklhgxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796758.732258-2990-20441810254233/AnsiballZ_file.py'
Nov 22 07:32:39 compute-0 sudo[179320]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:32:39 compute-0 python3.9[179322]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:32:39 compute-0 sudo[179320]: pam_unix(sudo:session): session closed for user root
Nov 22 07:32:39 compute-0 sudo[179472]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jnxwxtrmbiphwzfmybiahabanllyaiah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796759.463609-2990-101902533631826/AnsiballZ_file.py'
Nov 22 07:32:39 compute-0 sudo[179472]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:32:39 compute-0 python3.9[179474]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:32:40 compute-0 sudo[179472]: pam_unix(sudo:session): session closed for user root
Nov 22 07:32:40 compute-0 sudo[179624]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pdwndpqdwxlxctnopevkqkttwligbbiq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796760.1548924-2990-98123703891533/AnsiballZ_file.py'
Nov 22 07:32:40 compute-0 sudo[179624]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:32:40 compute-0 python3.9[179626]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:32:40 compute-0 sudo[179624]: pam_unix(sudo:session): session closed for user root
Nov 22 07:32:41 compute-0 sudo[179776]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mqoiutiewzinkeimiqlmdqzcgtrzfbcq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796760.7730873-2990-103278038702809/AnsiballZ_file.py'
Nov 22 07:32:41 compute-0 sudo[179776]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:32:41 compute-0 python3.9[179778]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:32:41 compute-0 sudo[179776]: pam_unix(sudo:session): session closed for user root
Nov 22 07:32:41 compute-0 sudo[179928]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gepgeaykogvzwavujdzcdwljmcjqdamx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796761.464506-2990-145664529049816/AnsiballZ_file.py'
Nov 22 07:32:41 compute-0 sudo[179928]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:32:41 compute-0 python3.9[179930]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:32:41 compute-0 sudo[179928]: pam_unix(sudo:session): session closed for user root
Nov 22 07:32:42 compute-0 sudo[180080]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vqgqinubrvvdniuvcujdblnhpjgoluoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796762.1284628-2990-200855681097821/AnsiballZ_file.py'
Nov 22 07:32:42 compute-0 sudo[180080]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:32:42 compute-0 python3.9[180082]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:32:42 compute-0 sudo[180080]: pam_unix(sudo:session): session closed for user root
Nov 22 07:32:42 compute-0 podman[180083]: 2025-11-22 07:32:42.664169698 +0000 UTC m=+0.056918942 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Nov 22 07:32:46 compute-0 podman[180126]: 2025-11-22 07:32:46.3972313 +0000 UTC m=+0.048995200 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 07:32:48 compute-0 sudo[180270]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-maycqhhgcycbngmkujddxvvbjmysecvh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796767.7717578-3295-117314814200433/AnsiballZ_getent.py'
Nov 22 07:32:48 compute-0 sudo[180270]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:32:48 compute-0 python3.9[180272]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Nov 22 07:32:48 compute-0 sudo[180270]: pam_unix(sudo:session): session closed for user root
Nov 22 07:32:49 compute-0 sudo[180423]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qskswfqkyvsplognejlktvekloiyabgc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796768.6410825-3319-71145181626856/AnsiballZ_group.py'
Nov 22 07:32:49 compute-0 sudo[180423]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:32:49 compute-0 python3.9[180425]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 22 07:32:49 compute-0 groupadd[180426]: group added to /etc/group: name=nova, GID=42436
Nov 22 07:32:49 compute-0 groupadd[180426]: group added to /etc/gshadow: name=nova
Nov 22 07:32:49 compute-0 groupadd[180426]: new group: name=nova, GID=42436
Nov 22 07:32:49 compute-0 sudo[180423]: pam_unix(sudo:session): session closed for user root
Nov 22 07:32:50 compute-0 sudo[180581]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wzffjynwqsmjuxqvsjvhwejwrpdaqlwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796770.111557-3343-253827957412530/AnsiballZ_user.py'
Nov 22 07:32:50 compute-0 sudo[180581]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:32:50 compute-0 python3.9[180583]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 22 07:32:50 compute-0 useradd[180585]: new user: name=nova, UID=42436, GID=42436, home=/home/nova, shell=/bin/sh, from=/dev/pts/0
Nov 22 07:32:50 compute-0 useradd[180585]: add 'nova' to group 'libvirt'
Nov 22 07:32:50 compute-0 useradd[180585]: add 'nova' to shadow group 'libvirt'
Nov 22 07:32:51 compute-0 sudo[180581]: pam_unix(sudo:session): session closed for user root
Nov 22 07:32:52 compute-0 sshd-session[180616]: Accepted publickey for zuul from 192.168.122.30 port 34384 ssh2: ECDSA SHA256:XSwr0+qdoVcqF4tsJEe7yzRPcUrJW8ET1w6IEkjbKvs
Nov 22 07:32:52 compute-0 systemd-logind[821]: New session 24 of user zuul.
Nov 22 07:32:52 compute-0 systemd[1]: Started Session 24 of User zuul.
Nov 22 07:32:52 compute-0 sshd-session[180616]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 22 07:32:52 compute-0 sshd-session[180619]: Received disconnect from 192.168.122.30 port 34384:11: disconnected by user
Nov 22 07:32:52 compute-0 sshd-session[180619]: Disconnected from user zuul 192.168.122.30 port 34384
Nov 22 07:32:52 compute-0 sshd-session[180616]: pam_unix(sshd:session): session closed for user zuul
Nov 22 07:32:52 compute-0 systemd[1]: session-24.scope: Deactivated successfully.
Nov 22 07:32:52 compute-0 systemd-logind[821]: Session 24 logged out. Waiting for processes to exit.
Nov 22 07:32:52 compute-0 systemd-logind[821]: Removed session 24.
Nov 22 07:32:53 compute-0 python3.9[180769]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:32:53 compute-0 python3.9[180890]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763796772.660821-3418-273257680602267/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:32:54 compute-0 python3.9[181040]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:32:54 compute-0 python3.9[181116]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:32:55 compute-0 python3.9[181266]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:32:55 compute-0 python3.9[181387]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763796774.8574436-3418-233758950555205/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:32:56 compute-0 python3.9[181537]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:32:56 compute-0 python3.9[181658]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763796775.986889-3418-169171944617905/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=1feba546d0beacad9258164ab79b8a747685ccc8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:32:57 compute-0 python3.9[181808]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:32:58 compute-0 python3.9[181929]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763796777.0886312-3418-49026493438755/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:32:58 compute-0 python3.9[182079]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:32:59 compute-0 python3.9[182200]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763796778.313785-3418-130166827350651/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:33:00 compute-0 sudo[182350]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uziiaamyhunydnrddzeyjwlvparrfcxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796779.7420228-3667-124331809110201/AnsiballZ_file.py'
Nov 22 07:33:00 compute-0 sudo[182350]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:33:00 compute-0 python3.9[182352]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:33:00 compute-0 sudo[182350]: pam_unix(sudo:session): session closed for user root
Nov 22 07:33:00 compute-0 sudo[182502]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fqgzqywsojmwtkkiunceyveqoucteojg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796780.45513-3691-269956285739604/AnsiballZ_copy.py'
Nov 22 07:33:00 compute-0 sudo[182502]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:33:00 compute-0 python3.9[182504]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:33:00 compute-0 sudo[182502]: pam_unix(sudo:session): session closed for user root
Nov 22 07:33:01 compute-0 sudo[182654]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-chidbpdwajisnzxxeenbibjnldhqigcf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796781.2316792-3715-36040852896156/AnsiballZ_stat.py'
Nov 22 07:33:01 compute-0 sudo[182654]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:33:01 compute-0 python3.9[182656]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 07:33:01 compute-0 sudo[182654]: pam_unix(sudo:session): session closed for user root
Nov 22 07:33:02 compute-0 sudo[182806]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gqheoaxtgblshxcjsvucatdqeeaaohik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796781.9369836-3739-247701209348738/AnsiballZ_stat.py'
Nov 22 07:33:02 compute-0 sudo[182806]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:33:02 compute-0 python3.9[182808]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:33:02 compute-0 sudo[182806]: pam_unix(sudo:session): session closed for user root
Nov 22 07:33:02 compute-0 sudo[182929]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ernmxxbqzjrmeahetvdextsyhpqcvfdz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796781.9369836-3739-247701209348738/AnsiballZ_copy.py'
Nov 22 07:33:02 compute-0 sudo[182929]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:33:02 compute-0 python3.9[182931]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1763796781.9369836-3739-247701209348738/.source _original_basename=.vl4u0f_e follow=False checksum=e952f8714363b3693141d403417b3a3885823823 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Nov 22 07:33:02 compute-0 sudo[182929]: pam_unix(sudo:session): session closed for user root
Nov 22 07:33:03 compute-0 python3.9[183083]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 07:33:04 compute-0 podman[183209]: 2025-11-22 07:33:04.448104725 +0000 UTC m=+0.090343644 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, config_id=ovn_controller)
Nov 22 07:33:04 compute-0 python3.9[183252]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:33:05 compute-0 python3.9[183383]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763796784.0874343-3817-71433798361257/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=211ffd0bca4b407eb4de45a749ef70116a7806fd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:33:05 compute-0 python3.9[183533]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:33:06 compute-0 python3.9[183654]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763796785.4407892-3862-184824889105405/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=60b024e6db49dc6e700fc0d50263944d98d4c034 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:33:07 compute-0 sudo[183804]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jhmwodeqpdogfjxkrmhskmavocnrizbz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796786.9579444-3913-241749047023190/AnsiballZ_container_config_data.py'
Nov 22 07:33:07 compute-0 sudo[183804]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:33:07 compute-0 python3.9[183806]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Nov 22 07:33:07 compute-0 sudo[183804]: pam_unix(sudo:session): session closed for user root
Nov 22 07:33:07 compute-0 sudo[183956]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uavvgkvzdpohuyiclrlvplvhycbktflg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796787.7182348-3940-217815740480925/AnsiballZ_container_config_hash.py'
Nov 22 07:33:07 compute-0 sudo[183956]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:33:08 compute-0 python3.9[183958]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 22 07:33:08 compute-0 sudo[183956]: pam_unix(sudo:session): session closed for user root
Nov 22 07:33:09 compute-0 sudo[184108]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aalndbgmybhawgvqqkxkpfdunqfnyiro ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763796788.8381865-3970-76105478957552/AnsiballZ_edpm_container_manage.py'
Nov 22 07:33:09 compute-0 sudo[184108]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:33:09 compute-0 python3[184110]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False
Nov 22 07:33:09 compute-0 podman[184145]: 2025-11-22 07:33:09.550960152 +0000 UTC m=+0.051548513 container create 370ac707cc393c72c1b061ecf7af8e12270d93cf60033f69d892e8c91a4f9623 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, container_name=nova_compute_init, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 07:33:09 compute-0 podman[184145]: 2025-11-22 07:33:09.520599344 +0000 UTC m=+0.021187735 image pull 8e31b7b83c8d26bacd9598fdae1b287d27f8fa7d1d3cf4270dd8e435ff2f6a66 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Nov 22 07:33:09 compute-0 python3[184110]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Nov 22 07:33:09 compute-0 sudo[184108]: pam_unix(sudo:session): session closed for user root
Nov 22 07:33:10 compute-0 sudo[184333]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-slnponfzjcfbfpanlavtulbirrkbqrin ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796789.9927535-3994-125708564369126/AnsiballZ_stat.py'
Nov 22 07:33:10 compute-0 sudo[184333]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:33:10 compute-0 python3.9[184335]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 07:33:10 compute-0 sudo[184333]: pam_unix(sudo:session): session closed for user root
Nov 22 07:33:11 compute-0 sudo[184487]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xbvxodhzrdlccofyvhdlqpmrlzafmpsm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796791.1826992-4030-23577101430526/AnsiballZ_container_config_data.py'
Nov 22 07:33:11 compute-0 sudo[184487]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:33:11 compute-0 python3.9[184489]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Nov 22 07:33:11 compute-0 sudo[184487]: pam_unix(sudo:session): session closed for user root
Nov 22 07:33:12 compute-0 sudo[184639]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efpjbinejbhqtseeylukmdgsbawldqcf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796792.001865-4057-46679438146277/AnsiballZ_container_config_hash.py'
Nov 22 07:33:12 compute-0 sudo[184639]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:33:12 compute-0 python3.9[184641]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 22 07:33:12 compute-0 sudo[184639]: pam_unix(sudo:session): session closed for user root
Nov 22 07:33:13 compute-0 sudo[184804]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-auhzwapgpsjjgnskytdccifmamagrzum ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763796792.9966414-4087-247644225460296/AnsiballZ_edpm_container_manage.py'
Nov 22 07:33:13 compute-0 sudo[184804]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:33:13 compute-0 podman[184765]: 2025-11-22 07:33:13.393139331 +0000 UTC m=+0.059063044 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 07:33:13 compute-0 python3[184812]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False
Nov 22 07:33:13 compute-0 podman[184847]: 2025-11-22 07:33:13.831362037 +0000 UTC m=+0.055117389 container create 211a6e80892220336cd0af8c80ee84b02e29143ca0dbc27383e916c2955d9c0c (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, io.buildah.version=1.41.3, container_name=nova_compute)
Nov 22 07:33:13 compute-0 podman[184847]: 2025-11-22 07:33:13.803672844 +0000 UTC m=+0.027428246 image pull 8e31b7b83c8d26bacd9598fdae1b287d27f8fa7d1d3cf4270dd8e435ff2f6a66 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Nov 22 07:33:13 compute-0 python3[184812]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start
Nov 22 07:33:13 compute-0 sudo[184804]: pam_unix(sudo:session): session closed for user root
Nov 22 07:33:14 compute-0 sudo[185033]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-suhrphurwyyfpqltorgwydwrjgotshyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796794.2126336-4111-266425157299341/AnsiballZ_stat.py'
Nov 22 07:33:14 compute-0 sudo[185033]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:33:14 compute-0 python3.9[185035]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 07:33:14 compute-0 sudo[185033]: pam_unix(sudo:session): session closed for user root
Nov 22 07:33:15 compute-0 sudo[185187]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hnhbrzouwjlcsgpiwnkbmfoxsquioeit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796795.0351605-4138-127256194664357/AnsiballZ_file.py'
Nov 22 07:33:15 compute-0 sudo[185187]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:33:15 compute-0 python3.9[185189]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:33:15 compute-0 sudo[185187]: pam_unix(sudo:session): session closed for user root
Nov 22 07:33:15 compute-0 sudo[185338]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oqrpqcekudqtunprdbzhekhkpfchvhfr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796795.5551357-4138-209923570905454/AnsiballZ_copy.py'
Nov 22 07:33:15 compute-0 sudo[185338]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:33:16 compute-0 python3.9[185340]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763796795.5551357-4138-209923570905454/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:33:16 compute-0 sudo[185338]: pam_unix(sudo:session): session closed for user root
Nov 22 07:33:16 compute-0 sudo[185414]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kldealtebogfuarysmhcieqdyeahoqpl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796795.5551357-4138-209923570905454/AnsiballZ_systemd.py'
Nov 22 07:33:16 compute-0 sudo[185414]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:33:16 compute-0 python3.9[185416]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 22 07:33:16 compute-0 systemd[1]: Reloading.
Nov 22 07:33:16 compute-0 podman[185418]: 2025-11-22 07:33:16.704951648 +0000 UTC m=+0.057794283 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251118)
Nov 22 07:33:16 compute-0 systemd-rc-local-generator[185463]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 07:33:16 compute-0 systemd-sysv-generator[185467]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 07:33:16 compute-0 sudo[185414]: pam_unix(sudo:session): session closed for user root
Nov 22 07:33:17 compute-0 sudo[185545]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eqogxoenwqhxxagspjzxfxzpbylomrmr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796795.5551357-4138-209923570905454/AnsiballZ_systemd.py'
Nov 22 07:33:17 compute-0 sudo[185545]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:33:17 compute-0 python3.9[185547]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 07:33:17 compute-0 systemd[1]: Reloading.
Nov 22 07:33:17 compute-0 systemd-rc-local-generator[185575]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 07:33:17 compute-0 systemd-sysv-generator[185579]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 07:33:17 compute-0 systemd[1]: Starting nova_compute container...
Nov 22 07:33:17 compute-0 systemd[1]: Started libcrun container.
Nov 22 07:33:17 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/83b77655581e7ffbdbd5af1f10357fd903c1cfc2e61e2382dff78693e705a219/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Nov 22 07:33:17 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/83b77655581e7ffbdbd5af1f10357fd903c1cfc2e61e2382dff78693e705a219/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 22 07:33:17 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/83b77655581e7ffbdbd5af1f10357fd903c1cfc2e61e2382dff78693e705a219/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 22 07:33:17 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/83b77655581e7ffbdbd5af1f10357fd903c1cfc2e61e2382dff78693e705a219/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 22 07:33:17 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/83b77655581e7ffbdbd5af1f10357fd903c1cfc2e61e2382dff78693e705a219/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 22 07:33:17 compute-0 podman[185587]: 2025-11-22 07:33:17.925696146 +0000 UTC m=+0.093548702 container init 211a6e80892220336cd0af8c80ee84b02e29143ca0dbc27383e916c2955d9c0c (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 22 07:33:17 compute-0 podman[185587]: 2025-11-22 07:33:17.933361082 +0000 UTC m=+0.101213608 container start 211a6e80892220336cd0af8c80ee84b02e29143ca0dbc27383e916c2955d9c0c (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 22 07:33:17 compute-0 podman[185587]: nova_compute
Nov 22 07:33:17 compute-0 nova_compute[185603]: + sudo -E kolla_set_configs
Nov 22 07:33:17 compute-0 systemd[1]: Started nova_compute container.
Nov 22 07:33:17 compute-0 sudo[185545]: pam_unix(sudo:session): session closed for user root
Nov 22 07:33:18 compute-0 nova_compute[185603]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 22 07:33:18 compute-0 nova_compute[185603]: INFO:__main__:Validating config file
Nov 22 07:33:18 compute-0 nova_compute[185603]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 22 07:33:18 compute-0 nova_compute[185603]: INFO:__main__:Copying service configuration files
Nov 22 07:33:18 compute-0 nova_compute[185603]: INFO:__main__:Deleting /etc/nova/nova.conf
Nov 22 07:33:18 compute-0 nova_compute[185603]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Nov 22 07:33:18 compute-0 nova_compute[185603]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Nov 22 07:33:18 compute-0 nova_compute[185603]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Nov 22 07:33:18 compute-0 nova_compute[185603]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Nov 22 07:33:18 compute-0 nova_compute[185603]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 22 07:33:18 compute-0 nova_compute[185603]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 22 07:33:18 compute-0 nova_compute[185603]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Nov 22 07:33:18 compute-0 nova_compute[185603]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Nov 22 07:33:18 compute-0 nova_compute[185603]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 22 07:33:18 compute-0 nova_compute[185603]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 22 07:33:18 compute-0 nova_compute[185603]: INFO:__main__:Deleting /etc/ceph
Nov 22 07:33:18 compute-0 nova_compute[185603]: INFO:__main__:Creating directory /etc/ceph
Nov 22 07:33:18 compute-0 nova_compute[185603]: INFO:__main__:Setting permission for /etc/ceph
Nov 22 07:33:18 compute-0 nova_compute[185603]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Nov 22 07:33:18 compute-0 nova_compute[185603]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 22 07:33:18 compute-0 nova_compute[185603]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Nov 22 07:33:18 compute-0 nova_compute[185603]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 22 07:33:18 compute-0 nova_compute[185603]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Nov 22 07:33:18 compute-0 nova_compute[185603]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Nov 22 07:33:18 compute-0 nova_compute[185603]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Nov 22 07:33:18 compute-0 nova_compute[185603]: INFO:__main__:Writing out command to execute
Nov 22 07:33:18 compute-0 nova_compute[185603]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Nov 22 07:33:18 compute-0 nova_compute[185603]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 22 07:33:18 compute-0 nova_compute[185603]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 22 07:33:18 compute-0 nova_compute[185603]: ++ cat /run_command
Nov 22 07:33:18 compute-0 nova_compute[185603]: + CMD=nova-compute
Nov 22 07:33:18 compute-0 nova_compute[185603]: + ARGS=
Nov 22 07:33:18 compute-0 nova_compute[185603]: + sudo kolla_copy_cacerts
Nov 22 07:33:18 compute-0 nova_compute[185603]: Running command: 'nova-compute'
Nov 22 07:33:18 compute-0 nova_compute[185603]: + [[ ! -n '' ]]
Nov 22 07:33:18 compute-0 nova_compute[185603]: + . kolla_extend_start
Nov 22 07:33:18 compute-0 nova_compute[185603]: + echo 'Running command: '\''nova-compute'\'''
Nov 22 07:33:18 compute-0 nova_compute[185603]: + umask 0022
Nov 22 07:33:18 compute-0 nova_compute[185603]: + exec nova-compute
Nov 22 07:33:18 compute-0 sshd-session[185639]: Connection closed by 172.105.102.42 port 21352
Nov 22 07:33:19 compute-0 python3.9[185766]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.082 185607 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.083 185607 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.083 185607 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.083 185607 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.222 185607 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.246 185607 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.024s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.247 185607 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Nov 22 07:33:20 compute-0 python3.9[185920]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.853 185607 INFO nova.virt.driver [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.966 185607 INFO nova.compute.provider_config [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.976 185607 DEBUG oslo_concurrency.lockutils [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.976 185607 DEBUG oslo_concurrency.lockutils [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.977 185607 DEBUG oslo_concurrency.lockutils [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.977 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.977 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.977 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.977 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.978 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.978 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.978 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.978 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.978 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.978 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.979 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.979 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.979 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.979 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.979 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.979 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.980 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.980 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.980 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.980 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.980 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.980 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.980 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.981 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.981 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.981 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.981 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.981 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.981 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.982 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.982 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.982 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.982 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.982 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.982 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.982 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.983 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.983 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.983 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.983 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.983 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.983 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.984 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.984 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.984 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.984 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.984 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.984 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.984 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.985 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.985 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.985 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.985 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.985 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.985 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.986 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.986 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.986 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.986 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.986 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.986 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.987 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.987 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.987 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.987 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.987 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.987 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.988 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.988 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.988 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.988 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.988 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.988 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.988 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.989 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.989 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.989 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.989 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.989 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.989 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.989 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.990 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.990 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.990 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.990 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.990 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.990 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.991 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.991 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.991 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.991 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.991 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.991 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.992 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.992 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.992 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.992 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.992 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.992 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.993 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.993 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.993 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.993 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.993 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.993 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.994 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.994 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.994 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.994 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.994 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.995 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.995 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.995 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.995 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.995 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.995 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.996 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.996 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.996 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.996 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.996 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.996 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.996 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.997 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.997 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.997 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.997 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.997 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.997 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.997 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.998 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.998 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.998 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.998 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.998 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.998 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.999 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.999 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.999 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:20 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.999 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:20.999 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.000 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.000 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.000 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.000 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.000 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.000 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.001 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.001 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.001 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.001 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.001 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.001 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.002 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.002 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.002 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.002 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.002 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.002 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.003 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.003 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.003 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.003 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.003 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.003 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.004 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.004 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.004 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.004 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.004 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.004 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.005 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.005 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.005 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.005 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.005 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.005 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.006 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.006 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.006 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.006 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.006 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.006 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.007 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.007 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.007 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.007 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.007 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.007 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.008 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.008 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.008 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.008 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.008 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.008 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.008 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.009 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.009 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.009 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.009 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.009 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.010 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.010 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.010 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.010 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.010 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.010 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.010 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.010 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.011 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.011 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.011 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.011 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.011 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.011 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.012 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.012 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.012 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.012 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.012 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.012 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.013 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.013 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.013 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.013 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.013 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.013 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.013 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.014 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.014 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.014 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.014 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.014 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.014 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.014 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.015 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.015 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.015 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.015 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.015 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.015 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.016 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.016 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.016 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.016 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.016 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.016 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.017 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.017 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.017 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.017 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.017 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.017 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.018 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.018 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.018 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.018 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.018 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.018 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.018 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.019 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.019 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.019 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.019 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.019 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.019 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.020 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.020 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.020 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.020 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.020 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.020 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.020 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.021 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.021 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.021 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.021 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.021 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.021 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.021 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.022 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.022 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.022 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.022 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.022 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.022 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.023 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.023 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.023 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.023 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.023 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.023 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.023 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.024 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.024 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.024 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.024 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.024 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.024 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.024 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.025 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.025 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.025 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.025 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.025 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.025 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.025 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.026 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.026 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.026 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.026 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.026 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.026 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.026 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.027 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.027 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.027 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.027 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.027 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.027 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.028 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.028 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.028 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.028 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.028 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.028 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.028 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.029 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.029 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.029 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.029 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.029 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.029 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.030 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.030 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.030 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.030 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.030 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.030 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.031 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.031 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.031 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.031 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.031 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.032 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.032 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.032 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.032 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.032 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.032 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.033 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.033 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.033 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.033 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.033 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.034 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.034 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.034 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.034 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.034 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.035 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.035 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.035 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.035 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.035 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.035 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.036 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.036 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.036 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.036 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.036 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.036 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.036 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.037 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.037 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.037 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.037 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.037 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.037 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.038 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.038 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.038 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.038 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.038 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.038 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.039 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.039 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.039 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.039 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.039 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.039 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.040 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.040 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.040 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.040 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.040 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.041 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.041 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.041 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.041 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.041 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.041 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.042 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.042 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.042 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.042 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.042 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.042 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.043 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.043 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.043 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.043 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.043 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.043 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.043 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.044 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.044 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.044 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.044 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.044 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.044 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.044 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.045 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.045 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.045 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.045 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.045 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.045 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.046 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.046 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.046 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.046 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.046 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.046 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.046 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.047 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.047 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.047 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.047 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.047 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.047 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.048 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.048 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.048 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.048 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.048 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.048 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.048 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.049 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.049 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.049 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.049 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.049 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.050 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.050 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.050 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.050 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.050 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.050 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.050 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.051 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.051 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.051 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.051 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.051 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.051 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.051 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.052 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.052 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.052 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.052 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.052 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.052 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.053 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.053 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.053 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.053 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.053 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.053 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.053 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.054 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.054 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.054 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.054 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.054 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.054 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.054 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.055 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.055 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.055 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.055 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.055 185607 WARNING oslo_config.cfg [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Nov 22 07:33:21 compute-0 nova_compute[185603]: live_migration_uri is deprecated for removal in favor of two other options that
Nov 22 07:33:21 compute-0 nova_compute[185603]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Nov 22 07:33:21 compute-0 nova_compute[185603]: and ``live_migration_inbound_addr`` respectively.
Nov 22 07:33:21 compute-0 nova_compute[185603]: ).  Its value may be silently ignored in the future.
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.055 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.056 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.056 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.056 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.056 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.056 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.057 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.057 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.057 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.057 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.057 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.057 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.057 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.057 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.058 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.058 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.058 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.058 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.058 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.058 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.059 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.059 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.059 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.059 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.059 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.059 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.059 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.060 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.060 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.060 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.060 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.060 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.060 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.061 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.061 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.061 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.061 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.061 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.061 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.062 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.062 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.062 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.062 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.062 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.062 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.062 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.063 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.063 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.063 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.063 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.063 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.063 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.064 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.064 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.064 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.064 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.064 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.064 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.065 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.065 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.065 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.065 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.065 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.065 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.066 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.066 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.066 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.066 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.066 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.066 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.067 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.067 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.067 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.067 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.067 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.067 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.067 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.067 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.068 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.068 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.068 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.068 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.068 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] notifications.notification_format = both log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.068 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] notifications.notify_on_state_change = vm_and_task_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.069 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.069 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.069 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.069 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.069 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.069 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.069 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.070 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.070 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.070 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.070 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.070 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.070 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.070 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.071 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.071 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.071 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.071 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.071 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.071 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.071 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.072 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.072 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.072 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.072 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.072 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.072 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.072 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.073 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.073 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.073 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.073 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.073 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.073 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.073 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.074 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.074 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.074 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.074 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.074 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.075 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.075 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.075 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.075 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.075 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.075 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.076 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.076 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.076 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.076 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.076 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.077 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.077 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.077 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.077 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.077 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.077 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.078 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.078 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.078 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.078 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.078 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.078 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.079 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.079 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.079 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.079 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.079 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.079 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.080 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.080 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.080 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.080 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.080 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.080 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.081 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.081 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.081 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.081 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.081 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.081 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.082 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.082 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.082 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.082 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.082 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.082 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.082 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.083 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.083 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.083 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.083 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.083 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.083 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.083 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.084 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.084 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.084 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.084 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.084 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.084 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.085 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.085 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.085 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.085 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.085 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.085 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.086 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.086 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.086 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.086 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.086 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.086 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.087 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.087 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.087 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.087 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.087 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.087 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.088 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.088 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.088 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.088 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.088 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.088 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.088 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.089 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.089 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.089 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.089 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.089 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.089 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.089 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.090 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.090 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.090 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.090 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.090 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.090 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.090 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.091 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.091 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.091 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.091 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.091 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.091 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.092 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.092 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.092 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.092 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.092 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.092 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.092 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.093 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.093 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.093 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.093 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.093 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.093 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.093 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.094 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.094 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.094 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.094 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.094 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.094 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.095 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.095 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.095 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.095 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.095 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.095 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.096 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.096 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.096 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.096 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.096 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.096 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.096 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.097 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.097 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.097 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.097 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.097 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.097 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.097 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.098 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.098 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.098 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.098 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.098 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.098 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.098 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.099 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.099 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.099 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.099 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.099 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.099 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.099 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.100 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.100 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.100 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.100 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.100 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.100 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.100 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.101 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.101 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.101 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.101 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.101 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.102 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.102 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.102 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.102 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.102 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.102 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.102 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.103 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.103 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.103 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.103 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.103 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.103 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.104 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.104 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.104 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.104 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.104 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.104 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.104 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.105 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.105 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.105 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.105 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.105 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.105 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.106 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.106 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.106 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.106 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.106 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.106 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.107 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.107 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.107 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.107 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.107 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.107 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.108 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.108 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.108 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.108 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_messaging_notifications.driver = ['messagingv2'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.108 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.109 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.109 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.109 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.109 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.109 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.109 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.109 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.110 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.110 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.110 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.110 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.110 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.110 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.111 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.111 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.111 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.111 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.111 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.111 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.111 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.112 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.112 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.112 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.112 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.112 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.112 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.112 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.113 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.113 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.113 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.113 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.113 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.113 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.113 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.114 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.114 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.114 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.114 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.114 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.114 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.114 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.115 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.115 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.115 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.115 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.115 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.115 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.115 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.116 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.116 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.116 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.116 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.116 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.116 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.116 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.117 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.117 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.117 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.117 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.117 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.117 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.117 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.118 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.118 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.118 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.118 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.118 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.118 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.119 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.119 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.119 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.119 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.119 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.119 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.119 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.120 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.120 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.120 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.120 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.120 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.120 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.120 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.121 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.121 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.121 185607 DEBUG oslo_service.service [None req-86ab6f50-7028-46ac-a279-a638ab667066 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.122 185607 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.147 185607 DEBUG nova.virt.libvirt.host [None req-23e2c20b-87aa-40b5-a7b5-f2e146e4eea2 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.148 185607 DEBUG nova.virt.libvirt.host [None req-23e2c20b-87aa-40b5-a7b5-f2e146e4eea2 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.148 185607 DEBUG nova.virt.libvirt.host [None req-23e2c20b-87aa-40b5-a7b5-f2e146e4eea2 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.148 185607 DEBUG nova.virt.libvirt.host [None req-23e2c20b-87aa-40b5-a7b5-f2e146e4eea2 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Nov 22 07:33:21 compute-0 systemd[1]: Starting libvirt QEMU daemon...
Nov 22 07:33:21 compute-0 systemd[1]: Started libvirt QEMU daemon.
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.223 185607 DEBUG nova.virt.libvirt.host [None req-23e2c20b-87aa-40b5-a7b5-f2e146e4eea2 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f938af395b0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.225 185607 DEBUG nova.virt.libvirt.host [None req-23e2c20b-87aa-40b5-a7b5-f2e146e4eea2 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f938af395b0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.226 185607 INFO nova.virt.libvirt.driver [None req-23e2c20b-87aa-40b5-a7b5-f2e146e4eea2 - - - - - -] Connection event '1' reason 'None'
Nov 22 07:33:21 compute-0 python3.9[186070]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.354 185607 WARNING nova.virt.libvirt.driver [None req-23e2c20b-87aa-40b5-a7b5-f2e146e4eea2 - - - - - -] Cannot update service status on host "compute-0.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Nov 22 07:33:21 compute-0 nova_compute[185603]: 2025-11-22 07:33:21.355 185607 DEBUG nova.virt.libvirt.volume.mount [None req-23e2c20b-87aa-40b5-a7b5-f2e146e4eea2 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Nov 22 07:33:22 compute-0 nova_compute[185603]: 2025-11-22 07:33:22.036 185607 INFO nova.virt.libvirt.host [None req-23e2c20b-87aa-40b5-a7b5-f2e146e4eea2 - - - - - -] Libvirt host capabilities <capabilities>
Nov 22 07:33:22 compute-0 nova_compute[185603]: 
Nov 22 07:33:22 compute-0 nova_compute[185603]:   <host>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <uuid>44d48670-015e-4b65-9dee-fcaebe9200b4</uuid>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <cpu>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <arch>x86_64</arch>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model>EPYC-Rome-v4</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <vendor>AMD</vendor>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <microcode version='16777317'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <signature family='23' model='49' stepping='0'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <maxphysaddr mode='emulate' bits='40'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature name='x2apic'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature name='tsc-deadline'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature name='osxsave'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature name='hypervisor'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature name='tsc_adjust'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature name='spec-ctrl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature name='stibp'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature name='arch-capabilities'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature name='ssbd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature name='cmp_legacy'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature name='topoext'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature name='virt-ssbd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature name='lbrv'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature name='tsc-scale'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature name='vmcb-clean'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature name='pause-filter'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature name='pfthreshold'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature name='svme-addr-chk'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature name='rdctl-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature name='skip-l1dfl-vmentry'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature name='mds-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature name='pschange-mc-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <pages unit='KiB' size='4'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <pages unit='KiB' size='2048'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <pages unit='KiB' size='1048576'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </cpu>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <power_management>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <suspend_mem/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <suspend_disk/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <suspend_hybrid/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </power_management>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <iommu support='no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <migration_features>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <live/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <uri_transports>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <uri_transport>tcp</uri_transport>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <uri_transport>rdma</uri_transport>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </uri_transports>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </migration_features>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <topology>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <cells num='1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <cell id='0'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:           <memory unit='KiB'>7864320</memory>
Nov 22 07:33:22 compute-0 nova_compute[185603]:           <pages unit='KiB' size='4'>1966080</pages>
Nov 22 07:33:22 compute-0 nova_compute[185603]:           <pages unit='KiB' size='2048'>0</pages>
Nov 22 07:33:22 compute-0 nova_compute[185603]:           <pages unit='KiB' size='1048576'>0</pages>
Nov 22 07:33:22 compute-0 nova_compute[185603]:           <distances>
Nov 22 07:33:22 compute-0 nova_compute[185603]:             <sibling id='0' value='10'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:           </distances>
Nov 22 07:33:22 compute-0 nova_compute[185603]:           <cpus num='8'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:           </cpus>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         </cell>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </cells>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </topology>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <cache>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </cache>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <secmodel>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model>selinux</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <doi>0</doi>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </secmodel>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <secmodel>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model>dac</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <doi>0</doi>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <baselabel type='kvm'>+107:+107</baselabel>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <baselabel type='qemu'>+107:+107</baselabel>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </secmodel>
Nov 22 07:33:22 compute-0 nova_compute[185603]:   </host>
Nov 22 07:33:22 compute-0 nova_compute[185603]: 
Nov 22 07:33:22 compute-0 nova_compute[185603]:   <guest>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <os_type>hvm</os_type>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <arch name='i686'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <wordsize>32</wordsize>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <domain type='qemu'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <domain type='kvm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </arch>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <features>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <pae/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <nonpae/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <acpi default='on' toggle='yes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <apic default='on' toggle='no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <cpuselection/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <deviceboot/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <disksnapshot default='on' toggle='no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <externalSnapshot/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </features>
Nov 22 07:33:22 compute-0 nova_compute[185603]:   </guest>
Nov 22 07:33:22 compute-0 nova_compute[185603]: 
Nov 22 07:33:22 compute-0 nova_compute[185603]:   <guest>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <os_type>hvm</os_type>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <arch name='x86_64'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <wordsize>64</wordsize>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <domain type='qemu'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <domain type='kvm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </arch>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <features>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <acpi default='on' toggle='yes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <apic default='on' toggle='no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <cpuselection/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <deviceboot/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <disksnapshot default='on' toggle='no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <externalSnapshot/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </features>
Nov 22 07:33:22 compute-0 nova_compute[185603]:   </guest>
Nov 22 07:33:22 compute-0 nova_compute[185603]: 
Nov 22 07:33:22 compute-0 nova_compute[185603]: </capabilities>
Nov 22 07:33:22 compute-0 nova_compute[185603]: 
Nov 22 07:33:22 compute-0 nova_compute[185603]: 2025-11-22 07:33:22.043 185607 DEBUG nova.virt.libvirt.host [None req-23e2c20b-87aa-40b5-a7b5-f2e146e4eea2 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Nov 22 07:33:22 compute-0 nova_compute[185603]: 2025-11-22 07:33:22.060 185607 DEBUG nova.virt.libvirt.host [None req-23e2c20b-87aa-40b5-a7b5-f2e146e4eea2 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Nov 22 07:33:22 compute-0 nova_compute[185603]: <domainCapabilities>
Nov 22 07:33:22 compute-0 nova_compute[185603]:   <path>/usr/libexec/qemu-kvm</path>
Nov 22 07:33:22 compute-0 nova_compute[185603]:   <domain>kvm</domain>
Nov 22 07:33:22 compute-0 nova_compute[185603]:   <machine>pc-q35-rhel9.8.0</machine>
Nov 22 07:33:22 compute-0 nova_compute[185603]:   <arch>i686</arch>
Nov 22 07:33:22 compute-0 nova_compute[185603]:   <vcpu max='4096'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:   <iothreads supported='yes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:   <os supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <enum name='firmware'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <loader supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='type'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>rom</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>pflash</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='readonly'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>yes</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>no</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='secure'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>no</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </loader>
Nov 22 07:33:22 compute-0 nova_compute[185603]:   </os>
Nov 22 07:33:22 compute-0 nova_compute[185603]:   <cpu>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <mode name='host-passthrough' supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='hostPassthroughMigratable'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>on</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>off</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </mode>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <mode name='maximum' supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='maximumMigratable'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>on</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>off</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </mode>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <mode name='host-model' supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <vendor>AMD</vendor>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='x2apic'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='tsc-deadline'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='hypervisor'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='tsc_adjust'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='spec-ctrl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='stibp'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='ssbd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='cmp_legacy'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='overflow-recov'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='succor'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='ibrs'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='amd-ssbd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='virt-ssbd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='lbrv'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='tsc-scale'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='vmcb-clean'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='flushbyasid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='pause-filter'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='pfthreshold'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='svme-addr-chk'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='disable' name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </mode>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <mode name='custom' supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Broadwell'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Broadwell-IBRS'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Broadwell-noTSX'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Broadwell-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Broadwell-v2'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Broadwell-v3'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Broadwell-v4'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Cascadelake-Server'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Cascadelake-Server-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Cascadelake-Server-v2'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Cascadelake-Server-v3'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Cascadelake-Server-v4'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Cascadelake-Server-v5'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Cooperlake'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-bf16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='taa-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Cooperlake-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-bf16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='taa-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Cooperlake-v2'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-bf16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='taa-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Denverton'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='mpx'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Denverton-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='mpx'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Denverton-v2'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Denverton-v3'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Dhyana-v2'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='EPYC-Genoa'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amd-psfd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='auto-ibrs'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-bf16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bitalg'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512ifma'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='la57'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='no-nested-data-bp'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='null-sel-clr-base'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='stibp-always-on'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='EPYC-Genoa-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amd-psfd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='auto-ibrs'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-bf16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bitalg'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512ifma'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='la57'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='no-nested-data-bp'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='null-sel-clr-base'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='stibp-always-on'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='EPYC-Milan'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='EPYC-Milan-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='EPYC-Milan-v2'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amd-psfd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='no-nested-data-bp'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='null-sel-clr-base'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='stibp-always-on'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='EPYC-Rome'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='EPYC-Rome-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='EPYC-Rome-v2'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='EPYC-Rome-v3'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='EPYC-v3'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='EPYC-v4'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='GraniteRapids'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-bf16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-fp16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-int8'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-tile'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx-vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-bf16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-fp16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bitalg'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512ifma'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='bus-lock-detect'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fbsdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrc'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrs'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fzrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='la57'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='mcdt-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pbrsb-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='prefetchiti'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='psdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='sbdr-ssdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='serialize'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='taa-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='tsx-ldtrk'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xfd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='GraniteRapids-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-bf16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-fp16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-int8'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-tile'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx-vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-bf16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-fp16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bitalg'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512ifma'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='bus-lock-detect'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fbsdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrc'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrs'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fzrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='la57'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='mcdt-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pbrsb-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='prefetchiti'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='psdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='sbdr-ssdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='serialize'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='taa-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='tsx-ldtrk'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xfd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='GraniteRapids-v2'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-bf16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-fp16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-int8'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-tile'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx-vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx10'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx10-128'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx10-256'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx10-512'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-bf16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-fp16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bitalg'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512ifma'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='bus-lock-detect'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='cldemote'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fbsdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrc'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrs'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fzrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='la57'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='mcdt-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='movdir64b'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='movdiri'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pbrsb-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='prefetchiti'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='psdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='sbdr-ssdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='serialize'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ss'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='taa-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='tsx-ldtrk'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xfd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Haswell'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Haswell-IBRS'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Haswell-noTSX'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Haswell-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Haswell-v2'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Haswell-v3'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Haswell-v4'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Icelake-Server'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bitalg'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='la57'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Icelake-Server-noTSX'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bitalg'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='la57'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Icelake-Server-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bitalg'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='la57'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Icelake-Server-v2'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bitalg'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='la57'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Icelake-Server-v3'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bitalg'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='la57'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='taa-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Icelake-Server-v4'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bitalg'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512ifma'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='la57'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='taa-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Icelake-Server-v5'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bitalg'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512ifma'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='la57'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='taa-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Icelake-Server-v6'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bitalg'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512ifma'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='la57'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='taa-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Icelake-Server-v7'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bitalg'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512ifma'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='la57'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='taa-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='IvyBridge'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='IvyBridge-IBRS'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='IvyBridge-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='IvyBridge-v2'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='KnightsMill'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-4fmaps'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-4vnniw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512er'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512pf'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ss'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='KnightsMill-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-4fmaps'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-4vnniw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512er'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512pf'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ss'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Opteron_G4'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fma4'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xop'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Opteron_G4-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fma4'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xop'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Opteron_G5'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fma4'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='tbm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xop'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Opteron_G5-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fma4'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='tbm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xop'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='SapphireRapids'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-bf16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-int8'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-tile'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx-vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-bf16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-fp16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bitalg'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512ifma'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='bus-lock-detect'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrc'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrs'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fzrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='la57'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='serialize'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='taa-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='tsx-ldtrk'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xfd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='SapphireRapids-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-bf16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-int8'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-tile'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx-vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-bf16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-fp16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bitalg'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512ifma'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='bus-lock-detect'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrc'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrs'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fzrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='la57'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='serialize'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='taa-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='tsx-ldtrk'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xfd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='SapphireRapids-v2'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-bf16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-int8'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-tile'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx-vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-bf16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-fp16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bitalg'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512ifma'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='bus-lock-detect'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fbsdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrc'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrs'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fzrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='la57'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='psdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='sbdr-ssdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='serialize'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='taa-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='tsx-ldtrk'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xfd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='SapphireRapids-v3'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-bf16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-int8'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-tile'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx-vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-bf16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-fp16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bitalg'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512ifma'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='bus-lock-detect'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='cldemote'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fbsdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrc'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrs'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fzrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='la57'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='movdir64b'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='movdiri'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='psdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='sbdr-ssdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='serialize'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ss'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='taa-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='tsx-ldtrk'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xfd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='SierraForest'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx-ifma'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx-ne-convert'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx-vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx-vnni-int8'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='bus-lock-detect'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='cmpccxadd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fbsdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrs'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='mcdt-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pbrsb-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='psdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='sbdr-ssdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='serialize'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='SierraForest-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx-ifma'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx-ne-convert'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx-vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx-vnni-int8'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='bus-lock-detect'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='cmpccxadd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fbsdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrs'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='mcdt-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pbrsb-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='psdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='sbdr-ssdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='serialize'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Skylake-Client'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Skylake-Client-IBRS'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Skylake-Client-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Skylake-Client-v2'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Skylake-Client-v3'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Skylake-Client-v4'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Skylake-Server'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Skylake-Server-IBRS'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Skylake-Server-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Skylake-Server-v2'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Skylake-Server-v3'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Skylake-Server-v4'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Skylake-Server-v5'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Snowridge'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='cldemote'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='core-capability'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='movdir64b'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='movdiri'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='mpx'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='split-lock-detect'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Snowridge-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='cldemote'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='core-capability'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='movdir64b'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='movdiri'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='mpx'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='split-lock-detect'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Snowridge-v2'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='cldemote'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='core-capability'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='movdir64b'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='movdiri'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='split-lock-detect'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Snowridge-v3'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='cldemote'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='core-capability'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='movdir64b'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='movdiri'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='split-lock-detect'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Snowridge-v4'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='cldemote'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='movdir64b'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='movdiri'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='athlon'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='3dnow'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='3dnowext'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='athlon-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='3dnow'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='3dnowext'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='core2duo'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ss'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='core2duo-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ss'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='coreduo'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ss'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='coreduo-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ss'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='n270'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ss'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='n270-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ss'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='phenom'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='3dnow'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='3dnowext'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='phenom-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='3dnow'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='3dnowext'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </mode>
Nov 22 07:33:22 compute-0 nova_compute[185603]:   </cpu>
Nov 22 07:33:22 compute-0 nova_compute[185603]:   <memoryBacking supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <enum name='sourceType'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <value>file</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <value>anonymous</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <value>memfd</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:   </memoryBacking>
Nov 22 07:33:22 compute-0 nova_compute[185603]:   <devices>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <disk supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='diskDevice'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>disk</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>cdrom</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>floppy</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>lun</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='bus'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>fdc</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>scsi</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>virtio</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>usb</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>sata</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='model'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>virtio</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>virtio-transitional</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>virtio-non-transitional</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </disk>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <graphics supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='type'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>vnc</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>egl-headless</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>dbus</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </graphics>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <video supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='modelType'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>vga</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>cirrus</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>virtio</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>none</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>bochs</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>ramfb</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </video>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <hostdev supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='mode'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>subsystem</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='startupPolicy'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>default</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>mandatory</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>requisite</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>optional</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='subsysType'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>usb</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>pci</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>scsi</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='capsType'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='pciBackend'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </hostdev>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <rng supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='model'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>virtio</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>virtio-transitional</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>virtio-non-transitional</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='backendModel'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>random</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>egd</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>builtin</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </rng>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <filesystem supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='driverType'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>path</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>handle</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>virtiofs</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </filesystem>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <tpm supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='model'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>tpm-tis</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>tpm-crb</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='backendModel'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>emulator</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>external</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='backendVersion'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>2.0</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </tpm>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <redirdev supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='bus'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>usb</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </redirdev>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <channel supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='type'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>pty</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>unix</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </channel>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <crypto supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='model'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='type'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>qemu</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='backendModel'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>builtin</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </crypto>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <interface supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='backendType'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>default</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>passt</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </interface>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <panic supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='model'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>isa</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>hyperv</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </panic>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <console supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='type'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>null</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>vc</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>pty</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>dev</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>file</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>pipe</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>stdio</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>udp</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>tcp</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>unix</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>qemu-vdagent</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>dbus</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </console>
Nov 22 07:33:22 compute-0 nova_compute[185603]:   </devices>
Nov 22 07:33:22 compute-0 nova_compute[185603]:   <features>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <gic supported='no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <vmcoreinfo supported='yes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <genid supported='yes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <backingStoreInput supported='yes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <backup supported='yes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <async-teardown supported='yes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <ps2 supported='yes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <sev supported='no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <sgx supported='no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <hyperv supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='features'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>relaxed</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>vapic</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>spinlocks</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>vpindex</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>runtime</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>synic</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>stimer</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>reset</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>vendor_id</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>frequencies</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>reenlightenment</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>tlbflush</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>ipi</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>avic</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>emsr_bitmap</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>xmm_input</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <defaults>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <spinlocks>4095</spinlocks>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <stimer_direct>on</stimer_direct>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <tlbflush_direct>on</tlbflush_direct>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <tlbflush_extended>on</tlbflush_extended>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </defaults>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </hyperv>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <launchSecurity supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='sectype'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>tdx</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </launchSecurity>
Nov 22 07:33:22 compute-0 nova_compute[185603]:   </features>
Nov 22 07:33:22 compute-0 nova_compute[185603]: </domainCapabilities>
Nov 22 07:33:22 compute-0 nova_compute[185603]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 22 07:33:22 compute-0 nova_compute[185603]: 2025-11-22 07:33:22.066 185607 DEBUG nova.virt.libvirt.host [None req-23e2c20b-87aa-40b5-a7b5-f2e146e4eea2 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Nov 22 07:33:22 compute-0 nova_compute[185603]: <domainCapabilities>
Nov 22 07:33:22 compute-0 nova_compute[185603]:   <path>/usr/libexec/qemu-kvm</path>
Nov 22 07:33:22 compute-0 nova_compute[185603]:   <domain>kvm</domain>
Nov 22 07:33:22 compute-0 nova_compute[185603]:   <machine>pc-i440fx-rhel7.6.0</machine>
Nov 22 07:33:22 compute-0 nova_compute[185603]:   <arch>i686</arch>
Nov 22 07:33:22 compute-0 nova_compute[185603]:   <vcpu max='240'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:   <iothreads supported='yes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:   <os supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <enum name='firmware'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <loader supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='type'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>rom</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>pflash</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='readonly'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>yes</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>no</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='secure'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>no</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </loader>
Nov 22 07:33:22 compute-0 nova_compute[185603]:   </os>
Nov 22 07:33:22 compute-0 nova_compute[185603]:   <cpu>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <mode name='host-passthrough' supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='hostPassthroughMigratable'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>on</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>off</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </mode>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <mode name='maximum' supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='maximumMigratable'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>on</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>off</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </mode>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <mode name='host-model' supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <vendor>AMD</vendor>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='x2apic'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='tsc-deadline'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='hypervisor'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='tsc_adjust'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='spec-ctrl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='stibp'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='ssbd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='cmp_legacy'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='overflow-recov'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='succor'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='ibrs'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='amd-ssbd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='virt-ssbd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='lbrv'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='tsc-scale'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='vmcb-clean'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='flushbyasid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='pause-filter'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='pfthreshold'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='svme-addr-chk'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='disable' name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </mode>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <mode name='custom' supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Broadwell'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Broadwell-IBRS'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Broadwell-noTSX'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Broadwell-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Broadwell-v2'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Broadwell-v3'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Broadwell-v4'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Cascadelake-Server'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Cascadelake-Server-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Cascadelake-Server-v2'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Cascadelake-Server-v3'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Cascadelake-Server-v4'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Cascadelake-Server-v5'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Cooperlake'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-bf16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='taa-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Cooperlake-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-bf16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='taa-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Cooperlake-v2'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-bf16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='taa-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Denverton'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='mpx'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Denverton-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='mpx'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Denverton-v2'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Denverton-v3'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Dhyana-v2'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='EPYC-Genoa'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amd-psfd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='auto-ibrs'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-bf16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bitalg'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512ifma'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='la57'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='no-nested-data-bp'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='null-sel-clr-base'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='stibp-always-on'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='EPYC-Genoa-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amd-psfd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='auto-ibrs'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-bf16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bitalg'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512ifma'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='la57'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='no-nested-data-bp'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='null-sel-clr-base'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='stibp-always-on'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='EPYC-Milan'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='EPYC-Milan-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='EPYC-Milan-v2'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amd-psfd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='no-nested-data-bp'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='null-sel-clr-base'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='stibp-always-on'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='EPYC-Rome'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='EPYC-Rome-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='EPYC-Rome-v2'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='EPYC-Rome-v3'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='EPYC-v3'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='EPYC-v4'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='GraniteRapids'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-bf16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-fp16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-int8'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-tile'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx-vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-bf16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-fp16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bitalg'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512ifma'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='bus-lock-detect'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fbsdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrc'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrs'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fzrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='la57'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='mcdt-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pbrsb-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='prefetchiti'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='psdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='sbdr-ssdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='serialize'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='taa-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='tsx-ldtrk'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xfd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='GraniteRapids-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-bf16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-fp16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-int8'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-tile'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx-vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-bf16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-fp16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bitalg'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512ifma'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='bus-lock-detect'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fbsdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrc'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrs'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fzrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='la57'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='mcdt-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pbrsb-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='prefetchiti'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='psdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='sbdr-ssdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='serialize'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='taa-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='tsx-ldtrk'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xfd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='GraniteRapids-v2'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-bf16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-fp16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-int8'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-tile'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx-vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx10'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx10-128'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx10-256'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx10-512'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-bf16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-fp16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bitalg'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512ifma'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='bus-lock-detect'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='cldemote'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fbsdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrc'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrs'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fzrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='la57'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='mcdt-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='movdir64b'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='movdiri'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pbrsb-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='prefetchiti'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='psdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='sbdr-ssdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='serialize'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ss'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='taa-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='tsx-ldtrk'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xfd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Haswell'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Haswell-IBRS'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Haswell-noTSX'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Haswell-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Haswell-v2'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Haswell-v3'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Haswell-v4'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Icelake-Server'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bitalg'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='la57'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Icelake-Server-noTSX'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bitalg'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='la57'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Icelake-Server-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bitalg'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='la57'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Icelake-Server-v2'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bitalg'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='la57'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Icelake-Server-v3'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bitalg'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='la57'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='taa-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Icelake-Server-v4'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bitalg'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512ifma'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='la57'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='taa-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Icelake-Server-v5'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bitalg'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512ifma'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='la57'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='taa-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Icelake-Server-v6'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bitalg'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512ifma'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='la57'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='taa-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Icelake-Server-v7'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bitalg'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512ifma'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='la57'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='taa-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='IvyBridge'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='IvyBridge-IBRS'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='IvyBridge-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='IvyBridge-v2'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='KnightsMill'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-4fmaps'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-4vnniw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512er'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512pf'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ss'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='KnightsMill-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-4fmaps'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-4vnniw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512er'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512pf'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ss'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Opteron_G4'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fma4'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xop'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Opteron_G4-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fma4'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xop'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Opteron_G5'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fma4'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='tbm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xop'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Opteron_G5-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fma4'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='tbm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xop'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='SapphireRapids'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-bf16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-int8'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-tile'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx-vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-bf16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-fp16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bitalg'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512ifma'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='bus-lock-detect'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrc'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrs'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fzrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='la57'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='serialize'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='taa-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='tsx-ldtrk'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xfd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='SapphireRapids-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-bf16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-int8'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-tile'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx-vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-bf16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-fp16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bitalg'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512ifma'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='bus-lock-detect'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrc'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrs'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fzrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='la57'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='serialize'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='taa-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='tsx-ldtrk'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xfd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='SapphireRapids-v2'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-bf16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-int8'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-tile'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx-vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-bf16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-fp16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bitalg'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512ifma'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='bus-lock-detect'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fbsdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrc'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrs'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fzrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='la57'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='psdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='sbdr-ssdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='serialize'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='taa-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='tsx-ldtrk'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xfd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='SapphireRapids-v3'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-bf16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-int8'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-tile'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx-vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-bf16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-fp16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bitalg'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512ifma'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='bus-lock-detect'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='cldemote'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fbsdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrc'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrs'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fzrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='la57'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='movdir64b'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='movdiri'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='psdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='sbdr-ssdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='serialize'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ss'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='taa-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='tsx-ldtrk'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xfd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='SierraForest'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx-ifma'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx-ne-convert'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx-vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx-vnni-int8'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='bus-lock-detect'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='cmpccxadd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fbsdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrs'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='mcdt-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pbrsb-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='psdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='sbdr-ssdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='serialize'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='SierraForest-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx-ifma'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx-ne-convert'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx-vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx-vnni-int8'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='bus-lock-detect'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='cmpccxadd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fbsdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrs'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='mcdt-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pbrsb-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='psdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='sbdr-ssdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='serialize'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Skylake-Client'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Skylake-Client-IBRS'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Skylake-Client-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Skylake-Client-v2'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Skylake-Client-v3'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Skylake-Client-v4'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Skylake-Server'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 sudo[186284]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jfibnzqwtlrjwbmpnvduzcneazpztubr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796801.6768384-4318-280265460715984/AnsiballZ_podman_container.py'
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Skylake-Server-IBRS'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Skylake-Server-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Skylake-Server-v2'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Skylake-Server-v3'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Skylake-Server-v4'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Skylake-Server-v5'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 sudo[186284]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Snowridge'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='cldemote'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='core-capability'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='movdir64b'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='movdiri'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='mpx'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='split-lock-detect'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Snowridge-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='cldemote'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='core-capability'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='movdir64b'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='movdiri'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='mpx'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='split-lock-detect'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Snowridge-v2'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='cldemote'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='core-capability'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='movdir64b'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='movdiri'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='split-lock-detect'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Snowridge-v3'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='cldemote'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='core-capability'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='movdir64b'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='movdiri'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='split-lock-detect'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Snowridge-v4'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='cldemote'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='movdir64b'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='movdiri'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='athlon'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='3dnow'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='3dnowext'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='athlon-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='3dnow'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='3dnowext'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='core2duo'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ss'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='core2duo-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ss'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='coreduo'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ss'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='coreduo-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ss'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='n270'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ss'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='n270-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ss'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='phenom'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='3dnow'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='3dnowext'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='phenom-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='3dnow'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='3dnowext'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </mode>
Nov 22 07:33:22 compute-0 nova_compute[185603]:   </cpu>
Nov 22 07:33:22 compute-0 nova_compute[185603]:   <memoryBacking supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <enum name='sourceType'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <value>file</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <value>anonymous</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <value>memfd</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:   </memoryBacking>
Nov 22 07:33:22 compute-0 nova_compute[185603]:   <devices>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <disk supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='diskDevice'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>disk</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>cdrom</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>floppy</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>lun</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='bus'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>ide</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>fdc</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>scsi</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>virtio</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>usb</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>sata</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='model'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>virtio</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>virtio-transitional</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>virtio-non-transitional</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </disk>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <graphics supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='type'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>vnc</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>egl-headless</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>dbus</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </graphics>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <video supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='modelType'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>vga</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>cirrus</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>virtio</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>none</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>bochs</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>ramfb</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </video>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <hostdev supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='mode'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>subsystem</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='startupPolicy'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>default</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>mandatory</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>requisite</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>optional</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='subsysType'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>usb</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>pci</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>scsi</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='capsType'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='pciBackend'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </hostdev>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <rng supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='model'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>virtio</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>virtio-transitional</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>virtio-non-transitional</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='backendModel'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>random</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>egd</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>builtin</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </rng>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <filesystem supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='driverType'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>path</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>handle</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>virtiofs</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </filesystem>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <tpm supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='model'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>tpm-tis</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>tpm-crb</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='backendModel'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>emulator</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>external</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='backendVersion'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>2.0</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </tpm>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <redirdev supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='bus'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>usb</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </redirdev>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <channel supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='type'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>pty</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>unix</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </channel>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <crypto supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='model'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='type'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>qemu</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='backendModel'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>builtin</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </crypto>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <interface supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='backendType'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>default</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>passt</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </interface>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <panic supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='model'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>isa</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>hyperv</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </panic>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <console supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='type'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>null</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>vc</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>pty</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>dev</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>file</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>pipe</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>stdio</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>udp</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>tcp</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>unix</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>qemu-vdagent</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>dbus</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </console>
Nov 22 07:33:22 compute-0 nova_compute[185603]:   </devices>
Nov 22 07:33:22 compute-0 nova_compute[185603]:   <features>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <gic supported='no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <vmcoreinfo supported='yes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <genid supported='yes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <backingStoreInput supported='yes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <backup supported='yes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <async-teardown supported='yes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <ps2 supported='yes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <sev supported='no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <sgx supported='no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <hyperv supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='features'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>relaxed</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>vapic</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>spinlocks</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>vpindex</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>runtime</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>synic</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>stimer</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>reset</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>vendor_id</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>frequencies</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>reenlightenment</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>tlbflush</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>ipi</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>avic</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>emsr_bitmap</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>xmm_input</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <defaults>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <spinlocks>4095</spinlocks>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <stimer_direct>on</stimer_direct>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <tlbflush_direct>on</tlbflush_direct>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <tlbflush_extended>on</tlbflush_extended>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </defaults>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </hyperv>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <launchSecurity supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='sectype'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>tdx</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </launchSecurity>
Nov 22 07:33:22 compute-0 nova_compute[185603]:   </features>
Nov 22 07:33:22 compute-0 nova_compute[185603]: </domainCapabilities>
Nov 22 07:33:22 compute-0 nova_compute[185603]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 22 07:33:22 compute-0 nova_compute[185603]: 2025-11-22 07:33:22.095 185607 DEBUG nova.virt.libvirt.host [None req-23e2c20b-87aa-40b5-a7b5-f2e146e4eea2 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Nov 22 07:33:22 compute-0 nova_compute[185603]: 2025-11-22 07:33:22.098 185607 DEBUG nova.virt.libvirt.host [None req-23e2c20b-87aa-40b5-a7b5-f2e146e4eea2 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Nov 22 07:33:22 compute-0 nova_compute[185603]: <domainCapabilities>
Nov 22 07:33:22 compute-0 nova_compute[185603]:   <path>/usr/libexec/qemu-kvm</path>
Nov 22 07:33:22 compute-0 nova_compute[185603]:   <domain>kvm</domain>
Nov 22 07:33:22 compute-0 nova_compute[185603]:   <machine>pc-q35-rhel9.8.0</machine>
Nov 22 07:33:22 compute-0 nova_compute[185603]:   <arch>x86_64</arch>
Nov 22 07:33:22 compute-0 nova_compute[185603]:   <vcpu max='4096'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:   <iothreads supported='yes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:   <os supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <enum name='firmware'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <value>efi</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <loader supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='type'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>rom</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>pflash</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='readonly'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>yes</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>no</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='secure'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>yes</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>no</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </loader>
Nov 22 07:33:22 compute-0 nova_compute[185603]:   </os>
Nov 22 07:33:22 compute-0 nova_compute[185603]:   <cpu>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <mode name='host-passthrough' supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='hostPassthroughMigratable'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>on</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>off</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </mode>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <mode name='maximum' supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='maximumMigratable'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>on</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>off</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </mode>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <mode name='host-model' supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <vendor>AMD</vendor>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='x2apic'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='tsc-deadline'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='hypervisor'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='tsc_adjust'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='spec-ctrl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='stibp'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='ssbd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='cmp_legacy'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='overflow-recov'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='succor'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='ibrs'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='amd-ssbd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='virt-ssbd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='lbrv'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='tsc-scale'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='vmcb-clean'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='flushbyasid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='pause-filter'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='pfthreshold'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='svme-addr-chk'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='disable' name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </mode>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <mode name='custom' supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Broadwell'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Broadwell-IBRS'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Broadwell-noTSX'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Broadwell-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Broadwell-v2'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Broadwell-v3'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Broadwell-v4'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Cascadelake-Server'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Cascadelake-Server-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Cascadelake-Server-v2'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Cascadelake-Server-v3'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Cascadelake-Server-v4'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Cascadelake-Server-v5'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Cooperlake'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-bf16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='taa-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Cooperlake-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-bf16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='taa-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Cooperlake-v2'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-bf16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='taa-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Denverton'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='mpx'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Denverton-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='mpx'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Denverton-v2'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Denverton-v3'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Dhyana-v2'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='EPYC-Genoa'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amd-psfd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='auto-ibrs'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-bf16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bitalg'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512ifma'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='la57'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='no-nested-data-bp'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='null-sel-clr-base'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='stibp-always-on'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='EPYC-Genoa-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amd-psfd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='auto-ibrs'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-bf16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bitalg'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512ifma'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='la57'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='no-nested-data-bp'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='null-sel-clr-base'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='stibp-always-on'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='EPYC-Milan'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='EPYC-Milan-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='EPYC-Milan-v2'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amd-psfd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='no-nested-data-bp'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='null-sel-clr-base'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='stibp-always-on'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='EPYC-Rome'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='EPYC-Rome-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='EPYC-Rome-v2'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='EPYC-Rome-v3'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='EPYC-v3'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='EPYC-v4'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='GraniteRapids'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-bf16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-fp16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-int8'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-tile'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx-vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-bf16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-fp16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bitalg'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512ifma'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='bus-lock-detect'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fbsdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrc'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrs'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fzrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='la57'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='mcdt-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pbrsb-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='prefetchiti'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='psdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='sbdr-ssdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='serialize'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='taa-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='tsx-ldtrk'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xfd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='GraniteRapids-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-bf16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-fp16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-int8'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-tile'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx-vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-bf16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-fp16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bitalg'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512ifma'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='bus-lock-detect'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fbsdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrc'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrs'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fzrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='la57'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='mcdt-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pbrsb-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='prefetchiti'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='psdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='sbdr-ssdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='serialize'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='taa-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='tsx-ldtrk'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xfd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='GraniteRapids-v2'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-bf16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-fp16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-int8'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-tile'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx-vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx10'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx10-128'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx10-256'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx10-512'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-bf16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-fp16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bitalg'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512ifma'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='bus-lock-detect'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='cldemote'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fbsdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrc'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrs'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fzrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='la57'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='mcdt-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='movdir64b'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='movdiri'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pbrsb-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='prefetchiti'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='psdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='sbdr-ssdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='serialize'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ss'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='taa-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='tsx-ldtrk'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xfd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Haswell'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Haswell-IBRS'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Haswell-noTSX'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Haswell-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Haswell-v2'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Haswell-v3'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Haswell-v4'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Icelake-Server'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bitalg'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='la57'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Icelake-Server-noTSX'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bitalg'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='la57'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Icelake-Server-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bitalg'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='la57'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Icelake-Server-v2'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bitalg'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='la57'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Icelake-Server-v3'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bitalg'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='la57'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='taa-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Icelake-Server-v4'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bitalg'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512ifma'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='la57'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='taa-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Icelake-Server-v5'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bitalg'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512ifma'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='la57'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='taa-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Icelake-Server-v6'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bitalg'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512ifma'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='la57'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='taa-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Icelake-Server-v7'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bitalg'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512ifma'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='la57'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='taa-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='IvyBridge'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='IvyBridge-IBRS'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='IvyBridge-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='IvyBridge-v2'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='KnightsMill'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-4fmaps'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-4vnniw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512er'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512pf'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ss'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='KnightsMill-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-4fmaps'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-4vnniw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512er'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512pf'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ss'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Opteron_G4'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fma4'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xop'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Opteron_G4-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fma4'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xop'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Opteron_G5'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fma4'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='tbm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xop'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Opteron_G5-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fma4'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='tbm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xop'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='SapphireRapids'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-bf16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-int8'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-tile'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx-vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-bf16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-fp16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bitalg'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512ifma'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='bus-lock-detect'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrc'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrs'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fzrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='la57'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='serialize'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='taa-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='tsx-ldtrk'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xfd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='SapphireRapids-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-bf16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-int8'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-tile'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx-vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-bf16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-fp16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bitalg'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512ifma'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='bus-lock-detect'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrc'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrs'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fzrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='la57'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='serialize'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='taa-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='tsx-ldtrk'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xfd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='SapphireRapids-v2'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-bf16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-int8'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-tile'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx-vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-bf16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-fp16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bitalg'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512ifma'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='bus-lock-detect'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fbsdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrc'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrs'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fzrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='la57'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='psdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='sbdr-ssdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='serialize'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='taa-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='tsx-ldtrk'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xfd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='SapphireRapids-v3'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-bf16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-int8'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-tile'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx-vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-bf16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-fp16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bitalg'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512ifma'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='bus-lock-detect'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='cldemote'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fbsdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrc'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrs'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fzrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='la57'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='movdir64b'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='movdiri'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='psdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='sbdr-ssdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='serialize'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ss'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='taa-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='tsx-ldtrk'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xfd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='SierraForest'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx-ifma'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx-ne-convert'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx-vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx-vnni-int8'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='bus-lock-detect'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='cmpccxadd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fbsdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrs'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='mcdt-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pbrsb-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='psdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='sbdr-ssdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='serialize'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='SierraForest-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx-ifma'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx-ne-convert'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx-vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx-vnni-int8'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='bus-lock-detect'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='cmpccxadd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fbsdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrs'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='mcdt-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pbrsb-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='psdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='sbdr-ssdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='serialize'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Skylake-Client'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Skylake-Client-IBRS'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Skylake-Client-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Skylake-Client-v2'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Skylake-Client-v3'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Skylake-Client-v4'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Skylake-Server'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Skylake-Server-IBRS'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Skylake-Server-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Skylake-Server-v2'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Skylake-Server-v3'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Skylake-Server-v4'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Skylake-Server-v5'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Snowridge'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='cldemote'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='core-capability'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='movdir64b'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='movdiri'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='mpx'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='split-lock-detect'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Snowridge-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='cldemote'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='core-capability'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='movdir64b'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='movdiri'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='mpx'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='split-lock-detect'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Snowridge-v2'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='cldemote'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='core-capability'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='movdir64b'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='movdiri'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='split-lock-detect'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Snowridge-v3'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='cldemote'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='core-capability'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='movdir64b'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='movdiri'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='split-lock-detect'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Snowridge-v4'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='cldemote'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='movdir64b'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='movdiri'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='athlon'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='3dnow'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='3dnowext'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='athlon-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='3dnow'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='3dnowext'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='core2duo'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ss'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='core2duo-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ss'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='coreduo'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ss'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='coreduo-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ss'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='n270'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ss'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='n270-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ss'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='phenom'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='3dnow'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='3dnowext'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='phenom-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='3dnow'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='3dnowext'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </mode>
Nov 22 07:33:22 compute-0 nova_compute[185603]:   </cpu>
Nov 22 07:33:22 compute-0 nova_compute[185603]:   <memoryBacking supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <enum name='sourceType'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <value>file</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <value>anonymous</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <value>memfd</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:   </memoryBacking>
Nov 22 07:33:22 compute-0 nova_compute[185603]:   <devices>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <disk supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='diskDevice'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>disk</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>cdrom</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>floppy</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>lun</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='bus'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>fdc</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>scsi</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>virtio</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>usb</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>sata</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='model'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>virtio</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>virtio-transitional</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>virtio-non-transitional</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </disk>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <graphics supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='type'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>vnc</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>egl-headless</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>dbus</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </graphics>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <video supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='modelType'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>vga</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>cirrus</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>virtio</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>none</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>bochs</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>ramfb</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </video>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <hostdev supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='mode'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>subsystem</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='startupPolicy'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>default</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>mandatory</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>requisite</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>optional</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='subsysType'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>usb</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>pci</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>scsi</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='capsType'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='pciBackend'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </hostdev>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <rng supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='model'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>virtio</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>virtio-transitional</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>virtio-non-transitional</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='backendModel'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>random</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>egd</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>builtin</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </rng>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <filesystem supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='driverType'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>path</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>handle</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>virtiofs</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </filesystem>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <tpm supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='model'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>tpm-tis</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>tpm-crb</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='backendModel'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>emulator</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>external</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='backendVersion'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>2.0</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </tpm>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <redirdev supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='bus'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>usb</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </redirdev>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <channel supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='type'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>pty</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>unix</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </channel>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <crypto supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='model'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='type'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>qemu</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='backendModel'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>builtin</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </crypto>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <interface supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='backendType'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>default</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>passt</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </interface>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <panic supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='model'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>isa</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>hyperv</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </panic>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <console supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='type'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>null</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>vc</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>pty</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>dev</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>file</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>pipe</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>stdio</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>udp</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>tcp</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>unix</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>qemu-vdagent</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>dbus</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </console>
Nov 22 07:33:22 compute-0 nova_compute[185603]:   </devices>
Nov 22 07:33:22 compute-0 nova_compute[185603]:   <features>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <gic supported='no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <vmcoreinfo supported='yes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <genid supported='yes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <backingStoreInput supported='yes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <backup supported='yes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <async-teardown supported='yes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <ps2 supported='yes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <sev supported='no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <sgx supported='no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <hyperv supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='features'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>relaxed</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>vapic</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>spinlocks</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>vpindex</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>runtime</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>synic</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>stimer</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>reset</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>vendor_id</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>frequencies</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>reenlightenment</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>tlbflush</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>ipi</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>avic</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>emsr_bitmap</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>xmm_input</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <defaults>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <spinlocks>4095</spinlocks>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <stimer_direct>on</stimer_direct>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <tlbflush_direct>on</tlbflush_direct>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <tlbflush_extended>on</tlbflush_extended>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </defaults>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </hyperv>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <launchSecurity supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='sectype'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>tdx</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </launchSecurity>
Nov 22 07:33:22 compute-0 nova_compute[185603]:   </features>
Nov 22 07:33:22 compute-0 nova_compute[185603]: </domainCapabilities>
Nov 22 07:33:22 compute-0 nova_compute[185603]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 22 07:33:22 compute-0 nova_compute[185603]: 2025-11-22 07:33:22.159 185607 DEBUG nova.virt.libvirt.host [None req-23e2c20b-87aa-40b5-a7b5-f2e146e4eea2 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Nov 22 07:33:22 compute-0 nova_compute[185603]: <domainCapabilities>
Nov 22 07:33:22 compute-0 nova_compute[185603]:   <path>/usr/libexec/qemu-kvm</path>
Nov 22 07:33:22 compute-0 nova_compute[185603]:   <domain>kvm</domain>
Nov 22 07:33:22 compute-0 nova_compute[185603]:   <machine>pc-i440fx-rhel7.6.0</machine>
Nov 22 07:33:22 compute-0 nova_compute[185603]:   <arch>x86_64</arch>
Nov 22 07:33:22 compute-0 nova_compute[185603]:   <vcpu max='240'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:   <iothreads supported='yes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:   <os supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <enum name='firmware'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <loader supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='type'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>rom</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>pflash</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='readonly'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>yes</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>no</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='secure'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>no</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </loader>
Nov 22 07:33:22 compute-0 nova_compute[185603]:   </os>
Nov 22 07:33:22 compute-0 nova_compute[185603]:   <cpu>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <mode name='host-passthrough' supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='hostPassthroughMigratable'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>on</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>off</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </mode>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <mode name='maximum' supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='maximumMigratable'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>on</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>off</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </mode>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <mode name='host-model' supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <vendor>AMD</vendor>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='x2apic'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='tsc-deadline'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='hypervisor'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='tsc_adjust'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='spec-ctrl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='stibp'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='ssbd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='cmp_legacy'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='overflow-recov'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='succor'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='ibrs'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='amd-ssbd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='virt-ssbd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='lbrv'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='tsc-scale'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='vmcb-clean'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='flushbyasid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='pause-filter'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='pfthreshold'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='svme-addr-chk'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <feature policy='disable' name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </mode>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <mode name='custom' supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Broadwell'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Broadwell-IBRS'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Broadwell-noTSX'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Broadwell-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Broadwell-v2'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Broadwell-v3'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Broadwell-v4'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Cascadelake-Server'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Cascadelake-Server-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Cascadelake-Server-v2'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Cascadelake-Server-v3'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Cascadelake-Server-v4'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Cascadelake-Server-v5'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Cooperlake'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-bf16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='taa-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Cooperlake-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-bf16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='taa-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Cooperlake-v2'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-bf16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='taa-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Denverton'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='mpx'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Denverton-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='mpx'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Denverton-v2'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Denverton-v3'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Dhyana-v2'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='EPYC-Genoa'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amd-psfd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='auto-ibrs'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-bf16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bitalg'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512ifma'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='la57'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='no-nested-data-bp'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='null-sel-clr-base'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='stibp-always-on'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='EPYC-Genoa-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amd-psfd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='auto-ibrs'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-bf16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bitalg'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512ifma'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='la57'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='no-nested-data-bp'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='null-sel-clr-base'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='stibp-always-on'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='EPYC-Milan'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='EPYC-Milan-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='EPYC-Milan-v2'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amd-psfd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='no-nested-data-bp'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='null-sel-clr-base'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='stibp-always-on'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='EPYC-Rome'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='EPYC-Rome-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='EPYC-Rome-v2'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='EPYC-Rome-v3'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='EPYC-v3'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='EPYC-v4'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='GraniteRapids'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-bf16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-fp16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-int8'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-tile'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx-vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-bf16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-fp16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bitalg'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512ifma'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='bus-lock-detect'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fbsdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrc'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrs'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fzrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='la57'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='mcdt-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pbrsb-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='prefetchiti'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='psdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='sbdr-ssdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='serialize'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='taa-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='tsx-ldtrk'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xfd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='GraniteRapids-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-bf16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-fp16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-int8'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-tile'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx-vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-bf16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-fp16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bitalg'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512ifma'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='bus-lock-detect'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fbsdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrc'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrs'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fzrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='la57'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='mcdt-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pbrsb-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='prefetchiti'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='psdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='sbdr-ssdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='serialize'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='taa-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='tsx-ldtrk'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xfd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='GraniteRapids-v2'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-bf16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-fp16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-int8'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-tile'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx-vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx10'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx10-128'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx10-256'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx10-512'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-bf16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-fp16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bitalg'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512ifma'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='bus-lock-detect'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='cldemote'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fbsdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrc'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrs'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fzrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='la57'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='mcdt-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='movdir64b'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='movdiri'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pbrsb-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='prefetchiti'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='psdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='sbdr-ssdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='serialize'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ss'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='taa-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='tsx-ldtrk'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xfd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Haswell'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Haswell-IBRS'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Haswell-noTSX'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Haswell-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Haswell-v2'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Haswell-v3'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Haswell-v4'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Icelake-Server'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bitalg'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='la57'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Icelake-Server-noTSX'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bitalg'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='la57'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Icelake-Server-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bitalg'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='la57'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Icelake-Server-v2'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bitalg'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='la57'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Icelake-Server-v3'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bitalg'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='la57'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='taa-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Icelake-Server-v4'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bitalg'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512ifma'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='la57'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='taa-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Icelake-Server-v5'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bitalg'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512ifma'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='la57'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='taa-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Icelake-Server-v6'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bitalg'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512ifma'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='la57'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='taa-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Icelake-Server-v7'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bitalg'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512ifma'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='la57'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='taa-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='IvyBridge'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='IvyBridge-IBRS'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='IvyBridge-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='IvyBridge-v2'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='KnightsMill'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-4fmaps'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-4vnniw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512er'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512pf'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ss'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='KnightsMill-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-4fmaps'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-4vnniw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512er'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512pf'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ss'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Opteron_G4'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fma4'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xop'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Opteron_G4-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fma4'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xop'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Opteron_G5'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fma4'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='tbm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xop'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Opteron_G5-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fma4'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='tbm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xop'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='SapphireRapids'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-bf16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-int8'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-tile'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx-vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-bf16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-fp16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bitalg'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512ifma'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='bus-lock-detect'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrc'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrs'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fzrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='la57'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='serialize'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='taa-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='tsx-ldtrk'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xfd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='SapphireRapids-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-bf16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-int8'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-tile'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx-vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-bf16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-fp16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bitalg'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512ifma'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='bus-lock-detect'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrc'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrs'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fzrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='la57'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='serialize'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='taa-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='tsx-ldtrk'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xfd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='SapphireRapids-v2'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-bf16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-int8'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-tile'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx-vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-bf16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-fp16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bitalg'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512ifma'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='bus-lock-detect'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fbsdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrc'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrs'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fzrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='la57'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='psdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='sbdr-ssdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='serialize'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='taa-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='tsx-ldtrk'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xfd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='SapphireRapids-v3'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-bf16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-int8'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='amx-tile'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx-vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-bf16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-fp16'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bitalg'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512ifma'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='bus-lock-detect'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='cldemote'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fbsdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrc'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrs'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fzrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='la57'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='movdir64b'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='movdiri'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='psdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='sbdr-ssdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='serialize'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ss'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='taa-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='tsx-ldtrk'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xfd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='SierraForest'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx-ifma'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx-ne-convert'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx-vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx-vnni-int8'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='bus-lock-detect'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='cmpccxadd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fbsdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrs'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='mcdt-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pbrsb-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='psdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='sbdr-ssdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='serialize'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='SierraForest-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx-ifma'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx-ne-convert'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx-vnni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx-vnni-int8'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='bus-lock-detect'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='cmpccxadd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fbsdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='fsrs'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ibrs-all'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='mcdt-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pbrsb-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='psdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='sbdr-ssdp-no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='serialize'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vaes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Skylake-Client'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Skylake-Client-IBRS'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Skylake-Client-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Skylake-Client-v2'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Skylake-Client-v3'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Skylake-Client-v4'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Skylake-Server'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Skylake-Server-IBRS'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Skylake-Server-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Skylake-Server-v2'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='hle'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='rtm'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Skylake-Server-v3'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Skylake-Server-v4'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Skylake-Server-v5'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512bw'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512cd'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512dq'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512f'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='avx512vl'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='invpcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pcid'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='pku'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Snowridge'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='cldemote'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='core-capability'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='movdir64b'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='movdiri'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='mpx'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='split-lock-detect'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Snowridge-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='cldemote'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='core-capability'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='movdir64b'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='movdiri'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='mpx'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='split-lock-detect'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Snowridge-v2'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='cldemote'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='core-capability'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='movdir64b'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='movdiri'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='split-lock-detect'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Snowridge-v3'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='cldemote'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='core-capability'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='movdir64b'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='movdiri'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='split-lock-detect'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='Snowridge-v4'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='cldemote'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='erms'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='gfni'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='movdir64b'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='movdiri'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='xsaves'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='athlon'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='3dnow'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='3dnowext'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='athlon-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='3dnow'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='3dnowext'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='core2duo'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ss'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='core2duo-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ss'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='coreduo'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ss'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='coreduo-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ss'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='n270'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ss'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='n270-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='ss'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='phenom'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='3dnow'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='3dnowext'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <blockers model='phenom-v1'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='3dnow'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <feature name='3dnowext'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </blockers>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </mode>
Nov 22 07:33:22 compute-0 nova_compute[185603]:   </cpu>
Nov 22 07:33:22 compute-0 nova_compute[185603]:   <memoryBacking supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <enum name='sourceType'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <value>file</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <value>anonymous</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <value>memfd</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:   </memoryBacking>
Nov 22 07:33:22 compute-0 nova_compute[185603]:   <devices>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <disk supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='diskDevice'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>disk</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>cdrom</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>floppy</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>lun</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='bus'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>ide</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>fdc</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>scsi</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>virtio</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>usb</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>sata</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='model'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>virtio</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>virtio-transitional</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>virtio-non-transitional</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </disk>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <graphics supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='type'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>vnc</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>egl-headless</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>dbus</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </graphics>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <video supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='modelType'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>vga</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>cirrus</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>virtio</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>none</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>bochs</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>ramfb</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </video>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <hostdev supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='mode'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>subsystem</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='startupPolicy'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>default</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>mandatory</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>requisite</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>optional</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='subsysType'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>usb</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>pci</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>scsi</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='capsType'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='pciBackend'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </hostdev>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <rng supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='model'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>virtio</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>virtio-transitional</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>virtio-non-transitional</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='backendModel'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>random</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>egd</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>builtin</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </rng>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <filesystem supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='driverType'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>path</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>handle</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>virtiofs</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </filesystem>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <tpm supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='model'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>tpm-tis</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>tpm-crb</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='backendModel'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>emulator</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>external</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='backendVersion'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>2.0</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </tpm>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <redirdev supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='bus'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>usb</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </redirdev>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <channel supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='type'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>pty</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>unix</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </channel>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <crypto supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='model'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='type'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>qemu</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='backendModel'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>builtin</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </crypto>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <interface supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='backendType'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>default</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>passt</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </interface>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <panic supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='model'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>isa</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>hyperv</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </panic>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <console supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='type'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>null</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>vc</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>pty</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>dev</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>file</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>pipe</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>stdio</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>udp</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>tcp</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>unix</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>qemu-vdagent</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>dbus</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </console>
Nov 22 07:33:22 compute-0 nova_compute[185603]:   </devices>
Nov 22 07:33:22 compute-0 nova_compute[185603]:   <features>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <gic supported='no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <vmcoreinfo supported='yes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <genid supported='yes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <backingStoreInput supported='yes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <backup supported='yes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <async-teardown supported='yes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <ps2 supported='yes'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <sev supported='no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <sgx supported='no'/>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <hyperv supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='features'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>relaxed</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>vapic</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>spinlocks</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>vpindex</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>runtime</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>synic</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>stimer</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>reset</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>vendor_id</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>frequencies</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>reenlightenment</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>tlbflush</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>ipi</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>avic</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>emsr_bitmap</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>xmm_input</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <defaults>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <spinlocks>4095</spinlocks>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <stimer_direct>on</stimer_direct>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <tlbflush_direct>on</tlbflush_direct>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <tlbflush_extended>on</tlbflush_extended>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </defaults>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </hyperv>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     <launchSecurity supported='yes'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       <enum name='sectype'>
Nov 22 07:33:22 compute-0 nova_compute[185603]:         <value>tdx</value>
Nov 22 07:33:22 compute-0 nova_compute[185603]:       </enum>
Nov 22 07:33:22 compute-0 nova_compute[185603]:     </launchSecurity>
Nov 22 07:33:22 compute-0 nova_compute[185603]:   </features>
Nov 22 07:33:22 compute-0 nova_compute[185603]: </domainCapabilities>
Nov 22 07:33:22 compute-0 nova_compute[185603]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 22 07:33:22 compute-0 nova_compute[185603]: 2025-11-22 07:33:22.231 185607 DEBUG nova.virt.libvirt.host [None req-23e2c20b-87aa-40b5-a7b5-f2e146e4eea2 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Nov 22 07:33:22 compute-0 nova_compute[185603]: 2025-11-22 07:33:22.232 185607 INFO nova.virt.libvirt.host [None req-23e2c20b-87aa-40b5-a7b5-f2e146e4eea2 - - - - - -] Secure Boot support detected
Nov 22 07:33:22 compute-0 nova_compute[185603]: 2025-11-22 07:33:22.234 185607 INFO nova.virt.libvirt.driver [None req-23e2c20b-87aa-40b5-a7b5-f2e146e4eea2 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Nov 22 07:33:22 compute-0 nova_compute[185603]: 2025-11-22 07:33:22.234 185607 INFO nova.virt.libvirt.driver [None req-23e2c20b-87aa-40b5-a7b5-f2e146e4eea2 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Nov 22 07:33:22 compute-0 nova_compute[185603]: 2025-11-22 07:33:22.242 185607 DEBUG nova.virt.libvirt.driver [None req-23e2c20b-87aa-40b5-a7b5-f2e146e4eea2 - - - - - -] cpu compare xml: <cpu match="exact">
Nov 22 07:33:22 compute-0 nova_compute[185603]:   <model>Nehalem</model>
Nov 22 07:33:22 compute-0 nova_compute[185603]: </cpu>
Nov 22 07:33:22 compute-0 nova_compute[185603]:  _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019
Nov 22 07:33:22 compute-0 nova_compute[185603]: 2025-11-22 07:33:22.245 185607 DEBUG nova.virt.libvirt.driver [None req-23e2c20b-87aa-40b5-a7b5-f2e146e4eea2 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Nov 22 07:33:22 compute-0 nova_compute[185603]: 2025-11-22 07:33:22.360 185607 INFO nova.virt.node [None req-23e2c20b-87aa-40b5-a7b5-f2e146e4eea2 - - - - - -] Determined node identity 0a011418-630a-4be8-ab23-41ec1c11a5ea from /var/lib/nova/compute_id
Nov 22 07:33:22 compute-0 nova_compute[185603]: 2025-11-22 07:33:22.382 185607 WARNING nova.compute.manager [None req-23e2c20b-87aa-40b5-a7b5-f2e146e4eea2 - - - - - -] Compute nodes ['0a011418-630a-4be8-ab23-41ec1c11a5ea'] for host compute-0.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.
Nov 22 07:33:22 compute-0 python3.9[186286]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Nov 22 07:33:22 compute-0 nova_compute[185603]: 2025-11-22 07:33:22.447 185607 INFO nova.compute.manager [None req-23e2c20b-87aa-40b5-a7b5-f2e146e4eea2 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Nov 22 07:33:22 compute-0 sudo[186284]: pam_unix(sudo:session): session closed for user root
Nov 22 07:33:22 compute-0 nova_compute[185603]: 2025-11-22 07:33:22.615 185607 WARNING nova.compute.manager [None req-23e2c20b-87aa-40b5-a7b5-f2e146e4eea2 - - - - - -] No compute node record found for host compute-0.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Nov 22 07:33:22 compute-0 nova_compute[185603]: 2025-11-22 07:33:22.616 185607 DEBUG oslo_concurrency.lockutils [None req-23e2c20b-87aa-40b5-a7b5-f2e146e4eea2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:33:22 compute-0 nova_compute[185603]: 2025-11-22 07:33:22.616 185607 DEBUG oslo_concurrency.lockutils [None req-23e2c20b-87aa-40b5-a7b5-f2e146e4eea2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:33:22 compute-0 nova_compute[185603]: 2025-11-22 07:33:22.616 185607 DEBUG oslo_concurrency.lockutils [None req-23e2c20b-87aa-40b5-a7b5-f2e146e4eea2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:33:22 compute-0 nova_compute[185603]: 2025-11-22 07:33:22.616 185607 DEBUG nova.compute.resource_tracker [None req-23e2c20b-87aa-40b5-a7b5-f2e146e4eea2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 07:33:22 compute-0 systemd[1]: Starting libvirt nodedev daemon...
Nov 22 07:33:22 compute-0 systemd[1]: Started libvirt nodedev daemon.
Nov 22 07:33:22 compute-0 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 22 07:33:22 compute-0 nova_compute[185603]: 2025-11-22 07:33:22.912 185607 WARNING nova.virt.libvirt.driver [None req-23e2c20b-87aa-40b5-a7b5-f2e146e4eea2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 07:33:22 compute-0 nova_compute[185603]: 2025-11-22 07:33:22.913 185607 DEBUG nova.compute.resource_tracker [None req-23e2c20b-87aa-40b5-a7b5-f2e146e4eea2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6225MB free_disk=73.66471481323242GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 07:33:22 compute-0 nova_compute[185603]: 2025-11-22 07:33:22.914 185607 DEBUG oslo_concurrency.lockutils [None req-23e2c20b-87aa-40b5-a7b5-f2e146e4eea2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:33:22 compute-0 nova_compute[185603]: 2025-11-22 07:33:22.914 185607 DEBUG oslo_concurrency.lockutils [None req-23e2c20b-87aa-40b5-a7b5-f2e146e4eea2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:33:22 compute-0 nova_compute[185603]: 2025-11-22 07:33:22.926 185607 WARNING nova.compute.resource_tracker [None req-23e2c20b-87aa-40b5-a7b5-f2e146e4eea2 - - - - - -] No compute node record for compute-0.ctlplane.example.com:0a011418-630a-4be8-ab23-41ec1c11a5ea: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 0a011418-630a-4be8-ab23-41ec1c11a5ea could not be found.
Nov 22 07:33:23 compute-0 nova_compute[185603]: 2025-11-22 07:33:23.064 185607 INFO nova.compute.resource_tracker [None req-23e2c20b-87aa-40b5-a7b5-f2e146e4eea2 - - - - - -] Compute node record created for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com with uuid: 0a011418-630a-4be8-ab23-41ec1c11a5ea
Nov 22 07:33:23 compute-0 sudo[186480]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gftkxgtijgvaivpzqoiqnplyqpotiost ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796802.8494349-4342-137074062336421/AnsiballZ_systemd.py'
Nov 22 07:33:23 compute-0 sudo[186480]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:33:23 compute-0 nova_compute[185603]: 2025-11-22 07:33:23.130 185607 DEBUG nova.compute.resource_tracker [None req-23e2c20b-87aa-40b5-a7b5-f2e146e4eea2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 07:33:23 compute-0 nova_compute[185603]: 2025-11-22 07:33:23.130 185607 DEBUG nova.compute.resource_tracker [None req-23e2c20b-87aa-40b5-a7b5-f2e146e4eea2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 07:33:23 compute-0 python3.9[186482]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 07:33:23 compute-0 systemd[1]: Stopping nova_compute container...
Nov 22 07:33:23 compute-0 virtqemud[186092]: libvirt version: 11.9.0, package: 1.el9 (builder@centos.org, 2025-11-04-09:54:50, )
Nov 22 07:33:23 compute-0 virtqemud[186092]: hostname: compute-0
Nov 22 07:33:23 compute-0 virtqemud[186092]: End of file while reading data: Input/output error
Nov 22 07:33:23 compute-0 systemd[1]: libpod-211a6e80892220336cd0af8c80ee84b02e29143ca0dbc27383e916c2955d9c0c.scope: Deactivated successfully.
Nov 22 07:33:23 compute-0 systemd[1]: libpod-211a6e80892220336cd0af8c80ee84b02e29143ca0dbc27383e916c2955d9c0c.scope: Consumed 2.841s CPU time.
Nov 22 07:33:23 compute-0 podman[186486]: 2025-11-22 07:33:23.549312662 +0000 UTC m=+0.067092830 container died 211a6e80892220336cd0af8c80ee84b02e29143ca0dbc27383e916c2955d9c0c (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=edpm, container_name=nova_compute)
Nov 22 07:33:23 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-211a6e80892220336cd0af8c80ee84b02e29143ca0dbc27383e916c2955d9c0c-userdata-shm.mount: Deactivated successfully.
Nov 22 07:33:23 compute-0 systemd[1]: var-lib-containers-storage-overlay-83b77655581e7ffbdbd5af1f10357fd903c1cfc2e61e2382dff78693e705a219-merged.mount: Deactivated successfully.
Nov 22 07:33:23 compute-0 podman[186486]: 2025-11-22 07:33:23.621507943 +0000 UTC m=+0.139288081 container cleanup 211a6e80892220336cd0af8c80ee84b02e29143ca0dbc27383e916c2955d9c0c (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=nova_compute, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_id=edpm, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 07:33:23 compute-0 podman[186486]: nova_compute
Nov 22 07:33:23 compute-0 podman[186515]: nova_compute
Nov 22 07:33:23 compute-0 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Nov 22 07:33:23 compute-0 systemd[1]: Stopped nova_compute container.
Nov 22 07:33:23 compute-0 systemd[1]: Starting nova_compute container...
Nov 22 07:33:23 compute-0 systemd[1]: Started libcrun container.
Nov 22 07:33:23 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/83b77655581e7ffbdbd5af1f10357fd903c1cfc2e61e2382dff78693e705a219/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Nov 22 07:33:23 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/83b77655581e7ffbdbd5af1f10357fd903c1cfc2e61e2382dff78693e705a219/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 22 07:33:23 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/83b77655581e7ffbdbd5af1f10357fd903c1cfc2e61e2382dff78693e705a219/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 22 07:33:23 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/83b77655581e7ffbdbd5af1f10357fd903c1cfc2e61e2382dff78693e705a219/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 22 07:33:23 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/83b77655581e7ffbdbd5af1f10357fd903c1cfc2e61e2382dff78693e705a219/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 22 07:33:23 compute-0 podman[186528]: 2025-11-22 07:33:23.785609766 +0000 UTC m=+0.081915759 container init 211a6e80892220336cd0af8c80ee84b02e29143ca0dbc27383e916c2955d9c0c (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_id=edpm, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 07:33:23 compute-0 podman[186528]: 2025-11-22 07:33:23.793284882 +0000 UTC m=+0.089590845 container start 211a6e80892220336cd0af8c80ee84b02e29143ca0dbc27383e916c2955d9c0c (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm)
Nov 22 07:33:23 compute-0 podman[186528]: nova_compute
Nov 22 07:33:23 compute-0 nova_compute[186544]: + sudo -E kolla_set_configs
Nov 22 07:33:23 compute-0 systemd[1]: Started nova_compute container.
Nov 22 07:33:23 compute-0 sudo[186480]: pam_unix(sudo:session): session closed for user root
Nov 22 07:33:23 compute-0 nova_compute[186544]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 22 07:33:23 compute-0 nova_compute[186544]: INFO:__main__:Validating config file
Nov 22 07:33:23 compute-0 nova_compute[186544]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 22 07:33:23 compute-0 nova_compute[186544]: INFO:__main__:Copying service configuration files
Nov 22 07:33:23 compute-0 nova_compute[186544]: INFO:__main__:Deleting /etc/nova/nova.conf
Nov 22 07:33:23 compute-0 nova_compute[186544]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Nov 22 07:33:23 compute-0 nova_compute[186544]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Nov 22 07:33:23 compute-0 nova_compute[186544]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Nov 22 07:33:23 compute-0 nova_compute[186544]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Nov 22 07:33:23 compute-0 nova_compute[186544]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Nov 22 07:33:23 compute-0 nova_compute[186544]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 22 07:33:23 compute-0 nova_compute[186544]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 22 07:33:23 compute-0 nova_compute[186544]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 22 07:33:23 compute-0 nova_compute[186544]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Nov 22 07:33:23 compute-0 nova_compute[186544]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Nov 22 07:33:23 compute-0 nova_compute[186544]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Nov 22 07:33:23 compute-0 nova_compute[186544]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 22 07:33:23 compute-0 nova_compute[186544]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 22 07:33:23 compute-0 nova_compute[186544]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 22 07:33:23 compute-0 nova_compute[186544]: INFO:__main__:Deleting /etc/ceph
Nov 22 07:33:23 compute-0 nova_compute[186544]: INFO:__main__:Creating directory /etc/ceph
Nov 22 07:33:23 compute-0 nova_compute[186544]: INFO:__main__:Setting permission for /etc/ceph
Nov 22 07:33:23 compute-0 nova_compute[186544]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Nov 22 07:33:23 compute-0 nova_compute[186544]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Nov 22 07:33:23 compute-0 nova_compute[186544]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 22 07:33:23 compute-0 nova_compute[186544]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Nov 22 07:33:23 compute-0 nova_compute[186544]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Nov 22 07:33:23 compute-0 nova_compute[186544]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 22 07:33:23 compute-0 nova_compute[186544]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Nov 22 07:33:23 compute-0 nova_compute[186544]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Nov 22 07:33:23 compute-0 nova_compute[186544]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Nov 22 07:33:23 compute-0 nova_compute[186544]: INFO:__main__:Writing out command to execute
Nov 22 07:33:23 compute-0 nova_compute[186544]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Nov 22 07:33:23 compute-0 nova_compute[186544]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 22 07:33:23 compute-0 nova_compute[186544]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 22 07:33:23 compute-0 nova_compute[186544]: ++ cat /run_command
Nov 22 07:33:23 compute-0 nova_compute[186544]: + CMD=nova-compute
Nov 22 07:33:23 compute-0 nova_compute[186544]: + ARGS=
Nov 22 07:33:23 compute-0 nova_compute[186544]: + sudo kolla_copy_cacerts
Nov 22 07:33:23 compute-0 nova_compute[186544]: + [[ ! -n '' ]]
Nov 22 07:33:23 compute-0 nova_compute[186544]: + . kolla_extend_start
Nov 22 07:33:23 compute-0 nova_compute[186544]: + echo 'Running command: '\''nova-compute'\'''
Nov 22 07:33:23 compute-0 nova_compute[186544]: Running command: 'nova-compute'
Nov 22 07:33:23 compute-0 nova_compute[186544]: + umask 0022
Nov 22 07:33:23 compute-0 nova_compute[186544]: + exec nova-compute
Nov 22 07:33:25 compute-0 sudo[186706]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvtkpbpowxejafwujoezuqokqmtmyvwx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796804.7595785-4369-188941370848616/AnsiballZ_podman_container.py'
Nov 22 07:33:25 compute-0 sudo[186706]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:33:25 compute-0 python3.9[186708]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Nov 22 07:33:25 compute-0 systemd[1]: Started libpod-conmon-370ac707cc393c72c1b061ecf7af8e12270d93cf60033f69d892e8c91a4f9623.scope.
Nov 22 07:33:25 compute-0 systemd[1]: Started libcrun container.
Nov 22 07:33:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b6efe350a2a2d6ef7c130899f3586b24d7cfc2423dfe7d3142158836fed5e798/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Nov 22 07:33:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b6efe350a2a2d6ef7c130899f3586b24d7cfc2423dfe7d3142158836fed5e798/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 22 07:33:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b6efe350a2a2d6ef7c130899f3586b24d7cfc2423dfe7d3142158836fed5e798/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Nov 22 07:33:25 compute-0 podman[186731]: 2025-11-22 07:33:25.522881049 +0000 UTC m=+0.133146002 container init 370ac707cc393c72c1b061ecf7af8e12270d93cf60033f69d892e8c91a4f9623 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, container_name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=edpm)
Nov 22 07:33:25 compute-0 podman[186731]: 2025-11-22 07:33:25.531260873 +0000 UTC m=+0.141525806 container start 370ac707cc393c72c1b061ecf7af8e12270d93cf60033f69d892e8c91a4f9623 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, container_name=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=edpm)
Nov 22 07:33:25 compute-0 python3.9[186708]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Nov 22 07:33:25 compute-0 nova_compute_init[186753]: INFO:nova_statedir:Applying nova statedir ownership
Nov 22 07:33:25 compute-0 nova_compute_init[186753]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Nov 22 07:33:25 compute-0 nova_compute_init[186753]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Nov 22 07:33:25 compute-0 nova_compute_init[186753]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Nov 22 07:33:25 compute-0 nova_compute_init[186753]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Nov 22 07:33:25 compute-0 nova_compute_init[186753]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Nov 22 07:33:25 compute-0 nova_compute_init[186753]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Nov 22 07:33:25 compute-0 nova_compute_init[186753]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Nov 22 07:33:25 compute-0 nova_compute_init[186753]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Nov 22 07:33:25 compute-0 nova_compute_init[186753]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Nov 22 07:33:25 compute-0 nova_compute_init[186753]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Nov 22 07:33:25 compute-0 nova_compute_init[186753]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Nov 22 07:33:25 compute-0 nova_compute_init[186753]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Nov 22 07:33:25 compute-0 nova_compute_init[186753]: INFO:nova_statedir:Nova statedir ownership complete
Nov 22 07:33:25 compute-0 systemd[1]: libpod-370ac707cc393c72c1b061ecf7af8e12270d93cf60033f69d892e8c91a4f9623.scope: Deactivated successfully.
Nov 22 07:33:25 compute-0 podman[186754]: 2025-11-22 07:33:25.588793589 +0000 UTC m=+0.025432818 container died 370ac707cc393c72c1b061ecf7af8e12270d93cf60033f69d892e8c91a4f9623 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, container_name=nova_compute_init, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 22 07:33:25 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-370ac707cc393c72c1b061ecf7af8e12270d93cf60033f69d892e8c91a4f9623-userdata-shm.mount: Deactivated successfully.
Nov 22 07:33:25 compute-0 systemd[1]: var-lib-containers-storage-overlay-b6efe350a2a2d6ef7c130899f3586b24d7cfc2423dfe7d3142158836fed5e798-merged.mount: Deactivated successfully.
Nov 22 07:33:25 compute-0 podman[186765]: 2025-11-22 07:33:25.759446461 +0000 UTC m=+0.159488392 container cleanup 370ac707cc393c72c1b061ecf7af8e12270d93cf60033f69d892e8c91a4f9623 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, container_name=nova_compute_init)
Nov 22 07:33:25 compute-0 systemd[1]: libpod-conmon-370ac707cc393c72c1b061ecf7af8e12270d93cf60033f69d892e8c91a4f9623.scope: Deactivated successfully.
Nov 22 07:33:25 compute-0 sudo[186706]: pam_unix(sudo:session): session closed for user root
Nov 22 07:33:25 compute-0 nova_compute[186544]: 2025-11-22 07:33:25.963 186548 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 22 07:33:25 compute-0 nova_compute[186544]: 2025-11-22 07:33:25.964 186548 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 22 07:33:25 compute-0 nova_compute[186544]: 2025-11-22 07:33:25.964 186548 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 22 07:33:25 compute-0 nova_compute[186544]: 2025-11-22 07:33:25.964 186548 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.124 186548 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.152 186548 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.028s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.153 186548 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Nov 22 07:33:26 compute-0 sshd-session[158417]: Connection closed by 192.168.122.30 port 53640
Nov 22 07:33:26 compute-0 sshd-session[158414]: pam_unix(sshd:session): session closed for user zuul
Nov 22 07:33:26 compute-0 systemd[1]: session-23.scope: Deactivated successfully.
Nov 22 07:33:26 compute-0 systemd[1]: session-23.scope: Consumed 1min 49.523s CPU time.
Nov 22 07:33:26 compute-0 systemd-logind[821]: Session 23 logged out. Waiting for processes to exit.
Nov 22 07:33:26 compute-0 systemd-logind[821]: Removed session 23.
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.616 186548 INFO nova.virt.driver [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.743 186548 INFO nova.compute.provider_config [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.751 186548 DEBUG oslo_concurrency.lockutils [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.751 186548 DEBUG oslo_concurrency.lockutils [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.751 186548 DEBUG oslo_concurrency.lockutils [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.752 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.752 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.752 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.752 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.752 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.753 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.753 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.753 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.753 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.753 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.754 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.754 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.754 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.754 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.754 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.755 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.755 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.755 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.755 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.755 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.756 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.756 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.756 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.756 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.756 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.756 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.757 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.757 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.757 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.757 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.757 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.757 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.758 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.758 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.758 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.758 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.758 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.758 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.759 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.759 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.759 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.759 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.759 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.759 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.760 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.760 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.760 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.760 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.760 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.760 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.761 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.761 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.761 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.761 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.761 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.761 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.762 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.762 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.762 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.762 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.762 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.763 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.763 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.763 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.763 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.763 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.763 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.764 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.764 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.764 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.764 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.764 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.765 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.765 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.765 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.765 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.765 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.766 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.766 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.766 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.766 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.766 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.766 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.767 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.767 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.767 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.767 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.767 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.767 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.768 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.768 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.768 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.768 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.768 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.768 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.768 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.769 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.769 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.769 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.769 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.769 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.769 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.770 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.770 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.770 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.770 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.770 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.770 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.770 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.771 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.771 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.771 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.771 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.771 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.771 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.772 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.772 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.772 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.772 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.772 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.772 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.773 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.773 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.773 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.773 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.773 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.773 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.773 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.773 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.774 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.774 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.774 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.774 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.774 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.774 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.775 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.775 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.775 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.775 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.775 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.775 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.776 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.776 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.776 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.776 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.776 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.776 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.776 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.777 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.777 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.777 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.777 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.777 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.777 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.778 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.778 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.778 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.778 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.778 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.778 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.779 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.779 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.779 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.779 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.779 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.779 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.779 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.780 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.780 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.780 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.780 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.780 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.780 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.781 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.781 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.781 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.781 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.781 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.781 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.781 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.782 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.782 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.782 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.782 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.782 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.782 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.782 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.783 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.783 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.783 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.783 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.783 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.783 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.784 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.784 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.784 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.784 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.784 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.784 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.784 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.785 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.785 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.785 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.785 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.785 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.785 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.785 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.786 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.786 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.786 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.786 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.786 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.786 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.786 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.787 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.787 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.787 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.787 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.787 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.787 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.787 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.788 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.788 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.788 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.788 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.788 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.788 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.788 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.789 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.789 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.789 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.789 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.789 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.789 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.789 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.790 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.790 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.790 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.790 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.790 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.790 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.790 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.791 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.791 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.791 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.791 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.791 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.791 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.791 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.792 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.792 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.792 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.792 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.792 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.792 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.792 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.793 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.793 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.793 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.793 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.793 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.793 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.794 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.794 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.794 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.794 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.794 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.794 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.795 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.795 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.795 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.795 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.795 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.795 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.795 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.796 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.796 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.796 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.796 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.796 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.796 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.796 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.797 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.797 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.797 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.797 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.797 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.797 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.797 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.798 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.798 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.798 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.798 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.798 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.798 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.798 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.799 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.799 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.799 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.799 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.799 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.799 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.799 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.800 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.800 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.800 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.800 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.800 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.800 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.800 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.800 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.801 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.801 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.801 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.801 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.801 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.801 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.801 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.802 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.802 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.802 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.802 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.802 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.802 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.802 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.803 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.803 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.803 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.803 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.803 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.803 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.803 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.804 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.804 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.804 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.804 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.804 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.804 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.804 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.805 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.805 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.805 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.805 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.805 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.805 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.805 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.806 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.806 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.806 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.806 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.806 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.806 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.806 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.807 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.807 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.807 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.807 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.807 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.808 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.808 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.808 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.808 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.808 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.808 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.808 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.809 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.809 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.809 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.809 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.809 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.809 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.809 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.809 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.810 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.810 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.810 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.810 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.810 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.810 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.810 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.811 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.811 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.811 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.811 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.811 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.811 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.811 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.812 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.812 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.812 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.812 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.812 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.812 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.812 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.813 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.813 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.813 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.813 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.813 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.813 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.814 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.814 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.814 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.814 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.814 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.814 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.814 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.815 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.815 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.815 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.815 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.815 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.815 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.815 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.816 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.816 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.816 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.816 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.816 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.816 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.816 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.817 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.817 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.817 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.817 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.817 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.817 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.817 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.817 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.818 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.818 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.818 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.818 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.818 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.818 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.818 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.819 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.819 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.819 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.819 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.819 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.819 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.819 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.820 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.820 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.820 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.820 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.820 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.820 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.821 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.821 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.821 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.821 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.821 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.821 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.822 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.822 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.822 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.822 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.822 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.823 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.823 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.823 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.823 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.823 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.824 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.824 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.824 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.825 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.825 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.825 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.825 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.825 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.825 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.826 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.826 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.826 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.826 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.827 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.827 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.827 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.827 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.827 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.828 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.828 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.828 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.828 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.828 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.829 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.829 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.829 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.829 186548 WARNING oslo_config.cfg [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Nov 22 07:33:26 compute-0 nova_compute[186544]: live_migration_uri is deprecated for removal in favor of two other options that
Nov 22 07:33:26 compute-0 nova_compute[186544]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Nov 22 07:33:26 compute-0 nova_compute[186544]: and ``live_migration_inbound_addr`` respectively.
Nov 22 07:33:26 compute-0 nova_compute[186544]: ).  Its value may be silently ignored in the future.
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.830 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.830 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.830 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.830 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.831 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.831 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.831 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.831 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.831 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.832 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.832 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.832 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.832 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.832 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.833 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.833 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.833 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.833 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.833 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.834 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.834 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.834 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.834 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.834 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.834 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.835 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.835 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.835 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.835 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.835 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.836 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.836 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.836 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.836 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.837 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.837 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.837 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.837 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.837 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.837 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.838 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.838 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.838 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.838 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.838 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.839 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.839 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.839 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.839 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.839 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.839 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.840 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.840 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.840 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.840 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.840 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.840 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.840 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.841 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.841 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.841 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.841 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.841 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.841 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.841 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.842 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.842 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.842 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.842 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.842 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.843 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.843 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.843 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.843 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.843 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.843 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.844 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.844 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.844 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.844 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.844 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.844 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.845 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] notifications.notification_format = both log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.845 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] notifications.notify_on_state_change = vm_and_task_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.845 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.845 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.845 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.845 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.846 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.846 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.846 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.846 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.846 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.846 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.846 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.847 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.847 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.847 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.847 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.847 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.847 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.848 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.848 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.848 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.848 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.848 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.848 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.849 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.849 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.849 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.849 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.849 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.849 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.850 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.850 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.850 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.850 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.850 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.850 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.850 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.851 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.851 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.851 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.851 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.851 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.852 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.852 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.852 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.852 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.852 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.853 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.853 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.853 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.853 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.853 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.854 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.854 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.854 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.854 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.855 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.855 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.855 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.855 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.855 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.856 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.856 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.856 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.856 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.856 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.857 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.857 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.857 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.857 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.857 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.858 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.858 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.858 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.858 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.858 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.858 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.859 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.859 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.859 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.859 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.859 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.859 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.860 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.860 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.860 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.860 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.860 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.861 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.861 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.861 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.861 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.861 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.862 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.862 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.862 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.862 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.863 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.863 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.863 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.863 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.863 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.864 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.864 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.864 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.864 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.864 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.864 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.864 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.865 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.865 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.865 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.865 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.865 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.865 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.866 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.866 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.866 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.866 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.866 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.867 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.867 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.867 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.867 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.867 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.867 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.867 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.868 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.868 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.868 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.868 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.868 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.868 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.868 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.869 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.869 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.869 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.869 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.869 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.870 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.870 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.870 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.870 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.870 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.870 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.871 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.871 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.871 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.871 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.871 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.871 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.872 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.872 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.872 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.872 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.872 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.872 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.873 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.873 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.873 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.873 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.873 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.873 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.873 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.874 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.874 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.874 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.875 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.875 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.875 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.875 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.875 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.876 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.876 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.876 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.876 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.876 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.877 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.877 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.877 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.877 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.877 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.877 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.878 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.878 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.878 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.878 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.878 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.879 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.879 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.879 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.879 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.879 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.879 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.879 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.880 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.880 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.880 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.880 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.880 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.880 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.880 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.881 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.881 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.881 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.881 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.881 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.881 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.882 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.882 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.882 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.882 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.882 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.882 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.883 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.883 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.883 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.883 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.883 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.883 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.883 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.884 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.884 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.884 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.884 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.884 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.884 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.885 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.885 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.885 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.885 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.885 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.885 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.886 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.886 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.886 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.886 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.886 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.886 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.886 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.887 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.887 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.887 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.887 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.887 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.887 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.888 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.888 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.888 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.888 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.888 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.889 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.889 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.889 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.889 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_messaging_notifications.driver = ['messagingv2'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.889 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.889 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.890 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.890 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.890 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.890 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.890 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.890 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.890 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.891 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.891 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.891 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.891 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.891 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.891 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.891 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.892 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.892 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.892 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.892 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.892 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.892 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.892 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.893 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.893 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.893 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.893 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.893 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.893 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.894 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.894 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.894 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.894 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.894 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.894 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.895 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.895 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.895 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.895 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.895 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.895 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.895 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.896 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.896 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.896 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.896 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.896 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.896 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.896 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.896 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.897 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.897 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.897 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.897 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.897 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.898 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.898 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.898 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.898 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.898 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.899 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.899 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.899 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.899 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.899 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.899 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.900 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.900 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.900 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.900 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.900 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.901 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.901 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.901 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.901 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.901 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.902 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.902 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.902 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.902 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.902 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.902 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.903 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.903 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.903 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.903 186548 DEBUG oslo_service.service [None req-4508fa56-f60b-4ddc-81de-ee495b2ba8b0 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.905 186548 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.916 186548 INFO nova.virt.node [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Determined node identity 0a011418-630a-4be8-ab23-41ec1c11a5ea from /var/lib/nova/compute_id
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.917 186548 DEBUG nova.virt.libvirt.host [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.918 186548 DEBUG nova.virt.libvirt.host [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.918 186548 DEBUG nova.virt.libvirt.host [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.918 186548 DEBUG nova.virt.libvirt.host [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.936 186548 DEBUG nova.virt.libvirt.host [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7feeef7d4070> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.939 186548 DEBUG nova.virt.libvirt.host [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7feeef7d4070> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.940 186548 INFO nova.virt.libvirt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Connection event '1' reason 'None'
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.949 186548 INFO nova.virt.libvirt.host [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Libvirt host capabilities <capabilities>
Nov 22 07:33:26 compute-0 nova_compute[186544]: 
Nov 22 07:33:26 compute-0 nova_compute[186544]:   <host>
Nov 22 07:33:26 compute-0 nova_compute[186544]:     <uuid>44d48670-015e-4b65-9dee-fcaebe9200b4</uuid>
Nov 22 07:33:26 compute-0 nova_compute[186544]:     <cpu>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <arch>x86_64</arch>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model>EPYC-Rome-v4</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <vendor>AMD</vendor>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <microcode version='16777317'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <signature family='23' model='49' stepping='0'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <maxphysaddr mode='emulate' bits='40'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <feature name='x2apic'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <feature name='tsc-deadline'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <feature name='osxsave'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <feature name='hypervisor'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <feature name='tsc_adjust'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <feature name='spec-ctrl'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <feature name='stibp'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <feature name='arch-capabilities'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <feature name='ssbd'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <feature name='cmp_legacy'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <feature name='topoext'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <feature name='virt-ssbd'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <feature name='lbrv'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <feature name='tsc-scale'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <feature name='vmcb-clean'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <feature name='pause-filter'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <feature name='pfthreshold'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <feature name='svme-addr-chk'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <feature name='rdctl-no'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <feature name='skip-l1dfl-vmentry'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <feature name='mds-no'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <feature name='pschange-mc-no'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <pages unit='KiB' size='4'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <pages unit='KiB' size='2048'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <pages unit='KiB' size='1048576'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:     </cpu>
Nov 22 07:33:26 compute-0 nova_compute[186544]:     <power_management>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <suspend_mem/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <suspend_disk/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <suspend_hybrid/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:     </power_management>
Nov 22 07:33:26 compute-0 nova_compute[186544]:     <iommu support='no'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:     <migration_features>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <live/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <uri_transports>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <uri_transport>tcp</uri_transport>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <uri_transport>rdma</uri_transport>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </uri_transports>
Nov 22 07:33:26 compute-0 nova_compute[186544]:     </migration_features>
Nov 22 07:33:26 compute-0 nova_compute[186544]:     <topology>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <cells num='1'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <cell id='0'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:           <memory unit='KiB'>7864320</memory>
Nov 22 07:33:26 compute-0 nova_compute[186544]:           <pages unit='KiB' size='4'>1966080</pages>
Nov 22 07:33:26 compute-0 nova_compute[186544]:           <pages unit='KiB' size='2048'>0</pages>
Nov 22 07:33:26 compute-0 nova_compute[186544]:           <pages unit='KiB' size='1048576'>0</pages>
Nov 22 07:33:26 compute-0 nova_compute[186544]:           <distances>
Nov 22 07:33:26 compute-0 nova_compute[186544]:             <sibling id='0' value='10'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:           </distances>
Nov 22 07:33:26 compute-0 nova_compute[186544]:           <cpus num='8'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:           </cpus>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         </cell>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </cells>
Nov 22 07:33:26 compute-0 nova_compute[186544]:     </topology>
Nov 22 07:33:26 compute-0 nova_compute[186544]:     <cache>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:     </cache>
Nov 22 07:33:26 compute-0 nova_compute[186544]:     <secmodel>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model>selinux</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <doi>0</doi>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Nov 22 07:33:26 compute-0 nova_compute[186544]:     </secmodel>
Nov 22 07:33:26 compute-0 nova_compute[186544]:     <secmodel>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model>dac</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <doi>0</doi>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <baselabel type='kvm'>+107:+107</baselabel>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <baselabel type='qemu'>+107:+107</baselabel>
Nov 22 07:33:26 compute-0 nova_compute[186544]:     </secmodel>
Nov 22 07:33:26 compute-0 nova_compute[186544]:   </host>
Nov 22 07:33:26 compute-0 nova_compute[186544]: 
Nov 22 07:33:26 compute-0 nova_compute[186544]:   <guest>
Nov 22 07:33:26 compute-0 nova_compute[186544]:     <os_type>hvm</os_type>
Nov 22 07:33:26 compute-0 nova_compute[186544]:     <arch name='i686'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <wordsize>32</wordsize>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <domain type='qemu'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <domain type='kvm'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:     </arch>
Nov 22 07:33:26 compute-0 nova_compute[186544]:     <features>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <pae/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <nonpae/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <acpi default='on' toggle='yes'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <apic default='on' toggle='no'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <cpuselection/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <deviceboot/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <disksnapshot default='on' toggle='no'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <externalSnapshot/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:     </features>
Nov 22 07:33:26 compute-0 nova_compute[186544]:   </guest>
Nov 22 07:33:26 compute-0 nova_compute[186544]: 
Nov 22 07:33:26 compute-0 nova_compute[186544]:   <guest>
Nov 22 07:33:26 compute-0 nova_compute[186544]:     <os_type>hvm</os_type>
Nov 22 07:33:26 compute-0 nova_compute[186544]:     <arch name='x86_64'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <wordsize>64</wordsize>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <domain type='qemu'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <domain type='kvm'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:     </arch>
Nov 22 07:33:26 compute-0 nova_compute[186544]:     <features>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <acpi default='on' toggle='yes'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <apic default='on' toggle='no'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <cpuselection/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <deviceboot/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <disksnapshot default='on' toggle='no'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <externalSnapshot/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:     </features>
Nov 22 07:33:26 compute-0 nova_compute[186544]:   </guest>
Nov 22 07:33:26 compute-0 nova_compute[186544]: 
Nov 22 07:33:26 compute-0 nova_compute[186544]: </capabilities>
Nov 22 07:33:26 compute-0 nova_compute[186544]: 
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.951 186548 DEBUG nova.virt.libvirt.volume.mount [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.956 186548 DEBUG nova.virt.libvirt.host [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Nov 22 07:33:26 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.960 186548 DEBUG nova.virt.libvirt.host [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Nov 22 07:33:26 compute-0 nova_compute[186544]: <domainCapabilities>
Nov 22 07:33:26 compute-0 nova_compute[186544]:   <path>/usr/libexec/qemu-kvm</path>
Nov 22 07:33:26 compute-0 nova_compute[186544]:   <domain>kvm</domain>
Nov 22 07:33:26 compute-0 nova_compute[186544]:   <machine>pc-i440fx-rhel7.6.0</machine>
Nov 22 07:33:26 compute-0 nova_compute[186544]:   <arch>i686</arch>
Nov 22 07:33:26 compute-0 nova_compute[186544]:   <vcpu max='240'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:   <iothreads supported='yes'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:   <os supported='yes'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:     <enum name='firmware'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:     <loader supported='yes'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <enum name='type'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <value>rom</value>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <value>pflash</value>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <enum name='readonly'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <value>yes</value>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <value>no</value>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <enum name='secure'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <value>no</value>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:26 compute-0 nova_compute[186544]:     </loader>
Nov 22 07:33:26 compute-0 nova_compute[186544]:   </os>
Nov 22 07:33:26 compute-0 nova_compute[186544]:   <cpu>
Nov 22 07:33:26 compute-0 nova_compute[186544]:     <mode name='host-passthrough' supported='yes'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <enum name='hostPassthroughMigratable'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <value>on</value>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <value>off</value>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:26 compute-0 nova_compute[186544]:     </mode>
Nov 22 07:33:26 compute-0 nova_compute[186544]:     <mode name='maximum' supported='yes'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <enum name='maximumMigratable'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <value>on</value>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <value>off</value>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:26 compute-0 nova_compute[186544]:     </mode>
Nov 22 07:33:26 compute-0 nova_compute[186544]:     <mode name='host-model' supported='yes'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <vendor>AMD</vendor>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <feature policy='require' name='x2apic'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <feature policy='require' name='tsc-deadline'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <feature policy='require' name='hypervisor'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <feature policy='require' name='tsc_adjust'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <feature policy='require' name='spec-ctrl'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <feature policy='require' name='stibp'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <feature policy='require' name='ssbd'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <feature policy='require' name='cmp_legacy'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <feature policy='require' name='overflow-recov'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <feature policy='require' name='succor'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <feature policy='require' name='ibrs'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <feature policy='require' name='amd-ssbd'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <feature policy='require' name='virt-ssbd'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <feature policy='require' name='lbrv'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <feature policy='require' name='tsc-scale'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <feature policy='require' name='vmcb-clean'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <feature policy='require' name='flushbyasid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <feature policy='require' name='pause-filter'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <feature policy='require' name='pfthreshold'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <feature policy='require' name='svme-addr-chk'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <feature policy='disable' name='xsaves'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:     </mode>
Nov 22 07:33:26 compute-0 nova_compute[186544]:     <mode name='custom' supported='yes'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='Broadwell'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='Broadwell-IBRS'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='Broadwell-noTSX'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='Broadwell-v1'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='Broadwell-v2'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='Broadwell-v3'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='Broadwell-v4'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='Cascadelake-Server'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='Cascadelake-Server-v1'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='Cascadelake-Server-v2'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='Cascadelake-Server-v3'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='Cascadelake-Server-v4'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='Cascadelake-Server-v5'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='Cooperlake'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512-bf16'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='taa-no'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='Cooperlake-v1'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512-bf16'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='taa-no'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='Cooperlake-v2'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512-bf16'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='taa-no'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='Denverton'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='mpx'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='Denverton-v1'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='mpx'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='Denverton-v2'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='Denverton-v3'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='Dhyana-v2'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='EPYC-Genoa'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='amd-psfd'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='auto-ibrs'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512-bf16'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512bitalg'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512ifma'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vbmi'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='fsrm'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='la57'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='no-nested-data-bp'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='null-sel-clr-base'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='stibp-always-on'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='EPYC-Genoa-v1'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='amd-psfd'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='auto-ibrs'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512-bf16'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512bitalg'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512ifma'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vbmi'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='fsrm'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='la57'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='no-nested-data-bp'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='null-sel-clr-base'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='stibp-always-on'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='EPYC-Milan'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='fsrm'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='EPYC-Milan-v1'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='fsrm'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='EPYC-Milan-v2'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='amd-psfd'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='fsrm'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='no-nested-data-bp'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='null-sel-clr-base'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='stibp-always-on'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='EPYC-Rome'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='EPYC-Rome-v1'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='EPYC-Rome-v2'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='EPYC-Rome-v3'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='EPYC-v3'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='EPYC-v4'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='GraniteRapids'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='amx-bf16'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='amx-fp16'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='amx-int8'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='amx-tile'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx-vnni'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512-bf16'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512-fp16'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512bitalg'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512ifma'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vbmi'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='bus-lock-detect'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='fbsdp-no'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='fsrc'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='fsrm'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='fsrs'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='fzrm'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='la57'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='mcdt-no'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pbrsb-no'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='prefetchiti'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='psdp-no'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='sbdr-ssdp-no'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='serialize'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='taa-no'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='tsx-ldtrk'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='xfd'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='GraniteRapids-v1'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='amx-bf16'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='amx-fp16'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='amx-int8'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='amx-tile'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx-vnni'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512-bf16'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512-fp16'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512bitalg'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512ifma'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vbmi'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='bus-lock-detect'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='fbsdp-no'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='fsrc'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='fsrm'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='fsrs'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='fzrm'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='la57'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='mcdt-no'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pbrsb-no'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='prefetchiti'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='psdp-no'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='sbdr-ssdp-no'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='serialize'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='taa-no'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='tsx-ldtrk'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='xfd'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='GraniteRapids-v2'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='amx-bf16'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='amx-fp16'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='amx-int8'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='amx-tile'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx-vnni'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx10'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx10-128'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx10-256'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx10-512'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512-bf16'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512-fp16'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512bitalg'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512ifma'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vbmi'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='bus-lock-detect'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='cldemote'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='fbsdp-no'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='fsrc'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='fsrm'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='fsrs'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='fzrm'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='la57'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='mcdt-no'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='movdir64b'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='movdiri'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pbrsb-no'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='prefetchiti'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='psdp-no'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='sbdr-ssdp-no'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='serialize'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='ss'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='taa-no'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='tsx-ldtrk'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='xfd'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='Haswell'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='Haswell-IBRS'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='Haswell-noTSX'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='Haswell-v1'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='Haswell-v2'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='Haswell-v3'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='Haswell-v4'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='Icelake-Server'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512bitalg'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vbmi'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='la57'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='Icelake-Server-noTSX'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512bitalg'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vbmi'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='la57'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='Icelake-Server-v1'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512bitalg'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vbmi'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='la57'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='Icelake-Server-v2'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512bitalg'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vbmi'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='la57'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='Icelake-Server-v3'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512bitalg'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vbmi'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='la57'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='taa-no'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='Icelake-Server-v4'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512bitalg'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512ifma'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vbmi'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='fsrm'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='la57'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='taa-no'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='Icelake-Server-v5'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512bitalg'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512ifma'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vbmi'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='fsrm'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='la57'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='taa-no'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='Icelake-Server-v6'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512bitalg'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512ifma'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vbmi'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='fsrm'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='la57'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='taa-no'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='Icelake-Server-v7'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512bitalg'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512ifma'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vbmi'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='fsrm'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='la57'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='taa-no'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='IvyBridge'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='IvyBridge-IBRS'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='IvyBridge-v1'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='IvyBridge-v2'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='KnightsMill'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512-4fmaps'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512-4vnniw'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512er'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512pf'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='ss'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='KnightsMill-v1'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512-4fmaps'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512-4vnniw'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512er'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512pf'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='ss'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='Opteron_G4'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='fma4'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='xop'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='Opteron_G4-v1'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='fma4'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='xop'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='Opteron_G5'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='fma4'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='tbm'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='xop'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='Opteron_G5-v1'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='fma4'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='tbm'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='xop'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='SapphireRapids'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='amx-bf16'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='amx-int8'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='amx-tile'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx-vnni'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512-bf16'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512-fp16'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512bitalg'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512ifma'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vbmi'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='bus-lock-detect'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='fsrc'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='fsrm'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='fsrs'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='fzrm'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='la57'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='serialize'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='taa-no'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='tsx-ldtrk'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='xfd'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='SapphireRapids-v1'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='amx-bf16'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='amx-int8'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='amx-tile'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx-vnni'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512-bf16'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512-fp16'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512bitalg'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512ifma'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vbmi'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='bus-lock-detect'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='fsrc'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='fsrm'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='fsrs'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='fzrm'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='la57'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='serialize'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='taa-no'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='tsx-ldtrk'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='xfd'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='SapphireRapids-v2'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='amx-bf16'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='amx-int8'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='amx-tile'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx-vnni'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512-bf16'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512-fp16'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512bitalg'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512ifma'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vbmi'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='bus-lock-detect'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='fbsdp-no'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='fsrc'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='fsrm'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='fsrs'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='fzrm'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='la57'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='psdp-no'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='sbdr-ssdp-no'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='serialize'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='taa-no'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='tsx-ldtrk'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='xfd'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='SapphireRapids-v3'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='amx-bf16'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='amx-int8'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='amx-tile'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx-vnni'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512-bf16'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512-fp16'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512bitalg'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512ifma'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vbmi'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='bus-lock-detect'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='cldemote'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='fbsdp-no'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='fsrc'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='fsrm'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='fsrs'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='fzrm'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='la57'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='movdir64b'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='movdiri'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='psdp-no'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='sbdr-ssdp-no'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='serialize'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='ss'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='taa-no'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='tsx-ldtrk'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='xfd'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='SierraForest'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx-ifma'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx-ne-convert'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx-vnni'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx-vnni-int8'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='bus-lock-detect'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='cmpccxadd'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='fbsdp-no'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='fsrm'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='fsrs'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='mcdt-no'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pbrsb-no'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='psdp-no'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='sbdr-ssdp-no'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='serialize'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='SierraForest-v1'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx-ifma'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx-ne-convert'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx-vnni'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx-vnni-int8'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='bus-lock-detect'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='cmpccxadd'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='fbsdp-no'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='fsrm'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='fsrs'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='mcdt-no'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pbrsb-no'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='psdp-no'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='sbdr-ssdp-no'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='serialize'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='Skylake-Client'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='Skylake-Client-IBRS'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='Skylake-Client-v1'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='Skylake-Client-v2'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='Skylake-Client-v3'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='Skylake-Client-v4'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='Skylake-Server'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='Skylake-Server-IBRS'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='Skylake-Server-v1'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='Skylake-Server-v2'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='Skylake-Server-v3'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='Skylake-Server-v4'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='Skylake-Server-v5'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='Snowridge'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='cldemote'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='core-capability'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='movdir64b'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='movdiri'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='mpx'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='split-lock-detect'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='Snowridge-v1'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='cldemote'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='core-capability'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='movdir64b'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='movdiri'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='mpx'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='split-lock-detect'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='Snowridge-v2'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='cldemote'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='core-capability'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='movdir64b'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='movdiri'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='split-lock-detect'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='Snowridge-v3'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='cldemote'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='core-capability'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='movdir64b'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='movdiri'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='split-lock-detect'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='Snowridge-v4'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='cldemote'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='movdir64b'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='movdiri'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='athlon'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='3dnow'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='3dnowext'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='athlon-v1'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='3dnow'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='3dnowext'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='core2duo'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='ss'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='core2duo-v1'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='ss'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='coreduo'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='ss'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='coreduo-v1'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='ss'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='n270'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='ss'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='n270-v1'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='ss'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='phenom'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='3dnow'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='3dnowext'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <blockers model='phenom-v1'>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='3dnow'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:         <feature name='3dnowext'/>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 22 07:33:26 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </mode>
Nov 22 07:33:27 compute-0 nova_compute[186544]:   </cpu>
Nov 22 07:33:27 compute-0 nova_compute[186544]:   <memoryBacking supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <enum name='sourceType'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <value>file</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <value>anonymous</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <value>memfd</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:   </memoryBacking>
Nov 22 07:33:27 compute-0 nova_compute[186544]:   <devices>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <disk supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='diskDevice'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>disk</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>cdrom</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>floppy</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>lun</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='bus'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>ide</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>fdc</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>scsi</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>virtio</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>usb</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>sata</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='model'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>virtio</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>virtio-transitional</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>virtio-non-transitional</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <graphics supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='type'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>vnc</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>egl-headless</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>dbus</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </graphics>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <video supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='modelType'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>vga</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>cirrus</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>virtio</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>none</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>bochs</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>ramfb</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </video>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <hostdev supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='mode'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>subsystem</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='startupPolicy'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>default</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>mandatory</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>requisite</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>optional</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='subsysType'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>usb</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>pci</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>scsi</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='capsType'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='pciBackend'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </hostdev>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <rng supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='model'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>virtio</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>virtio-transitional</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>virtio-non-transitional</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='backendModel'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>random</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>egd</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>builtin</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </rng>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <filesystem supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='driverType'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>path</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>handle</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>virtiofs</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </filesystem>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <tpm supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='model'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>tpm-tis</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>tpm-crb</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='backendModel'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>emulator</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>external</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='backendVersion'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>2.0</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </tpm>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <redirdev supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='bus'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>usb</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </redirdev>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <channel supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='type'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>pty</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>unix</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </channel>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <crypto supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='model'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='type'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>qemu</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='backendModel'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>builtin</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </crypto>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <interface supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='backendType'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>default</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>passt</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </interface>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <panic supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='model'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>isa</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>hyperv</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </panic>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <console supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='type'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>null</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>vc</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>pty</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>dev</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>file</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>pipe</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>stdio</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>udp</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>tcp</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>unix</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>qemu-vdagent</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>dbus</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </console>
Nov 22 07:33:27 compute-0 nova_compute[186544]:   </devices>
Nov 22 07:33:27 compute-0 nova_compute[186544]:   <features>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <gic supported='no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <vmcoreinfo supported='yes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <genid supported='yes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <backingStoreInput supported='yes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <backup supported='yes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <async-teardown supported='yes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <ps2 supported='yes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <sev supported='no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <sgx supported='no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <hyperv supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='features'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>relaxed</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>vapic</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>spinlocks</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>vpindex</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>runtime</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>synic</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>stimer</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>reset</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>vendor_id</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>frequencies</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>reenlightenment</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>tlbflush</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>ipi</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>avic</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>emsr_bitmap</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>xmm_input</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <defaults>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <spinlocks>4095</spinlocks>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <stimer_direct>on</stimer_direct>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <tlbflush_direct>on</tlbflush_direct>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <tlbflush_extended>on</tlbflush_extended>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </defaults>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </hyperv>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <launchSecurity supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='sectype'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>tdx</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </launchSecurity>
Nov 22 07:33:27 compute-0 nova_compute[186544]:   </features>
Nov 22 07:33:27 compute-0 nova_compute[186544]: </domainCapabilities>
Nov 22 07:33:27 compute-0 nova_compute[186544]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 22 07:33:27 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.967 186548 DEBUG nova.virt.libvirt.host [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Nov 22 07:33:27 compute-0 nova_compute[186544]: <domainCapabilities>
Nov 22 07:33:27 compute-0 nova_compute[186544]:   <path>/usr/libexec/qemu-kvm</path>
Nov 22 07:33:27 compute-0 nova_compute[186544]:   <domain>kvm</domain>
Nov 22 07:33:27 compute-0 nova_compute[186544]:   <machine>pc-q35-rhel9.8.0</machine>
Nov 22 07:33:27 compute-0 nova_compute[186544]:   <arch>i686</arch>
Nov 22 07:33:27 compute-0 nova_compute[186544]:   <vcpu max='4096'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:   <iothreads supported='yes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:   <os supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <enum name='firmware'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <loader supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='type'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>rom</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>pflash</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='readonly'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>yes</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>no</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='secure'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>no</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </loader>
Nov 22 07:33:27 compute-0 nova_compute[186544]:   </os>
Nov 22 07:33:27 compute-0 nova_compute[186544]:   <cpu>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <mode name='host-passthrough' supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='hostPassthroughMigratable'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>on</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>off</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </mode>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <mode name='maximum' supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='maximumMigratable'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>on</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>off</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </mode>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <mode name='host-model' supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <vendor>AMD</vendor>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <feature policy='require' name='x2apic'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <feature policy='require' name='tsc-deadline'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <feature policy='require' name='hypervisor'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <feature policy='require' name='tsc_adjust'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <feature policy='require' name='spec-ctrl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <feature policy='require' name='stibp'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <feature policy='require' name='ssbd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <feature policy='require' name='cmp_legacy'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <feature policy='require' name='overflow-recov'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <feature policy='require' name='succor'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <feature policy='require' name='ibrs'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <feature policy='require' name='amd-ssbd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <feature policy='require' name='virt-ssbd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <feature policy='require' name='lbrv'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <feature policy='require' name='tsc-scale'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <feature policy='require' name='vmcb-clean'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <feature policy='require' name='flushbyasid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <feature policy='require' name='pause-filter'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <feature policy='require' name='pfthreshold'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <feature policy='require' name='svme-addr-chk'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <feature policy='disable' name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </mode>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <mode name='custom' supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Broadwell'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Broadwell-IBRS'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Broadwell-noTSX'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Broadwell-v1'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Broadwell-v2'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Broadwell-v3'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Broadwell-v4'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Cascadelake-Server'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Cascadelake-Server-v1'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Cascadelake-Server-v2'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Cascadelake-Server-v3'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Cascadelake-Server-v4'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Cascadelake-Server-v5'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Cooperlake'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-bf16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='taa-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Cooperlake-v1'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-bf16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='taa-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Cooperlake-v2'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-bf16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='taa-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Denverton'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='mpx'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Denverton-v1'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='mpx'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Denverton-v2'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Denverton-v3'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Dhyana-v2'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='EPYC-Genoa'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amd-psfd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='auto-ibrs'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-bf16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bitalg'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512ifma'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='la57'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='no-nested-data-bp'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='null-sel-clr-base'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='stibp-always-on'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='EPYC-Genoa-v1'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amd-psfd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='auto-ibrs'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-bf16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bitalg'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512ifma'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='la57'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='no-nested-data-bp'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='null-sel-clr-base'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='stibp-always-on'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='EPYC-Milan'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='EPYC-Milan-v1'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='EPYC-Milan-v2'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amd-psfd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='no-nested-data-bp'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='null-sel-clr-base'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='stibp-always-on'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='EPYC-Rome'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='EPYC-Rome-v1'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='EPYC-Rome-v2'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='EPYC-Rome-v3'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='EPYC-v3'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='EPYC-v4'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='GraniteRapids'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amx-bf16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amx-fp16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amx-int8'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amx-tile'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx-vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-bf16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-fp16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bitalg'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512ifma'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='bus-lock-detect'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fbsdp-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrc'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrs'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fzrm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='la57'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='mcdt-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pbrsb-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='prefetchiti'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='psdp-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='sbdr-ssdp-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='serialize'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='taa-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='tsx-ldtrk'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xfd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='GraniteRapids-v1'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amx-bf16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amx-fp16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amx-int8'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amx-tile'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx-vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-bf16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-fp16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bitalg'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512ifma'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='bus-lock-detect'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fbsdp-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrc'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrs'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fzrm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='la57'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='mcdt-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pbrsb-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='prefetchiti'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='psdp-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='sbdr-ssdp-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='serialize'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='taa-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='tsx-ldtrk'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xfd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='GraniteRapids-v2'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amx-bf16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amx-fp16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amx-int8'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amx-tile'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx-vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx10'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx10-128'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx10-256'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx10-512'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-bf16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-fp16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bitalg'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512ifma'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='bus-lock-detect'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='cldemote'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fbsdp-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrc'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrs'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fzrm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='la57'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='mcdt-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='movdir64b'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='movdiri'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pbrsb-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='prefetchiti'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='psdp-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='sbdr-ssdp-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='serialize'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ss'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='taa-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='tsx-ldtrk'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xfd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Haswell'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Haswell-IBRS'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Haswell-noTSX'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Haswell-v1'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Haswell-v2'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Haswell-v3'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Haswell-v4'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Icelake-Server'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bitalg'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='la57'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Icelake-Server-noTSX'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bitalg'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='la57'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Icelake-Server-v1'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bitalg'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='la57'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Icelake-Server-v2'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bitalg'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='la57'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Icelake-Server-v3'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bitalg'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='la57'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='taa-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Icelake-Server-v4'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bitalg'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512ifma'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='la57'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='taa-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Icelake-Server-v5'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bitalg'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512ifma'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='la57'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='taa-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Icelake-Server-v6'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bitalg'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512ifma'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='la57'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='taa-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Icelake-Server-v7'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bitalg'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512ifma'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='la57'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='taa-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='IvyBridge'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='IvyBridge-IBRS'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='IvyBridge-v1'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='IvyBridge-v2'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='KnightsMill'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-4fmaps'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-4vnniw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512er'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512pf'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ss'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='KnightsMill-v1'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-4fmaps'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-4vnniw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512er'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512pf'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ss'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Opteron_G4'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fma4'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xop'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Opteron_G4-v1'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fma4'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xop'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Opteron_G5'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fma4'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='tbm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xop'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Opteron_G5-v1'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fma4'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='tbm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xop'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='SapphireRapids'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amx-bf16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amx-int8'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amx-tile'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx-vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-bf16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-fp16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bitalg'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512ifma'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='bus-lock-detect'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrc'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrs'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fzrm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='la57'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='serialize'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='taa-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='tsx-ldtrk'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xfd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='SapphireRapids-v1'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amx-bf16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amx-int8'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amx-tile'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx-vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-bf16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-fp16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bitalg'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512ifma'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='bus-lock-detect'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrc'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrs'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fzrm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='la57'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='serialize'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='taa-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='tsx-ldtrk'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xfd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='SapphireRapids-v2'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amx-bf16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amx-int8'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amx-tile'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx-vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-bf16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-fp16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bitalg'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512ifma'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='bus-lock-detect'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fbsdp-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrc'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrs'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fzrm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='la57'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='psdp-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='sbdr-ssdp-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='serialize'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='taa-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='tsx-ldtrk'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xfd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='SapphireRapids-v3'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amx-bf16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amx-int8'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amx-tile'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx-vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-bf16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-fp16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bitalg'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512ifma'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='bus-lock-detect'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='cldemote'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fbsdp-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrc'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrs'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fzrm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='la57'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='movdir64b'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='movdiri'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='psdp-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='sbdr-ssdp-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='serialize'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ss'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='taa-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='tsx-ldtrk'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xfd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='SierraForest'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx-ifma'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx-ne-convert'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx-vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx-vnni-int8'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='bus-lock-detect'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='cmpccxadd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fbsdp-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrs'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='mcdt-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pbrsb-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='psdp-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='sbdr-ssdp-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='serialize'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='SierraForest-v1'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx-ifma'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx-ne-convert'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx-vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx-vnni-int8'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='bus-lock-detect'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='cmpccxadd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fbsdp-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrs'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='mcdt-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pbrsb-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='psdp-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='sbdr-ssdp-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='serialize'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Skylake-Client'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Skylake-Client-IBRS'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Skylake-Client-v1'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Skylake-Client-v2'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Skylake-Client-v3'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Skylake-Client-v4'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Skylake-Server'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Skylake-Server-IBRS'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Skylake-Server-v1'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Skylake-Server-v2'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Skylake-Server-v3'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Skylake-Server-v4'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Skylake-Server-v5'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Snowridge'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='cldemote'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='core-capability'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='movdir64b'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='movdiri'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='mpx'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='split-lock-detect'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Snowridge-v1'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='cldemote'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='core-capability'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='movdir64b'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='movdiri'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='mpx'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='split-lock-detect'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Snowridge-v2'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='cldemote'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='core-capability'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='movdir64b'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='movdiri'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='split-lock-detect'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Snowridge-v3'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='cldemote'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='core-capability'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='movdir64b'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='movdiri'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='split-lock-detect'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Snowridge-v4'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='cldemote'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='movdir64b'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='movdiri'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='athlon'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='3dnow'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='3dnowext'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='athlon-v1'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='3dnow'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='3dnowext'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='core2duo'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ss'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='core2duo-v1'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ss'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='coreduo'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ss'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='coreduo-v1'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ss'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='n270'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ss'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='n270-v1'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ss'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='phenom'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='3dnow'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='3dnowext'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='phenom-v1'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='3dnow'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='3dnowext'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </mode>
Nov 22 07:33:27 compute-0 nova_compute[186544]:   </cpu>
Nov 22 07:33:27 compute-0 nova_compute[186544]:   <memoryBacking supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <enum name='sourceType'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <value>file</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <value>anonymous</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <value>memfd</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:   </memoryBacking>
Nov 22 07:33:27 compute-0 nova_compute[186544]:   <devices>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <disk supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='diskDevice'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>disk</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>cdrom</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>floppy</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>lun</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='bus'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>fdc</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>scsi</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>virtio</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>usb</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>sata</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='model'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>virtio</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>virtio-transitional</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>virtio-non-transitional</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <graphics supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='type'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>vnc</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>egl-headless</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>dbus</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </graphics>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <video supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='modelType'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>vga</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>cirrus</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>virtio</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>none</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>bochs</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>ramfb</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </video>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <hostdev supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='mode'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>subsystem</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='startupPolicy'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>default</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>mandatory</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>requisite</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>optional</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='subsysType'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>usb</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>pci</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>scsi</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='capsType'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='pciBackend'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </hostdev>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <rng supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='model'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>virtio</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>virtio-transitional</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>virtio-non-transitional</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='backendModel'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>random</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>egd</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>builtin</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </rng>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <filesystem supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='driverType'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>path</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>handle</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>virtiofs</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </filesystem>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <tpm supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='model'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>tpm-tis</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>tpm-crb</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='backendModel'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>emulator</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>external</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='backendVersion'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>2.0</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </tpm>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <redirdev supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='bus'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>usb</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </redirdev>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <channel supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='type'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>pty</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>unix</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </channel>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <crypto supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='model'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='type'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>qemu</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='backendModel'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>builtin</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </crypto>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <interface supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='backendType'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>default</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>passt</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </interface>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <panic supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='model'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>isa</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>hyperv</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </panic>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <console supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='type'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>null</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>vc</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>pty</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>dev</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>file</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>pipe</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>stdio</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>udp</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>tcp</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>unix</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>qemu-vdagent</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>dbus</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </console>
Nov 22 07:33:27 compute-0 nova_compute[186544]:   </devices>
Nov 22 07:33:27 compute-0 nova_compute[186544]:   <features>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <gic supported='no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <vmcoreinfo supported='yes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <genid supported='yes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <backingStoreInput supported='yes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <backup supported='yes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <async-teardown supported='yes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <ps2 supported='yes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <sev supported='no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <sgx supported='no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <hyperv supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='features'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>relaxed</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>vapic</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>spinlocks</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>vpindex</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>runtime</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>synic</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>stimer</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>reset</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>vendor_id</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>frequencies</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>reenlightenment</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>tlbflush</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>ipi</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>avic</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>emsr_bitmap</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>xmm_input</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <defaults>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <spinlocks>4095</spinlocks>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <stimer_direct>on</stimer_direct>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <tlbflush_direct>on</tlbflush_direct>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <tlbflush_extended>on</tlbflush_extended>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </defaults>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </hyperv>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <launchSecurity supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='sectype'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>tdx</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </launchSecurity>
Nov 22 07:33:27 compute-0 nova_compute[186544]:   </features>
Nov 22 07:33:27 compute-0 nova_compute[186544]: </domainCapabilities>
Nov 22 07:33:27 compute-0 nova_compute[186544]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 22 07:33:27 compute-0 nova_compute[186544]: 2025-11-22 07:33:26.997 186548 DEBUG nova.virt.libvirt.host [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Nov 22 07:33:27 compute-0 nova_compute[186544]: 2025-11-22 07:33:27.001 186548 DEBUG nova.virt.libvirt.host [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Nov 22 07:33:27 compute-0 nova_compute[186544]: <domainCapabilities>
Nov 22 07:33:27 compute-0 nova_compute[186544]:   <path>/usr/libexec/qemu-kvm</path>
Nov 22 07:33:27 compute-0 nova_compute[186544]:   <domain>kvm</domain>
Nov 22 07:33:27 compute-0 nova_compute[186544]:   <machine>pc-q35-rhel9.8.0</machine>
Nov 22 07:33:27 compute-0 nova_compute[186544]:   <arch>x86_64</arch>
Nov 22 07:33:27 compute-0 nova_compute[186544]:   <vcpu max='4096'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:   <iothreads supported='yes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:   <os supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <enum name='firmware'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <value>efi</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <loader supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='type'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>rom</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>pflash</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='readonly'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>yes</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>no</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='secure'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>yes</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>no</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </loader>
Nov 22 07:33:27 compute-0 nova_compute[186544]:   </os>
Nov 22 07:33:27 compute-0 nova_compute[186544]:   <cpu>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <mode name='host-passthrough' supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='hostPassthroughMigratable'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>on</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>off</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </mode>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <mode name='maximum' supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='maximumMigratable'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>on</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>off</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </mode>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <mode name='host-model' supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <vendor>AMD</vendor>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <feature policy='require' name='x2apic'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <feature policy='require' name='tsc-deadline'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <feature policy='require' name='hypervisor'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <feature policy='require' name='tsc_adjust'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <feature policy='require' name='spec-ctrl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <feature policy='require' name='stibp'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <feature policy='require' name='ssbd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <feature policy='require' name='cmp_legacy'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <feature policy='require' name='overflow-recov'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <feature policy='require' name='succor'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <feature policy='require' name='ibrs'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <feature policy='require' name='amd-ssbd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <feature policy='require' name='virt-ssbd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <feature policy='require' name='lbrv'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <feature policy='require' name='tsc-scale'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <feature policy='require' name='vmcb-clean'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <feature policy='require' name='flushbyasid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <feature policy='require' name='pause-filter'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <feature policy='require' name='pfthreshold'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <feature policy='require' name='svme-addr-chk'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <feature policy='disable' name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </mode>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <mode name='custom' supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Broadwell'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Broadwell-IBRS'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Broadwell-noTSX'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Broadwell-v1'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Broadwell-v2'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Broadwell-v3'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Broadwell-v4'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Cascadelake-Server'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Cascadelake-Server-v1'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Cascadelake-Server-v2'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Cascadelake-Server-v3'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Cascadelake-Server-v4'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Cascadelake-Server-v5'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Cooperlake'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-bf16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='taa-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Cooperlake-v1'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-bf16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='taa-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Cooperlake-v2'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-bf16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='taa-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Denverton'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='mpx'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Denverton-v1'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='mpx'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Denverton-v2'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Denverton-v3'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Dhyana-v2'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='EPYC-Genoa'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amd-psfd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='auto-ibrs'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-bf16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bitalg'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512ifma'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='la57'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='no-nested-data-bp'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='null-sel-clr-base'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='stibp-always-on'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='EPYC-Genoa-v1'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amd-psfd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='auto-ibrs'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-bf16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bitalg'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512ifma'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='la57'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='no-nested-data-bp'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='null-sel-clr-base'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='stibp-always-on'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='EPYC-Milan'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='EPYC-Milan-v1'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='EPYC-Milan-v2'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amd-psfd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='no-nested-data-bp'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='null-sel-clr-base'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='stibp-always-on'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='EPYC-Rome'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='EPYC-Rome-v1'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='EPYC-Rome-v2'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='EPYC-Rome-v3'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='EPYC-v3'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='EPYC-v4'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='GraniteRapids'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amx-bf16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amx-fp16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amx-int8'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amx-tile'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx-vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-bf16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-fp16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bitalg'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512ifma'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='bus-lock-detect'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fbsdp-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrc'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrs'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fzrm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='la57'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='mcdt-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pbrsb-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='prefetchiti'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='psdp-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='sbdr-ssdp-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='serialize'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='taa-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='tsx-ldtrk'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xfd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='GraniteRapids-v1'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amx-bf16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amx-fp16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amx-int8'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amx-tile'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx-vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-bf16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-fp16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bitalg'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512ifma'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='bus-lock-detect'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fbsdp-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrc'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrs'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fzrm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='la57'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='mcdt-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pbrsb-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='prefetchiti'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='psdp-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='sbdr-ssdp-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='serialize'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='taa-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='tsx-ldtrk'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xfd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='GraniteRapids-v2'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amx-bf16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amx-fp16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amx-int8'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amx-tile'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx-vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx10'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx10-128'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx10-256'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx10-512'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-bf16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-fp16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bitalg'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512ifma'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='bus-lock-detect'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='cldemote'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fbsdp-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrc'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrs'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fzrm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='la57'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='mcdt-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='movdir64b'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='movdiri'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pbrsb-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='prefetchiti'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='psdp-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='sbdr-ssdp-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='serialize'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ss'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='taa-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='tsx-ldtrk'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xfd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Haswell'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Haswell-IBRS'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Haswell-noTSX'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Haswell-v1'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Haswell-v2'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Haswell-v3'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Haswell-v4'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Icelake-Server'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bitalg'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='la57'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Icelake-Server-noTSX'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bitalg'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='la57'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Icelake-Server-v1'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bitalg'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='la57'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Icelake-Server-v2'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bitalg'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='la57'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Icelake-Server-v3'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bitalg'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='la57'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='taa-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Icelake-Server-v4'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bitalg'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512ifma'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='la57'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='taa-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Icelake-Server-v5'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bitalg'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512ifma'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='la57'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='taa-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Icelake-Server-v6'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bitalg'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512ifma'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='la57'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='taa-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Icelake-Server-v7'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bitalg'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512ifma'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='la57'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='taa-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='IvyBridge'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='IvyBridge-IBRS'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='IvyBridge-v1'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='IvyBridge-v2'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='KnightsMill'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-4fmaps'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-4vnniw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512er'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512pf'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ss'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='KnightsMill-v1'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-4fmaps'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-4vnniw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512er'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512pf'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ss'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Opteron_G4'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fma4'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xop'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Opteron_G4-v1'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fma4'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xop'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Opteron_G5'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fma4'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='tbm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xop'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Opteron_G5-v1'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fma4'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='tbm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xop'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='SapphireRapids'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amx-bf16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amx-int8'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amx-tile'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx-vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-bf16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-fp16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bitalg'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512ifma'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='bus-lock-detect'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrc'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrs'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fzrm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='la57'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='serialize'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='taa-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='tsx-ldtrk'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xfd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='SapphireRapids-v1'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amx-bf16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amx-int8'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amx-tile'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx-vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-bf16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-fp16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bitalg'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512ifma'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='bus-lock-detect'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrc'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrs'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fzrm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='la57'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='serialize'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='taa-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='tsx-ldtrk'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xfd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='SapphireRapids-v2'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amx-bf16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amx-int8'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amx-tile'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx-vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-bf16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-fp16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bitalg'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512ifma'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='bus-lock-detect'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fbsdp-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrc'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrs'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fzrm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='la57'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='psdp-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='sbdr-ssdp-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='serialize'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='taa-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='tsx-ldtrk'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xfd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='SapphireRapids-v3'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amx-bf16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amx-int8'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amx-tile'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx-vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-bf16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-fp16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bitalg'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512ifma'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='bus-lock-detect'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='cldemote'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fbsdp-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrc'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrs'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fzrm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='la57'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='movdir64b'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='movdiri'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='psdp-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='sbdr-ssdp-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='serialize'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ss'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='taa-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='tsx-ldtrk'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xfd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='SierraForest'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx-ifma'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx-ne-convert'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx-vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx-vnni-int8'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='bus-lock-detect'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='cmpccxadd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fbsdp-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrs'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='mcdt-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pbrsb-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='psdp-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='sbdr-ssdp-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='serialize'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='SierraForest-v1'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx-ifma'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx-ne-convert'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx-vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx-vnni-int8'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='bus-lock-detect'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='cmpccxadd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fbsdp-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrs'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='mcdt-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pbrsb-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='psdp-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='sbdr-ssdp-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='serialize'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Skylake-Client'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Skylake-Client-IBRS'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Skylake-Client-v1'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Skylake-Client-v2'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Skylake-Client-v3'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Skylake-Client-v4'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Skylake-Server'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Skylake-Server-IBRS'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Skylake-Server-v1'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Skylake-Server-v2'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Skylake-Server-v3'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Skylake-Server-v4'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Skylake-Server-v5'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Snowridge'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='cldemote'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='core-capability'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='movdir64b'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='movdiri'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='mpx'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='split-lock-detect'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Snowridge-v1'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='cldemote'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='core-capability'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='movdir64b'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='movdiri'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='mpx'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='split-lock-detect'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Snowridge-v2'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='cldemote'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='core-capability'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='movdir64b'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='movdiri'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='split-lock-detect'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Snowridge-v3'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='cldemote'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='core-capability'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='movdir64b'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='movdiri'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='split-lock-detect'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Snowridge-v4'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='cldemote'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='movdir64b'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='movdiri'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='athlon'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='3dnow'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='3dnowext'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='athlon-v1'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='3dnow'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='3dnowext'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='core2duo'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ss'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='core2duo-v1'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ss'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='coreduo'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ss'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='coreduo-v1'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ss'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='n270'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ss'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='n270-v1'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ss'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='phenom'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='3dnow'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='3dnowext'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='phenom-v1'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='3dnow'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='3dnowext'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </mode>
Nov 22 07:33:27 compute-0 nova_compute[186544]:   </cpu>
Nov 22 07:33:27 compute-0 nova_compute[186544]:   <memoryBacking supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <enum name='sourceType'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <value>file</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <value>anonymous</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <value>memfd</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:   </memoryBacking>
Nov 22 07:33:27 compute-0 nova_compute[186544]:   <devices>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <disk supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='diskDevice'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>disk</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>cdrom</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>floppy</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>lun</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='bus'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>fdc</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>scsi</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>virtio</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>usb</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>sata</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='model'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>virtio</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>virtio-transitional</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>virtio-non-transitional</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <graphics supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='type'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>vnc</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>egl-headless</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>dbus</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </graphics>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <video supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='modelType'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>vga</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>cirrus</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>virtio</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>none</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>bochs</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>ramfb</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </video>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <hostdev supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='mode'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>subsystem</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='startupPolicy'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>default</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>mandatory</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>requisite</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>optional</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='subsysType'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>usb</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>pci</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>scsi</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='capsType'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='pciBackend'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </hostdev>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <rng supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='model'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>virtio</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>virtio-transitional</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>virtio-non-transitional</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='backendModel'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>random</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>egd</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>builtin</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </rng>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <filesystem supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='driverType'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>path</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>handle</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>virtiofs</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </filesystem>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <tpm supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='model'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>tpm-tis</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>tpm-crb</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='backendModel'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>emulator</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>external</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='backendVersion'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>2.0</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </tpm>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <redirdev supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='bus'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>usb</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </redirdev>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <channel supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='type'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>pty</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>unix</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </channel>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <crypto supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='model'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='type'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>qemu</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='backendModel'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>builtin</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </crypto>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <interface supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='backendType'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>default</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>passt</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </interface>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <panic supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='model'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>isa</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>hyperv</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </panic>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <console supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='type'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>null</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>vc</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>pty</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>dev</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>file</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>pipe</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>stdio</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>udp</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>tcp</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>unix</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>qemu-vdagent</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>dbus</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </console>
Nov 22 07:33:27 compute-0 nova_compute[186544]:   </devices>
Nov 22 07:33:27 compute-0 nova_compute[186544]:   <features>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <gic supported='no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <vmcoreinfo supported='yes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <genid supported='yes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <backingStoreInput supported='yes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <backup supported='yes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <async-teardown supported='yes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <ps2 supported='yes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <sev supported='no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <sgx supported='no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <hyperv supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='features'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>relaxed</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>vapic</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>spinlocks</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>vpindex</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>runtime</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>synic</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>stimer</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>reset</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>vendor_id</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>frequencies</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>reenlightenment</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>tlbflush</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>ipi</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>avic</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>emsr_bitmap</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>xmm_input</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <defaults>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <spinlocks>4095</spinlocks>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <stimer_direct>on</stimer_direct>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <tlbflush_direct>on</tlbflush_direct>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <tlbflush_extended>on</tlbflush_extended>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </defaults>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </hyperv>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <launchSecurity supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='sectype'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>tdx</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </launchSecurity>
Nov 22 07:33:27 compute-0 nova_compute[186544]:   </features>
Nov 22 07:33:27 compute-0 nova_compute[186544]: </domainCapabilities>
Nov 22 07:33:27 compute-0 nova_compute[186544]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 22 07:33:27 compute-0 nova_compute[186544]: 2025-11-22 07:33:27.066 186548 DEBUG nova.virt.libvirt.host [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Nov 22 07:33:27 compute-0 nova_compute[186544]: <domainCapabilities>
Nov 22 07:33:27 compute-0 nova_compute[186544]:   <path>/usr/libexec/qemu-kvm</path>
Nov 22 07:33:27 compute-0 nova_compute[186544]:   <domain>kvm</domain>
Nov 22 07:33:27 compute-0 nova_compute[186544]:   <machine>pc-i440fx-rhel7.6.0</machine>
Nov 22 07:33:27 compute-0 nova_compute[186544]:   <arch>x86_64</arch>
Nov 22 07:33:27 compute-0 nova_compute[186544]:   <vcpu max='240'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:   <iothreads supported='yes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:   <os supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <enum name='firmware'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <loader supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='type'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>rom</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>pflash</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='readonly'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>yes</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>no</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='secure'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>no</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </loader>
Nov 22 07:33:27 compute-0 nova_compute[186544]:   </os>
Nov 22 07:33:27 compute-0 nova_compute[186544]:   <cpu>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <mode name='host-passthrough' supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='hostPassthroughMigratable'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>on</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>off</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </mode>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <mode name='maximum' supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='maximumMigratable'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>on</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>off</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </mode>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <mode name='host-model' supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <vendor>AMD</vendor>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <feature policy='require' name='x2apic'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <feature policy='require' name='tsc-deadline'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <feature policy='require' name='hypervisor'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <feature policy='require' name='tsc_adjust'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <feature policy='require' name='spec-ctrl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <feature policy='require' name='stibp'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <feature policy='require' name='ssbd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <feature policy='require' name='cmp_legacy'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <feature policy='require' name='overflow-recov'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <feature policy='require' name='succor'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <feature policy='require' name='ibrs'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <feature policy='require' name='amd-ssbd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <feature policy='require' name='virt-ssbd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <feature policy='require' name='lbrv'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <feature policy='require' name='tsc-scale'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <feature policy='require' name='vmcb-clean'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <feature policy='require' name='flushbyasid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <feature policy='require' name='pause-filter'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <feature policy='require' name='pfthreshold'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <feature policy='require' name='svme-addr-chk'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <feature policy='disable' name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </mode>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <mode name='custom' supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Broadwell'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Broadwell-IBRS'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Broadwell-noTSX'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Broadwell-v1'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Broadwell-v2'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Broadwell-v3'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Broadwell-v4'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Cascadelake-Server'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Cascadelake-Server-v1'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Cascadelake-Server-v2'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Cascadelake-Server-v3'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Cascadelake-Server-v4'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Cascadelake-Server-v5'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Cooperlake'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-bf16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='taa-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Cooperlake-v1'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-bf16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='taa-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Cooperlake-v2'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-bf16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='taa-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Denverton'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='mpx'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Denverton-v1'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='mpx'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Denverton-v2'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Denverton-v3'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Dhyana-v2'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='EPYC-Genoa'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amd-psfd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='auto-ibrs'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-bf16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bitalg'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512ifma'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='la57'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='no-nested-data-bp'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='null-sel-clr-base'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='stibp-always-on'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='EPYC-Genoa-v1'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amd-psfd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='auto-ibrs'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-bf16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bitalg'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512ifma'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='la57'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='no-nested-data-bp'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='null-sel-clr-base'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='stibp-always-on'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='EPYC-Milan'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='EPYC-Milan-v1'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='EPYC-Milan-v2'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amd-psfd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='no-nested-data-bp'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='null-sel-clr-base'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='stibp-always-on'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='EPYC-Rome'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='EPYC-Rome-v1'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='EPYC-Rome-v2'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='EPYC-Rome-v3'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='EPYC-v3'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='EPYC-v4'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='GraniteRapids'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amx-bf16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amx-fp16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amx-int8'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amx-tile'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx-vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-bf16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-fp16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bitalg'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512ifma'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='bus-lock-detect'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fbsdp-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrc'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrs'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fzrm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='la57'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='mcdt-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pbrsb-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='prefetchiti'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='psdp-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='sbdr-ssdp-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='serialize'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='taa-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='tsx-ldtrk'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xfd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='GraniteRapids-v1'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amx-bf16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amx-fp16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amx-int8'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amx-tile'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx-vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-bf16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-fp16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bitalg'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512ifma'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='bus-lock-detect'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fbsdp-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrc'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrs'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fzrm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='la57'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='mcdt-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pbrsb-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='prefetchiti'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='psdp-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='sbdr-ssdp-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='serialize'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='taa-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='tsx-ldtrk'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xfd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='GraniteRapids-v2'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amx-bf16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amx-fp16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amx-int8'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amx-tile'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx-vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx10'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx10-128'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx10-256'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx10-512'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-bf16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-fp16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bitalg'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512ifma'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='bus-lock-detect'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='cldemote'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fbsdp-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrc'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrs'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fzrm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='la57'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='mcdt-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='movdir64b'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='movdiri'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pbrsb-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='prefetchiti'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='psdp-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='sbdr-ssdp-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='serialize'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ss'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='taa-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='tsx-ldtrk'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xfd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Haswell'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Haswell-IBRS'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Haswell-noTSX'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Haswell-v1'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Haswell-v2'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Haswell-v3'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Haswell-v4'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Icelake-Server'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bitalg'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='la57'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Icelake-Server-noTSX'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bitalg'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='la57'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Icelake-Server-v1'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bitalg'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='la57'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Icelake-Server-v2'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bitalg'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='la57'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Icelake-Server-v3'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bitalg'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='la57'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='taa-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Icelake-Server-v4'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bitalg'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512ifma'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='la57'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='taa-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Icelake-Server-v5'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bitalg'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512ifma'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='la57'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='taa-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Icelake-Server-v6'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bitalg'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512ifma'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='la57'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='taa-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Icelake-Server-v7'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bitalg'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512ifma'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='la57'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='taa-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='IvyBridge'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='IvyBridge-IBRS'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='IvyBridge-v1'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='IvyBridge-v2'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='KnightsMill'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-4fmaps'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-4vnniw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512er'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512pf'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ss'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='KnightsMill-v1'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-4fmaps'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-4vnniw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512er'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512pf'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ss'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Opteron_G4'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fma4'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xop'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Opteron_G4-v1'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fma4'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xop'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Opteron_G5'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fma4'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='tbm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xop'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Opteron_G5-v1'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fma4'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='tbm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xop'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='SapphireRapids'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amx-bf16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amx-int8'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amx-tile'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx-vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-bf16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-fp16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bitalg'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512ifma'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='bus-lock-detect'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrc'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrs'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fzrm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='la57'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='serialize'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='taa-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='tsx-ldtrk'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xfd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='SapphireRapids-v1'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amx-bf16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amx-int8'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amx-tile'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx-vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-bf16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-fp16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bitalg'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512ifma'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='bus-lock-detect'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrc'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrs'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fzrm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='la57'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='serialize'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='taa-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='tsx-ldtrk'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xfd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='SapphireRapids-v2'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amx-bf16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amx-int8'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amx-tile'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx-vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-bf16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-fp16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bitalg'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512ifma'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='bus-lock-detect'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fbsdp-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrc'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrs'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fzrm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='la57'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='psdp-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='sbdr-ssdp-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='serialize'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='taa-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='tsx-ldtrk'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xfd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='SapphireRapids-v3'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amx-bf16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amx-int8'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='amx-tile'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx-vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-bf16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-fp16'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512-vpopcntdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bitalg'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512ifma'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vbmi2'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='bus-lock-detect'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='cldemote'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fbsdp-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrc'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrs'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fzrm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='la57'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='movdir64b'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='movdiri'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='psdp-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='sbdr-ssdp-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='serialize'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ss'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='taa-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='tsx-ldtrk'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xfd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='SierraForest'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx-ifma'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx-ne-convert'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx-vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx-vnni-int8'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='bus-lock-detect'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='cmpccxadd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fbsdp-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrs'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='mcdt-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pbrsb-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='psdp-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='sbdr-ssdp-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='serialize'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='SierraForest-v1'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx-ifma'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx-ne-convert'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx-vnni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx-vnni-int8'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='bus-lock-detect'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='cmpccxadd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fbsdp-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='fsrs'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ibrs-all'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='mcdt-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pbrsb-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='psdp-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='sbdr-ssdp-no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='serialize'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vaes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='vpclmulqdq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Skylake-Client'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Skylake-Client-IBRS'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Skylake-Client-v1'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Skylake-Client-v2'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Skylake-Client-v3'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Skylake-Client-v4'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Skylake-Server'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Skylake-Server-IBRS'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Skylake-Server-v1'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Skylake-Server-v2'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='hle'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='rtm'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Skylake-Server-v3'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Skylake-Server-v4'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Skylake-Server-v5'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512bw'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512cd'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512dq'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512f'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='avx512vl'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='invpcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pcid'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='pku'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Snowridge'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='cldemote'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='core-capability'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='movdir64b'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='movdiri'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='mpx'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='split-lock-detect'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Snowridge-v1'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='cldemote'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='core-capability'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='movdir64b'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='movdiri'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='mpx'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='split-lock-detect'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Snowridge-v2'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='cldemote'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='core-capability'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='movdir64b'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='movdiri'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='split-lock-detect'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Snowridge-v3'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='cldemote'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='core-capability'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='movdir64b'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='movdiri'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='split-lock-detect'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='Snowridge-v4'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='cldemote'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='erms'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='gfni'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='movdir64b'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='movdiri'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='xsaves'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='athlon'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='3dnow'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='3dnowext'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='athlon-v1'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='3dnow'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='3dnowext'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='core2duo'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ss'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='core2duo-v1'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ss'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='coreduo'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ss'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='coreduo-v1'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ss'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='n270'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ss'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='n270-v1'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='ss'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='phenom'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='3dnow'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='3dnowext'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <blockers model='phenom-v1'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='3dnow'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <feature name='3dnowext'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </blockers>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </mode>
Nov 22 07:33:27 compute-0 nova_compute[186544]:   </cpu>
Nov 22 07:33:27 compute-0 nova_compute[186544]:   <memoryBacking supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <enum name='sourceType'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <value>file</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <value>anonymous</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <value>memfd</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:   </memoryBacking>
Nov 22 07:33:27 compute-0 nova_compute[186544]:   <devices>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <disk supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='diskDevice'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>disk</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>cdrom</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>floppy</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>lun</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='bus'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>ide</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>fdc</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>scsi</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>virtio</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>usb</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>sata</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='model'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>virtio</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>virtio-transitional</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>virtio-non-transitional</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <graphics supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='type'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>vnc</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>egl-headless</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>dbus</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </graphics>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <video supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='modelType'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>vga</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>cirrus</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>virtio</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>none</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>bochs</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>ramfb</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </video>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <hostdev supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='mode'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>subsystem</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='startupPolicy'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>default</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>mandatory</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>requisite</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>optional</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='subsysType'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>usb</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>pci</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>scsi</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='capsType'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='pciBackend'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </hostdev>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <rng supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='model'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>virtio</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>virtio-transitional</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>virtio-non-transitional</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='backendModel'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>random</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>egd</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>builtin</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </rng>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <filesystem supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='driverType'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>path</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>handle</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>virtiofs</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </filesystem>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <tpm supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='model'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>tpm-tis</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>tpm-crb</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='backendModel'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>emulator</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>external</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='backendVersion'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>2.0</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </tpm>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <redirdev supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='bus'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>usb</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </redirdev>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <channel supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='type'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>pty</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>unix</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </channel>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <crypto supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='model'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='type'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>qemu</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='backendModel'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>builtin</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </crypto>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <interface supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='backendType'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>default</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>passt</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </interface>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <panic supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='model'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>isa</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>hyperv</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </panic>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <console supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='type'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>null</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>vc</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>pty</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>dev</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>file</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>pipe</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>stdio</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>udp</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>tcp</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>unix</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>qemu-vdagent</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>dbus</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </console>
Nov 22 07:33:27 compute-0 nova_compute[186544]:   </devices>
Nov 22 07:33:27 compute-0 nova_compute[186544]:   <features>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <gic supported='no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <vmcoreinfo supported='yes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <genid supported='yes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <backingStoreInput supported='yes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <backup supported='yes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <async-teardown supported='yes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <ps2 supported='yes'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <sev supported='no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <sgx supported='no'/>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <hyperv supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='features'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>relaxed</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>vapic</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>spinlocks</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>vpindex</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>runtime</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>synic</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>stimer</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>reset</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>vendor_id</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>frequencies</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>reenlightenment</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>tlbflush</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>ipi</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>avic</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>emsr_bitmap</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>xmm_input</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <defaults>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <spinlocks>4095</spinlocks>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <stimer_direct>on</stimer_direct>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <tlbflush_direct>on</tlbflush_direct>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <tlbflush_extended>on</tlbflush_extended>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </defaults>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </hyperv>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     <launchSecurity supported='yes'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       <enum name='sectype'>
Nov 22 07:33:27 compute-0 nova_compute[186544]:         <value>tdx</value>
Nov 22 07:33:27 compute-0 nova_compute[186544]:       </enum>
Nov 22 07:33:27 compute-0 nova_compute[186544]:     </launchSecurity>
Nov 22 07:33:27 compute-0 nova_compute[186544]:   </features>
Nov 22 07:33:27 compute-0 nova_compute[186544]: </domainCapabilities>
Nov 22 07:33:27 compute-0 nova_compute[186544]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 22 07:33:27 compute-0 nova_compute[186544]: 2025-11-22 07:33:27.133 186548 DEBUG nova.virt.libvirt.host [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Nov 22 07:33:27 compute-0 nova_compute[186544]: 2025-11-22 07:33:27.133 186548 INFO nova.virt.libvirt.host [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Secure Boot support detected
Nov 22 07:33:27 compute-0 nova_compute[186544]: 2025-11-22 07:33:27.137 186548 INFO nova.virt.libvirt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Nov 22 07:33:27 compute-0 nova_compute[186544]: 2025-11-22 07:33:27.137 186548 INFO nova.virt.libvirt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Nov 22 07:33:27 compute-0 nova_compute[186544]: 2025-11-22 07:33:27.148 186548 DEBUG nova.virt.libvirt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] cpu compare xml: <cpu match="exact">
Nov 22 07:33:27 compute-0 nova_compute[186544]:   <model>Nehalem</model>
Nov 22 07:33:27 compute-0 nova_compute[186544]: </cpu>
Nov 22 07:33:27 compute-0 nova_compute[186544]:  _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019
Nov 22 07:33:27 compute-0 nova_compute[186544]: 2025-11-22 07:33:27.151 186548 DEBUG nova.virt.libvirt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Nov 22 07:33:27 compute-0 nova_compute[186544]: 2025-11-22 07:33:27.169 186548 INFO nova.virt.node [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Determined node identity 0a011418-630a-4be8-ab23-41ec1c11a5ea from /var/lib/nova/compute_id
Nov 22 07:33:27 compute-0 nova_compute[186544]: 2025-11-22 07:33:27.193 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Verified node 0a011418-630a-4be8-ab23-41ec1c11a5ea matches my host compute-0.ctlplane.example.com _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568
Nov 22 07:33:27 compute-0 nova_compute[186544]: 2025-11-22 07:33:27.229 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Nov 22 07:33:27 compute-0 nova_compute[186544]: 2025-11-22 07:33:27.315 186548 ERROR nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Could not retrieve compute node resource provider 0a011418-630a-4be8-ab23-41ec1c11a5ea and therefore unable to error out any instances stuck in BUILDING state. Error: Failed to retrieve allocations for resource provider 0a011418-630a-4be8-ab23-41ec1c11a5ea: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider '0a011418-630a-4be8-ab23-41ec1c11a5ea' not found: No resource provider with uuid 0a011418-630a-4be8-ab23-41ec1c11a5ea found  ", "request_id": "req-42816eaf-7afd-45ff-a7ec-827f910e331c"}]}: nova.exception.ResourceProviderAllocationRetrievalFailed: Failed to retrieve allocations for resource provider 0a011418-630a-4be8-ab23-41ec1c11a5ea: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider '0a011418-630a-4be8-ab23-41ec1c11a5ea' not found: No resource provider with uuid 0a011418-630a-4be8-ab23-41ec1c11a5ea found  ", "request_id": "req-42816eaf-7afd-45ff-a7ec-827f910e331c"}]}
Nov 22 07:33:27 compute-0 nova_compute[186544]: 2025-11-22 07:33:27.352 186548 DEBUG oslo_concurrency.lockutils [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:33:27 compute-0 nova_compute[186544]: 2025-11-22 07:33:27.353 186548 DEBUG oslo_concurrency.lockutils [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:33:27 compute-0 nova_compute[186544]: 2025-11-22 07:33:27.353 186548 DEBUG oslo_concurrency.lockutils [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:33:27 compute-0 nova_compute[186544]: 2025-11-22 07:33:27.353 186548 DEBUG nova.compute.resource_tracker [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 07:33:27 compute-0 nova_compute[186544]: 2025-11-22 07:33:27.510 186548 WARNING nova.virt.libvirt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 07:33:27 compute-0 nova_compute[186544]: 2025-11-22 07:33:27.512 186548 DEBUG nova.compute.resource_tracker [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6187MB free_disk=73.66325378417969GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 07:33:27 compute-0 nova_compute[186544]: 2025-11-22 07:33:27.512 186548 DEBUG oslo_concurrency.lockutils [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:33:27 compute-0 nova_compute[186544]: 2025-11-22 07:33:27.512 186548 DEBUG oslo_concurrency.lockutils [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:33:27 compute-0 nova_compute[186544]: 2025-11-22 07:33:27.628 186548 ERROR nova.compute.resource_tracker [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Skipping removal of allocations for deleted instances: Failed to retrieve allocations for resource provider 0a011418-630a-4be8-ab23-41ec1c11a5ea: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider '0a011418-630a-4be8-ab23-41ec1c11a5ea' not found: No resource provider with uuid 0a011418-630a-4be8-ab23-41ec1c11a5ea found  ", "request_id": "req-49bb6eb0-e193-4913-b60c-a181c264c5c9"}]}: nova.exception.ResourceProviderAllocationRetrievalFailed: Failed to retrieve allocations for resource provider 0a011418-630a-4be8-ab23-41ec1c11a5ea: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider '0a011418-630a-4be8-ab23-41ec1c11a5ea' not found: No resource provider with uuid 0a011418-630a-4be8-ab23-41ec1c11a5ea found  ", "request_id": "req-49bb6eb0-e193-4913-b60c-a181c264c5c9"}]}
Nov 22 07:33:27 compute-0 nova_compute[186544]: 2025-11-22 07:33:27.629 186548 DEBUG nova.compute.resource_tracker [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 07:33:27 compute-0 nova_compute[186544]: 2025-11-22 07:33:27.629 186548 DEBUG nova.compute.resource_tracker [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 07:33:27 compute-0 nova_compute[186544]: 2025-11-22 07:33:27.886 186548 INFO nova.scheduler.client.report [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [req-d2ac0bf0-c4df-4f55-bd30-1f65a5797fff] Created resource provider record via placement API for resource provider with UUID 0a011418-630a-4be8-ab23-41ec1c11a5ea and name compute-0.ctlplane.example.com.
Nov 22 07:33:28 compute-0 nova_compute[186544]: 2025-11-22 07:33:28.517 186548 DEBUG nova.virt.libvirt.host [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Nov 22 07:33:28 compute-0 nova_compute[186544]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Nov 22 07:33:28 compute-0 nova_compute[186544]: 2025-11-22 07:33:28.517 186548 INFO nova.virt.libvirt.host [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] kernel doesn't support AMD SEV
Nov 22 07:33:28 compute-0 nova_compute[186544]: 2025-11-22 07:33:28.518 186548 DEBUG nova.compute.provider_tree [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Updating inventory in ProviderTree for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea with inventory: {'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 22 07:33:28 compute-0 nova_compute[186544]: 2025-11-22 07:33:28.518 186548 DEBUG nova.virt.libvirt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 07:33:28 compute-0 nova_compute[186544]: 2025-11-22 07:33:28.520 186548 DEBUG nova.virt.libvirt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Libvirt baseline CPU <cpu>
Nov 22 07:33:28 compute-0 nova_compute[186544]:   <arch>x86_64</arch>
Nov 22 07:33:28 compute-0 nova_compute[186544]:   <model>Nehalem</model>
Nov 22 07:33:28 compute-0 nova_compute[186544]:   <vendor>AMD</vendor>
Nov 22 07:33:28 compute-0 nova_compute[186544]:   <topology sockets="8" cores="1" threads="1"/>
Nov 22 07:33:28 compute-0 nova_compute[186544]: </cpu>
Nov 22 07:33:28 compute-0 nova_compute[186544]:  _get_guest_baseline_cpu_features /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12537
Nov 22 07:33:28 compute-0 nova_compute[186544]: 2025-11-22 07:33:28.838 186548 DEBUG nova.scheduler.client.report [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Updated inventory for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Nov 22 07:33:28 compute-0 nova_compute[186544]: 2025-11-22 07:33:28.839 186548 DEBUG nova.compute.provider_tree [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Updating resource provider 0a011418-630a-4be8-ab23-41ec1c11a5ea generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Nov 22 07:33:28 compute-0 nova_compute[186544]: 2025-11-22 07:33:28.839 186548 DEBUG nova.compute.provider_tree [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Updating inventory in ProviderTree for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea with inventory: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 22 07:33:28 compute-0 nova_compute[186544]: 2025-11-22 07:33:28.982 186548 DEBUG nova.compute.provider_tree [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Updating resource provider 0a011418-630a-4be8-ab23-41ec1c11a5ea generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Nov 22 07:33:29 compute-0 nova_compute[186544]: 2025-11-22 07:33:29.083 186548 DEBUG nova.compute.resource_tracker [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 07:33:29 compute-0 nova_compute[186544]: 2025-11-22 07:33:29.083 186548 DEBUG oslo_concurrency.lockutils [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.571s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:33:29 compute-0 nova_compute[186544]: 2025-11-22 07:33:29.084 186548 DEBUG nova.service [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Nov 22 07:33:29 compute-0 nova_compute[186544]: 2025-11-22 07:33:29.139 186548 DEBUG nova.service [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Nov 22 07:33:29 compute-0 nova_compute[186544]: 2025-11-22 07:33:29.140 186548 DEBUG nova.servicegroup.drivers.db [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] DB_Driver: join new ServiceGroup member compute-0.ctlplane.example.com to the compute group, service = <Service: host=compute-0.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Nov 22 07:33:31 compute-0 sshd-session[186836]: Accepted publickey for zuul from 192.168.122.30 port 38450 ssh2: ECDSA SHA256:XSwr0+qdoVcqF4tsJEe7yzRPcUrJW8ET1w6IEkjbKvs
Nov 22 07:33:31 compute-0 systemd-logind[821]: New session 25 of user zuul.
Nov 22 07:33:31 compute-0 systemd[1]: Started Session 25 of User zuul.
Nov 22 07:33:31 compute-0 sshd-session[186836]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 22 07:33:32 compute-0 python3.9[186989]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 22 07:33:34 compute-0 sudo[187143]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vweeqkigeqqlmjzlgomeohpsmfskpwbb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796813.7144632-73-174013695465922/AnsiballZ_systemd_service.py'
Nov 22 07:33:34 compute-0 sudo[187143]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:33:34 compute-0 python3.9[187145]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 22 07:33:34 compute-0 systemd[1]: Reloading.
Nov 22 07:33:34 compute-0 systemd-rc-local-generator[187191]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 07:33:34 compute-0 systemd-sysv-generator[187196]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 07:33:34 compute-0 podman[187147]: 2025-11-22 07:33:34.750238618 +0000 UTC m=+0.089829332 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible)
Nov 22 07:33:34 compute-0 sudo[187143]: pam_unix(sudo:session): session closed for user root
Nov 22 07:33:35 compute-0 python3.9[187355]: ansible-ansible.builtin.service_facts Invoked
Nov 22 07:33:35 compute-0 network[187372]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 22 07:33:35 compute-0 network[187373]: 'network-scripts' will be removed from distribution in near future.
Nov 22 07:33:35 compute-0 network[187374]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 22 07:33:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:33:37.300 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:33:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:33:37.301 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:33:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:33:37.301 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:33:40 compute-0 sudo[187646]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cyaeuponkrcamasodcmhxwozisrtchmd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796820.2136626-130-180184389197440/AnsiballZ_systemd_service.py'
Nov 22 07:33:40 compute-0 sudo[187646]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:33:40 compute-0 python3.9[187648]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 07:33:40 compute-0 sudo[187646]: pam_unix(sudo:session): session closed for user root
Nov 22 07:33:41 compute-0 sudo[187799]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-apmkpwyhjeonpqwfugoftoysbmmpslei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796821.2896383-160-136036928579378/AnsiballZ_file.py'
Nov 22 07:33:41 compute-0 sudo[187799]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:33:41 compute-0 python3.9[187801]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:33:41 compute-0 sudo[187799]: pam_unix(sudo:session): session closed for user root
Nov 22 07:33:41 compute-0 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 22 07:33:41 compute-0 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 22 07:33:42 compute-0 sudo[187952]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zhhclgpppbcuzdplbdhebgrmohgruadp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796822.1675153-184-90244725806214/AnsiballZ_file.py'
Nov 22 07:33:42 compute-0 sudo[187952]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:33:42 compute-0 python3.9[187954]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:33:42 compute-0 sudo[187952]: pam_unix(sudo:session): session closed for user root
Nov 22 07:33:43 compute-0 sudo[188121]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rswwndxsexofgocvdgoqwexvialczbrr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796822.985461-211-4196630251669/AnsiballZ_command.py'
Nov 22 07:33:43 compute-0 sudo[188121]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:33:43 compute-0 podman[188078]: 2025-11-22 07:33:43.610321583 +0000 UTC m=+0.053181282 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team)
Nov 22 07:33:43 compute-0 python3.9[188126]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 07:33:43 compute-0 sudo[188121]: pam_unix(sudo:session): session closed for user root
Nov 22 07:33:44 compute-0 python3.9[188278]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 22 07:33:45 compute-0 sudo[188428]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gezvqnrllbkpbesimzgbyyrvbmblpcwq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796825.0497696-265-90733956526619/AnsiballZ_systemd_service.py'
Nov 22 07:33:45 compute-0 sudo[188428]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:33:45 compute-0 python3.9[188430]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 22 07:33:45 compute-0 systemd[1]: Reloading.
Nov 22 07:33:45 compute-0 systemd-rc-local-generator[188456]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 07:33:45 compute-0 systemd-sysv-generator[188459]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 07:33:45 compute-0 sudo[188428]: pam_unix(sudo:session): session closed for user root
Nov 22 07:33:46 compute-0 sudo[188615]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vapusaovhxpobpmcgpgjqrccgcrjswrr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796826.1655197-289-157215045833440/AnsiballZ_command.py'
Nov 22 07:33:46 compute-0 sudo[188615]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:33:46 compute-0 python3.9[188617]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 07:33:46 compute-0 sudo[188615]: pam_unix(sudo:session): session closed for user root
Nov 22 07:33:47 compute-0 sudo[188779]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hpdcsiqaaivorpbieqcudtaeievoophk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796826.9848423-316-232152018276978/AnsiballZ_file.py'
Nov 22 07:33:47 compute-0 sudo[188779]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:33:47 compute-0 podman[188742]: 2025-11-22 07:33:47.29912166 +0000 UTC m=+0.068106277 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Nov 22 07:33:47 compute-0 python3.9[188789]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/config/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:33:47 compute-0 sudo[188779]: pam_unix(sudo:session): session closed for user root
Nov 22 07:33:48 compute-0 python3.9[188940]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 07:33:49 compute-0 python3.9[189092]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:33:49 compute-0 python3.9[189213]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763796828.7829711-364-78327933698400/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=e86e0e43000ce9ccfe5aefbf8e8f2e3d15d05584 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:33:50 compute-0 sudo[189363]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lyfolejhwocxrqyfspeukmbzbpphmaqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796830.1472328-409-25288307841173/AnsiballZ_group.py'
Nov 22 07:33:50 compute-0 sudo[189363]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:33:50 compute-0 python3.9[189365]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None
Nov 22 07:33:50 compute-0 sudo[189363]: pam_unix(sudo:session): session closed for user root
Nov 22 07:33:51 compute-0 sudo[189515]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xbimnlbairsrabvebvukpkuytfdtyphk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796831.3625991-442-19158943812458/AnsiballZ_getent.py'
Nov 22 07:33:51 compute-0 sudo[189515]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:33:51 compute-0 python3.9[189517]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None
Nov 22 07:33:51 compute-0 sudo[189515]: pam_unix(sudo:session): session closed for user root
Nov 22 07:33:52 compute-0 sudo[189668]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tvbtnyvkpynaagdsigktstfokfvdaxxe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796832.2197616-466-131590426840553/AnsiballZ_group.py'
Nov 22 07:33:52 compute-0 sudo[189668]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:33:52 compute-0 python3.9[189670]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 22 07:33:52 compute-0 groupadd[189671]: group added to /etc/group: name=ceilometer, GID=42405
Nov 22 07:33:52 compute-0 groupadd[189671]: group added to /etc/gshadow: name=ceilometer
Nov 22 07:33:52 compute-0 groupadd[189671]: new group: name=ceilometer, GID=42405
Nov 22 07:33:52 compute-0 sudo[189668]: pam_unix(sudo:session): session closed for user root
Nov 22 07:33:53 compute-0 sudo[189826]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ocpsowxhtsoscrtqlxochhbqqdhybcsg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796833.0349483-490-163626014163369/AnsiballZ_user.py'
Nov 22 07:33:53 compute-0 sudo[189826]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:33:53 compute-0 python3.9[189828]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 22 07:33:53 compute-0 useradd[189830]: new user: name=ceilometer, UID=42405, GID=42405, home=/home/ceilometer, shell=/sbin/nologin, from=/dev/pts/0
Nov 22 07:33:53 compute-0 useradd[189830]: add 'ceilometer' to group 'libvirt'
Nov 22 07:33:53 compute-0 useradd[189830]: add 'ceilometer' to shadow group 'libvirt'
Nov 22 07:33:54 compute-0 sudo[189826]: pam_unix(sudo:session): session closed for user root
Nov 22 07:33:55 compute-0 python3.9[189986]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:33:55 compute-0 python3.9[190107]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1763796834.8564355-568-196145727573414/.source.conf _original_basename=ceilometer.conf follow=False checksum=f74f01c63e6cdeca5458ef9aff2a1db5d6a4e4b9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:33:56 compute-0 python3.9[190257]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:33:56 compute-0 python3.9[190378]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1763796836.0275536-568-153030252405889/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:33:57 compute-0 python3.9[190528]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:33:58 compute-0 python3.9[190649]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1763796837.1199918-568-125880007839408/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:33:58 compute-0 python3.9[190799]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 07:33:59 compute-0 python3.9[190951]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 07:34:00 compute-0 python3.9[191103]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:34:00 compute-0 python3.9[191224]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796839.9520175-745-33691486320878/.source.json follow=False _original_basename=ceilometer-agent-compute.json.j2 checksum=264d11e8d3809e7ef745878dce7edd46098e25b2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:34:01 compute-0 python3.9[191374]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:34:01 compute-0 sshd-session[191451]: Connection closed by 172.105.102.42 port 9214
Nov 22 07:34:01 compute-0 sshd-session[191453]: Connection closed by 172.105.102.42 port 9262
Nov 22 07:34:01 compute-0 sshd-session[191452]: error: Protocol major versions differ: 2 vs. 1
Nov 22 07:34:01 compute-0 sshd-session[191455]: error: Protocol major versions differ: 2 vs. 1
Nov 22 07:34:01 compute-0 sshd-session[191452]: banner exchange: Connection from 172.105.102.42 port 9224: could not read protocol version
Nov 22 07:34:01 compute-0 sshd-session[191455]: banner exchange: Connection from 172.105.102.42 port 9240: could not read protocol version
Nov 22 07:34:01 compute-0 sshd-session[191454]: Unable to negotiate with 172.105.102.42 port 9230: no matching key exchange method found. Their offer: diffie-hellman-group1-sha1 [preauth]
Nov 22 07:34:01 compute-0 sshd-session[191459]: Unable to negotiate with 172.105.102.42 port 9264: no matching host key type found. Their offer: ssh-dss [preauth]
Nov 22 07:34:01 compute-0 sshd-session[191456]: Invalid user gsqjp from 172.105.102.42 port 9252
Nov 22 07:34:01 compute-0 sshd-session[191456]: Connection closed by invalid user gsqjp 172.105.102.42 port 9252 [preauth]
Nov 22 07:34:01 compute-0 sshd-session[191461]: Unable to negotiate with 172.105.102.42 port 9270: no matching host key type found. Their offer: ssh-rsa [preauth]
Nov 22 07:34:01 compute-0 python3.9[191450]: ansible-ansible.legacy.file Invoked with mode=420 dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf _original_basename=ceilometer-host-specific.conf.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:34:02 compute-0 sshd-session[191463]: Connection closed by 172.105.102.42 port 9274 [preauth]
Nov 22 07:34:02 compute-0 sshd-session[191488]: Unable to negotiate with 172.105.102.42 port 9290: no matching host key type found. Their offer: ecdsa-sha2-nistp384 [preauth]
Nov 22 07:34:02 compute-0 sshd-session[191492]: Unable to negotiate with 172.105.102.42 port 9306: no matching host key type found. Their offer: ecdsa-sha2-nistp521 [preauth]
Nov 22 07:34:02 compute-0 sshd-session[191516]: Connection closed by 172.105.102.42 port 9322 [preauth]
Nov 22 07:34:02 compute-0 python3.9[191620]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:34:03 compute-0 python3.9[191741]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796842.10383-745-93325875200817/.source.json follow=False _original_basename=ceilometer_agent_compute.json.j2 checksum=17453a32c9d181134878b3e453cb84c3cd9bd67d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:34:03 compute-0 nova_compute[186544]: 2025-11-22 07:34:03.141 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:34:03 compute-0 nova_compute[186544]: 2025-11-22 07:34:03.157 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:34:03 compute-0 python3.9[191891]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:34:04 compute-0 python3.9[192012]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796843.2423122-745-72111169417728/.source.yaml follow=False _original_basename=ceilometer_prom_exporter.yaml.j2 checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:34:04 compute-0 python3.9[192162]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:34:05 compute-0 podman[192257]: 2025-11-22 07:34:05.269317874 +0000 UTC m=+0.117663370 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 22 07:34:05 compute-0 python3.9[192294]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/firewall.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796844.3783193-745-218380391165114/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:34:06 compute-0 python3.9[192457]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:34:06 compute-0 python3.9[192578]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796845.5359817-745-242794041326350/.source.json follow=False _original_basename=node_exporter.json.j2 checksum=6e4982940d2bfae88404914dfaf72552f6356d81 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:34:07 compute-0 python3.9[192728]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:34:07 compute-0 python3.9[192849]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796846.7350302-745-257380071885333/.source.yaml follow=False _original_basename=node_exporter.yaml.j2 checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:34:08 compute-0 python3.9[192999]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:34:08 compute-0 python3.9[193120]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796847.9595172-745-141063155665639/.source.json follow=False _original_basename=openstack_network_exporter.json.j2 checksum=d474f1e4c3dbd24762592c51cbe5311f0a037273 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:34:09 compute-0 python3.9[193270]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:34:10 compute-0 python3.9[193391]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796849.0391245-745-49786097097260/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=2b6bd0891e609bf38a73282f42888052b750bed6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:34:10 compute-0 python3.9[193541]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:34:11 compute-0 python3.9[193662]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796850.178862-745-210038638953258/.source.json follow=False _original_basename=podman_exporter.json.j2 checksum=e342121a88f67e2bae7ebc05d1e6d350470198a5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:34:11 compute-0 python3.9[193812]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:34:12 compute-0 python3.9[193933]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763796851.2662234-745-220245632296130/.source.yaml follow=False _original_basename=podman_exporter.yaml.j2 checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:34:13 compute-0 podman[194057]: 2025-11-22 07:34:13.857198987 +0000 UTC m=+0.057786450 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Nov 22 07:34:14 compute-0 python3.9[194097]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:34:14 compute-0 python3.9[194177]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/node_exporter.yaml _original_basename=node_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/node_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:34:15 compute-0 python3.9[194327]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:34:15 compute-0 python3.9[194403]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml _original_basename=podman_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/podman_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:34:16 compute-0 python3.9[194553]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:34:16 compute-0 python3.9[194629]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml _original_basename=ceilometer_prom_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:34:17 compute-0 sudo[194796]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bntbdkvdyhnnaklokejhidfzimpramvz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796857.1171694-1312-42924377970427/AnsiballZ_file.py'
Nov 22 07:34:17 compute-0 sudo[194796]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:34:17 compute-0 podman[194753]: 2025-11-22 07:34:17.444474472 +0000 UTC m=+0.089412645 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 22 07:34:17 compute-0 python3.9[194801]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.crt recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:34:17 compute-0 sudo[194796]: pam_unix(sudo:session): session closed for user root
Nov 22 07:34:18 compute-0 sudo[194951]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jomahminpwzahwamhjjnmzdakonmhbvq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796857.8380125-1336-236496208503138/AnsiballZ_file.py'
Nov 22 07:34:18 compute-0 sudo[194951]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:34:18 compute-0 python3.9[194953]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.key recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:34:18 compute-0 sudo[194951]: pam_unix(sudo:session): session closed for user root
Nov 22 07:34:18 compute-0 sudo[195103]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uoxrtdptffruxvwluxtgwynnhulcwltx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796858.5627987-1360-130179806369525/AnsiballZ_file.py'
Nov 22 07:34:18 compute-0 sudo[195103]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:34:19 compute-0 python3.9[195105]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:34:19 compute-0 sudo[195103]: pam_unix(sudo:session): session closed for user root
Nov 22 07:34:19 compute-0 sudo[195255]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xytzjuzveekfpntjcqeybwtpugocqjfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796859.2852073-1384-139921942833990/AnsiballZ_systemd_service.py'
Nov 22 07:34:19 compute-0 sudo[195255]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:34:19 compute-0 python3.9[195257]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 07:34:19 compute-0 systemd[1]: Reloading.
Nov 22 07:34:20 compute-0 systemd-rc-local-generator[195285]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 07:34:20 compute-0 systemd-sysv-generator[195288]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 07:34:20 compute-0 systemd[1]: Listening on Podman API Socket.
Nov 22 07:34:20 compute-0 sudo[195255]: pam_unix(sudo:session): session closed for user root
Nov 22 07:34:21 compute-0 sudo[195445]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgavtldfkqouyxgpmistqtuxgzixmrko ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796860.7359347-1411-253332520613449/AnsiballZ_stat.py'
Nov 22 07:34:21 compute-0 sudo[195445]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:34:21 compute-0 python3.9[195447]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:34:21 compute-0 sudo[195445]: pam_unix(sudo:session): session closed for user root
Nov 22 07:34:21 compute-0 sudo[195568]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oucdnaqgsxupyjytwfjjbpbdiyqmfgwi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796860.7359347-1411-253332520613449/AnsiballZ_copy.py'
Nov 22 07:34:21 compute-0 sudo[195568]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:34:21 compute-0 python3.9[195570]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763796860.7359347-1411-253332520613449/.source _original_basename=healthcheck follow=False checksum=ebb343c21fce35a02591a9351660cb7035a47d42 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:34:21 compute-0 sudo[195568]: pam_unix(sudo:session): session closed for user root
Nov 22 07:34:22 compute-0 sudo[195644]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xrotcmscwaznxyrahooazfxftmyyxywk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796860.7359347-1411-253332520613449/AnsiballZ_stat.py'
Nov 22 07:34:22 compute-0 sudo[195644]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:34:22 compute-0 python3.9[195646]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck.future follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:34:22 compute-0 sudo[195644]: pam_unix(sudo:session): session closed for user root
Nov 22 07:34:22 compute-0 sudo[195767]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-crynwmnwwhhvxqrsirlakqoiyofudnru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796860.7359347-1411-253332520613449/AnsiballZ_copy.py'
Nov 22 07:34:22 compute-0 sudo[195767]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:34:22 compute-0 python3.9[195769]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763796860.7359347-1411-253332520613449/.source.future _original_basename=healthcheck.future follow=False checksum=d500a98192f4ddd70b4dfdc059e2d81aed36a294 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:34:22 compute-0 sudo[195767]: pam_unix(sudo:session): session closed for user root
Nov 22 07:34:23 compute-0 sudo[195919]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wamswulklyxrqpeszyjcmgrddeefpqxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796863.3524394-1495-137727440735839/AnsiballZ_container_config_data.py'
Nov 22 07:34:23 compute-0 sudo[195919]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:34:24 compute-0 python3.9[195921]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=ceilometer_agent_compute.json debug=False
Nov 22 07:34:24 compute-0 sudo[195919]: pam_unix(sudo:session): session closed for user root
Nov 22 07:34:25 compute-0 sudo[196071]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vrsdmgjpgzjzcvkesdvqqoydktoyhdbt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796864.596775-1522-247508428572151/AnsiballZ_container_config_hash.py'
Nov 22 07:34:25 compute-0 sudo[196071]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:34:25 compute-0 python3.9[196073]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 22 07:34:25 compute-0 sudo[196071]: pam_unix(sudo:session): session closed for user root
Nov 22 07:34:26 compute-0 nova_compute[186544]: 2025-11-22 07:34:26.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:34:26 compute-0 nova_compute[186544]: 2025-11-22 07:34:26.165 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:34:26 compute-0 nova_compute[186544]: 2025-11-22 07:34:26.165 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 07:34:26 compute-0 nova_compute[186544]: 2025-11-22 07:34:26.166 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 07:34:26 compute-0 nova_compute[186544]: 2025-11-22 07:34:26.181 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 07:34:26 compute-0 nova_compute[186544]: 2025-11-22 07:34:26.182 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:34:26 compute-0 nova_compute[186544]: 2025-11-22 07:34:26.183 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:34:26 compute-0 nova_compute[186544]: 2025-11-22 07:34:26.183 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:34:26 compute-0 nova_compute[186544]: 2025-11-22 07:34:26.183 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:34:26 compute-0 nova_compute[186544]: 2025-11-22 07:34:26.183 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:34:26 compute-0 nova_compute[186544]: 2025-11-22 07:34:26.184 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:34:26 compute-0 nova_compute[186544]: 2025-11-22 07:34:26.184 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 07:34:26 compute-0 nova_compute[186544]: 2025-11-22 07:34:26.184 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:34:26 compute-0 nova_compute[186544]: 2025-11-22 07:34:26.215 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:34:26 compute-0 nova_compute[186544]: 2025-11-22 07:34:26.216 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:34:26 compute-0 nova_compute[186544]: 2025-11-22 07:34:26.216 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:34:26 compute-0 nova_compute[186544]: 2025-11-22 07:34:26.216 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 07:34:26 compute-0 sudo[196223]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hadjqutpzcumqcyytbzkdnfzlirudtpz ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763796865.7500322-1552-5503575635252/AnsiballZ_edpm_container_manage.py'
Nov 22 07:34:26 compute-0 sudo[196223]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:34:26 compute-0 nova_compute[186544]: 2025-11-22 07:34:26.400 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 07:34:26 compute-0 nova_compute[186544]: 2025-11-22 07:34:26.402 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6196MB free_disk=73.66284942626953GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 07:34:26 compute-0 nova_compute[186544]: 2025-11-22 07:34:26.402 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:34:26 compute-0 nova_compute[186544]: 2025-11-22 07:34:26.403 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:34:26 compute-0 nova_compute[186544]: 2025-11-22 07:34:26.457 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 07:34:26 compute-0 nova_compute[186544]: 2025-11-22 07:34:26.457 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 07:34:26 compute-0 nova_compute[186544]: 2025-11-22 07:34:26.477 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:34:26 compute-0 nova_compute[186544]: 2025-11-22 07:34:26.489 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:34:26 compute-0 nova_compute[186544]: 2025-11-22 07:34:26.491 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 07:34:26 compute-0 nova_compute[186544]: 2025-11-22 07:34:26.491 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.089s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:34:26 compute-0 python3[196225]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=ceilometer_agent_compute.json log_base_path=/var/log/containers/stdouts debug=False
Nov 22 07:34:26 compute-0 podman[196262]: 2025-11-22 07:34:26.759162977 +0000 UTC m=+0.023416340 image pull 5b3bac081df6146e06acefa72320d250dc7d5f82abc7fbe0b9e83aec1e1587f5 quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Nov 22 07:34:27 compute-0 podman[196262]: 2025-11-22 07:34:27.010872557 +0000 UTC m=+0.275125900 container create a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118)
Nov 22 07:34:27 compute-0 python3[196225]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ceilometer_agent_compute --conmon-pidfile /run/ceilometer_agent_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck compute --label config_id=edpm --label container_name=ceilometer_agent_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']} --log-driver journald --log-level info --network host --security-opt label:type:ceilometer_polling_t --user ceilometer --volume /var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z --volume /var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z --volume /run/libvirt:/run/libvirt:shared,ro --volume /etc/hosts:/etc/hosts:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z --volume /dev/log:/dev/log --volume /var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified kolla_start
Nov 22 07:34:27 compute-0 sudo[196223]: pam_unix(sudo:session): session closed for user root
Nov 22 07:34:27 compute-0 sudo[196450]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ewhcspdhqabvgfqhjomjdhvgeazfhxqt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796867.3298666-1576-194427904288367/AnsiballZ_stat.py'
Nov 22 07:34:27 compute-0 sudo[196450]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:34:27 compute-0 python3.9[196452]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 07:34:27 compute-0 sudo[196450]: pam_unix(sudo:session): session closed for user root
Nov 22 07:34:28 compute-0 sudo[196604]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lqxpgborvcjsjdsuqejvxvquusncelsi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796868.1496787-1603-142590410779463/AnsiballZ_file.py'
Nov 22 07:34:28 compute-0 sudo[196604]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:34:28 compute-0 python3.9[196606]: ansible-file Invoked with path=/etc/systemd/system/edpm_ceilometer_agent_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:34:28 compute-0 sudo[196604]: pam_unix(sudo:session): session closed for user root
Nov 22 07:34:29 compute-0 sudo[196755]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fmmdsmkbmrlzomaoaofpgyjovukaoucy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796868.7354813-1603-7698880171567/AnsiballZ_copy.py'
Nov 22 07:34:29 compute-0 sudo[196755]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:34:29 compute-0 python3.9[196757]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763796868.7354813-1603-7698880171567/source dest=/etc/systemd/system/edpm_ceilometer_agent_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:34:29 compute-0 sudo[196755]: pam_unix(sudo:session): session closed for user root
Nov 22 07:34:29 compute-0 sudo[196831]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-peaidwieiubnujnmebshzavercnqtvxi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796868.7354813-1603-7698880171567/AnsiballZ_systemd.py'
Nov 22 07:34:29 compute-0 sudo[196831]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:34:30 compute-0 python3.9[196833]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 22 07:34:30 compute-0 systemd[1]: Reloading.
Nov 22 07:34:30 compute-0 systemd-rc-local-generator[196858]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 07:34:30 compute-0 systemd-sysv-generator[196862]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 07:34:30 compute-0 sudo[196831]: pam_unix(sudo:session): session closed for user root
Nov 22 07:34:30 compute-0 sudo[196942]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qhvowbypjgligcmratslnoonihkqahcf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796868.7354813-1603-7698880171567/AnsiballZ_systemd.py'
Nov 22 07:34:30 compute-0 sudo[196942]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:34:31 compute-0 python3.9[196944]: ansible-systemd Invoked with state=restarted name=edpm_ceilometer_agent_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 07:34:31 compute-0 systemd[1]: Reloading.
Nov 22 07:34:31 compute-0 systemd-rc-local-generator[196974]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 07:34:31 compute-0 systemd-sysv-generator[196977]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 07:34:31 compute-0 systemd[1]: Starting ceilometer_agent_compute container...
Nov 22 07:34:31 compute-0 systemd[1]: Started libcrun container.
Nov 22 07:34:31 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49381b935dd6407551a95d6630bce8e2d570a0ec1b16bcc31cd58a57c5b8ebc7/merged/etc/ceilometer/tls supports timestamps until 2038 (0x7fffffff)
Nov 22 07:34:31 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49381b935dd6407551a95d6630bce8e2d570a0ec1b16bcc31cd58a57c5b8ebc7/merged/etc/ceilometer/ceilometer_prom_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 22 07:34:31 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49381b935dd6407551a95d6630bce8e2d570a0ec1b16bcc31cd58a57c5b8ebc7/merged/var/lib/openstack/config supports timestamps until 2038 (0x7fffffff)
Nov 22 07:34:31 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49381b935dd6407551a95d6630bce8e2d570a0ec1b16bcc31cd58a57c5b8ebc7/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Nov 22 07:34:31 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b.
Nov 22 07:34:32 compute-0 podman[196984]: 2025-11-22 07:34:32.1298074 +0000 UTC m=+0.522347942 container init a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute)
Nov 22 07:34:32 compute-0 ceilometer_agent_compute[196999]: + sudo -E kolla_set_configs
Nov 22 07:34:32 compute-0 podman[196984]: 2025-11-22 07:34:32.155065473 +0000 UTC m=+0.547606015 container start a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute)
Nov 22 07:34:32 compute-0 sudo[197005]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 22 07:34:32 compute-0 ceilometer_agent_compute[196999]: sudo: unable to send audit message: Operation not permitted
Nov 22 07:34:32 compute-0 sudo[197005]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 22 07:34:32 compute-0 sudo[197005]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Nov 22 07:34:32 compute-0 podman[196984]: ceilometer_agent_compute
Nov 22 07:34:32 compute-0 systemd[1]: Started ceilometer_agent_compute container.
Nov 22 07:34:32 compute-0 ceilometer_agent_compute[196999]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 22 07:34:32 compute-0 ceilometer_agent_compute[196999]: INFO:__main__:Validating config file
Nov 22 07:34:32 compute-0 ceilometer_agent_compute[196999]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 22 07:34:32 compute-0 ceilometer_agent_compute[196999]: INFO:__main__:Copying service configuration files
Nov 22 07:34:32 compute-0 ceilometer_agent_compute[196999]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Nov 22 07:34:32 compute-0 ceilometer_agent_compute[196999]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Nov 22 07:34:32 compute-0 ceilometer_agent_compute[196999]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Nov 22 07:34:32 compute-0 ceilometer_agent_compute[196999]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Nov 22 07:34:32 compute-0 ceilometer_agent_compute[196999]: INFO:__main__:Copying /var/lib/openstack/config/polling.yaml to /etc/ceilometer/polling.yaml
Nov 22 07:34:32 compute-0 ceilometer_agent_compute[196999]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Nov 22 07:34:32 compute-0 ceilometer_agent_compute[196999]: INFO:__main__:Copying /var/lib/openstack/config/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Nov 22 07:34:32 compute-0 ceilometer_agent_compute[196999]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Nov 22 07:34:32 compute-0 ceilometer_agent_compute[196999]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Nov 22 07:34:32 compute-0 ceilometer_agent_compute[196999]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Nov 22 07:34:32 compute-0 ceilometer_agent_compute[196999]: INFO:__main__:Writing out command to execute
Nov 22 07:34:32 compute-0 sudo[197005]: pam_unix(sudo:session): session closed for user root
Nov 22 07:34:32 compute-0 ceilometer_agent_compute[196999]: ++ cat /run_command
Nov 22 07:34:32 compute-0 ceilometer_agent_compute[196999]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Nov 22 07:34:32 compute-0 ceilometer_agent_compute[196999]: + ARGS=
Nov 22 07:34:32 compute-0 ceilometer_agent_compute[196999]: + sudo kolla_copy_cacerts
Nov 22 07:34:32 compute-0 sudo[196942]: pam_unix(sudo:session): session closed for user root
Nov 22 07:34:32 compute-0 sudo[197022]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Nov 22 07:34:32 compute-0 ceilometer_agent_compute[196999]: sudo: unable to send audit message: Operation not permitted
Nov 22 07:34:32 compute-0 sudo[197022]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 22 07:34:32 compute-0 sudo[197022]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Nov 22 07:34:32 compute-0 sudo[197022]: pam_unix(sudo:session): session closed for user root
Nov 22 07:34:32 compute-0 ceilometer_agent_compute[196999]: + [[ ! -n '' ]]
Nov 22 07:34:32 compute-0 ceilometer_agent_compute[196999]: + . kolla_extend_start
Nov 22 07:34:32 compute-0 ceilometer_agent_compute[196999]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Nov 22 07:34:32 compute-0 ceilometer_agent_compute[196999]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Nov 22 07:34:32 compute-0 ceilometer_agent_compute[196999]: + umask 0022
Nov 22 07:34:32 compute-0 ceilometer_agent_compute[196999]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Nov 22 07:34:32 compute-0 podman[197006]: 2025-11-22 07:34:32.252014777 +0000 UTC m=+0.085882761 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=1, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=edpm, container_name=ceilometer_agent_compute)
Nov 22 07:34:32 compute-0 systemd[1]: a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b-187770595efc35a3.service: Main process exited, code=exited, status=1/FAILURE
Nov 22 07:34:32 compute-0 systemd[1]: a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b-187770595efc35a3.service: Failed with result 'exit-code'.
Nov 22 07:34:33 compute-0 sudo[197180]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vbohbsrqisetoyqnkwqvesgqbbtdmrqw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796872.8157635-1675-32741322715486/AnsiballZ_systemd.py'
Nov 22 07:34:33 compute-0 sudo[197180]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.182 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.182 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.182 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.182 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.182 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.182 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.182 2 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.183 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.183 2 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.183 2 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.183 2 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.183 2 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.183 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.183 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.183 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.183 2 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.184 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.184 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.184 2 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.184 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.184 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.184 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.184 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.184 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.184 2 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.184 2 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.184 2 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.185 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.185 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.185 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.185 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.185 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.185 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.185 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.185 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.185 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.185 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.185 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.185 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.186 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.186 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.186 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.186 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.186 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.186 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.186 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.186 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.186 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.186 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.187 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.187 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.187 2 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.187 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.187 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.187 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.187 2 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.187 2 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.187 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.187 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.188 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.188 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.188 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.188 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.188 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.188 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.188 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.188 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.188 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.188 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.188 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.189 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.189 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.189 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.189 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.189 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.189 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.189 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.189 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.189 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.189 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.190 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.190 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.190 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.190 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.190 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.190 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.190 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.190 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.190 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.190 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.190 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.191 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.191 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.191 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.191 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.191 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.191 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.191 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.191 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.191 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.191 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.192 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.192 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.192 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.192 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.192 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.192 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.192 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.193 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.193 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.193 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.193 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.193 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.193 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.193 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.193 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.194 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.194 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.194 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.194 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.194 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.194 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.194 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.194 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.195 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.195 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.195 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.195 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.195 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.195 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.195 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.195 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.195 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.196 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.196 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.196 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.196 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.196 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.196 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.196 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.196 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.196 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.197 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.197 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.197 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.197 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.197 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.197 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.197 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.197 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.197 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.198 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.198 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.198 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.198 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.198 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.198 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.198 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.198 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.198 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.198 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.199 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.199 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.199 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.199 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.216 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']].
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.217 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d].
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.218 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']].
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.307 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.382 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.383 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.383 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.383 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.383 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.383 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.383 12 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.383 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.383 12 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.384 12 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.384 12 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.384 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange               = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.384 12 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.384 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.384 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.384 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.384 12 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.384 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.385 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.385 12 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.385 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.385 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.385 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.385 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.385 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.385 12 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.385 12 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.385 12 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.386 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.386 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.386 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.386 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.386 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.386 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.386 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.386 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.386 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.386 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.386 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.386 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.387 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.387 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.387 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.387 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.387 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.387 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.387 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.387 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.387 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.388 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.388 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.388 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.388 12 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.388 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.388 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.388 12 DEBUG cotyledon.oslo_config_glue [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.388 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.388 12 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.388 12 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.388 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.389 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.389 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.389 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.389 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.389 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.389 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.389 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.389 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.389 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.390 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.390 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.390 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.390 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.390 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.390 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.390 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.390 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.390 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.390 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.391 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.391 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.391 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.391 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.391 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.391 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.391 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.391 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.391 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.391 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.391 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.392 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.392 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.392 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.392 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.392 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.392 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.392 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.392 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.392 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.392 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.392 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.392 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.393 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.393 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.393 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.393 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.393 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.393 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.393 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.393 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.393 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.394 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.394 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.394 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.394 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.394 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.394 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.394 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.394 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.394 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.394 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.395 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.395 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.395 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.395 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.395 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.395 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.395 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.395 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.395 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.395 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.395 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.396 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.396 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.396 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.396 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url   = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.396 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.396 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.396 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.396 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.396 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.396 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.396 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.396 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.397 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.397 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.397 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.397 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.397 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.397 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.397 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.397 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.397 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.397 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.397 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.397 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.398 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.398 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.398 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.398 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username   = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.398 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.398 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.398 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.398 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.398 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.398 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.398 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.398 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.399 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.399 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.399 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.399 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.399 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.399 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.399 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.399 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.399 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.399 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.399 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.399 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.400 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.400 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.400 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.400 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.400 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.400 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.400 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.400 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.400 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.400 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.400 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.400 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.401 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.401 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.401 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.401 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.401 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.401 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.401 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.401 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.401 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.401 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.402 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.402 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.402 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.402 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.402 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.402 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.402 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.402 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.402 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.402 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.402 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.402 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.403 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.403 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.403 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.403 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.403 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.405 12 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.9/site-packages/ceilometer/agent.py:64
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.412 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.416 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.416 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.416 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.416 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.417 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.417 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.417 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.417 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.417 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.417 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.417 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.417 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.417 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.417 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.417 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.418 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.418 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.418 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.418 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.418 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.418 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.418 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.418 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.418 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.418 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:34:33 compute-0 python3.9[197182]: ansible-ansible.builtin.systemd Invoked with name=edpm_ceilometer_agent_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 07:34:33 compute-0 systemd[1]: Stopping ceilometer_agent_compute container...
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.853 2 INFO cotyledon._service_manager [-] Caught SIGTERM signal, graceful exiting of master process
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.955 2 DEBUG cotyledon._service_manager [-] Killing services with signal SIGTERM _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:304
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.955 2 DEBUG cotyledon._service_manager [-] Waiting services to terminate _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:308
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.956 12 INFO cotyledon._service [-] Caught SIGTERM signal, graceful exiting of service AgentManager(0) [12]
Nov 22 07:34:33 compute-0 ceilometer_agent_compute[196999]: 2025-11-22 07:34:33.969 2 DEBUG cotyledon._service_manager [-] Shutdown finish _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:320
Nov 22 07:34:33 compute-0 virtqemud[186092]: End of file while reading data: Input/output error
Nov 22 07:34:33 compute-0 virtqemud[186092]: End of file while reading data: Input/output error
Nov 22 07:34:34 compute-0 systemd[1]: libpod-a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b.scope: Deactivated successfully.
Nov 22 07:34:34 compute-0 systemd[1]: libpod-a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b.scope: Consumed 1.506s CPU time.
Nov 22 07:34:34 compute-0 podman[197192]: 2025-11-22 07:34:34.192542666 +0000 UTC m=+0.702248216 container died a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 07:34:34 compute-0 systemd[1]: a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b-187770595efc35a3.timer: Deactivated successfully.
Nov 22 07:34:34 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b.
Nov 22 07:34:34 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b-userdata-shm.mount: Deactivated successfully.
Nov 22 07:34:34 compute-0 systemd[1]: var-lib-containers-storage-overlay-49381b935dd6407551a95d6630bce8e2d570a0ec1b16bcc31cd58a57c5b8ebc7-merged.mount: Deactivated successfully.
Nov 22 07:34:34 compute-0 podman[197192]: 2025-11-22 07:34:34.756224324 +0000 UTC m=+1.265929874 container cleanup a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 22 07:34:34 compute-0 podman[197192]: ceilometer_agent_compute
Nov 22 07:34:34 compute-0 podman[197219]: ceilometer_agent_compute
Nov 22 07:34:34 compute-0 systemd[1]: edpm_ceilometer_agent_compute.service: Deactivated successfully.
Nov 22 07:34:34 compute-0 systemd[1]: Stopped ceilometer_agent_compute container.
Nov 22 07:34:34 compute-0 systemd[1]: Starting ceilometer_agent_compute container...
Nov 22 07:34:35 compute-0 systemd[1]: Started libcrun container.
Nov 22 07:34:35 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49381b935dd6407551a95d6630bce8e2d570a0ec1b16bcc31cd58a57c5b8ebc7/merged/etc/ceilometer/tls supports timestamps until 2038 (0x7fffffff)
Nov 22 07:34:35 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49381b935dd6407551a95d6630bce8e2d570a0ec1b16bcc31cd58a57c5b8ebc7/merged/etc/ceilometer/ceilometer_prom_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 22 07:34:35 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49381b935dd6407551a95d6630bce8e2d570a0ec1b16bcc31cd58a57c5b8ebc7/merged/var/lib/openstack/config supports timestamps until 2038 (0x7fffffff)
Nov 22 07:34:35 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49381b935dd6407551a95d6630bce8e2d570a0ec1b16bcc31cd58a57c5b8ebc7/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Nov 22 07:34:35 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b.
Nov 22 07:34:35 compute-0 podman[197232]: 2025-11-22 07:34:35.412626935 +0000 UTC m=+0.559818106 container init a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm)
Nov 22 07:34:35 compute-0 ceilometer_agent_compute[197247]: + sudo -E kolla_set_configs
Nov 22 07:34:35 compute-0 podman[197232]: 2025-11-22 07:34:35.43461519 +0000 UTC m=+0.581806331 container start a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 22 07:34:35 compute-0 ceilometer_agent_compute[197247]: sudo: unable to send audit message: Operation not permitted
Nov 22 07:34:35 compute-0 sudo[197264]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 22 07:34:35 compute-0 sudo[197264]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 22 07:34:35 compute-0 sudo[197264]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Nov 22 07:34:35 compute-0 ceilometer_agent_compute[197247]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 22 07:34:35 compute-0 ceilometer_agent_compute[197247]: INFO:__main__:Validating config file
Nov 22 07:34:35 compute-0 ceilometer_agent_compute[197247]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 22 07:34:35 compute-0 ceilometer_agent_compute[197247]: INFO:__main__:Copying service configuration files
Nov 22 07:34:35 compute-0 ceilometer_agent_compute[197247]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Nov 22 07:34:35 compute-0 ceilometer_agent_compute[197247]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Nov 22 07:34:35 compute-0 ceilometer_agent_compute[197247]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Nov 22 07:34:35 compute-0 ceilometer_agent_compute[197247]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Nov 22 07:34:35 compute-0 ceilometer_agent_compute[197247]: INFO:__main__:Copying /var/lib/openstack/config/polling.yaml to /etc/ceilometer/polling.yaml
Nov 22 07:34:35 compute-0 ceilometer_agent_compute[197247]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Nov 22 07:34:35 compute-0 ceilometer_agent_compute[197247]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Nov 22 07:34:35 compute-0 ceilometer_agent_compute[197247]: INFO:__main__:Copying /var/lib/openstack/config/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Nov 22 07:34:35 compute-0 ceilometer_agent_compute[197247]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Nov 22 07:34:35 compute-0 ceilometer_agent_compute[197247]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Nov 22 07:34:35 compute-0 ceilometer_agent_compute[197247]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Nov 22 07:34:35 compute-0 ceilometer_agent_compute[197247]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Nov 22 07:34:35 compute-0 ceilometer_agent_compute[197247]: INFO:__main__:Writing out command to execute
Nov 22 07:34:35 compute-0 sudo[197264]: pam_unix(sudo:session): session closed for user root
Nov 22 07:34:35 compute-0 ceilometer_agent_compute[197247]: ++ cat /run_command
Nov 22 07:34:35 compute-0 ceilometer_agent_compute[197247]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Nov 22 07:34:35 compute-0 ceilometer_agent_compute[197247]: + ARGS=
Nov 22 07:34:35 compute-0 ceilometer_agent_compute[197247]: + sudo kolla_copy_cacerts
Nov 22 07:34:35 compute-0 sudo[197289]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Nov 22 07:34:35 compute-0 ceilometer_agent_compute[197247]: sudo: unable to send audit message: Operation not permitted
Nov 22 07:34:35 compute-0 sudo[197289]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 22 07:34:35 compute-0 sudo[197289]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Nov 22 07:34:35 compute-0 sudo[197289]: pam_unix(sudo:session): session closed for user root
Nov 22 07:34:35 compute-0 ceilometer_agent_compute[197247]: + [[ ! -n '' ]]
Nov 22 07:34:35 compute-0 ceilometer_agent_compute[197247]: + . kolla_extend_start
Nov 22 07:34:35 compute-0 ceilometer_agent_compute[197247]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Nov 22 07:34:35 compute-0 ceilometer_agent_compute[197247]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Nov 22 07:34:35 compute-0 ceilometer_agent_compute[197247]: + umask 0022
Nov 22 07:34:35 compute-0 ceilometer_agent_compute[197247]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Nov 22 07:34:35 compute-0 podman[197232]: ceilometer_agent_compute
Nov 22 07:34:35 compute-0 systemd[1]: Started ceilometer_agent_compute container.
Nov 22 07:34:35 compute-0 podman[197251]: 2025-11-22 07:34:35.582048591 +0000 UTC m=+0.232556514 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 07:34:35 compute-0 sudo[197180]: pam_unix(sudo:session): session closed for user root
Nov 22 07:34:35 compute-0 podman[197267]: 2025-11-22 07:34:35.628037248 +0000 UTC m=+0.184889335 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Nov 22 07:34:35 compute-0 systemd[1]: a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b-173a7a48cbf9c8ec.service: Main process exited, code=exited, status=1/FAILURE
Nov 22 07:34:35 compute-0 systemd[1]: a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b-173a7a48cbf9c8ec.service: Failed with result 'exit-code'.
Nov 22 07:34:36 compute-0 sudo[197453]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzwuyfzhgcznvwegjwfpahcpoyiykjlc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796876.1098325-1699-217724098107364/AnsiballZ_stat.py'
Nov 22 07:34:36 compute-0 sudo[197453]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.371 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.372 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.372 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.372 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.372 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.372 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.372 2 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.372 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.372 2 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.372 2 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.373 2 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.373 2 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.373 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.373 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.373 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.373 2 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.373 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.373 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.373 2 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.373 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.374 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.374 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.374 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.374 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.374 2 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.374 2 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.374 2 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.374 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.374 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.374 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.374 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.374 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.375 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.375 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.375 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.375 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.375 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.375 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.375 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.375 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.375 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.376 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.376 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.376 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.376 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.376 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.376 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.376 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.376 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.376 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.376 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.376 2 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.377 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.377 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.377 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.377 2 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.377 2 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.377 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.377 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.377 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.378 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.378 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.378 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.378 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.378 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.378 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.378 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.378 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.378 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.379 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.379 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.379 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.379 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.379 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.379 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.379 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.379 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.379 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.379 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.379 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.380 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.380 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.380 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.380 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.380 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.380 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.380 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.380 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.380 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.380 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.380 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.381 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.381 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.381 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.381 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.381 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.381 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.381 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.381 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.381 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.381 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.381 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.382 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.382 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.382 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.382 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.382 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.382 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.382 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.382 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.382 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.382 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.383 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.383 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.383 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.383 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.383 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.383 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.383 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.383 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.383 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.383 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.383 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.384 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.384 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.384 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.384 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.384 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.384 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.384 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.384 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.384 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.384 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.384 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.384 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.385 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.385 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.385 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.385 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.385 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.385 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.385 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.385 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.385 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.385 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.385 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.386 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.386 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.386 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.386 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.386 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.386 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.386 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.386 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.386 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.386 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.386 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.386 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.387 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.387 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.387 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.387 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.387 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.387 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.387 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.404 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']].
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.406 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d].
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.407 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']].
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.418 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.556 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.556 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.556 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.556 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.557 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.557 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.557 12 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.557 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.557 12 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.557 12 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.557 12 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.557 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange               = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.557 12 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.558 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.558 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.558 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.558 12 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.558 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.558 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.558 12 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.558 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.558 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.559 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.559 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.559 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.559 12 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.559 12 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.559 12 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.559 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.559 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.559 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.559 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.559 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.559 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.560 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.560 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.560 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.560 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.560 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.560 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.560 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.560 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.560 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.560 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.560 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.561 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.561 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.561 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.561 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.561 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.561 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.561 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.561 12 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.561 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.562 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.562 12 DEBUG cotyledon.oslo_config_glue [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.562 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.562 12 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.562 12 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.562 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.562 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.562 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.562 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.562 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.562 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.563 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.563 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.563 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 python3.9[197455]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/node_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.563 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.563 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.563 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.563 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.563 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.563 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.563 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.564 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.564 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.564 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.564 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.564 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.564 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.564 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.564 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.564 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.564 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.564 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.564 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.565 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.565 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.565 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.565 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.565 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.565 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.565 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.565 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.565 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.565 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.566 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.566 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.566 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.566 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.566 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.566 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.567 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.567 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.567 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.567 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.567 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.567 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.567 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.567 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.568 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.568 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.568 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.568 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.568 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.568 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.568 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.568 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.568 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.568 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.569 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.569 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.569 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.569 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.569 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.569 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.569 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.569 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.570 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.570 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.570 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.570 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.570 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.570 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.570 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url   = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.570 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.570 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.571 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.571 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.571 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.571 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.571 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.571 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.571 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.571 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.571 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.571 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.572 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.572 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.572 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.572 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.572 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.572 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.572 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.572 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.572 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.572 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.572 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.573 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username   = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.573 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.573 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.573 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.573 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.573 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.573 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.573 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.573 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.573 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.574 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.574 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.574 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.574 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.574 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.574 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.574 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.574 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.574 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.575 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.575 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.575 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.575 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.575 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.575 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.575 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.575 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.575 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.575 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.576 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.576 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.576 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.576 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.576 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.576 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.576 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.576 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.576 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.576 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.576 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.577 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.577 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.577 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.577 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.577 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.577 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.577 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.577 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.577 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.578 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.578 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.578 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.578 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.578 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.578 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.578 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.578 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.578 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.578 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.579 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.581 12 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.9/site-packages/ceilometer/agent.py:64
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.587 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Nov 22 07:34:36 compute-0 sudo[197453]: pam_unix(sudo:session): session closed for user root
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.591 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.591 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.591 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.592 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.592 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.592 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.592 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.592 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.592 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.592 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.592 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.592 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.592 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.592 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.592 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.593 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.593 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.593 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.593 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.593 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.593 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.593 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.593 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.593 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:34:36.593 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:34:36 compute-0 sudo[197582]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oypajfyjzyjkqrhrwbbgbumnajpelltd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796876.1098325-1699-217724098107364/AnsiballZ_copy.py'
Nov 22 07:34:36 compute-0 sudo[197582]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:34:37 compute-0 python3.9[197584]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/node_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763796876.1098325-1699-217724098107364/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:34:37 compute-0 sudo[197582]: pam_unix(sudo:session): session closed for user root
Nov 22 07:34:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:34:37.301 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:34:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:34:37.301 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:34:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:34:37.301 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:34:37 compute-0 sudo[197734]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wpdkpiojbncakoqzwsfielblfuafqbpa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796877.734998-1750-111474935698415/AnsiballZ_container_config_data.py'
Nov 22 07:34:37 compute-0 sudo[197734]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:34:38 compute-0 python3.9[197736]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=node_exporter.json debug=False
Nov 22 07:34:38 compute-0 sudo[197734]: pam_unix(sudo:session): session closed for user root
Nov 22 07:34:38 compute-0 sudo[197886]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hwdmswjqbnacttodlgysbewirjbpngym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796878.641857-1777-56693653563159/AnsiballZ_container_config_hash.py'
Nov 22 07:34:38 compute-0 sudo[197886]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:34:39 compute-0 python3.9[197888]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 22 07:34:39 compute-0 sudo[197886]: pam_unix(sudo:session): session closed for user root
Nov 22 07:34:39 compute-0 sudo[198038]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ytnjfafgawrceeeqtztvkrpcahiacviw ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763796879.6237094-1807-153657772251667/AnsiballZ_edpm_container_manage.py'
Nov 22 07:34:39 compute-0 sudo[198038]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:34:40 compute-0 python3[198040]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=node_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Nov 22 07:34:40 compute-0 podman[198075]: 2025-11-22 07:34:40.365306699 +0000 UTC m=+0.019583189 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Nov 22 07:34:41 compute-0 podman[198075]: 2025-11-22 07:34:41.158242061 +0000 UTC m=+0.812518531 container create 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_id=edpm, container_name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 07:34:41 compute-0 python3[198040]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name node_exporter --conmon-pidfile /run/node_exporter.pid --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck node_exporter --label config_id=edpm --label container_name=node_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9100:9100 --user root --volume /var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z --volume /var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw --volume /var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z quay.io/prometheus/node-exporter:v1.5.0 --web.config.file=/etc/node_exporter/node_exporter.yaml --web.disable-exporter-metrics --collector.systemd --collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service --no-collector.dmi --no-collector.entropy --no-collector.thermal_zone --no-collector.time --no-collector.timex --no-collector.uname --no-collector.stat --no-collector.hwmon --no-collector.os --no-collector.selinux --no-collector.textfile --no-collector.powersupplyclass --no-collector.pressure --no-collector.rapl
Nov 22 07:34:41 compute-0 sudo[198038]: pam_unix(sudo:session): session closed for user root
Nov 22 07:34:41 compute-0 sudo[198261]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vevuaurhvmaibqifvkxxapvazqdculfo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796881.4854696-1831-101390731594758/AnsiballZ_stat.py'
Nov 22 07:34:41 compute-0 sudo[198261]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:34:42 compute-0 python3.9[198263]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 07:34:42 compute-0 sudo[198261]: pam_unix(sudo:session): session closed for user root
Nov 22 07:34:42 compute-0 sudo[198415]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qspybmrctfowengocxcltfbtvhvrnids ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796882.447922-1858-75812963777579/AnsiballZ_file.py'
Nov 22 07:34:42 compute-0 sudo[198415]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:34:42 compute-0 python3.9[198417]: ansible-file Invoked with path=/etc/systemd/system/edpm_node_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:34:42 compute-0 sudo[198415]: pam_unix(sudo:session): session closed for user root
Nov 22 07:34:43 compute-0 sudo[198566]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-znizlbiigqzwzshmkqhlhfsttuyiasto ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796883.034945-1858-6195769063846/AnsiballZ_copy.py'
Nov 22 07:34:43 compute-0 sudo[198566]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:34:43 compute-0 python3.9[198568]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763796883.034945-1858-6195769063846/source dest=/etc/systemd/system/edpm_node_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:34:43 compute-0 sudo[198566]: pam_unix(sudo:session): session closed for user root
Nov 22 07:34:43 compute-0 sudo[198654]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tdhsnocpnkaetehiciifeleabjvoviiz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796883.034945-1858-6195769063846/AnsiballZ_systemd.py'
Nov 22 07:34:43 compute-0 sudo[198654]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:34:43 compute-0 podman[198616]: 2025-11-22 07:34:43.987651201 +0000 UTC m=+0.062776490 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 22 07:34:44 compute-0 python3.9[198662]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 22 07:34:44 compute-0 systemd[1]: Reloading.
Nov 22 07:34:44 compute-0 systemd-rc-local-generator[198688]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 07:34:44 compute-0 systemd-sysv-generator[198692]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 07:34:44 compute-0 sudo[198654]: pam_unix(sudo:session): session closed for user root
Nov 22 07:34:44 compute-0 sudo[198771]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gioovxijmruexrwjjxmpubllbwufbjuc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796883.034945-1858-6195769063846/AnsiballZ_systemd.py'
Nov 22 07:34:44 compute-0 sudo[198771]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:34:45 compute-0 python3.9[198773]: ansible-systemd Invoked with state=restarted name=edpm_node_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 07:34:45 compute-0 systemd[1]: Reloading.
Nov 22 07:34:45 compute-0 systemd-rc-local-generator[198802]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 07:34:45 compute-0 systemd-sysv-generator[198806]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 07:34:45 compute-0 systemd[1]: Starting node_exporter container...
Nov 22 07:34:46 compute-0 systemd[1]: Started libcrun container.
Nov 22 07:34:46 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0d59637fc8d1c7f2fdda9ff430494fd819d05751693c2fc36ffbf219c2e8fe7b/merged/etc/node_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 22 07:34:46 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0d59637fc8d1c7f2fdda9ff430494fd819d05751693c2fc36ffbf219c2e8fe7b/merged/etc/node_exporter/node_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 22 07:34:46 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3.
Nov 22 07:34:46 compute-0 podman[198812]: 2025-11-22 07:34:46.642460305 +0000 UTC m=+0.810347948 container init 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 07:34:46 compute-0 node_exporter[198828]: ts=2025-11-22T07:34:46.655Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Nov 22 07:34:46 compute-0 node_exporter[198828]: ts=2025-11-22T07:34:46.655Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Nov 22 07:34:46 compute-0 node_exporter[198828]: ts=2025-11-22T07:34:46.655Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Nov 22 07:34:46 compute-0 node_exporter[198828]: ts=2025-11-22T07:34:46.655Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Nov 22 07:34:46 compute-0 node_exporter[198828]: ts=2025-11-22T07:34:46.655Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Nov 22 07:34:46 compute-0 node_exporter[198828]: ts=2025-11-22T07:34:46.656Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Nov 22 07:34:46 compute-0 node_exporter[198828]: ts=2025-11-22T07:34:46.656Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Nov 22 07:34:46 compute-0 node_exporter[198828]: ts=2025-11-22T07:34:46.656Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Nov 22 07:34:46 compute-0 node_exporter[198828]: ts=2025-11-22T07:34:46.656Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Nov 22 07:34:46 compute-0 node_exporter[198828]: ts=2025-11-22T07:34:46.656Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Nov 22 07:34:46 compute-0 node_exporter[198828]: ts=2025-11-22T07:34:46.656Z caller=node_exporter.go:117 level=info collector=arp
Nov 22 07:34:46 compute-0 node_exporter[198828]: ts=2025-11-22T07:34:46.656Z caller=node_exporter.go:117 level=info collector=bcache
Nov 22 07:34:46 compute-0 node_exporter[198828]: ts=2025-11-22T07:34:46.656Z caller=node_exporter.go:117 level=info collector=bonding
Nov 22 07:34:46 compute-0 node_exporter[198828]: ts=2025-11-22T07:34:46.656Z caller=node_exporter.go:117 level=info collector=btrfs
Nov 22 07:34:46 compute-0 node_exporter[198828]: ts=2025-11-22T07:34:46.656Z caller=node_exporter.go:117 level=info collector=conntrack
Nov 22 07:34:46 compute-0 node_exporter[198828]: ts=2025-11-22T07:34:46.656Z caller=node_exporter.go:117 level=info collector=cpu
Nov 22 07:34:46 compute-0 node_exporter[198828]: ts=2025-11-22T07:34:46.657Z caller=node_exporter.go:117 level=info collector=cpufreq
Nov 22 07:34:46 compute-0 node_exporter[198828]: ts=2025-11-22T07:34:46.657Z caller=node_exporter.go:117 level=info collector=diskstats
Nov 22 07:34:46 compute-0 node_exporter[198828]: ts=2025-11-22T07:34:46.657Z caller=node_exporter.go:117 level=info collector=edac
Nov 22 07:34:46 compute-0 node_exporter[198828]: ts=2025-11-22T07:34:46.657Z caller=node_exporter.go:117 level=info collector=fibrechannel
Nov 22 07:34:46 compute-0 node_exporter[198828]: ts=2025-11-22T07:34:46.657Z caller=node_exporter.go:117 level=info collector=filefd
Nov 22 07:34:46 compute-0 node_exporter[198828]: ts=2025-11-22T07:34:46.657Z caller=node_exporter.go:117 level=info collector=filesystem
Nov 22 07:34:46 compute-0 node_exporter[198828]: ts=2025-11-22T07:34:46.657Z caller=node_exporter.go:117 level=info collector=infiniband
Nov 22 07:34:46 compute-0 node_exporter[198828]: ts=2025-11-22T07:34:46.657Z caller=node_exporter.go:117 level=info collector=ipvs
Nov 22 07:34:46 compute-0 node_exporter[198828]: ts=2025-11-22T07:34:46.657Z caller=node_exporter.go:117 level=info collector=loadavg
Nov 22 07:34:46 compute-0 node_exporter[198828]: ts=2025-11-22T07:34:46.657Z caller=node_exporter.go:117 level=info collector=mdadm
Nov 22 07:34:46 compute-0 node_exporter[198828]: ts=2025-11-22T07:34:46.657Z caller=node_exporter.go:117 level=info collector=meminfo
Nov 22 07:34:46 compute-0 node_exporter[198828]: ts=2025-11-22T07:34:46.657Z caller=node_exporter.go:117 level=info collector=netclass
Nov 22 07:34:46 compute-0 node_exporter[198828]: ts=2025-11-22T07:34:46.657Z caller=node_exporter.go:117 level=info collector=netdev
Nov 22 07:34:46 compute-0 node_exporter[198828]: ts=2025-11-22T07:34:46.657Z caller=node_exporter.go:117 level=info collector=netstat
Nov 22 07:34:46 compute-0 node_exporter[198828]: ts=2025-11-22T07:34:46.657Z caller=node_exporter.go:117 level=info collector=nfs
Nov 22 07:34:46 compute-0 node_exporter[198828]: ts=2025-11-22T07:34:46.657Z caller=node_exporter.go:117 level=info collector=nfsd
Nov 22 07:34:46 compute-0 node_exporter[198828]: ts=2025-11-22T07:34:46.657Z caller=node_exporter.go:117 level=info collector=nvme
Nov 22 07:34:46 compute-0 node_exporter[198828]: ts=2025-11-22T07:34:46.657Z caller=node_exporter.go:117 level=info collector=schedstat
Nov 22 07:34:46 compute-0 node_exporter[198828]: ts=2025-11-22T07:34:46.657Z caller=node_exporter.go:117 level=info collector=sockstat
Nov 22 07:34:46 compute-0 node_exporter[198828]: ts=2025-11-22T07:34:46.657Z caller=node_exporter.go:117 level=info collector=softnet
Nov 22 07:34:46 compute-0 node_exporter[198828]: ts=2025-11-22T07:34:46.657Z caller=node_exporter.go:117 level=info collector=systemd
Nov 22 07:34:46 compute-0 node_exporter[198828]: ts=2025-11-22T07:34:46.657Z caller=node_exporter.go:117 level=info collector=tapestats
Nov 22 07:34:46 compute-0 node_exporter[198828]: ts=2025-11-22T07:34:46.657Z caller=node_exporter.go:117 level=info collector=udp_queues
Nov 22 07:34:46 compute-0 node_exporter[198828]: ts=2025-11-22T07:34:46.657Z caller=node_exporter.go:117 level=info collector=vmstat
Nov 22 07:34:46 compute-0 node_exporter[198828]: ts=2025-11-22T07:34:46.657Z caller=node_exporter.go:117 level=info collector=xfs
Nov 22 07:34:46 compute-0 node_exporter[198828]: ts=2025-11-22T07:34:46.657Z caller=node_exporter.go:117 level=info collector=zfs
Nov 22 07:34:46 compute-0 node_exporter[198828]: ts=2025-11-22T07:34:46.657Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Nov 22 07:34:46 compute-0 node_exporter[198828]: ts=2025-11-22T07:34:46.658Z caller=tls_config.go:268 level=info msg="TLS is enabled." http2=true address=[::]:9100
Nov 22 07:34:46 compute-0 podman[198812]: 2025-11-22 07:34:46.670873413 +0000 UTC m=+0.838761076 container start 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 07:34:46 compute-0 podman[198812]: node_exporter
Nov 22 07:34:46 compute-0 systemd[1]: Started node_exporter container.
Nov 22 07:34:46 compute-0 sudo[198771]: pam_unix(sudo:session): session closed for user root
Nov 22 07:34:46 compute-0 podman[198837]: 2025-11-22 07:34:46.910379061 +0000 UTC m=+0.232861330 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 07:34:47 compute-0 sudo[199023]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xryolsadoorznhfcgwrrbtqnnwqerlvx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796887.3695548-1930-14916958652303/AnsiballZ_systemd.py'
Nov 22 07:34:47 compute-0 sudo[199023]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:34:47 compute-0 podman[198985]: 2025-11-22 07:34:47.688351565 +0000 UTC m=+0.066261093 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, container_name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3)
Nov 22 07:34:48 compute-0 python3.9[199030]: ansible-ansible.builtin.systemd Invoked with name=edpm_node_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 07:34:48 compute-0 systemd[1]: Stopping node_exporter container...
Nov 22 07:34:48 compute-0 systemd[1]: libpod-0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3.scope: Deactivated successfully.
Nov 22 07:34:48 compute-0 conmon[198828]: conmon 0017e37c66cba4b32b08 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3.scope/container/memory.events
Nov 22 07:34:48 compute-0 podman[199034]: 2025-11-22 07:34:48.673735791 +0000 UTC m=+0.598450969 container died 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 07:34:49 compute-0 systemd[1]: 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3-79760a8ca28f5d1b.timer: Deactivated successfully.
Nov 22 07:34:49 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3.
Nov 22 07:34:49 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3-userdata-shm.mount: Deactivated successfully.
Nov 22 07:34:49 compute-0 systemd[1]: var-lib-containers-storage-overlay-0d59637fc8d1c7f2fdda9ff430494fd819d05751693c2fc36ffbf219c2e8fe7b-merged.mount: Deactivated successfully.
Nov 22 07:34:50 compute-0 podman[199034]: 2025-11-22 07:34:50.453140324 +0000 UTC m=+2.377855502 container cleanup 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 07:34:50 compute-0 podman[199034]: node_exporter
Nov 22 07:34:50 compute-0 systemd[1]: edpm_node_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Nov 22 07:34:50 compute-0 podman[199063]: node_exporter
Nov 22 07:34:50 compute-0 systemd[1]: edpm_node_exporter.service: Failed with result 'exit-code'.
Nov 22 07:34:50 compute-0 systemd[1]: Stopped node_exporter container.
Nov 22 07:34:50 compute-0 systemd[1]: Starting node_exporter container...
Nov 22 07:34:50 compute-0 systemd[1]: Started libcrun container.
Nov 22 07:34:50 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0d59637fc8d1c7f2fdda9ff430494fd819d05751693c2fc36ffbf219c2e8fe7b/merged/etc/node_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 22 07:34:50 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0d59637fc8d1c7f2fdda9ff430494fd819d05751693c2fc36ffbf219c2e8fe7b/merged/etc/node_exporter/node_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 22 07:34:51 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3.
Nov 22 07:34:51 compute-0 podman[199076]: 2025-11-22 07:34:51.272805213 +0000 UTC m=+0.725118073 container init 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 07:34:51 compute-0 node_exporter[199091]: ts=2025-11-22T07:34:51.284Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Nov 22 07:34:51 compute-0 node_exporter[199091]: ts=2025-11-22T07:34:51.285Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Nov 22 07:34:51 compute-0 node_exporter[199091]: ts=2025-11-22T07:34:51.285Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Nov 22 07:34:51 compute-0 node_exporter[199091]: ts=2025-11-22T07:34:51.286Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Nov 22 07:34:51 compute-0 node_exporter[199091]: ts=2025-11-22T07:34:51.286Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Nov 22 07:34:51 compute-0 node_exporter[199091]: ts=2025-11-22T07:34:51.286Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Nov 22 07:34:51 compute-0 node_exporter[199091]: ts=2025-11-22T07:34:51.286Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Nov 22 07:34:51 compute-0 node_exporter[199091]: ts=2025-11-22T07:34:51.286Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Nov 22 07:34:51 compute-0 node_exporter[199091]: ts=2025-11-22T07:34:51.286Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Nov 22 07:34:51 compute-0 node_exporter[199091]: ts=2025-11-22T07:34:51.286Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Nov 22 07:34:51 compute-0 node_exporter[199091]: ts=2025-11-22T07:34:51.287Z caller=node_exporter.go:117 level=info collector=arp
Nov 22 07:34:51 compute-0 node_exporter[199091]: ts=2025-11-22T07:34:51.287Z caller=node_exporter.go:117 level=info collector=bcache
Nov 22 07:34:51 compute-0 node_exporter[199091]: ts=2025-11-22T07:34:51.287Z caller=node_exporter.go:117 level=info collector=bonding
Nov 22 07:34:51 compute-0 node_exporter[199091]: ts=2025-11-22T07:34:51.287Z caller=node_exporter.go:117 level=info collector=btrfs
Nov 22 07:34:51 compute-0 node_exporter[199091]: ts=2025-11-22T07:34:51.287Z caller=node_exporter.go:117 level=info collector=conntrack
Nov 22 07:34:51 compute-0 node_exporter[199091]: ts=2025-11-22T07:34:51.287Z caller=node_exporter.go:117 level=info collector=cpu
Nov 22 07:34:51 compute-0 node_exporter[199091]: ts=2025-11-22T07:34:51.287Z caller=node_exporter.go:117 level=info collector=cpufreq
Nov 22 07:34:51 compute-0 node_exporter[199091]: ts=2025-11-22T07:34:51.287Z caller=node_exporter.go:117 level=info collector=diskstats
Nov 22 07:34:51 compute-0 node_exporter[199091]: ts=2025-11-22T07:34:51.287Z caller=node_exporter.go:117 level=info collector=edac
Nov 22 07:34:51 compute-0 node_exporter[199091]: ts=2025-11-22T07:34:51.287Z caller=node_exporter.go:117 level=info collector=fibrechannel
Nov 22 07:34:51 compute-0 node_exporter[199091]: ts=2025-11-22T07:34:51.287Z caller=node_exporter.go:117 level=info collector=filefd
Nov 22 07:34:51 compute-0 node_exporter[199091]: ts=2025-11-22T07:34:51.287Z caller=node_exporter.go:117 level=info collector=filesystem
Nov 22 07:34:51 compute-0 node_exporter[199091]: ts=2025-11-22T07:34:51.287Z caller=node_exporter.go:117 level=info collector=infiniband
Nov 22 07:34:51 compute-0 node_exporter[199091]: ts=2025-11-22T07:34:51.287Z caller=node_exporter.go:117 level=info collector=ipvs
Nov 22 07:34:51 compute-0 node_exporter[199091]: ts=2025-11-22T07:34:51.287Z caller=node_exporter.go:117 level=info collector=loadavg
Nov 22 07:34:51 compute-0 node_exporter[199091]: ts=2025-11-22T07:34:51.287Z caller=node_exporter.go:117 level=info collector=mdadm
Nov 22 07:34:51 compute-0 node_exporter[199091]: ts=2025-11-22T07:34:51.287Z caller=node_exporter.go:117 level=info collector=meminfo
Nov 22 07:34:51 compute-0 node_exporter[199091]: ts=2025-11-22T07:34:51.287Z caller=node_exporter.go:117 level=info collector=netclass
Nov 22 07:34:51 compute-0 node_exporter[199091]: ts=2025-11-22T07:34:51.287Z caller=node_exporter.go:117 level=info collector=netdev
Nov 22 07:34:51 compute-0 node_exporter[199091]: ts=2025-11-22T07:34:51.287Z caller=node_exporter.go:117 level=info collector=netstat
Nov 22 07:34:51 compute-0 node_exporter[199091]: ts=2025-11-22T07:34:51.287Z caller=node_exporter.go:117 level=info collector=nfs
Nov 22 07:34:51 compute-0 node_exporter[199091]: ts=2025-11-22T07:34:51.287Z caller=node_exporter.go:117 level=info collector=nfsd
Nov 22 07:34:51 compute-0 node_exporter[199091]: ts=2025-11-22T07:34:51.287Z caller=node_exporter.go:117 level=info collector=nvme
Nov 22 07:34:51 compute-0 node_exporter[199091]: ts=2025-11-22T07:34:51.287Z caller=node_exporter.go:117 level=info collector=schedstat
Nov 22 07:34:51 compute-0 node_exporter[199091]: ts=2025-11-22T07:34:51.287Z caller=node_exporter.go:117 level=info collector=sockstat
Nov 22 07:34:51 compute-0 node_exporter[199091]: ts=2025-11-22T07:34:51.287Z caller=node_exporter.go:117 level=info collector=softnet
Nov 22 07:34:51 compute-0 node_exporter[199091]: ts=2025-11-22T07:34:51.287Z caller=node_exporter.go:117 level=info collector=systemd
Nov 22 07:34:51 compute-0 node_exporter[199091]: ts=2025-11-22T07:34:51.287Z caller=node_exporter.go:117 level=info collector=tapestats
Nov 22 07:34:51 compute-0 node_exporter[199091]: ts=2025-11-22T07:34:51.287Z caller=node_exporter.go:117 level=info collector=udp_queues
Nov 22 07:34:51 compute-0 node_exporter[199091]: ts=2025-11-22T07:34:51.287Z caller=node_exporter.go:117 level=info collector=vmstat
Nov 22 07:34:51 compute-0 node_exporter[199091]: ts=2025-11-22T07:34:51.287Z caller=node_exporter.go:117 level=info collector=xfs
Nov 22 07:34:51 compute-0 node_exporter[199091]: ts=2025-11-22T07:34:51.287Z caller=node_exporter.go:117 level=info collector=zfs
Nov 22 07:34:51 compute-0 node_exporter[199091]: ts=2025-11-22T07:34:51.288Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Nov 22 07:34:51 compute-0 node_exporter[199091]: ts=2025-11-22T07:34:51.288Z caller=tls_config.go:268 level=info msg="TLS is enabled." http2=true address=[::]:9100
Nov 22 07:34:51 compute-0 podman[199076]: 2025-11-22 07:34:51.300570886 +0000 UTC m=+0.752883726 container start 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 07:34:51 compute-0 podman[199076]: node_exporter
Nov 22 07:34:51 compute-0 systemd[1]: Started node_exporter container.
Nov 22 07:34:51 compute-0 sudo[199023]: pam_unix(sudo:session): session closed for user root
Nov 22 07:34:51 compute-0 podman[199101]: 2025-11-22 07:34:51.534979132 +0000 UTC m=+0.224931881 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 07:34:52 compute-0 sudo[199272]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dabtednrktncsueavlvskjftvmfdkqod ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796891.7712803-1954-9452650847579/AnsiballZ_stat.py'
Nov 22 07:34:52 compute-0 sudo[199272]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:34:52 compute-0 python3.9[199274]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:34:52 compute-0 sudo[199272]: pam_unix(sudo:session): session closed for user root
Nov 22 07:34:52 compute-0 sudo[199395]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ondtwghreefqqcdalsjrqlquopqpynph ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796891.7712803-1954-9452650847579/AnsiballZ_copy.py'
Nov 22 07:34:52 compute-0 sudo[199395]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:34:52 compute-0 python3.9[199397]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763796891.7712803-1954-9452650847579/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:34:52 compute-0 sudo[199395]: pam_unix(sudo:session): session closed for user root
Nov 22 07:34:53 compute-0 sudo[199547]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ahcstrkxwnxtjtzcxavvdgriqocfjfnn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796893.5578682-2005-205243739622993/AnsiballZ_container_config_data.py'
Nov 22 07:34:53 compute-0 sudo[199547]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:34:54 compute-0 python3.9[199549]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=podman_exporter.json debug=False
Nov 22 07:34:54 compute-0 sudo[199547]: pam_unix(sudo:session): session closed for user root
Nov 22 07:34:54 compute-0 sudo[199699]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxuspedezvaucsqmeostcjdrqnkfturb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796894.6917098-2032-150561417957719/AnsiballZ_container_config_hash.py'
Nov 22 07:34:54 compute-0 sudo[199699]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:34:55 compute-0 python3.9[199701]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 22 07:34:55 compute-0 sudo[199699]: pam_unix(sudo:session): session closed for user root
Nov 22 07:34:56 compute-0 sudo[199851]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-harycduaaplzzqwklxatqtbfpilixaev ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763796895.7481337-2062-7737433494140/AnsiballZ_edpm_container_manage.py'
Nov 22 07:34:56 compute-0 sudo[199851]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:34:56 compute-0 python3[199853]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=podman_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Nov 22 07:35:01 compute-0 podman[199866]: 2025-11-22 07:35:01.393531183 +0000 UTC m=+4.905940985 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Nov 22 07:35:01 compute-0 podman[199962]: 2025-11-22 07:35:01.557227123 +0000 UTC m=+0.065465562 container create d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=edpm, container_name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 07:35:01 compute-0 podman[199962]: 2025-11-22 07:35:01.514166267 +0000 UTC m=+0.022404736 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Nov 22 07:35:01 compute-0 python3[199853]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env OS_ENDPOINT_TYPE=internal --env CONTAINER_HOST=unix:///run/podman/podman.sock --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=edpm --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter:v1.10.1 --web.config.file=/etc/podman_exporter/podman_exporter.yaml
Nov 22 07:35:01 compute-0 sudo[199851]: pam_unix(sudo:session): session closed for user root
Nov 22 07:35:02 compute-0 sudo[200150]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hdqaekfiqwazzuduxfquzywniuuwppyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796901.905044-2086-138167915263266/AnsiballZ_stat.py'
Nov 22 07:35:02 compute-0 sudo[200150]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:35:02 compute-0 python3.9[200152]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 07:35:02 compute-0 sudo[200150]: pam_unix(sudo:session): session closed for user root
Nov 22 07:35:03 compute-0 sudo[200304]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iqyyaiwvzjsgnvilafmhdqicgrpzyuyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796902.9314365-2113-264762319578760/AnsiballZ_file.py'
Nov 22 07:35:03 compute-0 sudo[200304]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:35:03 compute-0 python3.9[200306]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:35:03 compute-0 sudo[200304]: pam_unix(sudo:session): session closed for user root
Nov 22 07:35:03 compute-0 sudo[200455]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-myflxtllkqlplpzrqubazsjvvvfwnpvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796903.4843059-2113-222463595883216/AnsiballZ_copy.py'
Nov 22 07:35:03 compute-0 sudo[200455]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:35:04 compute-0 python3.9[200457]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763796903.4843059-2113-222463595883216/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:35:04 compute-0 sudo[200455]: pam_unix(sudo:session): session closed for user root
Nov 22 07:35:04 compute-0 sudo[200531]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nknswwlhyloakekyfsvwiyyapbferzvq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796903.4843059-2113-222463595883216/AnsiballZ_systemd.py'
Nov 22 07:35:04 compute-0 sudo[200531]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:35:04 compute-0 python3.9[200533]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 22 07:35:04 compute-0 systemd[1]: Reloading.
Nov 22 07:35:04 compute-0 systemd-rc-local-generator[200560]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 07:35:04 compute-0 systemd-sysv-generator[200563]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 07:35:05 compute-0 sudo[200531]: pam_unix(sudo:session): session closed for user root
Nov 22 07:35:05 compute-0 sudo[200641]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wbqdzrxwkekpgovfucjrmkegjbjxtyen ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796903.4843059-2113-222463595883216/AnsiballZ_systemd.py'
Nov 22 07:35:05 compute-0 sudo[200641]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:35:05 compute-0 python3.9[200643]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 07:35:05 compute-0 systemd[1]: Reloading.
Nov 22 07:35:05 compute-0 podman[200645]: 2025-11-22 07:35:05.823100626 +0000 UTC m=+0.052339573 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=2, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute)
Nov 22 07:35:05 compute-0 podman[200646]: 2025-11-22 07:35:05.855723179 +0000 UTC m=+0.085350896 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 22 07:35:05 compute-0 systemd-rc-local-generator[200713]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 07:35:05 compute-0 systemd-sysv-generator[200716]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 07:35:06 compute-0 systemd[1]: a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b-173a7a48cbf9c8ec.service: Main process exited, code=exited, status=1/FAILURE
Nov 22 07:35:06 compute-0 systemd[1]: a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b-173a7a48cbf9c8ec.service: Failed with result 'exit-code'.
Nov 22 07:35:06 compute-0 systemd[1]: Starting podman_exporter container...
Nov 22 07:35:06 compute-0 systemd[1]: Started libcrun container.
Nov 22 07:35:06 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f6f48baba264db9a9bca34ada538dba5a89e35e4df0f169d1d390ff4fa174e70/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 22 07:35:06 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f6f48baba264db9a9bca34ada538dba5a89e35e4df0f169d1d390ff4fa174e70/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 22 07:35:06 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9.
Nov 22 07:35:06 compute-0 podman[200724]: 2025-11-22 07:35:06.801120405 +0000 UTC m=+0.653345995 container init d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 07:35:06 compute-0 podman_exporter[200739]: ts=2025-11-22T07:35:06.816Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Nov 22 07:35:06 compute-0 podman_exporter[200739]: ts=2025-11-22T07:35:06.816Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Nov 22 07:35:06 compute-0 podman_exporter[200739]: ts=2025-11-22T07:35:06.816Z caller=handler.go:94 level=info msg="enabled collectors"
Nov 22 07:35:06 compute-0 podman_exporter[200739]: ts=2025-11-22T07:35:06.816Z caller=handler.go:105 level=info collector=container
Nov 22 07:35:06 compute-0 podman[200724]: 2025-11-22 07:35:06.823800987 +0000 UTC m=+0.676026557 container start d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 07:35:06 compute-0 systemd[1]: Starting Podman API Service...
Nov 22 07:35:06 compute-0 systemd[1]: Started Podman API Service.
Nov 22 07:35:06 compute-0 podman[200750]: time="2025-11-22T07:35:06Z" level=info msg="/usr/bin/podman filtering at log level info"
Nov 22 07:35:06 compute-0 podman[200750]: time="2025-11-22T07:35:06Z" level=info msg="Setting parallel job count to 25"
Nov 22 07:35:06 compute-0 podman[200750]: time="2025-11-22T07:35:06Z" level=info msg="Using sqlite as database backend"
Nov 22 07:35:07 compute-0 podman[200724]: podman_exporter
Nov 22 07:35:07 compute-0 systemd[1]: Started podman_exporter container.
Nov 22 07:35:07 compute-0 podman[200750]: time="2025-11-22T07:35:07Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled"
Nov 22 07:35:07 compute-0 podman[200750]: time="2025-11-22T07:35:07Z" level=info msg="Using systemd socket activation to determine API endpoint"
Nov 22 07:35:07 compute-0 podman[200750]: time="2025-11-22T07:35:07Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"unix:///run/podman/podman.sock\""
Nov 22 07:35:07 compute-0 podman[200750]: @ - - [22/Nov/2025:07:35:07 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Nov 22 07:35:07 compute-0 podman[200750]: time="2025-11-22T07:35:07Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 22 07:35:07 compute-0 sudo[200641]: pam_unix(sudo:session): session closed for user root
Nov 22 07:35:07 compute-0 podman[200750]: @ - - [22/Nov/2025:07:35:07 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 19569 "" "Go-http-client/1.1"
Nov 22 07:35:07 compute-0 podman_exporter[200739]: ts=2025-11-22T07:35:07.077Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Nov 22 07:35:07 compute-0 podman_exporter[200739]: ts=2025-11-22T07:35:07.078Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Nov 22 07:35:07 compute-0 podman_exporter[200739]: ts=2025-11-22T07:35:07.078Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Nov 22 07:35:07 compute-0 podman[200748]: 2025-11-22 07:35:07.124755245 +0000 UTC m=+0.291464618 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 07:35:08 compute-0 sudo[200936]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ckedfcrrfhwuwzuztcawlmaqscznhcud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796908.0917945-2185-140044685811492/AnsiballZ_systemd.py'
Nov 22 07:35:08 compute-0 sudo[200936]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:35:08 compute-0 python3.9[200938]: ansible-ansible.builtin.systemd Invoked with name=edpm_podman_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 07:35:08 compute-0 systemd[1]: Stopping podman_exporter container...
Nov 22 07:35:09 compute-0 podman[200750]: @ - - [22/Nov/2025:07:35:07 +0000] "GET /v4.9.3/libpod/events?filters=%7B%7D&since=&stream=true&until= HTTP/1.1" 200 1448 "" "Go-http-client/1.1"
Nov 22 07:35:09 compute-0 systemd[1]: libpod-d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9.scope: Deactivated successfully.
Nov 22 07:35:09 compute-0 podman[200942]: 2025-11-22 07:35:09.09049279 +0000 UTC m=+0.210388326 container died d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 07:35:09 compute-0 systemd[1]: d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9-6de31ba226f0b1ff.timer: Deactivated successfully.
Nov 22 07:35:09 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9.
Nov 22 07:35:09 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9-userdata-shm.mount: Deactivated successfully.
Nov 22 07:35:09 compute-0 systemd[1]: var-lib-containers-storage-overlay-f6f48baba264db9a9bca34ada538dba5a89e35e4df0f169d1d390ff4fa174e70-merged.mount: Deactivated successfully.
Nov 22 07:35:09 compute-0 podman[200942]: 2025-11-22 07:35:09.468545003 +0000 UTC m=+0.588440529 container cleanup d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 07:35:09 compute-0 podman[200942]: podman_exporter
Nov 22 07:35:09 compute-0 systemd[1]: edpm_podman_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Nov 22 07:35:09 compute-0 podman[200971]: podman_exporter
Nov 22 07:35:09 compute-0 systemd[1]: edpm_podman_exporter.service: Failed with result 'exit-code'.
Nov 22 07:35:09 compute-0 systemd[1]: Stopped podman_exporter container.
Nov 22 07:35:09 compute-0 systemd[1]: Starting podman_exporter container...
Nov 22 07:35:09 compute-0 systemd[1]: Started libcrun container.
Nov 22 07:35:09 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f6f48baba264db9a9bca34ada538dba5a89e35e4df0f169d1d390ff4fa174e70/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 22 07:35:09 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f6f48baba264db9a9bca34ada538dba5a89e35e4df0f169d1d390ff4fa174e70/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 22 07:35:09 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9.
Nov 22 07:35:09 compute-0 podman[200984]: 2025-11-22 07:35:09.926783655 +0000 UTC m=+0.380023371 container init d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 07:35:09 compute-0 podman_exporter[200999]: ts=2025-11-22T07:35:09.941Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Nov 22 07:35:09 compute-0 podman_exporter[200999]: ts=2025-11-22T07:35:09.941Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Nov 22 07:35:09 compute-0 podman_exporter[200999]: ts=2025-11-22T07:35:09.941Z caller=handler.go:94 level=info msg="enabled collectors"
Nov 22 07:35:09 compute-0 podman_exporter[200999]: ts=2025-11-22T07:35:09.941Z caller=handler.go:105 level=info collector=container
Nov 22 07:35:09 compute-0 podman[200750]: @ - - [22/Nov/2025:07:35:09 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Nov 22 07:35:09 compute-0 podman[200750]: time="2025-11-22T07:35:09Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 22 07:35:09 compute-0 podman[200984]: 2025-11-22 07:35:09.957733178 +0000 UTC m=+0.410972874 container start d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 07:35:09 compute-0 podman[200984]: podman_exporter
Nov 22 07:35:09 compute-0 podman[200750]: @ - - [22/Nov/2025:07:35:09 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 19571 "" "Go-http-client/1.1"
Nov 22 07:35:09 compute-0 podman_exporter[200999]: ts=2025-11-22T07:35:09.969Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Nov 22 07:35:09 compute-0 podman_exporter[200999]: ts=2025-11-22T07:35:09.969Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Nov 22 07:35:09 compute-0 podman_exporter[200999]: ts=2025-11-22T07:35:09.970Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Nov 22 07:35:09 compute-0 systemd[1]: Started podman_exporter container.
Nov 22 07:35:10 compute-0 sudo[200936]: pam_unix(sudo:session): session closed for user root
Nov 22 07:35:10 compute-0 podman[201010]: 2025-11-22 07:35:10.022303367 +0000 UTC m=+0.054481295 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 07:35:11 compute-0 sudo[201184]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dvminyeavdfsramzfizbiuxueqkrvntj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796910.916164-2209-146278018169307/AnsiballZ_stat.py'
Nov 22 07:35:11 compute-0 sudo[201184]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:35:11 compute-0 python3.9[201186]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:35:11 compute-0 sudo[201184]: pam_unix(sudo:session): session closed for user root
Nov 22 07:35:11 compute-0 sudo[201307]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pkqqchpkjzeudlyfwlqpcukqyxbwumwk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796910.916164-2209-146278018169307/AnsiballZ_copy.py'
Nov 22 07:35:11 compute-0 sudo[201307]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:35:11 compute-0 python3.9[201309]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763796910.916164-2209-146278018169307/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 22 07:35:11 compute-0 sudo[201307]: pam_unix(sudo:session): session closed for user root
Nov 22 07:35:12 compute-0 sudo[201459]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wwmamogziiwetmnbopxuqoznhkjmosea ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796912.4190605-2260-131503010512235/AnsiballZ_container_config_data.py'
Nov 22 07:35:12 compute-0 sudo[201459]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:35:12 compute-0 python3.9[201461]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=openstack_network_exporter.json debug=False
Nov 22 07:35:12 compute-0 sudo[201459]: pam_unix(sudo:session): session closed for user root
Nov 22 07:35:13 compute-0 sudo[201611]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qyaqqcsrgtzsqirywpppojoxutpxxyhb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796913.5195808-2287-186549661123240/AnsiballZ_container_config_hash.py'
Nov 22 07:35:13 compute-0 sudo[201611]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:35:13 compute-0 python3.9[201613]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 22 07:35:13 compute-0 sudo[201611]: pam_unix(sudo:session): session closed for user root
Nov 22 07:35:14 compute-0 podman[201638]: 2025-11-22 07:35:14.404077618 +0000 UTC m=+0.051735739 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Nov 22 07:35:14 compute-0 sudo[201783]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hohbhrevjkssftpcwdnwslenllogsghb ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763796914.5113916-2317-52230961443059/AnsiballZ_edpm_container_manage.py'
Nov 22 07:35:14 compute-0 sudo[201783]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:35:15 compute-0 python3[201785]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=openstack_network_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Nov 22 07:35:17 compute-0 podman[201797]: 2025-11-22 07:35:17.827621589 +0000 UTC m=+2.684379730 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Nov 22 07:35:17 compute-0 podman[201893]: 2025-11-22 07:35:17.962497038 +0000 UTC m=+0.049108725 container create 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, name=ubi9-minimal, config_id=edpm, vcs-type=git, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, build-date=2025-08-20T13:12:41, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., release=1755695350, io.buildah.version=1.33.7, managed_by=edpm_ansible, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc.)
Nov 22 07:35:17 compute-0 podman[201893]: 2025-11-22 07:35:17.935046381 +0000 UTC m=+0.021658088 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Nov 22 07:35:17 compute-0 python3[201785]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OS_ENDPOINT_TYPE=internal --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=edpm --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Nov 22 07:35:18 compute-0 sudo[201783]: pam_unix(sudo:session): session closed for user root
Nov 22 07:35:18 compute-0 auditd[702]: Audit daemon rotating log files
Nov 22 07:35:18 compute-0 podman[202021]: 2025-11-22 07:35:18.411361742 +0000 UTC m=+0.061005544 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 22 07:35:18 compute-0 sudo[202101]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-urbdewsknzoeghfjgmiszrwrtahgirqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796918.2285013-2341-228554132619530/AnsiballZ_stat.py'
Nov 22 07:35:18 compute-0 sudo[202101]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:35:18 compute-0 python3.9[202103]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 07:35:18 compute-0 sudo[202101]: pam_unix(sudo:session): session closed for user root
Nov 22 07:35:19 compute-0 sudo[202255]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jcagvnbackcnhisqpgmytlemwgdhhdmm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796919.0978901-2368-25176192508725/AnsiballZ_file.py'
Nov 22 07:35:19 compute-0 sudo[202255]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:35:19 compute-0 python3.9[202257]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:35:19 compute-0 sudo[202255]: pam_unix(sudo:session): session closed for user root
Nov 22 07:35:20 compute-0 sudo[202406]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ckaklhzgedosjtibopvvuqhlzengbqar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796919.6499813-2368-134958863336586/AnsiballZ_copy.py'
Nov 22 07:35:20 compute-0 sudo[202406]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:35:20 compute-0 python3.9[202408]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763796919.6499813-2368-134958863336586/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:35:20 compute-0 sudo[202406]: pam_unix(sudo:session): session closed for user root
Nov 22 07:35:20 compute-0 sudo[202482]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ddbviquvekosmckrpifhujtqtavhusbz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796919.6499813-2368-134958863336586/AnsiballZ_systemd.py'
Nov 22 07:35:20 compute-0 sudo[202482]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:35:20 compute-0 python3.9[202484]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 22 07:35:20 compute-0 systemd[1]: Reloading.
Nov 22 07:35:20 compute-0 systemd-rc-local-generator[202509]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 07:35:20 compute-0 systemd-sysv-generator[202514]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 07:35:21 compute-0 sudo[202482]: pam_unix(sudo:session): session closed for user root
Nov 22 07:35:21 compute-0 sudo[202593]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hnfkqrdgtgthqcniturqwjmhvbzakkol ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796919.6499813-2368-134958863336586/AnsiballZ_systemd.py'
Nov 22 07:35:21 compute-0 sudo[202593]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:35:21 compute-0 python3.9[202595]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 22 07:35:21 compute-0 systemd[1]: Reloading.
Nov 22 07:35:21 compute-0 podman[202597]: 2025-11-22 07:35:21.772173858 +0000 UTC m=+0.060250356 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 07:35:21 compute-0 systemd-rc-local-generator[202647]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 22 07:35:21 compute-0 systemd-sysv-generator[202651]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 22 07:35:22 compute-0 systemd[1]: Starting openstack_network_exporter container...
Nov 22 07:35:22 compute-0 systemd[1]: Started libcrun container.
Nov 22 07:35:22 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7ff455ae5aef5682c22e07c5ae996227d9ba06872ebc0221bed35491934871d9/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Nov 22 07:35:22 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7ff455ae5aef5682c22e07c5ae996227d9ba06872ebc0221bed35491934871d9/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 22 07:35:22 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7ff455ae5aef5682c22e07c5ae996227d9ba06872ebc0221bed35491934871d9/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 22 07:35:22 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf.
Nov 22 07:35:22 compute-0 podman[202659]: 2025-11-22 07:35:22.220186391 +0000 UTC m=+0.142576988 container init 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.expose-services=, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, name=ubi9-minimal, vendor=Red Hat, Inc., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, release=1755695350, distribution-scope=public, com.redhat.component=ubi9-minimal-container, vcs-type=git, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9)
Nov 22 07:35:22 compute-0 openstack_network_exporter[202674]: INFO    07:35:22 main.go:48: registering *bridge.Collector
Nov 22 07:35:22 compute-0 openstack_network_exporter[202674]: INFO    07:35:22 main.go:48: registering *coverage.Collector
Nov 22 07:35:22 compute-0 openstack_network_exporter[202674]: INFO    07:35:22 main.go:48: registering *datapath.Collector
Nov 22 07:35:22 compute-0 openstack_network_exporter[202674]: INFO    07:35:22 main.go:48: registering *iface.Collector
Nov 22 07:35:22 compute-0 openstack_network_exporter[202674]: INFO    07:35:22 main.go:48: registering *memory.Collector
Nov 22 07:35:22 compute-0 openstack_network_exporter[202674]: INFO    07:35:22 main.go:48: registering *ovnnorthd.Collector
Nov 22 07:35:22 compute-0 openstack_network_exporter[202674]: INFO    07:35:22 main.go:48: registering *ovn.Collector
Nov 22 07:35:22 compute-0 openstack_network_exporter[202674]: INFO    07:35:22 main.go:48: registering *ovsdbserver.Collector
Nov 22 07:35:22 compute-0 openstack_network_exporter[202674]: INFO    07:35:22 main.go:48: registering *pmd_perf.Collector
Nov 22 07:35:22 compute-0 openstack_network_exporter[202674]: INFO    07:35:22 main.go:48: registering *pmd_rxq.Collector
Nov 22 07:35:22 compute-0 openstack_network_exporter[202674]: INFO    07:35:22 main.go:48: registering *vswitch.Collector
Nov 22 07:35:22 compute-0 openstack_network_exporter[202674]: NOTICE  07:35:22 main.go:76: listening on https://:9105/metrics
Nov 22 07:35:22 compute-0 podman[202659]: 2025-11-22 07:35:22.248668993 +0000 UTC m=+0.171059570 container start 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, architecture=x86_64, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, vcs-type=git, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, release=1755695350, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 22 07:35:22 compute-0 podman[202659]: openstack_network_exporter
Nov 22 07:35:22 compute-0 systemd[1]: Started openstack_network_exporter container.
Nov 22 07:35:22 compute-0 sudo[202593]: pam_unix(sudo:session): session closed for user root
Nov 22 07:35:22 compute-0 podman[202684]: 2025-11-22 07:35:22.354577949 +0000 UTC m=+0.092598433 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, maintainer=Red Hat, Inc., release=1755695350, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, container_name=openstack_network_exporter, vcs-type=git, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, config_id=edpm)
Nov 22 07:35:23 compute-0 sudo[202856]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zeodicvbndedesnosbhijzefjppttpbw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796922.8682792-2440-55649395254545/AnsiballZ_systemd.py'
Nov 22 07:35:23 compute-0 sudo[202856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:35:23 compute-0 python3.9[202858]: ansible-ansible.builtin.systemd Invoked with name=edpm_openstack_network_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 22 07:35:23 compute-0 systemd[1]: Stopping openstack_network_exporter container...
Nov 22 07:35:23 compute-0 systemd[1]: libpod-26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf.scope: Deactivated successfully.
Nov 22 07:35:23 compute-0 podman[202862]: 2025-11-22 07:35:23.592685902 +0000 UTC m=+0.050606601 container died 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, name=ubi9-minimal, version=9.6, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1755695350, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter)
Nov 22 07:35:23 compute-0 systemd[1]: 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf-125348b096e4087b.timer: Deactivated successfully.
Nov 22 07:35:23 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf.
Nov 22 07:35:23 compute-0 systemd[1]: var-lib-containers-storage-overlay-7ff455ae5aef5682c22e07c5ae996227d9ba06872ebc0221bed35491934871d9-merged.mount: Deactivated successfully.
Nov 22 07:35:23 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf-userdata-shm.mount: Deactivated successfully.
Nov 22 07:35:26 compute-0 podman[202862]: 2025-11-22 07:35:26.360057429 +0000 UTC m=+2.817978028 container cleanup 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, vcs-type=git, architecture=x86_64, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, release=1755695350, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 22 07:35:26 compute-0 podman[202862]: openstack_network_exporter
Nov 22 07:35:26 compute-0 systemd[1]: edpm_openstack_network_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Nov 22 07:35:26 compute-0 podman[202891]: openstack_network_exporter
Nov 22 07:35:26 compute-0 systemd[1]: edpm_openstack_network_exporter.service: Failed with result 'exit-code'.
Nov 22 07:35:26 compute-0 systemd[1]: Stopped openstack_network_exporter container.
Nov 22 07:35:26 compute-0 systemd[1]: Starting openstack_network_exporter container...
Nov 22 07:35:26 compute-0 nova_compute[186544]: 2025-11-22 07:35:26.484 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:35:26 compute-0 nova_compute[186544]: 2025-11-22 07:35:26.511 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:35:26 compute-0 nova_compute[186544]: 2025-11-22 07:35:26.511 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 07:35:26 compute-0 nova_compute[186544]: 2025-11-22 07:35:26.511 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 07:35:26 compute-0 nova_compute[186544]: 2025-11-22 07:35:26.524 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 07:35:26 compute-0 nova_compute[186544]: 2025-11-22 07:35:26.525 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:35:26 compute-0 nova_compute[186544]: 2025-11-22 07:35:26.525 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:35:26 compute-0 nova_compute[186544]: 2025-11-22 07:35:26.525 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 07:35:26 compute-0 nova_compute[186544]: 2025-11-22 07:35:26.525 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:35:26 compute-0 nova_compute[186544]: 2025-11-22 07:35:26.557 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:35:26 compute-0 nova_compute[186544]: 2025-11-22 07:35:26.558 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:35:26 compute-0 nova_compute[186544]: 2025-11-22 07:35:26.558 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:35:26 compute-0 nova_compute[186544]: 2025-11-22 07:35:26.558 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 07:35:26 compute-0 systemd[1]: Started libcrun container.
Nov 22 07:35:26 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7ff455ae5aef5682c22e07c5ae996227d9ba06872ebc0221bed35491934871d9/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Nov 22 07:35:26 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7ff455ae5aef5682c22e07c5ae996227d9ba06872ebc0221bed35491934871d9/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 22 07:35:26 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7ff455ae5aef5682c22e07c5ae996227d9ba06872ebc0221bed35491934871d9/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 22 07:35:26 compute-0 nova_compute[186544]: 2025-11-22 07:35:26.714 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 07:35:26 compute-0 nova_compute[186544]: 2025-11-22 07:35:26.715 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5937MB free_disk=73.49367904663086GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 07:35:26 compute-0 nova_compute[186544]: 2025-11-22 07:35:26.715 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:35:26 compute-0 nova_compute[186544]: 2025-11-22 07:35:26.715 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:35:26 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf.
Nov 22 07:35:26 compute-0 podman[202904]: 2025-11-22 07:35:26.738489031 +0000 UTC m=+0.287041741 container init 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, release=1755695350, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, io.openshift.tags=minimal rhel9, version=9.6, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, vendor=Red Hat, Inc.)
Nov 22 07:35:26 compute-0 openstack_network_exporter[202920]: INFO    07:35:26 main.go:48: registering *bridge.Collector
Nov 22 07:35:26 compute-0 openstack_network_exporter[202920]: INFO    07:35:26 main.go:48: registering *coverage.Collector
Nov 22 07:35:26 compute-0 openstack_network_exporter[202920]: INFO    07:35:26 main.go:48: registering *datapath.Collector
Nov 22 07:35:26 compute-0 openstack_network_exporter[202920]: INFO    07:35:26 main.go:48: registering *iface.Collector
Nov 22 07:35:26 compute-0 openstack_network_exporter[202920]: INFO    07:35:26 main.go:48: registering *memory.Collector
Nov 22 07:35:26 compute-0 openstack_network_exporter[202920]: INFO    07:35:26 main.go:48: registering *ovnnorthd.Collector
Nov 22 07:35:26 compute-0 openstack_network_exporter[202920]: INFO    07:35:26 main.go:48: registering *ovn.Collector
Nov 22 07:35:26 compute-0 openstack_network_exporter[202920]: INFO    07:35:26 main.go:48: registering *ovsdbserver.Collector
Nov 22 07:35:26 compute-0 openstack_network_exporter[202920]: INFO    07:35:26 main.go:48: registering *pmd_perf.Collector
Nov 22 07:35:26 compute-0 openstack_network_exporter[202920]: INFO    07:35:26 main.go:48: registering *pmd_rxq.Collector
Nov 22 07:35:26 compute-0 openstack_network_exporter[202920]: INFO    07:35:26 main.go:48: registering *vswitch.Collector
Nov 22 07:35:26 compute-0 openstack_network_exporter[202920]: NOTICE  07:35:26 main.go:76: listening on https://:9105/metrics
Nov 22 07:35:26 compute-0 podman[202904]: 2025-11-22 07:35:26.765889748 +0000 UTC m=+0.314442438 container start 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, architecture=x86_64, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, vendor=Red Hat, Inc., container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.33.7, release=1755695350, vcs-type=git, io.openshift.tags=minimal rhel9, version=9.6, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 22 07:35:26 compute-0 podman[202904]: openstack_network_exporter
Nov 22 07:35:26 compute-0 nova_compute[186544]: 2025-11-22 07:35:26.773 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 07:35:26 compute-0 nova_compute[186544]: 2025-11-22 07:35:26.773 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 07:35:26 compute-0 systemd[1]: Started openstack_network_exporter container.
Nov 22 07:35:26 compute-0 nova_compute[186544]: 2025-11-22 07:35:26.793 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:35:26 compute-0 nova_compute[186544]: 2025-11-22 07:35:26.806 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:35:26 compute-0 nova_compute[186544]: 2025-11-22 07:35:26.808 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 07:35:26 compute-0 nova_compute[186544]: 2025-11-22 07:35:26.808 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.093s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:35:26 compute-0 sudo[202856]: pam_unix(sudo:session): session closed for user root
Nov 22 07:35:26 compute-0 podman[202930]: 2025-11-22 07:35:26.831315178 +0000 UTC m=+0.056619588 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, distribution-scope=public, vcs-type=git, config_id=edpm, container_name=openstack_network_exporter, release=1755695350, managed_by=edpm_ansible, name=ubi9-minimal, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 22 07:35:27 compute-0 nova_compute[186544]: 2025-11-22 07:35:27.447 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:35:27 compute-0 nova_compute[186544]: 2025-11-22 07:35:27.447 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:35:27 compute-0 nova_compute[186544]: 2025-11-22 07:35:27.447 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:35:27 compute-0 nova_compute[186544]: 2025-11-22 07:35:27.448 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:35:27 compute-0 sudo[203100]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxekgtncvyfsardgrlayzbqkjkoacqod ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796927.4097817-2464-6106670614903/AnsiballZ_find.py'
Nov 22 07:35:27 compute-0 sudo[203100]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:35:27 compute-0 python3.9[203102]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 22 07:35:27 compute-0 sudo[203100]: pam_unix(sudo:session): session closed for user root
Nov 22 07:35:28 compute-0 nova_compute[186544]: 2025-11-22 07:35:28.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:35:29 compute-0 sudo[203252]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sqcqldavbysnouezfpblftvxhyigjuqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796928.6947184-2492-27174379166225/AnsiballZ_podman_container_info.py'
Nov 22 07:35:29 compute-0 sudo[203252]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:35:29 compute-0 python3.9[203254]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman
Nov 22 07:35:29 compute-0 sudo[203252]: pam_unix(sudo:session): session closed for user root
Nov 22 07:35:30 compute-0 sudo[203416]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-japgcapxouvsmjzcxivloyibmcdhcjar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796929.599379-2500-99061488572475/AnsiballZ_podman_container_exec.py'
Nov 22 07:35:30 compute-0 sudo[203416]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:35:30 compute-0 python3.9[203418]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 22 07:35:30 compute-0 systemd[1]: Started libpod-conmon-ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f.scope.
Nov 22 07:35:30 compute-0 podman[203419]: 2025-11-22 07:35:30.639658014 +0000 UTC m=+0.211667808 container exec ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller)
Nov 22 07:35:30 compute-0 podman[203438]: 2025-11-22 07:35:30.744527845 +0000 UTC m=+0.090826260 container exec_died ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 07:35:30 compute-0 podman[203419]: 2025-11-22 07:35:30.823478355 +0000 UTC m=+0.395488149 container exec_died ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 22 07:35:30 compute-0 systemd[1]: libpod-conmon-ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f.scope: Deactivated successfully.
Nov 22 07:35:30 compute-0 sudo[203416]: pam_unix(sudo:session): session closed for user root
Nov 22 07:35:31 compute-0 sudo[203598]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kazrrumeshlnjvwdejagfivltpxvdqhn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796931.226961-2508-136968826844335/AnsiballZ_podman_container_exec.py'
Nov 22 07:35:31 compute-0 sudo[203598]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:35:31 compute-0 python3.9[203600]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 22 07:35:31 compute-0 systemd[1]: Started libpod-conmon-ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f.scope.
Nov 22 07:35:31 compute-0 podman[203601]: 2025-11-22 07:35:31.973357633 +0000 UTC m=+0.257910891 container exec ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 22 07:35:32 compute-0 podman[203620]: 2025-11-22 07:35:32.122484229 +0000 UTC m=+0.134960623 container exec_died ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 22 07:35:32 compute-0 podman[203601]: 2025-11-22 07:35:32.165377932 +0000 UTC m=+0.449931160 container exec_died ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 22 07:35:32 compute-0 systemd[1]: libpod-conmon-ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f.scope: Deactivated successfully.
Nov 22 07:35:32 compute-0 sudo[203598]: pam_unix(sudo:session): session closed for user root
Nov 22 07:35:32 compute-0 sudo[203781]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jyzbwlnqzcizirsmzviwmzbwyuhbpugw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796932.5227854-2516-128556883314236/AnsiballZ_file.py'
Nov 22 07:35:32 compute-0 sudo[203781]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:35:33 compute-0 python3.9[203783]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:35:33 compute-0 sudo[203781]: pam_unix(sudo:session): session closed for user root
Nov 22 07:35:33 compute-0 sudo[203933]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uyvnhkytyrgrwfozzssxrjwvptjiwjwc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796933.2537153-2525-106532119253291/AnsiballZ_podman_container_info.py'
Nov 22 07:35:33 compute-0 sudo[203933]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:35:33 compute-0 python3.9[203935]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman
Nov 22 07:35:33 compute-0 sudo[203933]: pam_unix(sudo:session): session closed for user root
Nov 22 07:35:34 compute-0 sudo[204099]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-abnfiumobekelmjmnnsgresfmaryvxbv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796933.974732-2533-184016762725603/AnsiballZ_podman_container_exec.py'
Nov 22 07:35:34 compute-0 sudo[204099]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:35:34 compute-0 python3.9[204101]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 22 07:35:34 compute-0 systemd[1]: Started libpod-conmon-c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37.scope.
Nov 22 07:35:34 compute-0 podman[204102]: 2025-11-22 07:35:34.677780119 +0000 UTC m=+0.221859275 container exec c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 07:35:34 compute-0 podman[204102]: 2025-11-22 07:35:34.713750904 +0000 UTC m=+0.257830050 container exec_died c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Nov 22 07:35:34 compute-0 systemd[1]: libpod-conmon-c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37.scope: Deactivated successfully.
Nov 22 07:35:34 compute-0 sudo[204099]: pam_unix(sudo:session): session closed for user root
Nov 22 07:35:35 compute-0 sudo[204281]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ijgangfpxfxiuinluegsiymdhxkghuxb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796935.0738094-2541-209120753768274/AnsiballZ_podman_container_exec.py'
Nov 22 07:35:35 compute-0 sudo[204281]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:35:35 compute-0 python3.9[204283]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 22 07:35:35 compute-0 systemd[1]: Started libpod-conmon-c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37.scope.
Nov 22 07:35:35 compute-0 podman[204284]: 2025-11-22 07:35:35.906000044 +0000 UTC m=+0.105435196 container exec c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Nov 22 07:35:35 compute-0 podman[204303]: 2025-11-22 07:35:35.98649333 +0000 UTC m=+0.068380674 container exec_died c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Nov 22 07:35:36 compute-0 podman[204284]: 2025-11-22 07:35:36.00373729 +0000 UTC m=+0.203172422 container exec_died c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 07:35:36 compute-0 systemd[1]: libpod-conmon-c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37.scope: Deactivated successfully.
Nov 22 07:35:36 compute-0 sudo[204281]: pam_unix(sudo:session): session closed for user root
Nov 22 07:35:36 compute-0 podman[204392]: 2025-11-22 07:35:36.437838215 +0000 UTC m=+0.080784216 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, health_failing_streak=3, health_log=, org.label-schema.license=GPLv2, config_id=edpm, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, managed_by=edpm_ansible)
Nov 22 07:35:36 compute-0 systemd[1]: a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b-173a7a48cbf9c8ec.service: Main process exited, code=exited, status=1/FAILURE
Nov 22 07:35:36 compute-0 systemd[1]: a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b-173a7a48cbf9c8ec.service: Failed with result 'exit-code'.
Nov 22 07:35:36 compute-0 podman[204401]: 2025-11-22 07:35:36.453580168 +0000 UTC m=+0.095413541 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 22 07:35:36 compute-0 sudo[204511]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hlkvvzfxpijwoblgmfjkubvqzmeaftdt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796936.2549076-2549-97454376040014/AnsiballZ_file.py'
Nov 22 07:35:36 compute-0 sudo[204511]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:35:36 compute-0 python3.9[204513]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:35:36 compute-0 sudo[204511]: pam_unix(sudo:session): session closed for user root
Nov 22 07:35:37 compute-0 sudo[204663]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdzrcabymcxuwafnebahnrbqyxozlizx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796936.9834461-2558-120350505411002/AnsiballZ_podman_container_info.py'
Nov 22 07:35:37 compute-0 sudo[204663]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:35:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:35:37.302 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:35:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:35:37.303 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:35:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:35:37.303 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:35:37 compute-0 python3.9[204665]: ansible-containers.podman.podman_container_info Invoked with name=['multipathd'] executable=podman
Nov 22 07:35:37 compute-0 sudo[204663]: pam_unix(sudo:session): session closed for user root
Nov 22 07:35:38 compute-0 sudo[204828]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jadwcfwqkshuazehvfuetoxxzlbvdwfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796937.7403164-2566-220533558243997/AnsiballZ_podman_container_exec.py'
Nov 22 07:35:38 compute-0 sudo[204828]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:35:38 compute-0 python3.9[204830]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 22 07:35:38 compute-0 systemd[1]: Started libpod-conmon-8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f.scope.
Nov 22 07:35:38 compute-0 podman[204831]: 2025-11-22 07:35:38.396072098 +0000 UTC m=+0.063968057 container exec 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 07:35:38 compute-0 podman[204831]: 2025-11-22 07:35:38.426669962 +0000 UTC m=+0.094565901 container exec_died 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, io.buildah.version=1.41.3, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 07:35:38 compute-0 sudo[204828]: pam_unix(sudo:session): session closed for user root
Nov 22 07:35:38 compute-0 systemd[1]: libpod-conmon-8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f.scope: Deactivated successfully.
Nov 22 07:35:38 compute-0 sudo[205013]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xfolhifdfnixcojnxyfzshezvmplqgay ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796938.6905823-2574-101845445274842/AnsiballZ_podman_container_exec.py'
Nov 22 07:35:38 compute-0 sudo[205013]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:35:39 compute-0 python3.9[205015]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 22 07:35:39 compute-0 systemd[1]: Started libpod-conmon-8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f.scope.
Nov 22 07:35:39 compute-0 podman[205016]: 2025-11-22 07:35:39.26563676 +0000 UTC m=+0.068860145 container exec 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Nov 22 07:35:39 compute-0 podman[205016]: 2025-11-22 07:35:39.295809935 +0000 UTC m=+0.099033320 container exec_died 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 07:35:39 compute-0 systemd[1]: libpod-conmon-8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f.scope: Deactivated successfully.
Nov 22 07:35:39 compute-0 sudo[205013]: pam_unix(sudo:session): session closed for user root
Nov 22 07:35:39 compute-0 sudo[205198]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxwzjwsjoqzdjgjoacwuyrsbklutnyuc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796939.5193372-2582-45831244763971/AnsiballZ_file.py'
Nov 22 07:35:39 compute-0 sudo[205198]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:35:39 compute-0 python3.9[205200]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/multipathd recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:35:40 compute-0 sudo[205198]: pam_unix(sudo:session): session closed for user root
Nov 22 07:35:40 compute-0 podman[205294]: 2025-11-22 07:35:40.412076956 +0000 UTC m=+0.050900619 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 07:35:40 compute-0 sudo[205373]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vjgyvrhlamqqckrtgwlcjjxdcjjntwss ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796940.23241-2591-12886009178888/AnsiballZ_podman_container_info.py'
Nov 22 07:35:40 compute-0 sudo[205373]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:35:40 compute-0 python3.9[205375]: ansible-containers.podman.podman_container_info Invoked with name=['ceilometer_agent_compute'] executable=podman
Nov 22 07:35:40 compute-0 sudo[205373]: pam_unix(sudo:session): session closed for user root
Nov 22 07:35:41 compute-0 sudo[205538]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lmlcnexjdcndcjmjrcxrhlqokwqnfwon ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796940.965012-2599-68714127214909/AnsiballZ_podman_container_exec.py'
Nov 22 07:35:41 compute-0 sudo[205538]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:35:41 compute-0 python3.9[205540]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 22 07:35:41 compute-0 systemd[1]: Started libpod-conmon-a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b.scope.
Nov 22 07:35:41 compute-0 podman[205541]: 2025-11-22 07:35:41.61442433 +0000 UTC m=+0.122803257 container exec a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 22 07:35:41 compute-0 podman[205541]: 2025-11-22 07:35:41.650872916 +0000 UTC m=+0.159251843 container exec_died a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 22 07:35:41 compute-0 systemd[1]: libpod-conmon-a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b.scope: Deactivated successfully.
Nov 22 07:35:41 compute-0 sudo[205538]: pam_unix(sudo:session): session closed for user root
Nov 22 07:35:42 compute-0 sudo[205722]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-awfrklnlvtcintlefqtecziowzstbcwq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796941.8645499-2607-97462263782290/AnsiballZ_podman_container_exec.py'
Nov 22 07:35:42 compute-0 sudo[205722]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:35:42 compute-0 python3.9[205724]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 22 07:35:42 compute-0 systemd[1]: Started libpod-conmon-a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b.scope.
Nov 22 07:35:42 compute-0 podman[205725]: 2025-11-22 07:35:42.48304592 +0000 UTC m=+0.090054860 container exec a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 07:35:42 compute-0 podman[205725]: 2025-11-22 07:35:42.516600136 +0000 UTC m=+0.123609036 container exec_died a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 22 07:35:42 compute-0 systemd[1]: libpod-conmon-a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b.scope: Deactivated successfully.
Nov 22 07:35:42 compute-0 sudo[205722]: pam_unix(sudo:session): session closed for user root
Nov 22 07:35:42 compute-0 sudo[205905]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zcksagzgdyrhdrpetsactbnzisfzfhsg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796942.7143767-2615-259409664026949/AnsiballZ_file.py'
Nov 22 07:35:42 compute-0 sudo[205905]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:35:43 compute-0 python3.9[205907]: ansible-ansible.builtin.file Invoked with group=42405 mode=0700 owner=42405 path=/var/lib/openstack/healthchecks/ceilometer_agent_compute recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:35:43 compute-0 sudo[205905]: pam_unix(sudo:session): session closed for user root
Nov 22 07:35:43 compute-0 sudo[206057]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nrancnnxezvfjvlwloltfghjmsjdpjcf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796943.4783638-2624-176322512521101/AnsiballZ_podman_container_info.py'
Nov 22 07:35:43 compute-0 sudo[206057]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:35:43 compute-0 python3.9[206059]: ansible-containers.podman.podman_container_info Invoked with name=['node_exporter'] executable=podman
Nov 22 07:35:44 compute-0 sudo[206057]: pam_unix(sudo:session): session closed for user root
Nov 22 07:35:44 compute-0 sudo[206234]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zyvbufvgsshalalsnjbuqhqqjnqwvcjx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796944.2096236-2632-145349828007014/AnsiballZ_podman_container_exec.py'
Nov 22 07:35:44 compute-0 sudo[206234]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:35:44 compute-0 podman[206196]: 2025-11-22 07:35:44.516215845 +0000 UTC m=+0.061291052 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 07:35:44 compute-0 python3.9[206241]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 22 07:35:45 compute-0 systemd[1]: Started libpod-conmon-0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3.scope.
Nov 22 07:35:45 compute-0 podman[206244]: 2025-11-22 07:35:45.128466472 +0000 UTC m=+0.367885326 container exec 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 07:35:45 compute-0 podman[206263]: 2025-11-22 07:35:45.330598716 +0000 UTC m=+0.188635787 container exec_died 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 07:35:45 compute-0 podman[206244]: 2025-11-22 07:35:45.569839553 +0000 UTC m=+0.809258387 container exec_died 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 07:35:45 compute-0 systemd[1]: libpod-conmon-0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3.scope: Deactivated successfully.
Nov 22 07:35:46 compute-0 sudo[206234]: pam_unix(sudo:session): session closed for user root
Nov 22 07:35:46 compute-0 sudo[206425]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qvugufzkaolnllwmpxpczyngrzkphniy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796946.5177214-2640-127171830903982/AnsiballZ_podman_container_exec.py'
Nov 22 07:35:46 compute-0 sudo[206425]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:35:47 compute-0 python3.9[206427]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 22 07:35:47 compute-0 systemd[1]: Started libpod-conmon-0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3.scope.
Nov 22 07:35:47 compute-0 podman[206428]: 2025-11-22 07:35:47.09715841 +0000 UTC m=+0.071469329 container exec 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 07:35:47 compute-0 podman[206428]: 2025-11-22 07:35:47.127986429 +0000 UTC m=+0.102297318 container exec_died 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 07:35:47 compute-0 systemd[1]: libpod-conmon-0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3.scope: Deactivated successfully.
Nov 22 07:35:47 compute-0 sudo[206425]: pam_unix(sudo:session): session closed for user root
Nov 22 07:35:47 compute-0 sudo[206609]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wqigucryajnzurhuztczbygqjlxcsgps ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796947.3478136-2648-249075527959764/AnsiballZ_file.py'
Nov 22 07:35:47 compute-0 sudo[206609]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:35:47 compute-0 python3.9[206611]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/node_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:35:47 compute-0 sudo[206609]: pam_unix(sudo:session): session closed for user root
Nov 22 07:35:48 compute-0 sudo[206761]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cxupqkokakaewtqxmyyonjjvxpnsfeyc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796948.0429356-2657-132422349107975/AnsiballZ_podman_container_info.py'
Nov 22 07:35:48 compute-0 sudo[206761]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:35:48 compute-0 python3.9[206763]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman
Nov 22 07:35:48 compute-0 sudo[206761]: pam_unix(sudo:session): session closed for user root
Nov 22 07:35:49 compute-0 sudo[206940]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pxwetjhdtntfhhklhxpiggmmnxoeupyp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796948.7745245-2665-39799826724138/AnsiballZ_podman_container_exec.py'
Nov 22 07:35:49 compute-0 sudo[206940]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:35:49 compute-0 podman[206900]: 2025-11-22 07:35:49.100776116 +0000 UTC m=+0.057699734 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, io.buildah.version=1.41.3)
Nov 22 07:35:49 compute-0 python3.9[206948]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 22 07:35:49 compute-0 systemd[1]: Started libpod-conmon-d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9.scope.
Nov 22 07:35:49 compute-0 podman[206949]: 2025-11-22 07:35:49.397236594 +0000 UTC m=+0.075246880 container exec d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 07:35:49 compute-0 podman[206969]: 2025-11-22 07:35:49.461533877 +0000 UTC m=+0.052878597 container exec_died d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 07:35:49 compute-0 podman[206949]: 2025-11-22 07:35:49.468065796 +0000 UTC m=+0.146076072 container exec_died d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 07:35:49 compute-0 systemd[1]: libpod-conmon-d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9.scope: Deactivated successfully.
Nov 22 07:35:49 compute-0 sudo[206940]: pam_unix(sudo:session): session closed for user root
Nov 22 07:35:50 compute-0 sudo[207131]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tisiwztarrlqvbutrczvvkvgsgnoixil ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796949.6707702-2673-8706440117639/AnsiballZ_podman_container_exec.py'
Nov 22 07:35:50 compute-0 sudo[207131]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:35:50 compute-0 python3.9[207133]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 22 07:35:51 compute-0 systemd[1]: Started libpod-conmon-d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9.scope.
Nov 22 07:35:51 compute-0 podman[207134]: 2025-11-22 07:35:51.073950023 +0000 UTC m=+0.070301761 container exec d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 07:35:51 compute-0 podman[207134]: 2025-11-22 07:35:51.109621709 +0000 UTC m=+0.105973427 container exec_died d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 07:35:51 compute-0 systemd[1]: libpod-conmon-d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9.scope: Deactivated successfully.
Nov 22 07:35:51 compute-0 sudo[207131]: pam_unix(sudo:session): session closed for user root
Nov 22 07:35:51 compute-0 sudo[207314]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pwteogjwnrgqdegahhjsxxfumyogbfsg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796951.3271956-2681-227780986344751/AnsiballZ_file.py'
Nov 22 07:35:51 compute-0 sudo[207314]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:35:51 compute-0 python3.9[207316]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:35:51 compute-0 sudo[207314]: pam_unix(sudo:session): session closed for user root
Nov 22 07:35:52 compute-0 sudo[207479]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oadqmufttmzramjtircmtmiudjfvzwyc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796952.0872965-2690-177769995588703/AnsiballZ_podman_container_info.py'
Nov 22 07:35:52 compute-0 sudo[207479]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:35:52 compute-0 podman[207440]: 2025-11-22 07:35:52.379210018 +0000 UTC m=+0.052239101 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 07:35:52 compute-0 python3.9[207490]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman
Nov 22 07:35:52 compute-0 sudo[207479]: pam_unix(sudo:session): session closed for user root
Nov 22 07:35:53 compute-0 sudo[207652]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-muwlfvubhazxncvcenedsrmugeelnxax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796953.234665-2698-166889088862973/AnsiballZ_podman_container_exec.py'
Nov 22 07:35:53 compute-0 sudo[207652]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:35:53 compute-0 python3.9[207654]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 22 07:35:53 compute-0 systemd[1]: Started libpod-conmon-26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf.scope.
Nov 22 07:35:53 compute-0 podman[207655]: 2025-11-22 07:35:53.784046826 +0000 UTC m=+0.073336654 container exec 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, version=9.6, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., config_id=edpm, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container)
Nov 22 07:35:53 compute-0 podman[207655]: 2025-11-22 07:35:53.818715759 +0000 UTC m=+0.108005597 container exec_died 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, distribution-scope=public, version=9.6, architecture=x86_64, config_id=edpm, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.openshift.expose-services=, name=ubi9-minimal, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, vcs-type=git)
Nov 22 07:35:53 compute-0 systemd[1]: libpod-conmon-26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf.scope: Deactivated successfully.
Nov 22 07:35:53 compute-0 sudo[207652]: pam_unix(sudo:session): session closed for user root
Nov 22 07:35:54 compute-0 sudo[207833]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vedjvwsagyhkmqpjskvhnjdabmwghemv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796954.0295115-2706-74787592386452/AnsiballZ_podman_container_exec.py'
Nov 22 07:35:54 compute-0 sudo[207833]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:35:54 compute-0 python3.9[207835]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 22 07:35:54 compute-0 systemd[1]: Started libpod-conmon-26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf.scope.
Nov 22 07:35:54 compute-0 podman[207836]: 2025-11-22 07:35:54.578605676 +0000 UTC m=+0.067401170 container exec 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, vcs-type=git, managed_by=edpm_ansible, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, vendor=Red Hat, Inc., version=9.6, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.openshift.tags=minimal rhel9, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 22 07:35:54 compute-0 podman[207836]: 2025-11-22 07:35:54.614829637 +0000 UTC m=+0.103625141 container exec_died 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, release=1755695350, container_name=openstack_network_exporter, architecture=x86_64, io.openshift.expose-services=, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., name=ubi9-minimal, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7)
Nov 22 07:35:54 compute-0 systemd[1]: libpod-conmon-26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf.scope: Deactivated successfully.
Nov 22 07:35:54 compute-0 sudo[207833]: pam_unix(sudo:session): session closed for user root
Nov 22 07:35:55 compute-0 sudo[208018]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nkywvisfqopiaibfgslqxymjjbngfqok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763796954.835388-2714-266114097529591/AnsiballZ_file.py'
Nov 22 07:35:55 compute-0 sudo[208018]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:35:55 compute-0 python3.9[208020]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:35:55 compute-0 sudo[208018]: pam_unix(sudo:session): session closed for user root
Nov 22 07:35:57 compute-0 podman[208045]: 2025-11-22 07:35:57.407139394 +0000 UTC m=+0.057621277 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, container_name=openstack_network_exporter, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 22 07:36:01 compute-0 anacron[30858]: Job `cron.daily' started
Nov 22 07:36:01 compute-0 anacron[30858]: Job `cron.daily' terminated
Nov 22 07:36:07 compute-0 podman[208067]: 2025-11-22 07:36:07.41178918 +0000 UTC m=+0.058200049 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm)
Nov 22 07:36:07 compute-0 podman[208068]: 2025-11-22 07:36:07.437136421 +0000 UTC m=+0.080176938 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251118)
Nov 22 07:36:11 compute-0 podman[208113]: 2025-11-22 07:36:11.421225445 +0000 UTC m=+0.073583115 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 07:36:15 compute-0 podman[208137]: 2025-11-22 07:36:15.412212218 +0000 UTC m=+0.060200068 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true)
Nov 22 07:36:19 compute-0 podman[208156]: 2025-11-22 07:36:19.413181395 +0000 UTC m=+0.066414879 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, org.label-schema.license=GPLv2)
Nov 22 07:36:23 compute-0 podman[208176]: 2025-11-22 07:36:23.397662528 +0000 UTC m=+0.051473165 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 07:36:26 compute-0 nova_compute[186544]: 2025-11-22 07:36:26.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:36:26 compute-0 nova_compute[186544]: 2025-11-22 07:36:26.165 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 07:36:26 compute-0 nova_compute[186544]: 2025-11-22 07:36:26.165 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 07:36:26 compute-0 nova_compute[186544]: 2025-11-22 07:36:26.178 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 07:36:26 compute-0 nova_compute[186544]: 2025-11-22 07:36:26.178 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:36:26 compute-0 nova_compute[186544]: 2025-11-22 07:36:26.178 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 07:36:27 compute-0 nova_compute[186544]: 2025-11-22 07:36:27.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:36:28 compute-0 nova_compute[186544]: 2025-11-22 07:36:28.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:36:28 compute-0 nova_compute[186544]: 2025-11-22 07:36:28.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:36:28 compute-0 nova_compute[186544]: 2025-11-22 07:36:28.165 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:36:28 compute-0 nova_compute[186544]: 2025-11-22 07:36:28.165 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:36:28 compute-0 nova_compute[186544]: 2025-11-22 07:36:28.194 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:36:28 compute-0 nova_compute[186544]: 2025-11-22 07:36:28.195 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:36:28 compute-0 nova_compute[186544]: 2025-11-22 07:36:28.195 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:36:28 compute-0 nova_compute[186544]: 2025-11-22 07:36:28.196 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 07:36:28 compute-0 podman[208200]: 2025-11-22 07:36:28.41328122 +0000 UTC m=+0.059477169 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, config_id=edpm, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, managed_by=edpm_ansible, vcs-type=git)
Nov 22 07:36:28 compute-0 nova_compute[186544]: 2025-11-22 07:36:28.447 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 07:36:28 compute-0 nova_compute[186544]: 2025-11-22 07:36:28.449 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6049MB free_disk=73.49763870239258GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 07:36:28 compute-0 nova_compute[186544]: 2025-11-22 07:36:28.449 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:36:28 compute-0 nova_compute[186544]: 2025-11-22 07:36:28.449 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:36:28 compute-0 nova_compute[186544]: 2025-11-22 07:36:28.519 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 07:36:28 compute-0 nova_compute[186544]: 2025-11-22 07:36:28.520 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 07:36:28 compute-0 nova_compute[186544]: 2025-11-22 07:36:28.543 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:36:28 compute-0 nova_compute[186544]: 2025-11-22 07:36:28.557 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:36:28 compute-0 nova_compute[186544]: 2025-11-22 07:36:28.559 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 07:36:28 compute-0 nova_compute[186544]: 2025-11-22 07:36:28.560 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:36:29 compute-0 nova_compute[186544]: 2025-11-22 07:36:29.555 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:36:29 compute-0 nova_compute[186544]: 2025-11-22 07:36:29.558 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:36:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:36:36.588 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:36:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:36:36.589 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:36:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:36:36.589 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:36:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:36:36.589 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:36:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:36:36.589 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:36:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:36:36.589 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:36:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:36:36.590 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:36:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:36:36.590 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:36:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:36:36.590 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:36:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:36:36.590 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:36:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:36:36.590 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:36:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:36:36.590 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:36:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:36:36.590 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:36:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:36:36.590 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:36:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:36:36.590 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:36:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:36:36.590 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:36:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:36:36.590 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:36:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:36:36.590 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:36:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:36:36.590 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:36:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:36:36.591 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:36:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:36:36.591 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:36:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:36:36.591 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:36:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:36:36.591 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:36:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:36:36.591 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:36:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:36:36.591 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:36:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:36:37.304 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:36:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:36:37.305 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:36:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:36:37.305 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:36:38 compute-0 podman[208221]: 2025-11-22 07:36:38.404848354 +0000 UTC m=+0.056622620 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 07:36:38 compute-0 podman[208222]: 2025-11-22 07:36:38.445338617 +0000 UTC m=+0.091864504 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 22 07:36:42 compute-0 podman[208264]: 2025-11-22 07:36:42.399040276 +0000 UTC m=+0.047477256 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 07:36:46 compute-0 podman[208289]: 2025-11-22 07:36:46.398971329 +0000 UTC m=+0.052453738 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Nov 22 07:36:49 compute-0 sudo[208434]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-awhljzcwoavvizwwiateypfxkhvryawr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763797009.0116198-3187-39806182745332/AnsiballZ_file.py'
Nov 22 07:36:49 compute-0 sudo[208434]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:36:49 compute-0 python3.9[208436]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:36:49 compute-0 sudo[208434]: pam_unix(sudo:session): session closed for user root
Nov 22 07:36:50 compute-0 sudo[208599]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mdvbjxquclgdznprcwkphyqghwphnpwj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763797009.729-3211-271075537184749/AnsiballZ_stat.py'
Nov 22 07:36:50 compute-0 sudo[208599]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:36:50 compute-0 podman[208560]: 2025-11-22 07:36:50.033037448 +0000 UTC m=+0.068999583 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Nov 22 07:36:50 compute-0 python3.9[208607]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:36:50 compute-0 sudo[208599]: pam_unix(sudo:session): session closed for user root
Nov 22 07:36:50 compute-0 sudo[208731]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mgzlhhzcswiphcbbswbqegomdhhnvmaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763797009.729-3211-271075537184749/AnsiballZ_copy.py'
Nov 22 07:36:50 compute-0 sudo[208731]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:36:50 compute-0 python3.9[208733]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1763797009.729-3211-271075537184749/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:36:50 compute-0 sudo[208731]: pam_unix(sudo:session): session closed for user root
Nov 22 07:36:51 compute-0 sudo[208883]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ahtqclhgqoyriwgpiotaljplklkmqljk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763797011.2942061-3259-261839031401747/AnsiballZ_file.py'
Nov 22 07:36:51 compute-0 sudo[208883]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:36:51 compute-0 python3.9[208885]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:36:51 compute-0 sudo[208883]: pam_unix(sudo:session): session closed for user root
Nov 22 07:36:52 compute-0 sudo[209035]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ezdxbhvneugirphoicnxdcbjvsxzwozi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763797012.0278342-3283-137537761536344/AnsiballZ_stat.py'
Nov 22 07:36:52 compute-0 sudo[209035]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:36:52 compute-0 python3.9[209037]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:36:52 compute-0 sudo[209035]: pam_unix(sudo:session): session closed for user root
Nov 22 07:36:52 compute-0 sudo[209113]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iavlklzpocfphvoxvofmzawimurtcleo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763797012.0278342-3283-137537761536344/AnsiballZ_file.py'
Nov 22 07:36:52 compute-0 sudo[209113]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:36:52 compute-0 python3.9[209115]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:36:52 compute-0 sudo[209113]: pam_unix(sudo:session): session closed for user root
Nov 22 07:36:53 compute-0 podman[209239]: 2025-11-22 07:36:53.540349067 +0000 UTC m=+0.051204156 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 07:36:53 compute-0 sudo[209282]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pkmsqtibchwmhxwgoyewioladmnpbwlr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763797013.225049-3319-213477066671522/AnsiballZ_stat.py'
Nov 22 07:36:53 compute-0 sudo[209282]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:36:53 compute-0 python3.9[209291]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:36:53 compute-0 sudo[209282]: pam_unix(sudo:session): session closed for user root
Nov 22 07:36:53 compute-0 sudo[209367]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-phldnphijsycwxoxrzkbliqgfqayocyy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763797013.225049-3319-213477066671522/AnsiballZ_file.py'
Nov 22 07:36:53 compute-0 sudo[209367]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:36:54 compute-0 python3.9[209369]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.dw71k7jg recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:36:54 compute-0 sudo[209367]: pam_unix(sudo:session): session closed for user root
Nov 22 07:36:54 compute-0 sudo[209519]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hzmivnxrxxrmxgxctjmoftamuxzyfotm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763797014.4460795-3355-75893966291187/AnsiballZ_stat.py'
Nov 22 07:36:54 compute-0 sudo[209519]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:36:54 compute-0 python3.9[209521]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:36:54 compute-0 sudo[209519]: pam_unix(sudo:session): session closed for user root
Nov 22 07:36:55 compute-0 sudo[209597]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pdrfovnrhsastqukahgupmpegdbudiex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763797014.4460795-3355-75893966291187/AnsiballZ_file.py'
Nov 22 07:36:55 compute-0 sudo[209597]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:36:55 compute-0 python3.9[209599]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:36:55 compute-0 sudo[209597]: pam_unix(sudo:session): session closed for user root
Nov 22 07:36:55 compute-0 sudo[209749]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kisonbueqyrpyjvarbbrzixlthsalgfn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763797015.7052755-3394-112841108094047/AnsiballZ_command.py'
Nov 22 07:36:55 compute-0 sudo[209749]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:36:56 compute-0 python3.9[209751]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 07:36:56 compute-0 sudo[209749]: pam_unix(sudo:session): session closed for user root
Nov 22 07:36:56 compute-0 sudo[209902]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxwvicsehnhdowlywjabupwomflndmed ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763797016.4305394-3418-263265128767172/AnsiballZ_edpm_nftables_from_files.py'
Nov 22 07:36:56 compute-0 sudo[209902]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:36:57 compute-0 python3[209904]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 22 07:36:57 compute-0 sudo[209902]: pam_unix(sudo:session): session closed for user root
Nov 22 07:36:57 compute-0 sudo[210054]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vqhlsjfeqgouiakrepricbxezhjtmyxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763797017.3459942-3442-121349964855201/AnsiballZ_stat.py'
Nov 22 07:36:57 compute-0 sudo[210054]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:36:57 compute-0 python3.9[210056]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:36:57 compute-0 sudo[210054]: pam_unix(sudo:session): session closed for user root
Nov 22 07:36:58 compute-0 sudo[210132]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mnbgohpfrvrtsehwvswuohlnnrbxyfyt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763797017.3459942-3442-121349964855201/AnsiballZ_file.py'
Nov 22 07:36:58 compute-0 sudo[210132]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:36:58 compute-0 python3.9[210134]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:36:58 compute-0 sudo[210132]: pam_unix(sudo:session): session closed for user root
Nov 22 07:36:58 compute-0 sudo[210297]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lukfxrifnyiexomsvyherkhqkjnkiqhi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763797018.5388582-3478-274842850238537/AnsiballZ_stat.py'
Nov 22 07:36:58 compute-0 sudo[210297]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:36:58 compute-0 podman[210258]: 2025-11-22 07:36:58.885102832 +0000 UTC m=+0.063894279 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., config_id=edpm, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, version=9.6, distribution-scope=public, vcs-type=git, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 22 07:36:59 compute-0 python3.9[210305]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:36:59 compute-0 sudo[210297]: pam_unix(sudo:session): session closed for user root
Nov 22 07:36:59 compute-0 sudo[210383]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yuosztxmjoeaqzlfvmezoarinpdyzzsn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763797018.5388582-3478-274842850238537/AnsiballZ_file.py'
Nov 22 07:36:59 compute-0 sudo[210383]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:36:59 compute-0 python3.9[210385]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:36:59 compute-0 sudo[210383]: pam_unix(sudo:session): session closed for user root
Nov 22 07:37:00 compute-0 sudo[210535]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hifdwwfainhegcyyokgwnpueyicfhfoi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763797019.8171165-3514-264815505015305/AnsiballZ_stat.py'
Nov 22 07:37:00 compute-0 sudo[210535]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:37:00 compute-0 python3.9[210537]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:37:00 compute-0 sudo[210535]: pam_unix(sudo:session): session closed for user root
Nov 22 07:37:00 compute-0 sudo[210613]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ljofrusptwamvcmujgmcnryzromnmxzy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763797019.8171165-3514-264815505015305/AnsiballZ_file.py'
Nov 22 07:37:00 compute-0 sudo[210613]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:37:00 compute-0 python3.9[210615]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:37:00 compute-0 sudo[210613]: pam_unix(sudo:session): session closed for user root
Nov 22 07:37:01 compute-0 sudo[210765]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kaiqlvkgwicumawjfweaaxhsqgvnifid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763797021.0685003-3550-45597917715886/AnsiballZ_stat.py'
Nov 22 07:37:01 compute-0 sudo[210765]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:37:01 compute-0 python3.9[210767]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:37:01 compute-0 sudo[210765]: pam_unix(sudo:session): session closed for user root
Nov 22 07:37:01 compute-0 sudo[210843]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iknmgnuapgjkwyacwilciqmxkvuywdfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763797021.0685003-3550-45597917715886/AnsiballZ_file.py'
Nov 22 07:37:01 compute-0 sudo[210843]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:37:02 compute-0 python3.9[210845]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:37:02 compute-0 sudo[210843]: pam_unix(sudo:session): session closed for user root
Nov 22 07:37:02 compute-0 sudo[210995]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-njatxsyeoppvjpldnpncvxnopvmglnsh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763797022.26269-3586-148155160147037/AnsiballZ_stat.py'
Nov 22 07:37:02 compute-0 sudo[210995]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:37:02 compute-0 python3.9[210997]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 22 07:37:02 compute-0 sudo[210995]: pam_unix(sudo:session): session closed for user root
Nov 22 07:37:03 compute-0 sudo[211120]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rkqudwbyxsdwujlupncfmwavdsdltjeq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763797022.26269-3586-148155160147037/AnsiballZ_copy.py'
Nov 22 07:37:03 compute-0 sudo[211120]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:37:03 compute-0 python3.9[211122]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763797022.26269-3586-148155160147037/.source.nft follow=False _original_basename=ruleset.j2 checksum=fb3275eced3a2e06312143189928124e1b2df34a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:37:03 compute-0 sudo[211120]: pam_unix(sudo:session): session closed for user root
Nov 22 07:37:03 compute-0 sudo[211272]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxwlfccxrlsjrohlfeaeumpmtsaiqaou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763797023.6749177-3631-7816376340783/AnsiballZ_file.py'
Nov 22 07:37:03 compute-0 sudo[211272]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:37:04 compute-0 python3.9[211274]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:37:04 compute-0 sudo[211272]: pam_unix(sudo:session): session closed for user root
Nov 22 07:37:04 compute-0 sudo[211424]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-livcyjyhhywqreftkprhsbqthopqocxi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763797024.4087074-3655-85761522918980/AnsiballZ_command.py'
Nov 22 07:37:04 compute-0 sudo[211424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:37:04 compute-0 python3.9[211426]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 07:37:04 compute-0 sudo[211424]: pam_unix(sudo:session): session closed for user root
Nov 22 07:37:05 compute-0 sudo[211579]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-czurcmrhfqfwkcoonkxepoctaomxczgc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763797025.154229-3679-87069335490877/AnsiballZ_blockinfile.py'
Nov 22 07:37:05 compute-0 sudo[211579]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:37:05 compute-0 python3.9[211581]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:37:05 compute-0 sudo[211579]: pam_unix(sudo:session): session closed for user root
Nov 22 07:37:06 compute-0 sudo[211731]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmouwbjqehfoatdxxxjcsmdaxtegviav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763797026.211341-3706-59858874622261/AnsiballZ_command.py'
Nov 22 07:37:06 compute-0 sudo[211731]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:37:06 compute-0 python3.9[211733]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 07:37:06 compute-0 sudo[211731]: pam_unix(sudo:session): session closed for user root
Nov 22 07:37:07 compute-0 sudo[211884]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-arlnmdzblgrxpmxoaeprwsafpabszctx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763797026.9692447-3730-210128382491795/AnsiballZ_stat.py'
Nov 22 07:37:07 compute-0 sudo[211884]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:37:07 compute-0 python3.9[211886]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 22 07:37:07 compute-0 sudo[211884]: pam_unix(sudo:session): session closed for user root
Nov 22 07:37:07 compute-0 sudo[212038]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pgikyrdzmrwatwwiidjmilurzuneuiqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763797027.7412605-3754-72632158008571/AnsiballZ_command.py'
Nov 22 07:37:07 compute-0 sudo[212038]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:37:08 compute-0 python3.9[212040]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 22 07:37:08 compute-0 sudo[212038]: pam_unix(sudo:session): session closed for user root
Nov 22 07:37:08 compute-0 sudo[212224]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-latckuwgmauitzmfygydtltefuqrgpfe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763797028.4685247-3778-155226859257482/AnsiballZ_file.py'
Nov 22 07:37:08 compute-0 sudo[212224]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 07:37:08 compute-0 podman[212167]: 2025-11-22 07:37:08.767545148 +0000 UTC m=+0.058066682 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118)
Nov 22 07:37:08 compute-0 podman[212168]: 2025-11-22 07:37:08.792165683 +0000 UTC m=+0.078180081 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 22 07:37:08 compute-0 python3.9[212234]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 22 07:37:08 compute-0 sudo[212224]: pam_unix(sudo:session): session closed for user root
Nov 22 07:37:09 compute-0 sshd-session[186839]: Connection closed by 192.168.122.30 port 38450
Nov 22 07:37:09 compute-0 sshd-session[186836]: pam_unix(sshd:session): session closed for user zuul
Nov 22 07:37:09 compute-0 systemd[1]: session-25.scope: Deactivated successfully.
Nov 22 07:37:09 compute-0 systemd[1]: session-25.scope: Consumed 1min 36.843s CPU time.
Nov 22 07:37:09 compute-0 systemd-logind[821]: Session 25 logged out. Waiting for processes to exit.
Nov 22 07:37:09 compute-0 systemd-logind[821]: Removed session 25.
Nov 22 07:37:13 compute-0 podman[212266]: 2025-11-22 07:37:13.407220167 +0000 UTC m=+0.053145695 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 07:37:17 compute-0 podman[212291]: 2025-11-22 07:37:17.396233772 +0000 UTC m=+0.048202257 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Nov 22 07:37:20 compute-0 podman[212310]: 2025-11-22 07:37:20.411215596 +0000 UTC m=+0.060557205 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Nov 22 07:37:24 compute-0 podman[212331]: 2025-11-22 07:37:24.428308588 +0000 UTC m=+0.075706098 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 07:37:26 compute-0 nova_compute[186544]: 2025-11-22 07:37:26.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:37:26 compute-0 nova_compute[186544]: 2025-11-22 07:37:26.165 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 07:37:26 compute-0 nova_compute[186544]: 2025-11-22 07:37:26.165 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 07:37:26 compute-0 nova_compute[186544]: 2025-11-22 07:37:26.179 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 07:37:26 compute-0 nova_compute[186544]: 2025-11-22 07:37:26.179 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:37:26 compute-0 nova_compute[186544]: 2025-11-22 07:37:26.180 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 07:37:27 compute-0 nova_compute[186544]: 2025-11-22 07:37:27.167 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:37:28 compute-0 nova_compute[186544]: 2025-11-22 07:37:28.167 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:37:28 compute-0 nova_compute[186544]: 2025-11-22 07:37:28.168 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:37:28 compute-0 nova_compute[186544]: 2025-11-22 07:37:28.192 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:37:28 compute-0 nova_compute[186544]: 2025-11-22 07:37:28.193 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:37:28 compute-0 nova_compute[186544]: 2025-11-22 07:37:28.193 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:37:28 compute-0 nova_compute[186544]: 2025-11-22 07:37:28.193 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 07:37:28 compute-0 nova_compute[186544]: 2025-11-22 07:37:28.338 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 07:37:28 compute-0 nova_compute[186544]: 2025-11-22 07:37:28.338 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6059MB free_disk=73.49714660644531GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 07:37:28 compute-0 nova_compute[186544]: 2025-11-22 07:37:28.339 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:37:28 compute-0 nova_compute[186544]: 2025-11-22 07:37:28.339 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:37:28 compute-0 nova_compute[186544]: 2025-11-22 07:37:28.414 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 07:37:28 compute-0 nova_compute[186544]: 2025-11-22 07:37:28.414 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 07:37:28 compute-0 nova_compute[186544]: 2025-11-22 07:37:28.440 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:37:28 compute-0 nova_compute[186544]: 2025-11-22 07:37:28.453 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:37:28 compute-0 nova_compute[186544]: 2025-11-22 07:37:28.455 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 07:37:28 compute-0 nova_compute[186544]: 2025-11-22 07:37:28.455 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.116s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:37:29 compute-0 podman[212355]: 2025-11-22 07:37:29.405375397 +0000 UTC m=+0.058022521 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, managed_by=edpm_ansible, release=1755695350, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter)
Nov 22 07:37:29 compute-0 nova_compute[186544]: 2025-11-22 07:37:29.451 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:37:30 compute-0 nova_compute[186544]: 2025-11-22 07:37:30.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:37:30 compute-0 nova_compute[186544]: 2025-11-22 07:37:30.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:37:30 compute-0 nova_compute[186544]: 2025-11-22 07:37:30.165 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:37:31 compute-0 nova_compute[186544]: 2025-11-22 07:37:31.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:37:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:37:37.304 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:37:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:37:37.305 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:37:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:37:37.305 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:37:39 compute-0 podman[212377]: 2025-11-22 07:37:39.410389899 +0000 UTC m=+0.054641643 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 07:37:39 compute-0 podman[212378]: 2025-11-22 07:37:39.464218801 +0000 UTC m=+0.105267992 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller)
Nov 22 07:37:40 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:37:40.074 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:37:40 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:37:40.075 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 07:37:40 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:37:40.075 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:37:44 compute-0 podman[212426]: 2025-11-22 07:37:44.413994114 +0000 UTC m=+0.055069424 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 07:37:48 compute-0 podman[212451]: 2025-11-22 07:37:48.404867727 +0000 UTC m=+0.049476689 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS)
Nov 22 07:37:51 compute-0 podman[212471]: 2025-11-22 07:37:51.409619137 +0000 UTC m=+0.061642614 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd)
Nov 22 07:37:55 compute-0 podman[212491]: 2025-11-22 07:37:55.399293089 +0000 UTC m=+0.049375898 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 07:38:00 compute-0 podman[212516]: 2025-11-22 07:38:00.403762015 +0000 UTC m=+0.058025890 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, vendor=Red Hat, Inc., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, distribution-scope=public)
Nov 22 07:38:10 compute-0 podman[212537]: 2025-11-22 07:38:10.402072602 +0000 UTC m=+0.052853279 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 22 07:38:10 compute-0 podman[212538]: 2025-11-22 07:38:10.430103539 +0000 UTC m=+0.076998773 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 22 07:38:15 compute-0 podman[212583]: 2025-11-22 07:38:15.397238107 +0000 UTC m=+0.047910000 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 07:38:19 compute-0 podman[212607]: 2025-11-22 07:38:19.397030159 +0000 UTC m=+0.046380965 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 22 07:38:22 compute-0 podman[212627]: 2025-11-22 07:38:22.411083546 +0000 UTC m=+0.055064661 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 22 07:38:26 compute-0 nova_compute[186544]: 2025-11-22 07:38:26.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:38:26 compute-0 nova_compute[186544]: 2025-11-22 07:38:26.164 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 22 07:38:26 compute-0 nova_compute[186544]: 2025-11-22 07:38:26.181 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 22 07:38:26 compute-0 nova_compute[186544]: 2025-11-22 07:38:26.182 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:38:26 compute-0 nova_compute[186544]: 2025-11-22 07:38:26.182 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 22 07:38:26 compute-0 nova_compute[186544]: 2025-11-22 07:38:26.204 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:38:26 compute-0 podman[212648]: 2025-11-22 07:38:26.408164093 +0000 UTC m=+0.055709577 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 07:38:27 compute-0 nova_compute[186544]: 2025-11-22 07:38:27.238 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:38:27 compute-0 nova_compute[186544]: 2025-11-22 07:38:27.239 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 07:38:28 compute-0 nova_compute[186544]: 2025-11-22 07:38:28.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:38:28 compute-0 nova_compute[186544]: 2025-11-22 07:38:28.164 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 07:38:28 compute-0 nova_compute[186544]: 2025-11-22 07:38:28.164 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 07:38:28 compute-0 nova_compute[186544]: 2025-11-22 07:38:28.182 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 07:38:28 compute-0 nova_compute[186544]: 2025-11-22 07:38:28.182 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:38:28 compute-0 nova_compute[186544]: 2025-11-22 07:38:28.182 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:38:28 compute-0 nova_compute[186544]: 2025-11-22 07:38:28.211 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:38:28 compute-0 nova_compute[186544]: 2025-11-22 07:38:28.212 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:38:28 compute-0 nova_compute[186544]: 2025-11-22 07:38:28.212 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:38:28 compute-0 nova_compute[186544]: 2025-11-22 07:38:28.212 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 07:38:28 compute-0 nova_compute[186544]: 2025-11-22 07:38:28.366 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 07:38:28 compute-0 nova_compute[186544]: 2025-11-22 07:38:28.367 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6092MB free_disk=73.49714660644531GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 07:38:28 compute-0 nova_compute[186544]: 2025-11-22 07:38:28.368 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:38:28 compute-0 nova_compute[186544]: 2025-11-22 07:38:28.368 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:38:28 compute-0 nova_compute[186544]: 2025-11-22 07:38:28.532 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 07:38:28 compute-0 nova_compute[186544]: 2025-11-22 07:38:28.532 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 07:38:28 compute-0 nova_compute[186544]: 2025-11-22 07:38:28.601 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Refreshing inventories for resource provider 0a011418-630a-4be8-ab23-41ec1c11a5ea _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 22 07:38:28 compute-0 nova_compute[186544]: 2025-11-22 07:38:28.664 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Updating ProviderTree inventory for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 22 07:38:28 compute-0 nova_compute[186544]: 2025-11-22 07:38:28.665 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Updating inventory in ProviderTree for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 22 07:38:28 compute-0 nova_compute[186544]: 2025-11-22 07:38:28.682 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Refreshing aggregate associations for resource provider 0a011418-630a-4be8-ab23-41ec1c11a5ea, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 22 07:38:28 compute-0 nova_compute[186544]: 2025-11-22 07:38:28.711 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Refreshing trait associations for resource provider 0a011418-630a-4be8-ab23-41ec1c11a5ea, traits: COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_ACCELERATORS,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_RESCUE_BFV,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSSE3,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE41 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 22 07:38:28 compute-0 nova_compute[186544]: 2025-11-22 07:38:28.734 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:38:28 compute-0 nova_compute[186544]: 2025-11-22 07:38:28.751 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:38:28 compute-0 nova_compute[186544]: 2025-11-22 07:38:28.753 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 07:38:28 compute-0 nova_compute[186544]: 2025-11-22 07:38:28.753 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.385s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:38:29 compute-0 nova_compute[186544]: 2025-11-22 07:38:29.734 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:38:30 compute-0 nova_compute[186544]: 2025-11-22 07:38:30.158 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:38:30 compute-0 nova_compute[186544]: 2025-11-22 07:38:30.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:38:30 compute-0 nova_compute[186544]: 2025-11-22 07:38:30.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:38:31 compute-0 podman[212672]: 2025-11-22 07:38:31.398464742 +0000 UTC m=+0.053605556 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, name=ubi9-minimal, release=1755695350, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., config_id=edpm, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=)
Nov 22 07:38:32 compute-0 nova_compute[186544]: 2025-11-22 07:38:32.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:38:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:38:36.589 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:38:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:38:36.589 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:38:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:38:36.589 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:38:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:38:36.590 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:38:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:38:36.590 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:38:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:38:36.590 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:38:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:38:36.590 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:38:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:38:36.590 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:38:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:38:36.590 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:38:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:38:36.590 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:38:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:38:36.590 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:38:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:38:36.590 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:38:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:38:36.590 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:38:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:38:36.590 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:38:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:38:36.591 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:38:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:38:36.591 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:38:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:38:36.591 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:38:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:38:36.591 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:38:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:38:36.591 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:38:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:38:36.591 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:38:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:38:36.591 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:38:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:38:36.591 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:38:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:38:36.591 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:38:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:38:36.591 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:38:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:38:36.591 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:38:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:38:37.305 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:38:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:38:37.306 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:38:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:38:37.306 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:38:41 compute-0 podman[212693]: 2025-11-22 07:38:41.404376844 +0000 UTC m=+0.058135234 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 07:38:41 compute-0 podman[212694]: 2025-11-22 07:38:41.436043468 +0000 UTC m=+0.086360056 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Nov 22 07:38:46 compute-0 podman[212738]: 2025-11-22 07:38:46.406158427 +0000 UTC m=+0.055757327 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 07:38:50 compute-0 podman[212762]: 2025-11-22 07:38:50.403586183 +0000 UTC m=+0.053771061 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 22 07:38:53 compute-0 podman[212781]: 2025-11-22 07:38:53.404721591 +0000 UTC m=+0.049404076 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118)
Nov 22 07:38:57 compute-0 podman[212801]: 2025-11-22 07:38:57.394062495 +0000 UTC m=+0.046672741 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 07:39:02 compute-0 podman[212825]: 2025-11-22 07:39:02.405393406 +0000 UTC m=+0.054768214 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, version=9.6, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.33.7, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 22 07:39:12 compute-0 podman[212845]: 2025-11-22 07:39:12.433472192 +0000 UTC m=+0.084621131 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 07:39:12 compute-0 podman[212846]: 2025-11-22 07:39:12.466523398 +0000 UTC m=+0.113764073 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 22 07:39:17 compute-0 podman[212891]: 2025-11-22 07:39:17.401126984 +0000 UTC m=+0.051183615 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 07:39:21 compute-0 podman[212915]: 2025-11-22 07:39:21.436155968 +0000 UTC m=+0.083001551 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Nov 22 07:39:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:39:22.513 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:39:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:39:22.514 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 07:39:24 compute-0 podman[212935]: 2025-11-22 07:39:24.429228212 +0000 UTC m=+0.079471466 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 22 07:39:28 compute-0 nova_compute[186544]: 2025-11-22 07:39:28.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:39:28 compute-0 nova_compute[186544]: 2025-11-22 07:39:28.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:39:28 compute-0 nova_compute[186544]: 2025-11-22 07:39:28.164 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 07:39:28 compute-0 nova_compute[186544]: 2025-11-22 07:39:28.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:39:28 compute-0 nova_compute[186544]: 2025-11-22 07:39:28.187 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:39:28 compute-0 nova_compute[186544]: 2025-11-22 07:39:28.188 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:39:28 compute-0 nova_compute[186544]: 2025-11-22 07:39:28.188 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:39:28 compute-0 nova_compute[186544]: 2025-11-22 07:39:28.188 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 07:39:28 compute-0 nova_compute[186544]: 2025-11-22 07:39:28.327 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 07:39:28 compute-0 nova_compute[186544]: 2025-11-22 07:39:28.327 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6097MB free_disk=73.49742126464844GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 07:39:28 compute-0 nova_compute[186544]: 2025-11-22 07:39:28.328 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:39:28 compute-0 nova_compute[186544]: 2025-11-22 07:39:28.328 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:39:28 compute-0 nova_compute[186544]: 2025-11-22 07:39:28.391 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 07:39:28 compute-0 nova_compute[186544]: 2025-11-22 07:39:28.391 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 07:39:28 compute-0 podman[212957]: 2025-11-22 07:39:28.393776458 +0000 UTC m=+0.045425377 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 07:39:28 compute-0 nova_compute[186544]: 2025-11-22 07:39:28.422 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:39:28 compute-0 nova_compute[186544]: 2025-11-22 07:39:28.435 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:39:28 compute-0 nova_compute[186544]: 2025-11-22 07:39:28.437 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 07:39:28 compute-0 nova_compute[186544]: 2025-11-22 07:39:28.437 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.109s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:39:29 compute-0 nova_compute[186544]: 2025-11-22 07:39:29.437 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:39:29 compute-0 nova_compute[186544]: 2025-11-22 07:39:29.438 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 07:39:29 compute-0 nova_compute[186544]: 2025-11-22 07:39:29.438 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 07:39:29 compute-0 nova_compute[186544]: 2025-11-22 07:39:29.457 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 07:39:29 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:39:29.516 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:39:30 compute-0 nova_compute[186544]: 2025-11-22 07:39:30.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:39:30 compute-0 nova_compute[186544]: 2025-11-22 07:39:30.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:39:30 compute-0 nova_compute[186544]: 2025-11-22 07:39:30.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:39:32 compute-0 nova_compute[186544]: 2025-11-22 07:39:32.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:39:32 compute-0 nova_compute[186544]: 2025-11-22 07:39:32.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:39:33 compute-0 podman[212978]: 2025-11-22 07:39:33.407862598 +0000 UTC m=+0.057136498 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., vcs-type=git, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_id=edpm, io.buildah.version=1.33.7, name=ubi9-minimal, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, architecture=x86_64, version=9.6, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible)
Nov 22 07:39:36 compute-0 nova_compute[186544]: 2025-11-22 07:39:36.159 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:39:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:39:37.308 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:39:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:39:37.308 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:39:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:39:37.308 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:39:39 compute-0 nova_compute[186544]: 2025-11-22 07:39:39.328 186548 DEBUG oslo_concurrency.lockutils [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Acquiring lock "5bdfb56f-9385-45bb-918d-0662112228b2" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:39:39 compute-0 nova_compute[186544]: 2025-11-22 07:39:39.328 186548 DEBUG oslo_concurrency.lockutils [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Lock "5bdfb56f-9385-45bb-918d-0662112228b2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:39:39 compute-0 nova_compute[186544]: 2025-11-22 07:39:39.358 186548 DEBUG nova.compute.manager [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 5bdfb56f-9385-45bb-918d-0662112228b2] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 07:39:39 compute-0 nova_compute[186544]: 2025-11-22 07:39:39.537 186548 DEBUG oslo_concurrency.lockutils [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:39:39 compute-0 nova_compute[186544]: 2025-11-22 07:39:39.538 186548 DEBUG oslo_concurrency.lockutils [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:39:39 compute-0 nova_compute[186544]: 2025-11-22 07:39:39.543 186548 DEBUG nova.virt.hardware [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 07:39:39 compute-0 nova_compute[186544]: 2025-11-22 07:39:39.544 186548 INFO nova.compute.claims [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 5bdfb56f-9385-45bb-918d-0662112228b2] Claim successful on node compute-0.ctlplane.example.com
Nov 22 07:39:39 compute-0 nova_compute[186544]: 2025-11-22 07:39:39.655 186548 DEBUG nova.compute.provider_tree [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:39:39 compute-0 nova_compute[186544]: 2025-11-22 07:39:39.666 186548 DEBUG nova.scheduler.client.report [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:39:39 compute-0 nova_compute[186544]: 2025-11-22 07:39:39.685 186548 DEBUG oslo_concurrency.lockutils [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.147s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:39:39 compute-0 nova_compute[186544]: 2025-11-22 07:39:39.685 186548 DEBUG nova.compute.manager [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 5bdfb56f-9385-45bb-918d-0662112228b2] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 07:39:39 compute-0 nova_compute[186544]: 2025-11-22 07:39:39.737 186548 DEBUG nova.compute.manager [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 5bdfb56f-9385-45bb-918d-0662112228b2] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948
Nov 22 07:39:39 compute-0 nova_compute[186544]: 2025-11-22 07:39:39.751 186548 INFO nova.virt.libvirt.driver [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 5bdfb56f-9385-45bb-918d-0662112228b2] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 07:39:39 compute-0 nova_compute[186544]: 2025-11-22 07:39:39.862 186548 DEBUG nova.compute.manager [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 5bdfb56f-9385-45bb-918d-0662112228b2] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 07:39:40 compute-0 nova_compute[186544]: 2025-11-22 07:39:40.063 186548 DEBUG nova.compute.manager [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 5bdfb56f-9385-45bb-918d-0662112228b2] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 07:39:40 compute-0 nova_compute[186544]: 2025-11-22 07:39:40.064 186548 DEBUG nova.virt.libvirt.driver [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 5bdfb56f-9385-45bb-918d-0662112228b2] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 07:39:40 compute-0 nova_compute[186544]: 2025-11-22 07:39:40.064 186548 INFO nova.virt.libvirt.driver [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 5bdfb56f-9385-45bb-918d-0662112228b2] Creating image(s)
Nov 22 07:39:40 compute-0 nova_compute[186544]: 2025-11-22 07:39:40.065 186548 DEBUG oslo_concurrency.lockutils [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Acquiring lock "/var/lib/nova/instances/5bdfb56f-9385-45bb-918d-0662112228b2/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:39:40 compute-0 nova_compute[186544]: 2025-11-22 07:39:40.065 186548 DEBUG oslo_concurrency.lockutils [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Lock "/var/lib/nova/instances/5bdfb56f-9385-45bb-918d-0662112228b2/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:39:40 compute-0 nova_compute[186544]: 2025-11-22 07:39:40.066 186548 DEBUG oslo_concurrency.lockutils [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Lock "/var/lib/nova/instances/5bdfb56f-9385-45bb-918d-0662112228b2/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:39:40 compute-0 nova_compute[186544]: 2025-11-22 07:39:40.066 186548 DEBUG oslo_concurrency.lockutils [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:39:40 compute-0 nova_compute[186544]: 2025-11-22 07:39:40.067 186548 DEBUG oslo_concurrency.lockutils [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:39:42 compute-0 nova_compute[186544]: 2025-11-22 07:39:42.412 186548 DEBUG oslo_concurrency.processutils [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:39:42 compute-0 nova_compute[186544]: 2025-11-22 07:39:42.465 186548 DEBUG oslo_concurrency.processutils [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726.part --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:39:42 compute-0 nova_compute[186544]: 2025-11-22 07:39:42.466 186548 DEBUG nova.virt.images [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] eb6eb4ac-7956-4021-b3a0-d612ae61d38c was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Nov 22 07:39:42 compute-0 nova_compute[186544]: 2025-11-22 07:39:42.468 186548 DEBUG nova.privsep.utils [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Nov 22 07:39:42 compute-0 nova_compute[186544]: 2025-11-22 07:39:42.469 186548 DEBUG oslo_concurrency.processutils [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726.part /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:39:43 compute-0 nova_compute[186544]: 2025-11-22 07:39:43.417 186548 DEBUG oslo_concurrency.processutils [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726.part /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726.converted" returned: 0 in 0.948s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:39:43 compute-0 nova_compute[186544]: 2025-11-22 07:39:43.422 186548 DEBUG oslo_concurrency.processutils [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:39:43 compute-0 podman[213010]: 2025-11-22 07:39:43.446491802 +0000 UTC m=+0.089239082 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible)
Nov 22 07:39:43 compute-0 podman[213011]: 2025-11-22 07:39:43.497693085 +0000 UTC m=+0.137966815 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 07:39:43 compute-0 nova_compute[186544]: 2025-11-22 07:39:43.497 186548 DEBUG oslo_concurrency.processutils [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726.converted --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:39:43 compute-0 nova_compute[186544]: 2025-11-22 07:39:43.500 186548 DEBUG oslo_concurrency.lockutils [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 3.433s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:39:43 compute-0 nova_compute[186544]: 2025-11-22 07:39:43.515 186548 INFO oslo.privsep.daemon [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmpd_sqr5ry/privsep.sock']
Nov 22 07:39:44 compute-0 nova_compute[186544]: 2025-11-22 07:39:44.164 186548 INFO oslo.privsep.daemon [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Spawned new privsep daemon via rootwrap
Nov 22 07:39:44 compute-0 nova_compute[186544]: 2025-11-22 07:39:44.043 213059 INFO oslo.privsep.daemon [-] privsep daemon starting
Nov 22 07:39:44 compute-0 nova_compute[186544]: 2025-11-22 07:39:44.048 213059 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Nov 22 07:39:44 compute-0 nova_compute[186544]: 2025-11-22 07:39:44.050 213059 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Nov 22 07:39:44 compute-0 nova_compute[186544]: 2025-11-22 07:39:44.050 213059 INFO oslo.privsep.daemon [-] privsep daemon running as pid 213059
Nov 22 07:39:44 compute-0 nova_compute[186544]: 2025-11-22 07:39:44.239 186548 DEBUG oslo_concurrency.processutils [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:39:44 compute-0 nova_compute[186544]: 2025-11-22 07:39:44.288 186548 DEBUG oslo_concurrency.processutils [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:39:44 compute-0 nova_compute[186544]: 2025-11-22 07:39:44.289 186548 DEBUG oslo_concurrency.lockutils [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:39:44 compute-0 nova_compute[186544]: 2025-11-22 07:39:44.290 186548 DEBUG oslo_concurrency.lockutils [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:39:44 compute-0 nova_compute[186544]: 2025-11-22 07:39:44.305 186548 DEBUG oslo_concurrency.processutils [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:39:44 compute-0 nova_compute[186544]: 2025-11-22 07:39:44.361 186548 DEBUG oslo_concurrency.processutils [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:39:44 compute-0 nova_compute[186544]: 2025-11-22 07:39:44.362 186548 DEBUG oslo_concurrency.processutils [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/5bdfb56f-9385-45bb-918d-0662112228b2/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:39:44 compute-0 nova_compute[186544]: 2025-11-22 07:39:44.417 186548 DEBUG oslo_concurrency.processutils [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/5bdfb56f-9385-45bb-918d-0662112228b2/disk 1073741824" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:39:44 compute-0 nova_compute[186544]: 2025-11-22 07:39:44.418 186548 DEBUG oslo_concurrency.lockutils [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.128s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:39:44 compute-0 nova_compute[186544]: 2025-11-22 07:39:44.418 186548 DEBUG oslo_concurrency.processutils [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:39:44 compute-0 nova_compute[186544]: 2025-11-22 07:39:44.473 186548 DEBUG oslo_concurrency.processutils [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:39:44 compute-0 nova_compute[186544]: 2025-11-22 07:39:44.474 186548 DEBUG nova.virt.disk.api [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Checking if we can resize image /var/lib/nova/instances/5bdfb56f-9385-45bb-918d-0662112228b2/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 07:39:44 compute-0 nova_compute[186544]: 2025-11-22 07:39:44.474 186548 DEBUG oslo_concurrency.processutils [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5bdfb56f-9385-45bb-918d-0662112228b2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:39:44 compute-0 nova_compute[186544]: 2025-11-22 07:39:44.526 186548 DEBUG oslo_concurrency.processutils [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5bdfb56f-9385-45bb-918d-0662112228b2/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:39:44 compute-0 nova_compute[186544]: 2025-11-22 07:39:44.528 186548 DEBUG nova.virt.disk.api [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Cannot resize image /var/lib/nova/instances/5bdfb56f-9385-45bb-918d-0662112228b2/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 07:39:44 compute-0 nova_compute[186544]: 2025-11-22 07:39:44.528 186548 DEBUG nova.objects.instance [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Lazy-loading 'migration_context' on Instance uuid 5bdfb56f-9385-45bb-918d-0662112228b2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:39:44 compute-0 nova_compute[186544]: 2025-11-22 07:39:44.545 186548 DEBUG nova.virt.libvirt.driver [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 5bdfb56f-9385-45bb-918d-0662112228b2] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 07:39:44 compute-0 nova_compute[186544]: 2025-11-22 07:39:44.546 186548 DEBUG nova.virt.libvirt.driver [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 5bdfb56f-9385-45bb-918d-0662112228b2] Ensure instance console log exists: /var/lib/nova/instances/5bdfb56f-9385-45bb-918d-0662112228b2/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 07:39:44 compute-0 nova_compute[186544]: 2025-11-22 07:39:44.546 186548 DEBUG oslo_concurrency.lockutils [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:39:44 compute-0 nova_compute[186544]: 2025-11-22 07:39:44.547 186548 DEBUG oslo_concurrency.lockutils [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:39:44 compute-0 nova_compute[186544]: 2025-11-22 07:39:44.547 186548 DEBUG oslo_concurrency.lockutils [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:39:44 compute-0 nova_compute[186544]: 2025-11-22 07:39:44.550 186548 DEBUG nova.virt.libvirt.driver [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 5bdfb56f-9385-45bb-918d-0662112228b2] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 07:39:44 compute-0 nova_compute[186544]: 2025-11-22 07:39:44.556 186548 WARNING nova.virt.libvirt.driver [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 07:39:44 compute-0 nova_compute[186544]: 2025-11-22 07:39:44.561 186548 DEBUG nova.virt.libvirt.host [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 07:39:44 compute-0 nova_compute[186544]: 2025-11-22 07:39:44.561 186548 DEBUG nova.virt.libvirt.host [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 07:39:44 compute-0 nova_compute[186544]: 2025-11-22 07:39:44.564 186548 DEBUG nova.virt.libvirt.host [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 07:39:44 compute-0 nova_compute[186544]: 2025-11-22 07:39:44.565 186548 DEBUG nova.virt.libvirt.host [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 07:39:44 compute-0 nova_compute[186544]: 2025-11-22 07:39:44.566 186548 DEBUG nova.virt.libvirt.driver [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 07:39:44 compute-0 nova_compute[186544]: 2025-11-22 07:39:44.567 186548 DEBUG nova.virt.hardware [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 07:39:44 compute-0 nova_compute[186544]: 2025-11-22 07:39:44.567 186548 DEBUG nova.virt.hardware [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 07:39:44 compute-0 nova_compute[186544]: 2025-11-22 07:39:44.567 186548 DEBUG nova.virt.hardware [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 07:39:44 compute-0 nova_compute[186544]: 2025-11-22 07:39:44.567 186548 DEBUG nova.virt.hardware [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 07:39:44 compute-0 nova_compute[186544]: 2025-11-22 07:39:44.568 186548 DEBUG nova.virt.hardware [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 07:39:44 compute-0 nova_compute[186544]: 2025-11-22 07:39:44.568 186548 DEBUG nova.virt.hardware [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 07:39:44 compute-0 nova_compute[186544]: 2025-11-22 07:39:44.568 186548 DEBUG nova.virt.hardware [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 07:39:44 compute-0 nova_compute[186544]: 2025-11-22 07:39:44.568 186548 DEBUG nova.virt.hardware [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 07:39:44 compute-0 nova_compute[186544]: 2025-11-22 07:39:44.569 186548 DEBUG nova.virt.hardware [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 07:39:44 compute-0 nova_compute[186544]: 2025-11-22 07:39:44.569 186548 DEBUG nova.virt.hardware [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 07:39:44 compute-0 nova_compute[186544]: 2025-11-22 07:39:44.569 186548 DEBUG nova.virt.hardware [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 07:39:44 compute-0 nova_compute[186544]: 2025-11-22 07:39:44.573 186548 DEBUG nova.privsep.utils [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Nov 22 07:39:44 compute-0 nova_compute[186544]: 2025-11-22 07:39:44.574 186548 DEBUG nova.objects.instance [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Lazy-loading 'pci_devices' on Instance uuid 5bdfb56f-9385-45bb-918d-0662112228b2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:39:44 compute-0 nova_compute[186544]: 2025-11-22 07:39:44.591 186548 DEBUG nova.virt.libvirt.driver [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 5bdfb56f-9385-45bb-918d-0662112228b2] End _get_guest_xml xml=<domain type="kvm">
Nov 22 07:39:44 compute-0 nova_compute[186544]:   <uuid>5bdfb56f-9385-45bb-918d-0662112228b2</uuid>
Nov 22 07:39:44 compute-0 nova_compute[186544]:   <name>instance-00000002</name>
Nov 22 07:39:44 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 07:39:44 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 07:39:44 compute-0 nova_compute[186544]:   <metadata>
Nov 22 07:39:44 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 07:39:44 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 07:39:44 compute-0 nova_compute[186544]:       <nova:name>tempest-AutoAllocateNetworkTest-server-1910697718</nova:name>
Nov 22 07:39:44 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 07:39:44</nova:creationTime>
Nov 22 07:39:44 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 07:39:44 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 07:39:44 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 07:39:44 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 07:39:44 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 07:39:44 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 07:39:44 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 07:39:44 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 07:39:44 compute-0 nova_compute[186544]:         <nova:user uuid="12b223a79f8b4927861908eb11663fb5">tempest-AutoAllocateNetworkTest-83498172-project-member</nova:user>
Nov 22 07:39:44 compute-0 nova_compute[186544]:         <nova:project uuid="98627e04b62e4ce4bf9650377c674f73">tempest-AutoAllocateNetworkTest-83498172</nova:project>
Nov 22 07:39:44 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 07:39:44 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 07:39:44 compute-0 nova_compute[186544]:       <nova:ports/>
Nov 22 07:39:44 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 07:39:44 compute-0 nova_compute[186544]:   </metadata>
Nov 22 07:39:44 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 07:39:44 compute-0 nova_compute[186544]:     <system>
Nov 22 07:39:44 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 07:39:44 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 07:39:44 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 07:39:44 compute-0 nova_compute[186544]:       <entry name="serial">5bdfb56f-9385-45bb-918d-0662112228b2</entry>
Nov 22 07:39:44 compute-0 nova_compute[186544]:       <entry name="uuid">5bdfb56f-9385-45bb-918d-0662112228b2</entry>
Nov 22 07:39:44 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 07:39:44 compute-0 nova_compute[186544]:     </system>
Nov 22 07:39:44 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 07:39:44 compute-0 nova_compute[186544]:   <os>
Nov 22 07:39:44 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 07:39:44 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 07:39:44 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 07:39:44 compute-0 nova_compute[186544]:   </os>
Nov 22 07:39:44 compute-0 nova_compute[186544]:   <features>
Nov 22 07:39:44 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 07:39:44 compute-0 nova_compute[186544]:     <apic/>
Nov 22 07:39:44 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 07:39:44 compute-0 nova_compute[186544]:   </features>
Nov 22 07:39:44 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 07:39:44 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 07:39:44 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 07:39:44 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 07:39:44 compute-0 nova_compute[186544]:   </clock>
Nov 22 07:39:44 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 07:39:44 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 07:39:44 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 07:39:44 compute-0 nova_compute[186544]:   </cpu>
Nov 22 07:39:44 compute-0 nova_compute[186544]:   <devices>
Nov 22 07:39:44 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 07:39:44 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 07:39:44 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/5bdfb56f-9385-45bb-918d-0662112228b2/disk"/>
Nov 22 07:39:44 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 07:39:44 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:39:44 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 07:39:44 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 07:39:44 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/5bdfb56f-9385-45bb-918d-0662112228b2/disk.config"/>
Nov 22 07:39:44 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 07:39:44 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:39:44 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 07:39:44 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/5bdfb56f-9385-45bb-918d-0662112228b2/console.log" append="off"/>
Nov 22 07:39:44 compute-0 nova_compute[186544]:     </serial>
Nov 22 07:39:44 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 07:39:44 compute-0 nova_compute[186544]:     <video>
Nov 22 07:39:44 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 07:39:44 compute-0 nova_compute[186544]:     </video>
Nov 22 07:39:44 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 07:39:44 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 07:39:44 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 07:39:44 compute-0 nova_compute[186544]:     </rng>
Nov 22 07:39:44 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 07:39:44 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:39:44 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:39:44 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:39:44 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:39:44 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:39:44 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:39:44 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:39:44 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:39:44 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:39:44 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:39:44 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:39:44 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:39:44 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:39:44 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:39:44 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:39:44 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:39:44 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:39:44 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:39:44 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:39:44 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:39:44 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:39:44 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:39:44 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:39:44 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:39:44 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 07:39:44 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 07:39:44 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 07:39:44 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 07:39:44 compute-0 nova_compute[186544]:   </devices>
Nov 22 07:39:44 compute-0 nova_compute[186544]: </domain>
Nov 22 07:39:44 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 07:39:44 compute-0 nova_compute[186544]: 2025-11-22 07:39:44.647 186548 DEBUG nova.virt.libvirt.driver [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:39:44 compute-0 nova_compute[186544]: 2025-11-22 07:39:44.648 186548 DEBUG nova.virt.libvirt.driver [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:39:44 compute-0 nova_compute[186544]: 2025-11-22 07:39:44.648 186548 INFO nova.virt.libvirt.driver [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 5bdfb56f-9385-45bb-918d-0662112228b2] Using config drive
Nov 22 07:39:45 compute-0 nova_compute[186544]: 2025-11-22 07:39:45.760 186548 INFO nova.virt.libvirt.driver [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 5bdfb56f-9385-45bb-918d-0662112228b2] Creating config drive at /var/lib/nova/instances/5bdfb56f-9385-45bb-918d-0662112228b2/disk.config
Nov 22 07:39:45 compute-0 nova_compute[186544]: 2025-11-22 07:39:45.765 186548 DEBUG oslo_concurrency.processutils [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5bdfb56f-9385-45bb-918d-0662112228b2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8ew1yt1u execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:39:45 compute-0 nova_compute[186544]: 2025-11-22 07:39:45.885 186548 DEBUG oslo_concurrency.processutils [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5bdfb56f-9385-45bb-918d-0662112228b2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8ew1yt1u" returned: 0 in 0.120s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:39:46 compute-0 systemd-machined[152872]: New machine qemu-1-instance-00000002.
Nov 22 07:39:46 compute-0 systemd[1]: Started Virtual Machine qemu-1-instance-00000002.
Nov 22 07:39:46 compute-0 nova_compute[186544]: 2025-11-22 07:39:46.244 186548 DEBUG nova.compute.manager [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 5bdfb56f-9385-45bb-918d-0662112228b2] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 07:39:46 compute-0 nova_compute[186544]: 2025-11-22 07:39:46.245 186548 DEBUG nova.virt.libvirt.driver [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 5bdfb56f-9385-45bb-918d-0662112228b2] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 07:39:46 compute-0 nova_compute[186544]: 2025-11-22 07:39:46.249 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763797186.2487702, 5bdfb56f-9385-45bb-918d-0662112228b2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:39:46 compute-0 nova_compute[186544]: 2025-11-22 07:39:46.249 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 5bdfb56f-9385-45bb-918d-0662112228b2] VM Resumed (Lifecycle Event)
Nov 22 07:39:46 compute-0 nova_compute[186544]: 2025-11-22 07:39:46.253 186548 INFO nova.virt.libvirt.driver [-] [instance: 5bdfb56f-9385-45bb-918d-0662112228b2] Instance spawned successfully.
Nov 22 07:39:46 compute-0 nova_compute[186544]: 2025-11-22 07:39:46.254 186548 DEBUG nova.virt.libvirt.driver [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 5bdfb56f-9385-45bb-918d-0662112228b2] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 07:39:46 compute-0 nova_compute[186544]: 2025-11-22 07:39:46.591 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 5bdfb56f-9385-45bb-918d-0662112228b2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:39:46 compute-0 nova_compute[186544]: 2025-11-22 07:39:46.594 186548 DEBUG nova.virt.libvirt.driver [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 5bdfb56f-9385-45bb-918d-0662112228b2] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:39:46 compute-0 nova_compute[186544]: 2025-11-22 07:39:46.594 186548 DEBUG nova.virt.libvirt.driver [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 5bdfb56f-9385-45bb-918d-0662112228b2] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:39:46 compute-0 nova_compute[186544]: 2025-11-22 07:39:46.595 186548 DEBUG nova.virt.libvirt.driver [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 5bdfb56f-9385-45bb-918d-0662112228b2] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:39:46 compute-0 nova_compute[186544]: 2025-11-22 07:39:46.595 186548 DEBUG nova.virt.libvirt.driver [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 5bdfb56f-9385-45bb-918d-0662112228b2] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:39:46 compute-0 nova_compute[186544]: 2025-11-22 07:39:46.595 186548 DEBUG nova.virt.libvirt.driver [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 5bdfb56f-9385-45bb-918d-0662112228b2] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:39:46 compute-0 nova_compute[186544]: 2025-11-22 07:39:46.596 186548 DEBUG nova.virt.libvirt.driver [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 5bdfb56f-9385-45bb-918d-0662112228b2] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:39:46 compute-0 nova_compute[186544]: 2025-11-22 07:39:46.599 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 5bdfb56f-9385-45bb-918d-0662112228b2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:39:46 compute-0 nova_compute[186544]: 2025-11-22 07:39:46.634 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 5bdfb56f-9385-45bb-918d-0662112228b2] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 07:39:46 compute-0 nova_compute[186544]: 2025-11-22 07:39:46.635 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763797186.2517822, 5bdfb56f-9385-45bb-918d-0662112228b2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:39:46 compute-0 nova_compute[186544]: 2025-11-22 07:39:46.635 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 5bdfb56f-9385-45bb-918d-0662112228b2] VM Started (Lifecycle Event)
Nov 22 07:39:46 compute-0 nova_compute[186544]: 2025-11-22 07:39:46.654 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 5bdfb56f-9385-45bb-918d-0662112228b2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:39:46 compute-0 nova_compute[186544]: 2025-11-22 07:39:46.657 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 5bdfb56f-9385-45bb-918d-0662112228b2] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:39:46 compute-0 nova_compute[186544]: 2025-11-22 07:39:46.691 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 5bdfb56f-9385-45bb-918d-0662112228b2] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 07:39:46 compute-0 nova_compute[186544]: 2025-11-22 07:39:46.707 186548 INFO nova.compute.manager [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 5bdfb56f-9385-45bb-918d-0662112228b2] Took 6.64 seconds to spawn the instance on the hypervisor.
Nov 22 07:39:46 compute-0 nova_compute[186544]: 2025-11-22 07:39:46.708 186548 DEBUG nova.compute.manager [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 5bdfb56f-9385-45bb-918d-0662112228b2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:39:46 compute-0 nova_compute[186544]: 2025-11-22 07:39:46.791 186548 INFO nova.compute.manager [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 5bdfb56f-9385-45bb-918d-0662112228b2] Took 7.34 seconds to build instance.
Nov 22 07:39:46 compute-0 nova_compute[186544]: 2025-11-22 07:39:46.853 186548 DEBUG oslo_concurrency.lockutils [None req-958fe321-4686-4389-bda4-00bc94d6d439 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Lock "5bdfb56f-9385-45bb-918d-0662112228b2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.525s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:39:48 compute-0 podman[213103]: 2025-11-22 07:39:48.404964972 +0000 UTC m=+0.051320669 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 07:39:52 compute-0 podman[213127]: 2025-11-22 07:39:52.396494158 +0000 UTC m=+0.047931636 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Nov 22 07:39:53 compute-0 nova_compute[186544]: 2025-11-22 07:39:53.715 186548 DEBUG oslo_concurrency.lockutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Acquiring lock "3ed60b2a-f4cf-4b63-81bb-099f566d7e6c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:39:53 compute-0 nova_compute[186544]: 2025-11-22 07:39:53.715 186548 DEBUG oslo_concurrency.lockutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Lock "3ed60b2a-f4cf-4b63-81bb-099f566d7e6c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:39:53 compute-0 nova_compute[186544]: 2025-11-22 07:39:53.740 186548 DEBUG nova.compute.manager [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 07:39:53 compute-0 nova_compute[186544]: 2025-11-22 07:39:53.840 186548 DEBUG oslo_concurrency.lockutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:39:53 compute-0 nova_compute[186544]: 2025-11-22 07:39:53.840 186548 DEBUG oslo_concurrency.lockutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:39:53 compute-0 nova_compute[186544]: 2025-11-22 07:39:53.848 186548 DEBUG nova.virt.hardware [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 07:39:53 compute-0 nova_compute[186544]: 2025-11-22 07:39:53.848 186548 INFO nova.compute.claims [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c] Claim successful on node compute-0.ctlplane.example.com
Nov 22 07:39:54 compute-0 nova_compute[186544]: 2025-11-22 07:39:54.147 186548 DEBUG nova.compute.provider_tree [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Updating inventory in ProviderTree for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea with inventory: {'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 22 07:39:54 compute-0 nova_compute[186544]: 2025-11-22 07:39:54.185 186548 ERROR nova.scheduler.client.report [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [req-36344fe5-ac60-4fec-81a3-287f869a1f21] Failed to update inventory to [{'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}}] for resource provider with UUID 0a011418-630a-4be8-ab23-41ec1c11a5ea.  Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict  ", "code": "placement.concurrent_update", "request_id": "req-36344fe5-ac60-4fec-81a3-287f869a1f21"}]}
Nov 22 07:39:54 compute-0 nova_compute[186544]: 2025-11-22 07:39:54.209 186548 DEBUG nova.scheduler.client.report [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Refreshing inventories for resource provider 0a011418-630a-4be8-ab23-41ec1c11a5ea _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 22 07:39:54 compute-0 nova_compute[186544]: 2025-11-22 07:39:54.241 186548 DEBUG nova.scheduler.client.report [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Updating ProviderTree inventory for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 22 07:39:54 compute-0 nova_compute[186544]: 2025-11-22 07:39:54.242 186548 DEBUG nova.compute.provider_tree [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Updating inventory in ProviderTree for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 22 07:39:54 compute-0 nova_compute[186544]: 2025-11-22 07:39:54.254 186548 DEBUG nova.scheduler.client.report [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Refreshing aggregate associations for resource provider 0a011418-630a-4be8-ab23-41ec1c11a5ea, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 22 07:39:54 compute-0 nova_compute[186544]: 2025-11-22 07:39:54.277 186548 DEBUG nova.scheduler.client.report [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Refreshing trait associations for resource provider 0a011418-630a-4be8-ab23-41ec1c11a5ea, traits: COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_ACCELERATORS,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_RESCUE_BFV,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSSE3,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE41 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 22 07:39:54 compute-0 nova_compute[186544]: 2025-11-22 07:39:54.358 186548 DEBUG nova.compute.provider_tree [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Updating inventory in ProviderTree for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea with inventory: {'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 22 07:39:54 compute-0 nova_compute[186544]: 2025-11-22 07:39:54.482 186548 DEBUG nova.scheduler.client.report [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Updated inventory for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea with generation 8 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Nov 22 07:39:54 compute-0 nova_compute[186544]: 2025-11-22 07:39:54.482 186548 DEBUG nova.compute.provider_tree [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Updating resource provider 0a011418-630a-4be8-ab23-41ec1c11a5ea generation from 8 to 9 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Nov 22 07:39:54 compute-0 nova_compute[186544]: 2025-11-22 07:39:54.483 186548 DEBUG nova.compute.provider_tree [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Updating inventory in ProviderTree for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea with inventory: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 22 07:39:54 compute-0 nova_compute[186544]: 2025-11-22 07:39:54.507 186548 DEBUG oslo_concurrency.lockutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.667s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:39:54 compute-0 nova_compute[186544]: 2025-11-22 07:39:54.508 186548 DEBUG nova.compute.manager [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 07:39:54 compute-0 nova_compute[186544]: 2025-11-22 07:39:54.603 186548 DEBUG nova.compute.manager [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 22 07:39:54 compute-0 nova_compute[186544]: 2025-11-22 07:39:54.603 186548 DEBUG nova.network.neutron [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 07:39:54 compute-0 nova_compute[186544]: 2025-11-22 07:39:54.638 186548 INFO nova.virt.libvirt.driver [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 07:39:54 compute-0 nova_compute[186544]: 2025-11-22 07:39:54.660 186548 DEBUG nova.compute.manager [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 07:39:54 compute-0 nova_compute[186544]: 2025-11-22 07:39:54.759 186548 DEBUG nova.compute.manager [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 07:39:54 compute-0 nova_compute[186544]: 2025-11-22 07:39:54.760 186548 DEBUG nova.virt.libvirt.driver [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 07:39:54 compute-0 nova_compute[186544]: 2025-11-22 07:39:54.761 186548 INFO nova.virt.libvirt.driver [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c] Creating image(s)
Nov 22 07:39:54 compute-0 nova_compute[186544]: 2025-11-22 07:39:54.762 186548 DEBUG oslo_concurrency.lockutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Acquiring lock "/var/lib/nova/instances/3ed60b2a-f4cf-4b63-81bb-099f566d7e6c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:39:54 compute-0 nova_compute[186544]: 2025-11-22 07:39:54.762 186548 DEBUG oslo_concurrency.lockutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Lock "/var/lib/nova/instances/3ed60b2a-f4cf-4b63-81bb-099f566d7e6c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:39:54 compute-0 nova_compute[186544]: 2025-11-22 07:39:54.763 186548 DEBUG oslo_concurrency.lockutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Lock "/var/lib/nova/instances/3ed60b2a-f4cf-4b63-81bb-099f566d7e6c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:39:54 compute-0 nova_compute[186544]: 2025-11-22 07:39:54.775 186548 DEBUG oslo_concurrency.processutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:39:54 compute-0 nova_compute[186544]: 2025-11-22 07:39:54.831 186548 DEBUG oslo_concurrency.processutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:39:54 compute-0 nova_compute[186544]: 2025-11-22 07:39:54.832 186548 DEBUG oslo_concurrency.lockutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:39:54 compute-0 nova_compute[186544]: 2025-11-22 07:39:54.833 186548 DEBUG oslo_concurrency.lockutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:39:54 compute-0 nova_compute[186544]: 2025-11-22 07:39:54.844 186548 DEBUG oslo_concurrency.processutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:39:54 compute-0 nova_compute[186544]: 2025-11-22 07:39:54.896 186548 DEBUG oslo_concurrency.processutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:39:54 compute-0 nova_compute[186544]: 2025-11-22 07:39:54.897 186548 DEBUG oslo_concurrency.processutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/3ed60b2a-f4cf-4b63-81bb-099f566d7e6c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:39:54 compute-0 nova_compute[186544]: 2025-11-22 07:39:54.954 186548 DEBUG oslo_concurrency.processutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/3ed60b2a-f4cf-4b63-81bb-099f566d7e6c/disk 1073741824" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:39:54 compute-0 nova_compute[186544]: 2025-11-22 07:39:54.955 186548 DEBUG oslo_concurrency.lockutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.123s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:39:54 compute-0 nova_compute[186544]: 2025-11-22 07:39:54.956 186548 DEBUG oslo_concurrency.processutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:39:55 compute-0 nova_compute[186544]: 2025-11-22 07:39:55.009 186548 DEBUG oslo_concurrency.processutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:39:55 compute-0 nova_compute[186544]: 2025-11-22 07:39:55.011 186548 DEBUG nova.virt.disk.api [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Checking if we can resize image /var/lib/nova/instances/3ed60b2a-f4cf-4b63-81bb-099f566d7e6c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 07:39:55 compute-0 nova_compute[186544]: 2025-11-22 07:39:55.011 186548 DEBUG oslo_concurrency.processutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3ed60b2a-f4cf-4b63-81bb-099f566d7e6c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:39:55 compute-0 nova_compute[186544]: 2025-11-22 07:39:55.032 186548 DEBUG nova.network.neutron [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c] Automatically allocating a network for project 98627e04b62e4ce4bf9650377c674f73. _auto_allocate_network /usr/lib/python3.9/site-packages/nova/network/neutron.py:2460
Nov 22 07:39:55 compute-0 nova_compute[186544]: 2025-11-22 07:39:55.085 186548 DEBUG oslo_concurrency.processutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3ed60b2a-f4cf-4b63-81bb-099f566d7e6c/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:39:55 compute-0 nova_compute[186544]: 2025-11-22 07:39:55.086 186548 DEBUG nova.virt.disk.api [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Cannot resize image /var/lib/nova/instances/3ed60b2a-f4cf-4b63-81bb-099f566d7e6c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 07:39:55 compute-0 nova_compute[186544]: 2025-11-22 07:39:55.086 186548 DEBUG nova.objects.instance [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Lazy-loading 'migration_context' on Instance uuid 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:39:55 compute-0 nova_compute[186544]: 2025-11-22 07:39:55.097 186548 DEBUG nova.virt.libvirt.driver [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 07:39:55 compute-0 nova_compute[186544]: 2025-11-22 07:39:55.098 186548 DEBUG nova.virt.libvirt.driver [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c] Ensure instance console log exists: /var/lib/nova/instances/3ed60b2a-f4cf-4b63-81bb-099f566d7e6c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 07:39:55 compute-0 nova_compute[186544]: 2025-11-22 07:39:55.099 186548 DEBUG oslo_concurrency.lockutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:39:55 compute-0 nova_compute[186544]: 2025-11-22 07:39:55.099 186548 DEBUG oslo_concurrency.lockutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:39:55 compute-0 nova_compute[186544]: 2025-11-22 07:39:55.100 186548 DEBUG oslo_concurrency.lockutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:39:55 compute-0 podman[213161]: 2025-11-22 07:39:55.406532451 +0000 UTC m=+0.058527032 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 22 07:39:59 compute-0 podman[213181]: 2025-11-22 07:39:59.396086639 +0000 UTC m=+0.041281256 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 07:40:04 compute-0 podman[213224]: 2025-11-22 07:40:04.397135636 +0000 UTC m=+0.049322140 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, managed_by=edpm_ansible, version=9.6, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vendor=Red Hat, Inc., distribution-scope=public)
Nov 22 07:40:07 compute-0 nova_compute[186544]: 2025-11-22 07:40:07.674 186548 DEBUG oslo_concurrency.lockutils [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Acquiring lock "7dac5f01-e074-40c1-9f9a-5d514359dd0e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:40:07 compute-0 nova_compute[186544]: 2025-11-22 07:40:07.675 186548 DEBUG oslo_concurrency.lockutils [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Lock "7dac5f01-e074-40c1-9f9a-5d514359dd0e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:40:07 compute-0 nova_compute[186544]: 2025-11-22 07:40:07.732 186548 DEBUG nova.compute.manager [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] [instance: 7dac5f01-e074-40c1-9f9a-5d514359dd0e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 07:40:07 compute-0 nova_compute[186544]: 2025-11-22 07:40:07.866 186548 DEBUG oslo_concurrency.lockutils [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:40:07 compute-0 nova_compute[186544]: 2025-11-22 07:40:07.867 186548 DEBUG oslo_concurrency.lockutils [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:40:07 compute-0 nova_compute[186544]: 2025-11-22 07:40:07.873 186548 DEBUG nova.virt.hardware [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 07:40:07 compute-0 nova_compute[186544]: 2025-11-22 07:40:07.873 186548 INFO nova.compute.claims [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] [instance: 7dac5f01-e074-40c1-9f9a-5d514359dd0e] Claim successful on node compute-0.ctlplane.example.com
Nov 22 07:40:08 compute-0 nova_compute[186544]: 2025-11-22 07:40:08.103 186548 DEBUG nova.compute.provider_tree [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:40:08 compute-0 nova_compute[186544]: 2025-11-22 07:40:08.133 186548 DEBUG nova.scheduler.client.report [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:40:08 compute-0 nova_compute[186544]: 2025-11-22 07:40:08.170 186548 DEBUG oslo_concurrency.lockutils [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.303s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:40:08 compute-0 nova_compute[186544]: 2025-11-22 07:40:08.171 186548 DEBUG nova.compute.manager [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] [instance: 7dac5f01-e074-40c1-9f9a-5d514359dd0e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 07:40:08 compute-0 nova_compute[186544]: 2025-11-22 07:40:08.263 186548 DEBUG nova.compute.manager [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] [instance: 7dac5f01-e074-40c1-9f9a-5d514359dd0e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 22 07:40:08 compute-0 nova_compute[186544]: 2025-11-22 07:40:08.264 186548 DEBUG nova.network.neutron [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] [instance: 7dac5f01-e074-40c1-9f9a-5d514359dd0e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 07:40:08 compute-0 nova_compute[186544]: 2025-11-22 07:40:08.302 186548 INFO nova.virt.libvirt.driver [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] [instance: 7dac5f01-e074-40c1-9f9a-5d514359dd0e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 07:40:08 compute-0 nova_compute[186544]: 2025-11-22 07:40:08.343 186548 DEBUG nova.compute.manager [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] [instance: 7dac5f01-e074-40c1-9f9a-5d514359dd0e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 07:40:08 compute-0 nova_compute[186544]: 2025-11-22 07:40:08.524 186548 DEBUG nova.compute.manager [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] [instance: 7dac5f01-e074-40c1-9f9a-5d514359dd0e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 07:40:08 compute-0 nova_compute[186544]: 2025-11-22 07:40:08.526 186548 DEBUG nova.virt.libvirt.driver [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] [instance: 7dac5f01-e074-40c1-9f9a-5d514359dd0e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 07:40:08 compute-0 nova_compute[186544]: 2025-11-22 07:40:08.526 186548 INFO nova.virt.libvirt.driver [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] [instance: 7dac5f01-e074-40c1-9f9a-5d514359dd0e] Creating image(s)
Nov 22 07:40:08 compute-0 nova_compute[186544]: 2025-11-22 07:40:08.527 186548 DEBUG oslo_concurrency.lockutils [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Acquiring lock "/var/lib/nova/instances/7dac5f01-e074-40c1-9f9a-5d514359dd0e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:40:08 compute-0 nova_compute[186544]: 2025-11-22 07:40:08.527 186548 DEBUG oslo_concurrency.lockutils [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Lock "/var/lib/nova/instances/7dac5f01-e074-40c1-9f9a-5d514359dd0e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:40:08 compute-0 nova_compute[186544]: 2025-11-22 07:40:08.528 186548 DEBUG oslo_concurrency.lockutils [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Lock "/var/lib/nova/instances/7dac5f01-e074-40c1-9f9a-5d514359dd0e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:40:08 compute-0 nova_compute[186544]: 2025-11-22 07:40:08.540 186548 DEBUG oslo_concurrency.processutils [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:40:08 compute-0 nova_compute[186544]: 2025-11-22 07:40:08.601 186548 DEBUG oslo_concurrency.processutils [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:40:08 compute-0 nova_compute[186544]: 2025-11-22 07:40:08.603 186548 DEBUG oslo_concurrency.lockutils [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:40:08 compute-0 nova_compute[186544]: 2025-11-22 07:40:08.603 186548 DEBUG oslo_concurrency.lockutils [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:40:08 compute-0 nova_compute[186544]: 2025-11-22 07:40:08.614 186548 DEBUG oslo_concurrency.processutils [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:40:08 compute-0 nova_compute[186544]: 2025-11-22 07:40:08.692 186548 DEBUG oslo_concurrency.processutils [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:40:08 compute-0 nova_compute[186544]: 2025-11-22 07:40:08.694 186548 DEBUG oslo_concurrency.processutils [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/7dac5f01-e074-40c1-9f9a-5d514359dd0e/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:40:08 compute-0 nova_compute[186544]: 2025-11-22 07:40:08.794 186548 DEBUG oslo_concurrency.processutils [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/7dac5f01-e074-40c1-9f9a-5d514359dd0e/disk 1073741824" returned: 0 in 0.101s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:40:08 compute-0 nova_compute[186544]: 2025-11-22 07:40:08.795 186548 DEBUG oslo_concurrency.lockutils [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.192s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:40:08 compute-0 nova_compute[186544]: 2025-11-22 07:40:08.796 186548 DEBUG oslo_concurrency.processutils [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:40:08 compute-0 nova_compute[186544]: 2025-11-22 07:40:08.849 186548 DEBUG oslo_concurrency.processutils [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:40:08 compute-0 nova_compute[186544]: 2025-11-22 07:40:08.850 186548 DEBUG nova.virt.disk.api [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Checking if we can resize image /var/lib/nova/instances/7dac5f01-e074-40c1-9f9a-5d514359dd0e/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 07:40:08 compute-0 nova_compute[186544]: 2025-11-22 07:40:08.850 186548 DEBUG oslo_concurrency.processutils [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7dac5f01-e074-40c1-9f9a-5d514359dd0e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:40:08 compute-0 nova_compute[186544]: 2025-11-22 07:40:08.902 186548 DEBUG oslo_concurrency.processutils [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7dac5f01-e074-40c1-9f9a-5d514359dd0e/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:40:08 compute-0 nova_compute[186544]: 2025-11-22 07:40:08.903 186548 DEBUG nova.virt.disk.api [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Cannot resize image /var/lib/nova/instances/7dac5f01-e074-40c1-9f9a-5d514359dd0e/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 07:40:08 compute-0 nova_compute[186544]: 2025-11-22 07:40:08.904 186548 DEBUG nova.objects.instance [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Lazy-loading 'migration_context' on Instance uuid 7dac5f01-e074-40c1-9f9a-5d514359dd0e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:40:08 compute-0 nova_compute[186544]: 2025-11-22 07:40:08.942 186548 DEBUG nova.virt.libvirt.driver [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] [instance: 7dac5f01-e074-40c1-9f9a-5d514359dd0e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 07:40:08 compute-0 nova_compute[186544]: 2025-11-22 07:40:08.943 186548 DEBUG nova.virt.libvirt.driver [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] [instance: 7dac5f01-e074-40c1-9f9a-5d514359dd0e] Ensure instance console log exists: /var/lib/nova/instances/7dac5f01-e074-40c1-9f9a-5d514359dd0e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 07:40:08 compute-0 nova_compute[186544]: 2025-11-22 07:40:08.943 186548 DEBUG oslo_concurrency.lockutils [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:40:08 compute-0 nova_compute[186544]: 2025-11-22 07:40:08.944 186548 DEBUG oslo_concurrency.lockutils [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:40:08 compute-0 nova_compute[186544]: 2025-11-22 07:40:08.945 186548 DEBUG oslo_concurrency.lockutils [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:40:09 compute-0 nova_compute[186544]: 2025-11-22 07:40:09.419 186548 DEBUG nova.network.neutron [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] [instance: 7dac5f01-e074-40c1-9f9a-5d514359dd0e] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188
Nov 22 07:40:09 compute-0 nova_compute[186544]: 2025-11-22 07:40:09.420 186548 DEBUG nova.compute.manager [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] [instance: 7dac5f01-e074-40c1-9f9a-5d514359dd0e] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 22 07:40:09 compute-0 nova_compute[186544]: 2025-11-22 07:40:09.422 186548 DEBUG nova.virt.libvirt.driver [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] [instance: 7dac5f01-e074-40c1-9f9a-5d514359dd0e] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 07:40:09 compute-0 nova_compute[186544]: 2025-11-22 07:40:09.426 186548 WARNING nova.virt.libvirt.driver [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 07:40:09 compute-0 nova_compute[186544]: 2025-11-22 07:40:09.432 186548 DEBUG nova.virt.libvirt.host [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 07:40:09 compute-0 nova_compute[186544]: 2025-11-22 07:40:09.433 186548 DEBUG nova.virt.libvirt.host [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 07:40:09 compute-0 nova_compute[186544]: 2025-11-22 07:40:09.436 186548 DEBUG nova.virt.libvirt.host [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 07:40:09 compute-0 nova_compute[186544]: 2025-11-22 07:40:09.436 186548 DEBUG nova.virt.libvirt.host [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 07:40:09 compute-0 nova_compute[186544]: 2025-11-22 07:40:09.437 186548 DEBUG nova.virt.libvirt.driver [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 07:40:09 compute-0 nova_compute[186544]: 2025-11-22 07:40:09.438 186548 DEBUG nova.virt.hardware [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 07:40:09 compute-0 nova_compute[186544]: 2025-11-22 07:40:09.438 186548 DEBUG nova.virt.hardware [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 07:40:09 compute-0 nova_compute[186544]: 2025-11-22 07:40:09.438 186548 DEBUG nova.virt.hardware [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 07:40:09 compute-0 nova_compute[186544]: 2025-11-22 07:40:09.438 186548 DEBUG nova.virt.hardware [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 07:40:09 compute-0 nova_compute[186544]: 2025-11-22 07:40:09.439 186548 DEBUG nova.virt.hardware [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 07:40:09 compute-0 nova_compute[186544]: 2025-11-22 07:40:09.439 186548 DEBUG nova.virt.hardware [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 07:40:09 compute-0 nova_compute[186544]: 2025-11-22 07:40:09.439 186548 DEBUG nova.virt.hardware [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 07:40:09 compute-0 nova_compute[186544]: 2025-11-22 07:40:09.439 186548 DEBUG nova.virt.hardware [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 07:40:09 compute-0 nova_compute[186544]: 2025-11-22 07:40:09.440 186548 DEBUG nova.virt.hardware [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 07:40:09 compute-0 nova_compute[186544]: 2025-11-22 07:40:09.440 186548 DEBUG nova.virt.hardware [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 07:40:09 compute-0 nova_compute[186544]: 2025-11-22 07:40:09.440 186548 DEBUG nova.virt.hardware [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 07:40:09 compute-0 nova_compute[186544]: 2025-11-22 07:40:09.443 186548 DEBUG nova.objects.instance [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Lazy-loading 'pci_devices' on Instance uuid 7dac5f01-e074-40c1-9f9a-5d514359dd0e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:40:09 compute-0 nova_compute[186544]: 2025-11-22 07:40:09.460 186548 DEBUG nova.virt.libvirt.driver [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] [instance: 7dac5f01-e074-40c1-9f9a-5d514359dd0e] End _get_guest_xml xml=<domain type="kvm">
Nov 22 07:40:09 compute-0 nova_compute[186544]:   <uuid>7dac5f01-e074-40c1-9f9a-5d514359dd0e</uuid>
Nov 22 07:40:09 compute-0 nova_compute[186544]:   <name>instance-00000009</name>
Nov 22 07:40:09 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 07:40:09 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 07:40:09 compute-0 nova_compute[186544]:   <metadata>
Nov 22 07:40:09 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 07:40:09 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 07:40:09 compute-0 nova_compute[186544]:       <nova:name>tempest-LiveMigrationNegativeTest-server-1715284301</nova:name>
Nov 22 07:40:09 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 07:40:09</nova:creationTime>
Nov 22 07:40:09 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 07:40:09 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 07:40:09 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 07:40:09 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 07:40:09 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 07:40:09 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 07:40:09 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 07:40:09 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 07:40:09 compute-0 nova_compute[186544]:         <nova:user uuid="d7697c5198974e5ca1152e4c64815e29">tempest-LiveMigrationNegativeTest-555543298-project-member</nova:user>
Nov 22 07:40:09 compute-0 nova_compute[186544]:         <nova:project uuid="463b8a0ac0be4ebbb7491f91038a890f">tempest-LiveMigrationNegativeTest-555543298</nova:project>
Nov 22 07:40:09 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 07:40:09 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 07:40:09 compute-0 nova_compute[186544]:       <nova:ports/>
Nov 22 07:40:09 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 07:40:09 compute-0 nova_compute[186544]:   </metadata>
Nov 22 07:40:09 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 07:40:09 compute-0 nova_compute[186544]:     <system>
Nov 22 07:40:09 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 07:40:09 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 07:40:09 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 07:40:09 compute-0 nova_compute[186544]:       <entry name="serial">7dac5f01-e074-40c1-9f9a-5d514359dd0e</entry>
Nov 22 07:40:09 compute-0 nova_compute[186544]:       <entry name="uuid">7dac5f01-e074-40c1-9f9a-5d514359dd0e</entry>
Nov 22 07:40:09 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 07:40:09 compute-0 nova_compute[186544]:     </system>
Nov 22 07:40:09 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 07:40:09 compute-0 nova_compute[186544]:   <os>
Nov 22 07:40:09 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 07:40:09 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 07:40:09 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 07:40:09 compute-0 nova_compute[186544]:   </os>
Nov 22 07:40:09 compute-0 nova_compute[186544]:   <features>
Nov 22 07:40:09 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 07:40:09 compute-0 nova_compute[186544]:     <apic/>
Nov 22 07:40:09 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 07:40:09 compute-0 nova_compute[186544]:   </features>
Nov 22 07:40:09 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 07:40:09 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 07:40:09 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 07:40:09 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 07:40:09 compute-0 nova_compute[186544]:   </clock>
Nov 22 07:40:09 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 07:40:09 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 07:40:09 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 07:40:09 compute-0 nova_compute[186544]:   </cpu>
Nov 22 07:40:09 compute-0 nova_compute[186544]:   <devices>
Nov 22 07:40:09 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 07:40:09 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 07:40:09 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/7dac5f01-e074-40c1-9f9a-5d514359dd0e/disk"/>
Nov 22 07:40:09 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 07:40:09 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:40:09 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 07:40:09 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 07:40:09 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/7dac5f01-e074-40c1-9f9a-5d514359dd0e/disk.config"/>
Nov 22 07:40:09 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 07:40:09 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:40:09 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 07:40:09 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/7dac5f01-e074-40c1-9f9a-5d514359dd0e/console.log" append="off"/>
Nov 22 07:40:09 compute-0 nova_compute[186544]:     </serial>
Nov 22 07:40:09 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 07:40:09 compute-0 nova_compute[186544]:     <video>
Nov 22 07:40:09 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 07:40:09 compute-0 nova_compute[186544]:     </video>
Nov 22 07:40:09 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 07:40:09 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 07:40:09 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 07:40:09 compute-0 nova_compute[186544]:     </rng>
Nov 22 07:40:09 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 07:40:09 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:40:09 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:40:09 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:40:09 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:40:09 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:40:09 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:40:09 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:40:09 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:40:09 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:40:09 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:40:09 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:40:09 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:40:09 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:40:09 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:40:09 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:40:09 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:40:09 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:40:09 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:40:09 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:40:09 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:40:09 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:40:09 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:40:09 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:40:09 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:40:09 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 07:40:09 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 07:40:09 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 07:40:09 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 07:40:09 compute-0 nova_compute[186544]:   </devices>
Nov 22 07:40:09 compute-0 nova_compute[186544]: </domain>
Nov 22 07:40:09 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 07:40:09 compute-0 nova_compute[186544]: 2025-11-22 07:40:09.577 186548 DEBUG nova.virt.libvirt.driver [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:40:09 compute-0 nova_compute[186544]: 2025-11-22 07:40:09.578 186548 DEBUG nova.virt.libvirt.driver [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:40:09 compute-0 nova_compute[186544]: 2025-11-22 07:40:09.578 186548 INFO nova.virt.libvirt.driver [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] [instance: 7dac5f01-e074-40c1-9f9a-5d514359dd0e] Using config drive
Nov 22 07:40:09 compute-0 nova_compute[186544]: 2025-11-22 07:40:09.890 186548 INFO nova.virt.libvirt.driver [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] [instance: 7dac5f01-e074-40c1-9f9a-5d514359dd0e] Creating config drive at /var/lib/nova/instances/7dac5f01-e074-40c1-9f9a-5d514359dd0e/disk.config
Nov 22 07:40:09 compute-0 nova_compute[186544]: 2025-11-22 07:40:09.895 186548 DEBUG oslo_concurrency.processutils [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7dac5f01-e074-40c1-9f9a-5d514359dd0e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpaiqg9t6u execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:40:10 compute-0 nova_compute[186544]: 2025-11-22 07:40:10.023 186548 DEBUG oslo_concurrency.processutils [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7dac5f01-e074-40c1-9f9a-5d514359dd0e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpaiqg9t6u" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:40:10 compute-0 systemd-machined[152872]: New machine qemu-2-instance-00000009.
Nov 22 07:40:10 compute-0 systemd[1]: Started Virtual Machine qemu-2-instance-00000009.
Nov 22 07:40:11 compute-0 nova_compute[186544]: 2025-11-22 07:40:11.093 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763797211.0918236, 7dac5f01-e074-40c1-9f9a-5d514359dd0e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:40:11 compute-0 nova_compute[186544]: 2025-11-22 07:40:11.094 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 7dac5f01-e074-40c1-9f9a-5d514359dd0e] VM Resumed (Lifecycle Event)
Nov 22 07:40:11 compute-0 nova_compute[186544]: 2025-11-22 07:40:11.096 186548 DEBUG nova.compute.manager [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] [instance: 7dac5f01-e074-40c1-9f9a-5d514359dd0e] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 07:40:11 compute-0 nova_compute[186544]: 2025-11-22 07:40:11.096 186548 DEBUG nova.virt.libvirt.driver [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] [instance: 7dac5f01-e074-40c1-9f9a-5d514359dd0e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 07:40:11 compute-0 nova_compute[186544]: 2025-11-22 07:40:11.100 186548 INFO nova.virt.libvirt.driver [-] [instance: 7dac5f01-e074-40c1-9f9a-5d514359dd0e] Instance spawned successfully.
Nov 22 07:40:11 compute-0 nova_compute[186544]: 2025-11-22 07:40:11.101 186548 DEBUG nova.virt.libvirt.driver [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] [instance: 7dac5f01-e074-40c1-9f9a-5d514359dd0e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 07:40:11 compute-0 nova_compute[186544]: 2025-11-22 07:40:11.134 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 7dac5f01-e074-40c1-9f9a-5d514359dd0e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:40:11 compute-0 nova_compute[186544]: 2025-11-22 07:40:11.140 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 7dac5f01-e074-40c1-9f9a-5d514359dd0e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:40:11 compute-0 nova_compute[186544]: 2025-11-22 07:40:11.143 186548 DEBUG nova.virt.libvirt.driver [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] [instance: 7dac5f01-e074-40c1-9f9a-5d514359dd0e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:40:11 compute-0 nova_compute[186544]: 2025-11-22 07:40:11.143 186548 DEBUG nova.virt.libvirt.driver [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] [instance: 7dac5f01-e074-40c1-9f9a-5d514359dd0e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:40:11 compute-0 nova_compute[186544]: 2025-11-22 07:40:11.143 186548 DEBUG nova.virt.libvirt.driver [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] [instance: 7dac5f01-e074-40c1-9f9a-5d514359dd0e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:40:11 compute-0 nova_compute[186544]: 2025-11-22 07:40:11.144 186548 DEBUG nova.virt.libvirt.driver [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] [instance: 7dac5f01-e074-40c1-9f9a-5d514359dd0e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:40:11 compute-0 nova_compute[186544]: 2025-11-22 07:40:11.144 186548 DEBUG nova.virt.libvirt.driver [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] [instance: 7dac5f01-e074-40c1-9f9a-5d514359dd0e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:40:11 compute-0 nova_compute[186544]: 2025-11-22 07:40:11.144 186548 DEBUG nova.virt.libvirt.driver [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] [instance: 7dac5f01-e074-40c1-9f9a-5d514359dd0e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:40:11 compute-0 nova_compute[186544]: 2025-11-22 07:40:11.196 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 7dac5f01-e074-40c1-9f9a-5d514359dd0e] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 07:40:11 compute-0 nova_compute[186544]: 2025-11-22 07:40:11.196 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763797211.0935218, 7dac5f01-e074-40c1-9f9a-5d514359dd0e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:40:11 compute-0 nova_compute[186544]: 2025-11-22 07:40:11.197 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 7dac5f01-e074-40c1-9f9a-5d514359dd0e] VM Started (Lifecycle Event)
Nov 22 07:40:11 compute-0 nova_compute[186544]: 2025-11-22 07:40:11.252 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 7dac5f01-e074-40c1-9f9a-5d514359dd0e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:40:11 compute-0 nova_compute[186544]: 2025-11-22 07:40:11.255 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 7dac5f01-e074-40c1-9f9a-5d514359dd0e] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:40:11 compute-0 nova_compute[186544]: 2025-11-22 07:40:11.303 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 7dac5f01-e074-40c1-9f9a-5d514359dd0e] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 07:40:11 compute-0 nova_compute[186544]: 2025-11-22 07:40:11.314 186548 INFO nova.compute.manager [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] [instance: 7dac5f01-e074-40c1-9f9a-5d514359dd0e] Took 2.79 seconds to spawn the instance on the hypervisor.
Nov 22 07:40:11 compute-0 nova_compute[186544]: 2025-11-22 07:40:11.315 186548 DEBUG nova.compute.manager [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] [instance: 7dac5f01-e074-40c1-9f9a-5d514359dd0e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:40:11 compute-0 nova_compute[186544]: 2025-11-22 07:40:11.400 186548 INFO nova.compute.manager [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] [instance: 7dac5f01-e074-40c1-9f9a-5d514359dd0e] Took 3.59 seconds to build instance.
Nov 22 07:40:11 compute-0 nova_compute[186544]: 2025-11-22 07:40:11.422 186548 DEBUG oslo_concurrency.lockutils [None req-605e17e0-67b8-4a12-a967-c15779b26fd4 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Lock "7dac5f01-e074-40c1-9f9a-5d514359dd0e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 3.747s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:40:13 compute-0 nova_compute[186544]: 2025-11-22 07:40:13.762 186548 DEBUG nova.objects.instance [None req-22eb6c3c-573f-4aa4-8cdd-f76d44972ce5 f247f4272a41405a9ceba3b34e898209 3cb41ee119514e32aecf0e39d953e5e9 - - default default] Lazy-loading 'pci_devices' on Instance uuid 7dac5f01-e074-40c1-9f9a-5d514359dd0e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:40:13 compute-0 nova_compute[186544]: 2025-11-22 07:40:13.799 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763797213.7990396, 7dac5f01-e074-40c1-9f9a-5d514359dd0e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:40:13 compute-0 nova_compute[186544]: 2025-11-22 07:40:13.799 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 7dac5f01-e074-40c1-9f9a-5d514359dd0e] VM Paused (Lifecycle Event)
Nov 22 07:40:13 compute-0 nova_compute[186544]: 2025-11-22 07:40:13.830 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 7dac5f01-e074-40c1-9f9a-5d514359dd0e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:40:13 compute-0 nova_compute[186544]: 2025-11-22 07:40:13.835 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 7dac5f01-e074-40c1-9f9a-5d514359dd0e] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:40:13 compute-0 nova_compute[186544]: 2025-11-22 07:40:13.860 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 7dac5f01-e074-40c1-9f9a-5d514359dd0e] During sync_power_state the instance has a pending task (suspending). Skip.
Nov 22 07:40:14 compute-0 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000009.scope: Deactivated successfully.
Nov 22 07:40:14 compute-0 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000009.scope: Consumed 3.764s CPU time.
Nov 22 07:40:14 compute-0 systemd-machined[152872]: Machine qemu-2-instance-00000009 terminated.
Nov 22 07:40:14 compute-0 podman[213292]: 2025-11-22 07:40:14.276157412 +0000 UTC m=+0.063481482 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118)
Nov 22 07:40:14 compute-0 podman[213293]: 2025-11-22 07:40:14.329331342 +0000 UTC m=+0.108614778 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible)
Nov 22 07:40:14 compute-0 nova_compute[186544]: 2025-11-22 07:40:14.395 186548 DEBUG nova.compute.manager [None req-22eb6c3c-573f-4aa4-8cdd-f76d44972ce5 f247f4272a41405a9ceba3b34e898209 3cb41ee119514e32aecf0e39d953e5e9 - - default default] [instance: 7dac5f01-e074-40c1-9f9a-5d514359dd0e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:40:19 compute-0 nova_compute[186544]: 2025-11-22 07:40:19.178 186548 DEBUG oslo_concurrency.lockutils [None req-a383ca1b-fcd6-472e-9191-a548f3831ff1 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Acquiring lock "7dac5f01-e074-40c1-9f9a-5d514359dd0e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:40:19 compute-0 nova_compute[186544]: 2025-11-22 07:40:19.179 186548 DEBUG oslo_concurrency.lockutils [None req-a383ca1b-fcd6-472e-9191-a548f3831ff1 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Lock "7dac5f01-e074-40c1-9f9a-5d514359dd0e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:40:19 compute-0 nova_compute[186544]: 2025-11-22 07:40:19.179 186548 DEBUG oslo_concurrency.lockutils [None req-a383ca1b-fcd6-472e-9191-a548f3831ff1 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Acquiring lock "7dac5f01-e074-40c1-9f9a-5d514359dd0e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:40:19 compute-0 nova_compute[186544]: 2025-11-22 07:40:19.180 186548 DEBUG oslo_concurrency.lockutils [None req-a383ca1b-fcd6-472e-9191-a548f3831ff1 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Lock "7dac5f01-e074-40c1-9f9a-5d514359dd0e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:40:19 compute-0 nova_compute[186544]: 2025-11-22 07:40:19.180 186548 DEBUG oslo_concurrency.lockutils [None req-a383ca1b-fcd6-472e-9191-a548f3831ff1 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Lock "7dac5f01-e074-40c1-9f9a-5d514359dd0e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:40:19 compute-0 nova_compute[186544]: 2025-11-22 07:40:19.186 186548 INFO nova.compute.manager [None req-a383ca1b-fcd6-472e-9191-a548f3831ff1 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] [instance: 7dac5f01-e074-40c1-9f9a-5d514359dd0e] Terminating instance
Nov 22 07:40:19 compute-0 nova_compute[186544]: 2025-11-22 07:40:19.191 186548 DEBUG oslo_concurrency.lockutils [None req-a383ca1b-fcd6-472e-9191-a548f3831ff1 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Acquiring lock "refresh_cache-7dac5f01-e074-40c1-9f9a-5d514359dd0e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:40:19 compute-0 nova_compute[186544]: 2025-11-22 07:40:19.192 186548 DEBUG oslo_concurrency.lockutils [None req-a383ca1b-fcd6-472e-9191-a548f3831ff1 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Acquired lock "refresh_cache-7dac5f01-e074-40c1-9f9a-5d514359dd0e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:40:19 compute-0 nova_compute[186544]: 2025-11-22 07:40:19.192 186548 DEBUG nova.network.neutron [None req-a383ca1b-fcd6-472e-9191-a548f3831ff1 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] [instance: 7dac5f01-e074-40c1-9f9a-5d514359dd0e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 07:40:19 compute-0 podman[213347]: 2025-11-22 07:40:19.394430353 +0000 UTC m=+0.047362311 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 07:40:19 compute-0 nova_compute[186544]: 2025-11-22 07:40:19.635 186548 DEBUG nova.network.neutron [None req-a383ca1b-fcd6-472e-9191-a548f3831ff1 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] [instance: 7dac5f01-e074-40c1-9f9a-5d514359dd0e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 07:40:20 compute-0 nova_compute[186544]: 2025-11-22 07:40:20.607 186548 DEBUG nova.network.neutron [None req-a383ca1b-fcd6-472e-9191-a548f3831ff1 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] [instance: 7dac5f01-e074-40c1-9f9a-5d514359dd0e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:40:20 compute-0 nova_compute[186544]: 2025-11-22 07:40:20.627 186548 DEBUG oslo_concurrency.lockutils [None req-a383ca1b-fcd6-472e-9191-a548f3831ff1 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Releasing lock "refresh_cache-7dac5f01-e074-40c1-9f9a-5d514359dd0e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:40:20 compute-0 nova_compute[186544]: 2025-11-22 07:40:20.627 186548 DEBUG nova.compute.manager [None req-a383ca1b-fcd6-472e-9191-a548f3831ff1 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] [instance: 7dac5f01-e074-40c1-9f9a-5d514359dd0e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 07:40:20 compute-0 nova_compute[186544]: 2025-11-22 07:40:20.634 186548 INFO nova.virt.libvirt.driver [-] [instance: 7dac5f01-e074-40c1-9f9a-5d514359dd0e] Instance destroyed successfully.
Nov 22 07:40:20 compute-0 nova_compute[186544]: 2025-11-22 07:40:20.634 186548 DEBUG nova.objects.instance [None req-a383ca1b-fcd6-472e-9191-a548f3831ff1 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Lazy-loading 'resources' on Instance uuid 7dac5f01-e074-40c1-9f9a-5d514359dd0e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:40:20 compute-0 nova_compute[186544]: 2025-11-22 07:40:20.650 186548 INFO nova.virt.libvirt.driver [None req-a383ca1b-fcd6-472e-9191-a548f3831ff1 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] [instance: 7dac5f01-e074-40c1-9f9a-5d514359dd0e] Deleting instance files /var/lib/nova/instances/7dac5f01-e074-40c1-9f9a-5d514359dd0e_del
Nov 22 07:40:20 compute-0 nova_compute[186544]: 2025-11-22 07:40:20.651 186548 INFO nova.virt.libvirt.driver [None req-a383ca1b-fcd6-472e-9191-a548f3831ff1 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] [instance: 7dac5f01-e074-40c1-9f9a-5d514359dd0e] Deletion of /var/lib/nova/instances/7dac5f01-e074-40c1-9f9a-5d514359dd0e_del complete
Nov 22 07:40:20 compute-0 nova_compute[186544]: 2025-11-22 07:40:20.750 186548 DEBUG nova.virt.libvirt.host [None req-a383ca1b-fcd6-472e-9191-a548f3831ff1 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754
Nov 22 07:40:20 compute-0 nova_compute[186544]: 2025-11-22 07:40:20.752 186548 INFO nova.virt.libvirt.host [None req-a383ca1b-fcd6-472e-9191-a548f3831ff1 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] UEFI support detected
Nov 22 07:40:20 compute-0 nova_compute[186544]: 2025-11-22 07:40:20.759 186548 INFO nova.compute.manager [None req-a383ca1b-fcd6-472e-9191-a548f3831ff1 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] [instance: 7dac5f01-e074-40c1-9f9a-5d514359dd0e] Took 0.13 seconds to destroy the instance on the hypervisor.
Nov 22 07:40:20 compute-0 nova_compute[186544]: 2025-11-22 07:40:20.760 186548 DEBUG oslo.service.loopingcall [None req-a383ca1b-fcd6-472e-9191-a548f3831ff1 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 07:40:20 compute-0 nova_compute[186544]: 2025-11-22 07:40:20.760 186548 DEBUG nova.compute.manager [-] [instance: 7dac5f01-e074-40c1-9f9a-5d514359dd0e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 07:40:20 compute-0 nova_compute[186544]: 2025-11-22 07:40:20.760 186548 DEBUG nova.network.neutron [-] [instance: 7dac5f01-e074-40c1-9f9a-5d514359dd0e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 07:40:21 compute-0 nova_compute[186544]: 2025-11-22 07:40:21.436 186548 DEBUG nova.network.neutron [-] [instance: 7dac5f01-e074-40c1-9f9a-5d514359dd0e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 07:40:21 compute-0 nova_compute[186544]: 2025-11-22 07:40:21.467 186548 DEBUG nova.network.neutron [-] [instance: 7dac5f01-e074-40c1-9f9a-5d514359dd0e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:40:21 compute-0 nova_compute[186544]: 2025-11-22 07:40:21.490 186548 INFO nova.compute.manager [-] [instance: 7dac5f01-e074-40c1-9f9a-5d514359dd0e] Took 0.73 seconds to deallocate network for instance.
Nov 22 07:40:21 compute-0 nova_compute[186544]: 2025-11-22 07:40:21.632 186548 DEBUG oslo_concurrency.lockutils [None req-a383ca1b-fcd6-472e-9191-a548f3831ff1 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:40:21 compute-0 nova_compute[186544]: 2025-11-22 07:40:21.633 186548 DEBUG oslo_concurrency.lockutils [None req-a383ca1b-fcd6-472e-9191-a548f3831ff1 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:40:21 compute-0 nova_compute[186544]: 2025-11-22 07:40:21.832 186548 DEBUG nova.compute.provider_tree [None req-a383ca1b-fcd6-472e-9191-a548f3831ff1 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:40:21 compute-0 nova_compute[186544]: 2025-11-22 07:40:21.845 186548 DEBUG nova.scheduler.client.report [None req-a383ca1b-fcd6-472e-9191-a548f3831ff1 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:40:21 compute-0 nova_compute[186544]: 2025-11-22 07:40:21.868 186548 DEBUG oslo_concurrency.lockutils [None req-a383ca1b-fcd6-472e-9191-a548f3831ff1 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.235s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:40:21 compute-0 nova_compute[186544]: 2025-11-22 07:40:21.890 186548 DEBUG nova.network.neutron [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c] Automatically allocated network: {'id': 'cd94b117-ddd2-457a-a1e9-a1e03ac67322', 'name': 'auto_allocated_network', 'tenant_id': '98627e04b62e4ce4bf9650377c674f73', 'admin_state_up': True, 'mtu': 1442, 'status': 'ACTIVE', 'subnets': ['68dcf6dc-373a-4168-81d5-f04abc5d8ac8', '826b0fb5-b3d0-49b5-b40e-079f62557646'], 'shared': False, 'availability_zone_hints': [], 'availability_zones': [], 'ipv4_address_scope': None, 'ipv6_address_scope': None, 'router:external': False, 'description': '', 'qos_policy_id': None, 'port_security_enabled': True, 'dns_domain': '', 'l2_adjacency': True, 'tags': [], 'created_at': '2025-11-22T07:39:55Z', 'updated_at': '2025-11-22T07:40:10Z', 'revision_number': 4, 'project_id': '98627e04b62e4ce4bf9650377c674f73'} _auto_allocate_network /usr/lib/python3.9/site-packages/nova/network/neutron.py:2478
Nov 22 07:40:21 compute-0 nova_compute[186544]: 2025-11-22 07:40:21.900 186548 WARNING oslo_policy.policy [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Nov 22 07:40:21 compute-0 nova_compute[186544]: 2025-11-22 07:40:21.900 186548 WARNING oslo_policy.policy [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Nov 22 07:40:21 compute-0 nova_compute[186544]: 2025-11-22 07:40:21.902 186548 DEBUG nova.policy [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 22 07:40:21 compute-0 nova_compute[186544]: 2025-11-22 07:40:21.905 186548 INFO nova.scheduler.client.report [None req-a383ca1b-fcd6-472e-9191-a548f3831ff1 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Deleted allocations for instance 7dac5f01-e074-40c1-9f9a-5d514359dd0e
Nov 22 07:40:22 compute-0 nova_compute[186544]: 2025-11-22 07:40:22.041 186548 DEBUG oslo_concurrency.lockutils [None req-a383ca1b-fcd6-472e-9191-a548f3831ff1 d7697c5198974e5ca1152e4c64815e29 463b8a0ac0be4ebbb7491f91038a890f - - default default] Lock "7dac5f01-e074-40c1-9f9a-5d514359dd0e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.862s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:40:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:40:23.223 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:40:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:40:23.225 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 07:40:23 compute-0 podman[213372]: 2025-11-22 07:40:23.432003066 +0000 UTC m=+0.071056416 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 07:40:24 compute-0 nova_compute[186544]: 2025-11-22 07:40:24.180 186548 DEBUG nova.network.neutron [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c] Successfully created port: ff14ba4b-c046-4907-aa37-9db05ed22278 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 22 07:40:25 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:40:25.227 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:40:26 compute-0 podman[213393]: 2025-11-22 07:40:26.433639513 +0000 UTC m=+0.082147081 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS)
Nov 22 07:40:27 compute-0 nova_compute[186544]: 2025-11-22 07:40:27.497 186548 DEBUG nova.network.neutron [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c] Successfully updated port: ff14ba4b-c046-4907-aa37-9db05ed22278 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 07:40:27 compute-0 nova_compute[186544]: 2025-11-22 07:40:27.648 186548 DEBUG oslo_concurrency.lockutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Acquiring lock "refresh_cache-3ed60b2a-f4cf-4b63-81bb-099f566d7e6c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:40:27 compute-0 nova_compute[186544]: 2025-11-22 07:40:27.649 186548 DEBUG oslo_concurrency.lockutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Acquired lock "refresh_cache-3ed60b2a-f4cf-4b63-81bb-099f566d7e6c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:40:27 compute-0 nova_compute[186544]: 2025-11-22 07:40:27.649 186548 DEBUG nova.network.neutron [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 07:40:27 compute-0 nova_compute[186544]: 2025-11-22 07:40:27.988 186548 DEBUG nova.network.neutron [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 07:40:28 compute-0 nova_compute[186544]: 2025-11-22 07:40:28.139 186548 DEBUG nova.compute.manager [req-9dcf37ac-110b-4d0f-b182-97390de7696e req-05474d51-06d8-469b-a29d-f0a7e4b84adf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c] Received event network-changed-ff14ba4b-c046-4907-aa37-9db05ed22278 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:40:28 compute-0 nova_compute[186544]: 2025-11-22 07:40:28.140 186548 DEBUG nova.compute.manager [req-9dcf37ac-110b-4d0f-b182-97390de7696e req-05474d51-06d8-469b-a29d-f0a7e4b84adf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c] Refreshing instance network info cache due to event network-changed-ff14ba4b-c046-4907-aa37-9db05ed22278. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 07:40:28 compute-0 nova_compute[186544]: 2025-11-22 07:40:28.140 186548 DEBUG oslo_concurrency.lockutils [req-9dcf37ac-110b-4d0f-b182-97390de7696e req-05474d51-06d8-469b-a29d-f0a7e4b84adf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-3ed60b2a-f4cf-4b63-81bb-099f566d7e6c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:40:28 compute-0 nova_compute[186544]: 2025-11-22 07:40:28.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:40:28 compute-0 nova_compute[186544]: 2025-11-22 07:40:28.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:40:28 compute-0 nova_compute[186544]: 2025-11-22 07:40:28.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 07:40:28 compute-0 nova_compute[186544]: 2025-11-22 07:40:28.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:40:28 compute-0 nova_compute[186544]: 2025-11-22 07:40:28.184 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:40:28 compute-0 nova_compute[186544]: 2025-11-22 07:40:28.184 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:40:28 compute-0 nova_compute[186544]: 2025-11-22 07:40:28.185 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:40:28 compute-0 nova_compute[186544]: 2025-11-22 07:40:28.185 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 07:40:28 compute-0 nova_compute[186544]: 2025-11-22 07:40:28.280 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5bdfb56f-9385-45bb-918d-0662112228b2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:40:28 compute-0 nova_compute[186544]: 2025-11-22 07:40:28.336 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5bdfb56f-9385-45bb-918d-0662112228b2/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:40:28 compute-0 nova_compute[186544]: 2025-11-22 07:40:28.338 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5bdfb56f-9385-45bb-918d-0662112228b2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:40:28 compute-0 nova_compute[186544]: 2025-11-22 07:40:28.394 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5bdfb56f-9385-45bb-918d-0662112228b2/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:40:28 compute-0 nova_compute[186544]: 2025-11-22 07:40:28.562 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 07:40:28 compute-0 nova_compute[186544]: 2025-11-22 07:40:28.563 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5874MB free_disk=73.43431854248047GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 07:40:28 compute-0 nova_compute[186544]: 2025-11-22 07:40:28.563 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:40:28 compute-0 nova_compute[186544]: 2025-11-22 07:40:28.564 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:40:28 compute-0 nova_compute[186544]: 2025-11-22 07:40:28.675 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Instance 5bdfb56f-9385-45bb-918d-0662112228b2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 22 07:40:28 compute-0 nova_compute[186544]: 2025-11-22 07:40:28.676 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Instance 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 22 07:40:28 compute-0 nova_compute[186544]: 2025-11-22 07:40:28.676 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 07:40:28 compute-0 nova_compute[186544]: 2025-11-22 07:40:28.677 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 07:40:28 compute-0 nova_compute[186544]: 2025-11-22 07:40:28.754 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:40:28 compute-0 nova_compute[186544]: 2025-11-22 07:40:28.776 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:40:28 compute-0 nova_compute[186544]: 2025-11-22 07:40:28.801 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 07:40:28 compute-0 nova_compute[186544]: 2025-11-22 07:40:28.802 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.238s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:40:29 compute-0 nova_compute[186544]: 2025-11-22 07:40:29.397 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763797214.3956113, 7dac5f01-e074-40c1-9f9a-5d514359dd0e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:40:29 compute-0 nova_compute[186544]: 2025-11-22 07:40:29.398 186548 INFO nova.compute.manager [-] [instance: 7dac5f01-e074-40c1-9f9a-5d514359dd0e] VM Stopped (Lifecycle Event)
Nov 22 07:40:29 compute-0 nova_compute[186544]: 2025-11-22 07:40:29.419 186548 DEBUG nova.compute.manager [None req-a25c825c-9534-42d8-aa6d-3421d739f67e - - - - - -] [instance: 7dac5f01-e074-40c1-9f9a-5d514359dd0e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:40:30 compute-0 podman[213422]: 2025-11-22 07:40:30.41841166 +0000 UTC m=+0.063796057 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 07:40:30 compute-0 nova_compute[186544]: 2025-11-22 07:40:30.798 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:40:30 compute-0 nova_compute[186544]: 2025-11-22 07:40:30.799 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:40:30 compute-0 nova_compute[186544]: 2025-11-22 07:40:30.800 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 07:40:30 compute-0 nova_compute[186544]: 2025-11-22 07:40:30.800 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 07:40:30 compute-0 nova_compute[186544]: 2025-11-22 07:40:30.833 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Nov 22 07:40:31 compute-0 nova_compute[186544]: 2025-11-22 07:40:31.850 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "refresh_cache-5bdfb56f-9385-45bb-918d-0662112228b2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:40:31 compute-0 nova_compute[186544]: 2025-11-22 07:40:31.850 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquired lock "refresh_cache-5bdfb56f-9385-45bb-918d-0662112228b2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:40:31 compute-0 nova_compute[186544]: 2025-11-22 07:40:31.851 186548 DEBUG nova.network.neutron [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 5bdfb56f-9385-45bb-918d-0662112228b2] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 22 07:40:31 compute-0 nova_compute[186544]: 2025-11-22 07:40:31.851 186548 DEBUG nova.objects.instance [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lazy-loading 'info_cache' on Instance uuid 5bdfb56f-9385-45bb-918d-0662112228b2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:40:32 compute-0 nova_compute[186544]: 2025-11-22 07:40:32.734 186548 DEBUG nova.network.neutron [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 5bdfb56f-9385-45bb-918d-0662112228b2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 07:40:32 compute-0 nova_compute[186544]: 2025-11-22 07:40:32.781 186548 DEBUG nova.network.neutron [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c] Updating instance_info_cache with network_info: [{"id": "ff14ba4b-c046-4907-aa37-9db05ed22278", "address": "fa:16:3e:82:6a:39", "network": {"id": "cd94b117-ddd2-457a-a1e9-a1e03ac67322", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::53", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.45", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98627e04b62e4ce4bf9650377c674f73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff14ba4b-c0", "ovs_interfaceid": "ff14ba4b-c046-4907-aa37-9db05ed22278", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:40:32 compute-0 nova_compute[186544]: 2025-11-22 07:40:32.932 186548 DEBUG oslo_concurrency.lockutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Releasing lock "refresh_cache-3ed60b2a-f4cf-4b63-81bb-099f566d7e6c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:40:32 compute-0 nova_compute[186544]: 2025-11-22 07:40:32.933 186548 DEBUG nova.compute.manager [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c] Instance network_info: |[{"id": "ff14ba4b-c046-4907-aa37-9db05ed22278", "address": "fa:16:3e:82:6a:39", "network": {"id": "cd94b117-ddd2-457a-a1e9-a1e03ac67322", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::53", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.45", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98627e04b62e4ce4bf9650377c674f73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff14ba4b-c0", "ovs_interfaceid": "ff14ba4b-c046-4907-aa37-9db05ed22278", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 22 07:40:32 compute-0 nova_compute[186544]: 2025-11-22 07:40:32.933 186548 DEBUG oslo_concurrency.lockutils [req-9dcf37ac-110b-4d0f-b182-97390de7696e req-05474d51-06d8-469b-a29d-f0a7e4b84adf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-3ed60b2a-f4cf-4b63-81bb-099f566d7e6c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:40:32 compute-0 nova_compute[186544]: 2025-11-22 07:40:32.934 186548 DEBUG nova.network.neutron [req-9dcf37ac-110b-4d0f-b182-97390de7696e req-05474d51-06d8-469b-a29d-f0a7e4b84adf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c] Refreshing network info cache for port ff14ba4b-c046-4907-aa37-9db05ed22278 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 07:40:32 compute-0 nova_compute[186544]: 2025-11-22 07:40:32.936 186548 DEBUG nova.virt.libvirt.driver [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c] Start _get_guest_xml network_info=[{"id": "ff14ba4b-c046-4907-aa37-9db05ed22278", "address": "fa:16:3e:82:6a:39", "network": {"id": "cd94b117-ddd2-457a-a1e9-a1e03ac67322", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::53", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.45", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98627e04b62e4ce4bf9650377c674f73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff14ba4b-c0", "ovs_interfaceid": "ff14ba4b-c046-4907-aa37-9db05ed22278", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 07:40:32 compute-0 nova_compute[186544]: 2025-11-22 07:40:32.943 186548 WARNING nova.virt.libvirt.driver [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 07:40:32 compute-0 nova_compute[186544]: 2025-11-22 07:40:32.947 186548 DEBUG nova.virt.libvirt.host [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 07:40:32 compute-0 nova_compute[186544]: 2025-11-22 07:40:32.948 186548 DEBUG nova.virt.libvirt.host [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 07:40:32 compute-0 nova_compute[186544]: 2025-11-22 07:40:32.953 186548 DEBUG nova.virt.libvirt.host [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 07:40:32 compute-0 nova_compute[186544]: 2025-11-22 07:40:32.953 186548 DEBUG nova.virt.libvirt.host [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 07:40:32 compute-0 nova_compute[186544]: 2025-11-22 07:40:32.955 186548 DEBUG nova.virt.libvirt.driver [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 07:40:32 compute-0 nova_compute[186544]: 2025-11-22 07:40:32.955 186548 DEBUG nova.virt.hardware [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 07:40:32 compute-0 nova_compute[186544]: 2025-11-22 07:40:32.956 186548 DEBUG nova.virt.hardware [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 07:40:32 compute-0 nova_compute[186544]: 2025-11-22 07:40:32.956 186548 DEBUG nova.virt.hardware [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 07:40:32 compute-0 nova_compute[186544]: 2025-11-22 07:40:32.956 186548 DEBUG nova.virt.hardware [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 07:40:32 compute-0 nova_compute[186544]: 2025-11-22 07:40:32.957 186548 DEBUG nova.virt.hardware [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 07:40:32 compute-0 nova_compute[186544]: 2025-11-22 07:40:32.957 186548 DEBUG nova.virt.hardware [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 07:40:32 compute-0 nova_compute[186544]: 2025-11-22 07:40:32.957 186548 DEBUG nova.virt.hardware [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 07:40:32 compute-0 nova_compute[186544]: 2025-11-22 07:40:32.958 186548 DEBUG nova.virt.hardware [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 07:40:32 compute-0 nova_compute[186544]: 2025-11-22 07:40:32.958 186548 DEBUG nova.virt.hardware [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 07:40:32 compute-0 nova_compute[186544]: 2025-11-22 07:40:32.958 186548 DEBUG nova.virt.hardware [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 07:40:32 compute-0 nova_compute[186544]: 2025-11-22 07:40:32.958 186548 DEBUG nova.virt.hardware [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 07:40:32 compute-0 nova_compute[186544]: 2025-11-22 07:40:32.962 186548 DEBUG nova.virt.libvirt.vif [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:39:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-tempest.common.compute-instance-1346960213-2',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1346960213-2',id=5,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=1,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='98627e04b62e4ce4bf9650377c674f73',ramdisk_id='',reservation_id='r-qng9bmzp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AutoAllocateNetworkTest-83498172',owner_user_name='tempest-AutoAllocateNetworkTest-83498172-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:39:54Z,user_data=None,user_id='12b223a79f8b4927861908eb11663fb5',uuid=3ed60b2a-f4cf-4b63-81bb-099f566d7e6c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ff14ba4b-c046-4907-aa37-9db05ed22278", "address": "fa:16:3e:82:6a:39", "network": {"id": "cd94b117-ddd2-457a-a1e9-a1e03ac67322", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::53", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.45", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98627e04b62e4ce4bf9650377c674f73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff14ba4b-c0", "ovs_interfaceid": "ff14ba4b-c046-4907-aa37-9db05ed22278", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 07:40:32 compute-0 nova_compute[186544]: 2025-11-22 07:40:32.963 186548 DEBUG nova.network.os_vif_util [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Converting VIF {"id": "ff14ba4b-c046-4907-aa37-9db05ed22278", "address": "fa:16:3e:82:6a:39", "network": {"id": "cd94b117-ddd2-457a-a1e9-a1e03ac67322", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::53", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.45", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98627e04b62e4ce4bf9650377c674f73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff14ba4b-c0", "ovs_interfaceid": "ff14ba4b-c046-4907-aa37-9db05ed22278", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:40:32 compute-0 nova_compute[186544]: 2025-11-22 07:40:32.963 186548 DEBUG nova.network.os_vif_util [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:82:6a:39,bridge_name='br-int',has_traffic_filtering=True,id=ff14ba4b-c046-4907-aa37-9db05ed22278,network=Network(cd94b117-ddd2-457a-a1e9-a1e03ac67322),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapff14ba4b-c0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:40:32 compute-0 nova_compute[186544]: 2025-11-22 07:40:32.967 186548 DEBUG nova.objects.instance [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Lazy-loading 'pci_devices' on Instance uuid 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:40:32 compute-0 nova_compute[186544]: 2025-11-22 07:40:32.979 186548 DEBUG nova.virt.libvirt.driver [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c] End _get_guest_xml xml=<domain type="kvm">
Nov 22 07:40:32 compute-0 nova_compute[186544]:   <uuid>3ed60b2a-f4cf-4b63-81bb-099f566d7e6c</uuid>
Nov 22 07:40:32 compute-0 nova_compute[186544]:   <name>instance-00000005</name>
Nov 22 07:40:32 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 07:40:32 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 07:40:32 compute-0 nova_compute[186544]:   <metadata>
Nov 22 07:40:32 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 07:40:32 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 07:40:32 compute-0 nova_compute[186544]:       <nova:name>tempest-tempest.common.compute-instance-1346960213-2</nova:name>
Nov 22 07:40:32 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 07:40:32</nova:creationTime>
Nov 22 07:40:32 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 07:40:32 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 07:40:32 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 07:40:32 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 07:40:32 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 07:40:32 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 07:40:32 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 07:40:32 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 07:40:32 compute-0 nova_compute[186544]:         <nova:user uuid="12b223a79f8b4927861908eb11663fb5">tempest-AutoAllocateNetworkTest-83498172-project-member</nova:user>
Nov 22 07:40:32 compute-0 nova_compute[186544]:         <nova:project uuid="98627e04b62e4ce4bf9650377c674f73">tempest-AutoAllocateNetworkTest-83498172</nova:project>
Nov 22 07:40:32 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 07:40:32 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 07:40:32 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 07:40:32 compute-0 nova_compute[186544]:         <nova:port uuid="ff14ba4b-c046-4907-aa37-9db05ed22278">
Nov 22 07:40:32 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="fdfe:381f:8400::53" ipVersion="6"/>
Nov 22 07:40:32 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.1.0.45" ipVersion="4"/>
Nov 22 07:40:32 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 07:40:32 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 07:40:32 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 07:40:32 compute-0 nova_compute[186544]:   </metadata>
Nov 22 07:40:32 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 07:40:32 compute-0 nova_compute[186544]:     <system>
Nov 22 07:40:32 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 07:40:32 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 07:40:32 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 07:40:32 compute-0 nova_compute[186544]:       <entry name="serial">3ed60b2a-f4cf-4b63-81bb-099f566d7e6c</entry>
Nov 22 07:40:32 compute-0 nova_compute[186544]:       <entry name="uuid">3ed60b2a-f4cf-4b63-81bb-099f566d7e6c</entry>
Nov 22 07:40:32 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 07:40:32 compute-0 nova_compute[186544]:     </system>
Nov 22 07:40:32 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 07:40:32 compute-0 nova_compute[186544]:   <os>
Nov 22 07:40:32 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 07:40:32 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 07:40:32 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 07:40:32 compute-0 nova_compute[186544]:   </os>
Nov 22 07:40:32 compute-0 nova_compute[186544]:   <features>
Nov 22 07:40:32 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 07:40:32 compute-0 nova_compute[186544]:     <apic/>
Nov 22 07:40:32 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 07:40:32 compute-0 nova_compute[186544]:   </features>
Nov 22 07:40:32 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 07:40:32 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 07:40:32 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 07:40:32 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 07:40:32 compute-0 nova_compute[186544]:   </clock>
Nov 22 07:40:32 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 07:40:32 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 07:40:32 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 07:40:32 compute-0 nova_compute[186544]:   </cpu>
Nov 22 07:40:32 compute-0 nova_compute[186544]:   <devices>
Nov 22 07:40:32 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 07:40:32 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 07:40:32 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/3ed60b2a-f4cf-4b63-81bb-099f566d7e6c/disk"/>
Nov 22 07:40:32 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 07:40:32 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:40:32 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 07:40:32 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 07:40:32 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/3ed60b2a-f4cf-4b63-81bb-099f566d7e6c/disk.config"/>
Nov 22 07:40:32 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 07:40:32 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:40:32 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 07:40:32 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:82:6a:39"/>
Nov 22 07:40:32 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 07:40:32 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 07:40:32 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 07:40:32 compute-0 nova_compute[186544]:       <target dev="tapff14ba4b-c0"/>
Nov 22 07:40:32 compute-0 nova_compute[186544]:     </interface>
Nov 22 07:40:32 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 07:40:32 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/3ed60b2a-f4cf-4b63-81bb-099f566d7e6c/console.log" append="off"/>
Nov 22 07:40:32 compute-0 nova_compute[186544]:     </serial>
Nov 22 07:40:32 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 07:40:32 compute-0 nova_compute[186544]:     <video>
Nov 22 07:40:32 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 07:40:32 compute-0 nova_compute[186544]:     </video>
Nov 22 07:40:32 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 07:40:32 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 07:40:32 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 07:40:32 compute-0 nova_compute[186544]:     </rng>
Nov 22 07:40:32 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 07:40:32 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:40:32 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:40:32 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:40:32 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:40:32 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:40:32 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:40:32 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:40:32 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:40:32 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:40:32 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:40:32 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:40:32 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:40:32 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:40:32 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:40:32 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:40:32 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:40:32 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:40:32 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:40:32 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:40:32 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:40:32 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:40:32 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:40:32 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:40:32 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:40:32 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 07:40:32 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 07:40:32 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 07:40:32 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 07:40:32 compute-0 nova_compute[186544]:   </devices>
Nov 22 07:40:32 compute-0 nova_compute[186544]: </domain>
Nov 22 07:40:32 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 07:40:32 compute-0 nova_compute[186544]: 2025-11-22 07:40:32.981 186548 DEBUG nova.compute.manager [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c] Preparing to wait for external event network-vif-plugged-ff14ba4b-c046-4907-aa37-9db05ed22278 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 07:40:32 compute-0 nova_compute[186544]: 2025-11-22 07:40:32.981 186548 DEBUG oslo_concurrency.lockutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Acquiring lock "3ed60b2a-f4cf-4b63-81bb-099f566d7e6c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:40:32 compute-0 nova_compute[186544]: 2025-11-22 07:40:32.981 186548 DEBUG oslo_concurrency.lockutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Lock "3ed60b2a-f4cf-4b63-81bb-099f566d7e6c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:40:32 compute-0 nova_compute[186544]: 2025-11-22 07:40:32.981 186548 DEBUG oslo_concurrency.lockutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Lock "3ed60b2a-f4cf-4b63-81bb-099f566d7e6c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:40:32 compute-0 nova_compute[186544]: 2025-11-22 07:40:32.982 186548 DEBUG nova.virt.libvirt.vif [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:39:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-tempest.common.compute-instance-1346960213-2',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1346960213-2',id=5,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=1,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='98627e04b62e4ce4bf9650377c674f73',ramdisk_id='',reservation_id='r-qng9bmzp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AutoAllocateNetworkTest-83498172',owner_user_name='tempest-AutoAllocateNetworkTest-83498172-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:39:54Z,user_data=None,user_id='12b223a79f8b4927861908eb11663fb5',uuid=3ed60b2a-f4cf-4b63-81bb-099f566d7e6c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ff14ba4b-c046-4907-aa37-9db05ed22278", "address": "fa:16:3e:82:6a:39", "network": {"id": "cd94b117-ddd2-457a-a1e9-a1e03ac67322", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::53", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.45", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98627e04b62e4ce4bf9650377c674f73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff14ba4b-c0", "ovs_interfaceid": "ff14ba4b-c046-4907-aa37-9db05ed22278", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 07:40:32 compute-0 nova_compute[186544]: 2025-11-22 07:40:32.982 186548 DEBUG nova.network.os_vif_util [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Converting VIF {"id": "ff14ba4b-c046-4907-aa37-9db05ed22278", "address": "fa:16:3e:82:6a:39", "network": {"id": "cd94b117-ddd2-457a-a1e9-a1e03ac67322", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::53", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.45", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98627e04b62e4ce4bf9650377c674f73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff14ba4b-c0", "ovs_interfaceid": "ff14ba4b-c046-4907-aa37-9db05ed22278", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:40:32 compute-0 nova_compute[186544]: 2025-11-22 07:40:32.983 186548 DEBUG nova.network.os_vif_util [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:82:6a:39,bridge_name='br-int',has_traffic_filtering=True,id=ff14ba4b-c046-4907-aa37-9db05ed22278,network=Network(cd94b117-ddd2-457a-a1e9-a1e03ac67322),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapff14ba4b-c0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:40:32 compute-0 nova_compute[186544]: 2025-11-22 07:40:32.983 186548 DEBUG os_vif [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:82:6a:39,bridge_name='br-int',has_traffic_filtering=True,id=ff14ba4b-c046-4907-aa37-9db05ed22278,network=Network(cd94b117-ddd2-457a-a1e9-a1e03ac67322),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapff14ba4b-c0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 07:40:33 compute-0 nova_compute[186544]: 2025-11-22 07:40:33.016 186548 DEBUG ovsdbapp.backend.ovs_idl [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 22 07:40:33 compute-0 nova_compute[186544]: 2025-11-22 07:40:33.017 186548 DEBUG ovsdbapp.backend.ovs_idl [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 22 07:40:33 compute-0 nova_compute[186544]: 2025-11-22 07:40:33.017 186548 DEBUG ovsdbapp.backend.ovs_idl [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 22 07:40:33 compute-0 nova_compute[186544]: 2025-11-22 07:40:33.017 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 22 07:40:33 compute-0 nova_compute[186544]: 2025-11-22 07:40:33.018 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [POLLOUT] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:40:33 compute-0 nova_compute[186544]: 2025-11-22 07:40:33.018 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 22 07:40:33 compute-0 nova_compute[186544]: 2025-11-22 07:40:33.019 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:40:33 compute-0 nova_compute[186544]: 2025-11-22 07:40:33.020 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:40:33 compute-0 nova_compute[186544]: 2025-11-22 07:40:33.023 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:40:33 compute-0 nova_compute[186544]: 2025-11-22 07:40:33.032 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:40:33 compute-0 nova_compute[186544]: 2025-11-22 07:40:33.033 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:40:33 compute-0 nova_compute[186544]: 2025-11-22 07:40:33.033 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:40:33 compute-0 nova_compute[186544]: 2025-11-22 07:40:33.034 186548 INFO oslo.privsep.daemon [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmp9s58dfoh/privsep.sock']
Nov 22 07:40:33 compute-0 nova_compute[186544]: 2025-11-22 07:40:33.737 186548 INFO oslo.privsep.daemon [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Spawned new privsep daemon via rootwrap
Nov 22 07:40:33 compute-0 nova_compute[186544]: 2025-11-22 07:40:33.603 213451 INFO oslo.privsep.daemon [-] privsep daemon starting
Nov 22 07:40:33 compute-0 nova_compute[186544]: 2025-11-22 07:40:33.607 213451 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Nov 22 07:40:33 compute-0 nova_compute[186544]: 2025-11-22 07:40:33.609 213451 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none
Nov 22 07:40:33 compute-0 nova_compute[186544]: 2025-11-22 07:40:33.609 213451 INFO oslo.privsep.daemon [-] privsep daemon running as pid 213451
Nov 22 07:40:34 compute-0 nova_compute[186544]: 2025-11-22 07:40:34.042 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:40:34 compute-0 nova_compute[186544]: 2025-11-22 07:40:34.043 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapff14ba4b-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:40:34 compute-0 nova_compute[186544]: 2025-11-22 07:40:34.043 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapff14ba4b-c0, col_values=(('external_ids', {'iface-id': 'ff14ba4b-c046-4907-aa37-9db05ed22278', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:82:6a:39', 'vm-uuid': '3ed60b2a-f4cf-4b63-81bb-099f566d7e6c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:40:34 compute-0 nova_compute[186544]: 2025-11-22 07:40:34.045 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:40:34 compute-0 NetworkManager[55036]: <info>  [1763797234.0461] manager: (tapff14ba4b-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/21)
Nov 22 07:40:34 compute-0 nova_compute[186544]: 2025-11-22 07:40:34.048 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 07:40:34 compute-0 nova_compute[186544]: 2025-11-22 07:40:34.051 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:40:34 compute-0 nova_compute[186544]: 2025-11-22 07:40:34.052 186548 INFO os_vif [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:82:6a:39,bridge_name='br-int',has_traffic_filtering=True,id=ff14ba4b-c046-4907-aa37-9db05ed22278,network=Network(cd94b117-ddd2-457a-a1e9-a1e03ac67322),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapff14ba4b-c0')
Nov 22 07:40:34 compute-0 nova_compute[186544]: 2025-11-22 07:40:34.128 186548 DEBUG nova.virt.libvirt.driver [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:40:34 compute-0 nova_compute[186544]: 2025-11-22 07:40:34.128 186548 DEBUG nova.virt.libvirt.driver [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:40:34 compute-0 nova_compute[186544]: 2025-11-22 07:40:34.128 186548 DEBUG nova.virt.libvirt.driver [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] No VIF found with MAC fa:16:3e:82:6a:39, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 07:40:34 compute-0 nova_compute[186544]: 2025-11-22 07:40:34.129 186548 INFO nova.virt.libvirt.driver [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c] Using config drive
Nov 22 07:40:34 compute-0 nova_compute[186544]: 2025-11-22 07:40:34.374 186548 DEBUG nova.network.neutron [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 5bdfb56f-9385-45bb-918d-0662112228b2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:40:34 compute-0 nova_compute[186544]: 2025-11-22 07:40:34.398 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Releasing lock "refresh_cache-5bdfb56f-9385-45bb-918d-0662112228b2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:40:34 compute-0 nova_compute[186544]: 2025-11-22 07:40:34.398 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 5bdfb56f-9385-45bb-918d-0662112228b2] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 22 07:40:34 compute-0 nova_compute[186544]: 2025-11-22 07:40:34.399 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:40:34 compute-0 nova_compute[186544]: 2025-11-22 07:40:34.399 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:40:34 compute-0 nova_compute[186544]: 2025-11-22 07:40:34.400 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:40:34 compute-0 nova_compute[186544]: 2025-11-22 07:40:34.400 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:40:34 compute-0 nova_compute[186544]: 2025-11-22 07:40:34.793 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:40:35 compute-0 nova_compute[186544]: 2025-11-22 07:40:35.063 186548 INFO nova.virt.libvirt.driver [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c] Creating config drive at /var/lib/nova/instances/3ed60b2a-f4cf-4b63-81bb-099f566d7e6c/disk.config
Nov 22 07:40:35 compute-0 nova_compute[186544]: 2025-11-22 07:40:35.068 186548 DEBUG oslo_concurrency.processutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3ed60b2a-f4cf-4b63-81bb-099f566d7e6c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpv9fbfm_f execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:40:35 compute-0 nova_compute[186544]: 2025-11-22 07:40:35.190 186548 DEBUG oslo_concurrency.processutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3ed60b2a-f4cf-4b63-81bb-099f566d7e6c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpv9fbfm_f" returned: 0 in 0.122s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:40:35 compute-0 kernel: tun: Universal TUN/TAP device driver, 1.6
Nov 22 07:40:35 compute-0 kernel: tapff14ba4b-c0: entered promiscuous mode
Nov 22 07:40:35 compute-0 NetworkManager[55036]: <info>  [1763797235.2778] manager: (tapff14ba4b-c0): new Tun device (/org/freedesktop/NetworkManager/Devices/22)
Nov 22 07:40:35 compute-0 ovn_controller[94843]: 2025-11-22T07:40:35Z|00027|binding|INFO|Claiming lport ff14ba4b-c046-4907-aa37-9db05ed22278 for this chassis.
Nov 22 07:40:35 compute-0 ovn_controller[94843]: 2025-11-22T07:40:35Z|00028|binding|INFO|ff14ba4b-c046-4907-aa37-9db05ed22278: Claiming fa:16:3e:82:6a:39 10.1.0.45 fdfe:381f:8400::53
Nov 22 07:40:35 compute-0 nova_compute[186544]: 2025-11-22 07:40:35.279 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:40:35 compute-0 nova_compute[186544]: 2025-11-22 07:40:35.283 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:40:35 compute-0 systemd-udevd[213493]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 07:40:35 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:40:35.301 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:82:6a:39 10.1.0.45 fdfe:381f:8400::53'], port_security=['fa:16:3e:82:6a:39 10.1.0.45 fdfe:381f:8400::53'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.1.0.45/26 fdfe:381f:8400::53/64', 'neutron:device_id': '3ed60b2a-f4cf-4b63-81bb-099f566d7e6c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cd94b117-ddd2-457a-a1e9-a1e03ac67322', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98627e04b62e4ce4bf9650377c674f73', 'neutron:revision_number': '2', 'neutron:security_group_ids': '931bf7c3-500b-4034-8d8e-f18219ff1b58', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6120d3e5-4a9e-45cc-93a1-87b92bf94714, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=ff14ba4b-c046-4907-aa37-9db05ed22278) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:40:35 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:40:35.302 103805 INFO neutron.agent.ovn.metadata.agent [-] Port ff14ba4b-c046-4907-aa37-9db05ed22278 in datapath cd94b117-ddd2-457a-a1e9-a1e03ac67322 bound to our chassis
Nov 22 07:40:35 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:40:35.304 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network cd94b117-ddd2-457a-a1e9-a1e03ac67322
Nov 22 07:40:35 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:40:35.305 103805 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpnhp8ltg2/privsep.sock']
Nov 22 07:40:35 compute-0 NetworkManager[55036]: <info>  [1763797235.3237] device (tapff14ba4b-c0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 07:40:35 compute-0 NetworkManager[55036]: <info>  [1763797235.3246] device (tapff14ba4b-c0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 07:40:35 compute-0 podman[213467]: 2025-11-22 07:40:35.330212769 +0000 UTC m=+0.070955395 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., architecture=x86_64, config_id=edpm, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.openshift.expose-services=, version=9.6, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., managed_by=edpm_ansible)
Nov 22 07:40:35 compute-0 systemd-machined[152872]: New machine qemu-3-instance-00000005.
Nov 22 07:40:35 compute-0 nova_compute[186544]: 2025-11-22 07:40:35.375 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:40:35 compute-0 systemd[1]: Started Virtual Machine qemu-3-instance-00000005.
Nov 22 07:40:35 compute-0 ovn_controller[94843]: 2025-11-22T07:40:35Z|00029|binding|INFO|Setting lport ff14ba4b-c046-4907-aa37-9db05ed22278 ovn-installed in OVS
Nov 22 07:40:35 compute-0 ovn_controller[94843]: 2025-11-22T07:40:35Z|00030|binding|INFO|Setting lport ff14ba4b-c046-4907-aa37-9db05ed22278 up in Southbound
Nov 22 07:40:35 compute-0 nova_compute[186544]: 2025-11-22 07:40:35.385 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:40:35 compute-0 nova_compute[186544]: 2025-11-22 07:40:35.695 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763797235.6949182, 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:40:35 compute-0 nova_compute[186544]: 2025-11-22 07:40:35.696 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c] VM Started (Lifecycle Event)
Nov 22 07:40:35 compute-0 nova_compute[186544]: 2025-11-22 07:40:35.718 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:40:35 compute-0 nova_compute[186544]: 2025-11-22 07:40:35.723 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763797235.698619, 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:40:35 compute-0 nova_compute[186544]: 2025-11-22 07:40:35.724 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c] VM Paused (Lifecycle Event)
Nov 22 07:40:35 compute-0 nova_compute[186544]: 2025-11-22 07:40:35.746 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:40:35 compute-0 nova_compute[186544]: 2025-11-22 07:40:35.750 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:40:35 compute-0 nova_compute[186544]: 2025-11-22 07:40:35.774 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 07:40:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:40:36.092 103805 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Nov 22 07:40:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:40:36.093 103805 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpnhp8ltg2/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Nov 22 07:40:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:40:35.922 213522 INFO oslo.privsep.daemon [-] privsep daemon starting
Nov 22 07:40:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:40:35.928 213522 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Nov 22 07:40:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:40:35.930 213522 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none
Nov 22 07:40:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:40:35.931 213522 INFO oslo.privsep.daemon [-] privsep daemon running as pid 213522
Nov 22 07:40:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:40:36.097 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[bc2aa2d2-9b0f-45d8-a5e9-79276c8dbc19]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:40:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:40:36.686 213522 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:40:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:40:36.686 213522 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:40:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:40:36.686 213522 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:40:37 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:37.200 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}759fdb23b70b4c3cdb60c99d9c5ba42769d1b8be5932a6fa01987fe919cac381" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Nov 22 07:40:37 compute-0 nova_compute[186544]: 2025-11-22 07:40:37.208 186548 DEBUG nova.compute.manager [req-e140ca59-cb71-499f-ac69-c6d04298c5c9 req-82808a4f-853f-4e6a-a8ba-f486b5275f7c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c] Received event network-vif-plugged-ff14ba4b-c046-4907-aa37-9db05ed22278 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:40:37 compute-0 nova_compute[186544]: 2025-11-22 07:40:37.208 186548 DEBUG oslo_concurrency.lockutils [req-e140ca59-cb71-499f-ac69-c6d04298c5c9 req-82808a4f-853f-4e6a-a8ba-f486b5275f7c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "3ed60b2a-f4cf-4b63-81bb-099f566d7e6c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:40:37 compute-0 nova_compute[186544]: 2025-11-22 07:40:37.210 186548 DEBUG oslo_concurrency.lockutils [req-e140ca59-cb71-499f-ac69-c6d04298c5c9 req-82808a4f-853f-4e6a-a8ba-f486b5275f7c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "3ed60b2a-f4cf-4b63-81bb-099f566d7e6c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:40:37 compute-0 nova_compute[186544]: 2025-11-22 07:40:37.210 186548 DEBUG oslo_concurrency.lockutils [req-e140ca59-cb71-499f-ac69-c6d04298c5c9 req-82808a4f-853f-4e6a-a8ba-f486b5275f7c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "3ed60b2a-f4cf-4b63-81bb-099f566d7e6c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:40:37 compute-0 nova_compute[186544]: 2025-11-22 07:40:37.210 186548 DEBUG nova.compute.manager [req-e140ca59-cb71-499f-ac69-c6d04298c5c9 req-82808a4f-853f-4e6a-a8ba-f486b5275f7c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c] Processing event network-vif-plugged-ff14ba4b-c046-4907-aa37-9db05ed22278 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 07:40:37 compute-0 nova_compute[186544]: 2025-11-22 07:40:37.212 186548 DEBUG nova.compute.manager [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 07:40:37 compute-0 nova_compute[186544]: 2025-11-22 07:40:37.216 186548 DEBUG nova.virt.libvirt.driver [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 07:40:37 compute-0 nova_compute[186544]: 2025-11-22 07:40:37.217 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763797237.2165718, 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:40:37 compute-0 nova_compute[186544]: 2025-11-22 07:40:37.218 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c] VM Resumed (Lifecycle Event)
Nov 22 07:40:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:40:37.308 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:40:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:40:37.310 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:40:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:40:37.310 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:40:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:40:37.326 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[b9813c39-4db5-427e-b826-9c5946051f3c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:40:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:40:37.328 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapcd94b117-d1 in ovnmeta-cd94b117-ddd2-457a-a1e9-a1e03ac67322 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 07:40:37 compute-0 nova_compute[186544]: 2025-11-22 07:40:37.328 186548 INFO nova.virt.libvirt.driver [-] [instance: 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c] Instance spawned successfully.
Nov 22 07:40:37 compute-0 nova_compute[186544]: 2025-11-22 07:40:37.329 186548 DEBUG nova.virt.libvirt.driver [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 07:40:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:40:37.330 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapcd94b117-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 07:40:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:40:37.330 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[062d6903-3f1f-4db8-b574-a4fb771e2185]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:40:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:40:37.333 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[1a4b2cf9-2f1b-431e-b79d-abe6052566ca]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:40:37 compute-0 nova_compute[186544]: 2025-11-22 07:40:37.362 186548 DEBUG nova.virt.libvirt.driver [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:40:37 compute-0 nova_compute[186544]: 2025-11-22 07:40:37.362 186548 DEBUG nova.virt.libvirt.driver [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:40:37 compute-0 nova_compute[186544]: 2025-11-22 07:40:37.363 186548 DEBUG nova.virt.libvirt.driver [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:40:37 compute-0 nova_compute[186544]: 2025-11-22 07:40:37.363 186548 DEBUG nova.virt.libvirt.driver [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:40:37 compute-0 nova_compute[186544]: 2025-11-22 07:40:37.363 186548 DEBUG nova.virt.libvirt.driver [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:40:37 compute-0 nova_compute[186544]: 2025-11-22 07:40:37.364 186548 DEBUG nova.virt.libvirt.driver [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:40:37 compute-0 nova_compute[186544]: 2025-11-22 07:40:37.367 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:40:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:40:37.368 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[731ef47b-d397-4670-abcd-7982af861eef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:40:37 compute-0 nova_compute[186544]: 2025-11-22 07:40:37.369 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:40:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:40:37.387 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[94134fb3-277f-4eef-bc15-f9496f881178]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:40:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:40:37.389 103805 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmpuru5h9lt/privsep.sock']
Nov 22 07:40:37 compute-0 nova_compute[186544]: 2025-11-22 07:40:37.430 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 07:40:37 compute-0 nova_compute[186544]: 2025-11-22 07:40:37.485 186548 INFO nova.compute.manager [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c] Took 42.73 seconds to spawn the instance on the hypervisor.
Nov 22 07:40:37 compute-0 nova_compute[186544]: 2025-11-22 07:40:37.485 186548 DEBUG nova.compute.manager [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:40:37 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:37.587 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 1183 Content-Type: application/json Date: Sat, 22 Nov 2025 07:40:37 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-18477595-31f7-4559-82f9-7c36ea2e9796 x-openstack-request-id: req-18477595-31f7-4559-82f9-7c36ea2e9796 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Nov 22 07:40:37 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:37.587 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavors": [{"id": "1890587748", "name": "tempest-flavor_with_ephemeral_0-22804361", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/1890587748"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/1890587748"}]}, {"id": "1c351edf-5b2d-477d-93d0-c380bdae83e7", "name": "m1.micro", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/1c351edf-5b2d-477d-93d0-c380bdae83e7"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/1c351edf-5b2d-477d-93d0-c380bdae83e7"}]}, {"id": "31612188-3cd6-428b-9166-9568f0affd4a", "name": "m1.nano", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/31612188-3cd6-428b-9166-9568f0affd4a"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/31612188-3cd6-428b-9166-9568f0affd4a"}]}, {"id": "962835561", "name": "tempest-flavor_with_ephemeral_1-2110312460", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/962835561"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/962835561"}]}]} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Nov 22 07:40:37 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:37.587 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None used request id req-18477595-31f7-4559-82f9-7c36ea2e9796 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Nov 22 07:40:37 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:37.589 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors/31612188-3cd6-428b-9166-9568f0affd4a -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}759fdb23b70b4c3cdb60c99d9c5ba42769d1b8be5932a6fa01987fe919cac381" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Nov 22 07:40:37 compute-0 nova_compute[186544]: 2025-11-22 07:40:37.638 186548 INFO nova.compute.manager [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c] Took 43.83 seconds to build instance.
Nov 22 07:40:37 compute-0 nova_compute[186544]: 2025-11-22 07:40:37.698 186548 DEBUG oslo_concurrency.lockutils [None req-d79bdbf3-6a21-41c1-9288-2019f5e78cdd 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Lock "3ed60b2a-f4cf-4b63-81bb-099f566d7e6c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 43.982s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:40:38 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:40:38.115 103805 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Nov 22 07:40:38 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:40:38.115 103805 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpuru5h9lt/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Nov 22 07:40:38 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:40:37.986 213536 INFO oslo.privsep.daemon [-] privsep daemon starting
Nov 22 07:40:38 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:40:37.991 213536 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Nov 22 07:40:38 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:40:37.993 213536 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Nov 22 07:40:38 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:40:37.993 213536 INFO oslo.privsep.daemon [-] privsep daemon running as pid 213536
Nov 22 07:40:38 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:40:38.118 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[b926dc5e-be21-4765-808e-5aa33e7bc8e7]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.432 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 495 Content-Type: application/json Date: Sat, 22 Nov 2025 07:40:37 GMT Keep-Alive: timeout=5, max=99 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-9991f9cc-d2b5-488e-9080-bf6bb728d91f x-openstack-request-id: req-9991f9cc-d2b5-488e-9080-bf6bb728d91f _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.432 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavor": {"id": "31612188-3cd6-428b-9166-9568f0affd4a", "name": "m1.nano", "ram": 128, "disk": 1, "swap": "", "OS-FLV-EXT-DATA:ephemeral": 0, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/31612188-3cd6-428b-9166-9568f0affd4a"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/31612188-3cd6-428b-9166-9568f0affd4a"}]}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.432 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors/31612188-3cd6-428b-9166-9568f0affd4a used request id req-9991f9cc-d2b5-488e-9080-bf6bb728d91f request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.433 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '5bdfb56f-9385-45bb-918d-0662112228b2', 'name': 'tempest-AutoAllocateNetworkTest-server-1910697718', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '98627e04b62e4ce4bf9650377c674f73', 'user_id': '12b223a79f8b4927861908eb11663fb5', 'hostId': '6256e3530c12fbc1fe9b188a07a33a9780cb122dd211a13a1665f16c', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.436 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '3ed60b2a-f4cf-4b63-81bb-099f566d7e6c', 'name': 'tempest-tempest.common.compute-instance-1346960213-2', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000005', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '98627e04b62e4ce4bf9650377c674f73', 'user_id': '12b223a79f8b4927861908eb11663fb5', 'hostId': '6256e3530c12fbc1fe9b188a07a33a9780cb122dd211a13a1665f16c', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.436 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.441 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c / tapff14ba4b-c0 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.442 12 DEBUG ceilometer.compute.pollsters [-] 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.449 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f469c757-4309-4c7a-a6c5-bb8ba627c233', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': 'instance-00000005-3ed60b2a-f4cf-4b63-81bb-099f566d7e6c-tapff14ba4b-c0', 'timestamp': '2025-11-22T07:40:38.436883', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1346960213-2', 'name': 'tapff14ba4b-c0', 'instance_id': '3ed60b2a-f4cf-4b63-81bb-099f566d7e6c', 'instance_type': 'm1.nano', 'host': '6256e3530c12fbc1fe9b188a07a33a9780cb122dd211a13a1665f16c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:82:6a:39', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapff14ba4b-c0'}, 'message_id': '8a097530-c776-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 3979.131318309, 'message_signature': '8f686bddae09e9b9435889e447c7414cff3cbf56e37b94c20cf15ffd10fde652'}]}, 'timestamp': '2025-11-22 07:40:38.443250', '_unique_id': 'dede5ccc7bc54ee6a32b9400389ee353'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.449 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.449 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.449 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.449 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.449 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.449 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.449 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.449 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.449 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.449 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.449 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.449 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.449 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.449 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.449 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.449 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.449 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.449 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.449 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.449 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.449 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.449 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.449 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.449 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.449 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.449 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.449 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.449 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.449 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.449 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.449 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.451 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.451 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.451 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-AutoAllocateNetworkTest-server-1910697718>, <NovaLikeServer: tempest-tempest.common.compute-instance-1346960213-2>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-AutoAllocateNetworkTest-server-1910697718>, <NovaLikeServer: tempest-tempest.common.compute-instance-1346960213-2>]
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.452 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.452 12 DEBUG ceilometer.compute.pollsters [-] 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.453 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b1a5af51-a7c8-4734-a401-73a4f54ede62', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': 'instance-00000005-3ed60b2a-f4cf-4b63-81bb-099f566d7e6c-tapff14ba4b-c0', 'timestamp': '2025-11-22T07:40:38.452109', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1346960213-2', 'name': 'tapff14ba4b-c0', 'instance_id': '3ed60b2a-f4cf-4b63-81bb-099f566d7e6c', 'instance_type': 'm1.nano', 'host': '6256e3530c12fbc1fe9b188a07a33a9780cb122dd211a13a1665f16c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:82:6a:39', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapff14ba4b-c0'}, 'message_id': '8a0ae780-c776-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 3979.131318309, 'message_signature': 'b74eff10c3bbc1f8e3336c0cffe558dd0ec17eebca05c01d78a42d17274f4fbc'}]}, 'timestamp': '2025-11-22 07:40:38.452609', '_unique_id': '57758c5fcefa479f861c6ced506ed93b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.453 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.453 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.453 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.453 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.453 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.453 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.453 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.453 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.453 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.453 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.453 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.453 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.453 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.453 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.453 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.453 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.453 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.453 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.453 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.453 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.453 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.453 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.453 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.453 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.453 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.453 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.453 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.453 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.453 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.453 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.453 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.453 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.481 12 DEBUG ceilometer.compute.pollsters [-] 5bdfb56f-9385-45bb-918d-0662112228b2/disk.device.read.bytes volume: 30845440 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.482 12 DEBUG ceilometer.compute.pollsters [-] 5bdfb56f-9385-45bb-918d-0662112228b2/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.508 12 DEBUG ceilometer.compute.pollsters [-] 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c/disk.device.read.bytes volume: 9892352 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.509 12 DEBUG ceilometer.compute.pollsters [-] 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.511 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bd565b0e-cce2-422b-b26f-7bc13f90c579', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30845440, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': '5bdfb56f-9385-45bb-918d-0662112228b2-vda', 'timestamp': '2025-11-22T07:40:38.454004', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-1910697718', 'name': 'instance-00000002', 'instance_id': '5bdfb56f-9385-45bb-918d-0662112228b2', 'instance_type': 'm1.nano', 'host': '6256e3530c12fbc1fe9b188a07a33a9780cb122dd211a13a1665f16c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8a0f7a34-c776-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 3979.145885269, 'message_signature': '65f7c338beb80a95a6b9d0c995067eb5187c308c1c05b54b5ed58c3ed11d08f0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': '5bdfb56f-9385-45bb-918d-0662112228b2-sda', 'timestamp': '2025-11-22T07:40:38.454004', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-1910697718', 'name': 'instance-00000002', 'instance_id': '5bdfb56f-9385-45bb-918d-0662112228b2', 'instance_type': 'm1.nano', 'host': '6256e3530c12fbc1fe9b188a07a33a9780cb122dd211a13a1665f16c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8a0f877c-c776-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 3979.145885269, 'message_signature': '4ea715ca6f0294f64ec89da977516d5a6b5a53dd529355c9e62ebc05b45ecd26'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9892352, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': '3ed60b2a-f4cf-4b63-81bb-099f566d7e6c-vda', 'timestamp': '2025-11-22T07:40:38.454004', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1346960213-2', 'name': 'instance-00000005', 'instance_id': '3ed60b2a-f4cf-4b63-81bb-099f566d7e6c', 'instance_type': 'm1.nano', 'host': '6256e3530c12fbc1fe9b188a07a33a9780cb122dd211a13a1665f16c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8a139cfe-c776-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 3979.174624589, 'message_signature': '6712dc8e31149b308b4269bf9784955d2aa09226ce7075881c01c420e2f4ec1d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': '3ed60b2a-f4cf-4b63-81bb-099f566d7e6c-sda', 'timestamp': '2025-11-22T07:40:38.454004', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1346960213-2', 'name': 'instance-00000005', 'instance_id': '3ed60b2a-f4cf-4b63-81bb-099f566d7e6c', 'instance_type': 'm1.nano', 'host': '6256e3530c12fbc1fe9b188a07a33a9780cb122dd211a13a1665f16c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8a13acb2-c776-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 3979.174624589, 'message_signature': '5c5e4d2c600cd203a811c6958317d8973eaf45b05074d67b37e7e9fffaee2488'}]}, 'timestamp': '2025-11-22 07:40:38.510074', '_unique_id': '4e63d5d26d9a4e359bd57921b4917148'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.511 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.511 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.511 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.511 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.511 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.511 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.511 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.511 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.511 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.511 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.511 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.511 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.511 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.511 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.511 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.511 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.511 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.511 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.511 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.511 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.511 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.511 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.511 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.511 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.511 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.511 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.511 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.511 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.511 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.511 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.511 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.511 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.512 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.525 12 DEBUG ceilometer.compute.pollsters [-] 5bdfb56f-9385-45bb-918d-0662112228b2/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.526 12 DEBUG ceilometer.compute.pollsters [-] 5bdfb56f-9385-45bb-918d-0662112228b2/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.545 12 DEBUG ceilometer.compute.pollsters [-] 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.545 12 DEBUG ceilometer.compute.pollsters [-] 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.547 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bb9602e9-8b88-4747-81c4-ca0761170461', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': '5bdfb56f-9385-45bb-918d-0662112228b2-vda', 'timestamp': '2025-11-22T07:40:38.512798', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-1910697718', 'name': 'instance-00000002', 'instance_id': '5bdfb56f-9385-45bb-918d-0662112228b2', 'instance_type': 'm1.nano', 'host': '6256e3530c12fbc1fe9b188a07a33a9780cb122dd211a13a1665f16c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8a161c54-c776-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 3979.204789185, 'message_signature': '4f8bb28d226387224eaaed279a31485bd77e65caa301ec9f2c365f0fde3a1a2f'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': '5bdfb56f-9385-45bb-918d-0662112228b2-sda', 'timestamp': '2025-11-22T07:40:38.512798', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-1910697718', 'name': 'instance-00000002', 'instance_id': '5bdfb56f-9385-45bb-918d-0662112228b2', 'instance_type': 'm1.nano', 'host': '6256e3530c12fbc1fe9b188a07a33a9780cb122dd211a13a1665f16c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8a1629f6-c776-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 3979.204789185, 'message_signature': '8eaec57e39a98304052a42c80f3a7283b2317cef3c30e22398e181706b325888'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': '3ed60b2a-f4cf-4b63-81bb-099f566d7e6c-vda', 'timestamp': '2025-11-22T07:40:38.512798', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1346960213-2', 'name': 'instance-00000005', 'instance_id': '3ed60b2a-f4cf-4b63-81bb-099f566d7e6c', 'instance_type': 'm1.nano', 'host': '6256e3530c12fbc1fe9b188a07a33a9780cb122dd211a13a1665f16c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8a19223c-c776-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 3979.218132124, 'message_signature': '8a2a0bd3081764147a105648038a0e5c0bc7bc4a26e1b51efb740fd737f8b282'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': '3ed60b2a-f4cf-4b63-81bb-099f566d7e6c-sda', 'timestamp': '2025-11-22T07:40:38.512798', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1346960213-2', 'name': 'instance-00000005', 'instance_id': '3ed60b2a-f4cf-4b63-81bb-099f566d7e6c', 'instance_type': 'm1.nano', 'host': '6256e3530c12fbc1fe9b188a07a33a9780cb122dd211a13a1665f16c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8a192dfe-c776-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 3979.218132124, 'message_signature': '648cf4d329b5046fd5472cca340e6f0272ab9706f1302e4ef48b4c1035a3b9c9'}]}, 'timestamp': '2025-11-22 07:40:38.546122', '_unique_id': '1af41b351a4247c3a286d89b98cbc9ed'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.547 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.547 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.547 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.547 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.547 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.547 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.547 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.547 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.547 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.547 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.547 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.547 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.547 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.547 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.547 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.547 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.547 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.547 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.547 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.547 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.547 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.547 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.547 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.547 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.547 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.547 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.547 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.547 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.547 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.547 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.547 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.548 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.565 12 DEBUG ceilometer.compute.pollsters [-] 5bdfb56f-9385-45bb-918d-0662112228b2/cpu volume: 12830000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.583 12 DEBUG ceilometer.compute.pollsters [-] 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c/cpu volume: 1280000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.584 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5261c31a-10de-4e2f-9aae-1aa7e80c4573', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12830000000, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': '5bdfb56f-9385-45bb-918d-0662112228b2', 'timestamp': '2025-11-22T07:40:38.548490', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-1910697718', 'name': 'instance-00000002', 'instance_id': '5bdfb56f-9385-45bb-918d-0662112228b2', 'instance_type': 'm1.nano', 'host': '6256e3530c12fbc1fe9b188a07a33a9780cb122dd211a13a1665f16c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '8a1c255e-c776-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 3979.256516283, 'message_signature': 'ed004c2bdd1519f09919c8ed0ebc9903a413cf54219552f801ebfdd8d23a2765'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1280000000, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': '3ed60b2a-f4cf-4b63-81bb-099f566d7e6c', 'timestamp': '2025-11-22T07:40:38.548490', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1346960213-2', 'name': 'instance-00000005', 'instance_id': '3ed60b2a-f4cf-4b63-81bb-099f566d7e6c', 'instance_type': 'm1.nano', 'host': '6256e3530c12fbc1fe9b188a07a33a9780cb122dd211a13a1665f16c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '8a1ee582-c776-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 3979.274705072, 'message_signature': '7ef49544688cf0e72a0303791b44688f0161776e55af23ee71c1816942aeca7a'}]}, 'timestamp': '2025-11-22 07:40:38.583644', '_unique_id': '19e874d6e904438b88d93a78a5dfd02f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.584 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.584 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.584 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.584 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.584 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.584 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.584 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.584 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.584 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.584 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.584 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.584 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.584 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.584 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.584 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.584 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.584 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.584 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.584 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.584 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.584 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.584 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.584 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.584 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.584 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.584 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.584 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.584 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.584 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.584 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.584 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.584 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.585 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.585 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.585 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-AutoAllocateNetworkTest-server-1910697718>, <NovaLikeServer: tempest-tempest.common.compute-instance-1346960213-2>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-AutoAllocateNetworkTest-server-1910697718>, <NovaLikeServer: tempest-tempest.common.compute-instance-1346960213-2>]
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.585 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.585 12 DEBUG ceilometer.compute.pollsters [-] 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.586 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '032daa0e-9c31-49f1-8797-a66e93d19a5e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': 'instance-00000005-3ed60b2a-f4cf-4b63-81bb-099f566d7e6c-tapff14ba4b-c0', 'timestamp': '2025-11-22T07:40:38.585805', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1346960213-2', 'name': 'tapff14ba4b-c0', 'instance_id': '3ed60b2a-f4cf-4b63-81bb-099f566d7e6c', 'instance_type': 'm1.nano', 'host': '6256e3530c12fbc1fe9b188a07a33a9780cb122dd211a13a1665f16c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:82:6a:39', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapff14ba4b-c0'}, 'message_id': '8a1f48c4-c776-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 3979.131318309, 'message_signature': '8954304ede89f371d4bd2ddae874a4f2100085e16104c5007881281afa85b197'}]}, 'timestamp': '2025-11-22 07:40:38.586105', '_unique_id': '1d0318c28392484f9460d9d8555a9f7a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.586 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.586 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.586 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.586 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.586 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.586 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.586 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.586 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.586 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.586 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.586 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.586 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.586 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.586 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.586 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.586 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.586 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.586 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.586 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.586 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.586 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.586 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.586 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.586 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.586 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.586 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.586 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.586 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.586 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.586 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.586 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.586 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.587 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.587 12 DEBUG ceilometer.compute.pollsters [-] 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.588 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8d059f2f-bd5c-47e8-9574-84c333dfceae', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': 'instance-00000005-3ed60b2a-f4cf-4b63-81bb-099f566d7e6c-tapff14ba4b-c0', 'timestamp': '2025-11-22T07:40:38.587465', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1346960213-2', 'name': 'tapff14ba4b-c0', 'instance_id': '3ed60b2a-f4cf-4b63-81bb-099f566d7e6c', 'instance_type': 'm1.nano', 'host': '6256e3530c12fbc1fe9b188a07a33a9780cb122dd211a13a1665f16c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:82:6a:39', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapff14ba4b-c0'}, 'message_id': '8a1f896a-c776-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 3979.131318309, 'message_signature': 'f65b28b7314e124b46c941a21384796bb48710f7d0b90612aecd8ee7504b34d3'}]}, 'timestamp': '2025-11-22 07:40:38.587797', '_unique_id': '457c70ac9794460f8b2384855eca691f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.588 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.588 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.588 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.588 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.588 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.588 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.588 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.588 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.588 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.588 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.588 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.588 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.588 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.588 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.588 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.588 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.588 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.588 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.588 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.588 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.588 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.588 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.588 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.588 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.588 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.588 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.588 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.588 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.588 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.588 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.588 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.588 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.589 12 DEBUG ceilometer.compute.pollsters [-] 5bdfb56f-9385-45bb-918d-0662112228b2/disk.device.usage volume: 29884416 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.589 12 DEBUG ceilometer.compute.pollsters [-] 5bdfb56f-9385-45bb-918d-0662112228b2/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.589 12 DEBUG ceilometer.compute.pollsters [-] 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.589 12 DEBUG ceilometer.compute.pollsters [-] 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.590 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9d5c4ebc-058f-4294-b22f-89d78289fe41', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29884416, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': '5bdfb56f-9385-45bb-918d-0662112228b2-vda', 'timestamp': '2025-11-22T07:40:38.589007', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-1910697718', 'name': 'instance-00000002', 'instance_id': '5bdfb56f-9385-45bb-918d-0662112228b2', 'instance_type': 'm1.nano', 'host': '6256e3530c12fbc1fe9b188a07a33a9780cb122dd211a13a1665f16c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8a1fc51a-c776-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 3979.204789185, 'message_signature': '3674662a51d09991ff763061ed733dc4c99305e2bbde7a68938b98766b9ee41c'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': '5bdfb56f-9385-45bb-918d-0662112228b2-sda', 'timestamp': '2025-11-22T07:40:38.589007', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-1910697718', 'name': 'instance-00000002', 'instance_id': '5bdfb56f-9385-45bb-918d-0662112228b2', 'instance_type': 'm1.nano', 'host': '6256e3530c12fbc1fe9b188a07a33a9780cb122dd211a13a1665f16c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8a1fd08c-c776-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 3979.204789185, 'message_signature': 'c2a17e2aef7c60437fcf18398cff4693125f6687371db465280a8e6218b1c983'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': '3ed60b2a-f4cf-4b63-81bb-099f566d7e6c-vda', 'timestamp': '2025-11-22T07:40:38.589007', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1346960213-2', 'name': 'instance-00000005', 'instance_id': '3ed60b2a-f4cf-4b63-81bb-099f566d7e6c', 'instance_type': 'm1.nano', 'host': '6256e3530c12fbc1fe9b188a07a33a9780cb122dd211a13a1665f16c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8a1fd94c-c776-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 3979.218132124, 'message_signature': '6fe87c9fe3b0001d04b9e9aea132d623f0a301e7e0d5a03a468f50b9ae3dfd92'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': '3ed60b2a-f4cf-4b63-81bb-099f566d7e6c-sda', 'timestamp': '2025-11-22T07:40:38.589007', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1346960213-2', 'name': 'instance-00000005', 'instance_id': '3ed60b2a-f4cf-4b63-81bb-099f566d7e6c', 'instance_type': 'm1.nano', 'host': '6256e3530c12fbc1fe9b188a07a33a9780cb122dd211a13a1665f16c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8a1fe342-c776-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 3979.218132124, 'message_signature': '86b6a4e755db8bca94c3f20568e42fe057ec9e491704808e6158eb3dbdd48888'}]}, 'timestamp': '2025-11-22 07:40:38.590051', '_unique_id': '38ce81820e67493abcc8111c35cd066b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.590 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.590 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.590 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.590 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.590 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.590 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.590 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.590 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.590 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.590 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.590 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.590 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.590 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.590 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.590 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.590 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.590 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.590 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.590 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.590 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.590 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.590 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.590 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.590 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.590 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.590 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.590 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.590 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.590 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.590 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.590 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.591 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.591 12 DEBUG ceilometer.compute.pollsters [-] 5bdfb56f-9385-45bb-918d-0662112228b2/disk.device.read.latency volume: 1226830598 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.591 12 DEBUG ceilometer.compute.pollsters [-] 5bdfb56f-9385-45bb-918d-0662112228b2/disk.device.read.latency volume: 192328633 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.591 12 DEBUG ceilometer.compute.pollsters [-] 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c/disk.device.read.latency volume: 572822472 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.592 12 DEBUG ceilometer.compute.pollsters [-] 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c/disk.device.read.latency volume: 6653034 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.592 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd402eab9-9412-4dc3-b41e-9c67a2bafe64', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1226830598, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': '5bdfb56f-9385-45bb-918d-0662112228b2-vda', 'timestamp': '2025-11-22T07:40:38.591375', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-1910697718', 'name': 'instance-00000002', 'instance_id': '5bdfb56f-9385-45bb-918d-0662112228b2', 'instance_type': 'm1.nano', 'host': '6256e3530c12fbc1fe9b188a07a33a9780cb122dd211a13a1665f16c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8a2021a4-c776-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 3979.145885269, 'message_signature': 'ae4fda1c038dd33a814ff38afadb68bf185d5f1ac1eaf1569446ba26a8bd266c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 192328633, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': '5bdfb56f-9385-45bb-918d-0662112228b2-sda', 'timestamp': '2025-11-22T07:40:38.591375', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-1910697718', 'name': 'instance-00000002', 'instance_id': '5bdfb56f-9385-45bb-918d-0662112228b2', 'instance_type': 'm1.nano', 'host': '6256e3530c12fbc1fe9b188a07a33a9780cb122dd211a13a1665f16c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8a202c6c-c776-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 3979.145885269, 'message_signature': '0e04803efa6acd8bc916dc416151a4c57e7c4e5288c37c2b478727e447669205'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 572822472, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': '3ed60b2a-f4cf-4b63-81bb-099f566d7e6c-vda', 'timestamp': '2025-11-22T07:40:38.591375', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1346960213-2', 'name': 'instance-00000005', 'instance_id': '3ed60b2a-f4cf-4b63-81bb-099f566d7e6c', 'instance_type': 'm1.nano', 'host': '6256e3530c12fbc1fe9b188a07a33a9780cb122dd211a13a1665f16c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8a203676-c776-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 3979.174624589, 'message_signature': 'f99ccf0e643f0d9973953abf527d98b3d455a1e1c5f763c8609f4409113fa818'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 6653034, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': '3ed60b2a-f4cf-4b63-81bb-099f566d7e6c-sda', 'timestamp': '2025-11-22T07:40:38.591375', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1346960213-2', 'name': 'instance-00000005', 'instance_id': '3ed60b2a-f4cf-4b63-81bb-099f566d7e6c', 'instance_type': 'm1.nano', 'host': '6256e3530c12fbc1fe9b188a07a33a9780cb122dd211a13a1665f16c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8a20403a-c776-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 3979.174624589, 'message_signature': '8e32ed34a723168d6cc9d0b8fdeb49d3c89bf14cc8e859af44c1fbe7bff6e280'}]}, 'timestamp': '2025-11-22 07:40:38.592427', '_unique_id': '0cf0a5e144434ba9bf736dc7ebbb1d89'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.592 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.592 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.592 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.592 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.592 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.592 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.592 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.592 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.592 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.592 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.592 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.592 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.592 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.592 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.592 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.592 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.592 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.592 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.592 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.592 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.592 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.592 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.592 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.592 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.592 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.592 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.592 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.592 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.592 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.592 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.592 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.592 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.593 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.593 12 DEBUG ceilometer.compute.pollsters [-] 5bdfb56f-9385-45bb-918d-0662112228b2/disk.device.write.latency volume: 60540638624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.593 12 DEBUG ceilometer.compute.pollsters [-] 5bdfb56f-9385-45bb-918d-0662112228b2/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.594 12 DEBUG ceilometer.compute.pollsters [-] 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.594 12 DEBUG ceilometer.compute.pollsters [-] 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.595 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4a204bc7-dac9-4f7c-a046-535278943ee7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 60540638624, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': '5bdfb56f-9385-45bb-918d-0662112228b2-vda', 'timestamp': '2025-11-22T07:40:38.593659', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-1910697718', 'name': 'instance-00000002', 'instance_id': '5bdfb56f-9385-45bb-918d-0662112228b2', 'instance_type': 'm1.nano', 'host': '6256e3530c12fbc1fe9b188a07a33a9780cb122dd211a13a1665f16c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8a207b18-c776-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 3979.145885269, 'message_signature': '86aa35eb5c340c4327ff9b9645254066a3140d38dee5ac4bbf5fbf6d5b4fe1a6'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': '5bdfb56f-9385-45bb-918d-0662112228b2-sda', 'timestamp': '2025-11-22T07:40:38.593659', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-1910697718', 'name': 'instance-00000002', 'instance_id': '5bdfb56f-9385-45bb-918d-0662112228b2', 'instance_type': 'm1.nano', 'host': '6256e3530c12fbc1fe9b188a07a33a9780cb122dd211a13a1665f16c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8a208676-c776-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 3979.145885269, 'message_signature': 'e5154c6768f0f90d8572d1cfded9be4f3167ff61cefd77955193b42ec12938ab'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': '3ed60b2a-f4cf-4b63-81bb-099f566d7e6c-vda', 'timestamp': '2025-11-22T07:40:38.593659', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1346960213-2', 'name': 'instance-00000005', 'instance_id': '3ed60b2a-f4cf-4b63-81bb-099f566d7e6c', 'instance_type': 'm1.nano', 'host': '6256e3530c12fbc1fe9b188a07a33a9780cb122dd211a13a1665f16c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8a2091f2-c776-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 3979.174624589, 'message_signature': '890b3547ff610564335b287fd391e430d9a7b3318c95e8f551a311193f8a8f83'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': '3ed60b2a-f4cf-4b63-81bb-099f566d7e6c-sda', 'timestamp': '2025-11-22T07:40:38.593659', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1346960213-2', 'name': 'instance-00000005', 'instance_id': '3ed60b2a-f4cf-4b63-81bb-099f566d7e6c', 'instance_type': 'm1.nano', 'host': '6256e3530c12fbc1fe9b188a07a33a9780cb122dd211a13a1665f16c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8a209ada-c776-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 3979.174624589, 'message_signature': '2d5f83f4876eac0703f42dd2ca714b4067ccb35f8b817759abca88f4d344e762'}]}, 'timestamp': '2025-11-22 07:40:38.594756', '_unique_id': '7b98792e05a64ebea7f0f84df7cfcdd5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.595 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.595 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.595 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.595 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.595 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.595 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.595 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.595 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.595 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.595 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.595 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.595 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.595 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.595 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.595 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.595 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.595 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.595 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.595 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.595 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.595 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.595 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.595 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.595 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.595 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.595 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.595 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.595 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.595 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.595 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.595 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.595 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.596 12 DEBUG ceilometer.compute.pollsters [-] 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.596 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b23a5b63-fcf5-4839-84b6-47f3d92434af', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': 'instance-00000005-3ed60b2a-f4cf-4b63-81bb-099f566d7e6c-tapff14ba4b-c0', 'timestamp': '2025-11-22T07:40:38.595991', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1346960213-2', 'name': 'tapff14ba4b-c0', 'instance_id': '3ed60b2a-f4cf-4b63-81bb-099f566d7e6c', 'instance_type': 'm1.nano', 'host': '6256e3530c12fbc1fe9b188a07a33a9780cb122dd211a13a1665f16c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:82:6a:39', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapff14ba4b-c0'}, 'message_id': '8a20d626-c776-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 3979.131318309, 'message_signature': '459885d7195c08dab80aa646511ea0572201b0a3bbcf9da33781dc3dc661aa56'}]}, 'timestamp': '2025-11-22 07:40:38.596321', '_unique_id': '3c9b01a2f5e944e4ac5c6598e2bde039'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.596 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.596 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.596 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.596 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.596 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.596 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.596 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.596 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.596 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.596 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.596 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.596 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.596 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.596 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.596 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.596 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.596 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.596 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.596 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.596 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.596 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.596 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.596 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.596 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.596 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.596 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.596 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.596 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.596 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.596 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.596 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.597 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.597 12 DEBUG ceilometer.compute.pollsters [-] 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.598 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bfd6ea39-0c08-4ab7-87cb-f0b2104ff125', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': 'instance-00000005-3ed60b2a-f4cf-4b63-81bb-099f566d7e6c-tapff14ba4b-c0', 'timestamp': '2025-11-22T07:40:38.597463', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1346960213-2', 'name': 'tapff14ba4b-c0', 'instance_id': '3ed60b2a-f4cf-4b63-81bb-099f566d7e6c', 'instance_type': 'm1.nano', 'host': '6256e3530c12fbc1fe9b188a07a33a9780cb122dd211a13a1665f16c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:82:6a:39', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapff14ba4b-c0'}, 'message_id': '8a210f56-c776-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 3979.131318309, 'message_signature': '53b6c3177f3addb0192c1d416cad5d550f68c9a3ae6ca70cd13fcbbb793d8bad'}]}, 'timestamp': '2025-11-22 07:40:38.597765', '_unique_id': 'ed7b5f53bb99405b8099fb87620f544d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.598 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.598 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.598 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.598 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.598 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.598 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.598 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.598 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.598 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.598 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.598 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.598 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.598 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.598 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.598 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.598 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.598 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.598 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.598 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.598 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.598 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.598 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.598 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.598 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.598 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.598 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.598 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.598 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.598 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.598 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.598 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.598 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.598 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.598 12 DEBUG ceilometer.compute.pollsters [-] 5bdfb56f-9385-45bb-918d-0662112228b2/disk.device.allocation volume: 30154752 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.599 12 DEBUG ceilometer.compute.pollsters [-] 5bdfb56f-9385-45bb-918d-0662112228b2/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.599 12 DEBUG ceilometer.compute.pollsters [-] 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.599 12 DEBUG ceilometer.compute.pollsters [-] 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.600 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1adf8c9b-097f-44d2-9441-b67a7776ccaa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30154752, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': '5bdfb56f-9385-45bb-918d-0662112228b2-vda', 'timestamp': '2025-11-22T07:40:38.598868', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-1910697718', 'name': 'instance-00000002', 'instance_id': '5bdfb56f-9385-45bb-918d-0662112228b2', 'instance_type': 'm1.nano', 'host': '6256e3530c12fbc1fe9b188a07a33a9780cb122dd211a13a1665f16c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8a2145f2-c776-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 3979.204789185, 'message_signature': '2be816428ff9f63d62e32f6223f33ed98d136951cb481e3e97c7fbe85f2e0e32'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': '5bdfb56f-9385-45bb-918d-0662112228b2-sda', 'timestamp': '2025-11-22T07:40:38.598868', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-1910697718', 'name': 'instance-00000002', 'instance_id': '5bdfb56f-9385-45bb-918d-0662112228b2', 'instance_type': 'm1.nano', 'host': '6256e3530c12fbc1fe9b188a07a33a9780cb122dd211a13a1665f16c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8a215a24-c776-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 3979.204789185, 'message_signature': 'bad1a4e974e37db2b79ae0019130cd90e9611e5c373a51ed2a8e4f69bb738db7'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': '3ed60b2a-f4cf-4b63-81bb-099f566d7e6c-vda', 'timestamp': '2025-11-22T07:40:38.598868', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1346960213-2', 'name': 'instance-00000005', 'instance_id': '3ed60b2a-f4cf-4b63-81bb-099f566d7e6c', 'instance_type': 'm1.nano', 'host': '6256e3530c12fbc1fe9b188a07a33a9780cb122dd211a13a1665f16c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8a2163d4-c776-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 3979.218132124, 'message_signature': 'eb9d237d4f5f3f82d1d5169b4a5994bd77e6362bfe7bf5cff2f52a47e6bf46ad'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': '3ed60b2a-f4cf-4b63-81bb-099f566d7e6c-sda', 'timestamp': '2025-11-22T07:40:38.598868', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1346960213-2', 'name': 'instance-00000005', 'instance_id': '3ed60b2a-f4cf-4b63-81bb-099f566d7e6c', 'instance_type': 'm1.nano', 'host': '6256e3530c12fbc1fe9b188a07a33a9780cb122dd211a13a1665f16c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8a216f0a-c776-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 3979.218132124, 'message_signature': '5caff55fcfca4e0be4282c0dfd0be41e5d213434fef60cf4e6609484e80649f1'}]}, 'timestamp': '2025-11-22 07:40:38.600185', '_unique_id': '31e23de984a6461f9578cbb00a5008a2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.600 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.600 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.600 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.600 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.600 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.600 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.600 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.600 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.600 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.600 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.600 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.600 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.600 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.600 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.600 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.600 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.600 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.600 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.600 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.600 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.600 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.600 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.600 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.600 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.600 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.600 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.600 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.600 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.600 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.600 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.600 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.601 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.601 12 DEBUG ceilometer.compute.pollsters [-] 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.602 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3f2bc92f-2639-4f5d-9535-72afffea29c4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': 'instance-00000005-3ed60b2a-f4cf-4b63-81bb-099f566d7e6c-tapff14ba4b-c0', 'timestamp': '2025-11-22T07:40:38.601521', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1346960213-2', 'name': 'tapff14ba4b-c0', 'instance_id': '3ed60b2a-f4cf-4b63-81bb-099f566d7e6c', 'instance_type': 'm1.nano', 'host': '6256e3530c12fbc1fe9b188a07a33a9780cb122dd211a13a1665f16c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:82:6a:39', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapff14ba4b-c0'}, 'message_id': '8a21ae84-c776-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 3979.131318309, 'message_signature': '0d6a1295bae8cb0f778d1becb2c6e2916f981f778f0a6ae46bf1f5b896a1fc21'}]}, 'timestamp': '2025-11-22 07:40:38.601840', '_unique_id': 'b63203fd599847b48f02b5aa88dbb202'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.602 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.602 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.602 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.602 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.602 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.602 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.602 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.602 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.602 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.602 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.602 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.602 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.602 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.602 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.602 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.602 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.602 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.602 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.602 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.602 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.602 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.602 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.602 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.602 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.602 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.602 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.602 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.602 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.602 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.602 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.602 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.602 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.603 12 DEBUG ceilometer.compute.pollsters [-] 5bdfb56f-9385-45bb-918d-0662112228b2/disk.device.read.requests volume: 1120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.603 12 DEBUG ceilometer.compute.pollsters [-] 5bdfb56f-9385-45bb-918d-0662112228b2/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.603 12 DEBUG ceilometer.compute.pollsters [-] 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c/disk.device.read.requests volume: 323 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.603 12 DEBUG ceilometer.compute.pollsters [-] 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.604 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c8f67033-fb10-4950-9abe-c35a489435ab', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1120, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': '5bdfb56f-9385-45bb-918d-0662112228b2-vda', 'timestamp': '2025-11-22T07:40:38.602986', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-1910697718', 'name': 'instance-00000002', 'instance_id': '5bdfb56f-9385-45bb-918d-0662112228b2', 'instance_type': 'm1.nano', 'host': '6256e3530c12fbc1fe9b188a07a33a9780cb122dd211a13a1665f16c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8a21e70a-c776-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 3979.145885269, 'message_signature': '4d3729cf7e61226e10361fd84ce7fbeb3a5c657d8edf5ad564baaa1b2c1d6580'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': '5bdfb56f-9385-45bb-918d-0662112228b2-sda', 'timestamp': '2025-11-22T07:40:38.602986', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-1910697718', 'name': 'instance-00000002', 'instance_id': '5bdfb56f-9385-45bb-918d-0662112228b2', 'instance_type': 'm1.nano', 'host': '6256e3530c12fbc1fe9b188a07a33a9780cb122dd211a13a1665f16c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8a21effc-c776-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 3979.145885269, 'message_signature': '36053de8a264b5c813c7589fac2cd29f1d7e353c9a4c3143346db6acc0c61ec7'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 323, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': '3ed60b2a-f4cf-4b63-81bb-099f566d7e6c-vda', 'timestamp': '2025-11-22T07:40:38.602986', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1346960213-2', 'name': 'instance-00000005', 'instance_id': '3ed60b2a-f4cf-4b63-81bb-099f566d7e6c', 'instance_type': 'm1.nano', 'host': '6256e3530c12fbc1fe9b188a07a33a9780cb122dd211a13a1665f16c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8a21f93e-c776-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 3979.174624589, 'message_signature': '0e6f326e3dae81044c3d2fb78eee3840dffaf4ea154fa1b3382c2589ee26fd3a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': '3ed60b2a-f4cf-4b63-81bb-099f566d7e6c-sda', 'timestamp': '2025-11-22T07:40:38.602986', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1346960213-2', 'name': 'instance-00000005', 'instance_id': '3ed60b2a-f4cf-4b63-81bb-099f566d7e6c', 'instance_type': 'm1.nano', 'host': '6256e3530c12fbc1fe9b188a07a33a9780cb122dd211a13a1665f16c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8a220168-c776-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 3979.174624589, 'message_signature': 'b25dadc731f8eb43cb78a7d0201ba3a3bb7f1db70413a4be670de3c2f655b9ee'}]}, 'timestamp': '2025-11-22 07:40:38.603907', '_unique_id': '1285f7e99385419499b089cba114601d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.604 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.604 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.604 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.604 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.604 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.604 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.604 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.604 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.604 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.604 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.604 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.604 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.604 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.604 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.604 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.604 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.604 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.604 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.604 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.604 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.604 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.604 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.604 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.604 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.604 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.604 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.604 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.604 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.604 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.604 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.604 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.605 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.605 12 DEBUG ceilometer.compute.pollsters [-] 5bdfb56f-9385-45bb-918d-0662112228b2/disk.device.write.requests volume: 286 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.605 12 DEBUG ceilometer.compute.pollsters [-] 5bdfb56f-9385-45bb-918d-0662112228b2/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.605 12 DEBUG ceilometer.compute.pollsters [-] 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.605 12 DEBUG ceilometer.compute.pollsters [-] 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.606 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e9edc3df-8d54-4e12-8447-c0cc5d52aff6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 286, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': '5bdfb56f-9385-45bb-918d-0662112228b2-vda', 'timestamp': '2025-11-22T07:40:38.605171', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-1910697718', 'name': 'instance-00000002', 'instance_id': '5bdfb56f-9385-45bb-918d-0662112228b2', 'instance_type': 'm1.nano', 'host': '6256e3530c12fbc1fe9b188a07a33a9780cb122dd211a13a1665f16c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8a223bd8-c776-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 3979.145885269, 'message_signature': '392f8ec161b0e61bfd35adf3c60f53d104b23cbe5fbe6167601c69fb063e3a09'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': '5bdfb56f-9385-45bb-918d-0662112228b2-sda', 'timestamp': '2025-11-22T07:40:38.605171', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-1910697718', 'name': 'instance-00000002', 'instance_id': '5bdfb56f-9385-45bb-918d-0662112228b2', 'instance_type': 'm1.nano', 'host': '6256e3530c12fbc1fe9b188a07a33a9780cb122dd211a13a1665f16c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8a2243da-c776-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 3979.145885269, 'message_signature': '9ea351a3c6f6b286389f73c886c018b4bb8775e9c9dd5316463738d5a2a635e8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': '3ed60b2a-f4cf-4b63-81bb-099f566d7e6c-vda', 'timestamp': '2025-11-22T07:40:38.605171', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1346960213-2', 'name': 'instance-00000005', 'instance_id': '3ed60b2a-f4cf-4b63-81bb-099f566d7e6c', 'instance_type': 'm1.nano', 'host': '6256e3530c12fbc1fe9b188a07a33a9780cb122dd211a13a1665f16c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8a224d6c-c776-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 3979.174624589, 'message_signature': '321151fc14ea1a19e604b9c4167125ff52a6f7ccf25837720c568ecde284869a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': '3ed60b2a-f4cf-4b63-81bb-099f566d7e6c-sda', 'timestamp': '2025-11-22T07:40:38.605171', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1346960213-2', 'name': 'instance-00000005', 'instance_id': '3ed60b2a-f4cf-4b63-81bb-099f566d7e6c', 'instance_type': 'm1.nano', 'host': '6256e3530c12fbc1fe9b188a07a33a9780cb122dd211a13a1665f16c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8a2256ae-c776-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 3979.174624589, 'message_signature': 'e329d04a037684578964ebc55890ee520c0e847916a80e51d22dffa9a8b98626'}]}, 'timestamp': '2025-11-22 07:40:38.606141', '_unique_id': '8d3a2c23ff1b44dab33bcf29149e755f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.606 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.606 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.606 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.606 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.606 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.606 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.606 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.606 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.606 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.606 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.606 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.606 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.606 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.606 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.606 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.606 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.606 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.606 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.606 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.606 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.606 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.606 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.606 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.606 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.606 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.606 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.606 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.606 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.606 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.606 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.606 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.607 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.607 12 DEBUG ceilometer.compute.pollsters [-] 5bdfb56f-9385-45bb-918d-0662112228b2/disk.device.write.bytes volume: 72953856 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.607 12 DEBUG ceilometer.compute.pollsters [-] 5bdfb56f-9385-45bb-918d-0662112228b2/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.607 12 DEBUG ceilometer.compute.pollsters [-] 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.608 12 DEBUG ceilometer.compute.pollsters [-] 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.608 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e1d2d5ae-dbe2-4150-8efc-4718cfb5e2f9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72953856, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': '5bdfb56f-9385-45bb-918d-0662112228b2-vda', 'timestamp': '2025-11-22T07:40:38.607486', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-1910697718', 'name': 'instance-00000002', 'instance_id': '5bdfb56f-9385-45bb-918d-0662112228b2', 'instance_type': 'm1.nano', 'host': '6256e3530c12fbc1fe9b188a07a33a9780cb122dd211a13a1665f16c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8a22956a-c776-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 3979.145885269, 'message_signature': '8aa0d236fe6f1b1f2bdd447973bc41ce3cc8375c94ed30aa681e1d0b943e7867'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': '5bdfb56f-9385-45bb-918d-0662112228b2-sda', 'timestamp': '2025-11-22T07:40:38.607486', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-1910697718', 'name': 'instance-00000002', 'instance_id': '5bdfb56f-9385-45bb-918d-0662112228b2', 'instance_type': 'm1.nano', 'host': '6256e3530c12fbc1fe9b188a07a33a9780cb122dd211a13a1665f16c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8a229e3e-c776-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 3979.145885269, 'message_signature': '21a14b8f399154b7f4e0e9f79df0e85f8aa5ce5ca3ee717c9fb0560409f21b94'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': '3ed60b2a-f4cf-4b63-81bb-099f566d7e6c-vda', 'timestamp': '2025-11-22T07:40:38.607486', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1346960213-2', 'name': 'instance-00000005', 'instance_id': '3ed60b2a-f4cf-4b63-81bb-099f566d7e6c', 'instance_type': 'm1.nano', 'host': '6256e3530c12fbc1fe9b188a07a33a9780cb122dd211a13a1665f16c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8a22a5fa-c776-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 3979.174624589, 'message_signature': 'd5338a93ead4c6215624440bc81cbe632032c8571758ceddea8f09637b89984b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': '3ed60b2a-f4cf-4b63-81bb-099f566d7e6c-sda', 'timestamp': '2025-11-22T07:40:38.607486', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1346960213-2', 'name': 'instance-00000005', 'instance_id': '3ed60b2a-f4cf-4b63-81bb-099f566d7e6c', 'instance_type': 'm1.nano', 'host': '6256e3530c12fbc1fe9b188a07a33a9780cb122dd211a13a1665f16c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8a22ad48-c776-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 3979.174624589, 'message_signature': '1c0e74fcd6b5d3e606efa31ff8bab629467a4f938d269ba5c3f1bc0f870530be'}]}, 'timestamp': '2025-11-22 07:40:38.608321', '_unique_id': '3af108d52b8549eca103c475e3650015'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.608 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.608 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.608 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.608 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.608 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.608 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.608 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.608 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.608 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.608 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.608 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.608 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.608 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.608 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.608 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.608 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.608 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.608 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.608 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.608 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.608 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.608 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.608 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.608 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.608 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.608 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.608 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.608 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.608 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.608 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.608 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.609 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.609 12 DEBUG ceilometer.compute.pollsters [-] 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.610 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0c974ae9-dd72-4076-bb87-615223c9ce46', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': 'instance-00000005-3ed60b2a-f4cf-4b63-81bb-099f566d7e6c-tapff14ba4b-c0', 'timestamp': '2025-11-22T07:40:38.609471', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1346960213-2', 'name': 'tapff14ba4b-c0', 'instance_id': '3ed60b2a-f4cf-4b63-81bb-099f566d7e6c', 'instance_type': 'm1.nano', 'host': '6256e3530c12fbc1fe9b188a07a33a9780cb122dd211a13a1665f16c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:82:6a:39', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapff14ba4b-c0'}, 'message_id': '8a22e326-c776-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 3979.131318309, 'message_signature': '19d5398a935b2bfc036d6ce3c97c882dee66e4407ad39e9d55dfd979faeb0681'}]}, 'timestamp': '2025-11-22 07:40:38.609700', '_unique_id': 'daf56198a8194a42b6ac40dcfcaaaaad'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.610 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.610 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.610 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.610 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.610 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.610 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.610 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.610 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.610 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.610 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.610 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.610 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.610 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.610 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.610 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.610 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.610 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.610 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.610 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.610 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.610 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.610 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.610 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.610 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.610 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.610 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.610 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.610 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.610 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.610 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.610 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.610 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.610 12 DEBUG ceilometer.compute.pollsters [-] 5bdfb56f-9385-45bb-918d-0662112228b2/memory.usage volume: 44.8828125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.611 12 DEBUG ceilometer.compute.pollsters [-] 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.611 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c: ceilometer.compute.pollsters.NoVolumeException
Nov 22 07:40:38 compute-0 nova_compute[186544]: 2025-11-22 07:40:38.611 186548 DEBUG nova.network.neutron [req-9dcf37ac-110b-4d0f-b182-97390de7696e req-05474d51-06d8-469b-a29d-f0a7e4b84adf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c] Updated VIF entry in instance network info cache for port ff14ba4b-c046-4907-aa37-9db05ed22278. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 07:40:38 compute-0 nova_compute[186544]: 2025-11-22 07:40:38.611 186548 DEBUG nova.network.neutron [req-9dcf37ac-110b-4d0f-b182-97390de7696e req-05474d51-06d8-469b-a29d-f0a7e4b84adf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c] Updating instance_info_cache with network_info: [{"id": "ff14ba4b-c046-4907-aa37-9db05ed22278", "address": "fa:16:3e:82:6a:39", "network": {"id": "cd94b117-ddd2-457a-a1e9-a1e03ac67322", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::53", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.45", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98627e04b62e4ce4bf9650377c674f73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff14ba4b-c0", "ovs_interfaceid": "ff14ba4b-c046-4907-aa37-9db05ed22278", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.611 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '401d794a-e9b0-4866-b7dd-01ca8338acf6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 44.8828125, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': '5bdfb56f-9385-45bb-918d-0662112228b2', 'timestamp': '2025-11-22T07:40:38.610847', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-1910697718', 'name': 'instance-00000002', 'instance_id': '5bdfb56f-9385-45bb-918d-0662112228b2', 'instance_type': 'm1.nano', 'host': '6256e3530c12fbc1fe9b188a07a33a9780cb122dd211a13a1665f16c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '8a231be8-c776-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 3979.256516283, 'message_signature': '793aca4c7496b77cb16e082f671a404421535f592f361ff65331eeabad692b9e'}]}, 'timestamp': '2025-11-22 07:40:38.611383', '_unique_id': 'b540a7b09d504c9781b232e334dc2f2d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.611 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.611 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.611 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.611 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.611 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.611 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.611 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.611 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.611 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.611 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.611 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.611 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.611 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.611 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.611 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.611 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.611 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.611 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.611 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.611 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.611 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.611 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.611 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.611 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.611 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.611 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.611 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.611 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.611 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.611 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.611 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.613 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.613 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.613 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-AutoAllocateNetworkTest-server-1910697718>, <NovaLikeServer: tempest-tempest.common.compute-instance-1346960213-2>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-AutoAllocateNetworkTest-server-1910697718>, <NovaLikeServer: tempest-tempest.common.compute-instance-1346960213-2>]
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.613 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.613 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.613 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-AutoAllocateNetworkTest-server-1910697718>, <NovaLikeServer: tempest-tempest.common.compute-instance-1346960213-2>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-AutoAllocateNetworkTest-server-1910697718>, <NovaLikeServer: tempest-tempest.common.compute-instance-1346960213-2>]
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.613 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.614 12 DEBUG ceilometer.compute.pollsters [-] 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.615 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6324dd06-b9e2-4846-ae40-3a8083fa21be', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': 'instance-00000005-3ed60b2a-f4cf-4b63-81bb-099f566d7e6c-tapff14ba4b-c0', 'timestamp': '2025-11-22T07:40:38.614023', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1346960213-2', 'name': 'tapff14ba4b-c0', 'instance_id': '3ed60b2a-f4cf-4b63-81bb-099f566d7e6c', 'instance_type': 'm1.nano', 'host': '6256e3530c12fbc1fe9b188a07a33a9780cb122dd211a13a1665f16c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:82:6a:39', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapff14ba4b-c0'}, 'message_id': '8a239d3e-c776-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 3979.131318309, 'message_signature': 'ea20f9fb48623e8078c3eea9a7b5987a4bd4c3ad2d6a7093ff523e244c2ef906'}]}, 'timestamp': '2025-11-22 07:40:38.614512', '_unique_id': '57b44efedfba46229cde6fd6751fcd3a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.615 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.615 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.615 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.615 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.615 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.615 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.615 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.615 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.615 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.615 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.615 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.615 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.615 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.615 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.615 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.615 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.615 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.615 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.615 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.615 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.615 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.615 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.615 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.615 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.615 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.615 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.615 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.615 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.615 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.615 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.615 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.615 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.616 12 DEBUG ceilometer.compute.pollsters [-] 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.616 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '82b38232-ce35-4876-92b4-7892677d3a54', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '12b223a79f8b4927861908eb11663fb5', 'user_name': None, 'project_id': '98627e04b62e4ce4bf9650377c674f73', 'project_name': None, 'resource_id': 'instance-00000005-3ed60b2a-f4cf-4b63-81bb-099f566d7e6c-tapff14ba4b-c0', 'timestamp': '2025-11-22T07:40:38.616032', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1346960213-2', 'name': 'tapff14ba4b-c0', 'instance_id': '3ed60b2a-f4cf-4b63-81bb-099f566d7e6c', 'instance_type': 'm1.nano', 'host': '6256e3530c12fbc1fe9b188a07a33a9780cb122dd211a13a1665f16c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:82:6a:39', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapff14ba4b-c0'}, 'message_id': '8a23e488-c776-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 3979.131318309, 'message_signature': 'd5733847c03fdef9122ab522fef19986fe913405ef930e637456be924918b8ec'}]}, 'timestamp': '2025-11-22 07:40:38.616348', '_unique_id': '766835906082472d8d4b193f61dbfbfa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.616 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.616 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.616 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.616 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.616 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.616 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.616 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.616 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.616 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.616 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.616 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.616 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.616 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.616 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.616 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.616 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.616 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.616 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.616 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.616 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.616 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.616 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.616 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.616 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.616 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.616 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.616 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.616 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.616 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.616 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:40:38 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:40:38.616 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:40:38 compute-0 nova_compute[186544]: 2025-11-22 07:40:38.637 186548 DEBUG oslo_concurrency.lockutils [req-9dcf37ac-110b-4d0f-b182-97390de7696e req-05474d51-06d8-469b-a29d-f0a7e4b84adf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-3ed60b2a-f4cf-4b63-81bb-099f566d7e6c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:40:38 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:40:38.650 213536 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:40:38 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:40:38.650 213536 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:40:38 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:40:38.650 213536 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:40:38 compute-0 rsyslogd[1008]: imjournal from <np0005531886:ceilometer_agent_compute>: begin to drop messages due to rate-limiting
Nov 22 07:40:39 compute-0 nova_compute[186544]: 2025-11-22 07:40:39.046 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:40:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:40:39.330 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[2767627a-1cf3-4180-ba66-cc3c954a3cc8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:40:39 compute-0 NetworkManager[55036]: <info>  [1763797239.3535] manager: (tapcd94b117-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/23)
Nov 22 07:40:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:40:39.352 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[91eca823-22cd-4d9d-97fa-0b6201f3f192]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:40:39 compute-0 systemd-udevd[213548]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 07:40:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:40:39.386 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[a44cd120-733a-49b5-8c6a-aff5031209ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:40:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:40:39.391 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[c09a4e7d-53dd-49e6-b960-3dcb1dde9362]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:40:39 compute-0 NetworkManager[55036]: <info>  [1763797239.4236] device (tapcd94b117-d0): carrier: link connected
Nov 22 07:40:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:40:39.428 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[d1f4faea-cb87-4f01-9be1-79d582ef6957]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:40:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:40:39.447 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[eb2b9474-6d43-4619-9354-df21d99098c1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcd94b117-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:20:df:b6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 398006, 'reachable_time': 17710, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213566, 'error': None, 'target': 'ovnmeta-cd94b117-ddd2-457a-a1e9-a1e03ac67322', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:40:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:40:39.463 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[bbcb8479-cf01-4984-ba46-7e1254766b17]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe20:dfb6'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 398006, 'tstamp': 398006}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213567, 'error': None, 'target': 'ovnmeta-cd94b117-ddd2-457a-a1e9-a1e03ac67322', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:40:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:40:39.480 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[531e4405-4d40-44d6-b3a9-ff7c9da76009]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcd94b117-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:20:df:b6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 398006, 'reachable_time': 17710, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 213568, 'error': None, 'target': 'ovnmeta-cd94b117-ddd2-457a-a1e9-a1e03ac67322', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:40:39 compute-0 nova_compute[186544]: 2025-11-22 07:40:39.494 186548 DEBUG nova.compute.manager [req-a7532c2a-5dde-4d96-801a-22c281b57c35 req-ef72e022-dc87-4f19-acd1-d03d0209fae7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c] Received event network-vif-plugged-ff14ba4b-c046-4907-aa37-9db05ed22278 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:40:39 compute-0 nova_compute[186544]: 2025-11-22 07:40:39.494 186548 DEBUG oslo_concurrency.lockutils [req-a7532c2a-5dde-4d96-801a-22c281b57c35 req-ef72e022-dc87-4f19-acd1-d03d0209fae7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "3ed60b2a-f4cf-4b63-81bb-099f566d7e6c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:40:39 compute-0 nova_compute[186544]: 2025-11-22 07:40:39.494 186548 DEBUG oslo_concurrency.lockutils [req-a7532c2a-5dde-4d96-801a-22c281b57c35 req-ef72e022-dc87-4f19-acd1-d03d0209fae7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "3ed60b2a-f4cf-4b63-81bb-099f566d7e6c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:40:39 compute-0 nova_compute[186544]: 2025-11-22 07:40:39.494 186548 DEBUG oslo_concurrency.lockutils [req-a7532c2a-5dde-4d96-801a-22c281b57c35 req-ef72e022-dc87-4f19-acd1-d03d0209fae7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "3ed60b2a-f4cf-4b63-81bb-099f566d7e6c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:40:39 compute-0 nova_compute[186544]: 2025-11-22 07:40:39.494 186548 DEBUG nova.compute.manager [req-a7532c2a-5dde-4d96-801a-22c281b57c35 req-ef72e022-dc87-4f19-acd1-d03d0209fae7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c] No waiting events found dispatching network-vif-plugged-ff14ba4b-c046-4907-aa37-9db05ed22278 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:40:39 compute-0 nova_compute[186544]: 2025-11-22 07:40:39.495 186548 WARNING nova.compute.manager [req-a7532c2a-5dde-4d96-801a-22c281b57c35 req-ef72e022-dc87-4f19-acd1-d03d0209fae7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c] Received unexpected event network-vif-plugged-ff14ba4b-c046-4907-aa37-9db05ed22278 for instance with vm_state active and task_state None.
Nov 22 07:40:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:40:39.509 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[b4ece575-44ce-413c-9fc1-34654dfcdd74]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:40:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:40:39.569 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[194c7d65-15f6-4e41-8413-ede78bf0a5ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:40:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:40:39.571 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcd94b117-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:40:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:40:39.571 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:40:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:40:39.572 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcd94b117-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:40:39 compute-0 kernel: tapcd94b117-d0: entered promiscuous mode
Nov 22 07:40:39 compute-0 NetworkManager[55036]: <info>  [1763797239.5763] manager: (tapcd94b117-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/24)
Nov 22 07:40:39 compute-0 nova_compute[186544]: 2025-11-22 07:40:39.574 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:40:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:40:39.582 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapcd94b117-d0, col_values=(('external_ids', {'iface-id': 'f15694ec-11c8-44d4-a18a-7277c1308d45'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:40:39 compute-0 ovn_controller[94843]: 2025-11-22T07:40:39Z|00031|binding|INFO|Releasing lport f15694ec-11c8-44d4-a18a-7277c1308d45 from this chassis (sb_readonly=0)
Nov 22 07:40:39 compute-0 nova_compute[186544]: 2025-11-22 07:40:39.583 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:40:39 compute-0 nova_compute[186544]: 2025-11-22 07:40:39.597 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:40:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:40:39.598 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/cd94b117-ddd2-457a-a1e9-a1e03ac67322.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/cd94b117-ddd2-457a-a1e9-a1e03ac67322.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 07:40:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:40:39.599 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[1eae0d1c-b394-4786-81ca-6d0da8668fb2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:40:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:40:39.600 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 07:40:39 compute-0 ovn_metadata_agent[103800]: global
Nov 22 07:40:39 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 07:40:39 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-cd94b117-ddd2-457a-a1e9-a1e03ac67322
Nov 22 07:40:39 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 07:40:39 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 07:40:39 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 07:40:39 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/cd94b117-ddd2-457a-a1e9-a1e03ac67322.pid.haproxy
Nov 22 07:40:39 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 07:40:39 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:40:39 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 07:40:39 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 07:40:39 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 07:40:39 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 07:40:39 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 07:40:39 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 07:40:39 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 07:40:39 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 07:40:39 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 07:40:39 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 07:40:39 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 07:40:39 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 07:40:39 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 07:40:39 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:40:39 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:40:39 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 07:40:39 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 07:40:39 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 07:40:39 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID cd94b117-ddd2-457a-a1e9-a1e03ac67322
Nov 22 07:40:39 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 07:40:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:40:39.601 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-cd94b117-ddd2-457a-a1e9-a1e03ac67322', 'env', 'PROCESS_TAG=haproxy-cd94b117-ddd2-457a-a1e9-a1e03ac67322', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/cd94b117-ddd2-457a-a1e9-a1e03ac67322.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 07:40:39 compute-0 nova_compute[186544]: 2025-11-22 07:40:39.796 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:40:40 compute-0 podman[213601]: 2025-11-22 07:40:39.949255225 +0000 UTC m=+0.027078411 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 07:40:40 compute-0 podman[213601]: 2025-11-22 07:40:40.078442007 +0000 UTC m=+0.156265163 container create 1b312cdee0e5a6ac9c966fdf652670dfa8b5b3f325b290f14bec83e274f15547 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cd94b117-ddd2-457a-a1e9-a1e03ac67322, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team)
Nov 22 07:40:40 compute-0 systemd[1]: Started libpod-conmon-1b312cdee0e5a6ac9c966fdf652670dfa8b5b3f325b290f14bec83e274f15547.scope.
Nov 22 07:40:40 compute-0 systemd[1]: Started libcrun container.
Nov 22 07:40:40 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/172e43981677e4a882ca8668101a5c1dc492cf76637931127dd3ff9454f162f1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 07:40:40 compute-0 podman[213601]: 2025-11-22 07:40:40.232142244 +0000 UTC m=+0.309965420 container init 1b312cdee0e5a6ac9c966fdf652670dfa8b5b3f325b290f14bec83e274f15547 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cd94b117-ddd2-457a-a1e9-a1e03ac67322, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2)
Nov 22 07:40:40 compute-0 podman[213601]: 2025-11-22 07:40:40.239885665 +0000 UTC m=+0.317708821 container start 1b312cdee0e5a6ac9c966fdf652670dfa8b5b3f325b290f14bec83e274f15547 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cd94b117-ddd2-457a-a1e9-a1e03ac67322, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 22 07:40:40 compute-0 neutron-haproxy-ovnmeta-cd94b117-ddd2-457a-a1e9-a1e03ac67322[213617]: [NOTICE]   (213621) : New worker (213623) forked
Nov 22 07:40:40 compute-0 neutron-haproxy-ovnmeta-cd94b117-ddd2-457a-a1e9-a1e03ac67322[213617]: [NOTICE]   (213621) : Loading success.
Nov 22 07:40:44 compute-0 nova_compute[186544]: 2025-11-22 07:40:44.048 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:40:44 compute-0 podman[213632]: 2025-11-22 07:40:44.4095608 +0000 UTC m=+0.061191334 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm)
Nov 22 07:40:44 compute-0 podman[213633]: 2025-11-22 07:40:44.439376986 +0000 UTC m=+0.084591581 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 07:40:44 compute-0 nova_compute[186544]: 2025-11-22 07:40:44.798 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:40:49 compute-0 nova_compute[186544]: 2025-11-22 07:40:49.050 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:40:49 compute-0 nova_compute[186544]: 2025-11-22 07:40:49.803 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:40:50 compute-0 podman[213680]: 2025-11-22 07:40:50.42721342 +0000 UTC m=+0.063337626 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 07:40:52 compute-0 ovn_controller[94843]: 2025-11-22T07:40:52Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:82:6a:39 10.1.0.45
Nov 22 07:40:52 compute-0 ovn_controller[94843]: 2025-11-22T07:40:52Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:82:6a:39 10.1.0.45
Nov 22 07:40:53 compute-0 nova_compute[186544]: 2025-11-22 07:40:53.881 186548 DEBUG oslo_concurrency.lockutils [None req-9774a15d-4358-41ea-923e-4ad79a1789ac 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Acquiring lock "3ed60b2a-f4cf-4b63-81bb-099f566d7e6c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:40:53 compute-0 nova_compute[186544]: 2025-11-22 07:40:53.882 186548 DEBUG oslo_concurrency.lockutils [None req-9774a15d-4358-41ea-923e-4ad79a1789ac 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Lock "3ed60b2a-f4cf-4b63-81bb-099f566d7e6c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:40:53 compute-0 nova_compute[186544]: 2025-11-22 07:40:53.882 186548 DEBUG oslo_concurrency.lockutils [None req-9774a15d-4358-41ea-923e-4ad79a1789ac 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Acquiring lock "3ed60b2a-f4cf-4b63-81bb-099f566d7e6c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:40:53 compute-0 nova_compute[186544]: 2025-11-22 07:40:53.883 186548 DEBUG oslo_concurrency.lockutils [None req-9774a15d-4358-41ea-923e-4ad79a1789ac 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Lock "3ed60b2a-f4cf-4b63-81bb-099f566d7e6c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:40:53 compute-0 nova_compute[186544]: 2025-11-22 07:40:53.883 186548 DEBUG oslo_concurrency.lockutils [None req-9774a15d-4358-41ea-923e-4ad79a1789ac 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Lock "3ed60b2a-f4cf-4b63-81bb-099f566d7e6c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:40:53 compute-0 nova_compute[186544]: 2025-11-22 07:40:53.891 186548 INFO nova.compute.manager [None req-9774a15d-4358-41ea-923e-4ad79a1789ac 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c] Terminating instance
Nov 22 07:40:53 compute-0 nova_compute[186544]: 2025-11-22 07:40:53.900 186548 DEBUG nova.compute.manager [None req-9774a15d-4358-41ea-923e-4ad79a1789ac 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 07:40:53 compute-0 kernel: tapff14ba4b-c0 (unregistering): left promiscuous mode
Nov 22 07:40:53 compute-0 NetworkManager[55036]: <info>  [1763797253.9223] device (tapff14ba4b-c0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 07:40:53 compute-0 nova_compute[186544]: 2025-11-22 07:40:53.934 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:40:53 compute-0 ovn_controller[94843]: 2025-11-22T07:40:53Z|00032|binding|INFO|Releasing lport ff14ba4b-c046-4907-aa37-9db05ed22278 from this chassis (sb_readonly=0)
Nov 22 07:40:53 compute-0 ovn_controller[94843]: 2025-11-22T07:40:53Z|00033|binding|INFO|Setting lport ff14ba4b-c046-4907-aa37-9db05ed22278 down in Southbound
Nov 22 07:40:53 compute-0 ovn_controller[94843]: 2025-11-22T07:40:53Z|00034|binding|INFO|Removing iface tapff14ba4b-c0 ovn-installed in OVS
Nov 22 07:40:53 compute-0 nova_compute[186544]: 2025-11-22 07:40:53.937 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:40:53 compute-0 nova_compute[186544]: 2025-11-22 07:40:53.954 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:40:53 compute-0 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000005.scope: Deactivated successfully.
Nov 22 07:40:53 compute-0 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000005.scope: Consumed 14.770s CPU time.
Nov 22 07:40:53 compute-0 systemd-machined[152872]: Machine qemu-3-instance-00000005 terminated.
Nov 22 07:40:54 compute-0 podman[213711]: 2025-11-22 07:40:54.00818796 +0000 UTC m=+0.059520892 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, container_name=ovn_metadata_agent)
Nov 22 07:40:54 compute-0 nova_compute[186544]: 2025-11-22 07:40:54.052 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:40:54 compute-0 nova_compute[186544]: 2025-11-22 07:40:54.150 186548 INFO nova.virt.libvirt.driver [-] [instance: 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c] Instance destroyed successfully.
Nov 22 07:40:54 compute-0 nova_compute[186544]: 2025-11-22 07:40:54.151 186548 DEBUG nova.objects.instance [None req-9774a15d-4358-41ea-923e-4ad79a1789ac 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Lazy-loading 'resources' on Instance uuid 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:40:54 compute-0 nova_compute[186544]: 2025-11-22 07:40:54.161 186548 DEBUG nova.virt.libvirt.vif [None req-9774a15d-4358-41ea-923e-4ad79a1789ac 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:39:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-tempest.common.compute-instance-1346960213-2',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1346960213-2',id=5,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=1,launched_at=2025-11-22T07:40:37Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='98627e04b62e4ce4bf9650377c674f73',ramdisk_id='',reservation_id='r-qng9bmzp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AutoAllocateNetworkTest-83498172',owner_user_name='tempest-AutoAllocateNetworkTest-83498172-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:40:37Z,user_data=None,user_id='12b223a79f8b4927861908eb11663fb5',uuid=3ed60b2a-f4cf-4b63-81bb-099f566d7e6c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ff14ba4b-c046-4907-aa37-9db05ed22278", "address": "fa:16:3e:82:6a:39", "network": {"id": "cd94b117-ddd2-457a-a1e9-a1e03ac67322", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::53", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.45", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98627e04b62e4ce4bf9650377c674f73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff14ba4b-c0", "ovs_interfaceid": "ff14ba4b-c046-4907-aa37-9db05ed22278", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 07:40:54 compute-0 nova_compute[186544]: 2025-11-22 07:40:54.162 186548 DEBUG nova.network.os_vif_util [None req-9774a15d-4358-41ea-923e-4ad79a1789ac 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Converting VIF {"id": "ff14ba4b-c046-4907-aa37-9db05ed22278", "address": "fa:16:3e:82:6a:39", "network": {"id": "cd94b117-ddd2-457a-a1e9-a1e03ac67322", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::53", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.45", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98627e04b62e4ce4bf9650377c674f73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff14ba4b-c0", "ovs_interfaceid": "ff14ba4b-c046-4907-aa37-9db05ed22278", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:40:54 compute-0 nova_compute[186544]: 2025-11-22 07:40:54.163 186548 DEBUG nova.network.os_vif_util [None req-9774a15d-4358-41ea-923e-4ad79a1789ac 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:82:6a:39,bridge_name='br-int',has_traffic_filtering=True,id=ff14ba4b-c046-4907-aa37-9db05ed22278,network=Network(cd94b117-ddd2-457a-a1e9-a1e03ac67322),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapff14ba4b-c0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:40:54 compute-0 nova_compute[186544]: 2025-11-22 07:40:54.163 186548 DEBUG os_vif [None req-9774a15d-4358-41ea-923e-4ad79a1789ac 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:82:6a:39,bridge_name='br-int',has_traffic_filtering=True,id=ff14ba4b-c046-4907-aa37-9db05ed22278,network=Network(cd94b117-ddd2-457a-a1e9-a1e03ac67322),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapff14ba4b-c0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 07:40:54 compute-0 nova_compute[186544]: 2025-11-22 07:40:54.165 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:40:54 compute-0 nova_compute[186544]: 2025-11-22 07:40:54.165 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapff14ba4b-c0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:40:54 compute-0 nova_compute[186544]: 2025-11-22 07:40:54.167 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:40:54 compute-0 nova_compute[186544]: 2025-11-22 07:40:54.168 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 07:40:54 compute-0 nova_compute[186544]: 2025-11-22 07:40:54.169 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:40:54 compute-0 nova_compute[186544]: 2025-11-22 07:40:54.171 186548 INFO os_vif [None req-9774a15d-4358-41ea-923e-4ad79a1789ac 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:82:6a:39,bridge_name='br-int',has_traffic_filtering=True,id=ff14ba4b-c046-4907-aa37-9db05ed22278,network=Network(cd94b117-ddd2-457a-a1e9-a1e03ac67322),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapff14ba4b-c0')
Nov 22 07:40:54 compute-0 nova_compute[186544]: 2025-11-22 07:40:54.172 186548 INFO nova.virt.libvirt.driver [None req-9774a15d-4358-41ea-923e-4ad79a1789ac 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c] Deleting instance files /var/lib/nova/instances/3ed60b2a-f4cf-4b63-81bb-099f566d7e6c_del
Nov 22 07:40:54 compute-0 nova_compute[186544]: 2025-11-22 07:40:54.173 186548 INFO nova.virt.libvirt.driver [None req-9774a15d-4358-41ea-923e-4ad79a1789ac 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c] Deletion of /var/lib/nova/instances/3ed60b2a-f4cf-4b63-81bb-099f566d7e6c_del complete
Nov 22 07:40:54 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:40:54.365 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:82:6a:39 10.1.0.45 fdfe:381f:8400::53'], port_security=['fa:16:3e:82:6a:39 10.1.0.45 fdfe:381f:8400::53'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.1.0.45/26 fdfe:381f:8400::53/64', 'neutron:device_id': '3ed60b2a-f4cf-4b63-81bb-099f566d7e6c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cd94b117-ddd2-457a-a1e9-a1e03ac67322', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98627e04b62e4ce4bf9650377c674f73', 'neutron:revision_number': '4', 'neutron:security_group_ids': '931bf7c3-500b-4034-8d8e-f18219ff1b58', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6120d3e5-4a9e-45cc-93a1-87b92bf94714, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=ff14ba4b-c046-4907-aa37-9db05ed22278) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:40:54 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:40:54.366 103805 INFO neutron.agent.ovn.metadata.agent [-] Port ff14ba4b-c046-4907-aa37-9db05ed22278 in datapath cd94b117-ddd2-457a-a1e9-a1e03ac67322 unbound from our chassis
Nov 22 07:40:54 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:40:54.367 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cd94b117-ddd2-457a-a1e9-a1e03ac67322, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 07:40:54 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:40:54.368 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[dcb81861-547c-4511-9e90-c5ab79eaef26]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:40:54 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:40:54.369 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-cd94b117-ddd2-457a-a1e9-a1e03ac67322 namespace which is not needed anymore
Nov 22 07:40:54 compute-0 neutron-haproxy-ovnmeta-cd94b117-ddd2-457a-a1e9-a1e03ac67322[213617]: [NOTICE]   (213621) : haproxy version is 2.8.14-c23fe91
Nov 22 07:40:54 compute-0 neutron-haproxy-ovnmeta-cd94b117-ddd2-457a-a1e9-a1e03ac67322[213617]: [NOTICE]   (213621) : path to executable is /usr/sbin/haproxy
Nov 22 07:40:54 compute-0 neutron-haproxy-ovnmeta-cd94b117-ddd2-457a-a1e9-a1e03ac67322[213617]: [WARNING]  (213621) : Exiting Master process...
Nov 22 07:40:54 compute-0 neutron-haproxy-ovnmeta-cd94b117-ddd2-457a-a1e9-a1e03ac67322[213617]: [WARNING]  (213621) : Exiting Master process...
Nov 22 07:40:54 compute-0 neutron-haproxy-ovnmeta-cd94b117-ddd2-457a-a1e9-a1e03ac67322[213617]: [ALERT]    (213621) : Current worker (213623) exited with code 143 (Terminated)
Nov 22 07:40:54 compute-0 neutron-haproxy-ovnmeta-cd94b117-ddd2-457a-a1e9-a1e03ac67322[213617]: [WARNING]  (213621) : All workers exited. Exiting... (0)
Nov 22 07:40:54 compute-0 systemd[1]: libpod-1b312cdee0e5a6ac9c966fdf652670dfa8b5b3f325b290f14bec83e274f15547.scope: Deactivated successfully.
Nov 22 07:40:54 compute-0 podman[213772]: 2025-11-22 07:40:54.605898917 +0000 UTC m=+0.156850356 container died 1b312cdee0e5a6ac9c966fdf652670dfa8b5b3f325b290f14bec83e274f15547 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cd94b117-ddd2-457a-a1e9-a1e03ac67322, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 22 07:40:54 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1b312cdee0e5a6ac9c966fdf652670dfa8b5b3f325b290f14bec83e274f15547-userdata-shm.mount: Deactivated successfully.
Nov 22 07:40:54 compute-0 systemd[1]: var-lib-containers-storage-overlay-172e43981677e4a882ca8668101a5c1dc492cf76637931127dd3ff9454f162f1-merged.mount: Deactivated successfully.
Nov 22 07:40:54 compute-0 podman[213772]: 2025-11-22 07:40:54.725585183 +0000 UTC m=+0.276536622 container cleanup 1b312cdee0e5a6ac9c966fdf652670dfa8b5b3f325b290f14bec83e274f15547 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cd94b117-ddd2-457a-a1e9-a1e03ac67322, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Nov 22 07:40:54 compute-0 systemd[1]: libpod-conmon-1b312cdee0e5a6ac9c966fdf652670dfa8b5b3f325b290f14bec83e274f15547.scope: Deactivated successfully.
Nov 22 07:40:54 compute-0 nova_compute[186544]: 2025-11-22 07:40:54.807 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:40:54 compute-0 nova_compute[186544]: 2025-11-22 07:40:54.863 186548 INFO nova.compute.manager [None req-9774a15d-4358-41ea-923e-4ad79a1789ac 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c] Took 0.96 seconds to destroy the instance on the hypervisor.
Nov 22 07:40:54 compute-0 nova_compute[186544]: 2025-11-22 07:40:54.864 186548 DEBUG oslo.service.loopingcall [None req-9774a15d-4358-41ea-923e-4ad79a1789ac 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 07:40:54 compute-0 nova_compute[186544]: 2025-11-22 07:40:54.865 186548 DEBUG nova.compute.manager [-] [instance: 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 07:40:54 compute-0 nova_compute[186544]: 2025-11-22 07:40:54.865 186548 DEBUG nova.network.neutron [-] [instance: 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 07:40:54 compute-0 podman[213804]: 2025-11-22 07:40:54.890027456 +0000 UTC m=+0.135134049 container remove 1b312cdee0e5a6ac9c966fdf652670dfa8b5b3f325b290f14bec83e274f15547 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cd94b117-ddd2-457a-a1e9-a1e03ac67322, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 22 07:40:54 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:40:54.895 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[01db382d-0227-449d-ac7f-432dce38849b]: (4, ('Sat Nov 22 07:40:54 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-cd94b117-ddd2-457a-a1e9-a1e03ac67322 (1b312cdee0e5a6ac9c966fdf652670dfa8b5b3f325b290f14bec83e274f15547)\n1b312cdee0e5a6ac9c966fdf652670dfa8b5b3f325b290f14bec83e274f15547\nSat Nov 22 07:40:54 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-cd94b117-ddd2-457a-a1e9-a1e03ac67322 (1b312cdee0e5a6ac9c966fdf652670dfa8b5b3f325b290f14bec83e274f15547)\n1b312cdee0e5a6ac9c966fdf652670dfa8b5b3f325b290f14bec83e274f15547\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:40:54 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:40:54.896 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[7379f9ff-cc90-4abd-9d1b-a1e815e0781a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:40:54 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:40:54.897 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcd94b117-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:40:54 compute-0 kernel: tapcd94b117-d0: left promiscuous mode
Nov 22 07:40:54 compute-0 nova_compute[186544]: 2025-11-22 07:40:54.899 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:40:54 compute-0 nova_compute[186544]: 2025-11-22 07:40:54.941 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:40:54 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:40:54.943 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[b6964474-73e4-449a-82c5-75c697829353]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:40:54 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:40:54.964 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[2c50e1a5-062a-4e3b-9003-149dea9b170a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:40:54 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:40:54.966 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[c07fa20d-1f87-41ba-9fcf-f531a4510f77]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:40:54 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:40:54.979 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[3e866f3f-6d32-4026-ab0d-6aed5df844f7]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 397996, 'reachable_time': 37796, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213819, 'error': None, 'target': 'ovnmeta-cd94b117-ddd2-457a-a1e9-a1e03ac67322', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:40:54 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:40:54.987 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-cd94b117-ddd2-457a-a1e9-a1e03ac67322 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 07:40:54 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:40:54.988 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[9262b037-981d-4509-808d-fa5dc8d6e097]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:40:54 compute-0 systemd[1]: run-netns-ovnmeta\x2dcd94b117\x2dddd2\x2d457a\x2da1e9\x2da1e03ac67322.mount: Deactivated successfully.
Nov 22 07:40:57 compute-0 nova_compute[186544]: 2025-11-22 07:40:57.234 186548 DEBUG nova.compute.manager [req-cddc38d9-6192-4799-a56c-840bd2910b4d req-72ee595c-6283-4700-8b23-6daaa4c23f92 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c] Received event network-vif-unplugged-ff14ba4b-c046-4907-aa37-9db05ed22278 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:40:57 compute-0 nova_compute[186544]: 2025-11-22 07:40:57.235 186548 DEBUG oslo_concurrency.lockutils [req-cddc38d9-6192-4799-a56c-840bd2910b4d req-72ee595c-6283-4700-8b23-6daaa4c23f92 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "3ed60b2a-f4cf-4b63-81bb-099f566d7e6c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:40:57 compute-0 nova_compute[186544]: 2025-11-22 07:40:57.235 186548 DEBUG oslo_concurrency.lockutils [req-cddc38d9-6192-4799-a56c-840bd2910b4d req-72ee595c-6283-4700-8b23-6daaa4c23f92 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "3ed60b2a-f4cf-4b63-81bb-099f566d7e6c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:40:57 compute-0 nova_compute[186544]: 2025-11-22 07:40:57.235 186548 DEBUG oslo_concurrency.lockutils [req-cddc38d9-6192-4799-a56c-840bd2910b4d req-72ee595c-6283-4700-8b23-6daaa4c23f92 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "3ed60b2a-f4cf-4b63-81bb-099f566d7e6c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:40:57 compute-0 nova_compute[186544]: 2025-11-22 07:40:57.235 186548 DEBUG nova.compute.manager [req-cddc38d9-6192-4799-a56c-840bd2910b4d req-72ee595c-6283-4700-8b23-6daaa4c23f92 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c] No waiting events found dispatching network-vif-unplugged-ff14ba4b-c046-4907-aa37-9db05ed22278 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:40:57 compute-0 nova_compute[186544]: 2025-11-22 07:40:57.236 186548 DEBUG nova.compute.manager [req-cddc38d9-6192-4799-a56c-840bd2910b4d req-72ee595c-6283-4700-8b23-6daaa4c23f92 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c] Received event network-vif-unplugged-ff14ba4b-c046-4907-aa37-9db05ed22278 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 22 07:40:57 compute-0 podman[213822]: 2025-11-22 07:40:57.400312063 +0000 UTC m=+0.053719127 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 22 07:40:59 compute-0 nova_compute[186544]: 2025-11-22 07:40:59.168 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:40:59 compute-0 nova_compute[186544]: 2025-11-22 07:40:59.808 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:41:00 compute-0 nova_compute[186544]: 2025-11-22 07:41:00.560 186548 DEBUG nova.network.neutron [-] [instance: 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:41:00 compute-0 nova_compute[186544]: 2025-11-22 07:41:00.699 186548 INFO nova.compute.manager [-] [instance: 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c] Took 5.83 seconds to deallocate network for instance.
Nov 22 07:41:00 compute-0 nova_compute[186544]: 2025-11-22 07:41:00.862 186548 DEBUG oslo_concurrency.lockutils [None req-9774a15d-4358-41ea-923e-4ad79a1789ac 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:41:00 compute-0 nova_compute[186544]: 2025-11-22 07:41:00.862 186548 DEBUG oslo_concurrency.lockutils [None req-9774a15d-4358-41ea-923e-4ad79a1789ac 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:41:00 compute-0 nova_compute[186544]: 2025-11-22 07:41:00.880 186548 DEBUG nova.compute.manager [req-b36f31e6-3ae6-46a9-9358-78de27b06e66 req-f6d4287d-196e-433d-a15d-f40c3ac6341d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c] Received event network-vif-deleted-ff14ba4b-c046-4907-aa37-9db05ed22278 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:41:01 compute-0 nova_compute[186544]: 2025-11-22 07:41:01.121 186548 DEBUG nova.compute.provider_tree [None req-9774a15d-4358-41ea-923e-4ad79a1789ac 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:41:01 compute-0 nova_compute[186544]: 2025-11-22 07:41:01.139 186548 DEBUG nova.scheduler.client.report [None req-9774a15d-4358-41ea-923e-4ad79a1789ac 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:41:01 compute-0 podman[213841]: 2025-11-22 07:41:01.401156388 +0000 UTC m=+0.050269793 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 07:41:01 compute-0 nova_compute[186544]: 2025-11-22 07:41:01.670 186548 DEBUG nova.compute.manager [req-30d99050-ad82-431b-b052-b8b31d12dda9 req-4d87050f-6da9-4cd7-b197-8df6c33d3a08 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c] Received event network-vif-plugged-ff14ba4b-c046-4907-aa37-9db05ed22278 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:41:01 compute-0 nova_compute[186544]: 2025-11-22 07:41:01.671 186548 DEBUG oslo_concurrency.lockutils [req-30d99050-ad82-431b-b052-b8b31d12dda9 req-4d87050f-6da9-4cd7-b197-8df6c33d3a08 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "3ed60b2a-f4cf-4b63-81bb-099f566d7e6c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:41:01 compute-0 nova_compute[186544]: 2025-11-22 07:41:01.671 186548 DEBUG oslo_concurrency.lockutils [req-30d99050-ad82-431b-b052-b8b31d12dda9 req-4d87050f-6da9-4cd7-b197-8df6c33d3a08 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "3ed60b2a-f4cf-4b63-81bb-099f566d7e6c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:41:01 compute-0 nova_compute[186544]: 2025-11-22 07:41:01.671 186548 DEBUG oslo_concurrency.lockutils [req-30d99050-ad82-431b-b052-b8b31d12dda9 req-4d87050f-6da9-4cd7-b197-8df6c33d3a08 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "3ed60b2a-f4cf-4b63-81bb-099f566d7e6c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:41:01 compute-0 nova_compute[186544]: 2025-11-22 07:41:01.671 186548 DEBUG nova.compute.manager [req-30d99050-ad82-431b-b052-b8b31d12dda9 req-4d87050f-6da9-4cd7-b197-8df6c33d3a08 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c] No waiting events found dispatching network-vif-plugged-ff14ba4b-c046-4907-aa37-9db05ed22278 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:41:01 compute-0 nova_compute[186544]: 2025-11-22 07:41:01.672 186548 WARNING nova.compute.manager [req-30d99050-ad82-431b-b052-b8b31d12dda9 req-4d87050f-6da9-4cd7-b197-8df6c33d3a08 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c] Received unexpected event network-vif-plugged-ff14ba4b-c046-4907-aa37-9db05ed22278 for instance with vm_state deleted and task_state None.
Nov 22 07:41:01 compute-0 nova_compute[186544]: 2025-11-22 07:41:01.767 186548 DEBUG oslo_concurrency.lockutils [None req-9774a15d-4358-41ea-923e-4ad79a1789ac 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.905s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:41:02 compute-0 nova_compute[186544]: 2025-11-22 07:41:02.265 186548 INFO nova.scheduler.client.report [None req-9774a15d-4358-41ea-923e-4ad79a1789ac 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Deleted allocations for instance 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c
Nov 22 07:41:02 compute-0 nova_compute[186544]: 2025-11-22 07:41:02.491 186548 DEBUG oslo_concurrency.lockutils [None req-9774a15d-4358-41ea-923e-4ad79a1789ac 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Lock "3ed60b2a-f4cf-4b63-81bb-099f566d7e6c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 8.609s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:41:04 compute-0 nova_compute[186544]: 2025-11-22 07:41:04.171 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:41:04 compute-0 nova_compute[186544]: 2025-11-22 07:41:04.871 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:41:06 compute-0 podman[213865]: 2025-11-22 07:41:06.402142892 +0000 UTC m=+0.050694195 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vcs-type=git, maintainer=Red Hat, Inc., distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.openshift.expose-services=, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 22 07:41:09 compute-0 nova_compute[186544]: 2025-11-22 07:41:09.149 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763797254.1488695, 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:41:09 compute-0 nova_compute[186544]: 2025-11-22 07:41:09.150 186548 INFO nova.compute.manager [-] [instance: 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c] VM Stopped (Lifecycle Event)
Nov 22 07:41:09 compute-0 nova_compute[186544]: 2025-11-22 07:41:09.173 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:41:09 compute-0 nova_compute[186544]: 2025-11-22 07:41:09.182 186548 DEBUG nova.compute.manager [None req-0596274f-e0ea-4d1b-9972-5e0aef516e9f - - - - - -] [instance: 3ed60b2a-f4cf-4b63-81bb-099f566d7e6c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:41:09 compute-0 nova_compute[186544]: 2025-11-22 07:41:09.872 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:41:14 compute-0 nova_compute[186544]: 2025-11-22 07:41:14.176 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:41:14 compute-0 nova_compute[186544]: 2025-11-22 07:41:14.876 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:41:15 compute-0 podman[213886]: 2025-11-22 07:41:15.412322434 +0000 UTC m=+0.058522107 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team)
Nov 22 07:41:15 compute-0 podman[213887]: 2025-11-22 07:41:15.436014359 +0000 UTC m=+0.079347351 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 22 07:41:19 compute-0 nova_compute[186544]: 2025-11-22 07:41:19.180 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:41:19 compute-0 nova_compute[186544]: 2025-11-22 07:41:19.876 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:41:21 compute-0 nova_compute[186544]: 2025-11-22 07:41:21.303 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:41:21 compute-0 podman[213930]: 2025-11-22 07:41:21.397466252 +0000 UTC m=+0.050482218 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 07:41:24 compute-0 nova_compute[186544]: 2025-11-22 07:41:24.182 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:41:24 compute-0 podman[213954]: 2025-11-22 07:41:24.398941135 +0000 UTC m=+0.051608996 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 07:41:24 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:41:24.670 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:41:24 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:41:24.671 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 07:41:24 compute-0 nova_compute[186544]: 2025-11-22 07:41:24.671 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:41:24 compute-0 nova_compute[186544]: 2025-11-22 07:41:24.878 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:41:25 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:41:25.674 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:41:28 compute-0 nova_compute[186544]: 2025-11-22 07:41:28.165 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:41:28 compute-0 nova_compute[186544]: 2025-11-22 07:41:28.189 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:41:28 compute-0 nova_compute[186544]: 2025-11-22 07:41:28.189 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:41:28 compute-0 nova_compute[186544]: 2025-11-22 07:41:28.190 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:41:28 compute-0 nova_compute[186544]: 2025-11-22 07:41:28.190 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 07:41:28 compute-0 nova_compute[186544]: 2025-11-22 07:41:28.245 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5bdfb56f-9385-45bb-918d-0662112228b2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:41:28 compute-0 podman[213974]: 2025-11-22 07:41:28.310949325 +0000 UTC m=+0.077184447 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 07:41:28 compute-0 nova_compute[186544]: 2025-11-22 07:41:28.324 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5bdfb56f-9385-45bb-918d-0662112228b2/disk --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:41:28 compute-0 nova_compute[186544]: 2025-11-22 07:41:28.325 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5bdfb56f-9385-45bb-918d-0662112228b2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:41:28 compute-0 nova_compute[186544]: 2025-11-22 07:41:28.379 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5bdfb56f-9385-45bb-918d-0662112228b2/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:41:28 compute-0 nova_compute[186544]: 2025-11-22 07:41:28.506 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 07:41:28 compute-0 nova_compute[186544]: 2025-11-22 07:41:28.507 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5642MB free_disk=73.43434143066406GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 07:41:28 compute-0 nova_compute[186544]: 2025-11-22 07:41:28.507 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:41:28 compute-0 nova_compute[186544]: 2025-11-22 07:41:28.508 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:41:28 compute-0 nova_compute[186544]: 2025-11-22 07:41:28.586 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Instance 5bdfb56f-9385-45bb-918d-0662112228b2 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 22 07:41:28 compute-0 nova_compute[186544]: 2025-11-22 07:41:28.587 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 07:41:28 compute-0 nova_compute[186544]: 2025-11-22 07:41:28.587 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 07:41:28 compute-0 nova_compute[186544]: 2025-11-22 07:41:28.620 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:41:28 compute-0 nova_compute[186544]: 2025-11-22 07:41:28.631 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:41:28 compute-0 nova_compute[186544]: 2025-11-22 07:41:28.747 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 07:41:28 compute-0 nova_compute[186544]: 2025-11-22 07:41:28.747 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.239s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:41:29 compute-0 nova_compute[186544]: 2025-11-22 07:41:29.184 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:41:29 compute-0 nova_compute[186544]: 2025-11-22 07:41:29.879 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:41:30 compute-0 nova_compute[186544]: 2025-11-22 07:41:30.241 186548 DEBUG oslo_concurrency.lockutils [None req-068ee591-7d85-4507-8609-629fbe694782 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Acquiring lock "5bdfb56f-9385-45bb-918d-0662112228b2" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:41:30 compute-0 nova_compute[186544]: 2025-11-22 07:41:30.241 186548 DEBUG oslo_concurrency.lockutils [None req-068ee591-7d85-4507-8609-629fbe694782 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Lock "5bdfb56f-9385-45bb-918d-0662112228b2" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:41:30 compute-0 nova_compute[186544]: 2025-11-22 07:41:30.242 186548 DEBUG oslo_concurrency.lockutils [None req-068ee591-7d85-4507-8609-629fbe694782 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Acquiring lock "5bdfb56f-9385-45bb-918d-0662112228b2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:41:30 compute-0 nova_compute[186544]: 2025-11-22 07:41:30.242 186548 DEBUG oslo_concurrency.lockutils [None req-068ee591-7d85-4507-8609-629fbe694782 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Lock "5bdfb56f-9385-45bb-918d-0662112228b2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:41:30 compute-0 nova_compute[186544]: 2025-11-22 07:41:30.242 186548 DEBUG oslo_concurrency.lockutils [None req-068ee591-7d85-4507-8609-629fbe694782 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Lock "5bdfb56f-9385-45bb-918d-0662112228b2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:41:30 compute-0 nova_compute[186544]: 2025-11-22 07:41:30.248 186548 INFO nova.compute.manager [None req-068ee591-7d85-4507-8609-629fbe694782 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 5bdfb56f-9385-45bb-918d-0662112228b2] Terminating instance
Nov 22 07:41:30 compute-0 nova_compute[186544]: 2025-11-22 07:41:30.255 186548 DEBUG oslo_concurrency.lockutils [None req-068ee591-7d85-4507-8609-629fbe694782 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Acquiring lock "refresh_cache-5bdfb56f-9385-45bb-918d-0662112228b2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:41:30 compute-0 nova_compute[186544]: 2025-11-22 07:41:30.255 186548 DEBUG oslo_concurrency.lockutils [None req-068ee591-7d85-4507-8609-629fbe694782 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Acquired lock "refresh_cache-5bdfb56f-9385-45bb-918d-0662112228b2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:41:30 compute-0 nova_compute[186544]: 2025-11-22 07:41:30.255 186548 DEBUG nova.network.neutron [None req-068ee591-7d85-4507-8609-629fbe694782 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 5bdfb56f-9385-45bb-918d-0662112228b2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 07:41:30 compute-0 nova_compute[186544]: 2025-11-22 07:41:30.746 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:41:30 compute-0 nova_compute[186544]: 2025-11-22 07:41:30.747 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:41:30 compute-0 nova_compute[186544]: 2025-11-22 07:41:30.747 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 07:41:31 compute-0 nova_compute[186544]: 2025-11-22 07:41:31.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:41:31 compute-0 nova_compute[186544]: 2025-11-22 07:41:31.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 07:41:31 compute-0 nova_compute[186544]: 2025-11-22 07:41:31.164 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 07:41:31 compute-0 nova_compute[186544]: 2025-11-22 07:41:31.181 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 5bdfb56f-9385-45bb-918d-0662112228b2] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875
Nov 22 07:41:31 compute-0 nova_compute[186544]: 2025-11-22 07:41:31.182 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 07:41:32 compute-0 nova_compute[186544]: 2025-11-22 07:41:32.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:41:32 compute-0 nova_compute[186544]: 2025-11-22 07:41:32.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:41:32 compute-0 nova_compute[186544]: 2025-11-22 07:41:32.261 186548 DEBUG nova.network.neutron [None req-068ee591-7d85-4507-8609-629fbe694782 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 5bdfb56f-9385-45bb-918d-0662112228b2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 07:41:32 compute-0 podman[214000]: 2025-11-22 07:41:32.401328591 +0000 UTC m=+0.053624056 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 07:41:33 compute-0 nova_compute[186544]: 2025-11-22 07:41:33.072 186548 DEBUG nova.network.neutron [None req-068ee591-7d85-4507-8609-629fbe694782 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 5bdfb56f-9385-45bb-918d-0662112228b2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:41:33 compute-0 nova_compute[186544]: 2025-11-22 07:41:33.095 186548 DEBUG oslo_concurrency.lockutils [None req-068ee591-7d85-4507-8609-629fbe694782 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Releasing lock "refresh_cache-5bdfb56f-9385-45bb-918d-0662112228b2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:41:33 compute-0 nova_compute[186544]: 2025-11-22 07:41:33.096 186548 DEBUG nova.compute.manager [None req-068ee591-7d85-4507-8609-629fbe694782 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 5bdfb56f-9385-45bb-918d-0662112228b2] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 07:41:33 compute-0 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000002.scope: Deactivated successfully.
Nov 22 07:41:33 compute-0 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000002.scope: Consumed 17.412s CPU time.
Nov 22 07:41:33 compute-0 systemd-machined[152872]: Machine qemu-1-instance-00000002 terminated.
Nov 22 07:41:33 compute-0 nova_compute[186544]: 2025-11-22 07:41:33.338 186548 INFO nova.virt.libvirt.driver [-] [instance: 5bdfb56f-9385-45bb-918d-0662112228b2] Instance destroyed successfully.
Nov 22 07:41:33 compute-0 nova_compute[186544]: 2025-11-22 07:41:33.338 186548 DEBUG nova.objects.instance [None req-068ee591-7d85-4507-8609-629fbe694782 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Lazy-loading 'resources' on Instance uuid 5bdfb56f-9385-45bb-918d-0662112228b2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:41:33 compute-0 nova_compute[186544]: 2025-11-22 07:41:33.365 186548 INFO nova.virt.libvirt.driver [None req-068ee591-7d85-4507-8609-629fbe694782 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 5bdfb56f-9385-45bb-918d-0662112228b2] Deleting instance files /var/lib/nova/instances/5bdfb56f-9385-45bb-918d-0662112228b2_del
Nov 22 07:41:33 compute-0 nova_compute[186544]: 2025-11-22 07:41:33.366 186548 INFO nova.virt.libvirt.driver [None req-068ee591-7d85-4507-8609-629fbe694782 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 5bdfb56f-9385-45bb-918d-0662112228b2] Deletion of /var/lib/nova/instances/5bdfb56f-9385-45bb-918d-0662112228b2_del complete
Nov 22 07:41:33 compute-0 nova_compute[186544]: 2025-11-22 07:41:33.485 186548 INFO nova.compute.manager [None req-068ee591-7d85-4507-8609-629fbe694782 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] [instance: 5bdfb56f-9385-45bb-918d-0662112228b2] Took 0.39 seconds to destroy the instance on the hypervisor.
Nov 22 07:41:33 compute-0 nova_compute[186544]: 2025-11-22 07:41:33.486 186548 DEBUG oslo.service.loopingcall [None req-068ee591-7d85-4507-8609-629fbe694782 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 07:41:33 compute-0 nova_compute[186544]: 2025-11-22 07:41:33.486 186548 DEBUG nova.compute.manager [-] [instance: 5bdfb56f-9385-45bb-918d-0662112228b2] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 07:41:33 compute-0 nova_compute[186544]: 2025-11-22 07:41:33.486 186548 DEBUG nova.network.neutron [-] [instance: 5bdfb56f-9385-45bb-918d-0662112228b2] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 07:41:33 compute-0 nova_compute[186544]: 2025-11-22 07:41:33.762 186548 DEBUG nova.network.neutron [-] [instance: 5bdfb56f-9385-45bb-918d-0662112228b2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 07:41:33 compute-0 nova_compute[186544]: 2025-11-22 07:41:33.779 186548 DEBUG nova.network.neutron [-] [instance: 5bdfb56f-9385-45bb-918d-0662112228b2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:41:33 compute-0 nova_compute[186544]: 2025-11-22 07:41:33.811 186548 INFO nova.compute.manager [-] [instance: 5bdfb56f-9385-45bb-918d-0662112228b2] Took 0.32 seconds to deallocate network for instance.
Nov 22 07:41:33 compute-0 nova_compute[186544]: 2025-11-22 07:41:33.920 186548 DEBUG oslo_concurrency.lockutils [None req-068ee591-7d85-4507-8609-629fbe694782 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:41:33 compute-0 nova_compute[186544]: 2025-11-22 07:41:33.921 186548 DEBUG oslo_concurrency.lockutils [None req-068ee591-7d85-4507-8609-629fbe694782 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:41:34 compute-0 nova_compute[186544]: 2025-11-22 07:41:34.019 186548 DEBUG nova.compute.provider_tree [None req-068ee591-7d85-4507-8609-629fbe694782 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:41:34 compute-0 nova_compute[186544]: 2025-11-22 07:41:34.043 186548 DEBUG nova.scheduler.client.report [None req-068ee591-7d85-4507-8609-629fbe694782 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:41:34 compute-0 nova_compute[186544]: 2025-11-22 07:41:34.075 186548 DEBUG oslo_concurrency.lockutils [None req-068ee591-7d85-4507-8609-629fbe694782 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.154s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:41:34 compute-0 nova_compute[186544]: 2025-11-22 07:41:34.128 186548 INFO nova.scheduler.client.report [None req-068ee591-7d85-4507-8609-629fbe694782 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Deleted allocations for instance 5bdfb56f-9385-45bb-918d-0662112228b2
Nov 22 07:41:34 compute-0 nova_compute[186544]: 2025-11-22 07:41:34.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:41:34 compute-0 nova_compute[186544]: 2025-11-22 07:41:34.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:41:34 compute-0 nova_compute[186544]: 2025-11-22 07:41:34.187 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:41:34 compute-0 nova_compute[186544]: 2025-11-22 07:41:34.279 186548 DEBUG oslo_concurrency.lockutils [None req-068ee591-7d85-4507-8609-629fbe694782 12b223a79f8b4927861908eb11663fb5 98627e04b62e4ce4bf9650377c674f73 - - default default] Lock "5bdfb56f-9385-45bb-918d-0662112228b2" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.038s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:41:34 compute-0 nova_compute[186544]: 2025-11-22 07:41:34.880 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:41:35 compute-0 nova_compute[186544]: 2025-11-22 07:41:35.165 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:41:37 compute-0 nova_compute[186544]: 2025-11-22 07:41:37.160 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:41:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:41:37.309 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:41:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:41:37.310 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:41:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:41:37.310 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:41:37 compute-0 podman[214035]: 2025-11-22 07:41:37.459089667 +0000 UTC m=+0.096322681 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, version=9.6, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, io.buildah.version=1.33.7, io.openshift.expose-services=)
Nov 22 07:41:39 compute-0 nova_compute[186544]: 2025-11-22 07:41:39.212 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:41:39 compute-0 nova_compute[186544]: 2025-11-22 07:41:39.881 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:41:44 compute-0 nova_compute[186544]: 2025-11-22 07:41:44.215 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:41:44 compute-0 nova_compute[186544]: 2025-11-22 07:41:44.887 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:41:46 compute-0 podman[214057]: 2025-11-22 07:41:46.411184313 +0000 UTC m=+0.058768093 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 22 07:41:46 compute-0 podman[214058]: 2025-11-22 07:41:46.43699932 +0000 UTC m=+0.080299444 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 07:41:48 compute-0 nova_compute[186544]: 2025-11-22 07:41:48.337 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763797293.336517, 5bdfb56f-9385-45bb-918d-0662112228b2 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:41:48 compute-0 nova_compute[186544]: 2025-11-22 07:41:48.338 186548 INFO nova.compute.manager [-] [instance: 5bdfb56f-9385-45bb-918d-0662112228b2] VM Stopped (Lifecycle Event)
Nov 22 07:41:48 compute-0 nova_compute[186544]: 2025-11-22 07:41:48.407 186548 DEBUG nova.compute.manager [None req-eec10ccc-cf3c-4e42-bead-8f170f9a3bfc - - - - - -] [instance: 5bdfb56f-9385-45bb-918d-0662112228b2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:41:49 compute-0 nova_compute[186544]: 2025-11-22 07:41:49.219 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:41:49 compute-0 nova_compute[186544]: 2025-11-22 07:41:49.887 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:41:52 compute-0 podman[214104]: 2025-11-22 07:41:52.423141881 +0000 UTC m=+0.076969602 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 07:41:52 compute-0 ovn_controller[94843]: 2025-11-22T07:41:52Z|00035|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Nov 22 07:41:54 compute-0 nova_compute[186544]: 2025-11-22 07:41:54.222 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:41:54 compute-0 rsyslogd[1008]: imjournal: 439 messages lost due to rate-limiting (20000 allowed within 600 seconds)
Nov 22 07:41:54 compute-0 nova_compute[186544]: 2025-11-22 07:41:54.888 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:41:55 compute-0 podman[214128]: 2025-11-22 07:41:55.404624481 +0000 UTC m=+0.052908508 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 22 07:41:58 compute-0 podman[214148]: 2025-11-22 07:41:58.412495613 +0000 UTC m=+0.065168801 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 07:41:59 compute-0 nova_compute[186544]: 2025-11-22 07:41:59.225 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:41:59 compute-0 nova_compute[186544]: 2025-11-22 07:41:59.889 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:42:03 compute-0 podman[214168]: 2025-11-22 07:42:03.403058817 +0000 UTC m=+0.050827717 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 07:42:04 compute-0 nova_compute[186544]: 2025-11-22 07:42:04.228 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:42:04 compute-0 nova_compute[186544]: 2025-11-22 07:42:04.891 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:42:08 compute-0 podman[214192]: 2025-11-22 07:42:08.403693902 +0000 UTC m=+0.058028314 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=openstack_network_exporter, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9-minimal, release=1755695350, config_id=edpm, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, vcs-type=git)
Nov 22 07:42:09 compute-0 nova_compute[186544]: 2025-11-22 07:42:09.273 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:42:09 compute-0 nova_compute[186544]: 2025-11-22 07:42:09.892 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:42:14 compute-0 nova_compute[186544]: 2025-11-22 07:42:14.274 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:42:14 compute-0 nova_compute[186544]: 2025-11-22 07:42:14.894 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:42:15 compute-0 nova_compute[186544]: 2025-11-22 07:42:15.991 186548 DEBUG oslo_concurrency.lockutils [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] Acquiring lock "e6c2518b-54f7-4cfa-99bf-c237bff88edb" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:42:15 compute-0 nova_compute[186544]: 2025-11-22 07:42:15.991 186548 DEBUG oslo_concurrency.lockutils [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] Lock "e6c2518b-54f7-4cfa-99bf-c237bff88edb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:42:16 compute-0 nova_compute[186544]: 2025-11-22 07:42:16.166 186548 DEBUG nova.compute.manager [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] [instance: e6c2518b-54f7-4cfa-99bf-c237bff88edb] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 07:42:16 compute-0 nova_compute[186544]: 2025-11-22 07:42:16.283 186548 DEBUG oslo_concurrency.lockutils [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:42:16 compute-0 nova_compute[186544]: 2025-11-22 07:42:16.284 186548 DEBUG oslo_concurrency.lockutils [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:42:16 compute-0 nova_compute[186544]: 2025-11-22 07:42:16.291 186548 DEBUG nova.virt.hardware [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 07:42:16 compute-0 nova_compute[186544]: 2025-11-22 07:42:16.292 186548 INFO nova.compute.claims [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] [instance: e6c2518b-54f7-4cfa-99bf-c237bff88edb] Claim successful on node compute-0.ctlplane.example.com
Nov 22 07:42:16 compute-0 nova_compute[186544]: 2025-11-22 07:42:16.490 186548 DEBUG nova.compute.provider_tree [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:42:16 compute-0 nova_compute[186544]: 2025-11-22 07:42:16.548 186548 DEBUG nova.scheduler.client.report [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:42:16 compute-0 nova_compute[186544]: 2025-11-22 07:42:16.670 186548 DEBUG oslo_concurrency.lockutils [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.386s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:42:16 compute-0 nova_compute[186544]: 2025-11-22 07:42:16.671 186548 DEBUG nova.compute.manager [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] [instance: e6c2518b-54f7-4cfa-99bf-c237bff88edb] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 07:42:16 compute-0 nova_compute[186544]: 2025-11-22 07:42:16.896 186548 DEBUG nova.compute.manager [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] [instance: e6c2518b-54f7-4cfa-99bf-c237bff88edb] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 22 07:42:16 compute-0 nova_compute[186544]: 2025-11-22 07:42:16.897 186548 DEBUG nova.network.neutron [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] [instance: e6c2518b-54f7-4cfa-99bf-c237bff88edb] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 07:42:16 compute-0 nova_compute[186544]: 2025-11-22 07:42:16.936 186548 INFO nova.virt.libvirt.driver [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] [instance: e6c2518b-54f7-4cfa-99bf-c237bff88edb] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 07:42:16 compute-0 nova_compute[186544]: 2025-11-22 07:42:16.988 186548 DEBUG nova.compute.manager [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] [instance: e6c2518b-54f7-4cfa-99bf-c237bff88edb] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 07:42:17 compute-0 nova_compute[186544]: 2025-11-22 07:42:17.304 186548 DEBUG nova.compute.manager [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] [instance: e6c2518b-54f7-4cfa-99bf-c237bff88edb] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 07:42:17 compute-0 nova_compute[186544]: 2025-11-22 07:42:17.305 186548 DEBUG nova.virt.libvirt.driver [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] [instance: e6c2518b-54f7-4cfa-99bf-c237bff88edb] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 07:42:17 compute-0 nova_compute[186544]: 2025-11-22 07:42:17.305 186548 INFO nova.virt.libvirt.driver [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] [instance: e6c2518b-54f7-4cfa-99bf-c237bff88edb] Creating image(s)
Nov 22 07:42:17 compute-0 nova_compute[186544]: 2025-11-22 07:42:17.305 186548 DEBUG oslo_concurrency.lockutils [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] Acquiring lock "/var/lib/nova/instances/e6c2518b-54f7-4cfa-99bf-c237bff88edb/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:42:17 compute-0 nova_compute[186544]: 2025-11-22 07:42:17.306 186548 DEBUG oslo_concurrency.lockutils [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] Lock "/var/lib/nova/instances/e6c2518b-54f7-4cfa-99bf-c237bff88edb/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:42:17 compute-0 nova_compute[186544]: 2025-11-22 07:42:17.306 186548 DEBUG oslo_concurrency.lockutils [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] Lock "/var/lib/nova/instances/e6c2518b-54f7-4cfa-99bf-c237bff88edb/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:42:17 compute-0 nova_compute[186544]: 2025-11-22 07:42:17.319 186548 DEBUG oslo_concurrency.processutils [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:42:17 compute-0 nova_compute[186544]: 2025-11-22 07:42:17.378 186548 DEBUG oslo_concurrency.processutils [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:42:17 compute-0 nova_compute[186544]: 2025-11-22 07:42:17.379 186548 DEBUG oslo_concurrency.lockutils [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:42:17 compute-0 nova_compute[186544]: 2025-11-22 07:42:17.379 186548 DEBUG oslo_concurrency.lockutils [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:42:17 compute-0 nova_compute[186544]: 2025-11-22 07:42:17.390 186548 DEBUG oslo_concurrency.processutils [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:42:17 compute-0 podman[214214]: 2025-11-22 07:42:17.397074218 +0000 UTC m=+0.048190281 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118)
Nov 22 07:42:17 compute-0 podman[214215]: 2025-11-22 07:42:17.43679408 +0000 UTC m=+0.080679005 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_id=ovn_controller, org.label-schema.schema-version=1.0)
Nov 22 07:42:17 compute-0 nova_compute[186544]: 2025-11-22 07:42:17.455 186548 DEBUG oslo_concurrency.processutils [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:42:17 compute-0 nova_compute[186544]: 2025-11-22 07:42:17.456 186548 DEBUG oslo_concurrency.processutils [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/e6c2518b-54f7-4cfa-99bf-c237bff88edb/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:42:17 compute-0 nova_compute[186544]: 2025-11-22 07:42:17.508 186548 DEBUG oslo_concurrency.processutils [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/e6c2518b-54f7-4cfa-99bf-c237bff88edb/disk 1073741824" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:42:17 compute-0 nova_compute[186544]: 2025-11-22 07:42:17.508 186548 DEBUG oslo_concurrency.lockutils [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:42:17 compute-0 nova_compute[186544]: 2025-11-22 07:42:17.509 186548 DEBUG oslo_concurrency.processutils [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:42:17 compute-0 nova_compute[186544]: 2025-11-22 07:42:17.560 186548 DEBUG oslo_concurrency.processutils [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:42:17 compute-0 nova_compute[186544]: 2025-11-22 07:42:17.561 186548 DEBUG nova.virt.disk.api [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] Checking if we can resize image /var/lib/nova/instances/e6c2518b-54f7-4cfa-99bf-c237bff88edb/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 07:42:17 compute-0 nova_compute[186544]: 2025-11-22 07:42:17.561 186548 DEBUG oslo_concurrency.processutils [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e6c2518b-54f7-4cfa-99bf-c237bff88edb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:42:17 compute-0 nova_compute[186544]: 2025-11-22 07:42:17.611 186548 DEBUG oslo_concurrency.processutils [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e6c2518b-54f7-4cfa-99bf-c237bff88edb/disk --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:42:17 compute-0 nova_compute[186544]: 2025-11-22 07:42:17.612 186548 DEBUG nova.virt.disk.api [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] Cannot resize image /var/lib/nova/instances/e6c2518b-54f7-4cfa-99bf-c237bff88edb/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 07:42:17 compute-0 nova_compute[186544]: 2025-11-22 07:42:17.613 186548 DEBUG nova.objects.instance [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] Lazy-loading 'migration_context' on Instance uuid e6c2518b-54f7-4cfa-99bf-c237bff88edb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:42:17 compute-0 nova_compute[186544]: 2025-11-22 07:42:17.619 186548 DEBUG nova.network.neutron [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] [instance: e6c2518b-54f7-4cfa-99bf-c237bff88edb] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188
Nov 22 07:42:17 compute-0 nova_compute[186544]: 2025-11-22 07:42:17.619 186548 DEBUG nova.compute.manager [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] [instance: e6c2518b-54f7-4cfa-99bf-c237bff88edb] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 22 07:42:17 compute-0 nova_compute[186544]: 2025-11-22 07:42:17.631 186548 DEBUG nova.virt.libvirt.driver [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] [instance: e6c2518b-54f7-4cfa-99bf-c237bff88edb] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 07:42:17 compute-0 nova_compute[186544]: 2025-11-22 07:42:17.631 186548 DEBUG nova.virt.libvirt.driver [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] [instance: e6c2518b-54f7-4cfa-99bf-c237bff88edb] Ensure instance console log exists: /var/lib/nova/instances/e6c2518b-54f7-4cfa-99bf-c237bff88edb/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 07:42:17 compute-0 nova_compute[186544]: 2025-11-22 07:42:17.632 186548 DEBUG oslo_concurrency.lockutils [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:42:17 compute-0 nova_compute[186544]: 2025-11-22 07:42:17.632 186548 DEBUG oslo_concurrency.lockutils [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:42:17 compute-0 nova_compute[186544]: 2025-11-22 07:42:17.632 186548 DEBUG oslo_concurrency.lockutils [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:42:17 compute-0 nova_compute[186544]: 2025-11-22 07:42:17.634 186548 DEBUG nova.virt.libvirt.driver [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] [instance: e6c2518b-54f7-4cfa-99bf-c237bff88edb] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 07:42:17 compute-0 nova_compute[186544]: 2025-11-22 07:42:17.636 186548 WARNING nova.virt.libvirt.driver [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 07:42:17 compute-0 nova_compute[186544]: 2025-11-22 07:42:17.642 186548 DEBUG nova.virt.libvirt.host [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 07:42:17 compute-0 nova_compute[186544]: 2025-11-22 07:42:17.642 186548 DEBUG nova.virt.libvirt.host [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 07:42:17 compute-0 nova_compute[186544]: 2025-11-22 07:42:17.646 186548 DEBUG nova.virt.libvirt.host [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 07:42:17 compute-0 nova_compute[186544]: 2025-11-22 07:42:17.647 186548 DEBUG nova.virt.libvirt.host [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 07:42:17 compute-0 nova_compute[186544]: 2025-11-22 07:42:17.648 186548 DEBUG nova.virt.libvirt.driver [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 07:42:17 compute-0 nova_compute[186544]: 2025-11-22 07:42:17.648 186548 DEBUG nova.virt.hardware [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 07:42:17 compute-0 nova_compute[186544]: 2025-11-22 07:42:17.648 186548 DEBUG nova.virt.hardware [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 07:42:17 compute-0 nova_compute[186544]: 2025-11-22 07:42:17.649 186548 DEBUG nova.virt.hardware [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 07:42:17 compute-0 nova_compute[186544]: 2025-11-22 07:42:17.649 186548 DEBUG nova.virt.hardware [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 07:42:17 compute-0 nova_compute[186544]: 2025-11-22 07:42:17.649 186548 DEBUG nova.virt.hardware [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 07:42:17 compute-0 nova_compute[186544]: 2025-11-22 07:42:17.649 186548 DEBUG nova.virt.hardware [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 07:42:17 compute-0 nova_compute[186544]: 2025-11-22 07:42:17.649 186548 DEBUG nova.virt.hardware [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 07:42:17 compute-0 nova_compute[186544]: 2025-11-22 07:42:17.649 186548 DEBUG nova.virt.hardware [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 07:42:17 compute-0 nova_compute[186544]: 2025-11-22 07:42:17.650 186548 DEBUG nova.virt.hardware [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 07:42:17 compute-0 nova_compute[186544]: 2025-11-22 07:42:17.650 186548 DEBUG nova.virt.hardware [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 07:42:17 compute-0 nova_compute[186544]: 2025-11-22 07:42:17.650 186548 DEBUG nova.virt.hardware [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 07:42:17 compute-0 nova_compute[186544]: 2025-11-22 07:42:17.653 186548 DEBUG nova.objects.instance [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] Lazy-loading 'pci_devices' on Instance uuid e6c2518b-54f7-4cfa-99bf-c237bff88edb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:42:17 compute-0 nova_compute[186544]: 2025-11-22 07:42:17.676 186548 DEBUG nova.virt.libvirt.driver [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] [instance: e6c2518b-54f7-4cfa-99bf-c237bff88edb] End _get_guest_xml xml=<domain type="kvm">
Nov 22 07:42:17 compute-0 nova_compute[186544]:   <uuid>e6c2518b-54f7-4cfa-99bf-c237bff88edb</uuid>
Nov 22 07:42:17 compute-0 nova_compute[186544]:   <name>instance-0000000c</name>
Nov 22 07:42:17 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 07:42:17 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 07:42:17 compute-0 nova_compute[186544]:   <metadata>
Nov 22 07:42:17 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 07:42:17 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 07:42:17 compute-0 nova_compute[186544]:       <nova:name>tempest-ServerExternalEventsTest-server-1949983540</nova:name>
Nov 22 07:42:17 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 07:42:17</nova:creationTime>
Nov 22 07:42:17 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 07:42:17 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 07:42:17 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 07:42:17 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 07:42:17 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 07:42:17 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 07:42:17 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 07:42:17 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 07:42:17 compute-0 nova_compute[186544]:         <nova:user uuid="0a9d3bc3192649f9aa6e6f5155fec180">tempest-ServerExternalEventsTest-111434087-project-member</nova:user>
Nov 22 07:42:17 compute-0 nova_compute[186544]:         <nova:project uuid="5ea490c83a804eebbee19b8d539fbb4d">tempest-ServerExternalEventsTest-111434087</nova:project>
Nov 22 07:42:17 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 07:42:17 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 07:42:17 compute-0 nova_compute[186544]:       <nova:ports/>
Nov 22 07:42:17 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 07:42:17 compute-0 nova_compute[186544]:   </metadata>
Nov 22 07:42:17 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 07:42:17 compute-0 nova_compute[186544]:     <system>
Nov 22 07:42:17 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 07:42:17 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 07:42:17 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 07:42:17 compute-0 nova_compute[186544]:       <entry name="serial">e6c2518b-54f7-4cfa-99bf-c237bff88edb</entry>
Nov 22 07:42:17 compute-0 nova_compute[186544]:       <entry name="uuid">e6c2518b-54f7-4cfa-99bf-c237bff88edb</entry>
Nov 22 07:42:17 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 07:42:17 compute-0 nova_compute[186544]:     </system>
Nov 22 07:42:17 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 07:42:17 compute-0 nova_compute[186544]:   <os>
Nov 22 07:42:17 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 07:42:17 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 07:42:17 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 07:42:17 compute-0 nova_compute[186544]:   </os>
Nov 22 07:42:17 compute-0 nova_compute[186544]:   <features>
Nov 22 07:42:17 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 07:42:17 compute-0 nova_compute[186544]:     <apic/>
Nov 22 07:42:17 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 07:42:17 compute-0 nova_compute[186544]:   </features>
Nov 22 07:42:17 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 07:42:17 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 07:42:17 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 07:42:17 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 07:42:17 compute-0 nova_compute[186544]:   </clock>
Nov 22 07:42:17 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 07:42:17 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 07:42:17 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 07:42:17 compute-0 nova_compute[186544]:   </cpu>
Nov 22 07:42:17 compute-0 nova_compute[186544]:   <devices>
Nov 22 07:42:17 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 07:42:17 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 07:42:17 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/e6c2518b-54f7-4cfa-99bf-c237bff88edb/disk"/>
Nov 22 07:42:17 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 07:42:17 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:42:17 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 07:42:17 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 07:42:17 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/e6c2518b-54f7-4cfa-99bf-c237bff88edb/disk.config"/>
Nov 22 07:42:17 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 07:42:17 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:42:17 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 07:42:17 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/e6c2518b-54f7-4cfa-99bf-c237bff88edb/console.log" append="off"/>
Nov 22 07:42:17 compute-0 nova_compute[186544]:     </serial>
Nov 22 07:42:17 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 07:42:17 compute-0 nova_compute[186544]:     <video>
Nov 22 07:42:17 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 07:42:17 compute-0 nova_compute[186544]:     </video>
Nov 22 07:42:17 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 07:42:17 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 07:42:17 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 07:42:17 compute-0 nova_compute[186544]:     </rng>
Nov 22 07:42:17 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 07:42:17 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:42:17 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:42:17 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:42:17 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:42:17 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:42:17 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:42:17 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:42:17 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:42:17 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:42:17 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:42:17 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:42:17 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:42:17 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:42:17 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:42:17 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:42:17 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:42:17 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:42:17 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:42:17 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:42:17 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:42:17 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:42:17 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:42:17 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:42:17 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:42:17 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 07:42:17 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 07:42:17 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 07:42:17 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 07:42:17 compute-0 nova_compute[186544]:   </devices>
Nov 22 07:42:17 compute-0 nova_compute[186544]: </domain>
Nov 22 07:42:17 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 07:42:17 compute-0 nova_compute[186544]: 2025-11-22 07:42:17.744 186548 DEBUG nova.virt.libvirt.driver [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:42:17 compute-0 nova_compute[186544]: 2025-11-22 07:42:17.745 186548 DEBUG nova.virt.libvirt.driver [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:42:17 compute-0 nova_compute[186544]: 2025-11-22 07:42:17.745 186548 INFO nova.virt.libvirt.driver [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] [instance: e6c2518b-54f7-4cfa-99bf-c237bff88edb] Using config drive
Nov 22 07:42:18 compute-0 nova_compute[186544]: 2025-11-22 07:42:18.659 186548 INFO nova.virt.libvirt.driver [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] [instance: e6c2518b-54f7-4cfa-99bf-c237bff88edb] Creating config drive at /var/lib/nova/instances/e6c2518b-54f7-4cfa-99bf-c237bff88edb/disk.config
Nov 22 07:42:18 compute-0 nova_compute[186544]: 2025-11-22 07:42:18.663 186548 DEBUG oslo_concurrency.processutils [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e6c2518b-54f7-4cfa-99bf-c237bff88edb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp7l2cwmoq execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:42:18 compute-0 nova_compute[186544]: 2025-11-22 07:42:18.785 186548 DEBUG oslo_concurrency.processutils [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e6c2518b-54f7-4cfa-99bf-c237bff88edb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp7l2cwmoq" returned: 0 in 0.122s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:42:18 compute-0 systemd-machined[152872]: New machine qemu-4-instance-0000000c.
Nov 22 07:42:18 compute-0 systemd[1]: Started Virtual Machine qemu-4-instance-0000000c.
Nov 22 07:42:19 compute-0 nova_compute[186544]: 2025-11-22 07:42:19.260 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763797339.2605317, e6c2518b-54f7-4cfa-99bf-c237bff88edb => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:42:19 compute-0 nova_compute[186544]: 2025-11-22 07:42:19.261 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: e6c2518b-54f7-4cfa-99bf-c237bff88edb] VM Resumed (Lifecycle Event)
Nov 22 07:42:19 compute-0 nova_compute[186544]: 2025-11-22 07:42:19.264 186548 DEBUG nova.compute.manager [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] [instance: e6c2518b-54f7-4cfa-99bf-c237bff88edb] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 07:42:19 compute-0 nova_compute[186544]: 2025-11-22 07:42:19.264 186548 DEBUG nova.virt.libvirt.driver [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] [instance: e6c2518b-54f7-4cfa-99bf-c237bff88edb] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 07:42:19 compute-0 nova_compute[186544]: 2025-11-22 07:42:19.267 186548 INFO nova.virt.libvirt.driver [-] [instance: e6c2518b-54f7-4cfa-99bf-c237bff88edb] Instance spawned successfully.
Nov 22 07:42:19 compute-0 nova_compute[186544]: 2025-11-22 07:42:19.267 186548 DEBUG nova.virt.libvirt.driver [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] [instance: e6c2518b-54f7-4cfa-99bf-c237bff88edb] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 07:42:19 compute-0 nova_compute[186544]: 2025-11-22 07:42:19.277 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:42:19 compute-0 nova_compute[186544]: 2025-11-22 07:42:19.283 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: e6c2518b-54f7-4cfa-99bf-c237bff88edb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:42:19 compute-0 nova_compute[186544]: 2025-11-22 07:42:19.289 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: e6c2518b-54f7-4cfa-99bf-c237bff88edb] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:42:19 compute-0 nova_compute[186544]: 2025-11-22 07:42:19.291 186548 DEBUG nova.virt.libvirt.driver [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] [instance: e6c2518b-54f7-4cfa-99bf-c237bff88edb] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:42:19 compute-0 nova_compute[186544]: 2025-11-22 07:42:19.292 186548 DEBUG nova.virt.libvirt.driver [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] [instance: e6c2518b-54f7-4cfa-99bf-c237bff88edb] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:42:19 compute-0 nova_compute[186544]: 2025-11-22 07:42:19.292 186548 DEBUG nova.virt.libvirt.driver [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] [instance: e6c2518b-54f7-4cfa-99bf-c237bff88edb] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:42:19 compute-0 nova_compute[186544]: 2025-11-22 07:42:19.292 186548 DEBUG nova.virt.libvirt.driver [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] [instance: e6c2518b-54f7-4cfa-99bf-c237bff88edb] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:42:19 compute-0 nova_compute[186544]: 2025-11-22 07:42:19.293 186548 DEBUG nova.virt.libvirt.driver [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] [instance: e6c2518b-54f7-4cfa-99bf-c237bff88edb] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:42:19 compute-0 nova_compute[186544]: 2025-11-22 07:42:19.293 186548 DEBUG nova.virt.libvirt.driver [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] [instance: e6c2518b-54f7-4cfa-99bf-c237bff88edb] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:42:19 compute-0 nova_compute[186544]: 2025-11-22 07:42:19.337 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: e6c2518b-54f7-4cfa-99bf-c237bff88edb] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 07:42:19 compute-0 nova_compute[186544]: 2025-11-22 07:42:19.337 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763797339.2615614, e6c2518b-54f7-4cfa-99bf-c237bff88edb => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:42:19 compute-0 nova_compute[186544]: 2025-11-22 07:42:19.337 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: e6c2518b-54f7-4cfa-99bf-c237bff88edb] VM Started (Lifecycle Event)
Nov 22 07:42:19 compute-0 nova_compute[186544]: 2025-11-22 07:42:19.384 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: e6c2518b-54f7-4cfa-99bf-c237bff88edb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:42:19 compute-0 nova_compute[186544]: 2025-11-22 07:42:19.388 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: e6c2518b-54f7-4cfa-99bf-c237bff88edb] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:42:19 compute-0 nova_compute[186544]: 2025-11-22 07:42:19.420 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: e6c2518b-54f7-4cfa-99bf-c237bff88edb] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 07:42:19 compute-0 nova_compute[186544]: 2025-11-22 07:42:19.435 186548 INFO nova.compute.manager [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] [instance: e6c2518b-54f7-4cfa-99bf-c237bff88edb] Took 2.13 seconds to spawn the instance on the hypervisor.
Nov 22 07:42:19 compute-0 nova_compute[186544]: 2025-11-22 07:42:19.435 186548 DEBUG nova.compute.manager [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] [instance: e6c2518b-54f7-4cfa-99bf-c237bff88edb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:42:19 compute-0 nova_compute[186544]: 2025-11-22 07:42:19.570 186548 INFO nova.compute.manager [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] [instance: e6c2518b-54f7-4cfa-99bf-c237bff88edb] Took 3.33 seconds to build instance.
Nov 22 07:42:19 compute-0 nova_compute[186544]: 2025-11-22 07:42:19.631 186548 DEBUG oslo_concurrency.lockutils [None req-d822c82a-94c2-4f1c-848b-52ca56701005 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] Lock "e6c2518b-54f7-4cfa-99bf-c237bff88edb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 3.640s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:42:19 compute-0 nova_compute[186544]: 2025-11-22 07:42:19.924 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:42:21 compute-0 nova_compute[186544]: 2025-11-22 07:42:21.555 186548 DEBUG nova.compute.manager [None req-c7e3f54a-09a1-477e-a58f-c3ba2c2a1192 98a1442b9a404b7f920e321cdd780eb2 d8f18ef45514429b89386002ff34c659 - - default default] [instance: e6c2518b-54f7-4cfa-99bf-c237bff88edb] Received event network-changed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:42:21 compute-0 nova_compute[186544]: 2025-11-22 07:42:21.555 186548 DEBUG nova.compute.manager [None req-c7e3f54a-09a1-477e-a58f-c3ba2c2a1192 98a1442b9a404b7f920e321cdd780eb2 d8f18ef45514429b89386002ff34c659 - - default default] [instance: e6c2518b-54f7-4cfa-99bf-c237bff88edb] Refreshing instance network info cache due to event network-changed. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 07:42:21 compute-0 nova_compute[186544]: 2025-11-22 07:42:21.556 186548 DEBUG oslo_concurrency.lockutils [None req-c7e3f54a-09a1-477e-a58f-c3ba2c2a1192 98a1442b9a404b7f920e321cdd780eb2 d8f18ef45514429b89386002ff34c659 - - default default] Acquiring lock "refresh_cache-e6c2518b-54f7-4cfa-99bf-c237bff88edb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:42:21 compute-0 nova_compute[186544]: 2025-11-22 07:42:21.556 186548 DEBUG oslo_concurrency.lockutils [None req-c7e3f54a-09a1-477e-a58f-c3ba2c2a1192 98a1442b9a404b7f920e321cdd780eb2 d8f18ef45514429b89386002ff34c659 - - default default] Acquired lock "refresh_cache-e6c2518b-54f7-4cfa-99bf-c237bff88edb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:42:21 compute-0 nova_compute[186544]: 2025-11-22 07:42:21.556 186548 DEBUG nova.network.neutron [None req-c7e3f54a-09a1-477e-a58f-c3ba2c2a1192 98a1442b9a404b7f920e321cdd780eb2 d8f18ef45514429b89386002ff34c659 - - default default] [instance: e6c2518b-54f7-4cfa-99bf-c237bff88edb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 07:42:22 compute-0 nova_compute[186544]: 2025-11-22 07:42:22.114 186548 DEBUG nova.network.neutron [None req-c7e3f54a-09a1-477e-a58f-c3ba2c2a1192 98a1442b9a404b7f920e321cdd780eb2 d8f18ef45514429b89386002ff34c659 - - default default] [instance: e6c2518b-54f7-4cfa-99bf-c237bff88edb] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 07:42:22 compute-0 nova_compute[186544]: 2025-11-22 07:42:22.455 186548 DEBUG oslo_concurrency.lockutils [None req-430fa037-246a-441f-95a1-81145fb84c0d 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] Acquiring lock "e6c2518b-54f7-4cfa-99bf-c237bff88edb" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:42:22 compute-0 nova_compute[186544]: 2025-11-22 07:42:22.455 186548 DEBUG oslo_concurrency.lockutils [None req-430fa037-246a-441f-95a1-81145fb84c0d 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] Lock "e6c2518b-54f7-4cfa-99bf-c237bff88edb" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:42:22 compute-0 nova_compute[186544]: 2025-11-22 07:42:22.456 186548 DEBUG oslo_concurrency.lockutils [None req-430fa037-246a-441f-95a1-81145fb84c0d 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] Acquiring lock "e6c2518b-54f7-4cfa-99bf-c237bff88edb-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:42:22 compute-0 nova_compute[186544]: 2025-11-22 07:42:22.456 186548 DEBUG oslo_concurrency.lockutils [None req-430fa037-246a-441f-95a1-81145fb84c0d 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] Lock "e6c2518b-54f7-4cfa-99bf-c237bff88edb-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:42:22 compute-0 nova_compute[186544]: 2025-11-22 07:42:22.456 186548 DEBUG oslo_concurrency.lockutils [None req-430fa037-246a-441f-95a1-81145fb84c0d 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] Lock "e6c2518b-54f7-4cfa-99bf-c237bff88edb-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:42:22 compute-0 nova_compute[186544]: 2025-11-22 07:42:22.463 186548 INFO nova.compute.manager [None req-430fa037-246a-441f-95a1-81145fb84c0d 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] [instance: e6c2518b-54f7-4cfa-99bf-c237bff88edb] Terminating instance
Nov 22 07:42:22 compute-0 nova_compute[186544]: 2025-11-22 07:42:22.469 186548 DEBUG oslo_concurrency.lockutils [None req-430fa037-246a-441f-95a1-81145fb84c0d 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] Acquiring lock "refresh_cache-e6c2518b-54f7-4cfa-99bf-c237bff88edb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:42:23 compute-0 nova_compute[186544]: 2025-11-22 07:42:23.139 186548 DEBUG nova.network.neutron [None req-c7e3f54a-09a1-477e-a58f-c3ba2c2a1192 98a1442b9a404b7f920e321cdd780eb2 d8f18ef45514429b89386002ff34c659 - - default default] [instance: e6c2518b-54f7-4cfa-99bf-c237bff88edb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:42:23 compute-0 nova_compute[186544]: 2025-11-22 07:42:23.179 186548 DEBUG oslo_concurrency.lockutils [None req-c7e3f54a-09a1-477e-a58f-c3ba2c2a1192 98a1442b9a404b7f920e321cdd780eb2 d8f18ef45514429b89386002ff34c659 - - default default] Releasing lock "refresh_cache-e6c2518b-54f7-4cfa-99bf-c237bff88edb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:42:23 compute-0 nova_compute[186544]: 2025-11-22 07:42:23.180 186548 DEBUG oslo_concurrency.lockutils [None req-430fa037-246a-441f-95a1-81145fb84c0d 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] Acquired lock "refresh_cache-e6c2518b-54f7-4cfa-99bf-c237bff88edb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:42:23 compute-0 nova_compute[186544]: 2025-11-22 07:42:23.180 186548 DEBUG nova.network.neutron [None req-430fa037-246a-441f-95a1-81145fb84c0d 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] [instance: e6c2518b-54f7-4cfa-99bf-c237bff88edb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 07:42:23 compute-0 podman[214305]: 2025-11-22 07:42:23.398188599 +0000 UTC m=+0.049957055 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 07:42:23 compute-0 nova_compute[186544]: 2025-11-22 07:42:23.945 186548 DEBUG nova.network.neutron [None req-430fa037-246a-441f-95a1-81145fb84c0d 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] [instance: e6c2518b-54f7-4cfa-99bf-c237bff88edb] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 07:42:24 compute-0 nova_compute[186544]: 2025-11-22 07:42:24.279 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:42:24 compute-0 nova_compute[186544]: 2025-11-22 07:42:24.926 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:42:24 compute-0 nova_compute[186544]: 2025-11-22 07:42:24.991 186548 DEBUG nova.network.neutron [None req-430fa037-246a-441f-95a1-81145fb84c0d 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] [instance: e6c2518b-54f7-4cfa-99bf-c237bff88edb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:42:25 compute-0 nova_compute[186544]: 2025-11-22 07:42:25.033 186548 DEBUG oslo_concurrency.lockutils [None req-430fa037-246a-441f-95a1-81145fb84c0d 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] Releasing lock "refresh_cache-e6c2518b-54f7-4cfa-99bf-c237bff88edb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:42:25 compute-0 nova_compute[186544]: 2025-11-22 07:42:25.033 186548 DEBUG nova.compute.manager [None req-430fa037-246a-441f-95a1-81145fb84c0d 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] [instance: e6c2518b-54f7-4cfa-99bf-c237bff88edb] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 07:42:25 compute-0 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d0000000c.scope: Deactivated successfully.
Nov 22 07:42:25 compute-0 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d0000000c.scope: Consumed 6.195s CPU time.
Nov 22 07:42:25 compute-0 systemd-machined[152872]: Machine qemu-4-instance-0000000c terminated.
Nov 22 07:42:25 compute-0 nova_compute[186544]: 2025-11-22 07:42:25.270 186548 INFO nova.virt.libvirt.driver [-] [instance: e6c2518b-54f7-4cfa-99bf-c237bff88edb] Instance destroyed successfully.
Nov 22 07:42:25 compute-0 nova_compute[186544]: 2025-11-22 07:42:25.271 186548 DEBUG nova.objects.instance [None req-430fa037-246a-441f-95a1-81145fb84c0d 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] Lazy-loading 'resources' on Instance uuid e6c2518b-54f7-4cfa-99bf-c237bff88edb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:42:25 compute-0 nova_compute[186544]: 2025-11-22 07:42:25.282 186548 INFO nova.virt.libvirt.driver [None req-430fa037-246a-441f-95a1-81145fb84c0d 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] [instance: e6c2518b-54f7-4cfa-99bf-c237bff88edb] Deleting instance files /var/lib/nova/instances/e6c2518b-54f7-4cfa-99bf-c237bff88edb_del
Nov 22 07:42:25 compute-0 nova_compute[186544]: 2025-11-22 07:42:25.283 186548 INFO nova.virt.libvirt.driver [None req-430fa037-246a-441f-95a1-81145fb84c0d 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] [instance: e6c2518b-54f7-4cfa-99bf-c237bff88edb] Deletion of /var/lib/nova/instances/e6c2518b-54f7-4cfa-99bf-c237bff88edb_del complete
Nov 22 07:42:25 compute-0 nova_compute[186544]: 2025-11-22 07:42:25.372 186548 INFO nova.compute.manager [None req-430fa037-246a-441f-95a1-81145fb84c0d 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] [instance: e6c2518b-54f7-4cfa-99bf-c237bff88edb] Took 0.34 seconds to destroy the instance on the hypervisor.
Nov 22 07:42:25 compute-0 nova_compute[186544]: 2025-11-22 07:42:25.372 186548 DEBUG oslo.service.loopingcall [None req-430fa037-246a-441f-95a1-81145fb84c0d 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 07:42:25 compute-0 nova_compute[186544]: 2025-11-22 07:42:25.373 186548 DEBUG nova.compute.manager [-] [instance: e6c2518b-54f7-4cfa-99bf-c237bff88edb] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 07:42:25 compute-0 nova_compute[186544]: 2025-11-22 07:42:25.373 186548 DEBUG nova.network.neutron [-] [instance: e6c2518b-54f7-4cfa-99bf-c237bff88edb] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 07:42:25 compute-0 nova_compute[186544]: 2025-11-22 07:42:25.949 186548 DEBUG nova.network.neutron [-] [instance: e6c2518b-54f7-4cfa-99bf-c237bff88edb] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 07:42:25 compute-0 nova_compute[186544]: 2025-11-22 07:42:25.962 186548 DEBUG nova.network.neutron [-] [instance: e6c2518b-54f7-4cfa-99bf-c237bff88edb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:42:25 compute-0 nova_compute[186544]: 2025-11-22 07:42:25.985 186548 INFO nova.compute.manager [-] [instance: e6c2518b-54f7-4cfa-99bf-c237bff88edb] Took 0.61 seconds to deallocate network for instance.
Nov 22 07:42:26 compute-0 nova_compute[186544]: 2025-11-22 07:42:26.098 186548 DEBUG oslo_concurrency.lockutils [None req-430fa037-246a-441f-95a1-81145fb84c0d 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:42:26 compute-0 nova_compute[186544]: 2025-11-22 07:42:26.099 186548 DEBUG oslo_concurrency.lockutils [None req-430fa037-246a-441f-95a1-81145fb84c0d 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:42:26 compute-0 nova_compute[186544]: 2025-11-22 07:42:26.197 186548 DEBUG nova.compute.provider_tree [None req-430fa037-246a-441f-95a1-81145fb84c0d 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:42:26 compute-0 podman[214340]: 2025-11-22 07:42:26.409136772 +0000 UTC m=+0.056215894 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Nov 22 07:42:26 compute-0 nova_compute[186544]: 2025-11-22 07:42:26.481 186548 DEBUG nova.scheduler.client.report [None req-430fa037-246a-441f-95a1-81145fb84c0d 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:42:26 compute-0 nova_compute[186544]: 2025-11-22 07:42:26.539 186548 DEBUG oslo_concurrency.lockutils [None req-430fa037-246a-441f-95a1-81145fb84c0d 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.440s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:42:26 compute-0 nova_compute[186544]: 2025-11-22 07:42:26.611 186548 INFO nova.scheduler.client.report [None req-430fa037-246a-441f-95a1-81145fb84c0d 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] Deleted allocations for instance e6c2518b-54f7-4cfa-99bf-c237bff88edb
Nov 22 07:42:26 compute-0 nova_compute[186544]: 2025-11-22 07:42:26.706 186548 DEBUG oslo_concurrency.lockutils [None req-430fa037-246a-441f-95a1-81145fb84c0d 0a9d3bc3192649f9aa6e6f5155fec180 5ea490c83a804eebbee19b8d539fbb4d - - default default] Lock "e6c2518b-54f7-4cfa-99bf-c237bff88edb" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.251s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:42:28 compute-0 nova_compute[186544]: 2025-11-22 07:42:28.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:42:28 compute-0 nova_compute[186544]: 2025-11-22 07:42:28.183 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:42:28 compute-0 nova_compute[186544]: 2025-11-22 07:42:28.184 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:42:28 compute-0 nova_compute[186544]: 2025-11-22 07:42:28.184 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:42:28 compute-0 nova_compute[186544]: 2025-11-22 07:42:28.184 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 07:42:28 compute-0 nova_compute[186544]: 2025-11-22 07:42:28.343 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 07:42:28 compute-0 nova_compute[186544]: 2025-11-22 07:42:28.344 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5774MB free_disk=73.46304702758789GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 07:42:28 compute-0 nova_compute[186544]: 2025-11-22 07:42:28.345 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:42:28 compute-0 nova_compute[186544]: 2025-11-22 07:42:28.345 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:42:28 compute-0 nova_compute[186544]: 2025-11-22 07:42:28.628 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 07:42:28 compute-0 nova_compute[186544]: 2025-11-22 07:42:28.628 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 07:42:28 compute-0 nova_compute[186544]: 2025-11-22 07:42:28.645 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:42:28 compute-0 nova_compute[186544]: 2025-11-22 07:42:28.659 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:42:28 compute-0 nova_compute[186544]: 2025-11-22 07:42:28.704 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 07:42:28 compute-0 nova_compute[186544]: 2025-11-22 07:42:28.704 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.359s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:42:29 compute-0 nova_compute[186544]: 2025-11-22 07:42:29.282 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:42:29 compute-0 podman[214360]: 2025-11-22 07:42:29.399093567 +0000 UTC m=+0.049493929 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Nov 22 07:42:29 compute-0 nova_compute[186544]: 2025-11-22 07:42:29.927 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:42:31 compute-0 nova_compute[186544]: 2025-11-22 07:42:31.705 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:42:31 compute-0 nova_compute[186544]: 2025-11-22 07:42:31.705 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:42:31 compute-0 nova_compute[186544]: 2025-11-22 07:42:31.705 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 07:42:32 compute-0 nova_compute[186544]: 2025-11-22 07:42:32.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:42:33 compute-0 nova_compute[186544]: 2025-11-22 07:42:33.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:42:33 compute-0 nova_compute[186544]: 2025-11-22 07:42:33.164 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 07:42:33 compute-0 nova_compute[186544]: 2025-11-22 07:42:33.165 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 07:42:33 compute-0 nova_compute[186544]: 2025-11-22 07:42:33.188 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 07:42:34 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:42:34.023 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:42:34 compute-0 nova_compute[186544]: 2025-11-22 07:42:34.024 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:42:34 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:42:34.024 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 07:42:34 compute-0 nova_compute[186544]: 2025-11-22 07:42:34.184 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:42:34 compute-0 nova_compute[186544]: 2025-11-22 07:42:34.283 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:42:34 compute-0 podman[214380]: 2025-11-22 07:42:34.404865007 +0000 UTC m=+0.058764428 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 07:42:34 compute-0 nova_compute[186544]: 2025-11-22 07:42:34.929 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:42:35 compute-0 nova_compute[186544]: 2025-11-22 07:42:35.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:42:36 compute-0 nova_compute[186544]: 2025-11-22 07:42:36.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:42:36 compute-0 nova_compute[186544]: 2025-11-22 07:42:36.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:42:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:42:36.589 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:42:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:42:36.589 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:42:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:42:36.589 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:42:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:42:36.590 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:42:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:42:36.590 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:42:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:42:36.590 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:42:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:42:36.590 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:42:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:42:36.590 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:42:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:42:36.590 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:42:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:42:36.590 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:42:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:42:36.590 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:42:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:42:36.590 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:42:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:42:36.590 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:42:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:42:36.590 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:42:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:42:36.590 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:42:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:42:36.591 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:42:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:42:36.591 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:42:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:42:36.591 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:42:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:42:36.591 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:42:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:42:36.591 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:42:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:42:36.591 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:42:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:42:36.591 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:42:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:42:36.591 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:42:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:42:36.591 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:42:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:42:36.591 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:42:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:42:37.311 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:42:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:42:37.311 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:42:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:42:37.311 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:42:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:42:39.026 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:42:39 compute-0 nova_compute[186544]: 2025-11-22 07:42:39.286 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:42:39 compute-0 podman[214404]: 2025-11-22 07:42:39.395063303 +0000 UTC m=+0.049501299 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, version=9.6, io.openshift.expose-services=, vendor=Red Hat, Inc., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, architecture=x86_64, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 22 07:42:39 compute-0 nova_compute[186544]: 2025-11-22 07:42:39.930 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:42:40 compute-0 nova_compute[186544]: 2025-11-22 07:42:40.270 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763797345.268605, e6c2518b-54f7-4cfa-99bf-c237bff88edb => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:42:40 compute-0 nova_compute[186544]: 2025-11-22 07:42:40.271 186548 INFO nova.compute.manager [-] [instance: e6c2518b-54f7-4cfa-99bf-c237bff88edb] VM Stopped (Lifecycle Event)
Nov 22 07:42:40 compute-0 nova_compute[186544]: 2025-11-22 07:42:40.305 186548 DEBUG nova.compute.manager [None req-355d6bf7-b5bc-4348-a266-3cf074f5a9bd - - - - - -] [instance: e6c2518b-54f7-4cfa-99bf-c237bff88edb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:42:43 compute-0 ovn_controller[94843]: 2025-11-22T07:42:43Z|00036|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Nov 22 07:42:44 compute-0 nova_compute[186544]: 2025-11-22 07:42:44.288 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:42:44 compute-0 nova_compute[186544]: 2025-11-22 07:42:44.932 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:42:47 compute-0 nova_compute[186544]: 2025-11-22 07:42:47.795 186548 DEBUG oslo_concurrency.processutils [None req-2f03d2cb-b34c-488e-b2c3-5e51a1136aa8 0ae8171360de485c9542d048ca53f706 2619d1f3d21c4828bf1dc9e23586c45f - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:42:47 compute-0 nova_compute[186544]: 2025-11-22 07:42:47.827 186548 DEBUG oslo_concurrency.processutils [None req-2f03d2cb-b34c-488e-b2c3-5e51a1136aa8 0ae8171360de485c9542d048ca53f706 2619d1f3d21c4828bf1dc9e23586c45f - - default default] CMD "env LANG=C uptime" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:42:48 compute-0 podman[214427]: 2025-11-22 07:42:48.414068963 +0000 UTC m=+0.058295036 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm)
Nov 22 07:42:48 compute-0 podman[214428]: 2025-11-22 07:42:48.449233469 +0000 UTC m=+0.089036863 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 22 07:42:49 compute-0 nova_compute[186544]: 2025-11-22 07:42:49.289 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:42:49 compute-0 nova_compute[186544]: 2025-11-22 07:42:49.934 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:42:54 compute-0 nova_compute[186544]: 2025-11-22 07:42:54.292 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:42:54 compute-0 podman[214471]: 2025-11-22 07:42:54.398527691 +0000 UTC m=+0.047045869 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 07:42:54 compute-0 nova_compute[186544]: 2025-11-22 07:42:54.934 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:42:57 compute-0 podman[214495]: 2025-11-22 07:42:57.393494418 +0000 UTC m=+0.048384462 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent)
Nov 22 07:42:59 compute-0 nova_compute[186544]: 2025-11-22 07:42:59.295 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:42:59 compute-0 nova_compute[186544]: 2025-11-22 07:42:59.937 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:43:00 compute-0 podman[214514]: 2025-11-22 07:43:00.408436635 +0000 UTC m=+0.057535907 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 07:43:04 compute-0 nova_compute[186544]: 2025-11-22 07:43:04.329 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:43:04 compute-0 nova_compute[186544]: 2025-11-22 07:43:04.938 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:43:05 compute-0 nova_compute[186544]: 2025-11-22 07:43:05.105 186548 DEBUG oslo_concurrency.lockutils [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Acquiring lock "6d263548-4cc6-463b-b26b-cb43b0d069cd" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:43:05 compute-0 nova_compute[186544]: 2025-11-22 07:43:05.105 186548 DEBUG oslo_concurrency.lockutils [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "6d263548-4cc6-463b-b26b-cb43b0d069cd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:43:05 compute-0 nova_compute[186544]: 2025-11-22 07:43:05.138 186548 DEBUG nova.compute.manager [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 07:43:05 compute-0 nova_compute[186544]: 2025-11-22 07:43:05.254 186548 DEBUG oslo_concurrency.lockutils [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:43:05 compute-0 nova_compute[186544]: 2025-11-22 07:43:05.254 186548 DEBUG oslo_concurrency.lockutils [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:43:05 compute-0 nova_compute[186544]: 2025-11-22 07:43:05.270 186548 DEBUG nova.virt.hardware [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 07:43:05 compute-0 nova_compute[186544]: 2025-11-22 07:43:05.271 186548 INFO nova.compute.claims [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Claim successful on node compute-0.ctlplane.example.com
Nov 22 07:43:05 compute-0 podman[214534]: 2025-11-22 07:43:05.448084829 +0000 UTC m=+0.089457383 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 07:43:05 compute-0 nova_compute[186544]: 2025-11-22 07:43:05.494 186548 DEBUG nova.compute.provider_tree [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:43:05 compute-0 nova_compute[186544]: 2025-11-22 07:43:05.519 186548 DEBUG nova.scheduler.client.report [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:43:05 compute-0 nova_compute[186544]: 2025-11-22 07:43:05.550 186548 DEBUG oslo_concurrency.lockutils [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.296s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:43:05 compute-0 nova_compute[186544]: 2025-11-22 07:43:05.551 186548 DEBUG nova.compute.manager [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 07:43:05 compute-0 nova_compute[186544]: 2025-11-22 07:43:05.620 186548 DEBUG nova.compute.manager [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 22 07:43:05 compute-0 nova_compute[186544]: 2025-11-22 07:43:05.621 186548 DEBUG nova.network.neutron [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 07:43:05 compute-0 nova_compute[186544]: 2025-11-22 07:43:05.661 186548 INFO nova.virt.libvirt.driver [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 07:43:05 compute-0 nova_compute[186544]: 2025-11-22 07:43:05.704 186548 DEBUG nova.compute.manager [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 07:43:05 compute-0 nova_compute[186544]: 2025-11-22 07:43:05.859 186548 DEBUG nova.compute.manager [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 07:43:05 compute-0 nova_compute[186544]: 2025-11-22 07:43:05.861 186548 DEBUG nova.virt.libvirt.driver [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 07:43:05 compute-0 nova_compute[186544]: 2025-11-22 07:43:05.861 186548 INFO nova.virt.libvirt.driver [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Creating image(s)
Nov 22 07:43:05 compute-0 nova_compute[186544]: 2025-11-22 07:43:05.862 186548 DEBUG oslo_concurrency.lockutils [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Acquiring lock "/var/lib/nova/instances/6d263548-4cc6-463b-b26b-cb43b0d069cd/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:43:05 compute-0 nova_compute[186544]: 2025-11-22 07:43:05.862 186548 DEBUG oslo_concurrency.lockutils [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "/var/lib/nova/instances/6d263548-4cc6-463b-b26b-cb43b0d069cd/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:43:05 compute-0 nova_compute[186544]: 2025-11-22 07:43:05.863 186548 DEBUG oslo_concurrency.lockutils [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "/var/lib/nova/instances/6d263548-4cc6-463b-b26b-cb43b0d069cd/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:43:05 compute-0 nova_compute[186544]: 2025-11-22 07:43:05.876 186548 DEBUG oslo_concurrency.processutils [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:43:05 compute-0 nova_compute[186544]: 2025-11-22 07:43:05.941 186548 DEBUG oslo_concurrency.processutils [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:43:05 compute-0 nova_compute[186544]: 2025-11-22 07:43:05.942 186548 DEBUG oslo_concurrency.lockutils [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:43:05 compute-0 nova_compute[186544]: 2025-11-22 07:43:05.943 186548 DEBUG oslo_concurrency.lockutils [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:43:05 compute-0 nova_compute[186544]: 2025-11-22 07:43:05.955 186548 DEBUG oslo_concurrency.processutils [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:43:06 compute-0 nova_compute[186544]: 2025-11-22 07:43:06.009 186548 DEBUG oslo_concurrency.processutils [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:43:06 compute-0 nova_compute[186544]: 2025-11-22 07:43:06.010 186548 DEBUG oslo_concurrency.processutils [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/6d263548-4cc6-463b-b26b-cb43b0d069cd/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:43:06 compute-0 nova_compute[186544]: 2025-11-22 07:43:06.054 186548 DEBUG oslo_concurrency.processutils [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/6d263548-4cc6-463b-b26b-cb43b0d069cd/disk 1073741824" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:43:06 compute-0 nova_compute[186544]: 2025-11-22 07:43:06.055 186548 DEBUG oslo_concurrency.lockutils [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.113s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:43:06 compute-0 nova_compute[186544]: 2025-11-22 07:43:06.056 186548 DEBUG oslo_concurrency.processutils [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:43:06 compute-0 nova_compute[186544]: 2025-11-22 07:43:06.112 186548 DEBUG oslo_concurrency.processutils [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:43:06 compute-0 nova_compute[186544]: 2025-11-22 07:43:06.113 186548 DEBUG nova.virt.disk.api [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Checking if we can resize image /var/lib/nova/instances/6d263548-4cc6-463b-b26b-cb43b0d069cd/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 07:43:06 compute-0 nova_compute[186544]: 2025-11-22 07:43:06.114 186548 DEBUG oslo_concurrency.processutils [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6d263548-4cc6-463b-b26b-cb43b0d069cd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:43:06 compute-0 nova_compute[186544]: 2025-11-22 07:43:06.168 186548 DEBUG oslo_concurrency.processutils [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6d263548-4cc6-463b-b26b-cb43b0d069cd/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:43:06 compute-0 nova_compute[186544]: 2025-11-22 07:43:06.169 186548 DEBUG nova.virt.disk.api [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Cannot resize image /var/lib/nova/instances/6d263548-4cc6-463b-b26b-cb43b0d069cd/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 07:43:06 compute-0 nova_compute[186544]: 2025-11-22 07:43:06.169 186548 DEBUG nova.objects.instance [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lazy-loading 'migration_context' on Instance uuid 6d263548-4cc6-463b-b26b-cb43b0d069cd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:43:06 compute-0 nova_compute[186544]: 2025-11-22 07:43:06.183 186548 DEBUG nova.virt.libvirt.driver [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 07:43:06 compute-0 nova_compute[186544]: 2025-11-22 07:43:06.184 186548 DEBUG nova.virt.libvirt.driver [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Ensure instance console log exists: /var/lib/nova/instances/6d263548-4cc6-463b-b26b-cb43b0d069cd/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 07:43:06 compute-0 nova_compute[186544]: 2025-11-22 07:43:06.184 186548 DEBUG oslo_concurrency.lockutils [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:43:06 compute-0 nova_compute[186544]: 2025-11-22 07:43:06.184 186548 DEBUG oslo_concurrency.lockutils [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:43:06 compute-0 nova_compute[186544]: 2025-11-22 07:43:06.185 186548 DEBUG oslo_concurrency.lockutils [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:43:06 compute-0 nova_compute[186544]: 2025-11-22 07:43:06.627 186548 DEBUG nova.policy [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7c0fb56fc41e44dfa23a0d45149e78e3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9b004cb06df74de2903dae19345fd9c7', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 22 07:43:07 compute-0 nova_compute[186544]: 2025-11-22 07:43:07.633 186548 DEBUG nova.network.neutron [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Successfully created port: cc51d9a0-e170-42eb-b8db-2910ea320cb4 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 22 07:43:09 compute-0 nova_compute[186544]: 2025-11-22 07:43:09.331 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:43:09 compute-0 nova_compute[186544]: 2025-11-22 07:43:09.940 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:43:10 compute-0 podman[214571]: 2025-11-22 07:43:10.402921145 +0000 UTC m=+0.053114378 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, release=1755695350, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.expose-services=, maintainer=Red Hat, Inc., architecture=x86_64, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal)
Nov 22 07:43:10 compute-0 nova_compute[186544]: 2025-11-22 07:43:10.420 186548 DEBUG nova.network.neutron [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Successfully updated port: cc51d9a0-e170-42eb-b8db-2910ea320cb4 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 07:43:10 compute-0 nova_compute[186544]: 2025-11-22 07:43:10.449 186548 DEBUG oslo_concurrency.lockutils [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Acquiring lock "refresh_cache-6d263548-4cc6-463b-b26b-cb43b0d069cd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:43:10 compute-0 nova_compute[186544]: 2025-11-22 07:43:10.449 186548 DEBUG oslo_concurrency.lockutils [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Acquired lock "refresh_cache-6d263548-4cc6-463b-b26b-cb43b0d069cd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:43:10 compute-0 nova_compute[186544]: 2025-11-22 07:43:10.450 186548 DEBUG nova.network.neutron [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 07:43:10 compute-0 nova_compute[186544]: 2025-11-22 07:43:10.601 186548 DEBUG nova.compute.manager [req-6a8bc638-8857-4981-afc7-47aaac7db9d3 req-ef23a906-ee08-4c1f-802f-c628f0cb1424 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Received event network-changed-cc51d9a0-e170-42eb-b8db-2910ea320cb4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:43:10 compute-0 nova_compute[186544]: 2025-11-22 07:43:10.601 186548 DEBUG nova.compute.manager [req-6a8bc638-8857-4981-afc7-47aaac7db9d3 req-ef23a906-ee08-4c1f-802f-c628f0cb1424 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Refreshing instance network info cache due to event network-changed-cc51d9a0-e170-42eb-b8db-2910ea320cb4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 07:43:10 compute-0 nova_compute[186544]: 2025-11-22 07:43:10.602 186548 DEBUG oslo_concurrency.lockutils [req-6a8bc638-8857-4981-afc7-47aaac7db9d3 req-ef23a906-ee08-4c1f-802f-c628f0cb1424 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-6d263548-4cc6-463b-b26b-cb43b0d069cd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:43:10 compute-0 nova_compute[186544]: 2025-11-22 07:43:10.782 186548 DEBUG nova.network.neutron [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 07:43:13 compute-0 nova_compute[186544]: 2025-11-22 07:43:13.507 186548 DEBUG nova.network.neutron [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Updating instance_info_cache with network_info: [{"id": "cc51d9a0-e170-42eb-b8db-2910ea320cb4", "address": "fa:16:3e:88:5c:3d", "network": {"id": "d7ba1c27-6255-4c71-8e98-23a1c59b5723", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1812148536-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b004cb06df74de2903dae19345fd9c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc51d9a0-e1", "ovs_interfaceid": "cc51d9a0-e170-42eb-b8db-2910ea320cb4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:43:13 compute-0 nova_compute[186544]: 2025-11-22 07:43:13.539 186548 DEBUG oslo_concurrency.lockutils [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Releasing lock "refresh_cache-6d263548-4cc6-463b-b26b-cb43b0d069cd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:43:13 compute-0 nova_compute[186544]: 2025-11-22 07:43:13.539 186548 DEBUG nova.compute.manager [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Instance network_info: |[{"id": "cc51d9a0-e170-42eb-b8db-2910ea320cb4", "address": "fa:16:3e:88:5c:3d", "network": {"id": "d7ba1c27-6255-4c71-8e98-23a1c59b5723", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1812148536-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b004cb06df74de2903dae19345fd9c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc51d9a0-e1", "ovs_interfaceid": "cc51d9a0-e170-42eb-b8db-2910ea320cb4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 22 07:43:13 compute-0 nova_compute[186544]: 2025-11-22 07:43:13.540 186548 DEBUG oslo_concurrency.lockutils [req-6a8bc638-8857-4981-afc7-47aaac7db9d3 req-ef23a906-ee08-4c1f-802f-c628f0cb1424 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-6d263548-4cc6-463b-b26b-cb43b0d069cd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:43:13 compute-0 nova_compute[186544]: 2025-11-22 07:43:13.540 186548 DEBUG nova.network.neutron [req-6a8bc638-8857-4981-afc7-47aaac7db9d3 req-ef23a906-ee08-4c1f-802f-c628f0cb1424 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Refreshing network info cache for port cc51d9a0-e170-42eb-b8db-2910ea320cb4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 07:43:13 compute-0 nova_compute[186544]: 2025-11-22 07:43:13.543 186548 DEBUG nova.virt.libvirt.driver [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Start _get_guest_xml network_info=[{"id": "cc51d9a0-e170-42eb-b8db-2910ea320cb4", "address": "fa:16:3e:88:5c:3d", "network": {"id": "d7ba1c27-6255-4c71-8e98-23a1c59b5723", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1812148536-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b004cb06df74de2903dae19345fd9c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc51d9a0-e1", "ovs_interfaceid": "cc51d9a0-e170-42eb-b8db-2910ea320cb4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 07:43:13 compute-0 nova_compute[186544]: 2025-11-22 07:43:13.547 186548 WARNING nova.virt.libvirt.driver [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 07:43:13 compute-0 nova_compute[186544]: 2025-11-22 07:43:13.551 186548 DEBUG nova.virt.libvirt.host [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 07:43:13 compute-0 nova_compute[186544]: 2025-11-22 07:43:13.551 186548 DEBUG nova.virt.libvirt.host [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 07:43:13 compute-0 nova_compute[186544]: 2025-11-22 07:43:13.554 186548 DEBUG nova.virt.libvirt.host [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 07:43:13 compute-0 nova_compute[186544]: 2025-11-22 07:43:13.554 186548 DEBUG nova.virt.libvirt.host [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 07:43:13 compute-0 nova_compute[186544]: 2025-11-22 07:43:13.555 186548 DEBUG nova.virt.libvirt.driver [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 07:43:13 compute-0 nova_compute[186544]: 2025-11-22 07:43:13.556 186548 DEBUG nova.virt.hardware [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 07:43:13 compute-0 nova_compute[186544]: 2025-11-22 07:43:13.556 186548 DEBUG nova.virt.hardware [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 07:43:13 compute-0 nova_compute[186544]: 2025-11-22 07:43:13.556 186548 DEBUG nova.virt.hardware [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 07:43:13 compute-0 nova_compute[186544]: 2025-11-22 07:43:13.556 186548 DEBUG nova.virt.hardware [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 07:43:13 compute-0 nova_compute[186544]: 2025-11-22 07:43:13.557 186548 DEBUG nova.virt.hardware [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 07:43:13 compute-0 nova_compute[186544]: 2025-11-22 07:43:13.557 186548 DEBUG nova.virt.hardware [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 07:43:13 compute-0 nova_compute[186544]: 2025-11-22 07:43:13.557 186548 DEBUG nova.virt.hardware [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 07:43:13 compute-0 nova_compute[186544]: 2025-11-22 07:43:13.557 186548 DEBUG nova.virt.hardware [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 07:43:13 compute-0 nova_compute[186544]: 2025-11-22 07:43:13.558 186548 DEBUG nova.virt.hardware [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 07:43:13 compute-0 nova_compute[186544]: 2025-11-22 07:43:13.558 186548 DEBUG nova.virt.hardware [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 07:43:13 compute-0 nova_compute[186544]: 2025-11-22 07:43:13.558 186548 DEBUG nova.virt.hardware [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 07:43:13 compute-0 nova_compute[186544]: 2025-11-22 07:43:13.561 186548 DEBUG nova.virt.libvirt.vif [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:43:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-174353263',display_name='tempest-ServersAdminTestJSON-server-174353263',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-174353263',id=13,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9b004cb06df74de2903dae19345fd9c7',ramdisk_id='',reservation_id='r-wcksy40q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-1843119868',owner_user_name='tempest-ServersAdminTestJSON-1843119868-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:43:05Z,user_data=None,user_id='7c0fb56fc41e44dfa23a0d45149e78e3',uuid=6d263548-4cc6-463b-b26b-cb43b0d069cd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cc51d9a0-e170-42eb-b8db-2910ea320cb4", "address": "fa:16:3e:88:5c:3d", "network": {"id": "d7ba1c27-6255-4c71-8e98-23a1c59b5723", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1812148536-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b004cb06df74de2903dae19345fd9c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc51d9a0-e1", "ovs_interfaceid": "cc51d9a0-e170-42eb-b8db-2910ea320cb4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 07:43:13 compute-0 nova_compute[186544]: 2025-11-22 07:43:13.561 186548 DEBUG nova.network.os_vif_util [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Converting VIF {"id": "cc51d9a0-e170-42eb-b8db-2910ea320cb4", "address": "fa:16:3e:88:5c:3d", "network": {"id": "d7ba1c27-6255-4c71-8e98-23a1c59b5723", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1812148536-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b004cb06df74de2903dae19345fd9c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc51d9a0-e1", "ovs_interfaceid": "cc51d9a0-e170-42eb-b8db-2910ea320cb4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:43:13 compute-0 nova_compute[186544]: 2025-11-22 07:43:13.562 186548 DEBUG nova.network.os_vif_util [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:88:5c:3d,bridge_name='br-int',has_traffic_filtering=True,id=cc51d9a0-e170-42eb-b8db-2910ea320cb4,network=Network(d7ba1c27-6255-4c71-8e98-23a1c59b5723),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc51d9a0-e1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:43:13 compute-0 nova_compute[186544]: 2025-11-22 07:43:13.563 186548 DEBUG nova.objects.instance [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lazy-loading 'pci_devices' on Instance uuid 6d263548-4cc6-463b-b26b-cb43b0d069cd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:43:13 compute-0 nova_compute[186544]: 2025-11-22 07:43:13.591 186548 DEBUG nova.virt.libvirt.driver [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] End _get_guest_xml xml=<domain type="kvm">
Nov 22 07:43:13 compute-0 nova_compute[186544]:   <uuid>6d263548-4cc6-463b-b26b-cb43b0d069cd</uuid>
Nov 22 07:43:13 compute-0 nova_compute[186544]:   <name>instance-0000000d</name>
Nov 22 07:43:13 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 07:43:13 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 07:43:13 compute-0 nova_compute[186544]:   <metadata>
Nov 22 07:43:13 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 07:43:13 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 07:43:13 compute-0 nova_compute[186544]:       <nova:name>tempest-ServersAdminTestJSON-server-174353263</nova:name>
Nov 22 07:43:13 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 07:43:13</nova:creationTime>
Nov 22 07:43:13 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 07:43:13 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 07:43:13 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 07:43:13 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 07:43:13 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 07:43:13 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 07:43:13 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 07:43:13 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 07:43:13 compute-0 nova_compute[186544]:         <nova:user uuid="7c0fb56fc41e44dfa23a0d45149e78e3">tempest-ServersAdminTestJSON-1843119868-project-member</nova:user>
Nov 22 07:43:13 compute-0 nova_compute[186544]:         <nova:project uuid="9b004cb06df74de2903dae19345fd9c7">tempest-ServersAdminTestJSON-1843119868</nova:project>
Nov 22 07:43:13 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 07:43:13 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 07:43:13 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 07:43:13 compute-0 nova_compute[186544]:         <nova:port uuid="cc51d9a0-e170-42eb-b8db-2910ea320cb4">
Nov 22 07:43:13 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Nov 22 07:43:13 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 07:43:13 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 07:43:13 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 07:43:13 compute-0 nova_compute[186544]:   </metadata>
Nov 22 07:43:13 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 07:43:13 compute-0 nova_compute[186544]:     <system>
Nov 22 07:43:13 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 07:43:13 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 07:43:13 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 07:43:13 compute-0 nova_compute[186544]:       <entry name="serial">6d263548-4cc6-463b-b26b-cb43b0d069cd</entry>
Nov 22 07:43:13 compute-0 nova_compute[186544]:       <entry name="uuid">6d263548-4cc6-463b-b26b-cb43b0d069cd</entry>
Nov 22 07:43:13 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 07:43:13 compute-0 nova_compute[186544]:     </system>
Nov 22 07:43:13 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 07:43:13 compute-0 nova_compute[186544]:   <os>
Nov 22 07:43:13 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 07:43:13 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 07:43:13 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 07:43:13 compute-0 nova_compute[186544]:   </os>
Nov 22 07:43:13 compute-0 nova_compute[186544]:   <features>
Nov 22 07:43:13 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 07:43:13 compute-0 nova_compute[186544]:     <apic/>
Nov 22 07:43:13 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 07:43:13 compute-0 nova_compute[186544]:   </features>
Nov 22 07:43:13 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 07:43:13 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 07:43:13 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 07:43:13 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 07:43:13 compute-0 nova_compute[186544]:   </clock>
Nov 22 07:43:13 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 07:43:13 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 07:43:13 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 07:43:13 compute-0 nova_compute[186544]:   </cpu>
Nov 22 07:43:13 compute-0 nova_compute[186544]:   <devices>
Nov 22 07:43:13 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 07:43:13 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 07:43:13 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/6d263548-4cc6-463b-b26b-cb43b0d069cd/disk"/>
Nov 22 07:43:13 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 07:43:13 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:43:13 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 07:43:13 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 07:43:13 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/6d263548-4cc6-463b-b26b-cb43b0d069cd/disk.config"/>
Nov 22 07:43:13 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 07:43:13 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:43:13 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 07:43:13 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:88:5c:3d"/>
Nov 22 07:43:13 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 07:43:13 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 07:43:13 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 07:43:13 compute-0 nova_compute[186544]:       <target dev="tapcc51d9a0-e1"/>
Nov 22 07:43:13 compute-0 nova_compute[186544]:     </interface>
Nov 22 07:43:13 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 07:43:13 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/6d263548-4cc6-463b-b26b-cb43b0d069cd/console.log" append="off"/>
Nov 22 07:43:13 compute-0 nova_compute[186544]:     </serial>
Nov 22 07:43:13 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 07:43:13 compute-0 nova_compute[186544]:     <video>
Nov 22 07:43:13 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 07:43:13 compute-0 nova_compute[186544]:     </video>
Nov 22 07:43:13 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 07:43:13 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 07:43:13 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 07:43:13 compute-0 nova_compute[186544]:     </rng>
Nov 22 07:43:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 07:43:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:43:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:43:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:43:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:43:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:43:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:43:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:43:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:43:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:43:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:43:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:43:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:43:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:43:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:43:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:43:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:43:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:43:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:43:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:43:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:43:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:43:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:43:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:43:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:43:13 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 07:43:13 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 07:43:13 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 07:43:13 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 07:43:13 compute-0 nova_compute[186544]:   </devices>
Nov 22 07:43:13 compute-0 nova_compute[186544]: </domain>
Nov 22 07:43:13 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 07:43:13 compute-0 nova_compute[186544]: 2025-11-22 07:43:13.593 186548 DEBUG nova.compute.manager [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Preparing to wait for external event network-vif-plugged-cc51d9a0-e170-42eb-b8db-2910ea320cb4 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 07:43:13 compute-0 nova_compute[186544]: 2025-11-22 07:43:13.593 186548 DEBUG oslo_concurrency.lockutils [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Acquiring lock "6d263548-4cc6-463b-b26b-cb43b0d069cd-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:43:13 compute-0 nova_compute[186544]: 2025-11-22 07:43:13.593 186548 DEBUG oslo_concurrency.lockutils [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "6d263548-4cc6-463b-b26b-cb43b0d069cd-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:43:13 compute-0 nova_compute[186544]: 2025-11-22 07:43:13.593 186548 DEBUG oslo_concurrency.lockutils [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "6d263548-4cc6-463b-b26b-cb43b0d069cd-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:43:13 compute-0 nova_compute[186544]: 2025-11-22 07:43:13.594 186548 DEBUG nova.virt.libvirt.vif [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:43:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-174353263',display_name='tempest-ServersAdminTestJSON-server-174353263',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-174353263',id=13,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9b004cb06df74de2903dae19345fd9c7',ramdisk_id='',reservation_id='r-wcksy40q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-1843119868',owner_user_name='tempest-ServersAdminTestJSON-1843119868-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:43:05Z,user_data=None,user_id='7c0fb56fc41e44dfa23a0d45149e78e3',uuid=6d263548-4cc6-463b-b26b-cb43b0d069cd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cc51d9a0-e170-42eb-b8db-2910ea320cb4", "address": "fa:16:3e:88:5c:3d", "network": {"id": "d7ba1c27-6255-4c71-8e98-23a1c59b5723", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1812148536-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b004cb06df74de2903dae19345fd9c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc51d9a0-e1", "ovs_interfaceid": "cc51d9a0-e170-42eb-b8db-2910ea320cb4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 07:43:13 compute-0 nova_compute[186544]: 2025-11-22 07:43:13.594 186548 DEBUG nova.network.os_vif_util [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Converting VIF {"id": "cc51d9a0-e170-42eb-b8db-2910ea320cb4", "address": "fa:16:3e:88:5c:3d", "network": {"id": "d7ba1c27-6255-4c71-8e98-23a1c59b5723", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1812148536-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b004cb06df74de2903dae19345fd9c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc51d9a0-e1", "ovs_interfaceid": "cc51d9a0-e170-42eb-b8db-2910ea320cb4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:43:13 compute-0 nova_compute[186544]: 2025-11-22 07:43:13.595 186548 DEBUG nova.network.os_vif_util [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:88:5c:3d,bridge_name='br-int',has_traffic_filtering=True,id=cc51d9a0-e170-42eb-b8db-2910ea320cb4,network=Network(d7ba1c27-6255-4c71-8e98-23a1c59b5723),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc51d9a0-e1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:43:13 compute-0 nova_compute[186544]: 2025-11-22 07:43:13.595 186548 DEBUG os_vif [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:88:5c:3d,bridge_name='br-int',has_traffic_filtering=True,id=cc51d9a0-e170-42eb-b8db-2910ea320cb4,network=Network(d7ba1c27-6255-4c71-8e98-23a1c59b5723),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc51d9a0-e1') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 07:43:13 compute-0 nova_compute[186544]: 2025-11-22 07:43:13.596 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:43:13 compute-0 nova_compute[186544]: 2025-11-22 07:43:13.596 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:43:13 compute-0 nova_compute[186544]: 2025-11-22 07:43:13.596 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:43:13 compute-0 nova_compute[186544]: 2025-11-22 07:43:13.599 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:43:13 compute-0 nova_compute[186544]: 2025-11-22 07:43:13.599 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcc51d9a0-e1, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:43:13 compute-0 nova_compute[186544]: 2025-11-22 07:43:13.599 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapcc51d9a0-e1, col_values=(('external_ids', {'iface-id': 'cc51d9a0-e170-42eb-b8db-2910ea320cb4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:88:5c:3d', 'vm-uuid': '6d263548-4cc6-463b-b26b-cb43b0d069cd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:43:13 compute-0 nova_compute[186544]: 2025-11-22 07:43:13.600 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:43:13 compute-0 NetworkManager[55036]: <info>  [1763797393.6017] manager: (tapcc51d9a0-e1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/25)
Nov 22 07:43:13 compute-0 nova_compute[186544]: 2025-11-22 07:43:13.603 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 07:43:13 compute-0 nova_compute[186544]: 2025-11-22 07:43:13.607 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:43:13 compute-0 nova_compute[186544]: 2025-11-22 07:43:13.608 186548 INFO os_vif [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:88:5c:3d,bridge_name='br-int',has_traffic_filtering=True,id=cc51d9a0-e170-42eb-b8db-2910ea320cb4,network=Network(d7ba1c27-6255-4c71-8e98-23a1c59b5723),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc51d9a0-e1')
Nov 22 07:43:13 compute-0 nova_compute[186544]: 2025-11-22 07:43:13.675 186548 DEBUG nova.virt.libvirt.driver [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:43:13 compute-0 nova_compute[186544]: 2025-11-22 07:43:13.676 186548 DEBUG nova.virt.libvirt.driver [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:43:13 compute-0 nova_compute[186544]: 2025-11-22 07:43:13.676 186548 DEBUG nova.virt.libvirt.driver [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] No VIF found with MAC fa:16:3e:88:5c:3d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 07:43:13 compute-0 nova_compute[186544]: 2025-11-22 07:43:13.677 186548 INFO nova.virt.libvirt.driver [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Using config drive
Nov 22 07:43:14 compute-0 nova_compute[186544]: 2025-11-22 07:43:14.516 186548 INFO nova.virt.libvirt.driver [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Creating config drive at /var/lib/nova/instances/6d263548-4cc6-463b-b26b-cb43b0d069cd/disk.config
Nov 22 07:43:14 compute-0 nova_compute[186544]: 2025-11-22 07:43:14.521 186548 DEBUG oslo_concurrency.processutils [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6d263548-4cc6-463b-b26b-cb43b0d069cd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpk3ce2ptj execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:43:14 compute-0 nova_compute[186544]: 2025-11-22 07:43:14.643 186548 DEBUG oslo_concurrency.processutils [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6d263548-4cc6-463b-b26b-cb43b0d069cd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpk3ce2ptj" returned: 0 in 0.122s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:43:14 compute-0 kernel: tapcc51d9a0-e1: entered promiscuous mode
Nov 22 07:43:14 compute-0 NetworkManager[55036]: <info>  [1763797394.7039] manager: (tapcc51d9a0-e1): new Tun device (/org/freedesktop/NetworkManager/Devices/26)
Nov 22 07:43:14 compute-0 nova_compute[186544]: 2025-11-22 07:43:14.704 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:43:14 compute-0 ovn_controller[94843]: 2025-11-22T07:43:14Z|00037|binding|INFO|Claiming lport cc51d9a0-e170-42eb-b8db-2910ea320cb4 for this chassis.
Nov 22 07:43:14 compute-0 ovn_controller[94843]: 2025-11-22T07:43:14Z|00038|binding|INFO|cc51d9a0-e170-42eb-b8db-2910ea320cb4: Claiming fa:16:3e:88:5c:3d 10.100.0.13
Nov 22 07:43:14 compute-0 nova_compute[186544]: 2025-11-22 07:43:14.709 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:43:14 compute-0 nova_compute[186544]: 2025-11-22 07:43:14.713 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:43:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:43:14.725 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:88:5c:3d 10.100.0.13'], port_security=['fa:16:3e:88:5c:3d 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '6d263548-4cc6-463b-b26b-cb43b0d069cd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d7ba1c27-6255-4c71-8e98-23a1c59b5723', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9b004cb06df74de2903dae19345fd9c7', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'f8b9c274-fa57-419c-9d40-54201db84f9d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ce3be460-df7c-41a5-9ff2-c82c8fc728ec, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=cc51d9a0-e170-42eb-b8db-2910ea320cb4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:43:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:43:14.727 103805 INFO neutron.agent.ovn.metadata.agent [-] Port cc51d9a0-e170-42eb-b8db-2910ea320cb4 in datapath d7ba1c27-6255-4c71-8e98-23a1c59b5723 bound to our chassis
Nov 22 07:43:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:43:14.728 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d7ba1c27-6255-4c71-8e98-23a1c59b5723
Nov 22 07:43:14 compute-0 systemd-machined[152872]: New machine qemu-5-instance-0000000d.
Nov 22 07:43:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:43:14.738 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[6bf03a7e-528b-4c55-a31f-d8f340a75df0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:43:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:43:14.740 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd7ba1c27-61 in ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 07:43:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:43:14.741 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd7ba1c27-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 07:43:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:43:14.742 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[84a7de71-2032-4c3b-8e94-e6f62e48675c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:43:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:43:14.742 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[40267fe3-cb1c-43a2-9206-34bf18ee5de7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:43:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:43:14.753 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[7b381e57-101d-489b-89cd-f451b7e9931f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:43:14 compute-0 nova_compute[186544]: 2025-11-22 07:43:14.764 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:43:14 compute-0 systemd[1]: Started Virtual Machine qemu-5-instance-0000000d.
Nov 22 07:43:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:43:14.766 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[ef4efdb3-8c90-4929-9077-30222da28fb7]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:43:14 compute-0 ovn_controller[94843]: 2025-11-22T07:43:14Z|00039|binding|INFO|Setting lport cc51d9a0-e170-42eb-b8db-2910ea320cb4 ovn-installed in OVS
Nov 22 07:43:14 compute-0 ovn_controller[94843]: 2025-11-22T07:43:14Z|00040|binding|INFO|Setting lport cc51d9a0-e170-42eb-b8db-2910ea320cb4 up in Southbound
Nov 22 07:43:14 compute-0 nova_compute[186544]: 2025-11-22 07:43:14.769 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:43:14 compute-0 systemd-udevd[214616]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 07:43:14 compute-0 NetworkManager[55036]: <info>  [1763797394.7920] device (tapcc51d9a0-e1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 07:43:14 compute-0 NetworkManager[55036]: <info>  [1763797394.7931] device (tapcc51d9a0-e1): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 07:43:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:43:14.801 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[93a02552-61e8-4655-b83b-2cc0a8d261b9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:43:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:43:14.806 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[aac02760-f998-4cc5-9906-55c385edfdcc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:43:14 compute-0 NetworkManager[55036]: <info>  [1763797394.8077] manager: (tapd7ba1c27-60): new Veth device (/org/freedesktop/NetworkManager/Devices/27)
Nov 22 07:43:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:43:14.832 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[1a208b73-e6f2-414d-ae91-105b95919a04]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:43:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:43:14.836 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[28de3372-eb82-4978-bcc9-a0e16b023286]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:43:14 compute-0 NetworkManager[55036]: <info>  [1763797394.8570] device (tapd7ba1c27-60): carrier: link connected
Nov 22 07:43:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:43:14.860 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[4297091b-76d6-4f9a-bfe7-5100527f22bc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:43:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:43:14.877 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[d44776c6-a65a-44c1-b756-7b83fe502bd2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd7ba1c27-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bb:37:eb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 413549, 'reachable_time': 24347, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214645, 'error': None, 'target': 'ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:43:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:43:14.892 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[8c463414-b39c-4e30-b665-3ed60323a74c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:febb:37eb'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 413549, 'tstamp': 413549}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214646, 'error': None, 'target': 'ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:43:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:43:14.907 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[f751de5b-ea11-4b11-86c4-8324af92aaef]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd7ba1c27-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bb:37:eb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 413549, 'reachable_time': 24347, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 214647, 'error': None, 'target': 'ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:43:14 compute-0 nova_compute[186544]: 2025-11-22 07:43:14.941 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:43:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:43:14.942 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[2e976aff-2984-42aa-9beb-abba57067071]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:43:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:43:14.994 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[9672c087-cb45-4899-86fe-0657c39c1912]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:43:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:43:14.996 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd7ba1c27-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:43:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:43:14.996 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:43:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:43:14.996 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd7ba1c27-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:43:14 compute-0 nova_compute[186544]: 2025-11-22 07:43:14.998 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:43:14 compute-0 kernel: tapd7ba1c27-60: entered promiscuous mode
Nov 22 07:43:14 compute-0 NetworkManager[55036]: <info>  [1763797394.9992] manager: (tapd7ba1c27-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/28)
Nov 22 07:43:15 compute-0 nova_compute[186544]: 2025-11-22 07:43:15.001 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:43:15 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:43:15.003 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd7ba1c27-60, col_values=(('external_ids', {'iface-id': '3c20001c-28e2-4cdd-9a7c-497ed470b31c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:43:15 compute-0 nova_compute[186544]: 2025-11-22 07:43:15.005 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:43:15 compute-0 ovn_controller[94843]: 2025-11-22T07:43:15Z|00041|binding|INFO|Releasing lport 3c20001c-28e2-4cdd-9a7c-497ed470b31c from this chassis (sb_readonly=0)
Nov 22 07:43:15 compute-0 nova_compute[186544]: 2025-11-22 07:43:15.006 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:43:15 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:43:15.007 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d7ba1c27-6255-4c71-8e98-23a1c59b5723.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d7ba1c27-6255-4c71-8e98-23a1c59b5723.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 07:43:15 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:43:15.007 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[22e592d4-56d2-46c2-bfb9-35948026f8ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:43:15 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:43:15.008 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 07:43:15 compute-0 ovn_metadata_agent[103800]: global
Nov 22 07:43:15 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 07:43:15 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-d7ba1c27-6255-4c71-8e98-23a1c59b5723
Nov 22 07:43:15 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 07:43:15 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 07:43:15 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 07:43:15 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/d7ba1c27-6255-4c71-8e98-23a1c59b5723.pid.haproxy
Nov 22 07:43:15 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 07:43:15 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:43:15 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 07:43:15 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 07:43:15 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 07:43:15 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 07:43:15 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 07:43:15 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 07:43:15 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 07:43:15 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 07:43:15 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 07:43:15 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 07:43:15 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 07:43:15 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 07:43:15 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 07:43:15 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:43:15 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:43:15 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 07:43:15 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 07:43:15 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 07:43:15 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID d7ba1c27-6255-4c71-8e98-23a1c59b5723
Nov 22 07:43:15 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 07:43:15 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:43:15.009 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723', 'env', 'PROCESS_TAG=haproxy-d7ba1c27-6255-4c71-8e98-23a1c59b5723', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d7ba1c27-6255-4c71-8e98-23a1c59b5723.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 07:43:15 compute-0 nova_compute[186544]: 2025-11-22 07:43:15.017 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:43:15 compute-0 nova_compute[186544]: 2025-11-22 07:43:15.031 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763797395.028472, 6d263548-4cc6-463b-b26b-cb43b0d069cd => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:43:15 compute-0 nova_compute[186544]: 2025-11-22 07:43:15.032 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] VM Started (Lifecycle Event)
Nov 22 07:43:15 compute-0 nova_compute[186544]: 2025-11-22 07:43:15.056 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:43:15 compute-0 nova_compute[186544]: 2025-11-22 07:43:15.060 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763797395.0288885, 6d263548-4cc6-463b-b26b-cb43b0d069cd => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:43:15 compute-0 nova_compute[186544]: 2025-11-22 07:43:15.061 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] VM Paused (Lifecycle Event)
Nov 22 07:43:15 compute-0 nova_compute[186544]: 2025-11-22 07:43:15.079 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:43:15 compute-0 nova_compute[186544]: 2025-11-22 07:43:15.082 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:43:15 compute-0 nova_compute[186544]: 2025-11-22 07:43:15.115 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 07:43:15 compute-0 nova_compute[186544]: 2025-11-22 07:43:15.198 186548 DEBUG nova.compute.manager [req-d420693c-5292-446b-909e-4f59e9b77ded req-0486a84f-c900-4e62-be5e-d0ed03b1bf6c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Received event network-vif-plugged-cc51d9a0-e170-42eb-b8db-2910ea320cb4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:43:15 compute-0 nova_compute[186544]: 2025-11-22 07:43:15.198 186548 DEBUG oslo_concurrency.lockutils [req-d420693c-5292-446b-909e-4f59e9b77ded req-0486a84f-c900-4e62-be5e-d0ed03b1bf6c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "6d263548-4cc6-463b-b26b-cb43b0d069cd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:43:15 compute-0 nova_compute[186544]: 2025-11-22 07:43:15.199 186548 DEBUG oslo_concurrency.lockutils [req-d420693c-5292-446b-909e-4f59e9b77ded req-0486a84f-c900-4e62-be5e-d0ed03b1bf6c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6d263548-4cc6-463b-b26b-cb43b0d069cd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:43:15 compute-0 nova_compute[186544]: 2025-11-22 07:43:15.199 186548 DEBUG oslo_concurrency.lockutils [req-d420693c-5292-446b-909e-4f59e9b77ded req-0486a84f-c900-4e62-be5e-d0ed03b1bf6c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6d263548-4cc6-463b-b26b-cb43b0d069cd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:43:15 compute-0 nova_compute[186544]: 2025-11-22 07:43:15.199 186548 DEBUG nova.compute.manager [req-d420693c-5292-446b-909e-4f59e9b77ded req-0486a84f-c900-4e62-be5e-d0ed03b1bf6c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Processing event network-vif-plugged-cc51d9a0-e170-42eb-b8db-2910ea320cb4 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 07:43:15 compute-0 nova_compute[186544]: 2025-11-22 07:43:15.201 186548 DEBUG nova.compute.manager [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 07:43:15 compute-0 nova_compute[186544]: 2025-11-22 07:43:15.204 186548 DEBUG nova.virt.libvirt.driver [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 07:43:15 compute-0 nova_compute[186544]: 2025-11-22 07:43:15.205 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763797395.2044418, 6d263548-4cc6-463b-b26b-cb43b0d069cd => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:43:15 compute-0 nova_compute[186544]: 2025-11-22 07:43:15.205 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] VM Resumed (Lifecycle Event)
Nov 22 07:43:15 compute-0 nova_compute[186544]: 2025-11-22 07:43:15.211 186548 INFO nova.virt.libvirt.driver [-] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Instance spawned successfully.
Nov 22 07:43:15 compute-0 nova_compute[186544]: 2025-11-22 07:43:15.212 186548 DEBUG nova.virt.libvirt.driver [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 07:43:15 compute-0 nova_compute[186544]: 2025-11-22 07:43:15.236 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:43:15 compute-0 nova_compute[186544]: 2025-11-22 07:43:15.242 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:43:15 compute-0 nova_compute[186544]: 2025-11-22 07:43:15.249 186548 DEBUG nova.virt.libvirt.driver [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:43:15 compute-0 nova_compute[186544]: 2025-11-22 07:43:15.250 186548 DEBUG nova.virt.libvirt.driver [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:43:15 compute-0 nova_compute[186544]: 2025-11-22 07:43:15.250 186548 DEBUG nova.virt.libvirt.driver [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:43:15 compute-0 nova_compute[186544]: 2025-11-22 07:43:15.251 186548 DEBUG nova.virt.libvirt.driver [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:43:15 compute-0 nova_compute[186544]: 2025-11-22 07:43:15.252 186548 DEBUG nova.virt.libvirt.driver [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:43:15 compute-0 nova_compute[186544]: 2025-11-22 07:43:15.252 186548 DEBUG nova.virt.libvirt.driver [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:43:15 compute-0 nova_compute[186544]: 2025-11-22 07:43:15.283 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 07:43:15 compute-0 nova_compute[186544]: 2025-11-22 07:43:15.355 186548 INFO nova.compute.manager [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Took 9.50 seconds to spawn the instance on the hypervisor.
Nov 22 07:43:15 compute-0 nova_compute[186544]: 2025-11-22 07:43:15.356 186548 DEBUG nova.compute.manager [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:43:15 compute-0 podman[214686]: 2025-11-22 07:43:15.399757764 +0000 UTC m=+0.063160545 container create 91d7ce8c15213c6da50c6c46a876432b887311dc491e3a4154a56bc2a492e97e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 22 07:43:15 compute-0 systemd[1]: Started libpod-conmon-91d7ce8c15213c6da50c6c46a876432b887311dc491e3a4154a56bc2a492e97e.scope.
Nov 22 07:43:15 compute-0 podman[214686]: 2025-11-22 07:43:15.359007491 +0000 UTC m=+0.022410292 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 07:43:15 compute-0 systemd[1]: Started libcrun container.
Nov 22 07:43:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/34d07bed9407075b12baa35fa2a97afa90891c73260439011ba6296986437f08/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 07:43:15 compute-0 nova_compute[186544]: 2025-11-22 07:43:15.493 186548 INFO nova.compute.manager [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Took 10.30 seconds to build instance.
Nov 22 07:43:15 compute-0 podman[214686]: 2025-11-22 07:43:15.49422268 +0000 UTC m=+0.157625491 container init 91d7ce8c15213c6da50c6c46a876432b887311dc491e3a4154a56bc2a492e97e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 07:43:15 compute-0 podman[214686]: 2025-11-22 07:43:15.500550055 +0000 UTC m=+0.163952846 container start 91d7ce8c15213c6da50c6c46a876432b887311dc491e3a4154a56bc2a492e97e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723, org.label-schema.build-date=20251118, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 22 07:43:15 compute-0 nova_compute[186544]: 2025-11-22 07:43:15.517 186548 DEBUG oslo_concurrency.lockutils [None req-b551398d-f6c2-4e2a-855c-1afa19899b15 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "6d263548-4cc6-463b-b26b-cb43b0d069cd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.412s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:43:15 compute-0 neutron-haproxy-ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723[214700]: [NOTICE]   (214704) : New worker (214706) forked
Nov 22 07:43:15 compute-0 neutron-haproxy-ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723[214700]: [NOTICE]   (214704) : Loading success.
Nov 22 07:43:17 compute-0 nova_compute[186544]: 2025-11-22 07:43:17.028 186548 DEBUG nova.network.neutron [req-6a8bc638-8857-4981-afc7-47aaac7db9d3 req-ef23a906-ee08-4c1f-802f-c628f0cb1424 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Updated VIF entry in instance network info cache for port cc51d9a0-e170-42eb-b8db-2910ea320cb4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 07:43:17 compute-0 nova_compute[186544]: 2025-11-22 07:43:17.029 186548 DEBUG nova.network.neutron [req-6a8bc638-8857-4981-afc7-47aaac7db9d3 req-ef23a906-ee08-4c1f-802f-c628f0cb1424 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Updating instance_info_cache with network_info: [{"id": "cc51d9a0-e170-42eb-b8db-2910ea320cb4", "address": "fa:16:3e:88:5c:3d", "network": {"id": "d7ba1c27-6255-4c71-8e98-23a1c59b5723", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1812148536-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b004cb06df74de2903dae19345fd9c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc51d9a0-e1", "ovs_interfaceid": "cc51d9a0-e170-42eb-b8db-2910ea320cb4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:43:17 compute-0 nova_compute[186544]: 2025-11-22 07:43:17.049 186548 DEBUG oslo_concurrency.lockutils [req-6a8bc638-8857-4981-afc7-47aaac7db9d3 req-ef23a906-ee08-4c1f-802f-c628f0cb1424 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-6d263548-4cc6-463b-b26b-cb43b0d069cd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:43:17 compute-0 nova_compute[186544]: 2025-11-22 07:43:17.539 186548 DEBUG nova.compute.manager [req-92de87c3-fdc3-406b-8df9-553c1d71d784 req-788b020c-39e1-46aa-9c56-cc5b71db540c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Received event network-vif-plugged-cc51d9a0-e170-42eb-b8db-2910ea320cb4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:43:17 compute-0 nova_compute[186544]: 2025-11-22 07:43:17.540 186548 DEBUG oslo_concurrency.lockutils [req-92de87c3-fdc3-406b-8df9-553c1d71d784 req-788b020c-39e1-46aa-9c56-cc5b71db540c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "6d263548-4cc6-463b-b26b-cb43b0d069cd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:43:17 compute-0 nova_compute[186544]: 2025-11-22 07:43:17.541 186548 DEBUG oslo_concurrency.lockutils [req-92de87c3-fdc3-406b-8df9-553c1d71d784 req-788b020c-39e1-46aa-9c56-cc5b71db540c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6d263548-4cc6-463b-b26b-cb43b0d069cd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:43:17 compute-0 nova_compute[186544]: 2025-11-22 07:43:17.541 186548 DEBUG oslo_concurrency.lockutils [req-92de87c3-fdc3-406b-8df9-553c1d71d784 req-788b020c-39e1-46aa-9c56-cc5b71db540c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6d263548-4cc6-463b-b26b-cb43b0d069cd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:43:17 compute-0 nova_compute[186544]: 2025-11-22 07:43:17.541 186548 DEBUG nova.compute.manager [req-92de87c3-fdc3-406b-8df9-553c1d71d784 req-788b020c-39e1-46aa-9c56-cc5b71db540c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] No waiting events found dispatching network-vif-plugged-cc51d9a0-e170-42eb-b8db-2910ea320cb4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:43:17 compute-0 nova_compute[186544]: 2025-11-22 07:43:17.542 186548 WARNING nova.compute.manager [req-92de87c3-fdc3-406b-8df9-553c1d71d784 req-788b020c-39e1-46aa-9c56-cc5b71db540c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Received unexpected event network-vif-plugged-cc51d9a0-e170-42eb-b8db-2910ea320cb4 for instance with vm_state active and task_state None.
Nov 22 07:43:18 compute-0 nova_compute[186544]: 2025-11-22 07:43:18.601 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:43:19 compute-0 podman[214715]: 2025-11-22 07:43:19.427712347 +0000 UTC m=+0.074490064 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 22 07:43:19 compute-0 podman[214716]: 2025-11-22 07:43:19.480071756 +0000 UTC m=+0.122954127 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, container_name=ovn_controller, managed_by=edpm_ansible)
Nov 22 07:43:19 compute-0 nova_compute[186544]: 2025-11-22 07:43:19.942 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:43:23 compute-0 nova_compute[186544]: 2025-11-22 07:43:23.604 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:43:24 compute-0 nova_compute[186544]: 2025-11-22 07:43:24.944 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:43:25 compute-0 podman[214762]: 2025-11-22 07:43:25.400010807 +0000 UTC m=+0.045918121 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 07:43:26 compute-0 nova_compute[186544]: 2025-11-22 07:43:26.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:43:26 compute-0 nova_compute[186544]: 2025-11-22 07:43:26.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 22 07:43:26 compute-0 nova_compute[186544]: 2025-11-22 07:43:26.181 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 22 07:43:28 compute-0 nova_compute[186544]: 2025-11-22 07:43:28.181 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:43:28 compute-0 nova_compute[186544]: 2025-11-22 07:43:28.218 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:43:28 compute-0 nova_compute[186544]: 2025-11-22 07:43:28.219 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:43:28 compute-0 nova_compute[186544]: 2025-11-22 07:43:28.219 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:43:28 compute-0 nova_compute[186544]: 2025-11-22 07:43:28.219 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 07:43:28 compute-0 podman[214798]: 2025-11-22 07:43:28.288197636 +0000 UTC m=+0.075414737 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 22 07:43:28 compute-0 nova_compute[186544]: 2025-11-22 07:43:28.295 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6d263548-4cc6-463b-b26b-cb43b0d069cd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:43:28 compute-0 nova_compute[186544]: 2025-11-22 07:43:28.358 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6d263548-4cc6-463b-b26b-cb43b0d069cd/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:43:28 compute-0 nova_compute[186544]: 2025-11-22 07:43:28.359 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6d263548-4cc6-463b-b26b-cb43b0d069cd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:43:28 compute-0 nova_compute[186544]: 2025-11-22 07:43:28.422 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6d263548-4cc6-463b-b26b-cb43b0d069cd/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:43:28 compute-0 nova_compute[186544]: 2025-11-22 07:43:28.563 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 07:43:28 compute-0 nova_compute[186544]: 2025-11-22 07:43:28.565 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5588MB free_disk=73.43603897094727GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 07:43:28 compute-0 nova_compute[186544]: 2025-11-22 07:43:28.565 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:43:28 compute-0 nova_compute[186544]: 2025-11-22 07:43:28.566 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:43:28 compute-0 nova_compute[186544]: 2025-11-22 07:43:28.607 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:43:28 compute-0 nova_compute[186544]: 2025-11-22 07:43:28.874 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Instance 6d263548-4cc6-463b-b26b-cb43b0d069cd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 22 07:43:28 compute-0 nova_compute[186544]: 2025-11-22 07:43:28.895 186548 INFO nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Instance 36d5f234-9baf-48b6-a565-430378fe4068 has allocations against this compute host but is not found in the database.
Nov 22 07:43:28 compute-0 nova_compute[186544]: 2025-11-22 07:43:28.895 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 07:43:28 compute-0 nova_compute[186544]: 2025-11-22 07:43:28.895 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 07:43:29 compute-0 nova_compute[186544]: 2025-11-22 07:43:29.162 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:43:29 compute-0 nova_compute[186544]: 2025-11-22 07:43:29.362 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:43:29 compute-0 nova_compute[186544]: 2025-11-22 07:43:29.418 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 07:43:29 compute-0 nova_compute[186544]: 2025-11-22 07:43:29.418 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.853s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:43:29 compute-0 nova_compute[186544]: 2025-11-22 07:43:29.584 186548 DEBUG oslo_concurrency.lockutils [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Acquiring lock "36d5f234-9baf-48b6-a565-430378fe4068" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:43:29 compute-0 nova_compute[186544]: 2025-11-22 07:43:29.585 186548 DEBUG oslo_concurrency.lockutils [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "36d5f234-9baf-48b6-a565-430378fe4068" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:43:29 compute-0 nova_compute[186544]: 2025-11-22 07:43:29.619 186548 DEBUG nova.compute.manager [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 36d5f234-9baf-48b6-a565-430378fe4068] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 07:43:29 compute-0 nova_compute[186544]: 2025-11-22 07:43:29.735 186548 DEBUG oslo_concurrency.lockutils [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:43:29 compute-0 nova_compute[186544]: 2025-11-22 07:43:29.735 186548 DEBUG oslo_concurrency.lockutils [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:43:29 compute-0 nova_compute[186544]: 2025-11-22 07:43:29.746 186548 DEBUG nova.virt.hardware [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 07:43:29 compute-0 nova_compute[186544]: 2025-11-22 07:43:29.746 186548 INFO nova.compute.claims [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 36d5f234-9baf-48b6-a565-430378fe4068] Claim successful on node compute-0.ctlplane.example.com
Nov 22 07:43:29 compute-0 nova_compute[186544]: 2025-11-22 07:43:29.883 186548 DEBUG nova.compute.provider_tree [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:43:29 compute-0 nova_compute[186544]: 2025-11-22 07:43:29.895 186548 DEBUG nova.scheduler.client.report [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:43:29 compute-0 nova_compute[186544]: 2025-11-22 07:43:29.946 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:43:29 compute-0 nova_compute[186544]: 2025-11-22 07:43:29.950 186548 DEBUG oslo_concurrency.lockutils [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.214s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:43:29 compute-0 nova_compute[186544]: 2025-11-22 07:43:29.950 186548 DEBUG nova.compute.manager [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 36d5f234-9baf-48b6-a565-430378fe4068] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 07:43:30 compute-0 nova_compute[186544]: 2025-11-22 07:43:30.030 186548 DEBUG nova.compute.manager [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 36d5f234-9baf-48b6-a565-430378fe4068] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 22 07:43:30 compute-0 nova_compute[186544]: 2025-11-22 07:43:30.030 186548 DEBUG nova.network.neutron [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 36d5f234-9baf-48b6-a565-430378fe4068] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 07:43:30 compute-0 nova_compute[186544]: 2025-11-22 07:43:30.068 186548 INFO nova.virt.libvirt.driver [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 36d5f234-9baf-48b6-a565-430378fe4068] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 07:43:30 compute-0 nova_compute[186544]: 2025-11-22 07:43:30.154 186548 DEBUG nova.compute.manager [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 36d5f234-9baf-48b6-a565-430378fe4068] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 07:43:30 compute-0 nova_compute[186544]: 2025-11-22 07:43:30.246 186548 DEBUG nova.policy [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7c0fb56fc41e44dfa23a0d45149e78e3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9b004cb06df74de2903dae19345fd9c7', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 22 07:43:30 compute-0 ovn_controller[94843]: 2025-11-22T07:43:30Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:88:5c:3d 10.100.0.13
Nov 22 07:43:30 compute-0 ovn_controller[94843]: 2025-11-22T07:43:30Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:88:5c:3d 10.100.0.13
Nov 22 07:43:30 compute-0 nova_compute[186544]: 2025-11-22 07:43:30.336 186548 DEBUG nova.compute.manager [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 36d5f234-9baf-48b6-a565-430378fe4068] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 07:43:30 compute-0 nova_compute[186544]: 2025-11-22 07:43:30.338 186548 DEBUG nova.virt.libvirt.driver [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 36d5f234-9baf-48b6-a565-430378fe4068] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 07:43:30 compute-0 nova_compute[186544]: 2025-11-22 07:43:30.338 186548 INFO nova.virt.libvirt.driver [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 36d5f234-9baf-48b6-a565-430378fe4068] Creating image(s)
Nov 22 07:43:30 compute-0 nova_compute[186544]: 2025-11-22 07:43:30.339 186548 DEBUG oslo_concurrency.lockutils [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Acquiring lock "/var/lib/nova/instances/36d5f234-9baf-48b6-a565-430378fe4068/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:43:30 compute-0 nova_compute[186544]: 2025-11-22 07:43:30.339 186548 DEBUG oslo_concurrency.lockutils [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "/var/lib/nova/instances/36d5f234-9baf-48b6-a565-430378fe4068/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:43:30 compute-0 nova_compute[186544]: 2025-11-22 07:43:30.340 186548 DEBUG oslo_concurrency.lockutils [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "/var/lib/nova/instances/36d5f234-9baf-48b6-a565-430378fe4068/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:43:30 compute-0 nova_compute[186544]: 2025-11-22 07:43:30.351 186548 DEBUG oslo_concurrency.processutils [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:43:30 compute-0 nova_compute[186544]: 2025-11-22 07:43:30.403 186548 DEBUG oslo_concurrency.processutils [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:43:30 compute-0 nova_compute[186544]: 2025-11-22 07:43:30.404 186548 DEBUG oslo_concurrency.lockutils [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:43:30 compute-0 nova_compute[186544]: 2025-11-22 07:43:30.404 186548 DEBUG oslo_concurrency.lockutils [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:43:30 compute-0 nova_compute[186544]: 2025-11-22 07:43:30.416 186548 DEBUG oslo_concurrency.processutils [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:43:30 compute-0 nova_compute[186544]: 2025-11-22 07:43:30.467 186548 DEBUG oslo_concurrency.processutils [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:43:30 compute-0 nova_compute[186544]: 2025-11-22 07:43:30.468 186548 DEBUG oslo_concurrency.processutils [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/36d5f234-9baf-48b6-a565-430378fe4068/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:43:30 compute-0 nova_compute[186544]: 2025-11-22 07:43:30.631 186548 DEBUG oslo_concurrency.processutils [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/36d5f234-9baf-48b6-a565-430378fe4068/disk 1073741824" returned: 0 in 0.164s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:43:30 compute-0 nova_compute[186544]: 2025-11-22 07:43:30.633 186548 DEBUG oslo_concurrency.lockutils [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.228s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:43:30 compute-0 nova_compute[186544]: 2025-11-22 07:43:30.633 186548 DEBUG oslo_concurrency.processutils [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:43:30 compute-0 nova_compute[186544]: 2025-11-22 07:43:30.686 186548 DEBUG oslo_concurrency.processutils [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:43:30 compute-0 nova_compute[186544]: 2025-11-22 07:43:30.687 186548 DEBUG nova.virt.disk.api [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Checking if we can resize image /var/lib/nova/instances/36d5f234-9baf-48b6-a565-430378fe4068/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 07:43:30 compute-0 nova_compute[186544]: 2025-11-22 07:43:30.687 186548 DEBUG oslo_concurrency.processutils [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/36d5f234-9baf-48b6-a565-430378fe4068/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:43:30 compute-0 nova_compute[186544]: 2025-11-22 07:43:30.740 186548 DEBUG oslo_concurrency.processutils [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/36d5f234-9baf-48b6-a565-430378fe4068/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:43:30 compute-0 nova_compute[186544]: 2025-11-22 07:43:30.741 186548 DEBUG nova.virt.disk.api [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Cannot resize image /var/lib/nova/instances/36d5f234-9baf-48b6-a565-430378fe4068/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 07:43:30 compute-0 nova_compute[186544]: 2025-11-22 07:43:30.742 186548 DEBUG nova.objects.instance [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lazy-loading 'migration_context' on Instance uuid 36d5f234-9baf-48b6-a565-430378fe4068 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:43:30 compute-0 nova_compute[186544]: 2025-11-22 07:43:30.758 186548 DEBUG nova.virt.libvirt.driver [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 36d5f234-9baf-48b6-a565-430378fe4068] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 07:43:30 compute-0 nova_compute[186544]: 2025-11-22 07:43:30.759 186548 DEBUG nova.virt.libvirt.driver [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 36d5f234-9baf-48b6-a565-430378fe4068] Ensure instance console log exists: /var/lib/nova/instances/36d5f234-9baf-48b6-a565-430378fe4068/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 07:43:30 compute-0 nova_compute[186544]: 2025-11-22 07:43:30.759 186548 DEBUG oslo_concurrency.lockutils [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:43:30 compute-0 nova_compute[186544]: 2025-11-22 07:43:30.759 186548 DEBUG oslo_concurrency.lockutils [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:43:30 compute-0 nova_compute[186544]: 2025-11-22 07:43:30.760 186548 DEBUG oslo_concurrency.lockutils [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:43:31 compute-0 podman[214839]: 2025-11-22 07:43:31.417303575 +0000 UTC m=+0.067273307 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 22 07:43:31 compute-0 nova_compute[186544]: 2025-11-22 07:43:31.841 186548 DEBUG nova.network.neutron [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 36d5f234-9baf-48b6-a565-430378fe4068] Successfully created port: 471dbcab-6d2d-430b-9c1d-41d0b1f895a0 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 22 07:43:32 compute-0 nova_compute[186544]: 2025-11-22 07:43:32.401 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:43:32 compute-0 nova_compute[186544]: 2025-11-22 07:43:32.401 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:43:32 compute-0 nova_compute[186544]: 2025-11-22 07:43:32.865 186548 DEBUG nova.network.neutron [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 36d5f234-9baf-48b6-a565-430378fe4068] Successfully updated port: 471dbcab-6d2d-430b-9c1d-41d0b1f895a0 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 07:43:32 compute-0 nova_compute[186544]: 2025-11-22 07:43:32.880 186548 DEBUG oslo_concurrency.lockutils [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Acquiring lock "refresh_cache-36d5f234-9baf-48b6-a565-430378fe4068" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:43:32 compute-0 nova_compute[186544]: 2025-11-22 07:43:32.881 186548 DEBUG oslo_concurrency.lockutils [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Acquired lock "refresh_cache-36d5f234-9baf-48b6-a565-430378fe4068" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:43:32 compute-0 nova_compute[186544]: 2025-11-22 07:43:32.881 186548 DEBUG nova.network.neutron [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 36d5f234-9baf-48b6-a565-430378fe4068] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 07:43:33 compute-0 nova_compute[186544]: 2025-11-22 07:43:33.123 186548 DEBUG nova.compute.manager [req-60aff229-2b99-4c1e-aee6-4af9d7559362 req-7f85e867-fe94-41a9-a200-4e820901ea57 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 36d5f234-9baf-48b6-a565-430378fe4068] Received event network-changed-471dbcab-6d2d-430b-9c1d-41d0b1f895a0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:43:33 compute-0 nova_compute[186544]: 2025-11-22 07:43:33.123 186548 DEBUG nova.compute.manager [req-60aff229-2b99-4c1e-aee6-4af9d7559362 req-7f85e867-fe94-41a9-a200-4e820901ea57 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 36d5f234-9baf-48b6-a565-430378fe4068] Refreshing instance network info cache due to event network-changed-471dbcab-6d2d-430b-9c1d-41d0b1f895a0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 07:43:33 compute-0 nova_compute[186544]: 2025-11-22 07:43:33.123 186548 DEBUG oslo_concurrency.lockutils [req-60aff229-2b99-4c1e-aee6-4af9d7559362 req-7f85e867-fe94-41a9-a200-4e820901ea57 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-36d5f234-9baf-48b6-a565-430378fe4068" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:43:33 compute-0 nova_compute[186544]: 2025-11-22 07:43:33.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:43:33 compute-0 nova_compute[186544]: 2025-11-22 07:43:33.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 07:43:33 compute-0 nova_compute[186544]: 2025-11-22 07:43:33.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:43:33 compute-0 nova_compute[186544]: 2025-11-22 07:43:33.243 186548 DEBUG nova.network.neutron [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 36d5f234-9baf-48b6-a565-430378fe4068] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 07:43:33 compute-0 nova_compute[186544]: 2025-11-22 07:43:33.608 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:43:34 compute-0 nova_compute[186544]: 2025-11-22 07:43:34.176 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:43:34 compute-0 nova_compute[186544]: 2025-11-22 07:43:34.176 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:43:34 compute-0 nova_compute[186544]: 2025-11-22 07:43:34.177 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 07:43:34 compute-0 nova_compute[186544]: 2025-11-22 07:43:34.177 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 07:43:34 compute-0 nova_compute[186544]: 2025-11-22 07:43:34.196 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 36d5f234-9baf-48b6-a565-430378fe4068] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Nov 22 07:43:34 compute-0 nova_compute[186544]: 2025-11-22 07:43:34.734 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "refresh_cache-6d263548-4cc6-463b-b26b-cb43b0d069cd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:43:34 compute-0 nova_compute[186544]: 2025-11-22 07:43:34.735 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquired lock "refresh_cache-6d263548-4cc6-463b-b26b-cb43b0d069cd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:43:34 compute-0 nova_compute[186544]: 2025-11-22 07:43:34.735 186548 DEBUG nova.network.neutron [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 22 07:43:34 compute-0 nova_compute[186544]: 2025-11-22 07:43:34.735 186548 DEBUG nova.objects.instance [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lazy-loading 'info_cache' on Instance uuid 6d263548-4cc6-463b-b26b-cb43b0d069cd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:43:34 compute-0 nova_compute[186544]: 2025-11-22 07:43:34.948 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:43:34 compute-0 nova_compute[186544]: 2025-11-22 07:43:34.978 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:43:34 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:43:34.977 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:43:34 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:43:34.979 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 07:43:36 compute-0 nova_compute[186544]: 2025-11-22 07:43:36.344 186548 DEBUG nova.network.neutron [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 36d5f234-9baf-48b6-a565-430378fe4068] Updating instance_info_cache with network_info: [{"id": "471dbcab-6d2d-430b-9c1d-41d0b1f895a0", "address": "fa:16:3e:61:ea:95", "network": {"id": "d7ba1c27-6255-4c71-8e98-23a1c59b5723", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1812148536-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b004cb06df74de2903dae19345fd9c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap471dbcab-6d", "ovs_interfaceid": "471dbcab-6d2d-430b-9c1d-41d0b1f895a0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:43:36 compute-0 podman[214860]: 2025-11-22 07:43:36.391011436 +0000 UTC m=+0.041696277 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 07:43:36 compute-0 nova_compute[186544]: 2025-11-22 07:43:36.393 186548 DEBUG oslo_concurrency.lockutils [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Releasing lock "refresh_cache-36d5f234-9baf-48b6-a565-430378fe4068" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:43:36 compute-0 nova_compute[186544]: 2025-11-22 07:43:36.393 186548 DEBUG nova.compute.manager [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 36d5f234-9baf-48b6-a565-430378fe4068] Instance network_info: |[{"id": "471dbcab-6d2d-430b-9c1d-41d0b1f895a0", "address": "fa:16:3e:61:ea:95", "network": {"id": "d7ba1c27-6255-4c71-8e98-23a1c59b5723", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1812148536-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b004cb06df74de2903dae19345fd9c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap471dbcab-6d", "ovs_interfaceid": "471dbcab-6d2d-430b-9c1d-41d0b1f895a0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 22 07:43:36 compute-0 nova_compute[186544]: 2025-11-22 07:43:36.393 186548 DEBUG oslo_concurrency.lockutils [req-60aff229-2b99-4c1e-aee6-4af9d7559362 req-7f85e867-fe94-41a9-a200-4e820901ea57 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-36d5f234-9baf-48b6-a565-430378fe4068" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:43:36 compute-0 nova_compute[186544]: 2025-11-22 07:43:36.394 186548 DEBUG nova.network.neutron [req-60aff229-2b99-4c1e-aee6-4af9d7559362 req-7f85e867-fe94-41a9-a200-4e820901ea57 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 36d5f234-9baf-48b6-a565-430378fe4068] Refreshing network info cache for port 471dbcab-6d2d-430b-9c1d-41d0b1f895a0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 07:43:36 compute-0 nova_compute[186544]: 2025-11-22 07:43:36.396 186548 DEBUG nova.virt.libvirt.driver [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 36d5f234-9baf-48b6-a565-430378fe4068] Start _get_guest_xml network_info=[{"id": "471dbcab-6d2d-430b-9c1d-41d0b1f895a0", "address": "fa:16:3e:61:ea:95", "network": {"id": "d7ba1c27-6255-4c71-8e98-23a1c59b5723", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1812148536-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b004cb06df74de2903dae19345fd9c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap471dbcab-6d", "ovs_interfaceid": "471dbcab-6d2d-430b-9c1d-41d0b1f895a0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 07:43:36 compute-0 nova_compute[186544]: 2025-11-22 07:43:36.400 186548 WARNING nova.virt.libvirt.driver [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 07:43:36 compute-0 nova_compute[186544]: 2025-11-22 07:43:36.406 186548 DEBUG nova.virt.libvirt.host [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 07:43:36 compute-0 nova_compute[186544]: 2025-11-22 07:43:36.406 186548 DEBUG nova.virt.libvirt.host [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 07:43:36 compute-0 nova_compute[186544]: 2025-11-22 07:43:36.412 186548 DEBUG nova.virt.libvirt.host [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 07:43:36 compute-0 nova_compute[186544]: 2025-11-22 07:43:36.412 186548 DEBUG nova.virt.libvirt.host [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 07:43:36 compute-0 nova_compute[186544]: 2025-11-22 07:43:36.413 186548 DEBUG nova.virt.libvirt.driver [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 07:43:36 compute-0 nova_compute[186544]: 2025-11-22 07:43:36.413 186548 DEBUG nova.virt.hardware [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 07:43:36 compute-0 nova_compute[186544]: 2025-11-22 07:43:36.414 186548 DEBUG nova.virt.hardware [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 07:43:36 compute-0 nova_compute[186544]: 2025-11-22 07:43:36.414 186548 DEBUG nova.virt.hardware [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 07:43:36 compute-0 nova_compute[186544]: 2025-11-22 07:43:36.414 186548 DEBUG nova.virt.hardware [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 07:43:36 compute-0 nova_compute[186544]: 2025-11-22 07:43:36.414 186548 DEBUG nova.virt.hardware [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 07:43:36 compute-0 nova_compute[186544]: 2025-11-22 07:43:36.414 186548 DEBUG nova.virt.hardware [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 07:43:36 compute-0 nova_compute[186544]: 2025-11-22 07:43:36.415 186548 DEBUG nova.virt.hardware [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 07:43:36 compute-0 nova_compute[186544]: 2025-11-22 07:43:36.415 186548 DEBUG nova.virt.hardware [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 07:43:36 compute-0 nova_compute[186544]: 2025-11-22 07:43:36.415 186548 DEBUG nova.virt.hardware [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 07:43:36 compute-0 nova_compute[186544]: 2025-11-22 07:43:36.415 186548 DEBUG nova.virt.hardware [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 07:43:36 compute-0 nova_compute[186544]: 2025-11-22 07:43:36.415 186548 DEBUG nova.virt.hardware [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 07:43:36 compute-0 nova_compute[186544]: 2025-11-22 07:43:36.418 186548 DEBUG nova.virt.libvirt.vif [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:43:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-626797560',display_name='tempest-ServersAdminTestJSON-server-626797560',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-626797560',id=15,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9b004cb06df74de2903dae19345fd9c7',ramdisk_id='',reservation_id='r-vz6jvqc8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-1843119868',owner_user_name='tempest-ServersAdminTestJSON-1843119868-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:43:30Z,user_data=None,user_id='7c0fb56fc41e44dfa23a0d45149e78e3',uuid=36d5f234-9baf-48b6-a565-430378fe4068,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "471dbcab-6d2d-430b-9c1d-41d0b1f895a0", "address": "fa:16:3e:61:ea:95", "network": {"id": "d7ba1c27-6255-4c71-8e98-23a1c59b5723", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1812148536-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b004cb06df74de2903dae19345fd9c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap471dbcab-6d", "ovs_interfaceid": "471dbcab-6d2d-430b-9c1d-41d0b1f895a0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 07:43:36 compute-0 nova_compute[186544]: 2025-11-22 07:43:36.418 186548 DEBUG nova.network.os_vif_util [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Converting VIF {"id": "471dbcab-6d2d-430b-9c1d-41d0b1f895a0", "address": "fa:16:3e:61:ea:95", "network": {"id": "d7ba1c27-6255-4c71-8e98-23a1c59b5723", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1812148536-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b004cb06df74de2903dae19345fd9c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap471dbcab-6d", "ovs_interfaceid": "471dbcab-6d2d-430b-9c1d-41d0b1f895a0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:43:36 compute-0 nova_compute[186544]: 2025-11-22 07:43:36.419 186548 DEBUG nova.network.os_vif_util [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:61:ea:95,bridge_name='br-int',has_traffic_filtering=True,id=471dbcab-6d2d-430b-9c1d-41d0b1f895a0,network=Network(d7ba1c27-6255-4c71-8e98-23a1c59b5723),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap471dbcab-6d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:43:36 compute-0 nova_compute[186544]: 2025-11-22 07:43:36.420 186548 DEBUG nova.objects.instance [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lazy-loading 'pci_devices' on Instance uuid 36d5f234-9baf-48b6-a565-430378fe4068 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:43:36 compute-0 nova_compute[186544]: 2025-11-22 07:43:36.434 186548 DEBUG nova.virt.libvirt.driver [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 36d5f234-9baf-48b6-a565-430378fe4068] End _get_guest_xml xml=<domain type="kvm">
Nov 22 07:43:36 compute-0 nova_compute[186544]:   <uuid>36d5f234-9baf-48b6-a565-430378fe4068</uuid>
Nov 22 07:43:36 compute-0 nova_compute[186544]:   <name>instance-0000000f</name>
Nov 22 07:43:36 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 07:43:36 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 07:43:36 compute-0 nova_compute[186544]:   <metadata>
Nov 22 07:43:36 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 07:43:36 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 07:43:36 compute-0 nova_compute[186544]:       <nova:name>tempest-ServersAdminTestJSON-server-626797560</nova:name>
Nov 22 07:43:36 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 07:43:36</nova:creationTime>
Nov 22 07:43:36 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 07:43:36 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 07:43:36 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 07:43:36 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 07:43:36 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 07:43:36 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 07:43:36 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 07:43:36 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 07:43:36 compute-0 nova_compute[186544]:         <nova:user uuid="7c0fb56fc41e44dfa23a0d45149e78e3">tempest-ServersAdminTestJSON-1843119868-project-member</nova:user>
Nov 22 07:43:36 compute-0 nova_compute[186544]:         <nova:project uuid="9b004cb06df74de2903dae19345fd9c7">tempest-ServersAdminTestJSON-1843119868</nova:project>
Nov 22 07:43:36 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 07:43:36 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 07:43:36 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 07:43:36 compute-0 nova_compute[186544]:         <nova:port uuid="471dbcab-6d2d-430b-9c1d-41d0b1f895a0">
Nov 22 07:43:36 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Nov 22 07:43:36 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 07:43:36 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 07:43:36 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 07:43:36 compute-0 nova_compute[186544]:   </metadata>
Nov 22 07:43:36 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 07:43:36 compute-0 nova_compute[186544]:     <system>
Nov 22 07:43:36 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 07:43:36 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 07:43:36 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 07:43:36 compute-0 nova_compute[186544]:       <entry name="serial">36d5f234-9baf-48b6-a565-430378fe4068</entry>
Nov 22 07:43:36 compute-0 nova_compute[186544]:       <entry name="uuid">36d5f234-9baf-48b6-a565-430378fe4068</entry>
Nov 22 07:43:36 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 07:43:36 compute-0 nova_compute[186544]:     </system>
Nov 22 07:43:36 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 07:43:36 compute-0 nova_compute[186544]:   <os>
Nov 22 07:43:36 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 07:43:36 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 07:43:36 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 07:43:36 compute-0 nova_compute[186544]:   </os>
Nov 22 07:43:36 compute-0 nova_compute[186544]:   <features>
Nov 22 07:43:36 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 07:43:36 compute-0 nova_compute[186544]:     <apic/>
Nov 22 07:43:36 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 07:43:36 compute-0 nova_compute[186544]:   </features>
Nov 22 07:43:36 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 07:43:36 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 07:43:36 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 07:43:36 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 07:43:36 compute-0 nova_compute[186544]:   </clock>
Nov 22 07:43:36 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 07:43:36 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 07:43:36 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 07:43:36 compute-0 nova_compute[186544]:   </cpu>
Nov 22 07:43:36 compute-0 nova_compute[186544]:   <devices>
Nov 22 07:43:36 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 07:43:36 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 07:43:36 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/36d5f234-9baf-48b6-a565-430378fe4068/disk"/>
Nov 22 07:43:36 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 07:43:36 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:43:36 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 07:43:36 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 07:43:36 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/36d5f234-9baf-48b6-a565-430378fe4068/disk.config"/>
Nov 22 07:43:36 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 07:43:36 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:43:36 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 07:43:36 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:61:ea:95"/>
Nov 22 07:43:36 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 07:43:36 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 07:43:36 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 07:43:36 compute-0 nova_compute[186544]:       <target dev="tap471dbcab-6d"/>
Nov 22 07:43:36 compute-0 nova_compute[186544]:     </interface>
Nov 22 07:43:36 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 07:43:36 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/36d5f234-9baf-48b6-a565-430378fe4068/console.log" append="off"/>
Nov 22 07:43:36 compute-0 nova_compute[186544]:     </serial>
Nov 22 07:43:36 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 07:43:36 compute-0 nova_compute[186544]:     <video>
Nov 22 07:43:36 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 07:43:36 compute-0 nova_compute[186544]:     </video>
Nov 22 07:43:36 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 07:43:36 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 07:43:36 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 07:43:36 compute-0 nova_compute[186544]:     </rng>
Nov 22 07:43:36 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 07:43:36 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:43:36 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:43:36 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:43:36 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:43:36 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:43:36 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:43:36 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:43:36 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:43:36 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:43:36 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:43:36 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:43:36 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:43:36 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:43:36 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:43:36 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:43:36 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:43:36 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:43:36 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:43:36 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:43:36 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:43:36 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:43:36 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:43:36 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:43:36 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:43:36 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 07:43:36 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 07:43:36 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 07:43:36 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 07:43:36 compute-0 nova_compute[186544]:   </devices>
Nov 22 07:43:36 compute-0 nova_compute[186544]: </domain>
Nov 22 07:43:36 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 07:43:36 compute-0 nova_compute[186544]: 2025-11-22 07:43:36.435 186548 DEBUG nova.compute.manager [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 36d5f234-9baf-48b6-a565-430378fe4068] Preparing to wait for external event network-vif-plugged-471dbcab-6d2d-430b-9c1d-41d0b1f895a0 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 07:43:36 compute-0 nova_compute[186544]: 2025-11-22 07:43:36.435 186548 DEBUG oslo_concurrency.lockutils [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Acquiring lock "36d5f234-9baf-48b6-a565-430378fe4068-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:43:36 compute-0 nova_compute[186544]: 2025-11-22 07:43:36.436 186548 DEBUG oslo_concurrency.lockutils [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "36d5f234-9baf-48b6-a565-430378fe4068-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:43:36 compute-0 nova_compute[186544]: 2025-11-22 07:43:36.436 186548 DEBUG oslo_concurrency.lockutils [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "36d5f234-9baf-48b6-a565-430378fe4068-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:43:36 compute-0 nova_compute[186544]: 2025-11-22 07:43:36.437 186548 DEBUG nova.virt.libvirt.vif [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:43:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-626797560',display_name='tempest-ServersAdminTestJSON-server-626797560',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-626797560',id=15,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9b004cb06df74de2903dae19345fd9c7',ramdisk_id='',reservation_id='r-vz6jvqc8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-1843119868',owner_user_name='tempest-ServersAdminTestJSON-1843119868-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:43:30Z,user_data=None,user_id='7c0fb56fc41e44dfa23a0d45149e78e3',uuid=36d5f234-9baf-48b6-a565-430378fe4068,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "471dbcab-6d2d-430b-9c1d-41d0b1f895a0", "address": "fa:16:3e:61:ea:95", "network": {"id": "d7ba1c27-6255-4c71-8e98-23a1c59b5723", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1812148536-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b004cb06df74de2903dae19345fd9c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap471dbcab-6d", "ovs_interfaceid": "471dbcab-6d2d-430b-9c1d-41d0b1f895a0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 07:43:36 compute-0 nova_compute[186544]: 2025-11-22 07:43:36.437 186548 DEBUG nova.network.os_vif_util [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Converting VIF {"id": "471dbcab-6d2d-430b-9c1d-41d0b1f895a0", "address": "fa:16:3e:61:ea:95", "network": {"id": "d7ba1c27-6255-4c71-8e98-23a1c59b5723", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1812148536-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b004cb06df74de2903dae19345fd9c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap471dbcab-6d", "ovs_interfaceid": "471dbcab-6d2d-430b-9c1d-41d0b1f895a0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:43:36 compute-0 nova_compute[186544]: 2025-11-22 07:43:36.437 186548 DEBUG nova.network.os_vif_util [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:61:ea:95,bridge_name='br-int',has_traffic_filtering=True,id=471dbcab-6d2d-430b-9c1d-41d0b1f895a0,network=Network(d7ba1c27-6255-4c71-8e98-23a1c59b5723),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap471dbcab-6d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:43:36 compute-0 nova_compute[186544]: 2025-11-22 07:43:36.438 186548 DEBUG os_vif [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:61:ea:95,bridge_name='br-int',has_traffic_filtering=True,id=471dbcab-6d2d-430b-9c1d-41d0b1f895a0,network=Network(d7ba1c27-6255-4c71-8e98-23a1c59b5723),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap471dbcab-6d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 07:43:36 compute-0 nova_compute[186544]: 2025-11-22 07:43:36.438 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:43:36 compute-0 nova_compute[186544]: 2025-11-22 07:43:36.438 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:43:36 compute-0 nova_compute[186544]: 2025-11-22 07:43:36.439 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:43:36 compute-0 nova_compute[186544]: 2025-11-22 07:43:36.441 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:43:36 compute-0 nova_compute[186544]: 2025-11-22 07:43:36.441 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap471dbcab-6d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:43:36 compute-0 nova_compute[186544]: 2025-11-22 07:43:36.442 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap471dbcab-6d, col_values=(('external_ids', {'iface-id': '471dbcab-6d2d-430b-9c1d-41d0b1f895a0', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:61:ea:95', 'vm-uuid': '36d5f234-9baf-48b6-a565-430378fe4068'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:43:36 compute-0 NetworkManager[55036]: <info>  [1763797416.4807] manager: (tap471dbcab-6d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/29)
Nov 22 07:43:36 compute-0 nova_compute[186544]: 2025-11-22 07:43:36.480 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:43:36 compute-0 nova_compute[186544]: 2025-11-22 07:43:36.482 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 07:43:36 compute-0 nova_compute[186544]: 2025-11-22 07:43:36.486 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:43:36 compute-0 nova_compute[186544]: 2025-11-22 07:43:36.487 186548 INFO os_vif [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:61:ea:95,bridge_name='br-int',has_traffic_filtering=True,id=471dbcab-6d2d-430b-9c1d-41d0b1f895a0,network=Network(d7ba1c27-6255-4c71-8e98-23a1c59b5723),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap471dbcab-6d')
Nov 22 07:43:36 compute-0 nova_compute[186544]: 2025-11-22 07:43:36.535 186548 DEBUG nova.virt.libvirt.driver [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:43:36 compute-0 nova_compute[186544]: 2025-11-22 07:43:36.535 186548 DEBUG nova.virt.libvirt.driver [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:43:36 compute-0 nova_compute[186544]: 2025-11-22 07:43:36.535 186548 DEBUG nova.virt.libvirt.driver [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] No VIF found with MAC fa:16:3e:61:ea:95, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 07:43:36 compute-0 nova_compute[186544]: 2025-11-22 07:43:36.536 186548 INFO nova.virt.libvirt.driver [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 36d5f234-9baf-48b6-a565-430378fe4068] Using config drive
Nov 22 07:43:37 compute-0 nova_compute[186544]: 2025-11-22 07:43:37.017 186548 INFO nova.virt.libvirt.driver [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 36d5f234-9baf-48b6-a565-430378fe4068] Creating config drive at /var/lib/nova/instances/36d5f234-9baf-48b6-a565-430378fe4068/disk.config
Nov 22 07:43:37 compute-0 nova_compute[186544]: 2025-11-22 07:43:37.022 186548 DEBUG oslo_concurrency.processutils [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/36d5f234-9baf-48b6-a565-430378fe4068/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpnb1dklct execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:43:37 compute-0 nova_compute[186544]: 2025-11-22 07:43:37.143 186548 DEBUG oslo_concurrency.processutils [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/36d5f234-9baf-48b6-a565-430378fe4068/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpnb1dklct" returned: 0 in 0.121s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:43:37 compute-0 kernel: tap471dbcab-6d: entered promiscuous mode
Nov 22 07:43:37 compute-0 NetworkManager[55036]: <info>  [1763797417.1921] manager: (tap471dbcab-6d): new Tun device (/org/freedesktop/NetworkManager/Devices/30)
Nov 22 07:43:37 compute-0 nova_compute[186544]: 2025-11-22 07:43:37.193 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:43:37 compute-0 ovn_controller[94843]: 2025-11-22T07:43:37Z|00042|binding|INFO|Claiming lport 471dbcab-6d2d-430b-9c1d-41d0b1f895a0 for this chassis.
Nov 22 07:43:37 compute-0 ovn_controller[94843]: 2025-11-22T07:43:37Z|00043|binding|INFO|471dbcab-6d2d-430b-9c1d-41d0b1f895a0: Claiming fa:16:3e:61:ea:95 10.100.0.8
Nov 22 07:43:37 compute-0 ovn_controller[94843]: 2025-11-22T07:43:37Z|00044|binding|INFO|Setting lport 471dbcab-6d2d-430b-9c1d-41d0b1f895a0 ovn-installed in OVS
Nov 22 07:43:37 compute-0 nova_compute[186544]: 2025-11-22 07:43:37.209 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:43:37 compute-0 systemd-udevd[214905]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 07:43:37 compute-0 ovn_controller[94843]: 2025-11-22T07:43:37Z|00045|binding|INFO|Setting lport 471dbcab-6d2d-430b-9c1d-41d0b1f895a0 up in Southbound
Nov 22 07:43:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:43:37.228 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:61:ea:95 10.100.0.8'], port_security=['fa:16:3e:61:ea:95 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '36d5f234-9baf-48b6-a565-430378fe4068', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d7ba1c27-6255-4c71-8e98-23a1c59b5723', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9b004cb06df74de2903dae19345fd9c7', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'f8b9c274-fa57-419c-9d40-54201db84f9d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ce3be460-df7c-41a5-9ff2-c82c8fc728ec, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=471dbcab-6d2d-430b-9c1d-41d0b1f895a0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:43:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:43:37.229 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 471dbcab-6d2d-430b-9c1d-41d0b1f895a0 in datapath d7ba1c27-6255-4c71-8e98-23a1c59b5723 bound to our chassis
Nov 22 07:43:37 compute-0 systemd-machined[152872]: New machine qemu-6-instance-0000000f.
Nov 22 07:43:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:43:37.231 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d7ba1c27-6255-4c71-8e98-23a1c59b5723
Nov 22 07:43:37 compute-0 systemd[1]: Started Virtual Machine qemu-6-instance-0000000f.
Nov 22 07:43:37 compute-0 NetworkManager[55036]: <info>  [1763797417.2428] device (tap471dbcab-6d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 07:43:37 compute-0 NetworkManager[55036]: <info>  [1763797417.2436] device (tap471dbcab-6d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 07:43:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:43:37.245 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[bec1c9a7-4f0a-4ee2-a559-09542b3bc15f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:43:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:43:37.269 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[4b9e0eb4-0242-429a-afd3-b77f4e786a8e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:43:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:43:37.272 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[53b54f60-6631-4684-9718-6357a3a64f61]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:43:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:43:37.297 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[93834a27-e87a-483e-ae42-c042f30db253]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:43:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:43:37.311 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:43:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:43:37.312 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:43:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:43:37.313 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:43:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:43:37.312 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[15597b9e-a664-4afc-9239-b089e97de7f6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd7ba1c27-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bb:37:eb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 413549, 'reachable_time': 24347, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214918, 'error': None, 'target': 'ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:43:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:43:37.329 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[203c86ef-ebdc-4d0f-aba8-91333b0914dc]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapd7ba1c27-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 413560, 'tstamp': 413560}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214920, 'error': None, 'target': 'ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapd7ba1c27-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 413562, 'tstamp': 413562}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214920, 'error': None, 'target': 'ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:43:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:43:37.331 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd7ba1c27-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:43:37 compute-0 nova_compute[186544]: 2025-11-22 07:43:37.333 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:43:37 compute-0 nova_compute[186544]: 2025-11-22 07:43:37.334 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:43:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:43:37.334 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd7ba1c27-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:43:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:43:37.335 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:43:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:43:37.335 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd7ba1c27-60, col_values=(('external_ids', {'iface-id': '3c20001c-28e2-4cdd-9a7c-497ed470b31c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:43:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:43:37.335 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:43:37 compute-0 nova_compute[186544]: 2025-11-22 07:43:37.698 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763797417.69772, 36d5f234-9baf-48b6-a565-430378fe4068 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:43:37 compute-0 nova_compute[186544]: 2025-11-22 07:43:37.699 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 36d5f234-9baf-48b6-a565-430378fe4068] VM Started (Lifecycle Event)
Nov 22 07:43:37 compute-0 nova_compute[186544]: 2025-11-22 07:43:37.734 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 36d5f234-9baf-48b6-a565-430378fe4068] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:43:37 compute-0 nova_compute[186544]: 2025-11-22 07:43:37.738 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763797417.6986952, 36d5f234-9baf-48b6-a565-430378fe4068 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:43:37 compute-0 nova_compute[186544]: 2025-11-22 07:43:37.738 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 36d5f234-9baf-48b6-a565-430378fe4068] VM Paused (Lifecycle Event)
Nov 22 07:43:37 compute-0 nova_compute[186544]: 2025-11-22 07:43:37.757 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 36d5f234-9baf-48b6-a565-430378fe4068] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:43:37 compute-0 nova_compute[186544]: 2025-11-22 07:43:37.760 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 36d5f234-9baf-48b6-a565-430378fe4068] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:43:37 compute-0 nova_compute[186544]: 2025-11-22 07:43:37.779 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 36d5f234-9baf-48b6-a565-430378fe4068] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 07:43:39 compute-0 nova_compute[186544]: 2025-11-22 07:43:39.175 186548 DEBUG nova.network.neutron [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Updating instance_info_cache with network_info: [{"id": "cc51d9a0-e170-42eb-b8db-2910ea320cb4", "address": "fa:16:3e:88:5c:3d", "network": {"id": "d7ba1c27-6255-4c71-8e98-23a1c59b5723", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1812148536-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b004cb06df74de2903dae19345fd9c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc51d9a0-e1", "ovs_interfaceid": "cc51d9a0-e170-42eb-b8db-2910ea320cb4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:43:39 compute-0 nova_compute[186544]: 2025-11-22 07:43:39.177 186548 DEBUG nova.network.neutron [req-60aff229-2b99-4c1e-aee6-4af9d7559362 req-7f85e867-fe94-41a9-a200-4e820901ea57 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 36d5f234-9baf-48b6-a565-430378fe4068] Updated VIF entry in instance network info cache for port 471dbcab-6d2d-430b-9c1d-41d0b1f895a0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 07:43:39 compute-0 nova_compute[186544]: 2025-11-22 07:43:39.178 186548 DEBUG nova.network.neutron [req-60aff229-2b99-4c1e-aee6-4af9d7559362 req-7f85e867-fe94-41a9-a200-4e820901ea57 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 36d5f234-9baf-48b6-a565-430378fe4068] Updating instance_info_cache with network_info: [{"id": "471dbcab-6d2d-430b-9c1d-41d0b1f895a0", "address": "fa:16:3e:61:ea:95", "network": {"id": "d7ba1c27-6255-4c71-8e98-23a1c59b5723", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1812148536-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b004cb06df74de2903dae19345fd9c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap471dbcab-6d", "ovs_interfaceid": "471dbcab-6d2d-430b-9c1d-41d0b1f895a0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:43:39 compute-0 nova_compute[186544]: 2025-11-22 07:43:39.204 186548 DEBUG oslo_concurrency.lockutils [req-60aff229-2b99-4c1e-aee6-4af9d7559362 req-7f85e867-fe94-41a9-a200-4e820901ea57 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-36d5f234-9baf-48b6-a565-430378fe4068" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:43:39 compute-0 nova_compute[186544]: 2025-11-22 07:43:39.209 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Releasing lock "refresh_cache-6d263548-4cc6-463b-b26b-cb43b0d069cd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:43:39 compute-0 nova_compute[186544]: 2025-11-22 07:43:39.209 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 22 07:43:39 compute-0 nova_compute[186544]: 2025-11-22 07:43:39.210 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:43:39 compute-0 nova_compute[186544]: 2025-11-22 07:43:39.210 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:43:39 compute-0 nova_compute[186544]: 2025-11-22 07:43:39.210 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:43:39 compute-0 nova_compute[186544]: 2025-11-22 07:43:39.668 186548 DEBUG nova.compute.manager [req-fdd6ae51-553c-4e0e-afbf-ae4ef8919b4a req-728f038c-a737-4b25-aa01-e34b4b7dd4b3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 36d5f234-9baf-48b6-a565-430378fe4068] Received event network-vif-plugged-471dbcab-6d2d-430b-9c1d-41d0b1f895a0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:43:39 compute-0 nova_compute[186544]: 2025-11-22 07:43:39.669 186548 DEBUG oslo_concurrency.lockutils [req-fdd6ae51-553c-4e0e-afbf-ae4ef8919b4a req-728f038c-a737-4b25-aa01-e34b4b7dd4b3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "36d5f234-9baf-48b6-a565-430378fe4068-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:43:39 compute-0 nova_compute[186544]: 2025-11-22 07:43:39.669 186548 DEBUG oslo_concurrency.lockutils [req-fdd6ae51-553c-4e0e-afbf-ae4ef8919b4a req-728f038c-a737-4b25-aa01-e34b4b7dd4b3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "36d5f234-9baf-48b6-a565-430378fe4068-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:43:39 compute-0 nova_compute[186544]: 2025-11-22 07:43:39.669 186548 DEBUG oslo_concurrency.lockutils [req-fdd6ae51-553c-4e0e-afbf-ae4ef8919b4a req-728f038c-a737-4b25-aa01-e34b4b7dd4b3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "36d5f234-9baf-48b6-a565-430378fe4068-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:43:39 compute-0 nova_compute[186544]: 2025-11-22 07:43:39.669 186548 DEBUG nova.compute.manager [req-fdd6ae51-553c-4e0e-afbf-ae4ef8919b4a req-728f038c-a737-4b25-aa01-e34b4b7dd4b3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 36d5f234-9baf-48b6-a565-430378fe4068] Processing event network-vif-plugged-471dbcab-6d2d-430b-9c1d-41d0b1f895a0 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 07:43:39 compute-0 nova_compute[186544]: 2025-11-22 07:43:39.670 186548 DEBUG nova.compute.manager [req-fdd6ae51-553c-4e0e-afbf-ae4ef8919b4a req-728f038c-a737-4b25-aa01-e34b4b7dd4b3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 36d5f234-9baf-48b6-a565-430378fe4068] Received event network-vif-plugged-471dbcab-6d2d-430b-9c1d-41d0b1f895a0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:43:39 compute-0 nova_compute[186544]: 2025-11-22 07:43:39.670 186548 DEBUG oslo_concurrency.lockutils [req-fdd6ae51-553c-4e0e-afbf-ae4ef8919b4a req-728f038c-a737-4b25-aa01-e34b4b7dd4b3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "36d5f234-9baf-48b6-a565-430378fe4068-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:43:39 compute-0 nova_compute[186544]: 2025-11-22 07:43:39.670 186548 DEBUG oslo_concurrency.lockutils [req-fdd6ae51-553c-4e0e-afbf-ae4ef8919b4a req-728f038c-a737-4b25-aa01-e34b4b7dd4b3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "36d5f234-9baf-48b6-a565-430378fe4068-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:43:39 compute-0 nova_compute[186544]: 2025-11-22 07:43:39.670 186548 DEBUG oslo_concurrency.lockutils [req-fdd6ae51-553c-4e0e-afbf-ae4ef8919b4a req-728f038c-a737-4b25-aa01-e34b4b7dd4b3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "36d5f234-9baf-48b6-a565-430378fe4068-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:43:39 compute-0 nova_compute[186544]: 2025-11-22 07:43:39.670 186548 DEBUG nova.compute.manager [req-fdd6ae51-553c-4e0e-afbf-ae4ef8919b4a req-728f038c-a737-4b25-aa01-e34b4b7dd4b3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 36d5f234-9baf-48b6-a565-430378fe4068] No waiting events found dispatching network-vif-plugged-471dbcab-6d2d-430b-9c1d-41d0b1f895a0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:43:39 compute-0 nova_compute[186544]: 2025-11-22 07:43:39.671 186548 WARNING nova.compute.manager [req-fdd6ae51-553c-4e0e-afbf-ae4ef8919b4a req-728f038c-a737-4b25-aa01-e34b4b7dd4b3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 36d5f234-9baf-48b6-a565-430378fe4068] Received unexpected event network-vif-plugged-471dbcab-6d2d-430b-9c1d-41d0b1f895a0 for instance with vm_state building and task_state spawning.
Nov 22 07:43:39 compute-0 nova_compute[186544]: 2025-11-22 07:43:39.671 186548 DEBUG nova.compute.manager [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 36d5f234-9baf-48b6-a565-430378fe4068] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 07:43:39 compute-0 nova_compute[186544]: 2025-11-22 07:43:39.674 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763797419.6743255, 36d5f234-9baf-48b6-a565-430378fe4068 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:43:39 compute-0 nova_compute[186544]: 2025-11-22 07:43:39.675 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 36d5f234-9baf-48b6-a565-430378fe4068] VM Resumed (Lifecycle Event)
Nov 22 07:43:39 compute-0 nova_compute[186544]: 2025-11-22 07:43:39.676 186548 DEBUG nova.virt.libvirt.driver [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 36d5f234-9baf-48b6-a565-430378fe4068] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 07:43:39 compute-0 nova_compute[186544]: 2025-11-22 07:43:39.679 186548 INFO nova.virt.libvirt.driver [-] [instance: 36d5f234-9baf-48b6-a565-430378fe4068] Instance spawned successfully.
Nov 22 07:43:39 compute-0 nova_compute[186544]: 2025-11-22 07:43:39.679 186548 DEBUG nova.virt.libvirt.driver [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 36d5f234-9baf-48b6-a565-430378fe4068] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 07:43:39 compute-0 nova_compute[186544]: 2025-11-22 07:43:39.691 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 36d5f234-9baf-48b6-a565-430378fe4068] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:43:39 compute-0 nova_compute[186544]: 2025-11-22 07:43:39.697 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 36d5f234-9baf-48b6-a565-430378fe4068] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:43:39 compute-0 nova_compute[186544]: 2025-11-22 07:43:39.699 186548 DEBUG nova.virt.libvirt.driver [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 36d5f234-9baf-48b6-a565-430378fe4068] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:43:39 compute-0 nova_compute[186544]: 2025-11-22 07:43:39.700 186548 DEBUG nova.virt.libvirt.driver [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 36d5f234-9baf-48b6-a565-430378fe4068] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:43:39 compute-0 nova_compute[186544]: 2025-11-22 07:43:39.700 186548 DEBUG nova.virt.libvirt.driver [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 36d5f234-9baf-48b6-a565-430378fe4068] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:43:39 compute-0 nova_compute[186544]: 2025-11-22 07:43:39.701 186548 DEBUG nova.virt.libvirt.driver [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 36d5f234-9baf-48b6-a565-430378fe4068] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:43:39 compute-0 nova_compute[186544]: 2025-11-22 07:43:39.701 186548 DEBUG nova.virt.libvirt.driver [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 36d5f234-9baf-48b6-a565-430378fe4068] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:43:39 compute-0 nova_compute[186544]: 2025-11-22 07:43:39.702 186548 DEBUG nova.virt.libvirt.driver [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 36d5f234-9baf-48b6-a565-430378fe4068] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:43:39 compute-0 nova_compute[186544]: 2025-11-22 07:43:39.721 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 36d5f234-9baf-48b6-a565-430378fe4068] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 07:43:39 compute-0 nova_compute[186544]: 2025-11-22 07:43:39.757 186548 INFO nova.compute.manager [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 36d5f234-9baf-48b6-a565-430378fe4068] Took 9.42 seconds to spawn the instance on the hypervisor.
Nov 22 07:43:39 compute-0 nova_compute[186544]: 2025-11-22 07:43:39.757 186548 DEBUG nova.compute.manager [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 36d5f234-9baf-48b6-a565-430378fe4068] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:43:39 compute-0 nova_compute[186544]: 2025-11-22 07:43:39.821 186548 INFO nova.compute.manager [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 36d5f234-9baf-48b6-a565-430378fe4068] Took 10.11 seconds to build instance.
Nov 22 07:43:39 compute-0 nova_compute[186544]: 2025-11-22 07:43:39.836 186548 DEBUG oslo_concurrency.lockutils [None req-fa69c1b1-0244-4df3-8a30-148444381d28 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "36d5f234-9baf-48b6-a565-430378fe4068" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.251s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:43:39 compute-0 nova_compute[186544]: 2025-11-22 07:43:39.950 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:43:40 compute-0 nova_compute[186544]: 2025-11-22 07:43:40.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:43:40 compute-0 nova_compute[186544]: 2025-11-22 07:43:40.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 22 07:43:41 compute-0 podman[214928]: 2025-11-22 07:43:41.414203715 +0000 UTC m=+0.056545053 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, container_name=openstack_network_exporter, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, version=9.6, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible)
Nov 22 07:43:41 compute-0 nova_compute[186544]: 2025-11-22 07:43:41.480 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:43:42 compute-0 nova_compute[186544]: 2025-11-22 07:43:42.170 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:43:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:43:42.981 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:43:44 compute-0 nova_compute[186544]: 2025-11-22 07:43:44.952 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:43:46 compute-0 nova_compute[186544]: 2025-11-22 07:43:46.483 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:43:46 compute-0 nova_compute[186544]: 2025-11-22 07:43:46.986 186548 DEBUG nova.virt.libvirt.driver [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Creating tmpfile /var/lib/nova/instances/tmpjjc7nk5w to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Nov 22 07:43:47 compute-0 nova_compute[186544]: 2025-11-22 07:43:47.130 186548 DEBUG nova.compute.manager [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpjjc7nk5w',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Nov 22 07:43:47 compute-0 nova_compute[186544]: 2025-11-22 07:43:47.156 186548 DEBUG oslo_concurrency.lockutils [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Acquiring lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:43:47 compute-0 nova_compute[186544]: 2025-11-22 07:43:47.156 186548 DEBUG oslo_concurrency.lockutils [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Acquired lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:43:47 compute-0 nova_compute[186544]: 2025-11-22 07:43:47.164 186548 INFO nova.compute.rpcapi [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Automatically selected compute RPC version 6.2 from minimum service version 66
Nov 22 07:43:47 compute-0 nova_compute[186544]: 2025-11-22 07:43:47.165 186548 DEBUG oslo_concurrency.lockutils [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Releasing lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:43:48 compute-0 nova_compute[186544]: 2025-11-22 07:43:48.557 186548 DEBUG nova.compute.manager [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560
Nov 22 07:43:48 compute-0 nova_compute[186544]: 2025-11-22 07:43:48.740 186548 DEBUG oslo_concurrency.lockutils [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:43:48 compute-0 nova_compute[186544]: 2025-11-22 07:43:48.741 186548 DEBUG oslo_concurrency.lockutils [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:43:48 compute-0 nova_compute[186544]: 2025-11-22 07:43:48.788 186548 DEBUG nova.objects.instance [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Lazy-loading 'pci_requests' on Instance uuid 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:43:48 compute-0 nova_compute[186544]: 2025-11-22 07:43:48.810 186548 DEBUG nova.virt.hardware [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 07:43:48 compute-0 nova_compute[186544]: 2025-11-22 07:43:48.810 186548 INFO nova.compute.claims [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Claim successful on node compute-0.ctlplane.example.com
Nov 22 07:43:48 compute-0 nova_compute[186544]: 2025-11-22 07:43:48.810 186548 DEBUG nova.objects.instance [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Lazy-loading 'resources' on Instance uuid 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:43:48 compute-0 nova_compute[186544]: 2025-11-22 07:43:48.831 186548 DEBUG nova.objects.instance [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Lazy-loading 'numa_topology' on Instance uuid 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:43:48 compute-0 nova_compute[186544]: 2025-11-22 07:43:48.843 186548 DEBUG nova.objects.instance [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Lazy-loading 'pci_devices' on Instance uuid 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:43:48 compute-0 nova_compute[186544]: 2025-11-22 07:43:48.899 186548 INFO nova.compute.resource_tracker [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Updating resource usage from migration d8d5546f-2f6f-4662-836a-f33c5797ec0e
Nov 22 07:43:48 compute-0 nova_compute[186544]: 2025-11-22 07:43:48.900 186548 DEBUG nova.compute.resource_tracker [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Starting to track incoming migration d8d5546f-2f6f-4662-836a-f33c5797ec0e with flavor 31612188-3cd6-428b-9166-9568f0affd4a _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431
Nov 22 07:43:49 compute-0 nova_compute[186544]: 2025-11-22 07:43:49.028 186548 DEBUG nova.compute.manager [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpjjc7nk5w',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='144e6cca-5b79-4b25-9456-a59f6895075b',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Nov 22 07:43:49 compute-0 nova_compute[186544]: 2025-11-22 07:43:49.032 186548 DEBUG nova.compute.provider_tree [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:43:49 compute-0 nova_compute[186544]: 2025-11-22 07:43:49.046 186548 DEBUG nova.scheduler.client.report [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:43:49 compute-0 nova_compute[186544]: 2025-11-22 07:43:49.051 186548 DEBUG oslo_concurrency.lockutils [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Acquiring lock "refresh_cache-144e6cca-5b79-4b25-9456-a59f6895075b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:43:49 compute-0 nova_compute[186544]: 2025-11-22 07:43:49.051 186548 DEBUG oslo_concurrency.lockutils [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Acquired lock "refresh_cache-144e6cca-5b79-4b25-9456-a59f6895075b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:43:49 compute-0 nova_compute[186544]: 2025-11-22 07:43:49.051 186548 DEBUG nova.network.neutron [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 07:43:49 compute-0 nova_compute[186544]: 2025-11-22 07:43:49.077 186548 DEBUG oslo_concurrency.lockutils [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.337s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:43:49 compute-0 nova_compute[186544]: 2025-11-22 07:43:49.078 186548 INFO nova.compute.manager [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Migrating
Nov 22 07:43:49 compute-0 nova_compute[186544]: 2025-11-22 07:43:49.955 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:43:50 compute-0 podman[214949]: 2025-11-22 07:43:50.403901454 +0000 UTC m=+0.055561049 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 07:43:50 compute-0 podman[214950]: 2025-11-22 07:43:50.428022007 +0000 UTC m=+0.076794411 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251118)
Nov 22 07:43:51 compute-0 nova_compute[186544]: 2025-11-22 07:43:51.488 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:43:51 compute-0 sshd-session[214994]: Accepted publickey for nova from 192.168.122.102 port 36098 ssh2: ECDSA SHA256:92zYFkcPWlh9+ZvK5fXQ5RiSCmUwy/g/KFplYnusFfI
Nov 22 07:43:51 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Nov 22 07:43:51 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 22 07:43:51 compute-0 systemd-logind[821]: New session 26 of user nova.
Nov 22 07:43:51 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 22 07:43:51 compute-0 systemd[1]: Starting User Manager for UID 42436...
Nov 22 07:43:51 compute-0 systemd[214998]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 22 07:43:51 compute-0 systemd[214998]: Queued start job for default target Main User Target.
Nov 22 07:43:51 compute-0 systemd[214998]: Created slice User Application Slice.
Nov 22 07:43:51 compute-0 systemd[214998]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 22 07:43:51 compute-0 systemd[214998]: Started Daily Cleanup of User's Temporary Directories.
Nov 22 07:43:51 compute-0 systemd[214998]: Reached target Paths.
Nov 22 07:43:51 compute-0 systemd[214998]: Reached target Timers.
Nov 22 07:43:51 compute-0 systemd[214998]: Starting D-Bus User Message Bus Socket...
Nov 22 07:43:51 compute-0 systemd[214998]: Starting Create User's Volatile Files and Directories...
Nov 22 07:43:52 compute-0 systemd[214998]: Listening on D-Bus User Message Bus Socket.
Nov 22 07:43:52 compute-0 systemd[214998]: Finished Create User's Volatile Files and Directories.
Nov 22 07:43:52 compute-0 systemd[214998]: Reached target Sockets.
Nov 22 07:43:52 compute-0 systemd[214998]: Reached target Basic System.
Nov 22 07:43:52 compute-0 systemd[214998]: Reached target Main User Target.
Nov 22 07:43:52 compute-0 systemd[214998]: Startup finished in 143ms.
Nov 22 07:43:52 compute-0 systemd[1]: Started User Manager for UID 42436.
Nov 22 07:43:52 compute-0 systemd[1]: Started Session 26 of User nova.
Nov 22 07:43:52 compute-0 sshd-session[214994]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 22 07:43:52 compute-0 sshd-session[215013]: Received disconnect from 192.168.122.102 port 36098:11: disconnected by user
Nov 22 07:43:52 compute-0 sshd-session[215013]: Disconnected from user nova 192.168.122.102 port 36098
Nov 22 07:43:52 compute-0 sshd-session[214994]: pam_unix(sshd:session): session closed for user nova
Nov 22 07:43:52 compute-0 systemd[1]: session-26.scope: Deactivated successfully.
Nov 22 07:43:52 compute-0 systemd-logind[821]: Session 26 logged out. Waiting for processes to exit.
Nov 22 07:43:52 compute-0 systemd-logind[821]: Removed session 26.
Nov 22 07:43:52 compute-0 nova_compute[186544]: 2025-11-22 07:43:52.119 186548 DEBUG nova.network.neutron [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Updating instance_info_cache with network_info: [{"id": "66ab05b0-442e-4420-82b9-0fc90a3df63b", "address": "fa:16:3e:4f:30:6c", "network": {"id": "cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-667943228-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74651b744925468db6c6e47d1397cc04", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66ab05b0-44", "ovs_interfaceid": "66ab05b0-442e-4420-82b9-0fc90a3df63b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:43:52 compute-0 nova_compute[186544]: 2025-11-22 07:43:52.170 186548 DEBUG oslo_concurrency.lockutils [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Releasing lock "refresh_cache-144e6cca-5b79-4b25-9456-a59f6895075b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:43:52 compute-0 nova_compute[186544]: 2025-11-22 07:43:52.178 186548 DEBUG nova.virt.libvirt.driver [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpjjc7nk5w',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='144e6cca-5b79-4b25-9456-a59f6895075b',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Nov 22 07:43:52 compute-0 nova_compute[186544]: 2025-11-22 07:43:52.179 186548 DEBUG nova.virt.libvirt.driver [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Creating instance directory: /var/lib/nova/instances/144e6cca-5b79-4b25-9456-a59f6895075b pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Nov 22 07:43:52 compute-0 nova_compute[186544]: 2025-11-22 07:43:52.180 186548 DEBUG nova.virt.libvirt.driver [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Creating disk.info with the contents: {'/var/lib/nova/instances/144e6cca-5b79-4b25-9456-a59f6895075b/disk': 'qcow2', '/var/lib/nova/instances/144e6cca-5b79-4b25-9456-a59f6895075b/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854
Nov 22 07:43:52 compute-0 nova_compute[186544]: 2025-11-22 07:43:52.180 186548 DEBUG nova.virt.libvirt.driver [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864
Nov 22 07:43:52 compute-0 nova_compute[186544]: 2025-11-22 07:43:52.181 186548 DEBUG nova.objects.instance [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Lazy-loading 'trusted_certs' on Instance uuid 144e6cca-5b79-4b25-9456-a59f6895075b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:43:52 compute-0 sshd-session[215017]: Accepted publickey for nova from 192.168.122.102 port 38900 ssh2: ECDSA SHA256:92zYFkcPWlh9+ZvK5fXQ5RiSCmUwy/g/KFplYnusFfI
Nov 22 07:43:52 compute-0 systemd-logind[821]: New session 28 of user nova.
Nov 22 07:43:52 compute-0 nova_compute[186544]: 2025-11-22 07:43:52.213 186548 DEBUG oslo_concurrency.processutils [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:43:52 compute-0 systemd[1]: Started Session 28 of User nova.
Nov 22 07:43:52 compute-0 sshd-session[215017]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 22 07:43:52 compute-0 nova_compute[186544]: 2025-11-22 07:43:52.273 186548 DEBUG oslo_concurrency.processutils [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:43:52 compute-0 nova_compute[186544]: 2025-11-22 07:43:52.274 186548 DEBUG oslo_concurrency.lockutils [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:43:52 compute-0 nova_compute[186544]: 2025-11-22 07:43:52.274 186548 DEBUG oslo_concurrency.lockutils [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:43:52 compute-0 sshd-session[215026]: Received disconnect from 192.168.122.102 port 38900:11: disconnected by user
Nov 22 07:43:52 compute-0 sshd-session[215026]: Disconnected from user nova 192.168.122.102 port 38900
Nov 22 07:43:52 compute-0 sshd-session[215017]: pam_unix(sshd:session): session closed for user nova
Nov 22 07:43:52 compute-0 systemd[1]: session-28.scope: Deactivated successfully.
Nov 22 07:43:52 compute-0 systemd-logind[821]: Session 28 logged out. Waiting for processes to exit.
Nov 22 07:43:52 compute-0 systemd-logind[821]: Removed session 28.
Nov 22 07:43:52 compute-0 nova_compute[186544]: 2025-11-22 07:43:52.288 186548 DEBUG oslo_concurrency.processutils [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:43:52 compute-0 nova_compute[186544]: 2025-11-22 07:43:52.342 186548 DEBUG oslo_concurrency.processutils [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:43:52 compute-0 nova_compute[186544]: 2025-11-22 07:43:52.343 186548 DEBUG oslo_concurrency.processutils [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/144e6cca-5b79-4b25-9456-a59f6895075b/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:43:52 compute-0 nova_compute[186544]: 2025-11-22 07:43:52.383 186548 DEBUG oslo_concurrency.processutils [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/144e6cca-5b79-4b25-9456-a59f6895075b/disk 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:43:52 compute-0 nova_compute[186544]: 2025-11-22 07:43:52.384 186548 DEBUG oslo_concurrency.lockutils [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:43:52 compute-0 nova_compute[186544]: 2025-11-22 07:43:52.384 186548 DEBUG oslo_concurrency.processutils [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:43:52 compute-0 nova_compute[186544]: 2025-11-22 07:43:52.437 186548 DEBUG oslo_concurrency.processutils [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:43:52 compute-0 nova_compute[186544]: 2025-11-22 07:43:52.438 186548 DEBUG nova.virt.disk.api [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Checking if we can resize image /var/lib/nova/instances/144e6cca-5b79-4b25-9456-a59f6895075b/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 07:43:52 compute-0 nova_compute[186544]: 2025-11-22 07:43:52.438 186548 DEBUG oslo_concurrency.processutils [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/144e6cca-5b79-4b25-9456-a59f6895075b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:43:52 compute-0 nova_compute[186544]: 2025-11-22 07:43:52.491 186548 DEBUG oslo_concurrency.processutils [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/144e6cca-5b79-4b25-9456-a59f6895075b/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:43:52 compute-0 nova_compute[186544]: 2025-11-22 07:43:52.492 186548 DEBUG nova.virt.disk.api [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Cannot resize image /var/lib/nova/instances/144e6cca-5b79-4b25-9456-a59f6895075b/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 07:43:52 compute-0 nova_compute[186544]: 2025-11-22 07:43:52.492 186548 DEBUG nova.objects.instance [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Lazy-loading 'migration_context' on Instance uuid 144e6cca-5b79-4b25-9456-a59f6895075b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:43:52 compute-0 nova_compute[186544]: 2025-11-22 07:43:52.503 186548 DEBUG oslo_concurrency.processutils [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/144e6cca-5b79-4b25-9456-a59f6895075b/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:43:52 compute-0 nova_compute[186544]: 2025-11-22 07:43:52.524 186548 DEBUG oslo_concurrency.processutils [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/144e6cca-5b79-4b25-9456-a59f6895075b/disk.config 485376" returned: 0 in 0.021s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:43:52 compute-0 nova_compute[186544]: 2025-11-22 07:43:52.526 186548 DEBUG nova.virt.libvirt.volume.remotefs [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Copying file compute-1.ctlplane.example.com:/var/lib/nova/instances/144e6cca-5b79-4b25-9456-a59f6895075b/disk.config to /var/lib/nova/instances/144e6cca-5b79-4b25-9456-a59f6895075b copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Nov 22 07:43:52 compute-0 nova_compute[186544]: 2025-11-22 07:43:52.527 186548 DEBUG oslo_concurrency.processutils [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Running cmd (subprocess): scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/144e6cca-5b79-4b25-9456-a59f6895075b/disk.config /var/lib/nova/instances/144e6cca-5b79-4b25-9456-a59f6895075b execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:43:52 compute-0 nova_compute[186544]: 2025-11-22 07:43:52.990 186548 DEBUG oslo_concurrency.processutils [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] CMD "scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/144e6cca-5b79-4b25-9456-a59f6895075b/disk.config /var/lib/nova/instances/144e6cca-5b79-4b25-9456-a59f6895075b" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:43:52 compute-0 nova_compute[186544]: 2025-11-22 07:43:52.990 186548 DEBUG nova.virt.libvirt.driver [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Nov 22 07:43:52 compute-0 nova_compute[186544]: 2025-11-22 07:43:52.992 186548 DEBUG nova.virt.libvirt.vif [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:43:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1027576693',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1027576693',id=16,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:43:41Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='74651b744925468db6c6e47d1397cc04',ramdisk_id='',reservation_id='r-u8vxgo1r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1505701588',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1505701588-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:43:41Z,user_data=None,user_id='4ca2e31d955040598948fa3da5d84888',uuid=144e6cca-5b79-4b25-9456-a59f6895075b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "66ab05b0-442e-4420-82b9-0fc90a3df63b", "address": "fa:16:3e:4f:30:6c", "network": {"id": "cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-667943228-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74651b744925468db6c6e47d1397cc04", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap66ab05b0-44", "ovs_interfaceid": "66ab05b0-442e-4420-82b9-0fc90a3df63b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 07:43:52 compute-0 nova_compute[186544]: 2025-11-22 07:43:52.992 186548 DEBUG nova.network.os_vif_util [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Converting VIF {"id": "66ab05b0-442e-4420-82b9-0fc90a3df63b", "address": "fa:16:3e:4f:30:6c", "network": {"id": "cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-667943228-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74651b744925468db6c6e47d1397cc04", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap66ab05b0-44", "ovs_interfaceid": "66ab05b0-442e-4420-82b9-0fc90a3df63b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:43:52 compute-0 nova_compute[186544]: 2025-11-22 07:43:52.993 186548 DEBUG nova.network.os_vif_util [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4f:30:6c,bridge_name='br-int',has_traffic_filtering=True,id=66ab05b0-442e-4420-82b9-0fc90a3df63b,network=Network(cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap66ab05b0-44') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:43:52 compute-0 nova_compute[186544]: 2025-11-22 07:43:52.994 186548 DEBUG os_vif [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4f:30:6c,bridge_name='br-int',has_traffic_filtering=True,id=66ab05b0-442e-4420-82b9-0fc90a3df63b,network=Network(cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap66ab05b0-44') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 07:43:52 compute-0 nova_compute[186544]: 2025-11-22 07:43:52.994 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:43:52 compute-0 nova_compute[186544]: 2025-11-22 07:43:52.995 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:43:52 compute-0 nova_compute[186544]: 2025-11-22 07:43:52.995 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:43:52 compute-0 nova_compute[186544]: 2025-11-22 07:43:52.997 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:43:52 compute-0 nova_compute[186544]: 2025-11-22 07:43:52.998 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap66ab05b0-44, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:43:52 compute-0 nova_compute[186544]: 2025-11-22 07:43:52.998 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap66ab05b0-44, col_values=(('external_ids', {'iface-id': '66ab05b0-442e-4420-82b9-0fc90a3df63b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:4f:30:6c', 'vm-uuid': '144e6cca-5b79-4b25-9456-a59f6895075b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:43:53 compute-0 nova_compute[186544]: 2025-11-22 07:43:53.000 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:43:53 compute-0 NetworkManager[55036]: <info>  [1763797433.0007] manager: (tap66ab05b0-44): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/31)
Nov 22 07:43:53 compute-0 nova_compute[186544]: 2025-11-22 07:43:53.003 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 07:43:53 compute-0 nova_compute[186544]: 2025-11-22 07:43:53.005 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:43:53 compute-0 nova_compute[186544]: 2025-11-22 07:43:53.007 186548 INFO os_vif [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4f:30:6c,bridge_name='br-int',has_traffic_filtering=True,id=66ab05b0-442e-4420-82b9-0fc90a3df63b,network=Network(cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap66ab05b0-44')
Nov 22 07:43:53 compute-0 nova_compute[186544]: 2025-11-22 07:43:53.007 186548 DEBUG nova.virt.libvirt.driver [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Nov 22 07:43:53 compute-0 nova_compute[186544]: 2025-11-22 07:43:53.008 186548 DEBUG nova.compute.manager [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpjjc7nk5w',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='144e6cca-5b79-4b25-9456-a59f6895075b',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Nov 22 07:43:53 compute-0 ovn_controller[94843]: 2025-11-22T07:43:53Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:61:ea:95 10.100.0.8
Nov 22 07:43:53 compute-0 ovn_controller[94843]: 2025-11-22T07:43:53Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:61:ea:95 10.100.0.8
Nov 22 07:43:54 compute-0 nova_compute[186544]: 2025-11-22 07:43:54.776 186548 DEBUG nova.network.neutron [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Port 66ab05b0-442e-4420-82b9-0fc90a3df63b updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Nov 22 07:43:54 compute-0 nova_compute[186544]: 2025-11-22 07:43:54.783 186548 DEBUG nova.compute.manager [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpjjc7nk5w',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='144e6cca-5b79-4b25-9456-a59f6895075b',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Nov 22 07:43:54 compute-0 systemd[1]: Starting libvirt proxy daemon...
Nov 22 07:43:54 compute-0 systemd[1]: Started libvirt proxy daemon.
Nov 22 07:43:54 compute-0 nova_compute[186544]: 2025-11-22 07:43:54.956 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:43:55 compute-0 kernel: tap66ab05b0-44: entered promiscuous mode
Nov 22 07:43:55 compute-0 NetworkManager[55036]: <info>  [1763797435.0878] manager: (tap66ab05b0-44): new Tun device (/org/freedesktop/NetworkManager/Devices/32)
Nov 22 07:43:55 compute-0 ovn_controller[94843]: 2025-11-22T07:43:55Z|00046|binding|INFO|Claiming lport 66ab05b0-442e-4420-82b9-0fc90a3df63b for this additional chassis.
Nov 22 07:43:55 compute-0 nova_compute[186544]: 2025-11-22 07:43:55.088 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:43:55 compute-0 ovn_controller[94843]: 2025-11-22T07:43:55Z|00047|binding|INFO|66ab05b0-442e-4420-82b9-0fc90a3df63b: Claiming fa:16:3e:4f:30:6c 10.100.0.8
Nov 22 07:43:55 compute-0 nova_compute[186544]: 2025-11-22 07:43:55.092 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:43:55 compute-0 nova_compute[186544]: 2025-11-22 07:43:55.094 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:43:55 compute-0 systemd-machined[152872]: New machine qemu-7-instance-00000010.
Nov 22 07:43:55 compute-0 systemd-udevd[215098]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 07:43:55 compute-0 NetworkManager[55036]: <info>  [1763797435.1371] device (tap66ab05b0-44): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 07:43:55 compute-0 NetworkManager[55036]: <info>  [1763797435.1377] device (tap66ab05b0-44): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 07:43:55 compute-0 ovn_controller[94843]: 2025-11-22T07:43:55Z|00048|binding|INFO|Setting lport 66ab05b0-442e-4420-82b9-0fc90a3df63b ovn-installed in OVS
Nov 22 07:43:55 compute-0 nova_compute[186544]: 2025-11-22 07:43:55.149 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:43:55 compute-0 systemd[1]: Started Virtual Machine qemu-7-instance-00000010.
Nov 22 07:43:56 compute-0 podman[215113]: 2025-11-22 07:43:56.40619064 +0000 UTC m=+0.055979999 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 07:43:56 compute-0 nova_compute[186544]: 2025-11-22 07:43:56.445 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763797436.4450324, 144e6cca-5b79-4b25-9456-a59f6895075b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:43:56 compute-0 nova_compute[186544]: 2025-11-22 07:43:56.445 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] VM Started (Lifecycle Event)
Nov 22 07:43:56 compute-0 nova_compute[186544]: 2025-11-22 07:43:56.462 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:43:57 compute-0 nova_compute[186544]: 2025-11-22 07:43:57.472 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763797437.4722302, 144e6cca-5b79-4b25-9456-a59f6895075b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:43:57 compute-0 nova_compute[186544]: 2025-11-22 07:43:57.474 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] VM Resumed (Lifecycle Event)
Nov 22 07:43:57 compute-0 nova_compute[186544]: 2025-11-22 07:43:57.491 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:43:57 compute-0 nova_compute[186544]: 2025-11-22 07:43:57.494 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:43:57 compute-0 nova_compute[186544]: 2025-11-22 07:43:57.513 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] During the sync_power process the instance has moved from host compute-1.ctlplane.example.com to host compute-0.ctlplane.example.com
Nov 22 07:43:58 compute-0 nova_compute[186544]: 2025-11-22 07:43:58.000 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:43:58 compute-0 podman[215152]: 2025-11-22 07:43:58.443736771 +0000 UTC m=+0.080124313 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 22 07:43:58 compute-0 ovn_controller[94843]: 2025-11-22T07:43:58Z|00049|binding|INFO|Claiming lport 66ab05b0-442e-4420-82b9-0fc90a3df63b for this chassis.
Nov 22 07:43:58 compute-0 ovn_controller[94843]: 2025-11-22T07:43:58Z|00050|binding|INFO|66ab05b0-442e-4420-82b9-0fc90a3df63b: Claiming fa:16:3e:4f:30:6c 10.100.0.8
Nov 22 07:43:58 compute-0 ovn_controller[94843]: 2025-11-22T07:43:58Z|00051|binding|INFO|Setting lport 66ab05b0-442e-4420-82b9-0fc90a3df63b up in Southbound
Nov 22 07:43:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:43:58.919 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4f:30:6c 10.100.0.8'], port_security=['fa:16:3e:4f:30:6c 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '144e6cca-5b79-4b25-9456-a59f6895075b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '74651b744925468db6c6e47d1397cc04', 'neutron:revision_number': '11', 'neutron:security_group_ids': '91f2be3c-33ea-422b-b9a4-1d9e92a850d8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=14c3e272-b4ef-4625-a876-b23f3cbba9b7, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=66ab05b0-442e-4420-82b9-0fc90a3df63b) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:43:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:43:58.920 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 66ab05b0-442e-4420-82b9-0fc90a3df63b in datapath cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9 bound to our chassis
Nov 22 07:43:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:43:58.921 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9
Nov 22 07:43:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:43:58.933 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[acce1cc1-6543-4e6d-b326-384dcf9ecccd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:43:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:43:58.934 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapcd5fa4f6-01 in ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 07:43:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:43:58.936 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapcd5fa4f6-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 07:43:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:43:58.936 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[894756ae-634b-4c10-a331-e1bfd0ae53bf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:43:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:43:58.936 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[9bfe462b-35b9-43ed-b680-80de05ebbb05]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:43:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:43:58.950 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[067f00f9-4995-4e35-acf0-b7913f66bea3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:43:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:43:58.964 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[a14b8de6-0e64-4b26-9933-022188c09b12]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:43:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:43:58.990 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[5c1ae086-9d10-40da-b425-6d32b566d14a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:43:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:43:58.994 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[ac6c8b20-e8fd-43df-8c10-75e8abfee936]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:43:58 compute-0 NetworkManager[55036]: <info>  [1763797438.9954] manager: (tapcd5fa4f6-00): new Veth device (/org/freedesktop/NetworkManager/Devices/33)
Nov 22 07:43:59 compute-0 systemd-udevd[215178]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 07:43:59 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:43:59.018 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[86f161cc-6fc6-48a2-8bbe-5f7131c0b7d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:43:59 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:43:59.022 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[f9d012d0-200f-4926-b3b2-f2613733a62b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:43:59 compute-0 NetworkManager[55036]: <info>  [1763797439.0481] device (tapcd5fa4f6-00): carrier: link connected
Nov 22 07:43:59 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:43:59.051 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[25108d75-6f1c-4148-90a6-035e550c43d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:43:59 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:43:59.064 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[9e3513ef-45fe-49be-aae4-960ea5fa3582]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcd5fa4f6-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:12:db:2b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 417968, 'reachable_time': 29969, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215197, 'error': None, 'target': 'ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:43:59 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:43:59.077 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[3446757c-8f11-47e0-b191-dcdfbf583930]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe12:db2b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 417968, 'tstamp': 417968}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215198, 'error': None, 'target': 'ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:43:59 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:43:59.092 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[e8f4021d-ca9a-4f39-a220-deb1c0253e3a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcd5fa4f6-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:12:db:2b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 417968, 'reachable_time': 29969, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 215199, 'error': None, 'target': 'ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:43:59 compute-0 nova_compute[186544]: 2025-11-22 07:43:59.101 186548 INFO nova.compute.manager [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Post operation of migration started
Nov 22 07:43:59 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:43:59.120 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[2db70376-77d0-435c-b0cb-6ac653fe4cbe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:43:59 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:43:59.172 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[e1dd7c7e-59e5-4e74-828a-2ecb87281da8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:43:59 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:43:59.174 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcd5fa4f6-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:43:59 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:43:59.174 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:43:59 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:43:59.175 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcd5fa4f6-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:43:59 compute-0 nova_compute[186544]: 2025-11-22 07:43:59.177 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:43:59 compute-0 NetworkManager[55036]: <info>  [1763797439.1784] manager: (tapcd5fa4f6-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/34)
Nov 22 07:43:59 compute-0 kernel: tapcd5fa4f6-00: entered promiscuous mode
Nov 22 07:43:59 compute-0 nova_compute[186544]: 2025-11-22 07:43:59.180 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:43:59 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:43:59.181 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapcd5fa4f6-00, col_values=(('external_ids', {'iface-id': 'f400467f-3f35-4435-bb4a-0b3da05366fb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:43:59 compute-0 ovn_controller[94843]: 2025-11-22T07:43:59Z|00052|binding|INFO|Releasing lport f400467f-3f35-4435-bb4a-0b3da05366fb from this chassis (sb_readonly=0)
Nov 22 07:43:59 compute-0 nova_compute[186544]: 2025-11-22 07:43:59.181 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:43:59 compute-0 nova_compute[186544]: 2025-11-22 07:43:59.193 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:43:59 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:43:59.195 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 07:43:59 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:43:59.196 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[5699e377-e310-4574-a725-d845bb0bbbfb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:43:59 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:43:59.196 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 07:43:59 compute-0 ovn_metadata_agent[103800]: global
Nov 22 07:43:59 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 07:43:59 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9
Nov 22 07:43:59 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 07:43:59 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 07:43:59 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 07:43:59 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9.pid.haproxy
Nov 22 07:43:59 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 07:43:59 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:43:59 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 07:43:59 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 07:43:59 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 07:43:59 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 07:43:59 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 07:43:59 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 07:43:59 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 07:43:59 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 07:43:59 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 07:43:59 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 07:43:59 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 07:43:59 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 07:43:59 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 07:43:59 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:43:59 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:43:59 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 07:43:59 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 07:43:59 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 07:43:59 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9
Nov 22 07:43:59 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 07:43:59 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:43:59.198 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9', 'env', 'PROCESS_TAG=haproxy-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 07:43:59 compute-0 nova_compute[186544]: 2025-11-22 07:43:59.452 186548 INFO nova.compute.manager [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Rebuilding instance
Nov 22 07:43:59 compute-0 nova_compute[186544]: 2025-11-22 07:43:59.530 186548 DEBUG oslo_concurrency.lockutils [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Acquiring lock "refresh_cache-144e6cca-5b79-4b25-9456-a59f6895075b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:43:59 compute-0 nova_compute[186544]: 2025-11-22 07:43:59.530 186548 DEBUG oslo_concurrency.lockutils [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Acquired lock "refresh_cache-144e6cca-5b79-4b25-9456-a59f6895075b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:43:59 compute-0 nova_compute[186544]: 2025-11-22 07:43:59.531 186548 DEBUG nova.network.neutron [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 07:43:59 compute-0 podman[215232]: 2025-11-22 07:43:59.55972203 +0000 UTC m=+0.056927343 container create 331474afb06f5f97660753b897ed5557539a0dca1611a740a96ebd00a8d463ec (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Nov 22 07:43:59 compute-0 systemd[1]: Started libpod-conmon-331474afb06f5f97660753b897ed5557539a0dca1611a740a96ebd00a8d463ec.scope.
Nov 22 07:43:59 compute-0 podman[215232]: 2025-11-22 07:43:59.523700433 +0000 UTC m=+0.020905766 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 07:43:59 compute-0 systemd[1]: Started libcrun container.
Nov 22 07:43:59 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/47172935216ac24d1bac506836770cc30351f0eed51908ac6958a9337a6c1118/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 07:43:59 compute-0 podman[215232]: 2025-11-22 07:43:59.649156861 +0000 UTC m=+0.146362194 container init 331474afb06f5f97660753b897ed5557539a0dca1611a740a96ebd00a8d463ec (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 07:43:59 compute-0 podman[215232]: 2025-11-22 07:43:59.65440451 +0000 UTC m=+0.151609823 container start 331474afb06f5f97660753b897ed5557539a0dca1611a740a96ebd00a8d463ec (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 22 07:43:59 compute-0 neutron-haproxy-ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9[215247]: [NOTICE]   (215251) : New worker (215253) forked
Nov 22 07:43:59 compute-0 neutron-haproxy-ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9[215247]: [NOTICE]   (215251) : Loading success.
Nov 22 07:43:59 compute-0 nova_compute[186544]: 2025-11-22 07:43:59.731 186548 DEBUG nova.compute.manager [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:43:59 compute-0 nova_compute[186544]: 2025-11-22 07:43:59.786 186548 DEBUG nova.objects.instance [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lazy-loading 'pci_requests' on Instance uuid 6d263548-4cc6-463b-b26b-cb43b0d069cd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:43:59 compute-0 nova_compute[186544]: 2025-11-22 07:43:59.794 186548 DEBUG nova.objects.instance [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lazy-loading 'pci_devices' on Instance uuid 6d263548-4cc6-463b-b26b-cb43b0d069cd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:43:59 compute-0 nova_compute[186544]: 2025-11-22 07:43:59.802 186548 DEBUG nova.objects.instance [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lazy-loading 'resources' on Instance uuid 6d263548-4cc6-463b-b26b-cb43b0d069cd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:43:59 compute-0 nova_compute[186544]: 2025-11-22 07:43:59.811 186548 DEBUG nova.objects.instance [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lazy-loading 'migration_context' on Instance uuid 6d263548-4cc6-463b-b26b-cb43b0d069cd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:43:59 compute-0 nova_compute[186544]: 2025-11-22 07:43:59.821 186548 DEBUG nova.objects.instance [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Nov 22 07:43:59 compute-0 nova_compute[186544]: 2025-11-22 07:43:59.824 186548 DEBUG nova.virt.libvirt.driver [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Nov 22 07:43:59 compute-0 nova_compute[186544]: 2025-11-22 07:43:59.959 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:01 compute-0 nova_compute[186544]: 2025-11-22 07:44:01.017 186548 DEBUG nova.network.neutron [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Updating instance_info_cache with network_info: [{"id": "66ab05b0-442e-4420-82b9-0fc90a3df63b", "address": "fa:16:3e:4f:30:6c", "network": {"id": "cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-667943228-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74651b744925468db6c6e47d1397cc04", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66ab05b0-44", "ovs_interfaceid": "66ab05b0-442e-4420-82b9-0fc90a3df63b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:44:01 compute-0 nova_compute[186544]: 2025-11-22 07:44:01.043 186548 DEBUG oslo_concurrency.lockutils [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Releasing lock "refresh_cache-144e6cca-5b79-4b25-9456-a59f6895075b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:44:01 compute-0 nova_compute[186544]: 2025-11-22 07:44:01.065 186548 DEBUG oslo_concurrency.lockutils [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:44:01 compute-0 nova_compute[186544]: 2025-11-22 07:44:01.066 186548 DEBUG oslo_concurrency.lockutils [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:44:01 compute-0 nova_compute[186544]: 2025-11-22 07:44:01.066 186548 DEBUG oslo_concurrency.lockutils [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:44:01 compute-0 nova_compute[186544]: 2025-11-22 07:44:01.070 186548 INFO nova.virt.libvirt.driver [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Nov 22 07:44:01 compute-0 virtqemud[186092]: Domain id=7 name='instance-00000010' uuid=144e6cca-5b79-4b25-9456-a59f6895075b is tainted: custom-monitor
Nov 22 07:44:02 compute-0 kernel: tapcc51d9a0-e1 (unregistering): left promiscuous mode
Nov 22 07:44:02 compute-0 NetworkManager[55036]: <info>  [1763797442.0189] device (tapcc51d9a0-e1): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 07:44:02 compute-0 ovn_controller[94843]: 2025-11-22T07:44:02Z|00053|binding|INFO|Releasing lport cc51d9a0-e170-42eb-b8db-2910ea320cb4 from this chassis (sb_readonly=0)
Nov 22 07:44:02 compute-0 ovn_controller[94843]: 2025-11-22T07:44:02Z|00054|binding|INFO|Setting lport cc51d9a0-e170-42eb-b8db-2910ea320cb4 down in Southbound
Nov 22 07:44:02 compute-0 ovn_controller[94843]: 2025-11-22T07:44:02Z|00055|binding|INFO|Removing iface tapcc51d9a0-e1 ovn-installed in OVS
Nov 22 07:44:02 compute-0 nova_compute[186544]: 2025-11-22 07:44:02.027 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:02 compute-0 nova_compute[186544]: 2025-11-22 07:44:02.029 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:02.035 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:88:5c:3d 10.100.0.13'], port_security=['fa:16:3e:88:5c:3d 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '6d263548-4cc6-463b-b26b-cb43b0d069cd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d7ba1c27-6255-4c71-8e98-23a1c59b5723', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9b004cb06df74de2903dae19345fd9c7', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f8b9c274-fa57-419c-9d40-54201db84f9d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ce3be460-df7c-41a5-9ff2-c82c8fc728ec, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=cc51d9a0-e170-42eb-b8db-2910ea320cb4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:44:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:02.037 103805 INFO neutron.agent.ovn.metadata.agent [-] Port cc51d9a0-e170-42eb-b8db-2910ea320cb4 in datapath d7ba1c27-6255-4c71-8e98-23a1c59b5723 unbound from our chassis
Nov 22 07:44:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:02.038 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d7ba1c27-6255-4c71-8e98-23a1c59b5723
Nov 22 07:44:02 compute-0 nova_compute[186544]: 2025-11-22 07:44:02.040 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:02.053 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[9011a7f6-2f6c-4435-9d58-2d0939a39074]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:44:02 compute-0 nova_compute[186544]: 2025-11-22 07:44:02.079 186548 INFO nova.virt.libvirt.driver [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Nov 22 07:44:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:02.080 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[abb4b21d-69f3-4208-9993-d343128beae4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:44:02 compute-0 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d0000000d.scope: Deactivated successfully.
Nov 22 07:44:02 compute-0 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d0000000d.scope: Consumed 15.475s CPU time.
Nov 22 07:44:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:02.084 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[d94abab3-636c-45d0-9755-dae074e671c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:44:02 compute-0 systemd-machined[152872]: Machine qemu-5-instance-0000000d terminated.
Nov 22 07:44:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:02.111 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[23e083b1-a4ce-4c01-b85c-2048e0fe510f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:44:02 compute-0 podman[215262]: 2025-11-22 07:44:02.115449546 +0000 UTC m=+0.064235891 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 22 07:44:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:02.126 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[6e0c6ceb-a400-4082-bd30-d68f87e54096]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd7ba1c27-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bb:37:eb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 413549, 'reachable_time': 24347, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215294, 'error': None, 'target': 'ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:44:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:02.138 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[fa0df4e0-2d16-4453-862a-6aea627d0566]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapd7ba1c27-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 413560, 'tstamp': 413560}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215295, 'error': None, 'target': 'ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapd7ba1c27-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 413562, 'tstamp': 413562}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215295, 'error': None, 'target': 'ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:44:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:02.140 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd7ba1c27-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:44:02 compute-0 nova_compute[186544]: 2025-11-22 07:44:02.141 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:02 compute-0 nova_compute[186544]: 2025-11-22 07:44:02.147 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:02.148 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd7ba1c27-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:44:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:02.148 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:44:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:02.149 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd7ba1c27-60, col_values=(('external_ids', {'iface-id': '3c20001c-28e2-4cdd-9a7c-497ed470b31c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:44:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:02.149 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:44:02 compute-0 nova_compute[186544]: 2025-11-22 07:44:02.257 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:02 compute-0 nova_compute[186544]: 2025-11-22 07:44:02.261 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:02 compute-0 nova_compute[186544]: 2025-11-22 07:44:02.279 186548 DEBUG nova.compute.manager [req-d42b6f8a-defb-4e54-b255-d816c5888a2f req-6ce37cd8-f2b7-4f9f-b095-688038d5dc70 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Received event network-vif-unplugged-cc51d9a0-e170-42eb-b8db-2910ea320cb4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:44:02 compute-0 nova_compute[186544]: 2025-11-22 07:44:02.280 186548 DEBUG oslo_concurrency.lockutils [req-d42b6f8a-defb-4e54-b255-d816c5888a2f req-6ce37cd8-f2b7-4f9f-b095-688038d5dc70 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "6d263548-4cc6-463b-b26b-cb43b0d069cd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:44:02 compute-0 nova_compute[186544]: 2025-11-22 07:44:02.280 186548 DEBUG oslo_concurrency.lockutils [req-d42b6f8a-defb-4e54-b255-d816c5888a2f req-6ce37cd8-f2b7-4f9f-b095-688038d5dc70 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6d263548-4cc6-463b-b26b-cb43b0d069cd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:44:02 compute-0 nova_compute[186544]: 2025-11-22 07:44:02.281 186548 DEBUG oslo_concurrency.lockutils [req-d42b6f8a-defb-4e54-b255-d816c5888a2f req-6ce37cd8-f2b7-4f9f-b095-688038d5dc70 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6d263548-4cc6-463b-b26b-cb43b0d069cd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:44:02 compute-0 nova_compute[186544]: 2025-11-22 07:44:02.281 186548 DEBUG nova.compute.manager [req-d42b6f8a-defb-4e54-b255-d816c5888a2f req-6ce37cd8-f2b7-4f9f-b095-688038d5dc70 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] No waiting events found dispatching network-vif-unplugged-cc51d9a0-e170-42eb-b8db-2910ea320cb4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:44:02 compute-0 nova_compute[186544]: 2025-11-22 07:44:02.281 186548 WARNING nova.compute.manager [req-d42b6f8a-defb-4e54-b255-d816c5888a2f req-6ce37cd8-f2b7-4f9f-b095-688038d5dc70 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Received unexpected event network-vif-unplugged-cc51d9a0-e170-42eb-b8db-2910ea320cb4 for instance with vm_state error and task_state rebuilding.
Nov 22 07:44:02 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Nov 22 07:44:02 compute-0 systemd[214998]: Activating special unit Exit the Session...
Nov 22 07:44:02 compute-0 systemd[214998]: Stopped target Main User Target.
Nov 22 07:44:02 compute-0 systemd[214998]: Stopped target Basic System.
Nov 22 07:44:02 compute-0 systemd[214998]: Stopped target Paths.
Nov 22 07:44:02 compute-0 systemd[214998]: Stopped target Sockets.
Nov 22 07:44:02 compute-0 systemd[214998]: Stopped target Timers.
Nov 22 07:44:02 compute-0 systemd[214998]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 22 07:44:02 compute-0 systemd[214998]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 22 07:44:02 compute-0 systemd[214998]: Closed D-Bus User Message Bus Socket.
Nov 22 07:44:02 compute-0 systemd[214998]: Stopped Create User's Volatile Files and Directories.
Nov 22 07:44:02 compute-0 systemd[214998]: Removed slice User Application Slice.
Nov 22 07:44:02 compute-0 systemd[214998]: Reached target Shutdown.
Nov 22 07:44:02 compute-0 systemd[214998]: Finished Exit the Session.
Nov 22 07:44:02 compute-0 systemd[214998]: Reached target Exit the Session.
Nov 22 07:44:02 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Nov 22 07:44:02 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Nov 22 07:44:02 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 22 07:44:02 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 22 07:44:02 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 22 07:44:02 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 22 07:44:02 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Nov 22 07:44:02 compute-0 nova_compute[186544]: 2025-11-22 07:44:02.839 186548 INFO nova.virt.libvirt.driver [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Instance shutdown successfully after 3 seconds.
Nov 22 07:44:02 compute-0 nova_compute[186544]: 2025-11-22 07:44:02.845 186548 INFO nova.virt.libvirt.driver [-] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Instance destroyed successfully.
Nov 22 07:44:02 compute-0 nova_compute[186544]: 2025-11-22 07:44:02.852 186548 INFO nova.virt.libvirt.driver [-] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Instance destroyed successfully.
Nov 22 07:44:02 compute-0 nova_compute[186544]: 2025-11-22 07:44:02.853 186548 DEBUG nova.virt.libvirt.vif [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:43:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-174353263',display_name='tempest-ServersAdminTestJSON-server-174353263',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-174353263',id=13,image_ref='360f90ca-2ddb-4e60-a48e-364e3b48bd96',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:43:15Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='9b004cb06df74de2903dae19345fd9c7',ramdisk_id='',reservation_id='r-wcksy40q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='360f90ca-2ddb-4e60-a48e-364e3b48bd96',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-1843119868',owner_user_name='tempest-ServersAdminTestJSON-1843119868-project-member'},tags=<?>,task_state='rebuilding',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:43:58Z,user_data=None,user_id='7c0fb56fc41e44dfa23a0d45149e78e3',uuid=6d263548-4cc6-463b-b26b-cb43b0d069cd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='error') vif={"id": "cc51d9a0-e170-42eb-b8db-2910ea320cb4", "address": "fa:16:3e:88:5c:3d", "network": {"id": "d7ba1c27-6255-4c71-8e98-23a1c59b5723", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1812148536-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b004cb06df74de2903dae19345fd9c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc51d9a0-e1", "ovs_interfaceid": "cc51d9a0-e170-42eb-b8db-2910ea320cb4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 07:44:02 compute-0 nova_compute[186544]: 2025-11-22 07:44:02.853 186548 DEBUG nova.network.os_vif_util [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Converting VIF {"id": "cc51d9a0-e170-42eb-b8db-2910ea320cb4", "address": "fa:16:3e:88:5c:3d", "network": {"id": "d7ba1c27-6255-4c71-8e98-23a1c59b5723", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1812148536-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b004cb06df74de2903dae19345fd9c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc51d9a0-e1", "ovs_interfaceid": "cc51d9a0-e170-42eb-b8db-2910ea320cb4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:44:02 compute-0 nova_compute[186544]: 2025-11-22 07:44:02.855 186548 DEBUG nova.network.os_vif_util [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:88:5c:3d,bridge_name='br-int',has_traffic_filtering=True,id=cc51d9a0-e170-42eb-b8db-2910ea320cb4,network=Network(d7ba1c27-6255-4c71-8e98-23a1c59b5723),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc51d9a0-e1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:44:02 compute-0 nova_compute[186544]: 2025-11-22 07:44:02.855 186548 DEBUG os_vif [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:88:5c:3d,bridge_name='br-int',has_traffic_filtering=True,id=cc51d9a0-e170-42eb-b8db-2910ea320cb4,network=Network(d7ba1c27-6255-4c71-8e98-23a1c59b5723),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc51d9a0-e1') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 07:44:02 compute-0 nova_compute[186544]: 2025-11-22 07:44:02.858 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:02 compute-0 nova_compute[186544]: 2025-11-22 07:44:02.858 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcc51d9a0-e1, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:44:02 compute-0 nova_compute[186544]: 2025-11-22 07:44:02.860 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:02 compute-0 nova_compute[186544]: 2025-11-22 07:44:02.863 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:02 compute-0 nova_compute[186544]: 2025-11-22 07:44:02.865 186548 INFO os_vif [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:88:5c:3d,bridge_name='br-int',has_traffic_filtering=True,id=cc51d9a0-e170-42eb-b8db-2910ea320cb4,network=Network(d7ba1c27-6255-4c71-8e98-23a1c59b5723),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc51d9a0-e1')
Nov 22 07:44:02 compute-0 nova_compute[186544]: 2025-11-22 07:44:02.865 186548 INFO nova.virt.libvirt.driver [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Deleting instance files /var/lib/nova/instances/6d263548-4cc6-463b-b26b-cb43b0d069cd_del
Nov 22 07:44:02 compute-0 nova_compute[186544]: 2025-11-22 07:44:02.866 186548 INFO nova.virt.libvirt.driver [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Deletion of /var/lib/nova/instances/6d263548-4cc6-463b-b26b-cb43b0d069cd_del complete
Nov 22 07:44:03 compute-0 nova_compute[186544]: 2025-11-22 07:44:03.054 186548 DEBUG nova.virt.libvirt.driver [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 07:44:03 compute-0 nova_compute[186544]: 2025-11-22 07:44:03.055 186548 INFO nova.virt.libvirt.driver [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Creating image(s)
Nov 22 07:44:03 compute-0 nova_compute[186544]: 2025-11-22 07:44:03.055 186548 DEBUG oslo_concurrency.lockutils [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Acquiring lock "/var/lib/nova/instances/6d263548-4cc6-463b-b26b-cb43b0d069cd/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:44:03 compute-0 nova_compute[186544]: 2025-11-22 07:44:03.056 186548 DEBUG oslo_concurrency.lockutils [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "/var/lib/nova/instances/6d263548-4cc6-463b-b26b-cb43b0d069cd/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:44:03 compute-0 nova_compute[186544]: 2025-11-22 07:44:03.056 186548 DEBUG oslo_concurrency.lockutils [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "/var/lib/nova/instances/6d263548-4cc6-463b-b26b-cb43b0d069cd/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:44:03 compute-0 nova_compute[186544]: 2025-11-22 07:44:03.056 186548 DEBUG oslo_concurrency.lockutils [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Acquiring lock "2882af3479446958b785a3f508ce087a26493f42" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:44:03 compute-0 nova_compute[186544]: 2025-11-22 07:44:03.057 186548 DEBUG oslo_concurrency.lockutils [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "2882af3479446958b785a3f508ce087a26493f42" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:44:03 compute-0 nova_compute[186544]: 2025-11-22 07:44:03.083 186548 INFO nova.virt.libvirt.driver [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Nov 22 07:44:03 compute-0 nova_compute[186544]: 2025-11-22 07:44:03.088 186548 DEBUG nova.compute.manager [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:44:03 compute-0 nova_compute[186544]: 2025-11-22 07:44:03.135 186548 DEBUG nova.objects.instance [None req-ef3e9e77-7348-49b3-938a-9a345d5608e5 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Nov 22 07:44:03 compute-0 nova_compute[186544]: 2025-11-22 07:44:03.143 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:44:03 compute-0 nova_compute[186544]: 2025-11-22 07:44:03.175 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Triggering sync for uuid 36d5f234-9baf-48b6-a565-430378fe4068 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Nov 22 07:44:03 compute-0 nova_compute[186544]: 2025-11-22 07:44:03.176 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Triggering sync for uuid 6d263548-4cc6-463b-b26b-cb43b0d069cd _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Nov 22 07:44:03 compute-0 nova_compute[186544]: 2025-11-22 07:44:03.176 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "36d5f234-9baf-48b6-a565-430378fe4068" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:44:03 compute-0 nova_compute[186544]: 2025-11-22 07:44:03.176 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "36d5f234-9baf-48b6-a565-430378fe4068" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:44:03 compute-0 nova_compute[186544]: 2025-11-22 07:44:03.176 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "6d263548-4cc6-463b-b26b-cb43b0d069cd" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:44:03 compute-0 nova_compute[186544]: 2025-11-22 07:44:03.177 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "6d263548-4cc6-463b-b26b-cb43b0d069cd" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:44:03 compute-0 nova_compute[186544]: 2025-11-22 07:44:03.177 186548 INFO nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Nov 22 07:44:03 compute-0 nova_compute[186544]: 2025-11-22 07:44:03.177 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "6d263548-4cc6-463b-b26b-cb43b0d069cd" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:44:03 compute-0 nova_compute[186544]: 2025-11-22 07:44:03.214 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "36d5f234-9baf-48b6-a565-430378fe4068" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.038s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:44:04 compute-0 nova_compute[186544]: 2025-11-22 07:44:04.374 186548 DEBUG nova.compute.manager [req-143498a8-ce4a-4d9b-871d-58795a370b66 req-b2b103d9-6ac4-41d6-9e21-3c969634eda0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Received event network-vif-plugged-cc51d9a0-e170-42eb-b8db-2910ea320cb4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:44:04 compute-0 nova_compute[186544]: 2025-11-22 07:44:04.375 186548 DEBUG oslo_concurrency.lockutils [req-143498a8-ce4a-4d9b-871d-58795a370b66 req-b2b103d9-6ac4-41d6-9e21-3c969634eda0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "6d263548-4cc6-463b-b26b-cb43b0d069cd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:44:04 compute-0 nova_compute[186544]: 2025-11-22 07:44:04.375 186548 DEBUG oslo_concurrency.lockutils [req-143498a8-ce4a-4d9b-871d-58795a370b66 req-b2b103d9-6ac4-41d6-9e21-3c969634eda0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6d263548-4cc6-463b-b26b-cb43b0d069cd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:44:04 compute-0 nova_compute[186544]: 2025-11-22 07:44:04.375 186548 DEBUG oslo_concurrency.lockutils [req-143498a8-ce4a-4d9b-871d-58795a370b66 req-b2b103d9-6ac4-41d6-9e21-3c969634eda0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6d263548-4cc6-463b-b26b-cb43b0d069cd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:44:04 compute-0 nova_compute[186544]: 2025-11-22 07:44:04.376 186548 DEBUG nova.compute.manager [req-143498a8-ce4a-4d9b-871d-58795a370b66 req-b2b103d9-6ac4-41d6-9e21-3c969634eda0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] No waiting events found dispatching network-vif-plugged-cc51d9a0-e170-42eb-b8db-2910ea320cb4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:44:04 compute-0 nova_compute[186544]: 2025-11-22 07:44:04.376 186548 WARNING nova.compute.manager [req-143498a8-ce4a-4d9b-871d-58795a370b66 req-b2b103d9-6ac4-41d6-9e21-3c969634eda0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Received unexpected event network-vif-plugged-cc51d9a0-e170-42eb-b8db-2910ea320cb4 for instance with vm_state error and task_state rebuild_spawning.
Nov 22 07:44:04 compute-0 nova_compute[186544]: 2025-11-22 07:44:04.475 186548 DEBUG oslo_concurrency.processutils [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:44:04 compute-0 nova_compute[186544]: 2025-11-22 07:44:04.529 186548 DEBUG oslo_concurrency.processutils [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42.part --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:44:04 compute-0 nova_compute[186544]: 2025-11-22 07:44:04.530 186548 DEBUG nova.virt.images [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] 360f90ca-2ddb-4e60-a48e-364e3b48bd96 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Nov 22 07:44:04 compute-0 nova_compute[186544]: 2025-11-22 07:44:04.531 186548 DEBUG nova.privsep.utils [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Nov 22 07:44:04 compute-0 nova_compute[186544]: 2025-11-22 07:44:04.531 186548 DEBUG oslo_concurrency.processutils [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42.part /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:44:04 compute-0 nova_compute[186544]: 2025-11-22 07:44:04.692 186548 DEBUG oslo_concurrency.processutils [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42.part /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42.converted" returned: 0 in 0.161s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:44:04 compute-0 nova_compute[186544]: 2025-11-22 07:44:04.697 186548 DEBUG oslo_concurrency.processutils [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:44:04 compute-0 nova_compute[186544]: 2025-11-22 07:44:04.752 186548 DEBUG oslo_concurrency.processutils [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42.converted --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:44:04 compute-0 nova_compute[186544]: 2025-11-22 07:44:04.753 186548 DEBUG oslo_concurrency.lockutils [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "2882af3479446958b785a3f508ce087a26493f42" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.697s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:44:04 compute-0 nova_compute[186544]: 2025-11-22 07:44:04.767 186548 DEBUG oslo_concurrency.processutils [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:44:04 compute-0 nova_compute[186544]: 2025-11-22 07:44:04.823 186548 DEBUG oslo_concurrency.processutils [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:44:04 compute-0 nova_compute[186544]: 2025-11-22 07:44:04.824 186548 DEBUG oslo_concurrency.lockutils [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Acquiring lock "2882af3479446958b785a3f508ce087a26493f42" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:44:04 compute-0 nova_compute[186544]: 2025-11-22 07:44:04.825 186548 DEBUG oslo_concurrency.lockutils [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "2882af3479446958b785a3f508ce087a26493f42" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:44:04 compute-0 nova_compute[186544]: 2025-11-22 07:44:04.836 186548 DEBUG oslo_concurrency.processutils [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:44:04 compute-0 nova_compute[186544]: 2025-11-22 07:44:04.890 186548 DEBUG oslo_concurrency.processutils [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:44:04 compute-0 nova_compute[186544]: 2025-11-22 07:44:04.891 186548 DEBUG oslo_concurrency.processutils [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42,backing_fmt=raw /var/lib/nova/instances/6d263548-4cc6-463b-b26b-cb43b0d069cd/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:44:04 compute-0 nova_compute[186544]: 2025-11-22 07:44:04.924 186548 DEBUG oslo_concurrency.processutils [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42,backing_fmt=raw /var/lib/nova/instances/6d263548-4cc6-463b-b26b-cb43b0d069cd/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:44:04 compute-0 nova_compute[186544]: 2025-11-22 07:44:04.925 186548 DEBUG oslo_concurrency.lockutils [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "2882af3479446958b785a3f508ce087a26493f42" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.101s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:44:04 compute-0 nova_compute[186544]: 2025-11-22 07:44:04.926 186548 DEBUG oslo_concurrency.processutils [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:44:04 compute-0 nova_compute[186544]: 2025-11-22 07:44:04.962 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:04 compute-0 nova_compute[186544]: 2025-11-22 07:44:04.984 186548 DEBUG oslo_concurrency.processutils [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:44:04 compute-0 nova_compute[186544]: 2025-11-22 07:44:04.984 186548 DEBUG nova.virt.disk.api [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Checking if we can resize image /var/lib/nova/instances/6d263548-4cc6-463b-b26b-cb43b0d069cd/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 07:44:04 compute-0 nova_compute[186544]: 2025-11-22 07:44:04.984 186548 DEBUG oslo_concurrency.processutils [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6d263548-4cc6-463b-b26b-cb43b0d069cd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:44:05 compute-0 nova_compute[186544]: 2025-11-22 07:44:05.040 186548 DEBUG oslo_concurrency.processutils [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6d263548-4cc6-463b-b26b-cb43b0d069cd/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:44:05 compute-0 nova_compute[186544]: 2025-11-22 07:44:05.041 186548 DEBUG nova.virt.disk.api [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Cannot resize image /var/lib/nova/instances/6d263548-4cc6-463b-b26b-cb43b0d069cd/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 07:44:05 compute-0 nova_compute[186544]: 2025-11-22 07:44:05.042 186548 DEBUG nova.virt.libvirt.driver [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 07:44:05 compute-0 nova_compute[186544]: 2025-11-22 07:44:05.042 186548 DEBUG nova.virt.libvirt.driver [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Ensure instance console log exists: /var/lib/nova/instances/6d263548-4cc6-463b-b26b-cb43b0d069cd/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 07:44:05 compute-0 nova_compute[186544]: 2025-11-22 07:44:05.043 186548 DEBUG oslo_concurrency.lockutils [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:44:05 compute-0 nova_compute[186544]: 2025-11-22 07:44:05.043 186548 DEBUG oslo_concurrency.lockutils [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:44:05 compute-0 nova_compute[186544]: 2025-11-22 07:44:05.043 186548 DEBUG oslo_concurrency.lockutils [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:44:05 compute-0 nova_compute[186544]: 2025-11-22 07:44:05.045 186548 DEBUG nova.virt.libvirt.driver [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Start _get_guest_xml network_info=[{"id": "cc51d9a0-e170-42eb-b8db-2910ea320cb4", "address": "fa:16:3e:88:5c:3d", "network": {"id": "d7ba1c27-6255-4c71-8e98-23a1c59b5723", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1812148536-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b004cb06df74de2903dae19345fd9c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc51d9a0-e1", "ovs_interfaceid": "cc51d9a0-e170-42eb-b8db-2910ea320cb4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:39:01Z,direct_url=<?>,disk_format='qcow2',id=360f90ca-2ddb-4e60-a48e-364e3b48bd96,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:02Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 07:44:05 compute-0 nova_compute[186544]: 2025-11-22 07:44:05.049 186548 WARNING nova.virt.libvirt.driver [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError
Nov 22 07:44:05 compute-0 nova_compute[186544]: 2025-11-22 07:44:05.054 186548 DEBUG nova.virt.libvirt.host [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 07:44:05 compute-0 nova_compute[186544]: 2025-11-22 07:44:05.054 186548 DEBUG nova.virt.libvirt.host [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 07:44:05 compute-0 nova_compute[186544]: 2025-11-22 07:44:05.057 186548 DEBUG nova.virt.libvirt.host [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 07:44:05 compute-0 nova_compute[186544]: 2025-11-22 07:44:05.058 186548 DEBUG nova.virt.libvirt.host [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 07:44:05 compute-0 nova_compute[186544]: 2025-11-22 07:44:05.059 186548 DEBUG nova.virt.libvirt.driver [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 07:44:05 compute-0 nova_compute[186544]: 2025-11-22 07:44:05.059 186548 DEBUG nova.virt.hardware [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:39:01Z,direct_url=<?>,disk_format='qcow2',id=360f90ca-2ddb-4e60-a48e-364e3b48bd96,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:02Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 07:44:05 compute-0 nova_compute[186544]: 2025-11-22 07:44:05.059 186548 DEBUG nova.virt.hardware [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 07:44:05 compute-0 nova_compute[186544]: 2025-11-22 07:44:05.060 186548 DEBUG nova.virt.hardware [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 07:44:05 compute-0 nova_compute[186544]: 2025-11-22 07:44:05.060 186548 DEBUG nova.virt.hardware [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 07:44:05 compute-0 nova_compute[186544]: 2025-11-22 07:44:05.060 186548 DEBUG nova.virt.hardware [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 07:44:05 compute-0 nova_compute[186544]: 2025-11-22 07:44:05.060 186548 DEBUG nova.virt.hardware [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 07:44:05 compute-0 nova_compute[186544]: 2025-11-22 07:44:05.060 186548 DEBUG nova.virt.hardware [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 07:44:05 compute-0 nova_compute[186544]: 2025-11-22 07:44:05.060 186548 DEBUG nova.virt.hardware [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 07:44:05 compute-0 nova_compute[186544]: 2025-11-22 07:44:05.060 186548 DEBUG nova.virt.hardware [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 07:44:05 compute-0 nova_compute[186544]: 2025-11-22 07:44:05.061 186548 DEBUG nova.virt.hardware [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 07:44:05 compute-0 nova_compute[186544]: 2025-11-22 07:44:05.061 186548 DEBUG nova.virt.hardware [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 07:44:05 compute-0 nova_compute[186544]: 2025-11-22 07:44:05.061 186548 DEBUG nova.objects.instance [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 6d263548-4cc6-463b-b26b-cb43b0d069cd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:44:05 compute-0 nova_compute[186544]: 2025-11-22 07:44:05.076 186548 DEBUG nova.virt.libvirt.vif [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-22T07:43:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-174353263',display_name='tempest-ServersAdminTestJSON-server-174353263',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-174353263',id=13,image_ref='360f90ca-2ddb-4e60-a48e-364e3b48bd96',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:43:15Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='9b004cb06df74de2903dae19345fd9c7',ramdisk_id='',reservation_id='r-wcksy40q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='360f90ca-2ddb-4e60-a48e-364e3b48bd96',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-1843119868',owner_user_name='tempest-ServersAdminTestJSON-1843119868-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:44:02Z,user_data=None,user_id='7c0fb56fc41e44dfa23a0d45149e78e3',uuid=6d263548-4cc6-463b-b26b-cb43b0d069cd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='error') vif={"id": "cc51d9a0-e170-42eb-b8db-2910ea320cb4", "address": "fa:16:3e:88:5c:3d", "network": {"id": "d7ba1c27-6255-4c71-8e98-23a1c59b5723", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1812148536-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b004cb06df74de2903dae19345fd9c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc51d9a0-e1", "ovs_interfaceid": "cc51d9a0-e170-42eb-b8db-2910ea320cb4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 07:44:05 compute-0 nova_compute[186544]: 2025-11-22 07:44:05.076 186548 DEBUG nova.network.os_vif_util [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Converting VIF {"id": "cc51d9a0-e170-42eb-b8db-2910ea320cb4", "address": "fa:16:3e:88:5c:3d", "network": {"id": "d7ba1c27-6255-4c71-8e98-23a1c59b5723", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1812148536-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b004cb06df74de2903dae19345fd9c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc51d9a0-e1", "ovs_interfaceid": "cc51d9a0-e170-42eb-b8db-2910ea320cb4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:44:05 compute-0 nova_compute[186544]: 2025-11-22 07:44:05.077 186548 DEBUG nova.network.os_vif_util [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:88:5c:3d,bridge_name='br-int',has_traffic_filtering=True,id=cc51d9a0-e170-42eb-b8db-2910ea320cb4,network=Network(d7ba1c27-6255-4c71-8e98-23a1c59b5723),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc51d9a0-e1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:44:05 compute-0 nova_compute[186544]: 2025-11-22 07:44:05.079 186548 DEBUG nova.virt.libvirt.driver [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] End _get_guest_xml xml=<domain type="kvm">
Nov 22 07:44:05 compute-0 nova_compute[186544]:   <uuid>6d263548-4cc6-463b-b26b-cb43b0d069cd</uuid>
Nov 22 07:44:05 compute-0 nova_compute[186544]:   <name>instance-0000000d</name>
Nov 22 07:44:05 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 07:44:05 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 07:44:05 compute-0 nova_compute[186544]:   <metadata>
Nov 22 07:44:05 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 07:44:05 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 07:44:05 compute-0 nova_compute[186544]:       <nova:name>tempest-ServersAdminTestJSON-server-174353263</nova:name>
Nov 22 07:44:05 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 07:44:05</nova:creationTime>
Nov 22 07:44:05 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 07:44:05 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 07:44:05 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 07:44:05 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 07:44:05 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 07:44:05 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 07:44:05 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 07:44:05 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 07:44:05 compute-0 nova_compute[186544]:         <nova:user uuid="7c0fb56fc41e44dfa23a0d45149e78e3">tempest-ServersAdminTestJSON-1843119868-project-member</nova:user>
Nov 22 07:44:05 compute-0 nova_compute[186544]:         <nova:project uuid="9b004cb06df74de2903dae19345fd9c7">tempest-ServersAdminTestJSON-1843119868</nova:project>
Nov 22 07:44:05 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 07:44:05 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="360f90ca-2ddb-4e60-a48e-364e3b48bd96"/>
Nov 22 07:44:05 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 07:44:05 compute-0 nova_compute[186544]:         <nova:port uuid="cc51d9a0-e170-42eb-b8db-2910ea320cb4">
Nov 22 07:44:05 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Nov 22 07:44:05 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 07:44:05 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 07:44:05 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 07:44:05 compute-0 nova_compute[186544]:   </metadata>
Nov 22 07:44:05 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 07:44:05 compute-0 nova_compute[186544]:     <system>
Nov 22 07:44:05 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 07:44:05 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 07:44:05 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 07:44:05 compute-0 nova_compute[186544]:       <entry name="serial">6d263548-4cc6-463b-b26b-cb43b0d069cd</entry>
Nov 22 07:44:05 compute-0 nova_compute[186544]:       <entry name="uuid">6d263548-4cc6-463b-b26b-cb43b0d069cd</entry>
Nov 22 07:44:05 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 07:44:05 compute-0 nova_compute[186544]:     </system>
Nov 22 07:44:05 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 07:44:05 compute-0 nova_compute[186544]:   <os>
Nov 22 07:44:05 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 07:44:05 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 07:44:05 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 07:44:05 compute-0 nova_compute[186544]:   </os>
Nov 22 07:44:05 compute-0 nova_compute[186544]:   <features>
Nov 22 07:44:05 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 07:44:05 compute-0 nova_compute[186544]:     <apic/>
Nov 22 07:44:05 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 07:44:05 compute-0 nova_compute[186544]:   </features>
Nov 22 07:44:05 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 07:44:05 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 07:44:05 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 07:44:05 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 07:44:05 compute-0 nova_compute[186544]:   </clock>
Nov 22 07:44:05 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 07:44:05 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 07:44:05 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 07:44:05 compute-0 nova_compute[186544]:   </cpu>
Nov 22 07:44:05 compute-0 nova_compute[186544]:   <devices>
Nov 22 07:44:05 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 07:44:05 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 07:44:05 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/6d263548-4cc6-463b-b26b-cb43b0d069cd/disk"/>
Nov 22 07:44:05 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 07:44:05 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:44:05 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 07:44:05 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 07:44:05 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/6d263548-4cc6-463b-b26b-cb43b0d069cd/disk.config"/>
Nov 22 07:44:05 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 07:44:05 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:44:05 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 07:44:05 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:88:5c:3d"/>
Nov 22 07:44:05 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 07:44:05 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 07:44:05 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 07:44:05 compute-0 nova_compute[186544]:       <target dev="tapcc51d9a0-e1"/>
Nov 22 07:44:05 compute-0 nova_compute[186544]:     </interface>
Nov 22 07:44:05 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 07:44:05 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/6d263548-4cc6-463b-b26b-cb43b0d069cd/console.log" append="off"/>
Nov 22 07:44:05 compute-0 nova_compute[186544]:     </serial>
Nov 22 07:44:05 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 07:44:05 compute-0 nova_compute[186544]:     <video>
Nov 22 07:44:05 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 07:44:05 compute-0 nova_compute[186544]:     </video>
Nov 22 07:44:05 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 07:44:05 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 07:44:05 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 07:44:05 compute-0 nova_compute[186544]:     </rng>
Nov 22 07:44:05 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 07:44:05 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:05 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:05 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:05 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:05 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:05 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:05 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:05 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:05 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:05 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:05 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:05 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:05 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:05 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:05 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:05 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:05 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:05 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:05 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:05 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:05 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:05 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:05 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:05 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:05 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 07:44:05 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 07:44:05 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 07:44:05 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 07:44:05 compute-0 nova_compute[186544]:   </devices>
Nov 22 07:44:05 compute-0 nova_compute[186544]: </domain>
Nov 22 07:44:05 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 07:44:05 compute-0 nova_compute[186544]: 2025-11-22 07:44:05.079 186548 DEBUG nova.virt.libvirt.vif [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-22T07:43:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-174353263',display_name='tempest-ServersAdminTestJSON-server-174353263',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-174353263',id=13,image_ref='360f90ca-2ddb-4e60-a48e-364e3b48bd96',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:43:15Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='9b004cb06df74de2903dae19345fd9c7',ramdisk_id='',reservation_id='r-wcksy40q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='360f90ca-2ddb-4e60-a48e-364e3b48bd96',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-1843119868',owner_user_name='tempest-ServersAdminTestJSON-1843119868-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:44:02Z,user_data=None,user_id='7c0fb56fc41e44dfa23a0d45149e78e3',uuid=6d263548-4cc6-463b-b26b-cb43b0d069cd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='error') vif={"id": "cc51d9a0-e170-42eb-b8db-2910ea320cb4", "address": "fa:16:3e:88:5c:3d", "network": {"id": "d7ba1c27-6255-4c71-8e98-23a1c59b5723", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1812148536-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b004cb06df74de2903dae19345fd9c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc51d9a0-e1", "ovs_interfaceid": "cc51d9a0-e170-42eb-b8db-2910ea320cb4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 07:44:05 compute-0 nova_compute[186544]: 2025-11-22 07:44:05.080 186548 DEBUG nova.network.os_vif_util [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Converting VIF {"id": "cc51d9a0-e170-42eb-b8db-2910ea320cb4", "address": "fa:16:3e:88:5c:3d", "network": {"id": "d7ba1c27-6255-4c71-8e98-23a1c59b5723", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1812148536-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b004cb06df74de2903dae19345fd9c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc51d9a0-e1", "ovs_interfaceid": "cc51d9a0-e170-42eb-b8db-2910ea320cb4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:44:05 compute-0 nova_compute[186544]: 2025-11-22 07:44:05.080 186548 DEBUG nova.network.os_vif_util [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:88:5c:3d,bridge_name='br-int',has_traffic_filtering=True,id=cc51d9a0-e170-42eb-b8db-2910ea320cb4,network=Network(d7ba1c27-6255-4c71-8e98-23a1c59b5723),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc51d9a0-e1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:44:05 compute-0 nova_compute[186544]: 2025-11-22 07:44:05.080 186548 DEBUG os_vif [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:88:5c:3d,bridge_name='br-int',has_traffic_filtering=True,id=cc51d9a0-e170-42eb-b8db-2910ea320cb4,network=Network(d7ba1c27-6255-4c71-8e98-23a1c59b5723),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc51d9a0-e1') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 07:44:05 compute-0 nova_compute[186544]: 2025-11-22 07:44:05.081 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:05 compute-0 nova_compute[186544]: 2025-11-22 07:44:05.081 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:44:05 compute-0 nova_compute[186544]: 2025-11-22 07:44:05.082 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:44:05 compute-0 nova_compute[186544]: 2025-11-22 07:44:05.085 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:05 compute-0 nova_compute[186544]: 2025-11-22 07:44:05.086 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcc51d9a0-e1, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:44:05 compute-0 nova_compute[186544]: 2025-11-22 07:44:05.087 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapcc51d9a0-e1, col_values=(('external_ids', {'iface-id': 'cc51d9a0-e170-42eb-b8db-2910ea320cb4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:88:5c:3d', 'vm-uuid': '6d263548-4cc6-463b-b26b-cb43b0d069cd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:44:05 compute-0 NetworkManager[55036]: <info>  [1763797445.0891] manager: (tapcc51d9a0-e1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/35)
Nov 22 07:44:05 compute-0 nova_compute[186544]: 2025-11-22 07:44:05.088 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:05 compute-0 nova_compute[186544]: 2025-11-22 07:44:05.090 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 07:44:05 compute-0 nova_compute[186544]: 2025-11-22 07:44:05.094 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:05 compute-0 nova_compute[186544]: 2025-11-22 07:44:05.095 186548 INFO os_vif [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:88:5c:3d,bridge_name='br-int',has_traffic_filtering=True,id=cc51d9a0-e170-42eb-b8db-2910ea320cb4,network=Network(d7ba1c27-6255-4c71-8e98-23a1c59b5723),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc51d9a0-e1')
Nov 22 07:44:05 compute-0 nova_compute[186544]: 2025-11-22 07:44:05.148 186548 DEBUG nova.virt.libvirt.driver [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:44:05 compute-0 nova_compute[186544]: 2025-11-22 07:44:05.148 186548 DEBUG nova.virt.libvirt.driver [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:44:05 compute-0 nova_compute[186544]: 2025-11-22 07:44:05.149 186548 DEBUG nova.virt.libvirt.driver [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] No VIF found with MAC fa:16:3e:88:5c:3d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 07:44:05 compute-0 nova_compute[186544]: 2025-11-22 07:44:05.149 186548 INFO nova.virt.libvirt.driver [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Using config drive
Nov 22 07:44:05 compute-0 nova_compute[186544]: 2025-11-22 07:44:05.162 186548 DEBUG nova.objects.instance [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 6d263548-4cc6-463b-b26b-cb43b0d069cd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:44:05 compute-0 nova_compute[186544]: 2025-11-22 07:44:05.229 186548 DEBUG nova.objects.instance [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lazy-loading 'keypairs' on Instance uuid 6d263548-4cc6-463b-b26b-cb43b0d069cd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:44:05 compute-0 nova_compute[186544]: 2025-11-22 07:44:05.518 186548 DEBUG nova.virt.libvirt.driver [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Check if temp file /var/lib/nova/instances/tmppimoasud exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Nov 22 07:44:05 compute-0 nova_compute[186544]: 2025-11-22 07:44:05.519 186548 DEBUG nova.compute.manager [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmppimoasud',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='144e6cca-5b79-4b25-9456-a59f6895075b',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Nov 22 07:44:05 compute-0 nova_compute[186544]: 2025-11-22 07:44:05.602 186548 INFO nova.virt.libvirt.driver [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Creating config drive at /var/lib/nova/instances/6d263548-4cc6-463b-b26b-cb43b0d069cd/disk.config
Nov 22 07:44:05 compute-0 nova_compute[186544]: 2025-11-22 07:44:05.607 186548 DEBUG oslo_concurrency.processutils [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6d263548-4cc6-463b-b26b-cb43b0d069cd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpducvflcu execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:44:05 compute-0 sshd-session[215349]: Accepted publickey for nova from 192.168.122.102 port 56622 ssh2: ECDSA SHA256:92zYFkcPWlh9+ZvK5fXQ5RiSCmUwy/g/KFplYnusFfI
Nov 22 07:44:05 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Nov 22 07:44:05 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 22 07:44:05 compute-0 systemd-logind[821]: New session 29 of user nova.
Nov 22 07:44:05 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 22 07:44:05 compute-0 systemd[1]: Starting User Manager for UID 42436...
Nov 22 07:44:05 compute-0 systemd[215356]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 22 07:44:05 compute-0 nova_compute[186544]: 2025-11-22 07:44:05.732 186548 DEBUG oslo_concurrency.processutils [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6d263548-4cc6-463b-b26b-cb43b0d069cd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpducvflcu" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:44:05 compute-0 kernel: tapcc51d9a0-e1: entered promiscuous mode
Nov 22 07:44:05 compute-0 NetworkManager[55036]: <info>  [1763797445.8037] manager: (tapcc51d9a0-e1): new Tun device (/org/freedesktop/NetworkManager/Devices/36)
Nov 22 07:44:05 compute-0 systemd[215356]: Queued start job for default target Main User Target.
Nov 22 07:44:05 compute-0 systemd-udevd[215380]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 07:44:05 compute-0 ovn_controller[94843]: 2025-11-22T07:44:05Z|00056|binding|INFO|Claiming lport cc51d9a0-e170-42eb-b8db-2910ea320cb4 for this chassis.
Nov 22 07:44:05 compute-0 ovn_controller[94843]: 2025-11-22T07:44:05Z|00057|binding|INFO|cc51d9a0-e170-42eb-b8db-2910ea320cb4: Claiming fa:16:3e:88:5c:3d 10.100.0.13
Nov 22 07:44:05 compute-0 nova_compute[186544]: 2025-11-22 07:44:05.845 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:05 compute-0 systemd[215356]: Created slice User Application Slice.
Nov 22 07:44:05 compute-0 systemd[215356]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 22 07:44:05 compute-0 systemd[215356]: Started Daily Cleanup of User's Temporary Directories.
Nov 22 07:44:05 compute-0 systemd[215356]: Reached target Paths.
Nov 22 07:44:05 compute-0 systemd[215356]: Reached target Timers.
Nov 22 07:44:05 compute-0 systemd[215356]: Starting D-Bus User Message Bus Socket...
Nov 22 07:44:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:05.854 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:88:5c:3d 10.100.0.13'], port_security=['fa:16:3e:88:5c:3d 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '6d263548-4cc6-463b-b26b-cb43b0d069cd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d7ba1c27-6255-4c71-8e98-23a1c59b5723', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9b004cb06df74de2903dae19345fd9c7', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'f8b9c274-fa57-419c-9d40-54201db84f9d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ce3be460-df7c-41a5-9ff2-c82c8fc728ec, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=cc51d9a0-e170-42eb-b8db-2910ea320cb4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:44:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:05.855 103805 INFO neutron.agent.ovn.metadata.agent [-] Port cc51d9a0-e170-42eb-b8db-2910ea320cb4 in datapath d7ba1c27-6255-4c71-8e98-23a1c59b5723 bound to our chassis
Nov 22 07:44:05 compute-0 systemd[215356]: Starting Create User's Volatile Files and Directories...
Nov 22 07:44:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:05.857 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d7ba1c27-6255-4c71-8e98-23a1c59b5723
Nov 22 07:44:05 compute-0 ovn_controller[94843]: 2025-11-22T07:44:05Z|00058|binding|INFO|Setting lport cc51d9a0-e170-42eb-b8db-2910ea320cb4 ovn-installed in OVS
Nov 22 07:44:05 compute-0 ovn_controller[94843]: 2025-11-22T07:44:05Z|00059|binding|INFO|Setting lport cc51d9a0-e170-42eb-b8db-2910ea320cb4 up in Southbound
Nov 22 07:44:05 compute-0 nova_compute[186544]: 2025-11-22 07:44:05.864 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:05 compute-0 systemd[215356]: Finished Create User's Volatile Files and Directories.
Nov 22 07:44:05 compute-0 nova_compute[186544]: 2025-11-22 07:44:05.870 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:05 compute-0 NetworkManager[55036]: <info>  [1763797445.8721] device (tapcc51d9a0-e1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 07:44:05 compute-0 NetworkManager[55036]: <info>  [1763797445.8735] device (tapcc51d9a0-e1): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 07:44:05 compute-0 systemd[215356]: Listening on D-Bus User Message Bus Socket.
Nov 22 07:44:05 compute-0 systemd[215356]: Reached target Sockets.
Nov 22 07:44:05 compute-0 systemd[215356]: Reached target Basic System.
Nov 22 07:44:05 compute-0 systemd[215356]: Reached target Main User Target.
Nov 22 07:44:05 compute-0 systemd[215356]: Startup finished in 183ms.
Nov 22 07:44:05 compute-0 systemd[1]: Started User Manager for UID 42436.
Nov 22 07:44:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:05.875 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[a0f13b00-a10d-41db-8674-4b850ea4c695]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:44:05 compute-0 systemd[1]: Started Session 29 of User nova.
Nov 22 07:44:05 compute-0 systemd-machined[152872]: New machine qemu-8-instance-0000000d.
Nov 22 07:44:05 compute-0 systemd[1]: Started Virtual Machine qemu-8-instance-0000000d.
Nov 22 07:44:05 compute-0 sshd-session[215349]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 22 07:44:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:05.902 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[77b92fbf-c284-4123-8e06-93ec6d66e6e7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:44:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:05.905 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[99244eb8-bff9-4f72-83df-52626a2619e7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:44:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:05.929 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[86c9b24b-a21b-4293-b343-6866afb02331]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:44:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:05.950 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[e410ccc0-56d8-4f68-85ce-28616b9a6e16]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd7ba1c27-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bb:37:eb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 9, 'rx_bytes': 700, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 9, 'rx_bytes': 700, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 413549, 'reachable_time': 24347, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215399, 'error': None, 'target': 'ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:44:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:05.966 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[b69b2f0d-95f3-46ea-8e25-cd750f0c9195]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapd7ba1c27-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 413560, 'tstamp': 413560}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215402, 'error': None, 'target': 'ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapd7ba1c27-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 413562, 'tstamp': 413562}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215402, 'error': None, 'target': 'ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:44:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:05.968 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd7ba1c27-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:44:05 compute-0 nova_compute[186544]: 2025-11-22 07:44:05.969 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:05 compute-0 nova_compute[186544]: 2025-11-22 07:44:05.971 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:05.971 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd7ba1c27-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:44:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:05.971 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:44:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:05.972 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd7ba1c27-60, col_values=(('external_ids', {'iface-id': '3c20001c-28e2-4cdd-9a7c-497ed470b31c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:44:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:05.972 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:44:06 compute-0 sshd-session[215391]: Received disconnect from 192.168.122.102 port 56622:11: disconnected by user
Nov 22 07:44:06 compute-0 sshd-session[215391]: Disconnected from user nova 192.168.122.102 port 56622
Nov 22 07:44:06 compute-0 sshd-session[215349]: pam_unix(sshd:session): session closed for user nova
Nov 22 07:44:06 compute-0 systemd[1]: session-29.scope: Deactivated successfully.
Nov 22 07:44:06 compute-0 systemd-logind[821]: Session 29 logged out. Waiting for processes to exit.
Nov 22 07:44:06 compute-0 systemd-logind[821]: Removed session 29.
Nov 22 07:44:06 compute-0 nova_compute[186544]: 2025-11-22 07:44:06.432 186548 DEBUG nova.virt.libvirt.host [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Removed pending event for 6d263548-4cc6-463b-b26b-cb43b0d069cd due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Nov 22 07:44:06 compute-0 nova_compute[186544]: 2025-11-22 07:44:06.433 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763797446.4323146, 6d263548-4cc6-463b-b26b-cb43b0d069cd => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:44:06 compute-0 nova_compute[186544]: 2025-11-22 07:44:06.433 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] VM Resumed (Lifecycle Event)
Nov 22 07:44:06 compute-0 nova_compute[186544]: 2025-11-22 07:44:06.435 186548 DEBUG nova.compute.manager [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 07:44:06 compute-0 nova_compute[186544]: 2025-11-22 07:44:06.435 186548 DEBUG nova.virt.libvirt.driver [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 07:44:06 compute-0 nova_compute[186544]: 2025-11-22 07:44:06.441 186548 INFO nova.virt.libvirt.driver [-] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Instance spawned successfully.
Nov 22 07:44:06 compute-0 nova_compute[186544]: 2025-11-22 07:44:06.441 186548 DEBUG nova.virt.libvirt.driver [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 07:44:06 compute-0 nova_compute[186544]: 2025-11-22 07:44:06.444 186548 DEBUG oslo_concurrency.processutils [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/144e6cca-5b79-4b25-9456-a59f6895075b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:44:06 compute-0 nova_compute[186544]: 2025-11-22 07:44:06.463 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:44:06 compute-0 nova_compute[186544]: 2025-11-22 07:44:06.467 186548 DEBUG nova.compute.manager [req-dd12014a-0126-4799-b595-e01e7ff84637 req-a61a6f64-0591-48b7-9248-40a8ab2a522e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Received event network-vif-plugged-cc51d9a0-e170-42eb-b8db-2910ea320cb4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:44:06 compute-0 nova_compute[186544]: 2025-11-22 07:44:06.468 186548 DEBUG oslo_concurrency.lockutils [req-dd12014a-0126-4799-b595-e01e7ff84637 req-a61a6f64-0591-48b7-9248-40a8ab2a522e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "6d263548-4cc6-463b-b26b-cb43b0d069cd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:44:06 compute-0 nova_compute[186544]: 2025-11-22 07:44:06.468 186548 DEBUG oslo_concurrency.lockutils [req-dd12014a-0126-4799-b595-e01e7ff84637 req-a61a6f64-0591-48b7-9248-40a8ab2a522e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6d263548-4cc6-463b-b26b-cb43b0d069cd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:44:06 compute-0 nova_compute[186544]: 2025-11-22 07:44:06.468 186548 DEBUG oslo_concurrency.lockutils [req-dd12014a-0126-4799-b595-e01e7ff84637 req-a61a6f64-0591-48b7-9248-40a8ab2a522e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6d263548-4cc6-463b-b26b-cb43b0d069cd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:44:06 compute-0 nova_compute[186544]: 2025-11-22 07:44:06.468 186548 DEBUG nova.compute.manager [req-dd12014a-0126-4799-b595-e01e7ff84637 req-a61a6f64-0591-48b7-9248-40a8ab2a522e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] No waiting events found dispatching network-vif-plugged-cc51d9a0-e170-42eb-b8db-2910ea320cb4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:44:06 compute-0 nova_compute[186544]: 2025-11-22 07:44:06.468 186548 WARNING nova.compute.manager [req-dd12014a-0126-4799-b595-e01e7ff84637 req-a61a6f64-0591-48b7-9248-40a8ab2a522e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Received unexpected event network-vif-plugged-cc51d9a0-e170-42eb-b8db-2910ea320cb4 for instance with vm_state error and task_state rebuild_spawning.
Nov 22 07:44:06 compute-0 nova_compute[186544]: 2025-11-22 07:44:06.469 186548 DEBUG nova.compute.manager [req-dd12014a-0126-4799-b595-e01e7ff84637 req-a61a6f64-0591-48b7-9248-40a8ab2a522e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Received event network-vif-plugged-cc51d9a0-e170-42eb-b8db-2910ea320cb4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:44:06 compute-0 nova_compute[186544]: 2025-11-22 07:44:06.469 186548 DEBUG oslo_concurrency.lockutils [req-dd12014a-0126-4799-b595-e01e7ff84637 req-a61a6f64-0591-48b7-9248-40a8ab2a522e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "6d263548-4cc6-463b-b26b-cb43b0d069cd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:44:06 compute-0 nova_compute[186544]: 2025-11-22 07:44:06.469 186548 DEBUG oslo_concurrency.lockutils [req-dd12014a-0126-4799-b595-e01e7ff84637 req-a61a6f64-0591-48b7-9248-40a8ab2a522e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6d263548-4cc6-463b-b26b-cb43b0d069cd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:44:06 compute-0 nova_compute[186544]: 2025-11-22 07:44:06.469 186548 DEBUG oslo_concurrency.lockutils [req-dd12014a-0126-4799-b595-e01e7ff84637 req-a61a6f64-0591-48b7-9248-40a8ab2a522e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6d263548-4cc6-463b-b26b-cb43b0d069cd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:44:06 compute-0 nova_compute[186544]: 2025-11-22 07:44:06.469 186548 DEBUG nova.compute.manager [req-dd12014a-0126-4799-b595-e01e7ff84637 req-a61a6f64-0591-48b7-9248-40a8ab2a522e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] No waiting events found dispatching network-vif-plugged-cc51d9a0-e170-42eb-b8db-2910ea320cb4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:44:06 compute-0 nova_compute[186544]: 2025-11-22 07:44:06.469 186548 WARNING nova.compute.manager [req-dd12014a-0126-4799-b595-e01e7ff84637 req-a61a6f64-0591-48b7-9248-40a8ab2a522e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Received unexpected event network-vif-plugged-cc51d9a0-e170-42eb-b8db-2910ea320cb4 for instance with vm_state error and task_state rebuild_spawning.
Nov 22 07:44:06 compute-0 sshd-session[215409]: Accepted publickey for nova from 192.168.122.102 port 56626 ssh2: ECDSA SHA256:92zYFkcPWlh9+ZvK5fXQ5RiSCmUwy/g/KFplYnusFfI
Nov 22 07:44:06 compute-0 nova_compute[186544]: 2025-11-22 07:44:06.472 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: error, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:44:06 compute-0 systemd-logind[821]: New session 31 of user nova.
Nov 22 07:44:06 compute-0 systemd[1]: Started Session 31 of User nova.
Nov 22 07:44:06 compute-0 nova_compute[186544]: 2025-11-22 07:44:06.484 186548 DEBUG nova.virt.libvirt.driver [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:44:06 compute-0 nova_compute[186544]: 2025-11-22 07:44:06.485 186548 DEBUG nova.virt.libvirt.driver [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:44:06 compute-0 nova_compute[186544]: 2025-11-22 07:44:06.485 186548 DEBUG nova.virt.libvirt.driver [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:44:06 compute-0 nova_compute[186544]: 2025-11-22 07:44:06.486 186548 DEBUG nova.virt.libvirt.driver [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:44:06 compute-0 nova_compute[186544]: 2025-11-22 07:44:06.486 186548 DEBUG nova.virt.libvirt.driver [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:44:06 compute-0 nova_compute[186544]: 2025-11-22 07:44:06.486 186548 DEBUG nova.virt.libvirt.driver [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:44:06 compute-0 sshd-session[215409]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 22 07:44:06 compute-0 nova_compute[186544]: 2025-11-22 07:44:06.492 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Nov 22 07:44:06 compute-0 nova_compute[186544]: 2025-11-22 07:44:06.492 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763797446.4323924, 6d263548-4cc6-463b-b26b-cb43b0d069cd => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:44:06 compute-0 nova_compute[186544]: 2025-11-22 07:44:06.493 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] VM Started (Lifecycle Event)
Nov 22 07:44:06 compute-0 nova_compute[186544]: 2025-11-22 07:44:06.502 186548 DEBUG oslo_concurrency.processutils [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/144e6cca-5b79-4b25-9456-a59f6895075b/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:44:06 compute-0 nova_compute[186544]: 2025-11-22 07:44:06.503 186548 DEBUG oslo_concurrency.processutils [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/144e6cca-5b79-4b25-9456-a59f6895075b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:44:06 compute-0 nova_compute[186544]: 2025-11-22 07:44:06.537 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:44:06 compute-0 nova_compute[186544]: 2025-11-22 07:44:06.542 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Synchronizing instance power state after lifecycle event "Started"; current vm_state: error, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:44:06 compute-0 sshd-session[215420]: Received disconnect from 192.168.122.102 port 56626:11: disconnected by user
Nov 22 07:44:06 compute-0 sshd-session[215420]: Disconnected from user nova 192.168.122.102 port 56626
Nov 22 07:44:06 compute-0 sshd-session[215409]: pam_unix(sshd:session): session closed for user nova
Nov 22 07:44:06 compute-0 systemd[1]: session-31.scope: Deactivated successfully.
Nov 22 07:44:06 compute-0 systemd-logind[821]: Session 31 logged out. Waiting for processes to exit.
Nov 22 07:44:06 compute-0 systemd-logind[821]: Removed session 31.
Nov 22 07:44:06 compute-0 nova_compute[186544]: 2025-11-22 07:44:06.572 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Nov 22 07:44:06 compute-0 nova_compute[186544]: 2025-11-22 07:44:06.577 186548 DEBUG oslo_concurrency.processutils [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/144e6cca-5b79-4b25-9456-a59f6895075b/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:44:06 compute-0 podman[215413]: 2025-11-22 07:44:06.583522443 +0000 UTC m=+0.100116796 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 07:44:06 compute-0 nova_compute[186544]: 2025-11-22 07:44:06.585 186548 DEBUG nova.compute.manager [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:44:06 compute-0 sshd-session[215445]: Accepted publickey for nova from 192.168.122.102 port 56642 ssh2: ECDSA SHA256:92zYFkcPWlh9+ZvK5fXQ5RiSCmUwy/g/KFplYnusFfI
Nov 22 07:44:06 compute-0 nova_compute[186544]: 2025-11-22 07:44:06.680 186548 DEBUG oslo_concurrency.lockutils [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:44:06 compute-0 nova_compute[186544]: 2025-11-22 07:44:06.681 186548 DEBUG oslo_concurrency.lockutils [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:44:06 compute-0 nova_compute[186544]: 2025-11-22 07:44:06.681 186548 DEBUG nova.objects.instance [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Nov 22 07:44:06 compute-0 systemd-logind[821]: New session 32 of user nova.
Nov 22 07:44:06 compute-0 systemd[1]: Started Session 32 of User nova.
Nov 22 07:44:06 compute-0 sshd-session[215445]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 22 07:44:06 compute-0 nova_compute[186544]: 2025-11-22 07:44:06.751 186548 DEBUG oslo_concurrency.lockutils [None req-747ab9dd-e1eb-46f8-a5d8-b719a24d61e5 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.070s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:44:06 compute-0 sshd-session[215448]: Received disconnect from 192.168.122.102 port 56642:11: disconnected by user
Nov 22 07:44:06 compute-0 sshd-session[215448]: Disconnected from user nova 192.168.122.102 port 56642
Nov 22 07:44:06 compute-0 sshd-session[215445]: pam_unix(sshd:session): session closed for user nova
Nov 22 07:44:06 compute-0 systemd[1]: session-32.scope: Deactivated successfully.
Nov 22 07:44:06 compute-0 systemd-logind[821]: Session 32 logged out. Waiting for processes to exit.
Nov 22 07:44:06 compute-0 systemd-logind[821]: Removed session 32.
Nov 22 07:44:08 compute-0 nova_compute[186544]: 2025-11-22 07:44:08.087 186548 DEBUG oslo_concurrency.lockutils [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Acquiring lock "refresh_cache-5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:44:08 compute-0 nova_compute[186544]: 2025-11-22 07:44:08.087 186548 DEBUG oslo_concurrency.lockutils [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Acquired lock "refresh_cache-5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:44:08 compute-0 nova_compute[186544]: 2025-11-22 07:44:08.087 186548 DEBUG nova.network.neutron [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 07:44:09 compute-0 nova_compute[186544]: 2025-11-22 07:44:09.086 186548 DEBUG nova.network.neutron [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 07:44:09 compute-0 nova_compute[186544]: 2025-11-22 07:44:09.966 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:10 compute-0 nova_compute[186544]: 2025-11-22 07:44:10.088 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:10 compute-0 nova_compute[186544]: 2025-11-22 07:44:10.218 186548 DEBUG nova.network.neutron [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:44:10 compute-0 nova_compute[186544]: 2025-11-22 07:44:10.233 186548 DEBUG oslo_concurrency.lockutils [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Releasing lock "refresh_cache-5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:44:10 compute-0 nova_compute[186544]: 2025-11-22 07:44:10.327 186548 DEBUG nova.virt.libvirt.driver [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698
Nov 22 07:44:10 compute-0 nova_compute[186544]: 2025-11-22 07:44:10.329 186548 DEBUG nova.virt.libvirt.driver [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719
Nov 22 07:44:10 compute-0 nova_compute[186544]: 2025-11-22 07:44:10.329 186548 INFO nova.virt.libvirt.driver [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Creating image(s)
Nov 22 07:44:10 compute-0 nova_compute[186544]: 2025-11-22 07:44:10.330 186548 DEBUG nova.objects.instance [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Lazy-loading 'trusted_certs' on Instance uuid 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:44:10 compute-0 nova_compute[186544]: 2025-11-22 07:44:10.352 186548 DEBUG oslo_concurrency.processutils [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:44:10 compute-0 nova_compute[186544]: 2025-11-22 07:44:10.411 186548 DEBUG oslo_concurrency.processutils [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:44:10 compute-0 nova_compute[186544]: 2025-11-22 07:44:10.412 186548 DEBUG nova.virt.disk.api [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Checking if we can resize image /var/lib/nova/instances/5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 07:44:10 compute-0 nova_compute[186544]: 2025-11-22 07:44:10.412 186548 DEBUG oslo_concurrency.processutils [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:44:10 compute-0 nova_compute[186544]: 2025-11-22 07:44:10.470 186548 DEBUG oslo_concurrency.processutils [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:44:10 compute-0 nova_compute[186544]: 2025-11-22 07:44:10.471 186548 DEBUG nova.virt.disk.api [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Cannot resize image /var/lib/nova/instances/5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 07:44:10 compute-0 nova_compute[186544]: 2025-11-22 07:44:10.493 186548 DEBUG nova.virt.libvirt.driver [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859
Nov 22 07:44:10 compute-0 nova_compute[186544]: 2025-11-22 07:44:10.493 186548 DEBUG nova.virt.libvirt.driver [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Ensure instance console log exists: /var/lib/nova/instances/5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 07:44:10 compute-0 nova_compute[186544]: 2025-11-22 07:44:10.494 186548 DEBUG oslo_concurrency.lockutils [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:44:10 compute-0 nova_compute[186544]: 2025-11-22 07:44:10.494 186548 DEBUG oslo_concurrency.lockutils [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:44:10 compute-0 nova_compute[186544]: 2025-11-22 07:44:10.495 186548 DEBUG oslo_concurrency.lockutils [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:44:10 compute-0 nova_compute[186544]: 2025-11-22 07:44:10.496 186548 DEBUG nova.virt.libvirt.driver [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 07:44:10 compute-0 nova_compute[186544]: 2025-11-22 07:44:10.501 186548 WARNING nova.virt.libvirt.driver [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 07:44:10 compute-0 nova_compute[186544]: 2025-11-22 07:44:10.506 186548 DEBUG nova.virt.libvirt.host [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 07:44:10 compute-0 nova_compute[186544]: 2025-11-22 07:44:10.507 186548 DEBUG nova.virt.libvirt.host [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 07:44:10 compute-0 nova_compute[186544]: 2025-11-22 07:44:10.510 186548 DEBUG nova.virt.libvirt.host [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 07:44:10 compute-0 nova_compute[186544]: 2025-11-22 07:44:10.510 186548 DEBUG nova.virt.libvirt.host [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 07:44:10 compute-0 nova_compute[186544]: 2025-11-22 07:44:10.511 186548 DEBUG nova.virt.libvirt.driver [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 07:44:10 compute-0 nova_compute[186544]: 2025-11-22 07:44:10.512 186548 DEBUG nova.virt.hardware [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 07:44:10 compute-0 nova_compute[186544]: 2025-11-22 07:44:10.512 186548 DEBUG nova.virt.hardware [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 07:44:10 compute-0 nova_compute[186544]: 2025-11-22 07:44:10.512 186548 DEBUG nova.virt.hardware [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 07:44:10 compute-0 nova_compute[186544]: 2025-11-22 07:44:10.513 186548 DEBUG nova.virt.hardware [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 07:44:10 compute-0 nova_compute[186544]: 2025-11-22 07:44:10.513 186548 DEBUG nova.virt.hardware [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 07:44:10 compute-0 nova_compute[186544]: 2025-11-22 07:44:10.513 186548 DEBUG nova.virt.hardware [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 07:44:10 compute-0 nova_compute[186544]: 2025-11-22 07:44:10.513 186548 DEBUG nova.virt.hardware [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 07:44:10 compute-0 nova_compute[186544]: 2025-11-22 07:44:10.514 186548 DEBUG nova.virt.hardware [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 07:44:10 compute-0 nova_compute[186544]: 2025-11-22 07:44:10.514 186548 DEBUG nova.virt.hardware [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 07:44:10 compute-0 nova_compute[186544]: 2025-11-22 07:44:10.514 186548 DEBUG nova.virt.hardware [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 07:44:10 compute-0 nova_compute[186544]: 2025-11-22 07:44:10.514 186548 DEBUG nova.virt.hardware [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 07:44:10 compute-0 nova_compute[186544]: 2025-11-22 07:44:10.515 186548 DEBUG nova.objects.instance [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Lazy-loading 'vcpu_model' on Instance uuid 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:44:10 compute-0 nova_compute[186544]: 2025-11-22 07:44:10.533 186548 DEBUG oslo_concurrency.processutils [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:44:10 compute-0 nova_compute[186544]: 2025-11-22 07:44:10.587 186548 DEBUG oslo_concurrency.processutils [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/disk.config --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:44:10 compute-0 nova_compute[186544]: 2025-11-22 07:44:10.588 186548 DEBUG oslo_concurrency.lockutils [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Acquiring lock "/var/lib/nova/instances/5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:44:10 compute-0 nova_compute[186544]: 2025-11-22 07:44:10.588 186548 DEBUG oslo_concurrency.lockutils [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Lock "/var/lib/nova/instances/5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:44:10 compute-0 nova_compute[186544]: 2025-11-22 07:44:10.589 186548 DEBUG oslo_concurrency.lockutils [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] Lock "/var/lib/nova/instances/5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:44:10 compute-0 nova_compute[186544]: 2025-11-22 07:44:10.591 186548 DEBUG nova.virt.libvirt.driver [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] End _get_guest_xml xml=<domain type="kvm">
Nov 22 07:44:10 compute-0 nova_compute[186544]:   <uuid>5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67</uuid>
Nov 22 07:44:10 compute-0 nova_compute[186544]:   <name>instance-00000011</name>
Nov 22 07:44:10 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 07:44:10 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 07:44:10 compute-0 nova_compute[186544]:   <metadata>
Nov 22 07:44:10 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 07:44:10 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 07:44:10 compute-0 nova_compute[186544]:       <nova:name>tempest-MigrationsAdminTest-server-370989325</nova:name>
Nov 22 07:44:10 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 07:44:10</nova:creationTime>
Nov 22 07:44:10 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 07:44:10 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 07:44:10 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 07:44:10 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 07:44:10 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 07:44:10 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 07:44:10 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 07:44:10 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 07:44:10 compute-0 nova_compute[186544]:         <nova:user uuid="5ea417ea62e2404d8cb5b9e767e8c5c4">tempest-MigrationsAdminTest-573005991-project-member</nova:user>
Nov 22 07:44:10 compute-0 nova_compute[186544]:         <nova:project uuid="070aaece3c3c4232877d26c34023c56d">tempest-MigrationsAdminTest-573005991</nova:project>
Nov 22 07:44:10 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 07:44:10 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 07:44:10 compute-0 nova_compute[186544]:       <nova:ports/>
Nov 22 07:44:10 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 07:44:10 compute-0 nova_compute[186544]:   </metadata>
Nov 22 07:44:10 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 07:44:10 compute-0 nova_compute[186544]:     <system>
Nov 22 07:44:10 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 07:44:10 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 07:44:10 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 07:44:10 compute-0 nova_compute[186544]:       <entry name="serial">5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67</entry>
Nov 22 07:44:10 compute-0 nova_compute[186544]:       <entry name="uuid">5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67</entry>
Nov 22 07:44:10 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 07:44:10 compute-0 nova_compute[186544]:     </system>
Nov 22 07:44:10 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 07:44:10 compute-0 nova_compute[186544]:   <os>
Nov 22 07:44:10 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 07:44:10 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 07:44:10 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 07:44:10 compute-0 nova_compute[186544]:   </os>
Nov 22 07:44:10 compute-0 nova_compute[186544]:   <features>
Nov 22 07:44:10 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 07:44:10 compute-0 nova_compute[186544]:     <apic/>
Nov 22 07:44:10 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 07:44:10 compute-0 nova_compute[186544]:   </features>
Nov 22 07:44:10 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 07:44:10 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 07:44:10 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 07:44:10 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 07:44:10 compute-0 nova_compute[186544]:   </clock>
Nov 22 07:44:10 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 07:44:10 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 07:44:10 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 07:44:10 compute-0 nova_compute[186544]:   </cpu>
Nov 22 07:44:10 compute-0 nova_compute[186544]:   <devices>
Nov 22 07:44:10 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 07:44:10 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 07:44:10 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/disk"/>
Nov 22 07:44:10 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 07:44:10 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:44:10 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 07:44:10 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 07:44:10 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/disk.config"/>
Nov 22 07:44:10 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 07:44:10 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:44:10 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 07:44:10 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/console.log" append="off"/>
Nov 22 07:44:10 compute-0 nova_compute[186544]:     </serial>
Nov 22 07:44:10 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 07:44:10 compute-0 nova_compute[186544]:     <video>
Nov 22 07:44:10 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 07:44:10 compute-0 nova_compute[186544]:     </video>
Nov 22 07:44:10 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 07:44:10 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 07:44:10 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 07:44:10 compute-0 nova_compute[186544]:     </rng>
Nov 22 07:44:10 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 07:44:10 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:10 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:10 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:10 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:10 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:10 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:10 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:10 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:10 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:10 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:10 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:10 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:10 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:10 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:10 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:10 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:10 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:10 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:10 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:10 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:10 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:10 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:10 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:10 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:10 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 07:44:10 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 07:44:10 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 07:44:10 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 07:44:10 compute-0 nova_compute[186544]:   </devices>
Nov 22 07:44:10 compute-0 nova_compute[186544]: </domain>
Nov 22 07:44:10 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 07:44:10 compute-0 nova_compute[186544]: 2025-11-22 07:44:10.639 186548 DEBUG nova.virt.libvirt.driver [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:44:10 compute-0 nova_compute[186544]: 2025-11-22 07:44:10.640 186548 DEBUG nova.virt.libvirt.driver [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:44:10 compute-0 nova_compute[186544]: 2025-11-22 07:44:10.640 186548 INFO nova.virt.libvirt.driver [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Using config drive
Nov 22 07:44:10 compute-0 systemd-machined[152872]: New machine qemu-9-instance-00000011.
Nov 22 07:44:10 compute-0 systemd[1]: Started Virtual Machine qemu-9-instance-00000011.
Nov 22 07:44:11 compute-0 sshd-session[215477]: Accepted publickey for nova from 192.168.122.101 port 58608 ssh2: ECDSA SHA256:92zYFkcPWlh9+ZvK5fXQ5RiSCmUwy/g/KFplYnusFfI
Nov 22 07:44:11 compute-0 systemd-logind[821]: New session 33 of user nova.
Nov 22 07:44:11 compute-0 systemd[1]: Started Session 33 of User nova.
Nov 22 07:44:11 compute-0 sshd-session[215477]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 22 07:44:11 compute-0 sshd-session[215480]: Received disconnect from 192.168.122.101 port 58608:11: disconnected by user
Nov 22 07:44:11 compute-0 sshd-session[215480]: Disconnected from user nova 192.168.122.101 port 58608
Nov 22 07:44:11 compute-0 sshd-session[215477]: pam_unix(sshd:session): session closed for user nova
Nov 22 07:44:11 compute-0 systemd[1]: session-33.scope: Deactivated successfully.
Nov 22 07:44:11 compute-0 systemd-logind[821]: Session 33 logged out. Waiting for processes to exit.
Nov 22 07:44:11 compute-0 systemd-logind[821]: Removed session 33.
Nov 22 07:44:11 compute-0 nova_compute[186544]: 2025-11-22 07:44:11.624 186548 INFO nova.compute.manager [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Rebuilding instance
Nov 22 07:44:11 compute-0 nova_compute[186544]: 2025-11-22 07:44:11.723 186548 DEBUG nova.compute.manager [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 07:44:11 compute-0 nova_compute[186544]: 2025-11-22 07:44:11.724 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763797451.7241943, 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:44:11 compute-0 nova_compute[186544]: 2025-11-22 07:44:11.724 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] VM Resumed (Lifecycle Event)
Nov 22 07:44:11 compute-0 nova_compute[186544]: 2025-11-22 07:44:11.730 186548 INFO nova.virt.libvirt.driver [-] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Instance running successfully.
Nov 22 07:44:11 compute-0 virtqemud[186092]: argument unsupported: QEMU guest agent is not configured
Nov 22 07:44:11 compute-0 nova_compute[186544]: 2025-11-22 07:44:11.734 186548 DEBUG nova.virt.libvirt.guest [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200
Nov 22 07:44:11 compute-0 nova_compute[186544]: 2025-11-22 07:44:11.734 186548 DEBUG nova.virt.libvirt.driver [None req-497dd65d-d468-4708-bfb8-594341c8ebb1 835591a1fec64877916027bfadff73ca 0e5e5597929b40e390b5295be902c8fb - - default default] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793
Nov 22 07:44:11 compute-0 nova_compute[186544]: 2025-11-22 07:44:11.739 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:44:11 compute-0 nova_compute[186544]: 2025-11-22 07:44:11.742 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:44:11 compute-0 podman[215484]: 2025-11-22 07:44:11.750634223 +0000 UTC m=+0.095487891 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, maintainer=Red Hat, Inc., release=1755695350, architecture=x86_64, name=ubi9-minimal, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 22 07:44:11 compute-0 nova_compute[186544]: 2025-11-22 07:44:11.776 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] During sync_power_state the instance has a pending task (resize_finish). Skip.
Nov 22 07:44:11 compute-0 nova_compute[186544]: 2025-11-22 07:44:11.777 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763797451.7242959, 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:44:11 compute-0 nova_compute[186544]: 2025-11-22 07:44:11.777 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] VM Started (Lifecycle Event)
Nov 22 07:44:11 compute-0 nova_compute[186544]: 2025-11-22 07:44:11.802 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:44:11 compute-0 nova_compute[186544]: 2025-11-22 07:44:11.805 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:44:11 compute-0 nova_compute[186544]: 2025-11-22 07:44:11.864 186548 DEBUG nova.compute.manager [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:44:11 compute-0 nova_compute[186544]: 2025-11-22 07:44:11.934 186548 DEBUG nova.objects.instance [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lazy-loading 'pci_requests' on Instance uuid 6d263548-4cc6-463b-b26b-cb43b0d069cd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:44:11 compute-0 nova_compute[186544]: 2025-11-22 07:44:11.944 186548 DEBUG nova.objects.instance [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lazy-loading 'pci_devices' on Instance uuid 6d263548-4cc6-463b-b26b-cb43b0d069cd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:44:11 compute-0 nova_compute[186544]: 2025-11-22 07:44:11.953 186548 DEBUG nova.objects.instance [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lazy-loading 'resources' on Instance uuid 6d263548-4cc6-463b-b26b-cb43b0d069cd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:44:11 compute-0 nova_compute[186544]: 2025-11-22 07:44:11.963 186548 DEBUG nova.objects.instance [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lazy-loading 'migration_context' on Instance uuid 6d263548-4cc6-463b-b26b-cb43b0d069cd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:44:11 compute-0 nova_compute[186544]: 2025-11-22 07:44:11.976 186548 DEBUG nova.objects.instance [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Nov 22 07:44:11 compute-0 nova_compute[186544]: 2025-11-22 07:44:11.979 186548 DEBUG nova.virt.libvirt.driver [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Nov 22 07:44:13 compute-0 nova_compute[186544]: 2025-11-22 07:44:13.571 186548 DEBUG nova.compute.manager [req-7616150a-21fa-416f-9a55-bc68b1ceb702 req-5c12d425-4012-4dc2-93b5-c1ba20d7bbeb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Received event network-vif-unplugged-66ab05b0-442e-4420-82b9-0fc90a3df63b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:44:13 compute-0 nova_compute[186544]: 2025-11-22 07:44:13.571 186548 DEBUG oslo_concurrency.lockutils [req-7616150a-21fa-416f-9a55-bc68b1ceb702 req-5c12d425-4012-4dc2-93b5-c1ba20d7bbeb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "144e6cca-5b79-4b25-9456-a59f6895075b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:44:13 compute-0 nova_compute[186544]: 2025-11-22 07:44:13.572 186548 DEBUG oslo_concurrency.lockutils [req-7616150a-21fa-416f-9a55-bc68b1ceb702 req-5c12d425-4012-4dc2-93b5-c1ba20d7bbeb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "144e6cca-5b79-4b25-9456-a59f6895075b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:44:13 compute-0 nova_compute[186544]: 2025-11-22 07:44:13.572 186548 DEBUG oslo_concurrency.lockutils [req-7616150a-21fa-416f-9a55-bc68b1ceb702 req-5c12d425-4012-4dc2-93b5-c1ba20d7bbeb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "144e6cca-5b79-4b25-9456-a59f6895075b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:44:13 compute-0 nova_compute[186544]: 2025-11-22 07:44:13.572 186548 DEBUG nova.compute.manager [req-7616150a-21fa-416f-9a55-bc68b1ceb702 req-5c12d425-4012-4dc2-93b5-c1ba20d7bbeb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] No waiting events found dispatching network-vif-unplugged-66ab05b0-442e-4420-82b9-0fc90a3df63b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:44:13 compute-0 nova_compute[186544]: 2025-11-22 07:44:13.572 186548 DEBUG nova.compute.manager [req-7616150a-21fa-416f-9a55-bc68b1ceb702 req-5c12d425-4012-4dc2-93b5-c1ba20d7bbeb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Received event network-vif-unplugged-66ab05b0-442e-4420-82b9-0fc90a3df63b for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 22 07:44:14 compute-0 nova_compute[186544]: 2025-11-22 07:44:14.315 186548 INFO nova.compute.manager [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Took 7.74 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Nov 22 07:44:14 compute-0 nova_compute[186544]: 2025-11-22 07:44:14.315 186548 DEBUG nova.compute.manager [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 07:44:14 compute-0 nova_compute[186544]: 2025-11-22 07:44:14.337 186548 DEBUG nova.compute.manager [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmppimoasud',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='144e6cca-5b79-4b25-9456-a59f6895075b',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(fd9d8912-b73b-402e-8b65-444a6c007a96),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Nov 22 07:44:14 compute-0 nova_compute[186544]: 2025-11-22 07:44:14.354 186548 DEBUG nova.objects.instance [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Lazy-loading 'migration_context' on Instance uuid 144e6cca-5b79-4b25-9456-a59f6895075b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:44:14 compute-0 nova_compute[186544]: 2025-11-22 07:44:14.355 186548 DEBUG nova.virt.libvirt.driver [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Nov 22 07:44:14 compute-0 nova_compute[186544]: 2025-11-22 07:44:14.356 186548 DEBUG nova.virt.libvirt.driver [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Nov 22 07:44:14 compute-0 nova_compute[186544]: 2025-11-22 07:44:14.356 186548 DEBUG nova.virt.libvirt.driver [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Nov 22 07:44:14 compute-0 nova_compute[186544]: 2025-11-22 07:44:14.367 186548 DEBUG nova.virt.libvirt.vif [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-22T07:43:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1027576693',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1027576693',id=16,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:43:41Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='74651b744925468db6c6e47d1397cc04',ramdisk_id='',reservation_id='r-u8vxgo1r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1505701588',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1505701588-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:44:03Z,user_data=None,user_id='4ca2e31d955040598948fa3da5d84888',uuid=144e6cca-5b79-4b25-9456-a59f6895075b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "66ab05b0-442e-4420-82b9-0fc90a3df63b", "address": "fa:16:3e:4f:30:6c", "network": {"id": "cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-667943228-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74651b744925468db6c6e47d1397cc04", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap66ab05b0-44", "ovs_interfaceid": "66ab05b0-442e-4420-82b9-0fc90a3df63b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 07:44:14 compute-0 nova_compute[186544]: 2025-11-22 07:44:14.367 186548 DEBUG nova.network.os_vif_util [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Converting VIF {"id": "66ab05b0-442e-4420-82b9-0fc90a3df63b", "address": "fa:16:3e:4f:30:6c", "network": {"id": "cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-667943228-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74651b744925468db6c6e47d1397cc04", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap66ab05b0-44", "ovs_interfaceid": "66ab05b0-442e-4420-82b9-0fc90a3df63b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:44:14 compute-0 nova_compute[186544]: 2025-11-22 07:44:14.368 186548 DEBUG nova.network.os_vif_util [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:4f:30:6c,bridge_name='br-int',has_traffic_filtering=True,id=66ab05b0-442e-4420-82b9-0fc90a3df63b,network=Network(cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap66ab05b0-44') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:44:14 compute-0 nova_compute[186544]: 2025-11-22 07:44:14.369 186548 DEBUG nova.virt.libvirt.migration [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Updating guest XML with vif config: <interface type="ethernet">
Nov 22 07:44:14 compute-0 nova_compute[186544]:   <mac address="fa:16:3e:4f:30:6c"/>
Nov 22 07:44:14 compute-0 nova_compute[186544]:   <model type="virtio"/>
Nov 22 07:44:14 compute-0 nova_compute[186544]:   <driver name="vhost" rx_queue_size="512"/>
Nov 22 07:44:14 compute-0 nova_compute[186544]:   <mtu size="1442"/>
Nov 22 07:44:14 compute-0 nova_compute[186544]:   <target dev="tap66ab05b0-44"/>
Nov 22 07:44:14 compute-0 nova_compute[186544]: </interface>
Nov 22 07:44:14 compute-0 nova_compute[186544]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Nov 22 07:44:14 compute-0 nova_compute[186544]: 2025-11-22 07:44:14.370 186548 DEBUG nova.virt.libvirt.driver [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Nov 22 07:44:14 compute-0 nova_compute[186544]: 2025-11-22 07:44:14.859 186548 DEBUG nova.virt.libvirt.migration [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 22 07:44:14 compute-0 nova_compute[186544]: 2025-11-22 07:44:14.859 186548 INFO nova.virt.libvirt.migration [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Increasing downtime to 50 ms after 0 sec elapsed time
Nov 22 07:44:14 compute-0 nova_compute[186544]: 2025-11-22 07:44:14.946 186548 INFO nova.virt.libvirt.driver [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Nov 22 07:44:14 compute-0 nova_compute[186544]: 2025-11-22 07:44:14.965 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:15 compute-0 nova_compute[186544]: 2025-11-22 07:44:15.089 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:15 compute-0 nova_compute[186544]: 2025-11-22 07:44:15.449 186548 DEBUG nova.virt.libvirt.migration [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 22 07:44:15 compute-0 nova_compute[186544]: 2025-11-22 07:44:15.449 186548 DEBUG nova.virt.libvirt.migration [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Nov 22 07:44:15 compute-0 nova_compute[186544]: 2025-11-22 07:44:15.700 186548 DEBUG nova.compute.manager [req-d440a589-514e-4c51-8421-faf99ee7afb6 req-679aa81c-e69d-4f01-ac21-ec1aa53ebe59 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Received event network-vif-plugged-66ab05b0-442e-4420-82b9-0fc90a3df63b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:44:15 compute-0 nova_compute[186544]: 2025-11-22 07:44:15.700 186548 DEBUG oslo_concurrency.lockutils [req-d440a589-514e-4c51-8421-faf99ee7afb6 req-679aa81c-e69d-4f01-ac21-ec1aa53ebe59 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "144e6cca-5b79-4b25-9456-a59f6895075b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:44:15 compute-0 nova_compute[186544]: 2025-11-22 07:44:15.701 186548 DEBUG oslo_concurrency.lockutils [req-d440a589-514e-4c51-8421-faf99ee7afb6 req-679aa81c-e69d-4f01-ac21-ec1aa53ebe59 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "144e6cca-5b79-4b25-9456-a59f6895075b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:44:15 compute-0 nova_compute[186544]: 2025-11-22 07:44:15.701 186548 DEBUG oslo_concurrency.lockutils [req-d440a589-514e-4c51-8421-faf99ee7afb6 req-679aa81c-e69d-4f01-ac21-ec1aa53ebe59 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "144e6cca-5b79-4b25-9456-a59f6895075b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:44:15 compute-0 nova_compute[186544]: 2025-11-22 07:44:15.701 186548 DEBUG nova.compute.manager [req-d440a589-514e-4c51-8421-faf99ee7afb6 req-679aa81c-e69d-4f01-ac21-ec1aa53ebe59 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] No waiting events found dispatching network-vif-plugged-66ab05b0-442e-4420-82b9-0fc90a3df63b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:44:15 compute-0 nova_compute[186544]: 2025-11-22 07:44:15.702 186548 WARNING nova.compute.manager [req-d440a589-514e-4c51-8421-faf99ee7afb6 req-679aa81c-e69d-4f01-ac21-ec1aa53ebe59 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Received unexpected event network-vif-plugged-66ab05b0-442e-4420-82b9-0fc90a3df63b for instance with vm_state active and task_state migrating.
Nov 22 07:44:15 compute-0 nova_compute[186544]: 2025-11-22 07:44:15.702 186548 DEBUG nova.compute.manager [req-d440a589-514e-4c51-8421-faf99ee7afb6 req-679aa81c-e69d-4f01-ac21-ec1aa53ebe59 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Received event network-changed-66ab05b0-442e-4420-82b9-0fc90a3df63b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:44:15 compute-0 nova_compute[186544]: 2025-11-22 07:44:15.702 186548 DEBUG nova.compute.manager [req-d440a589-514e-4c51-8421-faf99ee7afb6 req-679aa81c-e69d-4f01-ac21-ec1aa53ebe59 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Refreshing instance network info cache due to event network-changed-66ab05b0-442e-4420-82b9-0fc90a3df63b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 07:44:15 compute-0 nova_compute[186544]: 2025-11-22 07:44:15.703 186548 DEBUG oslo_concurrency.lockutils [req-d440a589-514e-4c51-8421-faf99ee7afb6 req-679aa81c-e69d-4f01-ac21-ec1aa53ebe59 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-144e6cca-5b79-4b25-9456-a59f6895075b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:44:15 compute-0 nova_compute[186544]: 2025-11-22 07:44:15.703 186548 DEBUG oslo_concurrency.lockutils [req-d440a589-514e-4c51-8421-faf99ee7afb6 req-679aa81c-e69d-4f01-ac21-ec1aa53ebe59 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-144e6cca-5b79-4b25-9456-a59f6895075b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:44:15 compute-0 nova_compute[186544]: 2025-11-22 07:44:15.703 186548 DEBUG nova.network.neutron [req-d440a589-514e-4c51-8421-faf99ee7afb6 req-679aa81c-e69d-4f01-ac21-ec1aa53ebe59 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Refreshing network info cache for port 66ab05b0-442e-4420-82b9-0fc90a3df63b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 07:44:15 compute-0 nova_compute[186544]: 2025-11-22 07:44:15.997 186548 DEBUG nova.virt.libvirt.migration [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 22 07:44:15 compute-0 nova_compute[186544]: 2025-11-22 07:44:15.998 186548 DEBUG nova.virt.libvirt.migration [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Nov 22 07:44:16 compute-0 nova_compute[186544]: 2025-11-22 07:44:16.502 186548 DEBUG nova.virt.libvirt.migration [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 22 07:44:16 compute-0 nova_compute[186544]: 2025-11-22 07:44:16.502 186548 DEBUG nova.virt.libvirt.migration [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Nov 22 07:44:16 compute-0 nova_compute[186544]: 2025-11-22 07:44:16.646 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763797456.6463025, 144e6cca-5b79-4b25-9456-a59f6895075b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:44:16 compute-0 nova_compute[186544]: 2025-11-22 07:44:16.647 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] VM Paused (Lifecycle Event)
Nov 22 07:44:16 compute-0 nova_compute[186544]: 2025-11-22 07:44:16.678 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:44:16 compute-0 nova_compute[186544]: 2025-11-22 07:44:16.682 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:44:16 compute-0 nova_compute[186544]: 2025-11-22 07:44:16.709 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] During sync_power_state the instance has a pending task (migrating). Skip.
Nov 22 07:44:16 compute-0 kernel: tap66ab05b0-44 (unregistering): left promiscuous mode
Nov 22 07:44:16 compute-0 NetworkManager[55036]: <info>  [1763797456.8169] device (tap66ab05b0-44): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 07:44:16 compute-0 nova_compute[186544]: 2025-11-22 07:44:16.827 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:16 compute-0 ovn_controller[94843]: 2025-11-22T07:44:16Z|00060|binding|INFO|Releasing lport 66ab05b0-442e-4420-82b9-0fc90a3df63b from this chassis (sb_readonly=0)
Nov 22 07:44:16 compute-0 ovn_controller[94843]: 2025-11-22T07:44:16Z|00061|binding|INFO|Setting lport 66ab05b0-442e-4420-82b9-0fc90a3df63b down in Southbound
Nov 22 07:44:16 compute-0 ovn_controller[94843]: 2025-11-22T07:44:16Z|00062|binding|INFO|Removing iface tap66ab05b0-44 ovn-installed in OVS
Nov 22 07:44:16 compute-0 nova_compute[186544]: 2025-11-22 07:44:16.842 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:16 compute-0 nova_compute[186544]: 2025-11-22 07:44:16.843 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:16 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:16.857 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4f:30:6c 10.100.0.8'], port_security=['fa:16:3e:4f:30:6c 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '73ab1342-b2af-4236-8199-7d435ebce194'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '144e6cca-5b79-4b25-9456-a59f6895075b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '74651b744925468db6c6e47d1397cc04', 'neutron:revision_number': '18', 'neutron:security_group_ids': '91f2be3c-33ea-422b-b9a4-1d9e92a850d8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=14c3e272-b4ef-4625-a876-b23f3cbba9b7, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=66ab05b0-442e-4420-82b9-0fc90a3df63b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:44:16 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:16.859 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 66ab05b0-442e-4420-82b9-0fc90a3df63b in datapath cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9 unbound from our chassis
Nov 22 07:44:16 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:16.860 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 07:44:16 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:16.861 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[a94fc9a4-59cd-4ee8-be1e-0c844bee5748]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:44:16 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:16.861 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9 namespace which is not needed anymore
Nov 22 07:44:16 compute-0 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000010.scope: Deactivated successfully.
Nov 22 07:44:16 compute-0 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000010.scope: Consumed 3.227s CPU time.
Nov 22 07:44:16 compute-0 systemd-machined[152872]: Machine qemu-7-instance-00000010 terminated.
Nov 22 07:44:16 compute-0 neutron-haproxy-ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9[215247]: [NOTICE]   (215251) : haproxy version is 2.8.14-c23fe91
Nov 22 07:44:16 compute-0 neutron-haproxy-ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9[215247]: [NOTICE]   (215251) : path to executable is /usr/sbin/haproxy
Nov 22 07:44:16 compute-0 neutron-haproxy-ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9[215247]: [WARNING]  (215251) : Exiting Master process...
Nov 22 07:44:16 compute-0 neutron-haproxy-ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9[215247]: [ALERT]    (215251) : Current worker (215253) exited with code 143 (Terminated)
Nov 22 07:44:16 compute-0 neutron-haproxy-ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9[215247]: [WARNING]  (215251) : All workers exited. Exiting... (0)
Nov 22 07:44:16 compute-0 systemd[1]: libpod-331474afb06f5f97660753b897ed5557539a0dca1611a740a96ebd00a8d463ec.scope: Deactivated successfully.
Nov 22 07:44:16 compute-0 podman[215539]: 2025-11-22 07:44:16.99476041 +0000 UTC m=+0.047409057 container died 331474afb06f5f97660753b897ed5557539a0dca1611a740a96ebd00a8d463ec (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 22 07:44:17 compute-0 nova_compute[186544]: 2025-11-22 07:44:17.011 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:17 compute-0 nova_compute[186544]: 2025-11-22 07:44:17.021 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:17 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-331474afb06f5f97660753b897ed5557539a0dca1611a740a96ebd00a8d463ec-userdata-shm.mount: Deactivated successfully.
Nov 22 07:44:17 compute-0 systemd[1]: var-lib-containers-storage-overlay-47172935216ac24d1bac506836770cc30351f0eed51908ac6958a9337a6c1118-merged.mount: Deactivated successfully.
Nov 22 07:44:17 compute-0 nova_compute[186544]: 2025-11-22 07:44:17.044 186548 DEBUG nova.virt.libvirt.guest [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Nov 22 07:44:17 compute-0 nova_compute[186544]: 2025-11-22 07:44:17.045 186548 INFO nova.virt.libvirt.driver [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Migration operation has completed
Nov 22 07:44:17 compute-0 nova_compute[186544]: 2025-11-22 07:44:17.046 186548 INFO nova.compute.manager [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] _post_live_migration() is started..
Nov 22 07:44:17 compute-0 nova_compute[186544]: 2025-11-22 07:44:17.048 186548 DEBUG nova.virt.libvirt.driver [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Nov 22 07:44:17 compute-0 nova_compute[186544]: 2025-11-22 07:44:17.049 186548 DEBUG nova.virt.libvirt.driver [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Nov 22 07:44:17 compute-0 nova_compute[186544]: 2025-11-22 07:44:17.049 186548 DEBUG nova.virt.libvirt.driver [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Nov 22 07:44:17 compute-0 podman[215539]: 2025-11-22 07:44:17.058444238 +0000 UTC m=+0.111092845 container cleanup 331474afb06f5f97660753b897ed5557539a0dca1611a740a96ebd00a8d463ec (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 22 07:44:17 compute-0 systemd[1]: libpod-conmon-331474afb06f5f97660753b897ed5557539a0dca1611a740a96ebd00a8d463ec.scope: Deactivated successfully.
Nov 22 07:44:17 compute-0 podman[215581]: 2025-11-22 07:44:17.140407475 +0000 UTC m=+0.057160577 container remove 331474afb06f5f97660753b897ed5557539a0dca1611a740a96ebd00a8d463ec (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 07:44:17 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:17.146 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[f3d84a86-6075-45d4-977b-a37e929e48ce]: (4, ('Sat Nov 22 07:44:16 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9 (331474afb06f5f97660753b897ed5557539a0dca1611a740a96ebd00a8d463ec)\n331474afb06f5f97660753b897ed5557539a0dca1611a740a96ebd00a8d463ec\nSat Nov 22 07:44:17 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9 (331474afb06f5f97660753b897ed5557539a0dca1611a740a96ebd00a8d463ec)\n331474afb06f5f97660753b897ed5557539a0dca1611a740a96ebd00a8d463ec\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:44:17 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:17.148 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[93c0ba38-b8ca-4c3f-8fd3-6c5c2f3138ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:44:17 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:17.148 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcd5fa4f6-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:44:17 compute-0 nova_compute[186544]: 2025-11-22 07:44:17.164 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:17 compute-0 kernel: tapcd5fa4f6-00: left promiscuous mode
Nov 22 07:44:17 compute-0 nova_compute[186544]: 2025-11-22 07:44:17.171 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:17 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:17.173 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[48e76545-f879-4941-a18a-466c98d71ab5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:44:17 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:17.187 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[86927d8e-d968-44a3-a0c9-03540fc4ce78]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:44:17 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:17.188 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[ca77f436-b1a2-48f7-bc31-b87cb4a2d345]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:44:17 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:17.202 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[f3d6d8ba-1e3f-4c85-b12f-cbe7e755c23c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 417962, 'reachable_time': 40524, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215598, 'error': None, 'target': 'ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:44:17 compute-0 systemd[1]: run-netns-ovnmeta\x2dcd5fa4f6\x2d0f1b\x2d41f2\x2d9643\x2d3c1a36620dc9.mount: Deactivated successfully.
Nov 22 07:44:17 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:17.207 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 07:44:17 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:17.207 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[50eb4f25-63f3-4fc2-b46c-edfe13d621a5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:44:18 compute-0 nova_compute[186544]: 2025-11-22 07:44:18.193 186548 DEBUG nova.network.neutron [req-d440a589-514e-4c51-8421-faf99ee7afb6 req-679aa81c-e69d-4f01-ac21-ec1aa53ebe59 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Updated VIF entry in instance network info cache for port 66ab05b0-442e-4420-82b9-0fc90a3df63b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 07:44:18 compute-0 nova_compute[186544]: 2025-11-22 07:44:18.194 186548 DEBUG nova.network.neutron [req-d440a589-514e-4c51-8421-faf99ee7afb6 req-679aa81c-e69d-4f01-ac21-ec1aa53ebe59 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Updating instance_info_cache with network_info: [{"id": "66ab05b0-442e-4420-82b9-0fc90a3df63b", "address": "fa:16:3e:4f:30:6c", "network": {"id": "cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-667943228-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74651b744925468db6c6e47d1397cc04", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66ab05b0-44", "ovs_interfaceid": "66ab05b0-442e-4420-82b9-0fc90a3df63b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true, "migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:44:18 compute-0 nova_compute[186544]: 2025-11-22 07:44:18.221 186548 DEBUG oslo_concurrency.lockutils [req-d440a589-514e-4c51-8421-faf99ee7afb6 req-679aa81c-e69d-4f01-ac21-ec1aa53ebe59 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-144e6cca-5b79-4b25-9456-a59f6895075b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:44:18 compute-0 nova_compute[186544]: 2025-11-22 07:44:18.314 186548 DEBUG nova.compute.manager [req-03223a67-c79e-4d47-9446-9932d29ae619 req-d65815db-9e08-4dd3-8fa1-6d5573501947 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Received event network-vif-unplugged-66ab05b0-442e-4420-82b9-0fc90a3df63b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:44:18 compute-0 nova_compute[186544]: 2025-11-22 07:44:18.314 186548 DEBUG oslo_concurrency.lockutils [req-03223a67-c79e-4d47-9446-9932d29ae619 req-d65815db-9e08-4dd3-8fa1-6d5573501947 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "144e6cca-5b79-4b25-9456-a59f6895075b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:44:18 compute-0 nova_compute[186544]: 2025-11-22 07:44:18.314 186548 DEBUG oslo_concurrency.lockutils [req-03223a67-c79e-4d47-9446-9932d29ae619 req-d65815db-9e08-4dd3-8fa1-6d5573501947 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "144e6cca-5b79-4b25-9456-a59f6895075b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:44:18 compute-0 nova_compute[186544]: 2025-11-22 07:44:18.315 186548 DEBUG oslo_concurrency.lockutils [req-03223a67-c79e-4d47-9446-9932d29ae619 req-d65815db-9e08-4dd3-8fa1-6d5573501947 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "144e6cca-5b79-4b25-9456-a59f6895075b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:44:18 compute-0 nova_compute[186544]: 2025-11-22 07:44:18.315 186548 DEBUG nova.compute.manager [req-03223a67-c79e-4d47-9446-9932d29ae619 req-d65815db-9e08-4dd3-8fa1-6d5573501947 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] No waiting events found dispatching network-vif-unplugged-66ab05b0-442e-4420-82b9-0fc90a3df63b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:44:18 compute-0 nova_compute[186544]: 2025-11-22 07:44:18.315 186548 DEBUG nova.compute.manager [req-03223a67-c79e-4d47-9446-9932d29ae619 req-d65815db-9e08-4dd3-8fa1-6d5573501947 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Received event network-vif-unplugged-66ab05b0-442e-4420-82b9-0fc90a3df63b for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 22 07:44:18 compute-0 nova_compute[186544]: 2025-11-22 07:44:18.564 186548 DEBUG nova.network.neutron [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Activated binding for port 66ab05b0-442e-4420-82b9-0fc90a3df63b and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Nov 22 07:44:18 compute-0 nova_compute[186544]: 2025-11-22 07:44:18.564 186548 DEBUG nova.compute.manager [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "66ab05b0-442e-4420-82b9-0fc90a3df63b", "address": "fa:16:3e:4f:30:6c", "network": {"id": "cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-667943228-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74651b744925468db6c6e47d1397cc04", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66ab05b0-44", "ovs_interfaceid": "66ab05b0-442e-4420-82b9-0fc90a3df63b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Nov 22 07:44:18 compute-0 nova_compute[186544]: 2025-11-22 07:44:18.565 186548 DEBUG nova.virt.libvirt.vif [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-22T07:43:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1027576693',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1027576693',id=16,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:43:41Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='74651b744925468db6c6e47d1397cc04',ramdisk_id='',reservation_id='r-u8vxgo1r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1505701588',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1505701588-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:44:04Z,user_data=None,user_id='4ca2e31d955040598948fa3da5d84888',uuid=144e6cca-5b79-4b25-9456-a59f6895075b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "66ab05b0-442e-4420-82b9-0fc90a3df63b", "address": "fa:16:3e:4f:30:6c", "network": {"id": "cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-667943228-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74651b744925468db6c6e47d1397cc04", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66ab05b0-44", "ovs_interfaceid": "66ab05b0-442e-4420-82b9-0fc90a3df63b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 07:44:18 compute-0 nova_compute[186544]: 2025-11-22 07:44:18.566 186548 DEBUG nova.network.os_vif_util [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Converting VIF {"id": "66ab05b0-442e-4420-82b9-0fc90a3df63b", "address": "fa:16:3e:4f:30:6c", "network": {"id": "cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-667943228-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74651b744925468db6c6e47d1397cc04", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66ab05b0-44", "ovs_interfaceid": "66ab05b0-442e-4420-82b9-0fc90a3df63b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:44:18 compute-0 nova_compute[186544]: 2025-11-22 07:44:18.566 186548 DEBUG nova.network.os_vif_util [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:4f:30:6c,bridge_name='br-int',has_traffic_filtering=True,id=66ab05b0-442e-4420-82b9-0fc90a3df63b,network=Network(cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap66ab05b0-44') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:44:18 compute-0 nova_compute[186544]: 2025-11-22 07:44:18.567 186548 DEBUG os_vif [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:4f:30:6c,bridge_name='br-int',has_traffic_filtering=True,id=66ab05b0-442e-4420-82b9-0fc90a3df63b,network=Network(cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap66ab05b0-44') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 07:44:18 compute-0 nova_compute[186544]: 2025-11-22 07:44:18.568 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:18 compute-0 nova_compute[186544]: 2025-11-22 07:44:18.569 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap66ab05b0-44, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:44:18 compute-0 nova_compute[186544]: 2025-11-22 07:44:18.574 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 07:44:18 compute-0 nova_compute[186544]: 2025-11-22 07:44:18.576 186548 INFO os_vif [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:4f:30:6c,bridge_name='br-int',has_traffic_filtering=True,id=66ab05b0-442e-4420-82b9-0fc90a3df63b,network=Network(cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap66ab05b0-44')
Nov 22 07:44:18 compute-0 nova_compute[186544]: 2025-11-22 07:44:18.577 186548 DEBUG oslo_concurrency.lockutils [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:44:18 compute-0 nova_compute[186544]: 2025-11-22 07:44:18.577 186548 DEBUG oslo_concurrency.lockutils [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:44:18 compute-0 nova_compute[186544]: 2025-11-22 07:44:18.577 186548 DEBUG oslo_concurrency.lockutils [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:44:18 compute-0 nova_compute[186544]: 2025-11-22 07:44:18.577 186548 DEBUG nova.compute.manager [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Nov 22 07:44:18 compute-0 nova_compute[186544]: 2025-11-22 07:44:18.578 186548 INFO nova.virt.libvirt.driver [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Deleting instance files /var/lib/nova/instances/144e6cca-5b79-4b25-9456-a59f6895075b_del
Nov 22 07:44:18 compute-0 nova_compute[186544]: 2025-11-22 07:44:18.578 186548 INFO nova.virt.libvirt.driver [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Deletion of /var/lib/nova/instances/144e6cca-5b79-4b25-9456-a59f6895075b_del complete
Nov 22 07:44:18 compute-0 ovn_controller[94843]: 2025-11-22T07:44:18Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:88:5c:3d 10.100.0.13
Nov 22 07:44:18 compute-0 ovn_controller[94843]: 2025-11-22T07:44:18Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:88:5c:3d 10.100.0.13
Nov 22 07:44:19 compute-0 nova_compute[186544]: 2025-11-22 07:44:19.968 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:20 compute-0 nova_compute[186544]: 2025-11-22 07:44:20.411 186548 DEBUG nova.compute.manager [req-ab361cb8-2ab8-4ef1-841b-10a78606c342 req-cd258bbc-95c1-4cc2-b196-86f846e8811f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Received event network-vif-plugged-66ab05b0-442e-4420-82b9-0fc90a3df63b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:44:20 compute-0 nova_compute[186544]: 2025-11-22 07:44:20.411 186548 DEBUG oslo_concurrency.lockutils [req-ab361cb8-2ab8-4ef1-841b-10a78606c342 req-cd258bbc-95c1-4cc2-b196-86f846e8811f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "144e6cca-5b79-4b25-9456-a59f6895075b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:44:20 compute-0 nova_compute[186544]: 2025-11-22 07:44:20.412 186548 DEBUG oslo_concurrency.lockutils [req-ab361cb8-2ab8-4ef1-841b-10a78606c342 req-cd258bbc-95c1-4cc2-b196-86f846e8811f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "144e6cca-5b79-4b25-9456-a59f6895075b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:44:20 compute-0 nova_compute[186544]: 2025-11-22 07:44:20.412 186548 DEBUG oslo_concurrency.lockutils [req-ab361cb8-2ab8-4ef1-841b-10a78606c342 req-cd258bbc-95c1-4cc2-b196-86f846e8811f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "144e6cca-5b79-4b25-9456-a59f6895075b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:44:20 compute-0 nova_compute[186544]: 2025-11-22 07:44:20.412 186548 DEBUG nova.compute.manager [req-ab361cb8-2ab8-4ef1-841b-10a78606c342 req-cd258bbc-95c1-4cc2-b196-86f846e8811f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] No waiting events found dispatching network-vif-plugged-66ab05b0-442e-4420-82b9-0fc90a3df63b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:44:20 compute-0 nova_compute[186544]: 2025-11-22 07:44:20.412 186548 WARNING nova.compute.manager [req-ab361cb8-2ab8-4ef1-841b-10a78606c342 req-cd258bbc-95c1-4cc2-b196-86f846e8811f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Received unexpected event network-vif-plugged-66ab05b0-442e-4420-82b9-0fc90a3df63b for instance with vm_state active and task_state migrating.
Nov 22 07:44:20 compute-0 nova_compute[186544]: 2025-11-22 07:44:20.413 186548 DEBUG nova.compute.manager [req-ab361cb8-2ab8-4ef1-841b-10a78606c342 req-cd258bbc-95c1-4cc2-b196-86f846e8811f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Received event network-vif-plugged-66ab05b0-442e-4420-82b9-0fc90a3df63b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:44:20 compute-0 nova_compute[186544]: 2025-11-22 07:44:20.413 186548 DEBUG oslo_concurrency.lockutils [req-ab361cb8-2ab8-4ef1-841b-10a78606c342 req-cd258bbc-95c1-4cc2-b196-86f846e8811f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "144e6cca-5b79-4b25-9456-a59f6895075b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:44:20 compute-0 nova_compute[186544]: 2025-11-22 07:44:20.413 186548 DEBUG oslo_concurrency.lockutils [req-ab361cb8-2ab8-4ef1-841b-10a78606c342 req-cd258bbc-95c1-4cc2-b196-86f846e8811f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "144e6cca-5b79-4b25-9456-a59f6895075b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:44:20 compute-0 nova_compute[186544]: 2025-11-22 07:44:20.413 186548 DEBUG oslo_concurrency.lockutils [req-ab361cb8-2ab8-4ef1-841b-10a78606c342 req-cd258bbc-95c1-4cc2-b196-86f846e8811f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "144e6cca-5b79-4b25-9456-a59f6895075b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:44:20 compute-0 nova_compute[186544]: 2025-11-22 07:44:20.413 186548 DEBUG nova.compute.manager [req-ab361cb8-2ab8-4ef1-841b-10a78606c342 req-cd258bbc-95c1-4cc2-b196-86f846e8811f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] No waiting events found dispatching network-vif-plugged-66ab05b0-442e-4420-82b9-0fc90a3df63b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:44:20 compute-0 nova_compute[186544]: 2025-11-22 07:44:20.414 186548 WARNING nova.compute.manager [req-ab361cb8-2ab8-4ef1-841b-10a78606c342 req-cd258bbc-95c1-4cc2-b196-86f846e8811f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Received unexpected event network-vif-plugged-66ab05b0-442e-4420-82b9-0fc90a3df63b for instance with vm_state active and task_state migrating.
Nov 22 07:44:20 compute-0 nova_compute[186544]: 2025-11-22 07:44:20.414 186548 DEBUG nova.compute.manager [req-ab361cb8-2ab8-4ef1-841b-10a78606c342 req-cd258bbc-95c1-4cc2-b196-86f846e8811f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Received event network-vif-plugged-66ab05b0-442e-4420-82b9-0fc90a3df63b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:44:20 compute-0 nova_compute[186544]: 2025-11-22 07:44:20.414 186548 DEBUG oslo_concurrency.lockutils [req-ab361cb8-2ab8-4ef1-841b-10a78606c342 req-cd258bbc-95c1-4cc2-b196-86f846e8811f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "144e6cca-5b79-4b25-9456-a59f6895075b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:44:20 compute-0 nova_compute[186544]: 2025-11-22 07:44:20.414 186548 DEBUG oslo_concurrency.lockutils [req-ab361cb8-2ab8-4ef1-841b-10a78606c342 req-cd258bbc-95c1-4cc2-b196-86f846e8811f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "144e6cca-5b79-4b25-9456-a59f6895075b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:44:20 compute-0 nova_compute[186544]: 2025-11-22 07:44:20.414 186548 DEBUG oslo_concurrency.lockutils [req-ab361cb8-2ab8-4ef1-841b-10a78606c342 req-cd258bbc-95c1-4cc2-b196-86f846e8811f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "144e6cca-5b79-4b25-9456-a59f6895075b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:44:20 compute-0 nova_compute[186544]: 2025-11-22 07:44:20.415 186548 DEBUG nova.compute.manager [req-ab361cb8-2ab8-4ef1-841b-10a78606c342 req-cd258bbc-95c1-4cc2-b196-86f846e8811f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] No waiting events found dispatching network-vif-plugged-66ab05b0-442e-4420-82b9-0fc90a3df63b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:44:20 compute-0 nova_compute[186544]: 2025-11-22 07:44:20.415 186548 WARNING nova.compute.manager [req-ab361cb8-2ab8-4ef1-841b-10a78606c342 req-cd258bbc-95c1-4cc2-b196-86f846e8811f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Received unexpected event network-vif-plugged-66ab05b0-442e-4420-82b9-0fc90a3df63b for instance with vm_state active and task_state migrating.
Nov 22 07:44:21 compute-0 podman[215618]: 2025-11-22 07:44:21.413232075 +0000 UTC m=+0.060731546 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute)
Nov 22 07:44:21 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Nov 22 07:44:21 compute-0 systemd[215356]: Activating special unit Exit the Session...
Nov 22 07:44:21 compute-0 systemd[215356]: Stopped target Main User Target.
Nov 22 07:44:21 compute-0 systemd[215356]: Stopped target Basic System.
Nov 22 07:44:21 compute-0 systemd[215356]: Stopped target Paths.
Nov 22 07:44:21 compute-0 systemd[215356]: Stopped target Sockets.
Nov 22 07:44:21 compute-0 systemd[215356]: Stopped target Timers.
Nov 22 07:44:21 compute-0 systemd[215356]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 22 07:44:21 compute-0 systemd[215356]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 22 07:44:21 compute-0 systemd[215356]: Closed D-Bus User Message Bus Socket.
Nov 22 07:44:21 compute-0 systemd[215356]: Stopped Create User's Volatile Files and Directories.
Nov 22 07:44:21 compute-0 systemd[215356]: Removed slice User Application Slice.
Nov 22 07:44:21 compute-0 systemd[215356]: Reached target Shutdown.
Nov 22 07:44:21 compute-0 systemd[215356]: Finished Exit the Session.
Nov 22 07:44:21 compute-0 systemd[215356]: Reached target Exit the Session.
Nov 22 07:44:21 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Nov 22 07:44:21 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Nov 22 07:44:21 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 22 07:44:21 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 22 07:44:21 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 22 07:44:21 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 22 07:44:21 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Nov 22 07:44:21 compute-0 podman[215619]: 2025-11-22 07:44:21.480106351 +0000 UTC m=+0.122338922 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 07:44:22 compute-0 nova_compute[186544]: 2025-11-22 07:44:22.030 186548 DEBUG nova.virt.libvirt.driver [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Nov 22 07:44:23 compute-0 nova_compute[186544]: 2025-11-22 07:44:23.570 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:24 compute-0 kernel: tapcc51d9a0-e1 (unregistering): left promiscuous mode
Nov 22 07:44:24 compute-0 NetworkManager[55036]: <info>  [1763797464.2112] device (tapcc51d9a0-e1): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 07:44:24 compute-0 nova_compute[186544]: 2025-11-22 07:44:24.214 186548 DEBUG oslo_concurrency.lockutils [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Acquiring lock "144e6cca-5b79-4b25-9456-a59f6895075b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:44:24 compute-0 nova_compute[186544]: 2025-11-22 07:44:24.214 186548 DEBUG oslo_concurrency.lockutils [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Lock "144e6cca-5b79-4b25-9456-a59f6895075b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:44:24 compute-0 nova_compute[186544]: 2025-11-22 07:44:24.214 186548 DEBUG oslo_concurrency.lockutils [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Lock "144e6cca-5b79-4b25-9456-a59f6895075b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:44:24 compute-0 ovn_controller[94843]: 2025-11-22T07:44:24Z|00063|binding|INFO|Releasing lport cc51d9a0-e170-42eb-b8db-2910ea320cb4 from this chassis (sb_readonly=0)
Nov 22 07:44:24 compute-0 ovn_controller[94843]: 2025-11-22T07:44:24Z|00064|binding|INFO|Setting lport cc51d9a0-e170-42eb-b8db-2910ea320cb4 down in Southbound
Nov 22 07:44:24 compute-0 nova_compute[186544]: 2025-11-22 07:44:24.219 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:24 compute-0 ovn_controller[94843]: 2025-11-22T07:44:24Z|00065|binding|INFO|Removing iface tapcc51d9a0-e1 ovn-installed in OVS
Nov 22 07:44:24 compute-0 nova_compute[186544]: 2025-11-22 07:44:24.221 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:24 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:24.228 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:88:5c:3d 10.100.0.13'], port_security=['fa:16:3e:88:5c:3d 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '6d263548-4cc6-463b-b26b-cb43b0d069cd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d7ba1c27-6255-4c71-8e98-23a1c59b5723', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9b004cb06df74de2903dae19345fd9c7', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'f8b9c274-fa57-419c-9d40-54201db84f9d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ce3be460-df7c-41a5-9ff2-c82c8fc728ec, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=cc51d9a0-e170-42eb-b8db-2910ea320cb4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:44:24 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:24.229 103805 INFO neutron.agent.ovn.metadata.agent [-] Port cc51d9a0-e170-42eb-b8db-2910ea320cb4 in datapath d7ba1c27-6255-4c71-8e98-23a1c59b5723 unbound from our chassis
Nov 22 07:44:24 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:24.231 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d7ba1c27-6255-4c71-8e98-23a1c59b5723
Nov 22 07:44:24 compute-0 nova_compute[186544]: 2025-11-22 07:44:24.233 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:24 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:24.245 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[94ccf5c2-2d55-43a2-a45d-02c95fcce03b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:44:24 compute-0 nova_compute[186544]: 2025-11-22 07:44:24.250 186548 DEBUG oslo_concurrency.lockutils [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:44:24 compute-0 nova_compute[186544]: 2025-11-22 07:44:24.250 186548 DEBUG oslo_concurrency.lockutils [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:44:24 compute-0 nova_compute[186544]: 2025-11-22 07:44:24.251 186548 DEBUG oslo_concurrency.lockutils [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:44:24 compute-0 nova_compute[186544]: 2025-11-22 07:44:24.251 186548 DEBUG nova.compute.resource_tracker [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 07:44:24 compute-0 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d0000000d.scope: Deactivated successfully.
Nov 22 07:44:24 compute-0 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d0000000d.scope: Consumed 14.050s CPU time.
Nov 22 07:44:24 compute-0 systemd-machined[152872]: Machine qemu-8-instance-0000000d terminated.
Nov 22 07:44:24 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:24.270 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[a0e5fd98-dc12-4153-8f2d-a86bd14cc40e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:44:24 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:24.273 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[359c0700-085b-4462-9971-05b2a57c2138]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:44:24 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:24.297 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[9dbc2623-9f15-43d7-a957-b5cb9f09a126]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:44:24 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:24.312 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[60c6b4de-8d13-4327-bf7a-edf46503a724]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd7ba1c27-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bb:37:eb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 11, 'rx_bytes': 700, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 11, 'rx_bytes': 700, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 413549, 'reachable_time': 24347, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215682, 'error': None, 'target': 'ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:44:24 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:24.328 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[a6bf534c-dcbb-4a75-acd6-7f0e5805fcd5]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapd7ba1c27-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 413560, 'tstamp': 413560}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215683, 'error': None, 'target': 'ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapd7ba1c27-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 413562, 'tstamp': 413562}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215683, 'error': None, 'target': 'ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:44:24 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:24.329 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd7ba1c27-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:44:24 compute-0 nova_compute[186544]: 2025-11-22 07:44:24.331 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:24 compute-0 nova_compute[186544]: 2025-11-22 07:44:24.335 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:24 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:24.335 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd7ba1c27-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:44:24 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:24.336 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:44:24 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:24.336 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd7ba1c27-60, col_values=(('external_ids', {'iface-id': '3c20001c-28e2-4cdd-9a7c-497ed470b31c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:44:24 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:24.337 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:44:24 compute-0 nova_compute[186544]: 2025-11-22 07:44:24.488 186548 DEBUG nova.compute.manager [req-f1fa909c-0525-4d45-8750-be00bf5318e7 req-e4f9a40f-fb68-443c-9bd1-144f0bc15a46 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Received event network-vif-unplugged-cc51d9a0-e170-42eb-b8db-2910ea320cb4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:44:24 compute-0 nova_compute[186544]: 2025-11-22 07:44:24.489 186548 DEBUG oslo_concurrency.lockutils [req-f1fa909c-0525-4d45-8750-be00bf5318e7 req-e4f9a40f-fb68-443c-9bd1-144f0bc15a46 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "6d263548-4cc6-463b-b26b-cb43b0d069cd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:44:24 compute-0 nova_compute[186544]: 2025-11-22 07:44:24.489 186548 DEBUG oslo_concurrency.lockutils [req-f1fa909c-0525-4d45-8750-be00bf5318e7 req-e4f9a40f-fb68-443c-9bd1-144f0bc15a46 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6d263548-4cc6-463b-b26b-cb43b0d069cd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:44:24 compute-0 nova_compute[186544]: 2025-11-22 07:44:24.489 186548 DEBUG oslo_concurrency.lockutils [req-f1fa909c-0525-4d45-8750-be00bf5318e7 req-e4f9a40f-fb68-443c-9bd1-144f0bc15a46 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6d263548-4cc6-463b-b26b-cb43b0d069cd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:44:24 compute-0 nova_compute[186544]: 2025-11-22 07:44:24.489 186548 DEBUG nova.compute.manager [req-f1fa909c-0525-4d45-8750-be00bf5318e7 req-e4f9a40f-fb68-443c-9bd1-144f0bc15a46 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] No waiting events found dispatching network-vif-unplugged-cc51d9a0-e170-42eb-b8db-2910ea320cb4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:44:24 compute-0 nova_compute[186544]: 2025-11-22 07:44:24.490 186548 WARNING nova.compute.manager [req-f1fa909c-0525-4d45-8750-be00bf5318e7 req-e4f9a40f-fb68-443c-9bd1-144f0bc15a46 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Received unexpected event network-vif-unplugged-cc51d9a0-e170-42eb-b8db-2910ea320cb4 for instance with vm_state active and task_state rebuilding.
Nov 22 07:44:24 compute-0 nova_compute[186544]: 2025-11-22 07:44:24.562 186548 DEBUG oslo_concurrency.processutils [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:44:24 compute-0 nova_compute[186544]: 2025-11-22 07:44:24.627 186548 DEBUG oslo_concurrency.processutils [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:44:24 compute-0 nova_compute[186544]: 2025-11-22 07:44:24.629 186548 DEBUG oslo_concurrency.processutils [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:44:24 compute-0 nova_compute[186544]: 2025-11-22 07:44:24.683 186548 DEBUG oslo_concurrency.processutils [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:44:24 compute-0 nova_compute[186544]: 2025-11-22 07:44:24.689 186548 DEBUG oslo_concurrency.processutils [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6d263548-4cc6-463b-b26b-cb43b0d069cd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:44:24 compute-0 nova_compute[186544]: 2025-11-22 07:44:24.754 186548 DEBUG oslo_concurrency.processutils [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6d263548-4cc6-463b-b26b-cb43b0d069cd/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:44:24 compute-0 nova_compute[186544]: 2025-11-22 07:44:24.756 186548 DEBUG oslo_concurrency.processutils [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6d263548-4cc6-463b-b26b-cb43b0d069cd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:44:24 compute-0 nova_compute[186544]: 2025-11-22 07:44:24.812 186548 DEBUG oslo_concurrency.processutils [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6d263548-4cc6-463b-b26b-cb43b0d069cd/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:44:24 compute-0 nova_compute[186544]: 2025-11-22 07:44:24.817 186548 DEBUG oslo_concurrency.processutils [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/36d5f234-9baf-48b6-a565-430378fe4068/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:44:24 compute-0 nova_compute[186544]: 2025-11-22 07:44:24.871 186548 DEBUG oslo_concurrency.processutils [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/36d5f234-9baf-48b6-a565-430378fe4068/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:44:24 compute-0 nova_compute[186544]: 2025-11-22 07:44:24.872 186548 DEBUG oslo_concurrency.processutils [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/36d5f234-9baf-48b6-a565-430378fe4068/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:44:24 compute-0 nova_compute[186544]: 2025-11-22 07:44:24.928 186548 DEBUG oslo_concurrency.processutils [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/36d5f234-9baf-48b6-a565-430378fe4068/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:44:24 compute-0 nova_compute[186544]: 2025-11-22 07:44:24.969 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.043 186548 INFO nova.virt.libvirt.driver [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Instance shutdown successfully after 13 seconds.
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.048 186548 INFO nova.virt.libvirt.driver [-] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Instance destroyed successfully.
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.053 186548 INFO nova.virt.libvirt.driver [-] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Instance destroyed successfully.
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.054 186548 DEBUG nova.virt.libvirt.vif [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-22T07:43:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-174353263',display_name='tempest-ServersAdminTestJSON-server-174353263',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-174353263',id=13,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:44:06Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='9b004cb06df74de2903dae19345fd9c7',ramdisk_id='',reservation_id='r-wcksy40q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-1843119868',owner_user_name='tempest-ServersAdminTestJSON-1843119868-project-member'},tags=<?>,task_state='rebuilding',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:44:11Z,user_data=None,user_id='7c0fb56fc41e44dfa23a0d45149e78e3',uuid=6d263548-4cc6-463b-b26b-cb43b0d069cd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cc51d9a0-e170-42eb-b8db-2910ea320cb4", "address": "fa:16:3e:88:5c:3d", "network": {"id": "d7ba1c27-6255-4c71-8e98-23a1c59b5723", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1812148536-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b004cb06df74de2903dae19345fd9c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc51d9a0-e1", "ovs_interfaceid": "cc51d9a0-e170-42eb-b8db-2910ea320cb4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.054 186548 DEBUG nova.network.os_vif_util [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Converting VIF {"id": "cc51d9a0-e170-42eb-b8db-2910ea320cb4", "address": "fa:16:3e:88:5c:3d", "network": {"id": "d7ba1c27-6255-4c71-8e98-23a1c59b5723", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1812148536-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b004cb06df74de2903dae19345fd9c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc51d9a0-e1", "ovs_interfaceid": "cc51d9a0-e170-42eb-b8db-2910ea320cb4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.055 186548 DEBUG nova.network.os_vif_util [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:88:5c:3d,bridge_name='br-int',has_traffic_filtering=True,id=cc51d9a0-e170-42eb-b8db-2910ea320cb4,network=Network(d7ba1c27-6255-4c71-8e98-23a1c59b5723),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc51d9a0-e1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.056 186548 DEBUG os_vif [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:88:5c:3d,bridge_name='br-int',has_traffic_filtering=True,id=cc51d9a0-e170-42eb-b8db-2910ea320cb4,network=Network(d7ba1c27-6255-4c71-8e98-23a1c59b5723),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc51d9a0-e1') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.058 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.058 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcc51d9a0-e1, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.059 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.061 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.063 186548 INFO os_vif [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:88:5c:3d,bridge_name='br-int',has_traffic_filtering=True,id=cc51d9a0-e170-42eb-b8db-2910ea320cb4,network=Network(d7ba1c27-6255-4c71-8e98-23a1c59b5723),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc51d9a0-e1')
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.064 186548 INFO nova.virt.libvirt.driver [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Deleting instance files /var/lib/nova/instances/6d263548-4cc6-463b-b26b-cb43b0d069cd_del
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.065 186548 INFO nova.virt.libvirt.driver [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Deletion of /var/lib/nova/instances/6d263548-4cc6-463b-b26b-cb43b0d069cd_del complete
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.120 186548 WARNING nova.virt.libvirt.driver [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.121 186548 DEBUG nova.compute.resource_tracker [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5346MB free_disk=73.33867645263672GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.121 186548 DEBUG oslo_concurrency.lockutils [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.122 186548 DEBUG oslo_concurrency.lockutils [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.191 186548 DEBUG nova.compute.resource_tracker [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Migration for instance 144e6cca-5b79-4b25-9456-a59f6895075b refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.211 186548 DEBUG nova.compute.resource_tracker [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.247 186548 DEBUG nova.compute.resource_tracker [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Instance 6d263548-4cc6-463b-b26b-cb43b0d069cd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.248 186548 DEBUG nova.compute.resource_tracker [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Instance 36d5f234-9baf-48b6-a565-430378fe4068 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.248 186548 DEBUG nova.compute.resource_tracker [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Instance 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.248 186548 DEBUG nova.compute.resource_tracker [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Migration fd9d8912-b73b-402e-8b65-444a6c007a96 is active on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.249 186548 DEBUG nova.compute.resource_tracker [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.249 186548 DEBUG nova.compute.resource_tracker [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=896MB phys_disk=79GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.303 186548 DEBUG nova.virt.libvirt.driver [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.304 186548 INFO nova.virt.libvirt.driver [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Creating image(s)
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.304 186548 DEBUG oslo_concurrency.lockutils [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Acquiring lock "/var/lib/nova/instances/6d263548-4cc6-463b-b26b-cb43b0d069cd/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.305 186548 DEBUG oslo_concurrency.lockutils [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "/var/lib/nova/instances/6d263548-4cc6-463b-b26b-cb43b0d069cd/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.305 186548 DEBUG oslo_concurrency.lockutils [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "/var/lib/nova/instances/6d263548-4cc6-463b-b26b-cb43b0d069cd/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.320 186548 DEBUG oslo_concurrency.processutils [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.343 186548 DEBUG nova.compute.provider_tree [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.358 186548 DEBUG nova.scheduler.client.report [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.379 186548 DEBUG oslo_concurrency.processutils [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.380 186548 DEBUG oslo_concurrency.lockutils [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.381 186548 DEBUG oslo_concurrency.lockutils [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.392 186548 DEBUG oslo_concurrency.processutils [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.409 186548 DEBUG nova.compute.resource_tracker [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.410 186548 DEBUG oslo_concurrency.lockutils [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.288s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.422 186548 INFO nova.compute.manager [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.445 186548 DEBUG oslo_concurrency.processutils [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.446 186548 DEBUG oslo_concurrency.processutils [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/6d263548-4cc6-463b-b26b-cb43b0d069cd/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.478 186548 DEBUG oslo_concurrency.processutils [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/6d263548-4cc6-463b-b26b-cb43b0d069cd/disk 1073741824" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.479 186548 DEBUG oslo_concurrency.lockutils [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.099s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.480 186548 DEBUG oslo_concurrency.processutils [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.531 186548 INFO nova.scheduler.client.report [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Deleted allocation for migration fd9d8912-b73b-402e-8b65-444a6c007a96
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.532 186548 DEBUG nova.virt.libvirt.driver [None req-c0d6f7ce-324e-4aa9-befa-b322767c0eb2 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.537 186548 DEBUG oslo_concurrency.processutils [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.538 186548 DEBUG nova.virt.disk.api [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Checking if we can resize image /var/lib/nova/instances/6d263548-4cc6-463b-b26b-cb43b0d069cd/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.539 186548 DEBUG oslo_concurrency.processutils [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6d263548-4cc6-463b-b26b-cb43b0d069cd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.600 186548 DEBUG oslo_concurrency.processutils [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6d263548-4cc6-463b-b26b-cb43b0d069cd/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.602 186548 DEBUG nova.virt.disk.api [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Cannot resize image /var/lib/nova/instances/6d263548-4cc6-463b-b26b-cb43b0d069cd/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.602 186548 DEBUG nova.virt.libvirt.driver [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.603 186548 DEBUG nova.virt.libvirt.driver [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Ensure instance console log exists: /var/lib/nova/instances/6d263548-4cc6-463b-b26b-cb43b0d069cd/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.604 186548 DEBUG oslo_concurrency.lockutils [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.604 186548 DEBUG oslo_concurrency.lockutils [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.604 186548 DEBUG oslo_concurrency.lockutils [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.607 186548 DEBUG nova.virt.libvirt.driver [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Start _get_guest_xml network_info=[{"id": "cc51d9a0-e170-42eb-b8db-2910ea320cb4", "address": "fa:16:3e:88:5c:3d", "network": {"id": "d7ba1c27-6255-4c71-8e98-23a1c59b5723", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1812148536-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b004cb06df74de2903dae19345fd9c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc51d9a0-e1", "ovs_interfaceid": "cc51d9a0-e170-42eb-b8db-2910ea320cb4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.613 186548 WARNING nova.virt.libvirt.driver [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.625 186548 DEBUG nova.virt.libvirt.host [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.626 186548 DEBUG nova.virt.libvirt.host [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.630 186548 DEBUG nova.virt.libvirt.host [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.630 186548 DEBUG nova.virt.libvirt.host [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.632 186548 DEBUG nova.virt.libvirt.driver [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.632 186548 DEBUG nova.virt.hardware [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.632 186548 DEBUG nova.virt.hardware [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.632 186548 DEBUG nova.virt.hardware [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.633 186548 DEBUG nova.virt.hardware [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.633 186548 DEBUG nova.virt.hardware [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.633 186548 DEBUG nova.virt.hardware [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.633 186548 DEBUG nova.virt.hardware [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.633 186548 DEBUG nova.virt.hardware [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.634 186548 DEBUG nova.virt.hardware [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.634 186548 DEBUG nova.virt.hardware [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.634 186548 DEBUG nova.virt.hardware [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.634 186548 DEBUG nova.objects.instance [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 6d263548-4cc6-463b-b26b-cb43b0d069cd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.663 186548 DEBUG nova.virt.libvirt.vif [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-22T07:43:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-174353263',display_name='tempest-ServersAdminTestJSON-server-174353263',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-174353263',id=13,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:44:06Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='9b004cb06df74de2903dae19345fd9c7',ramdisk_id='',reservation_id='r-wcksy40q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='2',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-1843119868',owner_user_name='tempest-ServersAdminTestJSON-1843119868-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:44:25Z,user_data=None,user_id='7c0fb56fc41e44dfa23a0d45149e78e3',uuid=6d263548-4cc6-463b-b26b-cb43b0d069cd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cc51d9a0-e170-42eb-b8db-2910ea320cb4", "address": "fa:16:3e:88:5c:3d", "network": {"id": "d7ba1c27-6255-4c71-8e98-23a1c59b5723", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1812148536-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b004cb06df74de2903dae19345fd9c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc51d9a0-e1", "ovs_interfaceid": "cc51d9a0-e170-42eb-b8db-2910ea320cb4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.663 186548 DEBUG nova.network.os_vif_util [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Converting VIF {"id": "cc51d9a0-e170-42eb-b8db-2910ea320cb4", "address": "fa:16:3e:88:5c:3d", "network": {"id": "d7ba1c27-6255-4c71-8e98-23a1c59b5723", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1812148536-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b004cb06df74de2903dae19345fd9c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc51d9a0-e1", "ovs_interfaceid": "cc51d9a0-e170-42eb-b8db-2910ea320cb4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.664 186548 DEBUG nova.network.os_vif_util [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:88:5c:3d,bridge_name='br-int',has_traffic_filtering=True,id=cc51d9a0-e170-42eb-b8db-2910ea320cb4,network=Network(d7ba1c27-6255-4c71-8e98-23a1c59b5723),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc51d9a0-e1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.666 186548 DEBUG nova.virt.libvirt.driver [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] End _get_guest_xml xml=<domain type="kvm">
Nov 22 07:44:25 compute-0 nova_compute[186544]:   <uuid>6d263548-4cc6-463b-b26b-cb43b0d069cd</uuid>
Nov 22 07:44:25 compute-0 nova_compute[186544]:   <name>instance-0000000d</name>
Nov 22 07:44:25 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 07:44:25 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 07:44:25 compute-0 nova_compute[186544]:   <metadata>
Nov 22 07:44:25 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 07:44:25 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 07:44:25 compute-0 nova_compute[186544]:       <nova:name>tempest-ServersAdminTestJSON-server-174353263</nova:name>
Nov 22 07:44:25 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 07:44:25</nova:creationTime>
Nov 22 07:44:25 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 07:44:25 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 07:44:25 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 07:44:25 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 07:44:25 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 07:44:25 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 07:44:25 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 07:44:25 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 07:44:25 compute-0 nova_compute[186544]:         <nova:user uuid="7c0fb56fc41e44dfa23a0d45149e78e3">tempest-ServersAdminTestJSON-1843119868-project-member</nova:user>
Nov 22 07:44:25 compute-0 nova_compute[186544]:         <nova:project uuid="9b004cb06df74de2903dae19345fd9c7">tempest-ServersAdminTestJSON-1843119868</nova:project>
Nov 22 07:44:25 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 07:44:25 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 07:44:25 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 07:44:25 compute-0 nova_compute[186544]:         <nova:port uuid="cc51d9a0-e170-42eb-b8db-2910ea320cb4">
Nov 22 07:44:25 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Nov 22 07:44:25 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 07:44:25 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 07:44:25 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 07:44:25 compute-0 nova_compute[186544]:   </metadata>
Nov 22 07:44:25 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 07:44:25 compute-0 nova_compute[186544]:     <system>
Nov 22 07:44:25 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 07:44:25 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 07:44:25 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 07:44:25 compute-0 nova_compute[186544]:       <entry name="serial">6d263548-4cc6-463b-b26b-cb43b0d069cd</entry>
Nov 22 07:44:25 compute-0 nova_compute[186544]:       <entry name="uuid">6d263548-4cc6-463b-b26b-cb43b0d069cd</entry>
Nov 22 07:44:25 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 07:44:25 compute-0 nova_compute[186544]:     </system>
Nov 22 07:44:25 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 07:44:25 compute-0 nova_compute[186544]:   <os>
Nov 22 07:44:25 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 07:44:25 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 07:44:25 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 07:44:25 compute-0 nova_compute[186544]:   </os>
Nov 22 07:44:25 compute-0 nova_compute[186544]:   <features>
Nov 22 07:44:25 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 07:44:25 compute-0 nova_compute[186544]:     <apic/>
Nov 22 07:44:25 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 07:44:25 compute-0 nova_compute[186544]:   </features>
Nov 22 07:44:25 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 07:44:25 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 07:44:25 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 07:44:25 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 07:44:25 compute-0 nova_compute[186544]:   </clock>
Nov 22 07:44:25 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 07:44:25 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 07:44:25 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 07:44:25 compute-0 nova_compute[186544]:   </cpu>
Nov 22 07:44:25 compute-0 nova_compute[186544]:   <devices>
Nov 22 07:44:25 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 07:44:25 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 07:44:25 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/6d263548-4cc6-463b-b26b-cb43b0d069cd/disk"/>
Nov 22 07:44:25 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 07:44:25 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:44:25 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 07:44:25 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 07:44:25 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/6d263548-4cc6-463b-b26b-cb43b0d069cd/disk.config"/>
Nov 22 07:44:25 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 07:44:25 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:44:25 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 07:44:25 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:88:5c:3d"/>
Nov 22 07:44:25 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 07:44:25 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 07:44:25 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 07:44:25 compute-0 nova_compute[186544]:       <target dev="tapcc51d9a0-e1"/>
Nov 22 07:44:25 compute-0 nova_compute[186544]:     </interface>
Nov 22 07:44:25 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 07:44:25 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/6d263548-4cc6-463b-b26b-cb43b0d069cd/console.log" append="off"/>
Nov 22 07:44:25 compute-0 nova_compute[186544]:     </serial>
Nov 22 07:44:25 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 07:44:25 compute-0 nova_compute[186544]:     <video>
Nov 22 07:44:25 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 07:44:25 compute-0 nova_compute[186544]:     </video>
Nov 22 07:44:25 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 07:44:25 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 07:44:25 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 07:44:25 compute-0 nova_compute[186544]:     </rng>
Nov 22 07:44:25 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 07:44:25 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:25 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:25 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:25 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:25 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:25 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:25 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:25 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:25 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:25 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:25 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:25 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:25 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:25 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:25 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:25 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:25 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:25 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:25 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:25 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:25 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:25 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:25 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:25 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:25 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 07:44:25 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 07:44:25 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 07:44:25 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 07:44:25 compute-0 nova_compute[186544]:   </devices>
Nov 22 07:44:25 compute-0 nova_compute[186544]: </domain>
Nov 22 07:44:25 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.668 186548 DEBUG nova.virt.libvirt.vif [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-22T07:43:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-174353263',display_name='tempest-ServersAdminTestJSON-server-174353263',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-174353263',id=13,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:44:06Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='9b004cb06df74de2903dae19345fd9c7',ramdisk_id='',reservation_id='r-wcksy40q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='2',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-1843119868',owner_user_name='tempest-ServersAdminTestJSON-1843119868-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:44:25Z,user_data=None,user_id='7c0fb56fc41e44dfa23a0d45149e78e3',uuid=6d263548-4cc6-463b-b26b-cb43b0d069cd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cc51d9a0-e170-42eb-b8db-2910ea320cb4", "address": "fa:16:3e:88:5c:3d", "network": {"id": "d7ba1c27-6255-4c71-8e98-23a1c59b5723", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1812148536-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b004cb06df74de2903dae19345fd9c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc51d9a0-e1", "ovs_interfaceid": "cc51d9a0-e170-42eb-b8db-2910ea320cb4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.668 186548 DEBUG nova.network.os_vif_util [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Converting VIF {"id": "cc51d9a0-e170-42eb-b8db-2910ea320cb4", "address": "fa:16:3e:88:5c:3d", "network": {"id": "d7ba1c27-6255-4c71-8e98-23a1c59b5723", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1812148536-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b004cb06df74de2903dae19345fd9c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc51d9a0-e1", "ovs_interfaceid": "cc51d9a0-e170-42eb-b8db-2910ea320cb4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.669 186548 DEBUG nova.network.os_vif_util [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:88:5c:3d,bridge_name='br-int',has_traffic_filtering=True,id=cc51d9a0-e170-42eb-b8db-2910ea320cb4,network=Network(d7ba1c27-6255-4c71-8e98-23a1c59b5723),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc51d9a0-e1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.670 186548 DEBUG os_vif [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:88:5c:3d,bridge_name='br-int',has_traffic_filtering=True,id=cc51d9a0-e170-42eb-b8db-2910ea320cb4,network=Network(d7ba1c27-6255-4c71-8e98-23a1c59b5723),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc51d9a0-e1') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.670 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.671 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.671 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.676 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.676 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcc51d9a0-e1, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.677 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapcc51d9a0-e1, col_values=(('external_ids', {'iface-id': 'cc51d9a0-e170-42eb-b8db-2910ea320cb4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:88:5c:3d', 'vm-uuid': '6d263548-4cc6-463b-b26b-cb43b0d069cd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.679 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:25 compute-0 NetworkManager[55036]: <info>  [1763797465.6803] manager: (tapcc51d9a0-e1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/37)
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.682 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.684 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.686 186548 INFO os_vif [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:88:5c:3d,bridge_name='br-int',has_traffic_filtering=True,id=cc51d9a0-e170-42eb-b8db-2910ea320cb4,network=Network(d7ba1c27-6255-4c71-8e98-23a1c59b5723),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc51d9a0-e1')
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.728 186548 DEBUG nova.virt.libvirt.driver [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.729 186548 DEBUG nova.virt.libvirt.driver [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.729 186548 DEBUG nova.virt.libvirt.driver [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] No VIF found with MAC fa:16:3e:88:5c:3d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.729 186548 INFO nova.virt.libvirt.driver [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Using config drive
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.742 186548 DEBUG nova.objects.instance [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 6d263548-4cc6-463b-b26b-cb43b0d069cd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:44:25 compute-0 nova_compute[186544]: 2025-11-22 07:44:25.773 186548 DEBUG nova.objects.instance [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lazy-loading 'keypairs' on Instance uuid 6d263548-4cc6-463b-b26b-cb43b0d069cd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:44:26 compute-0 nova_compute[186544]: 2025-11-22 07:44:26.284 186548 INFO nova.virt.libvirt.driver [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Creating config drive at /var/lib/nova/instances/6d263548-4cc6-463b-b26b-cb43b0d069cd/disk.config
Nov 22 07:44:26 compute-0 nova_compute[186544]: 2025-11-22 07:44:26.290 186548 DEBUG oslo_concurrency.processutils [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6d263548-4cc6-463b-b26b-cb43b0d069cd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpaqbzru9s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:44:26 compute-0 nova_compute[186544]: 2025-11-22 07:44:26.415 186548 DEBUG oslo_concurrency.processutils [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6d263548-4cc6-463b-b26b-cb43b0d069cd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpaqbzru9s" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:44:26 compute-0 kernel: tapcc51d9a0-e1: entered promiscuous mode
Nov 22 07:44:26 compute-0 NetworkManager[55036]: <info>  [1763797466.4812] manager: (tapcc51d9a0-e1): new Tun device (/org/freedesktop/NetworkManager/Devices/38)
Nov 22 07:44:26 compute-0 systemd-udevd[215675]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 07:44:26 compute-0 ovn_controller[94843]: 2025-11-22T07:44:26Z|00066|binding|INFO|Claiming lport cc51d9a0-e170-42eb-b8db-2910ea320cb4 for this chassis.
Nov 22 07:44:26 compute-0 ovn_controller[94843]: 2025-11-22T07:44:26Z|00067|binding|INFO|cc51d9a0-e170-42eb-b8db-2910ea320cb4: Claiming fa:16:3e:88:5c:3d 10.100.0.13
Nov 22 07:44:26 compute-0 nova_compute[186544]: 2025-11-22 07:44:26.482 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:26 compute-0 NetworkManager[55036]: <info>  [1763797466.4946] device (tapcc51d9a0-e1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 07:44:26 compute-0 NetworkManager[55036]: <info>  [1763797466.4954] device (tapcc51d9a0-e1): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 07:44:26 compute-0 ovn_controller[94843]: 2025-11-22T07:44:26Z|00068|binding|INFO|Setting lport cc51d9a0-e170-42eb-b8db-2910ea320cb4 ovn-installed in OVS
Nov 22 07:44:26 compute-0 nova_compute[186544]: 2025-11-22 07:44:26.497 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:26 compute-0 nova_compute[186544]: 2025-11-22 07:44:26.500 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:26 compute-0 ovn_controller[94843]: 2025-11-22T07:44:26Z|00069|binding|INFO|Setting lport cc51d9a0-e170-42eb-b8db-2910ea320cb4 up in Southbound
Nov 22 07:44:26 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:26.505 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:88:5c:3d 10.100.0.13'], port_security=['fa:16:3e:88:5c:3d 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '6d263548-4cc6-463b-b26b-cb43b0d069cd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d7ba1c27-6255-4c71-8e98-23a1c59b5723', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9b004cb06df74de2903dae19345fd9c7', 'neutron:revision_number': '7', 'neutron:security_group_ids': 'f8b9c274-fa57-419c-9d40-54201db84f9d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ce3be460-df7c-41a5-9ff2-c82c8fc728ec, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=cc51d9a0-e170-42eb-b8db-2910ea320cb4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:44:26 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:26.506 103805 INFO neutron.agent.ovn.metadata.agent [-] Port cc51d9a0-e170-42eb-b8db-2910ea320cb4 in datapath d7ba1c27-6255-4c71-8e98-23a1c59b5723 bound to our chassis
Nov 22 07:44:26 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:26.508 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d7ba1c27-6255-4c71-8e98-23a1c59b5723
Nov 22 07:44:26 compute-0 systemd-machined[152872]: New machine qemu-10-instance-0000000d.
Nov 22 07:44:26 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:26.524 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[7e7ab326-bc1e-4c66-9448-767f11e8c8cf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:44:26 compute-0 systemd[1]: Started Virtual Machine qemu-10-instance-0000000d.
Nov 22 07:44:26 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:26.553 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[3f891f66-f1d1-4ced-b6fa-9e0eb066706b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:44:26 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:26.556 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[f3e9076c-f3a7-4286-a605-b29d85d4a013]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:44:26 compute-0 podman[215747]: 2025-11-22 07:44:26.575218281 +0000 UTC m=+0.095340638 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 07:44:26 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:26.581 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[babbcf51-8040-48df-a15b-ba907d5b2dbc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:44:26 compute-0 nova_compute[186544]: 2025-11-22 07:44:26.585 186548 DEBUG nova.compute.manager [req-eda8abb0-f29d-414f-a9ec-afea07248a7b req-2fafa22b-7336-4434-90ed-d0a7876bc617 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Received event network-vif-plugged-cc51d9a0-e170-42eb-b8db-2910ea320cb4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:44:26 compute-0 nova_compute[186544]: 2025-11-22 07:44:26.585 186548 DEBUG oslo_concurrency.lockutils [req-eda8abb0-f29d-414f-a9ec-afea07248a7b req-2fafa22b-7336-4434-90ed-d0a7876bc617 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "6d263548-4cc6-463b-b26b-cb43b0d069cd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:44:26 compute-0 nova_compute[186544]: 2025-11-22 07:44:26.585 186548 DEBUG oslo_concurrency.lockutils [req-eda8abb0-f29d-414f-a9ec-afea07248a7b req-2fafa22b-7336-4434-90ed-d0a7876bc617 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6d263548-4cc6-463b-b26b-cb43b0d069cd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:44:26 compute-0 nova_compute[186544]: 2025-11-22 07:44:26.586 186548 DEBUG oslo_concurrency.lockutils [req-eda8abb0-f29d-414f-a9ec-afea07248a7b req-2fafa22b-7336-4434-90ed-d0a7876bc617 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6d263548-4cc6-463b-b26b-cb43b0d069cd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:44:26 compute-0 nova_compute[186544]: 2025-11-22 07:44:26.586 186548 DEBUG nova.compute.manager [req-eda8abb0-f29d-414f-a9ec-afea07248a7b req-2fafa22b-7336-4434-90ed-d0a7876bc617 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] No waiting events found dispatching network-vif-plugged-cc51d9a0-e170-42eb-b8db-2910ea320cb4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:44:26 compute-0 nova_compute[186544]: 2025-11-22 07:44:26.586 186548 WARNING nova.compute.manager [req-eda8abb0-f29d-414f-a9ec-afea07248a7b req-2fafa22b-7336-4434-90ed-d0a7876bc617 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Received unexpected event network-vif-plugged-cc51d9a0-e170-42eb-b8db-2910ea320cb4 for instance with vm_state active and task_state rebuild_spawning.
Nov 22 07:44:26 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:26.600 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[7a0acfe9-3c36-4a8d-abc3-d18bb19e4414]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd7ba1c27-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bb:37:eb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 13, 'rx_bytes': 700, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 13, 'rx_bytes': 700, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 413549, 'reachable_time': 24347, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215789, 'error': None, 'target': 'ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:44:26 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:26.615 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[e6afbbe2-ce98-4e8e-ae92-7a19f231d594]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapd7ba1c27-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 413560, 'tstamp': 413560}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215791, 'error': None, 'target': 'ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapd7ba1c27-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 413562, 'tstamp': 413562}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215791, 'error': None, 'target': 'ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:44:26 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:26.616 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd7ba1c27-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:44:26 compute-0 nova_compute[186544]: 2025-11-22 07:44:26.618 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:26 compute-0 nova_compute[186544]: 2025-11-22 07:44:26.620 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:26 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:26.620 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd7ba1c27-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:44:26 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:26.620 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:44:26 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:26.621 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd7ba1c27-60, col_values=(('external_ids', {'iface-id': '3c20001c-28e2-4cdd-9a7c-497ed470b31c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:44:26 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:26.621 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:44:27 compute-0 nova_compute[186544]: 2025-11-22 07:44:27.022 186548 DEBUG nova.virt.libvirt.host [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Removed pending event for 6d263548-4cc6-463b-b26b-cb43b0d069cd due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Nov 22 07:44:27 compute-0 nova_compute[186544]: 2025-11-22 07:44:27.022 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763797467.0216024, 6d263548-4cc6-463b-b26b-cb43b0d069cd => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:44:27 compute-0 nova_compute[186544]: 2025-11-22 07:44:27.022 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] VM Resumed (Lifecycle Event)
Nov 22 07:44:27 compute-0 nova_compute[186544]: 2025-11-22 07:44:27.025 186548 DEBUG nova.compute.manager [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 07:44:27 compute-0 nova_compute[186544]: 2025-11-22 07:44:27.026 186548 DEBUG nova.virt.libvirt.driver [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 07:44:27 compute-0 nova_compute[186544]: 2025-11-22 07:44:27.031 186548 INFO nova.virt.libvirt.driver [-] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Instance spawned successfully.
Nov 22 07:44:27 compute-0 nova_compute[186544]: 2025-11-22 07:44:27.031 186548 DEBUG nova.virt.libvirt.driver [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 07:44:27 compute-0 nova_compute[186544]: 2025-11-22 07:44:27.045 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:44:27 compute-0 nova_compute[186544]: 2025-11-22 07:44:27.051 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:44:27 compute-0 nova_compute[186544]: 2025-11-22 07:44:27.056 186548 DEBUG nova.virt.libvirt.driver [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:44:27 compute-0 nova_compute[186544]: 2025-11-22 07:44:27.056 186548 DEBUG nova.virt.libvirt.driver [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:44:27 compute-0 nova_compute[186544]: 2025-11-22 07:44:27.057 186548 DEBUG nova.virt.libvirt.driver [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:44:27 compute-0 nova_compute[186544]: 2025-11-22 07:44:27.057 186548 DEBUG nova.virt.libvirt.driver [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:44:27 compute-0 nova_compute[186544]: 2025-11-22 07:44:27.057 186548 DEBUG nova.virt.libvirt.driver [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:44:27 compute-0 nova_compute[186544]: 2025-11-22 07:44:27.058 186548 DEBUG nova.virt.libvirt.driver [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:44:27 compute-0 nova_compute[186544]: 2025-11-22 07:44:27.065 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Nov 22 07:44:27 compute-0 nova_compute[186544]: 2025-11-22 07:44:27.065 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763797467.0241475, 6d263548-4cc6-463b-b26b-cb43b0d069cd => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:44:27 compute-0 nova_compute[186544]: 2025-11-22 07:44:27.065 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] VM Started (Lifecycle Event)
Nov 22 07:44:27 compute-0 nova_compute[186544]: 2025-11-22 07:44:27.084 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:44:27 compute-0 nova_compute[186544]: 2025-11-22 07:44:27.089 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:44:27 compute-0 nova_compute[186544]: 2025-11-22 07:44:27.114 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Nov 22 07:44:27 compute-0 nova_compute[186544]: 2025-11-22 07:44:27.138 186548 DEBUG nova.compute.manager [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:44:27 compute-0 nova_compute[186544]: 2025-11-22 07:44:27.219 186548 DEBUG oslo_concurrency.lockutils [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:44:27 compute-0 nova_compute[186544]: 2025-11-22 07:44:27.220 186548 DEBUG oslo_concurrency.lockutils [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:44:27 compute-0 nova_compute[186544]: 2025-11-22 07:44:27.220 186548 DEBUG nova.objects.instance [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Nov 22 07:44:27 compute-0 nova_compute[186544]: 2025-11-22 07:44:27.299 186548 DEBUG oslo_concurrency.lockutils [None req-5f2d4d5b-b781-4feb-90a4-1aa6bc87ee5c 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.080s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:44:28 compute-0 nova_compute[186544]: 2025-11-22 07:44:28.662 186548 DEBUG nova.compute.manager [req-a2dbd40b-518a-4ed9-82fc-903709d3e315 req-ae803f8f-6762-42f8-99dd-5acb4d342da5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Received event network-vif-plugged-cc51d9a0-e170-42eb-b8db-2910ea320cb4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:44:28 compute-0 nova_compute[186544]: 2025-11-22 07:44:28.662 186548 DEBUG oslo_concurrency.lockutils [req-a2dbd40b-518a-4ed9-82fc-903709d3e315 req-ae803f8f-6762-42f8-99dd-5acb4d342da5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "6d263548-4cc6-463b-b26b-cb43b0d069cd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:44:28 compute-0 nova_compute[186544]: 2025-11-22 07:44:28.663 186548 DEBUG oslo_concurrency.lockutils [req-a2dbd40b-518a-4ed9-82fc-903709d3e315 req-ae803f8f-6762-42f8-99dd-5acb4d342da5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6d263548-4cc6-463b-b26b-cb43b0d069cd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:44:28 compute-0 nova_compute[186544]: 2025-11-22 07:44:28.663 186548 DEBUG oslo_concurrency.lockutils [req-a2dbd40b-518a-4ed9-82fc-903709d3e315 req-ae803f8f-6762-42f8-99dd-5acb4d342da5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6d263548-4cc6-463b-b26b-cb43b0d069cd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:44:28 compute-0 nova_compute[186544]: 2025-11-22 07:44:28.663 186548 DEBUG nova.compute.manager [req-a2dbd40b-518a-4ed9-82fc-903709d3e315 req-ae803f8f-6762-42f8-99dd-5acb4d342da5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] No waiting events found dispatching network-vif-plugged-cc51d9a0-e170-42eb-b8db-2910ea320cb4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:44:28 compute-0 nova_compute[186544]: 2025-11-22 07:44:28.663 186548 WARNING nova.compute.manager [req-a2dbd40b-518a-4ed9-82fc-903709d3e315 req-ae803f8f-6762-42f8-99dd-5acb4d342da5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Received unexpected event network-vif-plugged-cc51d9a0-e170-42eb-b8db-2910ea320cb4 for instance with vm_state active and task_state None.
Nov 22 07:44:28 compute-0 nova_compute[186544]: 2025-11-22 07:44:28.663 186548 DEBUG nova.compute.manager [req-a2dbd40b-518a-4ed9-82fc-903709d3e315 req-ae803f8f-6762-42f8-99dd-5acb4d342da5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Received event network-vif-plugged-cc51d9a0-e170-42eb-b8db-2910ea320cb4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:44:28 compute-0 nova_compute[186544]: 2025-11-22 07:44:28.663 186548 DEBUG oslo_concurrency.lockutils [req-a2dbd40b-518a-4ed9-82fc-903709d3e315 req-ae803f8f-6762-42f8-99dd-5acb4d342da5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "6d263548-4cc6-463b-b26b-cb43b0d069cd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:44:28 compute-0 nova_compute[186544]: 2025-11-22 07:44:28.664 186548 DEBUG oslo_concurrency.lockutils [req-a2dbd40b-518a-4ed9-82fc-903709d3e315 req-ae803f8f-6762-42f8-99dd-5acb4d342da5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6d263548-4cc6-463b-b26b-cb43b0d069cd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:44:28 compute-0 nova_compute[186544]: 2025-11-22 07:44:28.664 186548 DEBUG oslo_concurrency.lockutils [req-a2dbd40b-518a-4ed9-82fc-903709d3e315 req-ae803f8f-6762-42f8-99dd-5acb4d342da5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6d263548-4cc6-463b-b26b-cb43b0d069cd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:44:28 compute-0 nova_compute[186544]: 2025-11-22 07:44:28.664 186548 DEBUG nova.compute.manager [req-a2dbd40b-518a-4ed9-82fc-903709d3e315 req-ae803f8f-6762-42f8-99dd-5acb4d342da5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] No waiting events found dispatching network-vif-plugged-cc51d9a0-e170-42eb-b8db-2910ea320cb4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:44:28 compute-0 nova_compute[186544]: 2025-11-22 07:44:28.664 186548 WARNING nova.compute.manager [req-a2dbd40b-518a-4ed9-82fc-903709d3e315 req-ae803f8f-6762-42f8-99dd-5acb4d342da5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Received unexpected event network-vif-plugged-cc51d9a0-e170-42eb-b8db-2910ea320cb4 for instance with vm_state active and task_state None.
Nov 22 07:44:29 compute-0 podman[215799]: 2025-11-22 07:44:29.423461156 +0000 UTC m=+0.064294523 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 22 07:44:29 compute-0 nova_compute[186544]: 2025-11-22 07:44:29.972 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:30 compute-0 nova_compute[186544]: 2025-11-22 07:44:30.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:44:30 compute-0 nova_compute[186544]: 2025-11-22 07:44:30.180 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:44:30 compute-0 nova_compute[186544]: 2025-11-22 07:44:30.180 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:44:30 compute-0 nova_compute[186544]: 2025-11-22 07:44:30.181 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:44:30 compute-0 nova_compute[186544]: 2025-11-22 07:44:30.181 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 07:44:30 compute-0 nova_compute[186544]: 2025-11-22 07:44:30.255 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:44:30 compute-0 nova_compute[186544]: 2025-11-22 07:44:30.315 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:44:30 compute-0 nova_compute[186544]: 2025-11-22 07:44:30.317 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:44:30 compute-0 nova_compute[186544]: 2025-11-22 07:44:30.371 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:44:30 compute-0 nova_compute[186544]: 2025-11-22 07:44:30.376 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6d263548-4cc6-463b-b26b-cb43b0d069cd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:44:30 compute-0 nova_compute[186544]: 2025-11-22 07:44:30.427 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6d263548-4cc6-463b-b26b-cb43b0d069cd/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:44:30 compute-0 nova_compute[186544]: 2025-11-22 07:44:30.428 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6d263548-4cc6-463b-b26b-cb43b0d069cd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:44:30 compute-0 nova_compute[186544]: 2025-11-22 07:44:30.483 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6d263548-4cc6-463b-b26b-cb43b0d069cd/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:44:30 compute-0 nova_compute[186544]: 2025-11-22 07:44:30.489 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/36d5f234-9baf-48b6-a565-430378fe4068/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:44:30 compute-0 nova_compute[186544]: 2025-11-22 07:44:30.544 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/36d5f234-9baf-48b6-a565-430378fe4068/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:44:30 compute-0 nova_compute[186544]: 2025-11-22 07:44:30.547 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/36d5f234-9baf-48b6-a565-430378fe4068/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:44:30 compute-0 nova_compute[186544]: 2025-11-22 07:44:30.605 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/36d5f234-9baf-48b6-a565-430378fe4068/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:44:30 compute-0 nova_compute[186544]: 2025-11-22 07:44:30.679 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:30 compute-0 nova_compute[186544]: 2025-11-22 07:44:30.803 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 07:44:30 compute-0 nova_compute[186544]: 2025-11-22 07:44:30.805 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5273MB free_disk=73.36669158935547GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 07:44:30 compute-0 nova_compute[186544]: 2025-11-22 07:44:30.805 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:44:30 compute-0 nova_compute[186544]: 2025-11-22 07:44:30.805 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:44:30 compute-0 nova_compute[186544]: 2025-11-22 07:44:30.886 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Instance 6d263548-4cc6-463b-b26b-cb43b0d069cd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 22 07:44:30 compute-0 nova_compute[186544]: 2025-11-22 07:44:30.886 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Instance 36d5f234-9baf-48b6-a565-430378fe4068 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 22 07:44:30 compute-0 nova_compute[186544]: 2025-11-22 07:44:30.886 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Instance 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 22 07:44:30 compute-0 nova_compute[186544]: 2025-11-22 07:44:30.886 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 07:44:30 compute-0 nova_compute[186544]: 2025-11-22 07:44:30.887 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=896MB phys_disk=79GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 07:44:30 compute-0 nova_compute[186544]: 2025-11-22 07:44:30.980 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:44:30 compute-0 nova_compute[186544]: 2025-11-22 07:44:30.999 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:44:31 compute-0 nova_compute[186544]: 2025-11-22 07:44:31.088 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 07:44:31 compute-0 nova_compute[186544]: 2025-11-22 07:44:31.089 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.283s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:44:32 compute-0 nova_compute[186544]: 2025-11-22 07:44:32.043 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763797457.0422406, 144e6cca-5b79-4b25-9456-a59f6895075b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:44:32 compute-0 nova_compute[186544]: 2025-11-22 07:44:32.044 186548 INFO nova.compute.manager [-] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] VM Stopped (Lifecycle Event)
Nov 22 07:44:32 compute-0 nova_compute[186544]: 2025-11-22 07:44:32.067 186548 DEBUG nova.compute.manager [None req-7e427bb8-a4d6-443b-99ea-82e97055cf73 - - - - - -] [instance: 144e6cca-5b79-4b25-9456-a59f6895075b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:44:32 compute-0 podman[215838]: 2025-11-22 07:44:32.42398025 +0000 UTC m=+0.060914740 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 07:44:33 compute-0 nova_compute[186544]: 2025-11-22 07:44:33.030 186548 DEBUG oslo_concurrency.lockutils [None req-9e6925aa-4212-4c61-a73b-939964c21586 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Acquiring lock "36d5f234-9baf-48b6-a565-430378fe4068" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:44:33 compute-0 nova_compute[186544]: 2025-11-22 07:44:33.031 186548 DEBUG oslo_concurrency.lockutils [None req-9e6925aa-4212-4c61-a73b-939964c21586 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "36d5f234-9baf-48b6-a565-430378fe4068" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:44:33 compute-0 nova_compute[186544]: 2025-11-22 07:44:33.031 186548 DEBUG oslo_concurrency.lockutils [None req-9e6925aa-4212-4c61-a73b-939964c21586 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Acquiring lock "36d5f234-9baf-48b6-a565-430378fe4068-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:44:33 compute-0 nova_compute[186544]: 2025-11-22 07:44:33.032 186548 DEBUG oslo_concurrency.lockutils [None req-9e6925aa-4212-4c61-a73b-939964c21586 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "36d5f234-9baf-48b6-a565-430378fe4068-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:44:33 compute-0 nova_compute[186544]: 2025-11-22 07:44:33.032 186548 DEBUG oslo_concurrency.lockutils [None req-9e6925aa-4212-4c61-a73b-939964c21586 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "36d5f234-9baf-48b6-a565-430378fe4068-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:44:33 compute-0 nova_compute[186544]: 2025-11-22 07:44:33.040 186548 INFO nova.compute.manager [None req-9e6925aa-4212-4c61-a73b-939964c21586 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 36d5f234-9baf-48b6-a565-430378fe4068] Terminating instance
Nov 22 07:44:33 compute-0 nova_compute[186544]: 2025-11-22 07:44:33.047 186548 DEBUG nova.compute.manager [None req-9e6925aa-4212-4c61-a73b-939964c21586 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 36d5f234-9baf-48b6-a565-430378fe4068] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 07:44:33 compute-0 kernel: tap471dbcab-6d (unregistering): left promiscuous mode
Nov 22 07:44:33 compute-0 NetworkManager[55036]: <info>  [1763797473.0708] device (tap471dbcab-6d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 07:44:33 compute-0 ovn_controller[94843]: 2025-11-22T07:44:33Z|00070|binding|INFO|Releasing lport 471dbcab-6d2d-430b-9c1d-41d0b1f895a0 from this chassis (sb_readonly=0)
Nov 22 07:44:33 compute-0 ovn_controller[94843]: 2025-11-22T07:44:33Z|00071|binding|INFO|Setting lport 471dbcab-6d2d-430b-9c1d-41d0b1f895a0 down in Southbound
Nov 22 07:44:33 compute-0 nova_compute[186544]: 2025-11-22 07:44:33.080 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:33 compute-0 ovn_controller[94843]: 2025-11-22T07:44:33Z|00072|binding|INFO|Removing iface tap471dbcab-6d ovn-installed in OVS
Nov 22 07:44:33 compute-0 nova_compute[186544]: 2025-11-22 07:44:33.082 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:33 compute-0 nova_compute[186544]: 2025-11-22 07:44:33.089 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:44:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:33.089 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:61:ea:95 10.100.0.8'], port_security=['fa:16:3e:61:ea:95 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '36d5f234-9baf-48b6-a565-430378fe4068', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d7ba1c27-6255-4c71-8e98-23a1c59b5723', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9b004cb06df74de2903dae19345fd9c7', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f8b9c274-fa57-419c-9d40-54201db84f9d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ce3be460-df7c-41a5-9ff2-c82c8fc728ec, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=471dbcab-6d2d-430b-9c1d-41d0b1f895a0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:44:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:33.090 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 471dbcab-6d2d-430b-9c1d-41d0b1f895a0 in datapath d7ba1c27-6255-4c71-8e98-23a1c59b5723 unbound from our chassis
Nov 22 07:44:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:33.092 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d7ba1c27-6255-4c71-8e98-23a1c59b5723
Nov 22 07:44:33 compute-0 nova_compute[186544]: 2025-11-22 07:44:33.094 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:33.109 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[76388771-2991-4f5d-a844-d02879d7c90a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:44:33 compute-0 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d0000000f.scope: Deactivated successfully.
Nov 22 07:44:33 compute-0 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d0000000f.scope: Consumed 15.473s CPU time.
Nov 22 07:44:33 compute-0 systemd-machined[152872]: Machine qemu-6-instance-0000000f terminated.
Nov 22 07:44:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:33.134 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[278ba1ae-586c-4de2-8749-d12f4f909a06]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:44:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:33.137 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[5fcd1b89-b5a0-48c3-900b-fd40c19090da]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:44:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:33.161 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[63ee5371-7efa-4b65-b577-00fe0fc5d8a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:44:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:33.175 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[a8046e19-66bf-4a25-97d7-65bd44042a1b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd7ba1c27-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bb:37:eb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 15, 'rx_bytes': 700, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 15, 'rx_bytes': 700, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 413549, 'reachable_time': 24347, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215870, 'error': None, 'target': 'ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:44:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:33.188 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[2f2e01a3-9456-4a13-a7d2-44d6b2a4d7fc]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapd7ba1c27-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 413560, 'tstamp': 413560}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215871, 'error': None, 'target': 'ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapd7ba1c27-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 413562, 'tstamp': 413562}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215871, 'error': None, 'target': 'ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:44:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:33.189 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd7ba1c27-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:44:33 compute-0 nova_compute[186544]: 2025-11-22 07:44:33.190 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:33 compute-0 nova_compute[186544]: 2025-11-22 07:44:33.194 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:33.195 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd7ba1c27-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:44:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:33.195 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:44:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:33.196 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd7ba1c27-60, col_values=(('external_ids', {'iface-id': '3c20001c-28e2-4cdd-9a7c-497ed470b31c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:44:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:33.196 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:44:33 compute-0 nova_compute[186544]: 2025-11-22 07:44:33.306 186548 INFO nova.virt.libvirt.driver [-] [instance: 36d5f234-9baf-48b6-a565-430378fe4068] Instance destroyed successfully.
Nov 22 07:44:33 compute-0 nova_compute[186544]: 2025-11-22 07:44:33.306 186548 DEBUG nova.objects.instance [None req-9e6925aa-4212-4c61-a73b-939964c21586 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lazy-loading 'resources' on Instance uuid 36d5f234-9baf-48b6-a565-430378fe4068 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:44:33 compute-0 nova_compute[186544]: 2025-11-22 07:44:33.320 186548 DEBUG nova.virt.libvirt.vif [None req-9e6925aa-4212-4c61-a73b-939964c21586 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:43:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-626797560',display_name='tempest-ServersAdminTestJSON-server-626797560',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-626797560',id=15,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:43:39Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9b004cb06df74de2903dae19345fd9c7',ramdisk_id='',reservation_id='r-vz6jvqc8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-1843119868',owner_user_name='tempest-ServersAdminTestJSON-1843119868-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:43:39Z,user_data=None,user_id='7c0fb56fc41e44dfa23a0d45149e78e3',uuid=36d5f234-9baf-48b6-a565-430378fe4068,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "471dbcab-6d2d-430b-9c1d-41d0b1f895a0", "address": "fa:16:3e:61:ea:95", "network": {"id": "d7ba1c27-6255-4c71-8e98-23a1c59b5723", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1812148536-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b004cb06df74de2903dae19345fd9c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap471dbcab-6d", "ovs_interfaceid": "471dbcab-6d2d-430b-9c1d-41d0b1f895a0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 07:44:33 compute-0 nova_compute[186544]: 2025-11-22 07:44:33.320 186548 DEBUG nova.network.os_vif_util [None req-9e6925aa-4212-4c61-a73b-939964c21586 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Converting VIF {"id": "471dbcab-6d2d-430b-9c1d-41d0b1f895a0", "address": "fa:16:3e:61:ea:95", "network": {"id": "d7ba1c27-6255-4c71-8e98-23a1c59b5723", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1812148536-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b004cb06df74de2903dae19345fd9c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap471dbcab-6d", "ovs_interfaceid": "471dbcab-6d2d-430b-9c1d-41d0b1f895a0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:44:33 compute-0 nova_compute[186544]: 2025-11-22 07:44:33.321 186548 DEBUG nova.network.os_vif_util [None req-9e6925aa-4212-4c61-a73b-939964c21586 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:61:ea:95,bridge_name='br-int',has_traffic_filtering=True,id=471dbcab-6d2d-430b-9c1d-41d0b1f895a0,network=Network(d7ba1c27-6255-4c71-8e98-23a1c59b5723),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap471dbcab-6d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:44:33 compute-0 nova_compute[186544]: 2025-11-22 07:44:33.321 186548 DEBUG os_vif [None req-9e6925aa-4212-4c61-a73b-939964c21586 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:61:ea:95,bridge_name='br-int',has_traffic_filtering=True,id=471dbcab-6d2d-430b-9c1d-41d0b1f895a0,network=Network(d7ba1c27-6255-4c71-8e98-23a1c59b5723),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap471dbcab-6d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 07:44:33 compute-0 nova_compute[186544]: 2025-11-22 07:44:33.323 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:33 compute-0 nova_compute[186544]: 2025-11-22 07:44:33.324 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap471dbcab-6d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:44:33 compute-0 nova_compute[186544]: 2025-11-22 07:44:33.325 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:33 compute-0 nova_compute[186544]: 2025-11-22 07:44:33.327 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:33 compute-0 nova_compute[186544]: 2025-11-22 07:44:33.329 186548 INFO os_vif [None req-9e6925aa-4212-4c61-a73b-939964c21586 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:61:ea:95,bridge_name='br-int',has_traffic_filtering=True,id=471dbcab-6d2d-430b-9c1d-41d0b1f895a0,network=Network(d7ba1c27-6255-4c71-8e98-23a1c59b5723),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap471dbcab-6d')
Nov 22 07:44:33 compute-0 nova_compute[186544]: 2025-11-22 07:44:33.330 186548 INFO nova.virt.libvirt.driver [None req-9e6925aa-4212-4c61-a73b-939964c21586 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 36d5f234-9baf-48b6-a565-430378fe4068] Deleting instance files /var/lib/nova/instances/36d5f234-9baf-48b6-a565-430378fe4068_del
Nov 22 07:44:33 compute-0 nova_compute[186544]: 2025-11-22 07:44:33.331 186548 INFO nova.virt.libvirt.driver [None req-9e6925aa-4212-4c61-a73b-939964c21586 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 36d5f234-9baf-48b6-a565-430378fe4068] Deletion of /var/lib/nova/instances/36d5f234-9baf-48b6-a565-430378fe4068_del complete
Nov 22 07:44:33 compute-0 nova_compute[186544]: 2025-11-22 07:44:33.403 186548 DEBUG nova.compute.manager [req-d51b5805-dd65-42e2-ade5-8a23be2973df req-0613aa66-0523-4963-8107-59cc0ec7a5d3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 36d5f234-9baf-48b6-a565-430378fe4068] Received event network-vif-unplugged-471dbcab-6d2d-430b-9c1d-41d0b1f895a0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:44:33 compute-0 nova_compute[186544]: 2025-11-22 07:44:33.403 186548 DEBUG oslo_concurrency.lockutils [req-d51b5805-dd65-42e2-ade5-8a23be2973df req-0613aa66-0523-4963-8107-59cc0ec7a5d3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "36d5f234-9baf-48b6-a565-430378fe4068-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:44:33 compute-0 nova_compute[186544]: 2025-11-22 07:44:33.403 186548 DEBUG oslo_concurrency.lockutils [req-d51b5805-dd65-42e2-ade5-8a23be2973df req-0613aa66-0523-4963-8107-59cc0ec7a5d3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "36d5f234-9baf-48b6-a565-430378fe4068-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:44:33 compute-0 nova_compute[186544]: 2025-11-22 07:44:33.404 186548 DEBUG oslo_concurrency.lockutils [req-d51b5805-dd65-42e2-ade5-8a23be2973df req-0613aa66-0523-4963-8107-59cc0ec7a5d3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "36d5f234-9baf-48b6-a565-430378fe4068-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:44:33 compute-0 nova_compute[186544]: 2025-11-22 07:44:33.404 186548 DEBUG nova.compute.manager [req-d51b5805-dd65-42e2-ade5-8a23be2973df req-0613aa66-0523-4963-8107-59cc0ec7a5d3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 36d5f234-9baf-48b6-a565-430378fe4068] No waiting events found dispatching network-vif-unplugged-471dbcab-6d2d-430b-9c1d-41d0b1f895a0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:44:33 compute-0 nova_compute[186544]: 2025-11-22 07:44:33.404 186548 DEBUG nova.compute.manager [req-d51b5805-dd65-42e2-ade5-8a23be2973df req-0613aa66-0523-4963-8107-59cc0ec7a5d3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 36d5f234-9baf-48b6-a565-430378fe4068] Received event network-vif-unplugged-471dbcab-6d2d-430b-9c1d-41d0b1f895a0 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 22 07:44:33 compute-0 nova_compute[186544]: 2025-11-22 07:44:33.472 186548 INFO nova.compute.manager [None req-9e6925aa-4212-4c61-a73b-939964c21586 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 36d5f234-9baf-48b6-a565-430378fe4068] Took 0.42 seconds to destroy the instance on the hypervisor.
Nov 22 07:44:33 compute-0 nova_compute[186544]: 2025-11-22 07:44:33.472 186548 DEBUG oslo.service.loopingcall [None req-9e6925aa-4212-4c61-a73b-939964c21586 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 07:44:33 compute-0 nova_compute[186544]: 2025-11-22 07:44:33.473 186548 DEBUG nova.compute.manager [-] [instance: 36d5f234-9baf-48b6-a565-430378fe4068] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 07:44:33 compute-0 nova_compute[186544]: 2025-11-22 07:44:33.473 186548 DEBUG nova.network.neutron [-] [instance: 36d5f234-9baf-48b6-a565-430378fe4068] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 07:44:34 compute-0 nova_compute[186544]: 2025-11-22 07:44:34.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:44:34 compute-0 nova_compute[186544]: 2025-11-22 07:44:34.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 07:44:34 compute-0 nova_compute[186544]: 2025-11-22 07:44:34.164 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 07:44:34 compute-0 nova_compute[186544]: 2025-11-22 07:44:34.181 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 36d5f234-9baf-48b6-a565-430378fe4068] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875
Nov 22 07:44:34 compute-0 nova_compute[186544]: 2025-11-22 07:44:34.974 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:35 compute-0 nova_compute[186544]: 2025-11-22 07:44:35.154 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "refresh_cache-5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:44:35 compute-0 nova_compute[186544]: 2025-11-22 07:44:35.155 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquired lock "refresh_cache-5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:44:35 compute-0 nova_compute[186544]: 2025-11-22 07:44:35.155 186548 DEBUG nova.network.neutron [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 22 07:44:35 compute-0 nova_compute[186544]: 2025-11-22 07:44:35.155 186548 DEBUG nova.objects.instance [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lazy-loading 'info_cache' on Instance uuid 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:44:35 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:35.406 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:44:35 compute-0 nova_compute[186544]: 2025-11-22 07:44:35.406 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:35 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:35.407 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 07:44:35 compute-0 nova_compute[186544]: 2025-11-22 07:44:35.496 186548 DEBUG nova.network.neutron [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 07:44:35 compute-0 nova_compute[186544]: 2025-11-22 07:44:35.688 186548 DEBUG nova.compute.manager [req-ba713a22-47cc-4475-8e09-6c9603cc97e2 req-7691a21d-026b-435c-85dc-06e6d5908323 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 36d5f234-9baf-48b6-a565-430378fe4068] Received event network-vif-plugged-471dbcab-6d2d-430b-9c1d-41d0b1f895a0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:44:35 compute-0 nova_compute[186544]: 2025-11-22 07:44:35.688 186548 DEBUG oslo_concurrency.lockutils [req-ba713a22-47cc-4475-8e09-6c9603cc97e2 req-7691a21d-026b-435c-85dc-06e6d5908323 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "36d5f234-9baf-48b6-a565-430378fe4068-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:44:35 compute-0 nova_compute[186544]: 2025-11-22 07:44:35.688 186548 DEBUG oslo_concurrency.lockutils [req-ba713a22-47cc-4475-8e09-6c9603cc97e2 req-7691a21d-026b-435c-85dc-06e6d5908323 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "36d5f234-9baf-48b6-a565-430378fe4068-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:44:35 compute-0 nova_compute[186544]: 2025-11-22 07:44:35.688 186548 DEBUG oslo_concurrency.lockutils [req-ba713a22-47cc-4475-8e09-6c9603cc97e2 req-7691a21d-026b-435c-85dc-06e6d5908323 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "36d5f234-9baf-48b6-a565-430378fe4068-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:44:35 compute-0 nova_compute[186544]: 2025-11-22 07:44:35.689 186548 DEBUG nova.compute.manager [req-ba713a22-47cc-4475-8e09-6c9603cc97e2 req-7691a21d-026b-435c-85dc-06e6d5908323 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 36d5f234-9baf-48b6-a565-430378fe4068] No waiting events found dispatching network-vif-plugged-471dbcab-6d2d-430b-9c1d-41d0b1f895a0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:44:35 compute-0 nova_compute[186544]: 2025-11-22 07:44:35.689 186548 WARNING nova.compute.manager [req-ba713a22-47cc-4475-8e09-6c9603cc97e2 req-7691a21d-026b-435c-85dc-06e6d5908323 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 36d5f234-9baf-48b6-a565-430378fe4068] Received unexpected event network-vif-plugged-471dbcab-6d2d-430b-9c1d-41d0b1f895a0 for instance with vm_state active and task_state deleting.
Nov 22 07:44:35 compute-0 nova_compute[186544]: 2025-11-22 07:44:35.731 186548 DEBUG nova.network.neutron [-] [instance: 36d5f234-9baf-48b6-a565-430378fe4068] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:44:35 compute-0 nova_compute[186544]: 2025-11-22 07:44:35.766 186548 INFO nova.compute.manager [-] [instance: 36d5f234-9baf-48b6-a565-430378fe4068] Took 2.29 seconds to deallocate network for instance.
Nov 22 07:44:35 compute-0 nova_compute[186544]: 2025-11-22 07:44:35.887 186548 DEBUG nova.compute.manager [req-135267db-192f-44f9-a55c-2a48fdcf5b69 req-e126faf4-1ee3-4656-9cfd-3997cdec64c2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 36d5f234-9baf-48b6-a565-430378fe4068] Received event network-vif-deleted-471dbcab-6d2d-430b-9c1d-41d0b1f895a0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:44:35 compute-0 nova_compute[186544]: 2025-11-22 07:44:35.888 186548 DEBUG oslo_concurrency.lockutils [None req-9e6925aa-4212-4c61-a73b-939964c21586 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:44:35 compute-0 nova_compute[186544]: 2025-11-22 07:44:35.889 186548 DEBUG oslo_concurrency.lockutils [None req-9e6925aa-4212-4c61-a73b-939964c21586 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:44:36 compute-0 nova_compute[186544]: 2025-11-22 07:44:36.011 186548 DEBUG nova.network.neutron [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:44:36 compute-0 nova_compute[186544]: 2025-11-22 07:44:36.026 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Releasing lock "refresh_cache-5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:44:36 compute-0 nova_compute[186544]: 2025-11-22 07:44:36.026 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 22 07:44:36 compute-0 nova_compute[186544]: 2025-11-22 07:44:36.029 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:44:36 compute-0 nova_compute[186544]: 2025-11-22 07:44:36.029 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:44:36 compute-0 nova_compute[186544]: 2025-11-22 07:44:36.030 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 07:44:36 compute-0 nova_compute[186544]: 2025-11-22 07:44:36.050 186548 DEBUG nova.compute.provider_tree [None req-9e6925aa-4212-4c61-a73b-939964c21586 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:44:36 compute-0 nova_compute[186544]: 2025-11-22 07:44:36.061 186548 DEBUG nova.scheduler.client.report [None req-9e6925aa-4212-4c61-a73b-939964c21586 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:44:36 compute-0 nova_compute[186544]: 2025-11-22 07:44:36.083 186548 DEBUG oslo_concurrency.lockutils [None req-9e6925aa-4212-4c61-a73b-939964c21586 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.194s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:44:36 compute-0 nova_compute[186544]: 2025-11-22 07:44:36.110 186548 DEBUG oslo_concurrency.lockutils [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Acquiring lock "feb5ca5f-df67-4f29-9c21-71ba30b5af9c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:44:36 compute-0 nova_compute[186544]: 2025-11-22 07:44:36.111 186548 DEBUG oslo_concurrency.lockutils [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Lock "feb5ca5f-df67-4f29-9c21-71ba30b5af9c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:44:36 compute-0 nova_compute[186544]: 2025-11-22 07:44:36.134 186548 INFO nova.scheduler.client.report [None req-9e6925aa-4212-4c61-a73b-939964c21586 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Deleted allocations for instance 36d5f234-9baf-48b6-a565-430378fe4068
Nov 22 07:44:36 compute-0 nova_compute[186544]: 2025-11-22 07:44:36.143 186548 DEBUG nova.compute.manager [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 07:44:36 compute-0 nova_compute[186544]: 2025-11-22 07:44:36.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:44:36 compute-0 nova_compute[186544]: 2025-11-22 07:44:36.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:44:36 compute-0 nova_compute[186544]: 2025-11-22 07:44:36.243 186548 DEBUG oslo_concurrency.lockutils [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:44:36 compute-0 nova_compute[186544]: 2025-11-22 07:44:36.244 186548 DEBUG oslo_concurrency.lockutils [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:44:36 compute-0 nova_compute[186544]: 2025-11-22 07:44:36.244 186548 DEBUG oslo_concurrency.lockutils [None req-9e6925aa-4212-4c61-a73b-939964c21586 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "36d5f234-9baf-48b6-a565-430378fe4068" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.213s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:44:36 compute-0 nova_compute[186544]: 2025-11-22 07:44:36.252 186548 DEBUG nova.virt.hardware [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 07:44:36 compute-0 nova_compute[186544]: 2025-11-22 07:44:36.252 186548 INFO nova.compute.claims [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Claim successful on node compute-0.ctlplane.example.com
Nov 22 07:44:36 compute-0 nova_compute[186544]: 2025-11-22 07:44:36.460 186548 DEBUG nova.compute.provider_tree [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:44:36 compute-0 nova_compute[186544]: 2025-11-22 07:44:36.472 186548 DEBUG nova.scheduler.client.report [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:44:36 compute-0 nova_compute[186544]: 2025-11-22 07:44:36.496 186548 DEBUG oslo_concurrency.lockutils [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.252s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:44:36 compute-0 nova_compute[186544]: 2025-11-22 07:44:36.496 186548 DEBUG nova.compute.manager [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 07:44:36 compute-0 nova_compute[186544]: 2025-11-22 07:44:36.563 186548 DEBUG nova.compute.manager [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 22 07:44:36 compute-0 nova_compute[186544]: 2025-11-22 07:44:36.564 186548 DEBUG nova.network.neutron [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 07:44:36 compute-0 nova_compute[186544]: 2025-11-22 07:44:36.590 186548 INFO nova.virt.libvirt.driver [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.592 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67', 'name': 'tempest-MigrationsAdminTest-server-370989325', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000011', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '070aaece3c3c4232877d26c34023c56d', 'user_id': '5ea417ea62e2404d8cb5b9e767e8c5c4', 'hostId': '9e6578c6ae35730b89f940f3c6085d7218ecaf8b959a880eeedd6624', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.594 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '6d263548-4cc6-463b-b26b-cb43b0d069cd', 'name': 'tempest-ServersAdminTestJSON-server-174353263', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000000d', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '9b004cb06df74de2903dae19345fd9c7', 'user_id': '7c0fb56fc41e44dfa23a0d45149e78e3', 'hostId': '58822ef3e0ed542a50f4c285ab49d39c541f09f50dc197ccbb5d5ca0', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.595 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.598 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 6d263548-4cc6-463b-b26b-cb43b0d069cd / tapcc51d9a0-e1 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.598 12 DEBUG ceilometer.compute.pollsters [-] 6d263548-4cc6-463b-b26b-cb43b0d069cd/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.600 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '53d4d923-d8ca-468a-af2c-f86267606d62', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7c0fb56fc41e44dfa23a0d45149e78e3', 'user_name': None, 'project_id': '9b004cb06df74de2903dae19345fd9c7', 'project_name': None, 'resource_id': 'instance-0000000d-6d263548-4cc6-463b-b26b-cb43b0d069cd-tapcc51d9a0-e1', 'timestamp': '2025-11-22T07:44:36.595242', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-174353263', 'name': 'tapcc51d9a0-e1', 'instance_id': '6d263548-4cc6-463b-b26b-cb43b0d069cd', 'instance_type': 'm1.nano', 'host': '58822ef3e0ed542a50f4c285ab49d39c541f09f50dc197ccbb5d5ca0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:88:5c:3d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcc51d9a0-e1'}, 'message_id': '17fd4178-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4217.288457019, 'message_signature': '11e03937da87c9f86b6e00292984d65b7c1808a0511f79375c06d2860fe40979'}]}, 'timestamp': '2025-11-22 07:44:36.599608', '_unique_id': '4f2b26b9e8ae471e87a7ae0ff4dc9a7b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.600 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.600 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.600 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.600 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.600 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.600 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.600 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.600 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.600 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.600 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.600 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.600 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.600 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.600 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.600 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.600 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.600 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.600 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.600 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.600 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.600 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.600 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.600 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.600 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.600 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.600 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.600 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.600 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.600 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.600 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.600 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.601 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 22 07:44:36 compute-0 nova_compute[186544]: 2025-11-22 07:44:36.606 186548 DEBUG nova.compute.manager [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.627 12 DEBUG ceilometer.compute.pollsters [-] 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/disk.device.write.requests volume: 28 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.627 12 DEBUG ceilometer.compute.pollsters [-] 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.659 12 DEBUG ceilometer.compute.pollsters [-] 6d263548-4cc6-463b-b26b-cb43b0d069cd/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.659 12 DEBUG ceilometer.compute.pollsters [-] 6d263548-4cc6-463b-b26b-cb43b0d069cd/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.661 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'df70f3d3-c3be-437f-a96e-08434dedac1c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 28, 'user_id': '5ea417ea62e2404d8cb5b9e767e8c5c4', 'user_name': None, 'project_id': '070aaece3c3c4232877d26c34023c56d', 'project_name': None, 'resource_id': '5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67-vda', 'timestamp': '2025-11-22T07:44:36.601366', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-370989325', 'name': 'instance-00000011', 'instance_id': '5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67', 'instance_type': 'm1.nano', 'host': '9e6578c6ae35730b89f940f3c6085d7218ecaf8b959a880eeedd6624', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '18018e7c-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4217.293162445, 'message_signature': 'c175b548ec73ffd58c73dfedf4a62124d6421575b3b7baf95241e30d78347d7e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '5ea417ea62e2404d8cb5b9e767e8c5c4', 'user_name': None, 'project_id': '070aaece3c3c4232877d26c34023c56d', 'project_name': None, 'resource_id': '5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67-sda', 'timestamp': '2025-11-22T07:44:36.601366', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-370989325', 'name': 'instance-00000011', 'instance_id': '5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67', 'instance_type': 'm1.nano', 'host': '9e6578c6ae35730b89f940f3c6085d7218ecaf8b959a880eeedd6624', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '18019cc8-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4217.293162445, 'message_signature': '6ffbcf34378b230caeca18bc7bda992ed1365cb99128a0aa875eb1b0caeb0451'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '7c0fb56fc41e44dfa23a0d45149e78e3', 'user_name': None, 'project_id': '9b004cb06df74de2903dae19345fd9c7', 'project_name': None, 'resource_id': '6d263548-4cc6-463b-b26b-cb43b0d069cd-vda', 'timestamp': '2025-11-22T07:44:36.601366', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-174353263', 'name': 'instance-0000000d', 'instance_id': '6d263548-4cc6-463b-b26b-cb43b0d069cd', 'instance_type': 'm1.nano', 'host': '58822ef3e0ed542a50f4c285ab49d39c541f09f50dc197ccbb5d5ca0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1806754a-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4217.319925959, 'message_signature': '3d4c3750248e1b5f94393ce87c3ee50dff59ed7103d4623a45ae1bf592e7e94b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '7c0fb56fc41e44dfa23a0d45149e78e3', 'user_name': None, 'project_id': '9b004cb06df74de2903dae19345fd9c7', 'project_name': None, 'resource_id': '6d263548-4cc6-463b-b26b-cb43b0d069cd-sda', 'timestamp': '2025-11-22T07:44:36.601366', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-174353263', 'name': 'instance-0000000d', 'instance_id': '6d263548-4cc6-463b-b26b-cb43b0d069cd', 'instance_type': 'm1.nano', 'host': '58822ef3e0ed542a50f4c285ab49d39c541f09f50dc197ccbb5d5ca0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '18067f68-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4217.319925959, 'message_signature': '79bed41c3b507ff32d0ea69e1112af0898861beca559dc65f0b0b0b90b602254'}]}, 'timestamp': '2025-11-22 07:44:36.660113', '_unique_id': '917f4395a2374a32b007e2a7532f92c8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.661 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.661 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.661 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.661 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.661 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.661 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.661 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.661 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.661 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.661 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.661 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.661 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.661 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.661 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.661 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.661 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.661 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.661 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.661 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.661 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.661 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.661 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.661 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.661 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.661 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.661 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.661 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.661 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.661 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.661 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.661 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.661 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.661 12 DEBUG ceilometer.compute.pollsters [-] 6d263548-4cc6-463b-b26b-cb43b0d069cd/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.662 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7f12ea70-0786-4c18-b92f-f95e5f0845ab', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7c0fb56fc41e44dfa23a0d45149e78e3', 'user_name': None, 'project_id': '9b004cb06df74de2903dae19345fd9c7', 'project_name': None, 'resource_id': 'instance-0000000d-6d263548-4cc6-463b-b26b-cb43b0d069cd-tapcc51d9a0-e1', 'timestamp': '2025-11-22T07:44:36.661837', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-174353263', 'name': 'tapcc51d9a0-e1', 'instance_id': '6d263548-4cc6-463b-b26b-cb43b0d069cd', 'instance_type': 'm1.nano', 'host': '58822ef3e0ed542a50f4c285ab49d39c541f09f50dc197ccbb5d5ca0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:88:5c:3d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcc51d9a0-e1'}, 'message_id': '1806cc5c-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4217.288457019, 'message_signature': 'aa6bab25b542d7d00b0938b52855b63cd48eba470e3744b15ab135cbfefd166f'}]}, 'timestamp': '2025-11-22 07:44:36.662088', '_unique_id': '0d146df79d094abdaba6a6d5fba38c41'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.662 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.662 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.662 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.662 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.662 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.662 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.662 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.662 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.662 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.662 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.662 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.662 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.662 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.662 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.662 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.662 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.662 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.662 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.662 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.662 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.662 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.662 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.662 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.662 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.662 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.662 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.662 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.662 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.662 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.662 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.662 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.662 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.662 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.662 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.662 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.662 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.662 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.662 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.662 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.662 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.662 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.662 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.662 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.662 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.662 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.662 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.662 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.662 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.662 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.662 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.662 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.662 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.662 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.662 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.663 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.663 12 DEBUG ceilometer.compute.pollsters [-] 6d263548-4cc6-463b-b26b-cb43b0d069cd/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.663 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0f982d1a-8dab-4420-816b-531c39377c69', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '7c0fb56fc41e44dfa23a0d45149e78e3', 'user_name': None, 'project_id': '9b004cb06df74de2903dae19345fd9c7', 'project_name': None, 'resource_id': 'instance-0000000d-6d263548-4cc6-463b-b26b-cb43b0d069cd-tapcc51d9a0-e1', 'timestamp': '2025-11-22T07:44:36.663163', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-174353263', 'name': 'tapcc51d9a0-e1', 'instance_id': '6d263548-4cc6-463b-b26b-cb43b0d069cd', 'instance_type': 'm1.nano', 'host': '58822ef3e0ed542a50f4c285ab49d39c541f09f50dc197ccbb5d5ca0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:88:5c:3d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcc51d9a0-e1'}, 'message_id': '18070050-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4217.288457019, 'message_signature': '8048ebc98a46658ce943e70eafc184c783ea68ab7cedd56d1a7e365d0211baa4'}]}, 'timestamp': '2025-11-22 07:44:36.663416', '_unique_id': '7863d47b300346ce86656f5ae12fe8d5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.663 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.663 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.663 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.663 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.663 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.663 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.663 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.663 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.663 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.663 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.663 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.663 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.663 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.663 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.663 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.663 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.663 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.663 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.663 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.663 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.663 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.663 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.663 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.663 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.663 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.663 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.663 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.663 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.663 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.663 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.663 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.664 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.664 12 DEBUG ceilometer.compute.pollsters [-] 6d263548-4cc6-463b-b26b-cb43b0d069cd/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.665 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '270c9464-48b4-4a29-bc5e-ebe8808f718e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7c0fb56fc41e44dfa23a0d45149e78e3', 'user_name': None, 'project_id': '9b004cb06df74de2903dae19345fd9c7', 'project_name': None, 'resource_id': 'instance-0000000d-6d263548-4cc6-463b-b26b-cb43b0d069cd-tapcc51d9a0-e1', 'timestamp': '2025-11-22T07:44:36.664470', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-174353263', 'name': 'tapcc51d9a0-e1', 'instance_id': '6d263548-4cc6-463b-b26b-cb43b0d069cd', 'instance_type': 'm1.nano', 'host': '58822ef3e0ed542a50f4c285ab49d39c541f09f50dc197ccbb5d5ca0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:88:5c:3d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcc51d9a0-e1'}, 'message_id': '180732aa-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4217.288457019, 'message_signature': '7623b1f44a631131faba5ec5a86aadda642a3fa3b6aa42e878ecd691d4715332'}]}, 'timestamp': '2025-11-22 07:44:36.664703', '_unique_id': 'cdcd7d9d9f274d9e95272335debf5f9c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.665 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.665 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.665 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.665 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.665 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.665 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.665 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.665 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.665 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.665 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.665 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.665 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.665 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.665 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.665 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.665 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.665 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.665 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.665 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.665 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.665 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.665 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.665 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.665 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.665 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.665 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.665 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.665 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.665 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.665 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.665 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.665 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.665 12 DEBUG ceilometer.compute.pollsters [-] 6d263548-4cc6-463b-b26b-cb43b0d069cd/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.666 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f5424575-4bd3-4ac2-bf8c-5435148f33dc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7c0fb56fc41e44dfa23a0d45149e78e3', 'user_name': None, 'project_id': '9b004cb06df74de2903dae19345fd9c7', 'project_name': None, 'resource_id': 'instance-0000000d-6d263548-4cc6-463b-b26b-cb43b0d069cd-tapcc51d9a0-e1', 'timestamp': '2025-11-22T07:44:36.665787', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-174353263', 'name': 'tapcc51d9a0-e1', 'instance_id': '6d263548-4cc6-463b-b26b-cb43b0d069cd', 'instance_type': 'm1.nano', 'host': '58822ef3e0ed542a50f4c285ab49d39c541f09f50dc197ccbb5d5ca0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:88:5c:3d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcc51d9a0-e1'}, 'message_id': '18076612-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4217.288457019, 'message_signature': 'e14c536bde282d154e718d9fd6e641bfa94fada02ef7c8df96d636ba8f2aacda'}]}, 'timestamp': '2025-11-22 07:44:36.666020', '_unique_id': '34da00f59e5841e980fce6860fd9a2aa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.666 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.666 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.666 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.666 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.666 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.666 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.666 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.666 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.666 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.666 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.666 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.666 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.666 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.666 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.666 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.666 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.666 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.666 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.666 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.666 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.666 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.666 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.666 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.666 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.666 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.666 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.666 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.666 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.666 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.666 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.666 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.667 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.667 12 DEBUG ceilometer.compute.pollsters [-] 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/disk.device.read.latency volume: 839805759 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.667 12 DEBUG ceilometer.compute.pollsters [-] 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/disk.device.read.latency volume: 42707663 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.667 12 DEBUG ceilometer.compute.pollsters [-] 6d263548-4cc6-463b-b26b-cb43b0d069cd/disk.device.read.latency volume: 564599828 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.667 12 DEBUG ceilometer.compute.pollsters [-] 6d263548-4cc6-463b-b26b-cb43b0d069cd/disk.device.read.latency volume: 2865611 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.668 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5f4e6f25-493d-4c86-b3a0-9555d716a95a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 839805759, 'user_id': '5ea417ea62e2404d8cb5b9e767e8c5c4', 'user_name': None, 'project_id': '070aaece3c3c4232877d26c34023c56d', 'project_name': None, 'resource_id': '5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67-vda', 'timestamp': '2025-11-22T07:44:36.667081', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-370989325', 'name': 'instance-00000011', 'instance_id': '5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67', 'instance_type': 'm1.nano', 'host': '9e6578c6ae35730b89f940f3c6085d7218ecaf8b959a880eeedd6624', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '18079916-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4217.293162445, 'message_signature': '14fcc897fac2708e69af69e55fc3921ca18ef24970b70b9045fa5585e9398d6b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 42707663, 'user_id': '5ea417ea62e2404d8cb5b9e767e8c5c4', 'user_name': None, 'project_id': '070aaece3c3c4232877d26c34023c56d', 'project_name': None, 'resource_id': '5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67-sda', 'timestamp': '2025-11-22T07:44:36.667081', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-370989325', 'name': 'instance-00000011', 'instance_id': '5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67', 'instance_type': 'm1.nano', 'host': '9e6578c6ae35730b89f940f3c6085d7218ecaf8b959a880eeedd6624', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1807a21c-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4217.293162445, 'message_signature': '0ae9616c3381d52fca517b8ccf8f5ef5239c9510d7484367ef25206bb14eb5f2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 564599828, 'user_id': '7c0fb56fc41e44dfa23a0d45149e78e3', 'user_name': None, 'project_id': '9b004cb06df74de2903dae19345fd9c7', 'project_name': None, 'resource_id': '6d263548-4cc6-463b-b26b-cb43b0d069cd-vda', 'timestamp': '2025-11-22T07:44:36.667081', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-174353263', 'name': 'instance-0000000d', 'instance_id': '6d263548-4cc6-463b-b26b-cb43b0d069cd', 'instance_type': 'm1.nano', 'host': '58822ef3e0ed542a50f4c285ab49d39c541f09f50dc197ccbb5d5ca0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1807a9c4-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4217.319925959, 'message_signature': 'ffdf5b3895ac9d2f19d22e40d94053dacb0a2a1be98f8ac0ee2317c22e5b8ec7'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2865611, 'user_id': '7c0fb56fc41e44dfa23a0d45149e78e3', 'user_name': None, 'project_id': '9b004cb06df74de2903dae19345fd9c7', 'project_name': None, 'resource_id': '6d263548-4cc6-463b-b26b-cb43b0d069cd-sda', 'timestamp': '2025-11-22T07:44:36.667081', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-174353263', 'name': 'instance-0000000d', 'instance_id': '6d263548-4cc6-463b-b26b-cb43b0d069cd', 'instance_type': 'm1.nano', 'host': '58822ef3e0ed542a50f4c285ab49d39c541f09f50dc197ccbb5d5ca0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1807b130-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4217.319925959, 'message_signature': '1e4cc88f8f9ca8e4cbaa030844655f5041bc91dbf40188a9eb781cd64845c5f1'}]}, 'timestamp': '2025-11-22 07:44:36.667927', '_unique_id': '5c62e7882a3b4a0fb52b36c31659e2c1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.668 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.668 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.668 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.668 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.668 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.668 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.668 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.668 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.668 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.668 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.668 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.668 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.668 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.668 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.668 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.668 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.668 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.668 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.668 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.668 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.668 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.668 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.668 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.668 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.668 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.668 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.668 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.668 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.668 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.668 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.668 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.668 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.669 12 DEBUG ceilometer.compute.pollsters [-] 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/disk.device.read.bytes volume: 32020480 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.669 12 DEBUG ceilometer.compute.pollsters [-] 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.669 12 DEBUG ceilometer.compute.pollsters [-] 6d263548-4cc6-463b-b26b-cb43b0d069cd/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.669 12 DEBUG ceilometer.compute.pollsters [-] 6d263548-4cc6-463b-b26b-cb43b0d069cd/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.670 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5c0a694e-a250-45e5-99df-84db43b321e2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 32020480, 'user_id': '5ea417ea62e2404d8cb5b9e767e8c5c4', 'user_name': None, 'project_id': '070aaece3c3c4232877d26c34023c56d', 'project_name': None, 'resource_id': '5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67-vda', 'timestamp': '2025-11-22T07:44:36.669042', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-370989325', 'name': 'instance-00000011', 'instance_id': '5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67', 'instance_type': 'm1.nano', 'host': '9e6578c6ae35730b89f940f3c6085d7218ecaf8b959a880eeedd6624', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1807e51a-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4217.293162445, 'message_signature': 'b7b04e5df608a19402d9c4d8cfd61796f13eefda6f6fe795ea96755e954ddf79'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '5ea417ea62e2404d8cb5b9e767e8c5c4', 'user_name': None, 'project_id': '070aaece3c3c4232877d26c34023c56d', 'project_name': None, 'resource_id': '5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67-sda', 'timestamp': '2025-11-22T07:44:36.669042', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-370989325', 'name': 'instance-00000011', 'instance_id': '5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67', 'instance_type': 'm1.nano', 'host': '9e6578c6ae35730b89f940f3c6085d7218ecaf8b959a880eeedd6624', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1807ede4-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4217.293162445, 'message_signature': 'be5a0c83118e0a8eb08c09fde321b8b9ad004d89a4de8cf54180a21b4bc02132'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '7c0fb56fc41e44dfa23a0d45149e78e3', 'user_name': None, 'project_id': '9b004cb06df74de2903dae19345fd9c7', 'project_name': None, 'resource_id': '6d263548-4cc6-463b-b26b-cb43b0d069cd-vda', 'timestamp': '2025-11-22T07:44:36.669042', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-174353263', 'name': 'instance-0000000d', 'instance_id': '6d263548-4cc6-463b-b26b-cb43b0d069cd', 'instance_type': 'm1.nano', 'host': '58822ef3e0ed542a50f4c285ab49d39c541f09f50dc197ccbb5d5ca0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1807f5b4-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4217.319925959, 'message_signature': 'c6dc36b0d70174e98205f90f5911c55c8a7c0b998705e433d42038cd7e501d2e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '7c0fb56fc41e44dfa23a0d45149e78e3', 'user_name': None, 'project_id': '9b004cb06df74de2903dae19345fd9c7', 'project_name': None, 'resource_id': '6d263548-4cc6-463b-b26b-cb43b0d069cd-sda', 'timestamp': '2025-11-22T07:44:36.669042', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-174353263', 'name': 'instance-0000000d', 'instance_id': '6d263548-4cc6-463b-b26b-cb43b0d069cd', 'instance_type': 'm1.nano', 'host': '58822ef3e0ed542a50f4c285ab49d39c541f09f50dc197ccbb5d5ca0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1807fdca-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4217.319925959, 'message_signature': '90f39b2be4bf8c415fe39ce4863aa532b16f112ad37b9526a4cbb54b6f7cc10d'}]}, 'timestamp': '2025-11-22 07:44:36.669886', '_unique_id': 'db27848812fa477e9b3e7754d1bd97a8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.670 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.670 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.670 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.670 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.670 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.670 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.670 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.670 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.670 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.670 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.670 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.670 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.670 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.670 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.670 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.670 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.670 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.670 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.670 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.670 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.670 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.670 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.670 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.670 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.670 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.670 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.670 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.670 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.670 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.670 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.670 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.670 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.671 12 DEBUG ceilometer.compute.pollsters [-] 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/disk.device.read.requests volume: 1206 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.671 12 DEBUG ceilometer.compute.pollsters [-] 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.671 12 DEBUG ceilometer.compute.pollsters [-] 6d263548-4cc6-463b-b26b-cb43b0d069cd/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.671 12 DEBUG ceilometer.compute.pollsters [-] 6d263548-4cc6-463b-b26b-cb43b0d069cd/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.672 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '127f3641-8b2c-4c2a-aedc-e1a1f5267608', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1206, 'user_id': '5ea417ea62e2404d8cb5b9e767e8c5c4', 'user_name': None, 'project_id': '070aaece3c3c4232877d26c34023c56d', 'project_name': None, 'resource_id': '5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67-vda', 'timestamp': '2025-11-22T07:44:36.671041', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-370989325', 'name': 'instance-00000011', 'instance_id': '5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67', 'instance_type': 'm1.nano', 'host': '9e6578c6ae35730b89f940f3c6085d7218ecaf8b959a880eeedd6624', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1808336c-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4217.293162445, 'message_signature': '982b6fa89a94177d263db577e1a390cee01a787cafc825be82228eda26989567'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '5ea417ea62e2404d8cb5b9e767e8c5c4', 'user_name': None, 'project_id': '070aaece3c3c4232877d26c34023c56d', 'project_name': None, 'resource_id': '5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67-sda', 'timestamp': '2025-11-22T07:44:36.671041', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-370989325', 'name': 'instance-00000011', 'instance_id': '5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67', 'instance_type': 'm1.nano', 'host': '9e6578c6ae35730b89f940f3c6085d7218ecaf8b959a880eeedd6624', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '18083e34-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4217.293162445, 'message_signature': 'af552c9a2a11bfde20c31c28037b170a96c50dca7314622e44dd0734bd1dcedc'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '7c0fb56fc41e44dfa23a0d45149e78e3', 'user_name': None, 'project_id': '9b004cb06df74de2903dae19345fd9c7', 'project_name': None, 'resource_id': '6d263548-4cc6-463b-b26b-cb43b0d069cd-vda', 'timestamp': '2025-11-22T07:44:36.671041', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-174353263', 'name': 'instance-0000000d', 'instance_id': '6d263548-4cc6-463b-b26b-cb43b0d069cd', 'instance_type': 'm1.nano', 'host': '58822ef3e0ed542a50f4c285ab49d39c541f09f50dc197ccbb5d5ca0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1808478a-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4217.319925959, 'message_signature': 'f8bda297fe3b7c1902ba9bdd58a888b5d5cf073350ff703a9575f022822c2475'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '7c0fb56fc41e44dfa23a0d45149e78e3', 'user_name': None, 'project_id': '9b004cb06df74de2903dae19345fd9c7', 'project_name': None, 'resource_id': '6d263548-4cc6-463b-b26b-cb43b0d069cd-sda', 'timestamp': '2025-11-22T07:44:36.671041', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-174353263', 'name': 'instance-0000000d', 'instance_id': '6d263548-4cc6-463b-b26b-cb43b0d069cd', 'instance_type': 'm1.nano', 'host': '58822ef3e0ed542a50f4c285ab49d39c541f09f50dc197ccbb5d5ca0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '18084eec-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4217.319925959, 'message_signature': '6221fe439bceb144159cbe72a2e0a86d4df609ce0ba3273fd2ea289f2b7fbf88'}]}, 'timestamp': '2025-11-22 07:44:36.671962', '_unique_id': '7358e059710a49cfbe3e7e8eae0e86df'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.672 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.672 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.672 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.672 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.672 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.672 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.672 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.672 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.672 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.672 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.672 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.672 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.672 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.672 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.672 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.672 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.672 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.672 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.672 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.672 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.672 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.672 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.672 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.672 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.672 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.672 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.672 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.672 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.672 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.672 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.672 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.673 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.689 12 DEBUG ceilometer.compute.pollsters [-] 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/cpu volume: 11550000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:44:36 compute-0 nova_compute[186544]: 2025-11-22 07:44:36.698 186548 DEBUG nova.compute.manager [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 07:44:36 compute-0 nova_compute[186544]: 2025-11-22 07:44:36.699 186548 DEBUG nova.virt.libvirt.driver [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 07:44:36 compute-0 nova_compute[186544]: 2025-11-22 07:44:36.700 186548 INFO nova.virt.libvirt.driver [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Creating image(s)
Nov 22 07:44:36 compute-0 nova_compute[186544]: 2025-11-22 07:44:36.700 186548 DEBUG oslo_concurrency.lockutils [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Acquiring lock "/var/lib/nova/instances/feb5ca5f-df67-4f29-9c21-71ba30b5af9c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:44:36 compute-0 nova_compute[186544]: 2025-11-22 07:44:36.700 186548 DEBUG oslo_concurrency.lockutils [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Lock "/var/lib/nova/instances/feb5ca5f-df67-4f29-9c21-71ba30b5af9c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:44:36 compute-0 nova_compute[186544]: 2025-11-22 07:44:36.701 186548 DEBUG oslo_concurrency.lockutils [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Lock "/var/lib/nova/instances/feb5ca5f-df67-4f29-9c21-71ba30b5af9c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.709 12 DEBUG ceilometer.compute.pollsters [-] 6d263548-4cc6-463b-b26b-cb43b0d069cd/cpu volume: 9290000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.711 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3287e735-94c0-4b8c-b4dc-fca58da3b65a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11550000000, 'user_id': '5ea417ea62e2404d8cb5b9e767e8c5c4', 'user_name': None, 'project_id': '070aaece3c3c4232877d26c34023c56d', 'project_name': None, 'resource_id': '5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67', 'timestamp': '2025-11-22T07:44:36.673178', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-370989325', 'name': 'instance-00000011', 'instance_id': '5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67', 'instance_type': 'm1.nano', 'host': '9e6578c6ae35730b89f940f3c6085d7218ecaf8b959a880eeedd6624', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '180b04f2-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4217.380713584, 'message_signature': '785db29d5337c7a1854f0435be20d28957ba266745dd3cdb1f3ea7b415f3e669'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 9290000000, 'user_id': '7c0fb56fc41e44dfa23a0d45149e78e3', 'user_name': None, 'project_id': '9b004cb06df74de2903dae19345fd9c7', 'project_name': None, 'resource_id': '6d263548-4cc6-463b-b26b-cb43b0d069cd', 'timestamp': '2025-11-22T07:44:36.673178', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-174353263', 'name': 'instance-0000000d', 'instance_id': '6d263548-4cc6-463b-b26b-cb43b0d069cd', 'instance_type': 'm1.nano', 'host': '58822ef3e0ed542a50f4c285ab49d39c541f09f50dc197ccbb5d5ca0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '180e1a16-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4217.400857507, 'message_signature': '7770fa5bb5b7b2b79dea2a99379ca7ff25806c46ee9054538674abefb5fd7f4f'}]}, 'timestamp': '2025-11-22 07:44:36.710071', '_unique_id': '1ac51b0e3a854b9e871bd9274a39fefc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.711 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.711 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.711 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.711 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.711 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.711 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.711 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.711 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.711 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.711 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.711 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.711 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.711 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.711 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.711 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.711 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.711 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.711 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.711 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.711 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.711 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.711 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.711 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.711 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.711 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.711 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.711 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.711 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.711 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.711 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.711 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.713 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.713 12 DEBUG ceilometer.compute.pollsters [-] 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/memory.usage volume: 40.6328125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:44:36 compute-0 nova_compute[186544]: 2025-11-22 07:44:36.713 186548 DEBUG oslo_concurrency.processutils [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.713 12 DEBUG ceilometer.compute.pollsters [-] 6d263548-4cc6-463b-b26b-cb43b0d069cd/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.714 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 6d263548-4cc6-463b-b26b-cb43b0d069cd: ceilometer.compute.pollsters.NoVolumeException
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.715 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '98b92c12-ab19-4de3-895c-1ecfeb18cfcd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.6328125, 'user_id': '5ea417ea62e2404d8cb5b9e767e8c5c4', 'user_name': None, 'project_id': '070aaece3c3c4232877d26c34023c56d', 'project_name': None, 'resource_id': '5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67', 'timestamp': '2025-11-22T07:44:36.713154', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-370989325', 'name': 'instance-00000011', 'instance_id': '5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67', 'instance_type': 'm1.nano', 'host': '9e6578c6ae35730b89f940f3c6085d7218ecaf8b959a880eeedd6624', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '180eb0de-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4217.380713584, 'message_signature': 'f358947a3f086bd1c928700be315dc512e46b4f95a6be8570dc61256742bf16d'}]}, 'timestamp': '2025-11-22 07:44:36.714202', '_unique_id': 'caad365ca5174542ac5abb5a9a4113fd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.715 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.715 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.715 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.715 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.715 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.715 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.715 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.715 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.715 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.715 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.715 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.715 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.715 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.715 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.715 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.715 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.715 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.715 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.715 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.715 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.715 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.715 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.715 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.715 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.715 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.715 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.715 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.715 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.715 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.715 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.715 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.716 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.716 12 DEBUG ceilometer.compute.pollsters [-] 6d263548-4cc6-463b-b26b-cb43b0d069cd/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8999fd80-af00-4b5c-a605-77724a19584e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7c0fb56fc41e44dfa23a0d45149e78e3', 'user_name': None, 'project_id': '9b004cb06df74de2903dae19345fd9c7', 'project_name': None, 'resource_id': 'instance-0000000d-6d263548-4cc6-463b-b26b-cb43b0d069cd-tapcc51d9a0-e1', 'timestamp': '2025-11-22T07:44:36.716503', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-174353263', 'name': 'tapcc51d9a0-e1', 'instance_id': '6d263548-4cc6-463b-b26b-cb43b0d069cd', 'instance_type': 'm1.nano', 'host': '58822ef3e0ed542a50f4c285ab49d39c541f09f50dc197ccbb5d5ca0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:88:5c:3d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcc51d9a0-e1'}, 'message_id': '180f3360-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4217.288457019, 'message_signature': '27f0ee85e5511e7f685bcbf8e7a69613587fc5cfc9f6d4f37d5eb5fe3952fe1c'}]}, 'timestamp': '2025-11-22 07:44:36.717220', '_unique_id': '53fc85972900413ca46d0ed33a750201'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.718 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.718 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.719 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.719 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-MigrationsAdminTest-server-370989325>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-174353263>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-MigrationsAdminTest-server-370989325>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-174353263>]
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.719 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.719 12 DEBUG ceilometer.compute.pollsters [-] 6d263548-4cc6-463b-b26b-cb43b0d069cd/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.721 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3496790c-324c-4b45-91bf-f835f17d90be', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7c0fb56fc41e44dfa23a0d45149e78e3', 'user_name': None, 'project_id': '9b004cb06df74de2903dae19345fd9c7', 'project_name': None, 'resource_id': 'instance-0000000d-6d263548-4cc6-463b-b26b-cb43b0d069cd-tapcc51d9a0-e1', 'timestamp': '2025-11-22T07:44:36.719536', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-174353263', 'name': 'tapcc51d9a0-e1', 'instance_id': '6d263548-4cc6-463b-b26b-cb43b0d069cd', 'instance_type': 'm1.nano', 'host': '58822ef3e0ed542a50f4c285ab49d39c541f09f50dc197ccbb5d5ca0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:88:5c:3d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcc51d9a0-e1'}, 'message_id': '180f9c92-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4217.288457019, 'message_signature': '2d6e079df3fb109b439d317a4dd533252e9a050548191b5060732b4dcf235237'}]}, 'timestamp': '2025-11-22 07:44:36.719950', '_unique_id': 'f1827e841fc1499b90ca1ea6726fd23d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.721 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.721 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.721 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.721 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.721 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.721 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.721 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.721 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.721 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.721 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.721 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.721 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.721 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.721 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.721 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.721 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.721 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.721 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.721 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.721 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.721 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.721 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.721 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.721 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.721 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.721 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.721 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.721 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.721 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.721 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.721 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.722 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.722 12 DEBUG ceilometer.compute.pollsters [-] 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/disk.device.write.bytes volume: 249856 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.722 12 DEBUG ceilometer.compute.pollsters [-] 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.722 12 DEBUG ceilometer.compute.pollsters [-] 6d263548-4cc6-463b-b26b-cb43b0d069cd/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.723 12 DEBUG ceilometer.compute.pollsters [-] 6d263548-4cc6-463b-b26b-cb43b0d069cd/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.724 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8f29cdca-dd4e-4516-a442-3a09be5133c5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 249856, 'user_id': '5ea417ea62e2404d8cb5b9e767e8c5c4', 'user_name': None, 'project_id': '070aaece3c3c4232877d26c34023c56d', 'project_name': None, 'resource_id': '5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67-vda', 'timestamp': '2025-11-22T07:44:36.722150', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-370989325', 'name': 'instance-00000011', 'instance_id': '5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67', 'instance_type': 'm1.nano', 'host': '9e6578c6ae35730b89f940f3c6085d7218ecaf8b959a880eeedd6624', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1810031c-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4217.293162445, 'message_signature': '3d924db8d6edc1a4cbdd29d32e491f03ed64c75ab99f981e932a173ee7562990'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5ea417ea62e2404d8cb5b9e767e8c5c4', 'user_name': None, 'project_id': '070aaece3c3c4232877d26c34023c56d', 'project_name': None, 'resource_id': '5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67-sda', 'timestamp': '2025-11-22T07:44:36.722150', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-370989325', 'name': 'instance-00000011', 'instance_id': '5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67', 'instance_type': 'm1.nano', 'host': '9e6578c6ae35730b89f940f3c6085d7218ecaf8b959a880eeedd6624', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '18100f24-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4217.293162445, 'message_signature': 'c14da236f46fb9c901015105598e0ac0da57f73153cf4f27462d2a0a2e38b78e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7c0fb56fc41e44dfa23a0d45149e78e3', 'user_name': None, 'project_id': '9b004cb06df74de2903dae19345fd9c7', 'project_name': None, 'resource_id': '6d263548-4cc6-463b-b26b-cb43b0d069cd-vda', 'timestamp': '2025-11-22T07:44:36.722150', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-174353263', 'name': 'instance-0000000d', 'instance_id': '6d263548-4cc6-463b-b26b-cb43b0d069cd', 'instance_type': 'm1.nano', 'host': '58822ef3e0ed542a50f4c285ab49d39c541f09f50dc197ccbb5d5ca0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '181019e2-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4217.319925959, 'message_signature': '5e6c6a3095608ef0686f5abbd2183900adf9cef60a4f8b65bbcfcc71305e455f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7c0fb56fc41e44dfa23a0d45149e78e3', 'user_name': None, 'project_id': '9b004cb06df74de2903dae19345fd9c7', 'project_name': None, 'resource_id': '6d263548-4cc6-463b-b26b-cb43b0d069cd-sda', 'timestamp': '2025-11-22T07:44:36.722150', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-174353263', 'name': 'instance-0000000d', 'instance_id': '6d263548-4cc6-463b-b26b-cb43b0d069cd', 'instance_type': 'm1.nano', 'host': '58822ef3e0ed542a50f4c285ab49d39c541f09f50dc197ccbb5d5ca0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '18102450-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4217.319925959, 'message_signature': '2b0aebf756d4241cb0dedd53a4f0c53db7d062e6214e51d7261d5d38c261c922'}]}, 'timestamp': '2025-11-22 07:44:36.723370', '_unique_id': 'f0ebd1dbb08c41d0820e5260b5789380'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.724 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.724 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.724 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.724 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.724 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.724 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.724 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.724 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.724 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.724 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.724 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.724 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.724 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.724 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.724 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.724 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.724 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.724 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.724 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.724 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.724 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.724 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.724 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.724 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.724 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.724 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.724 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.724 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.724 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.724 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.724 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.724 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.724 12 DEBUG ceilometer.compute.pollsters [-] 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/disk.device.write.latency volume: 31733026 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.725 12 DEBUG ceilometer.compute.pollsters [-] 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.725 12 DEBUG ceilometer.compute.pollsters [-] 6d263548-4cc6-463b-b26b-cb43b0d069cd/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.725 12 DEBUG ceilometer.compute.pollsters [-] 6d263548-4cc6-463b-b26b-cb43b0d069cd/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.726 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '47c3d09c-5ceb-4936-bb03-e53babbfb0ce', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 31733026, 'user_id': '5ea417ea62e2404d8cb5b9e767e8c5c4', 'user_name': None, 'project_id': '070aaece3c3c4232877d26c34023c56d', 'project_name': None, 'resource_id': '5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67-vda', 'timestamp': '2025-11-22T07:44:36.724813', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-370989325', 'name': 'instance-00000011', 'instance_id': '5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67', 'instance_type': 'm1.nano', 'host': '9e6578c6ae35730b89f940f3c6085d7218ecaf8b959a880eeedd6624', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '18106956-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4217.293162445, 'message_signature': '9532b43794169ee2182ecc1b438a2c35b0d641f8d117e5670abdb8ae81512ab6'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '5ea417ea62e2404d8cb5b9e767e8c5c4', 'user_name': None, 'project_id': '070aaece3c3c4232877d26c34023c56d', 'project_name': None, 'resource_id': '5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67-sda', 'timestamp': '2025-11-22T07:44:36.724813', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-370989325', 'name': 'instance-00000011', 'instance_id': '5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67', 'instance_type': 'm1.nano', 'host': '9e6578c6ae35730b89f940f3c6085d7218ecaf8b959a880eeedd6624', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '18107496-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4217.293162445, 'message_signature': 'a38edb3d118ada0637aae947a54fe32a89126ba0b209ec4b5522a620f03a4630'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '7c0fb56fc41e44dfa23a0d45149e78e3', 'user_name': None, 'project_id': '9b004cb06df74de2903dae19345fd9c7', 'project_name': None, 'resource_id': '6d263548-4cc6-463b-b26b-cb43b0d069cd-vda', 'timestamp': '2025-11-22T07:44:36.724813', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-174353263', 'name': 'instance-0000000d', 'instance_id': '6d263548-4cc6-463b-b26b-cb43b0d069cd', 'instance_type': 'm1.nano', 'host': '58822ef3e0ed542a50f4c285ab49d39c541f09f50dc197ccbb5d5ca0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '18107fc2-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4217.319925959, 'message_signature': '6e1a0489a3c37654b29ef8fe805b552cba40fb9f1d6ece5353e190aeaa2c6a60'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '7c0fb56fc41e44dfa23a0d45149e78e3', 'user_name': None, 'project_id': '9b004cb06df74de2903dae19345fd9c7', 'project_name': None, 'resource_id': '6d263548-4cc6-463b-b26b-cb43b0d069cd-sda', 'timestamp': '2025-11-22T07:44:36.724813', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-174353263', 'name': 'instance-0000000d', 'instance_id': '6d263548-4cc6-463b-b26b-cb43b0d069cd', 'instance_type': 'm1.nano', 'host': '58822ef3e0ed542a50f4c285ab49d39c541f09f50dc197ccbb5d5ca0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '18108be8-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4217.319925959, 'message_signature': '2bfbc7a2094f523f633abb50f134a039c53c576c6445ecab91a5518c4f4417a3'}]}, 'timestamp': '2025-11-22 07:44:36.725981', '_unique_id': 'b7e0bb3b4b314002bf5a5771a6561327'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.726 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.726 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.726 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.726 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.726 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.726 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.726 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.726 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.726 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.726 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.726 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.726 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.726 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.726 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.726 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.726 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.726 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.726 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.726 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.726 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.726 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.726 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.726 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.726 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.726 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.726 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.726 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.726 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.726 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.726 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.726 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.727 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.739 12 DEBUG ceilometer.compute.pollsters [-] 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/disk.device.usage volume: 30081024 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.740 12 DEBUG ceilometer.compute.pollsters [-] 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.752 12 DEBUG ceilometer.compute.pollsters [-] 6d263548-4cc6-463b-b26b-cb43b0d069cd/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.753 12 DEBUG ceilometer.compute.pollsters [-] 6d263548-4cc6-463b-b26b-cb43b0d069cd/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.754 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '280c9bf4-c20a-410f-922a-f3b8d86c396c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30081024, 'user_id': '5ea417ea62e2404d8cb5b9e767e8c5c4', 'user_name': None, 'project_id': '070aaece3c3c4232877d26c34023c56d', 'project_name': None, 'resource_id': '5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67-vda', 'timestamp': '2025-11-22T07:44:36.727709', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-370989325', 'name': 'instance-00000011', 'instance_id': '5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67', 'instance_type': 'm1.nano', 'host': '9e6578c6ae35730b89f940f3c6085d7218ecaf8b959a880eeedd6624', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1812b8e6-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4217.419552083, 'message_signature': '687c0395058615b5394917688d724afa1198e5f7633c13afd0f7c1d532da2b09'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '5ea417ea62e2404d8cb5b9e767e8c5c4', 'user_name': None, 'project_id': '070aaece3c3c4232877d26c34023c56d', 'project_name': None, 'resource_id': '5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67-sda', 'timestamp': '2025-11-22T07:44:36.727709', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-370989325', 'name': 'instance-00000011', 'instance_id': '5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67', 'instance_type': 'm1.nano', 'host': '9e6578c6ae35730b89f940f3c6085d7218ecaf8b959a880eeedd6624', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1812c656-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4217.419552083, 'message_signature': '5b6b2d5b21bb97823509407199e27d6be00da37cefd513314342e517281d2e51'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '7c0fb56fc41e44dfa23a0d45149e78e3', 'user_name': None, 'project_id': '9b004cb06df74de2903dae19345fd9c7', 'project_name': None, 'resource_id': '6d263548-4cc6-463b-b26b-cb43b0d069cd-vda', 'timestamp': '2025-11-22T07:44:36.727709', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-174353263', 'name': 'instance-0000000d', 'instance_id': '6d263548-4cc6-463b-b26b-cb43b0d069cd', 'instance_type': 'm1.nano', 'host': '58822ef3e0ed542a50f4c285ab49d39c541f09f50dc197ccbb5d5ca0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1814b218-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4217.432386327, 'message_signature': '26af983d9d4cc56a4a73d5e780ee63c588c281bb05df97581a1ca3814aa62a22'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '7c0fb56fc41e44dfa23a0d45149e78e3', 'user_name': None, 'project_id': '9b004cb06df74de2903dae19345fd9c7', 'project_name': None, 'resource_id': '6d263548-4cc6-463b-b26b-cb43b0d069cd-sda', 'timestamp': '2025-11-22T07:44:36.727709', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-174353263', 'name': 'instance-0000000d', 'instance_id': '6d263548-4cc6-463b-b26b-cb43b0d069cd', 'instance_type': 'm1.nano', 'host': '58822ef3e0ed542a50f4c285ab49d39c541f09f50dc197ccbb5d5ca0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1814c050-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4217.432386327, 'message_signature': 'b5406f33e1defcd767de3afdf4c0339269ce8f24d0c4450a495fc1f7ea20a157'}]}, 'timestamp': '2025-11-22 07:44:36.753533', '_unique_id': 'f88ec32cbae542348c66d9ad05834fa6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.754 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.754 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.754 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.754 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.754 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.754 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.754 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.754 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.754 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.754 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.754 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.754 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.754 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.754 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.754 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.754 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.754 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.754 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.754 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.754 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.754 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.754 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.754 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.754 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.754 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.754 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.754 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.754 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.754 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.754 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.754 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.755 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.755 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.755 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-MigrationsAdminTest-server-370989325>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-174353263>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-MigrationsAdminTest-server-370989325>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-174353263>]
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.755 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.755 12 DEBUG ceilometer.compute.pollsters [-] 6d263548-4cc6-463b-b26b-cb43b0d069cd/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.756 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '89685999-5c16-4d56-b375-8449201a9699', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7c0fb56fc41e44dfa23a0d45149e78e3', 'user_name': None, 'project_id': '9b004cb06df74de2903dae19345fd9c7', 'project_name': None, 'resource_id': 'instance-0000000d-6d263548-4cc6-463b-b26b-cb43b0d069cd-tapcc51d9a0-e1', 'timestamp': '2025-11-22T07:44:36.755887', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-174353263', 'name': 'tapcc51d9a0-e1', 'instance_id': '6d263548-4cc6-463b-b26b-cb43b0d069cd', 'instance_type': 'm1.nano', 'host': '58822ef3e0ed542a50f4c285ab49d39c541f09f50dc197ccbb5d5ca0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:88:5c:3d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcc51d9a0-e1'}, 'message_id': '181527a2-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4217.288457019, 'message_signature': 'ea66a6d9c5788122421f458b7c2729b5ed4f0d4761b4e5fdc153a4b4b210fa3d'}]}, 'timestamp': '2025-11-22 07:44:36.756203', '_unique_id': '1817fd72b72c45bbb9d07570cec15a88'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.756 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.756 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.756 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.756 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.756 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.756 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.756 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.756 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.756 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.756 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.756 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.756 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.756 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.756 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.756 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.756 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.756 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.756 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.756 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.756 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.756 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.756 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.756 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.756 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.756 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.756 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.756 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.756 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.756 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.756 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.756 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.757 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.757 12 DEBUG ceilometer.compute.pollsters [-] 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.757 12 DEBUG ceilometer.compute.pollsters [-] 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.758 12 DEBUG ceilometer.compute.pollsters [-] 6d263548-4cc6-463b-b26b-cb43b0d069cd/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.758 12 DEBUG ceilometer.compute.pollsters [-] 6d263548-4cc6-463b-b26b-cb43b0d069cd/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.759 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4dae5b93-70a9-49de-aed4-1cbdfd947b0c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '5ea417ea62e2404d8cb5b9e767e8c5c4', 'user_name': None, 'project_id': '070aaece3c3c4232877d26c34023c56d', 'project_name': None, 'resource_id': '5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67-vda', 'timestamp': '2025-11-22T07:44:36.757588', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-370989325', 'name': 'instance-00000011', 'instance_id': '5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67', 'instance_type': 'm1.nano', 'host': '9e6578c6ae35730b89f940f3c6085d7218ecaf8b959a880eeedd6624', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '181569e2-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4217.419552083, 'message_signature': '23f18639e94e91472cc74ed4e0cecb89a4fdd3159234db3065a1b0ca9ef6fbcb'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '5ea417ea62e2404d8cb5b9e767e8c5c4', 'user_name': None, 'project_id': '070aaece3c3c4232877d26c34023c56d', 'project_name': None, 'resource_id': '5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67-sda', 'timestamp': '2025-11-22T07:44:36.757588', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-370989325', 'name': 'instance-00000011', 'instance_id': '5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67', 'instance_type': 'm1.nano', 'host': '9e6578c6ae35730b89f940f3c6085d7218ecaf8b959a880eeedd6624', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '181574d2-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4217.419552083, 'message_signature': '07af104308ee6f242f70c84d32321b7805431231bc6ec3849c6942753a1d2f92'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7c0fb56fc41e44dfa23a0d45149e78e3', 'user_name': None, 'project_id': '9b004cb06df74de2903dae19345fd9c7', 'project_name': None, 'resource_id': '6d263548-4cc6-463b-b26b-cb43b0d069cd-vda', 'timestamp': '2025-11-22T07:44:36.757588', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-174353263', 'name': 'instance-0000000d', 'instance_id': '6d263548-4cc6-463b-b26b-cb43b0d069cd', 'instance_type': 'm1.nano', 'host': '58822ef3e0ed542a50f4c285ab49d39c541f09f50dc197ccbb5d5ca0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '18158396-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4217.432386327, 'message_signature': '42fe0cadb6318f68b338ec6d6391fa447d549f03c3947f8f80039abda179f310'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '7c0fb56fc41e44dfa23a0d45149e78e3', 'user_name': None, 'project_id': '9b004cb06df74de2903dae19345fd9c7', 'project_name': None, 'resource_id': '6d263548-4cc6-463b-b26b-cb43b0d069cd-sda', 'timestamp': '2025-11-22T07:44:36.757588', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-174353263', 'name': 'instance-0000000d', 'instance_id': '6d263548-4cc6-463b-b26b-cb43b0d069cd', 'instance_type': 'm1.nano', 'host': '58822ef3e0ed542a50f4c285ab49d39c541f09f50dc197ccbb5d5ca0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '18158e04-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4217.432386327, 'message_signature': 'adeaffa048c8934797710ee575a328df18d04808cd86e1417e6cf95201501ab2'}]}, 'timestamp': '2025-11-22 07:44:36.758809', '_unique_id': '75c6ea51ae2d4d8e8ddeaa139eef2afa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.759 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.759 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.759 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.759 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.759 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.759 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.759 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.759 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.759 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.759 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.759 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.759 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.759 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.759 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.759 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.759 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.759 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.759 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.759 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.759 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.759 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.759 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.759 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.759 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.759 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.759 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.759 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.759 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.759 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.759 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.759 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.760 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.760 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.760 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-MigrationsAdminTest-server-370989325>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-174353263>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-MigrationsAdminTest-server-370989325>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-174353263>]
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.760 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.760 12 DEBUG ceilometer.compute.pollsters [-] 6d263548-4cc6-463b-b26b-cb43b0d069cd/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.761 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a5939e3b-9629-47dc-9561-c6f27cf468c7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '7c0fb56fc41e44dfa23a0d45149e78e3', 'user_name': None, 'project_id': '9b004cb06df74de2903dae19345fd9c7', 'project_name': None, 'resource_id': 'instance-0000000d-6d263548-4cc6-463b-b26b-cb43b0d069cd-tapcc51d9a0-e1', 'timestamp': '2025-11-22T07:44:36.760773', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-174353263', 'name': 'tapcc51d9a0-e1', 'instance_id': '6d263548-4cc6-463b-b26b-cb43b0d069cd', 'instance_type': 'm1.nano', 'host': '58822ef3e0ed542a50f4c285ab49d39c541f09f50dc197ccbb5d5ca0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:88:5c:3d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcc51d9a0-e1'}, 'message_id': '1815e49e-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4217.288457019, 'message_signature': 'd43085a0a1e8aa82cd5ecb4366ab29b041711f891bf1a288b22191ea44829451'}]}, 'timestamp': '2025-11-22 07:44:36.761010', '_unique_id': '7e5412d9743f41638969960062cedb39'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.761 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.761 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.761 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.761 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.761 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.761 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.761 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.761 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.761 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.761 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.761 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.761 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.761 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.761 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.761 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.761 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.761 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.761 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.761 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.761 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.761 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.761 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.761 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.761 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.761 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.761 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.761 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.761 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.761 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.761 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.761 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.772 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.773 12 DEBUG ceilometer.compute.pollsters [-] 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/disk.device.allocation volume: 30085120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.773 12 DEBUG ceilometer.compute.pollsters [-] 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.773 12 DEBUG ceilometer.compute.pollsters [-] 6d263548-4cc6-463b-b26b-cb43b0d069cd/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.773 12 DEBUG ceilometer.compute.pollsters [-] 6d263548-4cc6-463b-b26b-cb43b0d069cd/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.775 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0fd54f43-a7a2-4122-89b1-0df48ab0b4ef', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30085120, 'user_id': '5ea417ea62e2404d8cb5b9e767e8c5c4', 'user_name': None, 'project_id': '070aaece3c3c4232877d26c34023c56d', 'project_name': None, 'resource_id': '5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67-vda', 'timestamp': '2025-11-22T07:44:36.772989', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-370989325', 'name': 'instance-00000011', 'instance_id': '5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67', 'instance_type': 'm1.nano', 'host': '9e6578c6ae35730b89f940f3c6085d7218ecaf8b959a880eeedd6624', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1817c476-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4217.419552083, 'message_signature': '15b9ed7ddc2a22c9959f527555415e9ee057689deffb1a4635af984730a50c03'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '5ea417ea62e2404d8cb5b9e767e8c5c4', 'user_name': None, 'project_id': '070aaece3c3c4232877d26c34023c56d', 'project_name': None, 'resource_id': '5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67-sda', 'timestamp': '2025-11-22T07:44:36.772989', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-370989325', 'name': 'instance-00000011', 'instance_id': '5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67', 'instance_type': 'm1.nano', 'host': '9e6578c6ae35730b89f940f3c6085d7218ecaf8b959a880eeedd6624', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1817cffc-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4217.419552083, 'message_signature': 'eb3465a8ee089d4f1cd4fb4f5b5df6dc444dfe26edfae7d5a62c3b7d33abff63'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '7c0fb56fc41e44dfa23a0d45149e78e3', 'user_name': None, 'project_id': '9b004cb06df74de2903dae19345fd9c7', 'project_name': None, 'resource_id': '6d263548-4cc6-463b-b26b-cb43b0d069cd-vda', 'timestamp': '2025-11-22T07:44:36.772989', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-174353263', 'name': 'instance-0000000d', 'instance_id': '6d263548-4cc6-463b-b26b-cb43b0d069cd', 'instance_type': 'm1.nano', 'host': '58822ef3e0ed542a50f4c285ab49d39c541f09f50dc197ccbb5d5ca0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1817db82-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4217.432386327, 'message_signature': '4d885d01cf183e3e0bd7c41a96af66bfba9d18aab8d2406afccc0d887f678641'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '7c0fb56fc41e44dfa23a0d45149e78e3', 'user_name': None, 'project_id': '9b004cb06df74de2903dae19345fd9c7', 'project_name': None, 'resource_id': '6d263548-4cc6-463b-b26b-cb43b0d069cd-sda', 'timestamp': '2025-11-22T07:44:36.772989', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-174353263', 'name': 'instance-0000000d', 'instance_id': '6d263548-4cc6-463b-b26b-cb43b0d069cd', 'instance_type': 'm1.nano', 'host': '58822ef3e0ed542a50f4c285ab49d39c541f09f50dc197ccbb5d5ca0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1817e578-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4217.432386327, 'message_signature': '5320fbee31aefe1f425e8b0bce3c70bffb908017fb144b907b7be51bb6caaa03'}]}, 'timestamp': '2025-11-22 07:44:36.774132', '_unique_id': '0bcf97c649a348f3b64c04e0b1e92e4d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.775 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.775 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.775 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.775 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.775 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.775 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.775 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.775 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.775 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.775 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.775 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.775 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.775 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.775 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.775 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.775 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.775 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.775 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.775 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.775 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.775 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.775 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.775 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.775 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.775 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.775 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.775 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.775 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.775 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.775 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.775 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.775 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.775 12 DEBUG ceilometer.compute.pollsters [-] 6d263548-4cc6-463b-b26b-cb43b0d069cd/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.776 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1e34be19-725d-4790-9838-123e126a45c5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7c0fb56fc41e44dfa23a0d45149e78e3', 'user_name': None, 'project_id': '9b004cb06df74de2903dae19345fd9c7', 'project_name': None, 'resource_id': 'instance-0000000d-6d263548-4cc6-463b-b26b-cb43b0d069cd-tapcc51d9a0-e1', 'timestamp': '2025-11-22T07:44:36.775660', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-174353263', 'name': 'tapcc51d9a0-e1', 'instance_id': '6d263548-4cc6-463b-b26b-cb43b0d069cd', 'instance_type': 'm1.nano', 'host': '58822ef3e0ed542a50f4c285ab49d39c541f09f50dc197ccbb5d5ca0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:88:5c:3d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcc51d9a0-e1'}, 'message_id': '18182a06-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4217.288457019, 'message_signature': 'ca763eb15c0e291005b14057f1245206c215bc09b1934e637e18a3eb3ab7b9c2'}]}, 'timestamp': '2025-11-22 07:44:36.775892', '_unique_id': '82ce50ab9bb74b83babbff94f8cff0ff'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.776 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.776 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.776 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.776 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.776 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.776 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.776 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.776 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.776 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.776 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.776 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.776 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.776 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.776 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.776 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.776 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.776 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.776 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.776 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.776 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.776 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.776 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.776 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.776 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.776 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.776 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.776 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.776 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.776 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.776 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.776 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.777 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.777 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 07:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:44:36.777 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-MigrationsAdminTest-server-370989325>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-174353263>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-MigrationsAdminTest-server-370989325>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-174353263>]
Nov 22 07:44:36 compute-0 nova_compute[186544]: 2025-11-22 07:44:36.804 186548 DEBUG oslo_concurrency.processutils [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:44:36 compute-0 nova_compute[186544]: 2025-11-22 07:44:36.805 186548 DEBUG oslo_concurrency.lockutils [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:44:36 compute-0 nova_compute[186544]: 2025-11-22 07:44:36.806 186548 DEBUG oslo_concurrency.lockutils [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:44:36 compute-0 nova_compute[186544]: 2025-11-22 07:44:36.816 186548 DEBUG oslo_concurrency.processutils [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:44:36 compute-0 nova_compute[186544]: 2025-11-22 07:44:36.877 186548 DEBUG oslo_concurrency.processutils [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:44:36 compute-0 nova_compute[186544]: 2025-11-22 07:44:36.878 186548 DEBUG oslo_concurrency.processutils [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/feb5ca5f-df67-4f29-9c21-71ba30b5af9c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:44:36 compute-0 nova_compute[186544]: 2025-11-22 07:44:36.919 186548 DEBUG oslo_concurrency.processutils [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/feb5ca5f-df67-4f29-9c21-71ba30b5af9c/disk 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:44:36 compute-0 nova_compute[186544]: 2025-11-22 07:44:36.921 186548 DEBUG oslo_concurrency.lockutils [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.115s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:44:36 compute-0 nova_compute[186544]: 2025-11-22 07:44:36.922 186548 DEBUG oslo_concurrency.processutils [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:44:36 compute-0 nova_compute[186544]: 2025-11-22 07:44:36.984 186548 DEBUG oslo_concurrency.processutils [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:44:36 compute-0 nova_compute[186544]: 2025-11-22 07:44:36.986 186548 DEBUG nova.virt.disk.api [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Checking if we can resize image /var/lib/nova/instances/feb5ca5f-df67-4f29-9c21-71ba30b5af9c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 07:44:36 compute-0 nova_compute[186544]: 2025-11-22 07:44:36.986 186548 DEBUG oslo_concurrency.processutils [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/feb5ca5f-df67-4f29-9c21-71ba30b5af9c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:44:37 compute-0 nova_compute[186544]: 2025-11-22 07:44:37.049 186548 DEBUG oslo_concurrency.processutils [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/feb5ca5f-df67-4f29-9c21-71ba30b5af9c/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:44:37 compute-0 nova_compute[186544]: 2025-11-22 07:44:37.051 186548 DEBUG nova.virt.disk.api [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Cannot resize image /var/lib/nova/instances/feb5ca5f-df67-4f29-9c21-71ba30b5af9c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 07:44:37 compute-0 nova_compute[186544]: 2025-11-22 07:44:37.051 186548 DEBUG nova.objects.instance [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Lazy-loading 'migration_context' on Instance uuid feb5ca5f-df67-4f29-9c21-71ba30b5af9c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:44:37 compute-0 nova_compute[186544]: 2025-11-22 07:44:37.063 186548 DEBUG nova.virt.libvirt.driver [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 07:44:37 compute-0 nova_compute[186544]: 2025-11-22 07:44:37.064 186548 DEBUG nova.virt.libvirt.driver [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Ensure instance console log exists: /var/lib/nova/instances/feb5ca5f-df67-4f29-9c21-71ba30b5af9c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 07:44:37 compute-0 nova_compute[186544]: 2025-11-22 07:44:37.065 186548 DEBUG oslo_concurrency.lockutils [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:44:37 compute-0 nova_compute[186544]: 2025-11-22 07:44:37.065 186548 DEBUG oslo_concurrency.lockutils [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:44:37 compute-0 nova_compute[186544]: 2025-11-22 07:44:37.065 186548 DEBUG oslo_concurrency.lockutils [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:44:37 compute-0 nova_compute[186544]: 2025-11-22 07:44:37.153 186548 DEBUG nova.policy [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4ca2e31d955040598948fa3da5d84888', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '74651b744925468db6c6e47d1397cc04', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 22 07:44:37 compute-0 nova_compute[186544]: 2025-11-22 07:44:37.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:44:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:37.313 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:44:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:37.314 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:44:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:37.314 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:44:37 compute-0 podman[215905]: 2025-11-22 07:44:37.435071506 +0000 UTC m=+0.085970732 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 07:44:38 compute-0 nova_compute[186544]: 2025-11-22 07:44:38.326 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:39 compute-0 nova_compute[186544]: 2025-11-22 07:44:39.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:44:39 compute-0 nova_compute[186544]: 2025-11-22 07:44:39.317 186548 DEBUG nova.network.neutron [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Successfully updated port: 4ab0012c-e73f-4cd6-b146-527583d046f3 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 07:44:39 compute-0 nova_compute[186544]: 2025-11-22 07:44:39.341 186548 DEBUG oslo_concurrency.lockutils [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Acquiring lock "refresh_cache-feb5ca5f-df67-4f29-9c21-71ba30b5af9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:44:39 compute-0 nova_compute[186544]: 2025-11-22 07:44:39.341 186548 DEBUG oslo_concurrency.lockutils [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Acquired lock "refresh_cache-feb5ca5f-df67-4f29-9c21-71ba30b5af9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:44:39 compute-0 nova_compute[186544]: 2025-11-22 07:44:39.341 186548 DEBUG nova.network.neutron [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 07:44:39 compute-0 nova_compute[186544]: 2025-11-22 07:44:39.809 186548 DEBUG nova.network.neutron [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 07:44:39 compute-0 nova_compute[186544]: 2025-11-22 07:44:39.976 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:40 compute-0 nova_compute[186544]: 2025-11-22 07:44:40.143 186548 DEBUG nova.compute.manager [req-a9f52bd0-6b35-4b14-a03e-5ea5d8677ff2 req-a287ba60-c9ab-4c73-b6f8-1f89f5f784fa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Received event network-changed-4ab0012c-e73f-4cd6-b146-527583d046f3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:44:40 compute-0 nova_compute[186544]: 2025-11-22 07:44:40.143 186548 DEBUG nova.compute.manager [req-a9f52bd0-6b35-4b14-a03e-5ea5d8677ff2 req-a287ba60-c9ab-4c73-b6f8-1f89f5f784fa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Refreshing instance network info cache due to event network-changed-4ab0012c-e73f-4cd6-b146-527583d046f3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 07:44:40 compute-0 nova_compute[186544]: 2025-11-22 07:44:40.143 186548 DEBUG oslo_concurrency.lockutils [req-a9f52bd0-6b35-4b14-a03e-5ea5d8677ff2 req-a287ba60-c9ab-4c73-b6f8-1f89f5f784fa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-feb5ca5f-df67-4f29-9c21-71ba30b5af9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:44:41 compute-0 ovn_controller[94843]: 2025-11-22T07:44:41Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:88:5c:3d 10.100.0.13
Nov 22 07:44:41 compute-0 ovn_controller[94843]: 2025-11-22T07:44:41Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:88:5c:3d 10.100.0.13
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.552 186548 DEBUG oslo_concurrency.lockutils [None req-f22a91de-cd6c-425b-85b7-89eff751d2fe 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Acquiring lock "6d263548-4cc6-463b-b26b-cb43b0d069cd" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.553 186548 DEBUG oslo_concurrency.lockutils [None req-f22a91de-cd6c-425b-85b7-89eff751d2fe 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "6d263548-4cc6-463b-b26b-cb43b0d069cd" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.553 186548 DEBUG oslo_concurrency.lockutils [None req-f22a91de-cd6c-425b-85b7-89eff751d2fe 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Acquiring lock "6d263548-4cc6-463b-b26b-cb43b0d069cd-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.553 186548 DEBUG oslo_concurrency.lockutils [None req-f22a91de-cd6c-425b-85b7-89eff751d2fe 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "6d263548-4cc6-463b-b26b-cb43b0d069cd-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.554 186548 DEBUG oslo_concurrency.lockutils [None req-f22a91de-cd6c-425b-85b7-89eff751d2fe 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "6d263548-4cc6-463b-b26b-cb43b0d069cd-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.561 186548 INFO nova.compute.manager [None req-f22a91de-cd6c-425b-85b7-89eff751d2fe 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Terminating instance
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.566 186548 DEBUG nova.compute.manager [None req-f22a91de-cd6c-425b-85b7-89eff751d2fe 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 07:44:41 compute-0 kernel: tapcc51d9a0-e1 (unregistering): left promiscuous mode
Nov 22 07:44:41 compute-0 NetworkManager[55036]: <info>  [1763797481.5868] device (tapcc51d9a0-e1): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 07:44:41 compute-0 ovn_controller[94843]: 2025-11-22T07:44:41Z|00073|binding|INFO|Releasing lport cc51d9a0-e170-42eb-b8db-2910ea320cb4 from this chassis (sb_readonly=0)
Nov 22 07:44:41 compute-0 ovn_controller[94843]: 2025-11-22T07:44:41Z|00074|binding|INFO|Setting lport cc51d9a0-e170-42eb-b8db-2910ea320cb4 down in Southbound
Nov 22 07:44:41 compute-0 ovn_controller[94843]: 2025-11-22T07:44:41Z|00075|binding|INFO|Removing iface tapcc51d9a0-e1 ovn-installed in OVS
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.593 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.609 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.621 186548 DEBUG nova.network.neutron [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Updating instance_info_cache with network_info: [{"id": "4ab0012c-e73f-4cd6-b146-527583d046f3", "address": "fa:16:3e:c2:22:c9", "network": {"id": "cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-667943228-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74651b744925468db6c6e47d1397cc04", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ab0012c-e7", "ovs_interfaceid": "4ab0012c-e73f-4cd6-b146-527583d046f3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:44:41 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:41.643 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:88:5c:3d 10.100.0.13'], port_security=['fa:16:3e:88:5c:3d 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '6d263548-4cc6-463b-b26b-cb43b0d069cd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d7ba1c27-6255-4c71-8e98-23a1c59b5723', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9b004cb06df74de2903dae19345fd9c7', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'f8b9c274-fa57-419c-9d40-54201db84f9d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ce3be460-df7c-41a5-9ff2-c82c8fc728ec, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=cc51d9a0-e170-42eb-b8db-2910ea320cb4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:44:41 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:41.644 103805 INFO neutron.agent.ovn.metadata.agent [-] Port cc51d9a0-e170-42eb-b8db-2910ea320cb4 in datapath d7ba1c27-6255-4c71-8e98-23a1c59b5723 unbound from our chassis
Nov 22 07:44:41 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:41.645 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d7ba1c27-6255-4c71-8e98-23a1c59b5723, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 07:44:41 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:41.646 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[c89b522b-445d-4363-9aff-46099fa461d3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:44:41 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:41.647 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723 namespace which is not needed anymore
Nov 22 07:44:41 compute-0 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000d.scope: Deactivated successfully.
Nov 22 07:44:41 compute-0 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000d.scope: Consumed 13.432s CPU time.
Nov 22 07:44:41 compute-0 systemd-machined[152872]: Machine qemu-10-instance-0000000d terminated.
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.728 186548 DEBUG oslo_concurrency.lockutils [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Releasing lock "refresh_cache-feb5ca5f-df67-4f29-9c21-71ba30b5af9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.729 186548 DEBUG nova.compute.manager [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Instance network_info: |[{"id": "4ab0012c-e73f-4cd6-b146-527583d046f3", "address": "fa:16:3e:c2:22:c9", "network": {"id": "cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-667943228-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74651b744925468db6c6e47d1397cc04", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ab0012c-e7", "ovs_interfaceid": "4ab0012c-e73f-4cd6-b146-527583d046f3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.729 186548 DEBUG oslo_concurrency.lockutils [req-a9f52bd0-6b35-4b14-a03e-5ea5d8677ff2 req-a287ba60-c9ab-4c73-b6f8-1f89f5f784fa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-feb5ca5f-df67-4f29-9c21-71ba30b5af9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.729 186548 DEBUG nova.network.neutron [req-a9f52bd0-6b35-4b14-a03e-5ea5d8677ff2 req-a287ba60-c9ab-4c73-b6f8-1f89f5f784fa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Refreshing network info cache for port 4ab0012c-e73f-4cd6-b146-527583d046f3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.733 186548 DEBUG nova.virt.libvirt.driver [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Start _get_guest_xml network_info=[{"id": "4ab0012c-e73f-4cd6-b146-527583d046f3", "address": "fa:16:3e:c2:22:c9", "network": {"id": "cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-667943228-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74651b744925468db6c6e47d1397cc04", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ab0012c-e7", "ovs_interfaceid": "4ab0012c-e73f-4cd6-b146-527583d046f3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.736 186548 WARNING nova.virt.libvirt.driver [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.741 186548 DEBUG nova.virt.libvirt.host [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.742 186548 DEBUG nova.virt.libvirt.host [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.747 186548 DEBUG nova.virt.libvirt.host [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.748 186548 DEBUG nova.virt.libvirt.host [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.749 186548 DEBUG nova.virt.libvirt.driver [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.749 186548 DEBUG nova.virt.hardware [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.749 186548 DEBUG nova.virt.hardware [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.749 186548 DEBUG nova.virt.hardware [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.750 186548 DEBUG nova.virt.hardware [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.750 186548 DEBUG nova.virt.hardware [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.750 186548 DEBUG nova.virt.hardware [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.750 186548 DEBUG nova.virt.hardware [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.750 186548 DEBUG nova.virt.hardware [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.751 186548 DEBUG nova.virt.hardware [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.751 186548 DEBUG nova.virt.hardware [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.751 186548 DEBUG nova.virt.hardware [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.755 186548 DEBUG nova.virt.libvirt.vif [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:44:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1661145969',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1661145969',id=20,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='74651b744925468db6c6e47d1397cc04',ramdisk_id='',reservation_id='r-lpdner4p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1505701588',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1505701588-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:44:36Z,user_data=None,user_id='4ca2e31d955040598948fa3da5d84888',uuid=feb5ca5f-df67-4f29-9c21-71ba30b5af9c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4ab0012c-e73f-4cd6-b146-527583d046f3", "address": "fa:16:3e:c2:22:c9", "network": {"id": "cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-667943228-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74651b744925468db6c6e47d1397cc04", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ab0012c-e7", "ovs_interfaceid": "4ab0012c-e73f-4cd6-b146-527583d046f3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.755 186548 DEBUG nova.network.os_vif_util [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Converting VIF {"id": "4ab0012c-e73f-4cd6-b146-527583d046f3", "address": "fa:16:3e:c2:22:c9", "network": {"id": "cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-667943228-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74651b744925468db6c6e47d1397cc04", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ab0012c-e7", "ovs_interfaceid": "4ab0012c-e73f-4cd6-b146-527583d046f3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.756 186548 DEBUG nova.network.os_vif_util [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c2:22:c9,bridge_name='br-int',has_traffic_filtering=True,id=4ab0012c-e73f-4cd6-b146-527583d046f3,network=Network(cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap4ab0012c-e7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.756 186548 DEBUG nova.objects.instance [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Lazy-loading 'pci_devices' on Instance uuid feb5ca5f-df67-4f29-9c21-71ba30b5af9c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:44:41 compute-0 neutron-haproxy-ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723[214700]: [NOTICE]   (214704) : haproxy version is 2.8.14-c23fe91
Nov 22 07:44:41 compute-0 neutron-haproxy-ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723[214700]: [NOTICE]   (214704) : path to executable is /usr/sbin/haproxy
Nov 22 07:44:41 compute-0 neutron-haproxy-ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723[214700]: [WARNING]  (214704) : Exiting Master process...
Nov 22 07:44:41 compute-0 neutron-haproxy-ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723[214700]: [ALERT]    (214704) : Current worker (214706) exited with code 143 (Terminated)
Nov 22 07:44:41 compute-0 neutron-haproxy-ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723[214700]: [WARNING]  (214704) : All workers exited. Exiting... (0)
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.769 186548 DEBUG nova.virt.libvirt.driver [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] End _get_guest_xml xml=<domain type="kvm">
Nov 22 07:44:41 compute-0 nova_compute[186544]:   <uuid>feb5ca5f-df67-4f29-9c21-71ba30b5af9c</uuid>
Nov 22 07:44:41 compute-0 nova_compute[186544]:   <name>instance-00000014</name>
Nov 22 07:44:41 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 07:44:41 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 07:44:41 compute-0 nova_compute[186544]:   <metadata>
Nov 22 07:44:41 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 07:44:41 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 07:44:41 compute-0 nova_compute[186544]:       <nova:name>tempest-LiveAutoBlockMigrationV225Test-server-1661145969</nova:name>
Nov 22 07:44:41 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 07:44:41</nova:creationTime>
Nov 22 07:44:41 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 07:44:41 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 07:44:41 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 07:44:41 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 07:44:41 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 07:44:41 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 07:44:41 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 07:44:41 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 07:44:41 compute-0 nova_compute[186544]:         <nova:user uuid="4ca2e31d955040598948fa3da5d84888">tempest-LiveAutoBlockMigrationV225Test-1505701588-project-member</nova:user>
Nov 22 07:44:41 compute-0 nova_compute[186544]:         <nova:project uuid="74651b744925468db6c6e47d1397cc04">tempest-LiveAutoBlockMigrationV225Test-1505701588</nova:project>
Nov 22 07:44:41 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 07:44:41 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 07:44:41 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 07:44:41 compute-0 nova_compute[186544]:         <nova:port uuid="4ab0012c-e73f-4cd6-b146-527583d046f3">
Nov 22 07:44:41 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 22 07:44:41 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 07:44:41 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 07:44:41 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 07:44:41 compute-0 nova_compute[186544]:   </metadata>
Nov 22 07:44:41 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 07:44:41 compute-0 nova_compute[186544]:     <system>
Nov 22 07:44:41 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 07:44:41 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 07:44:41 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 07:44:41 compute-0 nova_compute[186544]:       <entry name="serial">feb5ca5f-df67-4f29-9c21-71ba30b5af9c</entry>
Nov 22 07:44:41 compute-0 nova_compute[186544]:       <entry name="uuid">feb5ca5f-df67-4f29-9c21-71ba30b5af9c</entry>
Nov 22 07:44:41 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 07:44:41 compute-0 nova_compute[186544]:     </system>
Nov 22 07:44:41 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 07:44:41 compute-0 nova_compute[186544]:   <os>
Nov 22 07:44:41 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 07:44:41 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 07:44:41 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 07:44:41 compute-0 nova_compute[186544]:   </os>
Nov 22 07:44:41 compute-0 nova_compute[186544]:   <features>
Nov 22 07:44:41 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 07:44:41 compute-0 nova_compute[186544]:     <apic/>
Nov 22 07:44:41 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 07:44:41 compute-0 nova_compute[186544]:   </features>
Nov 22 07:44:41 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 07:44:41 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 07:44:41 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 07:44:41 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 07:44:41 compute-0 nova_compute[186544]:   </clock>
Nov 22 07:44:41 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 07:44:41 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 07:44:41 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 07:44:41 compute-0 nova_compute[186544]:   </cpu>
Nov 22 07:44:41 compute-0 nova_compute[186544]:   <devices>
Nov 22 07:44:41 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 07:44:41 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 07:44:41 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/feb5ca5f-df67-4f29-9c21-71ba30b5af9c/disk"/>
Nov 22 07:44:41 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 07:44:41 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:44:41 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 07:44:41 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 07:44:41 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/feb5ca5f-df67-4f29-9c21-71ba30b5af9c/disk.config"/>
Nov 22 07:44:41 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 07:44:41 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:44:41 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 07:44:41 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:c2:22:c9"/>
Nov 22 07:44:41 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 07:44:41 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 07:44:41 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 07:44:41 compute-0 nova_compute[186544]:       <target dev="tap4ab0012c-e7"/>
Nov 22 07:44:41 compute-0 nova_compute[186544]:     </interface>
Nov 22 07:44:41 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 07:44:41 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/feb5ca5f-df67-4f29-9c21-71ba30b5af9c/console.log" append="off"/>
Nov 22 07:44:41 compute-0 nova_compute[186544]:     </serial>
Nov 22 07:44:41 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 07:44:41 compute-0 nova_compute[186544]:     <video>
Nov 22 07:44:41 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 07:44:41 compute-0 nova_compute[186544]:     </video>
Nov 22 07:44:41 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 07:44:41 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 07:44:41 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 07:44:41 compute-0 nova_compute[186544]:     </rng>
Nov 22 07:44:41 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 07:44:41 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:41 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:41 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:41 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:41 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:41 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:41 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:41 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:41 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:41 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:41 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:41 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:41 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:41 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:41 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:41 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:41 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:41 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:41 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:41 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:41 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:41 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:41 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:41 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:44:41 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 07:44:41 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 07:44:41 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 07:44:41 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 07:44:41 compute-0 nova_compute[186544]:   </devices>
Nov 22 07:44:41 compute-0 nova_compute[186544]: </domain>
Nov 22 07:44:41 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.770 186548 DEBUG nova.compute.manager [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Preparing to wait for external event network-vif-plugged-4ab0012c-e73f-4cd6-b146-527583d046f3 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 07:44:41 compute-0 systemd[1]: libpod-91d7ce8c15213c6da50c6c46a876432b887311dc491e3a4154a56bc2a492e97e.scope: Deactivated successfully.
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.771 186548 DEBUG oslo_concurrency.lockutils [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Acquiring lock "feb5ca5f-df67-4f29-9c21-71ba30b5af9c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.771 186548 DEBUG oslo_concurrency.lockutils [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Lock "feb5ca5f-df67-4f29-9c21-71ba30b5af9c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.772 186548 DEBUG oslo_concurrency.lockutils [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Lock "feb5ca5f-df67-4f29-9c21-71ba30b5af9c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.772 186548 DEBUG nova.virt.libvirt.vif [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:44:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1661145969',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1661145969',id=20,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='74651b744925468db6c6e47d1397cc04',ramdisk_id='',reservation_id='r-lpdner4p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1505701588',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1505701588-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:44:36Z,user_data=None,user_id='4ca2e31d955040598948fa3da5d84888',uuid=feb5ca5f-df67-4f29-9c21-71ba30b5af9c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4ab0012c-e73f-4cd6-b146-527583d046f3", "address": "fa:16:3e:c2:22:c9", "network": {"id": "cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-667943228-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74651b744925468db6c6e47d1397cc04", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ab0012c-e7", "ovs_interfaceid": "4ab0012c-e73f-4cd6-b146-527583d046f3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.773 186548 DEBUG nova.network.os_vif_util [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Converting VIF {"id": "4ab0012c-e73f-4cd6-b146-527583d046f3", "address": "fa:16:3e:c2:22:c9", "network": {"id": "cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-667943228-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74651b744925468db6c6e47d1397cc04", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ab0012c-e7", "ovs_interfaceid": "4ab0012c-e73f-4cd6-b146-527583d046f3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.774 186548 DEBUG nova.network.os_vif_util [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c2:22:c9,bridge_name='br-int',has_traffic_filtering=True,id=4ab0012c-e73f-4cd6-b146-527583d046f3,network=Network(cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap4ab0012c-e7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.774 186548 DEBUG os_vif [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c2:22:c9,bridge_name='br-int',has_traffic_filtering=True,id=4ab0012c-e73f-4cd6-b146-527583d046f3,network=Network(cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap4ab0012c-e7') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.775 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.775 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.775 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.777 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:41 compute-0 podman[215977]: 2025-11-22 07:44:41.777875729 +0000 UTC m=+0.048663581 container died 91d7ce8c15213c6da50c6c46a876432b887311dc491e3a4154a56bc2a492e97e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.778 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4ab0012c-e7, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.778 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4ab0012c-e7, col_values=(('external_ids', {'iface-id': '4ab0012c-e73f-4cd6-b146-527583d046f3', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c2:22:c9', 'vm-uuid': 'feb5ca5f-df67-4f29-9c21-71ba30b5af9c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.779 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:41 compute-0 NetworkManager[55036]: <info>  [1763797481.7804] manager: (tap4ab0012c-e7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/39)
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.781 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.787 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.789 186548 INFO os_vif [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c2:22:c9,bridge_name='br-int',has_traffic_filtering=True,id=4ab0012c-e73f-4cd6-b146-527583d046f3,network=Network(cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap4ab0012c-e7')
Nov 22 07:44:41 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-91d7ce8c15213c6da50c6c46a876432b887311dc491e3a4154a56bc2a492e97e-userdata-shm.mount: Deactivated successfully.
Nov 22 07:44:41 compute-0 systemd[1]: var-lib-containers-storage-overlay-34d07bed9407075b12baa35fa2a97afa90891c73260439011ba6296986437f08-merged.mount: Deactivated successfully.
Nov 22 07:44:41 compute-0 podman[215977]: 2025-11-22 07:44:41.825565804 +0000 UTC m=+0.096353656 container cleanup 91d7ce8c15213c6da50c6c46a876432b887311dc491e3a4154a56bc2a492e97e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 22 07:44:41 compute-0 systemd[1]: libpod-conmon-91d7ce8c15213c6da50c6c46a876432b887311dc491e3a4154a56bc2a492e97e.scope: Deactivated successfully.
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.836 186548 INFO nova.virt.libvirt.driver [-] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Instance destroyed successfully.
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.836 186548 DEBUG nova.objects.instance [None req-f22a91de-cd6c-425b-85b7-89eff751d2fe 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lazy-loading 'resources' on Instance uuid 6d263548-4cc6-463b-b26b-cb43b0d069cd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.851 186548 DEBUG nova.virt.libvirt.vif [None req-f22a91de-cd6c-425b-85b7-89eff751d2fe 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-22T07:43:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-174353263',display_name='tempest-ServersAdminTestJSON-server-174353263',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-174353263',id=13,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:44:27Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9b004cb06df74de2903dae19345fd9c7',ramdisk_id='',reservation_id='r-wcksy40q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='2',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-1843119868',owner_user_name='tempest-ServersAdminTestJSON-1843119868-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:44:30Z,user_data=None,user_id='7c0fb56fc41e44dfa23a0d45149e78e3',uuid=6d263548-4cc6-463b-b26b-cb43b0d069cd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cc51d9a0-e170-42eb-b8db-2910ea320cb4", "address": "fa:16:3e:88:5c:3d", "network": {"id": "d7ba1c27-6255-4c71-8e98-23a1c59b5723", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1812148536-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b004cb06df74de2903dae19345fd9c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc51d9a0-e1", "ovs_interfaceid": "cc51d9a0-e170-42eb-b8db-2910ea320cb4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.851 186548 DEBUG nova.network.os_vif_util [None req-f22a91de-cd6c-425b-85b7-89eff751d2fe 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Converting VIF {"id": "cc51d9a0-e170-42eb-b8db-2910ea320cb4", "address": "fa:16:3e:88:5c:3d", "network": {"id": "d7ba1c27-6255-4c71-8e98-23a1c59b5723", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1812148536-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9b004cb06df74de2903dae19345fd9c7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc51d9a0-e1", "ovs_interfaceid": "cc51d9a0-e170-42eb-b8db-2910ea320cb4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.852 186548 DEBUG nova.network.os_vif_util [None req-f22a91de-cd6c-425b-85b7-89eff751d2fe 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:88:5c:3d,bridge_name='br-int',has_traffic_filtering=True,id=cc51d9a0-e170-42eb-b8db-2910ea320cb4,network=Network(d7ba1c27-6255-4c71-8e98-23a1c59b5723),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc51d9a0-e1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.852 186548 DEBUG os_vif [None req-f22a91de-cd6c-425b-85b7-89eff751d2fe 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:88:5c:3d,bridge_name='br-int',has_traffic_filtering=True,id=cc51d9a0-e170-42eb-b8db-2910ea320cb4,network=Network(d7ba1c27-6255-4c71-8e98-23a1c59b5723),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc51d9a0-e1') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.853 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.854 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcc51d9a0-e1, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.855 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.858 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.861 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.864 186548 INFO os_vif [None req-f22a91de-cd6c-425b-85b7-89eff751d2fe 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:88:5c:3d,bridge_name='br-int',has_traffic_filtering=True,id=cc51d9a0-e170-42eb-b8db-2910ea320cb4,network=Network(d7ba1c27-6255-4c71-8e98-23a1c59b5723),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc51d9a0-e1')
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.865 186548 INFO nova.virt.libvirt.driver [None req-f22a91de-cd6c-425b-85b7-89eff751d2fe 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Deleting instance files /var/lib/nova/instances/6d263548-4cc6-463b-b26b-cb43b0d069cd_del
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.865 186548 INFO nova.virt.libvirt.driver [None req-f22a91de-cd6c-425b-85b7-89eff751d2fe 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Deletion of /var/lib/nova/instances/6d263548-4cc6-463b-b26b-cb43b0d069cd_del complete
Nov 22 07:44:41 compute-0 podman[215993]: 2025-11-22 07:44:41.869428896 +0000 UTC m=+0.072149394 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, architecture=x86_64, managed_by=edpm_ansible, distribution-scope=public, name=ubi9-minimal, version=9.6, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, build-date=2025-08-20T13:12:41)
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.869 186548 DEBUG nova.virt.libvirt.driver [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.869 186548 DEBUG nova.virt.libvirt.driver [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.870 186548 DEBUG nova.virt.libvirt.driver [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] No VIF found with MAC fa:16:3e:c2:22:c9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.870 186548 INFO nova.virt.libvirt.driver [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Using config drive
Nov 22 07:44:41 compute-0 podman[216038]: 2025-11-22 07:44:41.893869604 +0000 UTC m=+0.044337205 container remove 91d7ce8c15213c6da50c6c46a876432b887311dc491e3a4154a56bc2a492e97e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 07:44:41 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:41.898 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[b4fc5b76-4630-4b67-93b9-b4ca5a39f7b0]: (4, ('Sat Nov 22 07:44:41 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723 (91d7ce8c15213c6da50c6c46a876432b887311dc491e3a4154a56bc2a492e97e)\n91d7ce8c15213c6da50c6c46a876432b887311dc491e3a4154a56bc2a492e97e\nSat Nov 22 07:44:41 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723 (91d7ce8c15213c6da50c6c46a876432b887311dc491e3a4154a56bc2a492e97e)\n91d7ce8c15213c6da50c6c46a876432b887311dc491e3a4154a56bc2a492e97e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:44:41 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:41.899 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[76cd7007-2e8b-4d61-ae6e-51a1d6cd8dd7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:44:41 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:41.900 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd7ba1c27-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.903 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:41 compute-0 kernel: tapd7ba1c27-60: left promiscuous mode
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.913 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.918 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:41 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:41.920 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[67a2b74e-ed87-40ad-b294-b7729ad0a98d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:44:41 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:41.931 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[d8b4ae00-4837-49c2-8e10-1d52dc70fb52]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:44:41 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:41.932 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[3be23703-c3f0-4687-8427-8de410783cdc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:44:41 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:41.948 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[b5bb9de3-bbb4-41ba-8615-a19115c865af]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 413543, 'reachable_time': 29682, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216064, 'error': None, 'target': 'ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:44:41 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:41.950 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d7ba1c27-6255-4c71-8e98-23a1c59b5723 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 07:44:41 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:41.951 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[ea745e6d-1810-4775-940b-93a3db69e1e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:44:41 compute-0 systemd[1]: run-netns-ovnmeta\x2dd7ba1c27\x2d6255\x2d4c71\x2d8e98\x2d23a1c59b5723.mount: Deactivated successfully.
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.975 186548 INFO nova.compute.manager [None req-f22a91de-cd6c-425b-85b7-89eff751d2fe 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Took 0.41 seconds to destroy the instance on the hypervisor.
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.976 186548 DEBUG oslo.service.loopingcall [None req-f22a91de-cd6c-425b-85b7-89eff751d2fe 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.976 186548 DEBUG nova.compute.manager [-] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 07:44:41 compute-0 nova_compute[186544]: 2025-11-22 07:44:41.976 186548 DEBUG nova.network.neutron [-] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 07:44:42 compute-0 nova_compute[186544]: 2025-11-22 07:44:42.095 186548 DEBUG nova.compute.manager [req-a455b09e-a474-4a03-8859-308befb57d45 req-b46a0b6c-7c6e-49e1-ab49-9ee601382ad7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Received event network-vif-unplugged-cc51d9a0-e170-42eb-b8db-2910ea320cb4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:44:42 compute-0 nova_compute[186544]: 2025-11-22 07:44:42.096 186548 DEBUG oslo_concurrency.lockutils [req-a455b09e-a474-4a03-8859-308befb57d45 req-b46a0b6c-7c6e-49e1-ab49-9ee601382ad7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "6d263548-4cc6-463b-b26b-cb43b0d069cd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:44:42 compute-0 nova_compute[186544]: 2025-11-22 07:44:42.096 186548 DEBUG oslo_concurrency.lockutils [req-a455b09e-a474-4a03-8859-308befb57d45 req-b46a0b6c-7c6e-49e1-ab49-9ee601382ad7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6d263548-4cc6-463b-b26b-cb43b0d069cd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:44:42 compute-0 nova_compute[186544]: 2025-11-22 07:44:42.096 186548 DEBUG oslo_concurrency.lockutils [req-a455b09e-a474-4a03-8859-308befb57d45 req-b46a0b6c-7c6e-49e1-ab49-9ee601382ad7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6d263548-4cc6-463b-b26b-cb43b0d069cd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:44:42 compute-0 nova_compute[186544]: 2025-11-22 07:44:42.097 186548 DEBUG nova.compute.manager [req-a455b09e-a474-4a03-8859-308befb57d45 req-b46a0b6c-7c6e-49e1-ab49-9ee601382ad7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] No waiting events found dispatching network-vif-unplugged-cc51d9a0-e170-42eb-b8db-2910ea320cb4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:44:42 compute-0 nova_compute[186544]: 2025-11-22 07:44:42.097 186548 DEBUG nova.compute.manager [req-a455b09e-a474-4a03-8859-308befb57d45 req-b46a0b6c-7c6e-49e1-ab49-9ee601382ad7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Received event network-vif-unplugged-cc51d9a0-e170-42eb-b8db-2910ea320cb4 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 22 07:44:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:42.409 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:44:42 compute-0 nova_compute[186544]: 2025-11-22 07:44:42.443 186548 INFO nova.virt.libvirt.driver [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Creating config drive at /var/lib/nova/instances/feb5ca5f-df67-4f29-9c21-71ba30b5af9c/disk.config
Nov 22 07:44:42 compute-0 nova_compute[186544]: 2025-11-22 07:44:42.449 186548 DEBUG oslo_concurrency.processutils [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/feb5ca5f-df67-4f29-9c21-71ba30b5af9c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp23owx5na execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:44:42 compute-0 nova_compute[186544]: 2025-11-22 07:44:42.575 186548 DEBUG oslo_concurrency.processutils [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/feb5ca5f-df67-4f29-9c21-71ba30b5af9c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp23owx5na" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:44:42 compute-0 kernel: tap4ab0012c-e7: entered promiscuous mode
Nov 22 07:44:42 compute-0 NetworkManager[55036]: <info>  [1763797482.6246] manager: (tap4ab0012c-e7): new Tun device (/org/freedesktop/NetworkManager/Devices/40)
Nov 22 07:44:42 compute-0 systemd-udevd[215956]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 07:44:42 compute-0 ovn_controller[94843]: 2025-11-22T07:44:42Z|00076|binding|INFO|Claiming lport 4ab0012c-e73f-4cd6-b146-527583d046f3 for this chassis.
Nov 22 07:44:42 compute-0 ovn_controller[94843]: 2025-11-22T07:44:42Z|00077|binding|INFO|4ab0012c-e73f-4cd6-b146-527583d046f3: Claiming fa:16:3e:c2:22:c9 10.100.0.5
Nov 22 07:44:42 compute-0 nova_compute[186544]: 2025-11-22 07:44:42.626 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:42 compute-0 ovn_controller[94843]: 2025-11-22T07:44:42Z|00078|binding|INFO|Claiming lport 77e99205-9615-4ea6-ab25-d16bf8bb4804 for this chassis.
Nov 22 07:44:42 compute-0 ovn_controller[94843]: 2025-11-22T07:44:42Z|00079|binding|INFO|77e99205-9615-4ea6-ab25-d16bf8bb4804: Claiming fa:16:3e:ef:ca:bc 19.80.0.156
Nov 22 07:44:42 compute-0 NetworkManager[55036]: <info>  [1763797482.6395] device (tap4ab0012c-e7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 07:44:42 compute-0 NetworkManager[55036]: <info>  [1763797482.6417] device (tap4ab0012c-e7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 07:44:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:42.654 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ef:ca:bc 19.80.0.156'], port_security=['fa:16:3e:ef:ca:bc 19.80.0.156'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['4ab0012c-e73f-4cd6-b146-527583d046f3'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-1561653152', 'neutron:cidrs': '19.80.0.156/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2de8212b-d744-4bab-b451-7daef022c1bc', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-1561653152', 'neutron:project_id': '74651b744925468db6c6e47d1397cc04', 'neutron:revision_number': '2', 'neutron:security_group_ids': '91f2be3c-33ea-422b-b9a4-1d9e92a850d8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=b9f16a9f-d373-4cb7-a13f-5e20d7a18db8, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=77e99205-9615-4ea6-ab25-d16bf8bb4804) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:44:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:42.656 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c2:22:c9 10.100.0.5'], port_security=['fa:16:3e:c2:22:c9 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-971128270', 'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'feb5ca5f-df67-4f29-9c21-71ba30b5af9c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-971128270', 'neutron:project_id': '74651b744925468db6c6e47d1397cc04', 'neutron:revision_number': '2', 'neutron:security_group_ids': '91f2be3c-33ea-422b-b9a4-1d9e92a850d8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=14c3e272-b4ef-4625-a876-b23f3cbba9b7, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=4ab0012c-e73f-4cd6-b146-527583d046f3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:44:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:42.657 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 77e99205-9615-4ea6-ab25-d16bf8bb4804 in datapath 2de8212b-d744-4bab-b451-7daef022c1bc bound to our chassis
Nov 22 07:44:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:42.658 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2de8212b-d744-4bab-b451-7daef022c1bc
Nov 22 07:44:42 compute-0 systemd-machined[152872]: New machine qemu-11-instance-00000014.
Nov 22 07:44:42 compute-0 nova_compute[186544]: 2025-11-22 07:44:42.671 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:42.672 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[63c87642-aa67-47b4-8470-f39d37c55e8f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:44:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:42.673 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap2de8212b-d1 in ovnmeta-2de8212b-d744-4bab-b451-7daef022c1bc namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 07:44:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:42.675 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap2de8212b-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 07:44:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:42.675 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[39dc0629-7221-4120-9cf1-dd45fe225a94]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:44:42 compute-0 ovn_controller[94843]: 2025-11-22T07:44:42Z|00080|binding|INFO|Setting lport 4ab0012c-e73f-4cd6-b146-527583d046f3 ovn-installed in OVS
Nov 22 07:44:42 compute-0 ovn_controller[94843]: 2025-11-22T07:44:42Z|00081|binding|INFO|Setting lport 4ab0012c-e73f-4cd6-b146-527583d046f3 up in Southbound
Nov 22 07:44:42 compute-0 ovn_controller[94843]: 2025-11-22T07:44:42Z|00082|binding|INFO|Setting lport 77e99205-9615-4ea6-ab25-d16bf8bb4804 up in Southbound
Nov 22 07:44:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:42.676 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[cbada4c0-a6d1-439c-a79e-81344a3dff39]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:44:42 compute-0 nova_compute[186544]: 2025-11-22 07:44:42.677 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:42 compute-0 systemd[1]: Started Virtual Machine qemu-11-instance-00000014.
Nov 22 07:44:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:42.687 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[cc8601d5-23c9-4c86-8907-0a3f3e8755c3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:44:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:42.710 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[cd2c7bd3-c670-43ca-8461-c8fdcdb5da4a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:44:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:42.737 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[307fe485-7bba-4738-a1fe-25f25674a2cd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:44:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:42.742 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[58efd411-d616-4ad5-ba95-5eb2eb47a07b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:44:42 compute-0 NetworkManager[55036]: <info>  [1763797482.7432] manager: (tap2de8212b-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/41)
Nov 22 07:44:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:42.771 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[5ce9a381-40c2-4075-81ca-b5a61bd4cdbb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:44:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:42.774 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[7c907fb1-143a-4f3d-b29e-a9beaad369f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:44:42 compute-0 NetworkManager[55036]: <info>  [1763797482.7955] device (tap2de8212b-d0): carrier: link connected
Nov 22 07:44:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:42.801 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[29b3afe9-f7bb-44aa-b907-168fa5b18918]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:44:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:42.817 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[ee7ad568-9d48-47ad-a05e-4d97d767ad54]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2de8212b-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:09:70:ae'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 27], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 422343, 'reachable_time': 37476, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216112, 'error': None, 'target': 'ovnmeta-2de8212b-d744-4bab-b451-7daef022c1bc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:44:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:42.832 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[1e80e22a-eb31-41ec-aef3-744590e029ec]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe09:70ae'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 422343, 'tstamp': 422343}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216113, 'error': None, 'target': 'ovnmeta-2de8212b-d744-4bab-b451-7daef022c1bc', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:44:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:42.850 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[df18718a-635f-42c5-8c9b-bd5155300e85]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2de8212b-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:09:70:ae'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 27], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 422343, 'reachable_time': 37476, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216116, 'error': None, 'target': 'ovnmeta-2de8212b-d744-4bab-b451-7daef022c1bc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:44:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:42.882 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[e2864c99-f94b-4a7b-9f32-d158f52c061c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:44:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:42.938 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[77b823d9-fbcf-4cae-8939-052b0515ebb4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:44:42 compute-0 nova_compute[186544]: 2025-11-22 07:44:42.939 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763797482.938761, feb5ca5f-df67-4f29-9c21-71ba30b5af9c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:44:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:42.940 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2de8212b-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:44:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:42.940 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:44:42 compute-0 nova_compute[186544]: 2025-11-22 07:44:42.940 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] VM Started (Lifecycle Event)
Nov 22 07:44:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:42.941 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2de8212b-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:44:42 compute-0 NetworkManager[55036]: <info>  [1763797482.9431] manager: (tap2de8212b-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/42)
Nov 22 07:44:42 compute-0 kernel: tap2de8212b-d0: entered promiscuous mode
Nov 22 07:44:42 compute-0 nova_compute[186544]: 2025-11-22 07:44:42.943 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:42.945 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2de8212b-d0, col_values=(('external_ids', {'iface-id': 'ebed6d9f-62b8-40d5-8d3c-93d6149e3602'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:44:42 compute-0 nova_compute[186544]: 2025-11-22 07:44:42.945 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:42 compute-0 ovn_controller[94843]: 2025-11-22T07:44:42Z|00083|binding|INFO|Releasing lport ebed6d9f-62b8-40d5-8d3c-93d6149e3602 from this chassis (sb_readonly=0)
Nov 22 07:44:42 compute-0 nova_compute[186544]: 2025-11-22 07:44:42.957 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:42.957 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/2de8212b-d744-4bab-b451-7daef022c1bc.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/2de8212b-d744-4bab-b451-7daef022c1bc.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 07:44:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:42.958 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[b12ad613-0dcc-4384-93df-36eb69acb648]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:44:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:42.958 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 07:44:42 compute-0 ovn_metadata_agent[103800]: global
Nov 22 07:44:42 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 07:44:42 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-2de8212b-d744-4bab-b451-7daef022c1bc
Nov 22 07:44:42 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 07:44:42 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 07:44:42 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 07:44:42 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/2de8212b-d744-4bab-b451-7daef022c1bc.pid.haproxy
Nov 22 07:44:42 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 07:44:42 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:44:42 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 07:44:42 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 07:44:42 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 07:44:42 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 07:44:42 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 07:44:42 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 07:44:42 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 07:44:42 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 07:44:42 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 07:44:42 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 07:44:42 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 07:44:42 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 07:44:42 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 07:44:42 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:44:42 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:44:42 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 07:44:42 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 07:44:42 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 07:44:42 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID 2de8212b-d744-4bab-b451-7daef022c1bc
Nov 22 07:44:42 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 07:44:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:42.959 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-2de8212b-d744-4bab-b451-7daef022c1bc', 'env', 'PROCESS_TAG=haproxy-2de8212b-d744-4bab-b451-7daef022c1bc', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/2de8212b-d744-4bab-b451-7daef022c1bc.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 07:44:43 compute-0 nova_compute[186544]: 2025-11-22 07:44:43.106 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:44:43 compute-0 nova_compute[186544]: 2025-11-22 07:44:43.110 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763797482.938932, feb5ca5f-df67-4f29-9c21-71ba30b5af9c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:44:43 compute-0 nova_compute[186544]: 2025-11-22 07:44:43.110 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] VM Paused (Lifecycle Event)
Nov 22 07:44:43 compute-0 nova_compute[186544]: 2025-11-22 07:44:43.139 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:44:43 compute-0 nova_compute[186544]: 2025-11-22 07:44:43.141 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:44:43 compute-0 nova_compute[186544]: 2025-11-22 07:44:43.190 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 07:44:43 compute-0 podman[216153]: 2025-11-22 07:44:43.348119823 +0000 UTC m=+0.062469117 container create db27d4dd2bcff9faccd083111666073e562b58314aa7c8f6a61ee3b958f5a287 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2de8212b-d744-4bab-b451-7daef022c1bc, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 22 07:44:43 compute-0 systemd[1]: Started libpod-conmon-db27d4dd2bcff9faccd083111666073e562b58314aa7c8f6a61ee3b958f5a287.scope.
Nov 22 07:44:43 compute-0 podman[216153]: 2025-11-22 07:44:43.315194499 +0000 UTC m=+0.029543843 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 07:44:43 compute-0 systemd[1]: Started libcrun container.
Nov 22 07:44:43 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0d32c04185b0231056f43dd1ce95593057e4b75cee40f6dc856ae33d3c578c8b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 07:44:43 compute-0 podman[216153]: 2025-11-22 07:44:43.449547003 +0000 UTC m=+0.163896317 container init db27d4dd2bcff9faccd083111666073e562b58314aa7c8f6a61ee3b958f5a287 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2de8212b-d744-4bab-b451-7daef022c1bc, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 22 07:44:43 compute-0 podman[216153]: 2025-11-22 07:44:43.460648584 +0000 UTC m=+0.174997878 container start db27d4dd2bcff9faccd083111666073e562b58314aa7c8f6a61ee3b958f5a287 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2de8212b-d744-4bab-b451-7daef022c1bc, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 22 07:44:43 compute-0 neutron-haproxy-ovnmeta-2de8212b-d744-4bab-b451-7daef022c1bc[216169]: [NOTICE]   (216173) : New worker (216175) forked
Nov 22 07:44:43 compute-0 neutron-haproxy-ovnmeta-2de8212b-d744-4bab-b451-7daef022c1bc[216169]: [NOTICE]   (216173) : Loading success.
Nov 22 07:44:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:43.536 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 4ab0012c-e73f-4cd6-b146-527583d046f3 in datapath cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9 unbound from our chassis
Nov 22 07:44:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:43.538 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9
Nov 22 07:44:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:43.550 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[be1f4ad7-0b95-4765-bbac-544ff88bcec1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:44:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:43.551 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapcd5fa4f6-01 in ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 07:44:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:43.552 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapcd5fa4f6-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 07:44:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:43.553 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[72157b97-ad4c-4faa-8c9f-1a03ad5609ff]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:44:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:43.553 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[616219ee-a6f2-401d-89a8-1c2cb8b84771]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:44:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:43.567 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[38e32c3a-09f6-458b-9b2b-e0d8803e9783]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:44:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:43.581 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[108d2b8b-25ef-443f-b504-cd7ba4c527da]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:44:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:43.608 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[c4d280cb-ec7e-4144-bce1-5dca06da5fc0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:44:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:43.614 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[35931393-e619-472f-b12e-18705e5401e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:44:43 compute-0 NetworkManager[55036]: <info>  [1763797483.6160] manager: (tapcd5fa4f6-00): new Veth device (/org/freedesktop/NetworkManager/Devices/43)
Nov 22 07:44:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:43.644 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[8d48d5b0-abd9-4b2f-b9a1-75932fe54f9e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:44:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:43.647 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[06749259-eeca-448b-ab9b-e77542e4a31e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:44:43 compute-0 NetworkManager[55036]: <info>  [1763797483.6743] device (tapcd5fa4f6-00): carrier: link connected
Nov 22 07:44:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:43.682 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[f8505ca2-e9ff-4d3d-8944-d7f88faa6b20]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:44:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:43.702 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[0cc23e65-cb9c-4629-a7f6-4b6954245b6f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcd5fa4f6-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:12:db:2b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 422431, 'reachable_time': 32857, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216197, 'error': None, 'target': 'ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:44:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:43.717 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[ae23d225-7550-4f8b-8cb5-af577a4cdfd1]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe12:db2b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 422431, 'tstamp': 422431}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216198, 'error': None, 'target': 'ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:44:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:43.734 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[271bb944-7edc-456a-a37f-6748a6024bdb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcd5fa4f6-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:12:db:2b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 422431, 'reachable_time': 32857, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216199, 'error': None, 'target': 'ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:44:43 compute-0 nova_compute[186544]: 2025-11-22 07:44:43.762 186548 DEBUG nova.network.neutron [-] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:44:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:43.763 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[be9c4323-ec7c-43f6-8e11-872d2efd6e88]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:44:43 compute-0 nova_compute[186544]: 2025-11-22 07:44:43.803 186548 INFO nova.compute.manager [-] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Took 1.83 seconds to deallocate network for instance.
Nov 22 07:44:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:43.815 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[f9e35873-2aa9-4332-8c04-46163d863df6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:44:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:43.816 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcd5fa4f6-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:44:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:43.817 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:44:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:43.817 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcd5fa4f6-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:44:43 compute-0 nova_compute[186544]: 2025-11-22 07:44:43.819 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:43 compute-0 NetworkManager[55036]: <info>  [1763797483.8199] manager: (tapcd5fa4f6-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/44)
Nov 22 07:44:43 compute-0 kernel: tapcd5fa4f6-00: entered promiscuous mode
Nov 22 07:44:43 compute-0 nova_compute[186544]: 2025-11-22 07:44:43.822 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:43.824 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapcd5fa4f6-00, col_values=(('external_ids', {'iface-id': 'f400467f-3f35-4435-bb4a-0b3da05366fb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:44:43 compute-0 ovn_controller[94843]: 2025-11-22T07:44:43Z|00084|binding|INFO|Releasing lport f400467f-3f35-4435-bb4a-0b3da05366fb from this chassis (sb_readonly=0)
Nov 22 07:44:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:43.826 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 07:44:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:43.826 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[f2203d1e-bec2-4631-b1b2-2b712258caa2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:44:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:43.827 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 07:44:43 compute-0 ovn_metadata_agent[103800]: global
Nov 22 07:44:43 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 07:44:43 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9
Nov 22 07:44:43 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 07:44:43 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 07:44:43 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 07:44:43 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9.pid.haproxy
Nov 22 07:44:43 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 07:44:43 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:44:43 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 07:44:43 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 07:44:43 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 07:44:43 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 07:44:43 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 07:44:43 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 07:44:43 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 07:44:43 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 07:44:43 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 07:44:43 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 07:44:43 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 07:44:43 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 07:44:43 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 07:44:43 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:44:43 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:44:43 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 07:44:43 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 07:44:43 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 07:44:43 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9
Nov 22 07:44:43 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 07:44:43 compute-0 nova_compute[186544]: 2025-11-22 07:44:43.827 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:44:43.827 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9', 'env', 'PROCESS_TAG=haproxy-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 07:44:43 compute-0 nova_compute[186544]: 2025-11-22 07:44:43.837 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:43 compute-0 nova_compute[186544]: 2025-11-22 07:44:43.878 186548 DEBUG nova.compute.manager [req-b933d0ca-88a7-4103-b8eb-19f53e1c85d7 req-e9aea4b8-a7e3-4dc0-89de-526defca2eb1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Received event network-vif-deleted-cc51d9a0-e170-42eb-b8db-2910ea320cb4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:44:43 compute-0 nova_compute[186544]: 2025-11-22 07:44:43.898 186548 DEBUG oslo_concurrency.lockutils [None req-f22a91de-cd6c-425b-85b7-89eff751d2fe 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:44:43 compute-0 nova_compute[186544]: 2025-11-22 07:44:43.899 186548 DEBUG oslo_concurrency.lockutils [None req-f22a91de-cd6c-425b-85b7-89eff751d2fe 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:44:44 compute-0 podman[216231]: 2025-11-22 07:44:44.184179837 +0000 UTC m=+0.047830621 container create 5f87f9d5e1cd379e69c8226609b4329f5ee303928ab5156728b2db4ef9367219 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 22 07:44:44 compute-0 systemd[1]: Started libpod-conmon-5f87f9d5e1cd379e69c8226609b4329f5ee303928ab5156728b2db4ef9367219.scope.
Nov 22 07:44:44 compute-0 systemd[1]: Started libcrun container.
Nov 22 07:44:44 compute-0 podman[216231]: 2025-11-22 07:44:44.154387448 +0000 UTC m=+0.018038252 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 07:44:44 compute-0 nova_compute[186544]: 2025-11-22 07:44:44.253 186548 DEBUG nova.compute.provider_tree [None req-f22a91de-cd6c-425b-85b7-89eff751d2fe 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:44:44 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6c269ce05e9b5936262be6b8f5b0f23b3f071df227d282e4f87d74f17373afa3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 07:44:44 compute-0 podman[216231]: 2025-11-22 07:44:44.266656111 +0000 UTC m=+0.130306915 container init 5f87f9d5e1cd379e69c8226609b4329f5ee303928ab5156728b2db4ef9367219 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS)
Nov 22 07:44:44 compute-0 podman[216231]: 2025-11-22 07:44:44.271753046 +0000 UTC m=+0.135403830 container start 5f87f9d5e1cd379e69c8226609b4329f5ee303928ab5156728b2db4ef9367219 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 07:44:44 compute-0 nova_compute[186544]: 2025-11-22 07:44:44.288 186548 DEBUG nova.scheduler.client.report [None req-f22a91de-cd6c-425b-85b7-89eff751d2fe 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:44:44 compute-0 neutron-haproxy-ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9[216246]: [NOTICE]   (216250) : New worker (216252) forked
Nov 22 07:44:44 compute-0 neutron-haproxy-ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9[216246]: [NOTICE]   (216250) : Loading success.
Nov 22 07:44:44 compute-0 nova_compute[186544]: 2025-11-22 07:44:44.335 186548 DEBUG oslo_concurrency.lockutils [None req-f22a91de-cd6c-425b-85b7-89eff751d2fe 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.437s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:44:44 compute-0 nova_compute[186544]: 2025-11-22 07:44:44.379 186548 DEBUG nova.compute.manager [req-b2d155b7-98de-4453-8d3b-16f2baafa53a req-dc222782-7790-4376-a7bb-7a38f95d3b3e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Received event network-vif-plugged-cc51d9a0-e170-42eb-b8db-2910ea320cb4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:44:44 compute-0 nova_compute[186544]: 2025-11-22 07:44:44.380 186548 DEBUG oslo_concurrency.lockutils [req-b2d155b7-98de-4453-8d3b-16f2baafa53a req-dc222782-7790-4376-a7bb-7a38f95d3b3e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "6d263548-4cc6-463b-b26b-cb43b0d069cd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:44:44 compute-0 nova_compute[186544]: 2025-11-22 07:44:44.380 186548 DEBUG oslo_concurrency.lockutils [req-b2d155b7-98de-4453-8d3b-16f2baafa53a req-dc222782-7790-4376-a7bb-7a38f95d3b3e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6d263548-4cc6-463b-b26b-cb43b0d069cd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:44:44 compute-0 nova_compute[186544]: 2025-11-22 07:44:44.381 186548 DEBUG oslo_concurrency.lockutils [req-b2d155b7-98de-4453-8d3b-16f2baafa53a req-dc222782-7790-4376-a7bb-7a38f95d3b3e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6d263548-4cc6-463b-b26b-cb43b0d069cd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:44:44 compute-0 nova_compute[186544]: 2025-11-22 07:44:44.381 186548 DEBUG nova.compute.manager [req-b2d155b7-98de-4453-8d3b-16f2baafa53a req-dc222782-7790-4376-a7bb-7a38f95d3b3e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] No waiting events found dispatching network-vif-plugged-cc51d9a0-e170-42eb-b8db-2910ea320cb4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:44:44 compute-0 nova_compute[186544]: 2025-11-22 07:44:44.381 186548 WARNING nova.compute.manager [req-b2d155b7-98de-4453-8d3b-16f2baafa53a req-dc222782-7790-4376-a7bb-7a38f95d3b3e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Received unexpected event network-vif-plugged-cc51d9a0-e170-42eb-b8db-2910ea320cb4 for instance with vm_state deleted and task_state None.
Nov 22 07:44:44 compute-0 nova_compute[186544]: 2025-11-22 07:44:44.382 186548 DEBUG nova.compute.manager [req-b2d155b7-98de-4453-8d3b-16f2baafa53a req-dc222782-7790-4376-a7bb-7a38f95d3b3e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Received event network-vif-plugged-4ab0012c-e73f-4cd6-b146-527583d046f3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:44:44 compute-0 nova_compute[186544]: 2025-11-22 07:44:44.382 186548 DEBUG oslo_concurrency.lockutils [req-b2d155b7-98de-4453-8d3b-16f2baafa53a req-dc222782-7790-4376-a7bb-7a38f95d3b3e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "feb5ca5f-df67-4f29-9c21-71ba30b5af9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:44:44 compute-0 nova_compute[186544]: 2025-11-22 07:44:44.382 186548 DEBUG oslo_concurrency.lockutils [req-b2d155b7-98de-4453-8d3b-16f2baafa53a req-dc222782-7790-4376-a7bb-7a38f95d3b3e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "feb5ca5f-df67-4f29-9c21-71ba30b5af9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:44:44 compute-0 nova_compute[186544]: 2025-11-22 07:44:44.382 186548 DEBUG oslo_concurrency.lockutils [req-b2d155b7-98de-4453-8d3b-16f2baafa53a req-dc222782-7790-4376-a7bb-7a38f95d3b3e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "feb5ca5f-df67-4f29-9c21-71ba30b5af9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:44:44 compute-0 nova_compute[186544]: 2025-11-22 07:44:44.383 186548 DEBUG nova.compute.manager [req-b2d155b7-98de-4453-8d3b-16f2baafa53a req-dc222782-7790-4376-a7bb-7a38f95d3b3e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Processing event network-vif-plugged-4ab0012c-e73f-4cd6-b146-527583d046f3 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 07:44:44 compute-0 nova_compute[186544]: 2025-11-22 07:44:44.384 186548 DEBUG nova.compute.manager [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 07:44:44 compute-0 nova_compute[186544]: 2025-11-22 07:44:44.388 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763797484.3881495, feb5ca5f-df67-4f29-9c21-71ba30b5af9c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:44:44 compute-0 nova_compute[186544]: 2025-11-22 07:44:44.389 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] VM Resumed (Lifecycle Event)
Nov 22 07:44:44 compute-0 nova_compute[186544]: 2025-11-22 07:44:44.391 186548 DEBUG nova.virt.libvirt.driver [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 07:44:44 compute-0 nova_compute[186544]: 2025-11-22 07:44:44.395 186548 INFO nova.virt.libvirt.driver [-] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Instance spawned successfully.
Nov 22 07:44:44 compute-0 nova_compute[186544]: 2025-11-22 07:44:44.396 186548 DEBUG nova.virt.libvirt.driver [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 07:44:44 compute-0 nova_compute[186544]: 2025-11-22 07:44:44.424 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:44:44 compute-0 nova_compute[186544]: 2025-11-22 07:44:44.429 186548 INFO nova.scheduler.client.report [None req-f22a91de-cd6c-425b-85b7-89eff751d2fe 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Deleted allocations for instance 6d263548-4cc6-463b-b26b-cb43b0d069cd
Nov 22 07:44:44 compute-0 nova_compute[186544]: 2025-11-22 07:44:44.431 186548 DEBUG nova.virt.libvirt.driver [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:44:44 compute-0 nova_compute[186544]: 2025-11-22 07:44:44.432 186548 DEBUG nova.virt.libvirt.driver [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:44:44 compute-0 nova_compute[186544]: 2025-11-22 07:44:44.432 186548 DEBUG nova.virt.libvirt.driver [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:44:44 compute-0 nova_compute[186544]: 2025-11-22 07:44:44.432 186548 DEBUG nova.virt.libvirt.driver [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:44:44 compute-0 nova_compute[186544]: 2025-11-22 07:44:44.433 186548 DEBUG nova.virt.libvirt.driver [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:44:44 compute-0 nova_compute[186544]: 2025-11-22 07:44:44.433 186548 DEBUG nova.virt.libvirt.driver [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:44:44 compute-0 nova_compute[186544]: 2025-11-22 07:44:44.437 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:44:44 compute-0 nova_compute[186544]: 2025-11-22 07:44:44.479 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 07:44:44 compute-0 nova_compute[186544]: 2025-11-22 07:44:44.548 186548 INFO nova.compute.manager [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Took 7.85 seconds to spawn the instance on the hypervisor.
Nov 22 07:44:44 compute-0 nova_compute[186544]: 2025-11-22 07:44:44.549 186548 DEBUG nova.compute.manager [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:44:44 compute-0 nova_compute[186544]: 2025-11-22 07:44:44.594 186548 DEBUG oslo_concurrency.lockutils [None req-f22a91de-cd6c-425b-85b7-89eff751d2fe 7c0fb56fc41e44dfa23a0d45149e78e3 9b004cb06df74de2903dae19345fd9c7 - - default default] Lock "6d263548-4cc6-463b-b26b-cb43b0d069cd" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.042s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:44:44 compute-0 nova_compute[186544]: 2025-11-22 07:44:44.723 186548 INFO nova.compute.manager [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Took 8.50 seconds to build instance.
Nov 22 07:44:44 compute-0 nova_compute[186544]: 2025-11-22 07:44:44.790 186548 DEBUG oslo_concurrency.lockutils [None req-953c3a9a-cdcf-4ea4-9a34-40e4141eb0bd 4ca2e31d955040598948fa3da5d84888 74651b744925468db6c6e47d1397cc04 - - default default] Lock "feb5ca5f-df67-4f29-9c21-71ba30b5af9c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.679s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:44:45 compute-0 nova_compute[186544]: 2025-11-22 07:44:45.026 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:45 compute-0 nova_compute[186544]: 2025-11-22 07:44:45.531 186548 DEBUG nova.network.neutron [req-a9f52bd0-6b35-4b14-a03e-5ea5d8677ff2 req-a287ba60-c9ab-4c73-b6f8-1f89f5f784fa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Updated VIF entry in instance network info cache for port 4ab0012c-e73f-4cd6-b146-527583d046f3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 07:44:45 compute-0 nova_compute[186544]: 2025-11-22 07:44:45.532 186548 DEBUG nova.network.neutron [req-a9f52bd0-6b35-4b14-a03e-5ea5d8677ff2 req-a287ba60-c9ab-4c73-b6f8-1f89f5f784fa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Updating instance_info_cache with network_info: [{"id": "4ab0012c-e73f-4cd6-b146-527583d046f3", "address": "fa:16:3e:c2:22:c9", "network": {"id": "cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-667943228-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74651b744925468db6c6e47d1397cc04", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ab0012c-e7", "ovs_interfaceid": "4ab0012c-e73f-4cd6-b146-527583d046f3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:44:45 compute-0 nova_compute[186544]: 2025-11-22 07:44:45.582 186548 DEBUG oslo_concurrency.lockutils [req-a9f52bd0-6b35-4b14-a03e-5ea5d8677ff2 req-a287ba60-c9ab-4c73-b6f8-1f89f5f784fa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-feb5ca5f-df67-4f29-9c21-71ba30b5af9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:44:46 compute-0 nova_compute[186544]: 2025-11-22 07:44:46.856 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:47 compute-0 nova_compute[186544]: 2025-11-22 07:44:47.061 186548 DEBUG nova.compute.manager [req-1a711dda-a5ad-4703-8cee-e199efc8d3b1 req-76e29be2-0743-4a33-a74f-6cfd72bd9538 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Received event network-vif-plugged-4ab0012c-e73f-4cd6-b146-527583d046f3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:44:47 compute-0 nova_compute[186544]: 2025-11-22 07:44:47.062 186548 DEBUG oslo_concurrency.lockutils [req-1a711dda-a5ad-4703-8cee-e199efc8d3b1 req-76e29be2-0743-4a33-a74f-6cfd72bd9538 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "feb5ca5f-df67-4f29-9c21-71ba30b5af9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:44:47 compute-0 nova_compute[186544]: 2025-11-22 07:44:47.062 186548 DEBUG oslo_concurrency.lockutils [req-1a711dda-a5ad-4703-8cee-e199efc8d3b1 req-76e29be2-0743-4a33-a74f-6cfd72bd9538 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "feb5ca5f-df67-4f29-9c21-71ba30b5af9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:44:47 compute-0 nova_compute[186544]: 2025-11-22 07:44:47.062 186548 DEBUG oslo_concurrency.lockutils [req-1a711dda-a5ad-4703-8cee-e199efc8d3b1 req-76e29be2-0743-4a33-a74f-6cfd72bd9538 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "feb5ca5f-df67-4f29-9c21-71ba30b5af9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:44:47 compute-0 nova_compute[186544]: 2025-11-22 07:44:47.062 186548 DEBUG nova.compute.manager [req-1a711dda-a5ad-4703-8cee-e199efc8d3b1 req-76e29be2-0743-4a33-a74f-6cfd72bd9538 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] No waiting events found dispatching network-vif-plugged-4ab0012c-e73f-4cd6-b146-527583d046f3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:44:47 compute-0 nova_compute[186544]: 2025-11-22 07:44:47.063 186548 WARNING nova.compute.manager [req-1a711dda-a5ad-4703-8cee-e199efc8d3b1 req-76e29be2-0743-4a33-a74f-6cfd72bd9538 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Received unexpected event network-vif-plugged-4ab0012c-e73f-4cd6-b146-527583d046f3 for instance with vm_state active and task_state None.
Nov 22 07:44:48 compute-0 nova_compute[186544]: 2025-11-22 07:44:48.305 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763797473.3040404, 36d5f234-9baf-48b6-a565-430378fe4068 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:44:48 compute-0 nova_compute[186544]: 2025-11-22 07:44:48.306 186548 INFO nova.compute.manager [-] [instance: 36d5f234-9baf-48b6-a565-430378fe4068] VM Stopped (Lifecycle Event)
Nov 22 07:44:48 compute-0 nova_compute[186544]: 2025-11-22 07:44:48.322 186548 DEBUG nova.compute.manager [None req-c3e1c655-c0ff-4f1e-8edc-65e515f4bd3a - - - - - -] [instance: 36d5f234-9baf-48b6-a565-430378fe4068] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:44:50 compute-0 nova_compute[186544]: 2025-11-22 07:44:50.027 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:51 compute-0 nova_compute[186544]: 2025-11-22 07:44:51.222 186548 DEBUG nova.virt.libvirt.driver [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Check if temp file /var/lib/nova/instances/tmpi234bmsv exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Nov 22 07:44:51 compute-0 nova_compute[186544]: 2025-11-22 07:44:51.223 186548 DEBUG nova.compute.manager [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpi234bmsv',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='feb5ca5f-df67-4f29-9c21-71ba30b5af9c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Nov 22 07:44:51 compute-0 ovn_controller[94843]: 2025-11-22T07:44:51Z|00085|binding|INFO|Releasing lport f400467f-3f35-4435-bb4a-0b3da05366fb from this chassis (sb_readonly=0)
Nov 22 07:44:51 compute-0 ovn_controller[94843]: 2025-11-22T07:44:51Z|00086|binding|INFO|Releasing lport ebed6d9f-62b8-40d5-8d3c-93d6149e3602 from this chassis (sb_readonly=0)
Nov 22 07:44:51 compute-0 nova_compute[186544]: 2025-11-22 07:44:51.714 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:51 compute-0 nova_compute[186544]: 2025-11-22 07:44:51.858 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:52 compute-0 podman[216262]: 2025-11-22 07:44:52.426027527 +0000 UTC m=+0.076728016 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, managed_by=edpm_ansible)
Nov 22 07:44:52 compute-0 podman[216263]: 2025-11-22 07:44:52.428219481 +0000 UTC m=+0.078299154 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true)
Nov 22 07:44:53 compute-0 nova_compute[186544]: 2025-11-22 07:44:53.153 186548 DEBUG oslo_concurrency.processutils [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/feb5ca5f-df67-4f29-9c21-71ba30b5af9c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:44:53 compute-0 nova_compute[186544]: 2025-11-22 07:44:53.221 186548 DEBUG oslo_concurrency.processutils [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/feb5ca5f-df67-4f29-9c21-71ba30b5af9c/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:44:53 compute-0 nova_compute[186544]: 2025-11-22 07:44:53.223 186548 DEBUG oslo_concurrency.processutils [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/feb5ca5f-df67-4f29-9c21-71ba30b5af9c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:44:53 compute-0 nova_compute[186544]: 2025-11-22 07:44:53.282 186548 DEBUG oslo_concurrency.processutils [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/feb5ca5f-df67-4f29-9c21-71ba30b5af9c/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:44:55 compute-0 nova_compute[186544]: 2025-11-22 07:44:55.061 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:56 compute-0 sshd-session[216310]: Accepted publickey for nova from 192.168.122.102 port 43480 ssh2: ECDSA SHA256:92zYFkcPWlh9+ZvK5fXQ5RiSCmUwy/g/KFplYnusFfI
Nov 22 07:44:56 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Nov 22 07:44:56 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 22 07:44:56 compute-0 systemd-logind[821]: New session 34 of user nova.
Nov 22 07:44:56 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 22 07:44:56 compute-0 systemd[1]: Starting User Manager for UID 42436...
Nov 22 07:44:56 compute-0 systemd[216314]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 22 07:44:56 compute-0 systemd[216314]: Queued start job for default target Main User Target.
Nov 22 07:44:56 compute-0 systemd[216314]: Created slice User Application Slice.
Nov 22 07:44:56 compute-0 systemd[216314]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 22 07:44:56 compute-0 systemd[216314]: Started Daily Cleanup of User's Temporary Directories.
Nov 22 07:44:56 compute-0 systemd[216314]: Reached target Paths.
Nov 22 07:44:56 compute-0 systemd[216314]: Reached target Timers.
Nov 22 07:44:56 compute-0 systemd[216314]: Starting D-Bus User Message Bus Socket...
Nov 22 07:44:56 compute-0 systemd[216314]: Starting Create User's Volatile Files and Directories...
Nov 22 07:44:56 compute-0 systemd[216314]: Finished Create User's Volatile Files and Directories.
Nov 22 07:44:56 compute-0 systemd[216314]: Listening on D-Bus User Message Bus Socket.
Nov 22 07:44:56 compute-0 systemd[216314]: Reached target Sockets.
Nov 22 07:44:56 compute-0 systemd[216314]: Reached target Basic System.
Nov 22 07:44:56 compute-0 systemd[216314]: Reached target Main User Target.
Nov 22 07:44:56 compute-0 systemd[216314]: Startup finished in 138ms.
Nov 22 07:44:56 compute-0 systemd[1]: Started User Manager for UID 42436.
Nov 22 07:44:56 compute-0 systemd[1]: Started Session 34 of User nova.
Nov 22 07:44:56 compute-0 sshd-session[216310]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 22 07:44:56 compute-0 sshd-session[216329]: Received disconnect from 192.168.122.102 port 43480:11: disconnected by user
Nov 22 07:44:56 compute-0 sshd-session[216329]: Disconnected from user nova 192.168.122.102 port 43480
Nov 22 07:44:56 compute-0 sshd-session[216310]: pam_unix(sshd:session): session closed for user nova
Nov 22 07:44:56 compute-0 systemd[1]: session-34.scope: Deactivated successfully.
Nov 22 07:44:56 compute-0 systemd-logind[821]: Session 34 logged out. Waiting for processes to exit.
Nov 22 07:44:56 compute-0 systemd-logind[821]: Removed session 34.
Nov 22 07:44:56 compute-0 nova_compute[186544]: 2025-11-22 07:44:56.833 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763797481.8328693, 6d263548-4cc6-463b-b26b-cb43b0d069cd => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:44:56 compute-0 nova_compute[186544]: 2025-11-22 07:44:56.835 186548 INFO nova.compute.manager [-] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] VM Stopped (Lifecycle Event)
Nov 22 07:44:56 compute-0 nova_compute[186544]: 2025-11-22 07:44:56.859 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:44:56 compute-0 nova_compute[186544]: 2025-11-22 07:44:56.862 186548 DEBUG nova.compute.manager [None req-4a78a5f6-4d30-449c-b93e-8caad043f4c0 - - - - - -] [instance: 6d263548-4cc6-463b-b26b-cb43b0d069cd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:44:56 compute-0 podman[216342]: 2025-11-22 07:44:56.945024126 +0000 UTC m=+0.055507307 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 07:44:57 compute-0 ovn_controller[94843]: 2025-11-22T07:44:57Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:c2:22:c9 10.100.0.5
Nov 22 07:44:57 compute-0 ovn_controller[94843]: 2025-11-22T07:44:57Z|00015|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c2:22:c9 10.100.0.5
Nov 22 07:44:57 compute-0 nova_compute[186544]: 2025-11-22 07:44:57.496 186548 DEBUG nova.compute.manager [req-367b33b5-8399-4c62-bdd6-934f682202d6 req-95aba0d8-e2c9-4792-8b4c-18613ba2076a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Received event network-vif-unplugged-4ab0012c-e73f-4cd6-b146-527583d046f3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:44:57 compute-0 nova_compute[186544]: 2025-11-22 07:44:57.496 186548 DEBUG oslo_concurrency.lockutils [req-367b33b5-8399-4c62-bdd6-934f682202d6 req-95aba0d8-e2c9-4792-8b4c-18613ba2076a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "feb5ca5f-df67-4f29-9c21-71ba30b5af9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:44:57 compute-0 nova_compute[186544]: 2025-11-22 07:44:57.496 186548 DEBUG oslo_concurrency.lockutils [req-367b33b5-8399-4c62-bdd6-934f682202d6 req-95aba0d8-e2c9-4792-8b4c-18613ba2076a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "feb5ca5f-df67-4f29-9c21-71ba30b5af9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:44:57 compute-0 nova_compute[186544]: 2025-11-22 07:44:57.496 186548 DEBUG oslo_concurrency.lockutils [req-367b33b5-8399-4c62-bdd6-934f682202d6 req-95aba0d8-e2c9-4792-8b4c-18613ba2076a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "feb5ca5f-df67-4f29-9c21-71ba30b5af9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:44:57 compute-0 nova_compute[186544]: 2025-11-22 07:44:57.496 186548 DEBUG nova.compute.manager [req-367b33b5-8399-4c62-bdd6-934f682202d6 req-95aba0d8-e2c9-4792-8b4c-18613ba2076a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] No waiting events found dispatching network-vif-unplugged-4ab0012c-e73f-4cd6-b146-527583d046f3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:44:57 compute-0 nova_compute[186544]: 2025-11-22 07:44:57.497 186548 DEBUG nova.compute.manager [req-367b33b5-8399-4c62-bdd6-934f682202d6 req-95aba0d8-e2c9-4792-8b4c-18613ba2076a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Received event network-vif-unplugged-4ab0012c-e73f-4cd6-b146-527583d046f3 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 22 07:45:00 compute-0 nova_compute[186544]: 2025-11-22 07:45:00.013 186548 DEBUG nova.compute.manager [req-818e7799-f7ff-424f-9c75-2911304bd810 req-eab8d245-533a-462f-b6e0-49fe6e5e4e77 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Received event network-vif-plugged-4ab0012c-e73f-4cd6-b146-527583d046f3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:45:00 compute-0 nova_compute[186544]: 2025-11-22 07:45:00.013 186548 DEBUG oslo_concurrency.lockutils [req-818e7799-f7ff-424f-9c75-2911304bd810 req-eab8d245-533a-462f-b6e0-49fe6e5e4e77 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "feb5ca5f-df67-4f29-9c21-71ba30b5af9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:45:00 compute-0 nova_compute[186544]: 2025-11-22 07:45:00.014 186548 DEBUG oslo_concurrency.lockutils [req-818e7799-f7ff-424f-9c75-2911304bd810 req-eab8d245-533a-462f-b6e0-49fe6e5e4e77 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "feb5ca5f-df67-4f29-9c21-71ba30b5af9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:45:00 compute-0 nova_compute[186544]: 2025-11-22 07:45:00.014 186548 DEBUG oslo_concurrency.lockutils [req-818e7799-f7ff-424f-9c75-2911304bd810 req-eab8d245-533a-462f-b6e0-49fe6e5e4e77 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "feb5ca5f-df67-4f29-9c21-71ba30b5af9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:45:00 compute-0 nova_compute[186544]: 2025-11-22 07:45:00.014 186548 DEBUG nova.compute.manager [req-818e7799-f7ff-424f-9c75-2911304bd810 req-eab8d245-533a-462f-b6e0-49fe6e5e4e77 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] No waiting events found dispatching network-vif-plugged-4ab0012c-e73f-4cd6-b146-527583d046f3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:45:00 compute-0 nova_compute[186544]: 2025-11-22 07:45:00.014 186548 WARNING nova.compute.manager [req-818e7799-f7ff-424f-9c75-2911304bd810 req-eab8d245-533a-462f-b6e0-49fe6e5e4e77 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Received unexpected event network-vif-plugged-4ab0012c-e73f-4cd6-b146-527583d046f3 for instance with vm_state active and task_state migrating.
Nov 22 07:45:00 compute-0 nova_compute[186544]: 2025-11-22 07:45:00.062 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:45:00 compute-0 nova_compute[186544]: 2025-11-22 07:45:00.311 186548 INFO nova.compute.manager [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Took 7.03 seconds for pre_live_migration on destination host compute-2.ctlplane.example.com.
Nov 22 07:45:00 compute-0 nova_compute[186544]: 2025-11-22 07:45:00.311 186548 DEBUG nova.compute.manager [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 07:45:00 compute-0 nova_compute[186544]: 2025-11-22 07:45:00.349 186548 DEBUG nova.compute.manager [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpi234bmsv',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='feb5ca5f-df67-4f29-9c21-71ba30b5af9c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(0c8c09a3-0dab-45e1-b468-73fb3bc21014),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Nov 22 07:45:00 compute-0 nova_compute[186544]: 2025-11-22 07:45:00.376 186548 DEBUG nova.objects.instance [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Lazy-loading 'migration_context' on Instance uuid feb5ca5f-df67-4f29-9c21-71ba30b5af9c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:45:00 compute-0 nova_compute[186544]: 2025-11-22 07:45:00.377 186548 DEBUG nova.virt.libvirt.driver [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Nov 22 07:45:00 compute-0 nova_compute[186544]: 2025-11-22 07:45:00.378 186548 DEBUG nova.virt.libvirt.driver [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Nov 22 07:45:00 compute-0 nova_compute[186544]: 2025-11-22 07:45:00.379 186548 DEBUG nova.virt.libvirt.driver [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Nov 22 07:45:00 compute-0 podman[216368]: 2025-11-22 07:45:00.401229902 +0000 UTC m=+0.051854698 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 22 07:45:00 compute-0 nova_compute[186544]: 2025-11-22 07:45:00.402 186548 DEBUG nova.virt.libvirt.vif [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:44:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1661145969',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1661145969',id=20,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:44:44Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='74651b744925468db6c6e47d1397cc04',ramdisk_id='',reservation_id='r-lpdner4p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1505701588',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1505701588-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:44:44Z,user_data=None,user_id='4ca2e31d955040598948fa3da5d84888',uuid=feb5ca5f-df67-4f29-9c21-71ba30b5af9c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4ab0012c-e73f-4cd6-b146-527583d046f3", "address": "fa:16:3e:c2:22:c9", "network": {"id": "cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-667943228-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74651b744925468db6c6e47d1397cc04", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap4ab0012c-e7", "ovs_interfaceid": "4ab0012c-e73f-4cd6-b146-527583d046f3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 07:45:00 compute-0 nova_compute[186544]: 2025-11-22 07:45:00.402 186548 DEBUG nova.network.os_vif_util [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Converting VIF {"id": "4ab0012c-e73f-4cd6-b146-527583d046f3", "address": "fa:16:3e:c2:22:c9", "network": {"id": "cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-667943228-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74651b744925468db6c6e47d1397cc04", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap4ab0012c-e7", "ovs_interfaceid": "4ab0012c-e73f-4cd6-b146-527583d046f3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:45:00 compute-0 nova_compute[186544]: 2025-11-22 07:45:00.403 186548 DEBUG nova.network.os_vif_util [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c2:22:c9,bridge_name='br-int',has_traffic_filtering=True,id=4ab0012c-e73f-4cd6-b146-527583d046f3,network=Network(cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap4ab0012c-e7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:45:00 compute-0 nova_compute[186544]: 2025-11-22 07:45:00.403 186548 DEBUG nova.virt.libvirt.migration [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Updating guest XML with vif config: <interface type="ethernet">
Nov 22 07:45:00 compute-0 nova_compute[186544]:   <mac address="fa:16:3e:c2:22:c9"/>
Nov 22 07:45:00 compute-0 nova_compute[186544]:   <model type="virtio"/>
Nov 22 07:45:00 compute-0 nova_compute[186544]:   <driver name="vhost" rx_queue_size="512"/>
Nov 22 07:45:00 compute-0 nova_compute[186544]:   <mtu size="1442"/>
Nov 22 07:45:00 compute-0 nova_compute[186544]:   <target dev="tap4ab0012c-e7"/>
Nov 22 07:45:00 compute-0 nova_compute[186544]: </interface>
Nov 22 07:45:00 compute-0 nova_compute[186544]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Nov 22 07:45:00 compute-0 nova_compute[186544]: 2025-11-22 07:45:00.404 186548 DEBUG nova.virt.libvirt.driver [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Nov 22 07:45:00 compute-0 nova_compute[186544]: 2025-11-22 07:45:00.881 186548 DEBUG nova.virt.libvirt.migration [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 22 07:45:00 compute-0 nova_compute[186544]: 2025-11-22 07:45:00.881 186548 INFO nova.virt.libvirt.migration [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Increasing downtime to 50 ms after 0 sec elapsed time
Nov 22 07:45:01 compute-0 nova_compute[186544]: 2025-11-22 07:45:01.040 186548 INFO nova.virt.libvirt.driver [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Nov 22 07:45:01 compute-0 nova_compute[186544]: 2025-11-22 07:45:01.543 186548 DEBUG nova.virt.libvirt.migration [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 22 07:45:01 compute-0 nova_compute[186544]: 2025-11-22 07:45:01.544 186548 DEBUG nova.virt.libvirt.migration [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Nov 22 07:45:01 compute-0 nova_compute[186544]: 2025-11-22 07:45:01.861 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:45:02 compute-0 nova_compute[186544]: 2025-11-22 07:45:02.050 186548 DEBUG nova.virt.libvirt.migration [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 22 07:45:02 compute-0 nova_compute[186544]: 2025-11-22 07:45:02.051 186548 DEBUG nova.virt.libvirt.migration [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Nov 22 07:45:02 compute-0 nova_compute[186544]: 2025-11-22 07:45:02.213 186548 DEBUG nova.compute.manager [req-bf3fa1a5-60ab-4406-94ce-35ca28cc5750 req-4c71e777-809e-45f7-a04d-a1045ef435ec 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Received event network-changed-4ab0012c-e73f-4cd6-b146-527583d046f3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:45:02 compute-0 nova_compute[186544]: 2025-11-22 07:45:02.213 186548 DEBUG nova.compute.manager [req-bf3fa1a5-60ab-4406-94ce-35ca28cc5750 req-4c71e777-809e-45f7-a04d-a1045ef435ec 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Refreshing instance network info cache due to event network-changed-4ab0012c-e73f-4cd6-b146-527583d046f3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 07:45:02 compute-0 nova_compute[186544]: 2025-11-22 07:45:02.213 186548 DEBUG oslo_concurrency.lockutils [req-bf3fa1a5-60ab-4406-94ce-35ca28cc5750 req-4c71e777-809e-45f7-a04d-a1045ef435ec 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-feb5ca5f-df67-4f29-9c21-71ba30b5af9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:45:02 compute-0 nova_compute[186544]: 2025-11-22 07:45:02.214 186548 DEBUG oslo_concurrency.lockutils [req-bf3fa1a5-60ab-4406-94ce-35ca28cc5750 req-4c71e777-809e-45f7-a04d-a1045ef435ec 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-feb5ca5f-df67-4f29-9c21-71ba30b5af9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:45:02 compute-0 nova_compute[186544]: 2025-11-22 07:45:02.214 186548 DEBUG nova.network.neutron [req-bf3fa1a5-60ab-4406-94ce-35ca28cc5750 req-4c71e777-809e-45f7-a04d-a1045ef435ec 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Refreshing network info cache for port 4ab0012c-e73f-4cd6-b146-527583d046f3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 07:45:02 compute-0 nova_compute[186544]: 2025-11-22 07:45:02.416 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763797502.416066, feb5ca5f-df67-4f29-9c21-71ba30b5af9c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:45:02 compute-0 nova_compute[186544]: 2025-11-22 07:45:02.416 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] VM Paused (Lifecycle Event)
Nov 22 07:45:02 compute-0 nova_compute[186544]: 2025-11-22 07:45:02.454 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:45:02 compute-0 nova_compute[186544]: 2025-11-22 07:45:02.458 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:45:02 compute-0 nova_compute[186544]: 2025-11-22 07:45:02.491 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] During sync_power_state the instance has a pending task (migrating). Skip.
Nov 22 07:45:02 compute-0 kernel: tap4ab0012c-e7 (unregistering): left promiscuous mode
Nov 22 07:45:02 compute-0 NetworkManager[55036]: <info>  [1763797502.5780] device (tap4ab0012c-e7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 07:45:02 compute-0 nova_compute[186544]: 2025-11-22 07:45:02.586 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:45:02 compute-0 ovn_controller[94843]: 2025-11-22T07:45:02Z|00087|binding|INFO|Releasing lport 4ab0012c-e73f-4cd6-b146-527583d046f3 from this chassis (sb_readonly=0)
Nov 22 07:45:02 compute-0 ovn_controller[94843]: 2025-11-22T07:45:02Z|00088|binding|INFO|Setting lport 4ab0012c-e73f-4cd6-b146-527583d046f3 down in Southbound
Nov 22 07:45:02 compute-0 ovn_controller[94843]: 2025-11-22T07:45:02Z|00089|binding|INFO|Releasing lport 77e99205-9615-4ea6-ab25-d16bf8bb4804 from this chassis (sb_readonly=0)
Nov 22 07:45:02 compute-0 ovn_controller[94843]: 2025-11-22T07:45:02Z|00090|binding|INFO|Setting lport 77e99205-9615-4ea6-ab25-d16bf8bb4804 down in Southbound
Nov 22 07:45:02 compute-0 ovn_controller[94843]: 2025-11-22T07:45:02Z|00091|binding|INFO|Removing iface tap4ab0012c-e7 ovn-installed in OVS
Nov 22 07:45:02 compute-0 nova_compute[186544]: 2025-11-22 07:45:02.589 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:45:02 compute-0 ovn_controller[94843]: 2025-11-22T07:45:02Z|00092|binding|INFO|Releasing lport f400467f-3f35-4435-bb4a-0b3da05366fb from this chassis (sb_readonly=0)
Nov 22 07:45:02 compute-0 ovn_controller[94843]: 2025-11-22T07:45:02Z|00093|binding|INFO|Releasing lport ebed6d9f-62b8-40d5-8d3c-93d6149e3602 from this chassis (sb_readonly=0)
Nov 22 07:45:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:45:02.599 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ef:ca:bc 19.80.0.156'], port_security=['fa:16:3e:ef:ca:bc 19.80.0.156'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['4ab0012c-e73f-4cd6-b146-527583d046f3'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-1561653152', 'neutron:cidrs': '19.80.0.156/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2de8212b-d744-4bab-b451-7daef022c1bc', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-1561653152', 'neutron:project_id': '74651b744925468db6c6e47d1397cc04', 'neutron:revision_number': '3', 'neutron:security_group_ids': '91f2be3c-33ea-422b-b9a4-1d9e92a850d8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=b9f16a9f-d373-4cb7-a13f-5e20d7a18db8, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=77e99205-9615-4ea6-ab25-d16bf8bb4804) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:45:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:45:02.601 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c2:22:c9 10.100.0.5'], port_security=['fa:16:3e:c2:22:c9 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-2.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '4984e16e-8f1c-4426-bfc6-5927f375ce79'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-971128270', 'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'feb5ca5f-df67-4f29-9c21-71ba30b5af9c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-971128270', 'neutron:project_id': '74651b744925468db6c6e47d1397cc04', 'neutron:revision_number': '8', 'neutron:security_group_ids': '91f2be3c-33ea-422b-b9a4-1d9e92a850d8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=14c3e272-b4ef-4625-a876-b23f3cbba9b7, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=4ab0012c-e73f-4cd6-b146-527583d046f3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:45:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:45:02.602 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 77e99205-9615-4ea6-ab25-d16bf8bb4804 in datapath 2de8212b-d744-4bab-b451-7daef022c1bc unbound from our chassis
Nov 22 07:45:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:45:02.603 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2de8212b-d744-4bab-b451-7daef022c1bc, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 07:45:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:45:02.604 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[6cd04bac-8f45-45c8-97c9-c683318557c3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:45:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:45:02.605 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-2de8212b-d744-4bab-b451-7daef022c1bc namespace which is not needed anymore
Nov 22 07:45:02 compute-0 nova_compute[186544]: 2025-11-22 07:45:02.614 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:45:02 compute-0 nova_compute[186544]: 2025-11-22 07:45:02.684 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:45:02 compute-0 podman[216393]: 2025-11-22 07:45:02.691959705 +0000 UTC m=+0.094226484 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 07:45:02 compute-0 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d00000014.scope: Deactivated successfully.
Nov 22 07:45:02 compute-0 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d00000014.scope: Consumed 14.013s CPU time.
Nov 22 07:45:02 compute-0 systemd-machined[152872]: Machine qemu-11-instance-00000014 terminated.
Nov 22 07:45:02 compute-0 neutron-haproxy-ovnmeta-2de8212b-d744-4bab-b451-7daef022c1bc[216169]: [NOTICE]   (216173) : haproxy version is 2.8.14-c23fe91
Nov 22 07:45:02 compute-0 neutron-haproxy-ovnmeta-2de8212b-d744-4bab-b451-7daef022c1bc[216169]: [NOTICE]   (216173) : path to executable is /usr/sbin/haproxy
Nov 22 07:45:02 compute-0 neutron-haproxy-ovnmeta-2de8212b-d744-4bab-b451-7daef022c1bc[216169]: [WARNING]  (216173) : Exiting Master process...
Nov 22 07:45:02 compute-0 neutron-haproxy-ovnmeta-2de8212b-d744-4bab-b451-7daef022c1bc[216169]: [ALERT]    (216173) : Current worker (216175) exited with code 143 (Terminated)
Nov 22 07:45:02 compute-0 neutron-haproxy-ovnmeta-2de8212b-d744-4bab-b451-7daef022c1bc[216169]: [WARNING]  (216173) : All workers exited. Exiting... (0)
Nov 22 07:45:02 compute-0 systemd[1]: libpod-db27d4dd2bcff9faccd083111666073e562b58314aa7c8f6a61ee3b958f5a287.scope: Deactivated successfully.
Nov 22 07:45:02 compute-0 podman[216432]: 2025-11-22 07:45:02.734455033 +0000 UTC m=+0.046990799 container died db27d4dd2bcff9faccd083111666073e562b58314aa7c8f6a61ee3b958f5a287 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2de8212b-d744-4bab-b451-7daef022c1bc, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 22 07:45:02 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-db27d4dd2bcff9faccd083111666073e562b58314aa7c8f6a61ee3b958f5a287-userdata-shm.mount: Deactivated successfully.
Nov 22 07:45:02 compute-0 systemd[1]: var-lib-containers-storage-overlay-0d32c04185b0231056f43dd1ce95593057e4b75cee40f6dc856ae33d3c578c8b-merged.mount: Deactivated successfully.
Nov 22 07:45:02 compute-0 podman[216432]: 2025-11-22 07:45:02.771323655 +0000 UTC m=+0.083859421 container cleanup db27d4dd2bcff9faccd083111666073e562b58314aa7c8f6a61ee3b958f5a287 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2de8212b-d744-4bab-b451-7daef022c1bc, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 22 07:45:02 compute-0 nova_compute[186544]: 2025-11-22 07:45:02.809 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:45:02 compute-0 systemd[1]: libpod-conmon-db27d4dd2bcff9faccd083111666073e562b58314aa7c8f6a61ee3b958f5a287.scope: Deactivated successfully.
Nov 22 07:45:02 compute-0 nova_compute[186544]: 2025-11-22 07:45:02.844 186548 DEBUG nova.virt.libvirt.guest [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Nov 22 07:45:02 compute-0 nova_compute[186544]: 2025-11-22 07:45:02.844 186548 INFO nova.virt.libvirt.driver [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Migration operation has completed
Nov 22 07:45:02 compute-0 nova_compute[186544]: 2025-11-22 07:45:02.845 186548 INFO nova.compute.manager [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] _post_live_migration() is started..
Nov 22 07:45:02 compute-0 nova_compute[186544]: 2025-11-22 07:45:02.848 186548 DEBUG nova.virt.libvirt.driver [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Nov 22 07:45:02 compute-0 nova_compute[186544]: 2025-11-22 07:45:02.848 186548 DEBUG nova.virt.libvirt.driver [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Nov 22 07:45:02 compute-0 nova_compute[186544]: 2025-11-22 07:45:02.848 186548 DEBUG nova.virt.libvirt.driver [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Nov 22 07:45:02 compute-0 podman[216470]: 2025-11-22 07:45:02.878752289 +0000 UTC m=+0.053704863 container remove db27d4dd2bcff9faccd083111666073e562b58314aa7c8f6a61ee3b958f5a287 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2de8212b-d744-4bab-b451-7daef022c1bc, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3)
Nov 22 07:45:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:45:02.883 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[c424ba4a-baa4-4769-a8dd-b90e100da8c0]: (4, ('Sat Nov 22 07:45:02 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-2de8212b-d744-4bab-b451-7daef022c1bc (db27d4dd2bcff9faccd083111666073e562b58314aa7c8f6a61ee3b958f5a287)\ndb27d4dd2bcff9faccd083111666073e562b58314aa7c8f6a61ee3b958f5a287\nSat Nov 22 07:45:02 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-2de8212b-d744-4bab-b451-7daef022c1bc (db27d4dd2bcff9faccd083111666073e562b58314aa7c8f6a61ee3b958f5a287)\ndb27d4dd2bcff9faccd083111666073e562b58314aa7c8f6a61ee3b958f5a287\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:45:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:45:02.885 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[f87e3999-b78c-4eab-a2b9-11cb2fe200a9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:45:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:45:02.886 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2de8212b-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:45:02 compute-0 nova_compute[186544]: 2025-11-22 07:45:02.887 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:45:02 compute-0 kernel: tap2de8212b-d0: left promiscuous mode
Nov 22 07:45:02 compute-0 nova_compute[186544]: 2025-11-22 07:45:02.906 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:45:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:45:02.910 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[cc739f1c-574e-4275-a8ab-87088f6daf5d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:45:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:45:02.928 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[fb60e921-992d-435c-b696-8472c34975e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:45:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:45:02.929 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[44d1323b-4fb5-44a6-9301-3c412d4cf424]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:45:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:45:02.945 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[636b5aeb-8dfb-4606-89e7-03b0aa03cd56]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 422337, 'reachable_time': 26306, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216498, 'error': None, 'target': 'ovnmeta-2de8212b-d744-4bab-b451-7daef022c1bc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:45:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:45:02.947 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-2de8212b-d744-4bab-b451-7daef022c1bc deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 07:45:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:45:02.947 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[e31bdb31-bcb4-4436-8996-20bc32269b11]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:45:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:45:02.948 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 4ab0012c-e73f-4cd6-b146-527583d046f3 in datapath cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9 unbound from our chassis
Nov 22 07:45:02 compute-0 systemd[1]: run-netns-ovnmeta\x2d2de8212b\x2dd744\x2d4bab\x2db451\x2d7daef022c1bc.mount: Deactivated successfully.
Nov 22 07:45:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:45:02.949 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 07:45:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:45:02.949 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[eb4a624d-3809-49a0-a72e-97e3685ad4a2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:45:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:45:02.950 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9 namespace which is not needed anymore
Nov 22 07:45:03 compute-0 neutron-haproxy-ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9[216246]: [NOTICE]   (216250) : haproxy version is 2.8.14-c23fe91
Nov 22 07:45:03 compute-0 neutron-haproxy-ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9[216246]: [NOTICE]   (216250) : path to executable is /usr/sbin/haproxy
Nov 22 07:45:03 compute-0 neutron-haproxy-ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9[216246]: [WARNING]  (216250) : Exiting Master process...
Nov 22 07:45:03 compute-0 neutron-haproxy-ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9[216246]: [WARNING]  (216250) : Exiting Master process...
Nov 22 07:45:03 compute-0 neutron-haproxy-ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9[216246]: [ALERT]    (216250) : Current worker (216252) exited with code 143 (Terminated)
Nov 22 07:45:03 compute-0 neutron-haproxy-ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9[216246]: [WARNING]  (216250) : All workers exited. Exiting... (0)
Nov 22 07:45:03 compute-0 systemd[1]: libpod-5f87f9d5e1cd379e69c8226609b4329f5ee303928ab5156728b2db4ef9367219.scope: Deactivated successfully.
Nov 22 07:45:03 compute-0 podman[216517]: 2025-11-22 07:45:03.096353878 +0000 UTC m=+0.049730607 container died 5f87f9d5e1cd379e69c8226609b4329f5ee303928ab5156728b2db4ef9367219 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Nov 22 07:45:03 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5f87f9d5e1cd379e69c8226609b4329f5ee303928ab5156728b2db4ef9367219-userdata-shm.mount: Deactivated successfully.
Nov 22 07:45:03 compute-0 systemd[1]: var-lib-containers-storage-overlay-6c269ce05e9b5936262be6b8f5b0f23b3f071df227d282e4f87d74f17373afa3-merged.mount: Deactivated successfully.
Nov 22 07:45:03 compute-0 podman[216517]: 2025-11-22 07:45:03.146717349 +0000 UTC m=+0.100094078 container cleanup 5f87f9d5e1cd379e69c8226609b4329f5ee303928ab5156728b2db4ef9367219 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118)
Nov 22 07:45:03 compute-0 systemd[1]: libpod-conmon-5f87f9d5e1cd379e69c8226609b4329f5ee303928ab5156728b2db4ef9367219.scope: Deactivated successfully.
Nov 22 07:45:03 compute-0 podman[216546]: 2025-11-22 07:45:03.21145484 +0000 UTC m=+0.047291287 container remove 5f87f9d5e1cd379e69c8226609b4329f5ee303928ab5156728b2db4ef9367219 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 07:45:03 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:45:03.216 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[df5d99aa-62e0-470a-9134-5d73500147ac]: (4, ('Sat Nov 22 07:45:03 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9 (5f87f9d5e1cd379e69c8226609b4329f5ee303928ab5156728b2db4ef9367219)\n5f87f9d5e1cd379e69c8226609b4329f5ee303928ab5156728b2db4ef9367219\nSat Nov 22 07:45:03 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9 (5f87f9d5e1cd379e69c8226609b4329f5ee303928ab5156728b2db4ef9367219)\n5f87f9d5e1cd379e69c8226609b4329f5ee303928ab5156728b2db4ef9367219\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:45:03 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:45:03.217 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[2d047ae4-4568-4b3d-999d-cc45e2c469c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:45:03 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:45:03.218 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcd5fa4f6-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:45:03 compute-0 nova_compute[186544]: 2025-11-22 07:45:03.219 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:45:03 compute-0 kernel: tapcd5fa4f6-00: left promiscuous mode
Nov 22 07:45:03 compute-0 nova_compute[186544]: 2025-11-22 07:45:03.234 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:45:03 compute-0 nova_compute[186544]: 2025-11-22 07:45:03.235 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:45:03 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:45:03.236 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[673dbba2-189b-497a-8d1c-ac9fd3477add]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:45:03 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:45:03.260 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[a00fb6c8-65d0-43c1-aaee-9dd890054b81]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:45:03 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:45:03.261 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[f9531120-8073-4066-a9b1-2f5786154d83]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:45:03 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:45:03.275 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[101ee5ee-5cdb-46a1-afc8-c98b215a655a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 422424, 'reachable_time': 32137, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216564, 'error': None, 'target': 'ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:45:03 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:45:03.277 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 07:45:03 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:45:03.278 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[ecae4fe0-55b1-49cc-aaf9-b5b574d24cea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:45:03 compute-0 systemd[1]: run-netns-ovnmeta\x2dcd5fa4f6\x2d0f1b\x2d41f2\x2d9643\x2d3c1a36620dc9.mount: Deactivated successfully.
Nov 22 07:45:03 compute-0 nova_compute[186544]: 2025-11-22 07:45:03.774 186548 DEBUG nova.compute.manager [req-fd98799d-5114-4776-8b11-78c7f7048714 req-be399c76-6530-4af3-b257-3fde531963b7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Received event network-vif-unplugged-4ab0012c-e73f-4cd6-b146-527583d046f3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:45:03 compute-0 nova_compute[186544]: 2025-11-22 07:45:03.774 186548 DEBUG oslo_concurrency.lockutils [req-fd98799d-5114-4776-8b11-78c7f7048714 req-be399c76-6530-4af3-b257-3fde531963b7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "feb5ca5f-df67-4f29-9c21-71ba30b5af9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:45:03 compute-0 nova_compute[186544]: 2025-11-22 07:45:03.774 186548 DEBUG oslo_concurrency.lockutils [req-fd98799d-5114-4776-8b11-78c7f7048714 req-be399c76-6530-4af3-b257-3fde531963b7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "feb5ca5f-df67-4f29-9c21-71ba30b5af9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:45:03 compute-0 nova_compute[186544]: 2025-11-22 07:45:03.775 186548 DEBUG oslo_concurrency.lockutils [req-fd98799d-5114-4776-8b11-78c7f7048714 req-be399c76-6530-4af3-b257-3fde531963b7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "feb5ca5f-df67-4f29-9c21-71ba30b5af9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:45:03 compute-0 nova_compute[186544]: 2025-11-22 07:45:03.775 186548 DEBUG nova.compute.manager [req-fd98799d-5114-4776-8b11-78c7f7048714 req-be399c76-6530-4af3-b257-3fde531963b7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] No waiting events found dispatching network-vif-unplugged-4ab0012c-e73f-4cd6-b146-527583d046f3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:45:03 compute-0 nova_compute[186544]: 2025-11-22 07:45:03.775 186548 DEBUG nova.compute.manager [req-fd98799d-5114-4776-8b11-78c7f7048714 req-be399c76-6530-4af3-b257-3fde531963b7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Received event network-vif-unplugged-4ab0012c-e73f-4cd6-b146-527583d046f3 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 22 07:45:03 compute-0 nova_compute[186544]: 2025-11-22 07:45:03.883 186548 DEBUG nova.compute.manager [req-35d0d0b8-bb67-4d43-94cb-25a911cfd67b req-896b630a-732f-4b8c-abf2-8aef1e1e39cc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Received event network-vif-unplugged-4ab0012c-e73f-4cd6-b146-527583d046f3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:45:03 compute-0 nova_compute[186544]: 2025-11-22 07:45:03.883 186548 DEBUG oslo_concurrency.lockutils [req-35d0d0b8-bb67-4d43-94cb-25a911cfd67b req-896b630a-732f-4b8c-abf2-8aef1e1e39cc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "feb5ca5f-df67-4f29-9c21-71ba30b5af9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:45:03 compute-0 nova_compute[186544]: 2025-11-22 07:45:03.883 186548 DEBUG oslo_concurrency.lockutils [req-35d0d0b8-bb67-4d43-94cb-25a911cfd67b req-896b630a-732f-4b8c-abf2-8aef1e1e39cc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "feb5ca5f-df67-4f29-9c21-71ba30b5af9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:45:03 compute-0 nova_compute[186544]: 2025-11-22 07:45:03.883 186548 DEBUG oslo_concurrency.lockutils [req-35d0d0b8-bb67-4d43-94cb-25a911cfd67b req-896b630a-732f-4b8c-abf2-8aef1e1e39cc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "feb5ca5f-df67-4f29-9c21-71ba30b5af9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:45:03 compute-0 nova_compute[186544]: 2025-11-22 07:45:03.883 186548 DEBUG nova.compute.manager [req-35d0d0b8-bb67-4d43-94cb-25a911cfd67b req-896b630a-732f-4b8c-abf2-8aef1e1e39cc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] No waiting events found dispatching network-vif-unplugged-4ab0012c-e73f-4cd6-b146-527583d046f3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:45:03 compute-0 nova_compute[186544]: 2025-11-22 07:45:03.884 186548 DEBUG nova.compute.manager [req-35d0d0b8-bb67-4d43-94cb-25a911cfd67b req-896b630a-732f-4b8c-abf2-8aef1e1e39cc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Received event network-vif-unplugged-4ab0012c-e73f-4cd6-b146-527583d046f3 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 22 07:45:03 compute-0 nova_compute[186544]: 2025-11-22 07:45:03.884 186548 DEBUG nova.compute.manager [req-35d0d0b8-bb67-4d43-94cb-25a911cfd67b req-896b630a-732f-4b8c-abf2-8aef1e1e39cc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Received event network-vif-plugged-4ab0012c-e73f-4cd6-b146-527583d046f3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:45:03 compute-0 nova_compute[186544]: 2025-11-22 07:45:03.884 186548 DEBUG oslo_concurrency.lockutils [req-35d0d0b8-bb67-4d43-94cb-25a911cfd67b req-896b630a-732f-4b8c-abf2-8aef1e1e39cc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "feb5ca5f-df67-4f29-9c21-71ba30b5af9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:45:03 compute-0 nova_compute[186544]: 2025-11-22 07:45:03.884 186548 DEBUG oslo_concurrency.lockutils [req-35d0d0b8-bb67-4d43-94cb-25a911cfd67b req-896b630a-732f-4b8c-abf2-8aef1e1e39cc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "feb5ca5f-df67-4f29-9c21-71ba30b5af9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:45:03 compute-0 nova_compute[186544]: 2025-11-22 07:45:03.884 186548 DEBUG oslo_concurrency.lockutils [req-35d0d0b8-bb67-4d43-94cb-25a911cfd67b req-896b630a-732f-4b8c-abf2-8aef1e1e39cc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "feb5ca5f-df67-4f29-9c21-71ba30b5af9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:45:03 compute-0 nova_compute[186544]: 2025-11-22 07:45:03.884 186548 DEBUG nova.compute.manager [req-35d0d0b8-bb67-4d43-94cb-25a911cfd67b req-896b630a-732f-4b8c-abf2-8aef1e1e39cc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] No waiting events found dispatching network-vif-plugged-4ab0012c-e73f-4cd6-b146-527583d046f3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:45:03 compute-0 nova_compute[186544]: 2025-11-22 07:45:03.885 186548 WARNING nova.compute.manager [req-35d0d0b8-bb67-4d43-94cb-25a911cfd67b req-896b630a-732f-4b8c-abf2-8aef1e1e39cc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Received unexpected event network-vif-plugged-4ab0012c-e73f-4cd6-b146-527583d046f3 for instance with vm_state active and task_state migrating.
Nov 22 07:45:03 compute-0 nova_compute[186544]: 2025-11-22 07:45:03.885 186548 DEBUG nova.compute.manager [req-35d0d0b8-bb67-4d43-94cb-25a911cfd67b req-896b630a-732f-4b8c-abf2-8aef1e1e39cc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Received event network-vif-plugged-4ab0012c-e73f-4cd6-b146-527583d046f3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:45:03 compute-0 nova_compute[186544]: 2025-11-22 07:45:03.885 186548 DEBUG oslo_concurrency.lockutils [req-35d0d0b8-bb67-4d43-94cb-25a911cfd67b req-896b630a-732f-4b8c-abf2-8aef1e1e39cc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "feb5ca5f-df67-4f29-9c21-71ba30b5af9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:45:03 compute-0 nova_compute[186544]: 2025-11-22 07:45:03.885 186548 DEBUG oslo_concurrency.lockutils [req-35d0d0b8-bb67-4d43-94cb-25a911cfd67b req-896b630a-732f-4b8c-abf2-8aef1e1e39cc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "feb5ca5f-df67-4f29-9c21-71ba30b5af9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:45:03 compute-0 nova_compute[186544]: 2025-11-22 07:45:03.885 186548 DEBUG oslo_concurrency.lockutils [req-35d0d0b8-bb67-4d43-94cb-25a911cfd67b req-896b630a-732f-4b8c-abf2-8aef1e1e39cc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "feb5ca5f-df67-4f29-9c21-71ba30b5af9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:45:03 compute-0 nova_compute[186544]: 2025-11-22 07:45:03.885 186548 DEBUG nova.compute.manager [req-35d0d0b8-bb67-4d43-94cb-25a911cfd67b req-896b630a-732f-4b8c-abf2-8aef1e1e39cc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] No waiting events found dispatching network-vif-plugged-4ab0012c-e73f-4cd6-b146-527583d046f3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:45:03 compute-0 nova_compute[186544]: 2025-11-22 07:45:03.886 186548 WARNING nova.compute.manager [req-35d0d0b8-bb67-4d43-94cb-25a911cfd67b req-896b630a-732f-4b8c-abf2-8aef1e1e39cc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Received unexpected event network-vif-plugged-4ab0012c-e73f-4cd6-b146-527583d046f3 for instance with vm_state active and task_state migrating.
Nov 22 07:45:04 compute-0 nova_compute[186544]: 2025-11-22 07:45:04.341 186548 DEBUG nova.network.neutron [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Activated binding for port 4ab0012c-e73f-4cd6-b146-527583d046f3 and host compute-2.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Nov 22 07:45:04 compute-0 nova_compute[186544]: 2025-11-22 07:45:04.342 186548 DEBUG nova.compute.manager [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "4ab0012c-e73f-4cd6-b146-527583d046f3", "address": "fa:16:3e:c2:22:c9", "network": {"id": "cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-667943228-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74651b744925468db6c6e47d1397cc04", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ab0012c-e7", "ovs_interfaceid": "4ab0012c-e73f-4cd6-b146-527583d046f3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Nov 22 07:45:04 compute-0 nova_compute[186544]: 2025-11-22 07:45:04.343 186548 DEBUG nova.virt.libvirt.vif [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:44:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1661145969',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1661145969',id=20,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:44:44Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='74651b744925468db6c6e47d1397cc04',ramdisk_id='',reservation_id='r-lpdner4p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1505701588',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1505701588-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:44:50Z,user_data=None,user_id='4ca2e31d955040598948fa3da5d84888',uuid=feb5ca5f-df67-4f29-9c21-71ba30b5af9c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4ab0012c-e73f-4cd6-b146-527583d046f3", "address": "fa:16:3e:c2:22:c9", "network": {"id": "cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-667943228-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74651b744925468db6c6e47d1397cc04", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ab0012c-e7", "ovs_interfaceid": "4ab0012c-e73f-4cd6-b146-527583d046f3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 07:45:04 compute-0 nova_compute[186544]: 2025-11-22 07:45:04.343 186548 DEBUG nova.network.os_vif_util [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Converting VIF {"id": "4ab0012c-e73f-4cd6-b146-527583d046f3", "address": "fa:16:3e:c2:22:c9", "network": {"id": "cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-667943228-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74651b744925468db6c6e47d1397cc04", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ab0012c-e7", "ovs_interfaceid": "4ab0012c-e73f-4cd6-b146-527583d046f3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:45:04 compute-0 nova_compute[186544]: 2025-11-22 07:45:04.344 186548 DEBUG nova.network.os_vif_util [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c2:22:c9,bridge_name='br-int',has_traffic_filtering=True,id=4ab0012c-e73f-4cd6-b146-527583d046f3,network=Network(cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap4ab0012c-e7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:45:04 compute-0 nova_compute[186544]: 2025-11-22 07:45:04.344 186548 DEBUG os_vif [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c2:22:c9,bridge_name='br-int',has_traffic_filtering=True,id=4ab0012c-e73f-4cd6-b146-527583d046f3,network=Network(cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap4ab0012c-e7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 07:45:04 compute-0 nova_compute[186544]: 2025-11-22 07:45:04.346 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:45:04 compute-0 nova_compute[186544]: 2025-11-22 07:45:04.346 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4ab0012c-e7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:45:04 compute-0 nova_compute[186544]: 2025-11-22 07:45:04.348 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:45:04 compute-0 nova_compute[186544]: 2025-11-22 07:45:04.350 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 07:45:04 compute-0 nova_compute[186544]: 2025-11-22 07:45:04.354 186548 INFO os_vif [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c2:22:c9,bridge_name='br-int',has_traffic_filtering=True,id=4ab0012c-e73f-4cd6-b146-527583d046f3,network=Network(cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap4ab0012c-e7')
Nov 22 07:45:04 compute-0 nova_compute[186544]: 2025-11-22 07:45:04.354 186548 DEBUG oslo_concurrency.lockutils [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:45:04 compute-0 nova_compute[186544]: 2025-11-22 07:45:04.355 186548 DEBUG oslo_concurrency.lockutils [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:45:04 compute-0 nova_compute[186544]: 2025-11-22 07:45:04.355 186548 DEBUG oslo_concurrency.lockutils [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:45:04 compute-0 nova_compute[186544]: 2025-11-22 07:45:04.355 186548 DEBUG nova.compute.manager [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Nov 22 07:45:04 compute-0 nova_compute[186544]: 2025-11-22 07:45:04.356 186548 INFO nova.virt.libvirt.driver [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Deleting instance files /var/lib/nova/instances/feb5ca5f-df67-4f29-9c21-71ba30b5af9c_del
Nov 22 07:45:04 compute-0 nova_compute[186544]: 2025-11-22 07:45:04.357 186548 INFO nova.virt.libvirt.driver [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Deletion of /var/lib/nova/instances/feb5ca5f-df67-4f29-9c21-71ba30b5af9c_del complete
Nov 22 07:45:04 compute-0 nova_compute[186544]: 2025-11-22 07:45:04.444 186548 DEBUG nova.network.neutron [req-bf3fa1a5-60ab-4406-94ce-35ca28cc5750 req-4c71e777-809e-45f7-a04d-a1045ef435ec 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Updated VIF entry in instance network info cache for port 4ab0012c-e73f-4cd6-b146-527583d046f3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 07:45:04 compute-0 nova_compute[186544]: 2025-11-22 07:45:04.444 186548 DEBUG nova.network.neutron [req-bf3fa1a5-60ab-4406-94ce-35ca28cc5750 req-4c71e777-809e-45f7-a04d-a1045ef435ec 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Updating instance_info_cache with network_info: [{"id": "4ab0012c-e73f-4cd6-b146-527583d046f3", "address": "fa:16:3e:c2:22:c9", "network": {"id": "cd5fa4f6-0f1b-41f2-9643-3c1a36620dc9", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-667943228-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74651b744925468db6c6e47d1397cc04", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ab0012c-e7", "ovs_interfaceid": "4ab0012c-e73f-4cd6-b146-527583d046f3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-2.ctlplane.example.com"}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:45:04 compute-0 nova_compute[186544]: 2025-11-22 07:45:04.472 186548 DEBUG oslo_concurrency.lockutils [req-bf3fa1a5-60ab-4406-94ce-35ca28cc5750 req-4c71e777-809e-45f7-a04d-a1045ef435ec 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-feb5ca5f-df67-4f29-9c21-71ba30b5af9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:45:05 compute-0 nova_compute[186544]: 2025-11-22 07:45:05.063 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:45:06 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Nov 22 07:45:06 compute-0 systemd[216314]: Activating special unit Exit the Session...
Nov 22 07:45:06 compute-0 systemd[216314]: Stopped target Main User Target.
Nov 22 07:45:06 compute-0 systemd[216314]: Stopped target Basic System.
Nov 22 07:45:06 compute-0 systemd[216314]: Stopped target Paths.
Nov 22 07:45:06 compute-0 systemd[216314]: Stopped target Sockets.
Nov 22 07:45:06 compute-0 systemd[216314]: Stopped target Timers.
Nov 22 07:45:06 compute-0 systemd[216314]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 22 07:45:06 compute-0 systemd[216314]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 22 07:45:06 compute-0 systemd[216314]: Closed D-Bus User Message Bus Socket.
Nov 22 07:45:06 compute-0 systemd[216314]: Stopped Create User's Volatile Files and Directories.
Nov 22 07:45:06 compute-0 systemd[216314]: Removed slice User Application Slice.
Nov 22 07:45:06 compute-0 systemd[216314]: Reached target Shutdown.
Nov 22 07:45:06 compute-0 systemd[216314]: Finished Exit the Session.
Nov 22 07:45:06 compute-0 systemd[216314]: Reached target Exit the Session.
Nov 22 07:45:06 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Nov 22 07:45:06 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Nov 22 07:45:06 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 22 07:45:06 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 22 07:45:06 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 22 07:45:06 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 22 07:45:06 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Nov 22 07:45:06 compute-0 nova_compute[186544]: 2025-11-22 07:45:06.735 186548 DEBUG nova.compute.manager [req-0c5e1acc-b875-41b6-8c65-4648e6bfc93b req-f966abae-a68b-4509-91f7-f17dea7b5e98 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Received event network-vif-plugged-4ab0012c-e73f-4cd6-b146-527583d046f3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:45:06 compute-0 nova_compute[186544]: 2025-11-22 07:45:06.736 186548 DEBUG oslo_concurrency.lockutils [req-0c5e1acc-b875-41b6-8c65-4648e6bfc93b req-f966abae-a68b-4509-91f7-f17dea7b5e98 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "feb5ca5f-df67-4f29-9c21-71ba30b5af9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:45:06 compute-0 nova_compute[186544]: 2025-11-22 07:45:06.736 186548 DEBUG oslo_concurrency.lockutils [req-0c5e1acc-b875-41b6-8c65-4648e6bfc93b req-f966abae-a68b-4509-91f7-f17dea7b5e98 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "feb5ca5f-df67-4f29-9c21-71ba30b5af9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:45:06 compute-0 nova_compute[186544]: 2025-11-22 07:45:06.736 186548 DEBUG oslo_concurrency.lockutils [req-0c5e1acc-b875-41b6-8c65-4648e6bfc93b req-f966abae-a68b-4509-91f7-f17dea7b5e98 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "feb5ca5f-df67-4f29-9c21-71ba30b5af9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:45:06 compute-0 nova_compute[186544]: 2025-11-22 07:45:06.736 186548 DEBUG nova.compute.manager [req-0c5e1acc-b875-41b6-8c65-4648e6bfc93b req-f966abae-a68b-4509-91f7-f17dea7b5e98 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] No waiting events found dispatching network-vif-plugged-4ab0012c-e73f-4cd6-b146-527583d046f3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:45:06 compute-0 nova_compute[186544]: 2025-11-22 07:45:06.736 186548 WARNING nova.compute.manager [req-0c5e1acc-b875-41b6-8c65-4648e6bfc93b req-f966abae-a68b-4509-91f7-f17dea7b5e98 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Received unexpected event network-vif-plugged-4ab0012c-e73f-4cd6-b146-527583d046f3 for instance with vm_state active and task_state migrating.
Nov 22 07:45:06 compute-0 nova_compute[186544]: 2025-11-22 07:45:06.737 186548 DEBUG nova.compute.manager [req-0c5e1acc-b875-41b6-8c65-4648e6bfc93b req-f966abae-a68b-4509-91f7-f17dea7b5e98 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Received event network-vif-plugged-4ab0012c-e73f-4cd6-b146-527583d046f3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:45:06 compute-0 nova_compute[186544]: 2025-11-22 07:45:06.737 186548 DEBUG oslo_concurrency.lockutils [req-0c5e1acc-b875-41b6-8c65-4648e6bfc93b req-f966abae-a68b-4509-91f7-f17dea7b5e98 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "feb5ca5f-df67-4f29-9c21-71ba30b5af9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:45:06 compute-0 nova_compute[186544]: 2025-11-22 07:45:06.737 186548 DEBUG oslo_concurrency.lockutils [req-0c5e1acc-b875-41b6-8c65-4648e6bfc93b req-f966abae-a68b-4509-91f7-f17dea7b5e98 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "feb5ca5f-df67-4f29-9c21-71ba30b5af9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:45:06 compute-0 nova_compute[186544]: 2025-11-22 07:45:06.737 186548 DEBUG oslo_concurrency.lockutils [req-0c5e1acc-b875-41b6-8c65-4648e6bfc93b req-f966abae-a68b-4509-91f7-f17dea7b5e98 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "feb5ca5f-df67-4f29-9c21-71ba30b5af9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:45:06 compute-0 nova_compute[186544]: 2025-11-22 07:45:06.737 186548 DEBUG nova.compute.manager [req-0c5e1acc-b875-41b6-8c65-4648e6bfc93b req-f966abae-a68b-4509-91f7-f17dea7b5e98 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] No waiting events found dispatching network-vif-plugged-4ab0012c-e73f-4cd6-b146-527583d046f3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:45:06 compute-0 nova_compute[186544]: 2025-11-22 07:45:06.738 186548 WARNING nova.compute.manager [req-0c5e1acc-b875-41b6-8c65-4648e6bfc93b req-f966abae-a68b-4509-91f7-f17dea7b5e98 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Received unexpected event network-vif-plugged-4ab0012c-e73f-4cd6-b146-527583d046f3 for instance with vm_state active and task_state migrating.
Nov 22 07:45:08 compute-0 podman[216566]: 2025-11-22 07:45:08.400370931 +0000 UTC m=+0.049218003 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 07:45:09 compute-0 nova_compute[186544]: 2025-11-22 07:45:09.349 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:45:09 compute-0 nova_compute[186544]: 2025-11-22 07:45:09.555 186548 DEBUG oslo_concurrency.lockutils [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Acquiring lock "feb5ca5f-df67-4f29-9c21-71ba30b5af9c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:45:09 compute-0 nova_compute[186544]: 2025-11-22 07:45:09.555 186548 DEBUG oslo_concurrency.lockutils [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Lock "feb5ca5f-df67-4f29-9c21-71ba30b5af9c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:45:09 compute-0 nova_compute[186544]: 2025-11-22 07:45:09.555 186548 DEBUG oslo_concurrency.lockutils [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Lock "feb5ca5f-df67-4f29-9c21-71ba30b5af9c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:45:09 compute-0 nova_compute[186544]: 2025-11-22 07:45:09.582 186548 DEBUG oslo_concurrency.lockutils [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:45:09 compute-0 nova_compute[186544]: 2025-11-22 07:45:09.583 186548 DEBUG oslo_concurrency.lockutils [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:45:09 compute-0 nova_compute[186544]: 2025-11-22 07:45:09.583 186548 DEBUG oslo_concurrency.lockutils [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:45:09 compute-0 nova_compute[186544]: 2025-11-22 07:45:09.583 186548 DEBUG nova.compute.resource_tracker [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 07:45:09 compute-0 nova_compute[186544]: 2025-11-22 07:45:09.643 186548 DEBUG oslo_concurrency.processutils [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:45:09 compute-0 nova_compute[186544]: 2025-11-22 07:45:09.707 186548 DEBUG oslo_concurrency.processutils [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:45:09 compute-0 nova_compute[186544]: 2025-11-22 07:45:09.708 186548 DEBUG oslo_concurrency.processutils [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:45:09 compute-0 nova_compute[186544]: 2025-11-22 07:45:09.766 186548 DEBUG oslo_concurrency.processutils [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:45:09 compute-0 nova_compute[186544]: 2025-11-22 07:45:09.913 186548 WARNING nova.virt.libvirt.driver [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 07:45:09 compute-0 nova_compute[186544]: 2025-11-22 07:45:09.915 186548 DEBUG nova.compute.resource_tracker [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5578MB free_disk=73.39615631103516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 07:45:09 compute-0 nova_compute[186544]: 2025-11-22 07:45:09.915 186548 DEBUG oslo_concurrency.lockutils [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:45:09 compute-0 nova_compute[186544]: 2025-11-22 07:45:09.915 186548 DEBUG oslo_concurrency.lockutils [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:45:09 compute-0 nova_compute[186544]: 2025-11-22 07:45:09.960 186548 DEBUG nova.compute.resource_tracker [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Migration for instance feb5ca5f-df67-4f29-9c21-71ba30b5af9c refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Nov 22 07:45:09 compute-0 nova_compute[186544]: 2025-11-22 07:45:09.980 186548 DEBUG nova.compute.resource_tracker [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Nov 22 07:45:10 compute-0 nova_compute[186544]: 2025-11-22 07:45:10.010 186548 DEBUG nova.compute.resource_tracker [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Instance 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 22 07:45:10 compute-0 nova_compute[186544]: 2025-11-22 07:45:10.010 186548 DEBUG nova.compute.resource_tracker [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Migration 0c8c09a3-0dab-45e1-b468-73fb3bc21014 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Nov 22 07:45:10 compute-0 nova_compute[186544]: 2025-11-22 07:45:10.011 186548 DEBUG nova.compute.resource_tracker [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 07:45:10 compute-0 nova_compute[186544]: 2025-11-22 07:45:10.011 186548 DEBUG nova.compute.resource_tracker [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 07:45:10 compute-0 nova_compute[186544]: 2025-11-22 07:45:10.031 186548 DEBUG nova.scheduler.client.report [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Refreshing inventories for resource provider 0a011418-630a-4be8-ab23-41ec1c11a5ea _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 22 07:45:10 compute-0 nova_compute[186544]: 2025-11-22 07:45:10.047 186548 DEBUG nova.scheduler.client.report [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Updating ProviderTree inventory for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 22 07:45:10 compute-0 nova_compute[186544]: 2025-11-22 07:45:10.047 186548 DEBUG nova.compute.provider_tree [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Updating inventory in ProviderTree for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 22 07:45:10 compute-0 nova_compute[186544]: 2025-11-22 07:45:10.060 186548 DEBUG nova.scheduler.client.report [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Refreshing aggregate associations for resource provider 0a011418-630a-4be8-ab23-41ec1c11a5ea, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 22 07:45:10 compute-0 nova_compute[186544]: 2025-11-22 07:45:10.064 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:45:10 compute-0 nova_compute[186544]: 2025-11-22 07:45:10.078 186548 DEBUG nova.scheduler.client.report [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Refreshing trait associations for resource provider 0a011418-630a-4be8-ab23-41ec1c11a5ea, traits: COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_ACCELERATORS,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_RESCUE_BFV,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSSE3,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE41 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 22 07:45:10 compute-0 nova_compute[186544]: 2025-11-22 07:45:10.139 186548 DEBUG nova.compute.provider_tree [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:45:10 compute-0 nova_compute[186544]: 2025-11-22 07:45:10.154 186548 DEBUG nova.scheduler.client.report [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:45:10 compute-0 nova_compute[186544]: 2025-11-22 07:45:10.180 186548 DEBUG nova.compute.resource_tracker [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 07:45:10 compute-0 nova_compute[186544]: 2025-11-22 07:45:10.180 186548 DEBUG oslo_concurrency.lockutils [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.265s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:45:10 compute-0 nova_compute[186544]: 2025-11-22 07:45:10.191 186548 INFO nova.compute.manager [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Migrating instance to compute-2.ctlplane.example.com finished successfully.
Nov 22 07:45:10 compute-0 nova_compute[186544]: 2025-11-22 07:45:10.282 186548 INFO nova.scheduler.client.report [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] Deleted allocation for migration 0c8c09a3-0dab-45e1-b468-73fb3bc21014
Nov 22 07:45:10 compute-0 nova_compute[186544]: 2025-11-22 07:45:10.282 186548 DEBUG nova.virt.libvirt.driver [None req-9c2eef8e-bae2-45c8-b97e-e8381eae9ee7 4a9f2fab15904ad4a5a624bf70ac8ed6 b13373072c22468abbe64db01a89c01f - - default default] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Nov 22 07:45:12 compute-0 podman[216597]: 2025-11-22 07:45:12.398325326 +0000 UTC m=+0.052159035 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, architecture=x86_64, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, vcs-type=git, version=9.6, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, distribution-scope=public, managed_by=edpm_ansible)
Nov 22 07:45:14 compute-0 nova_compute[186544]: 2025-11-22 07:45:14.394 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:45:15 compute-0 nova_compute[186544]: 2025-11-22 07:45:15.066 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:45:16 compute-0 nova_compute[186544]: 2025-11-22 07:45:16.944 186548 DEBUG oslo_concurrency.lockutils [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Acquiring lock "179e2133-b753-4ea6-8a33-f1fe9c02cd04" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:45:16 compute-0 nova_compute[186544]: 2025-11-22 07:45:16.944 186548 DEBUG oslo_concurrency.lockutils [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Lock "179e2133-b753-4ea6-8a33-f1fe9c02cd04" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:45:16 compute-0 nova_compute[186544]: 2025-11-22 07:45:16.956 186548 DEBUG nova.compute.manager [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] [instance: 179e2133-b753-4ea6-8a33-f1fe9c02cd04] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 07:45:17 compute-0 nova_compute[186544]: 2025-11-22 07:45:17.040 186548 DEBUG oslo_concurrency.lockutils [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:45:17 compute-0 nova_compute[186544]: 2025-11-22 07:45:17.041 186548 DEBUG oslo_concurrency.lockutils [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:45:17 compute-0 nova_compute[186544]: 2025-11-22 07:45:17.046 186548 DEBUG nova.virt.hardware [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 07:45:17 compute-0 nova_compute[186544]: 2025-11-22 07:45:17.046 186548 INFO nova.compute.claims [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] [instance: 179e2133-b753-4ea6-8a33-f1fe9c02cd04] Claim successful on node compute-0.ctlplane.example.com
Nov 22 07:45:17 compute-0 nova_compute[186544]: 2025-11-22 07:45:17.170 186548 DEBUG nova.compute.provider_tree [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:45:17 compute-0 nova_compute[186544]: 2025-11-22 07:45:17.180 186548 DEBUG nova.scheduler.client.report [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:45:17 compute-0 nova_compute[186544]: 2025-11-22 07:45:17.200 186548 DEBUG oslo_concurrency.lockutils [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.159s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:45:17 compute-0 nova_compute[186544]: 2025-11-22 07:45:17.200 186548 DEBUG nova.compute.manager [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] [instance: 179e2133-b753-4ea6-8a33-f1fe9c02cd04] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 07:45:17 compute-0 nova_compute[186544]: 2025-11-22 07:45:17.247 186548 DEBUG nova.compute.manager [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] [instance: 179e2133-b753-4ea6-8a33-f1fe9c02cd04] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 22 07:45:17 compute-0 nova_compute[186544]: 2025-11-22 07:45:17.248 186548 DEBUG nova.network.neutron [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] [instance: 179e2133-b753-4ea6-8a33-f1fe9c02cd04] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 07:45:17 compute-0 nova_compute[186544]: 2025-11-22 07:45:17.260 186548 INFO nova.virt.libvirt.driver [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] [instance: 179e2133-b753-4ea6-8a33-f1fe9c02cd04] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 07:45:17 compute-0 nova_compute[186544]: 2025-11-22 07:45:17.280 186548 DEBUG nova.compute.manager [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] [instance: 179e2133-b753-4ea6-8a33-f1fe9c02cd04] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 07:45:17 compute-0 nova_compute[186544]: 2025-11-22 07:45:17.398 186548 DEBUG nova.compute.manager [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] [instance: 179e2133-b753-4ea6-8a33-f1fe9c02cd04] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 07:45:17 compute-0 nova_compute[186544]: 2025-11-22 07:45:17.400 186548 DEBUG nova.virt.libvirt.driver [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] [instance: 179e2133-b753-4ea6-8a33-f1fe9c02cd04] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 07:45:17 compute-0 nova_compute[186544]: 2025-11-22 07:45:17.400 186548 INFO nova.virt.libvirt.driver [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] [instance: 179e2133-b753-4ea6-8a33-f1fe9c02cd04] Creating image(s)
Nov 22 07:45:17 compute-0 nova_compute[186544]: 2025-11-22 07:45:17.401 186548 DEBUG oslo_concurrency.lockutils [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Acquiring lock "/var/lib/nova/instances/179e2133-b753-4ea6-8a33-f1fe9c02cd04/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:45:17 compute-0 nova_compute[186544]: 2025-11-22 07:45:17.401 186548 DEBUG oslo_concurrency.lockutils [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Lock "/var/lib/nova/instances/179e2133-b753-4ea6-8a33-f1fe9c02cd04/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:45:17 compute-0 nova_compute[186544]: 2025-11-22 07:45:17.401 186548 DEBUG oslo_concurrency.lockutils [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Lock "/var/lib/nova/instances/179e2133-b753-4ea6-8a33-f1fe9c02cd04/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:45:17 compute-0 nova_compute[186544]: 2025-11-22 07:45:17.414 186548 DEBUG oslo_concurrency.processutils [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:45:17 compute-0 nova_compute[186544]: 2025-11-22 07:45:17.468 186548 DEBUG oslo_concurrency.processutils [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:45:17 compute-0 nova_compute[186544]: 2025-11-22 07:45:17.469 186548 DEBUG oslo_concurrency.lockutils [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:45:17 compute-0 nova_compute[186544]: 2025-11-22 07:45:17.470 186548 DEBUG oslo_concurrency.lockutils [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:45:17 compute-0 nova_compute[186544]: 2025-11-22 07:45:17.481 186548 DEBUG oslo_concurrency.processutils [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:45:17 compute-0 nova_compute[186544]: 2025-11-22 07:45:17.542 186548 DEBUG oslo_concurrency.processutils [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:45:17 compute-0 nova_compute[186544]: 2025-11-22 07:45:17.543 186548 DEBUG oslo_concurrency.processutils [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/179e2133-b753-4ea6-8a33-f1fe9c02cd04/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:45:17 compute-0 nova_compute[186544]: 2025-11-22 07:45:17.587 186548 DEBUG oslo_concurrency.processutils [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/179e2133-b753-4ea6-8a33-f1fe9c02cd04/disk 1073741824" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:45:17 compute-0 nova_compute[186544]: 2025-11-22 07:45:17.589 186548 DEBUG oslo_concurrency.lockutils [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.119s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:45:17 compute-0 nova_compute[186544]: 2025-11-22 07:45:17.590 186548 DEBUG oslo_concurrency.processutils [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:45:17 compute-0 nova_compute[186544]: 2025-11-22 07:45:17.642 186548 DEBUG oslo_concurrency.processutils [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:45:17 compute-0 nova_compute[186544]: 2025-11-22 07:45:17.643 186548 DEBUG nova.virt.disk.api [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Checking if we can resize image /var/lib/nova/instances/179e2133-b753-4ea6-8a33-f1fe9c02cd04/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 07:45:17 compute-0 nova_compute[186544]: 2025-11-22 07:45:17.643 186548 DEBUG oslo_concurrency.processutils [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/179e2133-b753-4ea6-8a33-f1fe9c02cd04/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:45:17 compute-0 nova_compute[186544]: 2025-11-22 07:45:17.695 186548 DEBUG oslo_concurrency.processutils [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/179e2133-b753-4ea6-8a33-f1fe9c02cd04/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:45:17 compute-0 nova_compute[186544]: 2025-11-22 07:45:17.696 186548 DEBUG nova.virt.disk.api [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Cannot resize image /var/lib/nova/instances/179e2133-b753-4ea6-8a33-f1fe9c02cd04/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 07:45:17 compute-0 nova_compute[186544]: 2025-11-22 07:45:17.697 186548 DEBUG nova.objects.instance [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Lazy-loading 'migration_context' on Instance uuid 179e2133-b753-4ea6-8a33-f1fe9c02cd04 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:45:17 compute-0 nova_compute[186544]: 2025-11-22 07:45:17.708 186548 DEBUG nova.virt.libvirt.driver [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] [instance: 179e2133-b753-4ea6-8a33-f1fe9c02cd04] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 07:45:17 compute-0 nova_compute[186544]: 2025-11-22 07:45:17.708 186548 DEBUG nova.virt.libvirt.driver [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] [instance: 179e2133-b753-4ea6-8a33-f1fe9c02cd04] Ensure instance console log exists: /var/lib/nova/instances/179e2133-b753-4ea6-8a33-f1fe9c02cd04/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 07:45:17 compute-0 nova_compute[186544]: 2025-11-22 07:45:17.709 186548 DEBUG oslo_concurrency.lockutils [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:45:17 compute-0 nova_compute[186544]: 2025-11-22 07:45:17.709 186548 DEBUG oslo_concurrency.lockutils [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:45:17 compute-0 nova_compute[186544]: 2025-11-22 07:45:17.710 186548 DEBUG oslo_concurrency.lockutils [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:45:17 compute-0 nova_compute[186544]: 2025-11-22 07:45:17.816 186548 DEBUG nova.network.neutron [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] [instance: 179e2133-b753-4ea6-8a33-f1fe9c02cd04] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188
Nov 22 07:45:17 compute-0 nova_compute[186544]: 2025-11-22 07:45:17.816 186548 DEBUG nova.compute.manager [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] [instance: 179e2133-b753-4ea6-8a33-f1fe9c02cd04] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 22 07:45:17 compute-0 nova_compute[186544]: 2025-11-22 07:45:17.818 186548 DEBUG nova.virt.libvirt.driver [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] [instance: 179e2133-b753-4ea6-8a33-f1fe9c02cd04] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 07:45:17 compute-0 nova_compute[186544]: 2025-11-22 07:45:17.821 186548 WARNING nova.virt.libvirt.driver [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 07:45:17 compute-0 nova_compute[186544]: 2025-11-22 07:45:17.825 186548 DEBUG nova.virt.libvirt.host [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 07:45:17 compute-0 nova_compute[186544]: 2025-11-22 07:45:17.825 186548 DEBUG nova.virt.libvirt.host [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 07:45:17 compute-0 nova_compute[186544]: 2025-11-22 07:45:17.828 186548 DEBUG nova.virt.libvirt.host [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 07:45:17 compute-0 nova_compute[186544]: 2025-11-22 07:45:17.828 186548 DEBUG nova.virt.libvirt.host [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 07:45:17 compute-0 nova_compute[186544]: 2025-11-22 07:45:17.829 186548 DEBUG nova.virt.libvirt.driver [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 07:45:17 compute-0 nova_compute[186544]: 2025-11-22 07:45:17.830 186548 DEBUG nova.virt.hardware [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 07:45:17 compute-0 nova_compute[186544]: 2025-11-22 07:45:17.830 186548 DEBUG nova.virt.hardware [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 07:45:17 compute-0 nova_compute[186544]: 2025-11-22 07:45:17.830 186548 DEBUG nova.virt.hardware [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 07:45:17 compute-0 nova_compute[186544]: 2025-11-22 07:45:17.831 186548 DEBUG nova.virt.hardware [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 07:45:17 compute-0 nova_compute[186544]: 2025-11-22 07:45:17.831 186548 DEBUG nova.virt.hardware [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 07:45:17 compute-0 nova_compute[186544]: 2025-11-22 07:45:17.831 186548 DEBUG nova.virt.hardware [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 07:45:17 compute-0 nova_compute[186544]: 2025-11-22 07:45:17.831 186548 DEBUG nova.virt.hardware [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 07:45:17 compute-0 nova_compute[186544]: 2025-11-22 07:45:17.832 186548 DEBUG nova.virt.hardware [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 07:45:17 compute-0 nova_compute[186544]: 2025-11-22 07:45:17.832 186548 DEBUG nova.virt.hardware [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 07:45:17 compute-0 nova_compute[186544]: 2025-11-22 07:45:17.832 186548 DEBUG nova.virt.hardware [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 07:45:17 compute-0 nova_compute[186544]: 2025-11-22 07:45:17.832 186548 DEBUG nova.virt.hardware [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 07:45:17 compute-0 nova_compute[186544]: 2025-11-22 07:45:17.836 186548 DEBUG nova.objects.instance [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Lazy-loading 'pci_devices' on Instance uuid 179e2133-b753-4ea6-8a33-f1fe9c02cd04 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:45:17 compute-0 nova_compute[186544]: 2025-11-22 07:45:17.844 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763797502.843225, feb5ca5f-df67-4f29-9c21-71ba30b5af9c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:45:17 compute-0 nova_compute[186544]: 2025-11-22 07:45:17.844 186548 INFO nova.compute.manager [-] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] VM Stopped (Lifecycle Event)
Nov 22 07:45:17 compute-0 nova_compute[186544]: 2025-11-22 07:45:17.848 186548 DEBUG nova.virt.libvirt.driver [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] [instance: 179e2133-b753-4ea6-8a33-f1fe9c02cd04] End _get_guest_xml xml=<domain type="kvm">
Nov 22 07:45:17 compute-0 nova_compute[186544]:   <uuid>179e2133-b753-4ea6-8a33-f1fe9c02cd04</uuid>
Nov 22 07:45:17 compute-0 nova_compute[186544]:   <name>instance-00000018</name>
Nov 22 07:45:17 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 07:45:17 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 07:45:17 compute-0 nova_compute[186544]:   <metadata>
Nov 22 07:45:17 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 07:45:17 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 07:45:17 compute-0 nova_compute[186544]:       <nova:name>tempest-ServersAdminNegativeTestJSON-server-385468381</nova:name>
Nov 22 07:45:17 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 07:45:17</nova:creationTime>
Nov 22 07:45:17 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 07:45:17 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 07:45:17 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 07:45:17 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 07:45:17 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 07:45:17 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 07:45:17 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 07:45:17 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 07:45:17 compute-0 nova_compute[186544]:         <nova:user uuid="9dc3d104549745fdaace1dd5280da2f2">tempest-ServersAdminNegativeTestJSON-1347039048-project-member</nova:user>
Nov 22 07:45:17 compute-0 nova_compute[186544]:         <nova:project uuid="8778f7e37a30439da41d0a6b383be684">tempest-ServersAdminNegativeTestJSON-1347039048</nova:project>
Nov 22 07:45:17 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 07:45:17 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 07:45:17 compute-0 nova_compute[186544]:       <nova:ports/>
Nov 22 07:45:17 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 07:45:17 compute-0 nova_compute[186544]:   </metadata>
Nov 22 07:45:17 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 07:45:17 compute-0 nova_compute[186544]:     <system>
Nov 22 07:45:17 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 07:45:17 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 07:45:17 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 07:45:17 compute-0 nova_compute[186544]:       <entry name="serial">179e2133-b753-4ea6-8a33-f1fe9c02cd04</entry>
Nov 22 07:45:17 compute-0 nova_compute[186544]:       <entry name="uuid">179e2133-b753-4ea6-8a33-f1fe9c02cd04</entry>
Nov 22 07:45:17 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 07:45:17 compute-0 nova_compute[186544]:     </system>
Nov 22 07:45:17 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 07:45:17 compute-0 nova_compute[186544]:   <os>
Nov 22 07:45:17 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 07:45:17 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 07:45:17 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 07:45:17 compute-0 nova_compute[186544]:   </os>
Nov 22 07:45:17 compute-0 nova_compute[186544]:   <features>
Nov 22 07:45:17 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 07:45:17 compute-0 nova_compute[186544]:     <apic/>
Nov 22 07:45:17 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 07:45:17 compute-0 nova_compute[186544]:   </features>
Nov 22 07:45:17 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 07:45:17 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 07:45:17 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 07:45:17 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 07:45:17 compute-0 nova_compute[186544]:   </clock>
Nov 22 07:45:17 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 07:45:17 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 07:45:17 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 07:45:17 compute-0 nova_compute[186544]:   </cpu>
Nov 22 07:45:17 compute-0 nova_compute[186544]:   <devices>
Nov 22 07:45:17 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 07:45:17 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 07:45:17 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/179e2133-b753-4ea6-8a33-f1fe9c02cd04/disk"/>
Nov 22 07:45:17 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 07:45:17 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:45:17 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 07:45:17 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 07:45:17 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/179e2133-b753-4ea6-8a33-f1fe9c02cd04/disk.config"/>
Nov 22 07:45:17 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 07:45:17 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:45:17 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 07:45:17 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/179e2133-b753-4ea6-8a33-f1fe9c02cd04/console.log" append="off"/>
Nov 22 07:45:17 compute-0 nova_compute[186544]:     </serial>
Nov 22 07:45:17 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 07:45:17 compute-0 nova_compute[186544]:     <video>
Nov 22 07:45:17 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 07:45:17 compute-0 nova_compute[186544]:     </video>
Nov 22 07:45:17 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 07:45:17 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 07:45:17 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 07:45:17 compute-0 nova_compute[186544]:     </rng>
Nov 22 07:45:17 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 07:45:17 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:45:17 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:45:17 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:45:17 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:45:17 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:45:17 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:45:17 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:45:17 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:45:17 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:45:17 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:45:17 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:45:17 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:45:17 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:45:17 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:45:17 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:45:17 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:45:17 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:45:17 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:45:17 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:45:17 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:45:17 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:45:17 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:45:17 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:45:17 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:45:17 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 07:45:17 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 07:45:17 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 07:45:17 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 07:45:17 compute-0 nova_compute[186544]:   </devices>
Nov 22 07:45:17 compute-0 nova_compute[186544]: </domain>
Nov 22 07:45:17 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 07:45:17 compute-0 nova_compute[186544]: 2025-11-22 07:45:17.878 186548 DEBUG nova.compute.manager [None req-b5ad91dc-e6e6-46d7-887f-4b33349b9cf9 - - - - - -] [instance: feb5ca5f-df67-4f29-9c21-71ba30b5af9c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:45:17 compute-0 nova_compute[186544]: 2025-11-22 07:45:17.899 186548 DEBUG nova.virt.libvirt.driver [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:45:17 compute-0 nova_compute[186544]: 2025-11-22 07:45:17.899 186548 DEBUG nova.virt.libvirt.driver [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:45:17 compute-0 nova_compute[186544]: 2025-11-22 07:45:17.900 186548 INFO nova.virt.libvirt.driver [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] [instance: 179e2133-b753-4ea6-8a33-f1fe9c02cd04] Using config drive
Nov 22 07:45:18 compute-0 nova_compute[186544]: 2025-11-22 07:45:18.139 186548 INFO nova.virt.libvirt.driver [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] [instance: 179e2133-b753-4ea6-8a33-f1fe9c02cd04] Creating config drive at /var/lib/nova/instances/179e2133-b753-4ea6-8a33-f1fe9c02cd04/disk.config
Nov 22 07:45:18 compute-0 nova_compute[186544]: 2025-11-22 07:45:18.143 186548 DEBUG oslo_concurrency.processutils [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/179e2133-b753-4ea6-8a33-f1fe9c02cd04/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmphzy_5ju6 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:45:18 compute-0 nova_compute[186544]: 2025-11-22 07:45:18.264 186548 DEBUG oslo_concurrency.processutils [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/179e2133-b753-4ea6-8a33-f1fe9c02cd04/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmphzy_5ju6" returned: 0 in 0.120s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:45:18 compute-0 systemd-machined[152872]: New machine qemu-12-instance-00000018.
Nov 22 07:45:18 compute-0 systemd[1]: Started Virtual Machine qemu-12-instance-00000018.
Nov 22 07:45:19 compute-0 nova_compute[186544]: 2025-11-22 07:45:19.106 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763797519.1062405, 179e2133-b753-4ea6-8a33-f1fe9c02cd04 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:45:19 compute-0 nova_compute[186544]: 2025-11-22 07:45:19.107 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 179e2133-b753-4ea6-8a33-f1fe9c02cd04] VM Resumed (Lifecycle Event)
Nov 22 07:45:19 compute-0 nova_compute[186544]: 2025-11-22 07:45:19.109 186548 DEBUG nova.compute.manager [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] [instance: 179e2133-b753-4ea6-8a33-f1fe9c02cd04] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 07:45:19 compute-0 nova_compute[186544]: 2025-11-22 07:45:19.109 186548 DEBUG nova.virt.libvirt.driver [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] [instance: 179e2133-b753-4ea6-8a33-f1fe9c02cd04] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 07:45:19 compute-0 nova_compute[186544]: 2025-11-22 07:45:19.112 186548 INFO nova.virt.libvirt.driver [-] [instance: 179e2133-b753-4ea6-8a33-f1fe9c02cd04] Instance spawned successfully.
Nov 22 07:45:19 compute-0 nova_compute[186544]: 2025-11-22 07:45:19.112 186548 DEBUG nova.virt.libvirt.driver [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] [instance: 179e2133-b753-4ea6-8a33-f1fe9c02cd04] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 07:45:19 compute-0 nova_compute[186544]: 2025-11-22 07:45:19.132 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 179e2133-b753-4ea6-8a33-f1fe9c02cd04] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:45:19 compute-0 nova_compute[186544]: 2025-11-22 07:45:19.135 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 179e2133-b753-4ea6-8a33-f1fe9c02cd04] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:45:19 compute-0 nova_compute[186544]: 2025-11-22 07:45:19.142 186548 DEBUG nova.virt.libvirt.driver [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] [instance: 179e2133-b753-4ea6-8a33-f1fe9c02cd04] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:45:19 compute-0 nova_compute[186544]: 2025-11-22 07:45:19.143 186548 DEBUG nova.virt.libvirt.driver [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] [instance: 179e2133-b753-4ea6-8a33-f1fe9c02cd04] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:45:19 compute-0 nova_compute[186544]: 2025-11-22 07:45:19.143 186548 DEBUG nova.virt.libvirt.driver [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] [instance: 179e2133-b753-4ea6-8a33-f1fe9c02cd04] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:45:19 compute-0 nova_compute[186544]: 2025-11-22 07:45:19.144 186548 DEBUG nova.virt.libvirt.driver [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] [instance: 179e2133-b753-4ea6-8a33-f1fe9c02cd04] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:45:19 compute-0 nova_compute[186544]: 2025-11-22 07:45:19.144 186548 DEBUG nova.virt.libvirt.driver [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] [instance: 179e2133-b753-4ea6-8a33-f1fe9c02cd04] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:45:19 compute-0 nova_compute[186544]: 2025-11-22 07:45:19.144 186548 DEBUG nova.virt.libvirt.driver [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] [instance: 179e2133-b753-4ea6-8a33-f1fe9c02cd04] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:45:19 compute-0 nova_compute[186544]: 2025-11-22 07:45:19.191 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 179e2133-b753-4ea6-8a33-f1fe9c02cd04] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 07:45:19 compute-0 nova_compute[186544]: 2025-11-22 07:45:19.192 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763797519.1088588, 179e2133-b753-4ea6-8a33-f1fe9c02cd04 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:45:19 compute-0 nova_compute[186544]: 2025-11-22 07:45:19.192 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 179e2133-b753-4ea6-8a33-f1fe9c02cd04] VM Started (Lifecycle Event)
Nov 22 07:45:19 compute-0 nova_compute[186544]: 2025-11-22 07:45:19.235 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 179e2133-b753-4ea6-8a33-f1fe9c02cd04] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:45:19 compute-0 nova_compute[186544]: 2025-11-22 07:45:19.238 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 179e2133-b753-4ea6-8a33-f1fe9c02cd04] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:45:19 compute-0 nova_compute[186544]: 2025-11-22 07:45:19.363 186548 INFO nova.compute.manager [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] [instance: 179e2133-b753-4ea6-8a33-f1fe9c02cd04] Took 1.96 seconds to spawn the instance on the hypervisor.
Nov 22 07:45:19 compute-0 nova_compute[186544]: 2025-11-22 07:45:19.363 186548 DEBUG nova.compute.manager [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] [instance: 179e2133-b753-4ea6-8a33-f1fe9c02cd04] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:45:19 compute-0 nova_compute[186544]: 2025-11-22 07:45:19.390 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 179e2133-b753-4ea6-8a33-f1fe9c02cd04] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 07:45:19 compute-0 nova_compute[186544]: 2025-11-22 07:45:19.397 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:45:19 compute-0 nova_compute[186544]: 2025-11-22 07:45:19.658 186548 INFO nova.compute.manager [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] [instance: 179e2133-b753-4ea6-8a33-f1fe9c02cd04] Took 2.65 seconds to build instance.
Nov 22 07:45:19 compute-0 nova_compute[186544]: 2025-11-22 07:45:19.682 186548 DEBUG oslo_concurrency.lockutils [None req-aea8bdbf-f412-44ff-92bf-bcb3c1abf091 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Lock "179e2133-b753-4ea6-8a33-f1fe9c02cd04" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.738s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:45:20 compute-0 nova_compute[186544]: 2025-11-22 07:45:20.067 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:45:21 compute-0 nova_compute[186544]: 2025-11-22 07:45:21.259 186548 DEBUG nova.objects.instance [None req-0d7f848d-996d-4a33-865a-8ea0cb4273c0 fc92fb0501f24b2d9c919f2272400b81 91daf3d0c3d94a598db29a6e3ee4052d - - default default] Lazy-loading 'pci_devices' on Instance uuid 179e2133-b753-4ea6-8a33-f1fe9c02cd04 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:45:21 compute-0 nova_compute[186544]: 2025-11-22 07:45:21.277 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763797521.2763886, 179e2133-b753-4ea6-8a33-f1fe9c02cd04 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:45:21 compute-0 nova_compute[186544]: 2025-11-22 07:45:21.277 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 179e2133-b753-4ea6-8a33-f1fe9c02cd04] VM Paused (Lifecycle Event)
Nov 22 07:45:21 compute-0 nova_compute[186544]: 2025-11-22 07:45:21.305 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 179e2133-b753-4ea6-8a33-f1fe9c02cd04] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:45:21 compute-0 nova_compute[186544]: 2025-11-22 07:45:21.308 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 179e2133-b753-4ea6-8a33-f1fe9c02cd04] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:45:21 compute-0 nova_compute[186544]: 2025-11-22 07:45:21.330 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 179e2133-b753-4ea6-8a33-f1fe9c02cd04] During sync_power_state the instance has a pending task (suspending). Skip.
Nov 22 07:45:21 compute-0 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000018.scope: Deactivated successfully.
Nov 22 07:45:21 compute-0 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000018.scope: Consumed 3.025s CPU time.
Nov 22 07:45:21 compute-0 systemd-machined[152872]: Machine qemu-12-instance-00000018 terminated.
Nov 22 07:45:21 compute-0 nova_compute[186544]: 2025-11-22 07:45:21.910 186548 DEBUG nova.compute.manager [None req-0d7f848d-996d-4a33-865a-8ea0cb4273c0 fc92fb0501f24b2d9c919f2272400b81 91daf3d0c3d94a598db29a6e3ee4052d - - default default] [instance: 179e2133-b753-4ea6-8a33-f1fe9c02cd04] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:45:23 compute-0 podman[216672]: 2025-11-22 07:45:23.457853029 +0000 UTC m=+0.105972601 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, maintainer=OpenStack Kubernetes Operator team)
Nov 22 07:45:23 compute-0 podman[216673]: 2025-11-22 07:45:23.475942861 +0000 UTC m=+0.124541515 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Nov 22 07:45:24 compute-0 nova_compute[186544]: 2025-11-22 07:45:24.399 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:45:25 compute-0 nova_compute[186544]: 2025-11-22 07:45:25.068 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:45:26 compute-0 nova_compute[186544]: 2025-11-22 07:45:26.837 186548 DEBUG oslo_concurrency.lockutils [None req-76b83adb-26fe-4acd-a4d6-1949f1288834 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Acquiring lock "179e2133-b753-4ea6-8a33-f1fe9c02cd04" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:45:26 compute-0 nova_compute[186544]: 2025-11-22 07:45:26.837 186548 DEBUG oslo_concurrency.lockutils [None req-76b83adb-26fe-4acd-a4d6-1949f1288834 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Lock "179e2133-b753-4ea6-8a33-f1fe9c02cd04" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:45:26 compute-0 nova_compute[186544]: 2025-11-22 07:45:26.838 186548 DEBUG oslo_concurrency.lockutils [None req-76b83adb-26fe-4acd-a4d6-1949f1288834 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Acquiring lock "179e2133-b753-4ea6-8a33-f1fe9c02cd04-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:45:26 compute-0 nova_compute[186544]: 2025-11-22 07:45:26.838 186548 DEBUG oslo_concurrency.lockutils [None req-76b83adb-26fe-4acd-a4d6-1949f1288834 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Lock "179e2133-b753-4ea6-8a33-f1fe9c02cd04-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:45:26 compute-0 nova_compute[186544]: 2025-11-22 07:45:26.838 186548 DEBUG oslo_concurrency.lockutils [None req-76b83adb-26fe-4acd-a4d6-1949f1288834 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Lock "179e2133-b753-4ea6-8a33-f1fe9c02cd04-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:45:26 compute-0 nova_compute[186544]: 2025-11-22 07:45:26.845 186548 INFO nova.compute.manager [None req-76b83adb-26fe-4acd-a4d6-1949f1288834 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] [instance: 179e2133-b753-4ea6-8a33-f1fe9c02cd04] Terminating instance
Nov 22 07:45:26 compute-0 nova_compute[186544]: 2025-11-22 07:45:26.852 186548 DEBUG oslo_concurrency.lockutils [None req-76b83adb-26fe-4acd-a4d6-1949f1288834 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Acquiring lock "refresh_cache-179e2133-b753-4ea6-8a33-f1fe9c02cd04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:45:26 compute-0 nova_compute[186544]: 2025-11-22 07:45:26.853 186548 DEBUG oslo_concurrency.lockutils [None req-76b83adb-26fe-4acd-a4d6-1949f1288834 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Acquired lock "refresh_cache-179e2133-b753-4ea6-8a33-f1fe9c02cd04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:45:26 compute-0 nova_compute[186544]: 2025-11-22 07:45:26.853 186548 DEBUG nova.network.neutron [None req-76b83adb-26fe-4acd-a4d6-1949f1288834 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] [instance: 179e2133-b753-4ea6-8a33-f1fe9c02cd04] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 07:45:27 compute-0 nova_compute[186544]: 2025-11-22 07:45:27.133 186548 DEBUG nova.network.neutron [None req-76b83adb-26fe-4acd-a4d6-1949f1288834 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] [instance: 179e2133-b753-4ea6-8a33-f1fe9c02cd04] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 07:45:27 compute-0 podman[216731]: 2025-11-22 07:45:27.403195278 +0000 UTC m=+0.047719757 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 07:45:27 compute-0 nova_compute[186544]: 2025-11-22 07:45:27.602 186548 DEBUG nova.network.neutron [None req-76b83adb-26fe-4acd-a4d6-1949f1288834 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] [instance: 179e2133-b753-4ea6-8a33-f1fe9c02cd04] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:45:27 compute-0 nova_compute[186544]: 2025-11-22 07:45:27.636 186548 DEBUG oslo_concurrency.lockutils [None req-76b83adb-26fe-4acd-a4d6-1949f1288834 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Releasing lock "refresh_cache-179e2133-b753-4ea6-8a33-f1fe9c02cd04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:45:27 compute-0 nova_compute[186544]: 2025-11-22 07:45:27.637 186548 DEBUG nova.compute.manager [None req-76b83adb-26fe-4acd-a4d6-1949f1288834 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] [instance: 179e2133-b753-4ea6-8a33-f1fe9c02cd04] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 07:45:27 compute-0 nova_compute[186544]: 2025-11-22 07:45:27.645 186548 INFO nova.virt.libvirt.driver [-] [instance: 179e2133-b753-4ea6-8a33-f1fe9c02cd04] Instance destroyed successfully.
Nov 22 07:45:27 compute-0 nova_compute[186544]: 2025-11-22 07:45:27.646 186548 DEBUG nova.objects.instance [None req-76b83adb-26fe-4acd-a4d6-1949f1288834 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Lazy-loading 'resources' on Instance uuid 179e2133-b753-4ea6-8a33-f1fe9c02cd04 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:45:27 compute-0 nova_compute[186544]: 2025-11-22 07:45:27.657 186548 INFO nova.virt.libvirt.driver [None req-76b83adb-26fe-4acd-a4d6-1949f1288834 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] [instance: 179e2133-b753-4ea6-8a33-f1fe9c02cd04] Deleting instance files /var/lib/nova/instances/179e2133-b753-4ea6-8a33-f1fe9c02cd04_del
Nov 22 07:45:27 compute-0 nova_compute[186544]: 2025-11-22 07:45:27.658 186548 INFO nova.virt.libvirt.driver [None req-76b83adb-26fe-4acd-a4d6-1949f1288834 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] [instance: 179e2133-b753-4ea6-8a33-f1fe9c02cd04] Deletion of /var/lib/nova/instances/179e2133-b753-4ea6-8a33-f1fe9c02cd04_del complete
Nov 22 07:45:27 compute-0 nova_compute[186544]: 2025-11-22 07:45:27.730 186548 INFO nova.compute.manager [None req-76b83adb-26fe-4acd-a4d6-1949f1288834 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] [instance: 179e2133-b753-4ea6-8a33-f1fe9c02cd04] Took 0.09 seconds to destroy the instance on the hypervisor.
Nov 22 07:45:27 compute-0 nova_compute[186544]: 2025-11-22 07:45:27.731 186548 DEBUG oslo.service.loopingcall [None req-76b83adb-26fe-4acd-a4d6-1949f1288834 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 07:45:27 compute-0 nova_compute[186544]: 2025-11-22 07:45:27.731 186548 DEBUG nova.compute.manager [-] [instance: 179e2133-b753-4ea6-8a33-f1fe9c02cd04] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 07:45:27 compute-0 nova_compute[186544]: 2025-11-22 07:45:27.731 186548 DEBUG nova.network.neutron [-] [instance: 179e2133-b753-4ea6-8a33-f1fe9c02cd04] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 07:45:28 compute-0 nova_compute[186544]: 2025-11-22 07:45:28.291 186548 DEBUG nova.network.neutron [-] [instance: 179e2133-b753-4ea6-8a33-f1fe9c02cd04] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 07:45:28 compute-0 nova_compute[186544]: 2025-11-22 07:45:28.313 186548 DEBUG nova.network.neutron [-] [instance: 179e2133-b753-4ea6-8a33-f1fe9c02cd04] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:45:28 compute-0 nova_compute[186544]: 2025-11-22 07:45:28.332 186548 INFO nova.compute.manager [-] [instance: 179e2133-b753-4ea6-8a33-f1fe9c02cd04] Took 0.60 seconds to deallocate network for instance.
Nov 22 07:45:28 compute-0 nova_compute[186544]: 2025-11-22 07:45:28.427 186548 DEBUG oslo_concurrency.lockutils [None req-76b83adb-26fe-4acd-a4d6-1949f1288834 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:45:28 compute-0 nova_compute[186544]: 2025-11-22 07:45:28.428 186548 DEBUG oslo_concurrency.lockutils [None req-76b83adb-26fe-4acd-a4d6-1949f1288834 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:45:28 compute-0 nova_compute[186544]: 2025-11-22 07:45:28.504 186548 DEBUG nova.compute.provider_tree [None req-76b83adb-26fe-4acd-a4d6-1949f1288834 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:45:28 compute-0 nova_compute[186544]: 2025-11-22 07:45:28.516 186548 DEBUG nova.scheduler.client.report [None req-76b83adb-26fe-4acd-a4d6-1949f1288834 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:45:28 compute-0 nova_compute[186544]: 2025-11-22 07:45:28.534 186548 DEBUG oslo_concurrency.lockutils [None req-76b83adb-26fe-4acd-a4d6-1949f1288834 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:45:28 compute-0 nova_compute[186544]: 2025-11-22 07:45:28.566 186548 INFO nova.scheduler.client.report [None req-76b83adb-26fe-4acd-a4d6-1949f1288834 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Deleted allocations for instance 179e2133-b753-4ea6-8a33-f1fe9c02cd04
Nov 22 07:45:28 compute-0 nova_compute[186544]: 2025-11-22 07:45:28.643 186548 DEBUG oslo_concurrency.lockutils [None req-76b83adb-26fe-4acd-a4d6-1949f1288834 9dc3d104549745fdaace1dd5280da2f2 8778f7e37a30439da41d0a6b383be684 - - default default] Lock "179e2133-b753-4ea6-8a33-f1fe9c02cd04" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.805s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:45:29 compute-0 nova_compute[186544]: 2025-11-22 07:45:29.401 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:45:30 compute-0 nova_compute[186544]: 2025-11-22 07:45:30.071 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:45:30 compute-0 nova_compute[186544]: 2025-11-22 07:45:30.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:45:30 compute-0 nova_compute[186544]: 2025-11-22 07:45:30.181 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:45:30 compute-0 nova_compute[186544]: 2025-11-22 07:45:30.182 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:45:30 compute-0 nova_compute[186544]: 2025-11-22 07:45:30.182 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:45:30 compute-0 nova_compute[186544]: 2025-11-22 07:45:30.182 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 07:45:30 compute-0 nova_compute[186544]: 2025-11-22 07:45:30.261 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:45:30 compute-0 nova_compute[186544]: 2025-11-22 07:45:30.323 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:45:30 compute-0 nova_compute[186544]: 2025-11-22 07:45:30.324 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:45:30 compute-0 nova_compute[186544]: 2025-11-22 07:45:30.381 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:45:30 compute-0 nova_compute[186544]: 2025-11-22 07:45:30.533 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 07:45:30 compute-0 nova_compute[186544]: 2025-11-22 07:45:30.535 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5529MB free_disk=73.3961296081543GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 07:45:30 compute-0 nova_compute[186544]: 2025-11-22 07:45:30.535 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:45:30 compute-0 nova_compute[186544]: 2025-11-22 07:45:30.536 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:45:30 compute-0 nova_compute[186544]: 2025-11-22 07:45:30.623 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Instance 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 22 07:45:30 compute-0 nova_compute[186544]: 2025-11-22 07:45:30.623 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 07:45:30 compute-0 nova_compute[186544]: 2025-11-22 07:45:30.623 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 07:45:30 compute-0 nova_compute[186544]: 2025-11-22 07:45:30.667 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:45:30 compute-0 nova_compute[186544]: 2025-11-22 07:45:30.681 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:45:30 compute-0 nova_compute[186544]: 2025-11-22 07:45:30.904 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 07:45:30 compute-0 nova_compute[186544]: 2025-11-22 07:45:30.904 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.369s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:45:31 compute-0 podman[216762]: 2025-11-22 07:45:31.41783302 +0000 UTC m=+0.055368233 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_id=ovn_metadata_agent)
Nov 22 07:45:32 compute-0 nova_compute[186544]: 2025-11-22 07:45:32.133 186548 DEBUG oslo_concurrency.lockutils [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Acquiring lock "2f7c9a98-a5d7-429a-afdd-d0e698e82707" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:45:32 compute-0 nova_compute[186544]: 2025-11-22 07:45:32.134 186548 DEBUG oslo_concurrency.lockutils [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Lock "2f7c9a98-a5d7-429a-afdd-d0e698e82707" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:45:32 compute-0 nova_compute[186544]: 2025-11-22 07:45:32.162 186548 DEBUG nova.compute.manager [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 07:45:32 compute-0 nova_compute[186544]: 2025-11-22 07:45:32.238 186548 DEBUG oslo_concurrency.lockutils [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:45:32 compute-0 nova_compute[186544]: 2025-11-22 07:45:32.238 186548 DEBUG oslo_concurrency.lockutils [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:45:32 compute-0 nova_compute[186544]: 2025-11-22 07:45:32.245 186548 DEBUG nova.virt.hardware [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 07:45:32 compute-0 nova_compute[186544]: 2025-11-22 07:45:32.245 186548 INFO nova.compute.claims [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] Claim successful on node compute-0.ctlplane.example.com
Nov 22 07:45:32 compute-0 nova_compute[186544]: 2025-11-22 07:45:32.414 186548 DEBUG nova.compute.provider_tree [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:45:32 compute-0 nova_compute[186544]: 2025-11-22 07:45:32.426 186548 DEBUG nova.scheduler.client.report [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:45:32 compute-0 nova_compute[186544]: 2025-11-22 07:45:32.468 186548 DEBUG oslo_concurrency.lockutils [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.230s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:45:32 compute-0 nova_compute[186544]: 2025-11-22 07:45:32.469 186548 DEBUG nova.compute.manager [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 07:45:32 compute-0 nova_compute[186544]: 2025-11-22 07:45:32.525 186548 DEBUG nova.compute.manager [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 22 07:45:32 compute-0 nova_compute[186544]: 2025-11-22 07:45:32.526 186548 DEBUG nova.network.neutron [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 07:45:32 compute-0 nova_compute[186544]: 2025-11-22 07:45:32.545 186548 INFO nova.virt.libvirt.driver [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 07:45:32 compute-0 nova_compute[186544]: 2025-11-22 07:45:32.573 186548 DEBUG nova.compute.manager [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 07:45:32 compute-0 nova_compute[186544]: 2025-11-22 07:45:32.722 186548 DEBUG nova.compute.manager [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 07:45:32 compute-0 nova_compute[186544]: 2025-11-22 07:45:32.724 186548 DEBUG nova.virt.libvirt.driver [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 07:45:32 compute-0 nova_compute[186544]: 2025-11-22 07:45:32.724 186548 INFO nova.virt.libvirt.driver [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] Creating image(s)
Nov 22 07:45:32 compute-0 nova_compute[186544]: 2025-11-22 07:45:32.725 186548 DEBUG oslo_concurrency.lockutils [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Acquiring lock "/var/lib/nova/instances/2f7c9a98-a5d7-429a-afdd-d0e698e82707/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:45:32 compute-0 nova_compute[186544]: 2025-11-22 07:45:32.725 186548 DEBUG oslo_concurrency.lockutils [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Lock "/var/lib/nova/instances/2f7c9a98-a5d7-429a-afdd-d0e698e82707/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:45:32 compute-0 nova_compute[186544]: 2025-11-22 07:45:32.726 186548 DEBUG oslo_concurrency.lockutils [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Lock "/var/lib/nova/instances/2f7c9a98-a5d7-429a-afdd-d0e698e82707/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:45:32 compute-0 nova_compute[186544]: 2025-11-22 07:45:32.738 186548 DEBUG oslo_concurrency.processutils [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:45:32 compute-0 nova_compute[186544]: 2025-11-22 07:45:32.753 186548 DEBUG nova.policy [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6c0f7d40bf1447dcbaf452e10ca66973', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2dcc66faee714eda8d202a96a56299c1', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 22 07:45:32 compute-0 nova_compute[186544]: 2025-11-22 07:45:32.796 186548 DEBUG oslo_concurrency.processutils [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:45:32 compute-0 nova_compute[186544]: 2025-11-22 07:45:32.797 186548 DEBUG oslo_concurrency.lockutils [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:45:32 compute-0 nova_compute[186544]: 2025-11-22 07:45:32.797 186548 DEBUG oslo_concurrency.lockutils [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:45:32 compute-0 nova_compute[186544]: 2025-11-22 07:45:32.808 186548 DEBUG oslo_concurrency.processutils [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:45:32 compute-0 nova_compute[186544]: 2025-11-22 07:45:32.861 186548 DEBUG oslo_concurrency.processutils [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:45:32 compute-0 nova_compute[186544]: 2025-11-22 07:45:32.862 186548 DEBUG oslo_concurrency.processutils [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/2f7c9a98-a5d7-429a-afdd-d0e698e82707/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:45:32 compute-0 nova_compute[186544]: 2025-11-22 07:45:32.893 186548 DEBUG oslo_concurrency.processutils [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/2f7c9a98-a5d7-429a-afdd-d0e698e82707/disk 1073741824" returned: 0 in 0.030s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:45:32 compute-0 nova_compute[186544]: 2025-11-22 07:45:32.894 186548 DEBUG oslo_concurrency.lockutils [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.096s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:45:32 compute-0 nova_compute[186544]: 2025-11-22 07:45:32.894 186548 DEBUG oslo_concurrency.processutils [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:45:32 compute-0 nova_compute[186544]: 2025-11-22 07:45:32.946 186548 DEBUG oslo_concurrency.processutils [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:45:32 compute-0 nova_compute[186544]: 2025-11-22 07:45:32.948 186548 DEBUG nova.virt.disk.api [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Checking if we can resize image /var/lib/nova/instances/2f7c9a98-a5d7-429a-afdd-d0e698e82707/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 07:45:32 compute-0 nova_compute[186544]: 2025-11-22 07:45:32.948 186548 DEBUG oslo_concurrency.processutils [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2f7c9a98-a5d7-429a-afdd-d0e698e82707/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:45:33 compute-0 nova_compute[186544]: 2025-11-22 07:45:33.002 186548 DEBUG oslo_concurrency.processutils [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2f7c9a98-a5d7-429a-afdd-d0e698e82707/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:45:33 compute-0 nova_compute[186544]: 2025-11-22 07:45:33.004 186548 DEBUG nova.virt.disk.api [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Cannot resize image /var/lib/nova/instances/2f7c9a98-a5d7-429a-afdd-d0e698e82707/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 07:45:33 compute-0 nova_compute[186544]: 2025-11-22 07:45:33.004 186548 DEBUG nova.objects.instance [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Lazy-loading 'migration_context' on Instance uuid 2f7c9a98-a5d7-429a-afdd-d0e698e82707 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:45:33 compute-0 nova_compute[186544]: 2025-11-22 07:45:33.017 186548 DEBUG nova.virt.libvirt.driver [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 07:45:33 compute-0 nova_compute[186544]: 2025-11-22 07:45:33.018 186548 DEBUG nova.virt.libvirt.driver [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] Ensure instance console log exists: /var/lib/nova/instances/2f7c9a98-a5d7-429a-afdd-d0e698e82707/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 07:45:33 compute-0 nova_compute[186544]: 2025-11-22 07:45:33.018 186548 DEBUG oslo_concurrency.lockutils [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:45:33 compute-0 nova_compute[186544]: 2025-11-22 07:45:33.018 186548 DEBUG oslo_concurrency.lockutils [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:45:33 compute-0 nova_compute[186544]: 2025-11-22 07:45:33.018 186548 DEBUG oslo_concurrency.lockutils [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:45:33 compute-0 podman[216796]: 2025-11-22 07:45:33.418654708 +0000 UTC m=+0.060689034 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2)
Nov 22 07:45:33 compute-0 nova_compute[186544]: 2025-11-22 07:45:33.905 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:45:34 compute-0 nova_compute[186544]: 2025-11-22 07:45:34.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:45:34 compute-0 nova_compute[186544]: 2025-11-22 07:45:34.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 07:45:34 compute-0 nova_compute[186544]: 2025-11-22 07:45:34.203 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 07:45:34 compute-0 nova_compute[186544]: 2025-11-22 07:45:34.203 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:45:34 compute-0 nova_compute[186544]: 2025-11-22 07:45:34.203 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 07:45:34 compute-0 nova_compute[186544]: 2025-11-22 07:45:34.432 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:45:34 compute-0 nova_compute[186544]: 2025-11-22 07:45:34.704 186548 DEBUG nova.network.neutron [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] Successfully created port: 5f017af5-2156-4188-9b3a-5ffb3f49f735 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 22 07:45:35 compute-0 nova_compute[186544]: 2025-11-22 07:45:35.072 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:45:35 compute-0 nova_compute[186544]: 2025-11-22 07:45:35.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:45:35 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:45:35.817 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:45:35 compute-0 nova_compute[186544]: 2025-11-22 07:45:35.817 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:45:35 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:45:35.818 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 07:45:36 compute-0 nova_compute[186544]: 2025-11-22 07:45:36.083 186548 DEBUG nova.network.neutron [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] Successfully updated port: 5f017af5-2156-4188-9b3a-5ffb3f49f735 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 07:45:36 compute-0 nova_compute[186544]: 2025-11-22 07:45:36.102 186548 DEBUG oslo_concurrency.lockutils [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Acquiring lock "refresh_cache-2f7c9a98-a5d7-429a-afdd-d0e698e82707" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:45:36 compute-0 nova_compute[186544]: 2025-11-22 07:45:36.102 186548 DEBUG oslo_concurrency.lockutils [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Acquired lock "refresh_cache-2f7c9a98-a5d7-429a-afdd-d0e698e82707" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:45:36 compute-0 nova_compute[186544]: 2025-11-22 07:45:36.102 186548 DEBUG nova.network.neutron [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 07:45:36 compute-0 nova_compute[186544]: 2025-11-22 07:45:36.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:45:36 compute-0 nova_compute[186544]: 2025-11-22 07:45:36.265 186548 DEBUG nova.compute.manager [req-11014211-4f09-4559-937e-7591881c87a6 req-1cff6c8d-fdaf-4b2a-a1b6-641d92ec1dbb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] Received event network-changed-5f017af5-2156-4188-9b3a-5ffb3f49f735 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:45:36 compute-0 nova_compute[186544]: 2025-11-22 07:45:36.265 186548 DEBUG nova.compute.manager [req-11014211-4f09-4559-937e-7591881c87a6 req-1cff6c8d-fdaf-4b2a-a1b6-641d92ec1dbb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] Refreshing instance network info cache due to event network-changed-5f017af5-2156-4188-9b3a-5ffb3f49f735. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 07:45:36 compute-0 nova_compute[186544]: 2025-11-22 07:45:36.265 186548 DEBUG oslo_concurrency.lockutils [req-11014211-4f09-4559-937e-7591881c87a6 req-1cff6c8d-fdaf-4b2a-a1b6-641d92ec1dbb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-2f7c9a98-a5d7-429a-afdd-d0e698e82707" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:45:36 compute-0 nova_compute[186544]: 2025-11-22 07:45:36.657 186548 DEBUG nova.network.neutron [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 07:45:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:45:36.820 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:45:36 compute-0 nova_compute[186544]: 2025-11-22 07:45:36.911 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763797521.909827, 179e2133-b753-4ea6-8a33-f1fe9c02cd04 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:45:36 compute-0 nova_compute[186544]: 2025-11-22 07:45:36.912 186548 INFO nova.compute.manager [-] [instance: 179e2133-b753-4ea6-8a33-f1fe9c02cd04] VM Stopped (Lifecycle Event)
Nov 22 07:45:36 compute-0 nova_compute[186544]: 2025-11-22 07:45:36.937 186548 DEBUG nova.compute.manager [None req-f1ea0f03-954a-4500-9c35-5bb4c965218a - - - - - -] [instance: 179e2133-b753-4ea6-8a33-f1fe9c02cd04] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:45:37 compute-0 nova_compute[186544]: 2025-11-22 07:45:37.157 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:45:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:45:37.314 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:45:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:45:37.314 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:45:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:45:37.314 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:45:38 compute-0 nova_compute[186544]: 2025-11-22 07:45:38.214 186548 DEBUG nova.network.neutron [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] Updating instance_info_cache with network_info: [{"id": "5f017af5-2156-4188-9b3a-5ffb3f49f735", "address": "fa:16:3e:c7:72:22", "network": {"id": "ca132bee-7714-47f8-abaf-f0c28bf1cbee", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-792408710-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2dcc66faee714eda8d202a96a56299c1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f017af5-21", "ovs_interfaceid": "5f017af5-2156-4188-9b3a-5ffb3f49f735", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:45:38 compute-0 nova_compute[186544]: 2025-11-22 07:45:38.596 186548 DEBUG oslo_concurrency.lockutils [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Releasing lock "refresh_cache-2f7c9a98-a5d7-429a-afdd-d0e698e82707" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:45:38 compute-0 nova_compute[186544]: 2025-11-22 07:45:38.597 186548 DEBUG nova.compute.manager [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] Instance network_info: |[{"id": "5f017af5-2156-4188-9b3a-5ffb3f49f735", "address": "fa:16:3e:c7:72:22", "network": {"id": "ca132bee-7714-47f8-abaf-f0c28bf1cbee", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-792408710-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2dcc66faee714eda8d202a96a56299c1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f017af5-21", "ovs_interfaceid": "5f017af5-2156-4188-9b3a-5ffb3f49f735", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 22 07:45:38 compute-0 nova_compute[186544]: 2025-11-22 07:45:38.597 186548 DEBUG oslo_concurrency.lockutils [req-11014211-4f09-4559-937e-7591881c87a6 req-1cff6c8d-fdaf-4b2a-a1b6-641d92ec1dbb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-2f7c9a98-a5d7-429a-afdd-d0e698e82707" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:45:38 compute-0 nova_compute[186544]: 2025-11-22 07:45:38.598 186548 DEBUG nova.network.neutron [req-11014211-4f09-4559-937e-7591881c87a6 req-1cff6c8d-fdaf-4b2a-a1b6-641d92ec1dbb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] Refreshing network info cache for port 5f017af5-2156-4188-9b3a-5ffb3f49f735 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 07:45:38 compute-0 nova_compute[186544]: 2025-11-22 07:45:38.601 186548 DEBUG nova.virt.libvirt.driver [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] Start _get_guest_xml network_info=[{"id": "5f017af5-2156-4188-9b3a-5ffb3f49f735", "address": "fa:16:3e:c7:72:22", "network": {"id": "ca132bee-7714-47f8-abaf-f0c28bf1cbee", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-792408710-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2dcc66faee714eda8d202a96a56299c1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f017af5-21", "ovs_interfaceid": "5f017af5-2156-4188-9b3a-5ffb3f49f735", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 07:45:38 compute-0 nova_compute[186544]: 2025-11-22 07:45:38.606 186548 WARNING nova.virt.libvirt.driver [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 07:45:38 compute-0 nova_compute[186544]: 2025-11-22 07:45:38.610 186548 DEBUG nova.virt.libvirt.host [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 07:45:38 compute-0 nova_compute[186544]: 2025-11-22 07:45:38.611 186548 DEBUG nova.virt.libvirt.host [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 07:45:38 compute-0 nova_compute[186544]: 2025-11-22 07:45:38.613 186548 DEBUG nova.virt.libvirt.host [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 07:45:38 compute-0 nova_compute[186544]: 2025-11-22 07:45:38.614 186548 DEBUG nova.virt.libvirt.host [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 07:45:38 compute-0 nova_compute[186544]: 2025-11-22 07:45:38.615 186548 DEBUG nova.virt.libvirt.driver [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 07:45:38 compute-0 nova_compute[186544]: 2025-11-22 07:45:38.615 186548 DEBUG nova.virt.hardware [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 07:45:38 compute-0 nova_compute[186544]: 2025-11-22 07:45:38.616 186548 DEBUG nova.virt.hardware [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 07:45:38 compute-0 nova_compute[186544]: 2025-11-22 07:45:38.616 186548 DEBUG nova.virt.hardware [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 07:45:38 compute-0 nova_compute[186544]: 2025-11-22 07:45:38.617 186548 DEBUG nova.virt.hardware [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 07:45:38 compute-0 nova_compute[186544]: 2025-11-22 07:45:38.617 186548 DEBUG nova.virt.hardware [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 07:45:38 compute-0 nova_compute[186544]: 2025-11-22 07:45:38.617 186548 DEBUG nova.virt.hardware [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 07:45:38 compute-0 nova_compute[186544]: 2025-11-22 07:45:38.617 186548 DEBUG nova.virt.hardware [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 07:45:38 compute-0 nova_compute[186544]: 2025-11-22 07:45:38.618 186548 DEBUG nova.virt.hardware [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 07:45:38 compute-0 nova_compute[186544]: 2025-11-22 07:45:38.618 186548 DEBUG nova.virt.hardware [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 07:45:38 compute-0 nova_compute[186544]: 2025-11-22 07:45:38.618 186548 DEBUG nova.virt.hardware [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 07:45:38 compute-0 nova_compute[186544]: 2025-11-22 07:45:38.618 186548 DEBUG nova.virt.hardware [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 07:45:38 compute-0 nova_compute[186544]: 2025-11-22 07:45:38.623 186548 DEBUG nova.virt.libvirt.vif [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:45:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesOneServerTestJSON-server-1403503271',display_name='tempest-ImagesOneServerTestJSON-server-1403503271',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagesoneservertestjson-server-1403503271',id=25,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2dcc66faee714eda8d202a96a56299c1',ramdisk_id='',reservation_id='r-da36031p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesOneServerTestJSON-1228060188',owner_user_name='tempest-ImagesOneServerTestJSON-1228060188-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:45:32Z,user_data=None,user_id='6c0f7d40bf1447dcbaf452e10ca66973',uuid=2f7c9a98-a5d7-429a-afdd-d0e698e82707,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5f017af5-2156-4188-9b3a-5ffb3f49f735", "address": "fa:16:3e:c7:72:22", "network": {"id": "ca132bee-7714-47f8-abaf-f0c28bf1cbee", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-792408710-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2dcc66faee714eda8d202a96a56299c1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f017af5-21", "ovs_interfaceid": "5f017af5-2156-4188-9b3a-5ffb3f49f735", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 07:45:38 compute-0 nova_compute[186544]: 2025-11-22 07:45:38.623 186548 DEBUG nova.network.os_vif_util [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Converting VIF {"id": "5f017af5-2156-4188-9b3a-5ffb3f49f735", "address": "fa:16:3e:c7:72:22", "network": {"id": "ca132bee-7714-47f8-abaf-f0c28bf1cbee", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-792408710-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2dcc66faee714eda8d202a96a56299c1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f017af5-21", "ovs_interfaceid": "5f017af5-2156-4188-9b3a-5ffb3f49f735", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:45:38 compute-0 nova_compute[186544]: 2025-11-22 07:45:38.626 186548 DEBUG nova.network.os_vif_util [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c7:72:22,bridge_name='br-int',has_traffic_filtering=True,id=5f017af5-2156-4188-9b3a-5ffb3f49f735,network=Network(ca132bee-7714-47f8-abaf-f0c28bf1cbee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5f017af5-21') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:45:38 compute-0 nova_compute[186544]: 2025-11-22 07:45:38.627 186548 DEBUG nova.objects.instance [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2f7c9a98-a5d7-429a-afdd-d0e698e82707 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:45:38 compute-0 nova_compute[186544]: 2025-11-22 07:45:38.639 186548 DEBUG nova.virt.libvirt.driver [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] End _get_guest_xml xml=<domain type="kvm">
Nov 22 07:45:38 compute-0 nova_compute[186544]:   <uuid>2f7c9a98-a5d7-429a-afdd-d0e698e82707</uuid>
Nov 22 07:45:38 compute-0 nova_compute[186544]:   <name>instance-00000019</name>
Nov 22 07:45:38 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 07:45:38 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 07:45:38 compute-0 nova_compute[186544]:   <metadata>
Nov 22 07:45:38 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 07:45:38 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 07:45:38 compute-0 nova_compute[186544]:       <nova:name>tempest-ImagesOneServerTestJSON-server-1403503271</nova:name>
Nov 22 07:45:38 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 07:45:38</nova:creationTime>
Nov 22 07:45:38 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 07:45:38 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 07:45:38 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 07:45:38 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 07:45:38 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 07:45:38 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 07:45:38 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 07:45:38 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 07:45:38 compute-0 nova_compute[186544]:         <nova:user uuid="6c0f7d40bf1447dcbaf452e10ca66973">tempest-ImagesOneServerTestJSON-1228060188-project-member</nova:user>
Nov 22 07:45:38 compute-0 nova_compute[186544]:         <nova:project uuid="2dcc66faee714eda8d202a96a56299c1">tempest-ImagesOneServerTestJSON-1228060188</nova:project>
Nov 22 07:45:38 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 07:45:38 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 07:45:38 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 07:45:38 compute-0 nova_compute[186544]:         <nova:port uuid="5f017af5-2156-4188-9b3a-5ffb3f49f735">
Nov 22 07:45:38 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 22 07:45:38 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 07:45:38 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 07:45:38 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 07:45:38 compute-0 nova_compute[186544]:   </metadata>
Nov 22 07:45:38 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 07:45:38 compute-0 nova_compute[186544]:     <system>
Nov 22 07:45:38 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 07:45:38 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 07:45:38 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 07:45:38 compute-0 nova_compute[186544]:       <entry name="serial">2f7c9a98-a5d7-429a-afdd-d0e698e82707</entry>
Nov 22 07:45:38 compute-0 nova_compute[186544]:       <entry name="uuid">2f7c9a98-a5d7-429a-afdd-d0e698e82707</entry>
Nov 22 07:45:38 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 07:45:38 compute-0 nova_compute[186544]:     </system>
Nov 22 07:45:38 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 07:45:38 compute-0 nova_compute[186544]:   <os>
Nov 22 07:45:38 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 07:45:38 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 07:45:38 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 07:45:38 compute-0 nova_compute[186544]:   </os>
Nov 22 07:45:38 compute-0 nova_compute[186544]:   <features>
Nov 22 07:45:38 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 07:45:38 compute-0 nova_compute[186544]:     <apic/>
Nov 22 07:45:38 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 07:45:38 compute-0 nova_compute[186544]:   </features>
Nov 22 07:45:38 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 07:45:38 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 07:45:38 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 07:45:38 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 07:45:38 compute-0 nova_compute[186544]:   </clock>
Nov 22 07:45:38 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 07:45:38 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 07:45:38 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 07:45:38 compute-0 nova_compute[186544]:   </cpu>
Nov 22 07:45:38 compute-0 nova_compute[186544]:   <devices>
Nov 22 07:45:38 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 07:45:38 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 07:45:38 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/2f7c9a98-a5d7-429a-afdd-d0e698e82707/disk"/>
Nov 22 07:45:38 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 07:45:38 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:45:38 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 07:45:38 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 07:45:38 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/2f7c9a98-a5d7-429a-afdd-d0e698e82707/disk.config"/>
Nov 22 07:45:38 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 07:45:38 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:45:38 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 07:45:38 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:c7:72:22"/>
Nov 22 07:45:38 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 07:45:38 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 07:45:38 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 07:45:38 compute-0 nova_compute[186544]:       <target dev="tap5f017af5-21"/>
Nov 22 07:45:38 compute-0 nova_compute[186544]:     </interface>
Nov 22 07:45:38 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 07:45:38 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/2f7c9a98-a5d7-429a-afdd-d0e698e82707/console.log" append="off"/>
Nov 22 07:45:38 compute-0 nova_compute[186544]:     </serial>
Nov 22 07:45:38 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 07:45:38 compute-0 nova_compute[186544]:     <video>
Nov 22 07:45:38 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 07:45:38 compute-0 nova_compute[186544]:     </video>
Nov 22 07:45:38 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 07:45:38 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 07:45:38 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 07:45:38 compute-0 nova_compute[186544]:     </rng>
Nov 22 07:45:38 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 07:45:38 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:45:38 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:45:38 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:45:38 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:45:38 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:45:38 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:45:38 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:45:38 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:45:38 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:45:38 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:45:38 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:45:38 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:45:38 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:45:38 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:45:38 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:45:38 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:45:38 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:45:38 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:45:38 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:45:38 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:45:38 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:45:38 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:45:38 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:45:38 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:45:38 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 07:45:38 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 07:45:38 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 07:45:38 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 07:45:38 compute-0 nova_compute[186544]:   </devices>
Nov 22 07:45:38 compute-0 nova_compute[186544]: </domain>
Nov 22 07:45:38 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 07:45:38 compute-0 nova_compute[186544]: 2025-11-22 07:45:38.640 186548 DEBUG nova.compute.manager [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] Preparing to wait for external event network-vif-plugged-5f017af5-2156-4188-9b3a-5ffb3f49f735 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 07:45:38 compute-0 nova_compute[186544]: 2025-11-22 07:45:38.642 186548 DEBUG oslo_concurrency.lockutils [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Acquiring lock "2f7c9a98-a5d7-429a-afdd-d0e698e82707-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:45:38 compute-0 nova_compute[186544]: 2025-11-22 07:45:38.643 186548 DEBUG oslo_concurrency.lockutils [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Lock "2f7c9a98-a5d7-429a-afdd-d0e698e82707-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:45:38 compute-0 nova_compute[186544]: 2025-11-22 07:45:38.643 186548 DEBUG oslo_concurrency.lockutils [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Lock "2f7c9a98-a5d7-429a-afdd-d0e698e82707-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:45:38 compute-0 nova_compute[186544]: 2025-11-22 07:45:38.644 186548 DEBUG nova.virt.libvirt.vif [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:45:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesOneServerTestJSON-server-1403503271',display_name='tempest-ImagesOneServerTestJSON-server-1403503271',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagesoneservertestjson-server-1403503271',id=25,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2dcc66faee714eda8d202a96a56299c1',ramdisk_id='',reservation_id='r-da36031p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesOneServerTestJSON-1228060188',owner_user_name='tempest-ImagesOneServerTestJSON-1228060188-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:45:32Z,user_data=None,user_id='6c0f7d40bf1447dcbaf452e10ca66973',uuid=2f7c9a98-a5d7-429a-afdd-d0e698e82707,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5f017af5-2156-4188-9b3a-5ffb3f49f735", "address": "fa:16:3e:c7:72:22", "network": {"id": "ca132bee-7714-47f8-abaf-f0c28bf1cbee", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-792408710-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2dcc66faee714eda8d202a96a56299c1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f017af5-21", "ovs_interfaceid": "5f017af5-2156-4188-9b3a-5ffb3f49f735", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 07:45:38 compute-0 nova_compute[186544]: 2025-11-22 07:45:38.644 186548 DEBUG nova.network.os_vif_util [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Converting VIF {"id": "5f017af5-2156-4188-9b3a-5ffb3f49f735", "address": "fa:16:3e:c7:72:22", "network": {"id": "ca132bee-7714-47f8-abaf-f0c28bf1cbee", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-792408710-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2dcc66faee714eda8d202a96a56299c1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f017af5-21", "ovs_interfaceid": "5f017af5-2156-4188-9b3a-5ffb3f49f735", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:45:38 compute-0 nova_compute[186544]: 2025-11-22 07:45:38.645 186548 DEBUG nova.network.os_vif_util [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c7:72:22,bridge_name='br-int',has_traffic_filtering=True,id=5f017af5-2156-4188-9b3a-5ffb3f49f735,network=Network(ca132bee-7714-47f8-abaf-f0c28bf1cbee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5f017af5-21') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:45:38 compute-0 nova_compute[186544]: 2025-11-22 07:45:38.645 186548 DEBUG os_vif [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c7:72:22,bridge_name='br-int',has_traffic_filtering=True,id=5f017af5-2156-4188-9b3a-5ffb3f49f735,network=Network(ca132bee-7714-47f8-abaf-f0c28bf1cbee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5f017af5-21') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 07:45:38 compute-0 nova_compute[186544]: 2025-11-22 07:45:38.646 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:45:38 compute-0 nova_compute[186544]: 2025-11-22 07:45:38.647 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:45:38 compute-0 nova_compute[186544]: 2025-11-22 07:45:38.648 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:45:38 compute-0 nova_compute[186544]: 2025-11-22 07:45:38.651 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:45:38 compute-0 nova_compute[186544]: 2025-11-22 07:45:38.652 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5f017af5-21, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:45:38 compute-0 nova_compute[186544]: 2025-11-22 07:45:38.652 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5f017af5-21, col_values=(('external_ids', {'iface-id': '5f017af5-2156-4188-9b3a-5ffb3f49f735', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c7:72:22', 'vm-uuid': '2f7c9a98-a5d7-429a-afdd-d0e698e82707'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:45:38 compute-0 nova_compute[186544]: 2025-11-22 07:45:38.654 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:45:38 compute-0 NetworkManager[55036]: <info>  [1763797538.6550] manager: (tap5f017af5-21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/45)
Nov 22 07:45:38 compute-0 nova_compute[186544]: 2025-11-22 07:45:38.657 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 07:45:38 compute-0 nova_compute[186544]: 2025-11-22 07:45:38.660 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:45:38 compute-0 nova_compute[186544]: 2025-11-22 07:45:38.661 186548 INFO os_vif [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c7:72:22,bridge_name='br-int',has_traffic_filtering=True,id=5f017af5-2156-4188-9b3a-5ffb3f49f735,network=Network(ca132bee-7714-47f8-abaf-f0c28bf1cbee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5f017af5-21')
Nov 22 07:45:38 compute-0 nova_compute[186544]: 2025-11-22 07:45:38.705 186548 DEBUG nova.virt.libvirt.driver [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:45:38 compute-0 nova_compute[186544]: 2025-11-22 07:45:38.706 186548 DEBUG nova.virt.libvirt.driver [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:45:38 compute-0 nova_compute[186544]: 2025-11-22 07:45:38.706 186548 DEBUG nova.virt.libvirt.driver [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] No VIF found with MAC fa:16:3e:c7:72:22, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 07:45:38 compute-0 nova_compute[186544]: 2025-11-22 07:45:38.706 186548 INFO nova.virt.libvirt.driver [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] Using config drive
Nov 22 07:45:39 compute-0 nova_compute[186544]: 2025-11-22 07:45:39.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:45:39 compute-0 nova_compute[186544]: 2025-11-22 07:45:39.255 186548 INFO nova.virt.libvirt.driver [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] Creating config drive at /var/lib/nova/instances/2f7c9a98-a5d7-429a-afdd-d0e698e82707/disk.config
Nov 22 07:45:39 compute-0 nova_compute[186544]: 2025-11-22 07:45:39.261 186548 DEBUG oslo_concurrency.processutils [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2f7c9a98-a5d7-429a-afdd-d0e698e82707/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpc48t5evx execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:45:39 compute-0 nova_compute[186544]: 2025-11-22 07:45:39.390 186548 DEBUG oslo_concurrency.processutils [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2f7c9a98-a5d7-429a-afdd-d0e698e82707/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpc48t5evx" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:45:39 compute-0 podman[216822]: 2025-11-22 07:45:39.403651025 +0000 UTC m=+0.051936540 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 07:45:39 compute-0 kernel: tap5f017af5-21: entered promiscuous mode
Nov 22 07:45:39 compute-0 NetworkManager[55036]: <info>  [1763797539.4595] manager: (tap5f017af5-21): new Tun device (/org/freedesktop/NetworkManager/Devices/46)
Nov 22 07:45:39 compute-0 ovn_controller[94843]: 2025-11-22T07:45:39Z|00094|binding|INFO|Claiming lport 5f017af5-2156-4188-9b3a-5ffb3f49f735 for this chassis.
Nov 22 07:45:39 compute-0 nova_compute[186544]: 2025-11-22 07:45:39.459 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:45:39 compute-0 ovn_controller[94843]: 2025-11-22T07:45:39Z|00095|binding|INFO|5f017af5-2156-4188-9b3a-5ffb3f49f735: Claiming fa:16:3e:c7:72:22 10.100.0.5
Nov 22 07:45:39 compute-0 nova_compute[186544]: 2025-11-22 07:45:39.464 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:45:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:45:39.481 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c7:72:22 10.100.0.5'], port_security=['fa:16:3e:c7:72:22 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '2f7c9a98-a5d7-429a-afdd-d0e698e82707', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ca132bee-7714-47f8-abaf-f0c28bf1cbee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2dcc66faee714eda8d202a96a56299c1', 'neutron:revision_number': '2', 'neutron:security_group_ids': '0722b417-e846-4e24-b0bf-11b5d0bd7732', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8ec4ac96-bec6-4012-b178-66b7b52fdfc6, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=5f017af5-2156-4188-9b3a-5ffb3f49f735) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:45:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:45:39.482 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 5f017af5-2156-4188-9b3a-5ffb3f49f735 in datapath ca132bee-7714-47f8-abaf-f0c28bf1cbee bound to our chassis
Nov 22 07:45:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:45:39.483 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ca132bee-7714-47f8-abaf-f0c28bf1cbee
Nov 22 07:45:39 compute-0 systemd-udevd[216858]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 07:45:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:45:39.495 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[183f5ded-b44c-490f-add7-c31f9f0cf2f2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:45:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:45:39.496 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapca132bee-71 in ovnmeta-ca132bee-7714-47f8-abaf-f0c28bf1cbee namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 07:45:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:45:39.498 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapca132bee-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 07:45:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:45:39.498 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[ecad6a6f-f57d-400e-afa9-9c49b42de655]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:45:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:45:39.500 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[1bf4b18f-1734-43ea-8143-56297c575146]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:45:39 compute-0 NetworkManager[55036]: <info>  [1763797539.5098] device (tap5f017af5-21): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 07:45:39 compute-0 NetworkManager[55036]: <info>  [1763797539.5106] device (tap5f017af5-21): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 07:45:39 compute-0 systemd-machined[152872]: New machine qemu-13-instance-00000019.
Nov 22 07:45:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:45:39.512 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[7dd0aa01-6302-464b-b132-e7f2cd933c45]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:45:39 compute-0 nova_compute[186544]: 2025-11-22 07:45:39.519 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:45:39 compute-0 systemd[1]: Started Virtual Machine qemu-13-instance-00000019.
Nov 22 07:45:39 compute-0 ovn_controller[94843]: 2025-11-22T07:45:39Z|00096|binding|INFO|Setting lport 5f017af5-2156-4188-9b3a-5ffb3f49f735 ovn-installed in OVS
Nov 22 07:45:39 compute-0 ovn_controller[94843]: 2025-11-22T07:45:39Z|00097|binding|INFO|Setting lport 5f017af5-2156-4188-9b3a-5ffb3f49f735 up in Southbound
Nov 22 07:45:39 compute-0 nova_compute[186544]: 2025-11-22 07:45:39.525 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:45:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:45:39.527 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[40ce9ab8-79bf-478b-afbc-b2d3e87ca55d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:45:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:45:39.554 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[6cc2f26e-8eac-4fe9-b386-3ddf905ae8e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:45:39 compute-0 NetworkManager[55036]: <info>  [1763797539.5610] manager: (tapca132bee-70): new Veth device (/org/freedesktop/NetworkManager/Devices/47)
Nov 22 07:45:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:45:39.560 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[9517466d-f235-4468-a41c-6b349c934752]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:45:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:45:39.590 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[cf9bb565-fcce-41d2-b15f-0854d2faf126]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:45:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:45:39.593 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[432f55ce-3e7f-4f0b-aee6-b5713daa7d0d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:45:39 compute-0 NetworkManager[55036]: <info>  [1763797539.6158] device (tapca132bee-70): carrier: link connected
Nov 22 07:45:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:45:39.619 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[e7acd842-5545-4d97-9b8a-b1f722422e82]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:45:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:45:39.636 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[5e05865b-9a52-4627-9af2-796443cc7a62]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapca132bee-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3d:60:c1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 31], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 428025, 'reachable_time': 19130, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216893, 'error': None, 'target': 'ovnmeta-ca132bee-7714-47f8-abaf-f0c28bf1cbee', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:45:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:45:39.651 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[d6ad6092-09d8-4816-bf95-145419acbe57]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe3d:60c1'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 428025, 'tstamp': 428025}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216894, 'error': None, 'target': 'ovnmeta-ca132bee-7714-47f8-abaf-f0c28bf1cbee', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:45:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:45:39.667 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[c681d1ea-82a0-4feb-b6ff-f100aa8c7116]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapca132bee-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3d:60:c1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 31], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 428025, 'reachable_time': 19130, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216895, 'error': None, 'target': 'ovnmeta-ca132bee-7714-47f8-abaf-f0c28bf1cbee', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:45:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:45:39.697 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[1e0a271f-7ab1-4dba-b2bc-83a202dcc533]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:45:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:45:39.748 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[bc528cb9-def5-4f17-a25a-c42fa9732742]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:45:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:45:39.750 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapca132bee-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:45:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:45:39.751 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:45:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:45:39.751 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapca132bee-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:45:39 compute-0 nova_compute[186544]: 2025-11-22 07:45:39.753 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:45:39 compute-0 NetworkManager[55036]: <info>  [1763797539.7537] manager: (tapca132bee-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/48)
Nov 22 07:45:39 compute-0 kernel: tapca132bee-70: entered promiscuous mode
Nov 22 07:45:39 compute-0 nova_compute[186544]: 2025-11-22 07:45:39.755 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:45:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:45:39.756 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapca132bee-70, col_values=(('external_ids', {'iface-id': 'cf20c44b-603f-4b36-b203-6ef2f1c29791'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:45:39 compute-0 nova_compute[186544]: 2025-11-22 07:45:39.757 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:45:39 compute-0 ovn_controller[94843]: 2025-11-22T07:45:39Z|00098|binding|INFO|Releasing lport cf20c44b-603f-4b36-b203-6ef2f1c29791 from this chassis (sb_readonly=0)
Nov 22 07:45:39 compute-0 nova_compute[186544]: 2025-11-22 07:45:39.758 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:45:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:45:39.759 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ca132bee-7714-47f8-abaf-f0c28bf1cbee.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ca132bee-7714-47f8-abaf-f0c28bf1cbee.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 07:45:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:45:39.759 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[ae0cced2-b403-47de-a4f6-44b5e580b633]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:45:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:45:39.760 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 07:45:39 compute-0 ovn_metadata_agent[103800]: global
Nov 22 07:45:39 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 07:45:39 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-ca132bee-7714-47f8-abaf-f0c28bf1cbee
Nov 22 07:45:39 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 07:45:39 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 07:45:39 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 07:45:39 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/ca132bee-7714-47f8-abaf-f0c28bf1cbee.pid.haproxy
Nov 22 07:45:39 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 07:45:39 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:45:39 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 07:45:39 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 07:45:39 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 07:45:39 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 07:45:39 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 07:45:39 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 07:45:39 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 07:45:39 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 07:45:39 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 07:45:39 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 07:45:39 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 07:45:39 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 07:45:39 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 07:45:39 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:45:39 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:45:39 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 07:45:39 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 07:45:39 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 07:45:39 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID ca132bee-7714-47f8-abaf-f0c28bf1cbee
Nov 22 07:45:39 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 07:45:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:45:39.761 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ca132bee-7714-47f8-abaf-f0c28bf1cbee', 'env', 'PROCESS_TAG=haproxy-ca132bee-7714-47f8-abaf-f0c28bf1cbee', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ca132bee-7714-47f8-abaf-f0c28bf1cbee.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 07:45:39 compute-0 nova_compute[186544]: 2025-11-22 07:45:39.768 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:45:39 compute-0 nova_compute[186544]: 2025-11-22 07:45:39.812 186548 DEBUG nova.compute.manager [req-18aa5303-63b3-4efc-a060-a4561d52bcce req-d02dadfb-12c7-47e7-89be-b60b9f40a08b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] Received event network-vif-plugged-5f017af5-2156-4188-9b3a-5ffb3f49f735 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:45:39 compute-0 nova_compute[186544]: 2025-11-22 07:45:39.812 186548 DEBUG oslo_concurrency.lockutils [req-18aa5303-63b3-4efc-a060-a4561d52bcce req-d02dadfb-12c7-47e7-89be-b60b9f40a08b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "2f7c9a98-a5d7-429a-afdd-d0e698e82707-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:45:39 compute-0 nova_compute[186544]: 2025-11-22 07:45:39.813 186548 DEBUG oslo_concurrency.lockutils [req-18aa5303-63b3-4efc-a060-a4561d52bcce req-d02dadfb-12c7-47e7-89be-b60b9f40a08b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "2f7c9a98-a5d7-429a-afdd-d0e698e82707-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:45:39 compute-0 nova_compute[186544]: 2025-11-22 07:45:39.813 186548 DEBUG oslo_concurrency.lockutils [req-18aa5303-63b3-4efc-a060-a4561d52bcce req-d02dadfb-12c7-47e7-89be-b60b9f40a08b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "2f7c9a98-a5d7-429a-afdd-d0e698e82707-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:45:39 compute-0 nova_compute[186544]: 2025-11-22 07:45:39.813 186548 DEBUG nova.compute.manager [req-18aa5303-63b3-4efc-a060-a4561d52bcce req-d02dadfb-12c7-47e7-89be-b60b9f40a08b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] Processing event network-vif-plugged-5f017af5-2156-4188-9b3a-5ffb3f49f735 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 07:45:40 compute-0 nova_compute[186544]: 2025-11-22 07:45:40.106 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:45:40 compute-0 nova_compute[186544]: 2025-11-22 07:45:40.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:45:40 compute-0 podman[216930]: 2025-11-22 07:45:40.167086092 +0000 UTC m=+0.061507024 container create cd9e6271b05f42a84747ce6cf4c8728f7e0668fb2b8d8d652eff25c665cb8861 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ca132bee-7714-47f8-abaf-f0c28bf1cbee, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.schema-version=1.0)
Nov 22 07:45:40 compute-0 systemd[1]: Started libpod-conmon-cd9e6271b05f42a84747ce6cf4c8728f7e0668fb2b8d8d652eff25c665cb8861.scope.
Nov 22 07:45:40 compute-0 nova_compute[186544]: 2025-11-22 07:45:40.197 186548 DEBUG nova.compute.manager [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 07:45:40 compute-0 nova_compute[186544]: 2025-11-22 07:45:40.198 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763797540.1980042, 2f7c9a98-a5d7-429a-afdd-d0e698e82707 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:45:40 compute-0 nova_compute[186544]: 2025-11-22 07:45:40.199 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] VM Started (Lifecycle Event)
Nov 22 07:45:40 compute-0 nova_compute[186544]: 2025-11-22 07:45:40.201 186548 DEBUG nova.virt.libvirt.driver [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 07:45:40 compute-0 nova_compute[186544]: 2025-11-22 07:45:40.205 186548 INFO nova.virt.libvirt.driver [-] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] Instance spawned successfully.
Nov 22 07:45:40 compute-0 nova_compute[186544]: 2025-11-22 07:45:40.206 186548 DEBUG nova.virt.libvirt.driver [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 07:45:40 compute-0 systemd[1]: Started libcrun container.
Nov 22 07:45:40 compute-0 nova_compute[186544]: 2025-11-22 07:45:40.222 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:45:40 compute-0 podman[216930]: 2025-11-22 07:45:40.12975462 +0000 UTC m=+0.024175582 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 07:45:40 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2793f8a02e47a4228a69962a0c6e8ece67219ca4baff32a46b4c51f42f45b6f5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 07:45:40 compute-0 nova_compute[186544]: 2025-11-22 07:45:40.232 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:45:40 compute-0 nova_compute[186544]: 2025-11-22 07:45:40.235 186548 DEBUG nova.virt.libvirt.driver [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:45:40 compute-0 nova_compute[186544]: 2025-11-22 07:45:40.235 186548 DEBUG nova.virt.libvirt.driver [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:45:40 compute-0 nova_compute[186544]: 2025-11-22 07:45:40.236 186548 DEBUG nova.virt.libvirt.driver [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:45:40 compute-0 nova_compute[186544]: 2025-11-22 07:45:40.236 186548 DEBUG nova.virt.libvirt.driver [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:45:40 compute-0 nova_compute[186544]: 2025-11-22 07:45:40.236 186548 DEBUG nova.virt.libvirt.driver [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:45:40 compute-0 nova_compute[186544]: 2025-11-22 07:45:40.236 186548 DEBUG nova.virt.libvirt.driver [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:45:40 compute-0 podman[216930]: 2025-11-22 07:45:40.239519672 +0000 UTC m=+0.133940634 container init cd9e6271b05f42a84747ce6cf4c8728f7e0668fb2b8d8d652eff25c665cb8861 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ca132bee-7714-47f8-abaf-f0c28bf1cbee, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 22 07:45:40 compute-0 podman[216930]: 2025-11-22 07:45:40.24596296 +0000 UTC m=+0.140383892 container start cd9e6271b05f42a84747ce6cf4c8728f7e0668fb2b8d8d652eff25c665cb8861 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ca132bee-7714-47f8-abaf-f0c28bf1cbee, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 22 07:45:40 compute-0 neutron-haproxy-ovnmeta-ca132bee-7714-47f8-abaf-f0c28bf1cbee[216951]: [NOTICE]   (216955) : New worker (216957) forked
Nov 22 07:45:40 compute-0 neutron-haproxy-ovnmeta-ca132bee-7714-47f8-abaf-f0c28bf1cbee[216951]: [NOTICE]   (216955) : Loading success.
Nov 22 07:45:40 compute-0 nova_compute[186544]: 2025-11-22 07:45:40.282 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 07:45:40 compute-0 nova_compute[186544]: 2025-11-22 07:45:40.282 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763797540.1981964, 2f7c9a98-a5d7-429a-afdd-d0e698e82707 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:45:40 compute-0 nova_compute[186544]: 2025-11-22 07:45:40.282 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] VM Paused (Lifecycle Event)
Nov 22 07:45:40 compute-0 nova_compute[186544]: 2025-11-22 07:45:40.329 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:45:40 compute-0 nova_compute[186544]: 2025-11-22 07:45:40.332 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763797540.2010581, 2f7c9a98-a5d7-429a-afdd-d0e698e82707 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:45:40 compute-0 nova_compute[186544]: 2025-11-22 07:45:40.333 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] VM Resumed (Lifecycle Event)
Nov 22 07:45:40 compute-0 nova_compute[186544]: 2025-11-22 07:45:40.353 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:45:40 compute-0 nova_compute[186544]: 2025-11-22 07:45:40.357 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:45:40 compute-0 nova_compute[186544]: 2025-11-22 07:45:40.414 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 07:45:40 compute-0 nova_compute[186544]: 2025-11-22 07:45:40.419 186548 INFO nova.compute.manager [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] Took 7.70 seconds to spawn the instance on the hypervisor.
Nov 22 07:45:40 compute-0 nova_compute[186544]: 2025-11-22 07:45:40.419 186548 DEBUG nova.compute.manager [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:45:40 compute-0 nova_compute[186544]: 2025-11-22 07:45:40.500 186548 INFO nova.compute.manager [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] Took 8.29 seconds to build instance.
Nov 22 07:45:40 compute-0 nova_compute[186544]: 2025-11-22 07:45:40.522 186548 DEBUG oslo_concurrency.lockutils [None req-857a46a4-10a8-4cb1-b5d4-73177686057a 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Lock "2f7c9a98-a5d7-429a-afdd-d0e698e82707" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.388s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:45:40 compute-0 nova_compute[186544]: 2025-11-22 07:45:40.537 186548 DEBUG nova.network.neutron [req-11014211-4f09-4559-937e-7591881c87a6 req-1cff6c8d-fdaf-4b2a-a1b6-641d92ec1dbb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] Updated VIF entry in instance network info cache for port 5f017af5-2156-4188-9b3a-5ffb3f49f735. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 07:45:40 compute-0 nova_compute[186544]: 2025-11-22 07:45:40.537 186548 DEBUG nova.network.neutron [req-11014211-4f09-4559-937e-7591881c87a6 req-1cff6c8d-fdaf-4b2a-a1b6-641d92ec1dbb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] Updating instance_info_cache with network_info: [{"id": "5f017af5-2156-4188-9b3a-5ffb3f49f735", "address": "fa:16:3e:c7:72:22", "network": {"id": "ca132bee-7714-47f8-abaf-f0c28bf1cbee", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-792408710-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2dcc66faee714eda8d202a96a56299c1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f017af5-21", "ovs_interfaceid": "5f017af5-2156-4188-9b3a-5ffb3f49f735", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:45:40 compute-0 nova_compute[186544]: 2025-11-22 07:45:40.552 186548 DEBUG oslo_concurrency.lockutils [req-11014211-4f09-4559-937e-7591881c87a6 req-1cff6c8d-fdaf-4b2a-a1b6-641d92ec1dbb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-2f7c9a98-a5d7-429a-afdd-d0e698e82707" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:45:41 compute-0 nova_compute[186544]: 2025-11-22 07:45:41.929 186548 DEBUG nova.compute.manager [req-026490c7-9a7d-4cd2-b1d0-d7859b1f8842 req-1287c776-a46f-47b0-8024-b04779a685e1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] Received event network-vif-plugged-5f017af5-2156-4188-9b3a-5ffb3f49f735 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:45:41 compute-0 nova_compute[186544]: 2025-11-22 07:45:41.930 186548 DEBUG oslo_concurrency.lockutils [req-026490c7-9a7d-4cd2-b1d0-d7859b1f8842 req-1287c776-a46f-47b0-8024-b04779a685e1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "2f7c9a98-a5d7-429a-afdd-d0e698e82707-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:45:41 compute-0 nova_compute[186544]: 2025-11-22 07:45:41.930 186548 DEBUG oslo_concurrency.lockutils [req-026490c7-9a7d-4cd2-b1d0-d7859b1f8842 req-1287c776-a46f-47b0-8024-b04779a685e1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "2f7c9a98-a5d7-429a-afdd-d0e698e82707-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:45:41 compute-0 nova_compute[186544]: 2025-11-22 07:45:41.930 186548 DEBUG oslo_concurrency.lockutils [req-026490c7-9a7d-4cd2-b1d0-d7859b1f8842 req-1287c776-a46f-47b0-8024-b04779a685e1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "2f7c9a98-a5d7-429a-afdd-d0e698e82707-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:45:41 compute-0 nova_compute[186544]: 2025-11-22 07:45:41.930 186548 DEBUG nova.compute.manager [req-026490c7-9a7d-4cd2-b1d0-d7859b1f8842 req-1287c776-a46f-47b0-8024-b04779a685e1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] No waiting events found dispatching network-vif-plugged-5f017af5-2156-4188-9b3a-5ffb3f49f735 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:45:41 compute-0 nova_compute[186544]: 2025-11-22 07:45:41.931 186548 WARNING nova.compute.manager [req-026490c7-9a7d-4cd2-b1d0-d7859b1f8842 req-1287c776-a46f-47b0-8024-b04779a685e1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] Received unexpected event network-vif-plugged-5f017af5-2156-4188-9b3a-5ffb3f49f735 for instance with vm_state active and task_state None.
Nov 22 07:45:42 compute-0 nova_compute[186544]: 2025-11-22 07:45:42.558 186548 DEBUG nova.compute.manager [None req-ea27261e-6022-49b3-82d2-195b2b0731cc 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:45:42 compute-0 nova_compute[186544]: 2025-11-22 07:45:42.632 186548 INFO nova.compute.manager [None req-ea27261e-6022-49b3-82d2-195b2b0731cc 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] instance snapshotting
Nov 22 07:45:42 compute-0 nova_compute[186544]: 2025-11-22 07:45:42.945 186548 INFO nova.virt.libvirt.driver [None req-ea27261e-6022-49b3-82d2-195b2b0731cc 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] Beginning live snapshot process
Nov 22 07:45:43 compute-0 virtqemud[186092]: invalid argument: disk vda does not have an active block job
Nov 22 07:45:43 compute-0 nova_compute[186544]: 2025-11-22 07:45:43.114 186548 DEBUG oslo_concurrency.processutils [None req-ea27261e-6022-49b3-82d2-195b2b0731cc 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2f7c9a98-a5d7-429a-afdd-d0e698e82707/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:45:43 compute-0 nova_compute[186544]: 2025-11-22 07:45:43.172 186548 DEBUG oslo_concurrency.processutils [None req-ea27261e-6022-49b3-82d2-195b2b0731cc 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2f7c9a98-a5d7-429a-afdd-d0e698e82707/disk --force-share --output=json -f qcow2" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:45:43 compute-0 nova_compute[186544]: 2025-11-22 07:45:43.173 186548 DEBUG oslo_concurrency.processutils [None req-ea27261e-6022-49b3-82d2-195b2b0731cc 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2f7c9a98-a5d7-429a-afdd-d0e698e82707/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:45:43 compute-0 nova_compute[186544]: 2025-11-22 07:45:43.236 186548 DEBUG oslo_concurrency.processutils [None req-ea27261e-6022-49b3-82d2-195b2b0731cc 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2f7c9a98-a5d7-429a-afdd-d0e698e82707/disk --force-share --output=json -f qcow2" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:45:43 compute-0 nova_compute[186544]: 2025-11-22 07:45:43.250 186548 DEBUG oslo_concurrency.processutils [None req-ea27261e-6022-49b3-82d2-195b2b0731cc 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:45:43 compute-0 nova_compute[186544]: 2025-11-22 07:45:43.310 186548 DEBUG oslo_concurrency.processutils [None req-ea27261e-6022-49b3-82d2-195b2b0731cc 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:45:43 compute-0 nova_compute[186544]: 2025-11-22 07:45:43.312 186548 DEBUG oslo_concurrency.processutils [None req-ea27261e-6022-49b3-82d2-195b2b0731cc 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/snapshots/tmp_3cpn_bw/04bd4d70902740f3a1e414d5abb60c83.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:45:43 compute-0 nova_compute[186544]: 2025-11-22 07:45:43.353 186548 DEBUG oslo_concurrency.processutils [None req-ea27261e-6022-49b3-82d2-195b2b0731cc 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/snapshots/tmp_3cpn_bw/04bd4d70902740f3a1e414d5abb60c83.delta 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:45:43 compute-0 nova_compute[186544]: 2025-11-22 07:45:43.354 186548 INFO nova.virt.libvirt.driver [None req-ea27261e-6022-49b3-82d2-195b2b0731cc 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] Quiescing instance not available: QEMU guest agent is not enabled.
Nov 22 07:45:43 compute-0 nova_compute[186544]: 2025-11-22 07:45:43.406 186548 DEBUG nova.virt.libvirt.guest [None req-ea27261e-6022-49b3-82d2-195b2b0731cc 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] COPY block job progress, current cursor: 1 final cursor: 1 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Nov 22 07:45:43 compute-0 nova_compute[186544]: 2025-11-22 07:45:43.410 186548 INFO nova.virt.libvirt.driver [None req-ea27261e-6022-49b3-82d2-195b2b0731cc 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] Skipping quiescing instance: QEMU guest agent is not enabled.
Nov 22 07:45:43 compute-0 podman[216976]: 2025-11-22 07:45:43.41511544 +0000 UTC m=+0.059181797 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, io.openshift.expose-services=, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.buildah.version=1.33.7, config_id=edpm, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64)
Nov 22 07:45:43 compute-0 nova_compute[186544]: 2025-11-22 07:45:43.457 186548 DEBUG nova.privsep.utils [None req-ea27261e-6022-49b3-82d2-195b2b0731cc 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Nov 22 07:45:43 compute-0 nova_compute[186544]: 2025-11-22 07:45:43.458 186548 DEBUG oslo_concurrency.processutils [None req-ea27261e-6022-49b3-82d2-195b2b0731cc 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmp_3cpn_bw/04bd4d70902740f3a1e414d5abb60c83.delta /var/lib/nova/instances/snapshots/tmp_3cpn_bw/04bd4d70902740f3a1e414d5abb60c83 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:45:43 compute-0 nova_compute[186544]: 2025-11-22 07:45:43.654 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:45:43 compute-0 nova_compute[186544]: 2025-11-22 07:45:43.657 186548 DEBUG oslo_concurrency.processutils [None req-ea27261e-6022-49b3-82d2-195b2b0731cc 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmp_3cpn_bw/04bd4d70902740f3a1e414d5abb60c83.delta /var/lib/nova/instances/snapshots/tmp_3cpn_bw/04bd4d70902740f3a1e414d5abb60c83" returned: 0 in 0.199s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:45:43 compute-0 nova_compute[186544]: 2025-11-22 07:45:43.658 186548 INFO nova.virt.libvirt.driver [None req-ea27261e-6022-49b3-82d2-195b2b0731cc 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] Snapshot extracted, beginning image upload
Nov 22 07:45:45 compute-0 nova_compute[186544]: 2025-11-22 07:45:45.111 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:45:46 compute-0 nova_compute[186544]: 2025-11-22 07:45:46.905 186548 INFO nova.virt.libvirt.driver [None req-ea27261e-6022-49b3-82d2-195b2b0731cc 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] Snapshot image upload complete
Nov 22 07:45:46 compute-0 nova_compute[186544]: 2025-11-22 07:45:46.906 186548 INFO nova.compute.manager [None req-ea27261e-6022-49b3-82d2-195b2b0731cc 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] Took 4.27 seconds to snapshot the instance on the hypervisor.
Nov 22 07:45:47 compute-0 nova_compute[186544]: 2025-11-22 07:45:47.158 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:45:48 compute-0 nova_compute[186544]: 2025-11-22 07:45:48.657 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:45:50 compute-0 nova_compute[186544]: 2025-11-22 07:45:50.035 186548 DEBUG nova.compute.manager [None req-e262cdfe-d23b-4080-a668-5bd87290737c 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:45:50 compute-0 nova_compute[186544]: 2025-11-22 07:45:50.092 186548 INFO nova.compute.manager [None req-e262cdfe-d23b-4080-a668-5bd87290737c 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] instance snapshotting
Nov 22 07:45:50 compute-0 nova_compute[186544]: 2025-11-22 07:45:50.112 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:45:50 compute-0 nova_compute[186544]: 2025-11-22 07:45:50.395 186548 INFO nova.virt.libvirt.driver [None req-e262cdfe-d23b-4080-a668-5bd87290737c 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] Beginning live snapshot process
Nov 22 07:45:50 compute-0 virtqemud[186092]: invalid argument: disk vda does not have an active block job
Nov 22 07:45:50 compute-0 nova_compute[186544]: 2025-11-22 07:45:50.638 186548 DEBUG oslo_concurrency.processutils [None req-e262cdfe-d23b-4080-a668-5bd87290737c 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2f7c9a98-a5d7-429a-afdd-d0e698e82707/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:45:50 compute-0 nova_compute[186544]: 2025-11-22 07:45:50.698 186548 DEBUG oslo_concurrency.processutils [None req-e262cdfe-d23b-4080-a668-5bd87290737c 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2f7c9a98-a5d7-429a-afdd-d0e698e82707/disk --force-share --output=json -f qcow2" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:45:50 compute-0 nova_compute[186544]: 2025-11-22 07:45:50.700 186548 DEBUG oslo_concurrency.processutils [None req-e262cdfe-d23b-4080-a668-5bd87290737c 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2f7c9a98-a5d7-429a-afdd-d0e698e82707/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:45:50 compute-0 nova_compute[186544]: 2025-11-22 07:45:50.759 186548 DEBUG oslo_concurrency.processutils [None req-e262cdfe-d23b-4080-a668-5bd87290737c 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2f7c9a98-a5d7-429a-afdd-d0e698e82707/disk --force-share --output=json -f qcow2" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:45:50 compute-0 nova_compute[186544]: 2025-11-22 07:45:50.777 186548 DEBUG oslo_concurrency.processutils [None req-e262cdfe-d23b-4080-a668-5bd87290737c 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:45:50 compute-0 nova_compute[186544]: 2025-11-22 07:45:50.838 186548 DEBUG oslo_concurrency.processutils [None req-e262cdfe-d23b-4080-a668-5bd87290737c 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:45:50 compute-0 nova_compute[186544]: 2025-11-22 07:45:50.840 186548 DEBUG oslo_concurrency.processutils [None req-e262cdfe-d23b-4080-a668-5bd87290737c 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/snapshots/tmp3i6p1j8i/329b658d6ceb45ae9af8d56f2d81d236.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:45:50 compute-0 nova_compute[186544]: 2025-11-22 07:45:50.879 186548 DEBUG oslo_concurrency.processutils [None req-e262cdfe-d23b-4080-a668-5bd87290737c 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/snapshots/tmp3i6p1j8i/329b658d6ceb45ae9af8d56f2d81d236.delta 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:45:50 compute-0 nova_compute[186544]: 2025-11-22 07:45:50.881 186548 INFO nova.virt.libvirt.driver [None req-e262cdfe-d23b-4080-a668-5bd87290737c 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] Quiescing instance not available: QEMU guest agent is not enabled.
Nov 22 07:45:50 compute-0 nova_compute[186544]: 2025-11-22 07:45:50.929 186548 DEBUG nova.virt.libvirt.guest [None req-e262cdfe-d23b-4080-a668-5bd87290737c 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] COPY block job progress, current cursor: 1 final cursor: 1 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Nov 22 07:45:50 compute-0 nova_compute[186544]: 2025-11-22 07:45:50.932 186548 INFO nova.virt.libvirt.driver [None req-e262cdfe-d23b-4080-a668-5bd87290737c 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] Skipping quiescing instance: QEMU guest agent is not enabled.
Nov 22 07:45:50 compute-0 nova_compute[186544]: 2025-11-22 07:45:50.962 186548 DEBUG nova.privsep.utils [None req-e262cdfe-d23b-4080-a668-5bd87290737c 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Nov 22 07:45:50 compute-0 nova_compute[186544]: 2025-11-22 07:45:50.963 186548 DEBUG oslo_concurrency.processutils [None req-e262cdfe-d23b-4080-a668-5bd87290737c 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmp3i6p1j8i/329b658d6ceb45ae9af8d56f2d81d236.delta /var/lib/nova/instances/snapshots/tmp3i6p1j8i/329b658d6ceb45ae9af8d56f2d81d236 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:45:51 compute-0 nova_compute[186544]: 2025-11-22 07:45:51.111 186548 DEBUG oslo_concurrency.processutils [None req-e262cdfe-d23b-4080-a668-5bd87290737c 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmp3i6p1j8i/329b658d6ceb45ae9af8d56f2d81d236.delta /var/lib/nova/instances/snapshots/tmp3i6p1j8i/329b658d6ceb45ae9af8d56f2d81d236" returned: 0 in 0.148s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:45:51 compute-0 nova_compute[186544]: 2025-11-22 07:45:51.112 186548 INFO nova.virt.libvirt.driver [None req-e262cdfe-d23b-4080-a668-5bd87290737c 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] Snapshot extracted, beginning image upload
Nov 22 07:45:53 compute-0 nova_compute[186544]: 2025-11-22 07:45:53.298 186548 INFO nova.virt.libvirt.driver [None req-e262cdfe-d23b-4080-a668-5bd87290737c 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] Snapshot image upload complete
Nov 22 07:45:53 compute-0 nova_compute[186544]: 2025-11-22 07:45:53.299 186548 INFO nova.compute.manager [None req-e262cdfe-d23b-4080-a668-5bd87290737c 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] Took 3.20 seconds to snapshot the instance on the hypervisor.
Nov 22 07:45:53 compute-0 nova_compute[186544]: 2025-11-22 07:45:53.659 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:45:54 compute-0 podman[217050]: 2025-11-22 07:45:54.400522901 +0000 UTC m=+0.053640932 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 22 07:45:54 compute-0 podman[217051]: 2025-11-22 07:45:54.426551537 +0000 UTC m=+0.077189398 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 22 07:45:54 compute-0 ovn_controller[94843]: 2025-11-22T07:45:54Z|00016|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:c7:72:22 10.100.0.5
Nov 22 07:45:54 compute-0 ovn_controller[94843]: 2025-11-22T07:45:54Z|00017|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c7:72:22 10.100.0.5
Nov 22 07:45:55 compute-0 nova_compute[186544]: 2025-11-22 07:45:55.112 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:45:55 compute-0 nova_compute[186544]: 2025-11-22 07:45:55.895 186548 DEBUG oslo_concurrency.lockutils [None req-3782aa2b-0730-4de7-8de6-4e40e6d57bbb 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Acquiring lock "2f7c9a98-a5d7-429a-afdd-d0e698e82707" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:45:55 compute-0 nova_compute[186544]: 2025-11-22 07:45:55.896 186548 DEBUG oslo_concurrency.lockutils [None req-3782aa2b-0730-4de7-8de6-4e40e6d57bbb 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Lock "2f7c9a98-a5d7-429a-afdd-d0e698e82707" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:45:55 compute-0 nova_compute[186544]: 2025-11-22 07:45:55.896 186548 DEBUG oslo_concurrency.lockutils [None req-3782aa2b-0730-4de7-8de6-4e40e6d57bbb 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Acquiring lock "2f7c9a98-a5d7-429a-afdd-d0e698e82707-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:45:55 compute-0 nova_compute[186544]: 2025-11-22 07:45:55.896 186548 DEBUG oslo_concurrency.lockutils [None req-3782aa2b-0730-4de7-8de6-4e40e6d57bbb 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Lock "2f7c9a98-a5d7-429a-afdd-d0e698e82707-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:45:55 compute-0 nova_compute[186544]: 2025-11-22 07:45:55.897 186548 DEBUG oslo_concurrency.lockutils [None req-3782aa2b-0730-4de7-8de6-4e40e6d57bbb 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Lock "2f7c9a98-a5d7-429a-afdd-d0e698e82707-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:45:55 compute-0 nova_compute[186544]: 2025-11-22 07:45:55.904 186548 INFO nova.compute.manager [None req-3782aa2b-0730-4de7-8de6-4e40e6d57bbb 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] Terminating instance
Nov 22 07:45:55 compute-0 nova_compute[186544]: 2025-11-22 07:45:55.910 186548 DEBUG nova.compute.manager [None req-3782aa2b-0730-4de7-8de6-4e40e6d57bbb 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 07:45:55 compute-0 kernel: tap5f017af5-21 (unregistering): left promiscuous mode
Nov 22 07:45:55 compute-0 NetworkManager[55036]: <info>  [1763797555.9288] device (tap5f017af5-21): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 07:45:55 compute-0 nova_compute[186544]: 2025-11-22 07:45:55.936 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:45:55 compute-0 ovn_controller[94843]: 2025-11-22T07:45:55Z|00099|binding|INFO|Releasing lport 5f017af5-2156-4188-9b3a-5ffb3f49f735 from this chassis (sb_readonly=0)
Nov 22 07:45:55 compute-0 ovn_controller[94843]: 2025-11-22T07:45:55Z|00100|binding|INFO|Setting lport 5f017af5-2156-4188-9b3a-5ffb3f49f735 down in Southbound
Nov 22 07:45:55 compute-0 ovn_controller[94843]: 2025-11-22T07:45:55Z|00101|binding|INFO|Removing iface tap5f017af5-21 ovn-installed in OVS
Nov 22 07:45:55 compute-0 nova_compute[186544]: 2025-11-22 07:45:55.939 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:45:55 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:45:55.947 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c7:72:22 10.100.0.5'], port_security=['fa:16:3e:c7:72:22 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '2f7c9a98-a5d7-429a-afdd-d0e698e82707', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ca132bee-7714-47f8-abaf-f0c28bf1cbee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2dcc66faee714eda8d202a96a56299c1', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0722b417-e846-4e24-b0bf-11b5d0bd7732', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8ec4ac96-bec6-4012-b178-66b7b52fdfc6, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=5f017af5-2156-4188-9b3a-5ffb3f49f735) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:45:55 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:45:55.948 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 5f017af5-2156-4188-9b3a-5ffb3f49f735 in datapath ca132bee-7714-47f8-abaf-f0c28bf1cbee unbound from our chassis
Nov 22 07:45:55 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:45:55.949 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ca132bee-7714-47f8-abaf-f0c28bf1cbee, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 07:45:55 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:45:55.951 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[b0b7812f-44e9-4e85-8a4c-fd2316c54d9c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:45:55 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:45:55.951 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ca132bee-7714-47f8-abaf-f0c28bf1cbee namespace which is not needed anymore
Nov 22 07:45:55 compute-0 nova_compute[186544]: 2025-11-22 07:45:55.955 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:45:55 compute-0 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000019.scope: Deactivated successfully.
Nov 22 07:45:55 compute-0 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000019.scope: Consumed 13.850s CPU time.
Nov 22 07:45:55 compute-0 systemd-machined[152872]: Machine qemu-13-instance-00000019 terminated.
Nov 22 07:45:56 compute-0 neutron-haproxy-ovnmeta-ca132bee-7714-47f8-abaf-f0c28bf1cbee[216951]: [NOTICE]   (216955) : haproxy version is 2.8.14-c23fe91
Nov 22 07:45:56 compute-0 neutron-haproxy-ovnmeta-ca132bee-7714-47f8-abaf-f0c28bf1cbee[216951]: [NOTICE]   (216955) : path to executable is /usr/sbin/haproxy
Nov 22 07:45:56 compute-0 neutron-haproxy-ovnmeta-ca132bee-7714-47f8-abaf-f0c28bf1cbee[216951]: [WARNING]  (216955) : Exiting Master process...
Nov 22 07:45:56 compute-0 neutron-haproxy-ovnmeta-ca132bee-7714-47f8-abaf-f0c28bf1cbee[216951]: [ALERT]    (216955) : Current worker (216957) exited with code 143 (Terminated)
Nov 22 07:45:56 compute-0 neutron-haproxy-ovnmeta-ca132bee-7714-47f8-abaf-f0c28bf1cbee[216951]: [WARNING]  (216955) : All workers exited. Exiting... (0)
Nov 22 07:45:56 compute-0 systemd[1]: libpod-cd9e6271b05f42a84747ce6cf4c8728f7e0668fb2b8d8d652eff25c665cb8861.scope: Deactivated successfully.
Nov 22 07:45:56 compute-0 podman[217120]: 2025-11-22 07:45:56.079015961 +0000 UTC m=+0.046261871 container died cd9e6271b05f42a84747ce6cf4c8728f7e0668fb2b8d8d652eff25c665cb8861 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ca132bee-7714-47f8-abaf-f0c28bf1cbee, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 22 07:45:56 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cd9e6271b05f42a84747ce6cf4c8728f7e0668fb2b8d8d652eff25c665cb8861-userdata-shm.mount: Deactivated successfully.
Nov 22 07:45:56 compute-0 systemd[1]: var-lib-containers-storage-overlay-2793f8a02e47a4228a69962a0c6e8ece67219ca4baff32a46b4c51f42f45b6f5-merged.mount: Deactivated successfully.
Nov 22 07:45:56 compute-0 podman[217120]: 2025-11-22 07:45:56.116519448 +0000 UTC m=+0.083765358 container cleanup cd9e6271b05f42a84747ce6cf4c8728f7e0668fb2b8d8d652eff25c665cb8861 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ca132bee-7714-47f8-abaf-f0c28bf1cbee, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Nov 22 07:45:56 compute-0 systemd[1]: libpod-conmon-cd9e6271b05f42a84747ce6cf4c8728f7e0668fb2b8d8d652eff25c665cb8861.scope: Deactivated successfully.
Nov 22 07:45:56 compute-0 nova_compute[186544]: 2025-11-22 07:45:56.178 186548 INFO nova.virt.libvirt.driver [-] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] Instance destroyed successfully.
Nov 22 07:45:56 compute-0 nova_compute[186544]: 2025-11-22 07:45:56.180 186548 DEBUG nova.objects.instance [None req-3782aa2b-0730-4de7-8de6-4e40e6d57bbb 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Lazy-loading 'resources' on Instance uuid 2f7c9a98-a5d7-429a-afdd-d0e698e82707 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:45:56 compute-0 podman[217154]: 2025-11-22 07:45:56.186419746 +0000 UTC m=+0.045606946 container remove cd9e6271b05f42a84747ce6cf4c8728f7e0668fb2b8d8d652eff25c665cb8861 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ca132bee-7714-47f8-abaf-f0c28bf1cbee, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 22 07:45:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:45:56.191 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[0428cf54-fe27-42a0-8fc6-7fcb5c70e3f3]: (4, ('Sat Nov 22 07:45:56 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-ca132bee-7714-47f8-abaf-f0c28bf1cbee (cd9e6271b05f42a84747ce6cf4c8728f7e0668fb2b8d8d652eff25c665cb8861)\ncd9e6271b05f42a84747ce6cf4c8728f7e0668fb2b8d8d652eff25c665cb8861\nSat Nov 22 07:45:56 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-ca132bee-7714-47f8-abaf-f0c28bf1cbee (cd9e6271b05f42a84747ce6cf4c8728f7e0668fb2b8d8d652eff25c665cb8861)\ncd9e6271b05f42a84747ce6cf4c8728f7e0668fb2b8d8d652eff25c665cb8861\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:45:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:45:56.192 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[79448c8d-2a71-42c7-8b2e-c0928eeecd8b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:45:56 compute-0 nova_compute[186544]: 2025-11-22 07:45:56.192 186548 DEBUG nova.virt.libvirt.vif [None req-3782aa2b-0730-4de7-8de6-4e40e6d57bbb 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:45:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesOneServerTestJSON-server-1403503271',display_name='tempest-ImagesOneServerTestJSON-server-1403503271',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagesoneservertestjson-server-1403503271',id=25,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:45:40Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2dcc66faee714eda8d202a96a56299c1',ramdisk_id='',reservation_id='r-da36031p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesOneServerTestJSON-1228060188',owner_user_name='tempest-ImagesOneServerTestJSON-1228060188-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:45:53Z,user_data=None,user_id='6c0f7d40bf1447dcbaf452e10ca66973',uuid=2f7c9a98-a5d7-429a-afdd-d0e698e82707,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5f017af5-2156-4188-9b3a-5ffb3f49f735", "address": "fa:16:3e:c7:72:22", "network": {"id": "ca132bee-7714-47f8-abaf-f0c28bf1cbee", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-792408710-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2dcc66faee714eda8d202a96a56299c1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f017af5-21", "ovs_interfaceid": "5f017af5-2156-4188-9b3a-5ffb3f49f735", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 07:45:56 compute-0 nova_compute[186544]: 2025-11-22 07:45:56.193 186548 DEBUG nova.network.os_vif_util [None req-3782aa2b-0730-4de7-8de6-4e40e6d57bbb 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Converting VIF {"id": "5f017af5-2156-4188-9b3a-5ffb3f49f735", "address": "fa:16:3e:c7:72:22", "network": {"id": "ca132bee-7714-47f8-abaf-f0c28bf1cbee", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-792408710-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2dcc66faee714eda8d202a96a56299c1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f017af5-21", "ovs_interfaceid": "5f017af5-2156-4188-9b3a-5ffb3f49f735", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:45:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:45:56.194 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapca132bee-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:45:56 compute-0 nova_compute[186544]: 2025-11-22 07:45:56.194 186548 DEBUG nova.network.os_vif_util [None req-3782aa2b-0730-4de7-8de6-4e40e6d57bbb 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c7:72:22,bridge_name='br-int',has_traffic_filtering=True,id=5f017af5-2156-4188-9b3a-5ffb3f49f735,network=Network(ca132bee-7714-47f8-abaf-f0c28bf1cbee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5f017af5-21') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:45:56 compute-0 nova_compute[186544]: 2025-11-22 07:45:56.195 186548 DEBUG os_vif [None req-3782aa2b-0730-4de7-8de6-4e40e6d57bbb 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c7:72:22,bridge_name='br-int',has_traffic_filtering=True,id=5f017af5-2156-4188-9b3a-5ffb3f49f735,network=Network(ca132bee-7714-47f8-abaf-f0c28bf1cbee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5f017af5-21') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 07:45:56 compute-0 kernel: tapca132bee-70: left promiscuous mode
Nov 22 07:45:56 compute-0 nova_compute[186544]: 2025-11-22 07:45:56.199 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:45:56 compute-0 nova_compute[186544]: 2025-11-22 07:45:56.200 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5f017af5-21, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:45:56 compute-0 nova_compute[186544]: 2025-11-22 07:45:56.200 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:45:56 compute-0 nova_compute[186544]: 2025-11-22 07:45:56.201 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 07:45:56 compute-0 nova_compute[186544]: 2025-11-22 07:45:56.212 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:45:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:45:56.213 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[61475872-86bb-4f4a-954c-e8d960dc22db]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:45:56 compute-0 nova_compute[186544]: 2025-11-22 07:45:56.215 186548 INFO os_vif [None req-3782aa2b-0730-4de7-8de6-4e40e6d57bbb 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c7:72:22,bridge_name='br-int',has_traffic_filtering=True,id=5f017af5-2156-4188-9b3a-5ffb3f49f735,network=Network(ca132bee-7714-47f8-abaf-f0c28bf1cbee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5f017af5-21')
Nov 22 07:45:56 compute-0 nova_compute[186544]: 2025-11-22 07:45:56.216 186548 INFO nova.virt.libvirt.driver [None req-3782aa2b-0730-4de7-8de6-4e40e6d57bbb 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] Deleting instance files /var/lib/nova/instances/2f7c9a98-a5d7-429a-afdd-d0e698e82707_del
Nov 22 07:45:56 compute-0 nova_compute[186544]: 2025-11-22 07:45:56.216 186548 INFO nova.virt.libvirt.driver [None req-3782aa2b-0730-4de7-8de6-4e40e6d57bbb 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] Deletion of /var/lib/nova/instances/2f7c9a98-a5d7-429a-afdd-d0e698e82707_del complete
Nov 22 07:45:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:45:56.228 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[e95e449e-cae3-4a29-85e2-c274401510f2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:45:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:45:56.229 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[81223159-c41d-490c-a21f-8a6303f516d4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:45:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:45:56.241 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[1abe6f35-f882-4799-bba0-8cf1ae8d1dd3]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 428018, 'reachable_time': 39707, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217185, 'error': None, 'target': 'ovnmeta-ca132bee-7714-47f8-abaf-f0c28bf1cbee', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:45:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:45:56.243 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ca132bee-7714-47f8-abaf-f0c28bf1cbee deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 07:45:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:45:56.243 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[041c222d-a8f4-4414-bbea-19f5897bcb6d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:45:56 compute-0 systemd[1]: run-netns-ovnmeta\x2dca132bee\x2d7714\x2d47f8\x2dabaf\x2df0c28bf1cbee.mount: Deactivated successfully.
Nov 22 07:45:56 compute-0 nova_compute[186544]: 2025-11-22 07:45:56.298 186548 INFO nova.compute.manager [None req-3782aa2b-0730-4de7-8de6-4e40e6d57bbb 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] Took 0.39 seconds to destroy the instance on the hypervisor.
Nov 22 07:45:56 compute-0 nova_compute[186544]: 2025-11-22 07:45:56.301 186548 DEBUG oslo.service.loopingcall [None req-3782aa2b-0730-4de7-8de6-4e40e6d57bbb 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 07:45:56 compute-0 nova_compute[186544]: 2025-11-22 07:45:56.301 186548 DEBUG nova.compute.manager [-] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 07:45:56 compute-0 nova_compute[186544]: 2025-11-22 07:45:56.302 186548 DEBUG nova.network.neutron [-] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 07:45:56 compute-0 nova_compute[186544]: 2025-11-22 07:45:56.483 186548 DEBUG nova.compute.manager [req-fdcf75fd-6cdf-4599-8316-3b3f1bc7de97 req-63a1ff76-1356-4844-bf6f-2840261820b3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] Received event network-vif-unplugged-5f017af5-2156-4188-9b3a-5ffb3f49f735 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:45:56 compute-0 nova_compute[186544]: 2025-11-22 07:45:56.484 186548 DEBUG oslo_concurrency.lockutils [req-fdcf75fd-6cdf-4599-8316-3b3f1bc7de97 req-63a1ff76-1356-4844-bf6f-2840261820b3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "2f7c9a98-a5d7-429a-afdd-d0e698e82707-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:45:56 compute-0 nova_compute[186544]: 2025-11-22 07:45:56.485 186548 DEBUG oslo_concurrency.lockutils [req-fdcf75fd-6cdf-4599-8316-3b3f1bc7de97 req-63a1ff76-1356-4844-bf6f-2840261820b3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "2f7c9a98-a5d7-429a-afdd-d0e698e82707-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:45:56 compute-0 nova_compute[186544]: 2025-11-22 07:45:56.485 186548 DEBUG oslo_concurrency.lockutils [req-fdcf75fd-6cdf-4599-8316-3b3f1bc7de97 req-63a1ff76-1356-4844-bf6f-2840261820b3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "2f7c9a98-a5d7-429a-afdd-d0e698e82707-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:45:56 compute-0 nova_compute[186544]: 2025-11-22 07:45:56.485 186548 DEBUG nova.compute.manager [req-fdcf75fd-6cdf-4599-8316-3b3f1bc7de97 req-63a1ff76-1356-4844-bf6f-2840261820b3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] No waiting events found dispatching network-vif-unplugged-5f017af5-2156-4188-9b3a-5ffb3f49f735 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:45:56 compute-0 nova_compute[186544]: 2025-11-22 07:45:56.485 186548 DEBUG nova.compute.manager [req-fdcf75fd-6cdf-4599-8316-3b3f1bc7de97 req-63a1ff76-1356-4844-bf6f-2840261820b3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] Received event network-vif-unplugged-5f017af5-2156-4188-9b3a-5ffb3f49f735 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 22 07:45:57 compute-0 nova_compute[186544]: 2025-11-22 07:45:57.448 186548 DEBUG nova.network.neutron [-] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:45:57 compute-0 nova_compute[186544]: 2025-11-22 07:45:57.483 186548 INFO nova.compute.manager [-] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] Took 1.18 seconds to deallocate network for instance.
Nov 22 07:45:57 compute-0 nova_compute[186544]: 2025-11-22 07:45:57.611 186548 DEBUG oslo_concurrency.lockutils [None req-3782aa2b-0730-4de7-8de6-4e40e6d57bbb 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:45:57 compute-0 nova_compute[186544]: 2025-11-22 07:45:57.611 186548 DEBUG oslo_concurrency.lockutils [None req-3782aa2b-0730-4de7-8de6-4e40e6d57bbb 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:45:57 compute-0 nova_compute[186544]: 2025-11-22 07:45:57.695 186548 DEBUG nova.compute.provider_tree [None req-3782aa2b-0730-4de7-8de6-4e40e6d57bbb 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:45:57 compute-0 nova_compute[186544]: 2025-11-22 07:45:57.708 186548 DEBUG nova.scheduler.client.report [None req-3782aa2b-0730-4de7-8de6-4e40e6d57bbb 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:45:57 compute-0 nova_compute[186544]: 2025-11-22 07:45:57.733 186548 DEBUG oslo_concurrency.lockutils [None req-3782aa2b-0730-4de7-8de6-4e40e6d57bbb 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:45:57 compute-0 nova_compute[186544]: 2025-11-22 07:45:57.772 186548 INFO nova.scheduler.client.report [None req-3782aa2b-0730-4de7-8de6-4e40e6d57bbb 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Deleted allocations for instance 2f7c9a98-a5d7-429a-afdd-d0e698e82707
Nov 22 07:45:57 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Nov 22 07:45:57 compute-0 podman[217187]: 2025-11-22 07:45:57.84909096 +0000 UTC m=+0.049254495 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 07:45:57 compute-0 nova_compute[186544]: 2025-11-22 07:45:57.867 186548 DEBUG oslo_concurrency.lockutils [None req-3782aa2b-0730-4de7-8de6-4e40e6d57bbb 6c0f7d40bf1447dcbaf452e10ca66973 2dcc66faee714eda8d202a96a56299c1 - - default default] Lock "2f7c9a98-a5d7-429a-afdd-d0e698e82707" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.972s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:45:58 compute-0 nova_compute[186544]: 2025-11-22 07:45:58.608 186548 DEBUG nova.compute.manager [req-1a089278-1a77-4dd5-8494-d977b91aea67 req-c97f2310-8d09-43ba-9d4b-972dfd547df6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] Received event network-vif-plugged-5f017af5-2156-4188-9b3a-5ffb3f49f735 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:45:58 compute-0 nova_compute[186544]: 2025-11-22 07:45:58.608 186548 DEBUG oslo_concurrency.lockutils [req-1a089278-1a77-4dd5-8494-d977b91aea67 req-c97f2310-8d09-43ba-9d4b-972dfd547df6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "2f7c9a98-a5d7-429a-afdd-d0e698e82707-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:45:58 compute-0 nova_compute[186544]: 2025-11-22 07:45:58.609 186548 DEBUG oslo_concurrency.lockutils [req-1a089278-1a77-4dd5-8494-d977b91aea67 req-c97f2310-8d09-43ba-9d4b-972dfd547df6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "2f7c9a98-a5d7-429a-afdd-d0e698e82707-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:45:58 compute-0 nova_compute[186544]: 2025-11-22 07:45:58.609 186548 DEBUG oslo_concurrency.lockutils [req-1a089278-1a77-4dd5-8494-d977b91aea67 req-c97f2310-8d09-43ba-9d4b-972dfd547df6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "2f7c9a98-a5d7-429a-afdd-d0e698e82707-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:45:58 compute-0 nova_compute[186544]: 2025-11-22 07:45:58.609 186548 DEBUG nova.compute.manager [req-1a089278-1a77-4dd5-8494-d977b91aea67 req-c97f2310-8d09-43ba-9d4b-972dfd547df6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] No waiting events found dispatching network-vif-plugged-5f017af5-2156-4188-9b3a-5ffb3f49f735 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:45:58 compute-0 nova_compute[186544]: 2025-11-22 07:45:58.609 186548 WARNING nova.compute.manager [req-1a089278-1a77-4dd5-8494-d977b91aea67 req-c97f2310-8d09-43ba-9d4b-972dfd547df6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] Received unexpected event network-vif-plugged-5f017af5-2156-4188-9b3a-5ffb3f49f735 for instance with vm_state deleted and task_state None.
Nov 22 07:45:58 compute-0 nova_compute[186544]: 2025-11-22 07:45:58.610 186548 DEBUG nova.compute.manager [req-1a089278-1a77-4dd5-8494-d977b91aea67 req-c97f2310-8d09-43ba-9d4b-972dfd547df6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] Received event network-vif-deleted-5f017af5-2156-4188-9b3a-5ffb3f49f735 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:46:00 compute-0 nova_compute[186544]: 2025-11-22 07:46:00.114 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:46:01 compute-0 nova_compute[186544]: 2025-11-22 07:46:01.201 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:46:02 compute-0 podman[217211]: 2025-11-22 07:46:02.397977399 +0000 UTC m=+0.047602735 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 22 07:46:04 compute-0 nova_compute[186544]: 2025-11-22 07:46:04.307 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:46:04 compute-0 podman[217230]: 2025-11-22 07:46:04.411039506 +0000 UTC m=+0.058517011 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=multipathd)
Nov 22 07:46:05 compute-0 nova_compute[186544]: 2025-11-22 07:46:05.116 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:46:05 compute-0 nova_compute[186544]: 2025-11-22 07:46:05.488 186548 DEBUG oslo_concurrency.lockutils [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Acquiring lock "09e8ccfc-6ae9-4a06-ae76-7e059f50ac44" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:46:05 compute-0 nova_compute[186544]: 2025-11-22 07:46:05.488 186548 DEBUG oslo_concurrency.lockutils [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Lock "09e8ccfc-6ae9-4a06-ae76-7e059f50ac44" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:46:05 compute-0 nova_compute[186544]: 2025-11-22 07:46:05.513 186548 DEBUG nova.compute.manager [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 07:46:05 compute-0 nova_compute[186544]: 2025-11-22 07:46:05.667 186548 DEBUG oslo_concurrency.lockutils [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:46:05 compute-0 nova_compute[186544]: 2025-11-22 07:46:05.668 186548 DEBUG oslo_concurrency.lockutils [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:46:05 compute-0 nova_compute[186544]: 2025-11-22 07:46:05.675 186548 DEBUG nova.virt.hardware [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 07:46:05 compute-0 nova_compute[186544]: 2025-11-22 07:46:05.675 186548 INFO nova.compute.claims [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Claim successful on node compute-0.ctlplane.example.com
Nov 22 07:46:05 compute-0 nova_compute[186544]: 2025-11-22 07:46:05.873 186548 DEBUG nova.compute.provider_tree [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:46:05 compute-0 nova_compute[186544]: 2025-11-22 07:46:05.885 186548 DEBUG nova.scheduler.client.report [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:46:05 compute-0 nova_compute[186544]: 2025-11-22 07:46:05.906 186548 DEBUG oslo_concurrency.lockutils [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.238s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:46:05 compute-0 nova_compute[186544]: 2025-11-22 07:46:05.907 186548 DEBUG nova.compute.manager [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 07:46:05 compute-0 nova_compute[186544]: 2025-11-22 07:46:05.952 186548 DEBUG nova.compute.manager [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 22 07:46:05 compute-0 nova_compute[186544]: 2025-11-22 07:46:05.953 186548 DEBUG nova.network.neutron [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 07:46:05 compute-0 nova_compute[186544]: 2025-11-22 07:46:05.974 186548 INFO nova.virt.libvirt.driver [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 07:46:05 compute-0 nova_compute[186544]: 2025-11-22 07:46:05.992 186548 DEBUG nova.compute.manager [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 07:46:06 compute-0 nova_compute[186544]: 2025-11-22 07:46:06.125 186548 DEBUG nova.compute.manager [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 07:46:06 compute-0 nova_compute[186544]: 2025-11-22 07:46:06.128 186548 DEBUG nova.virt.libvirt.driver [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 07:46:06 compute-0 nova_compute[186544]: 2025-11-22 07:46:06.129 186548 INFO nova.virt.libvirt.driver [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Creating image(s)
Nov 22 07:46:06 compute-0 nova_compute[186544]: 2025-11-22 07:46:06.130 186548 DEBUG oslo_concurrency.lockutils [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Acquiring lock "/var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:46:06 compute-0 nova_compute[186544]: 2025-11-22 07:46:06.130 186548 DEBUG oslo_concurrency.lockutils [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Lock "/var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:46:06 compute-0 nova_compute[186544]: 2025-11-22 07:46:06.131 186548 DEBUG oslo_concurrency.lockutils [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Lock "/var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:46:06 compute-0 nova_compute[186544]: 2025-11-22 07:46:06.144 186548 DEBUG oslo_concurrency.processutils [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:46:06 compute-0 nova_compute[186544]: 2025-11-22 07:46:06.203 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:46:06 compute-0 nova_compute[186544]: 2025-11-22 07:46:06.206 186548 DEBUG oslo_concurrency.processutils [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:46:06 compute-0 nova_compute[186544]: 2025-11-22 07:46:06.207 186548 DEBUG oslo_concurrency.lockutils [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:46:06 compute-0 nova_compute[186544]: 2025-11-22 07:46:06.207 186548 DEBUG oslo_concurrency.lockutils [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:46:06 compute-0 nova_compute[186544]: 2025-11-22 07:46:06.223 186548 DEBUG oslo_concurrency.processutils [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:46:06 compute-0 nova_compute[186544]: 2025-11-22 07:46:06.280 186548 DEBUG oslo_concurrency.processutils [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:46:06 compute-0 nova_compute[186544]: 2025-11-22 07:46:06.282 186548 DEBUG oslo_concurrency.processutils [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:46:06 compute-0 nova_compute[186544]: 2025-11-22 07:46:06.322 186548 DEBUG oslo_concurrency.processutils [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:46:06 compute-0 nova_compute[186544]: 2025-11-22 07:46:06.324 186548 DEBUG oslo_concurrency.lockutils [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.116s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:46:06 compute-0 nova_compute[186544]: 2025-11-22 07:46:06.324 186548 DEBUG oslo_concurrency.processutils [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:46:06 compute-0 nova_compute[186544]: 2025-11-22 07:46:06.382 186548 DEBUG oslo_concurrency.processutils [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:46:06 compute-0 nova_compute[186544]: 2025-11-22 07:46:06.384 186548 DEBUG nova.virt.disk.api [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Checking if we can resize image /var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 07:46:06 compute-0 nova_compute[186544]: 2025-11-22 07:46:06.385 186548 DEBUG oslo_concurrency.processutils [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:46:06 compute-0 nova_compute[186544]: 2025-11-22 07:46:06.408 186548 DEBUG nova.policy [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8a738b980aad493b9a21da7d5a5ccf8a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd48bda61691e4f778b6d30c0dc773a30', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 22 07:46:06 compute-0 nova_compute[186544]: 2025-11-22 07:46:06.454 186548 DEBUG oslo_concurrency.processutils [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:46:06 compute-0 nova_compute[186544]: 2025-11-22 07:46:06.455 186548 DEBUG nova.virt.disk.api [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Cannot resize image /var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 07:46:06 compute-0 nova_compute[186544]: 2025-11-22 07:46:06.455 186548 DEBUG nova.objects.instance [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Lazy-loading 'migration_context' on Instance uuid 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:46:06 compute-0 nova_compute[186544]: 2025-11-22 07:46:06.469 186548 DEBUG nova.virt.libvirt.driver [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 07:46:06 compute-0 nova_compute[186544]: 2025-11-22 07:46:06.470 186548 DEBUG nova.virt.libvirt.driver [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Ensure instance console log exists: /var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 07:46:06 compute-0 nova_compute[186544]: 2025-11-22 07:46:06.471 186548 DEBUG oslo_concurrency.lockutils [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:46:06 compute-0 nova_compute[186544]: 2025-11-22 07:46:06.471 186548 DEBUG oslo_concurrency.lockutils [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:46:06 compute-0 nova_compute[186544]: 2025-11-22 07:46:06.472 186548 DEBUG oslo_concurrency.lockutils [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:46:07 compute-0 nova_compute[186544]: 2025-11-22 07:46:07.016 186548 DEBUG nova.network.neutron [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Successfully created port: 973cdfc2-4ad8-4f41-b383-4b64b1b5433f _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 22 07:46:08 compute-0 nova_compute[186544]: 2025-11-22 07:46:08.021 186548 DEBUG nova.network.neutron [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Successfully updated port: 973cdfc2-4ad8-4f41-b383-4b64b1b5433f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 07:46:08 compute-0 nova_compute[186544]: 2025-11-22 07:46:08.030 186548 DEBUG oslo_concurrency.lockutils [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Acquiring lock "67b32f80-a795-43b0-bcb0-aeda0d9506d4" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:46:08 compute-0 nova_compute[186544]: 2025-11-22 07:46:08.031 186548 DEBUG oslo_concurrency.lockutils [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Lock "67b32f80-a795-43b0-bcb0-aeda0d9506d4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:46:08 compute-0 nova_compute[186544]: 2025-11-22 07:46:08.051 186548 DEBUG oslo_concurrency.lockutils [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Acquiring lock "refresh_cache-09e8ccfc-6ae9-4a06-ae76-7e059f50ac44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:46:08 compute-0 nova_compute[186544]: 2025-11-22 07:46:08.052 186548 DEBUG oslo_concurrency.lockutils [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Acquired lock "refresh_cache-09e8ccfc-6ae9-4a06-ae76-7e059f50ac44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:46:08 compute-0 nova_compute[186544]: 2025-11-22 07:46:08.052 186548 DEBUG nova.network.neutron [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 07:46:08 compute-0 nova_compute[186544]: 2025-11-22 07:46:08.067 186548 DEBUG nova.compute.manager [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 07:46:08 compute-0 nova_compute[186544]: 2025-11-22 07:46:08.164 186548 DEBUG nova.compute.manager [req-bdde01bd-1631-473b-b52d-c7aad4e4f5b8 req-e09a75e8-19f6-464f-9bae-07ae6be9dbf3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Received event network-changed-973cdfc2-4ad8-4f41-b383-4b64b1b5433f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:46:08 compute-0 nova_compute[186544]: 2025-11-22 07:46:08.165 186548 DEBUG nova.compute.manager [req-bdde01bd-1631-473b-b52d-c7aad4e4f5b8 req-e09a75e8-19f6-464f-9bae-07ae6be9dbf3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Refreshing instance network info cache due to event network-changed-973cdfc2-4ad8-4f41-b383-4b64b1b5433f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 07:46:08 compute-0 nova_compute[186544]: 2025-11-22 07:46:08.165 186548 DEBUG oslo_concurrency.lockutils [req-bdde01bd-1631-473b-b52d-c7aad4e4f5b8 req-e09a75e8-19f6-464f-9bae-07ae6be9dbf3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-09e8ccfc-6ae9-4a06-ae76-7e059f50ac44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:46:08 compute-0 nova_compute[186544]: 2025-11-22 07:46:08.207 186548 DEBUG oslo_concurrency.lockutils [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:46:08 compute-0 nova_compute[186544]: 2025-11-22 07:46:08.208 186548 DEBUG oslo_concurrency.lockutils [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:46:08 compute-0 nova_compute[186544]: 2025-11-22 07:46:08.218 186548 DEBUG nova.virt.hardware [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 07:46:08 compute-0 nova_compute[186544]: 2025-11-22 07:46:08.218 186548 INFO nova.compute.claims [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] Claim successful on node compute-0.ctlplane.example.com
Nov 22 07:46:08 compute-0 nova_compute[186544]: 2025-11-22 07:46:08.383 186548 DEBUG nova.network.neutron [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 07:46:08 compute-0 nova_compute[186544]: 2025-11-22 07:46:08.465 186548 DEBUG nova.compute.provider_tree [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:46:08 compute-0 nova_compute[186544]: 2025-11-22 07:46:08.485 186548 DEBUG nova.scheduler.client.report [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:46:08 compute-0 nova_compute[186544]: 2025-11-22 07:46:08.515 186548 DEBUG oslo_concurrency.lockutils [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.307s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:46:08 compute-0 nova_compute[186544]: 2025-11-22 07:46:08.516 186548 DEBUG nova.compute.manager [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 07:46:08 compute-0 nova_compute[186544]: 2025-11-22 07:46:08.563 186548 DEBUG nova.compute.manager [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 22 07:46:08 compute-0 nova_compute[186544]: 2025-11-22 07:46:08.563 186548 DEBUG nova.network.neutron [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 07:46:08 compute-0 nova_compute[186544]: 2025-11-22 07:46:08.583 186548 INFO nova.virt.libvirt.driver [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 07:46:08 compute-0 nova_compute[186544]: 2025-11-22 07:46:08.603 186548 DEBUG nova.compute.manager [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 07:46:08 compute-0 nova_compute[186544]: 2025-11-22 07:46:08.702 186548 DEBUG nova.compute.manager [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 07:46:08 compute-0 nova_compute[186544]: 2025-11-22 07:46:08.705 186548 DEBUG nova.virt.libvirt.driver [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 07:46:08 compute-0 nova_compute[186544]: 2025-11-22 07:46:08.706 186548 INFO nova.virt.libvirt.driver [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] Creating image(s)
Nov 22 07:46:08 compute-0 nova_compute[186544]: 2025-11-22 07:46:08.706 186548 DEBUG oslo_concurrency.lockutils [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Acquiring lock "/var/lib/nova/instances/67b32f80-a795-43b0-bcb0-aeda0d9506d4/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:46:08 compute-0 nova_compute[186544]: 2025-11-22 07:46:08.706 186548 DEBUG oslo_concurrency.lockutils [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Lock "/var/lib/nova/instances/67b32f80-a795-43b0-bcb0-aeda0d9506d4/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:46:08 compute-0 nova_compute[186544]: 2025-11-22 07:46:08.707 186548 DEBUG oslo_concurrency.lockutils [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Lock "/var/lib/nova/instances/67b32f80-a795-43b0-bcb0-aeda0d9506d4/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:46:08 compute-0 nova_compute[186544]: 2025-11-22 07:46:08.719 186548 DEBUG oslo_concurrency.processutils [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:46:08 compute-0 nova_compute[186544]: 2025-11-22 07:46:08.776 186548 DEBUG oslo_concurrency.processutils [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:46:08 compute-0 nova_compute[186544]: 2025-11-22 07:46:08.777 186548 DEBUG oslo_concurrency.lockutils [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:46:08 compute-0 nova_compute[186544]: 2025-11-22 07:46:08.778 186548 DEBUG oslo_concurrency.lockutils [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:46:08 compute-0 nova_compute[186544]: 2025-11-22 07:46:08.789 186548 DEBUG oslo_concurrency.processutils [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:46:08 compute-0 nova_compute[186544]: 2025-11-22 07:46:08.854 186548 DEBUG oslo_concurrency.processutils [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:46:08 compute-0 nova_compute[186544]: 2025-11-22 07:46:08.856 186548 DEBUG oslo_concurrency.processutils [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/67b32f80-a795-43b0-bcb0-aeda0d9506d4/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:46:08 compute-0 nova_compute[186544]: 2025-11-22 07:46:08.895 186548 DEBUG oslo_concurrency.processutils [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/67b32f80-a795-43b0-bcb0-aeda0d9506d4/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:46:08 compute-0 nova_compute[186544]: 2025-11-22 07:46:08.896 186548 DEBUG oslo_concurrency.lockutils [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.118s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:46:08 compute-0 nova_compute[186544]: 2025-11-22 07:46:08.897 186548 DEBUG oslo_concurrency.processutils [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:46:08 compute-0 nova_compute[186544]: 2025-11-22 07:46:08.956 186548 DEBUG oslo_concurrency.processutils [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:46:08 compute-0 nova_compute[186544]: 2025-11-22 07:46:08.958 186548 DEBUG nova.virt.disk.api [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Checking if we can resize image /var/lib/nova/instances/67b32f80-a795-43b0-bcb0-aeda0d9506d4/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 07:46:08 compute-0 nova_compute[186544]: 2025-11-22 07:46:08.960 186548 DEBUG oslo_concurrency.processutils [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/67b32f80-a795-43b0-bcb0-aeda0d9506d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:46:09 compute-0 nova_compute[186544]: 2025-11-22 07:46:09.008 186548 DEBUG nova.policy [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '65ded9a5f9a7463d8c52561197054664', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ef27a8ab9a794f7782ac89b9c28c893a', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 22 07:46:09 compute-0 nova_compute[186544]: 2025-11-22 07:46:09.024 186548 DEBUG oslo_concurrency.processutils [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/67b32f80-a795-43b0-bcb0-aeda0d9506d4/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:46:09 compute-0 nova_compute[186544]: 2025-11-22 07:46:09.025 186548 DEBUG nova.virt.disk.api [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Cannot resize image /var/lib/nova/instances/67b32f80-a795-43b0-bcb0-aeda0d9506d4/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 07:46:09 compute-0 nova_compute[186544]: 2025-11-22 07:46:09.026 186548 DEBUG nova.objects.instance [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Lazy-loading 'migration_context' on Instance uuid 67b32f80-a795-43b0-bcb0-aeda0d9506d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:46:09 compute-0 nova_compute[186544]: 2025-11-22 07:46:09.041 186548 DEBUG nova.virt.libvirt.driver [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 07:46:09 compute-0 nova_compute[186544]: 2025-11-22 07:46:09.042 186548 DEBUG nova.virt.libvirt.driver [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] Ensure instance console log exists: /var/lib/nova/instances/67b32f80-a795-43b0-bcb0-aeda0d9506d4/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 07:46:09 compute-0 nova_compute[186544]: 2025-11-22 07:46:09.043 186548 DEBUG oslo_concurrency.lockutils [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:46:09 compute-0 nova_compute[186544]: 2025-11-22 07:46:09.043 186548 DEBUG oslo_concurrency.lockutils [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:46:09 compute-0 nova_compute[186544]: 2025-11-22 07:46:09.044 186548 DEBUG oslo_concurrency.lockutils [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:46:09 compute-0 nova_compute[186544]: 2025-11-22 07:46:09.505 186548 DEBUG nova.network.neutron [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Updating instance_info_cache with network_info: [{"id": "973cdfc2-4ad8-4f41-b383-4b64b1b5433f", "address": "fa:16:3e:02:9b:98", "network": {"id": "c3f966e1-8cff-4ca0-9b4f-a318c31b0a70", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1427311200-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d48bda61691e4f778b6d30c0dc773a30", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap973cdfc2-4a", "ovs_interfaceid": "973cdfc2-4ad8-4f41-b383-4b64b1b5433f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:46:09 compute-0 nova_compute[186544]: 2025-11-22 07:46:09.537 186548 DEBUG oslo_concurrency.lockutils [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Releasing lock "refresh_cache-09e8ccfc-6ae9-4a06-ae76-7e059f50ac44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:46:09 compute-0 nova_compute[186544]: 2025-11-22 07:46:09.537 186548 DEBUG nova.compute.manager [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Instance network_info: |[{"id": "973cdfc2-4ad8-4f41-b383-4b64b1b5433f", "address": "fa:16:3e:02:9b:98", "network": {"id": "c3f966e1-8cff-4ca0-9b4f-a318c31b0a70", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1427311200-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d48bda61691e4f778b6d30c0dc773a30", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap973cdfc2-4a", "ovs_interfaceid": "973cdfc2-4ad8-4f41-b383-4b64b1b5433f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 22 07:46:09 compute-0 nova_compute[186544]: 2025-11-22 07:46:09.538 186548 DEBUG oslo_concurrency.lockutils [req-bdde01bd-1631-473b-b52d-c7aad4e4f5b8 req-e09a75e8-19f6-464f-9bae-07ae6be9dbf3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-09e8ccfc-6ae9-4a06-ae76-7e059f50ac44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:46:09 compute-0 nova_compute[186544]: 2025-11-22 07:46:09.539 186548 DEBUG nova.network.neutron [req-bdde01bd-1631-473b-b52d-c7aad4e4f5b8 req-e09a75e8-19f6-464f-9bae-07ae6be9dbf3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Refreshing network info cache for port 973cdfc2-4ad8-4f41-b383-4b64b1b5433f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 07:46:09 compute-0 nova_compute[186544]: 2025-11-22 07:46:09.541 186548 DEBUG nova.virt.libvirt.driver [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Start _get_guest_xml network_info=[{"id": "973cdfc2-4ad8-4f41-b383-4b64b1b5433f", "address": "fa:16:3e:02:9b:98", "network": {"id": "c3f966e1-8cff-4ca0-9b4f-a318c31b0a70", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1427311200-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d48bda61691e4f778b6d30c0dc773a30", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap973cdfc2-4a", "ovs_interfaceid": "973cdfc2-4ad8-4f41-b383-4b64b1b5433f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 07:46:09 compute-0 nova_compute[186544]: 2025-11-22 07:46:09.546 186548 WARNING nova.virt.libvirt.driver [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 07:46:09 compute-0 nova_compute[186544]: 2025-11-22 07:46:09.553 186548 DEBUG nova.virt.libvirt.host [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 07:46:09 compute-0 nova_compute[186544]: 2025-11-22 07:46:09.554 186548 DEBUG nova.virt.libvirt.host [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 07:46:09 compute-0 nova_compute[186544]: 2025-11-22 07:46:09.563 186548 DEBUG nova.virt.libvirt.host [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 07:46:09 compute-0 nova_compute[186544]: 2025-11-22 07:46:09.564 186548 DEBUG nova.virt.libvirt.host [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 07:46:09 compute-0 nova_compute[186544]: 2025-11-22 07:46:09.565 186548 DEBUG nova.virt.libvirt.driver [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 07:46:09 compute-0 nova_compute[186544]: 2025-11-22 07:46:09.565 186548 DEBUG nova.virt.hardware [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 07:46:09 compute-0 nova_compute[186544]: 2025-11-22 07:46:09.566 186548 DEBUG nova.virt.hardware [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 07:46:09 compute-0 nova_compute[186544]: 2025-11-22 07:46:09.566 186548 DEBUG nova.virt.hardware [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 07:46:09 compute-0 nova_compute[186544]: 2025-11-22 07:46:09.567 186548 DEBUG nova.virt.hardware [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 07:46:09 compute-0 nova_compute[186544]: 2025-11-22 07:46:09.567 186548 DEBUG nova.virt.hardware [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 07:46:09 compute-0 nova_compute[186544]: 2025-11-22 07:46:09.567 186548 DEBUG nova.virt.hardware [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 07:46:09 compute-0 nova_compute[186544]: 2025-11-22 07:46:09.567 186548 DEBUG nova.virt.hardware [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 07:46:09 compute-0 nova_compute[186544]: 2025-11-22 07:46:09.567 186548 DEBUG nova.virt.hardware [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 07:46:09 compute-0 nova_compute[186544]: 2025-11-22 07:46:09.568 186548 DEBUG nova.virt.hardware [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 07:46:09 compute-0 nova_compute[186544]: 2025-11-22 07:46:09.568 186548 DEBUG nova.virt.hardware [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 07:46:09 compute-0 nova_compute[186544]: 2025-11-22 07:46:09.568 186548 DEBUG nova.virt.hardware [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 07:46:09 compute-0 nova_compute[186544]: 2025-11-22 07:46:09.572 186548 DEBUG nova.virt.libvirt.vif [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:46:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-810629940',display_name='tempest-LiveMigrationTest-server-810629940',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-livemigrationtest-server-810629940',id=28,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d48bda61691e4f778b6d30c0dc773a30',ramdisk_id='',reservation_id='r-4wr2aiye',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveMigrationTest-2093743563',owner_user_name='tempest-LiveMigrationTest-2093743563-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:46:06Z,user_data=None,user_id='8a738b980aad493b9a21da7d5a5ccf8a',uuid=09e8ccfc-6ae9-4a06-ae76-7e059f50ac44,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "973cdfc2-4ad8-4f41-b383-4b64b1b5433f", "address": "fa:16:3e:02:9b:98", "network": {"id": "c3f966e1-8cff-4ca0-9b4f-a318c31b0a70", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1427311200-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d48bda61691e4f778b6d30c0dc773a30", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap973cdfc2-4a", "ovs_interfaceid": "973cdfc2-4ad8-4f41-b383-4b64b1b5433f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 07:46:09 compute-0 nova_compute[186544]: 2025-11-22 07:46:09.572 186548 DEBUG nova.network.os_vif_util [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Converting VIF {"id": "973cdfc2-4ad8-4f41-b383-4b64b1b5433f", "address": "fa:16:3e:02:9b:98", "network": {"id": "c3f966e1-8cff-4ca0-9b4f-a318c31b0a70", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1427311200-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d48bda61691e4f778b6d30c0dc773a30", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap973cdfc2-4a", "ovs_interfaceid": "973cdfc2-4ad8-4f41-b383-4b64b1b5433f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:46:09 compute-0 nova_compute[186544]: 2025-11-22 07:46:09.573 186548 DEBUG nova.network.os_vif_util [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:02:9b:98,bridge_name='br-int',has_traffic_filtering=True,id=973cdfc2-4ad8-4f41-b383-4b64b1b5433f,network=Network(c3f966e1-8cff-4ca0-9b4f-a318c31b0a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap973cdfc2-4a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:46:09 compute-0 nova_compute[186544]: 2025-11-22 07:46:09.574 186548 DEBUG nova.objects.instance [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Lazy-loading 'pci_devices' on Instance uuid 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:46:09 compute-0 nova_compute[186544]: 2025-11-22 07:46:09.597 186548 DEBUG nova.virt.libvirt.driver [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] End _get_guest_xml xml=<domain type="kvm">
Nov 22 07:46:09 compute-0 nova_compute[186544]:   <uuid>09e8ccfc-6ae9-4a06-ae76-7e059f50ac44</uuid>
Nov 22 07:46:09 compute-0 nova_compute[186544]:   <name>instance-0000001c</name>
Nov 22 07:46:09 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 07:46:09 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 07:46:09 compute-0 nova_compute[186544]:   <metadata>
Nov 22 07:46:09 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 07:46:09 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 07:46:09 compute-0 nova_compute[186544]:       <nova:name>tempest-LiveMigrationTest-server-810629940</nova:name>
Nov 22 07:46:09 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 07:46:09</nova:creationTime>
Nov 22 07:46:09 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 07:46:09 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 07:46:09 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 07:46:09 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 07:46:09 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 07:46:09 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 07:46:09 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 07:46:09 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 07:46:09 compute-0 nova_compute[186544]:         <nova:user uuid="8a738b980aad493b9a21da7d5a5ccf8a">tempest-LiveMigrationTest-2093743563-project-member</nova:user>
Nov 22 07:46:09 compute-0 nova_compute[186544]:         <nova:project uuid="d48bda61691e4f778b6d30c0dc773a30">tempest-LiveMigrationTest-2093743563</nova:project>
Nov 22 07:46:09 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 07:46:09 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 07:46:09 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 07:46:09 compute-0 nova_compute[186544]:         <nova:port uuid="973cdfc2-4ad8-4f41-b383-4b64b1b5433f">
Nov 22 07:46:09 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Nov 22 07:46:09 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 07:46:09 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 07:46:09 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 07:46:09 compute-0 nova_compute[186544]:   </metadata>
Nov 22 07:46:09 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 07:46:09 compute-0 nova_compute[186544]:     <system>
Nov 22 07:46:09 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 07:46:09 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 07:46:09 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 07:46:09 compute-0 nova_compute[186544]:       <entry name="serial">09e8ccfc-6ae9-4a06-ae76-7e059f50ac44</entry>
Nov 22 07:46:09 compute-0 nova_compute[186544]:       <entry name="uuid">09e8ccfc-6ae9-4a06-ae76-7e059f50ac44</entry>
Nov 22 07:46:09 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 07:46:09 compute-0 nova_compute[186544]:     </system>
Nov 22 07:46:09 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 07:46:09 compute-0 nova_compute[186544]:   <os>
Nov 22 07:46:09 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 07:46:09 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 07:46:09 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 07:46:09 compute-0 nova_compute[186544]:   </os>
Nov 22 07:46:09 compute-0 nova_compute[186544]:   <features>
Nov 22 07:46:09 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 07:46:09 compute-0 nova_compute[186544]:     <apic/>
Nov 22 07:46:09 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 07:46:09 compute-0 nova_compute[186544]:   </features>
Nov 22 07:46:09 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 07:46:09 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 07:46:09 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 07:46:09 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 07:46:09 compute-0 nova_compute[186544]:   </clock>
Nov 22 07:46:09 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 07:46:09 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 07:46:09 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 07:46:09 compute-0 nova_compute[186544]:   </cpu>
Nov 22 07:46:09 compute-0 nova_compute[186544]:   <devices>
Nov 22 07:46:09 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 07:46:09 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 07:46:09 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk"/>
Nov 22 07:46:09 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 07:46:09 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:46:09 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 07:46:09 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 07:46:09 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk.config"/>
Nov 22 07:46:09 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 07:46:09 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:46:09 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 07:46:09 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:02:9b:98"/>
Nov 22 07:46:09 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 07:46:09 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 07:46:09 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 07:46:09 compute-0 nova_compute[186544]:       <target dev="tap973cdfc2-4a"/>
Nov 22 07:46:09 compute-0 nova_compute[186544]:     </interface>
Nov 22 07:46:09 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 07:46:09 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/console.log" append="off"/>
Nov 22 07:46:09 compute-0 nova_compute[186544]:     </serial>
Nov 22 07:46:09 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 07:46:09 compute-0 nova_compute[186544]:     <video>
Nov 22 07:46:09 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 07:46:09 compute-0 nova_compute[186544]:     </video>
Nov 22 07:46:09 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 07:46:09 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 07:46:09 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 07:46:09 compute-0 nova_compute[186544]:     </rng>
Nov 22 07:46:09 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 07:46:09 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:46:09 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:46:09 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:46:09 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:46:09 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:46:09 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:46:09 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:46:09 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:46:09 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:46:09 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:46:09 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:46:09 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:46:09 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:46:09 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:46:09 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:46:09 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:46:09 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:46:09 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:46:09 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:46:09 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:46:09 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:46:09 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:46:09 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:46:09 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:46:09 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 07:46:09 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 07:46:09 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 07:46:09 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 07:46:09 compute-0 nova_compute[186544]:   </devices>
Nov 22 07:46:09 compute-0 nova_compute[186544]: </domain>
Nov 22 07:46:09 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 07:46:09 compute-0 nova_compute[186544]: 2025-11-22 07:46:09.598 186548 DEBUG nova.compute.manager [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Preparing to wait for external event network-vif-plugged-973cdfc2-4ad8-4f41-b383-4b64b1b5433f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 07:46:09 compute-0 nova_compute[186544]: 2025-11-22 07:46:09.599 186548 DEBUG oslo_concurrency.lockutils [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Acquiring lock "09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:46:09 compute-0 nova_compute[186544]: 2025-11-22 07:46:09.599 186548 DEBUG oslo_concurrency.lockutils [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Lock "09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:46:09 compute-0 nova_compute[186544]: 2025-11-22 07:46:09.599 186548 DEBUG oslo_concurrency.lockutils [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Lock "09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:46:09 compute-0 nova_compute[186544]: 2025-11-22 07:46:09.600 186548 DEBUG nova.virt.libvirt.vif [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:46:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-810629940',display_name='tempest-LiveMigrationTest-server-810629940',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-livemigrationtest-server-810629940',id=28,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d48bda61691e4f778b6d30c0dc773a30',ramdisk_id='',reservation_id='r-4wr2aiye',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveMigrationTest-2093743563',owner_user_name='tempest-LiveMigrationTest-2093743563-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:46:06Z,user_data=None,user_id='8a738b980aad493b9a21da7d5a5ccf8a',uuid=09e8ccfc-6ae9-4a06-ae76-7e059f50ac44,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "973cdfc2-4ad8-4f41-b383-4b64b1b5433f", "address": "fa:16:3e:02:9b:98", "network": {"id": "c3f966e1-8cff-4ca0-9b4f-a318c31b0a70", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1427311200-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d48bda61691e4f778b6d30c0dc773a30", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap973cdfc2-4a", "ovs_interfaceid": "973cdfc2-4ad8-4f41-b383-4b64b1b5433f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 07:46:09 compute-0 nova_compute[186544]: 2025-11-22 07:46:09.600 186548 DEBUG nova.network.os_vif_util [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Converting VIF {"id": "973cdfc2-4ad8-4f41-b383-4b64b1b5433f", "address": "fa:16:3e:02:9b:98", "network": {"id": "c3f966e1-8cff-4ca0-9b4f-a318c31b0a70", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1427311200-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d48bda61691e4f778b6d30c0dc773a30", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap973cdfc2-4a", "ovs_interfaceid": "973cdfc2-4ad8-4f41-b383-4b64b1b5433f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:46:09 compute-0 nova_compute[186544]: 2025-11-22 07:46:09.601 186548 DEBUG nova.network.os_vif_util [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:02:9b:98,bridge_name='br-int',has_traffic_filtering=True,id=973cdfc2-4ad8-4f41-b383-4b64b1b5433f,network=Network(c3f966e1-8cff-4ca0-9b4f-a318c31b0a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap973cdfc2-4a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:46:09 compute-0 nova_compute[186544]: 2025-11-22 07:46:09.601 186548 DEBUG os_vif [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:02:9b:98,bridge_name='br-int',has_traffic_filtering=True,id=973cdfc2-4ad8-4f41-b383-4b64b1b5433f,network=Network(c3f966e1-8cff-4ca0-9b4f-a318c31b0a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap973cdfc2-4a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 07:46:09 compute-0 nova_compute[186544]: 2025-11-22 07:46:09.602 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:46:09 compute-0 nova_compute[186544]: 2025-11-22 07:46:09.602 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:46:09 compute-0 nova_compute[186544]: 2025-11-22 07:46:09.603 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:46:09 compute-0 nova_compute[186544]: 2025-11-22 07:46:09.605 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:46:09 compute-0 nova_compute[186544]: 2025-11-22 07:46:09.605 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap973cdfc2-4a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:46:09 compute-0 nova_compute[186544]: 2025-11-22 07:46:09.606 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap973cdfc2-4a, col_values=(('external_ids', {'iface-id': '973cdfc2-4ad8-4f41-b383-4b64b1b5433f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:02:9b:98', 'vm-uuid': '09e8ccfc-6ae9-4a06-ae76-7e059f50ac44'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:46:09 compute-0 nova_compute[186544]: 2025-11-22 07:46:09.607 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:46:09 compute-0 NetworkManager[55036]: <info>  [1763797569.6082] manager: (tap973cdfc2-4a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/49)
Nov 22 07:46:09 compute-0 nova_compute[186544]: 2025-11-22 07:46:09.610 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 07:46:09 compute-0 nova_compute[186544]: 2025-11-22 07:46:09.612 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:46:09 compute-0 nova_compute[186544]: 2025-11-22 07:46:09.613 186548 INFO os_vif [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:02:9b:98,bridge_name='br-int',has_traffic_filtering=True,id=973cdfc2-4ad8-4f41-b383-4b64b1b5433f,network=Network(c3f966e1-8cff-4ca0-9b4f-a318c31b0a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap973cdfc2-4a')
Nov 22 07:46:09 compute-0 nova_compute[186544]: 2025-11-22 07:46:09.664 186548 DEBUG nova.virt.libvirt.driver [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:46:09 compute-0 nova_compute[186544]: 2025-11-22 07:46:09.664 186548 DEBUG nova.virt.libvirt.driver [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:46:09 compute-0 nova_compute[186544]: 2025-11-22 07:46:09.664 186548 DEBUG nova.virt.libvirt.driver [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] No VIF found with MAC fa:16:3e:02:9b:98, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 07:46:09 compute-0 nova_compute[186544]: 2025-11-22 07:46:09.665 186548 INFO nova.virt.libvirt.driver [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Using config drive
Nov 22 07:46:09 compute-0 podman[217283]: 2025-11-22 07:46:09.693957924 +0000 UTC m=+0.048220200 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 07:46:10 compute-0 nova_compute[186544]: 2025-11-22 07:46:10.117 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:46:10 compute-0 nova_compute[186544]: 2025-11-22 07:46:10.180 186548 DEBUG nova.network.neutron [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] Successfully created port: 775bdf61-ad38-473b-9d53-c3195f2be472 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 22 07:46:10 compute-0 nova_compute[186544]: 2025-11-22 07:46:10.287 186548 INFO nova.virt.libvirt.driver [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Creating config drive at /var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk.config
Nov 22 07:46:10 compute-0 nova_compute[186544]: 2025-11-22 07:46:10.292 186548 DEBUG oslo_concurrency.processutils [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpbr9cbh0y execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:46:10 compute-0 nova_compute[186544]: 2025-11-22 07:46:10.417 186548 DEBUG oslo_concurrency.processutils [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpbr9cbh0y" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:46:10 compute-0 kernel: tap973cdfc2-4a: entered promiscuous mode
Nov 22 07:46:10 compute-0 NetworkManager[55036]: <info>  [1763797570.4701] manager: (tap973cdfc2-4a): new Tun device (/org/freedesktop/NetworkManager/Devices/50)
Nov 22 07:46:10 compute-0 systemd-udevd[217319]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 07:46:10 compute-0 ovn_controller[94843]: 2025-11-22T07:46:10Z|00102|binding|INFO|Claiming lport 973cdfc2-4ad8-4f41-b383-4b64b1b5433f for this chassis.
Nov 22 07:46:10 compute-0 ovn_controller[94843]: 2025-11-22T07:46:10Z|00103|binding|INFO|973cdfc2-4ad8-4f41-b383-4b64b1b5433f: Claiming fa:16:3e:02:9b:98 10.100.0.14
Nov 22 07:46:10 compute-0 nova_compute[186544]: 2025-11-22 07:46:10.505 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:46:10 compute-0 nova_compute[186544]: 2025-11-22 07:46:10.509 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:46:10 compute-0 nova_compute[186544]: 2025-11-22 07:46:10.513 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:46:10 compute-0 NetworkManager[55036]: <info>  [1763797570.5161] device (tap973cdfc2-4a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 07:46:10 compute-0 NetworkManager[55036]: <info>  [1763797570.5169] device (tap973cdfc2-4a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 07:46:10 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:10.518 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:02:9b:98 10.100.0.14'], port_security=['fa:16:3e:02:9b:98 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '09e8ccfc-6ae9-4a06-ae76-7e059f50ac44', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd48bda61691e4f778b6d30c0dc773a30', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd99796e5-fe06-409c-adb5-ca5cc291d6f2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=45f3847d-7d6e-44b5-a83a-dde97f76bd11, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=973cdfc2-4ad8-4f41-b383-4b64b1b5433f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:46:10 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:10.519 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 973cdfc2-4ad8-4f41-b383-4b64b1b5433f in datapath c3f966e1-8cff-4ca0-9b4f-a318c31b0a70 bound to our chassis
Nov 22 07:46:10 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:10.521 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c3f966e1-8cff-4ca0-9b4f-a318c31b0a70
Nov 22 07:46:10 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:10.530 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[43d84834-de23-4d3b-92c4-6f3650a15da1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:46:10 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:10.532 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc3f966e1-81 in ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 07:46:10 compute-0 systemd-machined[152872]: New machine qemu-14-instance-0000001c.
Nov 22 07:46:10 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:10.533 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc3f966e1-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 07:46:10 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:10.533 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[c45f2b37-0e0b-482f-9f39-4caf9602d9ce]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:46:10 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:10.534 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[9c36da69-9e58-41dc-8c21-46ff976ef16a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:46:10 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:10.545 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[d1aa177b-8671-491e-a268-ff52e850d69d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:46:10 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:10.562 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[58bf5005-fcb5-4450-863a-1082235f1899]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:46:10 compute-0 ovn_controller[94843]: 2025-11-22T07:46:10Z|00104|binding|INFO|Setting lport 973cdfc2-4ad8-4f41-b383-4b64b1b5433f ovn-installed in OVS
Nov 22 07:46:10 compute-0 ovn_controller[94843]: 2025-11-22T07:46:10Z|00105|binding|INFO|Setting lport 973cdfc2-4ad8-4f41-b383-4b64b1b5433f up in Southbound
Nov 22 07:46:10 compute-0 systemd[1]: Started Virtual Machine qemu-14-instance-0000001c.
Nov 22 07:46:10 compute-0 nova_compute[186544]: 2025-11-22 07:46:10.566 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:46:10 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:10.590 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[f172d4fe-fd03-4122-a8a4-11f7913c4319]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:46:10 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:10.594 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[ccdd2ee0-312d-4453-912b-afbf17dbc313]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:46:10 compute-0 NetworkManager[55036]: <info>  [1763797570.5959] manager: (tapc3f966e1-80): new Veth device (/org/freedesktop/NetworkManager/Devices/51)
Nov 22 07:46:10 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:10.621 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[b0d3c3dc-a6a9-481c-bdb2-fe91dfb19c75]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:46:10 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:10.626 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[292babfe-8c83-42d5-b3bf-ebfd508d9b78]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:46:10 compute-0 NetworkManager[55036]: <info>  [1763797570.6467] device (tapc3f966e1-80): carrier: link connected
Nov 22 07:46:10 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:10.652 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[bda936b7-c187-463d-966a-5ccfd97995ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:46:10 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:10.666 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[55890160-3ce7-487c-bd4b-12ba549479f4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc3f966e1-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e2:74:99'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 34], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 431128, 'reachable_time': 24828, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217355, 'error': None, 'target': 'ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:46:10 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:10.679 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[e17764e6-4424-41fa-a10a-2abd75d68634]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee2:7499'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 431128, 'tstamp': 431128}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217356, 'error': None, 'target': 'ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:46:10 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:10.694 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[c802df2d-7e77-4fa7-a289-1770b17d7a6f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc3f966e1-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e2:74:99'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 34], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 431128, 'reachable_time': 24828, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217357, 'error': None, 'target': 'ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:46:10 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:10.721 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[cdc4da68-8d46-42a9-936c-6910a937da60]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:46:10 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:10.783 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[65c5e4d6-112c-4461-86e5-a1bb4fd7ec66]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:46:10 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:10.784 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc3f966e1-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:46:10 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:10.785 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:46:10 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:10.785 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc3f966e1-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:46:10 compute-0 NetworkManager[55036]: <info>  [1763797570.7890] manager: (tapc3f966e1-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/52)
Nov 22 07:46:10 compute-0 kernel: tapc3f966e1-80: entered promiscuous mode
Nov 22 07:46:10 compute-0 nova_compute[186544]: 2025-11-22 07:46:10.791 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:46:10 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:10.799 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc3f966e1-80, col_values=(('external_ids', {'iface-id': '8206cb6d-dd78-493d-a276-fccb0eeecc7d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:46:10 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:10.807 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c3f966e1-8cff-4ca0-9b4f-a318c31b0a70.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c3f966e1-8cff-4ca0-9b4f-a318c31b0a70.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 07:46:10 compute-0 ovn_controller[94843]: 2025-11-22T07:46:10Z|00106|binding|INFO|Releasing lport 8206cb6d-dd78-493d-a276-fccb0eeecc7d from this chassis (sb_readonly=0)
Nov 22 07:46:10 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:10.809 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[d8dce9b3-ac7c-46a7-accd-62af7b93f6a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:46:10 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:10.810 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 07:46:10 compute-0 ovn_metadata_agent[103800]: global
Nov 22 07:46:10 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 07:46:10 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70
Nov 22 07:46:10 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 07:46:10 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 07:46:10 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 07:46:10 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/c3f966e1-8cff-4ca0-9b4f-a318c31b0a70.pid.haproxy
Nov 22 07:46:10 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 07:46:10 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:46:10 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 07:46:10 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 07:46:10 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 07:46:10 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 07:46:10 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 07:46:10 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 07:46:10 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 07:46:10 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 07:46:10 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 07:46:10 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 07:46:10 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 07:46:10 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 07:46:10 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 07:46:10 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:46:10 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:46:10 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 07:46:10 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 07:46:10 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 07:46:10 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID c3f966e1-8cff-4ca0-9b4f-a318c31b0a70
Nov 22 07:46:10 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 07:46:10 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:10.812 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70', 'env', 'PROCESS_TAG=haproxy-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c3f966e1-8cff-4ca0-9b4f-a318c31b0a70.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 07:46:10 compute-0 nova_compute[186544]: 2025-11-22 07:46:10.810 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:46:10 compute-0 nova_compute[186544]: 2025-11-22 07:46:10.821 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:46:10 compute-0 nova_compute[186544]: 2025-11-22 07:46:10.980 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763797570.979774, 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:46:10 compute-0 nova_compute[186544]: 2025-11-22 07:46:10.981 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] VM Started (Lifecycle Event)
Nov 22 07:46:11 compute-0 nova_compute[186544]: 2025-11-22 07:46:11.009 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:46:11 compute-0 nova_compute[186544]: 2025-11-22 07:46:11.014 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763797570.9799001, 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:46:11 compute-0 nova_compute[186544]: 2025-11-22 07:46:11.014 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] VM Paused (Lifecycle Event)
Nov 22 07:46:11 compute-0 nova_compute[186544]: 2025-11-22 07:46:11.030 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:46:11 compute-0 nova_compute[186544]: 2025-11-22 07:46:11.034 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:46:11 compute-0 nova_compute[186544]: 2025-11-22 07:46:11.054 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 07:46:11 compute-0 nova_compute[186544]: 2025-11-22 07:46:11.142 186548 DEBUG nova.compute.manager [req-a91b6b7b-98ce-4b3c-b849-cbd20643ef07 req-67230207-65b7-442c-b10b-5442023e5623 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Received event network-vif-plugged-973cdfc2-4ad8-4f41-b383-4b64b1b5433f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:46:11 compute-0 nova_compute[186544]: 2025-11-22 07:46:11.143 186548 DEBUG oslo_concurrency.lockutils [req-a91b6b7b-98ce-4b3c-b849-cbd20643ef07 req-67230207-65b7-442c-b10b-5442023e5623 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:46:11 compute-0 nova_compute[186544]: 2025-11-22 07:46:11.143 186548 DEBUG oslo_concurrency.lockutils [req-a91b6b7b-98ce-4b3c-b849-cbd20643ef07 req-67230207-65b7-442c-b10b-5442023e5623 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:46:11 compute-0 nova_compute[186544]: 2025-11-22 07:46:11.144 186548 DEBUG oslo_concurrency.lockutils [req-a91b6b7b-98ce-4b3c-b849-cbd20643ef07 req-67230207-65b7-442c-b10b-5442023e5623 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:46:11 compute-0 nova_compute[186544]: 2025-11-22 07:46:11.144 186548 DEBUG nova.compute.manager [req-a91b6b7b-98ce-4b3c-b849-cbd20643ef07 req-67230207-65b7-442c-b10b-5442023e5623 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Processing event network-vif-plugged-973cdfc2-4ad8-4f41-b383-4b64b1b5433f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 07:46:11 compute-0 nova_compute[186544]: 2025-11-22 07:46:11.144 186548 DEBUG nova.compute.manager [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 07:46:11 compute-0 nova_compute[186544]: 2025-11-22 07:46:11.148 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763797571.1484435, 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:46:11 compute-0 nova_compute[186544]: 2025-11-22 07:46:11.149 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] VM Resumed (Lifecycle Event)
Nov 22 07:46:11 compute-0 nova_compute[186544]: 2025-11-22 07:46:11.150 186548 DEBUG nova.virt.libvirt.driver [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 07:46:11 compute-0 nova_compute[186544]: 2025-11-22 07:46:11.154 186548 INFO nova.virt.libvirt.driver [-] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Instance spawned successfully.
Nov 22 07:46:11 compute-0 nova_compute[186544]: 2025-11-22 07:46:11.154 186548 DEBUG nova.virt.libvirt.driver [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 07:46:11 compute-0 nova_compute[186544]: 2025-11-22 07:46:11.176 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763797556.175225, 2f7c9a98-a5d7-429a-afdd-d0e698e82707 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:46:11 compute-0 nova_compute[186544]: 2025-11-22 07:46:11.177 186548 INFO nova.compute.manager [-] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] VM Stopped (Lifecycle Event)
Nov 22 07:46:11 compute-0 nova_compute[186544]: 2025-11-22 07:46:11.186 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:46:11 compute-0 nova_compute[186544]: 2025-11-22 07:46:11.190 186548 DEBUG nova.virt.libvirt.driver [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:46:11 compute-0 nova_compute[186544]: 2025-11-22 07:46:11.191 186548 DEBUG nova.virt.libvirt.driver [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:46:11 compute-0 nova_compute[186544]: 2025-11-22 07:46:11.192 186548 DEBUG nova.virt.libvirt.driver [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:46:11 compute-0 nova_compute[186544]: 2025-11-22 07:46:11.192 186548 DEBUG nova.virt.libvirt.driver [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:46:11 compute-0 nova_compute[186544]: 2025-11-22 07:46:11.193 186548 DEBUG nova.virt.libvirt.driver [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:46:11 compute-0 nova_compute[186544]: 2025-11-22 07:46:11.193 186548 DEBUG nova.virt.libvirt.driver [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:46:11 compute-0 nova_compute[186544]: 2025-11-22 07:46:11.199 186548 DEBUG nova.compute.manager [None req-14754e8a-dc52-4c24-9e1b-8b64caee70b1 - - - - - -] [instance: 2f7c9a98-a5d7-429a-afdd-d0e698e82707] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:46:11 compute-0 nova_compute[186544]: 2025-11-22 07:46:11.203 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:46:11 compute-0 podman[217396]: 2025-11-22 07:46:11.204405218 +0000 UTC m=+0.054932664 container create 794bde83b58c7954c64e099d59b98755fa05e337465ad02cb44428d08d8ff49a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 22 07:46:11 compute-0 systemd[1]: Started libpod-conmon-794bde83b58c7954c64e099d59b98755fa05e337465ad02cb44428d08d8ff49a.scope.
Nov 22 07:46:11 compute-0 nova_compute[186544]: 2025-11-22 07:46:11.249 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 07:46:11 compute-0 podman[217396]: 2025-11-22 07:46:11.173102672 +0000 UTC m=+0.023630138 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 07:46:11 compute-0 systemd[1]: Started libcrun container.
Nov 22 07:46:11 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3f3f74c841b5aa21792978ebe6f60ea4b79d03cbec9b8a277a08bcd73fee7c64/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 07:46:11 compute-0 podman[217396]: 2025-11-22 07:46:11.296477327 +0000 UTC m=+0.147004793 container init 794bde83b58c7954c64e099d59b98755fa05e337465ad02cb44428d08d8ff49a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3)
Nov 22 07:46:11 compute-0 podman[217396]: 2025-11-22 07:46:11.302848413 +0000 UTC m=+0.153375859 container start 794bde83b58c7954c64e099d59b98755fa05e337465ad02cb44428d08d8ff49a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 07:46:11 compute-0 nova_compute[186544]: 2025-11-22 07:46:11.304 186548 INFO nova.compute.manager [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Took 5.18 seconds to spawn the instance on the hypervisor.
Nov 22 07:46:11 compute-0 nova_compute[186544]: 2025-11-22 07:46:11.305 186548 DEBUG nova.compute.manager [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:46:11 compute-0 neutron-haproxy-ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70[217412]: [NOTICE]   (217416) : New worker (217418) forked
Nov 22 07:46:11 compute-0 neutron-haproxy-ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70[217412]: [NOTICE]   (217416) : Loading success.
Nov 22 07:46:11 compute-0 nova_compute[186544]: 2025-11-22 07:46:11.403 186548 INFO nova.compute.manager [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Took 5.79 seconds to build instance.
Nov 22 07:46:11 compute-0 nova_compute[186544]: 2025-11-22 07:46:11.423 186548 DEBUG oslo_concurrency.lockutils [None req-20b07301-2260-4086-963f-03df008ea1ba 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Lock "09e8ccfc-6ae9-4a06-ae76-7e059f50ac44" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.934s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:46:12 compute-0 nova_compute[186544]: 2025-11-22 07:46:12.154 186548 DEBUG nova.network.neutron [req-bdde01bd-1631-473b-b52d-c7aad4e4f5b8 req-e09a75e8-19f6-464f-9bae-07ae6be9dbf3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Updated VIF entry in instance network info cache for port 973cdfc2-4ad8-4f41-b383-4b64b1b5433f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 07:46:12 compute-0 nova_compute[186544]: 2025-11-22 07:46:12.155 186548 DEBUG nova.network.neutron [req-bdde01bd-1631-473b-b52d-c7aad4e4f5b8 req-e09a75e8-19f6-464f-9bae-07ae6be9dbf3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Updating instance_info_cache with network_info: [{"id": "973cdfc2-4ad8-4f41-b383-4b64b1b5433f", "address": "fa:16:3e:02:9b:98", "network": {"id": "c3f966e1-8cff-4ca0-9b4f-a318c31b0a70", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1427311200-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d48bda61691e4f778b6d30c0dc773a30", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap973cdfc2-4a", "ovs_interfaceid": "973cdfc2-4ad8-4f41-b383-4b64b1b5433f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:46:12 compute-0 nova_compute[186544]: 2025-11-22 07:46:12.171 186548 DEBUG oslo_concurrency.lockutils [req-bdde01bd-1631-473b-b52d-c7aad4e4f5b8 req-e09a75e8-19f6-464f-9bae-07ae6be9dbf3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-09e8ccfc-6ae9-4a06-ae76-7e059f50ac44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:46:12 compute-0 nova_compute[186544]: 2025-11-22 07:46:12.453 186548 DEBUG nova.network.neutron [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] Successfully updated port: 775bdf61-ad38-473b-9d53-c3195f2be472 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 07:46:12 compute-0 nova_compute[186544]: 2025-11-22 07:46:12.466 186548 DEBUG oslo_concurrency.lockutils [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Acquiring lock "refresh_cache-67b32f80-a795-43b0-bcb0-aeda0d9506d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:46:12 compute-0 nova_compute[186544]: 2025-11-22 07:46:12.467 186548 DEBUG oslo_concurrency.lockutils [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Acquired lock "refresh_cache-67b32f80-a795-43b0-bcb0-aeda0d9506d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:46:12 compute-0 nova_compute[186544]: 2025-11-22 07:46:12.467 186548 DEBUG nova.network.neutron [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 07:46:12 compute-0 nova_compute[186544]: 2025-11-22 07:46:12.692 186548 DEBUG nova.network.neutron [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 07:46:14 compute-0 podman[217427]: 2025-11-22 07:46:14.438054234 +0000 UTC m=+0.083992544 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.openshift.expose-services=, release=1755695350, config_id=edpm, container_name=openstack_network_exporter, managed_by=edpm_ansible, name=ubi9-minimal, distribution-scope=public, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 22 07:46:14 compute-0 nova_compute[186544]: 2025-11-22 07:46:14.453 186548 DEBUG nova.compute.manager [req-db906a8a-dad9-46be-b075-3af4717da752 req-86bfd5fd-7a66-47a3-9a2b-349636650178 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Received event network-vif-plugged-973cdfc2-4ad8-4f41-b383-4b64b1b5433f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:46:14 compute-0 nova_compute[186544]: 2025-11-22 07:46:14.454 186548 DEBUG oslo_concurrency.lockutils [req-db906a8a-dad9-46be-b075-3af4717da752 req-86bfd5fd-7a66-47a3-9a2b-349636650178 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:46:14 compute-0 nova_compute[186544]: 2025-11-22 07:46:14.454 186548 DEBUG oslo_concurrency.lockutils [req-db906a8a-dad9-46be-b075-3af4717da752 req-86bfd5fd-7a66-47a3-9a2b-349636650178 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:46:14 compute-0 nova_compute[186544]: 2025-11-22 07:46:14.454 186548 DEBUG oslo_concurrency.lockutils [req-db906a8a-dad9-46be-b075-3af4717da752 req-86bfd5fd-7a66-47a3-9a2b-349636650178 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:46:14 compute-0 nova_compute[186544]: 2025-11-22 07:46:14.454 186548 DEBUG nova.compute.manager [req-db906a8a-dad9-46be-b075-3af4717da752 req-86bfd5fd-7a66-47a3-9a2b-349636650178 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] No waiting events found dispatching network-vif-plugged-973cdfc2-4ad8-4f41-b383-4b64b1b5433f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:46:14 compute-0 nova_compute[186544]: 2025-11-22 07:46:14.454 186548 WARNING nova.compute.manager [req-db906a8a-dad9-46be-b075-3af4717da752 req-86bfd5fd-7a66-47a3-9a2b-349636650178 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Received unexpected event network-vif-plugged-973cdfc2-4ad8-4f41-b383-4b64b1b5433f for instance with vm_state active and task_state None.
Nov 22 07:46:14 compute-0 nova_compute[186544]: 2025-11-22 07:46:14.455 186548 DEBUG nova.compute.manager [req-db906a8a-dad9-46be-b075-3af4717da752 req-86bfd5fd-7a66-47a3-9a2b-349636650178 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] Received event network-changed-775bdf61-ad38-473b-9d53-c3195f2be472 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:46:14 compute-0 nova_compute[186544]: 2025-11-22 07:46:14.455 186548 DEBUG nova.compute.manager [req-db906a8a-dad9-46be-b075-3af4717da752 req-86bfd5fd-7a66-47a3-9a2b-349636650178 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] Refreshing instance network info cache due to event network-changed-775bdf61-ad38-473b-9d53-c3195f2be472. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 07:46:14 compute-0 nova_compute[186544]: 2025-11-22 07:46:14.455 186548 DEBUG oslo_concurrency.lockutils [req-db906a8a-dad9-46be-b075-3af4717da752 req-86bfd5fd-7a66-47a3-9a2b-349636650178 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-67b32f80-a795-43b0-bcb0-aeda0d9506d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:46:14 compute-0 nova_compute[186544]: 2025-11-22 07:46:14.608 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:46:14 compute-0 nova_compute[186544]: 2025-11-22 07:46:14.895 186548 DEBUG nova.network.neutron [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] Updating instance_info_cache with network_info: [{"id": "775bdf61-ad38-473b-9d53-c3195f2be472", "address": "fa:16:3e:ec:d2:78", "network": {"id": "9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-440990750-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef27a8ab9a794f7782ac89b9c28c893a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap775bdf61-ad", "ovs_interfaceid": "775bdf61-ad38-473b-9d53-c3195f2be472", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:46:14 compute-0 nova_compute[186544]: 2025-11-22 07:46:14.918 186548 DEBUG oslo_concurrency.lockutils [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Releasing lock "refresh_cache-67b32f80-a795-43b0-bcb0-aeda0d9506d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:46:14 compute-0 nova_compute[186544]: 2025-11-22 07:46:14.918 186548 DEBUG nova.compute.manager [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] Instance network_info: |[{"id": "775bdf61-ad38-473b-9d53-c3195f2be472", "address": "fa:16:3e:ec:d2:78", "network": {"id": "9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-440990750-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef27a8ab9a794f7782ac89b9c28c893a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap775bdf61-ad", "ovs_interfaceid": "775bdf61-ad38-473b-9d53-c3195f2be472", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 22 07:46:14 compute-0 nova_compute[186544]: 2025-11-22 07:46:14.919 186548 DEBUG oslo_concurrency.lockutils [req-db906a8a-dad9-46be-b075-3af4717da752 req-86bfd5fd-7a66-47a3-9a2b-349636650178 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-67b32f80-a795-43b0-bcb0-aeda0d9506d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:46:14 compute-0 nova_compute[186544]: 2025-11-22 07:46:14.919 186548 DEBUG nova.network.neutron [req-db906a8a-dad9-46be-b075-3af4717da752 req-86bfd5fd-7a66-47a3-9a2b-349636650178 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] Refreshing network info cache for port 775bdf61-ad38-473b-9d53-c3195f2be472 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 07:46:14 compute-0 nova_compute[186544]: 2025-11-22 07:46:14.921 186548 DEBUG nova.virt.libvirt.driver [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] Start _get_guest_xml network_info=[{"id": "775bdf61-ad38-473b-9d53-c3195f2be472", "address": "fa:16:3e:ec:d2:78", "network": {"id": "9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-440990750-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef27a8ab9a794f7782ac89b9c28c893a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap775bdf61-ad", "ovs_interfaceid": "775bdf61-ad38-473b-9d53-c3195f2be472", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 07:46:14 compute-0 nova_compute[186544]: 2025-11-22 07:46:14.925 186548 WARNING nova.virt.libvirt.driver [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 07:46:14 compute-0 nova_compute[186544]: 2025-11-22 07:46:14.934 186548 DEBUG nova.virt.libvirt.host [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 07:46:14 compute-0 nova_compute[186544]: 2025-11-22 07:46:14.935 186548 DEBUG nova.virt.libvirt.host [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 07:46:14 compute-0 nova_compute[186544]: 2025-11-22 07:46:14.938 186548 DEBUG nova.virt.libvirt.host [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 07:46:14 compute-0 nova_compute[186544]: 2025-11-22 07:46:14.939 186548 DEBUG nova.virt.libvirt.host [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 07:46:14 compute-0 nova_compute[186544]: 2025-11-22 07:46:14.940 186548 DEBUG nova.virt.libvirt.driver [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 07:46:14 compute-0 nova_compute[186544]: 2025-11-22 07:46:14.940 186548 DEBUG nova.virt.hardware [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 07:46:14 compute-0 nova_compute[186544]: 2025-11-22 07:46:14.941 186548 DEBUG nova.virt.hardware [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 07:46:14 compute-0 nova_compute[186544]: 2025-11-22 07:46:14.941 186548 DEBUG nova.virt.hardware [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 07:46:14 compute-0 nova_compute[186544]: 2025-11-22 07:46:14.941 186548 DEBUG nova.virt.hardware [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 07:46:14 compute-0 nova_compute[186544]: 2025-11-22 07:46:14.941 186548 DEBUG nova.virt.hardware [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 07:46:14 compute-0 nova_compute[186544]: 2025-11-22 07:46:14.942 186548 DEBUG nova.virt.hardware [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 07:46:14 compute-0 nova_compute[186544]: 2025-11-22 07:46:14.942 186548 DEBUG nova.virt.hardware [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 07:46:14 compute-0 nova_compute[186544]: 2025-11-22 07:46:14.942 186548 DEBUG nova.virt.hardware [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 07:46:14 compute-0 nova_compute[186544]: 2025-11-22 07:46:14.942 186548 DEBUG nova.virt.hardware [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 07:46:14 compute-0 nova_compute[186544]: 2025-11-22 07:46:14.943 186548 DEBUG nova.virt.hardware [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 07:46:14 compute-0 nova_compute[186544]: 2025-11-22 07:46:14.943 186548 DEBUG nova.virt.hardware [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 07:46:14 compute-0 nova_compute[186544]: 2025-11-22 07:46:14.946 186548 DEBUG nova.virt.libvirt.vif [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:46:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-FloatingIPsAssociationTestJSON-server-2031422221',display_name='tempest-FloatingIPsAssociationTestJSON-server-2031422221',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-floatingipsassociationtestjson-server-2031422221',id=29,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ef27a8ab9a794f7782ac89b9c28c893a',ramdisk_id='',reservation_id='r-41g0d0jn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIPsAssociationTestJSON-1465053098',owner_user_name='tempest-FloatingIPsAssociationTestJSON-1465053098-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:46:08Z,user_data=None,user_id='65ded9a5f9a7463d8c52561197054664',uuid=67b32f80-a795-43b0-bcb0-aeda0d9506d4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "775bdf61-ad38-473b-9d53-c3195f2be472", "address": "fa:16:3e:ec:d2:78", "network": {"id": "9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-440990750-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef27a8ab9a794f7782ac89b9c28c893a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap775bdf61-ad", "ovs_interfaceid": "775bdf61-ad38-473b-9d53-c3195f2be472", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 07:46:14 compute-0 nova_compute[186544]: 2025-11-22 07:46:14.946 186548 DEBUG nova.network.os_vif_util [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Converting VIF {"id": "775bdf61-ad38-473b-9d53-c3195f2be472", "address": "fa:16:3e:ec:d2:78", "network": {"id": "9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-440990750-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef27a8ab9a794f7782ac89b9c28c893a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap775bdf61-ad", "ovs_interfaceid": "775bdf61-ad38-473b-9d53-c3195f2be472", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:46:14 compute-0 nova_compute[186544]: 2025-11-22 07:46:14.947 186548 DEBUG nova.network.os_vif_util [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ec:d2:78,bridge_name='br-int',has_traffic_filtering=True,id=775bdf61-ad38-473b-9d53-c3195f2be472,network=Network(9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap775bdf61-ad') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:46:14 compute-0 nova_compute[186544]: 2025-11-22 07:46:14.948 186548 DEBUG nova.objects.instance [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Lazy-loading 'pci_devices' on Instance uuid 67b32f80-a795-43b0-bcb0-aeda0d9506d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:46:14 compute-0 nova_compute[186544]: 2025-11-22 07:46:14.966 186548 DEBUG nova.virt.libvirt.driver [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] End _get_guest_xml xml=<domain type="kvm">
Nov 22 07:46:14 compute-0 nova_compute[186544]:   <uuid>67b32f80-a795-43b0-bcb0-aeda0d9506d4</uuid>
Nov 22 07:46:14 compute-0 nova_compute[186544]:   <name>instance-0000001d</name>
Nov 22 07:46:14 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 07:46:14 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 07:46:14 compute-0 nova_compute[186544]:   <metadata>
Nov 22 07:46:14 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 07:46:14 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 07:46:14 compute-0 nova_compute[186544]:       <nova:name>tempest-FloatingIPsAssociationTestJSON-server-2031422221</nova:name>
Nov 22 07:46:14 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 07:46:14</nova:creationTime>
Nov 22 07:46:14 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 07:46:14 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 07:46:14 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 07:46:14 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 07:46:14 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 07:46:14 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 07:46:14 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 07:46:14 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 07:46:14 compute-0 nova_compute[186544]:         <nova:user uuid="65ded9a5f9a7463d8c52561197054664">tempest-FloatingIPsAssociationTestJSON-1465053098-project-member</nova:user>
Nov 22 07:46:14 compute-0 nova_compute[186544]:         <nova:project uuid="ef27a8ab9a794f7782ac89b9c28c893a">tempest-FloatingIPsAssociationTestJSON-1465053098</nova:project>
Nov 22 07:46:14 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 07:46:14 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 07:46:14 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 07:46:14 compute-0 nova_compute[186544]:         <nova:port uuid="775bdf61-ad38-473b-9d53-c3195f2be472">
Nov 22 07:46:14 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Nov 22 07:46:14 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 07:46:14 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 07:46:14 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 07:46:14 compute-0 nova_compute[186544]:   </metadata>
Nov 22 07:46:14 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 07:46:14 compute-0 nova_compute[186544]:     <system>
Nov 22 07:46:14 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 07:46:14 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 07:46:14 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 07:46:14 compute-0 nova_compute[186544]:       <entry name="serial">67b32f80-a795-43b0-bcb0-aeda0d9506d4</entry>
Nov 22 07:46:14 compute-0 nova_compute[186544]:       <entry name="uuid">67b32f80-a795-43b0-bcb0-aeda0d9506d4</entry>
Nov 22 07:46:14 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 07:46:14 compute-0 nova_compute[186544]:     </system>
Nov 22 07:46:14 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 07:46:14 compute-0 nova_compute[186544]:   <os>
Nov 22 07:46:14 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 07:46:14 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 07:46:14 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 07:46:14 compute-0 nova_compute[186544]:   </os>
Nov 22 07:46:14 compute-0 nova_compute[186544]:   <features>
Nov 22 07:46:14 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 07:46:14 compute-0 nova_compute[186544]:     <apic/>
Nov 22 07:46:14 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 07:46:14 compute-0 nova_compute[186544]:   </features>
Nov 22 07:46:14 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 07:46:14 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 07:46:14 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 07:46:14 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 07:46:14 compute-0 nova_compute[186544]:   </clock>
Nov 22 07:46:14 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 07:46:14 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 07:46:14 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 07:46:14 compute-0 nova_compute[186544]:   </cpu>
Nov 22 07:46:14 compute-0 nova_compute[186544]:   <devices>
Nov 22 07:46:14 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 07:46:14 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 07:46:14 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/67b32f80-a795-43b0-bcb0-aeda0d9506d4/disk"/>
Nov 22 07:46:14 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 07:46:14 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:46:14 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 07:46:14 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 07:46:14 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/67b32f80-a795-43b0-bcb0-aeda0d9506d4/disk.config"/>
Nov 22 07:46:14 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 07:46:14 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:46:14 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 07:46:14 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:ec:d2:78"/>
Nov 22 07:46:14 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 07:46:14 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 07:46:14 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 07:46:14 compute-0 nova_compute[186544]:       <target dev="tap775bdf61-ad"/>
Nov 22 07:46:14 compute-0 nova_compute[186544]:     </interface>
Nov 22 07:46:14 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 07:46:14 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/67b32f80-a795-43b0-bcb0-aeda0d9506d4/console.log" append="off"/>
Nov 22 07:46:14 compute-0 nova_compute[186544]:     </serial>
Nov 22 07:46:14 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 07:46:14 compute-0 nova_compute[186544]:     <video>
Nov 22 07:46:14 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 07:46:14 compute-0 nova_compute[186544]:     </video>
Nov 22 07:46:14 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 07:46:14 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 07:46:14 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 07:46:14 compute-0 nova_compute[186544]:     </rng>
Nov 22 07:46:14 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 07:46:14 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:46:14 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:46:14 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:46:14 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:46:14 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:46:14 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:46:14 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:46:14 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:46:14 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:46:14 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:46:14 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:46:14 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:46:14 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:46:14 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:46:14 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:46:14 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:46:14 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:46:14 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:46:14 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:46:14 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:46:14 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:46:14 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:46:14 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:46:14 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:46:14 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 07:46:14 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 07:46:14 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 07:46:14 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 07:46:14 compute-0 nova_compute[186544]:   </devices>
Nov 22 07:46:14 compute-0 nova_compute[186544]: </domain>
Nov 22 07:46:14 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 07:46:14 compute-0 nova_compute[186544]: 2025-11-22 07:46:14.967 186548 DEBUG nova.compute.manager [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] Preparing to wait for external event network-vif-plugged-775bdf61-ad38-473b-9d53-c3195f2be472 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 07:46:14 compute-0 nova_compute[186544]: 2025-11-22 07:46:14.967 186548 DEBUG oslo_concurrency.lockutils [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Acquiring lock "67b32f80-a795-43b0-bcb0-aeda0d9506d4-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:46:14 compute-0 nova_compute[186544]: 2025-11-22 07:46:14.968 186548 DEBUG oslo_concurrency.lockutils [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Lock "67b32f80-a795-43b0-bcb0-aeda0d9506d4-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:46:14 compute-0 nova_compute[186544]: 2025-11-22 07:46:14.968 186548 DEBUG oslo_concurrency.lockutils [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Lock "67b32f80-a795-43b0-bcb0-aeda0d9506d4-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:46:14 compute-0 nova_compute[186544]: 2025-11-22 07:46:14.969 186548 DEBUG nova.virt.libvirt.vif [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:46:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-FloatingIPsAssociationTestJSON-server-2031422221',display_name='tempest-FloatingIPsAssociationTestJSON-server-2031422221',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-floatingipsassociationtestjson-server-2031422221',id=29,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ef27a8ab9a794f7782ac89b9c28c893a',ramdisk_id='',reservation_id='r-41g0d0jn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIPsAssociationTestJSON-1465053098',owner_user_name='tempest-FloatingIPsAssociationTestJSON-1465053098-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:46:08Z,user_data=None,user_id='65ded9a5f9a7463d8c52561197054664',uuid=67b32f80-a795-43b0-bcb0-aeda0d9506d4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "775bdf61-ad38-473b-9d53-c3195f2be472", "address": "fa:16:3e:ec:d2:78", "network": {"id": "9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-440990750-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef27a8ab9a794f7782ac89b9c28c893a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap775bdf61-ad", "ovs_interfaceid": "775bdf61-ad38-473b-9d53-c3195f2be472", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 07:46:14 compute-0 nova_compute[186544]: 2025-11-22 07:46:14.969 186548 DEBUG nova.network.os_vif_util [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Converting VIF {"id": "775bdf61-ad38-473b-9d53-c3195f2be472", "address": "fa:16:3e:ec:d2:78", "network": {"id": "9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-440990750-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef27a8ab9a794f7782ac89b9c28c893a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap775bdf61-ad", "ovs_interfaceid": "775bdf61-ad38-473b-9d53-c3195f2be472", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:46:14 compute-0 nova_compute[186544]: 2025-11-22 07:46:14.970 186548 DEBUG nova.network.os_vif_util [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ec:d2:78,bridge_name='br-int',has_traffic_filtering=True,id=775bdf61-ad38-473b-9d53-c3195f2be472,network=Network(9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap775bdf61-ad') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:46:14 compute-0 nova_compute[186544]: 2025-11-22 07:46:14.970 186548 DEBUG os_vif [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ec:d2:78,bridge_name='br-int',has_traffic_filtering=True,id=775bdf61-ad38-473b-9d53-c3195f2be472,network=Network(9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap775bdf61-ad') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 07:46:14 compute-0 nova_compute[186544]: 2025-11-22 07:46:14.971 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:46:14 compute-0 nova_compute[186544]: 2025-11-22 07:46:14.971 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:46:14 compute-0 nova_compute[186544]: 2025-11-22 07:46:14.971 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:46:14 compute-0 nova_compute[186544]: 2025-11-22 07:46:14.973 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:46:14 compute-0 nova_compute[186544]: 2025-11-22 07:46:14.974 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap775bdf61-ad, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:46:14 compute-0 nova_compute[186544]: 2025-11-22 07:46:14.974 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap775bdf61-ad, col_values=(('external_ids', {'iface-id': '775bdf61-ad38-473b-9d53-c3195f2be472', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ec:d2:78', 'vm-uuid': '67b32f80-a795-43b0-bcb0-aeda0d9506d4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:46:14 compute-0 nova_compute[186544]: 2025-11-22 07:46:14.976 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:46:14 compute-0 NetworkManager[55036]: <info>  [1763797574.9784] manager: (tap775bdf61-ad): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/53)
Nov 22 07:46:14 compute-0 nova_compute[186544]: 2025-11-22 07:46:14.978 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 07:46:14 compute-0 nova_compute[186544]: 2025-11-22 07:46:14.985 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:46:14 compute-0 nova_compute[186544]: 2025-11-22 07:46:14.986 186548 INFO os_vif [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ec:d2:78,bridge_name='br-int',has_traffic_filtering=True,id=775bdf61-ad38-473b-9d53-c3195f2be472,network=Network(9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap775bdf61-ad')
Nov 22 07:46:15 compute-0 nova_compute[186544]: 2025-11-22 07:46:15.050 186548 DEBUG nova.virt.libvirt.driver [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:46:15 compute-0 nova_compute[186544]: 2025-11-22 07:46:15.050 186548 DEBUG nova.virt.libvirt.driver [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:46:15 compute-0 nova_compute[186544]: 2025-11-22 07:46:15.050 186548 DEBUG nova.virt.libvirt.driver [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] No VIF found with MAC fa:16:3e:ec:d2:78, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 07:46:15 compute-0 nova_compute[186544]: 2025-11-22 07:46:15.051 186548 INFO nova.virt.libvirt.driver [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] Using config drive
Nov 22 07:46:15 compute-0 nova_compute[186544]: 2025-11-22 07:46:15.165 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:46:15 compute-0 nova_compute[186544]: 2025-11-22 07:46:15.659 186548 INFO nova.virt.libvirt.driver [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] Creating config drive at /var/lib/nova/instances/67b32f80-a795-43b0-bcb0-aeda0d9506d4/disk.config
Nov 22 07:46:15 compute-0 nova_compute[186544]: 2025-11-22 07:46:15.664 186548 DEBUG oslo_concurrency.processutils [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/67b32f80-a795-43b0-bcb0-aeda0d9506d4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpk9428_tc execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:46:15 compute-0 nova_compute[186544]: 2025-11-22 07:46:15.793 186548 DEBUG oslo_concurrency.processutils [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/67b32f80-a795-43b0-bcb0-aeda0d9506d4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpk9428_tc" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:46:15 compute-0 kernel: tap775bdf61-ad: entered promiscuous mode
Nov 22 07:46:15 compute-0 nova_compute[186544]: 2025-11-22 07:46:15.857 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:46:15 compute-0 ovn_controller[94843]: 2025-11-22T07:46:15Z|00107|binding|INFO|Claiming lport 775bdf61-ad38-473b-9d53-c3195f2be472 for this chassis.
Nov 22 07:46:15 compute-0 ovn_controller[94843]: 2025-11-22T07:46:15Z|00108|binding|INFO|775bdf61-ad38-473b-9d53-c3195f2be472: Claiming fa:16:3e:ec:d2:78 10.100.0.3
Nov 22 07:46:15 compute-0 NetworkManager[55036]: <info>  [1763797575.8591] manager: (tap775bdf61-ad): new Tun device (/org/freedesktop/NetworkManager/Devices/54)
Nov 22 07:46:15 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:15.876 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ec:d2:78 10.100.0.3'], port_security=['fa:16:3e:ec:d2:78 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '67b32f80-a795-43b0-bcb0-aeda0d9506d4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ef27a8ab9a794f7782ac89b9c28c893a', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7ba545ee-7ef7-4120-9b36-dfb927d132f4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=750112b4-7c3d-47fc-a624-7726e73fdc53, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=775bdf61-ad38-473b-9d53-c3195f2be472) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:46:15 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:15.877 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 775bdf61-ad38-473b-9d53-c3195f2be472 in datapath 9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202 bound to our chassis
Nov 22 07:46:15 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:15.878 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202
Nov 22 07:46:15 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:15.888 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[cc9839bf-5541-4404-987a-64a03dcaf6f8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:46:15 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:15.889 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9dfbfc3c-a1 in ovnmeta-9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 07:46:15 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:15.890 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9dfbfc3c-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 07:46:15 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:15.890 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[5d5e6858-ea8a-4b39-ae0d-e2794fabad4b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:46:15 compute-0 systemd-machined[152872]: New machine qemu-15-instance-0000001d.
Nov 22 07:46:15 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:15.891 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[a86cf516-6886-4992-9e64-2c9714cb0d94]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:46:15 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:15.902 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[f6f092d0-60aa-4f3c-8538-38c45076382a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:46:15 compute-0 nova_compute[186544]: 2025-11-22 07:46:15.914 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:46:15 compute-0 systemd[1]: Started Virtual Machine qemu-15-instance-0000001d.
Nov 22 07:46:15 compute-0 ovn_controller[94843]: 2025-11-22T07:46:15Z|00109|binding|INFO|Setting lport 775bdf61-ad38-473b-9d53-c3195f2be472 ovn-installed in OVS
Nov 22 07:46:15 compute-0 ovn_controller[94843]: 2025-11-22T07:46:15Z|00110|binding|INFO|Setting lport 775bdf61-ad38-473b-9d53-c3195f2be472 up in Southbound
Nov 22 07:46:15 compute-0 nova_compute[186544]: 2025-11-22 07:46:15.919 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:46:15 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:15.923 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[fd33bdb4-029f-4f3f-93b2-f57ed4cd8d2e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:46:15 compute-0 systemd-udevd[217472]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 07:46:15 compute-0 NetworkManager[55036]: <info>  [1763797575.9435] device (tap775bdf61-ad): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 07:46:15 compute-0 NetworkManager[55036]: <info>  [1763797575.9445] device (tap775bdf61-ad): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 07:46:15 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:15.950 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[062e505e-3cdd-44dc-850d-19c5c85e6bbf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:46:15 compute-0 NetworkManager[55036]: <info>  [1763797575.9569] manager: (tap9dfbfc3c-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/55)
Nov 22 07:46:15 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:15.957 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[29a4bdf1-cce4-4838-8256-c3bf917bc9a0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:46:15 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:15.991 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[4c92b70a-ff7c-4680-9c15-81959c209c79]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:46:15 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:15.993 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[d6a8246d-4587-486e-b503-a14380400ff2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:46:16 compute-0 NetworkManager[55036]: <info>  [1763797576.0149] device (tap9dfbfc3c-a0): carrier: link connected
Nov 22 07:46:16 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:16.019 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[10023fb2-09aa-41f1-978c-a6a0b91a5bfc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:46:16 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:16.034 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[c8631535-277b-4c53-8c1b-dc32a13e21de]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9dfbfc3c-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3e:03:74'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 36], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 431665, 'reachable_time': 42602, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217501, 'error': None, 'target': 'ovnmeta-9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:46:16 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:16.048 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[8a61e1cc-c2af-459c-97c4-6a083f7e7319]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe3e:374'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 431665, 'tstamp': 431665}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217502, 'error': None, 'target': 'ovnmeta-9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:46:16 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:16.063 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[4984fa44-d609-4458-8223-4436c76e03d6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9dfbfc3c-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3e:03:74'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 36], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 431665, 'reachable_time': 42602, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217503, 'error': None, 'target': 'ovnmeta-9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:46:16 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:16.098 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[1bb48a24-cae9-4c17-a0fe-e1a0b389b714]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:46:16 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:16.165 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[86ba919b-3184-4587-abae-fb4a22fdb9c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:46:16 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:16.168 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9dfbfc3c-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:46:16 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:16.169 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:46:16 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:16.169 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9dfbfc3c-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:46:16 compute-0 NetworkManager[55036]: <info>  [1763797576.1719] manager: (tap9dfbfc3c-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/56)
Nov 22 07:46:16 compute-0 kernel: tap9dfbfc3c-a0: entered promiscuous mode
Nov 22 07:46:16 compute-0 nova_compute[186544]: 2025-11-22 07:46:16.171 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:46:16 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:16.177 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9dfbfc3c-a0, col_values=(('external_ids', {'iface-id': 'd4b08431-3a8d-4e48-ba0b-792923071bed'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:46:16 compute-0 ovn_controller[94843]: 2025-11-22T07:46:16Z|00111|binding|INFO|Releasing lport d4b08431-3a8d-4e48-ba0b-792923071bed from this chassis (sb_readonly=0)
Nov 22 07:46:16 compute-0 nova_compute[186544]: 2025-11-22 07:46:16.179 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:46:16 compute-0 nova_compute[186544]: 2025-11-22 07:46:16.180 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:46:16 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:16.183 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 07:46:16 compute-0 nova_compute[186544]: 2025-11-22 07:46:16.192 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:46:16 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:16.191 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[f5503b65-4698-4cc7-9a24-b08caae7dab6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:46:16 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:16.194 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 07:46:16 compute-0 ovn_metadata_agent[103800]: global
Nov 22 07:46:16 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 07:46:16 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202
Nov 22 07:46:16 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 07:46:16 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 07:46:16 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 07:46:16 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202.pid.haproxy
Nov 22 07:46:16 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 07:46:16 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:46:16 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 07:46:16 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 07:46:16 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 07:46:16 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 07:46:16 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 07:46:16 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 07:46:16 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 07:46:16 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 07:46:16 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 07:46:16 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 07:46:16 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 07:46:16 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 07:46:16 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 07:46:16 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:46:16 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:46:16 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 07:46:16 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 07:46:16 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 07:46:16 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID 9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202
Nov 22 07:46:16 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 07:46:16 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:16.195 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202', 'env', 'PROCESS_TAG=haproxy-9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 07:46:16 compute-0 nova_compute[186544]: 2025-11-22 07:46:16.224 186548 DEBUG nova.compute.manager [req-c3399d0d-82c2-4720-b869-fa3ae070b1ea req-609c093f-2f9f-43ca-85d0-fae2c8a3107b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] Received event network-vif-plugged-775bdf61-ad38-473b-9d53-c3195f2be472 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:46:16 compute-0 nova_compute[186544]: 2025-11-22 07:46:16.225 186548 DEBUG oslo_concurrency.lockutils [req-c3399d0d-82c2-4720-b869-fa3ae070b1ea req-609c093f-2f9f-43ca-85d0-fae2c8a3107b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "67b32f80-a795-43b0-bcb0-aeda0d9506d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:46:16 compute-0 nova_compute[186544]: 2025-11-22 07:46:16.225 186548 DEBUG oslo_concurrency.lockutils [req-c3399d0d-82c2-4720-b869-fa3ae070b1ea req-609c093f-2f9f-43ca-85d0-fae2c8a3107b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "67b32f80-a795-43b0-bcb0-aeda0d9506d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:46:16 compute-0 nova_compute[186544]: 2025-11-22 07:46:16.225 186548 DEBUG oslo_concurrency.lockutils [req-c3399d0d-82c2-4720-b869-fa3ae070b1ea req-609c093f-2f9f-43ca-85d0-fae2c8a3107b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "67b32f80-a795-43b0-bcb0-aeda0d9506d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:46:16 compute-0 nova_compute[186544]: 2025-11-22 07:46:16.225 186548 DEBUG nova.compute.manager [req-c3399d0d-82c2-4720-b869-fa3ae070b1ea req-609c093f-2f9f-43ca-85d0-fae2c8a3107b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] Processing event network-vif-plugged-775bdf61-ad38-473b-9d53-c3195f2be472 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 07:46:16 compute-0 nova_compute[186544]: 2025-11-22 07:46:16.337 186548 DEBUG nova.virt.libvirt.driver [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Check if temp file /var/lib/nova/instances/tmp8ztptjqj exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Nov 22 07:46:16 compute-0 nova_compute[186544]: 2025-11-22 07:46:16.342 186548 DEBUG oslo_concurrency.processutils [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:46:16 compute-0 nova_compute[186544]: 2025-11-22 07:46:16.394 186548 DEBUG oslo_concurrency.processutils [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:46:16 compute-0 nova_compute[186544]: 2025-11-22 07:46:16.395 186548 DEBUG oslo_concurrency.processutils [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:46:16 compute-0 nova_compute[186544]: 2025-11-22 07:46:16.455 186548 DEBUG oslo_concurrency.processutils [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:46:16 compute-0 nova_compute[186544]: 2025-11-22 07:46:16.457 186548 DEBUG nova.compute.manager [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=72704,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp8ztptjqj',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='09e8ccfc-6ae9-4a06-ae76-7e059f50ac44',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Nov 22 07:46:16 compute-0 nova_compute[186544]: 2025-11-22 07:46:16.543 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763797576.543572, 67b32f80-a795-43b0-bcb0-aeda0d9506d4 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:46:16 compute-0 nova_compute[186544]: 2025-11-22 07:46:16.544 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] VM Started (Lifecycle Event)
Nov 22 07:46:16 compute-0 nova_compute[186544]: 2025-11-22 07:46:16.546 186548 DEBUG nova.compute.manager [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 07:46:16 compute-0 nova_compute[186544]: 2025-11-22 07:46:16.549 186548 DEBUG nova.virt.libvirt.driver [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 07:46:16 compute-0 nova_compute[186544]: 2025-11-22 07:46:16.557 186548 INFO nova.virt.libvirt.driver [-] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] Instance spawned successfully.
Nov 22 07:46:16 compute-0 nova_compute[186544]: 2025-11-22 07:46:16.558 186548 DEBUG nova.virt.libvirt.driver [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 07:46:16 compute-0 nova_compute[186544]: 2025-11-22 07:46:16.575 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:46:16 compute-0 nova_compute[186544]: 2025-11-22 07:46:16.579 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:46:16 compute-0 nova_compute[186544]: 2025-11-22 07:46:16.587 186548 DEBUG nova.virt.libvirt.driver [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:46:16 compute-0 nova_compute[186544]: 2025-11-22 07:46:16.588 186548 DEBUG nova.virt.libvirt.driver [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:46:16 compute-0 nova_compute[186544]: 2025-11-22 07:46:16.588 186548 DEBUG nova.virt.libvirt.driver [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:46:16 compute-0 nova_compute[186544]: 2025-11-22 07:46:16.588 186548 DEBUG nova.virt.libvirt.driver [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:46:16 compute-0 nova_compute[186544]: 2025-11-22 07:46:16.589 186548 DEBUG nova.virt.libvirt.driver [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:46:16 compute-0 nova_compute[186544]: 2025-11-22 07:46:16.589 186548 DEBUG nova.virt.libvirt.driver [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:46:16 compute-0 podman[217546]: 2025-11-22 07:46:16.604611642 +0000 UTC m=+0.080476228 container create 685d541a8ec994817fa4a84407bd57758a0e39f86ddd1c7e4bfc1c3420e31318 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, io.buildah.version=1.41.3)
Nov 22 07:46:16 compute-0 nova_compute[186544]: 2025-11-22 07:46:16.623 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 07:46:16 compute-0 nova_compute[186544]: 2025-11-22 07:46:16.623 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763797576.545664, 67b32f80-a795-43b0-bcb0-aeda0d9506d4 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:46:16 compute-0 nova_compute[186544]: 2025-11-22 07:46:16.624 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] VM Paused (Lifecycle Event)
Nov 22 07:46:16 compute-0 systemd[1]: Started libpod-conmon-685d541a8ec994817fa4a84407bd57758a0e39f86ddd1c7e4bfc1c3420e31318.scope.
Nov 22 07:46:16 compute-0 podman[217546]: 2025-11-22 07:46:16.555319577 +0000 UTC m=+0.031184183 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 07:46:16 compute-0 nova_compute[186544]: 2025-11-22 07:46:16.658 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:46:16 compute-0 nova_compute[186544]: 2025-11-22 07:46:16.660 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763797576.5551715, 67b32f80-a795-43b0-bcb0-aeda0d9506d4 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:46:16 compute-0 nova_compute[186544]: 2025-11-22 07:46:16.660 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] VM Resumed (Lifecycle Event)
Nov 22 07:46:16 compute-0 systemd[1]: Started libcrun container.
Nov 22 07:46:16 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1fa35a465e8338ede6406b212e8b8c9b5f09193d8fdb9a01ff9704ac61004a2d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 07:46:16 compute-0 podman[217546]: 2025-11-22 07:46:16.687734384 +0000 UTC m=+0.163598990 container init 685d541a8ec994817fa4a84407bd57758a0e39f86ddd1c7e4bfc1c3420e31318 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 22 07:46:16 compute-0 nova_compute[186544]: 2025-11-22 07:46:16.689 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:46:16 compute-0 podman[217546]: 2025-11-22 07:46:16.693565736 +0000 UTC m=+0.169430322 container start 685d541a8ec994817fa4a84407bd57758a0e39f86ddd1c7e4bfc1c3420e31318 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Nov 22 07:46:16 compute-0 nova_compute[186544]: 2025-11-22 07:46:16.694 186548 INFO nova.compute.manager [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] Took 7.99 seconds to spawn the instance on the hypervisor.
Nov 22 07:46:16 compute-0 nova_compute[186544]: 2025-11-22 07:46:16.694 186548 DEBUG nova.compute.manager [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:46:16 compute-0 nova_compute[186544]: 2025-11-22 07:46:16.695 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:46:16 compute-0 neutron-haproxy-ovnmeta-9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202[217563]: [NOTICE]   (217567) : New worker (217569) forked
Nov 22 07:46:16 compute-0 neutron-haproxy-ovnmeta-9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202[217563]: [NOTICE]   (217567) : Loading success.
Nov 22 07:46:16 compute-0 nova_compute[186544]: 2025-11-22 07:46:16.730 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 07:46:16 compute-0 nova_compute[186544]: 2025-11-22 07:46:16.772 186548 INFO nova.compute.manager [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] Took 8.59 seconds to build instance.
Nov 22 07:46:16 compute-0 nova_compute[186544]: 2025-11-22 07:46:16.789 186548 DEBUG oslo_concurrency.lockutils [None req-7a496c86-0b38-409b-a231-53f04974ee2a 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Lock "67b32f80-a795-43b0-bcb0-aeda0d9506d4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.758s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:46:17 compute-0 nova_compute[186544]: 2025-11-22 07:46:17.492 186548 DEBUG nova.network.neutron [req-db906a8a-dad9-46be-b075-3af4717da752 req-86bfd5fd-7a66-47a3-9a2b-349636650178 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] Updated VIF entry in instance network info cache for port 775bdf61-ad38-473b-9d53-c3195f2be472. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 07:46:17 compute-0 nova_compute[186544]: 2025-11-22 07:46:17.492 186548 DEBUG nova.network.neutron [req-db906a8a-dad9-46be-b075-3af4717da752 req-86bfd5fd-7a66-47a3-9a2b-349636650178 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] Updating instance_info_cache with network_info: [{"id": "775bdf61-ad38-473b-9d53-c3195f2be472", "address": "fa:16:3e:ec:d2:78", "network": {"id": "9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-440990750-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef27a8ab9a794f7782ac89b9c28c893a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap775bdf61-ad", "ovs_interfaceid": "775bdf61-ad38-473b-9d53-c3195f2be472", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:46:17 compute-0 nova_compute[186544]: 2025-11-22 07:46:17.522 186548 DEBUG oslo_concurrency.lockutils [req-db906a8a-dad9-46be-b075-3af4717da752 req-86bfd5fd-7a66-47a3-9a2b-349636650178 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-67b32f80-a795-43b0-bcb0-aeda0d9506d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:46:18 compute-0 nova_compute[186544]: 2025-11-22 07:46:18.019 186548 DEBUG oslo_concurrency.processutils [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:46:18 compute-0 nova_compute[186544]: 2025-11-22 07:46:18.125 186548 DEBUG oslo_concurrency.processutils [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk --force-share --output=json" returned: 0 in 0.106s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:46:18 compute-0 nova_compute[186544]: 2025-11-22 07:46:18.126 186548 DEBUG oslo_concurrency.processutils [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:46:18 compute-0 nova_compute[186544]: 2025-11-22 07:46:18.195 186548 DEBUG oslo_concurrency.processutils [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:46:18 compute-0 nova_compute[186544]: 2025-11-22 07:46:18.401 186548 DEBUG nova.compute.manager [req-8ac24772-611d-4d89-ae08-bf189c36e1d7 req-3c94efb3-ff3c-4d5b-9b3d-1e473a4b3214 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] Received event network-vif-plugged-775bdf61-ad38-473b-9d53-c3195f2be472 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:46:18 compute-0 nova_compute[186544]: 2025-11-22 07:46:18.401 186548 DEBUG oslo_concurrency.lockutils [req-8ac24772-611d-4d89-ae08-bf189c36e1d7 req-3c94efb3-ff3c-4d5b-9b3d-1e473a4b3214 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "67b32f80-a795-43b0-bcb0-aeda0d9506d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:46:18 compute-0 nova_compute[186544]: 2025-11-22 07:46:18.401 186548 DEBUG oslo_concurrency.lockutils [req-8ac24772-611d-4d89-ae08-bf189c36e1d7 req-3c94efb3-ff3c-4d5b-9b3d-1e473a4b3214 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "67b32f80-a795-43b0-bcb0-aeda0d9506d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:46:18 compute-0 nova_compute[186544]: 2025-11-22 07:46:18.402 186548 DEBUG oslo_concurrency.lockutils [req-8ac24772-611d-4d89-ae08-bf189c36e1d7 req-3c94efb3-ff3c-4d5b-9b3d-1e473a4b3214 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "67b32f80-a795-43b0-bcb0-aeda0d9506d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:46:18 compute-0 nova_compute[186544]: 2025-11-22 07:46:18.402 186548 DEBUG nova.compute.manager [req-8ac24772-611d-4d89-ae08-bf189c36e1d7 req-3c94efb3-ff3c-4d5b-9b3d-1e473a4b3214 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] No waiting events found dispatching network-vif-plugged-775bdf61-ad38-473b-9d53-c3195f2be472 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:46:18 compute-0 nova_compute[186544]: 2025-11-22 07:46:18.402 186548 WARNING nova.compute.manager [req-8ac24772-611d-4d89-ae08-bf189c36e1d7 req-3c94efb3-ff3c-4d5b-9b3d-1e473a4b3214 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] Received unexpected event network-vif-plugged-775bdf61-ad38-473b-9d53-c3195f2be472 for instance with vm_state active and task_state None.
Nov 22 07:46:19 compute-0 nova_compute[186544]: 2025-11-22 07:46:19.977 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:46:20 compute-0 nova_compute[186544]: 2025-11-22 07:46:20.165 186548 DEBUG oslo_concurrency.lockutils [None req-e770a100-1c40-47f9-8444-21399887646c 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Acquiring lock "5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:46:20 compute-0 nova_compute[186544]: 2025-11-22 07:46:20.165 186548 DEBUG oslo_concurrency.lockutils [None req-e770a100-1c40-47f9-8444-21399887646c 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:46:20 compute-0 nova_compute[186544]: 2025-11-22 07:46:20.166 186548 DEBUG oslo_concurrency.lockutils [None req-e770a100-1c40-47f9-8444-21399887646c 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Acquiring lock "5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:46:20 compute-0 nova_compute[186544]: 2025-11-22 07:46:20.166 186548 DEBUG oslo_concurrency.lockutils [None req-e770a100-1c40-47f9-8444-21399887646c 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:46:20 compute-0 nova_compute[186544]: 2025-11-22 07:46:20.166 186548 DEBUG oslo_concurrency.lockutils [None req-e770a100-1c40-47f9-8444-21399887646c 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:46:20 compute-0 nova_compute[186544]: 2025-11-22 07:46:20.170 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:46:20 compute-0 nova_compute[186544]: 2025-11-22 07:46:20.178 186548 INFO nova.compute.manager [None req-e770a100-1c40-47f9-8444-21399887646c 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Terminating instance
Nov 22 07:46:20 compute-0 nova_compute[186544]: 2025-11-22 07:46:20.184 186548 DEBUG oslo_concurrency.lockutils [None req-e770a100-1c40-47f9-8444-21399887646c 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Acquiring lock "refresh_cache-5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:46:20 compute-0 nova_compute[186544]: 2025-11-22 07:46:20.184 186548 DEBUG oslo_concurrency.lockutils [None req-e770a100-1c40-47f9-8444-21399887646c 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Acquired lock "refresh_cache-5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:46:20 compute-0 nova_compute[186544]: 2025-11-22 07:46:20.184 186548 DEBUG nova.network.neutron [None req-e770a100-1c40-47f9-8444-21399887646c 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 07:46:20 compute-0 nova_compute[186544]: 2025-11-22 07:46:20.520 186548 DEBUG nova.network.neutron [None req-e770a100-1c40-47f9-8444-21399887646c 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 07:46:20 compute-0 nova_compute[186544]: 2025-11-22 07:46:20.937 186548 DEBUG nova.network.neutron [None req-e770a100-1c40-47f9-8444-21399887646c 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:46:20 compute-0 nova_compute[186544]: 2025-11-22 07:46:20.951 186548 DEBUG oslo_concurrency.lockutils [None req-e770a100-1c40-47f9-8444-21399887646c 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Releasing lock "refresh_cache-5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:46:20 compute-0 nova_compute[186544]: 2025-11-22 07:46:20.952 186548 DEBUG nova.compute.manager [None req-e770a100-1c40-47f9-8444-21399887646c 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 07:46:20 compute-0 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000011.scope: Deactivated successfully.
Nov 22 07:46:20 compute-0 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000011.scope: Consumed 17.924s CPU time.
Nov 22 07:46:20 compute-0 systemd-machined[152872]: Machine qemu-9-instance-00000011 terminated.
Nov 22 07:46:21 compute-0 nova_compute[186544]: 2025-11-22 07:46:21.188 186548 INFO nova.virt.libvirt.driver [-] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Instance destroyed successfully.
Nov 22 07:46:21 compute-0 nova_compute[186544]: 2025-11-22 07:46:21.188 186548 DEBUG nova.objects.instance [None req-e770a100-1c40-47f9-8444-21399887646c 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lazy-loading 'resources' on Instance uuid 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:46:21 compute-0 nova_compute[186544]: 2025-11-22 07:46:21.200 186548 INFO nova.virt.libvirt.driver [None req-e770a100-1c40-47f9-8444-21399887646c 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Deleting instance files /var/lib/nova/instances/5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67_del
Nov 22 07:46:21 compute-0 nova_compute[186544]: 2025-11-22 07:46:21.205 186548 INFO nova.virt.libvirt.driver [None req-e770a100-1c40-47f9-8444-21399887646c 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Deletion of /var/lib/nova/instances/5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67_del complete
Nov 22 07:46:21 compute-0 nova_compute[186544]: 2025-11-22 07:46:21.281 186548 INFO nova.compute.manager [None req-e770a100-1c40-47f9-8444-21399887646c 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Took 0.33 seconds to destroy the instance on the hypervisor.
Nov 22 07:46:21 compute-0 nova_compute[186544]: 2025-11-22 07:46:21.281 186548 DEBUG oslo.service.loopingcall [None req-e770a100-1c40-47f9-8444-21399887646c 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 07:46:21 compute-0 nova_compute[186544]: 2025-11-22 07:46:21.282 186548 DEBUG nova.compute.manager [-] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 07:46:21 compute-0 nova_compute[186544]: 2025-11-22 07:46:21.282 186548 DEBUG nova.network.neutron [-] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 07:46:21 compute-0 nova_compute[186544]: 2025-11-22 07:46:21.450 186548 DEBUG nova.network.neutron [-] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 07:46:21 compute-0 nova_compute[186544]: 2025-11-22 07:46:21.463 186548 DEBUG nova.network.neutron [-] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:46:21 compute-0 nova_compute[186544]: 2025-11-22 07:46:21.477 186548 INFO nova.compute.manager [-] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Took 0.20 seconds to deallocate network for instance.
Nov 22 07:46:21 compute-0 nova_compute[186544]: 2025-11-22 07:46:21.575 186548 DEBUG oslo_concurrency.lockutils [None req-e770a100-1c40-47f9-8444-21399887646c 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:46:21 compute-0 nova_compute[186544]: 2025-11-22 07:46:21.576 186548 DEBUG oslo_concurrency.lockutils [None req-e770a100-1c40-47f9-8444-21399887646c 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:46:21 compute-0 nova_compute[186544]: 2025-11-22 07:46:21.676 186548 DEBUG nova.compute.provider_tree [None req-e770a100-1c40-47f9-8444-21399887646c 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:46:21 compute-0 nova_compute[186544]: 2025-11-22 07:46:21.689 186548 DEBUG nova.scheduler.client.report [None req-e770a100-1c40-47f9-8444-21399887646c 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:46:21 compute-0 nova_compute[186544]: 2025-11-22 07:46:21.711 186548 DEBUG oslo_concurrency.lockutils [None req-e770a100-1c40-47f9-8444-21399887646c 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.135s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:46:21 compute-0 nova_compute[186544]: 2025-11-22 07:46:21.749 186548 INFO nova.scheduler.client.report [None req-e770a100-1c40-47f9-8444-21399887646c 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Deleted allocations for instance 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67
Nov 22 07:46:21 compute-0 sshd-session[217594]: Accepted publickey for nova from 192.168.122.102 port 44772 ssh2: ECDSA SHA256:92zYFkcPWlh9+ZvK5fXQ5RiSCmUwy/g/KFplYnusFfI
Nov 22 07:46:21 compute-0 systemd-logind[821]: New session 36 of user nova.
Nov 22 07:46:21 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Nov 22 07:46:21 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 22 07:46:21 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 22 07:46:21 compute-0 systemd[1]: Starting User Manager for UID 42436...
Nov 22 07:46:21 compute-0 nova_compute[186544]: 2025-11-22 07:46:21.828 186548 DEBUG oslo_concurrency.lockutils [None req-e770a100-1c40-47f9-8444-21399887646c 5ea417ea62e2404d8cb5b9e767e8c5c4 070aaece3c3c4232877d26c34023c56d - - default default] Lock "5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.663s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:46:21 compute-0 systemd[217598]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 22 07:46:21 compute-0 systemd[217598]: Queued start job for default target Main User Target.
Nov 22 07:46:21 compute-0 systemd[217598]: Created slice User Application Slice.
Nov 22 07:46:21 compute-0 systemd[217598]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 22 07:46:21 compute-0 systemd[217598]: Started Daily Cleanup of User's Temporary Directories.
Nov 22 07:46:21 compute-0 systemd[217598]: Reached target Paths.
Nov 22 07:46:21 compute-0 systemd[217598]: Reached target Timers.
Nov 22 07:46:21 compute-0 systemd[217598]: Starting D-Bus User Message Bus Socket...
Nov 22 07:46:21 compute-0 systemd[217598]: Starting Create User's Volatile Files and Directories...
Nov 22 07:46:22 compute-0 systemd[217598]: Listening on D-Bus User Message Bus Socket.
Nov 22 07:46:22 compute-0 systemd[217598]: Reached target Sockets.
Nov 22 07:46:22 compute-0 systemd[217598]: Finished Create User's Volatile Files and Directories.
Nov 22 07:46:22 compute-0 systemd[217598]: Reached target Basic System.
Nov 22 07:46:22 compute-0 systemd[217598]: Reached target Main User Target.
Nov 22 07:46:22 compute-0 systemd[217598]: Startup finished in 155ms.
Nov 22 07:46:22 compute-0 systemd[1]: Started User Manager for UID 42436.
Nov 22 07:46:22 compute-0 systemd[1]: Started Session 36 of User nova.
Nov 22 07:46:22 compute-0 NetworkManager[55036]: <info>  [1763797582.0265] manager: (patch-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d-to-br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/57)
Nov 22 07:46:22 compute-0 sshd-session[217594]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 22 07:46:22 compute-0 NetworkManager[55036]: <info>  [1763797582.0272] device (patch-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d-to-br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 22 07:46:22 compute-0 NetworkManager[55036]: <info>  [1763797582.0281] manager: (patch-br-int-to-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/58)
Nov 22 07:46:22 compute-0 NetworkManager[55036]: <info>  [1763797582.0284] device (patch-br-int-to-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 22 07:46:22 compute-0 NetworkManager[55036]: <info>  [1763797582.0291] manager: (patch-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/59)
Nov 22 07:46:22 compute-0 NetworkManager[55036]: <info>  [1763797582.0296] manager: (patch-br-int-to-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/60)
Nov 22 07:46:22 compute-0 NetworkManager[55036]: <info>  [1763797582.0300] device (patch-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d-to-br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Nov 22 07:46:22 compute-0 NetworkManager[55036]: <info>  [1763797582.0302] device (patch-br-int-to-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Nov 22 07:46:22 compute-0 nova_compute[186544]: 2025-11-22 07:46:22.029 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:46:22 compute-0 nova_compute[186544]: 2025-11-22 07:46:22.132 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:46:22 compute-0 sshd-session[217613]: Received disconnect from 192.168.122.102 port 44772:11: disconnected by user
Nov 22 07:46:22 compute-0 sshd-session[217613]: Disconnected from user nova 192.168.122.102 port 44772
Nov 22 07:46:22 compute-0 ovn_controller[94843]: 2025-11-22T07:46:22Z|00112|binding|INFO|Releasing lport d4b08431-3a8d-4e48-ba0b-792923071bed from this chassis (sb_readonly=0)
Nov 22 07:46:22 compute-0 ovn_controller[94843]: 2025-11-22T07:46:22Z|00113|binding|INFO|Releasing lport 8206cb6d-dd78-493d-a276-fccb0eeecc7d from this chassis (sb_readonly=0)
Nov 22 07:46:22 compute-0 sshd-session[217594]: pam_unix(sshd:session): session closed for user nova
Nov 22 07:46:22 compute-0 systemd[1]: session-36.scope: Deactivated successfully.
Nov 22 07:46:22 compute-0 systemd-logind[821]: Session 36 logged out. Waiting for processes to exit.
Nov 22 07:46:22 compute-0 systemd-logind[821]: Removed session 36.
Nov 22 07:46:22 compute-0 nova_compute[186544]: 2025-11-22 07:46:22.154 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:46:24 compute-0 ovn_controller[94843]: 2025-11-22T07:46:24Z|00018|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:02:9b:98 10.100.0.14
Nov 22 07:46:24 compute-0 ovn_controller[94843]: 2025-11-22T07:46:24Z|00019|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:02:9b:98 10.100.0.14
Nov 22 07:46:24 compute-0 nova_compute[186544]: 2025-11-22 07:46:24.980 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:46:24 compute-0 nova_compute[186544]: 2025-11-22 07:46:24.987 186548 DEBUG nova.compute.manager [req-87a18731-e8a6-41a1-bc4d-6ce9fab06736 req-16aef587-f06a-4898-bf9d-4462fcdda3f6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Received event network-vif-unplugged-973cdfc2-4ad8-4f41-b383-4b64b1b5433f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:46:24 compute-0 nova_compute[186544]: 2025-11-22 07:46:24.988 186548 DEBUG oslo_concurrency.lockutils [req-87a18731-e8a6-41a1-bc4d-6ce9fab06736 req-16aef587-f06a-4898-bf9d-4462fcdda3f6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:46:24 compute-0 nova_compute[186544]: 2025-11-22 07:46:24.988 186548 DEBUG oslo_concurrency.lockutils [req-87a18731-e8a6-41a1-bc4d-6ce9fab06736 req-16aef587-f06a-4898-bf9d-4462fcdda3f6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:46:24 compute-0 nova_compute[186544]: 2025-11-22 07:46:24.988 186548 DEBUG oslo_concurrency.lockutils [req-87a18731-e8a6-41a1-bc4d-6ce9fab06736 req-16aef587-f06a-4898-bf9d-4462fcdda3f6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:46:24 compute-0 nova_compute[186544]: 2025-11-22 07:46:24.989 186548 DEBUG nova.compute.manager [req-87a18731-e8a6-41a1-bc4d-6ce9fab06736 req-16aef587-f06a-4898-bf9d-4462fcdda3f6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] No waiting events found dispatching network-vif-unplugged-973cdfc2-4ad8-4f41-b383-4b64b1b5433f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:46:24 compute-0 nova_compute[186544]: 2025-11-22 07:46:24.989 186548 DEBUG nova.compute.manager [req-87a18731-e8a6-41a1-bc4d-6ce9fab06736 req-16aef587-f06a-4898-bf9d-4462fcdda3f6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Received event network-vif-unplugged-973cdfc2-4ad8-4f41-b383-4b64b1b5433f for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 22 07:46:24 compute-0 nova_compute[186544]: 2025-11-22 07:46:24.989 186548 DEBUG nova.compute.manager [req-87a18731-e8a6-41a1-bc4d-6ce9fab06736 req-16aef587-f06a-4898-bf9d-4462fcdda3f6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Received event network-vif-plugged-973cdfc2-4ad8-4f41-b383-4b64b1b5433f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:46:24 compute-0 nova_compute[186544]: 2025-11-22 07:46:24.989 186548 DEBUG oslo_concurrency.lockutils [req-87a18731-e8a6-41a1-bc4d-6ce9fab06736 req-16aef587-f06a-4898-bf9d-4462fcdda3f6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:46:24 compute-0 nova_compute[186544]: 2025-11-22 07:46:24.990 186548 DEBUG oslo_concurrency.lockutils [req-87a18731-e8a6-41a1-bc4d-6ce9fab06736 req-16aef587-f06a-4898-bf9d-4462fcdda3f6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:46:24 compute-0 nova_compute[186544]: 2025-11-22 07:46:24.990 186548 DEBUG oslo_concurrency.lockutils [req-87a18731-e8a6-41a1-bc4d-6ce9fab06736 req-16aef587-f06a-4898-bf9d-4462fcdda3f6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:46:24 compute-0 nova_compute[186544]: 2025-11-22 07:46:24.990 186548 DEBUG nova.compute.manager [req-87a18731-e8a6-41a1-bc4d-6ce9fab06736 req-16aef587-f06a-4898-bf9d-4462fcdda3f6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] No waiting events found dispatching network-vif-plugged-973cdfc2-4ad8-4f41-b383-4b64b1b5433f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:46:24 compute-0 nova_compute[186544]: 2025-11-22 07:46:24.990 186548 WARNING nova.compute.manager [req-87a18731-e8a6-41a1-bc4d-6ce9fab06736 req-16aef587-f06a-4898-bf9d-4462fcdda3f6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Received unexpected event network-vif-plugged-973cdfc2-4ad8-4f41-b383-4b64b1b5433f for instance with vm_state active and task_state migrating.
Nov 22 07:46:24 compute-0 nova_compute[186544]: 2025-11-22 07:46:24.992 186548 INFO nova.compute.manager [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Took 6.80 seconds for pre_live_migration on destination host compute-2.ctlplane.example.com.
Nov 22 07:46:24 compute-0 nova_compute[186544]: 2025-11-22 07:46:24.993 186548 DEBUG nova.compute.manager [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 07:46:25 compute-0 nova_compute[186544]: 2025-11-22 07:46:25.010 186548 DEBUG nova.compute.manager [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=72704,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp8ztptjqj',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='09e8ccfc-6ae9-4a06-ae76-7e059f50ac44',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(5d25fe7b-81d5-4260-b852-0444a04944ca),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Nov 22 07:46:25 compute-0 nova_compute[186544]: 2025-11-22 07:46:25.045 186548 DEBUG nova.objects.instance [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Lazy-loading 'migration_context' on Instance uuid 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:46:25 compute-0 nova_compute[186544]: 2025-11-22 07:46:25.180 186548 DEBUG nova.virt.libvirt.driver [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Nov 22 07:46:25 compute-0 nova_compute[186544]: 2025-11-22 07:46:25.181 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:46:25 compute-0 nova_compute[186544]: 2025-11-22 07:46:25.183 186548 DEBUG nova.virt.libvirt.driver [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Nov 22 07:46:25 compute-0 nova_compute[186544]: 2025-11-22 07:46:25.183 186548 DEBUG nova.virt.libvirt.driver [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Nov 22 07:46:25 compute-0 nova_compute[186544]: 2025-11-22 07:46:25.198 186548 DEBUG nova.virt.libvirt.vif [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:46:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-810629940',display_name='tempest-LiveMigrationTest-server-810629940',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-livemigrationtest-server-810629940',id=28,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:46:11Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d48bda61691e4f778b6d30c0dc773a30',ramdisk_id='',reservation_id='r-4wr2aiye',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-2093743563',owner_user_name='tempest-LiveMigrationTest-2093743563-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:46:11Z,user_data=None,user_id='8a738b980aad493b9a21da7d5a5ccf8a',uuid=09e8ccfc-6ae9-4a06-ae76-7e059f50ac44,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "973cdfc2-4ad8-4f41-b383-4b64b1b5433f", "address": "fa:16:3e:02:9b:98", "network": {"id": "c3f966e1-8cff-4ca0-9b4f-a318c31b0a70", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1427311200-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d48bda61691e4f778b6d30c0dc773a30", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap973cdfc2-4a", "ovs_interfaceid": "973cdfc2-4ad8-4f41-b383-4b64b1b5433f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 07:46:25 compute-0 nova_compute[186544]: 2025-11-22 07:46:25.198 186548 DEBUG nova.network.os_vif_util [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Converting VIF {"id": "973cdfc2-4ad8-4f41-b383-4b64b1b5433f", "address": "fa:16:3e:02:9b:98", "network": {"id": "c3f966e1-8cff-4ca0-9b4f-a318c31b0a70", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1427311200-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d48bda61691e4f778b6d30c0dc773a30", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap973cdfc2-4a", "ovs_interfaceid": "973cdfc2-4ad8-4f41-b383-4b64b1b5433f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:46:25 compute-0 nova_compute[186544]: 2025-11-22 07:46:25.199 186548 DEBUG nova.network.os_vif_util [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:02:9b:98,bridge_name='br-int',has_traffic_filtering=True,id=973cdfc2-4ad8-4f41-b383-4b64b1b5433f,network=Network(c3f966e1-8cff-4ca0-9b4f-a318c31b0a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap973cdfc2-4a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:46:25 compute-0 nova_compute[186544]: 2025-11-22 07:46:25.199 186548 DEBUG nova.virt.libvirt.migration [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Updating guest XML with vif config: <interface type="ethernet">
Nov 22 07:46:25 compute-0 nova_compute[186544]:   <mac address="fa:16:3e:02:9b:98"/>
Nov 22 07:46:25 compute-0 nova_compute[186544]:   <model type="virtio"/>
Nov 22 07:46:25 compute-0 nova_compute[186544]:   <driver name="vhost" rx_queue_size="512"/>
Nov 22 07:46:25 compute-0 nova_compute[186544]:   <mtu size="1442"/>
Nov 22 07:46:25 compute-0 nova_compute[186544]:   <target dev="tap973cdfc2-4a"/>
Nov 22 07:46:25 compute-0 nova_compute[186544]: </interface>
Nov 22 07:46:25 compute-0 nova_compute[186544]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Nov 22 07:46:25 compute-0 nova_compute[186544]: 2025-11-22 07:46:25.200 186548 DEBUG nova.virt.libvirt.driver [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Nov 22 07:46:25 compute-0 podman[217629]: 2025-11-22 07:46:25.43230899 +0000 UTC m=+0.069376216 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 07:46:25 compute-0 podman[217630]: 2025-11-22 07:46:25.451210393 +0000 UTC m=+0.088487984 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 22 07:46:25 compute-0 nova_compute[186544]: 2025-11-22 07:46:25.685 186548 DEBUG nova.virt.libvirt.migration [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 22 07:46:25 compute-0 nova_compute[186544]: 2025-11-22 07:46:25.686 186548 INFO nova.virt.libvirt.migration [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Increasing downtime to 50 ms after 0 sec elapsed time
Nov 22 07:46:25 compute-0 nova_compute[186544]: 2025-11-22 07:46:25.785 186548 INFO nova.virt.libvirt.driver [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Nov 22 07:46:26 compute-0 nova_compute[186544]: 2025-11-22 07:46:26.290 186548 DEBUG nova.virt.libvirt.migration [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 22 07:46:26 compute-0 nova_compute[186544]: 2025-11-22 07:46:26.290 186548 DEBUG nova.virt.libvirt.migration [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Nov 22 07:46:26 compute-0 nova_compute[186544]: 2025-11-22 07:46:26.793 186548 DEBUG nova.virt.libvirt.migration [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 22 07:46:26 compute-0 nova_compute[186544]: 2025-11-22 07:46:26.794 186548 DEBUG nova.virt.libvirt.migration [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Nov 22 07:46:26 compute-0 nova_compute[186544]: 2025-11-22 07:46:26.990 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763797586.989861, 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:46:26 compute-0 nova_compute[186544]: 2025-11-22 07:46:26.990 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] VM Paused (Lifecycle Event)
Nov 22 07:46:27 compute-0 nova_compute[186544]: 2025-11-22 07:46:27.006 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:46:27 compute-0 nova_compute[186544]: 2025-11-22 07:46:27.010 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:46:27 compute-0 nova_compute[186544]: 2025-11-22 07:46:27.031 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] During sync_power_state the instance has a pending task (migrating). Skip.
Nov 22 07:46:27 compute-0 nova_compute[186544]: 2025-11-22 07:46:27.122 186548 DEBUG nova.compute.manager [req-0f2e190e-a463-4a2b-9002-c5ca116b9c58 req-9729dcd6-0dc1-4119-bc48-436249ee2f45 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Received event network-changed-973cdfc2-4ad8-4f41-b383-4b64b1b5433f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:46:27 compute-0 nova_compute[186544]: 2025-11-22 07:46:27.123 186548 DEBUG nova.compute.manager [req-0f2e190e-a463-4a2b-9002-c5ca116b9c58 req-9729dcd6-0dc1-4119-bc48-436249ee2f45 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Refreshing instance network info cache due to event network-changed-973cdfc2-4ad8-4f41-b383-4b64b1b5433f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 07:46:27 compute-0 nova_compute[186544]: 2025-11-22 07:46:27.123 186548 DEBUG oslo_concurrency.lockutils [req-0f2e190e-a463-4a2b-9002-c5ca116b9c58 req-9729dcd6-0dc1-4119-bc48-436249ee2f45 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-09e8ccfc-6ae9-4a06-ae76-7e059f50ac44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:46:27 compute-0 nova_compute[186544]: 2025-11-22 07:46:27.124 186548 DEBUG oslo_concurrency.lockutils [req-0f2e190e-a463-4a2b-9002-c5ca116b9c58 req-9729dcd6-0dc1-4119-bc48-436249ee2f45 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-09e8ccfc-6ae9-4a06-ae76-7e059f50ac44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:46:27 compute-0 nova_compute[186544]: 2025-11-22 07:46:27.124 186548 DEBUG nova.network.neutron [req-0f2e190e-a463-4a2b-9002-c5ca116b9c58 req-9729dcd6-0dc1-4119-bc48-436249ee2f45 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Refreshing network info cache for port 973cdfc2-4ad8-4f41-b383-4b64b1b5433f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 07:46:27 compute-0 kernel: tap973cdfc2-4a (unregistering): left promiscuous mode
Nov 22 07:46:27 compute-0 NetworkManager[55036]: <info>  [1763797587.1360] device (tap973cdfc2-4a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 07:46:27 compute-0 ovn_controller[94843]: 2025-11-22T07:46:27Z|00114|binding|INFO|Releasing lport 973cdfc2-4ad8-4f41-b383-4b64b1b5433f from this chassis (sb_readonly=0)
Nov 22 07:46:27 compute-0 ovn_controller[94843]: 2025-11-22T07:46:27Z|00115|binding|INFO|Setting lport 973cdfc2-4ad8-4f41-b383-4b64b1b5433f down in Southbound
Nov 22 07:46:27 compute-0 ovn_controller[94843]: 2025-11-22T07:46:27Z|00116|binding|INFO|Removing iface tap973cdfc2-4a ovn-installed in OVS
Nov 22 07:46:27 compute-0 nova_compute[186544]: 2025-11-22 07:46:27.157 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:46:27 compute-0 nova_compute[186544]: 2025-11-22 07:46:27.174 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:46:27 compute-0 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d0000001c.scope: Deactivated successfully.
Nov 22 07:46:27 compute-0 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d0000001c.scope: Consumed 13.627s CPU time.
Nov 22 07:46:27 compute-0 systemd-machined[152872]: Machine qemu-14-instance-0000001c terminated.
Nov 22 07:46:27 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:27.304 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:02:9b:98 10.100.0.14'], port_security=['fa:16:3e:02:9b:98 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-2.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '4984e16e-8f1c-4426-bfc6-5927f375ce79'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '09e8ccfc-6ae9-4a06-ae76-7e059f50ac44', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd48bda61691e4f778b6d30c0dc773a30', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'd99796e5-fe06-409c-adb5-ca5cc291d6f2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=45f3847d-7d6e-44b5-a83a-dde97f76bd11, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=973cdfc2-4ad8-4f41-b383-4b64b1b5433f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:46:27 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:27.306 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 973cdfc2-4ad8-4f41-b383-4b64b1b5433f in datapath c3f966e1-8cff-4ca0-9b4f-a318c31b0a70 unbound from our chassis
Nov 22 07:46:27 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:27.308 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c3f966e1-8cff-4ca0-9b4f-a318c31b0a70, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 07:46:27 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:27.309 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[baed76c2-934d-47e9-95c8-03476b226b6b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:46:27 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:27.309 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70 namespace which is not needed anymore
Nov 22 07:46:27 compute-0 ovn_controller[94843]: 2025-11-22T07:46:27Z|00117|binding|INFO|Releasing lport d4b08431-3a8d-4e48-ba0b-792923071bed from this chassis (sb_readonly=0)
Nov 22 07:46:27 compute-0 ovn_controller[94843]: 2025-11-22T07:46:27Z|00118|binding|INFO|Releasing lport 8206cb6d-dd78-493d-a276-fccb0eeecc7d from this chassis (sb_readonly=0)
Nov 22 07:46:27 compute-0 nova_compute[186544]: 2025-11-22 07:46:27.335 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:46:27 compute-0 nova_compute[186544]: 2025-11-22 07:46:27.342 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:46:27 compute-0 nova_compute[186544]: 2025-11-22 07:46:27.387 186548 DEBUG nova.virt.libvirt.guest [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Nov 22 07:46:27 compute-0 nova_compute[186544]: 2025-11-22 07:46:27.389 186548 INFO nova.virt.libvirt.driver [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Migration operation has completed
Nov 22 07:46:27 compute-0 nova_compute[186544]: 2025-11-22 07:46:27.389 186548 INFO nova.compute.manager [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] _post_live_migration() is started..
Nov 22 07:46:27 compute-0 nova_compute[186544]: 2025-11-22 07:46:27.391 186548 DEBUG nova.virt.libvirt.driver [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Nov 22 07:46:27 compute-0 nova_compute[186544]: 2025-11-22 07:46:27.391 186548 DEBUG nova.virt.libvirt.driver [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Nov 22 07:46:27 compute-0 nova_compute[186544]: 2025-11-22 07:46:27.392 186548 DEBUG nova.virt.libvirt.driver [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Nov 22 07:46:27 compute-0 neutron-haproxy-ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70[217412]: [NOTICE]   (217416) : haproxy version is 2.8.14-c23fe91
Nov 22 07:46:27 compute-0 neutron-haproxy-ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70[217412]: [NOTICE]   (217416) : path to executable is /usr/sbin/haproxy
Nov 22 07:46:27 compute-0 neutron-haproxy-ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70[217412]: [WARNING]  (217416) : Exiting Master process...
Nov 22 07:46:27 compute-0 neutron-haproxy-ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70[217412]: [ALERT]    (217416) : Current worker (217418) exited with code 143 (Terminated)
Nov 22 07:46:27 compute-0 neutron-haproxy-ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70[217412]: [WARNING]  (217416) : All workers exited. Exiting... (0)
Nov 22 07:46:27 compute-0 systemd[1]: libpod-794bde83b58c7954c64e099d59b98755fa05e337465ad02cb44428d08d8ff49a.scope: Deactivated successfully.
Nov 22 07:46:27 compute-0 podman[217715]: 2025-11-22 07:46:27.470366258 +0000 UTC m=+0.050940605 container died 794bde83b58c7954c64e099d59b98755fa05e337465ad02cb44428d08d8ff49a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 22 07:46:27 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-794bde83b58c7954c64e099d59b98755fa05e337465ad02cb44428d08d8ff49a-userdata-shm.mount: Deactivated successfully.
Nov 22 07:46:27 compute-0 systemd[1]: var-lib-containers-storage-overlay-3f3f74c841b5aa21792978ebe6f60ea4b79d03cbec9b8a277a08bcd73fee7c64-merged.mount: Deactivated successfully.
Nov 22 07:46:27 compute-0 podman[217715]: 2025-11-22 07:46:27.506172694 +0000 UTC m=+0.086747041 container cleanup 794bde83b58c7954c64e099d59b98755fa05e337465ad02cb44428d08d8ff49a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 07:46:27 compute-0 systemd[1]: libpod-conmon-794bde83b58c7954c64e099d59b98755fa05e337465ad02cb44428d08d8ff49a.scope: Deactivated successfully.
Nov 22 07:46:27 compute-0 podman[217743]: 2025-11-22 07:46:27.575828365 +0000 UTC m=+0.050819872 container remove 794bde83b58c7954c64e099d59b98755fa05e337465ad02cb44428d08d8ff49a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_managed=true)
Nov 22 07:46:27 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:27.581 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[f588a4c9-57b4-47de-8260-2dec3cfd6f4c]: (4, ('Sat Nov 22 07:46:27 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70 (794bde83b58c7954c64e099d59b98755fa05e337465ad02cb44428d08d8ff49a)\n794bde83b58c7954c64e099d59b98755fa05e337465ad02cb44428d08d8ff49a\nSat Nov 22 07:46:27 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70 (794bde83b58c7954c64e099d59b98755fa05e337465ad02cb44428d08d8ff49a)\n794bde83b58c7954c64e099d59b98755fa05e337465ad02cb44428d08d8ff49a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:46:27 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:27.583 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[97f8b8e9-66a5-41be-b754-c973df17c52a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:46:27 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:27.584 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc3f966e1-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:46:27 compute-0 nova_compute[186544]: 2025-11-22 07:46:27.626 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:46:27 compute-0 kernel: tapc3f966e1-80: left promiscuous mode
Nov 22 07:46:27 compute-0 nova_compute[186544]: 2025-11-22 07:46:27.643 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:46:27 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:27.646 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[92ac442b-ac84-403c-b007-5c6bc2f883a1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:46:27 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:27.661 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[e420f44e-30fa-40f3-b73a-ad3394a030d2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:46:27 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:27.662 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[32445ce2-c266-48cd-aad0-27dc4c72336b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:46:27 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:27.676 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[c4fbec54-d485-414a-817e-cf0c8bb1e7c1]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 431122, 'reachable_time': 44089, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217761, 'error': None, 'target': 'ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:46:27 compute-0 systemd[1]: run-netns-ovnmeta\x2dc3f966e1\x2d8cff\x2d4ca0\x2d9b4f\x2da318c31b0a70.mount: Deactivated successfully.
Nov 22 07:46:27 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:27.680 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 07:46:27 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:27.680 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[1bedfeb5-c78a-4d71-9a08-f732529bbecf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:46:28 compute-0 podman[217764]: 2025-11-22 07:46:28.402610381 +0000 UTC m=+0.050123056 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 07:46:28 compute-0 nova_compute[186544]: 2025-11-22 07:46:28.779 186548 DEBUG nova.network.neutron [req-0f2e190e-a463-4a2b-9002-c5ca116b9c58 req-9729dcd6-0dc1-4119-bc48-436249ee2f45 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Updated VIF entry in instance network info cache for port 973cdfc2-4ad8-4f41-b383-4b64b1b5433f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 07:46:28 compute-0 nova_compute[186544]: 2025-11-22 07:46:28.780 186548 DEBUG nova.network.neutron [req-0f2e190e-a463-4a2b-9002-c5ca116b9c58 req-9729dcd6-0dc1-4119-bc48-436249ee2f45 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Updating instance_info_cache with network_info: [{"id": "973cdfc2-4ad8-4f41-b383-4b64b1b5433f", "address": "fa:16:3e:02:9b:98", "network": {"id": "c3f966e1-8cff-4ca0-9b4f-a318c31b0a70", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1427311200-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d48bda61691e4f778b6d30c0dc773a30", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap973cdfc2-4a", "ovs_interfaceid": "973cdfc2-4ad8-4f41-b383-4b64b1b5433f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-2.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:46:28 compute-0 nova_compute[186544]: 2025-11-22 07:46:28.798 186548 DEBUG oslo_concurrency.lockutils [req-0f2e190e-a463-4a2b-9002-c5ca116b9c58 req-9729dcd6-0dc1-4119-bc48-436249ee2f45 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-09e8ccfc-6ae9-4a06-ae76-7e059f50ac44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:46:29 compute-0 nova_compute[186544]: 2025-11-22 07:46:29.447 186548 DEBUG nova.compute.manager [req-66592c4c-2b4b-449e-99c3-6d455aa25f89 req-39ab7efc-3ff7-49e6-8c72-09e8aad29e29 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Received event network-vif-unplugged-973cdfc2-4ad8-4f41-b383-4b64b1b5433f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:46:29 compute-0 nova_compute[186544]: 2025-11-22 07:46:29.448 186548 DEBUG oslo_concurrency.lockutils [req-66592c4c-2b4b-449e-99c3-6d455aa25f89 req-39ab7efc-3ff7-49e6-8c72-09e8aad29e29 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:46:29 compute-0 nova_compute[186544]: 2025-11-22 07:46:29.448 186548 DEBUG oslo_concurrency.lockutils [req-66592c4c-2b4b-449e-99c3-6d455aa25f89 req-39ab7efc-3ff7-49e6-8c72-09e8aad29e29 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:46:29 compute-0 nova_compute[186544]: 2025-11-22 07:46:29.448 186548 DEBUG oslo_concurrency.lockutils [req-66592c4c-2b4b-449e-99c3-6d455aa25f89 req-39ab7efc-3ff7-49e6-8c72-09e8aad29e29 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:46:29 compute-0 nova_compute[186544]: 2025-11-22 07:46:29.449 186548 DEBUG nova.compute.manager [req-66592c4c-2b4b-449e-99c3-6d455aa25f89 req-39ab7efc-3ff7-49e6-8c72-09e8aad29e29 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] No waiting events found dispatching network-vif-unplugged-973cdfc2-4ad8-4f41-b383-4b64b1b5433f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:46:29 compute-0 nova_compute[186544]: 2025-11-22 07:46:29.449 186548 DEBUG nova.compute.manager [req-66592c4c-2b4b-449e-99c3-6d455aa25f89 req-39ab7efc-3ff7-49e6-8c72-09e8aad29e29 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Received event network-vif-unplugged-973cdfc2-4ad8-4f41-b383-4b64b1b5433f for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 22 07:46:29 compute-0 nova_compute[186544]: 2025-11-22 07:46:29.449 186548 DEBUG nova.compute.manager [req-66592c4c-2b4b-449e-99c3-6d455aa25f89 req-39ab7efc-3ff7-49e6-8c72-09e8aad29e29 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Received event network-vif-plugged-973cdfc2-4ad8-4f41-b383-4b64b1b5433f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:46:29 compute-0 nova_compute[186544]: 2025-11-22 07:46:29.449 186548 DEBUG oslo_concurrency.lockutils [req-66592c4c-2b4b-449e-99c3-6d455aa25f89 req-39ab7efc-3ff7-49e6-8c72-09e8aad29e29 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:46:29 compute-0 nova_compute[186544]: 2025-11-22 07:46:29.450 186548 DEBUG oslo_concurrency.lockutils [req-66592c4c-2b4b-449e-99c3-6d455aa25f89 req-39ab7efc-3ff7-49e6-8c72-09e8aad29e29 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:46:29 compute-0 nova_compute[186544]: 2025-11-22 07:46:29.450 186548 DEBUG oslo_concurrency.lockutils [req-66592c4c-2b4b-449e-99c3-6d455aa25f89 req-39ab7efc-3ff7-49e6-8c72-09e8aad29e29 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:46:29 compute-0 nova_compute[186544]: 2025-11-22 07:46:29.450 186548 DEBUG nova.compute.manager [req-66592c4c-2b4b-449e-99c3-6d455aa25f89 req-39ab7efc-3ff7-49e6-8c72-09e8aad29e29 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] No waiting events found dispatching network-vif-plugged-973cdfc2-4ad8-4f41-b383-4b64b1b5433f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:46:29 compute-0 nova_compute[186544]: 2025-11-22 07:46:29.450 186548 WARNING nova.compute.manager [req-66592c4c-2b4b-449e-99c3-6d455aa25f89 req-39ab7efc-3ff7-49e6-8c72-09e8aad29e29 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Received unexpected event network-vif-plugged-973cdfc2-4ad8-4f41-b383-4b64b1b5433f for instance with vm_state active and task_state migrating.
Nov 22 07:46:29 compute-0 nova_compute[186544]: 2025-11-22 07:46:29.450 186548 DEBUG nova.compute.manager [req-66592c4c-2b4b-449e-99c3-6d455aa25f89 req-39ab7efc-3ff7-49e6-8c72-09e8aad29e29 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Received event network-vif-unplugged-973cdfc2-4ad8-4f41-b383-4b64b1b5433f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:46:29 compute-0 nova_compute[186544]: 2025-11-22 07:46:29.451 186548 DEBUG oslo_concurrency.lockutils [req-66592c4c-2b4b-449e-99c3-6d455aa25f89 req-39ab7efc-3ff7-49e6-8c72-09e8aad29e29 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:46:29 compute-0 nova_compute[186544]: 2025-11-22 07:46:29.451 186548 DEBUG oslo_concurrency.lockutils [req-66592c4c-2b4b-449e-99c3-6d455aa25f89 req-39ab7efc-3ff7-49e6-8c72-09e8aad29e29 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:46:29 compute-0 nova_compute[186544]: 2025-11-22 07:46:29.451 186548 DEBUG oslo_concurrency.lockutils [req-66592c4c-2b4b-449e-99c3-6d455aa25f89 req-39ab7efc-3ff7-49e6-8c72-09e8aad29e29 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:46:29 compute-0 nova_compute[186544]: 2025-11-22 07:46:29.451 186548 DEBUG nova.compute.manager [req-66592c4c-2b4b-449e-99c3-6d455aa25f89 req-39ab7efc-3ff7-49e6-8c72-09e8aad29e29 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] No waiting events found dispatching network-vif-unplugged-973cdfc2-4ad8-4f41-b383-4b64b1b5433f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:46:29 compute-0 nova_compute[186544]: 2025-11-22 07:46:29.451 186548 DEBUG nova.compute.manager [req-66592c4c-2b4b-449e-99c3-6d455aa25f89 req-39ab7efc-3ff7-49e6-8c72-09e8aad29e29 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Received event network-vif-unplugged-973cdfc2-4ad8-4f41-b383-4b64b1b5433f for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 22 07:46:29 compute-0 nova_compute[186544]: 2025-11-22 07:46:29.983 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:46:30 compute-0 nova_compute[186544]: 2025-11-22 07:46:30.029 186548 DEBUG nova.network.neutron [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Activated binding for port 973cdfc2-4ad8-4f41-b383-4b64b1b5433f and host compute-2.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Nov 22 07:46:30 compute-0 nova_compute[186544]: 2025-11-22 07:46:30.029 186548 DEBUG nova.compute.manager [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "973cdfc2-4ad8-4f41-b383-4b64b1b5433f", "address": "fa:16:3e:02:9b:98", "network": {"id": "c3f966e1-8cff-4ca0-9b4f-a318c31b0a70", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1427311200-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d48bda61691e4f778b6d30c0dc773a30", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap973cdfc2-4a", "ovs_interfaceid": "973cdfc2-4ad8-4f41-b383-4b64b1b5433f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Nov 22 07:46:30 compute-0 nova_compute[186544]: 2025-11-22 07:46:30.030 186548 DEBUG nova.virt.libvirt.vif [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:46:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-810629940',display_name='tempest-LiveMigrationTest-server-810629940',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-livemigrationtest-server-810629940',id=28,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:46:11Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d48bda61691e4f778b6d30c0dc773a30',ramdisk_id='',reservation_id='r-4wr2aiye',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-2093743563',owner_user_name='tempest-LiveMigrationTest-2093743563-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:46:15Z,user_data=None,user_id='8a738b980aad493b9a21da7d5a5ccf8a',uuid=09e8ccfc-6ae9-4a06-ae76-7e059f50ac44,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "973cdfc2-4ad8-4f41-b383-4b64b1b5433f", "address": "fa:16:3e:02:9b:98", "network": {"id": "c3f966e1-8cff-4ca0-9b4f-a318c31b0a70", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1427311200-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d48bda61691e4f778b6d30c0dc773a30", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap973cdfc2-4a", "ovs_interfaceid": "973cdfc2-4ad8-4f41-b383-4b64b1b5433f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 07:46:30 compute-0 nova_compute[186544]: 2025-11-22 07:46:30.030 186548 DEBUG nova.network.os_vif_util [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Converting VIF {"id": "973cdfc2-4ad8-4f41-b383-4b64b1b5433f", "address": "fa:16:3e:02:9b:98", "network": {"id": "c3f966e1-8cff-4ca0-9b4f-a318c31b0a70", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1427311200-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d48bda61691e4f778b6d30c0dc773a30", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap973cdfc2-4a", "ovs_interfaceid": "973cdfc2-4ad8-4f41-b383-4b64b1b5433f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:46:30 compute-0 nova_compute[186544]: 2025-11-22 07:46:30.031 186548 DEBUG nova.network.os_vif_util [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:02:9b:98,bridge_name='br-int',has_traffic_filtering=True,id=973cdfc2-4ad8-4f41-b383-4b64b1b5433f,network=Network(c3f966e1-8cff-4ca0-9b4f-a318c31b0a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap973cdfc2-4a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:46:30 compute-0 nova_compute[186544]: 2025-11-22 07:46:30.031 186548 DEBUG os_vif [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:02:9b:98,bridge_name='br-int',has_traffic_filtering=True,id=973cdfc2-4ad8-4f41-b383-4b64b1b5433f,network=Network(c3f966e1-8cff-4ca0-9b4f-a318c31b0a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap973cdfc2-4a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 07:46:30 compute-0 nova_compute[186544]: 2025-11-22 07:46:30.032 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:46:30 compute-0 nova_compute[186544]: 2025-11-22 07:46:30.033 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap973cdfc2-4a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:46:30 compute-0 nova_compute[186544]: 2025-11-22 07:46:30.034 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:46:30 compute-0 nova_compute[186544]: 2025-11-22 07:46:30.035 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:46:30 compute-0 nova_compute[186544]: 2025-11-22 07:46:30.036 186548 INFO os_vif [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:02:9b:98,bridge_name='br-int',has_traffic_filtering=True,id=973cdfc2-4ad8-4f41-b383-4b64b1b5433f,network=Network(c3f966e1-8cff-4ca0-9b4f-a318c31b0a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap973cdfc2-4a')
Nov 22 07:46:30 compute-0 nova_compute[186544]: 2025-11-22 07:46:30.037 186548 DEBUG oslo_concurrency.lockutils [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:46:30 compute-0 nova_compute[186544]: 2025-11-22 07:46:30.037 186548 DEBUG oslo_concurrency.lockutils [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:46:30 compute-0 nova_compute[186544]: 2025-11-22 07:46:30.037 186548 DEBUG oslo_concurrency.lockutils [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:46:30 compute-0 nova_compute[186544]: 2025-11-22 07:46:30.037 186548 DEBUG nova.compute.manager [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Nov 22 07:46:30 compute-0 nova_compute[186544]: 2025-11-22 07:46:30.038 186548 INFO nova.virt.libvirt.driver [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Deleting instance files /var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44_del
Nov 22 07:46:30 compute-0 nova_compute[186544]: 2025-11-22 07:46:30.038 186548 INFO nova.virt.libvirt.driver [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Deletion of /var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44_del complete
Nov 22 07:46:30 compute-0 ovn_controller[94843]: 2025-11-22T07:46:30Z|00020|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ec:d2:78 10.100.0.3
Nov 22 07:46:30 compute-0 ovn_controller[94843]: 2025-11-22T07:46:30Z|00021|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ec:d2:78 10.100.0.3
Nov 22 07:46:30 compute-0 nova_compute[186544]: 2025-11-22 07:46:30.170 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:46:32 compute-0 nova_compute[186544]: 2025-11-22 07:46:32.097 186548 DEBUG nova.compute.manager [req-913719a8-f147-4247-8dfc-c21c0f3b87c0 req-73d6ca62-2832-486a-889f-add723536eba 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Received event network-vif-plugged-973cdfc2-4ad8-4f41-b383-4b64b1b5433f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:46:32 compute-0 nova_compute[186544]: 2025-11-22 07:46:32.097 186548 DEBUG oslo_concurrency.lockutils [req-913719a8-f147-4247-8dfc-c21c0f3b87c0 req-73d6ca62-2832-486a-889f-add723536eba 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:46:32 compute-0 nova_compute[186544]: 2025-11-22 07:46:32.098 186548 DEBUG oslo_concurrency.lockutils [req-913719a8-f147-4247-8dfc-c21c0f3b87c0 req-73d6ca62-2832-486a-889f-add723536eba 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:46:32 compute-0 nova_compute[186544]: 2025-11-22 07:46:32.098 186548 DEBUG oslo_concurrency.lockutils [req-913719a8-f147-4247-8dfc-c21c0f3b87c0 req-73d6ca62-2832-486a-889f-add723536eba 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:46:32 compute-0 nova_compute[186544]: 2025-11-22 07:46:32.098 186548 DEBUG nova.compute.manager [req-913719a8-f147-4247-8dfc-c21c0f3b87c0 req-73d6ca62-2832-486a-889f-add723536eba 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] No waiting events found dispatching network-vif-plugged-973cdfc2-4ad8-4f41-b383-4b64b1b5433f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:46:32 compute-0 nova_compute[186544]: 2025-11-22 07:46:32.098 186548 WARNING nova.compute.manager [req-913719a8-f147-4247-8dfc-c21c0f3b87c0 req-73d6ca62-2832-486a-889f-add723536eba 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Received unexpected event network-vif-plugged-973cdfc2-4ad8-4f41-b383-4b64b1b5433f for instance with vm_state active and task_state migrating.
Nov 22 07:46:32 compute-0 nova_compute[186544]: 2025-11-22 07:46:32.098 186548 DEBUG nova.compute.manager [req-913719a8-f147-4247-8dfc-c21c0f3b87c0 req-73d6ca62-2832-486a-889f-add723536eba 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Received event network-vif-plugged-973cdfc2-4ad8-4f41-b383-4b64b1b5433f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:46:32 compute-0 nova_compute[186544]: 2025-11-22 07:46:32.099 186548 DEBUG oslo_concurrency.lockutils [req-913719a8-f147-4247-8dfc-c21c0f3b87c0 req-73d6ca62-2832-486a-889f-add723536eba 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:46:32 compute-0 nova_compute[186544]: 2025-11-22 07:46:32.099 186548 DEBUG oslo_concurrency.lockutils [req-913719a8-f147-4247-8dfc-c21c0f3b87c0 req-73d6ca62-2832-486a-889f-add723536eba 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:46:32 compute-0 nova_compute[186544]: 2025-11-22 07:46:32.099 186548 DEBUG oslo_concurrency.lockutils [req-913719a8-f147-4247-8dfc-c21c0f3b87c0 req-73d6ca62-2832-486a-889f-add723536eba 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:46:32 compute-0 nova_compute[186544]: 2025-11-22 07:46:32.099 186548 DEBUG nova.compute.manager [req-913719a8-f147-4247-8dfc-c21c0f3b87c0 req-73d6ca62-2832-486a-889f-add723536eba 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] No waiting events found dispatching network-vif-plugged-973cdfc2-4ad8-4f41-b383-4b64b1b5433f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:46:32 compute-0 nova_compute[186544]: 2025-11-22 07:46:32.099 186548 WARNING nova.compute.manager [req-913719a8-f147-4247-8dfc-c21c0f3b87c0 req-73d6ca62-2832-486a-889f-add723536eba 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Received unexpected event network-vif-plugged-973cdfc2-4ad8-4f41-b383-4b64b1b5433f for instance with vm_state active and task_state migrating.
Nov 22 07:46:32 compute-0 nova_compute[186544]: 2025-11-22 07:46:32.100 186548 DEBUG nova.compute.manager [req-913719a8-f147-4247-8dfc-c21c0f3b87c0 req-73d6ca62-2832-486a-889f-add723536eba 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] Received event network-changed-775bdf61-ad38-473b-9d53-c3195f2be472 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:46:32 compute-0 nova_compute[186544]: 2025-11-22 07:46:32.100 186548 DEBUG nova.compute.manager [req-913719a8-f147-4247-8dfc-c21c0f3b87c0 req-73d6ca62-2832-486a-889f-add723536eba 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] Refreshing instance network info cache due to event network-changed-775bdf61-ad38-473b-9d53-c3195f2be472. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 07:46:32 compute-0 nova_compute[186544]: 2025-11-22 07:46:32.100 186548 DEBUG oslo_concurrency.lockutils [req-913719a8-f147-4247-8dfc-c21c0f3b87c0 req-73d6ca62-2832-486a-889f-add723536eba 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-67b32f80-a795-43b0-bcb0-aeda0d9506d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:46:32 compute-0 nova_compute[186544]: 2025-11-22 07:46:32.100 186548 DEBUG oslo_concurrency.lockutils [req-913719a8-f147-4247-8dfc-c21c0f3b87c0 req-73d6ca62-2832-486a-889f-add723536eba 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-67b32f80-a795-43b0-bcb0-aeda0d9506d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:46:32 compute-0 nova_compute[186544]: 2025-11-22 07:46:32.100 186548 DEBUG nova.network.neutron [req-913719a8-f147-4247-8dfc-c21c0f3b87c0 req-73d6ca62-2832-486a-889f-add723536eba 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] Refreshing network info cache for port 775bdf61-ad38-473b-9d53-c3195f2be472 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 07:46:32 compute-0 nova_compute[186544]: 2025-11-22 07:46:32.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:46:32 compute-0 nova_compute[186544]: 2025-11-22 07:46:32.183 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:46:32 compute-0 nova_compute[186544]: 2025-11-22 07:46:32.184 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:46:32 compute-0 nova_compute[186544]: 2025-11-22 07:46:32.184 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:46:32 compute-0 nova_compute[186544]: 2025-11-22 07:46:32.184 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 07:46:32 compute-0 nova_compute[186544]: 2025-11-22 07:46:32.261 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/67b32f80-a795-43b0-bcb0-aeda0d9506d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:46:32 compute-0 nova_compute[186544]: 2025-11-22 07:46:32.318 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/67b32f80-a795-43b0-bcb0-aeda0d9506d4/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:46:32 compute-0 nova_compute[186544]: 2025-11-22 07:46:32.319 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/67b32f80-a795-43b0-bcb0-aeda0d9506d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:46:32 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Nov 22 07:46:32 compute-0 systemd[217598]: Activating special unit Exit the Session...
Nov 22 07:46:32 compute-0 systemd[217598]: Stopped target Main User Target.
Nov 22 07:46:32 compute-0 systemd[217598]: Stopped target Basic System.
Nov 22 07:46:32 compute-0 systemd[217598]: Stopped target Paths.
Nov 22 07:46:32 compute-0 systemd[217598]: Stopped target Sockets.
Nov 22 07:46:32 compute-0 systemd[217598]: Stopped target Timers.
Nov 22 07:46:32 compute-0 systemd[217598]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 22 07:46:32 compute-0 systemd[217598]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 22 07:46:32 compute-0 systemd[217598]: Closed D-Bus User Message Bus Socket.
Nov 22 07:46:32 compute-0 systemd[217598]: Stopped Create User's Volatile Files and Directories.
Nov 22 07:46:32 compute-0 systemd[217598]: Removed slice User Application Slice.
Nov 22 07:46:32 compute-0 systemd[217598]: Reached target Shutdown.
Nov 22 07:46:32 compute-0 systemd[217598]: Finished Exit the Session.
Nov 22 07:46:32 compute-0 systemd[217598]: Reached target Exit the Session.
Nov 22 07:46:32 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Nov 22 07:46:32 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Nov 22 07:46:32 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 22 07:46:32 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 22 07:46:32 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 22 07:46:32 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 22 07:46:32 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Nov 22 07:46:32 compute-0 nova_compute[186544]: 2025-11-22 07:46:32.376 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/67b32f80-a795-43b0-bcb0-aeda0d9506d4/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:46:32 compute-0 nova_compute[186544]: 2025-11-22 07:46:32.531 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 07:46:32 compute-0 nova_compute[186544]: 2025-11-22 07:46:32.532 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5476MB free_disk=73.39605712890625GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 07:46:32 compute-0 nova_compute[186544]: 2025-11-22 07:46:32.533 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:46:32 compute-0 nova_compute[186544]: 2025-11-22 07:46:32.533 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:46:32 compute-0 nova_compute[186544]: 2025-11-22 07:46:32.589 186548 INFO nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Updating resource usage from migration 5d25fe7b-81d5-4260-b852-0444a04944ca
Nov 22 07:46:32 compute-0 nova_compute[186544]: 2025-11-22 07:46:32.620 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Instance 67b32f80-a795-43b0-bcb0-aeda0d9506d4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 22 07:46:32 compute-0 nova_compute[186544]: 2025-11-22 07:46:32.621 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Migration 5d25fe7b-81d5-4260-b852-0444a04944ca is active on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Nov 22 07:46:32 compute-0 nova_compute[186544]: 2025-11-22 07:46:32.621 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 07:46:32 compute-0 nova_compute[186544]: 2025-11-22 07:46:32.621 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 07:46:32 compute-0 nova_compute[186544]: 2025-11-22 07:46:32.692 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:46:32 compute-0 nova_compute[186544]: 2025-11-22 07:46:32.707 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:46:32 compute-0 nova_compute[186544]: 2025-11-22 07:46:32.730 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 07:46:32 compute-0 nova_compute[186544]: 2025-11-22 07:46:32.730 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.197s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:46:33 compute-0 podman[217818]: 2025-11-22 07:46:33.398087986 +0000 UTC m=+0.051068220 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 22 07:46:34 compute-0 nova_compute[186544]: 2025-11-22 07:46:34.731 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:46:34 compute-0 nova_compute[186544]: 2025-11-22 07:46:34.732 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 07:46:34 compute-0 nova_compute[186544]: 2025-11-22 07:46:34.732 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 07:46:35 compute-0 nova_compute[186544]: 2025-11-22 07:46:35.036 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:46:35 compute-0 nova_compute[186544]: 2025-11-22 07:46:35.082 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "refresh_cache-67b32f80-a795-43b0-bcb0-aeda0d9506d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:46:35 compute-0 nova_compute[186544]: 2025-11-22 07:46:35.173 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:46:35 compute-0 nova_compute[186544]: 2025-11-22 07:46:35.298 186548 DEBUG nova.network.neutron [req-913719a8-f147-4247-8dfc-c21c0f3b87c0 req-73d6ca62-2832-486a-889f-add723536eba 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] Updated VIF entry in instance network info cache for port 775bdf61-ad38-473b-9d53-c3195f2be472. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 07:46:35 compute-0 nova_compute[186544]: 2025-11-22 07:46:35.299 186548 DEBUG nova.network.neutron [req-913719a8-f147-4247-8dfc-c21c0f3b87c0 req-73d6ca62-2832-486a-889f-add723536eba 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] Updating instance_info_cache with network_info: [{"id": "775bdf61-ad38-473b-9d53-c3195f2be472", "address": "fa:16:3e:ec:d2:78", "network": {"id": "9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-440990750-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef27a8ab9a794f7782ac89b9c28c893a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap775bdf61-ad", "ovs_interfaceid": "775bdf61-ad38-473b-9d53-c3195f2be472", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:46:35 compute-0 nova_compute[186544]: 2025-11-22 07:46:35.333 186548 DEBUG oslo_concurrency.lockutils [req-913719a8-f147-4247-8dfc-c21c0f3b87c0 req-73d6ca62-2832-486a-889f-add723536eba 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-67b32f80-a795-43b0-bcb0-aeda0d9506d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:46:35 compute-0 nova_compute[186544]: 2025-11-22 07:46:35.333 186548 DEBUG nova.compute.manager [req-913719a8-f147-4247-8dfc-c21c0f3b87c0 req-73d6ca62-2832-486a-889f-add723536eba 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Received event network-vif-plugged-973cdfc2-4ad8-4f41-b383-4b64b1b5433f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:46:35 compute-0 nova_compute[186544]: 2025-11-22 07:46:35.334 186548 DEBUG oslo_concurrency.lockutils [req-913719a8-f147-4247-8dfc-c21c0f3b87c0 req-73d6ca62-2832-486a-889f-add723536eba 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:46:35 compute-0 nova_compute[186544]: 2025-11-22 07:46:35.334 186548 DEBUG oslo_concurrency.lockutils [req-913719a8-f147-4247-8dfc-c21c0f3b87c0 req-73d6ca62-2832-486a-889f-add723536eba 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:46:35 compute-0 nova_compute[186544]: 2025-11-22 07:46:35.334 186548 DEBUG oslo_concurrency.lockutils [req-913719a8-f147-4247-8dfc-c21c0f3b87c0 req-73d6ca62-2832-486a-889f-add723536eba 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:46:35 compute-0 nova_compute[186544]: 2025-11-22 07:46:35.334 186548 DEBUG nova.compute.manager [req-913719a8-f147-4247-8dfc-c21c0f3b87c0 req-73d6ca62-2832-486a-889f-add723536eba 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] No waiting events found dispatching network-vif-plugged-973cdfc2-4ad8-4f41-b383-4b64b1b5433f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:46:35 compute-0 nova_compute[186544]: 2025-11-22 07:46:35.335 186548 WARNING nova.compute.manager [req-913719a8-f147-4247-8dfc-c21c0f3b87c0 req-73d6ca62-2832-486a-889f-add723536eba 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Received unexpected event network-vif-plugged-973cdfc2-4ad8-4f41-b383-4b64b1b5433f for instance with vm_state active and task_state migrating.
Nov 22 07:46:35 compute-0 nova_compute[186544]: 2025-11-22 07:46:35.336 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquired lock "refresh_cache-67b32f80-a795-43b0-bcb0-aeda0d9506d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:46:35 compute-0 nova_compute[186544]: 2025-11-22 07:46:35.336 186548 DEBUG nova.network.neutron [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 22 07:46:35 compute-0 nova_compute[186544]: 2025-11-22 07:46:35.336 186548 DEBUG nova.objects.instance [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lazy-loading 'info_cache' on Instance uuid 67b32f80-a795-43b0-bcb0-aeda0d9506d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:46:35 compute-0 podman[217837]: 2025-11-22 07:46:35.418219174 +0000 UTC m=+0.061233926 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 22 07:46:36 compute-0 nova_compute[186544]: 2025-11-22 07:46:36.187 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763797581.1865354, 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:46:36 compute-0 nova_compute[186544]: 2025-11-22 07:46:36.187 186548 INFO nova.compute.manager [-] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] VM Stopped (Lifecycle Event)
Nov 22 07:46:36 compute-0 nova_compute[186544]: 2025-11-22 07:46:36.205 186548 DEBUG nova.compute.manager [None req-b424a423-1b0b-43ad-9cfd-580811012bee - - - - - -] [instance: 5c2e8b5a-f96b-4224-a2fd-b3a8f7fc1f67] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:46:36 compute-0 nova_compute[186544]: 2025-11-22 07:46:36.505 186548 DEBUG oslo_concurrency.lockutils [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Acquiring lock "09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:46:36 compute-0 nova_compute[186544]: 2025-11-22 07:46:36.506 186548 DEBUG oslo_concurrency.lockutils [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Lock "09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:46:36 compute-0 nova_compute[186544]: 2025-11-22 07:46:36.506 186548 DEBUG oslo_concurrency.lockutils [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Lock "09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:46:36 compute-0 nova_compute[186544]: 2025-11-22 07:46:36.527 186548 DEBUG oslo_concurrency.lockutils [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:46:36 compute-0 nova_compute[186544]: 2025-11-22 07:46:36.528 186548 DEBUG oslo_concurrency.lockutils [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:46:36 compute-0 nova_compute[186544]: 2025-11-22 07:46:36.528 186548 DEBUG oslo_concurrency.lockutils [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:46:36 compute-0 nova_compute[186544]: 2025-11-22 07:46:36.529 186548 DEBUG nova.compute.resource_tracker [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.592 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '67b32f80-a795-43b0-bcb0-aeda0d9506d4', 'name': 'tempest-FloatingIPsAssociationTestJSON-server-2031422221', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000001d', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'ef27a8ab9a794f7782ac89b9c28c893a', 'user_id': '65ded9a5f9a7463d8c52561197054664', 'hostId': '826faeb4f15457be627ab93fd0598fadb2254bb02a7cb545a1c75818', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.592 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.594 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 67b32f80-a795-43b0-bcb0-aeda0d9506d4 / tap775bdf61-ad inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.595 12 DEBUG ceilometer.compute.pollsters [-] 67b32f80-a795-43b0-bcb0-aeda0d9506d4/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.596 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6099cbdc-a549-4680-bb36-967d828a3584', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '65ded9a5f9a7463d8c52561197054664', 'user_name': None, 'project_id': 'ef27a8ab9a794f7782ac89b9c28c893a', 'project_name': None, 'resource_id': 'instance-0000001d-67b32f80-a795-43b0-bcb0-aeda0d9506d4-tap775bdf61-ad', 'timestamp': '2025-11-22T07:46:36.593024', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2031422221', 'name': 'tap775bdf61-ad', 'instance_id': '67b32f80-a795-43b0-bcb0-aeda0d9506d4', 'instance_type': 'm1.nano', 'host': '826faeb4f15457be627ab93fd0598fadb2254bb02a7cb545a1c75818', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ec:d2:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap775bdf61-ad'}, 'message_id': '5f8336b0-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4337.284801428, 'message_signature': 'cbbe0b86398aad46774dadaf72bfdacf6d0ea303dd4eea8f03eaf7a192f3bfe1'}]}, 'timestamp': '2025-11-22 07:46:36.595784', '_unique_id': 'd22e280baa8249eca394d02a82d0ac64'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.596 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.596 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.596 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.596 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.596 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.596 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.596 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.596 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.596 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.596 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.596 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.596 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.596 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.596 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.596 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.596 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.596 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.596 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.596 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.596 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.596 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.596 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.596 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.596 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.596 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.596 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.596 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.596 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.596 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.596 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.596 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.598 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.598 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.598 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-FloatingIPsAssociationTestJSON-server-2031422221>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-FloatingIPsAssociationTestJSON-server-2031422221>]
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.598 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 22 07:46:36 compute-0 nova_compute[186544]: 2025-11-22 07:46:36.602 186548 DEBUG oslo_concurrency.processutils [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/67b32f80-a795-43b0-bcb0-aeda0d9506d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.620 12 DEBUG ceilometer.compute.pollsters [-] 67b32f80-a795-43b0-bcb0-aeda0d9506d4/disk.device.read.latency volume: 601427875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.621 12 DEBUG ceilometer.compute.pollsters [-] 67b32f80-a795-43b0-bcb0-aeda0d9506d4/disk.device.read.latency volume: 41607768 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.622 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e13fa07f-59f1-4768-a193-b76b62bbd514', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 601427875, 'user_id': '65ded9a5f9a7463d8c52561197054664', 'user_name': None, 'project_id': 'ef27a8ab9a794f7782ac89b9c28c893a', 'project_name': None, 'resource_id': '67b32f80-a795-43b0-bcb0-aeda0d9506d4-vda', 'timestamp': '2025-11-22T07:46:36.598842', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2031422221', 'name': 'instance-0000001d', 'instance_id': '67b32f80-a795-43b0-bcb0-aeda0d9506d4', 'instance_type': 'm1.nano', 'host': '826faeb4f15457be627ab93fd0598fadb2254bb02a7cb545a1c75818', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5f8721da-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4337.290638002, 'message_signature': 'f2b0ee6c3ebab117c37d580c07b84a1049b2b8ede512c5530a0ba860d67ed977'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 41607768, 'user_id': '65ded9a5f9a7463d8c52561197054664', 'user_name': None, 'project_id': 'ef27a8ab9a794f7782ac89b9c28c893a', 'project_name': None, 'resource_id': '67b32f80-a795-43b0-bcb0-aeda0d9506d4-sda', 'timestamp': '2025-11-22T07:46:36.598842', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2031422221', 'name': 'instance-0000001d', 'instance_id': '67b32f80-a795-43b0-bcb0-aeda0d9506d4', 'instance_type': 'm1.nano', 'host': '826faeb4f15457be627ab93fd0598fadb2254bb02a7cb545a1c75818', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5f8734d6-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4337.290638002, 'message_signature': '44bcd39b08bc14003418bc51401fd180eb38359263da96065805ba787f756245'}]}, 'timestamp': '2025-11-22 07:46:36.621859', '_unique_id': '3caa515a1eb9463d8381063cb054994b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.622 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.622 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.622 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.622 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.622 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.622 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.622 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.622 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.622 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.622 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.622 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.622 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.622 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.622 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.622 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.622 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.622 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.622 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.622 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.622 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.622 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.622 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.622 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.622 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.622 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.622 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.622 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.622 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.622 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.622 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.622 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.622 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.623 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.638 12 DEBUG ceilometer.compute.pollsters [-] 67b32f80-a795-43b0-bcb0-aeda0d9506d4/cpu volume: 12150000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.639 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '08fa4d3a-cc41-47e2-856b-0a62c266aab1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12150000000, 'user_id': '65ded9a5f9a7463d8c52561197054664', 'user_name': None, 'project_id': 'ef27a8ab9a794f7782ac89b9c28c893a', 'project_name': None, 'resource_id': '67b32f80-a795-43b0-bcb0-aeda0d9506d4', 'timestamp': '2025-11-22T07:46:36.623981', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2031422221', 'name': 'instance-0000001d', 'instance_id': '67b32f80-a795-43b0-bcb0-aeda0d9506d4', 'instance_type': 'm1.nano', 'host': '826faeb4f15457be627ab93fd0598fadb2254bb02a7cb545a1c75818', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '5f89c52a-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4337.32984752, 'message_signature': '7c53ac0c8e0e7664d9e1a0da95d7de7326050d1b1b749b0bf956b11d5f345aa2'}]}, 'timestamp': '2025-11-22 07:46:36.638709', '_unique_id': '9da1716ef45b41609b50a5bff5f59fae'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.639 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.639 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.639 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.639 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.639 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.639 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.639 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.639 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.639 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.639 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.639 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.639 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.639 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.639 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.639 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.639 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.639 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.639 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.639 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.639 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.639 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.639 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.639 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.639 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.639 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.639 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.639 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.639 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.639 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.639 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.639 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.640 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.640 12 DEBUG ceilometer.compute.pollsters [-] 67b32f80-a795-43b0-bcb0-aeda0d9506d4/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.641 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3f11db83-b924-423e-a3dc-e01e47323834', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '65ded9a5f9a7463d8c52561197054664', 'user_name': None, 'project_id': 'ef27a8ab9a794f7782ac89b9c28c893a', 'project_name': None, 'resource_id': 'instance-0000001d-67b32f80-a795-43b0-bcb0-aeda0d9506d4-tap775bdf61-ad', 'timestamp': '2025-11-22T07:46:36.640695', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2031422221', 'name': 'tap775bdf61-ad', 'instance_id': '67b32f80-a795-43b0-bcb0-aeda0d9506d4', 'instance_type': 'm1.nano', 'host': '826faeb4f15457be627ab93fd0598fadb2254bb02a7cb545a1c75818', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ec:d2:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap775bdf61-ad'}, 'message_id': '5f8a2222-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4337.284801428, 'message_signature': '2f51af5d0d347db655c1560163be1d8b65cdce1b09732ea9ec84804191e5e724'}]}, 'timestamp': '2025-11-22 07:46:36.641095', '_unique_id': 'd29f4b0ac8094b06bf8f3a5d73a3f57a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.641 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.641 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.641 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.641 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.641 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.641 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.641 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.641 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.641 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.641 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.641 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.641 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.641 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.641 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.641 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.641 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.641 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.641 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.641 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.641 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.641 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.641 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.641 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.641 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.641 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.641 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.641 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.641 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.641 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.641 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.641 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.642 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.643 12 DEBUG ceilometer.compute.pollsters [-] 67b32f80-a795-43b0-bcb0-aeda0d9506d4/network.incoming.packets volume: 11 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.644 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a1bb05dc-25f7-4ec1-a59a-8f60658aad4d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 11, 'user_id': '65ded9a5f9a7463d8c52561197054664', 'user_name': None, 'project_id': 'ef27a8ab9a794f7782ac89b9c28c893a', 'project_name': None, 'resource_id': 'instance-0000001d-67b32f80-a795-43b0-bcb0-aeda0d9506d4-tap775bdf61-ad', 'timestamp': '2025-11-22T07:46:36.643138', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2031422221', 'name': 'tap775bdf61-ad', 'instance_id': '67b32f80-a795-43b0-bcb0-aeda0d9506d4', 'instance_type': 'm1.nano', 'host': '826faeb4f15457be627ab93fd0598fadb2254bb02a7cb545a1c75818', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ec:d2:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap775bdf61-ad'}, 'message_id': '5f8a8244-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4337.284801428, 'message_signature': 'a0dac6b236e08aa13c1bdd70279cd9904a8c658f0b097879e902fa634ce58788'}]}, 'timestamp': '2025-11-22 07:46:36.643554', '_unique_id': 'f407ed67a7274aeba2138725bf844bcd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.644 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.644 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.644 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.644 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.644 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.644 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.644 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.644 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.644 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.644 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.644 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.644 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.644 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.644 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.644 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.644 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.644 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.644 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.644 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.644 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.644 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.644 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.644 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.644 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.644 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.644 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.644 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.644 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.644 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.644 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.644 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.645 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.645 12 DEBUG ceilometer.compute.pollsters [-] 67b32f80-a795-43b0-bcb0-aeda0d9506d4/disk.device.write.requests volume: 307 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.645 12 DEBUG ceilometer.compute.pollsters [-] 67b32f80-a795-43b0-bcb0-aeda0d9506d4/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.646 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e8055c0b-3a53-44be-b34b-bafbbf95697a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 307, 'user_id': '65ded9a5f9a7463d8c52561197054664', 'user_name': None, 'project_id': 'ef27a8ab9a794f7782ac89b9c28c893a', 'project_name': None, 'resource_id': '67b32f80-a795-43b0-bcb0-aeda0d9506d4-vda', 'timestamp': '2025-11-22T07:46:36.645547', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2031422221', 'name': 'instance-0000001d', 'instance_id': '67b32f80-a795-43b0-bcb0-aeda0d9506d4', 'instance_type': 'm1.nano', 'host': '826faeb4f15457be627ab93fd0598fadb2254bb02a7cb545a1c75818', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5f8adf3c-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4337.290638002, 'message_signature': 'e3d69bf9d33fea0ecd41a3f04321128507c2fe0b2408647b1eaa3227c8472cb1'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '65ded9a5f9a7463d8c52561197054664', 'user_name': None, 'project_id': 'ef27a8ab9a794f7782ac89b9c28c893a', 'project_name': None, 'resource_id': '67b32f80-a795-43b0-bcb0-aeda0d9506d4-sda', 'timestamp': '2025-11-22T07:46:36.645547', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2031422221', 'name': 'instance-0000001d', 'instance_id': '67b32f80-a795-43b0-bcb0-aeda0d9506d4', 'instance_type': 'm1.nano', 'host': '826faeb4f15457be627ab93fd0598fadb2254bb02a7cb545a1c75818', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5f8aec98-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4337.290638002, 'message_signature': 'e8b4bd25fa1b4c55c5cf07488bf7600bd2316cc9cdf0e9bcbb2f13ec2806d0bc'}]}, 'timestamp': '2025-11-22 07:46:36.646255', '_unique_id': 'db4cff0a0a71479fae0db0070df748d0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.646 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.646 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.646 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.646 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.646 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.646 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.646 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.646 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.646 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.646 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.646 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.646 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.646 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.646 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.646 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.646 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.646 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.646 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.646 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.646 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.646 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.646 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.646 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.646 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.646 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.646 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.646 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.646 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.646 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.646 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.646 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.648 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.657 12 DEBUG ceilometer.compute.pollsters [-] 67b32f80-a795-43b0-bcb0-aeda0d9506d4/disk.device.usage volume: 29884416 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.657 12 DEBUG ceilometer.compute.pollsters [-] 67b32f80-a795-43b0-bcb0-aeda0d9506d4/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.659 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1deab03f-f460-449c-ab5d-4b417351a720', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29884416, 'user_id': '65ded9a5f9a7463d8c52561197054664', 'user_name': None, 'project_id': 'ef27a8ab9a794f7782ac89b9c28c893a', 'project_name': None, 'resource_id': '67b32f80-a795-43b0-bcb0-aeda0d9506d4-vda', 'timestamp': '2025-11-22T07:46:36.648220', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2031422221', 'name': 'instance-0000001d', 'instance_id': '67b32f80-a795-43b0-bcb0-aeda0d9506d4', 'instance_type': 'm1.nano', 'host': '826faeb4f15457be627ab93fd0598fadb2254bb02a7cb545a1c75818', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5f8cb0f0-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4337.340054959, 'message_signature': '4d46b917aebe82bce6c536213435b2387f048442ec90e41b2ea1899a3628c245'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '65ded9a5f9a7463d8c52561197054664', 'user_name': None, 'project_id': 'ef27a8ab9a794f7782ac89b9c28c893a', 'project_name': None, 'resource_id': '67b32f80-a795-43b0-bcb0-aeda0d9506d4-sda', 'timestamp': '2025-11-22T07:46:36.648220', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2031422221', 'name': 'instance-0000001d', 'instance_id': '67b32f80-a795-43b0-bcb0-aeda0d9506d4', 'instance_type': 'm1.nano', 'host': '826faeb4f15457be627ab93fd0598fadb2254bb02a7cb545a1c75818', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5f8cbd98-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4337.340054959, 'message_signature': '3e0cb62334e9be8790447a274ad636c3b697320c08b169b1c1a57803ee1ccdc9'}]}, 'timestamp': '2025-11-22 07:46:36.658182', '_unique_id': '46f93fc897db45ebb4b9a3f6c39cb492'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.659 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.659 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.659 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.659 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.659 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.659 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.659 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.659 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.659 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.659 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.659 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.659 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.659 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.659 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.659 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.659 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.659 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.659 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.659 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.659 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.659 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.659 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.659 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.659 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.659 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.659 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.659 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.659 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.659 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.659 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.659 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.660 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.660 12 DEBUG ceilometer.compute.pollsters [-] 67b32f80-a795-43b0-bcb0-aeda0d9506d4/disk.device.read.requests volume: 1062 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.660 12 DEBUG ceilometer.compute.pollsters [-] 67b32f80-a795-43b0-bcb0-aeda0d9506d4/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:46:36 compute-0 nova_compute[186544]: 2025-11-22 07:46:36.660 186548 DEBUG oslo_concurrency.processutils [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/67b32f80-a795-43b0-bcb0-aeda0d9506d4/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.661 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c3353ef0-3766-4624-be62-1cfa009bd529', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1062, 'user_id': '65ded9a5f9a7463d8c52561197054664', 'user_name': None, 'project_id': 'ef27a8ab9a794f7782ac89b9c28c893a', 'project_name': None, 'resource_id': '67b32f80-a795-43b0-bcb0-aeda0d9506d4-vda', 'timestamp': '2025-11-22T07:46:36.660255', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2031422221', 'name': 'instance-0000001d', 'instance_id': '67b32f80-a795-43b0-bcb0-aeda0d9506d4', 'instance_type': 'm1.nano', 'host': '826faeb4f15457be627ab93fd0598fadb2254bb02a7cb545a1c75818', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5f8d1d4c-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4337.290638002, 'message_signature': 'ecd35b9e8aed835e9acc67ebcfb95892b67a1fa55b72f051f439f1c46278e979'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '65ded9a5f9a7463d8c52561197054664', 'user_name': None, 'project_id': 'ef27a8ab9a794f7782ac89b9c28c893a', 'project_name': None, 'resource_id': '67b32f80-a795-43b0-bcb0-aeda0d9506d4-sda', 'timestamp': '2025-11-22T07:46:36.660255', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2031422221', 'name': 'instance-0000001d', 'instance_id': '67b32f80-a795-43b0-bcb0-aeda0d9506d4', 'instance_type': 'm1.nano', 'host': '826faeb4f15457be627ab93fd0598fadb2254bb02a7cb545a1c75818', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5f8d27a6-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4337.290638002, 'message_signature': '7f3f5e8735f0280fecb188741e16972101335e9fb59d25c3bff764148bde1bba'}]}, 'timestamp': '2025-11-22 07:46:36.660829', '_unique_id': '4cdc7490d3bc419d8dc0a79d3da443b7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.661 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.661 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.661 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.661 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.661 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.661 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:46:36 compute-0 nova_compute[186544]: 2025-11-22 07:46:36.661 186548 DEBUG oslo_concurrency.processutils [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/67b32f80-a795-43b0-bcb0-aeda0d9506d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.661 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.661 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.661 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.661 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.661 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.661 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.661 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.661 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.661 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.661 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.661 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.661 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.661 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.661 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.661 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.661 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.661 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.661 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.661 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.661 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.661 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.661 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.661 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.661 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.661 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.662 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.662 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.662 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-FloatingIPsAssociationTestJSON-server-2031422221>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-FloatingIPsAssociationTestJSON-server-2031422221>]
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.662 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.662 12 DEBUG ceilometer.compute.pollsters [-] 67b32f80-a795-43b0-bcb0-aeda0d9506d4/disk.device.allocation volume: 30154752 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.663 12 DEBUG ceilometer.compute.pollsters [-] 67b32f80-a795-43b0-bcb0-aeda0d9506d4/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.664 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9d4e3138-1d55-4d5c-a124-6b8c247ae70b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30154752, 'user_id': '65ded9a5f9a7463d8c52561197054664', 'user_name': None, 'project_id': 'ef27a8ab9a794f7782ac89b9c28c893a', 'project_name': None, 'resource_id': '67b32f80-a795-43b0-bcb0-aeda0d9506d4-vda', 'timestamp': '2025-11-22T07:46:36.662954', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2031422221', 'name': 'instance-0000001d', 'instance_id': '67b32f80-a795-43b0-bcb0-aeda0d9506d4', 'instance_type': 'm1.nano', 'host': '826faeb4f15457be627ab93fd0598fadb2254bb02a7cb545a1c75818', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5f8d853e-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4337.340054959, 'message_signature': 'eb00a4537f39275049b7bcc95359f123bed9d6fa86db72c68d881f7bccdd6464'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '65ded9a5f9a7463d8c52561197054664', 'user_name': None, 'project_id': 'ef27a8ab9a794f7782ac89b9c28c893a', 'project_name': None, 'resource_id': '67b32f80-a795-43b0-bcb0-aeda0d9506d4-sda', 'timestamp': '2025-11-22T07:46:36.662954', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2031422221', 'name': 'instance-0000001d', 'instance_id': '67b32f80-a795-43b0-bcb0-aeda0d9506d4', 'instance_type': 'm1.nano', 'host': '826faeb4f15457be627ab93fd0598fadb2254bb02a7cb545a1c75818', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5f8d91aa-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4337.340054959, 'message_signature': 'fc17624bc1d305abf9859eeabfdade653015370510657f447ad3850eb6a1adec'}]}, 'timestamp': '2025-11-22 07:46:36.663543', '_unique_id': '9ac346a55fdf4c27b896a11058ad041b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.664 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.664 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.664 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.664 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.664 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.664 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.664 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.664 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.664 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.664 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.664 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.664 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.664 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.664 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.664 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.664 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.664 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.664 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.664 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.664 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.664 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.664 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.664 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.664 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.664 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.664 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.664 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.664 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.664 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.664 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.664 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.665 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.665 12 DEBUG ceilometer.compute.pollsters [-] 67b32f80-a795-43b0-bcb0-aeda0d9506d4/memory.usage volume: 40.3671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.666 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '28aaf271-354f-4469-8749-b4acf1873f99', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.3671875, 'user_id': '65ded9a5f9a7463d8c52561197054664', 'user_name': None, 'project_id': 'ef27a8ab9a794f7782ac89b9c28c893a', 'project_name': None, 'resource_id': '67b32f80-a795-43b0-bcb0-aeda0d9506d4', 'timestamp': '2025-11-22T07:46:36.665345', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2031422221', 'name': 'instance-0000001d', 'instance_id': '67b32f80-a795-43b0-bcb0-aeda0d9506d4', 'instance_type': 'm1.nano', 'host': '826faeb4f15457be627ab93fd0598fadb2254bb02a7cb545a1c75818', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '5f8de4c0-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4337.32984752, 'message_signature': '767e3f4e815d45be4eb0d84db19ccb1b26355aec3c8f5b9262515eec28310454'}]}, 'timestamp': '2025-11-22 07:46:36.665673', '_unique_id': '0d429d10b8ae4e52beddd5f2ae3fa5b1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.666 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.666 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.666 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.666 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.666 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.666 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.666 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.666 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.666 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.666 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.666 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.666 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.666 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.666 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.666 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.666 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.666 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.666 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.666 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.666 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.666 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.666 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.666 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.666 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.666 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.666 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.666 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.666 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.666 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.666 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.666 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.667 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.667 12 DEBUG ceilometer.compute.pollsters [-] 67b32f80-a795-43b0-bcb0-aeda0d9506d4/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.668 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8eac9625-ceff-4d24-be3e-ddf2ce1dd61f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '65ded9a5f9a7463d8c52561197054664', 'user_name': None, 'project_id': 'ef27a8ab9a794f7782ac89b9c28c893a', 'project_name': None, 'resource_id': 'instance-0000001d-67b32f80-a795-43b0-bcb0-aeda0d9506d4-tap775bdf61-ad', 'timestamp': '2025-11-22T07:46:36.667211', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2031422221', 'name': 'tap775bdf61-ad', 'instance_id': '67b32f80-a795-43b0-bcb0-aeda0d9506d4', 'instance_type': 'm1.nano', 'host': '826faeb4f15457be627ab93fd0598fadb2254bb02a7cb545a1c75818', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ec:d2:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap775bdf61-ad'}, 'message_id': '5f8e2ce6-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4337.284801428, 'message_signature': '5167a4d198843fdf1cf945e3befc2202e1a57ffbb5d2a18b8afc964f29d95820'}]}, 'timestamp': '2025-11-22 07:46:36.667585', '_unique_id': '113b2c866b234b8294843ad2c10a36c8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.668 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.668 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.668 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.668 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.668 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.668 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.668 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.668 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.668 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.668 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.668 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.668 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.668 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.668 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.668 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.668 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.668 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.668 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.668 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.668 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.668 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.668 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.668 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.668 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.668 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.668 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.668 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.668 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.668 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.668 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.668 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.669 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.669 12 DEBUG ceilometer.compute.pollsters [-] 67b32f80-a795-43b0-bcb0-aeda0d9506d4/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.670 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '45ad1c68-09b8-47c3-b81a-c970c874bfe3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '65ded9a5f9a7463d8c52561197054664', 'user_name': None, 'project_id': 'ef27a8ab9a794f7782ac89b9c28c893a', 'project_name': None, 'resource_id': 'instance-0000001d-67b32f80-a795-43b0-bcb0-aeda0d9506d4-tap775bdf61-ad', 'timestamp': '2025-11-22T07:46:36.669414', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2031422221', 'name': 'tap775bdf61-ad', 'instance_id': '67b32f80-a795-43b0-bcb0-aeda0d9506d4', 'instance_type': 'm1.nano', 'host': '826faeb4f15457be627ab93fd0598fadb2254bb02a7cb545a1c75818', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ec:d2:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap775bdf61-ad'}, 'message_id': '5f8e8b28-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4337.284801428, 'message_signature': '64cd0658bb9dc089b7d32e248f7594e3d6f9a24bdc486a98a9b293a561f61651'}]}, 'timestamp': '2025-11-22 07:46:36.669964', '_unique_id': '6f5d97ed38ed40fe9317b1adcf47c666'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.670 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.670 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.670 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.670 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.670 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.670 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.670 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.670 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.670 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.670 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.670 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.670 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.670 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.670 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.670 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.670 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.670 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.670 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.670 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.670 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.670 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.670 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.670 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.670 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.670 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.670 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.670 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.670 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.670 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.670 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.670 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.671 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.671 12 DEBUG ceilometer.compute.pollsters [-] 67b32f80-a795-43b0-bcb0-aeda0d9506d4/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.672 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7391bee5-b8da-4823-8eb8-b631241bdff3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '65ded9a5f9a7463d8c52561197054664', 'user_name': None, 'project_id': 'ef27a8ab9a794f7782ac89b9c28c893a', 'project_name': None, 'resource_id': 'instance-0000001d-67b32f80-a795-43b0-bcb0-aeda0d9506d4-tap775bdf61-ad', 'timestamp': '2025-11-22T07:46:36.671684', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2031422221', 'name': 'tap775bdf61-ad', 'instance_id': '67b32f80-a795-43b0-bcb0-aeda0d9506d4', 'instance_type': 'm1.nano', 'host': '826faeb4f15457be627ab93fd0598fadb2254bb02a7cb545a1c75818', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ec:d2:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap775bdf61-ad'}, 'message_id': '5f8edc7c-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4337.284801428, 'message_signature': 'ea3a771089ce56531263ef09b362bef1f7954b42bf8ed30a849bc63385620e6a'}]}, 'timestamp': '2025-11-22 07:46:36.672026', '_unique_id': '52be4e45ca5b415eb8966cfa8775ca9e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.672 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.672 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.672 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.672 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.672 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.672 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.672 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.672 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.672 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.672 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.672 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.672 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.672 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.672 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.672 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.672 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.672 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.672 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.672 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.672 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.672 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.672 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.672 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.672 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.672 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.672 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.672 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.672 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.672 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.672 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.672 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.673 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.673 12 DEBUG ceilometer.compute.pollsters [-] 67b32f80-a795-43b0-bcb0-aeda0d9506d4/network.outgoing.bytes volume: 1438 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.674 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd488df52-4839-4a90-929e-dc35a0f325b8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1438, 'user_id': '65ded9a5f9a7463d8c52561197054664', 'user_name': None, 'project_id': 'ef27a8ab9a794f7782ac89b9c28c893a', 'project_name': None, 'resource_id': 'instance-0000001d-67b32f80-a795-43b0-bcb0-aeda0d9506d4-tap775bdf61-ad', 'timestamp': '2025-11-22T07:46:36.673814', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2031422221', 'name': 'tap775bdf61-ad', 'instance_id': '67b32f80-a795-43b0-bcb0-aeda0d9506d4', 'instance_type': 'm1.nano', 'host': '826faeb4f15457be627ab93fd0598fadb2254bb02a7cb545a1c75818', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ec:d2:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap775bdf61-ad'}, 'message_id': '5f8f2d80-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4337.284801428, 'message_signature': 'ffd42700f4f9a8dcce931601fd377b6628e0e52de22247f516806092b5c987e2'}]}, 'timestamp': '2025-11-22 07:46:36.674116', '_unique_id': 'ea80801290cd49708c09df0a63a71211'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.674 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.674 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.674 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.674 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.674 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.674 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.674 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.674 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.674 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.674 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.674 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.674 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.674 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.674 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.674 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.674 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.674 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.674 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.674 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.674 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.674 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.674 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.674 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.674 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.674 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.674 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.674 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.674 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.674 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.674 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.674 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.675 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.675 12 DEBUG ceilometer.compute.pollsters [-] 67b32f80-a795-43b0-bcb0-aeda0d9506d4/disk.device.write.bytes volume: 72843264 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.676 12 DEBUG ceilometer.compute.pollsters [-] 67b32f80-a795-43b0-bcb0-aeda0d9506d4/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.676 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f5817ef1-1c84-4e36-9022-725b46c5e121', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72843264, 'user_id': '65ded9a5f9a7463d8c52561197054664', 'user_name': None, 'project_id': 'ef27a8ab9a794f7782ac89b9c28c893a', 'project_name': None, 'resource_id': '67b32f80-a795-43b0-bcb0-aeda0d9506d4-vda', 'timestamp': '2025-11-22T07:46:36.675702', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2031422221', 'name': 'instance-0000001d', 'instance_id': '67b32f80-a795-43b0-bcb0-aeda0d9506d4', 'instance_type': 'm1.nano', 'host': '826faeb4f15457be627ab93fd0598fadb2254bb02a7cb545a1c75818', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5f8f7934-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4337.290638002, 'message_signature': '96019390b5ed68109867f37d79e7a80d2578bd4e11128be2ed1225ae17e1d4a3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '65ded9a5f9a7463d8c52561197054664', 'user_name': None, 'project_id': 'ef27a8ab9a794f7782ac89b9c28c893a', 'project_name': None, 'resource_id': '67b32f80-a795-43b0-bcb0-aeda0d9506d4-sda', 'timestamp': '2025-11-22T07:46:36.675702', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2031422221', 'name': 'instance-0000001d', 'instance_id': '67b32f80-a795-43b0-bcb0-aeda0d9506d4', 'instance_type': 'm1.nano', 'host': '826faeb4f15457be627ab93fd0598fadb2254bb02a7cb545a1c75818', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5f8f83ac-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4337.290638002, 'message_signature': 'ee4f070e605f87bc0f74f264da7f542ba50d7836cd5fb617d2a680920614dfa2'}]}, 'timestamp': '2025-11-22 07:46:36.676324', '_unique_id': '58b9e7d331ba4e9fa4418d982c630a3e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.676 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.676 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.676 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.676 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.676 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.676 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.676 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.676 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.676 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.676 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.676 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.676 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.676 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.676 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.676 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.676 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.676 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.676 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.676 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.676 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.676 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.676 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.676 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.676 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.676 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.676 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.676 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.676 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.676 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.676 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.676 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.677 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.678 12 DEBUG ceilometer.compute.pollsters [-] 67b32f80-a795-43b0-bcb0-aeda0d9506d4/network.outgoing.packets volume: 13 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.678 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5c12f87c-8122-490a-804a-f8ea3507a8f3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 13, 'user_id': '65ded9a5f9a7463d8c52561197054664', 'user_name': None, 'project_id': 'ef27a8ab9a794f7782ac89b9c28c893a', 'project_name': None, 'resource_id': 'instance-0000001d-67b32f80-a795-43b0-bcb0-aeda0d9506d4-tap775bdf61-ad', 'timestamp': '2025-11-22T07:46:36.678017', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2031422221', 'name': 'tap775bdf61-ad', 'instance_id': '67b32f80-a795-43b0-bcb0-aeda0d9506d4', 'instance_type': 'm1.nano', 'host': '826faeb4f15457be627ab93fd0598fadb2254bb02a7cb545a1c75818', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ec:d2:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap775bdf61-ad'}, 'message_id': '5f8fd226-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4337.284801428, 'message_signature': '1d9530ef770a9519797123e44478b7a9d0df1e852dfb71c28ff26e5f2fbe21a5'}]}, 'timestamp': '2025-11-22 07:46:36.678354', '_unique_id': '78bfcb0c99de4d68a06cbad1d4900cbc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.678 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.678 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.678 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.678 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.678 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.678 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.678 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.678 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.678 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.678 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.678 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.678 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.678 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.678 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.678 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.678 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.678 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.678 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.678 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.678 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.678 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.678 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.678 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.678 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.678 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.678 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.678 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.678 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.678 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.678 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.678 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.679 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.679 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.680 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-FloatingIPsAssociationTestJSON-server-2031422221>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-FloatingIPsAssociationTestJSON-server-2031422221>]
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.680 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.680 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.680 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-FloatingIPsAssociationTestJSON-server-2031422221>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-FloatingIPsAssociationTestJSON-server-2031422221>]
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.680 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.680 12 DEBUG ceilometer.compute.pollsters [-] 67b32f80-a795-43b0-bcb0-aeda0d9506d4/disk.device.write.latency volume: 2982489683 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.681 12 DEBUG ceilometer.compute.pollsters [-] 67b32f80-a795-43b0-bcb0-aeda0d9506d4/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.681 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b92c49b5-291b-45ac-b546-10d853b69804', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2982489683, 'user_id': '65ded9a5f9a7463d8c52561197054664', 'user_name': None, 'project_id': 'ef27a8ab9a794f7782ac89b9c28c893a', 'project_name': None, 'resource_id': '67b32f80-a795-43b0-bcb0-aeda0d9506d4-vda', 'timestamp': '2025-11-22T07:46:36.680768', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2031422221', 'name': 'instance-0000001d', 'instance_id': '67b32f80-a795-43b0-bcb0-aeda0d9506d4', 'instance_type': 'm1.nano', 'host': '826faeb4f15457be627ab93fd0598fadb2254bb02a7cb545a1c75818', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5f903c7a-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4337.290638002, 'message_signature': '18b60216fc5e1d9f998e399bc70f85eb05c3eab6a5223cf4267357cdccd66e4a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '65ded9a5f9a7463d8c52561197054664', 'user_name': None, 'project_id': 'ef27a8ab9a794f7782ac89b9c28c893a', 'project_name': None, 'resource_id': '67b32f80-a795-43b0-bcb0-aeda0d9506d4-sda', 'timestamp': '2025-11-22T07:46:36.680768', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2031422221', 'name': 'instance-0000001d', 'instance_id': '67b32f80-a795-43b0-bcb0-aeda0d9506d4', 'instance_type': 'm1.nano', 'host': '826faeb4f15457be627ab93fd0598fadb2254bb02a7cb545a1c75818', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5f9047ce-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4337.290638002, 'message_signature': '8ea91e60eee92649d65b39621bafe473e4556f034796fcc965d5b2fd0c3a9ecc'}]}, 'timestamp': '2025-11-22 07:46:36.681354', '_unique_id': '30203f3ca8bf4bc5a8fa15844c3bc2c3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.681 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.681 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.681 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.681 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.681 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.681 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.681 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.681 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.681 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.681 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.681 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.681 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.681 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.681 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.681 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.681 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.681 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.681 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.681 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.681 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.681 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.681 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.681 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.681 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.681 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.681 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.681 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.681 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.681 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.681 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.681 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.682 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.683 12 DEBUG ceilometer.compute.pollsters [-] 67b32f80-a795-43b0-bcb0-aeda0d9506d4/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.683 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9d9810da-3e5f-473d-a508-fa1c655b2589', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '65ded9a5f9a7463d8c52561197054664', 'user_name': None, 'project_id': 'ef27a8ab9a794f7782ac89b9c28c893a', 'project_name': None, 'resource_id': 'instance-0000001d-67b32f80-a795-43b0-bcb0-aeda0d9506d4-tap775bdf61-ad', 'timestamp': '2025-11-22T07:46:36.683050', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2031422221', 'name': 'tap775bdf61-ad', 'instance_id': '67b32f80-a795-43b0-bcb0-aeda0d9506d4', 'instance_type': 'm1.nano', 'host': '826faeb4f15457be627ab93fd0598fadb2254bb02a7cb545a1c75818', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ec:d2:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap775bdf61-ad'}, 'message_id': '5f9096c0-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4337.284801428, 'message_signature': 'a1df9eae3a384ea27249eaaefc1b124230c90998bc5bff36e828cac02d266367'}]}, 'timestamp': '2025-11-22 07:46:36.683377', '_unique_id': '0aa69f7e27474990adc7d1b442c4d205'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.683 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.683 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.683 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.683 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.683 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.683 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.683 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.683 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.683 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.683 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.683 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.683 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.683 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.683 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.683 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.683 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.683 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.683 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.683 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.683 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.683 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.683 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.683 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.683 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.683 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.683 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.683 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.683 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.683 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.683 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.683 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.685 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.685 12 DEBUG ceilometer.compute.pollsters [-] 67b32f80-a795-43b0-bcb0-aeda0d9506d4/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.685 12 DEBUG ceilometer.compute.pollsters [-] 67b32f80-a795-43b0-bcb0-aeda0d9506d4/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.686 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cd386d16-9e6c-430e-a13a-40ed884f6565', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '65ded9a5f9a7463d8c52561197054664', 'user_name': None, 'project_id': 'ef27a8ab9a794f7782ac89b9c28c893a', 'project_name': None, 'resource_id': '67b32f80-a795-43b0-bcb0-aeda0d9506d4-vda', 'timestamp': '2025-11-22T07:46:36.685207', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2031422221', 'name': 'instance-0000001d', 'instance_id': '67b32f80-a795-43b0-bcb0-aeda0d9506d4', 'instance_type': 'm1.nano', 'host': '826faeb4f15457be627ab93fd0598fadb2254bb02a7cb545a1c75818', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5f90ed8c-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4337.340054959, 'message_signature': '8791cac21cd8957dabd77e4d211c740e81f3038ea102a26a892c1104ba1857a4'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '65ded9a5f9a7463d8c52561197054664', 'user_name': None, 'project_id': 'ef27a8ab9a794f7782ac89b9c28c893a', 'project_name': None, 'resource_id': '67b32f80-a795-43b0-bcb0-aeda0d9506d4-sda', 'timestamp': '2025-11-22T07:46:36.685207', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2031422221', 'name': 'instance-0000001d', 'instance_id': '67b32f80-a795-43b0-bcb0-aeda0d9506d4', 'instance_type': 'm1.nano', 'host': '826faeb4f15457be627ab93fd0598fadb2254bb02a7cb545a1c75818', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5f90f746-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4337.340054959, 'message_signature': '3a30cf0b577977398e32fbc6914d55f67485a177372f959617eda445ef36a914'}]}, 'timestamp': '2025-11-22 07:46:36.685802', '_unique_id': '8dff3168896d4b0ab57b0176a968a590'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.686 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.686 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.686 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.686 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.686 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.686 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.686 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.686 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.686 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.686 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.686 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.686 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.686 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.686 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.686 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.686 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.686 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.686 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.686 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.686 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.686 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.686 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.686 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.686 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.686 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.686 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.686 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.686 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.686 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.686 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.686 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.687 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.687 12 DEBUG ceilometer.compute.pollsters [-] 67b32f80-a795-43b0-bcb0-aeda0d9506d4/disk.device.read.bytes volume: 29510144 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.687 12 DEBUG ceilometer.compute.pollsters [-] 67b32f80-a795-43b0-bcb0-aeda0d9506d4/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.688 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9d0c1c6a-8566-455a-9bb8-7a44806a791a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29510144, 'user_id': '65ded9a5f9a7463d8c52561197054664', 'user_name': None, 'project_id': 'ef27a8ab9a794f7782ac89b9c28c893a', 'project_name': None, 'resource_id': '67b32f80-a795-43b0-bcb0-aeda0d9506d4-vda', 'timestamp': '2025-11-22T07:46:36.687378', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2031422221', 'name': 'instance-0000001d', 'instance_id': '67b32f80-a795-43b0-bcb0-aeda0d9506d4', 'instance_type': 'm1.nano', 'host': '826faeb4f15457be627ab93fd0598fadb2254bb02a7cb545a1c75818', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5f913efe-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4337.290638002, 'message_signature': 'da26bc67e9e5ca8b9b9b2743c0b8c5997db83c42e6f1297a70fec6e19fec07de'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '65ded9a5f9a7463d8c52561197054664', 'user_name': None, 'project_id': 'ef27a8ab9a794f7782ac89b9c28c893a', 'project_name': None, 'resource_id': '67b32f80-a795-43b0-bcb0-aeda0d9506d4-sda', 'timestamp': '2025-11-22T07:46:36.687378', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2031422221', 'name': 'instance-0000001d', 'instance_id': '67b32f80-a795-43b0-bcb0-aeda0d9506d4', 'instance_type': 'm1.nano', 'host': '826faeb4f15457be627ab93fd0598fadb2254bb02a7cb545a1c75818', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5f914854-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4337.290638002, 'message_signature': '3784358ae560be6a55e3d5345d4002d134a6a70dba49411b59b784e40012ec44'}]}, 'timestamp': '2025-11-22 07:46:36.687875', '_unique_id': '33138989e0694b268b5f38c6b99f0478'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.688 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.688 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.688 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.688 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.688 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.688 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.688 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.688 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.688 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.688 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.688 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.688 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.688 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.688 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.688 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.688 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.688 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.688 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.688 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.688 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.688 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.688 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.688 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.688 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.688 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.688 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.688 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.688 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.688 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.688 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.688 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.689 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.689 12 DEBUG ceilometer.compute.pollsters [-] 67b32f80-a795-43b0-bcb0-aeda0d9506d4/network.incoming.bytes volume: 1436 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.690 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3d93f29c-2bb9-4928-89c5-8059ac83bed0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1436, 'user_id': '65ded9a5f9a7463d8c52561197054664', 'user_name': None, 'project_id': 'ef27a8ab9a794f7782ac89b9c28c893a', 'project_name': None, 'resource_id': 'instance-0000001d-67b32f80-a795-43b0-bcb0-aeda0d9506d4-tap775bdf61-ad', 'timestamp': '2025-11-22T07:46:36.689462', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2031422221', 'name': 'tap775bdf61-ad', 'instance_id': '67b32f80-a795-43b0-bcb0-aeda0d9506d4', 'instance_type': 'm1.nano', 'host': '826faeb4f15457be627ab93fd0598fadb2254bb02a7cb545a1c75818', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ec:d2:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap775bdf61-ad'}, 'message_id': '5f91907a-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4337.284801428, 'message_signature': 'f6d19678c96c614027a9414f980bab7167c7d39b2626d09caaeeb2f4fc57bcfe'}]}, 'timestamp': '2025-11-22 07:46:36.689739', '_unique_id': '4d2f9d701522425c8a0505ae76e697c0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.690 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.690 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.690 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.690 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.690 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.690 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.690 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.690 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.690 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.690 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.690 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.690 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.690 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.690 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.690 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.690 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.690 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.690 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.690 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.690 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.690 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.690 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.690 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.690 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.690 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.690 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.690 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.690 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.690 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.690 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:46:36.690 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:46:36 compute-0 nova_compute[186544]: 2025-11-22 07:46:36.714 186548 DEBUG oslo_concurrency.processutils [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/67b32f80-a795-43b0-bcb0-aeda0d9506d4/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:46:36 compute-0 nova_compute[186544]: 2025-11-22 07:46:36.860 186548 WARNING nova.virt.libvirt.driver [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 07:46:36 compute-0 nova_compute[186544]: 2025-11-22 07:46:36.861 186548 DEBUG nova.compute.resource_tracker [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5490MB free_disk=73.39607620239258GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 07:46:36 compute-0 nova_compute[186544]: 2025-11-22 07:46:36.861 186548 DEBUG oslo_concurrency.lockutils [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:46:36 compute-0 nova_compute[186544]: 2025-11-22 07:46:36.862 186548 DEBUG oslo_concurrency.lockutils [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:46:36 compute-0 nova_compute[186544]: 2025-11-22 07:46:36.937 186548 DEBUG nova.compute.resource_tracker [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Migration for instance 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Nov 22 07:46:36 compute-0 nova_compute[186544]: 2025-11-22 07:46:36.962 186548 DEBUG nova.compute.resource_tracker [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Nov 22 07:46:37 compute-0 nova_compute[186544]: 2025-11-22 07:46:37.000 186548 DEBUG nova.compute.resource_tracker [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Instance 67b32f80-a795-43b0-bcb0-aeda0d9506d4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 22 07:46:37 compute-0 nova_compute[186544]: 2025-11-22 07:46:37.000 186548 DEBUG nova.compute.resource_tracker [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Migration 5d25fe7b-81d5-4260-b852-0444a04944ca is active on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Nov 22 07:46:37 compute-0 nova_compute[186544]: 2025-11-22 07:46:37.001 186548 DEBUG nova.compute.resource_tracker [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 07:46:37 compute-0 nova_compute[186544]: 2025-11-22 07:46:37.001 186548 DEBUG nova.compute.resource_tracker [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 07:46:37 compute-0 nova_compute[186544]: 2025-11-22 07:46:37.074 186548 DEBUG nova.compute.provider_tree [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:46:37 compute-0 nova_compute[186544]: 2025-11-22 07:46:37.088 186548 DEBUG nova.scheduler.client.report [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:46:37 compute-0 nova_compute[186544]: 2025-11-22 07:46:37.110 186548 DEBUG nova.compute.resource_tracker [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 07:46:37 compute-0 nova_compute[186544]: 2025-11-22 07:46:37.111 186548 DEBUG oslo_concurrency.lockutils [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.249s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:46:37 compute-0 nova_compute[186544]: 2025-11-22 07:46:37.120 186548 INFO nova.compute.manager [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Migrating instance to compute-2.ctlplane.example.com finished successfully.
Nov 22 07:46:37 compute-0 nova_compute[186544]: 2025-11-22 07:46:37.280 186548 INFO nova.scheduler.client.report [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Deleted allocation for migration 5d25fe7b-81d5-4260-b852-0444a04944ca
Nov 22 07:46:37 compute-0 nova_compute[186544]: 2025-11-22 07:46:37.280 186548 DEBUG nova.virt.libvirt.driver [None req-1c1bb8fe-3d8a-4e24-928e-6785dfacd488 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Nov 22 07:46:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:37.315 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:46:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:37.315 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:46:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:37.316 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:46:39 compute-0 nova_compute[186544]: 2025-11-22 07:46:39.554 186548 DEBUG nova.network.neutron [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] Updating instance_info_cache with network_info: [{"id": "775bdf61-ad38-473b-9d53-c3195f2be472", "address": "fa:16:3e:ec:d2:78", "network": {"id": "9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-440990750-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef27a8ab9a794f7782ac89b9c28c893a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap775bdf61-ad", "ovs_interfaceid": "775bdf61-ad38-473b-9d53-c3195f2be472", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:46:39 compute-0 nova_compute[186544]: 2025-11-22 07:46:39.580 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Releasing lock "refresh_cache-67b32f80-a795-43b0-bcb0-aeda0d9506d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:46:39 compute-0 nova_compute[186544]: 2025-11-22 07:46:39.580 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 22 07:46:39 compute-0 nova_compute[186544]: 2025-11-22 07:46:39.580 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:46:39 compute-0 nova_compute[186544]: 2025-11-22 07:46:39.581 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:46:39 compute-0 nova_compute[186544]: 2025-11-22 07:46:39.581 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:46:39 compute-0 nova_compute[186544]: 2025-11-22 07:46:39.582 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:46:39 compute-0 nova_compute[186544]: 2025-11-22 07:46:39.582 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 07:46:39 compute-0 nova_compute[186544]: 2025-11-22 07:46:39.934 186548 DEBUG nova.virt.libvirt.driver [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Creating tmpfile /var/lib/nova/instances/tmppob76voc to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Nov 22 07:46:39 compute-0 nova_compute[186544]: 2025-11-22 07:46:39.934 186548 DEBUG nova.compute.manager [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmppob76voc',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Nov 22 07:46:40 compute-0 nova_compute[186544]: 2025-11-22 07:46:40.039 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:46:40 compute-0 nova_compute[186544]: 2025-11-22 07:46:40.174 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:46:40 compute-0 podman[217863]: 2025-11-22 07:46:40.399535862 +0000 UTC m=+0.054243277 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 07:46:41 compute-0 nova_compute[186544]: 2025-11-22 07:46:41.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:46:41 compute-0 nova_compute[186544]: 2025-11-22 07:46:41.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:46:41 compute-0 nova_compute[186544]: 2025-11-22 07:46:41.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:46:41 compute-0 nova_compute[186544]: 2025-11-22 07:46:41.455 186548 DEBUG nova.compute.manager [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmppob76voc',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='09e8ccfc-6ae9-4a06-ae76-7e059f50ac44',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Nov 22 07:46:41 compute-0 nova_compute[186544]: 2025-11-22 07:46:41.493 186548 DEBUG oslo_concurrency.lockutils [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Acquiring lock "refresh_cache-09e8ccfc-6ae9-4a06-ae76-7e059f50ac44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:46:41 compute-0 nova_compute[186544]: 2025-11-22 07:46:41.494 186548 DEBUG oslo_concurrency.lockutils [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Acquired lock "refresh_cache-09e8ccfc-6ae9-4a06-ae76-7e059f50ac44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:46:41 compute-0 nova_compute[186544]: 2025-11-22 07:46:41.494 186548 DEBUG nova.network.neutron [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 07:46:42 compute-0 nova_compute[186544]: 2025-11-22 07:46:42.388 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763797587.3862967, 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:46:42 compute-0 nova_compute[186544]: 2025-11-22 07:46:42.389 186548 INFO nova.compute.manager [-] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] VM Stopped (Lifecycle Event)
Nov 22 07:46:42 compute-0 nova_compute[186544]: 2025-11-22 07:46:42.420 186548 DEBUG nova.compute.manager [None req-5beeca8c-9df1-4fc2-b0e5-7192d1d06030 - - - - - -] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:46:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:42.916 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:46:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:42.916 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 07:46:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:42.917 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:46:42 compute-0 nova_compute[186544]: 2025-11-22 07:46:42.917 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:46:43 compute-0 nova_compute[186544]: 2025-11-22 07:46:43.109 186548 DEBUG nova.compute.manager [req-3faa9ba9-0696-466c-b1a7-70840e1116a2 req-625d09c8-f752-4ebf-9558-fd43473dc38f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] Received event network-changed-775bdf61-ad38-473b-9d53-c3195f2be472 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:46:43 compute-0 nova_compute[186544]: 2025-11-22 07:46:43.109 186548 DEBUG nova.compute.manager [req-3faa9ba9-0696-466c-b1a7-70840e1116a2 req-625d09c8-f752-4ebf-9558-fd43473dc38f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] Refreshing instance network info cache due to event network-changed-775bdf61-ad38-473b-9d53-c3195f2be472. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 07:46:43 compute-0 nova_compute[186544]: 2025-11-22 07:46:43.110 186548 DEBUG oslo_concurrency.lockutils [req-3faa9ba9-0696-466c-b1a7-70840e1116a2 req-625d09c8-f752-4ebf-9558-fd43473dc38f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-67b32f80-a795-43b0-bcb0-aeda0d9506d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:46:43 compute-0 nova_compute[186544]: 2025-11-22 07:46:43.110 186548 DEBUG oslo_concurrency.lockutils [req-3faa9ba9-0696-466c-b1a7-70840e1116a2 req-625d09c8-f752-4ebf-9558-fd43473dc38f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-67b32f80-a795-43b0-bcb0-aeda0d9506d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:46:43 compute-0 nova_compute[186544]: 2025-11-22 07:46:43.110 186548 DEBUG nova.network.neutron [req-3faa9ba9-0696-466c-b1a7-70840e1116a2 req-625d09c8-f752-4ebf-9558-fd43473dc38f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] Refreshing network info cache for port 775bdf61-ad38-473b-9d53-c3195f2be472 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 07:46:43 compute-0 nova_compute[186544]: 2025-11-22 07:46:43.375 186548 DEBUG nova.network.neutron [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Updating instance_info_cache with network_info: [{"id": "973cdfc2-4ad8-4f41-b383-4b64b1b5433f", "address": "fa:16:3e:02:9b:98", "network": {"id": "c3f966e1-8cff-4ca0-9b4f-a318c31b0a70", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1427311200-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d48bda61691e4f778b6d30c0dc773a30", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap973cdfc2-4a", "ovs_interfaceid": "973cdfc2-4ad8-4f41-b383-4b64b1b5433f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:46:43 compute-0 nova_compute[186544]: 2025-11-22 07:46:43.407 186548 DEBUG oslo_concurrency.lockutils [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Releasing lock "refresh_cache-09e8ccfc-6ae9-4a06-ae76-7e059f50ac44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:46:43 compute-0 nova_compute[186544]: 2025-11-22 07:46:43.415 186548 DEBUG nova.virt.libvirt.driver [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmppob76voc',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='09e8ccfc-6ae9-4a06-ae76-7e059f50ac44',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Nov 22 07:46:43 compute-0 nova_compute[186544]: 2025-11-22 07:46:43.415 186548 DEBUG nova.virt.libvirt.driver [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Creating instance directory: /var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Nov 22 07:46:43 compute-0 nova_compute[186544]: 2025-11-22 07:46:43.416 186548 DEBUG nova.virt.libvirt.driver [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Creating disk.info with the contents: {'/var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk': 'qcow2', '/var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854
Nov 22 07:46:43 compute-0 nova_compute[186544]: 2025-11-22 07:46:43.416 186548 DEBUG nova.virt.libvirt.driver [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864
Nov 22 07:46:43 compute-0 nova_compute[186544]: 2025-11-22 07:46:43.417 186548 DEBUG nova.objects.instance [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Lazy-loading 'trusted_certs' on Instance uuid 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:46:43 compute-0 nova_compute[186544]: 2025-11-22 07:46:43.441 186548 DEBUG oslo_concurrency.processutils [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:46:43 compute-0 nova_compute[186544]: 2025-11-22 07:46:43.501 186548 DEBUG oslo_concurrency.processutils [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:46:43 compute-0 nova_compute[186544]: 2025-11-22 07:46:43.502 186548 DEBUG oslo_concurrency.lockutils [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:46:43 compute-0 nova_compute[186544]: 2025-11-22 07:46:43.503 186548 DEBUG oslo_concurrency.lockutils [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:46:43 compute-0 nova_compute[186544]: 2025-11-22 07:46:43.517 186548 DEBUG oslo_concurrency.processutils [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:46:43 compute-0 nova_compute[186544]: 2025-11-22 07:46:43.574 186548 DEBUG oslo_concurrency.processutils [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:46:43 compute-0 nova_compute[186544]: 2025-11-22 07:46:43.575 186548 DEBUG oslo_concurrency.processutils [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:46:43 compute-0 nova_compute[186544]: 2025-11-22 07:46:43.610 186548 DEBUG oslo_concurrency.processutils [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:46:43 compute-0 nova_compute[186544]: 2025-11-22 07:46:43.611 186548 DEBUG oslo_concurrency.lockutils [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:46:43 compute-0 nova_compute[186544]: 2025-11-22 07:46:43.611 186548 DEBUG oslo_concurrency.processutils [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:46:43 compute-0 nova_compute[186544]: 2025-11-22 07:46:43.665 186548 DEBUG oslo_concurrency.processutils [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:46:43 compute-0 nova_compute[186544]: 2025-11-22 07:46:43.667 186548 DEBUG nova.virt.disk.api [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Checking if we can resize image /var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 07:46:43 compute-0 nova_compute[186544]: 2025-11-22 07:46:43.668 186548 DEBUG oslo_concurrency.processutils [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:46:43 compute-0 nova_compute[186544]: 2025-11-22 07:46:43.725 186548 DEBUG oslo_concurrency.processutils [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:46:43 compute-0 nova_compute[186544]: 2025-11-22 07:46:43.726 186548 DEBUG nova.virt.disk.api [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Cannot resize image /var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 07:46:43 compute-0 nova_compute[186544]: 2025-11-22 07:46:43.726 186548 DEBUG nova.objects.instance [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Lazy-loading 'migration_context' on Instance uuid 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:46:43 compute-0 nova_compute[186544]: 2025-11-22 07:46:43.742 186548 DEBUG oslo_concurrency.processutils [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:46:43 compute-0 nova_compute[186544]: 2025-11-22 07:46:43.766 186548 DEBUG oslo_concurrency.processutils [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk.config 485376" returned: 0 in 0.023s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:46:43 compute-0 nova_compute[186544]: 2025-11-22 07:46:43.767 186548 DEBUG nova.virt.libvirt.volume.remotefs [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Copying file compute-2.ctlplane.example.com:/var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk.config to /var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44 copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Nov 22 07:46:43 compute-0 nova_compute[186544]: 2025-11-22 07:46:43.767 186548 DEBUG oslo_concurrency.processutils [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Running cmd (subprocess): scp -C -r compute-2.ctlplane.example.com:/var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk.config /var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:46:44 compute-0 nova_compute[186544]: 2025-11-22 07:46:44.284 186548 DEBUG oslo_concurrency.processutils [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] CMD "scp -C -r compute-2.ctlplane.example.com:/var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk.config /var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44" returned: 0 in 0.516s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:46:44 compute-0 nova_compute[186544]: 2025-11-22 07:46:44.285 186548 DEBUG nova.virt.libvirt.driver [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Nov 22 07:46:44 compute-0 nova_compute[186544]: 2025-11-22 07:46:44.286 186548 DEBUG nova.virt.libvirt.vif [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-22T07:46:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-810629940',display_name='tempest-LiveMigrationTest-server-810629940',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-livemigrationtest-server-810629940',id=28,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:46:11Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d48bda61691e4f778b6d30c0dc773a30',ramdisk_id='',reservation_id='r-4wr2aiye',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-2093743563',owner_user_name='tempest-LiveMigrationTest-2093743563-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:46:34Z,user_data=None,user_id='8a738b980aad493b9a21da7d5a5ccf8a',uuid=09e8ccfc-6ae9-4a06-ae76-7e059f50ac44,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "973cdfc2-4ad8-4f41-b383-4b64b1b5433f", "address": "fa:16:3e:02:9b:98", "network": {"id": "c3f966e1-8cff-4ca0-9b4f-a318c31b0a70", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1427311200-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d48bda61691e4f778b6d30c0dc773a30", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap973cdfc2-4a", "ovs_interfaceid": "973cdfc2-4ad8-4f41-b383-4b64b1b5433f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 07:46:44 compute-0 nova_compute[186544]: 2025-11-22 07:46:44.286 186548 DEBUG nova.network.os_vif_util [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Converting VIF {"id": "973cdfc2-4ad8-4f41-b383-4b64b1b5433f", "address": "fa:16:3e:02:9b:98", "network": {"id": "c3f966e1-8cff-4ca0-9b4f-a318c31b0a70", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1427311200-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d48bda61691e4f778b6d30c0dc773a30", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap973cdfc2-4a", "ovs_interfaceid": "973cdfc2-4ad8-4f41-b383-4b64b1b5433f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:46:44 compute-0 nova_compute[186544]: 2025-11-22 07:46:44.286 186548 DEBUG nova.network.os_vif_util [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:02:9b:98,bridge_name='br-int',has_traffic_filtering=True,id=973cdfc2-4ad8-4f41-b383-4b64b1b5433f,network=Network(c3f966e1-8cff-4ca0-9b4f-a318c31b0a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap973cdfc2-4a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:46:44 compute-0 nova_compute[186544]: 2025-11-22 07:46:44.287 186548 DEBUG os_vif [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:02:9b:98,bridge_name='br-int',has_traffic_filtering=True,id=973cdfc2-4ad8-4f41-b383-4b64b1b5433f,network=Network(c3f966e1-8cff-4ca0-9b4f-a318c31b0a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap973cdfc2-4a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 07:46:44 compute-0 nova_compute[186544]: 2025-11-22 07:46:44.287 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:46:44 compute-0 nova_compute[186544]: 2025-11-22 07:46:44.288 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:46:44 compute-0 nova_compute[186544]: 2025-11-22 07:46:44.288 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:46:44 compute-0 nova_compute[186544]: 2025-11-22 07:46:44.291 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:46:44 compute-0 nova_compute[186544]: 2025-11-22 07:46:44.291 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap973cdfc2-4a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:46:44 compute-0 nova_compute[186544]: 2025-11-22 07:46:44.292 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap973cdfc2-4a, col_values=(('external_ids', {'iface-id': '973cdfc2-4ad8-4f41-b383-4b64b1b5433f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:02:9b:98', 'vm-uuid': '09e8ccfc-6ae9-4a06-ae76-7e059f50ac44'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:46:44 compute-0 nova_compute[186544]: 2025-11-22 07:46:44.293 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:46:44 compute-0 NetworkManager[55036]: <info>  [1763797604.2942] manager: (tap973cdfc2-4a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/61)
Nov 22 07:46:44 compute-0 nova_compute[186544]: 2025-11-22 07:46:44.296 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 07:46:44 compute-0 nova_compute[186544]: 2025-11-22 07:46:44.299 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:46:44 compute-0 nova_compute[186544]: 2025-11-22 07:46:44.299 186548 INFO os_vif [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:02:9b:98,bridge_name='br-int',has_traffic_filtering=True,id=973cdfc2-4ad8-4f41-b383-4b64b1b5433f,network=Network(c3f966e1-8cff-4ca0-9b4f-a318c31b0a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap973cdfc2-4a')
Nov 22 07:46:44 compute-0 nova_compute[186544]: 2025-11-22 07:46:44.300 186548 DEBUG nova.virt.libvirt.driver [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Nov 22 07:46:44 compute-0 nova_compute[186544]: 2025-11-22 07:46:44.300 186548 DEBUG nova.compute.manager [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmppob76voc',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='09e8ccfc-6ae9-4a06-ae76-7e059f50ac44',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Nov 22 07:46:44 compute-0 nova_compute[186544]: 2025-11-22 07:46:44.516 186548 DEBUG oslo_concurrency.lockutils [None req-afa8fb01-1682-4114-ad16-d66f12034764 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Acquiring lock "67b32f80-a795-43b0-bcb0-aeda0d9506d4" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:46:44 compute-0 nova_compute[186544]: 2025-11-22 07:46:44.516 186548 DEBUG oslo_concurrency.lockutils [None req-afa8fb01-1682-4114-ad16-d66f12034764 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Lock "67b32f80-a795-43b0-bcb0-aeda0d9506d4" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:46:44 compute-0 nova_compute[186544]: 2025-11-22 07:46:44.517 186548 DEBUG oslo_concurrency.lockutils [None req-afa8fb01-1682-4114-ad16-d66f12034764 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Acquiring lock "67b32f80-a795-43b0-bcb0-aeda0d9506d4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:46:44 compute-0 nova_compute[186544]: 2025-11-22 07:46:44.517 186548 DEBUG oslo_concurrency.lockutils [None req-afa8fb01-1682-4114-ad16-d66f12034764 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Lock "67b32f80-a795-43b0-bcb0-aeda0d9506d4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:46:44 compute-0 nova_compute[186544]: 2025-11-22 07:46:44.517 186548 DEBUG oslo_concurrency.lockutils [None req-afa8fb01-1682-4114-ad16-d66f12034764 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Lock "67b32f80-a795-43b0-bcb0-aeda0d9506d4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:46:44 compute-0 nova_compute[186544]: 2025-11-22 07:46:44.525 186548 INFO nova.compute.manager [None req-afa8fb01-1682-4114-ad16-d66f12034764 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] Terminating instance
Nov 22 07:46:44 compute-0 nova_compute[186544]: 2025-11-22 07:46:44.531 186548 DEBUG nova.compute.manager [None req-afa8fb01-1682-4114-ad16-d66f12034764 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 07:46:44 compute-0 kernel: tap775bdf61-ad (unregistering): left promiscuous mode
Nov 22 07:46:44 compute-0 NetworkManager[55036]: <info>  [1763797604.5518] device (tap775bdf61-ad): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 07:46:44 compute-0 ovn_controller[94843]: 2025-11-22T07:46:44Z|00119|binding|INFO|Releasing lport 775bdf61-ad38-473b-9d53-c3195f2be472 from this chassis (sb_readonly=0)
Nov 22 07:46:44 compute-0 ovn_controller[94843]: 2025-11-22T07:46:44Z|00120|binding|INFO|Setting lport 775bdf61-ad38-473b-9d53-c3195f2be472 down in Southbound
Nov 22 07:46:44 compute-0 ovn_controller[94843]: 2025-11-22T07:46:44Z|00121|binding|INFO|Removing iface tap775bdf61-ad ovn-installed in OVS
Nov 22 07:46:44 compute-0 nova_compute[186544]: 2025-11-22 07:46:44.556 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:46:44 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:44.567 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ec:d2:78 10.100.0.3'], port_security=['fa:16:3e:ec:d2:78 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '67b32f80-a795-43b0-bcb0-aeda0d9506d4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ef27a8ab9a794f7782ac89b9c28c893a', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7ba545ee-7ef7-4120-9b36-dfb927d132f4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=750112b4-7c3d-47fc-a624-7726e73fdc53, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=775bdf61-ad38-473b-9d53-c3195f2be472) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:46:44 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:44.568 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 775bdf61-ad38-473b-9d53-c3195f2be472 in datapath 9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202 unbound from our chassis
Nov 22 07:46:44 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:44.569 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 07:46:44 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:44.570 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[8c1812dc-e119-4436-91f0-51905d476008]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:46:44 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:44.570 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202 namespace which is not needed anymore
Nov 22 07:46:44 compute-0 nova_compute[186544]: 2025-11-22 07:46:44.573 186548 DEBUG nova.network.neutron [req-3faa9ba9-0696-466c-b1a7-70840e1116a2 req-625d09c8-f752-4ebf-9558-fd43473dc38f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] Updated VIF entry in instance network info cache for port 775bdf61-ad38-473b-9d53-c3195f2be472. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 07:46:44 compute-0 nova_compute[186544]: 2025-11-22 07:46:44.574 186548 DEBUG nova.network.neutron [req-3faa9ba9-0696-466c-b1a7-70840e1116a2 req-625d09c8-f752-4ebf-9558-fd43473dc38f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] Updating instance_info_cache with network_info: [{"id": "775bdf61-ad38-473b-9d53-c3195f2be472", "address": "fa:16:3e:ec:d2:78", "network": {"id": "9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-440990750-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef27a8ab9a794f7782ac89b9c28c893a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap775bdf61-ad", "ovs_interfaceid": "775bdf61-ad38-473b-9d53-c3195f2be472", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:46:44 compute-0 nova_compute[186544]: 2025-11-22 07:46:44.576 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:46:44 compute-0 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d0000001d.scope: Deactivated successfully.
Nov 22 07:46:44 compute-0 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d0000001d.scope: Consumed 14.073s CPU time.
Nov 22 07:46:44 compute-0 systemd-machined[152872]: Machine qemu-15-instance-0000001d terminated.
Nov 22 07:46:44 compute-0 nova_compute[186544]: 2025-11-22 07:46:44.607 186548 DEBUG oslo_concurrency.lockutils [req-3faa9ba9-0696-466c-b1a7-70840e1116a2 req-625d09c8-f752-4ebf-9558-fd43473dc38f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-67b32f80-a795-43b0-bcb0-aeda0d9506d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:46:44 compute-0 podman[217912]: 2025-11-22 07:46:44.662075492 +0000 UTC m=+0.080595752 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., distribution-scope=public, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, managed_by=edpm_ansible, config_id=edpm, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc.)
Nov 22 07:46:44 compute-0 neutron-haproxy-ovnmeta-9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202[217563]: [NOTICE]   (217567) : haproxy version is 2.8.14-c23fe91
Nov 22 07:46:44 compute-0 neutron-haproxy-ovnmeta-9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202[217563]: [NOTICE]   (217567) : path to executable is /usr/sbin/haproxy
Nov 22 07:46:44 compute-0 neutron-haproxy-ovnmeta-9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202[217563]: [WARNING]  (217567) : Exiting Master process...
Nov 22 07:46:44 compute-0 neutron-haproxy-ovnmeta-9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202[217563]: [WARNING]  (217567) : Exiting Master process...
Nov 22 07:46:44 compute-0 neutron-haproxy-ovnmeta-9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202[217563]: [ALERT]    (217567) : Current worker (217569) exited with code 143 (Terminated)
Nov 22 07:46:44 compute-0 neutron-haproxy-ovnmeta-9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202[217563]: [WARNING]  (217567) : All workers exited. Exiting... (0)
Nov 22 07:46:44 compute-0 systemd[1]: libpod-685d541a8ec994817fa4a84407bd57758a0e39f86ddd1c7e4bfc1c3420e31318.scope: Deactivated successfully.
Nov 22 07:46:44 compute-0 podman[217955]: 2025-11-22 07:46:44.693999383 +0000 UTC m=+0.044565231 container died 685d541a8ec994817fa4a84407bd57758a0e39f86ddd1c7e4bfc1c3420e31318 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2)
Nov 22 07:46:44 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-685d541a8ec994817fa4a84407bd57758a0e39f86ddd1c7e4bfc1c3420e31318-userdata-shm.mount: Deactivated successfully.
Nov 22 07:46:44 compute-0 systemd[1]: var-lib-containers-storage-overlay-1fa35a465e8338ede6406b212e8b8c9b5f09193d8fdb9a01ff9704ac61004a2d-merged.mount: Deactivated successfully.
Nov 22 07:46:44 compute-0 nova_compute[186544]: 2025-11-22 07:46:44.751 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:46:44 compute-0 podman[217955]: 2025-11-22 07:46:44.757468007 +0000 UTC m=+0.108033855 container cleanup 685d541a8ec994817fa4a84407bd57758a0e39f86ddd1c7e4bfc1c3420e31318 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 07:46:44 compute-0 nova_compute[186544]: 2025-11-22 07:46:44.760 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:46:44 compute-0 systemd[1]: libpod-conmon-685d541a8ec994817fa4a84407bd57758a0e39f86ddd1c7e4bfc1c3420e31318.scope: Deactivated successfully.
Nov 22 07:46:44 compute-0 nova_compute[186544]: 2025-11-22 07:46:44.792 186548 INFO nova.virt.libvirt.driver [-] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] Instance destroyed successfully.
Nov 22 07:46:44 compute-0 nova_compute[186544]: 2025-11-22 07:46:44.793 186548 DEBUG nova.objects.instance [None req-afa8fb01-1682-4114-ad16-d66f12034764 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Lazy-loading 'resources' on Instance uuid 67b32f80-a795-43b0-bcb0-aeda0d9506d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:46:44 compute-0 nova_compute[186544]: 2025-11-22 07:46:44.809 186548 DEBUG nova.virt.libvirt.vif [None req-afa8fb01-1682-4114-ad16-d66f12034764 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:46:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-FloatingIPsAssociationTestJSON-server-2031422221',display_name='tempest-FloatingIPsAssociationTestJSON-server-2031422221',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-floatingipsassociationtestjson-server-2031422221',id=29,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:46:16Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ef27a8ab9a794f7782ac89b9c28c893a',ramdisk_id='',reservation_id='r-41g0d0jn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-FloatingIPsAssociationTestJSON-1465053098',owner_user_name='tempest-FloatingIPsAssociationTestJSON-1465053098-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:46:16Z,user_data=None,user_id='65ded9a5f9a7463d8c52561197054664',uuid=67b32f80-a795-43b0-bcb0-aeda0d9506d4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "775bdf61-ad38-473b-9d53-c3195f2be472", "address": "fa:16:3e:ec:d2:78", "network": {"id": "9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-440990750-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef27a8ab9a794f7782ac89b9c28c893a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap775bdf61-ad", "ovs_interfaceid": "775bdf61-ad38-473b-9d53-c3195f2be472", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 07:46:44 compute-0 nova_compute[186544]: 2025-11-22 07:46:44.809 186548 DEBUG nova.network.os_vif_util [None req-afa8fb01-1682-4114-ad16-d66f12034764 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Converting VIF {"id": "775bdf61-ad38-473b-9d53-c3195f2be472", "address": "fa:16:3e:ec:d2:78", "network": {"id": "9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-440990750-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef27a8ab9a794f7782ac89b9c28c893a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap775bdf61-ad", "ovs_interfaceid": "775bdf61-ad38-473b-9d53-c3195f2be472", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:46:44 compute-0 nova_compute[186544]: 2025-11-22 07:46:44.811 186548 DEBUG nova.network.os_vif_util [None req-afa8fb01-1682-4114-ad16-d66f12034764 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ec:d2:78,bridge_name='br-int',has_traffic_filtering=True,id=775bdf61-ad38-473b-9d53-c3195f2be472,network=Network(9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap775bdf61-ad') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:46:44 compute-0 nova_compute[186544]: 2025-11-22 07:46:44.811 186548 DEBUG os_vif [None req-afa8fb01-1682-4114-ad16-d66f12034764 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ec:d2:78,bridge_name='br-int',has_traffic_filtering=True,id=775bdf61-ad38-473b-9d53-c3195f2be472,network=Network(9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap775bdf61-ad') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 07:46:44 compute-0 nova_compute[186544]: 2025-11-22 07:46:44.813 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:46:44 compute-0 nova_compute[186544]: 2025-11-22 07:46:44.813 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap775bdf61-ad, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:46:44 compute-0 nova_compute[186544]: 2025-11-22 07:46:44.849 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:46:44 compute-0 nova_compute[186544]: 2025-11-22 07:46:44.851 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 07:46:44 compute-0 nova_compute[186544]: 2025-11-22 07:46:44.853 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:46:44 compute-0 nova_compute[186544]: 2025-11-22 07:46:44.855 186548 INFO os_vif [None req-afa8fb01-1682-4114-ad16-d66f12034764 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ec:d2:78,bridge_name='br-int',has_traffic_filtering=True,id=775bdf61-ad38-473b-9d53-c3195f2be472,network=Network(9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap775bdf61-ad')
Nov 22 07:46:44 compute-0 nova_compute[186544]: 2025-11-22 07:46:44.855 186548 INFO nova.virt.libvirt.driver [None req-afa8fb01-1682-4114-ad16-d66f12034764 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] Deleting instance files /var/lib/nova/instances/67b32f80-a795-43b0-bcb0-aeda0d9506d4_del
Nov 22 07:46:44 compute-0 nova_compute[186544]: 2025-11-22 07:46:44.856 186548 INFO nova.virt.libvirt.driver [None req-afa8fb01-1682-4114-ad16-d66f12034764 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] Deletion of /var/lib/nova/instances/67b32f80-a795-43b0-bcb0-aeda0d9506d4_del complete
Nov 22 07:46:44 compute-0 podman[217998]: 2025-11-22 07:46:44.859531234 +0000 UTC m=+0.079974349 container remove 685d541a8ec994817fa4a84407bd57758a0e39f86ddd1c7e4bfc1c3420e31318 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 07:46:44 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:44.864 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[ead077ce-58e7-4e30-8a96-e496cddaf255]: (4, ('Sat Nov 22 07:46:44 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202 (685d541a8ec994817fa4a84407bd57758a0e39f86ddd1c7e4bfc1c3420e31318)\n685d541a8ec994817fa4a84407bd57758a0e39f86ddd1c7e4bfc1c3420e31318\nSat Nov 22 07:46:44 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202 (685d541a8ec994817fa4a84407bd57758a0e39f86ddd1c7e4bfc1c3420e31318)\n685d541a8ec994817fa4a84407bd57758a0e39f86ddd1c7e4bfc1c3420e31318\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:46:44 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:44.865 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[032579cf-d0d7-4d10-b187-2a8838684576]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:46:44 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:44.866 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9dfbfc3c-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:46:44 compute-0 nova_compute[186544]: 2025-11-22 07:46:44.867 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:46:44 compute-0 kernel: tap9dfbfc3c-a0: left promiscuous mode
Nov 22 07:46:44 compute-0 nova_compute[186544]: 2025-11-22 07:46:44.881 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:46:44 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:44.884 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[78688932-40f2-4ab7-b4af-544e0f80fa89]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:46:44 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:44.903 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[8612829a-9380-44c4-84ab-0e4bdf28b0ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:46:44 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:44.904 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[0f0238e3-a041-4e06-bfe7-c4b2d9bf7e72]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:46:44 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:44.918 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[6103c9c2-2b42-4701-9a93-ceae153c3003]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 431658, 'reachable_time': 19773, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218023, 'error': None, 'target': 'ovnmeta-9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:46:44 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:44.920 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9dfbfc3c-a1e1-4f51-927e-fe8b0b4a7202 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 07:46:44 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:44.920 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[a8243c7b-bfcc-4b79-a3ec-beb85072396a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:46:44 compute-0 systemd[1]: run-netns-ovnmeta\x2d9dfbfc3c\x2da1e1\x2d4f51\x2d927e\x2dfe8b0b4a7202.mount: Deactivated successfully.
Nov 22 07:46:44 compute-0 nova_compute[186544]: 2025-11-22 07:46:44.921 186548 INFO nova.compute.manager [None req-afa8fb01-1682-4114-ad16-d66f12034764 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] Took 0.39 seconds to destroy the instance on the hypervisor.
Nov 22 07:46:44 compute-0 nova_compute[186544]: 2025-11-22 07:46:44.922 186548 DEBUG oslo.service.loopingcall [None req-afa8fb01-1682-4114-ad16-d66f12034764 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 07:46:44 compute-0 nova_compute[186544]: 2025-11-22 07:46:44.922 186548 DEBUG nova.compute.manager [-] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 07:46:44 compute-0 nova_compute[186544]: 2025-11-22 07:46:44.922 186548 DEBUG nova.network.neutron [-] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 07:46:45 compute-0 nova_compute[186544]: 2025-11-22 07:46:45.176 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:46:45 compute-0 nova_compute[186544]: 2025-11-22 07:46:45.329 186548 DEBUG nova.compute.manager [req-f61da559-238f-42ed-87d6-59b0ed084f11 req-f3dbf051-3bf0-405c-aa82-8be9c4320f3d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] Received event network-vif-unplugged-775bdf61-ad38-473b-9d53-c3195f2be472 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:46:45 compute-0 nova_compute[186544]: 2025-11-22 07:46:45.330 186548 DEBUG oslo_concurrency.lockutils [req-f61da559-238f-42ed-87d6-59b0ed084f11 req-f3dbf051-3bf0-405c-aa82-8be9c4320f3d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "67b32f80-a795-43b0-bcb0-aeda0d9506d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:46:45 compute-0 nova_compute[186544]: 2025-11-22 07:46:45.330 186548 DEBUG oslo_concurrency.lockutils [req-f61da559-238f-42ed-87d6-59b0ed084f11 req-f3dbf051-3bf0-405c-aa82-8be9c4320f3d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "67b32f80-a795-43b0-bcb0-aeda0d9506d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:46:45 compute-0 nova_compute[186544]: 2025-11-22 07:46:45.330 186548 DEBUG oslo_concurrency.lockutils [req-f61da559-238f-42ed-87d6-59b0ed084f11 req-f3dbf051-3bf0-405c-aa82-8be9c4320f3d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "67b32f80-a795-43b0-bcb0-aeda0d9506d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:46:45 compute-0 nova_compute[186544]: 2025-11-22 07:46:45.330 186548 DEBUG nova.compute.manager [req-f61da559-238f-42ed-87d6-59b0ed084f11 req-f3dbf051-3bf0-405c-aa82-8be9c4320f3d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] No waiting events found dispatching network-vif-unplugged-775bdf61-ad38-473b-9d53-c3195f2be472 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:46:45 compute-0 nova_compute[186544]: 2025-11-22 07:46:45.330 186548 DEBUG nova.compute.manager [req-f61da559-238f-42ed-87d6-59b0ed084f11 req-f3dbf051-3bf0-405c-aa82-8be9c4320f3d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] Received event network-vif-unplugged-775bdf61-ad38-473b-9d53-c3195f2be472 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 22 07:46:45 compute-0 nova_compute[186544]: 2025-11-22 07:46:45.696 186548 DEBUG nova.network.neutron [-] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:46:45 compute-0 nova_compute[186544]: 2025-11-22 07:46:45.713 186548 INFO nova.compute.manager [-] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] Took 0.79 seconds to deallocate network for instance.
Nov 22 07:46:45 compute-0 nova_compute[186544]: 2025-11-22 07:46:45.784 186548 DEBUG oslo_concurrency.lockutils [None req-afa8fb01-1682-4114-ad16-d66f12034764 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:46:45 compute-0 nova_compute[186544]: 2025-11-22 07:46:45.784 186548 DEBUG oslo_concurrency.lockutils [None req-afa8fb01-1682-4114-ad16-d66f12034764 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:46:45 compute-0 nova_compute[186544]: 2025-11-22 07:46:45.816 186548 DEBUG nova.compute.manager [req-2ac4a7a7-b4b2-4f84-ab46-78e5bc3b5765 req-5648f63f-a187-404b-9596-8df2c11d4ff0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] Received event network-vif-deleted-775bdf61-ad38-473b-9d53-c3195f2be472 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:46:45 compute-0 nova_compute[186544]: 2025-11-22 07:46:45.870 186548 DEBUG nova.compute.provider_tree [None req-afa8fb01-1682-4114-ad16-d66f12034764 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:46:45 compute-0 nova_compute[186544]: 2025-11-22 07:46:45.903 186548 DEBUG nova.scheduler.client.report [None req-afa8fb01-1682-4114-ad16-d66f12034764 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:46:45 compute-0 nova_compute[186544]: 2025-11-22 07:46:45.949 186548 DEBUG oslo_concurrency.lockutils [None req-afa8fb01-1682-4114-ad16-d66f12034764 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.165s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:46:45 compute-0 nova_compute[186544]: 2025-11-22 07:46:45.989 186548 INFO nova.scheduler.client.report [None req-afa8fb01-1682-4114-ad16-d66f12034764 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Deleted allocations for instance 67b32f80-a795-43b0-bcb0-aeda0d9506d4
Nov 22 07:46:46 compute-0 nova_compute[186544]: 2025-11-22 07:46:46.077 186548 DEBUG oslo_concurrency.lockutils [None req-afa8fb01-1682-4114-ad16-d66f12034764 65ded9a5f9a7463d8c52561197054664 ef27a8ab9a794f7782ac89b9c28c893a - - default default] Lock "67b32f80-a795-43b0-bcb0-aeda0d9506d4" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.561s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:46:46 compute-0 nova_compute[186544]: 2025-11-22 07:46:46.080 186548 DEBUG nova.network.neutron [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Port 973cdfc2-4ad8-4f41-b383-4b64b1b5433f updated with migration profile {'os_vif_delegation': True, 'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Nov 22 07:46:46 compute-0 nova_compute[186544]: 2025-11-22 07:46:46.085 186548 DEBUG nova.compute.manager [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmppob76voc',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='09e8ccfc-6ae9-4a06-ae76-7e059f50ac44',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Nov 22 07:46:46 compute-0 systemd[1]: Starting libvirt proxy daemon...
Nov 22 07:46:46 compute-0 systemd[1]: Started libvirt proxy daemon.
Nov 22 07:46:46 compute-0 kernel: tap973cdfc2-4a: entered promiscuous mode
Nov 22 07:46:46 compute-0 systemd-udevd[217926]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 07:46:46 compute-0 ovn_controller[94843]: 2025-11-22T07:46:46Z|00122|binding|INFO|Claiming lport 973cdfc2-4ad8-4f41-b383-4b64b1b5433f for this additional chassis.
Nov 22 07:46:46 compute-0 ovn_controller[94843]: 2025-11-22T07:46:46Z|00123|binding|INFO|973cdfc2-4ad8-4f41-b383-4b64b1b5433f: Claiming fa:16:3e:02:9b:98 10.100.0.14
Nov 22 07:46:46 compute-0 nova_compute[186544]: 2025-11-22 07:46:46.381 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:46:46 compute-0 NetworkManager[55036]: <info>  [1763797606.3832] manager: (tap973cdfc2-4a): new Tun device (/org/freedesktop/NetworkManager/Devices/62)
Nov 22 07:46:46 compute-0 NetworkManager[55036]: <info>  [1763797606.3955] device (tap973cdfc2-4a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 07:46:46 compute-0 NetworkManager[55036]: <info>  [1763797606.3961] device (tap973cdfc2-4a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 07:46:46 compute-0 nova_compute[186544]: 2025-11-22 07:46:46.399 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:46:46 compute-0 ovn_controller[94843]: 2025-11-22T07:46:46Z|00124|binding|INFO|Setting lport 973cdfc2-4ad8-4f41-b383-4b64b1b5433f ovn-installed in OVS
Nov 22 07:46:46 compute-0 nova_compute[186544]: 2025-11-22 07:46:46.403 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:46:46 compute-0 systemd-machined[152872]: New machine qemu-16-instance-0000001c.
Nov 22 07:46:46 compute-0 systemd[1]: Started Virtual Machine qemu-16-instance-0000001c.
Nov 22 07:46:47 compute-0 nova_compute[186544]: 2025-11-22 07:46:47.320 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763797607.3198922, 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:46:47 compute-0 nova_compute[186544]: 2025-11-22 07:46:47.321 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] VM Started (Lifecycle Event)
Nov 22 07:46:47 compute-0 nova_compute[186544]: 2025-11-22 07:46:47.341 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:46:47 compute-0 nova_compute[186544]: 2025-11-22 07:46:47.468 186548 DEBUG nova.compute.manager [req-436e3978-b4bd-4100-b325-6747970d5b09 req-2c6f44e1-a0b6-42b7-a2f4-c01ada1b545c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] Received event network-vif-plugged-775bdf61-ad38-473b-9d53-c3195f2be472 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:46:47 compute-0 nova_compute[186544]: 2025-11-22 07:46:47.468 186548 DEBUG oslo_concurrency.lockutils [req-436e3978-b4bd-4100-b325-6747970d5b09 req-2c6f44e1-a0b6-42b7-a2f4-c01ada1b545c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "67b32f80-a795-43b0-bcb0-aeda0d9506d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:46:47 compute-0 nova_compute[186544]: 2025-11-22 07:46:47.469 186548 DEBUG oslo_concurrency.lockutils [req-436e3978-b4bd-4100-b325-6747970d5b09 req-2c6f44e1-a0b6-42b7-a2f4-c01ada1b545c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "67b32f80-a795-43b0-bcb0-aeda0d9506d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:46:47 compute-0 nova_compute[186544]: 2025-11-22 07:46:47.469 186548 DEBUG oslo_concurrency.lockutils [req-436e3978-b4bd-4100-b325-6747970d5b09 req-2c6f44e1-a0b6-42b7-a2f4-c01ada1b545c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "67b32f80-a795-43b0-bcb0-aeda0d9506d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:46:47 compute-0 nova_compute[186544]: 2025-11-22 07:46:47.469 186548 DEBUG nova.compute.manager [req-436e3978-b4bd-4100-b325-6747970d5b09 req-2c6f44e1-a0b6-42b7-a2f4-c01ada1b545c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] No waiting events found dispatching network-vif-plugged-775bdf61-ad38-473b-9d53-c3195f2be472 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:46:47 compute-0 nova_compute[186544]: 2025-11-22 07:46:47.470 186548 WARNING nova.compute.manager [req-436e3978-b4bd-4100-b325-6747970d5b09 req-2c6f44e1-a0b6-42b7-a2f4-c01ada1b545c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] Received unexpected event network-vif-plugged-775bdf61-ad38-473b-9d53-c3195f2be472 for instance with vm_state deleted and task_state None.
Nov 22 07:46:48 compute-0 nova_compute[186544]: 2025-11-22 07:46:48.322 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763797608.3211102, 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:46:48 compute-0 nova_compute[186544]: 2025-11-22 07:46:48.323 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] VM Resumed (Lifecycle Event)
Nov 22 07:46:48 compute-0 nova_compute[186544]: 2025-11-22 07:46:48.357 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:46:48 compute-0 nova_compute[186544]: 2025-11-22 07:46:48.360 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:46:48 compute-0 nova_compute[186544]: 2025-11-22 07:46:48.380 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] During the sync_power process the instance has moved from host compute-2.ctlplane.example.com to host compute-0.ctlplane.example.com
Nov 22 07:46:49 compute-0 nova_compute[186544]: 2025-11-22 07:46:49.849 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:46:50 compute-0 ovn_controller[94843]: 2025-11-22T07:46:50Z|00125|binding|INFO|Claiming lport 973cdfc2-4ad8-4f41-b383-4b64b1b5433f for this chassis.
Nov 22 07:46:50 compute-0 ovn_controller[94843]: 2025-11-22T07:46:50Z|00126|binding|INFO|973cdfc2-4ad8-4f41-b383-4b64b1b5433f: Claiming fa:16:3e:02:9b:98 10.100.0.14
Nov 22 07:46:50 compute-0 ovn_controller[94843]: 2025-11-22T07:46:50Z|00127|binding|INFO|Setting lport 973cdfc2-4ad8-4f41-b383-4b64b1b5433f up in Southbound
Nov 22 07:46:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:50.131 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:02:9b:98 10.100.0.14'], port_security=['fa:16:3e:02:9b:98 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '09e8ccfc-6ae9-4a06-ae76-7e059f50ac44', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd48bda61691e4f778b6d30c0dc773a30', 'neutron:revision_number': '21', 'neutron:security_group_ids': 'd99796e5-fe06-409c-adb5-ca5cc291d6f2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=45f3847d-7d6e-44b5-a83a-dde97f76bd11, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=973cdfc2-4ad8-4f41-b383-4b64b1b5433f) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:46:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:50.133 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 973cdfc2-4ad8-4f41-b383-4b64b1b5433f in datapath c3f966e1-8cff-4ca0-9b4f-a318c31b0a70 bound to our chassis
Nov 22 07:46:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:50.134 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c3f966e1-8cff-4ca0-9b4f-a318c31b0a70
Nov 22 07:46:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:50.146 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[84c95ff9-6c17-42e8-a99b-ad1f2dedd18e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:46:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:50.147 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc3f966e1-81 in ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 07:46:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:50.148 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc3f966e1-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 07:46:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:50.148 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[fc53ebcb-3bba-425b-aeac-db7065e2d8ff]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:46:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:50.149 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[5e4e966c-04fd-44eb-9749-7744cd5d01a8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:46:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:50.158 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[1cf75f25-8c2c-4b62-b173-af36e68164ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:46:50 compute-0 nova_compute[186544]: 2025-11-22 07:46:50.177 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:46:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:50.181 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[3b5db51f-d133-4870-9f5f-82090e598e4e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:46:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:50.209 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[6cb3195f-af06-4456-92ae-a978a98ff545]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:46:50 compute-0 NetworkManager[55036]: <info>  [1763797610.2152] manager: (tapc3f966e1-80): new Veth device (/org/freedesktop/NetworkManager/Devices/63)
Nov 22 07:46:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:50.215 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[94af9d75-e3aa-4237-8f86-fa5b0472d43b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:46:50 compute-0 systemd-udevd[218095]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 07:46:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:50.239 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[84a3e8e3-66b6-4a48-aed4-c2f9e3fc54f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:46:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:50.241 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[9c0ac463-9dce-4d0e-bf63-f478f9a60fd7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:46:50 compute-0 NetworkManager[55036]: <info>  [1763797610.2653] device (tapc3f966e1-80): carrier: link connected
Nov 22 07:46:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:50.269 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[f19b13d8-4961-487a-8fba-797d14b3edc6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:46:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:50.286 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[a0e52d1e-820c-4420-aeb8-85e69897f1f7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc3f966e1-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e2:74:99'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 40], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 435090, 'reachable_time': 27263, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218114, 'error': None, 'target': 'ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:46:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:50.301 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[fbb919a1-146b-4fce-b6f0-abe106ee323d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee2:7499'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 435090, 'tstamp': 435090}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218115, 'error': None, 'target': 'ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:46:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:50.314 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[f6bf367d-32a0-4209-a015-d6199a28944a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc3f966e1-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e2:74:99'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 40], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 435090, 'reachable_time': 27263, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218116, 'error': None, 'target': 'ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:46:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:50.343 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[ed2d7c27-3f40-471d-8360-5264875ab2de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:46:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:50.407 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[fb3d1c3b-4076-48dd-a1f2-5590b90952fd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:46:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:50.409 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc3f966e1-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:46:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:50.409 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:46:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:50.410 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc3f966e1-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:46:50 compute-0 nova_compute[186544]: 2025-11-22 07:46:50.437 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:46:50 compute-0 NetworkManager[55036]: <info>  [1763797610.4377] manager: (tapc3f966e1-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/64)
Nov 22 07:46:50 compute-0 kernel: tapc3f966e1-80: entered promiscuous mode
Nov 22 07:46:50 compute-0 nova_compute[186544]: 2025-11-22 07:46:50.439 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:46:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:50.442 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc3f966e1-80, col_values=(('external_ids', {'iface-id': '8206cb6d-dd78-493d-a276-fccb0eeecc7d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:46:50 compute-0 nova_compute[186544]: 2025-11-22 07:46:50.444 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:46:50 compute-0 ovn_controller[94843]: 2025-11-22T07:46:50Z|00128|binding|INFO|Releasing lport 8206cb6d-dd78-493d-a276-fccb0eeecc7d from this chassis (sb_readonly=0)
Nov 22 07:46:50 compute-0 nova_compute[186544]: 2025-11-22 07:46:50.455 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:46:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:50.456 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c3f966e1-8cff-4ca0-9b4f-a318c31b0a70.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c3f966e1-8cff-4ca0-9b4f-a318c31b0a70.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 07:46:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:50.457 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[5bda39e5-7732-4d58-980d-4b89c559a431]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:46:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:50.458 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 07:46:50 compute-0 ovn_metadata_agent[103800]: global
Nov 22 07:46:50 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 07:46:50 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70
Nov 22 07:46:50 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 07:46:50 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 07:46:50 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 07:46:50 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/c3f966e1-8cff-4ca0-9b4f-a318c31b0a70.pid.haproxy
Nov 22 07:46:50 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 07:46:50 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:46:50 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 07:46:50 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 07:46:50 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 07:46:50 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 07:46:50 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 07:46:50 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 07:46:50 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 07:46:50 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 07:46:50 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 07:46:50 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 07:46:50 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 07:46:50 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 07:46:50 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 07:46:50 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:46:50 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:46:50 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 07:46:50 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 07:46:50 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 07:46:50 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID c3f966e1-8cff-4ca0-9b4f-a318c31b0a70
Nov 22 07:46:50 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 07:46:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:46:50.459 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70', 'env', 'PROCESS_TAG=haproxy-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c3f966e1-8cff-4ca0-9b4f-a318c31b0a70.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 07:46:50 compute-0 podman[218149]: 2025-11-22 07:46:50.820578704 +0000 UTC m=+0.050246200 container create 657eea662f247d9903d75b941ff648515a2081de35ddde9896ec86930b0fe529 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_managed=true)
Nov 22 07:46:50 compute-0 systemd[1]: Started libpod-conmon-657eea662f247d9903d75b941ff648515a2081de35ddde9896ec86930b0fe529.scope.
Nov 22 07:46:50 compute-0 systemd[1]: Started libcrun container.
Nov 22 07:46:50 compute-0 podman[218149]: 2025-11-22 07:46:50.794460445 +0000 UTC m=+0.024127961 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 07:46:50 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/39d45c0eb52508004f068b426dba2cdef6d1aeb39842a07373632a882fae392d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 07:46:50 compute-0 podman[218149]: 2025-11-22 07:46:50.904406506 +0000 UTC m=+0.134074092 container init 657eea662f247d9903d75b941ff648515a2081de35ddde9896ec86930b0fe529 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 07:46:50 compute-0 podman[218149]: 2025-11-22 07:46:50.909537601 +0000 UTC m=+0.139205097 container start 657eea662f247d9903d75b941ff648515a2081de35ddde9896ec86930b0fe529 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Nov 22 07:46:50 compute-0 neutron-haproxy-ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70[218164]: [NOTICE]   (218168) : New worker (218170) forked
Nov 22 07:46:50 compute-0 neutron-haproxy-ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70[218164]: [NOTICE]   (218168) : Loading success.
Nov 22 07:46:50 compute-0 nova_compute[186544]: 2025-11-22 07:46:50.951 186548 INFO nova.compute.manager [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Post operation of migration started
Nov 22 07:46:51 compute-0 nova_compute[186544]: 2025-11-22 07:46:51.636 186548 DEBUG oslo_concurrency.lockutils [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Acquiring lock "refresh_cache-09e8ccfc-6ae9-4a06-ae76-7e059f50ac44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:46:51 compute-0 nova_compute[186544]: 2025-11-22 07:46:51.637 186548 DEBUG oslo_concurrency.lockutils [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Acquired lock "refresh_cache-09e8ccfc-6ae9-4a06-ae76-7e059f50ac44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:46:51 compute-0 nova_compute[186544]: 2025-11-22 07:46:51.637 186548 DEBUG nova.network.neutron [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 07:46:53 compute-0 nova_compute[186544]: 2025-11-22 07:46:53.644 186548 DEBUG nova.network.neutron [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Updating instance_info_cache with network_info: [{"id": "973cdfc2-4ad8-4f41-b383-4b64b1b5433f", "address": "fa:16:3e:02:9b:98", "network": {"id": "c3f966e1-8cff-4ca0-9b4f-a318c31b0a70", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1427311200-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d48bda61691e4f778b6d30c0dc773a30", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap973cdfc2-4a", "ovs_interfaceid": "973cdfc2-4ad8-4f41-b383-4b64b1b5433f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:46:53 compute-0 nova_compute[186544]: 2025-11-22 07:46:53.661 186548 DEBUG oslo_concurrency.lockutils [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Releasing lock "refresh_cache-09e8ccfc-6ae9-4a06-ae76-7e059f50ac44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:46:53 compute-0 nova_compute[186544]: 2025-11-22 07:46:53.679 186548 DEBUG oslo_concurrency.lockutils [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:46:53 compute-0 nova_compute[186544]: 2025-11-22 07:46:53.679 186548 DEBUG oslo_concurrency.lockutils [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:46:53 compute-0 nova_compute[186544]: 2025-11-22 07:46:53.679 186548 DEBUG oslo_concurrency.lockutils [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:46:53 compute-0 nova_compute[186544]: 2025-11-22 07:46:53.683 186548 INFO nova.virt.libvirt.driver [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Nov 22 07:46:53 compute-0 virtqemud[186092]: Domain id=16 name='instance-0000001c' uuid=09e8ccfc-6ae9-4a06-ae76-7e059f50ac44 is tainted: custom-monitor
Nov 22 07:46:54 compute-0 nova_compute[186544]: 2025-11-22 07:46:54.694 186548 INFO nova.virt.libvirt.driver [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Nov 22 07:46:54 compute-0 nova_compute[186544]: 2025-11-22 07:46:54.890 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:46:55 compute-0 nova_compute[186544]: 2025-11-22 07:46:55.179 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:46:55 compute-0 nova_compute[186544]: 2025-11-22 07:46:55.700 186548 INFO nova.virt.libvirt.driver [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Nov 22 07:46:55 compute-0 nova_compute[186544]: 2025-11-22 07:46:55.705 186548 DEBUG nova.compute.manager [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:46:55 compute-0 nova_compute[186544]: 2025-11-22 07:46:55.766 186548 DEBUG nova.objects.instance [None req-6263406b-8835-4379-827e-e11c0511f3a3 b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Nov 22 07:46:56 compute-0 podman[218179]: 2025-11-22 07:46:56.405184437 +0000 UTC m=+0.052929866 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute)
Nov 22 07:46:56 compute-0 podman[218180]: 2025-11-22 07:46:56.458135442 +0000 UTC m=+0.107142642 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 22 07:46:57 compute-0 nova_compute[186544]: 2025-11-22 07:46:57.044 186548 DEBUG oslo_concurrency.lockutils [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Acquiring lock "8734dffa-e3b3-424a-a8cd-d893c67bae4a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:46:57 compute-0 nova_compute[186544]: 2025-11-22 07:46:57.045 186548 DEBUG oslo_concurrency.lockutils [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Lock "8734dffa-e3b3-424a-a8cd-d893c67bae4a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:46:57 compute-0 nova_compute[186544]: 2025-11-22 07:46:57.060 186548 DEBUG nova.compute.manager [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 07:46:57 compute-0 nova_compute[186544]: 2025-11-22 07:46:57.173 186548 DEBUG oslo_concurrency.lockutils [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:46:57 compute-0 nova_compute[186544]: 2025-11-22 07:46:57.174 186548 DEBUG oslo_concurrency.lockutils [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:46:57 compute-0 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 22 07:46:57 compute-0 nova_compute[186544]: 2025-11-22 07:46:57.181 186548 DEBUG nova.virt.hardware [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 07:46:57 compute-0 nova_compute[186544]: 2025-11-22 07:46:57.181 186548 INFO nova.compute.claims [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] Claim successful on node compute-0.ctlplane.example.com
Nov 22 07:46:57 compute-0 nova_compute[186544]: 2025-11-22 07:46:57.400 186548 DEBUG nova.compute.provider_tree [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:46:57 compute-0 nova_compute[186544]: 2025-11-22 07:46:57.418 186548 DEBUG nova.scheduler.client.report [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:46:57 compute-0 nova_compute[186544]: 2025-11-22 07:46:57.447 186548 DEBUG oslo_concurrency.lockutils [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.273s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:46:57 compute-0 nova_compute[186544]: 2025-11-22 07:46:57.448 186548 DEBUG nova.compute.manager [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 07:46:57 compute-0 nova_compute[186544]: 2025-11-22 07:46:57.524 186548 DEBUG nova.compute.manager [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 22 07:46:57 compute-0 nova_compute[186544]: 2025-11-22 07:46:57.524 186548 DEBUG nova.network.neutron [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 07:46:57 compute-0 nova_compute[186544]: 2025-11-22 07:46:57.564 186548 INFO nova.virt.libvirt.driver [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 07:46:57 compute-0 nova_compute[186544]: 2025-11-22 07:46:57.606 186548 DEBUG nova.compute.manager [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 07:46:57 compute-0 nova_compute[186544]: 2025-11-22 07:46:57.787 186548 DEBUG nova.compute.manager [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 07:46:57 compute-0 nova_compute[186544]: 2025-11-22 07:46:57.788 186548 DEBUG nova.virt.libvirt.driver [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 07:46:57 compute-0 nova_compute[186544]: 2025-11-22 07:46:57.789 186548 INFO nova.virt.libvirt.driver [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] Creating image(s)
Nov 22 07:46:57 compute-0 nova_compute[186544]: 2025-11-22 07:46:57.789 186548 DEBUG oslo_concurrency.lockutils [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Acquiring lock "/var/lib/nova/instances/8734dffa-e3b3-424a-a8cd-d893c67bae4a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:46:57 compute-0 nova_compute[186544]: 2025-11-22 07:46:57.790 186548 DEBUG oslo_concurrency.lockutils [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Lock "/var/lib/nova/instances/8734dffa-e3b3-424a-a8cd-d893c67bae4a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:46:57 compute-0 nova_compute[186544]: 2025-11-22 07:46:57.790 186548 DEBUG oslo_concurrency.lockutils [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Lock "/var/lib/nova/instances/8734dffa-e3b3-424a-a8cd-d893c67bae4a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:46:57 compute-0 nova_compute[186544]: 2025-11-22 07:46:57.805 186548 DEBUG nova.policy [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b47fa480dd1c4c9f81da16b464195f2b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b94109a356454dbda245fe5e57d0cd82', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 22 07:46:57 compute-0 nova_compute[186544]: 2025-11-22 07:46:57.807 186548 DEBUG oslo_concurrency.processutils [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:46:57 compute-0 nova_compute[186544]: 2025-11-22 07:46:57.866 186548 DEBUG oslo_concurrency.processutils [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:46:57 compute-0 nova_compute[186544]: 2025-11-22 07:46:57.867 186548 DEBUG oslo_concurrency.lockutils [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:46:57 compute-0 nova_compute[186544]: 2025-11-22 07:46:57.868 186548 DEBUG oslo_concurrency.lockutils [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:46:57 compute-0 nova_compute[186544]: 2025-11-22 07:46:57.880 186548 DEBUG oslo_concurrency.processutils [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:46:57 compute-0 nova_compute[186544]: 2025-11-22 07:46:57.935 186548 DEBUG oslo_concurrency.processutils [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:46:57 compute-0 nova_compute[186544]: 2025-11-22 07:46:57.936 186548 DEBUG oslo_concurrency.processutils [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/8734dffa-e3b3-424a-a8cd-d893c67bae4a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:46:57 compute-0 nova_compute[186544]: 2025-11-22 07:46:57.968 186548 DEBUG oslo_concurrency.processutils [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/8734dffa-e3b3-424a-a8cd-d893c67bae4a/disk 1073741824" returned: 0 in 0.031s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:46:57 compute-0 nova_compute[186544]: 2025-11-22 07:46:57.969 186548 DEBUG oslo_concurrency.lockutils [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.101s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:46:57 compute-0 nova_compute[186544]: 2025-11-22 07:46:57.969 186548 DEBUG oslo_concurrency.processutils [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:46:58 compute-0 nova_compute[186544]: 2025-11-22 07:46:58.023 186548 DEBUG oslo_concurrency.processutils [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:46:58 compute-0 nova_compute[186544]: 2025-11-22 07:46:58.024 186548 DEBUG nova.virt.disk.api [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Checking if we can resize image /var/lib/nova/instances/8734dffa-e3b3-424a-a8cd-d893c67bae4a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 07:46:58 compute-0 nova_compute[186544]: 2025-11-22 07:46:58.024 186548 DEBUG oslo_concurrency.processutils [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8734dffa-e3b3-424a-a8cd-d893c67bae4a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:46:58 compute-0 nova_compute[186544]: 2025-11-22 07:46:58.080 186548 DEBUG oslo_concurrency.processutils [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8734dffa-e3b3-424a-a8cd-d893c67bae4a/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:46:58 compute-0 nova_compute[186544]: 2025-11-22 07:46:58.081 186548 DEBUG nova.virt.disk.api [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Cannot resize image /var/lib/nova/instances/8734dffa-e3b3-424a-a8cd-d893c67bae4a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 07:46:58 compute-0 nova_compute[186544]: 2025-11-22 07:46:58.081 186548 DEBUG nova.objects.instance [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Lazy-loading 'migration_context' on Instance uuid 8734dffa-e3b3-424a-a8cd-d893c67bae4a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:46:58 compute-0 nova_compute[186544]: 2025-11-22 07:46:58.095 186548 DEBUG nova.virt.libvirt.driver [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 07:46:58 compute-0 nova_compute[186544]: 2025-11-22 07:46:58.095 186548 DEBUG nova.virt.libvirt.driver [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] Ensure instance console log exists: /var/lib/nova/instances/8734dffa-e3b3-424a-a8cd-d893c67bae4a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 07:46:58 compute-0 nova_compute[186544]: 2025-11-22 07:46:58.096 186548 DEBUG oslo_concurrency.lockutils [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:46:58 compute-0 nova_compute[186544]: 2025-11-22 07:46:58.096 186548 DEBUG oslo_concurrency.lockutils [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:46:58 compute-0 nova_compute[186544]: 2025-11-22 07:46:58.096 186548 DEBUG oslo_concurrency.lockutils [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:46:58 compute-0 nova_compute[186544]: 2025-11-22 07:46:58.928 186548 DEBUG nova.network.neutron [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] Successfully created port: 33114641-5801-4450-8fcb-fdf1dc0cb013 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 22 07:46:59 compute-0 podman[218241]: 2025-11-22 07:46:59.4078827 +0000 UTC m=+0.051859120 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 07:46:59 compute-0 nova_compute[186544]: 2025-11-22 07:46:59.791 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763797604.7904239, 67b32f80-a795-43b0-bcb0-aeda0d9506d4 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:46:59 compute-0 nova_compute[186544]: 2025-11-22 07:46:59.792 186548 INFO nova.compute.manager [-] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] VM Stopped (Lifecycle Event)
Nov 22 07:46:59 compute-0 nova_compute[186544]: 2025-11-22 07:46:59.811 186548 DEBUG nova.compute.manager [None req-cd0a138c-7831-4215-81f9-04c4bbf0fcb3 - - - - - -] [instance: 67b32f80-a795-43b0-bcb0-aeda0d9506d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:46:59 compute-0 nova_compute[186544]: 2025-11-22 07:46:59.891 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:00 compute-0 nova_compute[186544]: 2025-11-22 07:47:00.182 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:00 compute-0 nova_compute[186544]: 2025-11-22 07:47:00.693 186548 DEBUG nova.network.neutron [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] Successfully updated port: 33114641-5801-4450-8fcb-fdf1dc0cb013 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 07:47:00 compute-0 nova_compute[186544]: 2025-11-22 07:47:00.712 186548 DEBUG oslo_concurrency.lockutils [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Acquiring lock "refresh_cache-8734dffa-e3b3-424a-a8cd-d893c67bae4a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:47:00 compute-0 nova_compute[186544]: 2025-11-22 07:47:00.712 186548 DEBUG oslo_concurrency.lockutils [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Acquired lock "refresh_cache-8734dffa-e3b3-424a-a8cd-d893c67bae4a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:47:00 compute-0 nova_compute[186544]: 2025-11-22 07:47:00.713 186548 DEBUG nova.network.neutron [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 07:47:00 compute-0 nova_compute[186544]: 2025-11-22 07:47:00.840 186548 DEBUG nova.compute.manager [req-d7ea55a6-4f85-4998-95be-608a33a45d72 req-925e089c-f8cf-4e9d-8a30-d1c30a5d444f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] Received event network-changed-33114641-5801-4450-8fcb-fdf1dc0cb013 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:47:00 compute-0 nova_compute[186544]: 2025-11-22 07:47:00.840 186548 DEBUG nova.compute.manager [req-d7ea55a6-4f85-4998-95be-608a33a45d72 req-925e089c-f8cf-4e9d-8a30-d1c30a5d444f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] Refreshing instance network info cache due to event network-changed-33114641-5801-4450-8fcb-fdf1dc0cb013. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 07:47:00 compute-0 nova_compute[186544]: 2025-11-22 07:47:00.840 186548 DEBUG oslo_concurrency.lockutils [req-d7ea55a6-4f85-4998-95be-608a33a45d72 req-925e089c-f8cf-4e9d-8a30-d1c30a5d444f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-8734dffa-e3b3-424a-a8cd-d893c67bae4a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:47:00 compute-0 nova_compute[186544]: 2025-11-22 07:47:00.932 186548 DEBUG nova.network.neutron [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 07:47:01 compute-0 ovn_controller[94843]: 2025-11-22T07:47:01Z|00129|binding|INFO|Releasing lport 8206cb6d-dd78-493d-a276-fccb0eeecc7d from this chassis (sb_readonly=0)
Nov 22 07:47:01 compute-0 nova_compute[186544]: 2025-11-22 07:47:01.672 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:01 compute-0 ovn_controller[94843]: 2025-11-22T07:47:01Z|00130|binding|INFO|Releasing lport 8206cb6d-dd78-493d-a276-fccb0eeecc7d from this chassis (sb_readonly=0)
Nov 22 07:47:01 compute-0 nova_compute[186544]: 2025-11-22 07:47:01.777 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:02 compute-0 nova_compute[186544]: 2025-11-22 07:47:02.025 186548 DEBUG nova.network.neutron [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] Updating instance_info_cache with network_info: [{"id": "33114641-5801-4450-8fcb-fdf1dc0cb013", "address": "fa:16:3e:95:60:dc", "network": {"id": "4aa99606-7691-4fcb-846d-56459aaaa088", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1996006581-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b94109a356454dbda245fe5e57d0cd82", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33114641-58", "ovs_interfaceid": "33114641-5801-4450-8fcb-fdf1dc0cb013", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:47:02 compute-0 nova_compute[186544]: 2025-11-22 07:47:02.042 186548 DEBUG oslo_concurrency.lockutils [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Releasing lock "refresh_cache-8734dffa-e3b3-424a-a8cd-d893c67bae4a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:47:02 compute-0 nova_compute[186544]: 2025-11-22 07:47:02.042 186548 DEBUG nova.compute.manager [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] Instance network_info: |[{"id": "33114641-5801-4450-8fcb-fdf1dc0cb013", "address": "fa:16:3e:95:60:dc", "network": {"id": "4aa99606-7691-4fcb-846d-56459aaaa088", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1996006581-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b94109a356454dbda245fe5e57d0cd82", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33114641-58", "ovs_interfaceid": "33114641-5801-4450-8fcb-fdf1dc0cb013", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 22 07:47:02 compute-0 nova_compute[186544]: 2025-11-22 07:47:02.042 186548 DEBUG oslo_concurrency.lockutils [req-d7ea55a6-4f85-4998-95be-608a33a45d72 req-925e089c-f8cf-4e9d-8a30-d1c30a5d444f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-8734dffa-e3b3-424a-a8cd-d893c67bae4a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:47:02 compute-0 nova_compute[186544]: 2025-11-22 07:47:02.043 186548 DEBUG nova.network.neutron [req-d7ea55a6-4f85-4998-95be-608a33a45d72 req-925e089c-f8cf-4e9d-8a30-d1c30a5d444f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] Refreshing network info cache for port 33114641-5801-4450-8fcb-fdf1dc0cb013 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 07:47:02 compute-0 nova_compute[186544]: 2025-11-22 07:47:02.045 186548 DEBUG nova.virt.libvirt.driver [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] Start _get_guest_xml network_info=[{"id": "33114641-5801-4450-8fcb-fdf1dc0cb013", "address": "fa:16:3e:95:60:dc", "network": {"id": "4aa99606-7691-4fcb-846d-56459aaaa088", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1996006581-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b94109a356454dbda245fe5e57d0cd82", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33114641-58", "ovs_interfaceid": "33114641-5801-4450-8fcb-fdf1dc0cb013", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 07:47:02 compute-0 nova_compute[186544]: 2025-11-22 07:47:02.050 186548 WARNING nova.virt.libvirt.driver [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 07:47:02 compute-0 nova_compute[186544]: 2025-11-22 07:47:02.054 186548 DEBUG nova.virt.libvirt.host [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 07:47:02 compute-0 nova_compute[186544]: 2025-11-22 07:47:02.054 186548 DEBUG nova.virt.libvirt.host [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 07:47:02 compute-0 nova_compute[186544]: 2025-11-22 07:47:02.061 186548 DEBUG nova.virt.libvirt.host [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 07:47:02 compute-0 nova_compute[186544]: 2025-11-22 07:47:02.062 186548 DEBUG nova.virt.libvirt.host [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 07:47:02 compute-0 nova_compute[186544]: 2025-11-22 07:47:02.063 186548 DEBUG nova.virt.libvirt.driver [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 07:47:02 compute-0 nova_compute[186544]: 2025-11-22 07:47:02.063 186548 DEBUG nova.virt.hardware [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 07:47:02 compute-0 nova_compute[186544]: 2025-11-22 07:47:02.064 186548 DEBUG nova.virt.hardware [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 07:47:02 compute-0 nova_compute[186544]: 2025-11-22 07:47:02.064 186548 DEBUG nova.virt.hardware [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 07:47:02 compute-0 nova_compute[186544]: 2025-11-22 07:47:02.064 186548 DEBUG nova.virt.hardware [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 07:47:02 compute-0 nova_compute[186544]: 2025-11-22 07:47:02.064 186548 DEBUG nova.virt.hardware [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 07:47:02 compute-0 nova_compute[186544]: 2025-11-22 07:47:02.064 186548 DEBUG nova.virt.hardware [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 07:47:02 compute-0 nova_compute[186544]: 2025-11-22 07:47:02.065 186548 DEBUG nova.virt.hardware [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 07:47:02 compute-0 nova_compute[186544]: 2025-11-22 07:47:02.065 186548 DEBUG nova.virt.hardware [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 07:47:02 compute-0 nova_compute[186544]: 2025-11-22 07:47:02.065 186548 DEBUG nova.virt.hardware [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 07:47:02 compute-0 nova_compute[186544]: 2025-11-22 07:47:02.065 186548 DEBUG nova.virt.hardware [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 07:47:02 compute-0 nova_compute[186544]: 2025-11-22 07:47:02.066 186548 DEBUG nova.virt.hardware [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 07:47:02 compute-0 nova_compute[186544]: 2025-11-22 07:47:02.068 186548 DEBUG nova.virt.libvirt.vif [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:46:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesOneServerNegativeTestJSON-server-656887238',display_name='tempest-ImagesOneServerNegativeTestJSON-server-656887238',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagesoneservernegativetestjson-server-656887238',id=31,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b94109a356454dbda245fe5e57d0cd82',ramdisk_id='',reservation_id='r-go5pt1p5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesOneServerNegativeTestJSON-328128522',owner_user_name='tempest-ImagesOneServerNegativeTestJSON-328128522-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:46:57Z,user_data=None,user_id='b47fa480dd1c4c9f81da16b464195f2b',uuid=8734dffa-e3b3-424a-a8cd-d893c67bae4a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "33114641-5801-4450-8fcb-fdf1dc0cb013", "address": "fa:16:3e:95:60:dc", "network": {"id": "4aa99606-7691-4fcb-846d-56459aaaa088", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1996006581-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b94109a356454dbda245fe5e57d0cd82", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33114641-58", "ovs_interfaceid": "33114641-5801-4450-8fcb-fdf1dc0cb013", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 07:47:02 compute-0 nova_compute[186544]: 2025-11-22 07:47:02.069 186548 DEBUG nova.network.os_vif_util [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Converting VIF {"id": "33114641-5801-4450-8fcb-fdf1dc0cb013", "address": "fa:16:3e:95:60:dc", "network": {"id": "4aa99606-7691-4fcb-846d-56459aaaa088", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1996006581-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b94109a356454dbda245fe5e57d0cd82", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33114641-58", "ovs_interfaceid": "33114641-5801-4450-8fcb-fdf1dc0cb013", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:47:02 compute-0 nova_compute[186544]: 2025-11-22 07:47:02.069 186548 DEBUG nova.network.os_vif_util [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:95:60:dc,bridge_name='br-int',has_traffic_filtering=True,id=33114641-5801-4450-8fcb-fdf1dc0cb013,network=Network(4aa99606-7691-4fcb-846d-56459aaaa088),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap33114641-58') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:47:02 compute-0 nova_compute[186544]: 2025-11-22 07:47:02.070 186548 DEBUG nova.objects.instance [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Lazy-loading 'pci_devices' on Instance uuid 8734dffa-e3b3-424a-a8cd-d893c67bae4a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:47:02 compute-0 nova_compute[186544]: 2025-11-22 07:47:02.084 186548 DEBUG nova.virt.libvirt.driver [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] End _get_guest_xml xml=<domain type="kvm">
Nov 22 07:47:02 compute-0 nova_compute[186544]:   <uuid>8734dffa-e3b3-424a-a8cd-d893c67bae4a</uuid>
Nov 22 07:47:02 compute-0 nova_compute[186544]:   <name>instance-0000001f</name>
Nov 22 07:47:02 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 07:47:02 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 07:47:02 compute-0 nova_compute[186544]:   <metadata>
Nov 22 07:47:02 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 07:47:02 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 07:47:02 compute-0 nova_compute[186544]:       <nova:name>tempest-ImagesOneServerNegativeTestJSON-server-656887238</nova:name>
Nov 22 07:47:02 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 07:47:02</nova:creationTime>
Nov 22 07:47:02 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 07:47:02 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 07:47:02 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 07:47:02 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 07:47:02 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 07:47:02 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 07:47:02 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 07:47:02 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 07:47:02 compute-0 nova_compute[186544]:         <nova:user uuid="b47fa480dd1c4c9f81da16b464195f2b">tempest-ImagesOneServerNegativeTestJSON-328128522-project-member</nova:user>
Nov 22 07:47:02 compute-0 nova_compute[186544]:         <nova:project uuid="b94109a356454dbda245fe5e57d0cd82">tempest-ImagesOneServerNegativeTestJSON-328128522</nova:project>
Nov 22 07:47:02 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 07:47:02 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 07:47:02 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 07:47:02 compute-0 nova_compute[186544]:         <nova:port uuid="33114641-5801-4450-8fcb-fdf1dc0cb013">
Nov 22 07:47:02 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 22 07:47:02 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 07:47:02 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 07:47:02 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 07:47:02 compute-0 nova_compute[186544]:   </metadata>
Nov 22 07:47:02 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 07:47:02 compute-0 nova_compute[186544]:     <system>
Nov 22 07:47:02 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 07:47:02 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 07:47:02 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 07:47:02 compute-0 nova_compute[186544]:       <entry name="serial">8734dffa-e3b3-424a-a8cd-d893c67bae4a</entry>
Nov 22 07:47:02 compute-0 nova_compute[186544]:       <entry name="uuid">8734dffa-e3b3-424a-a8cd-d893c67bae4a</entry>
Nov 22 07:47:02 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 07:47:02 compute-0 nova_compute[186544]:     </system>
Nov 22 07:47:02 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 07:47:02 compute-0 nova_compute[186544]:   <os>
Nov 22 07:47:02 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 07:47:02 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 07:47:02 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 07:47:02 compute-0 nova_compute[186544]:   </os>
Nov 22 07:47:02 compute-0 nova_compute[186544]:   <features>
Nov 22 07:47:02 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 07:47:02 compute-0 nova_compute[186544]:     <apic/>
Nov 22 07:47:02 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 07:47:02 compute-0 nova_compute[186544]:   </features>
Nov 22 07:47:02 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 07:47:02 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 07:47:02 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 07:47:02 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 07:47:02 compute-0 nova_compute[186544]:   </clock>
Nov 22 07:47:02 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 07:47:02 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 07:47:02 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 07:47:02 compute-0 nova_compute[186544]:   </cpu>
Nov 22 07:47:02 compute-0 nova_compute[186544]:   <devices>
Nov 22 07:47:02 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 07:47:02 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 07:47:02 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/8734dffa-e3b3-424a-a8cd-d893c67bae4a/disk"/>
Nov 22 07:47:02 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 07:47:02 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:47:02 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 07:47:02 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 07:47:02 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/8734dffa-e3b3-424a-a8cd-d893c67bae4a/disk.config"/>
Nov 22 07:47:02 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 07:47:02 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:47:02 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 07:47:02 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:95:60:dc"/>
Nov 22 07:47:02 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 07:47:02 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 07:47:02 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 07:47:02 compute-0 nova_compute[186544]:       <target dev="tap33114641-58"/>
Nov 22 07:47:02 compute-0 nova_compute[186544]:     </interface>
Nov 22 07:47:02 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 07:47:02 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/8734dffa-e3b3-424a-a8cd-d893c67bae4a/console.log" append="off"/>
Nov 22 07:47:02 compute-0 nova_compute[186544]:     </serial>
Nov 22 07:47:02 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 07:47:02 compute-0 nova_compute[186544]:     <video>
Nov 22 07:47:02 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 07:47:02 compute-0 nova_compute[186544]:     </video>
Nov 22 07:47:02 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 07:47:02 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 07:47:02 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 07:47:02 compute-0 nova_compute[186544]:     </rng>
Nov 22 07:47:02 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 07:47:02 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:47:02 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:47:02 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:47:02 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:47:02 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:47:02 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:47:02 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:47:02 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:47:02 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:47:02 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:47:02 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:47:02 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:47:02 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:47:02 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:47:02 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:47:02 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:47:02 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:47:02 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:47:02 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:47:02 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:47:02 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:47:02 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:47:02 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:47:02 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:47:02 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 07:47:02 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 07:47:02 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 07:47:02 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 07:47:02 compute-0 nova_compute[186544]:   </devices>
Nov 22 07:47:02 compute-0 nova_compute[186544]: </domain>
Nov 22 07:47:02 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 07:47:02 compute-0 nova_compute[186544]: 2025-11-22 07:47:02.085 186548 DEBUG nova.compute.manager [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] Preparing to wait for external event network-vif-plugged-33114641-5801-4450-8fcb-fdf1dc0cb013 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 07:47:02 compute-0 nova_compute[186544]: 2025-11-22 07:47:02.086 186548 DEBUG oslo_concurrency.lockutils [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Acquiring lock "8734dffa-e3b3-424a-a8cd-d893c67bae4a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:47:02 compute-0 nova_compute[186544]: 2025-11-22 07:47:02.086 186548 DEBUG oslo_concurrency.lockutils [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Lock "8734dffa-e3b3-424a-a8cd-d893c67bae4a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:47:02 compute-0 nova_compute[186544]: 2025-11-22 07:47:02.086 186548 DEBUG oslo_concurrency.lockutils [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Lock "8734dffa-e3b3-424a-a8cd-d893c67bae4a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:47:02 compute-0 nova_compute[186544]: 2025-11-22 07:47:02.087 186548 DEBUG nova.virt.libvirt.vif [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:46:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesOneServerNegativeTestJSON-server-656887238',display_name='tempest-ImagesOneServerNegativeTestJSON-server-656887238',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagesoneservernegativetestjson-server-656887238',id=31,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b94109a356454dbda245fe5e57d0cd82',ramdisk_id='',reservation_id='r-go5pt1p5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesOneServerNegativeTestJSON-328128522',owner_user_name='tempest-ImagesOneServerNegativeTestJSON-328128522-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:46:57Z,user_data=None,user_id='b47fa480dd1c4c9f81da16b464195f2b',uuid=8734dffa-e3b3-424a-a8cd-d893c67bae4a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "33114641-5801-4450-8fcb-fdf1dc0cb013", "address": "fa:16:3e:95:60:dc", "network": {"id": "4aa99606-7691-4fcb-846d-56459aaaa088", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1996006581-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b94109a356454dbda245fe5e57d0cd82", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33114641-58", "ovs_interfaceid": "33114641-5801-4450-8fcb-fdf1dc0cb013", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 07:47:02 compute-0 nova_compute[186544]: 2025-11-22 07:47:02.087 186548 DEBUG nova.network.os_vif_util [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Converting VIF {"id": "33114641-5801-4450-8fcb-fdf1dc0cb013", "address": "fa:16:3e:95:60:dc", "network": {"id": "4aa99606-7691-4fcb-846d-56459aaaa088", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1996006581-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b94109a356454dbda245fe5e57d0cd82", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33114641-58", "ovs_interfaceid": "33114641-5801-4450-8fcb-fdf1dc0cb013", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:47:02 compute-0 nova_compute[186544]: 2025-11-22 07:47:02.088 186548 DEBUG nova.network.os_vif_util [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:95:60:dc,bridge_name='br-int',has_traffic_filtering=True,id=33114641-5801-4450-8fcb-fdf1dc0cb013,network=Network(4aa99606-7691-4fcb-846d-56459aaaa088),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap33114641-58') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:47:02 compute-0 nova_compute[186544]: 2025-11-22 07:47:02.088 186548 DEBUG os_vif [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:95:60:dc,bridge_name='br-int',has_traffic_filtering=True,id=33114641-5801-4450-8fcb-fdf1dc0cb013,network=Network(4aa99606-7691-4fcb-846d-56459aaaa088),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap33114641-58') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 07:47:02 compute-0 nova_compute[186544]: 2025-11-22 07:47:02.089 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:02 compute-0 nova_compute[186544]: 2025-11-22 07:47:02.089 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:47:02 compute-0 nova_compute[186544]: 2025-11-22 07:47:02.089 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:47:02 compute-0 nova_compute[186544]: 2025-11-22 07:47:02.093 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:02 compute-0 nova_compute[186544]: 2025-11-22 07:47:02.093 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap33114641-58, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:47:02 compute-0 nova_compute[186544]: 2025-11-22 07:47:02.094 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap33114641-58, col_values=(('external_ids', {'iface-id': '33114641-5801-4450-8fcb-fdf1dc0cb013', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:95:60:dc', 'vm-uuid': '8734dffa-e3b3-424a-a8cd-d893c67bae4a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:47:02 compute-0 NetworkManager[55036]: <info>  [1763797622.0963] manager: (tap33114641-58): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/65)
Nov 22 07:47:02 compute-0 nova_compute[186544]: 2025-11-22 07:47:02.095 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:02 compute-0 nova_compute[186544]: 2025-11-22 07:47:02.098 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 07:47:02 compute-0 nova_compute[186544]: 2025-11-22 07:47:02.103 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:02 compute-0 nova_compute[186544]: 2025-11-22 07:47:02.104 186548 INFO os_vif [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:95:60:dc,bridge_name='br-int',has_traffic_filtering=True,id=33114641-5801-4450-8fcb-fdf1dc0cb013,network=Network(4aa99606-7691-4fcb-846d-56459aaaa088),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap33114641-58')
Nov 22 07:47:02 compute-0 nova_compute[186544]: 2025-11-22 07:47:02.183 186548 DEBUG nova.virt.libvirt.driver [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:47:02 compute-0 nova_compute[186544]: 2025-11-22 07:47:02.184 186548 DEBUG nova.virt.libvirt.driver [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:47:02 compute-0 nova_compute[186544]: 2025-11-22 07:47:02.184 186548 DEBUG nova.virt.libvirt.driver [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] No VIF found with MAC fa:16:3e:95:60:dc, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 07:47:02 compute-0 nova_compute[186544]: 2025-11-22 07:47:02.184 186548 INFO nova.virt.libvirt.driver [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] Using config drive
Nov 22 07:47:02 compute-0 nova_compute[186544]: 2025-11-22 07:47:02.620 186548 INFO nova.virt.libvirt.driver [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] Creating config drive at /var/lib/nova/instances/8734dffa-e3b3-424a-a8cd-d893c67bae4a/disk.config
Nov 22 07:47:02 compute-0 nova_compute[186544]: 2025-11-22 07:47:02.626 186548 DEBUG oslo_concurrency.processutils [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/8734dffa-e3b3-424a-a8cd-d893c67bae4a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpx3dsdgrj execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:47:02 compute-0 nova_compute[186544]: 2025-11-22 07:47:02.750 186548 DEBUG oslo_concurrency.processutils [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/8734dffa-e3b3-424a-a8cd-d893c67bae4a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpx3dsdgrj" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:47:02 compute-0 NetworkManager[55036]: <info>  [1763797622.8031] manager: (tap33114641-58): new Tun device (/org/freedesktop/NetworkManager/Devices/66)
Nov 22 07:47:02 compute-0 kernel: tap33114641-58: entered promiscuous mode
Nov 22 07:47:02 compute-0 ovn_controller[94843]: 2025-11-22T07:47:02Z|00131|binding|INFO|Claiming lport 33114641-5801-4450-8fcb-fdf1dc0cb013 for this chassis.
Nov 22 07:47:02 compute-0 ovn_controller[94843]: 2025-11-22T07:47:02Z|00132|binding|INFO|33114641-5801-4450-8fcb-fdf1dc0cb013: Claiming fa:16:3e:95:60:dc 10.100.0.11
Nov 22 07:47:02 compute-0 nova_compute[186544]: 2025-11-22 07:47:02.806 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:02 compute-0 nova_compute[186544]: 2025-11-22 07:47:02.809 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:02.831 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:95:60:dc 10.100.0.11'], port_security=['fa:16:3e:95:60:dc 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '8734dffa-e3b3-424a-a8cd-d893c67bae4a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4aa99606-7691-4fcb-846d-56459aaaa088', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b94109a356454dbda245fe5e57d0cd82', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7ebc0842-f2b0-4995-8bc2-4b71e8009dae', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=440686f5-fec3-41db-bbb0-53e12589d6a4, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=33114641-5801-4450-8fcb-fdf1dc0cb013) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:47:02 compute-0 systemd-udevd[218284]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 07:47:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:02.833 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 33114641-5801-4450-8fcb-fdf1dc0cb013 in datapath 4aa99606-7691-4fcb-846d-56459aaaa088 bound to our chassis
Nov 22 07:47:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:02.835 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4aa99606-7691-4fcb-846d-56459aaaa088
Nov 22 07:47:02 compute-0 NetworkManager[55036]: <info>  [1763797622.8443] device (tap33114641-58): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 07:47:02 compute-0 NetworkManager[55036]: <info>  [1763797622.8451] device (tap33114641-58): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 07:47:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:02.847 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[a344cc2b-42b7-40ec-954a-2916b7f368c7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:02.848 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap4aa99606-71 in ovnmeta-4aa99606-7691-4fcb-846d-56459aaaa088 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 07:47:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:02.849 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap4aa99606-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 07:47:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:02.849 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[b33ddb98-beec-4b6c-ae0c-928477c6d3ae]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:02.850 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[9c046e4e-3a26-4473-98e5-4a6524014617]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:02 compute-0 systemd-machined[152872]: New machine qemu-17-instance-0000001f.
Nov 22 07:47:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:02.861 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[75e0c6be-577e-4119-9dd7-4833d630807e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:02 compute-0 nova_compute[186544]: 2025-11-22 07:47:02.862 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:02 compute-0 ovn_controller[94843]: 2025-11-22T07:47:02Z|00133|binding|INFO|Setting lport 33114641-5801-4450-8fcb-fdf1dc0cb013 ovn-installed in OVS
Nov 22 07:47:02 compute-0 ovn_controller[94843]: 2025-11-22T07:47:02Z|00134|binding|INFO|Setting lport 33114641-5801-4450-8fcb-fdf1dc0cb013 up in Southbound
Nov 22 07:47:02 compute-0 nova_compute[186544]: 2025-11-22 07:47:02.866 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:02 compute-0 systemd[1]: Started Virtual Machine qemu-17-instance-0000001f.
Nov 22 07:47:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:02.884 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[ed4b6fb3-7589-4409-9c4b-979d5e92f328]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:02.909 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[68a3c238-0a72-4a85-b46a-6bf75078d939]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:02 compute-0 systemd-udevd[218288]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 07:47:02 compute-0 NetworkManager[55036]: <info>  [1763797622.9159] manager: (tap4aa99606-70): new Veth device (/org/freedesktop/NetworkManager/Devices/67)
Nov 22 07:47:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:02.916 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[a40a8fb9-3665-46a9-96bb-579a7c98678d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:02.948 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[bcdef1c2-7561-439f-9808-dcee5b963500]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:02.953 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[2709f8f1-109a-4ca7-ac87-dde471121e55]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:02 compute-0 NetworkManager[55036]: <info>  [1763797622.9756] device (tap4aa99606-70): carrier: link connected
Nov 22 07:47:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:02.985 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[236d476a-121d-448f-b6e2-2d19c6f796b7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:03 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:03.004 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[739aead2-913e-4c62-a722-7ac011b5e637]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4aa99606-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:85:b3:a9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 42], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 436361, 'reachable_time': 40485, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218319, 'error': None, 'target': 'ovnmeta-4aa99606-7691-4fcb-846d-56459aaaa088', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:03 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:03.022 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[00559562-0485-4ddd-a0fa-3089d0349ec8]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe85:b3a9'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 436361, 'tstamp': 436361}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218320, 'error': None, 'target': 'ovnmeta-4aa99606-7691-4fcb-846d-56459aaaa088', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:03 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:03.040 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[f5d4f373-ff89-4524-8552-54d016e4f5d8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4aa99606-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:85:b3:a9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 42], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 436361, 'reachable_time': 40485, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218321, 'error': None, 'target': 'ovnmeta-4aa99606-7691-4fcb-846d-56459aaaa088', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:03 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:03.073 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[433f5224-437e-42f7-b7f4-58667a952cd7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:03 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:03.129 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[e5d5dca6-177f-4ddb-a69b-30f37c9ce1a4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:03 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:03.131 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4aa99606-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:47:03 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:03.131 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:47:03 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:03.131 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4aa99606-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:47:03 compute-0 NetworkManager[55036]: <info>  [1763797623.1345] manager: (tap4aa99606-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/68)
Nov 22 07:47:03 compute-0 kernel: tap4aa99606-70: entered promiscuous mode
Nov 22 07:47:03 compute-0 nova_compute[186544]: 2025-11-22 07:47:03.133 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:03 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:03.139 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4aa99606-70, col_values=(('external_ids', {'iface-id': 'ef41332e-7ec0-4d28-b824-d5b12ab6995f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:47:03 compute-0 nova_compute[186544]: 2025-11-22 07:47:03.138 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:03 compute-0 ovn_controller[94843]: 2025-11-22T07:47:03Z|00135|binding|INFO|Releasing lport ef41332e-7ec0-4d28-b824-d5b12ab6995f from this chassis (sb_readonly=0)
Nov 22 07:47:03 compute-0 nova_compute[186544]: 2025-11-22 07:47:03.140 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:03 compute-0 nova_compute[186544]: 2025-11-22 07:47:03.153 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:03 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:03.154 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4aa99606-7691-4fcb-846d-56459aaaa088.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4aa99606-7691-4fcb-846d-56459aaaa088.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 07:47:03 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:03.155 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[a753f62b-21e2-44e7-99bc-bbb5150d5e72]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:03 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:03.156 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 07:47:03 compute-0 ovn_metadata_agent[103800]: global
Nov 22 07:47:03 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 07:47:03 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-4aa99606-7691-4fcb-846d-56459aaaa088
Nov 22 07:47:03 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 07:47:03 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 07:47:03 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 07:47:03 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/4aa99606-7691-4fcb-846d-56459aaaa088.pid.haproxy
Nov 22 07:47:03 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 07:47:03 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:47:03 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 07:47:03 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 07:47:03 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 07:47:03 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 07:47:03 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 07:47:03 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 07:47:03 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 07:47:03 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 07:47:03 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 07:47:03 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 07:47:03 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 07:47:03 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 07:47:03 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 07:47:03 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:47:03 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:47:03 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 07:47:03 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 07:47:03 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 07:47:03 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID 4aa99606-7691-4fcb-846d-56459aaaa088
Nov 22 07:47:03 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 07:47:03 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:03.156 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-4aa99606-7691-4fcb-846d-56459aaaa088', 'env', 'PROCESS_TAG=haproxy-4aa99606-7691-4fcb-846d-56459aaaa088', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/4aa99606-7691-4fcb-846d-56459aaaa088.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 07:47:03 compute-0 nova_compute[186544]: 2025-11-22 07:47:03.182 186548 DEBUG nova.compute.manager [req-ba8b841f-9845-489a-9d46-27b8cac2ac5e req-ea87859a-e0f5-4483-904c-20b40b43d473 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] Received event network-vif-plugged-33114641-5801-4450-8fcb-fdf1dc0cb013 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:47:03 compute-0 nova_compute[186544]: 2025-11-22 07:47:03.182 186548 DEBUG oslo_concurrency.lockutils [req-ba8b841f-9845-489a-9d46-27b8cac2ac5e req-ea87859a-e0f5-4483-904c-20b40b43d473 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "8734dffa-e3b3-424a-a8cd-d893c67bae4a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:47:03 compute-0 nova_compute[186544]: 2025-11-22 07:47:03.183 186548 DEBUG oslo_concurrency.lockutils [req-ba8b841f-9845-489a-9d46-27b8cac2ac5e req-ea87859a-e0f5-4483-904c-20b40b43d473 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "8734dffa-e3b3-424a-a8cd-d893c67bae4a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:47:03 compute-0 nova_compute[186544]: 2025-11-22 07:47:03.183 186548 DEBUG oslo_concurrency.lockutils [req-ba8b841f-9845-489a-9d46-27b8cac2ac5e req-ea87859a-e0f5-4483-904c-20b40b43d473 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "8734dffa-e3b3-424a-a8cd-d893c67bae4a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:47:03 compute-0 nova_compute[186544]: 2025-11-22 07:47:03.183 186548 DEBUG nova.compute.manager [req-ba8b841f-9845-489a-9d46-27b8cac2ac5e req-ea87859a-e0f5-4483-904c-20b40b43d473 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] Processing event network-vif-plugged-33114641-5801-4450-8fcb-fdf1dc0cb013 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 07:47:03 compute-0 nova_compute[186544]: 2025-11-22 07:47:03.290 186548 DEBUG nova.compute.manager [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 07:47:03 compute-0 nova_compute[186544]: 2025-11-22 07:47:03.291 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763797623.291207, 8734dffa-e3b3-424a-a8cd-d893c67bae4a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:47:03 compute-0 nova_compute[186544]: 2025-11-22 07:47:03.291 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] VM Started (Lifecycle Event)
Nov 22 07:47:03 compute-0 nova_compute[186544]: 2025-11-22 07:47:03.295 186548 DEBUG nova.virt.libvirt.driver [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 07:47:03 compute-0 nova_compute[186544]: 2025-11-22 07:47:03.298 186548 INFO nova.virt.libvirt.driver [-] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] Instance spawned successfully.
Nov 22 07:47:03 compute-0 nova_compute[186544]: 2025-11-22 07:47:03.299 186548 DEBUG nova.virt.libvirt.driver [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 07:47:03 compute-0 nova_compute[186544]: 2025-11-22 07:47:03.325 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:47:03 compute-0 nova_compute[186544]: 2025-11-22 07:47:03.330 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:47:03 compute-0 nova_compute[186544]: 2025-11-22 07:47:03.333 186548 DEBUG nova.virt.libvirt.driver [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:47:03 compute-0 nova_compute[186544]: 2025-11-22 07:47:03.334 186548 DEBUG nova.virt.libvirt.driver [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:47:03 compute-0 nova_compute[186544]: 2025-11-22 07:47:03.334 186548 DEBUG nova.virt.libvirt.driver [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:47:03 compute-0 nova_compute[186544]: 2025-11-22 07:47:03.334 186548 DEBUG nova.virt.libvirt.driver [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:47:03 compute-0 nova_compute[186544]: 2025-11-22 07:47:03.335 186548 DEBUG nova.virt.libvirt.driver [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:47:03 compute-0 nova_compute[186544]: 2025-11-22 07:47:03.335 186548 DEBUG nova.virt.libvirt.driver [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:47:03 compute-0 nova_compute[186544]: 2025-11-22 07:47:03.395 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 07:47:03 compute-0 nova_compute[186544]: 2025-11-22 07:47:03.400 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763797623.293503, 8734dffa-e3b3-424a-a8cd-d893c67bae4a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:47:03 compute-0 nova_compute[186544]: 2025-11-22 07:47:03.400 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] VM Paused (Lifecycle Event)
Nov 22 07:47:03 compute-0 nova_compute[186544]: 2025-11-22 07:47:03.433 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:47:03 compute-0 nova_compute[186544]: 2025-11-22 07:47:03.436 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763797623.2940435, 8734dffa-e3b3-424a-a8cd-d893c67bae4a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:47:03 compute-0 nova_compute[186544]: 2025-11-22 07:47:03.436 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] VM Resumed (Lifecycle Event)
Nov 22 07:47:03 compute-0 nova_compute[186544]: 2025-11-22 07:47:03.451 186548 INFO nova.compute.manager [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] Took 5.66 seconds to spawn the instance on the hypervisor.
Nov 22 07:47:03 compute-0 nova_compute[186544]: 2025-11-22 07:47:03.451 186548 DEBUG nova.compute.manager [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:47:03 compute-0 nova_compute[186544]: 2025-11-22 07:47:03.457 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:47:03 compute-0 nova_compute[186544]: 2025-11-22 07:47:03.459 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:47:03 compute-0 nova_compute[186544]: 2025-11-22 07:47:03.492 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 07:47:03 compute-0 podman[218358]: 2025-11-22 07:47:03.52134732 +0000 UTC m=+0.054270589 container create 71fa4aad0458a75ef5c0dfde7e34a2b03d283cc93e2283b455c4aff9fdcf5eb2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4aa99606-7691-4fcb-846d-56459aaaa088, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 22 07:47:03 compute-0 nova_compute[186544]: 2025-11-22 07:47:03.542 186548 INFO nova.compute.manager [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] Took 6.40 seconds to build instance.
Nov 22 07:47:03 compute-0 systemd[1]: Started libpod-conmon-71fa4aad0458a75ef5c0dfde7e34a2b03d283cc93e2283b455c4aff9fdcf5eb2.scope.
Nov 22 07:47:03 compute-0 nova_compute[186544]: 2025-11-22 07:47:03.575 186548 DEBUG oslo_concurrency.lockutils [None req-3132cb68-1aea-48fc-bd05-cb572d7fe9ac b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Lock "8734dffa-e3b3-424a-a8cd-d893c67bae4a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.529s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:47:03 compute-0 systemd[1]: Started libcrun container.
Nov 22 07:47:03 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3167ab65a67982f70328da85af586128d3f4b5b0af289e89767624be59254a84/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 07:47:03 compute-0 podman[218358]: 2025-11-22 07:47:03.494311278 +0000 UTC m=+0.027234577 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 07:47:03 compute-0 podman[218358]: 2025-11-22 07:47:03.599063011 +0000 UTC m=+0.131986280 container init 71fa4aad0458a75ef5c0dfde7e34a2b03d283cc93e2283b455c4aff9fdcf5eb2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4aa99606-7691-4fcb-846d-56459aaaa088, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, io.buildah.version=1.41.3)
Nov 22 07:47:03 compute-0 podman[218358]: 2025-11-22 07:47:03.608482911 +0000 UTC m=+0.141406180 container start 71fa4aad0458a75ef5c0dfde7e34a2b03d283cc93e2283b455c4aff9fdcf5eb2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4aa99606-7691-4fcb-846d-56459aaaa088, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 07:47:03 compute-0 podman[218372]: 2025-11-22 07:47:03.612443118 +0000 UTC m=+0.055734845 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 22 07:47:03 compute-0 neutron-haproxy-ovnmeta-4aa99606-7691-4fcb-846d-56459aaaa088[218375]: [NOTICE]   (218398) : New worker (218400) forked
Nov 22 07:47:03 compute-0 neutron-haproxy-ovnmeta-4aa99606-7691-4fcb-846d-56459aaaa088[218375]: [NOTICE]   (218398) : Loading success.
Nov 22 07:47:04 compute-0 nova_compute[186544]: 2025-11-22 07:47:04.479 186548 DEBUG nova.network.neutron [req-d7ea55a6-4f85-4998-95be-608a33a45d72 req-925e089c-f8cf-4e9d-8a30-d1c30a5d444f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] Updated VIF entry in instance network info cache for port 33114641-5801-4450-8fcb-fdf1dc0cb013. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 07:47:04 compute-0 nova_compute[186544]: 2025-11-22 07:47:04.480 186548 DEBUG nova.network.neutron [req-d7ea55a6-4f85-4998-95be-608a33a45d72 req-925e089c-f8cf-4e9d-8a30-d1c30a5d444f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] Updating instance_info_cache with network_info: [{"id": "33114641-5801-4450-8fcb-fdf1dc0cb013", "address": "fa:16:3e:95:60:dc", "network": {"id": "4aa99606-7691-4fcb-846d-56459aaaa088", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1996006581-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b94109a356454dbda245fe5e57d0cd82", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33114641-58", "ovs_interfaceid": "33114641-5801-4450-8fcb-fdf1dc0cb013", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:47:04 compute-0 nova_compute[186544]: 2025-11-22 07:47:04.511 186548 DEBUG oslo_concurrency.lockutils [req-d7ea55a6-4f85-4998-95be-608a33a45d72 req-925e089c-f8cf-4e9d-8a30-d1c30a5d444f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-8734dffa-e3b3-424a-a8cd-d893c67bae4a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:47:05 compute-0 nova_compute[186544]: 2025-11-22 07:47:05.185 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:05 compute-0 nova_compute[186544]: 2025-11-22 07:47:05.816 186548 DEBUG nova.compute.manager [req-91da5672-e5ac-4670-b03d-62735c4ae645 req-2249029d-32a6-48e5-9d0c-491e6aa5b42a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] Received event network-vif-plugged-33114641-5801-4450-8fcb-fdf1dc0cb013 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:47:05 compute-0 nova_compute[186544]: 2025-11-22 07:47:05.817 186548 DEBUG oslo_concurrency.lockutils [req-91da5672-e5ac-4670-b03d-62735c4ae645 req-2249029d-32a6-48e5-9d0c-491e6aa5b42a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "8734dffa-e3b3-424a-a8cd-d893c67bae4a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:47:05 compute-0 nova_compute[186544]: 2025-11-22 07:47:05.817 186548 DEBUG oslo_concurrency.lockutils [req-91da5672-e5ac-4670-b03d-62735c4ae645 req-2249029d-32a6-48e5-9d0c-491e6aa5b42a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "8734dffa-e3b3-424a-a8cd-d893c67bae4a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:47:05 compute-0 nova_compute[186544]: 2025-11-22 07:47:05.817 186548 DEBUG oslo_concurrency.lockutils [req-91da5672-e5ac-4670-b03d-62735c4ae645 req-2249029d-32a6-48e5-9d0c-491e6aa5b42a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "8734dffa-e3b3-424a-a8cd-d893c67bae4a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:47:05 compute-0 nova_compute[186544]: 2025-11-22 07:47:05.817 186548 DEBUG nova.compute.manager [req-91da5672-e5ac-4670-b03d-62735c4ae645 req-2249029d-32a6-48e5-9d0c-491e6aa5b42a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] No waiting events found dispatching network-vif-plugged-33114641-5801-4450-8fcb-fdf1dc0cb013 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:47:05 compute-0 nova_compute[186544]: 2025-11-22 07:47:05.818 186548 WARNING nova.compute.manager [req-91da5672-e5ac-4670-b03d-62735c4ae645 req-2249029d-32a6-48e5-9d0c-491e6aa5b42a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] Received unexpected event network-vif-plugged-33114641-5801-4450-8fcb-fdf1dc0cb013 for instance with vm_state active and task_state None.
Nov 22 07:47:06 compute-0 podman[218409]: 2025-11-22 07:47:06.411837747 +0000 UTC m=+0.065273218 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 22 07:47:07 compute-0 nova_compute[186544]: 2025-11-22 07:47:07.098 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:09 compute-0 nova_compute[186544]: 2025-11-22 07:47:09.269 186548 DEBUG nova.compute.manager [None req-29a0865e-4946-4841-9de0-2dac5f47673b b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:47:09 compute-0 nova_compute[186544]: 2025-11-22 07:47:09.365 186548 INFO nova.compute.manager [None req-29a0865e-4946-4841-9de0-2dac5f47673b b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] instance snapshotting
Nov 22 07:47:09 compute-0 nova_compute[186544]: 2025-11-22 07:47:09.941 186548 INFO nova.virt.libvirt.driver [None req-29a0865e-4946-4841-9de0-2dac5f47673b b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] Beginning live snapshot process
Nov 22 07:47:10 compute-0 nova_compute[186544]: 2025-11-22 07:47:10.186 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:11 compute-0 virtqemud[186092]: invalid argument: disk vda does not have an active block job
Nov 22 07:47:11 compute-0 nova_compute[186544]: 2025-11-22 07:47:11.222 186548 DEBUG oslo_concurrency.processutils [None req-29a0865e-4946-4841-9de0-2dac5f47673b b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8734dffa-e3b3-424a-a8cd-d893c67bae4a/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:47:11 compute-0 nova_compute[186544]: 2025-11-22 07:47:11.283 186548 DEBUG oslo_concurrency.processutils [None req-29a0865e-4946-4841-9de0-2dac5f47673b b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8734dffa-e3b3-424a-a8cd-d893c67bae4a/disk --force-share --output=json -f qcow2" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:47:11 compute-0 nova_compute[186544]: 2025-11-22 07:47:11.284 186548 DEBUG oslo_concurrency.processutils [None req-29a0865e-4946-4841-9de0-2dac5f47673b b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8734dffa-e3b3-424a-a8cd-d893c67bae4a/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:47:11 compute-0 nova_compute[186544]: 2025-11-22 07:47:11.339 186548 DEBUG oslo_concurrency.processutils [None req-29a0865e-4946-4841-9de0-2dac5f47673b b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8734dffa-e3b3-424a-a8cd-d893c67bae4a/disk --force-share --output=json -f qcow2" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:47:11 compute-0 nova_compute[186544]: 2025-11-22 07:47:11.353 186548 DEBUG oslo_concurrency.processutils [None req-29a0865e-4946-4841-9de0-2dac5f47673b b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:47:11 compute-0 podman[218432]: 2025-11-22 07:47:11.410610685 +0000 UTC m=+0.061708070 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 07:47:11 compute-0 nova_compute[186544]: 2025-11-22 07:47:11.419 186548 DEBUG oslo_concurrency.processutils [None req-29a0865e-4946-4841-9de0-2dac5f47673b b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:47:11 compute-0 nova_compute[186544]: 2025-11-22 07:47:11.421 186548 DEBUG oslo_concurrency.processutils [None req-29a0865e-4946-4841-9de0-2dac5f47673b b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpbofk2khy/858553f0977749ffbef34e20a65b13d6.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:47:11 compute-0 nova_compute[186544]: 2025-11-22 07:47:11.457 186548 DEBUG oslo_concurrency.processutils [None req-29a0865e-4946-4841-9de0-2dac5f47673b b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpbofk2khy/858553f0977749ffbef34e20a65b13d6.delta 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:47:11 compute-0 nova_compute[186544]: 2025-11-22 07:47:11.458 186548 INFO nova.virt.libvirt.driver [None req-29a0865e-4946-4841-9de0-2dac5f47673b b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] Quiescing instance not available: QEMU guest agent is not enabled.
Nov 22 07:47:11 compute-0 nova_compute[186544]: 2025-11-22 07:47:11.509 186548 DEBUG nova.virt.libvirt.guest [None req-29a0865e-4946-4841-9de0-2dac5f47673b b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] COPY block job progress, current cursor: 1 final cursor: 1 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Nov 22 07:47:11 compute-0 nova_compute[186544]: 2025-11-22 07:47:11.512 186548 INFO nova.virt.libvirt.driver [None req-29a0865e-4946-4841-9de0-2dac5f47673b b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] Skipping quiescing instance: QEMU guest agent is not enabled.
Nov 22 07:47:11 compute-0 nova_compute[186544]: 2025-11-22 07:47:11.558 186548 DEBUG nova.privsep.utils [None req-29a0865e-4946-4841-9de0-2dac5f47673b b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Nov 22 07:47:11 compute-0 nova_compute[186544]: 2025-11-22 07:47:11.558 186548 DEBUG oslo_concurrency.processutils [None req-29a0865e-4946-4841-9de0-2dac5f47673b b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpbofk2khy/858553f0977749ffbef34e20a65b13d6.delta /var/lib/nova/instances/snapshots/tmpbofk2khy/858553f0977749ffbef34e20a65b13d6 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:47:11 compute-0 nova_compute[186544]: 2025-11-22 07:47:11.751 186548 DEBUG oslo_concurrency.processutils [None req-29a0865e-4946-4841-9de0-2dac5f47673b b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpbofk2khy/858553f0977749ffbef34e20a65b13d6.delta /var/lib/nova/instances/snapshots/tmpbofk2khy/858553f0977749ffbef34e20a65b13d6" returned: 0 in 0.192s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:47:11 compute-0 nova_compute[186544]: 2025-11-22 07:47:11.752 186548 INFO nova.virt.libvirt.driver [None req-29a0865e-4946-4841-9de0-2dac5f47673b b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] Snapshot extracted, beginning image upload
Nov 22 07:47:12 compute-0 nova_compute[186544]: 2025-11-22 07:47:12.101 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:12 compute-0 nova_compute[186544]: 2025-11-22 07:47:12.241 186548 WARNING nova.compute.manager [None req-29a0865e-4946-4841-9de0-2dac5f47673b b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] Image not found during snapshot: nova.exception.ImageNotFound: Image 8e27b3f1-f462-4ce2-9296-724c7bed4269 could not be found.
Nov 22 07:47:13 compute-0 nova_compute[186544]: 2025-11-22 07:47:13.934 186548 DEBUG oslo_concurrency.lockutils [None req-00714d57-8cd5-48a2-9b66-aa355f1c8b05 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Acquiring lock "8734dffa-e3b3-424a-a8cd-d893c67bae4a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:47:13 compute-0 nova_compute[186544]: 2025-11-22 07:47:13.935 186548 DEBUG oslo_concurrency.lockutils [None req-00714d57-8cd5-48a2-9b66-aa355f1c8b05 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Lock "8734dffa-e3b3-424a-a8cd-d893c67bae4a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:47:13 compute-0 nova_compute[186544]: 2025-11-22 07:47:13.935 186548 DEBUG oslo_concurrency.lockutils [None req-00714d57-8cd5-48a2-9b66-aa355f1c8b05 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Acquiring lock "8734dffa-e3b3-424a-a8cd-d893c67bae4a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:47:13 compute-0 nova_compute[186544]: 2025-11-22 07:47:13.936 186548 DEBUG oslo_concurrency.lockutils [None req-00714d57-8cd5-48a2-9b66-aa355f1c8b05 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Lock "8734dffa-e3b3-424a-a8cd-d893c67bae4a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:47:13 compute-0 nova_compute[186544]: 2025-11-22 07:47:13.936 186548 DEBUG oslo_concurrency.lockutils [None req-00714d57-8cd5-48a2-9b66-aa355f1c8b05 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Lock "8734dffa-e3b3-424a-a8cd-d893c67bae4a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:47:13 compute-0 nova_compute[186544]: 2025-11-22 07:47:13.945 186548 INFO nova.compute.manager [None req-00714d57-8cd5-48a2-9b66-aa355f1c8b05 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] Terminating instance
Nov 22 07:47:13 compute-0 nova_compute[186544]: 2025-11-22 07:47:13.952 186548 DEBUG nova.compute.manager [None req-00714d57-8cd5-48a2-9b66-aa355f1c8b05 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 07:47:13 compute-0 kernel: tap33114641-58 (unregistering): left promiscuous mode
Nov 22 07:47:13 compute-0 NetworkManager[55036]: <info>  [1763797633.9703] device (tap33114641-58): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 07:47:13 compute-0 ovn_controller[94843]: 2025-11-22T07:47:13Z|00136|binding|INFO|Releasing lport 33114641-5801-4450-8fcb-fdf1dc0cb013 from this chassis (sb_readonly=0)
Nov 22 07:47:13 compute-0 ovn_controller[94843]: 2025-11-22T07:47:13Z|00137|binding|INFO|Setting lport 33114641-5801-4450-8fcb-fdf1dc0cb013 down in Southbound
Nov 22 07:47:13 compute-0 nova_compute[186544]: 2025-11-22 07:47:13.973 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:13 compute-0 ovn_controller[94843]: 2025-11-22T07:47:13Z|00138|binding|INFO|Removing iface tap33114641-58 ovn-installed in OVS
Nov 22 07:47:13 compute-0 nova_compute[186544]: 2025-11-22 07:47:13.976 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:13 compute-0 nova_compute[186544]: 2025-11-22 07:47:13.992 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:14 compute-0 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d0000001f.scope: Deactivated successfully.
Nov 22 07:47:14 compute-0 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d0000001f.scope: Consumed 11.241s CPU time.
Nov 22 07:47:14 compute-0 systemd-machined[152872]: Machine qemu-17-instance-0000001f terminated.
Nov 22 07:47:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:14.074 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:95:60:dc 10.100.0.11'], port_security=['fa:16:3e:95:60:dc 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '8734dffa-e3b3-424a-a8cd-d893c67bae4a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4aa99606-7691-4fcb-846d-56459aaaa088', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b94109a356454dbda245fe5e57d0cd82', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7ebc0842-f2b0-4995-8bc2-4b71e8009dae', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=440686f5-fec3-41db-bbb0-53e12589d6a4, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=33114641-5801-4450-8fcb-fdf1dc0cb013) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:47:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:14.075 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 33114641-5801-4450-8fcb-fdf1dc0cb013 in datapath 4aa99606-7691-4fcb-846d-56459aaaa088 unbound from our chassis
Nov 22 07:47:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:14.077 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4aa99606-7691-4fcb-846d-56459aaaa088, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 07:47:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:14.078 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[9aaa62a5-5502-4aa6-9941-5ee85b08b429]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:14.078 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-4aa99606-7691-4fcb-846d-56459aaaa088 namespace which is not needed anymore
Nov 22 07:47:14 compute-0 kernel: tap33114641-58: entered promiscuous mode
Nov 22 07:47:14 compute-0 kernel: tap33114641-58 (unregistering): left promiscuous mode
Nov 22 07:47:14 compute-0 NetworkManager[55036]: <info>  [1763797634.1749] manager: (tap33114641-58): new Tun device (/org/freedesktop/NetworkManager/Devices/69)
Nov 22 07:47:14 compute-0 nova_compute[186544]: 2025-11-22 07:47:14.179 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:14 compute-0 neutron-haproxy-ovnmeta-4aa99606-7691-4fcb-846d-56459aaaa088[218375]: [NOTICE]   (218398) : haproxy version is 2.8.14-c23fe91
Nov 22 07:47:14 compute-0 neutron-haproxy-ovnmeta-4aa99606-7691-4fcb-846d-56459aaaa088[218375]: [NOTICE]   (218398) : path to executable is /usr/sbin/haproxy
Nov 22 07:47:14 compute-0 neutron-haproxy-ovnmeta-4aa99606-7691-4fcb-846d-56459aaaa088[218375]: [WARNING]  (218398) : Exiting Master process...
Nov 22 07:47:14 compute-0 neutron-haproxy-ovnmeta-4aa99606-7691-4fcb-846d-56459aaaa088[218375]: [WARNING]  (218398) : Exiting Master process...
Nov 22 07:47:14 compute-0 neutron-haproxy-ovnmeta-4aa99606-7691-4fcb-846d-56459aaaa088[218375]: [ALERT]    (218398) : Current worker (218400) exited with code 143 (Terminated)
Nov 22 07:47:14 compute-0 neutron-haproxy-ovnmeta-4aa99606-7691-4fcb-846d-56459aaaa088[218375]: [WARNING]  (218398) : All workers exited. Exiting... (0)
Nov 22 07:47:14 compute-0 systemd[1]: libpod-71fa4aad0458a75ef5c0dfde7e34a2b03d283cc93e2283b455c4aff9fdcf5eb2.scope: Deactivated successfully.
Nov 22 07:47:14 compute-0 podman[218503]: 2025-11-22 07:47:14.208707602 +0000 UTC m=+0.048753454 container died 71fa4aad0458a75ef5c0dfde7e34a2b03d283cc93e2283b455c4aff9fdcf5eb2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4aa99606-7691-4fcb-846d-56459aaaa088, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 07:47:14 compute-0 nova_compute[186544]: 2025-11-22 07:47:14.228 186548 INFO nova.virt.libvirt.driver [-] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] Instance destroyed successfully.
Nov 22 07:47:14 compute-0 nova_compute[186544]: 2025-11-22 07:47:14.229 186548 DEBUG nova.objects.instance [None req-00714d57-8cd5-48a2-9b66-aa355f1c8b05 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Lazy-loading 'resources' on Instance uuid 8734dffa-e3b3-424a-a8cd-d893c67bae4a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:47:14 compute-0 systemd[1]: var-lib-containers-storage-overlay-3167ab65a67982f70328da85af586128d3f4b5b0af289e89767624be59254a84-merged.mount: Deactivated successfully.
Nov 22 07:47:14 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-71fa4aad0458a75ef5c0dfde7e34a2b03d283cc93e2283b455c4aff9fdcf5eb2-userdata-shm.mount: Deactivated successfully.
Nov 22 07:47:14 compute-0 podman[218503]: 2025-11-22 07:47:14.237522458 +0000 UTC m=+0.077568310 container cleanup 71fa4aad0458a75ef5c0dfde7e34a2b03d283cc93e2283b455c4aff9fdcf5eb2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4aa99606-7691-4fcb-846d-56459aaaa088, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 07:47:14 compute-0 systemd[1]: libpod-conmon-71fa4aad0458a75ef5c0dfde7e34a2b03d283cc93e2283b455c4aff9fdcf5eb2.scope: Deactivated successfully.
Nov 22 07:47:14 compute-0 nova_compute[186544]: 2025-11-22 07:47:14.253 186548 DEBUG nova.virt.libvirt.vif [None req-00714d57-8cd5-48a2-9b66-aa355f1c8b05 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:46:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesOneServerNegativeTestJSON-server-656887238',display_name='tempest-ImagesOneServerNegativeTestJSON-server-656887238',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagesoneservernegativetestjson-server-656887238',id=31,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:47:03Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b94109a356454dbda245fe5e57d0cd82',ramdisk_id='',reservation_id='r-go5pt1p5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesOneServerNegativeTestJSON-328128522',owner_user_name='tempest-ImagesOneServerNegativeTestJSON-328128522-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:47:12Z,user_data=None,user_id='b47fa480dd1c4c9f81da16b464195f2b',uuid=8734dffa-e3b3-424a-a8cd-d893c67bae4a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "33114641-5801-4450-8fcb-fdf1dc0cb013", "address": "fa:16:3e:95:60:dc", "network": {"id": "4aa99606-7691-4fcb-846d-56459aaaa088", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1996006581-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b94109a356454dbda245fe5e57d0cd82", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33114641-58", "ovs_interfaceid": "33114641-5801-4450-8fcb-fdf1dc0cb013", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 07:47:14 compute-0 nova_compute[186544]: 2025-11-22 07:47:14.254 186548 DEBUG nova.network.os_vif_util [None req-00714d57-8cd5-48a2-9b66-aa355f1c8b05 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Converting VIF {"id": "33114641-5801-4450-8fcb-fdf1dc0cb013", "address": "fa:16:3e:95:60:dc", "network": {"id": "4aa99606-7691-4fcb-846d-56459aaaa088", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1996006581-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b94109a356454dbda245fe5e57d0cd82", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33114641-58", "ovs_interfaceid": "33114641-5801-4450-8fcb-fdf1dc0cb013", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:47:14 compute-0 nova_compute[186544]: 2025-11-22 07:47:14.255 186548 DEBUG nova.network.os_vif_util [None req-00714d57-8cd5-48a2-9b66-aa355f1c8b05 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:95:60:dc,bridge_name='br-int',has_traffic_filtering=True,id=33114641-5801-4450-8fcb-fdf1dc0cb013,network=Network(4aa99606-7691-4fcb-846d-56459aaaa088),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap33114641-58') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:47:14 compute-0 nova_compute[186544]: 2025-11-22 07:47:14.255 186548 DEBUG os_vif [None req-00714d57-8cd5-48a2-9b66-aa355f1c8b05 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:95:60:dc,bridge_name='br-int',has_traffic_filtering=True,id=33114641-5801-4450-8fcb-fdf1dc0cb013,network=Network(4aa99606-7691-4fcb-846d-56459aaaa088),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap33114641-58') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 07:47:14 compute-0 nova_compute[186544]: 2025-11-22 07:47:14.258 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:14 compute-0 nova_compute[186544]: 2025-11-22 07:47:14.258 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap33114641-58, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:47:14 compute-0 nova_compute[186544]: 2025-11-22 07:47:14.303 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:14 compute-0 nova_compute[186544]: 2025-11-22 07:47:14.305 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 07:47:14 compute-0 nova_compute[186544]: 2025-11-22 07:47:14.307 186548 INFO os_vif [None req-00714d57-8cd5-48a2-9b66-aa355f1c8b05 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:95:60:dc,bridge_name='br-int',has_traffic_filtering=True,id=33114641-5801-4450-8fcb-fdf1dc0cb013,network=Network(4aa99606-7691-4fcb-846d-56459aaaa088),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap33114641-58')
Nov 22 07:47:14 compute-0 nova_compute[186544]: 2025-11-22 07:47:14.307 186548 INFO nova.virt.libvirt.driver [None req-00714d57-8cd5-48a2-9b66-aa355f1c8b05 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] Deleting instance files /var/lib/nova/instances/8734dffa-e3b3-424a-a8cd-d893c67bae4a_del
Nov 22 07:47:14 compute-0 nova_compute[186544]: 2025-11-22 07:47:14.308 186548 INFO nova.virt.libvirt.driver [None req-00714d57-8cd5-48a2-9b66-aa355f1c8b05 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] Deletion of /var/lib/nova/instances/8734dffa-e3b3-424a-a8cd-d893c67bae4a_del complete
Nov 22 07:47:14 compute-0 podman[218546]: 2025-11-22 07:47:14.334773286 +0000 UTC m=+0.076546894 container remove 71fa4aad0458a75ef5c0dfde7e34a2b03d283cc93e2283b455c4aff9fdcf5eb2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4aa99606-7691-4fcb-846d-56459aaaa088, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 22 07:47:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:14.339 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[f1ab2862-22ce-4377-aeb7-ba8479407ce3]: (4, ('Sat Nov 22 07:47:14 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-4aa99606-7691-4fcb-846d-56459aaaa088 (71fa4aad0458a75ef5c0dfde7e34a2b03d283cc93e2283b455c4aff9fdcf5eb2)\n71fa4aad0458a75ef5c0dfde7e34a2b03d283cc93e2283b455c4aff9fdcf5eb2\nSat Nov 22 07:47:14 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-4aa99606-7691-4fcb-846d-56459aaaa088 (71fa4aad0458a75ef5c0dfde7e34a2b03d283cc93e2283b455c4aff9fdcf5eb2)\n71fa4aad0458a75ef5c0dfde7e34a2b03d283cc93e2283b455c4aff9fdcf5eb2\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:14.341 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[b4967c81-4bea-489f-95df-93b73497299d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:14.341 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4aa99606-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:47:14 compute-0 nova_compute[186544]: 2025-11-22 07:47:14.343 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:14 compute-0 kernel: tap4aa99606-70: left promiscuous mode
Nov 22 07:47:14 compute-0 nova_compute[186544]: 2025-11-22 07:47:14.355 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:14.357 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[c4275917-00cc-4369-b2f9-a6b8cdc47b8c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:14.379 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[307991f8-b67d-4251-8ae5-798f6092b3dc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:14.380 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[dc34e0e7-0bd6-47ce-b3ea-c9ca04444a6b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:14.395 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[c016d5f3-be31-41f1-9ac7-57b87a735b59]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 436354, 'reachable_time': 22301, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218559, 'error': None, 'target': 'ovnmeta-4aa99606-7691-4fcb-846d-56459aaaa088', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:14.398 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-4aa99606-7691-4fcb-846d-56459aaaa088 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 07:47:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:14.398 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[d5a91d88-e034-4daa-96cc-97f4841d8d40]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:14 compute-0 systemd[1]: run-netns-ovnmeta\x2d4aa99606\x2d7691\x2d4fcb\x2d846d\x2d56459aaaa088.mount: Deactivated successfully.
Nov 22 07:47:14 compute-0 nova_compute[186544]: 2025-11-22 07:47:14.520 186548 INFO nova.compute.manager [None req-00714d57-8cd5-48a2-9b66-aa355f1c8b05 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] Took 0.57 seconds to destroy the instance on the hypervisor.
Nov 22 07:47:14 compute-0 nova_compute[186544]: 2025-11-22 07:47:14.521 186548 DEBUG oslo.service.loopingcall [None req-00714d57-8cd5-48a2-9b66-aa355f1c8b05 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 07:47:14 compute-0 nova_compute[186544]: 2025-11-22 07:47:14.521 186548 DEBUG nova.compute.manager [-] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 07:47:14 compute-0 nova_compute[186544]: 2025-11-22 07:47:14.521 186548 DEBUG nova.network.neutron [-] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 07:47:14 compute-0 nova_compute[186544]: 2025-11-22 07:47:14.777 186548 DEBUG nova.compute.manager [req-5c536286-20e6-4793-bec7-01218c9df874 req-b1b4b1c9-99cd-4870-b7eb-52bfacf10afa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] Received event network-vif-unplugged-33114641-5801-4450-8fcb-fdf1dc0cb013 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:47:14 compute-0 nova_compute[186544]: 2025-11-22 07:47:14.777 186548 DEBUG oslo_concurrency.lockutils [req-5c536286-20e6-4793-bec7-01218c9df874 req-b1b4b1c9-99cd-4870-b7eb-52bfacf10afa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "8734dffa-e3b3-424a-a8cd-d893c67bae4a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:47:14 compute-0 nova_compute[186544]: 2025-11-22 07:47:14.778 186548 DEBUG oslo_concurrency.lockutils [req-5c536286-20e6-4793-bec7-01218c9df874 req-b1b4b1c9-99cd-4870-b7eb-52bfacf10afa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "8734dffa-e3b3-424a-a8cd-d893c67bae4a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:47:14 compute-0 nova_compute[186544]: 2025-11-22 07:47:14.778 186548 DEBUG oslo_concurrency.lockutils [req-5c536286-20e6-4793-bec7-01218c9df874 req-b1b4b1c9-99cd-4870-b7eb-52bfacf10afa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "8734dffa-e3b3-424a-a8cd-d893c67bae4a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:47:14 compute-0 nova_compute[186544]: 2025-11-22 07:47:14.778 186548 DEBUG nova.compute.manager [req-5c536286-20e6-4793-bec7-01218c9df874 req-b1b4b1c9-99cd-4870-b7eb-52bfacf10afa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] No waiting events found dispatching network-vif-unplugged-33114641-5801-4450-8fcb-fdf1dc0cb013 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:47:14 compute-0 nova_compute[186544]: 2025-11-22 07:47:14.778 186548 DEBUG nova.compute.manager [req-5c536286-20e6-4793-bec7-01218c9df874 req-b1b4b1c9-99cd-4870-b7eb-52bfacf10afa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] Received event network-vif-unplugged-33114641-5801-4450-8fcb-fdf1dc0cb013 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 22 07:47:15 compute-0 nova_compute[186544]: 2025-11-22 07:47:15.187 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:15 compute-0 podman[218560]: 2025-11-22 07:47:15.409217454 +0000 UTC m=+0.056014202 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, managed_by=edpm_ansible, vendor=Red Hat, Inc., io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public)
Nov 22 07:47:15 compute-0 nova_compute[186544]: 2025-11-22 07:47:15.780 186548 DEBUG nova.network.neutron [-] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:47:15 compute-0 nova_compute[186544]: 2025-11-22 07:47:15.817 186548 INFO nova.compute.manager [-] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] Took 1.30 seconds to deallocate network for instance.
Nov 22 07:47:15 compute-0 nova_compute[186544]: 2025-11-22 07:47:15.898 186548 DEBUG nova.compute.manager [req-67b11493-7766-439c-9b1c-1e4bbf6e100b req-2263e095-35bc-4678-b4cf-0ec7e24bb929 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] Received event network-vif-deleted-33114641-5801-4450-8fcb-fdf1dc0cb013 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:47:15 compute-0 nova_compute[186544]: 2025-11-22 07:47:15.950 186548 DEBUG oslo_concurrency.lockutils [None req-00714d57-8cd5-48a2-9b66-aa355f1c8b05 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:47:15 compute-0 nova_compute[186544]: 2025-11-22 07:47:15.950 186548 DEBUG oslo_concurrency.lockutils [None req-00714d57-8cd5-48a2-9b66-aa355f1c8b05 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:47:16 compute-0 nova_compute[186544]: 2025-11-22 07:47:16.037 186548 DEBUG nova.compute.provider_tree [None req-00714d57-8cd5-48a2-9b66-aa355f1c8b05 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:47:16 compute-0 nova_compute[186544]: 2025-11-22 07:47:16.058 186548 DEBUG nova.scheduler.client.report [None req-00714d57-8cd5-48a2-9b66-aa355f1c8b05 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:47:16 compute-0 nova_compute[186544]: 2025-11-22 07:47:16.683 186548 DEBUG oslo_concurrency.lockutils [None req-00714d57-8cd5-48a2-9b66-aa355f1c8b05 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.733s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:47:16 compute-0 nova_compute[186544]: 2025-11-22 07:47:16.940 186548 DEBUG nova.compute.manager [req-828a71e5-53e2-4980-af30-fd2148c86c7f req-19ed8970-18f0-4f5f-951a-a3d9fff883b4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] Received event network-vif-plugged-33114641-5801-4450-8fcb-fdf1dc0cb013 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:47:16 compute-0 nova_compute[186544]: 2025-11-22 07:47:16.941 186548 DEBUG oslo_concurrency.lockutils [req-828a71e5-53e2-4980-af30-fd2148c86c7f req-19ed8970-18f0-4f5f-951a-a3d9fff883b4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "8734dffa-e3b3-424a-a8cd-d893c67bae4a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:47:16 compute-0 nova_compute[186544]: 2025-11-22 07:47:16.941 186548 DEBUG oslo_concurrency.lockutils [req-828a71e5-53e2-4980-af30-fd2148c86c7f req-19ed8970-18f0-4f5f-951a-a3d9fff883b4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "8734dffa-e3b3-424a-a8cd-d893c67bae4a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:47:16 compute-0 nova_compute[186544]: 2025-11-22 07:47:16.941 186548 DEBUG oslo_concurrency.lockutils [req-828a71e5-53e2-4980-af30-fd2148c86c7f req-19ed8970-18f0-4f5f-951a-a3d9fff883b4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "8734dffa-e3b3-424a-a8cd-d893c67bae4a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:47:16 compute-0 nova_compute[186544]: 2025-11-22 07:47:16.941 186548 DEBUG nova.compute.manager [req-828a71e5-53e2-4980-af30-fd2148c86c7f req-19ed8970-18f0-4f5f-951a-a3d9fff883b4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] No waiting events found dispatching network-vif-plugged-33114641-5801-4450-8fcb-fdf1dc0cb013 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:47:16 compute-0 nova_compute[186544]: 2025-11-22 07:47:16.942 186548 WARNING nova.compute.manager [req-828a71e5-53e2-4980-af30-fd2148c86c7f req-19ed8970-18f0-4f5f-951a-a3d9fff883b4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] Received unexpected event network-vif-plugged-33114641-5801-4450-8fcb-fdf1dc0cb013 for instance with vm_state deleted and task_state None.
Nov 22 07:47:18 compute-0 nova_compute[186544]: 2025-11-22 07:47:18.060 186548 INFO nova.scheduler.client.report [None req-00714d57-8cd5-48a2-9b66-aa355f1c8b05 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Deleted allocations for instance 8734dffa-e3b3-424a-a8cd-d893c67bae4a
Nov 22 07:47:18 compute-0 nova_compute[186544]: 2025-11-22 07:47:18.163 186548 DEBUG oslo_concurrency.lockutils [None req-00714d57-8cd5-48a2-9b66-aa355f1c8b05 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Lock "8734dffa-e3b3-424a-a8cd-d893c67bae4a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.228s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:47:19 compute-0 nova_compute[186544]: 2025-11-22 07:47:19.302 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:20 compute-0 nova_compute[186544]: 2025-11-22 07:47:20.188 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:22 compute-0 nova_compute[186544]: 2025-11-22 07:47:22.939 186548 DEBUG oslo_concurrency.lockutils [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Acquiring lock "91985628-1682-4fc4-b267-d329bef908d3" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:47:22 compute-0 nova_compute[186544]: 2025-11-22 07:47:22.939 186548 DEBUG oslo_concurrency.lockutils [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Lock "91985628-1682-4fc4-b267-d329bef908d3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:47:22 compute-0 nova_compute[186544]: 2025-11-22 07:47:22.966 186548 DEBUG nova.compute.manager [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 91985628-1682-4fc4-b267-d329bef908d3] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 07:47:23 compute-0 nova_compute[186544]: 2025-11-22 07:47:23.069 186548 DEBUG oslo_concurrency.lockutils [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:47:23 compute-0 nova_compute[186544]: 2025-11-22 07:47:23.069 186548 DEBUG oslo_concurrency.lockutils [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:47:23 compute-0 nova_compute[186544]: 2025-11-22 07:47:23.075 186548 DEBUG nova.virt.hardware [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 07:47:23 compute-0 nova_compute[186544]: 2025-11-22 07:47:23.075 186548 INFO nova.compute.claims [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 91985628-1682-4fc4-b267-d329bef908d3] Claim successful on node compute-0.ctlplane.example.com
Nov 22 07:47:23 compute-0 nova_compute[186544]: 2025-11-22 07:47:23.236 186548 DEBUG nova.compute.provider_tree [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:47:23 compute-0 nova_compute[186544]: 2025-11-22 07:47:23.259 186548 DEBUG nova.scheduler.client.report [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:47:23 compute-0 nova_compute[186544]: 2025-11-22 07:47:23.305 186548 DEBUG oslo_concurrency.lockutils [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.236s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:47:23 compute-0 nova_compute[186544]: 2025-11-22 07:47:23.306 186548 DEBUG nova.compute.manager [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 91985628-1682-4fc4-b267-d329bef908d3] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 07:47:23 compute-0 nova_compute[186544]: 2025-11-22 07:47:23.398 186548 DEBUG nova.compute.manager [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 91985628-1682-4fc4-b267-d329bef908d3] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 22 07:47:23 compute-0 nova_compute[186544]: 2025-11-22 07:47:23.398 186548 DEBUG nova.network.neutron [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 91985628-1682-4fc4-b267-d329bef908d3] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 07:47:23 compute-0 nova_compute[186544]: 2025-11-22 07:47:23.427 186548 INFO nova.virt.libvirt.driver [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 91985628-1682-4fc4-b267-d329bef908d3] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 07:47:23 compute-0 nova_compute[186544]: 2025-11-22 07:47:23.461 186548 DEBUG nova.compute.manager [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 91985628-1682-4fc4-b267-d329bef908d3] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 07:47:23 compute-0 nova_compute[186544]: 2025-11-22 07:47:23.631 186548 DEBUG nova.compute.manager [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 91985628-1682-4fc4-b267-d329bef908d3] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 07:47:23 compute-0 nova_compute[186544]: 2025-11-22 07:47:23.632 186548 DEBUG nova.virt.libvirt.driver [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 91985628-1682-4fc4-b267-d329bef908d3] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 07:47:23 compute-0 nova_compute[186544]: 2025-11-22 07:47:23.633 186548 INFO nova.virt.libvirt.driver [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 91985628-1682-4fc4-b267-d329bef908d3] Creating image(s)
Nov 22 07:47:23 compute-0 nova_compute[186544]: 2025-11-22 07:47:23.633 186548 DEBUG oslo_concurrency.lockutils [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Acquiring lock "/var/lib/nova/instances/91985628-1682-4fc4-b267-d329bef908d3/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:47:23 compute-0 nova_compute[186544]: 2025-11-22 07:47:23.634 186548 DEBUG oslo_concurrency.lockutils [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Lock "/var/lib/nova/instances/91985628-1682-4fc4-b267-d329bef908d3/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:47:23 compute-0 nova_compute[186544]: 2025-11-22 07:47:23.634 186548 DEBUG oslo_concurrency.lockutils [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Lock "/var/lib/nova/instances/91985628-1682-4fc4-b267-d329bef908d3/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:47:23 compute-0 nova_compute[186544]: 2025-11-22 07:47:23.648 186548 DEBUG oslo_concurrency.processutils [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:47:23 compute-0 nova_compute[186544]: 2025-11-22 07:47:23.715 186548 DEBUG oslo_concurrency.processutils [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:47:23 compute-0 nova_compute[186544]: 2025-11-22 07:47:23.716 186548 DEBUG oslo_concurrency.lockutils [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:47:23 compute-0 nova_compute[186544]: 2025-11-22 07:47:23.716 186548 DEBUG oslo_concurrency.lockutils [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:47:23 compute-0 nova_compute[186544]: 2025-11-22 07:47:23.727 186548 DEBUG oslo_concurrency.processutils [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:47:23 compute-0 nova_compute[186544]: 2025-11-22 07:47:23.783 186548 DEBUG nova.policy [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b47fa480dd1c4c9f81da16b464195f2b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b94109a356454dbda245fe5e57d0cd82', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 22 07:47:23 compute-0 nova_compute[186544]: 2025-11-22 07:47:23.795 186548 DEBUG oslo_concurrency.processutils [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:47:23 compute-0 nova_compute[186544]: 2025-11-22 07:47:23.795 186548 DEBUG oslo_concurrency.processutils [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/91985628-1682-4fc4-b267-d329bef908d3/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:47:23 compute-0 nova_compute[186544]: 2025-11-22 07:47:23.837 186548 DEBUG oslo_concurrency.processutils [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/91985628-1682-4fc4-b267-d329bef908d3/disk 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:47:23 compute-0 nova_compute[186544]: 2025-11-22 07:47:23.838 186548 DEBUG oslo_concurrency.lockutils [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.122s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:47:23 compute-0 nova_compute[186544]: 2025-11-22 07:47:23.838 186548 DEBUG oslo_concurrency.processutils [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:47:23 compute-0 nova_compute[186544]: 2025-11-22 07:47:23.868 186548 DEBUG oslo_concurrency.lockutils [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Acquiring lock "d44f6523-2367-449e-8244-593a14760876" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:47:23 compute-0 nova_compute[186544]: 2025-11-22 07:47:23.869 186548 DEBUG oslo_concurrency.lockutils [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "d44f6523-2367-449e-8244-593a14760876" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:47:23 compute-0 nova_compute[186544]: 2025-11-22 07:47:23.895 186548 DEBUG oslo_concurrency.processutils [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:47:23 compute-0 nova_compute[186544]: 2025-11-22 07:47:23.896 186548 DEBUG nova.virt.disk.api [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Checking if we can resize image /var/lib/nova/instances/91985628-1682-4fc4-b267-d329bef908d3/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 07:47:23 compute-0 nova_compute[186544]: 2025-11-22 07:47:23.896 186548 DEBUG oslo_concurrency.processutils [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/91985628-1682-4fc4-b267-d329bef908d3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:47:23 compute-0 nova_compute[186544]: 2025-11-22 07:47:23.912 186548 DEBUG nova.compute.manager [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: d44f6523-2367-449e-8244-593a14760876] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 07:47:23 compute-0 nova_compute[186544]: 2025-11-22 07:47:23.949 186548 DEBUG oslo_concurrency.processutils [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/91985628-1682-4fc4-b267-d329bef908d3/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:47:23 compute-0 nova_compute[186544]: 2025-11-22 07:47:23.950 186548 DEBUG nova.virt.disk.api [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Cannot resize image /var/lib/nova/instances/91985628-1682-4fc4-b267-d329bef908d3/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 07:47:23 compute-0 nova_compute[186544]: 2025-11-22 07:47:23.951 186548 DEBUG nova.objects.instance [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Lazy-loading 'migration_context' on Instance uuid 91985628-1682-4fc4-b267-d329bef908d3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:47:24 compute-0 nova_compute[186544]: 2025-11-22 07:47:24.000 186548 DEBUG nova.virt.libvirt.driver [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 91985628-1682-4fc4-b267-d329bef908d3] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 07:47:24 compute-0 nova_compute[186544]: 2025-11-22 07:47:24.000 186548 DEBUG nova.virt.libvirt.driver [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 91985628-1682-4fc4-b267-d329bef908d3] Ensure instance console log exists: /var/lib/nova/instances/91985628-1682-4fc4-b267-d329bef908d3/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 07:47:24 compute-0 nova_compute[186544]: 2025-11-22 07:47:24.001 186548 DEBUG oslo_concurrency.lockutils [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:47:24 compute-0 nova_compute[186544]: 2025-11-22 07:47:24.001 186548 DEBUG oslo_concurrency.lockutils [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:47:24 compute-0 nova_compute[186544]: 2025-11-22 07:47:24.001 186548 DEBUG oslo_concurrency.lockutils [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:47:24 compute-0 nova_compute[186544]: 2025-11-22 07:47:24.035 186548 DEBUG oslo_concurrency.lockutils [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:47:24 compute-0 nova_compute[186544]: 2025-11-22 07:47:24.036 186548 DEBUG oslo_concurrency.lockutils [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:47:24 compute-0 nova_compute[186544]: 2025-11-22 07:47:24.043 186548 DEBUG nova.virt.hardware [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 07:47:24 compute-0 nova_compute[186544]: 2025-11-22 07:47:24.043 186548 INFO nova.compute.claims [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: d44f6523-2367-449e-8244-593a14760876] Claim successful on node compute-0.ctlplane.example.com
Nov 22 07:47:24 compute-0 nova_compute[186544]: 2025-11-22 07:47:24.235 186548 DEBUG nova.compute.provider_tree [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:47:24 compute-0 nova_compute[186544]: 2025-11-22 07:47:24.258 186548 DEBUG nova.scheduler.client.report [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:47:24 compute-0 nova_compute[186544]: 2025-11-22 07:47:24.285 186548 DEBUG oslo_concurrency.lockutils [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.249s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:47:24 compute-0 nova_compute[186544]: 2025-11-22 07:47:24.304 186548 DEBUG oslo_concurrency.lockutils [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Acquiring lock "c837965b-440e-43df-b19e-691163d254d5" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:47:24 compute-0 nova_compute[186544]: 2025-11-22 07:47:24.305 186548 DEBUG oslo_concurrency.lockutils [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "c837965b-440e-43df-b19e-691163d254d5" acquired by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:47:24 compute-0 nova_compute[186544]: 2025-11-22 07:47:24.307 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:24 compute-0 nova_compute[186544]: 2025-11-22 07:47:24.326 186548 DEBUG nova.compute.manager [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: d44f6523-2367-449e-8244-593a14760876] No node specified, defaulting to compute-0.ctlplane.example.com _get_nodename /usr/lib/python3.9/site-packages/nova/compute/manager.py:10505
Nov 22 07:47:24 compute-0 nova_compute[186544]: 2025-11-22 07:47:24.360 186548 DEBUG oslo_concurrency.lockutils [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "c837965b-440e-43df-b19e-691163d254d5" "released" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" :: held 0.055s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:47:24 compute-0 nova_compute[186544]: 2025-11-22 07:47:24.361 186548 DEBUG nova.compute.manager [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: d44f6523-2367-449e-8244-593a14760876] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 07:47:24 compute-0 nova_compute[186544]: 2025-11-22 07:47:24.450 186548 DEBUG nova.compute.manager [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: d44f6523-2367-449e-8244-593a14760876] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 22 07:47:24 compute-0 nova_compute[186544]: 2025-11-22 07:47:24.450 186548 DEBUG nova.network.neutron [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: d44f6523-2367-449e-8244-593a14760876] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 07:47:24 compute-0 nova_compute[186544]: 2025-11-22 07:47:24.475 186548 INFO nova.virt.libvirt.driver [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: d44f6523-2367-449e-8244-593a14760876] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 07:47:24 compute-0 nova_compute[186544]: 2025-11-22 07:47:24.488 186548 DEBUG nova.compute.manager [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: d44f6523-2367-449e-8244-593a14760876] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 07:47:24 compute-0 nova_compute[186544]: 2025-11-22 07:47:24.689 186548 DEBUG nova.compute.manager [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: d44f6523-2367-449e-8244-593a14760876] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 07:47:24 compute-0 nova_compute[186544]: 2025-11-22 07:47:24.691 186548 DEBUG nova.virt.libvirt.driver [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: d44f6523-2367-449e-8244-593a14760876] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 07:47:24 compute-0 nova_compute[186544]: 2025-11-22 07:47:24.691 186548 INFO nova.virt.libvirt.driver [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: d44f6523-2367-449e-8244-593a14760876] Creating image(s)
Nov 22 07:47:24 compute-0 nova_compute[186544]: 2025-11-22 07:47:24.692 186548 DEBUG oslo_concurrency.lockutils [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Acquiring lock "/var/lib/nova/instances/d44f6523-2367-449e-8244-593a14760876/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:47:24 compute-0 nova_compute[186544]: 2025-11-22 07:47:24.692 186548 DEBUG oslo_concurrency.lockutils [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "/var/lib/nova/instances/d44f6523-2367-449e-8244-593a14760876/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:47:24 compute-0 nova_compute[186544]: 2025-11-22 07:47:24.693 186548 DEBUG oslo_concurrency.lockutils [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "/var/lib/nova/instances/d44f6523-2367-449e-8244-593a14760876/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:47:24 compute-0 nova_compute[186544]: 2025-11-22 07:47:24.705 186548 DEBUG nova.network.neutron [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 91985628-1682-4fc4-b267-d329bef908d3] Successfully created port: e62cb303-c302-4453-bc2e-549678b72530 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 22 07:47:24 compute-0 nova_compute[186544]: 2025-11-22 07:47:24.709 186548 DEBUG oslo_concurrency.processutils [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:47:24 compute-0 nova_compute[186544]: 2025-11-22 07:47:24.768 186548 DEBUG oslo_concurrency.processutils [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:47:24 compute-0 nova_compute[186544]: 2025-11-22 07:47:24.769 186548 DEBUG oslo_concurrency.lockutils [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:47:24 compute-0 nova_compute[186544]: 2025-11-22 07:47:24.770 186548 DEBUG oslo_concurrency.lockutils [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:47:24 compute-0 nova_compute[186544]: 2025-11-22 07:47:24.781 186548 DEBUG oslo_concurrency.processutils [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:47:24 compute-0 nova_compute[186544]: 2025-11-22 07:47:24.838 186548 DEBUG oslo_concurrency.processutils [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:47:24 compute-0 nova_compute[186544]: 2025-11-22 07:47:24.839 186548 DEBUG oslo_concurrency.processutils [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/d44f6523-2367-449e-8244-593a14760876/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:47:24 compute-0 nova_compute[186544]: 2025-11-22 07:47:24.855 186548 DEBUG nova.network.neutron [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: d44f6523-2367-449e-8244-593a14760876] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188
Nov 22 07:47:24 compute-0 nova_compute[186544]: 2025-11-22 07:47:24.856 186548 DEBUG nova.compute.manager [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: d44f6523-2367-449e-8244-593a14760876] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 22 07:47:24 compute-0 nova_compute[186544]: 2025-11-22 07:47:24.872 186548 DEBUG oslo_concurrency.processutils [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/d44f6523-2367-449e-8244-593a14760876/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:47:24 compute-0 nova_compute[186544]: 2025-11-22 07:47:24.873 186548 DEBUG oslo_concurrency.lockutils [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.103s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:47:24 compute-0 nova_compute[186544]: 2025-11-22 07:47:24.873 186548 DEBUG oslo_concurrency.processutils [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:47:24 compute-0 nova_compute[186544]: 2025-11-22 07:47:24.925 186548 DEBUG oslo_concurrency.processutils [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:47:24 compute-0 nova_compute[186544]: 2025-11-22 07:47:24.926 186548 DEBUG nova.virt.disk.api [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Checking if we can resize image /var/lib/nova/instances/d44f6523-2367-449e-8244-593a14760876/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 07:47:24 compute-0 nova_compute[186544]: 2025-11-22 07:47:24.926 186548 DEBUG oslo_concurrency.processutils [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d44f6523-2367-449e-8244-593a14760876/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:47:24 compute-0 nova_compute[186544]: 2025-11-22 07:47:24.980 186548 DEBUG oslo_concurrency.processutils [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d44f6523-2367-449e-8244-593a14760876/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:47:24 compute-0 nova_compute[186544]: 2025-11-22 07:47:24.981 186548 DEBUG nova.virt.disk.api [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Cannot resize image /var/lib/nova/instances/d44f6523-2367-449e-8244-593a14760876/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 07:47:24 compute-0 nova_compute[186544]: 2025-11-22 07:47:24.981 186548 DEBUG nova.objects.instance [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lazy-loading 'migration_context' on Instance uuid d44f6523-2367-449e-8244-593a14760876 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:47:24 compute-0 nova_compute[186544]: 2025-11-22 07:47:24.994 186548 DEBUG nova.virt.libvirt.driver [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: d44f6523-2367-449e-8244-593a14760876] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 07:47:24 compute-0 nova_compute[186544]: 2025-11-22 07:47:24.994 186548 DEBUG nova.virt.libvirt.driver [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: d44f6523-2367-449e-8244-593a14760876] Ensure instance console log exists: /var/lib/nova/instances/d44f6523-2367-449e-8244-593a14760876/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 07:47:24 compute-0 nova_compute[186544]: 2025-11-22 07:47:24.995 186548 DEBUG oslo_concurrency.lockutils [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:47:24 compute-0 nova_compute[186544]: 2025-11-22 07:47:24.995 186548 DEBUG oslo_concurrency.lockutils [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:47:24 compute-0 nova_compute[186544]: 2025-11-22 07:47:24.996 186548 DEBUG oslo_concurrency.lockutils [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:47:24 compute-0 nova_compute[186544]: 2025-11-22 07:47:24.997 186548 DEBUG nova.virt.libvirt.driver [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: d44f6523-2367-449e-8244-593a14760876] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 07:47:25 compute-0 nova_compute[186544]: 2025-11-22 07:47:25.002 186548 WARNING nova.virt.libvirt.driver [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 07:47:25 compute-0 nova_compute[186544]: 2025-11-22 07:47:25.007 186548 DEBUG nova.virt.libvirt.host [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 07:47:25 compute-0 nova_compute[186544]: 2025-11-22 07:47:25.008 186548 DEBUG nova.virt.libvirt.host [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 07:47:25 compute-0 nova_compute[186544]: 2025-11-22 07:47:25.010 186548 DEBUG nova.virt.libvirt.host [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 07:47:25 compute-0 nova_compute[186544]: 2025-11-22 07:47:25.011 186548 DEBUG nova.virt.libvirt.host [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 07:47:25 compute-0 nova_compute[186544]: 2025-11-22 07:47:25.012 186548 DEBUG nova.virt.libvirt.driver [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 07:47:25 compute-0 nova_compute[186544]: 2025-11-22 07:47:25.012 186548 DEBUG nova.virt.hardware [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 07:47:25 compute-0 nova_compute[186544]: 2025-11-22 07:47:25.013 186548 DEBUG nova.virt.hardware [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 07:47:25 compute-0 nova_compute[186544]: 2025-11-22 07:47:25.013 186548 DEBUG nova.virt.hardware [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 07:47:25 compute-0 nova_compute[186544]: 2025-11-22 07:47:25.014 186548 DEBUG nova.virt.hardware [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 07:47:25 compute-0 nova_compute[186544]: 2025-11-22 07:47:25.014 186548 DEBUG nova.virt.hardware [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 07:47:25 compute-0 nova_compute[186544]: 2025-11-22 07:47:25.014 186548 DEBUG nova.virt.hardware [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 07:47:25 compute-0 nova_compute[186544]: 2025-11-22 07:47:25.014 186548 DEBUG nova.virt.hardware [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 07:47:25 compute-0 nova_compute[186544]: 2025-11-22 07:47:25.015 186548 DEBUG nova.virt.hardware [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 07:47:25 compute-0 nova_compute[186544]: 2025-11-22 07:47:25.015 186548 DEBUG nova.virt.hardware [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 07:47:25 compute-0 nova_compute[186544]: 2025-11-22 07:47:25.015 186548 DEBUG nova.virt.hardware [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 07:47:25 compute-0 nova_compute[186544]: 2025-11-22 07:47:25.015 186548 DEBUG nova.virt.hardware [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 07:47:25 compute-0 nova_compute[186544]: 2025-11-22 07:47:25.019 186548 DEBUG nova.objects.instance [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lazy-loading 'pci_devices' on Instance uuid d44f6523-2367-449e-8244-593a14760876 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:47:25 compute-0 nova_compute[186544]: 2025-11-22 07:47:25.034 186548 DEBUG nova.virt.libvirt.driver [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: d44f6523-2367-449e-8244-593a14760876] End _get_guest_xml xml=<domain type="kvm">
Nov 22 07:47:25 compute-0 nova_compute[186544]:   <uuid>d44f6523-2367-449e-8244-593a14760876</uuid>
Nov 22 07:47:25 compute-0 nova_compute[186544]:   <name>instance-00000025</name>
Nov 22 07:47:25 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 07:47:25 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 07:47:25 compute-0 nova_compute[186544]:   <metadata>
Nov 22 07:47:25 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 07:47:25 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 07:47:25 compute-0 nova_compute[186544]:       <nova:name>tempest-ServersOnMultiNodesTest-server-1882522842-1</nova:name>
Nov 22 07:47:25 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 07:47:25</nova:creationTime>
Nov 22 07:47:25 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 07:47:25 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 07:47:25 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 07:47:25 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 07:47:25 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 07:47:25 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 07:47:25 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 07:47:25 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 07:47:25 compute-0 nova_compute[186544]:         <nova:user uuid="f3272c6a12f44ac18db2715976e29248">tempest-ServersOnMultiNodesTest-214232393-project-member</nova:user>
Nov 22 07:47:25 compute-0 nova_compute[186544]:         <nova:project uuid="b764107a4dca4a799bc3edefe458310b">tempest-ServersOnMultiNodesTest-214232393</nova:project>
Nov 22 07:47:25 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 07:47:25 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 07:47:25 compute-0 nova_compute[186544]:       <nova:ports/>
Nov 22 07:47:25 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 07:47:25 compute-0 nova_compute[186544]:   </metadata>
Nov 22 07:47:25 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 07:47:25 compute-0 nova_compute[186544]:     <system>
Nov 22 07:47:25 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 07:47:25 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 07:47:25 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 07:47:25 compute-0 nova_compute[186544]:       <entry name="serial">d44f6523-2367-449e-8244-593a14760876</entry>
Nov 22 07:47:25 compute-0 nova_compute[186544]:       <entry name="uuid">d44f6523-2367-449e-8244-593a14760876</entry>
Nov 22 07:47:25 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 07:47:25 compute-0 nova_compute[186544]:     </system>
Nov 22 07:47:25 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 07:47:25 compute-0 nova_compute[186544]:   <os>
Nov 22 07:47:25 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 07:47:25 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 07:47:25 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 07:47:25 compute-0 nova_compute[186544]:   </os>
Nov 22 07:47:25 compute-0 nova_compute[186544]:   <features>
Nov 22 07:47:25 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 07:47:25 compute-0 nova_compute[186544]:     <apic/>
Nov 22 07:47:25 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 07:47:25 compute-0 nova_compute[186544]:   </features>
Nov 22 07:47:25 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 07:47:25 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 07:47:25 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 07:47:25 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 07:47:25 compute-0 nova_compute[186544]:   </clock>
Nov 22 07:47:25 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 07:47:25 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 07:47:25 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 07:47:25 compute-0 nova_compute[186544]:   </cpu>
Nov 22 07:47:25 compute-0 nova_compute[186544]:   <devices>
Nov 22 07:47:25 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 07:47:25 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 07:47:25 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/d44f6523-2367-449e-8244-593a14760876/disk"/>
Nov 22 07:47:25 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 07:47:25 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:47:25 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 07:47:25 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 07:47:25 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/d44f6523-2367-449e-8244-593a14760876/disk.config"/>
Nov 22 07:47:25 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 07:47:25 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:47:25 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 07:47:25 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/d44f6523-2367-449e-8244-593a14760876/console.log" append="off"/>
Nov 22 07:47:25 compute-0 nova_compute[186544]:     </serial>
Nov 22 07:47:25 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 07:47:25 compute-0 nova_compute[186544]:     <video>
Nov 22 07:47:25 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 07:47:25 compute-0 nova_compute[186544]:     </video>
Nov 22 07:47:25 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 07:47:25 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 07:47:25 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 07:47:25 compute-0 nova_compute[186544]:     </rng>
Nov 22 07:47:25 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 07:47:25 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:47:25 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:47:25 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:47:25 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:47:25 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:47:25 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:47:25 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:47:25 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:47:25 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:47:25 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:47:25 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:47:25 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:47:25 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:47:25 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:47:25 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:47:25 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:47:25 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:47:25 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:47:25 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:47:25 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:47:25 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:47:25 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:47:25 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:47:25 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:47:25 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 07:47:25 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 07:47:25 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 07:47:25 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 07:47:25 compute-0 nova_compute[186544]:   </devices>
Nov 22 07:47:25 compute-0 nova_compute[186544]: </domain>
Nov 22 07:47:25 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 07:47:25 compute-0 nova_compute[186544]: 2025-11-22 07:47:25.190 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:25 compute-0 nova_compute[186544]: 2025-11-22 07:47:25.279 186548 DEBUG nova.virt.libvirt.driver [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:47:25 compute-0 nova_compute[186544]: 2025-11-22 07:47:25.280 186548 DEBUG nova.virt.libvirt.driver [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:47:25 compute-0 nova_compute[186544]: 2025-11-22 07:47:25.280 186548 INFO nova.virt.libvirt.driver [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: d44f6523-2367-449e-8244-593a14760876] Using config drive
Nov 22 07:47:26 compute-0 nova_compute[186544]: 2025-11-22 07:47:26.436 186548 INFO nova.virt.libvirt.driver [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: d44f6523-2367-449e-8244-593a14760876] Creating config drive at /var/lib/nova/instances/d44f6523-2367-449e-8244-593a14760876/disk.config
Nov 22 07:47:26 compute-0 nova_compute[186544]: 2025-11-22 07:47:26.441 186548 DEBUG oslo_concurrency.processutils [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d44f6523-2367-449e-8244-593a14760876/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpumbmfdjx execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:47:26 compute-0 nova_compute[186544]: 2025-11-22 07:47:26.577 186548 DEBUG oslo_concurrency.processutils [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d44f6523-2367-449e-8244-593a14760876/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpumbmfdjx" returned: 0 in 0.136s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:47:26 compute-0 systemd-machined[152872]: New machine qemu-18-instance-00000025.
Nov 22 07:47:26 compute-0 systemd[1]: Started Virtual Machine qemu-18-instance-00000025.
Nov 22 07:47:26 compute-0 podman[218624]: 2025-11-22 07:47:26.742344706 +0000 UTC m=+0.080439089 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Nov 22 07:47:26 compute-0 podman[218625]: 2025-11-22 07:47:26.814548902 +0000 UTC m=+0.149240551 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Nov 22 07:47:27 compute-0 nova_compute[186544]: 2025-11-22 07:47:27.118 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763797647.1184251, d44f6523-2367-449e-8244-593a14760876 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:47:27 compute-0 nova_compute[186544]: 2025-11-22 07:47:27.119 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: d44f6523-2367-449e-8244-593a14760876] VM Resumed (Lifecycle Event)
Nov 22 07:47:27 compute-0 nova_compute[186544]: 2025-11-22 07:47:27.123 186548 DEBUG nova.compute.manager [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: d44f6523-2367-449e-8244-593a14760876] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 07:47:27 compute-0 nova_compute[186544]: 2025-11-22 07:47:27.123 186548 DEBUG nova.virt.libvirt.driver [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: d44f6523-2367-449e-8244-593a14760876] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 07:47:27 compute-0 nova_compute[186544]: 2025-11-22 07:47:27.126 186548 INFO nova.virt.libvirt.driver [-] [instance: d44f6523-2367-449e-8244-593a14760876] Instance spawned successfully.
Nov 22 07:47:27 compute-0 nova_compute[186544]: 2025-11-22 07:47:27.127 186548 DEBUG nova.virt.libvirt.driver [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: d44f6523-2367-449e-8244-593a14760876] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 07:47:27 compute-0 nova_compute[186544]: 2025-11-22 07:47:27.140 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: d44f6523-2367-449e-8244-593a14760876] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:47:27 compute-0 nova_compute[186544]: 2025-11-22 07:47:27.146 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: d44f6523-2367-449e-8244-593a14760876] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:47:27 compute-0 nova_compute[186544]: 2025-11-22 07:47:27.149 186548 DEBUG nova.virt.libvirt.driver [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: d44f6523-2367-449e-8244-593a14760876] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:47:27 compute-0 nova_compute[186544]: 2025-11-22 07:47:27.149 186548 DEBUG nova.virt.libvirt.driver [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: d44f6523-2367-449e-8244-593a14760876] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:47:27 compute-0 nova_compute[186544]: 2025-11-22 07:47:27.150 186548 DEBUG nova.virt.libvirt.driver [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: d44f6523-2367-449e-8244-593a14760876] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:47:27 compute-0 nova_compute[186544]: 2025-11-22 07:47:27.150 186548 DEBUG nova.virt.libvirt.driver [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: d44f6523-2367-449e-8244-593a14760876] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:47:27 compute-0 nova_compute[186544]: 2025-11-22 07:47:27.151 186548 DEBUG nova.virt.libvirt.driver [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: d44f6523-2367-449e-8244-593a14760876] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:47:27 compute-0 nova_compute[186544]: 2025-11-22 07:47:27.151 186548 DEBUG nova.virt.libvirt.driver [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: d44f6523-2367-449e-8244-593a14760876] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:47:27 compute-0 nova_compute[186544]: 2025-11-22 07:47:27.172 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: d44f6523-2367-449e-8244-593a14760876] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 07:47:27 compute-0 nova_compute[186544]: 2025-11-22 07:47:27.173 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763797647.1217449, d44f6523-2367-449e-8244-593a14760876 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:47:27 compute-0 nova_compute[186544]: 2025-11-22 07:47:27.173 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: d44f6523-2367-449e-8244-593a14760876] VM Started (Lifecycle Event)
Nov 22 07:47:27 compute-0 nova_compute[186544]: 2025-11-22 07:47:27.198 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: d44f6523-2367-449e-8244-593a14760876] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:47:27 compute-0 nova_compute[186544]: 2025-11-22 07:47:27.201 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: d44f6523-2367-449e-8244-593a14760876] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:47:27 compute-0 nova_compute[186544]: 2025-11-22 07:47:27.229 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: d44f6523-2367-449e-8244-593a14760876] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 07:47:27 compute-0 nova_compute[186544]: 2025-11-22 07:47:27.233 186548 INFO nova.compute.manager [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: d44f6523-2367-449e-8244-593a14760876] Took 2.54 seconds to spawn the instance on the hypervisor.
Nov 22 07:47:27 compute-0 nova_compute[186544]: 2025-11-22 07:47:27.233 186548 DEBUG nova.compute.manager [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: d44f6523-2367-449e-8244-593a14760876] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:47:27 compute-0 nova_compute[186544]: 2025-11-22 07:47:27.352 186548 INFO nova.compute.manager [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: d44f6523-2367-449e-8244-593a14760876] Took 3.35 seconds to build instance.
Nov 22 07:47:27 compute-0 nova_compute[186544]: 2025-11-22 07:47:27.392 186548 DEBUG oslo_concurrency.lockutils [None req-8f4a66f1-faa5-447f-a7ff-206be0238cd7 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "d44f6523-2367-449e-8244-593a14760876" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 3.523s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:47:29 compute-0 nova_compute[186544]: 2025-11-22 07:47:29.228 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763797634.2267663, 8734dffa-e3b3-424a-a8cd-d893c67bae4a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:47:29 compute-0 nova_compute[186544]: 2025-11-22 07:47:29.228 186548 INFO nova.compute.manager [-] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] VM Stopped (Lifecycle Event)
Nov 22 07:47:29 compute-0 nova_compute[186544]: 2025-11-22 07:47:29.246 186548 DEBUG nova.compute.manager [None req-b05deeca-ad4c-47b5-97f0-73750578a39e - - - - - -] [instance: 8734dffa-e3b3-424a-a8cd-d893c67bae4a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:47:29 compute-0 nova_compute[186544]: 2025-11-22 07:47:29.309 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:30 compute-0 nova_compute[186544]: 2025-11-22 07:47:30.192 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:30 compute-0 podman[218689]: 2025-11-22 07:47:30.421995321 +0000 UTC m=+0.063174617 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 07:47:31 compute-0 nova_compute[186544]: 2025-11-22 07:47:31.556 186548 DEBUG nova.network.neutron [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 91985628-1682-4fc4-b267-d329bef908d3] Successfully updated port: e62cb303-c302-4453-bc2e-549678b72530 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 07:47:31 compute-0 nova_compute[186544]: 2025-11-22 07:47:31.574 186548 DEBUG oslo_concurrency.lockutils [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Acquiring lock "refresh_cache-91985628-1682-4fc4-b267-d329bef908d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:47:31 compute-0 nova_compute[186544]: 2025-11-22 07:47:31.574 186548 DEBUG oslo_concurrency.lockutils [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Acquired lock "refresh_cache-91985628-1682-4fc4-b267-d329bef908d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:47:31 compute-0 nova_compute[186544]: 2025-11-22 07:47:31.574 186548 DEBUG nova.network.neutron [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 91985628-1682-4fc4-b267-d329bef908d3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 07:47:32 compute-0 nova_compute[186544]: 2025-11-22 07:47:32.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:47:32 compute-0 nova_compute[186544]: 2025-11-22 07:47:32.184 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:47:32 compute-0 nova_compute[186544]: 2025-11-22 07:47:32.185 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:47:32 compute-0 nova_compute[186544]: 2025-11-22 07:47:32.185 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:47:32 compute-0 nova_compute[186544]: 2025-11-22 07:47:32.185 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 07:47:32 compute-0 nova_compute[186544]: 2025-11-22 07:47:32.260 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:47:32 compute-0 nova_compute[186544]: 2025-11-22 07:47:32.324 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:47:32 compute-0 nova_compute[186544]: 2025-11-22 07:47:32.325 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:47:32 compute-0 nova_compute[186544]: 2025-11-22 07:47:32.381 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:47:32 compute-0 nova_compute[186544]: 2025-11-22 07:47:32.388 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d44f6523-2367-449e-8244-593a14760876/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:47:32 compute-0 nova_compute[186544]: 2025-11-22 07:47:32.443 186548 DEBUG nova.network.neutron [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 91985628-1682-4fc4-b267-d329bef908d3] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 07:47:32 compute-0 nova_compute[186544]: 2025-11-22 07:47:32.448 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d44f6523-2367-449e-8244-593a14760876/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:47:32 compute-0 nova_compute[186544]: 2025-11-22 07:47:32.448 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d44f6523-2367-449e-8244-593a14760876/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:47:32 compute-0 nova_compute[186544]: 2025-11-22 07:47:32.505 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d44f6523-2367-449e-8244-593a14760876/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:47:32 compute-0 nova_compute[186544]: 2025-11-22 07:47:32.521 186548 DEBUG nova.compute.manager [req-b63787c8-dd81-4437-8225-e2996bb67cca req-aeee0d0f-b510-4643-87eb-0d57765e39f8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 91985628-1682-4fc4-b267-d329bef908d3] Received event network-changed-e62cb303-c302-4453-bc2e-549678b72530 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:47:32 compute-0 nova_compute[186544]: 2025-11-22 07:47:32.522 186548 DEBUG nova.compute.manager [req-b63787c8-dd81-4437-8225-e2996bb67cca req-aeee0d0f-b510-4643-87eb-0d57765e39f8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 91985628-1682-4fc4-b267-d329bef908d3] Refreshing instance network info cache due to event network-changed-e62cb303-c302-4453-bc2e-549678b72530. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 07:47:32 compute-0 nova_compute[186544]: 2025-11-22 07:47:32.522 186548 DEBUG oslo_concurrency.lockutils [req-b63787c8-dd81-4437-8225-e2996bb67cca req-aeee0d0f-b510-4643-87eb-0d57765e39f8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-91985628-1682-4fc4-b267-d329bef908d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:47:32 compute-0 nova_compute[186544]: 2025-11-22 07:47:32.669 186548 DEBUG oslo_concurrency.lockutils [None req-09132683-17be-4d9a-b4cd-d7f98b2d0b46 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Acquiring lock "d44f6523-2367-449e-8244-593a14760876" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:47:32 compute-0 nova_compute[186544]: 2025-11-22 07:47:32.669 186548 DEBUG oslo_concurrency.lockutils [None req-09132683-17be-4d9a-b4cd-d7f98b2d0b46 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "d44f6523-2367-449e-8244-593a14760876" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:47:32 compute-0 nova_compute[186544]: 2025-11-22 07:47:32.670 186548 DEBUG oslo_concurrency.lockutils [None req-09132683-17be-4d9a-b4cd-d7f98b2d0b46 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Acquiring lock "d44f6523-2367-449e-8244-593a14760876-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:47:32 compute-0 nova_compute[186544]: 2025-11-22 07:47:32.670 186548 DEBUG oslo_concurrency.lockutils [None req-09132683-17be-4d9a-b4cd-d7f98b2d0b46 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "d44f6523-2367-449e-8244-593a14760876-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:47:32 compute-0 nova_compute[186544]: 2025-11-22 07:47:32.670 186548 DEBUG oslo_concurrency.lockutils [None req-09132683-17be-4d9a-b4cd-d7f98b2d0b46 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "d44f6523-2367-449e-8244-593a14760876-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:47:32 compute-0 nova_compute[186544]: 2025-11-22 07:47:32.677 186548 INFO nova.compute.manager [None req-09132683-17be-4d9a-b4cd-d7f98b2d0b46 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: d44f6523-2367-449e-8244-593a14760876] Terminating instance
Nov 22 07:47:32 compute-0 nova_compute[186544]: 2025-11-22 07:47:32.683 186548 DEBUG oslo_concurrency.lockutils [None req-09132683-17be-4d9a-b4cd-d7f98b2d0b46 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Acquiring lock "refresh_cache-d44f6523-2367-449e-8244-593a14760876" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:47:32 compute-0 nova_compute[186544]: 2025-11-22 07:47:32.683 186548 DEBUG oslo_concurrency.lockutils [None req-09132683-17be-4d9a-b4cd-d7f98b2d0b46 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Acquired lock "refresh_cache-d44f6523-2367-449e-8244-593a14760876" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:47:32 compute-0 nova_compute[186544]: 2025-11-22 07:47:32.683 186548 DEBUG nova.network.neutron [None req-09132683-17be-4d9a-b4cd-d7f98b2d0b46 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: d44f6523-2367-449e-8244-593a14760876] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 07:47:32 compute-0 nova_compute[186544]: 2025-11-22 07:47:32.715 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 07:47:32 compute-0 nova_compute[186544]: 2025-11-22 07:47:32.716 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5402MB free_disk=73.39511108398438GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 07:47:32 compute-0 nova_compute[186544]: 2025-11-22 07:47:32.716 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:47:32 compute-0 nova_compute[186544]: 2025-11-22 07:47:32.717 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:47:32 compute-0 nova_compute[186544]: 2025-11-22 07:47:32.994 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Instance 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 22 07:47:32 compute-0 nova_compute[186544]: 2025-11-22 07:47:32.994 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Instance 91985628-1682-4fc4-b267-d329bef908d3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 22 07:47:32 compute-0 nova_compute[186544]: 2025-11-22 07:47:32.995 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Instance d44f6523-2367-449e-8244-593a14760876 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 22 07:47:32 compute-0 nova_compute[186544]: 2025-11-22 07:47:32.995 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 07:47:32 compute-0 nova_compute[186544]: 2025-11-22 07:47:32.995 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=896MB phys_disk=79GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 07:47:33 compute-0 nova_compute[186544]: 2025-11-22 07:47:33.091 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:47:33 compute-0 nova_compute[186544]: 2025-11-22 07:47:33.107 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:47:33 compute-0 nova_compute[186544]: 2025-11-22 07:47:33.141 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 07:47:33 compute-0 nova_compute[186544]: 2025-11-22 07:47:33.141 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.425s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:47:33 compute-0 nova_compute[186544]: 2025-11-22 07:47:33.249 186548 DEBUG nova.network.neutron [None req-09132683-17be-4d9a-b4cd-d7f98b2d0b46 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: d44f6523-2367-449e-8244-593a14760876] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 07:47:33 compute-0 nova_compute[186544]: 2025-11-22 07:47:33.896 186548 DEBUG nova.network.neutron [None req-09132683-17be-4d9a-b4cd-d7f98b2d0b46 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: d44f6523-2367-449e-8244-593a14760876] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:47:33 compute-0 nova_compute[186544]: 2025-11-22 07:47:33.901 186548 DEBUG nova.virt.libvirt.driver [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Creating tmpfile /var/lib/nova/instances/tmp1r930f3w to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Nov 22 07:47:33 compute-0 nova_compute[186544]: 2025-11-22 07:47:33.902 186548 DEBUG nova.compute.manager [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=72704,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp1r930f3w',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Nov 22 07:47:33 compute-0 nova_compute[186544]: 2025-11-22 07:47:33.906 186548 DEBUG oslo_concurrency.lockutils [None req-09132683-17be-4d9a-b4cd-d7f98b2d0b46 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Releasing lock "refresh_cache-d44f6523-2367-449e-8244-593a14760876" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:47:33 compute-0 nova_compute[186544]: 2025-11-22 07:47:33.906 186548 DEBUG nova.compute.manager [None req-09132683-17be-4d9a-b4cd-d7f98b2d0b46 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: d44f6523-2367-449e-8244-593a14760876] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 07:47:33 compute-0 nova_compute[186544]: 2025-11-22 07:47:33.925 186548 DEBUG nova.network.neutron [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 91985628-1682-4fc4-b267-d329bef908d3] Updating instance_info_cache with network_info: [{"id": "e62cb303-c302-4453-bc2e-549678b72530", "address": "fa:16:3e:2d:37:a2", "network": {"id": "4aa99606-7691-4fcb-846d-56459aaaa088", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1996006581-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b94109a356454dbda245fe5e57d0cd82", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape62cb303-c3", "ovs_interfaceid": "e62cb303-c302-4453-bc2e-549678b72530", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:47:33 compute-0 nova_compute[186544]: 2025-11-22 07:47:33.946 186548 DEBUG oslo_concurrency.lockutils [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Releasing lock "refresh_cache-91985628-1682-4fc4-b267-d329bef908d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:47:33 compute-0 nova_compute[186544]: 2025-11-22 07:47:33.946 186548 DEBUG nova.compute.manager [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 91985628-1682-4fc4-b267-d329bef908d3] Instance network_info: |[{"id": "e62cb303-c302-4453-bc2e-549678b72530", "address": "fa:16:3e:2d:37:a2", "network": {"id": "4aa99606-7691-4fcb-846d-56459aaaa088", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1996006581-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b94109a356454dbda245fe5e57d0cd82", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape62cb303-c3", "ovs_interfaceid": "e62cb303-c302-4453-bc2e-549678b72530", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 22 07:47:33 compute-0 nova_compute[186544]: 2025-11-22 07:47:33.947 186548 DEBUG oslo_concurrency.lockutils [req-b63787c8-dd81-4437-8225-e2996bb67cca req-aeee0d0f-b510-4643-87eb-0d57765e39f8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-91985628-1682-4fc4-b267-d329bef908d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:47:33 compute-0 nova_compute[186544]: 2025-11-22 07:47:33.947 186548 DEBUG nova.network.neutron [req-b63787c8-dd81-4437-8225-e2996bb67cca req-aeee0d0f-b510-4643-87eb-0d57765e39f8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 91985628-1682-4fc4-b267-d329bef908d3] Refreshing network info cache for port e62cb303-c302-4453-bc2e-549678b72530 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 07:47:33 compute-0 nova_compute[186544]: 2025-11-22 07:47:33.949 186548 DEBUG nova.virt.libvirt.driver [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 91985628-1682-4fc4-b267-d329bef908d3] Start _get_guest_xml network_info=[{"id": "e62cb303-c302-4453-bc2e-549678b72530", "address": "fa:16:3e:2d:37:a2", "network": {"id": "4aa99606-7691-4fcb-846d-56459aaaa088", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1996006581-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b94109a356454dbda245fe5e57d0cd82", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape62cb303-c3", "ovs_interfaceid": "e62cb303-c302-4453-bc2e-549678b72530", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 07:47:33 compute-0 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d00000025.scope: Deactivated successfully.
Nov 22 07:47:33 compute-0 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d00000025.scope: Consumed 7.293s CPU time.
Nov 22 07:47:33 compute-0 nova_compute[186544]: 2025-11-22 07:47:33.953 186548 WARNING nova.virt.libvirt.driver [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 07:47:33 compute-0 systemd-machined[152872]: Machine qemu-18-instance-00000025 terminated.
Nov 22 07:47:33 compute-0 nova_compute[186544]: 2025-11-22 07:47:33.956 186548 DEBUG nova.virt.libvirt.host [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 07:47:33 compute-0 nova_compute[186544]: 2025-11-22 07:47:33.957 186548 DEBUG nova.virt.libvirt.host [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 07:47:33 compute-0 nova_compute[186544]: 2025-11-22 07:47:33.960 186548 DEBUG nova.virt.libvirt.host [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 07:47:33 compute-0 nova_compute[186544]: 2025-11-22 07:47:33.960 186548 DEBUG nova.virt.libvirt.host [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 07:47:33 compute-0 nova_compute[186544]: 2025-11-22 07:47:33.961 186548 DEBUG nova.virt.libvirt.driver [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 07:47:33 compute-0 nova_compute[186544]: 2025-11-22 07:47:33.961 186548 DEBUG nova.virt.hardware [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 07:47:33 compute-0 nova_compute[186544]: 2025-11-22 07:47:33.961 186548 DEBUG nova.virt.hardware [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 07:47:33 compute-0 nova_compute[186544]: 2025-11-22 07:47:33.962 186548 DEBUG nova.virt.hardware [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 07:47:33 compute-0 nova_compute[186544]: 2025-11-22 07:47:33.962 186548 DEBUG nova.virt.hardware [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 07:47:33 compute-0 nova_compute[186544]: 2025-11-22 07:47:33.962 186548 DEBUG nova.virt.hardware [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 07:47:33 compute-0 nova_compute[186544]: 2025-11-22 07:47:33.962 186548 DEBUG nova.virt.hardware [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 07:47:33 compute-0 nova_compute[186544]: 2025-11-22 07:47:33.963 186548 DEBUG nova.virt.hardware [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 07:47:33 compute-0 nova_compute[186544]: 2025-11-22 07:47:33.963 186548 DEBUG nova.virt.hardware [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 07:47:33 compute-0 nova_compute[186544]: 2025-11-22 07:47:33.963 186548 DEBUG nova.virt.hardware [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 07:47:33 compute-0 nova_compute[186544]: 2025-11-22 07:47:33.963 186548 DEBUG nova.virt.hardware [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 07:47:33 compute-0 nova_compute[186544]: 2025-11-22 07:47:33.964 186548 DEBUG nova.virt.hardware [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 07:47:33 compute-0 nova_compute[186544]: 2025-11-22 07:47:33.968 186548 DEBUG nova.virt.libvirt.vif [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:47:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesOneServerNegativeTestJSON-server-221618664',display_name='tempest-ImagesOneServerNegativeTestJSON-server-221618664',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagesoneservernegativetestjson-server-221618664',id=36,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b94109a356454dbda245fe5e57d0cd82',ramdisk_id='',reservation_id='r-g0laf6zv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesOneServerNegativeTestJSON-328128522',owner_user_name='tempest-ImagesOneServerNegativeTestJSON-328128522-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:47:23Z,user_data=None,user_id='b47fa480dd1c4c9f81da16b464195f2b',uuid=91985628-1682-4fc4-b267-d329bef908d3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e62cb303-c302-4453-bc2e-549678b72530", "address": "fa:16:3e:2d:37:a2", "network": {"id": "4aa99606-7691-4fcb-846d-56459aaaa088", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1996006581-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b94109a356454dbda245fe5e57d0cd82", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape62cb303-c3", "ovs_interfaceid": "e62cb303-c302-4453-bc2e-549678b72530", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 07:47:33 compute-0 nova_compute[186544]: 2025-11-22 07:47:33.968 186548 DEBUG nova.network.os_vif_util [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Converting VIF {"id": "e62cb303-c302-4453-bc2e-549678b72530", "address": "fa:16:3e:2d:37:a2", "network": {"id": "4aa99606-7691-4fcb-846d-56459aaaa088", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1996006581-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b94109a356454dbda245fe5e57d0cd82", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape62cb303-c3", "ovs_interfaceid": "e62cb303-c302-4453-bc2e-549678b72530", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:47:33 compute-0 nova_compute[186544]: 2025-11-22 07:47:33.969 186548 DEBUG nova.network.os_vif_util [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2d:37:a2,bridge_name='br-int',has_traffic_filtering=True,id=e62cb303-c302-4453-bc2e-549678b72530,network=Network(4aa99606-7691-4fcb-846d-56459aaaa088),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape62cb303-c3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:47:33 compute-0 nova_compute[186544]: 2025-11-22 07:47:33.970 186548 DEBUG nova.objects.instance [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Lazy-loading 'pci_devices' on Instance uuid 91985628-1682-4fc4-b267-d329bef908d3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:47:33 compute-0 nova_compute[186544]: 2025-11-22 07:47:33.979 186548 DEBUG nova.virt.libvirt.driver [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 91985628-1682-4fc4-b267-d329bef908d3] End _get_guest_xml xml=<domain type="kvm">
Nov 22 07:47:33 compute-0 nova_compute[186544]:   <uuid>91985628-1682-4fc4-b267-d329bef908d3</uuid>
Nov 22 07:47:33 compute-0 nova_compute[186544]:   <name>instance-00000024</name>
Nov 22 07:47:33 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 07:47:33 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 07:47:33 compute-0 nova_compute[186544]:   <metadata>
Nov 22 07:47:33 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 07:47:33 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 07:47:33 compute-0 nova_compute[186544]:       <nova:name>tempest-ImagesOneServerNegativeTestJSON-server-221618664</nova:name>
Nov 22 07:47:33 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 07:47:33</nova:creationTime>
Nov 22 07:47:33 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 07:47:33 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 07:47:33 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 07:47:33 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 07:47:33 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 07:47:33 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 07:47:33 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 07:47:33 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 07:47:33 compute-0 nova_compute[186544]:         <nova:user uuid="b47fa480dd1c4c9f81da16b464195f2b">tempest-ImagesOneServerNegativeTestJSON-328128522-project-member</nova:user>
Nov 22 07:47:33 compute-0 nova_compute[186544]:         <nova:project uuid="b94109a356454dbda245fe5e57d0cd82">tempest-ImagesOneServerNegativeTestJSON-328128522</nova:project>
Nov 22 07:47:33 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 07:47:33 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 07:47:33 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 07:47:33 compute-0 nova_compute[186544]:         <nova:port uuid="e62cb303-c302-4453-bc2e-549678b72530">
Nov 22 07:47:33 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Nov 22 07:47:33 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 07:47:33 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 07:47:33 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 07:47:33 compute-0 nova_compute[186544]:   </metadata>
Nov 22 07:47:33 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 07:47:33 compute-0 nova_compute[186544]:     <system>
Nov 22 07:47:33 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 07:47:33 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 07:47:33 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 07:47:33 compute-0 nova_compute[186544]:       <entry name="serial">91985628-1682-4fc4-b267-d329bef908d3</entry>
Nov 22 07:47:33 compute-0 nova_compute[186544]:       <entry name="uuid">91985628-1682-4fc4-b267-d329bef908d3</entry>
Nov 22 07:47:33 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 07:47:33 compute-0 nova_compute[186544]:     </system>
Nov 22 07:47:33 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 07:47:33 compute-0 nova_compute[186544]:   <os>
Nov 22 07:47:33 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 07:47:33 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 07:47:33 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 07:47:33 compute-0 nova_compute[186544]:   </os>
Nov 22 07:47:33 compute-0 nova_compute[186544]:   <features>
Nov 22 07:47:33 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 07:47:33 compute-0 nova_compute[186544]:     <apic/>
Nov 22 07:47:33 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 07:47:33 compute-0 nova_compute[186544]:   </features>
Nov 22 07:47:33 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 07:47:33 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 07:47:33 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 07:47:33 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 07:47:33 compute-0 nova_compute[186544]:   </clock>
Nov 22 07:47:33 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 07:47:33 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 07:47:33 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 07:47:33 compute-0 nova_compute[186544]:   </cpu>
Nov 22 07:47:33 compute-0 nova_compute[186544]:   <devices>
Nov 22 07:47:33 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 07:47:33 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 07:47:33 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/91985628-1682-4fc4-b267-d329bef908d3/disk"/>
Nov 22 07:47:33 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 07:47:33 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:47:33 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 07:47:33 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 07:47:33 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/91985628-1682-4fc4-b267-d329bef908d3/disk.config"/>
Nov 22 07:47:33 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 07:47:33 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:47:33 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 07:47:33 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:2d:37:a2"/>
Nov 22 07:47:33 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 07:47:33 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 07:47:33 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 07:47:33 compute-0 nova_compute[186544]:       <target dev="tape62cb303-c3"/>
Nov 22 07:47:33 compute-0 nova_compute[186544]:     </interface>
Nov 22 07:47:33 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 07:47:33 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/91985628-1682-4fc4-b267-d329bef908d3/console.log" append="off"/>
Nov 22 07:47:33 compute-0 nova_compute[186544]:     </serial>
Nov 22 07:47:33 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 07:47:33 compute-0 nova_compute[186544]:     <video>
Nov 22 07:47:33 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 07:47:33 compute-0 nova_compute[186544]:     </video>
Nov 22 07:47:33 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 07:47:33 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 07:47:33 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 07:47:33 compute-0 nova_compute[186544]:     </rng>
Nov 22 07:47:33 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 07:47:33 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:47:33 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:47:33 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:47:33 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:47:33 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:47:33 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:47:33 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:47:33 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:47:33 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:47:33 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:47:33 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:47:33 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:47:33 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:47:33 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:47:33 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:47:33 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:47:33 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:47:33 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:47:33 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:47:33 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:47:33 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:47:33 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:47:33 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:47:33 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:47:33 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 07:47:33 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 07:47:33 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 07:47:33 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 07:47:33 compute-0 nova_compute[186544]:   </devices>
Nov 22 07:47:33 compute-0 nova_compute[186544]: </domain>
Nov 22 07:47:33 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 07:47:33 compute-0 nova_compute[186544]: 2025-11-22 07:47:33.979 186548 DEBUG nova.compute.manager [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 91985628-1682-4fc4-b267-d329bef908d3] Preparing to wait for external event network-vif-plugged-e62cb303-c302-4453-bc2e-549678b72530 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 07:47:33 compute-0 nova_compute[186544]: 2025-11-22 07:47:33.980 186548 DEBUG oslo_concurrency.lockutils [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Acquiring lock "91985628-1682-4fc4-b267-d329bef908d3-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:47:33 compute-0 nova_compute[186544]: 2025-11-22 07:47:33.980 186548 DEBUG oslo_concurrency.lockutils [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Lock "91985628-1682-4fc4-b267-d329bef908d3-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:47:33 compute-0 nova_compute[186544]: 2025-11-22 07:47:33.980 186548 DEBUG oslo_concurrency.lockutils [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Lock "91985628-1682-4fc4-b267-d329bef908d3-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:47:33 compute-0 nova_compute[186544]: 2025-11-22 07:47:33.981 186548 DEBUG nova.virt.libvirt.vif [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:47:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesOneServerNegativeTestJSON-server-221618664',display_name='tempest-ImagesOneServerNegativeTestJSON-server-221618664',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagesoneservernegativetestjson-server-221618664',id=36,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b94109a356454dbda245fe5e57d0cd82',ramdisk_id='',reservation_id='r-g0laf6zv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesOneServerNegativeTestJSON-328128522',owner_user_name='tempest-ImagesOneServerNegativeTestJSON-328128522-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:47:23Z,user_data=None,user_id='b47fa480dd1c4c9f81da16b464195f2b',uuid=91985628-1682-4fc4-b267-d329bef908d3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e62cb303-c302-4453-bc2e-549678b72530", "address": "fa:16:3e:2d:37:a2", "network": {"id": "4aa99606-7691-4fcb-846d-56459aaaa088", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1996006581-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b94109a356454dbda245fe5e57d0cd82", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape62cb303-c3", "ovs_interfaceid": "e62cb303-c302-4453-bc2e-549678b72530", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 07:47:33 compute-0 nova_compute[186544]: 2025-11-22 07:47:33.981 186548 DEBUG nova.network.os_vif_util [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Converting VIF {"id": "e62cb303-c302-4453-bc2e-549678b72530", "address": "fa:16:3e:2d:37:a2", "network": {"id": "4aa99606-7691-4fcb-846d-56459aaaa088", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1996006581-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b94109a356454dbda245fe5e57d0cd82", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape62cb303-c3", "ovs_interfaceid": "e62cb303-c302-4453-bc2e-549678b72530", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:47:33 compute-0 nova_compute[186544]: 2025-11-22 07:47:33.982 186548 DEBUG nova.network.os_vif_util [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2d:37:a2,bridge_name='br-int',has_traffic_filtering=True,id=e62cb303-c302-4453-bc2e-549678b72530,network=Network(4aa99606-7691-4fcb-846d-56459aaaa088),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape62cb303-c3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:47:33 compute-0 nova_compute[186544]: 2025-11-22 07:47:33.982 186548 DEBUG os_vif [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2d:37:a2,bridge_name='br-int',has_traffic_filtering=True,id=e62cb303-c302-4453-bc2e-549678b72530,network=Network(4aa99606-7691-4fcb-846d-56459aaaa088),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape62cb303-c3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 07:47:33 compute-0 nova_compute[186544]: 2025-11-22 07:47:33.983 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:33 compute-0 nova_compute[186544]: 2025-11-22 07:47:33.983 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:47:33 compute-0 nova_compute[186544]: 2025-11-22 07:47:33.983 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:47:33 compute-0 nova_compute[186544]: 2025-11-22 07:47:33.987 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:33 compute-0 nova_compute[186544]: 2025-11-22 07:47:33.987 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape62cb303-c3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:47:33 compute-0 nova_compute[186544]: 2025-11-22 07:47:33.987 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape62cb303-c3, col_values=(('external_ids', {'iface-id': 'e62cb303-c302-4453-bc2e-549678b72530', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2d:37:a2', 'vm-uuid': '91985628-1682-4fc4-b267-d329bef908d3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:47:33 compute-0 nova_compute[186544]: 2025-11-22 07:47:33.988 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:33 compute-0 NetworkManager[55036]: <info>  [1763797653.9903] manager: (tape62cb303-c3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/70)
Nov 22 07:47:33 compute-0 nova_compute[186544]: 2025-11-22 07:47:33.991 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 07:47:33 compute-0 nova_compute[186544]: 2025-11-22 07:47:33.995 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:33 compute-0 nova_compute[186544]: 2025-11-22 07:47:33.996 186548 INFO os_vif [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2d:37:a2,bridge_name='br-int',has_traffic_filtering=True,id=e62cb303-c302-4453-bc2e-549678b72530,network=Network(4aa99606-7691-4fcb-846d-56459aaaa088),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape62cb303-c3')
Nov 22 07:47:34 compute-0 podman[218727]: 2025-11-22 07:47:34.003222148 +0000 UTC m=+0.052210769 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 22 07:47:34 compute-0 nova_compute[186544]: 2025-11-22 07:47:34.043 186548 DEBUG nova.virt.libvirt.driver [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:47:34 compute-0 nova_compute[186544]: 2025-11-22 07:47:34.043 186548 DEBUG nova.virt.libvirt.driver [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:47:34 compute-0 nova_compute[186544]: 2025-11-22 07:47:34.043 186548 DEBUG nova.virt.libvirt.driver [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] No VIF found with MAC fa:16:3e:2d:37:a2, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 07:47:34 compute-0 nova_compute[186544]: 2025-11-22 07:47:34.044 186548 INFO nova.virt.libvirt.driver [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 91985628-1682-4fc4-b267-d329bef908d3] Using config drive
Nov 22 07:47:34 compute-0 nova_compute[186544]: 2025-11-22 07:47:34.151 186548 INFO nova.virt.libvirt.driver [-] [instance: d44f6523-2367-449e-8244-593a14760876] Instance destroyed successfully.
Nov 22 07:47:34 compute-0 nova_compute[186544]: 2025-11-22 07:47:34.152 186548 DEBUG nova.objects.instance [None req-09132683-17be-4d9a-b4cd-d7f98b2d0b46 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lazy-loading 'resources' on Instance uuid d44f6523-2367-449e-8244-593a14760876 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:47:34 compute-0 nova_compute[186544]: 2025-11-22 07:47:34.267 186548 INFO nova.virt.libvirt.driver [None req-09132683-17be-4d9a-b4cd-d7f98b2d0b46 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: d44f6523-2367-449e-8244-593a14760876] Deleting instance files /var/lib/nova/instances/d44f6523-2367-449e-8244-593a14760876_del
Nov 22 07:47:34 compute-0 nova_compute[186544]: 2025-11-22 07:47:34.268 186548 INFO nova.virt.libvirt.driver [None req-09132683-17be-4d9a-b4cd-d7f98b2d0b46 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: d44f6523-2367-449e-8244-593a14760876] Deletion of /var/lib/nova/instances/d44f6523-2367-449e-8244-593a14760876_del complete
Nov 22 07:47:34 compute-0 nova_compute[186544]: 2025-11-22 07:47:34.607 186548 INFO nova.compute.manager [None req-09132683-17be-4d9a-b4cd-d7f98b2d0b46 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] [instance: d44f6523-2367-449e-8244-593a14760876] Took 0.70 seconds to destroy the instance on the hypervisor.
Nov 22 07:47:34 compute-0 nova_compute[186544]: 2025-11-22 07:47:34.607 186548 DEBUG oslo.service.loopingcall [None req-09132683-17be-4d9a-b4cd-d7f98b2d0b46 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 07:47:34 compute-0 nova_compute[186544]: 2025-11-22 07:47:34.608 186548 DEBUG nova.compute.manager [-] [instance: d44f6523-2367-449e-8244-593a14760876] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 07:47:34 compute-0 nova_compute[186544]: 2025-11-22 07:47:34.608 186548 DEBUG nova.network.neutron [-] [instance: d44f6523-2367-449e-8244-593a14760876] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 07:47:34 compute-0 nova_compute[186544]: 2025-11-22 07:47:34.658 186548 INFO nova.virt.libvirt.driver [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 91985628-1682-4fc4-b267-d329bef908d3] Creating config drive at /var/lib/nova/instances/91985628-1682-4fc4-b267-d329bef908d3/disk.config
Nov 22 07:47:34 compute-0 nova_compute[186544]: 2025-11-22 07:47:34.663 186548 DEBUG oslo_concurrency.processutils [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/91985628-1682-4fc4-b267-d329bef908d3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpz0051u0e execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:47:34 compute-0 nova_compute[186544]: 2025-11-22 07:47:34.785 186548 DEBUG oslo_concurrency.processutils [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/91985628-1682-4fc4-b267-d329bef908d3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpz0051u0e" returned: 0 in 0.122s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:47:34 compute-0 kernel: tape62cb303-c3: entered promiscuous mode
Nov 22 07:47:34 compute-0 NetworkManager[55036]: <info>  [1763797654.8346] manager: (tape62cb303-c3): new Tun device (/org/freedesktop/NetworkManager/Devices/71)
Nov 22 07:47:34 compute-0 systemd-udevd[218736]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 07:47:34 compute-0 nova_compute[186544]: 2025-11-22 07:47:34.835 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:34 compute-0 ovn_controller[94843]: 2025-11-22T07:47:34Z|00139|binding|INFO|Claiming lport e62cb303-c302-4453-bc2e-549678b72530 for this chassis.
Nov 22 07:47:34 compute-0 ovn_controller[94843]: 2025-11-22T07:47:34Z|00140|binding|INFO|e62cb303-c302-4453-bc2e-549678b72530: Claiming fa:16:3e:2d:37:a2 10.100.0.12
Nov 22 07:47:34 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:34.843 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2d:37:a2 10.100.0.12'], port_security=['fa:16:3e:2d:37:a2 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '91985628-1682-4fc4-b267-d329bef908d3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4aa99606-7691-4fcb-846d-56459aaaa088', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b94109a356454dbda245fe5e57d0cd82', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7ebc0842-f2b0-4995-8bc2-4b71e8009dae', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=440686f5-fec3-41db-bbb0-53e12589d6a4, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=e62cb303-c302-4453-bc2e-549678b72530) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:47:34 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:34.845 103805 INFO neutron.agent.ovn.metadata.agent [-] Port e62cb303-c302-4453-bc2e-549678b72530 in datapath 4aa99606-7691-4fcb-846d-56459aaaa088 bound to our chassis
Nov 22 07:47:34 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:34.846 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4aa99606-7691-4fcb-846d-56459aaaa088
Nov 22 07:47:34 compute-0 ovn_controller[94843]: 2025-11-22T07:47:34Z|00141|binding|INFO|Setting lport e62cb303-c302-4453-bc2e-549678b72530 ovn-installed in OVS
Nov 22 07:47:34 compute-0 ovn_controller[94843]: 2025-11-22T07:47:34Z|00142|binding|INFO|Setting lport e62cb303-c302-4453-bc2e-549678b72530 up in Southbound
Nov 22 07:47:34 compute-0 nova_compute[186544]: 2025-11-22 07:47:34.848 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:34 compute-0 nova_compute[186544]: 2025-11-22 07:47:34.851 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:34 compute-0 NetworkManager[55036]: <info>  [1763797654.8598] device (tape62cb303-c3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 07:47:34 compute-0 NetworkManager[55036]: <info>  [1763797654.8609] device (tape62cb303-c3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 07:47:34 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:34.861 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[270449fd-d0c0-4248-95b6-21bea5394169]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:34 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:34.862 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap4aa99606-71 in ovnmeta-4aa99606-7691-4fcb-846d-56459aaaa088 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 07:47:34 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:34.863 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap4aa99606-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 07:47:34 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:34.864 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[2538f6c4-9832-4c78-b648-84963ba6043a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:34 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:34.864 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[f66ed76c-0bfe-408e-8003-405f08fbd1d6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:34 compute-0 systemd-machined[152872]: New machine qemu-19-instance-00000024.
Nov 22 07:47:34 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:34.878 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[bd74036b-ac05-4713-9c18-876d90370c86]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:34 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:34.893 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[50292f5e-e502-4cde-ae19-9c611462f81c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:34 compute-0 systemd[1]: Started Virtual Machine qemu-19-instance-00000024.
Nov 22 07:47:34 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:34.926 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[259e7827-1a44-4206-b749-b06d706e7850]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:34 compute-0 NetworkManager[55036]: <info>  [1763797654.9330] manager: (tap4aa99606-70): new Veth device (/org/freedesktop/NetworkManager/Devices/72)
Nov 22 07:47:34 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:34.933 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[1780b798-5b45-48da-8a7a-4c649924faed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:34 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:34.961 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[df14c56e-ee5f-4bb1-b72e-5da20a4fd5fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:34 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:34.964 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[642deef5-56cf-47f5-a25b-7811c57507dd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:34 compute-0 NetworkManager[55036]: <info>  [1763797654.9894] device (tap4aa99606-70): carrier: link connected
Nov 22 07:47:34 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:34.995 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[dd3f207f-1f8d-4b39-8b21-e310d32b113c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:35 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:35.014 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[1c934c51-1621-4167-81f9-767b74f40f0b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4aa99606-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:85:b3:a9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 45], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 439562, 'reachable_time': 40202, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218807, 'error': None, 'target': 'ovnmeta-4aa99606-7691-4fcb-846d-56459aaaa088', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:35 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:35.032 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[5b902e6c-d7d0-45b4-9ce6-c1490e320f84]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe85:b3a9'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 439562, 'tstamp': 439562}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218808, 'error': None, 'target': 'ovnmeta-4aa99606-7691-4fcb-846d-56459aaaa088', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:35 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:35.050 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[d9e213e4-badd-4fb2-a191-1877ae923de1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4aa99606-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:85:b3:a9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 45], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 439562, 'reachable_time': 40202, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218809, 'error': None, 'target': 'ovnmeta-4aa99606-7691-4fcb-846d-56459aaaa088', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:35 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:35.082 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[9e1efc59-4417-4a3b-b265-4e3222ed77b5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:35 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:35.142 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[7cd03606-2dc8-4b81-b0e1-a09e9204148e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:35 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:35.144 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4aa99606-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:47:35 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:35.144 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:47:35 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:35.144 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4aa99606-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:47:35 compute-0 NetworkManager[55036]: <info>  [1763797655.1467] manager: (tap4aa99606-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/73)
Nov 22 07:47:35 compute-0 kernel: tap4aa99606-70: entered promiscuous mode
Nov 22 07:47:35 compute-0 nova_compute[186544]: 2025-11-22 07:47:35.147 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:35 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:35.149 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4aa99606-70, col_values=(('external_ids', {'iface-id': 'ef41332e-7ec0-4d28-b824-d5b12ab6995f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:47:35 compute-0 nova_compute[186544]: 2025-11-22 07:47:35.150 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:35 compute-0 nova_compute[186544]: 2025-11-22 07:47:35.152 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:35 compute-0 ovn_controller[94843]: 2025-11-22T07:47:35Z|00143|binding|INFO|Releasing lport ef41332e-7ec0-4d28-b824-d5b12ab6995f from this chassis (sb_readonly=0)
Nov 22 07:47:35 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:35.153 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4aa99606-7691-4fcb-846d-56459aaaa088.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4aa99606-7691-4fcb-846d-56459aaaa088.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 07:47:35 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:35.154 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[35b20772-d9a2-49f0-af8c-0264f6d66517]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:35 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:35.155 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 07:47:35 compute-0 ovn_metadata_agent[103800]: global
Nov 22 07:47:35 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 07:47:35 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-4aa99606-7691-4fcb-846d-56459aaaa088
Nov 22 07:47:35 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 07:47:35 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 07:47:35 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 07:47:35 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/4aa99606-7691-4fcb-846d-56459aaaa088.pid.haproxy
Nov 22 07:47:35 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 07:47:35 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:47:35 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 07:47:35 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 07:47:35 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 07:47:35 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 07:47:35 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 07:47:35 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 07:47:35 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 07:47:35 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 07:47:35 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 07:47:35 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 07:47:35 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 07:47:35 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 07:47:35 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 07:47:35 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:47:35 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:47:35 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 07:47:35 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 07:47:35 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 07:47:35 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID 4aa99606-7691-4fcb-846d-56459aaaa088
Nov 22 07:47:35 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 07:47:35 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:35.155 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-4aa99606-7691-4fcb-846d-56459aaaa088', 'env', 'PROCESS_TAG=haproxy-4aa99606-7691-4fcb-846d-56459aaaa088', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/4aa99606-7691-4fcb-846d-56459aaaa088.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 07:47:35 compute-0 nova_compute[186544]: 2025-11-22 07:47:35.158 186548 DEBUG nova.network.neutron [-] [instance: d44f6523-2367-449e-8244-593a14760876] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 07:47:35 compute-0 nova_compute[186544]: 2025-11-22 07:47:35.167 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:35 compute-0 nova_compute[186544]: 2025-11-22 07:47:35.193 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:35 compute-0 nova_compute[186544]: 2025-11-22 07:47:35.196 186548 DEBUG nova.network.neutron [-] [instance: d44f6523-2367-449e-8244-593a14760876] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:47:35 compute-0 nova_compute[186544]: 2025-11-22 07:47:35.261 186548 INFO nova.compute.manager [-] [instance: d44f6523-2367-449e-8244-593a14760876] Took 0.65 seconds to deallocate network for instance.
Nov 22 07:47:35 compute-0 nova_compute[186544]: 2025-11-22 07:47:35.334 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763797655.3342464, 91985628-1682-4fc4-b267-d329bef908d3 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:47:35 compute-0 nova_compute[186544]: 2025-11-22 07:47:35.335 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 91985628-1682-4fc4-b267-d329bef908d3] VM Started (Lifecycle Event)
Nov 22 07:47:35 compute-0 nova_compute[186544]: 2025-11-22 07:47:35.372 186548 DEBUG oslo_concurrency.lockutils [None req-09132683-17be-4d9a-b4cd-d7f98b2d0b46 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:47:35 compute-0 nova_compute[186544]: 2025-11-22 07:47:35.372 186548 DEBUG oslo_concurrency.lockutils [None req-09132683-17be-4d9a-b4cd-d7f98b2d0b46 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:47:35 compute-0 nova_compute[186544]: 2025-11-22 07:47:35.374 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 91985628-1682-4fc4-b267-d329bef908d3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:47:35 compute-0 nova_compute[186544]: 2025-11-22 07:47:35.378 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763797655.3349228, 91985628-1682-4fc4-b267-d329bef908d3 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:47:35 compute-0 nova_compute[186544]: 2025-11-22 07:47:35.378 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 91985628-1682-4fc4-b267-d329bef908d3] VM Paused (Lifecycle Event)
Nov 22 07:47:35 compute-0 nova_compute[186544]: 2025-11-22 07:47:35.408 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 91985628-1682-4fc4-b267-d329bef908d3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:47:35 compute-0 nova_compute[186544]: 2025-11-22 07:47:35.413 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 91985628-1682-4fc4-b267-d329bef908d3] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:47:35 compute-0 podman[218848]: 2025-11-22 07:47:35.517084376 +0000 UTC m=+0.057467817 container create f994f6a61e575d618860d0eb99aa883e338ffa1c08976d44c09f722043166efb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4aa99606-7691-4fcb-846d-56459aaaa088, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 07:47:35 compute-0 systemd[1]: Started libpod-conmon-f994f6a61e575d618860d0eb99aa883e338ffa1c08976d44c09f722043166efb.scope.
Nov 22 07:47:35 compute-0 systemd[1]: Started libcrun container.
Nov 22 07:47:35 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/20f23ee50bcacbee689e8c3102276d3cb606252041007f501e6b5a2be83049d9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 07:47:35 compute-0 podman[218848]: 2025-11-22 07:47:35.48621674 +0000 UTC m=+0.026600221 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 07:47:35 compute-0 podman[218848]: 2025-11-22 07:47:35.590519832 +0000 UTC m=+0.130903303 container init f994f6a61e575d618860d0eb99aa883e338ffa1c08976d44c09f722043166efb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4aa99606-7691-4fcb-846d-56459aaaa088, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 07:47:35 compute-0 podman[218848]: 2025-11-22 07:47:35.595481994 +0000 UTC m=+0.135865425 container start f994f6a61e575d618860d0eb99aa883e338ffa1c08976d44c09f722043166efb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4aa99606-7691-4fcb-846d-56459aaaa088, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Nov 22 07:47:35 compute-0 neutron-haproxy-ovnmeta-4aa99606-7691-4fcb-846d-56459aaaa088[218863]: [NOTICE]   (218867) : New worker (218869) forked
Nov 22 07:47:35 compute-0 neutron-haproxy-ovnmeta-4aa99606-7691-4fcb-846d-56459aaaa088[218863]: [NOTICE]   (218867) : Loading success.
Nov 22 07:47:35 compute-0 nova_compute[186544]: 2025-11-22 07:47:35.660 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 91985628-1682-4fc4-b267-d329bef908d3] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 07:47:35 compute-0 nova_compute[186544]: 2025-11-22 07:47:35.751 186548 DEBUG nova.compute.manager [req-92076438-aec8-4187-a8db-f13a41c1113a req-3deabf67-7621-44eb-ad63-9b9bf2b69a26 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 91985628-1682-4fc4-b267-d329bef908d3] Received event network-vif-plugged-e62cb303-c302-4453-bc2e-549678b72530 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:47:35 compute-0 nova_compute[186544]: 2025-11-22 07:47:35.752 186548 DEBUG oslo_concurrency.lockutils [req-92076438-aec8-4187-a8db-f13a41c1113a req-3deabf67-7621-44eb-ad63-9b9bf2b69a26 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "91985628-1682-4fc4-b267-d329bef908d3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:47:35 compute-0 nova_compute[186544]: 2025-11-22 07:47:35.752 186548 DEBUG oslo_concurrency.lockutils [req-92076438-aec8-4187-a8db-f13a41c1113a req-3deabf67-7621-44eb-ad63-9b9bf2b69a26 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "91985628-1682-4fc4-b267-d329bef908d3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:47:35 compute-0 nova_compute[186544]: 2025-11-22 07:47:35.753 186548 DEBUG oslo_concurrency.lockutils [req-92076438-aec8-4187-a8db-f13a41c1113a req-3deabf67-7621-44eb-ad63-9b9bf2b69a26 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "91985628-1682-4fc4-b267-d329bef908d3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:47:35 compute-0 nova_compute[186544]: 2025-11-22 07:47:35.753 186548 DEBUG nova.compute.manager [req-92076438-aec8-4187-a8db-f13a41c1113a req-3deabf67-7621-44eb-ad63-9b9bf2b69a26 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 91985628-1682-4fc4-b267-d329bef908d3] Processing event network-vif-plugged-e62cb303-c302-4453-bc2e-549678b72530 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 07:47:35 compute-0 nova_compute[186544]: 2025-11-22 07:47:35.755 186548 DEBUG nova.compute.manager [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 91985628-1682-4fc4-b267-d329bef908d3] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 07:47:35 compute-0 nova_compute[186544]: 2025-11-22 07:47:35.759 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763797655.7597365, 91985628-1682-4fc4-b267-d329bef908d3 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:47:35 compute-0 nova_compute[186544]: 2025-11-22 07:47:35.760 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 91985628-1682-4fc4-b267-d329bef908d3] VM Resumed (Lifecycle Event)
Nov 22 07:47:35 compute-0 nova_compute[186544]: 2025-11-22 07:47:35.761 186548 DEBUG nova.virt.libvirt.driver [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 91985628-1682-4fc4-b267-d329bef908d3] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 07:47:35 compute-0 nova_compute[186544]: 2025-11-22 07:47:35.762 186548 DEBUG nova.compute.provider_tree [None req-09132683-17be-4d9a-b4cd-d7f98b2d0b46 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:47:35 compute-0 nova_compute[186544]: 2025-11-22 07:47:35.767 186548 INFO nova.virt.libvirt.driver [-] [instance: 91985628-1682-4fc4-b267-d329bef908d3] Instance spawned successfully.
Nov 22 07:47:35 compute-0 nova_compute[186544]: 2025-11-22 07:47:35.767 186548 DEBUG nova.virt.libvirt.driver [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 91985628-1682-4fc4-b267-d329bef908d3] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 07:47:35 compute-0 nova_compute[186544]: 2025-11-22 07:47:35.787 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 91985628-1682-4fc4-b267-d329bef908d3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:47:35 compute-0 nova_compute[186544]: 2025-11-22 07:47:35.789 186548 DEBUG nova.scheduler.client.report [None req-09132683-17be-4d9a-b4cd-d7f98b2d0b46 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:47:35 compute-0 nova_compute[186544]: 2025-11-22 07:47:35.798 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 91985628-1682-4fc4-b267-d329bef908d3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:47:35 compute-0 nova_compute[186544]: 2025-11-22 07:47:35.801 186548 DEBUG nova.virt.libvirt.driver [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 91985628-1682-4fc4-b267-d329bef908d3] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:47:35 compute-0 nova_compute[186544]: 2025-11-22 07:47:35.801 186548 DEBUG nova.virt.libvirt.driver [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 91985628-1682-4fc4-b267-d329bef908d3] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:47:35 compute-0 nova_compute[186544]: 2025-11-22 07:47:35.802 186548 DEBUG nova.virt.libvirt.driver [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 91985628-1682-4fc4-b267-d329bef908d3] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:47:35 compute-0 nova_compute[186544]: 2025-11-22 07:47:35.802 186548 DEBUG nova.virt.libvirt.driver [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 91985628-1682-4fc4-b267-d329bef908d3] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:47:35 compute-0 nova_compute[186544]: 2025-11-22 07:47:35.803 186548 DEBUG nova.virt.libvirt.driver [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 91985628-1682-4fc4-b267-d329bef908d3] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:47:35 compute-0 nova_compute[186544]: 2025-11-22 07:47:35.803 186548 DEBUG nova.virt.libvirt.driver [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 91985628-1682-4fc4-b267-d329bef908d3] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:47:35 compute-0 nova_compute[186544]: 2025-11-22 07:47:35.831 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 91985628-1682-4fc4-b267-d329bef908d3] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 07:47:35 compute-0 nova_compute[186544]: 2025-11-22 07:47:35.864 186548 DEBUG oslo_concurrency.lockutils [None req-09132683-17be-4d9a-b4cd-d7f98b2d0b46 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.491s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:47:35 compute-0 nova_compute[186544]: 2025-11-22 07:47:35.940 186548 INFO nova.compute.manager [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 91985628-1682-4fc4-b267-d329bef908d3] Took 12.31 seconds to spawn the instance on the hypervisor.
Nov 22 07:47:35 compute-0 nova_compute[186544]: 2025-11-22 07:47:35.940 186548 DEBUG nova.compute.manager [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 91985628-1682-4fc4-b267-d329bef908d3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:47:36 compute-0 nova_compute[186544]: 2025-11-22 07:47:36.045 186548 INFO nova.scheduler.client.report [None req-09132683-17be-4d9a-b4cd-d7f98b2d0b46 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Deleted allocations for instance d44f6523-2367-449e-8244-593a14760876
Nov 22 07:47:36 compute-0 nova_compute[186544]: 2025-11-22 07:47:36.079 186548 INFO nova.compute.manager [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 91985628-1682-4fc4-b267-d329bef908d3] Took 13.04 seconds to build instance.
Nov 22 07:47:36 compute-0 nova_compute[186544]: 2025-11-22 07:47:36.119 186548 DEBUG oslo_concurrency.lockutils [None req-e2101c0c-146d-48aa-b179-10db01a8678a b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Lock "91985628-1682-4fc4-b267-d329bef908d3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.179s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:47:36 compute-0 nova_compute[186544]: 2025-11-22 07:47:36.141 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:47:36 compute-0 nova_compute[186544]: 2025-11-22 07:47:36.142 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:47:36 compute-0 nova_compute[186544]: 2025-11-22 07:47:36.142 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 07:47:36 compute-0 nova_compute[186544]: 2025-11-22 07:47:36.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:47:36 compute-0 nova_compute[186544]: 2025-11-22 07:47:36.164 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 07:47:36 compute-0 nova_compute[186544]: 2025-11-22 07:47:36.165 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 07:47:36 compute-0 nova_compute[186544]: 2025-11-22 07:47:36.200 186548 DEBUG oslo_concurrency.lockutils [None req-09132683-17be-4d9a-b4cd-d7f98b2d0b46 f3272c6a12f44ac18db2715976e29248 b764107a4dca4a799bc3edefe458310b - - default default] Lock "d44f6523-2367-449e-8244-593a14760876" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.530s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:47:36 compute-0 nova_compute[186544]: 2025-11-22 07:47:36.374 186548 DEBUG nova.network.neutron [req-b63787c8-dd81-4437-8225-e2996bb67cca req-aeee0d0f-b510-4643-87eb-0d57765e39f8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 91985628-1682-4fc4-b267-d329bef908d3] Updated VIF entry in instance network info cache for port e62cb303-c302-4453-bc2e-549678b72530. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 07:47:36 compute-0 nova_compute[186544]: 2025-11-22 07:47:36.375 186548 DEBUG nova.network.neutron [req-b63787c8-dd81-4437-8225-e2996bb67cca req-aeee0d0f-b510-4643-87eb-0d57765e39f8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 91985628-1682-4fc4-b267-d329bef908d3] Updating instance_info_cache with network_info: [{"id": "e62cb303-c302-4453-bc2e-549678b72530", "address": "fa:16:3e:2d:37:a2", "network": {"id": "4aa99606-7691-4fcb-846d-56459aaaa088", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1996006581-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b94109a356454dbda245fe5e57d0cd82", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape62cb303-c3", "ovs_interfaceid": "e62cb303-c302-4453-bc2e-549678b72530", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:47:36 compute-0 nova_compute[186544]: 2025-11-22 07:47:36.383 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "refresh_cache-91985628-1682-4fc4-b267-d329bef908d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:47:36 compute-0 nova_compute[186544]: 2025-11-22 07:47:36.393 186548 DEBUG oslo_concurrency.lockutils [req-b63787c8-dd81-4437-8225-e2996bb67cca req-aeee0d0f-b510-4643-87eb-0d57765e39f8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-91985628-1682-4fc4-b267-d329bef908d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:47:36 compute-0 nova_compute[186544]: 2025-11-22 07:47:36.394 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquired lock "refresh_cache-91985628-1682-4fc4-b267-d329bef908d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:47:36 compute-0 nova_compute[186544]: 2025-11-22 07:47:36.394 186548 DEBUG nova.network.neutron [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 91985628-1682-4fc4-b267-d329bef908d3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 22 07:47:36 compute-0 nova_compute[186544]: 2025-11-22 07:47:36.394 186548 DEBUG nova.objects.instance [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lazy-loading 'info_cache' on Instance uuid 91985628-1682-4fc4-b267-d329bef908d3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:47:36 compute-0 nova_compute[186544]: 2025-11-22 07:47:36.582 186548 DEBUG nova.compute.manager [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=72704,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp1r930f3w',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='b9c07170-ca6f-422e-8f1c-9dfd5cc943a4',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Nov 22 07:47:36 compute-0 nova_compute[186544]: 2025-11-22 07:47:36.607 186548 DEBUG oslo_concurrency.lockutils [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Acquiring lock "refresh_cache-b9c07170-ca6f-422e-8f1c-9dfd5cc943a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:47:36 compute-0 nova_compute[186544]: 2025-11-22 07:47:36.607 186548 DEBUG oslo_concurrency.lockutils [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Acquired lock "refresh_cache-b9c07170-ca6f-422e-8f1c-9dfd5cc943a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:47:36 compute-0 nova_compute[186544]: 2025-11-22 07:47:36.608 186548 DEBUG nova.network.neutron [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 07:47:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:37.316 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:47:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:37.316 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:47:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:37.318 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:47:37 compute-0 podman[218878]: 2025-11-22 07:47:37.415993633 +0000 UTC m=+0.060400758 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd)
Nov 22 07:47:37 compute-0 nova_compute[186544]: 2025-11-22 07:47:37.951 186548 DEBUG nova.compute.manager [req-ea9114b7-df3e-49f9-97c4-b739cde24855 req-d2bf82ec-8af0-4dac-8ebc-c17d59c17816 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 91985628-1682-4fc4-b267-d329bef908d3] Received event network-vif-plugged-e62cb303-c302-4453-bc2e-549678b72530 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:47:37 compute-0 nova_compute[186544]: 2025-11-22 07:47:37.952 186548 DEBUG oslo_concurrency.lockutils [req-ea9114b7-df3e-49f9-97c4-b739cde24855 req-d2bf82ec-8af0-4dac-8ebc-c17d59c17816 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "91985628-1682-4fc4-b267-d329bef908d3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:47:37 compute-0 nova_compute[186544]: 2025-11-22 07:47:37.952 186548 DEBUG oslo_concurrency.lockutils [req-ea9114b7-df3e-49f9-97c4-b739cde24855 req-d2bf82ec-8af0-4dac-8ebc-c17d59c17816 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "91985628-1682-4fc4-b267-d329bef908d3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:47:37 compute-0 nova_compute[186544]: 2025-11-22 07:47:37.952 186548 DEBUG oslo_concurrency.lockutils [req-ea9114b7-df3e-49f9-97c4-b739cde24855 req-d2bf82ec-8af0-4dac-8ebc-c17d59c17816 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "91985628-1682-4fc4-b267-d329bef908d3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:47:37 compute-0 nova_compute[186544]: 2025-11-22 07:47:37.953 186548 DEBUG nova.compute.manager [req-ea9114b7-df3e-49f9-97c4-b739cde24855 req-d2bf82ec-8af0-4dac-8ebc-c17d59c17816 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 91985628-1682-4fc4-b267-d329bef908d3] No waiting events found dispatching network-vif-plugged-e62cb303-c302-4453-bc2e-549678b72530 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:47:37 compute-0 nova_compute[186544]: 2025-11-22 07:47:37.953 186548 WARNING nova.compute.manager [req-ea9114b7-df3e-49f9-97c4-b739cde24855 req-d2bf82ec-8af0-4dac-8ebc-c17d59c17816 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 91985628-1682-4fc4-b267-d329bef908d3] Received unexpected event network-vif-plugged-e62cb303-c302-4453-bc2e-549678b72530 for instance with vm_state active and task_state None.
Nov 22 07:47:38 compute-0 nova_compute[186544]: 2025-11-22 07:47:38.743 186548 DEBUG nova.network.neutron [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 91985628-1682-4fc4-b267-d329bef908d3] Updating instance_info_cache with network_info: [{"id": "e62cb303-c302-4453-bc2e-549678b72530", "address": "fa:16:3e:2d:37:a2", "network": {"id": "4aa99606-7691-4fcb-846d-56459aaaa088", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1996006581-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b94109a356454dbda245fe5e57d0cd82", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape62cb303-c3", "ovs_interfaceid": "e62cb303-c302-4453-bc2e-549678b72530", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:47:38 compute-0 nova_compute[186544]: 2025-11-22 07:47:38.745 186548 DEBUG oslo_concurrency.lockutils [None req-a2274d43-f439-458d-9899-62f4ea4deea7 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Acquiring lock "91985628-1682-4fc4-b267-d329bef908d3" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:47:38 compute-0 nova_compute[186544]: 2025-11-22 07:47:38.746 186548 DEBUG oslo_concurrency.lockutils [None req-a2274d43-f439-458d-9899-62f4ea4deea7 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Lock "91985628-1682-4fc4-b267-d329bef908d3" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:47:38 compute-0 nova_compute[186544]: 2025-11-22 07:47:38.746 186548 DEBUG oslo_concurrency.lockutils [None req-a2274d43-f439-458d-9899-62f4ea4deea7 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Acquiring lock "91985628-1682-4fc4-b267-d329bef908d3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:47:38 compute-0 nova_compute[186544]: 2025-11-22 07:47:38.746 186548 DEBUG oslo_concurrency.lockutils [None req-a2274d43-f439-458d-9899-62f4ea4deea7 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Lock "91985628-1682-4fc4-b267-d329bef908d3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:47:38 compute-0 nova_compute[186544]: 2025-11-22 07:47:38.747 186548 DEBUG oslo_concurrency.lockutils [None req-a2274d43-f439-458d-9899-62f4ea4deea7 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Lock "91985628-1682-4fc4-b267-d329bef908d3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:47:38 compute-0 nova_compute[186544]: 2025-11-22 07:47:38.754 186548 INFO nova.compute.manager [None req-a2274d43-f439-458d-9899-62f4ea4deea7 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 91985628-1682-4fc4-b267-d329bef908d3] Terminating instance
Nov 22 07:47:38 compute-0 nova_compute[186544]: 2025-11-22 07:47:38.760 186548 DEBUG nova.compute.manager [None req-a2274d43-f439-458d-9899-62f4ea4deea7 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 91985628-1682-4fc4-b267-d329bef908d3] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 07:47:38 compute-0 nova_compute[186544]: 2025-11-22 07:47:38.766 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Releasing lock "refresh_cache-91985628-1682-4fc4-b267-d329bef908d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:47:38 compute-0 nova_compute[186544]: 2025-11-22 07:47:38.767 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 91985628-1682-4fc4-b267-d329bef908d3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 22 07:47:38 compute-0 nova_compute[186544]: 2025-11-22 07:47:38.767 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:47:38 compute-0 nova_compute[186544]: 2025-11-22 07:47:38.768 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:47:38 compute-0 kernel: tape62cb303-c3 (unregistering): left promiscuous mode
Nov 22 07:47:38 compute-0 NetworkManager[55036]: <info>  [1763797658.7768] device (tape62cb303-c3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 07:47:38 compute-0 ovn_controller[94843]: 2025-11-22T07:47:38Z|00144|binding|INFO|Releasing lport e62cb303-c302-4453-bc2e-549678b72530 from this chassis (sb_readonly=0)
Nov 22 07:47:38 compute-0 ovn_controller[94843]: 2025-11-22T07:47:38Z|00145|binding|INFO|Setting lport e62cb303-c302-4453-bc2e-549678b72530 down in Southbound
Nov 22 07:47:38 compute-0 ovn_controller[94843]: 2025-11-22T07:47:38Z|00146|binding|INFO|Removing iface tape62cb303-c3 ovn-installed in OVS
Nov 22 07:47:38 compute-0 nova_compute[186544]: 2025-11-22 07:47:38.785 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:38 compute-0 nova_compute[186544]: 2025-11-22 07:47:38.788 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:38 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:38.794 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2d:37:a2 10.100.0.12'], port_security=['fa:16:3e:2d:37:a2 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '91985628-1682-4fc4-b267-d329bef908d3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4aa99606-7691-4fcb-846d-56459aaaa088', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b94109a356454dbda245fe5e57d0cd82', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7ebc0842-f2b0-4995-8bc2-4b71e8009dae', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=440686f5-fec3-41db-bbb0-53e12589d6a4, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=e62cb303-c302-4453-bc2e-549678b72530) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:47:38 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:38.795 103805 INFO neutron.agent.ovn.metadata.agent [-] Port e62cb303-c302-4453-bc2e-549678b72530 in datapath 4aa99606-7691-4fcb-846d-56459aaaa088 unbound from our chassis
Nov 22 07:47:38 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:38.796 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4aa99606-7691-4fcb-846d-56459aaaa088, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 07:47:38 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:38.797 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[9eebd1b2-8f40-45fa-b525-ca208e8306d2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:38 compute-0 nova_compute[186544]: 2025-11-22 07:47:38.798 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:38 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:38.797 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-4aa99606-7691-4fcb-846d-56459aaaa088 namespace which is not needed anymore
Nov 22 07:47:38 compute-0 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d00000024.scope: Deactivated successfully.
Nov 22 07:47:38 compute-0 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d00000024.scope: Consumed 3.455s CPU time.
Nov 22 07:47:38 compute-0 systemd-machined[152872]: Machine qemu-19-instance-00000024 terminated.
Nov 22 07:47:38 compute-0 neutron-haproxy-ovnmeta-4aa99606-7691-4fcb-846d-56459aaaa088[218863]: [NOTICE]   (218867) : haproxy version is 2.8.14-c23fe91
Nov 22 07:47:38 compute-0 neutron-haproxy-ovnmeta-4aa99606-7691-4fcb-846d-56459aaaa088[218863]: [NOTICE]   (218867) : path to executable is /usr/sbin/haproxy
Nov 22 07:47:38 compute-0 neutron-haproxy-ovnmeta-4aa99606-7691-4fcb-846d-56459aaaa088[218863]: [WARNING]  (218867) : Exiting Master process...
Nov 22 07:47:38 compute-0 neutron-haproxy-ovnmeta-4aa99606-7691-4fcb-846d-56459aaaa088[218863]: [ALERT]    (218867) : Current worker (218869) exited with code 143 (Terminated)
Nov 22 07:47:38 compute-0 neutron-haproxy-ovnmeta-4aa99606-7691-4fcb-846d-56459aaaa088[218863]: [WARNING]  (218867) : All workers exited. Exiting... (0)
Nov 22 07:47:38 compute-0 systemd[1]: libpod-f994f6a61e575d618860d0eb99aa883e338ffa1c08976d44c09f722043166efb.scope: Deactivated successfully.
Nov 22 07:47:38 compute-0 podman[218922]: 2025-11-22 07:47:38.928757884 +0000 UTC m=+0.051125052 container died f994f6a61e575d618860d0eb99aa883e338ffa1c08976d44c09f722043166efb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4aa99606-7691-4fcb-846d-56459aaaa088, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 07:47:38 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f994f6a61e575d618860d0eb99aa883e338ffa1c08976d44c09f722043166efb-userdata-shm.mount: Deactivated successfully.
Nov 22 07:47:38 compute-0 systemd[1]: var-lib-containers-storage-overlay-20f23ee50bcacbee689e8c3102276d3cb606252041007f501e6b5a2be83049d9-merged.mount: Deactivated successfully.
Nov 22 07:47:38 compute-0 podman[218922]: 2025-11-22 07:47:38.971423888 +0000 UTC m=+0.093791056 container cleanup f994f6a61e575d618860d0eb99aa883e338ffa1c08976d44c09f722043166efb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4aa99606-7691-4fcb-846d-56459aaaa088, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS)
Nov 22 07:47:38 compute-0 kernel: tape62cb303-c3: entered promiscuous mode
Nov 22 07:47:39 compute-0 systemd[1]: libpod-conmon-f994f6a61e575d618860d0eb99aa883e338ffa1c08976d44c09f722043166efb.scope: Deactivated successfully.
Nov 22 07:47:39 compute-0 ovn_controller[94843]: 2025-11-22T07:47:39Z|00147|binding|INFO|Claiming lport e62cb303-c302-4453-bc2e-549678b72530 for this chassis.
Nov 22 07:47:39 compute-0 nova_compute[186544]: 2025-11-22 07:47:39.065 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:39 compute-0 ovn_controller[94843]: 2025-11-22T07:47:39Z|00148|binding|INFO|e62cb303-c302-4453-bc2e-549678b72530: Claiming fa:16:3e:2d:37:a2 10.100.0.12
Nov 22 07:47:39 compute-0 kernel: tape62cb303-c3 (unregistering): left promiscuous mode
Nov 22 07:47:39 compute-0 NetworkManager[55036]: <info>  [1763797659.0689] manager: (tape62cb303-c3): new Tun device (/org/freedesktop/NetworkManager/Devices/74)
Nov 22 07:47:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:39.077 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2d:37:a2 10.100.0.12'], port_security=['fa:16:3e:2d:37:a2 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '91985628-1682-4fc4-b267-d329bef908d3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4aa99606-7691-4fcb-846d-56459aaaa088', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b94109a356454dbda245fe5e57d0cd82', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7ebc0842-f2b0-4995-8bc2-4b71e8009dae', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=440686f5-fec3-41db-bbb0-53e12589d6a4, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=e62cb303-c302-4453-bc2e-549678b72530) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:47:39 compute-0 ovn_controller[94843]: 2025-11-22T07:47:39Z|00149|binding|INFO|Setting lport e62cb303-c302-4453-bc2e-549678b72530 ovn-installed in OVS
Nov 22 07:47:39 compute-0 ovn_controller[94843]: 2025-11-22T07:47:39Z|00150|binding|INFO|Setting lport e62cb303-c302-4453-bc2e-549678b72530 up in Southbound
Nov 22 07:47:39 compute-0 ovn_controller[94843]: 2025-11-22T07:47:39Z|00151|binding|INFO|Releasing lport e62cb303-c302-4453-bc2e-549678b72530 from this chassis (sb_readonly=1)
Nov 22 07:47:39 compute-0 nova_compute[186544]: 2025-11-22 07:47:39.086 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:39 compute-0 ovn_controller[94843]: 2025-11-22T07:47:39Z|00152|binding|INFO|Removing iface tape62cb303-c3 ovn-installed in OVS
Nov 22 07:47:39 compute-0 ovn_controller[94843]: 2025-11-22T07:47:39Z|00153|if_status|INFO|Not setting lport e62cb303-c302-4453-bc2e-549678b72530 down as sb is readonly
Nov 22 07:47:39 compute-0 nova_compute[186544]: 2025-11-22 07:47:39.088 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:39 compute-0 ovn_controller[94843]: 2025-11-22T07:47:39Z|00154|binding|INFO|Releasing lport e62cb303-c302-4453-bc2e-549678b72530 from this chassis (sb_readonly=1)
Nov 22 07:47:39 compute-0 ovn_controller[94843]: 2025-11-22T07:47:39Z|00155|binding|INFO|Setting lport e62cb303-c302-4453-bc2e-549678b72530 down in Southbound
Nov 22 07:47:39 compute-0 nova_compute[186544]: 2025-11-22 07:47:39.100 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:39.106 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2d:37:a2 10.100.0.12'], port_security=['fa:16:3e:2d:37:a2 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '91985628-1682-4fc4-b267-d329bef908d3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4aa99606-7691-4fcb-846d-56459aaaa088', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b94109a356454dbda245fe5e57d0cd82', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7ebc0842-f2b0-4995-8bc2-4b71e8009dae', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=440686f5-fec3-41db-bbb0-53e12589d6a4, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=e62cb303-c302-4453-bc2e-549678b72530) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:47:39 compute-0 nova_compute[186544]: 2025-11-22 07:47:39.125 186548 INFO nova.virt.libvirt.driver [-] [instance: 91985628-1682-4fc4-b267-d329bef908d3] Instance destroyed successfully.
Nov 22 07:47:39 compute-0 nova_compute[186544]: 2025-11-22 07:47:39.125 186548 DEBUG nova.objects.instance [None req-a2274d43-f439-458d-9899-62f4ea4deea7 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Lazy-loading 'resources' on Instance uuid 91985628-1682-4fc4-b267-d329bef908d3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:47:39 compute-0 podman[218955]: 2025-11-22 07:47:39.137305646 +0000 UTC m=+0.054641458 container remove f994f6a61e575d618860d0eb99aa883e338ffa1c08976d44c09f722043166efb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4aa99606-7691-4fcb-846d-56459aaaa088, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 22 07:47:39 compute-0 nova_compute[186544]: 2025-11-22 07:47:39.138 186548 DEBUG nova.virt.libvirt.vif [None req-a2274d43-f439-458d-9899-62f4ea4deea7 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:47:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesOneServerNegativeTestJSON-server-221618664',display_name='tempest-ImagesOneServerNegativeTestJSON-server-221618664',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagesoneservernegativetestjson-server-221618664',id=36,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:47:35Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b94109a356454dbda245fe5e57d0cd82',ramdisk_id='',reservation_id='r-g0laf6zv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesOneServerNegativeTestJSON-328128522',owner_user_name='tempest-ImagesOneServerNegativeTestJSON-328128522-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:47:35Z,user_data=None,user_id='b47fa480dd1c4c9f81da16b464195f2b',uuid=91985628-1682-4fc4-b267-d329bef908d3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e62cb303-c302-4453-bc2e-549678b72530", "address": "fa:16:3e:2d:37:a2", "network": {"id": "4aa99606-7691-4fcb-846d-56459aaaa088", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1996006581-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b94109a356454dbda245fe5e57d0cd82", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape62cb303-c3", "ovs_interfaceid": "e62cb303-c302-4453-bc2e-549678b72530", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 07:47:39 compute-0 nova_compute[186544]: 2025-11-22 07:47:39.139 186548 DEBUG nova.network.os_vif_util [None req-a2274d43-f439-458d-9899-62f4ea4deea7 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Converting VIF {"id": "e62cb303-c302-4453-bc2e-549678b72530", "address": "fa:16:3e:2d:37:a2", "network": {"id": "4aa99606-7691-4fcb-846d-56459aaaa088", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1996006581-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b94109a356454dbda245fe5e57d0cd82", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape62cb303-c3", "ovs_interfaceid": "e62cb303-c302-4453-bc2e-549678b72530", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:47:39 compute-0 nova_compute[186544]: 2025-11-22 07:47:39.139 186548 DEBUG nova.network.os_vif_util [None req-a2274d43-f439-458d-9899-62f4ea4deea7 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2d:37:a2,bridge_name='br-int',has_traffic_filtering=True,id=e62cb303-c302-4453-bc2e-549678b72530,network=Network(4aa99606-7691-4fcb-846d-56459aaaa088),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape62cb303-c3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:47:39 compute-0 nova_compute[186544]: 2025-11-22 07:47:39.139 186548 DEBUG os_vif [None req-a2274d43-f439-458d-9899-62f4ea4deea7 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2d:37:a2,bridge_name='br-int',has_traffic_filtering=True,id=e62cb303-c302-4453-bc2e-549678b72530,network=Network(4aa99606-7691-4fcb-846d-56459aaaa088),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape62cb303-c3') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 07:47:39 compute-0 nova_compute[186544]: 2025-11-22 07:47:39.141 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:39 compute-0 nova_compute[186544]: 2025-11-22 07:47:39.141 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape62cb303-c3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:47:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:39.142 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[0245f367-d02d-495d-9b08-5aa42731c7f1]: (4, ('Sat Nov 22 07:47:38 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-4aa99606-7691-4fcb-846d-56459aaaa088 (f994f6a61e575d618860d0eb99aa883e338ffa1c08976d44c09f722043166efb)\nf994f6a61e575d618860d0eb99aa883e338ffa1c08976d44c09f722043166efb\nSat Nov 22 07:47:39 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-4aa99606-7691-4fcb-846d-56459aaaa088 (f994f6a61e575d618860d0eb99aa883e338ffa1c08976d44c09f722043166efb)\nf994f6a61e575d618860d0eb99aa883e338ffa1c08976d44c09f722043166efb\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:39 compute-0 nova_compute[186544]: 2025-11-22 07:47:39.142 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:39 compute-0 nova_compute[186544]: 2025-11-22 07:47:39.144 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:39.144 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[bde236b9-f0ce-4f61-9ba6-b8b07c9a8eff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:39.145 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4aa99606-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:47:39 compute-0 nova_compute[186544]: 2025-11-22 07:47:39.146 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:39 compute-0 kernel: tap4aa99606-70: left promiscuous mode
Nov 22 07:47:39 compute-0 nova_compute[186544]: 2025-11-22 07:47:39.147 186548 INFO os_vif [None req-a2274d43-f439-458d-9899-62f4ea4deea7 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2d:37:a2,bridge_name='br-int',has_traffic_filtering=True,id=e62cb303-c302-4453-bc2e-549678b72530,network=Network(4aa99606-7691-4fcb-846d-56459aaaa088),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape62cb303-c3')
Nov 22 07:47:39 compute-0 nova_compute[186544]: 2025-11-22 07:47:39.148 186548 INFO nova.virt.libvirt.driver [None req-a2274d43-f439-458d-9899-62f4ea4deea7 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 91985628-1682-4fc4-b267-d329bef908d3] Deleting instance files /var/lib/nova/instances/91985628-1682-4fc4-b267-d329bef908d3_del
Nov 22 07:47:39 compute-0 nova_compute[186544]: 2025-11-22 07:47:39.148 186548 INFO nova.virt.libvirt.driver [None req-a2274d43-f439-458d-9899-62f4ea4deea7 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 91985628-1682-4fc4-b267-d329bef908d3] Deletion of /var/lib/nova/instances/91985628-1682-4fc4-b267-d329bef908d3_del complete
Nov 22 07:47:39 compute-0 nova_compute[186544]: 2025-11-22 07:47:39.158 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:39.161 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[09c88432-a3d8-452d-8c6b-2d0d5d5ae551]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:39.173 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[9654a72c-f1b6-4298-a5c3-72686869e17b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:39.175 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[b7a1811a-5d56-47d6-ae0e-5845d94cc2dc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:39.189 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[d553d93c-57be-4444-b0e3-362ca9470c3f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 439555, 'reachable_time': 27461, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218978, 'error': None, 'target': 'ovnmeta-4aa99606-7691-4fcb-846d-56459aaaa088', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:39 compute-0 systemd[1]: run-netns-ovnmeta\x2d4aa99606\x2d7691\x2d4fcb\x2d846d\x2d56459aaaa088.mount: Deactivated successfully.
Nov 22 07:47:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:39.191 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-4aa99606-7691-4fcb-846d-56459aaaa088 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 07:47:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:39.192 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[0f11e40f-1b72-4941-a301-54a104d289d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:39.192 103805 INFO neutron.agent.ovn.metadata.agent [-] Port e62cb303-c302-4453-bc2e-549678b72530 in datapath 4aa99606-7691-4fcb-846d-56459aaaa088 unbound from our chassis
Nov 22 07:47:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:39.194 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4aa99606-7691-4fcb-846d-56459aaaa088, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 07:47:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:39.194 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[c6e3c514-bbb3-4b87-8dbc-a905b5ede4b7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:39.194 103805 INFO neutron.agent.ovn.metadata.agent [-] Port e62cb303-c302-4453-bc2e-549678b72530 in datapath 4aa99606-7691-4fcb-846d-56459aaaa088 unbound from our chassis
Nov 22 07:47:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:39.195 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4aa99606-7691-4fcb-846d-56459aaaa088, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 07:47:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:39.196 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[0924a70c-da07-44b1-a1f9-73855e57453e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:39 compute-0 nova_compute[186544]: 2025-11-22 07:47:39.256 186548 INFO nova.compute.manager [None req-a2274d43-f439-458d-9899-62f4ea4deea7 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] [instance: 91985628-1682-4fc4-b267-d329bef908d3] Took 0.49 seconds to destroy the instance on the hypervisor.
Nov 22 07:47:39 compute-0 nova_compute[186544]: 2025-11-22 07:47:39.256 186548 DEBUG oslo.service.loopingcall [None req-a2274d43-f439-458d-9899-62f4ea4deea7 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 07:47:39 compute-0 nova_compute[186544]: 2025-11-22 07:47:39.256 186548 DEBUG nova.compute.manager [-] [instance: 91985628-1682-4fc4-b267-d329bef908d3] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 07:47:39 compute-0 nova_compute[186544]: 2025-11-22 07:47:39.257 186548 DEBUG nova.network.neutron [-] [instance: 91985628-1682-4fc4-b267-d329bef908d3] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 07:47:39 compute-0 nova_compute[186544]: 2025-11-22 07:47:39.307 186548 DEBUG nova.network.neutron [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Updating instance_info_cache with network_info: [{"id": "28cefb09-6f44-4c5f-b924-c2e3ca0082e1", "address": "fa:16:3e:90:34:2c", "network": {"id": "c3f966e1-8cff-4ca0-9b4f-a318c31b0a70", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1427311200-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d48bda61691e4f778b6d30c0dc773a30", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap28cefb09-6f", "ovs_interfaceid": "28cefb09-6f44-4c5f-b924-c2e3ca0082e1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:47:39 compute-0 nova_compute[186544]: 2025-11-22 07:47:39.330 186548 DEBUG oslo_concurrency.lockutils [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Releasing lock "refresh_cache-b9c07170-ca6f-422e-8f1c-9dfd5cc943a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:47:39 compute-0 nova_compute[186544]: 2025-11-22 07:47:39.337 186548 DEBUG nova.virt.libvirt.driver [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=72704,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp1r930f3w',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='b9c07170-ca6f-422e-8f1c-9dfd5cc943a4',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Nov 22 07:47:39 compute-0 nova_compute[186544]: 2025-11-22 07:47:39.337 186548 DEBUG nova.virt.libvirt.driver [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Creating instance directory: /var/lib/nova/instances/b9c07170-ca6f-422e-8f1c-9dfd5cc943a4 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Nov 22 07:47:39 compute-0 nova_compute[186544]: 2025-11-22 07:47:39.338 186548 DEBUG nova.virt.libvirt.driver [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Creating disk.info with the contents: {'/var/lib/nova/instances/b9c07170-ca6f-422e-8f1c-9dfd5cc943a4/disk': 'qcow2', '/var/lib/nova/instances/b9c07170-ca6f-422e-8f1c-9dfd5cc943a4/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854
Nov 22 07:47:39 compute-0 nova_compute[186544]: 2025-11-22 07:47:39.338 186548 DEBUG nova.virt.libvirt.driver [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864
Nov 22 07:47:39 compute-0 nova_compute[186544]: 2025-11-22 07:47:39.339 186548 DEBUG nova.objects.instance [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Lazy-loading 'trusted_certs' on Instance uuid b9c07170-ca6f-422e-8f1c-9dfd5cc943a4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:47:39 compute-0 nova_compute[186544]: 2025-11-22 07:47:39.383 186548 DEBUG oslo_concurrency.processutils [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:47:39 compute-0 nova_compute[186544]: 2025-11-22 07:47:39.439 186548 DEBUG oslo_concurrency.processutils [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:47:39 compute-0 nova_compute[186544]: 2025-11-22 07:47:39.440 186548 DEBUG oslo_concurrency.lockutils [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:47:39 compute-0 nova_compute[186544]: 2025-11-22 07:47:39.441 186548 DEBUG oslo_concurrency.lockutils [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:47:39 compute-0 nova_compute[186544]: 2025-11-22 07:47:39.451 186548 DEBUG oslo_concurrency.processutils [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:47:39 compute-0 nova_compute[186544]: 2025-11-22 07:47:39.505 186548 DEBUG oslo_concurrency.processutils [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:47:39 compute-0 nova_compute[186544]: 2025-11-22 07:47:39.506 186548 DEBUG oslo_concurrency.processutils [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/b9c07170-ca6f-422e-8f1c-9dfd5cc943a4/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:47:39 compute-0 nova_compute[186544]: 2025-11-22 07:47:39.538 186548 DEBUG oslo_concurrency.processutils [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/b9c07170-ca6f-422e-8f1c-9dfd5cc943a4/disk 1073741824" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:47:39 compute-0 nova_compute[186544]: 2025-11-22 07:47:39.539 186548 DEBUG oslo_concurrency.lockutils [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.098s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:47:39 compute-0 nova_compute[186544]: 2025-11-22 07:47:39.539 186548 DEBUG oslo_concurrency.processutils [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:47:39 compute-0 nova_compute[186544]: 2025-11-22 07:47:39.599 186548 DEBUG oslo_concurrency.processutils [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:47:39 compute-0 nova_compute[186544]: 2025-11-22 07:47:39.600 186548 DEBUG nova.virt.disk.api [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Checking if we can resize image /var/lib/nova/instances/b9c07170-ca6f-422e-8f1c-9dfd5cc943a4/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 07:47:39 compute-0 nova_compute[186544]: 2025-11-22 07:47:39.600 186548 DEBUG oslo_concurrency.processutils [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b9c07170-ca6f-422e-8f1c-9dfd5cc943a4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:47:39 compute-0 nova_compute[186544]: 2025-11-22 07:47:39.654 186548 DEBUG oslo_concurrency.processutils [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b9c07170-ca6f-422e-8f1c-9dfd5cc943a4/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:47:39 compute-0 nova_compute[186544]: 2025-11-22 07:47:39.655 186548 DEBUG nova.virt.disk.api [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Cannot resize image /var/lib/nova/instances/b9c07170-ca6f-422e-8f1c-9dfd5cc943a4/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 07:47:39 compute-0 nova_compute[186544]: 2025-11-22 07:47:39.655 186548 DEBUG nova.objects.instance [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Lazy-loading 'migration_context' on Instance uuid b9c07170-ca6f-422e-8f1c-9dfd5cc943a4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:47:39 compute-0 nova_compute[186544]: 2025-11-22 07:47:39.670 186548 DEBUG oslo_concurrency.processutils [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/b9c07170-ca6f-422e-8f1c-9dfd5cc943a4/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:47:39 compute-0 nova_compute[186544]: 2025-11-22 07:47:39.695 186548 DEBUG oslo_concurrency.processutils [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/b9c07170-ca6f-422e-8f1c-9dfd5cc943a4/disk.config 485376" returned: 0 in 0.026s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:47:39 compute-0 nova_compute[186544]: 2025-11-22 07:47:39.697 186548 DEBUG nova.virt.libvirt.volume.remotefs [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Copying file compute-1.ctlplane.example.com:/var/lib/nova/instances/b9c07170-ca6f-422e-8f1c-9dfd5cc943a4/disk.config to /var/lib/nova/instances/b9c07170-ca6f-422e-8f1c-9dfd5cc943a4 copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Nov 22 07:47:39 compute-0 nova_compute[186544]: 2025-11-22 07:47:39.697 186548 DEBUG oslo_concurrency.processutils [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Running cmd (subprocess): scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/b9c07170-ca6f-422e-8f1c-9dfd5cc943a4/disk.config /var/lib/nova/instances/b9c07170-ca6f-422e-8f1c-9dfd5cc943a4 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:47:40 compute-0 nova_compute[186544]: 2025-11-22 07:47:40.044 186548 DEBUG nova.compute.manager [req-81c40d3b-cc3c-4896-aa75-4bdf6e4d0eac req-18d96fb7-d122-49fd-be23-2b7c2d85212b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 91985628-1682-4fc4-b267-d329bef908d3] Received event network-vif-unplugged-e62cb303-c302-4453-bc2e-549678b72530 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:47:40 compute-0 nova_compute[186544]: 2025-11-22 07:47:40.045 186548 DEBUG oslo_concurrency.lockutils [req-81c40d3b-cc3c-4896-aa75-4bdf6e4d0eac req-18d96fb7-d122-49fd-be23-2b7c2d85212b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "91985628-1682-4fc4-b267-d329bef908d3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:47:40 compute-0 nova_compute[186544]: 2025-11-22 07:47:40.045 186548 DEBUG oslo_concurrency.lockutils [req-81c40d3b-cc3c-4896-aa75-4bdf6e4d0eac req-18d96fb7-d122-49fd-be23-2b7c2d85212b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "91985628-1682-4fc4-b267-d329bef908d3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:47:40 compute-0 nova_compute[186544]: 2025-11-22 07:47:40.045 186548 DEBUG oslo_concurrency.lockutils [req-81c40d3b-cc3c-4896-aa75-4bdf6e4d0eac req-18d96fb7-d122-49fd-be23-2b7c2d85212b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "91985628-1682-4fc4-b267-d329bef908d3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:47:40 compute-0 nova_compute[186544]: 2025-11-22 07:47:40.046 186548 DEBUG nova.compute.manager [req-81c40d3b-cc3c-4896-aa75-4bdf6e4d0eac req-18d96fb7-d122-49fd-be23-2b7c2d85212b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 91985628-1682-4fc4-b267-d329bef908d3] No waiting events found dispatching network-vif-unplugged-e62cb303-c302-4453-bc2e-549678b72530 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:47:40 compute-0 nova_compute[186544]: 2025-11-22 07:47:40.046 186548 DEBUG nova.compute.manager [req-81c40d3b-cc3c-4896-aa75-4bdf6e4d0eac req-18d96fb7-d122-49fd-be23-2b7c2d85212b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 91985628-1682-4fc4-b267-d329bef908d3] Received event network-vif-unplugged-e62cb303-c302-4453-bc2e-549678b72530 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 22 07:47:40 compute-0 nova_compute[186544]: 2025-11-22 07:47:40.046 186548 DEBUG nova.compute.manager [req-81c40d3b-cc3c-4896-aa75-4bdf6e4d0eac req-18d96fb7-d122-49fd-be23-2b7c2d85212b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 91985628-1682-4fc4-b267-d329bef908d3] Received event network-vif-plugged-e62cb303-c302-4453-bc2e-549678b72530 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:47:40 compute-0 nova_compute[186544]: 2025-11-22 07:47:40.046 186548 DEBUG oslo_concurrency.lockutils [req-81c40d3b-cc3c-4896-aa75-4bdf6e4d0eac req-18d96fb7-d122-49fd-be23-2b7c2d85212b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "91985628-1682-4fc4-b267-d329bef908d3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:47:40 compute-0 nova_compute[186544]: 2025-11-22 07:47:40.046 186548 DEBUG oslo_concurrency.lockutils [req-81c40d3b-cc3c-4896-aa75-4bdf6e4d0eac req-18d96fb7-d122-49fd-be23-2b7c2d85212b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "91985628-1682-4fc4-b267-d329bef908d3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:47:40 compute-0 nova_compute[186544]: 2025-11-22 07:47:40.046 186548 DEBUG oslo_concurrency.lockutils [req-81c40d3b-cc3c-4896-aa75-4bdf6e4d0eac req-18d96fb7-d122-49fd-be23-2b7c2d85212b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "91985628-1682-4fc4-b267-d329bef908d3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:47:40 compute-0 nova_compute[186544]: 2025-11-22 07:47:40.046 186548 DEBUG nova.compute.manager [req-81c40d3b-cc3c-4896-aa75-4bdf6e4d0eac req-18d96fb7-d122-49fd-be23-2b7c2d85212b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 91985628-1682-4fc4-b267-d329bef908d3] No waiting events found dispatching network-vif-plugged-e62cb303-c302-4453-bc2e-549678b72530 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:47:40 compute-0 nova_compute[186544]: 2025-11-22 07:47:40.047 186548 WARNING nova.compute.manager [req-81c40d3b-cc3c-4896-aa75-4bdf6e4d0eac req-18d96fb7-d122-49fd-be23-2b7c2d85212b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 91985628-1682-4fc4-b267-d329bef908d3] Received unexpected event network-vif-plugged-e62cb303-c302-4453-bc2e-549678b72530 for instance with vm_state active and task_state deleting.
Nov 22 07:47:40 compute-0 nova_compute[186544]: 2025-11-22 07:47:40.047 186548 DEBUG nova.compute.manager [req-81c40d3b-cc3c-4896-aa75-4bdf6e4d0eac req-18d96fb7-d122-49fd-be23-2b7c2d85212b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 91985628-1682-4fc4-b267-d329bef908d3] Received event network-vif-plugged-e62cb303-c302-4453-bc2e-549678b72530 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:47:40 compute-0 nova_compute[186544]: 2025-11-22 07:47:40.047 186548 DEBUG oslo_concurrency.lockutils [req-81c40d3b-cc3c-4896-aa75-4bdf6e4d0eac req-18d96fb7-d122-49fd-be23-2b7c2d85212b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "91985628-1682-4fc4-b267-d329bef908d3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:47:40 compute-0 nova_compute[186544]: 2025-11-22 07:47:40.047 186548 DEBUG oslo_concurrency.lockutils [req-81c40d3b-cc3c-4896-aa75-4bdf6e4d0eac req-18d96fb7-d122-49fd-be23-2b7c2d85212b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "91985628-1682-4fc4-b267-d329bef908d3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:47:40 compute-0 nova_compute[186544]: 2025-11-22 07:47:40.047 186548 DEBUG oslo_concurrency.lockutils [req-81c40d3b-cc3c-4896-aa75-4bdf6e4d0eac req-18d96fb7-d122-49fd-be23-2b7c2d85212b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "91985628-1682-4fc4-b267-d329bef908d3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:47:40 compute-0 nova_compute[186544]: 2025-11-22 07:47:40.047 186548 DEBUG nova.compute.manager [req-81c40d3b-cc3c-4896-aa75-4bdf6e4d0eac req-18d96fb7-d122-49fd-be23-2b7c2d85212b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 91985628-1682-4fc4-b267-d329bef908d3] No waiting events found dispatching network-vif-plugged-e62cb303-c302-4453-bc2e-549678b72530 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:47:40 compute-0 nova_compute[186544]: 2025-11-22 07:47:40.048 186548 WARNING nova.compute.manager [req-81c40d3b-cc3c-4896-aa75-4bdf6e4d0eac req-18d96fb7-d122-49fd-be23-2b7c2d85212b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 91985628-1682-4fc4-b267-d329bef908d3] Received unexpected event network-vif-plugged-e62cb303-c302-4453-bc2e-549678b72530 for instance with vm_state active and task_state deleting.
Nov 22 07:47:40 compute-0 nova_compute[186544]: 2025-11-22 07:47:40.048 186548 DEBUG nova.compute.manager [req-81c40d3b-cc3c-4896-aa75-4bdf6e4d0eac req-18d96fb7-d122-49fd-be23-2b7c2d85212b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 91985628-1682-4fc4-b267-d329bef908d3] Received event network-vif-plugged-e62cb303-c302-4453-bc2e-549678b72530 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:47:40 compute-0 nova_compute[186544]: 2025-11-22 07:47:40.048 186548 DEBUG oslo_concurrency.lockutils [req-81c40d3b-cc3c-4896-aa75-4bdf6e4d0eac req-18d96fb7-d122-49fd-be23-2b7c2d85212b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "91985628-1682-4fc4-b267-d329bef908d3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:47:40 compute-0 nova_compute[186544]: 2025-11-22 07:47:40.048 186548 DEBUG oslo_concurrency.lockutils [req-81c40d3b-cc3c-4896-aa75-4bdf6e4d0eac req-18d96fb7-d122-49fd-be23-2b7c2d85212b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "91985628-1682-4fc4-b267-d329bef908d3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:47:40 compute-0 nova_compute[186544]: 2025-11-22 07:47:40.048 186548 DEBUG oslo_concurrency.lockutils [req-81c40d3b-cc3c-4896-aa75-4bdf6e4d0eac req-18d96fb7-d122-49fd-be23-2b7c2d85212b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "91985628-1682-4fc4-b267-d329bef908d3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:47:40 compute-0 nova_compute[186544]: 2025-11-22 07:47:40.048 186548 DEBUG nova.compute.manager [req-81c40d3b-cc3c-4896-aa75-4bdf6e4d0eac req-18d96fb7-d122-49fd-be23-2b7c2d85212b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 91985628-1682-4fc4-b267-d329bef908d3] No waiting events found dispatching network-vif-plugged-e62cb303-c302-4453-bc2e-549678b72530 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:47:40 compute-0 nova_compute[186544]: 2025-11-22 07:47:40.049 186548 WARNING nova.compute.manager [req-81c40d3b-cc3c-4896-aa75-4bdf6e4d0eac req-18d96fb7-d122-49fd-be23-2b7c2d85212b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 91985628-1682-4fc4-b267-d329bef908d3] Received unexpected event network-vif-plugged-e62cb303-c302-4453-bc2e-549678b72530 for instance with vm_state active and task_state deleting.
Nov 22 07:47:40 compute-0 nova_compute[186544]: 2025-11-22 07:47:40.132 186548 DEBUG oslo_concurrency.processutils [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] CMD "scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/b9c07170-ca6f-422e-8f1c-9dfd5cc943a4/disk.config /var/lib/nova/instances/b9c07170-ca6f-422e-8f1c-9dfd5cc943a4" returned: 0 in 0.435s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:47:40 compute-0 nova_compute[186544]: 2025-11-22 07:47:40.132 186548 DEBUG nova.virt.libvirt.driver [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Nov 22 07:47:40 compute-0 nova_compute[186544]: 2025-11-22 07:47:40.134 186548 DEBUG nova.virt.libvirt.vif [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:47:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1107734578',display_name='tempest-LiveMigrationTest-server-1107734578',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-livemigrationtest-server-1107734578',id=35,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:47:23Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d48bda61691e4f778b6d30c0dc773a30',ramdisk_id='',reservation_id='r-qyzvalf0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-2093743563',owner_user_name='tempest-LiveMigrationTest-2093743563-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:47:23Z,user_data=None,user_id='8a738b980aad493b9a21da7d5a5ccf8a',uuid=b9c07170-ca6f-422e-8f1c-9dfd5cc943a4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "28cefb09-6f44-4c5f-b924-c2e3ca0082e1", "address": "fa:16:3e:90:34:2c", "network": {"id": "c3f966e1-8cff-4ca0-9b4f-a318c31b0a70", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1427311200-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d48bda61691e4f778b6d30c0dc773a30", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap28cefb09-6f", "ovs_interfaceid": "28cefb09-6f44-4c5f-b924-c2e3ca0082e1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 07:47:40 compute-0 nova_compute[186544]: 2025-11-22 07:47:40.134 186548 DEBUG nova.network.os_vif_util [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Converting VIF {"id": "28cefb09-6f44-4c5f-b924-c2e3ca0082e1", "address": "fa:16:3e:90:34:2c", "network": {"id": "c3f966e1-8cff-4ca0-9b4f-a318c31b0a70", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1427311200-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d48bda61691e4f778b6d30c0dc773a30", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap28cefb09-6f", "ovs_interfaceid": "28cefb09-6f44-4c5f-b924-c2e3ca0082e1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:47:40 compute-0 nova_compute[186544]: 2025-11-22 07:47:40.135 186548 DEBUG nova.network.os_vif_util [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:90:34:2c,bridge_name='br-int',has_traffic_filtering=True,id=28cefb09-6f44-4c5f-b924-c2e3ca0082e1,network=Network(c3f966e1-8cff-4ca0-9b4f-a318c31b0a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap28cefb09-6f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:47:40 compute-0 nova_compute[186544]: 2025-11-22 07:47:40.135 186548 DEBUG os_vif [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:90:34:2c,bridge_name='br-int',has_traffic_filtering=True,id=28cefb09-6f44-4c5f-b924-c2e3ca0082e1,network=Network(c3f966e1-8cff-4ca0-9b4f-a318c31b0a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap28cefb09-6f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 07:47:40 compute-0 nova_compute[186544]: 2025-11-22 07:47:40.136 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:40 compute-0 nova_compute[186544]: 2025-11-22 07:47:40.136 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:47:40 compute-0 nova_compute[186544]: 2025-11-22 07:47:40.137 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:47:40 compute-0 nova_compute[186544]: 2025-11-22 07:47:40.140 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:40 compute-0 nova_compute[186544]: 2025-11-22 07:47:40.140 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap28cefb09-6f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:47:40 compute-0 nova_compute[186544]: 2025-11-22 07:47:40.140 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap28cefb09-6f, col_values=(('external_ids', {'iface-id': '28cefb09-6f44-4c5f-b924-c2e3ca0082e1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:90:34:2c', 'vm-uuid': 'b9c07170-ca6f-422e-8f1c-9dfd5cc943a4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:47:40 compute-0 nova_compute[186544]: 2025-11-22 07:47:40.142 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:40 compute-0 NetworkManager[55036]: <info>  [1763797660.1430] manager: (tap28cefb09-6f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/75)
Nov 22 07:47:40 compute-0 nova_compute[186544]: 2025-11-22 07:47:40.145 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 07:47:40 compute-0 nova_compute[186544]: 2025-11-22 07:47:40.149 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:40 compute-0 nova_compute[186544]: 2025-11-22 07:47:40.150 186548 INFO os_vif [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:90:34:2c,bridge_name='br-int',has_traffic_filtering=True,id=28cefb09-6f44-4c5f-b924-c2e3ca0082e1,network=Network(c3f966e1-8cff-4ca0-9b4f-a318c31b0a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap28cefb09-6f')
Nov 22 07:47:40 compute-0 nova_compute[186544]: 2025-11-22 07:47:40.151 186548 DEBUG nova.virt.libvirt.driver [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Nov 22 07:47:40 compute-0 nova_compute[186544]: 2025-11-22 07:47:40.151 186548 DEBUG nova.compute.manager [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=72704,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp1r930f3w',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='b9c07170-ca6f-422e-8f1c-9dfd5cc943a4',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Nov 22 07:47:40 compute-0 nova_compute[186544]: 2025-11-22 07:47:40.194 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:40 compute-0 nova_compute[186544]: 2025-11-22 07:47:40.472 186548 DEBUG nova.network.neutron [-] [instance: 91985628-1682-4fc4-b267-d329bef908d3] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:47:40 compute-0 nova_compute[186544]: 2025-11-22 07:47:40.490 186548 INFO nova.compute.manager [-] [instance: 91985628-1682-4fc4-b267-d329bef908d3] Took 1.23 seconds to deallocate network for instance.
Nov 22 07:47:40 compute-0 nova_compute[186544]: 2025-11-22 07:47:40.564 186548 DEBUG oslo_concurrency.lockutils [None req-a2274d43-f439-458d-9899-62f4ea4deea7 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:47:40 compute-0 nova_compute[186544]: 2025-11-22 07:47:40.565 186548 DEBUG oslo_concurrency.lockutils [None req-a2274d43-f439-458d-9899-62f4ea4deea7 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:47:40 compute-0 nova_compute[186544]: 2025-11-22 07:47:40.573 186548 DEBUG nova.compute.manager [req-b568881b-9455-4ae8-8ec5-e3efac049176 req-b0a6d194-d201-4059-bf25-ffe61ae202b6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 91985628-1682-4fc4-b267-d329bef908d3] Received event network-vif-deleted-e62cb303-c302-4453-bc2e-549678b72530 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:47:40 compute-0 nova_compute[186544]: 2025-11-22 07:47:40.672 186548 DEBUG nova.compute.provider_tree [None req-a2274d43-f439-458d-9899-62f4ea4deea7 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:47:40 compute-0 nova_compute[186544]: 2025-11-22 07:47:40.698 186548 DEBUG nova.scheduler.client.report [None req-a2274d43-f439-458d-9899-62f4ea4deea7 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:47:40 compute-0 nova_compute[186544]: 2025-11-22 07:47:40.762 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:47:40 compute-0 nova_compute[186544]: 2025-11-22 07:47:40.954 186548 DEBUG oslo_concurrency.lockutils [None req-a2274d43-f439-458d-9899-62f4ea4deea7 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.389s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:47:40 compute-0 nova_compute[186544]: 2025-11-22 07:47:40.978 186548 INFO nova.scheduler.client.report [None req-a2274d43-f439-458d-9899-62f4ea4deea7 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Deleted allocations for instance 91985628-1682-4fc4-b267-d329bef908d3
Nov 22 07:47:41 compute-0 nova_compute[186544]: 2025-11-22 07:47:41.085 186548 DEBUG oslo_concurrency.lockutils [None req-a2274d43-f439-458d-9899-62f4ea4deea7 b47fa480dd1c4c9f81da16b464195f2b b94109a356454dbda245fe5e57d0cd82 - - default default] Lock "91985628-1682-4fc4-b267-d329bef908d3" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.339s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:47:41 compute-0 nova_compute[186544]: 2025-11-22 07:47:41.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:47:42 compute-0 nova_compute[186544]: 2025-11-22 07:47:42.152 186548 DEBUG nova.compute.manager [req-21bb74df-91d6-425f-ba1c-97cbde3b7ef1 req-79b2ef5c-7a7e-4352-9fc3-b2c675c69ecd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 91985628-1682-4fc4-b267-d329bef908d3] Received event network-vif-unplugged-e62cb303-c302-4453-bc2e-549678b72530 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:47:42 compute-0 nova_compute[186544]: 2025-11-22 07:47:42.153 186548 DEBUG oslo_concurrency.lockutils [req-21bb74df-91d6-425f-ba1c-97cbde3b7ef1 req-79b2ef5c-7a7e-4352-9fc3-b2c675c69ecd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "91985628-1682-4fc4-b267-d329bef908d3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:47:42 compute-0 nova_compute[186544]: 2025-11-22 07:47:42.153 186548 DEBUG oslo_concurrency.lockutils [req-21bb74df-91d6-425f-ba1c-97cbde3b7ef1 req-79b2ef5c-7a7e-4352-9fc3-b2c675c69ecd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "91985628-1682-4fc4-b267-d329bef908d3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:47:42 compute-0 nova_compute[186544]: 2025-11-22 07:47:42.153 186548 DEBUG oslo_concurrency.lockutils [req-21bb74df-91d6-425f-ba1c-97cbde3b7ef1 req-79b2ef5c-7a7e-4352-9fc3-b2c675c69ecd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "91985628-1682-4fc4-b267-d329bef908d3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:47:42 compute-0 nova_compute[186544]: 2025-11-22 07:47:42.154 186548 DEBUG nova.compute.manager [req-21bb74df-91d6-425f-ba1c-97cbde3b7ef1 req-79b2ef5c-7a7e-4352-9fc3-b2c675c69ecd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 91985628-1682-4fc4-b267-d329bef908d3] No waiting events found dispatching network-vif-unplugged-e62cb303-c302-4453-bc2e-549678b72530 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:47:42 compute-0 nova_compute[186544]: 2025-11-22 07:47:42.154 186548 WARNING nova.compute.manager [req-21bb74df-91d6-425f-ba1c-97cbde3b7ef1 req-79b2ef5c-7a7e-4352-9fc3-b2c675c69ecd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 91985628-1682-4fc4-b267-d329bef908d3] Received unexpected event network-vif-unplugged-e62cb303-c302-4453-bc2e-549678b72530 for instance with vm_state deleted and task_state None.
Nov 22 07:47:42 compute-0 nova_compute[186544]: 2025-11-22 07:47:42.154 186548 DEBUG nova.compute.manager [req-21bb74df-91d6-425f-ba1c-97cbde3b7ef1 req-79b2ef5c-7a7e-4352-9fc3-b2c675c69ecd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 91985628-1682-4fc4-b267-d329bef908d3] Received event network-vif-plugged-e62cb303-c302-4453-bc2e-549678b72530 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:47:42 compute-0 nova_compute[186544]: 2025-11-22 07:47:42.154 186548 DEBUG oslo_concurrency.lockutils [req-21bb74df-91d6-425f-ba1c-97cbde3b7ef1 req-79b2ef5c-7a7e-4352-9fc3-b2c675c69ecd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "91985628-1682-4fc4-b267-d329bef908d3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:47:42 compute-0 nova_compute[186544]: 2025-11-22 07:47:42.155 186548 DEBUG oslo_concurrency.lockutils [req-21bb74df-91d6-425f-ba1c-97cbde3b7ef1 req-79b2ef5c-7a7e-4352-9fc3-b2c675c69ecd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "91985628-1682-4fc4-b267-d329bef908d3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:47:42 compute-0 nova_compute[186544]: 2025-11-22 07:47:42.155 186548 DEBUG oslo_concurrency.lockutils [req-21bb74df-91d6-425f-ba1c-97cbde3b7ef1 req-79b2ef5c-7a7e-4352-9fc3-b2c675c69ecd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "91985628-1682-4fc4-b267-d329bef908d3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:47:42 compute-0 nova_compute[186544]: 2025-11-22 07:47:42.155 186548 DEBUG nova.compute.manager [req-21bb74df-91d6-425f-ba1c-97cbde3b7ef1 req-79b2ef5c-7a7e-4352-9fc3-b2c675c69ecd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 91985628-1682-4fc4-b267-d329bef908d3] No waiting events found dispatching network-vif-plugged-e62cb303-c302-4453-bc2e-549678b72530 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:47:42 compute-0 nova_compute[186544]: 2025-11-22 07:47:42.155 186548 WARNING nova.compute.manager [req-21bb74df-91d6-425f-ba1c-97cbde3b7ef1 req-79b2ef5c-7a7e-4352-9fc3-b2c675c69ecd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 91985628-1682-4fc4-b267-d329bef908d3] Received unexpected event network-vif-plugged-e62cb303-c302-4453-bc2e-549678b72530 for instance with vm_state deleted and task_state None.
Nov 22 07:47:42 compute-0 nova_compute[186544]: 2025-11-22 07:47:42.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:47:42 compute-0 podman[219001]: 2025-11-22 07:47:42.421556648 +0000 UTC m=+0.067065322 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 07:47:42 compute-0 nova_compute[186544]: 2025-11-22 07:47:42.871 186548 DEBUG nova.network.neutron [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Port 28cefb09-6f44-4c5f-b924-c2e3ca0082e1 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Nov 22 07:47:42 compute-0 nova_compute[186544]: 2025-11-22 07:47:42.880 186548 DEBUG nova.compute.manager [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=72704,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp1r930f3w',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='b9c07170-ca6f-422e-8f1c-9dfd5cc943a4',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Nov 22 07:47:43 compute-0 kernel: tap28cefb09-6f: entered promiscuous mode
Nov 22 07:47:43 compute-0 NetworkManager[55036]: <info>  [1763797663.1296] manager: (tap28cefb09-6f): new Tun device (/org/freedesktop/NetworkManager/Devices/76)
Nov 22 07:47:43 compute-0 ovn_controller[94843]: 2025-11-22T07:47:43Z|00156|binding|INFO|Claiming lport 28cefb09-6f44-4c5f-b924-c2e3ca0082e1 for this additional chassis.
Nov 22 07:47:43 compute-0 ovn_controller[94843]: 2025-11-22T07:47:43Z|00157|binding|INFO|28cefb09-6f44-4c5f-b924-c2e3ca0082e1: Claiming fa:16:3e:90:34:2c 10.100.0.9
Nov 22 07:47:43 compute-0 ovn_controller[94843]: 2025-11-22T07:47:43Z|00158|binding|INFO|Claiming lport eb321ea2-ecc9-494b-a270-c3aac4f36e7d for this additional chassis.
Nov 22 07:47:43 compute-0 ovn_controller[94843]: 2025-11-22T07:47:43Z|00159|binding|INFO|eb321ea2-ecc9-494b-a270-c3aac4f36e7d: Claiming fa:16:3e:d2:b0:13 19.80.0.49
Nov 22 07:47:43 compute-0 nova_compute[186544]: 2025-11-22 07:47:43.131 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:43 compute-0 systemd-udevd[219040]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 07:47:43 compute-0 NetworkManager[55036]: <info>  [1763797663.1702] device (tap28cefb09-6f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 07:47:43 compute-0 NetworkManager[55036]: <info>  [1763797663.1712] device (tap28cefb09-6f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 07:47:43 compute-0 nova_compute[186544]: 2025-11-22 07:47:43.176 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:43 compute-0 ovn_controller[94843]: 2025-11-22T07:47:43Z|00160|binding|INFO|Setting lport 28cefb09-6f44-4c5f-b924-c2e3ca0082e1 ovn-installed in OVS
Nov 22 07:47:43 compute-0 systemd-machined[152872]: New machine qemu-20-instance-00000023.
Nov 22 07:47:43 compute-0 nova_compute[186544]: 2025-11-22 07:47:43.183 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:43 compute-0 systemd[1]: Started Virtual Machine qemu-20-instance-00000023.
Nov 22 07:47:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:43.658 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:47:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:43.659 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 07:47:43 compute-0 nova_compute[186544]: 2025-11-22 07:47:43.663 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:44 compute-0 nova_compute[186544]: 2025-11-22 07:47:44.262 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763797664.2618732, b9c07170-ca6f-422e-8f1c-9dfd5cc943a4 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:47:44 compute-0 nova_compute[186544]: 2025-11-22 07:47:44.263 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] VM Started (Lifecycle Event)
Nov 22 07:47:44 compute-0 nova_compute[186544]: 2025-11-22 07:47:44.296 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:47:45 compute-0 nova_compute[186544]: 2025-11-22 07:47:45.036 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763797665.0360148, b9c07170-ca6f-422e-8f1c-9dfd5cc943a4 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:47:45 compute-0 nova_compute[186544]: 2025-11-22 07:47:45.036 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] VM Resumed (Lifecycle Event)
Nov 22 07:47:45 compute-0 nova_compute[186544]: 2025-11-22 07:47:45.058 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:47:45 compute-0 nova_compute[186544]: 2025-11-22 07:47:45.062 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:47:45 compute-0 nova_compute[186544]: 2025-11-22 07:47:45.082 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] During the sync_power process the instance has moved from host compute-1.ctlplane.example.com to host compute-0.ctlplane.example.com
Nov 22 07:47:45 compute-0 nova_compute[186544]: 2025-11-22 07:47:45.142 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:45 compute-0 nova_compute[186544]: 2025-11-22 07:47:45.195 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:46 compute-0 podman[219072]: 2025-11-22 07:47:46.426335478 +0000 UTC m=+0.070560667 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, name=ubi9-minimal)
Nov 22 07:47:46 compute-0 ovn_controller[94843]: 2025-11-22T07:47:46Z|00161|binding|INFO|Claiming lport 28cefb09-6f44-4c5f-b924-c2e3ca0082e1 for this chassis.
Nov 22 07:47:46 compute-0 ovn_controller[94843]: 2025-11-22T07:47:46Z|00162|binding|INFO|28cefb09-6f44-4c5f-b924-c2e3ca0082e1: Claiming fa:16:3e:90:34:2c 10.100.0.9
Nov 22 07:47:46 compute-0 ovn_controller[94843]: 2025-11-22T07:47:46Z|00163|binding|INFO|Claiming lport eb321ea2-ecc9-494b-a270-c3aac4f36e7d for this chassis.
Nov 22 07:47:46 compute-0 ovn_controller[94843]: 2025-11-22T07:47:46Z|00164|binding|INFO|eb321ea2-ecc9-494b-a270-c3aac4f36e7d: Claiming fa:16:3e:d2:b0:13 19.80.0.49
Nov 22 07:47:46 compute-0 ovn_controller[94843]: 2025-11-22T07:47:46Z|00165|binding|INFO|Setting lport 28cefb09-6f44-4c5f-b924-c2e3ca0082e1 up in Southbound
Nov 22 07:47:46 compute-0 ovn_controller[94843]: 2025-11-22T07:47:46Z|00166|binding|INFO|Setting lport eb321ea2-ecc9-494b-a270-c3aac4f36e7d up in Southbound
Nov 22 07:47:46 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:46.952 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:90:34:2c 10.100.0.9'], port_security=['fa:16:3e:90:34:2c 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-109341048', 'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'b9c07170-ca6f-422e-8f1c-9dfd5cc943a4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-109341048', 'neutron:project_id': 'd48bda61691e4f778b6d30c0dc773a30', 'neutron:revision_number': '11', 'neutron:security_group_ids': 'd99796e5-fe06-409c-adb5-ca5cc291d6f2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=45f3847d-7d6e-44b5-a83a-dde97f76bd11, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=28cefb09-6f44-4c5f-b924-c2e3ca0082e1) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:47:46 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:46.953 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d2:b0:13 19.80.0.49'], port_security=['fa:16:3e:d2:b0:13 19.80.0.49'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': ''}, parent_port=['28cefb09-6f44-4c5f-b924-c2e3ca0082e1'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-1622735635', 'neutron:cidrs': '19.80.0.49/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c410ae8d-536e-4819-b766-652bc78ac3e4', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-1622735635', 'neutron:project_id': 'd48bda61691e4f778b6d30c0dc773a30', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd99796e5-fe06-409c-adb5-ca5cc291d6f2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=83c2898f-e4b8-43d0-8099-6e9553385d03, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=eb321ea2-ecc9-494b-a270-c3aac4f36e7d) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:47:46 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:46.954 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 28cefb09-6f44-4c5f-b924-c2e3ca0082e1 in datapath c3f966e1-8cff-4ca0-9b4f-a318c31b0a70 bound to our chassis
Nov 22 07:47:46 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:46.956 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c3f966e1-8cff-4ca0-9b4f-a318c31b0a70
Nov 22 07:47:46 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:46.971 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[7be5f508-3938-48be-ba20-bccfa3ef2de2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:47.000 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[ba331fa2-5e20-4ddf-b471-801eb0600242]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:47.004 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[2e2f1968-5138-47ae-9619-d758923626d1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:47.039 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[ce5305a1-e072-4185-b836-045df09331cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:47.058 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[410c1ca5-d78b-4b53-8963-cc46b40e7a60]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc3f966e1-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e2:74:99'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 26, 'tx_packets': 5, 'rx_bytes': 1372, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 26, 'tx_packets': 5, 'rx_bytes': 1372, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 40], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 435090, 'reachable_time': 27263, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219098, 'error': None, 'target': 'ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:47.075 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[e270e1ed-12fe-4bad-afe5-ec7ac1a0b439]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapc3f966e1-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 435100, 'tstamp': 435100}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219099, 'error': None, 'target': 'ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapc3f966e1-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 435103, 'tstamp': 435103}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219099, 'error': None, 'target': 'ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:47.077 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc3f966e1-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:47:47 compute-0 nova_compute[186544]: 2025-11-22 07:47:47.079 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:47.079 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc3f966e1-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:47:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:47.079 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:47:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:47.080 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc3f966e1-80, col_values=(('external_ids', {'iface-id': '8206cb6d-dd78-493d-a276-fccb0eeecc7d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:47:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:47.080 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:47:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:47.081 103805 INFO neutron.agent.ovn.metadata.agent [-] Port eb321ea2-ecc9-494b-a270-c3aac4f36e7d in datapath c410ae8d-536e-4819-b766-652bc78ac3e4 bound to our chassis
Nov 22 07:47:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:47.083 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c410ae8d-536e-4819-b766-652bc78ac3e4
Nov 22 07:47:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:47.096 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[ea950909-92c1-45c5-ba6c-5f8d837740cf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:47.097 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc410ae8d-51 in ovnmeta-c410ae8d-536e-4819-b766-652bc78ac3e4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 07:47:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:47.099 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc410ae8d-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 07:47:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:47.099 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[8819bf6e-c39a-47c2-9cb0-ba9cdcb3ed09]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:47.101 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[f15318f7-b613-46a5-ad04-1418bbe319c7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:47.112 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[7c52a94f-f38a-4dbf-bd09-283239374cee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:47.136 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[819f8939-ea4e-4b6b-9777-5f6dd59cc052]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:47 compute-0 nova_compute[186544]: 2025-11-22 07:47:47.140 186548 INFO nova.compute.manager [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Post operation of migration started
Nov 22 07:47:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:47.165 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[7820a565-ebb9-4e32-b972-ff76702bd0b3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:47.172 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[bf62e720-e15a-4daf-9b44-09663a5ad09c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:47 compute-0 NetworkManager[55036]: <info>  [1763797667.1736] manager: (tapc410ae8d-50): new Veth device (/org/freedesktop/NetworkManager/Devices/77)
Nov 22 07:47:47 compute-0 systemd-udevd[219107]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 07:47:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:47.212 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[c4247dee-9f18-44f1-b12a-346cdfb76994]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:47.216 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[471400c1-2379-4509-a4d8-6426caef32ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:47 compute-0 NetworkManager[55036]: <info>  [1763797667.2500] device (tapc410ae8d-50): carrier: link connected
Nov 22 07:47:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:47.256 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[fdd67b9f-2e7f-4d9b-a5ea-07653163dc4a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:47.274 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[e7e362d3-21f0-4c20-bdc4-33ca72872d93]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc410ae8d-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:13:32:39'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 48], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 440788, 'reachable_time': 43097, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219126, 'error': None, 'target': 'ovnmeta-c410ae8d-536e-4819-b766-652bc78ac3e4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:47.291 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[58045d8c-94b7-4207-b168-624292b90b56]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe13:3239'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 440788, 'tstamp': 440788}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219127, 'error': None, 'target': 'ovnmeta-c410ae8d-536e-4819-b766-652bc78ac3e4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:47.307 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[33fe3eb6-4dd0-47cd-b4a5-e02c7f21909a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc410ae8d-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:13:32:39'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 48], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 440788, 'reachable_time': 43097, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 219128, 'error': None, 'target': 'ovnmeta-c410ae8d-536e-4819-b766-652bc78ac3e4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:47.342 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[fe8f8146-ee4b-44b2-ae66-b75cc7f02e1c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:47.399 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[2ee01594-9530-4a67-a2cd-1126a366a861]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:47.400 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc410ae8d-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:47:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:47.400 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:47:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:47.401 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc410ae8d-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:47:47 compute-0 nova_compute[186544]: 2025-11-22 07:47:47.402 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:47 compute-0 NetworkManager[55036]: <info>  [1763797667.4032] manager: (tapc410ae8d-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/78)
Nov 22 07:47:47 compute-0 kernel: tapc410ae8d-50: entered promiscuous mode
Nov 22 07:47:47 compute-0 nova_compute[186544]: 2025-11-22 07:47:47.405 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:47.405 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc410ae8d-50, col_values=(('external_ids', {'iface-id': 'adbcca2d-be43-4042-953b-c108dbe75276'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:47:47 compute-0 nova_compute[186544]: 2025-11-22 07:47:47.406 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:47 compute-0 ovn_controller[94843]: 2025-11-22T07:47:47Z|00167|binding|INFO|Releasing lport adbcca2d-be43-4042-953b-c108dbe75276 from this chassis (sb_readonly=0)
Nov 22 07:47:47 compute-0 nova_compute[186544]: 2025-11-22 07:47:47.419 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:47.420 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c410ae8d-536e-4819-b766-652bc78ac3e4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c410ae8d-536e-4819-b766-652bc78ac3e4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 07:47:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:47.421 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[a04ced8a-b3cf-4f5a-9c74-9d83c35204f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:47.421 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 07:47:47 compute-0 ovn_metadata_agent[103800]: global
Nov 22 07:47:47 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 07:47:47 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-c410ae8d-536e-4819-b766-652bc78ac3e4
Nov 22 07:47:47 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 07:47:47 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 07:47:47 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 07:47:47 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/c410ae8d-536e-4819-b766-652bc78ac3e4.pid.haproxy
Nov 22 07:47:47 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 07:47:47 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:47:47 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 07:47:47 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 07:47:47 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 07:47:47 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 07:47:47 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 07:47:47 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 07:47:47 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 07:47:47 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 07:47:47 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 07:47:47 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 07:47:47 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 07:47:47 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 07:47:47 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 07:47:47 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:47:47 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:47:47 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 07:47:47 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 07:47:47 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 07:47:47 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID c410ae8d-536e-4819-b766-652bc78ac3e4
Nov 22 07:47:47 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 07:47:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:47.422 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c410ae8d-536e-4819-b766-652bc78ac3e4', 'env', 'PROCESS_TAG=haproxy-c410ae8d-536e-4819-b766-652bc78ac3e4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c410ae8d-536e-4819-b766-652bc78ac3e4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 07:47:47 compute-0 nova_compute[186544]: 2025-11-22 07:47:47.694 186548 DEBUG oslo_concurrency.lockutils [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Acquiring lock "refresh_cache-b9c07170-ca6f-422e-8f1c-9dfd5cc943a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:47:47 compute-0 nova_compute[186544]: 2025-11-22 07:47:47.694 186548 DEBUG oslo_concurrency.lockutils [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Acquired lock "refresh_cache-b9c07170-ca6f-422e-8f1c-9dfd5cc943a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:47:47 compute-0 nova_compute[186544]: 2025-11-22 07:47:47.695 186548 DEBUG nova.network.neutron [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 07:47:47 compute-0 podman[219159]: 2025-11-22 07:47:47.797197826 +0000 UTC m=+0.048251291 container create d46c2d7f48c015880d2964738a7c2238f1060e130f3018c4dd0380f42085b52a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c410ae8d-536e-4819-b766-652bc78ac3e4, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 07:47:47 compute-0 systemd[1]: Started libpod-conmon-d46c2d7f48c015880d2964738a7c2238f1060e130f3018c4dd0380f42085b52a.scope.
Nov 22 07:47:47 compute-0 systemd[1]: Started libcrun container.
Nov 22 07:47:47 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/78c228fd5ba499cd36b0d4e205aebe5cfa62ac04084935734104fdb81f3a4bf2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 07:47:47 compute-0 podman[219159]: 2025-11-22 07:47:47.770418091 +0000 UTC m=+0.021471586 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 07:47:47 compute-0 podman[219159]: 2025-11-22 07:47:47.875768249 +0000 UTC m=+0.126821734 container init d46c2d7f48c015880d2964738a7c2238f1060e130f3018c4dd0380f42085b52a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c410ae8d-536e-4819-b766-652bc78ac3e4, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 07:47:47 compute-0 podman[219159]: 2025-11-22 07:47:47.881040348 +0000 UTC m=+0.132093813 container start d46c2d7f48c015880d2964738a7c2238f1060e130f3018c4dd0380f42085b52a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c410ae8d-536e-4819-b766-652bc78ac3e4, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118)
Nov 22 07:47:47 compute-0 neutron-haproxy-ovnmeta-c410ae8d-536e-4819-b766-652bc78ac3e4[219174]: [NOTICE]   (219178) : New worker (219180) forked
Nov 22 07:47:47 compute-0 neutron-haproxy-ovnmeta-c410ae8d-536e-4819-b766-652bc78ac3e4[219174]: [NOTICE]   (219178) : Loading success.
Nov 22 07:47:49 compute-0 nova_compute[186544]: 2025-11-22 07:47:49.151 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763797654.150179, d44f6523-2367-449e-8244-593a14760876 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:47:49 compute-0 nova_compute[186544]: 2025-11-22 07:47:49.152 186548 INFO nova.compute.manager [-] [instance: d44f6523-2367-449e-8244-593a14760876] VM Stopped (Lifecycle Event)
Nov 22 07:47:49 compute-0 nova_compute[186544]: 2025-11-22 07:47:49.190 186548 DEBUG nova.compute.manager [None req-ec53eaa2-ef2a-420e-a3d8-0fd161269574 - - - - - -] [instance: d44f6523-2367-449e-8244-593a14760876] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:47:49 compute-0 nova_compute[186544]: 2025-11-22 07:47:49.474 186548 DEBUG nova.network.neutron [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Updating instance_info_cache with network_info: [{"id": "28cefb09-6f44-4c5f-b924-c2e3ca0082e1", "address": "fa:16:3e:90:34:2c", "network": {"id": "c3f966e1-8cff-4ca0-9b4f-a318c31b0a70", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1427311200-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d48bda61691e4f778b6d30c0dc773a30", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap28cefb09-6f", "ovs_interfaceid": "28cefb09-6f44-4c5f-b924-c2e3ca0082e1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:47:49 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:49.660 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:47:49 compute-0 nova_compute[186544]: 2025-11-22 07:47:49.725 186548 DEBUG oslo_concurrency.lockutils [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Releasing lock "refresh_cache-b9c07170-ca6f-422e-8f1c-9dfd5cc943a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:47:49 compute-0 nova_compute[186544]: 2025-11-22 07:47:49.745 186548 DEBUG oslo_concurrency.lockutils [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:47:49 compute-0 nova_compute[186544]: 2025-11-22 07:47:49.746 186548 DEBUG oslo_concurrency.lockutils [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:47:49 compute-0 nova_compute[186544]: 2025-11-22 07:47:49.746 186548 DEBUG oslo_concurrency.lockutils [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:47:49 compute-0 nova_compute[186544]: 2025-11-22 07:47:49.749 186548 INFO nova.virt.libvirt.driver [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Nov 22 07:47:49 compute-0 virtqemud[186092]: Domain id=20 name='instance-00000023' uuid=b9c07170-ca6f-422e-8f1c-9dfd5cc943a4 is tainted: custom-monitor
Nov 22 07:47:50 compute-0 nova_compute[186544]: 2025-11-22 07:47:50.144 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:50 compute-0 nova_compute[186544]: 2025-11-22 07:47:50.158 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:47:50 compute-0 nova_compute[186544]: 2025-11-22 07:47:50.198 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:50 compute-0 ovn_controller[94843]: 2025-11-22T07:47:50Z|00168|binding|INFO|Releasing lport 8206cb6d-dd78-493d-a276-fccb0eeecc7d from this chassis (sb_readonly=0)
Nov 22 07:47:50 compute-0 ovn_controller[94843]: 2025-11-22T07:47:50Z|00169|binding|INFO|Releasing lport adbcca2d-be43-4042-953b-c108dbe75276 from this chassis (sb_readonly=0)
Nov 22 07:47:50 compute-0 nova_compute[186544]: 2025-11-22 07:47:50.268 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:50 compute-0 nova_compute[186544]: 2025-11-22 07:47:50.758 186548 INFO nova.virt.libvirt.driver [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Nov 22 07:47:51 compute-0 nova_compute[186544]: 2025-11-22 07:47:51.764 186548 INFO nova.virt.libvirt.driver [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Nov 22 07:47:51 compute-0 nova_compute[186544]: 2025-11-22 07:47:51.769 186548 DEBUG nova.compute.manager [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:47:51 compute-0 nova_compute[186544]: 2025-11-22 07:47:51.788 186548 DEBUG nova.objects.instance [None req-5a3ab37b-200d-45ec-a7a9-f39b9f7473bd b724835a5cf046a2bb3be4ad406d1c7f f3d0eb660de3401baacfd2e58fb59b5c - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Nov 22 07:47:54 compute-0 nova_compute[186544]: 2025-11-22 07:47:54.124 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763797659.1232574, 91985628-1682-4fc4-b267-d329bef908d3 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:47:54 compute-0 nova_compute[186544]: 2025-11-22 07:47:54.124 186548 INFO nova.compute.manager [-] [instance: 91985628-1682-4fc4-b267-d329bef908d3] VM Stopped (Lifecycle Event)
Nov 22 07:47:54 compute-0 nova_compute[186544]: 2025-11-22 07:47:54.165 186548 DEBUG nova.compute.manager [None req-2a5e8b2c-6ff0-4acf-8667-52e396ee478e - - - - - -] [instance: 91985628-1682-4fc4-b267-d329bef908d3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:47:55 compute-0 nova_compute[186544]: 2025-11-22 07:47:55.146 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:55 compute-0 nova_compute[186544]: 2025-11-22 07:47:55.199 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:57 compute-0 podman[219189]: 2025-11-22 07:47:57.412175755 +0000 UTC m=+0.064487889 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 07:47:57 compute-0 podman[219190]: 2025-11-22 07:47:57.442536097 +0000 UTC m=+0.089614014 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 22 07:47:58 compute-0 nova_compute[186544]: 2025-11-22 07:47:58.163 186548 DEBUG oslo_concurrency.lockutils [None req-80d88e34-a0f2-4f3d-8077-4338c48e0242 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Acquiring lock "b9c07170-ca6f-422e-8f1c-9dfd5cc943a4" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:47:58 compute-0 nova_compute[186544]: 2025-11-22 07:47:58.164 186548 DEBUG oslo_concurrency.lockutils [None req-80d88e34-a0f2-4f3d-8077-4338c48e0242 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Lock "b9c07170-ca6f-422e-8f1c-9dfd5cc943a4" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:47:58 compute-0 nova_compute[186544]: 2025-11-22 07:47:58.164 186548 DEBUG oslo_concurrency.lockutils [None req-80d88e34-a0f2-4f3d-8077-4338c48e0242 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Acquiring lock "b9c07170-ca6f-422e-8f1c-9dfd5cc943a4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:47:58 compute-0 nova_compute[186544]: 2025-11-22 07:47:58.164 186548 DEBUG oslo_concurrency.lockutils [None req-80d88e34-a0f2-4f3d-8077-4338c48e0242 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Lock "b9c07170-ca6f-422e-8f1c-9dfd5cc943a4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:47:58 compute-0 nova_compute[186544]: 2025-11-22 07:47:58.164 186548 DEBUG oslo_concurrency.lockutils [None req-80d88e34-a0f2-4f3d-8077-4338c48e0242 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Lock "b9c07170-ca6f-422e-8f1c-9dfd5cc943a4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:47:58 compute-0 nova_compute[186544]: 2025-11-22 07:47:58.173 186548 INFO nova.compute.manager [None req-80d88e34-a0f2-4f3d-8077-4338c48e0242 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Terminating instance
Nov 22 07:47:58 compute-0 nova_compute[186544]: 2025-11-22 07:47:58.180 186548 DEBUG nova.compute.manager [None req-80d88e34-a0f2-4f3d-8077-4338c48e0242 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 07:47:58 compute-0 kernel: tap28cefb09-6f (unregistering): left promiscuous mode
Nov 22 07:47:58 compute-0 NetworkManager[55036]: <info>  [1763797678.2050] device (tap28cefb09-6f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 07:47:58 compute-0 nova_compute[186544]: 2025-11-22 07:47:58.218 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:58 compute-0 ovn_controller[94843]: 2025-11-22T07:47:58Z|00170|binding|INFO|Releasing lport 28cefb09-6f44-4c5f-b924-c2e3ca0082e1 from this chassis (sb_readonly=0)
Nov 22 07:47:58 compute-0 ovn_controller[94843]: 2025-11-22T07:47:58Z|00171|binding|INFO|Setting lport 28cefb09-6f44-4c5f-b924-c2e3ca0082e1 down in Southbound
Nov 22 07:47:58 compute-0 ovn_controller[94843]: 2025-11-22T07:47:58Z|00172|binding|INFO|Releasing lport eb321ea2-ecc9-494b-a270-c3aac4f36e7d from this chassis (sb_readonly=0)
Nov 22 07:47:58 compute-0 ovn_controller[94843]: 2025-11-22T07:47:58Z|00173|binding|INFO|Setting lport eb321ea2-ecc9-494b-a270-c3aac4f36e7d down in Southbound
Nov 22 07:47:58 compute-0 ovn_controller[94843]: 2025-11-22T07:47:58Z|00174|binding|INFO|Removing iface tap28cefb09-6f ovn-installed in OVS
Nov 22 07:47:58 compute-0 nova_compute[186544]: 2025-11-22 07:47:58.221 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:58 compute-0 ovn_controller[94843]: 2025-11-22T07:47:58Z|00175|binding|INFO|Releasing lport 8206cb6d-dd78-493d-a276-fccb0eeecc7d from this chassis (sb_readonly=0)
Nov 22 07:47:58 compute-0 ovn_controller[94843]: 2025-11-22T07:47:58Z|00176|binding|INFO|Releasing lport adbcca2d-be43-4042-953b-c108dbe75276 from this chassis (sb_readonly=0)
Nov 22 07:47:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:58.229 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:90:34:2c 10.100.0.9'], port_security=['fa:16:3e:90:34:2c 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-109341048', 'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'b9c07170-ca6f-422e-8f1c-9dfd5cc943a4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-109341048', 'neutron:project_id': 'd48bda61691e4f778b6d30c0dc773a30', 'neutron:revision_number': '13', 'neutron:security_group_ids': 'd99796e5-fe06-409c-adb5-ca5cc291d6f2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=45f3847d-7d6e-44b5-a83a-dde97f76bd11, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=28cefb09-6f44-4c5f-b924-c2e3ca0082e1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:47:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:58.231 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d2:b0:13 19.80.0.49'], port_security=['fa:16:3e:d2:b0:13 19.80.0.49'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['28cefb09-6f44-4c5f-b924-c2e3ca0082e1'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-1622735635', 'neutron:cidrs': '19.80.0.49/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c410ae8d-536e-4819-b766-652bc78ac3e4', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-1622735635', 'neutron:project_id': 'd48bda61691e4f778b6d30c0dc773a30', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'd99796e5-fe06-409c-adb5-ca5cc291d6f2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=83c2898f-e4b8-43d0-8099-6e9553385d03, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=eb321ea2-ecc9-494b-a270-c3aac4f36e7d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:47:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:58.232 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 28cefb09-6f44-4c5f-b924-c2e3ca0082e1 in datapath c3f966e1-8cff-4ca0-9b4f-a318c31b0a70 unbound from our chassis
Nov 22 07:47:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:58.233 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c3f966e1-8cff-4ca0-9b4f-a318c31b0a70
Nov 22 07:47:58 compute-0 nova_compute[186544]: 2025-11-22 07:47:58.241 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:58.250 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[3d60122a-7c56-46fc-98a6-836f21e86d0f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:58 compute-0 nova_compute[186544]: 2025-11-22 07:47:58.262 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:58.276 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[dbd3fde2-7fac-41ef-814d-03759fe812eb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:58.279 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[0faf9061-e179-4166-abc9-72ca21947ebd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:58 compute-0 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d00000023.scope: Deactivated successfully.
Nov 22 07:47:58 compute-0 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d00000023.scope: Consumed 2.141s CPU time.
Nov 22 07:47:58 compute-0 systemd-machined[152872]: Machine qemu-20-instance-00000023 terminated.
Nov 22 07:47:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:58.304 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[a0e54a2b-6c5e-49b6-89f5-1a4a480bcdbd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:58.319 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[93b3fa17-5d11-49f8-8e15-3976053611ad]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc3f966e1-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e2:74:99'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 41, 'tx_packets': 7, 'rx_bytes': 2002, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 41, 'tx_packets': 7, 'rx_bytes': 2002, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 40], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 435090, 'reachable_time': 27263, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219248, 'error': None, 'target': 'ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:58.334 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[8157f7b7-d30e-4cec-a03a-f83efb6ba413]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapc3f966e1-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 435100, 'tstamp': 435100}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219249, 'error': None, 'target': 'ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapc3f966e1-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 435103, 'tstamp': 435103}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219249, 'error': None, 'target': 'ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:58.336 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc3f966e1-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:47:58 compute-0 nova_compute[186544]: 2025-11-22 07:47:58.337 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:58 compute-0 nova_compute[186544]: 2025-11-22 07:47:58.342 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:58.342 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc3f966e1-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:47:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:58.343 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:47:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:58.343 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc3f966e1-80, col_values=(('external_ids', {'iface-id': '8206cb6d-dd78-493d-a276-fccb0eeecc7d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:47:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:58.344 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:47:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:58.345 103805 INFO neutron.agent.ovn.metadata.agent [-] Port eb321ea2-ecc9-494b-a270-c3aac4f36e7d in datapath c410ae8d-536e-4819-b766-652bc78ac3e4 unbound from our chassis
Nov 22 07:47:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:58.347 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c410ae8d-536e-4819-b766-652bc78ac3e4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 07:47:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:58.347 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[c1fec311-2b42-4aa2-935f-e331b7a97b31]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:58.348 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c410ae8d-536e-4819-b766-652bc78ac3e4 namespace which is not needed anymore
Nov 22 07:47:58 compute-0 nova_compute[186544]: 2025-11-22 07:47:58.397 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:58 compute-0 nova_compute[186544]: 2025-11-22 07:47:58.402 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:58 compute-0 nova_compute[186544]: 2025-11-22 07:47:58.436 186548 INFO nova.virt.libvirt.driver [-] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Instance destroyed successfully.
Nov 22 07:47:58 compute-0 nova_compute[186544]: 2025-11-22 07:47:58.437 186548 DEBUG nova.objects.instance [None req-80d88e34-a0f2-4f3d-8077-4338c48e0242 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Lazy-loading 'resources' on Instance uuid b9c07170-ca6f-422e-8f1c-9dfd5cc943a4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:47:58 compute-0 nova_compute[186544]: 2025-11-22 07:47:58.452 186548 DEBUG nova.virt.libvirt.vif [None req-80d88e34-a0f2-4f3d-8077-4338c48e0242 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-22T07:47:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1107734578',display_name='tempest-LiveMigrationTest-server-1107734578',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-livemigrationtest-server-1107734578',id=35,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:47:23Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d48bda61691e4f778b6d30c0dc773a30',ramdisk_id='',reservation_id='r-qyzvalf0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-2093743563',owner_user_name='tempest-LiveMigrationTest-2093743563-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:47:51Z,user_data=None,user_id='8a738b980aad493b9a21da7d5a5ccf8a',uuid=b9c07170-ca6f-422e-8f1c-9dfd5cc943a4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "28cefb09-6f44-4c5f-b924-c2e3ca0082e1", "address": "fa:16:3e:90:34:2c", "network": {"id": "c3f966e1-8cff-4ca0-9b4f-a318c31b0a70", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1427311200-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d48bda61691e4f778b6d30c0dc773a30", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap28cefb09-6f", "ovs_interfaceid": "28cefb09-6f44-4c5f-b924-c2e3ca0082e1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 07:47:58 compute-0 nova_compute[186544]: 2025-11-22 07:47:58.453 186548 DEBUG nova.network.os_vif_util [None req-80d88e34-a0f2-4f3d-8077-4338c48e0242 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Converting VIF {"id": "28cefb09-6f44-4c5f-b924-c2e3ca0082e1", "address": "fa:16:3e:90:34:2c", "network": {"id": "c3f966e1-8cff-4ca0-9b4f-a318c31b0a70", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1427311200-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d48bda61691e4f778b6d30c0dc773a30", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap28cefb09-6f", "ovs_interfaceid": "28cefb09-6f44-4c5f-b924-c2e3ca0082e1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:47:58 compute-0 nova_compute[186544]: 2025-11-22 07:47:58.454 186548 DEBUG nova.network.os_vif_util [None req-80d88e34-a0f2-4f3d-8077-4338c48e0242 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:90:34:2c,bridge_name='br-int',has_traffic_filtering=True,id=28cefb09-6f44-4c5f-b924-c2e3ca0082e1,network=Network(c3f966e1-8cff-4ca0-9b4f-a318c31b0a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap28cefb09-6f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:47:58 compute-0 nova_compute[186544]: 2025-11-22 07:47:58.454 186548 DEBUG os_vif [None req-80d88e34-a0f2-4f3d-8077-4338c48e0242 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:90:34:2c,bridge_name='br-int',has_traffic_filtering=True,id=28cefb09-6f44-4c5f-b924-c2e3ca0082e1,network=Network(c3f966e1-8cff-4ca0-9b4f-a318c31b0a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap28cefb09-6f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 07:47:58 compute-0 nova_compute[186544]: 2025-11-22 07:47:58.456 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:58 compute-0 nova_compute[186544]: 2025-11-22 07:47:58.457 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap28cefb09-6f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:47:58 compute-0 nova_compute[186544]: 2025-11-22 07:47:58.461 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:58 compute-0 nova_compute[186544]: 2025-11-22 07:47:58.463 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 07:47:58 compute-0 nova_compute[186544]: 2025-11-22 07:47:58.466 186548 INFO os_vif [None req-80d88e34-a0f2-4f3d-8077-4338c48e0242 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:90:34:2c,bridge_name='br-int',has_traffic_filtering=True,id=28cefb09-6f44-4c5f-b924-c2e3ca0082e1,network=Network(c3f966e1-8cff-4ca0-9b4f-a318c31b0a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap28cefb09-6f')
Nov 22 07:47:58 compute-0 nova_compute[186544]: 2025-11-22 07:47:58.467 186548 INFO nova.virt.libvirt.driver [None req-80d88e34-a0f2-4f3d-8077-4338c48e0242 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Deleting instance files /var/lib/nova/instances/b9c07170-ca6f-422e-8f1c-9dfd5cc943a4_del
Nov 22 07:47:58 compute-0 nova_compute[186544]: 2025-11-22 07:47:58.468 186548 INFO nova.virt.libvirt.driver [None req-80d88e34-a0f2-4f3d-8077-4338c48e0242 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Deletion of /var/lib/nova/instances/b9c07170-ca6f-422e-8f1c-9dfd5cc943a4_del complete
Nov 22 07:47:58 compute-0 neutron-haproxy-ovnmeta-c410ae8d-536e-4819-b766-652bc78ac3e4[219174]: [NOTICE]   (219178) : haproxy version is 2.8.14-c23fe91
Nov 22 07:47:58 compute-0 neutron-haproxy-ovnmeta-c410ae8d-536e-4819-b766-652bc78ac3e4[219174]: [NOTICE]   (219178) : path to executable is /usr/sbin/haproxy
Nov 22 07:47:58 compute-0 neutron-haproxy-ovnmeta-c410ae8d-536e-4819-b766-652bc78ac3e4[219174]: [WARNING]  (219178) : Exiting Master process...
Nov 22 07:47:58 compute-0 neutron-haproxy-ovnmeta-c410ae8d-536e-4819-b766-652bc78ac3e4[219174]: [ALERT]    (219178) : Current worker (219180) exited with code 143 (Terminated)
Nov 22 07:47:58 compute-0 neutron-haproxy-ovnmeta-c410ae8d-536e-4819-b766-652bc78ac3e4[219174]: [WARNING]  (219178) : All workers exited. Exiting... (0)
Nov 22 07:47:58 compute-0 systemd[1]: libpod-d46c2d7f48c015880d2964738a7c2238f1060e130f3018c4dd0380f42085b52a.scope: Deactivated successfully.
Nov 22 07:47:58 compute-0 podman[219279]: 2025-11-22 07:47:58.503818132 +0000 UTC m=+0.069325087 container died d46c2d7f48c015880d2964738a7c2238f1060e130f3018c4dd0380f42085b52a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c410ae8d-536e-4819-b766-652bc78ac3e4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 07:47:58 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d46c2d7f48c015880d2964738a7c2238f1060e130f3018c4dd0380f42085b52a-userdata-shm.mount: Deactivated successfully.
Nov 22 07:47:58 compute-0 systemd[1]: var-lib-containers-storage-overlay-78c228fd5ba499cd36b0d4e205aebe5cfa62ac04084935734104fdb81f3a4bf2-merged.mount: Deactivated successfully.
Nov 22 07:47:58 compute-0 podman[219279]: 2025-11-22 07:47:58.561071803 +0000 UTC m=+0.126578758 container cleanup d46c2d7f48c015880d2964738a7c2238f1060e130f3018c4dd0380f42085b52a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c410ae8d-536e-4819-b766-652bc78ac3e4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 07:47:58 compute-0 systemd[1]: libpod-conmon-d46c2d7f48c015880d2964738a7c2238f1060e130f3018c4dd0380f42085b52a.scope: Deactivated successfully.
Nov 22 07:47:58 compute-0 nova_compute[186544]: 2025-11-22 07:47:58.599 186548 INFO nova.compute.manager [None req-80d88e34-a0f2-4f3d-8077-4338c48e0242 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Took 0.42 seconds to destroy the instance on the hypervisor.
Nov 22 07:47:58 compute-0 nova_compute[186544]: 2025-11-22 07:47:58.599 186548 DEBUG oslo.service.loopingcall [None req-80d88e34-a0f2-4f3d-8077-4338c48e0242 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 07:47:58 compute-0 nova_compute[186544]: 2025-11-22 07:47:58.600 186548 DEBUG nova.compute.manager [-] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 07:47:58 compute-0 nova_compute[186544]: 2025-11-22 07:47:58.600 186548 DEBUG nova.network.neutron [-] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 07:47:58 compute-0 nova_compute[186544]: 2025-11-22 07:47:58.816 186548 DEBUG nova.compute.manager [req-89b4451c-47bb-42d4-b9ce-01919dadb50a req-146d6c6f-7999-4d0d-a0a5-fdee2218f51f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Received event network-vif-unplugged-28cefb09-6f44-4c5f-b924-c2e3ca0082e1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:47:58 compute-0 nova_compute[186544]: 2025-11-22 07:47:58.816 186548 DEBUG oslo_concurrency.lockutils [req-89b4451c-47bb-42d4-b9ce-01919dadb50a req-146d6c6f-7999-4d0d-a0a5-fdee2218f51f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "b9c07170-ca6f-422e-8f1c-9dfd5cc943a4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:47:58 compute-0 nova_compute[186544]: 2025-11-22 07:47:58.816 186548 DEBUG oslo_concurrency.lockutils [req-89b4451c-47bb-42d4-b9ce-01919dadb50a req-146d6c6f-7999-4d0d-a0a5-fdee2218f51f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b9c07170-ca6f-422e-8f1c-9dfd5cc943a4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:47:58 compute-0 nova_compute[186544]: 2025-11-22 07:47:58.817 186548 DEBUG oslo_concurrency.lockutils [req-89b4451c-47bb-42d4-b9ce-01919dadb50a req-146d6c6f-7999-4d0d-a0a5-fdee2218f51f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b9c07170-ca6f-422e-8f1c-9dfd5cc943a4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:47:58 compute-0 nova_compute[186544]: 2025-11-22 07:47:58.817 186548 DEBUG nova.compute.manager [req-89b4451c-47bb-42d4-b9ce-01919dadb50a req-146d6c6f-7999-4d0d-a0a5-fdee2218f51f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] No waiting events found dispatching network-vif-unplugged-28cefb09-6f44-4c5f-b924-c2e3ca0082e1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:47:58 compute-0 nova_compute[186544]: 2025-11-22 07:47:58.817 186548 DEBUG nova.compute.manager [req-89b4451c-47bb-42d4-b9ce-01919dadb50a req-146d6c6f-7999-4d0d-a0a5-fdee2218f51f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Received event network-vif-unplugged-28cefb09-6f44-4c5f-b924-c2e3ca0082e1 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 22 07:47:58 compute-0 podman[219311]: 2025-11-22 07:47:58.8650781 +0000 UTC m=+0.281965749 container remove d46c2d7f48c015880d2964738a7c2238f1060e130f3018c4dd0380f42085b52a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c410ae8d-536e-4819-b766-652bc78ac3e4, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 22 07:47:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:58.870 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[8262c6bd-9f44-4b1e-aa07-0a6599b62116]: (4, ('Sat Nov 22 07:47:58 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-c410ae8d-536e-4819-b766-652bc78ac3e4 (d46c2d7f48c015880d2964738a7c2238f1060e130f3018c4dd0380f42085b52a)\nd46c2d7f48c015880d2964738a7c2238f1060e130f3018c4dd0380f42085b52a\nSat Nov 22 07:47:58 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-c410ae8d-536e-4819-b766-652bc78ac3e4 (d46c2d7f48c015880d2964738a7c2238f1060e130f3018c4dd0380f42085b52a)\nd46c2d7f48c015880d2964738a7c2238f1060e130f3018c4dd0380f42085b52a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:58.872 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[cdeb4d7b-9689-4464-a179-03533fd9cb97]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:58.873 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc410ae8d-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:47:58 compute-0 nova_compute[186544]: 2025-11-22 07:47:58.875 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:58 compute-0 kernel: tapc410ae8d-50: left promiscuous mode
Nov 22 07:47:58 compute-0 nova_compute[186544]: 2025-11-22 07:47:58.877 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:58.880 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[1a3db095-d495-485c-93da-e0eaa9854b4b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:58 compute-0 nova_compute[186544]: 2025-11-22 07:47:58.890 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:47:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:58.898 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[1562d499-41ae-4442-8f17-e9d6e4c78dae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:58.899 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[85aad088-b1ec-4df6-b826-a0a2eba16b03]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:58.914 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[bd7d5244-3ae2-4da9-a777-cf6e809da547]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 440779, 'reachable_time': 25670, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219326, 'error': None, 'target': 'ovnmeta-c410ae8d-536e-4819-b766-652bc78ac3e4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:47:58 compute-0 systemd[1]: run-netns-ovnmeta\x2dc410ae8d\x2d536e\x2d4819\x2db766\x2d652bc78ac3e4.mount: Deactivated successfully.
Nov 22 07:47:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:58.918 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c410ae8d-536e-4819-b766-652bc78ac3e4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 07:47:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:47:58.918 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[cdab19c1-a3c7-40ec-b8b9-c8e5a90fd966]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:48:00 compute-0 nova_compute[186544]: 2025-11-22 07:48:00.201 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:48:01 compute-0 nova_compute[186544]: 2025-11-22 07:48:01.187 186548 DEBUG nova.compute.manager [req-f7a2ff5f-dd93-44d5-961c-7495bc6e2fc2 req-b110d0ea-6a4a-4f88-a4d5-970abaa2d275 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Received event network-vif-plugged-28cefb09-6f44-4c5f-b924-c2e3ca0082e1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:48:01 compute-0 nova_compute[186544]: 2025-11-22 07:48:01.188 186548 DEBUG oslo_concurrency.lockutils [req-f7a2ff5f-dd93-44d5-961c-7495bc6e2fc2 req-b110d0ea-6a4a-4f88-a4d5-970abaa2d275 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "b9c07170-ca6f-422e-8f1c-9dfd5cc943a4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:48:01 compute-0 nova_compute[186544]: 2025-11-22 07:48:01.188 186548 DEBUG oslo_concurrency.lockutils [req-f7a2ff5f-dd93-44d5-961c-7495bc6e2fc2 req-b110d0ea-6a4a-4f88-a4d5-970abaa2d275 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b9c07170-ca6f-422e-8f1c-9dfd5cc943a4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:48:01 compute-0 nova_compute[186544]: 2025-11-22 07:48:01.188 186548 DEBUG oslo_concurrency.lockutils [req-f7a2ff5f-dd93-44d5-961c-7495bc6e2fc2 req-b110d0ea-6a4a-4f88-a4d5-970abaa2d275 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b9c07170-ca6f-422e-8f1c-9dfd5cc943a4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:48:01 compute-0 nova_compute[186544]: 2025-11-22 07:48:01.188 186548 DEBUG nova.compute.manager [req-f7a2ff5f-dd93-44d5-961c-7495bc6e2fc2 req-b110d0ea-6a4a-4f88-a4d5-970abaa2d275 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] No waiting events found dispatching network-vif-plugged-28cefb09-6f44-4c5f-b924-c2e3ca0082e1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:48:01 compute-0 nova_compute[186544]: 2025-11-22 07:48:01.188 186548 WARNING nova.compute.manager [req-f7a2ff5f-dd93-44d5-961c-7495bc6e2fc2 req-b110d0ea-6a4a-4f88-a4d5-970abaa2d275 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Received unexpected event network-vif-plugged-28cefb09-6f44-4c5f-b924-c2e3ca0082e1 for instance with vm_state active and task_state deleting.
Nov 22 07:48:01 compute-0 podman[219327]: 2025-11-22 07:48:01.394065274 +0000 UTC m=+0.047132824 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 07:48:03 compute-0 nova_compute[186544]: 2025-11-22 07:48:03.461 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:48:03 compute-0 nova_compute[186544]: 2025-11-22 07:48:03.636 186548 DEBUG nova.network.neutron [-] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:48:03 compute-0 nova_compute[186544]: 2025-11-22 07:48:03.734 186548 INFO nova.compute.manager [-] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Took 5.13 seconds to deallocate network for instance.
Nov 22 07:48:03 compute-0 nova_compute[186544]: 2025-11-22 07:48:03.886 186548 DEBUG oslo_concurrency.lockutils [None req-80d88e34-a0f2-4f3d-8077-4338c48e0242 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:48:03 compute-0 nova_compute[186544]: 2025-11-22 07:48:03.886 186548 DEBUG oslo_concurrency.lockutils [None req-80d88e34-a0f2-4f3d-8077-4338c48e0242 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:48:03 compute-0 nova_compute[186544]: 2025-11-22 07:48:03.891 186548 DEBUG oslo_concurrency.lockutils [None req-80d88e34-a0f2-4f3d-8077-4338c48e0242 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.005s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:48:03 compute-0 nova_compute[186544]: 2025-11-22 07:48:03.970 186548 INFO nova.scheduler.client.report [None req-80d88e34-a0f2-4f3d-8077-4338c48e0242 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Deleted allocations for instance b9c07170-ca6f-422e-8f1c-9dfd5cc943a4
Nov 22 07:48:04 compute-0 nova_compute[186544]: 2025-11-22 07:48:04.104 186548 DEBUG oslo_concurrency.lockutils [None req-80d88e34-a0f2-4f3d-8077-4338c48e0242 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Lock "b9c07170-ca6f-422e-8f1c-9dfd5cc943a4" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.940s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:48:04 compute-0 podman[219351]: 2025-11-22 07:48:04.406213207 +0000 UTC m=+0.053154372 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 07:48:05 compute-0 nova_compute[186544]: 2025-11-22 07:48:05.203 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:48:08 compute-0 podman[219371]: 2025-11-22 07:48:08.403181614 +0000 UTC m=+0.053566092 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd)
Nov 22 07:48:08 compute-0 nova_compute[186544]: 2025-11-22 07:48:08.465 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:48:10 compute-0 nova_compute[186544]: 2025-11-22 07:48:10.206 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:48:11 compute-0 nova_compute[186544]: 2025-11-22 07:48:11.715 186548 DEBUG oslo_concurrency.lockutils [None req-35af312e-1cc0-43cc-b55a-e3676e18be27 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Acquiring lock "09e8ccfc-6ae9-4a06-ae76-7e059f50ac44" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:48:11 compute-0 nova_compute[186544]: 2025-11-22 07:48:11.715 186548 DEBUG oslo_concurrency.lockutils [None req-35af312e-1cc0-43cc-b55a-e3676e18be27 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Lock "09e8ccfc-6ae9-4a06-ae76-7e059f50ac44" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:48:11 compute-0 nova_compute[186544]: 2025-11-22 07:48:11.716 186548 DEBUG oslo_concurrency.lockutils [None req-35af312e-1cc0-43cc-b55a-e3676e18be27 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Acquiring lock "09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:48:11 compute-0 nova_compute[186544]: 2025-11-22 07:48:11.716 186548 DEBUG oslo_concurrency.lockutils [None req-35af312e-1cc0-43cc-b55a-e3676e18be27 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Lock "09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:48:11 compute-0 nova_compute[186544]: 2025-11-22 07:48:11.716 186548 DEBUG oslo_concurrency.lockutils [None req-35af312e-1cc0-43cc-b55a-e3676e18be27 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Lock "09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:48:11 compute-0 nova_compute[186544]: 2025-11-22 07:48:11.724 186548 INFO nova.compute.manager [None req-35af312e-1cc0-43cc-b55a-e3676e18be27 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Terminating instance
Nov 22 07:48:11 compute-0 nova_compute[186544]: 2025-11-22 07:48:11.731 186548 DEBUG nova.compute.manager [None req-35af312e-1cc0-43cc-b55a-e3676e18be27 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 07:48:11 compute-0 kernel: tap973cdfc2-4a (unregistering): left promiscuous mode
Nov 22 07:48:11 compute-0 NetworkManager[55036]: <info>  [1763797691.7555] device (tap973cdfc2-4a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 07:48:11 compute-0 ovn_controller[94843]: 2025-11-22T07:48:11Z|00177|binding|INFO|Releasing lport 973cdfc2-4ad8-4f41-b383-4b64b1b5433f from this chassis (sb_readonly=0)
Nov 22 07:48:11 compute-0 nova_compute[186544]: 2025-11-22 07:48:11.761 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:48:11 compute-0 ovn_controller[94843]: 2025-11-22T07:48:11Z|00178|binding|INFO|Setting lport 973cdfc2-4ad8-4f41-b383-4b64b1b5433f down in Southbound
Nov 22 07:48:11 compute-0 ovn_controller[94843]: 2025-11-22T07:48:11Z|00179|binding|INFO|Removing iface tap973cdfc2-4a ovn-installed in OVS
Nov 22 07:48:11 compute-0 nova_compute[186544]: 2025-11-22 07:48:11.764 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:48:11 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:48:11.773 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:02:9b:98 10.100.0.14'], port_security=['fa:16:3e:02:9b:98 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '09e8ccfc-6ae9-4a06-ae76-7e059f50ac44', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd48bda61691e4f778b6d30c0dc773a30', 'neutron:revision_number': '23', 'neutron:security_group_ids': 'd99796e5-fe06-409c-adb5-ca5cc291d6f2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=45f3847d-7d6e-44b5-a83a-dde97f76bd11, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=973cdfc2-4ad8-4f41-b383-4b64b1b5433f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:48:11 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:48:11.774 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 973cdfc2-4ad8-4f41-b383-4b64b1b5433f in datapath c3f966e1-8cff-4ca0-9b4f-a318c31b0a70 unbound from our chassis
Nov 22 07:48:11 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:48:11.775 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c3f966e1-8cff-4ca0-9b4f-a318c31b0a70, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 07:48:11 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:48:11.776 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[86d4d0f8-bb97-4ac5-81b6-e575442352dc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:48:11 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:48:11.776 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70 namespace which is not needed anymore
Nov 22 07:48:11 compute-0 nova_compute[186544]: 2025-11-22 07:48:11.783 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:48:11 compute-0 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d0000001c.scope: Deactivated successfully.
Nov 22 07:48:11 compute-0 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d0000001c.scope: Consumed 4.945s CPU time.
Nov 22 07:48:11 compute-0 systemd-machined[152872]: Machine qemu-16-instance-0000001c terminated.
Nov 22 07:48:11 compute-0 neutron-haproxy-ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70[218164]: [NOTICE]   (218168) : haproxy version is 2.8.14-c23fe91
Nov 22 07:48:11 compute-0 neutron-haproxy-ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70[218164]: [NOTICE]   (218168) : path to executable is /usr/sbin/haproxy
Nov 22 07:48:11 compute-0 neutron-haproxy-ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70[218164]: [WARNING]  (218168) : Exiting Master process...
Nov 22 07:48:11 compute-0 neutron-haproxy-ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70[218164]: [WARNING]  (218168) : Exiting Master process...
Nov 22 07:48:11 compute-0 neutron-haproxy-ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70[218164]: [ALERT]    (218168) : Current worker (218170) exited with code 143 (Terminated)
Nov 22 07:48:11 compute-0 neutron-haproxy-ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70[218164]: [WARNING]  (218168) : All workers exited. Exiting... (0)
Nov 22 07:48:11 compute-0 systemd[1]: libpod-657eea662f247d9903d75b941ff648515a2081de35ddde9896ec86930b0fe529.scope: Deactivated successfully.
Nov 22 07:48:11 compute-0 podman[219418]: 2025-11-22 07:48:11.906032753 +0000 UTC m=+0.043382252 container died 657eea662f247d9903d75b941ff648515a2081de35ddde9896ec86930b0fe529 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 07:48:11 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-657eea662f247d9903d75b941ff648515a2081de35ddde9896ec86930b0fe529-userdata-shm.mount: Deactivated successfully.
Nov 22 07:48:11 compute-0 systemd[1]: var-lib-containers-storage-overlay-39d45c0eb52508004f068b426dba2cdef6d1aeb39842a07373632a882fae392d-merged.mount: Deactivated successfully.
Nov 22 07:48:11 compute-0 podman[219418]: 2025-11-22 07:48:11.941408619 +0000 UTC m=+0.078758118 container cleanup 657eea662f247d9903d75b941ff648515a2081de35ddde9896ec86930b0fe529 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 07:48:11 compute-0 nova_compute[186544]: 2025-11-22 07:48:11.954 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:48:11 compute-0 nova_compute[186544]: 2025-11-22 07:48:11.960 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:48:11 compute-0 systemd[1]: libpod-conmon-657eea662f247d9903d75b941ff648515a2081de35ddde9896ec86930b0fe529.scope: Deactivated successfully.
Nov 22 07:48:11 compute-0 nova_compute[186544]: 2025-11-22 07:48:11.996 186548 INFO nova.virt.libvirt.driver [-] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Instance destroyed successfully.
Nov 22 07:48:11 compute-0 nova_compute[186544]: 2025-11-22 07:48:11.997 186548 DEBUG nova.objects.instance [None req-35af312e-1cc0-43cc-b55a-e3676e18be27 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Lazy-loading 'resources' on Instance uuid 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:48:12 compute-0 nova_compute[186544]: 2025-11-22 07:48:12.017 186548 DEBUG nova.virt.libvirt.vif [None req-35af312e-1cc0-43cc-b55a-e3676e18be27 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-22T07:46:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-810629940',display_name='tempest-LiveMigrationTest-server-810629940',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-livemigrationtest-server-810629940',id=28,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:46:11Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d48bda61691e4f778b6d30c0dc773a30',ramdisk_id='',reservation_id='r-4wr2aiye',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='2',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-2093743563',owner_user_name='tempest-LiveMigrationTest-2093743563-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:46:55Z,user_data=None,user_id='8a738b980aad493b9a21da7d5a5ccf8a',uuid=09e8ccfc-6ae9-4a06-ae76-7e059f50ac44,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "973cdfc2-4ad8-4f41-b383-4b64b1b5433f", "address": "fa:16:3e:02:9b:98", "network": {"id": "c3f966e1-8cff-4ca0-9b4f-a318c31b0a70", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1427311200-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d48bda61691e4f778b6d30c0dc773a30", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap973cdfc2-4a", "ovs_interfaceid": "973cdfc2-4ad8-4f41-b383-4b64b1b5433f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 07:48:12 compute-0 nova_compute[186544]: 2025-11-22 07:48:12.017 186548 DEBUG nova.network.os_vif_util [None req-35af312e-1cc0-43cc-b55a-e3676e18be27 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Converting VIF {"id": "973cdfc2-4ad8-4f41-b383-4b64b1b5433f", "address": "fa:16:3e:02:9b:98", "network": {"id": "c3f966e1-8cff-4ca0-9b4f-a318c31b0a70", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1427311200-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d48bda61691e4f778b6d30c0dc773a30", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap973cdfc2-4a", "ovs_interfaceid": "973cdfc2-4ad8-4f41-b383-4b64b1b5433f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:48:12 compute-0 nova_compute[186544]: 2025-11-22 07:48:12.018 186548 DEBUG nova.network.os_vif_util [None req-35af312e-1cc0-43cc-b55a-e3676e18be27 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:02:9b:98,bridge_name='br-int',has_traffic_filtering=True,id=973cdfc2-4ad8-4f41-b383-4b64b1b5433f,network=Network(c3f966e1-8cff-4ca0-9b4f-a318c31b0a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap973cdfc2-4a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:48:12 compute-0 nova_compute[186544]: 2025-11-22 07:48:12.018 186548 DEBUG os_vif [None req-35af312e-1cc0-43cc-b55a-e3676e18be27 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:02:9b:98,bridge_name='br-int',has_traffic_filtering=True,id=973cdfc2-4ad8-4f41-b383-4b64b1b5433f,network=Network(c3f966e1-8cff-4ca0-9b4f-a318c31b0a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap973cdfc2-4a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 07:48:12 compute-0 podman[219448]: 2025-11-22 07:48:12.018861024 +0000 UTC m=+0.053500700 container remove 657eea662f247d9903d75b941ff648515a2081de35ddde9896ec86930b0fe529 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 22 07:48:12 compute-0 nova_compute[186544]: 2025-11-22 07:48:12.019 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:48:12 compute-0 nova_compute[186544]: 2025-11-22 07:48:12.020 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap973cdfc2-4a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:48:12 compute-0 nova_compute[186544]: 2025-11-22 07:48:12.021 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:48:12 compute-0 nova_compute[186544]: 2025-11-22 07:48:12.023 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:48:12 compute-0 nova_compute[186544]: 2025-11-22 07:48:12.026 186548 INFO os_vif [None req-35af312e-1cc0-43cc-b55a-e3676e18be27 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:02:9b:98,bridge_name='br-int',has_traffic_filtering=True,id=973cdfc2-4ad8-4f41-b383-4b64b1b5433f,network=Network(c3f966e1-8cff-4ca0-9b4f-a318c31b0a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap973cdfc2-4a')
Nov 22 07:48:12 compute-0 nova_compute[186544]: 2025-11-22 07:48:12.026 186548 INFO nova.virt.libvirt.driver [None req-35af312e-1cc0-43cc-b55a-e3676e18be27 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Deleting instance files /var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44_del
Nov 22 07:48:12 compute-0 nova_compute[186544]: 2025-11-22 07:48:12.027 186548 INFO nova.virt.libvirt.driver [None req-35af312e-1cc0-43cc-b55a-e3676e18be27 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Deletion of /var/lib/nova/instances/09e8ccfc-6ae9-4a06-ae76-7e059f50ac44_del complete
Nov 22 07:48:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:48:12.027 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[d1e8fa84-21b3-4a19-9729-0bba3b1f21d7]: (4, ('Sat Nov 22 07:48:11 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70 (657eea662f247d9903d75b941ff648515a2081de35ddde9896ec86930b0fe529)\n657eea662f247d9903d75b941ff648515a2081de35ddde9896ec86930b0fe529\nSat Nov 22 07:48:11 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70 (657eea662f247d9903d75b941ff648515a2081de35ddde9896ec86930b0fe529)\n657eea662f247d9903d75b941ff648515a2081de35ddde9896ec86930b0fe529\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:48:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:48:12.029 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[32049d6e-5a1f-44e9-9ec5-a7428e368f23]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:48:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:48:12.030 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc3f966e1-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:48:12 compute-0 nova_compute[186544]: 2025-11-22 07:48:12.032 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:48:12 compute-0 kernel: tapc3f966e1-80: left promiscuous mode
Nov 22 07:48:12 compute-0 nova_compute[186544]: 2025-11-22 07:48:12.044 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:48:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:48:12.047 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[dc1c9561-5de2-4113-8126-0ecf85c7a53a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:48:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:48:12.060 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[7c86694a-cfea-444a-a368-cf06d59f9415]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:48:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:48:12.062 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[570c2660-2d4d-4c7b-af20-6d37b491ae72]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:48:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:48:12.076 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[2b0de660-2436-43d7-8199-7a3196e98a89]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 435084, 'reachable_time': 15478, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219478, 'error': None, 'target': 'ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:48:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:48:12.079 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c3f966e1-8cff-4ca0-9b4f-a318c31b0a70 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 07:48:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:48:12.079 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[66899b26-ca98-4f5d-b3e6-168a3d92cc6b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:48:12 compute-0 systemd[1]: run-netns-ovnmeta\x2dc3f966e1\x2d8cff\x2d4ca0\x2d9b4f\x2da318c31b0a70.mount: Deactivated successfully.
Nov 22 07:48:12 compute-0 nova_compute[186544]: 2025-11-22 07:48:12.141 186548 INFO nova.compute.manager [None req-35af312e-1cc0-43cc-b55a-e3676e18be27 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Took 0.41 seconds to destroy the instance on the hypervisor.
Nov 22 07:48:12 compute-0 nova_compute[186544]: 2025-11-22 07:48:12.142 186548 DEBUG oslo.service.loopingcall [None req-35af312e-1cc0-43cc-b55a-e3676e18be27 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 07:48:12 compute-0 nova_compute[186544]: 2025-11-22 07:48:12.142 186548 DEBUG nova.compute.manager [-] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 07:48:12 compute-0 nova_compute[186544]: 2025-11-22 07:48:12.143 186548 DEBUG nova.network.neutron [-] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 07:48:12 compute-0 nova_compute[186544]: 2025-11-22 07:48:12.205 186548 DEBUG nova.compute.manager [req-00a456fd-352e-4f47-a0da-63bdc620aea3 req-119eecc4-e24c-4e57-b168-cd1809e8e89c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Received event network-vif-unplugged-973cdfc2-4ad8-4f41-b383-4b64b1b5433f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:48:12 compute-0 nova_compute[186544]: 2025-11-22 07:48:12.205 186548 DEBUG oslo_concurrency.lockutils [req-00a456fd-352e-4f47-a0da-63bdc620aea3 req-119eecc4-e24c-4e57-b168-cd1809e8e89c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:48:12 compute-0 nova_compute[186544]: 2025-11-22 07:48:12.206 186548 DEBUG oslo_concurrency.lockutils [req-00a456fd-352e-4f47-a0da-63bdc620aea3 req-119eecc4-e24c-4e57-b168-cd1809e8e89c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:48:12 compute-0 nova_compute[186544]: 2025-11-22 07:48:12.206 186548 DEBUG oslo_concurrency.lockutils [req-00a456fd-352e-4f47-a0da-63bdc620aea3 req-119eecc4-e24c-4e57-b168-cd1809e8e89c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:48:12 compute-0 nova_compute[186544]: 2025-11-22 07:48:12.206 186548 DEBUG nova.compute.manager [req-00a456fd-352e-4f47-a0da-63bdc620aea3 req-119eecc4-e24c-4e57-b168-cd1809e8e89c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] No waiting events found dispatching network-vif-unplugged-973cdfc2-4ad8-4f41-b383-4b64b1b5433f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:48:12 compute-0 nova_compute[186544]: 2025-11-22 07:48:12.207 186548 DEBUG nova.compute.manager [req-00a456fd-352e-4f47-a0da-63bdc620aea3 req-119eecc4-e24c-4e57-b168-cd1809e8e89c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Received event network-vif-unplugged-973cdfc2-4ad8-4f41-b383-4b64b1b5433f for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 22 07:48:13 compute-0 nova_compute[186544]: 2025-11-22 07:48:13.350 186548 DEBUG nova.network.neutron [-] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:48:13 compute-0 nova_compute[186544]: 2025-11-22 07:48:13.405 186548 INFO nova.compute.manager [-] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Took 1.26 seconds to deallocate network for instance.
Nov 22 07:48:13 compute-0 podman[219479]: 2025-11-22 07:48:13.407178279 +0000 UTC m=+0.058738427 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 07:48:13 compute-0 nova_compute[186544]: 2025-11-22 07:48:13.435 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763797678.434965, b9c07170-ca6f-422e-8f1c-9dfd5cc943a4 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:48:13 compute-0 nova_compute[186544]: 2025-11-22 07:48:13.436 186548 INFO nova.compute.manager [-] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] VM Stopped (Lifecycle Event)
Nov 22 07:48:13 compute-0 nova_compute[186544]: 2025-11-22 07:48:13.462 186548 DEBUG nova.compute.manager [None req-4fa5f084-d8a3-40b0-8460-11b3d84123bb - - - - - -] [instance: b9c07170-ca6f-422e-8f1c-9dfd5cc943a4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:48:13 compute-0 nova_compute[186544]: 2025-11-22 07:48:13.493 186548 DEBUG nova.compute.manager [req-d85cb868-2b95-413c-8c6c-c3cfde63fd8a req-8bbdfbc8-224f-493c-b858-2d26f0afd44f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Received event network-vif-deleted-973cdfc2-4ad8-4f41-b383-4b64b1b5433f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:48:13 compute-0 nova_compute[186544]: 2025-11-22 07:48:13.498 186548 DEBUG oslo_concurrency.lockutils [None req-35af312e-1cc0-43cc-b55a-e3676e18be27 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:48:13 compute-0 nova_compute[186544]: 2025-11-22 07:48:13.498 186548 DEBUG oslo_concurrency.lockutils [None req-35af312e-1cc0-43cc-b55a-e3676e18be27 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:48:13 compute-0 nova_compute[186544]: 2025-11-22 07:48:13.566 186548 DEBUG nova.compute.provider_tree [None req-35af312e-1cc0-43cc-b55a-e3676e18be27 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:48:13 compute-0 nova_compute[186544]: 2025-11-22 07:48:13.583 186548 DEBUG nova.scheduler.client.report [None req-35af312e-1cc0-43cc-b55a-e3676e18be27 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:48:13 compute-0 nova_compute[186544]: 2025-11-22 07:48:13.604 186548 DEBUG oslo_concurrency.lockutils [None req-35af312e-1cc0-43cc-b55a-e3676e18be27 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:48:13 compute-0 nova_compute[186544]: 2025-11-22 07:48:13.654 186548 INFO nova.scheduler.client.report [None req-35af312e-1cc0-43cc-b55a-e3676e18be27 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Deleted allocations for instance 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44
Nov 22 07:48:13 compute-0 nova_compute[186544]: 2025-11-22 07:48:13.735 186548 DEBUG oslo_concurrency.lockutils [None req-35af312e-1cc0-43cc-b55a-e3676e18be27 8a738b980aad493b9a21da7d5a5ccf8a d48bda61691e4f778b6d30c0dc773a30 - - default default] Lock "09e8ccfc-6ae9-4a06-ae76-7e059f50ac44" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.020s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:48:14 compute-0 nova_compute[186544]: 2025-11-22 07:48:14.333 186548 DEBUG nova.compute.manager [req-c304eb9a-16cf-4597-804c-2283f869e93d req-c1e7eba5-1a10-4867-a52b-135a631e7070 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Received event network-vif-plugged-973cdfc2-4ad8-4f41-b383-4b64b1b5433f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:48:14 compute-0 nova_compute[186544]: 2025-11-22 07:48:14.334 186548 DEBUG oslo_concurrency.lockutils [req-c304eb9a-16cf-4597-804c-2283f869e93d req-c1e7eba5-1a10-4867-a52b-135a631e7070 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:48:14 compute-0 nova_compute[186544]: 2025-11-22 07:48:14.334 186548 DEBUG oslo_concurrency.lockutils [req-c304eb9a-16cf-4597-804c-2283f869e93d req-c1e7eba5-1a10-4867-a52b-135a631e7070 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:48:14 compute-0 nova_compute[186544]: 2025-11-22 07:48:14.334 186548 DEBUG oslo_concurrency.lockutils [req-c304eb9a-16cf-4597-804c-2283f869e93d req-c1e7eba5-1a10-4867-a52b-135a631e7070 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "09e8ccfc-6ae9-4a06-ae76-7e059f50ac44-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:48:14 compute-0 nova_compute[186544]: 2025-11-22 07:48:14.334 186548 DEBUG nova.compute.manager [req-c304eb9a-16cf-4597-804c-2283f869e93d req-c1e7eba5-1a10-4867-a52b-135a631e7070 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] No waiting events found dispatching network-vif-plugged-973cdfc2-4ad8-4f41-b383-4b64b1b5433f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:48:14 compute-0 nova_compute[186544]: 2025-11-22 07:48:14.335 186548 WARNING nova.compute.manager [req-c304eb9a-16cf-4597-804c-2283f869e93d req-c1e7eba5-1a10-4867-a52b-135a631e7070 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Received unexpected event network-vif-plugged-973cdfc2-4ad8-4f41-b383-4b64b1b5433f for instance with vm_state deleted and task_state None.
Nov 22 07:48:15 compute-0 nova_compute[186544]: 2025-11-22 07:48:15.208 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:48:17 compute-0 nova_compute[186544]: 2025-11-22 07:48:17.021 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:48:17 compute-0 podman[219503]: 2025-11-22 07:48:17.408260329 +0000 UTC m=+0.057297373 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.openshift.expose-services=, maintainer=Red Hat, Inc., vcs-type=git, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, container_name=openstack_network_exporter)
Nov 22 07:48:17 compute-0 nova_compute[186544]: 2025-11-22 07:48:17.761 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:48:20 compute-0 nova_compute[186544]: 2025-11-22 07:48:20.210 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:48:22 compute-0 nova_compute[186544]: 2025-11-22 07:48:22.023 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:48:25 compute-0 nova_compute[186544]: 2025-11-22 07:48:25.211 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:48:26 compute-0 nova_compute[186544]: 2025-11-22 07:48:26.996 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763797691.9954214, 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:48:26 compute-0 nova_compute[186544]: 2025-11-22 07:48:26.996 186548 INFO nova.compute.manager [-] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] VM Stopped (Lifecycle Event)
Nov 22 07:48:27 compute-0 nova_compute[186544]: 2025-11-22 07:48:27.011 186548 DEBUG nova.compute.manager [None req-e0841956-e7b1-4bb9-b008-f93320ec2ecd - - - - - -] [instance: 09e8ccfc-6ae9-4a06-ae76-7e059f50ac44] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:48:27 compute-0 nova_compute[186544]: 2025-11-22 07:48:27.047 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:48:28 compute-0 podman[219525]: 2025-11-22 07:48:28.400897191 +0000 UTC m=+0.053548721 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 22 07:48:28 compute-0 podman[219526]: 2025-11-22 07:48:28.449247394 +0000 UTC m=+0.099954917 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible)
Nov 22 07:48:30 compute-0 nova_compute[186544]: 2025-11-22 07:48:30.212 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:48:32 compute-0 nova_compute[186544]: 2025-11-22 07:48:32.077 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:48:32 compute-0 podman[219571]: 2025-11-22 07:48:32.42118729 +0000 UTC m=+0.060808789 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 07:48:32 compute-0 nova_compute[186544]: 2025-11-22 07:48:32.722 186548 DEBUG oslo_concurrency.lockutils [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] Acquiring lock "5225b730-46c6-4fb4-8ada-9f9ea046e722" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:48:32 compute-0 nova_compute[186544]: 2025-11-22 07:48:32.723 186548 DEBUG oslo_concurrency.lockutils [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] Lock "5225b730-46c6-4fb4-8ada-9f9ea046e722" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:48:32 compute-0 nova_compute[186544]: 2025-11-22 07:48:32.792 186548 DEBUG nova.compute.manager [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] [instance: 5225b730-46c6-4fb4-8ada-9f9ea046e722] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 07:48:33 compute-0 nova_compute[186544]: 2025-11-22 07:48:33.016 186548 DEBUG oslo_concurrency.lockutils [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:48:33 compute-0 nova_compute[186544]: 2025-11-22 07:48:33.017 186548 DEBUG oslo_concurrency.lockutils [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:48:33 compute-0 nova_compute[186544]: 2025-11-22 07:48:33.024 186548 DEBUG nova.virt.hardware [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 07:48:33 compute-0 nova_compute[186544]: 2025-11-22 07:48:33.025 186548 INFO nova.compute.claims [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] [instance: 5225b730-46c6-4fb4-8ada-9f9ea046e722] Claim successful on node compute-0.ctlplane.example.com
Nov 22 07:48:33 compute-0 nova_compute[186544]: 2025-11-22 07:48:33.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:48:33 compute-0 nova_compute[186544]: 2025-11-22 07:48:33.186 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:48:33 compute-0 nova_compute[186544]: 2025-11-22 07:48:33.377 186548 DEBUG nova.compute.provider_tree [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:48:33 compute-0 nova_compute[186544]: 2025-11-22 07:48:33.398 186548 DEBUG nova.scheduler.client.report [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:48:33 compute-0 nova_compute[186544]: 2025-11-22 07:48:33.433 186548 DEBUG oslo_concurrency.lockutils [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.416s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:48:33 compute-0 nova_compute[186544]: 2025-11-22 07:48:33.433 186548 DEBUG nova.compute.manager [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] [instance: 5225b730-46c6-4fb4-8ada-9f9ea046e722] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 07:48:33 compute-0 nova_compute[186544]: 2025-11-22 07:48:33.438 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.251s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:48:33 compute-0 nova_compute[186544]: 2025-11-22 07:48:33.438 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:48:33 compute-0 nova_compute[186544]: 2025-11-22 07:48:33.438 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 07:48:33 compute-0 nova_compute[186544]: 2025-11-22 07:48:33.623 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 07:48:33 compute-0 nova_compute[186544]: 2025-11-22 07:48:33.624 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5748MB free_disk=73.42370986938477GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 07:48:33 compute-0 nova_compute[186544]: 2025-11-22 07:48:33.624 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:48:33 compute-0 nova_compute[186544]: 2025-11-22 07:48:33.624 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:48:33 compute-0 nova_compute[186544]: 2025-11-22 07:48:33.875 186548 DEBUG nova.compute.manager [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] [instance: 5225b730-46c6-4fb4-8ada-9f9ea046e722] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 22 07:48:33 compute-0 nova_compute[186544]: 2025-11-22 07:48:33.876 186548 DEBUG nova.network.neutron [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] [instance: 5225b730-46c6-4fb4-8ada-9f9ea046e722] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 07:48:33 compute-0 nova_compute[186544]: 2025-11-22 07:48:33.906 186548 INFO nova.virt.libvirt.driver [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] [instance: 5225b730-46c6-4fb4-8ada-9f9ea046e722] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 07:48:33 compute-0 nova_compute[186544]: 2025-11-22 07:48:33.908 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Instance 5225b730-46c6-4fb4-8ada-9f9ea046e722 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 22 07:48:33 compute-0 nova_compute[186544]: 2025-11-22 07:48:33.909 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 07:48:33 compute-0 nova_compute[186544]: 2025-11-22 07:48:33.909 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 07:48:33 compute-0 nova_compute[186544]: 2025-11-22 07:48:33.938 186548 DEBUG nova.compute.manager [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] [instance: 5225b730-46c6-4fb4-8ada-9f9ea046e722] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 07:48:33 compute-0 nova_compute[186544]: 2025-11-22 07:48:33.976 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:48:33 compute-0 nova_compute[186544]: 2025-11-22 07:48:33.995 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:48:34 compute-0 nova_compute[186544]: 2025-11-22 07:48:34.040 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 07:48:34 compute-0 nova_compute[186544]: 2025-11-22 07:48:34.040 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.416s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:48:34 compute-0 nova_compute[186544]: 2025-11-22 07:48:34.120 186548 DEBUG nova.compute.manager [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] [instance: 5225b730-46c6-4fb4-8ada-9f9ea046e722] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 07:48:34 compute-0 nova_compute[186544]: 2025-11-22 07:48:34.122 186548 DEBUG nova.virt.libvirt.driver [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] [instance: 5225b730-46c6-4fb4-8ada-9f9ea046e722] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 07:48:34 compute-0 nova_compute[186544]: 2025-11-22 07:48:34.122 186548 INFO nova.virt.libvirt.driver [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] [instance: 5225b730-46c6-4fb4-8ada-9f9ea046e722] Creating image(s)
Nov 22 07:48:34 compute-0 nova_compute[186544]: 2025-11-22 07:48:34.123 186548 DEBUG oslo_concurrency.lockutils [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] Acquiring lock "/var/lib/nova/instances/5225b730-46c6-4fb4-8ada-9f9ea046e722/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:48:34 compute-0 nova_compute[186544]: 2025-11-22 07:48:34.123 186548 DEBUG oslo_concurrency.lockutils [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] Lock "/var/lib/nova/instances/5225b730-46c6-4fb4-8ada-9f9ea046e722/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:48:34 compute-0 nova_compute[186544]: 2025-11-22 07:48:34.124 186548 DEBUG oslo_concurrency.lockutils [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] Lock "/var/lib/nova/instances/5225b730-46c6-4fb4-8ada-9f9ea046e722/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:48:34 compute-0 nova_compute[186544]: 2025-11-22 07:48:34.140 186548 DEBUG oslo_concurrency.processutils [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:48:34 compute-0 nova_compute[186544]: 2025-11-22 07:48:34.204 186548 DEBUG oslo_concurrency.processutils [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:48:34 compute-0 nova_compute[186544]: 2025-11-22 07:48:34.205 186548 DEBUG oslo_concurrency.lockutils [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:48:34 compute-0 nova_compute[186544]: 2025-11-22 07:48:34.205 186548 DEBUG oslo_concurrency.lockutils [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:48:34 compute-0 nova_compute[186544]: 2025-11-22 07:48:34.217 186548 DEBUG oslo_concurrency.processutils [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:48:34 compute-0 nova_compute[186544]: 2025-11-22 07:48:34.270 186548 DEBUG oslo_concurrency.processutils [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:48:34 compute-0 nova_compute[186544]: 2025-11-22 07:48:34.271 186548 DEBUG oslo_concurrency.processutils [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/5225b730-46c6-4fb4-8ada-9f9ea046e722/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:48:34 compute-0 nova_compute[186544]: 2025-11-22 07:48:34.371 186548 DEBUG oslo_concurrency.processutils [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/5225b730-46c6-4fb4-8ada-9f9ea046e722/disk 1073741824" returned: 0 in 0.100s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:48:34 compute-0 nova_compute[186544]: 2025-11-22 07:48:34.372 186548 DEBUG oslo_concurrency.lockutils [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.167s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:48:34 compute-0 nova_compute[186544]: 2025-11-22 07:48:34.373 186548 DEBUG oslo_concurrency.processutils [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:48:34 compute-0 nova_compute[186544]: 2025-11-22 07:48:34.434 186548 DEBUG oslo_concurrency.processutils [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:48:34 compute-0 nova_compute[186544]: 2025-11-22 07:48:34.435 186548 DEBUG nova.virt.disk.api [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] Checking if we can resize image /var/lib/nova/instances/5225b730-46c6-4fb4-8ada-9f9ea046e722/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 07:48:34 compute-0 nova_compute[186544]: 2025-11-22 07:48:34.435 186548 DEBUG oslo_concurrency.processutils [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5225b730-46c6-4fb4-8ada-9f9ea046e722/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:48:34 compute-0 nova_compute[186544]: 2025-11-22 07:48:34.473 186548 DEBUG nova.network.neutron [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] [instance: 5225b730-46c6-4fb4-8ada-9f9ea046e722] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188
Nov 22 07:48:34 compute-0 nova_compute[186544]: 2025-11-22 07:48:34.474 186548 DEBUG nova.compute.manager [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] [instance: 5225b730-46c6-4fb4-8ada-9f9ea046e722] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 22 07:48:34 compute-0 nova_compute[186544]: 2025-11-22 07:48:34.501 186548 DEBUG oslo_concurrency.processutils [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5225b730-46c6-4fb4-8ada-9f9ea046e722/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:48:34 compute-0 nova_compute[186544]: 2025-11-22 07:48:34.502 186548 DEBUG nova.virt.disk.api [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] Cannot resize image /var/lib/nova/instances/5225b730-46c6-4fb4-8ada-9f9ea046e722/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 07:48:34 compute-0 nova_compute[186544]: 2025-11-22 07:48:34.503 186548 DEBUG nova.objects.instance [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] Lazy-loading 'migration_context' on Instance uuid 5225b730-46c6-4fb4-8ada-9f9ea046e722 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:48:34 compute-0 nova_compute[186544]: 2025-11-22 07:48:34.515 186548 DEBUG nova.virt.libvirt.driver [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] [instance: 5225b730-46c6-4fb4-8ada-9f9ea046e722] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 07:48:34 compute-0 nova_compute[186544]: 2025-11-22 07:48:34.516 186548 DEBUG nova.virt.libvirt.driver [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] [instance: 5225b730-46c6-4fb4-8ada-9f9ea046e722] Ensure instance console log exists: /var/lib/nova/instances/5225b730-46c6-4fb4-8ada-9f9ea046e722/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 07:48:34 compute-0 nova_compute[186544]: 2025-11-22 07:48:34.517 186548 DEBUG oslo_concurrency.lockutils [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:48:34 compute-0 nova_compute[186544]: 2025-11-22 07:48:34.517 186548 DEBUG oslo_concurrency.lockutils [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:48:34 compute-0 nova_compute[186544]: 2025-11-22 07:48:34.517 186548 DEBUG oslo_concurrency.lockutils [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:48:34 compute-0 nova_compute[186544]: 2025-11-22 07:48:34.519 186548 DEBUG nova.virt.libvirt.driver [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] [instance: 5225b730-46c6-4fb4-8ada-9f9ea046e722] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 07:48:34 compute-0 nova_compute[186544]: 2025-11-22 07:48:34.525 186548 WARNING nova.virt.libvirt.driver [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 07:48:34 compute-0 nova_compute[186544]: 2025-11-22 07:48:34.530 186548 DEBUG nova.virt.libvirt.host [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 07:48:34 compute-0 nova_compute[186544]: 2025-11-22 07:48:34.531 186548 DEBUG nova.virt.libvirt.host [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 07:48:34 compute-0 nova_compute[186544]: 2025-11-22 07:48:34.534 186548 DEBUG nova.virt.libvirt.host [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 07:48:34 compute-0 nova_compute[186544]: 2025-11-22 07:48:34.535 186548 DEBUG nova.virt.libvirt.host [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 07:48:34 compute-0 nova_compute[186544]: 2025-11-22 07:48:34.537 186548 DEBUG nova.virt.libvirt.driver [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 07:48:34 compute-0 nova_compute[186544]: 2025-11-22 07:48:34.538 186548 DEBUG nova.virt.hardware [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 07:48:34 compute-0 nova_compute[186544]: 2025-11-22 07:48:34.538 186548 DEBUG nova.virt.hardware [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 07:48:34 compute-0 nova_compute[186544]: 2025-11-22 07:48:34.539 186548 DEBUG nova.virt.hardware [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 07:48:34 compute-0 nova_compute[186544]: 2025-11-22 07:48:34.539 186548 DEBUG nova.virt.hardware [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 07:48:34 compute-0 nova_compute[186544]: 2025-11-22 07:48:34.539 186548 DEBUG nova.virt.hardware [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 07:48:34 compute-0 nova_compute[186544]: 2025-11-22 07:48:34.539 186548 DEBUG nova.virt.hardware [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 07:48:34 compute-0 nova_compute[186544]: 2025-11-22 07:48:34.540 186548 DEBUG nova.virt.hardware [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 07:48:34 compute-0 nova_compute[186544]: 2025-11-22 07:48:34.540 186548 DEBUG nova.virt.hardware [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 07:48:34 compute-0 nova_compute[186544]: 2025-11-22 07:48:34.540 186548 DEBUG nova.virt.hardware [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 07:48:34 compute-0 nova_compute[186544]: 2025-11-22 07:48:34.540 186548 DEBUG nova.virt.hardware [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 07:48:34 compute-0 nova_compute[186544]: 2025-11-22 07:48:34.541 186548 DEBUG nova.virt.hardware [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 07:48:34 compute-0 nova_compute[186544]: 2025-11-22 07:48:34.544 186548 DEBUG nova.objects.instance [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] Lazy-loading 'pci_devices' on Instance uuid 5225b730-46c6-4fb4-8ada-9f9ea046e722 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:48:34 compute-0 nova_compute[186544]: 2025-11-22 07:48:34.557 186548 DEBUG nova.virt.libvirt.driver [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] [instance: 5225b730-46c6-4fb4-8ada-9f9ea046e722] End _get_guest_xml xml=<domain type="kvm">
Nov 22 07:48:34 compute-0 nova_compute[186544]:   <uuid>5225b730-46c6-4fb4-8ada-9f9ea046e722</uuid>
Nov 22 07:48:34 compute-0 nova_compute[186544]:   <name>instance-00000029</name>
Nov 22 07:48:34 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 07:48:34 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 07:48:34 compute-0 nova_compute[186544]:   <metadata>
Nov 22 07:48:34 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 07:48:34 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 07:48:34 compute-0 nova_compute[186544]:       <nova:name>tempest-ServerDiagnosticsTest-server-1416418599</nova:name>
Nov 22 07:48:34 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 07:48:34</nova:creationTime>
Nov 22 07:48:34 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 07:48:34 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 07:48:34 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 07:48:34 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 07:48:34 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 07:48:34 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 07:48:34 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 07:48:34 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 07:48:34 compute-0 nova_compute[186544]:         <nova:user uuid="57d60f953b0b471ba18aa65d08ed9bb2">tempest-ServerDiagnosticsTest-1162958798-project-member</nova:user>
Nov 22 07:48:34 compute-0 nova_compute[186544]:         <nova:project uuid="ccefaac5c3ca44b8847d197a5412050b">tempest-ServerDiagnosticsTest-1162958798</nova:project>
Nov 22 07:48:34 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 07:48:34 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 07:48:34 compute-0 nova_compute[186544]:       <nova:ports/>
Nov 22 07:48:34 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 07:48:34 compute-0 nova_compute[186544]:   </metadata>
Nov 22 07:48:34 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 07:48:34 compute-0 nova_compute[186544]:     <system>
Nov 22 07:48:34 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 07:48:34 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 07:48:34 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 07:48:34 compute-0 nova_compute[186544]:       <entry name="serial">5225b730-46c6-4fb4-8ada-9f9ea046e722</entry>
Nov 22 07:48:34 compute-0 nova_compute[186544]:       <entry name="uuid">5225b730-46c6-4fb4-8ada-9f9ea046e722</entry>
Nov 22 07:48:34 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 07:48:34 compute-0 nova_compute[186544]:     </system>
Nov 22 07:48:34 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 07:48:34 compute-0 nova_compute[186544]:   <os>
Nov 22 07:48:34 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 07:48:34 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 07:48:34 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 07:48:34 compute-0 nova_compute[186544]:   </os>
Nov 22 07:48:34 compute-0 nova_compute[186544]:   <features>
Nov 22 07:48:34 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 07:48:34 compute-0 nova_compute[186544]:     <apic/>
Nov 22 07:48:34 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 07:48:34 compute-0 nova_compute[186544]:   </features>
Nov 22 07:48:34 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 07:48:34 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 07:48:34 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 07:48:34 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 07:48:34 compute-0 nova_compute[186544]:   </clock>
Nov 22 07:48:34 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 07:48:34 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 07:48:34 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 07:48:34 compute-0 nova_compute[186544]:   </cpu>
Nov 22 07:48:34 compute-0 nova_compute[186544]:   <devices>
Nov 22 07:48:34 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 07:48:34 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 07:48:34 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/5225b730-46c6-4fb4-8ada-9f9ea046e722/disk"/>
Nov 22 07:48:34 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 07:48:34 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:48:34 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 07:48:34 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 07:48:34 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/5225b730-46c6-4fb4-8ada-9f9ea046e722/disk.config"/>
Nov 22 07:48:34 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 07:48:34 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:48:34 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 07:48:34 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/5225b730-46c6-4fb4-8ada-9f9ea046e722/console.log" append="off"/>
Nov 22 07:48:34 compute-0 nova_compute[186544]:     </serial>
Nov 22 07:48:34 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 07:48:34 compute-0 nova_compute[186544]:     <video>
Nov 22 07:48:34 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 07:48:34 compute-0 nova_compute[186544]:     </video>
Nov 22 07:48:34 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 07:48:34 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 07:48:34 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 07:48:34 compute-0 nova_compute[186544]:     </rng>
Nov 22 07:48:34 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 07:48:34 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:48:34 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:48:34 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:48:34 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:48:34 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:48:34 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:48:34 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:48:34 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:48:34 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:48:34 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:48:34 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:48:34 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:48:34 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:48:34 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:48:34 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:48:34 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:48:34 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:48:34 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:48:34 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:48:34 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:48:34 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:48:34 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:48:34 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:48:34 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:48:34 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 07:48:34 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 07:48:34 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 07:48:34 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 07:48:34 compute-0 nova_compute[186544]:   </devices>
Nov 22 07:48:34 compute-0 nova_compute[186544]: </domain>
Nov 22 07:48:34 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 07:48:34 compute-0 nova_compute[186544]: 2025-11-22 07:48:34.838 186548 DEBUG nova.virt.libvirt.driver [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:48:34 compute-0 nova_compute[186544]: 2025-11-22 07:48:34.839 186548 DEBUG nova.virt.libvirt.driver [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:48:34 compute-0 nova_compute[186544]: 2025-11-22 07:48:34.840 186548 INFO nova.virt.libvirt.driver [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] [instance: 5225b730-46c6-4fb4-8ada-9f9ea046e722] Using config drive
Nov 22 07:48:34 compute-0 nova_compute[186544]: 2025-11-22 07:48:34.992 186548 INFO nova.virt.libvirt.driver [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] [instance: 5225b730-46c6-4fb4-8ada-9f9ea046e722] Creating config drive at /var/lib/nova/instances/5225b730-46c6-4fb4-8ada-9f9ea046e722/disk.config
Nov 22 07:48:34 compute-0 nova_compute[186544]: 2025-11-22 07:48:34.997 186548 DEBUG oslo_concurrency.processutils [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5225b730-46c6-4fb4-8ada-9f9ea046e722/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpgf8i4ydc execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:48:35 compute-0 nova_compute[186544]: 2025-11-22 07:48:35.123 186548 DEBUG oslo_concurrency.processutils [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5225b730-46c6-4fb4-8ada-9f9ea046e722/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpgf8i4ydc" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:48:35 compute-0 systemd-machined[152872]: New machine qemu-21-instance-00000029.
Nov 22 07:48:35 compute-0 systemd[1]: Started Virtual Machine qemu-21-instance-00000029.
Nov 22 07:48:35 compute-0 nova_compute[186544]: 2025-11-22 07:48:35.250 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:48:35 compute-0 podman[219622]: 2025-11-22 07:48:35.273079132 +0000 UTC m=+0.083509233 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 22 07:48:35 compute-0 nova_compute[186544]: 2025-11-22 07:48:35.813 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763797715.8126266, 5225b730-46c6-4fb4-8ada-9f9ea046e722 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:48:35 compute-0 nova_compute[186544]: 2025-11-22 07:48:35.815 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 5225b730-46c6-4fb4-8ada-9f9ea046e722] VM Resumed (Lifecycle Event)
Nov 22 07:48:35 compute-0 nova_compute[186544]: 2025-11-22 07:48:35.818 186548 DEBUG nova.compute.manager [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] [instance: 5225b730-46c6-4fb4-8ada-9f9ea046e722] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 07:48:35 compute-0 nova_compute[186544]: 2025-11-22 07:48:35.818 186548 DEBUG nova.virt.libvirt.driver [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] [instance: 5225b730-46c6-4fb4-8ada-9f9ea046e722] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 07:48:35 compute-0 nova_compute[186544]: 2025-11-22 07:48:35.823 186548 INFO nova.virt.libvirt.driver [-] [instance: 5225b730-46c6-4fb4-8ada-9f9ea046e722] Instance spawned successfully.
Nov 22 07:48:35 compute-0 nova_compute[186544]: 2025-11-22 07:48:35.823 186548 DEBUG nova.virt.libvirt.driver [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] [instance: 5225b730-46c6-4fb4-8ada-9f9ea046e722] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 07:48:35 compute-0 nova_compute[186544]: 2025-11-22 07:48:35.838 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 5225b730-46c6-4fb4-8ada-9f9ea046e722] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:48:35 compute-0 nova_compute[186544]: 2025-11-22 07:48:35.844 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 5225b730-46c6-4fb4-8ada-9f9ea046e722] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:48:35 compute-0 nova_compute[186544]: 2025-11-22 07:48:35.848 186548 DEBUG nova.virt.libvirt.driver [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] [instance: 5225b730-46c6-4fb4-8ada-9f9ea046e722] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:48:35 compute-0 nova_compute[186544]: 2025-11-22 07:48:35.848 186548 DEBUG nova.virt.libvirt.driver [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] [instance: 5225b730-46c6-4fb4-8ada-9f9ea046e722] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:48:35 compute-0 nova_compute[186544]: 2025-11-22 07:48:35.849 186548 DEBUG nova.virt.libvirt.driver [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] [instance: 5225b730-46c6-4fb4-8ada-9f9ea046e722] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:48:35 compute-0 nova_compute[186544]: 2025-11-22 07:48:35.849 186548 DEBUG nova.virt.libvirt.driver [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] [instance: 5225b730-46c6-4fb4-8ada-9f9ea046e722] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:48:35 compute-0 nova_compute[186544]: 2025-11-22 07:48:35.849 186548 DEBUG nova.virt.libvirt.driver [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] [instance: 5225b730-46c6-4fb4-8ada-9f9ea046e722] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:48:35 compute-0 nova_compute[186544]: 2025-11-22 07:48:35.850 186548 DEBUG nova.virt.libvirt.driver [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] [instance: 5225b730-46c6-4fb4-8ada-9f9ea046e722] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:48:35 compute-0 nova_compute[186544]: 2025-11-22 07:48:35.881 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 5225b730-46c6-4fb4-8ada-9f9ea046e722] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 07:48:35 compute-0 nova_compute[186544]: 2025-11-22 07:48:35.882 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763797715.8141978, 5225b730-46c6-4fb4-8ada-9f9ea046e722 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:48:35 compute-0 nova_compute[186544]: 2025-11-22 07:48:35.882 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 5225b730-46c6-4fb4-8ada-9f9ea046e722] VM Started (Lifecycle Event)
Nov 22 07:48:35 compute-0 nova_compute[186544]: 2025-11-22 07:48:35.902 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 5225b730-46c6-4fb4-8ada-9f9ea046e722] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:48:35 compute-0 nova_compute[186544]: 2025-11-22 07:48:35.905 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 5225b730-46c6-4fb4-8ada-9f9ea046e722] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:48:35 compute-0 nova_compute[186544]: 2025-11-22 07:48:35.924 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 5225b730-46c6-4fb4-8ada-9f9ea046e722] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 07:48:35 compute-0 nova_compute[186544]: 2025-11-22 07:48:35.984 186548 INFO nova.compute.manager [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] [instance: 5225b730-46c6-4fb4-8ada-9f9ea046e722] Took 1.86 seconds to spawn the instance on the hypervisor.
Nov 22 07:48:35 compute-0 nova_compute[186544]: 2025-11-22 07:48:35.985 186548 DEBUG nova.compute.manager [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] [instance: 5225b730-46c6-4fb4-8ada-9f9ea046e722] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:48:36 compute-0 nova_compute[186544]: 2025-11-22 07:48:36.087 186548 INFO nova.compute.manager [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] [instance: 5225b730-46c6-4fb4-8ada-9f9ea046e722] Took 3.24 seconds to build instance.
Nov 22 07:48:36 compute-0 nova_compute[186544]: 2025-11-22 07:48:36.103 186548 DEBUG oslo_concurrency.lockutils [None req-5427ce23-894f-45a6-bb13-911619d75d09 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] Lock "5225b730-46c6-4fb4-8ada-9f9ea046e722" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 3.379s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.592 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '5225b730-46c6-4fb4-8ada-9f9ea046e722', 'name': 'tempest-ServerDiagnosticsTest-server-1416418599', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000029', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'ccefaac5c3ca44b8847d197a5412050b', 'user_id': '57d60f953b0b471ba18aa65d08ed9bb2', 'hostId': '63542d5ba318c4196f371bfe792b57a2109112c59c4fe8870a045f76', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.594 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.596 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.610 12 DEBUG ceilometer.compute.pollsters [-] 5225b730-46c6-4fb4-8ada-9f9ea046e722/cpu volume: 730000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.612 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '991abc0b-aae5-4c6a-91f3-d7b16e3848d5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 730000000, 'user_id': '57d60f953b0b471ba18aa65d08ed9bb2', 'user_name': None, 'project_id': 'ccefaac5c3ca44b8847d197a5412050b', 'project_name': None, 'resource_id': '5225b730-46c6-4fb4-8ada-9f9ea046e722', 'timestamp': '2025-11-22T07:48:36.596677', 'resource_metadata': {'display_name': 'tempest-ServerDiagnosticsTest-server-1416418599', 'name': 'instance-00000029', 'instance_id': '5225b730-46c6-4fb4-8ada-9f9ea046e722', 'instance_type': 'm1.nano', 'host': '63542d5ba318c4196f371bfe792b57a2109112c59c4fe8870a045f76', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'a70c1268-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4457.30202342, 'message_signature': 'fc0bd9b380fcd292b544bc697fdb11d186b98331a83ef13fc536a090b226fa3e'}]}, 'timestamp': '2025-11-22 07:48:36.610955', '_unique_id': '878390b0730e4e2f98f11b80691f50df'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.612 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.612 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.612 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.612 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.612 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.612 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.612 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.612 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.612 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.612 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.612 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.612 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.612 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.612 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.612 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.612 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.612 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.612 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.612 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.612 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.612 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.612 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.612 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.612 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.612 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.612 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.612 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.612 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.612 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.612 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.612 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.612 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.612 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.612 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.612 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.612 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.612 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.612 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.612 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.612 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.612 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.612 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.612 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.612 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.612 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.612 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.612 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.612 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.612 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.612 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.612 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.612 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.612 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.612 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.613 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.613 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.640 12 DEBUG ceilometer.compute.pollsters [-] 5225b730-46c6-4fb4-8ada-9f9ea046e722/disk.device.read.bytes volume: 2860544 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.641 12 DEBUG ceilometer.compute.pollsters [-] 5225b730-46c6-4fb4-8ada-9f9ea046e722/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.642 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd68ad4df-65f9-4513-92b8-39b9e51fd06b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2860544, 'user_id': '57d60f953b0b471ba18aa65d08ed9bb2', 'user_name': None, 'project_id': 'ccefaac5c3ca44b8847d197a5412050b', 'project_name': None, 'resource_id': '5225b730-46c6-4fb4-8ada-9f9ea046e722-vda', 'timestamp': '2025-11-22T07:48:36.613765', 'resource_metadata': {'display_name': 'tempest-ServerDiagnosticsTest-server-1416418599', 'name': 'instance-00000029', 'instance_id': '5225b730-46c6-4fb4-8ada-9f9ea046e722', 'instance_type': 'm1.nano', 'host': '63542d5ba318c4196f371bfe792b57a2109112c59c4fe8870a045f76', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a710ba7a-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4457.305560007, 'message_signature': '162fa0bf62a6cb00742357c74774b211ecd913f56eebcdfbe30c305e60e028cb'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '57d60f953b0b471ba18aa65d08ed9bb2', 'user_name': None, 'project_id': 'ccefaac5c3ca44b8847d197a5412050b', 'project_name': None, 'resource_id': '5225b730-46c6-4fb4-8ada-9f9ea046e722-sda', 'timestamp': '2025-11-22T07:48:36.613765', 'resource_metadata': {'display_name': 'tempest-ServerDiagnosticsTest-server-1416418599', 'name': 'instance-00000029', 'instance_id': '5225b730-46c6-4fb4-8ada-9f9ea046e722', 'instance_type': 'm1.nano', 'host': '63542d5ba318c4196f371bfe792b57a2109112c59c4fe8870a045f76', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a710ceca-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4457.305560007, 'message_signature': '84b3d98acf698418af025f7370b21bc8f13370388883650982fc97a6e8e16682'}]}, 'timestamp': '2025-11-22 07:48:36.641910', '_unique_id': '3e45a7c565564e918e9845d3eadc96a7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.642 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.642 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.642 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.642 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.642 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.642 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.642 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.642 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.642 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.642 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.642 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.642 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.642 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.642 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.642 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.642 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.642 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.642 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.642 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.642 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.642 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.642 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.642 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.642 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.642 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.642 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.642 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.642 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.642 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.642 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.642 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.643 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.643 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.643 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServerDiagnosticsTest-server-1416418599>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerDiagnosticsTest-server-1416418599>]
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.644 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.644 12 DEBUG ceilometer.compute.pollsters [-] 5225b730-46c6-4fb4-8ada-9f9ea046e722/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.644 12 DEBUG ceilometer.compute.pollsters [-] 5225b730-46c6-4fb4-8ada-9f9ea046e722/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.645 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '620d9b30-fd63-4e28-943f-a846640b29a8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '57d60f953b0b471ba18aa65d08ed9bb2', 'user_name': None, 'project_id': 'ccefaac5c3ca44b8847d197a5412050b', 'project_name': None, 'resource_id': '5225b730-46c6-4fb4-8ada-9f9ea046e722-vda', 'timestamp': '2025-11-22T07:48:36.644074', 'resource_metadata': {'display_name': 'tempest-ServerDiagnosticsTest-server-1416418599', 'name': 'instance-00000029', 'instance_id': '5225b730-46c6-4fb4-8ada-9f9ea046e722', 'instance_type': 'm1.nano', 'host': '63542d5ba318c4196f371bfe792b57a2109112c59c4fe8870a045f76', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a7112dd4-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4457.305560007, 'message_signature': '8d68cdd662d3711d18945ed1dda31126dd9f678b90246e93e170b1780e7e5602'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '57d60f953b0b471ba18aa65d08ed9bb2', 'user_name': None, 'project_id': 'ccefaac5c3ca44b8847d197a5412050b', 'project_name': None, 'resource_id': '5225b730-46c6-4fb4-8ada-9f9ea046e722-sda', 'timestamp': '2025-11-22T07:48:36.644074', 'resource_metadata': {'display_name': 'tempest-ServerDiagnosticsTest-server-1416418599', 'name': 'instance-00000029', 'instance_id': '5225b730-46c6-4fb4-8ada-9f9ea046e722', 'instance_type': 'm1.nano', 'host': '63542d5ba318c4196f371bfe792b57a2109112c59c4fe8870a045f76', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a7114d28-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4457.305560007, 'message_signature': '64e25b9ad28dcb9684ea80b4383cf847a2b00cc2e8d5e3770844bd33ff830030'}]}, 'timestamp': '2025-11-22 07:48:36.645132', '_unique_id': 'd876fdf45fb444ad9c25faf628fb27d7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.645 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.645 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.645 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.645 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.645 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.645 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.645 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.645 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.645 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.645 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.645 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.645 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.645 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.645 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.645 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.645 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.645 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.645 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.645 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.645 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.645 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.645 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.645 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.645 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.645 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.645 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.645 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.645 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.645 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.645 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.645 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.646 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.646 12 DEBUG ceilometer.compute.pollsters [-] 5225b730-46c6-4fb4-8ada-9f9ea046e722/disk.device.read.requests volume: 105 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.647 12 DEBUG ceilometer.compute.pollsters [-] 5225b730-46c6-4fb4-8ada-9f9ea046e722/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.648 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c14d781a-de41-4155-a707-35d6ff03184f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 105, 'user_id': '57d60f953b0b471ba18aa65d08ed9bb2', 'user_name': None, 'project_id': 'ccefaac5c3ca44b8847d197a5412050b', 'project_name': None, 'resource_id': '5225b730-46c6-4fb4-8ada-9f9ea046e722-vda', 'timestamp': '2025-11-22T07:48:36.646911', 'resource_metadata': {'display_name': 'tempest-ServerDiagnosticsTest-server-1416418599', 'name': 'instance-00000029', 'instance_id': '5225b730-46c6-4fb4-8ada-9f9ea046e722', 'instance_type': 'm1.nano', 'host': '63542d5ba318c4196f371bfe792b57a2109112c59c4fe8870a045f76', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a7119e72-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4457.305560007, 'message_signature': '7da67600ea752045beab7c76a99532a044cfd6b34391b029d5258eb3f71d4452'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '57d60f953b0b471ba18aa65d08ed9bb2', 'user_name': None, 'project_id': 'ccefaac5c3ca44b8847d197a5412050b', 'project_name': None, 'resource_id': '5225b730-46c6-4fb4-8ada-9f9ea046e722-sda', 'timestamp': '2025-11-22T07:48:36.646911', 'resource_metadata': {'display_name': 'tempest-ServerDiagnosticsTest-server-1416418599', 'name': 'instance-00000029', 'instance_id': '5225b730-46c6-4fb4-8ada-9f9ea046e722', 'instance_type': 'm1.nano', 'host': '63542d5ba318c4196f371bfe792b57a2109112c59c4fe8870a045f76', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a711ab74-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4457.305560007, 'message_signature': 'f53bfc8cc3f19303a3a85f9c1a20fc33d9993a0e866ea2b733aa520fb2ca08a2'}]}, 'timestamp': '2025-11-22 07:48:36.647520', '_unique_id': '122858863ee6406e8853580e5b60135c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.648 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.648 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.648 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.648 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.648 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.648 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.648 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.648 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.648 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.648 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.648 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.648 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.648 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.648 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.648 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.648 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.648 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.648 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.648 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.648 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.648 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.648 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.648 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.648 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.648 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.648 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.648 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.648 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.648 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.648 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.648 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.648 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.648 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.648 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.648 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.648 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.648 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.648 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.648 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.648 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.648 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.648 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.648 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.648 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.648 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.648 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.648 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.648 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.648 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.648 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.648 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.648 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.648 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.648 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.649 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.649 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.649 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.649 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServerDiagnosticsTest-server-1416418599>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerDiagnosticsTest-server-1416418599>]
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.650 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.650 12 DEBUG ceilometer.compute.pollsters [-] 5225b730-46c6-4fb4-8ada-9f9ea046e722/disk.device.read.latency volume: 202811493 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.650 12 DEBUG ceilometer.compute.pollsters [-] 5225b730-46c6-4fb4-8ada-9f9ea046e722/disk.device.read.latency volume: 27135825 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.651 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2a26abe8-01c2-4680-9ebb-7aa7cdbb373e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 202811493, 'user_id': '57d60f953b0b471ba18aa65d08ed9bb2', 'user_name': None, 'project_id': 'ccefaac5c3ca44b8847d197a5412050b', 'project_name': None, 'resource_id': '5225b730-46c6-4fb4-8ada-9f9ea046e722-vda', 'timestamp': '2025-11-22T07:48:36.650127', 'resource_metadata': {'display_name': 'tempest-ServerDiagnosticsTest-server-1416418599', 'name': 'instance-00000029', 'instance_id': '5225b730-46c6-4fb4-8ada-9f9ea046e722', 'instance_type': 'm1.nano', 'host': '63542d5ba318c4196f371bfe792b57a2109112c59c4fe8870a045f76', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a7121dfc-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4457.305560007, 'message_signature': '02ac6a7804314ec1c6485d15544f1ed3c2760e1c6d657eee7f507bb249da2913'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 27135825, 'user_id': '57d60f953b0b471ba18aa65d08ed9bb2', 'user_name': None, 'project_id': 'ccefaac5c3ca44b8847d197a5412050b', 'project_name': None, 'resource_id': '5225b730-46c6-4fb4-8ada-9f9ea046e722-sda', 'timestamp': '2025-11-22T07:48:36.650127', 'resource_metadata': {'display_name': 'tempest-ServerDiagnosticsTest-server-1416418599', 'name': 'instance-00000029', 'instance_id': '5225b730-46c6-4fb4-8ada-9f9ea046e722', 'instance_type': 'm1.nano', 'host': '63542d5ba318c4196f371bfe792b57a2109112c59c4fe8870a045f76', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a7123116-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4457.305560007, 'message_signature': 'ea760b6003c86b71fb32466f2e4a70c16edabf5b70aa9b36b8aae57d8393e253'}]}, 'timestamp': '2025-11-22 07:48:36.650938', '_unique_id': '4804158ec3164b7fbcfa76bb99faa24e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.651 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.651 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.651 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.651 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.651 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.651 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.651 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.651 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.651 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.651 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.651 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.651 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.651 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.651 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.651 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.651 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.651 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.651 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.651 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.651 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.651 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.651 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.651 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.651 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.651 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.651 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.651 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.651 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.651 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.651 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.651 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.652 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.661 12 DEBUG ceilometer.compute.pollsters [-] 5225b730-46c6-4fb4-8ada-9f9ea046e722/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.662 12 DEBUG ceilometer.compute.pollsters [-] 5225b730-46c6-4fb4-8ada-9f9ea046e722/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.663 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5e85165c-7402-442e-bdb5-244bdea3d2ac', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '57d60f953b0b471ba18aa65d08ed9bb2', 'user_name': None, 'project_id': 'ccefaac5c3ca44b8847d197a5412050b', 'project_name': None, 'resource_id': '5225b730-46c6-4fb4-8ada-9f9ea046e722-vda', 'timestamp': '2025-11-22T07:48:36.652706', 'resource_metadata': {'display_name': 'tempest-ServerDiagnosticsTest-server-1416418599', 'name': 'instance-00000029', 'instance_id': '5225b730-46c6-4fb4-8ada-9f9ea046e722', 'instance_type': 'm1.nano', 'host': '63542d5ba318c4196f371bfe792b57a2109112c59c4fe8870a045f76', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a713ea10-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4457.34451597, 'message_signature': '1199c0ff6673e34d365b2647ecd2dee4df294e088a6800e2c003bf7f898887a0'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '57d60f953b0b471ba18aa65d08ed9bb2', 'user_name': None, 'project_id': 'ccefaac5c3ca44b8847d197a5412050b', 'project_name': None, 'resource_id': '5225b730-46c6-4fb4-8ada-9f9ea046e722-sda', 'timestamp': '2025-11-22T07:48:36.652706', 'resource_metadata': {'display_name': 'tempest-ServerDiagnosticsTest-server-1416418599', 'name': 'instance-00000029', 'instance_id': '5225b730-46c6-4fb4-8ada-9f9ea046e722', 'instance_type': 'm1.nano', 'host': '63542d5ba318c4196f371bfe792b57a2109112c59c4fe8870a045f76', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a713f64a-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4457.34451597, 'message_signature': '4d331a159de870373e544aaa7244647318e306060b8972fcf57f39658115469c'}]}, 'timestamp': '2025-11-22 07:48:36.662543', '_unique_id': 'ef8754baf00b42738b61e5681df6eaf2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.663 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.663 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.663 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.663 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.663 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.663 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.663 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.663 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.663 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.663 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.663 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.663 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.663 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.663 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.663 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.663 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.663 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.663 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.663 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.663 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.663 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.663 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.663 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.663 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.663 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.663 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.663 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.663 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.663 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.663 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.663 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.664 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.664 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.664 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServerDiagnosticsTest-server-1416418599>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerDiagnosticsTest-server-1416418599>]
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.664 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.665 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.665 12 DEBUG ceilometer.compute.pollsters [-] 5225b730-46c6-4fb4-8ada-9f9ea046e722/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.665 12 DEBUG ceilometer.compute.pollsters [-] 5225b730-46c6-4fb4-8ada-9f9ea046e722/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.666 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4bc6a775-24a1-4563-9157-c13b3c687b3e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '57d60f953b0b471ba18aa65d08ed9bb2', 'user_name': None, 'project_id': 'ccefaac5c3ca44b8847d197a5412050b', 'project_name': None, 'resource_id': '5225b730-46c6-4fb4-8ada-9f9ea046e722-vda', 'timestamp': '2025-11-22T07:48:36.665143', 'resource_metadata': {'display_name': 'tempest-ServerDiagnosticsTest-server-1416418599', 'name': 'instance-00000029', 'instance_id': '5225b730-46c6-4fb4-8ada-9f9ea046e722', 'instance_type': 'm1.nano', 'host': '63542d5ba318c4196f371bfe792b57a2109112c59c4fe8870a045f76', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a7146936-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4457.34451597, 'message_signature': 'fe6492d56d22c7eca2ec2a8ec90a5117b4f67d6516f189ba0262cd4f8a7cfbb3'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '57d60f953b0b471ba18aa65d08ed9bb2', 'user_name': None, 'project_id': 'ccefaac5c3ca44b8847d197a5412050b', 'project_name': None, 'resource_id': '5225b730-46c6-4fb4-8ada-9f9ea046e722-sda', 'timestamp': '2025-11-22T07:48:36.665143', 'resource_metadata': {'display_name': 'tempest-ServerDiagnosticsTest-server-1416418599', 'name': 'instance-00000029', 'instance_id': '5225b730-46c6-4fb4-8ada-9f9ea046e722', 'instance_type': 'm1.nano', 'host': '63542d5ba318c4196f371bfe792b57a2109112c59c4fe8870a045f76', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a7147624-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4457.34451597, 'message_signature': '8f431ed9d72e11e3cd4cd59c45c6f8d188e828869fba262ed2a85be32f0379a7'}]}, 'timestamp': '2025-11-22 07:48:36.665826', '_unique_id': 'cef4e8dea6284d4e8fa2583543cf4e29'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.666 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.666 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.666 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.666 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.666 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.666 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.666 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.666 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.666 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.666 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.666 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.666 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.666 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.666 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.666 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.666 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.666 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.666 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.666 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.666 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.666 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.666 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.666 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.666 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.666 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.666 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.666 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.666 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.666 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.666 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.666 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.667 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.667 12 DEBUG ceilometer.compute.pollsters [-] 5225b730-46c6-4fb4-8ada-9f9ea046e722/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.667 12 DEBUG ceilometer.compute.pollsters [-] 5225b730-46c6-4fb4-8ada-9f9ea046e722/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.668 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '318651f7-244a-4bac-8e8f-b796c4acb09f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '57d60f953b0b471ba18aa65d08ed9bb2', 'user_name': None, 'project_id': 'ccefaac5c3ca44b8847d197a5412050b', 'project_name': None, 'resource_id': '5225b730-46c6-4fb4-8ada-9f9ea046e722-vda', 'timestamp': '2025-11-22T07:48:36.667418', 'resource_metadata': {'display_name': 'tempest-ServerDiagnosticsTest-server-1416418599', 'name': 'instance-00000029', 'instance_id': '5225b730-46c6-4fb4-8ada-9f9ea046e722', 'instance_type': 'm1.nano', 'host': '63542d5ba318c4196f371bfe792b57a2109112c59c4fe8870a045f76', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a714bfb2-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4457.305560007, 'message_signature': '6a224780441e518c92dd6b308a0f901135dffed4186a0ca2ed37fd8413e6d0d6'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '57d60f953b0b471ba18aa65d08ed9bb2', 'user_name': None, 'project_id': 'ccefaac5c3ca44b8847d197a5412050b', 'project_name': None, 'resource_id': '5225b730-46c6-4fb4-8ada-9f9ea046e722-sda', 'timestamp': '2025-11-22T07:48:36.667418', 'resource_metadata': {'display_name': 'tempest-ServerDiagnosticsTest-server-1416418599', 'name': 'instance-00000029', 'instance_id': '5225b730-46c6-4fb4-8ada-9f9ea046e722', 'instance_type': 'm1.nano', 'host': '63542d5ba318c4196f371bfe792b57a2109112c59c4fe8870a045f76', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a714ca20-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4457.305560007, 'message_signature': 'f87a82401a0d5f493975948595ff00f2e7a1e289e841cc022c37b977dd67697f'}]}, 'timestamp': '2025-11-22 07:48:36.667955', '_unique_id': 'e79db069ef65451b9840390ec75de098'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.668 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.668 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.668 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.668 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.668 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.668 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.668 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.668 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.668 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.668 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.668 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.668 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.668 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.668 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.668 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.668 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.668 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.668 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.668 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.668 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.668 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.668 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.668 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.668 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.668 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.668 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.668 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.668 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.668 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.668 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.668 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.669 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.669 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.669 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.669 12 DEBUG ceilometer.compute.pollsters [-] 5225b730-46c6-4fb4-8ada-9f9ea046e722/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.670 12 DEBUG ceilometer.compute.pollsters [-] 5225b730-46c6-4fb4-8ada-9f9ea046e722/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.670 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e14a3dbd-f2af-4753-af68-8f76f2b8fd77', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '57d60f953b0b471ba18aa65d08ed9bb2', 'user_name': None, 'project_id': 'ccefaac5c3ca44b8847d197a5412050b', 'project_name': None, 'resource_id': '5225b730-46c6-4fb4-8ada-9f9ea046e722-vda', 'timestamp': '2025-11-22T07:48:36.669827', 'resource_metadata': {'display_name': 'tempest-ServerDiagnosticsTest-server-1416418599', 'name': 'instance-00000029', 'instance_id': '5225b730-46c6-4fb4-8ada-9f9ea046e722', 'instance_type': 'm1.nano', 'host': '63542d5ba318c4196f371bfe792b57a2109112c59c4fe8870a045f76', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a7151eda-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4457.34451597, 'message_signature': '7ed13018780af8ce29cd819d0a13ae5e54b0554ba49206789852edeaf5c593a2'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '57d60f953b0b471ba18aa65d08ed9bb2', 'user_name': None, 'project_id': 'ccefaac5c3ca44b8847d197a5412050b', 'project_name': None, 'resource_id': '5225b730-46c6-4fb4-8ada-9f9ea046e722-sda', 'timestamp': '2025-11-22T07:48:36.669827', 'resource_metadata': {'display_name': 'tempest-ServerDiagnosticsTest-server-1416418599', 'name': 'instance-00000029', 'instance_id': '5225b730-46c6-4fb4-8ada-9f9ea046e722', 'instance_type': 'm1.nano', 'host': '63542d5ba318c4196f371bfe792b57a2109112c59c4fe8870a045f76', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a7152bd2-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4457.34451597, 'message_signature': '67c731e7ab9c7710dec322ac205690c98796e39435796442adb64743ad68061b'}]}, 'timestamp': '2025-11-22 07:48:36.670461', '_unique_id': 'd048d040170a445e85925ee892237f19'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.670 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.670 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.670 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.670 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.670 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.670 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.670 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.670 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.670 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.670 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.670 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.670 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.670 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.670 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.670 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.670 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.670 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.670 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.670 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.670 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.670 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.670 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.670 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.670 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.670 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.670 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.670 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.670 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.670 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.670 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.670 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.672 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.672 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.672 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.672 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.672 12 DEBUG ceilometer.compute.pollsters [-] 5225b730-46c6-4fb4-8ada-9f9ea046e722/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.672 12 DEBUG ceilometer.compute.pollsters [-] 5225b730-46c6-4fb4-8ada-9f9ea046e722/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.673 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bbd7d0ac-5f7c-45b7-9dde-d9096401d300', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '57d60f953b0b471ba18aa65d08ed9bb2', 'user_name': None, 'project_id': 'ccefaac5c3ca44b8847d197a5412050b', 'project_name': None, 'resource_id': '5225b730-46c6-4fb4-8ada-9f9ea046e722-vda', 'timestamp': '2025-11-22T07:48:36.672514', 'resource_metadata': {'display_name': 'tempest-ServerDiagnosticsTest-server-1416418599', 'name': 'instance-00000029', 'instance_id': '5225b730-46c6-4fb4-8ada-9f9ea046e722', 'instance_type': 'm1.nano', 'host': '63542d5ba318c4196f371bfe792b57a2109112c59c4fe8870a045f76', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a71586c2-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4457.305560007, 'message_signature': '03c298935ae7243fbb68798c89889136936210f7a942dbbd67144e4329d533aa'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '57d60f953b0b471ba18aa65d08ed9bb2', 'user_name': None, 'project_id': 'ccefaac5c3ca44b8847d197a5412050b', 'project_name': None, 'resource_id': '5225b730-46c6-4fb4-8ada-9f9ea046e722-sda', 'timestamp': '2025-11-22T07:48:36.672514', 'resource_metadata': {'display_name': 'tempest-ServerDiagnosticsTest-server-1416418599', 'name': 'instance-00000029', 'instance_id': '5225b730-46c6-4fb4-8ada-9f9ea046e722', 'instance_type': 'm1.nano', 'host': '63542d5ba318c4196f371bfe792b57a2109112c59c4fe8870a045f76', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a7159068-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4457.305560007, 'message_signature': 'f64db8314d043215d270cab57f53c1bef02256912b2b33005d1a4346f0920555'}]}, 'timestamp': '2025-11-22 07:48:36.673030', '_unique_id': '4b6103997b2d4cc9a60c742f86d41633'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.673 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.673 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.673 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.673 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.673 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.673 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.673 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.673 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.673 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.673 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.673 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.673 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.673 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.673 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.673 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.673 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.673 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.673 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.673 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.673 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.673 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.673 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.673 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.673 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.673 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.673 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.673 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.673 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.673 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.673 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.673 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.674 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.674 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.674 12 DEBUG ceilometer.compute.pollsters [-] 5225b730-46c6-4fb4-8ada-9f9ea046e722/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.674 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 5225b730-46c6-4fb4-8ada-9f9ea046e722: ceilometer.compute.pollsters.NoVolumeException
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.675 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.675 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 07:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:48:36.675 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServerDiagnosticsTest-server-1416418599>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerDiagnosticsTest-server-1416418599>]
Nov 22 07:48:37 compute-0 nova_compute[186544]: 2025-11-22 07:48:37.041 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:48:37 compute-0 nova_compute[186544]: 2025-11-22 07:48:37.041 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 07:48:37 compute-0 nova_compute[186544]: 2025-11-22 07:48:37.068 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 07:48:37 compute-0 nova_compute[186544]: 2025-11-22 07:48:37.069 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:48:37 compute-0 nova_compute[186544]: 2025-11-22 07:48:37.069 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:48:37 compute-0 nova_compute[186544]: 2025-11-22 07:48:37.079 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:48:37 compute-0 nova_compute[186544]: 2025-11-22 07:48:37.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:48:37 compute-0 nova_compute[186544]: 2025-11-22 07:48:37.164 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 07:48:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:48:37.317 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:48:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:48:37.317 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:48:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:48:37.317 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:48:38 compute-0 nova_compute[186544]: 2025-11-22 07:48:38.157 186548 DEBUG nova.compute.manager [None req-70b8d195-62d6-4a4f-bcbc-b58723272178 d56774ad36f44e3ebf1574f41befaf35 1e9d8188c9ef41a9a96c7c201143d68a - - default default] [instance: 5225b730-46c6-4fb4-8ada-9f9ea046e722] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:48:38 compute-0 nova_compute[186544]: 2025-11-22 07:48:38.157 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:48:38 compute-0 nova_compute[186544]: 2025-11-22 07:48:38.160 186548 INFO nova.compute.manager [None req-70b8d195-62d6-4a4f-bcbc-b58723272178 d56774ad36f44e3ebf1574f41befaf35 1e9d8188c9ef41a9a96c7c201143d68a - - default default] [instance: 5225b730-46c6-4fb4-8ada-9f9ea046e722] Retrieving diagnostics
Nov 22 07:48:38 compute-0 nova_compute[186544]: 2025-11-22 07:48:38.481 186548 DEBUG oslo_concurrency.lockutils [None req-fee42ba7-9895-46ef-899e-869ee36939cf 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] Acquiring lock "5225b730-46c6-4fb4-8ada-9f9ea046e722" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:48:38 compute-0 nova_compute[186544]: 2025-11-22 07:48:38.481 186548 DEBUG oslo_concurrency.lockutils [None req-fee42ba7-9895-46ef-899e-869ee36939cf 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] Lock "5225b730-46c6-4fb4-8ada-9f9ea046e722" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:48:38 compute-0 nova_compute[186544]: 2025-11-22 07:48:38.481 186548 DEBUG oslo_concurrency.lockutils [None req-fee42ba7-9895-46ef-899e-869ee36939cf 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] Acquiring lock "5225b730-46c6-4fb4-8ada-9f9ea046e722-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:48:38 compute-0 nova_compute[186544]: 2025-11-22 07:48:38.482 186548 DEBUG oslo_concurrency.lockutils [None req-fee42ba7-9895-46ef-899e-869ee36939cf 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] Lock "5225b730-46c6-4fb4-8ada-9f9ea046e722-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:48:38 compute-0 nova_compute[186544]: 2025-11-22 07:48:38.482 186548 DEBUG oslo_concurrency.lockutils [None req-fee42ba7-9895-46ef-899e-869ee36939cf 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] Lock "5225b730-46c6-4fb4-8ada-9f9ea046e722-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:48:38 compute-0 nova_compute[186544]: 2025-11-22 07:48:38.489 186548 INFO nova.compute.manager [None req-fee42ba7-9895-46ef-899e-869ee36939cf 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] [instance: 5225b730-46c6-4fb4-8ada-9f9ea046e722] Terminating instance
Nov 22 07:48:38 compute-0 nova_compute[186544]: 2025-11-22 07:48:38.497 186548 DEBUG oslo_concurrency.lockutils [None req-fee42ba7-9895-46ef-899e-869ee36939cf 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] Acquiring lock "refresh_cache-5225b730-46c6-4fb4-8ada-9f9ea046e722" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:48:38 compute-0 nova_compute[186544]: 2025-11-22 07:48:38.497 186548 DEBUG oslo_concurrency.lockutils [None req-fee42ba7-9895-46ef-899e-869ee36939cf 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] Acquired lock "refresh_cache-5225b730-46c6-4fb4-8ada-9f9ea046e722" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:48:38 compute-0 nova_compute[186544]: 2025-11-22 07:48:38.497 186548 DEBUG nova.network.neutron [None req-fee42ba7-9895-46ef-899e-869ee36939cf 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] [instance: 5225b730-46c6-4fb4-8ada-9f9ea046e722] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 07:48:38 compute-0 nova_compute[186544]: 2025-11-22 07:48:38.676 186548 DEBUG nova.network.neutron [None req-fee42ba7-9895-46ef-899e-869ee36939cf 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] [instance: 5225b730-46c6-4fb4-8ada-9f9ea046e722] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 07:48:39 compute-0 podman[219658]: 2025-11-22 07:48:39.414744001 +0000 UTC m=+0.061069575 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 22 07:48:39 compute-0 nova_compute[186544]: 2025-11-22 07:48:39.580 186548 DEBUG nova.network.neutron [None req-fee42ba7-9895-46ef-899e-869ee36939cf 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] [instance: 5225b730-46c6-4fb4-8ada-9f9ea046e722] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:48:39 compute-0 nova_compute[186544]: 2025-11-22 07:48:39.605 186548 DEBUG oslo_concurrency.lockutils [None req-fee42ba7-9895-46ef-899e-869ee36939cf 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] Releasing lock "refresh_cache-5225b730-46c6-4fb4-8ada-9f9ea046e722" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:48:39 compute-0 nova_compute[186544]: 2025-11-22 07:48:39.606 186548 DEBUG nova.compute.manager [None req-fee42ba7-9895-46ef-899e-869ee36939cf 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] [instance: 5225b730-46c6-4fb4-8ada-9f9ea046e722] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 07:48:39 compute-0 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d00000029.scope: Deactivated successfully.
Nov 22 07:48:39 compute-0 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d00000029.scope: Consumed 4.273s CPU time.
Nov 22 07:48:39 compute-0 systemd-machined[152872]: Machine qemu-21-instance-00000029 terminated.
Nov 22 07:48:39 compute-0 nova_compute[186544]: 2025-11-22 07:48:39.844 186548 INFO nova.virt.libvirt.driver [-] [instance: 5225b730-46c6-4fb4-8ada-9f9ea046e722] Instance destroyed successfully.
Nov 22 07:48:39 compute-0 nova_compute[186544]: 2025-11-22 07:48:39.844 186548 DEBUG nova.objects.instance [None req-fee42ba7-9895-46ef-899e-869ee36939cf 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] Lazy-loading 'resources' on Instance uuid 5225b730-46c6-4fb4-8ada-9f9ea046e722 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:48:39 compute-0 nova_compute[186544]: 2025-11-22 07:48:39.857 186548 INFO nova.virt.libvirt.driver [None req-fee42ba7-9895-46ef-899e-869ee36939cf 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] [instance: 5225b730-46c6-4fb4-8ada-9f9ea046e722] Deleting instance files /var/lib/nova/instances/5225b730-46c6-4fb4-8ada-9f9ea046e722_del
Nov 22 07:48:39 compute-0 nova_compute[186544]: 2025-11-22 07:48:39.858 186548 INFO nova.virt.libvirt.driver [None req-fee42ba7-9895-46ef-899e-869ee36939cf 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] [instance: 5225b730-46c6-4fb4-8ada-9f9ea046e722] Deletion of /var/lib/nova/instances/5225b730-46c6-4fb4-8ada-9f9ea046e722_del complete
Nov 22 07:48:39 compute-0 nova_compute[186544]: 2025-11-22 07:48:39.940 186548 INFO nova.compute.manager [None req-fee42ba7-9895-46ef-899e-869ee36939cf 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] [instance: 5225b730-46c6-4fb4-8ada-9f9ea046e722] Took 0.33 seconds to destroy the instance on the hypervisor.
Nov 22 07:48:39 compute-0 nova_compute[186544]: 2025-11-22 07:48:39.942 186548 DEBUG oslo.service.loopingcall [None req-fee42ba7-9895-46ef-899e-869ee36939cf 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 07:48:39 compute-0 nova_compute[186544]: 2025-11-22 07:48:39.942 186548 DEBUG nova.compute.manager [-] [instance: 5225b730-46c6-4fb4-8ada-9f9ea046e722] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 07:48:39 compute-0 nova_compute[186544]: 2025-11-22 07:48:39.942 186548 DEBUG nova.network.neutron [-] [instance: 5225b730-46c6-4fb4-8ada-9f9ea046e722] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 07:48:40 compute-0 nova_compute[186544]: 2025-11-22 07:48:40.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:48:40 compute-0 nova_compute[186544]: 2025-11-22 07:48:40.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:48:40 compute-0 nova_compute[186544]: 2025-11-22 07:48:40.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 22 07:48:40 compute-0 nova_compute[186544]: 2025-11-22 07:48:40.175 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 22 07:48:40 compute-0 nova_compute[186544]: 2025-11-22 07:48:40.176 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:48:40 compute-0 nova_compute[186544]: 2025-11-22 07:48:40.176 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 22 07:48:40 compute-0 nova_compute[186544]: 2025-11-22 07:48:40.252 186548 DEBUG nova.network.neutron [-] [instance: 5225b730-46c6-4fb4-8ada-9f9ea046e722] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 07:48:40 compute-0 nova_compute[186544]: 2025-11-22 07:48:40.253 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:48:40 compute-0 nova_compute[186544]: 2025-11-22 07:48:40.266 186548 DEBUG nova.network.neutron [-] [instance: 5225b730-46c6-4fb4-8ada-9f9ea046e722] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:48:40 compute-0 nova_compute[186544]: 2025-11-22 07:48:40.278 186548 INFO nova.compute.manager [-] [instance: 5225b730-46c6-4fb4-8ada-9f9ea046e722] Took 0.34 seconds to deallocate network for instance.
Nov 22 07:48:40 compute-0 nova_compute[186544]: 2025-11-22 07:48:40.353 186548 DEBUG oslo_concurrency.lockutils [None req-fee42ba7-9895-46ef-899e-869ee36939cf 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:48:40 compute-0 nova_compute[186544]: 2025-11-22 07:48:40.353 186548 DEBUG oslo_concurrency.lockutils [None req-fee42ba7-9895-46ef-899e-869ee36939cf 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:48:40 compute-0 nova_compute[186544]: 2025-11-22 07:48:40.426 186548 DEBUG nova.compute.provider_tree [None req-fee42ba7-9895-46ef-899e-869ee36939cf 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:48:40 compute-0 nova_compute[186544]: 2025-11-22 07:48:40.441 186548 DEBUG nova.scheduler.client.report [None req-fee42ba7-9895-46ef-899e-869ee36939cf 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:48:40 compute-0 nova_compute[186544]: 2025-11-22 07:48:40.463 186548 DEBUG oslo_concurrency.lockutils [None req-fee42ba7-9895-46ef-899e-869ee36939cf 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:48:40 compute-0 nova_compute[186544]: 2025-11-22 07:48:40.500 186548 INFO nova.scheduler.client.report [None req-fee42ba7-9895-46ef-899e-869ee36939cf 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] Deleted allocations for instance 5225b730-46c6-4fb4-8ada-9f9ea046e722
Nov 22 07:48:40 compute-0 nova_compute[186544]: 2025-11-22 07:48:40.584 186548 DEBUG oslo_concurrency.lockutils [None req-fee42ba7-9895-46ef-899e-869ee36939cf 57d60f953b0b471ba18aa65d08ed9bb2 ccefaac5c3ca44b8847d197a5412050b - - default default] Lock "5225b730-46c6-4fb4-8ada-9f9ea046e722" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.103s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:48:42 compute-0 nova_compute[186544]: 2025-11-22 07:48:42.081 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:48:43 compute-0 nova_compute[186544]: 2025-11-22 07:48:43.188 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:48:43 compute-0 nova_compute[186544]: 2025-11-22 07:48:43.189 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:48:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:48:43.939 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:48:43 compute-0 nova_compute[186544]: 2025-11-22 07:48:43.940 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:48:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:48:43.940 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 07:48:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:48:43.941 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:48:44 compute-0 podman[219687]: 2025-11-22 07:48:44.404954129 +0000 UTC m=+0.051724627 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 07:48:45 compute-0 nova_compute[186544]: 2025-11-22 07:48:45.253 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:48:46 compute-0 nova_compute[186544]: 2025-11-22 07:48:46.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:48:47 compute-0 nova_compute[186544]: 2025-11-22 07:48:47.082 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:48:48 compute-0 podman[219711]: 2025-11-22 07:48:48.635165294 +0000 UTC m=+0.073785446 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter)
Nov 22 07:48:50 compute-0 nova_compute[186544]: 2025-11-22 07:48:50.255 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:48:50 compute-0 nova_compute[186544]: 2025-11-22 07:48:50.543 186548 DEBUG oslo_concurrency.lockutils [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] Acquiring lock "b6ae104e-3174-402f-856b-f4c9156f02c9" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:48:50 compute-0 nova_compute[186544]: 2025-11-22 07:48:50.543 186548 DEBUG oslo_concurrency.lockutils [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] Lock "b6ae104e-3174-402f-856b-f4c9156f02c9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:48:50 compute-0 nova_compute[186544]: 2025-11-22 07:48:50.567 186548 DEBUG nova.compute.manager [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] [instance: b6ae104e-3174-402f-856b-f4c9156f02c9] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 07:48:50 compute-0 nova_compute[186544]: 2025-11-22 07:48:50.657 186548 DEBUG oslo_concurrency.lockutils [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:48:50 compute-0 nova_compute[186544]: 2025-11-22 07:48:50.658 186548 DEBUG oslo_concurrency.lockutils [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:48:50 compute-0 nova_compute[186544]: 2025-11-22 07:48:50.665 186548 DEBUG nova.virt.hardware [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 07:48:50 compute-0 nova_compute[186544]: 2025-11-22 07:48:50.665 186548 INFO nova.compute.claims [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] [instance: b6ae104e-3174-402f-856b-f4c9156f02c9] Claim successful on node compute-0.ctlplane.example.com
Nov 22 07:48:50 compute-0 nova_compute[186544]: 2025-11-22 07:48:50.801 186548 DEBUG nova.compute.provider_tree [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:48:50 compute-0 nova_compute[186544]: 2025-11-22 07:48:50.829 186548 DEBUG nova.scheduler.client.report [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:48:50 compute-0 nova_compute[186544]: 2025-11-22 07:48:50.861 186548 DEBUG oslo_concurrency.lockutils [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.204s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:48:50 compute-0 nova_compute[186544]: 2025-11-22 07:48:50.862 186548 DEBUG nova.compute.manager [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] [instance: b6ae104e-3174-402f-856b-f4c9156f02c9] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 07:48:50 compute-0 nova_compute[186544]: 2025-11-22 07:48:50.971 186548 DEBUG nova.compute.manager [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] [instance: b6ae104e-3174-402f-856b-f4c9156f02c9] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 22 07:48:50 compute-0 nova_compute[186544]: 2025-11-22 07:48:50.971 186548 DEBUG nova.network.neutron [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] [instance: b6ae104e-3174-402f-856b-f4c9156f02c9] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 07:48:50 compute-0 nova_compute[186544]: 2025-11-22 07:48:50.995 186548 INFO nova.virt.libvirt.driver [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] [instance: b6ae104e-3174-402f-856b-f4c9156f02c9] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 07:48:51 compute-0 nova_compute[186544]: 2025-11-22 07:48:51.018 186548 DEBUG nova.compute.manager [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] [instance: b6ae104e-3174-402f-856b-f4c9156f02c9] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 07:48:51 compute-0 nova_compute[186544]: 2025-11-22 07:48:51.174 186548 DEBUG nova.compute.manager [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] [instance: b6ae104e-3174-402f-856b-f4c9156f02c9] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 07:48:51 compute-0 nova_compute[186544]: 2025-11-22 07:48:51.175 186548 DEBUG nova.virt.libvirt.driver [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] [instance: b6ae104e-3174-402f-856b-f4c9156f02c9] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 07:48:51 compute-0 nova_compute[186544]: 2025-11-22 07:48:51.176 186548 INFO nova.virt.libvirt.driver [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] [instance: b6ae104e-3174-402f-856b-f4c9156f02c9] Creating image(s)
Nov 22 07:48:51 compute-0 nova_compute[186544]: 2025-11-22 07:48:51.176 186548 DEBUG oslo_concurrency.lockutils [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] Acquiring lock "/var/lib/nova/instances/b6ae104e-3174-402f-856b-f4c9156f02c9/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:48:51 compute-0 nova_compute[186544]: 2025-11-22 07:48:51.177 186548 DEBUG oslo_concurrency.lockutils [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] Lock "/var/lib/nova/instances/b6ae104e-3174-402f-856b-f4c9156f02c9/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:48:51 compute-0 nova_compute[186544]: 2025-11-22 07:48:51.177 186548 DEBUG oslo_concurrency.lockutils [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] Lock "/var/lib/nova/instances/b6ae104e-3174-402f-856b-f4c9156f02c9/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:48:51 compute-0 nova_compute[186544]: 2025-11-22 07:48:51.191 186548 DEBUG oslo_concurrency.processutils [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:48:51 compute-0 nova_compute[186544]: 2025-11-22 07:48:51.256 186548 DEBUG oslo_concurrency.processutils [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:48:51 compute-0 nova_compute[186544]: 2025-11-22 07:48:51.257 186548 DEBUG oslo_concurrency.lockutils [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:48:51 compute-0 nova_compute[186544]: 2025-11-22 07:48:51.258 186548 DEBUG oslo_concurrency.lockutils [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:48:51 compute-0 nova_compute[186544]: 2025-11-22 07:48:51.273 186548 DEBUG oslo_concurrency.processutils [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:48:51 compute-0 nova_compute[186544]: 2025-11-22 07:48:51.328 186548 DEBUG oslo_concurrency.processutils [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:48:51 compute-0 nova_compute[186544]: 2025-11-22 07:48:51.329 186548 DEBUG oslo_concurrency.processutils [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/b6ae104e-3174-402f-856b-f4c9156f02c9/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:48:51 compute-0 nova_compute[186544]: 2025-11-22 07:48:51.394 186548 DEBUG oslo_concurrency.processutils [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/b6ae104e-3174-402f-856b-f4c9156f02c9/disk 1073741824" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:48:51 compute-0 nova_compute[186544]: 2025-11-22 07:48:51.395 186548 DEBUG oslo_concurrency.lockutils [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.138s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:48:51 compute-0 nova_compute[186544]: 2025-11-22 07:48:51.396 186548 DEBUG oslo_concurrency.processutils [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:48:51 compute-0 nova_compute[186544]: 2025-11-22 07:48:51.452 186548 DEBUG oslo_concurrency.processutils [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:48:51 compute-0 nova_compute[186544]: 2025-11-22 07:48:51.453 186548 DEBUG nova.virt.disk.api [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] Checking if we can resize image /var/lib/nova/instances/b6ae104e-3174-402f-856b-f4c9156f02c9/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 07:48:51 compute-0 nova_compute[186544]: 2025-11-22 07:48:51.454 186548 DEBUG oslo_concurrency.processutils [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b6ae104e-3174-402f-856b-f4c9156f02c9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:48:51 compute-0 nova_compute[186544]: 2025-11-22 07:48:51.492 186548 DEBUG nova.policy [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '360aa32f5cfd48598c025d765dc21aa5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f7ba3637e7e04fff9deaded2ed375c3a', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 22 07:48:51 compute-0 nova_compute[186544]: 2025-11-22 07:48:51.511 186548 DEBUG oslo_concurrency.processutils [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b6ae104e-3174-402f-856b-f4c9156f02c9/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:48:51 compute-0 nova_compute[186544]: 2025-11-22 07:48:51.512 186548 DEBUG nova.virt.disk.api [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] Cannot resize image /var/lib/nova/instances/b6ae104e-3174-402f-856b-f4c9156f02c9/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 07:48:51 compute-0 nova_compute[186544]: 2025-11-22 07:48:51.512 186548 DEBUG nova.objects.instance [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] Lazy-loading 'migration_context' on Instance uuid b6ae104e-3174-402f-856b-f4c9156f02c9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:48:51 compute-0 nova_compute[186544]: 2025-11-22 07:48:51.531 186548 DEBUG nova.virt.libvirt.driver [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] [instance: b6ae104e-3174-402f-856b-f4c9156f02c9] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 07:48:51 compute-0 nova_compute[186544]: 2025-11-22 07:48:51.532 186548 DEBUG nova.virt.libvirt.driver [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] [instance: b6ae104e-3174-402f-856b-f4c9156f02c9] Ensure instance console log exists: /var/lib/nova/instances/b6ae104e-3174-402f-856b-f4c9156f02c9/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 07:48:51 compute-0 nova_compute[186544]: 2025-11-22 07:48:51.532 186548 DEBUG oslo_concurrency.lockutils [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:48:51 compute-0 nova_compute[186544]: 2025-11-22 07:48:51.532 186548 DEBUG oslo_concurrency.lockutils [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:48:51 compute-0 nova_compute[186544]: 2025-11-22 07:48:51.533 186548 DEBUG oslo_concurrency.lockutils [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:48:52 compute-0 nova_compute[186544]: 2025-11-22 07:48:52.086 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:48:52 compute-0 nova_compute[186544]: 2025-11-22 07:48:52.483 186548 DEBUG nova.network.neutron [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] [instance: b6ae104e-3174-402f-856b-f4c9156f02c9] Successfully created port: e95ab2f5-6f0d-4673-839c-40b2a2f2e8bb _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 22 07:48:52 compute-0 ovn_controller[94843]: 2025-11-22T07:48:52Z|00180|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Nov 22 07:48:54 compute-0 nova_compute[186544]: 2025-11-22 07:48:54.629 186548 DEBUG nova.network.neutron [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] [instance: b6ae104e-3174-402f-856b-f4c9156f02c9] Successfully updated port: e95ab2f5-6f0d-4673-839c-40b2a2f2e8bb _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 07:48:54 compute-0 nova_compute[186544]: 2025-11-22 07:48:54.647 186548 DEBUG oslo_concurrency.lockutils [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] Acquiring lock "refresh_cache-b6ae104e-3174-402f-856b-f4c9156f02c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:48:54 compute-0 nova_compute[186544]: 2025-11-22 07:48:54.647 186548 DEBUG oslo_concurrency.lockutils [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] Acquired lock "refresh_cache-b6ae104e-3174-402f-856b-f4c9156f02c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:48:54 compute-0 nova_compute[186544]: 2025-11-22 07:48:54.647 186548 DEBUG nova.network.neutron [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] [instance: b6ae104e-3174-402f-856b-f4c9156f02c9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 07:48:54 compute-0 nova_compute[186544]: 2025-11-22 07:48:54.793 186548 DEBUG nova.compute.manager [req-1c24d9c1-3ab3-421f-91c1-f5b7d92b2ce3 req-ac37db19-84d5-454f-a2e9-748def2ec527 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b6ae104e-3174-402f-856b-f4c9156f02c9] Received event network-changed-e95ab2f5-6f0d-4673-839c-40b2a2f2e8bb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:48:54 compute-0 nova_compute[186544]: 2025-11-22 07:48:54.793 186548 DEBUG nova.compute.manager [req-1c24d9c1-3ab3-421f-91c1-f5b7d92b2ce3 req-ac37db19-84d5-454f-a2e9-748def2ec527 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b6ae104e-3174-402f-856b-f4c9156f02c9] Refreshing instance network info cache due to event network-changed-e95ab2f5-6f0d-4673-839c-40b2a2f2e8bb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 07:48:54 compute-0 nova_compute[186544]: 2025-11-22 07:48:54.793 186548 DEBUG oslo_concurrency.lockutils [req-1c24d9c1-3ab3-421f-91c1-f5b7d92b2ce3 req-ac37db19-84d5-454f-a2e9-748def2ec527 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-b6ae104e-3174-402f-856b-f4c9156f02c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:48:54 compute-0 nova_compute[186544]: 2025-11-22 07:48:54.843 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763797719.8426974, 5225b730-46c6-4fb4-8ada-9f9ea046e722 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:48:54 compute-0 nova_compute[186544]: 2025-11-22 07:48:54.844 186548 INFO nova.compute.manager [-] [instance: 5225b730-46c6-4fb4-8ada-9f9ea046e722] VM Stopped (Lifecycle Event)
Nov 22 07:48:54 compute-0 nova_compute[186544]: 2025-11-22 07:48:54.877 186548 DEBUG nova.compute.manager [None req-243ec406-cb6b-4d8a-9368-1bae6519b2a4 - - - - - -] [instance: 5225b730-46c6-4fb4-8ada-9f9ea046e722] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:48:55 compute-0 nova_compute[186544]: 2025-11-22 07:48:55.181 186548 DEBUG nova.network.neutron [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] [instance: b6ae104e-3174-402f-856b-f4c9156f02c9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 07:48:55 compute-0 nova_compute[186544]: 2025-11-22 07:48:55.257 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:48:56 compute-0 nova_compute[186544]: 2025-11-22 07:48:56.436 186548 DEBUG nova.network.neutron [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] [instance: b6ae104e-3174-402f-856b-f4c9156f02c9] Updating instance_info_cache with network_info: [{"id": "e95ab2f5-6f0d-4673-839c-40b2a2f2e8bb", "address": "fa:16:3e:d0:2c:71", "network": {"id": "dff9a519-6da1-447f-b8be-904df7d55a6d", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1145359980-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7ba3637e7e04fff9deaded2ed375c3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape95ab2f5-6f", "ovs_interfaceid": "e95ab2f5-6f0d-4673-839c-40b2a2f2e8bb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:48:56 compute-0 nova_compute[186544]: 2025-11-22 07:48:56.487 186548 DEBUG oslo_concurrency.lockutils [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] Releasing lock "refresh_cache-b6ae104e-3174-402f-856b-f4c9156f02c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:48:56 compute-0 nova_compute[186544]: 2025-11-22 07:48:56.488 186548 DEBUG nova.compute.manager [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] [instance: b6ae104e-3174-402f-856b-f4c9156f02c9] Instance network_info: |[{"id": "e95ab2f5-6f0d-4673-839c-40b2a2f2e8bb", "address": "fa:16:3e:d0:2c:71", "network": {"id": "dff9a519-6da1-447f-b8be-904df7d55a6d", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1145359980-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7ba3637e7e04fff9deaded2ed375c3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape95ab2f5-6f", "ovs_interfaceid": "e95ab2f5-6f0d-4673-839c-40b2a2f2e8bb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 22 07:48:56 compute-0 nova_compute[186544]: 2025-11-22 07:48:56.489 186548 DEBUG oslo_concurrency.lockutils [req-1c24d9c1-3ab3-421f-91c1-f5b7d92b2ce3 req-ac37db19-84d5-454f-a2e9-748def2ec527 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-b6ae104e-3174-402f-856b-f4c9156f02c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:48:56 compute-0 nova_compute[186544]: 2025-11-22 07:48:56.489 186548 DEBUG nova.network.neutron [req-1c24d9c1-3ab3-421f-91c1-f5b7d92b2ce3 req-ac37db19-84d5-454f-a2e9-748def2ec527 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b6ae104e-3174-402f-856b-f4c9156f02c9] Refreshing network info cache for port e95ab2f5-6f0d-4673-839c-40b2a2f2e8bb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 07:48:56 compute-0 nova_compute[186544]: 2025-11-22 07:48:56.492 186548 DEBUG nova.virt.libvirt.driver [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] [instance: b6ae104e-3174-402f-856b-f4c9156f02c9] Start _get_guest_xml network_info=[{"id": "e95ab2f5-6f0d-4673-839c-40b2a2f2e8bb", "address": "fa:16:3e:d0:2c:71", "network": {"id": "dff9a519-6da1-447f-b8be-904df7d55a6d", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1145359980-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7ba3637e7e04fff9deaded2ed375c3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape95ab2f5-6f", "ovs_interfaceid": "e95ab2f5-6f0d-4673-839c-40b2a2f2e8bb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 07:48:56 compute-0 nova_compute[186544]: 2025-11-22 07:48:56.497 186548 WARNING nova.virt.libvirt.driver [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 07:48:56 compute-0 nova_compute[186544]: 2025-11-22 07:48:56.503 186548 DEBUG nova.virt.libvirt.host [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 07:48:56 compute-0 nova_compute[186544]: 2025-11-22 07:48:56.504 186548 DEBUG nova.virt.libvirt.host [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 07:48:56 compute-0 nova_compute[186544]: 2025-11-22 07:48:56.507 186548 DEBUG nova.virt.libvirt.host [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 07:48:56 compute-0 nova_compute[186544]: 2025-11-22 07:48:56.507 186548 DEBUG nova.virt.libvirt.host [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 07:48:56 compute-0 nova_compute[186544]: 2025-11-22 07:48:56.508 186548 DEBUG nova.virt.libvirt.driver [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 07:48:56 compute-0 nova_compute[186544]: 2025-11-22 07:48:56.508 186548 DEBUG nova.virt.hardware [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 07:48:56 compute-0 nova_compute[186544]: 2025-11-22 07:48:56.509 186548 DEBUG nova.virt.hardware [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 07:48:56 compute-0 nova_compute[186544]: 2025-11-22 07:48:56.509 186548 DEBUG nova.virt.hardware [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 07:48:56 compute-0 nova_compute[186544]: 2025-11-22 07:48:56.509 186548 DEBUG nova.virt.hardware [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 07:48:56 compute-0 nova_compute[186544]: 2025-11-22 07:48:56.510 186548 DEBUG nova.virt.hardware [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 07:48:56 compute-0 nova_compute[186544]: 2025-11-22 07:48:56.510 186548 DEBUG nova.virt.hardware [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 07:48:56 compute-0 nova_compute[186544]: 2025-11-22 07:48:56.510 186548 DEBUG nova.virt.hardware [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 07:48:56 compute-0 nova_compute[186544]: 2025-11-22 07:48:56.510 186548 DEBUG nova.virt.hardware [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 07:48:56 compute-0 nova_compute[186544]: 2025-11-22 07:48:56.511 186548 DEBUG nova.virt.hardware [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 07:48:56 compute-0 nova_compute[186544]: 2025-11-22 07:48:56.511 186548 DEBUG nova.virt.hardware [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 07:48:56 compute-0 nova_compute[186544]: 2025-11-22 07:48:56.511 186548 DEBUG nova.virt.hardware [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 07:48:56 compute-0 nova_compute[186544]: 2025-11-22 07:48:56.514 186548 DEBUG nova.virt.libvirt.vif [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:48:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-FloatingIPsAssociationNegativeTestJSON-server-1307647210',display_name='tempest-FloatingIPsAssociationNegativeTestJSON-server-1307647210',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-floatingipsassociationnegativetestjson-server-130764721',id=42,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f7ba3637e7e04fff9deaded2ed375c3a',ramdisk_id='',reservation_id='r-8dqd3qgz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIPsAssociationNegativeTestJSON-1416697517',owner_user_name='tempest-FloatingIPsAssociationNegativeTestJSON-1416697517-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:48:51Z,user_data=None,user_id='360aa32f5cfd48598c025d765dc21aa5',uuid=b6ae104e-3174-402f-856b-f4c9156f02c9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e95ab2f5-6f0d-4673-839c-40b2a2f2e8bb", "address": "fa:16:3e:d0:2c:71", "network": {"id": "dff9a519-6da1-447f-b8be-904df7d55a6d", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1145359980-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7ba3637e7e04fff9deaded2ed375c3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape95ab2f5-6f", "ovs_interfaceid": "e95ab2f5-6f0d-4673-839c-40b2a2f2e8bb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 07:48:56 compute-0 nova_compute[186544]: 2025-11-22 07:48:56.515 186548 DEBUG nova.network.os_vif_util [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] Converting VIF {"id": "e95ab2f5-6f0d-4673-839c-40b2a2f2e8bb", "address": "fa:16:3e:d0:2c:71", "network": {"id": "dff9a519-6da1-447f-b8be-904df7d55a6d", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1145359980-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7ba3637e7e04fff9deaded2ed375c3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape95ab2f5-6f", "ovs_interfaceid": "e95ab2f5-6f0d-4673-839c-40b2a2f2e8bb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:48:56 compute-0 nova_compute[186544]: 2025-11-22 07:48:56.516 186548 DEBUG nova.network.os_vif_util [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d0:2c:71,bridge_name='br-int',has_traffic_filtering=True,id=e95ab2f5-6f0d-4673-839c-40b2a2f2e8bb,network=Network(dff9a519-6da1-447f-b8be-904df7d55a6d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape95ab2f5-6f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:48:56 compute-0 nova_compute[186544]: 2025-11-22 07:48:56.516 186548 DEBUG nova.objects.instance [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] Lazy-loading 'pci_devices' on Instance uuid b6ae104e-3174-402f-856b-f4c9156f02c9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:48:56 compute-0 nova_compute[186544]: 2025-11-22 07:48:56.529 186548 DEBUG nova.virt.libvirt.driver [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] [instance: b6ae104e-3174-402f-856b-f4c9156f02c9] End _get_guest_xml xml=<domain type="kvm">
Nov 22 07:48:56 compute-0 nova_compute[186544]:   <uuid>b6ae104e-3174-402f-856b-f4c9156f02c9</uuid>
Nov 22 07:48:56 compute-0 nova_compute[186544]:   <name>instance-0000002a</name>
Nov 22 07:48:56 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 07:48:56 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 07:48:56 compute-0 nova_compute[186544]:   <metadata>
Nov 22 07:48:56 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 07:48:56 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 07:48:56 compute-0 nova_compute[186544]:       <nova:name>tempest-FloatingIPsAssociationNegativeTestJSON-server-1307647210</nova:name>
Nov 22 07:48:56 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 07:48:56</nova:creationTime>
Nov 22 07:48:56 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 07:48:56 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 07:48:56 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 07:48:56 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 07:48:56 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 07:48:56 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 07:48:56 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 07:48:56 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 07:48:56 compute-0 nova_compute[186544]:         <nova:user uuid="360aa32f5cfd48598c025d765dc21aa5">tempest-FloatingIPsAssociationNegativeTestJSON-1416697517-project-member</nova:user>
Nov 22 07:48:56 compute-0 nova_compute[186544]:         <nova:project uuid="f7ba3637e7e04fff9deaded2ed375c3a">tempest-FloatingIPsAssociationNegativeTestJSON-1416697517</nova:project>
Nov 22 07:48:56 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 07:48:56 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 07:48:56 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 07:48:56 compute-0 nova_compute[186544]:         <nova:port uuid="e95ab2f5-6f0d-4673-839c-40b2a2f2e8bb">
Nov 22 07:48:56 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Nov 22 07:48:56 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 07:48:56 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 07:48:56 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 07:48:56 compute-0 nova_compute[186544]:   </metadata>
Nov 22 07:48:56 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 07:48:56 compute-0 nova_compute[186544]:     <system>
Nov 22 07:48:56 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 07:48:56 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 07:48:56 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 07:48:56 compute-0 nova_compute[186544]:       <entry name="serial">b6ae104e-3174-402f-856b-f4c9156f02c9</entry>
Nov 22 07:48:56 compute-0 nova_compute[186544]:       <entry name="uuid">b6ae104e-3174-402f-856b-f4c9156f02c9</entry>
Nov 22 07:48:56 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 07:48:56 compute-0 nova_compute[186544]:     </system>
Nov 22 07:48:56 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 07:48:56 compute-0 nova_compute[186544]:   <os>
Nov 22 07:48:56 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 07:48:56 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 07:48:56 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 07:48:56 compute-0 nova_compute[186544]:   </os>
Nov 22 07:48:56 compute-0 nova_compute[186544]:   <features>
Nov 22 07:48:56 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 07:48:56 compute-0 nova_compute[186544]:     <apic/>
Nov 22 07:48:56 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 07:48:56 compute-0 nova_compute[186544]:   </features>
Nov 22 07:48:56 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 07:48:56 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 07:48:56 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 07:48:56 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 07:48:56 compute-0 nova_compute[186544]:   </clock>
Nov 22 07:48:56 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 07:48:56 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 07:48:56 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 07:48:56 compute-0 nova_compute[186544]:   </cpu>
Nov 22 07:48:56 compute-0 nova_compute[186544]:   <devices>
Nov 22 07:48:56 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 07:48:56 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 07:48:56 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/b6ae104e-3174-402f-856b-f4c9156f02c9/disk"/>
Nov 22 07:48:56 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 07:48:56 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:48:56 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 07:48:56 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 07:48:56 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/b6ae104e-3174-402f-856b-f4c9156f02c9/disk.config"/>
Nov 22 07:48:56 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 07:48:56 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:48:56 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 07:48:56 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:d0:2c:71"/>
Nov 22 07:48:56 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 07:48:56 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 07:48:56 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 07:48:56 compute-0 nova_compute[186544]:       <target dev="tape95ab2f5-6f"/>
Nov 22 07:48:56 compute-0 nova_compute[186544]:     </interface>
Nov 22 07:48:56 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 07:48:56 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/b6ae104e-3174-402f-856b-f4c9156f02c9/console.log" append="off"/>
Nov 22 07:48:56 compute-0 nova_compute[186544]:     </serial>
Nov 22 07:48:56 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 07:48:56 compute-0 nova_compute[186544]:     <video>
Nov 22 07:48:56 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 07:48:56 compute-0 nova_compute[186544]:     </video>
Nov 22 07:48:56 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 07:48:56 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 07:48:56 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 07:48:56 compute-0 nova_compute[186544]:     </rng>
Nov 22 07:48:56 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 07:48:56 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:48:56 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:48:56 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:48:56 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:48:56 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:48:56 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:48:56 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:48:56 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:48:56 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:48:56 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:48:56 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:48:56 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:48:56 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:48:56 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:48:56 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:48:56 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:48:56 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:48:56 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:48:56 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:48:56 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:48:56 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:48:56 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:48:56 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:48:56 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:48:56 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 07:48:56 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 07:48:56 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 07:48:56 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 07:48:56 compute-0 nova_compute[186544]:   </devices>
Nov 22 07:48:56 compute-0 nova_compute[186544]: </domain>
Nov 22 07:48:56 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 07:48:56 compute-0 nova_compute[186544]: 2025-11-22 07:48:56.531 186548 DEBUG nova.compute.manager [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] [instance: b6ae104e-3174-402f-856b-f4c9156f02c9] Preparing to wait for external event network-vif-plugged-e95ab2f5-6f0d-4673-839c-40b2a2f2e8bb prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 07:48:56 compute-0 nova_compute[186544]: 2025-11-22 07:48:56.531 186548 DEBUG oslo_concurrency.lockutils [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] Acquiring lock "b6ae104e-3174-402f-856b-f4c9156f02c9-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:48:56 compute-0 nova_compute[186544]: 2025-11-22 07:48:56.531 186548 DEBUG oslo_concurrency.lockutils [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] Lock "b6ae104e-3174-402f-856b-f4c9156f02c9-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:48:56 compute-0 nova_compute[186544]: 2025-11-22 07:48:56.532 186548 DEBUG oslo_concurrency.lockutils [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] Lock "b6ae104e-3174-402f-856b-f4c9156f02c9-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:48:56 compute-0 nova_compute[186544]: 2025-11-22 07:48:56.532 186548 DEBUG nova.virt.libvirt.vif [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:48:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-FloatingIPsAssociationNegativeTestJSON-server-1307647210',display_name='tempest-FloatingIPsAssociationNegativeTestJSON-server-1307647210',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-floatingipsassociationnegativetestjson-server-130764721',id=42,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f7ba3637e7e04fff9deaded2ed375c3a',ramdisk_id='',reservation_id='r-8dqd3qgz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIPsAssociationNegativeTestJSON-1416697517',owner_user_name='tempest-FloatingIPsAssociationNegativeTestJSON-1416697517-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:48:51Z,user_data=None,user_id='360aa32f5cfd48598c025d765dc21aa5',uuid=b6ae104e-3174-402f-856b-f4c9156f02c9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e95ab2f5-6f0d-4673-839c-40b2a2f2e8bb", "address": "fa:16:3e:d0:2c:71", "network": {"id": "dff9a519-6da1-447f-b8be-904df7d55a6d", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1145359980-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7ba3637e7e04fff9deaded2ed375c3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape95ab2f5-6f", "ovs_interfaceid": "e95ab2f5-6f0d-4673-839c-40b2a2f2e8bb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 07:48:56 compute-0 nova_compute[186544]: 2025-11-22 07:48:56.533 186548 DEBUG nova.network.os_vif_util [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] Converting VIF {"id": "e95ab2f5-6f0d-4673-839c-40b2a2f2e8bb", "address": "fa:16:3e:d0:2c:71", "network": {"id": "dff9a519-6da1-447f-b8be-904df7d55a6d", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1145359980-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7ba3637e7e04fff9deaded2ed375c3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape95ab2f5-6f", "ovs_interfaceid": "e95ab2f5-6f0d-4673-839c-40b2a2f2e8bb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:48:56 compute-0 nova_compute[186544]: 2025-11-22 07:48:56.533 186548 DEBUG nova.network.os_vif_util [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d0:2c:71,bridge_name='br-int',has_traffic_filtering=True,id=e95ab2f5-6f0d-4673-839c-40b2a2f2e8bb,network=Network(dff9a519-6da1-447f-b8be-904df7d55a6d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape95ab2f5-6f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:48:56 compute-0 nova_compute[186544]: 2025-11-22 07:48:56.534 186548 DEBUG os_vif [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d0:2c:71,bridge_name='br-int',has_traffic_filtering=True,id=e95ab2f5-6f0d-4673-839c-40b2a2f2e8bb,network=Network(dff9a519-6da1-447f-b8be-904df7d55a6d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape95ab2f5-6f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 07:48:56 compute-0 nova_compute[186544]: 2025-11-22 07:48:56.534 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:48:56 compute-0 nova_compute[186544]: 2025-11-22 07:48:56.535 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:48:56 compute-0 nova_compute[186544]: 2025-11-22 07:48:56.535 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:48:56 compute-0 nova_compute[186544]: 2025-11-22 07:48:56.538 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:48:56 compute-0 nova_compute[186544]: 2025-11-22 07:48:56.539 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape95ab2f5-6f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:48:56 compute-0 nova_compute[186544]: 2025-11-22 07:48:56.539 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape95ab2f5-6f, col_values=(('external_ids', {'iface-id': 'e95ab2f5-6f0d-4673-839c-40b2a2f2e8bb', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d0:2c:71', 'vm-uuid': 'b6ae104e-3174-402f-856b-f4c9156f02c9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:48:56 compute-0 nova_compute[186544]: 2025-11-22 07:48:56.541 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:48:56 compute-0 NetworkManager[55036]: <info>  [1763797736.5421] manager: (tape95ab2f5-6f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/79)
Nov 22 07:48:56 compute-0 nova_compute[186544]: 2025-11-22 07:48:56.544 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 07:48:56 compute-0 nova_compute[186544]: 2025-11-22 07:48:56.548 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:48:56 compute-0 nova_compute[186544]: 2025-11-22 07:48:56.549 186548 INFO os_vif [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d0:2c:71,bridge_name='br-int',has_traffic_filtering=True,id=e95ab2f5-6f0d-4673-839c-40b2a2f2e8bb,network=Network(dff9a519-6da1-447f-b8be-904df7d55a6d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape95ab2f5-6f')
Nov 22 07:48:56 compute-0 nova_compute[186544]: 2025-11-22 07:48:56.758 186548 DEBUG nova.virt.libvirt.driver [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:48:56 compute-0 nova_compute[186544]: 2025-11-22 07:48:56.759 186548 DEBUG nova.virt.libvirt.driver [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:48:56 compute-0 nova_compute[186544]: 2025-11-22 07:48:56.759 186548 DEBUG nova.virt.libvirt.driver [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] No VIF found with MAC fa:16:3e:d0:2c:71, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 07:48:56 compute-0 nova_compute[186544]: 2025-11-22 07:48:56.759 186548 INFO nova.virt.libvirt.driver [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] [instance: b6ae104e-3174-402f-856b-f4c9156f02c9] Using config drive
Nov 22 07:48:57 compute-0 nova_compute[186544]: 2025-11-22 07:48:57.395 186548 INFO nova.virt.libvirt.driver [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] [instance: b6ae104e-3174-402f-856b-f4c9156f02c9] Creating config drive at /var/lib/nova/instances/b6ae104e-3174-402f-856b-f4c9156f02c9/disk.config
Nov 22 07:48:57 compute-0 nova_compute[186544]: 2025-11-22 07:48:57.400 186548 DEBUG oslo_concurrency.processutils [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b6ae104e-3174-402f-856b-f4c9156f02c9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpze3ew2cb execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:48:57 compute-0 nova_compute[186544]: 2025-11-22 07:48:57.527 186548 DEBUG oslo_concurrency.processutils [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b6ae104e-3174-402f-856b-f4c9156f02c9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpze3ew2cb" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:48:57 compute-0 kernel: tape95ab2f5-6f: entered promiscuous mode
Nov 22 07:48:57 compute-0 NetworkManager[55036]: <info>  [1763797737.5914] manager: (tape95ab2f5-6f): new Tun device (/org/freedesktop/NetworkManager/Devices/80)
Nov 22 07:48:57 compute-0 ovn_controller[94843]: 2025-11-22T07:48:57Z|00181|binding|INFO|Claiming lport e95ab2f5-6f0d-4673-839c-40b2a2f2e8bb for this chassis.
Nov 22 07:48:57 compute-0 ovn_controller[94843]: 2025-11-22T07:48:57Z|00182|binding|INFO|e95ab2f5-6f0d-4673-839c-40b2a2f2e8bb: Claiming fa:16:3e:d0:2c:71 10.100.0.14
Nov 22 07:48:57 compute-0 nova_compute[186544]: 2025-11-22 07:48:57.591 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:48:57 compute-0 nova_compute[186544]: 2025-11-22 07:48:57.598 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:48:57 compute-0 systemd-udevd[219766]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 07:48:57 compute-0 systemd-machined[152872]: New machine qemu-22-instance-0000002a.
Nov 22 07:48:57 compute-0 NetworkManager[55036]: <info>  [1763797737.6391] device (tape95ab2f5-6f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 07:48:57 compute-0 NetworkManager[55036]: <info>  [1763797737.6408] device (tape95ab2f5-6f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 07:48:57 compute-0 nova_compute[186544]: 2025-11-22 07:48:57.653 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:48:57 compute-0 ovn_controller[94843]: 2025-11-22T07:48:57Z|00183|binding|INFO|Setting lport e95ab2f5-6f0d-4673-839c-40b2a2f2e8bb ovn-installed in OVS
Nov 22 07:48:57 compute-0 nova_compute[186544]: 2025-11-22 07:48:57.659 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:48:57 compute-0 systemd[1]: Started Virtual Machine qemu-22-instance-0000002a.
Nov 22 07:48:57 compute-0 ovn_controller[94843]: 2025-11-22T07:48:57Z|00184|binding|INFO|Setting lport e95ab2f5-6f0d-4673-839c-40b2a2f2e8bb up in Southbound
Nov 22 07:48:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:48:57.700 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d0:2c:71 10.100.0.14'], port_security=['fa:16:3e:d0:2c:71 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'b6ae104e-3174-402f-856b-f4c9156f02c9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dff9a519-6da1-447f-b8be-904df7d55a6d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f7ba3637e7e04fff9deaded2ed375c3a', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2184d4ea-1da1-4bba-8650-39bb1b3ade24', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c1f01d26-8c6c-4bfe-840a-5ebb96fa3baf, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=e95ab2f5-6f0d-4673-839c-40b2a2f2e8bb) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:48:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:48:57.701 103805 INFO neutron.agent.ovn.metadata.agent [-] Port e95ab2f5-6f0d-4673-839c-40b2a2f2e8bb in datapath dff9a519-6da1-447f-b8be-904df7d55a6d bound to our chassis
Nov 22 07:48:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:48:57.702 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network dff9a519-6da1-447f-b8be-904df7d55a6d
Nov 22 07:48:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:48:57.713 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[5a80d824-a030-4bac-9d3c-2dd9a2be6973]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:48:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:48:57.715 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapdff9a519-61 in ovnmeta-dff9a519-6da1-447f-b8be-904df7d55a6d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 07:48:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:48:57.717 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapdff9a519-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 07:48:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:48:57.717 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[823ea700-d4f1-4c23-baca-b5ab54b42dc6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:48:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:48:57.718 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[93ab953b-6312-4100-b94e-7511a11e90d1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:48:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:48:57.729 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[2db87d3e-1270-49cf-820a-8b5dea3dc508]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:48:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:48:57.744 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[8fa1f090-de7a-404e-98c9-9a46573641e4]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:48:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:48:57.771 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[d1dee0bc-64d1-4dee-9cd3-eadc2f4a3d64]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:48:57 compute-0 NetworkManager[55036]: <info>  [1763797737.7779] manager: (tapdff9a519-60): new Veth device (/org/freedesktop/NetworkManager/Devices/81)
Nov 22 07:48:57 compute-0 systemd-udevd[219769]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 07:48:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:48:57.777 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[12bbc0fc-7802-45b6-9085-50afa28f1e31]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:48:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:48:57.811 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[db7efba1-b175-4e21-9dcc-38f05c1a789f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:48:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:48:57.814 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[11ff06a2-f7f5-44a1-af3b-677aafb26a56]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:48:57 compute-0 NetworkManager[55036]: <info>  [1763797737.8385] device (tapdff9a519-60): carrier: link connected
Nov 22 07:48:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:48:57.844 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[543b09cc-3e01-4a81-b7c5-b223c929685d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:48:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:48:57.859 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[b3f672f9-c48f-4177-b478-e32eb85c76e4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdff9a519-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f3:11:26'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 52], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 447847, 'reachable_time': 35170, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219800, 'error': None, 'target': 'ovnmeta-dff9a519-6da1-447f-b8be-904df7d55a6d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:48:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:48:57.873 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[715675f7-2b88-4b04-8b2a-9eea6df93221]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef3:1126'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 447847, 'tstamp': 447847}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219801, 'error': None, 'target': 'ovnmeta-dff9a519-6da1-447f-b8be-904df7d55a6d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:48:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:48:57.889 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[b1737007-515f-4d40-b837-ec03fd8599b1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdff9a519-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f3:11:26'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 52], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 447847, 'reachable_time': 35170, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 219802, 'error': None, 'target': 'ovnmeta-dff9a519-6da1-447f-b8be-904df7d55a6d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:48:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:48:57.923 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[b720bc07-fad2-419d-bc72-04ba479a814c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:48:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:48:57.979 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[342fcab1-50f4-469c-bf54-6653d84a9e23]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:48:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:48:57.981 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdff9a519-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:48:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:48:57.981 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:48:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:48:57.981 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdff9a519-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:48:57 compute-0 nova_compute[186544]: 2025-11-22 07:48:57.983 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:48:57 compute-0 kernel: tapdff9a519-60: entered promiscuous mode
Nov 22 07:48:57 compute-0 NetworkManager[55036]: <info>  [1763797737.9842] manager: (tapdff9a519-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/82)
Nov 22 07:48:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:48:57.986 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapdff9a519-60, col_values=(('external_ids', {'iface-id': '6e81866f-95d5-405a-9bba-3f7ec7f5ffd0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:48:57 compute-0 nova_compute[186544]: 2025-11-22 07:48:57.985 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:48:57 compute-0 ovn_controller[94843]: 2025-11-22T07:48:57Z|00185|binding|INFO|Releasing lport 6e81866f-95d5-405a-9bba-3f7ec7f5ffd0 from this chassis (sb_readonly=0)
Nov 22 07:48:57 compute-0 nova_compute[186544]: 2025-11-22 07:48:57.998 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:48:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:48:57.999 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/dff9a519-6da1-447f-b8be-904df7d55a6d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/dff9a519-6da1-447f-b8be-904df7d55a6d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 07:48:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:48:58.000 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[9ade9d1e-7b74-4a0f-9e2f-6cbf549bd2b0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:48:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:48:58.001 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 07:48:58 compute-0 ovn_metadata_agent[103800]: global
Nov 22 07:48:58 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 07:48:58 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-dff9a519-6da1-447f-b8be-904df7d55a6d
Nov 22 07:48:58 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 07:48:58 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 07:48:58 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 07:48:58 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/dff9a519-6da1-447f-b8be-904df7d55a6d.pid.haproxy
Nov 22 07:48:58 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 07:48:58 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:48:58 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 07:48:58 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 07:48:58 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 07:48:58 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 07:48:58 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 07:48:58 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 07:48:58 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 07:48:58 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 07:48:58 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 07:48:58 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 07:48:58 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 07:48:58 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 07:48:58 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 07:48:58 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:48:58 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:48:58 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 07:48:58 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 07:48:58 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 07:48:58 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID dff9a519-6da1-447f-b8be-904df7d55a6d
Nov 22 07:48:58 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 07:48:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:48:58.002 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-dff9a519-6da1-447f-b8be-904df7d55a6d', 'env', 'PROCESS_TAG=haproxy-dff9a519-6da1-447f-b8be-904df7d55a6d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/dff9a519-6da1-447f-b8be-904df7d55a6d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 07:48:58 compute-0 nova_compute[186544]: 2025-11-22 07:48:58.059 186548 DEBUG nova.compute.manager [req-91eafd88-7e40-49f8-bb2a-2f68469db06b req-65fe67ea-396f-4ee0-b4ed-6f0484452d3f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b6ae104e-3174-402f-856b-f4c9156f02c9] Received event network-vif-plugged-e95ab2f5-6f0d-4673-839c-40b2a2f2e8bb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:48:58 compute-0 nova_compute[186544]: 2025-11-22 07:48:58.060 186548 DEBUG oslo_concurrency.lockutils [req-91eafd88-7e40-49f8-bb2a-2f68469db06b req-65fe67ea-396f-4ee0-b4ed-6f0484452d3f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "b6ae104e-3174-402f-856b-f4c9156f02c9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:48:58 compute-0 nova_compute[186544]: 2025-11-22 07:48:58.060 186548 DEBUG oslo_concurrency.lockutils [req-91eafd88-7e40-49f8-bb2a-2f68469db06b req-65fe67ea-396f-4ee0-b4ed-6f0484452d3f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b6ae104e-3174-402f-856b-f4c9156f02c9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:48:58 compute-0 nova_compute[186544]: 2025-11-22 07:48:58.061 186548 DEBUG oslo_concurrency.lockutils [req-91eafd88-7e40-49f8-bb2a-2f68469db06b req-65fe67ea-396f-4ee0-b4ed-6f0484452d3f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b6ae104e-3174-402f-856b-f4c9156f02c9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:48:58 compute-0 nova_compute[186544]: 2025-11-22 07:48:58.061 186548 DEBUG nova.compute.manager [req-91eafd88-7e40-49f8-bb2a-2f68469db06b req-65fe67ea-396f-4ee0-b4ed-6f0484452d3f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b6ae104e-3174-402f-856b-f4c9156f02c9] Processing event network-vif-plugged-e95ab2f5-6f0d-4673-839c-40b2a2f2e8bb _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 07:48:58 compute-0 nova_compute[186544]: 2025-11-22 07:48:58.177 186548 DEBUG nova.network.neutron [req-1c24d9c1-3ab3-421f-91c1-f5b7d92b2ce3 req-ac37db19-84d5-454f-a2e9-748def2ec527 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b6ae104e-3174-402f-856b-f4c9156f02c9] Updated VIF entry in instance network info cache for port e95ab2f5-6f0d-4673-839c-40b2a2f2e8bb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 07:48:58 compute-0 nova_compute[186544]: 2025-11-22 07:48:58.178 186548 DEBUG nova.network.neutron [req-1c24d9c1-3ab3-421f-91c1-f5b7d92b2ce3 req-ac37db19-84d5-454f-a2e9-748def2ec527 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b6ae104e-3174-402f-856b-f4c9156f02c9] Updating instance_info_cache with network_info: [{"id": "e95ab2f5-6f0d-4673-839c-40b2a2f2e8bb", "address": "fa:16:3e:d0:2c:71", "network": {"id": "dff9a519-6da1-447f-b8be-904df7d55a6d", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1145359980-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7ba3637e7e04fff9deaded2ed375c3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape95ab2f5-6f", "ovs_interfaceid": "e95ab2f5-6f0d-4673-839c-40b2a2f2e8bb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:48:58 compute-0 nova_compute[186544]: 2025-11-22 07:48:58.190 186548 DEBUG oslo_concurrency.lockutils [req-1c24d9c1-3ab3-421f-91c1-f5b7d92b2ce3 req-ac37db19-84d5-454f-a2e9-748def2ec527 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-b6ae104e-3174-402f-856b-f4c9156f02c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:48:58 compute-0 podman[219834]: 2025-11-22 07:48:58.451843347 +0000 UTC m=+0.105913335 container create b82f90c05392cbae3422cab88aea92e81f9a50dfd1129063457f2956f04c0d5a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dff9a519-6da1-447f-b8be-904df7d55a6d, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 22 07:48:58 compute-0 podman[219834]: 2025-11-22 07:48:58.368822876 +0000 UTC m=+0.022892854 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 07:48:58 compute-0 systemd[1]: Started libpod-conmon-b82f90c05392cbae3422cab88aea92e81f9a50dfd1129063457f2956f04c0d5a.scope.
Nov 22 07:48:58 compute-0 systemd[1]: Started libcrun container.
Nov 22 07:48:58 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bc66f86825fd7b8d3a809884d7a117af9d0c5ca66894df817dfbb615ed7107b2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 07:48:58 compute-0 podman[219834]: 2025-11-22 07:48:58.551832103 +0000 UTC m=+0.205902081 container init b82f90c05392cbae3422cab88aea92e81f9a50dfd1129063457f2956f04c0d5a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dff9a519-6da1-447f-b8be-904df7d55a6d, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 22 07:48:58 compute-0 podman[219847]: 2025-11-22 07:48:58.552694005 +0000 UTC m=+0.066694400 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3)
Nov 22 07:48:58 compute-0 podman[219834]: 2025-11-22 07:48:58.558859286 +0000 UTC m=+0.212929244 container start b82f90c05392cbae3422cab88aea92e81f9a50dfd1129063457f2956f04c0d5a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dff9a519-6da1-447f-b8be-904df7d55a6d, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 22 07:48:58 compute-0 podman[219848]: 2025-11-22 07:48:58.575019704 +0000 UTC m=+0.086683292 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Nov 22 07:48:58 compute-0 neutron-haproxy-ovnmeta-dff9a519-6da1-447f-b8be-904df7d55a6d[219868]: [NOTICE]   (219892) : New worker (219897) forked
Nov 22 07:48:58 compute-0 neutron-haproxy-ovnmeta-dff9a519-6da1-447f-b8be-904df7d55a6d[219868]: [NOTICE]   (219892) : Loading success.
Nov 22 07:48:58 compute-0 nova_compute[186544]: 2025-11-22 07:48:58.719 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763797738.7187397, b6ae104e-3174-402f-856b-f4c9156f02c9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:48:58 compute-0 nova_compute[186544]: 2025-11-22 07:48:58.720 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: b6ae104e-3174-402f-856b-f4c9156f02c9] VM Started (Lifecycle Event)
Nov 22 07:48:58 compute-0 nova_compute[186544]: 2025-11-22 07:48:58.722 186548 DEBUG nova.compute.manager [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] [instance: b6ae104e-3174-402f-856b-f4c9156f02c9] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 07:48:58 compute-0 nova_compute[186544]: 2025-11-22 07:48:58.725 186548 DEBUG nova.virt.libvirt.driver [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] [instance: b6ae104e-3174-402f-856b-f4c9156f02c9] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 07:48:58 compute-0 nova_compute[186544]: 2025-11-22 07:48:58.729 186548 INFO nova.virt.libvirt.driver [-] [instance: b6ae104e-3174-402f-856b-f4c9156f02c9] Instance spawned successfully.
Nov 22 07:48:58 compute-0 nova_compute[186544]: 2025-11-22 07:48:58.729 186548 DEBUG nova.virt.libvirt.driver [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] [instance: b6ae104e-3174-402f-856b-f4c9156f02c9] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 07:48:58 compute-0 nova_compute[186544]: 2025-11-22 07:48:58.736 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: b6ae104e-3174-402f-856b-f4c9156f02c9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:48:58 compute-0 nova_compute[186544]: 2025-11-22 07:48:58.740 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: b6ae104e-3174-402f-856b-f4c9156f02c9] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:48:58 compute-0 nova_compute[186544]: 2025-11-22 07:48:58.748 186548 DEBUG nova.virt.libvirt.driver [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] [instance: b6ae104e-3174-402f-856b-f4c9156f02c9] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:48:58 compute-0 nova_compute[186544]: 2025-11-22 07:48:58.748 186548 DEBUG nova.virt.libvirt.driver [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] [instance: b6ae104e-3174-402f-856b-f4c9156f02c9] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:48:58 compute-0 nova_compute[186544]: 2025-11-22 07:48:58.749 186548 DEBUG nova.virt.libvirt.driver [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] [instance: b6ae104e-3174-402f-856b-f4c9156f02c9] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:48:58 compute-0 nova_compute[186544]: 2025-11-22 07:48:58.749 186548 DEBUG nova.virt.libvirt.driver [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] [instance: b6ae104e-3174-402f-856b-f4c9156f02c9] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:48:58 compute-0 nova_compute[186544]: 2025-11-22 07:48:58.750 186548 DEBUG nova.virt.libvirt.driver [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] [instance: b6ae104e-3174-402f-856b-f4c9156f02c9] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:48:58 compute-0 nova_compute[186544]: 2025-11-22 07:48:58.750 186548 DEBUG nova.virt.libvirt.driver [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] [instance: b6ae104e-3174-402f-856b-f4c9156f02c9] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:48:58 compute-0 nova_compute[186544]: 2025-11-22 07:48:58.754 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: b6ae104e-3174-402f-856b-f4c9156f02c9] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 07:48:58 compute-0 nova_compute[186544]: 2025-11-22 07:48:58.754 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763797738.7196338, b6ae104e-3174-402f-856b-f4c9156f02c9 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:48:58 compute-0 nova_compute[186544]: 2025-11-22 07:48:58.755 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: b6ae104e-3174-402f-856b-f4c9156f02c9] VM Paused (Lifecycle Event)
Nov 22 07:48:58 compute-0 nova_compute[186544]: 2025-11-22 07:48:58.776 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: b6ae104e-3174-402f-856b-f4c9156f02c9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:48:58 compute-0 nova_compute[186544]: 2025-11-22 07:48:58.779 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763797738.7249022, b6ae104e-3174-402f-856b-f4c9156f02c9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:48:58 compute-0 nova_compute[186544]: 2025-11-22 07:48:58.780 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: b6ae104e-3174-402f-856b-f4c9156f02c9] VM Resumed (Lifecycle Event)
Nov 22 07:48:58 compute-0 nova_compute[186544]: 2025-11-22 07:48:58.793 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: b6ae104e-3174-402f-856b-f4c9156f02c9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:48:58 compute-0 nova_compute[186544]: 2025-11-22 07:48:58.796 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: b6ae104e-3174-402f-856b-f4c9156f02c9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:48:58 compute-0 nova_compute[186544]: 2025-11-22 07:48:58.810 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: b6ae104e-3174-402f-856b-f4c9156f02c9] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 07:48:58 compute-0 nova_compute[186544]: 2025-11-22 07:48:58.905 186548 INFO nova.compute.manager [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] [instance: b6ae104e-3174-402f-856b-f4c9156f02c9] Took 7.73 seconds to spawn the instance on the hypervisor.
Nov 22 07:48:58 compute-0 nova_compute[186544]: 2025-11-22 07:48:58.905 186548 DEBUG nova.compute.manager [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] [instance: b6ae104e-3174-402f-856b-f4c9156f02c9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:48:58 compute-0 nova_compute[186544]: 2025-11-22 07:48:58.995 186548 INFO nova.compute.manager [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] [instance: b6ae104e-3174-402f-856b-f4c9156f02c9] Took 8.38 seconds to build instance.
Nov 22 07:48:59 compute-0 nova_compute[186544]: 2025-11-22 07:48:59.046 186548 DEBUG oslo_concurrency.lockutils [None req-cc3b0cf4-b42d-46f4-ae3c-ac4cd08be89a 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] Lock "b6ae104e-3174-402f-856b-f4c9156f02c9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.503s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:49:00 compute-0 nova_compute[186544]: 2025-11-22 07:49:00.231 186548 DEBUG nova.compute.manager [req-d69ca6fd-7516-4df7-a77f-94991998502f req-80d8c5bd-fd2e-4b1f-b688-2b47c38d1c16 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b6ae104e-3174-402f-856b-f4c9156f02c9] Received event network-vif-plugged-e95ab2f5-6f0d-4673-839c-40b2a2f2e8bb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:49:00 compute-0 nova_compute[186544]: 2025-11-22 07:49:00.232 186548 DEBUG oslo_concurrency.lockutils [req-d69ca6fd-7516-4df7-a77f-94991998502f req-80d8c5bd-fd2e-4b1f-b688-2b47c38d1c16 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "b6ae104e-3174-402f-856b-f4c9156f02c9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:49:00 compute-0 nova_compute[186544]: 2025-11-22 07:49:00.233 186548 DEBUG oslo_concurrency.lockutils [req-d69ca6fd-7516-4df7-a77f-94991998502f req-80d8c5bd-fd2e-4b1f-b688-2b47c38d1c16 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b6ae104e-3174-402f-856b-f4c9156f02c9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:49:00 compute-0 nova_compute[186544]: 2025-11-22 07:49:00.233 186548 DEBUG oslo_concurrency.lockutils [req-d69ca6fd-7516-4df7-a77f-94991998502f req-80d8c5bd-fd2e-4b1f-b688-2b47c38d1c16 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b6ae104e-3174-402f-856b-f4c9156f02c9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:49:00 compute-0 nova_compute[186544]: 2025-11-22 07:49:00.234 186548 DEBUG nova.compute.manager [req-d69ca6fd-7516-4df7-a77f-94991998502f req-80d8c5bd-fd2e-4b1f-b688-2b47c38d1c16 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b6ae104e-3174-402f-856b-f4c9156f02c9] No waiting events found dispatching network-vif-plugged-e95ab2f5-6f0d-4673-839c-40b2a2f2e8bb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:49:00 compute-0 nova_compute[186544]: 2025-11-22 07:49:00.234 186548 WARNING nova.compute.manager [req-d69ca6fd-7516-4df7-a77f-94991998502f req-80d8c5bd-fd2e-4b1f-b688-2b47c38d1c16 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b6ae104e-3174-402f-856b-f4c9156f02c9] Received unexpected event network-vif-plugged-e95ab2f5-6f0d-4673-839c-40b2a2f2e8bb for instance with vm_state active and task_state None.
Nov 22 07:49:00 compute-0 nova_compute[186544]: 2025-11-22 07:49:00.259 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:49:01 compute-0 nova_compute[186544]: 2025-11-22 07:49:01.541 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:49:03 compute-0 podman[219913]: 2025-11-22 07:49:03.402303308 +0000 UTC m=+0.050081072 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 07:49:05 compute-0 nova_compute[186544]: 2025-11-22 07:49:05.260 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:49:05 compute-0 podman[219937]: 2025-11-22 07:49:05.395175554 +0000 UTC m=+0.044430743 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 07:49:06 compute-0 NetworkManager[55036]: <info>  [1763797746.0546] manager: (patch-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/83)
Nov 22 07:49:06 compute-0 NetworkManager[55036]: <info>  [1763797746.0554] manager: (patch-br-int-to-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/84)
Nov 22 07:49:06 compute-0 nova_compute[186544]: 2025-11-22 07:49:06.053 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:49:06 compute-0 ovn_controller[94843]: 2025-11-22T07:49:06Z|00186|binding|INFO|Releasing lport 6e81866f-95d5-405a-9bba-3f7ec7f5ffd0 from this chassis (sb_readonly=0)
Nov 22 07:49:06 compute-0 nova_compute[186544]: 2025-11-22 07:49:06.081 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:49:06 compute-0 ovn_controller[94843]: 2025-11-22T07:49:06Z|00187|binding|INFO|Releasing lport 6e81866f-95d5-405a-9bba-3f7ec7f5ffd0 from this chassis (sb_readonly=0)
Nov 22 07:49:06 compute-0 nova_compute[186544]: 2025-11-22 07:49:06.088 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:49:06 compute-0 nova_compute[186544]: 2025-11-22 07:49:06.543 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:49:06 compute-0 nova_compute[186544]: 2025-11-22 07:49:06.616 186548 DEBUG nova.compute.manager [req-f97a6423-5ee2-4c66-8ac4-a90da59d68b8 req-8581dbdc-9225-45d6-9a76-ff56d09c5b9d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b6ae104e-3174-402f-856b-f4c9156f02c9] Received event network-changed-e95ab2f5-6f0d-4673-839c-40b2a2f2e8bb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:49:06 compute-0 nova_compute[186544]: 2025-11-22 07:49:06.617 186548 DEBUG nova.compute.manager [req-f97a6423-5ee2-4c66-8ac4-a90da59d68b8 req-8581dbdc-9225-45d6-9a76-ff56d09c5b9d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b6ae104e-3174-402f-856b-f4c9156f02c9] Refreshing instance network info cache due to event network-changed-e95ab2f5-6f0d-4673-839c-40b2a2f2e8bb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 07:49:06 compute-0 nova_compute[186544]: 2025-11-22 07:49:06.617 186548 DEBUG oslo_concurrency.lockutils [req-f97a6423-5ee2-4c66-8ac4-a90da59d68b8 req-8581dbdc-9225-45d6-9a76-ff56d09c5b9d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-b6ae104e-3174-402f-856b-f4c9156f02c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:49:06 compute-0 nova_compute[186544]: 2025-11-22 07:49:06.617 186548 DEBUG oslo_concurrency.lockutils [req-f97a6423-5ee2-4c66-8ac4-a90da59d68b8 req-8581dbdc-9225-45d6-9a76-ff56d09c5b9d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-b6ae104e-3174-402f-856b-f4c9156f02c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:49:06 compute-0 nova_compute[186544]: 2025-11-22 07:49:06.618 186548 DEBUG nova.network.neutron [req-f97a6423-5ee2-4c66-8ac4-a90da59d68b8 req-8581dbdc-9225-45d6-9a76-ff56d09c5b9d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b6ae104e-3174-402f-856b-f4c9156f02c9] Refreshing network info cache for port e95ab2f5-6f0d-4673-839c-40b2a2f2e8bb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 07:49:08 compute-0 nova_compute[186544]: 2025-11-22 07:49:08.443 186548 DEBUG nova.network.neutron [req-f97a6423-5ee2-4c66-8ac4-a90da59d68b8 req-8581dbdc-9225-45d6-9a76-ff56d09c5b9d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b6ae104e-3174-402f-856b-f4c9156f02c9] Updated VIF entry in instance network info cache for port e95ab2f5-6f0d-4673-839c-40b2a2f2e8bb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 07:49:08 compute-0 nova_compute[186544]: 2025-11-22 07:49:08.444 186548 DEBUG nova.network.neutron [req-f97a6423-5ee2-4c66-8ac4-a90da59d68b8 req-8581dbdc-9225-45d6-9a76-ff56d09c5b9d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b6ae104e-3174-402f-856b-f4c9156f02c9] Updating instance_info_cache with network_info: [{"id": "e95ab2f5-6f0d-4673-839c-40b2a2f2e8bb", "address": "fa:16:3e:d0:2c:71", "network": {"id": "dff9a519-6da1-447f-b8be-904df7d55a6d", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1145359980-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7ba3637e7e04fff9deaded2ed375c3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape95ab2f5-6f", "ovs_interfaceid": "e95ab2f5-6f0d-4673-839c-40b2a2f2e8bb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:49:08 compute-0 nova_compute[186544]: 2025-11-22 07:49:08.462 186548 DEBUG oslo_concurrency.lockutils [req-f97a6423-5ee2-4c66-8ac4-a90da59d68b8 req-8581dbdc-9225-45d6-9a76-ff56d09c5b9d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-b6ae104e-3174-402f-856b-f4c9156f02c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:49:10 compute-0 nova_compute[186544]: 2025-11-22 07:49:10.261 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:49:10 compute-0 podman[219957]: 2025-11-22 07:49:10.412218991 +0000 UTC m=+0.057990696 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 22 07:49:11 compute-0 nova_compute[186544]: 2025-11-22 07:49:11.545 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:49:11 compute-0 nova_compute[186544]: 2025-11-22 07:49:11.804 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:49:14 compute-0 ovn_controller[94843]: 2025-11-22T07:49:14Z|00022|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d0:2c:71 10.100.0.14
Nov 22 07:49:14 compute-0 ovn_controller[94843]: 2025-11-22T07:49:14Z|00023|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d0:2c:71 10.100.0.14
Nov 22 07:49:14 compute-0 nova_compute[186544]: 2025-11-22 07:49:14.646 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:49:15 compute-0 nova_compute[186544]: 2025-11-22 07:49:15.263 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:49:15 compute-0 podman[219996]: 2025-11-22 07:49:15.430414258 +0000 UTC m=+0.073087587 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 07:49:16 compute-0 nova_compute[186544]: 2025-11-22 07:49:16.548 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:49:18 compute-0 nova_compute[186544]: 2025-11-22 07:49:18.109 186548 DEBUG nova.compute.manager [req-e10e2436-0e2b-431f-95ba-795a6451846e req-9f21ba66-2634-4c8c-9fcc-cbb8d457bcc2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b6ae104e-3174-402f-856b-f4c9156f02c9] Received event network-changed-e95ab2f5-6f0d-4673-839c-40b2a2f2e8bb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:49:18 compute-0 nova_compute[186544]: 2025-11-22 07:49:18.110 186548 DEBUG nova.compute.manager [req-e10e2436-0e2b-431f-95ba-795a6451846e req-9f21ba66-2634-4c8c-9fcc-cbb8d457bcc2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b6ae104e-3174-402f-856b-f4c9156f02c9] Refreshing instance network info cache due to event network-changed-e95ab2f5-6f0d-4673-839c-40b2a2f2e8bb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 07:49:18 compute-0 nova_compute[186544]: 2025-11-22 07:49:18.110 186548 DEBUG oslo_concurrency.lockutils [req-e10e2436-0e2b-431f-95ba-795a6451846e req-9f21ba66-2634-4c8c-9fcc-cbb8d457bcc2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-b6ae104e-3174-402f-856b-f4c9156f02c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:49:18 compute-0 nova_compute[186544]: 2025-11-22 07:49:18.110 186548 DEBUG oslo_concurrency.lockutils [req-e10e2436-0e2b-431f-95ba-795a6451846e req-9f21ba66-2634-4c8c-9fcc-cbb8d457bcc2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-b6ae104e-3174-402f-856b-f4c9156f02c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:49:18 compute-0 nova_compute[186544]: 2025-11-22 07:49:18.111 186548 DEBUG nova.network.neutron [req-e10e2436-0e2b-431f-95ba-795a6451846e req-9f21ba66-2634-4c8c-9fcc-cbb8d457bcc2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b6ae104e-3174-402f-856b-f4c9156f02c9] Refreshing network info cache for port e95ab2f5-6f0d-4673-839c-40b2a2f2e8bb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 07:49:18 compute-0 nova_compute[186544]: 2025-11-22 07:49:18.606 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:49:18 compute-0 nova_compute[186544]: 2025-11-22 07:49:18.898 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:49:19 compute-0 podman[220020]: 2025-11-22 07:49:19.409832984 +0000 UTC m=+0.058588830 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vcs-type=git, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., version=9.6, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter)
Nov 22 07:49:19 compute-0 nova_compute[186544]: 2025-11-22 07:49:19.621 186548 DEBUG nova.network.neutron [req-e10e2436-0e2b-431f-95ba-795a6451846e req-9f21ba66-2634-4c8c-9fcc-cbb8d457bcc2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b6ae104e-3174-402f-856b-f4c9156f02c9] Updated VIF entry in instance network info cache for port e95ab2f5-6f0d-4673-839c-40b2a2f2e8bb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 07:49:19 compute-0 nova_compute[186544]: 2025-11-22 07:49:19.621 186548 DEBUG nova.network.neutron [req-e10e2436-0e2b-431f-95ba-795a6451846e req-9f21ba66-2634-4c8c-9fcc-cbb8d457bcc2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b6ae104e-3174-402f-856b-f4c9156f02c9] Updating instance_info_cache with network_info: [{"id": "e95ab2f5-6f0d-4673-839c-40b2a2f2e8bb", "address": "fa:16:3e:d0:2c:71", "network": {"id": "dff9a519-6da1-447f-b8be-904df7d55a6d", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1145359980-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7ba3637e7e04fff9deaded2ed375c3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape95ab2f5-6f", "ovs_interfaceid": "e95ab2f5-6f0d-4673-839c-40b2a2f2e8bb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:49:19 compute-0 nova_compute[186544]: 2025-11-22 07:49:19.636 186548 DEBUG oslo_concurrency.lockutils [req-e10e2436-0e2b-431f-95ba-795a6451846e req-9f21ba66-2634-4c8c-9fcc-cbb8d457bcc2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-b6ae104e-3174-402f-856b-f4c9156f02c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:49:20 compute-0 nova_compute[186544]: 2025-11-22 07:49:20.265 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:49:21 compute-0 nova_compute[186544]: 2025-11-22 07:49:21.592 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:49:22 compute-0 nova_compute[186544]: 2025-11-22 07:49:22.135 186548 DEBUG oslo_concurrency.lockutils [None req-3df5138f-6996-4207-8fe9-9426d8592c16 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] Acquiring lock "b6ae104e-3174-402f-856b-f4c9156f02c9" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:49:22 compute-0 nova_compute[186544]: 2025-11-22 07:49:22.136 186548 DEBUG oslo_concurrency.lockutils [None req-3df5138f-6996-4207-8fe9-9426d8592c16 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] Lock "b6ae104e-3174-402f-856b-f4c9156f02c9" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:49:22 compute-0 nova_compute[186544]: 2025-11-22 07:49:22.136 186548 DEBUG oslo_concurrency.lockutils [None req-3df5138f-6996-4207-8fe9-9426d8592c16 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] Acquiring lock "b6ae104e-3174-402f-856b-f4c9156f02c9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:49:22 compute-0 nova_compute[186544]: 2025-11-22 07:49:22.137 186548 DEBUG oslo_concurrency.lockutils [None req-3df5138f-6996-4207-8fe9-9426d8592c16 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] Lock "b6ae104e-3174-402f-856b-f4c9156f02c9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:49:22 compute-0 nova_compute[186544]: 2025-11-22 07:49:22.138 186548 DEBUG oslo_concurrency.lockutils [None req-3df5138f-6996-4207-8fe9-9426d8592c16 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] Lock "b6ae104e-3174-402f-856b-f4c9156f02c9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:49:22 compute-0 nova_compute[186544]: 2025-11-22 07:49:22.147 186548 INFO nova.compute.manager [None req-3df5138f-6996-4207-8fe9-9426d8592c16 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] [instance: b6ae104e-3174-402f-856b-f4c9156f02c9] Terminating instance
Nov 22 07:49:22 compute-0 nova_compute[186544]: 2025-11-22 07:49:22.156 186548 DEBUG nova.compute.manager [None req-3df5138f-6996-4207-8fe9-9426d8592c16 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] [instance: b6ae104e-3174-402f-856b-f4c9156f02c9] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 07:49:22 compute-0 kernel: tape95ab2f5-6f (unregistering): left promiscuous mode
Nov 22 07:49:22 compute-0 NetworkManager[55036]: <info>  [1763797762.1893] device (tape95ab2f5-6f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 07:49:22 compute-0 ovn_controller[94843]: 2025-11-22T07:49:22Z|00188|binding|INFO|Releasing lport e95ab2f5-6f0d-4673-839c-40b2a2f2e8bb from this chassis (sb_readonly=0)
Nov 22 07:49:22 compute-0 nova_compute[186544]: 2025-11-22 07:49:22.198 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:49:22 compute-0 ovn_controller[94843]: 2025-11-22T07:49:22Z|00189|binding|INFO|Setting lport e95ab2f5-6f0d-4673-839c-40b2a2f2e8bb down in Southbound
Nov 22 07:49:22 compute-0 ovn_controller[94843]: 2025-11-22T07:49:22Z|00190|binding|INFO|Removing iface tape95ab2f5-6f ovn-installed in OVS
Nov 22 07:49:22 compute-0 nova_compute[186544]: 2025-11-22 07:49:22.201 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:49:22 compute-0 nova_compute[186544]: 2025-11-22 07:49:22.215 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:49:22 compute-0 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d0000002a.scope: Deactivated successfully.
Nov 22 07:49:22 compute-0 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d0000002a.scope: Consumed 15.484s CPU time.
Nov 22 07:49:22 compute-0 systemd-machined[152872]: Machine qemu-22-instance-0000002a terminated.
Nov 22 07:49:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:49:22.276 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d0:2c:71 10.100.0.14'], port_security=['fa:16:3e:d0:2c:71 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'b6ae104e-3174-402f-856b-f4c9156f02c9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dff9a519-6da1-447f-b8be-904df7d55a6d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f7ba3637e7e04fff9deaded2ed375c3a', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2184d4ea-1da1-4bba-8650-39bb1b3ade24', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c1f01d26-8c6c-4bfe-840a-5ebb96fa3baf, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=e95ab2f5-6f0d-4673-839c-40b2a2f2e8bb) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:49:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:49:22.278 103805 INFO neutron.agent.ovn.metadata.agent [-] Port e95ab2f5-6f0d-4673-839c-40b2a2f2e8bb in datapath dff9a519-6da1-447f-b8be-904df7d55a6d unbound from our chassis
Nov 22 07:49:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:49:22.280 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network dff9a519-6da1-447f-b8be-904df7d55a6d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 07:49:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:49:22.281 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[1fb9daf8-12f3-444c-b731-333590d99955]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:49:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:49:22.282 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-dff9a519-6da1-447f-b8be-904df7d55a6d namespace which is not needed anymore
Nov 22 07:49:22 compute-0 nova_compute[186544]: 2025-11-22 07:49:22.412 186548 INFO nova.virt.libvirt.driver [-] [instance: b6ae104e-3174-402f-856b-f4c9156f02c9] Instance destroyed successfully.
Nov 22 07:49:22 compute-0 nova_compute[186544]: 2025-11-22 07:49:22.413 186548 DEBUG nova.objects.instance [None req-3df5138f-6996-4207-8fe9-9426d8592c16 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] Lazy-loading 'resources' on Instance uuid b6ae104e-3174-402f-856b-f4c9156f02c9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:49:22 compute-0 nova_compute[186544]: 2025-11-22 07:49:22.427 186548 DEBUG nova.virt.libvirt.vif [None req-3df5138f-6996-4207-8fe9-9426d8592c16 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:48:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-FloatingIPsAssociationNegativeTestJSON-server-1307647210',display_name='tempest-FloatingIPsAssociationNegativeTestJSON-server-1307647210',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-floatingipsassociationnegativetestjson-server-130764721',id=42,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:48:58Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='f7ba3637e7e04fff9deaded2ed375c3a',ramdisk_id='',reservation_id='r-8dqd3qgz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-FloatingIPsAssociationNegativeTestJSON-1416697517',owner_user_name='tempest-FloatingIPsAssociationNegativeTestJSON-1416697517-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:48:58Z,user_data=None,user_id='360aa32f5cfd48598c025d765dc21aa5',uuid=b6ae104e-3174-402f-856b-f4c9156f02c9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e95ab2f5-6f0d-4673-839c-40b2a2f2e8bb", "address": "fa:16:3e:d0:2c:71", "network": {"id": "dff9a519-6da1-447f-b8be-904df7d55a6d", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1145359980-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7ba3637e7e04fff9deaded2ed375c3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape95ab2f5-6f", "ovs_interfaceid": "e95ab2f5-6f0d-4673-839c-40b2a2f2e8bb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 07:49:22 compute-0 nova_compute[186544]: 2025-11-22 07:49:22.427 186548 DEBUG nova.network.os_vif_util [None req-3df5138f-6996-4207-8fe9-9426d8592c16 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] Converting VIF {"id": "e95ab2f5-6f0d-4673-839c-40b2a2f2e8bb", "address": "fa:16:3e:d0:2c:71", "network": {"id": "dff9a519-6da1-447f-b8be-904df7d55a6d", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1145359980-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f7ba3637e7e04fff9deaded2ed375c3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape95ab2f5-6f", "ovs_interfaceid": "e95ab2f5-6f0d-4673-839c-40b2a2f2e8bb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:49:22 compute-0 nova_compute[186544]: 2025-11-22 07:49:22.428 186548 DEBUG nova.network.os_vif_util [None req-3df5138f-6996-4207-8fe9-9426d8592c16 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d0:2c:71,bridge_name='br-int',has_traffic_filtering=True,id=e95ab2f5-6f0d-4673-839c-40b2a2f2e8bb,network=Network(dff9a519-6da1-447f-b8be-904df7d55a6d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape95ab2f5-6f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:49:22 compute-0 nova_compute[186544]: 2025-11-22 07:49:22.428 186548 DEBUG os_vif [None req-3df5138f-6996-4207-8fe9-9426d8592c16 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d0:2c:71,bridge_name='br-int',has_traffic_filtering=True,id=e95ab2f5-6f0d-4673-839c-40b2a2f2e8bb,network=Network(dff9a519-6da1-447f-b8be-904df7d55a6d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape95ab2f5-6f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 07:49:22 compute-0 nova_compute[186544]: 2025-11-22 07:49:22.430 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:49:22 compute-0 nova_compute[186544]: 2025-11-22 07:49:22.430 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape95ab2f5-6f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:49:22 compute-0 nova_compute[186544]: 2025-11-22 07:49:22.431 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:49:22 compute-0 nova_compute[186544]: 2025-11-22 07:49:22.432 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:49:22 compute-0 nova_compute[186544]: 2025-11-22 07:49:22.434 186548 INFO os_vif [None req-3df5138f-6996-4207-8fe9-9426d8592c16 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d0:2c:71,bridge_name='br-int',has_traffic_filtering=True,id=e95ab2f5-6f0d-4673-839c-40b2a2f2e8bb,network=Network(dff9a519-6da1-447f-b8be-904df7d55a6d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape95ab2f5-6f')
Nov 22 07:49:22 compute-0 nova_compute[186544]: 2025-11-22 07:49:22.435 186548 INFO nova.virt.libvirt.driver [None req-3df5138f-6996-4207-8fe9-9426d8592c16 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] [instance: b6ae104e-3174-402f-856b-f4c9156f02c9] Deleting instance files /var/lib/nova/instances/b6ae104e-3174-402f-856b-f4c9156f02c9_del
Nov 22 07:49:22 compute-0 nova_compute[186544]: 2025-11-22 07:49:22.436 186548 INFO nova.virt.libvirt.driver [None req-3df5138f-6996-4207-8fe9-9426d8592c16 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] [instance: b6ae104e-3174-402f-856b-f4c9156f02c9] Deletion of /var/lib/nova/instances/b6ae104e-3174-402f-856b-f4c9156f02c9_del complete
Nov 22 07:49:22 compute-0 neutron-haproxy-ovnmeta-dff9a519-6da1-447f-b8be-904df7d55a6d[219868]: [NOTICE]   (219892) : haproxy version is 2.8.14-c23fe91
Nov 22 07:49:22 compute-0 neutron-haproxy-ovnmeta-dff9a519-6da1-447f-b8be-904df7d55a6d[219868]: [NOTICE]   (219892) : path to executable is /usr/sbin/haproxy
Nov 22 07:49:22 compute-0 neutron-haproxy-ovnmeta-dff9a519-6da1-447f-b8be-904df7d55a6d[219868]: [WARNING]  (219892) : Exiting Master process...
Nov 22 07:49:22 compute-0 neutron-haproxy-ovnmeta-dff9a519-6da1-447f-b8be-904df7d55a6d[219868]: [ALERT]    (219892) : Current worker (219897) exited with code 143 (Terminated)
Nov 22 07:49:22 compute-0 neutron-haproxy-ovnmeta-dff9a519-6da1-447f-b8be-904df7d55a6d[219868]: [WARNING]  (219892) : All workers exited. Exiting... (0)
Nov 22 07:49:22 compute-0 systemd[1]: libpod-b82f90c05392cbae3422cab88aea92e81f9a50dfd1129063457f2956f04c0d5a.scope: Deactivated successfully.
Nov 22 07:49:22 compute-0 podman[220067]: 2025-11-22 07:49:22.531461342 +0000 UTC m=+0.165575550 container died b82f90c05392cbae3422cab88aea92e81f9a50dfd1129063457f2956f04c0d5a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dff9a519-6da1-447f-b8be-904df7d55a6d, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3)
Nov 22 07:49:22 compute-0 nova_compute[186544]: 2025-11-22 07:49:22.580 186548 INFO nova.compute.manager [None req-3df5138f-6996-4207-8fe9-9426d8592c16 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] [instance: b6ae104e-3174-402f-856b-f4c9156f02c9] Took 0.42 seconds to destroy the instance on the hypervisor.
Nov 22 07:49:22 compute-0 nova_compute[186544]: 2025-11-22 07:49:22.580 186548 DEBUG oslo.service.loopingcall [None req-3df5138f-6996-4207-8fe9-9426d8592c16 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 07:49:22 compute-0 nova_compute[186544]: 2025-11-22 07:49:22.580 186548 DEBUG nova.compute.manager [-] [instance: b6ae104e-3174-402f-856b-f4c9156f02c9] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 07:49:22 compute-0 nova_compute[186544]: 2025-11-22 07:49:22.581 186548 DEBUG nova.network.neutron [-] [instance: b6ae104e-3174-402f-856b-f4c9156f02c9] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 07:49:22 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b82f90c05392cbae3422cab88aea92e81f9a50dfd1129063457f2956f04c0d5a-userdata-shm.mount: Deactivated successfully.
Nov 22 07:49:22 compute-0 systemd[1]: var-lib-containers-storage-overlay-bc66f86825fd7b8d3a809884d7a117af9d0c5ca66894df817dfbb615ed7107b2-merged.mount: Deactivated successfully.
Nov 22 07:49:22 compute-0 podman[220067]: 2025-11-22 07:49:22.631762817 +0000 UTC m=+0.265877015 container cleanup b82f90c05392cbae3422cab88aea92e81f9a50dfd1129063457f2956f04c0d5a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dff9a519-6da1-447f-b8be-904df7d55a6d, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 22 07:49:22 compute-0 systemd[1]: libpod-conmon-b82f90c05392cbae3422cab88aea92e81f9a50dfd1129063457f2956f04c0d5a.scope: Deactivated successfully.
Nov 22 07:49:22 compute-0 podman[220112]: 2025-11-22 07:49:22.779425466 +0000 UTC m=+0.120933863 container remove b82f90c05392cbae3422cab88aea92e81f9a50dfd1129063457f2956f04c0d5a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dff9a519-6da1-447f-b8be-904df7d55a6d, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Nov 22 07:49:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:49:22.786 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[d102defa-a139-4ae4-9f4a-d063a21b0493]: (4, ('Sat Nov 22 07:49:22 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-dff9a519-6da1-447f-b8be-904df7d55a6d (b82f90c05392cbae3422cab88aea92e81f9a50dfd1129063457f2956f04c0d5a)\nb82f90c05392cbae3422cab88aea92e81f9a50dfd1129063457f2956f04c0d5a\nSat Nov 22 07:49:22 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-dff9a519-6da1-447f-b8be-904df7d55a6d (b82f90c05392cbae3422cab88aea92e81f9a50dfd1129063457f2956f04c0d5a)\nb82f90c05392cbae3422cab88aea92e81f9a50dfd1129063457f2956f04c0d5a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:49:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:49:22.788 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[05394332-6dc4-4a25-9f2f-b05897b31955]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:49:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:49:22.789 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdff9a519-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:49:22 compute-0 kernel: tapdff9a519-60: left promiscuous mode
Nov 22 07:49:22 compute-0 nova_compute[186544]: 2025-11-22 07:49:22.792 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:49:22 compute-0 nova_compute[186544]: 2025-11-22 07:49:22.804 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:49:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:49:22.807 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[e76d9565-269b-48d1-b9b3-e81e53829358]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:49:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:49:22.821 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[ed3ace54-3186-470c-8510-05b6e61f8ff5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:49:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:49:22.823 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[368d22fb-db78-467d-b89f-ee4e7f5484f0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:49:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:49:22.837 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[e02da5ea-8f6b-4600-b8ec-a55756787479]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 447840, 'reachable_time': 31302, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220127, 'error': None, 'target': 'ovnmeta-dff9a519-6da1-447f-b8be-904df7d55a6d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:49:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:49:22.840 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-dff9a519-6da1-447f-b8be-904df7d55a6d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 07:49:22 compute-0 systemd[1]: run-netns-ovnmeta\x2ddff9a519\x2d6da1\x2d447f\x2db8be\x2d904df7d55a6d.mount: Deactivated successfully.
Nov 22 07:49:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:49:22.840 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[60bf560f-ac4b-44ae-92e2-32c0e88ba918]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:49:23 compute-0 nova_compute[186544]: 2025-11-22 07:49:23.367 186548 DEBUG nova.network.neutron [-] [instance: b6ae104e-3174-402f-856b-f4c9156f02c9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:49:23 compute-0 nova_compute[186544]: 2025-11-22 07:49:23.432 186548 INFO nova.compute.manager [-] [instance: b6ae104e-3174-402f-856b-f4c9156f02c9] Took 0.85 seconds to deallocate network for instance.
Nov 22 07:49:23 compute-0 nova_compute[186544]: 2025-11-22 07:49:23.442 186548 DEBUG nova.compute.manager [req-76cdf990-e7a3-46e5-beaa-54ebd4ac55ce req-4c6362af-898e-4953-843a-44d3f2b73a4a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b6ae104e-3174-402f-856b-f4c9156f02c9] Received event network-vif-deleted-e95ab2f5-6f0d-4673-839c-40b2a2f2e8bb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:49:23 compute-0 nova_compute[186544]: 2025-11-22 07:49:23.504 186548 DEBUG oslo_concurrency.lockutils [None req-3df5138f-6996-4207-8fe9-9426d8592c16 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:49:23 compute-0 nova_compute[186544]: 2025-11-22 07:49:23.505 186548 DEBUG oslo_concurrency.lockutils [None req-3df5138f-6996-4207-8fe9-9426d8592c16 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:49:23 compute-0 nova_compute[186544]: 2025-11-22 07:49:23.558 186548 DEBUG nova.compute.provider_tree [None req-3df5138f-6996-4207-8fe9-9426d8592c16 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:49:23 compute-0 nova_compute[186544]: 2025-11-22 07:49:23.570 186548 DEBUG nova.scheduler.client.report [None req-3df5138f-6996-4207-8fe9-9426d8592c16 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:49:23 compute-0 nova_compute[186544]: 2025-11-22 07:49:23.609 186548 DEBUG oslo_concurrency.lockutils [None req-3df5138f-6996-4207-8fe9-9426d8592c16 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.104s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:49:23 compute-0 nova_compute[186544]: 2025-11-22 07:49:23.663 186548 INFO nova.scheduler.client.report [None req-3df5138f-6996-4207-8fe9-9426d8592c16 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] Deleted allocations for instance b6ae104e-3174-402f-856b-f4c9156f02c9
Nov 22 07:49:23 compute-0 nova_compute[186544]: 2025-11-22 07:49:23.750 186548 DEBUG oslo_concurrency.lockutils [None req-3df5138f-6996-4207-8fe9-9426d8592c16 360aa32f5cfd48598c025d765dc21aa5 f7ba3637e7e04fff9deaded2ed375c3a - - default default] Lock "b6ae104e-3174-402f-856b-f4c9156f02c9" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.614s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:49:24 compute-0 nova_compute[186544]: 2025-11-22 07:49:24.692 186548 DEBUG oslo_concurrency.lockutils [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] Acquiring lock "6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:49:24 compute-0 nova_compute[186544]: 2025-11-22 07:49:24.693 186548 DEBUG oslo_concurrency.lockutils [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] Lock "6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:49:24 compute-0 nova_compute[186544]: 2025-11-22 07:49:24.712 186548 DEBUG nova.compute.manager [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 07:49:25 compute-0 nova_compute[186544]: 2025-11-22 07:49:25.045 186548 DEBUG oslo_concurrency.lockutils [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:49:25 compute-0 nova_compute[186544]: 2025-11-22 07:49:25.045 186548 DEBUG oslo_concurrency.lockutils [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:49:25 compute-0 nova_compute[186544]: 2025-11-22 07:49:25.058 186548 DEBUG nova.virt.hardware [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 07:49:25 compute-0 nova_compute[186544]: 2025-11-22 07:49:25.059 186548 INFO nova.compute.claims [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] Claim successful on node compute-0.ctlplane.example.com
Nov 22 07:49:25 compute-0 nova_compute[186544]: 2025-11-22 07:49:25.267 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:49:25 compute-0 nova_compute[186544]: 2025-11-22 07:49:25.504 186548 DEBUG nova.compute.provider_tree [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:49:25 compute-0 nova_compute[186544]: 2025-11-22 07:49:25.515 186548 DEBUG nova.scheduler.client.report [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:49:25 compute-0 nova_compute[186544]: 2025-11-22 07:49:25.557 186548 DEBUG oslo_concurrency.lockutils [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.512s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:49:25 compute-0 nova_compute[186544]: 2025-11-22 07:49:25.558 186548 DEBUG nova.compute.manager [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 07:49:25 compute-0 nova_compute[186544]: 2025-11-22 07:49:25.660 186548 DEBUG nova.compute.manager [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 22 07:49:25 compute-0 nova_compute[186544]: 2025-11-22 07:49:25.660 186548 DEBUG nova.network.neutron [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 07:49:25 compute-0 nova_compute[186544]: 2025-11-22 07:49:25.681 186548 INFO nova.virt.libvirt.driver [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 07:49:25 compute-0 nova_compute[186544]: 2025-11-22 07:49:25.699 186548 DEBUG nova.compute.manager [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 07:49:25 compute-0 nova_compute[186544]: 2025-11-22 07:49:25.813 186548 DEBUG nova.compute.manager [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 07:49:25 compute-0 nova_compute[186544]: 2025-11-22 07:49:25.814 186548 DEBUG nova.virt.libvirt.driver [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 07:49:25 compute-0 nova_compute[186544]: 2025-11-22 07:49:25.814 186548 INFO nova.virt.libvirt.driver [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] Creating image(s)
Nov 22 07:49:25 compute-0 nova_compute[186544]: 2025-11-22 07:49:25.815 186548 DEBUG oslo_concurrency.lockutils [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] Acquiring lock "/var/lib/nova/instances/6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:49:25 compute-0 nova_compute[186544]: 2025-11-22 07:49:25.815 186548 DEBUG oslo_concurrency.lockutils [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] Lock "/var/lib/nova/instances/6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:49:25 compute-0 nova_compute[186544]: 2025-11-22 07:49:25.816 186548 DEBUG oslo_concurrency.lockutils [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] Lock "/var/lib/nova/instances/6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:49:25 compute-0 nova_compute[186544]: 2025-11-22 07:49:25.829 186548 DEBUG oslo_concurrency.processutils [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:49:25 compute-0 nova_compute[186544]: 2025-11-22 07:49:25.883 186548 DEBUG oslo_concurrency.processutils [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:49:25 compute-0 nova_compute[186544]: 2025-11-22 07:49:25.884 186548 DEBUG oslo_concurrency.lockutils [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:49:25 compute-0 nova_compute[186544]: 2025-11-22 07:49:25.885 186548 DEBUG oslo_concurrency.lockutils [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:49:25 compute-0 nova_compute[186544]: 2025-11-22 07:49:25.895 186548 DEBUG oslo_concurrency.processutils [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:49:25 compute-0 nova_compute[186544]: 2025-11-22 07:49:25.950 186548 DEBUG oslo_concurrency.processutils [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:49:25 compute-0 nova_compute[186544]: 2025-11-22 07:49:25.951 186548 DEBUG oslo_concurrency.processutils [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:49:25 compute-0 nova_compute[186544]: 2025-11-22 07:49:25.968 186548 DEBUG nova.policy [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '528e45fb2759463fbdd0d05562f8e46d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1bd2bb14962c40aaa9806867906278a1', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 22 07:49:26 compute-0 nova_compute[186544]: 2025-11-22 07:49:26.058 186548 DEBUG oslo_concurrency.processutils [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f/disk 1073741824" returned: 0 in 0.107s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:49:26 compute-0 nova_compute[186544]: 2025-11-22 07:49:26.059 186548 DEBUG oslo_concurrency.lockutils [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.174s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:49:26 compute-0 nova_compute[186544]: 2025-11-22 07:49:26.059 186548 DEBUG oslo_concurrency.processutils [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:49:26 compute-0 nova_compute[186544]: 2025-11-22 07:49:26.120 186548 DEBUG oslo_concurrency.processutils [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:49:26 compute-0 nova_compute[186544]: 2025-11-22 07:49:26.121 186548 DEBUG nova.virt.disk.api [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] Checking if we can resize image /var/lib/nova/instances/6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 07:49:26 compute-0 nova_compute[186544]: 2025-11-22 07:49:26.122 186548 DEBUG oslo_concurrency.processutils [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:49:26 compute-0 nova_compute[186544]: 2025-11-22 07:49:26.180 186548 DEBUG oslo_concurrency.processutils [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:49:26 compute-0 nova_compute[186544]: 2025-11-22 07:49:26.181 186548 DEBUG nova.virt.disk.api [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] Cannot resize image /var/lib/nova/instances/6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 07:49:26 compute-0 nova_compute[186544]: 2025-11-22 07:49:26.181 186548 DEBUG nova.objects.instance [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] Lazy-loading 'migration_context' on Instance uuid 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:49:26 compute-0 nova_compute[186544]: 2025-11-22 07:49:26.196 186548 DEBUG nova.virt.libvirt.driver [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 07:49:26 compute-0 nova_compute[186544]: 2025-11-22 07:49:26.197 186548 DEBUG nova.virt.libvirt.driver [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] Ensure instance console log exists: /var/lib/nova/instances/6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 07:49:26 compute-0 nova_compute[186544]: 2025-11-22 07:49:26.198 186548 DEBUG oslo_concurrency.lockutils [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:49:26 compute-0 nova_compute[186544]: 2025-11-22 07:49:26.199 186548 DEBUG oslo_concurrency.lockutils [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:49:26 compute-0 nova_compute[186544]: 2025-11-22 07:49:26.200 186548 DEBUG oslo_concurrency.lockutils [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:49:27 compute-0 nova_compute[186544]: 2025-11-22 07:49:27.433 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:49:27 compute-0 nova_compute[186544]: 2025-11-22 07:49:27.854 186548 DEBUG nova.network.neutron [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] Successfully created port: 132698f3-1aa3-4976-ad25-4fbb0b51ad82 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 22 07:49:29 compute-0 podman[220144]: 2025-11-22 07:49:29.417248835 +0000 UTC m=+0.068209818 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 22 07:49:29 compute-0 podman[220145]: 2025-11-22 07:49:29.44268795 +0000 UTC m=+0.091367637 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3)
Nov 22 07:49:29 compute-0 nova_compute[186544]: 2025-11-22 07:49:29.743 186548 DEBUG nova.network.neutron [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] Successfully updated port: 132698f3-1aa3-4976-ad25-4fbb0b51ad82 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 07:49:29 compute-0 nova_compute[186544]: 2025-11-22 07:49:29.759 186548 DEBUG oslo_concurrency.lockutils [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] Acquiring lock "refresh_cache-6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:49:29 compute-0 nova_compute[186544]: 2025-11-22 07:49:29.759 186548 DEBUG oslo_concurrency.lockutils [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] Acquired lock "refresh_cache-6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:49:29 compute-0 nova_compute[186544]: 2025-11-22 07:49:29.759 186548 DEBUG nova.network.neutron [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 07:49:29 compute-0 nova_compute[186544]: 2025-11-22 07:49:29.846 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:49:29 compute-0 nova_compute[186544]: 2025-11-22 07:49:29.865 186548 DEBUG nova.compute.manager [req-7e9cc6b4-c210-4db6-b2ab-f683bea36076 req-21605b4c-620a-4eb1-9098-5db663ed6e15 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] Received event network-changed-132698f3-1aa3-4976-ad25-4fbb0b51ad82 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:49:29 compute-0 nova_compute[186544]: 2025-11-22 07:49:29.866 186548 DEBUG nova.compute.manager [req-7e9cc6b4-c210-4db6-b2ab-f683bea36076 req-21605b4c-620a-4eb1-9098-5db663ed6e15 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] Refreshing instance network info cache due to event network-changed-132698f3-1aa3-4976-ad25-4fbb0b51ad82. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 07:49:29 compute-0 nova_compute[186544]: 2025-11-22 07:49:29.866 186548 DEBUG oslo_concurrency.lockutils [req-7e9cc6b4-c210-4db6-b2ab-f683bea36076 req-21605b4c-620a-4eb1-9098-5db663ed6e15 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:49:29 compute-0 nova_compute[186544]: 2025-11-22 07:49:29.981 186548 DEBUG nova.network.neutron [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 07:49:29 compute-0 nova_compute[186544]: 2025-11-22 07:49:29.996 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:49:30 compute-0 nova_compute[186544]: 2025-11-22 07:49:30.271 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:49:30 compute-0 nova_compute[186544]: 2025-11-22 07:49:30.779 186548 DEBUG nova.network.neutron [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] Updating instance_info_cache with network_info: [{"id": "132698f3-1aa3-4976-ad25-4fbb0b51ad82", "address": "fa:16:3e:95:e5:62", "network": {"id": "a1462998-8f32-4668-b387-592c9f613719", "bridge": "br-int", "label": "tempest-ServersTestJSON-82173856-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1bd2bb14962c40aaa9806867906278a1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap132698f3-1a", "ovs_interfaceid": "132698f3-1aa3-4976-ad25-4fbb0b51ad82", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:49:30 compute-0 nova_compute[186544]: 2025-11-22 07:49:30.798 186548 DEBUG oslo_concurrency.lockutils [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] Releasing lock "refresh_cache-6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:49:30 compute-0 nova_compute[186544]: 2025-11-22 07:49:30.799 186548 DEBUG nova.compute.manager [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] Instance network_info: |[{"id": "132698f3-1aa3-4976-ad25-4fbb0b51ad82", "address": "fa:16:3e:95:e5:62", "network": {"id": "a1462998-8f32-4668-b387-592c9f613719", "bridge": "br-int", "label": "tempest-ServersTestJSON-82173856-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1bd2bb14962c40aaa9806867906278a1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap132698f3-1a", "ovs_interfaceid": "132698f3-1aa3-4976-ad25-4fbb0b51ad82", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 22 07:49:30 compute-0 nova_compute[186544]: 2025-11-22 07:49:30.799 186548 DEBUG oslo_concurrency.lockutils [req-7e9cc6b4-c210-4db6-b2ab-f683bea36076 req-21605b4c-620a-4eb1-9098-5db663ed6e15 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:49:30 compute-0 nova_compute[186544]: 2025-11-22 07:49:30.799 186548 DEBUG nova.network.neutron [req-7e9cc6b4-c210-4db6-b2ab-f683bea36076 req-21605b4c-620a-4eb1-9098-5db663ed6e15 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] Refreshing network info cache for port 132698f3-1aa3-4976-ad25-4fbb0b51ad82 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 07:49:30 compute-0 nova_compute[186544]: 2025-11-22 07:49:30.802 186548 DEBUG nova.virt.libvirt.driver [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] Start _get_guest_xml network_info=[{"id": "132698f3-1aa3-4976-ad25-4fbb0b51ad82", "address": "fa:16:3e:95:e5:62", "network": {"id": "a1462998-8f32-4668-b387-592c9f613719", "bridge": "br-int", "label": "tempest-ServersTestJSON-82173856-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1bd2bb14962c40aaa9806867906278a1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap132698f3-1a", "ovs_interfaceid": "132698f3-1aa3-4976-ad25-4fbb0b51ad82", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 07:49:30 compute-0 nova_compute[186544]: 2025-11-22 07:49:30.807 186548 WARNING nova.virt.libvirt.driver [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 07:49:30 compute-0 nova_compute[186544]: 2025-11-22 07:49:30.812 186548 DEBUG nova.virt.libvirt.host [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 07:49:30 compute-0 nova_compute[186544]: 2025-11-22 07:49:30.813 186548 DEBUG nova.virt.libvirt.host [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 07:49:30 compute-0 nova_compute[186544]: 2025-11-22 07:49:30.820 186548 DEBUG nova.virt.libvirt.host [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 07:49:30 compute-0 nova_compute[186544]: 2025-11-22 07:49:30.821 186548 DEBUG nova.virt.libvirt.host [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 07:49:30 compute-0 nova_compute[186544]: 2025-11-22 07:49:30.823 186548 DEBUG nova.virt.libvirt.driver [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 07:49:30 compute-0 nova_compute[186544]: 2025-11-22 07:49:30.823 186548 DEBUG nova.virt.hardware [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 07:49:30 compute-0 nova_compute[186544]: 2025-11-22 07:49:30.824 186548 DEBUG nova.virt.hardware [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 07:49:30 compute-0 nova_compute[186544]: 2025-11-22 07:49:30.824 186548 DEBUG nova.virt.hardware [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 07:49:30 compute-0 nova_compute[186544]: 2025-11-22 07:49:30.824 186548 DEBUG nova.virt.hardware [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 07:49:30 compute-0 nova_compute[186544]: 2025-11-22 07:49:30.825 186548 DEBUG nova.virt.hardware [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 07:49:30 compute-0 nova_compute[186544]: 2025-11-22 07:49:30.825 186548 DEBUG nova.virt.hardware [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 07:49:30 compute-0 nova_compute[186544]: 2025-11-22 07:49:30.825 186548 DEBUG nova.virt.hardware [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 07:49:30 compute-0 nova_compute[186544]: 2025-11-22 07:49:30.825 186548 DEBUG nova.virt.hardware [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 07:49:30 compute-0 nova_compute[186544]: 2025-11-22 07:49:30.826 186548 DEBUG nova.virt.hardware [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 07:49:30 compute-0 nova_compute[186544]: 2025-11-22 07:49:30.826 186548 DEBUG nova.virt.hardware [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 07:49:30 compute-0 nova_compute[186544]: 2025-11-22 07:49:30.826 186548 DEBUG nova.virt.hardware [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 07:49:30 compute-0 nova_compute[186544]: 2025-11-22 07:49:30.830 186548 DEBUG nova.virt.libvirt.vif [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:49:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-385299850',display_name='tempest-ServersTestJSON-server-385299850',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestjson-server-385299850',id=46,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMM0br1VYfJ44NY8G6ffvXdzpmhyFAMBzK66p3+PqqXOf00g6ey50tdiit5TiOiIcIq4iMWZGTTBBybRnd7+oBuKJXEHCv8BebLvAd6NHT0AY0MjHBvgEGK5J283CXGsZA==',key_name='tempest-keypair-301448170',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1bd2bb14962c40aaa9806867906278a1',ramdisk_id='',reservation_id='r-m1yb3qwk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-569601487',owner_user_name='tempest-ServersTestJSON-569601487-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:49:25Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='528e45fb2759463fbdd0d05562f8e46d',uuid=6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "132698f3-1aa3-4976-ad25-4fbb0b51ad82", "address": "fa:16:3e:95:e5:62", "network": {"id": "a1462998-8f32-4668-b387-592c9f613719", "bridge": "br-int", "label": "tempest-ServersTestJSON-82173856-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1bd2bb14962c40aaa9806867906278a1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap132698f3-1a", "ovs_interfaceid": "132698f3-1aa3-4976-ad25-4fbb0b51ad82", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 07:49:30 compute-0 nova_compute[186544]: 2025-11-22 07:49:30.830 186548 DEBUG nova.network.os_vif_util [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] Converting VIF {"id": "132698f3-1aa3-4976-ad25-4fbb0b51ad82", "address": "fa:16:3e:95:e5:62", "network": {"id": "a1462998-8f32-4668-b387-592c9f613719", "bridge": "br-int", "label": "tempest-ServersTestJSON-82173856-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1bd2bb14962c40aaa9806867906278a1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap132698f3-1a", "ovs_interfaceid": "132698f3-1aa3-4976-ad25-4fbb0b51ad82", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:49:30 compute-0 nova_compute[186544]: 2025-11-22 07:49:30.831 186548 DEBUG nova.network.os_vif_util [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:95:e5:62,bridge_name='br-int',has_traffic_filtering=True,id=132698f3-1aa3-4976-ad25-4fbb0b51ad82,network=Network(a1462998-8f32-4668-b387-592c9f613719),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap132698f3-1a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:49:30 compute-0 nova_compute[186544]: 2025-11-22 07:49:30.832 186548 DEBUG nova.objects.instance [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] Lazy-loading 'pci_devices' on Instance uuid 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:49:30 compute-0 nova_compute[186544]: 2025-11-22 07:49:30.843 186548 DEBUG nova.virt.libvirt.driver [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] End _get_guest_xml xml=<domain type="kvm">
Nov 22 07:49:30 compute-0 nova_compute[186544]:   <uuid>6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f</uuid>
Nov 22 07:49:30 compute-0 nova_compute[186544]:   <name>instance-0000002e</name>
Nov 22 07:49:30 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 07:49:30 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 07:49:30 compute-0 nova_compute[186544]:   <metadata>
Nov 22 07:49:30 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 07:49:30 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 07:49:30 compute-0 nova_compute[186544]:       <nova:name>tempest-ServersTestJSON-server-385299850</nova:name>
Nov 22 07:49:30 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 07:49:30</nova:creationTime>
Nov 22 07:49:30 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 07:49:30 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 07:49:30 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 07:49:30 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 07:49:30 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 07:49:30 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 07:49:30 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 07:49:30 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 07:49:30 compute-0 nova_compute[186544]:         <nova:user uuid="528e45fb2759463fbdd0d05562f8e46d">tempest-ServersTestJSON-569601487-project-member</nova:user>
Nov 22 07:49:30 compute-0 nova_compute[186544]:         <nova:project uuid="1bd2bb14962c40aaa9806867906278a1">tempest-ServersTestJSON-569601487</nova:project>
Nov 22 07:49:30 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 07:49:30 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 07:49:30 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 07:49:30 compute-0 nova_compute[186544]:         <nova:port uuid="132698f3-1aa3-4976-ad25-4fbb0b51ad82">
Nov 22 07:49:30 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Nov 22 07:49:30 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 07:49:30 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 07:49:30 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 07:49:30 compute-0 nova_compute[186544]:   </metadata>
Nov 22 07:49:30 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 07:49:30 compute-0 nova_compute[186544]:     <system>
Nov 22 07:49:30 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 07:49:30 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 07:49:30 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 07:49:30 compute-0 nova_compute[186544]:       <entry name="serial">6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f</entry>
Nov 22 07:49:30 compute-0 nova_compute[186544]:       <entry name="uuid">6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f</entry>
Nov 22 07:49:30 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 07:49:30 compute-0 nova_compute[186544]:     </system>
Nov 22 07:49:30 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 07:49:30 compute-0 nova_compute[186544]:   <os>
Nov 22 07:49:30 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 07:49:30 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 07:49:30 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 07:49:30 compute-0 nova_compute[186544]:   </os>
Nov 22 07:49:30 compute-0 nova_compute[186544]:   <features>
Nov 22 07:49:30 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 07:49:30 compute-0 nova_compute[186544]:     <apic/>
Nov 22 07:49:30 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 07:49:30 compute-0 nova_compute[186544]:   </features>
Nov 22 07:49:30 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 07:49:30 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 07:49:30 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 07:49:30 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 07:49:30 compute-0 nova_compute[186544]:   </clock>
Nov 22 07:49:30 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 07:49:30 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 07:49:30 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 07:49:30 compute-0 nova_compute[186544]:   </cpu>
Nov 22 07:49:30 compute-0 nova_compute[186544]:   <devices>
Nov 22 07:49:30 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 07:49:30 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 07:49:30 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f/disk"/>
Nov 22 07:49:30 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 07:49:30 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:49:30 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 07:49:30 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 07:49:30 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f/disk.config"/>
Nov 22 07:49:30 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 07:49:30 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:49:30 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 07:49:30 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:95:e5:62"/>
Nov 22 07:49:30 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 07:49:30 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 07:49:30 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 07:49:30 compute-0 nova_compute[186544]:       <target dev="tap132698f3-1a"/>
Nov 22 07:49:30 compute-0 nova_compute[186544]:     </interface>
Nov 22 07:49:30 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 07:49:30 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f/console.log" append="off"/>
Nov 22 07:49:30 compute-0 nova_compute[186544]:     </serial>
Nov 22 07:49:30 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 07:49:30 compute-0 nova_compute[186544]:     <video>
Nov 22 07:49:30 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 07:49:30 compute-0 nova_compute[186544]:     </video>
Nov 22 07:49:30 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 07:49:30 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 07:49:30 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 07:49:30 compute-0 nova_compute[186544]:     </rng>
Nov 22 07:49:30 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 07:49:30 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:49:30 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:49:30 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:49:30 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:49:30 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:49:30 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:49:30 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:49:30 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:49:30 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:49:30 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:49:30 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:49:30 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:49:30 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:49:30 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:49:30 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:49:30 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:49:30 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:49:30 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:49:30 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:49:30 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:49:30 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:49:30 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:49:30 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:49:30 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:49:30 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 07:49:30 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 07:49:30 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 07:49:30 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 07:49:30 compute-0 nova_compute[186544]:   </devices>
Nov 22 07:49:30 compute-0 nova_compute[186544]: </domain>
Nov 22 07:49:30 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 07:49:30 compute-0 nova_compute[186544]: 2025-11-22 07:49:30.845 186548 DEBUG nova.compute.manager [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] Preparing to wait for external event network-vif-plugged-132698f3-1aa3-4976-ad25-4fbb0b51ad82 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 07:49:30 compute-0 nova_compute[186544]: 2025-11-22 07:49:30.845 186548 DEBUG oslo_concurrency.lockutils [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] Acquiring lock "6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:49:30 compute-0 nova_compute[186544]: 2025-11-22 07:49:30.845 186548 DEBUG oslo_concurrency.lockutils [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] Lock "6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:49:30 compute-0 nova_compute[186544]: 2025-11-22 07:49:30.845 186548 DEBUG oslo_concurrency.lockutils [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] Lock "6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:49:30 compute-0 nova_compute[186544]: 2025-11-22 07:49:30.846 186548 DEBUG nova.virt.libvirt.vif [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:49:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-385299850',display_name='tempest-ServersTestJSON-server-385299850',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestjson-server-385299850',id=46,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMM0br1VYfJ44NY8G6ffvXdzpmhyFAMBzK66p3+PqqXOf00g6ey50tdiit5TiOiIcIq4iMWZGTTBBybRnd7+oBuKJXEHCv8BebLvAd6NHT0AY0MjHBvgEGK5J283CXGsZA==',key_name='tempest-keypair-301448170',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1bd2bb14962c40aaa9806867906278a1',ramdisk_id='',reservation_id='r-m1yb3qwk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-569601487',owner_user_name='tempest-ServersTestJSON-569601487-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:49:25Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='528e45fb2759463fbdd0d05562f8e46d',uuid=6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "132698f3-1aa3-4976-ad25-4fbb0b51ad82", "address": "fa:16:3e:95:e5:62", "network": {"id": "a1462998-8f32-4668-b387-592c9f613719", "bridge": "br-int", "label": "tempest-ServersTestJSON-82173856-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1bd2bb14962c40aaa9806867906278a1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap132698f3-1a", "ovs_interfaceid": "132698f3-1aa3-4976-ad25-4fbb0b51ad82", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 07:49:30 compute-0 nova_compute[186544]: 2025-11-22 07:49:30.846 186548 DEBUG nova.network.os_vif_util [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] Converting VIF {"id": "132698f3-1aa3-4976-ad25-4fbb0b51ad82", "address": "fa:16:3e:95:e5:62", "network": {"id": "a1462998-8f32-4668-b387-592c9f613719", "bridge": "br-int", "label": "tempest-ServersTestJSON-82173856-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1bd2bb14962c40aaa9806867906278a1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap132698f3-1a", "ovs_interfaceid": "132698f3-1aa3-4976-ad25-4fbb0b51ad82", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:49:30 compute-0 nova_compute[186544]: 2025-11-22 07:49:30.847 186548 DEBUG nova.network.os_vif_util [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:95:e5:62,bridge_name='br-int',has_traffic_filtering=True,id=132698f3-1aa3-4976-ad25-4fbb0b51ad82,network=Network(a1462998-8f32-4668-b387-592c9f613719),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap132698f3-1a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:49:30 compute-0 nova_compute[186544]: 2025-11-22 07:49:30.848 186548 DEBUG os_vif [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:95:e5:62,bridge_name='br-int',has_traffic_filtering=True,id=132698f3-1aa3-4976-ad25-4fbb0b51ad82,network=Network(a1462998-8f32-4668-b387-592c9f613719),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap132698f3-1a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 07:49:30 compute-0 nova_compute[186544]: 2025-11-22 07:49:30.848 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:49:30 compute-0 nova_compute[186544]: 2025-11-22 07:49:30.849 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:49:30 compute-0 nova_compute[186544]: 2025-11-22 07:49:30.849 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:49:30 compute-0 nova_compute[186544]: 2025-11-22 07:49:30.851 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:49:30 compute-0 nova_compute[186544]: 2025-11-22 07:49:30.852 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap132698f3-1a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:49:30 compute-0 nova_compute[186544]: 2025-11-22 07:49:30.852 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap132698f3-1a, col_values=(('external_ids', {'iface-id': '132698f3-1aa3-4976-ad25-4fbb0b51ad82', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:95:e5:62', 'vm-uuid': '6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:49:30 compute-0 nova_compute[186544]: 2025-11-22 07:49:30.853 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:49:30 compute-0 NetworkManager[55036]: <info>  [1763797770.8551] manager: (tap132698f3-1a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/85)
Nov 22 07:49:30 compute-0 nova_compute[186544]: 2025-11-22 07:49:30.856 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 07:49:30 compute-0 nova_compute[186544]: 2025-11-22 07:49:30.859 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:49:30 compute-0 nova_compute[186544]: 2025-11-22 07:49:30.860 186548 INFO os_vif [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:95:e5:62,bridge_name='br-int',has_traffic_filtering=True,id=132698f3-1aa3-4976-ad25-4fbb0b51ad82,network=Network(a1462998-8f32-4668-b387-592c9f613719),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap132698f3-1a')
Nov 22 07:49:30 compute-0 nova_compute[186544]: 2025-11-22 07:49:30.927 186548 DEBUG nova.virt.libvirt.driver [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:49:30 compute-0 nova_compute[186544]: 2025-11-22 07:49:30.928 186548 DEBUG nova.virt.libvirt.driver [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:49:30 compute-0 nova_compute[186544]: 2025-11-22 07:49:30.928 186548 DEBUG nova.virt.libvirt.driver [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] No VIF found with MAC fa:16:3e:95:e5:62, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 07:49:30 compute-0 nova_compute[186544]: 2025-11-22 07:49:30.928 186548 INFO nova.virt.libvirt.driver [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] Using config drive
Nov 22 07:49:31 compute-0 nova_compute[186544]: 2025-11-22 07:49:31.654 186548 INFO nova.virt.libvirt.driver [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] Creating config drive at /var/lib/nova/instances/6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f/disk.config
Nov 22 07:49:31 compute-0 nova_compute[186544]: 2025-11-22 07:49:31.660 186548 DEBUG oslo_concurrency.processutils [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpxeuctdst execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:49:31 compute-0 nova_compute[186544]: 2025-11-22 07:49:31.786 186548 DEBUG oslo_concurrency.processutils [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpxeuctdst" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:49:31 compute-0 kernel: tap132698f3-1a: entered promiscuous mode
Nov 22 07:49:31 compute-0 NetworkManager[55036]: <info>  [1763797771.8557] manager: (tap132698f3-1a): new Tun device (/org/freedesktop/NetworkManager/Devices/86)
Nov 22 07:49:31 compute-0 systemd-udevd[220204]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 07:49:31 compute-0 ovn_controller[94843]: 2025-11-22T07:49:31Z|00191|binding|INFO|Claiming lport 132698f3-1aa3-4976-ad25-4fbb0b51ad82 for this chassis.
Nov 22 07:49:31 compute-0 ovn_controller[94843]: 2025-11-22T07:49:31Z|00192|binding|INFO|132698f3-1aa3-4976-ad25-4fbb0b51ad82: Claiming fa:16:3e:95:e5:62 10.100.0.13
Nov 22 07:49:31 compute-0 nova_compute[186544]: 2025-11-22 07:49:31.886 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:49:31 compute-0 NetworkManager[55036]: <info>  [1763797771.9005] device (tap132698f3-1a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 07:49:31 compute-0 NetworkManager[55036]: <info>  [1763797771.9019] device (tap132698f3-1a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 07:49:31 compute-0 systemd-machined[152872]: New machine qemu-23-instance-0000002e.
Nov 22 07:49:31 compute-0 nova_compute[186544]: 2025-11-22 07:49:31.941 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:49:31 compute-0 ovn_controller[94843]: 2025-11-22T07:49:31Z|00193|binding|INFO|Setting lport 132698f3-1aa3-4976-ad25-4fbb0b51ad82 ovn-installed in OVS
Nov 22 07:49:31 compute-0 systemd[1]: Started Virtual Machine qemu-23-instance-0000002e.
Nov 22 07:49:31 compute-0 nova_compute[186544]: 2025-11-22 07:49:31.947 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:49:32 compute-0 ovn_controller[94843]: 2025-11-22T07:49:32Z|00194|binding|INFO|Setting lport 132698f3-1aa3-4976-ad25-4fbb0b51ad82 up in Southbound
Nov 22 07:49:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:49:32.154 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:95:e5:62 10.100.0.13'], port_security=['fa:16:3e:95:e5:62 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a1462998-8f32-4668-b387-592c9f613719', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1bd2bb14962c40aaa9806867906278a1', 'neutron:revision_number': '2', 'neutron:security_group_ids': '8ec6d3f1-75fe-4954-ad35-0382d7fe25a0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=684c8eb0-802c-4c2c-bc51-c95af76d4976, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=132698f3-1aa3-4976-ad25-4fbb0b51ad82) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:49:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:49:32.156 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 132698f3-1aa3-4976-ad25-4fbb0b51ad82 in datapath a1462998-8f32-4668-b387-592c9f613719 bound to our chassis
Nov 22 07:49:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:49:32.157 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a1462998-8f32-4668-b387-592c9f613719
Nov 22 07:49:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:49:32.168 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[972e61ed-271f-46a6-a487-7ece61be1750]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:49:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:49:32.168 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa1462998-81 in ovnmeta-a1462998-8f32-4668-b387-592c9f613719 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 07:49:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:49:32.170 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa1462998-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 07:49:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:49:32.170 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[543ecbae-f9ea-4f54-908e-b1297b6efbaa]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:49:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:49:32.171 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[b4a6a332-4f2d-4d12-b5ea-bb88ae776bf9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:49:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:49:32.183 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[cf4b339c-d73c-4dda-9cbc-e0686f053202]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:49:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:49:32.206 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[d6e9f4ad-bc5e-41ff-a25d-9455285e3630]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:49:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:49:32.235 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[8f4b2a41-2848-473f-b5ae-65be88aca7ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:49:32 compute-0 NetworkManager[55036]: <info>  [1763797772.2414] manager: (tapa1462998-80): new Veth device (/org/freedesktop/NetworkManager/Devices/87)
Nov 22 07:49:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:49:32.240 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[d37f859a-9b88-4817-a79d-6427b25793ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:49:32 compute-0 systemd-udevd[220207]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 07:49:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:49:32.282 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[8f033faf-f66e-41d3-b258-b3da8582d692]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:49:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:49:32.285 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[854c7277-0022-46aa-affd-0b03f679d68a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:49:32 compute-0 nova_compute[186544]: 2025-11-22 07:49:32.305 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763797772.3045282, 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:49:32 compute-0 nova_compute[186544]: 2025-11-22 07:49:32.305 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] VM Started (Lifecycle Event)
Nov 22 07:49:32 compute-0 NetworkManager[55036]: <info>  [1763797772.3070] device (tapa1462998-80): carrier: link connected
Nov 22 07:49:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:49:32.312 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[6e8020c6-ee4c-429f-a196-1679716d7b9b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:49:32 compute-0 nova_compute[186544]: 2025-11-22 07:49:32.331 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:49:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:49:32.329 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[fa0b462f-d318-41ca-82ae-2d434213c065]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa1462998-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f6:f1:5c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 55], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 451294, 'reachable_time': 37759, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220247, 'error': None, 'target': 'ovnmeta-a1462998-8f32-4668-b387-592c9f613719', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:49:32 compute-0 nova_compute[186544]: 2025-11-22 07:49:32.336 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763797772.3047373, 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:49:32 compute-0 nova_compute[186544]: 2025-11-22 07:49:32.336 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] VM Paused (Lifecycle Event)
Nov 22 07:49:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:49:32.345 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[0fe8542a-0dfe-4bb1-b812-7167a58ca93d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef6:f15c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 451294, 'tstamp': 451294}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220248, 'error': None, 'target': 'ovnmeta-a1462998-8f32-4668-b387-592c9f613719', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:49:32 compute-0 nova_compute[186544]: 2025-11-22 07:49:32.354 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:49:32 compute-0 nova_compute[186544]: 2025-11-22 07:49:32.357 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:49:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:49:32.365 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[6e6453c3-ee66-4470-8a17-fdaf1a4226a1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa1462998-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f6:f1:5c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 55], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 451294, 'reachable_time': 37759, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 220249, 'error': None, 'target': 'ovnmeta-a1462998-8f32-4668-b387-592c9f613719', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:49:32 compute-0 nova_compute[186544]: 2025-11-22 07:49:32.378 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 07:49:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:49:32.401 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[a2fa70e0-0c3a-494f-8a6f-7f0011144c6d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:49:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:49:32.469 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[161187a4-6ce4-4139-9c50-850c1c88c80f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:49:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:49:32.471 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa1462998-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:49:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:49:32.472 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:49:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:49:32.472 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa1462998-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:49:32 compute-0 nova_compute[186544]: 2025-11-22 07:49:32.474 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:49:32 compute-0 NetworkManager[55036]: <info>  [1763797772.4756] manager: (tapa1462998-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/88)
Nov 22 07:49:32 compute-0 kernel: tapa1462998-80: entered promiscuous mode
Nov 22 07:49:32 compute-0 nova_compute[186544]: 2025-11-22 07:49:32.477 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:49:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:49:32.478 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa1462998-80, col_values=(('external_ids', {'iface-id': '4431cd61-5895-4a5a-84b0-99030ae4b583'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:49:32 compute-0 nova_compute[186544]: 2025-11-22 07:49:32.479 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:49:32 compute-0 ovn_controller[94843]: 2025-11-22T07:49:32Z|00195|binding|INFO|Releasing lport 4431cd61-5895-4a5a-84b0-99030ae4b583 from this chassis (sb_readonly=1)
Nov 22 07:49:32 compute-0 nova_compute[186544]: 2025-11-22 07:49:32.492 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:49:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:49:32.494 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a1462998-8f32-4668-b387-592c9f613719.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a1462998-8f32-4668-b387-592c9f613719.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 07:49:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:49:32.494 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[dcf0876b-90fe-4782-b6b6-3b4d637278c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:49:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:49:32.496 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 07:49:32 compute-0 ovn_metadata_agent[103800]: global
Nov 22 07:49:32 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 07:49:32 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-a1462998-8f32-4668-b387-592c9f613719
Nov 22 07:49:32 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 07:49:32 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 07:49:32 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 07:49:32 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/a1462998-8f32-4668-b387-592c9f613719.pid.haproxy
Nov 22 07:49:32 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 07:49:32 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:49:32 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 07:49:32 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 07:49:32 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 07:49:32 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 07:49:32 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 07:49:32 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 07:49:32 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 07:49:32 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 07:49:32 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 07:49:32 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 07:49:32 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 07:49:32 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 07:49:32 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 07:49:32 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:49:32 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:49:32 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 07:49:32 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 07:49:32 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 07:49:32 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID a1462998-8f32-4668-b387-592c9f613719
Nov 22 07:49:32 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 07:49:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:49:32.497 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a1462998-8f32-4668-b387-592c9f613719', 'env', 'PROCESS_TAG=haproxy-a1462998-8f32-4668-b387-592c9f613719', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a1462998-8f32-4668-b387-592c9f613719.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 07:49:32 compute-0 podman[220281]: 2025-11-22 07:49:32.934534454 +0000 UTC m=+0.087013009 container create d639ddc1b22fa98fad249a756040bf9fe22a4d70e4676efbd0777a4348ea4d1a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a1462998-8f32-4668-b387-592c9f613719, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 22 07:49:32 compute-0 podman[220281]: 2025-11-22 07:49:32.871593677 +0000 UTC m=+0.024072252 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 07:49:32 compute-0 systemd[1]: Started libpod-conmon-d639ddc1b22fa98fad249a756040bf9fe22a4d70e4676efbd0777a4348ea4d1a.scope.
Nov 22 07:49:33 compute-0 nova_compute[186544]: 2025-11-22 07:49:33.008 186548 DEBUG nova.network.neutron [req-7e9cc6b4-c210-4db6-b2ab-f683bea36076 req-21605b4c-620a-4eb1-9098-5db663ed6e15 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] Updated VIF entry in instance network info cache for port 132698f3-1aa3-4976-ad25-4fbb0b51ad82. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 07:49:33 compute-0 nova_compute[186544]: 2025-11-22 07:49:33.009 186548 DEBUG nova.network.neutron [req-7e9cc6b4-c210-4db6-b2ab-f683bea36076 req-21605b4c-620a-4eb1-9098-5db663ed6e15 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] Updating instance_info_cache with network_info: [{"id": "132698f3-1aa3-4976-ad25-4fbb0b51ad82", "address": "fa:16:3e:95:e5:62", "network": {"id": "a1462998-8f32-4668-b387-592c9f613719", "bridge": "br-int", "label": "tempest-ServersTestJSON-82173856-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1bd2bb14962c40aaa9806867906278a1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap132698f3-1a", "ovs_interfaceid": "132698f3-1aa3-4976-ad25-4fbb0b51ad82", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:49:33 compute-0 systemd[1]: Started libcrun container.
Nov 22 07:49:33 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/43c649f57e585e11faed74fb323ff4b19113745548171e4372902d18d4c56f85/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 07:49:33 compute-0 nova_compute[186544]: 2025-11-22 07:49:33.028 186548 DEBUG oslo_concurrency.lockutils [req-7e9cc6b4-c210-4db6-b2ab-f683bea36076 req-21605b4c-620a-4eb1-9098-5db663ed6e15 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:49:33 compute-0 podman[220281]: 2025-11-22 07:49:33.04056709 +0000 UTC m=+0.193045695 container init d639ddc1b22fa98fad249a756040bf9fe22a4d70e4676efbd0777a4348ea4d1a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a1462998-8f32-4668-b387-592c9f613719, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2)
Nov 22 07:49:33 compute-0 podman[220281]: 2025-11-22 07:49:33.045998574 +0000 UTC m=+0.198477129 container start d639ddc1b22fa98fad249a756040bf9fe22a4d70e4676efbd0777a4348ea4d1a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a1462998-8f32-4668-b387-592c9f613719, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 22 07:49:33 compute-0 nova_compute[186544]: 2025-11-22 07:49:33.051 186548 DEBUG nova.compute.manager [req-f5f27b01-4215-42a1-b527-187c8864d386 req-9337caaf-bfed-4bd3-85b1-646009a133e4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] Received event network-vif-plugged-132698f3-1aa3-4976-ad25-4fbb0b51ad82 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:49:33 compute-0 nova_compute[186544]: 2025-11-22 07:49:33.051 186548 DEBUG oslo_concurrency.lockutils [req-f5f27b01-4215-42a1-b527-187c8864d386 req-9337caaf-bfed-4bd3-85b1-646009a133e4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:49:33 compute-0 nova_compute[186544]: 2025-11-22 07:49:33.052 186548 DEBUG oslo_concurrency.lockutils [req-f5f27b01-4215-42a1-b527-187c8864d386 req-9337caaf-bfed-4bd3-85b1-646009a133e4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:49:33 compute-0 nova_compute[186544]: 2025-11-22 07:49:33.052 186548 DEBUG oslo_concurrency.lockutils [req-f5f27b01-4215-42a1-b527-187c8864d386 req-9337caaf-bfed-4bd3-85b1-646009a133e4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:49:33 compute-0 nova_compute[186544]: 2025-11-22 07:49:33.052 186548 DEBUG nova.compute.manager [req-f5f27b01-4215-42a1-b527-187c8864d386 req-9337caaf-bfed-4bd3-85b1-646009a133e4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] Processing event network-vif-plugged-132698f3-1aa3-4976-ad25-4fbb0b51ad82 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 07:49:33 compute-0 nova_compute[186544]: 2025-11-22 07:49:33.053 186548 DEBUG nova.compute.manager [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 07:49:33 compute-0 nova_compute[186544]: 2025-11-22 07:49:33.058 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763797773.05804, 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:49:33 compute-0 nova_compute[186544]: 2025-11-22 07:49:33.058 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] VM Resumed (Lifecycle Event)
Nov 22 07:49:33 compute-0 nova_compute[186544]: 2025-11-22 07:49:33.060 186548 DEBUG nova.virt.libvirt.driver [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 07:49:33 compute-0 nova_compute[186544]: 2025-11-22 07:49:33.064 186548 INFO nova.virt.libvirt.driver [-] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] Instance spawned successfully.
Nov 22 07:49:33 compute-0 nova_compute[186544]: 2025-11-22 07:49:33.064 186548 DEBUG nova.virt.libvirt.driver [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 07:49:33 compute-0 neutron-haproxy-ovnmeta-a1462998-8f32-4668-b387-592c9f613719[220296]: [NOTICE]   (220300) : New worker (220302) forked
Nov 22 07:49:33 compute-0 neutron-haproxy-ovnmeta-a1462998-8f32-4668-b387-592c9f613719[220296]: [NOTICE]   (220300) : Loading success.
Nov 22 07:49:33 compute-0 nova_compute[186544]: 2025-11-22 07:49:33.076 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:49:33 compute-0 nova_compute[186544]: 2025-11-22 07:49:33.084 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:49:33 compute-0 nova_compute[186544]: 2025-11-22 07:49:33.088 186548 DEBUG nova.virt.libvirt.driver [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:49:33 compute-0 nova_compute[186544]: 2025-11-22 07:49:33.089 186548 DEBUG nova.virt.libvirt.driver [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:49:33 compute-0 nova_compute[186544]: 2025-11-22 07:49:33.089 186548 DEBUG nova.virt.libvirt.driver [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:49:33 compute-0 nova_compute[186544]: 2025-11-22 07:49:33.090 186548 DEBUG nova.virt.libvirt.driver [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:49:33 compute-0 nova_compute[186544]: 2025-11-22 07:49:33.090 186548 DEBUG nova.virt.libvirt.driver [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:49:33 compute-0 nova_compute[186544]: 2025-11-22 07:49:33.091 186548 DEBUG nova.virt.libvirt.driver [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:49:33 compute-0 nova_compute[186544]: 2025-11-22 07:49:33.134 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 07:49:33 compute-0 nova_compute[186544]: 2025-11-22 07:49:33.177 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:49:33 compute-0 nova_compute[186544]: 2025-11-22 07:49:33.193 186548 INFO nova.compute.manager [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] Took 7.38 seconds to spawn the instance on the hypervisor.
Nov 22 07:49:33 compute-0 nova_compute[186544]: 2025-11-22 07:49:33.194 186548 DEBUG nova.compute.manager [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:49:33 compute-0 nova_compute[186544]: 2025-11-22 07:49:33.197 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:49:33 compute-0 nova_compute[186544]: 2025-11-22 07:49:33.197 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:49:33 compute-0 nova_compute[186544]: 2025-11-22 07:49:33.197 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:49:33 compute-0 nova_compute[186544]: 2025-11-22 07:49:33.198 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 07:49:33 compute-0 nova_compute[186544]: 2025-11-22 07:49:33.304 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:49:33 compute-0 nova_compute[186544]: 2025-11-22 07:49:33.337 186548 INFO nova.compute.manager [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] Took 8.36 seconds to build instance.
Nov 22 07:49:33 compute-0 nova_compute[186544]: 2025-11-22 07:49:33.358 186548 DEBUG oslo_concurrency.lockutils [None req-81a10c16-8fd9-45ba-8915-9adafc4262c8 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] Lock "6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.664s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:49:33 compute-0 nova_compute[186544]: 2025-11-22 07:49:33.373 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:49:33 compute-0 nova_compute[186544]: 2025-11-22 07:49:33.374 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:49:33 compute-0 nova_compute[186544]: 2025-11-22 07:49:33.446 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:49:33 compute-0 nova_compute[186544]: 2025-11-22 07:49:33.646 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 07:49:33 compute-0 nova_compute[186544]: 2025-11-22 07:49:33.648 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5652MB free_disk=73.42284393310547GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 07:49:33 compute-0 nova_compute[186544]: 2025-11-22 07:49:33.648 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:49:33 compute-0 nova_compute[186544]: 2025-11-22 07:49:33.648 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:49:33 compute-0 nova_compute[186544]: 2025-11-22 07:49:33.719 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Instance 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 22 07:49:33 compute-0 nova_compute[186544]: 2025-11-22 07:49:33.720 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 07:49:33 compute-0 nova_compute[186544]: 2025-11-22 07:49:33.720 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 07:49:33 compute-0 nova_compute[186544]: 2025-11-22 07:49:33.774 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:49:33 compute-0 nova_compute[186544]: 2025-11-22 07:49:33.790 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:49:33 compute-0 nova_compute[186544]: 2025-11-22 07:49:33.814 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 07:49:33 compute-0 nova_compute[186544]: 2025-11-22 07:49:33.815 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.166s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:49:34 compute-0 podman[220318]: 2025-11-22 07:49:34.447233649 +0000 UTC m=+0.083604575 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 07:49:35 compute-0 nova_compute[186544]: 2025-11-22 07:49:35.149 186548 DEBUG nova.compute.manager [req-9f61237e-43ca-46b7-b22e-296b597c7113 req-82449f45-9b3d-4a7a-a5ef-76352245411e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] Received event network-vif-plugged-132698f3-1aa3-4976-ad25-4fbb0b51ad82 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:49:35 compute-0 nova_compute[186544]: 2025-11-22 07:49:35.149 186548 DEBUG oslo_concurrency.lockutils [req-9f61237e-43ca-46b7-b22e-296b597c7113 req-82449f45-9b3d-4a7a-a5ef-76352245411e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:49:35 compute-0 nova_compute[186544]: 2025-11-22 07:49:35.149 186548 DEBUG oslo_concurrency.lockutils [req-9f61237e-43ca-46b7-b22e-296b597c7113 req-82449f45-9b3d-4a7a-a5ef-76352245411e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:49:35 compute-0 nova_compute[186544]: 2025-11-22 07:49:35.149 186548 DEBUG oslo_concurrency.lockutils [req-9f61237e-43ca-46b7-b22e-296b597c7113 req-82449f45-9b3d-4a7a-a5ef-76352245411e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:49:35 compute-0 nova_compute[186544]: 2025-11-22 07:49:35.150 186548 DEBUG nova.compute.manager [req-9f61237e-43ca-46b7-b22e-296b597c7113 req-82449f45-9b3d-4a7a-a5ef-76352245411e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] No waiting events found dispatching network-vif-plugged-132698f3-1aa3-4976-ad25-4fbb0b51ad82 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:49:35 compute-0 nova_compute[186544]: 2025-11-22 07:49:35.150 186548 WARNING nova.compute.manager [req-9f61237e-43ca-46b7-b22e-296b597c7113 req-82449f45-9b3d-4a7a-a5ef-76352245411e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] Received unexpected event network-vif-plugged-132698f3-1aa3-4976-ad25-4fbb0b51ad82 for instance with vm_state active and task_state None.
Nov 22 07:49:35 compute-0 nova_compute[186544]: 2025-11-22 07:49:35.272 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:49:35 compute-0 nova_compute[186544]: 2025-11-22 07:49:35.854 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:49:36 compute-0 podman[220343]: 2025-11-22 07:49:36.402370909 +0000 UTC m=+0.048175606 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 07:49:36 compute-0 nova_compute[186544]: 2025-11-22 07:49:36.801 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:49:37 compute-0 NetworkManager[55036]: <info>  [1763797777.0629] manager: (patch-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/89)
Nov 22 07:49:37 compute-0 NetworkManager[55036]: <info>  [1763797777.0641] manager: (patch-br-int-to-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/90)
Nov 22 07:49:37 compute-0 nova_compute[186544]: 2025-11-22 07:49:37.062 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:49:37 compute-0 nova_compute[186544]: 2025-11-22 07:49:37.135 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:49:37 compute-0 ovn_controller[94843]: 2025-11-22T07:49:37Z|00196|binding|INFO|Releasing lport 4431cd61-5895-4a5a-84b0-99030ae4b583 from this chassis (sb_readonly=0)
Nov 22 07:49:37 compute-0 nova_compute[186544]: 2025-11-22 07:49:37.148 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:49:37 compute-0 nova_compute[186544]: 2025-11-22 07:49:37.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:49:37 compute-0 nova_compute[186544]: 2025-11-22 07:49:37.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 07:49:37 compute-0 nova_compute[186544]: 2025-11-22 07:49:37.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 07:49:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:49:37.317 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:49:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:49:37.318 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:49:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:49:37.318 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:49:37 compute-0 nova_compute[186544]: 2025-11-22 07:49:37.369 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "refresh_cache-6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:49:37 compute-0 nova_compute[186544]: 2025-11-22 07:49:37.369 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquired lock "refresh_cache-6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:49:37 compute-0 nova_compute[186544]: 2025-11-22 07:49:37.370 186548 DEBUG nova.network.neutron [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 22 07:49:37 compute-0 nova_compute[186544]: 2025-11-22 07:49:37.370 186548 DEBUG nova.objects.instance [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lazy-loading 'info_cache' on Instance uuid 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:49:37 compute-0 nova_compute[186544]: 2025-11-22 07:49:37.411 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763797762.4095185, b6ae104e-3174-402f-856b-f4c9156f02c9 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:49:37 compute-0 nova_compute[186544]: 2025-11-22 07:49:37.412 186548 INFO nova.compute.manager [-] [instance: b6ae104e-3174-402f-856b-f4c9156f02c9] VM Stopped (Lifecycle Event)
Nov 22 07:49:37 compute-0 nova_compute[186544]: 2025-11-22 07:49:37.434 186548 DEBUG nova.compute.manager [None req-46227928-5b04-418c-87ab-e392557233c7 - - - - - -] [instance: b6ae104e-3174-402f-856b-f4c9156f02c9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:49:37 compute-0 nova_compute[186544]: 2025-11-22 07:49:37.850 186548 DEBUG nova.compute.manager [req-06830907-0a04-4cfe-9c7f-18096e0fecad req-fae73cfa-1cd9-4768-ba6b-63876a960a2f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] Received event network-changed-132698f3-1aa3-4976-ad25-4fbb0b51ad82 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:49:37 compute-0 nova_compute[186544]: 2025-11-22 07:49:37.851 186548 DEBUG nova.compute.manager [req-06830907-0a04-4cfe-9c7f-18096e0fecad req-fae73cfa-1cd9-4768-ba6b-63876a960a2f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] Refreshing instance network info cache due to event network-changed-132698f3-1aa3-4976-ad25-4fbb0b51ad82. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 07:49:37 compute-0 nova_compute[186544]: 2025-11-22 07:49:37.851 186548 DEBUG oslo_concurrency.lockutils [req-06830907-0a04-4cfe-9c7f-18096e0fecad req-fae73cfa-1cd9-4768-ba6b-63876a960a2f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:49:39 compute-0 nova_compute[186544]: 2025-11-22 07:49:39.437 186548 DEBUG nova.network.neutron [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] Updating instance_info_cache with network_info: [{"id": "132698f3-1aa3-4976-ad25-4fbb0b51ad82", "address": "fa:16:3e:95:e5:62", "network": {"id": "a1462998-8f32-4668-b387-592c9f613719", "bridge": "br-int", "label": "tempest-ServersTestJSON-82173856-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.236", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1bd2bb14962c40aaa9806867906278a1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap132698f3-1a", "ovs_interfaceid": "132698f3-1aa3-4976-ad25-4fbb0b51ad82", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:49:39 compute-0 nova_compute[186544]: 2025-11-22 07:49:39.468 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Releasing lock "refresh_cache-6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:49:39 compute-0 nova_compute[186544]: 2025-11-22 07:49:39.468 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 22 07:49:39 compute-0 nova_compute[186544]: 2025-11-22 07:49:39.468 186548 DEBUG oslo_concurrency.lockutils [req-06830907-0a04-4cfe-9c7f-18096e0fecad req-fae73cfa-1cd9-4768-ba6b-63876a960a2f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:49:39 compute-0 nova_compute[186544]: 2025-11-22 07:49:39.469 186548 DEBUG nova.network.neutron [req-06830907-0a04-4cfe-9c7f-18096e0fecad req-fae73cfa-1cd9-4768-ba6b-63876a960a2f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] Refreshing network info cache for port 132698f3-1aa3-4976-ad25-4fbb0b51ad82 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 07:49:39 compute-0 nova_compute[186544]: 2025-11-22 07:49:39.470 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:49:39 compute-0 nova_compute[186544]: 2025-11-22 07:49:39.470 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:49:39 compute-0 nova_compute[186544]: 2025-11-22 07:49:39.470 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 07:49:40 compute-0 nova_compute[186544]: 2025-11-22 07:49:40.273 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:49:40 compute-0 nova_compute[186544]: 2025-11-22 07:49:40.466 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:49:40 compute-0 nova_compute[186544]: 2025-11-22 07:49:40.856 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:49:41 compute-0 podman[220363]: 2025-11-22 07:49:41.416083295 +0000 UTC m=+0.060406147 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team)
Nov 22 07:49:42 compute-0 nova_compute[186544]: 2025-11-22 07:49:42.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:49:42 compute-0 nova_compute[186544]: 2025-11-22 07:49:42.474 186548 DEBUG nova.network.neutron [req-06830907-0a04-4cfe-9c7f-18096e0fecad req-fae73cfa-1cd9-4768-ba6b-63876a960a2f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] Updated VIF entry in instance network info cache for port 132698f3-1aa3-4976-ad25-4fbb0b51ad82. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 07:49:42 compute-0 nova_compute[186544]: 2025-11-22 07:49:42.475 186548 DEBUG nova.network.neutron [req-06830907-0a04-4cfe-9c7f-18096e0fecad req-fae73cfa-1cd9-4768-ba6b-63876a960a2f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] Updating instance_info_cache with network_info: [{"id": "132698f3-1aa3-4976-ad25-4fbb0b51ad82", "address": "fa:16:3e:95:e5:62", "network": {"id": "a1462998-8f32-4668-b387-592c9f613719", "bridge": "br-int", "label": "tempest-ServersTestJSON-82173856-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.236", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1bd2bb14962c40aaa9806867906278a1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap132698f3-1a", "ovs_interfaceid": "132698f3-1aa3-4976-ad25-4fbb0b51ad82", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:49:42 compute-0 nova_compute[186544]: 2025-11-22 07:49:42.491 186548 DEBUG oslo_concurrency.lockutils [req-06830907-0a04-4cfe-9c7f-18096e0fecad req-fae73cfa-1cd9-4768-ba6b-63876a960a2f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:49:43 compute-0 nova_compute[186544]: 2025-11-22 07:49:43.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:49:44 compute-0 nova_compute[186544]: 2025-11-22 07:49:44.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:49:45 compute-0 nova_compute[186544]: 2025-11-22 07:49:45.275 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:49:45 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Nov 22 07:49:45 compute-0 nova_compute[186544]: 2025-11-22 07:49:45.858 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:49:46 compute-0 podman[220406]: 2025-11-22 07:49:46.677447385 +0000 UTC m=+0.322749602 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 07:49:47 compute-0 ovn_controller[94843]: 2025-11-22T07:49:47Z|00024|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:95:e5:62 10.100.0.13
Nov 22 07:49:47 compute-0 ovn_controller[94843]: 2025-11-22T07:49:47Z|00025|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:95:e5:62 10.100.0.13
Nov 22 07:49:50 compute-0 nova_compute[186544]: 2025-11-22 07:49:50.278 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:49:50 compute-0 podman[220428]: 2025-11-22 07:49:50.415416048 +0000 UTC m=+0.058763185 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, architecture=x86_64, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=edpm, io.openshift.expose-services=, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.openshift.tags=minimal rhel9, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., version=9.6)
Nov 22 07:49:50 compute-0 nova_compute[186544]: 2025-11-22 07:49:50.861 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:49:54 compute-0 nova_compute[186544]: 2025-11-22 07:49:54.158 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:49:55 compute-0 nova_compute[186544]: 2025-11-22 07:49:55.279 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:49:55 compute-0 nova_compute[186544]: 2025-11-22 07:49:55.759 186548 DEBUG oslo_concurrency.lockutils [None req-5be4f0d3-7f0e-42b6-a6e8-d66d50cc72a4 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] Acquiring lock "6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:49:55 compute-0 nova_compute[186544]: 2025-11-22 07:49:55.760 186548 DEBUG oslo_concurrency.lockutils [None req-5be4f0d3-7f0e-42b6-a6e8-d66d50cc72a4 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] Lock "6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:49:55 compute-0 nova_compute[186544]: 2025-11-22 07:49:55.760 186548 DEBUG oslo_concurrency.lockutils [None req-5be4f0d3-7f0e-42b6-a6e8-d66d50cc72a4 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] Acquiring lock "6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:49:55 compute-0 nova_compute[186544]: 2025-11-22 07:49:55.760 186548 DEBUG oslo_concurrency.lockutils [None req-5be4f0d3-7f0e-42b6-a6e8-d66d50cc72a4 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] Lock "6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:49:55 compute-0 nova_compute[186544]: 2025-11-22 07:49:55.760 186548 DEBUG oslo_concurrency.lockutils [None req-5be4f0d3-7f0e-42b6-a6e8-d66d50cc72a4 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] Lock "6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:49:55 compute-0 nova_compute[186544]: 2025-11-22 07:49:55.766 186548 INFO nova.compute.manager [None req-5be4f0d3-7f0e-42b6-a6e8-d66d50cc72a4 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] Terminating instance
Nov 22 07:49:55 compute-0 nova_compute[186544]: 2025-11-22 07:49:55.772 186548 DEBUG nova.compute.manager [None req-5be4f0d3-7f0e-42b6-a6e8-d66d50cc72a4 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 07:49:55 compute-0 kernel: tap132698f3-1a (unregistering): left promiscuous mode
Nov 22 07:49:55 compute-0 NetworkManager[55036]: <info>  [1763797795.7955] device (tap132698f3-1a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 07:49:55 compute-0 nova_compute[186544]: 2025-11-22 07:49:55.803 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:49:55 compute-0 ovn_controller[94843]: 2025-11-22T07:49:55Z|00197|binding|INFO|Releasing lport 132698f3-1aa3-4976-ad25-4fbb0b51ad82 from this chassis (sb_readonly=0)
Nov 22 07:49:55 compute-0 ovn_controller[94843]: 2025-11-22T07:49:55Z|00198|binding|INFO|Setting lport 132698f3-1aa3-4976-ad25-4fbb0b51ad82 down in Southbound
Nov 22 07:49:55 compute-0 ovn_controller[94843]: 2025-11-22T07:49:55Z|00199|binding|INFO|Removing iface tap132698f3-1a ovn-installed in OVS
Nov 22 07:49:55 compute-0 nova_compute[186544]: 2025-11-22 07:49:55.805 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:49:55 compute-0 nova_compute[186544]: 2025-11-22 07:49:55.819 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:49:55 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:49:55.825 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:95:e5:62 10.100.0.13'], port_security=['fa:16:3e:95:e5:62 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a1462998-8f32-4668-b387-592c9f613719', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1bd2bb14962c40aaa9806867906278a1', 'neutron:revision_number': '4', 'neutron:security_group_ids': '8ec6d3f1-75fe-4954-ad35-0382d7fe25a0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.236'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=684c8eb0-802c-4c2c-bc51-c95af76d4976, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=132698f3-1aa3-4976-ad25-4fbb0b51ad82) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:49:55 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:49:55.826 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 132698f3-1aa3-4976-ad25-4fbb0b51ad82 in datapath a1462998-8f32-4668-b387-592c9f613719 unbound from our chassis
Nov 22 07:49:55 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:49:55.827 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a1462998-8f32-4668-b387-592c9f613719, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 07:49:55 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:49:55.828 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[4d9e07da-dc41-4ea7-aa63-aa0ac7e98fa5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:49:55 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:49:55.829 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a1462998-8f32-4668-b387-592c9f613719 namespace which is not needed anymore
Nov 22 07:49:55 compute-0 nova_compute[186544]: 2025-11-22 07:49:55.835 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:49:55 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:49:55.835 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:49:55 compute-0 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d0000002e.scope: Deactivated successfully.
Nov 22 07:49:55 compute-0 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d0000002e.scope: Consumed 14.834s CPU time.
Nov 22 07:49:55 compute-0 systemd-machined[152872]: Machine qemu-23-instance-0000002e terminated.
Nov 22 07:49:55 compute-0 nova_compute[186544]: 2025-11-22 07:49:55.862 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:49:56 compute-0 neutron-haproxy-ovnmeta-a1462998-8f32-4668-b387-592c9f613719[220296]: [NOTICE]   (220300) : haproxy version is 2.8.14-c23fe91
Nov 22 07:49:56 compute-0 neutron-haproxy-ovnmeta-a1462998-8f32-4668-b387-592c9f613719[220296]: [NOTICE]   (220300) : path to executable is /usr/sbin/haproxy
Nov 22 07:49:56 compute-0 neutron-haproxy-ovnmeta-a1462998-8f32-4668-b387-592c9f613719[220296]: [WARNING]  (220300) : Exiting Master process...
Nov 22 07:49:56 compute-0 neutron-haproxy-ovnmeta-a1462998-8f32-4668-b387-592c9f613719[220296]: [WARNING]  (220300) : Exiting Master process...
Nov 22 07:49:56 compute-0 neutron-haproxy-ovnmeta-a1462998-8f32-4668-b387-592c9f613719[220296]: [ALERT]    (220300) : Current worker (220302) exited with code 143 (Terminated)
Nov 22 07:49:56 compute-0 neutron-haproxy-ovnmeta-a1462998-8f32-4668-b387-592c9f613719[220296]: [WARNING]  (220300) : All workers exited. Exiting... (0)
Nov 22 07:49:56 compute-0 systemd[1]: libpod-d639ddc1b22fa98fad249a756040bf9fe22a4d70e4676efbd0777a4348ea4d1a.scope: Deactivated successfully.
Nov 22 07:49:56 compute-0 nova_compute[186544]: 2025-11-22 07:49:56.035 186548 INFO nova.virt.libvirt.driver [-] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] Instance destroyed successfully.
Nov 22 07:49:56 compute-0 nova_compute[186544]: 2025-11-22 07:49:56.036 186548 DEBUG nova.objects.instance [None req-5be4f0d3-7f0e-42b6-a6e8-d66d50cc72a4 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] Lazy-loading 'resources' on Instance uuid 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:49:56 compute-0 podman[220472]: 2025-11-22 07:49:56.038862789 +0000 UTC m=+0.126714975 container died d639ddc1b22fa98fad249a756040bf9fe22a4d70e4676efbd0777a4348ea4d1a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a1462998-8f32-4668-b387-592c9f613719, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true)
Nov 22 07:49:56 compute-0 nova_compute[186544]: 2025-11-22 07:49:56.048 186548 DEBUG nova.virt.libvirt.vif [None req-5be4f0d3-7f0e-42b6-a6e8-d66d50cc72a4 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:49:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestJSON-server-385299850',display_name='tempest-ServersTestJSON-server-385299850',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestjson-server-385299850',id=46,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMM0br1VYfJ44NY8G6ffvXdzpmhyFAMBzK66p3+PqqXOf00g6ey50tdiit5TiOiIcIq4iMWZGTTBBybRnd7+oBuKJXEHCv8BebLvAd6NHT0AY0MjHBvgEGK5J283CXGsZA==',key_name='tempest-keypair-301448170',keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:49:33Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='1bd2bb14962c40aaa9806867906278a1',ramdisk_id='',reservation_id='r-m1yb3qwk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestJSON-569601487',owner_user_name='tempest-ServersTestJSON-569601487-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:49:33Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='528e45fb2759463fbdd0d05562f8e46d',uuid=6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "132698f3-1aa3-4976-ad25-4fbb0b51ad82", "address": "fa:16:3e:95:e5:62", "network": {"id": "a1462998-8f32-4668-b387-592c9f613719", "bridge": "br-int", "label": "tempest-ServersTestJSON-82173856-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.236", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1bd2bb14962c40aaa9806867906278a1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap132698f3-1a", "ovs_interfaceid": "132698f3-1aa3-4976-ad25-4fbb0b51ad82", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 07:49:56 compute-0 nova_compute[186544]: 2025-11-22 07:49:56.049 186548 DEBUG nova.network.os_vif_util [None req-5be4f0d3-7f0e-42b6-a6e8-d66d50cc72a4 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] Converting VIF {"id": "132698f3-1aa3-4976-ad25-4fbb0b51ad82", "address": "fa:16:3e:95:e5:62", "network": {"id": "a1462998-8f32-4668-b387-592c9f613719", "bridge": "br-int", "label": "tempest-ServersTestJSON-82173856-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.236", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1bd2bb14962c40aaa9806867906278a1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap132698f3-1a", "ovs_interfaceid": "132698f3-1aa3-4976-ad25-4fbb0b51ad82", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:49:56 compute-0 nova_compute[186544]: 2025-11-22 07:49:56.050 186548 DEBUG nova.network.os_vif_util [None req-5be4f0d3-7f0e-42b6-a6e8-d66d50cc72a4 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:95:e5:62,bridge_name='br-int',has_traffic_filtering=True,id=132698f3-1aa3-4976-ad25-4fbb0b51ad82,network=Network(a1462998-8f32-4668-b387-592c9f613719),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap132698f3-1a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:49:56 compute-0 nova_compute[186544]: 2025-11-22 07:49:56.050 186548 DEBUG os_vif [None req-5be4f0d3-7f0e-42b6-a6e8-d66d50cc72a4 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:95:e5:62,bridge_name='br-int',has_traffic_filtering=True,id=132698f3-1aa3-4976-ad25-4fbb0b51ad82,network=Network(a1462998-8f32-4668-b387-592c9f613719),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap132698f3-1a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 07:49:56 compute-0 nova_compute[186544]: 2025-11-22 07:49:56.052 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:49:56 compute-0 nova_compute[186544]: 2025-11-22 07:49:56.052 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap132698f3-1a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:49:56 compute-0 nova_compute[186544]: 2025-11-22 07:49:56.054 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:49:56 compute-0 nova_compute[186544]: 2025-11-22 07:49:56.056 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 07:49:56 compute-0 nova_compute[186544]: 2025-11-22 07:49:56.059 186548 INFO os_vif [None req-5be4f0d3-7f0e-42b6-a6e8-d66d50cc72a4 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:95:e5:62,bridge_name='br-int',has_traffic_filtering=True,id=132698f3-1aa3-4976-ad25-4fbb0b51ad82,network=Network(a1462998-8f32-4668-b387-592c9f613719),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap132698f3-1a')
Nov 22 07:49:56 compute-0 nova_compute[186544]: 2025-11-22 07:49:56.059 186548 INFO nova.virt.libvirt.driver [None req-5be4f0d3-7f0e-42b6-a6e8-d66d50cc72a4 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] Deleting instance files /var/lib/nova/instances/6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f_del
Nov 22 07:49:56 compute-0 nova_compute[186544]: 2025-11-22 07:49:56.060 186548 INFO nova.virt.libvirt.driver [None req-5be4f0d3-7f0e-42b6-a6e8-d66d50cc72a4 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] Deletion of /var/lib/nova/instances/6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f_del complete
Nov 22 07:49:56 compute-0 systemd[1]: var-lib-containers-storage-overlay-43c649f57e585e11faed74fb323ff4b19113745548171e4372902d18d4c56f85-merged.mount: Deactivated successfully.
Nov 22 07:49:56 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d639ddc1b22fa98fad249a756040bf9fe22a4d70e4676efbd0777a4348ea4d1a-userdata-shm.mount: Deactivated successfully.
Nov 22 07:49:56 compute-0 podman[220472]: 2025-11-22 07:49:56.137307949 +0000 UTC m=+0.225160135 container cleanup d639ddc1b22fa98fad249a756040bf9fe22a4d70e4676efbd0777a4348ea4d1a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a1462998-8f32-4668-b387-592c9f613719, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 22 07:49:56 compute-0 nova_compute[186544]: 2025-11-22 07:49:56.141 186548 INFO nova.compute.manager [None req-5be4f0d3-7f0e-42b6-a6e8-d66d50cc72a4 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] Took 0.37 seconds to destroy the instance on the hypervisor.
Nov 22 07:49:56 compute-0 nova_compute[186544]: 2025-11-22 07:49:56.141 186548 DEBUG oslo.service.loopingcall [None req-5be4f0d3-7f0e-42b6-a6e8-d66d50cc72a4 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 07:49:56 compute-0 nova_compute[186544]: 2025-11-22 07:49:56.141 186548 DEBUG nova.compute.manager [-] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 07:49:56 compute-0 nova_compute[186544]: 2025-11-22 07:49:56.142 186548 DEBUG nova.network.neutron [-] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 07:49:56 compute-0 systemd[1]: libpod-conmon-d639ddc1b22fa98fad249a756040bf9fe22a4d70e4676efbd0777a4348ea4d1a.scope: Deactivated successfully.
Nov 22 07:49:56 compute-0 nova_compute[186544]: 2025-11-22 07:49:56.148 186548 DEBUG nova.compute.manager [req-69fe1f5f-3df5-497d-b344-9f69ab4c198a req-c937060e-7bdf-461a-bbee-eaafe95ede9e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] Received event network-vif-unplugged-132698f3-1aa3-4976-ad25-4fbb0b51ad82 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:49:56 compute-0 nova_compute[186544]: 2025-11-22 07:49:56.149 186548 DEBUG oslo_concurrency.lockutils [req-69fe1f5f-3df5-497d-b344-9f69ab4c198a req-c937060e-7bdf-461a-bbee-eaafe95ede9e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:49:56 compute-0 nova_compute[186544]: 2025-11-22 07:49:56.150 186548 DEBUG oslo_concurrency.lockutils [req-69fe1f5f-3df5-497d-b344-9f69ab4c198a req-c937060e-7bdf-461a-bbee-eaafe95ede9e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:49:56 compute-0 nova_compute[186544]: 2025-11-22 07:49:56.150 186548 DEBUG oslo_concurrency.lockutils [req-69fe1f5f-3df5-497d-b344-9f69ab4c198a req-c937060e-7bdf-461a-bbee-eaafe95ede9e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:49:56 compute-0 nova_compute[186544]: 2025-11-22 07:49:56.150 186548 DEBUG nova.compute.manager [req-69fe1f5f-3df5-497d-b344-9f69ab4c198a req-c937060e-7bdf-461a-bbee-eaafe95ede9e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] No waiting events found dispatching network-vif-unplugged-132698f3-1aa3-4976-ad25-4fbb0b51ad82 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:49:56 compute-0 nova_compute[186544]: 2025-11-22 07:49:56.150 186548 DEBUG nova.compute.manager [req-69fe1f5f-3df5-497d-b344-9f69ab4c198a req-c937060e-7bdf-461a-bbee-eaafe95ede9e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] Received event network-vif-unplugged-132698f3-1aa3-4976-ad25-4fbb0b51ad82 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 22 07:49:56 compute-0 podman[220520]: 2025-11-22 07:49:56.596731559 +0000 UTC m=+0.437250796 container remove d639ddc1b22fa98fad249a756040bf9fe22a4d70e4676efbd0777a4348ea4d1a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a1462998-8f32-4668-b387-592c9f613719, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 22 07:49:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:49:56.602 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[1c4713cf-16f8-47cf-92ea-591efb90057f]: (4, ('Sat Nov 22 07:49:55 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-a1462998-8f32-4668-b387-592c9f613719 (d639ddc1b22fa98fad249a756040bf9fe22a4d70e4676efbd0777a4348ea4d1a)\nd639ddc1b22fa98fad249a756040bf9fe22a4d70e4676efbd0777a4348ea4d1a\nSat Nov 22 07:49:56 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-a1462998-8f32-4668-b387-592c9f613719 (d639ddc1b22fa98fad249a756040bf9fe22a4d70e4676efbd0777a4348ea4d1a)\nd639ddc1b22fa98fad249a756040bf9fe22a4d70e4676efbd0777a4348ea4d1a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:49:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:49:56.603 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[8f9649a6-0566-49ac-806f-6c1b07ff853d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:49:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:49:56.604 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa1462998-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:49:56 compute-0 nova_compute[186544]: 2025-11-22 07:49:56.606 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:49:56 compute-0 kernel: tapa1462998-80: left promiscuous mode
Nov 22 07:49:56 compute-0 nova_compute[186544]: 2025-11-22 07:49:56.617 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:49:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:49:56.620 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[c917a1ef-f6d3-4f83-98d8-d7f6d2c2e230]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:49:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:49:56.638 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[394e1e1a-74a3-4760-a9f4-55e7346c70a0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:49:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:49:56.640 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[12ce4866-c46f-4096-be07-986191cf5598]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:49:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:49:56.655 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[e5280cbb-4cc2-45f5-a774-2dcbbb368f75]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 451286, 'reachable_time': 40284, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220535, 'error': None, 'target': 'ovnmeta-a1462998-8f32-4668-b387-592c9f613719', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:49:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:49:56.657 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a1462998-8f32-4668-b387-592c9f613719 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 07:49:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:49:56.657 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[73a9ee3a-6cd6-48c0-8cd9-badd79c5f3d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:49:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:49:56.658 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 07:49:56 compute-0 systemd[1]: run-netns-ovnmeta\x2da1462998\x2d8f32\x2d4668\x2db387\x2d592c9f613719.mount: Deactivated successfully.
Nov 22 07:49:57 compute-0 nova_compute[186544]: 2025-11-22 07:49:57.430 186548 DEBUG nova.network.neutron [-] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:49:57 compute-0 nova_compute[186544]: 2025-11-22 07:49:57.457 186548 INFO nova.compute.manager [-] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] Took 1.32 seconds to deallocate network for instance.
Nov 22 07:49:57 compute-0 nova_compute[186544]: 2025-11-22 07:49:57.518 186548 DEBUG oslo_concurrency.lockutils [None req-5be4f0d3-7f0e-42b6-a6e8-d66d50cc72a4 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:49:57 compute-0 nova_compute[186544]: 2025-11-22 07:49:57.519 186548 DEBUG oslo_concurrency.lockutils [None req-5be4f0d3-7f0e-42b6-a6e8-d66d50cc72a4 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:49:57 compute-0 nova_compute[186544]: 2025-11-22 07:49:57.539 186548 DEBUG nova.compute.manager [req-00fd57ed-6f5e-4e47-a50f-a5cdfd577454 req-32a11ac5-2562-443c-b125-fd6f715fdbad 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] Received event network-vif-deleted-132698f3-1aa3-4976-ad25-4fbb0b51ad82 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:49:57 compute-0 nova_compute[186544]: 2025-11-22 07:49:57.578 186548 DEBUG nova.compute.provider_tree [None req-5be4f0d3-7f0e-42b6-a6e8-d66d50cc72a4 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:49:57 compute-0 nova_compute[186544]: 2025-11-22 07:49:57.593 186548 DEBUG nova.scheduler.client.report [None req-5be4f0d3-7f0e-42b6-a6e8-d66d50cc72a4 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:49:57 compute-0 nova_compute[186544]: 2025-11-22 07:49:57.622 186548 DEBUG oslo_concurrency.lockutils [None req-5be4f0d3-7f0e-42b6-a6e8-d66d50cc72a4 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.103s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:49:57 compute-0 nova_compute[186544]: 2025-11-22 07:49:57.659 186548 INFO nova.scheduler.client.report [None req-5be4f0d3-7f0e-42b6-a6e8-d66d50cc72a4 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] Deleted allocations for instance 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f
Nov 22 07:49:57 compute-0 nova_compute[186544]: 2025-11-22 07:49:57.740 186548 DEBUG oslo_concurrency.lockutils [None req-5be4f0d3-7f0e-42b6-a6e8-d66d50cc72a4 528e45fb2759463fbdd0d05562f8e46d 1bd2bb14962c40aaa9806867906278a1 - - default default] Lock "6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.981s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:49:58 compute-0 nova_compute[186544]: 2025-11-22 07:49:58.230 186548 DEBUG nova.compute.manager [req-cf9e6252-8729-4f26-93bc-64800db1760b req-07df9238-b92a-4ba0-a56d-cc17b80c1215 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] Received event network-vif-plugged-132698f3-1aa3-4976-ad25-4fbb0b51ad82 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:49:58 compute-0 nova_compute[186544]: 2025-11-22 07:49:58.231 186548 DEBUG oslo_concurrency.lockutils [req-cf9e6252-8729-4f26-93bc-64800db1760b req-07df9238-b92a-4ba0-a56d-cc17b80c1215 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:49:58 compute-0 nova_compute[186544]: 2025-11-22 07:49:58.231 186548 DEBUG oslo_concurrency.lockutils [req-cf9e6252-8729-4f26-93bc-64800db1760b req-07df9238-b92a-4ba0-a56d-cc17b80c1215 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:49:58 compute-0 nova_compute[186544]: 2025-11-22 07:49:58.231 186548 DEBUG oslo_concurrency.lockutils [req-cf9e6252-8729-4f26-93bc-64800db1760b req-07df9238-b92a-4ba0-a56d-cc17b80c1215 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:49:58 compute-0 nova_compute[186544]: 2025-11-22 07:49:58.231 186548 DEBUG nova.compute.manager [req-cf9e6252-8729-4f26-93bc-64800db1760b req-07df9238-b92a-4ba0-a56d-cc17b80c1215 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] No waiting events found dispatching network-vif-plugged-132698f3-1aa3-4976-ad25-4fbb0b51ad82 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:49:58 compute-0 nova_compute[186544]: 2025-11-22 07:49:58.232 186548 WARNING nova.compute.manager [req-cf9e6252-8729-4f26-93bc-64800db1760b req-07df9238-b92a-4ba0-a56d-cc17b80c1215 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] Received unexpected event network-vif-plugged-132698f3-1aa3-4976-ad25-4fbb0b51ad82 for instance with vm_state deleted and task_state None.
Nov 22 07:50:00 compute-0 nova_compute[186544]: 2025-11-22 07:50:00.281 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:50:00 compute-0 podman[220536]: 2025-11-22 07:50:00.408071386 +0000 UTC m=+0.056150561 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Nov 22 07:50:00 compute-0 podman[220537]: 2025-11-22 07:50:00.436462123 +0000 UTC m=+0.081437652 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true)
Nov 22 07:50:01 compute-0 nova_compute[186544]: 2025-11-22 07:50:01.054 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:50:04 compute-0 nova_compute[186544]: 2025-11-22 07:50:04.489 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:50:04 compute-0 nova_compute[186544]: 2025-11-22 07:50:04.595 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:50:04 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:50:04.660 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:50:05 compute-0 nova_compute[186544]: 2025-11-22 07:50:05.283 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:50:05 compute-0 podman[220582]: 2025-11-22 07:50:05.401038071 +0000 UTC m=+0.054170132 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 07:50:06 compute-0 nova_compute[186544]: 2025-11-22 07:50:06.057 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:50:07 compute-0 podman[220606]: 2025-11-22 07:50:07.398227602 +0000 UTC m=+0.048350678 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 22 07:50:10 compute-0 nova_compute[186544]: 2025-11-22 07:50:10.283 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:50:11 compute-0 nova_compute[186544]: 2025-11-22 07:50:11.034 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763797796.0319805, 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:50:11 compute-0 nova_compute[186544]: 2025-11-22 07:50:11.034 186548 INFO nova.compute.manager [-] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] VM Stopped (Lifecycle Event)
Nov 22 07:50:11 compute-0 nova_compute[186544]: 2025-11-22 07:50:11.059 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:50:11 compute-0 nova_compute[186544]: 2025-11-22 07:50:11.066 186548 DEBUG nova.compute.manager [None req-3089f263-0890-4cb3-9d79-35846a3ba809 - - - - - -] [instance: 6dba2b6f-b30a-4cbb-b45a-50a7d6150f7f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:50:12 compute-0 podman[220625]: 2025-11-22 07:50:12.402857585 +0000 UTC m=+0.051540078 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd)
Nov 22 07:50:15 compute-0 nova_compute[186544]: 2025-11-22 07:50:15.285 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:50:16 compute-0 nova_compute[186544]: 2025-11-22 07:50:16.061 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:50:17 compute-0 podman[220646]: 2025-11-22 07:50:17.397047121 +0000 UTC m=+0.049435175 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 07:50:20 compute-0 nova_compute[186544]: 2025-11-22 07:50:20.287 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:50:21 compute-0 nova_compute[186544]: 2025-11-22 07:50:21.066 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:50:21 compute-0 podman[220671]: 2025-11-22 07:50:21.40340279 +0000 UTC m=+0.055331780 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, version=9.6, com.redhat.component=ubi9-minimal-container)
Nov 22 07:50:25 compute-0 nova_compute[186544]: 2025-11-22 07:50:25.288 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:50:26 compute-0 nova_compute[186544]: 2025-11-22 07:50:26.069 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:50:26 compute-0 nova_compute[186544]: 2025-11-22 07:50:26.516 186548 DEBUG oslo_concurrency.lockutils [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Acquiring lock "fccb17bd-32d9-4428-8812-0fbb6f93afa4" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:50:26 compute-0 nova_compute[186544]: 2025-11-22 07:50:26.516 186548 DEBUG oslo_concurrency.lockutils [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Lock "fccb17bd-32d9-4428-8812-0fbb6f93afa4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:50:26 compute-0 nova_compute[186544]: 2025-11-22 07:50:26.536 186548 DEBUG nova.compute.manager [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 07:50:26 compute-0 nova_compute[186544]: 2025-11-22 07:50:26.621 186548 DEBUG oslo_concurrency.lockutils [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:50:26 compute-0 nova_compute[186544]: 2025-11-22 07:50:26.622 186548 DEBUG oslo_concurrency.lockutils [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:50:26 compute-0 nova_compute[186544]: 2025-11-22 07:50:26.631 186548 DEBUG nova.virt.hardware [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 07:50:26 compute-0 nova_compute[186544]: 2025-11-22 07:50:26.632 186548 INFO nova.compute.claims [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Claim successful on node compute-0.ctlplane.example.com
Nov 22 07:50:26 compute-0 nova_compute[186544]: 2025-11-22 07:50:26.731 186548 DEBUG nova.scheduler.client.report [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Refreshing inventories for resource provider 0a011418-630a-4be8-ab23-41ec1c11a5ea _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 22 07:50:26 compute-0 nova_compute[186544]: 2025-11-22 07:50:26.754 186548 DEBUG nova.scheduler.client.report [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Updating ProviderTree inventory for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 22 07:50:26 compute-0 nova_compute[186544]: 2025-11-22 07:50:26.756 186548 DEBUG nova.compute.provider_tree [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Updating inventory in ProviderTree for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 22 07:50:26 compute-0 nova_compute[186544]: 2025-11-22 07:50:26.784 186548 DEBUG nova.scheduler.client.report [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Refreshing aggregate associations for resource provider 0a011418-630a-4be8-ab23-41ec1c11a5ea, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 22 07:50:26 compute-0 nova_compute[186544]: 2025-11-22 07:50:26.811 186548 DEBUG nova.scheduler.client.report [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Refreshing trait associations for resource provider 0a011418-630a-4be8-ab23-41ec1c11a5ea, traits: COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_ACCELERATORS,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_RESCUE_BFV,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSSE3,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE41 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 22 07:50:26 compute-0 nova_compute[186544]: 2025-11-22 07:50:26.889 186548 DEBUG nova.compute.provider_tree [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:50:26 compute-0 nova_compute[186544]: 2025-11-22 07:50:26.915 186548 DEBUG nova.scheduler.client.report [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:50:26 compute-0 nova_compute[186544]: 2025-11-22 07:50:26.942 186548 DEBUG oslo_concurrency.lockutils [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.320s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:50:26 compute-0 nova_compute[186544]: 2025-11-22 07:50:26.943 186548 DEBUG nova.compute.manager [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 07:50:26 compute-0 nova_compute[186544]: 2025-11-22 07:50:26.989 186548 DEBUG nova.compute.manager [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948
Nov 22 07:50:27 compute-0 nova_compute[186544]: 2025-11-22 07:50:27.002 186548 INFO nova.virt.libvirt.driver [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 07:50:27 compute-0 nova_compute[186544]: 2025-11-22 07:50:27.020 186548 DEBUG nova.compute.manager [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 07:50:27 compute-0 nova_compute[186544]: 2025-11-22 07:50:27.110 186548 DEBUG nova.compute.manager [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 07:50:27 compute-0 nova_compute[186544]: 2025-11-22 07:50:27.111 186548 DEBUG nova.virt.libvirt.driver [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 07:50:27 compute-0 nova_compute[186544]: 2025-11-22 07:50:27.111 186548 INFO nova.virt.libvirt.driver [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Creating image(s)
Nov 22 07:50:27 compute-0 nova_compute[186544]: 2025-11-22 07:50:27.112 186548 DEBUG oslo_concurrency.lockutils [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Acquiring lock "/var/lib/nova/instances/fccb17bd-32d9-4428-8812-0fbb6f93afa4/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:50:27 compute-0 nova_compute[186544]: 2025-11-22 07:50:27.112 186548 DEBUG oslo_concurrency.lockutils [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Lock "/var/lib/nova/instances/fccb17bd-32d9-4428-8812-0fbb6f93afa4/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:50:27 compute-0 nova_compute[186544]: 2025-11-22 07:50:27.112 186548 DEBUG oslo_concurrency.lockutils [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Lock "/var/lib/nova/instances/fccb17bd-32d9-4428-8812-0fbb6f93afa4/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:50:27 compute-0 nova_compute[186544]: 2025-11-22 07:50:27.124 186548 DEBUG oslo_concurrency.processutils [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:50:27 compute-0 nova_compute[186544]: 2025-11-22 07:50:27.177 186548 DEBUG oslo_concurrency.processutils [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:50:27 compute-0 nova_compute[186544]: 2025-11-22 07:50:27.178 186548 DEBUG oslo_concurrency.lockutils [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:50:27 compute-0 nova_compute[186544]: 2025-11-22 07:50:27.178 186548 DEBUG oslo_concurrency.lockutils [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:50:27 compute-0 nova_compute[186544]: 2025-11-22 07:50:27.192 186548 DEBUG oslo_concurrency.processutils [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:50:27 compute-0 nova_compute[186544]: 2025-11-22 07:50:27.251 186548 DEBUG oslo_concurrency.processutils [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:50:27 compute-0 nova_compute[186544]: 2025-11-22 07:50:27.252 186548 DEBUG oslo_concurrency.processutils [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/fccb17bd-32d9-4428-8812-0fbb6f93afa4/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:50:27 compute-0 nova_compute[186544]: 2025-11-22 07:50:27.309 186548 DEBUG oslo_concurrency.processutils [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/fccb17bd-32d9-4428-8812-0fbb6f93afa4/disk 1073741824" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:50:27 compute-0 nova_compute[186544]: 2025-11-22 07:50:27.310 186548 DEBUG oslo_concurrency.lockutils [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.131s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:50:27 compute-0 nova_compute[186544]: 2025-11-22 07:50:27.310 186548 DEBUG oslo_concurrency.processutils [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:50:27 compute-0 nova_compute[186544]: 2025-11-22 07:50:27.375 186548 DEBUG oslo_concurrency.processutils [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:50:27 compute-0 nova_compute[186544]: 2025-11-22 07:50:27.376 186548 DEBUG nova.virt.disk.api [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Checking if we can resize image /var/lib/nova/instances/fccb17bd-32d9-4428-8812-0fbb6f93afa4/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 07:50:27 compute-0 nova_compute[186544]: 2025-11-22 07:50:27.376 186548 DEBUG oslo_concurrency.processutils [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fccb17bd-32d9-4428-8812-0fbb6f93afa4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:50:27 compute-0 nova_compute[186544]: 2025-11-22 07:50:27.434 186548 DEBUG oslo_concurrency.processutils [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fccb17bd-32d9-4428-8812-0fbb6f93afa4/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:50:27 compute-0 nova_compute[186544]: 2025-11-22 07:50:27.435 186548 DEBUG nova.virt.disk.api [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Cannot resize image /var/lib/nova/instances/fccb17bd-32d9-4428-8812-0fbb6f93afa4/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 07:50:27 compute-0 nova_compute[186544]: 2025-11-22 07:50:27.436 186548 DEBUG nova.objects.instance [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Lazy-loading 'migration_context' on Instance uuid fccb17bd-32d9-4428-8812-0fbb6f93afa4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:50:27 compute-0 nova_compute[186544]: 2025-11-22 07:50:27.448 186548 DEBUG nova.virt.libvirt.driver [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 07:50:27 compute-0 nova_compute[186544]: 2025-11-22 07:50:27.448 186548 DEBUG nova.virt.libvirt.driver [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Ensure instance console log exists: /var/lib/nova/instances/fccb17bd-32d9-4428-8812-0fbb6f93afa4/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 07:50:27 compute-0 nova_compute[186544]: 2025-11-22 07:50:27.449 186548 DEBUG oslo_concurrency.lockutils [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:50:27 compute-0 nova_compute[186544]: 2025-11-22 07:50:27.449 186548 DEBUG oslo_concurrency.lockutils [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:50:27 compute-0 nova_compute[186544]: 2025-11-22 07:50:27.449 186548 DEBUG oslo_concurrency.lockutils [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:50:27 compute-0 nova_compute[186544]: 2025-11-22 07:50:27.450 186548 DEBUG nova.virt.libvirt.driver [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 07:50:27 compute-0 nova_compute[186544]: 2025-11-22 07:50:27.454 186548 WARNING nova.virt.libvirt.driver [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 07:50:27 compute-0 nova_compute[186544]: 2025-11-22 07:50:27.459 186548 DEBUG nova.virt.libvirt.host [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 07:50:27 compute-0 nova_compute[186544]: 2025-11-22 07:50:27.459 186548 DEBUG nova.virt.libvirt.host [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 07:50:27 compute-0 nova_compute[186544]: 2025-11-22 07:50:27.462 186548 DEBUG nova.virt.libvirt.host [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 07:50:27 compute-0 nova_compute[186544]: 2025-11-22 07:50:27.462 186548 DEBUG nova.virt.libvirt.host [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 07:50:27 compute-0 nova_compute[186544]: 2025-11-22 07:50:27.463 186548 DEBUG nova.virt.libvirt.driver [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 07:50:27 compute-0 nova_compute[186544]: 2025-11-22 07:50:27.463 186548 DEBUG nova.virt.hardware [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 07:50:27 compute-0 nova_compute[186544]: 2025-11-22 07:50:27.464 186548 DEBUG nova.virt.hardware [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 07:50:27 compute-0 nova_compute[186544]: 2025-11-22 07:50:27.464 186548 DEBUG nova.virt.hardware [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 07:50:27 compute-0 nova_compute[186544]: 2025-11-22 07:50:27.464 186548 DEBUG nova.virt.hardware [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 07:50:27 compute-0 nova_compute[186544]: 2025-11-22 07:50:27.464 186548 DEBUG nova.virt.hardware [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 07:50:27 compute-0 nova_compute[186544]: 2025-11-22 07:50:27.464 186548 DEBUG nova.virt.hardware [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 07:50:27 compute-0 nova_compute[186544]: 2025-11-22 07:50:27.465 186548 DEBUG nova.virt.hardware [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 07:50:27 compute-0 nova_compute[186544]: 2025-11-22 07:50:27.465 186548 DEBUG nova.virt.hardware [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 07:50:27 compute-0 nova_compute[186544]: 2025-11-22 07:50:27.465 186548 DEBUG nova.virt.hardware [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 07:50:27 compute-0 nova_compute[186544]: 2025-11-22 07:50:27.465 186548 DEBUG nova.virt.hardware [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 07:50:27 compute-0 nova_compute[186544]: 2025-11-22 07:50:27.465 186548 DEBUG nova.virt.hardware [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 07:50:27 compute-0 nova_compute[186544]: 2025-11-22 07:50:27.469 186548 DEBUG nova.objects.instance [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Lazy-loading 'pci_devices' on Instance uuid fccb17bd-32d9-4428-8812-0fbb6f93afa4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:50:27 compute-0 nova_compute[186544]: 2025-11-22 07:50:27.479 186548 DEBUG nova.virt.libvirt.driver [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] End _get_guest_xml xml=<domain type="kvm">
Nov 22 07:50:27 compute-0 nova_compute[186544]:   <uuid>fccb17bd-32d9-4428-8812-0fbb6f93afa4</uuid>
Nov 22 07:50:27 compute-0 nova_compute[186544]:   <name>instance-00000030</name>
Nov 22 07:50:27 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 07:50:27 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 07:50:27 compute-0 nova_compute[186544]:   <metadata>
Nov 22 07:50:27 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 07:50:27 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 07:50:27 compute-0 nova_compute[186544]:       <nova:name>tempest-UnshelveToHostMultiNodesTest-server-1801097299</nova:name>
Nov 22 07:50:27 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 07:50:27</nova:creationTime>
Nov 22 07:50:27 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 07:50:27 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 07:50:27 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 07:50:27 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 07:50:27 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 07:50:27 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 07:50:27 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 07:50:27 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 07:50:27 compute-0 nova_compute[186544]:         <nova:user uuid="a2e51707e7f64c0793f0a8feeb6c40e6">tempest-UnshelveToHostMultiNodesTest-1261470077-project-member</nova:user>
Nov 22 07:50:27 compute-0 nova_compute[186544]:         <nova:project uuid="d31cb5bd32934c45b774fafa62a8eb01">tempest-UnshelveToHostMultiNodesTest-1261470077</nova:project>
Nov 22 07:50:27 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 07:50:27 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 07:50:27 compute-0 nova_compute[186544]:       <nova:ports/>
Nov 22 07:50:27 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 07:50:27 compute-0 nova_compute[186544]:   </metadata>
Nov 22 07:50:27 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 07:50:27 compute-0 nova_compute[186544]:     <system>
Nov 22 07:50:27 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 07:50:27 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 07:50:27 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 07:50:27 compute-0 nova_compute[186544]:       <entry name="serial">fccb17bd-32d9-4428-8812-0fbb6f93afa4</entry>
Nov 22 07:50:27 compute-0 nova_compute[186544]:       <entry name="uuid">fccb17bd-32d9-4428-8812-0fbb6f93afa4</entry>
Nov 22 07:50:27 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 07:50:27 compute-0 nova_compute[186544]:     </system>
Nov 22 07:50:27 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 07:50:27 compute-0 nova_compute[186544]:   <os>
Nov 22 07:50:27 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 07:50:27 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 07:50:27 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 07:50:27 compute-0 nova_compute[186544]:   </os>
Nov 22 07:50:27 compute-0 nova_compute[186544]:   <features>
Nov 22 07:50:27 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 07:50:27 compute-0 nova_compute[186544]:     <apic/>
Nov 22 07:50:27 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 07:50:27 compute-0 nova_compute[186544]:   </features>
Nov 22 07:50:27 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 07:50:27 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 07:50:27 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 07:50:27 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 07:50:27 compute-0 nova_compute[186544]:   </clock>
Nov 22 07:50:27 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 07:50:27 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 07:50:27 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 07:50:27 compute-0 nova_compute[186544]:   </cpu>
Nov 22 07:50:27 compute-0 nova_compute[186544]:   <devices>
Nov 22 07:50:27 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 07:50:27 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 07:50:27 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/fccb17bd-32d9-4428-8812-0fbb6f93afa4/disk"/>
Nov 22 07:50:27 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 07:50:27 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:50:27 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 07:50:27 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 07:50:27 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/fccb17bd-32d9-4428-8812-0fbb6f93afa4/disk.config"/>
Nov 22 07:50:27 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 07:50:27 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:50:27 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 07:50:27 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/fccb17bd-32d9-4428-8812-0fbb6f93afa4/console.log" append="off"/>
Nov 22 07:50:27 compute-0 nova_compute[186544]:     </serial>
Nov 22 07:50:27 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 07:50:27 compute-0 nova_compute[186544]:     <video>
Nov 22 07:50:27 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 07:50:27 compute-0 nova_compute[186544]:     </video>
Nov 22 07:50:27 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 07:50:27 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 07:50:27 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 07:50:27 compute-0 nova_compute[186544]:     </rng>
Nov 22 07:50:27 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 07:50:27 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:50:27 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:50:27 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:50:27 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:50:27 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:50:27 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:50:27 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:50:27 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:50:27 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:50:27 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:50:27 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:50:27 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:50:27 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:50:27 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:50:27 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:50:27 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:50:27 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:50:27 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:50:27 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:50:27 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:50:27 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:50:27 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:50:27 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:50:27 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:50:27 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 07:50:27 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 07:50:27 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 07:50:27 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 07:50:27 compute-0 nova_compute[186544]:   </devices>
Nov 22 07:50:27 compute-0 nova_compute[186544]: </domain>
Nov 22 07:50:27 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 07:50:27 compute-0 nova_compute[186544]: 2025-11-22 07:50:27.529 186548 DEBUG nova.virt.libvirt.driver [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:50:27 compute-0 nova_compute[186544]: 2025-11-22 07:50:27.530 186548 DEBUG nova.virt.libvirt.driver [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:50:27 compute-0 nova_compute[186544]: 2025-11-22 07:50:27.530 186548 INFO nova.virt.libvirt.driver [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Using config drive
Nov 22 07:50:27 compute-0 nova_compute[186544]: 2025-11-22 07:50:27.718 186548 INFO nova.virt.libvirt.driver [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Creating config drive at /var/lib/nova/instances/fccb17bd-32d9-4428-8812-0fbb6f93afa4/disk.config
Nov 22 07:50:27 compute-0 nova_compute[186544]: 2025-11-22 07:50:27.724 186548 DEBUG oslo_concurrency.processutils [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/fccb17bd-32d9-4428-8812-0fbb6f93afa4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmu9d8pss execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:50:27 compute-0 nova_compute[186544]: 2025-11-22 07:50:27.849 186548 DEBUG oslo_concurrency.processutils [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/fccb17bd-32d9-4428-8812-0fbb6f93afa4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmu9d8pss" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:50:27 compute-0 systemd-machined[152872]: New machine qemu-24-instance-00000030.
Nov 22 07:50:27 compute-0 systemd[1]: Started Virtual Machine qemu-24-instance-00000030.
Nov 22 07:50:29 compute-0 nova_compute[186544]: 2025-11-22 07:50:29.076 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763797829.0754135, fccb17bd-32d9-4428-8812-0fbb6f93afa4 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:50:29 compute-0 nova_compute[186544]: 2025-11-22 07:50:29.077 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] VM Resumed (Lifecycle Event)
Nov 22 07:50:29 compute-0 nova_compute[186544]: 2025-11-22 07:50:29.080 186548 DEBUG nova.compute.manager [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 07:50:29 compute-0 nova_compute[186544]: 2025-11-22 07:50:29.081 186548 DEBUG nova.virt.libvirt.driver [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 07:50:29 compute-0 nova_compute[186544]: 2025-11-22 07:50:29.084 186548 INFO nova.virt.libvirt.driver [-] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Instance spawned successfully.
Nov 22 07:50:29 compute-0 nova_compute[186544]: 2025-11-22 07:50:29.085 186548 DEBUG nova.virt.libvirt.driver [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 07:50:29 compute-0 nova_compute[186544]: 2025-11-22 07:50:29.099 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:50:29 compute-0 nova_compute[186544]: 2025-11-22 07:50:29.103 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:50:29 compute-0 nova_compute[186544]: 2025-11-22 07:50:29.111 186548 DEBUG nova.virt.libvirt.driver [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:50:29 compute-0 nova_compute[186544]: 2025-11-22 07:50:29.111 186548 DEBUG nova.virt.libvirt.driver [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:50:29 compute-0 nova_compute[186544]: 2025-11-22 07:50:29.112 186548 DEBUG nova.virt.libvirt.driver [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:50:29 compute-0 nova_compute[186544]: 2025-11-22 07:50:29.112 186548 DEBUG nova.virt.libvirt.driver [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:50:29 compute-0 nova_compute[186544]: 2025-11-22 07:50:29.112 186548 DEBUG nova.virt.libvirt.driver [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:50:29 compute-0 nova_compute[186544]: 2025-11-22 07:50:29.113 186548 DEBUG nova.virt.libvirt.driver [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:50:29 compute-0 nova_compute[186544]: 2025-11-22 07:50:29.120 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 07:50:29 compute-0 nova_compute[186544]: 2025-11-22 07:50:29.121 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763797829.0764024, fccb17bd-32d9-4428-8812-0fbb6f93afa4 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:50:29 compute-0 nova_compute[186544]: 2025-11-22 07:50:29.121 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] VM Started (Lifecycle Event)
Nov 22 07:50:29 compute-0 nova_compute[186544]: 2025-11-22 07:50:29.151 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:50:29 compute-0 nova_compute[186544]: 2025-11-22 07:50:29.155 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:50:29 compute-0 nova_compute[186544]: 2025-11-22 07:50:29.182 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 07:50:29 compute-0 nova_compute[186544]: 2025-11-22 07:50:29.218 186548 INFO nova.compute.manager [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Took 2.11 seconds to spawn the instance on the hypervisor.
Nov 22 07:50:29 compute-0 nova_compute[186544]: 2025-11-22 07:50:29.219 186548 DEBUG nova.compute.manager [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:50:29 compute-0 nova_compute[186544]: 2025-11-22 07:50:29.296 186548 INFO nova.compute.manager [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Took 2.71 seconds to build instance.
Nov 22 07:50:29 compute-0 nova_compute[186544]: 2025-11-22 07:50:29.311 186548 DEBUG oslo_concurrency.lockutils [None req-0d0f69c0-887e-43ac-966d-d932269cfbb6 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Lock "fccb17bd-32d9-4428-8812-0fbb6f93afa4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.795s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:50:30 compute-0 nova_compute[186544]: 2025-11-22 07:50:30.289 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:50:31 compute-0 nova_compute[186544]: 2025-11-22 07:50:31.071 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:50:31 compute-0 podman[220736]: 2025-11-22 07:50:31.420392364 +0000 UTC m=+0.063315896 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 22 07:50:31 compute-0 podman[220737]: 2025-11-22 07:50:31.456792109 +0000 UTC m=+0.099033914 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, config_id=ovn_controller, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 22 07:50:32 compute-0 nova_compute[186544]: 2025-11-22 07:50:32.579 186548 DEBUG oslo_concurrency.lockutils [None req-1a26471d-4676-4603-968a-1fc6477f6dc2 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Acquiring lock "fccb17bd-32d9-4428-8812-0fbb6f93afa4" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:50:32 compute-0 nova_compute[186544]: 2025-11-22 07:50:32.579 186548 DEBUG oslo_concurrency.lockutils [None req-1a26471d-4676-4603-968a-1fc6477f6dc2 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Lock "fccb17bd-32d9-4428-8812-0fbb6f93afa4" acquired by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:50:32 compute-0 nova_compute[186544]: 2025-11-22 07:50:32.580 186548 INFO nova.compute.manager [None req-1a26471d-4676-4603-968a-1fc6477f6dc2 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Shelving
Nov 22 07:50:32 compute-0 nova_compute[186544]: 2025-11-22 07:50:32.618 186548 DEBUG nova.virt.libvirt.driver [None req-1a26471d-4676-4603-968a-1fc6477f6dc2 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Nov 22 07:50:33 compute-0 nova_compute[186544]: 2025-11-22 07:50:33.576 186548 DEBUG oslo_concurrency.lockutils [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Acquiring lock "1b9bb28d-9dd5-417c-ae84-1aaf78c81b52" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:50:33 compute-0 nova_compute[186544]: 2025-11-22 07:50:33.577 186548 DEBUG oslo_concurrency.lockutils [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Lock "1b9bb28d-9dd5-417c-ae84-1aaf78c81b52" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:50:33 compute-0 nova_compute[186544]: 2025-11-22 07:50:33.598 186548 DEBUG nova.compute.manager [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 07:50:33 compute-0 nova_compute[186544]: 2025-11-22 07:50:33.679 186548 DEBUG oslo_concurrency.lockutils [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:50:33 compute-0 nova_compute[186544]: 2025-11-22 07:50:33.680 186548 DEBUG oslo_concurrency.lockutils [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:50:33 compute-0 nova_compute[186544]: 2025-11-22 07:50:33.685 186548 DEBUG nova.virt.hardware [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 07:50:33 compute-0 nova_compute[186544]: 2025-11-22 07:50:33.685 186548 INFO nova.compute.claims [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Claim successful on node compute-0.ctlplane.example.com
Nov 22 07:50:33 compute-0 nova_compute[186544]: 2025-11-22 07:50:33.827 186548 DEBUG nova.compute.provider_tree [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:50:33 compute-0 nova_compute[186544]: 2025-11-22 07:50:33.842 186548 DEBUG nova.scheduler.client.report [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:50:33 compute-0 nova_compute[186544]: 2025-11-22 07:50:33.877 186548 DEBUG oslo_concurrency.lockutils [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.198s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:50:33 compute-0 nova_compute[186544]: 2025-11-22 07:50:33.878 186548 DEBUG nova.compute.manager [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 07:50:33 compute-0 nova_compute[186544]: 2025-11-22 07:50:33.947 186548 DEBUG nova.compute.manager [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 22 07:50:33 compute-0 nova_compute[186544]: 2025-11-22 07:50:33.948 186548 DEBUG nova.network.neutron [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 07:50:33 compute-0 nova_compute[186544]: 2025-11-22 07:50:33.965 186548 INFO nova.virt.libvirt.driver [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 07:50:33 compute-0 nova_compute[186544]: 2025-11-22 07:50:33.982 186548 DEBUG nova.compute.manager [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 07:50:34 compute-0 nova_compute[186544]: 2025-11-22 07:50:34.135 186548 DEBUG nova.compute.manager [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 07:50:34 compute-0 nova_compute[186544]: 2025-11-22 07:50:34.137 186548 DEBUG nova.virt.libvirt.driver [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 07:50:34 compute-0 nova_compute[186544]: 2025-11-22 07:50:34.137 186548 INFO nova.virt.libvirt.driver [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Creating image(s)
Nov 22 07:50:34 compute-0 nova_compute[186544]: 2025-11-22 07:50:34.138 186548 DEBUG oslo_concurrency.lockutils [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Acquiring lock "/var/lib/nova/instances/1b9bb28d-9dd5-417c-ae84-1aaf78c81b52/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:50:34 compute-0 nova_compute[186544]: 2025-11-22 07:50:34.138 186548 DEBUG oslo_concurrency.lockutils [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Lock "/var/lib/nova/instances/1b9bb28d-9dd5-417c-ae84-1aaf78c81b52/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:50:34 compute-0 nova_compute[186544]: 2025-11-22 07:50:34.138 186548 DEBUG oslo_concurrency.lockutils [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Lock "/var/lib/nova/instances/1b9bb28d-9dd5-417c-ae84-1aaf78c81b52/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:50:34 compute-0 nova_compute[186544]: 2025-11-22 07:50:34.151 186548 DEBUG oslo_concurrency.processutils [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:50:34 compute-0 nova_compute[186544]: 2025-11-22 07:50:34.210 186548 DEBUG oslo_concurrency.processutils [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:50:34 compute-0 nova_compute[186544]: 2025-11-22 07:50:34.212 186548 DEBUG oslo_concurrency.lockutils [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:50:34 compute-0 nova_compute[186544]: 2025-11-22 07:50:34.212 186548 DEBUG oslo_concurrency.lockutils [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:50:34 compute-0 nova_compute[186544]: 2025-11-22 07:50:34.224 186548 DEBUG oslo_concurrency.processutils [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:50:34 compute-0 nova_compute[186544]: 2025-11-22 07:50:34.286 186548 DEBUG oslo_concurrency.processutils [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:50:34 compute-0 nova_compute[186544]: 2025-11-22 07:50:34.287 186548 DEBUG oslo_concurrency.processutils [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/1b9bb28d-9dd5-417c-ae84-1aaf78c81b52/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:50:34 compute-0 nova_compute[186544]: 2025-11-22 07:50:34.335 186548 DEBUG oslo_concurrency.processutils [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/1b9bb28d-9dd5-417c-ae84-1aaf78c81b52/disk 1073741824" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:50:34 compute-0 nova_compute[186544]: 2025-11-22 07:50:34.337 186548 DEBUG oslo_concurrency.lockutils [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.124s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:50:34 compute-0 nova_compute[186544]: 2025-11-22 07:50:34.337 186548 DEBUG oslo_concurrency.processutils [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:50:34 compute-0 nova_compute[186544]: 2025-11-22 07:50:34.396 186548 DEBUG oslo_concurrency.processutils [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:50:34 compute-0 nova_compute[186544]: 2025-11-22 07:50:34.398 186548 DEBUG nova.virt.disk.api [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Checking if we can resize image /var/lib/nova/instances/1b9bb28d-9dd5-417c-ae84-1aaf78c81b52/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 07:50:34 compute-0 nova_compute[186544]: 2025-11-22 07:50:34.398 186548 DEBUG oslo_concurrency.processutils [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1b9bb28d-9dd5-417c-ae84-1aaf78c81b52/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:50:34 compute-0 nova_compute[186544]: 2025-11-22 07:50:34.459 186548 DEBUG oslo_concurrency.processutils [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1b9bb28d-9dd5-417c-ae84-1aaf78c81b52/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:50:34 compute-0 nova_compute[186544]: 2025-11-22 07:50:34.460 186548 DEBUG nova.virt.disk.api [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Cannot resize image /var/lib/nova/instances/1b9bb28d-9dd5-417c-ae84-1aaf78c81b52/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 07:50:34 compute-0 nova_compute[186544]: 2025-11-22 07:50:34.461 186548 DEBUG nova.objects.instance [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Lazy-loading 'migration_context' on Instance uuid 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:50:34 compute-0 nova_compute[186544]: 2025-11-22 07:50:34.469 186548 DEBUG nova.virt.libvirt.driver [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 07:50:34 compute-0 nova_compute[186544]: 2025-11-22 07:50:34.470 186548 DEBUG nova.virt.libvirt.driver [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Ensure instance console log exists: /var/lib/nova/instances/1b9bb28d-9dd5-417c-ae84-1aaf78c81b52/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 07:50:34 compute-0 nova_compute[186544]: 2025-11-22 07:50:34.470 186548 DEBUG oslo_concurrency.lockutils [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:50:34 compute-0 nova_compute[186544]: 2025-11-22 07:50:34.471 186548 DEBUG oslo_concurrency.lockutils [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:50:34 compute-0 nova_compute[186544]: 2025-11-22 07:50:34.471 186548 DEBUG oslo_concurrency.lockutils [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:50:34 compute-0 nova_compute[186544]: 2025-11-22 07:50:34.717 186548 DEBUG nova.policy [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6671df69e863418882e60d5614674bf6', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'cea7e35789854e6685cdb211e396cd1b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 22 07:50:35 compute-0 nova_compute[186544]: 2025-11-22 07:50:35.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:50:35 compute-0 nova_compute[186544]: 2025-11-22 07:50:35.180 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:50:35 compute-0 nova_compute[186544]: 2025-11-22 07:50:35.181 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:50:35 compute-0 nova_compute[186544]: 2025-11-22 07:50:35.181 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:50:35 compute-0 nova_compute[186544]: 2025-11-22 07:50:35.181 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 07:50:35 compute-0 nova_compute[186544]: 2025-11-22 07:50:35.238 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fccb17bd-32d9-4428-8812-0fbb6f93afa4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:50:35 compute-0 nova_compute[186544]: 2025-11-22 07:50:35.291 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:50:35 compute-0 nova_compute[186544]: 2025-11-22 07:50:35.299 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fccb17bd-32d9-4428-8812-0fbb6f93afa4/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:50:35 compute-0 nova_compute[186544]: 2025-11-22 07:50:35.299 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fccb17bd-32d9-4428-8812-0fbb6f93afa4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:50:35 compute-0 nova_compute[186544]: 2025-11-22 07:50:35.358 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fccb17bd-32d9-4428-8812-0fbb6f93afa4/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:50:35 compute-0 nova_compute[186544]: 2025-11-22 07:50:35.496 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 07:50:35 compute-0 nova_compute[186544]: 2025-11-22 07:50:35.498 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5627MB free_disk=73.42275619506836GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 07:50:35 compute-0 nova_compute[186544]: 2025-11-22 07:50:35.498 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:50:35 compute-0 nova_compute[186544]: 2025-11-22 07:50:35.498 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:50:35 compute-0 nova_compute[186544]: 2025-11-22 07:50:35.711 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Instance fccb17bd-32d9-4428-8812-0fbb6f93afa4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 22 07:50:35 compute-0 nova_compute[186544]: 2025-11-22 07:50:35.712 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Instance 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 22 07:50:35 compute-0 nova_compute[186544]: 2025-11-22 07:50:35.712 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 07:50:35 compute-0 nova_compute[186544]: 2025-11-22 07:50:35.712 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 07:50:35 compute-0 nova_compute[186544]: 2025-11-22 07:50:35.818 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:50:35 compute-0 nova_compute[186544]: 2025-11-22 07:50:35.846 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:50:35 compute-0 nova_compute[186544]: 2025-11-22 07:50:35.901 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 07:50:35 compute-0 nova_compute[186544]: 2025-11-22 07:50:35.902 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.403s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:50:36 compute-0 nova_compute[186544]: 2025-11-22 07:50:36.073 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:50:36 compute-0 nova_compute[186544]: 2025-11-22 07:50:36.134 186548 DEBUG nova.network.neutron [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Successfully created port: 5c6a12d4-4526-4d44-8729-c065528b7250 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 22 07:50:36 compute-0 podman[220804]: 2025-11-22 07:50:36.400437233 +0000 UTC m=+0.048567474 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.593 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'fccb17bd-32d9-4428-8812-0fbb6f93afa4', 'name': 'tempest-UnshelveToHostMultiNodesTest-server-1801097299', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000030', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'd31cb5bd32934c45b774fafa62a8eb01', 'user_id': 'a2e51707e7f64c0793f0a8feeb6c40e6', 'hostId': '21176f1ddf9a27a7abb626f40479decffc0683d0440a51b63acd4901', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.593 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.619 12 DEBUG ceilometer.compute.pollsters [-] fccb17bd-32d9-4428-8812-0fbb6f93afa4/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.619 12 DEBUG ceilometer.compute.pollsters [-] fccb17bd-32d9-4428-8812-0fbb6f93afa4/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.621 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0468d789-238c-4d2f-ad30-4752f05f9976', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': 'a2e51707e7f64c0793f0a8feeb6c40e6', 'user_name': None, 'project_id': 'd31cb5bd32934c45b774fafa62a8eb01', 'project_name': None, 'resource_id': 'fccb17bd-32d9-4428-8812-0fbb6f93afa4-vda', 'timestamp': '2025-11-22T07:50:36.594058', 'resource_metadata': {'display_name': 'tempest-UnshelveToHostMultiNodesTest-server-1801097299', 'name': 'instance-00000030', 'instance_id': 'fccb17bd-32d9-4428-8812-0fbb6f93afa4', 'instance_type': 'm1.nano', 'host': '21176f1ddf9a27a7abb626f40479decffc0683d0440a51b63acd4901', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ee93fccc-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4577.285846475, 'message_signature': 'e4f7d3cdbc333aa0ae69f210cf6e6298ff0a17ef33d22e9dac64ebf24a77ddc3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'a2e51707e7f64c0793f0a8feeb6c40e6', 'user_name': None, 'project_id': 'd31cb5bd32934c45b774fafa62a8eb01', 'project_name': None, 'resource_id': 'fccb17bd-32d9-4428-8812-0fbb6f93afa4-sda', 'timestamp': '2025-11-22T07:50:36.594058', 'resource_metadata': {'display_name': 'tempest-UnshelveToHostMultiNodesTest-server-1801097299', 'name': 'instance-00000030', 'instance_id': 'fccb17bd-32d9-4428-8812-0fbb6f93afa4', 'instance_type': 'm1.nano', 'host': '21176f1ddf9a27a7abb626f40479decffc0683d0440a51b63acd4901', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ee940a28-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4577.285846475, 'message_signature': 'bff6ebe53e66d2d41d8f7d40d3658e9e135cb34b0a2d1f882987caf14e4b2ef8'}]}, 'timestamp': '2025-11-22 07:50:36.620175', '_unique_id': '53463f4177ef4d52b7931ec5361c2ae1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.621 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.621 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.621 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.621 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.621 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.621 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.621 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.621 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.621 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.621 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.621 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.621 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.621 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.621 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.621 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.621 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.621 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.621 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.621 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.621 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.621 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.621 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.621 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.621 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.621 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.621 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.621 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.621 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.621 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.621 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.621 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.621 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.621 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.621 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.621 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.621 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.621 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.621 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.621 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.621 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.621 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.621 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.621 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.621 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.621 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.621 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.621 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.621 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.621 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.621 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.621 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.621 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.621 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.621 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.622 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.638 12 DEBUG ceilometer.compute.pollsters [-] fccb17bd-32d9-4428-8812-0fbb6f93afa4/cpu volume: 7270000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.640 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8e6eddce-e584-4de7-a5c6-bc1beeecdfbc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 7270000000, 'user_id': 'a2e51707e7f64c0793f0a8feeb6c40e6', 'user_name': None, 'project_id': 'd31cb5bd32934c45b774fafa62a8eb01', 'project_name': None, 'resource_id': 'fccb17bd-32d9-4428-8812-0fbb6f93afa4', 'timestamp': '2025-11-22T07:50:36.622558', 'resource_metadata': {'display_name': 'tempest-UnshelveToHostMultiNodesTest-server-1801097299', 'name': 'instance-00000030', 'instance_id': 'fccb17bd-32d9-4428-8812-0fbb6f93afa4', 'instance_type': 'm1.nano', 'host': '21176f1ddf9a27a7abb626f40479decffc0683d0440a51b63acd4901', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'ee96ee64-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4577.330061271, 'message_signature': '4fd18335ce6c156614817790b926f5e58f3592538a8092fa5decd5d61b223008'}]}, 'timestamp': '2025-11-22 07:50:36.639228', '_unique_id': 'e115e623825a4c3597dcb943090a456f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.640 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.640 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.640 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.640 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.640 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.640 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.640 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.640 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.640 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.640 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.640 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.640 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.640 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.640 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.640 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.640 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.640 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.640 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.640 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.640 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.640 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.640 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.640 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.640 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.640 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.640 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.640 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.640 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.640 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.640 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.640 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.641 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.644 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.653 12 DEBUG ceilometer.compute.pollsters [-] fccb17bd-32d9-4428-8812-0fbb6f93afa4/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.654 12 DEBUG ceilometer.compute.pollsters [-] fccb17bd-32d9-4428-8812-0fbb6f93afa4/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.656 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fccbcad4-6c85-499a-8037-910633d332c8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'a2e51707e7f64c0793f0a8feeb6c40e6', 'user_name': None, 'project_id': 'd31cb5bd32934c45b774fafa62a8eb01', 'project_name': None, 'resource_id': 'fccb17bd-32d9-4428-8812-0fbb6f93afa4-vda', 'timestamp': '2025-11-22T07:50:36.644653', 'resource_metadata': {'display_name': 'tempest-UnshelveToHostMultiNodesTest-server-1801097299', 'name': 'instance-00000030', 'instance_id': 'fccb17bd-32d9-4428-8812-0fbb6f93afa4', 'instance_type': 'm1.nano', 'host': '21176f1ddf9a27a7abb626f40479decffc0683d0440a51b63acd4901', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ee994678-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4577.336432928, 'message_signature': 'c1b943475ad417b76128e34e0b483e7eb3645aa2f26f2c3f0e1476fca5a9e394'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'a2e51707e7f64c0793f0a8feeb6c40e6', 'user_name': None, 'project_id': 'd31cb5bd32934c45b774fafa62a8eb01', 'project_name': None, 'resource_id': 'fccb17bd-32d9-4428-8812-0fbb6f93afa4-sda', 'timestamp': '2025-11-22T07:50:36.644653', 'resource_metadata': {'display_name': 'tempest-UnshelveToHostMultiNodesTest-server-1801097299', 'name': 'instance-00000030', 'instance_id': 'fccb17bd-32d9-4428-8812-0fbb6f93afa4', 'instance_type': 'm1.nano', 'host': '21176f1ddf9a27a7abb626f40479decffc0683d0440a51b63acd4901', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ee99565e-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4577.336432928, 'message_signature': '0710a4222308802f97719e70e5204b946a6040e1319d09fc23dbcdeda58a9c6b'}]}, 'timestamp': '2025-11-22 07:50:36.654951', '_unique_id': 'a44d3663450043f583c5d067934308ee'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.656 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.656 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.656 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.656 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.656 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.656 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.656 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.656 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.656 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.656 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.656 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.656 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.656 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.656 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.656 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.656 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.656 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.656 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.656 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.656 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.656 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.656 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.656 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.656 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.656 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.656 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.656 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.656 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.656 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.656 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.656 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.657 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.657 12 DEBUG ceilometer.compute.pollsters [-] fccb17bd-32d9-4428-8812-0fbb6f93afa4/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.657 12 DEBUG ceilometer.compute.pollsters [-] fccb17bd-32d9-4428-8812-0fbb6f93afa4/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.659 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fcd0ae13-9d70-43ad-87ac-4fa45680ad73', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a2e51707e7f64c0793f0a8feeb6c40e6', 'user_name': None, 'project_id': 'd31cb5bd32934c45b774fafa62a8eb01', 'project_name': None, 'resource_id': 'fccb17bd-32d9-4428-8812-0fbb6f93afa4-vda', 'timestamp': '2025-11-22T07:50:36.657450', 'resource_metadata': {'display_name': 'tempest-UnshelveToHostMultiNodesTest-server-1801097299', 'name': 'instance-00000030', 'instance_id': 'fccb17bd-32d9-4428-8812-0fbb6f93afa4', 'instance_type': 'm1.nano', 'host': '21176f1ddf9a27a7abb626f40479decffc0683d0440a51b63acd4901', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ee99c7ec-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4577.285846475, 'message_signature': '26d69e2236e5cf5fba467706a1ede6425fdb192166765489dead924cb4eddb3e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a2e51707e7f64c0793f0a8feeb6c40e6', 'user_name': None, 'project_id': 'd31cb5bd32934c45b774fafa62a8eb01', 'project_name': None, 'resource_id': 'fccb17bd-32d9-4428-8812-0fbb6f93afa4-sda', 'timestamp': '2025-11-22T07:50:36.657450', 'resource_metadata': {'display_name': 'tempest-UnshelveToHostMultiNodesTest-server-1801097299', 'name': 'instance-00000030', 'instance_id': 'fccb17bd-32d9-4428-8812-0fbb6f93afa4', 'instance_type': 'm1.nano', 'host': '21176f1ddf9a27a7abb626f40479decffc0683d0440a51b63acd4901', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ee99d502-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4577.285846475, 'message_signature': 'd1f0273f96b9804e021b6321fe6c23b47a66bc37a55b0221c5324cc20a8452fd'}]}, 'timestamp': '2025-11-22 07:50:36.658135', '_unique_id': 'f1a57cae1eeb4fc48c8ab78807d7d950'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.659 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.659 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.659 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.659 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.659 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.659 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.659 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.659 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.659 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.659 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.659 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.659 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.659 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.659 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.659 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.659 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.659 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.659 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.659 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.659 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.659 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.659 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.659 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.659 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.659 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.659 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.659 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.659 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.659 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.659 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.659 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.660 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.661 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.661 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-UnshelveToHostMultiNodesTest-server-1801097299>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-UnshelveToHostMultiNodesTest-server-1801097299>]
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.661 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.661 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.661 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-UnshelveToHostMultiNodesTest-server-1801097299>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-UnshelveToHostMultiNodesTest-server-1801097299>]
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.661 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.662 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.662 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.662 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.662 12 DEBUG ceilometer.compute.pollsters [-] fccb17bd-32d9-4428-8812-0fbb6f93afa4/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.663 12 DEBUG ceilometer.compute.pollsters [-] fccb17bd-32d9-4428-8812-0fbb6f93afa4/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.664 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '42c74053-c958-4f43-ad90-4287fa9016b5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': 'a2e51707e7f64c0793f0a8feeb6c40e6', 'user_name': None, 'project_id': 'd31cb5bd32934c45b774fafa62a8eb01', 'project_name': None, 'resource_id': 'fccb17bd-32d9-4428-8812-0fbb6f93afa4-vda', 'timestamp': '2025-11-22T07:50:36.662571', 'resource_metadata': {'display_name': 'tempest-UnshelveToHostMultiNodesTest-server-1801097299', 'name': 'instance-00000030', 'instance_id': 'fccb17bd-32d9-4428-8812-0fbb6f93afa4', 'instance_type': 'm1.nano', 'host': '21176f1ddf9a27a7abb626f40479decffc0683d0440a51b63acd4901', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ee9a9262-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4577.336432928, 'message_signature': '4fb0aa7f7943729a4436a8731af1d329a7d96a2622e40e167ea0a6728f47f4a2'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'a2e51707e7f64c0793f0a8feeb6c40e6', 'user_name': None, 'project_id': 'd31cb5bd32934c45b774fafa62a8eb01', 'project_name': None, 'resource_id': 'fccb17bd-32d9-4428-8812-0fbb6f93afa4-sda', 'timestamp': '2025-11-22T07:50:36.662571', 'resource_metadata': {'display_name': 'tempest-UnshelveToHostMultiNodesTest-server-1801097299', 'name': 'instance-00000030', 'instance_id': 'fccb17bd-32d9-4428-8812-0fbb6f93afa4', 'instance_type': 'm1.nano', 'host': '21176f1ddf9a27a7abb626f40479decffc0683d0440a51b63acd4901', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ee9aa0b8-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4577.336432928, 'message_signature': 'b5b3719fc085fbfc30a53291da65c1288524aec10554921ce8e78bc6cf02b482'}]}, 'timestamp': '2025-11-22 07:50:36.663381', '_unique_id': 'a33927a0a1d748a78597f7c805a0229b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.664 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.664 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.664 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.664 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.664 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.664 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.664 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.664 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.664 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.664 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.664 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.664 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.664 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.664 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.664 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.664 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.664 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.664 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.664 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.664 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.664 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.664 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.664 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.664 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.664 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.664 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.664 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.664 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.664 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.664 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.664 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.665 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.665 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.665 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.665 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.666 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.666 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.666 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-UnshelveToHostMultiNodesTest-server-1801097299>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-UnshelveToHostMultiNodesTest-server-1801097299>]
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.666 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.666 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.666 12 DEBUG ceilometer.compute.pollsters [-] fccb17bd-32d9-4428-8812-0fbb6f93afa4/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.667 12 DEBUG ceilometer.compute.pollsters [-] fccb17bd-32d9-4428-8812-0fbb6f93afa4/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.668 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a19b4ad3-be19-481a-a0e4-29bacd6eb913', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': 'a2e51707e7f64c0793f0a8feeb6c40e6', 'user_name': None, 'project_id': 'd31cb5bd32934c45b774fafa62a8eb01', 'project_name': None, 'resource_id': 'fccb17bd-32d9-4428-8812-0fbb6f93afa4-vda', 'timestamp': '2025-11-22T07:50:36.666943', 'resource_metadata': {'display_name': 'tempest-UnshelveToHostMultiNodesTest-server-1801097299', 'name': 'instance-00000030', 'instance_id': 'fccb17bd-32d9-4428-8812-0fbb6f93afa4', 'instance_type': 'm1.nano', 'host': '21176f1ddf9a27a7abb626f40479decffc0683d0440a51b63acd4901', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ee9b3a64-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4577.336432928, 'message_signature': 'b5632742ea0986f493b0e7874e90314494396792b1ff63271d6d9c9b3cae67db'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'a2e51707e7f64c0793f0a8feeb6c40e6', 'user_name': None, 'project_id': 'd31cb5bd32934c45b774fafa62a8eb01', 'project_name': None, 'resource_id': 'fccb17bd-32d9-4428-8812-0fbb6f93afa4-sda', 'timestamp': '2025-11-22T07:50:36.666943', 'resource_metadata': {'display_name': 'tempest-UnshelveToHostMultiNodesTest-server-1801097299', 'name': 'instance-00000030', 'instance_id': 'fccb17bd-32d9-4428-8812-0fbb6f93afa4', 'instance_type': 'm1.nano', 'host': '21176f1ddf9a27a7abb626f40479decffc0683d0440a51b63acd4901', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ee9b4824-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4577.336432928, 'message_signature': 'a5fa95020e59d0276bc9ffbcf4fea6b889d7f03ae3a042bade1997c4206e22fc'}]}, 'timestamp': '2025-11-22 07:50:36.667643', '_unique_id': 'c26261892cbc489cb4437bf6385295d2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.668 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.668 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.668 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.668 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.668 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.668 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.668 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.668 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.668 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.668 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.668 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.668 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.668 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.668 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.668 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.668 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.668 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.668 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.668 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.668 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.668 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.668 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.668 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.668 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.668 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.668 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.668 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.668 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.668 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.668 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.668 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.669 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.669 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.669 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-UnshelveToHostMultiNodesTest-server-1801097299>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-UnshelveToHostMultiNodesTest-server-1801097299>]
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.669 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.670 12 DEBUG ceilometer.compute.pollsters [-] fccb17bd-32d9-4428-8812-0fbb6f93afa4/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.670 12 DEBUG ceilometer.compute.pollsters [-] fccb17bd-32d9-4428-8812-0fbb6f93afa4/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.671 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e158dc19-217e-4498-a751-5c8de352d3c5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': 'a2e51707e7f64c0793f0a8feeb6c40e6', 'user_name': None, 'project_id': 'd31cb5bd32934c45b774fafa62a8eb01', 'project_name': None, 'resource_id': 'fccb17bd-32d9-4428-8812-0fbb6f93afa4-vda', 'timestamp': '2025-11-22T07:50:36.670080', 'resource_metadata': {'display_name': 'tempest-UnshelveToHostMultiNodesTest-server-1801097299', 'name': 'instance-00000030', 'instance_id': 'fccb17bd-32d9-4428-8812-0fbb6f93afa4', 'instance_type': 'm1.nano', 'host': '21176f1ddf9a27a7abb626f40479decffc0683d0440a51b63acd4901', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ee9bb5e8-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4577.285846475, 'message_signature': 'bc634ec27e811d7f8f4d94ef54ea5f015a1fff74b1ed342aa540dd6857fb5b21'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': 'a2e51707e7f64c0793f0a8feeb6c40e6', 'user_name': None, 'project_id': 'd31cb5bd32934c45b774fafa62a8eb01', 'project_name': None, 'resource_id': 'fccb17bd-32d9-4428-8812-0fbb6f93afa4-sda', 'timestamp': '2025-11-22T07:50:36.670080', 'resource_metadata': {'display_name': 'tempest-UnshelveToHostMultiNodesTest-server-1801097299', 'name': 'instance-00000030', 'instance_id': 'fccb17bd-32d9-4428-8812-0fbb6f93afa4', 'instance_type': 'm1.nano', 'host': '21176f1ddf9a27a7abb626f40479decffc0683d0440a51b63acd4901', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ee9bc2f4-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4577.285846475, 'message_signature': '7dd25dd17f0ae7e9cd67cc9223caf59ffbcf58d6b153467b8b2d53e0b85e8881'}]}, 'timestamp': '2025-11-22 07:50:36.670787', '_unique_id': '1c6e3c7ee64440fab8421ab130f9e2e1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.671 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.671 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.671 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.671 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.671 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.671 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.671 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.671 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.671 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.671 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.671 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.671 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.671 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.671 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.671 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.671 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.671 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.671 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.671 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.671 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.671 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.671 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.671 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.671 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.671 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.671 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.671 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.671 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.671 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.671 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.671 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.672 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.672 12 DEBUG ceilometer.compute.pollsters [-] fccb17bd-32d9-4428-8812-0fbb6f93afa4/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.673 12 DEBUG ceilometer.compute.pollsters [-] fccb17bd-32d9-4428-8812-0fbb6f93afa4/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.673 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '709577b8-2018-46b2-9140-b0daf2b44d78', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'a2e51707e7f64c0793f0a8feeb6c40e6', 'user_name': None, 'project_id': 'd31cb5bd32934c45b774fafa62a8eb01', 'project_name': None, 'resource_id': 'fccb17bd-32d9-4428-8812-0fbb6f93afa4-vda', 'timestamp': '2025-11-22T07:50:36.672750', 'resource_metadata': {'display_name': 'tempest-UnshelveToHostMultiNodesTest-server-1801097299', 'name': 'instance-00000030', 'instance_id': 'fccb17bd-32d9-4428-8812-0fbb6f93afa4', 'instance_type': 'm1.nano', 'host': '21176f1ddf9a27a7abb626f40479decffc0683d0440a51b63acd4901', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ee9c1b96-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4577.285846475, 'message_signature': '07d58b7f811cd143c1aa0c99ef6a25c4c6a8956e98966c0de00fb177333cd675'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'a2e51707e7f64c0793f0a8feeb6c40e6', 'user_name': None, 'project_id': 'd31cb5bd32934c45b774fafa62a8eb01', 'project_name': None, 'resource_id': 'fccb17bd-32d9-4428-8812-0fbb6f93afa4-sda', 'timestamp': '2025-11-22T07:50:36.672750', 'resource_metadata': {'display_name': 'tempest-UnshelveToHostMultiNodesTest-server-1801097299', 'name': 'instance-00000030', 'instance_id': 'fccb17bd-32d9-4428-8812-0fbb6f93afa4', 'instance_type': 'm1.nano', 'host': '21176f1ddf9a27a7abb626f40479decffc0683d0440a51b63acd4901', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ee9c271c-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4577.285846475, 'message_signature': '3f5ce2b4084e399c26c1c0b7d48bb758d68e34b271a4e9aa7c45a25641438584'}]}, 'timestamp': '2025-11-22 07:50:36.673369', '_unique_id': '6652f2a58c214aaabb8ccf6ced2a5c62'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.673 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.673 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.673 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.673 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.673 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.673 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.673 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.673 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.673 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.673 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.673 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.673 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.673 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.673 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.673 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.673 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.673 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.673 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.673 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.673 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.673 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.673 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.673 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.673 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.673 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.673 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.673 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.673 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.673 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.673 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.673 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.675 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.675 12 DEBUG ceilometer.compute.pollsters [-] fccb17bd-32d9-4428-8812-0fbb6f93afa4/disk.device.read.latency volume: 741304458 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.675 12 DEBUG ceilometer.compute.pollsters [-] fccb17bd-32d9-4428-8812-0fbb6f93afa4/disk.device.read.latency volume: 2059951 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.676 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd1b81eba-6d08-4abd-9135-04ae316249d4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 741304458, 'user_id': 'a2e51707e7f64c0793f0a8feeb6c40e6', 'user_name': None, 'project_id': 'd31cb5bd32934c45b774fafa62a8eb01', 'project_name': None, 'resource_id': 'fccb17bd-32d9-4428-8812-0fbb6f93afa4-vda', 'timestamp': '2025-11-22T07:50:36.675326', 'resource_metadata': {'display_name': 'tempest-UnshelveToHostMultiNodesTest-server-1801097299', 'name': 'instance-00000030', 'instance_id': 'fccb17bd-32d9-4428-8812-0fbb6f93afa4', 'instance_type': 'm1.nano', 'host': '21176f1ddf9a27a7abb626f40479decffc0683d0440a51b63acd4901', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ee9c8284-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4577.285846475, 'message_signature': '4144da51b594ee66e1568de0b4a6eb0d38f04a56d143ec7c84a14547c9401c8d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2059951, 'user_id': 'a2e51707e7f64c0793f0a8feeb6c40e6', 'user_name': None, 'project_id': 'd31cb5bd32934c45b774fafa62a8eb01', 'project_name': None, 'resource_id': 'fccb17bd-32d9-4428-8812-0fbb6f93afa4-sda', 'timestamp': '2025-11-22T07:50:36.675326', 'resource_metadata': {'display_name': 'tempest-UnshelveToHostMultiNodesTest-server-1801097299', 'name': 'instance-00000030', 'instance_id': 'fccb17bd-32d9-4428-8812-0fbb6f93afa4', 'instance_type': 'm1.nano', 'host': '21176f1ddf9a27a7abb626f40479decffc0683d0440a51b63acd4901', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ee9c8f90-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4577.285846475, 'message_signature': '1357a0467a6e3cec6525f8da3c5f735e679e84821a3c5199259847b9ee45c496'}]}, 'timestamp': '2025-11-22 07:50:36.676024', '_unique_id': '8a233dfd05b64244b7f57f3aac97058a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.676 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.676 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.676 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.676 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.676 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.676 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.676 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.676 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.676 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.676 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.676 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.676 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.676 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.676 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.676 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.676 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.676 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.676 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.676 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.676 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.676 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.676 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.676 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.676 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.676 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.676 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.676 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.676 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.676 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.676 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.676 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.677 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.678 12 DEBUG ceilometer.compute.pollsters [-] fccb17bd-32d9-4428-8812-0fbb6f93afa4/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.678 12 DEBUG ceilometer.compute.pollsters [-] fccb17bd-32d9-4428-8812-0fbb6f93afa4/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.679 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a998b61e-b291-480c-9a9e-b0949394e6fa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'a2e51707e7f64c0793f0a8feeb6c40e6', 'user_name': None, 'project_id': 'd31cb5bd32934c45b774fafa62a8eb01', 'project_name': None, 'resource_id': 'fccb17bd-32d9-4428-8812-0fbb6f93afa4-vda', 'timestamp': '2025-11-22T07:50:36.678002', 'resource_metadata': {'display_name': 'tempest-UnshelveToHostMultiNodesTest-server-1801097299', 'name': 'instance-00000030', 'instance_id': 'fccb17bd-32d9-4428-8812-0fbb6f93afa4', 'instance_type': 'm1.nano', 'host': '21176f1ddf9a27a7abb626f40479decffc0683d0440a51b63acd4901', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ee9ceab2-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4577.285846475, 'message_signature': '010e5e8b26178dd8d2f83c437fd805fec48f063b384108a6ab9c4bb679114907'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'a2e51707e7f64c0793f0a8feeb6c40e6', 'user_name': None, 'project_id': 'd31cb5bd32934c45b774fafa62a8eb01', 'project_name': None, 'resource_id': 'fccb17bd-32d9-4428-8812-0fbb6f93afa4-sda', 'timestamp': '2025-11-22T07:50:36.678002', 'resource_metadata': {'display_name': 'tempest-UnshelveToHostMultiNodesTest-server-1801097299', 'name': 'instance-00000030', 'instance_id': 'fccb17bd-32d9-4428-8812-0fbb6f93afa4', 'instance_type': 'm1.nano', 'host': '21176f1ddf9a27a7abb626f40479decffc0683d0440a51b63acd4901', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ee9cf886-c777-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4577.285846475, 'message_signature': 'd677592fa3bc031e044a8b832e48f9d160e854cd75e1577afaff7be71847a1e9'}]}, 'timestamp': '2025-11-22 07:50:36.678715', '_unique_id': '2a88c33fff6a46dc81bd8cc24eeb96f9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.679 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.679 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.679 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.679 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.679 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.679 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.679 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.679 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.679 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.679 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.679 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.679 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.679 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.679 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.679 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.679 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.679 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.679 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.679 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.679 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.679 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.679 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.679 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.679 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.679 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.679 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.679 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.679 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.679 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.679 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.679 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.680 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.680 12 DEBUG ceilometer.compute.pollsters [-] fccb17bd-32d9-4428-8812-0fbb6f93afa4/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.680 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance fccb17bd-32d9-4428-8812-0fbb6f93afa4: ceilometer.compute.pollsters.NoVolumeException
Nov 22 07:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:50:36.680 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 22 07:50:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:50:37.318 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:50:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:50:37.319 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:50:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:50:37.319 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:50:37 compute-0 nova_compute[186544]: 2025-11-22 07:50:37.903 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:50:37 compute-0 nova_compute[186544]: 2025-11-22 07:50:37.903 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 07:50:37 compute-0 nova_compute[186544]: 2025-11-22 07:50:37.903 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 07:50:37 compute-0 nova_compute[186544]: 2025-11-22 07:50:37.926 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Nov 22 07:50:37 compute-0 nova_compute[186544]: 2025-11-22 07:50:37.926 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "refresh_cache-fccb17bd-32d9-4428-8812-0fbb6f93afa4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:50:37 compute-0 nova_compute[186544]: 2025-11-22 07:50:37.926 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquired lock "refresh_cache-fccb17bd-32d9-4428-8812-0fbb6f93afa4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:50:37 compute-0 nova_compute[186544]: 2025-11-22 07:50:37.926 186548 DEBUG nova.network.neutron [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 22 07:50:37 compute-0 nova_compute[186544]: 2025-11-22 07:50:37.927 186548 DEBUG nova.objects.instance [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lazy-loading 'info_cache' on Instance uuid fccb17bd-32d9-4428-8812-0fbb6f93afa4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:50:38 compute-0 nova_compute[186544]: 2025-11-22 07:50:38.147 186548 DEBUG nova.network.neutron [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 07:50:38 compute-0 podman[220828]: 2025-11-22 07:50:38.397695757 +0000 UTC m=+0.045136341 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team)
Nov 22 07:50:38 compute-0 nova_compute[186544]: 2025-11-22 07:50:38.496 186548 DEBUG nova.network.neutron [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:50:38 compute-0 nova_compute[186544]: 2025-11-22 07:50:38.515 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Releasing lock "refresh_cache-fccb17bd-32d9-4428-8812-0fbb6f93afa4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:50:38 compute-0 nova_compute[186544]: 2025-11-22 07:50:38.515 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 22 07:50:38 compute-0 nova_compute[186544]: 2025-11-22 07:50:38.516 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:50:38 compute-0 nova_compute[186544]: 2025-11-22 07:50:38.748 186548 DEBUG nova.network.neutron [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Successfully updated port: 5c6a12d4-4526-4d44-8729-c065528b7250 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 07:50:38 compute-0 nova_compute[186544]: 2025-11-22 07:50:38.762 186548 DEBUG oslo_concurrency.lockutils [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Acquiring lock "refresh_cache-1b9bb28d-9dd5-417c-ae84-1aaf78c81b52" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:50:38 compute-0 nova_compute[186544]: 2025-11-22 07:50:38.763 186548 DEBUG oslo_concurrency.lockutils [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Acquired lock "refresh_cache-1b9bb28d-9dd5-417c-ae84-1aaf78c81b52" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:50:38 compute-0 nova_compute[186544]: 2025-11-22 07:50:38.763 186548 DEBUG nova.network.neutron [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 07:50:38 compute-0 nova_compute[186544]: 2025-11-22 07:50:38.771 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:50:38 compute-0 nova_compute[186544]: 2025-11-22 07:50:38.907 186548 DEBUG nova.network.neutron [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 07:50:39 compute-0 nova_compute[186544]: 2025-11-22 07:50:39.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:50:39 compute-0 nova_compute[186544]: 2025-11-22 07:50:39.997 186548 DEBUG nova.network.neutron [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Updating instance_info_cache with network_info: [{"id": "5c6a12d4-4526-4d44-8729-c065528b7250", "address": "fa:16:3e:73:65:12", "network": {"id": "fe07f841-e2b4-4f9f-87cb-969d1b6e9370", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1818983046-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cea7e35789854e6685cdb211e396cd1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c6a12d4-45", "ovs_interfaceid": "5c6a12d4-4526-4d44-8729-c065528b7250", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:50:40 compute-0 nova_compute[186544]: 2025-11-22 07:50:40.033 186548 DEBUG oslo_concurrency.lockutils [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Releasing lock "refresh_cache-1b9bb28d-9dd5-417c-ae84-1aaf78c81b52" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:50:40 compute-0 nova_compute[186544]: 2025-11-22 07:50:40.033 186548 DEBUG nova.compute.manager [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Instance network_info: |[{"id": "5c6a12d4-4526-4d44-8729-c065528b7250", "address": "fa:16:3e:73:65:12", "network": {"id": "fe07f841-e2b4-4f9f-87cb-969d1b6e9370", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1818983046-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cea7e35789854e6685cdb211e396cd1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c6a12d4-45", "ovs_interfaceid": "5c6a12d4-4526-4d44-8729-c065528b7250", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 22 07:50:40 compute-0 nova_compute[186544]: 2025-11-22 07:50:40.036 186548 DEBUG nova.virt.libvirt.driver [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Start _get_guest_xml network_info=[{"id": "5c6a12d4-4526-4d44-8729-c065528b7250", "address": "fa:16:3e:73:65:12", "network": {"id": "fe07f841-e2b4-4f9f-87cb-969d1b6e9370", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1818983046-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cea7e35789854e6685cdb211e396cd1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c6a12d4-45", "ovs_interfaceid": "5c6a12d4-4526-4d44-8729-c065528b7250", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 07:50:40 compute-0 nova_compute[186544]: 2025-11-22 07:50:40.040 186548 WARNING nova.virt.libvirt.driver [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 07:50:40 compute-0 nova_compute[186544]: 2025-11-22 07:50:40.044 186548 DEBUG nova.virt.libvirt.host [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 07:50:40 compute-0 nova_compute[186544]: 2025-11-22 07:50:40.045 186548 DEBUG nova.virt.libvirt.host [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 07:50:40 compute-0 nova_compute[186544]: 2025-11-22 07:50:40.054 186548 DEBUG nova.virt.libvirt.host [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 07:50:40 compute-0 nova_compute[186544]: 2025-11-22 07:50:40.055 186548 DEBUG nova.virt.libvirt.host [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 07:50:40 compute-0 nova_compute[186544]: 2025-11-22 07:50:40.056 186548 DEBUG nova.virt.libvirt.driver [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 07:50:40 compute-0 nova_compute[186544]: 2025-11-22 07:50:40.056 186548 DEBUG nova.virt.hardware [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 07:50:40 compute-0 nova_compute[186544]: 2025-11-22 07:50:40.057 186548 DEBUG nova.virt.hardware [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 07:50:40 compute-0 nova_compute[186544]: 2025-11-22 07:50:40.057 186548 DEBUG nova.virt.hardware [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 07:50:40 compute-0 nova_compute[186544]: 2025-11-22 07:50:40.057 186548 DEBUG nova.virt.hardware [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 07:50:40 compute-0 nova_compute[186544]: 2025-11-22 07:50:40.058 186548 DEBUG nova.virt.hardware [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 07:50:40 compute-0 nova_compute[186544]: 2025-11-22 07:50:40.058 186548 DEBUG nova.virt.hardware [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 07:50:40 compute-0 nova_compute[186544]: 2025-11-22 07:50:40.058 186548 DEBUG nova.virt.hardware [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 07:50:40 compute-0 nova_compute[186544]: 2025-11-22 07:50:40.058 186548 DEBUG nova.virt.hardware [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 07:50:40 compute-0 nova_compute[186544]: 2025-11-22 07:50:40.059 186548 DEBUG nova.virt.hardware [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 07:50:40 compute-0 nova_compute[186544]: 2025-11-22 07:50:40.059 186548 DEBUG nova.virt.hardware [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 07:50:40 compute-0 nova_compute[186544]: 2025-11-22 07:50:40.059 186548 DEBUG nova.virt.hardware [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 07:50:40 compute-0 nova_compute[186544]: 2025-11-22 07:50:40.062 186548 DEBUG nova.virt.libvirt.vif [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:50:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachInterfacesV270Test-server-1166339039',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacesv270test-server-1166339039',id=50,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cea7e35789854e6685cdb211e396cd1b',ramdisk_id='',reservation_id='r-b79p1kq8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesV270Test-2058079139',owner_user_name='tempest-AttachInterfacesV270Test-2058079139-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:50:34Z,user_data=None,user_id='6671df69e863418882e60d5614674bf6',uuid=1b9bb28d-9dd5-417c-ae84-1aaf78c81b52,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5c6a12d4-4526-4d44-8729-c065528b7250", "address": "fa:16:3e:73:65:12", "network": {"id": "fe07f841-e2b4-4f9f-87cb-969d1b6e9370", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1818983046-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cea7e35789854e6685cdb211e396cd1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c6a12d4-45", "ovs_interfaceid": "5c6a12d4-4526-4d44-8729-c065528b7250", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 07:50:40 compute-0 nova_compute[186544]: 2025-11-22 07:50:40.063 186548 DEBUG nova.network.os_vif_util [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Converting VIF {"id": "5c6a12d4-4526-4d44-8729-c065528b7250", "address": "fa:16:3e:73:65:12", "network": {"id": "fe07f841-e2b4-4f9f-87cb-969d1b6e9370", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1818983046-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cea7e35789854e6685cdb211e396cd1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c6a12d4-45", "ovs_interfaceid": "5c6a12d4-4526-4d44-8729-c065528b7250", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:50:40 compute-0 nova_compute[186544]: 2025-11-22 07:50:40.063 186548 DEBUG nova.network.os_vif_util [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:73:65:12,bridge_name='br-int',has_traffic_filtering=True,id=5c6a12d4-4526-4d44-8729-c065528b7250,network=Network(fe07f841-e2b4-4f9f-87cb-969d1b6e9370),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5c6a12d4-45') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:50:40 compute-0 nova_compute[186544]: 2025-11-22 07:50:40.064 186548 DEBUG nova.objects.instance [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Lazy-loading 'pci_devices' on Instance uuid 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:50:40 compute-0 nova_compute[186544]: 2025-11-22 07:50:40.092 186548 DEBUG nova.virt.libvirt.driver [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] End _get_guest_xml xml=<domain type="kvm">
Nov 22 07:50:40 compute-0 nova_compute[186544]:   <uuid>1b9bb28d-9dd5-417c-ae84-1aaf78c81b52</uuid>
Nov 22 07:50:40 compute-0 nova_compute[186544]:   <name>instance-00000032</name>
Nov 22 07:50:40 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 07:50:40 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 07:50:40 compute-0 nova_compute[186544]:   <metadata>
Nov 22 07:50:40 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 07:50:40 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 07:50:40 compute-0 nova_compute[186544]:       <nova:name>tempest-AttachInterfacesV270Test-server-1166339039</nova:name>
Nov 22 07:50:40 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 07:50:40</nova:creationTime>
Nov 22 07:50:40 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 07:50:40 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 07:50:40 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 07:50:40 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 07:50:40 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 07:50:40 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 07:50:40 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 07:50:40 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 07:50:40 compute-0 nova_compute[186544]:         <nova:user uuid="6671df69e863418882e60d5614674bf6">tempest-AttachInterfacesV270Test-2058079139-project-member</nova:user>
Nov 22 07:50:40 compute-0 nova_compute[186544]:         <nova:project uuid="cea7e35789854e6685cdb211e396cd1b">tempest-AttachInterfacesV270Test-2058079139</nova:project>
Nov 22 07:50:40 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 07:50:40 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 07:50:40 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 07:50:40 compute-0 nova_compute[186544]:         <nova:port uuid="5c6a12d4-4526-4d44-8729-c065528b7250">
Nov 22 07:50:40 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Nov 22 07:50:40 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 07:50:40 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 07:50:40 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 07:50:40 compute-0 nova_compute[186544]:   </metadata>
Nov 22 07:50:40 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 07:50:40 compute-0 nova_compute[186544]:     <system>
Nov 22 07:50:40 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 07:50:40 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 07:50:40 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 07:50:40 compute-0 nova_compute[186544]:       <entry name="serial">1b9bb28d-9dd5-417c-ae84-1aaf78c81b52</entry>
Nov 22 07:50:40 compute-0 nova_compute[186544]:       <entry name="uuid">1b9bb28d-9dd5-417c-ae84-1aaf78c81b52</entry>
Nov 22 07:50:40 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 07:50:40 compute-0 nova_compute[186544]:     </system>
Nov 22 07:50:40 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 07:50:40 compute-0 nova_compute[186544]:   <os>
Nov 22 07:50:40 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 07:50:40 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 07:50:40 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 07:50:40 compute-0 nova_compute[186544]:   </os>
Nov 22 07:50:40 compute-0 nova_compute[186544]:   <features>
Nov 22 07:50:40 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 07:50:40 compute-0 nova_compute[186544]:     <apic/>
Nov 22 07:50:40 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 07:50:40 compute-0 nova_compute[186544]:   </features>
Nov 22 07:50:40 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 07:50:40 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 07:50:40 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 07:50:40 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 07:50:40 compute-0 nova_compute[186544]:   </clock>
Nov 22 07:50:40 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 07:50:40 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 07:50:40 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 07:50:40 compute-0 nova_compute[186544]:   </cpu>
Nov 22 07:50:40 compute-0 nova_compute[186544]:   <devices>
Nov 22 07:50:40 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 07:50:40 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 07:50:40 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/1b9bb28d-9dd5-417c-ae84-1aaf78c81b52/disk"/>
Nov 22 07:50:40 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 07:50:40 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:50:40 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 07:50:40 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 07:50:40 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/1b9bb28d-9dd5-417c-ae84-1aaf78c81b52/disk.config"/>
Nov 22 07:50:40 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 07:50:40 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:50:40 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 07:50:40 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:73:65:12"/>
Nov 22 07:50:40 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 07:50:40 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 07:50:40 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 07:50:40 compute-0 nova_compute[186544]:       <target dev="tap5c6a12d4-45"/>
Nov 22 07:50:40 compute-0 nova_compute[186544]:     </interface>
Nov 22 07:50:40 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 07:50:40 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/1b9bb28d-9dd5-417c-ae84-1aaf78c81b52/console.log" append="off"/>
Nov 22 07:50:40 compute-0 nova_compute[186544]:     </serial>
Nov 22 07:50:40 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 07:50:40 compute-0 nova_compute[186544]:     <video>
Nov 22 07:50:40 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 07:50:40 compute-0 nova_compute[186544]:     </video>
Nov 22 07:50:40 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 07:50:40 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 07:50:40 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 07:50:40 compute-0 nova_compute[186544]:     </rng>
Nov 22 07:50:40 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 07:50:40 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:50:40 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:50:40 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:50:40 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:50:40 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:50:40 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:50:40 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:50:40 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:50:40 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:50:40 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:50:40 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:50:40 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:50:40 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:50:40 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:50:40 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:50:40 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:50:40 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:50:40 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:50:40 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:50:40 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:50:40 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:50:40 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:50:40 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:50:40 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:50:40 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 07:50:40 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 07:50:40 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 07:50:40 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 07:50:40 compute-0 nova_compute[186544]:   </devices>
Nov 22 07:50:40 compute-0 nova_compute[186544]: </domain>
Nov 22 07:50:40 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 07:50:40 compute-0 nova_compute[186544]: 2025-11-22 07:50:40.097 186548 DEBUG nova.compute.manager [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Preparing to wait for external event network-vif-plugged-5c6a12d4-4526-4d44-8729-c065528b7250 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 07:50:40 compute-0 nova_compute[186544]: 2025-11-22 07:50:40.097 186548 DEBUG oslo_concurrency.lockutils [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Acquiring lock "1b9bb28d-9dd5-417c-ae84-1aaf78c81b52-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:50:40 compute-0 nova_compute[186544]: 2025-11-22 07:50:40.098 186548 DEBUG oslo_concurrency.lockutils [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Lock "1b9bb28d-9dd5-417c-ae84-1aaf78c81b52-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:50:40 compute-0 nova_compute[186544]: 2025-11-22 07:50:40.098 186548 DEBUG oslo_concurrency.lockutils [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Lock "1b9bb28d-9dd5-417c-ae84-1aaf78c81b52-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:50:40 compute-0 nova_compute[186544]: 2025-11-22 07:50:40.099 186548 DEBUG nova.virt.libvirt.vif [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:50:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachInterfacesV270Test-server-1166339039',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacesv270test-server-1166339039',id=50,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cea7e35789854e6685cdb211e396cd1b',ramdisk_id='',reservation_id='r-b79p1kq8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesV270Test-2058079139',owner_user_name='tempest-AttachInterfacesV270Test-2058079139-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:50:34Z,user_data=None,user_id='6671df69e863418882e60d5614674bf6',uuid=1b9bb28d-9dd5-417c-ae84-1aaf78c81b52,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5c6a12d4-4526-4d44-8729-c065528b7250", "address": "fa:16:3e:73:65:12", "network": {"id": "fe07f841-e2b4-4f9f-87cb-969d1b6e9370", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1818983046-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cea7e35789854e6685cdb211e396cd1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c6a12d4-45", "ovs_interfaceid": "5c6a12d4-4526-4d44-8729-c065528b7250", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 07:50:40 compute-0 nova_compute[186544]: 2025-11-22 07:50:40.099 186548 DEBUG nova.network.os_vif_util [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Converting VIF {"id": "5c6a12d4-4526-4d44-8729-c065528b7250", "address": "fa:16:3e:73:65:12", "network": {"id": "fe07f841-e2b4-4f9f-87cb-969d1b6e9370", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1818983046-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cea7e35789854e6685cdb211e396cd1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c6a12d4-45", "ovs_interfaceid": "5c6a12d4-4526-4d44-8729-c065528b7250", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:50:40 compute-0 nova_compute[186544]: 2025-11-22 07:50:40.100 186548 DEBUG nova.network.os_vif_util [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:73:65:12,bridge_name='br-int',has_traffic_filtering=True,id=5c6a12d4-4526-4d44-8729-c065528b7250,network=Network(fe07f841-e2b4-4f9f-87cb-969d1b6e9370),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5c6a12d4-45') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:50:40 compute-0 nova_compute[186544]: 2025-11-22 07:50:40.100 186548 DEBUG os_vif [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:73:65:12,bridge_name='br-int',has_traffic_filtering=True,id=5c6a12d4-4526-4d44-8729-c065528b7250,network=Network(fe07f841-e2b4-4f9f-87cb-969d1b6e9370),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5c6a12d4-45') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 07:50:40 compute-0 nova_compute[186544]: 2025-11-22 07:50:40.100 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:50:40 compute-0 nova_compute[186544]: 2025-11-22 07:50:40.101 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:50:40 compute-0 nova_compute[186544]: 2025-11-22 07:50:40.101 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:50:40 compute-0 nova_compute[186544]: 2025-11-22 07:50:40.104 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:50:40 compute-0 nova_compute[186544]: 2025-11-22 07:50:40.104 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5c6a12d4-45, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:50:40 compute-0 nova_compute[186544]: 2025-11-22 07:50:40.104 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5c6a12d4-45, col_values=(('external_ids', {'iface-id': '5c6a12d4-4526-4d44-8729-c065528b7250', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:73:65:12', 'vm-uuid': '1b9bb28d-9dd5-417c-ae84-1aaf78c81b52'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:50:40 compute-0 nova_compute[186544]: 2025-11-22 07:50:40.106 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:50:40 compute-0 NetworkManager[55036]: <info>  [1763797840.1071] manager: (tap5c6a12d4-45): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/91)
Nov 22 07:50:40 compute-0 nova_compute[186544]: 2025-11-22 07:50:40.108 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 07:50:40 compute-0 nova_compute[186544]: 2025-11-22 07:50:40.112 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:50:40 compute-0 nova_compute[186544]: 2025-11-22 07:50:40.113 186548 INFO os_vif [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:73:65:12,bridge_name='br-int',has_traffic_filtering=True,id=5c6a12d4-4526-4d44-8729-c065528b7250,network=Network(fe07f841-e2b4-4f9f-87cb-969d1b6e9370),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5c6a12d4-45')
Nov 22 07:50:40 compute-0 nova_compute[186544]: 2025-11-22 07:50:40.118 186548 DEBUG nova.compute.manager [req-f355f480-594b-46a5-8911-6a090eed5812 req-265f416e-a648-47d0-b3cd-96297467d923 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Received event network-changed-5c6a12d4-4526-4d44-8729-c065528b7250 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:50:40 compute-0 nova_compute[186544]: 2025-11-22 07:50:40.118 186548 DEBUG nova.compute.manager [req-f355f480-594b-46a5-8911-6a090eed5812 req-265f416e-a648-47d0-b3cd-96297467d923 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Refreshing instance network info cache due to event network-changed-5c6a12d4-4526-4d44-8729-c065528b7250. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 07:50:40 compute-0 nova_compute[186544]: 2025-11-22 07:50:40.119 186548 DEBUG oslo_concurrency.lockutils [req-f355f480-594b-46a5-8911-6a090eed5812 req-265f416e-a648-47d0-b3cd-96297467d923 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-1b9bb28d-9dd5-417c-ae84-1aaf78c81b52" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:50:40 compute-0 nova_compute[186544]: 2025-11-22 07:50:40.119 186548 DEBUG oslo_concurrency.lockutils [req-f355f480-594b-46a5-8911-6a090eed5812 req-265f416e-a648-47d0-b3cd-96297467d923 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-1b9bb28d-9dd5-417c-ae84-1aaf78c81b52" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:50:40 compute-0 nova_compute[186544]: 2025-11-22 07:50:40.119 186548 DEBUG nova.network.neutron [req-f355f480-594b-46a5-8911-6a090eed5812 req-265f416e-a648-47d0-b3cd-96297467d923 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Refreshing network info cache for port 5c6a12d4-4526-4d44-8729-c065528b7250 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 07:50:40 compute-0 nova_compute[186544]: 2025-11-22 07:50:40.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:50:40 compute-0 nova_compute[186544]: 2025-11-22 07:50:40.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 07:50:40 compute-0 nova_compute[186544]: 2025-11-22 07:50:40.167 186548 DEBUG nova.virt.libvirt.driver [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:50:40 compute-0 nova_compute[186544]: 2025-11-22 07:50:40.167 186548 DEBUG nova.virt.libvirt.driver [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:50:40 compute-0 nova_compute[186544]: 2025-11-22 07:50:40.168 186548 DEBUG nova.virt.libvirt.driver [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] No VIF found with MAC fa:16:3e:73:65:12, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 07:50:40 compute-0 nova_compute[186544]: 2025-11-22 07:50:40.168 186548 INFO nova.virt.libvirt.driver [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Using config drive
Nov 22 07:50:40 compute-0 nova_compute[186544]: 2025-11-22 07:50:40.292 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:50:40 compute-0 nova_compute[186544]: 2025-11-22 07:50:40.921 186548 INFO nova.virt.libvirt.driver [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Creating config drive at /var/lib/nova/instances/1b9bb28d-9dd5-417c-ae84-1aaf78c81b52/disk.config
Nov 22 07:50:40 compute-0 nova_compute[186544]: 2025-11-22 07:50:40.927 186548 DEBUG oslo_concurrency.processutils [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1b9bb28d-9dd5-417c-ae84-1aaf78c81b52/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6njgdo66 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:50:41 compute-0 nova_compute[186544]: 2025-11-22 07:50:41.051 186548 DEBUG oslo_concurrency.processutils [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1b9bb28d-9dd5-417c-ae84-1aaf78c81b52/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6njgdo66" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:50:41 compute-0 kernel: tap5c6a12d4-45: entered promiscuous mode
Nov 22 07:50:41 compute-0 NetworkManager[55036]: <info>  [1763797841.0993] manager: (tap5c6a12d4-45): new Tun device (/org/freedesktop/NetworkManager/Devices/92)
Nov 22 07:50:41 compute-0 ovn_controller[94843]: 2025-11-22T07:50:41Z|00200|binding|INFO|Claiming lport 5c6a12d4-4526-4d44-8729-c065528b7250 for this chassis.
Nov 22 07:50:41 compute-0 ovn_controller[94843]: 2025-11-22T07:50:41Z|00201|binding|INFO|5c6a12d4-4526-4d44-8729-c065528b7250: Claiming fa:16:3e:73:65:12 10.100.0.10
Nov 22 07:50:41 compute-0 nova_compute[186544]: 2025-11-22 07:50:41.101 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:50:41 compute-0 nova_compute[186544]: 2025-11-22 07:50:41.106 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:50:41 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:50:41.116 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:73:65:12 10.100.0.10'], port_security=['fa:16:3e:73:65:12 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '1b9bb28d-9dd5-417c-ae84-1aaf78c81b52', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fe07f841-e2b4-4f9f-87cb-969d1b6e9370', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cea7e35789854e6685cdb211e396cd1b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '3871b414-a0eb-4bb1-b2ee-37966ffc5476', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d5c66103-8017-42dc-8f0e-11b4184f098c, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=5c6a12d4-4526-4d44-8729-c065528b7250) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:50:41 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:50:41.117 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 5c6a12d4-4526-4d44-8729-c065528b7250 in datapath fe07f841-e2b4-4f9f-87cb-969d1b6e9370 bound to our chassis
Nov 22 07:50:41 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:50:41.118 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fe07f841-e2b4-4f9f-87cb-969d1b6e9370
Nov 22 07:50:41 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:50:41.130 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[f13c8cc8-f014-4258-b844-bf6b9e959ff9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:50:41 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:50:41.131 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapfe07f841-e1 in ovnmeta-fe07f841-e2b4-4f9f-87cb-969d1b6e9370 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 07:50:41 compute-0 systemd-udevd[220867]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 07:50:41 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:50:41.133 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapfe07f841-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 07:50:41 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:50:41.133 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[821a62f3-1f14-4dac-8e0f-0f68253c7597]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:50:41 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:50:41.134 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[a9a49a5c-b6ae-4714-aa00-263ccc83f61f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:50:41 compute-0 systemd-machined[152872]: New machine qemu-25-instance-00000032.
Nov 22 07:50:41 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:50:41.145 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[830db6c7-394a-4347-bc80-d01011cc18d6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:50:41 compute-0 NetworkManager[55036]: <info>  [1763797841.1484] device (tap5c6a12d4-45): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 07:50:41 compute-0 NetworkManager[55036]: <info>  [1763797841.1495] device (tap5c6a12d4-45): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 07:50:41 compute-0 systemd[1]: Started Virtual Machine qemu-25-instance-00000032.
Nov 22 07:50:41 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:50:41.159 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[ba1617ef-1fc5-4ed5-b158-fe53161d996a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:50:41 compute-0 nova_compute[186544]: 2025-11-22 07:50:41.158 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:50:41 compute-0 ovn_controller[94843]: 2025-11-22T07:50:41Z|00202|binding|INFO|Setting lport 5c6a12d4-4526-4d44-8729-c065528b7250 ovn-installed in OVS
Nov 22 07:50:41 compute-0 ovn_controller[94843]: 2025-11-22T07:50:41Z|00203|binding|INFO|Setting lport 5c6a12d4-4526-4d44-8729-c065528b7250 up in Southbound
Nov 22 07:50:41 compute-0 nova_compute[186544]: 2025-11-22 07:50:41.162 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:50:41 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:50:41.187 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[de49bf0a-c1fb-4efe-8481-4c04ce7108c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:50:41 compute-0 NetworkManager[55036]: <info>  [1763797841.1939] manager: (tapfe07f841-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/93)
Nov 22 07:50:41 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:50:41.194 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[5861746a-c83f-4111-b276-0c8282bb8b88]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:50:41 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:50:41.227 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[f3401660-cc9b-447e-b655-90f97ed038d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:50:41 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:50:41.231 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[4578a97e-2357-42a1-85d0-66fae6186329]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:50:41 compute-0 NetworkManager[55036]: <info>  [1763797841.2536] device (tapfe07f841-e0): carrier: link connected
Nov 22 07:50:41 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:50:41.257 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[aa194846-381c-4acd-9680-c44053d9a888]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:50:41 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:50:41.272 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[d9bb6ab9-fa57-4bbf-80b2-165c9644d0a3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfe07f841-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d9:5f:61'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 58], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 458189, 'reachable_time': 26428, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220909, 'error': None, 'target': 'ovnmeta-fe07f841-e2b4-4f9f-87cb-969d1b6e9370', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:50:41 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:50:41.284 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[4bc10540-9d35-4498-9f3e-f4955894b784]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed9:5f61'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 458189, 'tstamp': 458189}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220910, 'error': None, 'target': 'ovnmeta-fe07f841-e2b4-4f9f-87cb-969d1b6e9370', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:50:41 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:50:41.299 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[c6d87612-f55c-4116-8a49-320553f902d1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfe07f841-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d9:5f:61'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 58], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 458189, 'reachable_time': 26428, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 220911, 'error': None, 'target': 'ovnmeta-fe07f841-e2b4-4f9f-87cb-969d1b6e9370', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:50:41 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:50:41.323 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[91ac80de-0aae-44b4-a37d-ac1663b750a0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:50:41 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:50:41.375 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[17b168ec-89e2-40e6-b317-0947e275a6ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:50:41 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:50:41.376 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfe07f841-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:50:41 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:50:41.376 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:50:41 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:50:41.377 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfe07f841-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:50:41 compute-0 kernel: tapfe07f841-e0: entered promiscuous mode
Nov 22 07:50:41 compute-0 NetworkManager[55036]: <info>  [1763797841.3807] manager: (tapfe07f841-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/94)
Nov 22 07:50:41 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:50:41.383 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfe07f841-e0, col_values=(('external_ids', {'iface-id': '49228a2f-9df1-41f6-89e3-3e4aec690329'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:50:41 compute-0 ovn_controller[94843]: 2025-11-22T07:50:41Z|00204|binding|INFO|Releasing lport 49228a2f-9df1-41f6-89e3-3e4aec690329 from this chassis (sb_readonly=0)
Nov 22 07:50:41 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:50:41.386 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fe07f841-e2b4-4f9f-87cb-969d1b6e9370.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fe07f841-e2b4-4f9f-87cb-969d1b6e9370.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 07:50:41 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:50:41.387 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[8a0fc83e-4ed0-4347-a24e-4ec594b0e750]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:50:41 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:50:41.387 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 07:50:41 compute-0 ovn_metadata_agent[103800]: global
Nov 22 07:50:41 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 07:50:41 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-fe07f841-e2b4-4f9f-87cb-969d1b6e9370
Nov 22 07:50:41 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 07:50:41 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 07:50:41 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 07:50:41 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/fe07f841-e2b4-4f9f-87cb-969d1b6e9370.pid.haproxy
Nov 22 07:50:41 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 07:50:41 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:50:41 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 07:50:41 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 07:50:41 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 07:50:41 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 07:50:41 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 07:50:41 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 07:50:41 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 07:50:41 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 07:50:41 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 07:50:41 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 07:50:41 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 07:50:41 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 07:50:41 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 07:50:41 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:50:41 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:50:41 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 07:50:41 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 07:50:41 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 07:50:41 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID fe07f841-e2b4-4f9f-87cb-969d1b6e9370
Nov 22 07:50:41 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 07:50:41 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:50:41.388 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-fe07f841-e2b4-4f9f-87cb-969d1b6e9370', 'env', 'PROCESS_TAG=haproxy-fe07f841-e2b4-4f9f-87cb-969d1b6e9370', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/fe07f841-e2b4-4f9f-87cb-969d1b6e9370.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 07:50:41 compute-0 nova_compute[186544]: 2025-11-22 07:50:41.394 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:50:41 compute-0 nova_compute[186544]: 2025-11-22 07:50:41.740 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763797841.740111, 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:50:41 compute-0 nova_compute[186544]: 2025-11-22 07:50:41.740 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] VM Started (Lifecycle Event)
Nov 22 07:50:41 compute-0 nova_compute[186544]: 2025-11-22 07:50:41.760 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:50:41 compute-0 nova_compute[186544]: 2025-11-22 07:50:41.764 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763797841.7402177, 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:50:41 compute-0 nova_compute[186544]: 2025-11-22 07:50:41.764 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] VM Paused (Lifecycle Event)
Nov 22 07:50:41 compute-0 podman[220957]: 2025-11-22 07:50:41.788993741 +0000 UTC m=+0.076578673 container create a9c605eff51e75a4fd86af0802d0041fe10614fb30d18acefb95aff19e4e1f1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fe07f841-e2b4-4f9f-87cb-969d1b6e9370, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 07:50:41 compute-0 nova_compute[186544]: 2025-11-22 07:50:41.794 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:50:41 compute-0 nova_compute[186544]: 2025-11-22 07:50:41.797 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:50:41 compute-0 nova_compute[186544]: 2025-11-22 07:50:41.813 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 07:50:41 compute-0 systemd[1]: Started libpod-conmon-a9c605eff51e75a4fd86af0802d0041fe10614fb30d18acefb95aff19e4e1f1e.scope.
Nov 22 07:50:41 compute-0 podman[220957]: 2025-11-22 07:50:41.740630732 +0000 UTC m=+0.028215694 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 07:50:41 compute-0 nova_compute[186544]: 2025-11-22 07:50:41.838 186548 DEBUG nova.compute.manager [req-ca14398d-990f-49a1-9118-1cc3f4b593a0 req-5d9c1dff-8225-4f3a-88d7-e77a0c3c04a9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Received event network-vif-plugged-5c6a12d4-4526-4d44-8729-c065528b7250 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:50:41 compute-0 nova_compute[186544]: 2025-11-22 07:50:41.838 186548 DEBUG oslo_concurrency.lockutils [req-ca14398d-990f-49a1-9118-1cc3f4b593a0 req-5d9c1dff-8225-4f3a-88d7-e77a0c3c04a9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1b9bb28d-9dd5-417c-ae84-1aaf78c81b52-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:50:41 compute-0 nova_compute[186544]: 2025-11-22 07:50:41.839 186548 DEBUG oslo_concurrency.lockutils [req-ca14398d-990f-49a1-9118-1cc3f4b593a0 req-5d9c1dff-8225-4f3a-88d7-e77a0c3c04a9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1b9bb28d-9dd5-417c-ae84-1aaf78c81b52-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:50:41 compute-0 nova_compute[186544]: 2025-11-22 07:50:41.839 186548 DEBUG oslo_concurrency.lockutils [req-ca14398d-990f-49a1-9118-1cc3f4b593a0 req-5d9c1dff-8225-4f3a-88d7-e77a0c3c04a9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1b9bb28d-9dd5-417c-ae84-1aaf78c81b52-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:50:41 compute-0 nova_compute[186544]: 2025-11-22 07:50:41.839 186548 DEBUG nova.compute.manager [req-ca14398d-990f-49a1-9118-1cc3f4b593a0 req-5d9c1dff-8225-4f3a-88d7-e77a0c3c04a9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Processing event network-vif-plugged-5c6a12d4-4526-4d44-8729-c065528b7250 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 07:50:41 compute-0 nova_compute[186544]: 2025-11-22 07:50:41.840 186548 DEBUG nova.compute.manager [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 07:50:41 compute-0 systemd[1]: Started libcrun container.
Nov 22 07:50:41 compute-0 nova_compute[186544]: 2025-11-22 07:50:41.850 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763797841.8440247, 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:50:41 compute-0 nova_compute[186544]: 2025-11-22 07:50:41.850 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] VM Resumed (Lifecycle Event)
Nov 22 07:50:41 compute-0 nova_compute[186544]: 2025-11-22 07:50:41.852 186548 DEBUG nova.virt.libvirt.driver [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 07:50:41 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e9e44c8c1aaad343a270113d7f57de8469d31f67f0e876c15088dd9abf5b7a28/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 07:50:41 compute-0 nova_compute[186544]: 2025-11-22 07:50:41.867 186548 INFO nova.virt.libvirt.driver [-] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Instance spawned successfully.
Nov 22 07:50:41 compute-0 nova_compute[186544]: 2025-11-22 07:50:41.868 186548 DEBUG nova.virt.libvirt.driver [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 07:50:41 compute-0 podman[220957]: 2025-11-22 07:50:41.869158681 +0000 UTC m=+0.156743653 container init a9c605eff51e75a4fd86af0802d0041fe10614fb30d18acefb95aff19e4e1f1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fe07f841-e2b4-4f9f-87cb-969d1b6e9370, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team)
Nov 22 07:50:41 compute-0 nova_compute[186544]: 2025-11-22 07:50:41.871 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:50:41 compute-0 podman[220957]: 2025-11-22 07:50:41.874437691 +0000 UTC m=+0.162022633 container start a9c605eff51e75a4fd86af0802d0041fe10614fb30d18acefb95aff19e4e1f1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fe07f841-e2b4-4f9f-87cb-969d1b6e9370, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 22 07:50:41 compute-0 nova_compute[186544]: 2025-11-22 07:50:41.875 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:50:41 compute-0 neutron-haproxy-ovnmeta-fe07f841-e2b4-4f9f-87cb-969d1b6e9370[220974]: [NOTICE]   (220978) : New worker (220980) forked
Nov 22 07:50:41 compute-0 neutron-haproxy-ovnmeta-fe07f841-e2b4-4f9f-87cb-969d1b6e9370[220974]: [NOTICE]   (220978) : Loading success.
Nov 22 07:50:41 compute-0 nova_compute[186544]: 2025-11-22 07:50:41.909 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 07:50:41 compute-0 nova_compute[186544]: 2025-11-22 07:50:41.916 186548 DEBUG nova.virt.libvirt.driver [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:50:41 compute-0 nova_compute[186544]: 2025-11-22 07:50:41.916 186548 DEBUG nova.virt.libvirt.driver [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:50:41 compute-0 nova_compute[186544]: 2025-11-22 07:50:41.917 186548 DEBUG nova.virt.libvirt.driver [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:50:41 compute-0 nova_compute[186544]: 2025-11-22 07:50:41.917 186548 DEBUG nova.virt.libvirt.driver [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:50:41 compute-0 nova_compute[186544]: 2025-11-22 07:50:41.918 186548 DEBUG nova.virt.libvirt.driver [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:50:41 compute-0 nova_compute[186544]: 2025-11-22 07:50:41.918 186548 DEBUG nova.virt.libvirt.driver [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:50:42 compute-0 nova_compute[186544]: 2025-11-22 07:50:42.007 186548 INFO nova.compute.manager [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Took 7.87 seconds to spawn the instance on the hypervisor.
Nov 22 07:50:42 compute-0 nova_compute[186544]: 2025-11-22 07:50:42.008 186548 DEBUG nova.compute.manager [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:50:42 compute-0 nova_compute[186544]: 2025-11-22 07:50:42.091 186548 INFO nova.compute.manager [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Took 8.44 seconds to build instance.
Nov 22 07:50:42 compute-0 nova_compute[186544]: 2025-11-22 07:50:42.117 186548 DEBUG oslo_concurrency.lockutils [None req-e49dec8a-7935-47ab-a47b-9cb5bb19a2de 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Lock "1b9bb28d-9dd5-417c-ae84-1aaf78c81b52" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.540s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:50:42 compute-0 nova_compute[186544]: 2025-11-22 07:50:42.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:50:42 compute-0 nova_compute[186544]: 2025-11-22 07:50:42.364 186548 DEBUG nova.network.neutron [req-f355f480-594b-46a5-8911-6a090eed5812 req-265f416e-a648-47d0-b3cd-96297467d923 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Updated VIF entry in instance network info cache for port 5c6a12d4-4526-4d44-8729-c065528b7250. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 07:50:42 compute-0 nova_compute[186544]: 2025-11-22 07:50:42.365 186548 DEBUG nova.network.neutron [req-f355f480-594b-46a5-8911-6a090eed5812 req-265f416e-a648-47d0-b3cd-96297467d923 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Updating instance_info_cache with network_info: [{"id": "5c6a12d4-4526-4d44-8729-c065528b7250", "address": "fa:16:3e:73:65:12", "network": {"id": "fe07f841-e2b4-4f9f-87cb-969d1b6e9370", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1818983046-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cea7e35789854e6685cdb211e396cd1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c6a12d4-45", "ovs_interfaceid": "5c6a12d4-4526-4d44-8729-c065528b7250", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:50:42 compute-0 nova_compute[186544]: 2025-11-22 07:50:42.379 186548 DEBUG oslo_concurrency.lockutils [req-f355f480-594b-46a5-8911-6a090eed5812 req-265f416e-a648-47d0-b3cd-96297467d923 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-1b9bb28d-9dd5-417c-ae84-1aaf78c81b52" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:50:42 compute-0 nova_compute[186544]: 2025-11-22 07:50:42.665 186548 DEBUG nova.virt.libvirt.driver [None req-1a26471d-4676-4603-968a-1fc6477f6dc2 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Nov 22 07:50:43 compute-0 nova_compute[186544]: 2025-11-22 07:50:43.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:50:43 compute-0 podman[220989]: 2025-11-22 07:50:43.407133367 +0000 UTC m=+0.056461568 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 22 07:50:43 compute-0 nova_compute[186544]: 2025-11-22 07:50:43.969 186548 DEBUG nova.compute.manager [req-d66699ee-dbd9-4f8b-b36f-e88d26a78687 req-b8cc8dc6-b07b-4d0d-84ca-0e24e62f1da8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Received event network-vif-plugged-5c6a12d4-4526-4d44-8729-c065528b7250 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:50:43 compute-0 nova_compute[186544]: 2025-11-22 07:50:43.969 186548 DEBUG oslo_concurrency.lockutils [req-d66699ee-dbd9-4f8b-b36f-e88d26a78687 req-b8cc8dc6-b07b-4d0d-84ca-0e24e62f1da8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1b9bb28d-9dd5-417c-ae84-1aaf78c81b52-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:50:43 compute-0 nova_compute[186544]: 2025-11-22 07:50:43.969 186548 DEBUG oslo_concurrency.lockutils [req-d66699ee-dbd9-4f8b-b36f-e88d26a78687 req-b8cc8dc6-b07b-4d0d-84ca-0e24e62f1da8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1b9bb28d-9dd5-417c-ae84-1aaf78c81b52-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:50:43 compute-0 nova_compute[186544]: 2025-11-22 07:50:43.969 186548 DEBUG oslo_concurrency.lockutils [req-d66699ee-dbd9-4f8b-b36f-e88d26a78687 req-b8cc8dc6-b07b-4d0d-84ca-0e24e62f1da8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1b9bb28d-9dd5-417c-ae84-1aaf78c81b52-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:50:43 compute-0 nova_compute[186544]: 2025-11-22 07:50:43.970 186548 DEBUG nova.compute.manager [req-d66699ee-dbd9-4f8b-b36f-e88d26a78687 req-b8cc8dc6-b07b-4d0d-84ca-0e24e62f1da8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] No waiting events found dispatching network-vif-plugged-5c6a12d4-4526-4d44-8729-c065528b7250 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:50:43 compute-0 nova_compute[186544]: 2025-11-22 07:50:43.970 186548 WARNING nova.compute.manager [req-d66699ee-dbd9-4f8b-b36f-e88d26a78687 req-b8cc8dc6-b07b-4d0d-84ca-0e24e62f1da8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Received unexpected event network-vif-plugged-5c6a12d4-4526-4d44-8729-c065528b7250 for instance with vm_state active and task_state None.
Nov 22 07:50:44 compute-0 nova_compute[186544]: 2025-11-22 07:50:44.035 186548 DEBUG oslo_concurrency.lockutils [None req-f4cef7ff-90e9-47ad-aeea-4b6d2536e167 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Acquiring lock "interface-1b9bb28d-9dd5-417c-ae84-1aaf78c81b52-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:50:44 compute-0 nova_compute[186544]: 2025-11-22 07:50:44.035 186548 DEBUG oslo_concurrency.lockutils [None req-f4cef7ff-90e9-47ad-aeea-4b6d2536e167 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Lock "interface-1b9bb28d-9dd5-417c-ae84-1aaf78c81b52-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:50:44 compute-0 nova_compute[186544]: 2025-11-22 07:50:44.036 186548 DEBUG nova.objects.instance [None req-f4cef7ff-90e9-47ad-aeea-4b6d2536e167 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Lazy-loading 'flavor' on Instance uuid 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:50:44 compute-0 nova_compute[186544]: 2025-11-22 07:50:44.069 186548 DEBUG nova.objects.instance [None req-f4cef7ff-90e9-47ad-aeea-4b6d2536e167 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Lazy-loading 'pci_requests' on Instance uuid 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:50:44 compute-0 nova_compute[186544]: 2025-11-22 07:50:44.086 186548 DEBUG nova.network.neutron [None req-f4cef7ff-90e9-47ad-aeea-4b6d2536e167 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 07:50:44 compute-0 nova_compute[186544]: 2025-11-22 07:50:44.663 186548 DEBUG nova.policy [None req-f4cef7ff-90e9-47ad-aeea-4b6d2536e167 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6671df69e863418882e60d5614674bf6', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'cea7e35789854e6685cdb211e396cd1b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 22 07:50:45 compute-0 nova_compute[186544]: 2025-11-22 07:50:45.107 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:50:45 compute-0 nova_compute[186544]: 2025-11-22 07:50:45.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:50:45 compute-0 nova_compute[186544]: 2025-11-22 07:50:45.296 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:50:45 compute-0 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d00000030.scope: Deactivated successfully.
Nov 22 07:50:45 compute-0 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d00000030.scope: Consumed 14.597s CPU time.
Nov 22 07:50:45 compute-0 systemd-machined[152872]: Machine qemu-24-instance-00000030 terminated.
Nov 22 07:50:45 compute-0 nova_compute[186544]: 2025-11-22 07:50:45.827 186548 INFO nova.virt.libvirt.driver [None req-1a26471d-4676-4603-968a-1fc6477f6dc2 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Instance shutdown successfully after 13 seconds.
Nov 22 07:50:45 compute-0 nova_compute[186544]: 2025-11-22 07:50:45.834 186548 INFO nova.virt.libvirt.driver [-] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Instance destroyed successfully.
Nov 22 07:50:45 compute-0 nova_compute[186544]: 2025-11-22 07:50:45.834 186548 DEBUG nova.objects.instance [None req-1a26471d-4676-4603-968a-1fc6477f6dc2 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Lazy-loading 'numa_topology' on Instance uuid fccb17bd-32d9-4428-8812-0fbb6f93afa4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:50:46 compute-0 nova_compute[186544]: 2025-11-22 07:50:46.107 186548 INFO nova.virt.libvirt.driver [None req-1a26471d-4676-4603-968a-1fc6477f6dc2 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Beginning cold snapshot process
Nov 22 07:50:46 compute-0 nova_compute[186544]: 2025-11-22 07:50:46.334 186548 DEBUG nova.privsep.utils [None req-1a26471d-4676-4603-968a-1fc6477f6dc2 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Nov 22 07:50:46 compute-0 nova_compute[186544]: 2025-11-22 07:50:46.336 186548 DEBUG oslo_concurrency.processutils [None req-1a26471d-4676-4603-968a-1fc6477f6dc2 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/fccb17bd-32d9-4428-8812-0fbb6f93afa4/disk /var/lib/nova/instances/snapshots/tmpd2hh8zry/0712e70b702348e6be919dc70996c03c execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:50:46 compute-0 nova_compute[186544]: 2025-11-22 07:50:46.958 186548 DEBUG oslo_concurrency.processutils [None req-1a26471d-4676-4603-968a-1fc6477f6dc2 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/fccb17bd-32d9-4428-8812-0fbb6f93afa4/disk /var/lib/nova/instances/snapshots/tmpd2hh8zry/0712e70b702348e6be919dc70996c03c" returned: 0 in 0.622s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:50:46 compute-0 nova_compute[186544]: 2025-11-22 07:50:46.959 186548 INFO nova.virt.libvirt.driver [None req-1a26471d-4676-4603-968a-1fc6477f6dc2 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Snapshot extracted, beginning image upload
Nov 22 07:50:47 compute-0 nova_compute[186544]: 2025-11-22 07:50:47.582 186548 DEBUG nova.network.neutron [None req-f4cef7ff-90e9-47ad-aeea-4b6d2536e167 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Successfully created port: ef3ec563-b83e-4960-ae19-13e1326de63e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 22 07:50:48 compute-0 podman[221023]: 2025-11-22 07:50:48.395140761 +0000 UTC m=+0.048087262 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 07:50:49 compute-0 nova_compute[186544]: 2025-11-22 07:50:49.290 186548 DEBUG nova.network.neutron [None req-f4cef7ff-90e9-47ad-aeea-4b6d2536e167 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Successfully updated port: ef3ec563-b83e-4960-ae19-13e1326de63e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 07:50:49 compute-0 nova_compute[186544]: 2025-11-22 07:50:49.308 186548 DEBUG oslo_concurrency.lockutils [None req-f4cef7ff-90e9-47ad-aeea-4b6d2536e167 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Acquiring lock "refresh_cache-1b9bb28d-9dd5-417c-ae84-1aaf78c81b52" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:50:49 compute-0 nova_compute[186544]: 2025-11-22 07:50:49.308 186548 DEBUG oslo_concurrency.lockutils [None req-f4cef7ff-90e9-47ad-aeea-4b6d2536e167 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Acquired lock "refresh_cache-1b9bb28d-9dd5-417c-ae84-1aaf78c81b52" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:50:49 compute-0 nova_compute[186544]: 2025-11-22 07:50:49.308 186548 DEBUG nova.network.neutron [None req-f4cef7ff-90e9-47ad-aeea-4b6d2536e167 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 07:50:49 compute-0 nova_compute[186544]: 2025-11-22 07:50:49.448 186548 DEBUG nova.compute.manager [req-9b60bca6-e247-455c-ba19-d5589631a78d req-b1a1b072-edfb-4192-99a8-7a68ef1e9cd7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Received event network-changed-ef3ec563-b83e-4960-ae19-13e1326de63e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:50:49 compute-0 nova_compute[186544]: 2025-11-22 07:50:49.449 186548 DEBUG nova.compute.manager [req-9b60bca6-e247-455c-ba19-d5589631a78d req-b1a1b072-edfb-4192-99a8-7a68ef1e9cd7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Refreshing instance network info cache due to event network-changed-ef3ec563-b83e-4960-ae19-13e1326de63e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 07:50:49 compute-0 nova_compute[186544]: 2025-11-22 07:50:49.449 186548 DEBUG oslo_concurrency.lockutils [req-9b60bca6-e247-455c-ba19-d5589631a78d req-b1a1b072-edfb-4192-99a8-7a68ef1e9cd7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-1b9bb28d-9dd5-417c-ae84-1aaf78c81b52" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:50:49 compute-0 nova_compute[186544]: 2025-11-22 07:50:49.512 186548 INFO nova.virt.libvirt.driver [None req-1a26471d-4676-4603-968a-1fc6477f6dc2 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Snapshot image upload complete
Nov 22 07:50:49 compute-0 nova_compute[186544]: 2025-11-22 07:50:49.513 186548 DEBUG nova.compute.manager [None req-1a26471d-4676-4603-968a-1fc6477f6dc2 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:50:49 compute-0 nova_compute[186544]: 2025-11-22 07:50:49.518 186548 WARNING nova.network.neutron [None req-f4cef7ff-90e9-47ad-aeea-4b6d2536e167 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] fe07f841-e2b4-4f9f-87cb-969d1b6e9370 already exists in list: networks containing: ['fe07f841-e2b4-4f9f-87cb-969d1b6e9370']. ignoring it
Nov 22 07:50:49 compute-0 nova_compute[186544]: 2025-11-22 07:50:49.583 186548 INFO nova.compute.manager [None req-1a26471d-4676-4603-968a-1fc6477f6dc2 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Shelve offloading
Nov 22 07:50:49 compute-0 nova_compute[186544]: 2025-11-22 07:50:49.592 186548 INFO nova.virt.libvirt.driver [-] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Instance destroyed successfully.
Nov 22 07:50:49 compute-0 nova_compute[186544]: 2025-11-22 07:50:49.593 186548 DEBUG nova.compute.manager [None req-1a26471d-4676-4603-968a-1fc6477f6dc2 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:50:49 compute-0 nova_compute[186544]: 2025-11-22 07:50:49.595 186548 DEBUG oslo_concurrency.lockutils [None req-1a26471d-4676-4603-968a-1fc6477f6dc2 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Acquiring lock "refresh_cache-fccb17bd-32d9-4428-8812-0fbb6f93afa4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:50:49 compute-0 nova_compute[186544]: 2025-11-22 07:50:49.595 186548 DEBUG oslo_concurrency.lockutils [None req-1a26471d-4676-4603-968a-1fc6477f6dc2 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Acquired lock "refresh_cache-fccb17bd-32d9-4428-8812-0fbb6f93afa4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:50:49 compute-0 nova_compute[186544]: 2025-11-22 07:50:49.595 186548 DEBUG nova.network.neutron [None req-1a26471d-4676-4603-968a-1fc6477f6dc2 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 07:50:50 compute-0 nova_compute[186544]: 2025-11-22 07:50:50.110 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:50:50 compute-0 nova_compute[186544]: 2025-11-22 07:50:50.263 186548 DEBUG nova.network.neutron [None req-1a26471d-4676-4603-968a-1fc6477f6dc2 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 07:50:50 compute-0 nova_compute[186544]: 2025-11-22 07:50:50.297 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:50:50 compute-0 nova_compute[186544]: 2025-11-22 07:50:50.543 186548 DEBUG nova.network.neutron [None req-1a26471d-4676-4603-968a-1fc6477f6dc2 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:50:50 compute-0 nova_compute[186544]: 2025-11-22 07:50:50.562 186548 DEBUG oslo_concurrency.lockutils [None req-1a26471d-4676-4603-968a-1fc6477f6dc2 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Releasing lock "refresh_cache-fccb17bd-32d9-4428-8812-0fbb6f93afa4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:50:50 compute-0 nova_compute[186544]: 2025-11-22 07:50:50.574 186548 INFO nova.virt.libvirt.driver [-] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Instance destroyed successfully.
Nov 22 07:50:50 compute-0 nova_compute[186544]: 2025-11-22 07:50:50.575 186548 DEBUG nova.objects.instance [None req-1a26471d-4676-4603-968a-1fc6477f6dc2 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Lazy-loading 'resources' on Instance uuid fccb17bd-32d9-4428-8812-0fbb6f93afa4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:50:50 compute-0 nova_compute[186544]: 2025-11-22 07:50:50.619 186548 INFO nova.virt.libvirt.driver [None req-1a26471d-4676-4603-968a-1fc6477f6dc2 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Deleting instance files /var/lib/nova/instances/fccb17bd-32d9-4428-8812-0fbb6f93afa4_del
Nov 22 07:50:50 compute-0 nova_compute[186544]: 2025-11-22 07:50:50.630 186548 INFO nova.virt.libvirt.driver [None req-1a26471d-4676-4603-968a-1fc6477f6dc2 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Deletion of /var/lib/nova/instances/fccb17bd-32d9-4428-8812-0fbb6f93afa4_del complete
Nov 22 07:50:50 compute-0 nova_compute[186544]: 2025-11-22 07:50:50.841 186548 INFO nova.scheduler.client.report [None req-1a26471d-4676-4603-968a-1fc6477f6dc2 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Deleted allocations for instance fccb17bd-32d9-4428-8812-0fbb6f93afa4
Nov 22 07:50:50 compute-0 nova_compute[186544]: 2025-11-22 07:50:50.918 186548 DEBUG oslo_concurrency.lockutils [None req-1a26471d-4676-4603-968a-1fc6477f6dc2 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:50:50 compute-0 nova_compute[186544]: 2025-11-22 07:50:50.918 186548 DEBUG oslo_concurrency.lockutils [None req-1a26471d-4676-4603-968a-1fc6477f6dc2 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:50:50 compute-0 nova_compute[186544]: 2025-11-22 07:50:50.969 186548 DEBUG nova.compute.provider_tree [None req-1a26471d-4676-4603-968a-1fc6477f6dc2 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:50:50 compute-0 nova_compute[186544]: 2025-11-22 07:50:50.984 186548 DEBUG nova.scheduler.client.report [None req-1a26471d-4676-4603-968a-1fc6477f6dc2 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:50:51 compute-0 nova_compute[186544]: 2025-11-22 07:50:51.012 186548 DEBUG oslo_concurrency.lockutils [None req-1a26471d-4676-4603-968a-1fc6477f6dc2 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.094s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:50:51 compute-0 nova_compute[186544]: 2025-11-22 07:50:51.091 186548 DEBUG oslo_concurrency.lockutils [None req-1a26471d-4676-4603-968a-1fc6477f6dc2 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Lock "fccb17bd-32d9-4428-8812-0fbb6f93afa4" "released" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: held 18.512s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:50:52 compute-0 podman[221047]: 2025-11-22 07:50:52.398880908 +0000 UTC m=+0.050852942 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, name=ubi9-minimal, container_name=openstack_network_exporter)
Nov 22 07:50:53 compute-0 nova_compute[186544]: 2025-11-22 07:50:53.058 186548 DEBUG nova.network.neutron [None req-f4cef7ff-90e9-47ad-aeea-4b6d2536e167 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Updating instance_info_cache with network_info: [{"id": "5c6a12d4-4526-4d44-8729-c065528b7250", "address": "fa:16:3e:73:65:12", "network": {"id": "fe07f841-e2b4-4f9f-87cb-969d1b6e9370", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1818983046-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cea7e35789854e6685cdb211e396cd1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c6a12d4-45", "ovs_interfaceid": "5c6a12d4-4526-4d44-8729-c065528b7250", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ef3ec563-b83e-4960-ae19-13e1326de63e", "address": "fa:16:3e:f8:d1:aa", "network": {"id": "fe07f841-e2b4-4f9f-87cb-969d1b6e9370", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1818983046-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cea7e35789854e6685cdb211e396cd1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef3ec563-b8", "ovs_interfaceid": "ef3ec563-b83e-4960-ae19-13e1326de63e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:50:53 compute-0 nova_compute[186544]: 2025-11-22 07:50:53.161 186548 DEBUG oslo_concurrency.lockutils [None req-f4cef7ff-90e9-47ad-aeea-4b6d2536e167 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Releasing lock "refresh_cache-1b9bb28d-9dd5-417c-ae84-1aaf78c81b52" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:50:53 compute-0 nova_compute[186544]: 2025-11-22 07:50:53.162 186548 DEBUG oslo_concurrency.lockutils [req-9b60bca6-e247-455c-ba19-d5589631a78d req-b1a1b072-edfb-4192-99a8-7a68ef1e9cd7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-1b9bb28d-9dd5-417c-ae84-1aaf78c81b52" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:50:53 compute-0 nova_compute[186544]: 2025-11-22 07:50:53.162 186548 DEBUG nova.network.neutron [req-9b60bca6-e247-455c-ba19-d5589631a78d req-b1a1b072-edfb-4192-99a8-7a68ef1e9cd7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Refreshing network info cache for port ef3ec563-b83e-4960-ae19-13e1326de63e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 07:50:53 compute-0 nova_compute[186544]: 2025-11-22 07:50:53.165 186548 DEBUG nova.virt.libvirt.vif [None req-f4cef7ff-90e9-47ad-aeea-4b6d2536e167 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:50:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachInterfacesV270Test-server-1166339039',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacesv270test-server-1166339039',id=50,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:50:42Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='cea7e35789854e6685cdb211e396cd1b',ramdisk_id='',reservation_id='r-b79p1kq8',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesV270Test-2058079139',owner_user_name='tempest-AttachInterfacesV270Test-2058079139-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:50:42Z,user_data=None,user_id='6671df69e863418882e60d5614674bf6',uuid=1b9bb28d-9dd5-417c-ae84-1aaf78c81b52,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ef3ec563-b83e-4960-ae19-13e1326de63e", "address": "fa:16:3e:f8:d1:aa", "network": {"id": "fe07f841-e2b4-4f9f-87cb-969d1b6e9370", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1818983046-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cea7e35789854e6685cdb211e396cd1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef3ec563-b8", "ovs_interfaceid": "ef3ec563-b83e-4960-ae19-13e1326de63e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 07:50:53 compute-0 nova_compute[186544]: 2025-11-22 07:50:53.165 186548 DEBUG nova.network.os_vif_util [None req-f4cef7ff-90e9-47ad-aeea-4b6d2536e167 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Converting VIF {"id": "ef3ec563-b83e-4960-ae19-13e1326de63e", "address": "fa:16:3e:f8:d1:aa", "network": {"id": "fe07f841-e2b4-4f9f-87cb-969d1b6e9370", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1818983046-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cea7e35789854e6685cdb211e396cd1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef3ec563-b8", "ovs_interfaceid": "ef3ec563-b83e-4960-ae19-13e1326de63e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:50:53 compute-0 nova_compute[186544]: 2025-11-22 07:50:53.166 186548 DEBUG nova.network.os_vif_util [None req-f4cef7ff-90e9-47ad-aeea-4b6d2536e167 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f8:d1:aa,bridge_name='br-int',has_traffic_filtering=True,id=ef3ec563-b83e-4960-ae19-13e1326de63e,network=Network(fe07f841-e2b4-4f9f-87cb-969d1b6e9370),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapef3ec563-b8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:50:53 compute-0 nova_compute[186544]: 2025-11-22 07:50:53.166 186548 DEBUG os_vif [None req-f4cef7ff-90e9-47ad-aeea-4b6d2536e167 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f8:d1:aa,bridge_name='br-int',has_traffic_filtering=True,id=ef3ec563-b83e-4960-ae19-13e1326de63e,network=Network(fe07f841-e2b4-4f9f-87cb-969d1b6e9370),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapef3ec563-b8') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 07:50:53 compute-0 nova_compute[186544]: 2025-11-22 07:50:53.167 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:50:53 compute-0 nova_compute[186544]: 2025-11-22 07:50:53.167 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:50:53 compute-0 nova_compute[186544]: 2025-11-22 07:50:53.167 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:50:53 compute-0 nova_compute[186544]: 2025-11-22 07:50:53.170 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:50:53 compute-0 nova_compute[186544]: 2025-11-22 07:50:53.170 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapef3ec563-b8, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:50:53 compute-0 nova_compute[186544]: 2025-11-22 07:50:53.170 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapef3ec563-b8, col_values=(('external_ids', {'iface-id': 'ef3ec563-b83e-4960-ae19-13e1326de63e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f8:d1:aa', 'vm-uuid': '1b9bb28d-9dd5-417c-ae84-1aaf78c81b52'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:50:53 compute-0 nova_compute[186544]: 2025-11-22 07:50:53.172 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:50:53 compute-0 NetworkManager[55036]: <info>  [1763797853.1731] manager: (tapef3ec563-b8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/95)
Nov 22 07:50:53 compute-0 nova_compute[186544]: 2025-11-22 07:50:53.177 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 07:50:53 compute-0 nova_compute[186544]: 2025-11-22 07:50:53.180 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:50:53 compute-0 nova_compute[186544]: 2025-11-22 07:50:53.180 186548 INFO os_vif [None req-f4cef7ff-90e9-47ad-aeea-4b6d2536e167 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f8:d1:aa,bridge_name='br-int',has_traffic_filtering=True,id=ef3ec563-b83e-4960-ae19-13e1326de63e,network=Network(fe07f841-e2b4-4f9f-87cb-969d1b6e9370),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapef3ec563-b8')
Nov 22 07:50:53 compute-0 nova_compute[186544]: 2025-11-22 07:50:53.181 186548 DEBUG nova.virt.libvirt.vif [None req-f4cef7ff-90e9-47ad-aeea-4b6d2536e167 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:50:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachInterfacesV270Test-server-1166339039',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacesv270test-server-1166339039',id=50,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:50:42Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='cea7e35789854e6685cdb211e396cd1b',ramdisk_id='',reservation_id='r-b79p1kq8',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesV270Test-2058079139',owner_user_name='tempest-AttachInterfacesV270Test-2058079139-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:50:42Z,user_data=None,user_id='6671df69e863418882e60d5614674bf6',uuid=1b9bb28d-9dd5-417c-ae84-1aaf78c81b52,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ef3ec563-b83e-4960-ae19-13e1326de63e", "address": "fa:16:3e:f8:d1:aa", "network": {"id": "fe07f841-e2b4-4f9f-87cb-969d1b6e9370", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1818983046-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cea7e35789854e6685cdb211e396cd1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef3ec563-b8", "ovs_interfaceid": "ef3ec563-b83e-4960-ae19-13e1326de63e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 07:50:53 compute-0 nova_compute[186544]: 2025-11-22 07:50:53.181 186548 DEBUG nova.network.os_vif_util [None req-f4cef7ff-90e9-47ad-aeea-4b6d2536e167 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Converting VIF {"id": "ef3ec563-b83e-4960-ae19-13e1326de63e", "address": "fa:16:3e:f8:d1:aa", "network": {"id": "fe07f841-e2b4-4f9f-87cb-969d1b6e9370", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1818983046-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cea7e35789854e6685cdb211e396cd1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef3ec563-b8", "ovs_interfaceid": "ef3ec563-b83e-4960-ae19-13e1326de63e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:50:53 compute-0 nova_compute[186544]: 2025-11-22 07:50:53.182 186548 DEBUG nova.network.os_vif_util [None req-f4cef7ff-90e9-47ad-aeea-4b6d2536e167 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f8:d1:aa,bridge_name='br-int',has_traffic_filtering=True,id=ef3ec563-b83e-4960-ae19-13e1326de63e,network=Network(fe07f841-e2b4-4f9f-87cb-969d1b6e9370),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapef3ec563-b8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:50:53 compute-0 nova_compute[186544]: 2025-11-22 07:50:53.184 186548 DEBUG nova.virt.libvirt.guest [None req-f4cef7ff-90e9-47ad-aeea-4b6d2536e167 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] attach device xml: <interface type="ethernet">
Nov 22 07:50:53 compute-0 nova_compute[186544]:   <mac address="fa:16:3e:f8:d1:aa"/>
Nov 22 07:50:53 compute-0 nova_compute[186544]:   <model type="virtio"/>
Nov 22 07:50:53 compute-0 nova_compute[186544]:   <driver name="vhost" rx_queue_size="512"/>
Nov 22 07:50:53 compute-0 nova_compute[186544]:   <mtu size="1442"/>
Nov 22 07:50:53 compute-0 nova_compute[186544]:   <target dev="tapef3ec563-b8"/>
Nov 22 07:50:53 compute-0 nova_compute[186544]: </interface>
Nov 22 07:50:53 compute-0 nova_compute[186544]:  attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339
Nov 22 07:50:53 compute-0 kernel: tapef3ec563-b8: entered promiscuous mode
Nov 22 07:50:53 compute-0 ovn_controller[94843]: 2025-11-22T07:50:53Z|00205|binding|INFO|Claiming lport ef3ec563-b83e-4960-ae19-13e1326de63e for this chassis.
Nov 22 07:50:53 compute-0 ovn_controller[94843]: 2025-11-22T07:50:53Z|00206|binding|INFO|ef3ec563-b83e-4960-ae19-13e1326de63e: Claiming fa:16:3e:f8:d1:aa 10.100.0.8
Nov 22 07:50:53 compute-0 NetworkManager[55036]: <info>  [1763797853.2008] manager: (tapef3ec563-b8): new Tun device (/org/freedesktop/NetworkManager/Devices/96)
Nov 22 07:50:53 compute-0 nova_compute[186544]: 2025-11-22 07:50:53.201 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:50:53 compute-0 ovn_controller[94843]: 2025-11-22T07:50:53Z|00207|binding|INFO|Setting lport ef3ec563-b83e-4960-ae19-13e1326de63e ovn-installed in OVS
Nov 22 07:50:53 compute-0 nova_compute[186544]: 2025-11-22 07:50:53.214 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:50:53 compute-0 nova_compute[186544]: 2025-11-22 07:50:53.215 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:50:53 compute-0 systemd-udevd[221074]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 07:50:53 compute-0 NetworkManager[55036]: <info>  [1763797853.2456] device (tapef3ec563-b8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 07:50:53 compute-0 NetworkManager[55036]: <info>  [1763797853.2464] device (tapef3ec563-b8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 07:50:53 compute-0 ovn_controller[94843]: 2025-11-22T07:50:53Z|00208|binding|INFO|Setting lport ef3ec563-b83e-4960-ae19-13e1326de63e up in Southbound
Nov 22 07:50:53 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:50:53.263 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f8:d1:aa 10.100.0.8'], port_security=['fa:16:3e:f8:d1:aa 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '1b9bb28d-9dd5-417c-ae84-1aaf78c81b52', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fe07f841-e2b4-4f9f-87cb-969d1b6e9370', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cea7e35789854e6685cdb211e396cd1b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '3871b414-a0eb-4bb1-b2ee-37966ffc5476', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d5c66103-8017-42dc-8f0e-11b4184f098c, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=ef3ec563-b83e-4960-ae19-13e1326de63e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:50:53 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:50:53.265 103805 INFO neutron.agent.ovn.metadata.agent [-] Port ef3ec563-b83e-4960-ae19-13e1326de63e in datapath fe07f841-e2b4-4f9f-87cb-969d1b6e9370 bound to our chassis
Nov 22 07:50:53 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:50:53.267 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fe07f841-e2b4-4f9f-87cb-969d1b6e9370
Nov 22 07:50:53 compute-0 nova_compute[186544]: 2025-11-22 07:50:53.275 186548 DEBUG nova.virt.libvirt.driver [None req-f4cef7ff-90e9-47ad-aeea-4b6d2536e167 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:50:53 compute-0 nova_compute[186544]: 2025-11-22 07:50:53.276 186548 DEBUG nova.virt.libvirt.driver [None req-f4cef7ff-90e9-47ad-aeea-4b6d2536e167 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:50:53 compute-0 nova_compute[186544]: 2025-11-22 07:50:53.276 186548 DEBUG nova.virt.libvirt.driver [None req-f4cef7ff-90e9-47ad-aeea-4b6d2536e167 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] No VIF found with MAC fa:16:3e:73:65:12, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 07:50:53 compute-0 nova_compute[186544]: 2025-11-22 07:50:53.277 186548 DEBUG nova.virt.libvirt.driver [None req-f4cef7ff-90e9-47ad-aeea-4b6d2536e167 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] No VIF found with MAC fa:16:3e:f8:d1:aa, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 07:50:53 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:50:53.280 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[10c22eb5-d84b-4f5a-a41e-2d144a2d1cdc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:50:53 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:50:53.308 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[238d120b-e2b2-4f69-9778-541b4da85c98]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:50:53 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:50:53.311 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[9a3a9a3e-e12f-42ab-b142-798bc9f30120]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:50:53 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:50:53.338 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[a24c1541-88f5-424a-8381-737013631c96]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:50:53 compute-0 nova_compute[186544]: 2025-11-22 07:50:53.341 186548 DEBUG nova.virt.libvirt.guest [None req-f4cef7ff-90e9-47ad-aeea-4b6d2536e167 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 07:50:53 compute-0 nova_compute[186544]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 07:50:53 compute-0 nova_compute[186544]:   <nova:name>tempest-AttachInterfacesV270Test-server-1166339039</nova:name>
Nov 22 07:50:53 compute-0 nova_compute[186544]:   <nova:creationTime>2025-11-22 07:50:53</nova:creationTime>
Nov 22 07:50:53 compute-0 nova_compute[186544]:   <nova:flavor name="m1.nano">
Nov 22 07:50:53 compute-0 nova_compute[186544]:     <nova:memory>128</nova:memory>
Nov 22 07:50:53 compute-0 nova_compute[186544]:     <nova:disk>1</nova:disk>
Nov 22 07:50:53 compute-0 nova_compute[186544]:     <nova:swap>0</nova:swap>
Nov 22 07:50:53 compute-0 nova_compute[186544]:     <nova:ephemeral>0</nova:ephemeral>
Nov 22 07:50:53 compute-0 nova_compute[186544]:     <nova:vcpus>1</nova:vcpus>
Nov 22 07:50:53 compute-0 nova_compute[186544]:   </nova:flavor>
Nov 22 07:50:53 compute-0 nova_compute[186544]:   <nova:owner>
Nov 22 07:50:53 compute-0 nova_compute[186544]:     <nova:user uuid="6671df69e863418882e60d5614674bf6">tempest-AttachInterfacesV270Test-2058079139-project-member</nova:user>
Nov 22 07:50:53 compute-0 nova_compute[186544]:     <nova:project uuid="cea7e35789854e6685cdb211e396cd1b">tempest-AttachInterfacesV270Test-2058079139</nova:project>
Nov 22 07:50:53 compute-0 nova_compute[186544]:   </nova:owner>
Nov 22 07:50:53 compute-0 nova_compute[186544]:   <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 07:50:53 compute-0 nova_compute[186544]:   <nova:ports>
Nov 22 07:50:53 compute-0 nova_compute[186544]:     <nova:port uuid="5c6a12d4-4526-4d44-8729-c065528b7250">
Nov 22 07:50:53 compute-0 nova_compute[186544]:       <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Nov 22 07:50:53 compute-0 nova_compute[186544]:     </nova:port>
Nov 22 07:50:53 compute-0 nova_compute[186544]:     <nova:port uuid="ef3ec563-b83e-4960-ae19-13e1326de63e">
Nov 22 07:50:53 compute-0 nova_compute[186544]:       <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Nov 22 07:50:53 compute-0 nova_compute[186544]:     </nova:port>
Nov 22 07:50:53 compute-0 nova_compute[186544]:   </nova:ports>
Nov 22 07:50:53 compute-0 nova_compute[186544]: </nova:instance>
Nov 22 07:50:53 compute-0 nova_compute[186544]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Nov 22 07:50:53 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:50:53.354 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[63ad08a1-e69d-48fb-aa25-d63737401459]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfe07f841-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d9:5f:61'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 5, 'rx_bytes': 532, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 5, 'rx_bytes': 532, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 58], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 458189, 'reachable_time': 26428, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221082, 'error': None, 'target': 'ovnmeta-fe07f841-e2b4-4f9f-87cb-969d1b6e9370', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:50:53 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:50:53.367 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[af00d4b0-ae6a-4289-9b21-56020f9a5f57]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapfe07f841-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 458198, 'tstamp': 458198}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221083, 'error': None, 'target': 'ovnmeta-fe07f841-e2b4-4f9f-87cb-969d1b6e9370', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapfe07f841-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 458200, 'tstamp': 458200}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221083, 'error': None, 'target': 'ovnmeta-fe07f841-e2b4-4f9f-87cb-969d1b6e9370', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:50:53 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:50:53.369 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfe07f841-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:50:53 compute-0 nova_compute[186544]: 2025-11-22 07:50:53.371 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:50:53 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:50:53.372 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfe07f841-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:50:53 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:50:53.372 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:50:53 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:50:53.373 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfe07f841-e0, col_values=(('external_ids', {'iface-id': '49228a2f-9df1-41f6-89e3-3e4aec690329'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:50:53 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:50:53.373 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:50:53 compute-0 nova_compute[186544]: 2025-11-22 07:50:53.374 186548 DEBUG oslo_concurrency.lockutils [None req-f4cef7ff-90e9-47ad-aeea-4b6d2536e167 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Lock "interface-1b9bb28d-9dd5-417c-ae84-1aaf78c81b52-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 9.338s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:50:53 compute-0 nova_compute[186544]: 2025-11-22 07:50:53.860 186548 DEBUG nova.compute.manager [req-37b48bf8-9b80-4f4b-8233-a3ddbc3e2460 req-47433eea-99ca-4c25-abb4-6b75169ba241 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Received event network-vif-plugged-ef3ec563-b83e-4960-ae19-13e1326de63e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:50:53 compute-0 nova_compute[186544]: 2025-11-22 07:50:53.861 186548 DEBUG oslo_concurrency.lockutils [req-37b48bf8-9b80-4f4b-8233-a3ddbc3e2460 req-47433eea-99ca-4c25-abb4-6b75169ba241 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1b9bb28d-9dd5-417c-ae84-1aaf78c81b52-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:50:53 compute-0 nova_compute[186544]: 2025-11-22 07:50:53.861 186548 DEBUG oslo_concurrency.lockutils [req-37b48bf8-9b80-4f4b-8233-a3ddbc3e2460 req-47433eea-99ca-4c25-abb4-6b75169ba241 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1b9bb28d-9dd5-417c-ae84-1aaf78c81b52-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:50:53 compute-0 nova_compute[186544]: 2025-11-22 07:50:53.861 186548 DEBUG oslo_concurrency.lockutils [req-37b48bf8-9b80-4f4b-8233-a3ddbc3e2460 req-47433eea-99ca-4c25-abb4-6b75169ba241 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1b9bb28d-9dd5-417c-ae84-1aaf78c81b52-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:50:53 compute-0 nova_compute[186544]: 2025-11-22 07:50:53.861 186548 DEBUG nova.compute.manager [req-37b48bf8-9b80-4f4b-8233-a3ddbc3e2460 req-47433eea-99ca-4c25-abb4-6b75169ba241 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] No waiting events found dispatching network-vif-plugged-ef3ec563-b83e-4960-ae19-13e1326de63e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:50:53 compute-0 nova_compute[186544]: 2025-11-22 07:50:53.862 186548 WARNING nova.compute.manager [req-37b48bf8-9b80-4f4b-8233-a3ddbc3e2460 req-47433eea-99ca-4c25-abb4-6b75169ba241 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Received unexpected event network-vif-plugged-ef3ec563-b83e-4960-ae19-13e1326de63e for instance with vm_state active and task_state None.
Nov 22 07:50:55 compute-0 nova_compute[186544]: 2025-11-22 07:50:55.034 186548 DEBUG oslo_concurrency.lockutils [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Acquiring lock "fccb17bd-32d9-4428-8812-0fbb6f93afa4" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:50:55 compute-0 nova_compute[186544]: 2025-11-22 07:50:55.035 186548 DEBUG oslo_concurrency.lockutils [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Lock "fccb17bd-32d9-4428-8812-0fbb6f93afa4" acquired by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:50:55 compute-0 nova_compute[186544]: 2025-11-22 07:50:55.035 186548 INFO nova.compute.manager [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Unshelving
Nov 22 07:50:55 compute-0 nova_compute[186544]: 2025-11-22 07:50:55.146 186548 DEBUG oslo_concurrency.lockutils [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:50:55 compute-0 nova_compute[186544]: 2025-11-22 07:50:55.147 186548 DEBUG oslo_concurrency.lockutils [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:50:55 compute-0 nova_compute[186544]: 2025-11-22 07:50:55.151 186548 DEBUG nova.objects.instance [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Lazy-loading 'pci_requests' on Instance uuid fccb17bd-32d9-4428-8812-0fbb6f93afa4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:50:55 compute-0 nova_compute[186544]: 2025-11-22 07:50:55.170 186548 DEBUG nova.objects.instance [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Lazy-loading 'numa_topology' on Instance uuid fccb17bd-32d9-4428-8812-0fbb6f93afa4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:50:55 compute-0 nova_compute[186544]: 2025-11-22 07:50:55.182 186548 DEBUG nova.virt.hardware [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 07:50:55 compute-0 nova_compute[186544]: 2025-11-22 07:50:55.182 186548 INFO nova.compute.claims [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Claim successful on node compute-0.ctlplane.example.com
Nov 22 07:50:55 compute-0 nova_compute[186544]: 2025-11-22 07:50:55.299 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:50:55 compute-0 nova_compute[186544]: 2025-11-22 07:50:55.419 186548 DEBUG nova.compute.provider_tree [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:50:55 compute-0 nova_compute[186544]: 2025-11-22 07:50:55.452 186548 DEBUG nova.scheduler.client.report [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:50:55 compute-0 nova_compute[186544]: 2025-11-22 07:50:55.505 186548 DEBUG oslo_concurrency.lockutils [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.358s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:50:55 compute-0 nova_compute[186544]: 2025-11-22 07:50:55.527 186548 DEBUG nova.network.neutron [req-9b60bca6-e247-455c-ba19-d5589631a78d req-b1a1b072-edfb-4192-99a8-7a68ef1e9cd7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Updated VIF entry in instance network info cache for port ef3ec563-b83e-4960-ae19-13e1326de63e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 07:50:55 compute-0 nova_compute[186544]: 2025-11-22 07:50:55.528 186548 DEBUG nova.network.neutron [req-9b60bca6-e247-455c-ba19-d5589631a78d req-b1a1b072-edfb-4192-99a8-7a68ef1e9cd7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Updating instance_info_cache with network_info: [{"id": "5c6a12d4-4526-4d44-8729-c065528b7250", "address": "fa:16:3e:73:65:12", "network": {"id": "fe07f841-e2b4-4f9f-87cb-969d1b6e9370", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1818983046-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cea7e35789854e6685cdb211e396cd1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c6a12d4-45", "ovs_interfaceid": "5c6a12d4-4526-4d44-8729-c065528b7250", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ef3ec563-b83e-4960-ae19-13e1326de63e", "address": "fa:16:3e:f8:d1:aa", "network": {"id": "fe07f841-e2b4-4f9f-87cb-969d1b6e9370", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1818983046-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cea7e35789854e6685cdb211e396cd1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef3ec563-b8", "ovs_interfaceid": "ef3ec563-b83e-4960-ae19-13e1326de63e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:50:55 compute-0 nova_compute[186544]: 2025-11-22 07:50:55.544 186548 DEBUG oslo_concurrency.lockutils [None req-9e4ed310-e8f1-4fde-8798-b6adbb1c54c9 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Acquiring lock "1b9bb28d-9dd5-417c-ae84-1aaf78c81b52" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:50:55 compute-0 nova_compute[186544]: 2025-11-22 07:50:55.545 186548 DEBUG oslo_concurrency.lockutils [None req-9e4ed310-e8f1-4fde-8798-b6adbb1c54c9 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Lock "1b9bb28d-9dd5-417c-ae84-1aaf78c81b52" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:50:55 compute-0 nova_compute[186544]: 2025-11-22 07:50:55.545 186548 DEBUG oslo_concurrency.lockutils [None req-9e4ed310-e8f1-4fde-8798-b6adbb1c54c9 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Acquiring lock "1b9bb28d-9dd5-417c-ae84-1aaf78c81b52-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:50:55 compute-0 nova_compute[186544]: 2025-11-22 07:50:55.545 186548 DEBUG oslo_concurrency.lockutils [None req-9e4ed310-e8f1-4fde-8798-b6adbb1c54c9 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Lock "1b9bb28d-9dd5-417c-ae84-1aaf78c81b52-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:50:55 compute-0 nova_compute[186544]: 2025-11-22 07:50:55.546 186548 DEBUG oslo_concurrency.lockutils [None req-9e4ed310-e8f1-4fde-8798-b6adbb1c54c9 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Lock "1b9bb28d-9dd5-417c-ae84-1aaf78c81b52-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:50:55 compute-0 nova_compute[186544]: 2025-11-22 07:50:55.553 186548 INFO nova.compute.manager [None req-9e4ed310-e8f1-4fde-8798-b6adbb1c54c9 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Terminating instance
Nov 22 07:50:55 compute-0 nova_compute[186544]: 2025-11-22 07:50:55.559 186548 DEBUG nova.compute.manager [None req-9e4ed310-e8f1-4fde-8798-b6adbb1c54c9 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 07:50:55 compute-0 nova_compute[186544]: 2025-11-22 07:50:55.563 186548 DEBUG oslo_concurrency.lockutils [req-9b60bca6-e247-455c-ba19-d5589631a78d req-b1a1b072-edfb-4192-99a8-7a68ef1e9cd7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-1b9bb28d-9dd5-417c-ae84-1aaf78c81b52" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:50:55 compute-0 kernel: tap5c6a12d4-45 (unregistering): left promiscuous mode
Nov 22 07:50:55 compute-0 NetworkManager[55036]: <info>  [1763797855.5843] device (tap5c6a12d4-45): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 07:50:55 compute-0 ovn_controller[94843]: 2025-11-22T07:50:55Z|00209|binding|INFO|Releasing lport 5c6a12d4-4526-4d44-8729-c065528b7250 from this chassis (sb_readonly=0)
Nov 22 07:50:55 compute-0 ovn_controller[94843]: 2025-11-22T07:50:55Z|00210|binding|INFO|Setting lport 5c6a12d4-4526-4d44-8729-c065528b7250 down in Southbound
Nov 22 07:50:55 compute-0 nova_compute[186544]: 2025-11-22 07:50:55.593 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:50:55 compute-0 ovn_controller[94843]: 2025-11-22T07:50:55Z|00211|binding|INFO|Removing iface tap5c6a12d4-45 ovn-installed in OVS
Nov 22 07:50:55 compute-0 nova_compute[186544]: 2025-11-22 07:50:55.598 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:50:55 compute-0 nova_compute[186544]: 2025-11-22 07:50:55.607 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:50:55 compute-0 kernel: tapef3ec563-b8 (unregistering): left promiscuous mode
Nov 22 07:50:55 compute-0 NetworkManager[55036]: <info>  [1763797855.6107] device (tapef3ec563-b8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 07:50:55 compute-0 nova_compute[186544]: 2025-11-22 07:50:55.613 186548 DEBUG oslo_concurrency.lockutils [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Acquiring lock "refresh_cache-fccb17bd-32d9-4428-8812-0fbb6f93afa4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:50:55 compute-0 nova_compute[186544]: 2025-11-22 07:50:55.614 186548 DEBUG oslo_concurrency.lockutils [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Acquired lock "refresh_cache-fccb17bd-32d9-4428-8812-0fbb6f93afa4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:50:55 compute-0 nova_compute[186544]: 2025-11-22 07:50:55.614 186548 DEBUG nova.network.neutron [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 07:50:55 compute-0 nova_compute[186544]: 2025-11-22 07:50:55.615 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:50:55 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:50:55.616 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:73:65:12 10.100.0.10'], port_security=['fa:16:3e:73:65:12 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '1b9bb28d-9dd5-417c-ae84-1aaf78c81b52', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fe07f841-e2b4-4f9f-87cb-969d1b6e9370', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cea7e35789854e6685cdb211e396cd1b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3871b414-a0eb-4bb1-b2ee-37966ffc5476', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d5c66103-8017-42dc-8f0e-11b4184f098c, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=5c6a12d4-4526-4d44-8729-c065528b7250) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:50:55 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:50:55.617 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 5c6a12d4-4526-4d44-8729-c065528b7250 in datapath fe07f841-e2b4-4f9f-87cb-969d1b6e9370 unbound from our chassis
Nov 22 07:50:55 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:50:55.618 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fe07f841-e2b4-4f9f-87cb-969d1b6e9370
Nov 22 07:50:55 compute-0 ovn_controller[94843]: 2025-11-22T07:50:55Z|00212|binding|INFO|Releasing lport ef3ec563-b83e-4960-ae19-13e1326de63e from this chassis (sb_readonly=0)
Nov 22 07:50:55 compute-0 ovn_controller[94843]: 2025-11-22T07:50:55Z|00213|binding|INFO|Setting lport ef3ec563-b83e-4960-ae19-13e1326de63e down in Southbound
Nov 22 07:50:55 compute-0 ovn_controller[94843]: 2025-11-22T07:50:55Z|00214|binding|INFO|Removing iface tapef3ec563-b8 ovn-installed in OVS
Nov 22 07:50:55 compute-0 nova_compute[186544]: 2025-11-22 07:50:55.622 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:50:55 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:50:55.632 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[695af3ed-24ec-41f5-b53f-529b5c089dc5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:50:55 compute-0 nova_compute[186544]: 2025-11-22 07:50:55.633 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:50:55 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:50:55.661 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[214a420b-ea68-4722-88fe-7073efe001d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:50:55 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:50:55.664 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[27ffc229-3145-42f6-b279-4dd0aeadaeb3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:50:55 compute-0 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d00000032.scope: Deactivated successfully.
Nov 22 07:50:55 compute-0 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d00000032.scope: Consumed 13.534s CPU time.
Nov 22 07:50:55 compute-0 systemd-machined[152872]: Machine qemu-25-instance-00000032 terminated.
Nov 22 07:50:55 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:50:55.697 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f8:d1:aa 10.100.0.8'], port_security=['fa:16:3e:f8:d1:aa 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '1b9bb28d-9dd5-417c-ae84-1aaf78c81b52', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fe07f841-e2b4-4f9f-87cb-969d1b6e9370', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cea7e35789854e6685cdb211e396cd1b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3871b414-a0eb-4bb1-b2ee-37966ffc5476', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d5c66103-8017-42dc-8f0e-11b4184f098c, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=ef3ec563-b83e-4960-ae19-13e1326de63e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:50:55 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:50:55.696 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[87d1c453-2cb1-48de-88d1-066314ff5248]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:50:55 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:50:55.719 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[aab771da-f63d-4ed2-84a2-b05e2cb1b0ff]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfe07f841-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d9:5f:61'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 7, 'rx_bytes': 532, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 7, 'rx_bytes': 532, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 58], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 458189, 'reachable_time': 26428, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221113, 'error': None, 'target': 'ovnmeta-fe07f841-e2b4-4f9f-87cb-969d1b6e9370', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:50:55 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:50:55.739 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[3804dc3b-6e64-426f-aff8-c7f70c9d582d]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapfe07f841-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 458198, 'tstamp': 458198}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221114, 'error': None, 'target': 'ovnmeta-fe07f841-e2b4-4f9f-87cb-969d1b6e9370', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapfe07f841-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 458200, 'tstamp': 458200}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221114, 'error': None, 'target': 'ovnmeta-fe07f841-e2b4-4f9f-87cb-969d1b6e9370', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:50:55 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:50:55.741 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfe07f841-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:50:55 compute-0 nova_compute[186544]: 2025-11-22 07:50:55.743 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:50:55 compute-0 nova_compute[186544]: 2025-11-22 07:50:55.752 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:50:55 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:50:55.753 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfe07f841-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:50:55 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:50:55.754 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:50:55 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:50:55.754 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfe07f841-e0, col_values=(('external_ids', {'iface-id': '49228a2f-9df1-41f6-89e3-3e4aec690329'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:50:55 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:50:55.755 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:50:55 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:50:55.757 103805 INFO neutron.agent.ovn.metadata.agent [-] Port ef3ec563-b83e-4960-ae19-13e1326de63e in datapath fe07f841-e2b4-4f9f-87cb-969d1b6e9370 unbound from our chassis
Nov 22 07:50:55 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:50:55.758 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fe07f841-e2b4-4f9f-87cb-969d1b6e9370, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 07:50:55 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:50:55.759 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[30751165-1fd4-41f2-9306-8f27f7010249]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:50:55 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:50:55.759 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-fe07f841-e2b4-4f9f-87cb-969d1b6e9370 namespace which is not needed anymore
Nov 22 07:50:55 compute-0 NetworkManager[55036]: <info>  [1763797855.7784] manager: (tap5c6a12d4-45): new Tun device (/org/freedesktop/NetworkManager/Devices/97)
Nov 22 07:50:55 compute-0 NetworkManager[55036]: <info>  [1763797855.7919] manager: (tapef3ec563-b8): new Tun device (/org/freedesktop/NetworkManager/Devices/98)
Nov 22 07:50:55 compute-0 nova_compute[186544]: 2025-11-22 07:50:55.841 186548 INFO nova.virt.libvirt.driver [-] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Instance destroyed successfully.
Nov 22 07:50:55 compute-0 nova_compute[186544]: 2025-11-22 07:50:55.843 186548 DEBUG nova.objects.instance [None req-9e4ed310-e8f1-4fde-8798-b6adbb1c54c9 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Lazy-loading 'resources' on Instance uuid 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:50:55 compute-0 nova_compute[186544]: 2025-11-22 07:50:55.854 186548 DEBUG nova.virt.libvirt.vif [None req-9e4ed310-e8f1-4fde-8798-b6adbb1c54c9 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:50:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachInterfacesV270Test-server-1166339039',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacesv270test-server-1166339039',id=50,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:50:42Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='cea7e35789854e6685cdb211e396cd1b',ramdisk_id='',reservation_id='r-b79p1kq8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesV270Test-2058079139',owner_user_name='tempest-AttachInterfacesV270Test-2058079139-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:50:42Z,user_data=None,user_id='6671df69e863418882e60d5614674bf6',uuid=1b9bb28d-9dd5-417c-ae84-1aaf78c81b52,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5c6a12d4-4526-4d44-8729-c065528b7250", "address": "fa:16:3e:73:65:12", "network": {"id": "fe07f841-e2b4-4f9f-87cb-969d1b6e9370", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1818983046-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cea7e35789854e6685cdb211e396cd1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c6a12d4-45", "ovs_interfaceid": "5c6a12d4-4526-4d44-8729-c065528b7250", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 07:50:55 compute-0 nova_compute[186544]: 2025-11-22 07:50:55.855 186548 DEBUG nova.network.os_vif_util [None req-9e4ed310-e8f1-4fde-8798-b6adbb1c54c9 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Converting VIF {"id": "5c6a12d4-4526-4d44-8729-c065528b7250", "address": "fa:16:3e:73:65:12", "network": {"id": "fe07f841-e2b4-4f9f-87cb-969d1b6e9370", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1818983046-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cea7e35789854e6685cdb211e396cd1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c6a12d4-45", "ovs_interfaceid": "5c6a12d4-4526-4d44-8729-c065528b7250", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:50:55 compute-0 nova_compute[186544]: 2025-11-22 07:50:55.856 186548 DEBUG nova.network.os_vif_util [None req-9e4ed310-e8f1-4fde-8798-b6adbb1c54c9 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:73:65:12,bridge_name='br-int',has_traffic_filtering=True,id=5c6a12d4-4526-4d44-8729-c065528b7250,network=Network(fe07f841-e2b4-4f9f-87cb-969d1b6e9370),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5c6a12d4-45') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:50:55 compute-0 nova_compute[186544]: 2025-11-22 07:50:55.856 186548 DEBUG os_vif [None req-9e4ed310-e8f1-4fde-8798-b6adbb1c54c9 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:73:65:12,bridge_name='br-int',has_traffic_filtering=True,id=5c6a12d4-4526-4d44-8729-c065528b7250,network=Network(fe07f841-e2b4-4f9f-87cb-969d1b6e9370),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5c6a12d4-45') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 07:50:55 compute-0 nova_compute[186544]: 2025-11-22 07:50:55.860 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:50:55 compute-0 nova_compute[186544]: 2025-11-22 07:50:55.860 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5c6a12d4-45, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:50:55 compute-0 nova_compute[186544]: 2025-11-22 07:50:55.862 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:50:55 compute-0 nova_compute[186544]: 2025-11-22 07:50:55.864 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 07:50:55 compute-0 nova_compute[186544]: 2025-11-22 07:50:55.867 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:50:55 compute-0 nova_compute[186544]: 2025-11-22 07:50:55.870 186548 INFO os_vif [None req-9e4ed310-e8f1-4fde-8798-b6adbb1c54c9 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:73:65:12,bridge_name='br-int',has_traffic_filtering=True,id=5c6a12d4-4526-4d44-8729-c065528b7250,network=Network(fe07f841-e2b4-4f9f-87cb-969d1b6e9370),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5c6a12d4-45')
Nov 22 07:50:55 compute-0 nova_compute[186544]: 2025-11-22 07:50:55.871 186548 DEBUG nova.virt.libvirt.vif [None req-9e4ed310-e8f1-4fde-8798-b6adbb1c54c9 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:50:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachInterfacesV270Test-server-1166339039',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacesv270test-server-1166339039',id=50,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:50:42Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='cea7e35789854e6685cdb211e396cd1b',ramdisk_id='',reservation_id='r-b79p1kq8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesV270Test-2058079139',owner_user_name='tempest-AttachInterfacesV270Test-2058079139-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:50:42Z,user_data=None,user_id='6671df69e863418882e60d5614674bf6',uuid=1b9bb28d-9dd5-417c-ae84-1aaf78c81b52,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ef3ec563-b83e-4960-ae19-13e1326de63e", "address": "fa:16:3e:f8:d1:aa", "network": {"id": "fe07f841-e2b4-4f9f-87cb-969d1b6e9370", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1818983046-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cea7e35789854e6685cdb211e396cd1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef3ec563-b8", "ovs_interfaceid": "ef3ec563-b83e-4960-ae19-13e1326de63e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 07:50:55 compute-0 nova_compute[186544]: 2025-11-22 07:50:55.872 186548 DEBUG nova.network.os_vif_util [None req-9e4ed310-e8f1-4fde-8798-b6adbb1c54c9 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Converting VIF {"id": "ef3ec563-b83e-4960-ae19-13e1326de63e", "address": "fa:16:3e:f8:d1:aa", "network": {"id": "fe07f841-e2b4-4f9f-87cb-969d1b6e9370", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1818983046-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cea7e35789854e6685cdb211e396cd1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef3ec563-b8", "ovs_interfaceid": "ef3ec563-b83e-4960-ae19-13e1326de63e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:50:55 compute-0 nova_compute[186544]: 2025-11-22 07:50:55.872 186548 DEBUG nova.network.os_vif_util [None req-9e4ed310-e8f1-4fde-8798-b6adbb1c54c9 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f8:d1:aa,bridge_name='br-int',has_traffic_filtering=True,id=ef3ec563-b83e-4960-ae19-13e1326de63e,network=Network(fe07f841-e2b4-4f9f-87cb-969d1b6e9370),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapef3ec563-b8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:50:55 compute-0 nova_compute[186544]: 2025-11-22 07:50:55.873 186548 DEBUG os_vif [None req-9e4ed310-e8f1-4fde-8798-b6adbb1c54c9 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f8:d1:aa,bridge_name='br-int',has_traffic_filtering=True,id=ef3ec563-b83e-4960-ae19-13e1326de63e,network=Network(fe07f841-e2b4-4f9f-87cb-969d1b6e9370),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapef3ec563-b8') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 07:50:55 compute-0 nova_compute[186544]: 2025-11-22 07:50:55.874 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:50:55 compute-0 nova_compute[186544]: 2025-11-22 07:50:55.875 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapef3ec563-b8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:50:55 compute-0 nova_compute[186544]: 2025-11-22 07:50:55.876 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:50:55 compute-0 nova_compute[186544]: 2025-11-22 07:50:55.878 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 07:50:55 compute-0 nova_compute[186544]: 2025-11-22 07:50:55.881 186548 INFO os_vif [None req-9e4ed310-e8f1-4fde-8798-b6adbb1c54c9 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f8:d1:aa,bridge_name='br-int',has_traffic_filtering=True,id=ef3ec563-b83e-4960-ae19-13e1326de63e,network=Network(fe07f841-e2b4-4f9f-87cb-969d1b6e9370),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapef3ec563-b8')
Nov 22 07:50:55 compute-0 nova_compute[186544]: 2025-11-22 07:50:55.881 186548 INFO nova.virt.libvirt.driver [None req-9e4ed310-e8f1-4fde-8798-b6adbb1c54c9 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Deleting instance files /var/lib/nova/instances/1b9bb28d-9dd5-417c-ae84-1aaf78c81b52_del
Nov 22 07:50:55 compute-0 nova_compute[186544]: 2025-11-22 07:50:55.882 186548 INFO nova.virt.libvirt.driver [None req-9e4ed310-e8f1-4fde-8798-b6adbb1c54c9 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Deletion of /var/lib/nova/instances/1b9bb28d-9dd5-417c-ae84-1aaf78c81b52_del complete
Nov 22 07:50:55 compute-0 nova_compute[186544]: 2025-11-22 07:50:55.889 186548 DEBUG nova.network.neutron [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 07:50:55 compute-0 neutron-haproxy-ovnmeta-fe07f841-e2b4-4f9f-87cb-969d1b6e9370[220974]: [NOTICE]   (220978) : haproxy version is 2.8.14-c23fe91
Nov 22 07:50:55 compute-0 neutron-haproxy-ovnmeta-fe07f841-e2b4-4f9f-87cb-969d1b6e9370[220974]: [NOTICE]   (220978) : path to executable is /usr/sbin/haproxy
Nov 22 07:50:55 compute-0 neutron-haproxy-ovnmeta-fe07f841-e2b4-4f9f-87cb-969d1b6e9370[220974]: [WARNING]  (220978) : Exiting Master process...
Nov 22 07:50:55 compute-0 neutron-haproxy-ovnmeta-fe07f841-e2b4-4f9f-87cb-969d1b6e9370[220974]: [WARNING]  (220978) : Exiting Master process...
Nov 22 07:50:55 compute-0 neutron-haproxy-ovnmeta-fe07f841-e2b4-4f9f-87cb-969d1b6e9370[220974]: [ALERT]    (220978) : Current worker (220980) exited with code 143 (Terminated)
Nov 22 07:50:55 compute-0 neutron-haproxy-ovnmeta-fe07f841-e2b4-4f9f-87cb-969d1b6e9370[220974]: [WARNING]  (220978) : All workers exited. Exiting... (0)
Nov 22 07:50:55 compute-0 systemd[1]: libpod-a9c605eff51e75a4fd86af0802d0041fe10614fb30d18acefb95aff19e4e1f1e.scope: Deactivated successfully.
Nov 22 07:50:55 compute-0 podman[221161]: 2025-11-22 07:50:55.918469253 +0000 UTC m=+0.051275310 container died a9c605eff51e75a4fd86af0802d0041fe10614fb30d18acefb95aff19e4e1f1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fe07f841-e2b4-4f9f-87cb-969d1b6e9370, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118)
Nov 22 07:50:55 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a9c605eff51e75a4fd86af0802d0041fe10614fb30d18acefb95aff19e4e1f1e-userdata-shm.mount: Deactivated successfully.
Nov 22 07:50:55 compute-0 systemd[1]: var-lib-containers-storage-overlay-e9e44c8c1aaad343a270113d7f57de8469d31f67f0e876c15088dd9abf5b7a28-merged.mount: Deactivated successfully.
Nov 22 07:50:55 compute-0 podman[221161]: 2025-11-22 07:50:55.971436845 +0000 UTC m=+0.104242852 container cleanup a9c605eff51e75a4fd86af0802d0041fe10614fb30d18acefb95aff19e4e1f1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fe07f841-e2b4-4f9f-87cb-969d1b6e9370, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 07:50:55 compute-0 nova_compute[186544]: 2025-11-22 07:50:55.976 186548 DEBUG nova.compute.manager [req-411d87c5-605e-4b08-aecd-0030e7e9a6ce req-7cbf127e-f994-4439-87c6-cc043764aeeb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Received event network-vif-plugged-ef3ec563-b83e-4960-ae19-13e1326de63e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:50:55 compute-0 nova_compute[186544]: 2025-11-22 07:50:55.976 186548 DEBUG oslo_concurrency.lockutils [req-411d87c5-605e-4b08-aecd-0030e7e9a6ce req-7cbf127e-f994-4439-87c6-cc043764aeeb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1b9bb28d-9dd5-417c-ae84-1aaf78c81b52-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:50:55 compute-0 nova_compute[186544]: 2025-11-22 07:50:55.977 186548 DEBUG oslo_concurrency.lockutils [req-411d87c5-605e-4b08-aecd-0030e7e9a6ce req-7cbf127e-f994-4439-87c6-cc043764aeeb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1b9bb28d-9dd5-417c-ae84-1aaf78c81b52-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:50:55 compute-0 nova_compute[186544]: 2025-11-22 07:50:55.977 186548 DEBUG oslo_concurrency.lockutils [req-411d87c5-605e-4b08-aecd-0030e7e9a6ce req-7cbf127e-f994-4439-87c6-cc043764aeeb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1b9bb28d-9dd5-417c-ae84-1aaf78c81b52-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:50:55 compute-0 nova_compute[186544]: 2025-11-22 07:50:55.977 186548 DEBUG nova.compute.manager [req-411d87c5-605e-4b08-aecd-0030e7e9a6ce req-7cbf127e-f994-4439-87c6-cc043764aeeb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] No waiting events found dispatching network-vif-plugged-ef3ec563-b83e-4960-ae19-13e1326de63e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:50:55 compute-0 nova_compute[186544]: 2025-11-22 07:50:55.977 186548 WARNING nova.compute.manager [req-411d87c5-605e-4b08-aecd-0030e7e9a6ce req-7cbf127e-f994-4439-87c6-cc043764aeeb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Received unexpected event network-vif-plugged-ef3ec563-b83e-4960-ae19-13e1326de63e for instance with vm_state active and task_state deleting.
Nov 22 07:50:55 compute-0 nova_compute[186544]: 2025-11-22 07:50:55.980 186548 INFO nova.compute.manager [None req-9e4ed310-e8f1-4fde-8798-b6adbb1c54c9 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Took 0.42 seconds to destroy the instance on the hypervisor.
Nov 22 07:50:55 compute-0 nova_compute[186544]: 2025-11-22 07:50:55.980 186548 DEBUG oslo.service.loopingcall [None req-9e4ed310-e8f1-4fde-8798-b6adbb1c54c9 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 07:50:55 compute-0 systemd[1]: libpod-conmon-a9c605eff51e75a4fd86af0802d0041fe10614fb30d18acefb95aff19e4e1f1e.scope: Deactivated successfully.
Nov 22 07:50:55 compute-0 nova_compute[186544]: 2025-11-22 07:50:55.981 186548 DEBUG nova.compute.manager [-] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 07:50:55 compute-0 nova_compute[186544]: 2025-11-22 07:50:55.981 186548 DEBUG nova.network.neutron [-] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 07:50:56 compute-0 podman[221193]: 2025-11-22 07:50:56.048349126 +0000 UTC m=+0.058079958 container remove a9c605eff51e75a4fd86af0802d0041fe10614fb30d18acefb95aff19e4e1f1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fe07f841-e2b4-4f9f-87cb-969d1b6e9370, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 22 07:50:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:50:56.054 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[462e078e-5c40-4c2e-86f9-89ed7efa7962]: (4, ('Sat Nov 22 07:50:55 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-fe07f841-e2b4-4f9f-87cb-969d1b6e9370 (a9c605eff51e75a4fd86af0802d0041fe10614fb30d18acefb95aff19e4e1f1e)\na9c605eff51e75a4fd86af0802d0041fe10614fb30d18acefb95aff19e4e1f1e\nSat Nov 22 07:50:55 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-fe07f841-e2b4-4f9f-87cb-969d1b6e9370 (a9c605eff51e75a4fd86af0802d0041fe10614fb30d18acefb95aff19e4e1f1e)\na9c605eff51e75a4fd86af0802d0041fe10614fb30d18acefb95aff19e4e1f1e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:50:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:50:56.056 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[94b8bb1a-3bc3-43c9-af98-499dbe5eed1d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:50:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:50:56.057 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfe07f841-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:50:56 compute-0 nova_compute[186544]: 2025-11-22 07:50:56.058 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:50:56 compute-0 kernel: tapfe07f841-e0: left promiscuous mode
Nov 22 07:50:56 compute-0 nova_compute[186544]: 2025-11-22 07:50:56.060 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:50:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:50:56.062 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[626734eb-c4a3-4e32-8e92-69aaa0512fde]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:50:56 compute-0 nova_compute[186544]: 2025-11-22 07:50:56.074 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:50:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:50:56.084 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[f81c7bd7-b8e4-4827-8d15-69b9b4874263]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:50:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:50:56.085 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[125a6d85-12b7-40d4-acd3-56f6c239a17d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:50:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:50:56.100 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[8b4fb6d1-4509-4f9d-abac-d0ea5b146b9d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 458182, 'reachable_time': 31469, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221207, 'error': None, 'target': 'ovnmeta-fe07f841-e2b4-4f9f-87cb-969d1b6e9370', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:50:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:50:56.103 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-fe07f841-e2b4-4f9f-87cb-969d1b6e9370 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 07:50:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:50:56.103 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[b47203e6-5aa9-438b-8c48-8e6f80054db2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:50:56 compute-0 systemd[1]: run-netns-ovnmeta\x2dfe07f841\x2de2b4\x2d4f9f\x2d87cb\x2d969d1b6e9370.mount: Deactivated successfully.
Nov 22 07:50:56 compute-0 nova_compute[186544]: 2025-11-22 07:50:56.215 186548 DEBUG nova.network.neutron [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:50:56 compute-0 nova_compute[186544]: 2025-11-22 07:50:56.231 186548 DEBUG oslo_concurrency.lockutils [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Releasing lock "refresh_cache-fccb17bd-32d9-4428-8812-0fbb6f93afa4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:50:56 compute-0 nova_compute[186544]: 2025-11-22 07:50:56.233 186548 DEBUG nova.virt.libvirt.driver [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 07:50:56 compute-0 nova_compute[186544]: 2025-11-22 07:50:56.233 186548 INFO nova.virt.libvirt.driver [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Creating image(s)
Nov 22 07:50:56 compute-0 nova_compute[186544]: 2025-11-22 07:50:56.233 186548 DEBUG oslo_concurrency.lockutils [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Acquiring lock "/var/lib/nova/instances/fccb17bd-32d9-4428-8812-0fbb6f93afa4/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:50:56 compute-0 nova_compute[186544]: 2025-11-22 07:50:56.234 186548 DEBUG oslo_concurrency.lockutils [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Lock "/var/lib/nova/instances/fccb17bd-32d9-4428-8812-0fbb6f93afa4/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:50:56 compute-0 nova_compute[186544]: 2025-11-22 07:50:56.234 186548 DEBUG oslo_concurrency.lockutils [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Lock "/var/lib/nova/instances/fccb17bd-32d9-4428-8812-0fbb6f93afa4/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:50:56 compute-0 nova_compute[186544]: 2025-11-22 07:50:56.235 186548 DEBUG nova.objects.instance [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Lazy-loading 'trusted_certs' on Instance uuid fccb17bd-32d9-4428-8812-0fbb6f93afa4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:50:56 compute-0 nova_compute[186544]: 2025-11-22 07:50:56.253 186548 DEBUG oslo_concurrency.lockutils [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Acquiring lock "9794f07bebbd0adda446514edb1f5d23da5d1e0c" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:50:56 compute-0 nova_compute[186544]: 2025-11-22 07:50:56.254 186548 DEBUG oslo_concurrency.lockutils [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Lock "9794f07bebbd0adda446514edb1f5d23da5d1e0c" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:50:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:50:57.038 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:50:57 compute-0 nova_compute[186544]: 2025-11-22 07:50:57.039 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:50:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:50:57.039 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 07:50:57 compute-0 nova_compute[186544]: 2025-11-22 07:50:57.043 186548 DEBUG nova.compute.manager [req-d2ccf92c-5499-4d35-87b1-f1b6a5556312 req-0c9fe210-41e3-48da-a37d-ed602927718a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Received event network-vif-unplugged-5c6a12d4-4526-4d44-8729-c065528b7250 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:50:57 compute-0 nova_compute[186544]: 2025-11-22 07:50:57.043 186548 DEBUG oslo_concurrency.lockutils [req-d2ccf92c-5499-4d35-87b1-f1b6a5556312 req-0c9fe210-41e3-48da-a37d-ed602927718a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1b9bb28d-9dd5-417c-ae84-1aaf78c81b52-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:50:57 compute-0 nova_compute[186544]: 2025-11-22 07:50:57.043 186548 DEBUG oslo_concurrency.lockutils [req-d2ccf92c-5499-4d35-87b1-f1b6a5556312 req-0c9fe210-41e3-48da-a37d-ed602927718a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1b9bb28d-9dd5-417c-ae84-1aaf78c81b52-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:50:57 compute-0 nova_compute[186544]: 2025-11-22 07:50:57.044 186548 DEBUG oslo_concurrency.lockutils [req-d2ccf92c-5499-4d35-87b1-f1b6a5556312 req-0c9fe210-41e3-48da-a37d-ed602927718a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1b9bb28d-9dd5-417c-ae84-1aaf78c81b52-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:50:57 compute-0 nova_compute[186544]: 2025-11-22 07:50:57.044 186548 DEBUG nova.compute.manager [req-d2ccf92c-5499-4d35-87b1-f1b6a5556312 req-0c9fe210-41e3-48da-a37d-ed602927718a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] No waiting events found dispatching network-vif-unplugged-5c6a12d4-4526-4d44-8729-c065528b7250 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:50:57 compute-0 nova_compute[186544]: 2025-11-22 07:50:57.044 186548 DEBUG nova.compute.manager [req-d2ccf92c-5499-4d35-87b1-f1b6a5556312 req-0c9fe210-41e3-48da-a37d-ed602927718a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Received event network-vif-unplugged-5c6a12d4-4526-4d44-8729-c065528b7250 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 22 07:50:57 compute-0 nova_compute[186544]: 2025-11-22 07:50:57.045 186548 DEBUG nova.compute.manager [req-d2ccf92c-5499-4d35-87b1-f1b6a5556312 req-0c9fe210-41e3-48da-a37d-ed602927718a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Received event network-vif-plugged-5c6a12d4-4526-4d44-8729-c065528b7250 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:50:57 compute-0 nova_compute[186544]: 2025-11-22 07:50:57.045 186548 DEBUG oslo_concurrency.lockutils [req-d2ccf92c-5499-4d35-87b1-f1b6a5556312 req-0c9fe210-41e3-48da-a37d-ed602927718a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1b9bb28d-9dd5-417c-ae84-1aaf78c81b52-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:50:57 compute-0 nova_compute[186544]: 2025-11-22 07:50:57.045 186548 DEBUG oslo_concurrency.lockutils [req-d2ccf92c-5499-4d35-87b1-f1b6a5556312 req-0c9fe210-41e3-48da-a37d-ed602927718a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1b9bb28d-9dd5-417c-ae84-1aaf78c81b52-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:50:57 compute-0 nova_compute[186544]: 2025-11-22 07:50:57.046 186548 DEBUG oslo_concurrency.lockutils [req-d2ccf92c-5499-4d35-87b1-f1b6a5556312 req-0c9fe210-41e3-48da-a37d-ed602927718a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1b9bb28d-9dd5-417c-ae84-1aaf78c81b52-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:50:57 compute-0 nova_compute[186544]: 2025-11-22 07:50:57.046 186548 DEBUG nova.compute.manager [req-d2ccf92c-5499-4d35-87b1-f1b6a5556312 req-0c9fe210-41e3-48da-a37d-ed602927718a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] No waiting events found dispatching network-vif-plugged-5c6a12d4-4526-4d44-8729-c065528b7250 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:50:57 compute-0 nova_compute[186544]: 2025-11-22 07:50:57.046 186548 WARNING nova.compute.manager [req-d2ccf92c-5499-4d35-87b1-f1b6a5556312 req-0c9fe210-41e3-48da-a37d-ed602927718a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Received unexpected event network-vif-plugged-5c6a12d4-4526-4d44-8729-c065528b7250 for instance with vm_state active and task_state deleting.
Nov 22 07:50:58 compute-0 nova_compute[186544]: 2025-11-22 07:50:58.119 186548 DEBUG nova.compute.manager [req-63b4aaee-020e-4a4e-a1b0-574c0e9de4b7 req-6df3b660-052d-42cf-9246-246adee5333c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Received event network-vif-unplugged-ef3ec563-b83e-4960-ae19-13e1326de63e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:50:58 compute-0 nova_compute[186544]: 2025-11-22 07:50:58.119 186548 DEBUG oslo_concurrency.lockutils [req-63b4aaee-020e-4a4e-a1b0-574c0e9de4b7 req-6df3b660-052d-42cf-9246-246adee5333c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1b9bb28d-9dd5-417c-ae84-1aaf78c81b52-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:50:58 compute-0 nova_compute[186544]: 2025-11-22 07:50:58.120 186548 DEBUG oslo_concurrency.lockutils [req-63b4aaee-020e-4a4e-a1b0-574c0e9de4b7 req-6df3b660-052d-42cf-9246-246adee5333c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1b9bb28d-9dd5-417c-ae84-1aaf78c81b52-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:50:58 compute-0 nova_compute[186544]: 2025-11-22 07:50:58.120 186548 DEBUG oslo_concurrency.lockutils [req-63b4aaee-020e-4a4e-a1b0-574c0e9de4b7 req-6df3b660-052d-42cf-9246-246adee5333c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1b9bb28d-9dd5-417c-ae84-1aaf78c81b52-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:50:58 compute-0 nova_compute[186544]: 2025-11-22 07:50:58.120 186548 DEBUG nova.compute.manager [req-63b4aaee-020e-4a4e-a1b0-574c0e9de4b7 req-6df3b660-052d-42cf-9246-246adee5333c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] No waiting events found dispatching network-vif-unplugged-ef3ec563-b83e-4960-ae19-13e1326de63e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:50:58 compute-0 nova_compute[186544]: 2025-11-22 07:50:58.120 186548 DEBUG nova.compute.manager [req-63b4aaee-020e-4a4e-a1b0-574c0e9de4b7 req-6df3b660-052d-42cf-9246-246adee5333c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Received event network-vif-unplugged-ef3ec563-b83e-4960-ae19-13e1326de63e for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 22 07:50:58 compute-0 nova_compute[186544]: 2025-11-22 07:50:58.120 186548 DEBUG nova.compute.manager [req-63b4aaee-020e-4a4e-a1b0-574c0e9de4b7 req-6df3b660-052d-42cf-9246-246adee5333c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Received event network-vif-plugged-ef3ec563-b83e-4960-ae19-13e1326de63e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:50:58 compute-0 nova_compute[186544]: 2025-11-22 07:50:58.121 186548 DEBUG oslo_concurrency.lockutils [req-63b4aaee-020e-4a4e-a1b0-574c0e9de4b7 req-6df3b660-052d-42cf-9246-246adee5333c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1b9bb28d-9dd5-417c-ae84-1aaf78c81b52-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:50:58 compute-0 nova_compute[186544]: 2025-11-22 07:50:58.121 186548 DEBUG oslo_concurrency.lockutils [req-63b4aaee-020e-4a4e-a1b0-574c0e9de4b7 req-6df3b660-052d-42cf-9246-246adee5333c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1b9bb28d-9dd5-417c-ae84-1aaf78c81b52-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:50:58 compute-0 nova_compute[186544]: 2025-11-22 07:50:58.121 186548 DEBUG oslo_concurrency.lockutils [req-63b4aaee-020e-4a4e-a1b0-574c0e9de4b7 req-6df3b660-052d-42cf-9246-246adee5333c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1b9bb28d-9dd5-417c-ae84-1aaf78c81b52-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:50:58 compute-0 nova_compute[186544]: 2025-11-22 07:50:58.122 186548 DEBUG nova.compute.manager [req-63b4aaee-020e-4a4e-a1b0-574c0e9de4b7 req-6df3b660-052d-42cf-9246-246adee5333c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] No waiting events found dispatching network-vif-plugged-ef3ec563-b83e-4960-ae19-13e1326de63e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:50:58 compute-0 nova_compute[186544]: 2025-11-22 07:50:58.122 186548 WARNING nova.compute.manager [req-63b4aaee-020e-4a4e-a1b0-574c0e9de4b7 req-6df3b660-052d-42cf-9246-246adee5333c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Received unexpected event network-vif-plugged-ef3ec563-b83e-4960-ae19-13e1326de63e for instance with vm_state active and task_state deleting.
Nov 22 07:50:58 compute-0 nova_compute[186544]: 2025-11-22 07:50:58.122 186548 DEBUG nova.compute.manager [req-63b4aaee-020e-4a4e-a1b0-574c0e9de4b7 req-6df3b660-052d-42cf-9246-246adee5333c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Received event network-vif-deleted-5c6a12d4-4526-4d44-8729-c065528b7250 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:50:58 compute-0 nova_compute[186544]: 2025-11-22 07:50:58.122 186548 INFO nova.compute.manager [req-63b4aaee-020e-4a4e-a1b0-574c0e9de4b7 req-6df3b660-052d-42cf-9246-246adee5333c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Neutron deleted interface 5c6a12d4-4526-4d44-8729-c065528b7250; detaching it from the instance and deleting it from the info cache
Nov 22 07:50:58 compute-0 nova_compute[186544]: 2025-11-22 07:50:58.123 186548 DEBUG nova.network.neutron [req-63b4aaee-020e-4a4e-a1b0-574c0e9de4b7 req-6df3b660-052d-42cf-9246-246adee5333c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Updating instance_info_cache with network_info: [{"id": "ef3ec563-b83e-4960-ae19-13e1326de63e", "address": "fa:16:3e:f8:d1:aa", "network": {"id": "fe07f841-e2b4-4f9f-87cb-969d1b6e9370", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1818983046-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cea7e35789854e6685cdb211e396cd1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef3ec563-b8", "ovs_interfaceid": "ef3ec563-b83e-4960-ae19-13e1326de63e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:50:58 compute-0 nova_compute[186544]: 2025-11-22 07:50:58.154 186548 DEBUG nova.network.neutron [-] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:50:58 compute-0 nova_compute[186544]: 2025-11-22 07:50:58.171 186548 DEBUG nova.compute.manager [req-63b4aaee-020e-4a4e-a1b0-574c0e9de4b7 req-6df3b660-052d-42cf-9246-246adee5333c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Detach interface failed, port_id=5c6a12d4-4526-4d44-8729-c065528b7250, reason: Instance 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Nov 22 07:50:58 compute-0 nova_compute[186544]: 2025-11-22 07:50:58.181 186548 INFO nova.compute.manager [-] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Took 2.20 seconds to deallocate network for instance.
Nov 22 07:50:58 compute-0 nova_compute[186544]: 2025-11-22 07:50:58.267 186548 DEBUG oslo_concurrency.lockutils [None req-9e4ed310-e8f1-4fde-8798-b6adbb1c54c9 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:50:58 compute-0 nova_compute[186544]: 2025-11-22 07:50:58.268 186548 DEBUG oslo_concurrency.lockutils [None req-9e4ed310-e8f1-4fde-8798-b6adbb1c54c9 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:50:58 compute-0 nova_compute[186544]: 2025-11-22 07:50:58.336 186548 DEBUG nova.compute.provider_tree [None req-9e4ed310-e8f1-4fde-8798-b6adbb1c54c9 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:50:58 compute-0 nova_compute[186544]: 2025-11-22 07:50:58.351 186548 DEBUG nova.scheduler.client.report [None req-9e4ed310-e8f1-4fde-8798-b6adbb1c54c9 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:50:58 compute-0 nova_compute[186544]: 2025-11-22 07:50:58.379 186548 DEBUG oslo_concurrency.lockutils [None req-9e4ed310-e8f1-4fde-8798-b6adbb1c54c9 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:50:58 compute-0 nova_compute[186544]: 2025-11-22 07:50:58.406 186548 DEBUG oslo_concurrency.processutils [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9794f07bebbd0adda446514edb1f5d23da5d1e0c.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:50:58 compute-0 nova_compute[186544]: 2025-11-22 07:50:58.425 186548 INFO nova.scheduler.client.report [None req-9e4ed310-e8f1-4fde-8798-b6adbb1c54c9 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Deleted allocations for instance 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52
Nov 22 07:50:58 compute-0 nova_compute[186544]: 2025-11-22 07:50:58.462 186548 DEBUG oslo_concurrency.processutils [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9794f07bebbd0adda446514edb1f5d23da5d1e0c.part --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:50:58 compute-0 nova_compute[186544]: 2025-11-22 07:50:58.463 186548 DEBUG nova.virt.images [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] 8b5dbfb9-e249-42fb-b20a-58daf8fdf66a was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Nov 22 07:50:58 compute-0 nova_compute[186544]: 2025-11-22 07:50:58.464 186548 DEBUG nova.privsep.utils [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Nov 22 07:50:58 compute-0 nova_compute[186544]: 2025-11-22 07:50:58.464 186548 DEBUG oslo_concurrency.processutils [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/9794f07bebbd0adda446514edb1f5d23da5d1e0c.part /var/lib/nova/instances/_base/9794f07bebbd0adda446514edb1f5d23da5d1e0c.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:50:58 compute-0 nova_compute[186544]: 2025-11-22 07:50:58.508 186548 DEBUG oslo_concurrency.lockutils [None req-9e4ed310-e8f1-4fde-8798-b6adbb1c54c9 6671df69e863418882e60d5614674bf6 cea7e35789854e6685cdb211e396cd1b - - default default] Lock "1b9bb28d-9dd5-417c-ae84-1aaf78c81b52" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.963s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:50:58 compute-0 nova_compute[186544]: 2025-11-22 07:50:58.856 186548 DEBUG oslo_concurrency.processutils [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/9794f07bebbd0adda446514edb1f5d23da5d1e0c.part /var/lib/nova/instances/_base/9794f07bebbd0adda446514edb1f5d23da5d1e0c.converted" returned: 0 in 0.391s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:50:58 compute-0 nova_compute[186544]: 2025-11-22 07:50:58.865 186548 DEBUG oslo_concurrency.processutils [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9794f07bebbd0adda446514edb1f5d23da5d1e0c.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:50:58 compute-0 nova_compute[186544]: 2025-11-22 07:50:58.947 186548 DEBUG oslo_concurrency.processutils [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9794f07bebbd0adda446514edb1f5d23da5d1e0c.converted --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:50:58 compute-0 nova_compute[186544]: 2025-11-22 07:50:58.948 186548 DEBUG oslo_concurrency.lockutils [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Lock "9794f07bebbd0adda446514edb1f5d23da5d1e0c" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.694s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:50:58 compute-0 nova_compute[186544]: 2025-11-22 07:50:58.961 186548 DEBUG oslo_concurrency.processutils [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9794f07bebbd0adda446514edb1f5d23da5d1e0c --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:50:59 compute-0 nova_compute[186544]: 2025-11-22 07:50:59.016 186548 DEBUG oslo_concurrency.processutils [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9794f07bebbd0adda446514edb1f5d23da5d1e0c --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:50:59 compute-0 nova_compute[186544]: 2025-11-22 07:50:59.017 186548 DEBUG oslo_concurrency.lockutils [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Acquiring lock "9794f07bebbd0adda446514edb1f5d23da5d1e0c" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:50:59 compute-0 nova_compute[186544]: 2025-11-22 07:50:59.018 186548 DEBUG oslo_concurrency.lockutils [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Lock "9794f07bebbd0adda446514edb1f5d23da5d1e0c" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:50:59 compute-0 nova_compute[186544]: 2025-11-22 07:50:59.030 186548 DEBUG oslo_concurrency.processutils [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9794f07bebbd0adda446514edb1f5d23da5d1e0c --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:50:59 compute-0 nova_compute[186544]: 2025-11-22 07:50:59.082 186548 DEBUG oslo_concurrency.processutils [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9794f07bebbd0adda446514edb1f5d23da5d1e0c --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:50:59 compute-0 nova_compute[186544]: 2025-11-22 07:50:59.083 186548 DEBUG oslo_concurrency.processutils [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/9794f07bebbd0adda446514edb1f5d23da5d1e0c,backing_fmt=raw /var/lib/nova/instances/fccb17bd-32d9-4428-8812-0fbb6f93afa4/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:50:59 compute-0 nova_compute[186544]: 2025-11-22 07:50:59.122 186548 DEBUG oslo_concurrency.processutils [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/9794f07bebbd0adda446514edb1f5d23da5d1e0c,backing_fmt=raw /var/lib/nova/instances/fccb17bd-32d9-4428-8812-0fbb6f93afa4/disk 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:50:59 compute-0 nova_compute[186544]: 2025-11-22 07:50:59.123 186548 DEBUG oslo_concurrency.lockutils [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Lock "9794f07bebbd0adda446514edb1f5d23da5d1e0c" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.105s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:50:59 compute-0 nova_compute[186544]: 2025-11-22 07:50:59.123 186548 DEBUG oslo_concurrency.processutils [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9794f07bebbd0adda446514edb1f5d23da5d1e0c --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:50:59 compute-0 nova_compute[186544]: 2025-11-22 07:50:59.180 186548 DEBUG oslo_concurrency.processutils [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9794f07bebbd0adda446514edb1f5d23da5d1e0c --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:50:59 compute-0 nova_compute[186544]: 2025-11-22 07:50:59.181 186548 DEBUG nova.objects.instance [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Lazy-loading 'migration_context' on Instance uuid fccb17bd-32d9-4428-8812-0fbb6f93afa4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:50:59 compute-0 nova_compute[186544]: 2025-11-22 07:50:59.195 186548 INFO nova.virt.libvirt.driver [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Rebasing disk image.
Nov 22 07:50:59 compute-0 nova_compute[186544]: 2025-11-22 07:50:59.195 186548 DEBUG oslo_concurrency.processutils [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:50:59 compute-0 nova_compute[186544]: 2025-11-22 07:50:59.254 186548 DEBUG oslo_concurrency.processutils [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:50:59 compute-0 nova_compute[186544]: 2025-11-22 07:50:59.255 186548 DEBUG oslo_concurrency.processutils [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Running cmd (subprocess): qemu-img rebase -b /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 -F raw /var/lib/nova/instances/fccb17bd-32d9-4428-8812-0fbb6f93afa4/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:51:00 compute-0 nova_compute[186544]: 2025-11-22 07:51:00.137 186548 DEBUG nova.compute.manager [req-449c5cb5-bfa3-41bc-b8d6-bc51688bd3f4 req-9836e9b1-9d1c-4af2-98af-c4a8b2ffcb4e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Received event network-vif-deleted-ef3ec563-b83e-4960-ae19-13e1326de63e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:51:00 compute-0 nova_compute[186544]: 2025-11-22 07:51:00.300 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:51:00 compute-0 nova_compute[186544]: 2025-11-22 07:51:00.497 186548 DEBUG oslo_concurrency.processutils [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] CMD "qemu-img rebase -b /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 -F raw /var/lib/nova/instances/fccb17bd-32d9-4428-8812-0fbb6f93afa4/disk" returned: 0 in 1.242s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:51:00 compute-0 nova_compute[186544]: 2025-11-22 07:51:00.498 186548 DEBUG nova.virt.libvirt.driver [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 07:51:00 compute-0 nova_compute[186544]: 2025-11-22 07:51:00.498 186548 DEBUG nova.virt.libvirt.driver [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Ensure instance console log exists: /var/lib/nova/instances/fccb17bd-32d9-4428-8812-0fbb6f93afa4/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 07:51:00 compute-0 nova_compute[186544]: 2025-11-22 07:51:00.498 186548 DEBUG oslo_concurrency.lockutils [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:51:00 compute-0 nova_compute[186544]: 2025-11-22 07:51:00.499 186548 DEBUG oslo_concurrency.lockutils [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:51:00 compute-0 nova_compute[186544]: 2025-11-22 07:51:00.499 186548 DEBUG oslo_concurrency.lockutils [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:51:00 compute-0 nova_compute[186544]: 2025-11-22 07:51:00.500 186548 DEBUG nova.virt.libvirt.driver [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='bad9a47fdd24cf0d6c4639c7ebd4c498',container_format='bare',created_at=2025-11-22T07:50:32Z,direct_url=<?>,disk_format='qcow2',id=8b5dbfb9-e249-42fb-b20a-58daf8fdf66a,min_disk=1,min_ram=0,name='tempest-UnshelveToHostMultiNodesTest-server-1801097299-shelved',owner='d31cb5bd32934c45b774fafa62a8eb01',properties=ImageMetaProps,protected=<?>,size=52166656,status='active',tags=<?>,updated_at=2025-11-22T07:50:49Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 07:51:00 compute-0 nova_compute[186544]: 2025-11-22 07:51:00.503 186548 WARNING nova.virt.libvirt.driver [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 07:51:00 compute-0 nova_compute[186544]: 2025-11-22 07:51:00.507 186548 DEBUG nova.virt.libvirt.host [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 07:51:00 compute-0 nova_compute[186544]: 2025-11-22 07:51:00.508 186548 DEBUG nova.virt.libvirt.host [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 07:51:00 compute-0 nova_compute[186544]: 2025-11-22 07:51:00.511 186548 DEBUG nova.virt.libvirt.host [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 07:51:00 compute-0 nova_compute[186544]: 2025-11-22 07:51:00.512 186548 DEBUG nova.virt.libvirt.host [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 07:51:00 compute-0 nova_compute[186544]: 2025-11-22 07:51:00.513 186548 DEBUG nova.virt.libvirt.driver [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 07:51:00 compute-0 nova_compute[186544]: 2025-11-22 07:51:00.513 186548 DEBUG nova.virt.hardware [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='bad9a47fdd24cf0d6c4639c7ebd4c498',container_format='bare',created_at=2025-11-22T07:50:32Z,direct_url=<?>,disk_format='qcow2',id=8b5dbfb9-e249-42fb-b20a-58daf8fdf66a,min_disk=1,min_ram=0,name='tempest-UnshelveToHostMultiNodesTest-server-1801097299-shelved',owner='d31cb5bd32934c45b774fafa62a8eb01',properties=ImageMetaProps,protected=<?>,size=52166656,status='active',tags=<?>,updated_at=2025-11-22T07:50:49Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 07:51:00 compute-0 nova_compute[186544]: 2025-11-22 07:51:00.514 186548 DEBUG nova.virt.hardware [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 07:51:00 compute-0 nova_compute[186544]: 2025-11-22 07:51:00.514 186548 DEBUG nova.virt.hardware [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 07:51:00 compute-0 nova_compute[186544]: 2025-11-22 07:51:00.514 186548 DEBUG nova.virt.hardware [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 07:51:00 compute-0 nova_compute[186544]: 2025-11-22 07:51:00.514 186548 DEBUG nova.virt.hardware [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 07:51:00 compute-0 nova_compute[186544]: 2025-11-22 07:51:00.515 186548 DEBUG nova.virt.hardware [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 07:51:00 compute-0 nova_compute[186544]: 2025-11-22 07:51:00.515 186548 DEBUG nova.virt.hardware [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 07:51:00 compute-0 nova_compute[186544]: 2025-11-22 07:51:00.515 186548 DEBUG nova.virt.hardware [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 07:51:00 compute-0 nova_compute[186544]: 2025-11-22 07:51:00.515 186548 DEBUG nova.virt.hardware [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 07:51:00 compute-0 nova_compute[186544]: 2025-11-22 07:51:00.516 186548 DEBUG nova.virt.hardware [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 07:51:00 compute-0 nova_compute[186544]: 2025-11-22 07:51:00.516 186548 DEBUG nova.virt.hardware [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 07:51:00 compute-0 nova_compute[186544]: 2025-11-22 07:51:00.516 186548 DEBUG nova.objects.instance [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Lazy-loading 'vcpu_model' on Instance uuid fccb17bd-32d9-4428-8812-0fbb6f93afa4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:51:00 compute-0 nova_compute[186544]: 2025-11-22 07:51:00.538 186548 DEBUG nova.objects.instance [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Lazy-loading 'pci_devices' on Instance uuid fccb17bd-32d9-4428-8812-0fbb6f93afa4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:51:00 compute-0 nova_compute[186544]: 2025-11-22 07:51:00.564 186548 DEBUG nova.virt.libvirt.driver [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] End _get_guest_xml xml=<domain type="kvm">
Nov 22 07:51:00 compute-0 nova_compute[186544]:   <uuid>fccb17bd-32d9-4428-8812-0fbb6f93afa4</uuid>
Nov 22 07:51:00 compute-0 nova_compute[186544]:   <name>instance-00000030</name>
Nov 22 07:51:00 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 07:51:00 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 07:51:00 compute-0 nova_compute[186544]:   <metadata>
Nov 22 07:51:00 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 07:51:00 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 07:51:00 compute-0 nova_compute[186544]:       <nova:name>tempest-UnshelveToHostMultiNodesTest-server-1801097299</nova:name>
Nov 22 07:51:00 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 07:51:00</nova:creationTime>
Nov 22 07:51:00 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 07:51:00 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 07:51:00 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 07:51:00 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 07:51:00 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 07:51:00 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 07:51:00 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 07:51:00 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 07:51:00 compute-0 nova_compute[186544]:         <nova:user uuid="a2e51707e7f64c0793f0a8feeb6c40e6">tempest-UnshelveToHostMultiNodesTest-1261470077-project-member</nova:user>
Nov 22 07:51:00 compute-0 nova_compute[186544]:         <nova:project uuid="d31cb5bd32934c45b774fafa62a8eb01">tempest-UnshelveToHostMultiNodesTest-1261470077</nova:project>
Nov 22 07:51:00 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 07:51:00 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="8b5dbfb9-e249-42fb-b20a-58daf8fdf66a"/>
Nov 22 07:51:00 compute-0 nova_compute[186544]:       <nova:ports/>
Nov 22 07:51:00 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 07:51:00 compute-0 nova_compute[186544]:   </metadata>
Nov 22 07:51:00 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 07:51:00 compute-0 nova_compute[186544]:     <system>
Nov 22 07:51:00 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 07:51:00 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 07:51:00 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 07:51:00 compute-0 nova_compute[186544]:       <entry name="serial">fccb17bd-32d9-4428-8812-0fbb6f93afa4</entry>
Nov 22 07:51:00 compute-0 nova_compute[186544]:       <entry name="uuid">fccb17bd-32d9-4428-8812-0fbb6f93afa4</entry>
Nov 22 07:51:00 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 07:51:00 compute-0 nova_compute[186544]:     </system>
Nov 22 07:51:00 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 07:51:00 compute-0 nova_compute[186544]:   <os>
Nov 22 07:51:00 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 07:51:00 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 07:51:00 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 07:51:00 compute-0 nova_compute[186544]:   </os>
Nov 22 07:51:00 compute-0 nova_compute[186544]:   <features>
Nov 22 07:51:00 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 07:51:00 compute-0 nova_compute[186544]:     <apic/>
Nov 22 07:51:00 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 07:51:00 compute-0 nova_compute[186544]:   </features>
Nov 22 07:51:00 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 07:51:00 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 07:51:00 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 07:51:00 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 07:51:00 compute-0 nova_compute[186544]:   </clock>
Nov 22 07:51:00 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 07:51:00 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 07:51:00 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 07:51:00 compute-0 nova_compute[186544]:   </cpu>
Nov 22 07:51:00 compute-0 nova_compute[186544]:   <devices>
Nov 22 07:51:00 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 07:51:00 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 07:51:00 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/fccb17bd-32d9-4428-8812-0fbb6f93afa4/disk"/>
Nov 22 07:51:00 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 07:51:00 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:51:00 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 07:51:00 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 07:51:00 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/fccb17bd-32d9-4428-8812-0fbb6f93afa4/disk.config"/>
Nov 22 07:51:00 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 07:51:00 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:51:00 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 07:51:00 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/fccb17bd-32d9-4428-8812-0fbb6f93afa4/console.log" append="off"/>
Nov 22 07:51:00 compute-0 nova_compute[186544]:     </serial>
Nov 22 07:51:00 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 07:51:00 compute-0 nova_compute[186544]:     <video>
Nov 22 07:51:00 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 07:51:00 compute-0 nova_compute[186544]:     </video>
Nov 22 07:51:00 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 07:51:00 compute-0 nova_compute[186544]:     <input type="keyboard" bus="usb"/>
Nov 22 07:51:00 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 07:51:00 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 07:51:00 compute-0 nova_compute[186544]:     </rng>
Nov 22 07:51:00 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 07:51:00 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:51:00 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:51:00 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:51:00 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:51:00 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:51:00 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:51:00 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:51:00 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:51:00 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:51:00 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:51:00 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:51:00 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:51:00 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:51:00 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:51:00 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:51:00 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:51:00 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:51:00 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:51:00 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:51:00 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:51:00 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:51:00 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:51:00 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:51:00 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:51:00 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 07:51:00 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 07:51:00 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 07:51:00 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 07:51:00 compute-0 nova_compute[186544]:   </devices>
Nov 22 07:51:00 compute-0 nova_compute[186544]: </domain>
Nov 22 07:51:00 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 07:51:00 compute-0 nova_compute[186544]: 2025-11-22 07:51:00.632 186548 DEBUG nova.virt.libvirt.driver [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:51:00 compute-0 nova_compute[186544]: 2025-11-22 07:51:00.632 186548 DEBUG nova.virt.libvirt.driver [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:51:00 compute-0 nova_compute[186544]: 2025-11-22 07:51:00.633 186548 INFO nova.virt.libvirt.driver [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Using config drive
Nov 22 07:51:00 compute-0 nova_compute[186544]: 2025-11-22 07:51:00.644 186548 DEBUG nova.objects.instance [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Lazy-loading 'ec2_ids' on Instance uuid fccb17bd-32d9-4428-8812-0fbb6f93afa4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:51:00 compute-0 nova_compute[186544]: 2025-11-22 07:51:00.682 186548 DEBUG nova.objects.instance [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Lazy-loading 'keypairs' on Instance uuid fccb17bd-32d9-4428-8812-0fbb6f93afa4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:51:00 compute-0 nova_compute[186544]: 2025-11-22 07:51:00.829 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763797845.8269176, fccb17bd-32d9-4428-8812-0fbb6f93afa4 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:51:00 compute-0 nova_compute[186544]: 2025-11-22 07:51:00.830 186548 INFO nova.compute.manager [-] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] VM Stopped (Lifecycle Event)
Nov 22 07:51:00 compute-0 nova_compute[186544]: 2025-11-22 07:51:00.847 186548 DEBUG nova.compute.manager [None req-d2c5487b-7581-4b9e-bf1e-e915048658f2 - - - - - -] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:51:00 compute-0 nova_compute[186544]: 2025-11-22 07:51:00.850 186548 DEBUG nova.compute.manager [None req-d2c5487b-7581-4b9e-bf1e-e915048658f2 - - - - - -] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:51:00 compute-0 nova_compute[186544]: 2025-11-22 07:51:00.877 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:51:00 compute-0 nova_compute[186544]: 2025-11-22 07:51:00.891 186548 INFO nova.compute.manager [None req-d2c5487b-7581-4b9e-bf1e-e915048658f2 - - - - - -] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 07:51:00 compute-0 nova_compute[186544]: 2025-11-22 07:51:00.896 186548 INFO nova.virt.libvirt.driver [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Creating config drive at /var/lib/nova/instances/fccb17bd-32d9-4428-8812-0fbb6f93afa4/disk.config
Nov 22 07:51:00 compute-0 nova_compute[186544]: 2025-11-22 07:51:00.900 186548 DEBUG oslo_concurrency.processutils [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/fccb17bd-32d9-4428-8812-0fbb6f93afa4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp7ql7o9od execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:51:01 compute-0 nova_compute[186544]: 2025-11-22 07:51:01.033 186548 DEBUG oslo_concurrency.processutils [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/fccb17bd-32d9-4428-8812-0fbb6f93afa4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp7ql7o9od" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:51:01 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:51:01.041 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:51:01 compute-0 systemd-machined[152872]: New machine qemu-26-instance-00000030.
Nov 22 07:51:01 compute-0 systemd[1]: Started Virtual Machine qemu-26-instance-00000030.
Nov 22 07:51:02 compute-0 nova_compute[186544]: 2025-11-22 07:51:02.145 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763797862.14433, fccb17bd-32d9-4428-8812-0fbb6f93afa4 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:51:02 compute-0 nova_compute[186544]: 2025-11-22 07:51:02.146 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] VM Resumed (Lifecycle Event)
Nov 22 07:51:02 compute-0 nova_compute[186544]: 2025-11-22 07:51:02.148 186548 DEBUG nova.compute.manager [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 07:51:02 compute-0 nova_compute[186544]: 2025-11-22 07:51:02.148 186548 DEBUG nova.virt.libvirt.driver [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 07:51:02 compute-0 nova_compute[186544]: 2025-11-22 07:51:02.152 186548 INFO nova.virt.libvirt.driver [-] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Instance spawned successfully.
Nov 22 07:51:02 compute-0 podman[221266]: 2025-11-22 07:51:02.161207744 +0000 UTC m=+0.073733443 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Nov 22 07:51:02 compute-0 nova_compute[186544]: 2025-11-22 07:51:02.164 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:51:02 compute-0 nova_compute[186544]: 2025-11-22 07:51:02.168 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:51:02 compute-0 nova_compute[186544]: 2025-11-22 07:51:02.192 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 07:51:02 compute-0 nova_compute[186544]: 2025-11-22 07:51:02.192 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763797862.1461973, fccb17bd-32d9-4428-8812-0fbb6f93afa4 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:51:02 compute-0 nova_compute[186544]: 2025-11-22 07:51:02.192 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] VM Started (Lifecycle Event)
Nov 22 07:51:02 compute-0 podman[221267]: 2025-11-22 07:51:02.205236906 +0000 UTC m=+0.108825165 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 22 07:51:02 compute-0 nova_compute[186544]: 2025-11-22 07:51:02.215 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:51:02 compute-0 nova_compute[186544]: 2025-11-22 07:51:02.217 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Synchronizing instance power state after lifecycle event "Started"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:51:02 compute-0 nova_compute[186544]: 2025-11-22 07:51:02.247 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 07:51:03 compute-0 nova_compute[186544]: 2025-11-22 07:51:03.401 186548 DEBUG nova.compute.manager [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:51:03 compute-0 nova_compute[186544]: 2025-11-22 07:51:03.517 186548 DEBUG oslo_concurrency.lockutils [None req-c29e23c8-c0d3-4a6f-9d29-c385e6de10e3 5e7f05f9ecb04e318a5922be259fdbc8 31455758237e4eeab63e1b340dc27a46 - - default default] Lock "fccb17bd-32d9-4428-8812-0fbb6f93afa4" "released" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: held 8.482s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:51:04 compute-0 nova_compute[186544]: 2025-11-22 07:51:04.940 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:51:05 compute-0 nova_compute[186544]: 2025-11-22 07:51:05.302 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:51:05 compute-0 nova_compute[186544]: 2025-11-22 07:51:05.880 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:51:06 compute-0 nova_compute[186544]: 2025-11-22 07:51:06.039 186548 DEBUG oslo_concurrency.lockutils [None req-91924b3f-71a2-46df-a244-008470b60290 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Acquiring lock "fccb17bd-32d9-4428-8812-0fbb6f93afa4" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:51:06 compute-0 nova_compute[186544]: 2025-11-22 07:51:06.040 186548 DEBUG oslo_concurrency.lockutils [None req-91924b3f-71a2-46df-a244-008470b60290 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Lock "fccb17bd-32d9-4428-8812-0fbb6f93afa4" acquired by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:51:06 compute-0 nova_compute[186544]: 2025-11-22 07:51:06.040 186548 INFO nova.compute.manager [None req-91924b3f-71a2-46df-a244-008470b60290 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Shelving
Nov 22 07:51:06 compute-0 nova_compute[186544]: 2025-11-22 07:51:06.071 186548 DEBUG nova.virt.libvirt.driver [None req-91924b3f-71a2-46df-a244-008470b60290 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Nov 22 07:51:07 compute-0 podman[221314]: 2025-11-22 07:51:07.414083828 +0000 UTC m=+0.050719334 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 07:51:09 compute-0 podman[221338]: 2025-11-22 07:51:09.406466589 +0000 UTC m=+0.059997870 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 07:51:10 compute-0 nova_compute[186544]: 2025-11-22 07:51:10.303 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:51:10 compute-0 nova_compute[186544]: 2025-11-22 07:51:10.839 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763797855.8377607, 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:51:10 compute-0 nova_compute[186544]: 2025-11-22 07:51:10.839 186548 INFO nova.compute.manager [-] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] VM Stopped (Lifecycle Event)
Nov 22 07:51:10 compute-0 nova_compute[186544]: 2025-11-22 07:51:10.873 186548 DEBUG nova.compute.manager [None req-538d02d6-9e09-4d2a-8c8d-e859be4c146b - - - - - -] [instance: 1b9bb28d-9dd5-417c-ae84-1aaf78c81b52] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:51:10 compute-0 nova_compute[186544]: 2025-11-22 07:51:10.882 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:51:14 compute-0 podman[221359]: 2025-11-22 07:51:14.430953642 +0000 UTC m=+0.076744101 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 22 07:51:15 compute-0 nova_compute[186544]: 2025-11-22 07:51:15.305 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:51:15 compute-0 nova_compute[186544]: 2025-11-22 07:51:15.885 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:51:16 compute-0 nova_compute[186544]: 2025-11-22 07:51:16.115 186548 DEBUG nova.virt.libvirt.driver [None req-91924b3f-71a2-46df-a244-008470b60290 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Nov 22 07:51:19 compute-0 systemd[1]: machine-qemu\x2d26\x2dinstance\x2d00000030.scope: Deactivated successfully.
Nov 22 07:51:19 compute-0 systemd[1]: machine-qemu\x2d26\x2dinstance\x2d00000030.scope: Consumed 15.970s CPU time.
Nov 22 07:51:19 compute-0 systemd-machined[152872]: Machine qemu-26-instance-00000030 terminated.
Nov 22 07:51:19 compute-0 podman[221387]: 2025-11-22 07:51:19.141152997 +0000 UTC m=+0.071240615 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 07:51:19 compute-0 nova_compute[186544]: 2025-11-22 07:51:19.260 186548 INFO nova.virt.libvirt.driver [None req-91924b3f-71a2-46df-a244-008470b60290 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Instance shutdown successfully after 13 seconds.
Nov 22 07:51:19 compute-0 nova_compute[186544]: 2025-11-22 07:51:19.265 186548 INFO nova.virt.libvirt.driver [-] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Instance destroyed successfully.
Nov 22 07:51:19 compute-0 nova_compute[186544]: 2025-11-22 07:51:19.265 186548 DEBUG nova.objects.instance [None req-91924b3f-71a2-46df-a244-008470b60290 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Lazy-loading 'numa_topology' on Instance uuid fccb17bd-32d9-4428-8812-0fbb6f93afa4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:51:19 compute-0 nova_compute[186544]: 2025-11-22 07:51:19.574 186548 INFO nova.virt.libvirt.driver [None req-91924b3f-71a2-46df-a244-008470b60290 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Beginning cold snapshot process
Nov 22 07:51:19 compute-0 nova_compute[186544]: 2025-11-22 07:51:19.810 186548 DEBUG nova.privsep.utils [None req-91924b3f-71a2-46df-a244-008470b60290 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Nov 22 07:51:19 compute-0 nova_compute[186544]: 2025-11-22 07:51:19.811 186548 DEBUG oslo_concurrency.processutils [None req-91924b3f-71a2-46df-a244-008470b60290 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/fccb17bd-32d9-4428-8812-0fbb6f93afa4/disk /var/lib/nova/instances/snapshots/tmpos0c6x95/d156b4cf5ca74d3dad34da35ee94e879 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:51:20 compute-0 nova_compute[186544]: 2025-11-22 07:51:20.302 186548 DEBUG oslo_concurrency.processutils [None req-91924b3f-71a2-46df-a244-008470b60290 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/fccb17bd-32d9-4428-8812-0fbb6f93afa4/disk /var/lib/nova/instances/snapshots/tmpos0c6x95/d156b4cf5ca74d3dad34da35ee94e879" returned: 0 in 0.491s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:51:20 compute-0 nova_compute[186544]: 2025-11-22 07:51:20.303 186548 INFO nova.virt.libvirt.driver [None req-91924b3f-71a2-46df-a244-008470b60290 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Snapshot extracted, beginning image upload
Nov 22 07:51:20 compute-0 nova_compute[186544]: 2025-11-22 07:51:20.307 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:51:20 compute-0 nova_compute[186544]: 2025-11-22 07:51:20.887 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:51:23 compute-0 podman[221430]: 2025-11-22 07:51:23.413371157 +0000 UTC m=+0.066946199 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, version=9.6, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.buildah.version=1.33.7, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., container_name=openstack_network_exporter, release=1755695350, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 22 07:51:23 compute-0 nova_compute[186544]: 2025-11-22 07:51:23.564 186548 INFO nova.virt.libvirt.driver [None req-91924b3f-71a2-46df-a244-008470b60290 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Snapshot image upload complete
Nov 22 07:51:23 compute-0 nova_compute[186544]: 2025-11-22 07:51:23.565 186548 DEBUG nova.compute.manager [None req-91924b3f-71a2-46df-a244-008470b60290 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:51:23 compute-0 nova_compute[186544]: 2025-11-22 07:51:23.665 186548 INFO nova.compute.manager [None req-91924b3f-71a2-46df-a244-008470b60290 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Shelve offloading
Nov 22 07:51:23 compute-0 nova_compute[186544]: 2025-11-22 07:51:23.678 186548 INFO nova.virt.libvirt.driver [-] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Instance destroyed successfully.
Nov 22 07:51:23 compute-0 nova_compute[186544]: 2025-11-22 07:51:23.679 186548 DEBUG nova.compute.manager [None req-91924b3f-71a2-46df-a244-008470b60290 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:51:23 compute-0 nova_compute[186544]: 2025-11-22 07:51:23.681 186548 DEBUG oslo_concurrency.lockutils [None req-91924b3f-71a2-46df-a244-008470b60290 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Acquiring lock "refresh_cache-fccb17bd-32d9-4428-8812-0fbb6f93afa4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:51:23 compute-0 nova_compute[186544]: 2025-11-22 07:51:23.681 186548 DEBUG oslo_concurrency.lockutils [None req-91924b3f-71a2-46df-a244-008470b60290 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Acquired lock "refresh_cache-fccb17bd-32d9-4428-8812-0fbb6f93afa4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:51:23 compute-0 nova_compute[186544]: 2025-11-22 07:51:23.682 186548 DEBUG nova.network.neutron [None req-91924b3f-71a2-46df-a244-008470b60290 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 07:51:24 compute-0 nova_compute[186544]: 2025-11-22 07:51:24.694 186548 DEBUG nova.network.neutron [None req-91924b3f-71a2-46df-a244-008470b60290 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 07:51:25 compute-0 nova_compute[186544]: 2025-11-22 07:51:25.191 186548 DEBUG nova.network.neutron [None req-91924b3f-71a2-46df-a244-008470b60290 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:51:25 compute-0 nova_compute[186544]: 2025-11-22 07:51:25.207 186548 DEBUG oslo_concurrency.lockutils [None req-91924b3f-71a2-46df-a244-008470b60290 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Releasing lock "refresh_cache-fccb17bd-32d9-4428-8812-0fbb6f93afa4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:51:25 compute-0 nova_compute[186544]: 2025-11-22 07:51:25.211 186548 INFO nova.virt.libvirt.driver [-] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Instance destroyed successfully.
Nov 22 07:51:25 compute-0 nova_compute[186544]: 2025-11-22 07:51:25.212 186548 DEBUG nova.objects.instance [None req-91924b3f-71a2-46df-a244-008470b60290 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Lazy-loading 'resources' on Instance uuid fccb17bd-32d9-4428-8812-0fbb6f93afa4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:51:25 compute-0 nova_compute[186544]: 2025-11-22 07:51:25.221 186548 INFO nova.virt.libvirt.driver [None req-91924b3f-71a2-46df-a244-008470b60290 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Deleting instance files /var/lib/nova/instances/fccb17bd-32d9-4428-8812-0fbb6f93afa4_del
Nov 22 07:51:25 compute-0 nova_compute[186544]: 2025-11-22 07:51:25.227 186548 INFO nova.virt.libvirt.driver [None req-91924b3f-71a2-46df-a244-008470b60290 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Deletion of /var/lib/nova/instances/fccb17bd-32d9-4428-8812-0fbb6f93afa4_del complete
Nov 22 07:51:25 compute-0 nova_compute[186544]: 2025-11-22 07:51:25.309 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:51:25 compute-0 nova_compute[186544]: 2025-11-22 07:51:25.348 186548 INFO nova.scheduler.client.report [None req-91924b3f-71a2-46df-a244-008470b60290 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Deleted allocations for instance fccb17bd-32d9-4428-8812-0fbb6f93afa4
Nov 22 07:51:25 compute-0 nova_compute[186544]: 2025-11-22 07:51:25.419 186548 DEBUG oslo_concurrency.lockutils [None req-91924b3f-71a2-46df-a244-008470b60290 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:51:25 compute-0 nova_compute[186544]: 2025-11-22 07:51:25.419 186548 DEBUG oslo_concurrency.lockutils [None req-91924b3f-71a2-46df-a244-008470b60290 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:51:25 compute-0 nova_compute[186544]: 2025-11-22 07:51:25.466 186548 DEBUG nova.compute.provider_tree [None req-91924b3f-71a2-46df-a244-008470b60290 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:51:25 compute-0 nova_compute[186544]: 2025-11-22 07:51:25.479 186548 DEBUG nova.scheduler.client.report [None req-91924b3f-71a2-46df-a244-008470b60290 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:51:25 compute-0 nova_compute[186544]: 2025-11-22 07:51:25.503 186548 DEBUG oslo_concurrency.lockutils [None req-91924b3f-71a2-46df-a244-008470b60290 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.083s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:51:25 compute-0 nova_compute[186544]: 2025-11-22 07:51:25.578 186548 DEBUG oslo_concurrency.lockutils [None req-91924b3f-71a2-46df-a244-008470b60290 a2e51707e7f64c0793f0a8feeb6c40e6 d31cb5bd32934c45b774fafa62a8eb01 - - default default] Lock "fccb17bd-32d9-4428-8812-0fbb6f93afa4" "released" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: held 19.538s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:51:25 compute-0 nova_compute[186544]: 2025-11-22 07:51:25.889 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:51:27 compute-0 nova_compute[186544]: 2025-11-22 07:51:27.080 186548 DEBUG oslo_concurrency.lockutils [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] Acquiring lock "e193cce7-4100-4385-bdfe-9e256dc10eed" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:51:27 compute-0 nova_compute[186544]: 2025-11-22 07:51:27.081 186548 DEBUG oslo_concurrency.lockutils [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] Lock "e193cce7-4100-4385-bdfe-9e256dc10eed" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:51:27 compute-0 nova_compute[186544]: 2025-11-22 07:51:27.107 186548 DEBUG nova.compute.manager [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 07:51:27 compute-0 nova_compute[186544]: 2025-11-22 07:51:27.238 186548 DEBUG oslo_concurrency.lockutils [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:51:27 compute-0 nova_compute[186544]: 2025-11-22 07:51:27.239 186548 DEBUG oslo_concurrency.lockutils [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:51:27 compute-0 nova_compute[186544]: 2025-11-22 07:51:27.250 186548 DEBUG nova.virt.hardware [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 07:51:27 compute-0 nova_compute[186544]: 2025-11-22 07:51:27.251 186548 INFO nova.compute.claims [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] Claim successful on node compute-0.ctlplane.example.com
Nov 22 07:51:27 compute-0 nova_compute[186544]: 2025-11-22 07:51:27.405 186548 DEBUG nova.compute.provider_tree [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:51:27 compute-0 nova_compute[186544]: 2025-11-22 07:51:27.426 186548 DEBUG nova.scheduler.client.report [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:51:27 compute-0 nova_compute[186544]: 2025-11-22 07:51:27.457 186548 DEBUG oslo_concurrency.lockutils [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.218s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:51:27 compute-0 nova_compute[186544]: 2025-11-22 07:51:27.458 186548 DEBUG nova.compute.manager [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 07:51:27 compute-0 nova_compute[186544]: 2025-11-22 07:51:27.512 186548 DEBUG nova.compute.manager [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 22 07:51:27 compute-0 nova_compute[186544]: 2025-11-22 07:51:27.512 186548 DEBUG nova.network.neutron [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 07:51:27 compute-0 nova_compute[186544]: 2025-11-22 07:51:27.535 186548 INFO nova.virt.libvirt.driver [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 07:51:27 compute-0 nova_compute[186544]: 2025-11-22 07:51:27.563 186548 DEBUG nova.compute.manager [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 07:51:27 compute-0 nova_compute[186544]: 2025-11-22 07:51:27.688 186548 DEBUG nova.compute.manager [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 07:51:27 compute-0 nova_compute[186544]: 2025-11-22 07:51:27.690 186548 DEBUG nova.virt.libvirt.driver [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 07:51:27 compute-0 nova_compute[186544]: 2025-11-22 07:51:27.690 186548 INFO nova.virt.libvirt.driver [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] Creating image(s)
Nov 22 07:51:27 compute-0 nova_compute[186544]: 2025-11-22 07:51:27.691 186548 DEBUG oslo_concurrency.lockutils [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] Acquiring lock "/var/lib/nova/instances/e193cce7-4100-4385-bdfe-9e256dc10eed/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:51:27 compute-0 nova_compute[186544]: 2025-11-22 07:51:27.691 186548 DEBUG oslo_concurrency.lockutils [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] Lock "/var/lib/nova/instances/e193cce7-4100-4385-bdfe-9e256dc10eed/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:51:27 compute-0 nova_compute[186544]: 2025-11-22 07:51:27.692 186548 DEBUG oslo_concurrency.lockutils [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] Lock "/var/lib/nova/instances/e193cce7-4100-4385-bdfe-9e256dc10eed/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:51:27 compute-0 nova_compute[186544]: 2025-11-22 07:51:27.705 186548 DEBUG oslo_concurrency.processutils [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:51:27 compute-0 nova_compute[186544]: 2025-11-22 07:51:27.767 186548 DEBUG oslo_concurrency.processutils [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:51:27 compute-0 nova_compute[186544]: 2025-11-22 07:51:27.768 186548 DEBUG oslo_concurrency.lockutils [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:51:27 compute-0 nova_compute[186544]: 2025-11-22 07:51:27.769 186548 DEBUG oslo_concurrency.lockutils [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:51:27 compute-0 nova_compute[186544]: 2025-11-22 07:51:27.780 186548 DEBUG oslo_concurrency.processutils [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:51:27 compute-0 nova_compute[186544]: 2025-11-22 07:51:27.834 186548 DEBUG nova.policy [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd4022ae06df548ef8b8b167c80f56ae6', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '08890994b268457d9deaf5460dcbe0d9', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 22 07:51:27 compute-0 nova_compute[186544]: 2025-11-22 07:51:27.841 186548 DEBUG oslo_concurrency.processutils [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:51:27 compute-0 nova_compute[186544]: 2025-11-22 07:51:27.841 186548 DEBUG oslo_concurrency.processutils [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/e193cce7-4100-4385-bdfe-9e256dc10eed/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:51:27 compute-0 nova_compute[186544]: 2025-11-22 07:51:27.918 186548 DEBUG oslo_concurrency.processutils [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/e193cce7-4100-4385-bdfe-9e256dc10eed/disk 1073741824" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:51:27 compute-0 nova_compute[186544]: 2025-11-22 07:51:27.919 186548 DEBUG oslo_concurrency.lockutils [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.150s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:51:27 compute-0 nova_compute[186544]: 2025-11-22 07:51:27.920 186548 DEBUG oslo_concurrency.processutils [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:51:27 compute-0 nova_compute[186544]: 2025-11-22 07:51:27.977 186548 DEBUG oslo_concurrency.processutils [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:51:27 compute-0 nova_compute[186544]: 2025-11-22 07:51:27.978 186548 DEBUG nova.virt.disk.api [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] Checking if we can resize image /var/lib/nova/instances/e193cce7-4100-4385-bdfe-9e256dc10eed/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 07:51:27 compute-0 nova_compute[186544]: 2025-11-22 07:51:27.978 186548 DEBUG oslo_concurrency.processutils [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e193cce7-4100-4385-bdfe-9e256dc10eed/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:51:28 compute-0 nova_compute[186544]: 2025-11-22 07:51:28.040 186548 DEBUG oslo_concurrency.processutils [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e193cce7-4100-4385-bdfe-9e256dc10eed/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:51:28 compute-0 nova_compute[186544]: 2025-11-22 07:51:28.042 186548 DEBUG nova.virt.disk.api [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] Cannot resize image /var/lib/nova/instances/e193cce7-4100-4385-bdfe-9e256dc10eed/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 07:51:28 compute-0 nova_compute[186544]: 2025-11-22 07:51:28.042 186548 DEBUG nova.objects.instance [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] Lazy-loading 'migration_context' on Instance uuid e193cce7-4100-4385-bdfe-9e256dc10eed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:51:28 compute-0 nova_compute[186544]: 2025-11-22 07:51:28.058 186548 DEBUG nova.virt.libvirt.driver [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 07:51:28 compute-0 nova_compute[186544]: 2025-11-22 07:51:28.059 186548 DEBUG nova.virt.libvirt.driver [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] Ensure instance console log exists: /var/lib/nova/instances/e193cce7-4100-4385-bdfe-9e256dc10eed/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 07:51:28 compute-0 nova_compute[186544]: 2025-11-22 07:51:28.059 186548 DEBUG oslo_concurrency.lockutils [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:51:28 compute-0 nova_compute[186544]: 2025-11-22 07:51:28.059 186548 DEBUG oslo_concurrency.lockutils [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:51:28 compute-0 nova_compute[186544]: 2025-11-22 07:51:28.060 186548 DEBUG oslo_concurrency.lockutils [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:51:29 compute-0 nova_compute[186544]: 2025-11-22 07:51:29.903 186548 DEBUG nova.network.neutron [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] Successfully created port: 4d24be20-c58b-47ad-8239-de4de9d7d6b7 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 22 07:51:30 compute-0 nova_compute[186544]: 2025-11-22 07:51:30.310 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:51:30 compute-0 nova_compute[186544]: 2025-11-22 07:51:30.890 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:51:31 compute-0 nova_compute[186544]: 2025-11-22 07:51:31.033 186548 DEBUG nova.network.neutron [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] Successfully updated port: 4d24be20-c58b-47ad-8239-de4de9d7d6b7 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 07:51:31 compute-0 nova_compute[186544]: 2025-11-22 07:51:31.074 186548 DEBUG oslo_concurrency.lockutils [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] Acquiring lock "refresh_cache-e193cce7-4100-4385-bdfe-9e256dc10eed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:51:31 compute-0 nova_compute[186544]: 2025-11-22 07:51:31.074 186548 DEBUG oslo_concurrency.lockutils [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] Acquired lock "refresh_cache-e193cce7-4100-4385-bdfe-9e256dc10eed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:51:31 compute-0 nova_compute[186544]: 2025-11-22 07:51:31.074 186548 DEBUG nova.network.neutron [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 07:51:31 compute-0 nova_compute[186544]: 2025-11-22 07:51:31.202 186548 DEBUG nova.compute.manager [req-39f7f645-bbc2-4542-99c2-a946ef773038 req-462f5112-85de-4904-8b6a-084b9ce6d2a7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] Received event network-changed-4d24be20-c58b-47ad-8239-de4de9d7d6b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:51:31 compute-0 nova_compute[186544]: 2025-11-22 07:51:31.202 186548 DEBUG nova.compute.manager [req-39f7f645-bbc2-4542-99c2-a946ef773038 req-462f5112-85de-4904-8b6a-084b9ce6d2a7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] Refreshing instance network info cache due to event network-changed-4d24be20-c58b-47ad-8239-de4de9d7d6b7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 07:51:31 compute-0 nova_compute[186544]: 2025-11-22 07:51:31.203 186548 DEBUG oslo_concurrency.lockutils [req-39f7f645-bbc2-4542-99c2-a946ef773038 req-462f5112-85de-4904-8b6a-084b9ce6d2a7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-e193cce7-4100-4385-bdfe-9e256dc10eed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:51:31 compute-0 nova_compute[186544]: 2025-11-22 07:51:31.959 186548 DEBUG nova.network.neutron [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 07:51:32 compute-0 podman[221467]: 2025-11-22 07:51:32.406935765 +0000 UTC m=+0.052865056 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 22 07:51:32 compute-0 podman[221468]: 2025-11-22 07:51:32.438856926 +0000 UTC m=+0.082599273 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3)
Nov 22 07:51:33 compute-0 nova_compute[186544]: 2025-11-22 07:51:33.297 186548 DEBUG nova.network.neutron [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] Updating instance_info_cache with network_info: [{"id": "4d24be20-c58b-47ad-8239-de4de9d7d6b7", "address": "fa:16:3e:89:07:8a", "network": {"id": "49018757-f317-4b88-bf73-115119e056fe", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-1275890616-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08890994b268457d9deaf5460dcbe0d9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d24be20-c5", "ovs_interfaceid": "4d24be20-c58b-47ad-8239-de4de9d7d6b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:51:33 compute-0 nova_compute[186544]: 2025-11-22 07:51:33.320 186548 DEBUG oslo_concurrency.lockutils [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] Releasing lock "refresh_cache-e193cce7-4100-4385-bdfe-9e256dc10eed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:51:33 compute-0 nova_compute[186544]: 2025-11-22 07:51:33.320 186548 DEBUG nova.compute.manager [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] Instance network_info: |[{"id": "4d24be20-c58b-47ad-8239-de4de9d7d6b7", "address": "fa:16:3e:89:07:8a", "network": {"id": "49018757-f317-4b88-bf73-115119e056fe", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-1275890616-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08890994b268457d9deaf5460dcbe0d9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d24be20-c5", "ovs_interfaceid": "4d24be20-c58b-47ad-8239-de4de9d7d6b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 22 07:51:33 compute-0 nova_compute[186544]: 2025-11-22 07:51:33.321 186548 DEBUG oslo_concurrency.lockutils [req-39f7f645-bbc2-4542-99c2-a946ef773038 req-462f5112-85de-4904-8b6a-084b9ce6d2a7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-e193cce7-4100-4385-bdfe-9e256dc10eed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:51:33 compute-0 nova_compute[186544]: 2025-11-22 07:51:33.321 186548 DEBUG nova.network.neutron [req-39f7f645-bbc2-4542-99c2-a946ef773038 req-462f5112-85de-4904-8b6a-084b9ce6d2a7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] Refreshing network info cache for port 4d24be20-c58b-47ad-8239-de4de9d7d6b7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 07:51:33 compute-0 nova_compute[186544]: 2025-11-22 07:51:33.323 186548 DEBUG nova.virt.libvirt.driver [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] Start _get_guest_xml network_info=[{"id": "4d24be20-c58b-47ad-8239-de4de9d7d6b7", "address": "fa:16:3e:89:07:8a", "network": {"id": "49018757-f317-4b88-bf73-115119e056fe", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-1275890616-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08890994b268457d9deaf5460dcbe0d9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d24be20-c5", "ovs_interfaceid": "4d24be20-c58b-47ad-8239-de4de9d7d6b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 07:51:33 compute-0 nova_compute[186544]: 2025-11-22 07:51:33.328 186548 WARNING nova.virt.libvirt.driver [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 07:51:33 compute-0 nova_compute[186544]: 2025-11-22 07:51:33.334 186548 DEBUG nova.virt.libvirt.host [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 07:51:33 compute-0 nova_compute[186544]: 2025-11-22 07:51:33.334 186548 DEBUG nova.virt.libvirt.host [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 07:51:33 compute-0 nova_compute[186544]: 2025-11-22 07:51:33.339 186548 DEBUG nova.virt.libvirt.host [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 07:51:33 compute-0 nova_compute[186544]: 2025-11-22 07:51:33.340 186548 DEBUG nova.virt.libvirt.host [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 07:51:33 compute-0 nova_compute[186544]: 2025-11-22 07:51:33.341 186548 DEBUG nova.virt.libvirt.driver [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 07:51:33 compute-0 nova_compute[186544]: 2025-11-22 07:51:33.341 186548 DEBUG nova.virt.hardware [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 07:51:33 compute-0 nova_compute[186544]: 2025-11-22 07:51:33.341 186548 DEBUG nova.virt.hardware [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 07:51:33 compute-0 nova_compute[186544]: 2025-11-22 07:51:33.342 186548 DEBUG nova.virt.hardware [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 07:51:33 compute-0 nova_compute[186544]: 2025-11-22 07:51:33.342 186548 DEBUG nova.virt.hardware [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 07:51:33 compute-0 nova_compute[186544]: 2025-11-22 07:51:33.342 186548 DEBUG nova.virt.hardware [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 07:51:33 compute-0 nova_compute[186544]: 2025-11-22 07:51:33.342 186548 DEBUG nova.virt.hardware [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 07:51:33 compute-0 nova_compute[186544]: 2025-11-22 07:51:33.342 186548 DEBUG nova.virt.hardware [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 07:51:33 compute-0 nova_compute[186544]: 2025-11-22 07:51:33.343 186548 DEBUG nova.virt.hardware [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 07:51:33 compute-0 nova_compute[186544]: 2025-11-22 07:51:33.343 186548 DEBUG nova.virt.hardware [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 07:51:33 compute-0 nova_compute[186544]: 2025-11-22 07:51:33.343 186548 DEBUG nova.virt.hardware [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 07:51:33 compute-0 nova_compute[186544]: 2025-11-22 07:51:33.343 186548 DEBUG nova.virt.hardware [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 07:51:33 compute-0 nova_compute[186544]: 2025-11-22 07:51:33.347 186548 DEBUG nova.virt.libvirt.vif [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:51:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestManualDisk-server-1231300254',display_name='tempest-ServersTestManualDisk-server-1231300254',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestmanualdisk-server-1231300254',id=53,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJjaVG81oZk2kR4+Yt+5vEgyS5SvF8HsdK7yD2bVZ2YZEUbRyPwfN1M/0R61a4k345vlzTYT5mbqHPkz6wNuPJ9rFPTIJ/oz2P6U5sgzPoxWy6PZMayVbdhMfxnk8yWSVg==',key_name='tempest-keypair-287681050',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='08890994b268457d9deaf5460dcbe0d9',ramdisk_id='',reservation_id='r-06tfrv3r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestManualDisk-1931640754',owner_user_name='tempest-ServersTestManualDisk-1931640754-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:51:27Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d4022ae06df548ef8b8b167c80f56ae6',uuid=e193cce7-4100-4385-bdfe-9e256dc10eed,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4d24be20-c58b-47ad-8239-de4de9d7d6b7", "address": "fa:16:3e:89:07:8a", "network": {"id": "49018757-f317-4b88-bf73-115119e056fe", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-1275890616-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08890994b268457d9deaf5460dcbe0d9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d24be20-c5", "ovs_interfaceid": "4d24be20-c58b-47ad-8239-de4de9d7d6b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 07:51:33 compute-0 nova_compute[186544]: 2025-11-22 07:51:33.347 186548 DEBUG nova.network.os_vif_util [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] Converting VIF {"id": "4d24be20-c58b-47ad-8239-de4de9d7d6b7", "address": "fa:16:3e:89:07:8a", "network": {"id": "49018757-f317-4b88-bf73-115119e056fe", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-1275890616-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08890994b268457d9deaf5460dcbe0d9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d24be20-c5", "ovs_interfaceid": "4d24be20-c58b-47ad-8239-de4de9d7d6b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:51:33 compute-0 nova_compute[186544]: 2025-11-22 07:51:33.347 186548 DEBUG nova.network.os_vif_util [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:89:07:8a,bridge_name='br-int',has_traffic_filtering=True,id=4d24be20-c58b-47ad-8239-de4de9d7d6b7,network=Network(49018757-f317-4b88-bf73-115119e056fe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4d24be20-c5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:51:33 compute-0 nova_compute[186544]: 2025-11-22 07:51:33.348 186548 DEBUG nova.objects.instance [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] Lazy-loading 'pci_devices' on Instance uuid e193cce7-4100-4385-bdfe-9e256dc10eed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:51:33 compute-0 nova_compute[186544]: 2025-11-22 07:51:33.368 186548 DEBUG nova.virt.libvirt.driver [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] End _get_guest_xml xml=<domain type="kvm">
Nov 22 07:51:33 compute-0 nova_compute[186544]:   <uuid>e193cce7-4100-4385-bdfe-9e256dc10eed</uuid>
Nov 22 07:51:33 compute-0 nova_compute[186544]:   <name>instance-00000035</name>
Nov 22 07:51:33 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 07:51:33 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 07:51:33 compute-0 nova_compute[186544]:   <metadata>
Nov 22 07:51:33 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 07:51:33 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 07:51:33 compute-0 nova_compute[186544]:       <nova:name>tempest-ServersTestManualDisk-server-1231300254</nova:name>
Nov 22 07:51:33 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 07:51:33</nova:creationTime>
Nov 22 07:51:33 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 07:51:33 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 07:51:33 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 07:51:33 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 07:51:33 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 07:51:33 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 07:51:33 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 07:51:33 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 07:51:33 compute-0 nova_compute[186544]:         <nova:user uuid="d4022ae06df548ef8b8b167c80f56ae6">tempest-ServersTestManualDisk-1931640754-project-member</nova:user>
Nov 22 07:51:33 compute-0 nova_compute[186544]:         <nova:project uuid="08890994b268457d9deaf5460dcbe0d9">tempest-ServersTestManualDisk-1931640754</nova:project>
Nov 22 07:51:33 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 07:51:33 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 07:51:33 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 07:51:33 compute-0 nova_compute[186544]:         <nova:port uuid="4d24be20-c58b-47ad-8239-de4de9d7d6b7">
Nov 22 07:51:33 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Nov 22 07:51:33 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 07:51:33 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 07:51:33 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 07:51:33 compute-0 nova_compute[186544]:   </metadata>
Nov 22 07:51:33 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 07:51:33 compute-0 nova_compute[186544]:     <system>
Nov 22 07:51:33 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 07:51:33 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 07:51:33 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 07:51:33 compute-0 nova_compute[186544]:       <entry name="serial">e193cce7-4100-4385-bdfe-9e256dc10eed</entry>
Nov 22 07:51:33 compute-0 nova_compute[186544]:       <entry name="uuid">e193cce7-4100-4385-bdfe-9e256dc10eed</entry>
Nov 22 07:51:33 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 07:51:33 compute-0 nova_compute[186544]:     </system>
Nov 22 07:51:33 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 07:51:33 compute-0 nova_compute[186544]:   <os>
Nov 22 07:51:33 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 07:51:33 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 07:51:33 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 07:51:33 compute-0 nova_compute[186544]:   </os>
Nov 22 07:51:33 compute-0 nova_compute[186544]:   <features>
Nov 22 07:51:33 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 07:51:33 compute-0 nova_compute[186544]:     <apic/>
Nov 22 07:51:33 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 07:51:33 compute-0 nova_compute[186544]:   </features>
Nov 22 07:51:33 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 07:51:33 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 07:51:33 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 07:51:33 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 07:51:33 compute-0 nova_compute[186544]:   </clock>
Nov 22 07:51:33 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 07:51:33 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 07:51:33 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 07:51:33 compute-0 nova_compute[186544]:   </cpu>
Nov 22 07:51:33 compute-0 nova_compute[186544]:   <devices>
Nov 22 07:51:33 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 07:51:33 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 07:51:33 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/e193cce7-4100-4385-bdfe-9e256dc10eed/disk"/>
Nov 22 07:51:33 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 07:51:33 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:51:33 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 07:51:33 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 07:51:33 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/e193cce7-4100-4385-bdfe-9e256dc10eed/disk.config"/>
Nov 22 07:51:33 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 07:51:33 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:51:33 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 07:51:33 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:89:07:8a"/>
Nov 22 07:51:33 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 07:51:33 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 07:51:33 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 07:51:33 compute-0 nova_compute[186544]:       <target dev="tap4d24be20-c5"/>
Nov 22 07:51:33 compute-0 nova_compute[186544]:     </interface>
Nov 22 07:51:33 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 07:51:33 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/e193cce7-4100-4385-bdfe-9e256dc10eed/console.log" append="off"/>
Nov 22 07:51:33 compute-0 nova_compute[186544]:     </serial>
Nov 22 07:51:33 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 07:51:33 compute-0 nova_compute[186544]:     <video>
Nov 22 07:51:33 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 07:51:33 compute-0 nova_compute[186544]:     </video>
Nov 22 07:51:33 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 07:51:33 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 07:51:33 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 07:51:33 compute-0 nova_compute[186544]:     </rng>
Nov 22 07:51:33 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 07:51:33 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:51:33 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:51:33 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:51:33 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:51:33 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:51:33 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:51:33 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:51:33 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:51:33 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:51:33 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:51:33 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:51:33 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:51:33 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:51:33 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:51:33 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:51:33 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:51:33 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:51:33 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:51:33 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:51:33 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:51:33 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:51:33 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:51:33 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:51:33 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:51:33 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 07:51:33 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 07:51:33 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 07:51:33 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 07:51:33 compute-0 nova_compute[186544]:   </devices>
Nov 22 07:51:33 compute-0 nova_compute[186544]: </domain>
Nov 22 07:51:33 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 07:51:33 compute-0 nova_compute[186544]: 2025-11-22 07:51:33.369 186548 DEBUG nova.compute.manager [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] Preparing to wait for external event network-vif-plugged-4d24be20-c58b-47ad-8239-de4de9d7d6b7 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 07:51:33 compute-0 nova_compute[186544]: 2025-11-22 07:51:33.370 186548 DEBUG oslo_concurrency.lockutils [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] Acquiring lock "e193cce7-4100-4385-bdfe-9e256dc10eed-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:51:33 compute-0 nova_compute[186544]: 2025-11-22 07:51:33.370 186548 DEBUG oslo_concurrency.lockutils [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] Lock "e193cce7-4100-4385-bdfe-9e256dc10eed-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:51:33 compute-0 nova_compute[186544]: 2025-11-22 07:51:33.370 186548 DEBUG oslo_concurrency.lockutils [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] Lock "e193cce7-4100-4385-bdfe-9e256dc10eed-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:51:33 compute-0 nova_compute[186544]: 2025-11-22 07:51:33.371 186548 DEBUG nova.virt.libvirt.vif [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:51:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestManualDisk-server-1231300254',display_name='tempest-ServersTestManualDisk-server-1231300254',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestmanualdisk-server-1231300254',id=53,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJjaVG81oZk2kR4+Yt+5vEgyS5SvF8HsdK7yD2bVZ2YZEUbRyPwfN1M/0R61a4k345vlzTYT5mbqHPkz6wNuPJ9rFPTIJ/oz2P6U5sgzPoxWy6PZMayVbdhMfxnk8yWSVg==',key_name='tempest-keypair-287681050',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='08890994b268457d9deaf5460dcbe0d9',ramdisk_id='',reservation_id='r-06tfrv3r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestManualDisk-1931640754',owner_user_name='tempest-ServersTestManualDisk-1931640754-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:51:27Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d4022ae06df548ef8b8b167c80f56ae6',uuid=e193cce7-4100-4385-bdfe-9e256dc10eed,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4d24be20-c58b-47ad-8239-de4de9d7d6b7", "address": "fa:16:3e:89:07:8a", "network": {"id": "49018757-f317-4b88-bf73-115119e056fe", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-1275890616-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08890994b268457d9deaf5460dcbe0d9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d24be20-c5", "ovs_interfaceid": "4d24be20-c58b-47ad-8239-de4de9d7d6b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 07:51:33 compute-0 nova_compute[186544]: 2025-11-22 07:51:33.371 186548 DEBUG nova.network.os_vif_util [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] Converting VIF {"id": "4d24be20-c58b-47ad-8239-de4de9d7d6b7", "address": "fa:16:3e:89:07:8a", "network": {"id": "49018757-f317-4b88-bf73-115119e056fe", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-1275890616-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08890994b268457d9deaf5460dcbe0d9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d24be20-c5", "ovs_interfaceid": "4d24be20-c58b-47ad-8239-de4de9d7d6b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:51:33 compute-0 nova_compute[186544]: 2025-11-22 07:51:33.372 186548 DEBUG nova.network.os_vif_util [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:89:07:8a,bridge_name='br-int',has_traffic_filtering=True,id=4d24be20-c58b-47ad-8239-de4de9d7d6b7,network=Network(49018757-f317-4b88-bf73-115119e056fe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4d24be20-c5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:51:33 compute-0 nova_compute[186544]: 2025-11-22 07:51:33.372 186548 DEBUG os_vif [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:89:07:8a,bridge_name='br-int',has_traffic_filtering=True,id=4d24be20-c58b-47ad-8239-de4de9d7d6b7,network=Network(49018757-f317-4b88-bf73-115119e056fe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4d24be20-c5') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 07:51:33 compute-0 nova_compute[186544]: 2025-11-22 07:51:33.372 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:51:33 compute-0 nova_compute[186544]: 2025-11-22 07:51:33.373 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:51:33 compute-0 nova_compute[186544]: 2025-11-22 07:51:33.373 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:51:33 compute-0 nova_compute[186544]: 2025-11-22 07:51:33.376 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:51:33 compute-0 nova_compute[186544]: 2025-11-22 07:51:33.376 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4d24be20-c5, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:51:33 compute-0 nova_compute[186544]: 2025-11-22 07:51:33.377 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4d24be20-c5, col_values=(('external_ids', {'iface-id': '4d24be20-c58b-47ad-8239-de4de9d7d6b7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:89:07:8a', 'vm-uuid': 'e193cce7-4100-4385-bdfe-9e256dc10eed'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:51:33 compute-0 nova_compute[186544]: 2025-11-22 07:51:33.378 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:51:33 compute-0 NetworkManager[55036]: <info>  [1763797893.3791] manager: (tap4d24be20-c5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/99)
Nov 22 07:51:33 compute-0 nova_compute[186544]: 2025-11-22 07:51:33.380 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 07:51:33 compute-0 nova_compute[186544]: 2025-11-22 07:51:33.384 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:51:33 compute-0 nova_compute[186544]: 2025-11-22 07:51:33.385 186548 INFO os_vif [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:89:07:8a,bridge_name='br-int',has_traffic_filtering=True,id=4d24be20-c58b-47ad-8239-de4de9d7d6b7,network=Network(49018757-f317-4b88-bf73-115119e056fe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4d24be20-c5')
Nov 22 07:51:33 compute-0 nova_compute[186544]: 2025-11-22 07:51:33.451 186548 DEBUG nova.virt.libvirt.driver [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:51:33 compute-0 nova_compute[186544]: 2025-11-22 07:51:33.451 186548 DEBUG nova.virt.libvirt.driver [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:51:33 compute-0 nova_compute[186544]: 2025-11-22 07:51:33.451 186548 DEBUG nova.virt.libvirt.driver [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] No VIF found with MAC fa:16:3e:89:07:8a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 07:51:33 compute-0 nova_compute[186544]: 2025-11-22 07:51:33.452 186548 INFO nova.virt.libvirt.driver [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] Using config drive
Nov 22 07:51:33 compute-0 nova_compute[186544]: 2025-11-22 07:51:33.860 186548 INFO nova.virt.libvirt.driver [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] Creating config drive at /var/lib/nova/instances/e193cce7-4100-4385-bdfe-9e256dc10eed/disk.config
Nov 22 07:51:33 compute-0 nova_compute[186544]: 2025-11-22 07:51:33.866 186548 DEBUG oslo_concurrency.processutils [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e193cce7-4100-4385-bdfe-9e256dc10eed/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpirunq8zq execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:51:33 compute-0 nova_compute[186544]: 2025-11-22 07:51:33.994 186548 DEBUG oslo_concurrency.processutils [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e193cce7-4100-4385-bdfe-9e256dc10eed/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpirunq8zq" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:51:34 compute-0 kernel: tap4d24be20-c5: entered promiscuous mode
Nov 22 07:51:34 compute-0 NetworkManager[55036]: <info>  [1763797894.0461] manager: (tap4d24be20-c5): new Tun device (/org/freedesktop/NetworkManager/Devices/100)
Nov 22 07:51:34 compute-0 ovn_controller[94843]: 2025-11-22T07:51:34Z|00215|binding|INFO|Claiming lport 4d24be20-c58b-47ad-8239-de4de9d7d6b7 for this chassis.
Nov 22 07:51:34 compute-0 ovn_controller[94843]: 2025-11-22T07:51:34Z|00216|binding|INFO|4d24be20-c58b-47ad-8239-de4de9d7d6b7: Claiming fa:16:3e:89:07:8a 10.100.0.4
Nov 22 07:51:34 compute-0 nova_compute[186544]: 2025-11-22 07:51:34.047 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:51:34 compute-0 nova_compute[186544]: 2025-11-22 07:51:34.051 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:51:34 compute-0 nova_compute[186544]: 2025-11-22 07:51:34.053 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:51:34 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:51:34.064 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:89:07:8a 10.100.0.4'], port_security=['fa:16:3e:89:07:8a 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'e193cce7-4100-4385-bdfe-9e256dc10eed', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-49018757-f317-4b88-bf73-115119e056fe', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '08890994b268457d9deaf5460dcbe0d9', 'neutron:revision_number': '2', 'neutron:security_group_ids': '5c3eabf9-2a85-4c92-9fef-7d981f48bf3d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6c300ec6-46f1-4063-af89-ecb9126b007d, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=4d24be20-c58b-47ad-8239-de4de9d7d6b7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:51:34 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:51:34.065 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 4d24be20-c58b-47ad-8239-de4de9d7d6b7 in datapath 49018757-f317-4b88-bf73-115119e056fe bound to our chassis
Nov 22 07:51:34 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:51:34.067 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 49018757-f317-4b88-bf73-115119e056fe
Nov 22 07:51:34 compute-0 systemd-machined[152872]: New machine qemu-27-instance-00000035.
Nov 22 07:51:34 compute-0 systemd-udevd[221533]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 07:51:34 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:51:34.083 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[d1380c26-60e7-475c-a8d0-2c15de756ac2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:51:34 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:51:34.084 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap49018757-f1 in ovnmeta-49018757-f317-4b88-bf73-115119e056fe namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 07:51:34 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:51:34.086 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap49018757-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 07:51:34 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:51:34.086 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[c834ac70-63fe-435c-85db-cc91e399d618]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:51:34 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:51:34.087 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[5f66aeb7-4bec-4512-a21f-1191768f6768]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:51:34 compute-0 NetworkManager[55036]: <info>  [1763797894.0921] device (tap4d24be20-c5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 07:51:34 compute-0 NetworkManager[55036]: <info>  [1763797894.0928] device (tap4d24be20-c5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 07:51:34 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:51:34.099 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[3560e629-3c9d-41b7-8b77-3d84c2cee39b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:51:34 compute-0 nova_compute[186544]: 2025-11-22 07:51:34.106 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:51:34 compute-0 systemd[1]: Started Virtual Machine qemu-27-instance-00000035.
Nov 22 07:51:34 compute-0 ovn_controller[94843]: 2025-11-22T07:51:34Z|00217|binding|INFO|Setting lport 4d24be20-c58b-47ad-8239-de4de9d7d6b7 ovn-installed in OVS
Nov 22 07:51:34 compute-0 ovn_controller[94843]: 2025-11-22T07:51:34Z|00218|binding|INFO|Setting lport 4d24be20-c58b-47ad-8239-de4de9d7d6b7 up in Southbound
Nov 22 07:51:34 compute-0 nova_compute[186544]: 2025-11-22 07:51:34.112 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:51:34 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:51:34.115 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[0a84580b-248f-4c19-b76f-e5a6dcf6ffe1]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:51:34 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:51:34.144 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[9207a878-80c1-4bd8-8907-439a5768f1a7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:51:34 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:51:34.148 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[fbaab0fc-357c-4cfc-adad-8fc241ed074c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:51:34 compute-0 NetworkManager[55036]: <info>  [1763797894.1491] manager: (tap49018757-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/101)
Nov 22 07:51:34 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:51:34.180 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[259846ec-8dfb-4e48-93db-1ec5c8ab76bc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:51:34 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:51:34.183 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[96d93927-f549-46b7-ab56-3acb645b931d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:51:34 compute-0 NetworkManager[55036]: <info>  [1763797894.2175] device (tap49018757-f0): carrier: link connected
Nov 22 07:51:34 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:51:34.223 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[9477d2c7-5372-4e45-9463-e13b5d9a0f83]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:51:34 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:51:34.240 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[ed6c78a2-6055-4fa8-9add-3de9df360c01]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap49018757-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:25:92:59'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 63], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 463485, 'reachable_time': 34748, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221565, 'error': None, 'target': 'ovnmeta-49018757-f317-4b88-bf73-115119e056fe', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:51:34 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:51:34.256 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[fab69f6f-c33e-43b5-a01f-e77c1edb6a55]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe25:9259'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 463485, 'tstamp': 463485}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221566, 'error': None, 'target': 'ovnmeta-49018757-f317-4b88-bf73-115119e056fe', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:51:34 compute-0 nova_compute[186544]: 2025-11-22 07:51:34.260 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763797879.2593863, fccb17bd-32d9-4428-8812-0fbb6f93afa4 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:51:34 compute-0 nova_compute[186544]: 2025-11-22 07:51:34.260 186548 INFO nova.compute.manager [-] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] VM Stopped (Lifecycle Event)
Nov 22 07:51:34 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:51:34.270 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[59a85fe0-144f-4526-891c-f3bf21525542]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap49018757-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:25:92:59'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 63], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 463485, 'reachable_time': 34748, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 221567, 'error': None, 'target': 'ovnmeta-49018757-f317-4b88-bf73-115119e056fe', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:51:34 compute-0 nova_compute[186544]: 2025-11-22 07:51:34.283 186548 DEBUG nova.compute.manager [None req-ae5ae8f9-6af7-49c5-bd8a-eb44fc4bb31b - - - - - -] [instance: fccb17bd-32d9-4428-8812-0fbb6f93afa4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:51:34 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:51:34.299 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[8455c225-5b0d-4abc-8e7a-97229cd71b7a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:51:34 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:51:34.350 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[c172f472-9edc-442a-b579-063db1dcd216]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:51:34 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:51:34.351 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap49018757-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:51:34 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:51:34.351 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:51:34 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:51:34.352 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap49018757-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:51:34 compute-0 NetworkManager[55036]: <info>  [1763797894.3540] manager: (tap49018757-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/102)
Nov 22 07:51:34 compute-0 kernel: tap49018757-f0: entered promiscuous mode
Nov 22 07:51:34 compute-0 nova_compute[186544]: 2025-11-22 07:51:34.358 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:51:34 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:51:34.359 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap49018757-f0, col_values=(('external_ids', {'iface-id': '7cb1b635-834a-4b45-ab15-00c2dd98d137'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:51:34 compute-0 ovn_controller[94843]: 2025-11-22T07:51:34Z|00219|binding|INFO|Releasing lport 7cb1b635-834a-4b45-ab15-00c2dd98d137 from this chassis (sb_readonly=0)
Nov 22 07:51:34 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:51:34.363 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/49018757-f317-4b88-bf73-115119e056fe.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/49018757-f317-4b88-bf73-115119e056fe.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 07:51:34 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:51:34.364 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[287bec77-b8a6-4491-9640-ca4036a23ca2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:51:34 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:51:34.365 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 07:51:34 compute-0 ovn_metadata_agent[103800]: global
Nov 22 07:51:34 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 07:51:34 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-49018757-f317-4b88-bf73-115119e056fe
Nov 22 07:51:34 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 07:51:34 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 07:51:34 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 07:51:34 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/49018757-f317-4b88-bf73-115119e056fe.pid.haproxy
Nov 22 07:51:34 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 07:51:34 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:51:34 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 07:51:34 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 07:51:34 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 07:51:34 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 07:51:34 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 07:51:34 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 07:51:34 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 07:51:34 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 07:51:34 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 07:51:34 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 07:51:34 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 07:51:34 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 07:51:34 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 07:51:34 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:51:34 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:51:34 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 07:51:34 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 07:51:34 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 07:51:34 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID 49018757-f317-4b88-bf73-115119e056fe
Nov 22 07:51:34 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 07:51:34 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:51:34.366 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-49018757-f317-4b88-bf73-115119e056fe', 'env', 'PROCESS_TAG=haproxy-49018757-f317-4b88-bf73-115119e056fe', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/49018757-f317-4b88-bf73-115119e056fe.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 07:51:34 compute-0 nova_compute[186544]: 2025-11-22 07:51:34.374 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:51:34 compute-0 nova_compute[186544]: 2025-11-22 07:51:34.709 186548 DEBUG nova.compute.manager [req-cd33bbf0-8ffe-4869-88a7-304d301f0eaa req-b421f45a-d7a1-4649-83cf-b367289a6f2b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] Received event network-vif-plugged-4d24be20-c58b-47ad-8239-de4de9d7d6b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:51:34 compute-0 nova_compute[186544]: 2025-11-22 07:51:34.709 186548 DEBUG oslo_concurrency.lockutils [req-cd33bbf0-8ffe-4869-88a7-304d301f0eaa req-b421f45a-d7a1-4649-83cf-b367289a6f2b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "e193cce7-4100-4385-bdfe-9e256dc10eed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:51:34 compute-0 nova_compute[186544]: 2025-11-22 07:51:34.710 186548 DEBUG oslo_concurrency.lockutils [req-cd33bbf0-8ffe-4869-88a7-304d301f0eaa req-b421f45a-d7a1-4649-83cf-b367289a6f2b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e193cce7-4100-4385-bdfe-9e256dc10eed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:51:34 compute-0 nova_compute[186544]: 2025-11-22 07:51:34.710 186548 DEBUG oslo_concurrency.lockutils [req-cd33bbf0-8ffe-4869-88a7-304d301f0eaa req-b421f45a-d7a1-4649-83cf-b367289a6f2b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e193cce7-4100-4385-bdfe-9e256dc10eed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:51:34 compute-0 nova_compute[186544]: 2025-11-22 07:51:34.710 186548 DEBUG nova.compute.manager [req-cd33bbf0-8ffe-4869-88a7-304d301f0eaa req-b421f45a-d7a1-4649-83cf-b367289a6f2b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] Processing event network-vif-plugged-4d24be20-c58b-47ad-8239-de4de9d7d6b7 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 07:51:34 compute-0 podman[221599]: 2025-11-22 07:51:34.723573127 +0000 UTC m=+0.053887150 container create 517560db3e59d5dbfde61f9edfd6307e8f23f6ac2c6b785072b5b3e421a9706f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-49018757-f317-4b88-bf73-115119e056fe, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 07:51:34 compute-0 systemd[1]: Started libpod-conmon-517560db3e59d5dbfde61f9edfd6307e8f23f6ac2c6b785072b5b3e421a9706f.scope.
Nov 22 07:51:34 compute-0 systemd[1]: Started libcrun container.
Nov 22 07:51:34 compute-0 podman[221599]: 2025-11-22 07:51:34.690680142 +0000 UTC m=+0.020994195 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 07:51:34 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/28c6374a67893017b95195cc7a4b8067a3b74bd4e7403f76334245b3f965f3d0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 07:51:34 compute-0 podman[221599]: 2025-11-22 07:51:34.800366528 +0000 UTC m=+0.130680571 container init 517560db3e59d5dbfde61f9edfd6307e8f23f6ac2c6b785072b5b3e421a9706f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-49018757-f317-4b88-bf73-115119e056fe, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 22 07:51:34 compute-0 podman[221599]: 2025-11-22 07:51:34.806481529 +0000 UTC m=+0.136795552 container start 517560db3e59d5dbfde61f9edfd6307e8f23f6ac2c6b785072b5b3e421a9706f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-49018757-f317-4b88-bf73-115119e056fe, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS)
Nov 22 07:51:34 compute-0 neutron-haproxy-ovnmeta-49018757-f317-4b88-bf73-115119e056fe[221615]: [NOTICE]   (221619) : New worker (221621) forked
Nov 22 07:51:34 compute-0 neutron-haproxy-ovnmeta-49018757-f317-4b88-bf73-115119e056fe[221615]: [NOTICE]   (221619) : Loading success.
Nov 22 07:51:35 compute-0 nova_compute[186544]: 2025-11-22 07:51:35.144 186548 DEBUG nova.network.neutron [req-39f7f645-bbc2-4542-99c2-a946ef773038 req-462f5112-85de-4904-8b6a-084b9ce6d2a7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] Updated VIF entry in instance network info cache for port 4d24be20-c58b-47ad-8239-de4de9d7d6b7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 07:51:35 compute-0 nova_compute[186544]: 2025-11-22 07:51:35.145 186548 DEBUG nova.network.neutron [req-39f7f645-bbc2-4542-99c2-a946ef773038 req-462f5112-85de-4904-8b6a-084b9ce6d2a7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] Updating instance_info_cache with network_info: [{"id": "4d24be20-c58b-47ad-8239-de4de9d7d6b7", "address": "fa:16:3e:89:07:8a", "network": {"id": "49018757-f317-4b88-bf73-115119e056fe", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-1275890616-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08890994b268457d9deaf5460dcbe0d9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d24be20-c5", "ovs_interfaceid": "4d24be20-c58b-47ad-8239-de4de9d7d6b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:51:35 compute-0 nova_compute[186544]: 2025-11-22 07:51:35.162 186548 DEBUG oslo_concurrency.lockutils [req-39f7f645-bbc2-4542-99c2-a946ef773038 req-462f5112-85de-4904-8b6a-084b9ce6d2a7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-e193cce7-4100-4385-bdfe-9e256dc10eed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:51:35 compute-0 nova_compute[186544]: 2025-11-22 07:51:35.312 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:51:35 compute-0 nova_compute[186544]: 2025-11-22 07:51:35.318 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763797895.3183277, e193cce7-4100-4385-bdfe-9e256dc10eed => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:51:35 compute-0 nova_compute[186544]: 2025-11-22 07:51:35.319 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] VM Started (Lifecycle Event)
Nov 22 07:51:35 compute-0 nova_compute[186544]: 2025-11-22 07:51:35.320 186548 DEBUG nova.compute.manager [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 07:51:35 compute-0 nova_compute[186544]: 2025-11-22 07:51:35.323 186548 DEBUG nova.virt.libvirt.driver [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 07:51:35 compute-0 nova_compute[186544]: 2025-11-22 07:51:35.326 186548 INFO nova.virt.libvirt.driver [-] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] Instance spawned successfully.
Nov 22 07:51:35 compute-0 nova_compute[186544]: 2025-11-22 07:51:35.326 186548 DEBUG nova.virt.libvirt.driver [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 07:51:35 compute-0 nova_compute[186544]: 2025-11-22 07:51:35.339 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:51:35 compute-0 nova_compute[186544]: 2025-11-22 07:51:35.345 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:51:35 compute-0 nova_compute[186544]: 2025-11-22 07:51:35.348 186548 DEBUG nova.virt.libvirt.driver [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:51:35 compute-0 nova_compute[186544]: 2025-11-22 07:51:35.349 186548 DEBUG nova.virt.libvirt.driver [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:51:35 compute-0 nova_compute[186544]: 2025-11-22 07:51:35.349 186548 DEBUG nova.virt.libvirt.driver [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:51:35 compute-0 nova_compute[186544]: 2025-11-22 07:51:35.350 186548 DEBUG nova.virt.libvirt.driver [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:51:35 compute-0 nova_compute[186544]: 2025-11-22 07:51:35.350 186548 DEBUG nova.virt.libvirt.driver [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:51:35 compute-0 nova_compute[186544]: 2025-11-22 07:51:35.351 186548 DEBUG nova.virt.libvirt.driver [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:51:35 compute-0 nova_compute[186544]: 2025-11-22 07:51:35.384 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 07:51:35 compute-0 nova_compute[186544]: 2025-11-22 07:51:35.384 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763797895.3188944, e193cce7-4100-4385-bdfe-9e256dc10eed => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:51:35 compute-0 nova_compute[186544]: 2025-11-22 07:51:35.384 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] VM Paused (Lifecycle Event)
Nov 22 07:51:35 compute-0 nova_compute[186544]: 2025-11-22 07:51:35.419 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:51:35 compute-0 nova_compute[186544]: 2025-11-22 07:51:35.423 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763797895.3228884, e193cce7-4100-4385-bdfe-9e256dc10eed => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:51:35 compute-0 nova_compute[186544]: 2025-11-22 07:51:35.423 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] VM Resumed (Lifecycle Event)
Nov 22 07:51:35 compute-0 nova_compute[186544]: 2025-11-22 07:51:35.443 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:51:35 compute-0 nova_compute[186544]: 2025-11-22 07:51:35.446 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:51:35 compute-0 nova_compute[186544]: 2025-11-22 07:51:35.469 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 07:51:35 compute-0 nova_compute[186544]: 2025-11-22 07:51:35.797 186548 INFO nova.compute.manager [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] Took 8.11 seconds to spawn the instance on the hypervisor.
Nov 22 07:51:35 compute-0 nova_compute[186544]: 2025-11-22 07:51:35.798 186548 DEBUG nova.compute.manager [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:51:35 compute-0 nova_compute[186544]: 2025-11-22 07:51:35.915 186548 INFO nova.compute.manager [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] Took 8.71 seconds to build instance.
Nov 22 07:51:35 compute-0 nova_compute[186544]: 2025-11-22 07:51:35.944 186548 DEBUG oslo_concurrency.lockutils [None req-c7795bcf-f52e-47af-92ed-5bf66809c86f d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] Lock "e193cce7-4100-4385-bdfe-9e256dc10eed" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.863s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:51:36 compute-0 nova_compute[186544]: 2025-11-22 07:51:36.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:51:36 compute-0 nova_compute[186544]: 2025-11-22 07:51:36.201 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:51:36 compute-0 nova_compute[186544]: 2025-11-22 07:51:36.201 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:51:36 compute-0 nova_compute[186544]: 2025-11-22 07:51:36.202 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:51:36 compute-0 nova_compute[186544]: 2025-11-22 07:51:36.202 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 07:51:36 compute-0 nova_compute[186544]: 2025-11-22 07:51:36.262 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e193cce7-4100-4385-bdfe-9e256dc10eed/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:51:36 compute-0 nova_compute[186544]: 2025-11-22 07:51:36.321 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e193cce7-4100-4385-bdfe-9e256dc10eed/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:51:36 compute-0 nova_compute[186544]: 2025-11-22 07:51:36.322 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e193cce7-4100-4385-bdfe-9e256dc10eed/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:51:36 compute-0 nova_compute[186544]: 2025-11-22 07:51:36.378 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e193cce7-4100-4385-bdfe-9e256dc10eed/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:51:36 compute-0 nova_compute[186544]: 2025-11-22 07:51:36.514 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 07:51:36 compute-0 nova_compute[186544]: 2025-11-22 07:51:36.516 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5647MB free_disk=73.34846878051758GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 07:51:36 compute-0 nova_compute[186544]: 2025-11-22 07:51:36.516 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:51:36 compute-0 nova_compute[186544]: 2025-11-22 07:51:36.517 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:51:36 compute-0 nova_compute[186544]: 2025-11-22 07:51:36.771 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Instance e193cce7-4100-4385-bdfe-9e256dc10eed actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 22 07:51:36 compute-0 nova_compute[186544]: 2025-11-22 07:51:36.772 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 07:51:36 compute-0 nova_compute[186544]: 2025-11-22 07:51:36.773 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 07:51:36 compute-0 nova_compute[186544]: 2025-11-22 07:51:36.835 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:51:36 compute-0 nova_compute[186544]: 2025-11-22 07:51:36.848 186548 DEBUG nova.compute.manager [req-513e4672-294e-49a5-8cff-54870f2ed38c req-2ad987fa-7617-4d9c-b684-52c5b3dc8784 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] Received event network-vif-plugged-4d24be20-c58b-47ad-8239-de4de9d7d6b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:51:36 compute-0 nova_compute[186544]: 2025-11-22 07:51:36.849 186548 DEBUG oslo_concurrency.lockutils [req-513e4672-294e-49a5-8cff-54870f2ed38c req-2ad987fa-7617-4d9c-b684-52c5b3dc8784 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "e193cce7-4100-4385-bdfe-9e256dc10eed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:51:36 compute-0 nova_compute[186544]: 2025-11-22 07:51:36.849 186548 DEBUG oslo_concurrency.lockutils [req-513e4672-294e-49a5-8cff-54870f2ed38c req-2ad987fa-7617-4d9c-b684-52c5b3dc8784 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e193cce7-4100-4385-bdfe-9e256dc10eed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:51:36 compute-0 nova_compute[186544]: 2025-11-22 07:51:36.850 186548 DEBUG oslo_concurrency.lockutils [req-513e4672-294e-49a5-8cff-54870f2ed38c req-2ad987fa-7617-4d9c-b684-52c5b3dc8784 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e193cce7-4100-4385-bdfe-9e256dc10eed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:51:36 compute-0 nova_compute[186544]: 2025-11-22 07:51:36.850 186548 DEBUG nova.compute.manager [req-513e4672-294e-49a5-8cff-54870f2ed38c req-2ad987fa-7617-4d9c-b684-52c5b3dc8784 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] No waiting events found dispatching network-vif-plugged-4d24be20-c58b-47ad-8239-de4de9d7d6b7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:51:36 compute-0 nova_compute[186544]: 2025-11-22 07:51:36.850 186548 WARNING nova.compute.manager [req-513e4672-294e-49a5-8cff-54870f2ed38c req-2ad987fa-7617-4d9c-b684-52c5b3dc8784 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] Received unexpected event network-vif-plugged-4d24be20-c58b-47ad-8239-de4de9d7d6b7 for instance with vm_state active and task_state None.
Nov 22 07:51:36 compute-0 nova_compute[186544]: 2025-11-22 07:51:36.852 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:51:36 compute-0 nova_compute[186544]: 2025-11-22 07:51:36.890 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 07:51:36 compute-0 nova_compute[186544]: 2025-11-22 07:51:36.891 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.374s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:51:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:51:37.320 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:51:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:51:37.321 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:51:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:51:37.323 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:51:37 compute-0 nova_compute[186544]: 2025-11-22 07:51:37.892 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:51:38 compute-0 nova_compute[186544]: 2025-11-22 07:51:38.380 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:51:38 compute-0 podman[221644]: 2025-11-22 07:51:38.400571923 +0000 UTC m=+0.051915073 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 07:51:38 compute-0 nova_compute[186544]: 2025-11-22 07:51:38.736 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:51:38 compute-0 NetworkManager[55036]: <info>  [1763797898.7378] manager: (patch-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/103)
Nov 22 07:51:38 compute-0 NetworkManager[55036]: <info>  [1763797898.7388] manager: (patch-br-int-to-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/104)
Nov 22 07:51:38 compute-0 nova_compute[186544]: 2025-11-22 07:51:38.879 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:51:38 compute-0 ovn_controller[94843]: 2025-11-22T07:51:38Z|00220|binding|INFO|Releasing lport 7cb1b635-834a-4b45-ab15-00c2dd98d137 from this chassis (sb_readonly=0)
Nov 22 07:51:38 compute-0 nova_compute[186544]: 2025-11-22 07:51:38.901 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:51:39 compute-0 nova_compute[186544]: 2025-11-22 07:51:39.088 186548 DEBUG nova.compute.manager [req-4ada7452-8b21-416c-b454-14523c2acf0c req-41c34b39-48bb-460b-b39c-35e2d9b5c9d7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] Received event network-changed-4d24be20-c58b-47ad-8239-de4de9d7d6b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:51:39 compute-0 nova_compute[186544]: 2025-11-22 07:51:39.089 186548 DEBUG nova.compute.manager [req-4ada7452-8b21-416c-b454-14523c2acf0c req-41c34b39-48bb-460b-b39c-35e2d9b5c9d7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] Refreshing instance network info cache due to event network-changed-4d24be20-c58b-47ad-8239-de4de9d7d6b7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 07:51:39 compute-0 nova_compute[186544]: 2025-11-22 07:51:39.089 186548 DEBUG oslo_concurrency.lockutils [req-4ada7452-8b21-416c-b454-14523c2acf0c req-41c34b39-48bb-460b-b39c-35e2d9b5c9d7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-e193cce7-4100-4385-bdfe-9e256dc10eed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:51:39 compute-0 nova_compute[186544]: 2025-11-22 07:51:39.089 186548 DEBUG oslo_concurrency.lockutils [req-4ada7452-8b21-416c-b454-14523c2acf0c req-41c34b39-48bb-460b-b39c-35e2d9b5c9d7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-e193cce7-4100-4385-bdfe-9e256dc10eed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:51:39 compute-0 nova_compute[186544]: 2025-11-22 07:51:39.090 186548 DEBUG nova.network.neutron [req-4ada7452-8b21-416c-b454-14523c2acf0c req-41c34b39-48bb-460b-b39c-35e2d9b5c9d7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] Refreshing network info cache for port 4d24be20-c58b-47ad-8239-de4de9d7d6b7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 07:51:39 compute-0 nova_compute[186544]: 2025-11-22 07:51:39.158 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:51:39 compute-0 nova_compute[186544]: 2025-11-22 07:51:39.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:51:39 compute-0 nova_compute[186544]: 2025-11-22 07:51:39.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 07:51:39 compute-0 nova_compute[186544]: 2025-11-22 07:51:39.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 07:51:39 compute-0 nova_compute[186544]: 2025-11-22 07:51:39.845 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "refresh_cache-e193cce7-4100-4385-bdfe-9e256dc10eed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:51:40 compute-0 nova_compute[186544]: 2025-11-22 07:51:40.314 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:51:40 compute-0 podman[221669]: 2025-11-22 07:51:40.395211039 +0000 UTC m=+0.048152860 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Nov 22 07:51:41 compute-0 nova_compute[186544]: 2025-11-22 07:51:41.274 186548 DEBUG nova.network.neutron [req-4ada7452-8b21-416c-b454-14523c2acf0c req-41c34b39-48bb-460b-b39c-35e2d9b5c9d7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] Updated VIF entry in instance network info cache for port 4d24be20-c58b-47ad-8239-de4de9d7d6b7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 07:51:41 compute-0 nova_compute[186544]: 2025-11-22 07:51:41.275 186548 DEBUG nova.network.neutron [req-4ada7452-8b21-416c-b454-14523c2acf0c req-41c34b39-48bb-460b-b39c-35e2d9b5c9d7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] Updating instance_info_cache with network_info: [{"id": "4d24be20-c58b-47ad-8239-de4de9d7d6b7", "address": "fa:16:3e:89:07:8a", "network": {"id": "49018757-f317-4b88-bf73-115119e056fe", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-1275890616-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08890994b268457d9deaf5460dcbe0d9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d24be20-c5", "ovs_interfaceid": "4d24be20-c58b-47ad-8239-de4de9d7d6b7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:51:41 compute-0 nova_compute[186544]: 2025-11-22 07:51:41.298 186548 DEBUG oslo_concurrency.lockutils [req-4ada7452-8b21-416c-b454-14523c2acf0c req-41c34b39-48bb-460b-b39c-35e2d9b5c9d7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-e193cce7-4100-4385-bdfe-9e256dc10eed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:51:41 compute-0 nova_compute[186544]: 2025-11-22 07:51:41.299 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquired lock "refresh_cache-e193cce7-4100-4385-bdfe-9e256dc10eed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:51:41 compute-0 nova_compute[186544]: 2025-11-22 07:51:41.299 186548 DEBUG nova.network.neutron [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 22 07:51:41 compute-0 nova_compute[186544]: 2025-11-22 07:51:41.299 186548 DEBUG nova.objects.instance [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lazy-loading 'info_cache' on Instance uuid e193cce7-4100-4385-bdfe-9e256dc10eed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:51:43 compute-0 ovn_controller[94843]: 2025-11-22T07:51:43Z|00221|binding|INFO|Releasing lport 7cb1b635-834a-4b45-ab15-00c2dd98d137 from this chassis (sb_readonly=0)
Nov 22 07:51:43 compute-0 nova_compute[186544]: 2025-11-22 07:51:43.364 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:51:43 compute-0 nova_compute[186544]: 2025-11-22 07:51:43.382 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:51:43 compute-0 nova_compute[186544]: 2025-11-22 07:51:43.485 186548 DEBUG nova.network.neutron [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] Updating instance_info_cache with network_info: [{"id": "4d24be20-c58b-47ad-8239-de4de9d7d6b7", "address": "fa:16:3e:89:07:8a", "network": {"id": "49018757-f317-4b88-bf73-115119e056fe", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-1275890616-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08890994b268457d9deaf5460dcbe0d9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d24be20-c5", "ovs_interfaceid": "4d24be20-c58b-47ad-8239-de4de9d7d6b7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:51:43 compute-0 nova_compute[186544]: 2025-11-22 07:51:43.506 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Releasing lock "refresh_cache-e193cce7-4100-4385-bdfe-9e256dc10eed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:51:43 compute-0 nova_compute[186544]: 2025-11-22 07:51:43.507 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 22 07:51:43 compute-0 nova_compute[186544]: 2025-11-22 07:51:43.507 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:51:43 compute-0 nova_compute[186544]: 2025-11-22 07:51:43.508 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:51:43 compute-0 nova_compute[186544]: 2025-11-22 07:51:43.508 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:51:43 compute-0 nova_compute[186544]: 2025-11-22 07:51:43.508 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 07:51:45 compute-0 nova_compute[186544]: 2025-11-22 07:51:45.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:51:45 compute-0 nova_compute[186544]: 2025-11-22 07:51:45.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:51:45 compute-0 nova_compute[186544]: 2025-11-22 07:51:45.315 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:51:45 compute-0 podman[221688]: 2025-11-22 07:51:45.402160442 +0000 UTC m=+0.051413780 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3)
Nov 22 07:51:48 compute-0 nova_compute[186544]: 2025-11-22 07:51:48.385 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:51:48 compute-0 ovn_controller[94843]: 2025-11-22T07:51:48Z|00222|binding|INFO|Releasing lport 7cb1b635-834a-4b45-ab15-00c2dd98d137 from this chassis (sb_readonly=0)
Nov 22 07:51:48 compute-0 nova_compute[186544]: 2025-11-22 07:51:48.538 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:51:48 compute-0 ovn_controller[94843]: 2025-11-22T07:51:48Z|00026|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:89:07:8a 10.100.0.4
Nov 22 07:51:48 compute-0 ovn_controller[94843]: 2025-11-22T07:51:48Z|00027|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:89:07:8a 10.100.0.4
Nov 22 07:51:49 compute-0 podman[221723]: 2025-11-22 07:51:49.404050534 +0000 UTC m=+0.054358541 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 07:51:50 compute-0 nova_compute[186544]: 2025-11-22 07:51:50.317 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:51:53 compute-0 nova_compute[186544]: 2025-11-22 07:51:53.389 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:51:54 compute-0 podman[221747]: 2025-11-22 07:51:54.400129892 +0000 UTC m=+0.054347112 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., version=9.6, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, distribution-scope=public, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, architecture=x86_64, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible)
Nov 22 07:51:55 compute-0 nova_compute[186544]: 2025-11-22 07:51:55.320 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:51:56 compute-0 nova_compute[186544]: 2025-11-22 07:51:56.159 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:51:57 compute-0 nova_compute[186544]: 2025-11-22 07:51:57.861 186548 DEBUG oslo_concurrency.lockutils [None req-dd7ea3e2-93c1-4b40-8a3e-573d189c07b7 d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] Acquiring lock "e193cce7-4100-4385-bdfe-9e256dc10eed" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:51:57 compute-0 nova_compute[186544]: 2025-11-22 07:51:57.861 186548 DEBUG oslo_concurrency.lockutils [None req-dd7ea3e2-93c1-4b40-8a3e-573d189c07b7 d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] Lock "e193cce7-4100-4385-bdfe-9e256dc10eed" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:51:57 compute-0 nova_compute[186544]: 2025-11-22 07:51:57.862 186548 DEBUG oslo_concurrency.lockutils [None req-dd7ea3e2-93c1-4b40-8a3e-573d189c07b7 d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] Acquiring lock "e193cce7-4100-4385-bdfe-9e256dc10eed-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:51:57 compute-0 nova_compute[186544]: 2025-11-22 07:51:57.862 186548 DEBUG oslo_concurrency.lockutils [None req-dd7ea3e2-93c1-4b40-8a3e-573d189c07b7 d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] Lock "e193cce7-4100-4385-bdfe-9e256dc10eed-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:51:57 compute-0 nova_compute[186544]: 2025-11-22 07:51:57.862 186548 DEBUG oslo_concurrency.lockutils [None req-dd7ea3e2-93c1-4b40-8a3e-573d189c07b7 d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] Lock "e193cce7-4100-4385-bdfe-9e256dc10eed-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:51:57 compute-0 nova_compute[186544]: 2025-11-22 07:51:57.870 186548 INFO nova.compute.manager [None req-dd7ea3e2-93c1-4b40-8a3e-573d189c07b7 d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] Terminating instance
Nov 22 07:51:57 compute-0 nova_compute[186544]: 2025-11-22 07:51:57.878 186548 DEBUG nova.compute.manager [None req-dd7ea3e2-93c1-4b40-8a3e-573d189c07b7 d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 07:51:57 compute-0 kernel: tap4d24be20-c5 (unregistering): left promiscuous mode
Nov 22 07:51:57 compute-0 NetworkManager[55036]: <info>  [1763797917.9006] device (tap4d24be20-c5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 07:51:57 compute-0 ovn_controller[94843]: 2025-11-22T07:51:57Z|00223|binding|INFO|Releasing lport 4d24be20-c58b-47ad-8239-de4de9d7d6b7 from this chassis (sb_readonly=0)
Nov 22 07:51:57 compute-0 nova_compute[186544]: 2025-11-22 07:51:57.908 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:51:57 compute-0 ovn_controller[94843]: 2025-11-22T07:51:57Z|00224|binding|INFO|Setting lport 4d24be20-c58b-47ad-8239-de4de9d7d6b7 down in Southbound
Nov 22 07:51:57 compute-0 ovn_controller[94843]: 2025-11-22T07:51:57Z|00225|binding|INFO|Removing iface tap4d24be20-c5 ovn-installed in OVS
Nov 22 07:51:57 compute-0 nova_compute[186544]: 2025-11-22 07:51:57.926 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:51:57 compute-0 systemd[1]: machine-qemu\x2d27\x2dinstance\x2d00000035.scope: Deactivated successfully.
Nov 22 07:51:57 compute-0 systemd[1]: machine-qemu\x2d27\x2dinstance\x2d00000035.scope: Consumed 14.255s CPU time.
Nov 22 07:51:57 compute-0 systemd-machined[152872]: Machine qemu-27-instance-00000035 terminated.
Nov 22 07:51:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:51:57.960 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:89:07:8a 10.100.0.4'], port_security=['fa:16:3e:89:07:8a 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'e193cce7-4100-4385-bdfe-9e256dc10eed', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-49018757-f317-4b88-bf73-115119e056fe', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '08890994b268457d9deaf5460dcbe0d9', 'neutron:revision_number': '4', 'neutron:security_group_ids': '5c3eabf9-2a85-4c92-9fef-7d981f48bf3d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.205'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6c300ec6-46f1-4063-af89-ecb9126b007d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=4d24be20-c58b-47ad-8239-de4de9d7d6b7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:51:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:51:57.962 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 4d24be20-c58b-47ad-8239-de4de9d7d6b7 in datapath 49018757-f317-4b88-bf73-115119e056fe unbound from our chassis
Nov 22 07:51:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:51:57.963 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 49018757-f317-4b88-bf73-115119e056fe, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 07:51:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:51:57.964 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[15d85d01-2a50-4dc1-a647-628730f2927c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:51:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:51:57.965 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-49018757-f317-4b88-bf73-115119e056fe namespace which is not needed anymore
Nov 22 07:51:58 compute-0 neutron-haproxy-ovnmeta-49018757-f317-4b88-bf73-115119e056fe[221615]: [NOTICE]   (221619) : haproxy version is 2.8.14-c23fe91
Nov 22 07:51:58 compute-0 neutron-haproxy-ovnmeta-49018757-f317-4b88-bf73-115119e056fe[221615]: [NOTICE]   (221619) : path to executable is /usr/sbin/haproxy
Nov 22 07:51:58 compute-0 neutron-haproxy-ovnmeta-49018757-f317-4b88-bf73-115119e056fe[221615]: [WARNING]  (221619) : Exiting Master process...
Nov 22 07:51:58 compute-0 neutron-haproxy-ovnmeta-49018757-f317-4b88-bf73-115119e056fe[221615]: [ALERT]    (221619) : Current worker (221621) exited with code 143 (Terminated)
Nov 22 07:51:58 compute-0 neutron-haproxy-ovnmeta-49018757-f317-4b88-bf73-115119e056fe[221615]: [WARNING]  (221619) : All workers exited. Exiting... (0)
Nov 22 07:51:58 compute-0 systemd[1]: libpod-517560db3e59d5dbfde61f9edfd6307e8f23f6ac2c6b785072b5b3e421a9706f.scope: Deactivated successfully.
Nov 22 07:51:58 compute-0 nova_compute[186544]: 2025-11-22 07:51:58.098 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:51:58 compute-0 podman[221795]: 2025-11-22 07:51:58.10174923 +0000 UTC m=+0.049401550 container died 517560db3e59d5dbfde61f9edfd6307e8f23f6ac2c6b785072b5b3e421a9706f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-49018757-f317-4b88-bf73-115119e056fe, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 22 07:51:58 compute-0 nova_compute[186544]: 2025-11-22 07:51:58.103 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:51:58 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-517560db3e59d5dbfde61f9edfd6307e8f23f6ac2c6b785072b5b3e421a9706f-userdata-shm.mount: Deactivated successfully.
Nov 22 07:51:58 compute-0 systemd[1]: var-lib-containers-storage-overlay-28c6374a67893017b95195cc7a4b8067a3b74bd4e7403f76334245b3f965f3d0-merged.mount: Deactivated successfully.
Nov 22 07:51:58 compute-0 nova_compute[186544]: 2025-11-22 07:51:58.140 186548 INFO nova.virt.libvirt.driver [-] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] Instance destroyed successfully.
Nov 22 07:51:58 compute-0 nova_compute[186544]: 2025-11-22 07:51:58.140 186548 DEBUG nova.objects.instance [None req-dd7ea3e2-93c1-4b40-8a3e-573d189c07b7 d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] Lazy-loading 'resources' on Instance uuid e193cce7-4100-4385-bdfe-9e256dc10eed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:51:58 compute-0 podman[221795]: 2025-11-22 07:51:58.15317404 +0000 UTC m=+0.100826360 container cleanup 517560db3e59d5dbfde61f9edfd6307e8f23f6ac2c6b785072b5b3e421a9706f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-49018757-f317-4b88-bf73-115119e056fe, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 22 07:51:58 compute-0 nova_compute[186544]: 2025-11-22 07:51:58.155 186548 DEBUG nova.virt.libvirt.vif [None req-dd7ea3e2-93c1-4b40-8a3e-573d189c07b7 d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:51:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestManualDisk-server-1231300254',display_name='tempest-ServersTestManualDisk-server-1231300254',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestmanualdisk-server-1231300254',id=53,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJjaVG81oZk2kR4+Yt+5vEgyS5SvF8HsdK7yD2bVZ2YZEUbRyPwfN1M/0R61a4k345vlzTYT5mbqHPkz6wNuPJ9rFPTIJ/oz2P6U5sgzPoxWy6PZMayVbdhMfxnk8yWSVg==',key_name='tempest-keypair-287681050',keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:51:35Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='08890994b268457d9deaf5460dcbe0d9',ramdisk_id='',reservation_id='r-06tfrv3r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestManualDisk-1931640754',owner_user_name='tempest-ServersTestManualDisk-1931640754-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:51:35Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d4022ae06df548ef8b8b167c80f56ae6',uuid=e193cce7-4100-4385-bdfe-9e256dc10eed,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4d24be20-c58b-47ad-8239-de4de9d7d6b7", "address": "fa:16:3e:89:07:8a", "network": {"id": "49018757-f317-4b88-bf73-115119e056fe", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-1275890616-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08890994b268457d9deaf5460dcbe0d9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d24be20-c5", "ovs_interfaceid": "4d24be20-c58b-47ad-8239-de4de9d7d6b7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 07:51:58 compute-0 nova_compute[186544]: 2025-11-22 07:51:58.156 186548 DEBUG nova.network.os_vif_util [None req-dd7ea3e2-93c1-4b40-8a3e-573d189c07b7 d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] Converting VIF {"id": "4d24be20-c58b-47ad-8239-de4de9d7d6b7", "address": "fa:16:3e:89:07:8a", "network": {"id": "49018757-f317-4b88-bf73-115119e056fe", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-1275890616-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "08890994b268457d9deaf5460dcbe0d9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d24be20-c5", "ovs_interfaceid": "4d24be20-c58b-47ad-8239-de4de9d7d6b7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:51:58 compute-0 nova_compute[186544]: 2025-11-22 07:51:58.156 186548 DEBUG nova.network.os_vif_util [None req-dd7ea3e2-93c1-4b40-8a3e-573d189c07b7 d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:89:07:8a,bridge_name='br-int',has_traffic_filtering=True,id=4d24be20-c58b-47ad-8239-de4de9d7d6b7,network=Network(49018757-f317-4b88-bf73-115119e056fe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4d24be20-c5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:51:58 compute-0 nova_compute[186544]: 2025-11-22 07:51:58.157 186548 DEBUG os_vif [None req-dd7ea3e2-93c1-4b40-8a3e-573d189c07b7 d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:89:07:8a,bridge_name='br-int',has_traffic_filtering=True,id=4d24be20-c58b-47ad-8239-de4de9d7d6b7,network=Network(49018757-f317-4b88-bf73-115119e056fe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4d24be20-c5') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 07:51:58 compute-0 nova_compute[186544]: 2025-11-22 07:51:58.159 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:51:58 compute-0 systemd[1]: libpod-conmon-517560db3e59d5dbfde61f9edfd6307e8f23f6ac2c6b785072b5b3e421a9706f.scope: Deactivated successfully.
Nov 22 07:51:58 compute-0 nova_compute[186544]: 2025-11-22 07:51:58.159 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4d24be20-c5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:51:58 compute-0 nova_compute[186544]: 2025-11-22 07:51:58.161 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:51:58 compute-0 nova_compute[186544]: 2025-11-22 07:51:58.164 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 07:51:58 compute-0 nova_compute[186544]: 2025-11-22 07:51:58.167 186548 INFO os_vif [None req-dd7ea3e2-93c1-4b40-8a3e-573d189c07b7 d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:89:07:8a,bridge_name='br-int',has_traffic_filtering=True,id=4d24be20-c58b-47ad-8239-de4de9d7d6b7,network=Network(49018757-f317-4b88-bf73-115119e056fe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4d24be20-c5')
Nov 22 07:51:58 compute-0 nova_compute[186544]: 2025-11-22 07:51:58.167 186548 INFO nova.virt.libvirt.driver [None req-dd7ea3e2-93c1-4b40-8a3e-573d189c07b7 d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] Deleting instance files /var/lib/nova/instances/e193cce7-4100-4385-bdfe-9e256dc10eed_del
Nov 22 07:51:58 compute-0 nova_compute[186544]: 2025-11-22 07:51:58.168 186548 INFO nova.virt.libvirt.driver [None req-dd7ea3e2-93c1-4b40-8a3e-573d189c07b7 d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] Deletion of /var/lib/nova/instances/e193cce7-4100-4385-bdfe-9e256dc10eed_del complete
Nov 22 07:51:58 compute-0 podman[221842]: 2025-11-22 07:51:58.221179906 +0000 UTC m=+0.043606849 container remove 517560db3e59d5dbfde61f9edfd6307e8f23f6ac2c6b785072b5b3e421a9706f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-49018757-f317-4b88-bf73-115119e056fe, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 22 07:51:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:51:58.226 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[a28b322a-d56c-4f5e-a922-56e0b73c814f]: (4, ('Sat Nov 22 07:51:58 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-49018757-f317-4b88-bf73-115119e056fe (517560db3e59d5dbfde61f9edfd6307e8f23f6ac2c6b785072b5b3e421a9706f)\n517560db3e59d5dbfde61f9edfd6307e8f23f6ac2c6b785072b5b3e421a9706f\nSat Nov 22 07:51:58 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-49018757-f317-4b88-bf73-115119e056fe (517560db3e59d5dbfde61f9edfd6307e8f23f6ac2c6b785072b5b3e421a9706f)\n517560db3e59d5dbfde61f9edfd6307e8f23f6ac2c6b785072b5b3e421a9706f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:51:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:51:58.228 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[81ee62ec-e424-46ae-bda6-9df99c0ac4cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:51:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:51:58.229 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap49018757-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:51:58 compute-0 kernel: tap49018757-f0: left promiscuous mode
Nov 22 07:51:58 compute-0 nova_compute[186544]: 2025-11-22 07:51:58.232 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:51:58 compute-0 nova_compute[186544]: 2025-11-22 07:51:58.241 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:51:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:51:58.244 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[576e0691-b58a-49f1-80ac-dcdd7b138b5b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:51:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:51:58.261 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[07b5ce0f-4252-4445-8390-cae93d02947f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:51:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:51:58.262 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[76e80721-6828-4070-b29c-efb5abf88b1a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:51:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:51:58.281 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[783a2414-c57e-458f-8b1a-5ad4f8a0eb4f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 463477, 'reachable_time': 15396, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221857, 'error': None, 'target': 'ovnmeta-49018757-f317-4b88-bf73-115119e056fe', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:51:58 compute-0 systemd[1]: run-netns-ovnmeta\x2d49018757\x2df317\x2d4b88\x2dbf73\x2d115119e056fe.mount: Deactivated successfully.
Nov 22 07:51:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:51:58.284 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-49018757-f317-4b88-bf73-115119e056fe deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 07:51:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:51:58.284 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[cd75ff54-8b8b-4f4e-a21e-10fe3fb2fee6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:51:58 compute-0 nova_compute[186544]: 2025-11-22 07:51:58.336 186548 INFO nova.compute.manager [None req-dd7ea3e2-93c1-4b40-8a3e-573d189c07b7 d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] Took 0.46 seconds to destroy the instance on the hypervisor.
Nov 22 07:51:58 compute-0 nova_compute[186544]: 2025-11-22 07:51:58.337 186548 DEBUG oslo.service.loopingcall [None req-dd7ea3e2-93c1-4b40-8a3e-573d189c07b7 d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 07:51:58 compute-0 nova_compute[186544]: 2025-11-22 07:51:58.337 186548 DEBUG nova.compute.manager [-] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 07:51:58 compute-0 nova_compute[186544]: 2025-11-22 07:51:58.337 186548 DEBUG nova.network.neutron [-] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 07:51:58 compute-0 nova_compute[186544]: 2025-11-22 07:51:58.460 186548 DEBUG nova.compute.manager [req-3122c5e9-31c2-47c0-aabc-535925b36645 req-ab089e36-ffa0-4f63-b7ef-dce396cba851 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] Received event network-vif-unplugged-4d24be20-c58b-47ad-8239-de4de9d7d6b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:51:58 compute-0 nova_compute[186544]: 2025-11-22 07:51:58.461 186548 DEBUG oslo_concurrency.lockutils [req-3122c5e9-31c2-47c0-aabc-535925b36645 req-ab089e36-ffa0-4f63-b7ef-dce396cba851 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "e193cce7-4100-4385-bdfe-9e256dc10eed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:51:58 compute-0 nova_compute[186544]: 2025-11-22 07:51:58.462 186548 DEBUG oslo_concurrency.lockutils [req-3122c5e9-31c2-47c0-aabc-535925b36645 req-ab089e36-ffa0-4f63-b7ef-dce396cba851 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e193cce7-4100-4385-bdfe-9e256dc10eed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:51:58 compute-0 nova_compute[186544]: 2025-11-22 07:51:58.462 186548 DEBUG oslo_concurrency.lockutils [req-3122c5e9-31c2-47c0-aabc-535925b36645 req-ab089e36-ffa0-4f63-b7ef-dce396cba851 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e193cce7-4100-4385-bdfe-9e256dc10eed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:51:58 compute-0 nova_compute[186544]: 2025-11-22 07:51:58.462 186548 DEBUG nova.compute.manager [req-3122c5e9-31c2-47c0-aabc-535925b36645 req-ab089e36-ffa0-4f63-b7ef-dce396cba851 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] No waiting events found dispatching network-vif-unplugged-4d24be20-c58b-47ad-8239-de4de9d7d6b7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:51:58 compute-0 nova_compute[186544]: 2025-11-22 07:51:58.462 186548 DEBUG nova.compute.manager [req-3122c5e9-31c2-47c0-aabc-535925b36645 req-ab089e36-ffa0-4f63-b7ef-dce396cba851 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] Received event network-vif-unplugged-4d24be20-c58b-47ad-8239-de4de9d7d6b7 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 22 07:51:59 compute-0 nova_compute[186544]: 2025-11-22 07:51:59.992 186548 DEBUG nova.network.neutron [-] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:52:00 compute-0 nova_compute[186544]: 2025-11-22 07:52:00.004 186548 DEBUG nova.compute.manager [req-b84f0088-0c2d-427b-8252-a77d67bd193d req-63366338-7088-4dd6-b20c-f3329fe6f9f9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] Received event network-vif-deleted-4d24be20-c58b-47ad-8239-de4de9d7d6b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:52:00 compute-0 nova_compute[186544]: 2025-11-22 07:52:00.005 186548 INFO nova.compute.manager [req-b84f0088-0c2d-427b-8252-a77d67bd193d req-63366338-7088-4dd6-b20c-f3329fe6f9f9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] Neutron deleted interface 4d24be20-c58b-47ad-8239-de4de9d7d6b7; detaching it from the instance and deleting it from the info cache
Nov 22 07:52:00 compute-0 nova_compute[186544]: 2025-11-22 07:52:00.005 186548 DEBUG nova.network.neutron [req-b84f0088-0c2d-427b-8252-a77d67bd193d req-63366338-7088-4dd6-b20c-f3329fe6f9f9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:52:00 compute-0 nova_compute[186544]: 2025-11-22 07:52:00.090 186548 INFO nova.compute.manager [-] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] Took 1.75 seconds to deallocate network for instance.
Nov 22 07:52:00 compute-0 nova_compute[186544]: 2025-11-22 07:52:00.107 186548 DEBUG nova.compute.manager [req-b84f0088-0c2d-427b-8252-a77d67bd193d req-63366338-7088-4dd6-b20c-f3329fe6f9f9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] Detach interface failed, port_id=4d24be20-c58b-47ad-8239-de4de9d7d6b7, reason: Instance e193cce7-4100-4385-bdfe-9e256dc10eed could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Nov 22 07:52:00 compute-0 nova_compute[186544]: 2025-11-22 07:52:00.192 186548 DEBUG oslo_concurrency.lockutils [None req-dd7ea3e2-93c1-4b40-8a3e-573d189c07b7 d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:52:00 compute-0 nova_compute[186544]: 2025-11-22 07:52:00.193 186548 DEBUG oslo_concurrency.lockutils [None req-dd7ea3e2-93c1-4b40-8a3e-573d189c07b7 d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:52:00 compute-0 nova_compute[186544]: 2025-11-22 07:52:00.330 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:52:00 compute-0 nova_compute[186544]: 2025-11-22 07:52:00.580 186548 DEBUG nova.compute.provider_tree [None req-dd7ea3e2-93c1-4b40-8a3e-573d189c07b7 d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:52:00 compute-0 nova_compute[186544]: 2025-11-22 07:52:00.596 186548 DEBUG nova.scheduler.client.report [None req-dd7ea3e2-93c1-4b40-8a3e-573d189c07b7 d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:52:00 compute-0 nova_compute[186544]: 2025-11-22 07:52:00.622 186548 DEBUG oslo_concurrency.lockutils [None req-dd7ea3e2-93c1-4b40-8a3e-573d189c07b7 d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.429s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:52:00 compute-0 nova_compute[186544]: 2025-11-22 07:52:00.634 186548 DEBUG nova.compute.manager [req-91448d1d-65f9-48d4-8889-463dd19833b1 req-8a7ff18e-c54e-4522-90f5-f7a28b2123c8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] Received event network-vif-plugged-4d24be20-c58b-47ad-8239-de4de9d7d6b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:52:00 compute-0 nova_compute[186544]: 2025-11-22 07:52:00.634 186548 DEBUG oslo_concurrency.lockutils [req-91448d1d-65f9-48d4-8889-463dd19833b1 req-8a7ff18e-c54e-4522-90f5-f7a28b2123c8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "e193cce7-4100-4385-bdfe-9e256dc10eed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:52:00 compute-0 nova_compute[186544]: 2025-11-22 07:52:00.635 186548 DEBUG oslo_concurrency.lockutils [req-91448d1d-65f9-48d4-8889-463dd19833b1 req-8a7ff18e-c54e-4522-90f5-f7a28b2123c8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e193cce7-4100-4385-bdfe-9e256dc10eed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:52:00 compute-0 nova_compute[186544]: 2025-11-22 07:52:00.635 186548 DEBUG oslo_concurrency.lockutils [req-91448d1d-65f9-48d4-8889-463dd19833b1 req-8a7ff18e-c54e-4522-90f5-f7a28b2123c8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e193cce7-4100-4385-bdfe-9e256dc10eed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:52:00 compute-0 nova_compute[186544]: 2025-11-22 07:52:00.636 186548 DEBUG nova.compute.manager [req-91448d1d-65f9-48d4-8889-463dd19833b1 req-8a7ff18e-c54e-4522-90f5-f7a28b2123c8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] No waiting events found dispatching network-vif-plugged-4d24be20-c58b-47ad-8239-de4de9d7d6b7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:52:00 compute-0 nova_compute[186544]: 2025-11-22 07:52:00.636 186548 WARNING nova.compute.manager [req-91448d1d-65f9-48d4-8889-463dd19833b1 req-8a7ff18e-c54e-4522-90f5-f7a28b2123c8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] Received unexpected event network-vif-plugged-4d24be20-c58b-47ad-8239-de4de9d7d6b7 for instance with vm_state deleted and task_state None.
Nov 22 07:52:00 compute-0 nova_compute[186544]: 2025-11-22 07:52:00.675 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:52:00 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:52:00.677 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:52:00 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:52:00.677 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 07:52:00 compute-0 nova_compute[186544]: 2025-11-22 07:52:00.688 186548 INFO nova.scheduler.client.report [None req-dd7ea3e2-93c1-4b40-8a3e-573d189c07b7 d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] Deleted allocations for instance e193cce7-4100-4385-bdfe-9e256dc10eed
Nov 22 07:52:00 compute-0 nova_compute[186544]: 2025-11-22 07:52:00.799 186548 DEBUG oslo_concurrency.lockutils [None req-dd7ea3e2-93c1-4b40-8a3e-573d189c07b7 d4022ae06df548ef8b8b167c80f56ae6 08890994b268457d9deaf5460dcbe0d9 - - default default] Lock "e193cce7-4100-4385-bdfe-9e256dc10eed" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.937s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:52:03 compute-0 nova_compute[186544]: 2025-11-22 07:52:03.163 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:52:03 compute-0 podman[221858]: 2025-11-22 07:52:03.442432787 +0000 UTC m=+0.089166455 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Nov 22 07:52:03 compute-0 podman[221859]: 2025-11-22 07:52:03.462679962 +0000 UTC m=+0.104330715 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 07:52:04 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:52:04.680 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:52:05 compute-0 nova_compute[186544]: 2025-11-22 07:52:05.331 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:52:05 compute-0 nova_compute[186544]: 2025-11-22 07:52:05.662 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:52:08 compute-0 nova_compute[186544]: 2025-11-22 07:52:08.172 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:52:09 compute-0 nova_compute[186544]: 2025-11-22 07:52:09.045 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:52:09 compute-0 nova_compute[186544]: 2025-11-22 07:52:09.164 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:52:09 compute-0 podman[221904]: 2025-11-22 07:52:09.405972406 +0000 UTC m=+0.056528205 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 07:52:10 compute-0 nova_compute[186544]: 2025-11-22 07:52:10.333 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:52:11 compute-0 podman[221930]: 2025-11-22 07:52:11.392118215 +0000 UTC m=+0.044616464 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 07:52:13 compute-0 nova_compute[186544]: 2025-11-22 07:52:13.139 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763797918.137137, e193cce7-4100-4385-bdfe-9e256dc10eed => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:52:13 compute-0 nova_compute[186544]: 2025-11-22 07:52:13.140 186548 INFO nova.compute.manager [-] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] VM Stopped (Lifecycle Event)
Nov 22 07:52:13 compute-0 nova_compute[186544]: 2025-11-22 07:52:13.161 186548 DEBUG nova.compute.manager [None req-1c41a882-f186-4464-957a-ff557b245db0 - - - - - -] [instance: e193cce7-4100-4385-bdfe-9e256dc10eed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:52:13 compute-0 nova_compute[186544]: 2025-11-22 07:52:13.174 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:52:15 compute-0 nova_compute[186544]: 2025-11-22 07:52:15.334 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:52:16 compute-0 podman[221949]: 2025-11-22 07:52:16.436683889 +0000 UTC m=+0.084880289 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible)
Nov 22 07:52:18 compute-0 nova_compute[186544]: 2025-11-22 07:52:18.177 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:52:20 compute-0 nova_compute[186544]: 2025-11-22 07:52:20.335 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:52:20 compute-0 podman[221969]: 2025-11-22 07:52:20.401031301 +0000 UTC m=+0.049378640 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 07:52:23 compute-0 nova_compute[186544]: 2025-11-22 07:52:23.182 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:52:25 compute-0 nova_compute[186544]: 2025-11-22 07:52:25.337 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:52:25 compute-0 podman[221994]: 2025-11-22 07:52:25.399260091 +0000 UTC m=+0.054389923 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, release=1755695350, com.redhat.component=ubi9-minimal-container, vcs-type=git)
Nov 22 07:52:28 compute-0 nova_compute[186544]: 2025-11-22 07:52:28.185 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:52:30 compute-0 nova_compute[186544]: 2025-11-22 07:52:30.339 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:52:33 compute-0 nova_compute[186544]: 2025-11-22 07:52:33.188 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:52:34 compute-0 podman[222015]: 2025-11-22 07:52:34.399297541 +0000 UTC m=+0.053104712 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, config_id=edpm, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 07:52:34 compute-0 podman[222016]: 2025-11-22 07:52:34.431220652 +0000 UTC m=+0.080748128 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller)
Nov 22 07:52:35 compute-0 nova_compute[186544]: 2025-11-22 07:52:35.340 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:52:36 compute-0 nova_compute[186544]: 2025-11-22 07:52:36.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:52:36 compute-0 nova_compute[186544]: 2025-11-22 07:52:36.209 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:52:36 compute-0 nova_compute[186544]: 2025-11-22 07:52:36.210 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:52:36 compute-0 nova_compute[186544]: 2025-11-22 07:52:36.210 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:52:36 compute-0 nova_compute[186544]: 2025-11-22 07:52:36.210 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 07:52:36 compute-0 nova_compute[186544]: 2025-11-22 07:52:36.356 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 07:52:36 compute-0 nova_compute[186544]: 2025-11-22 07:52:36.357 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5739MB free_disk=73.34942626953125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 07:52:36 compute-0 nova_compute[186544]: 2025-11-22 07:52:36.357 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:52:36 compute-0 nova_compute[186544]: 2025-11-22 07:52:36.357 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:52:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:52:36.591 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:52:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:52:36.592 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:52:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:52:36.592 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:52:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:52:36.592 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:52:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:52:36.592 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:52:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:52:36.592 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:52:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:52:36.593 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:52:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:52:36.593 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:52:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:52:36.593 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:52:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:52:36.593 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:52:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:52:36.593 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:52:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:52:36.593 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:52:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:52:36.593 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:52:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:52:36.594 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:52:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:52:36.594 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:52:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:52:36.594 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:52:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:52:36.594 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:52:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:52:36.594 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:52:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:52:36.594 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:52:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:52:36.595 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:52:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:52:36.595 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:52:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:52:36.595 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:52:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:52:36.595 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:52:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:52:36.595 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:52:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:52:36.595 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 07:52:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:52:37.320 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:52:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:52:37.321 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:52:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:52:37.321 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:52:38 compute-0 nova_compute[186544]: 2025-11-22 07:52:38.000 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 07:52:38 compute-0 nova_compute[186544]: 2025-11-22 07:52:38.000 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 07:52:38 compute-0 nova_compute[186544]: 2025-11-22 07:52:38.072 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:52:38 compute-0 nova_compute[186544]: 2025-11-22 07:52:38.092 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:52:38 compute-0 nova_compute[186544]: 2025-11-22 07:52:38.120 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 07:52:38 compute-0 nova_compute[186544]: 2025-11-22 07:52:38.120 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.763s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:52:38 compute-0 nova_compute[186544]: 2025-11-22 07:52:38.192 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:52:40 compute-0 nova_compute[186544]: 2025-11-22 07:52:40.120 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:52:40 compute-0 nova_compute[186544]: 2025-11-22 07:52:40.121 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:52:40 compute-0 nova_compute[186544]: 2025-11-22 07:52:40.121 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 07:52:40 compute-0 nova_compute[186544]: 2025-11-22 07:52:40.122 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 07:52:40 compute-0 nova_compute[186544]: 2025-11-22 07:52:40.146 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 07:52:40 compute-0 nova_compute[186544]: 2025-11-22 07:52:40.146 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:52:40 compute-0 nova_compute[186544]: 2025-11-22 07:52:40.342 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:52:40 compute-0 podman[222062]: 2025-11-22 07:52:40.40079737 +0000 UTC m=+0.052450115 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 07:52:41 compute-0 nova_compute[186544]: 2025-11-22 07:52:41.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:52:41 compute-0 nova_compute[186544]: 2025-11-22 07:52:41.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 07:52:41 compute-0 nova_compute[186544]: 2025-11-22 07:52:41.176 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:52:41 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:52:41.176 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:52:41 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:52:41.178 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 07:52:41 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:52:41.179 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:52:42 compute-0 nova_compute[186544]: 2025-11-22 07:52:42.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:52:42 compute-0 podman[222087]: 2025-11-22 07:52:42.398589083 +0000 UTC m=+0.049487004 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 22 07:52:43 compute-0 nova_compute[186544]: 2025-11-22 07:52:43.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:52:43 compute-0 nova_compute[186544]: 2025-11-22 07:52:43.195 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:52:45 compute-0 nova_compute[186544]: 2025-11-22 07:52:45.343 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:52:47 compute-0 nova_compute[186544]: 2025-11-22 07:52:47.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:52:47 compute-0 nova_compute[186544]: 2025-11-22 07:52:47.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:52:47 compute-0 podman[222106]: 2025-11-22 07:52:47.424449999 +0000 UTC m=+0.058941584 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 22 07:52:48 compute-0 nova_compute[186544]: 2025-11-22 07:52:48.199 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:52:48 compute-0 nova_compute[186544]: 2025-11-22 07:52:48.801 186548 DEBUG oslo_concurrency.lockutils [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] Acquiring lock "4e86eb77-e2b9-4437-b772-fbeddd72efb4" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:52:48 compute-0 nova_compute[186544]: 2025-11-22 07:52:48.801 186548 DEBUG oslo_concurrency.lockutils [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] Lock "4e86eb77-e2b9-4437-b772-fbeddd72efb4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:52:48 compute-0 nova_compute[186544]: 2025-11-22 07:52:48.832 186548 DEBUG nova.compute.manager [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] [instance: 4e86eb77-e2b9-4437-b772-fbeddd72efb4] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 07:52:49 compute-0 nova_compute[186544]: 2025-11-22 07:52:49.000 186548 DEBUG oslo_concurrency.lockutils [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:52:49 compute-0 nova_compute[186544]: 2025-11-22 07:52:49.001 186548 DEBUG oslo_concurrency.lockutils [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:52:49 compute-0 nova_compute[186544]: 2025-11-22 07:52:49.006 186548 DEBUG nova.virt.hardware [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 07:52:49 compute-0 nova_compute[186544]: 2025-11-22 07:52:49.007 186548 INFO nova.compute.claims [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] [instance: 4e86eb77-e2b9-4437-b772-fbeddd72efb4] Claim successful on node compute-0.ctlplane.example.com
Nov 22 07:52:49 compute-0 nova_compute[186544]: 2025-11-22 07:52:49.222 186548 DEBUG nova.compute.provider_tree [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:52:49 compute-0 nova_compute[186544]: 2025-11-22 07:52:49.253 186548 DEBUG nova.scheduler.client.report [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:52:49 compute-0 nova_compute[186544]: 2025-11-22 07:52:49.364 186548 DEBUG oslo_concurrency.lockutils [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.363s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:52:49 compute-0 nova_compute[186544]: 2025-11-22 07:52:49.364 186548 DEBUG nova.compute.manager [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] [instance: 4e86eb77-e2b9-4437-b772-fbeddd72efb4] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 07:52:49 compute-0 nova_compute[186544]: 2025-11-22 07:52:49.501 186548 DEBUG nova.compute.manager [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] [instance: 4e86eb77-e2b9-4437-b772-fbeddd72efb4] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 22 07:52:49 compute-0 nova_compute[186544]: 2025-11-22 07:52:49.502 186548 DEBUG nova.network.neutron [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] [instance: 4e86eb77-e2b9-4437-b772-fbeddd72efb4] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 07:52:49 compute-0 nova_compute[186544]: 2025-11-22 07:52:49.532 186548 INFO nova.virt.libvirt.driver [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] [instance: 4e86eb77-e2b9-4437-b772-fbeddd72efb4] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 07:52:49 compute-0 nova_compute[186544]: 2025-11-22 07:52:49.566 186548 DEBUG nova.compute.manager [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] [instance: 4e86eb77-e2b9-4437-b772-fbeddd72efb4] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 07:52:49 compute-0 nova_compute[186544]: 2025-11-22 07:52:49.780 186548 DEBUG nova.policy [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '50517a6ccf4348bba7b0ec50f9d90d35', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ed90dd8b47cb449aac7d907fa8dbaa78', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 22 07:52:50 compute-0 nova_compute[186544]: 2025-11-22 07:52:50.343 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:52:50 compute-0 nova_compute[186544]: 2025-11-22 07:52:50.554 186548 DEBUG nova.compute.manager [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] [instance: 4e86eb77-e2b9-4437-b772-fbeddd72efb4] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 07:52:50 compute-0 nova_compute[186544]: 2025-11-22 07:52:50.555 186548 DEBUG nova.virt.libvirt.driver [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] [instance: 4e86eb77-e2b9-4437-b772-fbeddd72efb4] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 07:52:50 compute-0 nova_compute[186544]: 2025-11-22 07:52:50.556 186548 INFO nova.virt.libvirt.driver [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] [instance: 4e86eb77-e2b9-4437-b772-fbeddd72efb4] Creating image(s)
Nov 22 07:52:50 compute-0 nova_compute[186544]: 2025-11-22 07:52:50.556 186548 DEBUG oslo_concurrency.lockutils [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] Acquiring lock "/var/lib/nova/instances/4e86eb77-e2b9-4437-b772-fbeddd72efb4/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:52:50 compute-0 nova_compute[186544]: 2025-11-22 07:52:50.556 186548 DEBUG oslo_concurrency.lockutils [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] Lock "/var/lib/nova/instances/4e86eb77-e2b9-4437-b772-fbeddd72efb4/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:52:50 compute-0 nova_compute[186544]: 2025-11-22 07:52:50.557 186548 DEBUG oslo_concurrency.lockutils [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] Lock "/var/lib/nova/instances/4e86eb77-e2b9-4437-b772-fbeddd72efb4/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:52:50 compute-0 nova_compute[186544]: 2025-11-22 07:52:50.572 186548 DEBUG oslo_concurrency.processutils [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:52:50 compute-0 nova_compute[186544]: 2025-11-22 07:52:50.627 186548 DEBUG oslo_concurrency.processutils [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:52:50 compute-0 nova_compute[186544]: 2025-11-22 07:52:50.628 186548 DEBUG oslo_concurrency.lockutils [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:52:50 compute-0 nova_compute[186544]: 2025-11-22 07:52:50.628 186548 DEBUG oslo_concurrency.lockutils [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:52:50 compute-0 nova_compute[186544]: 2025-11-22 07:52:50.640 186548 DEBUG oslo_concurrency.processutils [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:52:50 compute-0 nova_compute[186544]: 2025-11-22 07:52:50.699 186548 DEBUG oslo_concurrency.processutils [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:52:50 compute-0 nova_compute[186544]: 2025-11-22 07:52:50.701 186548 DEBUG oslo_concurrency.processutils [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/4e86eb77-e2b9-4437-b772-fbeddd72efb4/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:52:50 compute-0 nova_compute[186544]: 2025-11-22 07:52:50.757 186548 DEBUG oslo_concurrency.processutils [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/4e86eb77-e2b9-4437-b772-fbeddd72efb4/disk 1073741824" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:52:50 compute-0 nova_compute[186544]: 2025-11-22 07:52:50.759 186548 DEBUG oslo_concurrency.lockutils [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.130s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:52:50 compute-0 nova_compute[186544]: 2025-11-22 07:52:50.759 186548 DEBUG oslo_concurrency.processutils [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:52:50 compute-0 nova_compute[186544]: 2025-11-22 07:52:50.826 186548 DEBUG oslo_concurrency.processutils [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:52:50 compute-0 nova_compute[186544]: 2025-11-22 07:52:50.828 186548 DEBUG nova.virt.disk.api [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] Checking if we can resize image /var/lib/nova/instances/4e86eb77-e2b9-4437-b772-fbeddd72efb4/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 07:52:50 compute-0 nova_compute[186544]: 2025-11-22 07:52:50.828 186548 DEBUG oslo_concurrency.processutils [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4e86eb77-e2b9-4437-b772-fbeddd72efb4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:52:50 compute-0 nova_compute[186544]: 2025-11-22 07:52:50.889 186548 DEBUG oslo_concurrency.processutils [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4e86eb77-e2b9-4437-b772-fbeddd72efb4/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:52:50 compute-0 nova_compute[186544]: 2025-11-22 07:52:50.890 186548 DEBUG nova.virt.disk.api [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] Cannot resize image /var/lib/nova/instances/4e86eb77-e2b9-4437-b772-fbeddd72efb4/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 07:52:50 compute-0 nova_compute[186544]: 2025-11-22 07:52:50.890 186548 DEBUG nova.objects.instance [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] Lazy-loading 'migration_context' on Instance uuid 4e86eb77-e2b9-4437-b772-fbeddd72efb4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:52:50 compute-0 nova_compute[186544]: 2025-11-22 07:52:50.909 186548 DEBUG nova.virt.libvirt.driver [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] [instance: 4e86eb77-e2b9-4437-b772-fbeddd72efb4] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 07:52:50 compute-0 nova_compute[186544]: 2025-11-22 07:52:50.909 186548 DEBUG nova.virt.libvirt.driver [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] [instance: 4e86eb77-e2b9-4437-b772-fbeddd72efb4] Ensure instance console log exists: /var/lib/nova/instances/4e86eb77-e2b9-4437-b772-fbeddd72efb4/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 07:52:50 compute-0 nova_compute[186544]: 2025-11-22 07:52:50.910 186548 DEBUG oslo_concurrency.lockutils [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:52:50 compute-0 nova_compute[186544]: 2025-11-22 07:52:50.910 186548 DEBUG oslo_concurrency.lockutils [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:52:50 compute-0 nova_compute[186544]: 2025-11-22 07:52:50.910 186548 DEBUG oslo_concurrency.lockutils [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:52:51 compute-0 podman[222141]: 2025-11-22 07:52:51.405182922 +0000 UTC m=+0.050087986 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 07:52:52 compute-0 nova_compute[186544]: 2025-11-22 07:52:52.960 186548 DEBUG nova.network.neutron [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] [instance: 4e86eb77-e2b9-4437-b772-fbeddd72efb4] Successfully created port: ed12f9d4-2fd7-40a2-96c1-c9ddb1d4a09d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 22 07:52:53 compute-0 nova_compute[186544]: 2025-11-22 07:52:53.203 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:52:54 compute-0 nova_compute[186544]: 2025-11-22 07:52:54.767 186548 DEBUG nova.network.neutron [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] [instance: 4e86eb77-e2b9-4437-b772-fbeddd72efb4] Successfully updated port: ed12f9d4-2fd7-40a2-96c1-c9ddb1d4a09d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 07:52:54 compute-0 nova_compute[186544]: 2025-11-22 07:52:54.902 186548 DEBUG nova.compute.manager [req-50702604-274f-4261-bffe-3b6d70dd5e13 req-0b2f2f5e-3cbd-45a4-97c3-a06023505c41 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4e86eb77-e2b9-4437-b772-fbeddd72efb4] Received event network-changed-ed12f9d4-2fd7-40a2-96c1-c9ddb1d4a09d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:52:54 compute-0 nova_compute[186544]: 2025-11-22 07:52:54.903 186548 DEBUG nova.compute.manager [req-50702604-274f-4261-bffe-3b6d70dd5e13 req-0b2f2f5e-3cbd-45a4-97c3-a06023505c41 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4e86eb77-e2b9-4437-b772-fbeddd72efb4] Refreshing instance network info cache due to event network-changed-ed12f9d4-2fd7-40a2-96c1-c9ddb1d4a09d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 07:52:54 compute-0 nova_compute[186544]: 2025-11-22 07:52:54.903 186548 DEBUG oslo_concurrency.lockutils [req-50702604-274f-4261-bffe-3b6d70dd5e13 req-0b2f2f5e-3cbd-45a4-97c3-a06023505c41 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-4e86eb77-e2b9-4437-b772-fbeddd72efb4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:52:54 compute-0 nova_compute[186544]: 2025-11-22 07:52:54.903 186548 DEBUG oslo_concurrency.lockutils [req-50702604-274f-4261-bffe-3b6d70dd5e13 req-0b2f2f5e-3cbd-45a4-97c3-a06023505c41 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-4e86eb77-e2b9-4437-b772-fbeddd72efb4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:52:54 compute-0 nova_compute[186544]: 2025-11-22 07:52:54.904 186548 DEBUG nova.network.neutron [req-50702604-274f-4261-bffe-3b6d70dd5e13 req-0b2f2f5e-3cbd-45a4-97c3-a06023505c41 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4e86eb77-e2b9-4437-b772-fbeddd72efb4] Refreshing network info cache for port ed12f9d4-2fd7-40a2-96c1-c9ddb1d4a09d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 07:52:55 compute-0 nova_compute[186544]: 2025-11-22 07:52:55.048 186548 DEBUG nova.network.neutron [req-50702604-274f-4261-bffe-3b6d70dd5e13 req-0b2f2f5e-3cbd-45a4-97c3-a06023505c41 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4e86eb77-e2b9-4437-b772-fbeddd72efb4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 07:52:55 compute-0 nova_compute[186544]: 2025-11-22 07:52:55.346 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:52:55 compute-0 nova_compute[186544]: 2025-11-22 07:52:55.881 186548 DEBUG oslo_concurrency.lockutils [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] Acquiring lock "refresh_cache-4e86eb77-e2b9-4437-b772-fbeddd72efb4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:52:56 compute-0 podman[222165]: 2025-11-22 07:52:56.404332466 +0000 UTC m=+0.054693991 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, name=ubi9-minimal, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., release=1755695350, architecture=x86_64, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public)
Nov 22 07:52:56 compute-0 nova_compute[186544]: 2025-11-22 07:52:56.407 186548 DEBUG nova.network.neutron [req-50702604-274f-4261-bffe-3b6d70dd5e13 req-0b2f2f5e-3cbd-45a4-97c3-a06023505c41 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4e86eb77-e2b9-4437-b772-fbeddd72efb4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:52:56 compute-0 nova_compute[186544]: 2025-11-22 07:52:56.425 186548 DEBUG oslo_concurrency.lockutils [req-50702604-274f-4261-bffe-3b6d70dd5e13 req-0b2f2f5e-3cbd-45a4-97c3-a06023505c41 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-4e86eb77-e2b9-4437-b772-fbeddd72efb4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:52:56 compute-0 nova_compute[186544]: 2025-11-22 07:52:56.426 186548 DEBUG oslo_concurrency.lockutils [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] Acquired lock "refresh_cache-4e86eb77-e2b9-4437-b772-fbeddd72efb4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:52:56 compute-0 nova_compute[186544]: 2025-11-22 07:52:56.426 186548 DEBUG nova.network.neutron [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] [instance: 4e86eb77-e2b9-4437-b772-fbeddd72efb4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 07:52:56 compute-0 nova_compute[186544]: 2025-11-22 07:52:56.910 186548 DEBUG nova.network.neutron [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] [instance: 4e86eb77-e2b9-4437-b772-fbeddd72efb4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 07:52:58 compute-0 nova_compute[186544]: 2025-11-22 07:52:58.206 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:52:58 compute-0 nova_compute[186544]: 2025-11-22 07:52:58.595 186548 DEBUG nova.network.neutron [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] [instance: 4e86eb77-e2b9-4437-b772-fbeddd72efb4] Updating instance_info_cache with network_info: [{"id": "ed12f9d4-2fd7-40a2-96c1-c9ddb1d4a09d", "address": "fa:16:3e:4f:16:e4", "network": {"id": "22d3787f-0fa0-4880-8b94-f25c9c543165", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-2113755864-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ed90dd8b47cb449aac7d907fa8dbaa78", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "taped12f9d4-2f", "ovs_interfaceid": "ed12f9d4-2fd7-40a2-96c1-c9ddb1d4a09d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:52:58 compute-0 nova_compute[186544]: 2025-11-22 07:52:58.632 186548 DEBUG oslo_concurrency.lockutils [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] Releasing lock "refresh_cache-4e86eb77-e2b9-4437-b772-fbeddd72efb4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:52:58 compute-0 nova_compute[186544]: 2025-11-22 07:52:58.633 186548 DEBUG nova.compute.manager [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] [instance: 4e86eb77-e2b9-4437-b772-fbeddd72efb4] Instance network_info: |[{"id": "ed12f9d4-2fd7-40a2-96c1-c9ddb1d4a09d", "address": "fa:16:3e:4f:16:e4", "network": {"id": "22d3787f-0fa0-4880-8b94-f25c9c543165", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-2113755864-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ed90dd8b47cb449aac7d907fa8dbaa78", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "taped12f9d4-2f", "ovs_interfaceid": "ed12f9d4-2fd7-40a2-96c1-c9ddb1d4a09d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 22 07:52:58 compute-0 nova_compute[186544]: 2025-11-22 07:52:58.635 186548 DEBUG nova.virt.libvirt.driver [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] [instance: 4e86eb77-e2b9-4437-b772-fbeddd72efb4] Start _get_guest_xml network_info=[{"id": "ed12f9d4-2fd7-40a2-96c1-c9ddb1d4a09d", "address": "fa:16:3e:4f:16:e4", "network": {"id": "22d3787f-0fa0-4880-8b94-f25c9c543165", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-2113755864-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ed90dd8b47cb449aac7d907fa8dbaa78", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "taped12f9d4-2f", "ovs_interfaceid": "ed12f9d4-2fd7-40a2-96c1-c9ddb1d4a09d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 07:52:58 compute-0 nova_compute[186544]: 2025-11-22 07:52:58.639 186548 WARNING nova.virt.libvirt.driver [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 07:52:58 compute-0 nova_compute[186544]: 2025-11-22 07:52:58.645 186548 DEBUG nova.virt.libvirt.host [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 07:52:58 compute-0 nova_compute[186544]: 2025-11-22 07:52:58.646 186548 DEBUG nova.virt.libvirt.host [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 07:52:58 compute-0 nova_compute[186544]: 2025-11-22 07:52:58.654 186548 DEBUG nova.virt.libvirt.host [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 07:52:58 compute-0 nova_compute[186544]: 2025-11-22 07:52:58.655 186548 DEBUG nova.virt.libvirt.host [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 07:52:58 compute-0 nova_compute[186544]: 2025-11-22 07:52:58.656 186548 DEBUG nova.virt.libvirt.driver [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 07:52:58 compute-0 nova_compute[186544]: 2025-11-22 07:52:58.656 186548 DEBUG nova.virt.hardware [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 07:52:58 compute-0 nova_compute[186544]: 2025-11-22 07:52:58.656 186548 DEBUG nova.virt.hardware [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 07:52:58 compute-0 nova_compute[186544]: 2025-11-22 07:52:58.657 186548 DEBUG nova.virt.hardware [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 07:52:58 compute-0 nova_compute[186544]: 2025-11-22 07:52:58.657 186548 DEBUG nova.virt.hardware [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 07:52:58 compute-0 nova_compute[186544]: 2025-11-22 07:52:58.658 186548 DEBUG nova.virt.hardware [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 07:52:58 compute-0 nova_compute[186544]: 2025-11-22 07:52:58.658 186548 DEBUG nova.virt.hardware [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 07:52:58 compute-0 nova_compute[186544]: 2025-11-22 07:52:58.658 186548 DEBUG nova.virt.hardware [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 07:52:58 compute-0 nova_compute[186544]: 2025-11-22 07:52:58.659 186548 DEBUG nova.virt.hardware [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 07:52:58 compute-0 nova_compute[186544]: 2025-11-22 07:52:58.659 186548 DEBUG nova.virt.hardware [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 07:52:58 compute-0 nova_compute[186544]: 2025-11-22 07:52:58.660 186548 DEBUG nova.virt.hardware [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 07:52:58 compute-0 nova_compute[186544]: 2025-11-22 07:52:58.660 186548 DEBUG nova.virt.hardware [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 07:52:58 compute-0 nova_compute[186544]: 2025-11-22 07:52:58.666 186548 DEBUG nova.virt.libvirt.vif [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:52:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-InstanceActionsV221TestJSON-server-1983858410',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-instanceactionsv221testjson-server-1983858410',id=56,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ed90dd8b47cb449aac7d907fa8dbaa78',ramdisk_id='',reservation_id='r-ac554ofj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-InstanceActionsV221TestJSON-1421412495',owner_user_name='tempest-InstanceActionsV221TestJSON-1421412495-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:52:49Z,user_data=None,user_id='50517a6ccf4348bba7b0ec50f9d90d35',uuid=4e86eb77-e2b9-4437-b772-fbeddd72efb4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ed12f9d4-2fd7-40a2-96c1-c9ddb1d4a09d", "address": "fa:16:3e:4f:16:e4", "network": {"id": "22d3787f-0fa0-4880-8b94-f25c9c543165", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-2113755864-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ed90dd8b47cb449aac7d907fa8dbaa78", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "taped12f9d4-2f", "ovs_interfaceid": "ed12f9d4-2fd7-40a2-96c1-c9ddb1d4a09d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 07:52:58 compute-0 nova_compute[186544]: 2025-11-22 07:52:58.667 186548 DEBUG nova.network.os_vif_util [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] Converting VIF {"id": "ed12f9d4-2fd7-40a2-96c1-c9ddb1d4a09d", "address": "fa:16:3e:4f:16:e4", "network": {"id": "22d3787f-0fa0-4880-8b94-f25c9c543165", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-2113755864-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ed90dd8b47cb449aac7d907fa8dbaa78", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "taped12f9d4-2f", "ovs_interfaceid": "ed12f9d4-2fd7-40a2-96c1-c9ddb1d4a09d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:52:58 compute-0 nova_compute[186544]: 2025-11-22 07:52:58.668 186548 DEBUG nova.network.os_vif_util [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4f:16:e4,bridge_name='br-int',has_traffic_filtering=True,id=ed12f9d4-2fd7-40a2-96c1-c9ddb1d4a09d,network=Network(22d3787f-0fa0-4880-8b94-f25c9c543165),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='taped12f9d4-2f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:52:58 compute-0 nova_compute[186544]: 2025-11-22 07:52:58.670 186548 DEBUG nova.objects.instance [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] Lazy-loading 'pci_devices' on Instance uuid 4e86eb77-e2b9-4437-b772-fbeddd72efb4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:52:58 compute-0 nova_compute[186544]: 2025-11-22 07:52:58.687 186548 DEBUG nova.virt.libvirt.driver [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] [instance: 4e86eb77-e2b9-4437-b772-fbeddd72efb4] End _get_guest_xml xml=<domain type="kvm">
Nov 22 07:52:58 compute-0 nova_compute[186544]:   <uuid>4e86eb77-e2b9-4437-b772-fbeddd72efb4</uuid>
Nov 22 07:52:58 compute-0 nova_compute[186544]:   <name>instance-00000038</name>
Nov 22 07:52:58 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 07:52:58 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 07:52:58 compute-0 nova_compute[186544]:   <metadata>
Nov 22 07:52:58 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 07:52:58 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 07:52:58 compute-0 nova_compute[186544]:       <nova:name>tempest-InstanceActionsV221TestJSON-server-1983858410</nova:name>
Nov 22 07:52:58 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 07:52:58</nova:creationTime>
Nov 22 07:52:58 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 07:52:58 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 07:52:58 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 07:52:58 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 07:52:58 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 07:52:58 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 07:52:58 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 07:52:58 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 07:52:58 compute-0 nova_compute[186544]:         <nova:user uuid="50517a6ccf4348bba7b0ec50f9d90d35">tempest-InstanceActionsV221TestJSON-1421412495-project-member</nova:user>
Nov 22 07:52:58 compute-0 nova_compute[186544]:         <nova:project uuid="ed90dd8b47cb449aac7d907fa8dbaa78">tempest-InstanceActionsV221TestJSON-1421412495</nova:project>
Nov 22 07:52:58 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 07:52:58 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 07:52:58 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 07:52:58 compute-0 nova_compute[186544]:         <nova:port uuid="ed12f9d4-2fd7-40a2-96c1-c9ddb1d4a09d">
Nov 22 07:52:58 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Nov 22 07:52:58 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 07:52:58 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 07:52:58 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 07:52:58 compute-0 nova_compute[186544]:   </metadata>
Nov 22 07:52:58 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 07:52:58 compute-0 nova_compute[186544]:     <system>
Nov 22 07:52:58 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 07:52:58 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 07:52:58 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 07:52:58 compute-0 nova_compute[186544]:       <entry name="serial">4e86eb77-e2b9-4437-b772-fbeddd72efb4</entry>
Nov 22 07:52:58 compute-0 nova_compute[186544]:       <entry name="uuid">4e86eb77-e2b9-4437-b772-fbeddd72efb4</entry>
Nov 22 07:52:58 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 07:52:58 compute-0 nova_compute[186544]:     </system>
Nov 22 07:52:58 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 07:52:58 compute-0 nova_compute[186544]:   <os>
Nov 22 07:52:58 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 07:52:58 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 07:52:58 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 07:52:58 compute-0 nova_compute[186544]:   </os>
Nov 22 07:52:58 compute-0 nova_compute[186544]:   <features>
Nov 22 07:52:58 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 07:52:58 compute-0 nova_compute[186544]:     <apic/>
Nov 22 07:52:58 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 07:52:58 compute-0 nova_compute[186544]:   </features>
Nov 22 07:52:58 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 07:52:58 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 07:52:58 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 07:52:58 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 07:52:58 compute-0 nova_compute[186544]:   </clock>
Nov 22 07:52:58 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 07:52:58 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 07:52:58 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 07:52:58 compute-0 nova_compute[186544]:   </cpu>
Nov 22 07:52:58 compute-0 nova_compute[186544]:   <devices>
Nov 22 07:52:58 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 07:52:58 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 07:52:58 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/4e86eb77-e2b9-4437-b772-fbeddd72efb4/disk"/>
Nov 22 07:52:58 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 07:52:58 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:52:58 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 07:52:58 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 07:52:58 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/4e86eb77-e2b9-4437-b772-fbeddd72efb4/disk.config"/>
Nov 22 07:52:58 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 07:52:58 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:52:58 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 07:52:58 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:4f:16:e4"/>
Nov 22 07:52:58 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 07:52:58 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 07:52:58 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 07:52:58 compute-0 nova_compute[186544]:       <target dev="taped12f9d4-2f"/>
Nov 22 07:52:58 compute-0 nova_compute[186544]:     </interface>
Nov 22 07:52:58 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 07:52:58 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/4e86eb77-e2b9-4437-b772-fbeddd72efb4/console.log" append="off"/>
Nov 22 07:52:58 compute-0 nova_compute[186544]:     </serial>
Nov 22 07:52:58 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 07:52:58 compute-0 nova_compute[186544]:     <video>
Nov 22 07:52:58 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 07:52:58 compute-0 nova_compute[186544]:     </video>
Nov 22 07:52:58 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 07:52:58 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 07:52:58 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 07:52:58 compute-0 nova_compute[186544]:     </rng>
Nov 22 07:52:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 07:52:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:52:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:52:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:52:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:52:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:52:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:52:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:52:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:52:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:52:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:52:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:52:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:52:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:52:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:52:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:52:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:52:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:52:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:52:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:52:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:52:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:52:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:52:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:52:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:52:58 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 07:52:58 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 07:52:58 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 07:52:58 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 07:52:58 compute-0 nova_compute[186544]:   </devices>
Nov 22 07:52:58 compute-0 nova_compute[186544]: </domain>
Nov 22 07:52:58 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 07:52:58 compute-0 nova_compute[186544]: 2025-11-22 07:52:58.688 186548 DEBUG nova.compute.manager [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] [instance: 4e86eb77-e2b9-4437-b772-fbeddd72efb4] Preparing to wait for external event network-vif-plugged-ed12f9d4-2fd7-40a2-96c1-c9ddb1d4a09d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 07:52:58 compute-0 nova_compute[186544]: 2025-11-22 07:52:58.688 186548 DEBUG oslo_concurrency.lockutils [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] Acquiring lock "4e86eb77-e2b9-4437-b772-fbeddd72efb4-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:52:58 compute-0 nova_compute[186544]: 2025-11-22 07:52:58.688 186548 DEBUG oslo_concurrency.lockutils [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] Lock "4e86eb77-e2b9-4437-b772-fbeddd72efb4-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:52:58 compute-0 nova_compute[186544]: 2025-11-22 07:52:58.688 186548 DEBUG oslo_concurrency.lockutils [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] Lock "4e86eb77-e2b9-4437-b772-fbeddd72efb4-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:52:58 compute-0 nova_compute[186544]: 2025-11-22 07:52:58.689 186548 DEBUG nova.virt.libvirt.vif [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:52:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-InstanceActionsV221TestJSON-server-1983858410',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-instanceactionsv221testjson-server-1983858410',id=56,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ed90dd8b47cb449aac7d907fa8dbaa78',ramdisk_id='',reservation_id='r-ac554ofj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-InstanceActionsV221TestJSON-1421412495',owner_user_name='tempest-InstanceActionsV221TestJSON-1421412495-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:52:49Z,user_data=None,user_id='50517a6ccf4348bba7b0ec50f9d90d35',uuid=4e86eb77-e2b9-4437-b772-fbeddd72efb4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ed12f9d4-2fd7-40a2-96c1-c9ddb1d4a09d", "address": "fa:16:3e:4f:16:e4", "network": {"id": "22d3787f-0fa0-4880-8b94-f25c9c543165", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-2113755864-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ed90dd8b47cb449aac7d907fa8dbaa78", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "taped12f9d4-2f", "ovs_interfaceid": "ed12f9d4-2fd7-40a2-96c1-c9ddb1d4a09d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 07:52:58 compute-0 nova_compute[186544]: 2025-11-22 07:52:58.690 186548 DEBUG nova.network.os_vif_util [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] Converting VIF {"id": "ed12f9d4-2fd7-40a2-96c1-c9ddb1d4a09d", "address": "fa:16:3e:4f:16:e4", "network": {"id": "22d3787f-0fa0-4880-8b94-f25c9c543165", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-2113755864-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ed90dd8b47cb449aac7d907fa8dbaa78", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "taped12f9d4-2f", "ovs_interfaceid": "ed12f9d4-2fd7-40a2-96c1-c9ddb1d4a09d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:52:58 compute-0 nova_compute[186544]: 2025-11-22 07:52:58.690 186548 DEBUG nova.network.os_vif_util [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4f:16:e4,bridge_name='br-int',has_traffic_filtering=True,id=ed12f9d4-2fd7-40a2-96c1-c9ddb1d4a09d,network=Network(22d3787f-0fa0-4880-8b94-f25c9c543165),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='taped12f9d4-2f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:52:58 compute-0 nova_compute[186544]: 2025-11-22 07:52:58.691 186548 DEBUG os_vif [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4f:16:e4,bridge_name='br-int',has_traffic_filtering=True,id=ed12f9d4-2fd7-40a2-96c1-c9ddb1d4a09d,network=Network(22d3787f-0fa0-4880-8b94-f25c9c543165),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='taped12f9d4-2f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 07:52:58 compute-0 nova_compute[186544]: 2025-11-22 07:52:58.691 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:52:58 compute-0 nova_compute[186544]: 2025-11-22 07:52:58.691 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:52:58 compute-0 nova_compute[186544]: 2025-11-22 07:52:58.692 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:52:58 compute-0 nova_compute[186544]: 2025-11-22 07:52:58.694 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:52:58 compute-0 nova_compute[186544]: 2025-11-22 07:52:58.694 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=taped12f9d4-2f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:52:58 compute-0 nova_compute[186544]: 2025-11-22 07:52:58.695 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=taped12f9d4-2f, col_values=(('external_ids', {'iface-id': 'ed12f9d4-2fd7-40a2-96c1-c9ddb1d4a09d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:4f:16:e4', 'vm-uuid': '4e86eb77-e2b9-4437-b772-fbeddd72efb4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:52:58 compute-0 nova_compute[186544]: 2025-11-22 07:52:58.696 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:52:58 compute-0 NetworkManager[55036]: <info>  [1763797978.6967] manager: (taped12f9d4-2f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/105)
Nov 22 07:52:58 compute-0 nova_compute[186544]: 2025-11-22 07:52:58.698 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 07:52:58 compute-0 nova_compute[186544]: 2025-11-22 07:52:58.702 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:52:58 compute-0 nova_compute[186544]: 2025-11-22 07:52:58.703 186548 INFO os_vif [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4f:16:e4,bridge_name='br-int',has_traffic_filtering=True,id=ed12f9d4-2fd7-40a2-96c1-c9ddb1d4a09d,network=Network(22d3787f-0fa0-4880-8b94-f25c9c543165),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='taped12f9d4-2f')
Nov 22 07:52:58 compute-0 nova_compute[186544]: 2025-11-22 07:52:58.764 186548 DEBUG nova.virt.libvirt.driver [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:52:58 compute-0 nova_compute[186544]: 2025-11-22 07:52:58.765 186548 DEBUG nova.virt.libvirt.driver [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:52:58 compute-0 nova_compute[186544]: 2025-11-22 07:52:58.765 186548 DEBUG nova.virt.libvirt.driver [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] No VIF found with MAC fa:16:3e:4f:16:e4, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 07:52:58 compute-0 nova_compute[186544]: 2025-11-22 07:52:58.766 186548 INFO nova.virt.libvirt.driver [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] [instance: 4e86eb77-e2b9-4437-b772-fbeddd72efb4] Using config drive
Nov 22 07:52:59 compute-0 nova_compute[186544]: 2025-11-22 07:52:59.915 186548 INFO nova.virt.libvirt.driver [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] [instance: 4e86eb77-e2b9-4437-b772-fbeddd72efb4] Creating config drive at /var/lib/nova/instances/4e86eb77-e2b9-4437-b772-fbeddd72efb4/disk.config
Nov 22 07:52:59 compute-0 nova_compute[186544]: 2025-11-22 07:52:59.921 186548 DEBUG oslo_concurrency.processutils [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4e86eb77-e2b9-4437-b772-fbeddd72efb4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpbzrx0w7a execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:53:00 compute-0 nova_compute[186544]: 2025-11-22 07:53:00.050 186548 DEBUG oslo_concurrency.processutils [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4e86eb77-e2b9-4437-b772-fbeddd72efb4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpbzrx0w7a" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:53:00 compute-0 kernel: taped12f9d4-2f: entered promiscuous mode
Nov 22 07:53:00 compute-0 NetworkManager[55036]: <info>  [1763797980.1232] manager: (taped12f9d4-2f): new Tun device (/org/freedesktop/NetworkManager/Devices/106)
Nov 22 07:53:00 compute-0 ovn_controller[94843]: 2025-11-22T07:53:00Z|00226|binding|INFO|Claiming lport ed12f9d4-2fd7-40a2-96c1-c9ddb1d4a09d for this chassis.
Nov 22 07:53:00 compute-0 nova_compute[186544]: 2025-11-22 07:53:00.124 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:53:00 compute-0 ovn_controller[94843]: 2025-11-22T07:53:00Z|00227|binding|INFO|ed12f9d4-2fd7-40a2-96c1-c9ddb1d4a09d: Claiming fa:16:3e:4f:16:e4 10.100.0.7
Nov 22 07:53:00 compute-0 nova_compute[186544]: 2025-11-22 07:53:00.128 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:53:00 compute-0 nova_compute[186544]: 2025-11-22 07:53:00.135 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:53:00 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:53:00.157 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4f:16:e4 10.100.0.7'], port_security=['fa:16:3e:4f:16:e4 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '4e86eb77-e2b9-4437-b772-fbeddd72efb4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-22d3787f-0fa0-4880-8b94-f25c9c543165', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ed90dd8b47cb449aac7d907fa8dbaa78', 'neutron:revision_number': '2', 'neutron:security_group_ids': '20525348-8dc2-4c17-a67b-3dad14b15e3b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1436e329-cc61-4e26-b914-0b0acb14e464, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=ed12f9d4-2fd7-40a2-96c1-c9ddb1d4a09d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:53:00 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:53:00.158 103805 INFO neutron.agent.ovn.metadata.agent [-] Port ed12f9d4-2fd7-40a2-96c1-c9ddb1d4a09d in datapath 22d3787f-0fa0-4880-8b94-f25c9c543165 bound to our chassis
Nov 22 07:53:00 compute-0 systemd-machined[152872]: New machine qemu-28-instance-00000038.
Nov 22 07:53:00 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:53:00.159 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 22d3787f-0fa0-4880-8b94-f25c9c543165
Nov 22 07:53:00 compute-0 systemd-udevd[222209]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 07:53:00 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:53:00.169 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[79db8d12-f874-44e5-a3b1-ca0f5bb2694b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:53:00 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:53:00.170 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap22d3787f-01 in ovnmeta-22d3787f-0fa0-4880-8b94-f25c9c543165 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 07:53:00 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:53:00.172 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap22d3787f-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 07:53:00 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:53:00.172 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[104c296e-4805-4f0a-9817-f07c6de2221b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:53:00 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:53:00.173 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[0edec909-4e0c-43e2-972f-afdc17dcadc0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:53:00 compute-0 NetworkManager[55036]: <info>  [1763797980.1757] device (taped12f9d4-2f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 07:53:00 compute-0 NetworkManager[55036]: <info>  [1763797980.1777] device (taped12f9d4-2f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 07:53:00 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:53:00.184 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[27519935-50af-48cb-b012-1c1539554ea2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:53:00 compute-0 nova_compute[186544]: 2025-11-22 07:53:00.190 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:53:00 compute-0 ovn_controller[94843]: 2025-11-22T07:53:00Z|00228|binding|INFO|Setting lport ed12f9d4-2fd7-40a2-96c1-c9ddb1d4a09d ovn-installed in OVS
Nov 22 07:53:00 compute-0 ovn_controller[94843]: 2025-11-22T07:53:00Z|00229|binding|INFO|Setting lport ed12f9d4-2fd7-40a2-96c1-c9ddb1d4a09d up in Southbound
Nov 22 07:53:00 compute-0 systemd[1]: Started Virtual Machine qemu-28-instance-00000038.
Nov 22 07:53:00 compute-0 nova_compute[186544]: 2025-11-22 07:53:00.194 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:53:00 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:53:00.207 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[2c3387e2-0510-4e4c-939c-188de0077d94]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:53:00 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:53:00.236 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[e1ee42b8-8945-4a38-8178-b75e6dc4b387]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:53:00 compute-0 NetworkManager[55036]: <info>  [1763797980.2440] manager: (tap22d3787f-00): new Veth device (/org/freedesktop/NetworkManager/Devices/107)
Nov 22 07:53:00 compute-0 systemd-udevd[222212]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 07:53:00 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:53:00.244 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[d7812e7b-13ca-4407-893e-bd85fcc1646e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:53:00 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:53:00.278 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[13711f57-bf1d-461e-b2d2-e359fc8b956c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:53:00 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:53:00.282 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[477a15cf-2ca3-432e-8c77-a9fc9e0d68de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:53:00 compute-0 NetworkManager[55036]: <info>  [1763797980.3036] device (tap22d3787f-00): carrier: link connected
Nov 22 07:53:00 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:53:00.309 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[5d4a0672-87ce-4d31-bd6d-b3d7245f8559]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:53:00 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:53:00.325 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[cade9819-ff3f-43b3-bcc6-554866464cd5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap22d3787f-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:01:f7:d5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 66], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 472094, 'reachable_time': 43163, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222242, 'error': None, 'target': 'ovnmeta-22d3787f-0fa0-4880-8b94-f25c9c543165', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:53:00 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:53:00.340 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[9d598d9d-622a-4ccd-a95e-5f6be69fbb32]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe01:f7d5'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 472094, 'tstamp': 472094}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222244, 'error': None, 'target': 'ovnmeta-22d3787f-0fa0-4880-8b94-f25c9c543165', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:53:00 compute-0 nova_compute[186544]: 2025-11-22 07:53:00.348 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:53:00 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:53:00.362 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[4904473c-7469-40d8-acb2-88a615cce008]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap22d3787f-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:01:f7:d5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 66], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 472094, 'reachable_time': 43163, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 222245, 'error': None, 'target': 'ovnmeta-22d3787f-0fa0-4880-8b94-f25c9c543165', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:53:00 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:53:00.396 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[c3f77adf-02cc-4657-b0fb-9e3efc3ff159]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:53:00 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:53:00.460 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[6bd05601-6554-4a5c-88a6-5e9b6f01284a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:53:00 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:53:00.462 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap22d3787f-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:53:00 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:53:00.462 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:53:00 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:53:00.463 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap22d3787f-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:53:00 compute-0 NetworkManager[55036]: <info>  [1763797980.4651] manager: (tap22d3787f-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/108)
Nov 22 07:53:00 compute-0 kernel: tap22d3787f-00: entered promiscuous mode
Nov 22 07:53:00 compute-0 nova_compute[186544]: 2025-11-22 07:53:00.464 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:53:00 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:53:00.471 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap22d3787f-00, col_values=(('external_ids', {'iface-id': 'e76f65fc-f8af-412a-98d3-f76587af7709'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:53:00 compute-0 nova_compute[186544]: 2025-11-22 07:53:00.472 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:53:00 compute-0 ovn_controller[94843]: 2025-11-22T07:53:00Z|00230|binding|INFO|Releasing lport e76f65fc-f8af-412a-98d3-f76587af7709 from this chassis (sb_readonly=0)
Nov 22 07:53:00 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:53:00.476 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/22d3787f-0fa0-4880-8b94-f25c9c543165.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/22d3787f-0fa0-4880-8b94-f25c9c543165.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 07:53:00 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:53:00.477 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[a9549cae-fcdb-41de-be3e-27160cd68a77]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:53:00 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:53:00.478 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 07:53:00 compute-0 ovn_metadata_agent[103800]: global
Nov 22 07:53:00 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 07:53:00 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-22d3787f-0fa0-4880-8b94-f25c9c543165
Nov 22 07:53:00 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 07:53:00 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 07:53:00 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 07:53:00 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/22d3787f-0fa0-4880-8b94-f25c9c543165.pid.haproxy
Nov 22 07:53:00 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 07:53:00 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:53:00 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 07:53:00 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 07:53:00 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 07:53:00 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 07:53:00 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 07:53:00 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 07:53:00 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 07:53:00 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 07:53:00 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 07:53:00 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 07:53:00 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 07:53:00 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 07:53:00 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 07:53:00 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:53:00 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:53:00 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 07:53:00 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 07:53:00 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 07:53:00 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID 22d3787f-0fa0-4880-8b94-f25c9c543165
Nov 22 07:53:00 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 07:53:00 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:53:00.478 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-22d3787f-0fa0-4880-8b94-f25c9c543165', 'env', 'PROCESS_TAG=haproxy-22d3787f-0fa0-4880-8b94-f25c9c543165', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/22d3787f-0fa0-4880-8b94-f25c9c543165.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 07:53:00 compute-0 nova_compute[186544]: 2025-11-22 07:53:00.484 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:53:00 compute-0 nova_compute[186544]: 2025-11-22 07:53:00.545 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763797980.5454288, 4e86eb77-e2b9-4437-b772-fbeddd72efb4 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:53:00 compute-0 nova_compute[186544]: 2025-11-22 07:53:00.546 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 4e86eb77-e2b9-4437-b772-fbeddd72efb4] VM Started (Lifecycle Event)
Nov 22 07:53:00 compute-0 nova_compute[186544]: 2025-11-22 07:53:00.564 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 4e86eb77-e2b9-4437-b772-fbeddd72efb4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:53:00 compute-0 nova_compute[186544]: 2025-11-22 07:53:00.568 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763797980.546338, 4e86eb77-e2b9-4437-b772-fbeddd72efb4 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:53:00 compute-0 nova_compute[186544]: 2025-11-22 07:53:00.569 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 4e86eb77-e2b9-4437-b772-fbeddd72efb4] VM Paused (Lifecycle Event)
Nov 22 07:53:00 compute-0 nova_compute[186544]: 2025-11-22 07:53:00.586 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 4e86eb77-e2b9-4437-b772-fbeddd72efb4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:53:00 compute-0 nova_compute[186544]: 2025-11-22 07:53:00.590 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 4e86eb77-e2b9-4437-b772-fbeddd72efb4] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:53:00 compute-0 nova_compute[186544]: 2025-11-22 07:53:00.604 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 4e86eb77-e2b9-4437-b772-fbeddd72efb4] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 07:53:00 compute-0 podman[222284]: 2025-11-22 07:53:00.825663689 +0000 UTC m=+0.023117147 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 07:53:00 compute-0 podman[222284]: 2025-11-22 07:53:00.965556965 +0000 UTC m=+0.163010403 container create 9196370ef9843822665c6b788280ee30db482a8e53d94f1b7520ecdf8a11b694 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-22d3787f-0fa0-4880-8b94-f25c9c543165, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118)
Nov 22 07:53:01 compute-0 systemd[1]: Started libpod-conmon-9196370ef9843822665c6b788280ee30db482a8e53d94f1b7520ecdf8a11b694.scope.
Nov 22 07:53:01 compute-0 systemd[1]: Started libcrun container.
Nov 22 07:53:01 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2d42ef0cda38c3ad7f71ec7e9dcd42898f9c69fa89ece26ceed422fc25525e93/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 07:53:01 compute-0 podman[222284]: 2025-11-22 07:53:01.071393087 +0000 UTC m=+0.268846545 container init 9196370ef9843822665c6b788280ee30db482a8e53d94f1b7520ecdf8a11b694 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-22d3787f-0fa0-4880-8b94-f25c9c543165, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 22 07:53:01 compute-0 podman[222284]: 2025-11-22 07:53:01.078146172 +0000 UTC m=+0.275599610 container start 9196370ef9843822665c6b788280ee30db482a8e53d94f1b7520ecdf8a11b694 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-22d3787f-0fa0-4880-8b94-f25c9c543165, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 22 07:53:01 compute-0 neutron-haproxy-ovnmeta-22d3787f-0fa0-4880-8b94-f25c9c543165[222299]: [NOTICE]   (222303) : New worker (222305) forked
Nov 22 07:53:01 compute-0 neutron-haproxy-ovnmeta-22d3787f-0fa0-4880-8b94-f25c9c543165[222299]: [NOTICE]   (222303) : Loading success.
Nov 22 07:53:01 compute-0 nova_compute[186544]: 2025-11-22 07:53:01.287 186548 DEBUG nova.compute.manager [req-6e0f20a7-7c37-4e83-ad69-3c323a0711fe req-1b3fb392-d5ab-4cb6-a565-dd213c98034f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4e86eb77-e2b9-4437-b772-fbeddd72efb4] Received event network-vif-plugged-ed12f9d4-2fd7-40a2-96c1-c9ddb1d4a09d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:53:01 compute-0 nova_compute[186544]: 2025-11-22 07:53:01.288 186548 DEBUG oslo_concurrency.lockutils [req-6e0f20a7-7c37-4e83-ad69-3c323a0711fe req-1b3fb392-d5ab-4cb6-a565-dd213c98034f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "4e86eb77-e2b9-4437-b772-fbeddd72efb4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:53:01 compute-0 nova_compute[186544]: 2025-11-22 07:53:01.288 186548 DEBUG oslo_concurrency.lockutils [req-6e0f20a7-7c37-4e83-ad69-3c323a0711fe req-1b3fb392-d5ab-4cb6-a565-dd213c98034f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "4e86eb77-e2b9-4437-b772-fbeddd72efb4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:53:01 compute-0 nova_compute[186544]: 2025-11-22 07:53:01.289 186548 DEBUG oslo_concurrency.lockutils [req-6e0f20a7-7c37-4e83-ad69-3c323a0711fe req-1b3fb392-d5ab-4cb6-a565-dd213c98034f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "4e86eb77-e2b9-4437-b772-fbeddd72efb4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:53:01 compute-0 nova_compute[186544]: 2025-11-22 07:53:01.289 186548 DEBUG nova.compute.manager [req-6e0f20a7-7c37-4e83-ad69-3c323a0711fe req-1b3fb392-d5ab-4cb6-a565-dd213c98034f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4e86eb77-e2b9-4437-b772-fbeddd72efb4] Processing event network-vif-plugged-ed12f9d4-2fd7-40a2-96c1-c9ddb1d4a09d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 07:53:01 compute-0 nova_compute[186544]: 2025-11-22 07:53:01.290 186548 DEBUG nova.compute.manager [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] [instance: 4e86eb77-e2b9-4437-b772-fbeddd72efb4] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 07:53:01 compute-0 nova_compute[186544]: 2025-11-22 07:53:01.296 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763797981.2957647, 4e86eb77-e2b9-4437-b772-fbeddd72efb4 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:53:01 compute-0 nova_compute[186544]: 2025-11-22 07:53:01.296 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 4e86eb77-e2b9-4437-b772-fbeddd72efb4] VM Resumed (Lifecycle Event)
Nov 22 07:53:01 compute-0 nova_compute[186544]: 2025-11-22 07:53:01.299 186548 DEBUG nova.virt.libvirt.driver [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] [instance: 4e86eb77-e2b9-4437-b772-fbeddd72efb4] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 07:53:01 compute-0 nova_compute[186544]: 2025-11-22 07:53:01.304 186548 INFO nova.virt.libvirt.driver [-] [instance: 4e86eb77-e2b9-4437-b772-fbeddd72efb4] Instance spawned successfully.
Nov 22 07:53:01 compute-0 nova_compute[186544]: 2025-11-22 07:53:01.304 186548 DEBUG nova.virt.libvirt.driver [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] [instance: 4e86eb77-e2b9-4437-b772-fbeddd72efb4] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 07:53:01 compute-0 nova_compute[186544]: 2025-11-22 07:53:01.330 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 4e86eb77-e2b9-4437-b772-fbeddd72efb4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:53:01 compute-0 nova_compute[186544]: 2025-11-22 07:53:01.337 186548 DEBUG nova.virt.libvirt.driver [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] [instance: 4e86eb77-e2b9-4437-b772-fbeddd72efb4] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:53:01 compute-0 nova_compute[186544]: 2025-11-22 07:53:01.338 186548 DEBUG nova.virt.libvirt.driver [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] [instance: 4e86eb77-e2b9-4437-b772-fbeddd72efb4] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:53:01 compute-0 nova_compute[186544]: 2025-11-22 07:53:01.338 186548 DEBUG nova.virt.libvirt.driver [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] [instance: 4e86eb77-e2b9-4437-b772-fbeddd72efb4] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:53:01 compute-0 nova_compute[186544]: 2025-11-22 07:53:01.339 186548 DEBUG nova.virt.libvirt.driver [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] [instance: 4e86eb77-e2b9-4437-b772-fbeddd72efb4] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:53:01 compute-0 nova_compute[186544]: 2025-11-22 07:53:01.339 186548 DEBUG nova.virt.libvirt.driver [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] [instance: 4e86eb77-e2b9-4437-b772-fbeddd72efb4] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:53:01 compute-0 nova_compute[186544]: 2025-11-22 07:53:01.340 186548 DEBUG nova.virt.libvirt.driver [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] [instance: 4e86eb77-e2b9-4437-b772-fbeddd72efb4] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:53:01 compute-0 nova_compute[186544]: 2025-11-22 07:53:01.345 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 4e86eb77-e2b9-4437-b772-fbeddd72efb4] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:53:01 compute-0 nova_compute[186544]: 2025-11-22 07:53:01.393 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 4e86eb77-e2b9-4437-b772-fbeddd72efb4] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 07:53:01 compute-0 nova_compute[186544]: 2025-11-22 07:53:01.549 186548 INFO nova.compute.manager [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] [instance: 4e86eb77-e2b9-4437-b772-fbeddd72efb4] Took 10.99 seconds to spawn the instance on the hypervisor.
Nov 22 07:53:01 compute-0 nova_compute[186544]: 2025-11-22 07:53:01.549 186548 DEBUG nova.compute.manager [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] [instance: 4e86eb77-e2b9-4437-b772-fbeddd72efb4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:53:01 compute-0 nova_compute[186544]: 2025-11-22 07:53:01.690 186548 INFO nova.compute.manager [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] [instance: 4e86eb77-e2b9-4437-b772-fbeddd72efb4] Took 12.74 seconds to build instance.
Nov 22 07:53:01 compute-0 nova_compute[186544]: 2025-11-22 07:53:01.746 186548 DEBUG oslo_concurrency.lockutils [None req-1e8fe88b-0c9b-45e3-ae3d-9767989e644a 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] Lock "4e86eb77-e2b9-4437-b772-fbeddd72efb4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.945s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:53:03 compute-0 nova_compute[186544]: 2025-11-22 07:53:03.581 186548 DEBUG nova.compute.manager [req-c9318fa4-c02c-431b-83d5-abe2ccbb1535 req-31cfe451-64e1-4a9e-9ed5-e63ddb52207b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4e86eb77-e2b9-4437-b772-fbeddd72efb4] Received event network-vif-plugged-ed12f9d4-2fd7-40a2-96c1-c9ddb1d4a09d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:53:03 compute-0 nova_compute[186544]: 2025-11-22 07:53:03.582 186548 DEBUG oslo_concurrency.lockutils [req-c9318fa4-c02c-431b-83d5-abe2ccbb1535 req-31cfe451-64e1-4a9e-9ed5-e63ddb52207b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "4e86eb77-e2b9-4437-b772-fbeddd72efb4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:53:03 compute-0 nova_compute[186544]: 2025-11-22 07:53:03.582 186548 DEBUG oslo_concurrency.lockutils [req-c9318fa4-c02c-431b-83d5-abe2ccbb1535 req-31cfe451-64e1-4a9e-9ed5-e63ddb52207b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "4e86eb77-e2b9-4437-b772-fbeddd72efb4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:53:03 compute-0 nova_compute[186544]: 2025-11-22 07:53:03.582 186548 DEBUG oslo_concurrency.lockutils [req-c9318fa4-c02c-431b-83d5-abe2ccbb1535 req-31cfe451-64e1-4a9e-9ed5-e63ddb52207b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "4e86eb77-e2b9-4437-b772-fbeddd72efb4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:53:03 compute-0 nova_compute[186544]: 2025-11-22 07:53:03.582 186548 DEBUG nova.compute.manager [req-c9318fa4-c02c-431b-83d5-abe2ccbb1535 req-31cfe451-64e1-4a9e-9ed5-e63ddb52207b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4e86eb77-e2b9-4437-b772-fbeddd72efb4] No waiting events found dispatching network-vif-plugged-ed12f9d4-2fd7-40a2-96c1-c9ddb1d4a09d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:53:03 compute-0 nova_compute[186544]: 2025-11-22 07:53:03.583 186548 WARNING nova.compute.manager [req-c9318fa4-c02c-431b-83d5-abe2ccbb1535 req-31cfe451-64e1-4a9e-9ed5-e63ddb52207b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4e86eb77-e2b9-4437-b772-fbeddd72efb4] Received unexpected event network-vif-plugged-ed12f9d4-2fd7-40a2-96c1-c9ddb1d4a09d for instance with vm_state active and task_state None.
Nov 22 07:53:03 compute-0 nova_compute[186544]: 2025-11-22 07:53:03.698 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:53:05 compute-0 nova_compute[186544]: 2025-11-22 07:53:05.357 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:53:05 compute-0 podman[222314]: 2025-11-22 07:53:05.408039355 +0000 UTC m=+0.052954547 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 07:53:05 compute-0 podman[222315]: 2025-11-22 07:53:05.439446025 +0000 UTC m=+0.081836106 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 22 07:53:08 compute-0 nova_compute[186544]: 2025-11-22 07:53:08.702 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:53:09 compute-0 nova_compute[186544]: 2025-11-22 07:53:09.145 186548 DEBUG oslo_concurrency.lockutils [None req-fed156e0-5a54-49e7-a8d6-462f8ad2be08 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] Acquiring lock "4e86eb77-e2b9-4437-b772-fbeddd72efb4" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:53:09 compute-0 nova_compute[186544]: 2025-11-22 07:53:09.146 186548 DEBUG oslo_concurrency.lockutils [None req-fed156e0-5a54-49e7-a8d6-462f8ad2be08 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] Lock "4e86eb77-e2b9-4437-b772-fbeddd72efb4" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:53:09 compute-0 nova_compute[186544]: 2025-11-22 07:53:09.146 186548 DEBUG oslo_concurrency.lockutils [None req-fed156e0-5a54-49e7-a8d6-462f8ad2be08 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] Acquiring lock "4e86eb77-e2b9-4437-b772-fbeddd72efb4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:53:09 compute-0 nova_compute[186544]: 2025-11-22 07:53:09.146 186548 DEBUG oslo_concurrency.lockutils [None req-fed156e0-5a54-49e7-a8d6-462f8ad2be08 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] Lock "4e86eb77-e2b9-4437-b772-fbeddd72efb4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:53:09 compute-0 nova_compute[186544]: 2025-11-22 07:53:09.147 186548 DEBUG oslo_concurrency.lockutils [None req-fed156e0-5a54-49e7-a8d6-462f8ad2be08 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] Lock "4e86eb77-e2b9-4437-b772-fbeddd72efb4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:53:09 compute-0 nova_compute[186544]: 2025-11-22 07:53:09.155 186548 INFO nova.compute.manager [None req-fed156e0-5a54-49e7-a8d6-462f8ad2be08 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] [instance: 4e86eb77-e2b9-4437-b772-fbeddd72efb4] Terminating instance
Nov 22 07:53:09 compute-0 nova_compute[186544]: 2025-11-22 07:53:09.165 186548 DEBUG nova.compute.manager [None req-fed156e0-5a54-49e7-a8d6-462f8ad2be08 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] [instance: 4e86eb77-e2b9-4437-b772-fbeddd72efb4] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 07:53:09 compute-0 kernel: taped12f9d4-2f (unregistering): left promiscuous mode
Nov 22 07:53:09 compute-0 NetworkManager[55036]: <info>  [1763797989.1888] device (taped12f9d4-2f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 07:53:09 compute-0 nova_compute[186544]: 2025-11-22 07:53:09.199 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:53:09 compute-0 ovn_controller[94843]: 2025-11-22T07:53:09Z|00231|binding|INFO|Releasing lport ed12f9d4-2fd7-40a2-96c1-c9ddb1d4a09d from this chassis (sb_readonly=0)
Nov 22 07:53:09 compute-0 ovn_controller[94843]: 2025-11-22T07:53:09Z|00232|binding|INFO|Setting lport ed12f9d4-2fd7-40a2-96c1-c9ddb1d4a09d down in Southbound
Nov 22 07:53:09 compute-0 ovn_controller[94843]: 2025-11-22T07:53:09Z|00233|binding|INFO|Removing iface taped12f9d4-2f ovn-installed in OVS
Nov 22 07:53:09 compute-0 nova_compute[186544]: 2025-11-22 07:53:09.200 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:53:09 compute-0 nova_compute[186544]: 2025-11-22 07:53:09.217 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:53:09 compute-0 systemd[1]: machine-qemu\x2d28\x2dinstance\x2d00000038.scope: Deactivated successfully.
Nov 22 07:53:09 compute-0 systemd[1]: machine-qemu\x2d28\x2dinstance\x2d00000038.scope: Consumed 8.376s CPU time.
Nov 22 07:53:09 compute-0 systemd-machined[152872]: Machine qemu-28-instance-00000038 terminated.
Nov 22 07:53:09 compute-0 nova_compute[186544]: 2025-11-22 07:53:09.391 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:53:09 compute-0 nova_compute[186544]: 2025-11-22 07:53:09.397 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:53:09 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:53:09.400 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4f:16:e4 10.100.0.7'], port_security=['fa:16:3e:4f:16:e4 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '4e86eb77-e2b9-4437-b772-fbeddd72efb4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-22d3787f-0fa0-4880-8b94-f25c9c543165', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ed90dd8b47cb449aac7d907fa8dbaa78', 'neutron:revision_number': '4', 'neutron:security_group_ids': '20525348-8dc2-4c17-a67b-3dad14b15e3b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1436e329-cc61-4e26-b914-0b0acb14e464, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=ed12f9d4-2fd7-40a2-96c1-c9ddb1d4a09d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:53:09 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:53:09.402 103805 INFO neutron.agent.ovn.metadata.agent [-] Port ed12f9d4-2fd7-40a2-96c1-c9ddb1d4a09d in datapath 22d3787f-0fa0-4880-8b94-f25c9c543165 unbound from our chassis
Nov 22 07:53:09 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:53:09.404 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 22d3787f-0fa0-4880-8b94-f25c9c543165, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 07:53:09 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:53:09.405 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[99e3dc47-e9c6-40be-b400-7e2367805b36]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:53:09 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:53:09.405 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-22d3787f-0fa0-4880-8b94-f25c9c543165 namespace which is not needed anymore
Nov 22 07:53:09 compute-0 nova_compute[186544]: 2025-11-22 07:53:09.442 186548 INFO nova.virt.libvirt.driver [-] [instance: 4e86eb77-e2b9-4437-b772-fbeddd72efb4] Instance destroyed successfully.
Nov 22 07:53:09 compute-0 nova_compute[186544]: 2025-11-22 07:53:09.442 186548 DEBUG nova.objects.instance [None req-fed156e0-5a54-49e7-a8d6-462f8ad2be08 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] Lazy-loading 'resources' on Instance uuid 4e86eb77-e2b9-4437-b772-fbeddd72efb4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:53:09 compute-0 nova_compute[186544]: 2025-11-22 07:53:09.479 186548 DEBUG nova.virt.libvirt.vif [None req-fed156e0-5a54-49e7-a8d6-462f8ad2be08 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:52:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-InstanceActionsV221TestJSON-server-1983858410',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-instanceactionsv221testjson-server-1983858410',id=56,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:53:01Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ed90dd8b47cb449aac7d907fa8dbaa78',ramdisk_id='',reservation_id='r-ac554ofj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-InstanceActionsV221TestJSON-1421412495',owner_user_name='tempest-InstanceActionsV221TestJSON-1421412495-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:53:01Z,user_data=None,user_id='50517a6ccf4348bba7b0ec50f9d90d35',uuid=4e86eb77-e2b9-4437-b772-fbeddd72efb4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ed12f9d4-2fd7-40a2-96c1-c9ddb1d4a09d", "address": "fa:16:3e:4f:16:e4", "network": {"id": "22d3787f-0fa0-4880-8b94-f25c9c543165", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-2113755864-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ed90dd8b47cb449aac7d907fa8dbaa78", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "taped12f9d4-2f", "ovs_interfaceid": "ed12f9d4-2fd7-40a2-96c1-c9ddb1d4a09d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 07:53:09 compute-0 nova_compute[186544]: 2025-11-22 07:53:09.479 186548 DEBUG nova.network.os_vif_util [None req-fed156e0-5a54-49e7-a8d6-462f8ad2be08 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] Converting VIF {"id": "ed12f9d4-2fd7-40a2-96c1-c9ddb1d4a09d", "address": "fa:16:3e:4f:16:e4", "network": {"id": "22d3787f-0fa0-4880-8b94-f25c9c543165", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-2113755864-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ed90dd8b47cb449aac7d907fa8dbaa78", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "taped12f9d4-2f", "ovs_interfaceid": "ed12f9d4-2fd7-40a2-96c1-c9ddb1d4a09d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:53:09 compute-0 nova_compute[186544]: 2025-11-22 07:53:09.480 186548 DEBUG nova.network.os_vif_util [None req-fed156e0-5a54-49e7-a8d6-462f8ad2be08 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4f:16:e4,bridge_name='br-int',has_traffic_filtering=True,id=ed12f9d4-2fd7-40a2-96c1-c9ddb1d4a09d,network=Network(22d3787f-0fa0-4880-8b94-f25c9c543165),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='taped12f9d4-2f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:53:09 compute-0 nova_compute[186544]: 2025-11-22 07:53:09.480 186548 DEBUG os_vif [None req-fed156e0-5a54-49e7-a8d6-462f8ad2be08 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4f:16:e4,bridge_name='br-int',has_traffic_filtering=True,id=ed12f9d4-2fd7-40a2-96c1-c9ddb1d4a09d,network=Network(22d3787f-0fa0-4880-8b94-f25c9c543165),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='taped12f9d4-2f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 07:53:09 compute-0 nova_compute[186544]: 2025-11-22 07:53:09.483 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:53:09 compute-0 nova_compute[186544]: 2025-11-22 07:53:09.483 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=taped12f9d4-2f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:53:09 compute-0 nova_compute[186544]: 2025-11-22 07:53:09.487 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 07:53:09 compute-0 nova_compute[186544]: 2025-11-22 07:53:09.489 186548 INFO os_vif [None req-fed156e0-5a54-49e7-a8d6-462f8ad2be08 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4f:16:e4,bridge_name='br-int',has_traffic_filtering=True,id=ed12f9d4-2fd7-40a2-96c1-c9ddb1d4a09d,network=Network(22d3787f-0fa0-4880-8b94-f25c9c543165),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='taped12f9d4-2f')
Nov 22 07:53:09 compute-0 nova_compute[186544]: 2025-11-22 07:53:09.490 186548 INFO nova.virt.libvirt.driver [None req-fed156e0-5a54-49e7-a8d6-462f8ad2be08 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] [instance: 4e86eb77-e2b9-4437-b772-fbeddd72efb4] Deleting instance files /var/lib/nova/instances/4e86eb77-e2b9-4437-b772-fbeddd72efb4_del
Nov 22 07:53:09 compute-0 nova_compute[186544]: 2025-11-22 07:53:09.491 186548 INFO nova.virt.libvirt.driver [None req-fed156e0-5a54-49e7-a8d6-462f8ad2be08 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] [instance: 4e86eb77-e2b9-4437-b772-fbeddd72efb4] Deletion of /var/lib/nova/instances/4e86eb77-e2b9-4437-b772-fbeddd72efb4_del complete
Nov 22 07:53:09 compute-0 neutron-haproxy-ovnmeta-22d3787f-0fa0-4880-8b94-f25c9c543165[222299]: [NOTICE]   (222303) : haproxy version is 2.8.14-c23fe91
Nov 22 07:53:09 compute-0 neutron-haproxy-ovnmeta-22d3787f-0fa0-4880-8b94-f25c9c543165[222299]: [NOTICE]   (222303) : path to executable is /usr/sbin/haproxy
Nov 22 07:53:09 compute-0 neutron-haproxy-ovnmeta-22d3787f-0fa0-4880-8b94-f25c9c543165[222299]: [WARNING]  (222303) : Exiting Master process...
Nov 22 07:53:09 compute-0 neutron-haproxy-ovnmeta-22d3787f-0fa0-4880-8b94-f25c9c543165[222299]: [WARNING]  (222303) : Exiting Master process...
Nov 22 07:53:09 compute-0 neutron-haproxy-ovnmeta-22d3787f-0fa0-4880-8b94-f25c9c543165[222299]: [ALERT]    (222303) : Current worker (222305) exited with code 143 (Terminated)
Nov 22 07:53:09 compute-0 neutron-haproxy-ovnmeta-22d3787f-0fa0-4880-8b94-f25c9c543165[222299]: [WARNING]  (222303) : All workers exited. Exiting... (0)
Nov 22 07:53:09 compute-0 systemd[1]: libpod-9196370ef9843822665c6b788280ee30db482a8e53d94f1b7520ecdf8a11b694.scope: Deactivated successfully.
Nov 22 07:53:09 compute-0 podman[222397]: 2025-11-22 07:53:09.592313293 +0000 UTC m=+0.093885130 container died 9196370ef9843822665c6b788280ee30db482a8e53d94f1b7520ecdf8a11b694 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-22d3787f-0fa0-4880-8b94-f25c9c543165, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 22 07:53:09 compute-0 nova_compute[186544]: 2025-11-22 07:53:09.647 186548 INFO nova.compute.manager [None req-fed156e0-5a54-49e7-a8d6-462f8ad2be08 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] [instance: 4e86eb77-e2b9-4437-b772-fbeddd72efb4] Took 0.48 seconds to destroy the instance on the hypervisor.
Nov 22 07:53:09 compute-0 nova_compute[186544]: 2025-11-22 07:53:09.647 186548 DEBUG oslo.service.loopingcall [None req-fed156e0-5a54-49e7-a8d6-462f8ad2be08 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 07:53:09 compute-0 nova_compute[186544]: 2025-11-22 07:53:09.647 186548 DEBUG nova.compute.manager [-] [instance: 4e86eb77-e2b9-4437-b772-fbeddd72efb4] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 07:53:09 compute-0 nova_compute[186544]: 2025-11-22 07:53:09.648 186548 DEBUG nova.network.neutron [-] [instance: 4e86eb77-e2b9-4437-b772-fbeddd72efb4] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 07:53:09 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9196370ef9843822665c6b788280ee30db482a8e53d94f1b7520ecdf8a11b694-userdata-shm.mount: Deactivated successfully.
Nov 22 07:53:09 compute-0 systemd[1]: var-lib-containers-storage-overlay-2d42ef0cda38c3ad7f71ec7e9dcd42898f9c69fa89ece26ceed422fc25525e93-merged.mount: Deactivated successfully.
Nov 22 07:53:09 compute-0 podman[222397]: 2025-11-22 07:53:09.757315444 +0000 UTC m=+0.258887281 container cleanup 9196370ef9843822665c6b788280ee30db482a8e53d94f1b7520ecdf8a11b694 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-22d3787f-0fa0-4880-8b94-f25c9c543165, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 07:53:09 compute-0 systemd[1]: libpod-conmon-9196370ef9843822665c6b788280ee30db482a8e53d94f1b7520ecdf8a11b694.scope: Deactivated successfully.
Nov 22 07:53:09 compute-0 podman[222426]: 2025-11-22 07:53:09.934558483 +0000 UTC m=+0.147051061 container remove 9196370ef9843822665c6b788280ee30db482a8e53d94f1b7520ecdf8a11b694 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-22d3787f-0fa0-4880-8b94-f25c9c543165, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 22 07:53:09 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:53:09.940 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[bf5b93a4-85c4-471d-b36f-0eac974a3621]: (4, ('Sat Nov 22 07:53:09 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-22d3787f-0fa0-4880-8b94-f25c9c543165 (9196370ef9843822665c6b788280ee30db482a8e53d94f1b7520ecdf8a11b694)\n9196370ef9843822665c6b788280ee30db482a8e53d94f1b7520ecdf8a11b694\nSat Nov 22 07:53:09 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-22d3787f-0fa0-4880-8b94-f25c9c543165 (9196370ef9843822665c6b788280ee30db482a8e53d94f1b7520ecdf8a11b694)\n9196370ef9843822665c6b788280ee30db482a8e53d94f1b7520ecdf8a11b694\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:53:09 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:53:09.942 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[8afb82d7-944a-4c2a-8ce9-d688a451f9e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:53:09 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:53:09.944 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap22d3787f-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:53:09 compute-0 nova_compute[186544]: 2025-11-22 07:53:09.945 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:53:09 compute-0 kernel: tap22d3787f-00: left promiscuous mode
Nov 22 07:53:09 compute-0 nova_compute[186544]: 2025-11-22 07:53:09.959 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:53:09 compute-0 nova_compute[186544]: 2025-11-22 07:53:09.960 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:53:09 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:53:09.962 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[a89c7e3b-762f-426e-8346-c418db47db47]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:53:09 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:53:09.978 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[6d09721f-d001-4aa7-8ec5-417a96d2f084]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:53:09 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:53:09.980 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[93c48fed-46a1-43d3-8edc-84ff54245211]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:53:09 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:53:09.996 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[561240cb-e9d4-4f20-b1d0-fa22e2bda7b9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 472086, 'reachable_time': 41273, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222441, 'error': None, 'target': 'ovnmeta-22d3787f-0fa0-4880-8b94-f25c9c543165', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:53:10 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:53:10.000 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-22d3787f-0fa0-4880-8b94-f25c9c543165 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 07:53:10 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:53:10.000 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[22987cec-d32e-42e1-a66f-1c0c57b82714]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:53:10 compute-0 systemd[1]: run-netns-ovnmeta\x2d22d3787f\x2d0fa0\x2d4880\x2d8b94\x2df25c9c543165.mount: Deactivated successfully.
Nov 22 07:53:10 compute-0 nova_compute[186544]: 2025-11-22 07:53:10.358 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:53:11 compute-0 podman[222442]: 2025-11-22 07:53:11.414048155 +0000 UTC m=+0.057519920 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 07:53:12 compute-0 nova_compute[186544]: 2025-11-22 07:53:12.020 186548 DEBUG nova.compute.manager [req-1d6a7327-2095-488e-9b21-76483710b8a4 req-1c865ae7-7a57-47ed-a544-5e2265ffe4a3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4e86eb77-e2b9-4437-b772-fbeddd72efb4] Received event network-vif-unplugged-ed12f9d4-2fd7-40a2-96c1-c9ddb1d4a09d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:53:12 compute-0 nova_compute[186544]: 2025-11-22 07:53:12.021 186548 DEBUG oslo_concurrency.lockutils [req-1d6a7327-2095-488e-9b21-76483710b8a4 req-1c865ae7-7a57-47ed-a544-5e2265ffe4a3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "4e86eb77-e2b9-4437-b772-fbeddd72efb4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:53:12 compute-0 nova_compute[186544]: 2025-11-22 07:53:12.021 186548 DEBUG oslo_concurrency.lockutils [req-1d6a7327-2095-488e-9b21-76483710b8a4 req-1c865ae7-7a57-47ed-a544-5e2265ffe4a3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "4e86eb77-e2b9-4437-b772-fbeddd72efb4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:53:12 compute-0 nova_compute[186544]: 2025-11-22 07:53:12.021 186548 DEBUG oslo_concurrency.lockutils [req-1d6a7327-2095-488e-9b21-76483710b8a4 req-1c865ae7-7a57-47ed-a544-5e2265ffe4a3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "4e86eb77-e2b9-4437-b772-fbeddd72efb4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:53:12 compute-0 nova_compute[186544]: 2025-11-22 07:53:12.021 186548 DEBUG nova.compute.manager [req-1d6a7327-2095-488e-9b21-76483710b8a4 req-1c865ae7-7a57-47ed-a544-5e2265ffe4a3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4e86eb77-e2b9-4437-b772-fbeddd72efb4] No waiting events found dispatching network-vif-unplugged-ed12f9d4-2fd7-40a2-96c1-c9ddb1d4a09d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:53:12 compute-0 nova_compute[186544]: 2025-11-22 07:53:12.022 186548 DEBUG nova.compute.manager [req-1d6a7327-2095-488e-9b21-76483710b8a4 req-1c865ae7-7a57-47ed-a544-5e2265ffe4a3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4e86eb77-e2b9-4437-b772-fbeddd72efb4] Received event network-vif-unplugged-ed12f9d4-2fd7-40a2-96c1-c9ddb1d4a09d for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 22 07:53:13 compute-0 podman[222466]: 2025-11-22 07:53:13.402209723 +0000 UTC m=+0.052367413 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 07:53:14 compute-0 nova_compute[186544]: 2025-11-22 07:53:14.257 186548 DEBUG nova.network.neutron [-] [instance: 4e86eb77-e2b9-4437-b772-fbeddd72efb4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:53:14 compute-0 nova_compute[186544]: 2025-11-22 07:53:14.383 186548 DEBUG nova.compute.manager [req-23c1fd9b-850b-4ef1-8a45-24b7081908e9 req-8fa5e7b7-6957-4e26-bfdf-a8b8df302fde 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4e86eb77-e2b9-4437-b772-fbeddd72efb4] Received event network-vif-plugged-ed12f9d4-2fd7-40a2-96c1-c9ddb1d4a09d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:53:14 compute-0 nova_compute[186544]: 2025-11-22 07:53:14.384 186548 DEBUG oslo_concurrency.lockutils [req-23c1fd9b-850b-4ef1-8a45-24b7081908e9 req-8fa5e7b7-6957-4e26-bfdf-a8b8df302fde 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "4e86eb77-e2b9-4437-b772-fbeddd72efb4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:53:14 compute-0 nova_compute[186544]: 2025-11-22 07:53:14.384 186548 DEBUG oslo_concurrency.lockutils [req-23c1fd9b-850b-4ef1-8a45-24b7081908e9 req-8fa5e7b7-6957-4e26-bfdf-a8b8df302fde 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "4e86eb77-e2b9-4437-b772-fbeddd72efb4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:53:14 compute-0 nova_compute[186544]: 2025-11-22 07:53:14.384 186548 DEBUG oslo_concurrency.lockutils [req-23c1fd9b-850b-4ef1-8a45-24b7081908e9 req-8fa5e7b7-6957-4e26-bfdf-a8b8df302fde 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "4e86eb77-e2b9-4437-b772-fbeddd72efb4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:53:14 compute-0 nova_compute[186544]: 2025-11-22 07:53:14.384 186548 DEBUG nova.compute.manager [req-23c1fd9b-850b-4ef1-8a45-24b7081908e9 req-8fa5e7b7-6957-4e26-bfdf-a8b8df302fde 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4e86eb77-e2b9-4437-b772-fbeddd72efb4] No waiting events found dispatching network-vif-plugged-ed12f9d4-2fd7-40a2-96c1-c9ddb1d4a09d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:53:14 compute-0 nova_compute[186544]: 2025-11-22 07:53:14.384 186548 WARNING nova.compute.manager [req-23c1fd9b-850b-4ef1-8a45-24b7081908e9 req-8fa5e7b7-6957-4e26-bfdf-a8b8df302fde 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4e86eb77-e2b9-4437-b772-fbeddd72efb4] Received unexpected event network-vif-plugged-ed12f9d4-2fd7-40a2-96c1-c9ddb1d4a09d for instance with vm_state active and task_state deleting.
Nov 22 07:53:14 compute-0 nova_compute[186544]: 2025-11-22 07:53:14.485 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:53:14 compute-0 nova_compute[186544]: 2025-11-22 07:53:14.593 186548 INFO nova.compute.manager [-] [instance: 4e86eb77-e2b9-4437-b772-fbeddd72efb4] Took 4.95 seconds to deallocate network for instance.
Nov 22 07:53:14 compute-0 nova_compute[186544]: 2025-11-22 07:53:14.954 186548 DEBUG oslo_concurrency.lockutils [None req-fed156e0-5a54-49e7-a8d6-462f8ad2be08 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:53:14 compute-0 nova_compute[186544]: 2025-11-22 07:53:14.954 186548 DEBUG oslo_concurrency.lockutils [None req-fed156e0-5a54-49e7-a8d6-462f8ad2be08 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:53:15 compute-0 nova_compute[186544]: 2025-11-22 07:53:15.092 186548 DEBUG nova.compute.provider_tree [None req-fed156e0-5a54-49e7-a8d6-462f8ad2be08 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:53:15 compute-0 nova_compute[186544]: 2025-11-22 07:53:15.103 186548 DEBUG nova.compute.manager [req-6813b071-e8e8-43aa-9c31-90947bd1de3e req-05cb68db-f696-4e3e-8c54-0f77aaf297ea 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4e86eb77-e2b9-4437-b772-fbeddd72efb4] Received event network-vif-deleted-ed12f9d4-2fd7-40a2-96c1-c9ddb1d4a09d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:53:15 compute-0 nova_compute[186544]: 2025-11-22 07:53:15.114 186548 DEBUG nova.scheduler.client.report [None req-fed156e0-5a54-49e7-a8d6-462f8ad2be08 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:53:15 compute-0 nova_compute[186544]: 2025-11-22 07:53:15.185 186548 DEBUG oslo_concurrency.lockutils [None req-fed156e0-5a54-49e7-a8d6-462f8ad2be08 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.231s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:53:15 compute-0 nova_compute[186544]: 2025-11-22 07:53:15.248 186548 INFO nova.scheduler.client.report [None req-fed156e0-5a54-49e7-a8d6-462f8ad2be08 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] Deleted allocations for instance 4e86eb77-e2b9-4437-b772-fbeddd72efb4
Nov 22 07:53:15 compute-0 nova_compute[186544]: 2025-11-22 07:53:15.361 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:53:15 compute-0 nova_compute[186544]: 2025-11-22 07:53:15.507 186548 DEBUG oslo_concurrency.lockutils [None req-fed156e0-5a54-49e7-a8d6-462f8ad2be08 50517a6ccf4348bba7b0ec50f9d90d35 ed90dd8b47cb449aac7d907fa8dbaa78 - - default default] Lock "4e86eb77-e2b9-4437-b772-fbeddd72efb4" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.361s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:53:18 compute-0 podman[222485]: 2025-11-22 07:53:18.41457657 +0000 UTC m=+0.062795979 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 07:53:19 compute-0 nova_compute[186544]: 2025-11-22 07:53:19.487 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:53:20 compute-0 nova_compute[186544]: 2025-11-22 07:53:20.363 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:53:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:53:22.186 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:53:22 compute-0 nova_compute[186544]: 2025-11-22 07:53:22.187 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:53:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:53:22.187 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 07:53:22 compute-0 podman[222505]: 2025-11-22 07:53:22.451827346 +0000 UTC m=+0.094575186 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 07:53:24 compute-0 nova_compute[186544]: 2025-11-22 07:53:24.440 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763797989.4388728, 4e86eb77-e2b9-4437-b772-fbeddd72efb4 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:53:24 compute-0 nova_compute[186544]: 2025-11-22 07:53:24.440 186548 INFO nova.compute.manager [-] [instance: 4e86eb77-e2b9-4437-b772-fbeddd72efb4] VM Stopped (Lifecycle Event)
Nov 22 07:53:24 compute-0 nova_compute[186544]: 2025-11-22 07:53:24.460 186548 DEBUG nova.compute.manager [None req-d08c5322-2be8-47d1-8517-dde2ac66d351 - - - - - -] [instance: 4e86eb77-e2b9-4437-b772-fbeddd72efb4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:53:24 compute-0 nova_compute[186544]: 2025-11-22 07:53:24.490 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:53:25 compute-0 nova_compute[186544]: 2025-11-22 07:53:25.364 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:53:27 compute-0 podman[222530]: 2025-11-22 07:53:27.398154476 +0000 UTC m=+0.052355483 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, version=9.6, container_name=openstack_network_exporter, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, config_id=edpm, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, managed_by=edpm_ansible, release=1755695350, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41)
Nov 22 07:53:29 compute-0 nova_compute[186544]: 2025-11-22 07:53:29.493 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:53:30 compute-0 nova_compute[186544]: 2025-11-22 07:53:30.366 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:53:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:53:32.190 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:53:34 compute-0 nova_compute[186544]: 2025-11-22 07:53:34.494 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:53:35 compute-0 nova_compute[186544]: 2025-11-22 07:53:35.367 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:53:36 compute-0 podman[222550]: 2025-11-22 07:53:36.42043553 +0000 UTC m=+0.066115060 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 22 07:53:36 compute-0 podman[222551]: 2025-11-22 07:53:36.444300185 +0000 UTC m=+0.087232997 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 07:53:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:53:37.321 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:53:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:53:37.322 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:53:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:53:37.322 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:53:38 compute-0 nova_compute[186544]: 2025-11-22 07:53:38.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:53:38 compute-0 nova_compute[186544]: 2025-11-22 07:53:38.195 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:53:38 compute-0 nova_compute[186544]: 2025-11-22 07:53:38.195 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:53:38 compute-0 nova_compute[186544]: 2025-11-22 07:53:38.195 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:53:38 compute-0 nova_compute[186544]: 2025-11-22 07:53:38.195 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 07:53:38 compute-0 nova_compute[186544]: 2025-11-22 07:53:38.354 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 07:53:38 compute-0 nova_compute[186544]: 2025-11-22 07:53:38.355 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5750MB free_disk=73.34941482543945GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 07:53:38 compute-0 nova_compute[186544]: 2025-11-22 07:53:38.356 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:53:38 compute-0 nova_compute[186544]: 2025-11-22 07:53:38.356 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:53:38 compute-0 nova_compute[186544]: 2025-11-22 07:53:38.584 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 07:53:38 compute-0 nova_compute[186544]: 2025-11-22 07:53:38.584 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 07:53:38 compute-0 nova_compute[186544]: 2025-11-22 07:53:38.682 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:53:38 compute-0 nova_compute[186544]: 2025-11-22 07:53:38.703 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:53:38 compute-0 nova_compute[186544]: 2025-11-22 07:53:38.772 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 07:53:38 compute-0 nova_compute[186544]: 2025-11-22 07:53:38.772 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.416s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:53:39 compute-0 nova_compute[186544]: 2025-11-22 07:53:39.497 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:53:39 compute-0 nova_compute[186544]: 2025-11-22 07:53:39.767 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:53:39 compute-0 nova_compute[186544]: 2025-11-22 07:53:39.767 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:53:39 compute-0 nova_compute[186544]: 2025-11-22 07:53:39.767 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 07:53:39 compute-0 nova_compute[186544]: 2025-11-22 07:53:39.768 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 07:53:39 compute-0 nova_compute[186544]: 2025-11-22 07:53:39.780 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 07:53:39 compute-0 nova_compute[186544]: 2025-11-22 07:53:39.780 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:53:40 compute-0 nova_compute[186544]: 2025-11-22 07:53:40.368 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:53:42 compute-0 nova_compute[186544]: 2025-11-22 07:53:42.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:53:42 compute-0 nova_compute[186544]: 2025-11-22 07:53:42.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 07:53:42 compute-0 podman[222594]: 2025-11-22 07:53:42.399111352 +0000 UTC m=+0.048448998 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 07:53:43 compute-0 nova_compute[186544]: 2025-11-22 07:53:43.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:53:43 compute-0 nova_compute[186544]: 2025-11-22 07:53:43.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:53:44 compute-0 nova_compute[186544]: 2025-11-22 07:53:44.041 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:53:44 compute-0 podman[222620]: 2025-11-22 07:53:44.398612627 +0000 UTC m=+0.046680844 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true)
Nov 22 07:53:44 compute-0 nova_compute[186544]: 2025-11-22 07:53:44.498 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:53:45 compute-0 nova_compute[186544]: 2025-11-22 07:53:45.370 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:53:46 compute-0 nova_compute[186544]: 2025-11-22 07:53:46.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:53:46 compute-0 nova_compute[186544]: 2025-11-22 07:53:46.165 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 22 07:53:46 compute-0 nova_compute[186544]: 2025-11-22 07:53:46.204 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 22 07:53:48 compute-0 nova_compute[186544]: 2025-11-22 07:53:48.202 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:53:49 compute-0 nova_compute[186544]: 2025-11-22 07:53:49.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:53:49 compute-0 podman[222639]: 2025-11-22 07:53:49.3990177 +0000 UTC m=+0.051620864 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 22 07:53:49 compute-0 nova_compute[186544]: 2025-11-22 07:53:49.501 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:53:50 compute-0 nova_compute[186544]: 2025-11-22 07:53:50.373 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:53:50 compute-0 nova_compute[186544]: 2025-11-22 07:53:50.718 186548 DEBUG oslo_concurrency.lockutils [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquiring lock "c64e78b6-87b2-425c-aef9-771bcd042d58" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:53:50 compute-0 nova_compute[186544]: 2025-11-22 07:53:50.718 186548 DEBUG oslo_concurrency.lockutils [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "c64e78b6-87b2-425c-aef9-771bcd042d58" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:53:50 compute-0 nova_compute[186544]: 2025-11-22 07:53:50.746 186548 DEBUG nova.compute.manager [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 07:53:50 compute-0 nova_compute[186544]: 2025-11-22 07:53:50.919 186548 DEBUG oslo_concurrency.lockutils [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:53:50 compute-0 nova_compute[186544]: 2025-11-22 07:53:50.920 186548 DEBUG oslo_concurrency.lockutils [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:53:50 compute-0 nova_compute[186544]: 2025-11-22 07:53:50.928 186548 DEBUG nova.virt.hardware [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 07:53:50 compute-0 nova_compute[186544]: 2025-11-22 07:53:50.928 186548 INFO nova.compute.claims [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Claim successful on node compute-0.ctlplane.example.com
Nov 22 07:53:51 compute-0 nova_compute[186544]: 2025-11-22 07:53:51.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:53:51 compute-0 nova_compute[186544]: 2025-11-22 07:53:51.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 22 07:53:51 compute-0 nova_compute[186544]: 2025-11-22 07:53:51.683 186548 DEBUG nova.compute.provider_tree [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:53:51 compute-0 nova_compute[186544]: 2025-11-22 07:53:51.717 186548 DEBUG nova.scheduler.client.report [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:53:51 compute-0 nova_compute[186544]: 2025-11-22 07:53:51.776 186548 DEBUG oslo_concurrency.lockutils [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.856s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:53:51 compute-0 nova_compute[186544]: 2025-11-22 07:53:51.777 186548 DEBUG nova.compute.manager [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 07:53:51 compute-0 nova_compute[186544]: 2025-11-22 07:53:51.847 186548 DEBUG nova.compute.manager [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 22 07:53:51 compute-0 nova_compute[186544]: 2025-11-22 07:53:51.848 186548 DEBUG nova.network.neutron [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 07:53:51 compute-0 nova_compute[186544]: 2025-11-22 07:53:51.953 186548 INFO nova.virt.libvirt.driver [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 07:53:51 compute-0 nova_compute[186544]: 2025-11-22 07:53:51.996 186548 DEBUG nova.compute.manager [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 07:53:52 compute-0 nova_compute[186544]: 2025-11-22 07:53:52.207 186548 DEBUG nova.compute.manager [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 07:53:52 compute-0 nova_compute[186544]: 2025-11-22 07:53:52.209 186548 DEBUG nova.virt.libvirt.driver [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 07:53:52 compute-0 nova_compute[186544]: 2025-11-22 07:53:52.209 186548 INFO nova.virt.libvirt.driver [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Creating image(s)
Nov 22 07:53:52 compute-0 nova_compute[186544]: 2025-11-22 07:53:52.210 186548 DEBUG oslo_concurrency.lockutils [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquiring lock "/var/lib/nova/instances/c64e78b6-87b2-425c-aef9-771bcd042d58/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:53:52 compute-0 nova_compute[186544]: 2025-11-22 07:53:52.210 186548 DEBUG oslo_concurrency.lockutils [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "/var/lib/nova/instances/c64e78b6-87b2-425c-aef9-771bcd042d58/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:53:52 compute-0 nova_compute[186544]: 2025-11-22 07:53:52.211 186548 DEBUG oslo_concurrency.lockutils [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "/var/lib/nova/instances/c64e78b6-87b2-425c-aef9-771bcd042d58/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:53:52 compute-0 nova_compute[186544]: 2025-11-22 07:53:52.223 186548 DEBUG oslo_concurrency.processutils [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:53:52 compute-0 nova_compute[186544]: 2025-11-22 07:53:52.282 186548 DEBUG oslo_concurrency.processutils [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:53:52 compute-0 nova_compute[186544]: 2025-11-22 07:53:52.283 186548 DEBUG oslo_concurrency.lockutils [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:53:52 compute-0 nova_compute[186544]: 2025-11-22 07:53:52.284 186548 DEBUG oslo_concurrency.lockutils [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:53:52 compute-0 nova_compute[186544]: 2025-11-22 07:53:52.297 186548 DEBUG oslo_concurrency.processutils [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:53:52 compute-0 nova_compute[186544]: 2025-11-22 07:53:52.356 186548 DEBUG oslo_concurrency.processutils [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:53:52 compute-0 nova_compute[186544]: 2025-11-22 07:53:52.357 186548 DEBUG oslo_concurrency.processutils [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/c64e78b6-87b2-425c-aef9-771bcd042d58/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:53:52 compute-0 nova_compute[186544]: 2025-11-22 07:53:52.736 186548 DEBUG oslo_concurrency.processutils [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/c64e78b6-87b2-425c-aef9-771bcd042d58/disk 1073741824" returned: 0 in 0.379s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:53:52 compute-0 nova_compute[186544]: 2025-11-22 07:53:52.738 186548 DEBUG oslo_concurrency.lockutils [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.454s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:53:52 compute-0 nova_compute[186544]: 2025-11-22 07:53:52.738 186548 DEBUG oslo_concurrency.processutils [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:53:52 compute-0 nova_compute[186544]: 2025-11-22 07:53:52.805 186548 DEBUG oslo_concurrency.processutils [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:53:52 compute-0 nova_compute[186544]: 2025-11-22 07:53:52.807 186548 DEBUG nova.virt.disk.api [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Checking if we can resize image /var/lib/nova/instances/c64e78b6-87b2-425c-aef9-771bcd042d58/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 07:53:52 compute-0 nova_compute[186544]: 2025-11-22 07:53:52.807 186548 DEBUG oslo_concurrency.processutils [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c64e78b6-87b2-425c-aef9-771bcd042d58/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:53:52 compute-0 nova_compute[186544]: 2025-11-22 07:53:52.868 186548 DEBUG oslo_concurrency.processutils [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c64e78b6-87b2-425c-aef9-771bcd042d58/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:53:52 compute-0 nova_compute[186544]: 2025-11-22 07:53:52.869 186548 DEBUG nova.virt.disk.api [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Cannot resize image /var/lib/nova/instances/c64e78b6-87b2-425c-aef9-771bcd042d58/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 07:53:52 compute-0 nova_compute[186544]: 2025-11-22 07:53:52.870 186548 DEBUG nova.objects.instance [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lazy-loading 'migration_context' on Instance uuid c64e78b6-87b2-425c-aef9-771bcd042d58 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:53:52 compute-0 nova_compute[186544]: 2025-11-22 07:53:52.896 186548 DEBUG nova.virt.libvirt.driver [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 07:53:52 compute-0 nova_compute[186544]: 2025-11-22 07:53:52.897 186548 DEBUG nova.virt.libvirt.driver [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Ensure instance console log exists: /var/lib/nova/instances/c64e78b6-87b2-425c-aef9-771bcd042d58/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 07:53:52 compute-0 nova_compute[186544]: 2025-11-22 07:53:52.897 186548 DEBUG oslo_concurrency.lockutils [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:53:52 compute-0 nova_compute[186544]: 2025-11-22 07:53:52.898 186548 DEBUG oslo_concurrency.lockutils [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:53:52 compute-0 nova_compute[186544]: 2025-11-22 07:53:52.898 186548 DEBUG oslo_concurrency.lockutils [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:53:53 compute-0 nova_compute[186544]: 2025-11-22 07:53:53.037 186548 DEBUG nova.policy [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '57077a1511bf46d897beb6fd5eedfa67', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 22 07:53:53 compute-0 podman[222674]: 2025-11-22 07:53:53.397655641 +0000 UTC m=+0.052267670 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 07:53:54 compute-0 nova_compute[186544]: 2025-11-22 07:53:54.502 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:53:55 compute-0 nova_compute[186544]: 2025-11-22 07:53:55.374 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:53:56 compute-0 nova_compute[186544]: 2025-11-22 07:53:56.474 186548 DEBUG nova.network.neutron [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Successfully created port: a038edb6-47af-4f7e-9f5e-715660b6da32 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 22 07:53:57 compute-0 nova_compute[186544]: 2025-11-22 07:53:57.926 186548 DEBUG nova.network.neutron [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Successfully updated port: a038edb6-47af-4f7e-9f5e-715660b6da32 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 07:53:57 compute-0 nova_compute[186544]: 2025-11-22 07:53:57.965 186548 DEBUG oslo_concurrency.lockutils [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquiring lock "refresh_cache-c64e78b6-87b2-425c-aef9-771bcd042d58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:53:57 compute-0 nova_compute[186544]: 2025-11-22 07:53:57.966 186548 DEBUG oslo_concurrency.lockutils [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquired lock "refresh_cache-c64e78b6-87b2-425c-aef9-771bcd042d58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:53:57 compute-0 nova_compute[186544]: 2025-11-22 07:53:57.966 186548 DEBUG nova.network.neutron [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 07:53:58 compute-0 podman[222698]: 2025-11-22 07:53:58.003477112 +0000 UTC m=+0.052538768 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, version=9.6, release=1755695350, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, vcs-type=git, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 22 07:53:58 compute-0 nova_compute[186544]: 2025-11-22 07:53:58.098 186548 DEBUG nova.compute.manager [req-98bf11c5-1446-4fdb-8c32-1a059185cbbf req-656ffb2c-0597-47ed-93ee-a0a41afc9177 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Received event network-changed-a038edb6-47af-4f7e-9f5e-715660b6da32 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:53:58 compute-0 nova_compute[186544]: 2025-11-22 07:53:58.098 186548 DEBUG nova.compute.manager [req-98bf11c5-1446-4fdb-8c32-1a059185cbbf req-656ffb2c-0597-47ed-93ee-a0a41afc9177 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Refreshing instance network info cache due to event network-changed-a038edb6-47af-4f7e-9f5e-715660b6da32. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 07:53:58 compute-0 nova_compute[186544]: 2025-11-22 07:53:58.098 186548 DEBUG oslo_concurrency.lockutils [req-98bf11c5-1446-4fdb-8c32-1a059185cbbf req-656ffb2c-0597-47ed-93ee-a0a41afc9177 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-c64e78b6-87b2-425c-aef9-771bcd042d58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:53:58 compute-0 nova_compute[186544]: 2025-11-22 07:53:58.596 186548 DEBUG nova.network.neutron [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 07:53:59 compute-0 nova_compute[186544]: 2025-11-22 07:53:59.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:53:59 compute-0 nova_compute[186544]: 2025-11-22 07:53:59.199 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:53:59 compute-0 nova_compute[186544]: 2025-11-22 07:53:59.504 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:54:00 compute-0 nova_compute[186544]: 2025-11-22 07:54:00.375 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:54:00 compute-0 nova_compute[186544]: 2025-11-22 07:54:00.626 186548 DEBUG nova.network.neutron [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Updating instance_info_cache with network_info: [{"id": "a038edb6-47af-4f7e-9f5e-715660b6da32", "address": "fa:16:3e:36:ab:fc", "network": {"id": "5e910dbb-27d1-4915-8b74-d0538d33c33c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-667619475-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b68db2b61a54aeaa8ac219f44ed3e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa038edb6-47", "ovs_interfaceid": "a038edb6-47af-4f7e-9f5e-715660b6da32", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:54:00 compute-0 nova_compute[186544]: 2025-11-22 07:54:00.659 186548 DEBUG oslo_concurrency.lockutils [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Releasing lock "refresh_cache-c64e78b6-87b2-425c-aef9-771bcd042d58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:54:00 compute-0 nova_compute[186544]: 2025-11-22 07:54:00.660 186548 DEBUG nova.compute.manager [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Instance network_info: |[{"id": "a038edb6-47af-4f7e-9f5e-715660b6da32", "address": "fa:16:3e:36:ab:fc", "network": {"id": "5e910dbb-27d1-4915-8b74-d0538d33c33c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-667619475-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b68db2b61a54aeaa8ac219f44ed3e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa038edb6-47", "ovs_interfaceid": "a038edb6-47af-4f7e-9f5e-715660b6da32", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 22 07:54:00 compute-0 nova_compute[186544]: 2025-11-22 07:54:00.660 186548 DEBUG oslo_concurrency.lockutils [req-98bf11c5-1446-4fdb-8c32-1a059185cbbf req-656ffb2c-0597-47ed-93ee-a0a41afc9177 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-c64e78b6-87b2-425c-aef9-771bcd042d58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:54:00 compute-0 nova_compute[186544]: 2025-11-22 07:54:00.661 186548 DEBUG nova.network.neutron [req-98bf11c5-1446-4fdb-8c32-1a059185cbbf req-656ffb2c-0597-47ed-93ee-a0a41afc9177 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Refreshing network info cache for port a038edb6-47af-4f7e-9f5e-715660b6da32 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 07:54:00 compute-0 nova_compute[186544]: 2025-11-22 07:54:00.665 186548 DEBUG nova.virt.libvirt.driver [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Start _get_guest_xml network_info=[{"id": "a038edb6-47af-4f7e-9f5e-715660b6da32", "address": "fa:16:3e:36:ab:fc", "network": {"id": "5e910dbb-27d1-4915-8b74-d0538d33c33c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-667619475-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b68db2b61a54aeaa8ac219f44ed3e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa038edb6-47", "ovs_interfaceid": "a038edb6-47af-4f7e-9f5e-715660b6da32", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 07:54:00 compute-0 nova_compute[186544]: 2025-11-22 07:54:00.670 186548 WARNING nova.virt.libvirt.driver [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 07:54:00 compute-0 nova_compute[186544]: 2025-11-22 07:54:00.681 186548 DEBUG nova.virt.libvirt.host [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 07:54:00 compute-0 nova_compute[186544]: 2025-11-22 07:54:00.681 186548 DEBUG nova.virt.libvirt.host [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 07:54:00 compute-0 nova_compute[186544]: 2025-11-22 07:54:00.685 186548 DEBUG nova.virt.libvirt.host [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 07:54:00 compute-0 nova_compute[186544]: 2025-11-22 07:54:00.686 186548 DEBUG nova.virt.libvirt.host [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 07:54:00 compute-0 nova_compute[186544]: 2025-11-22 07:54:00.687 186548 DEBUG nova.virt.libvirt.driver [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 07:54:00 compute-0 nova_compute[186544]: 2025-11-22 07:54:00.688 186548 DEBUG nova.virt.hardware [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 07:54:00 compute-0 nova_compute[186544]: 2025-11-22 07:54:00.688 186548 DEBUG nova.virt.hardware [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 07:54:00 compute-0 nova_compute[186544]: 2025-11-22 07:54:00.688 186548 DEBUG nova.virt.hardware [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 07:54:00 compute-0 nova_compute[186544]: 2025-11-22 07:54:00.689 186548 DEBUG nova.virt.hardware [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 07:54:00 compute-0 nova_compute[186544]: 2025-11-22 07:54:00.689 186548 DEBUG nova.virt.hardware [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 07:54:00 compute-0 nova_compute[186544]: 2025-11-22 07:54:00.689 186548 DEBUG nova.virt.hardware [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 07:54:00 compute-0 nova_compute[186544]: 2025-11-22 07:54:00.689 186548 DEBUG nova.virt.hardware [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 07:54:00 compute-0 nova_compute[186544]: 2025-11-22 07:54:00.690 186548 DEBUG nova.virt.hardware [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 07:54:00 compute-0 nova_compute[186544]: 2025-11-22 07:54:00.690 186548 DEBUG nova.virt.hardware [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 07:54:00 compute-0 nova_compute[186544]: 2025-11-22 07:54:00.690 186548 DEBUG nova.virt.hardware [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 07:54:00 compute-0 nova_compute[186544]: 2025-11-22 07:54:00.690 186548 DEBUG nova.virt.hardware [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 07:54:00 compute-0 nova_compute[186544]: 2025-11-22 07:54:00.694 186548 DEBUG nova.virt.libvirt.vif [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:53:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1711924286',display_name='tempest-DeleteServersTestJSON-server-1711924286',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1711924286',id=59,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6b68db2b61a54aeaa8ac219f44ed3e75',ramdisk_id='',reservation_id='r-8pf23wx3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-550712359',owner_user_name='tempest-DeleteServersTestJSON-550712359-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:53:52Z,user_data=None,user_id='57077a1511bf46d897beb6fd5eedfa67',uuid=c64e78b6-87b2-425c-aef9-771bcd042d58,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a038edb6-47af-4f7e-9f5e-715660b6da32", "address": "fa:16:3e:36:ab:fc", "network": {"id": "5e910dbb-27d1-4915-8b74-d0538d33c33c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-667619475-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b68db2b61a54aeaa8ac219f44ed3e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa038edb6-47", "ovs_interfaceid": "a038edb6-47af-4f7e-9f5e-715660b6da32", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 07:54:00 compute-0 nova_compute[186544]: 2025-11-22 07:54:00.694 186548 DEBUG nova.network.os_vif_util [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Converting VIF {"id": "a038edb6-47af-4f7e-9f5e-715660b6da32", "address": "fa:16:3e:36:ab:fc", "network": {"id": "5e910dbb-27d1-4915-8b74-d0538d33c33c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-667619475-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b68db2b61a54aeaa8ac219f44ed3e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa038edb6-47", "ovs_interfaceid": "a038edb6-47af-4f7e-9f5e-715660b6da32", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:54:00 compute-0 nova_compute[186544]: 2025-11-22 07:54:00.695 186548 DEBUG nova.network.os_vif_util [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:36:ab:fc,bridge_name='br-int',has_traffic_filtering=True,id=a038edb6-47af-4f7e-9f5e-715660b6da32,network=Network(5e910dbb-27d1-4915-8b74-d0538d33c33c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa038edb6-47') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:54:00 compute-0 nova_compute[186544]: 2025-11-22 07:54:00.696 186548 DEBUG nova.objects.instance [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lazy-loading 'pci_devices' on Instance uuid c64e78b6-87b2-425c-aef9-771bcd042d58 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:54:00 compute-0 nova_compute[186544]: 2025-11-22 07:54:00.710 186548 DEBUG nova.virt.libvirt.driver [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] End _get_guest_xml xml=<domain type="kvm">
Nov 22 07:54:00 compute-0 nova_compute[186544]:   <uuid>c64e78b6-87b2-425c-aef9-771bcd042d58</uuid>
Nov 22 07:54:00 compute-0 nova_compute[186544]:   <name>instance-0000003b</name>
Nov 22 07:54:00 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 07:54:00 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 07:54:00 compute-0 nova_compute[186544]:   <metadata>
Nov 22 07:54:00 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 07:54:00 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 07:54:00 compute-0 nova_compute[186544]:       <nova:name>tempest-DeleteServersTestJSON-server-1711924286</nova:name>
Nov 22 07:54:00 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 07:54:00</nova:creationTime>
Nov 22 07:54:00 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 07:54:00 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 07:54:00 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 07:54:00 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 07:54:00 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 07:54:00 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 07:54:00 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 07:54:00 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 07:54:00 compute-0 nova_compute[186544]:         <nova:user uuid="57077a1511bf46d897beb6fd5eedfa67">tempest-DeleteServersTestJSON-550712359-project-member</nova:user>
Nov 22 07:54:00 compute-0 nova_compute[186544]:         <nova:project uuid="6b68db2b61a54aeaa8ac219f44ed3e75">tempest-DeleteServersTestJSON-550712359</nova:project>
Nov 22 07:54:00 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 07:54:00 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 07:54:00 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 07:54:00 compute-0 nova_compute[186544]:         <nova:port uuid="a038edb6-47af-4f7e-9f5e-715660b6da32">
Nov 22 07:54:00 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Nov 22 07:54:00 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 07:54:00 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 07:54:00 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 07:54:00 compute-0 nova_compute[186544]:   </metadata>
Nov 22 07:54:00 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 07:54:00 compute-0 nova_compute[186544]:     <system>
Nov 22 07:54:00 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 07:54:00 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 07:54:00 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 07:54:00 compute-0 nova_compute[186544]:       <entry name="serial">c64e78b6-87b2-425c-aef9-771bcd042d58</entry>
Nov 22 07:54:00 compute-0 nova_compute[186544]:       <entry name="uuid">c64e78b6-87b2-425c-aef9-771bcd042d58</entry>
Nov 22 07:54:00 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 07:54:00 compute-0 nova_compute[186544]:     </system>
Nov 22 07:54:00 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 07:54:00 compute-0 nova_compute[186544]:   <os>
Nov 22 07:54:00 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 07:54:00 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 07:54:00 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 07:54:00 compute-0 nova_compute[186544]:   </os>
Nov 22 07:54:00 compute-0 nova_compute[186544]:   <features>
Nov 22 07:54:00 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 07:54:00 compute-0 nova_compute[186544]:     <apic/>
Nov 22 07:54:00 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 07:54:00 compute-0 nova_compute[186544]:   </features>
Nov 22 07:54:00 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 07:54:00 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 07:54:00 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 07:54:00 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 07:54:00 compute-0 nova_compute[186544]:   </clock>
Nov 22 07:54:00 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 07:54:00 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 07:54:00 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 07:54:00 compute-0 nova_compute[186544]:   </cpu>
Nov 22 07:54:00 compute-0 nova_compute[186544]:   <devices>
Nov 22 07:54:00 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 07:54:00 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 07:54:00 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/c64e78b6-87b2-425c-aef9-771bcd042d58/disk"/>
Nov 22 07:54:00 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 07:54:00 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:54:00 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 07:54:00 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 07:54:00 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/c64e78b6-87b2-425c-aef9-771bcd042d58/disk.config"/>
Nov 22 07:54:00 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 07:54:00 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:54:00 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 07:54:00 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:36:ab:fc"/>
Nov 22 07:54:00 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 07:54:00 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 07:54:00 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 07:54:00 compute-0 nova_compute[186544]:       <target dev="tapa038edb6-47"/>
Nov 22 07:54:00 compute-0 nova_compute[186544]:     </interface>
Nov 22 07:54:00 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 07:54:00 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/c64e78b6-87b2-425c-aef9-771bcd042d58/console.log" append="off"/>
Nov 22 07:54:00 compute-0 nova_compute[186544]:     </serial>
Nov 22 07:54:00 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 07:54:00 compute-0 nova_compute[186544]:     <video>
Nov 22 07:54:00 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 07:54:00 compute-0 nova_compute[186544]:     </video>
Nov 22 07:54:00 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 07:54:00 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 07:54:00 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 07:54:00 compute-0 nova_compute[186544]:     </rng>
Nov 22 07:54:00 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 07:54:00 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:54:00 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:54:00 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:54:00 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:54:00 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:54:00 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:54:00 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:54:00 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:54:00 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:54:00 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:54:00 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:54:00 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:54:00 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:54:00 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:54:00 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:54:00 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:54:00 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:54:00 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:54:00 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:54:00 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:54:00 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:54:00 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:54:00 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:54:00 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:54:00 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 07:54:00 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 07:54:00 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 07:54:00 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 07:54:00 compute-0 nova_compute[186544]:   </devices>
Nov 22 07:54:00 compute-0 nova_compute[186544]: </domain>
Nov 22 07:54:00 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 07:54:00 compute-0 nova_compute[186544]: 2025-11-22 07:54:00.711 186548 DEBUG nova.compute.manager [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Preparing to wait for external event network-vif-plugged-a038edb6-47af-4f7e-9f5e-715660b6da32 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 07:54:00 compute-0 nova_compute[186544]: 2025-11-22 07:54:00.711 186548 DEBUG oslo_concurrency.lockutils [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquiring lock "c64e78b6-87b2-425c-aef9-771bcd042d58-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:54:00 compute-0 nova_compute[186544]: 2025-11-22 07:54:00.712 186548 DEBUG oslo_concurrency.lockutils [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "c64e78b6-87b2-425c-aef9-771bcd042d58-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:54:00 compute-0 nova_compute[186544]: 2025-11-22 07:54:00.712 186548 DEBUG oslo_concurrency.lockutils [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "c64e78b6-87b2-425c-aef9-771bcd042d58-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:54:00 compute-0 nova_compute[186544]: 2025-11-22 07:54:00.713 186548 DEBUG nova.virt.libvirt.vif [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:53:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1711924286',display_name='tempest-DeleteServersTestJSON-server-1711924286',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1711924286',id=59,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6b68db2b61a54aeaa8ac219f44ed3e75',ramdisk_id='',reservation_id='r-8pf23wx3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-550712359',owner_user_name='tempest-DeleteServersTestJSON-550712359-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:53:52Z,user_data=None,user_id='57077a1511bf46d897beb6fd5eedfa67',uuid=c64e78b6-87b2-425c-aef9-771bcd042d58,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a038edb6-47af-4f7e-9f5e-715660b6da32", "address": "fa:16:3e:36:ab:fc", "network": {"id": "5e910dbb-27d1-4915-8b74-d0538d33c33c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-667619475-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b68db2b61a54aeaa8ac219f44ed3e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa038edb6-47", "ovs_interfaceid": "a038edb6-47af-4f7e-9f5e-715660b6da32", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 07:54:00 compute-0 nova_compute[186544]: 2025-11-22 07:54:00.713 186548 DEBUG nova.network.os_vif_util [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Converting VIF {"id": "a038edb6-47af-4f7e-9f5e-715660b6da32", "address": "fa:16:3e:36:ab:fc", "network": {"id": "5e910dbb-27d1-4915-8b74-d0538d33c33c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-667619475-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b68db2b61a54aeaa8ac219f44ed3e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa038edb6-47", "ovs_interfaceid": "a038edb6-47af-4f7e-9f5e-715660b6da32", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:54:00 compute-0 nova_compute[186544]: 2025-11-22 07:54:00.714 186548 DEBUG nova.network.os_vif_util [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:36:ab:fc,bridge_name='br-int',has_traffic_filtering=True,id=a038edb6-47af-4f7e-9f5e-715660b6da32,network=Network(5e910dbb-27d1-4915-8b74-d0538d33c33c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa038edb6-47') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:54:00 compute-0 nova_compute[186544]: 2025-11-22 07:54:00.714 186548 DEBUG os_vif [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:36:ab:fc,bridge_name='br-int',has_traffic_filtering=True,id=a038edb6-47af-4f7e-9f5e-715660b6da32,network=Network(5e910dbb-27d1-4915-8b74-d0538d33c33c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa038edb6-47') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 07:54:00 compute-0 nova_compute[186544]: 2025-11-22 07:54:00.715 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:54:00 compute-0 nova_compute[186544]: 2025-11-22 07:54:00.715 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:54:00 compute-0 nova_compute[186544]: 2025-11-22 07:54:00.716 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:54:00 compute-0 nova_compute[186544]: 2025-11-22 07:54:00.718 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:54:00 compute-0 nova_compute[186544]: 2025-11-22 07:54:00.718 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa038edb6-47, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:54:00 compute-0 nova_compute[186544]: 2025-11-22 07:54:00.719 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa038edb6-47, col_values=(('external_ids', {'iface-id': 'a038edb6-47af-4f7e-9f5e-715660b6da32', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:36:ab:fc', 'vm-uuid': 'c64e78b6-87b2-425c-aef9-771bcd042d58'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:54:00 compute-0 nova_compute[186544]: 2025-11-22 07:54:00.720 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:54:00 compute-0 NetworkManager[55036]: <info>  [1763798040.7214] manager: (tapa038edb6-47): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/109)
Nov 22 07:54:00 compute-0 nova_compute[186544]: 2025-11-22 07:54:00.723 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 07:54:00 compute-0 nova_compute[186544]: 2025-11-22 07:54:00.727 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:54:00 compute-0 nova_compute[186544]: 2025-11-22 07:54:00.728 186548 INFO os_vif [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:36:ab:fc,bridge_name='br-int',has_traffic_filtering=True,id=a038edb6-47af-4f7e-9f5e-715660b6da32,network=Network(5e910dbb-27d1-4915-8b74-d0538d33c33c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa038edb6-47')
Nov 22 07:54:00 compute-0 nova_compute[186544]: 2025-11-22 07:54:00.962 186548 DEBUG nova.virt.libvirt.driver [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:54:00 compute-0 nova_compute[186544]: 2025-11-22 07:54:00.963 186548 DEBUG nova.virt.libvirt.driver [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:54:00 compute-0 nova_compute[186544]: 2025-11-22 07:54:00.963 186548 DEBUG nova.virt.libvirt.driver [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] No VIF found with MAC fa:16:3e:36:ab:fc, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 07:54:00 compute-0 nova_compute[186544]: 2025-11-22 07:54:00.964 186548 INFO nova.virt.libvirt.driver [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Using config drive
Nov 22 07:54:01 compute-0 nova_compute[186544]: 2025-11-22 07:54:01.916 186548 INFO nova.virt.libvirt.driver [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Creating config drive at /var/lib/nova/instances/c64e78b6-87b2-425c-aef9-771bcd042d58/disk.config
Nov 22 07:54:01 compute-0 nova_compute[186544]: 2025-11-22 07:54:01.922 186548 DEBUG oslo_concurrency.processutils [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c64e78b6-87b2-425c-aef9-771bcd042d58/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpxb0yapkw execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:54:02 compute-0 nova_compute[186544]: 2025-11-22 07:54:02.050 186548 DEBUG oslo_concurrency.processutils [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c64e78b6-87b2-425c-aef9-771bcd042d58/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpxb0yapkw" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:54:02 compute-0 kernel: tapa038edb6-47: entered promiscuous mode
Nov 22 07:54:02 compute-0 NetworkManager[55036]: <info>  [1763798042.1011] manager: (tapa038edb6-47): new Tun device (/org/freedesktop/NetworkManager/Devices/110)
Nov 22 07:54:02 compute-0 ovn_controller[94843]: 2025-11-22T07:54:02Z|00234|binding|INFO|Claiming lport a038edb6-47af-4f7e-9f5e-715660b6da32 for this chassis.
Nov 22 07:54:02 compute-0 ovn_controller[94843]: 2025-11-22T07:54:02Z|00235|binding|INFO|a038edb6-47af-4f7e-9f5e-715660b6da32: Claiming fa:16:3e:36:ab:fc 10.100.0.7
Nov 22 07:54:02 compute-0 nova_compute[186544]: 2025-11-22 07:54:02.102 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:54:02 compute-0 nova_compute[186544]: 2025-11-22 07:54:02.106 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:54:02 compute-0 systemd-udevd[222735]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 07:54:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:02.138 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:36:ab:fc 10.100.0.7'], port_security=['fa:16:3e:36:ab:fc 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'c64e78b6-87b2-425c-aef9-771bcd042d58', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5e910dbb-27d1-4915-8b74-d0538d33c33c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd8cd7544-2677-4974-86a3-a18d0c107043', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7bb67d1a-54cf-4f4c-900a-e9306bad2f5e, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=a038edb6-47af-4f7e-9f5e-715660b6da32) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:54:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:02.139 103805 INFO neutron.agent.ovn.metadata.agent [-] Port a038edb6-47af-4f7e-9f5e-715660b6da32 in datapath 5e910dbb-27d1-4915-8b74-d0538d33c33c bound to our chassis
Nov 22 07:54:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:02.140 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 5e910dbb-27d1-4915-8b74-d0538d33c33c
Nov 22 07:54:02 compute-0 NetworkManager[55036]: <info>  [1763798042.1450] device (tapa038edb6-47): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 07:54:02 compute-0 NetworkManager[55036]: <info>  [1763798042.1461] device (tapa038edb6-47): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 07:54:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:02.152 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[7d218135-bbf2-4306-a345-705af043fa80]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:54:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:02.153 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap5e910dbb-21 in ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 07:54:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:02.154 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap5e910dbb-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 07:54:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:02.154 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[d437906e-1b2d-4894-9f17-e5d5e703d372]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:54:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:02.155 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[98b2ba9e-4dcf-44d1-8991-00c021a0f50e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:54:02 compute-0 nova_compute[186544]: 2025-11-22 07:54:02.162 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:54:02 compute-0 systemd-machined[152872]: New machine qemu-29-instance-0000003b.
Nov 22 07:54:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:02.167 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[2151f69a-33db-4bcb-b055-d5be488fbd0b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:54:02 compute-0 ovn_controller[94843]: 2025-11-22T07:54:02Z|00236|binding|INFO|Setting lport a038edb6-47af-4f7e-9f5e-715660b6da32 ovn-installed in OVS
Nov 22 07:54:02 compute-0 ovn_controller[94843]: 2025-11-22T07:54:02Z|00237|binding|INFO|Setting lport a038edb6-47af-4f7e-9f5e-715660b6da32 up in Southbound
Nov 22 07:54:02 compute-0 nova_compute[186544]: 2025-11-22 07:54:02.175 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:54:02 compute-0 systemd[1]: Started Virtual Machine qemu-29-instance-0000003b.
Nov 22 07:54:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:02.191 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[9b2120ca-12ac-4992-bbc0-6904657006c3]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:54:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:02.219 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[3dbd5cbe-ce8f-42a4-8852-7834f4a96982]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:54:02 compute-0 NetworkManager[55036]: <info>  [1763798042.2254] manager: (tap5e910dbb-20): new Veth device (/org/freedesktop/NetworkManager/Devices/111)
Nov 22 07:54:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:02.224 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[08040261-abdc-42ab-ab39-74667a2edd35]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:54:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:02.253 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[e15889d6-a225-45d9-a14e-09550f7b15ca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:54:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:02.257 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[7fd4668d-5e74-46c7-a09d-53cf1126c84e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:54:02 compute-0 NetworkManager[55036]: <info>  [1763798042.2787] device (tap5e910dbb-20): carrier: link connected
Nov 22 07:54:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:02.282 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[cf04a233-95c2-422a-928a-0da0852f4d0d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:54:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:02.297 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[267246b3-58c8-483b-a531-fdea5b8e5091]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5e910dbb-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:27:e8:59'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 69], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 478291, 'reachable_time': 44484, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222771, 'error': None, 'target': 'ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:54:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:02.311 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[da5d0ac2-91c8-4050-b840-463a6f1ac74f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe27:e859'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 478291, 'tstamp': 478291}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222772, 'error': None, 'target': 'ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:54:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:02.325 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[13bcdacf-6a06-41e7-9379-f2acf5dfecda]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5e910dbb-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:27:e8:59'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 69], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 478291, 'reachable_time': 44484, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 222773, 'error': None, 'target': 'ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:54:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:02.354 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[7312aa14-f8b3-4ac9-8e88-69d056b4fab3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:54:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:02.410 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[4e9d90e3-ffd4-43cd-a4e2-c0d2d03d85cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:54:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:02.411 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5e910dbb-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:54:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:02.412 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:54:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:02.412 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5e910dbb-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:54:02 compute-0 kernel: tap5e910dbb-20: entered promiscuous mode
Nov 22 07:54:02 compute-0 NetworkManager[55036]: <info>  [1763798042.4155] manager: (tap5e910dbb-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/112)
Nov 22 07:54:02 compute-0 nova_compute[186544]: 2025-11-22 07:54:02.417 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:54:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:02.419 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap5e910dbb-20, col_values=(('external_ids', {'iface-id': 'df80c07a-3ea3-4dde-8219-31b028a556e5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:54:02 compute-0 ovn_controller[94843]: 2025-11-22T07:54:02Z|00238|binding|INFO|Releasing lport df80c07a-3ea3-4dde-8219-31b028a556e5 from this chassis (sb_readonly=0)
Nov 22 07:54:02 compute-0 nova_compute[186544]: 2025-11-22 07:54:02.425 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:54:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:02.426 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/5e910dbb-27d1-4915-8b74-d0538d33c33c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/5e910dbb-27d1-4915-8b74-d0538d33c33c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 07:54:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:02.426 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[d0dde738-7871-4397-9604-88944dcb1e78]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:54:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:02.427 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 07:54:02 compute-0 ovn_metadata_agent[103800]: global
Nov 22 07:54:02 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 07:54:02 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-5e910dbb-27d1-4915-8b74-d0538d33c33c
Nov 22 07:54:02 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 07:54:02 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 07:54:02 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 07:54:02 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/5e910dbb-27d1-4915-8b74-d0538d33c33c.pid.haproxy
Nov 22 07:54:02 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 07:54:02 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:54:02 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 07:54:02 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 07:54:02 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 07:54:02 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 07:54:02 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 07:54:02 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 07:54:02 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 07:54:02 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 07:54:02 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 07:54:02 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 07:54:02 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 07:54:02 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 07:54:02 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 07:54:02 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:54:02 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:54:02 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 07:54:02 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 07:54:02 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 07:54:02 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID 5e910dbb-27d1-4915-8b74-d0538d33c33c
Nov 22 07:54:02 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 07:54:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:02.428 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c', 'env', 'PROCESS_TAG=haproxy-5e910dbb-27d1-4915-8b74-d0538d33c33c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/5e910dbb-27d1-4915-8b74-d0538d33c33c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 07:54:02 compute-0 nova_compute[186544]: 2025-11-22 07:54:02.436 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:54:02 compute-0 podman[222805]: 2025-11-22 07:54:02.767156479 +0000 UTC m=+0.023198789 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 07:54:02 compute-0 podman[222805]: 2025-11-22 07:54:02.913601425 +0000 UTC m=+0.169643705 container create a9d759575593ccb1a6883d2ec23e43a71f3917589ed24b1bba435891ef5758dc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 07:54:02 compute-0 systemd[1]: Started libpod-conmon-a9d759575593ccb1a6883d2ec23e43a71f3917589ed24b1bba435891ef5758dc.scope.
Nov 22 07:54:02 compute-0 systemd[1]: Started libcrun container.
Nov 22 07:54:02 compute-0 nova_compute[186544]: 2025-11-22 07:54:02.980 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798042.9805663, c64e78b6-87b2-425c-aef9-771bcd042d58 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:54:02 compute-0 nova_compute[186544]: 2025-11-22 07:54:02.982 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] VM Started (Lifecycle Event)
Nov 22 07:54:03 compute-0 nova_compute[186544]: 2025-11-22 07:54:03.017 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:54:03 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/40d1337b8ae49f8ee59a9c81d5608c21316ebe7af834dfb9fa4b12f3360e89d4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 07:54:03 compute-0 nova_compute[186544]: 2025-11-22 07:54:03.024 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798042.9812994, c64e78b6-87b2-425c-aef9-771bcd042d58 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:54:03 compute-0 nova_compute[186544]: 2025-11-22 07:54:03.025 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] VM Paused (Lifecycle Event)
Nov 22 07:54:03 compute-0 nova_compute[186544]: 2025-11-22 07:54:03.063 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:54:03 compute-0 nova_compute[186544]: 2025-11-22 07:54:03.067 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:54:03 compute-0 podman[222805]: 2025-11-22 07:54:03.085571517 +0000 UTC m=+0.341613837 container init a9d759575593ccb1a6883d2ec23e43a71f3917589ed24b1bba435891ef5758dc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 22 07:54:03 compute-0 podman[222805]: 2025-11-22 07:54:03.093216673 +0000 UTC m=+0.349258963 container start a9d759575593ccb1a6883d2ec23e43a71f3917589ed24b1bba435891ef5758dc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2)
Nov 22 07:54:03 compute-0 nova_compute[186544]: 2025-11-22 07:54:03.105 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 07:54:03 compute-0 neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c[222827]: [NOTICE]   (222831) : New worker (222833) forked
Nov 22 07:54:03 compute-0 neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c[222827]: [NOTICE]   (222831) : Loading success.
Nov 22 07:54:04 compute-0 nova_compute[186544]: 2025-11-22 07:54:04.201 186548 DEBUG nova.compute.manager [req-13274868-978f-4a1a-afc8-5387be0ef709 req-adb271fc-0566-491e-b594-c61cd218edd1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Received event network-vif-plugged-a038edb6-47af-4f7e-9f5e-715660b6da32 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:54:04 compute-0 nova_compute[186544]: 2025-11-22 07:54:04.202 186548 DEBUG oslo_concurrency.lockutils [req-13274868-978f-4a1a-afc8-5387be0ef709 req-adb271fc-0566-491e-b594-c61cd218edd1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "c64e78b6-87b2-425c-aef9-771bcd042d58-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:54:04 compute-0 nova_compute[186544]: 2025-11-22 07:54:04.203 186548 DEBUG oslo_concurrency.lockutils [req-13274868-978f-4a1a-afc8-5387be0ef709 req-adb271fc-0566-491e-b594-c61cd218edd1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c64e78b6-87b2-425c-aef9-771bcd042d58-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:54:04 compute-0 nova_compute[186544]: 2025-11-22 07:54:04.203 186548 DEBUG oslo_concurrency.lockutils [req-13274868-978f-4a1a-afc8-5387be0ef709 req-adb271fc-0566-491e-b594-c61cd218edd1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c64e78b6-87b2-425c-aef9-771bcd042d58-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:54:04 compute-0 nova_compute[186544]: 2025-11-22 07:54:04.203 186548 DEBUG nova.compute.manager [req-13274868-978f-4a1a-afc8-5387be0ef709 req-adb271fc-0566-491e-b594-c61cd218edd1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Processing event network-vif-plugged-a038edb6-47af-4f7e-9f5e-715660b6da32 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 07:54:04 compute-0 nova_compute[186544]: 2025-11-22 07:54:04.204 186548 DEBUG nova.compute.manager [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 07:54:04 compute-0 nova_compute[186544]: 2025-11-22 07:54:04.207 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798044.207097, c64e78b6-87b2-425c-aef9-771bcd042d58 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:54:04 compute-0 nova_compute[186544]: 2025-11-22 07:54:04.207 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] VM Resumed (Lifecycle Event)
Nov 22 07:54:04 compute-0 nova_compute[186544]: 2025-11-22 07:54:04.210 186548 DEBUG nova.virt.libvirt.driver [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 07:54:04 compute-0 nova_compute[186544]: 2025-11-22 07:54:04.214 186548 INFO nova.virt.libvirt.driver [-] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Instance spawned successfully.
Nov 22 07:54:04 compute-0 nova_compute[186544]: 2025-11-22 07:54:04.215 186548 DEBUG nova.virt.libvirt.driver [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 07:54:04 compute-0 nova_compute[186544]: 2025-11-22 07:54:04.255 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:54:04 compute-0 nova_compute[186544]: 2025-11-22 07:54:04.262 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:54:04 compute-0 nova_compute[186544]: 2025-11-22 07:54:04.265 186548 DEBUG nova.virt.libvirt.driver [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:54:04 compute-0 nova_compute[186544]: 2025-11-22 07:54:04.266 186548 DEBUG nova.virt.libvirt.driver [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:54:04 compute-0 nova_compute[186544]: 2025-11-22 07:54:04.266 186548 DEBUG nova.virt.libvirt.driver [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:54:04 compute-0 nova_compute[186544]: 2025-11-22 07:54:04.267 186548 DEBUG nova.virt.libvirt.driver [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:54:04 compute-0 nova_compute[186544]: 2025-11-22 07:54:04.267 186548 DEBUG nova.virt.libvirt.driver [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:54:04 compute-0 nova_compute[186544]: 2025-11-22 07:54:04.268 186548 DEBUG nova.virt.libvirt.driver [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:54:04 compute-0 nova_compute[186544]: 2025-11-22 07:54:04.306 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 07:54:04 compute-0 nova_compute[186544]: 2025-11-22 07:54:04.371 186548 INFO nova.compute.manager [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Took 12.16 seconds to spawn the instance on the hypervisor.
Nov 22 07:54:04 compute-0 nova_compute[186544]: 2025-11-22 07:54:04.372 186548 DEBUG nova.compute.manager [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:54:04 compute-0 nova_compute[186544]: 2025-11-22 07:54:04.469 186548 INFO nova.compute.manager [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Took 13.59 seconds to build instance.
Nov 22 07:54:04 compute-0 nova_compute[186544]: 2025-11-22 07:54:04.501 186548 DEBUG oslo_concurrency.lockutils [None req-5692497e-e3b8-4320-92a8-8c131d312a17 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "c64e78b6-87b2-425c-aef9-771bcd042d58" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.782s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:54:04 compute-0 nova_compute[186544]: 2025-11-22 07:54:04.916 186548 DEBUG nova.network.neutron [req-98bf11c5-1446-4fdb-8c32-1a059185cbbf req-656ffb2c-0597-47ed-93ee-a0a41afc9177 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Updated VIF entry in instance network info cache for port a038edb6-47af-4f7e-9f5e-715660b6da32. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 07:54:04 compute-0 nova_compute[186544]: 2025-11-22 07:54:04.916 186548 DEBUG nova.network.neutron [req-98bf11c5-1446-4fdb-8c32-1a059185cbbf req-656ffb2c-0597-47ed-93ee-a0a41afc9177 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Updating instance_info_cache with network_info: [{"id": "a038edb6-47af-4f7e-9f5e-715660b6da32", "address": "fa:16:3e:36:ab:fc", "network": {"id": "5e910dbb-27d1-4915-8b74-d0538d33c33c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-667619475-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b68db2b61a54aeaa8ac219f44ed3e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa038edb6-47", "ovs_interfaceid": "a038edb6-47af-4f7e-9f5e-715660b6da32", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:54:04 compute-0 nova_compute[186544]: 2025-11-22 07:54:04.938 186548 DEBUG oslo_concurrency.lockutils [req-98bf11c5-1446-4fdb-8c32-1a059185cbbf req-656ffb2c-0597-47ed-93ee-a0a41afc9177 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-c64e78b6-87b2-425c-aef9-771bcd042d58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:54:05 compute-0 nova_compute[186544]: 2025-11-22 07:54:05.377 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:54:05 compute-0 nova_compute[186544]: 2025-11-22 07:54:05.720 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:54:07 compute-0 nova_compute[186544]: 2025-11-22 07:54:07.059 186548 DEBUG nova.compute.manager [req-25199ca4-72dc-46a2-ba18-7a825b61fcf2 req-fe3462ae-d6ad-4cb5-9b92-93cfeb7899bf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Received event network-vif-plugged-a038edb6-47af-4f7e-9f5e-715660b6da32 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:54:07 compute-0 nova_compute[186544]: 2025-11-22 07:54:07.060 186548 DEBUG oslo_concurrency.lockutils [req-25199ca4-72dc-46a2-ba18-7a825b61fcf2 req-fe3462ae-d6ad-4cb5-9b92-93cfeb7899bf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "c64e78b6-87b2-425c-aef9-771bcd042d58-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:54:07 compute-0 nova_compute[186544]: 2025-11-22 07:54:07.060 186548 DEBUG oslo_concurrency.lockutils [req-25199ca4-72dc-46a2-ba18-7a825b61fcf2 req-fe3462ae-d6ad-4cb5-9b92-93cfeb7899bf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c64e78b6-87b2-425c-aef9-771bcd042d58-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:54:07 compute-0 nova_compute[186544]: 2025-11-22 07:54:07.061 186548 DEBUG oslo_concurrency.lockutils [req-25199ca4-72dc-46a2-ba18-7a825b61fcf2 req-fe3462ae-d6ad-4cb5-9b92-93cfeb7899bf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c64e78b6-87b2-425c-aef9-771bcd042d58-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:54:07 compute-0 nova_compute[186544]: 2025-11-22 07:54:07.061 186548 DEBUG nova.compute.manager [req-25199ca4-72dc-46a2-ba18-7a825b61fcf2 req-fe3462ae-d6ad-4cb5-9b92-93cfeb7899bf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] No waiting events found dispatching network-vif-plugged-a038edb6-47af-4f7e-9f5e-715660b6da32 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:54:07 compute-0 nova_compute[186544]: 2025-11-22 07:54:07.061 186548 WARNING nova.compute.manager [req-25199ca4-72dc-46a2-ba18-7a825b61fcf2 req-fe3462ae-d6ad-4cb5-9b92-93cfeb7899bf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Received unexpected event network-vif-plugged-a038edb6-47af-4f7e-9f5e-715660b6da32 for instance with vm_state active and task_state None.
Nov 22 07:54:07 compute-0 nova_compute[186544]: 2025-11-22 07:54:07.401 186548 DEBUG oslo_concurrency.lockutils [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Acquiring lock "47f049f6-6113-427d-ba1e-f849dfec302b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:54:07 compute-0 nova_compute[186544]: 2025-11-22 07:54:07.402 186548 DEBUG oslo_concurrency.lockutils [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Lock "47f049f6-6113-427d-ba1e-f849dfec302b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:54:07 compute-0 podman[222842]: 2025-11-22 07:54:07.414577457 +0000 UTC m=+0.061107236 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=edpm, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 07:54:07 compute-0 nova_compute[186544]: 2025-11-22 07:54:07.431 186548 DEBUG nova.compute.manager [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 07:54:07 compute-0 podman[222843]: 2025-11-22 07:54:07.440752559 +0000 UTC m=+0.086345386 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, tcib_managed=true)
Nov 22 07:54:07 compute-0 nova_compute[186544]: 2025-11-22 07:54:07.557 186548 DEBUG oslo_concurrency.lockutils [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:54:07 compute-0 nova_compute[186544]: 2025-11-22 07:54:07.558 186548 DEBUG oslo_concurrency.lockutils [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:54:07 compute-0 nova_compute[186544]: 2025-11-22 07:54:07.569 186548 DEBUG nova.virt.hardware [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 07:54:07 compute-0 nova_compute[186544]: 2025-11-22 07:54:07.570 186548 INFO nova.compute.claims [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Claim successful on node compute-0.ctlplane.example.com
Nov 22 07:54:07 compute-0 nova_compute[186544]: 2025-11-22 07:54:07.764 186548 DEBUG nova.compute.provider_tree [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:54:07 compute-0 nova_compute[186544]: 2025-11-22 07:54:07.792 186548 DEBUG nova.scheduler.client.report [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:54:07 compute-0 nova_compute[186544]: 2025-11-22 07:54:07.850 186548 DEBUG oslo_concurrency.lockutils [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.292s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:54:07 compute-0 nova_compute[186544]: 2025-11-22 07:54:07.851 186548 DEBUG nova.compute.manager [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 07:54:07 compute-0 nova_compute[186544]: 2025-11-22 07:54:07.986 186548 DEBUG nova.compute.manager [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 22 07:54:07 compute-0 nova_compute[186544]: 2025-11-22 07:54:07.986 186548 DEBUG nova.network.neutron [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 07:54:08 compute-0 nova_compute[186544]: 2025-11-22 07:54:08.006 186548 INFO nova.virt.libvirt.driver [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 07:54:08 compute-0 nova_compute[186544]: 2025-11-22 07:54:08.041 186548 DEBUG nova.compute.manager [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 07:54:08 compute-0 nova_compute[186544]: 2025-11-22 07:54:08.196 186548 DEBUG nova.compute.manager [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 07:54:08 compute-0 nova_compute[186544]: 2025-11-22 07:54:08.198 186548 DEBUG nova.virt.libvirt.driver [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 07:54:08 compute-0 nova_compute[186544]: 2025-11-22 07:54:08.198 186548 INFO nova.virt.libvirt.driver [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Creating image(s)
Nov 22 07:54:08 compute-0 nova_compute[186544]: 2025-11-22 07:54:08.199 186548 DEBUG oslo_concurrency.lockutils [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Acquiring lock "/var/lib/nova/instances/47f049f6-6113-427d-ba1e-f849dfec302b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:54:08 compute-0 nova_compute[186544]: 2025-11-22 07:54:08.199 186548 DEBUG oslo_concurrency.lockutils [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Lock "/var/lib/nova/instances/47f049f6-6113-427d-ba1e-f849dfec302b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:54:08 compute-0 nova_compute[186544]: 2025-11-22 07:54:08.200 186548 DEBUG oslo_concurrency.lockutils [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Lock "/var/lib/nova/instances/47f049f6-6113-427d-ba1e-f849dfec302b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:54:08 compute-0 nova_compute[186544]: 2025-11-22 07:54:08.213 186548 DEBUG oslo_concurrency.processutils [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:54:08 compute-0 nova_compute[186544]: 2025-11-22 07:54:08.269 186548 DEBUG oslo_concurrency.processutils [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:54:08 compute-0 nova_compute[186544]: 2025-11-22 07:54:08.270 186548 DEBUG oslo_concurrency.lockutils [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:54:08 compute-0 nova_compute[186544]: 2025-11-22 07:54:08.270 186548 DEBUG oslo_concurrency.lockutils [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:54:08 compute-0 nova_compute[186544]: 2025-11-22 07:54:08.281 186548 DEBUG oslo_concurrency.processutils [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:54:08 compute-0 nova_compute[186544]: 2025-11-22 07:54:08.340 186548 DEBUG oslo_concurrency.processutils [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:54:08 compute-0 nova_compute[186544]: 2025-11-22 07:54:08.341 186548 DEBUG oslo_concurrency.processutils [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/47f049f6-6113-427d-ba1e-f849dfec302b/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:54:08 compute-0 nova_compute[186544]: 2025-11-22 07:54:08.480 186548 DEBUG oslo_concurrency.processutils [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/47f049f6-6113-427d-ba1e-f849dfec302b/disk 1073741824" returned: 0 in 0.139s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:54:08 compute-0 nova_compute[186544]: 2025-11-22 07:54:08.481 186548 DEBUG oslo_concurrency.lockutils [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.210s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:54:08 compute-0 nova_compute[186544]: 2025-11-22 07:54:08.481 186548 DEBUG oslo_concurrency.processutils [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:54:08 compute-0 nova_compute[186544]: 2025-11-22 07:54:08.544 186548 DEBUG oslo_concurrency.processutils [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:54:08 compute-0 nova_compute[186544]: 2025-11-22 07:54:08.545 186548 DEBUG nova.virt.disk.api [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Checking if we can resize image /var/lib/nova/instances/47f049f6-6113-427d-ba1e-f849dfec302b/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 07:54:08 compute-0 nova_compute[186544]: 2025-11-22 07:54:08.545 186548 DEBUG oslo_concurrency.processutils [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/47f049f6-6113-427d-ba1e-f849dfec302b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:54:08 compute-0 nova_compute[186544]: 2025-11-22 07:54:08.601 186548 DEBUG oslo_concurrency.processutils [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/47f049f6-6113-427d-ba1e-f849dfec302b/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:54:08 compute-0 nova_compute[186544]: 2025-11-22 07:54:08.602 186548 DEBUG nova.virt.disk.api [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Cannot resize image /var/lib/nova/instances/47f049f6-6113-427d-ba1e-f849dfec302b/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 07:54:08 compute-0 nova_compute[186544]: 2025-11-22 07:54:08.603 186548 DEBUG nova.objects.instance [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Lazy-loading 'migration_context' on Instance uuid 47f049f6-6113-427d-ba1e-f849dfec302b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:54:08 compute-0 nova_compute[186544]: 2025-11-22 07:54:08.623 186548 DEBUG nova.virt.libvirt.driver [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 07:54:08 compute-0 nova_compute[186544]: 2025-11-22 07:54:08.623 186548 DEBUG nova.virt.libvirt.driver [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Ensure instance console log exists: /var/lib/nova/instances/47f049f6-6113-427d-ba1e-f849dfec302b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 07:54:08 compute-0 nova_compute[186544]: 2025-11-22 07:54:08.624 186548 DEBUG oslo_concurrency.lockutils [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:54:08 compute-0 nova_compute[186544]: 2025-11-22 07:54:08.624 186548 DEBUG oslo_concurrency.lockutils [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:54:08 compute-0 nova_compute[186544]: 2025-11-22 07:54:08.625 186548 DEBUG oslo_concurrency.lockutils [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:54:09 compute-0 nova_compute[186544]: 2025-11-22 07:54:09.088 186548 DEBUG nova.policy [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 22 07:54:10 compute-0 nova_compute[186544]: 2025-11-22 07:54:10.380 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:54:10 compute-0 nova_compute[186544]: 2025-11-22 07:54:10.602 186548 DEBUG nova.network.neutron [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Successfully created port: bb707b30-96a3-4e7c-abad-b85fc5a10938 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 22 07:54:10 compute-0 nova_compute[186544]: 2025-11-22 07:54:10.722 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:54:13 compute-0 nova_compute[186544]: 2025-11-22 07:54:13.149 186548 DEBUG nova.network.neutron [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Successfully updated port: bb707b30-96a3-4e7c-abad-b85fc5a10938 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 07:54:13 compute-0 nova_compute[186544]: 2025-11-22 07:54:13.276 186548 DEBUG oslo_concurrency.lockutils [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Acquiring lock "refresh_cache-47f049f6-6113-427d-ba1e-f849dfec302b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:54:13 compute-0 nova_compute[186544]: 2025-11-22 07:54:13.277 186548 DEBUG oslo_concurrency.lockutils [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Acquired lock "refresh_cache-47f049f6-6113-427d-ba1e-f849dfec302b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:54:13 compute-0 nova_compute[186544]: 2025-11-22 07:54:13.277 186548 DEBUG nova.network.neutron [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 07:54:13 compute-0 podman[222899]: 2025-11-22 07:54:13.402989516 +0000 UTC m=+0.051084501 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 07:54:13 compute-0 nova_compute[186544]: 2025-11-22 07:54:13.562 186548 DEBUG nova.compute.manager [req-96e1d628-fd1d-424c-b454-db8234d3ba3f req-b2777408-0882-4416-9b46-cc57b623af71 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Received event network-changed-bb707b30-96a3-4e7c-abad-b85fc5a10938 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:54:13 compute-0 nova_compute[186544]: 2025-11-22 07:54:13.562 186548 DEBUG nova.compute.manager [req-96e1d628-fd1d-424c-b454-db8234d3ba3f req-b2777408-0882-4416-9b46-cc57b623af71 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Refreshing instance network info cache due to event network-changed-bb707b30-96a3-4e7c-abad-b85fc5a10938. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 07:54:13 compute-0 nova_compute[186544]: 2025-11-22 07:54:13.563 186548 DEBUG oslo_concurrency.lockutils [req-96e1d628-fd1d-424c-b454-db8234d3ba3f req-b2777408-0882-4416-9b46-cc57b623af71 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-47f049f6-6113-427d-ba1e-f849dfec302b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:54:14 compute-0 nova_compute[186544]: 2025-11-22 07:54:14.039 186548 DEBUG nova.network.neutron [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 07:54:14 compute-0 nova_compute[186544]: 2025-11-22 07:54:14.527 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:54:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:14.527 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:54:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:14.528 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 07:54:15 compute-0 nova_compute[186544]: 2025-11-22 07:54:15.330 186548 DEBUG oslo_concurrency.lockutils [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquiring lock "refresh_cache-c64e78b6-87b2-425c-aef9-771bcd042d58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:54:15 compute-0 nova_compute[186544]: 2025-11-22 07:54:15.330 186548 DEBUG oslo_concurrency.lockutils [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquired lock "refresh_cache-c64e78b6-87b2-425c-aef9-771bcd042d58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:54:15 compute-0 nova_compute[186544]: 2025-11-22 07:54:15.331 186548 DEBUG nova.network.neutron [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 07:54:15 compute-0 nova_compute[186544]: 2025-11-22 07:54:15.381 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:54:15 compute-0 podman[222923]: 2025-11-22 07:54:15.401542849 +0000 UTC m=+0.047957075 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 22 07:54:15 compute-0 nova_compute[186544]: 2025-11-22 07:54:15.723 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:54:16 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:16.531 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:54:17 compute-0 nova_compute[186544]: 2025-11-22 07:54:17.054 186548 DEBUG nova.network.neutron [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Updating instance_info_cache with network_info: [{"id": "bb707b30-96a3-4e7c-abad-b85fc5a10938", "address": "fa:16:3e:ee:30:e0", "network": {"id": "62930ff4-55a3-4e08-8229-5532aa7dcaed", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-130265711-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b4ca2b2e65ac4bf8b3d14f3310a3a7bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb707b30-96", "ovs_interfaceid": "bb707b30-96a3-4e7c-abad-b85fc5a10938", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:54:17 compute-0 nova_compute[186544]: 2025-11-22 07:54:17.112 186548 DEBUG oslo_concurrency.lockutils [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Releasing lock "refresh_cache-47f049f6-6113-427d-ba1e-f849dfec302b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:54:17 compute-0 nova_compute[186544]: 2025-11-22 07:54:17.112 186548 DEBUG nova.compute.manager [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Instance network_info: |[{"id": "bb707b30-96a3-4e7c-abad-b85fc5a10938", "address": "fa:16:3e:ee:30:e0", "network": {"id": "62930ff4-55a3-4e08-8229-5532aa7dcaed", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-130265711-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b4ca2b2e65ac4bf8b3d14f3310a3a7bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb707b30-96", "ovs_interfaceid": "bb707b30-96a3-4e7c-abad-b85fc5a10938", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 22 07:54:17 compute-0 nova_compute[186544]: 2025-11-22 07:54:17.113 186548 DEBUG oslo_concurrency.lockutils [req-96e1d628-fd1d-424c-b454-db8234d3ba3f req-b2777408-0882-4416-9b46-cc57b623af71 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-47f049f6-6113-427d-ba1e-f849dfec302b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:54:17 compute-0 nova_compute[186544]: 2025-11-22 07:54:17.113 186548 DEBUG nova.network.neutron [req-96e1d628-fd1d-424c-b454-db8234d3ba3f req-b2777408-0882-4416-9b46-cc57b623af71 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Refreshing network info cache for port bb707b30-96a3-4e7c-abad-b85fc5a10938 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 07:54:17 compute-0 nova_compute[186544]: 2025-11-22 07:54:17.116 186548 DEBUG nova.virt.libvirt.driver [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Start _get_guest_xml network_info=[{"id": "bb707b30-96a3-4e7c-abad-b85fc5a10938", "address": "fa:16:3e:ee:30:e0", "network": {"id": "62930ff4-55a3-4e08-8229-5532aa7dcaed", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-130265711-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b4ca2b2e65ac4bf8b3d14f3310a3a7bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb707b30-96", "ovs_interfaceid": "bb707b30-96a3-4e7c-abad-b85fc5a10938", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 07:54:17 compute-0 nova_compute[186544]: 2025-11-22 07:54:17.121 186548 WARNING nova.virt.libvirt.driver [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 07:54:17 compute-0 nova_compute[186544]: 2025-11-22 07:54:17.126 186548 DEBUG nova.virt.libvirt.host [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 07:54:17 compute-0 nova_compute[186544]: 2025-11-22 07:54:17.127 186548 DEBUG nova.virt.libvirt.host [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 07:54:17 compute-0 nova_compute[186544]: 2025-11-22 07:54:17.136 186548 DEBUG nova.virt.libvirt.host [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 07:54:17 compute-0 nova_compute[186544]: 2025-11-22 07:54:17.137 186548 DEBUG nova.virt.libvirt.host [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 07:54:17 compute-0 nova_compute[186544]: 2025-11-22 07:54:17.138 186548 DEBUG nova.virt.libvirt.driver [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 07:54:17 compute-0 nova_compute[186544]: 2025-11-22 07:54:17.138 186548 DEBUG nova.virt.hardware [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 07:54:17 compute-0 nova_compute[186544]: 2025-11-22 07:54:17.139 186548 DEBUG nova.virt.hardware [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 07:54:17 compute-0 nova_compute[186544]: 2025-11-22 07:54:17.139 186548 DEBUG nova.virt.hardware [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 07:54:17 compute-0 nova_compute[186544]: 2025-11-22 07:54:17.140 186548 DEBUG nova.virt.hardware [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 07:54:17 compute-0 nova_compute[186544]: 2025-11-22 07:54:17.140 186548 DEBUG nova.virt.hardware [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 07:54:17 compute-0 nova_compute[186544]: 2025-11-22 07:54:17.140 186548 DEBUG nova.virt.hardware [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 07:54:17 compute-0 nova_compute[186544]: 2025-11-22 07:54:17.140 186548 DEBUG nova.virt.hardware [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 07:54:17 compute-0 nova_compute[186544]: 2025-11-22 07:54:17.141 186548 DEBUG nova.virt.hardware [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 07:54:17 compute-0 nova_compute[186544]: 2025-11-22 07:54:17.141 186548 DEBUG nova.virt.hardware [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 07:54:17 compute-0 nova_compute[186544]: 2025-11-22 07:54:17.141 186548 DEBUG nova.virt.hardware [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 07:54:17 compute-0 nova_compute[186544]: 2025-11-22 07:54:17.142 186548 DEBUG nova.virt.hardware [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 07:54:17 compute-0 nova_compute[186544]: 2025-11-22 07:54:17.145 186548 DEBUG nova.virt.libvirt.vif [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:54:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-30500190',display_name='tempest-ListServerFiltersTestJSON-instance-30500190',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-30500190',id=62,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b4ca2b2e65ac4bf8b3d14f3310a3a7bf',ramdisk_id='',reservation_id='r-ykmudqwe',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServerFiltersTestJSON-1217253496',owner_user_name='tempest-ListServerFiltersTestJSON-1217253496-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:54:08Z,user_data=None,user_id='6d9b8aa760ed4afdbf24f9deb5d29190',uuid=47f049f6-6113-427d-ba1e-f849dfec302b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bb707b30-96a3-4e7c-abad-b85fc5a10938", "address": "fa:16:3e:ee:30:e0", "network": {"id": "62930ff4-55a3-4e08-8229-5532aa7dcaed", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-130265711-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b4ca2b2e65ac4bf8b3d14f3310a3a7bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb707b30-96", "ovs_interfaceid": "bb707b30-96a3-4e7c-abad-b85fc5a10938", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 07:54:17 compute-0 nova_compute[186544]: 2025-11-22 07:54:17.146 186548 DEBUG nova.network.os_vif_util [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Converting VIF {"id": "bb707b30-96a3-4e7c-abad-b85fc5a10938", "address": "fa:16:3e:ee:30:e0", "network": {"id": "62930ff4-55a3-4e08-8229-5532aa7dcaed", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-130265711-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b4ca2b2e65ac4bf8b3d14f3310a3a7bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb707b30-96", "ovs_interfaceid": "bb707b30-96a3-4e7c-abad-b85fc5a10938", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:54:17 compute-0 nova_compute[186544]: 2025-11-22 07:54:17.147 186548 DEBUG nova.network.os_vif_util [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ee:30:e0,bridge_name='br-int',has_traffic_filtering=True,id=bb707b30-96a3-4e7c-abad-b85fc5a10938,network=Network(62930ff4-55a3-4e08-8229-5532aa7dcaed),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbb707b30-96') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:54:17 compute-0 nova_compute[186544]: 2025-11-22 07:54:17.147 186548 DEBUG nova.objects.instance [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Lazy-loading 'pci_devices' on Instance uuid 47f049f6-6113-427d-ba1e-f849dfec302b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:54:17 compute-0 nova_compute[186544]: 2025-11-22 07:54:17.159 186548 DEBUG nova.virt.libvirt.driver [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] End _get_guest_xml xml=<domain type="kvm">
Nov 22 07:54:17 compute-0 nova_compute[186544]:   <uuid>47f049f6-6113-427d-ba1e-f849dfec302b</uuid>
Nov 22 07:54:17 compute-0 nova_compute[186544]:   <name>instance-0000003e</name>
Nov 22 07:54:17 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 07:54:17 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 07:54:17 compute-0 nova_compute[186544]:   <metadata>
Nov 22 07:54:17 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 07:54:17 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 07:54:17 compute-0 nova_compute[186544]:       <nova:name>tempest-ListServerFiltersTestJSON-instance-30500190</nova:name>
Nov 22 07:54:17 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 07:54:17</nova:creationTime>
Nov 22 07:54:17 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 07:54:17 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 07:54:17 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 07:54:17 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 07:54:17 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 07:54:17 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 07:54:17 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 07:54:17 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 07:54:17 compute-0 nova_compute[186544]:         <nova:user uuid="6d9b8aa760ed4afdbf24f9deb5d29190">tempest-ListServerFiltersTestJSON-1217253496-project-member</nova:user>
Nov 22 07:54:17 compute-0 nova_compute[186544]:         <nova:project uuid="b4ca2b2e65ac4bf8b3d14f3310a3a7bf">tempest-ListServerFiltersTestJSON-1217253496</nova:project>
Nov 22 07:54:17 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 07:54:17 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 07:54:17 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 07:54:17 compute-0 nova_compute[186544]:         <nova:port uuid="bb707b30-96a3-4e7c-abad-b85fc5a10938">
Nov 22 07:54:17 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Nov 22 07:54:17 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 07:54:17 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 07:54:17 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 07:54:17 compute-0 nova_compute[186544]:   </metadata>
Nov 22 07:54:17 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 07:54:17 compute-0 nova_compute[186544]:     <system>
Nov 22 07:54:17 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 07:54:17 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 07:54:17 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 07:54:17 compute-0 nova_compute[186544]:       <entry name="serial">47f049f6-6113-427d-ba1e-f849dfec302b</entry>
Nov 22 07:54:17 compute-0 nova_compute[186544]:       <entry name="uuid">47f049f6-6113-427d-ba1e-f849dfec302b</entry>
Nov 22 07:54:17 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 07:54:17 compute-0 nova_compute[186544]:     </system>
Nov 22 07:54:17 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 07:54:17 compute-0 nova_compute[186544]:   <os>
Nov 22 07:54:17 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 07:54:17 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 07:54:17 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 07:54:17 compute-0 nova_compute[186544]:   </os>
Nov 22 07:54:17 compute-0 nova_compute[186544]:   <features>
Nov 22 07:54:17 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 07:54:17 compute-0 nova_compute[186544]:     <apic/>
Nov 22 07:54:17 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 07:54:17 compute-0 nova_compute[186544]:   </features>
Nov 22 07:54:17 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 07:54:17 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 07:54:17 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 07:54:17 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 07:54:17 compute-0 nova_compute[186544]:   </clock>
Nov 22 07:54:17 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 07:54:17 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 07:54:17 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 07:54:17 compute-0 nova_compute[186544]:   </cpu>
Nov 22 07:54:17 compute-0 nova_compute[186544]:   <devices>
Nov 22 07:54:17 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 07:54:17 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 07:54:17 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/47f049f6-6113-427d-ba1e-f849dfec302b/disk"/>
Nov 22 07:54:17 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 07:54:17 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:54:17 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 07:54:17 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 07:54:17 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/47f049f6-6113-427d-ba1e-f849dfec302b/disk.config"/>
Nov 22 07:54:17 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 07:54:17 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:54:17 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 07:54:17 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:ee:30:e0"/>
Nov 22 07:54:17 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 07:54:17 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 07:54:17 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 07:54:17 compute-0 nova_compute[186544]:       <target dev="tapbb707b30-96"/>
Nov 22 07:54:17 compute-0 nova_compute[186544]:     </interface>
Nov 22 07:54:17 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 07:54:17 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/47f049f6-6113-427d-ba1e-f849dfec302b/console.log" append="off"/>
Nov 22 07:54:17 compute-0 nova_compute[186544]:     </serial>
Nov 22 07:54:17 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 07:54:17 compute-0 nova_compute[186544]:     <video>
Nov 22 07:54:17 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 07:54:17 compute-0 nova_compute[186544]:     </video>
Nov 22 07:54:17 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 07:54:17 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 07:54:17 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 07:54:17 compute-0 nova_compute[186544]:     </rng>
Nov 22 07:54:17 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 07:54:17 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:54:17 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:54:17 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:54:17 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:54:17 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:54:17 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:54:17 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:54:17 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:54:17 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:54:17 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:54:17 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:54:17 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:54:17 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:54:17 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:54:17 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:54:17 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:54:17 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:54:17 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:54:17 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:54:17 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:54:17 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:54:17 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:54:17 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:54:17 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:54:17 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 07:54:17 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 07:54:17 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 07:54:17 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 07:54:17 compute-0 nova_compute[186544]:   </devices>
Nov 22 07:54:17 compute-0 nova_compute[186544]: </domain>
Nov 22 07:54:17 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 07:54:17 compute-0 nova_compute[186544]: 2025-11-22 07:54:17.160 186548 DEBUG nova.compute.manager [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Preparing to wait for external event network-vif-plugged-bb707b30-96a3-4e7c-abad-b85fc5a10938 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 07:54:17 compute-0 nova_compute[186544]: 2025-11-22 07:54:17.161 186548 DEBUG oslo_concurrency.lockutils [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Acquiring lock "47f049f6-6113-427d-ba1e-f849dfec302b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:54:17 compute-0 nova_compute[186544]: 2025-11-22 07:54:17.161 186548 DEBUG oslo_concurrency.lockutils [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Lock "47f049f6-6113-427d-ba1e-f849dfec302b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:54:17 compute-0 nova_compute[186544]: 2025-11-22 07:54:17.161 186548 DEBUG oslo_concurrency.lockutils [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Lock "47f049f6-6113-427d-ba1e-f849dfec302b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:54:17 compute-0 nova_compute[186544]: 2025-11-22 07:54:17.162 186548 DEBUG nova.virt.libvirt.vif [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:54:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-30500190',display_name='tempest-ListServerFiltersTestJSON-instance-30500190',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-30500190',id=62,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b4ca2b2e65ac4bf8b3d14f3310a3a7bf',ramdisk_id='',reservation_id='r-ykmudqwe',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServerFiltersTestJSON-1217253496',owner_user_name='tempest-ListServerFiltersTestJSON-1217253496-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:54:08Z,user_data=None,user_id='6d9b8aa760ed4afdbf24f9deb5d29190',uuid=47f049f6-6113-427d-ba1e-f849dfec302b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bb707b30-96a3-4e7c-abad-b85fc5a10938", "address": "fa:16:3e:ee:30:e0", "network": {"id": "62930ff4-55a3-4e08-8229-5532aa7dcaed", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-130265711-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b4ca2b2e65ac4bf8b3d14f3310a3a7bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb707b30-96", "ovs_interfaceid": "bb707b30-96a3-4e7c-abad-b85fc5a10938", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 07:54:17 compute-0 nova_compute[186544]: 2025-11-22 07:54:17.162 186548 DEBUG nova.network.os_vif_util [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Converting VIF {"id": "bb707b30-96a3-4e7c-abad-b85fc5a10938", "address": "fa:16:3e:ee:30:e0", "network": {"id": "62930ff4-55a3-4e08-8229-5532aa7dcaed", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-130265711-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b4ca2b2e65ac4bf8b3d14f3310a3a7bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb707b30-96", "ovs_interfaceid": "bb707b30-96a3-4e7c-abad-b85fc5a10938", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:54:17 compute-0 nova_compute[186544]: 2025-11-22 07:54:17.163 186548 DEBUG nova.network.os_vif_util [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ee:30:e0,bridge_name='br-int',has_traffic_filtering=True,id=bb707b30-96a3-4e7c-abad-b85fc5a10938,network=Network(62930ff4-55a3-4e08-8229-5532aa7dcaed),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbb707b30-96') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:54:17 compute-0 nova_compute[186544]: 2025-11-22 07:54:17.163 186548 DEBUG os_vif [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ee:30:e0,bridge_name='br-int',has_traffic_filtering=True,id=bb707b30-96a3-4e7c-abad-b85fc5a10938,network=Network(62930ff4-55a3-4e08-8229-5532aa7dcaed),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbb707b30-96') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 07:54:17 compute-0 nova_compute[186544]: 2025-11-22 07:54:17.164 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:54:17 compute-0 nova_compute[186544]: 2025-11-22 07:54:17.164 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:54:17 compute-0 nova_compute[186544]: 2025-11-22 07:54:17.165 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:54:17 compute-0 nova_compute[186544]: 2025-11-22 07:54:17.167 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:54:17 compute-0 nova_compute[186544]: 2025-11-22 07:54:17.168 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbb707b30-96, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:54:17 compute-0 nova_compute[186544]: 2025-11-22 07:54:17.168 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapbb707b30-96, col_values=(('external_ids', {'iface-id': 'bb707b30-96a3-4e7c-abad-b85fc5a10938', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ee:30:e0', 'vm-uuid': '47f049f6-6113-427d-ba1e-f849dfec302b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:54:17 compute-0 NetworkManager[55036]: <info>  [1763798057.1721] manager: (tapbb707b30-96): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/113)
Nov 22 07:54:17 compute-0 nova_compute[186544]: 2025-11-22 07:54:17.173 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 07:54:17 compute-0 nova_compute[186544]: 2025-11-22 07:54:17.179 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:54:17 compute-0 nova_compute[186544]: 2025-11-22 07:54:17.180 186548 INFO os_vif [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ee:30:e0,bridge_name='br-int',has_traffic_filtering=True,id=bb707b30-96a3-4e7c-abad-b85fc5a10938,network=Network(62930ff4-55a3-4e08-8229-5532aa7dcaed),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbb707b30-96')
Nov 22 07:54:18 compute-0 nova_compute[186544]: 2025-11-22 07:54:18.169 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:54:18 compute-0 nova_compute[186544]: 2025-11-22 07:54:18.271 186548 DEBUG nova.virt.libvirt.driver [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:54:18 compute-0 nova_compute[186544]: 2025-11-22 07:54:18.272 186548 DEBUG nova.virt.libvirt.driver [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:54:18 compute-0 nova_compute[186544]: 2025-11-22 07:54:18.272 186548 DEBUG nova.virt.libvirt.driver [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] No VIF found with MAC fa:16:3e:ee:30:e0, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 07:54:18 compute-0 nova_compute[186544]: 2025-11-22 07:54:18.273 186548 INFO nova.virt.libvirt.driver [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Using config drive
Nov 22 07:54:18 compute-0 nova_compute[186544]: 2025-11-22 07:54:18.280 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Triggering sync for uuid c64e78b6-87b2-425c-aef9-771bcd042d58 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Nov 22 07:54:18 compute-0 nova_compute[186544]: 2025-11-22 07:54:18.281 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Triggering sync for uuid 47f049f6-6113-427d-ba1e-f849dfec302b _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Nov 22 07:54:18 compute-0 nova_compute[186544]: 2025-11-22 07:54:18.282 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "c64e78b6-87b2-425c-aef9-771bcd042d58" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:54:18 compute-0 nova_compute[186544]: 2025-11-22 07:54:18.282 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "c64e78b6-87b2-425c-aef9-771bcd042d58" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:54:18 compute-0 nova_compute[186544]: 2025-11-22 07:54:18.282 186548 INFO nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] During sync_power_state the instance has a pending task (resize_prep). Skip.
Nov 22 07:54:18 compute-0 nova_compute[186544]: 2025-11-22 07:54:18.283 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "c64e78b6-87b2-425c-aef9-771bcd042d58" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:54:18 compute-0 nova_compute[186544]: 2025-11-22 07:54:18.283 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "47f049f6-6113-427d-ba1e-f849dfec302b" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:54:20 compute-0 ovn_controller[94843]: 2025-11-22T07:54:20Z|00028|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:36:ab:fc 10.100.0.7
Nov 22 07:54:20 compute-0 ovn_controller[94843]: 2025-11-22T07:54:20Z|00029|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:36:ab:fc 10.100.0.7
Nov 22 07:54:20 compute-0 nova_compute[186544]: 2025-11-22 07:54:20.383 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:54:20 compute-0 podman[222964]: 2025-11-22 07:54:20.438799235 +0000 UTC m=+0.087787071 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Nov 22 07:54:20 compute-0 nova_compute[186544]: 2025-11-22 07:54:20.830 186548 INFO nova.virt.libvirt.driver [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Creating config drive at /var/lib/nova/instances/47f049f6-6113-427d-ba1e-f849dfec302b/disk.config
Nov 22 07:54:20 compute-0 nova_compute[186544]: 2025-11-22 07:54:20.837 186548 DEBUG oslo_concurrency.processutils [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/47f049f6-6113-427d-ba1e-f849dfec302b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpux13mpc8 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:54:20 compute-0 nova_compute[186544]: 2025-11-22 07:54:20.966 186548 DEBUG oslo_concurrency.processutils [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/47f049f6-6113-427d-ba1e-f849dfec302b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpux13mpc8" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:54:21 compute-0 kernel: tapbb707b30-96: entered promiscuous mode
Nov 22 07:54:21 compute-0 NetworkManager[55036]: <info>  [1763798061.0381] manager: (tapbb707b30-96): new Tun device (/org/freedesktop/NetworkManager/Devices/114)
Nov 22 07:54:21 compute-0 ovn_controller[94843]: 2025-11-22T07:54:21Z|00239|binding|INFO|Claiming lport bb707b30-96a3-4e7c-abad-b85fc5a10938 for this chassis.
Nov 22 07:54:21 compute-0 ovn_controller[94843]: 2025-11-22T07:54:21Z|00240|binding|INFO|bb707b30-96a3-4e7c-abad-b85fc5a10938: Claiming fa:16:3e:ee:30:e0 10.100.0.4
Nov 22 07:54:21 compute-0 nova_compute[186544]: 2025-11-22 07:54:21.043 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:54:21 compute-0 nova_compute[186544]: 2025-11-22 07:54:21.046 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:54:21 compute-0 systemd-udevd[222999]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 07:54:21 compute-0 ovn_controller[94843]: 2025-11-22T07:54:21Z|00241|binding|INFO|Setting lport bb707b30-96a3-4e7c-abad-b85fc5a10938 ovn-installed in OVS
Nov 22 07:54:21 compute-0 nova_compute[186544]: 2025-11-22 07:54:21.081 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:54:21 compute-0 NetworkManager[55036]: <info>  [1763798061.0838] device (tapbb707b30-96): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 07:54:21 compute-0 NetworkManager[55036]: <info>  [1763798061.0850] device (tapbb707b30-96): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 07:54:21 compute-0 systemd-machined[152872]: New machine qemu-30-instance-0000003e.
Nov 22 07:54:21 compute-0 systemd[1]: Started Virtual Machine qemu-30-instance-0000003e.
Nov 22 07:54:21 compute-0 ovn_controller[94843]: 2025-11-22T07:54:21Z|00242|binding|INFO|Setting lport bb707b30-96a3-4e7c-abad-b85fc5a10938 up in Southbound
Nov 22 07:54:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:21.139 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ee:30:e0 10.100.0.4'], port_security=['fa:16:3e:ee:30:e0 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '47f049f6-6113-427d-ba1e-f849dfec302b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-62930ff4-55a3-4e08-8229-5532aa7dcaed', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'cd63e957-ae08-4ca1-9eb9-8ce253173257', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=13b92379-ae34-491c-b971-1757bc6e8c79, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=bb707b30-96a3-4e7c-abad-b85fc5a10938) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:54:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:21.141 103805 INFO neutron.agent.ovn.metadata.agent [-] Port bb707b30-96a3-4e7c-abad-b85fc5a10938 in datapath 62930ff4-55a3-4e08-8229-5532aa7dcaed bound to our chassis
Nov 22 07:54:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:21.143 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 62930ff4-55a3-4e08-8229-5532aa7dcaed
Nov 22 07:54:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:21.158 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[7b7a9e56-d2d6-4f8f-8df8-a27f4fcef35d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:54:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:21.159 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap62930ff4-51 in ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 07:54:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:21.163 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap62930ff4-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 07:54:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:21.163 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[a8426eef-5523-4ad8-bcfb-cd97ca792d06]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:54:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:21.164 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[b7ae13fe-5211-4c1b-b8c4-2d29e5284087]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:54:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:21.177 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[45e2259a-c5f3-4351-9e09-ea17ad49627c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:54:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:21.190 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[41e18eb3-885c-40a8-910b-8e9b057e33cf]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:54:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:21.226 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[158eee1b-70d9-48ab-ab77-333297f4982d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:54:21 compute-0 NetworkManager[55036]: <info>  [1763798061.2341] manager: (tap62930ff4-50): new Veth device (/org/freedesktop/NetworkManager/Devices/115)
Nov 22 07:54:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:21.233 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[9e9147c2-e588-4da9-9ff8-91f34301416d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:54:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:21.266 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[6bb308da-1930-4061-b9cd-a51b0f289d6f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:54:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:21.271 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[ee0c884b-4f11-481d-841a-c1c2ed767067]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:54:21 compute-0 NetworkManager[55036]: <info>  [1763798061.2975] device (tap62930ff4-50): carrier: link connected
Nov 22 07:54:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:21.303 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[7d9e9fd1-c252-4b2c-97f1-f09da1e32c6c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:54:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:21.321 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[656e2572-f531-4077-a019-0a6986ee79df]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap62930ff4-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c3:07:14'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 71], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 480193, 'reachable_time': 40575, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223035, 'error': None, 'target': 'ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:54:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:21.336 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[62d3364e-57c1-4354-9d8e-4705fb2b7150]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec3:714'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 480193, 'tstamp': 480193}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223036, 'error': None, 'target': 'ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:54:21 compute-0 nova_compute[186544]: 2025-11-22 07:54:21.341 186548 DEBUG nova.network.neutron [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Updating instance_info_cache with network_info: [{"id": "a038edb6-47af-4f7e-9f5e-715660b6da32", "address": "fa:16:3e:36:ab:fc", "network": {"id": "5e910dbb-27d1-4915-8b74-d0538d33c33c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-667619475-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b68db2b61a54aeaa8ac219f44ed3e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa038edb6-47", "ovs_interfaceid": "a038edb6-47af-4f7e-9f5e-715660b6da32", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:54:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:21.354 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[3ee5aa8e-a13d-4927-bffa-03bce3976dac]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap62930ff4-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c3:07:14'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 71], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 480193, 'reachable_time': 40575, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 223037, 'error': None, 'target': 'ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:54:21 compute-0 nova_compute[186544]: 2025-11-22 07:54:21.393 186548 DEBUG oslo_concurrency.lockutils [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Releasing lock "refresh_cache-c64e78b6-87b2-425c-aef9-771bcd042d58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:54:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:21.397 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[99578f76-f8b5-4a10-b137-05f5457fa29d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:54:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:21.464 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[db89c43b-27a4-4be2-ac07-465be37a9b92]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:54:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:21.465 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap62930ff4-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:54:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:21.466 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:54:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:21.466 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap62930ff4-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:54:21 compute-0 NetworkManager[55036]: <info>  [1763798061.4690] manager: (tap62930ff4-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/116)
Nov 22 07:54:21 compute-0 kernel: tap62930ff4-50: entered promiscuous mode
Nov 22 07:54:21 compute-0 nova_compute[186544]: 2025-11-22 07:54:21.468 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:54:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:21.471 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap62930ff4-50, col_values=(('external_ids', {'iface-id': '02324e7a-c5bf-443b-a6e3-5a1cdac9fee4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:54:21 compute-0 nova_compute[186544]: 2025-11-22 07:54:21.472 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:54:21 compute-0 ovn_controller[94843]: 2025-11-22T07:54:21Z|00243|binding|INFO|Releasing lport 02324e7a-c5bf-443b-a6e3-5a1cdac9fee4 from this chassis (sb_readonly=0)
Nov 22 07:54:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:21.485 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/62930ff4-55a3-4e08-8229-5532aa7dcaed.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/62930ff4-55a3-4e08-8229-5532aa7dcaed.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 07:54:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:21.486 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[601d34f3-f8ba-4bbf-bb5e-3ba213e46d31]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:54:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:21.487 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 07:54:21 compute-0 ovn_metadata_agent[103800]: global
Nov 22 07:54:21 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 07:54:21 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-62930ff4-55a3-4e08-8229-5532aa7dcaed
Nov 22 07:54:21 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 07:54:21 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 07:54:21 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 07:54:21 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/62930ff4-55a3-4e08-8229-5532aa7dcaed.pid.haproxy
Nov 22 07:54:21 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 07:54:21 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:54:21 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 07:54:21 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 07:54:21 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 07:54:21 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 07:54:21 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 07:54:21 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 07:54:21 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 07:54:21 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 07:54:21 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 07:54:21 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 07:54:21 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 07:54:21 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 07:54:21 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 07:54:21 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:54:21 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:54:21 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 07:54:21 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 07:54:21 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 07:54:21 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID 62930ff4-55a3-4e08-8229-5532aa7dcaed
Nov 22 07:54:21 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 07:54:21 compute-0 nova_compute[186544]: 2025-11-22 07:54:21.488 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:54:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:21.489 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed', 'env', 'PROCESS_TAG=haproxy-62930ff4-55a3-4e08-8229-5532aa7dcaed', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/62930ff4-55a3-4e08-8229-5532aa7dcaed.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 07:54:21 compute-0 nova_compute[186544]: 2025-11-22 07:54:21.669 186548 DEBUG nova.virt.libvirt.driver [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511
Nov 22 07:54:21 compute-0 nova_compute[186544]: 2025-11-22 07:54:21.669 186548 DEBUG nova.virt.libvirt.volume.remotefs [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Creating file /var/lib/nova/instances/c64e78b6-87b2-425c-aef9-771bcd042d58/d878ac13d09c43ba94591d90fbddccfb.tmp on remote host 192.168.122.101 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79
Nov 22 07:54:21 compute-0 nova_compute[186544]: 2025-11-22 07:54:21.670 186548 DEBUG oslo_concurrency.processutils [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/c64e78b6-87b2-425c-aef9-771bcd042d58/d878ac13d09c43ba94591d90fbddccfb.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:54:21 compute-0 nova_compute[186544]: 2025-11-22 07:54:21.801 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798061.8005168, 47f049f6-6113-427d-ba1e-f849dfec302b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:54:21 compute-0 nova_compute[186544]: 2025-11-22 07:54:21.801 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] VM Started (Lifecycle Event)
Nov 22 07:54:21 compute-0 nova_compute[186544]: 2025-11-22 07:54:21.840 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:54:21 compute-0 nova_compute[186544]: 2025-11-22 07:54:21.844 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798061.800773, 47f049f6-6113-427d-ba1e-f849dfec302b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:54:21 compute-0 nova_compute[186544]: 2025-11-22 07:54:21.845 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] VM Paused (Lifecycle Event)
Nov 22 07:54:21 compute-0 nova_compute[186544]: 2025-11-22 07:54:21.895 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:54:21 compute-0 nova_compute[186544]: 2025-11-22 07:54:21.900 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:54:21 compute-0 nova_compute[186544]: 2025-11-22 07:54:21.930 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 07:54:21 compute-0 podman[223077]: 2025-11-22 07:54:21.872777172 +0000 UTC m=+0.023959397 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 07:54:22 compute-0 podman[223077]: 2025-11-22 07:54:22.099598437 +0000 UTC m=+0.250780622 container create 48f701d6cc65f59300c9b7eeff87e6652a391bd64c4e882c6f8d8e3cb9703a91 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 22 07:54:22 compute-0 nova_compute[186544]: 2025-11-22 07:54:22.171 186548 DEBUG oslo_concurrency.processutils [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/c64e78b6-87b2-425c-aef9-771bcd042d58/d878ac13d09c43ba94591d90fbddccfb.tmp" returned: 1 in 0.501s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:54:22 compute-0 nova_compute[186544]: 2025-11-22 07:54:22.171 186548 DEBUG oslo_concurrency.processutils [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] 'ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/c64e78b6-87b2-425c-aef9-771bcd042d58/d878ac13d09c43ba94591d90fbddccfb.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Nov 22 07:54:22 compute-0 nova_compute[186544]: 2025-11-22 07:54:22.172 186548 DEBUG nova.virt.libvirt.volume.remotefs [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Creating directory /var/lib/nova/instances/c64e78b6-87b2-425c-aef9-771bcd042d58 on remote host 192.168.122.101 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91
Nov 22 07:54:22 compute-0 nova_compute[186544]: 2025-11-22 07:54:22.172 186548 DEBUG oslo_concurrency.processutils [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/c64e78b6-87b2-425c-aef9-771bcd042d58 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:54:22 compute-0 systemd[1]: Started libpod-conmon-48f701d6cc65f59300c9b7eeff87e6652a391bd64c4e882c6f8d8e3cb9703a91.scope.
Nov 22 07:54:22 compute-0 nova_compute[186544]: 2025-11-22 07:54:22.190 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:54:22 compute-0 systemd[1]: Started libcrun container.
Nov 22 07:54:22 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a49b81ccd000c9683b17101de0d626a3787b59eaabe4be7512774e72966087c3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 07:54:22 compute-0 podman[223077]: 2025-11-22 07:54:22.253739622 +0000 UTC m=+0.404921837 container init 48f701d6cc65f59300c9b7eeff87e6652a391bd64c4e882c6f8d8e3cb9703a91 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 07:54:22 compute-0 podman[223077]: 2025-11-22 07:54:22.260550839 +0000 UTC m=+0.411733034 container start 48f701d6cc65f59300c9b7eeff87e6652a391bd64c4e882c6f8d8e3cb9703a91 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team)
Nov 22 07:54:22 compute-0 neutron-haproxy-ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed[223093]: [NOTICE]   (223097) : New worker (223099) forked
Nov 22 07:54:22 compute-0 neutron-haproxy-ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed[223093]: [NOTICE]   (223097) : Loading success.
Nov 22 07:54:22 compute-0 nova_compute[186544]: 2025-11-22 07:54:22.316 186548 DEBUG nova.network.neutron [req-96e1d628-fd1d-424c-b454-db8234d3ba3f req-b2777408-0882-4416-9b46-cc57b623af71 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Updated VIF entry in instance network info cache for port bb707b30-96a3-4e7c-abad-b85fc5a10938. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 07:54:22 compute-0 nova_compute[186544]: 2025-11-22 07:54:22.316 186548 DEBUG nova.network.neutron [req-96e1d628-fd1d-424c-b454-db8234d3ba3f req-b2777408-0882-4416-9b46-cc57b623af71 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Updating instance_info_cache with network_info: [{"id": "bb707b30-96a3-4e7c-abad-b85fc5a10938", "address": "fa:16:3e:ee:30:e0", "network": {"id": "62930ff4-55a3-4e08-8229-5532aa7dcaed", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-130265711-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b4ca2b2e65ac4bf8b3d14f3310a3a7bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb707b30-96", "ovs_interfaceid": "bb707b30-96a3-4e7c-abad-b85fc5a10938", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:54:22 compute-0 nova_compute[186544]: 2025-11-22 07:54:22.341 186548 DEBUG oslo_concurrency.lockutils [req-96e1d628-fd1d-424c-b454-db8234d3ba3f req-b2777408-0882-4416-9b46-cc57b623af71 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-47f049f6-6113-427d-ba1e-f849dfec302b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:54:22 compute-0 nova_compute[186544]: 2025-11-22 07:54:22.393 186548 DEBUG oslo_concurrency.processutils [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/c64e78b6-87b2-425c-aef9-771bcd042d58" returned: 0 in 0.222s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:54:22 compute-0 nova_compute[186544]: 2025-11-22 07:54:22.397 186548 DEBUG nova.virt.libvirt.driver [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Nov 22 07:54:23 compute-0 nova_compute[186544]: 2025-11-22 07:54:23.519 186548 DEBUG nova.compute.manager [req-3c56d5ed-a2a8-4774-98eb-39ac9b1976b6 req-111cedfc-bd68-467e-a945-1f7e8fc80c1e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Received event network-vif-plugged-bb707b30-96a3-4e7c-abad-b85fc5a10938 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:54:23 compute-0 nova_compute[186544]: 2025-11-22 07:54:23.520 186548 DEBUG oslo_concurrency.lockutils [req-3c56d5ed-a2a8-4774-98eb-39ac9b1976b6 req-111cedfc-bd68-467e-a945-1f7e8fc80c1e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "47f049f6-6113-427d-ba1e-f849dfec302b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:54:23 compute-0 nova_compute[186544]: 2025-11-22 07:54:23.520 186548 DEBUG oslo_concurrency.lockutils [req-3c56d5ed-a2a8-4774-98eb-39ac9b1976b6 req-111cedfc-bd68-467e-a945-1f7e8fc80c1e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "47f049f6-6113-427d-ba1e-f849dfec302b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:54:23 compute-0 nova_compute[186544]: 2025-11-22 07:54:23.521 186548 DEBUG oslo_concurrency.lockutils [req-3c56d5ed-a2a8-4774-98eb-39ac9b1976b6 req-111cedfc-bd68-467e-a945-1f7e8fc80c1e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "47f049f6-6113-427d-ba1e-f849dfec302b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:54:23 compute-0 nova_compute[186544]: 2025-11-22 07:54:23.521 186548 DEBUG nova.compute.manager [req-3c56d5ed-a2a8-4774-98eb-39ac9b1976b6 req-111cedfc-bd68-467e-a945-1f7e8fc80c1e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Processing event network-vif-plugged-bb707b30-96a3-4e7c-abad-b85fc5a10938 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 07:54:23 compute-0 nova_compute[186544]: 2025-11-22 07:54:23.521 186548 DEBUG nova.compute.manager [req-3c56d5ed-a2a8-4774-98eb-39ac9b1976b6 req-111cedfc-bd68-467e-a945-1f7e8fc80c1e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Received event network-vif-plugged-bb707b30-96a3-4e7c-abad-b85fc5a10938 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:54:23 compute-0 nova_compute[186544]: 2025-11-22 07:54:23.521 186548 DEBUG oslo_concurrency.lockutils [req-3c56d5ed-a2a8-4774-98eb-39ac9b1976b6 req-111cedfc-bd68-467e-a945-1f7e8fc80c1e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "47f049f6-6113-427d-ba1e-f849dfec302b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:54:23 compute-0 nova_compute[186544]: 2025-11-22 07:54:23.522 186548 DEBUG oslo_concurrency.lockutils [req-3c56d5ed-a2a8-4774-98eb-39ac9b1976b6 req-111cedfc-bd68-467e-a945-1f7e8fc80c1e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "47f049f6-6113-427d-ba1e-f849dfec302b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:54:23 compute-0 nova_compute[186544]: 2025-11-22 07:54:23.522 186548 DEBUG oslo_concurrency.lockutils [req-3c56d5ed-a2a8-4774-98eb-39ac9b1976b6 req-111cedfc-bd68-467e-a945-1f7e8fc80c1e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "47f049f6-6113-427d-ba1e-f849dfec302b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:54:23 compute-0 nova_compute[186544]: 2025-11-22 07:54:23.522 186548 DEBUG nova.compute.manager [req-3c56d5ed-a2a8-4774-98eb-39ac9b1976b6 req-111cedfc-bd68-467e-a945-1f7e8fc80c1e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] No waiting events found dispatching network-vif-plugged-bb707b30-96a3-4e7c-abad-b85fc5a10938 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:54:23 compute-0 nova_compute[186544]: 2025-11-22 07:54:23.522 186548 WARNING nova.compute.manager [req-3c56d5ed-a2a8-4774-98eb-39ac9b1976b6 req-111cedfc-bd68-467e-a945-1f7e8fc80c1e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Received unexpected event network-vif-plugged-bb707b30-96a3-4e7c-abad-b85fc5a10938 for instance with vm_state building and task_state spawning.
Nov 22 07:54:23 compute-0 nova_compute[186544]: 2025-11-22 07:54:23.523 186548 DEBUG nova.compute.manager [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 07:54:23 compute-0 nova_compute[186544]: 2025-11-22 07:54:23.526 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798063.5257797, 47f049f6-6113-427d-ba1e-f849dfec302b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:54:23 compute-0 nova_compute[186544]: 2025-11-22 07:54:23.526 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] VM Resumed (Lifecycle Event)
Nov 22 07:54:23 compute-0 nova_compute[186544]: 2025-11-22 07:54:23.528 186548 DEBUG nova.virt.libvirt.driver [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 07:54:23 compute-0 nova_compute[186544]: 2025-11-22 07:54:23.531 186548 INFO nova.virt.libvirt.driver [-] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Instance spawned successfully.
Nov 22 07:54:23 compute-0 nova_compute[186544]: 2025-11-22 07:54:23.531 186548 DEBUG nova.virt.libvirt.driver [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 07:54:23 compute-0 nova_compute[186544]: 2025-11-22 07:54:23.550 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:54:23 compute-0 nova_compute[186544]: 2025-11-22 07:54:23.554 186548 DEBUG nova.virt.libvirt.driver [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:54:23 compute-0 nova_compute[186544]: 2025-11-22 07:54:23.554 186548 DEBUG nova.virt.libvirt.driver [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:54:23 compute-0 nova_compute[186544]: 2025-11-22 07:54:23.555 186548 DEBUG nova.virt.libvirt.driver [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:54:23 compute-0 nova_compute[186544]: 2025-11-22 07:54:23.555 186548 DEBUG nova.virt.libvirt.driver [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:54:23 compute-0 nova_compute[186544]: 2025-11-22 07:54:23.556 186548 DEBUG nova.virt.libvirt.driver [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:54:23 compute-0 nova_compute[186544]: 2025-11-22 07:54:23.556 186548 DEBUG nova.virt.libvirt.driver [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:54:23 compute-0 nova_compute[186544]: 2025-11-22 07:54:23.562 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:54:23 compute-0 nova_compute[186544]: 2025-11-22 07:54:23.601 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 07:54:23 compute-0 nova_compute[186544]: 2025-11-22 07:54:23.695 186548 INFO nova.compute.manager [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Took 15.50 seconds to spawn the instance on the hypervisor.
Nov 22 07:54:23 compute-0 nova_compute[186544]: 2025-11-22 07:54:23.696 186548 DEBUG nova.compute.manager [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:54:23 compute-0 nova_compute[186544]: 2025-11-22 07:54:23.828 186548 INFO nova.compute.manager [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Took 16.32 seconds to build instance.
Nov 22 07:54:23 compute-0 nova_compute[186544]: 2025-11-22 07:54:23.926 186548 DEBUG oslo_concurrency.lockutils [None req-ac9ff3eb-71f9-4738-9c7c-c06e5a3cd9ec 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Lock "47f049f6-6113-427d-ba1e-f849dfec302b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.524s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:54:23 compute-0 nova_compute[186544]: 2025-11-22 07:54:23.926 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "47f049f6-6113-427d-ba1e-f849dfec302b" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 5.643s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:54:23 compute-0 nova_compute[186544]: 2025-11-22 07:54:23.927 186548 INFO nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 07:54:23 compute-0 nova_compute[186544]: 2025-11-22 07:54:23.927 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "47f049f6-6113-427d-ba1e-f849dfec302b" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:54:24 compute-0 podman[223108]: 2025-11-22 07:54:24.436848334 +0000 UTC m=+0.083175408 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 07:54:24 compute-0 kernel: tapa038edb6-47 (unregistering): left promiscuous mode
Nov 22 07:54:24 compute-0 NetworkManager[55036]: <info>  [1763798064.6571] device (tapa038edb6-47): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 07:54:24 compute-0 nova_compute[186544]: 2025-11-22 07:54:24.673 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:54:24 compute-0 ovn_controller[94843]: 2025-11-22T07:54:24Z|00244|binding|INFO|Releasing lport a038edb6-47af-4f7e-9f5e-715660b6da32 from this chassis (sb_readonly=0)
Nov 22 07:54:24 compute-0 ovn_controller[94843]: 2025-11-22T07:54:24Z|00245|binding|INFO|Setting lport a038edb6-47af-4f7e-9f5e-715660b6da32 down in Southbound
Nov 22 07:54:24 compute-0 ovn_controller[94843]: 2025-11-22T07:54:24Z|00246|binding|INFO|Removing iface tapa038edb6-47 ovn-installed in OVS
Nov 22 07:54:24 compute-0 nova_compute[186544]: 2025-11-22 07:54:24.675 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:54:24 compute-0 nova_compute[186544]: 2025-11-22 07:54:24.686 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:54:24 compute-0 systemd[1]: machine-qemu\x2d29\x2dinstance\x2d0000003b.scope: Deactivated successfully.
Nov 22 07:54:24 compute-0 systemd[1]: machine-qemu\x2d29\x2dinstance\x2d0000003b.scope: Consumed 15.734s CPU time.
Nov 22 07:54:24 compute-0 systemd-machined[152872]: Machine qemu-29-instance-0000003b terminated.
Nov 22 07:54:24 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:24.857 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:36:ab:fc 10.100.0.7'], port_security=['fa:16:3e:36:ab:fc 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'c64e78b6-87b2-425c-aef9-771bcd042d58', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5e910dbb-27d1-4915-8b74-d0538d33c33c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd8cd7544-2677-4974-86a3-a18d0c107043', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7bb67d1a-54cf-4f4c-900a-e9306bad2f5e, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=a038edb6-47af-4f7e-9f5e-715660b6da32) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:54:24 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:24.859 103805 INFO neutron.agent.ovn.metadata.agent [-] Port a038edb6-47af-4f7e-9f5e-715660b6da32 in datapath 5e910dbb-27d1-4915-8b74-d0538d33c33c unbound from our chassis
Nov 22 07:54:24 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:24.861 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5e910dbb-27d1-4915-8b74-d0538d33c33c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 07:54:24 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:24.861 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[3820fc8e-7ed2-414b-97d7-25aa40ca8f07]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:54:24 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:24.862 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c namespace which is not needed anymore
Nov 22 07:54:24 compute-0 neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c[222827]: [NOTICE]   (222831) : haproxy version is 2.8.14-c23fe91
Nov 22 07:54:24 compute-0 neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c[222827]: [NOTICE]   (222831) : path to executable is /usr/sbin/haproxy
Nov 22 07:54:24 compute-0 neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c[222827]: [WARNING]  (222831) : Exiting Master process...
Nov 22 07:54:24 compute-0 neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c[222827]: [WARNING]  (222831) : Exiting Master process...
Nov 22 07:54:25 compute-0 neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c[222827]: [ALERT]    (222831) : Current worker (222833) exited with code 143 (Terminated)
Nov 22 07:54:25 compute-0 neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c[222827]: [WARNING]  (222831) : All workers exited. Exiting... (0)
Nov 22 07:54:25 compute-0 systemd[1]: libpod-a9d759575593ccb1a6883d2ec23e43a71f3917589ed24b1bba435891ef5758dc.scope: Deactivated successfully.
Nov 22 07:54:25 compute-0 conmon[222827]: conmon a9d759575593ccb1a688 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-a9d759575593ccb1a6883d2ec23e43a71f3917589ed24b1bba435891ef5758dc.scope/container/memory.events
Nov 22 07:54:25 compute-0 podman[223173]: 2025-11-22 07:54:25.009299332 +0000 UTC m=+0.056062023 container died a9d759575593ccb1a6883d2ec23e43a71f3917589ed24b1bba435891ef5758dc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 07:54:25 compute-0 systemd[1]: var-lib-containers-storage-overlay-40d1337b8ae49f8ee59a9c81d5608c21316ebe7af834dfb9fa4b12f3360e89d4-merged.mount: Deactivated successfully.
Nov 22 07:54:25 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a9d759575593ccb1a6883d2ec23e43a71f3917589ed24b1bba435891ef5758dc-userdata-shm.mount: Deactivated successfully.
Nov 22 07:54:25 compute-0 podman[223173]: 2025-11-22 07:54:25.088744888 +0000 UTC m=+0.135507569 container cleanup a9d759575593ccb1a6883d2ec23e43a71f3917589ed24b1bba435891ef5758dc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 22 07:54:25 compute-0 systemd[1]: libpod-conmon-a9d759575593ccb1a6883d2ec23e43a71f3917589ed24b1bba435891ef5758dc.scope: Deactivated successfully.
Nov 22 07:54:25 compute-0 podman[223201]: 2025-11-22 07:54:25.16399511 +0000 UTC m=+0.054347992 container remove a9d759575593ccb1a6883d2ec23e43a71f3917589ed24b1bba435891ef5758dc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 07:54:25 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:25.170 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[9ac2a787-6897-4d76-ac7d-a976628c9926]: (4, ('Sat Nov 22 07:54:24 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c (a9d759575593ccb1a6883d2ec23e43a71f3917589ed24b1bba435891ef5758dc)\na9d759575593ccb1a6883d2ec23e43a71f3917589ed24b1bba435891ef5758dc\nSat Nov 22 07:54:25 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c (a9d759575593ccb1a6883d2ec23e43a71f3917589ed24b1bba435891ef5758dc)\na9d759575593ccb1a6883d2ec23e43a71f3917589ed24b1bba435891ef5758dc\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:54:25 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:25.172 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[0fbc349a-596f-41d3-9ce1-a013ff3f381a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:54:25 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:25.173 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5e910dbb-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:54:25 compute-0 nova_compute[186544]: 2025-11-22 07:54:25.174 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:54:25 compute-0 kernel: tap5e910dbb-20: left promiscuous mode
Nov 22 07:54:25 compute-0 nova_compute[186544]: 2025-11-22 07:54:25.190 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:54:25 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:25.194 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[45951492-99bd-4833-bc9c-3d33cd316612]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:54:25 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:25.208 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[182db09e-5a54-4ca3-af73-23185beb2566]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:54:25 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:25.210 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[25e96dce-9d83-4537-8277-d349335b6ea7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:54:25 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:25.225 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[843664aa-d375-4913-b636-e015f464f27f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 478285, 'reachable_time': 20290, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223217, 'error': None, 'target': 'ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:54:25 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:25.227 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-5e910dbb-27d1-4915-8b74-d0538d33c33c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 07:54:25 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:25.227 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[42d8fd5e-6db6-4436-892a-9f63bd4317b5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:54:25 compute-0 systemd[1]: run-netns-ovnmeta\x2d5e910dbb\x2d27d1\x2d4915\x2d8b74\x2dd0538d33c33c.mount: Deactivated successfully.
Nov 22 07:54:25 compute-0 nova_compute[186544]: 2025-11-22 07:54:25.385 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:54:25 compute-0 nova_compute[186544]: 2025-11-22 07:54:25.412 186548 INFO nova.virt.libvirt.driver [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Instance shutdown successfully after 3 seconds.
Nov 22 07:54:25 compute-0 nova_compute[186544]: 2025-11-22 07:54:25.417 186548 INFO nova.virt.libvirt.driver [-] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Instance destroyed successfully.
Nov 22 07:54:25 compute-0 nova_compute[186544]: 2025-11-22 07:54:25.418 186548 DEBUG nova.virt.libvirt.vif [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:53:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1711924286',display_name='tempest-DeleteServersTestJSON-server-1711924286',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1711924286',id=59,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:54:04Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='6b68db2b61a54aeaa8ac219f44ed3e75',ramdisk_id='',reservation_id='r-8pf23wx3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-DeleteServersTestJSON-550712359',owner_user_name='tempest-DeleteServersTestJSON-550712359-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:54:14Z,user_data=None,user_id='57077a1511bf46d897beb6fd5eedfa67',uuid=c64e78b6-87b2-425c-aef9-771bcd042d58,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a038edb6-47af-4f7e-9f5e-715660b6da32", "address": "fa:16:3e:36:ab:fc", "network": {"id": "5e910dbb-27d1-4915-8b74-d0538d33c33c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-667619475-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-DeleteServersTestJSON-667619475-network", "vif_mac": "fa:16:3e:36:ab:fc"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b68db2b61a54aeaa8ac219f44ed3e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa038edb6-47", "ovs_interfaceid": "a038edb6-47af-4f7e-9f5e-715660b6da32", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 07:54:25 compute-0 nova_compute[186544]: 2025-11-22 07:54:25.418 186548 DEBUG nova.network.os_vif_util [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Converting VIF {"id": "a038edb6-47af-4f7e-9f5e-715660b6da32", "address": "fa:16:3e:36:ab:fc", "network": {"id": "5e910dbb-27d1-4915-8b74-d0538d33c33c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-667619475-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-DeleteServersTestJSON-667619475-network", "vif_mac": "fa:16:3e:36:ab:fc"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b68db2b61a54aeaa8ac219f44ed3e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa038edb6-47", "ovs_interfaceid": "a038edb6-47af-4f7e-9f5e-715660b6da32", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:54:25 compute-0 nova_compute[186544]: 2025-11-22 07:54:25.418 186548 DEBUG nova.network.os_vif_util [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:36:ab:fc,bridge_name='br-int',has_traffic_filtering=True,id=a038edb6-47af-4f7e-9f5e-715660b6da32,network=Network(5e910dbb-27d1-4915-8b74-d0538d33c33c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa038edb6-47') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:54:25 compute-0 nova_compute[186544]: 2025-11-22 07:54:25.419 186548 DEBUG os_vif [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:36:ab:fc,bridge_name='br-int',has_traffic_filtering=True,id=a038edb6-47af-4f7e-9f5e-715660b6da32,network=Network(5e910dbb-27d1-4915-8b74-d0538d33c33c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa038edb6-47') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 07:54:25 compute-0 nova_compute[186544]: 2025-11-22 07:54:25.420 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:54:25 compute-0 nova_compute[186544]: 2025-11-22 07:54:25.421 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa038edb6-47, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:54:25 compute-0 nova_compute[186544]: 2025-11-22 07:54:25.422 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:54:25 compute-0 nova_compute[186544]: 2025-11-22 07:54:25.423 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:54:25 compute-0 nova_compute[186544]: 2025-11-22 07:54:25.425 186548 INFO os_vif [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:36:ab:fc,bridge_name='br-int',has_traffic_filtering=True,id=a038edb6-47af-4f7e-9f5e-715660b6da32,network=Network(5e910dbb-27d1-4915-8b74-d0538d33c33c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa038edb6-47')
Nov 22 07:54:25 compute-0 nova_compute[186544]: 2025-11-22 07:54:25.430 186548 DEBUG oslo_concurrency.processutils [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c64e78b6-87b2-425c-aef9-771bcd042d58/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:54:25 compute-0 nova_compute[186544]: 2025-11-22 07:54:25.492 186548 DEBUG oslo_concurrency.processutils [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c64e78b6-87b2-425c-aef9-771bcd042d58/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:54:25 compute-0 nova_compute[186544]: 2025-11-22 07:54:25.493 186548 DEBUG oslo_concurrency.processutils [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c64e78b6-87b2-425c-aef9-771bcd042d58/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:54:25 compute-0 nova_compute[186544]: 2025-11-22 07:54:25.553 186548 DEBUG oslo_concurrency.processutils [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c64e78b6-87b2-425c-aef9-771bcd042d58/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:54:25 compute-0 nova_compute[186544]: 2025-11-22 07:54:25.555 186548 DEBUG nova.virt.libvirt.volume.remotefs [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Copying file /var/lib/nova/instances/c64e78b6-87b2-425c-aef9-771bcd042d58_resize/disk to 192.168.122.101:/var/lib/nova/instances/c64e78b6-87b2-425c-aef9-771bcd042d58/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Nov 22 07:54:25 compute-0 nova_compute[186544]: 2025-11-22 07:54:25.556 186548 DEBUG oslo_concurrency.processutils [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/c64e78b6-87b2-425c-aef9-771bcd042d58_resize/disk 192.168.122.101:/var/lib/nova/instances/c64e78b6-87b2-425c-aef9-771bcd042d58/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:54:26 compute-0 nova_compute[186544]: 2025-11-22 07:54:26.239 186548 DEBUG oslo_concurrency.processutils [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] CMD "scp -r /var/lib/nova/instances/c64e78b6-87b2-425c-aef9-771bcd042d58_resize/disk 192.168.122.101:/var/lib/nova/instances/c64e78b6-87b2-425c-aef9-771bcd042d58/disk" returned: 0 in 0.684s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:54:26 compute-0 nova_compute[186544]: 2025-11-22 07:54:26.240 186548 DEBUG nova.virt.libvirt.volume.remotefs [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Copying file /var/lib/nova/instances/c64e78b6-87b2-425c-aef9-771bcd042d58_resize/disk.config to 192.168.122.101:/var/lib/nova/instances/c64e78b6-87b2-425c-aef9-771bcd042d58/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Nov 22 07:54:26 compute-0 nova_compute[186544]: 2025-11-22 07:54:26.241 186548 DEBUG oslo_concurrency.processutils [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/c64e78b6-87b2-425c-aef9-771bcd042d58_resize/disk.config 192.168.122.101:/var/lib/nova/instances/c64e78b6-87b2-425c-aef9-771bcd042d58/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:54:26 compute-0 nova_compute[186544]: 2025-11-22 07:54:26.304 186548 DEBUG nova.compute.manager [req-df58b878-2d77-45ad-8d50-5057f120808b req-b9f97595-2751-4fe5-883c-d1bac46a629e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Received event network-vif-unplugged-a038edb6-47af-4f7e-9f5e-715660b6da32 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:54:26 compute-0 nova_compute[186544]: 2025-11-22 07:54:26.304 186548 DEBUG oslo_concurrency.lockutils [req-df58b878-2d77-45ad-8d50-5057f120808b req-b9f97595-2751-4fe5-883c-d1bac46a629e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "c64e78b6-87b2-425c-aef9-771bcd042d58-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:54:26 compute-0 nova_compute[186544]: 2025-11-22 07:54:26.305 186548 DEBUG oslo_concurrency.lockutils [req-df58b878-2d77-45ad-8d50-5057f120808b req-b9f97595-2751-4fe5-883c-d1bac46a629e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c64e78b6-87b2-425c-aef9-771bcd042d58-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:54:26 compute-0 nova_compute[186544]: 2025-11-22 07:54:26.305 186548 DEBUG oslo_concurrency.lockutils [req-df58b878-2d77-45ad-8d50-5057f120808b req-b9f97595-2751-4fe5-883c-d1bac46a629e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c64e78b6-87b2-425c-aef9-771bcd042d58-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:54:26 compute-0 nova_compute[186544]: 2025-11-22 07:54:26.305 186548 DEBUG nova.compute.manager [req-df58b878-2d77-45ad-8d50-5057f120808b req-b9f97595-2751-4fe5-883c-d1bac46a629e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] No waiting events found dispatching network-vif-unplugged-a038edb6-47af-4f7e-9f5e-715660b6da32 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:54:26 compute-0 nova_compute[186544]: 2025-11-22 07:54:26.305 186548 WARNING nova.compute.manager [req-df58b878-2d77-45ad-8d50-5057f120808b req-b9f97595-2751-4fe5-883c-d1bac46a629e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Received unexpected event network-vif-unplugged-a038edb6-47af-4f7e-9f5e-715660b6da32 for instance with vm_state active and task_state resize_migrating.
Nov 22 07:54:26 compute-0 nova_compute[186544]: 2025-11-22 07:54:26.479 186548 DEBUG oslo_concurrency.processutils [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] CMD "scp -C -r /var/lib/nova/instances/c64e78b6-87b2-425c-aef9-771bcd042d58_resize/disk.config 192.168.122.101:/var/lib/nova/instances/c64e78b6-87b2-425c-aef9-771bcd042d58/disk.config" returned: 0 in 0.238s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:54:26 compute-0 nova_compute[186544]: 2025-11-22 07:54:26.480 186548 DEBUG nova.virt.libvirt.volume.remotefs [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Copying file /var/lib/nova/instances/c64e78b6-87b2-425c-aef9-771bcd042d58_resize/disk.info to 192.168.122.101:/var/lib/nova/instances/c64e78b6-87b2-425c-aef9-771bcd042d58/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Nov 22 07:54:26 compute-0 nova_compute[186544]: 2025-11-22 07:54:26.480 186548 DEBUG oslo_concurrency.processutils [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/c64e78b6-87b2-425c-aef9-771bcd042d58_resize/disk.info 192.168.122.101:/var/lib/nova/instances/c64e78b6-87b2-425c-aef9-771bcd042d58/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:54:26 compute-0 nova_compute[186544]: 2025-11-22 07:54:26.693 186548 DEBUG oslo_concurrency.processutils [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] CMD "scp -C -r /var/lib/nova/instances/c64e78b6-87b2-425c-aef9-771bcd042d58_resize/disk.info 192.168.122.101:/var/lib/nova/instances/c64e78b6-87b2-425c-aef9-771bcd042d58/disk.info" returned: 0 in 0.213s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:54:27 compute-0 nova_compute[186544]: 2025-11-22 07:54:27.064 186548 DEBUG neutronclient.v2_0.client [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port a038edb6-47af-4f7e-9f5e-715660b6da32 for host compute-1.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262
Nov 22 07:54:27 compute-0 nova_compute[186544]: 2025-11-22 07:54:27.214 186548 DEBUG oslo_concurrency.lockutils [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquiring lock "c64e78b6-87b2-425c-aef9-771bcd042d58-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:54:27 compute-0 nova_compute[186544]: 2025-11-22 07:54:27.215 186548 DEBUG oslo_concurrency.lockutils [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "c64e78b6-87b2-425c-aef9-771bcd042d58-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:54:27 compute-0 nova_compute[186544]: 2025-11-22 07:54:27.215 186548 DEBUG oslo_concurrency.lockutils [None req-50393a40-5c35-4e35-9596-982cb5efe475 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "c64e78b6-87b2-425c-aef9-771bcd042d58-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:54:28 compute-0 podman[223231]: 2025-11-22 07:54:28.435193488 +0000 UTC m=+0.072377174 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., vcs-type=git, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible)
Nov 22 07:54:28 compute-0 nova_compute[186544]: 2025-11-22 07:54:28.528 186548 DEBUG nova.compute.manager [req-483bab99-e43f-4687-9fb0-2228f6c8c12e req-9a470bed-a0ed-46b0-8740-2261ea456fab 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Received event network-vif-plugged-a038edb6-47af-4f7e-9f5e-715660b6da32 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:54:28 compute-0 nova_compute[186544]: 2025-11-22 07:54:28.529 186548 DEBUG oslo_concurrency.lockutils [req-483bab99-e43f-4687-9fb0-2228f6c8c12e req-9a470bed-a0ed-46b0-8740-2261ea456fab 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "c64e78b6-87b2-425c-aef9-771bcd042d58-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:54:28 compute-0 nova_compute[186544]: 2025-11-22 07:54:28.529 186548 DEBUG oslo_concurrency.lockutils [req-483bab99-e43f-4687-9fb0-2228f6c8c12e req-9a470bed-a0ed-46b0-8740-2261ea456fab 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c64e78b6-87b2-425c-aef9-771bcd042d58-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:54:28 compute-0 nova_compute[186544]: 2025-11-22 07:54:28.529 186548 DEBUG oslo_concurrency.lockutils [req-483bab99-e43f-4687-9fb0-2228f6c8c12e req-9a470bed-a0ed-46b0-8740-2261ea456fab 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c64e78b6-87b2-425c-aef9-771bcd042d58-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:54:28 compute-0 nova_compute[186544]: 2025-11-22 07:54:28.529 186548 DEBUG nova.compute.manager [req-483bab99-e43f-4687-9fb0-2228f6c8c12e req-9a470bed-a0ed-46b0-8740-2261ea456fab 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] No waiting events found dispatching network-vif-plugged-a038edb6-47af-4f7e-9f5e-715660b6da32 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:54:28 compute-0 nova_compute[186544]: 2025-11-22 07:54:28.530 186548 WARNING nova.compute.manager [req-483bab99-e43f-4687-9fb0-2228f6c8c12e req-9a470bed-a0ed-46b0-8740-2261ea456fab 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Received unexpected event network-vif-plugged-a038edb6-47af-4f7e-9f5e-715660b6da32 for instance with vm_state active and task_state resize_migrated.
Nov 22 07:54:28 compute-0 nova_compute[186544]: 2025-11-22 07:54:28.922 186548 DEBUG nova.compute.manager [req-37cdf781-8d53-43b5-bf15-8fa899a10b27 req-dab78e80-0b20-4402-b8ca-e29b0f9a14aa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Received event network-changed-a038edb6-47af-4f7e-9f5e-715660b6da32 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:54:28 compute-0 nova_compute[186544]: 2025-11-22 07:54:28.922 186548 DEBUG nova.compute.manager [req-37cdf781-8d53-43b5-bf15-8fa899a10b27 req-dab78e80-0b20-4402-b8ca-e29b0f9a14aa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Refreshing instance network info cache due to event network-changed-a038edb6-47af-4f7e-9f5e-715660b6da32. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 07:54:28 compute-0 nova_compute[186544]: 2025-11-22 07:54:28.923 186548 DEBUG oslo_concurrency.lockutils [req-37cdf781-8d53-43b5-bf15-8fa899a10b27 req-dab78e80-0b20-4402-b8ca-e29b0f9a14aa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-c64e78b6-87b2-425c-aef9-771bcd042d58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:54:28 compute-0 nova_compute[186544]: 2025-11-22 07:54:28.923 186548 DEBUG oslo_concurrency.lockutils [req-37cdf781-8d53-43b5-bf15-8fa899a10b27 req-dab78e80-0b20-4402-b8ca-e29b0f9a14aa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-c64e78b6-87b2-425c-aef9-771bcd042d58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:54:28 compute-0 nova_compute[186544]: 2025-11-22 07:54:28.923 186548 DEBUG nova.network.neutron [req-37cdf781-8d53-43b5-bf15-8fa899a10b27 req-dab78e80-0b20-4402-b8ca-e29b0f9a14aa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Refreshing network info cache for port a038edb6-47af-4f7e-9f5e-715660b6da32 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 07:54:30 compute-0 nova_compute[186544]: 2025-11-22 07:54:30.389 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:54:30 compute-0 nova_compute[186544]: 2025-11-22 07:54:30.422 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:54:33 compute-0 nova_compute[186544]: 2025-11-22 07:54:33.081 186548 DEBUG nova.network.neutron [req-37cdf781-8d53-43b5-bf15-8fa899a10b27 req-dab78e80-0b20-4402-b8ca-e29b0f9a14aa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Updated VIF entry in instance network info cache for port a038edb6-47af-4f7e-9f5e-715660b6da32. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 07:54:33 compute-0 nova_compute[186544]: 2025-11-22 07:54:33.081 186548 DEBUG nova.network.neutron [req-37cdf781-8d53-43b5-bf15-8fa899a10b27 req-dab78e80-0b20-4402-b8ca-e29b0f9a14aa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Updating instance_info_cache with network_info: [{"id": "a038edb6-47af-4f7e-9f5e-715660b6da32", "address": "fa:16:3e:36:ab:fc", "network": {"id": "5e910dbb-27d1-4915-8b74-d0538d33c33c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-667619475-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b68db2b61a54aeaa8ac219f44ed3e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa038edb6-47", "ovs_interfaceid": "a038edb6-47af-4f7e-9f5e-715660b6da32", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:54:33 compute-0 nova_compute[186544]: 2025-11-22 07:54:33.128 186548 DEBUG oslo_concurrency.lockutils [req-37cdf781-8d53-43b5-bf15-8fa899a10b27 req-dab78e80-0b20-4402-b8ca-e29b0f9a14aa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-c64e78b6-87b2-425c-aef9-771bcd042d58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:54:35 compute-0 nova_compute[186544]: 2025-11-22 07:54:35.389 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:54:35 compute-0 nova_compute[186544]: 2025-11-22 07:54:35.424 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:54:35 compute-0 nova_compute[186544]: 2025-11-22 07:54:35.658 186548 DEBUG nova.compute.manager [req-bb987b4f-d7b6-4bde-9a8f-9038e3b1f3c4 req-441a9fd7-f188-4017-8780-1af9888cb291 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Received event network-vif-plugged-a038edb6-47af-4f7e-9f5e-715660b6da32 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:54:35 compute-0 nova_compute[186544]: 2025-11-22 07:54:35.659 186548 DEBUG oslo_concurrency.lockutils [req-bb987b4f-d7b6-4bde-9a8f-9038e3b1f3c4 req-441a9fd7-f188-4017-8780-1af9888cb291 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "c64e78b6-87b2-425c-aef9-771bcd042d58-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:54:35 compute-0 nova_compute[186544]: 2025-11-22 07:54:35.659 186548 DEBUG oslo_concurrency.lockutils [req-bb987b4f-d7b6-4bde-9a8f-9038e3b1f3c4 req-441a9fd7-f188-4017-8780-1af9888cb291 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c64e78b6-87b2-425c-aef9-771bcd042d58-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:54:35 compute-0 nova_compute[186544]: 2025-11-22 07:54:35.660 186548 DEBUG oslo_concurrency.lockutils [req-bb987b4f-d7b6-4bde-9a8f-9038e3b1f3c4 req-441a9fd7-f188-4017-8780-1af9888cb291 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c64e78b6-87b2-425c-aef9-771bcd042d58-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:54:35 compute-0 nova_compute[186544]: 2025-11-22 07:54:35.661 186548 DEBUG nova.compute.manager [req-bb987b4f-d7b6-4bde-9a8f-9038e3b1f3c4 req-441a9fd7-f188-4017-8780-1af9888cb291 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] No waiting events found dispatching network-vif-plugged-a038edb6-47af-4f7e-9f5e-715660b6da32 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:54:35 compute-0 nova_compute[186544]: 2025-11-22 07:54:35.661 186548 WARNING nova.compute.manager [req-bb987b4f-d7b6-4bde-9a8f-9038e3b1f3c4 req-441a9fd7-f188-4017-8780-1af9888cb291 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Received unexpected event network-vif-plugged-a038edb6-47af-4f7e-9f5e-715660b6da32 for instance with vm_state resized and task_state None.
Nov 22 07:54:35 compute-0 nova_compute[186544]: 2025-11-22 07:54:35.909 186548 DEBUG oslo_concurrency.lockutils [None req-a687e413-2d03-4fff-83c3-fe61f93b477f 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquiring lock "c64e78b6-87b2-425c-aef9-771bcd042d58" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:54:35 compute-0 nova_compute[186544]: 2025-11-22 07:54:35.910 186548 DEBUG oslo_concurrency.lockutils [None req-a687e413-2d03-4fff-83c3-fe61f93b477f 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "c64e78b6-87b2-425c-aef9-771bcd042d58" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:54:35 compute-0 nova_compute[186544]: 2025-11-22 07:54:35.910 186548 DEBUG nova.compute.manager [None req-a687e413-2d03-4fff-83c3-fe61f93b477f 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Going to confirm migration 11 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679
Nov 22 07:54:35 compute-0 nova_compute[186544]: 2025-11-22 07:54:35.943 186548 DEBUG nova.objects.instance [None req-a687e413-2d03-4fff-83c3-fe61f93b477f 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lazy-loading 'info_cache' on Instance uuid c64e78b6-87b2-425c-aef9-771bcd042d58 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.594 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c64e78b6-87b2-425c-aef9-771bcd042d58', 'name': 'tempest-DeleteServersTestJSON-server-1711924286', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000003b', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'shutdown', 'tenant_id': '6b68db2b61a54aeaa8ac219f44ed3e75', 'user_id': '57077a1511bf46d897beb6fd5eedfa67', 'hostId': 'feccb6543c8ca9717580a87201da3cf318166bceaf2b029bda49e2e0', 'status': 'stopped', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.598 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '47f049f6-6113-427d-ba1e-f849dfec302b', 'name': 'tempest-ListServerFiltersTestJSON-instance-30500190', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000003e', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'hostId': '90b6817607013d66fc479ffb9172d010da052d8746b318ed99bddedd', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.598 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.599 12 DEBUG ceilometer.compute.pollsters [-] Instance c64e78b6-87b2-425c-aef9-771bcd042d58 was shut off while getting sample of network.incoming.packets.drop: Failed to inspect data of instance <name=instance-0000003b, id=c64e78b6-87b2-425c-aef9-771bcd042d58>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.603 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 47f049f6-6113-427d-ba1e-f849dfec302b / tapbb707b30-96 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.603 12 DEBUG ceilometer.compute.pollsters [-] 47f049f6-6113-427d-ba1e-f849dfec302b/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.605 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fa0c7360-4037-4fdb-8b16-3b0d9b3affef', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': 'instance-0000003e-47f049f6-6113-427d-ba1e-f849dfec302b-tapbb707b30-96', 'timestamp': '2025-11-22T07:54:36.598982', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-30500190', 'name': 'tapbb707b30-96', 'instance_id': '47f049f6-6113-427d-ba1e-f849dfec302b', 'instance_type': 'm1.nano', 'host': '90b6817607013d66fc479ffb9172d010da052d8746b318ed99bddedd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ee:30:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbb707b30-96'}, 'message_id': '7d9eb02e-c778-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4817.291855368, 'message_signature': '60cea3f7c195a49f3458f8470a025a7b478d17624ec0e618193ffe0769ee4c9e'}]}, 'timestamp': '2025-11-22 07:54:36.604205', '_unique_id': '86a93b7090fb4cf38e8560ba13d57592'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.605 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.605 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.605 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.605 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.605 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.605 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.605 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.605 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.605 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.605 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.605 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.605 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.605 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.605 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.605 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.605 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.605 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.605 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.605 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.605 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.605 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.605 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.605 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.605 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.605 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.605 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.605 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.605 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.605 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.605 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.605 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.606 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.606 12 DEBUG ceilometer.compute.pollsters [-] Instance c64e78b6-87b2-425c-aef9-771bcd042d58 was shut off while getting sample of network.outgoing.packets.error: Failed to inspect data of instance <name=instance-0000003b, id=c64e78b6-87b2-425c-aef9-771bcd042d58>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.607 12 DEBUG ceilometer.compute.pollsters [-] 47f049f6-6113-427d-ba1e-f849dfec302b/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.608 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6959fa6a-6adc-412b-a34e-a1e99d33d7c5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': 'instance-0000003e-47f049f6-6113-427d-ba1e-f849dfec302b-tapbb707b30-96', 'timestamp': '2025-11-22T07:54:36.606126', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-30500190', 'name': 'tapbb707b30-96', 'instance_id': '47f049f6-6113-427d-ba1e-f849dfec302b', 'instance_type': 'm1.nano', 'host': '90b6817607013d66fc479ffb9172d010da052d8746b318ed99bddedd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ee:30:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbb707b30-96'}, 'message_id': '7d9f2d2e-c778-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4817.291855368, 'message_signature': 'ace2770f63e04f92768518de9ee848e7e59ada5b95d8736559605dda7b132889'}]}, 'timestamp': '2025-11-22 07:54:36.607444', '_unique_id': 'b6fd2a017a564db6a2c3ccf5d67dedd6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.608 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.608 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.608 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.608 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.608 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.608 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.608 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.608 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.608 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.608 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.608 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.608 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.608 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.608 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.608 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.608 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.608 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.608 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.608 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.608 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.608 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.608 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.608 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.608 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.608 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.608 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.608 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.608 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.608 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.608 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.608 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.609 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.609 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.609 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-DeleteServersTestJSON-server-1711924286>, <NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-30500190>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-DeleteServersTestJSON-server-1711924286>, <NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-30500190>]
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.609 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.610 12 DEBUG ceilometer.compute.pollsters [-] Instance c64e78b6-87b2-425c-aef9-771bcd042d58 was shut off while getting sample of disk.device.write.latency: Failed to inspect data of instance <name=instance-0000003b, id=c64e78b6-87b2-425c-aef9-771bcd042d58>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.638 12 DEBUG ceilometer.compute.pollsters [-] 47f049f6-6113-427d-ba1e-f849dfec302b/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.639 12 DEBUG ceilometer.compute.pollsters [-] 47f049f6-6113-427d-ba1e-f849dfec302b/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.640 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e0081ebf-02da-47a3-9b3b-a40aa0e347c7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': '47f049f6-6113-427d-ba1e-f849dfec302b-vda', 'timestamp': '2025-11-22T07:54:36.610025', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-30500190', 'name': 'instance-0000003e', 'instance_id': '47f049f6-6113-427d-ba1e-f849dfec302b', 'instance_type': 'm1.nano', 'host': '90b6817607013d66fc479ffb9172d010da052d8746b318ed99bddedd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7da40e34-c778-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4817.302697754, 'message_signature': '39895b1fe1d217dea7bca87c4a1e6515ad9329f856aa8100d70afa9f2124964b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': '47f049f6-6113-427d-ba1e-f849dfec302b-sda', 'timestamp': '2025-11-22T07:54:36.610025', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-30500190', 'name': 'instance-0000003e', 'instance_id': '47f049f6-6113-427d-ba1e-f849dfec302b', 'instance_type': 'm1.nano', 'host': '90b6817607013d66fc479ffb9172d010da052d8746b318ed99bddedd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7da41da2-c778-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4817.302697754, 'message_signature': 'fa9c47868ac0021158c9751531ffa1877d07ef9dcbd7cfc838b2729a761096ad'}]}, 'timestamp': '2025-11-22 07:54:36.639730', '_unique_id': '2082a33b3dd94746a7e3b413a63f985c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.640 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.640 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.640 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.640 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.640 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.640 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.640 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.640 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.640 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.640 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.640 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.640 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.640 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.640 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.640 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.640 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.640 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.640 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.640 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.640 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.640 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.640 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.640 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.640 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.640 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.640 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.640 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.640 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.640 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.640 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.640 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.641 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.641 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.641 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-DeleteServersTestJSON-server-1711924286>, <NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-30500190>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-DeleteServersTestJSON-server-1711924286>, <NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-30500190>]
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.641 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.642 12 DEBUG ceilometer.compute.pollsters [-] Instance c64e78b6-87b2-425c-aef9-771bcd042d58 was shut off while getting sample of memory.usage: Failed to inspect data of instance <name=instance-0000003b, id=c64e78b6-87b2-425c-aef9-771bcd042d58>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.657 12 DEBUG ceilometer.compute.pollsters [-] 47f049f6-6113-427d-ba1e-f849dfec302b/memory.usage volume: 40.4296875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.660 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '382c1ed3-49bd-4444-b4be-b104e52fecf4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.4296875, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': '47f049f6-6113-427d-ba1e-f849dfec302b', 'timestamp': '2025-11-22T07:54:36.641757', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-30500190', 'name': 'instance-0000003e', 'instance_id': '47f049f6-6113-427d-ba1e-f849dfec302b', 'instance_type': 'm1.nano', 'host': '90b6817607013d66fc479ffb9172d010da052d8746b318ed99bddedd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '7da713ea-c778-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4817.349632112, 'message_signature': 'e216d4412d02a005e25cfe2c8925a06172b8090d9134f3780cec8fdaee056f69'}]}, 'timestamp': '2025-11-22 07:54:36.659180', '_unique_id': 'aac67b414e414a728e54c0b47290676e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.660 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.660 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.660 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.660 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.660 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.660 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.660 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.660 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.660 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.660 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.660 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.660 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.660 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.660 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.660 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.660 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.660 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.660 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.660 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.660 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.660 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.660 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.660 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.660 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.660 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.660 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.660 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.660 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.660 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.660 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.660 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.661 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.662 12 DEBUG ceilometer.compute.pollsters [-] Instance c64e78b6-87b2-425c-aef9-771bcd042d58 was shut off while getting sample of disk.device.capacity: Failed to inspect data of instance <name=instance-0000003b, id=c64e78b6-87b2-425c-aef9-771bcd042d58>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.671 12 DEBUG ceilometer.compute.pollsters [-] 47f049f6-6113-427d-ba1e-f849dfec302b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.671 12 DEBUG ceilometer.compute.pollsters [-] 47f049f6-6113-427d-ba1e-f849dfec302b/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.673 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '74d0e43c-a880-4f8b-8f82-d26e03e7f8b7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': '47f049f6-6113-427d-ba1e-f849dfec302b-vda', 'timestamp': '2025-11-22T07:54:36.661368', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-30500190', 'name': 'instance-0000003e', 'instance_id': '47f049f6-6113-427d-ba1e-f849dfec302b', 'instance_type': 'm1.nano', 'host': '90b6817607013d66fc479ffb9172d010da052d8746b318ed99bddedd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7da9070e-c778-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4817.354293017, 'message_signature': '2c1099f404916f5ebf413d669ff622b999fd4f315966549cbe63b28f88f17ee6'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': '47f049f6-6113-427d-ba1e-f849dfec302b-sda', 'timestamp': '2025-11-22T07:54:36.661368', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-30500190', 'name': 'instance-0000003e', 'instance_id': '47f049f6-6113-427d-ba1e-f849dfec302b', 'instance_type': 'm1.nano', 'host': '90b6817607013d66fc479ffb9172d010da052d8746b318ed99bddedd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7da915d2-c778-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4817.354293017, 'message_signature': 'd622f4059fb98adafb3d332e9763598f3b7471c02a7d84ab7c0aca14e06a64b1'}]}, 'timestamp': '2025-11-22 07:54:36.672312', '_unique_id': 'cb2c651f77f446748a5850d883877d86'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.673 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.673 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.673 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.673 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.673 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.673 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.673 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.673 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.673 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.673 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.673 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.673 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.673 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.673 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.673 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.673 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.673 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.673 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.673 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.673 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.673 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.673 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.673 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.673 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.673 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.673 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.673 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.673 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.673 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.673 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.673 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.674 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.675 12 DEBUG ceilometer.compute.pollsters [-] Instance c64e78b6-87b2-425c-aef9-771bcd042d58 was shut off while getting sample of network.outgoing.bytes: Failed to inspect data of instance <name=instance-0000003b, id=c64e78b6-87b2-425c-aef9-771bcd042d58>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.675 12 DEBUG ceilometer.compute.pollsters [-] 47f049f6-6113-427d-ba1e-f849dfec302b/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.676 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6af4b120-e5b7-47a6-90c0-ec89f62f659b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': 'instance-0000003e-47f049f6-6113-427d-ba1e-f849dfec302b-tapbb707b30-96', 'timestamp': '2025-11-22T07:54:36.674479', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-30500190', 'name': 'tapbb707b30-96', 'instance_id': '47f049f6-6113-427d-ba1e-f849dfec302b', 'instance_type': 'm1.nano', 'host': '90b6817607013d66fc479ffb9172d010da052d8746b318ed99bddedd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ee:30:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbb707b30-96'}, 'message_id': '7da996ce-c778-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4817.291855368, 'message_signature': '6838a0b709ae40a873de126776007d177e6bc6dd018c21315f15832de63dabba'}]}, 'timestamp': '2025-11-22 07:54:36.675570', '_unique_id': '6703f1ff80ca4166a31ea1338cd07888'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.676 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.676 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.676 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.676 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.676 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.676 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.676 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.676 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.676 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.676 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.676 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.676 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.676 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.676 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.676 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.676 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.676 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.676 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.676 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.676 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.676 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.676 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.676 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.676 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.676 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.676 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.676 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.676 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.676 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.676 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.676 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.677 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.678 12 DEBUG ceilometer.compute.pollsters [-] Instance c64e78b6-87b2-425c-aef9-771bcd042d58 was shut off while getting sample of network.outgoing.bytes.delta: Failed to inspect data of instance <name=instance-0000003b, id=c64e78b6-87b2-425c-aef9-771bcd042d58>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.678 12 DEBUG ceilometer.compute.pollsters [-] 47f049f6-6113-427d-ba1e-f849dfec302b/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.679 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '735ed402-b903-439d-bd4c-4d8690d3a1dd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': 'instance-0000003e-47f049f6-6113-427d-ba1e-f849dfec302b-tapbb707b30-96', 'timestamp': '2025-11-22T07:54:36.677536', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-30500190', 'name': 'tapbb707b30-96', 'instance_id': '47f049f6-6113-427d-ba1e-f849dfec302b', 'instance_type': 'm1.nano', 'host': '90b6817607013d66fc479ffb9172d010da052d8746b318ed99bddedd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ee:30:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbb707b30-96'}, 'message_id': '7daa139c-c778-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4817.291855368, 'message_signature': '2214c4ee9324ada8cf6752680fefd8c7b4af0497b244385e722bf257b6826f1c'}]}, 'timestamp': '2025-11-22 07:54:36.678850', '_unique_id': 'c786378ca793467cbfe27597c1b13305'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.679 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.679 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.679 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.679 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.679 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.679 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.679 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.679 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.679 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.679 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.679 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.679 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.679 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.679 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.679 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.679 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.679 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.679 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.679 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.679 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.679 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.679 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.679 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.679 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.679 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.679 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.679 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.679 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.679 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.679 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.679 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.680 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.681 12 DEBUG ceilometer.compute.pollsters [-] Instance c64e78b6-87b2-425c-aef9-771bcd042d58 was shut off while getting sample of disk.device.write.requests: Failed to inspect data of instance <name=instance-0000003b, id=c64e78b6-87b2-425c-aef9-771bcd042d58>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.681 12 DEBUG ceilometer.compute.pollsters [-] 47f049f6-6113-427d-ba1e-f849dfec302b/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.682 12 DEBUG ceilometer.compute.pollsters [-] 47f049f6-6113-427d-ba1e-f849dfec302b/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.683 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6c5fad38-2a7b-4b0b-bb0f-5a0b4599db5d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': '47f049f6-6113-427d-ba1e-f849dfec302b-vda', 'timestamp': '2025-11-22T07:54:36.681050', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-30500190', 'name': 'instance-0000003e', 'instance_id': '47f049f6-6113-427d-ba1e-f849dfec302b', 'instance_type': 'm1.nano', 'host': '90b6817607013d66fc479ffb9172d010da052d8746b318ed99bddedd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7daa9b64-c778-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4817.302697754, 'message_signature': 'a9a67fed0890930f9c39a7b7912758948f8fd8d23c071c36ec06e58877ce2b83'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': '47f049f6-6113-427d-ba1e-f849dfec302b-sda', 'timestamp': '2025-11-22T07:54:36.681050', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-30500190', 'name': 'instance-0000003e', 'instance_id': '47f049f6-6113-427d-ba1e-f849dfec302b', 'instance_type': 'm1.nano', 'host': '90b6817607013d66fc479ffb9172d010da052d8746b318ed99bddedd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7daaa834-c778-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4817.302697754, 'message_signature': '5896f888af54a7b8340cc549d00d1d51088b90d13ce5dd32b63ce0a60842e962'}]}, 'timestamp': '2025-11-22 07:54:36.682557', '_unique_id': '6247d94e811845e3a2ecf6b998d31539'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.683 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.683 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.683 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.683 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.683 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.683 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.683 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.683 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.683 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.683 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.683 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.683 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.683 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.683 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.683 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.683 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.683 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.683 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.683 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.683 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.683 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.683 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.683 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.683 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.683 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.683 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.683 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.683 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.683 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.683 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.683 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.684 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.684 12 DEBUG ceilometer.compute.pollsters [-] Instance c64e78b6-87b2-425c-aef9-771bcd042d58 was shut off while getting sample of network.outgoing.packets: Failed to inspect data of instance <name=instance-0000003b, id=c64e78b6-87b2-425c-aef9-771bcd042d58>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.684 12 DEBUG ceilometer.compute.pollsters [-] 47f049f6-6113-427d-ba1e-f849dfec302b/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.685 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a58aad38-b03d-4bba-8ed4-5b69bd617c12', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': 'instance-0000003e-47f049f6-6113-427d-ba1e-f849dfec302b-tapbb707b30-96', 'timestamp': '2025-11-22T07:54:36.684237', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-30500190', 'name': 'tapbb707b30-96', 'instance_id': '47f049f6-6113-427d-ba1e-f849dfec302b', 'instance_type': 'm1.nano', 'host': '90b6817607013d66fc479ffb9172d010da052d8746b318ed99bddedd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ee:30:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbb707b30-96'}, 'message_id': '7dab0cac-c778-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4817.291855368, 'message_signature': 'd276fac51d49c1a27e7fe0b90a617d767f1786ca7e541bd07d9f28f6b118cb63'}]}, 'timestamp': '2025-11-22 07:54:36.685166', '_unique_id': '99d028d7336c4bfaad5fc8f979649ffb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.685 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.685 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.685 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.685 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.685 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.685 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.685 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.685 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.685 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.685 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.685 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.685 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.685 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.685 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.685 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.685 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.685 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.685 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.685 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.685 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.685 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.685 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.685 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.685 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.685 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.685 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.685 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.685 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.685 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.685 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.685 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.687 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.687 12 DEBUG ceilometer.compute.pollsters [-] Instance c64e78b6-87b2-425c-aef9-771bcd042d58 was shut off while getting sample of disk.device.usage: Failed to inspect data of instance <name=instance-0000003b, id=c64e78b6-87b2-425c-aef9-771bcd042d58>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.688 12 DEBUG ceilometer.compute.pollsters [-] 47f049f6-6113-427d-ba1e-f849dfec302b/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.688 12 DEBUG ceilometer.compute.pollsters [-] 47f049f6-6113-427d-ba1e-f849dfec302b/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.689 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5bfd4612-e080-4189-9fed-1f3c369cd89c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': '47f049f6-6113-427d-ba1e-f849dfec302b-vda', 'timestamp': '2025-11-22T07:54:36.687207', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-30500190', 'name': 'instance-0000003e', 'instance_id': '47f049f6-6113-427d-ba1e-f849dfec302b', 'instance_type': 'm1.nano', 'host': '90b6817607013d66fc479ffb9172d010da052d8746b318ed99bddedd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7dab8a6a-c778-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4817.354293017, 'message_signature': 'ddf6c37c1ac94e7c854d99f4a87dbc36d17794940303701bc4693e9a217a7028'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': '47f049f6-6113-427d-ba1e-f849dfec302b-sda', 'timestamp': '2025-11-22T07:54:36.687207', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-30500190', 'name': 'instance-0000003e', 'instance_id': '47f049f6-6113-427d-ba1e-f849dfec302b', 'instance_type': 'm1.nano', 'host': '90b6817607013d66fc479ffb9172d010da052d8746b318ed99bddedd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7dab96ea-c778-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4817.354293017, 'message_signature': 'c86b445bd3970fa2aae3d1608bc8ab5552a2c5d60dbe832f5dd58f627ed1f4b1'}]}, 'timestamp': '2025-11-22 07:54:36.688666', '_unique_id': '06d02e72e047416ab9b10b2ff9ef9d06'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.689 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.689 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.689 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.689 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.689 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.689 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.689 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.689 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.689 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.689 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.689 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.689 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.689 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.689 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.689 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.689 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.689 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.689 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.689 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.689 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.689 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.689 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.689 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.689 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.689 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.689 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.689 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.689 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.689 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.689 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.689 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.690 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.690 12 DEBUG ceilometer.compute.pollsters [-] Instance c64e78b6-87b2-425c-aef9-771bcd042d58 was shut off while getting sample of disk.device.read.latency: Failed to inspect data of instance <name=instance-0000003b, id=c64e78b6-87b2-425c-aef9-771bcd042d58>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.691 12 DEBUG ceilometer.compute.pollsters [-] 47f049f6-6113-427d-ba1e-f849dfec302b/disk.device.read.latency volume: 848930079 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.691 12 DEBUG ceilometer.compute.pollsters [-] 47f049f6-6113-427d-ba1e-f849dfec302b/disk.device.read.latency volume: 758419 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.692 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '68edf4a2-f172-47bf-a816-0d2c7e48de17', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 848930079, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': '47f049f6-6113-427d-ba1e-f849dfec302b-vda', 'timestamp': '2025-11-22T07:54:36.690444', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-30500190', 'name': 'instance-0000003e', 'instance_id': '47f049f6-6113-427d-ba1e-f849dfec302b', 'instance_type': 'm1.nano', 'host': '90b6817607013d66fc479ffb9172d010da052d8746b318ed99bddedd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7dabfd1a-c778-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4817.302697754, 'message_signature': 'fcb189e634e0aa2d273cc3b89392c7376be360e144d6738ac47c7ececffd5c4f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 758419, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': '47f049f6-6113-427d-ba1e-f849dfec302b-sda', 'timestamp': '2025-11-22T07:54:36.690444', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-30500190', 'name': 'instance-0000003e', 'instance_id': '47f049f6-6113-427d-ba1e-f849dfec302b', 'instance_type': 'm1.nano', 'host': '90b6817607013d66fc479ffb9172d010da052d8746b318ed99bddedd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7dac07ec-c778-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4817.302697754, 'message_signature': 'b4d2acb921440d498e2e7730496bb47a4ad08dfbc51be29a57b4dbc877912697'}]}, 'timestamp': '2025-11-22 07:54:36.691556', '_unique_id': 'decd1f5fa07942df8decc9f8cae3f571'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.692 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.692 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.692 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.692 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.692 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.692 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.692 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.692 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.692 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.692 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.692 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.692 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.692 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.692 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.692 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.692 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.692 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.692 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.692 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.692 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.692 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.692 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.692 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.692 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.692 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.692 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.692 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.692 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.692 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.692 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.692 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.693 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.693 12 DEBUG ceilometer.compute.pollsters [-] Instance c64e78b6-87b2-425c-aef9-771bcd042d58 was shut off while getting sample of network.incoming.packets: Failed to inspect data of instance <name=instance-0000003b, id=c64e78b6-87b2-425c-aef9-771bcd042d58>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.693 12 DEBUG ceilometer.compute.pollsters [-] 47f049f6-6113-427d-ba1e-f849dfec302b/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.694 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f1740cc2-055c-4697-82bc-908d6adfdc84', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': 'instance-0000003e-47f049f6-6113-427d-ba1e-f849dfec302b-tapbb707b30-96', 'timestamp': '2025-11-22T07:54:36.693182', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-30500190', 'name': 'tapbb707b30-96', 'instance_id': '47f049f6-6113-427d-ba1e-f849dfec302b', 'instance_type': 'm1.nano', 'host': '90b6817607013d66fc479ffb9172d010da052d8746b318ed99bddedd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ee:30:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbb707b30-96'}, 'message_id': '7dac6868-c778-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4817.291855368, 'message_signature': '289a698f76286ea4c14c80c06b31ddc9cf498782b6a957d769912e149d56fc74'}]}, 'timestamp': '2025-11-22 07:54:36.694074', '_unique_id': 'faf94863881d4ca1a5bffe4f833664c1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.694 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.694 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.694 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.694 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.694 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.694 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.694 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.694 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.694 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.694 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.694 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.694 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.694 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.694 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.694 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.694 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.694 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.694 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.694 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.694 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.694 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.694 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.694 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.694 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.694 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.694 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.694 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.694 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.694 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.694 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.694 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.695 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.696 12 DEBUG ceilometer.compute.pollsters [-] Instance c64e78b6-87b2-425c-aef9-771bcd042d58 was shut off while getting sample of network.incoming.bytes.delta: Failed to inspect data of instance <name=instance-0000003b, id=c64e78b6-87b2-425c-aef9-771bcd042d58>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.696 12 DEBUG ceilometer.compute.pollsters [-] 47f049f6-6113-427d-ba1e-f849dfec302b/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.697 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5816e3e1-b8bf-4e9a-b9d0-0d90ccf5b69d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': 'instance-0000003e-47f049f6-6113-427d-ba1e-f849dfec302b-tapbb707b30-96', 'timestamp': '2025-11-22T07:54:36.695660', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-30500190', 'name': 'tapbb707b30-96', 'instance_id': '47f049f6-6113-427d-ba1e-f849dfec302b', 'instance_type': 'm1.nano', 'host': '90b6817607013d66fc479ffb9172d010da052d8746b318ed99bddedd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ee:30:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbb707b30-96'}, 'message_id': '7dacc9c0-c778-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4817.291855368, 'message_signature': '29151689e01e7204cc6ed87f1910608ff2a2bc8f53e1b7e512ad30a0c259d890'}]}, 'timestamp': '2025-11-22 07:54:36.696541', '_unique_id': '47a33396d9ce4a269c880e98670d1ead'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.697 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.697 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.697 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.697 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.697 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.697 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.697 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.697 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.697 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.697 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.697 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.697 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.697 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.697 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.697 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.697 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.697 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.697 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.697 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.697 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.697 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.697 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.697 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.697 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.697 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.697 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.697 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.697 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.697 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.697 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.697 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.697 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.698 12 DEBUG ceilometer.compute.pollsters [-] Instance c64e78b6-87b2-425c-aef9-771bcd042d58 was shut off while getting sample of disk.device.read.bytes: Failed to inspect data of instance <name=instance-0000003b, id=c64e78b6-87b2-425c-aef9-771bcd042d58>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.698 12 DEBUG ceilometer.compute.pollsters [-] 47f049f6-6113-427d-ba1e-f849dfec302b/disk.device.read.bytes volume: 23816192 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.699 12 DEBUG ceilometer.compute.pollsters [-] 47f049f6-6113-427d-ba1e-f849dfec302b/disk.device.read.bytes volume: 2208 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.700 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ab3bbe54-34e7-4204-8f49-eae3403c56b5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23816192, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': '47f049f6-6113-427d-ba1e-f849dfec302b-vda', 'timestamp': '2025-11-22T07:54:36.698048', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-30500190', 'name': 'instance-0000003e', 'instance_id': '47f049f6-6113-427d-ba1e-f849dfec302b', 'instance_type': 'm1.nano', 'host': '90b6817607013d66fc479ffb9172d010da052d8746b318ed99bddedd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7dad341e-c778-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4817.302697754, 'message_signature': '4e40205b792078178cbf98c6673f1f90a81490172f1d18b6721f6de4d668971c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2208, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': '47f049f6-6113-427d-ba1e-f849dfec302b-sda', 'timestamp': '2025-11-22T07:54:36.698048', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-30500190', 'name': 'instance-0000003e', 'instance_id': '47f049f6-6113-427d-ba1e-f849dfec302b', 'instance_type': 'm1.nano', 'host': '90b6817607013d66fc479ffb9172d010da052d8746b318ed99bddedd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7dad404e-c778-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4817.302697754, 'message_signature': '3d7968634195df709560e659e251d8921e3bdff7e65cd56d4acdec5ad2a32057'}]}, 'timestamp': '2025-11-22 07:54:36.699553', '_unique_id': '5a352ec096184f41829475186f16c83f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.700 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.700 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.700 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.700 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.700 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.700 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.700 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.700 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.700 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.700 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.700 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.700 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.700 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.700 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.700 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.700 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.700 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.700 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.700 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.700 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.700 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.700 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.700 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.700 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.700 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.700 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.700 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.700 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.700 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.700 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.700 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.701 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.701 12 DEBUG ceilometer.compute.pollsters [-] Instance c64e78b6-87b2-425c-aef9-771bcd042d58 was shut off while getting sample of network.outgoing.packets.drop: Failed to inspect data of instance <name=instance-0000003b, id=c64e78b6-87b2-425c-aef9-771bcd042d58>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.702 12 DEBUG ceilometer.compute.pollsters [-] 47f049f6-6113-427d-ba1e-f849dfec302b/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.703 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '50873a9c-9c87-4584-8629-f730d3cbe16e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': 'instance-0000003e-47f049f6-6113-427d-ba1e-f849dfec302b-tapbb707b30-96', 'timestamp': '2025-11-22T07:54:36.701314', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-30500190', 'name': 'tapbb707b30-96', 'instance_id': '47f049f6-6113-427d-ba1e-f849dfec302b', 'instance_type': 'm1.nano', 'host': '90b6817607013d66fc479ffb9172d010da052d8746b318ed99bddedd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ee:30:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbb707b30-96'}, 'message_id': '7dadad0e-c778-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4817.291855368, 'message_signature': '68711cf01e9ece1b6a96936c35b3e5baba6de89fdfa99d4317b4cf10d1f350e1'}]}, 'timestamp': '2025-11-22 07:54:36.702419', '_unique_id': 'bac8f693582a417984918528602f0a0b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.703 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.703 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.703 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.703 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.703 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.703 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.703 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.703 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.703 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.703 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.703 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.703 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.703 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.703 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.703 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.703 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.703 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.703 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.703 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.703 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.703 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.703 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.703 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.703 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.703 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.703 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.703 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.703 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.703 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.703 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.703 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.704 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.705 12 DEBUG ceilometer.compute.pollsters [-] Instance c64e78b6-87b2-425c-aef9-771bcd042d58 was shut off while getting sample of disk.device.allocation: Failed to inspect data of instance <name=instance-0000003b, id=c64e78b6-87b2-425c-aef9-771bcd042d58>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.705 12 DEBUG ceilometer.compute.pollsters [-] 47f049f6-6113-427d-ba1e-f849dfec302b/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.705 12 DEBUG ceilometer.compute.pollsters [-] 47f049f6-6113-427d-ba1e-f849dfec302b/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.706 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '311073fb-bf63-4027-a380-21d6c892db3b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': '47f049f6-6113-427d-ba1e-f849dfec302b-vda', 'timestamp': '2025-11-22T07:54:36.704554', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-30500190', 'name': 'instance-0000003e', 'instance_id': '47f049f6-6113-427d-ba1e-f849dfec302b', 'instance_type': 'm1.nano', 'host': '90b6817607013d66fc479ffb9172d010da052d8746b318ed99bddedd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7dae2842-c778-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4817.354293017, 'message_signature': '25783a977523e6145dab563fe0d99c79bd54ac41f5c0bdd9af6152ef7cd229ac'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': '47f049f6-6113-427d-ba1e-f849dfec302b-sda', 'timestamp': '2025-11-22T07:54:36.704554', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-30500190', 'name': 'instance-0000003e', 'instance_id': '47f049f6-6113-427d-ba1e-f849dfec302b', 'instance_type': 'm1.nano', 'host': '90b6817607013d66fc479ffb9172d010da052d8746b318ed99bddedd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7dae33e6-c778-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4817.354293017, 'message_signature': '299d530e0b9872675f985eac19e268b0d2b030a9c3c571ca071898d676b39b9b'}]}, 'timestamp': '2025-11-22 07:54:36.705794', '_unique_id': '7d58217cabfc44089103ed9b63f495fc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.706 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.706 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.706 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.706 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.706 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.706 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.706 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.706 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.706 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.706 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.706 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.706 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.706 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.706 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.706 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.706 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.706 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.706 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.706 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.706 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.706 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.706 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.706 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.706 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.706 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.706 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.706 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.706 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.706 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.706 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.706 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.707 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.708 12 DEBUG ceilometer.compute.pollsters [-] Instance c64e78b6-87b2-425c-aef9-771bcd042d58 was shut off while getting sample of disk.device.write.bytes: Failed to inspect data of instance <name=instance-0000003b, id=c64e78b6-87b2-425c-aef9-771bcd042d58>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.708 12 DEBUG ceilometer.compute.pollsters [-] 47f049f6-6113-427d-ba1e-f849dfec302b/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.709 12 DEBUG ceilometer.compute.pollsters [-] 47f049f6-6113-427d-ba1e-f849dfec302b/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.709 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3ffc5647-3491-4c3c-94b1-69de3c4540fb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': '47f049f6-6113-427d-ba1e-f849dfec302b-vda', 'timestamp': '2025-11-22T07:54:36.707907', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-30500190', 'name': 'instance-0000003e', 'instance_id': '47f049f6-6113-427d-ba1e-f849dfec302b', 'instance_type': 'm1.nano', 'host': '90b6817607013d66fc479ffb9172d010da052d8746b318ed99bddedd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7daeb032-c778-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4817.302697754, 'message_signature': '85c5cca4cf85d0b4187971eff79311802e39dbd11c2084dab72737a5476cdeda'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': '47f049f6-6113-427d-ba1e-f849dfec302b-sda', 'timestamp': '2025-11-22T07:54:36.707907', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-30500190', 'name': 'instance-0000003e', 'instance_id': '47f049f6-6113-427d-ba1e-f849dfec302b', 'instance_type': 'm1.nano', 'host': '90b6817607013d66fc479ffb9172d010da052d8746b318ed99bddedd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7daebc26-c778-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4817.302697754, 'message_signature': '4c2fcf3ca85c6939e4d85a09db6ae1ba17eec7dada8f1b7047bcb2ad88985ba3'}]}, 'timestamp': '2025-11-22 07:54:36.709299', '_unique_id': '4f4ed64034044bd1884e28b85f6998bc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.709 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.709 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.709 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.709 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.709 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.709 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.709 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.709 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.709 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.709 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.709 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.709 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.709 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.709 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.709 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.709 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.709 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.709 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.709 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.709 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.709 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.709 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.709 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.709 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.709 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.709 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.709 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.709 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.709 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.709 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.709 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.710 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.711 12 DEBUG ceilometer.compute.pollsters [-] Instance c64e78b6-87b2-425c-aef9-771bcd042d58 was shut off while getting sample of cpu: Failed to inspect data of instance <name=instance-0000003b, id=c64e78b6-87b2-425c-aef9-771bcd042d58>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.711 12 DEBUG ceilometer.compute.pollsters [-] 47f049f6-6113-427d-ba1e-f849dfec302b/cpu volume: 12130000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.712 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a44c9c15-04f4-4d2d-b49a-9229d76459bd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12130000000, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': '47f049f6-6113-427d-ba1e-f849dfec302b', 'timestamp': '2025-11-22T07:54:36.711080', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-30500190', 'name': 'instance-0000003e', 'instance_id': '47f049f6-6113-427d-ba1e-f849dfec302b', 'instance_type': 'm1.nano', 'host': '90b6817607013d66fc479ffb9172d010da052d8746b318ed99bddedd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '7daf2d3c-c778-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4817.349632112, 'message_signature': '97d659e954c52016de459d5425b841fb3793ac86e383545d4570678626696b3f'}]}, 'timestamp': '2025-11-22 07:54:36.712218', '_unique_id': 'fc590041801e4e0384897b5f29f91f27'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.712 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.712 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.712 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.712 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.712 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.712 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.712 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.712 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.712 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.712 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.712 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.712 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.712 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.712 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.712 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.712 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.712 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.712 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.712 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.712 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.712 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.712 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.712 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.712 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.712 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.712 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.712 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.712 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.712 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.712 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.712 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.713 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.714 12 DEBUG ceilometer.compute.pollsters [-] Instance c64e78b6-87b2-425c-aef9-771bcd042d58 was shut off while getting sample of network.incoming.bytes: Failed to inspect data of instance <name=instance-0000003b, id=c64e78b6-87b2-425c-aef9-771bcd042d58>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.714 12 DEBUG ceilometer.compute.pollsters [-] 47f049f6-6113-427d-ba1e-f849dfec302b/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.715 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '909cbfff-6c96-4995-9aac-466ff668cc34', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': 'instance-0000003e-47f049f6-6113-427d-ba1e-f849dfec302b-tapbb707b30-96', 'timestamp': '2025-11-22T07:54:36.713918', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-30500190', 'name': 'tapbb707b30-96', 'instance_id': '47f049f6-6113-427d-ba1e-f849dfec302b', 'instance_type': 'm1.nano', 'host': '90b6817607013d66fc479ffb9172d010da052d8746b318ed99bddedd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ee:30:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbb707b30-96'}, 'message_id': '7daf9416-c778-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4817.291855368, 'message_signature': '51ca51a28b7c43193d036002666a60e09d26a9f12bf65b268acaf2deb7226489'}]}, 'timestamp': '2025-11-22 07:54:36.714818', '_unique_id': '226b05810bcb442fad59506a2ee72f17'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.715 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.715 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.715 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.715 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.715 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.715 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.715 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.715 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.715 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.715 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.715 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.715 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.715 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.715 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.715 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.715 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.715 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.715 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.715 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.715 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.715 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.715 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.715 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.715 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.715 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.715 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.715 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.715 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.715 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.715 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.715 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.716 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.716 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.716 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-DeleteServersTestJSON-server-1711924286>, <NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-30500190>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-DeleteServersTestJSON-server-1711924286>, <NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-30500190>]
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.716 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.717 12 DEBUG ceilometer.compute.pollsters [-] Instance c64e78b6-87b2-425c-aef9-771bcd042d58 was shut off while getting sample of disk.device.read.requests: Failed to inspect data of instance <name=instance-0000003b, id=c64e78b6-87b2-425c-aef9-771bcd042d58>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.717 12 DEBUG ceilometer.compute.pollsters [-] 47f049f6-6113-427d-ba1e-f849dfec302b/disk.device.read.requests volume: 770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.717 12 DEBUG ceilometer.compute.pollsters [-] 47f049f6-6113-427d-ba1e-f849dfec302b/disk.device.read.requests volume: 6 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.718 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7bc20d71-8e0c-462e-b47e-daaeb3054d89', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 770, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': '47f049f6-6113-427d-ba1e-f849dfec302b-vda', 'timestamp': '2025-11-22T07:54:36.716806', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-30500190', 'name': 'instance-0000003e', 'instance_id': '47f049f6-6113-427d-ba1e-f849dfec302b', 'instance_type': 'm1.nano', 'host': '90b6817607013d66fc479ffb9172d010da052d8746b318ed99bddedd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7db00590-c778-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4817.302697754, 'message_signature': 'ebf49c5bb42d9c233b260f423ce0021cb8bdf0fa686712045b5e572099cbd690'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 6, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': '47f049f6-6113-427d-ba1e-f849dfec302b-sda', 'timestamp': '2025-11-22T07:54:36.716806', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-30500190', 'name': 'instance-0000003e', 'instance_id': '47f049f6-6113-427d-ba1e-f849dfec302b', 'instance_type': 'm1.nano', 'host': '90b6817607013d66fc479ffb9172d010da052d8746b318ed99bddedd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7db00f7c-c778-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4817.302697754, 'message_signature': 'bdea562657bed399e8617f3d69b51c01b27cb4a4365bed5ab2134db345e03eb7'}]}, 'timestamp': '2025-11-22 07:54:36.717966', '_unique_id': '10a835a1e8094af58a23ad2e531b67ba'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.718 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.718 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.718 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.718 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.718 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.718 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.718 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.718 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.718 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.718 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.718 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.718 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.718 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.718 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.718 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.718 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.718 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.718 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.718 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.718 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.718 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.718 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.718 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.718 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.718 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.718 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.718 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.718 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.718 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.718 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.718 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.719 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.719 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.719 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-DeleteServersTestJSON-server-1711924286>, <NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-30500190>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-DeleteServersTestJSON-server-1711924286>, <NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-30500190>]
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.720 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.720 12 DEBUG ceilometer.compute.pollsters [-] Instance c64e78b6-87b2-425c-aef9-771bcd042d58 was shut off while getting sample of network.incoming.packets.error: Failed to inspect data of instance <name=instance-0000003b, id=c64e78b6-87b2-425c-aef9-771bcd042d58>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.720 12 DEBUG ceilometer.compute.pollsters [-] 47f049f6-6113-427d-ba1e-f849dfec302b/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.721 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '22f17d63-49e3-4b00-9803-27a2eaa75852', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '6d9b8aa760ed4afdbf24f9deb5d29190', 'user_name': None, 'project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'project_name': None, 'resource_id': 'instance-0000003e-47f049f6-6113-427d-ba1e-f849dfec302b-tapbb707b30-96', 'timestamp': '2025-11-22T07:54:36.720150', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-30500190', 'name': 'tapbb707b30-96', 'instance_id': '47f049f6-6113-427d-ba1e-f849dfec302b', 'instance_type': 'm1.nano', 'host': '90b6817607013d66fc479ffb9172d010da052d8746b318ed99bddedd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ee:30:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbb707b30-96'}, 'message_id': '7db087e0-c778-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4817.291855368, 'message_signature': '730be4512882ba6a1a92e89160d7ebdc3713a3a9f32543bda78a780326873d64'}]}, 'timestamp': '2025-11-22 07:54:36.721097', '_unique_id': 'a0fb942302c745c59d98c3432c258c0b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.721 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.721 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.721 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.721 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.721 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.721 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.721 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.721 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.721 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.721 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.721 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.721 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.721 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.721 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.721 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.721 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.721 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.721 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.721 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.721 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.721 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.721 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.721 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.721 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.721 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.721 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.721 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.721 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.721 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.721 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.721 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:54:36.721 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:54:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:37.322 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:54:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:37.323 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:54:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:37.323 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:54:37 compute-0 nova_compute[186544]: 2025-11-22 07:54:37.665 186548 DEBUG neutronclient.v2_0.client [None req-a687e413-2d03-4fff-83c3-fe61f93b477f 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port a038edb6-47af-4f7e-9f5e-715660b6da32 for host compute-0.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262
Nov 22 07:54:37 compute-0 nova_compute[186544]: 2025-11-22 07:54:37.665 186548 DEBUG oslo_concurrency.lockutils [None req-a687e413-2d03-4fff-83c3-fe61f93b477f 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquiring lock "refresh_cache-c64e78b6-87b2-425c-aef9-771bcd042d58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:54:37 compute-0 nova_compute[186544]: 2025-11-22 07:54:37.666 186548 DEBUG oslo_concurrency.lockutils [None req-a687e413-2d03-4fff-83c3-fe61f93b477f 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquired lock "refresh_cache-c64e78b6-87b2-425c-aef9-771bcd042d58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:54:37 compute-0 nova_compute[186544]: 2025-11-22 07:54:37.666 186548 DEBUG nova.network.neutron [None req-a687e413-2d03-4fff-83c3-fe61f93b477f 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 07:54:38 compute-0 podman[223265]: 2025-11-22 07:54:38.409895545 +0000 UTC m=+0.056028853 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=edpm, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute)
Nov 22 07:54:38 compute-0 podman[223266]: 2025-11-22 07:54:38.438141017 +0000 UTC m=+0.081331253 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 22 07:54:39 compute-0 ovn_controller[94843]: 2025-11-22T07:54:39Z|00030|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ee:30:e0 10.100.0.4
Nov 22 07:54:39 compute-0 ovn_controller[94843]: 2025-11-22T07:54:39Z|00031|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ee:30:e0 10.100.0.4
Nov 22 07:54:39 compute-0 nova_compute[186544]: 2025-11-22 07:54:39.927 186548 DEBUG nova.network.neutron [None req-a687e413-2d03-4fff-83c3-fe61f93b477f 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Updating instance_info_cache with network_info: [{"id": "a038edb6-47af-4f7e-9f5e-715660b6da32", "address": "fa:16:3e:36:ab:fc", "network": {"id": "5e910dbb-27d1-4915-8b74-d0538d33c33c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-667619475-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b68db2b61a54aeaa8ac219f44ed3e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa038edb6-47", "ovs_interfaceid": "a038edb6-47af-4f7e-9f5e-715660b6da32", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:54:39 compute-0 nova_compute[186544]: 2025-11-22 07:54:39.931 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763798064.9300973, c64e78b6-87b2-425c-aef9-771bcd042d58 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:54:39 compute-0 nova_compute[186544]: 2025-11-22 07:54:39.932 186548 INFO nova.compute.manager [-] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] VM Stopped (Lifecycle Event)
Nov 22 07:54:39 compute-0 nova_compute[186544]: 2025-11-22 07:54:39.946 186548 DEBUG nova.compute.manager [req-1cd56adc-fe46-41e8-8c74-987026b94b12 req-223a4628-1bc6-4f22-8044-17ed671c2c77 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Received event network-vif-plugged-a038edb6-47af-4f7e-9f5e-715660b6da32 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:54:39 compute-0 nova_compute[186544]: 2025-11-22 07:54:39.946 186548 DEBUG oslo_concurrency.lockutils [req-1cd56adc-fe46-41e8-8c74-987026b94b12 req-223a4628-1bc6-4f22-8044-17ed671c2c77 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "c64e78b6-87b2-425c-aef9-771bcd042d58-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:54:39 compute-0 nova_compute[186544]: 2025-11-22 07:54:39.947 186548 DEBUG oslo_concurrency.lockutils [req-1cd56adc-fe46-41e8-8c74-987026b94b12 req-223a4628-1bc6-4f22-8044-17ed671c2c77 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c64e78b6-87b2-425c-aef9-771bcd042d58-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:54:39 compute-0 nova_compute[186544]: 2025-11-22 07:54:39.947 186548 DEBUG oslo_concurrency.lockutils [req-1cd56adc-fe46-41e8-8c74-987026b94b12 req-223a4628-1bc6-4f22-8044-17ed671c2c77 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c64e78b6-87b2-425c-aef9-771bcd042d58-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:54:39 compute-0 nova_compute[186544]: 2025-11-22 07:54:39.947 186548 DEBUG nova.compute.manager [req-1cd56adc-fe46-41e8-8c74-987026b94b12 req-223a4628-1bc6-4f22-8044-17ed671c2c77 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] No waiting events found dispatching network-vif-plugged-a038edb6-47af-4f7e-9f5e-715660b6da32 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:54:39 compute-0 nova_compute[186544]: 2025-11-22 07:54:39.948 186548 WARNING nova.compute.manager [req-1cd56adc-fe46-41e8-8c74-987026b94b12 req-223a4628-1bc6-4f22-8044-17ed671c2c77 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Received unexpected event network-vif-plugged-a038edb6-47af-4f7e-9f5e-715660b6da32 for instance with vm_state resized and task_state deleting.
Nov 22 07:54:39 compute-0 nova_compute[186544]: 2025-11-22 07:54:39.969 186548 DEBUG nova.compute.manager [None req-3fc7656f-3511-465b-ac95-f03c6f48aab1 - - - - - -] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:54:39 compute-0 nova_compute[186544]: 2025-11-22 07:54:39.972 186548 DEBUG nova.compute.manager [None req-3fc7656f-3511-465b-ac95-f03c6f48aab1 - - - - - -] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: resized, current task_state: deleting, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:54:39 compute-0 nova_compute[186544]: 2025-11-22 07:54:39.977 186548 DEBUG oslo_concurrency.lockutils [None req-a687e413-2d03-4fff-83c3-fe61f93b477f 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Releasing lock "refresh_cache-c64e78b6-87b2-425c-aef9-771bcd042d58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:54:39 compute-0 nova_compute[186544]: 2025-11-22 07:54:39.977 186548 DEBUG nova.objects.instance [None req-a687e413-2d03-4fff-83c3-fe61f93b477f 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lazy-loading 'migration_context' on Instance uuid c64e78b6-87b2-425c-aef9-771bcd042d58 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:54:40 compute-0 nova_compute[186544]: 2025-11-22 07:54:40.007 186548 INFO nova.compute.manager [None req-3fc7656f-3511-465b-ac95-f03c6f48aab1 - - - - - -] [instance: c64e78b6-87b2-425c-aef9-771bcd042d58] During the sync_power process the instance has moved from host compute-1.ctlplane.example.com to host compute-0.ctlplane.example.com
Nov 22 07:54:40 compute-0 nova_compute[186544]: 2025-11-22 07:54:40.013 186548 DEBUG nova.virt.libvirt.vif [None req-a687e413-2d03-4fff-83c3-fe61f93b477f 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:53:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1711924286',display_name='tempest-DeleteServersTestJSON-server-1711924286',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1711924286',id=59,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:54:34Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6b68db2b61a54aeaa8ac219f44ed3e75',ramdisk_id='',reservation_id='r-8pf23wx3',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-DeleteServersTestJSON-550712359',owner_user_name='tempest-DeleteServersTestJSON-550712359-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:54:35Z,user_data=None,user_id='57077a1511bf46d897beb6fd5eedfa67',uuid=c64e78b6-87b2-425c-aef9-771bcd042d58,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "a038edb6-47af-4f7e-9f5e-715660b6da32", "address": "fa:16:3e:36:ab:fc", "network": {"id": "5e910dbb-27d1-4915-8b74-d0538d33c33c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-667619475-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b68db2b61a54aeaa8ac219f44ed3e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa038edb6-47", "ovs_interfaceid": "a038edb6-47af-4f7e-9f5e-715660b6da32", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 07:54:40 compute-0 nova_compute[186544]: 2025-11-22 07:54:40.013 186548 DEBUG nova.network.os_vif_util [None req-a687e413-2d03-4fff-83c3-fe61f93b477f 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Converting VIF {"id": "a038edb6-47af-4f7e-9f5e-715660b6da32", "address": "fa:16:3e:36:ab:fc", "network": {"id": "5e910dbb-27d1-4915-8b74-d0538d33c33c", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-667619475-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6b68db2b61a54aeaa8ac219f44ed3e75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa038edb6-47", "ovs_interfaceid": "a038edb6-47af-4f7e-9f5e-715660b6da32", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:54:40 compute-0 nova_compute[186544]: 2025-11-22 07:54:40.014 186548 DEBUG nova.network.os_vif_util [None req-a687e413-2d03-4fff-83c3-fe61f93b477f 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:36:ab:fc,bridge_name='br-int',has_traffic_filtering=True,id=a038edb6-47af-4f7e-9f5e-715660b6da32,network=Network(5e910dbb-27d1-4915-8b74-d0538d33c33c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa038edb6-47') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:54:40 compute-0 nova_compute[186544]: 2025-11-22 07:54:40.014 186548 DEBUG os_vif [None req-a687e413-2d03-4fff-83c3-fe61f93b477f 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:36:ab:fc,bridge_name='br-int',has_traffic_filtering=True,id=a038edb6-47af-4f7e-9f5e-715660b6da32,network=Network(5e910dbb-27d1-4915-8b74-d0538d33c33c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa038edb6-47') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 07:54:40 compute-0 nova_compute[186544]: 2025-11-22 07:54:40.016 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:54:40 compute-0 nova_compute[186544]: 2025-11-22 07:54:40.016 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa038edb6-47, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:54:40 compute-0 nova_compute[186544]: 2025-11-22 07:54:40.017 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:54:40 compute-0 nova_compute[186544]: 2025-11-22 07:54:40.019 186548 INFO os_vif [None req-a687e413-2d03-4fff-83c3-fe61f93b477f 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:36:ab:fc,bridge_name='br-int',has_traffic_filtering=True,id=a038edb6-47af-4f7e-9f5e-715660b6da32,network=Network(5e910dbb-27d1-4915-8b74-d0538d33c33c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa038edb6-47')
Nov 22 07:54:40 compute-0 nova_compute[186544]: 2025-11-22 07:54:40.020 186548 DEBUG oslo_concurrency.lockutils [None req-a687e413-2d03-4fff-83c3-fe61f93b477f 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:54:40 compute-0 nova_compute[186544]: 2025-11-22 07:54:40.020 186548 DEBUG oslo_concurrency.lockutils [None req-a687e413-2d03-4fff-83c3-fe61f93b477f 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:54:40 compute-0 nova_compute[186544]: 2025-11-22 07:54:40.150 186548 DEBUG nova.compute.provider_tree [None req-a687e413-2d03-4fff-83c3-fe61f93b477f 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:54:40 compute-0 nova_compute[186544]: 2025-11-22 07:54:40.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:54:40 compute-0 nova_compute[186544]: 2025-11-22 07:54:40.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:54:40 compute-0 nova_compute[186544]: 2025-11-22 07:54:40.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 07:54:40 compute-0 nova_compute[186544]: 2025-11-22 07:54:40.164 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 07:54:40 compute-0 nova_compute[186544]: 2025-11-22 07:54:40.179 186548 DEBUG nova.scheduler.client.report [None req-a687e413-2d03-4fff-83c3-fe61f93b477f 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:54:40 compute-0 nova_compute[186544]: 2025-11-22 07:54:40.252 186548 DEBUG oslo_concurrency.lockutils [None req-a687e413-2d03-4fff-83c3-fe61f93b477f 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 0.232s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:54:40 compute-0 nova_compute[186544]: 2025-11-22 07:54:40.390 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:54:40 compute-0 nova_compute[186544]: 2025-11-22 07:54:40.425 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:54:40 compute-0 nova_compute[186544]: 2025-11-22 07:54:40.481 186548 INFO nova.scheduler.client.report [None req-a687e413-2d03-4fff-83c3-fe61f93b477f 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Deleted allocation for migration fe6cc4ba-d1c6-416a-a067-c3816513554b
Nov 22 07:54:40 compute-0 nova_compute[186544]: 2025-11-22 07:54:40.535 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "refresh_cache-47f049f6-6113-427d-ba1e-f849dfec302b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:54:40 compute-0 nova_compute[186544]: 2025-11-22 07:54:40.535 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquired lock "refresh_cache-47f049f6-6113-427d-ba1e-f849dfec302b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:54:40 compute-0 nova_compute[186544]: 2025-11-22 07:54:40.536 186548 DEBUG nova.network.neutron [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 22 07:54:40 compute-0 nova_compute[186544]: 2025-11-22 07:54:40.536 186548 DEBUG nova.objects.instance [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lazy-loading 'info_cache' on Instance uuid 47f049f6-6113-427d-ba1e-f849dfec302b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:54:40 compute-0 nova_compute[186544]: 2025-11-22 07:54:40.575 186548 DEBUG oslo_concurrency.lockutils [None req-a687e413-2d03-4fff-83c3-fe61f93b477f 57077a1511bf46d897beb6fd5eedfa67 6b68db2b61a54aeaa8ac219f44ed3e75 - - default default] Lock "c64e78b6-87b2-425c-aef9-771bcd042d58" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 4.666s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:54:44 compute-0 nova_compute[186544]: 2025-11-22 07:54:44.006 186548 DEBUG nova.network.neutron [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Updating instance_info_cache with network_info: [{"id": "bb707b30-96a3-4e7c-abad-b85fc5a10938", "address": "fa:16:3e:ee:30:e0", "network": {"id": "62930ff4-55a3-4e08-8229-5532aa7dcaed", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-130265711-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b4ca2b2e65ac4bf8b3d14f3310a3a7bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb707b30-96", "ovs_interfaceid": "bb707b30-96a3-4e7c-abad-b85fc5a10938", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:54:44 compute-0 nova_compute[186544]: 2025-11-22 07:54:44.031 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Releasing lock "refresh_cache-47f049f6-6113-427d-ba1e-f849dfec302b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:54:44 compute-0 nova_compute[186544]: 2025-11-22 07:54:44.031 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 22 07:54:44 compute-0 nova_compute[186544]: 2025-11-22 07:54:44.031 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:54:44 compute-0 nova_compute[186544]: 2025-11-22 07:54:44.032 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:54:44 compute-0 nova_compute[186544]: 2025-11-22 07:54:44.032 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 07:54:44 compute-0 nova_compute[186544]: 2025-11-22 07:54:44.032 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:54:44 compute-0 nova_compute[186544]: 2025-11-22 07:54:44.057 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:54:44 compute-0 nova_compute[186544]: 2025-11-22 07:54:44.058 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:54:44 compute-0 nova_compute[186544]: 2025-11-22 07:54:44.058 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:54:44 compute-0 nova_compute[186544]: 2025-11-22 07:54:44.058 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 07:54:44 compute-0 podman[223311]: 2025-11-22 07:54:44.156325218 +0000 UTC m=+0.053387738 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 07:54:44 compute-0 nova_compute[186544]: 2025-11-22 07:54:44.167 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/47f049f6-6113-427d-ba1e-f849dfec302b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:54:44 compute-0 nova_compute[186544]: 2025-11-22 07:54:44.223 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/47f049f6-6113-427d-ba1e-f849dfec302b/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:54:44 compute-0 nova_compute[186544]: 2025-11-22 07:54:44.224 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/47f049f6-6113-427d-ba1e-f849dfec302b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:54:44 compute-0 nova_compute[186544]: 2025-11-22 07:54:44.277 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/47f049f6-6113-427d-ba1e-f849dfec302b/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:54:44 compute-0 nova_compute[186544]: 2025-11-22 07:54:44.429 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 07:54:44 compute-0 nova_compute[186544]: 2025-11-22 07:54:44.430 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5592MB free_disk=73.32036590576172GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 07:54:44 compute-0 nova_compute[186544]: 2025-11-22 07:54:44.430 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:54:44 compute-0 nova_compute[186544]: 2025-11-22 07:54:44.430 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:54:44 compute-0 nova_compute[186544]: 2025-11-22 07:54:44.539 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Instance 47f049f6-6113-427d-ba1e-f849dfec302b actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 22 07:54:44 compute-0 nova_compute[186544]: 2025-11-22 07:54:44.540 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 07:54:44 compute-0 nova_compute[186544]: 2025-11-22 07:54:44.540 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 07:54:44 compute-0 nova_compute[186544]: 2025-11-22 07:54:44.619 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:54:44 compute-0 nova_compute[186544]: 2025-11-22 07:54:44.643 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:54:44 compute-0 nova_compute[186544]: 2025-11-22 07:54:44.670 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 07:54:44 compute-0 nova_compute[186544]: 2025-11-22 07:54:44.670 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.240s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:54:44 compute-0 nova_compute[186544]: 2025-11-22 07:54:44.802 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:54:45 compute-0 nova_compute[186544]: 2025-11-22 07:54:45.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:54:45 compute-0 nova_compute[186544]: 2025-11-22 07:54:45.393 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:54:45 compute-0 nova_compute[186544]: 2025-11-22 07:54:45.427 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:54:46 compute-0 podman[223342]: 2025-11-22 07:54:46.420906395 +0000 UTC m=+0.067240538 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 07:54:46 compute-0 nova_compute[186544]: 2025-11-22 07:54:46.939 186548 DEBUG oslo_concurrency.lockutils [None req-bcab36c6-8929-481a-9892-e8cc06a3a698 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Acquiring lock "47f049f6-6113-427d-ba1e-f849dfec302b" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:54:46 compute-0 nova_compute[186544]: 2025-11-22 07:54:46.940 186548 DEBUG oslo_concurrency.lockutils [None req-bcab36c6-8929-481a-9892-e8cc06a3a698 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Lock "47f049f6-6113-427d-ba1e-f849dfec302b" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:54:46 compute-0 nova_compute[186544]: 2025-11-22 07:54:46.940 186548 DEBUG nova.compute.manager [None req-bcab36c6-8929-481a-9892-e8cc06a3a698 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:54:46 compute-0 nova_compute[186544]: 2025-11-22 07:54:46.943 186548 DEBUG nova.compute.manager [None req-bcab36c6-8929-481a-9892-e8cc06a3a698 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338
Nov 22 07:54:46 compute-0 nova_compute[186544]: 2025-11-22 07:54:46.943 186548 DEBUG nova.objects.instance [None req-bcab36c6-8929-481a-9892-e8cc06a3a698 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Lazy-loading 'flavor' on Instance uuid 47f049f6-6113-427d-ba1e-f849dfec302b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:54:46 compute-0 nova_compute[186544]: 2025-11-22 07:54:46.974 186548 DEBUG nova.objects.instance [None req-bcab36c6-8929-481a-9892-e8cc06a3a698 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Lazy-loading 'info_cache' on Instance uuid 47f049f6-6113-427d-ba1e-f849dfec302b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:54:47 compute-0 nova_compute[186544]: 2025-11-22 07:54:47.007 186548 DEBUG nova.virt.libvirt.driver [None req-bcab36c6-8929-481a-9892-e8cc06a3a698 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Nov 22 07:54:49 compute-0 nova_compute[186544]: 2025-11-22 07:54:49.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:54:49 compute-0 nova_compute[186544]: 2025-11-22 07:54:49.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:54:49 compute-0 kernel: tapbb707b30-96 (unregistering): left promiscuous mode
Nov 22 07:54:49 compute-0 NetworkManager[55036]: <info>  [1763798089.1732] device (tapbb707b30-96): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 07:54:49 compute-0 ovn_controller[94843]: 2025-11-22T07:54:49Z|00247|binding|INFO|Releasing lport bb707b30-96a3-4e7c-abad-b85fc5a10938 from this chassis (sb_readonly=0)
Nov 22 07:54:49 compute-0 ovn_controller[94843]: 2025-11-22T07:54:49Z|00248|binding|INFO|Setting lport bb707b30-96a3-4e7c-abad-b85fc5a10938 down in Southbound
Nov 22 07:54:49 compute-0 nova_compute[186544]: 2025-11-22 07:54:49.181 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:54:49 compute-0 ovn_controller[94843]: 2025-11-22T07:54:49Z|00249|binding|INFO|Removing iface tapbb707b30-96 ovn-installed in OVS
Nov 22 07:54:49 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:49.198 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ee:30:e0 10.100.0.4'], port_security=['fa:16:3e:ee:30:e0 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '47f049f6-6113-427d-ba1e-f849dfec302b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-62930ff4-55a3-4e08-8229-5532aa7dcaed', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'cd63e957-ae08-4ca1-9eb9-8ce253173257', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=13b92379-ae34-491c-b971-1757bc6e8c79, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=bb707b30-96a3-4e7c-abad-b85fc5a10938) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:54:49 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:49.199 103805 INFO neutron.agent.ovn.metadata.agent [-] Port bb707b30-96a3-4e7c-abad-b85fc5a10938 in datapath 62930ff4-55a3-4e08-8229-5532aa7dcaed unbound from our chassis
Nov 22 07:54:49 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:49.201 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 62930ff4-55a3-4e08-8229-5532aa7dcaed, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 07:54:49 compute-0 nova_compute[186544]: 2025-11-22 07:54:49.202 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:54:49 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:49.202 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[01b058e4-e420-4914-b3fd-2bd5026e2773]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:54:49 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:49.203 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed namespace which is not needed anymore
Nov 22 07:54:49 compute-0 systemd[1]: machine-qemu\x2d30\x2dinstance\x2d0000003e.scope: Deactivated successfully.
Nov 22 07:54:49 compute-0 systemd[1]: machine-qemu\x2d30\x2dinstance\x2d0000003e.scope: Consumed 15.404s CPU time.
Nov 22 07:54:49 compute-0 systemd-machined[152872]: Machine qemu-30-instance-0000003e terminated.
Nov 22 07:54:49 compute-0 neutron-haproxy-ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed[223093]: [NOTICE]   (223097) : haproxy version is 2.8.14-c23fe91
Nov 22 07:54:49 compute-0 neutron-haproxy-ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed[223093]: [NOTICE]   (223097) : path to executable is /usr/sbin/haproxy
Nov 22 07:54:49 compute-0 neutron-haproxy-ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed[223093]: [WARNING]  (223097) : Exiting Master process...
Nov 22 07:54:49 compute-0 neutron-haproxy-ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed[223093]: [ALERT]    (223097) : Current worker (223099) exited with code 143 (Terminated)
Nov 22 07:54:49 compute-0 neutron-haproxy-ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed[223093]: [WARNING]  (223097) : All workers exited. Exiting... (0)
Nov 22 07:54:49 compute-0 systemd[1]: libpod-48f701d6cc65f59300c9b7eeff87e6652a391bd64c4e882c6f8d8e3cb9703a91.scope: Deactivated successfully.
Nov 22 07:54:49 compute-0 podman[223385]: 2025-11-22 07:54:49.335241793 +0000 UTC m=+0.047530865 container died 48f701d6cc65f59300c9b7eeff87e6652a391bd64c4e882c6f8d8e3cb9703a91 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 22 07:54:49 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-48f701d6cc65f59300c9b7eeff87e6652a391bd64c4e882c6f8d8e3cb9703a91-userdata-shm.mount: Deactivated successfully.
Nov 22 07:54:49 compute-0 systemd[1]: var-lib-containers-storage-overlay-a49b81ccd000c9683b17101de0d626a3787b59eaabe4be7512774e72966087c3-merged.mount: Deactivated successfully.
Nov 22 07:54:49 compute-0 podman[223385]: 2025-11-22 07:54:49.395070337 +0000 UTC m=+0.107359419 container cleanup 48f701d6cc65f59300c9b7eeff87e6652a391bd64c4e882c6f8d8e3cb9703a91 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 07:54:49 compute-0 systemd[1]: libpod-conmon-48f701d6cc65f59300c9b7eeff87e6652a391bd64c4e882c6f8d8e3cb9703a91.scope: Deactivated successfully.
Nov 22 07:54:49 compute-0 podman[223418]: 2025-11-22 07:54:49.464652832 +0000 UTC m=+0.050730614 container remove 48f701d6cc65f59300c9b7eeff87e6652a391bd64c4e882c6f8d8e3cb9703a91 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 07:54:49 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:49.470 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[3f4dfda9-1754-4ccc-8373-6956ef92cbb8]: (4, ('Sat Nov 22 07:54:49 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed (48f701d6cc65f59300c9b7eeff87e6652a391bd64c4e882c6f8d8e3cb9703a91)\n48f701d6cc65f59300c9b7eeff87e6652a391bd64c4e882c6f8d8e3cb9703a91\nSat Nov 22 07:54:49 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed (48f701d6cc65f59300c9b7eeff87e6652a391bd64c4e882c6f8d8e3cb9703a91)\n48f701d6cc65f59300c9b7eeff87e6652a391bd64c4e882c6f8d8e3cb9703a91\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:54:49 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:49.473 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[0104c9d2-cce2-4401-87fd-286bfec6b152]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:54:49 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:49.473 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap62930ff4-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:54:49 compute-0 kernel: tap62930ff4-50: left promiscuous mode
Nov 22 07:54:49 compute-0 nova_compute[186544]: 2025-11-22 07:54:49.476 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:54:49 compute-0 nova_compute[186544]: 2025-11-22 07:54:49.490 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:54:49 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:49.493 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[3ec3b561-051d-47b2-90dc-f2bbc2a723bf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:54:49 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:49.505 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[22ebbe8c-05a1-4b3c-b8f4-04fefd8f4f1f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:54:49 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:49.507 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[a2d5f07c-a45f-4761-9f42-d2bfdc92b8f3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:54:49 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:49.520 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[8abd6079-0c82-4a99-8d1a-7470d336197a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 480185, 'reachable_time': 34630, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223449, 'error': None, 'target': 'ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:54:49 compute-0 systemd[1]: run-netns-ovnmeta\x2d62930ff4\x2d55a3\x2d4e08\x2d8229\x2d5532aa7dcaed.mount: Deactivated successfully.
Nov 22 07:54:49 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:49.524 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 07:54:49 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:49.524 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[d5416c12-e965-4d73-b654-235e86027506]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:54:49 compute-0 nova_compute[186544]: 2025-11-22 07:54:49.926 186548 DEBUG nova.compute.manager [req-0bc32194-34a8-4b1d-b07c-ecd9da24ff17 req-d01d4a60-ac29-4b26-97eb-a5adc5fc4e2b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Received event network-vif-unplugged-bb707b30-96a3-4e7c-abad-b85fc5a10938 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:54:49 compute-0 nova_compute[186544]: 2025-11-22 07:54:49.926 186548 DEBUG oslo_concurrency.lockutils [req-0bc32194-34a8-4b1d-b07c-ecd9da24ff17 req-d01d4a60-ac29-4b26-97eb-a5adc5fc4e2b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "47f049f6-6113-427d-ba1e-f849dfec302b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:54:49 compute-0 nova_compute[186544]: 2025-11-22 07:54:49.927 186548 DEBUG oslo_concurrency.lockutils [req-0bc32194-34a8-4b1d-b07c-ecd9da24ff17 req-d01d4a60-ac29-4b26-97eb-a5adc5fc4e2b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "47f049f6-6113-427d-ba1e-f849dfec302b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:54:49 compute-0 nova_compute[186544]: 2025-11-22 07:54:49.927 186548 DEBUG oslo_concurrency.lockutils [req-0bc32194-34a8-4b1d-b07c-ecd9da24ff17 req-d01d4a60-ac29-4b26-97eb-a5adc5fc4e2b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "47f049f6-6113-427d-ba1e-f849dfec302b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:54:49 compute-0 nova_compute[186544]: 2025-11-22 07:54:49.927 186548 DEBUG nova.compute.manager [req-0bc32194-34a8-4b1d-b07c-ecd9da24ff17 req-d01d4a60-ac29-4b26-97eb-a5adc5fc4e2b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] No waiting events found dispatching network-vif-unplugged-bb707b30-96a3-4e7c-abad-b85fc5a10938 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:54:49 compute-0 nova_compute[186544]: 2025-11-22 07:54:49.927 186548 WARNING nova.compute.manager [req-0bc32194-34a8-4b1d-b07c-ecd9da24ff17 req-d01d4a60-ac29-4b26-97eb-a5adc5fc4e2b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Received unexpected event network-vif-unplugged-bb707b30-96a3-4e7c-abad-b85fc5a10938 for instance with vm_state active and task_state powering-off.
Nov 22 07:54:50 compute-0 nova_compute[186544]: 2025-11-22 07:54:50.023 186548 INFO nova.virt.libvirt.driver [None req-bcab36c6-8929-481a-9892-e8cc06a3a698 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Instance shutdown successfully after 3 seconds.
Nov 22 07:54:50 compute-0 nova_compute[186544]: 2025-11-22 07:54:50.029 186548 INFO nova.virt.libvirt.driver [-] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Instance destroyed successfully.
Nov 22 07:54:50 compute-0 nova_compute[186544]: 2025-11-22 07:54:50.029 186548 DEBUG nova.objects.instance [None req-bcab36c6-8929-481a-9892-e8cc06a3a698 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Lazy-loading 'numa_topology' on Instance uuid 47f049f6-6113-427d-ba1e-f849dfec302b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:54:50 compute-0 nova_compute[186544]: 2025-11-22 07:54:50.044 186548 DEBUG nova.compute.manager [None req-bcab36c6-8929-481a-9892-e8cc06a3a698 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:54:50 compute-0 nova_compute[186544]: 2025-11-22 07:54:50.192 186548 DEBUG oslo_concurrency.lockutils [None req-bcab36c6-8929-481a-9892-e8cc06a3a698 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Lock "47f049f6-6113-427d-ba1e-f849dfec302b" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 3.252s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:54:50 compute-0 nova_compute[186544]: 2025-11-22 07:54:50.395 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:54:50 compute-0 nova_compute[186544]: 2025-11-22 07:54:50.429 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:54:51 compute-0 podman[223450]: 2025-11-22 07:54:51.443247784 +0000 UTC m=+0.091421229 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 07:54:52 compute-0 nova_compute[186544]: 2025-11-22 07:54:52.097 186548 DEBUG nova.compute.manager [req-c7c82f19-4cf8-4da2-9d38-cdc1200a3f48 req-b19d57ad-9faf-4023-9a34-2fe36e8653fd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Received event network-vif-plugged-bb707b30-96a3-4e7c-abad-b85fc5a10938 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:54:52 compute-0 nova_compute[186544]: 2025-11-22 07:54:52.097 186548 DEBUG oslo_concurrency.lockutils [req-c7c82f19-4cf8-4da2-9d38-cdc1200a3f48 req-b19d57ad-9faf-4023-9a34-2fe36e8653fd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "47f049f6-6113-427d-ba1e-f849dfec302b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:54:52 compute-0 nova_compute[186544]: 2025-11-22 07:54:52.098 186548 DEBUG oslo_concurrency.lockutils [req-c7c82f19-4cf8-4da2-9d38-cdc1200a3f48 req-b19d57ad-9faf-4023-9a34-2fe36e8653fd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "47f049f6-6113-427d-ba1e-f849dfec302b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:54:52 compute-0 nova_compute[186544]: 2025-11-22 07:54:52.098 186548 DEBUG oslo_concurrency.lockutils [req-c7c82f19-4cf8-4da2-9d38-cdc1200a3f48 req-b19d57ad-9faf-4023-9a34-2fe36e8653fd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "47f049f6-6113-427d-ba1e-f849dfec302b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:54:52 compute-0 nova_compute[186544]: 2025-11-22 07:54:52.098 186548 DEBUG nova.compute.manager [req-c7c82f19-4cf8-4da2-9d38-cdc1200a3f48 req-b19d57ad-9faf-4023-9a34-2fe36e8653fd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] No waiting events found dispatching network-vif-plugged-bb707b30-96a3-4e7c-abad-b85fc5a10938 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:54:52 compute-0 nova_compute[186544]: 2025-11-22 07:54:52.098 186548 WARNING nova.compute.manager [req-c7c82f19-4cf8-4da2-9d38-cdc1200a3f48 req-b19d57ad-9faf-4023-9a34-2fe36e8653fd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Received unexpected event network-vif-plugged-bb707b30-96a3-4e7c-abad-b85fc5a10938 for instance with vm_state stopped and task_state None.
Nov 22 07:54:52 compute-0 nova_compute[186544]: 2025-11-22 07:54:52.841 186548 DEBUG oslo_concurrency.lockutils [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquiring lock "fde616bb-bfd9-477f-b2bf-6ce171da7c4c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:54:52 compute-0 nova_compute[186544]: 2025-11-22 07:54:52.842 186548 DEBUG oslo_concurrency.lockutils [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "fde616bb-bfd9-477f-b2bf-6ce171da7c4c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:54:52 compute-0 nova_compute[186544]: 2025-11-22 07:54:52.862 186548 DEBUG nova.compute.manager [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 07:54:52 compute-0 nova_compute[186544]: 2025-11-22 07:54:52.902 186548 DEBUG nova.objects.instance [None req-487bb8fb-8f3d-4389-9b3f-8d3ee4d79bba 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Lazy-loading 'flavor' on Instance uuid 47f049f6-6113-427d-ba1e-f849dfec302b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:54:52 compute-0 nova_compute[186544]: 2025-11-22 07:54:52.942 186548 DEBUG nova.objects.instance [None req-487bb8fb-8f3d-4389-9b3f-8d3ee4d79bba 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Lazy-loading 'info_cache' on Instance uuid 47f049f6-6113-427d-ba1e-f849dfec302b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:54:52 compute-0 nova_compute[186544]: 2025-11-22 07:54:52.993 186548 DEBUG oslo_concurrency.lockutils [None req-487bb8fb-8f3d-4389-9b3f-8d3ee4d79bba 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Acquiring lock "refresh_cache-47f049f6-6113-427d-ba1e-f849dfec302b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:54:52 compute-0 nova_compute[186544]: 2025-11-22 07:54:52.994 186548 DEBUG oslo_concurrency.lockutils [None req-487bb8fb-8f3d-4389-9b3f-8d3ee4d79bba 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Acquired lock "refresh_cache-47f049f6-6113-427d-ba1e-f849dfec302b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:54:52 compute-0 nova_compute[186544]: 2025-11-22 07:54:52.994 186548 DEBUG nova.network.neutron [None req-487bb8fb-8f3d-4389-9b3f-8d3ee4d79bba 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 07:54:53 compute-0 nova_compute[186544]: 2025-11-22 07:54:53.069 186548 DEBUG oslo_concurrency.lockutils [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:54:53 compute-0 nova_compute[186544]: 2025-11-22 07:54:53.070 186548 DEBUG oslo_concurrency.lockutils [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:54:53 compute-0 nova_compute[186544]: 2025-11-22 07:54:53.078 186548 DEBUG nova.virt.hardware [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 07:54:53 compute-0 nova_compute[186544]: 2025-11-22 07:54:53.079 186548 INFO nova.compute.claims [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Claim successful on node compute-0.ctlplane.example.com
Nov 22 07:54:53 compute-0 nova_compute[186544]: 2025-11-22 07:54:53.362 186548 DEBUG nova.compute.provider_tree [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:54:53 compute-0 nova_compute[186544]: 2025-11-22 07:54:53.394 186548 DEBUG nova.scheduler.client.report [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:54:53 compute-0 nova_compute[186544]: 2025-11-22 07:54:53.442 186548 DEBUG oslo_concurrency.lockutils [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.372s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:54:53 compute-0 nova_compute[186544]: 2025-11-22 07:54:53.442 186548 DEBUG nova.compute.manager [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 07:54:53 compute-0 nova_compute[186544]: 2025-11-22 07:54:53.520 186548 DEBUG nova.compute.manager [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 22 07:54:53 compute-0 nova_compute[186544]: 2025-11-22 07:54:53.521 186548 DEBUG nova.network.neutron [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 07:54:53 compute-0 nova_compute[186544]: 2025-11-22 07:54:53.544 186548 INFO nova.virt.libvirt.driver [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 07:54:53 compute-0 nova_compute[186544]: 2025-11-22 07:54:53.571 186548 DEBUG nova.compute.manager [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 07:54:53 compute-0 nova_compute[186544]: 2025-11-22 07:54:53.723 186548 DEBUG nova.compute.manager [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 07:54:53 compute-0 nova_compute[186544]: 2025-11-22 07:54:53.724 186548 DEBUG nova.virt.libvirt.driver [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 07:54:53 compute-0 nova_compute[186544]: 2025-11-22 07:54:53.725 186548 INFO nova.virt.libvirt.driver [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Creating image(s)
Nov 22 07:54:53 compute-0 nova_compute[186544]: 2025-11-22 07:54:53.725 186548 DEBUG oslo_concurrency.lockutils [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquiring lock "/var/lib/nova/instances/fde616bb-bfd9-477f-b2bf-6ce171da7c4c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:54:53 compute-0 nova_compute[186544]: 2025-11-22 07:54:53.725 186548 DEBUG oslo_concurrency.lockutils [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "/var/lib/nova/instances/fde616bb-bfd9-477f-b2bf-6ce171da7c4c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:54:53 compute-0 nova_compute[186544]: 2025-11-22 07:54:53.726 186548 DEBUG oslo_concurrency.lockutils [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "/var/lib/nova/instances/fde616bb-bfd9-477f-b2bf-6ce171da7c4c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:54:53 compute-0 nova_compute[186544]: 2025-11-22 07:54:53.739 186548 DEBUG oslo_concurrency.processutils [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:54:53 compute-0 nova_compute[186544]: 2025-11-22 07:54:53.792 186548 DEBUG oslo_concurrency.processutils [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:54:53 compute-0 nova_compute[186544]: 2025-11-22 07:54:53.794 186548 DEBUG oslo_concurrency.lockutils [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:54:53 compute-0 nova_compute[186544]: 2025-11-22 07:54:53.794 186548 DEBUG oslo_concurrency.lockutils [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:54:53 compute-0 nova_compute[186544]: 2025-11-22 07:54:53.809 186548 DEBUG oslo_concurrency.processutils [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:54:53 compute-0 nova_compute[186544]: 2025-11-22 07:54:53.864 186548 DEBUG oslo_concurrency.processutils [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:54:53 compute-0 nova_compute[186544]: 2025-11-22 07:54:53.865 186548 DEBUG oslo_concurrency.processutils [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/fde616bb-bfd9-477f-b2bf-6ce171da7c4c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:54:53 compute-0 nova_compute[186544]: 2025-11-22 07:54:53.902 186548 DEBUG oslo_concurrency.processutils [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/fde616bb-bfd9-477f-b2bf-6ce171da7c4c/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:54:53 compute-0 nova_compute[186544]: 2025-11-22 07:54:53.903 186548 DEBUG oslo_concurrency.lockutils [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.109s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:54:53 compute-0 nova_compute[186544]: 2025-11-22 07:54:53.903 186548 DEBUG oslo_concurrency.processutils [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:54:53 compute-0 nova_compute[186544]: 2025-11-22 07:54:53.927 186548 DEBUG nova.policy [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e24c302b62fb470aa189b76d4676733b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '063bf16c91af408ca075c690797e09d8', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 22 07:54:53 compute-0 nova_compute[186544]: 2025-11-22 07:54:53.966 186548 DEBUG oslo_concurrency.processutils [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:54:53 compute-0 nova_compute[186544]: 2025-11-22 07:54:53.967 186548 DEBUG nova.virt.disk.api [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Checking if we can resize image /var/lib/nova/instances/fde616bb-bfd9-477f-b2bf-6ce171da7c4c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 07:54:53 compute-0 nova_compute[186544]: 2025-11-22 07:54:53.968 186548 DEBUG oslo_concurrency.processutils [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fde616bb-bfd9-477f-b2bf-6ce171da7c4c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:54:54 compute-0 nova_compute[186544]: 2025-11-22 07:54:54.034 186548 DEBUG oslo_concurrency.processutils [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fde616bb-bfd9-477f-b2bf-6ce171da7c4c/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:54:54 compute-0 nova_compute[186544]: 2025-11-22 07:54:54.035 186548 DEBUG nova.virt.disk.api [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Cannot resize image /var/lib/nova/instances/fde616bb-bfd9-477f-b2bf-6ce171da7c4c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 07:54:54 compute-0 nova_compute[186544]: 2025-11-22 07:54:54.036 186548 DEBUG nova.objects.instance [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lazy-loading 'migration_context' on Instance uuid fde616bb-bfd9-477f-b2bf-6ce171da7c4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:54:54 compute-0 nova_compute[186544]: 2025-11-22 07:54:54.051 186548 DEBUG nova.virt.libvirt.driver [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 07:54:54 compute-0 nova_compute[186544]: 2025-11-22 07:54:54.052 186548 DEBUG nova.virt.libvirt.driver [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Ensure instance console log exists: /var/lib/nova/instances/fde616bb-bfd9-477f-b2bf-6ce171da7c4c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 07:54:54 compute-0 nova_compute[186544]: 2025-11-22 07:54:54.052 186548 DEBUG oslo_concurrency.lockutils [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:54:54 compute-0 nova_compute[186544]: 2025-11-22 07:54:54.052 186548 DEBUG oslo_concurrency.lockutils [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:54:54 compute-0 nova_compute[186544]: 2025-11-22 07:54:54.053 186548 DEBUG oslo_concurrency.lockutils [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:54:54 compute-0 nova_compute[186544]: 2025-11-22 07:54:54.212 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:54:55 compute-0 podman[223486]: 2025-11-22 07:54:55.394243799 +0000 UTC m=+0.046696854 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 07:54:55 compute-0 nova_compute[186544]: 2025-11-22 07:54:55.397 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:54:55 compute-0 nova_compute[186544]: 2025-11-22 07:54:55.431 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:54:55 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:55.883 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:54:55 compute-0 nova_compute[186544]: 2025-11-22 07:54:55.884 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:54:55 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:55.884 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 07:54:56 compute-0 nova_compute[186544]: 2025-11-22 07:54:56.523 186548 DEBUG nova.network.neutron [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Successfully created port: cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 22 07:54:57 compute-0 nova_compute[186544]: 2025-11-22 07:54:57.164 186548 DEBUG nova.network.neutron [None req-487bb8fb-8f3d-4389-9b3f-8d3ee4d79bba 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Updating instance_info_cache with network_info: [{"id": "bb707b30-96a3-4e7c-abad-b85fc5a10938", "address": "fa:16:3e:ee:30:e0", "network": {"id": "62930ff4-55a3-4e08-8229-5532aa7dcaed", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-130265711-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b4ca2b2e65ac4bf8b3d14f3310a3a7bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb707b30-96", "ovs_interfaceid": "bb707b30-96a3-4e7c-abad-b85fc5a10938", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:54:57 compute-0 nova_compute[186544]: 2025-11-22 07:54:57.190 186548 DEBUG oslo_concurrency.lockutils [None req-487bb8fb-8f3d-4389-9b3f-8d3ee4d79bba 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Releasing lock "refresh_cache-47f049f6-6113-427d-ba1e-f849dfec302b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:54:57 compute-0 nova_compute[186544]: 2025-11-22 07:54:57.246 186548 INFO nova.virt.libvirt.driver [-] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Instance destroyed successfully.
Nov 22 07:54:57 compute-0 nova_compute[186544]: 2025-11-22 07:54:57.246 186548 DEBUG nova.objects.instance [None req-487bb8fb-8f3d-4389-9b3f-8d3ee4d79bba 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Lazy-loading 'numa_topology' on Instance uuid 47f049f6-6113-427d-ba1e-f849dfec302b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:54:57 compute-0 nova_compute[186544]: 2025-11-22 07:54:57.269 186548 DEBUG nova.objects.instance [None req-487bb8fb-8f3d-4389-9b3f-8d3ee4d79bba 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Lazy-loading 'resources' on Instance uuid 47f049f6-6113-427d-ba1e-f849dfec302b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:54:57 compute-0 nova_compute[186544]: 2025-11-22 07:54:57.282 186548 DEBUG nova.virt.libvirt.vif [None req-487bb8fb-8f3d-4389-9b3f-8d3ee4d79bba 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:54:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-30500190',display_name='tempest-ListServerFiltersTestJSON-instance-30500190',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-30500190',id=62,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:54:23Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='b4ca2b2e65ac4bf8b3d14f3310a3a7bf',ramdisk_id='',reservation_id='r-ykmudqwe',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServerFiltersTestJSON-1217253496',owner_user_name='tempest-ListServerFiltersTestJSON-1217253496-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:54:50Z,user_data=None,user_id='6d9b8aa760ed4afdbf24f9deb5d29190',uuid=47f049f6-6113-427d-ba1e-f849dfec302b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "bb707b30-96a3-4e7c-abad-b85fc5a10938", "address": "fa:16:3e:ee:30:e0", "network": {"id": "62930ff4-55a3-4e08-8229-5532aa7dcaed", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-130265711-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b4ca2b2e65ac4bf8b3d14f3310a3a7bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb707b30-96", "ovs_interfaceid": "bb707b30-96a3-4e7c-abad-b85fc5a10938", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 07:54:57 compute-0 nova_compute[186544]: 2025-11-22 07:54:57.283 186548 DEBUG nova.network.os_vif_util [None req-487bb8fb-8f3d-4389-9b3f-8d3ee4d79bba 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Converting VIF {"id": "bb707b30-96a3-4e7c-abad-b85fc5a10938", "address": "fa:16:3e:ee:30:e0", "network": {"id": "62930ff4-55a3-4e08-8229-5532aa7dcaed", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-130265711-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b4ca2b2e65ac4bf8b3d14f3310a3a7bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb707b30-96", "ovs_interfaceid": "bb707b30-96a3-4e7c-abad-b85fc5a10938", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:54:57 compute-0 nova_compute[186544]: 2025-11-22 07:54:57.284 186548 DEBUG nova.network.os_vif_util [None req-487bb8fb-8f3d-4389-9b3f-8d3ee4d79bba 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ee:30:e0,bridge_name='br-int',has_traffic_filtering=True,id=bb707b30-96a3-4e7c-abad-b85fc5a10938,network=Network(62930ff4-55a3-4e08-8229-5532aa7dcaed),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbb707b30-96') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:54:57 compute-0 nova_compute[186544]: 2025-11-22 07:54:57.284 186548 DEBUG os_vif [None req-487bb8fb-8f3d-4389-9b3f-8d3ee4d79bba 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ee:30:e0,bridge_name='br-int',has_traffic_filtering=True,id=bb707b30-96a3-4e7c-abad-b85fc5a10938,network=Network(62930ff4-55a3-4e08-8229-5532aa7dcaed),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbb707b30-96') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 07:54:57 compute-0 nova_compute[186544]: 2025-11-22 07:54:57.286 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:54:57 compute-0 nova_compute[186544]: 2025-11-22 07:54:57.286 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbb707b30-96, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:54:57 compute-0 nova_compute[186544]: 2025-11-22 07:54:57.288 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:54:57 compute-0 nova_compute[186544]: 2025-11-22 07:54:57.289 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:54:57 compute-0 nova_compute[186544]: 2025-11-22 07:54:57.291 186548 INFO os_vif [None req-487bb8fb-8f3d-4389-9b3f-8d3ee4d79bba 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ee:30:e0,bridge_name='br-int',has_traffic_filtering=True,id=bb707b30-96a3-4e7c-abad-b85fc5a10938,network=Network(62930ff4-55a3-4e08-8229-5532aa7dcaed),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbb707b30-96')
Nov 22 07:54:57 compute-0 nova_compute[186544]: 2025-11-22 07:54:57.299 186548 DEBUG nova.virt.libvirt.driver [None req-487bb8fb-8f3d-4389-9b3f-8d3ee4d79bba 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Start _get_guest_xml network_info=[{"id": "bb707b30-96a3-4e7c-abad-b85fc5a10938", "address": "fa:16:3e:ee:30:e0", "network": {"id": "62930ff4-55a3-4e08-8229-5532aa7dcaed", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-130265711-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b4ca2b2e65ac4bf8b3d14f3310a3a7bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb707b30-96", "ovs_interfaceid": "bb707b30-96a3-4e7c-abad-b85fc5a10938", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 07:54:57 compute-0 nova_compute[186544]: 2025-11-22 07:54:57.304 186548 WARNING nova.virt.libvirt.driver [None req-487bb8fb-8f3d-4389-9b3f-8d3ee4d79bba 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 07:54:57 compute-0 nova_compute[186544]: 2025-11-22 07:54:57.312 186548 DEBUG nova.virt.libvirt.host [None req-487bb8fb-8f3d-4389-9b3f-8d3ee4d79bba 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 07:54:57 compute-0 nova_compute[186544]: 2025-11-22 07:54:57.312 186548 DEBUG nova.virt.libvirt.host [None req-487bb8fb-8f3d-4389-9b3f-8d3ee4d79bba 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 07:54:57 compute-0 nova_compute[186544]: 2025-11-22 07:54:57.318 186548 DEBUG nova.virt.libvirt.host [None req-487bb8fb-8f3d-4389-9b3f-8d3ee4d79bba 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 07:54:57 compute-0 nova_compute[186544]: 2025-11-22 07:54:57.318 186548 DEBUG nova.virt.libvirt.host [None req-487bb8fb-8f3d-4389-9b3f-8d3ee4d79bba 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 07:54:57 compute-0 nova_compute[186544]: 2025-11-22 07:54:57.320 186548 DEBUG nova.virt.libvirt.driver [None req-487bb8fb-8f3d-4389-9b3f-8d3ee4d79bba 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 07:54:57 compute-0 nova_compute[186544]: 2025-11-22 07:54:57.320 186548 DEBUG nova.virt.hardware [None req-487bb8fb-8f3d-4389-9b3f-8d3ee4d79bba 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 07:54:57 compute-0 nova_compute[186544]: 2025-11-22 07:54:57.320 186548 DEBUG nova.virt.hardware [None req-487bb8fb-8f3d-4389-9b3f-8d3ee4d79bba 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 07:54:57 compute-0 nova_compute[186544]: 2025-11-22 07:54:57.321 186548 DEBUG nova.virt.hardware [None req-487bb8fb-8f3d-4389-9b3f-8d3ee4d79bba 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 07:54:57 compute-0 nova_compute[186544]: 2025-11-22 07:54:57.321 186548 DEBUG nova.virt.hardware [None req-487bb8fb-8f3d-4389-9b3f-8d3ee4d79bba 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 07:54:57 compute-0 nova_compute[186544]: 2025-11-22 07:54:57.321 186548 DEBUG nova.virt.hardware [None req-487bb8fb-8f3d-4389-9b3f-8d3ee4d79bba 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 07:54:57 compute-0 nova_compute[186544]: 2025-11-22 07:54:57.321 186548 DEBUG nova.virt.hardware [None req-487bb8fb-8f3d-4389-9b3f-8d3ee4d79bba 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 07:54:57 compute-0 nova_compute[186544]: 2025-11-22 07:54:57.322 186548 DEBUG nova.virt.hardware [None req-487bb8fb-8f3d-4389-9b3f-8d3ee4d79bba 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 07:54:57 compute-0 nova_compute[186544]: 2025-11-22 07:54:57.322 186548 DEBUG nova.virt.hardware [None req-487bb8fb-8f3d-4389-9b3f-8d3ee4d79bba 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 07:54:57 compute-0 nova_compute[186544]: 2025-11-22 07:54:57.322 186548 DEBUG nova.virt.hardware [None req-487bb8fb-8f3d-4389-9b3f-8d3ee4d79bba 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 07:54:57 compute-0 nova_compute[186544]: 2025-11-22 07:54:57.322 186548 DEBUG nova.virt.hardware [None req-487bb8fb-8f3d-4389-9b3f-8d3ee4d79bba 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 07:54:57 compute-0 nova_compute[186544]: 2025-11-22 07:54:57.323 186548 DEBUG nova.virt.hardware [None req-487bb8fb-8f3d-4389-9b3f-8d3ee4d79bba 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 07:54:57 compute-0 nova_compute[186544]: 2025-11-22 07:54:57.323 186548 DEBUG nova.objects.instance [None req-487bb8fb-8f3d-4389-9b3f-8d3ee4d79bba 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Lazy-loading 'vcpu_model' on Instance uuid 47f049f6-6113-427d-ba1e-f849dfec302b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:54:57 compute-0 nova_compute[186544]: 2025-11-22 07:54:57.341 186548 DEBUG oslo_concurrency.processutils [None req-487bb8fb-8f3d-4389-9b3f-8d3ee4d79bba 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/47f049f6-6113-427d-ba1e-f849dfec302b/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:54:57 compute-0 nova_compute[186544]: 2025-11-22 07:54:57.402 186548 DEBUG oslo_concurrency.processutils [None req-487bb8fb-8f3d-4389-9b3f-8d3ee4d79bba 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/47f049f6-6113-427d-ba1e-f849dfec302b/disk.config --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:54:57 compute-0 nova_compute[186544]: 2025-11-22 07:54:57.403 186548 DEBUG oslo_concurrency.lockutils [None req-487bb8fb-8f3d-4389-9b3f-8d3ee4d79bba 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Acquiring lock "/var/lib/nova/instances/47f049f6-6113-427d-ba1e-f849dfec302b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:54:57 compute-0 nova_compute[186544]: 2025-11-22 07:54:57.404 186548 DEBUG oslo_concurrency.lockutils [None req-487bb8fb-8f3d-4389-9b3f-8d3ee4d79bba 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Lock "/var/lib/nova/instances/47f049f6-6113-427d-ba1e-f849dfec302b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:54:57 compute-0 nova_compute[186544]: 2025-11-22 07:54:57.405 186548 DEBUG oslo_concurrency.lockutils [None req-487bb8fb-8f3d-4389-9b3f-8d3ee4d79bba 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Lock "/var/lib/nova/instances/47f049f6-6113-427d-ba1e-f849dfec302b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:54:57 compute-0 nova_compute[186544]: 2025-11-22 07:54:57.406 186548 DEBUG nova.virt.libvirt.vif [None req-487bb8fb-8f3d-4389-9b3f-8d3ee4d79bba 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:54:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-30500190',display_name='tempest-ListServerFiltersTestJSON-instance-30500190',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-30500190',id=62,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:54:23Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='b4ca2b2e65ac4bf8b3d14f3310a3a7bf',ramdisk_id='',reservation_id='r-ykmudqwe',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServerFiltersTestJSON-1217253496',owner_user_name='tempest-ListServerFiltersTestJSON-1217253496-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:54:50Z,user_data=None,user_id='6d9b8aa760ed4afdbf24f9deb5d29190',uuid=47f049f6-6113-427d-ba1e-f849dfec302b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "bb707b30-96a3-4e7c-abad-b85fc5a10938", "address": "fa:16:3e:ee:30:e0", "network": {"id": "62930ff4-55a3-4e08-8229-5532aa7dcaed", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-130265711-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b4ca2b2e65ac4bf8b3d14f3310a3a7bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb707b30-96", "ovs_interfaceid": "bb707b30-96a3-4e7c-abad-b85fc5a10938", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 07:54:57 compute-0 nova_compute[186544]: 2025-11-22 07:54:57.406 186548 DEBUG nova.network.os_vif_util [None req-487bb8fb-8f3d-4389-9b3f-8d3ee4d79bba 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Converting VIF {"id": "bb707b30-96a3-4e7c-abad-b85fc5a10938", "address": "fa:16:3e:ee:30:e0", "network": {"id": "62930ff4-55a3-4e08-8229-5532aa7dcaed", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-130265711-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b4ca2b2e65ac4bf8b3d14f3310a3a7bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb707b30-96", "ovs_interfaceid": "bb707b30-96a3-4e7c-abad-b85fc5a10938", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:54:57 compute-0 nova_compute[186544]: 2025-11-22 07:54:57.407 186548 DEBUG nova.network.os_vif_util [None req-487bb8fb-8f3d-4389-9b3f-8d3ee4d79bba 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ee:30:e0,bridge_name='br-int',has_traffic_filtering=True,id=bb707b30-96a3-4e7c-abad-b85fc5a10938,network=Network(62930ff4-55a3-4e08-8229-5532aa7dcaed),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbb707b30-96') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:54:57 compute-0 nova_compute[186544]: 2025-11-22 07:54:57.408 186548 DEBUG nova.objects.instance [None req-487bb8fb-8f3d-4389-9b3f-8d3ee4d79bba 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Lazy-loading 'pci_devices' on Instance uuid 47f049f6-6113-427d-ba1e-f849dfec302b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:54:57 compute-0 nova_compute[186544]: 2025-11-22 07:54:57.439 186548 DEBUG nova.virt.libvirt.driver [None req-487bb8fb-8f3d-4389-9b3f-8d3ee4d79bba 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] End _get_guest_xml xml=<domain type="kvm">
Nov 22 07:54:57 compute-0 nova_compute[186544]:   <uuid>47f049f6-6113-427d-ba1e-f849dfec302b</uuid>
Nov 22 07:54:57 compute-0 nova_compute[186544]:   <name>instance-0000003e</name>
Nov 22 07:54:57 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 07:54:57 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 07:54:57 compute-0 nova_compute[186544]:   <metadata>
Nov 22 07:54:57 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 07:54:57 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 07:54:57 compute-0 nova_compute[186544]:       <nova:name>tempest-ListServerFiltersTestJSON-instance-30500190</nova:name>
Nov 22 07:54:57 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 07:54:57</nova:creationTime>
Nov 22 07:54:57 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 07:54:57 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 07:54:57 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 07:54:57 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 07:54:57 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 07:54:57 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 07:54:57 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 07:54:57 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 07:54:57 compute-0 nova_compute[186544]:         <nova:user uuid="6d9b8aa760ed4afdbf24f9deb5d29190">tempest-ListServerFiltersTestJSON-1217253496-project-member</nova:user>
Nov 22 07:54:57 compute-0 nova_compute[186544]:         <nova:project uuid="b4ca2b2e65ac4bf8b3d14f3310a3a7bf">tempest-ListServerFiltersTestJSON-1217253496</nova:project>
Nov 22 07:54:57 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 07:54:57 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 07:54:57 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 07:54:57 compute-0 nova_compute[186544]:         <nova:port uuid="bb707b30-96a3-4e7c-abad-b85fc5a10938">
Nov 22 07:54:57 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Nov 22 07:54:57 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 07:54:57 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 07:54:57 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 07:54:57 compute-0 nova_compute[186544]:   </metadata>
Nov 22 07:54:57 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 07:54:57 compute-0 nova_compute[186544]:     <system>
Nov 22 07:54:57 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 07:54:57 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 07:54:57 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 07:54:57 compute-0 nova_compute[186544]:       <entry name="serial">47f049f6-6113-427d-ba1e-f849dfec302b</entry>
Nov 22 07:54:57 compute-0 nova_compute[186544]:       <entry name="uuid">47f049f6-6113-427d-ba1e-f849dfec302b</entry>
Nov 22 07:54:57 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 07:54:57 compute-0 nova_compute[186544]:     </system>
Nov 22 07:54:57 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 07:54:57 compute-0 nova_compute[186544]:   <os>
Nov 22 07:54:57 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 07:54:57 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 07:54:57 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 07:54:57 compute-0 nova_compute[186544]:   </os>
Nov 22 07:54:57 compute-0 nova_compute[186544]:   <features>
Nov 22 07:54:57 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 07:54:57 compute-0 nova_compute[186544]:     <apic/>
Nov 22 07:54:57 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 07:54:57 compute-0 nova_compute[186544]:   </features>
Nov 22 07:54:57 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 07:54:57 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 07:54:57 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 07:54:57 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 07:54:57 compute-0 nova_compute[186544]:   </clock>
Nov 22 07:54:57 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 07:54:57 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 07:54:57 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 07:54:57 compute-0 nova_compute[186544]:   </cpu>
Nov 22 07:54:57 compute-0 nova_compute[186544]:   <devices>
Nov 22 07:54:57 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 07:54:57 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 07:54:57 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/47f049f6-6113-427d-ba1e-f849dfec302b/disk"/>
Nov 22 07:54:57 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 07:54:57 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:54:57 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 07:54:57 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 07:54:57 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/47f049f6-6113-427d-ba1e-f849dfec302b/disk.config"/>
Nov 22 07:54:57 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 07:54:57 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:54:57 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 07:54:57 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:ee:30:e0"/>
Nov 22 07:54:57 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 07:54:57 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 07:54:57 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 07:54:57 compute-0 nova_compute[186544]:       <target dev="tapbb707b30-96"/>
Nov 22 07:54:57 compute-0 nova_compute[186544]:     </interface>
Nov 22 07:54:57 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 07:54:57 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/47f049f6-6113-427d-ba1e-f849dfec302b/console.log" append="off"/>
Nov 22 07:54:57 compute-0 nova_compute[186544]:     </serial>
Nov 22 07:54:57 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 07:54:57 compute-0 nova_compute[186544]:     <video>
Nov 22 07:54:57 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 07:54:57 compute-0 nova_compute[186544]:     </video>
Nov 22 07:54:57 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 07:54:57 compute-0 nova_compute[186544]:     <input type="keyboard" bus="usb"/>
Nov 22 07:54:57 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 07:54:57 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 07:54:57 compute-0 nova_compute[186544]:     </rng>
Nov 22 07:54:57 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 07:54:57 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:54:57 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:54:57 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:54:57 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:54:57 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:54:57 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:54:57 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:54:57 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:54:57 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:54:57 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:54:57 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:54:57 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:54:57 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:54:57 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:54:57 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:54:57 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:54:57 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:54:57 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:54:57 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:54:57 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:54:57 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:54:57 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:54:57 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:54:57 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:54:57 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 07:54:57 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 07:54:57 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 07:54:57 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 07:54:57 compute-0 nova_compute[186544]:   </devices>
Nov 22 07:54:57 compute-0 nova_compute[186544]: </domain>
Nov 22 07:54:57 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 07:54:57 compute-0 nova_compute[186544]: 2025-11-22 07:54:57.441 186548 DEBUG oslo_concurrency.processutils [None req-487bb8fb-8f3d-4389-9b3f-8d3ee4d79bba 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/47f049f6-6113-427d-ba1e-f849dfec302b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:54:57 compute-0 nova_compute[186544]: 2025-11-22 07:54:57.499 186548 DEBUG oslo_concurrency.processutils [None req-487bb8fb-8f3d-4389-9b3f-8d3ee4d79bba 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/47f049f6-6113-427d-ba1e-f849dfec302b/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:54:57 compute-0 nova_compute[186544]: 2025-11-22 07:54:57.500 186548 DEBUG oslo_concurrency.processutils [None req-487bb8fb-8f3d-4389-9b3f-8d3ee4d79bba 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/47f049f6-6113-427d-ba1e-f849dfec302b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:54:57 compute-0 nova_compute[186544]: 2025-11-22 07:54:57.558 186548 DEBUG oslo_concurrency.processutils [None req-487bb8fb-8f3d-4389-9b3f-8d3ee4d79bba 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/47f049f6-6113-427d-ba1e-f849dfec302b/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:54:57 compute-0 nova_compute[186544]: 2025-11-22 07:54:57.560 186548 DEBUG nova.objects.instance [None req-487bb8fb-8f3d-4389-9b3f-8d3ee4d79bba 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Lazy-loading 'trusted_certs' on Instance uuid 47f049f6-6113-427d-ba1e-f849dfec302b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:54:57 compute-0 nova_compute[186544]: 2025-11-22 07:54:57.573 186548 DEBUG oslo_concurrency.processutils [None req-487bb8fb-8f3d-4389-9b3f-8d3ee4d79bba 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:54:57 compute-0 nova_compute[186544]: 2025-11-22 07:54:57.631 186548 DEBUG oslo_concurrency.processutils [None req-487bb8fb-8f3d-4389-9b3f-8d3ee4d79bba 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:54:57 compute-0 nova_compute[186544]: 2025-11-22 07:54:57.632 186548 DEBUG nova.virt.disk.api [None req-487bb8fb-8f3d-4389-9b3f-8d3ee4d79bba 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Checking if we can resize image /var/lib/nova/instances/47f049f6-6113-427d-ba1e-f849dfec302b/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 07:54:57 compute-0 nova_compute[186544]: 2025-11-22 07:54:57.633 186548 DEBUG oslo_concurrency.processutils [None req-487bb8fb-8f3d-4389-9b3f-8d3ee4d79bba 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/47f049f6-6113-427d-ba1e-f849dfec302b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:54:57 compute-0 nova_compute[186544]: 2025-11-22 07:54:57.696 186548 DEBUG oslo_concurrency.processutils [None req-487bb8fb-8f3d-4389-9b3f-8d3ee4d79bba 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/47f049f6-6113-427d-ba1e-f849dfec302b/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:54:57 compute-0 nova_compute[186544]: 2025-11-22 07:54:57.697 186548 DEBUG nova.virt.disk.api [None req-487bb8fb-8f3d-4389-9b3f-8d3ee4d79bba 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Cannot resize image /var/lib/nova/instances/47f049f6-6113-427d-ba1e-f849dfec302b/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 07:54:57 compute-0 nova_compute[186544]: 2025-11-22 07:54:57.698 186548 DEBUG nova.objects.instance [None req-487bb8fb-8f3d-4389-9b3f-8d3ee4d79bba 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Lazy-loading 'migration_context' on Instance uuid 47f049f6-6113-427d-ba1e-f849dfec302b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:54:57 compute-0 nova_compute[186544]: 2025-11-22 07:54:57.713 186548 DEBUG nova.virt.libvirt.vif [None req-487bb8fb-8f3d-4389-9b3f-8d3ee4d79bba 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:54:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-30500190',display_name='tempest-ListServerFiltersTestJSON-instance-30500190',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-30500190',id=62,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:54:23Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=4,progress=0,project_id='b4ca2b2e65ac4bf8b3d14f3310a3a7bf',ramdisk_id='',reservation_id='r-ykmudqwe',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServerFiltersTestJSON-1217253496',owner_user_name='tempest-ListServerFiltersTestJSON-1217253496-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:54:50Z,user_data=None,user_id='6d9b8aa760ed4afdbf24f9deb5d29190',uuid=47f049f6-6113-427d-ba1e-f849dfec302b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "bb707b30-96a3-4e7c-abad-b85fc5a10938", "address": "fa:16:3e:ee:30:e0", "network": {"id": "62930ff4-55a3-4e08-8229-5532aa7dcaed", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-130265711-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b4ca2b2e65ac4bf8b3d14f3310a3a7bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb707b30-96", "ovs_interfaceid": "bb707b30-96a3-4e7c-abad-b85fc5a10938", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 07:54:57 compute-0 nova_compute[186544]: 2025-11-22 07:54:57.713 186548 DEBUG nova.network.os_vif_util [None req-487bb8fb-8f3d-4389-9b3f-8d3ee4d79bba 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Converting VIF {"id": "bb707b30-96a3-4e7c-abad-b85fc5a10938", "address": "fa:16:3e:ee:30:e0", "network": {"id": "62930ff4-55a3-4e08-8229-5532aa7dcaed", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-130265711-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b4ca2b2e65ac4bf8b3d14f3310a3a7bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb707b30-96", "ovs_interfaceid": "bb707b30-96a3-4e7c-abad-b85fc5a10938", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:54:57 compute-0 nova_compute[186544]: 2025-11-22 07:54:57.714 186548 DEBUG nova.network.os_vif_util [None req-487bb8fb-8f3d-4389-9b3f-8d3ee4d79bba 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ee:30:e0,bridge_name='br-int',has_traffic_filtering=True,id=bb707b30-96a3-4e7c-abad-b85fc5a10938,network=Network(62930ff4-55a3-4e08-8229-5532aa7dcaed),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbb707b30-96') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:54:57 compute-0 nova_compute[186544]: 2025-11-22 07:54:57.715 186548 DEBUG os_vif [None req-487bb8fb-8f3d-4389-9b3f-8d3ee4d79bba 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ee:30:e0,bridge_name='br-int',has_traffic_filtering=True,id=bb707b30-96a3-4e7c-abad-b85fc5a10938,network=Network(62930ff4-55a3-4e08-8229-5532aa7dcaed),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbb707b30-96') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 07:54:57 compute-0 nova_compute[186544]: 2025-11-22 07:54:57.715 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:54:57 compute-0 nova_compute[186544]: 2025-11-22 07:54:57.716 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:54:57 compute-0 nova_compute[186544]: 2025-11-22 07:54:57.716 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:54:57 compute-0 nova_compute[186544]: 2025-11-22 07:54:57.719 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:54:57 compute-0 nova_compute[186544]: 2025-11-22 07:54:57.720 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbb707b30-96, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:54:57 compute-0 nova_compute[186544]: 2025-11-22 07:54:57.721 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapbb707b30-96, col_values=(('external_ids', {'iface-id': 'bb707b30-96a3-4e7c-abad-b85fc5a10938', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ee:30:e0', 'vm-uuid': '47f049f6-6113-427d-ba1e-f849dfec302b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:54:57 compute-0 nova_compute[186544]: 2025-11-22 07:54:57.723 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:54:57 compute-0 NetworkManager[55036]: <info>  [1763798097.7239] manager: (tapbb707b30-96): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/117)
Nov 22 07:54:57 compute-0 nova_compute[186544]: 2025-11-22 07:54:57.726 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 07:54:57 compute-0 nova_compute[186544]: 2025-11-22 07:54:57.729 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:54:57 compute-0 nova_compute[186544]: 2025-11-22 07:54:57.730 186548 INFO os_vif [None req-487bb8fb-8f3d-4389-9b3f-8d3ee4d79bba 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ee:30:e0,bridge_name='br-int',has_traffic_filtering=True,id=bb707b30-96a3-4e7c-abad-b85fc5a10938,network=Network(62930ff4-55a3-4e08-8229-5532aa7dcaed),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbb707b30-96')
Nov 22 07:54:57 compute-0 kernel: tapbb707b30-96: entered promiscuous mode
Nov 22 07:54:57 compute-0 NetworkManager[55036]: <info>  [1763798097.8011] manager: (tapbb707b30-96): new Tun device (/org/freedesktop/NetworkManager/Devices/118)
Nov 22 07:54:57 compute-0 ovn_controller[94843]: 2025-11-22T07:54:57Z|00250|binding|INFO|Claiming lport bb707b30-96a3-4e7c-abad-b85fc5a10938 for this chassis.
Nov 22 07:54:57 compute-0 ovn_controller[94843]: 2025-11-22T07:54:57Z|00251|binding|INFO|bb707b30-96a3-4e7c-abad-b85fc5a10938: Claiming fa:16:3e:ee:30:e0 10.100.0.4
Nov 22 07:54:57 compute-0 nova_compute[186544]: 2025-11-22 07:54:57.802 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:54:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:57.816 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ee:30:e0 10.100.0.4'], port_security=['fa:16:3e:ee:30:e0 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '47f049f6-6113-427d-ba1e-f849dfec302b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-62930ff4-55a3-4e08-8229-5532aa7dcaed', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'cd63e957-ae08-4ca1-9eb9-8ce253173257', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=13b92379-ae34-491c-b971-1757bc6e8c79, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=bb707b30-96a3-4e7c-abad-b85fc5a10938) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:54:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:57.817 103805 INFO neutron.agent.ovn.metadata.agent [-] Port bb707b30-96a3-4e7c-abad-b85fc5a10938 in datapath 62930ff4-55a3-4e08-8229-5532aa7dcaed bound to our chassis
Nov 22 07:54:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:57.818 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 62930ff4-55a3-4e08-8229-5532aa7dcaed
Nov 22 07:54:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:57.828 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[319315c3-a8fe-4b78-bfb6-5c3cec331537]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:54:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:57.829 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap62930ff4-51 in ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 07:54:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:57.830 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap62930ff4-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 07:54:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:57.830 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[ef54535c-c907-42d4-a367-8f708a6264ce]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:54:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:57.831 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[5cd3e6a3-5922-4633-813a-96ecfaa18c86]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:54:57 compute-0 systemd-udevd[223542]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 07:54:57 compute-0 ovn_controller[94843]: 2025-11-22T07:54:57Z|00252|binding|INFO|Setting lport bb707b30-96a3-4e7c-abad-b85fc5a10938 ovn-installed in OVS
Nov 22 07:54:57 compute-0 ovn_controller[94843]: 2025-11-22T07:54:57Z|00253|binding|INFO|Setting lport bb707b30-96a3-4e7c-abad-b85fc5a10938 up in Southbound
Nov 22 07:54:57 compute-0 nova_compute[186544]: 2025-11-22 07:54:57.843 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:54:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:57.841 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[32965ea9-6920-4f15-9504-5e5e10d102d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:54:57 compute-0 systemd-machined[152872]: New machine qemu-31-instance-0000003e.
Nov 22 07:54:57 compute-0 NetworkManager[55036]: <info>  [1763798097.8524] device (tapbb707b30-96): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 07:54:57 compute-0 NetworkManager[55036]: <info>  [1763798097.8535] device (tapbb707b30-96): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 07:54:57 compute-0 systemd[1]: Started Virtual Machine qemu-31-instance-0000003e.
Nov 22 07:54:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:57.860 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[b8f1be48-47cf-41c0-8572-7590aeec155b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:54:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:57.886 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:54:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:57.890 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[9c9ae5f5-5d78-464e-9608-be60574f0236]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:54:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:57.895 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[b3d3b82f-deaf-407b-b4da-8b9f13866026]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:54:57 compute-0 NetworkManager[55036]: <info>  [1763798097.8974] manager: (tap62930ff4-50): new Veth device (/org/freedesktop/NetworkManager/Devices/119)
Nov 22 07:54:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:57.927 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[d03d11bb-50a5-42be-9c37-0d176f7f1e8e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:54:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:57.930 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[3ff28c11-fb3b-442a-b2b7-8d91eaccbc80]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:54:57 compute-0 NetworkManager[55036]: <info>  [1763798097.9582] device (tap62930ff4-50): carrier: link connected
Nov 22 07:54:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:57.964 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[32ea667e-142e-433f-a409-4a44782bc38e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:54:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:57.980 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[d788db57-cce0-471d-a0a7-5df6dfa50af2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap62930ff4-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c3:07:14'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 75], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 483859, 'reachable_time': 42312, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223575, 'error': None, 'target': 'ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:54:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:57.994 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[a1009d59-35ec-45bd-9765-7a687ed198b6]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec3:714'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 483859, 'tstamp': 483859}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223576, 'error': None, 'target': 'ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:54:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:58.009 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[06ee962f-d2f8-40c4-ac9a-dd2123c231cb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap62930ff4-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c3:07:14'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 75], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 483859, 'reachable_time': 42312, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 223577, 'error': None, 'target': 'ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:54:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:58.040 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[bd5a2005-ed58-4ec8-9750-d9bd4f4febf4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:54:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:58.098 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[e52af613-98ff-4f5e-93da-7c0640b0504a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:54:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:58.100 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap62930ff4-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:54:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:58.100 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:54:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:58.101 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap62930ff4-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:54:58 compute-0 nova_compute[186544]: 2025-11-22 07:54:58.104 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:54:58 compute-0 NetworkManager[55036]: <info>  [1763798098.1049] manager: (tap62930ff4-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/120)
Nov 22 07:54:58 compute-0 kernel: tap62930ff4-50: entered promiscuous mode
Nov 22 07:54:58 compute-0 nova_compute[186544]: 2025-11-22 07:54:58.107 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:54:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:58.108 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap62930ff4-50, col_values=(('external_ids', {'iface-id': '02324e7a-c5bf-443b-a6e3-5a1cdac9fee4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:54:58 compute-0 nova_compute[186544]: 2025-11-22 07:54:58.109 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:54:58 compute-0 ovn_controller[94843]: 2025-11-22T07:54:58Z|00254|binding|INFO|Releasing lport 02324e7a-c5bf-443b-a6e3-5a1cdac9fee4 from this chassis (sb_readonly=0)
Nov 22 07:54:58 compute-0 nova_compute[186544]: 2025-11-22 07:54:58.123 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:54:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:58.124 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/62930ff4-55a3-4e08-8229-5532aa7dcaed.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/62930ff4-55a3-4e08-8229-5532aa7dcaed.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 07:54:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:58.125 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[d68d38fb-a3de-424d-9be3-afe5526fc5c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:54:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:58.126 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 07:54:58 compute-0 ovn_metadata_agent[103800]: global
Nov 22 07:54:58 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 07:54:58 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-62930ff4-55a3-4e08-8229-5532aa7dcaed
Nov 22 07:54:58 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 07:54:58 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 07:54:58 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 07:54:58 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/62930ff4-55a3-4e08-8229-5532aa7dcaed.pid.haproxy
Nov 22 07:54:58 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 07:54:58 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:54:58 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 07:54:58 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 07:54:58 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 07:54:58 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 07:54:58 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 07:54:58 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 07:54:58 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 07:54:58 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 07:54:58 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 07:54:58 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 07:54:58 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 07:54:58 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 07:54:58 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 07:54:58 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:54:58 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:54:58 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 07:54:58 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 07:54:58 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 07:54:58 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID 62930ff4-55a3-4e08-8229-5532aa7dcaed
Nov 22 07:54:58 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 07:54:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:54:58.126 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed', 'env', 'PROCESS_TAG=haproxy-62930ff4-55a3-4e08-8229-5532aa7dcaed', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/62930ff4-55a3-4e08-8229-5532aa7dcaed.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 07:54:58 compute-0 nova_compute[186544]: 2025-11-22 07:54:58.380 186548 DEBUG nova.virt.libvirt.host [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Removed pending event for 47f049f6-6113-427d-ba1e-f849dfec302b due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Nov 22 07:54:58 compute-0 nova_compute[186544]: 2025-11-22 07:54:58.381 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798098.380437, 47f049f6-6113-427d-ba1e-f849dfec302b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:54:58 compute-0 nova_compute[186544]: 2025-11-22 07:54:58.381 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] VM Resumed (Lifecycle Event)
Nov 22 07:54:58 compute-0 nova_compute[186544]: 2025-11-22 07:54:58.383 186548 DEBUG nova.compute.manager [None req-487bb8fb-8f3d-4389-9b3f-8d3ee4d79bba 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 07:54:58 compute-0 nova_compute[186544]: 2025-11-22 07:54:58.386 186548 INFO nova.virt.libvirt.driver [-] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Instance rebooted successfully.
Nov 22 07:54:58 compute-0 nova_compute[186544]: 2025-11-22 07:54:58.387 186548 DEBUG nova.compute.manager [None req-487bb8fb-8f3d-4389-9b3f-8d3ee4d79bba 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:54:58 compute-0 nova_compute[186544]: 2025-11-22 07:54:58.433 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:54:58 compute-0 nova_compute[186544]: 2025-11-22 07:54:58.439 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:54:58 compute-0 nova_compute[186544]: 2025-11-22 07:54:58.457 186548 DEBUG nova.compute.manager [req-f7e553d6-9bcd-4193-a5b9-e80781931ac8 req-022a1b83-2921-4759-9059-a5a410897c11 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Received event network-vif-plugged-bb707b30-96a3-4e7c-abad-b85fc5a10938 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:54:58 compute-0 nova_compute[186544]: 2025-11-22 07:54:58.457 186548 DEBUG oslo_concurrency.lockutils [req-f7e553d6-9bcd-4193-a5b9-e80781931ac8 req-022a1b83-2921-4759-9059-a5a410897c11 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "47f049f6-6113-427d-ba1e-f849dfec302b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:54:58 compute-0 nova_compute[186544]: 2025-11-22 07:54:58.458 186548 DEBUG oslo_concurrency.lockutils [req-f7e553d6-9bcd-4193-a5b9-e80781931ac8 req-022a1b83-2921-4759-9059-a5a410897c11 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "47f049f6-6113-427d-ba1e-f849dfec302b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:54:58 compute-0 nova_compute[186544]: 2025-11-22 07:54:58.458 186548 DEBUG oslo_concurrency.lockutils [req-f7e553d6-9bcd-4193-a5b9-e80781931ac8 req-022a1b83-2921-4759-9059-a5a410897c11 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "47f049f6-6113-427d-ba1e-f849dfec302b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:54:58 compute-0 nova_compute[186544]: 2025-11-22 07:54:58.458 186548 DEBUG nova.compute.manager [req-f7e553d6-9bcd-4193-a5b9-e80781931ac8 req-022a1b83-2921-4759-9059-a5a410897c11 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] No waiting events found dispatching network-vif-plugged-bb707b30-96a3-4e7c-abad-b85fc5a10938 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:54:58 compute-0 nova_compute[186544]: 2025-11-22 07:54:58.459 186548 WARNING nova.compute.manager [req-f7e553d6-9bcd-4193-a5b9-e80781931ac8 req-022a1b83-2921-4759-9059-a5a410897c11 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Received unexpected event network-vif-plugged-bb707b30-96a3-4e7c-abad-b85fc5a10938 for instance with vm_state stopped and task_state powering-on.
Nov 22 07:54:58 compute-0 nova_compute[186544]: 2025-11-22 07:54:58.462 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] During sync_power_state the instance has a pending task (powering-on). Skip.
Nov 22 07:54:58 compute-0 nova_compute[186544]: 2025-11-22 07:54:58.462 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798098.381695, 47f049f6-6113-427d-ba1e-f849dfec302b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:54:58 compute-0 nova_compute[186544]: 2025-11-22 07:54:58.462 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] VM Started (Lifecycle Event)
Nov 22 07:54:58 compute-0 nova_compute[186544]: 2025-11-22 07:54:58.503 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:54:58 compute-0 nova_compute[186544]: 2025-11-22 07:54:58.507 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Synchronizing instance power state after lifecycle event "Started"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:54:58 compute-0 podman[223616]: 2025-11-22 07:54:58.551917656 +0000 UTC m=+0.066121020 container create ae9df771339e9dd7c6309c607aeb15e79752cf4fd257197588d996f1e856fdf0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 07:54:58 compute-0 nova_compute[186544]: 2025-11-22 07:54:58.575 186548 DEBUG nova.network.neutron [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Successfully updated port: cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 07:54:58 compute-0 systemd[1]: Started libpod-conmon-ae9df771339e9dd7c6309c607aeb15e79752cf4fd257197588d996f1e856fdf0.scope.
Nov 22 07:54:58 compute-0 podman[223616]: 2025-11-22 07:54:58.512334607 +0000 UTC m=+0.026538001 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 07:54:58 compute-0 systemd[1]: Started libcrun container.
Nov 22 07:54:58 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/db02a6885eaf998ba3c409a9db615ea823d4d24beff70c671b5276cac3214b14/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 07:54:58 compute-0 podman[223616]: 2025-11-22 07:54:58.639836229 +0000 UTC m=+0.154039623 container init ae9df771339e9dd7c6309c607aeb15e79752cf4fd257197588d996f1e856fdf0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 07:54:58 compute-0 nova_compute[186544]: 2025-11-22 07:54:58.640 186548 DEBUG oslo_concurrency.lockutils [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquiring lock "refresh_cache-fde616bb-bfd9-477f-b2bf-6ce171da7c4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:54:58 compute-0 nova_compute[186544]: 2025-11-22 07:54:58.641 186548 DEBUG oslo_concurrency.lockutils [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquired lock "refresh_cache-fde616bb-bfd9-477f-b2bf-6ce171da7c4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:54:58 compute-0 nova_compute[186544]: 2025-11-22 07:54:58.641 186548 DEBUG nova.network.neutron [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 07:54:58 compute-0 podman[223616]: 2025-11-22 07:54:58.645630261 +0000 UTC m=+0.159833635 container start ae9df771339e9dd7c6309c607aeb15e79752cf4fd257197588d996f1e856fdf0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Nov 22 07:54:58 compute-0 podman[223628]: 2025-11-22 07:54:58.653057794 +0000 UTC m=+0.056433854 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, vcs-type=git, vendor=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, architecture=x86_64, container_name=openstack_network_exporter, io.openshift.expose-services=, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 22 07:54:58 compute-0 neutron-haproxy-ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed[223631]: [NOTICE]   (223653) : New worker (223656) forked
Nov 22 07:54:58 compute-0 neutron-haproxy-ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed[223631]: [NOTICE]   (223653) : Loading success.
Nov 22 07:54:58 compute-0 nova_compute[186544]: 2025-11-22 07:54:58.752 186548 DEBUG nova.compute.manager [req-e096296d-342f-43a3-a4c4-5cdb061af94b req-f19172d4-817f-4bb3-ba19-41285a640e3f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Received event network-changed-cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:54:58 compute-0 nova_compute[186544]: 2025-11-22 07:54:58.752 186548 DEBUG nova.compute.manager [req-e096296d-342f-43a3-a4c4-5cdb061af94b req-f19172d4-817f-4bb3-ba19-41285a640e3f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Refreshing instance network info cache due to event network-changed-cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 07:54:58 compute-0 nova_compute[186544]: 2025-11-22 07:54:58.753 186548 DEBUG oslo_concurrency.lockutils [req-e096296d-342f-43a3-a4c4-5cdb061af94b req-f19172d4-817f-4bb3-ba19-41285a640e3f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-fde616bb-bfd9-477f-b2bf-6ce171da7c4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:54:59 compute-0 nova_compute[186544]: 2025-11-22 07:54:59.151 186548 DEBUG nova.network.neutron [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 07:55:00 compute-0 nova_compute[186544]: 2025-11-22 07:55:00.399 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:00 compute-0 nova_compute[186544]: 2025-11-22 07:55:00.705 186548 DEBUG nova.compute.manager [req-22020af4-8075-4fd3-832a-149311b5a7f8 req-6fb9f533-ddc0-4bac-81aa-1a0c63761874 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Received event network-vif-plugged-bb707b30-96a3-4e7c-abad-b85fc5a10938 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:55:00 compute-0 nova_compute[186544]: 2025-11-22 07:55:00.706 186548 DEBUG oslo_concurrency.lockutils [req-22020af4-8075-4fd3-832a-149311b5a7f8 req-6fb9f533-ddc0-4bac-81aa-1a0c63761874 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "47f049f6-6113-427d-ba1e-f849dfec302b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:55:00 compute-0 nova_compute[186544]: 2025-11-22 07:55:00.706 186548 DEBUG oslo_concurrency.lockutils [req-22020af4-8075-4fd3-832a-149311b5a7f8 req-6fb9f533-ddc0-4bac-81aa-1a0c63761874 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "47f049f6-6113-427d-ba1e-f849dfec302b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:55:00 compute-0 nova_compute[186544]: 2025-11-22 07:55:00.706 186548 DEBUG oslo_concurrency.lockutils [req-22020af4-8075-4fd3-832a-149311b5a7f8 req-6fb9f533-ddc0-4bac-81aa-1a0c63761874 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "47f049f6-6113-427d-ba1e-f849dfec302b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:55:00 compute-0 nova_compute[186544]: 2025-11-22 07:55:00.707 186548 DEBUG nova.compute.manager [req-22020af4-8075-4fd3-832a-149311b5a7f8 req-6fb9f533-ddc0-4bac-81aa-1a0c63761874 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] No waiting events found dispatching network-vif-plugged-bb707b30-96a3-4e7c-abad-b85fc5a10938 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:55:00 compute-0 nova_compute[186544]: 2025-11-22 07:55:00.707 186548 WARNING nova.compute.manager [req-22020af4-8075-4fd3-832a-149311b5a7f8 req-6fb9f533-ddc0-4bac-81aa-1a0c63761874 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Received unexpected event network-vif-plugged-bb707b30-96a3-4e7c-abad-b85fc5a10938 for instance with vm_state active and task_state None.
Nov 22 07:55:01 compute-0 nova_compute[186544]: 2025-11-22 07:55:01.199 186548 DEBUG nova.network.neutron [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Updating instance_info_cache with network_info: [{"id": "cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b", "address": "fa:16:3e:d4:c1:18", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc5ed7bc-cf", "ovs_interfaceid": "cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:55:01 compute-0 nova_compute[186544]: 2025-11-22 07:55:01.276 186548 DEBUG oslo_concurrency.lockutils [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Releasing lock "refresh_cache-fde616bb-bfd9-477f-b2bf-6ce171da7c4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:55:01 compute-0 nova_compute[186544]: 2025-11-22 07:55:01.277 186548 DEBUG nova.compute.manager [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Instance network_info: |[{"id": "cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b", "address": "fa:16:3e:d4:c1:18", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc5ed7bc-cf", "ovs_interfaceid": "cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 22 07:55:01 compute-0 nova_compute[186544]: 2025-11-22 07:55:01.277 186548 DEBUG oslo_concurrency.lockutils [req-e096296d-342f-43a3-a4c4-5cdb061af94b req-f19172d4-817f-4bb3-ba19-41285a640e3f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-fde616bb-bfd9-477f-b2bf-6ce171da7c4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:55:01 compute-0 nova_compute[186544]: 2025-11-22 07:55:01.278 186548 DEBUG nova.network.neutron [req-e096296d-342f-43a3-a4c4-5cdb061af94b req-f19172d4-817f-4bb3-ba19-41285a640e3f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Refreshing network info cache for port cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 07:55:01 compute-0 nova_compute[186544]: 2025-11-22 07:55:01.281 186548 DEBUG nova.virt.libvirt.driver [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Start _get_guest_xml network_info=[{"id": "cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b", "address": "fa:16:3e:d4:c1:18", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc5ed7bc-cf", "ovs_interfaceid": "cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 07:55:01 compute-0 nova_compute[186544]: 2025-11-22 07:55:01.286 186548 WARNING nova.virt.libvirt.driver [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 07:55:01 compute-0 nova_compute[186544]: 2025-11-22 07:55:01.297 186548 DEBUG nova.virt.libvirt.host [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 07:55:01 compute-0 nova_compute[186544]: 2025-11-22 07:55:01.298 186548 DEBUG nova.virt.libvirt.host [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 07:55:01 compute-0 nova_compute[186544]: 2025-11-22 07:55:01.305 186548 DEBUG nova.virt.libvirt.host [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 07:55:01 compute-0 nova_compute[186544]: 2025-11-22 07:55:01.306 186548 DEBUG nova.virt.libvirt.host [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 07:55:01 compute-0 nova_compute[186544]: 2025-11-22 07:55:01.307 186548 DEBUG nova.virt.libvirt.driver [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 07:55:01 compute-0 nova_compute[186544]: 2025-11-22 07:55:01.307 186548 DEBUG nova.virt.hardware [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 07:55:01 compute-0 nova_compute[186544]: 2025-11-22 07:55:01.308 186548 DEBUG nova.virt.hardware [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 07:55:01 compute-0 nova_compute[186544]: 2025-11-22 07:55:01.308 186548 DEBUG nova.virt.hardware [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 07:55:01 compute-0 nova_compute[186544]: 2025-11-22 07:55:01.308 186548 DEBUG nova.virt.hardware [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 07:55:01 compute-0 nova_compute[186544]: 2025-11-22 07:55:01.309 186548 DEBUG nova.virt.hardware [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 07:55:01 compute-0 nova_compute[186544]: 2025-11-22 07:55:01.309 186548 DEBUG nova.virt.hardware [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 07:55:01 compute-0 nova_compute[186544]: 2025-11-22 07:55:01.309 186548 DEBUG nova.virt.hardware [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 07:55:01 compute-0 nova_compute[186544]: 2025-11-22 07:55:01.309 186548 DEBUG nova.virt.hardware [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 07:55:01 compute-0 nova_compute[186544]: 2025-11-22 07:55:01.310 186548 DEBUG nova.virt.hardware [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 07:55:01 compute-0 nova_compute[186544]: 2025-11-22 07:55:01.310 186548 DEBUG nova.virt.hardware [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 07:55:01 compute-0 nova_compute[186544]: 2025-11-22 07:55:01.310 186548 DEBUG nova.virt.hardware [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 07:55:01 compute-0 nova_compute[186544]: 2025-11-22 07:55:01.314 186548 DEBUG nova.virt.libvirt.vif [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:54:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1890324968',display_name='tempest-ServerDiskConfigTestJSON-server-1890324968',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1890324968',id=65,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='063bf16c91af408ca075c690797e09d8',ramdisk_id='',reservation_id='r-qfr0g3wu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-592691466',owner_user_name='tempest-ServerDiskConfigTestJSON-592691466-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:54:53Z,user_data=None,user_id='e24c302b62fb470aa189b76d4676733b',uuid=fde616bb-bfd9-477f-b2bf-6ce171da7c4c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b", "address": "fa:16:3e:d4:c1:18", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc5ed7bc-cf", "ovs_interfaceid": "cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 07:55:01 compute-0 nova_compute[186544]: 2025-11-22 07:55:01.315 186548 DEBUG nova.network.os_vif_util [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Converting VIF {"id": "cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b", "address": "fa:16:3e:d4:c1:18", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc5ed7bc-cf", "ovs_interfaceid": "cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:55:01 compute-0 nova_compute[186544]: 2025-11-22 07:55:01.315 186548 DEBUG nova.network.os_vif_util [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d4:c1:18,bridge_name='br-int',has_traffic_filtering=True,id=cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b,network=Network(d54e232a-5c68-4cc7-b58c-054da9c4646f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc5ed7bc-cf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:55:01 compute-0 nova_compute[186544]: 2025-11-22 07:55:01.317 186548 DEBUG nova.objects.instance [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lazy-loading 'pci_devices' on Instance uuid fde616bb-bfd9-477f-b2bf-6ce171da7c4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:55:01 compute-0 nova_compute[186544]: 2025-11-22 07:55:01.606 186548 DEBUG nova.virt.libvirt.driver [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] End _get_guest_xml xml=<domain type="kvm">
Nov 22 07:55:01 compute-0 nova_compute[186544]:   <uuid>fde616bb-bfd9-477f-b2bf-6ce171da7c4c</uuid>
Nov 22 07:55:01 compute-0 nova_compute[186544]:   <name>instance-00000041</name>
Nov 22 07:55:01 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 07:55:01 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 07:55:01 compute-0 nova_compute[186544]:   <metadata>
Nov 22 07:55:01 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 07:55:01 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 07:55:01 compute-0 nova_compute[186544]:       <nova:name>tempest-ServerDiskConfigTestJSON-server-1890324968</nova:name>
Nov 22 07:55:01 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 07:55:01</nova:creationTime>
Nov 22 07:55:01 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 07:55:01 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 07:55:01 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 07:55:01 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 07:55:01 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 07:55:01 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 07:55:01 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 07:55:01 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 07:55:01 compute-0 nova_compute[186544]:         <nova:user uuid="e24c302b62fb470aa189b76d4676733b">tempest-ServerDiskConfigTestJSON-592691466-project-member</nova:user>
Nov 22 07:55:01 compute-0 nova_compute[186544]:         <nova:project uuid="063bf16c91af408ca075c690797e09d8">tempest-ServerDiskConfigTestJSON-592691466</nova:project>
Nov 22 07:55:01 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 07:55:01 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 07:55:01 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 07:55:01 compute-0 nova_compute[186544]:         <nova:port uuid="cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b">
Nov 22 07:55:01 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Nov 22 07:55:01 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 07:55:01 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 07:55:01 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 07:55:01 compute-0 nova_compute[186544]:   </metadata>
Nov 22 07:55:01 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 07:55:01 compute-0 nova_compute[186544]:     <system>
Nov 22 07:55:01 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 07:55:01 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 07:55:01 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 07:55:01 compute-0 nova_compute[186544]:       <entry name="serial">fde616bb-bfd9-477f-b2bf-6ce171da7c4c</entry>
Nov 22 07:55:01 compute-0 nova_compute[186544]:       <entry name="uuid">fde616bb-bfd9-477f-b2bf-6ce171da7c4c</entry>
Nov 22 07:55:01 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 07:55:01 compute-0 nova_compute[186544]:     </system>
Nov 22 07:55:01 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 07:55:01 compute-0 nova_compute[186544]:   <os>
Nov 22 07:55:01 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 07:55:01 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 07:55:01 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 07:55:01 compute-0 nova_compute[186544]:   </os>
Nov 22 07:55:01 compute-0 nova_compute[186544]:   <features>
Nov 22 07:55:01 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 07:55:01 compute-0 nova_compute[186544]:     <apic/>
Nov 22 07:55:01 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 07:55:01 compute-0 nova_compute[186544]:   </features>
Nov 22 07:55:01 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 07:55:01 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 07:55:01 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 07:55:01 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 07:55:01 compute-0 nova_compute[186544]:   </clock>
Nov 22 07:55:01 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 07:55:01 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 07:55:01 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 07:55:01 compute-0 nova_compute[186544]:   </cpu>
Nov 22 07:55:01 compute-0 nova_compute[186544]:   <devices>
Nov 22 07:55:01 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 07:55:01 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 07:55:01 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/fde616bb-bfd9-477f-b2bf-6ce171da7c4c/disk"/>
Nov 22 07:55:01 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 07:55:01 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:55:01 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 07:55:01 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 07:55:01 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/fde616bb-bfd9-477f-b2bf-6ce171da7c4c/disk.config"/>
Nov 22 07:55:01 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 07:55:01 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:55:01 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 07:55:01 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:d4:c1:18"/>
Nov 22 07:55:01 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 07:55:01 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 07:55:01 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 07:55:01 compute-0 nova_compute[186544]:       <target dev="tapcc5ed7bc-cf"/>
Nov 22 07:55:01 compute-0 nova_compute[186544]:     </interface>
Nov 22 07:55:01 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 07:55:01 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/fde616bb-bfd9-477f-b2bf-6ce171da7c4c/console.log" append="off"/>
Nov 22 07:55:01 compute-0 nova_compute[186544]:     </serial>
Nov 22 07:55:01 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 07:55:01 compute-0 nova_compute[186544]:     <video>
Nov 22 07:55:01 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 07:55:01 compute-0 nova_compute[186544]:     </video>
Nov 22 07:55:01 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 07:55:01 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 07:55:01 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 07:55:01 compute-0 nova_compute[186544]:     </rng>
Nov 22 07:55:01 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 07:55:01 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:01 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:01 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:01 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:01 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:01 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:01 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:01 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:01 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:01 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:01 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:01 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:01 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:01 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:01 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:01 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:01 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:01 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:01 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:01 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:01 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:01 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:01 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:01 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:01 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 07:55:01 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 07:55:01 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 07:55:01 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 07:55:01 compute-0 nova_compute[186544]:   </devices>
Nov 22 07:55:01 compute-0 nova_compute[186544]: </domain>
Nov 22 07:55:01 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 07:55:01 compute-0 nova_compute[186544]: 2025-11-22 07:55:01.613 186548 DEBUG nova.compute.manager [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Preparing to wait for external event network-vif-plugged-cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 07:55:01 compute-0 nova_compute[186544]: 2025-11-22 07:55:01.614 186548 DEBUG oslo_concurrency.lockutils [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquiring lock "fde616bb-bfd9-477f-b2bf-6ce171da7c4c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:55:01 compute-0 nova_compute[186544]: 2025-11-22 07:55:01.614 186548 DEBUG oslo_concurrency.lockutils [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "fde616bb-bfd9-477f-b2bf-6ce171da7c4c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:55:01 compute-0 nova_compute[186544]: 2025-11-22 07:55:01.615 186548 DEBUG oslo_concurrency.lockutils [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "fde616bb-bfd9-477f-b2bf-6ce171da7c4c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:55:01 compute-0 nova_compute[186544]: 2025-11-22 07:55:01.615 186548 DEBUG nova.virt.libvirt.vif [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:54:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1890324968',display_name='tempest-ServerDiskConfigTestJSON-server-1890324968',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1890324968',id=65,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='063bf16c91af408ca075c690797e09d8',ramdisk_id='',reservation_id='r-qfr0g3wu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-592691466',owner_user_name='tempest-ServerDiskConfigTestJSON-592691466-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:54:53Z,user_data=None,user_id='e24c302b62fb470aa189b76d4676733b',uuid=fde616bb-bfd9-477f-b2bf-6ce171da7c4c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b", "address": "fa:16:3e:d4:c1:18", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc5ed7bc-cf", "ovs_interfaceid": "cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 07:55:01 compute-0 nova_compute[186544]: 2025-11-22 07:55:01.616 186548 DEBUG nova.network.os_vif_util [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Converting VIF {"id": "cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b", "address": "fa:16:3e:d4:c1:18", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc5ed7bc-cf", "ovs_interfaceid": "cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:55:01 compute-0 nova_compute[186544]: 2025-11-22 07:55:01.617 186548 DEBUG nova.network.os_vif_util [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d4:c1:18,bridge_name='br-int',has_traffic_filtering=True,id=cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b,network=Network(d54e232a-5c68-4cc7-b58c-054da9c4646f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc5ed7bc-cf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:55:01 compute-0 nova_compute[186544]: 2025-11-22 07:55:01.617 186548 DEBUG os_vif [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d4:c1:18,bridge_name='br-int',has_traffic_filtering=True,id=cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b,network=Network(d54e232a-5c68-4cc7-b58c-054da9c4646f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc5ed7bc-cf') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 07:55:01 compute-0 nova_compute[186544]: 2025-11-22 07:55:01.618 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:01 compute-0 nova_compute[186544]: 2025-11-22 07:55:01.618 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:55:01 compute-0 nova_compute[186544]: 2025-11-22 07:55:01.619 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:55:01 compute-0 nova_compute[186544]: 2025-11-22 07:55:01.621 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:01 compute-0 nova_compute[186544]: 2025-11-22 07:55:01.622 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcc5ed7bc-cf, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:55:01 compute-0 nova_compute[186544]: 2025-11-22 07:55:01.622 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapcc5ed7bc-cf, col_values=(('external_ids', {'iface-id': 'cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d4:c1:18', 'vm-uuid': 'fde616bb-bfd9-477f-b2bf-6ce171da7c4c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:55:01 compute-0 NetworkManager[55036]: <info>  [1763798101.6248] manager: (tapcc5ed7bc-cf): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/121)
Nov 22 07:55:01 compute-0 nova_compute[186544]: 2025-11-22 07:55:01.626 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 07:55:01 compute-0 nova_compute[186544]: 2025-11-22 07:55:01.630 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:01 compute-0 nova_compute[186544]: 2025-11-22 07:55:01.632 186548 INFO os_vif [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d4:c1:18,bridge_name='br-int',has_traffic_filtering=True,id=cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b,network=Network(d54e232a-5c68-4cc7-b58c-054da9c4646f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc5ed7bc-cf')
Nov 22 07:55:01 compute-0 nova_compute[186544]: 2025-11-22 07:55:01.715 186548 DEBUG nova.virt.libvirt.driver [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:55:01 compute-0 nova_compute[186544]: 2025-11-22 07:55:01.716 186548 DEBUG nova.virt.libvirt.driver [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:55:01 compute-0 nova_compute[186544]: 2025-11-22 07:55:01.716 186548 DEBUG nova.virt.libvirt.driver [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] No VIF found with MAC fa:16:3e:d4:c1:18, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 07:55:01 compute-0 nova_compute[186544]: 2025-11-22 07:55:01.716 186548 INFO nova.virt.libvirt.driver [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Using config drive
Nov 22 07:55:02 compute-0 nova_compute[186544]: 2025-11-22 07:55:02.786 186548 INFO nova.virt.libvirt.driver [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Creating config drive at /var/lib/nova/instances/fde616bb-bfd9-477f-b2bf-6ce171da7c4c/disk.config
Nov 22 07:55:02 compute-0 nova_compute[186544]: 2025-11-22 07:55:02.791 186548 DEBUG oslo_concurrency.processutils [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/fde616bb-bfd9-477f-b2bf-6ce171da7c4c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpyu6gg2_h execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:55:02 compute-0 nova_compute[186544]: 2025-11-22 07:55:02.913 186548 DEBUG oslo_concurrency.processutils [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/fde616bb-bfd9-477f-b2bf-6ce171da7c4c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpyu6gg2_h" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:55:02 compute-0 kernel: tapcc5ed7bc-cf: entered promiscuous mode
Nov 22 07:55:02 compute-0 NetworkManager[55036]: <info>  [1763798102.9613] manager: (tapcc5ed7bc-cf): new Tun device (/org/freedesktop/NetworkManager/Devices/122)
Nov 22 07:55:02 compute-0 nova_compute[186544]: 2025-11-22 07:55:02.962 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:02 compute-0 nova_compute[186544]: 2025-11-22 07:55:02.967 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:02 compute-0 ovn_controller[94843]: 2025-11-22T07:55:02Z|00255|binding|INFO|Claiming lport cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b for this chassis.
Nov 22 07:55:02 compute-0 ovn_controller[94843]: 2025-11-22T07:55:02Z|00256|binding|INFO|cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b: Claiming fa:16:3e:d4:c1:18 10.100.0.4
Nov 22 07:55:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:02.986 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d4:c1:18 10.100.0.4'], port_security=['fa:16:3e:d4:c1:18 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'fde616bb-bfd9-477f-b2bf-6ce171da7c4c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d54e232a-5c68-4cc7-b58c-054da9c4646f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '063bf16c91af408ca075c690797e09d8', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'f7dee984-8c58-404b-bb82-9a414aef709d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3245992a-6d76-4250-aff6-2ef09c2bac10, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:55:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:02.987 103805 INFO neutron.agent.ovn.metadata.agent [-] Port cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b in datapath d54e232a-5c68-4cc7-b58c-054da9c4646f bound to our chassis
Nov 22 07:55:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:02.989 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d54e232a-5c68-4cc7-b58c-054da9c4646f
Nov 22 07:55:02 compute-0 systemd-udevd[223686]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 07:55:03 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:03.007 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[c1b295c9-476f-48e1-a679-874f6dfa9987]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:03 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:03.009 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd54e232a-51 in ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 07:55:03 compute-0 NetworkManager[55036]: <info>  [1763798103.0115] device (tapcc5ed7bc-cf): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 07:55:03 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:03.013 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd54e232a-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 07:55:03 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:03.013 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[6a1df1f8-98cd-4f4f-9fb4-8e407f0f71c8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:03 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:03.014 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[7f26d259-8e4e-4f51-891d-8d7b7143f464]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:03 compute-0 systemd-machined[152872]: New machine qemu-32-instance-00000041.
Nov 22 07:55:03 compute-0 NetworkManager[55036]: <info>  [1763798103.0193] device (tapcc5ed7bc-cf): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 07:55:03 compute-0 nova_compute[186544]: 2025-11-22 07:55:03.021 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:03 compute-0 ovn_controller[94843]: 2025-11-22T07:55:03Z|00257|binding|INFO|Setting lport cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b ovn-installed in OVS
Nov 22 07:55:03 compute-0 ovn_controller[94843]: 2025-11-22T07:55:03Z|00258|binding|INFO|Setting lport cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b up in Southbound
Nov 22 07:55:03 compute-0 nova_compute[186544]: 2025-11-22 07:55:03.025 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:03 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:03.026 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[37f06ad1-f836-47a9-b2f1-9be56a056c31]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:03 compute-0 systemd[1]: Started Virtual Machine qemu-32-instance-00000041.
Nov 22 07:55:03 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:03.049 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[fac447c8-ae88-4940-ad68-a8ced81eeb70]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:03 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:03.083 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[4ad43f9c-4c44-4837-b4c8-4a80e732b2f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:03 compute-0 NetworkManager[55036]: <info>  [1763798103.0930] manager: (tapd54e232a-50): new Veth device (/org/freedesktop/NetworkManager/Devices/123)
Nov 22 07:55:03 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:03.093 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[591d2a4b-6170-46c9-a0fc-6ec31d918cd4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:03 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:03.124 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[537a34d3-eb7e-4c60-b9a4-68cb88beeec9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:03 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:03.128 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[4e94d0b0-02f5-4731-b1e9-edb89c1e925e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:03 compute-0 NetworkManager[55036]: <info>  [1763798103.1583] device (tapd54e232a-50): carrier: link connected
Nov 22 07:55:03 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:03.163 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[120a6a08-9a67-4e90-ae12-c9cb7fd89622]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:03 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:03.178 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[b7f95074-c6c8-463a-9d36-bd6637f0f0c9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd54e232a-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5b:69:51'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 77], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 484379, 'reachable_time': 41342, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223720, 'error': None, 'target': 'ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:03 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:03.194 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[d3052df0-2662-46fe-bda5-6286863b7ffe]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe5b:6951'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 484379, 'tstamp': 484379}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223721, 'error': None, 'target': 'ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:03 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:03.212 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[4424fdf4-0551-4d57-9200-d58bd3993b79]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd54e232a-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5b:69:51'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 77], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 484379, 'reachable_time': 41342, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 223722, 'error': None, 'target': 'ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:03 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:03.244 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[526e4997-21c1-4b04-af67-a2b460fcf8bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:03 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:03.302 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[ba2fdac7-9c45-42c4-8fea-f068ab50311b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:03 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:03.304 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd54e232a-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:55:03 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:03.305 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:55:03 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:03.305 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd54e232a-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:55:03 compute-0 nova_compute[186544]: 2025-11-22 07:55:03.307 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:03 compute-0 kernel: tapd54e232a-50: entered promiscuous mode
Nov 22 07:55:03 compute-0 NetworkManager[55036]: <info>  [1763798103.3082] manager: (tapd54e232a-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/124)
Nov 22 07:55:03 compute-0 nova_compute[186544]: 2025-11-22 07:55:03.310 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:03 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:03.311 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd54e232a-50, col_values=(('external_ids', {'iface-id': 'bab7bafe-e92a-4e88-a16b-e3bd78ab8944'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:55:03 compute-0 nova_compute[186544]: 2025-11-22 07:55:03.312 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:03 compute-0 ovn_controller[94843]: 2025-11-22T07:55:03Z|00259|binding|INFO|Releasing lport bab7bafe-e92a-4e88-a16b-e3bd78ab8944 from this chassis (sb_readonly=0)
Nov 22 07:55:03 compute-0 nova_compute[186544]: 2025-11-22 07:55:03.326 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:03 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:03.327 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d54e232a-5c68-4cc7-b58c-054da9c4646f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d54e232a-5c68-4cc7-b58c-054da9c4646f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 07:55:03 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:03.328 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[2b22d280-9ad7-4f9b-8a3f-f8473eafa4b0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:03 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:03.329 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 07:55:03 compute-0 ovn_metadata_agent[103800]: global
Nov 22 07:55:03 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 07:55:03 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-d54e232a-5c68-4cc7-b58c-054da9c4646f
Nov 22 07:55:03 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 07:55:03 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 07:55:03 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 07:55:03 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/d54e232a-5c68-4cc7-b58c-054da9c4646f.pid.haproxy
Nov 22 07:55:03 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 07:55:03 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:55:03 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 07:55:03 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 07:55:03 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 07:55:03 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 07:55:03 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 07:55:03 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 07:55:03 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 07:55:03 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 07:55:03 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 07:55:03 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 07:55:03 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 07:55:03 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 07:55:03 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 07:55:03 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:55:03 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:55:03 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 07:55:03 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 07:55:03 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 07:55:03 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID d54e232a-5c68-4cc7-b58c-054da9c4646f
Nov 22 07:55:03 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 07:55:03 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:03.329 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f', 'env', 'PROCESS_TAG=haproxy-d54e232a-5c68-4cc7-b58c-054da9c4646f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d54e232a-5c68-4cc7-b58c-054da9c4646f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 07:55:03 compute-0 podman[223752]: 2025-11-22 07:55:03.678359557 +0000 UTC m=+0.046551030 container create a43e837134e5c30f78a88b7655df5f868fe628921d9fd98c0c3cedbfe90c0d15 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 07:55:03 compute-0 systemd[1]: Started libpod-conmon-a43e837134e5c30f78a88b7655df5f868fe628921d9fd98c0c3cedbfe90c0d15.scope.
Nov 22 07:55:03 compute-0 systemd[1]: Started libcrun container.
Nov 22 07:55:03 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4af9916c3da199154088f26b2fc62cb1c0092bc3accb6f47e50e766ed7ecca8c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 07:55:03 compute-0 podman[223752]: 2025-11-22 07:55:03.651956251 +0000 UTC m=+0.020147744 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 07:55:03 compute-0 podman[223752]: 2025-11-22 07:55:03.751095139 +0000 UTC m=+0.119286622 container init a43e837134e5c30f78a88b7655df5f868fe628921d9fd98c0c3cedbfe90c0d15 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Nov 22 07:55:03 compute-0 podman[223752]: 2025-11-22 07:55:03.75606061 +0000 UTC m=+0.124252083 container start a43e837134e5c30f78a88b7655df5f868fe628921d9fd98c0c3cedbfe90c0d15 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 07:55:03 compute-0 neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f[223773]: [NOTICE]   (223778) : New worker (223780) forked
Nov 22 07:55:03 compute-0 neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f[223773]: [NOTICE]   (223778) : Loading success.
Nov 22 07:55:03 compute-0 nova_compute[186544]: 2025-11-22 07:55:03.800 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798103.799787, fde616bb-bfd9-477f-b2bf-6ce171da7c4c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:55:03 compute-0 nova_compute[186544]: 2025-11-22 07:55:03.801 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] VM Started (Lifecycle Event)
Nov 22 07:55:03 compute-0 nova_compute[186544]: 2025-11-22 07:55:03.841 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:55:03 compute-0 nova_compute[186544]: 2025-11-22 07:55:03.845 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798103.8006616, fde616bb-bfd9-477f-b2bf-6ce171da7c4c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:55:03 compute-0 nova_compute[186544]: 2025-11-22 07:55:03.846 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] VM Paused (Lifecycle Event)
Nov 22 07:55:03 compute-0 nova_compute[186544]: 2025-11-22 07:55:03.865 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:55:03 compute-0 nova_compute[186544]: 2025-11-22 07:55:03.868 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:55:03 compute-0 nova_compute[186544]: 2025-11-22 07:55:03.899 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 07:55:04 compute-0 nova_compute[186544]: 2025-11-22 07:55:04.828 186548 DEBUG nova.compute.manager [req-c491b26e-8c27-457e-ae88-a04b9874df2c req-f55822d6-7b29-4019-ad9b-a3771c89afd9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Received event network-vif-plugged-cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:55:04 compute-0 nova_compute[186544]: 2025-11-22 07:55:04.829 186548 DEBUG oslo_concurrency.lockutils [req-c491b26e-8c27-457e-ae88-a04b9874df2c req-f55822d6-7b29-4019-ad9b-a3771c89afd9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "fde616bb-bfd9-477f-b2bf-6ce171da7c4c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:55:04 compute-0 nova_compute[186544]: 2025-11-22 07:55:04.829 186548 DEBUG oslo_concurrency.lockutils [req-c491b26e-8c27-457e-ae88-a04b9874df2c req-f55822d6-7b29-4019-ad9b-a3771c89afd9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "fde616bb-bfd9-477f-b2bf-6ce171da7c4c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:55:04 compute-0 nova_compute[186544]: 2025-11-22 07:55:04.830 186548 DEBUG oslo_concurrency.lockutils [req-c491b26e-8c27-457e-ae88-a04b9874df2c req-f55822d6-7b29-4019-ad9b-a3771c89afd9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "fde616bb-bfd9-477f-b2bf-6ce171da7c4c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:55:04 compute-0 nova_compute[186544]: 2025-11-22 07:55:04.830 186548 DEBUG nova.compute.manager [req-c491b26e-8c27-457e-ae88-a04b9874df2c req-f55822d6-7b29-4019-ad9b-a3771c89afd9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Processing event network-vif-plugged-cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 07:55:04 compute-0 nova_compute[186544]: 2025-11-22 07:55:04.831 186548 DEBUG nova.compute.manager [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 07:55:04 compute-0 nova_compute[186544]: 2025-11-22 07:55:04.835 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798104.834303, fde616bb-bfd9-477f-b2bf-6ce171da7c4c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:55:04 compute-0 nova_compute[186544]: 2025-11-22 07:55:04.835 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] VM Resumed (Lifecycle Event)
Nov 22 07:55:04 compute-0 nova_compute[186544]: 2025-11-22 07:55:04.838 186548 DEBUG nova.virt.libvirt.driver [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 07:55:04 compute-0 nova_compute[186544]: 2025-11-22 07:55:04.840 186548 INFO nova.virt.libvirt.driver [-] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Instance spawned successfully.
Nov 22 07:55:04 compute-0 nova_compute[186544]: 2025-11-22 07:55:04.841 186548 DEBUG nova.virt.libvirt.driver [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 07:55:04 compute-0 nova_compute[186544]: 2025-11-22 07:55:04.859 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:55:04 compute-0 nova_compute[186544]: 2025-11-22 07:55:04.864 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:55:04 compute-0 nova_compute[186544]: 2025-11-22 07:55:04.866 186548 DEBUG nova.virt.libvirt.driver [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:55:04 compute-0 nova_compute[186544]: 2025-11-22 07:55:04.867 186548 DEBUG nova.virt.libvirt.driver [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:55:04 compute-0 nova_compute[186544]: 2025-11-22 07:55:04.867 186548 DEBUG nova.virt.libvirt.driver [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:55:04 compute-0 nova_compute[186544]: 2025-11-22 07:55:04.868 186548 DEBUG nova.virt.libvirt.driver [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:55:04 compute-0 nova_compute[186544]: 2025-11-22 07:55:04.868 186548 DEBUG nova.virt.libvirt.driver [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:55:04 compute-0 nova_compute[186544]: 2025-11-22 07:55:04.868 186548 DEBUG nova.virt.libvirt.driver [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:55:04 compute-0 nova_compute[186544]: 2025-11-22 07:55:04.919 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 07:55:05 compute-0 nova_compute[186544]: 2025-11-22 07:55:05.033 186548 INFO nova.compute.manager [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Took 11.31 seconds to spawn the instance on the hypervisor.
Nov 22 07:55:05 compute-0 nova_compute[186544]: 2025-11-22 07:55:05.033 186548 DEBUG nova.compute.manager [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:55:05 compute-0 nova_compute[186544]: 2025-11-22 07:55:05.160 186548 INFO nova.compute.manager [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Took 12.17 seconds to build instance.
Nov 22 07:55:05 compute-0 nova_compute[186544]: 2025-11-22 07:55:05.190 186548 DEBUG oslo_concurrency.lockutils [None req-bc2c3f26-aaef-4475-92e7-0a722b6cd621 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "fde616bb-bfd9-477f-b2bf-6ce171da7c4c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.348s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:55:05 compute-0 nova_compute[186544]: 2025-11-22 07:55:05.351 186548 DEBUG nova.network.neutron [req-e096296d-342f-43a3-a4c4-5cdb061af94b req-f19172d4-817f-4bb3-ba19-41285a640e3f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Updated VIF entry in instance network info cache for port cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 07:55:05 compute-0 nova_compute[186544]: 2025-11-22 07:55:05.352 186548 DEBUG nova.network.neutron [req-e096296d-342f-43a3-a4c4-5cdb061af94b req-f19172d4-817f-4bb3-ba19-41285a640e3f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Updating instance_info_cache with network_info: [{"id": "cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b", "address": "fa:16:3e:d4:c1:18", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc5ed7bc-cf", "ovs_interfaceid": "cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:55:05 compute-0 nova_compute[186544]: 2025-11-22 07:55:05.401 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:05 compute-0 nova_compute[186544]: 2025-11-22 07:55:05.420 186548 DEBUG oslo_concurrency.lockutils [req-e096296d-342f-43a3-a4c4-5cdb061af94b req-f19172d4-817f-4bb3-ba19-41285a640e3f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-fde616bb-bfd9-477f-b2bf-6ce171da7c4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:55:06 compute-0 nova_compute[186544]: 2025-11-22 07:55:06.624 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:08 compute-0 nova_compute[186544]: 2025-11-22 07:55:08.253 186548 DEBUG nova.compute.manager [req-5baf5ced-f8f2-42bb-96e6-314e6d6ffafe req-6b931626-5c33-4dee-9d5f-610a89cd924f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Received event network-vif-plugged-cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:55:08 compute-0 nova_compute[186544]: 2025-11-22 07:55:08.254 186548 DEBUG oslo_concurrency.lockutils [req-5baf5ced-f8f2-42bb-96e6-314e6d6ffafe req-6b931626-5c33-4dee-9d5f-610a89cd924f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "fde616bb-bfd9-477f-b2bf-6ce171da7c4c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:55:08 compute-0 nova_compute[186544]: 2025-11-22 07:55:08.254 186548 DEBUG oslo_concurrency.lockutils [req-5baf5ced-f8f2-42bb-96e6-314e6d6ffafe req-6b931626-5c33-4dee-9d5f-610a89cd924f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "fde616bb-bfd9-477f-b2bf-6ce171da7c4c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:55:08 compute-0 nova_compute[186544]: 2025-11-22 07:55:08.254 186548 DEBUG oslo_concurrency.lockutils [req-5baf5ced-f8f2-42bb-96e6-314e6d6ffafe req-6b931626-5c33-4dee-9d5f-610a89cd924f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "fde616bb-bfd9-477f-b2bf-6ce171da7c4c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:55:08 compute-0 nova_compute[186544]: 2025-11-22 07:55:08.254 186548 DEBUG nova.compute.manager [req-5baf5ced-f8f2-42bb-96e6-314e6d6ffafe req-6b931626-5c33-4dee-9d5f-610a89cd924f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] No waiting events found dispatching network-vif-plugged-cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:55:08 compute-0 nova_compute[186544]: 2025-11-22 07:55:08.254 186548 WARNING nova.compute.manager [req-5baf5ced-f8f2-42bb-96e6-314e6d6ffafe req-6b931626-5c33-4dee-9d5f-610a89cd924f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Received unexpected event network-vif-plugged-cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b for instance with vm_state active and task_state None.
Nov 22 07:55:09 compute-0 podman[223789]: 2025-11-22 07:55:09.426933804 +0000 UTC m=+0.068636443 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 22 07:55:09 compute-0 podman[223790]: 2025-11-22 07:55:09.476748553 +0000 UTC m=+0.116859092 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Nov 22 07:55:10 compute-0 nova_compute[186544]: 2025-11-22 07:55:10.402 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:11 compute-0 nova_compute[186544]: 2025-11-22 07:55:11.626 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:12 compute-0 ovn_controller[94843]: 2025-11-22T07:55:12Z|00032|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ee:30:e0 10.100.0.4
Nov 22 07:55:13 compute-0 nova_compute[186544]: 2025-11-22 07:55:13.722 186548 INFO nova.compute.manager [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Rebuilding instance
Nov 22 07:55:14 compute-0 podman[223837]: 2025-11-22 07:55:14.423152085 +0000 UTC m=+0.072403764 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 07:55:14 compute-0 nova_compute[186544]: 2025-11-22 07:55:14.510 186548 DEBUG nova.compute.manager [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:55:14 compute-0 nova_compute[186544]: 2025-11-22 07:55:14.625 186548 DEBUG nova.objects.instance [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lazy-loading 'pci_requests' on Instance uuid fde616bb-bfd9-477f-b2bf-6ce171da7c4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:55:14 compute-0 nova_compute[186544]: 2025-11-22 07:55:14.639 186548 DEBUG nova.objects.instance [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lazy-loading 'pci_devices' on Instance uuid fde616bb-bfd9-477f-b2bf-6ce171da7c4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:55:14 compute-0 nova_compute[186544]: 2025-11-22 07:55:14.652 186548 DEBUG nova.objects.instance [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lazy-loading 'resources' on Instance uuid fde616bb-bfd9-477f-b2bf-6ce171da7c4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:55:14 compute-0 nova_compute[186544]: 2025-11-22 07:55:14.663 186548 DEBUG nova.objects.instance [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lazy-loading 'migration_context' on Instance uuid fde616bb-bfd9-477f-b2bf-6ce171da7c4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:55:14 compute-0 nova_compute[186544]: 2025-11-22 07:55:14.674 186548 DEBUG nova.objects.instance [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Nov 22 07:55:14 compute-0 nova_compute[186544]: 2025-11-22 07:55:14.678 186548 DEBUG nova.virt.libvirt.driver [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Nov 22 07:55:15 compute-0 nova_compute[186544]: 2025-11-22 07:55:15.404 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:16 compute-0 nova_compute[186544]: 2025-11-22 07:55:16.630 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:17 compute-0 podman[223862]: 2025-11-22 07:55:17.407933659 +0000 UTC m=+0.059820027 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Nov 22 07:55:18 compute-0 ovn_controller[94843]: 2025-11-22T07:55:18Z|00033|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d4:c1:18 10.100.0.4
Nov 22 07:55:18 compute-0 ovn_controller[94843]: 2025-11-22T07:55:18Z|00034|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d4:c1:18 10.100.0.4
Nov 22 07:55:20 compute-0 nova_compute[186544]: 2025-11-22 07:55:20.077 186548 DEBUG oslo_concurrency.lockutils [None req-3a26c3d0-3f32-4dbc-a6fb-227062d9cade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Acquiring lock "47f049f6-6113-427d-ba1e-f849dfec302b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:55:20 compute-0 nova_compute[186544]: 2025-11-22 07:55:20.078 186548 DEBUG oslo_concurrency.lockutils [None req-3a26c3d0-3f32-4dbc-a6fb-227062d9cade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Lock "47f049f6-6113-427d-ba1e-f849dfec302b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:55:20 compute-0 nova_compute[186544]: 2025-11-22 07:55:20.078 186548 DEBUG oslo_concurrency.lockutils [None req-3a26c3d0-3f32-4dbc-a6fb-227062d9cade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Acquiring lock "47f049f6-6113-427d-ba1e-f849dfec302b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:55:20 compute-0 nova_compute[186544]: 2025-11-22 07:55:20.078 186548 DEBUG oslo_concurrency.lockutils [None req-3a26c3d0-3f32-4dbc-a6fb-227062d9cade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Lock "47f049f6-6113-427d-ba1e-f849dfec302b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:55:20 compute-0 nova_compute[186544]: 2025-11-22 07:55:20.079 186548 DEBUG oslo_concurrency.lockutils [None req-3a26c3d0-3f32-4dbc-a6fb-227062d9cade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Lock "47f049f6-6113-427d-ba1e-f849dfec302b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:55:20 compute-0 nova_compute[186544]: 2025-11-22 07:55:20.088 186548 INFO nova.compute.manager [None req-3a26c3d0-3f32-4dbc-a6fb-227062d9cade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Terminating instance
Nov 22 07:55:20 compute-0 nova_compute[186544]: 2025-11-22 07:55:20.095 186548 DEBUG nova.compute.manager [None req-3a26c3d0-3f32-4dbc-a6fb-227062d9cade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 07:55:20 compute-0 kernel: tapbb707b30-96 (unregistering): left promiscuous mode
Nov 22 07:55:20 compute-0 NetworkManager[55036]: <info>  [1763798120.1192] device (tapbb707b30-96): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 07:55:20 compute-0 nova_compute[186544]: 2025-11-22 07:55:20.127 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:20 compute-0 ovn_controller[94843]: 2025-11-22T07:55:20Z|00260|binding|INFO|Releasing lport bb707b30-96a3-4e7c-abad-b85fc5a10938 from this chassis (sb_readonly=0)
Nov 22 07:55:20 compute-0 ovn_controller[94843]: 2025-11-22T07:55:20Z|00261|binding|INFO|Setting lport bb707b30-96a3-4e7c-abad-b85fc5a10938 down in Southbound
Nov 22 07:55:20 compute-0 ovn_controller[94843]: 2025-11-22T07:55:20Z|00262|binding|INFO|Removing iface tapbb707b30-96 ovn-installed in OVS
Nov 22 07:55:20 compute-0 nova_compute[186544]: 2025-11-22 07:55:20.129 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:20 compute-0 nova_compute[186544]: 2025-11-22 07:55:20.140 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:20.140 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ee:30:e0 10.100.0.4'], port_security=['fa:16:3e:ee:30:e0 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '47f049f6-6113-427d-ba1e-f849dfec302b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-62930ff4-55a3-4e08-8229-5532aa7dcaed', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b4ca2b2e65ac4bf8b3d14f3310a3a7bf', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'cd63e957-ae08-4ca1-9eb9-8ce253173257', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=13b92379-ae34-491c-b971-1757bc6e8c79, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=bb707b30-96a3-4e7c-abad-b85fc5a10938) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:55:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:20.143 103805 INFO neutron.agent.ovn.metadata.agent [-] Port bb707b30-96a3-4e7c-abad-b85fc5a10938 in datapath 62930ff4-55a3-4e08-8229-5532aa7dcaed unbound from our chassis
Nov 22 07:55:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:20.144 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 62930ff4-55a3-4e08-8229-5532aa7dcaed, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 07:55:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:20.146 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[034a883b-60f4-4717-9edb-c9a425509bcb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:20.146 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed namespace which is not needed anymore
Nov 22 07:55:20 compute-0 systemd[1]: machine-qemu\x2d31\x2dinstance\x2d0000003e.scope: Deactivated successfully.
Nov 22 07:55:20 compute-0 systemd[1]: machine-qemu\x2d31\x2dinstance\x2d0000003e.scope: Consumed 13.921s CPU time.
Nov 22 07:55:20 compute-0 systemd-machined[152872]: Machine qemu-31-instance-0000003e terminated.
Nov 22 07:55:20 compute-0 neutron-haproxy-ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed[223631]: [NOTICE]   (223653) : haproxy version is 2.8.14-c23fe91
Nov 22 07:55:20 compute-0 neutron-haproxy-ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed[223631]: [NOTICE]   (223653) : path to executable is /usr/sbin/haproxy
Nov 22 07:55:20 compute-0 neutron-haproxy-ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed[223631]: [WARNING]  (223653) : Exiting Master process...
Nov 22 07:55:20 compute-0 neutron-haproxy-ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed[223631]: [ALERT]    (223653) : Current worker (223656) exited with code 143 (Terminated)
Nov 22 07:55:20 compute-0 neutron-haproxy-ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed[223631]: [WARNING]  (223653) : All workers exited. Exiting... (0)
Nov 22 07:55:20 compute-0 systemd[1]: libpod-ae9df771339e9dd7c6309c607aeb15e79752cf4fd257197588d996f1e856fdf0.scope: Deactivated successfully.
Nov 22 07:55:20 compute-0 podman[223925]: 2025-11-22 07:55:20.278895075 +0000 UTC m=+0.048338035 container died ae9df771339e9dd7c6309c607aeb15e79752cf4fd257197588d996f1e856fdf0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 22 07:55:20 compute-0 nova_compute[186544]: 2025-11-22 07:55:20.317 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:20 compute-0 nova_compute[186544]: 2025-11-22 07:55:20.321 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:20 compute-0 nova_compute[186544]: 2025-11-22 07:55:20.355 186548 INFO nova.virt.libvirt.driver [-] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Instance destroyed successfully.
Nov 22 07:55:20 compute-0 nova_compute[186544]: 2025-11-22 07:55:20.355 186548 DEBUG nova.objects.instance [None req-3a26c3d0-3f32-4dbc-a6fb-227062d9cade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Lazy-loading 'resources' on Instance uuid 47f049f6-6113-427d-ba1e-f849dfec302b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:55:20 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ae9df771339e9dd7c6309c607aeb15e79752cf4fd257197588d996f1e856fdf0-userdata-shm.mount: Deactivated successfully.
Nov 22 07:55:20 compute-0 systemd[1]: var-lib-containers-storage-overlay-db02a6885eaf998ba3c409a9db615ea823d4d24beff70c671b5276cac3214b14-merged.mount: Deactivated successfully.
Nov 22 07:55:20 compute-0 nova_compute[186544]: 2025-11-22 07:55:20.391 186548 DEBUG nova.virt.libvirt.vif [None req-3a26c3d0-3f32-4dbc-a6fb-227062d9cade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:54:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-30500190',display_name='tempest-ListServerFiltersTestJSON-instance-30500190',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-30500190',id=62,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:54:23Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b4ca2b2e65ac4bf8b3d14f3310a3a7bf',ramdisk_id='',reservation_id='r-ykmudqwe',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServerFiltersTestJSON-1217253496',owner_user_name='tempest-ListServerFiltersTestJSON-1217253496-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:54:58Z,user_data=None,user_id='6d9b8aa760ed4afdbf24f9deb5d29190',uuid=47f049f6-6113-427d-ba1e-f849dfec302b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bb707b30-96a3-4e7c-abad-b85fc5a10938", "address": "fa:16:3e:ee:30:e0", "network": {"id": "62930ff4-55a3-4e08-8229-5532aa7dcaed", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-130265711-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b4ca2b2e65ac4bf8b3d14f3310a3a7bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb707b30-96", "ovs_interfaceid": "bb707b30-96a3-4e7c-abad-b85fc5a10938", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 07:55:20 compute-0 nova_compute[186544]: 2025-11-22 07:55:20.391 186548 DEBUG nova.network.os_vif_util [None req-3a26c3d0-3f32-4dbc-a6fb-227062d9cade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Converting VIF {"id": "bb707b30-96a3-4e7c-abad-b85fc5a10938", "address": "fa:16:3e:ee:30:e0", "network": {"id": "62930ff4-55a3-4e08-8229-5532aa7dcaed", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-130265711-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b4ca2b2e65ac4bf8b3d14f3310a3a7bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb707b30-96", "ovs_interfaceid": "bb707b30-96a3-4e7c-abad-b85fc5a10938", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:55:20 compute-0 nova_compute[186544]: 2025-11-22 07:55:20.392 186548 DEBUG nova.network.os_vif_util [None req-3a26c3d0-3f32-4dbc-a6fb-227062d9cade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ee:30:e0,bridge_name='br-int',has_traffic_filtering=True,id=bb707b30-96a3-4e7c-abad-b85fc5a10938,network=Network(62930ff4-55a3-4e08-8229-5532aa7dcaed),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbb707b30-96') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:55:20 compute-0 nova_compute[186544]: 2025-11-22 07:55:20.392 186548 DEBUG os_vif [None req-3a26c3d0-3f32-4dbc-a6fb-227062d9cade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ee:30:e0,bridge_name='br-int',has_traffic_filtering=True,id=bb707b30-96a3-4e7c-abad-b85fc5a10938,network=Network(62930ff4-55a3-4e08-8229-5532aa7dcaed),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbb707b30-96') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 07:55:20 compute-0 nova_compute[186544]: 2025-11-22 07:55:20.394 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:20 compute-0 nova_compute[186544]: 2025-11-22 07:55:20.394 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbb707b30-96, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:55:20 compute-0 nova_compute[186544]: 2025-11-22 07:55:20.396 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:20 compute-0 nova_compute[186544]: 2025-11-22 07:55:20.397 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:20 compute-0 nova_compute[186544]: 2025-11-22 07:55:20.399 186548 INFO os_vif [None req-3a26c3d0-3f32-4dbc-a6fb-227062d9cade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ee:30:e0,bridge_name='br-int',has_traffic_filtering=True,id=bb707b30-96a3-4e7c-abad-b85fc5a10938,network=Network(62930ff4-55a3-4e08-8229-5532aa7dcaed),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbb707b30-96')
Nov 22 07:55:20 compute-0 nova_compute[186544]: 2025-11-22 07:55:20.400 186548 INFO nova.virt.libvirt.driver [None req-3a26c3d0-3f32-4dbc-a6fb-227062d9cade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Deleting instance files /var/lib/nova/instances/47f049f6-6113-427d-ba1e-f849dfec302b_del
Nov 22 07:55:20 compute-0 nova_compute[186544]: 2025-11-22 07:55:20.400 186548 INFO nova.virt.libvirt.driver [None req-3a26c3d0-3f32-4dbc-a6fb-227062d9cade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Deletion of /var/lib/nova/instances/47f049f6-6113-427d-ba1e-f849dfec302b_del complete
Nov 22 07:55:20 compute-0 podman[223925]: 2025-11-22 07:55:20.405661279 +0000 UTC m=+0.175104229 container cleanup ae9df771339e9dd7c6309c607aeb15e79752cf4fd257197588d996f1e856fdf0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 07:55:20 compute-0 nova_compute[186544]: 2025-11-22 07:55:20.405 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:20 compute-0 systemd[1]: libpod-conmon-ae9df771339e9dd7c6309c607aeb15e79752cf4fd257197588d996f1e856fdf0.scope: Deactivated successfully.
Nov 22 07:55:20 compute-0 podman[223970]: 2025-11-22 07:55:20.485499974 +0000 UTC m=+0.058152895 container remove ae9df771339e9dd7c6309c607aeb15e79752cf4fd257197588d996f1e856fdf0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 07:55:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:20.490 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[624ed13a-3bf6-4b5f-9553-cef8a806f503]: (4, ('Sat Nov 22 07:55:20 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed (ae9df771339e9dd7c6309c607aeb15e79752cf4fd257197588d996f1e856fdf0)\nae9df771339e9dd7c6309c607aeb15e79752cf4fd257197588d996f1e856fdf0\nSat Nov 22 07:55:20 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed (ae9df771339e9dd7c6309c607aeb15e79752cf4fd257197588d996f1e856fdf0)\nae9df771339e9dd7c6309c607aeb15e79752cf4fd257197588d996f1e856fdf0\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:20.492 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[9049c601-f40f-4a38-8db1-c1d212d819fe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:20.493 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap62930ff4-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:55:20 compute-0 nova_compute[186544]: 2025-11-22 07:55:20.494 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:20 compute-0 kernel: tap62930ff4-50: left promiscuous mode
Nov 22 07:55:20 compute-0 nova_compute[186544]: 2025-11-22 07:55:20.500 186548 INFO nova.compute.manager [None req-3a26c3d0-3f32-4dbc-a6fb-227062d9cade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Took 0.40 seconds to destroy the instance on the hypervisor.
Nov 22 07:55:20 compute-0 nova_compute[186544]: 2025-11-22 07:55:20.501 186548 DEBUG oslo.service.loopingcall [None req-3a26c3d0-3f32-4dbc-a6fb-227062d9cade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 07:55:20 compute-0 nova_compute[186544]: 2025-11-22 07:55:20.501 186548 DEBUG nova.compute.manager [-] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 07:55:20 compute-0 nova_compute[186544]: 2025-11-22 07:55:20.501 186548 DEBUG nova.network.neutron [-] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 07:55:20 compute-0 nova_compute[186544]: 2025-11-22 07:55:20.508 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:20.511 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[ad6f888e-a4e6-4085-be47-23c202cf47d3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:20.526 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[c1c4d0fe-572d-41e2-bfb8-77022664ed21]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:20.527 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[ba8bdf25-5385-45dd-8208-7d59c55720f2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:20.541 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[f8ad93e4-7c5c-4ccb-99b9-806f9fdc1c99]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 483852, 'reachable_time': 39239, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223987, 'error': None, 'target': 'ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:20.543 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-62930ff4-55a3-4e08-8229-5532aa7dcaed deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 07:55:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:20.544 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[63365ed3-4f51-4468-9e86-7dc83184fb21]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:20 compute-0 systemd[1]: run-netns-ovnmeta\x2d62930ff4\x2d55a3\x2d4e08\x2d8229\x2d5532aa7dcaed.mount: Deactivated successfully.
Nov 22 07:55:20 compute-0 nova_compute[186544]: 2025-11-22 07:55:20.576 186548 DEBUG nova.compute.manager [req-bc874a45-f02d-4dd3-b4ce-be4ad8336f8c req-4bd19c30-b848-4a16-a524-d205208df8cd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Received event network-vif-unplugged-bb707b30-96a3-4e7c-abad-b85fc5a10938 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:55:20 compute-0 nova_compute[186544]: 2025-11-22 07:55:20.576 186548 DEBUG oslo_concurrency.lockutils [req-bc874a45-f02d-4dd3-b4ce-be4ad8336f8c req-4bd19c30-b848-4a16-a524-d205208df8cd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "47f049f6-6113-427d-ba1e-f849dfec302b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:55:20 compute-0 nova_compute[186544]: 2025-11-22 07:55:20.577 186548 DEBUG oslo_concurrency.lockutils [req-bc874a45-f02d-4dd3-b4ce-be4ad8336f8c req-4bd19c30-b848-4a16-a524-d205208df8cd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "47f049f6-6113-427d-ba1e-f849dfec302b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:55:20 compute-0 nova_compute[186544]: 2025-11-22 07:55:20.577 186548 DEBUG oslo_concurrency.lockutils [req-bc874a45-f02d-4dd3-b4ce-be4ad8336f8c req-4bd19c30-b848-4a16-a524-d205208df8cd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "47f049f6-6113-427d-ba1e-f849dfec302b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:55:20 compute-0 nova_compute[186544]: 2025-11-22 07:55:20.577 186548 DEBUG nova.compute.manager [req-bc874a45-f02d-4dd3-b4ce-be4ad8336f8c req-4bd19c30-b848-4a16-a524-d205208df8cd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] No waiting events found dispatching network-vif-unplugged-bb707b30-96a3-4e7c-abad-b85fc5a10938 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:55:20 compute-0 nova_compute[186544]: 2025-11-22 07:55:20.577 186548 DEBUG nova.compute.manager [req-bc874a45-f02d-4dd3-b4ce-be4ad8336f8c req-4bd19c30-b848-4a16-a524-d205208df8cd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Received event network-vif-unplugged-bb707b30-96a3-4e7c-abad-b85fc5a10938 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 22 07:55:21 compute-0 nova_compute[186544]: 2025-11-22 07:55:21.686 186548 DEBUG nova.network.neutron [-] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:55:21 compute-0 nova_compute[186544]: 2025-11-22 07:55:21.721 186548 INFO nova.compute.manager [-] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Took 1.22 seconds to deallocate network for instance.
Nov 22 07:55:21 compute-0 nova_compute[186544]: 2025-11-22 07:55:21.832 186548 DEBUG oslo_concurrency.lockutils [None req-3a26c3d0-3f32-4dbc-a6fb-227062d9cade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:55:21 compute-0 nova_compute[186544]: 2025-11-22 07:55:21.832 186548 DEBUG oslo_concurrency.lockutils [None req-3a26c3d0-3f32-4dbc-a6fb-227062d9cade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:55:21 compute-0 nova_compute[186544]: 2025-11-22 07:55:21.933 186548 DEBUG nova.compute.provider_tree [None req-3a26c3d0-3f32-4dbc-a6fb-227062d9cade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:55:21 compute-0 nova_compute[186544]: 2025-11-22 07:55:21.955 186548 DEBUG nova.scheduler.client.report [None req-3a26c3d0-3f32-4dbc-a6fb-227062d9cade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:55:21 compute-0 nova_compute[186544]: 2025-11-22 07:55:21.991 186548 DEBUG oslo_concurrency.lockutils [None req-3a26c3d0-3f32-4dbc-a6fb-227062d9cade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.159s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:55:22 compute-0 nova_compute[186544]: 2025-11-22 07:55:22.049 186548 INFO nova.scheduler.client.report [None req-3a26c3d0-3f32-4dbc-a6fb-227062d9cade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Deleted allocations for instance 47f049f6-6113-427d-ba1e-f849dfec302b
Nov 22 07:55:22 compute-0 nova_compute[186544]: 2025-11-22 07:55:22.139 186548 DEBUG oslo_concurrency.lockutils [None req-3a26c3d0-3f32-4dbc-a6fb-227062d9cade 6d9b8aa760ed4afdbf24f9deb5d29190 b4ca2b2e65ac4bf8b3d14f3310a3a7bf - - default default] Lock "47f049f6-6113-427d-ba1e-f849dfec302b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.061s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:55:22 compute-0 podman[223988]: 2025-11-22 07:55:22.40415488 +0000 UTC m=+0.053485960 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd)
Nov 22 07:55:22 compute-0 nova_compute[186544]: 2025-11-22 07:55:22.655 186548 DEBUG nova.compute.manager [req-b0d9bd9d-bdcd-424c-b960-d8d4ba14601d req-59f70b09-1150-4e3c-a81f-a495a3e06f1d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Received event network-vif-plugged-bb707b30-96a3-4e7c-abad-b85fc5a10938 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:55:22 compute-0 nova_compute[186544]: 2025-11-22 07:55:22.655 186548 DEBUG oslo_concurrency.lockutils [req-b0d9bd9d-bdcd-424c-b960-d8d4ba14601d req-59f70b09-1150-4e3c-a81f-a495a3e06f1d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "47f049f6-6113-427d-ba1e-f849dfec302b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:55:22 compute-0 nova_compute[186544]: 2025-11-22 07:55:22.655 186548 DEBUG oslo_concurrency.lockutils [req-b0d9bd9d-bdcd-424c-b960-d8d4ba14601d req-59f70b09-1150-4e3c-a81f-a495a3e06f1d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "47f049f6-6113-427d-ba1e-f849dfec302b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:55:22 compute-0 nova_compute[186544]: 2025-11-22 07:55:22.656 186548 DEBUG oslo_concurrency.lockutils [req-b0d9bd9d-bdcd-424c-b960-d8d4ba14601d req-59f70b09-1150-4e3c-a81f-a495a3e06f1d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "47f049f6-6113-427d-ba1e-f849dfec302b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:55:22 compute-0 nova_compute[186544]: 2025-11-22 07:55:22.656 186548 DEBUG nova.compute.manager [req-b0d9bd9d-bdcd-424c-b960-d8d4ba14601d req-59f70b09-1150-4e3c-a81f-a495a3e06f1d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] No waiting events found dispatching network-vif-plugged-bb707b30-96a3-4e7c-abad-b85fc5a10938 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:55:22 compute-0 nova_compute[186544]: 2025-11-22 07:55:22.656 186548 WARNING nova.compute.manager [req-b0d9bd9d-bdcd-424c-b960-d8d4ba14601d req-59f70b09-1150-4e3c-a81f-a495a3e06f1d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Received unexpected event network-vif-plugged-bb707b30-96a3-4e7c-abad-b85fc5a10938 for instance with vm_state deleted and task_state None.
Nov 22 07:55:23 compute-0 nova_compute[186544]: 2025-11-22 07:55:23.817 186548 DEBUG nova.compute.manager [req-0a9ed0f4-8221-4dea-9eeb-13ce5fd24d3f req-900edc33-3b78-477d-b953-433af75d325b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Received event network-vif-deleted-bb707b30-96a3-4e7c-abad-b85fc5a10938 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:55:24 compute-0 ovn_controller[94843]: 2025-11-22T07:55:24Z|00263|binding|INFO|Releasing lport bab7bafe-e92a-4e88-a16b-e3bd78ab8944 from this chassis (sb_readonly=0)
Nov 22 07:55:24 compute-0 nova_compute[186544]: 2025-11-22 07:55:24.559 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:24 compute-0 nova_compute[186544]: 2025-11-22 07:55:24.726 186548 DEBUG nova.virt.libvirt.driver [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Nov 22 07:55:25 compute-0 nova_compute[186544]: 2025-11-22 07:55:25.397 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:25 compute-0 nova_compute[186544]: 2025-11-22 07:55:25.406 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:26 compute-0 podman[224009]: 2025-11-22 07:55:26.455328451 +0000 UTC m=+0.092519930 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 07:55:26 compute-0 kernel: tapcc5ed7bc-cf (unregistering): left promiscuous mode
Nov 22 07:55:26 compute-0 NetworkManager[55036]: <info>  [1763798126.9044] device (tapcc5ed7bc-cf): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 07:55:26 compute-0 ovn_controller[94843]: 2025-11-22T07:55:26Z|00264|binding|INFO|Releasing lport cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b from this chassis (sb_readonly=0)
Nov 22 07:55:26 compute-0 ovn_controller[94843]: 2025-11-22T07:55:26Z|00265|binding|INFO|Setting lport cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b down in Southbound
Nov 22 07:55:26 compute-0 nova_compute[186544]: 2025-11-22 07:55:26.909 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:26 compute-0 ovn_controller[94843]: 2025-11-22T07:55:26Z|00266|binding|INFO|Removing iface tapcc5ed7bc-cf ovn-installed in OVS
Nov 22 07:55:26 compute-0 nova_compute[186544]: 2025-11-22 07:55:26.911 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:26 compute-0 nova_compute[186544]: 2025-11-22 07:55:26.925 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:26 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:26.954 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d4:c1:18 10.100.0.4'], port_security=['fa:16:3e:d4:c1:18 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'fde616bb-bfd9-477f-b2bf-6ce171da7c4c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d54e232a-5c68-4cc7-b58c-054da9c4646f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '063bf16c91af408ca075c690797e09d8', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f7dee984-8c58-404b-bb82-9a414aef709d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3245992a-6d76-4250-aff6-2ef09c2bac10, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:55:26 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:26.956 103805 INFO neutron.agent.ovn.metadata.agent [-] Port cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b in datapath d54e232a-5c68-4cc7-b58c-054da9c4646f unbound from our chassis
Nov 22 07:55:26 compute-0 systemd[1]: machine-qemu\x2d32\x2dinstance\x2d00000041.scope: Deactivated successfully.
Nov 22 07:55:26 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:26.957 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d54e232a-5c68-4cc7-b58c-054da9c4646f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 07:55:26 compute-0 systemd[1]: machine-qemu\x2d32\x2dinstance\x2d00000041.scope: Consumed 15.255s CPU time.
Nov 22 07:55:26 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:26.958 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[53a5d683-28ef-4114-8a66-7719e77ebb9d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:26 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:26.958 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f namespace which is not needed anymore
Nov 22 07:55:26 compute-0 systemd-machined[152872]: Machine qemu-32-instance-00000041 terminated.
Nov 22 07:55:27 compute-0 neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f[223773]: [NOTICE]   (223778) : haproxy version is 2.8.14-c23fe91
Nov 22 07:55:27 compute-0 neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f[223773]: [NOTICE]   (223778) : path to executable is /usr/sbin/haproxy
Nov 22 07:55:27 compute-0 neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f[223773]: [WARNING]  (223778) : Exiting Master process...
Nov 22 07:55:27 compute-0 neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f[223773]: [ALERT]    (223778) : Current worker (223780) exited with code 143 (Terminated)
Nov 22 07:55:27 compute-0 neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f[223773]: [WARNING]  (223778) : All workers exited. Exiting... (0)
Nov 22 07:55:27 compute-0 systemd[1]: libpod-a43e837134e5c30f78a88b7655df5f868fe628921d9fd98c0c3cedbfe90c0d15.scope: Deactivated successfully.
Nov 22 07:55:27 compute-0 conmon[223773]: conmon a43e837134e5c30f78a8 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-a43e837134e5c30f78a88b7655df5f868fe628921d9fd98c0c3cedbfe90c0d15.scope/container/memory.events
Nov 22 07:55:27 compute-0 podman[224059]: 2025-11-22 07:55:27.099020699 +0000 UTC m=+0.048388933 container died a43e837134e5c30f78a88b7655df5f868fe628921d9fd98c0c3cedbfe90c0d15 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 22 07:55:27 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a43e837134e5c30f78a88b7655df5f868fe628921d9fd98c0c3cedbfe90c0d15-userdata-shm.mount: Deactivated successfully.
Nov 22 07:55:27 compute-0 systemd[1]: var-lib-containers-storage-overlay-4af9916c3da199154088f26b2fc62cb1c0092bc3accb6f47e50e766ed7ecca8c-merged.mount: Deactivated successfully.
Nov 22 07:55:27 compute-0 podman[224059]: 2025-11-22 07:55:27.136557674 +0000 UTC m=+0.085925898 container cleanup a43e837134e5c30f78a88b7655df5f868fe628921d9fd98c0c3cedbfe90c0d15 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 07:55:27 compute-0 systemd[1]: libpod-conmon-a43e837134e5c30f78a88b7655df5f868fe628921d9fd98c0c3cedbfe90c0d15.scope: Deactivated successfully.
Nov 22 07:55:27 compute-0 podman[224099]: 2025-11-22 07:55:27.201978406 +0000 UTC m=+0.041847422 container remove a43e837134e5c30f78a88b7655df5f868fe628921d9fd98c0c3cedbfe90c0d15 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 22 07:55:27 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:27.207 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[ad0e4c48-94ea-4fab-817c-9d67d7da6f71]: (4, ('Sat Nov 22 07:55:27 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f (a43e837134e5c30f78a88b7655df5f868fe628921d9fd98c0c3cedbfe90c0d15)\na43e837134e5c30f78a88b7655df5f868fe628921d9fd98c0c3cedbfe90c0d15\nSat Nov 22 07:55:27 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f (a43e837134e5c30f78a88b7655df5f868fe628921d9fd98c0c3cedbfe90c0d15)\na43e837134e5c30f78a88b7655df5f868fe628921d9fd98c0c3cedbfe90c0d15\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:27 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:27.210 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[e4c497df-8d77-423a-bedd-6b98057a40f4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:27 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:27.211 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd54e232a-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:55:27 compute-0 nova_compute[186544]: 2025-11-22 07:55:27.212 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:27 compute-0 kernel: tapd54e232a-50: left promiscuous mode
Nov 22 07:55:27 compute-0 nova_compute[186544]: 2025-11-22 07:55:27.231 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:27 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:27.234 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[8bdcc7db-e2a2-4b56-afae-08c4c17f1054]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:27 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:27.249 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[1fffa69f-f799-4a66-a0d5-b96c2beecc1d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:27 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:27.250 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[f625a11f-6282-4377-ab80-bb027a6d759a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:27 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:27.265 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[167332fd-31fa-48a8-9c78-e2674e27eda8]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 484371, 'reachable_time': 27139, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224123, 'error': None, 'target': 'ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:27 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:27.267 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 07:55:27 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:27.267 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[819a2b87-7ee8-4b2c-b2e3-c5a77b41e53b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:27 compute-0 systemd[1]: run-netns-ovnmeta\x2dd54e232a\x2d5c68\x2d4cc7\x2db58c\x2d054da9c4646f.mount: Deactivated successfully.
Nov 22 07:55:27 compute-0 nova_compute[186544]: 2025-11-22 07:55:27.739 186548 INFO nova.virt.libvirt.driver [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Instance shutdown successfully after 13 seconds.
Nov 22 07:55:27 compute-0 nova_compute[186544]: 2025-11-22 07:55:27.745 186548 INFO nova.virt.libvirt.driver [-] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Instance destroyed successfully.
Nov 22 07:55:27 compute-0 nova_compute[186544]: 2025-11-22 07:55:27.749 186548 INFO nova.virt.libvirt.driver [-] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Instance destroyed successfully.
Nov 22 07:55:27 compute-0 nova_compute[186544]: 2025-11-22 07:55:27.750 186548 DEBUG nova.virt.libvirt.vif [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:54:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1890324968',display_name='tempest-ServerDiskConfigTestJSON-server-1890324968',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1890324968',id=65,image_ref='360f90ca-2ddb-4e60-a48e-364e3b48bd96',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:55:05Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='063bf16c91af408ca075c690797e09d8',ramdisk_id='',reservation_id='r-qfr0g3wu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='360f90ca-2ddb-4e60-a48e-364e3b48bd96',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-592691466',owner_user_name='tempest-ServerDiskConfigTestJSON-592691466-project-member'},tags=<?>,task_state='rebuilding',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:55:12Z,user_data=None,user_id='e24c302b62fb470aa189b76d4676733b',uuid=fde616bb-bfd9-477f-b2bf-6ce171da7c4c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b", "address": "fa:16:3e:d4:c1:18", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc5ed7bc-cf", "ovs_interfaceid": "cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 07:55:27 compute-0 nova_compute[186544]: 2025-11-22 07:55:27.751 186548 DEBUG nova.network.os_vif_util [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Converting VIF {"id": "cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b", "address": "fa:16:3e:d4:c1:18", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc5ed7bc-cf", "ovs_interfaceid": "cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:55:27 compute-0 nova_compute[186544]: 2025-11-22 07:55:27.752 186548 DEBUG nova.network.os_vif_util [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d4:c1:18,bridge_name='br-int',has_traffic_filtering=True,id=cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b,network=Network(d54e232a-5c68-4cc7-b58c-054da9c4646f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc5ed7bc-cf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:55:27 compute-0 nova_compute[186544]: 2025-11-22 07:55:27.752 186548 DEBUG os_vif [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d4:c1:18,bridge_name='br-int',has_traffic_filtering=True,id=cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b,network=Network(d54e232a-5c68-4cc7-b58c-054da9c4646f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc5ed7bc-cf') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 07:55:27 compute-0 nova_compute[186544]: 2025-11-22 07:55:27.753 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:27 compute-0 nova_compute[186544]: 2025-11-22 07:55:27.754 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcc5ed7bc-cf, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:55:27 compute-0 nova_compute[186544]: 2025-11-22 07:55:27.755 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:27 compute-0 nova_compute[186544]: 2025-11-22 07:55:27.756 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:27 compute-0 nova_compute[186544]: 2025-11-22 07:55:27.759 186548 INFO os_vif [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d4:c1:18,bridge_name='br-int',has_traffic_filtering=True,id=cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b,network=Network(d54e232a-5c68-4cc7-b58c-054da9c4646f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc5ed7bc-cf')
Nov 22 07:55:27 compute-0 nova_compute[186544]: 2025-11-22 07:55:27.759 186548 INFO nova.virt.libvirt.driver [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Deleting instance files /var/lib/nova/instances/fde616bb-bfd9-477f-b2bf-6ce171da7c4c_del
Nov 22 07:55:27 compute-0 nova_compute[186544]: 2025-11-22 07:55:27.760 186548 INFO nova.virt.libvirt.driver [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Deletion of /var/lib/nova/instances/fde616bb-bfd9-477f-b2bf-6ce171da7c4c_del complete
Nov 22 07:55:27 compute-0 nova_compute[186544]: 2025-11-22 07:55:27.990 186548 DEBUG nova.virt.libvirt.driver [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 07:55:27 compute-0 nova_compute[186544]: 2025-11-22 07:55:27.991 186548 INFO nova.virt.libvirt.driver [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Creating image(s)
Nov 22 07:55:27 compute-0 nova_compute[186544]: 2025-11-22 07:55:27.992 186548 DEBUG oslo_concurrency.lockutils [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquiring lock "/var/lib/nova/instances/fde616bb-bfd9-477f-b2bf-6ce171da7c4c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:55:27 compute-0 nova_compute[186544]: 2025-11-22 07:55:27.993 186548 DEBUG oslo_concurrency.lockutils [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "/var/lib/nova/instances/fde616bb-bfd9-477f-b2bf-6ce171da7c4c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:55:27 compute-0 nova_compute[186544]: 2025-11-22 07:55:27.994 186548 DEBUG oslo_concurrency.lockutils [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "/var/lib/nova/instances/fde616bb-bfd9-477f-b2bf-6ce171da7c4c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:55:28 compute-0 nova_compute[186544]: 2025-11-22 07:55:28.012 186548 DEBUG oslo_concurrency.processutils [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:55:28 compute-0 nova_compute[186544]: 2025-11-22 07:55:28.069 186548 DEBUG oslo_concurrency.processutils [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:55:28 compute-0 nova_compute[186544]: 2025-11-22 07:55:28.071 186548 DEBUG oslo_concurrency.lockutils [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquiring lock "2882af3479446958b785a3f508ce087a26493f42" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:55:28 compute-0 nova_compute[186544]: 2025-11-22 07:55:28.071 186548 DEBUG oslo_concurrency.lockutils [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "2882af3479446958b785a3f508ce087a26493f42" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:55:28 compute-0 nova_compute[186544]: 2025-11-22 07:55:28.088 186548 DEBUG oslo_concurrency.processutils [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:55:28 compute-0 nova_compute[186544]: 2025-11-22 07:55:28.142 186548 DEBUG oslo_concurrency.processutils [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:55:28 compute-0 nova_compute[186544]: 2025-11-22 07:55:28.143 186548 DEBUG oslo_concurrency.processutils [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42,backing_fmt=raw /var/lib/nova/instances/fde616bb-bfd9-477f-b2bf-6ce171da7c4c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:55:28 compute-0 nova_compute[186544]: 2025-11-22 07:55:28.178 186548 DEBUG oslo_concurrency.processutils [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42,backing_fmt=raw /var/lib/nova/instances/fde616bb-bfd9-477f-b2bf-6ce171da7c4c/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:55:28 compute-0 nova_compute[186544]: 2025-11-22 07:55:28.178 186548 DEBUG oslo_concurrency.lockutils [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "2882af3479446958b785a3f508ce087a26493f42" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.107s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:55:28 compute-0 nova_compute[186544]: 2025-11-22 07:55:28.179 186548 DEBUG oslo_concurrency.processutils [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:55:28 compute-0 nova_compute[186544]: 2025-11-22 07:55:28.244 186548 DEBUG oslo_concurrency.processutils [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:55:28 compute-0 nova_compute[186544]: 2025-11-22 07:55:28.245 186548 DEBUG nova.virt.disk.api [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Checking if we can resize image /var/lib/nova/instances/fde616bb-bfd9-477f-b2bf-6ce171da7c4c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 07:55:28 compute-0 nova_compute[186544]: 2025-11-22 07:55:28.246 186548 DEBUG oslo_concurrency.processutils [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fde616bb-bfd9-477f-b2bf-6ce171da7c4c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:55:28 compute-0 nova_compute[186544]: 2025-11-22 07:55:28.334 186548 DEBUG oslo_concurrency.processutils [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fde616bb-bfd9-477f-b2bf-6ce171da7c4c/disk --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:55:28 compute-0 nova_compute[186544]: 2025-11-22 07:55:28.335 186548 DEBUG nova.virt.disk.api [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Cannot resize image /var/lib/nova/instances/fde616bb-bfd9-477f-b2bf-6ce171da7c4c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 07:55:28 compute-0 nova_compute[186544]: 2025-11-22 07:55:28.336 186548 DEBUG nova.virt.libvirt.driver [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 07:55:28 compute-0 nova_compute[186544]: 2025-11-22 07:55:28.336 186548 DEBUG nova.virt.libvirt.driver [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Ensure instance console log exists: /var/lib/nova/instances/fde616bb-bfd9-477f-b2bf-6ce171da7c4c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 07:55:28 compute-0 nova_compute[186544]: 2025-11-22 07:55:28.337 186548 DEBUG oslo_concurrency.lockutils [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:55:28 compute-0 nova_compute[186544]: 2025-11-22 07:55:28.337 186548 DEBUG oslo_concurrency.lockutils [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:55:28 compute-0 nova_compute[186544]: 2025-11-22 07:55:28.337 186548 DEBUG oslo_concurrency.lockutils [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:55:28 compute-0 nova_compute[186544]: 2025-11-22 07:55:28.340 186548 DEBUG nova.virt.libvirt.driver [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Start _get_guest_xml network_info=[{"id": "cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b", "address": "fa:16:3e:d4:c1:18", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc5ed7bc-cf", "ovs_interfaceid": "cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:39:01Z,direct_url=<?>,disk_format='qcow2',id=360f90ca-2ddb-4e60-a48e-364e3b48bd96,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:02Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 07:55:28 compute-0 nova_compute[186544]: 2025-11-22 07:55:28.347 186548 WARNING nova.virt.libvirt.driver [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError
Nov 22 07:55:28 compute-0 nova_compute[186544]: 2025-11-22 07:55:28.352 186548 DEBUG nova.virt.libvirt.host [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 07:55:28 compute-0 nova_compute[186544]: 2025-11-22 07:55:28.353 186548 DEBUG nova.virt.libvirt.host [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 07:55:28 compute-0 nova_compute[186544]: 2025-11-22 07:55:28.356 186548 DEBUG nova.virt.libvirt.host [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 07:55:28 compute-0 nova_compute[186544]: 2025-11-22 07:55:28.357 186548 DEBUG nova.virt.libvirt.host [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 07:55:28 compute-0 nova_compute[186544]: 2025-11-22 07:55:28.358 186548 DEBUG nova.virt.libvirt.driver [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 07:55:28 compute-0 nova_compute[186544]: 2025-11-22 07:55:28.359 186548 DEBUG nova.virt.hardware [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:39:01Z,direct_url=<?>,disk_format='qcow2',id=360f90ca-2ddb-4e60-a48e-364e3b48bd96,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:02Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 07:55:28 compute-0 nova_compute[186544]: 2025-11-22 07:55:28.359 186548 DEBUG nova.virt.hardware [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 07:55:28 compute-0 nova_compute[186544]: 2025-11-22 07:55:28.360 186548 DEBUG nova.virt.hardware [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 07:55:28 compute-0 nova_compute[186544]: 2025-11-22 07:55:28.360 186548 DEBUG nova.virt.hardware [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 07:55:28 compute-0 nova_compute[186544]: 2025-11-22 07:55:28.360 186548 DEBUG nova.virt.hardware [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 07:55:28 compute-0 nova_compute[186544]: 2025-11-22 07:55:28.361 186548 DEBUG nova.virt.hardware [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 07:55:28 compute-0 nova_compute[186544]: 2025-11-22 07:55:28.361 186548 DEBUG nova.virt.hardware [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 07:55:28 compute-0 nova_compute[186544]: 2025-11-22 07:55:28.361 186548 DEBUG nova.virt.hardware [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 07:55:28 compute-0 nova_compute[186544]: 2025-11-22 07:55:28.362 186548 DEBUG nova.virt.hardware [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 07:55:28 compute-0 nova_compute[186544]: 2025-11-22 07:55:28.362 186548 DEBUG nova.virt.hardware [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 07:55:28 compute-0 nova_compute[186544]: 2025-11-22 07:55:28.362 186548 DEBUG nova.virt.hardware [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 07:55:28 compute-0 nova_compute[186544]: 2025-11-22 07:55:28.362 186548 DEBUG nova.objects.instance [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lazy-loading 'vcpu_model' on Instance uuid fde616bb-bfd9-477f-b2bf-6ce171da7c4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:55:28 compute-0 nova_compute[186544]: 2025-11-22 07:55:28.382 186548 DEBUG nova.virt.libvirt.vif [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-22T07:54:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1890324968',display_name='tempest-ServerDiskConfigTestJSON-server-1890324968',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1890324968',id=65,image_ref='360f90ca-2ddb-4e60-a48e-364e3b48bd96',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:55:05Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='063bf16c91af408ca075c690797e09d8',ramdisk_id='',reservation_id='r-qfr0g3wu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='360f90ca-2ddb-4e60-a48e-364e3b48bd96',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-592691466',owner_user_name='tempest-ServerDiskConfigTestJSON-592691466-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:55:27Z,user_data=None,user_id='e24c302b62fb470aa189b76d4676733b',uuid=fde616bb-bfd9-477f-b2bf-6ce171da7c4c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b", "address": "fa:16:3e:d4:c1:18", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc5ed7bc-cf", "ovs_interfaceid": "cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 07:55:28 compute-0 nova_compute[186544]: 2025-11-22 07:55:28.382 186548 DEBUG nova.network.os_vif_util [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Converting VIF {"id": "cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b", "address": "fa:16:3e:d4:c1:18", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc5ed7bc-cf", "ovs_interfaceid": "cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:55:28 compute-0 nova_compute[186544]: 2025-11-22 07:55:28.383 186548 DEBUG nova.network.os_vif_util [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d4:c1:18,bridge_name='br-int',has_traffic_filtering=True,id=cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b,network=Network(d54e232a-5c68-4cc7-b58c-054da9c4646f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc5ed7bc-cf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:55:28 compute-0 nova_compute[186544]: 2025-11-22 07:55:28.384 186548 DEBUG nova.virt.libvirt.driver [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] End _get_guest_xml xml=<domain type="kvm">
Nov 22 07:55:28 compute-0 nova_compute[186544]:   <uuid>fde616bb-bfd9-477f-b2bf-6ce171da7c4c</uuid>
Nov 22 07:55:28 compute-0 nova_compute[186544]:   <name>instance-00000041</name>
Nov 22 07:55:28 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 07:55:28 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 07:55:28 compute-0 nova_compute[186544]:   <metadata>
Nov 22 07:55:28 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 07:55:28 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 07:55:28 compute-0 nova_compute[186544]:       <nova:name>tempest-ServerDiskConfigTestJSON-server-1890324968</nova:name>
Nov 22 07:55:28 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 07:55:28</nova:creationTime>
Nov 22 07:55:28 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 07:55:28 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 07:55:28 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 07:55:28 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 07:55:28 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 07:55:28 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 07:55:28 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 07:55:28 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 07:55:28 compute-0 nova_compute[186544]:         <nova:user uuid="e24c302b62fb470aa189b76d4676733b">tempest-ServerDiskConfigTestJSON-592691466-project-member</nova:user>
Nov 22 07:55:28 compute-0 nova_compute[186544]:         <nova:project uuid="063bf16c91af408ca075c690797e09d8">tempest-ServerDiskConfigTestJSON-592691466</nova:project>
Nov 22 07:55:28 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 07:55:28 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="360f90ca-2ddb-4e60-a48e-364e3b48bd96"/>
Nov 22 07:55:28 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 07:55:28 compute-0 nova_compute[186544]:         <nova:port uuid="cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b">
Nov 22 07:55:28 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Nov 22 07:55:28 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 07:55:28 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 07:55:28 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 07:55:28 compute-0 nova_compute[186544]:   </metadata>
Nov 22 07:55:28 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 07:55:28 compute-0 nova_compute[186544]:     <system>
Nov 22 07:55:28 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 07:55:28 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 07:55:28 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 07:55:28 compute-0 nova_compute[186544]:       <entry name="serial">fde616bb-bfd9-477f-b2bf-6ce171da7c4c</entry>
Nov 22 07:55:28 compute-0 nova_compute[186544]:       <entry name="uuid">fde616bb-bfd9-477f-b2bf-6ce171da7c4c</entry>
Nov 22 07:55:28 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 07:55:28 compute-0 nova_compute[186544]:     </system>
Nov 22 07:55:28 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 07:55:28 compute-0 nova_compute[186544]:   <os>
Nov 22 07:55:28 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 07:55:28 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 07:55:28 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 07:55:28 compute-0 nova_compute[186544]:   </os>
Nov 22 07:55:28 compute-0 nova_compute[186544]:   <features>
Nov 22 07:55:28 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 07:55:28 compute-0 nova_compute[186544]:     <apic/>
Nov 22 07:55:28 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 07:55:28 compute-0 nova_compute[186544]:   </features>
Nov 22 07:55:28 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 07:55:28 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 07:55:28 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 07:55:28 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 07:55:28 compute-0 nova_compute[186544]:   </clock>
Nov 22 07:55:28 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 07:55:28 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 07:55:28 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 07:55:28 compute-0 nova_compute[186544]:   </cpu>
Nov 22 07:55:28 compute-0 nova_compute[186544]:   <devices>
Nov 22 07:55:28 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 07:55:28 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 07:55:28 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/fde616bb-bfd9-477f-b2bf-6ce171da7c4c/disk"/>
Nov 22 07:55:28 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 07:55:28 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:55:28 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 07:55:28 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 07:55:28 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/fde616bb-bfd9-477f-b2bf-6ce171da7c4c/disk.config"/>
Nov 22 07:55:28 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 07:55:28 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:55:28 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 07:55:28 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:d4:c1:18"/>
Nov 22 07:55:28 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 07:55:28 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 07:55:28 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 07:55:28 compute-0 nova_compute[186544]:       <target dev="tapcc5ed7bc-cf"/>
Nov 22 07:55:28 compute-0 nova_compute[186544]:     </interface>
Nov 22 07:55:28 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 07:55:28 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/fde616bb-bfd9-477f-b2bf-6ce171da7c4c/console.log" append="off"/>
Nov 22 07:55:28 compute-0 nova_compute[186544]:     </serial>
Nov 22 07:55:28 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 07:55:28 compute-0 nova_compute[186544]:     <video>
Nov 22 07:55:28 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 07:55:28 compute-0 nova_compute[186544]:     </video>
Nov 22 07:55:28 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 07:55:28 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 07:55:28 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 07:55:28 compute-0 nova_compute[186544]:     </rng>
Nov 22 07:55:28 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 07:55:28 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:28 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:28 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:28 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:28 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:28 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:28 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:28 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:28 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:28 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:28 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:28 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:28 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:28 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:28 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:28 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:28 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:28 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:28 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:28 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:28 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:28 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:28 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:28 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:28 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 07:55:28 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 07:55:28 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 07:55:28 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 07:55:28 compute-0 nova_compute[186544]:   </devices>
Nov 22 07:55:28 compute-0 nova_compute[186544]: </domain>
Nov 22 07:55:28 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 07:55:28 compute-0 nova_compute[186544]: 2025-11-22 07:55:28.386 186548 DEBUG nova.compute.manager [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Preparing to wait for external event network-vif-plugged-cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 07:55:28 compute-0 nova_compute[186544]: 2025-11-22 07:55:28.386 186548 DEBUG oslo_concurrency.lockutils [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquiring lock "fde616bb-bfd9-477f-b2bf-6ce171da7c4c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:55:28 compute-0 nova_compute[186544]: 2025-11-22 07:55:28.386 186548 DEBUG oslo_concurrency.lockutils [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "fde616bb-bfd9-477f-b2bf-6ce171da7c4c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:55:28 compute-0 nova_compute[186544]: 2025-11-22 07:55:28.386 186548 DEBUG oslo_concurrency.lockutils [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "fde616bb-bfd9-477f-b2bf-6ce171da7c4c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:55:28 compute-0 nova_compute[186544]: 2025-11-22 07:55:28.387 186548 DEBUG nova.virt.libvirt.vif [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-22T07:54:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1890324968',display_name='tempest-ServerDiskConfigTestJSON-server-1890324968',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1890324968',id=65,image_ref='360f90ca-2ddb-4e60-a48e-364e3b48bd96',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:55:05Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='063bf16c91af408ca075c690797e09d8',ramdisk_id='',reservation_id='r-qfr0g3wu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='360f90ca-2ddb-4e60-a48e-364e3b48bd96',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-592691466',owner_user_name='tempest-ServerDiskConfigTestJSON-592691466-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:55:27Z,user_data=None,user_id='e24c302b62fb470aa189b76d4676733b',uuid=fde616bb-bfd9-477f-b2bf-6ce171da7c4c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b", "address": "fa:16:3e:d4:c1:18", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc5ed7bc-cf", "ovs_interfaceid": "cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 07:55:28 compute-0 nova_compute[186544]: 2025-11-22 07:55:28.387 186548 DEBUG nova.network.os_vif_util [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Converting VIF {"id": "cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b", "address": "fa:16:3e:d4:c1:18", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc5ed7bc-cf", "ovs_interfaceid": "cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:55:28 compute-0 nova_compute[186544]: 2025-11-22 07:55:28.387 186548 DEBUG nova.network.os_vif_util [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d4:c1:18,bridge_name='br-int',has_traffic_filtering=True,id=cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b,network=Network(d54e232a-5c68-4cc7-b58c-054da9c4646f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc5ed7bc-cf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:55:28 compute-0 nova_compute[186544]: 2025-11-22 07:55:28.388 186548 DEBUG os_vif [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d4:c1:18,bridge_name='br-int',has_traffic_filtering=True,id=cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b,network=Network(d54e232a-5c68-4cc7-b58c-054da9c4646f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc5ed7bc-cf') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 07:55:28 compute-0 nova_compute[186544]: 2025-11-22 07:55:28.388 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:28 compute-0 nova_compute[186544]: 2025-11-22 07:55:28.389 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:55:28 compute-0 nova_compute[186544]: 2025-11-22 07:55:28.389 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:55:28 compute-0 nova_compute[186544]: 2025-11-22 07:55:28.391 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:28 compute-0 nova_compute[186544]: 2025-11-22 07:55:28.391 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcc5ed7bc-cf, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:55:28 compute-0 nova_compute[186544]: 2025-11-22 07:55:28.392 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapcc5ed7bc-cf, col_values=(('external_ids', {'iface-id': 'cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d4:c1:18', 'vm-uuid': 'fde616bb-bfd9-477f-b2bf-6ce171da7c4c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:55:28 compute-0 nova_compute[186544]: 2025-11-22 07:55:28.393 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:28 compute-0 NetworkManager[55036]: <info>  [1763798128.3946] manager: (tapcc5ed7bc-cf): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/125)
Nov 22 07:55:28 compute-0 nova_compute[186544]: 2025-11-22 07:55:28.395 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 07:55:28 compute-0 nova_compute[186544]: 2025-11-22 07:55:28.398 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:28 compute-0 nova_compute[186544]: 2025-11-22 07:55:28.399 186548 INFO os_vif [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d4:c1:18,bridge_name='br-int',has_traffic_filtering=True,id=cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b,network=Network(d54e232a-5c68-4cc7-b58c-054da9c4646f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc5ed7bc-cf')
Nov 22 07:55:28 compute-0 nova_compute[186544]: 2025-11-22 07:55:28.450 186548 DEBUG nova.virt.libvirt.driver [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:55:28 compute-0 nova_compute[186544]: 2025-11-22 07:55:28.451 186548 DEBUG nova.virt.libvirt.driver [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:55:28 compute-0 nova_compute[186544]: 2025-11-22 07:55:28.451 186548 DEBUG nova.virt.libvirt.driver [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] No VIF found with MAC fa:16:3e:d4:c1:18, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 07:55:28 compute-0 nova_compute[186544]: 2025-11-22 07:55:28.452 186548 INFO nova.virt.libvirt.driver [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Using config drive
Nov 22 07:55:28 compute-0 nova_compute[186544]: 2025-11-22 07:55:28.469 186548 DEBUG nova.objects.instance [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lazy-loading 'ec2_ids' on Instance uuid fde616bb-bfd9-477f-b2bf-6ce171da7c4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:55:28 compute-0 nova_compute[186544]: 2025-11-22 07:55:28.511 186548 DEBUG nova.objects.instance [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lazy-loading 'keypairs' on Instance uuid fde616bb-bfd9-477f-b2bf-6ce171da7c4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:55:29 compute-0 podman[224142]: 2025-11-22 07:55:29.413076127 +0000 UTC m=+0.063755672 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, maintainer=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, config_id=edpm, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, distribution-scope=public)
Nov 22 07:55:29 compute-0 nova_compute[186544]: 2025-11-22 07:55:29.501 186548 INFO nova.virt.libvirt.driver [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Creating config drive at /var/lib/nova/instances/fde616bb-bfd9-477f-b2bf-6ce171da7c4c/disk.config
Nov 22 07:55:29 compute-0 nova_compute[186544]: 2025-11-22 07:55:29.505 186548 DEBUG oslo_concurrency.processutils [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/fde616bb-bfd9-477f-b2bf-6ce171da7c4c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpeiz8w4n0 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:55:29 compute-0 nova_compute[186544]: 2025-11-22 07:55:29.630 186548 DEBUG oslo_concurrency.processutils [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/fde616bb-bfd9-477f-b2bf-6ce171da7c4c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpeiz8w4n0" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:55:29 compute-0 kernel: tapcc5ed7bc-cf: entered promiscuous mode
Nov 22 07:55:29 compute-0 systemd-udevd[224039]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 07:55:29 compute-0 ovn_controller[94843]: 2025-11-22T07:55:29Z|00267|binding|INFO|Claiming lport cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b for this chassis.
Nov 22 07:55:29 compute-0 ovn_controller[94843]: 2025-11-22T07:55:29Z|00268|binding|INFO|cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b: Claiming fa:16:3e:d4:c1:18 10.100.0.4
Nov 22 07:55:29 compute-0 NetworkManager[55036]: <info>  [1763798129.6863] manager: (tapcc5ed7bc-cf): new Tun device (/org/freedesktop/NetworkManager/Devices/126)
Nov 22 07:55:29 compute-0 nova_compute[186544]: 2025-11-22 07:55:29.686 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:29 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:29.691 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d4:c1:18 10.100.0.4'], port_security=['fa:16:3e:d4:c1:18 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'fde616bb-bfd9-477f-b2bf-6ce171da7c4c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d54e232a-5c68-4cc7-b58c-054da9c4646f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '063bf16c91af408ca075c690797e09d8', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f7dee984-8c58-404b-bb82-9a414aef709d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3245992a-6d76-4250-aff6-2ef09c2bac10, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:55:29 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:29.692 103805 INFO neutron.agent.ovn.metadata.agent [-] Port cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b in datapath d54e232a-5c68-4cc7-b58c-054da9c4646f bound to our chassis
Nov 22 07:55:29 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:29.693 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d54e232a-5c68-4cc7-b58c-054da9c4646f
Nov 22 07:55:29 compute-0 NetworkManager[55036]: <info>  [1763798129.6983] device (tapcc5ed7bc-cf): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 07:55:29 compute-0 ovn_controller[94843]: 2025-11-22T07:55:29Z|00269|binding|INFO|Setting lport cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b ovn-installed in OVS
Nov 22 07:55:29 compute-0 ovn_controller[94843]: 2025-11-22T07:55:29Z|00270|binding|INFO|Setting lport cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b up in Southbound
Nov 22 07:55:29 compute-0 nova_compute[186544]: 2025-11-22 07:55:29.700 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:29 compute-0 NetworkManager[55036]: <info>  [1763798129.7014] device (tapcc5ed7bc-cf): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 07:55:29 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:29.703 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[24a0887c-3995-44d2-8917-30b5771294c0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:29 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:29.704 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd54e232a-51 in ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 07:55:29 compute-0 nova_compute[186544]: 2025-11-22 07:55:29.704 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:29 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:29.706 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd54e232a-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 07:55:29 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:29.706 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[d5ab3917-a306-48b1-8f93-8362c0e34b80]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:29 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:29.708 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[46c78c0b-f847-43d0-a152-b14f100ebf44]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:29 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:29.721 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[433fd4ee-5b90-45d3-93d8-43ba1ed59731]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:29 compute-0 systemd-machined[152872]: New machine qemu-33-instance-00000041.
Nov 22 07:55:29 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:29.734 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[b54ddb9d-e210-478c-b483-e21b305209e1]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:29 compute-0 systemd[1]: Started Virtual Machine qemu-33-instance-00000041.
Nov 22 07:55:29 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:29.760 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[7b07aa08-fb2e-46fd-8d79-bbca8cb948cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:29 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:29.765 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[1b3d7444-5bb7-4b9f-aedd-cb8e4353fa8e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:29 compute-0 NetworkManager[55036]: <info>  [1763798129.7679] manager: (tapd54e232a-50): new Veth device (/org/freedesktop/NetworkManager/Devices/127)
Nov 22 07:55:29 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:29.793 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[caebbaf9-3e72-4f06-91e7-6263c31bf63c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:29 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:29.795 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[e92907f6-05e5-4b28-9eac-e83cfcf25c4e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:29 compute-0 NetworkManager[55036]: <info>  [1763798129.8160] device (tapd54e232a-50): carrier: link connected
Nov 22 07:55:29 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:29.821 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[fdf9e498-75ec-4ddd-a916-780bff050b94]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:29 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:29.836 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[93abe54e-8760-4be7-b615-edad622d0d08]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd54e232a-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5b:69:51'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 81], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 487045, 'reachable_time': 42199, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224213, 'error': None, 'target': 'ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:29 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:29.850 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[afbb27aa-5c37-4b2f-b079-02e0cb40542d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe5b:6951'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 487045, 'tstamp': 487045}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224214, 'error': None, 'target': 'ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:29 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:29.866 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[647b24e5-f657-4445-bd85-263e099acac2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd54e232a-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5b:69:51'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 81], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 487045, 'reachable_time': 42199, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 224215, 'error': None, 'target': 'ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:29 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:29.896 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[df0eeb47-4540-417b-b373-7177ccaddac3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:29 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:29.964 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[461be0f7-953e-480a-8f25-635cc65f37e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:29 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:29.966 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd54e232a-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:55:29 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:29.966 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:55:29 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:29.967 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd54e232a-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:55:29 compute-0 kernel: tapd54e232a-50: entered promiscuous mode
Nov 22 07:55:29 compute-0 NetworkManager[55036]: <info>  [1763798129.9710] manager: (tapd54e232a-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/128)
Nov 22 07:55:29 compute-0 nova_compute[186544]: 2025-11-22 07:55:29.971 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:29 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:29.976 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd54e232a-50, col_values=(('external_ids', {'iface-id': 'bab7bafe-e92a-4e88-a16b-e3bd78ab8944'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:55:29 compute-0 ovn_controller[94843]: 2025-11-22T07:55:29Z|00271|binding|INFO|Releasing lport bab7bafe-e92a-4e88-a16b-e3bd78ab8944 from this chassis (sb_readonly=0)
Nov 22 07:55:29 compute-0 nova_compute[186544]: 2025-11-22 07:55:29.977 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:29 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:29.980 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d54e232a-5c68-4cc7-b58c-054da9c4646f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d54e232a-5c68-4cc7-b58c-054da9c4646f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 07:55:29 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:29.981 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[31782d61-e689-4989-9f3d-0497b3252a70]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:29 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:29.981 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 07:55:29 compute-0 ovn_metadata_agent[103800]: global
Nov 22 07:55:29 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 07:55:29 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-d54e232a-5c68-4cc7-b58c-054da9c4646f
Nov 22 07:55:29 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 07:55:29 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 07:55:29 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 07:55:29 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/d54e232a-5c68-4cc7-b58c-054da9c4646f.pid.haproxy
Nov 22 07:55:29 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 07:55:29 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:55:29 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 07:55:29 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 07:55:29 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 07:55:29 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 07:55:29 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 07:55:29 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 07:55:29 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 07:55:29 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 07:55:29 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 07:55:29 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 07:55:29 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 07:55:29 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 07:55:29 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 07:55:29 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:55:29 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:55:29 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 07:55:29 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 07:55:29 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 07:55:29 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID d54e232a-5c68-4cc7-b58c-054da9c4646f
Nov 22 07:55:29 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 07:55:29 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:29.982 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f', 'env', 'PROCESS_TAG=haproxy-d54e232a-5c68-4cc7-b58c-054da9c4646f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d54e232a-5c68-4cc7-b58c-054da9c4646f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 07:55:29 compute-0 nova_compute[186544]: 2025-11-22 07:55:29.989 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:30 compute-0 nova_compute[186544]: 2025-11-22 07:55:30.323 186548 DEBUG nova.virt.libvirt.host [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Removed pending event for fde616bb-bfd9-477f-b2bf-6ce171da7c4c due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Nov 22 07:55:30 compute-0 nova_compute[186544]: 2025-11-22 07:55:30.324 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798130.3225138, fde616bb-bfd9-477f-b2bf-6ce171da7c4c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:55:30 compute-0 nova_compute[186544]: 2025-11-22 07:55:30.324 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] VM Started (Lifecycle Event)
Nov 22 07:55:30 compute-0 nova_compute[186544]: 2025-11-22 07:55:30.346 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:55:30 compute-0 podman[224255]: 2025-11-22 07:55:30.350020019 +0000 UTC m=+0.060659135 container create ae86d07cafc7a25bbe7b6629579c108399daf90c69d41ea2af534e8d4cafc589 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118)
Nov 22 07:55:30 compute-0 nova_compute[186544]: 2025-11-22 07:55:30.350 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798130.3236766, fde616bb-bfd9-477f-b2bf-6ce171da7c4c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:55:30 compute-0 nova_compute[186544]: 2025-11-22 07:55:30.350 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] VM Paused (Lifecycle Event)
Nov 22 07:55:30 compute-0 nova_compute[186544]: 2025-11-22 07:55:30.374 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:55:30 compute-0 nova_compute[186544]: 2025-11-22 07:55:30.377 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:55:30 compute-0 systemd[1]: Started libpod-conmon-ae86d07cafc7a25bbe7b6629579c108399daf90c69d41ea2af534e8d4cafc589.scope.
Nov 22 07:55:30 compute-0 nova_compute[186544]: 2025-11-22 07:55:30.402 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Nov 22 07:55:30 compute-0 systemd[1]: Started libcrun container.
Nov 22 07:55:30 compute-0 podman[224255]: 2025-11-22 07:55:30.310996397 +0000 UTC m=+0.021635533 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 07:55:30 compute-0 nova_compute[186544]: 2025-11-22 07:55:30.408 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/34f4707e8e4d1e7685e2832f24a15f8b42112650fafd0872b20aa818f97df8d1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 07:55:30 compute-0 podman[224255]: 2025-11-22 07:55:30.431184648 +0000 UTC m=+0.141823784 container init ae86d07cafc7a25bbe7b6629579c108399daf90c69d41ea2af534e8d4cafc589 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 07:55:30 compute-0 podman[224255]: 2025-11-22 07:55:30.438801895 +0000 UTC m=+0.149441011 container start ae86d07cafc7a25bbe7b6629579c108399daf90c69d41ea2af534e8d4cafc589 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Nov 22 07:55:30 compute-0 neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f[224271]: [NOTICE]   (224275) : New worker (224277) forked
Nov 22 07:55:30 compute-0 neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f[224271]: [NOTICE]   (224275) : Loading success.
Nov 22 07:55:31 compute-0 nova_compute[186544]: 2025-11-22 07:55:31.432 186548 DEBUG nova.compute.manager [req-c10297e9-0316-4367-8bbf-91b4be1ad4ed req-70197202-637c-406e-995a-e86fbf16211e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Received event network-vif-unplugged-cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:55:31 compute-0 nova_compute[186544]: 2025-11-22 07:55:31.432 186548 DEBUG oslo_concurrency.lockutils [req-c10297e9-0316-4367-8bbf-91b4be1ad4ed req-70197202-637c-406e-995a-e86fbf16211e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "fde616bb-bfd9-477f-b2bf-6ce171da7c4c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:55:31 compute-0 nova_compute[186544]: 2025-11-22 07:55:31.432 186548 DEBUG oslo_concurrency.lockutils [req-c10297e9-0316-4367-8bbf-91b4be1ad4ed req-70197202-637c-406e-995a-e86fbf16211e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "fde616bb-bfd9-477f-b2bf-6ce171da7c4c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:55:31 compute-0 nova_compute[186544]: 2025-11-22 07:55:31.432 186548 DEBUG oslo_concurrency.lockutils [req-c10297e9-0316-4367-8bbf-91b4be1ad4ed req-70197202-637c-406e-995a-e86fbf16211e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "fde616bb-bfd9-477f-b2bf-6ce171da7c4c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:55:31 compute-0 nova_compute[186544]: 2025-11-22 07:55:31.433 186548 DEBUG nova.compute.manager [req-c10297e9-0316-4367-8bbf-91b4be1ad4ed req-70197202-637c-406e-995a-e86fbf16211e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] No event matching network-vif-unplugged-cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b in dict_keys([('network-vif-plugged', 'cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325
Nov 22 07:55:31 compute-0 nova_compute[186544]: 2025-11-22 07:55:31.433 186548 WARNING nova.compute.manager [req-c10297e9-0316-4367-8bbf-91b4be1ad4ed req-70197202-637c-406e-995a-e86fbf16211e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Received unexpected event network-vif-unplugged-cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b for instance with vm_state active and task_state rebuild_spawning.
Nov 22 07:55:33 compute-0 nova_compute[186544]: 2025-11-22 07:55:33.396 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:33 compute-0 nova_compute[186544]: 2025-11-22 07:55:33.555 186548 DEBUG nova.compute.manager [req-2c0c403a-17b6-4aac-8e50-7400b0967966 req-59df317c-bd97-4fda-8181-8d97711fc93e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Received event network-vif-plugged-cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:55:33 compute-0 nova_compute[186544]: 2025-11-22 07:55:33.555 186548 DEBUG oslo_concurrency.lockutils [req-2c0c403a-17b6-4aac-8e50-7400b0967966 req-59df317c-bd97-4fda-8181-8d97711fc93e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "fde616bb-bfd9-477f-b2bf-6ce171da7c4c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:55:33 compute-0 nova_compute[186544]: 2025-11-22 07:55:33.556 186548 DEBUG oslo_concurrency.lockutils [req-2c0c403a-17b6-4aac-8e50-7400b0967966 req-59df317c-bd97-4fda-8181-8d97711fc93e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "fde616bb-bfd9-477f-b2bf-6ce171da7c4c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:55:33 compute-0 nova_compute[186544]: 2025-11-22 07:55:33.556 186548 DEBUG oslo_concurrency.lockutils [req-2c0c403a-17b6-4aac-8e50-7400b0967966 req-59df317c-bd97-4fda-8181-8d97711fc93e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "fde616bb-bfd9-477f-b2bf-6ce171da7c4c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:55:33 compute-0 nova_compute[186544]: 2025-11-22 07:55:33.556 186548 DEBUG nova.compute.manager [req-2c0c403a-17b6-4aac-8e50-7400b0967966 req-59df317c-bd97-4fda-8181-8d97711fc93e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Processing event network-vif-plugged-cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 07:55:33 compute-0 nova_compute[186544]: 2025-11-22 07:55:33.556 186548 DEBUG nova.compute.manager [req-2c0c403a-17b6-4aac-8e50-7400b0967966 req-59df317c-bd97-4fda-8181-8d97711fc93e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Received event network-vif-plugged-cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:55:33 compute-0 nova_compute[186544]: 2025-11-22 07:55:33.556 186548 DEBUG oslo_concurrency.lockutils [req-2c0c403a-17b6-4aac-8e50-7400b0967966 req-59df317c-bd97-4fda-8181-8d97711fc93e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "fde616bb-bfd9-477f-b2bf-6ce171da7c4c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:55:33 compute-0 nova_compute[186544]: 2025-11-22 07:55:33.556 186548 DEBUG oslo_concurrency.lockutils [req-2c0c403a-17b6-4aac-8e50-7400b0967966 req-59df317c-bd97-4fda-8181-8d97711fc93e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "fde616bb-bfd9-477f-b2bf-6ce171da7c4c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:55:33 compute-0 nova_compute[186544]: 2025-11-22 07:55:33.557 186548 DEBUG oslo_concurrency.lockutils [req-2c0c403a-17b6-4aac-8e50-7400b0967966 req-59df317c-bd97-4fda-8181-8d97711fc93e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "fde616bb-bfd9-477f-b2bf-6ce171da7c4c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:55:33 compute-0 nova_compute[186544]: 2025-11-22 07:55:33.557 186548 DEBUG nova.compute.manager [req-2c0c403a-17b6-4aac-8e50-7400b0967966 req-59df317c-bd97-4fda-8181-8d97711fc93e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] No waiting events found dispatching network-vif-plugged-cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:55:33 compute-0 nova_compute[186544]: 2025-11-22 07:55:33.557 186548 WARNING nova.compute.manager [req-2c0c403a-17b6-4aac-8e50-7400b0967966 req-59df317c-bd97-4fda-8181-8d97711fc93e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Received unexpected event network-vif-plugged-cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b for instance with vm_state active and task_state rebuild_spawning.
Nov 22 07:55:33 compute-0 nova_compute[186544]: 2025-11-22 07:55:33.557 186548 DEBUG nova.compute.manager [req-2c0c403a-17b6-4aac-8e50-7400b0967966 req-59df317c-bd97-4fda-8181-8d97711fc93e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Received event network-vif-plugged-cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:55:33 compute-0 nova_compute[186544]: 2025-11-22 07:55:33.558 186548 DEBUG oslo_concurrency.lockutils [req-2c0c403a-17b6-4aac-8e50-7400b0967966 req-59df317c-bd97-4fda-8181-8d97711fc93e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "fde616bb-bfd9-477f-b2bf-6ce171da7c4c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:55:33 compute-0 nova_compute[186544]: 2025-11-22 07:55:33.558 186548 DEBUG oslo_concurrency.lockutils [req-2c0c403a-17b6-4aac-8e50-7400b0967966 req-59df317c-bd97-4fda-8181-8d97711fc93e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "fde616bb-bfd9-477f-b2bf-6ce171da7c4c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:55:33 compute-0 nova_compute[186544]: 2025-11-22 07:55:33.558 186548 DEBUG oslo_concurrency.lockutils [req-2c0c403a-17b6-4aac-8e50-7400b0967966 req-59df317c-bd97-4fda-8181-8d97711fc93e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "fde616bb-bfd9-477f-b2bf-6ce171da7c4c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:55:33 compute-0 nova_compute[186544]: 2025-11-22 07:55:33.558 186548 DEBUG nova.compute.manager [req-2c0c403a-17b6-4aac-8e50-7400b0967966 req-59df317c-bd97-4fda-8181-8d97711fc93e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] No waiting events found dispatching network-vif-plugged-cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:55:33 compute-0 nova_compute[186544]: 2025-11-22 07:55:33.558 186548 WARNING nova.compute.manager [req-2c0c403a-17b6-4aac-8e50-7400b0967966 req-59df317c-bd97-4fda-8181-8d97711fc93e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Received unexpected event network-vif-plugged-cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b for instance with vm_state active and task_state rebuild_spawning.
Nov 22 07:55:33 compute-0 nova_compute[186544]: 2025-11-22 07:55:33.559 186548 DEBUG nova.compute.manager [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Instance event wait completed in 3 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 07:55:33 compute-0 nova_compute[186544]: 2025-11-22 07:55:33.562 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798133.5621843, fde616bb-bfd9-477f-b2bf-6ce171da7c4c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:55:33 compute-0 nova_compute[186544]: 2025-11-22 07:55:33.563 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] VM Resumed (Lifecycle Event)
Nov 22 07:55:33 compute-0 nova_compute[186544]: 2025-11-22 07:55:33.564 186548 DEBUG nova.virt.libvirt.driver [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 07:55:33 compute-0 nova_compute[186544]: 2025-11-22 07:55:33.567 186548 INFO nova.virt.libvirt.driver [-] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Instance spawned successfully.
Nov 22 07:55:33 compute-0 nova_compute[186544]: 2025-11-22 07:55:33.567 186548 DEBUG nova.virt.libvirt.driver [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 07:55:33 compute-0 nova_compute[186544]: 2025-11-22 07:55:33.595 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:55:33 compute-0 nova_compute[186544]: 2025-11-22 07:55:33.597 186548 DEBUG nova.virt.libvirt.driver [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:55:33 compute-0 nova_compute[186544]: 2025-11-22 07:55:33.597 186548 DEBUG nova.virt.libvirt.driver [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:55:33 compute-0 nova_compute[186544]: 2025-11-22 07:55:33.597 186548 DEBUG nova.virt.libvirt.driver [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:55:33 compute-0 nova_compute[186544]: 2025-11-22 07:55:33.598 186548 DEBUG nova.virt.libvirt.driver [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:55:33 compute-0 nova_compute[186544]: 2025-11-22 07:55:33.598 186548 DEBUG nova.virt.libvirt.driver [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:55:33 compute-0 nova_compute[186544]: 2025-11-22 07:55:33.598 186548 DEBUG nova.virt.libvirt.driver [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:55:33 compute-0 nova_compute[186544]: 2025-11-22 07:55:33.604 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:55:33 compute-0 nova_compute[186544]: 2025-11-22 07:55:33.635 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Nov 22 07:55:33 compute-0 nova_compute[186544]: 2025-11-22 07:55:33.686 186548 DEBUG nova.compute.manager [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:55:33 compute-0 nova_compute[186544]: 2025-11-22 07:55:33.751 186548 DEBUG oslo_concurrency.lockutils [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:55:33 compute-0 nova_compute[186544]: 2025-11-22 07:55:33.751 186548 DEBUG oslo_concurrency.lockutils [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:55:33 compute-0 nova_compute[186544]: 2025-11-22 07:55:33.751 186548 DEBUG nova.objects.instance [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Nov 22 07:55:33 compute-0 nova_compute[186544]: 2025-11-22 07:55:33.873 186548 DEBUG oslo_concurrency.lockutils [None req-0925dd00-52db-401f-815c-8d47867e3ba0 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.122s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:55:35 compute-0 nova_compute[186544]: 2025-11-22 07:55:35.354 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763798120.3530493, 47f049f6-6113-427d-ba1e-f849dfec302b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:55:35 compute-0 nova_compute[186544]: 2025-11-22 07:55:35.355 186548 INFO nova.compute.manager [-] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] VM Stopped (Lifecycle Event)
Nov 22 07:55:35 compute-0 nova_compute[186544]: 2025-11-22 07:55:35.375 186548 DEBUG nova.compute.manager [None req-d06d4edd-16b3-4850-8ae0-53377183eba3 - - - - - -] [instance: 47f049f6-6113-427d-ba1e-f849dfec302b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:55:35 compute-0 nova_compute[186544]: 2025-11-22 07:55:35.410 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:36 compute-0 nova_compute[186544]: 2025-11-22 07:55:36.041 186548 DEBUG oslo_concurrency.lockutils [None req-9fbfc0e1-e143-4b76-a6c8-83dd14d62a82 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquiring lock "fde616bb-bfd9-477f-b2bf-6ce171da7c4c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:55:36 compute-0 nova_compute[186544]: 2025-11-22 07:55:36.041 186548 DEBUG oslo_concurrency.lockutils [None req-9fbfc0e1-e143-4b76-a6c8-83dd14d62a82 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "fde616bb-bfd9-477f-b2bf-6ce171da7c4c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:55:36 compute-0 nova_compute[186544]: 2025-11-22 07:55:36.042 186548 DEBUG oslo_concurrency.lockutils [None req-9fbfc0e1-e143-4b76-a6c8-83dd14d62a82 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquiring lock "fde616bb-bfd9-477f-b2bf-6ce171da7c4c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:55:36 compute-0 nova_compute[186544]: 2025-11-22 07:55:36.042 186548 DEBUG oslo_concurrency.lockutils [None req-9fbfc0e1-e143-4b76-a6c8-83dd14d62a82 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "fde616bb-bfd9-477f-b2bf-6ce171da7c4c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:55:36 compute-0 nova_compute[186544]: 2025-11-22 07:55:36.042 186548 DEBUG oslo_concurrency.lockutils [None req-9fbfc0e1-e143-4b76-a6c8-83dd14d62a82 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "fde616bb-bfd9-477f-b2bf-6ce171da7c4c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:55:36 compute-0 nova_compute[186544]: 2025-11-22 07:55:36.050 186548 INFO nova.compute.manager [None req-9fbfc0e1-e143-4b76-a6c8-83dd14d62a82 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Terminating instance
Nov 22 07:55:36 compute-0 nova_compute[186544]: 2025-11-22 07:55:36.056 186548 DEBUG nova.compute.manager [None req-9fbfc0e1-e143-4b76-a6c8-83dd14d62a82 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 07:55:36 compute-0 kernel: tapcc5ed7bc-cf (unregistering): left promiscuous mode
Nov 22 07:55:36 compute-0 NetworkManager[55036]: <info>  [1763798136.0789] device (tapcc5ed7bc-cf): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 07:55:36 compute-0 ovn_controller[94843]: 2025-11-22T07:55:36Z|00272|binding|INFO|Releasing lport cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b from this chassis (sb_readonly=0)
Nov 22 07:55:36 compute-0 ovn_controller[94843]: 2025-11-22T07:55:36Z|00273|binding|INFO|Setting lport cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b down in Southbound
Nov 22 07:55:36 compute-0 ovn_controller[94843]: 2025-11-22T07:55:36Z|00274|binding|INFO|Removing iface tapcc5ed7bc-cf ovn-installed in OVS
Nov 22 07:55:36 compute-0 nova_compute[186544]: 2025-11-22 07:55:36.086 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:36 compute-0 nova_compute[186544]: 2025-11-22 07:55:36.088 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:36 compute-0 nova_compute[186544]: 2025-11-22 07:55:36.103 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:36.113 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d4:c1:18 10.100.0.4'], port_security=['fa:16:3e:d4:c1:18 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'fde616bb-bfd9-477f-b2bf-6ce171da7c4c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d54e232a-5c68-4cc7-b58c-054da9c4646f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '063bf16c91af408ca075c690797e09d8', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'f7dee984-8c58-404b-bb82-9a414aef709d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3245992a-6d76-4250-aff6-2ef09c2bac10, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:55:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:36.114 103805 INFO neutron.agent.ovn.metadata.agent [-] Port cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b in datapath d54e232a-5c68-4cc7-b58c-054da9c4646f unbound from our chassis
Nov 22 07:55:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:36.116 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d54e232a-5c68-4cc7-b58c-054da9c4646f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 07:55:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:36.117 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[b7d0dc1e-f9ad-4e48-ab08-012ebe192ee1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:36.117 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f namespace which is not needed anymore
Nov 22 07:55:36 compute-0 systemd[1]: machine-qemu\x2d33\x2dinstance\x2d00000041.scope: Deactivated successfully.
Nov 22 07:55:36 compute-0 systemd[1]: machine-qemu\x2d33\x2dinstance\x2d00000041.scope: Consumed 3.073s CPU time.
Nov 22 07:55:36 compute-0 systemd-machined[152872]: Machine qemu-33-instance-00000041 terminated.
Nov 22 07:55:36 compute-0 neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f[224271]: [NOTICE]   (224275) : haproxy version is 2.8.14-c23fe91
Nov 22 07:55:36 compute-0 neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f[224271]: [NOTICE]   (224275) : path to executable is /usr/sbin/haproxy
Nov 22 07:55:36 compute-0 neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f[224271]: [WARNING]  (224275) : Exiting Master process...
Nov 22 07:55:36 compute-0 neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f[224271]: [WARNING]  (224275) : Exiting Master process...
Nov 22 07:55:36 compute-0 neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f[224271]: [ALERT]    (224275) : Current worker (224277) exited with code 143 (Terminated)
Nov 22 07:55:36 compute-0 neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f[224271]: [WARNING]  (224275) : All workers exited. Exiting... (0)
Nov 22 07:55:36 compute-0 systemd[1]: libpod-ae86d07cafc7a25bbe7b6629579c108399daf90c69d41ea2af534e8d4cafc589.scope: Deactivated successfully.
Nov 22 07:55:36 compute-0 podman[224310]: 2025-11-22 07:55:36.257174544 +0000 UTC m=+0.048297720 container died ae86d07cafc7a25bbe7b6629579c108399daf90c69d41ea2af534e8d4cafc589 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 22 07:55:36 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ae86d07cafc7a25bbe7b6629579c108399daf90c69d41ea2af534e8d4cafc589-userdata-shm.mount: Deactivated successfully.
Nov 22 07:55:36 compute-0 systemd[1]: var-lib-containers-storage-overlay-34f4707e8e4d1e7685e2832f24a15f8b42112650fafd0872b20aa818f97df8d1-merged.mount: Deactivated successfully.
Nov 22 07:55:36 compute-0 nova_compute[186544]: 2025-11-22 07:55:36.313 186548 INFO nova.virt.libvirt.driver [-] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Instance destroyed successfully.
Nov 22 07:55:36 compute-0 nova_compute[186544]: 2025-11-22 07:55:36.315 186548 DEBUG nova.objects.instance [None req-9fbfc0e1-e143-4b76-a6c8-83dd14d62a82 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lazy-loading 'resources' on Instance uuid fde616bb-bfd9-477f-b2bf-6ce171da7c4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:55:36 compute-0 podman[224310]: 2025-11-22 07:55:36.319149251 +0000 UTC m=+0.110272437 container cleanup ae86d07cafc7a25bbe7b6629579c108399daf90c69d41ea2af534e8d4cafc589 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 07:55:36 compute-0 systemd[1]: libpod-conmon-ae86d07cafc7a25bbe7b6629579c108399daf90c69d41ea2af534e8d4cafc589.scope: Deactivated successfully.
Nov 22 07:55:36 compute-0 nova_compute[186544]: 2025-11-22 07:55:36.333 186548 DEBUG nova.virt.libvirt.vif [None req-9fbfc0e1-e143-4b76-a6c8-83dd14d62a82 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-22T07:54:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1890324968',display_name='tempest-ServerDiskConfigTestJSON-server-1890324968',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1890324968',id=65,image_ref='360f90ca-2ddb-4e60-a48e-364e3b48bd96',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:55:33Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='063bf16c91af408ca075c690797e09d8',ramdisk_id='',reservation_id='r-qfr0g3wu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='360f90ca-2ddb-4e60-a48e-364e3b48bd96',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-592691466',owner_user_name='tempest-ServerDiskConfigTestJSON-592691466-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:55:33Z,user_data=None,user_id='e24c302b62fb470aa189b76d4676733b',uuid=fde616bb-bfd9-477f-b2bf-6ce171da7c4c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b", "address": "fa:16:3e:d4:c1:18", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc5ed7bc-cf", "ovs_interfaceid": "cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 07:55:36 compute-0 nova_compute[186544]: 2025-11-22 07:55:36.334 186548 DEBUG nova.network.os_vif_util [None req-9fbfc0e1-e143-4b76-a6c8-83dd14d62a82 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Converting VIF {"id": "cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b", "address": "fa:16:3e:d4:c1:18", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc5ed7bc-cf", "ovs_interfaceid": "cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:55:36 compute-0 nova_compute[186544]: 2025-11-22 07:55:36.334 186548 DEBUG nova.network.os_vif_util [None req-9fbfc0e1-e143-4b76-a6c8-83dd14d62a82 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d4:c1:18,bridge_name='br-int',has_traffic_filtering=True,id=cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b,network=Network(d54e232a-5c68-4cc7-b58c-054da9c4646f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc5ed7bc-cf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:55:36 compute-0 nova_compute[186544]: 2025-11-22 07:55:36.335 186548 DEBUG os_vif [None req-9fbfc0e1-e143-4b76-a6c8-83dd14d62a82 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d4:c1:18,bridge_name='br-int',has_traffic_filtering=True,id=cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b,network=Network(d54e232a-5c68-4cc7-b58c-054da9c4646f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc5ed7bc-cf') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 07:55:36 compute-0 nova_compute[186544]: 2025-11-22 07:55:36.336 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:36 compute-0 nova_compute[186544]: 2025-11-22 07:55:36.336 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcc5ed7bc-cf, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:55:36 compute-0 nova_compute[186544]: 2025-11-22 07:55:36.338 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:36 compute-0 nova_compute[186544]: 2025-11-22 07:55:36.341 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 07:55:36 compute-0 nova_compute[186544]: 2025-11-22 07:55:36.343 186548 INFO os_vif [None req-9fbfc0e1-e143-4b76-a6c8-83dd14d62a82 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d4:c1:18,bridge_name='br-int',has_traffic_filtering=True,id=cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b,network=Network(d54e232a-5c68-4cc7-b58c-054da9c4646f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc5ed7bc-cf')
Nov 22 07:55:36 compute-0 nova_compute[186544]: 2025-11-22 07:55:36.343 186548 INFO nova.virt.libvirt.driver [None req-9fbfc0e1-e143-4b76-a6c8-83dd14d62a82 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Deleting instance files /var/lib/nova/instances/fde616bb-bfd9-477f-b2bf-6ce171da7c4c_del
Nov 22 07:55:36 compute-0 nova_compute[186544]: 2025-11-22 07:55:36.344 186548 INFO nova.virt.libvirt.driver [None req-9fbfc0e1-e143-4b76-a6c8-83dd14d62a82 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Deletion of /var/lib/nova/instances/fde616bb-bfd9-477f-b2bf-6ce171da7c4c_del complete
Nov 22 07:55:36 compute-0 podman[224356]: 2025-11-22 07:55:36.393547273 +0000 UTC m=+0.049456649 container remove ae86d07cafc7a25bbe7b6629579c108399daf90c69d41ea2af534e8d4cafc589 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 07:55:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:36.398 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[43c17c5d-62cc-425c-9e8d-6707bcf817c7]: (4, ('Sat Nov 22 07:55:36 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f (ae86d07cafc7a25bbe7b6629579c108399daf90c69d41ea2af534e8d4cafc589)\nae86d07cafc7a25bbe7b6629579c108399daf90c69d41ea2af534e8d4cafc589\nSat Nov 22 07:55:36 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f (ae86d07cafc7a25bbe7b6629579c108399daf90c69d41ea2af534e8d4cafc589)\nae86d07cafc7a25bbe7b6629579c108399daf90c69d41ea2af534e8d4cafc589\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:36.400 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[f568d15b-edd7-4cf5-8ca9-493fff81bdee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:36.401 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd54e232a-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:55:36 compute-0 nova_compute[186544]: 2025-11-22 07:55:36.403 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:36 compute-0 kernel: tapd54e232a-50: left promiscuous mode
Nov 22 07:55:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:36.408 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[6bd1e181-e2e1-435e-b8a6-45077ba7d446]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:36 compute-0 nova_compute[186544]: 2025-11-22 07:55:36.416 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:36.424 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[4d760858-1243-496e-b03e-2b342024fbde]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:36.425 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[38339b4f-96c8-458a-bacf-0ac99829cf89]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:36.441 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[7a44b7c7-eb9b-4669-b5d4-a47f47f8bc0c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 487039, 'reachable_time': 34229, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224371, 'error': None, 'target': 'ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:36.443 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 07:55:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:36.443 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[6e459e3c-08ed-4985-ad6c-15d5c8e32389]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:36 compute-0 systemd[1]: run-netns-ovnmeta\x2dd54e232a\x2d5c68\x2d4cc7\x2db58c\x2d054da9c4646f.mount: Deactivated successfully.
Nov 22 07:55:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:37.323 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:55:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:37.323 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:55:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:37.323 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:55:37 compute-0 nova_compute[186544]: 2025-11-22 07:55:37.641 186548 INFO nova.compute.manager [None req-9fbfc0e1-e143-4b76-a6c8-83dd14d62a82 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Took 1.58 seconds to destroy the instance on the hypervisor.
Nov 22 07:55:37 compute-0 nova_compute[186544]: 2025-11-22 07:55:37.642 186548 DEBUG oslo.service.loopingcall [None req-9fbfc0e1-e143-4b76-a6c8-83dd14d62a82 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 07:55:37 compute-0 nova_compute[186544]: 2025-11-22 07:55:37.642 186548 DEBUG nova.compute.manager [-] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 07:55:37 compute-0 nova_compute[186544]: 2025-11-22 07:55:37.643 186548 DEBUG nova.network.neutron [-] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 07:55:38 compute-0 nova_compute[186544]: 2025-11-22 07:55:38.822 186548 DEBUG nova.network.neutron [-] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:55:38 compute-0 nova_compute[186544]: 2025-11-22 07:55:38.848 186548 INFO nova.compute.manager [-] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Took 1.21 seconds to deallocate network for instance.
Nov 22 07:55:38 compute-0 nova_compute[186544]: 2025-11-22 07:55:38.949 186548 DEBUG oslo_concurrency.lockutils [None req-9fbfc0e1-e143-4b76-a6c8-83dd14d62a82 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:55:38 compute-0 nova_compute[186544]: 2025-11-22 07:55:38.950 186548 DEBUG oslo_concurrency.lockutils [None req-9fbfc0e1-e143-4b76-a6c8-83dd14d62a82 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:55:38 compute-0 nova_compute[186544]: 2025-11-22 07:55:38.981 186548 DEBUG nova.compute.manager [req-9f16dd28-74dc-4bd3-9aff-56d9ea9df099 req-461ac552-0e0e-406b-a2e6-23d6528bbace 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Received event network-vif-deleted-cc5ed7bc-cfcd-42a5-9a4b-f371d409c87b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:55:38 compute-0 nova_compute[186544]: 2025-11-22 07:55:38.987 186548 DEBUG nova.scheduler.client.report [None req-9fbfc0e1-e143-4b76-a6c8-83dd14d62a82 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Refreshing inventories for resource provider 0a011418-630a-4be8-ab23-41ec1c11a5ea _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 22 07:55:39 compute-0 nova_compute[186544]: 2025-11-22 07:55:39.020 186548 DEBUG nova.scheduler.client.report [None req-9fbfc0e1-e143-4b76-a6c8-83dd14d62a82 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Updating ProviderTree inventory for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 22 07:55:39 compute-0 nova_compute[186544]: 2025-11-22 07:55:39.021 186548 DEBUG nova.compute.provider_tree [None req-9fbfc0e1-e143-4b76-a6c8-83dd14d62a82 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Updating inventory in ProviderTree for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 22 07:55:39 compute-0 nova_compute[186544]: 2025-11-22 07:55:39.045 186548 DEBUG nova.scheduler.client.report [None req-9fbfc0e1-e143-4b76-a6c8-83dd14d62a82 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Refreshing aggregate associations for resource provider 0a011418-630a-4be8-ab23-41ec1c11a5ea, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 22 07:55:39 compute-0 nova_compute[186544]: 2025-11-22 07:55:39.073 186548 DEBUG nova.scheduler.client.report [None req-9fbfc0e1-e143-4b76-a6c8-83dd14d62a82 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Refreshing trait associations for resource provider 0a011418-630a-4be8-ab23-41ec1c11a5ea, traits: COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_ACCELERATORS,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_RESCUE_BFV,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSSE3,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE41 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 22 07:55:39 compute-0 nova_compute[186544]: 2025-11-22 07:55:39.124 186548 DEBUG nova.compute.provider_tree [None req-9fbfc0e1-e143-4b76-a6c8-83dd14d62a82 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:55:39 compute-0 nova_compute[186544]: 2025-11-22 07:55:39.137 186548 DEBUG nova.scheduler.client.report [None req-9fbfc0e1-e143-4b76-a6c8-83dd14d62a82 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:55:39 compute-0 nova_compute[186544]: 2025-11-22 07:55:39.165 186548 DEBUG oslo_concurrency.lockutils [None req-9fbfc0e1-e143-4b76-a6c8-83dd14d62a82 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.216s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:55:39 compute-0 nova_compute[186544]: 2025-11-22 07:55:39.219 186548 INFO nova.scheduler.client.report [None req-9fbfc0e1-e143-4b76-a6c8-83dd14d62a82 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Deleted allocations for instance fde616bb-bfd9-477f-b2bf-6ce171da7c4c
Nov 22 07:55:39 compute-0 nova_compute[186544]: 2025-11-22 07:55:39.331 186548 DEBUG oslo_concurrency.lockutils [None req-9fbfc0e1-e143-4b76-a6c8-83dd14d62a82 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "fde616bb-bfd9-477f-b2bf-6ce171da7c4c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.290s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:55:40 compute-0 nova_compute[186544]: 2025-11-22 07:55:40.158 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:55:40 compute-0 nova_compute[186544]: 2025-11-22 07:55:40.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:55:40 compute-0 nova_compute[186544]: 2025-11-22 07:55:40.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 07:55:40 compute-0 nova_compute[186544]: 2025-11-22 07:55:40.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 07:55:40 compute-0 nova_compute[186544]: 2025-11-22 07:55:40.177 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 07:55:40 compute-0 podman[224372]: 2025-11-22 07:55:40.409038657 +0000 UTC m=+0.060145482 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 07:55:40 compute-0 nova_compute[186544]: 2025-11-22 07:55:40.412 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:40 compute-0 podman[224373]: 2025-11-22 07:55:40.440199306 +0000 UTC m=+0.088489872 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 22 07:55:41 compute-0 nova_compute[186544]: 2025-11-22 07:55:41.340 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:41 compute-0 nova_compute[186544]: 2025-11-22 07:55:41.362 186548 DEBUG oslo_concurrency.lockutils [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquiring lock "5eafb037-41a2-463f-9d3a-1b4248cb00f2" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:55:41 compute-0 nova_compute[186544]: 2025-11-22 07:55:41.362 186548 DEBUG oslo_concurrency.lockutils [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "5eafb037-41a2-463f-9d3a-1b4248cb00f2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:55:41 compute-0 nova_compute[186544]: 2025-11-22 07:55:41.379 186548 DEBUG nova.compute.manager [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 07:55:41 compute-0 nova_compute[186544]: 2025-11-22 07:55:41.538 186548 DEBUG oslo_concurrency.lockutils [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:55:41 compute-0 nova_compute[186544]: 2025-11-22 07:55:41.538 186548 DEBUG oslo_concurrency.lockutils [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:55:41 compute-0 nova_compute[186544]: 2025-11-22 07:55:41.545 186548 DEBUG nova.virt.hardware [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 07:55:41 compute-0 nova_compute[186544]: 2025-11-22 07:55:41.546 186548 INFO nova.compute.claims [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Claim successful on node compute-0.ctlplane.example.com
Nov 22 07:55:41 compute-0 nova_compute[186544]: 2025-11-22 07:55:41.712 186548 DEBUG nova.compute.provider_tree [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:55:41 compute-0 nova_compute[186544]: 2025-11-22 07:55:41.729 186548 DEBUG nova.scheduler.client.report [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:55:41 compute-0 nova_compute[186544]: 2025-11-22 07:55:41.754 186548 DEBUG oslo_concurrency.lockutils [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.216s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:55:41 compute-0 nova_compute[186544]: 2025-11-22 07:55:41.755 186548 DEBUG nova.compute.manager [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 07:55:41 compute-0 nova_compute[186544]: 2025-11-22 07:55:41.817 186548 DEBUG nova.compute.manager [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 22 07:55:41 compute-0 nova_compute[186544]: 2025-11-22 07:55:41.818 186548 DEBUG nova.network.neutron [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 07:55:41 compute-0 nova_compute[186544]: 2025-11-22 07:55:41.848 186548 INFO nova.virt.libvirt.driver [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 07:55:41 compute-0 nova_compute[186544]: 2025-11-22 07:55:41.873 186548 DEBUG nova.compute.manager [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 07:55:42 compute-0 nova_compute[186544]: 2025-11-22 07:55:42.047 186548 DEBUG nova.compute.manager [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 07:55:42 compute-0 nova_compute[186544]: 2025-11-22 07:55:42.049 186548 DEBUG nova.virt.libvirt.driver [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 07:55:42 compute-0 nova_compute[186544]: 2025-11-22 07:55:42.049 186548 INFO nova.virt.libvirt.driver [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Creating image(s)
Nov 22 07:55:42 compute-0 nova_compute[186544]: 2025-11-22 07:55:42.050 186548 DEBUG oslo_concurrency.lockutils [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquiring lock "/var/lib/nova/instances/5eafb037-41a2-463f-9d3a-1b4248cb00f2/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:55:42 compute-0 nova_compute[186544]: 2025-11-22 07:55:42.050 186548 DEBUG oslo_concurrency.lockutils [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "/var/lib/nova/instances/5eafb037-41a2-463f-9d3a-1b4248cb00f2/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:55:42 compute-0 nova_compute[186544]: 2025-11-22 07:55:42.051 186548 DEBUG oslo_concurrency.lockutils [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "/var/lib/nova/instances/5eafb037-41a2-463f-9d3a-1b4248cb00f2/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:55:42 compute-0 nova_compute[186544]: 2025-11-22 07:55:42.063 186548 DEBUG oslo_concurrency.processutils [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:55:42 compute-0 nova_compute[186544]: 2025-11-22 07:55:42.117 186548 DEBUG oslo_concurrency.processutils [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:55:42 compute-0 nova_compute[186544]: 2025-11-22 07:55:42.118 186548 DEBUG oslo_concurrency.lockutils [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:55:42 compute-0 nova_compute[186544]: 2025-11-22 07:55:42.119 186548 DEBUG oslo_concurrency.lockutils [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:55:42 compute-0 nova_compute[186544]: 2025-11-22 07:55:42.130 186548 DEBUG oslo_concurrency.processutils [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:55:42 compute-0 nova_compute[186544]: 2025-11-22 07:55:42.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:55:42 compute-0 nova_compute[186544]: 2025-11-22 07:55:42.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:55:42 compute-0 nova_compute[186544]: 2025-11-22 07:55:42.180 186548 DEBUG nova.policy [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e24c302b62fb470aa189b76d4676733b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '063bf16c91af408ca075c690797e09d8', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 22 07:55:42 compute-0 nova_compute[186544]: 2025-11-22 07:55:42.184 186548 DEBUG oslo_concurrency.processutils [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:55:42 compute-0 nova_compute[186544]: 2025-11-22 07:55:42.185 186548 DEBUG oslo_concurrency.processutils [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/5eafb037-41a2-463f-9d3a-1b4248cb00f2/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:55:42 compute-0 nova_compute[186544]: 2025-11-22 07:55:42.206 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:55:42 compute-0 nova_compute[186544]: 2025-11-22 07:55:42.207 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:55:42 compute-0 nova_compute[186544]: 2025-11-22 07:55:42.207 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:55:42 compute-0 nova_compute[186544]: 2025-11-22 07:55:42.208 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 07:55:42 compute-0 nova_compute[186544]: 2025-11-22 07:55:42.258 186548 DEBUG oslo_concurrency.processutils [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/5eafb037-41a2-463f-9d3a-1b4248cb00f2/disk 1073741824" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:55:42 compute-0 nova_compute[186544]: 2025-11-22 07:55:42.259 186548 DEBUG oslo_concurrency.lockutils [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.140s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:55:42 compute-0 nova_compute[186544]: 2025-11-22 07:55:42.260 186548 DEBUG oslo_concurrency.processutils [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:55:42 compute-0 nova_compute[186544]: 2025-11-22 07:55:42.316 186548 DEBUG oslo_concurrency.processutils [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:55:42 compute-0 nova_compute[186544]: 2025-11-22 07:55:42.318 186548 DEBUG nova.virt.disk.api [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Checking if we can resize image /var/lib/nova/instances/5eafb037-41a2-463f-9d3a-1b4248cb00f2/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 07:55:42 compute-0 nova_compute[186544]: 2025-11-22 07:55:42.318 186548 DEBUG oslo_concurrency.processutils [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5eafb037-41a2-463f-9d3a-1b4248cb00f2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:55:42 compute-0 nova_compute[186544]: 2025-11-22 07:55:42.380 186548 DEBUG oslo_concurrency.processutils [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5eafb037-41a2-463f-9d3a-1b4248cb00f2/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:55:42 compute-0 nova_compute[186544]: 2025-11-22 07:55:42.381 186548 DEBUG nova.virt.disk.api [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Cannot resize image /var/lib/nova/instances/5eafb037-41a2-463f-9d3a-1b4248cb00f2/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 07:55:42 compute-0 nova_compute[186544]: 2025-11-22 07:55:42.381 186548 DEBUG nova.objects.instance [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lazy-loading 'migration_context' on Instance uuid 5eafb037-41a2-463f-9d3a-1b4248cb00f2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:55:42 compute-0 nova_compute[186544]: 2025-11-22 07:55:42.403 186548 DEBUG nova.virt.libvirt.driver [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 07:55:42 compute-0 nova_compute[186544]: 2025-11-22 07:55:42.404 186548 DEBUG nova.virt.libvirt.driver [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Ensure instance console log exists: /var/lib/nova/instances/5eafb037-41a2-463f-9d3a-1b4248cb00f2/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 07:55:42 compute-0 nova_compute[186544]: 2025-11-22 07:55:42.404 186548 DEBUG oslo_concurrency.lockutils [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:55:42 compute-0 nova_compute[186544]: 2025-11-22 07:55:42.404 186548 DEBUG oslo_concurrency.lockutils [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:55:42 compute-0 nova_compute[186544]: 2025-11-22 07:55:42.404 186548 DEBUG oslo_concurrency.lockutils [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:55:42 compute-0 nova_compute[186544]: 2025-11-22 07:55:42.439 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 07:55:42 compute-0 nova_compute[186544]: 2025-11-22 07:55:42.440 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5700MB free_disk=73.34916687011719GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 07:55:42 compute-0 nova_compute[186544]: 2025-11-22 07:55:42.441 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:55:42 compute-0 nova_compute[186544]: 2025-11-22 07:55:42.441 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:55:42 compute-0 nova_compute[186544]: 2025-11-22 07:55:42.554 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Instance 5eafb037-41a2-463f-9d3a-1b4248cb00f2 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 22 07:55:42 compute-0 nova_compute[186544]: 2025-11-22 07:55:42.555 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 07:55:42 compute-0 nova_compute[186544]: 2025-11-22 07:55:42.555 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 07:55:42 compute-0 nova_compute[186544]: 2025-11-22 07:55:42.623 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:55:42 compute-0 nova_compute[186544]: 2025-11-22 07:55:42.633 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:55:42 compute-0 nova_compute[186544]: 2025-11-22 07:55:42.657 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 07:55:42 compute-0 nova_compute[186544]: 2025-11-22 07:55:42.657 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.216s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:55:43 compute-0 nova_compute[186544]: 2025-11-22 07:55:43.136 186548 DEBUG nova.network.neutron [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Successfully created port: 041d3fd2-9a77-48a1-b976-9dda05f01f7b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 22 07:55:44 compute-0 nova_compute[186544]: 2025-11-22 07:55:44.657 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:55:44 compute-0 nova_compute[186544]: 2025-11-22 07:55:44.657 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 07:55:45 compute-0 nova_compute[186544]: 2025-11-22 07:55:45.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:55:45 compute-0 podman[224431]: 2025-11-22 07:55:45.392982199 +0000 UTC m=+0.045863241 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 07:55:45 compute-0 nova_compute[186544]: 2025-11-22 07:55:45.414 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:45 compute-0 nova_compute[186544]: 2025-11-22 07:55:45.570 186548 DEBUG nova.network.neutron [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Successfully updated port: 041d3fd2-9a77-48a1-b976-9dda05f01f7b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 07:55:45 compute-0 nova_compute[186544]: 2025-11-22 07:55:45.591 186548 DEBUG oslo_concurrency.lockutils [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquiring lock "refresh_cache-5eafb037-41a2-463f-9d3a-1b4248cb00f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:55:45 compute-0 nova_compute[186544]: 2025-11-22 07:55:45.592 186548 DEBUG oslo_concurrency.lockutils [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquired lock "refresh_cache-5eafb037-41a2-463f-9d3a-1b4248cb00f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:55:45 compute-0 nova_compute[186544]: 2025-11-22 07:55:45.592 186548 DEBUG nova.network.neutron [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 07:55:45 compute-0 nova_compute[186544]: 2025-11-22 07:55:45.785 186548 DEBUG nova.compute.manager [req-17bdb0d2-2bab-4943-becc-96d13723af8c req-d3892f07-7c33-4396-8daa-81d48a4f0be6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Received event network-changed-041d3fd2-9a77-48a1-b976-9dda05f01f7b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:55:45 compute-0 nova_compute[186544]: 2025-11-22 07:55:45.786 186548 DEBUG nova.compute.manager [req-17bdb0d2-2bab-4943-becc-96d13723af8c req-d3892f07-7c33-4396-8daa-81d48a4f0be6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Refreshing instance network info cache due to event network-changed-041d3fd2-9a77-48a1-b976-9dda05f01f7b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 07:55:45 compute-0 nova_compute[186544]: 2025-11-22 07:55:45.786 186548 DEBUG oslo_concurrency.lockutils [req-17bdb0d2-2bab-4943-becc-96d13723af8c req-d3892f07-7c33-4396-8daa-81d48a4f0be6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-5eafb037-41a2-463f-9d3a-1b4248cb00f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:55:45 compute-0 nova_compute[186544]: 2025-11-22 07:55:45.962 186548 DEBUG nova.network.neutron [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 07:55:46 compute-0 nova_compute[186544]: 2025-11-22 07:55:46.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:55:46 compute-0 nova_compute[186544]: 2025-11-22 07:55:46.341 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:47 compute-0 nova_compute[186544]: 2025-11-22 07:55:47.778 186548 DEBUG oslo_concurrency.lockutils [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Acquiring lock "7d441089-3a23-460f-80f9-602e3500937b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:55:47 compute-0 nova_compute[186544]: 2025-11-22 07:55:47.779 186548 DEBUG oslo_concurrency.lockutils [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "7d441089-3a23-460f-80f9-602e3500937b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:55:47 compute-0 nova_compute[186544]: 2025-11-22 07:55:47.797 186548 DEBUG oslo_concurrency.lockutils [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Acquiring lock "1da3a852-b732-48ac-886f-c57ced76a3d5" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:55:47 compute-0 nova_compute[186544]: 2025-11-22 07:55:47.798 186548 DEBUG oslo_concurrency.lockutils [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lock "1da3a852-b732-48ac-886f-c57ced76a3d5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:55:47 compute-0 nova_compute[186544]: 2025-11-22 07:55:47.799 186548 DEBUG nova.compute.manager [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 7d441089-3a23-460f-80f9-602e3500937b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 07:55:47 compute-0 nova_compute[186544]: 2025-11-22 07:55:47.838 186548 DEBUG nova.compute.manager [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 07:55:47 compute-0 nova_compute[186544]: 2025-11-22 07:55:47.905 186548 DEBUG oslo_concurrency.lockutils [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:55:47 compute-0 nova_compute[186544]: 2025-11-22 07:55:47.905 186548 DEBUG oslo_concurrency.lockutils [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:55:47 compute-0 nova_compute[186544]: 2025-11-22 07:55:47.911 186548 DEBUG nova.virt.hardware [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 07:55:47 compute-0 nova_compute[186544]: 2025-11-22 07:55:47.911 186548 INFO nova.compute.claims [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 7d441089-3a23-460f-80f9-602e3500937b] Claim successful on node compute-0.ctlplane.example.com
Nov 22 07:55:47 compute-0 nova_compute[186544]: 2025-11-22 07:55:47.969 186548 DEBUG oslo_concurrency.lockutils [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.123 186548 DEBUG nova.compute.provider_tree [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.147 186548 DEBUG nova.scheduler.client.report [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.178 186548 DEBUG oslo_concurrency.lockutils [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.273s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.179 186548 DEBUG nova.compute.manager [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 7d441089-3a23-460f-80f9-602e3500937b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.182 186548 DEBUG oslo_concurrency.lockutils [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.213s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.188 186548 DEBUG nova.virt.hardware [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.189 186548 INFO nova.compute.claims [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] Claim successful on node compute-0.ctlplane.example.com
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.278 186548 DEBUG nova.compute.manager [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 7d441089-3a23-460f-80f9-602e3500937b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.278 186548 DEBUG nova.network.neutron [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 7d441089-3a23-460f-80f9-602e3500937b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.322 186548 DEBUG nova.network.neutron [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Updating instance_info_cache with network_info: [{"id": "041d3fd2-9a77-48a1-b976-9dda05f01f7b", "address": "fa:16:3e:31:97:43", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap041d3fd2-9a", "ovs_interfaceid": "041d3fd2-9a77-48a1-b976-9dda05f01f7b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.387 186548 INFO nova.virt.libvirt.driver [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 7d441089-3a23-460f-80f9-602e3500937b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 07:55:48 compute-0 podman[224456]: 2025-11-22 07:55:48.408953727 +0000 UTC m=+0.050043424 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.416 186548 DEBUG oslo_concurrency.lockutils [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Releasing lock "refresh_cache-5eafb037-41a2-463f-9d3a-1b4248cb00f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.416 186548 DEBUG nova.compute.manager [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Instance network_info: |[{"id": "041d3fd2-9a77-48a1-b976-9dda05f01f7b", "address": "fa:16:3e:31:97:43", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap041d3fd2-9a", "ovs_interfaceid": "041d3fd2-9a77-48a1-b976-9dda05f01f7b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.417 186548 DEBUG nova.compute.manager [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 7d441089-3a23-460f-80f9-602e3500937b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.419 186548 DEBUG oslo_concurrency.lockutils [req-17bdb0d2-2bab-4943-becc-96d13723af8c req-d3892f07-7c33-4396-8daa-81d48a4f0be6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-5eafb037-41a2-463f-9d3a-1b4248cb00f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.419 186548 DEBUG nova.network.neutron [req-17bdb0d2-2bab-4943-becc-96d13723af8c req-d3892f07-7c33-4396-8daa-81d48a4f0be6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Refreshing network info cache for port 041d3fd2-9a77-48a1-b976-9dda05f01f7b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.421 186548 DEBUG nova.virt.libvirt.driver [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Start _get_guest_xml network_info=[{"id": "041d3fd2-9a77-48a1-b976-9dda05f01f7b", "address": "fa:16:3e:31:97:43", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap041d3fd2-9a", "ovs_interfaceid": "041d3fd2-9a77-48a1-b976-9dda05f01f7b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.425 186548 WARNING nova.virt.libvirt.driver [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.431 186548 DEBUG nova.virt.libvirt.host [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.432 186548 DEBUG nova.virt.libvirt.host [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.439 186548 DEBUG nova.virt.libvirt.host [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.440 186548 DEBUG nova.virt.libvirt.host [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.441 186548 DEBUG nova.virt.libvirt.driver [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.441 186548 DEBUG nova.virt.hardware [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.441 186548 DEBUG nova.virt.hardware [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.442 186548 DEBUG nova.virt.hardware [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.442 186548 DEBUG nova.virt.hardware [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.442 186548 DEBUG nova.virt.hardware [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.442 186548 DEBUG nova.virt.hardware [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.442 186548 DEBUG nova.virt.hardware [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.443 186548 DEBUG nova.virt.hardware [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.443 186548 DEBUG nova.virt.hardware [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.443 186548 DEBUG nova.virt.hardware [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.443 186548 DEBUG nova.virt.hardware [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.446 186548 DEBUG nova.virt.libvirt.vif [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:55:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1663966303',display_name='tempest-ServerDiskConfigTestJSON-server-1663966303',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1663966303',id=70,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='063bf16c91af408ca075c690797e09d8',ramdisk_id='',reservation_id='r-9g9kcea9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-592691466',owner_user_name='tempest-ServerDiskConfigTestJSON-592691466-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:55:41Z,user_data=None,user_id='e24c302b62fb470aa189b76d4676733b',uuid=5eafb037-41a2-463f-9d3a-1b4248cb00f2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "041d3fd2-9a77-48a1-b976-9dda05f01f7b", "address": "fa:16:3e:31:97:43", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap041d3fd2-9a", "ovs_interfaceid": "041d3fd2-9a77-48a1-b976-9dda05f01f7b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.447 186548 DEBUG nova.network.os_vif_util [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Converting VIF {"id": "041d3fd2-9a77-48a1-b976-9dda05f01f7b", "address": "fa:16:3e:31:97:43", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap041d3fd2-9a", "ovs_interfaceid": "041d3fd2-9a77-48a1-b976-9dda05f01f7b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.447 186548 DEBUG nova.network.os_vif_util [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:31:97:43,bridge_name='br-int',has_traffic_filtering=True,id=041d3fd2-9a77-48a1-b976-9dda05f01f7b,network=Network(d54e232a-5c68-4cc7-b58c-054da9c4646f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap041d3fd2-9a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.448 186548 DEBUG nova.objects.instance [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lazy-loading 'pci_devices' on Instance uuid 5eafb037-41a2-463f-9d3a-1b4248cb00f2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.475 186548 DEBUG nova.virt.libvirt.driver [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] End _get_guest_xml xml=<domain type="kvm">
Nov 22 07:55:48 compute-0 nova_compute[186544]:   <uuid>5eafb037-41a2-463f-9d3a-1b4248cb00f2</uuid>
Nov 22 07:55:48 compute-0 nova_compute[186544]:   <name>instance-00000046</name>
Nov 22 07:55:48 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 07:55:48 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 07:55:48 compute-0 nova_compute[186544]:   <metadata>
Nov 22 07:55:48 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 07:55:48 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 07:55:48 compute-0 nova_compute[186544]:       <nova:name>tempest-ServerDiskConfigTestJSON-server-1663966303</nova:name>
Nov 22 07:55:48 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 07:55:48</nova:creationTime>
Nov 22 07:55:48 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 07:55:48 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 07:55:48 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 07:55:48 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 07:55:48 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 07:55:48 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 07:55:48 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 07:55:48 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 07:55:48 compute-0 nova_compute[186544]:         <nova:user uuid="e24c302b62fb470aa189b76d4676733b">tempest-ServerDiskConfigTestJSON-592691466-project-member</nova:user>
Nov 22 07:55:48 compute-0 nova_compute[186544]:         <nova:project uuid="063bf16c91af408ca075c690797e09d8">tempest-ServerDiskConfigTestJSON-592691466</nova:project>
Nov 22 07:55:48 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 07:55:48 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 07:55:48 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 07:55:48 compute-0 nova_compute[186544]:         <nova:port uuid="041d3fd2-9a77-48a1-b976-9dda05f01f7b">
Nov 22 07:55:48 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Nov 22 07:55:48 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 07:55:48 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 07:55:48 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 07:55:48 compute-0 nova_compute[186544]:   </metadata>
Nov 22 07:55:48 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 07:55:48 compute-0 nova_compute[186544]:     <system>
Nov 22 07:55:48 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 07:55:48 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 07:55:48 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 07:55:48 compute-0 nova_compute[186544]:       <entry name="serial">5eafb037-41a2-463f-9d3a-1b4248cb00f2</entry>
Nov 22 07:55:48 compute-0 nova_compute[186544]:       <entry name="uuid">5eafb037-41a2-463f-9d3a-1b4248cb00f2</entry>
Nov 22 07:55:48 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 07:55:48 compute-0 nova_compute[186544]:     </system>
Nov 22 07:55:48 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 07:55:48 compute-0 nova_compute[186544]:   <os>
Nov 22 07:55:48 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 07:55:48 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 07:55:48 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 07:55:48 compute-0 nova_compute[186544]:   </os>
Nov 22 07:55:48 compute-0 nova_compute[186544]:   <features>
Nov 22 07:55:48 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 07:55:48 compute-0 nova_compute[186544]:     <apic/>
Nov 22 07:55:48 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 07:55:48 compute-0 nova_compute[186544]:   </features>
Nov 22 07:55:48 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 07:55:48 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 07:55:48 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 07:55:48 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 07:55:48 compute-0 nova_compute[186544]:   </clock>
Nov 22 07:55:48 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 07:55:48 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 07:55:48 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 07:55:48 compute-0 nova_compute[186544]:   </cpu>
Nov 22 07:55:48 compute-0 nova_compute[186544]:   <devices>
Nov 22 07:55:48 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 07:55:48 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 07:55:48 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/5eafb037-41a2-463f-9d3a-1b4248cb00f2/disk"/>
Nov 22 07:55:48 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 07:55:48 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:55:48 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 07:55:48 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 07:55:48 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/5eafb037-41a2-463f-9d3a-1b4248cb00f2/disk.config"/>
Nov 22 07:55:48 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 07:55:48 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:55:48 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 07:55:48 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:31:97:43"/>
Nov 22 07:55:48 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 07:55:48 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 07:55:48 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 07:55:48 compute-0 nova_compute[186544]:       <target dev="tap041d3fd2-9a"/>
Nov 22 07:55:48 compute-0 nova_compute[186544]:     </interface>
Nov 22 07:55:48 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 07:55:48 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/5eafb037-41a2-463f-9d3a-1b4248cb00f2/console.log" append="off"/>
Nov 22 07:55:48 compute-0 nova_compute[186544]:     </serial>
Nov 22 07:55:48 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 07:55:48 compute-0 nova_compute[186544]:     <video>
Nov 22 07:55:48 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 07:55:48 compute-0 nova_compute[186544]:     </video>
Nov 22 07:55:48 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 07:55:48 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 07:55:48 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 07:55:48 compute-0 nova_compute[186544]:     </rng>
Nov 22 07:55:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 07:55:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:48 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 07:55:48 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 07:55:48 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 07:55:48 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 07:55:48 compute-0 nova_compute[186544]:   </devices>
Nov 22 07:55:48 compute-0 nova_compute[186544]: </domain>
Nov 22 07:55:48 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.476 186548 DEBUG nova.compute.manager [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Preparing to wait for external event network-vif-plugged-041d3fd2-9a77-48a1-b976-9dda05f01f7b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.477 186548 DEBUG oslo_concurrency.lockutils [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquiring lock "5eafb037-41a2-463f-9d3a-1b4248cb00f2-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.477 186548 DEBUG oslo_concurrency.lockutils [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "5eafb037-41a2-463f-9d3a-1b4248cb00f2-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.477 186548 DEBUG oslo_concurrency.lockutils [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "5eafb037-41a2-463f-9d3a-1b4248cb00f2-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.478 186548 DEBUG nova.virt.libvirt.vif [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:55:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1663966303',display_name='tempest-ServerDiskConfigTestJSON-server-1663966303',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1663966303',id=70,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='063bf16c91af408ca075c690797e09d8',ramdisk_id='',reservation_id='r-9g9kcea9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-592691466',owner_user_name='tempest-ServerDiskConfigTestJSON-592691466-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:55:41Z,user_data=None,user_id='e24c302b62fb470aa189b76d4676733b',uuid=5eafb037-41a2-463f-9d3a-1b4248cb00f2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "041d3fd2-9a77-48a1-b976-9dda05f01f7b", "address": "fa:16:3e:31:97:43", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap041d3fd2-9a", "ovs_interfaceid": "041d3fd2-9a77-48a1-b976-9dda05f01f7b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.478 186548 DEBUG nova.network.os_vif_util [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Converting VIF {"id": "041d3fd2-9a77-48a1-b976-9dda05f01f7b", "address": "fa:16:3e:31:97:43", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap041d3fd2-9a", "ovs_interfaceid": "041d3fd2-9a77-48a1-b976-9dda05f01f7b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.479 186548 DEBUG nova.network.os_vif_util [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:31:97:43,bridge_name='br-int',has_traffic_filtering=True,id=041d3fd2-9a77-48a1-b976-9dda05f01f7b,network=Network(d54e232a-5c68-4cc7-b58c-054da9c4646f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap041d3fd2-9a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.479 186548 DEBUG os_vif [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:31:97:43,bridge_name='br-int',has_traffic_filtering=True,id=041d3fd2-9a77-48a1-b976-9dda05f01f7b,network=Network(d54e232a-5c68-4cc7-b58c-054da9c4646f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap041d3fd2-9a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.480 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.480 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.480 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.485 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.485 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap041d3fd2-9a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.486 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap041d3fd2-9a, col_values=(('external_ids', {'iface-id': '041d3fd2-9a77-48a1-b976-9dda05f01f7b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:31:97:43', 'vm-uuid': '5eafb037-41a2-463f-9d3a-1b4248cb00f2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.487 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:48 compute-0 NetworkManager[55036]: <info>  [1763798148.4886] manager: (tap041d3fd2-9a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/129)
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.490 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.494 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.495 186548 INFO os_vif [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:31:97:43,bridge_name='br-int',has_traffic_filtering=True,id=041d3fd2-9a77-48a1-b976-9dda05f01f7b,network=Network(d54e232a-5c68-4cc7-b58c-054da9c4646f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap041d3fd2-9a')
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.555 186548 DEBUG nova.compute.provider_tree [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.561 186548 DEBUG nova.virt.libvirt.driver [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.562 186548 DEBUG nova.virt.libvirt.driver [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.562 186548 DEBUG nova.virt.libvirt.driver [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] No VIF found with MAC fa:16:3e:31:97:43, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.562 186548 INFO nova.virt.libvirt.driver [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Using config drive
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.614 186548 DEBUG nova.scheduler.client.report [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.632 186548 DEBUG nova.compute.manager [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 7d441089-3a23-460f-80f9-602e3500937b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.633 186548 DEBUG nova.virt.libvirt.driver [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 7d441089-3a23-460f-80f9-602e3500937b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.634 186548 INFO nova.virt.libvirt.driver [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 7d441089-3a23-460f-80f9-602e3500937b] Creating image(s)
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.634 186548 DEBUG oslo_concurrency.lockutils [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Acquiring lock "/var/lib/nova/instances/7d441089-3a23-460f-80f9-602e3500937b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.635 186548 DEBUG oslo_concurrency.lockutils [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "/var/lib/nova/instances/7d441089-3a23-460f-80f9-602e3500937b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.635 186548 DEBUG oslo_concurrency.lockutils [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "/var/lib/nova/instances/7d441089-3a23-460f-80f9-602e3500937b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.635 186548 DEBUG oslo_concurrency.lockutils [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Acquiring lock "8536cfb4a3f8d7a3046f2bf7b9787f58422494f1" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.636 186548 DEBUG oslo_concurrency.lockutils [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "8536cfb4a3f8d7a3046f2bf7b9787f58422494f1" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.648 186548 DEBUG oslo_concurrency.lockutils [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.467s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.649 186548 DEBUG nova.compute.manager [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.710 186548 DEBUG nova.compute.manager [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.711 186548 DEBUG nova.network.neutron [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.740 186548 INFO nova.virt.libvirt.driver [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.771 186548 DEBUG nova.compute.manager [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.977 186548 DEBUG nova.compute.manager [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.978 186548 DEBUG nova.virt.libvirt.driver [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.978 186548 INFO nova.virt.libvirt.driver [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] Creating image(s)
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.979 186548 DEBUG oslo_concurrency.lockutils [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Acquiring lock "/var/lib/nova/instances/1da3a852-b732-48ac-886f-c57ced76a3d5/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.979 186548 DEBUG oslo_concurrency.lockutils [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lock "/var/lib/nova/instances/1da3a852-b732-48ac-886f-c57ced76a3d5/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.980 186548 DEBUG oslo_concurrency.lockutils [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lock "/var/lib/nova/instances/1da3a852-b732-48ac-886f-c57ced76a3d5/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:55:48 compute-0 nova_compute[186544]: 2025-11-22 07:55:48.993 186548 DEBUG oslo_concurrency.processutils [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:55:49 compute-0 nova_compute[186544]: 2025-11-22 07:55:49.051 186548 DEBUG oslo_concurrency.processutils [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:55:49 compute-0 nova_compute[186544]: 2025-11-22 07:55:49.052 186548 DEBUG oslo_concurrency.lockutils [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:55:49 compute-0 nova_compute[186544]: 2025-11-22 07:55:49.053 186548 DEBUG oslo_concurrency.lockutils [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:55:49 compute-0 nova_compute[186544]: 2025-11-22 07:55:49.065 186548 DEBUG oslo_concurrency.processutils [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:55:49 compute-0 nova_compute[186544]: 2025-11-22 07:55:49.130 186548 DEBUG oslo_concurrency.processutils [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:55:49 compute-0 nova_compute[186544]: 2025-11-22 07:55:49.131 186548 DEBUG oslo_concurrency.processutils [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/1da3a852-b732-48ac-886f-c57ced76a3d5/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:55:49 compute-0 nova_compute[186544]: 2025-11-22 07:55:49.181 186548 DEBUG oslo_concurrency.processutils [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/1da3a852-b732-48ac-886f-c57ced76a3d5/disk 1073741824" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:55:49 compute-0 nova_compute[186544]: 2025-11-22 07:55:49.182 186548 DEBUG oslo_concurrency.lockutils [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.130s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:55:49 compute-0 nova_compute[186544]: 2025-11-22 07:55:49.183 186548 DEBUG oslo_concurrency.processutils [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:55:49 compute-0 nova_compute[186544]: 2025-11-22 07:55:49.240 186548 DEBUG oslo_concurrency.processutils [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:55:49 compute-0 nova_compute[186544]: 2025-11-22 07:55:49.241 186548 DEBUG nova.virt.disk.api [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Checking if we can resize image /var/lib/nova/instances/1da3a852-b732-48ac-886f-c57ced76a3d5/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 07:55:49 compute-0 nova_compute[186544]: 2025-11-22 07:55:49.242 186548 DEBUG oslo_concurrency.processutils [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1da3a852-b732-48ac-886f-c57ced76a3d5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:55:49 compute-0 nova_compute[186544]: 2025-11-22 07:55:49.290 186548 DEBUG nova.policy [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5a5a623606e647c183360572aab20b70', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'af3a536766704caaad94e5da2e3b88e2', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 22 07:55:49 compute-0 nova_compute[186544]: 2025-11-22 07:55:49.301 186548 DEBUG oslo_concurrency.processutils [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1da3a852-b732-48ac-886f-c57ced76a3d5/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:55:49 compute-0 nova_compute[186544]: 2025-11-22 07:55:49.301 186548 DEBUG nova.virt.disk.api [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Cannot resize image /var/lib/nova/instances/1da3a852-b732-48ac-886f-c57ced76a3d5/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 07:55:49 compute-0 nova_compute[186544]: 2025-11-22 07:55:49.302 186548 DEBUG nova.objects.instance [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lazy-loading 'migration_context' on Instance uuid 1da3a852-b732-48ac-886f-c57ced76a3d5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:55:49 compute-0 nova_compute[186544]: 2025-11-22 07:55:49.316 186548 DEBUG nova.virt.libvirt.driver [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 07:55:49 compute-0 nova_compute[186544]: 2025-11-22 07:55:49.317 186548 DEBUG nova.virt.libvirt.driver [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] Ensure instance console log exists: /var/lib/nova/instances/1da3a852-b732-48ac-886f-c57ced76a3d5/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 07:55:49 compute-0 nova_compute[186544]: 2025-11-22 07:55:49.318 186548 DEBUG oslo_concurrency.lockutils [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:55:49 compute-0 nova_compute[186544]: 2025-11-22 07:55:49.318 186548 DEBUG oslo_concurrency.lockutils [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:55:49 compute-0 nova_compute[186544]: 2025-11-22 07:55:49.319 186548 DEBUG oslo_concurrency.lockutils [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:55:49 compute-0 nova_compute[186544]: 2025-11-22 07:55:49.344 186548 DEBUG nova.policy [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1ac2d2381d294c96aff369941185056a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7ec4007dc8214caab4e2eb40f11fb3cd', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 22 07:55:50 compute-0 nova_compute[186544]: 2025-11-22 07:55:50.037 186548 INFO nova.virt.libvirt.driver [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Creating config drive at /var/lib/nova/instances/5eafb037-41a2-463f-9d3a-1b4248cb00f2/disk.config
Nov 22 07:55:50 compute-0 nova_compute[186544]: 2025-11-22 07:55:50.042 186548 DEBUG oslo_concurrency.processutils [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5eafb037-41a2-463f-9d3a-1b4248cb00f2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpsd2edrlm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:55:50 compute-0 nova_compute[186544]: 2025-11-22 07:55:50.165 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:55:50 compute-0 nova_compute[186544]: 2025-11-22 07:55:50.166 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:55:50 compute-0 nova_compute[186544]: 2025-11-22 07:55:50.167 186548 DEBUG oslo_concurrency.processutils [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5eafb037-41a2-463f-9d3a-1b4248cb00f2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpsd2edrlm" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:55:50 compute-0 kernel: tap041d3fd2-9a: entered promiscuous mode
Nov 22 07:55:50 compute-0 NetworkManager[55036]: <info>  [1763798150.2279] manager: (tap041d3fd2-9a): new Tun device (/org/freedesktop/NetworkManager/Devices/130)
Nov 22 07:55:50 compute-0 ovn_controller[94843]: 2025-11-22T07:55:50Z|00275|binding|INFO|Claiming lport 041d3fd2-9a77-48a1-b976-9dda05f01f7b for this chassis.
Nov 22 07:55:50 compute-0 ovn_controller[94843]: 2025-11-22T07:55:50Z|00276|binding|INFO|041d3fd2-9a77-48a1-b976-9dda05f01f7b: Claiming fa:16:3e:31:97:43 10.100.0.14
Nov 22 07:55:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:50.243 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:31:97:43 10.100.0.14'], port_security=['fa:16:3e:31:97:43 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '5eafb037-41a2-463f-9d3a-1b4248cb00f2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d54e232a-5c68-4cc7-b58c-054da9c4646f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '063bf16c91af408ca075c690797e09d8', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'f7dee984-8c58-404b-bb82-9a414aef709d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3245992a-6d76-4250-aff6-2ef09c2bac10, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=041d3fd2-9a77-48a1-b976-9dda05f01f7b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:55:50 compute-0 ovn_controller[94843]: 2025-11-22T07:55:50Z|00277|binding|INFO|Setting lport 041d3fd2-9a77-48a1-b976-9dda05f01f7b ovn-installed in OVS
Nov 22 07:55:50 compute-0 ovn_controller[94843]: 2025-11-22T07:55:50Z|00278|binding|INFO|Setting lport 041d3fd2-9a77-48a1-b976-9dda05f01f7b up in Southbound
Nov 22 07:55:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:50.245 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 041d3fd2-9a77-48a1-b976-9dda05f01f7b in datapath d54e232a-5c68-4cc7-b58c-054da9c4646f bound to our chassis
Nov 22 07:55:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:50.247 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d54e232a-5c68-4cc7-b58c-054da9c4646f
Nov 22 07:55:50 compute-0 systemd-udevd[224508]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 07:55:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:50.258 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[5c06e4e0-58d6-498d-bcf2-d914630b5f4b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:50.260 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd54e232a-51 in ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 07:55:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:50.262 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd54e232a-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 07:55:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:50.262 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[e381f407-a9ec-4cbe-8a76-13c2f007b092]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:50.263 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[0be1c1f8-8fbb-4009-85ff-ff5c898185d7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:50 compute-0 NetworkManager[55036]: <info>  [1763798150.2695] device (tap041d3fd2-9a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 07:55:50 compute-0 NetworkManager[55036]: <info>  [1763798150.2706] device (tap041d3fd2-9a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 07:55:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:50.276 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[60d2e7c9-609a-402a-be7a-9dd4d14509ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:50 compute-0 systemd-machined[152872]: New machine qemu-34-instance-00000046.
Nov 22 07:55:50 compute-0 systemd[1]: Started Virtual Machine qemu-34-instance-00000046.
Nov 22 07:55:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:50.300 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[74f8014a-83a0-4a29-9ff5-ea1b62b51ab9]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:50 compute-0 nova_compute[186544]: 2025-11-22 07:55:50.311 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:50.328 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[d0594b71-b63c-4a65-8707-4669049e836d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:50 compute-0 NetworkManager[55036]: <info>  [1763798150.3356] manager: (tapd54e232a-50): new Veth device (/org/freedesktop/NetworkManager/Devices/131)
Nov 22 07:55:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:50.334 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[e7e773f2-b378-47f9-9f29-0fbd06a4b04a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:50.364 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[ebb9bd89-372d-4ca7-8317-4acdbeb72984]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:50.367 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[65182511-0171-433c-9e97-8da88fa19b21]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:50 compute-0 NetworkManager[55036]: <info>  [1763798150.3904] device (tapd54e232a-50): carrier: link connected
Nov 22 07:55:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:50.398 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[fcc1c9bf-43c5-4faa-b074-50d59b226287]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:50 compute-0 nova_compute[186544]: 2025-11-22 07:55:50.415 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:50.415 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[6210c2a1-420f-4fe1-8c6a-e81b5f9d7ee1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd54e232a-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5b:69:51'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 84], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 489102, 'reachable_time': 42638, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224544, 'error': None, 'target': 'ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:50.435 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[fe0cca08-4954-4241-a2da-2b0496fcb078]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe5b:6951'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 489102, 'tstamp': 489102}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224545, 'error': None, 'target': 'ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:50.455 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[92874349-7faf-40db-a522-6751cc5684d4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd54e232a-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5b:69:51'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 84], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 489102, 'reachable_time': 42638, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 224546, 'error': None, 'target': 'ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:50.484 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[465a4865-a473-40e3-8c3d-0dddd4eedd74]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:50.538 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[23cb27d5-9573-470d-9062-3d5bbd05ff84]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:50.539 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd54e232a-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:55:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:50.539 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:55:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:50.540 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd54e232a-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:55:50 compute-0 kernel: tapd54e232a-50: entered promiscuous mode
Nov 22 07:55:50 compute-0 nova_compute[186544]: 2025-11-22 07:55:50.541 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:50 compute-0 NetworkManager[55036]: <info>  [1763798150.5422] manager: (tapd54e232a-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/132)
Nov 22 07:55:50 compute-0 nova_compute[186544]: 2025-11-22 07:55:50.543 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:50.546 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd54e232a-50, col_values=(('external_ids', {'iface-id': 'bab7bafe-e92a-4e88-a16b-e3bd78ab8944'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:55:50 compute-0 nova_compute[186544]: 2025-11-22 07:55:50.547 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:50 compute-0 ovn_controller[94843]: 2025-11-22T07:55:50Z|00279|binding|INFO|Releasing lport bab7bafe-e92a-4e88-a16b-e3bd78ab8944 from this chassis (sb_readonly=0)
Nov 22 07:55:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:50.551 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d54e232a-5c68-4cc7-b58c-054da9c4646f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d54e232a-5c68-4cc7-b58c-054da9c4646f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 07:55:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:50.552 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[676e70fb-df0b-4caf-ae34-9cc60d9d3141]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:50.553 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 07:55:50 compute-0 ovn_metadata_agent[103800]: global
Nov 22 07:55:50 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 07:55:50 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-d54e232a-5c68-4cc7-b58c-054da9c4646f
Nov 22 07:55:50 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 07:55:50 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 07:55:50 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 07:55:50 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/d54e232a-5c68-4cc7-b58c-054da9c4646f.pid.haproxy
Nov 22 07:55:50 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 07:55:50 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:55:50 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 07:55:50 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 07:55:50 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 07:55:50 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 07:55:50 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 07:55:50 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 07:55:50 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 07:55:50 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 07:55:50 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 07:55:50 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 07:55:50 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 07:55:50 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 07:55:50 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 07:55:50 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:55:50 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:55:50 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 07:55:50 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 07:55:50 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 07:55:50 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID d54e232a-5c68-4cc7-b58c-054da9c4646f
Nov 22 07:55:50 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 07:55:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:50.554 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f', 'env', 'PROCESS_TAG=haproxy-d54e232a-5c68-4cc7-b58c-054da9c4646f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d54e232a-5c68-4cc7-b58c-054da9c4646f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 07:55:50 compute-0 nova_compute[186544]: 2025-11-22 07:55:50.562 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:50 compute-0 nova_compute[186544]: 2025-11-22 07:55:50.720 186548 DEBUG oslo_concurrency.processutils [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8536cfb4a3f8d7a3046f2bf7b9787f58422494f1.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:55:50 compute-0 nova_compute[186544]: 2025-11-22 07:55:50.782 186548 DEBUG oslo_concurrency.processutils [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8536cfb4a3f8d7a3046f2bf7b9787f58422494f1.part --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:55:50 compute-0 nova_compute[186544]: 2025-11-22 07:55:50.783 186548 DEBUG nova.virt.images [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] 953f8847-fb3b-4129-b572-ed7beb09fe07 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Nov 22 07:55:50 compute-0 nova_compute[186544]: 2025-11-22 07:55:50.784 186548 DEBUG nova.privsep.utils [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Nov 22 07:55:50 compute-0 nova_compute[186544]: 2025-11-22 07:55:50.784 186548 DEBUG oslo_concurrency.processutils [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/8536cfb4a3f8d7a3046f2bf7b9787f58422494f1.part /var/lib/nova/instances/_base/8536cfb4a3f8d7a3046f2bf7b9787f58422494f1.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:55:50 compute-0 nova_compute[186544]: 2025-11-22 07:55:50.917 186548 DEBUG oslo_concurrency.processutils [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/8536cfb4a3f8d7a3046f2bf7b9787f58422494f1.part /var/lib/nova/instances/_base/8536cfb4a3f8d7a3046f2bf7b9787f58422494f1.converted" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:55:50 compute-0 nova_compute[186544]: 2025-11-22 07:55:50.923 186548 DEBUG oslo_concurrency.processutils [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8536cfb4a3f8d7a3046f2bf7b9787f58422494f1.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:55:50 compute-0 podman[224587]: 2025-11-22 07:55:50.929628915 +0000 UTC m=+0.087835834 container create a3381c5ddac2b45cc0023742b68be08471a0531da15cf4ace3d95a62464024ec (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 22 07:55:50 compute-0 podman[224587]: 2025-11-22 07:55:50.864919381 +0000 UTC m=+0.023126320 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 07:55:50 compute-0 nova_compute[186544]: 2025-11-22 07:55:50.960 186548 DEBUG nova.network.neutron [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] Successfully created port: ab338966-6ece-477c-af70-298f992f0c8f _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 22 07:55:50 compute-0 systemd[1]: Started libpod-conmon-a3381c5ddac2b45cc0023742b68be08471a0531da15cf4ace3d95a62464024ec.scope.
Nov 22 07:55:50 compute-0 nova_compute[186544]: 2025-11-22 07:55:50.983 186548 DEBUG oslo_concurrency.processutils [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8536cfb4a3f8d7a3046f2bf7b9787f58422494f1.converted --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:55:50 compute-0 systemd[1]: Started libcrun container.
Nov 22 07:55:50 compute-0 nova_compute[186544]: 2025-11-22 07:55:50.984 186548 DEBUG oslo_concurrency.lockutils [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "8536cfb4a3f8d7a3046f2bf7b9787f58422494f1" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.348s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:55:50 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d2deafbd7b2bf0a384883ce2fb9a6ef3ce78bfabd699745dc355b3855ff748a9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 07:55:50 compute-0 nova_compute[186544]: 2025-11-22 07:55:50.998 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798150.998096, 5eafb037-41a2-463f-9d3a-1b4248cb00f2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:55:50 compute-0 nova_compute[186544]: 2025-11-22 07:55:50.999 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] VM Started (Lifecycle Event)
Nov 22 07:55:51 compute-0 podman[224587]: 2025-11-22 07:55:51.000520792 +0000 UTC m=+0.158727711 container init a3381c5ddac2b45cc0023742b68be08471a0531da15cf4ace3d95a62464024ec (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 22 07:55:51 compute-0 nova_compute[186544]: 2025-11-22 07:55:51.002 186548 DEBUG oslo_concurrency.processutils [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8536cfb4a3f8d7a3046f2bf7b9787f58422494f1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:55:51 compute-0 podman[224587]: 2025-11-22 07:55:51.007375651 +0000 UTC m=+0.165582570 container start a3381c5ddac2b45cc0023742b68be08471a0531da15cf4ace3d95a62464024ec (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118)
Nov 22 07:55:51 compute-0 neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f[224609]: [NOTICE]   (224617) : New worker (224619) forked
Nov 22 07:55:51 compute-0 neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f[224609]: [NOTICE]   (224617) : Loading success.
Nov 22 07:55:51 compute-0 nova_compute[186544]: 2025-11-22 07:55:51.032 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:55:51 compute-0 nova_compute[186544]: 2025-11-22 07:55:51.036 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798150.998852, 5eafb037-41a2-463f-9d3a-1b4248cb00f2 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:55:51 compute-0 nova_compute[186544]: 2025-11-22 07:55:51.037 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] VM Paused (Lifecycle Event)
Nov 22 07:55:51 compute-0 nova_compute[186544]: 2025-11-22 07:55:51.054 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:55:51 compute-0 nova_compute[186544]: 2025-11-22 07:55:51.060 186548 DEBUG oslo_concurrency.processutils [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8536cfb4a3f8d7a3046f2bf7b9787f58422494f1 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:55:51 compute-0 nova_compute[186544]: 2025-11-22 07:55:51.061 186548 DEBUG oslo_concurrency.lockutils [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Acquiring lock "8536cfb4a3f8d7a3046f2bf7b9787f58422494f1" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:55:51 compute-0 nova_compute[186544]: 2025-11-22 07:55:51.061 186548 DEBUG oslo_concurrency.lockutils [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "8536cfb4a3f8d7a3046f2bf7b9787f58422494f1" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:55:51 compute-0 nova_compute[186544]: 2025-11-22 07:55:51.074 186548 DEBUG oslo_concurrency.processutils [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8536cfb4a3f8d7a3046f2bf7b9787f58422494f1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:55:51 compute-0 nova_compute[186544]: 2025-11-22 07:55:51.093 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:55:51 compute-0 nova_compute[186544]: 2025-11-22 07:55:51.128 186548 DEBUG oslo_concurrency.processutils [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8536cfb4a3f8d7a3046f2bf7b9787f58422494f1 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:55:51 compute-0 nova_compute[186544]: 2025-11-22 07:55:51.129 186548 DEBUG oslo_concurrency.processutils [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/8536cfb4a3f8d7a3046f2bf7b9787f58422494f1,backing_fmt=raw /var/lib/nova/instances/7d441089-3a23-460f-80f9-602e3500937b/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:55:51 compute-0 nova_compute[186544]: 2025-11-22 07:55:51.147 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 07:55:51 compute-0 nova_compute[186544]: 2025-11-22 07:55:51.158 186548 DEBUG oslo_concurrency.processutils [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/8536cfb4a3f8d7a3046f2bf7b9787f58422494f1,backing_fmt=raw /var/lib/nova/instances/7d441089-3a23-460f-80f9-602e3500937b/disk 1073741824" returned: 0 in 0.029s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:55:51 compute-0 nova_compute[186544]: 2025-11-22 07:55:51.159 186548 DEBUG oslo_concurrency.lockutils [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "8536cfb4a3f8d7a3046f2bf7b9787f58422494f1" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.098s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:55:51 compute-0 nova_compute[186544]: 2025-11-22 07:55:51.160 186548 DEBUG oslo_concurrency.processutils [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8536cfb4a3f8d7a3046f2bf7b9787f58422494f1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:55:51 compute-0 nova_compute[186544]: 2025-11-22 07:55:51.217 186548 DEBUG oslo_concurrency.processutils [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8536cfb4a3f8d7a3046f2bf7b9787f58422494f1 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:55:51 compute-0 nova_compute[186544]: 2025-11-22 07:55:51.218 186548 DEBUG nova.objects.instance [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lazy-loading 'migration_context' on Instance uuid 7d441089-3a23-460f-80f9-602e3500937b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:55:51 compute-0 nova_compute[186544]: 2025-11-22 07:55:51.238 186548 DEBUG nova.virt.libvirt.driver [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 7d441089-3a23-460f-80f9-602e3500937b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 07:55:51 compute-0 nova_compute[186544]: 2025-11-22 07:55:51.239 186548 DEBUG nova.virt.libvirt.driver [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 7d441089-3a23-460f-80f9-602e3500937b] Ensure instance console log exists: /var/lib/nova/instances/7d441089-3a23-460f-80f9-602e3500937b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 07:55:51 compute-0 nova_compute[186544]: 2025-11-22 07:55:51.239 186548 DEBUG oslo_concurrency.lockutils [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:55:51 compute-0 nova_compute[186544]: 2025-11-22 07:55:51.239 186548 DEBUG oslo_concurrency.lockutils [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:55:51 compute-0 nova_compute[186544]: 2025-11-22 07:55:51.240 186548 DEBUG oslo_concurrency.lockutils [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:55:51 compute-0 nova_compute[186544]: 2025-11-22 07:55:51.312 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763798136.3116086, fde616bb-bfd9-477f-b2bf-6ce171da7c4c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:55:51 compute-0 nova_compute[186544]: 2025-11-22 07:55:51.312 186548 INFO nova.compute.manager [-] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] VM Stopped (Lifecycle Event)
Nov 22 07:55:51 compute-0 nova_compute[186544]: 2025-11-22 07:55:51.334 186548 DEBUG nova.compute.manager [None req-cca14e08-136d-4d77-a983-66a5fd17fbe7 - - - - - -] [instance: fde616bb-bfd9-477f-b2bf-6ce171da7c4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:55:51 compute-0 nova_compute[186544]: 2025-11-22 07:55:51.426 186548 DEBUG nova.network.neutron [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 7d441089-3a23-460f-80f9-602e3500937b] Successfully created port: 030a5be9-2a7a-4bcf-a49f-000ccb253c8e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 22 07:55:51 compute-0 nova_compute[186544]: 2025-11-22 07:55:51.463 186548 DEBUG nova.compute.manager [req-5ce33464-db09-4cf9-b0c9-09a3ecfb2859 req-1604b6ec-c961-4261-9364-424c796c4828 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Received event network-vif-plugged-041d3fd2-9a77-48a1-b976-9dda05f01f7b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:55:51 compute-0 nova_compute[186544]: 2025-11-22 07:55:51.463 186548 DEBUG oslo_concurrency.lockutils [req-5ce33464-db09-4cf9-b0c9-09a3ecfb2859 req-1604b6ec-c961-4261-9364-424c796c4828 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "5eafb037-41a2-463f-9d3a-1b4248cb00f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:55:51 compute-0 nova_compute[186544]: 2025-11-22 07:55:51.464 186548 DEBUG oslo_concurrency.lockutils [req-5ce33464-db09-4cf9-b0c9-09a3ecfb2859 req-1604b6ec-c961-4261-9364-424c796c4828 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "5eafb037-41a2-463f-9d3a-1b4248cb00f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:55:51 compute-0 nova_compute[186544]: 2025-11-22 07:55:51.464 186548 DEBUG oslo_concurrency.lockutils [req-5ce33464-db09-4cf9-b0c9-09a3ecfb2859 req-1604b6ec-c961-4261-9364-424c796c4828 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "5eafb037-41a2-463f-9d3a-1b4248cb00f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:55:51 compute-0 nova_compute[186544]: 2025-11-22 07:55:51.464 186548 DEBUG nova.compute.manager [req-5ce33464-db09-4cf9-b0c9-09a3ecfb2859 req-1604b6ec-c961-4261-9364-424c796c4828 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Processing event network-vif-plugged-041d3fd2-9a77-48a1-b976-9dda05f01f7b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 07:55:51 compute-0 nova_compute[186544]: 2025-11-22 07:55:51.465 186548 DEBUG nova.compute.manager [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 07:55:51 compute-0 nova_compute[186544]: 2025-11-22 07:55:51.468 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798151.468148, 5eafb037-41a2-463f-9d3a-1b4248cb00f2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:55:51 compute-0 nova_compute[186544]: 2025-11-22 07:55:51.469 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] VM Resumed (Lifecycle Event)
Nov 22 07:55:51 compute-0 nova_compute[186544]: 2025-11-22 07:55:51.470 186548 DEBUG nova.virt.libvirt.driver [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 07:55:51 compute-0 nova_compute[186544]: 2025-11-22 07:55:51.474 186548 INFO nova.virt.libvirt.driver [-] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Instance spawned successfully.
Nov 22 07:55:51 compute-0 nova_compute[186544]: 2025-11-22 07:55:51.474 186548 DEBUG nova.virt.libvirt.driver [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 07:55:51 compute-0 nova_compute[186544]: 2025-11-22 07:55:51.493 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:55:51 compute-0 nova_compute[186544]: 2025-11-22 07:55:51.499 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:55:51 compute-0 nova_compute[186544]: 2025-11-22 07:55:51.503 186548 DEBUG nova.virt.libvirt.driver [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:55:51 compute-0 nova_compute[186544]: 2025-11-22 07:55:51.503 186548 DEBUG nova.virt.libvirt.driver [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:55:51 compute-0 nova_compute[186544]: 2025-11-22 07:55:51.504 186548 DEBUG nova.virt.libvirt.driver [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:55:51 compute-0 nova_compute[186544]: 2025-11-22 07:55:51.504 186548 DEBUG nova.virt.libvirt.driver [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:55:51 compute-0 nova_compute[186544]: 2025-11-22 07:55:51.505 186548 DEBUG nova.virt.libvirt.driver [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:55:51 compute-0 nova_compute[186544]: 2025-11-22 07:55:51.505 186548 DEBUG nova.virt.libvirt.driver [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:55:51 compute-0 nova_compute[186544]: 2025-11-22 07:55:51.540 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 07:55:51 compute-0 nova_compute[186544]: 2025-11-22 07:55:51.584 186548 INFO nova.compute.manager [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Took 9.54 seconds to spawn the instance on the hypervisor.
Nov 22 07:55:51 compute-0 nova_compute[186544]: 2025-11-22 07:55:51.585 186548 DEBUG nova.compute.manager [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:55:51 compute-0 nova_compute[186544]: 2025-11-22 07:55:51.670 186548 INFO nova.compute.manager [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Took 10.19 seconds to build instance.
Nov 22 07:55:51 compute-0 nova_compute[186544]: 2025-11-22 07:55:51.687 186548 DEBUG oslo_concurrency.lockutils [None req-8a374690-7dc2-4093-ba50-bf67484a80db e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "5eafb037-41a2-463f-9d3a-1b4248cb00f2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.325s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:55:51 compute-0 nova_compute[186544]: 2025-11-22 07:55:51.863 186548 DEBUG nova.network.neutron [req-17bdb0d2-2bab-4943-becc-96d13723af8c req-d3892f07-7c33-4396-8daa-81d48a4f0be6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Updated VIF entry in instance network info cache for port 041d3fd2-9a77-48a1-b976-9dda05f01f7b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 07:55:51 compute-0 nova_compute[186544]: 2025-11-22 07:55:51.864 186548 DEBUG nova.network.neutron [req-17bdb0d2-2bab-4943-becc-96d13723af8c req-d3892f07-7c33-4396-8daa-81d48a4f0be6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Updating instance_info_cache with network_info: [{"id": "041d3fd2-9a77-48a1-b976-9dda05f01f7b", "address": "fa:16:3e:31:97:43", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap041d3fd2-9a", "ovs_interfaceid": "041d3fd2-9a77-48a1-b976-9dda05f01f7b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:55:51 compute-0 nova_compute[186544]: 2025-11-22 07:55:51.910 186548 DEBUG oslo_concurrency.lockutils [req-17bdb0d2-2bab-4943-becc-96d13723af8c req-d3892f07-7c33-4396-8daa-81d48a4f0be6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-5eafb037-41a2-463f-9d3a-1b4248cb00f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:55:52 compute-0 nova_compute[186544]: 2025-11-22 07:55:52.224 186548 DEBUG nova.network.neutron [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] Successfully updated port: ab338966-6ece-477c-af70-298f992f0c8f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 07:55:52 compute-0 nova_compute[186544]: 2025-11-22 07:55:52.274 186548 DEBUG oslo_concurrency.lockutils [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Acquiring lock "refresh_cache-1da3a852-b732-48ac-886f-c57ced76a3d5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:55:52 compute-0 nova_compute[186544]: 2025-11-22 07:55:52.274 186548 DEBUG oslo_concurrency.lockutils [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Acquired lock "refresh_cache-1da3a852-b732-48ac-886f-c57ced76a3d5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:55:52 compute-0 nova_compute[186544]: 2025-11-22 07:55:52.275 186548 DEBUG nova.network.neutron [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 07:55:52 compute-0 nova_compute[186544]: 2025-11-22 07:55:52.350 186548 DEBUG nova.compute.manager [req-9784fdcc-391a-468f-9957-052f6599a9e5 req-ffb99d70-ce6c-47d3-ac2d-e68c027883a3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] Received event network-changed-ab338966-6ece-477c-af70-298f992f0c8f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:55:52 compute-0 nova_compute[186544]: 2025-11-22 07:55:52.350 186548 DEBUG nova.compute.manager [req-9784fdcc-391a-468f-9957-052f6599a9e5 req-ffb99d70-ce6c-47d3-ac2d-e68c027883a3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] Refreshing instance network info cache due to event network-changed-ab338966-6ece-477c-af70-298f992f0c8f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 07:55:52 compute-0 nova_compute[186544]: 2025-11-22 07:55:52.351 186548 DEBUG oslo_concurrency.lockutils [req-9784fdcc-391a-468f-9957-052f6599a9e5 req-ffb99d70-ce6c-47d3-ac2d-e68c027883a3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-1da3a852-b732-48ac-886f-c57ced76a3d5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:55:52 compute-0 nova_compute[186544]: 2025-11-22 07:55:52.509 186548 DEBUG nova.network.neutron [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 07:55:52 compute-0 nova_compute[186544]: 2025-11-22 07:55:52.652 186548 DEBUG nova.network.neutron [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 7d441089-3a23-460f-80f9-602e3500937b] Successfully updated port: 030a5be9-2a7a-4bcf-a49f-000ccb253c8e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 07:55:52 compute-0 nova_compute[186544]: 2025-11-22 07:55:52.665 186548 DEBUG oslo_concurrency.lockutils [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Acquiring lock "refresh_cache-7d441089-3a23-460f-80f9-602e3500937b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:55:52 compute-0 nova_compute[186544]: 2025-11-22 07:55:52.665 186548 DEBUG oslo_concurrency.lockutils [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Acquired lock "refresh_cache-7d441089-3a23-460f-80f9-602e3500937b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:55:52 compute-0 nova_compute[186544]: 2025-11-22 07:55:52.666 186548 DEBUG nova.network.neutron [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 7d441089-3a23-460f-80f9-602e3500937b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 07:55:53 compute-0 nova_compute[186544]: 2025-11-22 07:55:53.091 186548 DEBUG nova.network.neutron [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 7d441089-3a23-460f-80f9-602e3500937b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 07:55:53 compute-0 podman[224639]: 2025-11-22 07:55:53.448484998 +0000 UTC m=+0.095871582 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 07:55:53 compute-0 nova_compute[186544]: 2025-11-22 07:55:53.470 186548 DEBUG nova.compute.manager [req-41a2b70a-4cf7-4b19-9256-7319a9ef489b req-aa1d824a-d4f5-433a-be3e-d673c25b005a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7d441089-3a23-460f-80f9-602e3500937b] Received event network-changed-030a5be9-2a7a-4bcf-a49f-000ccb253c8e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:55:53 compute-0 nova_compute[186544]: 2025-11-22 07:55:53.471 186548 DEBUG nova.compute.manager [req-41a2b70a-4cf7-4b19-9256-7319a9ef489b req-aa1d824a-d4f5-433a-be3e-d673c25b005a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7d441089-3a23-460f-80f9-602e3500937b] Refreshing instance network info cache due to event network-changed-030a5be9-2a7a-4bcf-a49f-000ccb253c8e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 07:55:53 compute-0 nova_compute[186544]: 2025-11-22 07:55:53.471 186548 DEBUG oslo_concurrency.lockutils [req-41a2b70a-4cf7-4b19-9256-7319a9ef489b req-aa1d824a-d4f5-433a-be3e-d673c25b005a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-7d441089-3a23-460f-80f9-602e3500937b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:55:53 compute-0 nova_compute[186544]: 2025-11-22 07:55:53.489 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:53 compute-0 nova_compute[186544]: 2025-11-22 07:55:53.549 186548 DEBUG nova.compute.manager [req-d3a0bb19-844d-482a-9b54-38f93d0558ef req-e2ee3062-1493-42cd-ac5a-7e41296f54f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Received event network-vif-plugged-041d3fd2-9a77-48a1-b976-9dda05f01f7b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:55:53 compute-0 nova_compute[186544]: 2025-11-22 07:55:53.550 186548 DEBUG oslo_concurrency.lockutils [req-d3a0bb19-844d-482a-9b54-38f93d0558ef req-e2ee3062-1493-42cd-ac5a-7e41296f54f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "5eafb037-41a2-463f-9d3a-1b4248cb00f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:55:53 compute-0 nova_compute[186544]: 2025-11-22 07:55:53.550 186548 DEBUG oslo_concurrency.lockutils [req-d3a0bb19-844d-482a-9b54-38f93d0558ef req-e2ee3062-1493-42cd-ac5a-7e41296f54f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "5eafb037-41a2-463f-9d3a-1b4248cb00f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:55:53 compute-0 nova_compute[186544]: 2025-11-22 07:55:53.550 186548 DEBUG oslo_concurrency.lockutils [req-d3a0bb19-844d-482a-9b54-38f93d0558ef req-e2ee3062-1493-42cd-ac5a-7e41296f54f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "5eafb037-41a2-463f-9d3a-1b4248cb00f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:55:53 compute-0 nova_compute[186544]: 2025-11-22 07:55:53.551 186548 DEBUG nova.compute.manager [req-d3a0bb19-844d-482a-9b54-38f93d0558ef req-e2ee3062-1493-42cd-ac5a-7e41296f54f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] No waiting events found dispatching network-vif-plugged-041d3fd2-9a77-48a1-b976-9dda05f01f7b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:55:53 compute-0 nova_compute[186544]: 2025-11-22 07:55:53.551 186548 WARNING nova.compute.manager [req-d3a0bb19-844d-482a-9b54-38f93d0558ef req-e2ee3062-1493-42cd-ac5a-7e41296f54f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Received unexpected event network-vif-plugged-041d3fd2-9a77-48a1-b976-9dda05f01f7b for instance with vm_state active and task_state None.
Nov 22 07:55:53 compute-0 nova_compute[186544]: 2025-11-22 07:55:53.582 186548 DEBUG nova.network.neutron [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] Updating instance_info_cache with network_info: [{"id": "ab338966-6ece-477c-af70-298f992f0c8f", "address": "fa:16:3e:63:15:d6", "network": {"id": "a2b438ab-8fa8-4627-8c04-99bed701c19e", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1803455984-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "af3a536766704caaad94e5da2e3b88e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab338966-6e", "ovs_interfaceid": "ab338966-6ece-477c-af70-298f992f0c8f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:55:53 compute-0 nova_compute[186544]: 2025-11-22 07:55:53.601 186548 DEBUG oslo_concurrency.lockutils [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Releasing lock "refresh_cache-1da3a852-b732-48ac-886f-c57ced76a3d5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:55:53 compute-0 nova_compute[186544]: 2025-11-22 07:55:53.602 186548 DEBUG nova.compute.manager [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] Instance network_info: |[{"id": "ab338966-6ece-477c-af70-298f992f0c8f", "address": "fa:16:3e:63:15:d6", "network": {"id": "a2b438ab-8fa8-4627-8c04-99bed701c19e", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1803455984-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "af3a536766704caaad94e5da2e3b88e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab338966-6e", "ovs_interfaceid": "ab338966-6ece-477c-af70-298f992f0c8f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 22 07:55:53 compute-0 nova_compute[186544]: 2025-11-22 07:55:53.602 186548 DEBUG oslo_concurrency.lockutils [req-9784fdcc-391a-468f-9957-052f6599a9e5 req-ffb99d70-ce6c-47d3-ac2d-e68c027883a3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-1da3a852-b732-48ac-886f-c57ced76a3d5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:55:53 compute-0 nova_compute[186544]: 2025-11-22 07:55:53.603 186548 DEBUG nova.network.neutron [req-9784fdcc-391a-468f-9957-052f6599a9e5 req-ffb99d70-ce6c-47d3-ac2d-e68c027883a3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] Refreshing network info cache for port ab338966-6ece-477c-af70-298f992f0c8f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 07:55:53 compute-0 nova_compute[186544]: 2025-11-22 07:55:53.605 186548 DEBUG nova.virt.libvirt.driver [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] Start _get_guest_xml network_info=[{"id": "ab338966-6ece-477c-af70-298f992f0c8f", "address": "fa:16:3e:63:15:d6", "network": {"id": "a2b438ab-8fa8-4627-8c04-99bed701c19e", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1803455984-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "af3a536766704caaad94e5da2e3b88e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab338966-6e", "ovs_interfaceid": "ab338966-6ece-477c-af70-298f992f0c8f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 07:55:53 compute-0 nova_compute[186544]: 2025-11-22 07:55:53.611 186548 WARNING nova.virt.libvirt.driver [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 07:55:53 compute-0 nova_compute[186544]: 2025-11-22 07:55:53.618 186548 DEBUG nova.virt.libvirt.host [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 07:55:53 compute-0 nova_compute[186544]: 2025-11-22 07:55:53.619 186548 DEBUG nova.virt.libvirt.host [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 07:55:53 compute-0 nova_compute[186544]: 2025-11-22 07:55:53.622 186548 DEBUG nova.virt.libvirt.host [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 07:55:53 compute-0 nova_compute[186544]: 2025-11-22 07:55:53.623 186548 DEBUG nova.virt.libvirt.host [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 07:55:53 compute-0 nova_compute[186544]: 2025-11-22 07:55:53.624 186548 DEBUG nova.virt.libvirt.driver [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 07:55:53 compute-0 nova_compute[186544]: 2025-11-22 07:55:53.624 186548 DEBUG nova.virt.hardware [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 07:55:53 compute-0 nova_compute[186544]: 2025-11-22 07:55:53.624 186548 DEBUG nova.virt.hardware [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 07:55:53 compute-0 nova_compute[186544]: 2025-11-22 07:55:53.625 186548 DEBUG nova.virt.hardware [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 07:55:53 compute-0 nova_compute[186544]: 2025-11-22 07:55:53.625 186548 DEBUG nova.virt.hardware [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 07:55:53 compute-0 nova_compute[186544]: 2025-11-22 07:55:53.625 186548 DEBUG nova.virt.hardware [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 07:55:53 compute-0 nova_compute[186544]: 2025-11-22 07:55:53.625 186548 DEBUG nova.virt.hardware [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 07:55:53 compute-0 nova_compute[186544]: 2025-11-22 07:55:53.626 186548 DEBUG nova.virt.hardware [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 07:55:53 compute-0 nova_compute[186544]: 2025-11-22 07:55:53.626 186548 DEBUG nova.virt.hardware [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 07:55:53 compute-0 nova_compute[186544]: 2025-11-22 07:55:53.626 186548 DEBUG nova.virt.hardware [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 07:55:53 compute-0 nova_compute[186544]: 2025-11-22 07:55:53.626 186548 DEBUG nova.virt.hardware [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 07:55:53 compute-0 nova_compute[186544]: 2025-11-22 07:55:53.626 186548 DEBUG nova.virt.hardware [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 07:55:53 compute-0 nova_compute[186544]: 2025-11-22 07:55:53.629 186548 DEBUG nova.virt.libvirt.vif [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:55:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherA-server-824804646',display_name='tempest-ServerActionsTestOtherA-server-824804646',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestothera-server-824804646',id=72,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAZKpzQAL6ILshnC6rB6cFyDuv4pMips+ZeoETDONp6x88d0LohWezNpcKrkoSs9Vgu2Y9x9mhLipzvPvYb8Sku39yyvCNCWO3Bk20MRd537J3g4W+WKdsA5nca0tYn05g==',key_name='tempest-keypair-135608332',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='af3a536766704caaad94e5da2e3b88e2',ramdisk_id='',reservation_id='r-p4lnnnqe',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherA-1599563713',owner_user_name='tempest-ServerActionsTestOtherA-1599563713-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:55:48Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='5a5a623606e647c183360572aab20b70',uuid=1da3a852-b732-48ac-886f-c57ced76a3d5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ab338966-6ece-477c-af70-298f992f0c8f", "address": "fa:16:3e:63:15:d6", "network": {"id": "a2b438ab-8fa8-4627-8c04-99bed701c19e", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1803455984-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "af3a536766704caaad94e5da2e3b88e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab338966-6e", "ovs_interfaceid": "ab338966-6ece-477c-af70-298f992f0c8f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 07:55:53 compute-0 nova_compute[186544]: 2025-11-22 07:55:53.630 186548 DEBUG nova.network.os_vif_util [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Converting VIF {"id": "ab338966-6ece-477c-af70-298f992f0c8f", "address": "fa:16:3e:63:15:d6", "network": {"id": "a2b438ab-8fa8-4627-8c04-99bed701c19e", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1803455984-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "af3a536766704caaad94e5da2e3b88e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab338966-6e", "ovs_interfaceid": "ab338966-6ece-477c-af70-298f992f0c8f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:55:53 compute-0 nova_compute[186544]: 2025-11-22 07:55:53.630 186548 DEBUG nova.network.os_vif_util [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:63:15:d6,bridge_name='br-int',has_traffic_filtering=True,id=ab338966-6ece-477c-af70-298f992f0c8f,network=Network(a2b438ab-8fa8-4627-8c04-99bed701c19e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab338966-6e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:55:53 compute-0 nova_compute[186544]: 2025-11-22 07:55:53.631 186548 DEBUG nova.objects.instance [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lazy-loading 'pci_devices' on Instance uuid 1da3a852-b732-48ac-886f-c57ced76a3d5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:55:53 compute-0 nova_compute[186544]: 2025-11-22 07:55:53.644 186548 DEBUG nova.virt.libvirt.driver [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] End _get_guest_xml xml=<domain type="kvm">
Nov 22 07:55:53 compute-0 nova_compute[186544]:   <uuid>1da3a852-b732-48ac-886f-c57ced76a3d5</uuid>
Nov 22 07:55:53 compute-0 nova_compute[186544]:   <name>instance-00000048</name>
Nov 22 07:55:53 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 07:55:53 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 07:55:53 compute-0 nova_compute[186544]:   <metadata>
Nov 22 07:55:53 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 07:55:53 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 07:55:53 compute-0 nova_compute[186544]:       <nova:name>tempest-ServerActionsTestOtherA-server-824804646</nova:name>
Nov 22 07:55:53 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 07:55:53</nova:creationTime>
Nov 22 07:55:53 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 07:55:53 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 07:55:53 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 07:55:53 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 07:55:53 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 07:55:53 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 07:55:53 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 07:55:53 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 07:55:53 compute-0 nova_compute[186544]:         <nova:user uuid="5a5a623606e647c183360572aab20b70">tempest-ServerActionsTestOtherA-1599563713-project-member</nova:user>
Nov 22 07:55:53 compute-0 nova_compute[186544]:         <nova:project uuid="af3a536766704caaad94e5da2e3b88e2">tempest-ServerActionsTestOtherA-1599563713</nova:project>
Nov 22 07:55:53 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 07:55:53 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 07:55:53 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 07:55:53 compute-0 nova_compute[186544]:         <nova:port uuid="ab338966-6ece-477c-af70-298f992f0c8f">
Nov 22 07:55:53 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Nov 22 07:55:53 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 07:55:53 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 07:55:53 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 07:55:53 compute-0 nova_compute[186544]:   </metadata>
Nov 22 07:55:53 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 07:55:53 compute-0 nova_compute[186544]:     <system>
Nov 22 07:55:53 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 07:55:53 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 07:55:53 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 07:55:53 compute-0 nova_compute[186544]:       <entry name="serial">1da3a852-b732-48ac-886f-c57ced76a3d5</entry>
Nov 22 07:55:53 compute-0 nova_compute[186544]:       <entry name="uuid">1da3a852-b732-48ac-886f-c57ced76a3d5</entry>
Nov 22 07:55:53 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 07:55:53 compute-0 nova_compute[186544]:     </system>
Nov 22 07:55:53 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 07:55:53 compute-0 nova_compute[186544]:   <os>
Nov 22 07:55:53 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 07:55:53 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 07:55:53 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 07:55:53 compute-0 nova_compute[186544]:   </os>
Nov 22 07:55:53 compute-0 nova_compute[186544]:   <features>
Nov 22 07:55:53 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 07:55:53 compute-0 nova_compute[186544]:     <apic/>
Nov 22 07:55:53 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 07:55:53 compute-0 nova_compute[186544]:   </features>
Nov 22 07:55:53 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 07:55:53 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 07:55:53 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 07:55:53 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 07:55:53 compute-0 nova_compute[186544]:   </clock>
Nov 22 07:55:53 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 07:55:53 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 07:55:53 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 07:55:53 compute-0 nova_compute[186544]:   </cpu>
Nov 22 07:55:53 compute-0 nova_compute[186544]:   <devices>
Nov 22 07:55:53 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 07:55:53 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 07:55:53 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/1da3a852-b732-48ac-886f-c57ced76a3d5/disk"/>
Nov 22 07:55:53 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 07:55:53 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:55:53 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 07:55:53 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 07:55:53 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/1da3a852-b732-48ac-886f-c57ced76a3d5/disk.config"/>
Nov 22 07:55:53 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 07:55:53 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:55:53 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 07:55:53 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:63:15:d6"/>
Nov 22 07:55:53 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 07:55:53 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 07:55:53 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 07:55:53 compute-0 nova_compute[186544]:       <target dev="tapab338966-6e"/>
Nov 22 07:55:53 compute-0 nova_compute[186544]:     </interface>
Nov 22 07:55:53 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 07:55:53 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/1da3a852-b732-48ac-886f-c57ced76a3d5/console.log" append="off"/>
Nov 22 07:55:53 compute-0 nova_compute[186544]:     </serial>
Nov 22 07:55:53 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 07:55:53 compute-0 nova_compute[186544]:     <video>
Nov 22 07:55:53 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 07:55:53 compute-0 nova_compute[186544]:     </video>
Nov 22 07:55:53 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 07:55:53 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 07:55:53 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 07:55:53 compute-0 nova_compute[186544]:     </rng>
Nov 22 07:55:53 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 07:55:53 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:53 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:53 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:53 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:53 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:53 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:53 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:53 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:53 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:53 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:53 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:53 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:53 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:53 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:53 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:53 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:53 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:53 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:53 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:53 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:53 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:53 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:53 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:53 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:53 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 07:55:53 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 07:55:53 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 07:55:53 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 07:55:53 compute-0 nova_compute[186544]:   </devices>
Nov 22 07:55:53 compute-0 nova_compute[186544]: </domain>
Nov 22 07:55:53 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 07:55:53 compute-0 nova_compute[186544]: 2025-11-22 07:55:53.645 186548 DEBUG nova.compute.manager [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] Preparing to wait for external event network-vif-plugged-ab338966-6ece-477c-af70-298f992f0c8f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 07:55:53 compute-0 nova_compute[186544]: 2025-11-22 07:55:53.645 186548 DEBUG oslo_concurrency.lockutils [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Acquiring lock "1da3a852-b732-48ac-886f-c57ced76a3d5-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:55:53 compute-0 nova_compute[186544]: 2025-11-22 07:55:53.646 186548 DEBUG oslo_concurrency.lockutils [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lock "1da3a852-b732-48ac-886f-c57ced76a3d5-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:55:53 compute-0 nova_compute[186544]: 2025-11-22 07:55:53.646 186548 DEBUG oslo_concurrency.lockutils [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lock "1da3a852-b732-48ac-886f-c57ced76a3d5-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:55:53 compute-0 nova_compute[186544]: 2025-11-22 07:55:53.646 186548 DEBUG nova.virt.libvirt.vif [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:55:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherA-server-824804646',display_name='tempest-ServerActionsTestOtherA-server-824804646',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestothera-server-824804646',id=72,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAZKpzQAL6ILshnC6rB6cFyDuv4pMips+ZeoETDONp6x88d0LohWezNpcKrkoSs9Vgu2Y9x9mhLipzvPvYb8Sku39yyvCNCWO3Bk20MRd537J3g4W+WKdsA5nca0tYn05g==',key_name='tempest-keypair-135608332',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='af3a536766704caaad94e5da2e3b88e2',ramdisk_id='',reservation_id='r-p4lnnnqe',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherA-1599563713',owner_user_name='tempest-ServerActionsTestOtherA-1599563713-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:55:48Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='5a5a623606e647c183360572aab20b70',uuid=1da3a852-b732-48ac-886f-c57ced76a3d5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ab338966-6ece-477c-af70-298f992f0c8f", "address": "fa:16:3e:63:15:d6", "network": {"id": "a2b438ab-8fa8-4627-8c04-99bed701c19e", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1803455984-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "af3a536766704caaad94e5da2e3b88e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab338966-6e", "ovs_interfaceid": "ab338966-6ece-477c-af70-298f992f0c8f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 07:55:53 compute-0 nova_compute[186544]: 2025-11-22 07:55:53.647 186548 DEBUG nova.network.os_vif_util [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Converting VIF {"id": "ab338966-6ece-477c-af70-298f992f0c8f", "address": "fa:16:3e:63:15:d6", "network": {"id": "a2b438ab-8fa8-4627-8c04-99bed701c19e", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1803455984-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "af3a536766704caaad94e5da2e3b88e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab338966-6e", "ovs_interfaceid": "ab338966-6ece-477c-af70-298f992f0c8f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:55:53 compute-0 nova_compute[186544]: 2025-11-22 07:55:53.647 186548 DEBUG nova.network.os_vif_util [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:63:15:d6,bridge_name='br-int',has_traffic_filtering=True,id=ab338966-6ece-477c-af70-298f992f0c8f,network=Network(a2b438ab-8fa8-4627-8c04-99bed701c19e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab338966-6e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:55:53 compute-0 nova_compute[186544]: 2025-11-22 07:55:53.648 186548 DEBUG os_vif [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:63:15:d6,bridge_name='br-int',has_traffic_filtering=True,id=ab338966-6ece-477c-af70-298f992f0c8f,network=Network(a2b438ab-8fa8-4627-8c04-99bed701c19e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab338966-6e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 07:55:53 compute-0 nova_compute[186544]: 2025-11-22 07:55:53.649 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:53 compute-0 nova_compute[186544]: 2025-11-22 07:55:53.649 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:55:53 compute-0 nova_compute[186544]: 2025-11-22 07:55:53.649 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:55:53 compute-0 nova_compute[186544]: 2025-11-22 07:55:53.652 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:53 compute-0 nova_compute[186544]: 2025-11-22 07:55:53.652 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapab338966-6e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:55:53 compute-0 nova_compute[186544]: 2025-11-22 07:55:53.653 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapab338966-6e, col_values=(('external_ids', {'iface-id': 'ab338966-6ece-477c-af70-298f992f0c8f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:63:15:d6', 'vm-uuid': '1da3a852-b732-48ac-886f-c57ced76a3d5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:55:53 compute-0 nova_compute[186544]: 2025-11-22 07:55:53.654 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:53 compute-0 NetworkManager[55036]: <info>  [1763798153.6568] manager: (tapab338966-6e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/133)
Nov 22 07:55:53 compute-0 nova_compute[186544]: 2025-11-22 07:55:53.657 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 07:55:53 compute-0 nova_compute[186544]: 2025-11-22 07:55:53.665 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:53 compute-0 nova_compute[186544]: 2025-11-22 07:55:53.666 186548 INFO os_vif [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:63:15:d6,bridge_name='br-int',has_traffic_filtering=True,id=ab338966-6ece-477c-af70-298f992f0c8f,network=Network(a2b438ab-8fa8-4627-8c04-99bed701c19e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab338966-6e')
Nov 22 07:55:53 compute-0 nova_compute[186544]: 2025-11-22 07:55:53.725 186548 DEBUG nova.virt.libvirt.driver [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:55:53 compute-0 nova_compute[186544]: 2025-11-22 07:55:53.725 186548 DEBUG nova.virt.libvirt.driver [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:55:53 compute-0 nova_compute[186544]: 2025-11-22 07:55:53.725 186548 DEBUG nova.virt.libvirt.driver [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] No VIF found with MAC fa:16:3e:63:15:d6, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 07:55:53 compute-0 nova_compute[186544]: 2025-11-22 07:55:53.726 186548 INFO nova.virt.libvirt.driver [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] Using config drive
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.157 186548 INFO nova.virt.libvirt.driver [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] Creating config drive at /var/lib/nova/instances/1da3a852-b732-48ac-886f-c57ced76a3d5/disk.config
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.164 186548 DEBUG oslo_concurrency.processutils [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1da3a852-b732-48ac-886f-c57ced76a3d5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpm6x3zmoy execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.289 186548 DEBUG oslo_concurrency.processutils [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1da3a852-b732-48ac-886f-c57ced76a3d5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpm6x3zmoy" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.308 186548 DEBUG nova.network.neutron [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 7d441089-3a23-460f-80f9-602e3500937b] Updating instance_info_cache with network_info: [{"id": "030a5be9-2a7a-4bcf-a49f-000ccb253c8e", "address": "fa:16:3e:35:6d:35", "network": {"id": "dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1729458911-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec4007dc8214caab4e2eb40f11fb3cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap030a5be9-2a", "ovs_interfaceid": "030a5be9-2a7a-4bcf-a49f-000ccb253c8e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:55:54 compute-0 kernel: tapab338966-6e: entered promiscuous mode
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.341 186548 DEBUG oslo_concurrency.lockutils [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Releasing lock "refresh_cache-7d441089-3a23-460f-80f9-602e3500937b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.341 186548 DEBUG nova.compute.manager [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 7d441089-3a23-460f-80f9-602e3500937b] Instance network_info: |[{"id": "030a5be9-2a7a-4bcf-a49f-000ccb253c8e", "address": "fa:16:3e:35:6d:35", "network": {"id": "dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1729458911-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec4007dc8214caab4e2eb40f11fb3cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap030a5be9-2a", "ovs_interfaceid": "030a5be9-2a7a-4bcf-a49f-000ccb253c8e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.342 186548 DEBUG oslo_concurrency.lockutils [req-41a2b70a-4cf7-4b19-9256-7319a9ef489b req-aa1d824a-d4f5-433a-be3e-d673c25b005a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-7d441089-3a23-460f-80f9-602e3500937b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.342 186548 DEBUG nova.network.neutron [req-41a2b70a-4cf7-4b19-9256-7319a9ef489b req-aa1d824a-d4f5-433a-be3e-d673c25b005a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7d441089-3a23-460f-80f9-602e3500937b] Refreshing network info cache for port 030a5be9-2a7a-4bcf-a49f-000ccb253c8e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.346 186548 DEBUG nova.virt.libvirt.driver [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 7d441089-3a23-460f-80f9-602e3500937b] Start _get_guest_xml network_info=[{"id": "030a5be9-2a7a-4bcf-a49f-000ccb253c8e", "address": "fa:16:3e:35:6d:35", "network": {"id": "dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1729458911-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec4007dc8214caab4e2eb40f11fb3cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap030a5be9-2a", "ovs_interfaceid": "030a5be9-2a7a-4bcf-a49f-000ccb253c8e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='8585e91edf8d37af388395b4946b59f4',container_format='bare',created_at=2025-11-22T07:55:33Z,direct_url=<?>,disk_format='qcow2',id=953f8847-fb3b-4129-b572-ed7beb09fe07,min_disk=1,min_ram=0,name='tempest-test-snap-1016324395',owner='7ec4007dc8214caab4e2eb40f11fb3cd',properties=ImageMetaProps,protected=<?>,size=23330816,status='active',tags=<?>,updated_at=2025-11-22T07:55:40Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': '953f8847-fb3b-4129-b572-ed7beb09fe07'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 07:55:54 compute-0 NetworkManager[55036]: <info>  [1763798154.3473] manager: (tapab338966-6e): new Tun device (/org/freedesktop/NetworkManager/Devices/134)
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.347 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:54 compute-0 ovn_controller[94843]: 2025-11-22T07:55:54Z|00280|binding|INFO|Claiming lport ab338966-6ece-477c-af70-298f992f0c8f for this chassis.
Nov 22 07:55:54 compute-0 ovn_controller[94843]: 2025-11-22T07:55:54Z|00281|binding|INFO|ab338966-6ece-477c-af70-298f992f0c8f: Claiming fa:16:3e:63:15:d6 10.100.0.3
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.350 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:54 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:54.364 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:63:15:d6 10.100.0.3'], port_security=['fa:16:3e:63:15:d6 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '1da3a852-b732-48ac-886f-c57ced76a3d5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a2b438ab-8fa8-4627-8c04-99bed701c19e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'af3a536766704caaad94e5da2e3b88e2', 'neutron:revision_number': '2', 'neutron:security_group_ids': '55e3254c-8a80-4bbd-950d-1b4f6cb50116', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9cd2a902-e9cb-4e2e-893e-0a2e3b043ce7, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=ab338966-6ece-477c-af70-298f992f0c8f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:55:54 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:54.365 103805 INFO neutron.agent.ovn.metadata.agent [-] Port ab338966-6ece-477c-af70-298f992f0c8f in datapath a2b438ab-8fa8-4627-8c04-99bed701c19e bound to our chassis
Nov 22 07:55:54 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:54.366 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a2b438ab-8fa8-4627-8c04-99bed701c19e
Nov 22 07:55:54 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:54.377 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[ef816ce5-b123-4279-af78-5284635dc87b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:54 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:54.380 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa2b438ab-81 in ovnmeta-a2b438ab-8fa8-4627-8c04-99bed701c19e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 07:55:54 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:54.382 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa2b438ab-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 07:55:54 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:54.382 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[a4191be9-8db5-4f4b-8b5d-a72c81cf9baa]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:54 compute-0 systemd-udevd[224678]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.389 186548 WARNING nova.virt.libvirt.driver [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 07:55:54 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:54.387 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[68b78ebb-dfb1-4db2-851f-b740f6e3d4c8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:54 compute-0 NetworkManager[55036]: <info>  [1763798154.4029] device (tapab338966-6e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 07:55:54 compute-0 NetworkManager[55036]: <info>  [1763798154.4037] device (tapab338966-6e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.406 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.408 186548 DEBUG nova.virt.libvirt.host [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.409 186548 DEBUG nova.virt.libvirt.host [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 07:55:54 compute-0 ovn_controller[94843]: 2025-11-22T07:55:54Z|00282|binding|INFO|Setting lport ab338966-6ece-477c-af70-298f992f0c8f ovn-installed in OVS
Nov 22 07:55:54 compute-0 ovn_controller[94843]: 2025-11-22T07:55:54Z|00283|binding|INFO|Setting lport ab338966-6ece-477c-af70-298f992f0c8f up in Southbound
Nov 22 07:55:54 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:54.412 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[282ca61b-759c-4e91-a47d-31f37323e4a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:54 compute-0 systemd-machined[152872]: New machine qemu-35-instance-00000048.
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.417 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.419 186548 DEBUG nova.virt.libvirt.host [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.419 186548 DEBUG nova.virt.libvirt.host [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.421 186548 DEBUG nova.virt.libvirt.driver [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.421 186548 DEBUG nova.virt.hardware [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='8585e91edf8d37af388395b4946b59f4',container_format='bare',created_at=2025-11-22T07:55:33Z,direct_url=<?>,disk_format='qcow2',id=953f8847-fb3b-4129-b572-ed7beb09fe07,min_disk=1,min_ram=0,name='tempest-test-snap-1016324395',owner='7ec4007dc8214caab4e2eb40f11fb3cd',properties=ImageMetaProps,protected=<?>,size=23330816,status='active',tags=<?>,updated_at=2025-11-22T07:55:40Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.421 186548 DEBUG nova.virt.hardware [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.422 186548 DEBUG nova.virt.hardware [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.422 186548 DEBUG nova.virt.hardware [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.422 186548 DEBUG nova.virt.hardware [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.422 186548 DEBUG nova.virt.hardware [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.422 186548 DEBUG nova.virt.hardware [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 07:55:54 compute-0 systemd[1]: Started Virtual Machine qemu-35-instance-00000048.
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.422 186548 DEBUG nova.virt.hardware [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.423 186548 DEBUG nova.virt.hardware [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.423 186548 DEBUG nova.virt.hardware [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.423 186548 DEBUG nova.virt.hardware [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.427 186548 DEBUG nova.virt.libvirt.vif [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:55:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-965620588',display_name='tempest-ImagesTestJSON-server-965620588',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagestestjson-server-965620588',id=71,image_ref='953f8847-fb3b-4129-b572-ed7beb09fe07',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7ec4007dc8214caab4e2eb40f11fb3cd',ramdisk_id='',reservation_id='r-5bqas6qx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_boot_roles='member,reader',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_image_location='snapshot',image_image_state='available',image_image_type='snapshot',image_instance_uuid='dd6876d4-cab4-413b-9d67-10e2ba45a220',image_min_disk='1',image_min_ram='0',image_owner_id='7ec4007dc8214caab4e2eb40f11fb3cd',image_owner_project_name='tempest-ImagesTestJSON-117614339',image_owner_user_name='tempest-ImagesTestJSON-117614339-project-member',image_user_id='1ac2d2381d294c96aff369941185056a',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-117614339',owner_user_name='tempest-ImagesTestJSON-117614339-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:55:48Z,user_data=None,user_id='1ac2d2381d294c96aff369941185056a',uuid=7d441089-3a23-460f-80f9-602e3500937b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "030a5be9-2a7a-4bcf-a49f-000ccb253c8e", "address": "fa:16:3e:35:6d:35", "network": {"id": "dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1729458911-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec4007dc8214caab4e2eb40f11fb3cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap030a5be9-2a", "ovs_interfaceid": "030a5be9-2a7a-4bcf-a49f-000ccb253c8e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.427 186548 DEBUG nova.network.os_vif_util [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Converting VIF {"id": "030a5be9-2a7a-4bcf-a49f-000ccb253c8e", "address": "fa:16:3e:35:6d:35", "network": {"id": "dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1729458911-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec4007dc8214caab4e2eb40f11fb3cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap030a5be9-2a", "ovs_interfaceid": "030a5be9-2a7a-4bcf-a49f-000ccb253c8e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.428 186548 DEBUG nova.network.os_vif_util [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:35:6d:35,bridge_name='br-int',has_traffic_filtering=True,id=030a5be9-2a7a-4bcf-a49f-000ccb253c8e,network=Network(dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap030a5be9-2a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.429 186548 DEBUG nova.objects.instance [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lazy-loading 'pci_devices' on Instance uuid 7d441089-3a23-460f-80f9-602e3500937b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:55:54 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:54.441 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[167dd128-ac05-4e76-ac38-664ef2d34a9e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.446 186548 DEBUG nova.virt.libvirt.driver [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 7d441089-3a23-460f-80f9-602e3500937b] End _get_guest_xml xml=<domain type="kvm">
Nov 22 07:55:54 compute-0 nova_compute[186544]:   <uuid>7d441089-3a23-460f-80f9-602e3500937b</uuid>
Nov 22 07:55:54 compute-0 nova_compute[186544]:   <name>instance-00000047</name>
Nov 22 07:55:54 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 07:55:54 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 07:55:54 compute-0 nova_compute[186544]:   <metadata>
Nov 22 07:55:54 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 07:55:54 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 07:55:54 compute-0 nova_compute[186544]:       <nova:name>tempest-ImagesTestJSON-server-965620588</nova:name>
Nov 22 07:55:54 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 07:55:54</nova:creationTime>
Nov 22 07:55:54 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 07:55:54 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 07:55:54 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 07:55:54 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 07:55:54 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 07:55:54 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 07:55:54 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 07:55:54 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 07:55:54 compute-0 nova_compute[186544]:         <nova:user uuid="1ac2d2381d294c96aff369941185056a">tempest-ImagesTestJSON-117614339-project-member</nova:user>
Nov 22 07:55:54 compute-0 nova_compute[186544]:         <nova:project uuid="7ec4007dc8214caab4e2eb40f11fb3cd">tempest-ImagesTestJSON-117614339</nova:project>
Nov 22 07:55:54 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 07:55:54 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="953f8847-fb3b-4129-b572-ed7beb09fe07"/>
Nov 22 07:55:54 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 07:55:54 compute-0 nova_compute[186544]:         <nova:port uuid="030a5be9-2a7a-4bcf-a49f-000ccb253c8e">
Nov 22 07:55:54 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Nov 22 07:55:54 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 07:55:54 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 07:55:54 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 07:55:54 compute-0 nova_compute[186544]:   </metadata>
Nov 22 07:55:54 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 07:55:54 compute-0 nova_compute[186544]:     <system>
Nov 22 07:55:54 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 07:55:54 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 07:55:54 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 07:55:54 compute-0 nova_compute[186544]:       <entry name="serial">7d441089-3a23-460f-80f9-602e3500937b</entry>
Nov 22 07:55:54 compute-0 nova_compute[186544]:       <entry name="uuid">7d441089-3a23-460f-80f9-602e3500937b</entry>
Nov 22 07:55:54 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 07:55:54 compute-0 nova_compute[186544]:     </system>
Nov 22 07:55:54 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 07:55:54 compute-0 nova_compute[186544]:   <os>
Nov 22 07:55:54 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 07:55:54 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 07:55:54 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 07:55:54 compute-0 nova_compute[186544]:   </os>
Nov 22 07:55:54 compute-0 nova_compute[186544]:   <features>
Nov 22 07:55:54 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 07:55:54 compute-0 nova_compute[186544]:     <apic/>
Nov 22 07:55:54 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 07:55:54 compute-0 nova_compute[186544]:   </features>
Nov 22 07:55:54 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 07:55:54 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 07:55:54 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 07:55:54 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 07:55:54 compute-0 nova_compute[186544]:   </clock>
Nov 22 07:55:54 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 07:55:54 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 07:55:54 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 07:55:54 compute-0 nova_compute[186544]:   </cpu>
Nov 22 07:55:54 compute-0 nova_compute[186544]:   <devices>
Nov 22 07:55:54 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 07:55:54 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 07:55:54 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/7d441089-3a23-460f-80f9-602e3500937b/disk"/>
Nov 22 07:55:54 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 07:55:54 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:55:54 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 07:55:54 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 07:55:54 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/7d441089-3a23-460f-80f9-602e3500937b/disk.config"/>
Nov 22 07:55:54 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 07:55:54 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:55:54 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 07:55:54 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:35:6d:35"/>
Nov 22 07:55:54 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 07:55:54 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 07:55:54 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 07:55:54 compute-0 nova_compute[186544]:       <target dev="tap030a5be9-2a"/>
Nov 22 07:55:54 compute-0 nova_compute[186544]:     </interface>
Nov 22 07:55:54 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 07:55:54 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/7d441089-3a23-460f-80f9-602e3500937b/console.log" append="off"/>
Nov 22 07:55:54 compute-0 nova_compute[186544]:     </serial>
Nov 22 07:55:54 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 07:55:54 compute-0 nova_compute[186544]:     <video>
Nov 22 07:55:54 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 07:55:54 compute-0 nova_compute[186544]:     </video>
Nov 22 07:55:54 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 07:55:54 compute-0 nova_compute[186544]:     <input type="keyboard" bus="usb"/>
Nov 22 07:55:54 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 07:55:54 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 07:55:54 compute-0 nova_compute[186544]:     </rng>
Nov 22 07:55:54 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 07:55:54 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:54 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:54 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:54 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:54 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:54 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:54 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:54 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:54 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:54 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:54 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:54 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:54 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:54 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:54 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:54 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:54 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:54 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:54 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:54 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:54 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:54 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:54 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:54 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:55:54 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 07:55:54 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 07:55:54 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 07:55:54 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 07:55:54 compute-0 nova_compute[186544]:   </devices>
Nov 22 07:55:54 compute-0 nova_compute[186544]: </domain>
Nov 22 07:55:54 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.447 186548 DEBUG nova.compute.manager [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 7d441089-3a23-460f-80f9-602e3500937b] Preparing to wait for external event network-vif-plugged-030a5be9-2a7a-4bcf-a49f-000ccb253c8e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.447 186548 DEBUG oslo_concurrency.lockutils [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Acquiring lock "7d441089-3a23-460f-80f9-602e3500937b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.447 186548 DEBUG oslo_concurrency.lockutils [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "7d441089-3a23-460f-80f9-602e3500937b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.447 186548 DEBUG oslo_concurrency.lockutils [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "7d441089-3a23-460f-80f9-602e3500937b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.448 186548 DEBUG nova.virt.libvirt.vif [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:55:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-965620588',display_name='tempest-ImagesTestJSON-server-965620588',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagestestjson-server-965620588',id=71,image_ref='953f8847-fb3b-4129-b572-ed7beb09fe07',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7ec4007dc8214caab4e2eb40f11fb3cd',ramdisk_id='',reservation_id='r-5bqas6qx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_boot_roles='member,reader',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_image_location='snapshot',image_image_state='available',image_image_type='snapshot',image_instance_uuid='dd6876d4-cab4-413b-9d67-10e2ba45a220',image_min_disk='1',image_min_ram='0',image_owner_id='7ec4007dc8214caab4e2eb40f11fb3cd',image_owner_project_name='tempest-ImagesTestJSON-117614339',image_owner_user_name='tempest-ImagesTestJSON-117614339-project-member',image_user_id='1ac2d2381d294c96aff369941185056a',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-117614339',owner_user_name='tempest-ImagesTestJSON-117614339-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:55:48Z,user_data=None,user_id='1ac2d2381d294c96aff369941185056a',uuid=7d441089-3a23-460f-80f9-602e3500937b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "030a5be9-2a7a-4bcf-a49f-000ccb253c8e", "address": "fa:16:3e:35:6d:35", "network": {"id": "dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1729458911-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec4007dc8214caab4e2eb40f11fb3cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap030a5be9-2a", "ovs_interfaceid": "030a5be9-2a7a-4bcf-a49f-000ccb253c8e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.448 186548 DEBUG nova.network.os_vif_util [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Converting VIF {"id": "030a5be9-2a7a-4bcf-a49f-000ccb253c8e", "address": "fa:16:3e:35:6d:35", "network": {"id": "dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1729458911-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec4007dc8214caab4e2eb40f11fb3cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap030a5be9-2a", "ovs_interfaceid": "030a5be9-2a7a-4bcf-a49f-000ccb253c8e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.449 186548 DEBUG nova.network.os_vif_util [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:35:6d:35,bridge_name='br-int',has_traffic_filtering=True,id=030a5be9-2a7a-4bcf-a49f-000ccb253c8e,network=Network(dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap030a5be9-2a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.449 186548 DEBUG os_vif [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:35:6d:35,bridge_name='br-int',has_traffic_filtering=True,id=030a5be9-2a7a-4bcf-a49f-000ccb253c8e,network=Network(dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap030a5be9-2a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.449 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.450 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.450 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.453 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.453 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap030a5be9-2a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.454 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap030a5be9-2a, col_values=(('external_ids', {'iface-id': '030a5be9-2a7a-4bcf-a49f-000ccb253c8e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:35:6d:35', 'vm-uuid': '7d441089-3a23-460f-80f9-602e3500937b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.455 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:54 compute-0 NetworkManager[55036]: <info>  [1763798154.4558] manager: (tap030a5be9-2a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/135)
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.459 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.463 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.465 186548 INFO os_vif [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:35:6d:35,bridge_name='br-int',has_traffic_filtering=True,id=030a5be9-2a7a-4bcf-a49f-000ccb253c8e,network=Network(dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap030a5be9-2a')
Nov 22 07:55:54 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:54.483 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[3d3dc39f-6651-4771-8bd0-ba8993fac279]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:54 compute-0 systemd-udevd[224684]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 07:55:54 compute-0 NetworkManager[55036]: <info>  [1763798154.4919] manager: (tapa2b438ab-80): new Veth device (/org/freedesktop/NetworkManager/Devices/136)
Nov 22 07:55:54 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:54.491 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[91ce1f92-bc86-4ca3-b24f-d470f51a0fda]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.530 186548 DEBUG nova.virt.libvirt.driver [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.530 186548 DEBUG nova.virt.libvirt.driver [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.531 186548 DEBUG nova.virt.libvirt.driver [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] No VIF found with MAC fa:16:3e:35:6d:35, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.532 186548 INFO nova.virt.libvirt.driver [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 7d441089-3a23-460f-80f9-602e3500937b] Using config drive
Nov 22 07:55:54 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:54.532 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[c8bb7910-c68f-41fb-94b8-199a9aa3fb6d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:54 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:54.535 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[6d1f751f-5f52-4713-ace9-1cf3fc4d173e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:54 compute-0 NetworkManager[55036]: <info>  [1763798154.5602] device (tapa2b438ab-80): carrier: link connected
Nov 22 07:55:54 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:54.566 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[4e945cee-a9de-487e-9099-df8ea3dfd8d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:54 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:54.581 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[9d06fb2f-aeb9-408f-9726-d5be6c0c6c9b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa2b438ab-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:af:f4:9d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 86], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 489519, 'reachable_time': 35697, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224717, 'error': None, 'target': 'ovnmeta-a2b438ab-8fa8-4627-8c04-99bed701c19e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:54 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:54.594 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[2966c5bf-2626-4616-88b5-feeb60015bbc]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feaf:f49d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 489519, 'tstamp': 489519}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224718, 'error': None, 'target': 'ovnmeta-a2b438ab-8fa8-4627-8c04-99bed701c19e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:54 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:54.610 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[5036ec8c-a92f-477b-8e3d-d3c86de2d7e2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa2b438ab-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:af:f4:9d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 86], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 489519, 'reachable_time': 35697, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 224719, 'error': None, 'target': 'ovnmeta-a2b438ab-8fa8-4627-8c04-99bed701c19e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:54 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:54.641 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[0cadcd6f-a6c4-41d3-81af-a87c7b661c80]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:54 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:54.714 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[db559d74-d963-4f0b-92c7-5a65f5f04879]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:54 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:54.716 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa2b438ab-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:55:54 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:54.716 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:55:54 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:54.717 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa2b438ab-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.720 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:54 compute-0 NetworkManager[55036]: <info>  [1763798154.7221] manager: (tapa2b438ab-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/137)
Nov 22 07:55:54 compute-0 kernel: tapa2b438ab-80: entered promiscuous mode
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.726 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:54 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:54.727 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa2b438ab-80, col_values=(('external_ids', {'iface-id': '1f7bc015-fb2f-41a5-82bb-16526b7a95f0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:55:54 compute-0 ovn_controller[94843]: 2025-11-22T07:55:54Z|00284|binding|INFO|Releasing lport 1f7bc015-fb2f-41a5-82bb-16526b7a95f0 from this chassis (sb_readonly=0)
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.733 186548 DEBUG nova.compute.manager [req-fd822a2b-5096-456e-b829-53a9c22971b1 req-6cc3c892-63a2-4373-9464-dbdb8036aa2a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] Received event network-vif-plugged-ab338966-6ece-477c-af70-298f992f0c8f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.733 186548 DEBUG oslo_concurrency.lockutils [req-fd822a2b-5096-456e-b829-53a9c22971b1 req-6cc3c892-63a2-4373-9464-dbdb8036aa2a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1da3a852-b732-48ac-886f-c57ced76a3d5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.733 186548 DEBUG oslo_concurrency.lockutils [req-fd822a2b-5096-456e-b829-53a9c22971b1 req-6cc3c892-63a2-4373-9464-dbdb8036aa2a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1da3a852-b732-48ac-886f-c57ced76a3d5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.734 186548 DEBUG oslo_concurrency.lockutils [req-fd822a2b-5096-456e-b829-53a9c22971b1 req-6cc3c892-63a2-4373-9464-dbdb8036aa2a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1da3a852-b732-48ac-886f-c57ced76a3d5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.734 186548 DEBUG nova.compute.manager [req-fd822a2b-5096-456e-b829-53a9c22971b1 req-6cc3c892-63a2-4373-9464-dbdb8036aa2a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] Processing event network-vif-plugged-ab338966-6ece-477c-af70-298f992f0c8f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.734 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.742 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.746 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:54 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:54.747 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a2b438ab-8fa8-4627-8c04-99bed701c19e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a2b438ab-8fa8-4627-8c04-99bed701c19e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 07:55:54 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:54.749 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[cb8c310f-2b80-4a23-9f0e-c2b42bb63bd4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:54 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:54.750 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 07:55:54 compute-0 ovn_metadata_agent[103800]: global
Nov 22 07:55:54 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 07:55:54 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-a2b438ab-8fa8-4627-8c04-99bed701c19e
Nov 22 07:55:54 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 07:55:54 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 07:55:54 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 07:55:54 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/a2b438ab-8fa8-4627-8c04-99bed701c19e.pid.haproxy
Nov 22 07:55:54 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 07:55:54 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:55:54 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 07:55:54 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 07:55:54 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 07:55:54 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 07:55:54 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 07:55:54 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 07:55:54 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 07:55:54 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 07:55:54 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 07:55:54 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 07:55:54 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 07:55:54 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 07:55:54 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 07:55:54 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:55:54 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:55:54 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 07:55:54 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 07:55:54 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 07:55:54 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID a2b438ab-8fa8-4627-8c04-99bed701c19e
Nov 22 07:55:54 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 07:55:54 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:54.752 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a2b438ab-8fa8-4627-8c04-99bed701c19e', 'env', 'PROCESS_TAG=haproxy-a2b438ab-8fa8-4627-8c04-99bed701c19e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a2b438ab-8fa8-4627-8c04-99bed701c19e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.762 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798154.7618604, 1da3a852-b732-48ac-886f-c57ced76a3d5 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.762 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] VM Started (Lifecycle Event)
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.764 186548 DEBUG nova.compute.manager [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.767 186548 DEBUG nova.virt.libvirt.driver [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.776 186548 INFO nova.virt.libvirt.driver [-] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] Instance spawned successfully.
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.777 186548 DEBUG nova.virt.libvirt.driver [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.779 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.782 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.796 186548 DEBUG nova.virt.libvirt.driver [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.796 186548 DEBUG nova.virt.libvirt.driver [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.797 186548 DEBUG nova.virt.libvirt.driver [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.797 186548 DEBUG nova.virt.libvirt.driver [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.797 186548 DEBUG nova.virt.libvirt.driver [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.798 186548 DEBUG nova.virt.libvirt.driver [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.819 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.819 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798154.7641447, 1da3a852-b732-48ac-886f-c57ced76a3d5 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.819 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] VM Paused (Lifecycle Event)
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.853 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.860 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798154.7668843, 1da3a852-b732-48ac-886f-c57ced76a3d5 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.861 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] VM Resumed (Lifecycle Event)
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.886 186548 INFO nova.compute.manager [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] Took 5.91 seconds to spawn the instance on the hypervisor.
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.887 186548 DEBUG nova.compute.manager [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.897 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.899 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:55:54 compute-0 nova_compute[186544]: 2025-11-22 07:55:54.936 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 07:55:55 compute-0 nova_compute[186544]: 2025-11-22 07:55:55.042 186548 INFO nova.compute.manager [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] Took 7.11 seconds to build instance.
Nov 22 07:55:55 compute-0 nova_compute[186544]: 2025-11-22 07:55:55.096 186548 DEBUG oslo_concurrency.lockutils [None req-d68b3c5d-1757-4a18-8405-b26ff22bbec4 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lock "1da3a852-b732-48ac-886f-c57ced76a3d5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.299s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:55:55 compute-0 nova_compute[186544]: 2025-11-22 07:55:55.130 186548 INFO nova.virt.libvirt.driver [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 7d441089-3a23-460f-80f9-602e3500937b] Creating config drive at /var/lib/nova/instances/7d441089-3a23-460f-80f9-602e3500937b/disk.config
Nov 22 07:55:55 compute-0 nova_compute[186544]: 2025-11-22 07:55:55.134 186548 DEBUG oslo_concurrency.processutils [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7d441089-3a23-460f-80f9-602e3500937b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpyqpaijli execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:55:55 compute-0 podman[224763]: 2025-11-22 07:55:55.144375028 +0000 UTC m=+0.056510544 container create 9f2c1e3493ad53d9d6d5d861f9de7e2058ddd61e4c7ae7e5299bf5fb168e6663 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a2b438ab-8fa8-4627-8c04-99bed701c19e, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 07:55:55 compute-0 nova_compute[186544]: 2025-11-22 07:55:55.150 186548 DEBUG nova.network.neutron [req-9784fdcc-391a-468f-9957-052f6599a9e5 req-ffb99d70-ce6c-47d3-ac2d-e68c027883a3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] Updated VIF entry in instance network info cache for port ab338966-6ece-477c-af70-298f992f0c8f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 07:55:55 compute-0 nova_compute[186544]: 2025-11-22 07:55:55.151 186548 DEBUG nova.network.neutron [req-9784fdcc-391a-468f-9957-052f6599a9e5 req-ffb99d70-ce6c-47d3-ac2d-e68c027883a3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] Updating instance_info_cache with network_info: [{"id": "ab338966-6ece-477c-af70-298f992f0c8f", "address": "fa:16:3e:63:15:d6", "network": {"id": "a2b438ab-8fa8-4627-8c04-99bed701c19e", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1803455984-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "af3a536766704caaad94e5da2e3b88e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab338966-6e", "ovs_interfaceid": "ab338966-6ece-477c-af70-298f992f0c8f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:55:55 compute-0 nova_compute[186544]: 2025-11-22 07:55:55.168 186548 DEBUG oslo_concurrency.lockutils [req-9784fdcc-391a-468f-9957-052f6599a9e5 req-ffb99d70-ce6c-47d3-ac2d-e68c027883a3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-1da3a852-b732-48ac-886f-c57ced76a3d5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:55:55 compute-0 systemd[1]: Started libpod-conmon-9f2c1e3493ad53d9d6d5d861f9de7e2058ddd61e4c7ae7e5299bf5fb168e6663.scope.
Nov 22 07:55:55 compute-0 podman[224763]: 2025-11-22 07:55:55.113062706 +0000 UTC m=+0.025198242 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 07:55:55 compute-0 systemd[1]: Started libcrun container.
Nov 22 07:55:55 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d3b6474a9e1ac5da797529878cfaedf082acb57f3328a0fe3a0a8a2f6d596f7d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 07:55:55 compute-0 podman[224763]: 2025-11-22 07:55:55.236860936 +0000 UTC m=+0.148996472 container init 9f2c1e3493ad53d9d6d5d861f9de7e2058ddd61e4c7ae7e5299bf5fb168e6663 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a2b438ab-8fa8-4627-8c04-99bed701c19e, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 22 07:55:55 compute-0 podman[224763]: 2025-11-22 07:55:55.243247823 +0000 UTC m=+0.155383339 container start 9f2c1e3493ad53d9d6d5d861f9de7e2058ddd61e4c7ae7e5299bf5fb168e6663 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a2b438ab-8fa8-4627-8c04-99bed701c19e, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 07:55:55 compute-0 nova_compute[186544]: 2025-11-22 07:55:55.257 186548 DEBUG oslo_concurrency.processutils [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7d441089-3a23-460f-80f9-602e3500937b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpyqpaijli" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:55:55 compute-0 neutron-haproxy-ovnmeta-a2b438ab-8fa8-4627-8c04-99bed701c19e[224781]: [NOTICE]   (224785) : New worker (224788) forked
Nov 22 07:55:55 compute-0 neutron-haproxy-ovnmeta-a2b438ab-8fa8-4627-8c04-99bed701c19e[224781]: [NOTICE]   (224785) : Loading success.
Nov 22 07:55:55 compute-0 kernel: tap030a5be9-2a: entered promiscuous mode
Nov 22 07:55:55 compute-0 NetworkManager[55036]: <info>  [1763798155.3307] manager: (tap030a5be9-2a): new Tun device (/org/freedesktop/NetworkManager/Devices/138)
Nov 22 07:55:55 compute-0 nova_compute[186544]: 2025-11-22 07:55:55.332 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:55 compute-0 systemd-udevd[224707]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 07:55:55 compute-0 ovn_controller[94843]: 2025-11-22T07:55:55Z|00285|binding|INFO|Claiming lport 030a5be9-2a7a-4bcf-a49f-000ccb253c8e for this chassis.
Nov 22 07:55:55 compute-0 ovn_controller[94843]: 2025-11-22T07:55:55Z|00286|binding|INFO|030a5be9-2a7a-4bcf-a49f-000ccb253c8e: Claiming fa:16:3e:35:6d:35 10.100.0.14
Nov 22 07:55:55 compute-0 nova_compute[186544]: 2025-11-22 07:55:55.337 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:55 compute-0 nova_compute[186544]: 2025-11-22 07:55:55.340 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:55 compute-0 NetworkManager[55036]: <info>  [1763798155.3474] device (tap030a5be9-2a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 07:55:55 compute-0 NetworkManager[55036]: <info>  [1763798155.3482] device (tap030a5be9-2a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 07:55:55 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:55.353 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:35:6d:35 10.100.0.14'], port_security=['fa:16:3e:35:6d:35 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '7d441089-3a23-460f-80f9-602e3500937b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7ec4007dc8214caab4e2eb40f11fb3cd', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9b5409ad-68b6-4279-a5b6-4f93a6b83cf7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b3e65854-82c8-492a-b0f0-e6e843e59756, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=030a5be9-2a7a-4bcf-a49f-000ccb253c8e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:55:55 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:55.357 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 030a5be9-2a7a-4bcf-a49f-000ccb253c8e in datapath dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a bound to our chassis
Nov 22 07:55:55 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:55.359 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a
Nov 22 07:55:55 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:55.369 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[38693531-d043-4418-8f89-19102d9191b6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:55 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:55.371 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapdc6b9ee8-e1 in ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 07:55:55 compute-0 systemd-machined[152872]: New machine qemu-36-instance-00000047.
Nov 22 07:55:55 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:55.375 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapdc6b9ee8-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 07:55:55 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:55.375 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[e0f6b8b8-0294-46d3-b005-610cf5e8045b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:55 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:55.380 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[426e34fb-81f5-4f7e-b4ef-39d832671c16]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:55 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:55.392 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[51b746a3-f952-40d9-b93c-650559d8e0e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:55 compute-0 ovn_controller[94843]: 2025-11-22T07:55:55Z|00287|binding|INFO|Setting lport 030a5be9-2a7a-4bcf-a49f-000ccb253c8e ovn-installed in OVS
Nov 22 07:55:55 compute-0 systemd[1]: Started Virtual Machine qemu-36-instance-00000047.
Nov 22 07:55:55 compute-0 ovn_controller[94843]: 2025-11-22T07:55:55Z|00288|binding|INFO|Setting lport 030a5be9-2a7a-4bcf-a49f-000ccb253c8e up in Southbound
Nov 22 07:55:55 compute-0 nova_compute[186544]: 2025-11-22 07:55:55.398 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:55 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:55.406 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[e79fffb1-5649-4a3f-a991-5b2e86d35bcd]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:55 compute-0 nova_compute[186544]: 2025-11-22 07:55:55.418 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:55 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:55.437 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[a76b2acf-1fc2-4adf-9709-ac6b52eece0f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:55 compute-0 NetworkManager[55036]: <info>  [1763798155.4496] manager: (tapdc6b9ee8-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/139)
Nov 22 07:55:55 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:55.449 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[861d8f0e-eeeb-471b-aecb-a7b57218ac03]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:55 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:55.482 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[49e7a5ab-3d1e-4611-a6ad-423d7cb670ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:55 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:55.487 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[3d592d79-5929-4b95-99dd-eb0a666498e7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:55 compute-0 NetworkManager[55036]: <info>  [1763798155.5117] device (tapdc6b9ee8-e0): carrier: link connected
Nov 22 07:55:55 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:55.524 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[b531da87-39a0-458a-874c-58427a552eb4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:55 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:55.544 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[c0779f8a-b619-49cf-a6e0-4fbb20a61a76]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdc6b9ee8-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cc:d8:9c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 88], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 489614, 'reachable_time': 21663, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224828, 'error': None, 'target': 'ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:55 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:55.561 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[74e322c9-56fe-4456-8a71-343481a81c95]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fecc:d89c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 489614, 'tstamp': 489614}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224829, 'error': None, 'target': 'ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:55 compute-0 nova_compute[186544]: 2025-11-22 07:55:55.589 186548 DEBUG nova.compute.manager [req-006ed0f0-3cef-4c5f-8e75-009ce0519156 req-5c6b44bd-29f6-46f1-9b52-b8dc8a181bfd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7d441089-3a23-460f-80f9-602e3500937b] Received event network-vif-plugged-030a5be9-2a7a-4bcf-a49f-000ccb253c8e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:55:55 compute-0 nova_compute[186544]: 2025-11-22 07:55:55.589 186548 DEBUG oslo_concurrency.lockutils [req-006ed0f0-3cef-4c5f-8e75-009ce0519156 req-5c6b44bd-29f6-46f1-9b52-b8dc8a181bfd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "7d441089-3a23-460f-80f9-602e3500937b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:55:55 compute-0 nova_compute[186544]: 2025-11-22 07:55:55.590 186548 DEBUG oslo_concurrency.lockutils [req-006ed0f0-3cef-4c5f-8e75-009ce0519156 req-5c6b44bd-29f6-46f1-9b52-b8dc8a181bfd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "7d441089-3a23-460f-80f9-602e3500937b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:55:55 compute-0 nova_compute[186544]: 2025-11-22 07:55:55.591 186548 DEBUG oslo_concurrency.lockutils [req-006ed0f0-3cef-4c5f-8e75-009ce0519156 req-5c6b44bd-29f6-46f1-9b52-b8dc8a181bfd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "7d441089-3a23-460f-80f9-602e3500937b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:55:55 compute-0 nova_compute[186544]: 2025-11-22 07:55:55.591 186548 DEBUG nova.compute.manager [req-006ed0f0-3cef-4c5f-8e75-009ce0519156 req-5c6b44bd-29f6-46f1-9b52-b8dc8a181bfd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7d441089-3a23-460f-80f9-602e3500937b] Processing event network-vif-plugged-030a5be9-2a7a-4bcf-a49f-000ccb253c8e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 07:55:55 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:55.586 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[70de1666-f413-4f05-a11e-d6931969b9cf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdc6b9ee8-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cc:d8:9c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 88], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 489614, 'reachable_time': 21663, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 224830, 'error': None, 'target': 'ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:55 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:55.622 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[b4760d21-d51a-41ba-9d85-1801b70871df]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:55 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:55.689 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[62eaa674-641f-469c-91e3-e3a47f18e998]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:55 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:55.691 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdc6b9ee8-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:55:55 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:55.691 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:55:55 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:55.692 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdc6b9ee8-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:55:55 compute-0 NetworkManager[55036]: <info>  [1763798155.6946] manager: (tapdc6b9ee8-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/140)
Nov 22 07:55:55 compute-0 kernel: tapdc6b9ee8-e0: entered promiscuous mode
Nov 22 07:55:55 compute-0 nova_compute[186544]: 2025-11-22 07:55:55.694 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:55 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:55.700 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapdc6b9ee8-e0, col_values=(('external_ids', {'iface-id': '99cae854-daa9-4d08-8152-257a15e21bf8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:55:55 compute-0 nova_compute[186544]: 2025-11-22 07:55:55.701 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:55 compute-0 ovn_controller[94843]: 2025-11-22T07:55:55Z|00289|binding|INFO|Releasing lport 99cae854-daa9-4d08-8152-257a15e21bf8 from this chassis (sb_readonly=0)
Nov 22 07:55:55 compute-0 nova_compute[186544]: 2025-11-22 07:55:55.703 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:55 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:55.705 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 07:55:55 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:55.706 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[6aa32f5b-7f10-47ad-b545-1830d03af82f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:55 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:55.707 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 07:55:55 compute-0 ovn_metadata_agent[103800]: global
Nov 22 07:55:55 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 07:55:55 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a
Nov 22 07:55:55 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 07:55:55 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 07:55:55 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 07:55:55 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a.pid.haproxy
Nov 22 07:55:55 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 07:55:55 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:55:55 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 07:55:55 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 07:55:55 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 07:55:55 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 07:55:55 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 07:55:55 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 07:55:55 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 07:55:55 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 07:55:55 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 07:55:55 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 07:55:55 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 07:55:55 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 07:55:55 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 07:55:55 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:55:55 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:55:55 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 07:55:55 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 07:55:55 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 07:55:55 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a
Nov 22 07:55:55 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 07:55:55 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:55.708 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a', 'env', 'PROCESS_TAG=haproxy-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 07:55:55 compute-0 nova_compute[186544]: 2025-11-22 07:55:55.715 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:56 compute-0 podman[224867]: 2025-11-22 07:55:56.11584381 +0000 UTC m=+0.064389107 container create 5a7428065119aeb136029548478f2849df0febd13d41aa8c171e74839547bd1c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 22 07:55:56 compute-0 nova_compute[186544]: 2025-11-22 07:55:56.120 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798156.120224, 7d441089-3a23-460f-80f9-602e3500937b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:55:56 compute-0 nova_compute[186544]: 2025-11-22 07:55:56.121 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 7d441089-3a23-460f-80f9-602e3500937b] VM Started (Lifecycle Event)
Nov 22 07:55:56 compute-0 nova_compute[186544]: 2025-11-22 07:55:56.128 186548 DEBUG nova.compute.manager [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 7d441089-3a23-460f-80f9-602e3500937b] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 07:55:56 compute-0 nova_compute[186544]: 2025-11-22 07:55:56.132 186548 DEBUG nova.virt.libvirt.driver [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 7d441089-3a23-460f-80f9-602e3500937b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 07:55:56 compute-0 nova_compute[186544]: 2025-11-22 07:55:56.138 186548 INFO nova.virt.libvirt.driver [-] [instance: 7d441089-3a23-460f-80f9-602e3500937b] Instance spawned successfully.
Nov 22 07:55:56 compute-0 nova_compute[186544]: 2025-11-22 07:55:56.138 186548 INFO nova.compute.manager [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 7d441089-3a23-460f-80f9-602e3500937b] Took 7.51 seconds to spawn the instance on the hypervisor.
Nov 22 07:55:56 compute-0 nova_compute[186544]: 2025-11-22 07:55:56.139 186548 DEBUG nova.compute.manager [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 7d441089-3a23-460f-80f9-602e3500937b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:55:56 compute-0 nova_compute[186544]: 2025-11-22 07:55:56.147 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 7d441089-3a23-460f-80f9-602e3500937b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:55:56 compute-0 nova_compute[186544]: 2025-11-22 07:55:56.151 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 7d441089-3a23-460f-80f9-602e3500937b] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:55:56 compute-0 nova_compute[186544]: 2025-11-22 07:55:56.171 186548 DEBUG nova.network.neutron [req-41a2b70a-4cf7-4b19-9256-7319a9ef489b req-aa1d824a-d4f5-433a-be3e-d673c25b005a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7d441089-3a23-460f-80f9-602e3500937b] Updated VIF entry in instance network info cache for port 030a5be9-2a7a-4bcf-a49f-000ccb253c8e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 07:55:56 compute-0 nova_compute[186544]: 2025-11-22 07:55:56.171 186548 DEBUG nova.network.neutron [req-41a2b70a-4cf7-4b19-9256-7319a9ef489b req-aa1d824a-d4f5-433a-be3e-d673c25b005a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7d441089-3a23-460f-80f9-602e3500937b] Updating instance_info_cache with network_info: [{"id": "030a5be9-2a7a-4bcf-a49f-000ccb253c8e", "address": "fa:16:3e:35:6d:35", "network": {"id": "dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1729458911-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec4007dc8214caab4e2eb40f11fb3cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap030a5be9-2a", "ovs_interfaceid": "030a5be9-2a7a-4bcf-a49f-000ccb253c8e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:55:56 compute-0 podman[224867]: 2025-11-22 07:55:56.076244465 +0000 UTC m=+0.024789772 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 07:55:56 compute-0 systemd[1]: Started libpod-conmon-5a7428065119aeb136029548478f2849df0febd13d41aa8c171e74839547bd1c.scope.
Nov 22 07:55:56 compute-0 nova_compute[186544]: 2025-11-22 07:55:56.196 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 7d441089-3a23-460f-80f9-602e3500937b] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 07:55:56 compute-0 nova_compute[186544]: 2025-11-22 07:55:56.197 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798156.1203673, 7d441089-3a23-460f-80f9-602e3500937b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:55:56 compute-0 nova_compute[186544]: 2025-11-22 07:55:56.197 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 7d441089-3a23-460f-80f9-602e3500937b] VM Paused (Lifecycle Event)
Nov 22 07:55:56 compute-0 nova_compute[186544]: 2025-11-22 07:55:56.199 186548 DEBUG oslo_concurrency.lockutils [req-41a2b70a-4cf7-4b19-9256-7319a9ef489b req-aa1d824a-d4f5-433a-be3e-d673c25b005a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-7d441089-3a23-460f-80f9-602e3500937b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:55:56 compute-0 systemd[1]: Started libcrun container.
Nov 22 07:55:56 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cfa74fea33e8f6445049c4b73278c2a8ace1bc9ef55eed756dfdd03ad483bcb9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 07:55:56 compute-0 nova_compute[186544]: 2025-11-22 07:55:56.224 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 7d441089-3a23-460f-80f9-602e3500937b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:55:56 compute-0 podman[224867]: 2025-11-22 07:55:56.225671646 +0000 UTC m=+0.174216953 container init 5a7428065119aeb136029548478f2849df0febd13d41aa8c171e74839547bd1c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 07:55:56 compute-0 nova_compute[186544]: 2025-11-22 07:55:56.230 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798156.1323144, 7d441089-3a23-460f-80f9-602e3500937b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:55:56 compute-0 nova_compute[186544]: 2025-11-22 07:55:56.230 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 7d441089-3a23-460f-80f9-602e3500937b] VM Resumed (Lifecycle Event)
Nov 22 07:55:56 compute-0 podman[224867]: 2025-11-22 07:55:56.233023787 +0000 UTC m=+0.181569084 container start 5a7428065119aeb136029548478f2849df0febd13d41aa8c171e74839547bd1c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 22 07:55:56 compute-0 neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a[224883]: [NOTICE]   (224889) : New worker (224891) forked
Nov 22 07:55:56 compute-0 neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a[224883]: [NOTICE]   (224889) : Loading success.
Nov 22 07:55:56 compute-0 nova_compute[186544]: 2025-11-22 07:55:56.276 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 7d441089-3a23-460f-80f9-602e3500937b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:55:56 compute-0 nova_compute[186544]: 2025-11-22 07:55:56.277 186548 INFO nova.compute.manager [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 7d441089-3a23-460f-80f9-602e3500937b] Took 8.40 seconds to build instance.
Nov 22 07:55:56 compute-0 nova_compute[186544]: 2025-11-22 07:55:56.280 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 7d441089-3a23-460f-80f9-602e3500937b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:55:56 compute-0 nova_compute[186544]: 2025-11-22 07:55:56.312 186548 DEBUG oslo_concurrency.lockutils [None req-311c3131-596e-405b-877b-a61efb4c30f8 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "7d441089-3a23-460f-80f9-602e3500937b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.533s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:55:56 compute-0 nova_compute[186544]: 2025-11-22 07:55:56.858 186548 DEBUG nova.compute.manager [req-4f6666eb-85c6-4477-9fbe-5a1f3e78abac req-5df8a3cb-c413-4718-9b12-b5ce92895890 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] Received event network-vif-plugged-ab338966-6ece-477c-af70-298f992f0c8f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:55:56 compute-0 nova_compute[186544]: 2025-11-22 07:55:56.858 186548 DEBUG oslo_concurrency.lockutils [req-4f6666eb-85c6-4477-9fbe-5a1f3e78abac req-5df8a3cb-c413-4718-9b12-b5ce92895890 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1da3a852-b732-48ac-886f-c57ced76a3d5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:55:56 compute-0 nova_compute[186544]: 2025-11-22 07:55:56.859 186548 DEBUG oslo_concurrency.lockutils [req-4f6666eb-85c6-4477-9fbe-5a1f3e78abac req-5df8a3cb-c413-4718-9b12-b5ce92895890 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1da3a852-b732-48ac-886f-c57ced76a3d5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:55:56 compute-0 nova_compute[186544]: 2025-11-22 07:55:56.859 186548 DEBUG oslo_concurrency.lockutils [req-4f6666eb-85c6-4477-9fbe-5a1f3e78abac req-5df8a3cb-c413-4718-9b12-b5ce92895890 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1da3a852-b732-48ac-886f-c57ced76a3d5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:55:56 compute-0 nova_compute[186544]: 2025-11-22 07:55:56.859 186548 DEBUG nova.compute.manager [req-4f6666eb-85c6-4477-9fbe-5a1f3e78abac req-5df8a3cb-c413-4718-9b12-b5ce92895890 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] No waiting events found dispatching network-vif-plugged-ab338966-6ece-477c-af70-298f992f0c8f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:55:56 compute-0 nova_compute[186544]: 2025-11-22 07:55:56.859 186548 WARNING nova.compute.manager [req-4f6666eb-85c6-4477-9fbe-5a1f3e78abac req-5df8a3cb-c413-4718-9b12-b5ce92895890 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] Received unexpected event network-vif-plugged-ab338966-6ece-477c-af70-298f992f0c8f for instance with vm_state active and task_state None.
Nov 22 07:55:57 compute-0 NetworkManager[55036]: <info>  [1763798157.1838] manager: (patch-br-int-to-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/141)
Nov 22 07:55:57 compute-0 NetworkManager[55036]: <info>  [1763798157.1845] manager: (patch-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/142)
Nov 22 07:55:57 compute-0 ovn_controller[94843]: 2025-11-22T07:55:57Z|00290|binding|INFO|Releasing lport bab7bafe-e92a-4e88-a16b-e3bd78ab8944 from this chassis (sb_readonly=0)
Nov 22 07:55:57 compute-0 ovn_controller[94843]: 2025-11-22T07:55:57Z|00291|binding|INFO|Releasing lport 99cae854-daa9-4d08-8152-257a15e21bf8 from this chassis (sb_readonly=0)
Nov 22 07:55:57 compute-0 ovn_controller[94843]: 2025-11-22T07:55:57Z|00292|binding|INFO|Releasing lport 1f7bc015-fb2f-41a5-82bb-16526b7a95f0 from this chassis (sb_readonly=0)
Nov 22 07:55:57 compute-0 nova_compute[186544]: 2025-11-22 07:55:57.194 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:57 compute-0 nova_compute[186544]: 2025-11-22 07:55:57.217 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:57 compute-0 ovn_controller[94843]: 2025-11-22T07:55:57Z|00293|binding|INFO|Releasing lport bab7bafe-e92a-4e88-a16b-e3bd78ab8944 from this chassis (sb_readonly=0)
Nov 22 07:55:57 compute-0 ovn_controller[94843]: 2025-11-22T07:55:57Z|00294|binding|INFO|Releasing lport 99cae854-daa9-4d08-8152-257a15e21bf8 from this chassis (sb_readonly=0)
Nov 22 07:55:57 compute-0 ovn_controller[94843]: 2025-11-22T07:55:57Z|00295|binding|INFO|Releasing lport 1f7bc015-fb2f-41a5-82bb-16526b7a95f0 from this chassis (sb_readonly=0)
Nov 22 07:55:57 compute-0 nova_compute[186544]: 2025-11-22 07:55:57.230 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:57 compute-0 podman[224901]: 2025-11-22 07:55:57.390520293 +0000 UTC m=+0.043727028 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 07:55:57 compute-0 nova_compute[186544]: 2025-11-22 07:55:57.534 186548 DEBUG nova.compute.manager [req-f1bfe954-9798-4518-9379-e0c0032d3999 req-abeb3aac-3017-4d1c-a700-f5c4e353399d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] Received event network-changed-ab338966-6ece-477c-af70-298f992f0c8f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:55:57 compute-0 nova_compute[186544]: 2025-11-22 07:55:57.534 186548 DEBUG nova.compute.manager [req-f1bfe954-9798-4518-9379-e0c0032d3999 req-abeb3aac-3017-4d1c-a700-f5c4e353399d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] Refreshing instance network info cache due to event network-changed-ab338966-6ece-477c-af70-298f992f0c8f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 07:55:57 compute-0 nova_compute[186544]: 2025-11-22 07:55:57.534 186548 DEBUG oslo_concurrency.lockutils [req-f1bfe954-9798-4518-9379-e0c0032d3999 req-abeb3aac-3017-4d1c-a700-f5c4e353399d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-1da3a852-b732-48ac-886f-c57ced76a3d5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:55:57 compute-0 nova_compute[186544]: 2025-11-22 07:55:57.535 186548 DEBUG oslo_concurrency.lockutils [req-f1bfe954-9798-4518-9379-e0c0032d3999 req-abeb3aac-3017-4d1c-a700-f5c4e353399d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-1da3a852-b732-48ac-886f-c57ced76a3d5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:55:57 compute-0 nova_compute[186544]: 2025-11-22 07:55:57.535 186548 DEBUG nova.network.neutron [req-f1bfe954-9798-4518-9379-e0c0032d3999 req-abeb3aac-3017-4d1c-a700-f5c4e353399d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] Refreshing network info cache for port ab338966-6ece-477c-af70-298f992f0c8f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 07:55:57 compute-0 nova_compute[186544]: 2025-11-22 07:55:57.691 186548 DEBUG nova.compute.manager [req-4c21227b-0ee7-424b-b5ad-65fd281e541b req-df20ff22-ef96-4d23-b369-8db1503bc91f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7d441089-3a23-460f-80f9-602e3500937b] Received event network-vif-plugged-030a5be9-2a7a-4bcf-a49f-000ccb253c8e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:55:57 compute-0 nova_compute[186544]: 2025-11-22 07:55:57.691 186548 DEBUG oslo_concurrency.lockutils [req-4c21227b-0ee7-424b-b5ad-65fd281e541b req-df20ff22-ef96-4d23-b369-8db1503bc91f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "7d441089-3a23-460f-80f9-602e3500937b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:55:57 compute-0 nova_compute[186544]: 2025-11-22 07:55:57.692 186548 DEBUG oslo_concurrency.lockutils [req-4c21227b-0ee7-424b-b5ad-65fd281e541b req-df20ff22-ef96-4d23-b369-8db1503bc91f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "7d441089-3a23-460f-80f9-602e3500937b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:55:57 compute-0 nova_compute[186544]: 2025-11-22 07:55:57.692 186548 DEBUG oslo_concurrency.lockutils [req-4c21227b-0ee7-424b-b5ad-65fd281e541b req-df20ff22-ef96-4d23-b369-8db1503bc91f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "7d441089-3a23-460f-80f9-602e3500937b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:55:57 compute-0 nova_compute[186544]: 2025-11-22 07:55:57.693 186548 DEBUG nova.compute.manager [req-4c21227b-0ee7-424b-b5ad-65fd281e541b req-df20ff22-ef96-4d23-b369-8db1503bc91f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7d441089-3a23-460f-80f9-602e3500937b] No waiting events found dispatching network-vif-plugged-030a5be9-2a7a-4bcf-a49f-000ccb253c8e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:55:57 compute-0 nova_compute[186544]: 2025-11-22 07:55:57.693 186548 WARNING nova.compute.manager [req-4c21227b-0ee7-424b-b5ad-65fd281e541b req-df20ff22-ef96-4d23-b369-8db1503bc91f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7d441089-3a23-460f-80f9-602e3500937b] Received unexpected event network-vif-plugged-030a5be9-2a7a-4bcf-a49f-000ccb253c8e for instance with vm_state active and task_state None.
Nov 22 07:55:57 compute-0 nova_compute[186544]: 2025-11-22 07:55:57.725 186548 DEBUG oslo_concurrency.lockutils [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquiring lock "refresh_cache-5eafb037-41a2-463f-9d3a-1b4248cb00f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:55:57 compute-0 nova_compute[186544]: 2025-11-22 07:55:57.726 186548 DEBUG oslo_concurrency.lockutils [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquired lock "refresh_cache-5eafb037-41a2-463f-9d3a-1b4248cb00f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:55:57 compute-0 nova_compute[186544]: 2025-11-22 07:55:57.726 186548 DEBUG nova.network.neutron [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 07:55:58 compute-0 nova_compute[186544]: 2025-11-22 07:55:58.812 186548 DEBUG oslo_concurrency.lockutils [None req-9833cfb3-b1b5-4be4-8b07-62df75de6498 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Acquiring lock "7d441089-3a23-460f-80f9-602e3500937b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:55:58 compute-0 nova_compute[186544]: 2025-11-22 07:55:58.813 186548 DEBUG oslo_concurrency.lockutils [None req-9833cfb3-b1b5-4be4-8b07-62df75de6498 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "7d441089-3a23-460f-80f9-602e3500937b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:55:58 compute-0 nova_compute[186544]: 2025-11-22 07:55:58.813 186548 DEBUG oslo_concurrency.lockutils [None req-9833cfb3-b1b5-4be4-8b07-62df75de6498 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Acquiring lock "7d441089-3a23-460f-80f9-602e3500937b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:55:58 compute-0 nova_compute[186544]: 2025-11-22 07:55:58.813 186548 DEBUG oslo_concurrency.lockutils [None req-9833cfb3-b1b5-4be4-8b07-62df75de6498 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "7d441089-3a23-460f-80f9-602e3500937b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:55:58 compute-0 nova_compute[186544]: 2025-11-22 07:55:58.813 186548 DEBUG oslo_concurrency.lockutils [None req-9833cfb3-b1b5-4be4-8b07-62df75de6498 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "7d441089-3a23-460f-80f9-602e3500937b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:55:58 compute-0 nova_compute[186544]: 2025-11-22 07:55:58.821 186548 INFO nova.compute.manager [None req-9833cfb3-b1b5-4be4-8b07-62df75de6498 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 7d441089-3a23-460f-80f9-602e3500937b] Terminating instance
Nov 22 07:55:58 compute-0 nova_compute[186544]: 2025-11-22 07:55:58.827 186548 DEBUG nova.compute.manager [None req-9833cfb3-b1b5-4be4-8b07-62df75de6498 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 7d441089-3a23-460f-80f9-602e3500937b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 07:55:58 compute-0 kernel: tap030a5be9-2a (unregistering): left promiscuous mode
Nov 22 07:55:58 compute-0 NetworkManager[55036]: <info>  [1763798158.8494] device (tap030a5be9-2a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 07:55:58 compute-0 ovn_controller[94843]: 2025-11-22T07:55:58Z|00296|binding|INFO|Releasing lport 030a5be9-2a7a-4bcf-a49f-000ccb253c8e from this chassis (sb_readonly=0)
Nov 22 07:55:58 compute-0 ovn_controller[94843]: 2025-11-22T07:55:58Z|00297|binding|INFO|Setting lport 030a5be9-2a7a-4bcf-a49f-000ccb253c8e down in Southbound
Nov 22 07:55:58 compute-0 ovn_controller[94843]: 2025-11-22T07:55:58Z|00298|binding|INFO|Removing iface tap030a5be9-2a ovn-installed in OVS
Nov 22 07:55:58 compute-0 nova_compute[186544]: 2025-11-22 07:55:58.869 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:58 compute-0 nova_compute[186544]: 2025-11-22 07:55:58.879 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:58.884 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:35:6d:35 10.100.0.14'], port_security=['fa:16:3e:35:6d:35 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '7d441089-3a23-460f-80f9-602e3500937b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7ec4007dc8214caab4e2eb40f11fb3cd', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9b5409ad-68b6-4279-a5b6-4f93a6b83cf7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b3e65854-82c8-492a-b0f0-e6e843e59756, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=030a5be9-2a7a-4bcf-a49f-000ccb253c8e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:55:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:58.886 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 030a5be9-2a7a-4bcf-a49f-000ccb253c8e in datapath dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a unbound from our chassis
Nov 22 07:55:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:58.887 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 07:55:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:58.889 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[28234d9f-b1ec-4209-a6c1-4ea047576322]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:58.890 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a namespace which is not needed anymore
Nov 22 07:55:58 compute-0 systemd[1]: machine-qemu\x2d36\x2dinstance\x2d00000047.scope: Deactivated successfully.
Nov 22 07:55:58 compute-0 systemd[1]: machine-qemu\x2d36\x2dinstance\x2d00000047.scope: Consumed 3.499s CPU time.
Nov 22 07:55:58 compute-0 systemd-machined[152872]: Machine qemu-36-instance-00000047 terminated.
Nov 22 07:55:59 compute-0 neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a[224883]: [NOTICE]   (224889) : haproxy version is 2.8.14-c23fe91
Nov 22 07:55:59 compute-0 neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a[224883]: [NOTICE]   (224889) : path to executable is /usr/sbin/haproxy
Nov 22 07:55:59 compute-0 neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a[224883]: [WARNING]  (224889) : Exiting Master process...
Nov 22 07:55:59 compute-0 neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a[224883]: [WARNING]  (224889) : Exiting Master process...
Nov 22 07:55:59 compute-0 neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a[224883]: [ALERT]    (224889) : Current worker (224891) exited with code 143 (Terminated)
Nov 22 07:55:59 compute-0 neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a[224883]: [WARNING]  (224889) : All workers exited. Exiting... (0)
Nov 22 07:55:59 compute-0 systemd[1]: libpod-5a7428065119aeb136029548478f2849df0febd13d41aa8c171e74839547bd1c.scope: Deactivated successfully.
Nov 22 07:55:59 compute-0 podman[224948]: 2025-11-22 07:55:59.026184608 +0000 UTC m=+0.046027675 container died 5a7428065119aeb136029548478f2849df0febd13d41aa8c171e74839547bd1c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 07:55:59 compute-0 nova_compute[186544]: 2025-11-22 07:55:59.046 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:59 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5a7428065119aeb136029548478f2849df0febd13d41aa8c171e74839547bd1c-userdata-shm.mount: Deactivated successfully.
Nov 22 07:55:59 compute-0 systemd[1]: var-lib-containers-storage-overlay-cfa74fea33e8f6445049c4b73278c2a8ace1bc9ef55eed756dfdd03ad483bcb9-merged.mount: Deactivated successfully.
Nov 22 07:55:59 compute-0 nova_compute[186544]: 2025-11-22 07:55:59.060 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:59 compute-0 podman[224948]: 2025-11-22 07:55:59.07419922 +0000 UTC m=+0.094042287 container cleanup 5a7428065119aeb136029548478f2849df0febd13d41aa8c171e74839547bd1c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 22 07:55:59 compute-0 systemd[1]: libpod-conmon-5a7428065119aeb136029548478f2849df0febd13d41aa8c171e74839547bd1c.scope: Deactivated successfully.
Nov 22 07:55:59 compute-0 nova_compute[186544]: 2025-11-22 07:55:59.099 186548 INFO nova.virt.libvirt.driver [-] [instance: 7d441089-3a23-460f-80f9-602e3500937b] Instance destroyed successfully.
Nov 22 07:55:59 compute-0 nova_compute[186544]: 2025-11-22 07:55:59.099 186548 DEBUG nova.objects.instance [None req-9833cfb3-b1b5-4be4-8b07-62df75de6498 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lazy-loading 'resources' on Instance uuid 7d441089-3a23-460f-80f9-602e3500937b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:55:59 compute-0 nova_compute[186544]: 2025-11-22 07:55:59.113 186548 DEBUG nova.virt.libvirt.vif [None req-9833cfb3-b1b5-4be4-8b07-62df75de6498 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:55:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-965620588',display_name='tempest-ImagesTestJSON-server-965620588',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagestestjson-server-965620588',id=71,image_ref='953f8847-fb3b-4129-b572-ed7beb09fe07',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:55:56Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='7ec4007dc8214caab4e2eb40f11fb3cd',ramdisk_id='',reservation_id='r-5bqas6qx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_boot_roles='member,reader',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_image_location='snapshot',image_image_state='available',image_image_type='snapshot',image_instance_uuid='dd6876d4-cab4-413b-9d67-10e2ba45a220',image_min_disk='1',image_min_ram='0',image_owner_id='7ec4007dc8214caab4e2eb40f11fb3cd',image_owner_project_name='tempest-ImagesTestJSON-117614339',image_owner_user_name='tempest-ImagesTestJSON-117614339-project-member',image_user_id='1ac2d2381d294c96aff369941185056a',owner_project_name='tempest-ImagesTestJSON-117614339',owner_user_name='tempest-ImagesTestJSON-117614339-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:55:56Z,user_data=None,user_id='1ac2d2381d294c96aff369941185056a',uuid=7d441089-3a23-460f-80f9-602e3500937b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "030a5be9-2a7a-4bcf-a49f-000ccb253c8e", "address": "fa:16:3e:35:6d:35", "network": {"id": "dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1729458911-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec4007dc8214caab4e2eb40f11fb3cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap030a5be9-2a", "ovs_interfaceid": "030a5be9-2a7a-4bcf-a49f-000ccb253c8e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 07:55:59 compute-0 nova_compute[186544]: 2025-11-22 07:55:59.114 186548 DEBUG nova.network.os_vif_util [None req-9833cfb3-b1b5-4be4-8b07-62df75de6498 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Converting VIF {"id": "030a5be9-2a7a-4bcf-a49f-000ccb253c8e", "address": "fa:16:3e:35:6d:35", "network": {"id": "dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1729458911-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec4007dc8214caab4e2eb40f11fb3cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap030a5be9-2a", "ovs_interfaceid": "030a5be9-2a7a-4bcf-a49f-000ccb253c8e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:55:59 compute-0 nova_compute[186544]: 2025-11-22 07:55:59.114 186548 DEBUG nova.network.os_vif_util [None req-9833cfb3-b1b5-4be4-8b07-62df75de6498 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:35:6d:35,bridge_name='br-int',has_traffic_filtering=True,id=030a5be9-2a7a-4bcf-a49f-000ccb253c8e,network=Network(dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap030a5be9-2a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:55:59 compute-0 nova_compute[186544]: 2025-11-22 07:55:59.115 186548 DEBUG os_vif [None req-9833cfb3-b1b5-4be4-8b07-62df75de6498 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:35:6d:35,bridge_name='br-int',has_traffic_filtering=True,id=030a5be9-2a7a-4bcf-a49f-000ccb253c8e,network=Network(dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap030a5be9-2a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 07:55:59 compute-0 nova_compute[186544]: 2025-11-22 07:55:59.117 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:59 compute-0 nova_compute[186544]: 2025-11-22 07:55:59.117 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap030a5be9-2a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:55:59 compute-0 nova_compute[186544]: 2025-11-22 07:55:59.119 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:59 compute-0 nova_compute[186544]: 2025-11-22 07:55:59.120 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:59 compute-0 nova_compute[186544]: 2025-11-22 07:55:59.122 186548 INFO os_vif [None req-9833cfb3-b1b5-4be4-8b07-62df75de6498 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:35:6d:35,bridge_name='br-int',has_traffic_filtering=True,id=030a5be9-2a7a-4bcf-a49f-000ccb253c8e,network=Network(dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap030a5be9-2a')
Nov 22 07:55:59 compute-0 nova_compute[186544]: 2025-11-22 07:55:59.123 186548 INFO nova.virt.libvirt.driver [None req-9833cfb3-b1b5-4be4-8b07-62df75de6498 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 7d441089-3a23-460f-80f9-602e3500937b] Deleting instance files /var/lib/nova/instances/7d441089-3a23-460f-80f9-602e3500937b_del
Nov 22 07:55:59 compute-0 nova_compute[186544]: 2025-11-22 07:55:59.123 186548 INFO nova.virt.libvirt.driver [None req-9833cfb3-b1b5-4be4-8b07-62df75de6498 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 7d441089-3a23-460f-80f9-602e3500937b] Deletion of /var/lib/nova/instances/7d441089-3a23-460f-80f9-602e3500937b_del complete
Nov 22 07:55:59 compute-0 podman[224988]: 2025-11-22 07:55:59.148960413 +0000 UTC m=+0.047126102 container remove 5a7428065119aeb136029548478f2849df0febd13d41aa8c171e74839547bd1c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 07:55:59 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:59.154 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[533c7da2-997f-47e1-8602-a274ac4cef11]: (4, ('Sat Nov 22 07:55:58 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a (5a7428065119aeb136029548478f2849df0febd13d41aa8c171e74839547bd1c)\n5a7428065119aeb136029548478f2849df0febd13d41aa8c171e74839547bd1c\nSat Nov 22 07:55:59 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a (5a7428065119aeb136029548478f2849df0febd13d41aa8c171e74839547bd1c)\n5a7428065119aeb136029548478f2849df0febd13d41aa8c171e74839547bd1c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:59 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:59.155 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[92296ad1-4c89-41ed-b373-441e8db4047b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:59 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:59.156 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdc6b9ee8-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:55:59 compute-0 nova_compute[186544]: 2025-11-22 07:55:59.157 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:59 compute-0 kernel: tapdc6b9ee8-e0: left promiscuous mode
Nov 22 07:55:59 compute-0 nova_compute[186544]: 2025-11-22 07:55:59.173 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:55:59 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:59.176 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[5bbf728c-3624-4031-8d0c-2d1769dd08b3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:59 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:59.196 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[00348323-48c7-49b6-9e78-22fc5861e4c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:59 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:59.198 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[e5f416b1-0286-4c17-a03c-5a4362e319eb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:59 compute-0 nova_compute[186544]: 2025-11-22 07:55:59.201 186548 DEBUG nova.network.neutron [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Updating instance_info_cache with network_info: [{"id": "041d3fd2-9a77-48a1-b976-9dda05f01f7b", "address": "fa:16:3e:31:97:43", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap041d3fd2-9a", "ovs_interfaceid": "041d3fd2-9a77-48a1-b976-9dda05f01f7b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:55:59 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:59.215 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[b5ba627f-9119-4ea3-a331-0db77a8e7f98]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 489607, 'reachable_time': 25551, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225003, 'error': None, 'target': 'ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:59 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:59.219 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-dc6b9ee8-e824-42ea-be5e-5b3c4e48e46a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 07:55:59 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:55:59.219 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[84c666a4-c87c-4046-a2e6-cee26dc14d38]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:55:59 compute-0 systemd[1]: run-netns-ovnmeta\x2ddc6b9ee8\x2de824\x2d42ea\x2dbe5e\x2d5b3c4e48e46a.mount: Deactivated successfully.
Nov 22 07:55:59 compute-0 nova_compute[186544]: 2025-11-22 07:55:59.229 186548 DEBUG oslo_concurrency.lockutils [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Releasing lock "refresh_cache-5eafb037-41a2-463f-9d3a-1b4248cb00f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:55:59 compute-0 nova_compute[186544]: 2025-11-22 07:55:59.230 186548 INFO nova.compute.manager [None req-9833cfb3-b1b5-4be4-8b07-62df75de6498 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] [instance: 7d441089-3a23-460f-80f9-602e3500937b] Took 0.40 seconds to destroy the instance on the hypervisor.
Nov 22 07:55:59 compute-0 nova_compute[186544]: 2025-11-22 07:55:59.231 186548 DEBUG oslo.service.loopingcall [None req-9833cfb3-b1b5-4be4-8b07-62df75de6498 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 07:55:59 compute-0 nova_compute[186544]: 2025-11-22 07:55:59.231 186548 DEBUG nova.compute.manager [-] [instance: 7d441089-3a23-460f-80f9-602e3500937b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 07:55:59 compute-0 nova_compute[186544]: 2025-11-22 07:55:59.232 186548 DEBUG nova.network.neutron [-] [instance: 7d441089-3a23-460f-80f9-602e3500937b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 07:55:59 compute-0 nova_compute[186544]: 2025-11-22 07:55:59.353 186548 DEBUG nova.virt.libvirt.driver [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511
Nov 22 07:55:59 compute-0 nova_compute[186544]: 2025-11-22 07:55:59.354 186548 DEBUG nova.virt.libvirt.volume.remotefs [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Creating file /var/lib/nova/instances/5eafb037-41a2-463f-9d3a-1b4248cb00f2/834d093681bc4de1b0f010af12939e54.tmp on remote host 192.168.122.101 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79
Nov 22 07:55:59 compute-0 nova_compute[186544]: 2025-11-22 07:55:59.355 186548 DEBUG oslo_concurrency.processutils [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/5eafb037-41a2-463f-9d3a-1b4248cb00f2/834d093681bc4de1b0f010af12939e54.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:55:59 compute-0 nova_compute[186544]: 2025-11-22 07:55:59.620 186548 DEBUG nova.network.neutron [req-f1bfe954-9798-4518-9379-e0c0032d3999 req-abeb3aac-3017-4d1c-a700-f5c4e353399d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] Updated VIF entry in instance network info cache for port ab338966-6ece-477c-af70-298f992f0c8f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 07:55:59 compute-0 nova_compute[186544]: 2025-11-22 07:55:59.621 186548 DEBUG nova.network.neutron [req-f1bfe954-9798-4518-9379-e0c0032d3999 req-abeb3aac-3017-4d1c-a700-f5c4e353399d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] Updating instance_info_cache with network_info: [{"id": "ab338966-6ece-477c-af70-298f992f0c8f", "address": "fa:16:3e:63:15:d6", "network": {"id": "a2b438ab-8fa8-4627-8c04-99bed701c19e", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1803455984-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "af3a536766704caaad94e5da2e3b88e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab338966-6e", "ovs_interfaceid": "ab338966-6ece-477c-af70-298f992f0c8f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:55:59 compute-0 nova_compute[186544]: 2025-11-22 07:55:59.662 186548 DEBUG oslo_concurrency.lockutils [req-f1bfe954-9798-4518-9379-e0c0032d3999 req-abeb3aac-3017-4d1c-a700-f5c4e353399d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-1da3a852-b732-48ac-886f-c57ced76a3d5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:55:59 compute-0 nova_compute[186544]: 2025-11-22 07:55:59.779 186548 DEBUG nova.compute.manager [req-20469d9e-2d84-42e2-ad04-9463706290ab req-13069139-3af2-4918-bbcc-a978c8bc7175 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7d441089-3a23-460f-80f9-602e3500937b] Received event network-vif-unplugged-030a5be9-2a7a-4bcf-a49f-000ccb253c8e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:55:59 compute-0 nova_compute[186544]: 2025-11-22 07:55:59.780 186548 DEBUG oslo_concurrency.lockutils [req-20469d9e-2d84-42e2-ad04-9463706290ab req-13069139-3af2-4918-bbcc-a978c8bc7175 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "7d441089-3a23-460f-80f9-602e3500937b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:55:59 compute-0 nova_compute[186544]: 2025-11-22 07:55:59.780 186548 DEBUG oslo_concurrency.lockutils [req-20469d9e-2d84-42e2-ad04-9463706290ab req-13069139-3af2-4918-bbcc-a978c8bc7175 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "7d441089-3a23-460f-80f9-602e3500937b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:55:59 compute-0 nova_compute[186544]: 2025-11-22 07:55:59.780 186548 DEBUG oslo_concurrency.lockutils [req-20469d9e-2d84-42e2-ad04-9463706290ab req-13069139-3af2-4918-bbcc-a978c8bc7175 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "7d441089-3a23-460f-80f9-602e3500937b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:55:59 compute-0 nova_compute[186544]: 2025-11-22 07:55:59.780 186548 DEBUG nova.compute.manager [req-20469d9e-2d84-42e2-ad04-9463706290ab req-13069139-3af2-4918-bbcc-a978c8bc7175 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7d441089-3a23-460f-80f9-602e3500937b] No waiting events found dispatching network-vif-unplugged-030a5be9-2a7a-4bcf-a49f-000ccb253c8e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:55:59 compute-0 nova_compute[186544]: 2025-11-22 07:55:59.781 186548 DEBUG nova.compute.manager [req-20469d9e-2d84-42e2-ad04-9463706290ab req-13069139-3af2-4918-bbcc-a978c8bc7175 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7d441089-3a23-460f-80f9-602e3500937b] Received event network-vif-unplugged-030a5be9-2a7a-4bcf-a49f-000ccb253c8e for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 22 07:55:59 compute-0 nova_compute[186544]: 2025-11-22 07:55:59.781 186548 DEBUG nova.compute.manager [req-20469d9e-2d84-42e2-ad04-9463706290ab req-13069139-3af2-4918-bbcc-a978c8bc7175 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7d441089-3a23-460f-80f9-602e3500937b] Received event network-vif-plugged-030a5be9-2a7a-4bcf-a49f-000ccb253c8e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:55:59 compute-0 nova_compute[186544]: 2025-11-22 07:55:59.781 186548 DEBUG oslo_concurrency.lockutils [req-20469d9e-2d84-42e2-ad04-9463706290ab req-13069139-3af2-4918-bbcc-a978c8bc7175 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "7d441089-3a23-460f-80f9-602e3500937b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:55:59 compute-0 nova_compute[186544]: 2025-11-22 07:55:59.781 186548 DEBUG oslo_concurrency.lockutils [req-20469d9e-2d84-42e2-ad04-9463706290ab req-13069139-3af2-4918-bbcc-a978c8bc7175 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "7d441089-3a23-460f-80f9-602e3500937b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:55:59 compute-0 nova_compute[186544]: 2025-11-22 07:55:59.782 186548 DEBUG oslo_concurrency.lockutils [req-20469d9e-2d84-42e2-ad04-9463706290ab req-13069139-3af2-4918-bbcc-a978c8bc7175 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "7d441089-3a23-460f-80f9-602e3500937b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:55:59 compute-0 nova_compute[186544]: 2025-11-22 07:55:59.782 186548 DEBUG nova.compute.manager [req-20469d9e-2d84-42e2-ad04-9463706290ab req-13069139-3af2-4918-bbcc-a978c8bc7175 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7d441089-3a23-460f-80f9-602e3500937b] No waiting events found dispatching network-vif-plugged-030a5be9-2a7a-4bcf-a49f-000ccb253c8e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:55:59 compute-0 nova_compute[186544]: 2025-11-22 07:55:59.782 186548 WARNING nova.compute.manager [req-20469d9e-2d84-42e2-ad04-9463706290ab req-13069139-3af2-4918-bbcc-a978c8bc7175 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7d441089-3a23-460f-80f9-602e3500937b] Received unexpected event network-vif-plugged-030a5be9-2a7a-4bcf-a49f-000ccb253c8e for instance with vm_state active and task_state deleting.
Nov 22 07:55:59 compute-0 nova_compute[186544]: 2025-11-22 07:55:59.790 186548 DEBUG oslo_concurrency.processutils [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/5eafb037-41a2-463f-9d3a-1b4248cb00f2/834d093681bc4de1b0f010af12939e54.tmp" returned: 1 in 0.435s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:55:59 compute-0 nova_compute[186544]: 2025-11-22 07:55:59.790 186548 DEBUG oslo_concurrency.processutils [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] 'ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/5eafb037-41a2-463f-9d3a-1b4248cb00f2/834d093681bc4de1b0f010af12939e54.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Nov 22 07:55:59 compute-0 nova_compute[186544]: 2025-11-22 07:55:59.791 186548 DEBUG nova.virt.libvirt.volume.remotefs [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Creating directory /var/lib/nova/instances/5eafb037-41a2-463f-9d3a-1b4248cb00f2 on remote host 192.168.122.101 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91
Nov 22 07:55:59 compute-0 nova_compute[186544]: 2025-11-22 07:55:59.791 186548 DEBUG oslo_concurrency.processutils [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/5eafb037-41a2-463f-9d3a-1b4248cb00f2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:56:00 compute-0 nova_compute[186544]: 2025-11-22 07:56:00.004 186548 DEBUG oslo_concurrency.processutils [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/5eafb037-41a2-463f-9d3a-1b4248cb00f2" returned: 0 in 0.212s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:56:00 compute-0 nova_compute[186544]: 2025-11-22 07:56:00.012 186548 DEBUG nova.virt.libvirt.driver [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Nov 22 07:56:00 compute-0 nova_compute[186544]: 2025-11-22 07:56:00.220 186548 DEBUG nova.network.neutron [-] [instance: 7d441089-3a23-460f-80f9-602e3500937b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:56:00 compute-0 nova_compute[186544]: 2025-11-22 07:56:00.233 186548 INFO nova.compute.manager [-] [instance: 7d441089-3a23-460f-80f9-602e3500937b] Took 1.00 seconds to deallocate network for instance.
Nov 22 07:56:00 compute-0 nova_compute[186544]: 2025-11-22 07:56:00.294 186548 DEBUG oslo_concurrency.lockutils [None req-9833cfb3-b1b5-4be4-8b07-62df75de6498 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:56:00 compute-0 nova_compute[186544]: 2025-11-22 07:56:00.295 186548 DEBUG oslo_concurrency.lockutils [None req-9833cfb3-b1b5-4be4-8b07-62df75de6498 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:56:00 compute-0 podman[225006]: 2025-11-22 07:56:00.410225994 +0000 UTC m=+0.057409365 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., config_id=edpm, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.buildah.version=1.33.7, architecture=x86_64, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, distribution-scope=public, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 22 07:56:00 compute-0 nova_compute[186544]: 2025-11-22 07:56:00.419 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:56:00 compute-0 nova_compute[186544]: 2025-11-22 07:56:00.484 186548 DEBUG nova.compute.provider_tree [None req-9833cfb3-b1b5-4be4-8b07-62df75de6498 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:56:00 compute-0 nova_compute[186544]: 2025-11-22 07:56:00.496 186548 DEBUG nova.scheduler.client.report [None req-9833cfb3-b1b5-4be4-8b07-62df75de6498 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:56:00 compute-0 nova_compute[186544]: 2025-11-22 07:56:00.524 186548 DEBUG oslo_concurrency.lockutils [None req-9833cfb3-b1b5-4be4-8b07-62df75de6498 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.229s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:56:00 compute-0 nova_compute[186544]: 2025-11-22 07:56:00.549 186548 INFO nova.scheduler.client.report [None req-9833cfb3-b1b5-4be4-8b07-62df75de6498 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Deleted allocations for instance 7d441089-3a23-460f-80f9-602e3500937b
Nov 22 07:56:00 compute-0 nova_compute[186544]: 2025-11-22 07:56:00.610 186548 DEBUG oslo_concurrency.lockutils [None req-9833cfb3-b1b5-4be4-8b07-62df75de6498 1ac2d2381d294c96aff369941185056a 7ec4007dc8214caab4e2eb40f11fb3cd - - default default] Lock "7d441089-3a23-460f-80f9-602e3500937b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.797s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:56:01 compute-0 anacron[30858]: Job `cron.weekly' started
Nov 22 07:56:01 compute-0 anacron[30858]: Job `cron.weekly' terminated
Nov 22 07:56:01 compute-0 nova_compute[186544]: 2025-11-22 07:56:01.916 186548 DEBUG nova.compute.manager [req-3ba675a3-56c1-4ece-a2d7-bd445cdc1f24 req-d5fb5f4a-71ed-418a-bbd8-71c454d1854c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 7d441089-3a23-460f-80f9-602e3500937b] Received event network-vif-deleted-030a5be9-2a7a-4bcf-a49f-000ccb253c8e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:56:02 compute-0 nova_compute[186544]: 2025-11-22 07:56:02.158 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:56:04 compute-0 nova_compute[186544]: 2025-11-22 07:56:04.121 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:56:05 compute-0 nova_compute[186544]: 2025-11-22 07:56:05.092 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:56:05 compute-0 nova_compute[186544]: 2025-11-22 07:56:05.421 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:56:05 compute-0 ovn_controller[94843]: 2025-11-22T07:56:05Z|00035|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:31:97:43 10.100.0.14
Nov 22 07:56:05 compute-0 ovn_controller[94843]: 2025-11-22T07:56:05Z|00036|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:31:97:43 10.100.0.14
Nov 22 07:56:09 compute-0 nova_compute[186544]: 2025-11-22 07:56:09.126 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:56:09 compute-0 ovn_controller[94843]: 2025-11-22T07:56:09Z|00037|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:63:15:d6 10.100.0.3
Nov 22 07:56:09 compute-0 ovn_controller[94843]: 2025-11-22T07:56:09Z|00038|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:63:15:d6 10.100.0.3
Nov 22 07:56:10 compute-0 nova_compute[186544]: 2025-11-22 07:56:10.056 186548 DEBUG nova.virt.libvirt.driver [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Nov 22 07:56:10 compute-0 nova_compute[186544]: 2025-11-22 07:56:10.424 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:56:11 compute-0 nova_compute[186544]: 2025-11-22 07:56:11.418 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:56:11 compute-0 podman[225065]: 2025-11-22 07:56:11.427475579 +0000 UTC m=+0.072543678 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=edpm, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 22 07:56:11 compute-0 podman[225066]: 2025-11-22 07:56:11.446369215 +0000 UTC m=+0.088284646 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 07:56:12 compute-0 kernel: tap041d3fd2-9a (unregistering): left promiscuous mode
Nov 22 07:56:12 compute-0 NetworkManager[55036]: <info>  [1763798172.7809] device (tap041d3fd2-9a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 07:56:12 compute-0 nova_compute[186544]: 2025-11-22 07:56:12.791 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:56:12 compute-0 ovn_controller[94843]: 2025-11-22T07:56:12Z|00299|binding|INFO|Releasing lport 041d3fd2-9a77-48a1-b976-9dda05f01f7b from this chassis (sb_readonly=0)
Nov 22 07:56:12 compute-0 ovn_controller[94843]: 2025-11-22T07:56:12Z|00300|binding|INFO|Setting lport 041d3fd2-9a77-48a1-b976-9dda05f01f7b down in Southbound
Nov 22 07:56:12 compute-0 ovn_controller[94843]: 2025-11-22T07:56:12Z|00301|binding|INFO|Removing iface tap041d3fd2-9a ovn-installed in OVS
Nov 22 07:56:12 compute-0 nova_compute[186544]: 2025-11-22 07:56:12.793 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:56:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:56:12.799 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:31:97:43 10.100.0.14'], port_security=['fa:16:3e:31:97:43 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '5eafb037-41a2-463f-9d3a-1b4248cb00f2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d54e232a-5c68-4cc7-b58c-054da9c4646f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '063bf16c91af408ca075c690797e09d8', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f7dee984-8c58-404b-bb82-9a414aef709d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3245992a-6d76-4250-aff6-2ef09c2bac10, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=041d3fd2-9a77-48a1-b976-9dda05f01f7b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:56:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:56:12.800 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 041d3fd2-9a77-48a1-b976-9dda05f01f7b in datapath d54e232a-5c68-4cc7-b58c-054da9c4646f unbound from our chassis
Nov 22 07:56:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:56:12.801 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d54e232a-5c68-4cc7-b58c-054da9c4646f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 07:56:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:56:12.802 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[dc1e27ad-f36d-47aa-87f7-2f16b9d0f40a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:56:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:56:12.803 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f namespace which is not needed anymore
Nov 22 07:56:12 compute-0 nova_compute[186544]: 2025-11-22 07:56:12.805 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:56:12 compute-0 systemd[1]: machine-qemu\x2d34\x2dinstance\x2d00000046.scope: Deactivated successfully.
Nov 22 07:56:12 compute-0 systemd[1]: machine-qemu\x2d34\x2dinstance\x2d00000046.scope: Consumed 14.254s CPU time.
Nov 22 07:56:12 compute-0 systemd-machined[152872]: Machine qemu-34-instance-00000046 terminated.
Nov 22 07:56:12 compute-0 neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f[224609]: [NOTICE]   (224617) : haproxy version is 2.8.14-c23fe91
Nov 22 07:56:12 compute-0 neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f[224609]: [NOTICE]   (224617) : path to executable is /usr/sbin/haproxy
Nov 22 07:56:12 compute-0 neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f[224609]: [WARNING]  (224617) : Exiting Master process...
Nov 22 07:56:12 compute-0 neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f[224609]: [ALERT]    (224617) : Current worker (224619) exited with code 143 (Terminated)
Nov 22 07:56:12 compute-0 neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f[224609]: [WARNING]  (224617) : All workers exited. Exiting... (0)
Nov 22 07:56:12 compute-0 systemd[1]: libpod-a3381c5ddac2b45cc0023742b68be08471a0531da15cf4ace3d95a62464024ec.scope: Deactivated successfully.
Nov 22 07:56:12 compute-0 podman[225131]: 2025-11-22 07:56:12.94196758 +0000 UTC m=+0.050816173 container died a3381c5ddac2b45cc0023742b68be08471a0531da15cf4ace3d95a62464024ec (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 22 07:56:12 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a3381c5ddac2b45cc0023742b68be08471a0531da15cf4ace3d95a62464024ec-userdata-shm.mount: Deactivated successfully.
Nov 22 07:56:12 compute-0 systemd[1]: var-lib-containers-storage-overlay-d2deafbd7b2bf0a384883ce2fb9a6ef3ce78bfabd699745dc355b3855ff748a9-merged.mount: Deactivated successfully.
Nov 22 07:56:12 compute-0 podman[225131]: 2025-11-22 07:56:12.992052344 +0000 UTC m=+0.100900927 container cleanup a3381c5ddac2b45cc0023742b68be08471a0531da15cf4ace3d95a62464024ec (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 07:56:12 compute-0 systemd[1]: libpod-conmon-a3381c5ddac2b45cc0023742b68be08471a0531da15cf4ace3d95a62464024ec.scope: Deactivated successfully.
Nov 22 07:56:13 compute-0 nova_compute[186544]: 2025-11-22 07:56:13.068 186548 INFO nova.virt.libvirt.driver [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Instance shutdown successfully after 13 seconds.
Nov 22 07:56:13 compute-0 nova_compute[186544]: 2025-11-22 07:56:13.075 186548 INFO nova.virt.libvirt.driver [-] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Instance destroyed successfully.
Nov 22 07:56:13 compute-0 nova_compute[186544]: 2025-11-22 07:56:13.076 186548 DEBUG nova.virt.libvirt.vif [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:55:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1663966303',display_name='tempest-ServerDiskConfigTestJSON-server-1663966303',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1663966303',id=70,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:55:51Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='063bf16c91af408ca075c690797e09d8',ramdisk_id='',reservation_id='r-9g9kcea9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerDiskConfigTestJSON-592691466',owner_user_name='tempest-ServerDiskConfigTestJSON-592691466-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:55:57Z,user_data=None,user_id='e24c302b62fb470aa189b76d4676733b',uuid=5eafb037-41a2-463f-9d3a-1b4248cb00f2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "041d3fd2-9a77-48a1-b976-9dda05f01f7b", "address": "fa:16:3e:31:97:43", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "vif_mac": "fa:16:3e:31:97:43"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap041d3fd2-9a", "ovs_interfaceid": "041d3fd2-9a77-48a1-b976-9dda05f01f7b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 07:56:13 compute-0 nova_compute[186544]: 2025-11-22 07:56:13.076 186548 DEBUG nova.network.os_vif_util [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Converting VIF {"id": "041d3fd2-9a77-48a1-b976-9dda05f01f7b", "address": "fa:16:3e:31:97:43", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "vif_mac": "fa:16:3e:31:97:43"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap041d3fd2-9a", "ovs_interfaceid": "041d3fd2-9a77-48a1-b976-9dda05f01f7b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:56:13 compute-0 nova_compute[186544]: 2025-11-22 07:56:13.077 186548 DEBUG nova.network.os_vif_util [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:31:97:43,bridge_name='br-int',has_traffic_filtering=True,id=041d3fd2-9a77-48a1-b976-9dda05f01f7b,network=Network(d54e232a-5c68-4cc7-b58c-054da9c4646f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap041d3fd2-9a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:56:13 compute-0 nova_compute[186544]: 2025-11-22 07:56:13.078 186548 DEBUG os_vif [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:31:97:43,bridge_name='br-int',has_traffic_filtering=True,id=041d3fd2-9a77-48a1-b976-9dda05f01f7b,network=Network(d54e232a-5c68-4cc7-b58c-054da9c4646f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap041d3fd2-9a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 07:56:13 compute-0 nova_compute[186544]: 2025-11-22 07:56:13.079 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:56:13 compute-0 nova_compute[186544]: 2025-11-22 07:56:13.080 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap041d3fd2-9a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:56:13 compute-0 nova_compute[186544]: 2025-11-22 07:56:13.081 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:56:13 compute-0 nova_compute[186544]: 2025-11-22 07:56:13.083 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:56:13 compute-0 nova_compute[186544]: 2025-11-22 07:56:13.086 186548 INFO os_vif [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:31:97:43,bridge_name='br-int',has_traffic_filtering=True,id=041d3fd2-9a77-48a1-b976-9dda05f01f7b,network=Network(d54e232a-5c68-4cc7-b58c-054da9c4646f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap041d3fd2-9a')
Nov 22 07:56:13 compute-0 podman[225160]: 2025-11-22 07:56:13.089103464 +0000 UTC m=+0.068582940 container remove a3381c5ddac2b45cc0023742b68be08471a0531da15cf4ace3d95a62464024ec (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 07:56:13 compute-0 nova_compute[186544]: 2025-11-22 07:56:13.091 186548 DEBUG oslo_concurrency.processutils [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5eafb037-41a2-463f-9d3a-1b4248cb00f2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:56:13 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:56:13.095 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[8b17acbf-f49c-4c5f-98f8-9fa0cb4552ff]: (4, ('Sat Nov 22 07:56:12 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f (a3381c5ddac2b45cc0023742b68be08471a0531da15cf4ace3d95a62464024ec)\na3381c5ddac2b45cc0023742b68be08471a0531da15cf4ace3d95a62464024ec\nSat Nov 22 07:56:13 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f (a3381c5ddac2b45cc0023742b68be08471a0531da15cf4ace3d95a62464024ec)\na3381c5ddac2b45cc0023742b68be08471a0531da15cf4ace3d95a62464024ec\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:56:13 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:56:13.097 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[309309c6-2b14-4b66-be94-f00da730414a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:56:13 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:56:13.099 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd54e232a-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:56:13 compute-0 kernel: tapd54e232a-50: left promiscuous mode
Nov 22 07:56:13 compute-0 nova_compute[186544]: 2025-11-22 07:56:13.110 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:56:13 compute-0 nova_compute[186544]: 2025-11-22 07:56:13.114 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:56:13 compute-0 nova_compute[186544]: 2025-11-22 07:56:13.116 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:56:13 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:56:13.118 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[26450ed9-42e4-4541-acd1-f287d5dd9797]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:56:13 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:56:13.133 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[8adeeca5-77cb-492c-938c-e6339183f1a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:56:13 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:56:13.135 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[2cb1a97f-1ff5-4d27-9f7f-3dcae370a284]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:56:13 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:56:13.149 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[030a3407-3c0d-46a3-82ff-474d1789b3c5]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 489096, 'reachable_time': 30671, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225193, 'error': None, 'target': 'ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:56:13 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:56:13.151 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 07:56:13 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:56:13.151 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[c0dd963b-9394-4cb4-b492-a24b00782cd4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:56:13 compute-0 systemd[1]: run-netns-ovnmeta\x2dd54e232a\x2d5c68\x2d4cc7\x2db58c\x2d054da9c4646f.mount: Deactivated successfully.
Nov 22 07:56:13 compute-0 nova_compute[186544]: 2025-11-22 07:56:13.157 186548 DEBUG oslo_concurrency.processutils [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5eafb037-41a2-463f-9d3a-1b4248cb00f2/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:56:13 compute-0 nova_compute[186544]: 2025-11-22 07:56:13.158 186548 DEBUG oslo_concurrency.processutils [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5eafb037-41a2-463f-9d3a-1b4248cb00f2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:56:13 compute-0 nova_compute[186544]: 2025-11-22 07:56:13.214 186548 DEBUG oslo_concurrency.processutils [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5eafb037-41a2-463f-9d3a-1b4248cb00f2/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:56:13 compute-0 nova_compute[186544]: 2025-11-22 07:56:13.215 186548 DEBUG nova.virt.libvirt.volume.remotefs [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Copying file /var/lib/nova/instances/5eafb037-41a2-463f-9d3a-1b4248cb00f2_resize/disk to 192.168.122.101:/var/lib/nova/instances/5eafb037-41a2-463f-9d3a-1b4248cb00f2/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Nov 22 07:56:13 compute-0 nova_compute[186544]: 2025-11-22 07:56:13.216 186548 DEBUG oslo_concurrency.processutils [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/5eafb037-41a2-463f-9d3a-1b4248cb00f2_resize/disk 192.168.122.101:/var/lib/nova/instances/5eafb037-41a2-463f-9d3a-1b4248cb00f2/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:56:14 compute-0 nova_compute[186544]: 2025-11-22 07:56:14.077 186548 DEBUG oslo_concurrency.processutils [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CMD "scp -r /var/lib/nova/instances/5eafb037-41a2-463f-9d3a-1b4248cb00f2_resize/disk 192.168.122.101:/var/lib/nova/instances/5eafb037-41a2-463f-9d3a-1b4248cb00f2/disk" returned: 0 in 0.861s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:56:14 compute-0 nova_compute[186544]: 2025-11-22 07:56:14.078 186548 DEBUG nova.virt.libvirt.volume.remotefs [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Copying file /var/lib/nova/instances/5eafb037-41a2-463f-9d3a-1b4248cb00f2_resize/disk.config to 192.168.122.101:/var/lib/nova/instances/5eafb037-41a2-463f-9d3a-1b4248cb00f2/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Nov 22 07:56:14 compute-0 nova_compute[186544]: 2025-11-22 07:56:14.078 186548 DEBUG oslo_concurrency.processutils [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/5eafb037-41a2-463f-9d3a-1b4248cb00f2_resize/disk.config 192.168.122.101:/var/lib/nova/instances/5eafb037-41a2-463f-9d3a-1b4248cb00f2/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:56:14 compute-0 nova_compute[186544]: 2025-11-22 07:56:14.097 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763798159.0969436, 7d441089-3a23-460f-80f9-602e3500937b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:56:14 compute-0 nova_compute[186544]: 2025-11-22 07:56:14.098 186548 INFO nova.compute.manager [-] [instance: 7d441089-3a23-460f-80f9-602e3500937b] VM Stopped (Lifecycle Event)
Nov 22 07:56:14 compute-0 nova_compute[186544]: 2025-11-22 07:56:14.117 186548 DEBUG nova.compute.manager [None req-a16562d2-e201-45c8-b794-458a7536828c - - - - - -] [instance: 7d441089-3a23-460f-80f9-602e3500937b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:56:14 compute-0 nova_compute[186544]: 2025-11-22 07:56:14.308 186548 DEBUG oslo_concurrency.processutils [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CMD "scp -C -r /var/lib/nova/instances/5eafb037-41a2-463f-9d3a-1b4248cb00f2_resize/disk.config 192.168.122.101:/var/lib/nova/instances/5eafb037-41a2-463f-9d3a-1b4248cb00f2/disk.config" returned: 0 in 0.230s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:56:14 compute-0 nova_compute[186544]: 2025-11-22 07:56:14.309 186548 DEBUG nova.virt.libvirt.volume.remotefs [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Copying file /var/lib/nova/instances/5eafb037-41a2-463f-9d3a-1b4248cb00f2_resize/disk.info to 192.168.122.101:/var/lib/nova/instances/5eafb037-41a2-463f-9d3a-1b4248cb00f2/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Nov 22 07:56:14 compute-0 nova_compute[186544]: 2025-11-22 07:56:14.310 186548 DEBUG oslo_concurrency.processutils [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/5eafb037-41a2-463f-9d3a-1b4248cb00f2_resize/disk.info 192.168.122.101:/var/lib/nova/instances/5eafb037-41a2-463f-9d3a-1b4248cb00f2/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:56:14 compute-0 nova_compute[186544]: 2025-11-22 07:56:14.523 186548 DEBUG oslo_concurrency.processutils [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CMD "scp -C -r /var/lib/nova/instances/5eafb037-41a2-463f-9d3a-1b4248cb00f2_resize/disk.info 192.168.122.101:/var/lib/nova/instances/5eafb037-41a2-463f-9d3a-1b4248cb00f2/disk.info" returned: 0 in 0.213s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:56:15 compute-0 nova_compute[186544]: 2025-11-22 07:56:15.119 186548 DEBUG neutronclient.v2_0.client [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port 041d3fd2-9a77-48a1-b976-9dda05f01f7b for host compute-1.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262
Nov 22 07:56:15 compute-0 nova_compute[186544]: 2025-11-22 07:56:15.264 186548 DEBUG oslo_concurrency.lockutils [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquiring lock "5eafb037-41a2-463f-9d3a-1b4248cb00f2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:56:15 compute-0 nova_compute[186544]: 2025-11-22 07:56:15.264 186548 DEBUG oslo_concurrency.lockutils [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "5eafb037-41a2-463f-9d3a-1b4248cb00f2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:56:15 compute-0 nova_compute[186544]: 2025-11-22 07:56:15.264 186548 DEBUG oslo_concurrency.lockutils [None req-3f8a17ac-48ed-4ae1-8fae-36a8e4250a08 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "5eafb037-41a2-463f-9d3a-1b4248cb00f2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:56:15 compute-0 nova_compute[186544]: 2025-11-22 07:56:15.426 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:56:15 compute-0 nova_compute[186544]: 2025-11-22 07:56:15.974 186548 DEBUG nova.compute.manager [req-06c1788b-7377-4690-ade4-cc6996e193e1 req-d4b8c9f0-4754-4106-a8c4-bc1f39def99c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Received event network-vif-unplugged-041d3fd2-9a77-48a1-b976-9dda05f01f7b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:56:15 compute-0 nova_compute[186544]: 2025-11-22 07:56:15.974 186548 DEBUG oslo_concurrency.lockutils [req-06c1788b-7377-4690-ade4-cc6996e193e1 req-d4b8c9f0-4754-4106-a8c4-bc1f39def99c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "5eafb037-41a2-463f-9d3a-1b4248cb00f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:56:15 compute-0 nova_compute[186544]: 2025-11-22 07:56:15.974 186548 DEBUG oslo_concurrency.lockutils [req-06c1788b-7377-4690-ade4-cc6996e193e1 req-d4b8c9f0-4754-4106-a8c4-bc1f39def99c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "5eafb037-41a2-463f-9d3a-1b4248cb00f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:56:15 compute-0 nova_compute[186544]: 2025-11-22 07:56:15.975 186548 DEBUG oslo_concurrency.lockutils [req-06c1788b-7377-4690-ade4-cc6996e193e1 req-d4b8c9f0-4754-4106-a8c4-bc1f39def99c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "5eafb037-41a2-463f-9d3a-1b4248cb00f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:56:15 compute-0 nova_compute[186544]: 2025-11-22 07:56:15.975 186548 DEBUG nova.compute.manager [req-06c1788b-7377-4690-ade4-cc6996e193e1 req-d4b8c9f0-4754-4106-a8c4-bc1f39def99c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] No waiting events found dispatching network-vif-unplugged-041d3fd2-9a77-48a1-b976-9dda05f01f7b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:56:15 compute-0 nova_compute[186544]: 2025-11-22 07:56:15.975 186548 WARNING nova.compute.manager [req-06c1788b-7377-4690-ade4-cc6996e193e1 req-d4b8c9f0-4754-4106-a8c4-bc1f39def99c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Received unexpected event network-vif-unplugged-041d3fd2-9a77-48a1-b976-9dda05f01f7b for instance with vm_state active and task_state resize_migrated.
Nov 22 07:56:16 compute-0 nova_compute[186544]: 2025-11-22 07:56:16.192 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:56:16 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:56:16.195 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:56:16 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:56:16.196 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 07:56:16 compute-0 podman[225205]: 2025-11-22 07:56:16.407094435 +0000 UTC m=+0.046620441 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 07:56:17 compute-0 nova_compute[186544]: 2025-11-22 07:56:17.481 186548 DEBUG nova.compute.manager [req-03f3e226-dbb0-427f-b4ed-dd0fb5c9a636 req-3708b7c0-c7ef-4094-8595-db5a4c82bc1f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Received event network-changed-041d3fd2-9a77-48a1-b976-9dda05f01f7b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:56:17 compute-0 nova_compute[186544]: 2025-11-22 07:56:17.481 186548 DEBUG nova.compute.manager [req-03f3e226-dbb0-427f-b4ed-dd0fb5c9a636 req-3708b7c0-c7ef-4094-8595-db5a4c82bc1f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Refreshing instance network info cache due to event network-changed-041d3fd2-9a77-48a1-b976-9dda05f01f7b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 07:56:17 compute-0 nova_compute[186544]: 2025-11-22 07:56:17.481 186548 DEBUG oslo_concurrency.lockutils [req-03f3e226-dbb0-427f-b4ed-dd0fb5c9a636 req-3708b7c0-c7ef-4094-8595-db5a4c82bc1f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-5eafb037-41a2-463f-9d3a-1b4248cb00f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:56:17 compute-0 nova_compute[186544]: 2025-11-22 07:56:17.481 186548 DEBUG oslo_concurrency.lockutils [req-03f3e226-dbb0-427f-b4ed-dd0fb5c9a636 req-3708b7c0-c7ef-4094-8595-db5a4c82bc1f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-5eafb037-41a2-463f-9d3a-1b4248cb00f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:56:17 compute-0 nova_compute[186544]: 2025-11-22 07:56:17.482 186548 DEBUG nova.network.neutron [req-03f3e226-dbb0-427f-b4ed-dd0fb5c9a636 req-3708b7c0-c7ef-4094-8595-db5a4c82bc1f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Refreshing network info cache for port 041d3fd2-9a77-48a1-b976-9dda05f01f7b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 07:56:18 compute-0 nova_compute[186544]: 2025-11-22 07:56:18.068 186548 DEBUG nova.compute.manager [req-e9a90f4f-98c6-4c99-9cc5-3701527bf8d6 req-6998114a-bce5-4548-89ad-ae48fcc2f500 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Received event network-vif-plugged-041d3fd2-9a77-48a1-b976-9dda05f01f7b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:56:18 compute-0 nova_compute[186544]: 2025-11-22 07:56:18.069 186548 DEBUG oslo_concurrency.lockutils [req-e9a90f4f-98c6-4c99-9cc5-3701527bf8d6 req-6998114a-bce5-4548-89ad-ae48fcc2f500 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "5eafb037-41a2-463f-9d3a-1b4248cb00f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:56:18 compute-0 nova_compute[186544]: 2025-11-22 07:56:18.069 186548 DEBUG oslo_concurrency.lockutils [req-e9a90f4f-98c6-4c99-9cc5-3701527bf8d6 req-6998114a-bce5-4548-89ad-ae48fcc2f500 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "5eafb037-41a2-463f-9d3a-1b4248cb00f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:56:18 compute-0 nova_compute[186544]: 2025-11-22 07:56:18.069 186548 DEBUG oslo_concurrency.lockutils [req-e9a90f4f-98c6-4c99-9cc5-3701527bf8d6 req-6998114a-bce5-4548-89ad-ae48fcc2f500 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "5eafb037-41a2-463f-9d3a-1b4248cb00f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:56:18 compute-0 nova_compute[186544]: 2025-11-22 07:56:18.069 186548 DEBUG nova.compute.manager [req-e9a90f4f-98c6-4c99-9cc5-3701527bf8d6 req-6998114a-bce5-4548-89ad-ae48fcc2f500 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] No waiting events found dispatching network-vif-plugged-041d3fd2-9a77-48a1-b976-9dda05f01f7b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:56:18 compute-0 nova_compute[186544]: 2025-11-22 07:56:18.070 186548 WARNING nova.compute.manager [req-e9a90f4f-98c6-4c99-9cc5-3701527bf8d6 req-6998114a-bce5-4548-89ad-ae48fcc2f500 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Received unexpected event network-vif-plugged-041d3fd2-9a77-48a1-b976-9dda05f01f7b for instance with vm_state active and task_state resize_migrated.
Nov 22 07:56:18 compute-0 nova_compute[186544]: 2025-11-22 07:56:18.083 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:56:19 compute-0 nova_compute[186544]: 2025-11-22 07:56:19.039 186548 DEBUG nova.network.neutron [req-03f3e226-dbb0-427f-b4ed-dd0fb5c9a636 req-3708b7c0-c7ef-4094-8595-db5a4c82bc1f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Updated VIF entry in instance network info cache for port 041d3fd2-9a77-48a1-b976-9dda05f01f7b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 07:56:19 compute-0 nova_compute[186544]: 2025-11-22 07:56:19.039 186548 DEBUG nova.network.neutron [req-03f3e226-dbb0-427f-b4ed-dd0fb5c9a636 req-3708b7c0-c7ef-4094-8595-db5a4c82bc1f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Updating instance_info_cache with network_info: [{"id": "041d3fd2-9a77-48a1-b976-9dda05f01f7b", "address": "fa:16:3e:31:97:43", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap041d3fd2-9a", "ovs_interfaceid": "041d3fd2-9a77-48a1-b976-9dda05f01f7b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:56:19 compute-0 nova_compute[186544]: 2025-11-22 07:56:19.258 186548 DEBUG oslo_concurrency.lockutils [req-03f3e226-dbb0-427f-b4ed-dd0fb5c9a636 req-3708b7c0-c7ef-4094-8595-db5a4c82bc1f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-5eafb037-41a2-463f-9d3a-1b4248cb00f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:56:19 compute-0 podman[225229]: 2025-11-22 07:56:19.402248511 +0000 UTC m=+0.047924472 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 07:56:20 compute-0 nova_compute[186544]: 2025-11-22 07:56:20.427 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:56:22 compute-0 nova_compute[186544]: 2025-11-22 07:56:22.526 186548 DEBUG nova.compute.manager [req-055304b8-6b38-465b-8f99-ea19ae5973ed req-d3827b56-f734-4294-8448-2b6a80424cef 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Received event network-vif-plugged-041d3fd2-9a77-48a1-b976-9dda05f01f7b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:56:22 compute-0 nova_compute[186544]: 2025-11-22 07:56:22.527 186548 DEBUG oslo_concurrency.lockutils [req-055304b8-6b38-465b-8f99-ea19ae5973ed req-d3827b56-f734-4294-8448-2b6a80424cef 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "5eafb037-41a2-463f-9d3a-1b4248cb00f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:56:22 compute-0 nova_compute[186544]: 2025-11-22 07:56:22.527 186548 DEBUG oslo_concurrency.lockutils [req-055304b8-6b38-465b-8f99-ea19ae5973ed req-d3827b56-f734-4294-8448-2b6a80424cef 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "5eafb037-41a2-463f-9d3a-1b4248cb00f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:56:22 compute-0 nova_compute[186544]: 2025-11-22 07:56:22.527 186548 DEBUG oslo_concurrency.lockutils [req-055304b8-6b38-465b-8f99-ea19ae5973ed req-d3827b56-f734-4294-8448-2b6a80424cef 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "5eafb037-41a2-463f-9d3a-1b4248cb00f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:56:22 compute-0 nova_compute[186544]: 2025-11-22 07:56:22.527 186548 DEBUG nova.compute.manager [req-055304b8-6b38-465b-8f99-ea19ae5973ed req-d3827b56-f734-4294-8448-2b6a80424cef 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] No waiting events found dispatching network-vif-plugged-041d3fd2-9a77-48a1-b976-9dda05f01f7b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:56:22 compute-0 nova_compute[186544]: 2025-11-22 07:56:22.528 186548 WARNING nova.compute.manager [req-055304b8-6b38-465b-8f99-ea19ae5973ed req-d3827b56-f734-4294-8448-2b6a80424cef 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Received unexpected event network-vif-plugged-041d3fd2-9a77-48a1-b976-9dda05f01f7b for instance with vm_state resized and task_state None.
Nov 22 07:56:22 compute-0 nova_compute[186544]: 2025-11-22 07:56:22.528 186548 DEBUG nova.compute.manager [req-055304b8-6b38-465b-8f99-ea19ae5973ed req-d3827b56-f734-4294-8448-2b6a80424cef 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Received event network-vif-plugged-041d3fd2-9a77-48a1-b976-9dda05f01f7b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:56:22 compute-0 nova_compute[186544]: 2025-11-22 07:56:22.528 186548 DEBUG oslo_concurrency.lockutils [req-055304b8-6b38-465b-8f99-ea19ae5973ed req-d3827b56-f734-4294-8448-2b6a80424cef 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "5eafb037-41a2-463f-9d3a-1b4248cb00f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:56:22 compute-0 nova_compute[186544]: 2025-11-22 07:56:22.528 186548 DEBUG oslo_concurrency.lockutils [req-055304b8-6b38-465b-8f99-ea19ae5973ed req-d3827b56-f734-4294-8448-2b6a80424cef 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "5eafb037-41a2-463f-9d3a-1b4248cb00f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:56:22 compute-0 nova_compute[186544]: 2025-11-22 07:56:22.528 186548 DEBUG oslo_concurrency.lockutils [req-055304b8-6b38-465b-8f99-ea19ae5973ed req-d3827b56-f734-4294-8448-2b6a80424cef 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "5eafb037-41a2-463f-9d3a-1b4248cb00f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:56:22 compute-0 nova_compute[186544]: 2025-11-22 07:56:22.529 186548 DEBUG nova.compute.manager [req-055304b8-6b38-465b-8f99-ea19ae5973ed req-d3827b56-f734-4294-8448-2b6a80424cef 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] No waiting events found dispatching network-vif-plugged-041d3fd2-9a77-48a1-b976-9dda05f01f7b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:56:22 compute-0 nova_compute[186544]: 2025-11-22 07:56:22.529 186548 WARNING nova.compute.manager [req-055304b8-6b38-465b-8f99-ea19ae5973ed req-d3827b56-f734-4294-8448-2b6a80424cef 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Received unexpected event network-vif-plugged-041d3fd2-9a77-48a1-b976-9dda05f01f7b for instance with vm_state resized and task_state None.
Nov 22 07:56:22 compute-0 nova_compute[186544]: 2025-11-22 07:56:22.800 186548 DEBUG oslo_concurrency.lockutils [None req-332824fb-edf8-4624-9eec-d1bab57dcba8 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquiring lock "5eafb037-41a2-463f-9d3a-1b4248cb00f2" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:56:22 compute-0 nova_compute[186544]: 2025-11-22 07:56:22.800 186548 DEBUG oslo_concurrency.lockutils [None req-332824fb-edf8-4624-9eec-d1bab57dcba8 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "5eafb037-41a2-463f-9d3a-1b4248cb00f2" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:56:22 compute-0 nova_compute[186544]: 2025-11-22 07:56:22.801 186548 DEBUG nova.compute.manager [None req-332824fb-edf8-4624-9eec-d1bab57dcba8 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Going to confirm migration 12 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679
Nov 22 07:56:22 compute-0 nova_compute[186544]: 2025-11-22 07:56:22.842 186548 DEBUG nova.objects.instance [None req-332824fb-edf8-4624-9eec-d1bab57dcba8 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lazy-loading 'info_cache' on Instance uuid 5eafb037-41a2-463f-9d3a-1b4248cb00f2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:56:23 compute-0 nova_compute[186544]: 2025-11-22 07:56:23.086 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:56:23 compute-0 nova_compute[186544]: 2025-11-22 07:56:23.300 186548 DEBUG neutronclient.v2_0.client [None req-332824fb-edf8-4624-9eec-d1bab57dcba8 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port 041d3fd2-9a77-48a1-b976-9dda05f01f7b for host compute-0.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262
Nov 22 07:56:23 compute-0 nova_compute[186544]: 2025-11-22 07:56:23.301 186548 DEBUG oslo_concurrency.lockutils [None req-332824fb-edf8-4624-9eec-d1bab57dcba8 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquiring lock "refresh_cache-5eafb037-41a2-463f-9d3a-1b4248cb00f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:56:23 compute-0 nova_compute[186544]: 2025-11-22 07:56:23.301 186548 DEBUG oslo_concurrency.lockutils [None req-332824fb-edf8-4624-9eec-d1bab57dcba8 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquired lock "refresh_cache-5eafb037-41a2-463f-9d3a-1b4248cb00f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:56:23 compute-0 nova_compute[186544]: 2025-11-22 07:56:23.302 186548 DEBUG nova.network.neutron [None req-332824fb-edf8-4624-9eec-d1bab57dcba8 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 07:56:24 compute-0 podman[225248]: 2025-11-22 07:56:24.401228153 +0000 UTC m=+0.050307211 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd)
Nov 22 07:56:24 compute-0 nova_compute[186544]: 2025-11-22 07:56:24.540 186548 DEBUG nova.network.neutron [None req-332824fb-edf8-4624-9eec-d1bab57dcba8 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Updating instance_info_cache with network_info: [{"id": "041d3fd2-9a77-48a1-b976-9dda05f01f7b", "address": "fa:16:3e:31:97:43", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap041d3fd2-9a", "ovs_interfaceid": "041d3fd2-9a77-48a1-b976-9dda05f01f7b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:56:24 compute-0 nova_compute[186544]: 2025-11-22 07:56:24.567 186548 DEBUG oslo_concurrency.lockutils [None req-332824fb-edf8-4624-9eec-d1bab57dcba8 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Releasing lock "refresh_cache-5eafb037-41a2-463f-9d3a-1b4248cb00f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:56:24 compute-0 nova_compute[186544]: 2025-11-22 07:56:24.567 186548 DEBUG nova.objects.instance [None req-332824fb-edf8-4624-9eec-d1bab57dcba8 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lazy-loading 'migration_context' on Instance uuid 5eafb037-41a2-463f-9d3a-1b4248cb00f2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:56:24 compute-0 nova_compute[186544]: 2025-11-22 07:56:24.591 186548 DEBUG nova.virt.libvirt.vif [None req-332824fb-edf8-4624-9eec-d1bab57dcba8 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:55:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1663966303',display_name='tempest-ServerDiskConfigTestJSON-server-1663966303',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1663966303',id=70,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:56:20Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='063bf16c91af408ca075c690797e09d8',ramdisk_id='',reservation_id='r-9g9kcea9',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerDiskConfigTestJSON-592691466',owner_user_name='tempest-ServerDiskConfigTestJSON-592691466-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:56:20Z,user_data=None,user_id='e24c302b62fb470aa189b76d4676733b',uuid=5eafb037-41a2-463f-9d3a-1b4248cb00f2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "041d3fd2-9a77-48a1-b976-9dda05f01f7b", "address": "fa:16:3e:31:97:43", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap041d3fd2-9a", "ovs_interfaceid": "041d3fd2-9a77-48a1-b976-9dda05f01f7b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 07:56:24 compute-0 nova_compute[186544]: 2025-11-22 07:56:24.592 186548 DEBUG nova.network.os_vif_util [None req-332824fb-edf8-4624-9eec-d1bab57dcba8 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Converting VIF {"id": "041d3fd2-9a77-48a1-b976-9dda05f01f7b", "address": "fa:16:3e:31:97:43", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap041d3fd2-9a", "ovs_interfaceid": "041d3fd2-9a77-48a1-b976-9dda05f01f7b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:56:24 compute-0 nova_compute[186544]: 2025-11-22 07:56:24.592 186548 DEBUG nova.network.os_vif_util [None req-332824fb-edf8-4624-9eec-d1bab57dcba8 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:31:97:43,bridge_name='br-int',has_traffic_filtering=True,id=041d3fd2-9a77-48a1-b976-9dda05f01f7b,network=Network(d54e232a-5c68-4cc7-b58c-054da9c4646f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap041d3fd2-9a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:56:24 compute-0 nova_compute[186544]: 2025-11-22 07:56:24.593 186548 DEBUG os_vif [None req-332824fb-edf8-4624-9eec-d1bab57dcba8 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:31:97:43,bridge_name='br-int',has_traffic_filtering=True,id=041d3fd2-9a77-48a1-b976-9dda05f01f7b,network=Network(d54e232a-5c68-4cc7-b58c-054da9c4646f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap041d3fd2-9a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 07:56:24 compute-0 nova_compute[186544]: 2025-11-22 07:56:24.595 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:56:24 compute-0 nova_compute[186544]: 2025-11-22 07:56:24.596 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap041d3fd2-9a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:56:24 compute-0 nova_compute[186544]: 2025-11-22 07:56:24.596 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:56:24 compute-0 nova_compute[186544]: 2025-11-22 07:56:24.598 186548 INFO os_vif [None req-332824fb-edf8-4624-9eec-d1bab57dcba8 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:31:97:43,bridge_name='br-int',has_traffic_filtering=True,id=041d3fd2-9a77-48a1-b976-9dda05f01f7b,network=Network(d54e232a-5c68-4cc7-b58c-054da9c4646f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap041d3fd2-9a')
Nov 22 07:56:24 compute-0 nova_compute[186544]: 2025-11-22 07:56:24.598 186548 DEBUG oslo_concurrency.lockutils [None req-332824fb-edf8-4624-9eec-d1bab57dcba8 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:56:24 compute-0 nova_compute[186544]: 2025-11-22 07:56:24.599 186548 DEBUG oslo_concurrency.lockutils [None req-332824fb-edf8-4624-9eec-d1bab57dcba8 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:56:24 compute-0 nova_compute[186544]: 2025-11-22 07:56:24.684 186548 DEBUG nova.compute.provider_tree [None req-332824fb-edf8-4624-9eec-d1bab57dcba8 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:56:24 compute-0 nova_compute[186544]: 2025-11-22 07:56:24.700 186548 DEBUG nova.scheduler.client.report [None req-332824fb-edf8-4624-9eec-d1bab57dcba8 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:56:24 compute-0 nova_compute[186544]: 2025-11-22 07:56:24.747 186548 DEBUG oslo_concurrency.lockutils [None req-332824fb-edf8-4624-9eec-d1bab57dcba8 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 0.149s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:56:24 compute-0 nova_compute[186544]: 2025-11-22 07:56:24.902 186548 INFO nova.scheduler.client.report [None req-332824fb-edf8-4624-9eec-d1bab57dcba8 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Deleted allocation for migration 7911db20-f973-4a7b-bbc8-bc7a93d3bceb
Nov 22 07:56:24 compute-0 nova_compute[186544]: 2025-11-22 07:56:24.966 186548 DEBUG oslo_concurrency.lockutils [None req-332824fb-edf8-4624-9eec-d1bab57dcba8 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "5eafb037-41a2-463f-9d3a-1b4248cb00f2" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 2.165s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:56:25 compute-0 nova_compute[186544]: 2025-11-22 07:56:25.428 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:56:26 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:56:26.198 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:56:28 compute-0 nova_compute[186544]: 2025-11-22 07:56:28.064 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763798173.0624835, 5eafb037-41a2-463f-9d3a-1b4248cb00f2 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:56:28 compute-0 nova_compute[186544]: 2025-11-22 07:56:28.065 186548 INFO nova.compute.manager [-] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] VM Stopped (Lifecycle Event)
Nov 22 07:56:28 compute-0 nova_compute[186544]: 2025-11-22 07:56:28.089 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:56:28 compute-0 nova_compute[186544]: 2025-11-22 07:56:28.091 186548 DEBUG nova.compute.manager [None req-0c492fb2-c577-4907-8db3-24015f36376c - - - - - -] [instance: 5eafb037-41a2-463f-9d3a-1b4248cb00f2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:56:28 compute-0 podman[225269]: 2025-11-22 07:56:28.419493983 +0000 UTC m=+0.063548126 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 07:56:30 compute-0 nova_compute[186544]: 2025-11-22 07:56:30.431 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:56:31 compute-0 podman[225294]: 2025-11-22 07:56:31.422856551 +0000 UTC m=+0.072691162 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, container_name=openstack_network_exporter, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_id=edpm, version=9.6, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, distribution-scope=public, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, io.openshift.expose-services=)
Nov 22 07:56:33 compute-0 nova_compute[186544]: 2025-11-22 07:56:33.092 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:56:35 compute-0 ovn_controller[94843]: 2025-11-22T07:56:35Z|00302|binding|INFO|Releasing lport 1f7bc015-fb2f-41a5-82bb-16526b7a95f0 from this chassis (sb_readonly=0)
Nov 22 07:56:35 compute-0 nova_compute[186544]: 2025-11-22 07:56:35.345 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:56:35 compute-0 nova_compute[186544]: 2025-11-22 07:56:35.433 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.594 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '1da3a852-b732-48ac-886f-c57ced76a3d5', 'name': 'tempest-ServerActionsTestOtherA-server-824804646', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000048', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'af3a536766704caaad94e5da2e3b88e2', 'user_id': '5a5a623606e647c183360572aab20b70', 'hostId': '25ec3804a05b28b4253bcb0dfae9f6bbd1d187768cc78d26ffafeb71', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.595 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.597 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 1da3a852-b732-48ac-886f-c57ced76a3d5 / tapab338966-6e inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.598 12 DEBUG ceilometer.compute.pollsters [-] 1da3a852-b732-48ac-886f-c57ced76a3d5/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.599 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e2b43493-5324-4fe9-8881-05451cd56a2b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5a5a623606e647c183360572aab20b70', 'user_name': None, 'project_id': 'af3a536766704caaad94e5da2e3b88e2', 'project_name': None, 'resource_id': 'instance-00000048-1da3a852-b732-48ac-886f-c57ced76a3d5-tapab338966-6e', 'timestamp': '2025-11-22T07:56:36.595761', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-824804646', 'name': 'tapab338966-6e', 'instance_id': '1da3a852-b732-48ac-886f-c57ced76a3d5', 'instance_type': 'm1.nano', 'host': '25ec3804a05b28b4253bcb0dfae9f6bbd1d187768cc78d26ffafeb71', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:63:15:d6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapab338966-6e'}, 'message_id': 'c524677c-c778-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4937.287548822, 'message_signature': '2d675a9ccce1a7260d2e537999248c14ef76737f9ea510507d7aca398b10f65d'}]}, 'timestamp': '2025-11-22 07:56:36.598782', '_unique_id': 'fc4c96fe9f9b459aaecbc30a1a73f1d3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.599 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.599 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.599 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.599 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.599 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.599 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.599 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.599 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.599 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.599 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.599 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.599 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.599 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.599 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.599 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.599 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.599 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.599 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.599 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.599 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.599 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.599 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.599 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.599 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.599 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.599 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.599 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.599 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.599 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.599 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.599 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.599 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.601 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.601 12 DEBUG ceilometer.compute.pollsters [-] 1da3a852-b732-48ac-886f-c57ced76a3d5/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.602 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ac19baec-dbab-4840-bb51-0edb86275735', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5a5a623606e647c183360572aab20b70', 'user_name': None, 'project_id': 'af3a536766704caaad94e5da2e3b88e2', 'project_name': None, 'resource_id': 'instance-00000048-1da3a852-b732-48ac-886f-c57ced76a3d5-tapab338966-6e', 'timestamp': '2025-11-22T07:56:36.601647', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-824804646', 'name': 'tapab338966-6e', 'instance_id': '1da3a852-b732-48ac-886f-c57ced76a3d5', 'instance_type': 'm1.nano', 'host': '25ec3804a05b28b4253bcb0dfae9f6bbd1d187768cc78d26ffafeb71', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:63:15:d6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapab338966-6e'}, 'message_id': 'c524e97c-c778-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4937.287548822, 'message_signature': '32574e6a6048acceeed60b65b424a85f4ad476afb0fa5d02e943e484bd259052'}]}, 'timestamp': '2025-11-22 07:56:36.602076', '_unique_id': 'cb875acd5e804070ade3cef48af50b1f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.602 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.602 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.602 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.602 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.602 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.602 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.602 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.602 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.602 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.602 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.602 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.602 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.602 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.602 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.602 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.602 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.602 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.602 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.602 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.602 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.602 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.602 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.602 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.602 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.602 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.602 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.602 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.602 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.602 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.602 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.602 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.602 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.604 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.604 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.604 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServerActionsTestOtherA-server-824804646>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestOtherA-server-824804646>]
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.604 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.627 12 DEBUG ceilometer.compute.pollsters [-] 1da3a852-b732-48ac-886f-c57ced76a3d5/disk.device.write.requests volume: 312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.628 12 DEBUG ceilometer.compute.pollsters [-] 1da3a852-b732-48ac-886f-c57ced76a3d5/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.629 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dfe24fac-0c71-48c2-8ce5-0618477a9f2c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 312, 'user_id': '5a5a623606e647c183360572aab20b70', 'user_name': None, 'project_id': 'af3a536766704caaad94e5da2e3b88e2', 'project_name': None, 'resource_id': '1da3a852-b732-48ac-886f-c57ced76a3d5-vda', 'timestamp': '2025-11-22T07:56:36.604919', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-824804646', 'name': 'instance-00000048', 'instance_id': '1da3a852-b732-48ac-886f-c57ced76a3d5', 'instance_type': 'm1.nano', 'host': '25ec3804a05b28b4253bcb0dfae9f6bbd1d187768cc78d26ffafeb71', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c528f2e2-c778-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4937.296725917, 'message_signature': 'a68bc6873089d342679d112808d4a5b1ca34e2de80ff8dbb7bcb62e78ebf6e43'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '5a5a623606e647c183360572aab20b70', 'user_name': None, 'project_id': 'af3a536766704caaad94e5da2e3b88e2', 'project_name': None, 'resource_id': '1da3a852-b732-48ac-886f-c57ced76a3d5-sda', 'timestamp': '2025-11-22T07:56:36.604919', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-824804646', 'name': 'instance-00000048', 'instance_id': '1da3a852-b732-48ac-886f-c57ced76a3d5', 'instance_type': 'm1.nano', 'host': '25ec3804a05b28b4253bcb0dfae9f6bbd1d187768cc78d26ffafeb71', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c52902c8-c778-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4937.296725917, 'message_signature': '255901bdf1df03b16edc979943f74db33de0933890e80ee60f44e62ebf58a320'}]}, 'timestamp': '2025-11-22 07:56:36.628886', '_unique_id': '83cf32c2b61744169d23717cb9a81adb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.629 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.629 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.629 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.629 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.629 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.629 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.629 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.629 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.629 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.629 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.629 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.629 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.629 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.629 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.629 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.629 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.629 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.629 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.629 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.629 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.629 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.629 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.629 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.629 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.629 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.629 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.629 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.629 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.629 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.629 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.629 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.629 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.629 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.629 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.629 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.629 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.629 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.629 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.629 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.629 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.629 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.629 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.629 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.629 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.629 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.629 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.629 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.629 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.629 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.629 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.629 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.629 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.629 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.629 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.631 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.631 12 DEBUG ceilometer.compute.pollsters [-] 1da3a852-b732-48ac-886f-c57ced76a3d5/disk.device.read.requests volume: 1097 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.631 12 DEBUG ceilometer.compute.pollsters [-] 1da3a852-b732-48ac-886f-c57ced76a3d5/disk.device.read.requests volume: 120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.632 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9a0efc88-6973-4810-9ad3-4b48e975db64', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1097, 'user_id': '5a5a623606e647c183360572aab20b70', 'user_name': None, 'project_id': 'af3a536766704caaad94e5da2e3b88e2', 'project_name': None, 'resource_id': '1da3a852-b732-48ac-886f-c57ced76a3d5-vda', 'timestamp': '2025-11-22T07:56:36.631222', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-824804646', 'name': 'instance-00000048', 'instance_id': '1da3a852-b732-48ac-886f-c57ced76a3d5', 'instance_type': 'm1.nano', 'host': '25ec3804a05b28b4253bcb0dfae9f6bbd1d187768cc78d26ffafeb71', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c5296b46-c778-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4937.296725917, 'message_signature': '55aba6c42518a51cdc4efca13ac23fae316a34f814d28ed8d93185ee660931a2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 120, 'user_id': '5a5a623606e647c183360572aab20b70', 'user_name': None, 'project_id': 'af3a536766704caaad94e5da2e3b88e2', 'project_name': None, 'resource_id': '1da3a852-b732-48ac-886f-c57ced76a3d5-sda', 'timestamp': '2025-11-22T07:56:36.631222', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-824804646', 'name': 'instance-00000048', 'instance_id': '1da3a852-b732-48ac-886f-c57ced76a3d5', 'instance_type': 'm1.nano', 'host': '25ec3804a05b28b4253bcb0dfae9f6bbd1d187768cc78d26ffafeb71', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c52976ea-c778-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4937.296725917, 'message_signature': '9a51f0b77d66c049fed1674378315d2d651acef9b39b6a87882880d5b80dee77'}]}, 'timestamp': '2025-11-22 07:56:36.631874', '_unique_id': 'b5ab16a2c4d24661b50f8cd7919f1884'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.632 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.632 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.632 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.632 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.632 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.632 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.632 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.632 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.632 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.632 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.632 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.632 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.632 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.632 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.632 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.632 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.632 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.632 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.632 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.632 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.632 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.632 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.632 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.632 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.632 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.632 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.632 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.632 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.632 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.632 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.632 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.632 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.632 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.632 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.632 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.632 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.632 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.632 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.632 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.632 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.632 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.632 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.632 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.632 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.632 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.632 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.632 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.632 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.632 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.632 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.632 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.632 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.632 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.632 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.633 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.633 12 DEBUG ceilometer.compute.pollsters [-] 1da3a852-b732-48ac-886f-c57ced76a3d5/network.outgoing.packets volume: 29 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.634 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bcf9ff59-1d28-4be8-a353-2db2982de3eb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 29, 'user_id': '5a5a623606e647c183360572aab20b70', 'user_name': None, 'project_id': 'af3a536766704caaad94e5da2e3b88e2', 'project_name': None, 'resource_id': 'instance-00000048-1da3a852-b732-48ac-886f-c57ced76a3d5-tapab338966-6e', 'timestamp': '2025-11-22T07:56:36.633609', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-824804646', 'name': 'tapab338966-6e', 'instance_id': '1da3a852-b732-48ac-886f-c57ced76a3d5', 'instance_type': 'm1.nano', 'host': '25ec3804a05b28b4253bcb0dfae9f6bbd1d187768cc78d26ffafeb71', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:63:15:d6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapab338966-6e'}, 'message_id': 'c529c6fe-c778-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4937.287548822, 'message_signature': 'e08057c57195c9140d30bd351fef4abeff9429c004e58b8b6394141a88db85be'}]}, 'timestamp': '2025-11-22 07:56:36.633915', '_unique_id': '371cf744d33b4383b9606bad239c5dc8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.634 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.634 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.634 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.634 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.634 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.634 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.634 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.634 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.634 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.634 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.634 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.634 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.634 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.634 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.634 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.634 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.634 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.634 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.634 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.634 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.634 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.634 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.634 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.634 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.634 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.634 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.634 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.634 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.634 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.634 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.634 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.635 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.635 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.635 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServerActionsTestOtherA-server-824804646>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestOtherA-server-824804646>]
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.635 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.635 12 DEBUG ceilometer.compute.pollsters [-] 1da3a852-b732-48ac-886f-c57ced76a3d5/disk.device.write.bytes volume: 72970240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.636 12 DEBUG ceilometer.compute.pollsters [-] 1da3a852-b732-48ac-886f-c57ced76a3d5/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.636 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '711f3401-140c-4fbc-a845-97afee7c3d8e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72970240, 'user_id': '5a5a623606e647c183360572aab20b70', 'user_name': None, 'project_id': 'af3a536766704caaad94e5da2e3b88e2', 'project_name': None, 'resource_id': '1da3a852-b732-48ac-886f-c57ced76a3d5-vda', 'timestamp': '2025-11-22T07:56:36.635875', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-824804646', 'name': 'instance-00000048', 'instance_id': '1da3a852-b732-48ac-886f-c57ced76a3d5', 'instance_type': 'm1.nano', 'host': '25ec3804a05b28b4253bcb0dfae9f6bbd1d187768cc78d26ffafeb71', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c52a2022-c778-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4937.296725917, 'message_signature': 'f25e5d079d046d160a1da5349699266a24221b91dff29c65681573de09281e08'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5a5a623606e647c183360572aab20b70', 'user_name': None, 'project_id': 'af3a536766704caaad94e5da2e3b88e2', 'project_name': None, 'resource_id': '1da3a852-b732-48ac-886f-c57ced76a3d5-sda', 'timestamp': '2025-11-22T07:56:36.635875', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-824804646', 'name': 'instance-00000048', 'instance_id': '1da3a852-b732-48ac-886f-c57ced76a3d5', 'instance_type': 'm1.nano', 'host': '25ec3804a05b28b4253bcb0dfae9f6bbd1d187768cc78d26ffafeb71', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c52a2b58-c778-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4937.296725917, 'message_signature': '5c0fdf1035286f6b32549a30f8dea62441ec34cc9f49eaea44b01d10e81710f7'}]}, 'timestamp': '2025-11-22 07:56:36.636455', '_unique_id': '9281d78802ed4281acb56b659937748f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.636 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.636 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.636 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.636 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.636 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.636 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.636 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.636 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.636 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.636 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.636 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.636 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.636 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.636 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.636 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.636 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.636 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.636 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.636 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.636 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.636 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.636 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.636 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.636 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.636 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.636 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.636 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.636 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.636 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.636 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.636 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.636 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.636 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.636 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.636 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.636 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.636 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.636 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.636 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.636 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.636 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.636 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.636 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.636 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.636 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.636 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.636 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.636 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.636 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.636 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.636 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.636 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.636 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.636 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.637 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.650 12 DEBUG ceilometer.compute.pollsters [-] 1da3a852-b732-48ac-886f-c57ced76a3d5/memory.usage volume: 42.66796875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.651 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '206112c5-8cd7-4482-8762-d08b7012f491', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.66796875, 'user_id': '5a5a623606e647c183360572aab20b70', 'user_name': None, 'project_id': 'af3a536766704caaad94e5da2e3b88e2', 'project_name': None, 'resource_id': '1da3a852-b732-48ac-886f-c57ced76a3d5', 'timestamp': '2025-11-22T07:56:36.638051', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-824804646', 'name': 'instance-00000048', 'instance_id': '1da3a852-b732-48ac-886f-c57ced76a3d5', 'instance_type': 'm1.nano', 'host': '25ec3804a05b28b4253bcb0dfae9f6bbd1d187768cc78d26ffafeb71', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'c52c5806-c778-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4937.341996733, 'message_signature': '1b5ceb1be18ef9b435e625e7879222c4cec77dfa970ab7f5590b1ac07eb196ee'}]}, 'timestamp': '2025-11-22 07:56:36.650745', '_unique_id': '83b6604f820b44f0ad06620e6e6e7671'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.651 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.651 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.651 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.651 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.651 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.651 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.651 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.651 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.651 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.651 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.651 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.651 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.651 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.651 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.651 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.651 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.651 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.651 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.651 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.651 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.651 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.651 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.651 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.651 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.651 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.651 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.651 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.651 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.651 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.651 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.651 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.652 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.652 12 DEBUG ceilometer.compute.pollsters [-] 1da3a852-b732-48ac-886f-c57ced76a3d5/disk.device.read.bytes volume: 30562816 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.652 12 DEBUG ceilometer.compute.pollsters [-] 1da3a852-b732-48ac-886f-c57ced76a3d5/disk.device.read.bytes volume: 299326 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.653 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e7f5e225-b3f5-4379-af5e-55a1cc3a006a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30562816, 'user_id': '5a5a623606e647c183360572aab20b70', 'user_name': None, 'project_id': 'af3a536766704caaad94e5da2e3b88e2', 'project_name': None, 'resource_id': '1da3a852-b732-48ac-886f-c57ced76a3d5-vda', 'timestamp': '2025-11-22T07:56:36.652575', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-824804646', 'name': 'instance-00000048', 'instance_id': '1da3a852-b732-48ac-886f-c57ced76a3d5', 'instance_type': 'm1.nano', 'host': '25ec3804a05b28b4253bcb0dfae9f6bbd1d187768cc78d26ffafeb71', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c52cab8a-c778-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4937.296725917, 'message_signature': 'a0ded401b100dbe10df6654ac2c789e6224a341abde96ca9f6ccc67628f5d030'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 299326, 'user_id': '5a5a623606e647c183360572aab20b70', 'user_name': None, 'project_id': 'af3a536766704caaad94e5da2e3b88e2', 'project_name': None, 'resource_id': '1da3a852-b732-48ac-886f-c57ced76a3d5-sda', 'timestamp': '2025-11-22T07:56:36.652575', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-824804646', 'name': 'instance-00000048', 'instance_id': '1da3a852-b732-48ac-886f-c57ced76a3d5', 'instance_type': 'm1.nano', 'host': '25ec3804a05b28b4253bcb0dfae9f6bbd1d187768cc78d26ffafeb71', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c52cb5c6-c778-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4937.296725917, 'message_signature': '6cdfb484cdc46cf6c3e51917960a83e3a397014879a368aa915aa18a647440e5'}]}, 'timestamp': '2025-11-22 07:56:36.653104', '_unique_id': 'c6b2f57bffe24d188025ee16e03eafb6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.653 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.653 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.653 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.653 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.653 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.653 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.653 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.653 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.653 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.653 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.653 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.653 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.653 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.653 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.653 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.653 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.653 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.653 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.653 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.653 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.653 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.653 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.653 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.653 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.653 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.653 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.653 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.653 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.653 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.653 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.653 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.653 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.653 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.653 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.653 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.653 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.653 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.653 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.653 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.653 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.653 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.653 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.653 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.653 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.653 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.653 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.653 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.653 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.653 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.653 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.653 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.653 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.653 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.653 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.654 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.654 12 DEBUG ceilometer.compute.pollsters [-] 1da3a852-b732-48ac-886f-c57ced76a3d5/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.655 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'da559ad1-0278-4ee9-88c5-ad316275c756', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5a5a623606e647c183360572aab20b70', 'user_name': None, 'project_id': 'af3a536766704caaad94e5da2e3b88e2', 'project_name': None, 'resource_id': 'instance-00000048-1da3a852-b732-48ac-886f-c57ced76a3d5-tapab338966-6e', 'timestamp': '2025-11-22T07:56:36.654755', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-824804646', 'name': 'tapab338966-6e', 'instance_id': '1da3a852-b732-48ac-886f-c57ced76a3d5', 'instance_type': 'm1.nano', 'host': '25ec3804a05b28b4253bcb0dfae9f6bbd1d187768cc78d26ffafeb71', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:63:15:d6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapab338966-6e'}, 'message_id': 'c52d0116-c778-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4937.287548822, 'message_signature': 'c905c0714a6d1ddd4b72e65e9650ab39ef58d0df471475b4073b0c2bfcd65e15'}]}, 'timestamp': '2025-11-22 07:56:36.655062', '_unique_id': 'fe815a3c55074f0a94b6b20865fca5b3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.655 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.655 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.655 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.655 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.655 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.655 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.655 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.655 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.655 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.655 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.655 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.655 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.655 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.655 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.655 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.655 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.655 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.655 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.655 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.655 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.655 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.655 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.655 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.655 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.655 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.655 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.655 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.655 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.655 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.655 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.655 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.655 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.655 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.655 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.655 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.655 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.655 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.655 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.655 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.655 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.655 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.655 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.655 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.655 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.655 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.655 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.655 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.655 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.655 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.655 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.655 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.655 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.655 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.655 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.656 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.656 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.656 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServerActionsTestOtherA-server-824804646>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestOtherA-server-824804646>]
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.657 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.657 12 DEBUG ceilometer.compute.pollsters [-] 1da3a852-b732-48ac-886f-c57ced76a3d5/network.incoming.bytes volume: 4179 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.657 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3b323c98-0a82-413e-b9cc-337508ed60db', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4179, 'user_id': '5a5a623606e647c183360572aab20b70', 'user_name': None, 'project_id': 'af3a536766704caaad94e5da2e3b88e2', 'project_name': None, 'resource_id': 'instance-00000048-1da3a852-b732-48ac-886f-c57ced76a3d5-tapab338966-6e', 'timestamp': '2025-11-22T07:56:36.657143', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-824804646', 'name': 'tapab338966-6e', 'instance_id': '1da3a852-b732-48ac-886f-c57ced76a3d5', 'instance_type': 'm1.nano', 'host': '25ec3804a05b28b4253bcb0dfae9f6bbd1d187768cc78d26ffafeb71', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:63:15:d6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapab338966-6e'}, 'message_id': 'c52d6070-c778-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4937.287548822, 'message_signature': '8f078e4f5bd4ceb2cd4e896ce65bbb59aaefe272b983972227a6375604794abc'}]}, 'timestamp': '2025-11-22 07:56:36.657490', '_unique_id': '252d8d49d7e143f3af279415649504a7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.657 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.657 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.657 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.657 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.657 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.657 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.657 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.657 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.657 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.657 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.657 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.657 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.657 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.657 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.657 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.657 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.657 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.657 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.657 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.657 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.657 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.657 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.657 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.657 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.657 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.657 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.657 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.657 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.657 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.657 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.657 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.657 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.657 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.657 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.657 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.657 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.657 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.657 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.657 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.657 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.657 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.657 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.657 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.657 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.657 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.657 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.657 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.657 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.657 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.657 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.657 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.657 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.657 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.657 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.658 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.659 12 DEBUG ceilometer.compute.pollsters [-] 1da3a852-b732-48ac-886f-c57ced76a3d5/disk.device.write.latency volume: 6903326178 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.659 12 DEBUG ceilometer.compute.pollsters [-] 1da3a852-b732-48ac-886f-c57ced76a3d5/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.660 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '68465cd9-fa5b-4395-b864-585e4b883786', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 6903326178, 'user_id': '5a5a623606e647c183360572aab20b70', 'user_name': None, 'project_id': 'af3a536766704caaad94e5da2e3b88e2', 'project_name': None, 'resource_id': '1da3a852-b732-48ac-886f-c57ced76a3d5-vda', 'timestamp': '2025-11-22T07:56:36.659007', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-824804646', 'name': 'instance-00000048', 'instance_id': '1da3a852-b732-48ac-886f-c57ced76a3d5', 'instance_type': 'm1.nano', 'host': '25ec3804a05b28b4253bcb0dfae9f6bbd1d187768cc78d26ffafeb71', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c52da6de-c778-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4937.296725917, 'message_signature': '12d26f737d0ccee856499e03884d8d044f0f4e50a5341916144fa8da4a747147'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '5a5a623606e647c183360572aab20b70', 'user_name': None, 'project_id': 'af3a536766704caaad94e5da2e3b88e2', 'project_name': None, 'resource_id': '1da3a852-b732-48ac-886f-c57ced76a3d5-sda', 'timestamp': '2025-11-22T07:56:36.659007', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-824804646', 'name': 'instance-00000048', 'instance_id': '1da3a852-b732-48ac-886f-c57ced76a3d5', 'instance_type': 'm1.nano', 'host': '25ec3804a05b28b4253bcb0dfae9f6bbd1d187768cc78d26ffafeb71', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c52db200-c778-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4937.296725917, 'message_signature': 'c8b4538246cf9ce265a7f1fb39ea252b670c0bb94272451eecdb821a96d627dd'}]}, 'timestamp': '2025-11-22 07:56:36.659582', '_unique_id': '2bdf1e76cad7439680c11213786cc9e1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.660 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.660 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.660 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.660 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.660 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.660 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.660 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.660 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.660 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.660 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.660 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.660 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.660 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.660 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.660 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.660 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.660 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.660 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.660 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.660 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.660 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.660 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.660 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.660 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.660 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.660 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.660 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.660 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.660 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.660 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.660 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.661 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.661 12 DEBUG ceilometer.compute.pollsters [-] 1da3a852-b732-48ac-886f-c57ced76a3d5/disk.device.read.latency volume: 980261942 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.661 12 DEBUG ceilometer.compute.pollsters [-] 1da3a852-b732-48ac-886f-c57ced76a3d5/disk.device.read.latency volume: 55873374 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.663 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eefe7624-c1de-4a7c-81d6-f3fa81740e7c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 980261942, 'user_id': '5a5a623606e647c183360572aab20b70', 'user_name': None, 'project_id': 'af3a536766704caaad94e5da2e3b88e2', 'project_name': None, 'resource_id': '1da3a852-b732-48ac-886f-c57ced76a3d5-vda', 'timestamp': '2025-11-22T07:56:36.661431', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-824804646', 'name': 'instance-00000048', 'instance_id': '1da3a852-b732-48ac-886f-c57ced76a3d5', 'instance_type': 'm1.nano', 'host': '25ec3804a05b28b4253bcb0dfae9f6bbd1d187768cc78d26ffafeb71', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c52e070a-c778-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4937.296725917, 'message_signature': 'db9d4ff36de097574b55339a3be224ffca520fcd8155e25f56e8137da888e6cd'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 55873374, 'user_id': '5a5a623606e647c183360572aab20b70', 'user_name': None, 'project_id': 'af3a536766704caaad94e5da2e3b88e2', 'project_name': None, 'resource_id': '1da3a852-b732-48ac-886f-c57ced76a3d5-sda', 'timestamp': '2025-11-22T07:56:36.661431', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-824804646', 'name': 'instance-00000048', 'instance_id': '1da3a852-b732-48ac-886f-c57ced76a3d5', 'instance_type': 'm1.nano', 'host': '25ec3804a05b28b4253bcb0dfae9f6bbd1d187768cc78d26ffafeb71', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c52e12a4-c778-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4937.296725917, 'message_signature': '9a972d0b7f48670550370ed7ee23b705b231deb634cb3ad8b24cd258b5e55c53'}]}, 'timestamp': '2025-11-22 07:56:36.662054', '_unique_id': '45a3e2ad757640c084014e35c02ffcda'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.663 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.663 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.663 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.663 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.663 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.663 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.663 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.663 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.663 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.663 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.663 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.663 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.663 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.663 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.663 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.663 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.663 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.663 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.663 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.663 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.663 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.663 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.663 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.663 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.663 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.663 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.663 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.663 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.663 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.663 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.663 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.664 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.676 12 DEBUG ceilometer.compute.pollsters [-] 1da3a852-b732-48ac-886f-c57ced76a3d5/disk.device.usage volume: 30015488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.677 12 DEBUG ceilometer.compute.pollsters [-] 1da3a852-b732-48ac-886f-c57ced76a3d5/disk.device.usage volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.678 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'abbfbcfa-b66a-4d16-bc5e-33a24e4e76ba', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30015488, 'user_id': '5a5a623606e647c183360572aab20b70', 'user_name': None, 'project_id': 'af3a536766704caaad94e5da2e3b88e2', 'project_name': None, 'resource_id': '1da3a852-b732-48ac-886f-c57ced76a3d5-vda', 'timestamp': '2025-11-22T07:56:36.664419', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-824804646', 'name': 'instance-00000048', 'instance_id': '1da3a852-b732-48ac-886f-c57ced76a3d5', 'instance_type': 'm1.nano', 'host': '25ec3804a05b28b4253bcb0dfae9f6bbd1d187768cc78d26ffafeb71', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c53065e0-c778-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4937.356230344, 'message_signature': 'b4e38674e8555e7aa77af777dd70b25896ff5cfd41411127c6a389c55640e4f0'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '5a5a623606e647c183360572aab20b70', 'user_name': None, 'project_id': 'af3a536766704caaad94e5da2e3b88e2', 'project_name': None, 'resource_id': '1da3a852-b732-48ac-886f-c57ced76a3d5-sda', 'timestamp': '2025-11-22T07:56:36.664419', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-824804646', 'name': 'instance-00000048', 'instance_id': '1da3a852-b732-48ac-886f-c57ced76a3d5', 'instance_type': 'm1.nano', 'host': '25ec3804a05b28b4253bcb0dfae9f6bbd1d187768cc78d26ffafeb71', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c53074fe-c778-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4937.356230344, 'message_signature': 'a6aeb193f2be2acb52e3427748ea721edd3277629df5443d2ea6448387829e4b'}]}, 'timestamp': '2025-11-22 07:56:36.677671', '_unique_id': 'c1f02c9ee7464134b3cf48cf018f9d25'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.678 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.678 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.678 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.678 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.678 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.678 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.678 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.678 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.678 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.678 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.678 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.678 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.678 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.678 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.678 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.678 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.678 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.678 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.678 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.678 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.678 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.678 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.678 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.678 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.678 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.678 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.678 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.678 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.678 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.678 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.678 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.679 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.679 12 DEBUG ceilometer.compute.pollsters [-] 1da3a852-b732-48ac-886f-c57ced76a3d5/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.679 12 DEBUG ceilometer.compute.pollsters [-] 1da3a852-b732-48ac-886f-c57ced76a3d5/disk.device.capacity volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.680 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '545bbf65-168e-4a17-b1dc-5a54dbdd73c6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '5a5a623606e647c183360572aab20b70', 'user_name': None, 'project_id': 'af3a536766704caaad94e5da2e3b88e2', 'project_name': None, 'resource_id': '1da3a852-b732-48ac-886f-c57ced76a3d5-vda', 'timestamp': '2025-11-22T07:56:36.679529', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-824804646', 'name': 'instance-00000048', 'instance_id': '1da3a852-b732-48ac-886f-c57ced76a3d5', 'instance_type': 'm1.nano', 'host': '25ec3804a05b28b4253bcb0dfae9f6bbd1d187768cc78d26ffafeb71', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c530c922-c778-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4937.356230344, 'message_signature': '7c788630f1fecb8b0ede26acb081eb3b8890208e2e8900c4dd548b4314c80900'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '5a5a623606e647c183360572aab20b70', 'user_name': None, 'project_id': 'af3a536766704caaad94e5da2e3b88e2', 'project_name': None, 'resource_id': '1da3a852-b732-48ac-886f-c57ced76a3d5-sda', 'timestamp': '2025-11-22T07:56:36.679529', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-824804646', 'name': 'instance-00000048', 'instance_id': '1da3a852-b732-48ac-886f-c57ced76a3d5', 'instance_type': 'm1.nano', 'host': '25ec3804a05b28b4253bcb0dfae9f6bbd1d187768cc78d26ffafeb71', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c530d336-c778-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4937.356230344, 'message_signature': 'e6d91405264cb818c331febdc1d7f2918547a2b1b013751cdccb6e773091fe85'}]}, 'timestamp': '2025-11-22 07:56:36.680068', '_unique_id': 'd8aba06a77e14c0bb9f200f813a5b656'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.680 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.680 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.680 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.680 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.680 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.680 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.680 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.680 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.680 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.680 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.680 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.680 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.680 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.680 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.680 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.680 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.680 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.680 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.680 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.680 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.680 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.680 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.680 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.680 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.680 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.680 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.680 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.680 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.680 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.680 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.680 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.681 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.682 12 DEBUG ceilometer.compute.pollsters [-] 1da3a852-b732-48ac-886f-c57ced76a3d5/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.683 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f06bc041-9c49-47df-a4bf-df04410c0a98', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5a5a623606e647c183360572aab20b70', 'user_name': None, 'project_id': 'af3a536766704caaad94e5da2e3b88e2', 'project_name': None, 'resource_id': 'instance-00000048-1da3a852-b732-48ac-886f-c57ced76a3d5-tapab338966-6e', 'timestamp': '2025-11-22T07:56:36.682189', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-824804646', 'name': 'tapab338966-6e', 'instance_id': '1da3a852-b732-48ac-886f-c57ced76a3d5', 'instance_type': 'm1.nano', 'host': '25ec3804a05b28b4253bcb0dfae9f6bbd1d187768cc78d26ffafeb71', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:63:15:d6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapab338966-6e'}, 'message_id': 'c531331c-c778-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4937.287548822, 'message_signature': '08d33123c3885bc799a492bffc7e4a19586d1ab0006c571de3fb4eb684fcebd4'}]}, 'timestamp': '2025-11-22 07:56:36.682537', '_unique_id': 'd5c456394a9541a7ae186673cea61fa7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.683 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.683 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.683 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.683 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.683 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.683 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.683 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.683 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.683 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.683 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.683 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.683 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.683 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.683 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.683 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.683 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.683 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.683 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.683 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.683 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.683 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.683 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.683 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.683 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.683 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.683 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.683 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.683 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.683 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.683 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.683 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.683 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.684 12 DEBUG ceilometer.compute.pollsters [-] 1da3a852-b732-48ac-886f-c57ced76a3d5/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.684 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '539ca45b-011c-4592-bcbc-d50a3b9fe88d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5a5a623606e647c183360572aab20b70', 'user_name': None, 'project_id': 'af3a536766704caaad94e5da2e3b88e2', 'project_name': None, 'resource_id': 'instance-00000048-1da3a852-b732-48ac-886f-c57ced76a3d5-tapab338966-6e', 'timestamp': '2025-11-22T07:56:36.684046', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-824804646', 'name': 'tapab338966-6e', 'instance_id': '1da3a852-b732-48ac-886f-c57ced76a3d5', 'instance_type': 'm1.nano', 'host': '25ec3804a05b28b4253bcb0dfae9f6bbd1d187768cc78d26ffafeb71', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:63:15:d6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapab338966-6e'}, 'message_id': 'c53178e0-c778-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4937.287548822, 'message_signature': 'a8e35f24cdfd53c04c6bfd0b5f0c3450a61bcb1129b6b8a5e524d15de1614116'}]}, 'timestamp': '2025-11-22 07:56:36.684378', '_unique_id': '602e5a951b984b55aae934e2b6e3fd92'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.684 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.684 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.684 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.684 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.684 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.684 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.684 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.684 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.684 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.684 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.684 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.684 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.684 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.684 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.684 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.684 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.684 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.684 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.684 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.684 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.684 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.684 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.684 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.684 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.684 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.684 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.684 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.684 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.684 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.684 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.684 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.685 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.685 12 DEBUG ceilometer.compute.pollsters [-] 1da3a852-b732-48ac-886f-c57ced76a3d5/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.686 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'df2ed695-c3a0-484e-a86d-94cbbe4ef4a4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5a5a623606e647c183360572aab20b70', 'user_name': None, 'project_id': 'af3a536766704caaad94e5da2e3b88e2', 'project_name': None, 'resource_id': 'instance-00000048-1da3a852-b732-48ac-886f-c57ced76a3d5-tapab338966-6e', 'timestamp': '2025-11-22T07:56:36.685819', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-824804646', 'name': 'tapab338966-6e', 'instance_id': '1da3a852-b732-48ac-886f-c57ced76a3d5', 'instance_type': 'm1.nano', 'host': '25ec3804a05b28b4253bcb0dfae9f6bbd1d187768cc78d26ffafeb71', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:63:15:d6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapab338966-6e'}, 'message_id': 'c531be0e-c778-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4937.287548822, 'message_signature': 'f2e69a53bb21370a3b579ab4cc40f87e1e1b925be7781bb04be45cc8e9503f8f'}]}, 'timestamp': '2025-11-22 07:56:36.686092', '_unique_id': '5c107f78a1d44364ad85ebd375d03571'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.686 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.686 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.686 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.686 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.686 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.686 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.686 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.686 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.686 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.686 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.686 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.686 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.686 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.686 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.686 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.686 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.686 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.686 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.686 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.686 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.686 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.686 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.686 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.686 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.686 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.686 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.686 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.686 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.686 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.686 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.686 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.687 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.687 12 DEBUG ceilometer.compute.pollsters [-] 1da3a852-b732-48ac-886f-c57ced76a3d5/network.outgoing.bytes volume: 3456 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.688 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b65b423c-4b81-4cc4-98f8-e78b087aa418', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3456, 'user_id': '5a5a623606e647c183360572aab20b70', 'user_name': None, 'project_id': 'af3a536766704caaad94e5da2e3b88e2', 'project_name': None, 'resource_id': 'instance-00000048-1da3a852-b732-48ac-886f-c57ced76a3d5-tapab338966-6e', 'timestamp': '2025-11-22T07:56:36.687556', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-824804646', 'name': 'tapab338966-6e', 'instance_id': '1da3a852-b732-48ac-886f-c57ced76a3d5', 'instance_type': 'm1.nano', 'host': '25ec3804a05b28b4253bcb0dfae9f6bbd1d187768cc78d26ffafeb71', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:63:15:d6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapab338966-6e'}, 'message_id': 'c53201f2-c778-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4937.287548822, 'message_signature': 'f4473dad412ecbed418d2d71e02bb7fff2eb9e4709e73ad07ebbcc70f67cf9f2'}]}, 'timestamp': '2025-11-22 07:56:36.687829', '_unique_id': 'ba7a616a44194b4b845229b4a616ad7b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.688 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.688 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.688 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.688 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.688 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.688 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.688 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.688 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.688 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.688 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.688 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.688 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.688 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.688 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.688 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.688 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.688 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.688 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.688 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.688 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.688 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.688 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.688 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.688 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.688 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.688 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.688 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.688 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.688 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.688 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.688 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.689 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.689 12 DEBUG ceilometer.compute.pollsters [-] 1da3a852-b732-48ac-886f-c57ced76a3d5/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.689 12 DEBUG ceilometer.compute.pollsters [-] 1da3a852-b732-48ac-886f-c57ced76a3d5/disk.device.allocation volume: 512000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.690 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '66b58078-5325-4dd4-97e3-f8bac677930f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': '5a5a623606e647c183360572aab20b70', 'user_name': None, 'project_id': 'af3a536766704caaad94e5da2e3b88e2', 'project_name': None, 'resource_id': '1da3a852-b732-48ac-886f-c57ced76a3d5-vda', 'timestamp': '2025-11-22T07:56:36.689286', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-824804646', 'name': 'instance-00000048', 'instance_id': '1da3a852-b732-48ac-886f-c57ced76a3d5', 'instance_type': 'm1.nano', 'host': '25ec3804a05b28b4253bcb0dfae9f6bbd1d187768cc78d26ffafeb71', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c5324734-c778-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4937.356230344, 'message_signature': '3c1e42a77b07e3f8cd3c8d5f94843f1a29ae07075c02a54cf6e07bdc1e5f93c3'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512000, 'user_id': '5a5a623606e647c183360572aab20b70', 'user_name': None, 'project_id': 'af3a536766704caaad94e5da2e3b88e2', 'project_name': None, 'resource_id': '1da3a852-b732-48ac-886f-c57ced76a3d5-sda', 'timestamp': '2025-11-22T07:56:36.689286', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-824804646', 'name': 'instance-00000048', 'instance_id': '1da3a852-b732-48ac-886f-c57ced76a3d5', 'instance_type': 'm1.nano', 'host': '25ec3804a05b28b4253bcb0dfae9f6bbd1d187768cc78d26ffafeb71', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c53250c6-c778-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4937.356230344, 'message_signature': '5b128509c62242b063cee7ae4b0c6661d06148a36cc21c7dc28670dc8fb7c8b4'}]}, 'timestamp': '2025-11-22 07:56:36.689832', '_unique_id': '82c66ef7b9074959aa8f9eb144cf40b2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.690 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.690 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.690 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.690 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.690 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.690 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.690 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.690 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.690 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.690 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.690 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.690 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.690 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.690 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.690 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.690 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.690 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.690 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.690 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.690 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.690 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.690 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.690 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.690 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.690 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.690 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.690 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.690 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.690 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.690 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.690 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.691 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.691 12 DEBUG ceilometer.compute.pollsters [-] 1da3a852-b732-48ac-886f-c57ced76a3d5/network.incoming.packets volume: 26 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.692 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a85b8d69-932c-4b37-9bd7-9a268f3e2caa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 26, 'user_id': '5a5a623606e647c183360572aab20b70', 'user_name': None, 'project_id': 'af3a536766704caaad94e5da2e3b88e2', 'project_name': None, 'resource_id': 'instance-00000048-1da3a852-b732-48ac-886f-c57ced76a3d5-tapab338966-6e', 'timestamp': '2025-11-22T07:56:36.691326', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-824804646', 'name': 'tapab338966-6e', 'instance_id': '1da3a852-b732-48ac-886f-c57ced76a3d5', 'instance_type': 'm1.nano', 'host': '25ec3804a05b28b4253bcb0dfae9f6bbd1d187768cc78d26ffafeb71', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:63:15:d6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapab338966-6e'}, 'message_id': 'c53295a4-c778-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4937.287548822, 'message_signature': '93b2d28673de022d7c2670b5656eab9cb57fe87e02f66c85724e2fe91c35bd62'}]}, 'timestamp': '2025-11-22 07:56:36.691635', '_unique_id': '2290c2a6eed74259b717b98b02a272af'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.692 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.692 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.692 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.692 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.692 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.692 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.692 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.692 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.692 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.692 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.692 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.692 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.692 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.692 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.692 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.692 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.692 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.692 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.692 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.692 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.692 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.692 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.692 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.692 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.692 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.692 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.692 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.692 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.692 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.692 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.692 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.693 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.693 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.693 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServerActionsTestOtherA-server-824804646>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestOtherA-server-824804646>]
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.693 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.694 12 DEBUG ceilometer.compute.pollsters [-] 1da3a852-b732-48ac-886f-c57ced76a3d5/cpu volume: 13000000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.695 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e5963dbe-294d-49fb-bb08-da332a55bddf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 13000000000, 'user_id': '5a5a623606e647c183360572aab20b70', 'user_name': None, 'project_id': 'af3a536766704caaad94e5da2e3b88e2', 'project_name': None, 'resource_id': '1da3a852-b732-48ac-886f-c57ced76a3d5', 'timestamp': '2025-11-22T07:56:36.694039', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherA-server-824804646', 'name': 'instance-00000048', 'instance_id': '1da3a852-b732-48ac-886f-c57ced76a3d5', 'instance_type': 'm1.nano', 'host': '25ec3804a05b28b4253bcb0dfae9f6bbd1d187768cc78d26ffafeb71', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'c5330124-c778-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 4937.341996733, 'message_signature': '7b88fb5c894c14d98f3b58794033cf314f304e69970aae3e93f33dc7bd093470'}]}, 'timestamp': '2025-11-22 07:56:36.694440', '_unique_id': '6941471f8bc9408580014f233ddc1fcd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.695 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.695 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.695 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.695 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.695 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.695 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.695 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.695 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.695 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.695 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.695 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.695 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.695 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.695 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.695 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.695 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.695 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.695 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.695 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.695 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.695 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.695 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.695 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.695 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.695 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.695 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.695 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.695 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.695 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.695 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:56:36.695 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:56:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:56:37.324 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:56:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:56:37.324 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:56:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:56:37.325 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:56:37 compute-0 ovn_controller[94843]: 2025-11-22T07:56:37Z|00303|binding|INFO|Releasing lport 1f7bc015-fb2f-41a5-82bb-16526b7a95f0 from this chassis (sb_readonly=0)
Nov 22 07:56:38 compute-0 nova_compute[186544]: 2025-11-22 07:56:38.008 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:56:38 compute-0 nova_compute[186544]: 2025-11-22 07:56:38.093 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:56:40 compute-0 nova_compute[186544]: 2025-11-22 07:56:40.434 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:56:42 compute-0 nova_compute[186544]: 2025-11-22 07:56:42.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:56:42 compute-0 nova_compute[186544]: 2025-11-22 07:56:42.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:56:42 compute-0 nova_compute[186544]: 2025-11-22 07:56:42.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 07:56:42 compute-0 nova_compute[186544]: 2025-11-22 07:56:42.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 07:56:42 compute-0 podman[225316]: 2025-11-22 07:56:42.406654073 +0000 UTC m=+0.057913427 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251118, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 07:56:42 compute-0 podman[225317]: 2025-11-22 07:56:42.429090736 +0000 UTC m=+0.078377322 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 22 07:56:42 compute-0 nova_compute[186544]: 2025-11-22 07:56:42.505 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "refresh_cache-1da3a852-b732-48ac-886f-c57ced76a3d5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:56:42 compute-0 nova_compute[186544]: 2025-11-22 07:56:42.505 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquired lock "refresh_cache-1da3a852-b732-48ac-886f-c57ced76a3d5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:56:42 compute-0 nova_compute[186544]: 2025-11-22 07:56:42.505 186548 DEBUG nova.network.neutron [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 22 07:56:42 compute-0 nova_compute[186544]: 2025-11-22 07:56:42.506 186548 DEBUG nova.objects.instance [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lazy-loading 'info_cache' on Instance uuid 1da3a852-b732-48ac-886f-c57ced76a3d5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:56:43 compute-0 nova_compute[186544]: 2025-11-22 07:56:43.096 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:56:45 compute-0 nova_compute[186544]: 2025-11-22 07:56:45.427 186548 DEBUG nova.network.neutron [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] Updating instance_info_cache with network_info: [{"id": "ab338966-6ece-477c-af70-298f992f0c8f", "address": "fa:16:3e:63:15:d6", "network": {"id": "a2b438ab-8fa8-4627-8c04-99bed701c19e", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1803455984-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "af3a536766704caaad94e5da2e3b88e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab338966-6e", "ovs_interfaceid": "ab338966-6ece-477c-af70-298f992f0c8f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:56:45 compute-0 nova_compute[186544]: 2025-11-22 07:56:45.435 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:56:45 compute-0 nova_compute[186544]: 2025-11-22 07:56:45.452 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Releasing lock "refresh_cache-1da3a852-b732-48ac-886f-c57ced76a3d5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:56:45 compute-0 nova_compute[186544]: 2025-11-22 07:56:45.452 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 22 07:56:45 compute-0 nova_compute[186544]: 2025-11-22 07:56:45.453 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:56:45 compute-0 nova_compute[186544]: 2025-11-22 07:56:45.453 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:56:45 compute-0 nova_compute[186544]: 2025-11-22 07:56:45.453 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 07:56:45 compute-0 nova_compute[186544]: 2025-11-22 07:56:45.453 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:56:45 compute-0 nova_compute[186544]: 2025-11-22 07:56:45.480 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:56:45 compute-0 nova_compute[186544]: 2025-11-22 07:56:45.480 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:56:45 compute-0 nova_compute[186544]: 2025-11-22 07:56:45.481 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:56:45 compute-0 nova_compute[186544]: 2025-11-22 07:56:45.481 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 07:56:45 compute-0 nova_compute[186544]: 2025-11-22 07:56:45.561 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1da3a852-b732-48ac-886f-c57ced76a3d5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:56:45 compute-0 nova_compute[186544]: 2025-11-22 07:56:45.622 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1da3a852-b732-48ac-886f-c57ced76a3d5/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:56:45 compute-0 nova_compute[186544]: 2025-11-22 07:56:45.623 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1da3a852-b732-48ac-886f-c57ced76a3d5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:56:45 compute-0 nova_compute[186544]: 2025-11-22 07:56:45.687 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1da3a852-b732-48ac-886f-c57ced76a3d5/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:56:45 compute-0 nova_compute[186544]: 2025-11-22 07:56:45.855 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 07:56:45 compute-0 nova_compute[186544]: 2025-11-22 07:56:45.856 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5556MB free_disk=73.2861557006836GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 07:56:45 compute-0 nova_compute[186544]: 2025-11-22 07:56:45.856 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:56:45 compute-0 nova_compute[186544]: 2025-11-22 07:56:45.857 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:56:45 compute-0 nova_compute[186544]: 2025-11-22 07:56:45.958 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Instance 1da3a852-b732-48ac-886f-c57ced76a3d5 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 22 07:56:45 compute-0 nova_compute[186544]: 2025-11-22 07:56:45.959 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 07:56:45 compute-0 nova_compute[186544]: 2025-11-22 07:56:45.959 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 07:56:46 compute-0 nova_compute[186544]: 2025-11-22 07:56:46.006 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:56:46 compute-0 nova_compute[186544]: 2025-11-22 07:56:46.023 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:56:46 compute-0 nova_compute[186544]: 2025-11-22 07:56:46.055 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 07:56:46 compute-0 nova_compute[186544]: 2025-11-22 07:56:46.055 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.198s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:56:47 compute-0 podman[225364]: 2025-11-22 07:56:47.407396219 +0000 UTC m=+0.055643881 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 07:56:47 compute-0 nova_compute[186544]: 2025-11-22 07:56:47.768 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:56:48 compute-0 nova_compute[186544]: 2025-11-22 07:56:48.098 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:56:48 compute-0 nova_compute[186544]: 2025-11-22 07:56:48.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:56:48 compute-0 nova_compute[186544]: 2025-11-22 07:56:48.499 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:56:50 compute-0 nova_compute[186544]: 2025-11-22 07:56:50.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:56:50 compute-0 nova_compute[186544]: 2025-11-22 07:56:50.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:56:50 compute-0 podman[225390]: 2025-11-22 07:56:50.404147496 +0000 UTC m=+0.054645327 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 22 07:56:50 compute-0 nova_compute[186544]: 2025-11-22 07:56:50.438 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:56:50 compute-0 nova_compute[186544]: 2025-11-22 07:56:50.814 186548 DEBUG nova.compute.manager [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560
Nov 22 07:56:50 compute-0 nova_compute[186544]: 2025-11-22 07:56:50.910 186548 DEBUG oslo_concurrency.lockutils [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:56:50 compute-0 nova_compute[186544]: 2025-11-22 07:56:50.911 186548 DEBUG oslo_concurrency.lockutils [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:56:50 compute-0 nova_compute[186544]: 2025-11-22 07:56:50.949 186548 DEBUG nova.objects.instance [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lazy-loading 'pci_requests' on Instance uuid c6cd5fec-f214-4bbc-b854-9e16c9a7577a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:56:50 compute-0 nova_compute[186544]: 2025-11-22 07:56:50.963 186548 DEBUG nova.virt.hardware [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 07:56:50 compute-0 nova_compute[186544]: 2025-11-22 07:56:50.963 186548 INFO nova.compute.claims [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Claim successful on node compute-0.ctlplane.example.com
Nov 22 07:56:50 compute-0 nova_compute[186544]: 2025-11-22 07:56:50.964 186548 DEBUG nova.objects.instance [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lazy-loading 'resources' on Instance uuid c6cd5fec-f214-4bbc-b854-9e16c9a7577a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:56:50 compute-0 nova_compute[186544]: 2025-11-22 07:56:50.975 186548 DEBUG nova.objects.instance [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lazy-loading 'pci_devices' on Instance uuid c6cd5fec-f214-4bbc-b854-9e16c9a7577a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:56:51 compute-0 nova_compute[186544]: 2025-11-22 07:56:51.019 186548 INFO nova.compute.resource_tracker [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Updating resource usage from migration 263622db-30b9-4ef7-a87a-380190b6fd0e
Nov 22 07:56:51 compute-0 nova_compute[186544]: 2025-11-22 07:56:51.019 186548 DEBUG nova.compute.resource_tracker [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Starting to track incoming migration 263622db-30b9-4ef7-a87a-380190b6fd0e with flavor 1c351edf-5b2d-477d-93d0-c380bdae83e7 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431
Nov 22 07:56:51 compute-0 nova_compute[186544]: 2025-11-22 07:56:51.120 186548 DEBUG nova.compute.provider_tree [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:56:51 compute-0 nova_compute[186544]: 2025-11-22 07:56:51.138 186548 DEBUG nova.scheduler.client.report [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:56:51 compute-0 nova_compute[186544]: 2025-11-22 07:56:51.172 186548 DEBUG oslo_concurrency.lockutils [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.260s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:56:51 compute-0 nova_compute[186544]: 2025-11-22 07:56:51.172 186548 INFO nova.compute.manager [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Migrating
Nov 22 07:56:51 compute-0 nova_compute[186544]: 2025-11-22 07:56:51.626 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:56:53 compute-0 nova_compute[186544]: 2025-11-22 07:56:53.101 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:56:55 compute-0 sshd-session[225409]: Accepted publickey for nova from 192.168.122.101 port 44354 ssh2: ECDSA SHA256:92zYFkcPWlh9+ZvK5fXQ5RiSCmUwy/g/KFplYnusFfI
Nov 22 07:56:55 compute-0 podman[225411]: 2025-11-22 07:56:55.424785053 +0000 UTC m=+0.063083285 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 22 07:56:55 compute-0 systemd-logind[821]: New session 38 of user nova.
Nov 22 07:56:55 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Nov 22 07:56:55 compute-0 nova_compute[186544]: 2025-11-22 07:56:55.439 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:56:55 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 22 07:56:55 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 22 07:56:55 compute-0 systemd[1]: Starting User Manager for UID 42436...
Nov 22 07:56:55 compute-0 systemd[225434]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 22 07:56:55 compute-0 systemd[225434]: Queued start job for default target Main User Target.
Nov 22 07:56:55 compute-0 systemd[225434]: Created slice User Application Slice.
Nov 22 07:56:55 compute-0 systemd[225434]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 22 07:56:55 compute-0 systemd[225434]: Started Daily Cleanup of User's Temporary Directories.
Nov 22 07:56:55 compute-0 systemd[225434]: Reached target Paths.
Nov 22 07:56:55 compute-0 systemd[225434]: Reached target Timers.
Nov 22 07:56:55 compute-0 systemd[225434]: Starting D-Bus User Message Bus Socket...
Nov 22 07:56:55 compute-0 systemd[225434]: Starting Create User's Volatile Files and Directories...
Nov 22 07:56:55 compute-0 systemd[225434]: Finished Create User's Volatile Files and Directories.
Nov 22 07:56:55 compute-0 systemd[225434]: Listening on D-Bus User Message Bus Socket.
Nov 22 07:56:55 compute-0 systemd[225434]: Reached target Sockets.
Nov 22 07:56:55 compute-0 systemd[225434]: Reached target Basic System.
Nov 22 07:56:55 compute-0 systemd[225434]: Reached target Main User Target.
Nov 22 07:56:55 compute-0 systemd[225434]: Startup finished in 144ms.
Nov 22 07:56:55 compute-0 systemd[1]: Started User Manager for UID 42436.
Nov 22 07:56:55 compute-0 systemd[1]: Started Session 38 of User nova.
Nov 22 07:56:55 compute-0 sshd-session[225409]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 22 07:56:55 compute-0 sshd-session[225450]: Received disconnect from 192.168.122.101 port 44354:11: disconnected by user
Nov 22 07:56:55 compute-0 sshd-session[225450]: Disconnected from user nova 192.168.122.101 port 44354
Nov 22 07:56:55 compute-0 sshd-session[225409]: pam_unix(sshd:session): session closed for user nova
Nov 22 07:56:55 compute-0 systemd[1]: session-38.scope: Deactivated successfully.
Nov 22 07:56:55 compute-0 systemd-logind[821]: Session 38 logged out. Waiting for processes to exit.
Nov 22 07:56:55 compute-0 systemd-logind[821]: Removed session 38.
Nov 22 07:56:55 compute-0 sshd-session[225452]: Accepted publickey for nova from 192.168.122.101 port 44356 ssh2: ECDSA SHA256:92zYFkcPWlh9+ZvK5fXQ5RiSCmUwy/g/KFplYnusFfI
Nov 22 07:56:55 compute-0 systemd-logind[821]: New session 40 of user nova.
Nov 22 07:56:55 compute-0 systemd[1]: Started Session 40 of User nova.
Nov 22 07:56:55 compute-0 sshd-session[225452]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 22 07:56:55 compute-0 sshd-session[225455]: Received disconnect from 192.168.122.101 port 44356:11: disconnected by user
Nov 22 07:56:55 compute-0 sshd-session[225455]: Disconnected from user nova 192.168.122.101 port 44356
Nov 22 07:56:55 compute-0 sshd-session[225452]: pam_unix(sshd:session): session closed for user nova
Nov 22 07:56:55 compute-0 systemd[1]: session-40.scope: Deactivated successfully.
Nov 22 07:56:55 compute-0 systemd-logind[821]: Session 40 logged out. Waiting for processes to exit.
Nov 22 07:56:55 compute-0 systemd-logind[821]: Removed session 40.
Nov 22 07:56:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:56:57.614 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:56:57 compute-0 nova_compute[186544]: 2025-11-22 07:56:57.614 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:56:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:56:57.618 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 07:56:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:56:57.619 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:56:58 compute-0 nova_compute[186544]: 2025-11-22 07:56:58.104 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:56:59 compute-0 podman[225457]: 2025-11-22 07:56:59.399297945 +0000 UTC m=+0.050547257 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 07:57:00 compute-0 nova_compute[186544]: 2025-11-22 07:57:00.442 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:57:02 compute-0 podman[225482]: 2025-11-22 07:57:02.409229357 +0000 UTC m=+0.053695804 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, managed_by=edpm_ansible, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, container_name=openstack_network_exporter, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, name=ubi9-minimal, release=1755695350, io.openshift.tags=minimal rhel9)
Nov 22 07:57:03 compute-0 nova_compute[186544]: 2025-11-22 07:57:03.107 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:57:05 compute-0 nova_compute[186544]: 2025-11-22 07:57:05.444 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:57:06 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Nov 22 07:57:06 compute-0 systemd[225434]: Activating special unit Exit the Session...
Nov 22 07:57:06 compute-0 systemd[225434]: Stopped target Main User Target.
Nov 22 07:57:06 compute-0 systemd[225434]: Stopped target Basic System.
Nov 22 07:57:06 compute-0 systemd[225434]: Stopped target Paths.
Nov 22 07:57:06 compute-0 systemd[225434]: Stopped target Sockets.
Nov 22 07:57:06 compute-0 systemd[225434]: Stopped target Timers.
Nov 22 07:57:06 compute-0 systemd[225434]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 22 07:57:06 compute-0 systemd[225434]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 22 07:57:06 compute-0 systemd[225434]: Closed D-Bus User Message Bus Socket.
Nov 22 07:57:06 compute-0 systemd[225434]: Stopped Create User's Volatile Files and Directories.
Nov 22 07:57:06 compute-0 systemd[225434]: Removed slice User Application Slice.
Nov 22 07:57:06 compute-0 systemd[225434]: Reached target Shutdown.
Nov 22 07:57:06 compute-0 systemd[225434]: Finished Exit the Session.
Nov 22 07:57:06 compute-0 systemd[225434]: Reached target Exit the Session.
Nov 22 07:57:06 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Nov 22 07:57:06 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Nov 22 07:57:06 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 22 07:57:06 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 22 07:57:06 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 22 07:57:06 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 22 07:57:06 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Nov 22 07:57:08 compute-0 nova_compute[186544]: 2025-11-22 07:57:08.110 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:57:09 compute-0 sshd-session[225504]: Accepted publickey for nova from 192.168.122.101 port 56488 ssh2: ECDSA SHA256:92zYFkcPWlh9+ZvK5fXQ5RiSCmUwy/g/KFplYnusFfI
Nov 22 07:57:09 compute-0 systemd-logind[821]: New session 41 of user nova.
Nov 22 07:57:09 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Nov 22 07:57:09 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 22 07:57:09 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 22 07:57:09 compute-0 systemd[1]: Starting User Manager for UID 42436...
Nov 22 07:57:09 compute-0 systemd[225508]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 22 07:57:09 compute-0 systemd[225508]: Queued start job for default target Main User Target.
Nov 22 07:57:09 compute-0 systemd[225508]: Created slice User Application Slice.
Nov 22 07:57:09 compute-0 systemd[225508]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 22 07:57:09 compute-0 systemd[225508]: Started Daily Cleanup of User's Temporary Directories.
Nov 22 07:57:09 compute-0 systemd[225508]: Reached target Paths.
Nov 22 07:57:09 compute-0 systemd[225508]: Reached target Timers.
Nov 22 07:57:09 compute-0 systemd[225508]: Starting D-Bus User Message Bus Socket...
Nov 22 07:57:09 compute-0 systemd[225508]: Starting Create User's Volatile Files and Directories...
Nov 22 07:57:09 compute-0 systemd[225508]: Finished Create User's Volatile Files and Directories.
Nov 22 07:57:09 compute-0 systemd[225508]: Listening on D-Bus User Message Bus Socket.
Nov 22 07:57:09 compute-0 systemd[225508]: Reached target Sockets.
Nov 22 07:57:09 compute-0 systemd[225508]: Reached target Basic System.
Nov 22 07:57:09 compute-0 systemd[225508]: Reached target Main User Target.
Nov 22 07:57:09 compute-0 systemd[225508]: Startup finished in 153ms.
Nov 22 07:57:09 compute-0 systemd[1]: Started User Manager for UID 42436.
Nov 22 07:57:09 compute-0 systemd[1]: Started Session 41 of User nova.
Nov 22 07:57:09 compute-0 sshd-session[225504]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 22 07:57:10 compute-0 sshd-session[225523]: Received disconnect from 192.168.122.101 port 56488:11: disconnected by user
Nov 22 07:57:10 compute-0 sshd-session[225523]: Disconnected from user nova 192.168.122.101 port 56488
Nov 22 07:57:10 compute-0 sshd-session[225504]: pam_unix(sshd:session): session closed for user nova
Nov 22 07:57:10 compute-0 systemd[1]: session-41.scope: Deactivated successfully.
Nov 22 07:57:10 compute-0 systemd-logind[821]: Session 41 logged out. Waiting for processes to exit.
Nov 22 07:57:10 compute-0 systemd-logind[821]: Removed session 41.
Nov 22 07:57:10 compute-0 sshd-session[225525]: Accepted publickey for nova from 192.168.122.101 port 51604 ssh2: ECDSA SHA256:92zYFkcPWlh9+ZvK5fXQ5RiSCmUwy/g/KFplYnusFfI
Nov 22 07:57:10 compute-0 systemd-logind[821]: New session 43 of user nova.
Nov 22 07:57:10 compute-0 systemd[1]: Started Session 43 of User nova.
Nov 22 07:57:10 compute-0 sshd-session[225525]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 22 07:57:10 compute-0 sshd-session[225530]: Received disconnect from 192.168.122.101 port 51604:11: disconnected by user
Nov 22 07:57:10 compute-0 sshd-session[225530]: Disconnected from user nova 192.168.122.101 port 51604
Nov 22 07:57:10 compute-0 sshd-session[225525]: pam_unix(sshd:session): session closed for user nova
Nov 22 07:57:10 compute-0 systemd[1]: session-43.scope: Deactivated successfully.
Nov 22 07:57:10 compute-0 systemd-logind[821]: Session 43 logged out. Waiting for processes to exit.
Nov 22 07:57:10 compute-0 systemd-logind[821]: Removed session 43.
Nov 22 07:57:10 compute-0 nova_compute[186544]: 2025-11-22 07:57:10.446 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:57:10 compute-0 nova_compute[186544]: 2025-11-22 07:57:10.540 186548 DEBUG nova.compute.manager [req-345250c6-d3f8-40d1-b166-f9e6f5c34617 req-e63eb320-811a-4529-8b35-434a421d399c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Received event network-vif-unplugged-3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:57:10 compute-0 nova_compute[186544]: 2025-11-22 07:57:10.541 186548 DEBUG oslo_concurrency.lockutils [req-345250c6-d3f8-40d1-b166-f9e6f5c34617 req-e63eb320-811a-4529-8b35-434a421d399c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "c6cd5fec-f214-4bbc-b854-9e16c9a7577a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:57:10 compute-0 nova_compute[186544]: 2025-11-22 07:57:10.541 186548 DEBUG oslo_concurrency.lockutils [req-345250c6-d3f8-40d1-b166-f9e6f5c34617 req-e63eb320-811a-4529-8b35-434a421d399c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c6cd5fec-f214-4bbc-b854-9e16c9a7577a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:57:10 compute-0 nova_compute[186544]: 2025-11-22 07:57:10.541 186548 DEBUG oslo_concurrency.lockutils [req-345250c6-d3f8-40d1-b166-f9e6f5c34617 req-e63eb320-811a-4529-8b35-434a421d399c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c6cd5fec-f214-4bbc-b854-9e16c9a7577a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:57:10 compute-0 nova_compute[186544]: 2025-11-22 07:57:10.541 186548 DEBUG nova.compute.manager [req-345250c6-d3f8-40d1-b166-f9e6f5c34617 req-e63eb320-811a-4529-8b35-434a421d399c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] No waiting events found dispatching network-vif-unplugged-3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:57:10 compute-0 nova_compute[186544]: 2025-11-22 07:57:10.542 186548 WARNING nova.compute.manager [req-345250c6-d3f8-40d1-b166-f9e6f5c34617 req-e63eb320-811a-4529-8b35-434a421d399c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Received unexpected event network-vif-unplugged-3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2 for instance with vm_state active and task_state resize_migrating.
Nov 22 07:57:10 compute-0 sshd-session[225532]: Accepted publickey for nova from 192.168.122.101 port 51610 ssh2: ECDSA SHA256:92zYFkcPWlh9+ZvK5fXQ5RiSCmUwy/g/KFplYnusFfI
Nov 22 07:57:10 compute-0 systemd-logind[821]: New session 44 of user nova.
Nov 22 07:57:10 compute-0 systemd[1]: Started Session 44 of User nova.
Nov 22 07:57:10 compute-0 sshd-session[225532]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 22 07:57:10 compute-0 sshd-session[225535]: Received disconnect from 192.168.122.101 port 51610:11: disconnected by user
Nov 22 07:57:10 compute-0 sshd-session[225535]: Disconnected from user nova 192.168.122.101 port 51610
Nov 22 07:57:10 compute-0 sshd-session[225532]: pam_unix(sshd:session): session closed for user nova
Nov 22 07:57:10 compute-0 systemd[1]: session-44.scope: Deactivated successfully.
Nov 22 07:57:10 compute-0 systemd-logind[821]: Session 44 logged out. Waiting for processes to exit.
Nov 22 07:57:10 compute-0 systemd-logind[821]: Removed session 44.
Nov 22 07:57:12 compute-0 nova_compute[186544]: 2025-11-22 07:57:12.408 186548 INFO nova.network.neutron [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Updating port 3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2 with attributes {'binding:host_id': 'compute-0.ctlplane.example.com', 'device_owner': 'compute:nova'}
Nov 22 07:57:12 compute-0 nova_compute[186544]: 2025-11-22 07:57:12.760 186548 DEBUG nova.compute.manager [req-f1e28a8b-d898-412a-8033-a8901f54e2a0 req-52afde83-588c-46f5-8e8f-ca17df998619 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Received event network-vif-plugged-3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:57:12 compute-0 nova_compute[186544]: 2025-11-22 07:57:12.760 186548 DEBUG oslo_concurrency.lockutils [req-f1e28a8b-d898-412a-8033-a8901f54e2a0 req-52afde83-588c-46f5-8e8f-ca17df998619 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "c6cd5fec-f214-4bbc-b854-9e16c9a7577a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:57:12 compute-0 nova_compute[186544]: 2025-11-22 07:57:12.761 186548 DEBUG oslo_concurrency.lockutils [req-f1e28a8b-d898-412a-8033-a8901f54e2a0 req-52afde83-588c-46f5-8e8f-ca17df998619 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c6cd5fec-f214-4bbc-b854-9e16c9a7577a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:57:12 compute-0 nova_compute[186544]: 2025-11-22 07:57:12.761 186548 DEBUG oslo_concurrency.lockutils [req-f1e28a8b-d898-412a-8033-a8901f54e2a0 req-52afde83-588c-46f5-8e8f-ca17df998619 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c6cd5fec-f214-4bbc-b854-9e16c9a7577a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:57:12 compute-0 nova_compute[186544]: 2025-11-22 07:57:12.762 186548 DEBUG nova.compute.manager [req-f1e28a8b-d898-412a-8033-a8901f54e2a0 req-52afde83-588c-46f5-8e8f-ca17df998619 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] No waiting events found dispatching network-vif-plugged-3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:57:12 compute-0 nova_compute[186544]: 2025-11-22 07:57:12.763 186548 WARNING nova.compute.manager [req-f1e28a8b-d898-412a-8033-a8901f54e2a0 req-52afde83-588c-46f5-8e8f-ca17df998619 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Received unexpected event network-vif-plugged-3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2 for instance with vm_state active and task_state resize_migrated.
Nov 22 07:57:12 compute-0 nova_compute[186544]: 2025-11-22 07:57:12.791 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:57:13 compute-0 nova_compute[186544]: 2025-11-22 07:57:13.112 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:57:13 compute-0 podman[225537]: 2025-11-22 07:57:13.421782912 +0000 UTC m=+0.068062647 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 22 07:57:13 compute-0 podman[225538]: 2025-11-22 07:57:13.448945082 +0000 UTC m=+0.092011958 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 22 07:57:14 compute-0 nova_compute[186544]: 2025-11-22 07:57:14.526 186548 DEBUG oslo_concurrency.lockutils [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquiring lock "refresh_cache-c6cd5fec-f214-4bbc-b854-9e16c9a7577a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:57:14 compute-0 nova_compute[186544]: 2025-11-22 07:57:14.527 186548 DEBUG oslo_concurrency.lockutils [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquired lock "refresh_cache-c6cd5fec-f214-4bbc-b854-9e16c9a7577a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:57:14 compute-0 nova_compute[186544]: 2025-11-22 07:57:14.527 186548 DEBUG nova.network.neutron [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 07:57:15 compute-0 nova_compute[186544]: 2025-11-22 07:57:15.218 186548 DEBUG nova.compute.manager [req-d5e6e6d0-0099-413e-98b6-0f8c1c274975 req-fdb09782-6fb5-47bd-ae2a-2d7a44d8e7ca 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Received event network-changed-3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:57:15 compute-0 nova_compute[186544]: 2025-11-22 07:57:15.219 186548 DEBUG nova.compute.manager [req-d5e6e6d0-0099-413e-98b6-0f8c1c274975 req-fdb09782-6fb5-47bd-ae2a-2d7a44d8e7ca 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Refreshing instance network info cache due to event network-changed-3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 07:57:15 compute-0 nova_compute[186544]: 2025-11-22 07:57:15.220 186548 DEBUG oslo_concurrency.lockutils [req-d5e6e6d0-0099-413e-98b6-0f8c1c274975 req-fdb09782-6fb5-47bd-ae2a-2d7a44d8e7ca 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-c6cd5fec-f214-4bbc-b854-9e16c9a7577a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:57:15 compute-0 nova_compute[186544]: 2025-11-22 07:57:15.448 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:57:18 compute-0 nova_compute[186544]: 2025-11-22 07:57:18.115 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:57:18 compute-0 podman[225582]: 2025-11-22 07:57:18.406344837 +0000 UTC m=+0.056012231 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 07:57:19 compute-0 nova_compute[186544]: 2025-11-22 07:57:19.419 186548 DEBUG nova.network.neutron [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Updating instance_info_cache with network_info: [{"id": "3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2", "address": "fa:16:3e:27:44:b9", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3640f80e-71", "ovs_interfaceid": "3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:57:19 compute-0 nova_compute[186544]: 2025-11-22 07:57:19.470 186548 DEBUG oslo_concurrency.lockutils [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Releasing lock "refresh_cache-c6cd5fec-f214-4bbc-b854-9e16c9a7577a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:57:19 compute-0 nova_compute[186544]: 2025-11-22 07:57:19.474 186548 DEBUG oslo_concurrency.lockutils [req-d5e6e6d0-0099-413e-98b6-0f8c1c274975 req-fdb09782-6fb5-47bd-ae2a-2d7a44d8e7ca 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-c6cd5fec-f214-4bbc-b854-9e16c9a7577a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:57:19 compute-0 nova_compute[186544]: 2025-11-22 07:57:19.475 186548 DEBUG nova.network.neutron [req-d5e6e6d0-0099-413e-98b6-0f8c1c274975 req-fdb09782-6fb5-47bd-ae2a-2d7a44d8e7ca 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Refreshing network info cache for port 3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 07:57:19 compute-0 nova_compute[186544]: 2025-11-22 07:57:19.749 186548 DEBUG nova.virt.libvirt.driver [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698
Nov 22 07:57:19 compute-0 nova_compute[186544]: 2025-11-22 07:57:19.750 186548 DEBUG nova.virt.libvirt.driver [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719
Nov 22 07:57:19 compute-0 nova_compute[186544]: 2025-11-22 07:57:19.751 186548 INFO nova.virt.libvirt.driver [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Creating image(s)
Nov 22 07:57:19 compute-0 nova_compute[186544]: 2025-11-22 07:57:19.752 186548 DEBUG nova.objects.instance [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lazy-loading 'trusted_certs' on Instance uuid c6cd5fec-f214-4bbc-b854-9e16c9a7577a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:57:19 compute-0 nova_compute[186544]: 2025-11-22 07:57:19.810 186548 DEBUG oslo_concurrency.processutils [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:57:19 compute-0 nova_compute[186544]: 2025-11-22 07:57:19.879 186548 DEBUG oslo_concurrency.processutils [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:57:19 compute-0 nova_compute[186544]: 2025-11-22 07:57:19.880 186548 DEBUG nova.virt.disk.api [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Checking if we can resize image /var/lib/nova/instances/c6cd5fec-f214-4bbc-b854-9e16c9a7577a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 07:57:19 compute-0 nova_compute[186544]: 2025-11-22 07:57:19.881 186548 DEBUG oslo_concurrency.processutils [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c6cd5fec-f214-4bbc-b854-9e16c9a7577a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:57:19 compute-0 nova_compute[186544]: 2025-11-22 07:57:19.947 186548 DEBUG oslo_concurrency.processutils [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c6cd5fec-f214-4bbc-b854-9e16c9a7577a/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:57:19 compute-0 nova_compute[186544]: 2025-11-22 07:57:19.949 186548 DEBUG nova.virt.disk.api [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Cannot resize image /var/lib/nova/instances/c6cd5fec-f214-4bbc-b854-9e16c9a7577a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 07:57:20 compute-0 nova_compute[186544]: 2025-11-22 07:57:20.184 186548 DEBUG nova.virt.libvirt.driver [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859
Nov 22 07:57:20 compute-0 nova_compute[186544]: 2025-11-22 07:57:20.185 186548 DEBUG nova.virt.libvirt.driver [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Ensure instance console log exists: /var/lib/nova/instances/c6cd5fec-f214-4bbc-b854-9e16c9a7577a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 07:57:20 compute-0 nova_compute[186544]: 2025-11-22 07:57:20.185 186548 DEBUG oslo_concurrency.lockutils [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:57:20 compute-0 nova_compute[186544]: 2025-11-22 07:57:20.186 186548 DEBUG oslo_concurrency.lockutils [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:57:20 compute-0 nova_compute[186544]: 2025-11-22 07:57:20.186 186548 DEBUG oslo_concurrency.lockutils [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:57:20 compute-0 nova_compute[186544]: 2025-11-22 07:57:20.190 186548 DEBUG nova.virt.libvirt.driver [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Start _get_guest_xml network_info=[{"id": "3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2", "address": "fa:16:3e:27:44:b9", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "vif_mac": "fa:16:3e:27:44:b9"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3640f80e-71", "ovs_interfaceid": "3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 07:57:20 compute-0 nova_compute[186544]: 2025-11-22 07:57:20.200 186548 WARNING nova.virt.libvirt.driver [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 07:57:20 compute-0 nova_compute[186544]: 2025-11-22 07:57:20.213 186548 DEBUG nova.virt.libvirt.host [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 07:57:20 compute-0 nova_compute[186544]: 2025-11-22 07:57:20.214 186548 DEBUG nova.virt.libvirt.host [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 07:57:20 compute-0 nova_compute[186544]: 2025-11-22 07:57:20.224 186548 DEBUG nova.virt.libvirt.host [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 07:57:20 compute-0 nova_compute[186544]: 2025-11-22 07:57:20.225 186548 DEBUG nova.virt.libvirt.host [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 07:57:20 compute-0 nova_compute[186544]: 2025-11-22 07:57:20.227 186548 DEBUG nova.virt.libvirt.driver [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 07:57:20 compute-0 nova_compute[186544]: 2025-11-22 07:57:20.227 186548 DEBUG nova.virt.hardware [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1c351edf-5b2d-477d-93d0-c380bdae83e7',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 07:57:20 compute-0 nova_compute[186544]: 2025-11-22 07:57:20.228 186548 DEBUG nova.virt.hardware [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 07:57:20 compute-0 nova_compute[186544]: 2025-11-22 07:57:20.228 186548 DEBUG nova.virt.hardware [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 07:57:20 compute-0 nova_compute[186544]: 2025-11-22 07:57:20.228 186548 DEBUG nova.virt.hardware [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 07:57:20 compute-0 nova_compute[186544]: 2025-11-22 07:57:20.228 186548 DEBUG nova.virt.hardware [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 07:57:20 compute-0 nova_compute[186544]: 2025-11-22 07:57:20.229 186548 DEBUG nova.virt.hardware [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 07:57:20 compute-0 nova_compute[186544]: 2025-11-22 07:57:20.230 186548 DEBUG nova.virt.hardware [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 07:57:20 compute-0 nova_compute[186544]: 2025-11-22 07:57:20.230 186548 DEBUG nova.virt.hardware [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 07:57:20 compute-0 nova_compute[186544]: 2025-11-22 07:57:20.230 186548 DEBUG nova.virt.hardware [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 07:57:20 compute-0 nova_compute[186544]: 2025-11-22 07:57:20.230 186548 DEBUG nova.virt.hardware [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 07:57:20 compute-0 nova_compute[186544]: 2025-11-22 07:57:20.231 186548 DEBUG nova.virt.hardware [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 07:57:20 compute-0 nova_compute[186544]: 2025-11-22 07:57:20.232 186548 DEBUG nova.objects.instance [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lazy-loading 'vcpu_model' on Instance uuid c6cd5fec-f214-4bbc-b854-9e16c9a7577a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:57:20 compute-0 nova_compute[186544]: 2025-11-22 07:57:20.275 186548 DEBUG oslo_concurrency.processutils [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c6cd5fec-f214-4bbc-b854-9e16c9a7577a/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:57:20 compute-0 nova_compute[186544]: 2025-11-22 07:57:20.346 186548 DEBUG oslo_concurrency.processutils [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c6cd5fec-f214-4bbc-b854-9e16c9a7577a/disk.config --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:57:20 compute-0 nova_compute[186544]: 2025-11-22 07:57:20.347 186548 DEBUG oslo_concurrency.lockutils [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquiring lock "/var/lib/nova/instances/c6cd5fec-f214-4bbc-b854-9e16c9a7577a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:57:20 compute-0 nova_compute[186544]: 2025-11-22 07:57:20.347 186548 DEBUG oslo_concurrency.lockutils [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "/var/lib/nova/instances/c6cd5fec-f214-4bbc-b854-9e16c9a7577a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:57:20 compute-0 nova_compute[186544]: 2025-11-22 07:57:20.348 186548 DEBUG oslo_concurrency.lockutils [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "/var/lib/nova/instances/c6cd5fec-f214-4bbc-b854-9e16c9a7577a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:57:20 compute-0 nova_compute[186544]: 2025-11-22 07:57:20.350 186548 DEBUG nova.virt.libvirt.vif [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:56:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-772508279',display_name='tempest-ServerDiskConfigTestJSON-server-772508279',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-772508279',id=76,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:56:45Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='063bf16c91af408ca075c690797e09d8',ramdisk_id='',reservation_id='r-0n3qm4qh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerDiskConfigTestJSON-592691466',owner_user_name='tempest-ServerDiskConfigTestJSON-592691466-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:57:12Z,user_data=None,user_id='e24c302b62fb470aa189b76d4676733b',uuid=c6cd5fec-f214-4bbc-b854-9e16c9a7577a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2", "address": "fa:16:3e:27:44:b9", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "vif_mac": "fa:16:3e:27:44:b9"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3640f80e-71", "ovs_interfaceid": "3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 07:57:20 compute-0 nova_compute[186544]: 2025-11-22 07:57:20.350 186548 DEBUG nova.network.os_vif_util [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Converting VIF {"id": "3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2", "address": "fa:16:3e:27:44:b9", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "vif_mac": "fa:16:3e:27:44:b9"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3640f80e-71", "ovs_interfaceid": "3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:57:20 compute-0 nova_compute[186544]: 2025-11-22 07:57:20.351 186548 DEBUG nova.network.os_vif_util [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:27:44:b9,bridge_name='br-int',has_traffic_filtering=True,id=3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2,network=Network(d54e232a-5c68-4cc7-b58c-054da9c4646f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3640f80e-71') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:57:20 compute-0 nova_compute[186544]: 2025-11-22 07:57:20.354 186548 DEBUG nova.virt.libvirt.driver [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] End _get_guest_xml xml=<domain type="kvm">
Nov 22 07:57:20 compute-0 nova_compute[186544]:   <uuid>c6cd5fec-f214-4bbc-b854-9e16c9a7577a</uuid>
Nov 22 07:57:20 compute-0 nova_compute[186544]:   <name>instance-0000004c</name>
Nov 22 07:57:20 compute-0 nova_compute[186544]:   <memory>196608</memory>
Nov 22 07:57:20 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 07:57:20 compute-0 nova_compute[186544]:   <metadata>
Nov 22 07:57:20 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 07:57:20 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 07:57:20 compute-0 nova_compute[186544]:       <nova:name>tempest-ServerDiskConfigTestJSON-server-772508279</nova:name>
Nov 22 07:57:20 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 07:57:20</nova:creationTime>
Nov 22 07:57:20 compute-0 nova_compute[186544]:       <nova:flavor name="m1.micro">
Nov 22 07:57:20 compute-0 nova_compute[186544]:         <nova:memory>192</nova:memory>
Nov 22 07:57:20 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 07:57:20 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 07:57:20 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 07:57:20 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 07:57:20 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 07:57:20 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 07:57:20 compute-0 nova_compute[186544]:         <nova:user uuid="e24c302b62fb470aa189b76d4676733b">tempest-ServerDiskConfigTestJSON-592691466-project-member</nova:user>
Nov 22 07:57:20 compute-0 nova_compute[186544]:         <nova:project uuid="063bf16c91af408ca075c690797e09d8">tempest-ServerDiskConfigTestJSON-592691466</nova:project>
Nov 22 07:57:20 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 07:57:20 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 07:57:20 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 07:57:20 compute-0 nova_compute[186544]:         <nova:port uuid="3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2">
Nov 22 07:57:20 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Nov 22 07:57:20 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 07:57:20 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 07:57:20 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 07:57:20 compute-0 nova_compute[186544]:   </metadata>
Nov 22 07:57:20 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 07:57:20 compute-0 nova_compute[186544]:     <system>
Nov 22 07:57:20 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 07:57:20 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 07:57:20 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 07:57:20 compute-0 nova_compute[186544]:       <entry name="serial">c6cd5fec-f214-4bbc-b854-9e16c9a7577a</entry>
Nov 22 07:57:20 compute-0 nova_compute[186544]:       <entry name="uuid">c6cd5fec-f214-4bbc-b854-9e16c9a7577a</entry>
Nov 22 07:57:20 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 07:57:20 compute-0 nova_compute[186544]:     </system>
Nov 22 07:57:20 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 07:57:20 compute-0 nova_compute[186544]:   <os>
Nov 22 07:57:20 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 07:57:20 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 07:57:20 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 07:57:20 compute-0 nova_compute[186544]:   </os>
Nov 22 07:57:20 compute-0 nova_compute[186544]:   <features>
Nov 22 07:57:20 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 07:57:20 compute-0 nova_compute[186544]:     <apic/>
Nov 22 07:57:20 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 07:57:20 compute-0 nova_compute[186544]:   </features>
Nov 22 07:57:20 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 07:57:20 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 07:57:20 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 07:57:20 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 07:57:20 compute-0 nova_compute[186544]:   </clock>
Nov 22 07:57:20 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 07:57:20 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 07:57:20 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 07:57:20 compute-0 nova_compute[186544]:   </cpu>
Nov 22 07:57:20 compute-0 nova_compute[186544]:   <devices>
Nov 22 07:57:20 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 07:57:20 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 07:57:20 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/c6cd5fec-f214-4bbc-b854-9e16c9a7577a/disk"/>
Nov 22 07:57:20 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 07:57:20 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:57:20 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 07:57:20 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 07:57:20 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/c6cd5fec-f214-4bbc-b854-9e16c9a7577a/disk.config"/>
Nov 22 07:57:20 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 07:57:20 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:57:20 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 07:57:20 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:27:44:b9"/>
Nov 22 07:57:20 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 07:57:20 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 07:57:20 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 07:57:20 compute-0 nova_compute[186544]:       <target dev="tap3640f80e-71"/>
Nov 22 07:57:20 compute-0 nova_compute[186544]:     </interface>
Nov 22 07:57:20 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 07:57:20 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/c6cd5fec-f214-4bbc-b854-9e16c9a7577a/console.log" append="off"/>
Nov 22 07:57:20 compute-0 nova_compute[186544]:     </serial>
Nov 22 07:57:20 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 07:57:20 compute-0 nova_compute[186544]:     <video>
Nov 22 07:57:20 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 07:57:20 compute-0 nova_compute[186544]:     </video>
Nov 22 07:57:20 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 07:57:20 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 07:57:20 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 07:57:20 compute-0 nova_compute[186544]:     </rng>
Nov 22 07:57:20 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 07:57:20 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:57:20 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:57:20 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:57:20 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:57:20 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:57:20 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:57:20 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:57:20 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:57:20 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:57:20 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:57:20 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:57:20 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:57:20 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:57:20 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:57:20 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:57:20 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:57:20 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:57:20 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:57:20 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:57:20 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:57:20 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:57:20 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:57:20 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:57:20 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:57:20 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 07:57:20 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 07:57:20 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 07:57:20 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 07:57:20 compute-0 nova_compute[186544]:   </devices>
Nov 22 07:57:20 compute-0 nova_compute[186544]: </domain>
Nov 22 07:57:20 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 07:57:20 compute-0 nova_compute[186544]: 2025-11-22 07:57:20.355 186548 DEBUG nova.virt.libvirt.vif [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:56:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-772508279',display_name='tempest-ServerDiskConfigTestJSON-server-772508279',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-772508279',id=76,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:56:45Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='063bf16c91af408ca075c690797e09d8',ramdisk_id='',reservation_id='r-0n3qm4qh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerDiskConfigTestJSON-592691466',owner_user_name='tempest-ServerDiskConfigTestJSON-592691466-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:57:12Z,user_data=None,user_id='e24c302b62fb470aa189b76d4676733b',uuid=c6cd5fec-f214-4bbc-b854-9e16c9a7577a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2", "address": "fa:16:3e:27:44:b9", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "vif_mac": "fa:16:3e:27:44:b9"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3640f80e-71", "ovs_interfaceid": "3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 07:57:20 compute-0 nova_compute[186544]: 2025-11-22 07:57:20.355 186548 DEBUG nova.network.os_vif_util [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Converting VIF {"id": "3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2", "address": "fa:16:3e:27:44:b9", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "vif_mac": "fa:16:3e:27:44:b9"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3640f80e-71", "ovs_interfaceid": "3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:57:20 compute-0 nova_compute[186544]: 2025-11-22 07:57:20.356 186548 DEBUG nova.network.os_vif_util [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:27:44:b9,bridge_name='br-int',has_traffic_filtering=True,id=3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2,network=Network(d54e232a-5c68-4cc7-b58c-054da9c4646f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3640f80e-71') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:57:20 compute-0 nova_compute[186544]: 2025-11-22 07:57:20.359 186548 DEBUG os_vif [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:27:44:b9,bridge_name='br-int',has_traffic_filtering=True,id=3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2,network=Network(d54e232a-5c68-4cc7-b58c-054da9c4646f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3640f80e-71') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 07:57:20 compute-0 nova_compute[186544]: 2025-11-22 07:57:20.360 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:57:20 compute-0 nova_compute[186544]: 2025-11-22 07:57:20.361 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:57:20 compute-0 nova_compute[186544]: 2025-11-22 07:57:20.361 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:57:20 compute-0 nova_compute[186544]: 2025-11-22 07:57:20.365 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:57:20 compute-0 nova_compute[186544]: 2025-11-22 07:57:20.365 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3640f80e-71, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:57:20 compute-0 nova_compute[186544]: 2025-11-22 07:57:20.365 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3640f80e-71, col_values=(('external_ids', {'iface-id': '3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:27:44:b9', 'vm-uuid': 'c6cd5fec-f214-4bbc-b854-9e16c9a7577a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:57:20 compute-0 nova_compute[186544]: 2025-11-22 07:57:20.368 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:57:20 compute-0 NetworkManager[55036]: <info>  [1763798240.3691] manager: (tap3640f80e-71): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/143)
Nov 22 07:57:20 compute-0 nova_compute[186544]: 2025-11-22 07:57:20.371 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 07:57:20 compute-0 nova_compute[186544]: 2025-11-22 07:57:20.376 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:57:20 compute-0 nova_compute[186544]: 2025-11-22 07:57:20.377 186548 INFO os_vif [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:27:44:b9,bridge_name='br-int',has_traffic_filtering=True,id=3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2,network=Network(d54e232a-5c68-4cc7-b58c-054da9c4646f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3640f80e-71')
Nov 22 07:57:20 compute-0 nova_compute[186544]: 2025-11-22 07:57:20.450 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:57:20 compute-0 nova_compute[186544]: 2025-11-22 07:57:20.560 186548 DEBUG nova.virt.libvirt.driver [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:57:20 compute-0 nova_compute[186544]: 2025-11-22 07:57:20.561 186548 DEBUG nova.virt.libvirt.driver [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:57:20 compute-0 nova_compute[186544]: 2025-11-22 07:57:20.561 186548 DEBUG nova.virt.libvirt.driver [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] No VIF found with MAC fa:16:3e:27:44:b9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 07:57:20 compute-0 nova_compute[186544]: 2025-11-22 07:57:20.562 186548 INFO nova.virt.libvirt.driver [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Using config drive
Nov 22 07:57:20 compute-0 kernel: tap3640f80e-71: entered promiscuous mode
Nov 22 07:57:20 compute-0 NetworkManager[55036]: <info>  [1763798240.6447] manager: (tap3640f80e-71): new Tun device (/org/freedesktop/NetworkManager/Devices/144)
Nov 22 07:57:20 compute-0 nova_compute[186544]: 2025-11-22 07:57:20.645 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:57:20 compute-0 ovn_controller[94843]: 2025-11-22T07:57:20Z|00304|binding|INFO|Claiming lport 3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2 for this chassis.
Nov 22 07:57:20 compute-0 ovn_controller[94843]: 2025-11-22T07:57:20Z|00305|binding|INFO|3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2: Claiming fa:16:3e:27:44:b9 10.100.0.12
Nov 22 07:57:20 compute-0 ovn_controller[94843]: 2025-11-22T07:57:20Z|00306|binding|INFO|Setting lport 3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2 ovn-installed in OVS
Nov 22 07:57:20 compute-0 nova_compute[186544]: 2025-11-22 07:57:20.662 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:57:20 compute-0 nova_compute[186544]: 2025-11-22 07:57:20.666 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:57:20 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Nov 22 07:57:20 compute-0 systemd-udevd[225641]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 07:57:20 compute-0 systemd[225508]: Activating special unit Exit the Session...
Nov 22 07:57:20 compute-0 systemd[225508]: Stopped target Main User Target.
Nov 22 07:57:20 compute-0 systemd[225508]: Stopped target Basic System.
Nov 22 07:57:20 compute-0 systemd[225508]: Stopped target Paths.
Nov 22 07:57:20 compute-0 systemd[225508]: Stopped target Sockets.
Nov 22 07:57:20 compute-0 systemd[225508]: Stopped target Timers.
Nov 22 07:57:20 compute-0 systemd[225508]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 22 07:57:20 compute-0 systemd[225508]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 22 07:57:20 compute-0 systemd[225508]: Closed D-Bus User Message Bus Socket.
Nov 22 07:57:20 compute-0 systemd[225508]: Stopped Create User's Volatile Files and Directories.
Nov 22 07:57:20 compute-0 systemd[225508]: Removed slice User Application Slice.
Nov 22 07:57:20 compute-0 systemd[225508]: Reached target Shutdown.
Nov 22 07:57:20 compute-0 systemd[225508]: Finished Exit the Session.
Nov 22 07:57:20 compute-0 systemd[225508]: Reached target Exit the Session.
Nov 22 07:57:20 compute-0 systemd-machined[152872]: New machine qemu-37-instance-0000004c.
Nov 22 07:57:20 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Nov 22 07:57:20 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Nov 22 07:57:20 compute-0 NetworkManager[55036]: <info>  [1763798240.7005] device (tap3640f80e-71): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 07:57:20 compute-0 NetworkManager[55036]: <info>  [1763798240.7015] device (tap3640f80e-71): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 07:57:20 compute-0 ovn_controller[94843]: 2025-11-22T07:57:20Z|00307|binding|INFO|Setting lport 3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2 up in Southbound
Nov 22 07:57:20 compute-0 systemd[1]: Started Virtual Machine qemu-37-instance-0000004c.
Nov 22 07:57:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:57:20.707 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:27:44:b9 10.100.0.12'], port_security=['fa:16:3e:27:44:b9 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'c6cd5fec-f214-4bbc-b854-9e16c9a7577a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d54e232a-5c68-4cc7-b58c-054da9c4646f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '063bf16c91af408ca075c690797e09d8', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'f7dee984-8c58-404b-bb82-9a414aef709d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3245992a-6d76-4250-aff6-2ef09c2bac10, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:57:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:57:20.708 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2 in datapath d54e232a-5c68-4cc7-b58c-054da9c4646f bound to our chassis
Nov 22 07:57:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:57:20.710 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d54e232a-5c68-4cc7-b58c-054da9c4646f
Nov 22 07:57:20 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 22 07:57:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:57:20.724 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[62746fcf-2b58-4a97-80b3-e0898d697372]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:57:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:57:20.726 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd54e232a-51 in ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 07:57:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:57:20.728 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd54e232a-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 07:57:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:57:20.728 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[522ae352-170b-4e32-a91f-67ad24e48be3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:57:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:57:20.729 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[fad623bd-6d40-42ad-8817-612b89c69c91]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:57:20 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 22 07:57:20 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 22 07:57:20 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 22 07:57:20 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Nov 22 07:57:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:57:20.743 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[9b00144a-5c73-445e-8acf-b5ebfa739035]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:57:20 compute-0 podman[225622]: 2025-11-22 07:57:20.74742942 +0000 UTC m=+0.108342670 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 07:57:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:57:20.756 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[d4f5a032-1f71-4e66-9e17-77b5299beb1a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:57:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:57:20.791 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[6579287c-57ec-43a9-a753-d02c71ac3217]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:57:20 compute-0 systemd-udevd[225648]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 07:57:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:57:20.799 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[c3476cfc-e3fe-483e-92d5-d7e511a0e961]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:57:20 compute-0 NetworkManager[55036]: <info>  [1763798240.8002] manager: (tapd54e232a-50): new Veth device (/org/freedesktop/NetworkManager/Devices/145)
Nov 22 07:57:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:57:20.832 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[f3ebfdbe-1827-4593-984f-07619610623e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:57:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:57:20.837 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[8df36aa6-0942-436f-9e5f-561b3c091a35]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:57:20 compute-0 NetworkManager[55036]: <info>  [1763798240.8574] device (tapd54e232a-50): carrier: link connected
Nov 22 07:57:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:57:20.864 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[04d10a7f-dc9c-4b93-8495-6eb89501ed4e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:57:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:57:20.881 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[0900a7d9-e5e4-42d0-9a3e-58120869fa74]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd54e232a-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5b:69:51'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 92], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 498149, 'reachable_time': 42430, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225682, 'error': None, 'target': 'ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:57:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:57:20.904 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[617914c5-3892-4b0d-91ef-20a1eb89608b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe5b:6951'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 498149, 'tstamp': 498149}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225683, 'error': None, 'target': 'ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:57:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:57:20.924 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[0ae3c1ab-0296-4dc3-b493-02f047c7ac33]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd54e232a-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5b:69:51'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 92], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 498149, 'reachable_time': 42430, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 225684, 'error': None, 'target': 'ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:57:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:57:20.953 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[715beb9a-b174-4e06-86a2-6b9995a12777]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:57:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:57:21.007 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[38bc5c8f-5a16-48dd-a1b8-ec8ca93992c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:57:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:57:21.009 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd54e232a-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:57:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:57:21.009 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:57:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:57:21.010 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd54e232a-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:57:21 compute-0 nova_compute[186544]: 2025-11-22 07:57:21.011 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:57:21 compute-0 NetworkManager[55036]: <info>  [1763798241.0141] manager: (tapd54e232a-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/146)
Nov 22 07:57:21 compute-0 kernel: tapd54e232a-50: entered promiscuous mode
Nov 22 07:57:21 compute-0 nova_compute[186544]: 2025-11-22 07:57:21.015 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:57:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:57:21.021 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd54e232a-50, col_values=(('external_ids', {'iface-id': 'bab7bafe-e92a-4e88-a16b-e3bd78ab8944'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:57:21 compute-0 nova_compute[186544]: 2025-11-22 07:57:21.023 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:57:21 compute-0 ovn_controller[94843]: 2025-11-22T07:57:21Z|00308|binding|INFO|Releasing lport bab7bafe-e92a-4e88-a16b-e3bd78ab8944 from this chassis (sb_readonly=0)
Nov 22 07:57:21 compute-0 nova_compute[186544]: 2025-11-22 07:57:21.024 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:57:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:57:21.032 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d54e232a-5c68-4cc7-b58c-054da9c4646f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d54e232a-5c68-4cc7-b58c-054da9c4646f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 07:57:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:57:21.033 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[3593aac9-49dc-4af2-8c39-943ca262a7dc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:57:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:57:21.034 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 07:57:21 compute-0 ovn_metadata_agent[103800]: global
Nov 22 07:57:21 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 07:57:21 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-d54e232a-5c68-4cc7-b58c-054da9c4646f
Nov 22 07:57:21 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 07:57:21 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 07:57:21 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 07:57:21 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/d54e232a-5c68-4cc7-b58c-054da9c4646f.pid.haproxy
Nov 22 07:57:21 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 07:57:21 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:57:21 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 07:57:21 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 07:57:21 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 07:57:21 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 07:57:21 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 07:57:21 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 07:57:21 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 07:57:21 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 07:57:21 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 07:57:21 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 07:57:21 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 07:57:21 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 07:57:21 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 07:57:21 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:57:21 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:57:21 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 07:57:21 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 07:57:21 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 07:57:21 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID d54e232a-5c68-4cc7-b58c-054da9c4646f
Nov 22 07:57:21 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 07:57:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:57:21.035 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f', 'env', 'PROCESS_TAG=haproxy-d54e232a-5c68-4cc7-b58c-054da9c4646f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d54e232a-5c68-4cc7-b58c-054da9c4646f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 07:57:21 compute-0 nova_compute[186544]: 2025-11-22 07:57:21.041 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:57:21 compute-0 nova_compute[186544]: 2025-11-22 07:57:21.326 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798241.3247967, c6cd5fec-f214-4bbc-b854-9e16c9a7577a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:57:21 compute-0 nova_compute[186544]: 2025-11-22 07:57:21.326 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] VM Resumed (Lifecycle Event)
Nov 22 07:57:21 compute-0 nova_compute[186544]: 2025-11-22 07:57:21.329 186548 DEBUG nova.compute.manager [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 07:57:21 compute-0 nova_compute[186544]: 2025-11-22 07:57:21.333 186548 INFO nova.virt.libvirt.driver [-] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Instance running successfully.
Nov 22 07:57:21 compute-0 virtqemud[186092]: argument unsupported: QEMU guest agent is not configured
Nov 22 07:57:21 compute-0 nova_compute[186544]: 2025-11-22 07:57:21.335 186548 DEBUG nova.virt.libvirt.guest [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200
Nov 22 07:57:21 compute-0 nova_compute[186544]: 2025-11-22 07:57:21.336 186548 DEBUG nova.virt.libvirt.driver [None req-b4bb0d73-81d5-4bea-a8c0-79d57034b125 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793
Nov 22 07:57:21 compute-0 nova_compute[186544]: 2025-11-22 07:57:21.366 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:57:21 compute-0 nova_compute[186544]: 2025-11-22 07:57:21.369 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:57:21 compute-0 nova_compute[186544]: 2025-11-22 07:57:21.393 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] During sync_power_state the instance has a pending task (resize_finish). Skip.
Nov 22 07:57:21 compute-0 nova_compute[186544]: 2025-11-22 07:57:21.393 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798241.3294346, c6cd5fec-f214-4bbc-b854-9e16c9a7577a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:57:21 compute-0 nova_compute[186544]: 2025-11-22 07:57:21.393 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] VM Started (Lifecycle Event)
Nov 22 07:57:21 compute-0 nova_compute[186544]: 2025-11-22 07:57:21.418 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:57:21 compute-0 nova_compute[186544]: 2025-11-22 07:57:21.422 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:57:21 compute-0 nova_compute[186544]: 2025-11-22 07:57:21.443 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] During sync_power_state the instance has a pending task (resize_finish). Skip.
Nov 22 07:57:21 compute-0 podman[225723]: 2025-11-22 07:57:21.376562749 +0000 UTC m=+0.023722335 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 07:57:22 compute-0 podman[225723]: 2025-11-22 07:57:22.052121952 +0000 UTC m=+0.699281498 container create 2e657c96616f58842d26780b1f57b866a3adb96d8f809f28a43cf4040b3b1f97 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 22 07:57:22 compute-0 systemd[1]: Started libpod-conmon-2e657c96616f58842d26780b1f57b866a3adb96d8f809f28a43cf4040b3b1f97.scope.
Nov 22 07:57:22 compute-0 systemd[1]: Started libcrun container.
Nov 22 07:57:22 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5ae59c9d4de7519c963a04486a7dc14433b7cafb9f52ef523f435ad9e697f387/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 07:57:22 compute-0 podman[225723]: 2025-11-22 07:57:22.327035555 +0000 UTC m=+0.974195101 container init 2e657c96616f58842d26780b1f57b866a3adb96d8f809f28a43cf4040b3b1f97 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true)
Nov 22 07:57:22 compute-0 podman[225723]: 2025-11-22 07:57:22.334438177 +0000 UTC m=+0.981597723 container start 2e657c96616f58842d26780b1f57b866a3adb96d8f809f28a43cf4040b3b1f97 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, tcib_managed=true)
Nov 22 07:57:22 compute-0 neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f[225739]: [NOTICE]   (225743) : New worker (225745) forked
Nov 22 07:57:22 compute-0 neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f[225739]: [NOTICE]   (225743) : Loading success.
Nov 22 07:57:22 compute-0 nova_compute[186544]: 2025-11-22 07:57:22.840 186548 DEBUG nova.compute.manager [req-b3b183d2-a7d5-4012-88ae-84f9db16fc68 req-29c0aaa5-3ec8-44ba-b89e-47616e3e72fe 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Received event network-vif-plugged-3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:57:22 compute-0 nova_compute[186544]: 2025-11-22 07:57:22.840 186548 DEBUG oslo_concurrency.lockutils [req-b3b183d2-a7d5-4012-88ae-84f9db16fc68 req-29c0aaa5-3ec8-44ba-b89e-47616e3e72fe 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "c6cd5fec-f214-4bbc-b854-9e16c9a7577a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:57:22 compute-0 nova_compute[186544]: 2025-11-22 07:57:22.841 186548 DEBUG oslo_concurrency.lockutils [req-b3b183d2-a7d5-4012-88ae-84f9db16fc68 req-29c0aaa5-3ec8-44ba-b89e-47616e3e72fe 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c6cd5fec-f214-4bbc-b854-9e16c9a7577a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:57:22 compute-0 nova_compute[186544]: 2025-11-22 07:57:22.841 186548 DEBUG oslo_concurrency.lockutils [req-b3b183d2-a7d5-4012-88ae-84f9db16fc68 req-29c0aaa5-3ec8-44ba-b89e-47616e3e72fe 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c6cd5fec-f214-4bbc-b854-9e16c9a7577a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:57:22 compute-0 nova_compute[186544]: 2025-11-22 07:57:22.841 186548 DEBUG nova.compute.manager [req-b3b183d2-a7d5-4012-88ae-84f9db16fc68 req-29c0aaa5-3ec8-44ba-b89e-47616e3e72fe 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] No waiting events found dispatching network-vif-plugged-3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:57:22 compute-0 nova_compute[186544]: 2025-11-22 07:57:22.841 186548 WARNING nova.compute.manager [req-b3b183d2-a7d5-4012-88ae-84f9db16fc68 req-29c0aaa5-3ec8-44ba-b89e-47616e3e72fe 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Received unexpected event network-vif-plugged-3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2 for instance with vm_state resized and task_state None.
Nov 22 07:57:24 compute-0 nova_compute[186544]: 2025-11-22 07:57:24.970 186548 DEBUG nova.compute.manager [req-92925060-f0e6-4e75-9afc-abd6be7dc557 req-79dd736e-65a7-46d0-82b3-fcbfaec71001 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Received event network-vif-plugged-3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:57:24 compute-0 nova_compute[186544]: 2025-11-22 07:57:24.970 186548 DEBUG oslo_concurrency.lockutils [req-92925060-f0e6-4e75-9afc-abd6be7dc557 req-79dd736e-65a7-46d0-82b3-fcbfaec71001 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "c6cd5fec-f214-4bbc-b854-9e16c9a7577a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:57:24 compute-0 nova_compute[186544]: 2025-11-22 07:57:24.970 186548 DEBUG oslo_concurrency.lockutils [req-92925060-f0e6-4e75-9afc-abd6be7dc557 req-79dd736e-65a7-46d0-82b3-fcbfaec71001 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c6cd5fec-f214-4bbc-b854-9e16c9a7577a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:57:24 compute-0 nova_compute[186544]: 2025-11-22 07:57:24.970 186548 DEBUG oslo_concurrency.lockutils [req-92925060-f0e6-4e75-9afc-abd6be7dc557 req-79dd736e-65a7-46d0-82b3-fcbfaec71001 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c6cd5fec-f214-4bbc-b854-9e16c9a7577a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:57:24 compute-0 nova_compute[186544]: 2025-11-22 07:57:24.971 186548 DEBUG nova.compute.manager [req-92925060-f0e6-4e75-9afc-abd6be7dc557 req-79dd736e-65a7-46d0-82b3-fcbfaec71001 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] No waiting events found dispatching network-vif-plugged-3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:57:24 compute-0 nova_compute[186544]: 2025-11-22 07:57:24.971 186548 WARNING nova.compute.manager [req-92925060-f0e6-4e75-9afc-abd6be7dc557 req-79dd736e-65a7-46d0-82b3-fcbfaec71001 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Received unexpected event network-vif-plugged-3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2 for instance with vm_state resized and task_state None.
Nov 22 07:57:25 compute-0 nova_compute[186544]: 2025-11-22 07:57:25.368 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:57:25 compute-0 nova_compute[186544]: 2025-11-22 07:57:25.456 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:57:26 compute-0 nova_compute[186544]: 2025-11-22 07:57:26.057 186548 DEBUG nova.network.neutron [req-d5e6e6d0-0099-413e-98b6-0f8c1c274975 req-fdb09782-6fb5-47bd-ae2a-2d7a44d8e7ca 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Updated VIF entry in instance network info cache for port 3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 07:57:26 compute-0 nova_compute[186544]: 2025-11-22 07:57:26.057 186548 DEBUG nova.network.neutron [req-d5e6e6d0-0099-413e-98b6-0f8c1c274975 req-fdb09782-6fb5-47bd-ae2a-2d7a44d8e7ca 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Updating instance_info_cache with network_info: [{"id": "3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2", "address": "fa:16:3e:27:44:b9", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3640f80e-71", "ovs_interfaceid": "3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:57:26 compute-0 nova_compute[186544]: 2025-11-22 07:57:26.071 186548 DEBUG oslo_concurrency.lockutils [req-d5e6e6d0-0099-413e-98b6-0f8c1c274975 req-fdb09782-6fb5-47bd-ae2a-2d7a44d8e7ca 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-c6cd5fec-f214-4bbc-b854-9e16c9a7577a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:57:26 compute-0 podman[225755]: 2025-11-22 07:57:26.413310042 +0000 UTC m=+0.061531987 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, io.buildah.version=1.41.3)
Nov 22 07:57:27 compute-0 nova_compute[186544]: 2025-11-22 07:57:27.376 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:57:30 compute-0 nova_compute[186544]: 2025-11-22 07:57:30.370 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:57:30 compute-0 podman[225776]: 2025-11-22 07:57:30.407029118 +0000 UTC m=+0.051351545 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 07:57:30 compute-0 nova_compute[186544]: 2025-11-22 07:57:30.456 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:57:33 compute-0 nova_compute[186544]: 2025-11-22 07:57:33.143 186548 DEBUG oslo_concurrency.lockutils [None req-f8aa9cfd-656c-44b7-bf77-9aba6a906ef9 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquiring lock "c6cd5fec-f214-4bbc-b854-9e16c9a7577a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:57:33 compute-0 nova_compute[186544]: 2025-11-22 07:57:33.144 186548 DEBUG oslo_concurrency.lockutils [None req-f8aa9cfd-656c-44b7-bf77-9aba6a906ef9 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "c6cd5fec-f214-4bbc-b854-9e16c9a7577a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:57:33 compute-0 nova_compute[186544]: 2025-11-22 07:57:33.145 186548 DEBUG oslo_concurrency.lockutils [None req-f8aa9cfd-656c-44b7-bf77-9aba6a906ef9 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquiring lock "c6cd5fec-f214-4bbc-b854-9e16c9a7577a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:57:33 compute-0 nova_compute[186544]: 2025-11-22 07:57:33.145 186548 DEBUG oslo_concurrency.lockutils [None req-f8aa9cfd-656c-44b7-bf77-9aba6a906ef9 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "c6cd5fec-f214-4bbc-b854-9e16c9a7577a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:57:33 compute-0 nova_compute[186544]: 2025-11-22 07:57:33.145 186548 DEBUG oslo_concurrency.lockutils [None req-f8aa9cfd-656c-44b7-bf77-9aba6a906ef9 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "c6cd5fec-f214-4bbc-b854-9e16c9a7577a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:57:33 compute-0 nova_compute[186544]: 2025-11-22 07:57:33.153 186548 INFO nova.compute.manager [None req-f8aa9cfd-656c-44b7-bf77-9aba6a906ef9 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Terminating instance
Nov 22 07:57:33 compute-0 nova_compute[186544]: 2025-11-22 07:57:33.162 186548 DEBUG nova.compute.manager [None req-f8aa9cfd-656c-44b7-bf77-9aba6a906ef9 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 07:57:33 compute-0 kernel: tap3640f80e-71 (unregistering): left promiscuous mode
Nov 22 07:57:33 compute-0 NetworkManager[55036]: <info>  [1763798253.1814] device (tap3640f80e-71): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 07:57:33 compute-0 nova_compute[186544]: 2025-11-22 07:57:33.191 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:57:33 compute-0 ovn_controller[94843]: 2025-11-22T07:57:33Z|00309|binding|INFO|Releasing lport 3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2 from this chassis (sb_readonly=0)
Nov 22 07:57:33 compute-0 ovn_controller[94843]: 2025-11-22T07:57:33Z|00310|binding|INFO|Setting lport 3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2 down in Southbound
Nov 22 07:57:33 compute-0 ovn_controller[94843]: 2025-11-22T07:57:33Z|00311|binding|INFO|Removing iface tap3640f80e-71 ovn-installed in OVS
Nov 22 07:57:33 compute-0 nova_compute[186544]: 2025-11-22 07:57:33.204 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:57:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:57:33.239 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:27:44:b9 10.100.0.12'], port_security=['fa:16:3e:27:44:b9 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'c6cd5fec-f214-4bbc-b854-9e16c9a7577a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d54e232a-5c68-4cc7-b58c-054da9c4646f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '063bf16c91af408ca075c690797e09d8', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'f7dee984-8c58-404b-bb82-9a414aef709d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3245992a-6d76-4250-aff6-2ef09c2bac10, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:57:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:57:33.241 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2 in datapath d54e232a-5c68-4cc7-b58c-054da9c4646f unbound from our chassis
Nov 22 07:57:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:57:33.242 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d54e232a-5c68-4cc7-b58c-054da9c4646f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 07:57:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:57:33.243 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[019ca202-7583-43fc-93d5-479e54989ee1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:57:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:57:33.244 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f namespace which is not needed anymore
Nov 22 07:57:33 compute-0 systemd[1]: machine-qemu\x2d37\x2dinstance\x2d0000004c.scope: Deactivated successfully.
Nov 22 07:57:33 compute-0 systemd[1]: machine-qemu\x2d37\x2dinstance\x2d0000004c.scope: Consumed 12.564s CPU time.
Nov 22 07:57:33 compute-0 systemd-machined[152872]: Machine qemu-37-instance-0000004c terminated.
Nov 22 07:57:33 compute-0 podman[225801]: 2025-11-22 07:57:33.263309274 +0000 UTC m=+0.054982915 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., name=ubi9-minimal, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, managed_by=edpm_ansible, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 22 07:57:33 compute-0 nova_compute[186544]: 2025-11-22 07:57:33.419 186548 INFO nova.virt.libvirt.driver [-] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Instance destroyed successfully.
Nov 22 07:57:33 compute-0 nova_compute[186544]: 2025-11-22 07:57:33.420 186548 DEBUG nova.objects.instance [None req-f8aa9cfd-656c-44b7-bf77-9aba6a906ef9 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lazy-loading 'resources' on Instance uuid c6cd5fec-f214-4bbc-b854-9e16c9a7577a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:57:33 compute-0 nova_compute[186544]: 2025-11-22 07:57:33.431 186548 DEBUG nova.virt.libvirt.vif [None req-f8aa9cfd-656c-44b7-bf77-9aba6a906ef9 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:56:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-772508279',display_name='tempest-ServerDiskConfigTestJSON-server-772508279',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-772508279',id=76,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:57:22Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='063bf16c91af408ca075c690797e09d8',ramdisk_id='',reservation_id='r-0n3qm4qh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-592691466',owner_user_name='tempest-ServerDiskConfigTestJSON-592691466-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:57:30Z,user_data=None,user_id='e24c302b62fb470aa189b76d4676733b',uuid=c6cd5fec-f214-4bbc-b854-9e16c9a7577a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2", "address": "fa:16:3e:27:44:b9", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3640f80e-71", "ovs_interfaceid": "3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 07:57:33 compute-0 nova_compute[186544]: 2025-11-22 07:57:33.432 186548 DEBUG nova.network.os_vif_util [None req-f8aa9cfd-656c-44b7-bf77-9aba6a906ef9 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Converting VIF {"id": "3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2", "address": "fa:16:3e:27:44:b9", "network": {"id": "d54e232a-5c68-4cc7-b58c-054da9c4646f", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-405999941-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "063bf16c91af408ca075c690797e09d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3640f80e-71", "ovs_interfaceid": "3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:57:33 compute-0 nova_compute[186544]: 2025-11-22 07:57:33.432 186548 DEBUG nova.network.os_vif_util [None req-f8aa9cfd-656c-44b7-bf77-9aba6a906ef9 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:27:44:b9,bridge_name='br-int',has_traffic_filtering=True,id=3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2,network=Network(d54e232a-5c68-4cc7-b58c-054da9c4646f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3640f80e-71') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:57:33 compute-0 nova_compute[186544]: 2025-11-22 07:57:33.433 186548 DEBUG os_vif [None req-f8aa9cfd-656c-44b7-bf77-9aba6a906ef9 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:27:44:b9,bridge_name='br-int',has_traffic_filtering=True,id=3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2,network=Network(d54e232a-5c68-4cc7-b58c-054da9c4646f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3640f80e-71') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 07:57:33 compute-0 nova_compute[186544]: 2025-11-22 07:57:33.434 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:57:33 compute-0 nova_compute[186544]: 2025-11-22 07:57:33.434 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3640f80e-71, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:57:33 compute-0 nova_compute[186544]: 2025-11-22 07:57:33.436 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:57:33 compute-0 nova_compute[186544]: 2025-11-22 07:57:33.437 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:57:33 compute-0 nova_compute[186544]: 2025-11-22 07:57:33.441 186548 INFO os_vif [None req-f8aa9cfd-656c-44b7-bf77-9aba6a906ef9 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:27:44:b9,bridge_name='br-int',has_traffic_filtering=True,id=3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2,network=Network(d54e232a-5c68-4cc7-b58c-054da9c4646f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3640f80e-71')
Nov 22 07:57:33 compute-0 nova_compute[186544]: 2025-11-22 07:57:33.442 186548 INFO nova.virt.libvirt.driver [None req-f8aa9cfd-656c-44b7-bf77-9aba6a906ef9 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Deleting instance files /var/lib/nova/instances/c6cd5fec-f214-4bbc-b854-9e16c9a7577a_del
Nov 22 07:57:33 compute-0 nova_compute[186544]: 2025-11-22 07:57:33.447 186548 INFO nova.virt.libvirt.driver [None req-f8aa9cfd-656c-44b7-bf77-9aba6a906ef9 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Deletion of /var/lib/nova/instances/c6cd5fec-f214-4bbc-b854-9e16c9a7577a_del complete
Nov 22 07:57:33 compute-0 neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f[225739]: [NOTICE]   (225743) : haproxy version is 2.8.14-c23fe91
Nov 22 07:57:33 compute-0 neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f[225739]: [NOTICE]   (225743) : path to executable is /usr/sbin/haproxy
Nov 22 07:57:33 compute-0 neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f[225739]: [ALERT]    (225743) : Current worker (225745) exited with code 143 (Terminated)
Nov 22 07:57:33 compute-0 neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f[225739]: [WARNING]  (225743) : All workers exited. Exiting... (0)
Nov 22 07:57:33 compute-0 systemd[1]: libpod-2e657c96616f58842d26780b1f57b866a3adb96d8f809f28a43cf4040b3b1f97.scope: Deactivated successfully.
Nov 22 07:57:33 compute-0 podman[225846]: 2025-11-22 07:57:33.659981248 +0000 UTC m=+0.336355429 container died 2e657c96616f58842d26780b1f57b866a3adb96d8f809f28a43cf4040b3b1f97 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Nov 22 07:57:33 compute-0 nova_compute[186544]: 2025-11-22 07:57:33.704 186548 INFO nova.compute.manager [None req-f8aa9cfd-656c-44b7-bf77-9aba6a906ef9 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Took 0.54 seconds to destroy the instance on the hypervisor.
Nov 22 07:57:33 compute-0 nova_compute[186544]: 2025-11-22 07:57:33.705 186548 DEBUG oslo.service.loopingcall [None req-f8aa9cfd-656c-44b7-bf77-9aba6a906ef9 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 07:57:33 compute-0 nova_compute[186544]: 2025-11-22 07:57:33.705 186548 DEBUG nova.compute.manager [-] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 07:57:33 compute-0 nova_compute[186544]: 2025-11-22 07:57:33.705 186548 DEBUG nova.network.neutron [-] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 07:57:34 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2e657c96616f58842d26780b1f57b866a3adb96d8f809f28a43cf4040b3b1f97-userdata-shm.mount: Deactivated successfully.
Nov 22 07:57:34 compute-0 systemd[1]: var-lib-containers-storage-overlay-5ae59c9d4de7519c963a04486a7dc14433b7cafb9f52ef523f435ad9e697f387-merged.mount: Deactivated successfully.
Nov 22 07:57:34 compute-0 nova_compute[186544]: 2025-11-22 07:57:34.056 186548 DEBUG nova.compute.manager [req-98a659e2-e1fe-48ab-934c-87a3b828b3a8 req-2da2d737-a5bc-4278-95f5-f33f14dad7f7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Received event network-vif-unplugged-3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:57:34 compute-0 nova_compute[186544]: 2025-11-22 07:57:34.057 186548 DEBUG oslo_concurrency.lockutils [req-98a659e2-e1fe-48ab-934c-87a3b828b3a8 req-2da2d737-a5bc-4278-95f5-f33f14dad7f7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "c6cd5fec-f214-4bbc-b854-9e16c9a7577a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:57:34 compute-0 nova_compute[186544]: 2025-11-22 07:57:34.058 186548 DEBUG oslo_concurrency.lockutils [req-98a659e2-e1fe-48ab-934c-87a3b828b3a8 req-2da2d737-a5bc-4278-95f5-f33f14dad7f7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c6cd5fec-f214-4bbc-b854-9e16c9a7577a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:57:34 compute-0 nova_compute[186544]: 2025-11-22 07:57:34.058 186548 DEBUG oslo_concurrency.lockutils [req-98a659e2-e1fe-48ab-934c-87a3b828b3a8 req-2da2d737-a5bc-4278-95f5-f33f14dad7f7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c6cd5fec-f214-4bbc-b854-9e16c9a7577a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:57:34 compute-0 nova_compute[186544]: 2025-11-22 07:57:34.058 186548 DEBUG nova.compute.manager [req-98a659e2-e1fe-48ab-934c-87a3b828b3a8 req-2da2d737-a5bc-4278-95f5-f33f14dad7f7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] No waiting events found dispatching network-vif-unplugged-3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:57:34 compute-0 nova_compute[186544]: 2025-11-22 07:57:34.059 186548 DEBUG nova.compute.manager [req-98a659e2-e1fe-48ab-934c-87a3b828b3a8 req-2da2d737-a5bc-4278-95f5-f33f14dad7f7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Received event network-vif-unplugged-3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 22 07:57:34 compute-0 podman[225846]: 2025-11-22 07:57:34.305013868 +0000 UTC m=+0.981388049 container cleanup 2e657c96616f58842d26780b1f57b866a3adb96d8f809f28a43cf4040b3b1f97 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 07:57:34 compute-0 systemd[1]: libpod-conmon-2e657c96616f58842d26780b1f57b866a3adb96d8f809f28a43cf4040b3b1f97.scope: Deactivated successfully.
Nov 22 07:57:35 compute-0 podman[225894]: 2025-11-22 07:57:35.225224658 +0000 UTC m=+0.899221194 container remove 2e657c96616f58842d26780b1f57b866a3adb96d8f809f28a43cf4040b3b1f97 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 07:57:35 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:57:35.230 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[12049319-118c-46f7-b158-2a8394a4e725]: (4, ('Sat Nov 22 07:57:33 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f (2e657c96616f58842d26780b1f57b866a3adb96d8f809f28a43cf4040b3b1f97)\n2e657c96616f58842d26780b1f57b866a3adb96d8f809f28a43cf4040b3b1f97\nSat Nov 22 07:57:34 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f (2e657c96616f58842d26780b1f57b866a3adb96d8f809f28a43cf4040b3b1f97)\n2e657c96616f58842d26780b1f57b866a3adb96d8f809f28a43cf4040b3b1f97\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:57:35 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:57:35.232 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[8b95bd7a-d6ee-449d-8fca-096a60c98ea0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:57:35 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:57:35.233 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd54e232a-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:57:35 compute-0 nova_compute[186544]: 2025-11-22 07:57:35.234 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:57:35 compute-0 kernel: tapd54e232a-50: left promiscuous mode
Nov 22 07:57:35 compute-0 nova_compute[186544]: 2025-11-22 07:57:35.247 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:57:35 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:57:35.250 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[46870815-faee-4b25-a52a-82ea73c157a2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:57:35 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:57:35.272 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[b8fe7f02-7b35-4ef0-a815-0e4bb5ae1d0d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:57:35 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:57:35.274 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[0ba9a223-7ef2-4512-a879-a7e51865153f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:57:35 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:57:35.288 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[ab211d07-85d4-45e9-8477-52ef6eb4cced]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 498142, 'reachable_time': 38524, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225910, 'error': None, 'target': 'ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:57:35 compute-0 systemd[1]: run-netns-ovnmeta\x2dd54e232a\x2d5c68\x2d4cc7\x2db58c\x2d054da9c4646f.mount: Deactivated successfully.
Nov 22 07:57:35 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:57:35.292 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d54e232a-5c68-4cc7-b58c-054da9c4646f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 07:57:35 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:57:35.292 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[fe9f1e98-312b-49b7-b0df-44f0339e064f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:57:35 compute-0 nova_compute[186544]: 2025-11-22 07:57:35.458 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:57:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:57:37.325 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:57:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:57:37.326 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:57:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:57:37.326 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:57:38 compute-0 nova_compute[186544]: 2025-11-22 07:57:38.437 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:57:39 compute-0 nova_compute[186544]: 2025-11-22 07:57:39.823 186548 DEBUG nova.compute.manager [req-ea554f53-e972-44bd-a1bc-c72745750fa0 req-60911b41-f627-4d68-b1e8-c73a535cd8f5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Received event network-vif-plugged-3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:57:39 compute-0 nova_compute[186544]: 2025-11-22 07:57:39.823 186548 DEBUG oslo_concurrency.lockutils [req-ea554f53-e972-44bd-a1bc-c72745750fa0 req-60911b41-f627-4d68-b1e8-c73a535cd8f5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "c6cd5fec-f214-4bbc-b854-9e16c9a7577a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:57:39 compute-0 nova_compute[186544]: 2025-11-22 07:57:39.823 186548 DEBUG oslo_concurrency.lockutils [req-ea554f53-e972-44bd-a1bc-c72745750fa0 req-60911b41-f627-4d68-b1e8-c73a535cd8f5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c6cd5fec-f214-4bbc-b854-9e16c9a7577a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:57:39 compute-0 nova_compute[186544]: 2025-11-22 07:57:39.824 186548 DEBUG oslo_concurrency.lockutils [req-ea554f53-e972-44bd-a1bc-c72745750fa0 req-60911b41-f627-4d68-b1e8-c73a535cd8f5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c6cd5fec-f214-4bbc-b854-9e16c9a7577a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:57:39 compute-0 nova_compute[186544]: 2025-11-22 07:57:39.824 186548 DEBUG nova.compute.manager [req-ea554f53-e972-44bd-a1bc-c72745750fa0 req-60911b41-f627-4d68-b1e8-c73a535cd8f5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] No waiting events found dispatching network-vif-plugged-3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:57:39 compute-0 nova_compute[186544]: 2025-11-22 07:57:39.824 186548 WARNING nova.compute.manager [req-ea554f53-e972-44bd-a1bc-c72745750fa0 req-60911b41-f627-4d68-b1e8-c73a535cd8f5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Received unexpected event network-vif-plugged-3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2 for instance with vm_state active and task_state deleting.
Nov 22 07:57:40 compute-0 nova_compute[186544]: 2025-11-22 07:57:40.462 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:57:41 compute-0 nova_compute[186544]: 2025-11-22 07:57:41.178 186548 DEBUG nova.network.neutron [-] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:57:42 compute-0 nova_compute[186544]: 2025-11-22 07:57:42.158 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:57:42 compute-0 nova_compute[186544]: 2025-11-22 07:57:42.373 186548 INFO nova.compute.manager [-] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Took 8.67 seconds to deallocate network for instance.
Nov 22 07:57:43 compute-0 nova_compute[186544]: 2025-11-22 07:57:43.152 186548 DEBUG oslo_concurrency.lockutils [None req-f8aa9cfd-656c-44b7-bf77-9aba6a906ef9 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:57:43 compute-0 nova_compute[186544]: 2025-11-22 07:57:43.152 186548 DEBUG oslo_concurrency.lockutils [None req-f8aa9cfd-656c-44b7-bf77-9aba6a906ef9 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:57:43 compute-0 nova_compute[186544]: 2025-11-22 07:57:43.159 186548 DEBUG oslo_concurrency.lockutils [None req-f8aa9cfd-656c-44b7-bf77-9aba6a906ef9 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.006s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:57:43 compute-0 nova_compute[186544]: 2025-11-22 07:57:43.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:57:43 compute-0 nova_compute[186544]: 2025-11-22 07:57:43.164 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 07:57:43 compute-0 nova_compute[186544]: 2025-11-22 07:57:43.164 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 07:57:43 compute-0 nova_compute[186544]: 2025-11-22 07:57:43.441 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:57:43 compute-0 nova_compute[186544]: 2025-11-22 07:57:43.558 186548 INFO nova.scheduler.client.report [None req-f8aa9cfd-656c-44b7-bf77-9aba6a906ef9 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Deleted allocations for instance c6cd5fec-f214-4bbc-b854-9e16c9a7577a
Nov 22 07:57:43 compute-0 nova_compute[186544]: 2025-11-22 07:57:43.597 186548 DEBUG nova.compute.manager [req-dbaf82ba-149a-4bbd-a1b8-8eedd4af6ca2 req-521668fc-4238-4328-8c0e-06e11f9121ae 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Received event network-vif-deleted-3640f80e-7151-4f7e-a9c0-aef7d9ba1cb2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:57:43 compute-0 nova_compute[186544]: 2025-11-22 07:57:43.705 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "refresh_cache-1da3a852-b732-48ac-886f-c57ced76a3d5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:57:43 compute-0 nova_compute[186544]: 2025-11-22 07:57:43.705 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquired lock "refresh_cache-1da3a852-b732-48ac-886f-c57ced76a3d5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:57:43 compute-0 nova_compute[186544]: 2025-11-22 07:57:43.705 186548 DEBUG nova.network.neutron [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 22 07:57:43 compute-0 nova_compute[186544]: 2025-11-22 07:57:43.706 186548 DEBUG nova.objects.instance [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lazy-loading 'info_cache' on Instance uuid 1da3a852-b732-48ac-886f-c57ced76a3d5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:57:43 compute-0 nova_compute[186544]: 2025-11-22 07:57:43.774 186548 DEBUG oslo_concurrency.lockutils [None req-f8aa9cfd-656c-44b7-bf77-9aba6a906ef9 e24c302b62fb470aa189b76d4676733b 063bf16c91af408ca075c690797e09d8 - - default default] Lock "c6cd5fec-f214-4bbc-b854-9e16c9a7577a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 10.629s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:57:44 compute-0 podman[225911]: 2025-11-22 07:57:44.414363558 +0000 UTC m=+0.060811960 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 07:57:44 compute-0 podman[225912]: 2025-11-22 07:57:44.436516173 +0000 UTC m=+0.079001587 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, org.label-schema.build-date=20251118, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3)
Nov 22 07:57:45 compute-0 nova_compute[186544]: 2025-11-22 07:57:45.464 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:57:46 compute-0 nova_compute[186544]: 2025-11-22 07:57:46.784 186548 DEBUG nova.network.neutron [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] Updating instance_info_cache with network_info: [{"id": "ab338966-6ece-477c-af70-298f992f0c8f", "address": "fa:16:3e:63:15:d6", "network": {"id": "a2b438ab-8fa8-4627-8c04-99bed701c19e", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1803455984-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "af3a536766704caaad94e5da2e3b88e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab338966-6e", "ovs_interfaceid": "ab338966-6ece-477c-af70-298f992f0c8f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:57:46 compute-0 nova_compute[186544]: 2025-11-22 07:57:46.835 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Releasing lock "refresh_cache-1da3a852-b732-48ac-886f-c57ced76a3d5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:57:46 compute-0 nova_compute[186544]: 2025-11-22 07:57:46.836 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 22 07:57:46 compute-0 nova_compute[186544]: 2025-11-22 07:57:46.836 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:57:46 compute-0 nova_compute[186544]: 2025-11-22 07:57:46.837 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:57:46 compute-0 nova_compute[186544]: 2025-11-22 07:57:46.863 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:57:46 compute-0 nova_compute[186544]: 2025-11-22 07:57:46.863 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:57:46 compute-0 nova_compute[186544]: 2025-11-22 07:57:46.863 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:57:46 compute-0 nova_compute[186544]: 2025-11-22 07:57:46.864 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 07:57:46 compute-0 nova_compute[186544]: 2025-11-22 07:57:46.924 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1da3a852-b732-48ac-886f-c57ced76a3d5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:57:46 compute-0 nova_compute[186544]: 2025-11-22 07:57:46.988 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1da3a852-b732-48ac-886f-c57ced76a3d5/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:57:46 compute-0 nova_compute[186544]: 2025-11-22 07:57:46.990 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1da3a852-b732-48ac-886f-c57ced76a3d5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:57:47 compute-0 nova_compute[186544]: 2025-11-22 07:57:47.055 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1da3a852-b732-48ac-886f-c57ced76a3d5/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:57:47 compute-0 nova_compute[186544]: 2025-11-22 07:57:47.114 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:57:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:57:47.115 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=22, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=21) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:57:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:57:47.116 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 07:57:47 compute-0 nova_compute[186544]: 2025-11-22 07:57:47.249 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 07:57:47 compute-0 nova_compute[186544]: 2025-11-22 07:57:47.250 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5495MB free_disk=73.2861557006836GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 07:57:47 compute-0 nova_compute[186544]: 2025-11-22 07:57:47.251 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:57:47 compute-0 nova_compute[186544]: 2025-11-22 07:57:47.251 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:57:47 compute-0 nova_compute[186544]: 2025-11-22 07:57:47.321 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Instance 1da3a852-b732-48ac-886f-c57ced76a3d5 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 22 07:57:47 compute-0 nova_compute[186544]: 2025-11-22 07:57:47.321 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 07:57:47 compute-0 nova_compute[186544]: 2025-11-22 07:57:47.322 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 07:57:47 compute-0 nova_compute[186544]: 2025-11-22 07:57:47.370 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:57:47 compute-0 nova_compute[186544]: 2025-11-22 07:57:47.387 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:57:47 compute-0 nova_compute[186544]: 2025-11-22 07:57:47.405 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 07:57:47 compute-0 nova_compute[186544]: 2025-11-22 07:57:47.406 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.155s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:57:47 compute-0 nova_compute[186544]: 2025-11-22 07:57:47.733 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:57:47 compute-0 nova_compute[186544]: 2025-11-22 07:57:47.734 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:57:47 compute-0 nova_compute[186544]: 2025-11-22 07:57:47.734 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 07:57:47 compute-0 nova_compute[186544]: 2025-11-22 07:57:47.843 186548 DEBUG oslo_concurrency.lockutils [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Acquiring lock "a1a9cc11-55d5-4fcd-92a9-12d52d5713d6" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:57:47 compute-0 nova_compute[186544]: 2025-11-22 07:57:47.844 186548 DEBUG oslo_concurrency.lockutils [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lock "a1a9cc11-55d5-4fcd-92a9-12d52d5713d6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:57:47 compute-0 nova_compute[186544]: 2025-11-22 07:57:47.863 186548 DEBUG nova.compute.manager [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 07:57:47 compute-0 nova_compute[186544]: 2025-11-22 07:57:47.949 186548 DEBUG oslo_concurrency.lockutils [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:57:47 compute-0 nova_compute[186544]: 2025-11-22 07:57:47.950 186548 DEBUG oslo_concurrency.lockutils [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:57:47 compute-0 nova_compute[186544]: 2025-11-22 07:57:47.957 186548 DEBUG nova.virt.hardware [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 07:57:47 compute-0 nova_compute[186544]: 2025-11-22 07:57:47.958 186548 INFO nova.compute.claims [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] Claim successful on node compute-0.ctlplane.example.com
Nov 22 07:57:48 compute-0 nova_compute[186544]: 2025-11-22 07:57:48.106 186548 DEBUG nova.compute.provider_tree [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:57:48 compute-0 nova_compute[186544]: 2025-11-22 07:57:48.130 186548 DEBUG nova.scheduler.client.report [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:57:48 compute-0 nova_compute[186544]: 2025-11-22 07:57:48.175 186548 DEBUG oslo_concurrency.lockutils [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.225s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:57:48 compute-0 nova_compute[186544]: 2025-11-22 07:57:48.176 186548 DEBUG nova.compute.manager [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 07:57:48 compute-0 nova_compute[186544]: 2025-11-22 07:57:48.231 186548 DEBUG nova.compute.manager [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 22 07:57:48 compute-0 nova_compute[186544]: 2025-11-22 07:57:48.232 186548 DEBUG nova.network.neutron [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 07:57:48 compute-0 nova_compute[186544]: 2025-11-22 07:57:48.275 186548 INFO nova.virt.libvirt.driver [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 07:57:48 compute-0 nova_compute[186544]: 2025-11-22 07:57:48.312 186548 DEBUG nova.compute.manager [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 07:57:48 compute-0 nova_compute[186544]: 2025-11-22 07:57:48.418 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763798253.416804, c6cd5fec-f214-4bbc-b854-9e16c9a7577a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:57:48 compute-0 nova_compute[186544]: 2025-11-22 07:57:48.418 186548 INFO nova.compute.manager [-] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] VM Stopped (Lifecycle Event)
Nov 22 07:57:48 compute-0 nova_compute[186544]: 2025-11-22 07:57:48.444 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:57:48 compute-0 nova_compute[186544]: 2025-11-22 07:57:48.464 186548 DEBUG nova.compute.manager [None req-de011e30-015a-4035-aa89-44a19ae5d73a - - - - - -] [instance: c6cd5fec-f214-4bbc-b854-9e16c9a7577a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:57:48 compute-0 nova_compute[186544]: 2025-11-22 07:57:48.469 186548 DEBUG nova.compute.manager [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 07:57:48 compute-0 nova_compute[186544]: 2025-11-22 07:57:48.470 186548 DEBUG nova.virt.libvirt.driver [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 07:57:48 compute-0 nova_compute[186544]: 2025-11-22 07:57:48.471 186548 INFO nova.virt.libvirt.driver [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] Creating image(s)
Nov 22 07:57:48 compute-0 nova_compute[186544]: 2025-11-22 07:57:48.471 186548 DEBUG oslo_concurrency.lockutils [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Acquiring lock "/var/lib/nova/instances/a1a9cc11-55d5-4fcd-92a9-12d52d5713d6/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:57:48 compute-0 nova_compute[186544]: 2025-11-22 07:57:48.472 186548 DEBUG oslo_concurrency.lockutils [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lock "/var/lib/nova/instances/a1a9cc11-55d5-4fcd-92a9-12d52d5713d6/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:57:48 compute-0 nova_compute[186544]: 2025-11-22 07:57:48.472 186548 DEBUG oslo_concurrency.lockutils [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lock "/var/lib/nova/instances/a1a9cc11-55d5-4fcd-92a9-12d52d5713d6/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:57:48 compute-0 nova_compute[186544]: 2025-11-22 07:57:48.488 186548 DEBUG nova.policy [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5a5a623606e647c183360572aab20b70', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'af3a536766704caaad94e5da2e3b88e2', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 22 07:57:48 compute-0 nova_compute[186544]: 2025-11-22 07:57:48.491 186548 DEBUG oslo_concurrency.processutils [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:57:48 compute-0 nova_compute[186544]: 2025-11-22 07:57:48.545 186548 DEBUG oslo_concurrency.processutils [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:57:48 compute-0 nova_compute[186544]: 2025-11-22 07:57:48.546 186548 DEBUG oslo_concurrency.lockutils [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:57:48 compute-0 nova_compute[186544]: 2025-11-22 07:57:48.547 186548 DEBUG oslo_concurrency.lockutils [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:57:48 compute-0 nova_compute[186544]: 2025-11-22 07:57:48.562 186548 DEBUG oslo_concurrency.processutils [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:57:48 compute-0 nova_compute[186544]: 2025-11-22 07:57:48.616 186548 DEBUG oslo_concurrency.processutils [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:57:48 compute-0 nova_compute[186544]: 2025-11-22 07:57:48.617 186548 DEBUG oslo_concurrency.processutils [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/a1a9cc11-55d5-4fcd-92a9-12d52d5713d6/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:57:48 compute-0 nova_compute[186544]: 2025-11-22 07:57:48.760 186548 DEBUG oslo_concurrency.processutils [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/a1a9cc11-55d5-4fcd-92a9-12d52d5713d6/disk 1073741824" returned: 0 in 0.143s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:57:48 compute-0 nova_compute[186544]: 2025-11-22 07:57:48.763 186548 DEBUG oslo_concurrency.lockutils [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.216s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:57:48 compute-0 nova_compute[186544]: 2025-11-22 07:57:48.764 186548 DEBUG oslo_concurrency.processutils [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:57:48 compute-0 nova_compute[186544]: 2025-11-22 07:57:48.826 186548 DEBUG oslo_concurrency.processutils [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:57:48 compute-0 nova_compute[186544]: 2025-11-22 07:57:48.828 186548 DEBUG nova.virt.disk.api [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Checking if we can resize image /var/lib/nova/instances/a1a9cc11-55d5-4fcd-92a9-12d52d5713d6/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 07:57:48 compute-0 nova_compute[186544]: 2025-11-22 07:57:48.828 186548 DEBUG oslo_concurrency.processutils [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a1a9cc11-55d5-4fcd-92a9-12d52d5713d6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:57:48 compute-0 nova_compute[186544]: 2025-11-22 07:57:48.880 186548 DEBUG oslo_concurrency.processutils [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a1a9cc11-55d5-4fcd-92a9-12d52d5713d6/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:57:48 compute-0 nova_compute[186544]: 2025-11-22 07:57:48.881 186548 DEBUG nova.virt.disk.api [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Cannot resize image /var/lib/nova/instances/a1a9cc11-55d5-4fcd-92a9-12d52d5713d6/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 07:57:48 compute-0 nova_compute[186544]: 2025-11-22 07:57:48.882 186548 DEBUG nova.objects.instance [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lazy-loading 'migration_context' on Instance uuid a1a9cc11-55d5-4fcd-92a9-12d52d5713d6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:57:48 compute-0 nova_compute[186544]: 2025-11-22 07:57:48.897 186548 DEBUG nova.virt.libvirt.driver [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 07:57:48 compute-0 nova_compute[186544]: 2025-11-22 07:57:48.897 186548 DEBUG nova.virt.libvirt.driver [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] Ensure instance console log exists: /var/lib/nova/instances/a1a9cc11-55d5-4fcd-92a9-12d52d5713d6/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 07:57:48 compute-0 nova_compute[186544]: 2025-11-22 07:57:48.898 186548 DEBUG oslo_concurrency.lockutils [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:57:48 compute-0 nova_compute[186544]: 2025-11-22 07:57:48.899 186548 DEBUG oslo_concurrency.lockutils [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:57:48 compute-0 nova_compute[186544]: 2025-11-22 07:57:48.899 186548 DEBUG oslo_concurrency.lockutils [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:57:49 compute-0 nova_compute[186544]: 2025-11-22 07:57:49.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:57:49 compute-0 podman[225976]: 2025-11-22 07:57:49.394869996 +0000 UTC m=+0.047193044 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 07:57:50 compute-0 nova_compute[186544]: 2025-11-22 07:57:50.465 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:57:51 compute-0 nova_compute[186544]: 2025-11-22 07:57:51.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:57:51 compute-0 podman[226000]: 2025-11-22 07:57:51.400277689 +0000 UTC m=+0.050301521 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Nov 22 07:57:51 compute-0 nova_compute[186544]: 2025-11-22 07:57:51.628 186548 DEBUG nova.network.neutron [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] Successfully created port: c039cb7c-d1db-493e-acc5-8fc510f48206 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 22 07:57:52 compute-0 nova_compute[186544]: 2025-11-22 07:57:52.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:57:53 compute-0 nova_compute[186544]: 2025-11-22 07:57:53.446 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:57:54 compute-0 nova_compute[186544]: 2025-11-22 07:57:54.403 186548 DEBUG nova.network.neutron [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] Successfully updated port: c039cb7c-d1db-493e-acc5-8fc510f48206 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 07:57:54 compute-0 nova_compute[186544]: 2025-11-22 07:57:54.417 186548 DEBUG oslo_concurrency.lockutils [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Acquiring lock "refresh_cache-a1a9cc11-55d5-4fcd-92a9-12d52d5713d6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:57:54 compute-0 nova_compute[186544]: 2025-11-22 07:57:54.417 186548 DEBUG oslo_concurrency.lockutils [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Acquired lock "refresh_cache-a1a9cc11-55d5-4fcd-92a9-12d52d5713d6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:57:54 compute-0 nova_compute[186544]: 2025-11-22 07:57:54.417 186548 DEBUG nova.network.neutron [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 07:57:54 compute-0 nova_compute[186544]: 2025-11-22 07:57:54.641 186548 DEBUG nova.network.neutron [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 07:57:54 compute-0 nova_compute[186544]: 2025-11-22 07:57:54.825 186548 DEBUG nova.compute.manager [req-dc374b72-8979-4c58-9045-cac70655da68 req-f3903cfd-99c9-4dd9-b5fe-5eaa284b6eb8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] Received event network-changed-c039cb7c-d1db-493e-acc5-8fc510f48206 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:57:54 compute-0 nova_compute[186544]: 2025-11-22 07:57:54.825 186548 DEBUG nova.compute.manager [req-dc374b72-8979-4c58-9045-cac70655da68 req-f3903cfd-99c9-4dd9-b5fe-5eaa284b6eb8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] Refreshing instance network info cache due to event network-changed-c039cb7c-d1db-493e-acc5-8fc510f48206. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 07:57:54 compute-0 nova_compute[186544]: 2025-11-22 07:57:54.826 186548 DEBUG oslo_concurrency.lockutils [req-dc374b72-8979-4c58-9045-cac70655da68 req-f3903cfd-99c9-4dd9-b5fe-5eaa284b6eb8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-a1a9cc11-55d5-4fcd-92a9-12d52d5713d6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:57:55 compute-0 nova_compute[186544]: 2025-11-22 07:57:55.466 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:57:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:57:57.119 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '22'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:57:57 compute-0 nova_compute[186544]: 2025-11-22 07:57:57.135 186548 DEBUG nova.network.neutron [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] Updating instance_info_cache with network_info: [{"id": "c039cb7c-d1db-493e-acc5-8fc510f48206", "address": "fa:16:3e:34:7a:a8", "network": {"id": "a2b438ab-8fa8-4627-8c04-99bed701c19e", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1803455984-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "af3a536766704caaad94e5da2e3b88e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc039cb7c-d1", "ovs_interfaceid": "c039cb7c-d1db-493e-acc5-8fc510f48206", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:57:57 compute-0 nova_compute[186544]: 2025-11-22 07:57:57.156 186548 DEBUG oslo_concurrency.lockutils [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Releasing lock "refresh_cache-a1a9cc11-55d5-4fcd-92a9-12d52d5713d6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:57:57 compute-0 nova_compute[186544]: 2025-11-22 07:57:57.157 186548 DEBUG nova.compute.manager [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] Instance network_info: |[{"id": "c039cb7c-d1db-493e-acc5-8fc510f48206", "address": "fa:16:3e:34:7a:a8", "network": {"id": "a2b438ab-8fa8-4627-8c04-99bed701c19e", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1803455984-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "af3a536766704caaad94e5da2e3b88e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc039cb7c-d1", "ovs_interfaceid": "c039cb7c-d1db-493e-acc5-8fc510f48206", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 22 07:57:57 compute-0 nova_compute[186544]: 2025-11-22 07:57:57.157 186548 DEBUG oslo_concurrency.lockutils [req-dc374b72-8979-4c58-9045-cac70655da68 req-f3903cfd-99c9-4dd9-b5fe-5eaa284b6eb8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-a1a9cc11-55d5-4fcd-92a9-12d52d5713d6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:57:57 compute-0 nova_compute[186544]: 2025-11-22 07:57:57.157 186548 DEBUG nova.network.neutron [req-dc374b72-8979-4c58-9045-cac70655da68 req-f3903cfd-99c9-4dd9-b5fe-5eaa284b6eb8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] Refreshing network info cache for port c039cb7c-d1db-493e-acc5-8fc510f48206 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 07:57:57 compute-0 nova_compute[186544]: 2025-11-22 07:57:57.161 186548 DEBUG nova.virt.libvirt.driver [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] Start _get_guest_xml network_info=[{"id": "c039cb7c-d1db-493e-acc5-8fc510f48206", "address": "fa:16:3e:34:7a:a8", "network": {"id": "a2b438ab-8fa8-4627-8c04-99bed701c19e", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1803455984-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "af3a536766704caaad94e5da2e3b88e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc039cb7c-d1", "ovs_interfaceid": "c039cb7c-d1db-493e-acc5-8fc510f48206", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 07:57:57 compute-0 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 22 07:57:57 compute-0 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 22 07:57:57 compute-0 nova_compute[186544]: 2025-11-22 07:57:57.165 186548 WARNING nova.virt.libvirt.driver [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 07:57:57 compute-0 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 22 07:57:57 compute-0 nova_compute[186544]: 2025-11-22 07:57:57.176 186548 DEBUG nova.virt.libvirt.host [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 07:57:57 compute-0 nova_compute[186544]: 2025-11-22 07:57:57.177 186548 DEBUG nova.virt.libvirt.host [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 07:57:57 compute-0 nova_compute[186544]: 2025-11-22 07:57:57.182 186548 DEBUG nova.virt.libvirt.host [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 07:57:57 compute-0 nova_compute[186544]: 2025-11-22 07:57:57.183 186548 DEBUG nova.virt.libvirt.host [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 07:57:57 compute-0 nova_compute[186544]: 2025-11-22 07:57:57.184 186548 DEBUG nova.virt.libvirt.driver [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 07:57:57 compute-0 nova_compute[186544]: 2025-11-22 07:57:57.184 186548 DEBUG nova.virt.hardware [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 07:57:57 compute-0 nova_compute[186544]: 2025-11-22 07:57:57.185 186548 DEBUG nova.virt.hardware [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 07:57:57 compute-0 nova_compute[186544]: 2025-11-22 07:57:57.185 186548 DEBUG nova.virt.hardware [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 07:57:57 compute-0 nova_compute[186544]: 2025-11-22 07:57:57.185 186548 DEBUG nova.virt.hardware [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 07:57:57 compute-0 nova_compute[186544]: 2025-11-22 07:57:57.185 186548 DEBUG nova.virt.hardware [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 07:57:57 compute-0 nova_compute[186544]: 2025-11-22 07:57:57.186 186548 DEBUG nova.virt.hardware [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 07:57:57 compute-0 nova_compute[186544]: 2025-11-22 07:57:57.186 186548 DEBUG nova.virt.hardware [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 07:57:57 compute-0 nova_compute[186544]: 2025-11-22 07:57:57.186 186548 DEBUG nova.virt.hardware [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 07:57:57 compute-0 nova_compute[186544]: 2025-11-22 07:57:57.186 186548 DEBUG nova.virt.hardware [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 07:57:57 compute-0 nova_compute[186544]: 2025-11-22 07:57:57.186 186548 DEBUG nova.virt.hardware [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 07:57:57 compute-0 nova_compute[186544]: 2025-11-22 07:57:57.187 186548 DEBUG nova.virt.hardware [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 07:57:57 compute-0 nova_compute[186544]: 2025-11-22 07:57:57.190 186548 DEBUG nova.virt.libvirt.vif [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:57:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherA-server-985044045',display_name='tempest-ServerActionsTestOtherA-server-985044045',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestothera-server-985044045',id=79,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='af3a536766704caaad94e5da2e3b88e2',ramdisk_id='',reservation_id='r-fl4d5pn9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherA-1599563713',owner_user_name='tempest-ServerActionsTestOtherA-1599563713-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:57:48Z,user_data=None,user_id='5a5a623606e647c183360572aab20b70',uuid=a1a9cc11-55d5-4fcd-92a9-12d52d5713d6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c039cb7c-d1db-493e-acc5-8fc510f48206", "address": "fa:16:3e:34:7a:a8", "network": {"id": "a2b438ab-8fa8-4627-8c04-99bed701c19e", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1803455984-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "af3a536766704caaad94e5da2e3b88e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc039cb7c-d1", "ovs_interfaceid": "c039cb7c-d1db-493e-acc5-8fc510f48206", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 07:57:57 compute-0 nova_compute[186544]: 2025-11-22 07:57:57.191 186548 DEBUG nova.network.os_vif_util [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Converting VIF {"id": "c039cb7c-d1db-493e-acc5-8fc510f48206", "address": "fa:16:3e:34:7a:a8", "network": {"id": "a2b438ab-8fa8-4627-8c04-99bed701c19e", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1803455984-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "af3a536766704caaad94e5da2e3b88e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc039cb7c-d1", "ovs_interfaceid": "c039cb7c-d1db-493e-acc5-8fc510f48206", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:57:57 compute-0 nova_compute[186544]: 2025-11-22 07:57:57.192 186548 DEBUG nova.network.os_vif_util [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:34:7a:a8,bridge_name='br-int',has_traffic_filtering=True,id=c039cb7c-d1db-493e-acc5-8fc510f48206,network=Network(a2b438ab-8fa8-4627-8c04-99bed701c19e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc039cb7c-d1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:57:57 compute-0 nova_compute[186544]: 2025-11-22 07:57:57.193 186548 DEBUG nova.objects.instance [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lazy-loading 'pci_devices' on Instance uuid a1a9cc11-55d5-4fcd-92a9-12d52d5713d6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:57:57 compute-0 nova_compute[186544]: 2025-11-22 07:57:57.203 186548 DEBUG nova.virt.libvirt.driver [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] End _get_guest_xml xml=<domain type="kvm">
Nov 22 07:57:57 compute-0 nova_compute[186544]:   <uuid>a1a9cc11-55d5-4fcd-92a9-12d52d5713d6</uuid>
Nov 22 07:57:57 compute-0 nova_compute[186544]:   <name>instance-0000004f</name>
Nov 22 07:57:57 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 07:57:57 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 07:57:57 compute-0 nova_compute[186544]:   <metadata>
Nov 22 07:57:57 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 07:57:57 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 07:57:57 compute-0 nova_compute[186544]:       <nova:name>tempest-ServerActionsTestOtherA-server-985044045</nova:name>
Nov 22 07:57:57 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 07:57:57</nova:creationTime>
Nov 22 07:57:57 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 07:57:57 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 07:57:57 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 07:57:57 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 07:57:57 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 07:57:57 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 07:57:57 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 07:57:57 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 07:57:57 compute-0 nova_compute[186544]:         <nova:user uuid="5a5a623606e647c183360572aab20b70">tempest-ServerActionsTestOtherA-1599563713-project-member</nova:user>
Nov 22 07:57:57 compute-0 nova_compute[186544]:         <nova:project uuid="af3a536766704caaad94e5da2e3b88e2">tempest-ServerActionsTestOtherA-1599563713</nova:project>
Nov 22 07:57:57 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 07:57:57 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 07:57:57 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 07:57:57 compute-0 nova_compute[186544]:         <nova:port uuid="c039cb7c-d1db-493e-acc5-8fc510f48206">
Nov 22 07:57:57 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Nov 22 07:57:57 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 07:57:57 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 07:57:57 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 07:57:57 compute-0 nova_compute[186544]:   </metadata>
Nov 22 07:57:57 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 07:57:57 compute-0 nova_compute[186544]:     <system>
Nov 22 07:57:57 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 07:57:57 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 07:57:57 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 07:57:57 compute-0 nova_compute[186544]:       <entry name="serial">a1a9cc11-55d5-4fcd-92a9-12d52d5713d6</entry>
Nov 22 07:57:57 compute-0 nova_compute[186544]:       <entry name="uuid">a1a9cc11-55d5-4fcd-92a9-12d52d5713d6</entry>
Nov 22 07:57:57 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 07:57:57 compute-0 nova_compute[186544]:     </system>
Nov 22 07:57:57 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 07:57:57 compute-0 nova_compute[186544]:   <os>
Nov 22 07:57:57 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 07:57:57 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 07:57:57 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 07:57:57 compute-0 nova_compute[186544]:   </os>
Nov 22 07:57:57 compute-0 nova_compute[186544]:   <features>
Nov 22 07:57:57 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 07:57:57 compute-0 nova_compute[186544]:     <apic/>
Nov 22 07:57:57 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 07:57:57 compute-0 nova_compute[186544]:   </features>
Nov 22 07:57:57 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 07:57:57 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 07:57:57 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 07:57:57 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 07:57:57 compute-0 nova_compute[186544]:   </clock>
Nov 22 07:57:57 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 07:57:57 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 07:57:57 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 07:57:57 compute-0 nova_compute[186544]:   </cpu>
Nov 22 07:57:57 compute-0 nova_compute[186544]:   <devices>
Nov 22 07:57:57 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 07:57:57 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 07:57:57 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/a1a9cc11-55d5-4fcd-92a9-12d52d5713d6/disk"/>
Nov 22 07:57:57 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 07:57:57 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:57:57 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 07:57:57 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 07:57:57 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/a1a9cc11-55d5-4fcd-92a9-12d52d5713d6/disk.config"/>
Nov 22 07:57:57 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 07:57:57 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:57:57 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 07:57:57 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:34:7a:a8"/>
Nov 22 07:57:57 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 07:57:57 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 07:57:57 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 07:57:57 compute-0 nova_compute[186544]:       <target dev="tapc039cb7c-d1"/>
Nov 22 07:57:57 compute-0 nova_compute[186544]:     </interface>
Nov 22 07:57:57 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 07:57:57 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/a1a9cc11-55d5-4fcd-92a9-12d52d5713d6/console.log" append="off"/>
Nov 22 07:57:57 compute-0 nova_compute[186544]:     </serial>
Nov 22 07:57:57 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 07:57:57 compute-0 nova_compute[186544]:     <video>
Nov 22 07:57:57 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 07:57:57 compute-0 nova_compute[186544]:     </video>
Nov 22 07:57:57 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 07:57:57 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 07:57:57 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 07:57:57 compute-0 nova_compute[186544]:     </rng>
Nov 22 07:57:57 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 07:57:57 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:57:57 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:57:57 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:57:57 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:57:57 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:57:57 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:57:57 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:57:57 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:57:57 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:57:57 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:57:57 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:57:57 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:57:57 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:57:57 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:57:57 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:57:57 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:57:57 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:57:57 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:57:57 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:57:57 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:57:57 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:57:57 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:57:57 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:57:57 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:57:57 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 07:57:57 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 07:57:57 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 07:57:57 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 07:57:57 compute-0 nova_compute[186544]:   </devices>
Nov 22 07:57:57 compute-0 nova_compute[186544]: </domain>
Nov 22 07:57:57 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 07:57:57 compute-0 nova_compute[186544]: 2025-11-22 07:57:57.205 186548 DEBUG nova.compute.manager [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] Preparing to wait for external event network-vif-plugged-c039cb7c-d1db-493e-acc5-8fc510f48206 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 07:57:57 compute-0 nova_compute[186544]: 2025-11-22 07:57:57.205 186548 DEBUG oslo_concurrency.lockutils [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Acquiring lock "a1a9cc11-55d5-4fcd-92a9-12d52d5713d6-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:57:57 compute-0 nova_compute[186544]: 2025-11-22 07:57:57.205 186548 DEBUG oslo_concurrency.lockutils [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lock "a1a9cc11-55d5-4fcd-92a9-12d52d5713d6-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:57:57 compute-0 nova_compute[186544]: 2025-11-22 07:57:57.206 186548 DEBUG oslo_concurrency.lockutils [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lock "a1a9cc11-55d5-4fcd-92a9-12d52d5713d6-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:57:57 compute-0 nova_compute[186544]: 2025-11-22 07:57:57.206 186548 DEBUG nova.virt.libvirt.vif [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:57:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherA-server-985044045',display_name='tempest-ServerActionsTestOtherA-server-985044045',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestothera-server-985044045',id=79,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='af3a536766704caaad94e5da2e3b88e2',ramdisk_id='',reservation_id='r-fl4d5pn9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherA-1599563713',owner_user_name='tempest-ServerActionsTestOtherA-1599563713-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:57:48Z,user_data=None,user_id='5a5a623606e647c183360572aab20b70',uuid=a1a9cc11-55d5-4fcd-92a9-12d52d5713d6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c039cb7c-d1db-493e-acc5-8fc510f48206", "address": "fa:16:3e:34:7a:a8", "network": {"id": "a2b438ab-8fa8-4627-8c04-99bed701c19e", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1803455984-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "af3a536766704caaad94e5da2e3b88e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc039cb7c-d1", "ovs_interfaceid": "c039cb7c-d1db-493e-acc5-8fc510f48206", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 07:57:57 compute-0 nova_compute[186544]: 2025-11-22 07:57:57.207 186548 DEBUG nova.network.os_vif_util [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Converting VIF {"id": "c039cb7c-d1db-493e-acc5-8fc510f48206", "address": "fa:16:3e:34:7a:a8", "network": {"id": "a2b438ab-8fa8-4627-8c04-99bed701c19e", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1803455984-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "af3a536766704caaad94e5da2e3b88e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc039cb7c-d1", "ovs_interfaceid": "c039cb7c-d1db-493e-acc5-8fc510f48206", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:57:57 compute-0 nova_compute[186544]: 2025-11-22 07:57:57.207 186548 DEBUG nova.network.os_vif_util [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:34:7a:a8,bridge_name='br-int',has_traffic_filtering=True,id=c039cb7c-d1db-493e-acc5-8fc510f48206,network=Network(a2b438ab-8fa8-4627-8c04-99bed701c19e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc039cb7c-d1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:57:57 compute-0 nova_compute[186544]: 2025-11-22 07:57:57.208 186548 DEBUG os_vif [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:34:7a:a8,bridge_name='br-int',has_traffic_filtering=True,id=c039cb7c-d1db-493e-acc5-8fc510f48206,network=Network(a2b438ab-8fa8-4627-8c04-99bed701c19e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc039cb7c-d1') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 07:57:57 compute-0 nova_compute[186544]: 2025-11-22 07:57:57.208 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:57:57 compute-0 nova_compute[186544]: 2025-11-22 07:57:57.209 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:57:57 compute-0 nova_compute[186544]: 2025-11-22 07:57:57.209 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:57:57 compute-0 nova_compute[186544]: 2025-11-22 07:57:57.212 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:57:57 compute-0 nova_compute[186544]: 2025-11-22 07:57:57.213 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc039cb7c-d1, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:57:57 compute-0 nova_compute[186544]: 2025-11-22 07:57:57.213 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc039cb7c-d1, col_values=(('external_ids', {'iface-id': 'c039cb7c-d1db-493e-acc5-8fc510f48206', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:34:7a:a8', 'vm-uuid': 'a1a9cc11-55d5-4fcd-92a9-12d52d5713d6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:57:57 compute-0 nova_compute[186544]: 2025-11-22 07:57:57.214 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:57:57 compute-0 NetworkManager[55036]: <info>  [1763798277.2160] manager: (tapc039cb7c-d1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/147)
Nov 22 07:57:57 compute-0 nova_compute[186544]: 2025-11-22 07:57:57.217 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 07:57:57 compute-0 nova_compute[186544]: 2025-11-22 07:57:57.221 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:57:57 compute-0 nova_compute[186544]: 2025-11-22 07:57:57.222 186548 INFO os_vif [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:34:7a:a8,bridge_name='br-int',has_traffic_filtering=True,id=c039cb7c-d1db-493e-acc5-8fc510f48206,network=Network(a2b438ab-8fa8-4627-8c04-99bed701c19e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc039cb7c-d1')
Nov 22 07:57:57 compute-0 nova_compute[186544]: 2025-11-22 07:57:57.403 186548 DEBUG nova.virt.libvirt.driver [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:57:57 compute-0 nova_compute[186544]: 2025-11-22 07:57:57.404 186548 DEBUG nova.virt.libvirt.driver [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:57:57 compute-0 nova_compute[186544]: 2025-11-22 07:57:57.404 186548 DEBUG nova.virt.libvirt.driver [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] No VIF found with MAC fa:16:3e:34:7a:a8, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 07:57:57 compute-0 nova_compute[186544]: 2025-11-22 07:57:57.404 186548 INFO nova.virt.libvirt.driver [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] Using config drive
Nov 22 07:57:57 compute-0 podman[226023]: 2025-11-22 07:57:57.422901708 +0000 UTC m=+0.073847430 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, org.label-schema.license=GPLv2)
Nov 22 07:57:57 compute-0 nova_compute[186544]: 2025-11-22 07:57:57.828 186548 INFO nova.virt.libvirt.driver [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] Creating config drive at /var/lib/nova/instances/a1a9cc11-55d5-4fcd-92a9-12d52d5713d6/disk.config
Nov 22 07:57:57 compute-0 nova_compute[186544]: 2025-11-22 07:57:57.833 186548 DEBUG oslo_concurrency.processutils [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a1a9cc11-55d5-4fcd-92a9-12d52d5713d6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmphrqoq9wz execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:57:57 compute-0 nova_compute[186544]: 2025-11-22 07:57:57.956 186548 DEBUG oslo_concurrency.processutils [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a1a9cc11-55d5-4fcd-92a9-12d52d5713d6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmphrqoq9wz" returned: 0 in 0.122s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:57:58 compute-0 kernel: tapc039cb7c-d1: entered promiscuous mode
Nov 22 07:57:58 compute-0 NetworkManager[55036]: <info>  [1763798278.0141] manager: (tapc039cb7c-d1): new Tun device (/org/freedesktop/NetworkManager/Devices/148)
Nov 22 07:57:58 compute-0 ovn_controller[94843]: 2025-11-22T07:57:58Z|00312|binding|INFO|Claiming lport c039cb7c-d1db-493e-acc5-8fc510f48206 for this chassis.
Nov 22 07:57:58 compute-0 nova_compute[186544]: 2025-11-22 07:57:58.015 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:57:58 compute-0 ovn_controller[94843]: 2025-11-22T07:57:58Z|00313|binding|INFO|c039cb7c-d1db-493e-acc5-8fc510f48206: Claiming fa:16:3e:34:7a:a8 10.100.0.12
Nov 22 07:57:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:57:58.024 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:34:7a:a8 10.100.0.12'], port_security=['fa:16:3e:34:7a:a8 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'a1a9cc11-55d5-4fcd-92a9-12d52d5713d6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a2b438ab-8fa8-4627-8c04-99bed701c19e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'af3a536766704caaad94e5da2e3b88e2', 'neutron:revision_number': '2', 'neutron:security_group_ids': '80d022ea-fcc6-47bf-8d54-551da59f082d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9cd2a902-e9cb-4e2e-893e-0a2e3b043ce7, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=c039cb7c-d1db-493e-acc5-8fc510f48206) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:57:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:57:58.025 103805 INFO neutron.agent.ovn.metadata.agent [-] Port c039cb7c-d1db-493e-acc5-8fc510f48206 in datapath a2b438ab-8fa8-4627-8c04-99bed701c19e bound to our chassis
Nov 22 07:57:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:57:58.026 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a2b438ab-8fa8-4627-8c04-99bed701c19e
Nov 22 07:57:58 compute-0 ovn_controller[94843]: 2025-11-22T07:57:58Z|00314|binding|INFO|Setting lport c039cb7c-d1db-493e-acc5-8fc510f48206 ovn-installed in OVS
Nov 22 07:57:58 compute-0 ovn_controller[94843]: 2025-11-22T07:57:58Z|00315|binding|INFO|Setting lport c039cb7c-d1db-493e-acc5-8fc510f48206 up in Southbound
Nov 22 07:57:58 compute-0 nova_compute[186544]: 2025-11-22 07:57:58.032 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:57:58 compute-0 nova_compute[186544]: 2025-11-22 07:57:58.035 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:57:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:57:58.045 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[fdbdcfa7-d802-44eb-86d2-99b8f2ee9bd3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:57:58 compute-0 systemd-udevd[226060]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 07:57:58 compute-0 systemd-machined[152872]: New machine qemu-38-instance-0000004f.
Nov 22 07:57:58 compute-0 NetworkManager[55036]: <info>  [1763798278.0647] device (tapc039cb7c-d1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 07:57:58 compute-0 NetworkManager[55036]: <info>  [1763798278.0661] device (tapc039cb7c-d1): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 07:57:58 compute-0 systemd[1]: Started Virtual Machine qemu-38-instance-0000004f.
Nov 22 07:57:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:57:58.076 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[414e4b0d-28d5-4fe1-84df-15a968c08ec0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:57:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:57:58.080 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[9e6a15b9-2863-4ab3-b224-985bd457f0e0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:57:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:57:58.108 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[533290aa-479a-406d-8cac-08c1d1d91d20]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:57:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:57:58.125 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[030f1178-6deb-4885-8631-b3223d209e94]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa2b438ab-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:af:f4:9d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 86], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 489519, 'reachable_time': 35697, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226072, 'error': None, 'target': 'ovnmeta-a2b438ab-8fa8-4627-8c04-99bed701c19e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:57:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:57:58.142 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[be1d778e-3fee-4627-ae19-8b9d5a6b5686]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapa2b438ab-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 489530, 'tstamp': 489530}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226074, 'error': None, 'target': 'ovnmeta-a2b438ab-8fa8-4627-8c04-99bed701c19e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapa2b438ab-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 489534, 'tstamp': 489534}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226074, 'error': None, 'target': 'ovnmeta-a2b438ab-8fa8-4627-8c04-99bed701c19e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:57:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:57:58.145 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa2b438ab-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:57:58 compute-0 nova_compute[186544]: 2025-11-22 07:57:58.147 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:57:58 compute-0 nova_compute[186544]: 2025-11-22 07:57:58.149 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:57:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:57:58.149 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa2b438ab-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:57:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:57:58.149 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:57:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:57:58.149 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa2b438ab-80, col_values=(('external_ids', {'iface-id': '1f7bc015-fb2f-41a5-82bb-16526b7a95f0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:57:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:57:58.150 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:57:58 compute-0 ovn_controller[94843]: 2025-11-22T07:57:58Z|00316|binding|INFO|Releasing lport 1f7bc015-fb2f-41a5-82bb-16526b7a95f0 from this chassis (sb_readonly=0)
Nov 22 07:57:58 compute-0 nova_compute[186544]: 2025-11-22 07:57:58.585 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:57:58 compute-0 nova_compute[186544]: 2025-11-22 07:57:58.680 186548 DEBUG nova.compute.manager [req-eba16706-f07e-4df5-ba65-780b364d858e req-86f5033c-3795-40f3-a54e-4a333527a5cd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] Received event network-vif-plugged-c039cb7c-d1db-493e-acc5-8fc510f48206 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:57:58 compute-0 nova_compute[186544]: 2025-11-22 07:57:58.681 186548 DEBUG oslo_concurrency.lockutils [req-eba16706-f07e-4df5-ba65-780b364d858e req-86f5033c-3795-40f3-a54e-4a333527a5cd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "a1a9cc11-55d5-4fcd-92a9-12d52d5713d6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:57:58 compute-0 nova_compute[186544]: 2025-11-22 07:57:58.682 186548 DEBUG oslo_concurrency.lockutils [req-eba16706-f07e-4df5-ba65-780b364d858e req-86f5033c-3795-40f3-a54e-4a333527a5cd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "a1a9cc11-55d5-4fcd-92a9-12d52d5713d6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:57:58 compute-0 nova_compute[186544]: 2025-11-22 07:57:58.682 186548 DEBUG oslo_concurrency.lockutils [req-eba16706-f07e-4df5-ba65-780b364d858e req-86f5033c-3795-40f3-a54e-4a333527a5cd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "a1a9cc11-55d5-4fcd-92a9-12d52d5713d6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:57:58 compute-0 nova_compute[186544]: 2025-11-22 07:57:58.683 186548 DEBUG nova.compute.manager [req-eba16706-f07e-4df5-ba65-780b364d858e req-86f5033c-3795-40f3-a54e-4a333527a5cd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] Processing event network-vif-plugged-c039cb7c-d1db-493e-acc5-8fc510f48206 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 07:57:58 compute-0 nova_compute[186544]: 2025-11-22 07:57:58.756 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798278.7564125, a1a9cc11-55d5-4fcd-92a9-12d52d5713d6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:57:58 compute-0 nova_compute[186544]: 2025-11-22 07:57:58.757 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] VM Started (Lifecycle Event)
Nov 22 07:57:58 compute-0 nova_compute[186544]: 2025-11-22 07:57:58.759 186548 DEBUG nova.compute.manager [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 07:57:58 compute-0 nova_compute[186544]: 2025-11-22 07:57:58.764 186548 DEBUG nova.virt.libvirt.driver [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 07:57:58 compute-0 nova_compute[186544]: 2025-11-22 07:57:58.769 186548 INFO nova.virt.libvirt.driver [-] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] Instance spawned successfully.
Nov 22 07:57:58 compute-0 nova_compute[186544]: 2025-11-22 07:57:58.771 186548 DEBUG nova.virt.libvirt.driver [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 07:57:58 compute-0 nova_compute[186544]: 2025-11-22 07:57:58.776 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:57:58 compute-0 nova_compute[186544]: 2025-11-22 07:57:58.781 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:57:58 compute-0 nova_compute[186544]: 2025-11-22 07:57:58.791 186548 DEBUG nova.virt.libvirt.driver [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:57:58 compute-0 nova_compute[186544]: 2025-11-22 07:57:58.792 186548 DEBUG nova.virt.libvirt.driver [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:57:58 compute-0 nova_compute[186544]: 2025-11-22 07:57:58.792 186548 DEBUG nova.virt.libvirt.driver [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:57:58 compute-0 nova_compute[186544]: 2025-11-22 07:57:58.792 186548 DEBUG nova.virt.libvirt.driver [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:57:58 compute-0 nova_compute[186544]: 2025-11-22 07:57:58.793 186548 DEBUG nova.virt.libvirt.driver [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:57:58 compute-0 nova_compute[186544]: 2025-11-22 07:57:58.793 186548 DEBUG nova.virt.libvirt.driver [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:57:58 compute-0 nova_compute[186544]: 2025-11-22 07:57:58.822 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 07:57:58 compute-0 nova_compute[186544]: 2025-11-22 07:57:58.823 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798278.7565672, a1a9cc11-55d5-4fcd-92a9-12d52d5713d6 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:57:58 compute-0 nova_compute[186544]: 2025-11-22 07:57:58.823 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] VM Paused (Lifecycle Event)
Nov 22 07:57:58 compute-0 nova_compute[186544]: 2025-11-22 07:57:58.850 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:57:58 compute-0 nova_compute[186544]: 2025-11-22 07:57:58.854 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798278.762594, a1a9cc11-55d5-4fcd-92a9-12d52d5713d6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:57:58 compute-0 nova_compute[186544]: 2025-11-22 07:57:58.855 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] VM Resumed (Lifecycle Event)
Nov 22 07:57:58 compute-0 nova_compute[186544]: 2025-11-22 07:57:58.875 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:57:58 compute-0 nova_compute[186544]: 2025-11-22 07:57:58.876 186548 INFO nova.compute.manager [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] Took 10.41 seconds to spawn the instance on the hypervisor.
Nov 22 07:57:58 compute-0 nova_compute[186544]: 2025-11-22 07:57:58.877 186548 DEBUG nova.compute.manager [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:57:58 compute-0 nova_compute[186544]: 2025-11-22 07:57:58.879 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:57:58 compute-0 nova_compute[186544]: 2025-11-22 07:57:58.901 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 07:57:58 compute-0 nova_compute[186544]: 2025-11-22 07:57:58.954 186548 INFO nova.compute.manager [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] Took 11.04 seconds to build instance.
Nov 22 07:57:58 compute-0 nova_compute[186544]: 2025-11-22 07:57:58.979 186548 DEBUG oslo_concurrency.lockutils [None req-5c5c33c1-6807-478b-8462-575d3ca35e4a 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lock "a1a9cc11-55d5-4fcd-92a9-12d52d5713d6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.135s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:57:59 compute-0 nova_compute[186544]: 2025-11-22 07:57:59.106 186548 DEBUG nova.network.neutron [req-dc374b72-8979-4c58-9045-cac70655da68 req-f3903cfd-99c9-4dd9-b5fe-5eaa284b6eb8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] Updated VIF entry in instance network info cache for port c039cb7c-d1db-493e-acc5-8fc510f48206. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 07:57:59 compute-0 nova_compute[186544]: 2025-11-22 07:57:59.106 186548 DEBUG nova.network.neutron [req-dc374b72-8979-4c58-9045-cac70655da68 req-f3903cfd-99c9-4dd9-b5fe-5eaa284b6eb8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] Updating instance_info_cache with network_info: [{"id": "c039cb7c-d1db-493e-acc5-8fc510f48206", "address": "fa:16:3e:34:7a:a8", "network": {"id": "a2b438ab-8fa8-4627-8c04-99bed701c19e", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1803455984-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "af3a536766704caaad94e5da2e3b88e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc039cb7c-d1", "ovs_interfaceid": "c039cb7c-d1db-493e-acc5-8fc510f48206", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:57:59 compute-0 nova_compute[186544]: 2025-11-22 07:57:59.128 186548 DEBUG oslo_concurrency.lockutils [req-dc374b72-8979-4c58-9045-cac70655da68 req-f3903cfd-99c9-4dd9-b5fe-5eaa284b6eb8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-a1a9cc11-55d5-4fcd-92a9-12d52d5713d6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:58:00 compute-0 nova_compute[186544]: 2025-11-22 07:58:00.468 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:00 compute-0 nova_compute[186544]: 2025-11-22 07:58:00.805 186548 DEBUG nova.compute.manager [req-998c29af-fca7-4092-bfa0-b0b6bb97f272 req-c9d09857-b480-4721-a07f-1f38b13f4b16 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] Received event network-vif-plugged-c039cb7c-d1db-493e-acc5-8fc510f48206 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:58:00 compute-0 nova_compute[186544]: 2025-11-22 07:58:00.807 186548 DEBUG oslo_concurrency.lockutils [req-998c29af-fca7-4092-bfa0-b0b6bb97f272 req-c9d09857-b480-4721-a07f-1f38b13f4b16 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "a1a9cc11-55d5-4fcd-92a9-12d52d5713d6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:58:00 compute-0 nova_compute[186544]: 2025-11-22 07:58:00.808 186548 DEBUG oslo_concurrency.lockutils [req-998c29af-fca7-4092-bfa0-b0b6bb97f272 req-c9d09857-b480-4721-a07f-1f38b13f4b16 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "a1a9cc11-55d5-4fcd-92a9-12d52d5713d6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:58:00 compute-0 nova_compute[186544]: 2025-11-22 07:58:00.808 186548 DEBUG oslo_concurrency.lockutils [req-998c29af-fca7-4092-bfa0-b0b6bb97f272 req-c9d09857-b480-4721-a07f-1f38b13f4b16 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "a1a9cc11-55d5-4fcd-92a9-12d52d5713d6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:58:00 compute-0 nova_compute[186544]: 2025-11-22 07:58:00.809 186548 DEBUG nova.compute.manager [req-998c29af-fca7-4092-bfa0-b0b6bb97f272 req-c9d09857-b480-4721-a07f-1f38b13f4b16 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] No waiting events found dispatching network-vif-plugged-c039cb7c-d1db-493e-acc5-8fc510f48206 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:58:00 compute-0 nova_compute[186544]: 2025-11-22 07:58:00.809 186548 WARNING nova.compute.manager [req-998c29af-fca7-4092-bfa0-b0b6bb97f272 req-c9d09857-b480-4721-a07f-1f38b13f4b16 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] Received unexpected event network-vif-plugged-c039cb7c-d1db-493e-acc5-8fc510f48206 for instance with vm_state active and task_state None.
Nov 22 07:58:01 compute-0 podman[226083]: 2025-11-22 07:58:01.412248877 +0000 UTC m=+0.059381925 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 07:58:02 compute-0 nova_compute[186544]: 2025-11-22 07:58:02.216 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:02 compute-0 nova_compute[186544]: 2025-11-22 07:58:02.985 186548 DEBUG nova.compute.manager [req-be3daf5d-e40e-4e3c-946e-24bd57eb0b46 req-d5f17664-0976-4d72-9dbe-07ce1a96d0d5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] Received event network-changed-c039cb7c-d1db-493e-acc5-8fc510f48206 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:58:02 compute-0 nova_compute[186544]: 2025-11-22 07:58:02.986 186548 DEBUG nova.compute.manager [req-be3daf5d-e40e-4e3c-946e-24bd57eb0b46 req-d5f17664-0976-4d72-9dbe-07ce1a96d0d5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] Refreshing instance network info cache due to event network-changed-c039cb7c-d1db-493e-acc5-8fc510f48206. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 07:58:02 compute-0 nova_compute[186544]: 2025-11-22 07:58:02.986 186548 DEBUG oslo_concurrency.lockutils [req-be3daf5d-e40e-4e3c-946e-24bd57eb0b46 req-d5f17664-0976-4d72-9dbe-07ce1a96d0d5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-a1a9cc11-55d5-4fcd-92a9-12d52d5713d6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:58:02 compute-0 nova_compute[186544]: 2025-11-22 07:58:02.987 186548 DEBUG oslo_concurrency.lockutils [req-be3daf5d-e40e-4e3c-946e-24bd57eb0b46 req-d5f17664-0976-4d72-9dbe-07ce1a96d0d5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-a1a9cc11-55d5-4fcd-92a9-12d52d5713d6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:58:02 compute-0 nova_compute[186544]: 2025-11-22 07:58:02.987 186548 DEBUG nova.network.neutron [req-be3daf5d-e40e-4e3c-946e-24bd57eb0b46 req-d5f17664-0976-4d72-9dbe-07ce1a96d0d5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] Refreshing network info cache for port c039cb7c-d1db-493e-acc5-8fc510f48206 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 07:58:03 compute-0 podman[226109]: 2025-11-22 07:58:03.406612908 +0000 UTC m=+0.053615411 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, version=9.6, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, release=1755695350, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 22 07:58:05 compute-0 nova_compute[186544]: 2025-11-22 07:58:05.137 186548 DEBUG oslo_concurrency.lockutils [None req-a14c836e-83e0-4ce5-bc4b-3f4cd22aa4dc 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Acquiring lock "a1a9cc11-55d5-4fcd-92a9-12d52d5713d6" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:58:05 compute-0 nova_compute[186544]: 2025-11-22 07:58:05.137 186548 DEBUG oslo_concurrency.lockutils [None req-a14c836e-83e0-4ce5-bc4b-3f4cd22aa4dc 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lock "a1a9cc11-55d5-4fcd-92a9-12d52d5713d6" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:58:05 compute-0 nova_compute[186544]: 2025-11-22 07:58:05.137 186548 DEBUG oslo_concurrency.lockutils [None req-a14c836e-83e0-4ce5-bc4b-3f4cd22aa4dc 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Acquiring lock "a1a9cc11-55d5-4fcd-92a9-12d52d5713d6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:58:05 compute-0 nova_compute[186544]: 2025-11-22 07:58:05.138 186548 DEBUG oslo_concurrency.lockutils [None req-a14c836e-83e0-4ce5-bc4b-3f4cd22aa4dc 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lock "a1a9cc11-55d5-4fcd-92a9-12d52d5713d6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:58:05 compute-0 nova_compute[186544]: 2025-11-22 07:58:05.139 186548 DEBUG oslo_concurrency.lockutils [None req-a14c836e-83e0-4ce5-bc4b-3f4cd22aa4dc 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lock "a1a9cc11-55d5-4fcd-92a9-12d52d5713d6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:58:05 compute-0 nova_compute[186544]: 2025-11-22 07:58:05.147 186548 INFO nova.compute.manager [None req-a14c836e-83e0-4ce5-bc4b-3f4cd22aa4dc 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] Terminating instance
Nov 22 07:58:05 compute-0 nova_compute[186544]: 2025-11-22 07:58:05.153 186548 DEBUG nova.compute.manager [None req-a14c836e-83e0-4ce5-bc4b-3f4cd22aa4dc 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 07:58:05 compute-0 nova_compute[186544]: 2025-11-22 07:58:05.158 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:58:05 compute-0 kernel: tapc039cb7c-d1 (unregistering): left promiscuous mode
Nov 22 07:58:05 compute-0 NetworkManager[55036]: <info>  [1763798285.1781] device (tapc039cb7c-d1): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 07:58:05 compute-0 nova_compute[186544]: 2025-11-22 07:58:05.187 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:05 compute-0 ovn_controller[94843]: 2025-11-22T07:58:05Z|00317|binding|INFO|Releasing lport c039cb7c-d1db-493e-acc5-8fc510f48206 from this chassis (sb_readonly=0)
Nov 22 07:58:05 compute-0 ovn_controller[94843]: 2025-11-22T07:58:05Z|00318|binding|INFO|Setting lport c039cb7c-d1db-493e-acc5-8fc510f48206 down in Southbound
Nov 22 07:58:05 compute-0 ovn_controller[94843]: 2025-11-22T07:58:05Z|00319|binding|INFO|Removing iface tapc039cb7c-d1 ovn-installed in OVS
Nov 22 07:58:05 compute-0 nova_compute[186544]: 2025-11-22 07:58:05.189 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:05.201 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:34:7a:a8 10.100.0.12'], port_security=['fa:16:3e:34:7a:a8 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'a1a9cc11-55d5-4fcd-92a9-12d52d5713d6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a2b438ab-8fa8-4627-8c04-99bed701c19e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'af3a536766704caaad94e5da2e3b88e2', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9cd2a902-e9cb-4e2e-893e-0a2e3b043ce7, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=c039cb7c-d1db-493e-acc5-8fc510f48206) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:58:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:05.203 103805 INFO neutron.agent.ovn.metadata.agent [-] Port c039cb7c-d1db-493e-acc5-8fc510f48206 in datapath a2b438ab-8fa8-4627-8c04-99bed701c19e unbound from our chassis
Nov 22 07:58:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:05.204 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a2b438ab-8fa8-4627-8c04-99bed701c19e
Nov 22 07:58:05 compute-0 nova_compute[186544]: 2025-11-22 07:58:05.203 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:05.220 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[8be71da2-b6eb-4842-bfdd-fdf125c2fc12]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:05.251 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[bc86a06d-c82a-4ada-88b3-e1749faff39f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:05 compute-0 systemd[1]: machine-qemu\x2d38\x2dinstance\x2d0000004f.scope: Deactivated successfully.
Nov 22 07:58:05 compute-0 systemd[1]: machine-qemu\x2d38\x2dinstance\x2d0000004f.scope: Consumed 7.090s CPU time.
Nov 22 07:58:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:05.255 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[f9b48771-e974-4a00-8be7-7c9e6d9e23b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:05 compute-0 systemd-machined[152872]: Machine qemu-38-instance-0000004f terminated.
Nov 22 07:58:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:05.282 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[ac24fb77-8f24-4a3c-bb87-1fb6707f2c81]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:05.300 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[68e268f4-9767-47d0-8bb4-9081f978ab72]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa2b438ab-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:af:f4:9d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 616, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 616, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 86], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 489519, 'reachable_time': 35697, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226142, 'error': None, 'target': 'ovnmeta-a2b438ab-8fa8-4627-8c04-99bed701c19e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:05 compute-0 nova_compute[186544]: 2025-11-22 07:58:05.305 186548 DEBUG nova.network.neutron [req-be3daf5d-e40e-4e3c-946e-24bd57eb0b46 req-d5f17664-0976-4d72-9dbe-07ce1a96d0d5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] Updated VIF entry in instance network info cache for port c039cb7c-d1db-493e-acc5-8fc510f48206. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 07:58:05 compute-0 nova_compute[186544]: 2025-11-22 07:58:05.306 186548 DEBUG nova.network.neutron [req-be3daf5d-e40e-4e3c-946e-24bd57eb0b46 req-d5f17664-0976-4d72-9dbe-07ce1a96d0d5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] Updating instance_info_cache with network_info: [{"id": "c039cb7c-d1db-493e-acc5-8fc510f48206", "address": "fa:16:3e:34:7a:a8", "network": {"id": "a2b438ab-8fa8-4627-8c04-99bed701c19e", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1803455984-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "af3a536766704caaad94e5da2e3b88e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc039cb7c-d1", "ovs_interfaceid": "c039cb7c-d1db-493e-acc5-8fc510f48206", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:58:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:05.315 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[47d0858f-1ab4-4539-9dbb-23e7a7a6a88e]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapa2b438ab-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 489530, 'tstamp': 489530}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226143, 'error': None, 'target': 'ovnmeta-a2b438ab-8fa8-4627-8c04-99bed701c19e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapa2b438ab-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 489534, 'tstamp': 489534}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226143, 'error': None, 'target': 'ovnmeta-a2b438ab-8fa8-4627-8c04-99bed701c19e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:05.317 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa2b438ab-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:58:05 compute-0 nova_compute[186544]: 2025-11-22 07:58:05.318 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:05 compute-0 nova_compute[186544]: 2025-11-22 07:58:05.322 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:05.322 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa2b438ab-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:58:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:05.323 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:58:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:05.323 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa2b438ab-80, col_values=(('external_ids', {'iface-id': '1f7bc015-fb2f-41a5-82bb-16526b7a95f0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:58:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:05.323 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:58:05 compute-0 nova_compute[186544]: 2025-11-22 07:58:05.329 186548 DEBUG oslo_concurrency.lockutils [req-be3daf5d-e40e-4e3c-946e-24bd57eb0b46 req-d5f17664-0976-4d72-9dbe-07ce1a96d0d5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-a1a9cc11-55d5-4fcd-92a9-12d52d5713d6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:58:05 compute-0 nova_compute[186544]: 2025-11-22 07:58:05.414 186548 INFO nova.virt.libvirt.driver [-] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] Instance destroyed successfully.
Nov 22 07:58:05 compute-0 nova_compute[186544]: 2025-11-22 07:58:05.414 186548 DEBUG nova.objects.instance [None req-a14c836e-83e0-4ce5-bc4b-3f4cd22aa4dc 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lazy-loading 'resources' on Instance uuid a1a9cc11-55d5-4fcd-92a9-12d52d5713d6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:58:05 compute-0 nova_compute[186544]: 2025-11-22 07:58:05.424 186548 DEBUG nova.virt.libvirt.vif [None req-a14c836e-83e0-4ce5-bc4b-3f4cd22aa4dc 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:57:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherA-server-985044045',display_name='tempest-ServerActionsTestOtherA-server-985044045',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestothera-server-985044045',id=79,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:57:58Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='af3a536766704caaad94e5da2e3b88e2',ramdisk_id='',reservation_id='r-fl4d5pn9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherA-1599563713',owner_user_name='tempest-ServerActionsTestOtherA-1599563713-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:57:58Z,user_data=None,user_id='5a5a623606e647c183360572aab20b70',uuid=a1a9cc11-55d5-4fcd-92a9-12d52d5713d6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c039cb7c-d1db-493e-acc5-8fc510f48206", "address": "fa:16:3e:34:7a:a8", "network": {"id": "a2b438ab-8fa8-4627-8c04-99bed701c19e", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1803455984-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "af3a536766704caaad94e5da2e3b88e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc039cb7c-d1", "ovs_interfaceid": "c039cb7c-d1db-493e-acc5-8fc510f48206", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 07:58:05 compute-0 nova_compute[186544]: 2025-11-22 07:58:05.425 186548 DEBUG nova.network.os_vif_util [None req-a14c836e-83e0-4ce5-bc4b-3f4cd22aa4dc 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Converting VIF {"id": "c039cb7c-d1db-493e-acc5-8fc510f48206", "address": "fa:16:3e:34:7a:a8", "network": {"id": "a2b438ab-8fa8-4627-8c04-99bed701c19e", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1803455984-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "af3a536766704caaad94e5da2e3b88e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc039cb7c-d1", "ovs_interfaceid": "c039cb7c-d1db-493e-acc5-8fc510f48206", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:58:05 compute-0 nova_compute[186544]: 2025-11-22 07:58:05.425 186548 DEBUG nova.network.os_vif_util [None req-a14c836e-83e0-4ce5-bc4b-3f4cd22aa4dc 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:34:7a:a8,bridge_name='br-int',has_traffic_filtering=True,id=c039cb7c-d1db-493e-acc5-8fc510f48206,network=Network(a2b438ab-8fa8-4627-8c04-99bed701c19e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc039cb7c-d1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:58:05 compute-0 nova_compute[186544]: 2025-11-22 07:58:05.426 186548 DEBUG os_vif [None req-a14c836e-83e0-4ce5-bc4b-3f4cd22aa4dc 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:34:7a:a8,bridge_name='br-int',has_traffic_filtering=True,id=c039cb7c-d1db-493e-acc5-8fc510f48206,network=Network(a2b438ab-8fa8-4627-8c04-99bed701c19e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc039cb7c-d1') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 07:58:05 compute-0 nova_compute[186544]: 2025-11-22 07:58:05.427 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:05 compute-0 nova_compute[186544]: 2025-11-22 07:58:05.428 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc039cb7c-d1, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:58:05 compute-0 nova_compute[186544]: 2025-11-22 07:58:05.429 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:05 compute-0 nova_compute[186544]: 2025-11-22 07:58:05.433 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:05 compute-0 nova_compute[186544]: 2025-11-22 07:58:05.436 186548 DEBUG nova.compute.manager [req-4641b1d1-26b6-4950-b16d-4fc1e8548ab7 req-088a4c9e-4bc8-4839-9e79-8ec0ef99a812 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] Received event network-vif-unplugged-c039cb7c-d1db-493e-acc5-8fc510f48206 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:58:05 compute-0 nova_compute[186544]: 2025-11-22 07:58:05.437 186548 DEBUG oslo_concurrency.lockutils [req-4641b1d1-26b6-4950-b16d-4fc1e8548ab7 req-088a4c9e-4bc8-4839-9e79-8ec0ef99a812 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "a1a9cc11-55d5-4fcd-92a9-12d52d5713d6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:58:05 compute-0 nova_compute[186544]: 2025-11-22 07:58:05.437 186548 DEBUG oslo_concurrency.lockutils [req-4641b1d1-26b6-4950-b16d-4fc1e8548ab7 req-088a4c9e-4bc8-4839-9e79-8ec0ef99a812 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "a1a9cc11-55d5-4fcd-92a9-12d52d5713d6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:58:05 compute-0 nova_compute[186544]: 2025-11-22 07:58:05.437 186548 DEBUG oslo_concurrency.lockutils [req-4641b1d1-26b6-4950-b16d-4fc1e8548ab7 req-088a4c9e-4bc8-4839-9e79-8ec0ef99a812 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "a1a9cc11-55d5-4fcd-92a9-12d52d5713d6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:58:05 compute-0 nova_compute[186544]: 2025-11-22 07:58:05.437 186548 DEBUG nova.compute.manager [req-4641b1d1-26b6-4950-b16d-4fc1e8548ab7 req-088a4c9e-4bc8-4839-9e79-8ec0ef99a812 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] No waiting events found dispatching network-vif-unplugged-c039cb7c-d1db-493e-acc5-8fc510f48206 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:58:05 compute-0 nova_compute[186544]: 2025-11-22 07:58:05.437 186548 DEBUG nova.compute.manager [req-4641b1d1-26b6-4950-b16d-4fc1e8548ab7 req-088a4c9e-4bc8-4839-9e79-8ec0ef99a812 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] Received event network-vif-unplugged-c039cb7c-d1db-493e-acc5-8fc510f48206 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 22 07:58:05 compute-0 nova_compute[186544]: 2025-11-22 07:58:05.440 186548 INFO os_vif [None req-a14c836e-83e0-4ce5-bc4b-3f4cd22aa4dc 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:34:7a:a8,bridge_name='br-int',has_traffic_filtering=True,id=c039cb7c-d1db-493e-acc5-8fc510f48206,network=Network(a2b438ab-8fa8-4627-8c04-99bed701c19e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc039cb7c-d1')
Nov 22 07:58:05 compute-0 nova_compute[186544]: 2025-11-22 07:58:05.441 186548 INFO nova.virt.libvirt.driver [None req-a14c836e-83e0-4ce5-bc4b-3f4cd22aa4dc 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] Deleting instance files /var/lib/nova/instances/a1a9cc11-55d5-4fcd-92a9-12d52d5713d6_del
Nov 22 07:58:05 compute-0 nova_compute[186544]: 2025-11-22 07:58:05.441 186548 INFO nova.virt.libvirt.driver [None req-a14c836e-83e0-4ce5-bc4b-3f4cd22aa4dc 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] Deletion of /var/lib/nova/instances/a1a9cc11-55d5-4fcd-92a9-12d52d5713d6_del complete
Nov 22 07:58:05 compute-0 nova_compute[186544]: 2025-11-22 07:58:05.468 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:05 compute-0 nova_compute[186544]: 2025-11-22 07:58:05.555 186548 INFO nova.compute.manager [None req-a14c836e-83e0-4ce5-bc4b-3f4cd22aa4dc 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] Took 0.40 seconds to destroy the instance on the hypervisor.
Nov 22 07:58:05 compute-0 nova_compute[186544]: 2025-11-22 07:58:05.556 186548 DEBUG oslo.service.loopingcall [None req-a14c836e-83e0-4ce5-bc4b-3f4cd22aa4dc 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 07:58:05 compute-0 nova_compute[186544]: 2025-11-22 07:58:05.556 186548 DEBUG nova.compute.manager [-] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 07:58:05 compute-0 nova_compute[186544]: 2025-11-22 07:58:05.556 186548 DEBUG nova.network.neutron [-] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 07:58:06 compute-0 nova_compute[186544]: 2025-11-22 07:58:06.852 186548 DEBUG nova.network.neutron [-] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:58:06 compute-0 nova_compute[186544]: 2025-11-22 07:58:06.910 186548 DEBUG nova.compute.manager [req-06b4dea9-130f-495e-9937-5a939f355476 req-7f7fabd3-62cf-4a15-a380-a7a713cacdd8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] Received event network-vif-deleted-c039cb7c-d1db-493e-acc5-8fc510f48206 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:58:06 compute-0 nova_compute[186544]: 2025-11-22 07:58:06.911 186548 INFO nova.compute.manager [req-06b4dea9-130f-495e-9937-5a939f355476 req-7f7fabd3-62cf-4a15-a380-a7a713cacdd8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] Neutron deleted interface c039cb7c-d1db-493e-acc5-8fc510f48206; detaching it from the instance and deleting it from the info cache
Nov 22 07:58:06 compute-0 nova_compute[186544]: 2025-11-22 07:58:06.911 186548 DEBUG nova.network.neutron [req-06b4dea9-130f-495e-9937-5a939f355476 req-7f7fabd3-62cf-4a15-a380-a7a713cacdd8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:58:06 compute-0 nova_compute[186544]: 2025-11-22 07:58:06.938 186548 INFO nova.compute.manager [-] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] Took 1.38 seconds to deallocate network for instance.
Nov 22 07:58:06 compute-0 nova_compute[186544]: 2025-11-22 07:58:06.944 186548 DEBUG nova.compute.manager [req-06b4dea9-130f-495e-9937-5a939f355476 req-7f7fabd3-62cf-4a15-a380-a7a713cacdd8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] Detach interface failed, port_id=c039cb7c-d1db-493e-acc5-8fc510f48206, reason: Instance a1a9cc11-55d5-4fcd-92a9-12d52d5713d6 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Nov 22 07:58:07 compute-0 nova_compute[186544]: 2025-11-22 07:58:07.032 186548 DEBUG oslo_concurrency.lockutils [None req-a14c836e-83e0-4ce5-bc4b-3f4cd22aa4dc 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:58:07 compute-0 nova_compute[186544]: 2025-11-22 07:58:07.032 186548 DEBUG oslo_concurrency.lockutils [None req-a14c836e-83e0-4ce5-bc4b-3f4cd22aa4dc 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:58:07 compute-0 nova_compute[186544]: 2025-11-22 07:58:07.122 186548 DEBUG nova.compute.provider_tree [None req-a14c836e-83e0-4ce5-bc4b-3f4cd22aa4dc 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:58:07 compute-0 nova_compute[186544]: 2025-11-22 07:58:07.140 186548 DEBUG nova.scheduler.client.report [None req-a14c836e-83e0-4ce5-bc4b-3f4cd22aa4dc 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:58:07 compute-0 nova_compute[186544]: 2025-11-22 07:58:07.161 186548 DEBUG oslo_concurrency.lockutils [None req-a14c836e-83e0-4ce5-bc4b-3f4cd22aa4dc 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:58:07 compute-0 nova_compute[186544]: 2025-11-22 07:58:07.185 186548 INFO nova.scheduler.client.report [None req-a14c836e-83e0-4ce5-bc4b-3f4cd22aa4dc 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Deleted allocations for instance a1a9cc11-55d5-4fcd-92a9-12d52d5713d6
Nov 22 07:58:07 compute-0 nova_compute[186544]: 2025-11-22 07:58:07.243 186548 DEBUG oslo_concurrency.lockutils [None req-a14c836e-83e0-4ce5-bc4b-3f4cd22aa4dc 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lock "a1a9cc11-55d5-4fcd-92a9-12d52d5713d6" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.105s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:58:07 compute-0 ovn_controller[94843]: 2025-11-22T07:58:07Z|00320|binding|INFO|Releasing lport 1f7bc015-fb2f-41a5-82bb-16526b7a95f0 from this chassis (sb_readonly=0)
Nov 22 07:58:07 compute-0 nova_compute[186544]: 2025-11-22 07:58:07.305 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:07 compute-0 nova_compute[186544]: 2025-11-22 07:58:07.626 186548 DEBUG nova.compute.manager [req-41d205cd-08b5-44fd-889c-2b518a9b8352 req-674d29d6-e9f2-4c32-9cca-e3a0c800ff1e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] Received event network-vif-plugged-c039cb7c-d1db-493e-acc5-8fc510f48206 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:58:07 compute-0 nova_compute[186544]: 2025-11-22 07:58:07.626 186548 DEBUG oslo_concurrency.lockutils [req-41d205cd-08b5-44fd-889c-2b518a9b8352 req-674d29d6-e9f2-4c32-9cca-e3a0c800ff1e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "a1a9cc11-55d5-4fcd-92a9-12d52d5713d6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:58:07 compute-0 nova_compute[186544]: 2025-11-22 07:58:07.626 186548 DEBUG oslo_concurrency.lockutils [req-41d205cd-08b5-44fd-889c-2b518a9b8352 req-674d29d6-e9f2-4c32-9cca-e3a0c800ff1e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "a1a9cc11-55d5-4fcd-92a9-12d52d5713d6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:58:07 compute-0 nova_compute[186544]: 2025-11-22 07:58:07.627 186548 DEBUG oslo_concurrency.lockutils [req-41d205cd-08b5-44fd-889c-2b518a9b8352 req-674d29d6-e9f2-4c32-9cca-e3a0c800ff1e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "a1a9cc11-55d5-4fcd-92a9-12d52d5713d6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:58:07 compute-0 nova_compute[186544]: 2025-11-22 07:58:07.627 186548 DEBUG nova.compute.manager [req-41d205cd-08b5-44fd-889c-2b518a9b8352 req-674d29d6-e9f2-4c32-9cca-e3a0c800ff1e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] No waiting events found dispatching network-vif-plugged-c039cb7c-d1db-493e-acc5-8fc510f48206 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:58:07 compute-0 nova_compute[186544]: 2025-11-22 07:58:07.627 186548 WARNING nova.compute.manager [req-41d205cd-08b5-44fd-889c-2b518a9b8352 req-674d29d6-e9f2-4c32-9cca-e3a0c800ff1e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] Received unexpected event network-vif-plugged-c039cb7c-d1db-493e-acc5-8fc510f48206 for instance with vm_state deleted and task_state None.
Nov 22 07:58:08 compute-0 nova_compute[186544]: 2025-11-22 07:58:08.219 186548 DEBUG oslo_concurrency.lockutils [None req-1b45e6d5-6407-4a92-9067-b6e05ab36d46 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Acquiring lock "1da3a852-b732-48ac-886f-c57ced76a3d5" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:58:08 compute-0 nova_compute[186544]: 2025-11-22 07:58:08.220 186548 DEBUG oslo_concurrency.lockutils [None req-1b45e6d5-6407-4a92-9067-b6e05ab36d46 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lock "1da3a852-b732-48ac-886f-c57ced76a3d5" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:58:08 compute-0 nova_compute[186544]: 2025-11-22 07:58:08.220 186548 DEBUG oslo_concurrency.lockutils [None req-1b45e6d5-6407-4a92-9067-b6e05ab36d46 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Acquiring lock "1da3a852-b732-48ac-886f-c57ced76a3d5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:58:08 compute-0 nova_compute[186544]: 2025-11-22 07:58:08.220 186548 DEBUG oslo_concurrency.lockutils [None req-1b45e6d5-6407-4a92-9067-b6e05ab36d46 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lock "1da3a852-b732-48ac-886f-c57ced76a3d5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:58:08 compute-0 nova_compute[186544]: 2025-11-22 07:58:08.220 186548 DEBUG oslo_concurrency.lockutils [None req-1b45e6d5-6407-4a92-9067-b6e05ab36d46 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lock "1da3a852-b732-48ac-886f-c57ced76a3d5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:58:08 compute-0 nova_compute[186544]: 2025-11-22 07:58:08.228 186548 INFO nova.compute.manager [None req-1b45e6d5-6407-4a92-9067-b6e05ab36d46 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] Terminating instance
Nov 22 07:58:08 compute-0 nova_compute[186544]: 2025-11-22 07:58:08.233 186548 DEBUG nova.compute.manager [None req-1b45e6d5-6407-4a92-9067-b6e05ab36d46 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 07:58:08 compute-0 kernel: tapab338966-6e (unregistering): left promiscuous mode
Nov 22 07:58:08 compute-0 NetworkManager[55036]: <info>  [1763798288.3213] device (tapab338966-6e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 07:58:08 compute-0 ovn_controller[94843]: 2025-11-22T07:58:08Z|00321|binding|INFO|Releasing lport ab338966-6ece-477c-af70-298f992f0c8f from this chassis (sb_readonly=0)
Nov 22 07:58:08 compute-0 ovn_controller[94843]: 2025-11-22T07:58:08Z|00322|binding|INFO|Setting lport ab338966-6ece-477c-af70-298f992f0c8f down in Southbound
Nov 22 07:58:08 compute-0 nova_compute[186544]: 2025-11-22 07:58:08.327 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:08 compute-0 ovn_controller[94843]: 2025-11-22T07:58:08Z|00323|binding|INFO|Removing iface tapab338966-6e ovn-installed in OVS
Nov 22 07:58:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:08.335 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:63:15:d6 10.100.0.3'], port_security=['fa:16:3e:63:15:d6 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '1da3a852-b732-48ac-886f-c57ced76a3d5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a2b438ab-8fa8-4627-8c04-99bed701c19e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'af3a536766704caaad94e5da2e3b88e2', 'neutron:revision_number': '4', 'neutron:security_group_ids': '55e3254c-8a80-4bbd-950d-1b4f6cb50116', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.248'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9cd2a902-e9cb-4e2e-893e-0a2e3b043ce7, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=ab338966-6ece-477c-af70-298f992f0c8f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:58:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:08.337 103805 INFO neutron.agent.ovn.metadata.agent [-] Port ab338966-6ece-477c-af70-298f992f0c8f in datapath a2b438ab-8fa8-4627-8c04-99bed701c19e unbound from our chassis
Nov 22 07:58:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:08.339 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a2b438ab-8fa8-4627-8c04-99bed701c19e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 07:58:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:08.340 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[0ec7fb90-bb88-4ef2-8b47-035bf54bf371]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:08.341 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a2b438ab-8fa8-4627-8c04-99bed701c19e namespace which is not needed anymore
Nov 22 07:58:08 compute-0 nova_compute[186544]: 2025-11-22 07:58:08.344 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:08 compute-0 systemd[1]: machine-qemu\x2d35\x2dinstance\x2d00000048.scope: Deactivated successfully.
Nov 22 07:58:08 compute-0 systemd[1]: machine-qemu\x2d35\x2dinstance\x2d00000048.scope: Consumed 19.132s CPU time.
Nov 22 07:58:08 compute-0 systemd-machined[152872]: Machine qemu-35-instance-00000048 terminated.
Nov 22 07:58:08 compute-0 nova_compute[186544]: 2025-11-22 07:58:08.454 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:08 compute-0 nova_compute[186544]: 2025-11-22 07:58:08.458 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:08 compute-0 nova_compute[186544]: 2025-11-22 07:58:08.494 186548 INFO nova.virt.libvirt.driver [-] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] Instance destroyed successfully.
Nov 22 07:58:08 compute-0 nova_compute[186544]: 2025-11-22 07:58:08.495 186548 DEBUG nova.objects.instance [None req-1b45e6d5-6407-4a92-9067-b6e05ab36d46 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lazy-loading 'resources' on Instance uuid 1da3a852-b732-48ac-886f-c57ced76a3d5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:58:08 compute-0 nova_compute[186544]: 2025-11-22 07:58:08.518 186548 DEBUG nova.virt.libvirt.vif [None req-1b45e6d5-6407-4a92-9067-b6e05ab36d46 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:55:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherA-server-824804646',display_name='tempest-ServerActionsTestOtherA-server-824804646',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestothera-server-824804646',id=72,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAZKpzQAL6ILshnC6rB6cFyDuv4pMips+ZeoETDONp6x88d0LohWezNpcKrkoSs9Vgu2Y9x9mhLipzvPvYb8Sku39yyvCNCWO3Bk20MRd537J3g4W+WKdsA5nca0tYn05g==',key_name='tempest-keypair-135608332',keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:55:54Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='af3a536766704caaad94e5da2e3b88e2',ramdisk_id='',reservation_id='r-p4lnnnqe',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherA-1599563713',owner_user_name='tempest-ServerActionsTestOtherA-1599563713-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:55:54Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='5a5a623606e647c183360572aab20b70',uuid=1da3a852-b732-48ac-886f-c57ced76a3d5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ab338966-6ece-477c-af70-298f992f0c8f", "address": "fa:16:3e:63:15:d6", "network": {"id": "a2b438ab-8fa8-4627-8c04-99bed701c19e", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1803455984-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "af3a536766704caaad94e5da2e3b88e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab338966-6e", "ovs_interfaceid": "ab338966-6ece-477c-af70-298f992f0c8f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 07:58:08 compute-0 nova_compute[186544]: 2025-11-22 07:58:08.519 186548 DEBUG nova.network.os_vif_util [None req-1b45e6d5-6407-4a92-9067-b6e05ab36d46 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Converting VIF {"id": "ab338966-6ece-477c-af70-298f992f0c8f", "address": "fa:16:3e:63:15:d6", "network": {"id": "a2b438ab-8fa8-4627-8c04-99bed701c19e", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1803455984-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "af3a536766704caaad94e5da2e3b88e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab338966-6e", "ovs_interfaceid": "ab338966-6ece-477c-af70-298f992f0c8f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:58:08 compute-0 nova_compute[186544]: 2025-11-22 07:58:08.519 186548 DEBUG nova.network.os_vif_util [None req-1b45e6d5-6407-4a92-9067-b6e05ab36d46 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:63:15:d6,bridge_name='br-int',has_traffic_filtering=True,id=ab338966-6ece-477c-af70-298f992f0c8f,network=Network(a2b438ab-8fa8-4627-8c04-99bed701c19e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab338966-6e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:58:08 compute-0 nova_compute[186544]: 2025-11-22 07:58:08.520 186548 DEBUG os_vif [None req-1b45e6d5-6407-4a92-9067-b6e05ab36d46 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:63:15:d6,bridge_name='br-int',has_traffic_filtering=True,id=ab338966-6ece-477c-af70-298f992f0c8f,network=Network(a2b438ab-8fa8-4627-8c04-99bed701c19e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab338966-6e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 07:58:08 compute-0 nova_compute[186544]: 2025-11-22 07:58:08.522 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:08 compute-0 nova_compute[186544]: 2025-11-22 07:58:08.522 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapab338966-6e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:58:08 compute-0 nova_compute[186544]: 2025-11-22 07:58:08.523 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:08 compute-0 nova_compute[186544]: 2025-11-22 07:58:08.524 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:08 compute-0 nova_compute[186544]: 2025-11-22 07:58:08.526 186548 INFO os_vif [None req-1b45e6d5-6407-4a92-9067-b6e05ab36d46 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:63:15:d6,bridge_name='br-int',has_traffic_filtering=True,id=ab338966-6ece-477c-af70-298f992f0c8f,network=Network(a2b438ab-8fa8-4627-8c04-99bed701c19e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab338966-6e')
Nov 22 07:58:08 compute-0 nova_compute[186544]: 2025-11-22 07:58:08.527 186548 INFO nova.virt.libvirt.driver [None req-1b45e6d5-6407-4a92-9067-b6e05ab36d46 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] Deleting instance files /var/lib/nova/instances/1da3a852-b732-48ac-886f-c57ced76a3d5_del
Nov 22 07:58:08 compute-0 nova_compute[186544]: 2025-11-22 07:58:08.527 186548 INFO nova.virt.libvirt.driver [None req-1b45e6d5-6407-4a92-9067-b6e05ab36d46 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] Deletion of /var/lib/nova/instances/1da3a852-b732-48ac-886f-c57ced76a3d5_del complete
Nov 22 07:58:08 compute-0 nova_compute[186544]: 2025-11-22 07:58:08.621 186548 INFO nova.compute.manager [None req-1b45e6d5-6407-4a92-9067-b6e05ab36d46 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] Took 0.39 seconds to destroy the instance on the hypervisor.
Nov 22 07:58:08 compute-0 nova_compute[186544]: 2025-11-22 07:58:08.622 186548 DEBUG oslo.service.loopingcall [None req-1b45e6d5-6407-4a92-9067-b6e05ab36d46 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 07:58:08 compute-0 nova_compute[186544]: 2025-11-22 07:58:08.622 186548 DEBUG nova.compute.manager [-] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 07:58:08 compute-0 nova_compute[186544]: 2025-11-22 07:58:08.622 186548 DEBUG nova.network.neutron [-] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 07:58:08 compute-0 nova_compute[186544]: 2025-11-22 07:58:08.673 186548 DEBUG nova.compute.manager [req-a3a30362-9fc4-41c2-b791-b54cfc4aea81 req-b196521c-2159-4e20-8c9c-d8157cd3abd1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] Received event network-vif-unplugged-ab338966-6ece-477c-af70-298f992f0c8f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:58:08 compute-0 nova_compute[186544]: 2025-11-22 07:58:08.673 186548 DEBUG oslo_concurrency.lockutils [req-a3a30362-9fc4-41c2-b791-b54cfc4aea81 req-b196521c-2159-4e20-8c9c-d8157cd3abd1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1da3a852-b732-48ac-886f-c57ced76a3d5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:58:08 compute-0 nova_compute[186544]: 2025-11-22 07:58:08.673 186548 DEBUG oslo_concurrency.lockutils [req-a3a30362-9fc4-41c2-b791-b54cfc4aea81 req-b196521c-2159-4e20-8c9c-d8157cd3abd1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1da3a852-b732-48ac-886f-c57ced76a3d5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:58:08 compute-0 nova_compute[186544]: 2025-11-22 07:58:08.674 186548 DEBUG oslo_concurrency.lockutils [req-a3a30362-9fc4-41c2-b791-b54cfc4aea81 req-b196521c-2159-4e20-8c9c-d8157cd3abd1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1da3a852-b732-48ac-886f-c57ced76a3d5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:58:08 compute-0 nova_compute[186544]: 2025-11-22 07:58:08.674 186548 DEBUG nova.compute.manager [req-a3a30362-9fc4-41c2-b791-b54cfc4aea81 req-b196521c-2159-4e20-8c9c-d8157cd3abd1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] No waiting events found dispatching network-vif-unplugged-ab338966-6ece-477c-af70-298f992f0c8f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:58:08 compute-0 nova_compute[186544]: 2025-11-22 07:58:08.674 186548 DEBUG nova.compute.manager [req-a3a30362-9fc4-41c2-b791-b54cfc4aea81 req-b196521c-2159-4e20-8c9c-d8157cd3abd1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] Received event network-vif-unplugged-ab338966-6ece-477c-af70-298f992f0c8f for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 22 07:58:08 compute-0 neutron-haproxy-ovnmeta-a2b438ab-8fa8-4627-8c04-99bed701c19e[224781]: [NOTICE]   (224785) : haproxy version is 2.8.14-c23fe91
Nov 22 07:58:08 compute-0 neutron-haproxy-ovnmeta-a2b438ab-8fa8-4627-8c04-99bed701c19e[224781]: [NOTICE]   (224785) : path to executable is /usr/sbin/haproxy
Nov 22 07:58:08 compute-0 neutron-haproxy-ovnmeta-a2b438ab-8fa8-4627-8c04-99bed701c19e[224781]: [WARNING]  (224785) : Exiting Master process...
Nov 22 07:58:08 compute-0 neutron-haproxy-ovnmeta-a2b438ab-8fa8-4627-8c04-99bed701c19e[224781]: [WARNING]  (224785) : Exiting Master process...
Nov 22 07:58:08 compute-0 neutron-haproxy-ovnmeta-a2b438ab-8fa8-4627-8c04-99bed701c19e[224781]: [ALERT]    (224785) : Current worker (224788) exited with code 143 (Terminated)
Nov 22 07:58:08 compute-0 neutron-haproxy-ovnmeta-a2b438ab-8fa8-4627-8c04-99bed701c19e[224781]: [WARNING]  (224785) : All workers exited. Exiting... (0)
Nov 22 07:58:08 compute-0 systemd[1]: libpod-9f2c1e3493ad53d9d6d5d861f9de7e2058ddd61e4c7ae7e5299bf5fb168e6663.scope: Deactivated successfully.
Nov 22 07:58:08 compute-0 podman[226188]: 2025-11-22 07:58:08.864079025 +0000 UTC m=+0.423924435 container died 9f2c1e3493ad53d9d6d5d861f9de7e2058ddd61e4c7ae7e5299bf5fb168e6663 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a2b438ab-8fa8-4627-8c04-99bed701c19e, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 07:58:09 compute-0 nova_compute[186544]: 2025-11-22 07:58:09.462 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:09 compute-0 nova_compute[186544]: 2025-11-22 07:58:09.491 186548 DEBUG nova.network.neutron [-] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:58:09 compute-0 nova_compute[186544]: 2025-11-22 07:58:09.512 186548 INFO nova.compute.manager [-] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] Took 0.89 seconds to deallocate network for instance.
Nov 22 07:58:09 compute-0 nova_compute[186544]: 2025-11-22 07:58:09.606 186548 DEBUG oslo_concurrency.lockutils [None req-1b45e6d5-6407-4a92-9067-b6e05ab36d46 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:58:09 compute-0 nova_compute[186544]: 2025-11-22 07:58:09.606 186548 DEBUG oslo_concurrency.lockutils [None req-1b45e6d5-6407-4a92-9067-b6e05ab36d46 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:58:09 compute-0 nova_compute[186544]: 2025-11-22 07:58:09.646 186548 DEBUG nova.compute.manager [req-6899c772-d3ca-49f1-bcd3-bbb8c775d9ae req-312e00b8-7a70-485b-af63-a2c72723d134 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] Received event network-vif-deleted-ab338966-6ece-477c-af70-298f992f0c8f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:58:09 compute-0 nova_compute[186544]: 2025-11-22 07:58:09.660 186548 DEBUG nova.compute.provider_tree [None req-1b45e6d5-6407-4a92-9067-b6e05ab36d46 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:58:09 compute-0 nova_compute[186544]: 2025-11-22 07:58:09.683 186548 DEBUG nova.scheduler.client.report [None req-1b45e6d5-6407-4a92-9067-b6e05ab36d46 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:58:09 compute-0 systemd[1]: var-lib-containers-storage-overlay-d3b6474a9e1ac5da797529878cfaedf082acb57f3328a0fe3a0a8a2f6d596f7d-merged.mount: Deactivated successfully.
Nov 22 07:58:09 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9f2c1e3493ad53d9d6d5d861f9de7e2058ddd61e4c7ae7e5299bf5fb168e6663-userdata-shm.mount: Deactivated successfully.
Nov 22 07:58:09 compute-0 nova_compute[186544]: 2025-11-22 07:58:09.707 186548 DEBUG oslo_concurrency.lockutils [None req-1b45e6d5-6407-4a92-9067-b6e05ab36d46 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.101s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:58:09 compute-0 nova_compute[186544]: 2025-11-22 07:58:09.734 186548 INFO nova.scheduler.client.report [None req-1b45e6d5-6407-4a92-9067-b6e05ab36d46 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Deleted allocations for instance 1da3a852-b732-48ac-886f-c57ced76a3d5
Nov 22 07:58:09 compute-0 nova_compute[186544]: 2025-11-22 07:58:09.812 186548 DEBUG oslo_concurrency.lockutils [None req-1b45e6d5-6407-4a92-9067-b6e05ab36d46 5a5a623606e647c183360572aab20b70 af3a536766704caaad94e5da2e3b88e2 - - default default] Lock "1da3a852-b732-48ac-886f-c57ced76a3d5" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.592s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:58:10 compute-0 podman[226188]: 2025-11-22 07:58:10.250039669 +0000 UTC m=+1.809885079 container cleanup 9f2c1e3493ad53d9d6d5d861f9de7e2058ddd61e4c7ae7e5299bf5fb168e6663 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a2b438ab-8fa8-4627-8c04-99bed701c19e, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS)
Nov 22 07:58:10 compute-0 systemd[1]: libpod-conmon-9f2c1e3493ad53d9d6d5d861f9de7e2058ddd61e4c7ae7e5299bf5fb168e6663.scope: Deactivated successfully.
Nov 22 07:58:10 compute-0 nova_compute[186544]: 2025-11-22 07:58:10.470 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:10 compute-0 podman[226235]: 2025-11-22 07:58:10.552342687 +0000 UTC m=+0.282577483 container remove 9f2c1e3493ad53d9d6d5d861f9de7e2058ddd61e4c7ae7e5299bf5fb168e6663 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a2b438ab-8fa8-4627-8c04-99bed701c19e, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 22 07:58:10 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:10.557 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[f7863ebc-cec9-4217-8b7a-20955ee13ced]: (4, ('Sat Nov 22 07:58:08 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-a2b438ab-8fa8-4627-8c04-99bed701c19e (9f2c1e3493ad53d9d6d5d861f9de7e2058ddd61e4c7ae7e5299bf5fb168e6663)\n9f2c1e3493ad53d9d6d5d861f9de7e2058ddd61e4c7ae7e5299bf5fb168e6663\nSat Nov 22 07:58:10 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-a2b438ab-8fa8-4627-8c04-99bed701c19e (9f2c1e3493ad53d9d6d5d861f9de7e2058ddd61e4c7ae7e5299bf5fb168e6663)\n9f2c1e3493ad53d9d6d5d861f9de7e2058ddd61e4c7ae7e5299bf5fb168e6663\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:10 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:10.559 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[a402624d-a0cd-4eeb-bdad-de4e6ff306a5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:10 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:10.560 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa2b438ab-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:58:10 compute-0 kernel: tapa2b438ab-80: left promiscuous mode
Nov 22 07:58:10 compute-0 nova_compute[186544]: 2025-11-22 07:58:10.561 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:10 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:10.567 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[32ebbfe5-8eae-4d9f-a075-7360d6000186]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:10 compute-0 nova_compute[186544]: 2025-11-22 07:58:10.576 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:10 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:10.592 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[479d2f3e-1473-48bb-8ca1-7a192d8704f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:10 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:10.594 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[135b929a-2cc2-4ff6-932a-4d90d66fc94e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:10 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:10.609 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[ad00dc19-b9c4-412c-b6b4-03711f44fd24]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 489511, 'reachable_time': 23319, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226251, 'error': None, 'target': 'ovnmeta-a2b438ab-8fa8-4627-8c04-99bed701c19e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:10 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:10.611 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a2b438ab-8fa8-4627-8c04-99bed701c19e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 07:58:10 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:10.611 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[1ccee56d-ea71-4301-a0fb-fbc22c96edd2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:10 compute-0 systemd[1]: run-netns-ovnmeta\x2da2b438ab\x2d8fa8\x2d4627\x2d8c04\x2d99bed701c19e.mount: Deactivated successfully.
Nov 22 07:58:10 compute-0 nova_compute[186544]: 2025-11-22 07:58:10.791 186548 DEBUG nova.compute.manager [req-6e641d9c-e53e-49bc-b9e8-b83e30794ae8 req-4fa36678-3b33-48eb-8065-2df4576791ca 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] Received event network-vif-plugged-ab338966-6ece-477c-af70-298f992f0c8f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:58:10 compute-0 nova_compute[186544]: 2025-11-22 07:58:10.791 186548 DEBUG oslo_concurrency.lockutils [req-6e641d9c-e53e-49bc-b9e8-b83e30794ae8 req-4fa36678-3b33-48eb-8065-2df4576791ca 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1da3a852-b732-48ac-886f-c57ced76a3d5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:58:10 compute-0 nova_compute[186544]: 2025-11-22 07:58:10.791 186548 DEBUG oslo_concurrency.lockutils [req-6e641d9c-e53e-49bc-b9e8-b83e30794ae8 req-4fa36678-3b33-48eb-8065-2df4576791ca 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1da3a852-b732-48ac-886f-c57ced76a3d5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:58:10 compute-0 nova_compute[186544]: 2025-11-22 07:58:10.791 186548 DEBUG oslo_concurrency.lockutils [req-6e641d9c-e53e-49bc-b9e8-b83e30794ae8 req-4fa36678-3b33-48eb-8065-2df4576791ca 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1da3a852-b732-48ac-886f-c57ced76a3d5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:58:10 compute-0 nova_compute[186544]: 2025-11-22 07:58:10.792 186548 DEBUG nova.compute.manager [req-6e641d9c-e53e-49bc-b9e8-b83e30794ae8 req-4fa36678-3b33-48eb-8065-2df4576791ca 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] No waiting events found dispatching network-vif-plugged-ab338966-6ece-477c-af70-298f992f0c8f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:58:10 compute-0 nova_compute[186544]: 2025-11-22 07:58:10.792 186548 WARNING nova.compute.manager [req-6e641d9c-e53e-49bc-b9e8-b83e30794ae8 req-4fa36678-3b33-48eb-8065-2df4576791ca 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] Received unexpected event network-vif-plugged-ab338966-6ece-477c-af70-298f992f0c8f for instance with vm_state deleted and task_state None.
Nov 22 07:58:13 compute-0 nova_compute[186544]: 2025-11-22 07:58:13.122 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:13 compute-0 nova_compute[186544]: 2025-11-22 07:58:13.525 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:15 compute-0 podman[226252]: 2025-11-22 07:58:15.40643994 +0000 UTC m=+0.054844723 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 07:58:15 compute-0 nova_compute[186544]: 2025-11-22 07:58:15.426 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:15 compute-0 nova_compute[186544]: 2025-11-22 07:58:15.566 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:15 compute-0 podman[226253]: 2025-11-22 07:58:15.57534589 +0000 UTC m=+0.222315828 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Nov 22 07:58:15 compute-0 nova_compute[186544]: 2025-11-22 07:58:15.578 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:18 compute-0 nova_compute[186544]: 2025-11-22 07:58:18.527 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:20 compute-0 nova_compute[186544]: 2025-11-22 07:58:20.412 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763798285.4113965, a1a9cc11-55d5-4fcd-92a9-12d52d5713d6 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:58:20 compute-0 nova_compute[186544]: 2025-11-22 07:58:20.413 186548 INFO nova.compute.manager [-] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] VM Stopped (Lifecycle Event)
Nov 22 07:58:20 compute-0 podman[226301]: 2025-11-22 07:58:20.425155728 +0000 UTC m=+0.074830454 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 07:58:20 compute-0 nova_compute[186544]: 2025-11-22 07:58:20.443 186548 DEBUG nova.compute.manager [None req-3c3c68d4-5ae0-4576-b0a9-2e77e1fe2704 - - - - - -] [instance: a1a9cc11-55d5-4fcd-92a9-12d52d5713d6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:58:20 compute-0 nova_compute[186544]: 2025-11-22 07:58:20.570 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:22 compute-0 podman[226326]: 2025-11-22 07:58:22.410050017 +0000 UTC m=+0.056258787 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 22 07:58:23 compute-0 nova_compute[186544]: 2025-11-22 07:58:23.492 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763798288.4916253, 1da3a852-b732-48ac-886f-c57ced76a3d5 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:58:23 compute-0 nova_compute[186544]: 2025-11-22 07:58:23.493 186548 INFO nova.compute.manager [-] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] VM Stopped (Lifecycle Event)
Nov 22 07:58:23 compute-0 nova_compute[186544]: 2025-11-22 07:58:23.510 186548 DEBUG nova.compute.manager [None req-2696ec71-f090-4075-8fcd-babcc4c5f30a - - - - - -] [instance: 1da3a852-b732-48ac-886f-c57ced76a3d5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:58:23 compute-0 nova_compute[186544]: 2025-11-22 07:58:23.531 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:25 compute-0 nova_compute[186544]: 2025-11-22 07:58:25.573 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:28 compute-0 podman[226348]: 2025-11-22 07:58:28.402161255 +0000 UTC m=+0.054742009 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 22 07:58:28 compute-0 nova_compute[186544]: 2025-11-22 07:58:28.534 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:30 compute-0 nova_compute[186544]: 2025-11-22 07:58:30.199 186548 DEBUG oslo_concurrency.lockutils [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Acquiring lock "e0454e3f-2082-4375-8eea-951bbcda9d18" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:58:30 compute-0 nova_compute[186544]: 2025-11-22 07:58:30.199 186548 DEBUG oslo_concurrency.lockutils [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Lock "e0454e3f-2082-4375-8eea-951bbcda9d18" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:58:30 compute-0 nova_compute[186544]: 2025-11-22 07:58:30.212 186548 DEBUG nova.compute.manager [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 07:58:30 compute-0 nova_compute[186544]: 2025-11-22 07:58:30.327 186548 DEBUG oslo_concurrency.lockutils [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:58:30 compute-0 nova_compute[186544]: 2025-11-22 07:58:30.328 186548 DEBUG oslo_concurrency.lockutils [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:58:30 compute-0 nova_compute[186544]: 2025-11-22 07:58:30.334 186548 DEBUG nova.virt.hardware [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 07:58:30 compute-0 nova_compute[186544]: 2025-11-22 07:58:30.334 186548 INFO nova.compute.claims [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Claim successful on node compute-0.ctlplane.example.com
Nov 22 07:58:30 compute-0 nova_compute[186544]: 2025-11-22 07:58:30.502 186548 DEBUG nova.compute.provider_tree [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:58:30 compute-0 nova_compute[186544]: 2025-11-22 07:58:30.514 186548 DEBUG nova.scheduler.client.report [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:58:30 compute-0 nova_compute[186544]: 2025-11-22 07:58:30.534 186548 DEBUG oslo_concurrency.lockutils [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.206s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:58:30 compute-0 nova_compute[186544]: 2025-11-22 07:58:30.535 186548 DEBUG nova.compute.manager [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 07:58:30 compute-0 nova_compute[186544]: 2025-11-22 07:58:30.574 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:30 compute-0 nova_compute[186544]: 2025-11-22 07:58:30.594 186548 DEBUG nova.compute.manager [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 22 07:58:30 compute-0 nova_compute[186544]: 2025-11-22 07:58:30.594 186548 DEBUG nova.network.neutron [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 07:58:30 compute-0 nova_compute[186544]: 2025-11-22 07:58:30.615 186548 INFO nova.virt.libvirt.driver [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 07:58:30 compute-0 nova_compute[186544]: 2025-11-22 07:58:30.638 186548 DEBUG nova.compute.manager [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 07:58:30 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:30.721 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=23, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=22) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:58:30 compute-0 nova_compute[186544]: 2025-11-22 07:58:30.722 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:30 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:30.723 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 07:58:30 compute-0 nova_compute[186544]: 2025-11-22 07:58:30.772 186548 DEBUG nova.compute.manager [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 07:58:30 compute-0 nova_compute[186544]: 2025-11-22 07:58:30.774 186548 DEBUG nova.virt.libvirt.driver [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 07:58:30 compute-0 nova_compute[186544]: 2025-11-22 07:58:30.774 186548 INFO nova.virt.libvirt.driver [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Creating image(s)
Nov 22 07:58:30 compute-0 nova_compute[186544]: 2025-11-22 07:58:30.775 186548 DEBUG oslo_concurrency.lockutils [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Acquiring lock "/var/lib/nova/instances/e0454e3f-2082-4375-8eea-951bbcda9d18/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:58:30 compute-0 nova_compute[186544]: 2025-11-22 07:58:30.775 186548 DEBUG oslo_concurrency.lockutils [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Lock "/var/lib/nova/instances/e0454e3f-2082-4375-8eea-951bbcda9d18/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:58:30 compute-0 nova_compute[186544]: 2025-11-22 07:58:30.776 186548 DEBUG oslo_concurrency.lockutils [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Lock "/var/lib/nova/instances/e0454e3f-2082-4375-8eea-951bbcda9d18/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:58:30 compute-0 nova_compute[186544]: 2025-11-22 07:58:30.789 186548 DEBUG oslo_concurrency.processutils [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:58:30 compute-0 nova_compute[186544]: 2025-11-22 07:58:30.839 186548 DEBUG nova.policy [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '567c8b1d5a18457a8101063d7929927c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2e831305a33c4281b08d9d9a531f0f05', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 22 07:58:30 compute-0 nova_compute[186544]: 2025-11-22 07:58:30.848 186548 DEBUG oslo_concurrency.processutils [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:58:30 compute-0 nova_compute[186544]: 2025-11-22 07:58:30.849 186548 DEBUG oslo_concurrency.lockutils [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:58:30 compute-0 nova_compute[186544]: 2025-11-22 07:58:30.850 186548 DEBUG oslo_concurrency.lockutils [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:58:30 compute-0 nova_compute[186544]: 2025-11-22 07:58:30.862 186548 DEBUG oslo_concurrency.processutils [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:58:30 compute-0 nova_compute[186544]: 2025-11-22 07:58:30.919 186548 DEBUG oslo_concurrency.processutils [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:58:30 compute-0 nova_compute[186544]: 2025-11-22 07:58:30.920 186548 DEBUG oslo_concurrency.processutils [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/e0454e3f-2082-4375-8eea-951bbcda9d18/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:58:30 compute-0 nova_compute[186544]: 2025-11-22 07:58:30.953 186548 DEBUG oslo_concurrency.processutils [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/e0454e3f-2082-4375-8eea-951bbcda9d18/disk 1073741824" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:58:30 compute-0 nova_compute[186544]: 2025-11-22 07:58:30.954 186548 DEBUG oslo_concurrency.lockutils [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.104s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:58:30 compute-0 nova_compute[186544]: 2025-11-22 07:58:30.955 186548 DEBUG oslo_concurrency.processutils [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:58:31 compute-0 nova_compute[186544]: 2025-11-22 07:58:31.014 186548 DEBUG oslo_concurrency.processutils [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:58:31 compute-0 nova_compute[186544]: 2025-11-22 07:58:31.015 186548 DEBUG nova.virt.disk.api [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Checking if we can resize image /var/lib/nova/instances/e0454e3f-2082-4375-8eea-951bbcda9d18/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 07:58:31 compute-0 nova_compute[186544]: 2025-11-22 07:58:31.016 186548 DEBUG oslo_concurrency.processutils [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e0454e3f-2082-4375-8eea-951bbcda9d18/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:58:31 compute-0 nova_compute[186544]: 2025-11-22 07:58:31.068 186548 DEBUG oslo_concurrency.processutils [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e0454e3f-2082-4375-8eea-951bbcda9d18/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:58:31 compute-0 nova_compute[186544]: 2025-11-22 07:58:31.070 186548 DEBUG nova.virt.disk.api [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Cannot resize image /var/lib/nova/instances/e0454e3f-2082-4375-8eea-951bbcda9d18/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 07:58:31 compute-0 nova_compute[186544]: 2025-11-22 07:58:31.070 186548 DEBUG nova.objects.instance [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Lazy-loading 'migration_context' on Instance uuid e0454e3f-2082-4375-8eea-951bbcda9d18 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:58:31 compute-0 nova_compute[186544]: 2025-11-22 07:58:31.091 186548 DEBUG nova.virt.libvirt.driver [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 07:58:31 compute-0 nova_compute[186544]: 2025-11-22 07:58:31.091 186548 DEBUG nova.virt.libvirt.driver [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Ensure instance console log exists: /var/lib/nova/instances/e0454e3f-2082-4375-8eea-951bbcda9d18/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 07:58:31 compute-0 nova_compute[186544]: 2025-11-22 07:58:31.092 186548 DEBUG oslo_concurrency.lockutils [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:58:31 compute-0 nova_compute[186544]: 2025-11-22 07:58:31.092 186548 DEBUG oslo_concurrency.lockutils [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:58:31 compute-0 nova_compute[186544]: 2025-11-22 07:58:31.092 186548 DEBUG oslo_concurrency.lockutils [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:58:32 compute-0 podman[226383]: 2025-11-22 07:58:32.404067756 +0000 UTC m=+0.053646674 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 07:58:32 compute-0 nova_compute[186544]: 2025-11-22 07:58:32.727 186548 DEBUG nova.network.neutron [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Successfully created port: 99430038-cd5a-4b80-870d-a311ce070406 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 22 07:58:33 compute-0 nova_compute[186544]: 2025-11-22 07:58:33.539 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:33 compute-0 nova_compute[186544]: 2025-11-22 07:58:33.669 186548 DEBUG nova.network.neutron [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Successfully updated port: 99430038-cd5a-4b80-870d-a311ce070406 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 07:58:33 compute-0 nova_compute[186544]: 2025-11-22 07:58:33.729 186548 DEBUG oslo_concurrency.lockutils [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Acquiring lock "refresh_cache-e0454e3f-2082-4375-8eea-951bbcda9d18" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:58:33 compute-0 nova_compute[186544]: 2025-11-22 07:58:33.729 186548 DEBUG oslo_concurrency.lockutils [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Acquired lock "refresh_cache-e0454e3f-2082-4375-8eea-951bbcda9d18" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:58:33 compute-0 nova_compute[186544]: 2025-11-22 07:58:33.730 186548 DEBUG nova.network.neutron [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 07:58:33 compute-0 nova_compute[186544]: 2025-11-22 07:58:33.777 186548 DEBUG nova.compute.manager [req-15a1bfd0-a451-4b10-aedf-54327399de1c req-176a801f-3bf2-497f-a9b9-bca82cac759a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Received event network-changed-99430038-cd5a-4b80-870d-a311ce070406 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:58:33 compute-0 nova_compute[186544]: 2025-11-22 07:58:33.778 186548 DEBUG nova.compute.manager [req-15a1bfd0-a451-4b10-aedf-54327399de1c req-176a801f-3bf2-497f-a9b9-bca82cac759a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Refreshing instance network info cache due to event network-changed-99430038-cd5a-4b80-870d-a311ce070406. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 07:58:33 compute-0 nova_compute[186544]: 2025-11-22 07:58:33.778 186548 DEBUG oslo_concurrency.lockutils [req-15a1bfd0-a451-4b10-aedf-54327399de1c req-176a801f-3bf2-497f-a9b9-bca82cac759a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-e0454e3f-2082-4375-8eea-951bbcda9d18" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:58:33 compute-0 nova_compute[186544]: 2025-11-22 07:58:33.899 186548 DEBUG nova.network.neutron [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 07:58:34 compute-0 podman[226407]: 2025-11-22 07:58:34.401195936 +0000 UTC m=+0.051355087 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, distribution-scope=public, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-type=git, config_id=edpm, container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, version=9.6, io.buildah.version=1.33.7)
Nov 22 07:58:34 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:34.725 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '23'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:58:34 compute-0 nova_compute[186544]: 2025-11-22 07:58:34.786 186548 DEBUG nova.network.neutron [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Updating instance_info_cache with network_info: [{"id": "99430038-cd5a-4b80-870d-a311ce070406", "address": "fa:16:3e:dd:6b:ef", "network": {"id": "2eb8d1c0-aa5f-4b93-a282-8895c4dfc512", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-950997851-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e831305a33c4281b08d9d9a531f0f05", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap99430038-cd", "ovs_interfaceid": "99430038-cd5a-4b80-870d-a311ce070406", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:58:34 compute-0 nova_compute[186544]: 2025-11-22 07:58:34.819 186548 DEBUG oslo_concurrency.lockutils [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Releasing lock "refresh_cache-e0454e3f-2082-4375-8eea-951bbcda9d18" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:58:34 compute-0 nova_compute[186544]: 2025-11-22 07:58:34.819 186548 DEBUG nova.compute.manager [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Instance network_info: |[{"id": "99430038-cd5a-4b80-870d-a311ce070406", "address": "fa:16:3e:dd:6b:ef", "network": {"id": "2eb8d1c0-aa5f-4b93-a282-8895c4dfc512", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-950997851-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e831305a33c4281b08d9d9a531f0f05", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap99430038-cd", "ovs_interfaceid": "99430038-cd5a-4b80-870d-a311ce070406", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 22 07:58:34 compute-0 nova_compute[186544]: 2025-11-22 07:58:34.819 186548 DEBUG oslo_concurrency.lockutils [req-15a1bfd0-a451-4b10-aedf-54327399de1c req-176a801f-3bf2-497f-a9b9-bca82cac759a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-e0454e3f-2082-4375-8eea-951bbcda9d18" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:58:34 compute-0 nova_compute[186544]: 2025-11-22 07:58:34.820 186548 DEBUG nova.network.neutron [req-15a1bfd0-a451-4b10-aedf-54327399de1c req-176a801f-3bf2-497f-a9b9-bca82cac759a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Refreshing network info cache for port 99430038-cd5a-4b80-870d-a311ce070406 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 07:58:34 compute-0 nova_compute[186544]: 2025-11-22 07:58:34.822 186548 DEBUG nova.virt.libvirt.driver [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Start _get_guest_xml network_info=[{"id": "99430038-cd5a-4b80-870d-a311ce070406", "address": "fa:16:3e:dd:6b:ef", "network": {"id": "2eb8d1c0-aa5f-4b93-a282-8895c4dfc512", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-950997851-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e831305a33c4281b08d9d9a531f0f05", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap99430038-cd", "ovs_interfaceid": "99430038-cd5a-4b80-870d-a311ce070406", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 07:58:34 compute-0 nova_compute[186544]: 2025-11-22 07:58:34.827 186548 WARNING nova.virt.libvirt.driver [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 07:58:34 compute-0 nova_compute[186544]: 2025-11-22 07:58:34.831 186548 DEBUG nova.virt.libvirt.host [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 07:58:34 compute-0 nova_compute[186544]: 2025-11-22 07:58:34.832 186548 DEBUG nova.virt.libvirt.host [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 07:58:34 compute-0 nova_compute[186544]: 2025-11-22 07:58:34.835 186548 DEBUG nova.virt.libvirt.host [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 07:58:34 compute-0 nova_compute[186544]: 2025-11-22 07:58:34.835 186548 DEBUG nova.virt.libvirt.host [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 07:58:34 compute-0 nova_compute[186544]: 2025-11-22 07:58:34.836 186548 DEBUG nova.virt.libvirt.driver [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 07:58:34 compute-0 nova_compute[186544]: 2025-11-22 07:58:34.836 186548 DEBUG nova.virt.hardware [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 07:58:34 compute-0 nova_compute[186544]: 2025-11-22 07:58:34.837 186548 DEBUG nova.virt.hardware [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 07:58:34 compute-0 nova_compute[186544]: 2025-11-22 07:58:34.837 186548 DEBUG nova.virt.hardware [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 07:58:34 compute-0 nova_compute[186544]: 2025-11-22 07:58:34.837 186548 DEBUG nova.virt.hardware [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 07:58:34 compute-0 nova_compute[186544]: 2025-11-22 07:58:34.838 186548 DEBUG nova.virt.hardware [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 07:58:34 compute-0 nova_compute[186544]: 2025-11-22 07:58:34.838 186548 DEBUG nova.virt.hardware [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 07:58:34 compute-0 nova_compute[186544]: 2025-11-22 07:58:34.838 186548 DEBUG nova.virt.hardware [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 07:58:34 compute-0 nova_compute[186544]: 2025-11-22 07:58:34.838 186548 DEBUG nova.virt.hardware [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 07:58:34 compute-0 nova_compute[186544]: 2025-11-22 07:58:34.839 186548 DEBUG nova.virt.hardware [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 07:58:34 compute-0 nova_compute[186544]: 2025-11-22 07:58:34.839 186548 DEBUG nova.virt.hardware [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 07:58:34 compute-0 nova_compute[186544]: 2025-11-22 07:58:34.839 186548 DEBUG nova.virt.hardware [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 07:58:34 compute-0 nova_compute[186544]: 2025-11-22 07:58:34.843 186548 DEBUG nova.virt.libvirt.vif [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:58:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-InstanceActionsTestJSON-server-1380026557',display_name='tempest-InstanceActionsTestJSON-server-1380026557',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-instanceactionstestjson-server-1380026557',id=82,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2e831305a33c4281b08d9d9a531f0f05',ramdisk_id='',reservation_id='r-8wd5vte0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-InstanceActionsTestJSON-1024162909',owner_user_name='tempest-InstanceActionsTestJSON-1024162909-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:58:30Z,user_data=None,user_id='567c8b1d5a18457a8101063d7929927c',uuid=e0454e3f-2082-4375-8eea-951bbcda9d18,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "99430038-cd5a-4b80-870d-a311ce070406", "address": "fa:16:3e:dd:6b:ef", "network": {"id": "2eb8d1c0-aa5f-4b93-a282-8895c4dfc512", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-950997851-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e831305a33c4281b08d9d9a531f0f05", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap99430038-cd", "ovs_interfaceid": "99430038-cd5a-4b80-870d-a311ce070406", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 07:58:34 compute-0 nova_compute[186544]: 2025-11-22 07:58:34.844 186548 DEBUG nova.network.os_vif_util [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Converting VIF {"id": "99430038-cd5a-4b80-870d-a311ce070406", "address": "fa:16:3e:dd:6b:ef", "network": {"id": "2eb8d1c0-aa5f-4b93-a282-8895c4dfc512", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-950997851-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e831305a33c4281b08d9d9a531f0f05", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap99430038-cd", "ovs_interfaceid": "99430038-cd5a-4b80-870d-a311ce070406", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:58:34 compute-0 nova_compute[186544]: 2025-11-22 07:58:34.844 186548 DEBUG nova.network.os_vif_util [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:dd:6b:ef,bridge_name='br-int',has_traffic_filtering=True,id=99430038-cd5a-4b80-870d-a311ce070406,network=Network(2eb8d1c0-aa5f-4b93-a282-8895c4dfc512),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap99430038-cd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:58:34 compute-0 nova_compute[186544]: 2025-11-22 07:58:34.845 186548 DEBUG nova.objects.instance [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Lazy-loading 'pci_devices' on Instance uuid e0454e3f-2082-4375-8eea-951bbcda9d18 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:58:34 compute-0 nova_compute[186544]: 2025-11-22 07:58:34.856 186548 DEBUG nova.virt.libvirt.driver [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] End _get_guest_xml xml=<domain type="kvm">
Nov 22 07:58:34 compute-0 nova_compute[186544]:   <uuid>e0454e3f-2082-4375-8eea-951bbcda9d18</uuid>
Nov 22 07:58:34 compute-0 nova_compute[186544]:   <name>instance-00000052</name>
Nov 22 07:58:34 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 07:58:34 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 07:58:34 compute-0 nova_compute[186544]:   <metadata>
Nov 22 07:58:34 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 07:58:34 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 07:58:34 compute-0 nova_compute[186544]:       <nova:name>tempest-InstanceActionsTestJSON-server-1380026557</nova:name>
Nov 22 07:58:34 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 07:58:34</nova:creationTime>
Nov 22 07:58:34 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 07:58:34 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 07:58:34 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 07:58:34 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 07:58:34 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 07:58:34 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 07:58:34 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 07:58:34 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 07:58:34 compute-0 nova_compute[186544]:         <nova:user uuid="567c8b1d5a18457a8101063d7929927c">tempest-InstanceActionsTestJSON-1024162909-project-member</nova:user>
Nov 22 07:58:34 compute-0 nova_compute[186544]:         <nova:project uuid="2e831305a33c4281b08d9d9a531f0f05">tempest-InstanceActionsTestJSON-1024162909</nova:project>
Nov 22 07:58:34 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 07:58:34 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 07:58:34 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 07:58:34 compute-0 nova_compute[186544]:         <nova:port uuid="99430038-cd5a-4b80-870d-a311ce070406">
Nov 22 07:58:34 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Nov 22 07:58:34 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 07:58:34 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 07:58:34 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 07:58:34 compute-0 nova_compute[186544]:   </metadata>
Nov 22 07:58:34 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 07:58:34 compute-0 nova_compute[186544]:     <system>
Nov 22 07:58:34 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 07:58:34 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 07:58:34 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 07:58:34 compute-0 nova_compute[186544]:       <entry name="serial">e0454e3f-2082-4375-8eea-951bbcda9d18</entry>
Nov 22 07:58:34 compute-0 nova_compute[186544]:       <entry name="uuid">e0454e3f-2082-4375-8eea-951bbcda9d18</entry>
Nov 22 07:58:34 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 07:58:34 compute-0 nova_compute[186544]:     </system>
Nov 22 07:58:34 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 07:58:34 compute-0 nova_compute[186544]:   <os>
Nov 22 07:58:34 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 07:58:34 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 07:58:34 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 07:58:34 compute-0 nova_compute[186544]:   </os>
Nov 22 07:58:34 compute-0 nova_compute[186544]:   <features>
Nov 22 07:58:34 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 07:58:34 compute-0 nova_compute[186544]:     <apic/>
Nov 22 07:58:34 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 07:58:34 compute-0 nova_compute[186544]:   </features>
Nov 22 07:58:34 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 07:58:34 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 07:58:34 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 07:58:34 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 07:58:34 compute-0 nova_compute[186544]:   </clock>
Nov 22 07:58:34 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 07:58:34 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 07:58:34 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 07:58:34 compute-0 nova_compute[186544]:   </cpu>
Nov 22 07:58:34 compute-0 nova_compute[186544]:   <devices>
Nov 22 07:58:34 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 07:58:34 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 07:58:34 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/e0454e3f-2082-4375-8eea-951bbcda9d18/disk"/>
Nov 22 07:58:34 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 07:58:34 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:58:34 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 07:58:34 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 07:58:34 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/e0454e3f-2082-4375-8eea-951bbcda9d18/disk.config"/>
Nov 22 07:58:34 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 07:58:34 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:58:34 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 07:58:34 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:dd:6b:ef"/>
Nov 22 07:58:34 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 07:58:34 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 07:58:34 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 07:58:34 compute-0 nova_compute[186544]:       <target dev="tap99430038-cd"/>
Nov 22 07:58:34 compute-0 nova_compute[186544]:     </interface>
Nov 22 07:58:34 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 07:58:34 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/e0454e3f-2082-4375-8eea-951bbcda9d18/console.log" append="off"/>
Nov 22 07:58:34 compute-0 nova_compute[186544]:     </serial>
Nov 22 07:58:34 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 07:58:34 compute-0 nova_compute[186544]:     <video>
Nov 22 07:58:34 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 07:58:34 compute-0 nova_compute[186544]:     </video>
Nov 22 07:58:34 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 07:58:34 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 07:58:34 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 07:58:34 compute-0 nova_compute[186544]:     </rng>
Nov 22 07:58:34 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 07:58:34 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:34 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:34 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:34 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:34 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:34 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:34 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:34 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:34 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:34 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:34 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:34 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:34 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:34 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:34 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:34 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:34 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:34 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:34 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:34 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:34 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:34 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:34 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:34 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:34 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 07:58:34 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 07:58:34 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 07:58:34 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 07:58:34 compute-0 nova_compute[186544]:   </devices>
Nov 22 07:58:34 compute-0 nova_compute[186544]: </domain>
Nov 22 07:58:34 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 07:58:34 compute-0 nova_compute[186544]: 2025-11-22 07:58:34.857 186548 DEBUG nova.compute.manager [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Preparing to wait for external event network-vif-plugged-99430038-cd5a-4b80-870d-a311ce070406 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 07:58:34 compute-0 nova_compute[186544]: 2025-11-22 07:58:34.857 186548 DEBUG oslo_concurrency.lockutils [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Acquiring lock "e0454e3f-2082-4375-8eea-951bbcda9d18-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:58:34 compute-0 nova_compute[186544]: 2025-11-22 07:58:34.858 186548 DEBUG oslo_concurrency.lockutils [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Lock "e0454e3f-2082-4375-8eea-951bbcda9d18-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:58:34 compute-0 nova_compute[186544]: 2025-11-22 07:58:34.858 186548 DEBUG oslo_concurrency.lockutils [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Lock "e0454e3f-2082-4375-8eea-951bbcda9d18-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:58:34 compute-0 nova_compute[186544]: 2025-11-22 07:58:34.859 186548 DEBUG nova.virt.libvirt.vif [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:58:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-InstanceActionsTestJSON-server-1380026557',display_name='tempest-InstanceActionsTestJSON-server-1380026557',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-instanceactionstestjson-server-1380026557',id=82,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2e831305a33c4281b08d9d9a531f0f05',ramdisk_id='',reservation_id='r-8wd5vte0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-InstanceActionsTestJSON-1024162909',owner_user_name='tempest-InstanceActionsTestJSON-1024162909-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:58:30Z,user_data=None,user_id='567c8b1d5a18457a8101063d7929927c',uuid=e0454e3f-2082-4375-8eea-951bbcda9d18,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "99430038-cd5a-4b80-870d-a311ce070406", "address": "fa:16:3e:dd:6b:ef", "network": {"id": "2eb8d1c0-aa5f-4b93-a282-8895c4dfc512", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-950997851-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e831305a33c4281b08d9d9a531f0f05", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap99430038-cd", "ovs_interfaceid": "99430038-cd5a-4b80-870d-a311ce070406", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 07:58:34 compute-0 nova_compute[186544]: 2025-11-22 07:58:34.859 186548 DEBUG nova.network.os_vif_util [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Converting VIF {"id": "99430038-cd5a-4b80-870d-a311ce070406", "address": "fa:16:3e:dd:6b:ef", "network": {"id": "2eb8d1c0-aa5f-4b93-a282-8895c4dfc512", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-950997851-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e831305a33c4281b08d9d9a531f0f05", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap99430038-cd", "ovs_interfaceid": "99430038-cd5a-4b80-870d-a311ce070406", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:58:34 compute-0 nova_compute[186544]: 2025-11-22 07:58:34.859 186548 DEBUG nova.network.os_vif_util [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:dd:6b:ef,bridge_name='br-int',has_traffic_filtering=True,id=99430038-cd5a-4b80-870d-a311ce070406,network=Network(2eb8d1c0-aa5f-4b93-a282-8895c4dfc512),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap99430038-cd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:58:34 compute-0 nova_compute[186544]: 2025-11-22 07:58:34.860 186548 DEBUG os_vif [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:dd:6b:ef,bridge_name='br-int',has_traffic_filtering=True,id=99430038-cd5a-4b80-870d-a311ce070406,network=Network(2eb8d1c0-aa5f-4b93-a282-8895c4dfc512),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap99430038-cd') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 07:58:34 compute-0 nova_compute[186544]: 2025-11-22 07:58:34.860 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:34 compute-0 nova_compute[186544]: 2025-11-22 07:58:34.861 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:58:34 compute-0 nova_compute[186544]: 2025-11-22 07:58:34.861 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:58:34 compute-0 nova_compute[186544]: 2025-11-22 07:58:34.863 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:34 compute-0 nova_compute[186544]: 2025-11-22 07:58:34.863 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap99430038-cd, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:58:34 compute-0 nova_compute[186544]: 2025-11-22 07:58:34.864 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap99430038-cd, col_values=(('external_ids', {'iface-id': '99430038-cd5a-4b80-870d-a311ce070406', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:dd:6b:ef', 'vm-uuid': 'e0454e3f-2082-4375-8eea-951bbcda9d18'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:58:34 compute-0 nova_compute[186544]: 2025-11-22 07:58:34.865 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:34 compute-0 NetworkManager[55036]: <info>  [1763798314.8671] manager: (tap99430038-cd): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/149)
Nov 22 07:58:34 compute-0 nova_compute[186544]: 2025-11-22 07:58:34.867 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 07:58:34 compute-0 nova_compute[186544]: 2025-11-22 07:58:34.871 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:34 compute-0 nova_compute[186544]: 2025-11-22 07:58:34.872 186548 INFO os_vif [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:dd:6b:ef,bridge_name='br-int',has_traffic_filtering=True,id=99430038-cd5a-4b80-870d-a311ce070406,network=Network(2eb8d1c0-aa5f-4b93-a282-8895c4dfc512),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap99430038-cd')
Nov 22 07:58:35 compute-0 nova_compute[186544]: 2025-11-22 07:58:35.354 186548 DEBUG nova.virt.libvirt.driver [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:58:35 compute-0 nova_compute[186544]: 2025-11-22 07:58:35.355 186548 DEBUG nova.virt.libvirt.driver [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:58:35 compute-0 nova_compute[186544]: 2025-11-22 07:58:35.355 186548 DEBUG nova.virt.libvirt.driver [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] No VIF found with MAC fa:16:3e:dd:6b:ef, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 07:58:35 compute-0 nova_compute[186544]: 2025-11-22 07:58:35.355 186548 INFO nova.virt.libvirt.driver [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Using config drive
Nov 22 07:58:35 compute-0 nova_compute[186544]: 2025-11-22 07:58:35.575 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:35 compute-0 nova_compute[186544]: 2025-11-22 07:58:35.769 186548 INFO nova.virt.libvirt.driver [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Creating config drive at /var/lib/nova/instances/e0454e3f-2082-4375-8eea-951bbcda9d18/disk.config
Nov 22 07:58:35 compute-0 nova_compute[186544]: 2025-11-22 07:58:35.774 186548 DEBUG oslo_concurrency.processutils [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e0454e3f-2082-4375-8eea-951bbcda9d18/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpj0wlcj0g execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:58:35 compute-0 nova_compute[186544]: 2025-11-22 07:58:35.896 186548 DEBUG oslo_concurrency.processutils [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e0454e3f-2082-4375-8eea-951bbcda9d18/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpj0wlcj0g" returned: 0 in 0.122s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:58:35 compute-0 kernel: tap99430038-cd: entered promiscuous mode
Nov 22 07:58:35 compute-0 NetworkManager[55036]: <info>  [1763798315.9774] manager: (tap99430038-cd): new Tun device (/org/freedesktop/NetworkManager/Devices/150)
Nov 22 07:58:35 compute-0 ovn_controller[94843]: 2025-11-22T07:58:35Z|00324|binding|INFO|Claiming lport 99430038-cd5a-4b80-870d-a311ce070406 for this chassis.
Nov 22 07:58:35 compute-0 ovn_controller[94843]: 2025-11-22T07:58:35Z|00325|binding|INFO|99430038-cd5a-4b80-870d-a311ce070406: Claiming fa:16:3e:dd:6b:ef 10.100.0.8
Nov 22 07:58:35 compute-0 nova_compute[186544]: 2025-11-22 07:58:35.978 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:35 compute-0 nova_compute[186544]: 2025-11-22 07:58:35.989 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:35 compute-0 nova_compute[186544]: 2025-11-22 07:58:35.992 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:35.999 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:dd:6b:ef 10.100.0.8'], port_security=['fa:16:3e:dd:6b:ef 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'e0454e3f-2082-4375-8eea-951bbcda9d18', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2eb8d1c0-aa5f-4b93-a282-8895c4dfc512', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2e831305a33c4281b08d9d9a531f0f05', 'neutron:revision_number': '2', 'neutron:security_group_ids': '8bae29ab-eff0-4123-9083-446cbed1b776', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=acb7cb2c-1a3c-469d-9846-a04f374dd378, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=99430038-cd5a-4b80-870d-a311ce070406) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:58:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:36.000 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 99430038-cd5a-4b80-870d-a311ce070406 in datapath 2eb8d1c0-aa5f-4b93-a282-8895c4dfc512 bound to our chassis
Nov 22 07:58:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:36.002 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2eb8d1c0-aa5f-4b93-a282-8895c4dfc512
Nov 22 07:58:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:36.011 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[a3e962db-3662-487b-8856-4130a689c8b1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:36.012 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap2eb8d1c0-a1 in ovnmeta-2eb8d1c0-aa5f-4b93-a282-8895c4dfc512 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 07:58:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:36.015 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap2eb8d1c0-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 07:58:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:36.015 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[85b48587-305e-4796-b274-06d61e8cdcc3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:36.015 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[b0d74bf5-23ac-4ea6-bcdd-6d26e387d789]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:36 compute-0 systemd-udevd[226451]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 07:58:36 compute-0 systemd-machined[152872]: New machine qemu-39-instance-00000052.
Nov 22 07:58:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:36.026 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[c6de5d20-de79-4d17-8ed6-627a19135f43]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:36 compute-0 NetworkManager[55036]: <info>  [1763798316.0317] device (tap99430038-cd): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 07:58:36 compute-0 NetworkManager[55036]: <info>  [1763798316.0328] device (tap99430038-cd): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 07:58:36 compute-0 nova_compute[186544]: 2025-11-22 07:58:36.048 186548 DEBUG nova.network.neutron [req-15a1bfd0-a451-4b10-aedf-54327399de1c req-176a801f-3bf2-497f-a9b9-bca82cac759a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Updated VIF entry in instance network info cache for port 99430038-cd5a-4b80-870d-a311ce070406. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 07:58:36 compute-0 nova_compute[186544]: 2025-11-22 07:58:36.049 186548 DEBUG nova.network.neutron [req-15a1bfd0-a451-4b10-aedf-54327399de1c req-176a801f-3bf2-497f-a9b9-bca82cac759a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Updating instance_info_cache with network_info: [{"id": "99430038-cd5a-4b80-870d-a311ce070406", "address": "fa:16:3e:dd:6b:ef", "network": {"id": "2eb8d1c0-aa5f-4b93-a282-8895c4dfc512", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-950997851-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e831305a33c4281b08d9d9a531f0f05", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap99430038-cd", "ovs_interfaceid": "99430038-cd5a-4b80-870d-a311ce070406", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:58:36 compute-0 nova_compute[186544]: 2025-11-22 07:58:36.059 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:36 compute-0 ovn_controller[94843]: 2025-11-22T07:58:36Z|00326|binding|INFO|Setting lport 99430038-cd5a-4b80-870d-a311ce070406 ovn-installed in OVS
Nov 22 07:58:36 compute-0 ovn_controller[94843]: 2025-11-22T07:58:36Z|00327|binding|INFO|Setting lport 99430038-cd5a-4b80-870d-a311ce070406 up in Southbound
Nov 22 07:58:36 compute-0 nova_compute[186544]: 2025-11-22 07:58:36.063 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:36.062 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[0f19ce48-ab86-4688-a2f2-3d50d65eb2ff]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:36 compute-0 systemd[1]: Started Virtual Machine qemu-39-instance-00000052.
Nov 22 07:58:36 compute-0 nova_compute[186544]: 2025-11-22 07:58:36.073 186548 DEBUG oslo_concurrency.lockutils [req-15a1bfd0-a451-4b10-aedf-54327399de1c req-176a801f-3bf2-497f-a9b9-bca82cac759a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-e0454e3f-2082-4375-8eea-951bbcda9d18" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:58:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:36.090 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[11fd374c-d0e8-4d27-8efa-da49d62fd416]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:36.095 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[40678815-8801-4036-8168-861f50ddf98d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:36 compute-0 NetworkManager[55036]: <info>  [1763798316.0969] manager: (tap2eb8d1c0-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/151)
Nov 22 07:58:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:36.134 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[1d547053-a142-46b1-aac0-186b2d978c98]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:36.139 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[f12cec21-250d-407a-a23c-26c63fb6a06e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:36 compute-0 NetworkManager[55036]: <info>  [1763798316.1672] device (tap2eb8d1c0-a0): carrier: link connected
Nov 22 07:58:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:36.176 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[5d1bbb35-14c3-4b8e-a545-bfa50ca5e026]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:36.195 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[a6f982d0-0047-4c8b-aa67-b9cc2027ca00]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2eb8d1c0-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cd:71:67'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 98], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 505680, 'reachable_time': 15084, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226483, 'error': None, 'target': 'ovnmeta-2eb8d1c0-aa5f-4b93-a282-8895c4dfc512', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:36.210 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[188b9ed2-5ff7-475d-899f-5c03b1c54d50]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fecd:7167'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 505680, 'tstamp': 505680}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226484, 'error': None, 'target': 'ovnmeta-2eb8d1c0-aa5f-4b93-a282-8895c4dfc512', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:36.229 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[2b3fcbbc-cf54-4ff2-8e87-e301a5f5d855]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2eb8d1c0-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cd:71:67'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 98], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 505680, 'reachable_time': 15084, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 226485, 'error': None, 'target': 'ovnmeta-2eb8d1c0-aa5f-4b93-a282-8895c4dfc512', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:36 compute-0 nova_compute[186544]: 2025-11-22 07:58:36.260 186548 DEBUG nova.compute.manager [req-72491a45-07d4-4b2a-a24f-9e90874b2dd0 req-77e0cdbd-860c-4fb6-8c9a-19347e07b481 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Received event network-vif-plugged-99430038-cd5a-4b80-870d-a311ce070406 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:58:36 compute-0 nova_compute[186544]: 2025-11-22 07:58:36.260 186548 DEBUG oslo_concurrency.lockutils [req-72491a45-07d4-4b2a-a24f-9e90874b2dd0 req-77e0cdbd-860c-4fb6-8c9a-19347e07b481 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "e0454e3f-2082-4375-8eea-951bbcda9d18-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:58:36 compute-0 nova_compute[186544]: 2025-11-22 07:58:36.260 186548 DEBUG oslo_concurrency.lockutils [req-72491a45-07d4-4b2a-a24f-9e90874b2dd0 req-77e0cdbd-860c-4fb6-8c9a-19347e07b481 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e0454e3f-2082-4375-8eea-951bbcda9d18-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:58:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:36.260 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[0f04f607-b823-4fd0-a076-e7b127d5a70d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:36 compute-0 nova_compute[186544]: 2025-11-22 07:58:36.261 186548 DEBUG oslo_concurrency.lockutils [req-72491a45-07d4-4b2a-a24f-9e90874b2dd0 req-77e0cdbd-860c-4fb6-8c9a-19347e07b481 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e0454e3f-2082-4375-8eea-951bbcda9d18-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:58:36 compute-0 nova_compute[186544]: 2025-11-22 07:58:36.261 186548 DEBUG nova.compute.manager [req-72491a45-07d4-4b2a-a24f-9e90874b2dd0 req-77e0cdbd-860c-4fb6-8c9a-19347e07b481 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Processing event network-vif-plugged-99430038-cd5a-4b80-870d-a311ce070406 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 07:58:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:36.317 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[7262be67-6602-4e51-bde5-941916903bc0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:36.318 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2eb8d1c0-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:58:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:36.319 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:58:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:36.320 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2eb8d1c0-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:58:36 compute-0 kernel: tap2eb8d1c0-a0: entered promiscuous mode
Nov 22 07:58:36 compute-0 NetworkManager[55036]: <info>  [1763798316.3236] manager: (tap2eb8d1c0-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/152)
Nov 22 07:58:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:36.324 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2eb8d1c0-a0, col_values=(('external_ids', {'iface-id': '4bd45432-2a1f-427f-9701-acf0afb89540'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:58:36 compute-0 ovn_controller[94843]: 2025-11-22T07:58:36Z|00328|binding|INFO|Releasing lport 4bd45432-2a1f-427f-9701-acf0afb89540 from this chassis (sb_readonly=0)
Nov 22 07:58:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:36.339 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/2eb8d1c0-aa5f-4b93-a282-8895c4dfc512.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/2eb8d1c0-aa5f-4b93-a282-8895c4dfc512.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 07:58:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:36.340 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[ef72dffc-3a48-4f03-8392-4d307e4bfc9a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:36.341 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 07:58:36 compute-0 ovn_metadata_agent[103800]: global
Nov 22 07:58:36 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 07:58:36 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-2eb8d1c0-aa5f-4b93-a282-8895c4dfc512
Nov 22 07:58:36 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 07:58:36 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 07:58:36 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 07:58:36 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/2eb8d1c0-aa5f-4b93-a282-8895c4dfc512.pid.haproxy
Nov 22 07:58:36 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 07:58:36 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:58:36 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 07:58:36 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 07:58:36 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 07:58:36 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 07:58:36 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 07:58:36 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 07:58:36 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 07:58:36 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 07:58:36 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 07:58:36 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 07:58:36 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 07:58:36 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 07:58:36 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 07:58:36 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:58:36 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:58:36 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 07:58:36 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 07:58:36 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 07:58:36 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID 2eb8d1c0-aa5f-4b93-a282-8895c4dfc512
Nov 22 07:58:36 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 07:58:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:36.342 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-2eb8d1c0-aa5f-4b93-a282-8895c4dfc512', 'env', 'PROCESS_TAG=haproxy-2eb8d1c0-aa5f-4b93-a282-8895c4dfc512', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/2eb8d1c0-aa5f-4b93-a282-8895c4dfc512.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 07:58:36 compute-0 nova_compute[186544]: 2025-11-22 07:58:36.400 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:36 compute-0 nova_compute[186544]: 2025-11-22 07:58:36.402 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798316.3905725, e0454e3f-2082-4375-8eea-951bbcda9d18 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:58:36 compute-0 nova_compute[186544]: 2025-11-22 07:58:36.402 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] VM Started (Lifecycle Event)
Nov 22 07:58:36 compute-0 nova_compute[186544]: 2025-11-22 07:58:36.403 186548 DEBUG nova.compute.manager [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 07:58:36 compute-0 nova_compute[186544]: 2025-11-22 07:58:36.409 186548 DEBUG nova.virt.libvirt.driver [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 07:58:36 compute-0 nova_compute[186544]: 2025-11-22 07:58:36.412 186548 INFO nova.virt.libvirt.driver [-] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Instance spawned successfully.
Nov 22 07:58:36 compute-0 nova_compute[186544]: 2025-11-22 07:58:36.412 186548 DEBUG nova.virt.libvirt.driver [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 07:58:36 compute-0 nova_compute[186544]: 2025-11-22 07:58:36.432 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:58:36 compute-0 nova_compute[186544]: 2025-11-22 07:58:36.437 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:58:36 compute-0 nova_compute[186544]: 2025-11-22 07:58:36.439 186548 DEBUG nova.virt.libvirt.driver [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:58:36 compute-0 nova_compute[186544]: 2025-11-22 07:58:36.439 186548 DEBUG nova.virt.libvirt.driver [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:58:36 compute-0 nova_compute[186544]: 2025-11-22 07:58:36.440 186548 DEBUG nova.virt.libvirt.driver [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:58:36 compute-0 nova_compute[186544]: 2025-11-22 07:58:36.440 186548 DEBUG nova.virt.libvirt.driver [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:58:36 compute-0 nova_compute[186544]: 2025-11-22 07:58:36.441 186548 DEBUG nova.virt.libvirt.driver [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:58:36 compute-0 nova_compute[186544]: 2025-11-22 07:58:36.441 186548 DEBUG nova.virt.libvirt.driver [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:58:36 compute-0 nova_compute[186544]: 2025-11-22 07:58:36.463 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 07:58:36 compute-0 nova_compute[186544]: 2025-11-22 07:58:36.463 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798316.399925, e0454e3f-2082-4375-8eea-951bbcda9d18 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:58:36 compute-0 nova_compute[186544]: 2025-11-22 07:58:36.464 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] VM Paused (Lifecycle Event)
Nov 22 07:58:36 compute-0 nova_compute[186544]: 2025-11-22 07:58:36.484 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:58:36 compute-0 nova_compute[186544]: 2025-11-22 07:58:36.488 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798316.408311, e0454e3f-2082-4375-8eea-951bbcda9d18 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:58:36 compute-0 nova_compute[186544]: 2025-11-22 07:58:36.488 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] VM Resumed (Lifecycle Event)
Nov 22 07:58:36 compute-0 nova_compute[186544]: 2025-11-22 07:58:36.513 186548 INFO nova.compute.manager [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Took 5.74 seconds to spawn the instance on the hypervisor.
Nov 22 07:58:36 compute-0 nova_compute[186544]: 2025-11-22 07:58:36.513 186548 DEBUG nova.compute.manager [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:58:36 compute-0 nova_compute[186544]: 2025-11-22 07:58:36.525 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:58:36 compute-0 nova_compute[186544]: 2025-11-22 07:58:36.528 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:58:36 compute-0 nova_compute[186544]: 2025-11-22 07:58:36.566 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.595 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'e0454e3f-2082-4375-8eea-951bbcda9d18', 'name': 'tempest-InstanceActionsTestJSON-server-1380026557', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000052', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '2e831305a33c4281b08d9d9a531f0f05', 'user_id': '567c8b1d5a18457a8101063d7929927c', 'hostId': '3dbcf7d89462aa3463ffa286b1625839aa6505bd42df007972d42999', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.596 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.598 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for e0454e3f-2082-4375-8eea-951bbcda9d18 / tap99430038-cd inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.598 12 DEBUG ceilometer.compute.pollsters [-] e0454e3f-2082-4375-8eea-951bbcda9d18/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.600 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '12cc4b66-f166-425c-9a01-f5c17e3eb996', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '567c8b1d5a18457a8101063d7929927c', 'user_name': None, 'project_id': '2e831305a33c4281b08d9d9a531f0f05', 'project_name': None, 'resource_id': 'instance-00000052-e0454e3f-2082-4375-8eea-951bbcda9d18-tap99430038-cd', 'timestamp': '2025-11-22T07:58:36.596659', 'resource_metadata': {'display_name': 'tempest-InstanceActionsTestJSON-server-1380026557', 'name': 'tap99430038-cd', 'instance_id': 'e0454e3f-2082-4375-8eea-951bbcda9d18', 'instance_type': 'm1.nano', 'host': '3dbcf7d89462aa3463ffa286b1625839aa6505bd42df007972d42999', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:dd:6b:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap99430038-cd'}, 'message_id': '0cab0740-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5057.288449255, 'message_signature': 'd091bbb3f3f28678784fed06e37aefd2ef9a80c6f188c45e718fc891cb375e05'}]}, 'timestamp': '2025-11-22 07:58:36.599291', '_unique_id': '0c3f0183f8134332bf84f551a3321067'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.600 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.600 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.600 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.600 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.600 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.600 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.600 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.600 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.600 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.600 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.600 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.600 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.600 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.600 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.600 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.600 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.600 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.600 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.600 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.600 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.600 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.600 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.600 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.600 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.600 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.600 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.600 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.600 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.600 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.600 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.600 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.601 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 22 07:58:36 compute-0 nova_compute[186544]: 2025-11-22 07:58:36.603 186548 INFO nova.compute.manager [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Took 6.32 seconds to build instance.
Nov 22 07:58:36 compute-0 nova_compute[186544]: 2025-11-22 07:58:36.621 186548 DEBUG oslo_concurrency.lockutils [None req-0712d8c2-dd03-4150-861d-7a9052d41386 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Lock "e0454e3f-2082-4375-8eea-951bbcda9d18" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.421s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.634 12 DEBUG ceilometer.compute.pollsters [-] e0454e3f-2082-4375-8eea-951bbcda9d18/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.634 12 DEBUG ceilometer.compute.pollsters [-] e0454e3f-2082-4375-8eea-951bbcda9d18/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.635 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8346213b-3ff1-4900-8737-0d4e6268c71a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '567c8b1d5a18457a8101063d7929927c', 'user_name': None, 'project_id': '2e831305a33c4281b08d9d9a531f0f05', 'project_name': None, 'resource_id': 'e0454e3f-2082-4375-8eea-951bbcda9d18-vda', 'timestamp': '2025-11-22T07:58:36.601933', 'resource_metadata': {'display_name': 'tempest-InstanceActionsTestJSON-server-1380026557', 'name': 'instance-00000052', 'instance_id': 'e0454e3f-2082-4375-8eea-951bbcda9d18', 'instance_type': 'm1.nano', 'host': '3dbcf7d89462aa3463ffa286b1625839aa6505bd42df007972d42999', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0cb0745a-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5057.293734976, 'message_signature': '9628f1ae41a764f75abf323964756b52dd73d3bf76f61fb5e71b8d0e1a7cd2a9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '567c8b1d5a18457a8101063d7929927c', 'user_name': None, 'project_id': '2e831305a33c4281b08d9d9a531f0f05', 'project_name': None, 'resource_id': 'e0454e3f-2082-4375-8eea-951bbcda9d18-sda', 'timestamp': '2025-11-22T07:58:36.601933', 'resource_metadata': {'display_name': 'tempest-InstanceActionsTestJSON-server-1380026557', 'name': 'instance-00000052', 'instance_id': 'e0454e3f-2082-4375-8eea-951bbcda9d18', 'instance_type': 'm1.nano', 'host': '3dbcf7d89462aa3463ffa286b1625839aa6505bd42df007972d42999', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0cb07f54-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5057.293734976, 'message_signature': '53b57d3af214758a8b4e94d3b44b94f69827e6ecb42e71ca2cd8cf04d3303c77'}]}, 'timestamp': '2025-11-22 07:58:36.635023', '_unique_id': '9c67a71499d2413e9d3e82c2e2127897'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.635 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.635 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.635 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.635 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.635 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.635 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.635 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.635 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.635 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.635 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.635 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.635 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.635 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.635 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.635 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.635 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.635 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.635 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.635 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.635 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.635 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.635 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.635 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.635 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.635 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.635 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.635 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.635 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.635 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.635 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.635 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.635 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.635 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.635 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.635 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.635 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.635 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.635 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.635 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.635 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.635 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.635 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.635 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.635 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.635 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.635 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.635 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.635 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.635 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.635 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.635 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.635 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.635 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.635 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.636 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.636 12 DEBUG ceilometer.compute.pollsters [-] e0454e3f-2082-4375-8eea-951bbcda9d18/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.637 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bbfc1847-727e-4a32-bef4-33a1d1e00ec4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '567c8b1d5a18457a8101063d7929927c', 'user_name': None, 'project_id': '2e831305a33c4281b08d9d9a531f0f05', 'project_name': None, 'resource_id': 'instance-00000052-e0454e3f-2082-4375-8eea-951bbcda9d18-tap99430038-cd', 'timestamp': '2025-11-22T07:58:36.636767', 'resource_metadata': {'display_name': 'tempest-InstanceActionsTestJSON-server-1380026557', 'name': 'tap99430038-cd', 'instance_id': 'e0454e3f-2082-4375-8eea-951bbcda9d18', 'instance_type': 'm1.nano', 'host': '3dbcf7d89462aa3463ffa286b1625839aa6505bd42df007972d42999', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:dd:6b:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap99430038-cd'}, 'message_id': '0cb0cce8-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5057.288449255, 'message_signature': '314c34f27a9a71ade92496e4e90121601cdbcfee5a3786d0ad22d202a6ad0f99'}]}, 'timestamp': '2025-11-22 07:58:36.637015', '_unique_id': '822a879ac4944dfb8e2a9f81a08082ac'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.637 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.637 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.637 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.637 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.637 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.637 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.637 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.637 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.637 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.637 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.637 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.637 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.637 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.637 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.637 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.637 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.637 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.637 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.637 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.637 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.637 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.637 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.637 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.637 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.637 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.637 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.637 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.637 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.637 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.637 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.637 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.637 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.638 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.638 12 DEBUG ceilometer.compute.pollsters [-] e0454e3f-2082-4375-8eea-951bbcda9d18/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.638 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '809de225-2a09-469e-9026-9775e0c58fa1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '567c8b1d5a18457a8101063d7929927c', 'user_name': None, 'project_id': '2e831305a33c4281b08d9d9a531f0f05', 'project_name': None, 'resource_id': 'instance-00000052-e0454e3f-2082-4375-8eea-951bbcda9d18-tap99430038-cd', 'timestamp': '2025-11-22T07:58:36.638089', 'resource_metadata': {'display_name': 'tempest-InstanceActionsTestJSON-server-1380026557', 'name': 'tap99430038-cd', 'instance_id': 'e0454e3f-2082-4375-8eea-951bbcda9d18', 'instance_type': 'm1.nano', 'host': '3dbcf7d89462aa3463ffa286b1625839aa6505bd42df007972d42999', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:dd:6b:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap99430038-cd'}, 'message_id': '0cb1005a-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5057.288449255, 'message_signature': '6b849802a13a8c58bf5ea18750bdbe8e9bd5e3f306a6f3db6e5f6acca35d72ec'}]}, 'timestamp': '2025-11-22 07:58:36.638346', '_unique_id': 'ff89df3b1dce49a686cd1755928dee37'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.638 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.638 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.638 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.638 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.638 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.638 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.638 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.638 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.638 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.638 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.638 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.638 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.638 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.638 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.638 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.638 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.638 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.638 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.638 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.638 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.638 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.638 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.638 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.638 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.638 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.638 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.638 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.638 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.638 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.638 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.638 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.639 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.653 12 DEBUG ceilometer.compute.pollsters [-] e0454e3f-2082-4375-8eea-951bbcda9d18/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.653 12 DEBUG ceilometer.compute.pollsters [-] e0454e3f-2082-4375-8eea-951bbcda9d18/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.654 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '44d26fd5-9972-4b41-8283-1a89cbbf6e01', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '567c8b1d5a18457a8101063d7929927c', 'user_name': None, 'project_id': '2e831305a33c4281b08d9d9a531f0f05', 'project_name': None, 'resource_id': 'e0454e3f-2082-4375-8eea-951bbcda9d18-vda', 'timestamp': '2025-11-22T07:58:36.639404', 'resource_metadata': {'display_name': 'tempest-InstanceActionsTestJSON-server-1380026557', 'name': 'instance-00000052', 'instance_id': 'e0454e3f-2082-4375-8eea-951bbcda9d18', 'instance_type': 'm1.nano', 'host': '3dbcf7d89462aa3463ffa286b1625839aa6505bd42df007972d42999', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0cb35aa8-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5057.331195309, 'message_signature': 'a2d043a70cbfe765cfbf0ea75296c9de2516425a96fdf67518ec3e13520c7b16'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '567c8b1d5a18457a8101063d7929927c', 'user_name': None, 'project_id': '2e831305a33c4281b08d9d9a531f0f05', 'project_name': None, 'resource_id': 'e0454e3f-2082-4375-8eea-951bbcda9d18-sda', 'timestamp': '2025-11-22T07:58:36.639404', 'resource_metadata': {'display_name': 'tempest-InstanceActionsTestJSON-server-1380026557', 'name': 'instance-00000052', 'instance_id': 'e0454e3f-2082-4375-8eea-951bbcda9d18', 'instance_type': 'm1.nano', 'host': '3dbcf7d89462aa3463ffa286b1625839aa6505bd42df007972d42999', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0cb36408-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5057.331195309, 'message_signature': 'e1812c2bdfc06762cc93180e9b7d005c1fc3a5df8aa378ca1b55938f2fbd3a59'}]}, 'timestamp': '2025-11-22 07:58:36.653980', '_unique_id': '3abdd78e045249958cacba0b4b0165c8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.654 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.654 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.654 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.654 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.654 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.654 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.654 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.654 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.654 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.654 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.654 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.654 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.654 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.654 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.654 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.654 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.654 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.654 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.654 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.654 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.654 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.654 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.654 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.654 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.654 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.654 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.654 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.654 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.654 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.654 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.654 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.654 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.655 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.655 12 DEBUG ceilometer.compute.pollsters [-] e0454e3f-2082-4375-8eea-951bbcda9d18/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.655 12 DEBUG ceilometer.compute.pollsters [-] e0454e3f-2082-4375-8eea-951bbcda9d18/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.656 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c3a116d9-2824-4d73-b5ec-b408e64412fd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '567c8b1d5a18457a8101063d7929927c', 'user_name': None, 'project_id': '2e831305a33c4281b08d9d9a531f0f05', 'project_name': None, 'resource_id': 'e0454e3f-2082-4375-8eea-951bbcda9d18-vda', 'timestamp': '2025-11-22T07:58:36.655480', 'resource_metadata': {'display_name': 'tempest-InstanceActionsTestJSON-server-1380026557', 'name': 'instance-00000052', 'instance_id': 'e0454e3f-2082-4375-8eea-951bbcda9d18', 'instance_type': 'm1.nano', 'host': '3dbcf7d89462aa3463ffa286b1625839aa6505bd42df007972d42999', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0cb3a792-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5057.293734976, 'message_signature': 'd07c46a9b9e7f08640d5f3f01b548d4dcb4c3877e33d1d9cb4d4ed76a53369b1'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '567c8b1d5a18457a8101063d7929927c', 'user_name': None, 'project_id': '2e831305a33c4281b08d9d9a531f0f05', 'project_name': None, 'resource_id': 'e0454e3f-2082-4375-8eea-951bbcda9d18-sda', 'timestamp': '2025-11-22T07:58:36.655480', 'resource_metadata': {'display_name': 'tempest-InstanceActionsTestJSON-server-1380026557', 'name': 'instance-00000052', 'instance_id': 'e0454e3f-2082-4375-8eea-951bbcda9d18', 'instance_type': 'm1.nano', 'host': '3dbcf7d89462aa3463ffa286b1625839aa6505bd42df007972d42999', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0cb3b174-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5057.293734976, 'message_signature': '813cb505c33fc387daa641647f1b8e0b40bb45a568a1e9064861e3eb084a3fa5'}]}, 'timestamp': '2025-11-22 07:58:36.655958', '_unique_id': 'dcddfbb843074919a5bfb3f2eaea78dd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.656 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.656 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.656 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.656 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.656 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.656 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.656 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.656 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.656 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.656 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.656 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.656 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.656 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.656 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.656 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.656 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.656 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.656 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.656 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.656 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.656 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.656 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.656 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.656 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.656 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.656 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.656 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.656 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.656 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.656 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.656 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.657 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.657 12 DEBUG ceilometer.compute.pollsters [-] e0454e3f-2082-4375-8eea-951bbcda9d18/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.657 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '785b5f82-e456-4cd9-ae9c-85a3c64a8f4d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '567c8b1d5a18457a8101063d7929927c', 'user_name': None, 'project_id': '2e831305a33c4281b08d9d9a531f0f05', 'project_name': None, 'resource_id': 'instance-00000052-e0454e3f-2082-4375-8eea-951bbcda9d18-tap99430038-cd', 'timestamp': '2025-11-22T07:58:36.657098', 'resource_metadata': {'display_name': 'tempest-InstanceActionsTestJSON-server-1380026557', 'name': 'tap99430038-cd', 'instance_id': 'e0454e3f-2082-4375-8eea-951bbcda9d18', 'instance_type': 'm1.nano', 'host': '3dbcf7d89462aa3463ffa286b1625839aa6505bd42df007972d42999', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:dd:6b:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap99430038-cd'}, 'message_id': '0cb3e6a8-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5057.288449255, 'message_signature': 'ab97c37181dae5b49d0cccda84aeaf9e37a57f6299f65bb797a4bd137e2d72d4'}]}, 'timestamp': '2025-11-22 07:58:36.657351', '_unique_id': '51b5d53cee994696ad490e8ce792188e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.657 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.657 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.657 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.657 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.657 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.657 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.657 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.657 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.657 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.657 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.657 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.657 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.657 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.657 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.657 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.657 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.657 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.657 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.657 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.657 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.657 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.657 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.657 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.657 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.657 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.657 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.657 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.657 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.657 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.657 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.657 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.657 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.657 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.657 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.657 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.657 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.657 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.657 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.657 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.657 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.657 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.657 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.657 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.657 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.657 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.657 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.657 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.657 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.657 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.657 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.657 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.657 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.657 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.657 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.658 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.658 12 DEBUG ceilometer.compute.pollsters [-] e0454e3f-2082-4375-8eea-951bbcda9d18/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.659 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '82b2e180-0999-4688-a60f-9f90ef407adc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '567c8b1d5a18457a8101063d7929927c', 'user_name': None, 'project_id': '2e831305a33c4281b08d9d9a531f0f05', 'project_name': None, 'resource_id': 'instance-00000052-e0454e3f-2082-4375-8eea-951bbcda9d18-tap99430038-cd', 'timestamp': '2025-11-22T07:58:36.658406', 'resource_metadata': {'display_name': 'tempest-InstanceActionsTestJSON-server-1380026557', 'name': 'tap99430038-cd', 'instance_id': 'e0454e3f-2082-4375-8eea-951bbcda9d18', 'instance_type': 'm1.nano', 'host': '3dbcf7d89462aa3463ffa286b1625839aa6505bd42df007972d42999', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:dd:6b:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap99430038-cd'}, 'message_id': '0cb419b6-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5057.288449255, 'message_signature': 'cf68ac35971de07e3505189c9fdfbdad3d34bb89e44846d5b396345795fb237f'}]}, 'timestamp': '2025-11-22 07:58:36.658650', '_unique_id': '19ae6074e73649a6a31f877ae16d2142'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.659 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.659 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.659 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.659 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.659 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.659 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.659 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.659 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.659 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.659 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.659 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.659 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.659 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.659 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.659 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.659 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.659 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.659 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.659 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.659 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.659 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.659 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.659 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.659 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.659 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.659 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.659 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.659 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.659 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.659 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.659 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.659 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.682 12 DEBUG ceilometer.compute.pollsters [-] e0454e3f-2082-4375-8eea-951bbcda9d18/cpu volume: 220000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.684 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e936a44f-947a-49b8-8321-406c84538a97', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 220000000, 'user_id': '567c8b1d5a18457a8101063d7929927c', 'user_name': None, 'project_id': '2e831305a33c4281b08d9d9a531f0f05', 'project_name': None, 'resource_id': 'e0454e3f-2082-4375-8eea-951bbcda9d18', 'timestamp': '2025-11-22T07:58:36.659788', 'resource_metadata': {'display_name': 'tempest-InstanceActionsTestJSON-server-1380026557', 'name': 'instance-00000052', 'instance_id': 'e0454e3f-2082-4375-8eea-951bbcda9d18', 'instance_type': 'm1.nano', 'host': '3dbcf7d89462aa3463ffa286b1625839aa6505bd42df007972d42999', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '0cb7d9ca-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5057.374386442, 'message_signature': '4692eef1420ded11578bf14d27221d776e55600134990c3c91bd1ae8937e3f36'}]}, 'timestamp': '2025-11-22 07:58:36.683345', '_unique_id': '585db997e9a94315a05e1315d242038e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.684 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.684 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.684 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.684 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.684 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.684 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.684 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.684 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.684 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.684 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.684 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.684 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.684 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.684 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.684 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.684 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.684 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.684 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.684 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.684 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.684 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.684 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.684 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.684 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.684 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.684 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.684 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.684 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.684 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.684 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.684 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.685 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.685 12 DEBUG ceilometer.compute.pollsters [-] e0454e3f-2082-4375-8eea-951bbcda9d18/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.685 12 DEBUG ceilometer.compute.pollsters [-] e0454e3f-2082-4375-8eea-951bbcda9d18/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.686 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '37d82029-e699-49c9-babc-eb7463b53602', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '567c8b1d5a18457a8101063d7929927c', 'user_name': None, 'project_id': '2e831305a33c4281b08d9d9a531f0f05', 'project_name': None, 'resource_id': 'e0454e3f-2082-4375-8eea-951bbcda9d18-vda', 'timestamp': '2025-11-22T07:58:36.685351', 'resource_metadata': {'display_name': 'tempest-InstanceActionsTestJSON-server-1380026557', 'name': 'instance-00000052', 'instance_id': 'e0454e3f-2082-4375-8eea-951bbcda9d18', 'instance_type': 'm1.nano', 'host': '3dbcf7d89462aa3463ffa286b1625839aa6505bd42df007972d42999', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0cb83712-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5057.293734976, 'message_signature': 'a5e2afdcc7ec40d5acbba6456ec9eab4b8543e0b1a9e5877fc23423f1c5738e6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '567c8b1d5a18457a8101063d7929927c', 'user_name': None, 'project_id': '2e831305a33c4281b08d9d9a531f0f05', 'project_name': None, 'resource_id': 'e0454e3f-2082-4375-8eea-951bbcda9d18-sda', 'timestamp': '2025-11-22T07:58:36.685351', 'resource_metadata': {'display_name': 'tempest-InstanceActionsTestJSON-server-1380026557', 'name': 'instance-00000052', 'instance_id': 'e0454e3f-2082-4375-8eea-951bbcda9d18', 'instance_type': 'm1.nano', 'host': '3dbcf7d89462aa3463ffa286b1625839aa6505bd42df007972d42999', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0cb84018-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5057.293734976, 'message_signature': 'c20c2e439463934aa5a96577236ce1785e086ceaf6b0456a0071e2cdb3bbca3e'}]}, 'timestamp': '2025-11-22 07:58:36.685825', '_unique_id': '83c852064a7b44679d4d924a5c2b7127'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.686 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.686 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.686 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.686 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.686 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.686 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.686 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.686 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.686 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.686 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.686 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.686 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.686 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.686 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.686 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.686 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.686 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.686 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.686 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.686 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.686 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.686 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.686 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.686 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.686 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.686 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.686 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.686 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.686 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.686 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.686 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.686 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.687 12 DEBUG ceilometer.compute.pollsters [-] e0454e3f-2082-4375-8eea-951bbcda9d18/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.687 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'efafde63-099a-416e-88fc-2b87ede7ef55', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '567c8b1d5a18457a8101063d7929927c', 'user_name': None, 'project_id': '2e831305a33c4281b08d9d9a531f0f05', 'project_name': None, 'resource_id': 'instance-00000052-e0454e3f-2082-4375-8eea-951bbcda9d18-tap99430038-cd', 'timestamp': '2025-11-22T07:58:36.687055', 'resource_metadata': {'display_name': 'tempest-InstanceActionsTestJSON-server-1380026557', 'name': 'tap99430038-cd', 'instance_id': 'e0454e3f-2082-4375-8eea-951bbcda9d18', 'instance_type': 'm1.nano', 'host': '3dbcf7d89462aa3463ffa286b1625839aa6505bd42df007972d42999', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:dd:6b:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap99430038-cd'}, 'message_id': '0cb8790c-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5057.288449255, 'message_signature': '5fb99eafef5374d8e0be51313d6ffdb4f2e782665a69d38f248f6941bf5e7a3d'}]}, 'timestamp': '2025-11-22 07:58:36.687332', '_unique_id': '6aa851d564f046a682c4a1dd4bb8edca'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.687 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.687 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.687 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.687 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.687 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.687 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.687 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.687 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.687 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.687 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.687 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.687 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.687 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.687 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.687 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.687 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.687 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.687 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.687 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.687 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.687 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.687 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.687 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.687 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.687 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.687 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.687 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.687 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.687 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.687 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.687 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.688 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.688 12 DEBUG ceilometer.compute.pollsters [-] e0454e3f-2082-4375-8eea-951bbcda9d18/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.688 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance e0454e3f-2082-4375-8eea-951bbcda9d18: ceilometer.compute.pollsters.NoVolumeException
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.688 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.688 12 DEBUG ceilometer.compute.pollsters [-] e0454e3f-2082-4375-8eea-951bbcda9d18/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.689 12 DEBUG ceilometer.compute.pollsters [-] e0454e3f-2082-4375-8eea-951bbcda9d18/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.689 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c01e06d8-64a1-4245-b3b6-765c36fd4784', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '567c8b1d5a18457a8101063d7929927c', 'user_name': None, 'project_id': '2e831305a33c4281b08d9d9a531f0f05', 'project_name': None, 'resource_id': 'e0454e3f-2082-4375-8eea-951bbcda9d18-vda', 'timestamp': '2025-11-22T07:58:36.688849', 'resource_metadata': {'display_name': 'tempest-InstanceActionsTestJSON-server-1380026557', 'name': 'instance-00000052', 'instance_id': 'e0454e3f-2082-4375-8eea-951bbcda9d18', 'instance_type': 'm1.nano', 'host': '3dbcf7d89462aa3463ffa286b1625839aa6505bd42df007972d42999', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0cb8beda-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5057.293734976, 'message_signature': '5bd9a1ff644273eda0e50645f3b6e968a8eec6a8f179158743357aafd1e7a77d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '567c8b1d5a18457a8101063d7929927c', 'user_name': None, 'project_id': '2e831305a33c4281b08d9d9a531f0f05', 'project_name': None, 'resource_id': 'e0454e3f-2082-4375-8eea-951bbcda9d18-sda', 'timestamp': '2025-11-22T07:58:36.688849', 'resource_metadata': {'display_name': 'tempest-InstanceActionsTestJSON-server-1380026557', 'name': 'instance-00000052', 'instance_id': 'e0454e3f-2082-4375-8eea-951bbcda9d18', 'instance_type': 'm1.nano', 'host': '3dbcf7d89462aa3463ffa286b1625839aa6505bd42df007972d42999', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0cb8c790-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5057.293734976, 'message_signature': 'd2a2755edd4ed38e4eefc4fecaa2e6bd2f7b0ac2bf3b5fa26f11126d82ef1c53'}]}, 'timestamp': '2025-11-22 07:58:36.689314', '_unique_id': '4d34d2652777405dab412f043948f4be'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.689 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.689 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.689 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.689 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.689 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.689 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.689 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.689 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.689 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.689 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.689 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.689 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.689 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.689 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.689 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.689 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.689 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.689 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.689 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.689 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.689 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.689 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.689 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.689 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.689 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.689 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.689 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.689 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.689 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.689 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.689 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.690 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.690 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.690 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-InstanceActionsTestJSON-server-1380026557>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-InstanceActionsTestJSON-server-1380026557>]
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.690 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.690 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.690 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-InstanceActionsTestJSON-server-1380026557>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-InstanceActionsTestJSON-server-1380026557>]
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.690 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.691 12 DEBUG ceilometer.compute.pollsters [-] e0454e3f-2082-4375-8eea-951bbcda9d18/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.691 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8cc94056-2379-47dc-b92e-9c48cf6497a4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '567c8b1d5a18457a8101063d7929927c', 'user_name': None, 'project_id': '2e831305a33c4281b08d9d9a531f0f05', 'project_name': None, 'resource_id': 'instance-00000052-e0454e3f-2082-4375-8eea-951bbcda9d18-tap99430038-cd', 'timestamp': '2025-11-22T07:58:36.691035', 'resource_metadata': {'display_name': 'tempest-InstanceActionsTestJSON-server-1380026557', 'name': 'tap99430038-cd', 'instance_id': 'e0454e3f-2082-4375-8eea-951bbcda9d18', 'instance_type': 'm1.nano', 'host': '3dbcf7d89462aa3463ffa286b1625839aa6505bd42df007972d42999', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:dd:6b:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap99430038-cd'}, 'message_id': '0cb9148e-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5057.288449255, 'message_signature': '4151c295b8bf37e640502b6c5bdaeb92022e6407ea5563922a9685e517c00f6d'}]}, 'timestamp': '2025-11-22 07:58:36.691302', '_unique_id': 'cc205e712aba4706bdea4d6cb2f7c477'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.691 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.691 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.691 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.691 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.691 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.691 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.691 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.691 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.691 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.691 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.691 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.691 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.691 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.691 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.691 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.691 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.691 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.691 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.691 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.691 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.691 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.691 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.691 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.691 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.691 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.691 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.691 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.691 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.691 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.691 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.691 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.692 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.692 12 DEBUG ceilometer.compute.pollsters [-] e0454e3f-2082-4375-8eea-951bbcda9d18/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.692 12 DEBUG ceilometer.compute.pollsters [-] e0454e3f-2082-4375-8eea-951bbcda9d18/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.693 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1ad29d30-5d7b-44e0-96f5-014300ffe8e3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '567c8b1d5a18457a8101063d7929927c', 'user_name': None, 'project_id': '2e831305a33c4281b08d9d9a531f0f05', 'project_name': None, 'resource_id': 'e0454e3f-2082-4375-8eea-951bbcda9d18-vda', 'timestamp': '2025-11-22T07:58:36.692546', 'resource_metadata': {'display_name': 'tempest-InstanceActionsTestJSON-server-1380026557', 'name': 'instance-00000052', 'instance_id': 'e0454e3f-2082-4375-8eea-951bbcda9d18', 'instance_type': 'm1.nano', 'host': '3dbcf7d89462aa3463ffa286b1625839aa6505bd42df007972d42999', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0cb95070-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5057.293734976, 'message_signature': 'c6d847169e7b52fd8098be88da556067cddabbb08aeb9f0be967585f5e6095f2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '567c8b1d5a18457a8101063d7929927c', 'user_name': None, 'project_id': '2e831305a33c4281b08d9d9a531f0f05', 'project_name': None, 'resource_id': 'e0454e3f-2082-4375-8eea-951bbcda9d18-sda', 'timestamp': '2025-11-22T07:58:36.692546', 'resource_metadata': {'display_name': 'tempest-InstanceActionsTestJSON-server-1380026557', 'name': 'instance-00000052', 'instance_id': 'e0454e3f-2082-4375-8eea-951bbcda9d18', 'instance_type': 'm1.nano', 'host': '3dbcf7d89462aa3463ffa286b1625839aa6505bd42df007972d42999', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0cb95908-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5057.293734976, 'message_signature': 'f361dba6ae7da33ab7a77a1aa719122d41ba0942b18887e5d3abd986b7737361'}]}, 'timestamp': '2025-11-22 07:58:36.693012', '_unique_id': 'b3d707ba89484c199423fff46fde89e8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.693 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.693 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.693 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.693 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.693 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.693 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.693 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.693 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.693 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.693 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.693 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.693 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.693 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.693 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.693 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.693 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.693 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.693 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.693 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.693 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.693 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.693 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.693 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.693 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.693 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.693 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.693 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.693 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.693 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.693 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.693 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.694 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.694 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.694 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-InstanceActionsTestJSON-server-1380026557>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-InstanceActionsTestJSON-server-1380026557>]
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.694 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.694 12 DEBUG ceilometer.compute.pollsters [-] e0454e3f-2082-4375-8eea-951bbcda9d18/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.694 12 DEBUG ceilometer.compute.pollsters [-] e0454e3f-2082-4375-8eea-951bbcda9d18/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.695 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b3a9ae71-7284-4ac0-9bfd-10ee2ef093ed', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '567c8b1d5a18457a8101063d7929927c', 'user_name': None, 'project_id': '2e831305a33c4281b08d9d9a531f0f05', 'project_name': None, 'resource_id': 'e0454e3f-2082-4375-8eea-951bbcda9d18-vda', 'timestamp': '2025-11-22T07:58:36.694572', 'resource_metadata': {'display_name': 'tempest-InstanceActionsTestJSON-server-1380026557', 'name': 'instance-00000052', 'instance_id': 'e0454e3f-2082-4375-8eea-951bbcda9d18', 'instance_type': 'm1.nano', 'host': '3dbcf7d89462aa3463ffa286b1625839aa6505bd42df007972d42999', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0cb99eea-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5057.331195309, 'message_signature': '2dfc8ca37663d1cb8b0b1cc6ac701e2c9e1f2212a4b2c25fc9ea0ae4f6da0389'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '567c8b1d5a18457a8101063d7929927c', 'user_name': None, 'project_id': '2e831305a33c4281b08d9d9a531f0f05', 'project_name': None, 'resource_id': 'e0454e3f-2082-4375-8eea-951bbcda9d18-sda', 'timestamp': '2025-11-22T07:58:36.694572', 'resource_metadata': {'display_name': 'tempest-InstanceActionsTestJSON-server-1380026557', 'name': 'instance-00000052', 'instance_id': 'e0454e3f-2082-4375-8eea-951bbcda9d18', 'instance_type': 'm1.nano', 'host': '3dbcf7d89462aa3463ffa286b1625839aa6505bd42df007972d42999', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0cb9a6ec-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5057.331195309, 'message_signature': 'cb192d9c9daa696a7f3d16979b2e12613e6b91e11f62a20746746d64f5a690e4'}]}, 'timestamp': '2025-11-22 07:58:36.695005', '_unique_id': 'e66091b41b3a412094eabf338df2b31c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.695 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.695 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.695 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.695 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.695 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.695 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.695 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.695 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.695 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.695 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.695 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.695 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.695 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.695 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.695 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.695 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.695 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.695 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.695 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.695 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.695 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.695 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.695 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.695 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.695 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.695 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.695 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.695 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.695 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.695 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.695 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.696 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.696 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.696 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-InstanceActionsTestJSON-server-1380026557>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-InstanceActionsTestJSON-server-1380026557>]
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.696 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.696 12 DEBUG ceilometer.compute.pollsters [-] e0454e3f-2082-4375-8eea-951bbcda9d18/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.697 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0918b92b-0e7f-4682-b550-0bba2ffc588e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '567c8b1d5a18457a8101063d7929927c', 'user_name': None, 'project_id': '2e831305a33c4281b08d9d9a531f0f05', 'project_name': None, 'resource_id': 'instance-00000052-e0454e3f-2082-4375-8eea-951bbcda9d18-tap99430038-cd', 'timestamp': '2025-11-22T07:58:36.696390', 'resource_metadata': {'display_name': 'tempest-InstanceActionsTestJSON-server-1380026557', 'name': 'tap99430038-cd', 'instance_id': 'e0454e3f-2082-4375-8eea-951bbcda9d18', 'instance_type': 'm1.nano', 'host': '3dbcf7d89462aa3463ffa286b1625839aa6505bd42df007972d42999', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:dd:6b:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap99430038-cd'}, 'message_id': '0cb9e5e4-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5057.288449255, 'message_signature': '76b6dd55a9e29fdc9bf76c0b3eb49fc0347f01ca25817bd01462f7705e4bde0e'}]}, 'timestamp': '2025-11-22 07:58:36.696637', '_unique_id': 'fe807a0c91364a4badeb3a46507a151c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.697 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.697 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.697 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.697 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.697 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.697 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.697 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.697 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.697 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.697 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.697 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.697 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.697 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.697 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.697 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.697 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.697 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.697 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.697 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.697 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.697 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.697 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.697 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.697 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.697 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.697 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.697 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.697 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.697 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.697 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.697 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.697 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.697 12 DEBUG ceilometer.compute.pollsters [-] e0454e3f-2082-4375-8eea-951bbcda9d18/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.698 12 DEBUG ceilometer.compute.pollsters [-] e0454e3f-2082-4375-8eea-951bbcda9d18/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.698 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '40f423a5-9e16-4aa0-8d8b-faa19395589c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '567c8b1d5a18457a8101063d7929927c', 'user_name': None, 'project_id': '2e831305a33c4281b08d9d9a531f0f05', 'project_name': None, 'resource_id': 'e0454e3f-2082-4375-8eea-951bbcda9d18-vda', 'timestamp': '2025-11-22T07:58:36.697945', 'resource_metadata': {'display_name': 'tempest-InstanceActionsTestJSON-server-1380026557', 'name': 'instance-00000052', 'instance_id': 'e0454e3f-2082-4375-8eea-951bbcda9d18', 'instance_type': 'm1.nano', 'host': '3dbcf7d89462aa3463ffa286b1625839aa6505bd42df007972d42999', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0cba2338-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5057.293734976, 'message_signature': 'b60639f855e644fdc169bc8640aa8b3bf11971c3b89aaccf12c39f23ff2ce7da'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '567c8b1d5a18457a8101063d7929927c', 'user_name': None, 'project_id': '2e831305a33c4281b08d9d9a531f0f05', 'project_name': None, 'resource_id': 'e0454e3f-2082-4375-8eea-951bbcda9d18-sda', 'timestamp': '2025-11-22T07:58:36.697945', 'resource_metadata': {'display_name': 'tempest-InstanceActionsTestJSON-server-1380026557', 'name': 'instance-00000052', 'instance_id': 'e0454e3f-2082-4375-8eea-951bbcda9d18', 'instance_type': 'm1.nano', 'host': '3dbcf7d89462aa3463ffa286b1625839aa6505bd42df007972d42999', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0cba2d60-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5057.293734976, 'message_signature': '5a1d74c1e3e64280952d6b5078cf988b95096fa992f84d17c9ffb16d104d7f11'}]}, 'timestamp': '2025-11-22 07:58:36.698449', '_unique_id': '0d00aa59896d4025be93c5793b754358'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.698 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.698 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.698 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.698 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.698 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.698 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.698 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.698 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.698 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.698 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.698 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.698 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.698 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.698 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.698 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.698 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.698 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.698 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.698 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.698 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.698 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.698 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.698 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.698 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.698 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.698 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.698 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.698 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.698 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.698 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.698 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.699 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.699 12 DEBUG ceilometer.compute.pollsters [-] e0454e3f-2082-4375-8eea-951bbcda9d18/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.700 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '787c443a-c40d-45f2-9887-12293699c861', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '567c8b1d5a18457a8101063d7929927c', 'user_name': None, 'project_id': '2e831305a33c4281b08d9d9a531f0f05', 'project_name': None, 'resource_id': 'instance-00000052-e0454e3f-2082-4375-8eea-951bbcda9d18-tap99430038-cd', 'timestamp': '2025-11-22T07:58:36.699614', 'resource_metadata': {'display_name': 'tempest-InstanceActionsTestJSON-server-1380026557', 'name': 'tap99430038-cd', 'instance_id': 'e0454e3f-2082-4375-8eea-951bbcda9d18', 'instance_type': 'm1.nano', 'host': '3dbcf7d89462aa3463ffa286b1625839aa6505bd42df007972d42999', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:dd:6b:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap99430038-cd'}, 'message_id': '0cba64f6-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5057.288449255, 'message_signature': '213b272baa5fe4719dd6c612d3a2a75a165a6a62e22ff476ad58020424cd55e8'}]}, 'timestamp': '2025-11-22 07:58:36.699899', '_unique_id': 'afb2aac2613f46d09d3574b5d02cdc8f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.700 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.700 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.700 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.700 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.700 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.700 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.700 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.700 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.700 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.700 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.700 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.700 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.700 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.700 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.700 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.700 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.700 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.700 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.700 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.700 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.700 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.700 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.700 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.700 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.700 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.700 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.700 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.700 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.700 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.700 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.700 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.700 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.701 12 DEBUG ceilometer.compute.pollsters [-] e0454e3f-2082-4375-8eea-951bbcda9d18/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.701 12 DEBUG ceilometer.compute.pollsters [-] e0454e3f-2082-4375-8eea-951bbcda9d18/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.702 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a4fc2e26-2ac7-45cb-91d4-9999eb3b3cdb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '567c8b1d5a18457a8101063d7929927c', 'user_name': None, 'project_id': '2e831305a33c4281b08d9d9a531f0f05', 'project_name': None, 'resource_id': 'e0454e3f-2082-4375-8eea-951bbcda9d18-vda', 'timestamp': '2025-11-22T07:58:36.700999', 'resource_metadata': {'display_name': 'tempest-InstanceActionsTestJSON-server-1380026557', 'name': 'instance-00000052', 'instance_id': 'e0454e3f-2082-4375-8eea-951bbcda9d18', 'instance_type': 'm1.nano', 'host': '3dbcf7d89462aa3463ffa286b1625839aa6505bd42df007972d42999', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0cba998a-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5057.331195309, 'message_signature': '840a0d5e5be68efd474e69042933bdebb5d019e8df5211464fed9a977c6c2fd3'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '567c8b1d5a18457a8101063d7929927c', 'user_name': None, 'project_id': '2e831305a33c4281b08d9d9a531f0f05', 'project_name': None, 'resource_id': 'e0454e3f-2082-4375-8eea-951bbcda9d18-sda', 'timestamp': '2025-11-22T07:58:36.700999', 'resource_metadata': {'display_name': 'tempest-InstanceActionsTestJSON-server-1380026557', 'name': 'instance-00000052', 'instance_id': 'e0454e3f-2082-4375-8eea-951bbcda9d18', 'instance_type': 'm1.nano', 'host': '3dbcf7d89462aa3463ffa286b1625839aa6505bd42df007972d42999', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0cbaa466-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5057.331195309, 'message_signature': 'b5eac2ac0b2f05648fd7f5982bbe469f950d3613abb611f674a28a2163991446'}]}, 'timestamp': '2025-11-22 07:58:36.701539', '_unique_id': '54fffc0096cf48d1b350c2903bec4565'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.702 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.702 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.702 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.702 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.702 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.702 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.702 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.702 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.702 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.702 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.702 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.702 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.702 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.702 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.702 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.702 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.702 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.702 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.702 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.702 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.702 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.702 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.702 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.702 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.702 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.702 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.702 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.702 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.702 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.702 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.702 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.702 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.702 12 DEBUG ceilometer.compute.pollsters [-] e0454e3f-2082-4375-8eea-951bbcda9d18/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.703 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '998700b4-61ee-4f99-b417-0f542074c5a9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '567c8b1d5a18457a8101063d7929927c', 'user_name': None, 'project_id': '2e831305a33c4281b08d9d9a531f0f05', 'project_name': None, 'resource_id': 'instance-00000052-e0454e3f-2082-4375-8eea-951bbcda9d18-tap99430038-cd', 'timestamp': '2025-11-22T07:58:36.702791', 'resource_metadata': {'display_name': 'tempest-InstanceActionsTestJSON-server-1380026557', 'name': 'tap99430038-cd', 'instance_id': 'e0454e3f-2082-4375-8eea-951bbcda9d18', 'instance_type': 'm1.nano', 'host': '3dbcf7d89462aa3463ffa286b1625839aa6505bd42df007972d42999', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:dd:6b:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap99430038-cd'}, 'message_id': '0cbadf76-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5057.288449255, 'message_signature': 'faf786cd2f56afcc85c544067f1f261d7722e4da6a64f1a44ff85b5454f5305b'}]}, 'timestamp': '2025-11-22 07:58:36.703027', '_unique_id': '3d40787b661b489a98d5980d28d466cd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.703 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.703 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.703 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.703 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.703 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.703 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.703 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.703 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.703 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.703 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.703 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.703 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.703 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.703 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.703 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.703 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.703 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.703 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.703 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.703 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.703 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.703 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.703 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.703 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.703 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.703 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.703 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.703 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.703 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.703 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 07:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 07:58:36.703 12 ERROR oslo_messaging.notify.messaging 
Nov 22 07:58:36 compute-0 podman[226523]: 2025-11-22 07:58:36.742867684 +0000 UTC m=+0.098165170 container create de9d7966fd2b412506a0294d469d9d304b5b56c7841a5151340039362716ec9c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2eb8d1c0-aa5f-4b93-a282-8895c4dfc512, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 07:58:36 compute-0 podman[226523]: 2025-11-22 07:58:36.667652581 +0000 UTC m=+0.022950087 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 07:58:36 compute-0 systemd[1]: Started libpod-conmon-de9d7966fd2b412506a0294d469d9d304b5b56c7841a5151340039362716ec9c.scope.
Nov 22 07:58:36 compute-0 systemd[1]: Started libcrun container.
Nov 22 07:58:36 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/675672185d4001aa1bf65e1f79827cc0173de9c3d2ed8eced461cce5deaabc10/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 07:58:36 compute-0 podman[226523]: 2025-11-22 07:58:36.834651565 +0000 UTC m=+0.189949081 container init de9d7966fd2b412506a0294d469d9d304b5b56c7841a5151340039362716ec9c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2eb8d1c0-aa5f-4b93-a282-8895c4dfc512, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true)
Nov 22 07:58:36 compute-0 podman[226523]: 2025-11-22 07:58:36.841012541 +0000 UTC m=+0.196310037 container start de9d7966fd2b412506a0294d469d9d304b5b56c7841a5151340039362716ec9c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2eb8d1c0-aa5f-4b93-a282-8895c4dfc512, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 07:58:36 compute-0 neutron-haproxy-ovnmeta-2eb8d1c0-aa5f-4b93-a282-8895c4dfc512[226538]: [NOTICE]   (226542) : New worker (226544) forked
Nov 22 07:58:36 compute-0 neutron-haproxy-ovnmeta-2eb8d1c0-aa5f-4b93-a282-8895c4dfc512[226538]: [NOTICE]   (226542) : Loading success.
Nov 22 07:58:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:37.326 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:58:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:37.327 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:58:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:37.328 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:58:37 compute-0 nova_compute[186544]: 2025-11-22 07:58:37.932 186548 DEBUG oslo_concurrency.lockutils [None req-d4fa63dc-d0f0-47f3-af11-9cc95dabd4af 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Acquiring lock "e0454e3f-2082-4375-8eea-951bbcda9d18" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:58:37 compute-0 nova_compute[186544]: 2025-11-22 07:58:37.933 186548 DEBUG oslo_concurrency.lockutils [None req-d4fa63dc-d0f0-47f3-af11-9cc95dabd4af 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Lock "e0454e3f-2082-4375-8eea-951bbcda9d18" acquired by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:58:37 compute-0 nova_compute[186544]: 2025-11-22 07:58:37.933 186548 INFO nova.compute.manager [None req-d4fa63dc-d0f0-47f3-af11-9cc95dabd4af 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Rebooting instance
Nov 22 07:58:37 compute-0 nova_compute[186544]: 2025-11-22 07:58:37.951 186548 DEBUG oslo_concurrency.lockutils [None req-d4fa63dc-d0f0-47f3-af11-9cc95dabd4af 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Acquiring lock "refresh_cache-e0454e3f-2082-4375-8eea-951bbcda9d18" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:58:37 compute-0 nova_compute[186544]: 2025-11-22 07:58:37.951 186548 DEBUG oslo_concurrency.lockutils [None req-d4fa63dc-d0f0-47f3-af11-9cc95dabd4af 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Acquired lock "refresh_cache-e0454e3f-2082-4375-8eea-951bbcda9d18" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:58:37 compute-0 nova_compute[186544]: 2025-11-22 07:58:37.952 186548 DEBUG nova.network.neutron [None req-d4fa63dc-d0f0-47f3-af11-9cc95dabd4af 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 07:58:38 compute-0 nova_compute[186544]: 2025-11-22 07:58:38.400 186548 DEBUG nova.compute.manager [req-066ebf81-2ca0-4dca-9bdd-fc8a4c3bd651 req-53820f46-aca8-43e9-bbc2-8afdd9568801 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Received event network-vif-plugged-99430038-cd5a-4b80-870d-a311ce070406 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:58:38 compute-0 nova_compute[186544]: 2025-11-22 07:58:38.401 186548 DEBUG oslo_concurrency.lockutils [req-066ebf81-2ca0-4dca-9bdd-fc8a4c3bd651 req-53820f46-aca8-43e9-bbc2-8afdd9568801 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "e0454e3f-2082-4375-8eea-951bbcda9d18-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:58:38 compute-0 nova_compute[186544]: 2025-11-22 07:58:38.401 186548 DEBUG oslo_concurrency.lockutils [req-066ebf81-2ca0-4dca-9bdd-fc8a4c3bd651 req-53820f46-aca8-43e9-bbc2-8afdd9568801 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e0454e3f-2082-4375-8eea-951bbcda9d18-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:58:38 compute-0 nova_compute[186544]: 2025-11-22 07:58:38.401 186548 DEBUG oslo_concurrency.lockutils [req-066ebf81-2ca0-4dca-9bdd-fc8a4c3bd651 req-53820f46-aca8-43e9-bbc2-8afdd9568801 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e0454e3f-2082-4375-8eea-951bbcda9d18-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:58:38 compute-0 nova_compute[186544]: 2025-11-22 07:58:38.401 186548 DEBUG nova.compute.manager [req-066ebf81-2ca0-4dca-9bdd-fc8a4c3bd651 req-53820f46-aca8-43e9-bbc2-8afdd9568801 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] No waiting events found dispatching network-vif-plugged-99430038-cd5a-4b80-870d-a311ce070406 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:58:38 compute-0 nova_compute[186544]: 2025-11-22 07:58:38.402 186548 WARNING nova.compute.manager [req-066ebf81-2ca0-4dca-9bdd-fc8a4c3bd651 req-53820f46-aca8-43e9-bbc2-8afdd9568801 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Received unexpected event network-vif-plugged-99430038-cd5a-4b80-870d-a311ce070406 for instance with vm_state active and task_state rebooting_hard.
Nov 22 07:58:39 compute-0 nova_compute[186544]: 2025-11-22 07:58:39.069 186548 DEBUG nova.network.neutron [None req-d4fa63dc-d0f0-47f3-af11-9cc95dabd4af 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Updating instance_info_cache with network_info: [{"id": "99430038-cd5a-4b80-870d-a311ce070406", "address": "fa:16:3e:dd:6b:ef", "network": {"id": "2eb8d1c0-aa5f-4b93-a282-8895c4dfc512", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-950997851-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e831305a33c4281b08d9d9a531f0f05", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap99430038-cd", "ovs_interfaceid": "99430038-cd5a-4b80-870d-a311ce070406", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:58:39 compute-0 nova_compute[186544]: 2025-11-22 07:58:39.084 186548 DEBUG oslo_concurrency.lockutils [None req-d4fa63dc-d0f0-47f3-af11-9cc95dabd4af 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Releasing lock "refresh_cache-e0454e3f-2082-4375-8eea-951bbcda9d18" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:58:39 compute-0 nova_compute[186544]: 2025-11-22 07:58:39.097 186548 DEBUG nova.compute.manager [None req-d4fa63dc-d0f0-47f3-af11-9cc95dabd4af 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:58:39 compute-0 kernel: tap99430038-cd (unregistering): left promiscuous mode
Nov 22 07:58:39 compute-0 NetworkManager[55036]: <info>  [1763798319.2957] device (tap99430038-cd): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 07:58:39 compute-0 nova_compute[186544]: 2025-11-22 07:58:39.302 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:39 compute-0 ovn_controller[94843]: 2025-11-22T07:58:39Z|00329|binding|INFO|Releasing lport 99430038-cd5a-4b80-870d-a311ce070406 from this chassis (sb_readonly=0)
Nov 22 07:58:39 compute-0 ovn_controller[94843]: 2025-11-22T07:58:39Z|00330|binding|INFO|Setting lport 99430038-cd5a-4b80-870d-a311ce070406 down in Southbound
Nov 22 07:58:39 compute-0 ovn_controller[94843]: 2025-11-22T07:58:39Z|00331|binding|INFO|Removing iface tap99430038-cd ovn-installed in OVS
Nov 22 07:58:39 compute-0 nova_compute[186544]: 2025-11-22 07:58:39.304 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:39.312 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:dd:6b:ef 10.100.0.8'], port_security=['fa:16:3e:dd:6b:ef 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'e0454e3f-2082-4375-8eea-951bbcda9d18', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2eb8d1c0-aa5f-4b93-a282-8895c4dfc512', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2e831305a33c4281b08d9d9a531f0f05', 'neutron:revision_number': '4', 'neutron:security_group_ids': '8bae29ab-eff0-4123-9083-446cbed1b776', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=acb7cb2c-1a3c-469d-9846-a04f374dd378, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=99430038-cd5a-4b80-870d-a311ce070406) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:58:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:39.313 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 99430038-cd5a-4b80-870d-a311ce070406 in datapath 2eb8d1c0-aa5f-4b93-a282-8895c4dfc512 unbound from our chassis
Nov 22 07:58:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:39.315 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2eb8d1c0-aa5f-4b93-a282-8895c4dfc512, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 07:58:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:39.316 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[ba4aa9cb-3734-41ea-898f-00317baa6d71]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:39.316 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-2eb8d1c0-aa5f-4b93-a282-8895c4dfc512 namespace which is not needed anymore
Nov 22 07:58:39 compute-0 nova_compute[186544]: 2025-11-22 07:58:39.318 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:39 compute-0 systemd[1]: machine-qemu\x2d39\x2dinstance\x2d00000052.scope: Deactivated successfully.
Nov 22 07:58:39 compute-0 systemd[1]: machine-qemu\x2d39\x2dinstance\x2d00000052.scope: Consumed 3.142s CPU time.
Nov 22 07:58:39 compute-0 systemd-machined[152872]: Machine qemu-39-instance-00000052 terminated.
Nov 22 07:58:39 compute-0 nova_compute[186544]: 2025-11-22 07:58:39.472 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:39 compute-0 nova_compute[186544]: 2025-11-22 07:58:39.476 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:39 compute-0 nova_compute[186544]: 2025-11-22 07:58:39.514 186548 INFO nova.virt.libvirt.driver [-] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Instance destroyed successfully.
Nov 22 07:58:39 compute-0 nova_compute[186544]: 2025-11-22 07:58:39.515 186548 DEBUG nova.objects.instance [None req-d4fa63dc-d0f0-47f3-af11-9cc95dabd4af 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Lazy-loading 'resources' on Instance uuid e0454e3f-2082-4375-8eea-951bbcda9d18 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:58:39 compute-0 nova_compute[186544]: 2025-11-22 07:58:39.527 186548 DEBUG nova.virt.libvirt.vif [None req-d4fa63dc-d0f0-47f3-af11-9cc95dabd4af 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:58:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-InstanceActionsTestJSON-server-1380026557',display_name='tempest-InstanceActionsTestJSON-server-1380026557',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-instanceactionstestjson-server-1380026557',id=82,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:58:36Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2e831305a33c4281b08d9d9a531f0f05',ramdisk_id='',reservation_id='r-8wd5vte0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-InstanceActionsTestJSON-1024162909',owner_user_name='tempest-InstanceActionsTestJSON-1024162909-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:58:39Z,user_data=None,user_id='567c8b1d5a18457a8101063d7929927c',uuid=e0454e3f-2082-4375-8eea-951bbcda9d18,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "99430038-cd5a-4b80-870d-a311ce070406", "address": "fa:16:3e:dd:6b:ef", "network": {"id": "2eb8d1c0-aa5f-4b93-a282-8895c4dfc512", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-950997851-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e831305a33c4281b08d9d9a531f0f05", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap99430038-cd", "ovs_interfaceid": "99430038-cd5a-4b80-870d-a311ce070406", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 07:58:39 compute-0 nova_compute[186544]: 2025-11-22 07:58:39.528 186548 DEBUG nova.network.os_vif_util [None req-d4fa63dc-d0f0-47f3-af11-9cc95dabd4af 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Converting VIF {"id": "99430038-cd5a-4b80-870d-a311ce070406", "address": "fa:16:3e:dd:6b:ef", "network": {"id": "2eb8d1c0-aa5f-4b93-a282-8895c4dfc512", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-950997851-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e831305a33c4281b08d9d9a531f0f05", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap99430038-cd", "ovs_interfaceid": "99430038-cd5a-4b80-870d-a311ce070406", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:58:39 compute-0 nova_compute[186544]: 2025-11-22 07:58:39.528 186548 DEBUG nova.network.os_vif_util [None req-d4fa63dc-d0f0-47f3-af11-9cc95dabd4af 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:dd:6b:ef,bridge_name='br-int',has_traffic_filtering=True,id=99430038-cd5a-4b80-870d-a311ce070406,network=Network(2eb8d1c0-aa5f-4b93-a282-8895c4dfc512),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap99430038-cd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:58:39 compute-0 nova_compute[186544]: 2025-11-22 07:58:39.529 186548 DEBUG os_vif [None req-d4fa63dc-d0f0-47f3-af11-9cc95dabd4af 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:dd:6b:ef,bridge_name='br-int',has_traffic_filtering=True,id=99430038-cd5a-4b80-870d-a311ce070406,network=Network(2eb8d1c0-aa5f-4b93-a282-8895c4dfc512),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap99430038-cd') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 07:58:39 compute-0 nova_compute[186544]: 2025-11-22 07:58:39.530 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:39 compute-0 nova_compute[186544]: 2025-11-22 07:58:39.530 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap99430038-cd, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:58:39 compute-0 nova_compute[186544]: 2025-11-22 07:58:39.532 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:39 compute-0 nova_compute[186544]: 2025-11-22 07:58:39.533 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:39 compute-0 nova_compute[186544]: 2025-11-22 07:58:39.536 186548 INFO os_vif [None req-d4fa63dc-d0f0-47f3-af11-9cc95dabd4af 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:dd:6b:ef,bridge_name='br-int',has_traffic_filtering=True,id=99430038-cd5a-4b80-870d-a311ce070406,network=Network(2eb8d1c0-aa5f-4b93-a282-8895c4dfc512),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap99430038-cd')
Nov 22 07:58:39 compute-0 neutron-haproxy-ovnmeta-2eb8d1c0-aa5f-4b93-a282-8895c4dfc512[226538]: [NOTICE]   (226542) : haproxy version is 2.8.14-c23fe91
Nov 22 07:58:39 compute-0 neutron-haproxy-ovnmeta-2eb8d1c0-aa5f-4b93-a282-8895c4dfc512[226538]: [NOTICE]   (226542) : path to executable is /usr/sbin/haproxy
Nov 22 07:58:39 compute-0 neutron-haproxy-ovnmeta-2eb8d1c0-aa5f-4b93-a282-8895c4dfc512[226538]: [WARNING]  (226542) : Exiting Master process...
Nov 22 07:58:39 compute-0 neutron-haproxy-ovnmeta-2eb8d1c0-aa5f-4b93-a282-8895c4dfc512[226538]: [WARNING]  (226542) : Exiting Master process...
Nov 22 07:58:39 compute-0 neutron-haproxy-ovnmeta-2eb8d1c0-aa5f-4b93-a282-8895c4dfc512[226538]: [ALERT]    (226542) : Current worker (226544) exited with code 143 (Terminated)
Nov 22 07:58:39 compute-0 neutron-haproxy-ovnmeta-2eb8d1c0-aa5f-4b93-a282-8895c4dfc512[226538]: [WARNING]  (226542) : All workers exited. Exiting... (0)
Nov 22 07:58:39 compute-0 nova_compute[186544]: 2025-11-22 07:58:39.543 186548 DEBUG nova.virt.libvirt.driver [None req-d4fa63dc-d0f0-47f3-af11-9cc95dabd4af 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Start _get_guest_xml network_info=[{"id": "99430038-cd5a-4b80-870d-a311ce070406", "address": "fa:16:3e:dd:6b:ef", "network": {"id": "2eb8d1c0-aa5f-4b93-a282-8895c4dfc512", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-950997851-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e831305a33c4281b08d9d9a531f0f05", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap99430038-cd", "ovs_interfaceid": "99430038-cd5a-4b80-870d-a311ce070406", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 07:58:39 compute-0 systemd[1]: libpod-de9d7966fd2b412506a0294d469d9d304b5b56c7841a5151340039362716ec9c.scope: Deactivated successfully.
Nov 22 07:58:39 compute-0 nova_compute[186544]: 2025-11-22 07:58:39.546 186548 WARNING nova.virt.libvirt.driver [None req-d4fa63dc-d0f0-47f3-af11-9cc95dabd4af 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 07:58:39 compute-0 podman[226576]: 2025-11-22 07:58:39.552034139 +0000 UTC m=+0.147515746 container died de9d7966fd2b412506a0294d469d9d304b5b56c7841a5151340039362716ec9c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2eb8d1c0-aa5f-4b93-a282-8895c4dfc512, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 07:58:39 compute-0 nova_compute[186544]: 2025-11-22 07:58:39.553 186548 DEBUG nova.virt.libvirt.host [None req-d4fa63dc-d0f0-47f3-af11-9cc95dabd4af 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 07:58:39 compute-0 nova_compute[186544]: 2025-11-22 07:58:39.554 186548 DEBUG nova.virt.libvirt.host [None req-d4fa63dc-d0f0-47f3-af11-9cc95dabd4af 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 07:58:39 compute-0 nova_compute[186544]: 2025-11-22 07:58:39.557 186548 DEBUG nova.virt.libvirt.host [None req-d4fa63dc-d0f0-47f3-af11-9cc95dabd4af 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 07:58:39 compute-0 nova_compute[186544]: 2025-11-22 07:58:39.558 186548 DEBUG nova.virt.libvirt.host [None req-d4fa63dc-d0f0-47f3-af11-9cc95dabd4af 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 07:58:39 compute-0 nova_compute[186544]: 2025-11-22 07:58:39.559 186548 DEBUG nova.virt.libvirt.driver [None req-d4fa63dc-d0f0-47f3-af11-9cc95dabd4af 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 07:58:39 compute-0 nova_compute[186544]: 2025-11-22 07:58:39.560 186548 DEBUG nova.virt.hardware [None req-d4fa63dc-d0f0-47f3-af11-9cc95dabd4af 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 07:58:39 compute-0 nova_compute[186544]: 2025-11-22 07:58:39.560 186548 DEBUG nova.virt.hardware [None req-d4fa63dc-d0f0-47f3-af11-9cc95dabd4af 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 07:58:39 compute-0 nova_compute[186544]: 2025-11-22 07:58:39.560 186548 DEBUG nova.virt.hardware [None req-d4fa63dc-d0f0-47f3-af11-9cc95dabd4af 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 07:58:39 compute-0 nova_compute[186544]: 2025-11-22 07:58:39.561 186548 DEBUG nova.virt.hardware [None req-d4fa63dc-d0f0-47f3-af11-9cc95dabd4af 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 07:58:39 compute-0 nova_compute[186544]: 2025-11-22 07:58:39.561 186548 DEBUG nova.virt.hardware [None req-d4fa63dc-d0f0-47f3-af11-9cc95dabd4af 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 07:58:39 compute-0 nova_compute[186544]: 2025-11-22 07:58:39.562 186548 DEBUG nova.virt.hardware [None req-d4fa63dc-d0f0-47f3-af11-9cc95dabd4af 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 07:58:39 compute-0 nova_compute[186544]: 2025-11-22 07:58:39.562 186548 DEBUG nova.virt.hardware [None req-d4fa63dc-d0f0-47f3-af11-9cc95dabd4af 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 07:58:39 compute-0 nova_compute[186544]: 2025-11-22 07:58:39.562 186548 DEBUG nova.virt.hardware [None req-d4fa63dc-d0f0-47f3-af11-9cc95dabd4af 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 07:58:39 compute-0 nova_compute[186544]: 2025-11-22 07:58:39.563 186548 DEBUG nova.virt.hardware [None req-d4fa63dc-d0f0-47f3-af11-9cc95dabd4af 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 07:58:39 compute-0 nova_compute[186544]: 2025-11-22 07:58:39.563 186548 DEBUG nova.virt.hardware [None req-d4fa63dc-d0f0-47f3-af11-9cc95dabd4af 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 07:58:39 compute-0 nova_compute[186544]: 2025-11-22 07:58:39.563 186548 DEBUG nova.virt.hardware [None req-d4fa63dc-d0f0-47f3-af11-9cc95dabd4af 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 07:58:39 compute-0 nova_compute[186544]: 2025-11-22 07:58:39.563 186548 DEBUG nova.objects.instance [None req-d4fa63dc-d0f0-47f3-af11-9cc95dabd4af 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Lazy-loading 'vcpu_model' on Instance uuid e0454e3f-2082-4375-8eea-951bbcda9d18 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:58:39 compute-0 nova_compute[186544]: 2025-11-22 07:58:39.578 186548 DEBUG oslo_concurrency.processutils [None req-d4fa63dc-d0f0-47f3-af11-9cc95dabd4af 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e0454e3f-2082-4375-8eea-951bbcda9d18/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:58:39 compute-0 nova_compute[186544]: 2025-11-22 07:58:39.637 186548 DEBUG oslo_concurrency.processutils [None req-d4fa63dc-d0f0-47f3-af11-9cc95dabd4af 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e0454e3f-2082-4375-8eea-951bbcda9d18/disk.config --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:58:39 compute-0 nova_compute[186544]: 2025-11-22 07:58:39.638 186548 DEBUG oslo_concurrency.lockutils [None req-d4fa63dc-d0f0-47f3-af11-9cc95dabd4af 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Acquiring lock "/var/lib/nova/instances/e0454e3f-2082-4375-8eea-951bbcda9d18/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:58:39 compute-0 nova_compute[186544]: 2025-11-22 07:58:39.638 186548 DEBUG oslo_concurrency.lockutils [None req-d4fa63dc-d0f0-47f3-af11-9cc95dabd4af 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Lock "/var/lib/nova/instances/e0454e3f-2082-4375-8eea-951bbcda9d18/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:58:39 compute-0 nova_compute[186544]: 2025-11-22 07:58:39.639 186548 DEBUG oslo_concurrency.lockutils [None req-d4fa63dc-d0f0-47f3-af11-9cc95dabd4af 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Lock "/var/lib/nova/instances/e0454e3f-2082-4375-8eea-951bbcda9d18/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:58:39 compute-0 nova_compute[186544]: 2025-11-22 07:58:39.640 186548 DEBUG nova.virt.libvirt.vif [None req-d4fa63dc-d0f0-47f3-af11-9cc95dabd4af 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:58:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-InstanceActionsTestJSON-server-1380026557',display_name='tempest-InstanceActionsTestJSON-server-1380026557',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-instanceactionstestjson-server-1380026557',id=82,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:58:36Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2e831305a33c4281b08d9d9a531f0f05',ramdisk_id='',reservation_id='r-8wd5vte0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-InstanceActionsTestJSON-1024162909',owner_user_name='tempest-InstanceActionsTestJSON-1024162909-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:58:39Z,user_data=None,user_id='567c8b1d5a18457a8101063d7929927c',uuid=e0454e3f-2082-4375-8eea-951bbcda9d18,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "99430038-cd5a-4b80-870d-a311ce070406", "address": "fa:16:3e:dd:6b:ef", "network": {"id": "2eb8d1c0-aa5f-4b93-a282-8895c4dfc512", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-950997851-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e831305a33c4281b08d9d9a531f0f05", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap99430038-cd", "ovs_interfaceid": "99430038-cd5a-4b80-870d-a311ce070406", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 07:58:39 compute-0 nova_compute[186544]: 2025-11-22 07:58:39.641 186548 DEBUG nova.network.os_vif_util [None req-d4fa63dc-d0f0-47f3-af11-9cc95dabd4af 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Converting VIF {"id": "99430038-cd5a-4b80-870d-a311ce070406", "address": "fa:16:3e:dd:6b:ef", "network": {"id": "2eb8d1c0-aa5f-4b93-a282-8895c4dfc512", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-950997851-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e831305a33c4281b08d9d9a531f0f05", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap99430038-cd", "ovs_interfaceid": "99430038-cd5a-4b80-870d-a311ce070406", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:58:39 compute-0 nova_compute[186544]: 2025-11-22 07:58:39.642 186548 DEBUG nova.network.os_vif_util [None req-d4fa63dc-d0f0-47f3-af11-9cc95dabd4af 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:dd:6b:ef,bridge_name='br-int',has_traffic_filtering=True,id=99430038-cd5a-4b80-870d-a311ce070406,network=Network(2eb8d1c0-aa5f-4b93-a282-8895c4dfc512),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap99430038-cd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:58:39 compute-0 nova_compute[186544]: 2025-11-22 07:58:39.642 186548 DEBUG nova.objects.instance [None req-d4fa63dc-d0f0-47f3-af11-9cc95dabd4af 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Lazy-loading 'pci_devices' on Instance uuid e0454e3f-2082-4375-8eea-951bbcda9d18 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:58:39 compute-0 nova_compute[186544]: 2025-11-22 07:58:39.665 186548 DEBUG nova.virt.libvirt.driver [None req-d4fa63dc-d0f0-47f3-af11-9cc95dabd4af 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] End _get_guest_xml xml=<domain type="kvm">
Nov 22 07:58:39 compute-0 nova_compute[186544]:   <uuid>e0454e3f-2082-4375-8eea-951bbcda9d18</uuid>
Nov 22 07:58:39 compute-0 nova_compute[186544]:   <name>instance-00000052</name>
Nov 22 07:58:39 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 07:58:39 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 07:58:39 compute-0 nova_compute[186544]:   <metadata>
Nov 22 07:58:39 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 07:58:39 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 07:58:39 compute-0 nova_compute[186544]:       <nova:name>tempest-InstanceActionsTestJSON-server-1380026557</nova:name>
Nov 22 07:58:39 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 07:58:39</nova:creationTime>
Nov 22 07:58:39 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 07:58:39 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 07:58:39 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 07:58:39 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 07:58:39 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 07:58:39 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 07:58:39 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 07:58:39 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 07:58:39 compute-0 nova_compute[186544]:         <nova:user uuid="567c8b1d5a18457a8101063d7929927c">tempest-InstanceActionsTestJSON-1024162909-project-member</nova:user>
Nov 22 07:58:39 compute-0 nova_compute[186544]:         <nova:project uuid="2e831305a33c4281b08d9d9a531f0f05">tempest-InstanceActionsTestJSON-1024162909</nova:project>
Nov 22 07:58:39 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 07:58:39 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 07:58:39 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 07:58:39 compute-0 nova_compute[186544]:         <nova:port uuid="99430038-cd5a-4b80-870d-a311ce070406">
Nov 22 07:58:39 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Nov 22 07:58:39 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 07:58:39 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 07:58:39 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 07:58:39 compute-0 nova_compute[186544]:   </metadata>
Nov 22 07:58:39 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 07:58:39 compute-0 nova_compute[186544]:     <system>
Nov 22 07:58:39 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 07:58:39 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 07:58:39 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 07:58:39 compute-0 nova_compute[186544]:       <entry name="serial">e0454e3f-2082-4375-8eea-951bbcda9d18</entry>
Nov 22 07:58:39 compute-0 nova_compute[186544]:       <entry name="uuid">e0454e3f-2082-4375-8eea-951bbcda9d18</entry>
Nov 22 07:58:39 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 07:58:39 compute-0 nova_compute[186544]:     </system>
Nov 22 07:58:39 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 07:58:39 compute-0 nova_compute[186544]:   <os>
Nov 22 07:58:39 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 07:58:39 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 07:58:39 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 07:58:39 compute-0 nova_compute[186544]:   </os>
Nov 22 07:58:39 compute-0 nova_compute[186544]:   <features>
Nov 22 07:58:39 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 07:58:39 compute-0 nova_compute[186544]:     <apic/>
Nov 22 07:58:39 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 07:58:39 compute-0 nova_compute[186544]:   </features>
Nov 22 07:58:39 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 07:58:39 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 07:58:39 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 07:58:39 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 07:58:39 compute-0 nova_compute[186544]:   </clock>
Nov 22 07:58:39 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 07:58:39 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 07:58:39 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 07:58:39 compute-0 nova_compute[186544]:   </cpu>
Nov 22 07:58:39 compute-0 nova_compute[186544]:   <devices>
Nov 22 07:58:39 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 07:58:39 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 07:58:39 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/e0454e3f-2082-4375-8eea-951bbcda9d18/disk"/>
Nov 22 07:58:39 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 07:58:39 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:58:39 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 07:58:39 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 07:58:39 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/e0454e3f-2082-4375-8eea-951bbcda9d18/disk.config"/>
Nov 22 07:58:39 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 07:58:39 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:58:39 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 07:58:39 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:dd:6b:ef"/>
Nov 22 07:58:39 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 07:58:39 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 07:58:39 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 07:58:39 compute-0 nova_compute[186544]:       <target dev="tap99430038-cd"/>
Nov 22 07:58:39 compute-0 nova_compute[186544]:     </interface>
Nov 22 07:58:39 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 07:58:39 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/e0454e3f-2082-4375-8eea-951bbcda9d18/console.log" append="off"/>
Nov 22 07:58:39 compute-0 nova_compute[186544]:     </serial>
Nov 22 07:58:39 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 07:58:39 compute-0 nova_compute[186544]:     <video>
Nov 22 07:58:39 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 07:58:39 compute-0 nova_compute[186544]:     </video>
Nov 22 07:58:39 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 07:58:39 compute-0 nova_compute[186544]:     <input type="keyboard" bus="usb"/>
Nov 22 07:58:39 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 07:58:39 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 07:58:39 compute-0 nova_compute[186544]:     </rng>
Nov 22 07:58:39 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 07:58:39 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:39 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:39 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:39 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:39 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:39 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:39 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:39 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:39 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:39 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:39 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:39 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:39 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:39 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:39 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:39 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:39 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:39 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:39 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:39 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:39 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:39 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:39 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:39 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:39 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 07:58:39 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 07:58:39 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 07:58:39 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 07:58:39 compute-0 nova_compute[186544]:   </devices>
Nov 22 07:58:39 compute-0 nova_compute[186544]: </domain>
Nov 22 07:58:39 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 07:58:39 compute-0 nova_compute[186544]: 2025-11-22 07:58:39.667 186548 DEBUG oslo_concurrency.processutils [None req-d4fa63dc-d0f0-47f3-af11-9cc95dabd4af 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e0454e3f-2082-4375-8eea-951bbcda9d18/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:58:39 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-de9d7966fd2b412506a0294d469d9d304b5b56c7841a5151340039362716ec9c-userdata-shm.mount: Deactivated successfully.
Nov 22 07:58:39 compute-0 systemd[1]: var-lib-containers-storage-overlay-675672185d4001aa1bf65e1f79827cc0173de9c3d2ed8eced461cce5deaabc10-merged.mount: Deactivated successfully.
Nov 22 07:58:39 compute-0 nova_compute[186544]: 2025-11-22 07:58:39.727 186548 DEBUG oslo_concurrency.processutils [None req-d4fa63dc-d0f0-47f3-af11-9cc95dabd4af 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e0454e3f-2082-4375-8eea-951bbcda9d18/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:58:39 compute-0 nova_compute[186544]: 2025-11-22 07:58:39.728 186548 DEBUG oslo_concurrency.processutils [None req-d4fa63dc-d0f0-47f3-af11-9cc95dabd4af 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e0454e3f-2082-4375-8eea-951bbcda9d18/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:58:39 compute-0 nova_compute[186544]: 2025-11-22 07:58:39.783 186548 DEBUG oslo_concurrency.processutils [None req-d4fa63dc-d0f0-47f3-af11-9cc95dabd4af 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e0454e3f-2082-4375-8eea-951bbcda9d18/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:58:39 compute-0 nova_compute[186544]: 2025-11-22 07:58:39.785 186548 DEBUG nova.objects.instance [None req-d4fa63dc-d0f0-47f3-af11-9cc95dabd4af 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Lazy-loading 'trusted_certs' on Instance uuid e0454e3f-2082-4375-8eea-951bbcda9d18 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:58:39 compute-0 nova_compute[186544]: 2025-11-22 07:58:39.797 186548 DEBUG oslo_concurrency.processutils [None req-d4fa63dc-d0f0-47f3-af11-9cc95dabd4af 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:58:39 compute-0 podman[226576]: 2025-11-22 07:58:39.81308917 +0000 UTC m=+0.408570747 container cleanup de9d7966fd2b412506a0294d469d9d304b5b56c7841a5151340039362716ec9c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2eb8d1c0-aa5f-4b93-a282-8895c4dfc512, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 07:58:39 compute-0 systemd[1]: libpod-conmon-de9d7966fd2b412506a0294d469d9d304b5b56c7841a5151340039362716ec9c.scope: Deactivated successfully.
Nov 22 07:58:39 compute-0 nova_compute[186544]: 2025-11-22 07:58:39.858 186548 DEBUG oslo_concurrency.processutils [None req-d4fa63dc-d0f0-47f3-af11-9cc95dabd4af 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:58:39 compute-0 nova_compute[186544]: 2025-11-22 07:58:39.860 186548 DEBUG nova.virt.disk.api [None req-d4fa63dc-d0f0-47f3-af11-9cc95dabd4af 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Checking if we can resize image /var/lib/nova/instances/e0454e3f-2082-4375-8eea-951bbcda9d18/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 07:58:39 compute-0 nova_compute[186544]: 2025-11-22 07:58:39.860 186548 DEBUG oslo_concurrency.processutils [None req-d4fa63dc-d0f0-47f3-af11-9cc95dabd4af 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e0454e3f-2082-4375-8eea-951bbcda9d18/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:58:39 compute-0 nova_compute[186544]: 2025-11-22 07:58:39.923 186548 DEBUG oslo_concurrency.processutils [None req-d4fa63dc-d0f0-47f3-af11-9cc95dabd4af 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e0454e3f-2082-4375-8eea-951bbcda9d18/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:58:39 compute-0 nova_compute[186544]: 2025-11-22 07:58:39.924 186548 DEBUG nova.virt.disk.api [None req-d4fa63dc-d0f0-47f3-af11-9cc95dabd4af 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Cannot resize image /var/lib/nova/instances/e0454e3f-2082-4375-8eea-951bbcda9d18/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 07:58:39 compute-0 nova_compute[186544]: 2025-11-22 07:58:39.926 186548 DEBUG nova.objects.instance [None req-d4fa63dc-d0f0-47f3-af11-9cc95dabd4af 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Lazy-loading 'migration_context' on Instance uuid e0454e3f-2082-4375-8eea-951bbcda9d18 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:58:39 compute-0 nova_compute[186544]: 2025-11-22 07:58:39.943 186548 DEBUG nova.virt.libvirt.vif [None req-d4fa63dc-d0f0-47f3-af11-9cc95dabd4af 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:58:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-InstanceActionsTestJSON-server-1380026557',display_name='tempest-InstanceActionsTestJSON-server-1380026557',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-instanceactionstestjson-server-1380026557',id=82,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:58:36Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=1,progress=0,project_id='2e831305a33c4281b08d9d9a531f0f05',ramdisk_id='',reservation_id='r-8wd5vte0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-InstanceActionsTestJSON-1024162909',owner_user_name='tempest-InstanceActionsTestJSON-1024162909-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:58:39Z,user_data=None,user_id='567c8b1d5a18457a8101063d7929927c',uuid=e0454e3f-2082-4375-8eea-951bbcda9d18,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "99430038-cd5a-4b80-870d-a311ce070406", "address": "fa:16:3e:dd:6b:ef", "network": {"id": "2eb8d1c0-aa5f-4b93-a282-8895c4dfc512", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-950997851-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e831305a33c4281b08d9d9a531f0f05", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap99430038-cd", "ovs_interfaceid": "99430038-cd5a-4b80-870d-a311ce070406", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 07:58:39 compute-0 nova_compute[186544]: 2025-11-22 07:58:39.943 186548 DEBUG nova.network.os_vif_util [None req-d4fa63dc-d0f0-47f3-af11-9cc95dabd4af 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Converting VIF {"id": "99430038-cd5a-4b80-870d-a311ce070406", "address": "fa:16:3e:dd:6b:ef", "network": {"id": "2eb8d1c0-aa5f-4b93-a282-8895c4dfc512", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-950997851-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e831305a33c4281b08d9d9a531f0f05", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap99430038-cd", "ovs_interfaceid": "99430038-cd5a-4b80-870d-a311ce070406", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:58:39 compute-0 nova_compute[186544]: 2025-11-22 07:58:39.945 186548 DEBUG nova.network.os_vif_util [None req-d4fa63dc-d0f0-47f3-af11-9cc95dabd4af 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:dd:6b:ef,bridge_name='br-int',has_traffic_filtering=True,id=99430038-cd5a-4b80-870d-a311ce070406,network=Network(2eb8d1c0-aa5f-4b93-a282-8895c4dfc512),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap99430038-cd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:58:39 compute-0 nova_compute[186544]: 2025-11-22 07:58:39.945 186548 DEBUG os_vif [None req-d4fa63dc-d0f0-47f3-af11-9cc95dabd4af 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:dd:6b:ef,bridge_name='br-int',has_traffic_filtering=True,id=99430038-cd5a-4b80-870d-a311ce070406,network=Network(2eb8d1c0-aa5f-4b93-a282-8895c4dfc512),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap99430038-cd') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 07:58:39 compute-0 nova_compute[186544]: 2025-11-22 07:58:39.946 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:39 compute-0 nova_compute[186544]: 2025-11-22 07:58:39.946 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:58:39 compute-0 nova_compute[186544]: 2025-11-22 07:58:39.947 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:58:39 compute-0 nova_compute[186544]: 2025-11-22 07:58:39.949 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:39 compute-0 nova_compute[186544]: 2025-11-22 07:58:39.950 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap99430038-cd, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:58:39 compute-0 nova_compute[186544]: 2025-11-22 07:58:39.950 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap99430038-cd, col_values=(('external_ids', {'iface-id': '99430038-cd5a-4b80-870d-a311ce070406', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:dd:6b:ef', 'vm-uuid': 'e0454e3f-2082-4375-8eea-951bbcda9d18'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:58:39 compute-0 nova_compute[186544]: 2025-11-22 07:58:39.951 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:39 compute-0 NetworkManager[55036]: <info>  [1763798319.9532] manager: (tap99430038-cd): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/153)
Nov 22 07:58:39 compute-0 nova_compute[186544]: 2025-11-22 07:58:39.954 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 07:58:39 compute-0 nova_compute[186544]: 2025-11-22 07:58:39.957 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:39 compute-0 nova_compute[186544]: 2025-11-22 07:58:39.958 186548 INFO os_vif [None req-d4fa63dc-d0f0-47f3-af11-9cc95dabd4af 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:dd:6b:ef,bridge_name='br-int',has_traffic_filtering=True,id=99430038-cd5a-4b80-870d-a311ce070406,network=Network(2eb8d1c0-aa5f-4b93-a282-8895c4dfc512),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap99430038-cd')
Nov 22 07:58:40 compute-0 kernel: tap99430038-cd: entered promiscuous mode
Nov 22 07:58:40 compute-0 nova_compute[186544]: 2025-11-22 07:58:40.092 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:40 compute-0 NetworkManager[55036]: <info>  [1763798320.0929] manager: (tap99430038-cd): new Tun device (/org/freedesktop/NetworkManager/Devices/154)
Nov 22 07:58:40 compute-0 systemd-udevd[226558]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 07:58:40 compute-0 ovn_controller[94843]: 2025-11-22T07:58:40Z|00332|binding|INFO|Claiming lport 99430038-cd5a-4b80-870d-a311ce070406 for this chassis.
Nov 22 07:58:40 compute-0 ovn_controller[94843]: 2025-11-22T07:58:40Z|00333|binding|INFO|99430038-cd5a-4b80-870d-a311ce070406: Claiming fa:16:3e:dd:6b:ef 10.100.0.8
Nov 22 07:58:40 compute-0 NetworkManager[55036]: <info>  [1763798320.1065] device (tap99430038-cd): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 07:58:40 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:40.106 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:dd:6b:ef 10.100.0.8'], port_security=['fa:16:3e:dd:6b:ef 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'e0454e3f-2082-4375-8eea-951bbcda9d18', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2eb8d1c0-aa5f-4b93-a282-8895c4dfc512', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2e831305a33c4281b08d9d9a531f0f05', 'neutron:revision_number': '5', 'neutron:security_group_ids': '8bae29ab-eff0-4123-9083-446cbed1b776', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=acb7cb2c-1a3c-469d-9846-a04f374dd378, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=99430038-cd5a-4b80-870d-a311ce070406) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:58:40 compute-0 NetworkManager[55036]: <info>  [1763798320.1075] device (tap99430038-cd): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 07:58:40 compute-0 ovn_controller[94843]: 2025-11-22T07:58:40Z|00334|binding|INFO|Setting lport 99430038-cd5a-4b80-870d-a311ce070406 ovn-installed in OVS
Nov 22 07:58:40 compute-0 ovn_controller[94843]: 2025-11-22T07:58:40Z|00335|binding|INFO|Setting lport 99430038-cd5a-4b80-870d-a311ce070406 up in Southbound
Nov 22 07:58:40 compute-0 nova_compute[186544]: 2025-11-22 07:58:40.109 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:40 compute-0 nova_compute[186544]: 2025-11-22 07:58:40.117 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:40 compute-0 systemd-machined[152872]: New machine qemu-40-instance-00000052.
Nov 22 07:58:40 compute-0 systemd[1]: Started Virtual Machine qemu-40-instance-00000052.
Nov 22 07:58:40 compute-0 podman[226632]: 2025-11-22 07:58:40.209191828 +0000 UTC m=+0.373536263 container remove de9d7966fd2b412506a0294d469d9d304b5b56c7841a5151340039362716ec9c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2eb8d1c0-aa5f-4b93-a282-8895c4dfc512, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 22 07:58:40 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:40.220 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[d5c4d2ea-8a05-46f9-9d74-51f12a8b710d]: (4, ('Sat Nov 22 07:58:39 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-2eb8d1c0-aa5f-4b93-a282-8895c4dfc512 (de9d7966fd2b412506a0294d469d9d304b5b56c7841a5151340039362716ec9c)\nde9d7966fd2b412506a0294d469d9d304b5b56c7841a5151340039362716ec9c\nSat Nov 22 07:58:39 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-2eb8d1c0-aa5f-4b93-a282-8895c4dfc512 (de9d7966fd2b412506a0294d469d9d304b5b56c7841a5151340039362716ec9c)\nde9d7966fd2b412506a0294d469d9d304b5b56c7841a5151340039362716ec9c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:40 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:40.224 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[a3ded5a6-dcb4-478f-89d3-cfea035dbf9a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:40 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:40.227 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2eb8d1c0-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:58:40 compute-0 nova_compute[186544]: 2025-11-22 07:58:40.230 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:40 compute-0 kernel: tap2eb8d1c0-a0: left promiscuous mode
Nov 22 07:58:40 compute-0 nova_compute[186544]: 2025-11-22 07:58:40.245 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:40 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:40.248 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[c776bb39-8bf0-4937-a2da-a9b713a4a38b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:40 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:40.277 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[ce768357-3f54-4cd5-8845-af339ff522b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:40 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:40.279 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[1b62bcee-7fb4-418c-9888-b269d4eab993]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:40 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:40.299 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[852621f2-012b-4fe2-ae6a-9b3bb5b68e48]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 505672, 'reachable_time': 29400, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226675, 'error': None, 'target': 'ovnmeta-2eb8d1c0-aa5f-4b93-a282-8895c4dfc512', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:40 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:40.302 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-2eb8d1c0-aa5f-4b93-a282-8895c4dfc512 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 07:58:40 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:40.302 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[e04f8766-5183-4097-8fea-b60ef5dfcc97]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:40 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:40.303 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 99430038-cd5a-4b80-870d-a311ce070406 in datapath 2eb8d1c0-aa5f-4b93-a282-8895c4dfc512 unbound from our chassis
Nov 22 07:58:40 compute-0 systemd[1]: run-netns-ovnmeta\x2d2eb8d1c0\x2daa5f\x2d4b93\x2da282\x2d8895c4dfc512.mount: Deactivated successfully.
Nov 22 07:58:40 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:40.304 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2eb8d1c0-aa5f-4b93-a282-8895c4dfc512
Nov 22 07:58:40 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:40.316 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[7cbe4d5e-59ca-4e49-85bd-04e5d8bce540]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:40 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:40.317 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap2eb8d1c0-a1 in ovnmeta-2eb8d1c0-aa5f-4b93-a282-8895c4dfc512 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 07:58:40 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:40.319 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap2eb8d1c0-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 07:58:40 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:40.319 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[4b68a7ff-361c-46d9-8572-db5652070dc4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:40 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:40.321 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[b647759c-21d6-4cbe-966c-8d230fe8fbe0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:40 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:40.333 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[ee8acfc4-d609-4a9a-842d-c8cbd0675517]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:40 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:40.356 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[1696fa81-23fd-4e91-8ee8-3d18725f43ff]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:40 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:40.399 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[453392e8-1d35-4360-abae-bac74da0a76c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:40 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:40.405 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[dbc83785-a794-4103-9eef-93174fb96ef1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:40 compute-0 NetworkManager[55036]: <info>  [1763798320.4066] manager: (tap2eb8d1c0-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/155)
Nov 22 07:58:40 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:40.436 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[e9dc77e8-de5c-4557-b9f0-bd499874c597]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:40 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:40.439 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[b2aafca4-cd72-42c5-8902-6469622639bc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:40 compute-0 NetworkManager[55036]: <info>  [1763798320.4633] device (tap2eb8d1c0-a0): carrier: link connected
Nov 22 07:58:40 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:40.470 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[97adad79-8750-42c6-919b-6a6927394084]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:40 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:40.487 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[b1a8ed5d-c654-4c01-9358-2f918c5ea678]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2eb8d1c0-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cd:71:67'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 101], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 506110, 'reachable_time': 29413, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226707, 'error': None, 'target': 'ovnmeta-2eb8d1c0-aa5f-4b93-a282-8895c4dfc512', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:40 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:40.504 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[01e21501-5961-4d95-a008-62b94e7348e2]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fecd:7167'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 506110, 'tstamp': 506110}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226708, 'error': None, 'target': 'ovnmeta-2eb8d1c0-aa5f-4b93-a282-8895c4dfc512', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:40 compute-0 nova_compute[186544]: 2025-11-22 07:58:40.514 186548 DEBUG nova.virt.libvirt.host [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Removed pending event for e0454e3f-2082-4375-8eea-951bbcda9d18 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Nov 22 07:58:40 compute-0 nova_compute[186544]: 2025-11-22 07:58:40.516 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798320.5138783, e0454e3f-2082-4375-8eea-951bbcda9d18 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:58:40 compute-0 nova_compute[186544]: 2025-11-22 07:58:40.517 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] VM Resumed (Lifecycle Event)
Nov 22 07:58:40 compute-0 nova_compute[186544]: 2025-11-22 07:58:40.523 186548 DEBUG nova.compute.manager [req-b6c36edd-b7ca-4776-84de-163bf0eeb044 req-a9ea67c7-72de-4c23-85e2-76750e30699e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Received event network-vif-unplugged-99430038-cd5a-4b80-870d-a311ce070406 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:58:40 compute-0 nova_compute[186544]: 2025-11-22 07:58:40.524 186548 DEBUG oslo_concurrency.lockutils [req-b6c36edd-b7ca-4776-84de-163bf0eeb044 req-a9ea67c7-72de-4c23-85e2-76750e30699e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "e0454e3f-2082-4375-8eea-951bbcda9d18-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:58:40 compute-0 nova_compute[186544]: 2025-11-22 07:58:40.524 186548 DEBUG oslo_concurrency.lockutils [req-b6c36edd-b7ca-4776-84de-163bf0eeb044 req-a9ea67c7-72de-4c23-85e2-76750e30699e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e0454e3f-2082-4375-8eea-951bbcda9d18-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:58:40 compute-0 nova_compute[186544]: 2025-11-22 07:58:40.525 186548 DEBUG oslo_concurrency.lockutils [req-b6c36edd-b7ca-4776-84de-163bf0eeb044 req-a9ea67c7-72de-4c23-85e2-76750e30699e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e0454e3f-2082-4375-8eea-951bbcda9d18-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:58:40 compute-0 nova_compute[186544]: 2025-11-22 07:58:40.525 186548 DEBUG nova.compute.manager [req-b6c36edd-b7ca-4776-84de-163bf0eeb044 req-a9ea67c7-72de-4c23-85e2-76750e30699e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] No waiting events found dispatching network-vif-unplugged-99430038-cd5a-4b80-870d-a311ce070406 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:58:40 compute-0 nova_compute[186544]: 2025-11-22 07:58:40.525 186548 WARNING nova.compute.manager [req-b6c36edd-b7ca-4776-84de-163bf0eeb044 req-a9ea67c7-72de-4c23-85e2-76750e30699e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Received unexpected event network-vif-unplugged-99430038-cd5a-4b80-870d-a311ce070406 for instance with vm_state active and task_state reboot_started_hard.
Nov 22 07:58:40 compute-0 nova_compute[186544]: 2025-11-22 07:58:40.526 186548 DEBUG nova.compute.manager [req-b6c36edd-b7ca-4776-84de-163bf0eeb044 req-a9ea67c7-72de-4c23-85e2-76750e30699e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Received event network-vif-plugged-99430038-cd5a-4b80-870d-a311ce070406 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:58:40 compute-0 nova_compute[186544]: 2025-11-22 07:58:40.526 186548 DEBUG oslo_concurrency.lockutils [req-b6c36edd-b7ca-4776-84de-163bf0eeb044 req-a9ea67c7-72de-4c23-85e2-76750e30699e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "e0454e3f-2082-4375-8eea-951bbcda9d18-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:58:40 compute-0 nova_compute[186544]: 2025-11-22 07:58:40.526 186548 DEBUG oslo_concurrency.lockutils [req-b6c36edd-b7ca-4776-84de-163bf0eeb044 req-a9ea67c7-72de-4c23-85e2-76750e30699e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e0454e3f-2082-4375-8eea-951bbcda9d18-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:58:40 compute-0 nova_compute[186544]: 2025-11-22 07:58:40.527 186548 DEBUG oslo_concurrency.lockutils [req-b6c36edd-b7ca-4776-84de-163bf0eeb044 req-a9ea67c7-72de-4c23-85e2-76750e30699e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e0454e3f-2082-4375-8eea-951bbcda9d18-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:58:40 compute-0 nova_compute[186544]: 2025-11-22 07:58:40.527 186548 DEBUG nova.compute.manager [req-b6c36edd-b7ca-4776-84de-163bf0eeb044 req-a9ea67c7-72de-4c23-85e2-76750e30699e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] No waiting events found dispatching network-vif-plugged-99430038-cd5a-4b80-870d-a311ce070406 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:58:40 compute-0 nova_compute[186544]: 2025-11-22 07:58:40.527 186548 WARNING nova.compute.manager [req-b6c36edd-b7ca-4776-84de-163bf0eeb044 req-a9ea67c7-72de-4c23-85e2-76750e30699e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Received unexpected event network-vif-plugged-99430038-cd5a-4b80-870d-a311ce070406 for instance with vm_state active and task_state reboot_started_hard.
Nov 22 07:58:40 compute-0 nova_compute[186544]: 2025-11-22 07:58:40.528 186548 DEBUG nova.compute.manager [req-b6c36edd-b7ca-4776-84de-163bf0eeb044 req-a9ea67c7-72de-4c23-85e2-76750e30699e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Received event network-vif-plugged-99430038-cd5a-4b80-870d-a311ce070406 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:58:40 compute-0 nova_compute[186544]: 2025-11-22 07:58:40.528 186548 DEBUG oslo_concurrency.lockutils [req-b6c36edd-b7ca-4776-84de-163bf0eeb044 req-a9ea67c7-72de-4c23-85e2-76750e30699e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "e0454e3f-2082-4375-8eea-951bbcda9d18-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:58:40 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:40.527 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[2701dc67-1554-481d-8c38-f2d7fa86d4ac]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2eb8d1c0-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cd:71:67'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 101], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 506110, 'reachable_time': 29413, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 226709, 'error': None, 'target': 'ovnmeta-2eb8d1c0-aa5f-4b93-a282-8895c4dfc512', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:40 compute-0 nova_compute[186544]: 2025-11-22 07:58:40.528 186548 DEBUG oslo_concurrency.lockutils [req-b6c36edd-b7ca-4776-84de-163bf0eeb044 req-a9ea67c7-72de-4c23-85e2-76750e30699e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e0454e3f-2082-4375-8eea-951bbcda9d18-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:58:40 compute-0 nova_compute[186544]: 2025-11-22 07:58:40.529 186548 DEBUG oslo_concurrency.lockutils [req-b6c36edd-b7ca-4776-84de-163bf0eeb044 req-a9ea67c7-72de-4c23-85e2-76750e30699e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e0454e3f-2082-4375-8eea-951bbcda9d18-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:58:40 compute-0 nova_compute[186544]: 2025-11-22 07:58:40.529 186548 DEBUG nova.compute.manager [req-b6c36edd-b7ca-4776-84de-163bf0eeb044 req-a9ea67c7-72de-4c23-85e2-76750e30699e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] No waiting events found dispatching network-vif-plugged-99430038-cd5a-4b80-870d-a311ce070406 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:58:40 compute-0 nova_compute[186544]: 2025-11-22 07:58:40.529 186548 WARNING nova.compute.manager [req-b6c36edd-b7ca-4776-84de-163bf0eeb044 req-a9ea67c7-72de-4c23-85e2-76750e30699e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Received unexpected event network-vif-plugged-99430038-cd5a-4b80-870d-a311ce070406 for instance with vm_state active and task_state reboot_started_hard.
Nov 22 07:58:40 compute-0 nova_compute[186544]: 2025-11-22 07:58:40.530 186548 DEBUG nova.compute.manager [None req-d4fa63dc-d0f0-47f3-af11-9cc95dabd4af 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 07:58:40 compute-0 nova_compute[186544]: 2025-11-22 07:58:40.537 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:58:40 compute-0 nova_compute[186544]: 2025-11-22 07:58:40.539 186548 INFO nova.virt.libvirt.driver [-] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Instance rebooted successfully.
Nov 22 07:58:40 compute-0 nova_compute[186544]: 2025-11-22 07:58:40.540 186548 DEBUG nova.compute.manager [None req-d4fa63dc-d0f0-47f3-af11-9cc95dabd4af 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:58:40 compute-0 nova_compute[186544]: 2025-11-22 07:58:40.542 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: reboot_started_hard, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:58:40 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:40.559 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[2cc985e7-95c8-4ccf-ad9b-017ab90abe06]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:40 compute-0 nova_compute[186544]: 2025-11-22 07:58:40.571 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] During sync_power_state the instance has a pending task (reboot_started_hard). Skip.
Nov 22 07:58:40 compute-0 nova_compute[186544]: 2025-11-22 07:58:40.571 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798320.5152116, e0454e3f-2082-4375-8eea-951bbcda9d18 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:58:40 compute-0 nova_compute[186544]: 2025-11-22 07:58:40.572 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] VM Started (Lifecycle Event)
Nov 22 07:58:40 compute-0 nova_compute[186544]: 2025-11-22 07:58:40.578 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:40 compute-0 nova_compute[186544]: 2025-11-22 07:58:40.598 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:58:40 compute-0 nova_compute[186544]: 2025-11-22 07:58:40.602 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: reboot_started_hard, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:58:40 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:40.617 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[508a14b3-3438-441e-bf99-f9ce44dca535]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:40 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:40.618 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2eb8d1c0-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:58:40 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:40.618 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:58:40 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:40.619 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2eb8d1c0-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:58:40 compute-0 NetworkManager[55036]: <info>  [1763798320.6215] manager: (tap2eb8d1c0-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/156)
Nov 22 07:58:40 compute-0 kernel: tap2eb8d1c0-a0: entered promiscuous mode
Nov 22 07:58:40 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:40.623 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2eb8d1c0-a0, col_values=(('external_ids', {'iface-id': '4bd45432-2a1f-427f-9701-acf0afb89540'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:58:40 compute-0 ovn_controller[94843]: 2025-11-22T07:58:40Z|00336|binding|INFO|Releasing lport 4bd45432-2a1f-427f-9701-acf0afb89540 from this chassis (sb_readonly=0)
Nov 22 07:58:40 compute-0 nova_compute[186544]: 2025-11-22 07:58:40.636 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:40 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:40.637 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/2eb8d1c0-aa5f-4b93-a282-8895c4dfc512.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/2eb8d1c0-aa5f-4b93-a282-8895c4dfc512.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 07:58:40 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:40.638 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[1c16ec59-ecb0-4e5d-96f9-b3295b9ee79e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:40 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:40.639 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 07:58:40 compute-0 ovn_metadata_agent[103800]: global
Nov 22 07:58:40 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 07:58:40 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-2eb8d1c0-aa5f-4b93-a282-8895c4dfc512
Nov 22 07:58:40 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 07:58:40 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 07:58:40 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 07:58:40 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/2eb8d1c0-aa5f-4b93-a282-8895c4dfc512.pid.haproxy
Nov 22 07:58:40 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 07:58:40 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:58:40 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 07:58:40 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 07:58:40 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 07:58:40 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 07:58:40 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 07:58:40 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 07:58:40 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 07:58:40 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 07:58:40 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 07:58:40 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 07:58:40 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 07:58:40 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 07:58:40 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 07:58:40 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:58:40 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:58:40 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 07:58:40 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 07:58:40 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 07:58:40 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID 2eb8d1c0-aa5f-4b93-a282-8895c4dfc512
Nov 22 07:58:40 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 07:58:40 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:40.640 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-2eb8d1c0-aa5f-4b93-a282-8895c4dfc512', 'env', 'PROCESS_TAG=haproxy-2eb8d1c0-aa5f-4b93-a282-8895c4dfc512', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/2eb8d1c0-aa5f-4b93-a282-8895c4dfc512.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 07:58:40 compute-0 nova_compute[186544]: 2025-11-22 07:58:40.649 186548 DEBUG oslo_concurrency.lockutils [None req-d4fa63dc-d0f0-47f3-af11-9cc95dabd4af 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Lock "e0454e3f-2082-4375-8eea-951bbcda9d18" "released" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: held 2.716s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:58:41 compute-0 podman[226741]: 2025-11-22 07:58:40.989663936 +0000 UTC m=+0.025407888 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 07:58:41 compute-0 nova_compute[186544]: 2025-11-22 07:58:41.625 186548 DEBUG oslo_concurrency.lockutils [None req-30bbab12-b126-4092-85b2-de26e759d37c 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Acquiring lock "e0454e3f-2082-4375-8eea-951bbcda9d18" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:58:41 compute-0 nova_compute[186544]: 2025-11-22 07:58:41.626 186548 DEBUG oslo_concurrency.lockutils [None req-30bbab12-b126-4092-85b2-de26e759d37c 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Lock "e0454e3f-2082-4375-8eea-951bbcda9d18" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:58:41 compute-0 nova_compute[186544]: 2025-11-22 07:58:41.626 186548 DEBUG oslo_concurrency.lockutils [None req-30bbab12-b126-4092-85b2-de26e759d37c 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Acquiring lock "e0454e3f-2082-4375-8eea-951bbcda9d18-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:58:41 compute-0 nova_compute[186544]: 2025-11-22 07:58:41.626 186548 DEBUG oslo_concurrency.lockutils [None req-30bbab12-b126-4092-85b2-de26e759d37c 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Lock "e0454e3f-2082-4375-8eea-951bbcda9d18-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:58:41 compute-0 nova_compute[186544]: 2025-11-22 07:58:41.626 186548 DEBUG oslo_concurrency.lockutils [None req-30bbab12-b126-4092-85b2-de26e759d37c 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Lock "e0454e3f-2082-4375-8eea-951bbcda9d18-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:58:41 compute-0 nova_compute[186544]: 2025-11-22 07:58:41.647 186548 INFO nova.compute.manager [None req-30bbab12-b126-4092-85b2-de26e759d37c 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Terminating instance
Nov 22 07:58:41 compute-0 nova_compute[186544]: 2025-11-22 07:58:41.654 186548 DEBUG nova.compute.manager [None req-30bbab12-b126-4092-85b2-de26e759d37c 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 07:58:41 compute-0 kernel: tap99430038-cd (unregistering): left promiscuous mode
Nov 22 07:58:41 compute-0 NetworkManager[55036]: <info>  [1763798321.6769] device (tap99430038-cd): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 07:58:41 compute-0 ovn_controller[94843]: 2025-11-22T07:58:41Z|00337|binding|INFO|Releasing lport 99430038-cd5a-4b80-870d-a311ce070406 from this chassis (sb_readonly=0)
Nov 22 07:58:41 compute-0 nova_compute[186544]: 2025-11-22 07:58:41.687 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:41 compute-0 ovn_controller[94843]: 2025-11-22T07:58:41Z|00338|binding|INFO|Setting lport 99430038-cd5a-4b80-870d-a311ce070406 down in Southbound
Nov 22 07:58:41 compute-0 ovn_controller[94843]: 2025-11-22T07:58:41Z|00339|binding|INFO|Removing iface tap99430038-cd ovn-installed in OVS
Nov 22 07:58:41 compute-0 nova_compute[186544]: 2025-11-22 07:58:41.689 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:41 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:41.695 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:dd:6b:ef 10.100.0.8'], port_security=['fa:16:3e:dd:6b:ef 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'e0454e3f-2082-4375-8eea-951bbcda9d18', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2eb8d1c0-aa5f-4b93-a282-8895c4dfc512', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2e831305a33c4281b08d9d9a531f0f05', 'neutron:revision_number': '6', 'neutron:security_group_ids': '8bae29ab-eff0-4123-9083-446cbed1b776', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=acb7cb2c-1a3c-469d-9846-a04f374dd378, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=99430038-cd5a-4b80-870d-a311ce070406) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:58:41 compute-0 nova_compute[186544]: 2025-11-22 07:58:41.703 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:41 compute-0 systemd[1]: machine-qemu\x2d40\x2dinstance\x2d00000052.scope: Deactivated successfully.
Nov 22 07:58:41 compute-0 systemd[1]: machine-qemu\x2d40\x2dinstance\x2d00000052.scope: Consumed 1.483s CPU time.
Nov 22 07:58:41 compute-0 systemd-machined[152872]: Machine qemu-40-instance-00000052 terminated.
Nov 22 07:58:41 compute-0 podman[226741]: 2025-11-22 07:58:41.761112059 +0000 UTC m=+0.796855981 container create 3fef7c9ab5025a9896bdc211f07e6e7861148a7d018a6a28440da5475abca497 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2eb8d1c0-aa5f-4b93-a282-8895c4dfc512, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118)
Nov 22 07:58:41 compute-0 NetworkManager[55036]: <info>  [1763798321.8732] manager: (tap99430038-cd): new Tun device (/org/freedesktop/NetworkManager/Devices/157)
Nov 22 07:58:41 compute-0 systemd[1]: Started libpod-conmon-3fef7c9ab5025a9896bdc211f07e6e7861148a7d018a6a28440da5475abca497.scope.
Nov 22 07:58:41 compute-0 nova_compute[186544]: 2025-11-22 07:58:41.914 186548 INFO nova.virt.libvirt.driver [-] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Instance destroyed successfully.
Nov 22 07:58:41 compute-0 nova_compute[186544]: 2025-11-22 07:58:41.915 186548 DEBUG nova.objects.instance [None req-30bbab12-b126-4092-85b2-de26e759d37c 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Lazy-loading 'resources' on Instance uuid e0454e3f-2082-4375-8eea-951bbcda9d18 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:58:41 compute-0 nova_compute[186544]: 2025-11-22 07:58:41.926 186548 DEBUG nova.virt.libvirt.vif [None req-30bbab12-b126-4092-85b2-de26e759d37c 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:58:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-InstanceActionsTestJSON-server-1380026557',display_name='tempest-InstanceActionsTestJSON-server-1380026557',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-instanceactionstestjson-server-1380026557',id=82,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:58:36Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2e831305a33c4281b08d9d9a531f0f05',ramdisk_id='',reservation_id='r-8wd5vte0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-InstanceActionsTestJSON-1024162909',owner_user_name='tempest-InstanceActionsTestJSON-1024162909-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:58:40Z,user_data=None,user_id='567c8b1d5a18457a8101063d7929927c',uuid=e0454e3f-2082-4375-8eea-951bbcda9d18,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "99430038-cd5a-4b80-870d-a311ce070406", "address": "fa:16:3e:dd:6b:ef", "network": {"id": "2eb8d1c0-aa5f-4b93-a282-8895c4dfc512", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-950997851-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e831305a33c4281b08d9d9a531f0f05", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap99430038-cd", "ovs_interfaceid": "99430038-cd5a-4b80-870d-a311ce070406", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 07:58:41 compute-0 nova_compute[186544]: 2025-11-22 07:58:41.926 186548 DEBUG nova.network.os_vif_util [None req-30bbab12-b126-4092-85b2-de26e759d37c 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Converting VIF {"id": "99430038-cd5a-4b80-870d-a311ce070406", "address": "fa:16:3e:dd:6b:ef", "network": {"id": "2eb8d1c0-aa5f-4b93-a282-8895c4dfc512", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-950997851-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e831305a33c4281b08d9d9a531f0f05", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap99430038-cd", "ovs_interfaceid": "99430038-cd5a-4b80-870d-a311ce070406", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:58:41 compute-0 nova_compute[186544]: 2025-11-22 07:58:41.927 186548 DEBUG nova.network.os_vif_util [None req-30bbab12-b126-4092-85b2-de26e759d37c 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:dd:6b:ef,bridge_name='br-int',has_traffic_filtering=True,id=99430038-cd5a-4b80-870d-a311ce070406,network=Network(2eb8d1c0-aa5f-4b93-a282-8895c4dfc512),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap99430038-cd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:58:41 compute-0 nova_compute[186544]: 2025-11-22 07:58:41.927 186548 DEBUG os_vif [None req-30bbab12-b126-4092-85b2-de26e759d37c 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:dd:6b:ef,bridge_name='br-int',has_traffic_filtering=True,id=99430038-cd5a-4b80-870d-a311ce070406,network=Network(2eb8d1c0-aa5f-4b93-a282-8895c4dfc512),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap99430038-cd') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 07:58:41 compute-0 nova_compute[186544]: 2025-11-22 07:58:41.929 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:41 compute-0 nova_compute[186544]: 2025-11-22 07:58:41.929 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap99430038-cd, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:58:41 compute-0 nova_compute[186544]: 2025-11-22 07:58:41.930 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:41 compute-0 nova_compute[186544]: 2025-11-22 07:58:41.932 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:41 compute-0 nova_compute[186544]: 2025-11-22 07:58:41.934 186548 INFO os_vif [None req-30bbab12-b126-4092-85b2-de26e759d37c 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:dd:6b:ef,bridge_name='br-int',has_traffic_filtering=True,id=99430038-cd5a-4b80-870d-a311ce070406,network=Network(2eb8d1c0-aa5f-4b93-a282-8895c4dfc512),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap99430038-cd')
Nov 22 07:58:41 compute-0 nova_compute[186544]: 2025-11-22 07:58:41.935 186548 INFO nova.virt.libvirt.driver [None req-30bbab12-b126-4092-85b2-de26e759d37c 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Deleting instance files /var/lib/nova/instances/e0454e3f-2082-4375-8eea-951bbcda9d18_del
Nov 22 07:58:41 compute-0 nova_compute[186544]: 2025-11-22 07:58:41.936 186548 INFO nova.virt.libvirt.driver [None req-30bbab12-b126-4092-85b2-de26e759d37c 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Deletion of /var/lib/nova/instances/e0454e3f-2082-4375-8eea-951bbcda9d18_del complete
Nov 22 07:58:41 compute-0 systemd[1]: Started libcrun container.
Nov 22 07:58:41 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/19a865c40efdbfbfb001e40b4f92479c95623124744b5b7d802e34bd695f4b67/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 07:58:42 compute-0 nova_compute[186544]: 2025-11-22 07:58:42.003 186548 INFO nova.compute.manager [None req-30bbab12-b126-4092-85b2-de26e759d37c 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Took 0.35 seconds to destroy the instance on the hypervisor.
Nov 22 07:58:42 compute-0 nova_compute[186544]: 2025-11-22 07:58:42.003 186548 DEBUG oslo.service.loopingcall [None req-30bbab12-b126-4092-85b2-de26e759d37c 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 07:58:42 compute-0 nova_compute[186544]: 2025-11-22 07:58:42.004 186548 DEBUG nova.compute.manager [-] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 07:58:42 compute-0 nova_compute[186544]: 2025-11-22 07:58:42.004 186548 DEBUG nova.network.neutron [-] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 07:58:42 compute-0 podman[226741]: 2025-11-22 07:58:42.024152679 +0000 UTC m=+1.059896631 container init 3fef7c9ab5025a9896bdc211f07e6e7861148a7d018a6a28440da5475abca497 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2eb8d1c0-aa5f-4b93-a282-8895c4dfc512, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118)
Nov 22 07:58:42 compute-0 podman[226741]: 2025-11-22 07:58:42.03025619 +0000 UTC m=+1.066000112 container start 3fef7c9ab5025a9896bdc211f07e6e7861148a7d018a6a28440da5475abca497 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2eb8d1c0-aa5f-4b93-a282-8895c4dfc512, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, tcib_managed=true)
Nov 22 07:58:42 compute-0 neutron-haproxy-ovnmeta-2eb8d1c0-aa5f-4b93-a282-8895c4dfc512[226779]: [NOTICE]   (226783) : New worker (226785) forked
Nov 22 07:58:42 compute-0 neutron-haproxy-ovnmeta-2eb8d1c0-aa5f-4b93-a282-8895c4dfc512[226779]: [NOTICE]   (226783) : Loading success.
Nov 22 07:58:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:42.101 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 99430038-cd5a-4b80-870d-a311ce070406 in datapath 2eb8d1c0-aa5f-4b93-a282-8895c4dfc512 unbound from our chassis
Nov 22 07:58:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:42.103 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2eb8d1c0-aa5f-4b93-a282-8895c4dfc512, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 07:58:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:42.104 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[fd7296c2-e33e-47ca-985d-b802b3f64d03]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:42.105 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-2eb8d1c0-aa5f-4b93-a282-8895c4dfc512 namespace which is not needed anymore
Nov 22 07:58:42 compute-0 nova_compute[186544]: 2025-11-22 07:58:42.178 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:58:42 compute-0 neutron-haproxy-ovnmeta-2eb8d1c0-aa5f-4b93-a282-8895c4dfc512[226779]: [NOTICE]   (226783) : haproxy version is 2.8.14-c23fe91
Nov 22 07:58:42 compute-0 neutron-haproxy-ovnmeta-2eb8d1c0-aa5f-4b93-a282-8895c4dfc512[226779]: [NOTICE]   (226783) : path to executable is /usr/sbin/haproxy
Nov 22 07:58:42 compute-0 neutron-haproxy-ovnmeta-2eb8d1c0-aa5f-4b93-a282-8895c4dfc512[226779]: [WARNING]  (226783) : Exiting Master process...
Nov 22 07:58:42 compute-0 neutron-haproxy-ovnmeta-2eb8d1c0-aa5f-4b93-a282-8895c4dfc512[226779]: [WARNING]  (226783) : Exiting Master process...
Nov 22 07:58:42 compute-0 neutron-haproxy-ovnmeta-2eb8d1c0-aa5f-4b93-a282-8895c4dfc512[226779]: [ALERT]    (226783) : Current worker (226785) exited with code 143 (Terminated)
Nov 22 07:58:42 compute-0 neutron-haproxy-ovnmeta-2eb8d1c0-aa5f-4b93-a282-8895c4dfc512[226779]: [WARNING]  (226783) : All workers exited. Exiting... (0)
Nov 22 07:58:42 compute-0 systemd[1]: libpod-3fef7c9ab5025a9896bdc211f07e6e7861148a7d018a6a28440da5475abca497.scope: Deactivated successfully.
Nov 22 07:58:42 compute-0 podman[226811]: 2025-11-22 07:58:42.244598121 +0000 UTC m=+0.061001245 container died 3fef7c9ab5025a9896bdc211f07e6e7861148a7d018a6a28440da5475abca497 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2eb8d1c0-aa5f-4b93-a282-8895c4dfc512, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 07:58:42 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3fef7c9ab5025a9896bdc211f07e6e7861148a7d018a6a28440da5475abca497-userdata-shm.mount: Deactivated successfully.
Nov 22 07:58:42 compute-0 systemd[1]: var-lib-containers-storage-overlay-19a865c40efdbfbfb001e40b4f92479c95623124744b5b7d802e34bd695f4b67-merged.mount: Deactivated successfully.
Nov 22 07:58:42 compute-0 podman[226811]: 2025-11-22 07:58:42.409859132 +0000 UTC m=+0.226262256 container cleanup 3fef7c9ab5025a9896bdc211f07e6e7861148a7d018a6a28440da5475abca497 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2eb8d1c0-aa5f-4b93-a282-8895c4dfc512, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 07:58:42 compute-0 systemd[1]: libpod-conmon-3fef7c9ab5025a9896bdc211f07e6e7861148a7d018a6a28440da5475abca497.scope: Deactivated successfully.
Nov 22 07:58:42 compute-0 nova_compute[186544]: 2025-11-22 07:58:42.611 186548 DEBUG nova.compute.manager [req-8af3b98f-cf9d-404e-bc02-7dcb4b28efda req-97a6dde7-90f4-4877-b112-8356244265df 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Received event network-vif-plugged-99430038-cd5a-4b80-870d-a311ce070406 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:58:42 compute-0 nova_compute[186544]: 2025-11-22 07:58:42.611 186548 DEBUG oslo_concurrency.lockutils [req-8af3b98f-cf9d-404e-bc02-7dcb4b28efda req-97a6dde7-90f4-4877-b112-8356244265df 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "e0454e3f-2082-4375-8eea-951bbcda9d18-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:58:42 compute-0 nova_compute[186544]: 2025-11-22 07:58:42.611 186548 DEBUG oslo_concurrency.lockutils [req-8af3b98f-cf9d-404e-bc02-7dcb4b28efda req-97a6dde7-90f4-4877-b112-8356244265df 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e0454e3f-2082-4375-8eea-951bbcda9d18-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:58:42 compute-0 nova_compute[186544]: 2025-11-22 07:58:42.611 186548 DEBUG oslo_concurrency.lockutils [req-8af3b98f-cf9d-404e-bc02-7dcb4b28efda req-97a6dde7-90f4-4877-b112-8356244265df 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e0454e3f-2082-4375-8eea-951bbcda9d18-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:58:42 compute-0 nova_compute[186544]: 2025-11-22 07:58:42.612 186548 DEBUG nova.compute.manager [req-8af3b98f-cf9d-404e-bc02-7dcb4b28efda req-97a6dde7-90f4-4877-b112-8356244265df 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] No waiting events found dispatching network-vif-plugged-99430038-cd5a-4b80-870d-a311ce070406 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:58:42 compute-0 nova_compute[186544]: 2025-11-22 07:58:42.612 186548 WARNING nova.compute.manager [req-8af3b98f-cf9d-404e-bc02-7dcb4b28efda req-97a6dde7-90f4-4877-b112-8356244265df 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Received unexpected event network-vif-plugged-99430038-cd5a-4b80-870d-a311ce070406 for instance with vm_state active and task_state deleting.
Nov 22 07:58:42 compute-0 nova_compute[186544]: 2025-11-22 07:58:42.612 186548 DEBUG nova.compute.manager [req-8af3b98f-cf9d-404e-bc02-7dcb4b28efda req-97a6dde7-90f4-4877-b112-8356244265df 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Received event network-vif-unplugged-99430038-cd5a-4b80-870d-a311ce070406 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:58:42 compute-0 nova_compute[186544]: 2025-11-22 07:58:42.612 186548 DEBUG oslo_concurrency.lockutils [req-8af3b98f-cf9d-404e-bc02-7dcb4b28efda req-97a6dde7-90f4-4877-b112-8356244265df 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "e0454e3f-2082-4375-8eea-951bbcda9d18-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:58:42 compute-0 nova_compute[186544]: 2025-11-22 07:58:42.613 186548 DEBUG oslo_concurrency.lockutils [req-8af3b98f-cf9d-404e-bc02-7dcb4b28efda req-97a6dde7-90f4-4877-b112-8356244265df 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e0454e3f-2082-4375-8eea-951bbcda9d18-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:58:42 compute-0 nova_compute[186544]: 2025-11-22 07:58:42.613 186548 DEBUG oslo_concurrency.lockutils [req-8af3b98f-cf9d-404e-bc02-7dcb4b28efda req-97a6dde7-90f4-4877-b112-8356244265df 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e0454e3f-2082-4375-8eea-951bbcda9d18-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:58:42 compute-0 nova_compute[186544]: 2025-11-22 07:58:42.613 186548 DEBUG nova.compute.manager [req-8af3b98f-cf9d-404e-bc02-7dcb4b28efda req-97a6dde7-90f4-4877-b112-8356244265df 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] No waiting events found dispatching network-vif-unplugged-99430038-cd5a-4b80-870d-a311ce070406 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:58:42 compute-0 nova_compute[186544]: 2025-11-22 07:58:42.613 186548 DEBUG nova.compute.manager [req-8af3b98f-cf9d-404e-bc02-7dcb4b28efda req-97a6dde7-90f4-4877-b112-8356244265df 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Received event network-vif-unplugged-99430038-cd5a-4b80-870d-a311ce070406 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 22 07:58:42 compute-0 nova_compute[186544]: 2025-11-22 07:58:42.613 186548 DEBUG nova.compute.manager [req-8af3b98f-cf9d-404e-bc02-7dcb4b28efda req-97a6dde7-90f4-4877-b112-8356244265df 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Received event network-vif-plugged-99430038-cd5a-4b80-870d-a311ce070406 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:58:42 compute-0 nova_compute[186544]: 2025-11-22 07:58:42.613 186548 DEBUG oslo_concurrency.lockutils [req-8af3b98f-cf9d-404e-bc02-7dcb4b28efda req-97a6dde7-90f4-4877-b112-8356244265df 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "e0454e3f-2082-4375-8eea-951bbcda9d18-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:58:42 compute-0 nova_compute[186544]: 2025-11-22 07:58:42.614 186548 DEBUG oslo_concurrency.lockutils [req-8af3b98f-cf9d-404e-bc02-7dcb4b28efda req-97a6dde7-90f4-4877-b112-8356244265df 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e0454e3f-2082-4375-8eea-951bbcda9d18-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:58:42 compute-0 nova_compute[186544]: 2025-11-22 07:58:42.614 186548 DEBUG oslo_concurrency.lockutils [req-8af3b98f-cf9d-404e-bc02-7dcb4b28efda req-97a6dde7-90f4-4877-b112-8356244265df 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "e0454e3f-2082-4375-8eea-951bbcda9d18-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:58:42 compute-0 nova_compute[186544]: 2025-11-22 07:58:42.614 186548 DEBUG nova.compute.manager [req-8af3b98f-cf9d-404e-bc02-7dcb4b28efda req-97a6dde7-90f4-4877-b112-8356244265df 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] No waiting events found dispatching network-vif-plugged-99430038-cd5a-4b80-870d-a311ce070406 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:58:42 compute-0 nova_compute[186544]: 2025-11-22 07:58:42.614 186548 WARNING nova.compute.manager [req-8af3b98f-cf9d-404e-bc02-7dcb4b28efda req-97a6dde7-90f4-4877-b112-8356244265df 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Received unexpected event network-vif-plugged-99430038-cd5a-4b80-870d-a311ce070406 for instance with vm_state active and task_state deleting.
Nov 22 07:58:42 compute-0 podman[226842]: 2025-11-22 07:58:42.724525334 +0000 UTC m=+0.294654600 container remove 3fef7c9ab5025a9896bdc211f07e6e7861148a7d018a6a28440da5475abca497 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2eb8d1c0-aa5f-4b93-a282-8895c4dfc512, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 22 07:58:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:42.729 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[de42a7fd-7b81-432b-8696-5c03bcf70272]: (4, ('Sat Nov 22 07:58:42 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-2eb8d1c0-aa5f-4b93-a282-8895c4dfc512 (3fef7c9ab5025a9896bdc211f07e6e7861148a7d018a6a28440da5475abca497)\n3fef7c9ab5025a9896bdc211f07e6e7861148a7d018a6a28440da5475abca497\nSat Nov 22 07:58:42 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-2eb8d1c0-aa5f-4b93-a282-8895c4dfc512 (3fef7c9ab5025a9896bdc211f07e6e7861148a7d018a6a28440da5475abca497)\n3fef7c9ab5025a9896bdc211f07e6e7861148a7d018a6a28440da5475abca497\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:42.731 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[038fd231-32e0-40e3-9717-485762ffaf54]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:42.732 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2eb8d1c0-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:58:42 compute-0 nova_compute[186544]: 2025-11-22 07:58:42.734 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:42 compute-0 kernel: tap2eb8d1c0-a0: left promiscuous mode
Nov 22 07:58:42 compute-0 nova_compute[186544]: 2025-11-22 07:58:42.747 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:42.749 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[334aa723-99a2-47a8-9237-f2b6c2426171]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:42.770 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[2022cc72-42e7-4b09-83cd-3121e10d73c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:42.771 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[171efca1-acb9-4491-a951-e5141c8535bb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:42.787 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[8c043294-70aa-47d9-ac12-fa9941ceb1a7]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 506103, 'reachable_time': 33836, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226856, 'error': None, 'target': 'ovnmeta-2eb8d1c0-aa5f-4b93-a282-8895c4dfc512', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:42.789 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-2eb8d1c0-aa5f-4b93-a282-8895c4dfc512 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 07:58:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:42.789 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[ab66f6e6-f375-4a5f-b5bc-527851a8f29a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:42 compute-0 systemd[1]: run-netns-ovnmeta\x2d2eb8d1c0\x2daa5f\x2d4b93\x2da282\x2d8895c4dfc512.mount: Deactivated successfully.
Nov 22 07:58:42 compute-0 nova_compute[186544]: 2025-11-22 07:58:42.852 186548 DEBUG nova.network.neutron [-] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:58:42 compute-0 nova_compute[186544]: 2025-11-22 07:58:42.873 186548 INFO nova.compute.manager [-] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Took 0.87 seconds to deallocate network for instance.
Nov 22 07:58:42 compute-0 nova_compute[186544]: 2025-11-22 07:58:42.935 186548 DEBUG oslo_concurrency.lockutils [None req-30bbab12-b126-4092-85b2-de26e759d37c 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:58:42 compute-0 nova_compute[186544]: 2025-11-22 07:58:42.935 186548 DEBUG oslo_concurrency.lockutils [None req-30bbab12-b126-4092-85b2-de26e759d37c 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:58:42 compute-0 nova_compute[186544]: 2025-11-22 07:58:42.972 186548 DEBUG nova.compute.manager [req-dc7b4b1c-f7b0-4b61-9033-0901f0fa38b9 req-e3d07d77-bba3-41c6-8cb3-842340bfdfe7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Received event network-vif-deleted-99430038-cd5a-4b80-870d-a311ce070406 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:58:43 compute-0 nova_compute[186544]: 2025-11-22 07:58:43.101 186548 DEBUG nova.compute.provider_tree [None req-30bbab12-b126-4092-85b2-de26e759d37c 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:58:43 compute-0 nova_compute[186544]: 2025-11-22 07:58:43.120 186548 DEBUG nova.scheduler.client.report [None req-30bbab12-b126-4092-85b2-de26e759d37c 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:58:43 compute-0 nova_compute[186544]: 2025-11-22 07:58:43.149 186548 DEBUG oslo_concurrency.lockutils [None req-30bbab12-b126-4092-85b2-de26e759d37c 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.214s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:58:43 compute-0 nova_compute[186544]: 2025-11-22 07:58:43.259 186548 INFO nova.scheduler.client.report [None req-30bbab12-b126-4092-85b2-de26e759d37c 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Deleted allocations for instance e0454e3f-2082-4375-8eea-951bbcda9d18
Nov 22 07:58:43 compute-0 nova_compute[186544]: 2025-11-22 07:58:43.327 186548 DEBUG oslo_concurrency.lockutils [None req-30bbab12-b126-4092-85b2-de26e759d37c 567c8b1d5a18457a8101063d7929927c 2e831305a33c4281b08d9d9a531f0f05 - - default default] Lock "e0454e3f-2082-4375-8eea-951bbcda9d18" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.702s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:58:44 compute-0 nova_compute[186544]: 2025-11-22 07:58:44.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:58:44 compute-0 nova_compute[186544]: 2025-11-22 07:58:44.164 186548 DEBUG oslo_concurrency.lockutils [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Acquiring lock "967dbe07-d575-4894-aabd-483767dcd760" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:58:44 compute-0 nova_compute[186544]: 2025-11-22 07:58:44.165 186548 DEBUG oslo_concurrency.lockutils [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Lock "967dbe07-d575-4894-aabd-483767dcd760" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:58:44 compute-0 nova_compute[186544]: 2025-11-22 07:58:44.184 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:58:44 compute-0 nova_compute[186544]: 2025-11-22 07:58:44.185 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:58:44 compute-0 nova_compute[186544]: 2025-11-22 07:58:44.185 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:58:44 compute-0 nova_compute[186544]: 2025-11-22 07:58:44.186 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 07:58:44 compute-0 nova_compute[186544]: 2025-11-22 07:58:44.187 186548 DEBUG nova.compute.manager [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 07:58:44 compute-0 nova_compute[186544]: 2025-11-22 07:58:44.298 186548 DEBUG oslo_concurrency.lockutils [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:58:44 compute-0 nova_compute[186544]: 2025-11-22 07:58:44.299 186548 DEBUG oslo_concurrency.lockutils [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:58:44 compute-0 nova_compute[186544]: 2025-11-22 07:58:44.304 186548 DEBUG nova.virt.hardware [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 07:58:44 compute-0 nova_compute[186544]: 2025-11-22 07:58:44.304 186548 INFO nova.compute.claims [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Claim successful on node compute-0.ctlplane.example.com
Nov 22 07:58:44 compute-0 nova_compute[186544]: 2025-11-22 07:58:44.371 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 07:58:44 compute-0 nova_compute[186544]: 2025-11-22 07:58:44.372 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5653MB free_disk=73.31484985351562GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 07:58:44 compute-0 nova_compute[186544]: 2025-11-22 07:58:44.372 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:58:44 compute-0 nova_compute[186544]: 2025-11-22 07:58:44.421 186548 DEBUG nova.compute.provider_tree [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:58:44 compute-0 nova_compute[186544]: 2025-11-22 07:58:44.434 186548 DEBUG nova.scheduler.client.report [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:58:44 compute-0 nova_compute[186544]: 2025-11-22 07:58:44.453 186548 DEBUG oslo_concurrency.lockutils [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.154s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:58:44 compute-0 nova_compute[186544]: 2025-11-22 07:58:44.453 186548 DEBUG nova.compute.manager [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 07:58:44 compute-0 nova_compute[186544]: 2025-11-22 07:58:44.456 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.084s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:58:44 compute-0 nova_compute[186544]: 2025-11-22 07:58:44.526 186548 DEBUG nova.compute.manager [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 22 07:58:44 compute-0 nova_compute[186544]: 2025-11-22 07:58:44.526 186548 DEBUG nova.network.neutron [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 07:58:44 compute-0 nova_compute[186544]: 2025-11-22 07:58:44.542 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Instance 967dbe07-d575-4894-aabd-483767dcd760 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 22 07:58:44 compute-0 nova_compute[186544]: 2025-11-22 07:58:44.543 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 07:58:44 compute-0 nova_compute[186544]: 2025-11-22 07:58:44.543 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 07:58:44 compute-0 nova_compute[186544]: 2025-11-22 07:58:44.546 186548 INFO nova.virt.libvirt.driver [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 07:58:44 compute-0 nova_compute[186544]: 2025-11-22 07:58:44.574 186548 DEBUG nova.compute.manager [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 07:58:44 compute-0 nova_compute[186544]: 2025-11-22 07:58:44.591 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:58:44 compute-0 nova_compute[186544]: 2025-11-22 07:58:44.612 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:58:44 compute-0 nova_compute[186544]: 2025-11-22 07:58:44.632 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 07:58:44 compute-0 nova_compute[186544]: 2025-11-22 07:58:44.632 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.176s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:58:44 compute-0 nova_compute[186544]: 2025-11-22 07:58:44.675 186548 DEBUG nova.compute.manager [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 07:58:44 compute-0 nova_compute[186544]: 2025-11-22 07:58:44.676 186548 DEBUG nova.virt.libvirt.driver [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 07:58:44 compute-0 nova_compute[186544]: 2025-11-22 07:58:44.677 186548 INFO nova.virt.libvirt.driver [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Creating image(s)
Nov 22 07:58:44 compute-0 nova_compute[186544]: 2025-11-22 07:58:44.677 186548 DEBUG oslo_concurrency.lockutils [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Acquiring lock "/var/lib/nova/instances/967dbe07-d575-4894-aabd-483767dcd760/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:58:44 compute-0 nova_compute[186544]: 2025-11-22 07:58:44.677 186548 DEBUG oslo_concurrency.lockutils [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Lock "/var/lib/nova/instances/967dbe07-d575-4894-aabd-483767dcd760/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:58:44 compute-0 nova_compute[186544]: 2025-11-22 07:58:44.678 186548 DEBUG oslo_concurrency.lockutils [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Lock "/var/lib/nova/instances/967dbe07-d575-4894-aabd-483767dcd760/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:58:44 compute-0 nova_compute[186544]: 2025-11-22 07:58:44.690 186548 DEBUG oslo_concurrency.processutils [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:58:44 compute-0 nova_compute[186544]: 2025-11-22 07:58:44.750 186548 DEBUG oslo_concurrency.processutils [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:58:44 compute-0 nova_compute[186544]: 2025-11-22 07:58:44.752 186548 DEBUG oslo_concurrency.lockutils [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:58:44 compute-0 nova_compute[186544]: 2025-11-22 07:58:44.752 186548 DEBUG oslo_concurrency.lockutils [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:58:44 compute-0 nova_compute[186544]: 2025-11-22 07:58:44.765 186548 DEBUG oslo_concurrency.processutils [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:58:44 compute-0 nova_compute[186544]: 2025-11-22 07:58:44.819 186548 DEBUG oslo_concurrency.processutils [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:58:44 compute-0 nova_compute[186544]: 2025-11-22 07:58:44.820 186548 DEBUG oslo_concurrency.processutils [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/967dbe07-d575-4894-aabd-483767dcd760/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:58:45 compute-0 nova_compute[186544]: 2025-11-22 07:58:45.059 186548 DEBUG oslo_concurrency.processutils [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/967dbe07-d575-4894-aabd-483767dcd760/disk 1073741824" returned: 0 in 0.239s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:58:45 compute-0 nova_compute[186544]: 2025-11-22 07:58:45.060 186548 DEBUG oslo_concurrency.lockutils [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.308s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:58:45 compute-0 nova_compute[186544]: 2025-11-22 07:58:45.061 186548 DEBUG oslo_concurrency.processutils [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:58:45 compute-0 nova_compute[186544]: 2025-11-22 07:58:45.118 186548 DEBUG oslo_concurrency.processutils [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:58:45 compute-0 nova_compute[186544]: 2025-11-22 07:58:45.119 186548 DEBUG nova.virt.disk.api [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Checking if we can resize image /var/lib/nova/instances/967dbe07-d575-4894-aabd-483767dcd760/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 07:58:45 compute-0 nova_compute[186544]: 2025-11-22 07:58:45.119 186548 DEBUG oslo_concurrency.processutils [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/967dbe07-d575-4894-aabd-483767dcd760/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:58:45 compute-0 nova_compute[186544]: 2025-11-22 07:58:45.189 186548 DEBUG oslo_concurrency.processutils [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/967dbe07-d575-4894-aabd-483767dcd760/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:58:45 compute-0 nova_compute[186544]: 2025-11-22 07:58:45.190 186548 DEBUG nova.virt.disk.api [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Cannot resize image /var/lib/nova/instances/967dbe07-d575-4894-aabd-483767dcd760/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 07:58:45 compute-0 nova_compute[186544]: 2025-11-22 07:58:45.190 186548 DEBUG nova.objects.instance [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Lazy-loading 'migration_context' on Instance uuid 967dbe07-d575-4894-aabd-483767dcd760 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:58:45 compute-0 nova_compute[186544]: 2025-11-22 07:58:45.202 186548 DEBUG nova.virt.libvirt.driver [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 07:58:45 compute-0 nova_compute[186544]: 2025-11-22 07:58:45.202 186548 DEBUG nova.virt.libvirt.driver [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Ensure instance console log exists: /var/lib/nova/instances/967dbe07-d575-4894-aabd-483767dcd760/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 07:58:45 compute-0 nova_compute[186544]: 2025-11-22 07:58:45.203 186548 DEBUG oslo_concurrency.lockutils [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:58:45 compute-0 nova_compute[186544]: 2025-11-22 07:58:45.203 186548 DEBUG oslo_concurrency.lockutils [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:58:45 compute-0 nova_compute[186544]: 2025-11-22 07:58:45.203 186548 DEBUG oslo_concurrency.lockutils [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:58:45 compute-0 nova_compute[186544]: 2025-11-22 07:58:45.268 186548 DEBUG nova.policy [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd77b927940494160bce27934c565fda7', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd2bcbcf3720f46be9fea7fc4685dfecd', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 22 07:58:45 compute-0 nova_compute[186544]: 2025-11-22 07:58:45.580 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:45 compute-0 nova_compute[186544]: 2025-11-22 07:58:45.631 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:58:45 compute-0 nova_compute[186544]: 2025-11-22 07:58:45.632 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 07:58:45 compute-0 nova_compute[186544]: 2025-11-22 07:58:45.656 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 07:58:45 compute-0 nova_compute[186544]: 2025-11-22 07:58:45.880 186548 DEBUG oslo_concurrency.lockutils [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Acquiring lock "823521f6-eb37-4b78-982c-526e463c834f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:58:45 compute-0 nova_compute[186544]: 2025-11-22 07:58:45.881 186548 DEBUG oslo_concurrency.lockutils [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lock "823521f6-eb37-4b78-982c-526e463c834f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:58:45 compute-0 nova_compute[186544]: 2025-11-22 07:58:45.898 186548 DEBUG nova.compute.manager [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 07:58:45 compute-0 nova_compute[186544]: 2025-11-22 07:58:45.982 186548 DEBUG oslo_concurrency.lockutils [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:58:45 compute-0 nova_compute[186544]: 2025-11-22 07:58:45.983 186548 DEBUG oslo_concurrency.lockutils [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:58:45 compute-0 nova_compute[186544]: 2025-11-22 07:58:45.989 186548 DEBUG nova.virt.hardware [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 07:58:45 compute-0 nova_compute[186544]: 2025-11-22 07:58:45.989 186548 INFO nova.compute.claims [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Claim successful on node compute-0.ctlplane.example.com
Nov 22 07:58:46 compute-0 nova_compute[186544]: 2025-11-22 07:58:46.161 186548 DEBUG nova.compute.provider_tree [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:58:46 compute-0 nova_compute[186544]: 2025-11-22 07:58:46.179 186548 DEBUG nova.scheduler.client.report [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:58:46 compute-0 nova_compute[186544]: 2025-11-22 07:58:46.203 186548 DEBUG oslo_concurrency.lockutils [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.221s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:58:46 compute-0 nova_compute[186544]: 2025-11-22 07:58:46.204 186548 DEBUG nova.compute.manager [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 07:58:46 compute-0 nova_compute[186544]: 2025-11-22 07:58:46.207 186548 DEBUG nova.network.neutron [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Successfully created port: e3b10401-9ef6-4135-93b2-8c19486b866e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 22 07:58:46 compute-0 nova_compute[186544]: 2025-11-22 07:58:46.271 186548 DEBUG nova.compute.manager [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 22 07:58:46 compute-0 nova_compute[186544]: 2025-11-22 07:58:46.271 186548 DEBUG nova.network.neutron [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 07:58:46 compute-0 nova_compute[186544]: 2025-11-22 07:58:46.287 186548 INFO nova.virt.libvirt.driver [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 07:58:46 compute-0 nova_compute[186544]: 2025-11-22 07:58:46.305 186548 DEBUG nova.compute.manager [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 07:58:46 compute-0 nova_compute[186544]: 2025-11-22 07:58:46.412 186548 DEBUG nova.compute.manager [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 07:58:46 compute-0 nova_compute[186544]: 2025-11-22 07:58:46.413 186548 DEBUG nova.virt.libvirt.driver [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 07:58:46 compute-0 nova_compute[186544]: 2025-11-22 07:58:46.413 186548 INFO nova.virt.libvirt.driver [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Creating image(s)
Nov 22 07:58:46 compute-0 nova_compute[186544]: 2025-11-22 07:58:46.414 186548 DEBUG oslo_concurrency.lockutils [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Acquiring lock "/var/lib/nova/instances/823521f6-eb37-4b78-982c-526e463c834f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:58:46 compute-0 nova_compute[186544]: 2025-11-22 07:58:46.414 186548 DEBUG oslo_concurrency.lockutils [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lock "/var/lib/nova/instances/823521f6-eb37-4b78-982c-526e463c834f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:58:46 compute-0 nova_compute[186544]: 2025-11-22 07:58:46.414 186548 DEBUG oslo_concurrency.lockutils [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lock "/var/lib/nova/instances/823521f6-eb37-4b78-982c-526e463c834f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:58:46 compute-0 nova_compute[186544]: 2025-11-22 07:58:46.427 186548 DEBUG oslo_concurrency.processutils [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:58:46 compute-0 podman[226878]: 2025-11-22 07:58:46.452395281 +0000 UTC m=+0.088846169 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 07:58:46 compute-0 podman[226877]: 2025-11-22 07:58:46.452720069 +0000 UTC m=+0.091683409 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute)
Nov 22 07:58:46 compute-0 nova_compute[186544]: 2025-11-22 07:58:46.502 186548 DEBUG oslo_concurrency.processutils [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:58:46 compute-0 nova_compute[186544]: 2025-11-22 07:58:46.503 186548 DEBUG oslo_concurrency.lockutils [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:58:46 compute-0 nova_compute[186544]: 2025-11-22 07:58:46.504 186548 DEBUG oslo_concurrency.lockutils [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:58:46 compute-0 nova_compute[186544]: 2025-11-22 07:58:46.517 186548 DEBUG oslo_concurrency.processutils [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:58:46 compute-0 nova_compute[186544]: 2025-11-22 07:58:46.580 186548 DEBUG oslo_concurrency.processutils [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:58:46 compute-0 nova_compute[186544]: 2025-11-22 07:58:46.582 186548 DEBUG oslo_concurrency.processutils [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/823521f6-eb37-4b78-982c-526e463c834f/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:58:46 compute-0 nova_compute[186544]: 2025-11-22 07:58:46.603 186548 DEBUG nova.policy [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0d84421d986b40f481c0caef764443e2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 22 07:58:46 compute-0 nova_compute[186544]: 2025-11-22 07:58:46.726 186548 DEBUG oslo_concurrency.processutils [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/823521f6-eb37-4b78-982c-526e463c834f/disk 1073741824" returned: 0 in 0.144s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:58:46 compute-0 nova_compute[186544]: 2025-11-22 07:58:46.728 186548 DEBUG oslo_concurrency.lockutils [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.224s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:58:46 compute-0 nova_compute[186544]: 2025-11-22 07:58:46.729 186548 DEBUG oslo_concurrency.processutils [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:58:46 compute-0 nova_compute[186544]: 2025-11-22 07:58:46.794 186548 DEBUG oslo_concurrency.processutils [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:58:46 compute-0 nova_compute[186544]: 2025-11-22 07:58:46.795 186548 DEBUG nova.virt.disk.api [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Checking if we can resize image /var/lib/nova/instances/823521f6-eb37-4b78-982c-526e463c834f/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 07:58:46 compute-0 nova_compute[186544]: 2025-11-22 07:58:46.795 186548 DEBUG oslo_concurrency.processutils [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/823521f6-eb37-4b78-982c-526e463c834f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:58:46 compute-0 nova_compute[186544]: 2025-11-22 07:58:46.849 186548 DEBUG oslo_concurrency.processutils [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/823521f6-eb37-4b78-982c-526e463c834f/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:58:46 compute-0 nova_compute[186544]: 2025-11-22 07:58:46.850 186548 DEBUG nova.virt.disk.api [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Cannot resize image /var/lib/nova/instances/823521f6-eb37-4b78-982c-526e463c834f/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 07:58:46 compute-0 nova_compute[186544]: 2025-11-22 07:58:46.851 186548 DEBUG nova.objects.instance [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lazy-loading 'migration_context' on Instance uuid 823521f6-eb37-4b78-982c-526e463c834f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:58:46 compute-0 nova_compute[186544]: 2025-11-22 07:58:46.863 186548 DEBUG nova.virt.libvirt.driver [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 07:58:46 compute-0 nova_compute[186544]: 2025-11-22 07:58:46.863 186548 DEBUG nova.virt.libvirt.driver [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Ensure instance console log exists: /var/lib/nova/instances/823521f6-eb37-4b78-982c-526e463c834f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 07:58:46 compute-0 nova_compute[186544]: 2025-11-22 07:58:46.864 186548 DEBUG oslo_concurrency.lockutils [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:58:46 compute-0 nova_compute[186544]: 2025-11-22 07:58:46.864 186548 DEBUG oslo_concurrency.lockutils [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:58:46 compute-0 nova_compute[186544]: 2025-11-22 07:58:46.864 186548 DEBUG oslo_concurrency.lockutils [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:58:46 compute-0 nova_compute[186544]: 2025-11-22 07:58:46.931 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:47 compute-0 nova_compute[186544]: 2025-11-22 07:58:47.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:58:47 compute-0 nova_compute[186544]: 2025-11-22 07:58:47.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 07:58:47 compute-0 nova_compute[186544]: 2025-11-22 07:58:47.308 186548 DEBUG nova.network.neutron [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Successfully updated port: e3b10401-9ef6-4135-93b2-8c19486b866e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 07:58:47 compute-0 nova_compute[186544]: 2025-11-22 07:58:47.320 186548 DEBUG oslo_concurrency.lockutils [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Acquiring lock "refresh_cache-967dbe07-d575-4894-aabd-483767dcd760" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:58:47 compute-0 nova_compute[186544]: 2025-11-22 07:58:47.320 186548 DEBUG oslo_concurrency.lockutils [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Acquired lock "refresh_cache-967dbe07-d575-4894-aabd-483767dcd760" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:58:47 compute-0 nova_compute[186544]: 2025-11-22 07:58:47.321 186548 DEBUG nova.network.neutron [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 07:58:47 compute-0 nova_compute[186544]: 2025-11-22 07:58:47.411 186548 DEBUG nova.compute.manager [req-b74a4d33-bafe-490f-aac7-e2995a572a51 req-9dcd6068-5cba-46e5-89ea-4435b67f506c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Received event network-changed-e3b10401-9ef6-4135-93b2-8c19486b866e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:58:47 compute-0 nova_compute[186544]: 2025-11-22 07:58:47.412 186548 DEBUG nova.compute.manager [req-b74a4d33-bafe-490f-aac7-e2995a572a51 req-9dcd6068-5cba-46e5-89ea-4435b67f506c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Refreshing instance network info cache due to event network-changed-e3b10401-9ef6-4135-93b2-8c19486b866e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 07:58:47 compute-0 nova_compute[186544]: 2025-11-22 07:58:47.412 186548 DEBUG oslo_concurrency.lockutils [req-b74a4d33-bafe-490f-aac7-e2995a572a51 req-9dcd6068-5cba-46e5-89ea-4435b67f506c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-967dbe07-d575-4894-aabd-483767dcd760" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:58:47 compute-0 nova_compute[186544]: 2025-11-22 07:58:47.502 186548 DEBUG nova.network.neutron [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Successfully created port: 1abe3f57-9262-4dc9-a0a8-4465e0cd0702 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 22 07:58:47 compute-0 nova_compute[186544]: 2025-11-22 07:58:47.517 186548 DEBUG nova.network.neutron [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 07:58:48 compute-0 nova_compute[186544]: 2025-11-22 07:58:48.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:58:49 compute-0 nova_compute[186544]: 2025-11-22 07:58:49.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:58:50 compute-0 nova_compute[186544]: 2025-11-22 07:58:50.140 186548 DEBUG nova.network.neutron [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Updating instance_info_cache with network_info: [{"id": "e3b10401-9ef6-4135-93b2-8c19486b866e", "address": "fa:16:3e:dd:c3:78", "network": {"id": "9f740f05-d312-4e00-a27d-4d2a45e526b6", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1530130691-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d2bcbcf3720f46be9fea7fc4685dfecd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape3b10401-9e", "ovs_interfaceid": "e3b10401-9ef6-4135-93b2-8c19486b866e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:58:50 compute-0 nova_compute[186544]: 2025-11-22 07:58:50.162 186548 DEBUG oslo_concurrency.lockutils [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Releasing lock "refresh_cache-967dbe07-d575-4894-aabd-483767dcd760" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:58:50 compute-0 nova_compute[186544]: 2025-11-22 07:58:50.163 186548 DEBUG nova.compute.manager [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Instance network_info: |[{"id": "e3b10401-9ef6-4135-93b2-8c19486b866e", "address": "fa:16:3e:dd:c3:78", "network": {"id": "9f740f05-d312-4e00-a27d-4d2a45e526b6", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1530130691-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d2bcbcf3720f46be9fea7fc4685dfecd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape3b10401-9e", "ovs_interfaceid": "e3b10401-9ef6-4135-93b2-8c19486b866e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 22 07:58:50 compute-0 nova_compute[186544]: 2025-11-22 07:58:50.164 186548 DEBUG oslo_concurrency.lockutils [req-b74a4d33-bafe-490f-aac7-e2995a572a51 req-9dcd6068-5cba-46e5-89ea-4435b67f506c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-967dbe07-d575-4894-aabd-483767dcd760" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:58:50 compute-0 nova_compute[186544]: 2025-11-22 07:58:50.165 186548 DEBUG nova.network.neutron [req-b74a4d33-bafe-490f-aac7-e2995a572a51 req-9dcd6068-5cba-46e5-89ea-4435b67f506c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Refreshing network info cache for port e3b10401-9ef6-4135-93b2-8c19486b866e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 07:58:50 compute-0 nova_compute[186544]: 2025-11-22 07:58:50.170 186548 DEBUG nova.virt.libvirt.driver [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Start _get_guest_xml network_info=[{"id": "e3b10401-9ef6-4135-93b2-8c19486b866e", "address": "fa:16:3e:dd:c3:78", "network": {"id": "9f740f05-d312-4e00-a27d-4d2a45e526b6", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1530130691-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d2bcbcf3720f46be9fea7fc4685dfecd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape3b10401-9e", "ovs_interfaceid": "e3b10401-9ef6-4135-93b2-8c19486b866e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 07:58:50 compute-0 nova_compute[186544]: 2025-11-22 07:58:50.176 186548 WARNING nova.virt.libvirt.driver [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 07:58:50 compute-0 nova_compute[186544]: 2025-11-22 07:58:50.181 186548 DEBUG nova.virt.libvirt.host [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 07:58:50 compute-0 nova_compute[186544]: 2025-11-22 07:58:50.182 186548 DEBUG nova.virt.libvirt.host [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 07:58:50 compute-0 nova_compute[186544]: 2025-11-22 07:58:50.191 186548 DEBUG nova.virt.libvirt.host [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 07:58:50 compute-0 nova_compute[186544]: 2025-11-22 07:58:50.192 186548 DEBUG nova.virt.libvirt.host [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 07:58:50 compute-0 nova_compute[186544]: 2025-11-22 07:58:50.194 186548 DEBUG nova.virt.libvirt.driver [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 07:58:50 compute-0 nova_compute[186544]: 2025-11-22 07:58:50.194 186548 DEBUG nova.virt.hardware [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 07:58:50 compute-0 nova_compute[186544]: 2025-11-22 07:58:50.194 186548 DEBUG nova.virt.hardware [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 07:58:50 compute-0 nova_compute[186544]: 2025-11-22 07:58:50.195 186548 DEBUG nova.virt.hardware [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 07:58:50 compute-0 nova_compute[186544]: 2025-11-22 07:58:50.195 186548 DEBUG nova.virt.hardware [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 07:58:50 compute-0 nova_compute[186544]: 2025-11-22 07:58:50.195 186548 DEBUG nova.virt.hardware [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 07:58:50 compute-0 nova_compute[186544]: 2025-11-22 07:58:50.195 186548 DEBUG nova.virt.hardware [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 07:58:50 compute-0 nova_compute[186544]: 2025-11-22 07:58:50.195 186548 DEBUG nova.virt.hardware [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 07:58:50 compute-0 nova_compute[186544]: 2025-11-22 07:58:50.196 186548 DEBUG nova.virt.hardware [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 07:58:50 compute-0 nova_compute[186544]: 2025-11-22 07:58:50.196 186548 DEBUG nova.virt.hardware [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 07:58:50 compute-0 nova_compute[186544]: 2025-11-22 07:58:50.196 186548 DEBUG nova.virt.hardware [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 07:58:50 compute-0 nova_compute[186544]: 2025-11-22 07:58:50.196 186548 DEBUG nova.virt.hardware [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 07:58:50 compute-0 nova_compute[186544]: 2025-11-22 07:58:50.201 186548 DEBUG nova.virt.libvirt.vif [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:58:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-SecurityGroupsTestJSON-server-274106407',display_name='tempest-SecurityGroupsTestJSON-server-274106407',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-securitygroupstestjson-server-274106407',id=84,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d2bcbcf3720f46be9fea7fc4685dfecd',ramdisk_id='',reservation_id='r-jtzr0xmm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-SecurityGroupsTestJSON-2135176549',owner_user_name='tempest-SecurityGroupsTestJSON-2135176549-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:58:44Z,user_data=None,user_id='d77b927940494160bce27934c565fda7',uuid=967dbe07-d575-4894-aabd-483767dcd760,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e3b10401-9ef6-4135-93b2-8c19486b866e", "address": "fa:16:3e:dd:c3:78", "network": {"id": "9f740f05-d312-4e00-a27d-4d2a45e526b6", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1530130691-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d2bcbcf3720f46be9fea7fc4685dfecd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape3b10401-9e", "ovs_interfaceid": "e3b10401-9ef6-4135-93b2-8c19486b866e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 07:58:50 compute-0 nova_compute[186544]: 2025-11-22 07:58:50.201 186548 DEBUG nova.network.os_vif_util [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Converting VIF {"id": "e3b10401-9ef6-4135-93b2-8c19486b866e", "address": "fa:16:3e:dd:c3:78", "network": {"id": "9f740f05-d312-4e00-a27d-4d2a45e526b6", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1530130691-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d2bcbcf3720f46be9fea7fc4685dfecd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape3b10401-9e", "ovs_interfaceid": "e3b10401-9ef6-4135-93b2-8c19486b866e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:58:50 compute-0 nova_compute[186544]: 2025-11-22 07:58:50.202 186548 DEBUG nova.network.os_vif_util [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:dd:c3:78,bridge_name='br-int',has_traffic_filtering=True,id=e3b10401-9ef6-4135-93b2-8c19486b866e,network=Network(9f740f05-d312-4e00-a27d-4d2a45e526b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape3b10401-9e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:58:50 compute-0 nova_compute[186544]: 2025-11-22 07:58:50.203 186548 DEBUG nova.objects.instance [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Lazy-loading 'pci_devices' on Instance uuid 967dbe07-d575-4894-aabd-483767dcd760 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:58:50 compute-0 nova_compute[186544]: 2025-11-22 07:58:50.219 186548 DEBUG nova.virt.libvirt.driver [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] End _get_guest_xml xml=<domain type="kvm">
Nov 22 07:58:50 compute-0 nova_compute[186544]:   <uuid>967dbe07-d575-4894-aabd-483767dcd760</uuid>
Nov 22 07:58:50 compute-0 nova_compute[186544]:   <name>instance-00000054</name>
Nov 22 07:58:50 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 07:58:50 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 07:58:50 compute-0 nova_compute[186544]:   <metadata>
Nov 22 07:58:50 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 07:58:50 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 07:58:50 compute-0 nova_compute[186544]:       <nova:name>tempest-SecurityGroupsTestJSON-server-274106407</nova:name>
Nov 22 07:58:50 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 07:58:50</nova:creationTime>
Nov 22 07:58:50 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 07:58:50 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 07:58:50 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 07:58:50 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 07:58:50 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 07:58:50 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 07:58:50 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 07:58:50 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 07:58:50 compute-0 nova_compute[186544]:         <nova:user uuid="d77b927940494160bce27934c565fda7">tempest-SecurityGroupsTestJSON-2135176549-project-member</nova:user>
Nov 22 07:58:50 compute-0 nova_compute[186544]:         <nova:project uuid="d2bcbcf3720f46be9fea7fc4685dfecd">tempest-SecurityGroupsTestJSON-2135176549</nova:project>
Nov 22 07:58:50 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 07:58:50 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 07:58:50 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 07:58:50 compute-0 nova_compute[186544]:         <nova:port uuid="e3b10401-9ef6-4135-93b2-8c19486b866e">
Nov 22 07:58:50 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Nov 22 07:58:50 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 07:58:50 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 07:58:50 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 07:58:50 compute-0 nova_compute[186544]:   </metadata>
Nov 22 07:58:50 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 07:58:50 compute-0 nova_compute[186544]:     <system>
Nov 22 07:58:50 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 07:58:50 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 07:58:50 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 07:58:50 compute-0 nova_compute[186544]:       <entry name="serial">967dbe07-d575-4894-aabd-483767dcd760</entry>
Nov 22 07:58:50 compute-0 nova_compute[186544]:       <entry name="uuid">967dbe07-d575-4894-aabd-483767dcd760</entry>
Nov 22 07:58:50 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 07:58:50 compute-0 nova_compute[186544]:     </system>
Nov 22 07:58:50 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 07:58:50 compute-0 nova_compute[186544]:   <os>
Nov 22 07:58:50 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 07:58:50 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 07:58:50 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 07:58:50 compute-0 nova_compute[186544]:   </os>
Nov 22 07:58:50 compute-0 nova_compute[186544]:   <features>
Nov 22 07:58:50 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 07:58:50 compute-0 nova_compute[186544]:     <apic/>
Nov 22 07:58:50 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 07:58:50 compute-0 nova_compute[186544]:   </features>
Nov 22 07:58:50 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 07:58:50 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 07:58:50 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 07:58:50 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 07:58:50 compute-0 nova_compute[186544]:   </clock>
Nov 22 07:58:50 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 07:58:50 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 07:58:50 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 07:58:50 compute-0 nova_compute[186544]:   </cpu>
Nov 22 07:58:50 compute-0 nova_compute[186544]:   <devices>
Nov 22 07:58:50 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 07:58:50 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 07:58:50 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/967dbe07-d575-4894-aabd-483767dcd760/disk"/>
Nov 22 07:58:50 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 07:58:50 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:58:50 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 07:58:50 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 07:58:50 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/967dbe07-d575-4894-aabd-483767dcd760/disk.config"/>
Nov 22 07:58:50 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 07:58:50 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:58:50 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 07:58:50 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:dd:c3:78"/>
Nov 22 07:58:50 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 07:58:50 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 07:58:50 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 07:58:50 compute-0 nova_compute[186544]:       <target dev="tape3b10401-9e"/>
Nov 22 07:58:50 compute-0 nova_compute[186544]:     </interface>
Nov 22 07:58:50 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 07:58:50 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/967dbe07-d575-4894-aabd-483767dcd760/console.log" append="off"/>
Nov 22 07:58:50 compute-0 nova_compute[186544]:     </serial>
Nov 22 07:58:50 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 07:58:50 compute-0 nova_compute[186544]:     <video>
Nov 22 07:58:50 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 07:58:50 compute-0 nova_compute[186544]:     </video>
Nov 22 07:58:50 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 07:58:50 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 07:58:50 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 07:58:50 compute-0 nova_compute[186544]:     </rng>
Nov 22 07:58:50 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 07:58:50 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:50 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:50 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:50 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:50 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:50 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:50 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:50 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:50 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:50 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:50 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:50 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:50 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:50 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:50 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:50 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:50 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:50 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:50 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:50 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:50 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:50 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:50 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:50 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:50 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 07:58:50 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 07:58:50 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 07:58:50 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 07:58:50 compute-0 nova_compute[186544]:   </devices>
Nov 22 07:58:50 compute-0 nova_compute[186544]: </domain>
Nov 22 07:58:50 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 07:58:50 compute-0 nova_compute[186544]: 2025-11-22 07:58:50.221 186548 DEBUG nova.compute.manager [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Preparing to wait for external event network-vif-plugged-e3b10401-9ef6-4135-93b2-8c19486b866e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 07:58:50 compute-0 nova_compute[186544]: 2025-11-22 07:58:50.221 186548 DEBUG oslo_concurrency.lockutils [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Acquiring lock "967dbe07-d575-4894-aabd-483767dcd760-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:58:50 compute-0 nova_compute[186544]: 2025-11-22 07:58:50.222 186548 DEBUG oslo_concurrency.lockutils [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Lock "967dbe07-d575-4894-aabd-483767dcd760-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:58:50 compute-0 nova_compute[186544]: 2025-11-22 07:58:50.222 186548 DEBUG oslo_concurrency.lockutils [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Lock "967dbe07-d575-4894-aabd-483767dcd760-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:58:50 compute-0 nova_compute[186544]: 2025-11-22 07:58:50.223 186548 DEBUG nova.virt.libvirt.vif [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:58:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-SecurityGroupsTestJSON-server-274106407',display_name='tempest-SecurityGroupsTestJSON-server-274106407',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-securitygroupstestjson-server-274106407',id=84,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d2bcbcf3720f46be9fea7fc4685dfecd',ramdisk_id='',reservation_id='r-jtzr0xmm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-SecurityGroupsTestJSON-2135176549',owner_user_name='tempest-SecurityGroupsTestJSON-2135176549-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:58:44Z,user_data=None,user_id='d77b927940494160bce27934c565fda7',uuid=967dbe07-d575-4894-aabd-483767dcd760,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e3b10401-9ef6-4135-93b2-8c19486b866e", "address": "fa:16:3e:dd:c3:78", "network": {"id": "9f740f05-d312-4e00-a27d-4d2a45e526b6", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1530130691-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d2bcbcf3720f46be9fea7fc4685dfecd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape3b10401-9e", "ovs_interfaceid": "e3b10401-9ef6-4135-93b2-8c19486b866e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 07:58:50 compute-0 nova_compute[186544]: 2025-11-22 07:58:50.223 186548 DEBUG nova.network.os_vif_util [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Converting VIF {"id": "e3b10401-9ef6-4135-93b2-8c19486b866e", "address": "fa:16:3e:dd:c3:78", "network": {"id": "9f740f05-d312-4e00-a27d-4d2a45e526b6", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1530130691-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d2bcbcf3720f46be9fea7fc4685dfecd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape3b10401-9e", "ovs_interfaceid": "e3b10401-9ef6-4135-93b2-8c19486b866e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:58:50 compute-0 nova_compute[186544]: 2025-11-22 07:58:50.224 186548 DEBUG nova.network.os_vif_util [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:dd:c3:78,bridge_name='br-int',has_traffic_filtering=True,id=e3b10401-9ef6-4135-93b2-8c19486b866e,network=Network(9f740f05-d312-4e00-a27d-4d2a45e526b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape3b10401-9e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:58:50 compute-0 nova_compute[186544]: 2025-11-22 07:58:50.224 186548 DEBUG os_vif [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:dd:c3:78,bridge_name='br-int',has_traffic_filtering=True,id=e3b10401-9ef6-4135-93b2-8c19486b866e,network=Network(9f740f05-d312-4e00-a27d-4d2a45e526b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape3b10401-9e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 07:58:50 compute-0 nova_compute[186544]: 2025-11-22 07:58:50.225 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:50 compute-0 nova_compute[186544]: 2025-11-22 07:58:50.225 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:58:50 compute-0 nova_compute[186544]: 2025-11-22 07:58:50.226 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:58:50 compute-0 nova_compute[186544]: 2025-11-22 07:58:50.229 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:50 compute-0 nova_compute[186544]: 2025-11-22 07:58:50.229 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape3b10401-9e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:58:50 compute-0 nova_compute[186544]: 2025-11-22 07:58:50.229 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape3b10401-9e, col_values=(('external_ids', {'iface-id': 'e3b10401-9ef6-4135-93b2-8c19486b866e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:dd:c3:78', 'vm-uuid': '967dbe07-d575-4894-aabd-483767dcd760'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:58:50 compute-0 NetworkManager[55036]: <info>  [1763798330.2315] manager: (tape3b10401-9e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/158)
Nov 22 07:58:50 compute-0 nova_compute[186544]: 2025-11-22 07:58:50.230 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:50 compute-0 nova_compute[186544]: 2025-11-22 07:58:50.234 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 07:58:50 compute-0 nova_compute[186544]: 2025-11-22 07:58:50.236 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:50 compute-0 nova_compute[186544]: 2025-11-22 07:58:50.238 186548 INFO os_vif [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:dd:c3:78,bridge_name='br-int',has_traffic_filtering=True,id=e3b10401-9ef6-4135-93b2-8c19486b866e,network=Network(9f740f05-d312-4e00-a27d-4d2a45e526b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape3b10401-9e')
Nov 22 07:58:50 compute-0 nova_compute[186544]: 2025-11-22 07:58:50.258 186548 DEBUG nova.network.neutron [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Successfully updated port: 1abe3f57-9262-4dc9-a0a8-4465e0cd0702 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 07:58:50 compute-0 nova_compute[186544]: 2025-11-22 07:58:50.276 186548 DEBUG oslo_concurrency.lockutils [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Acquiring lock "refresh_cache-823521f6-eb37-4b78-982c-526e463c834f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:58:50 compute-0 nova_compute[186544]: 2025-11-22 07:58:50.277 186548 DEBUG oslo_concurrency.lockutils [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Acquired lock "refresh_cache-823521f6-eb37-4b78-982c-526e463c834f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:58:50 compute-0 nova_compute[186544]: 2025-11-22 07:58:50.277 186548 DEBUG nova.network.neutron [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 07:58:50 compute-0 nova_compute[186544]: 2025-11-22 07:58:50.474 186548 DEBUG nova.network.neutron [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 07:58:50 compute-0 nova_compute[186544]: 2025-11-22 07:58:50.581 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:50 compute-0 nova_compute[186544]: 2025-11-22 07:58:50.622 186548 DEBUG nova.virt.libvirt.driver [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:58:50 compute-0 nova_compute[186544]: 2025-11-22 07:58:50.623 186548 DEBUG nova.virt.libvirt.driver [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:58:50 compute-0 nova_compute[186544]: 2025-11-22 07:58:50.623 186548 DEBUG nova.virt.libvirt.driver [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] No VIF found with MAC fa:16:3e:dd:c3:78, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 07:58:50 compute-0 nova_compute[186544]: 2025-11-22 07:58:50.623 186548 INFO nova.virt.libvirt.driver [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Using config drive
Nov 22 07:58:50 compute-0 nova_compute[186544]: 2025-11-22 07:58:50.940 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:51 compute-0 nova_compute[186544]: 2025-11-22 07:58:51.052 186548 INFO nova.virt.libvirt.driver [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Creating config drive at /var/lib/nova/instances/967dbe07-d575-4894-aabd-483767dcd760/disk.config
Nov 22 07:58:51 compute-0 nova_compute[186544]: 2025-11-22 07:58:51.057 186548 DEBUG oslo_concurrency.processutils [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/967dbe07-d575-4894-aabd-483767dcd760/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmz8xi6r3 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:58:51 compute-0 nova_compute[186544]: 2025-11-22 07:58:51.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:58:51 compute-0 nova_compute[186544]: 2025-11-22 07:58:51.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:58:51 compute-0 nova_compute[186544]: 2025-11-22 07:58:51.182 186548 DEBUG oslo_concurrency.processutils [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/967dbe07-d575-4894-aabd-483767dcd760/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmz8xi6r3" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:58:51 compute-0 kernel: tape3b10401-9e: entered promiscuous mode
Nov 22 07:58:51 compute-0 NetworkManager[55036]: <info>  [1763798331.2641] manager: (tape3b10401-9e): new Tun device (/org/freedesktop/NetworkManager/Devices/159)
Nov 22 07:58:51 compute-0 ovn_controller[94843]: 2025-11-22T07:58:51Z|00340|binding|INFO|Claiming lport e3b10401-9ef6-4135-93b2-8c19486b866e for this chassis.
Nov 22 07:58:51 compute-0 ovn_controller[94843]: 2025-11-22T07:58:51Z|00341|binding|INFO|e3b10401-9ef6-4135-93b2-8c19486b866e: Claiming fa:16:3e:dd:c3:78 10.100.0.7
Nov 22 07:58:51 compute-0 nova_compute[186544]: 2025-11-22 07:58:51.265 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:51 compute-0 nova_compute[186544]: 2025-11-22 07:58:51.272 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:51 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:51.281 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:dd:c3:78 10.100.0.7'], port_security=['fa:16:3e:dd:c3:78 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '967dbe07-d575-4894-aabd-483767dcd760', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9f740f05-d312-4e00-a27d-4d2a45e526b6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd2bcbcf3720f46be9fea7fc4685dfecd', 'neutron:revision_number': '2', 'neutron:security_group_ids': '726ed215-2cc1-4cd0-860c-0d95ad883b6b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=63d51e5f-a087-4eb1-a0c4-4a9ee7856c37, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=e3b10401-9ef6-4135-93b2-8c19486b866e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:58:51 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:51.282 103805 INFO neutron.agent.ovn.metadata.agent [-] Port e3b10401-9ef6-4135-93b2-8c19486b866e in datapath 9f740f05-d312-4e00-a27d-4d2a45e526b6 bound to our chassis
Nov 22 07:58:51 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:51.283 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9f740f05-d312-4e00-a27d-4d2a45e526b6
Nov 22 07:58:51 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:51.299 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[d5c1196b-809b-4344-8f61-3d1313206a7b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:51 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:51.300 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9f740f05-d1 in ovnmeta-9f740f05-d312-4e00-a27d-4d2a45e526b6 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 07:58:51 compute-0 systemd-udevd[226967]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 07:58:51 compute-0 systemd-machined[152872]: New machine qemu-41-instance-00000054.
Nov 22 07:58:51 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:51.304 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9f740f05-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 07:58:51 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:51.304 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[9e012c34-082e-4136-80b4-a4d08dcc1855]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:51 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:51.305 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[ee25ea91-ee4f-4b11-b01f-9bfc096c87d3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:51 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:51.316 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[8c29c3fc-2ce1-452f-b10d-0e3ad8384099]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:51 compute-0 NetworkManager[55036]: <info>  [1763798331.3189] device (tape3b10401-9e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 07:58:51 compute-0 NetworkManager[55036]: <info>  [1763798331.3201] device (tape3b10401-9e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 07:58:51 compute-0 nova_compute[186544]: 2025-11-22 07:58:51.327 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:51 compute-0 systemd[1]: Started Virtual Machine qemu-41-instance-00000054.
Nov 22 07:58:51 compute-0 ovn_controller[94843]: 2025-11-22T07:58:51Z|00342|binding|INFO|Setting lport e3b10401-9ef6-4135-93b2-8c19486b866e ovn-installed in OVS
Nov 22 07:58:51 compute-0 ovn_controller[94843]: 2025-11-22T07:58:51Z|00343|binding|INFO|Setting lport e3b10401-9ef6-4135-93b2-8c19486b866e up in Southbound
Nov 22 07:58:51 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:51.340 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[2054091e-d99f-4237-a167-2d06652b0753]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:51 compute-0 nova_compute[186544]: 2025-11-22 07:58:51.342 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:51 compute-0 podman[226949]: 2025-11-22 07:58:51.343660089 +0000 UTC m=+0.088246605 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 07:58:51 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:51.365 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[1f85b658-9ab5-481c-a8ba-b594e39b6a67]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:51 compute-0 systemd-udevd[226974]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 07:58:51 compute-0 NetworkManager[55036]: <info>  [1763798331.3724] manager: (tap9f740f05-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/160)
Nov 22 07:58:51 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:51.371 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[0f0ec932-93c6-4079-a628-f994d6b41f20]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:51 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:51.405 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[01703339-86ee-4a30-9139-e62ab82c705b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:51 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:51.408 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[ac61b221-14a3-4891-a180-bb1d8b07d8c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:51 compute-0 NetworkManager[55036]: <info>  [1763798331.4317] device (tap9f740f05-d0): carrier: link connected
Nov 22 07:58:51 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:51.437 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[5e777879-e601-41d4-9fbe-196ae973e5ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:51 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:51.456 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[9e67df3d-9071-4eba-b68a-7c3e2d5c25c7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9f740f05-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:56:0d:1d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 104], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 507206, 'reachable_time': 20723, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227010, 'error': None, 'target': 'ovnmeta-9f740f05-d312-4e00-a27d-4d2a45e526b6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:51 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:51.470 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[f688e806-f802-446e-94a5-6a03e322beb3]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe56:d1d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 507206, 'tstamp': 507206}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227011, 'error': None, 'target': 'ovnmeta-9f740f05-d312-4e00-a27d-4d2a45e526b6', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:51 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:51.487 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[1cb4f2b3-5f7c-42b0-b0c2-f92d65450553]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9f740f05-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:56:0d:1d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 104], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 507206, 'reachable_time': 20723, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 227012, 'error': None, 'target': 'ovnmeta-9f740f05-d312-4e00-a27d-4d2a45e526b6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:51 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:51.517 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[80ec95bc-2fe3-4a2f-a7b8-523cbd733353]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:51 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:51.569 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[9b669ddc-0432-480d-bd09-4d68527d6dab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:51 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:51.571 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9f740f05-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:58:51 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:51.571 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:58:51 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:51.571 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9f740f05-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:58:51 compute-0 kernel: tap9f740f05-d0: entered promiscuous mode
Nov 22 07:58:51 compute-0 NetworkManager[55036]: <info>  [1763798331.5746] manager: (tap9f740f05-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/161)
Nov 22 07:58:51 compute-0 nova_compute[186544]: 2025-11-22 07:58:51.573 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:51 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:51.576 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9f740f05-d0, col_values=(('external_ids', {'iface-id': 'a92e4d0c-d7b2-40f9-9251-db8a7ccb6b31'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:58:51 compute-0 ovn_controller[94843]: 2025-11-22T07:58:51Z|00344|binding|INFO|Releasing lport a92e4d0c-d7b2-40f9-9251-db8a7ccb6b31 from this chassis (sb_readonly=0)
Nov 22 07:58:51 compute-0 nova_compute[186544]: 2025-11-22 07:58:51.591 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:51 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:51.592 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9f740f05-d312-4e00-a27d-4d2a45e526b6.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9f740f05-d312-4e00-a27d-4d2a45e526b6.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 07:58:51 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:51.593 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[d7a6cbbb-920d-4974-9439-bc830badd636]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:51 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:51.594 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 07:58:51 compute-0 ovn_metadata_agent[103800]: global
Nov 22 07:58:51 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 07:58:51 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-9f740f05-d312-4e00-a27d-4d2a45e526b6
Nov 22 07:58:51 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 07:58:51 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 07:58:51 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 07:58:51 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/9f740f05-d312-4e00-a27d-4d2a45e526b6.pid.haproxy
Nov 22 07:58:51 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 07:58:51 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:58:51 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 07:58:51 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 07:58:51 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 07:58:51 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 07:58:51 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 07:58:51 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 07:58:51 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 07:58:51 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 07:58:51 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 07:58:51 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 07:58:51 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 07:58:51 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 07:58:51 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 07:58:51 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:58:51 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:58:51 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 07:58:51 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 07:58:51 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 07:58:51 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID 9f740f05-d312-4e00-a27d-4d2a45e526b6
Nov 22 07:58:51 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 07:58:51 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:51.595 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9f740f05-d312-4e00-a27d-4d2a45e526b6', 'env', 'PROCESS_TAG=haproxy-9f740f05-d312-4e00-a27d-4d2a45e526b6', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9f740f05-d312-4e00-a27d-4d2a45e526b6.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 07:58:51 compute-0 nova_compute[186544]: 2025-11-22 07:58:51.622 186548 DEBUG nova.compute.manager [req-546cb7cd-2372-40be-8843-9c8cb374ab15 req-6807a40d-b71e-43e3-97f5-721943f32fe6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Received event network-vif-plugged-e3b10401-9ef6-4135-93b2-8c19486b866e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:58:51 compute-0 nova_compute[186544]: 2025-11-22 07:58:51.623 186548 DEBUG oslo_concurrency.lockutils [req-546cb7cd-2372-40be-8843-9c8cb374ab15 req-6807a40d-b71e-43e3-97f5-721943f32fe6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "967dbe07-d575-4894-aabd-483767dcd760-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:58:51 compute-0 nova_compute[186544]: 2025-11-22 07:58:51.623 186548 DEBUG oslo_concurrency.lockutils [req-546cb7cd-2372-40be-8843-9c8cb374ab15 req-6807a40d-b71e-43e3-97f5-721943f32fe6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "967dbe07-d575-4894-aabd-483767dcd760-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:58:51 compute-0 nova_compute[186544]: 2025-11-22 07:58:51.623 186548 DEBUG oslo_concurrency.lockutils [req-546cb7cd-2372-40be-8843-9c8cb374ab15 req-6807a40d-b71e-43e3-97f5-721943f32fe6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "967dbe07-d575-4894-aabd-483767dcd760-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:58:51 compute-0 nova_compute[186544]: 2025-11-22 07:58:51.623 186548 DEBUG nova.compute.manager [req-546cb7cd-2372-40be-8843-9c8cb374ab15 req-6807a40d-b71e-43e3-97f5-721943f32fe6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Processing event network-vif-plugged-e3b10401-9ef6-4135-93b2-8c19486b866e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 07:58:51 compute-0 nova_compute[186544]: 2025-11-22 07:58:51.685 186548 DEBUG nova.network.neutron [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Updating instance_info_cache with network_info: [{"id": "1abe3f57-9262-4dc9-a0a8-4465e0cd0702", "address": "fa:16:3e:27:6a:75", "network": {"id": "06e0f3a5-911a-4244-bd9c-8cb4fa4c4794", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-960378838-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd33c7e49baa4c7f9575824b348a0f23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1abe3f57-92", "ovs_interfaceid": "1abe3f57-9262-4dc9-a0a8-4465e0cd0702", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:58:51 compute-0 nova_compute[186544]: 2025-11-22 07:58:51.696 186548 DEBUG nova.network.neutron [req-b74a4d33-bafe-490f-aac7-e2995a572a51 req-9dcd6068-5cba-46e5-89ea-4435b67f506c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Updated VIF entry in instance network info cache for port e3b10401-9ef6-4135-93b2-8c19486b866e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 07:58:51 compute-0 nova_compute[186544]: 2025-11-22 07:58:51.697 186548 DEBUG nova.network.neutron [req-b74a4d33-bafe-490f-aac7-e2995a572a51 req-9dcd6068-5cba-46e5-89ea-4435b67f506c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Updating instance_info_cache with network_info: [{"id": "e3b10401-9ef6-4135-93b2-8c19486b866e", "address": "fa:16:3e:dd:c3:78", "network": {"id": "9f740f05-d312-4e00-a27d-4d2a45e526b6", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1530130691-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d2bcbcf3720f46be9fea7fc4685dfecd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape3b10401-9e", "ovs_interfaceid": "e3b10401-9ef6-4135-93b2-8c19486b866e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:58:51 compute-0 nova_compute[186544]: 2025-11-22 07:58:51.701 186548 DEBUG oslo_concurrency.lockutils [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Releasing lock "refresh_cache-823521f6-eb37-4b78-982c-526e463c834f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:58:51 compute-0 nova_compute[186544]: 2025-11-22 07:58:51.702 186548 DEBUG nova.compute.manager [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Instance network_info: |[{"id": "1abe3f57-9262-4dc9-a0a8-4465e0cd0702", "address": "fa:16:3e:27:6a:75", "network": {"id": "06e0f3a5-911a-4244-bd9c-8cb4fa4c4794", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-960378838-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd33c7e49baa4c7f9575824b348a0f23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1abe3f57-92", "ovs_interfaceid": "1abe3f57-9262-4dc9-a0a8-4465e0cd0702", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 22 07:58:51 compute-0 nova_compute[186544]: 2025-11-22 07:58:51.704 186548 DEBUG nova.virt.libvirt.driver [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Start _get_guest_xml network_info=[{"id": "1abe3f57-9262-4dc9-a0a8-4465e0cd0702", "address": "fa:16:3e:27:6a:75", "network": {"id": "06e0f3a5-911a-4244-bd9c-8cb4fa4c4794", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-960378838-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd33c7e49baa4c7f9575824b348a0f23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1abe3f57-92", "ovs_interfaceid": "1abe3f57-9262-4dc9-a0a8-4465e0cd0702", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 07:58:51 compute-0 nova_compute[186544]: 2025-11-22 07:58:51.709 186548 DEBUG oslo_concurrency.lockutils [req-b74a4d33-bafe-490f-aac7-e2995a572a51 req-9dcd6068-5cba-46e5-89ea-4435b67f506c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-967dbe07-d575-4894-aabd-483767dcd760" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:58:51 compute-0 nova_compute[186544]: 2025-11-22 07:58:51.711 186548 WARNING nova.virt.libvirt.driver [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 07:58:51 compute-0 nova_compute[186544]: 2025-11-22 07:58:51.716 186548 DEBUG nova.virt.libvirt.host [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 07:58:51 compute-0 nova_compute[186544]: 2025-11-22 07:58:51.716 186548 DEBUG nova.virt.libvirt.host [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 07:58:51 compute-0 nova_compute[186544]: 2025-11-22 07:58:51.720 186548 DEBUG nova.virt.libvirt.host [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 07:58:51 compute-0 nova_compute[186544]: 2025-11-22 07:58:51.721 186548 DEBUG nova.virt.libvirt.host [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 07:58:51 compute-0 nova_compute[186544]: 2025-11-22 07:58:51.723 186548 DEBUG nova.virt.libvirt.driver [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 07:58:51 compute-0 nova_compute[186544]: 2025-11-22 07:58:51.723 186548 DEBUG nova.virt.hardware [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 07:58:51 compute-0 nova_compute[186544]: 2025-11-22 07:58:51.724 186548 DEBUG nova.virt.hardware [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 07:58:51 compute-0 nova_compute[186544]: 2025-11-22 07:58:51.724 186548 DEBUG nova.virt.hardware [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 07:58:51 compute-0 nova_compute[186544]: 2025-11-22 07:58:51.724 186548 DEBUG nova.virt.hardware [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 07:58:51 compute-0 nova_compute[186544]: 2025-11-22 07:58:51.725 186548 DEBUG nova.virt.hardware [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 07:58:51 compute-0 nova_compute[186544]: 2025-11-22 07:58:51.725 186548 DEBUG nova.virt.hardware [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 07:58:51 compute-0 nova_compute[186544]: 2025-11-22 07:58:51.725 186548 DEBUG nova.virt.hardware [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 07:58:51 compute-0 nova_compute[186544]: 2025-11-22 07:58:51.725 186548 DEBUG nova.virt.hardware [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 07:58:51 compute-0 nova_compute[186544]: 2025-11-22 07:58:51.726 186548 DEBUG nova.virt.hardware [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 07:58:51 compute-0 nova_compute[186544]: 2025-11-22 07:58:51.726 186548 DEBUG nova.virt.hardware [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 07:58:51 compute-0 nova_compute[186544]: 2025-11-22 07:58:51.726 186548 DEBUG nova.virt.hardware [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 07:58:51 compute-0 nova_compute[186544]: 2025-11-22 07:58:51.729 186548 DEBUG nova.virt.libvirt.vif [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:58:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-517382055',display_name='tempest-ServerStableDeviceRescueTest-server-517382055',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-517382055',id=85,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fd33c7e49baa4c7f9575824b348a0f23',ramdisk_id='',reservation_id='r-iqw87cv9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerStableDeviceRescueTest-455223381',owner_user_name='tempest-ServerStableDeviceRescueTest-455223381-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:58:46Z,user_data=None,user_id='0d84421d986b40f481c0caef764443e2',uuid=823521f6-eb37-4b78-982c-526e463c834f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1abe3f57-9262-4dc9-a0a8-4465e0cd0702", "address": "fa:16:3e:27:6a:75", "network": {"id": "06e0f3a5-911a-4244-bd9c-8cb4fa4c4794", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-960378838-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd33c7e49baa4c7f9575824b348a0f23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1abe3f57-92", "ovs_interfaceid": "1abe3f57-9262-4dc9-a0a8-4465e0cd0702", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 07:58:51 compute-0 nova_compute[186544]: 2025-11-22 07:58:51.730 186548 DEBUG nova.network.os_vif_util [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Converting VIF {"id": "1abe3f57-9262-4dc9-a0a8-4465e0cd0702", "address": "fa:16:3e:27:6a:75", "network": {"id": "06e0f3a5-911a-4244-bd9c-8cb4fa4c4794", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-960378838-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd33c7e49baa4c7f9575824b348a0f23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1abe3f57-92", "ovs_interfaceid": "1abe3f57-9262-4dc9-a0a8-4465e0cd0702", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:58:51 compute-0 nova_compute[186544]: 2025-11-22 07:58:51.731 186548 DEBUG nova.network.os_vif_util [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:27:6a:75,bridge_name='br-int',has_traffic_filtering=True,id=1abe3f57-9262-4dc9-a0a8-4465e0cd0702,network=Network(06e0f3a5-911a-4244-bd9c-8cb4fa4c4794),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1abe3f57-92') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:58:51 compute-0 nova_compute[186544]: 2025-11-22 07:58:51.732 186548 DEBUG nova.objects.instance [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lazy-loading 'pci_devices' on Instance uuid 823521f6-eb37-4b78-982c-526e463c834f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:58:51 compute-0 nova_compute[186544]: 2025-11-22 07:58:51.743 186548 DEBUG nova.virt.libvirt.driver [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] End _get_guest_xml xml=<domain type="kvm">
Nov 22 07:58:51 compute-0 nova_compute[186544]:   <uuid>823521f6-eb37-4b78-982c-526e463c834f</uuid>
Nov 22 07:58:51 compute-0 nova_compute[186544]:   <name>instance-00000055</name>
Nov 22 07:58:51 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 07:58:51 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 07:58:51 compute-0 nova_compute[186544]:   <metadata>
Nov 22 07:58:51 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 07:58:51 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 07:58:51 compute-0 nova_compute[186544]:       <nova:name>tempest-ServerStableDeviceRescueTest-server-517382055</nova:name>
Nov 22 07:58:51 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 07:58:51</nova:creationTime>
Nov 22 07:58:51 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 07:58:51 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 07:58:51 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 07:58:51 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 07:58:51 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 07:58:51 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 07:58:51 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 07:58:51 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 07:58:51 compute-0 nova_compute[186544]:         <nova:user uuid="0d84421d986b40f481c0caef764443e2">tempest-ServerStableDeviceRescueTest-455223381-project-member</nova:user>
Nov 22 07:58:51 compute-0 nova_compute[186544]:         <nova:project uuid="fd33c7e49baa4c7f9575824b348a0f23">tempest-ServerStableDeviceRescueTest-455223381</nova:project>
Nov 22 07:58:51 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 07:58:51 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 07:58:51 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 07:58:51 compute-0 nova_compute[186544]:         <nova:port uuid="1abe3f57-9262-4dc9-a0a8-4465e0cd0702">
Nov 22 07:58:51 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Nov 22 07:58:51 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 07:58:51 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 07:58:51 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 07:58:51 compute-0 nova_compute[186544]:   </metadata>
Nov 22 07:58:51 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 07:58:51 compute-0 nova_compute[186544]:     <system>
Nov 22 07:58:51 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 07:58:51 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 07:58:51 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 07:58:51 compute-0 nova_compute[186544]:       <entry name="serial">823521f6-eb37-4b78-982c-526e463c834f</entry>
Nov 22 07:58:51 compute-0 nova_compute[186544]:       <entry name="uuid">823521f6-eb37-4b78-982c-526e463c834f</entry>
Nov 22 07:58:51 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 07:58:51 compute-0 nova_compute[186544]:     </system>
Nov 22 07:58:51 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 07:58:51 compute-0 nova_compute[186544]:   <os>
Nov 22 07:58:51 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 07:58:51 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 07:58:51 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 07:58:51 compute-0 nova_compute[186544]:   </os>
Nov 22 07:58:51 compute-0 nova_compute[186544]:   <features>
Nov 22 07:58:51 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 07:58:51 compute-0 nova_compute[186544]:     <apic/>
Nov 22 07:58:51 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 07:58:51 compute-0 nova_compute[186544]:   </features>
Nov 22 07:58:51 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 07:58:51 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 07:58:51 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 07:58:51 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 07:58:51 compute-0 nova_compute[186544]:   </clock>
Nov 22 07:58:51 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 07:58:51 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 07:58:51 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 07:58:51 compute-0 nova_compute[186544]:   </cpu>
Nov 22 07:58:51 compute-0 nova_compute[186544]:   <devices>
Nov 22 07:58:51 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 07:58:51 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 07:58:51 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/823521f6-eb37-4b78-982c-526e463c834f/disk"/>
Nov 22 07:58:51 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 07:58:51 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:58:51 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 07:58:51 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 07:58:51 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/823521f6-eb37-4b78-982c-526e463c834f/disk.config"/>
Nov 22 07:58:51 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 07:58:51 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:58:51 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 07:58:51 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:27:6a:75"/>
Nov 22 07:58:51 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 07:58:51 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 07:58:51 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 07:58:51 compute-0 nova_compute[186544]:       <target dev="tap1abe3f57-92"/>
Nov 22 07:58:51 compute-0 nova_compute[186544]:     </interface>
Nov 22 07:58:51 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 07:58:51 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/823521f6-eb37-4b78-982c-526e463c834f/console.log" append="off"/>
Nov 22 07:58:51 compute-0 nova_compute[186544]:     </serial>
Nov 22 07:58:51 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 07:58:51 compute-0 nova_compute[186544]:     <video>
Nov 22 07:58:51 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 07:58:51 compute-0 nova_compute[186544]:     </video>
Nov 22 07:58:51 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 07:58:51 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 07:58:51 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 07:58:51 compute-0 nova_compute[186544]:     </rng>
Nov 22 07:58:51 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 07:58:51 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:51 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:51 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:51 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:51 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:51 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:51 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:51 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:51 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:51 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:51 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:51 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:51 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:51 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:51 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:51 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:51 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:51 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:51 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:51 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:51 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:51 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:51 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:51 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:58:51 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 07:58:51 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 07:58:51 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 07:58:51 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 07:58:51 compute-0 nova_compute[186544]:   </devices>
Nov 22 07:58:51 compute-0 nova_compute[186544]: </domain>
Nov 22 07:58:51 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 07:58:51 compute-0 nova_compute[186544]: 2025-11-22 07:58:51.748 186548 DEBUG nova.compute.manager [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Preparing to wait for external event network-vif-plugged-1abe3f57-9262-4dc9-a0a8-4465e0cd0702 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 07:58:51 compute-0 nova_compute[186544]: 2025-11-22 07:58:51.749 186548 DEBUG oslo_concurrency.lockutils [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Acquiring lock "823521f6-eb37-4b78-982c-526e463c834f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:58:51 compute-0 nova_compute[186544]: 2025-11-22 07:58:51.749 186548 DEBUG oslo_concurrency.lockutils [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lock "823521f6-eb37-4b78-982c-526e463c834f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:58:51 compute-0 nova_compute[186544]: 2025-11-22 07:58:51.749 186548 DEBUG oslo_concurrency.lockutils [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lock "823521f6-eb37-4b78-982c-526e463c834f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:58:51 compute-0 nova_compute[186544]: 2025-11-22 07:58:51.750 186548 DEBUG nova.virt.libvirt.vif [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:58:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-517382055',display_name='tempest-ServerStableDeviceRescueTest-server-517382055',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-517382055',id=85,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fd33c7e49baa4c7f9575824b348a0f23',ramdisk_id='',reservation_id='r-iqw87cv9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerStableDeviceRescueTest-455223381',owner_user_name='tempest-ServerStableDeviceRescueTest-455223381-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:58:46Z,user_data=None,user_id='0d84421d986b40f481c0caef764443e2',uuid=823521f6-eb37-4b78-982c-526e463c834f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1abe3f57-9262-4dc9-a0a8-4465e0cd0702", "address": "fa:16:3e:27:6a:75", "network": {"id": "06e0f3a5-911a-4244-bd9c-8cb4fa4c4794", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-960378838-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd33c7e49baa4c7f9575824b348a0f23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1abe3f57-92", "ovs_interfaceid": "1abe3f57-9262-4dc9-a0a8-4465e0cd0702", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 07:58:51 compute-0 nova_compute[186544]: 2025-11-22 07:58:51.750 186548 DEBUG nova.network.os_vif_util [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Converting VIF {"id": "1abe3f57-9262-4dc9-a0a8-4465e0cd0702", "address": "fa:16:3e:27:6a:75", "network": {"id": "06e0f3a5-911a-4244-bd9c-8cb4fa4c4794", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-960378838-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd33c7e49baa4c7f9575824b348a0f23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1abe3f57-92", "ovs_interfaceid": "1abe3f57-9262-4dc9-a0a8-4465e0cd0702", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:58:51 compute-0 nova_compute[186544]: 2025-11-22 07:58:51.751 186548 DEBUG nova.network.os_vif_util [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:27:6a:75,bridge_name='br-int',has_traffic_filtering=True,id=1abe3f57-9262-4dc9-a0a8-4465e0cd0702,network=Network(06e0f3a5-911a-4244-bd9c-8cb4fa4c4794),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1abe3f57-92') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:58:51 compute-0 nova_compute[186544]: 2025-11-22 07:58:51.752 186548 DEBUG os_vif [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:27:6a:75,bridge_name='br-int',has_traffic_filtering=True,id=1abe3f57-9262-4dc9-a0a8-4465e0cd0702,network=Network(06e0f3a5-911a-4244-bd9c-8cb4fa4c4794),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1abe3f57-92') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 07:58:51 compute-0 nova_compute[186544]: 2025-11-22 07:58:51.752 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:51 compute-0 nova_compute[186544]: 2025-11-22 07:58:51.753 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:58:51 compute-0 nova_compute[186544]: 2025-11-22 07:58:51.753 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:58:51 compute-0 nova_compute[186544]: 2025-11-22 07:58:51.755 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:51 compute-0 nova_compute[186544]: 2025-11-22 07:58:51.755 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1abe3f57-92, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:58:51 compute-0 nova_compute[186544]: 2025-11-22 07:58:51.756 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1abe3f57-92, col_values=(('external_ids', {'iface-id': '1abe3f57-9262-4dc9-a0a8-4465e0cd0702', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:27:6a:75', 'vm-uuid': '823521f6-eb37-4b78-982c-526e463c834f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:58:51 compute-0 NetworkManager[55036]: <info>  [1763798331.7587] manager: (tap1abe3f57-92): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/162)
Nov 22 07:58:51 compute-0 nova_compute[186544]: 2025-11-22 07:58:51.758 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 07:58:51 compute-0 nova_compute[186544]: 2025-11-22 07:58:51.764 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:51 compute-0 nova_compute[186544]: 2025-11-22 07:58:51.765 186548 INFO os_vif [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:27:6a:75,bridge_name='br-int',has_traffic_filtering=True,id=1abe3f57-9262-4dc9-a0a8-4465e0cd0702,network=Network(06e0f3a5-911a-4244-bd9c-8cb4fa4c4794),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1abe3f57-92')
Nov 22 07:58:51 compute-0 nova_compute[186544]: 2025-11-22 07:58:51.812 186548 DEBUG nova.compute.manager [req-09c2c71b-f033-418c-aa75-abe53d50bbce req-d3ac9b4e-cb86-4e6c-860d-8a4bdfdb64f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Received event network-changed-1abe3f57-9262-4dc9-a0a8-4465e0cd0702 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:58:51 compute-0 nova_compute[186544]: 2025-11-22 07:58:51.813 186548 DEBUG nova.compute.manager [req-09c2c71b-f033-418c-aa75-abe53d50bbce req-d3ac9b4e-cb86-4e6c-860d-8a4bdfdb64f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Refreshing instance network info cache due to event network-changed-1abe3f57-9262-4dc9-a0a8-4465e0cd0702. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 07:58:51 compute-0 nova_compute[186544]: 2025-11-22 07:58:51.813 186548 DEBUG oslo_concurrency.lockutils [req-09c2c71b-f033-418c-aa75-abe53d50bbce req-d3ac9b4e-cb86-4e6c-860d-8a4bdfdb64f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-823521f6-eb37-4b78-982c-526e463c834f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:58:51 compute-0 nova_compute[186544]: 2025-11-22 07:58:51.814 186548 DEBUG oslo_concurrency.lockutils [req-09c2c71b-f033-418c-aa75-abe53d50bbce req-d3ac9b4e-cb86-4e6c-860d-8a4bdfdb64f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-823521f6-eb37-4b78-982c-526e463c834f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:58:51 compute-0 nova_compute[186544]: 2025-11-22 07:58:51.814 186548 DEBUG nova.network.neutron [req-09c2c71b-f033-418c-aa75-abe53d50bbce req-d3ac9b4e-cb86-4e6c-860d-8a4bdfdb64f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Refreshing network info cache for port 1abe3f57-9262-4dc9-a0a8-4465e0cd0702 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 07:58:51 compute-0 nova_compute[186544]: 2025-11-22 07:58:51.973 186548 DEBUG nova.virt.libvirt.driver [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:58:51 compute-0 nova_compute[186544]: 2025-11-22 07:58:51.973 186548 DEBUG nova.virt.libvirt.driver [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:58:51 compute-0 nova_compute[186544]: 2025-11-22 07:58:51.973 186548 DEBUG nova.virt.libvirt.driver [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] No VIF found with MAC fa:16:3e:27:6a:75, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 07:58:51 compute-0 nova_compute[186544]: 2025-11-22 07:58:51.974 186548 INFO nova.virt.libvirt.driver [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Using config drive
Nov 22 07:58:52 compute-0 podman[227045]: 2025-11-22 07:58:51.920718506 +0000 UTC m=+0.020911166 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 07:58:52 compute-0 nova_compute[186544]: 2025-11-22 07:58:52.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:58:52 compute-0 nova_compute[186544]: 2025-11-22 07:58:52.498 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798332.4979944, 967dbe07-d575-4894-aabd-483767dcd760 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:58:52 compute-0 nova_compute[186544]: 2025-11-22 07:58:52.499 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 967dbe07-d575-4894-aabd-483767dcd760] VM Started (Lifecycle Event)
Nov 22 07:58:52 compute-0 nova_compute[186544]: 2025-11-22 07:58:52.501 186548 DEBUG nova.compute.manager [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 07:58:52 compute-0 nova_compute[186544]: 2025-11-22 07:58:52.504 186548 DEBUG nova.virt.libvirt.driver [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 07:58:52 compute-0 nova_compute[186544]: 2025-11-22 07:58:52.507 186548 INFO nova.virt.libvirt.driver [-] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Instance spawned successfully.
Nov 22 07:58:52 compute-0 nova_compute[186544]: 2025-11-22 07:58:52.508 186548 DEBUG nova.virt.libvirt.driver [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 07:58:52 compute-0 nova_compute[186544]: 2025-11-22 07:58:52.521 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:58:52 compute-0 nova_compute[186544]: 2025-11-22 07:58:52.528 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:58:52 compute-0 nova_compute[186544]: 2025-11-22 07:58:52.531 186548 DEBUG nova.virt.libvirt.driver [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:58:52 compute-0 nova_compute[186544]: 2025-11-22 07:58:52.532 186548 DEBUG nova.virt.libvirt.driver [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:58:52 compute-0 nova_compute[186544]: 2025-11-22 07:58:52.532 186548 DEBUG nova.virt.libvirt.driver [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:58:52 compute-0 nova_compute[186544]: 2025-11-22 07:58:52.533 186548 DEBUG nova.virt.libvirt.driver [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:58:52 compute-0 nova_compute[186544]: 2025-11-22 07:58:52.533 186548 DEBUG nova.virt.libvirt.driver [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:58:52 compute-0 nova_compute[186544]: 2025-11-22 07:58:52.534 186548 DEBUG nova.virt.libvirt.driver [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:58:52 compute-0 nova_compute[186544]: 2025-11-22 07:58:52.554 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 967dbe07-d575-4894-aabd-483767dcd760] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 07:58:52 compute-0 nova_compute[186544]: 2025-11-22 07:58:52.555 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798332.4981332, 967dbe07-d575-4894-aabd-483767dcd760 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:58:52 compute-0 nova_compute[186544]: 2025-11-22 07:58:52.555 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 967dbe07-d575-4894-aabd-483767dcd760] VM Paused (Lifecycle Event)
Nov 22 07:58:52 compute-0 nova_compute[186544]: 2025-11-22 07:58:52.579 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:58:52 compute-0 nova_compute[186544]: 2025-11-22 07:58:52.581 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798332.5034952, 967dbe07-d575-4894-aabd-483767dcd760 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:58:52 compute-0 nova_compute[186544]: 2025-11-22 07:58:52.582 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 967dbe07-d575-4894-aabd-483767dcd760] VM Resumed (Lifecycle Event)
Nov 22 07:58:52 compute-0 nova_compute[186544]: 2025-11-22 07:58:52.585 186548 INFO nova.virt.libvirt.driver [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Creating config drive at /var/lib/nova/instances/823521f6-eb37-4b78-982c-526e463c834f/disk.config
Nov 22 07:58:52 compute-0 nova_compute[186544]: 2025-11-22 07:58:52.590 186548 DEBUG oslo_concurrency.processutils [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/823521f6-eb37-4b78-982c-526e463c834f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp76c6_j2i execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:58:52 compute-0 nova_compute[186544]: 2025-11-22 07:58:52.612 186548 INFO nova.compute.manager [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Took 7.94 seconds to spawn the instance on the hypervisor.
Nov 22 07:58:52 compute-0 nova_compute[186544]: 2025-11-22 07:58:52.613 186548 DEBUG nova.compute.manager [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:58:52 compute-0 nova_compute[186544]: 2025-11-22 07:58:52.640 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:58:52 compute-0 nova_compute[186544]: 2025-11-22 07:58:52.644 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:58:52 compute-0 nova_compute[186544]: 2025-11-22 07:58:52.665 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 967dbe07-d575-4894-aabd-483767dcd760] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 07:58:52 compute-0 nova_compute[186544]: 2025-11-22 07:58:52.687 186548 INFO nova.compute.manager [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Took 8.43 seconds to build instance.
Nov 22 07:58:52 compute-0 nova_compute[186544]: 2025-11-22 07:58:52.702 186548 DEBUG oslo_concurrency.lockutils [None req-cd88b2ad-75f0-4e10-94f5-94a4385ccb20 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Lock "967dbe07-d575-4894-aabd-483767dcd760" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.538s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:58:52 compute-0 nova_compute[186544]: 2025-11-22 07:58:52.717 186548 DEBUG oslo_concurrency.processutils [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/823521f6-eb37-4b78-982c-526e463c834f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp76c6_j2i" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:58:52 compute-0 kernel: tap1abe3f57-92: entered promiscuous mode
Nov 22 07:58:52 compute-0 ovn_controller[94843]: 2025-11-22T07:58:52Z|00345|binding|INFO|Claiming lport 1abe3f57-9262-4dc9-a0a8-4465e0cd0702 for this chassis.
Nov 22 07:58:52 compute-0 nova_compute[186544]: 2025-11-22 07:58:52.782 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:52 compute-0 ovn_controller[94843]: 2025-11-22T07:58:52Z|00346|binding|INFO|1abe3f57-9262-4dc9-a0a8-4465e0cd0702: Claiming fa:16:3e:27:6a:75 10.100.0.6
Nov 22 07:58:52 compute-0 NetworkManager[55036]: <info>  [1763798332.7840] manager: (tap1abe3f57-92): new Tun device (/org/freedesktop/NetworkManager/Devices/163)
Nov 22 07:58:52 compute-0 systemd-udevd[226991]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 07:58:52 compute-0 NetworkManager[55036]: <info>  [1763798332.7995] device (tap1abe3f57-92): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 07:58:52 compute-0 NetworkManager[55036]: <info>  [1763798332.8006] device (tap1abe3f57-92): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 07:58:52 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:52.803 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:27:6a:75 10.100.0.6'], port_security=['fa:16:3e:27:6a:75 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '823521f6-eb37-4b78-982c-526e463c834f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'neutron:revision_number': '2', 'neutron:security_group_ids': '40e7d412-78c2-4966-b2d0-76294ef96b0a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2f082361-600e-461c-8c13-2c91c0ff7f77, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=1abe3f57-9262-4dc9-a0a8-4465e0cd0702) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:58:52 compute-0 systemd-machined[152872]: New machine qemu-42-instance-00000055.
Nov 22 07:58:52 compute-0 nova_compute[186544]: 2025-11-22 07:58:52.839 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:52 compute-0 ovn_controller[94843]: 2025-11-22T07:58:52Z|00347|binding|INFO|Setting lport 1abe3f57-9262-4dc9-a0a8-4465e0cd0702 ovn-installed in OVS
Nov 22 07:58:52 compute-0 ovn_controller[94843]: 2025-11-22T07:58:52Z|00348|binding|INFO|Setting lport 1abe3f57-9262-4dc9-a0a8-4465e0cd0702 up in Southbound
Nov 22 07:58:52 compute-0 nova_compute[186544]: 2025-11-22 07:58:52.846 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:52 compute-0 systemd[1]: Started Virtual Machine qemu-42-instance-00000055.
Nov 22 07:58:53 compute-0 podman[227045]: 2025-11-22 07:58:53.150781609 +0000 UTC m=+1.250974249 container create a178f49609f63f50204416423f23b53fce7926f73eccb1c0db659d8f7fcee078 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9f740f05-d312-4e00-a27d-4d2a45e526b6, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 07:58:53 compute-0 systemd[1]: Started libpod-conmon-a178f49609f63f50204416423f23b53fce7926f73eccb1c0db659d8f7fcee078.scope.
Nov 22 07:58:53 compute-0 podman[227071]: 2025-11-22 07:58:53.241940594 +0000 UTC m=+0.484381734 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 22 07:58:53 compute-0 systemd[1]: Started libcrun container.
Nov 22 07:58:53 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9c393a0a35db332ad2cb56372b68732057ab1abd5f77f5929790fbc457ab2b8d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 07:58:53 compute-0 podman[227045]: 2025-11-22 07:58:53.376670313 +0000 UTC m=+1.476862983 container init a178f49609f63f50204416423f23b53fce7926f73eccb1c0db659d8f7fcee078 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9f740f05-d312-4e00-a27d-4d2a45e526b6, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 07:58:53 compute-0 podman[227045]: 2025-11-22 07:58:53.383456151 +0000 UTC m=+1.483648791 container start a178f49609f63f50204416423f23b53fce7926f73eccb1c0db659d8f7fcee078 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9f740f05-d312-4e00-a27d-4d2a45e526b6, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 07:58:53 compute-0 neutron-haproxy-ovnmeta-9f740f05-d312-4e00-a27d-4d2a45e526b6[227113]: [NOTICE]   (227126) : New worker (227129) forked
Nov 22 07:58:53 compute-0 neutron-haproxy-ovnmeta-9f740f05-d312-4e00-a27d-4d2a45e526b6[227113]: [NOTICE]   (227126) : Loading success.
Nov 22 07:58:53 compute-0 nova_compute[186544]: 2025-11-22 07:58:53.441 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798333.4407656, 823521f6-eb37-4b78-982c-526e463c834f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:58:53 compute-0 nova_compute[186544]: 2025-11-22 07:58:53.442 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 823521f6-eb37-4b78-982c-526e463c834f] VM Started (Lifecycle Event)
Nov 22 07:58:53 compute-0 nova_compute[186544]: 2025-11-22 07:58:53.460 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:58:53 compute-0 nova_compute[186544]: 2025-11-22 07:58:53.465 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798333.4419897, 823521f6-eb37-4b78-982c-526e463c834f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:58:53 compute-0 nova_compute[186544]: 2025-11-22 07:58:53.465 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 823521f6-eb37-4b78-982c-526e463c834f] VM Paused (Lifecycle Event)
Nov 22 07:58:53 compute-0 nova_compute[186544]: 2025-11-22 07:58:53.479 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:58:53 compute-0 nova_compute[186544]: 2025-11-22 07:58:53.482 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:58:53 compute-0 nova_compute[186544]: 2025-11-22 07:58:53.505 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 823521f6-eb37-4b78-982c-526e463c834f] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 07:58:53 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:53.619 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 1abe3f57-9262-4dc9-a0a8-4465e0cd0702 in datapath 06e0f3a5-911a-4244-bd9c-8cb4fa4c4794 unbound from our chassis
Nov 22 07:58:53 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:53.621 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 06e0f3a5-911a-4244-bd9c-8cb4fa4c4794
Nov 22 07:58:53 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:53.635 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[155adf78-8145-46cb-87e1-908b522a95e3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:53 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:53.637 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap06e0f3a5-91 in ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 07:58:53 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:53.640 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap06e0f3a5-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 07:58:53 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:53.641 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[4cdd3ad2-3331-4531-9d49-6ebde954dbd5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:53 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:53.642 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[fbb15252-cd04-4e73-b738-86b9b824d04c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:53 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:53.654 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[ed76911c-f60e-4eae-89b9-ce8382fd8e0e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:53 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:53.670 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[5ccf2fe7-53a4-4666-8230-ef77d81c3f9e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:53 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:53.702 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[16c93a50-c40e-4a41-88e0-13d01f0eaf89]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:53 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:53.708 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[dc0824fb-9374-45b5-8950-b10f3552bdf2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:53 compute-0 NetworkManager[55036]: <info>  [1763798333.7095] manager: (tap06e0f3a5-90): new Veth device (/org/freedesktop/NetworkManager/Devices/164)
Nov 22 07:58:53 compute-0 nova_compute[186544]: 2025-11-22 07:58:53.707 186548 DEBUG nova.compute.manager [req-a527037b-9aa3-48cf-9027-c0b835af36b2 req-5b95709b-3d2b-4880-8d0d-cb86929c5004 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Received event network-vif-plugged-e3b10401-9ef6-4135-93b2-8c19486b866e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:58:53 compute-0 nova_compute[186544]: 2025-11-22 07:58:53.710 186548 DEBUG oslo_concurrency.lockutils [req-a527037b-9aa3-48cf-9027-c0b835af36b2 req-5b95709b-3d2b-4880-8d0d-cb86929c5004 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "967dbe07-d575-4894-aabd-483767dcd760-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:58:53 compute-0 nova_compute[186544]: 2025-11-22 07:58:53.711 186548 DEBUG oslo_concurrency.lockutils [req-a527037b-9aa3-48cf-9027-c0b835af36b2 req-5b95709b-3d2b-4880-8d0d-cb86929c5004 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "967dbe07-d575-4894-aabd-483767dcd760-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:58:53 compute-0 nova_compute[186544]: 2025-11-22 07:58:53.711 186548 DEBUG oslo_concurrency.lockutils [req-a527037b-9aa3-48cf-9027-c0b835af36b2 req-5b95709b-3d2b-4880-8d0d-cb86929c5004 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "967dbe07-d575-4894-aabd-483767dcd760-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:58:53 compute-0 nova_compute[186544]: 2025-11-22 07:58:53.711 186548 DEBUG nova.compute.manager [req-a527037b-9aa3-48cf-9027-c0b835af36b2 req-5b95709b-3d2b-4880-8d0d-cb86929c5004 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] No waiting events found dispatching network-vif-plugged-e3b10401-9ef6-4135-93b2-8c19486b866e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:58:53 compute-0 nova_compute[186544]: 2025-11-22 07:58:53.711 186548 WARNING nova.compute.manager [req-a527037b-9aa3-48cf-9027-c0b835af36b2 req-5b95709b-3d2b-4880-8d0d-cb86929c5004 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Received unexpected event network-vif-plugged-e3b10401-9ef6-4135-93b2-8c19486b866e for instance with vm_state active and task_state None.
Nov 22 07:58:53 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:53.736 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[736900c0-8f82-45ef-9c95-37ed0f27f2fb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:53 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:53.739 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[71b8abbc-f625-46f4-9a4f-8b15c16fa377]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:53 compute-0 NetworkManager[55036]: <info>  [1763798333.7603] device (tap06e0f3a5-90): carrier: link connected
Nov 22 07:58:53 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:53.765 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[e2f52050-8eec-4bc5-a978-bf44f58cc8ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:53 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:53.784 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[117eaf90-831b-4325-b9fe-e0ba5e4d56e6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap06e0f3a5-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d7:b7:bd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 106], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 507439, 'reachable_time': 24270, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227148, 'error': None, 'target': 'ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:53 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:53.805 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[422783f6-242b-4930-87b5-5aa966c48790]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed7:b7bd'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 507439, 'tstamp': 507439}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227149, 'error': None, 'target': 'ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:53 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:53.822 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[ee4e7a35-7d96-4366-ab9f-d2804a0af9f8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap06e0f3a5-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d7:b7:bd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 106], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 507439, 'reachable_time': 24270, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 227150, 'error': None, 'target': 'ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:53 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:53.849 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[fd714646-2f45-4a61-aa92-4551af276c7b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:53 compute-0 nova_compute[186544]: 2025-11-22 07:58:53.890 186548 DEBUG nova.compute.manager [req-14ad6adc-9d07-4141-b41d-3d525d74e815 req-b322113c-9756-4077-9b6e-624b9029e395 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Received event network-vif-plugged-1abe3f57-9262-4dc9-a0a8-4465e0cd0702 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:58:53 compute-0 nova_compute[186544]: 2025-11-22 07:58:53.890 186548 DEBUG oslo_concurrency.lockutils [req-14ad6adc-9d07-4141-b41d-3d525d74e815 req-b322113c-9756-4077-9b6e-624b9029e395 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "823521f6-eb37-4b78-982c-526e463c834f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:58:53 compute-0 nova_compute[186544]: 2025-11-22 07:58:53.891 186548 DEBUG oslo_concurrency.lockutils [req-14ad6adc-9d07-4141-b41d-3d525d74e815 req-b322113c-9756-4077-9b6e-624b9029e395 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "823521f6-eb37-4b78-982c-526e463c834f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:58:53 compute-0 nova_compute[186544]: 2025-11-22 07:58:53.891 186548 DEBUG oslo_concurrency.lockutils [req-14ad6adc-9d07-4141-b41d-3d525d74e815 req-b322113c-9756-4077-9b6e-624b9029e395 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "823521f6-eb37-4b78-982c-526e463c834f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:58:53 compute-0 nova_compute[186544]: 2025-11-22 07:58:53.891 186548 DEBUG nova.compute.manager [req-14ad6adc-9d07-4141-b41d-3d525d74e815 req-b322113c-9756-4077-9b6e-624b9029e395 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Processing event network-vif-plugged-1abe3f57-9262-4dc9-a0a8-4465e0cd0702 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 07:58:53 compute-0 nova_compute[186544]: 2025-11-22 07:58:53.892 186548 DEBUG nova.compute.manager [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 07:58:53 compute-0 nova_compute[186544]: 2025-11-22 07:58:53.896 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798333.8959417, 823521f6-eb37-4b78-982c-526e463c834f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:58:53 compute-0 nova_compute[186544]: 2025-11-22 07:58:53.897 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 823521f6-eb37-4b78-982c-526e463c834f] VM Resumed (Lifecycle Event)
Nov 22 07:58:53 compute-0 nova_compute[186544]: 2025-11-22 07:58:53.899 186548 DEBUG nova.virt.libvirt.driver [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 07:58:53 compute-0 nova_compute[186544]: 2025-11-22 07:58:53.903 186548 INFO nova.virt.libvirt.driver [-] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Instance spawned successfully.
Nov 22 07:58:53 compute-0 nova_compute[186544]: 2025-11-22 07:58:53.904 186548 DEBUG nova.virt.libvirt.driver [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 07:58:53 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:53.913 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[65c05570-af92-4b16-a368-2d6853993384]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:53 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:53.914 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap06e0f3a5-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:58:53 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:53.915 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:58:53 compute-0 nova_compute[186544]: 2025-11-22 07:58:53.915 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:58:53 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:53.915 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap06e0f3a5-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:58:53 compute-0 NetworkManager[55036]: <info>  [1763798333.9186] manager: (tap06e0f3a5-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/165)
Nov 22 07:58:53 compute-0 kernel: tap06e0f3a5-90: entered promiscuous mode
Nov 22 07:58:53 compute-0 nova_compute[186544]: 2025-11-22 07:58:53.920 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:53 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:53.921 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap06e0f3a5-90, col_values=(('external_ids', {'iface-id': '465da2c0-9a1c-41a9-be9a-d10bcbd7a813'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:58:53 compute-0 ovn_controller[94843]: 2025-11-22T07:58:53Z|00349|binding|INFO|Releasing lport 465da2c0-9a1c-41a9-be9a-d10bcbd7a813 from this chassis (sb_readonly=0)
Nov 22 07:58:53 compute-0 nova_compute[186544]: 2025-11-22 07:58:53.923 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:53 compute-0 nova_compute[186544]: 2025-11-22 07:58:53.934 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:58:53 compute-0 nova_compute[186544]: 2025-11-22 07:58:53.937 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:53 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:53.938 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/06e0f3a5-911a-4244-bd9c-8cb4fa4c4794.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/06e0f3a5-911a-4244-bd9c-8cb4fa4c4794.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 07:58:53 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:53.939 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[9513a5a1-e66d-44ff-af19-f8a3a43e37b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:58:53 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:53.940 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 07:58:53 compute-0 ovn_metadata_agent[103800]: global
Nov 22 07:58:53 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 07:58:53 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794
Nov 22 07:58:53 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 07:58:53 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 07:58:53 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 07:58:53 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/06e0f3a5-911a-4244-bd9c-8cb4fa4c4794.pid.haproxy
Nov 22 07:58:53 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 07:58:53 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:58:53 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 07:58:53 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 07:58:53 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 07:58:53 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 07:58:53 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 07:58:53 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 07:58:53 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 07:58:53 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 07:58:53 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 07:58:53 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 07:58:53 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 07:58:53 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 07:58:53 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 07:58:53 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:58:53 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:58:53 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 07:58:53 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 07:58:53 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 07:58:53 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID 06e0f3a5-911a-4244-bd9c-8cb4fa4c4794
Nov 22 07:58:53 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 07:58:53 compute-0 nova_compute[186544]: 2025-11-22 07:58:53.940 186548 DEBUG nova.virt.libvirt.driver [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:58:53 compute-0 nova_compute[186544]: 2025-11-22 07:58:53.940 186548 DEBUG nova.virt.libvirt.driver [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:58:53 compute-0 nova_compute[186544]: 2025-11-22 07:58:53.941 186548 DEBUG nova.virt.libvirt.driver [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:58:53 compute-0 nova_compute[186544]: 2025-11-22 07:58:53.941 186548 DEBUG nova.virt.libvirt.driver [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:58:53 compute-0 nova_compute[186544]: 2025-11-22 07:58:53.942 186548 DEBUG nova.virt.libvirt.driver [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:58:53 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:58:53.942 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794', 'env', 'PROCESS_TAG=haproxy-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/06e0f3a5-911a-4244-bd9c-8cb4fa4c4794.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 07:58:53 compute-0 nova_compute[186544]: 2025-11-22 07:58:53.942 186548 DEBUG nova.virt.libvirt.driver [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:58:53 compute-0 nova_compute[186544]: 2025-11-22 07:58:53.961 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 823521f6-eb37-4b78-982c-526e463c834f] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 07:58:54 compute-0 nova_compute[186544]: 2025-11-22 07:58:54.046 186548 INFO nova.compute.manager [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Took 7.63 seconds to spawn the instance on the hypervisor.
Nov 22 07:58:54 compute-0 nova_compute[186544]: 2025-11-22 07:58:54.047 186548 DEBUG nova.compute.manager [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:58:54 compute-0 nova_compute[186544]: 2025-11-22 07:58:54.192 186548 INFO nova.compute.manager [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Took 8.23 seconds to build instance.
Nov 22 07:58:54 compute-0 nova_compute[186544]: 2025-11-22 07:58:54.214 186548 DEBUG oslo_concurrency.lockutils [None req-2dddcf5d-9e42-4df2-afae-37fd0e64c8a8 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lock "823521f6-eb37-4b78-982c-526e463c834f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.333s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:58:54 compute-0 podman[227182]: 2025-11-22 07:58:54.335118376 +0000 UTC m=+0.028493713 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 07:58:54 compute-0 podman[227182]: 2025-11-22 07:58:54.438483592 +0000 UTC m=+0.131858929 container create 6d8d43ddb02f5a48f501eaffa850305c588b4a5d598d59cf5cfe1407b045dc4f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 22 07:58:54 compute-0 systemd[1]: Started libpod-conmon-6d8d43ddb02f5a48f501eaffa850305c588b4a5d598d59cf5cfe1407b045dc4f.scope.
Nov 22 07:58:54 compute-0 systemd[1]: Started libcrun container.
Nov 22 07:58:54 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/026e5abe48735cd4ac24a9340e911192d2cb399737c74f3129666361318f3e21/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 07:58:54 compute-0 podman[227182]: 2025-11-22 07:58:54.577183699 +0000 UTC m=+0.270559056 container init 6d8d43ddb02f5a48f501eaffa850305c588b4a5d598d59cf5cfe1407b045dc4f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 07:58:54 compute-0 podman[227182]: 2025-11-22 07:58:54.584241323 +0000 UTC m=+0.277616660 container start 6d8d43ddb02f5a48f501eaffa850305c588b4a5d598d59cf5cfe1407b045dc4f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true)
Nov 22 07:58:54 compute-0 neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794[227197]: [NOTICE]   (227202) : New worker (227204) forked
Nov 22 07:58:54 compute-0 neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794[227197]: [NOTICE]   (227202) : Loading success.
Nov 22 07:58:55 compute-0 nova_compute[186544]: 2025-11-22 07:58:55.276 186548 DEBUG nova.network.neutron [req-09c2c71b-f033-418c-aa75-abe53d50bbce req-d3ac9b4e-cb86-4e6c-860d-8a4bdfdb64f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Updated VIF entry in instance network info cache for port 1abe3f57-9262-4dc9-a0a8-4465e0cd0702. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 07:58:55 compute-0 nova_compute[186544]: 2025-11-22 07:58:55.277 186548 DEBUG nova.network.neutron [req-09c2c71b-f033-418c-aa75-abe53d50bbce req-d3ac9b4e-cb86-4e6c-860d-8a4bdfdb64f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Updating instance_info_cache with network_info: [{"id": "1abe3f57-9262-4dc9-a0a8-4465e0cd0702", "address": "fa:16:3e:27:6a:75", "network": {"id": "06e0f3a5-911a-4244-bd9c-8cb4fa4c4794", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-960378838-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd33c7e49baa4c7f9575824b348a0f23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1abe3f57-92", "ovs_interfaceid": "1abe3f57-9262-4dc9-a0a8-4465e0cd0702", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:58:55 compute-0 nova_compute[186544]: 2025-11-22 07:58:55.300 186548 DEBUG oslo_concurrency.lockutils [req-09c2c71b-f033-418c-aa75-abe53d50bbce req-d3ac9b4e-cb86-4e6c-860d-8a4bdfdb64f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-823521f6-eb37-4b78-982c-526e463c834f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:58:55 compute-0 nova_compute[186544]: 2025-11-22 07:58:55.584 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:55 compute-0 nova_compute[186544]: 2025-11-22 07:58:55.988 186548 DEBUG nova.compute.manager [req-b8fec8f9-b086-4c55-b842-3a71234e6b66 req-49016b26-5089-44a7-81c0-bff9aa01b47d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Received event network-vif-plugged-1abe3f57-9262-4dc9-a0a8-4465e0cd0702 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:58:55 compute-0 nova_compute[186544]: 2025-11-22 07:58:55.988 186548 DEBUG oslo_concurrency.lockutils [req-b8fec8f9-b086-4c55-b842-3a71234e6b66 req-49016b26-5089-44a7-81c0-bff9aa01b47d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "823521f6-eb37-4b78-982c-526e463c834f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:58:55 compute-0 nova_compute[186544]: 2025-11-22 07:58:55.988 186548 DEBUG oslo_concurrency.lockutils [req-b8fec8f9-b086-4c55-b842-3a71234e6b66 req-49016b26-5089-44a7-81c0-bff9aa01b47d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "823521f6-eb37-4b78-982c-526e463c834f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:58:55 compute-0 nova_compute[186544]: 2025-11-22 07:58:55.989 186548 DEBUG oslo_concurrency.lockutils [req-b8fec8f9-b086-4c55-b842-3a71234e6b66 req-49016b26-5089-44a7-81c0-bff9aa01b47d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "823521f6-eb37-4b78-982c-526e463c834f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:58:55 compute-0 nova_compute[186544]: 2025-11-22 07:58:55.989 186548 DEBUG nova.compute.manager [req-b8fec8f9-b086-4c55-b842-3a71234e6b66 req-49016b26-5089-44a7-81c0-bff9aa01b47d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] No waiting events found dispatching network-vif-plugged-1abe3f57-9262-4dc9-a0a8-4465e0cd0702 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:58:55 compute-0 nova_compute[186544]: 2025-11-22 07:58:55.989 186548 WARNING nova.compute.manager [req-b8fec8f9-b086-4c55-b842-3a71234e6b66 req-49016b26-5089-44a7-81c0-bff9aa01b47d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Received unexpected event network-vif-plugged-1abe3f57-9262-4dc9-a0a8-4465e0cd0702 for instance with vm_state active and task_state None.
Nov 22 07:58:56 compute-0 nova_compute[186544]: 2025-11-22 07:58:56.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:58:56 compute-0 nova_compute[186544]: 2025-11-22 07:58:56.164 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 22 07:58:56 compute-0 nova_compute[186544]: 2025-11-22 07:58:56.179 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 22 07:58:56 compute-0 nova_compute[186544]: 2025-11-22 07:58:56.464 186548 DEBUG nova.compute.manager [req-9283d923-ad11-4dde-a930-eaff5b1a642b req-9e94ba44-6491-4e24-986e-a9b76276ae41 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Received event network-changed-e3b10401-9ef6-4135-93b2-8c19486b866e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:58:56 compute-0 nova_compute[186544]: 2025-11-22 07:58:56.464 186548 DEBUG nova.compute.manager [req-9283d923-ad11-4dde-a930-eaff5b1a642b req-9e94ba44-6491-4e24-986e-a9b76276ae41 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Refreshing instance network info cache due to event network-changed-e3b10401-9ef6-4135-93b2-8c19486b866e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 07:58:56 compute-0 nova_compute[186544]: 2025-11-22 07:58:56.464 186548 DEBUG oslo_concurrency.lockutils [req-9283d923-ad11-4dde-a930-eaff5b1a642b req-9e94ba44-6491-4e24-986e-a9b76276ae41 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-967dbe07-d575-4894-aabd-483767dcd760" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:58:56 compute-0 nova_compute[186544]: 2025-11-22 07:58:56.464 186548 DEBUG oslo_concurrency.lockutils [req-9283d923-ad11-4dde-a930-eaff5b1a642b req-9e94ba44-6491-4e24-986e-a9b76276ae41 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-967dbe07-d575-4894-aabd-483767dcd760" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:58:56 compute-0 nova_compute[186544]: 2025-11-22 07:58:56.465 186548 DEBUG nova.network.neutron [req-9283d923-ad11-4dde-a930-eaff5b1a642b req-9e94ba44-6491-4e24-986e-a9b76276ae41 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Refreshing network info cache for port e3b10401-9ef6-4135-93b2-8c19486b866e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 07:58:56 compute-0 nova_compute[186544]: 2025-11-22 07:58:56.577 186548 DEBUG nova.compute.manager [None req-44b6893b-56ef-4f87-b998-da152c147fb0 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:58:56 compute-0 nova_compute[186544]: 2025-11-22 07:58:56.674 186548 INFO nova.compute.manager [None req-44b6893b-56ef-4f87-b998-da152c147fb0 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] instance snapshotting
Nov 22 07:58:56 compute-0 nova_compute[186544]: 2025-11-22 07:58:56.757 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:58:56 compute-0 nova_compute[186544]: 2025-11-22 07:58:56.912 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763798321.9116442, e0454e3f-2082-4375-8eea-951bbcda9d18 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:58:56 compute-0 nova_compute[186544]: 2025-11-22 07:58:56.913 186548 INFO nova.compute.manager [-] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] VM Stopped (Lifecycle Event)
Nov 22 07:58:56 compute-0 nova_compute[186544]: 2025-11-22 07:58:56.931 186548 DEBUG nova.compute.manager [None req-369a77e7-a98f-4180-b0c5-8fcc59aaa888 - - - - - -] [instance: e0454e3f-2082-4375-8eea-951bbcda9d18] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:58:57 compute-0 nova_compute[186544]: 2025-11-22 07:58:57.020 186548 INFO nova.virt.libvirt.driver [None req-44b6893b-56ef-4f87-b998-da152c147fb0 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Beginning live snapshot process
Nov 22 07:58:57 compute-0 virtqemud[186092]: invalid argument: disk vda does not have an active block job
Nov 22 07:58:57 compute-0 nova_compute[186544]: 2025-11-22 07:58:57.277 186548 DEBUG oslo_concurrency.processutils [None req-44b6893b-56ef-4f87-b998-da152c147fb0 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/823521f6-eb37-4b78-982c-526e463c834f/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:58:57 compute-0 nova_compute[186544]: 2025-11-22 07:58:57.340 186548 DEBUG oslo_concurrency.processutils [None req-44b6893b-56ef-4f87-b998-da152c147fb0 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/823521f6-eb37-4b78-982c-526e463c834f/disk --force-share --output=json -f qcow2" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:58:57 compute-0 nova_compute[186544]: 2025-11-22 07:58:57.342 186548 DEBUG oslo_concurrency.processutils [None req-44b6893b-56ef-4f87-b998-da152c147fb0 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/823521f6-eb37-4b78-982c-526e463c834f/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:58:57 compute-0 nova_compute[186544]: 2025-11-22 07:58:57.364 186548 DEBUG oslo_concurrency.lockutils [None req-7669d9df-bce4-41d0-99db-3d135204847b d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Acquiring lock "967dbe07-d575-4894-aabd-483767dcd760" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:58:57 compute-0 nova_compute[186544]: 2025-11-22 07:58:57.365 186548 DEBUG oslo_concurrency.lockutils [None req-7669d9df-bce4-41d0-99db-3d135204847b d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Lock "967dbe07-d575-4894-aabd-483767dcd760" acquired by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:58:57 compute-0 nova_compute[186544]: 2025-11-22 07:58:57.365 186548 INFO nova.compute.manager [None req-7669d9df-bce4-41d0-99db-3d135204847b d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Rebooting instance
Nov 22 07:58:57 compute-0 nova_compute[186544]: 2025-11-22 07:58:57.378 186548 DEBUG oslo_concurrency.lockutils [None req-7669d9df-bce4-41d0-99db-3d135204847b d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Acquiring lock "refresh_cache-967dbe07-d575-4894-aabd-483767dcd760" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:58:57 compute-0 nova_compute[186544]: 2025-11-22 07:58:57.404 186548 DEBUG oslo_concurrency.processutils [None req-44b6893b-56ef-4f87-b998-da152c147fb0 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/823521f6-eb37-4b78-982c-526e463c834f/disk --force-share --output=json -f qcow2" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:58:57 compute-0 nova_compute[186544]: 2025-11-22 07:58:57.418 186548 DEBUG oslo_concurrency.processutils [None req-44b6893b-56ef-4f87-b998-da152c147fb0 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:58:57 compute-0 nova_compute[186544]: 2025-11-22 07:58:57.477 186548 DEBUG oslo_concurrency.processutils [None req-44b6893b-56ef-4f87-b998-da152c147fb0 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:58:57 compute-0 nova_compute[186544]: 2025-11-22 07:58:57.478 186548 DEBUG oslo_concurrency.processutils [None req-44b6893b-56ef-4f87-b998-da152c147fb0 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpqzt7t6zj/21058e0d5377472dbb15ed636e5c08cd.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:58:57 compute-0 nova_compute[186544]: 2025-11-22 07:58:57.518 186548 DEBUG oslo_concurrency.processutils [None req-44b6893b-56ef-4f87-b998-da152c147fb0 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpqzt7t6zj/21058e0d5377472dbb15ed636e5c08cd.delta 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:58:57 compute-0 nova_compute[186544]: 2025-11-22 07:58:57.519 186548 INFO nova.virt.libvirt.driver [None req-44b6893b-56ef-4f87-b998-da152c147fb0 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Quiescing instance not available: QEMU guest agent is not enabled.
Nov 22 07:58:57 compute-0 nova_compute[186544]: 2025-11-22 07:58:57.581 186548 DEBUG nova.virt.libvirt.guest [None req-44b6893b-56ef-4f87-b998-da152c147fb0 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] COPY block job progress, current cursor: 1 final cursor: 1 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Nov 22 07:58:57 compute-0 nova_compute[186544]: 2025-11-22 07:58:57.585 186548 INFO nova.virt.libvirt.driver [None req-44b6893b-56ef-4f87-b998-da152c147fb0 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Skipping quiescing instance: QEMU guest agent is not enabled.
Nov 22 07:58:57 compute-0 nova_compute[186544]: 2025-11-22 07:58:57.639 186548 DEBUG nova.privsep.utils [None req-44b6893b-56ef-4f87-b998-da152c147fb0 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Nov 22 07:58:57 compute-0 nova_compute[186544]: 2025-11-22 07:58:57.640 186548 DEBUG oslo_concurrency.processutils [None req-44b6893b-56ef-4f87-b998-da152c147fb0 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpqzt7t6zj/21058e0d5377472dbb15ed636e5c08cd.delta /var/lib/nova/instances/snapshots/tmpqzt7t6zj/21058e0d5377472dbb15ed636e5c08cd execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:58:58 compute-0 nova_compute[186544]: 2025-11-22 07:58:58.399 186548 DEBUG nova.network.neutron [req-9283d923-ad11-4dde-a930-eaff5b1a642b req-9e94ba44-6491-4e24-986e-a9b76276ae41 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Updated VIF entry in instance network info cache for port e3b10401-9ef6-4135-93b2-8c19486b866e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 07:58:58 compute-0 nova_compute[186544]: 2025-11-22 07:58:58.400 186548 DEBUG nova.network.neutron [req-9283d923-ad11-4dde-a930-eaff5b1a642b req-9e94ba44-6491-4e24-986e-a9b76276ae41 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Updating instance_info_cache with network_info: [{"id": "e3b10401-9ef6-4135-93b2-8c19486b866e", "address": "fa:16:3e:dd:c3:78", "network": {"id": "9f740f05-d312-4e00-a27d-4d2a45e526b6", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1530130691-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d2bcbcf3720f46be9fea7fc4685dfecd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape3b10401-9e", "ovs_interfaceid": "e3b10401-9ef6-4135-93b2-8c19486b866e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:58:58 compute-0 nova_compute[186544]: 2025-11-22 07:58:58.484 186548 DEBUG oslo_concurrency.lockutils [req-9283d923-ad11-4dde-a930-eaff5b1a642b req-9e94ba44-6491-4e24-986e-a9b76276ae41 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-967dbe07-d575-4894-aabd-483767dcd760" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:58:58 compute-0 nova_compute[186544]: 2025-11-22 07:58:58.488 186548 DEBUG oslo_concurrency.lockutils [None req-7669d9df-bce4-41d0-99db-3d135204847b d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Acquired lock "refresh_cache-967dbe07-d575-4894-aabd-483767dcd760" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:58:58 compute-0 nova_compute[186544]: 2025-11-22 07:58:58.489 186548 DEBUG nova.network.neutron [None req-7669d9df-bce4-41d0-99db-3d135204847b d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 07:58:58 compute-0 nova_compute[186544]: 2025-11-22 07:58:58.583 186548 DEBUG oslo_concurrency.processutils [None req-44b6893b-56ef-4f87-b998-da152c147fb0 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpqzt7t6zj/21058e0d5377472dbb15ed636e5c08cd.delta /var/lib/nova/instances/snapshots/tmpqzt7t6zj/21058e0d5377472dbb15ed636e5c08cd" returned: 0 in 0.943s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:58:58 compute-0 nova_compute[186544]: 2025-11-22 07:58:58.584 186548 INFO nova.virt.libvirt.driver [None req-44b6893b-56ef-4f87-b998-da152c147fb0 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Snapshot extracted, beginning image upload
Nov 22 07:58:59 compute-0 podman[227239]: 2025-11-22 07:58:59.413132725 +0000 UTC m=+0.055184571 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Nov 22 07:59:00 compute-0 nova_compute[186544]: 2025-11-22 07:59:00.335 186548 DEBUG nova.network.neutron [None req-7669d9df-bce4-41d0-99db-3d135204847b d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Updating instance_info_cache with network_info: [{"id": "e3b10401-9ef6-4135-93b2-8c19486b866e", "address": "fa:16:3e:dd:c3:78", "network": {"id": "9f740f05-d312-4e00-a27d-4d2a45e526b6", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1530130691-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d2bcbcf3720f46be9fea7fc4685dfecd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape3b10401-9e", "ovs_interfaceid": "e3b10401-9ef6-4135-93b2-8c19486b866e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:59:00 compute-0 nova_compute[186544]: 2025-11-22 07:59:00.348 186548 DEBUG oslo_concurrency.lockutils [None req-7669d9df-bce4-41d0-99db-3d135204847b d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Releasing lock "refresh_cache-967dbe07-d575-4894-aabd-483767dcd760" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:59:00 compute-0 nova_compute[186544]: 2025-11-22 07:59:00.356 186548 DEBUG nova.compute.manager [None req-7669d9df-bce4-41d0-99db-3d135204847b d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:59:00 compute-0 nova_compute[186544]: 2025-11-22 07:59:00.586 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:00 compute-0 kernel: tape3b10401-9e (unregistering): left promiscuous mode
Nov 22 07:59:00 compute-0 NetworkManager[55036]: <info>  [1763798340.6725] device (tape3b10401-9e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 07:59:00 compute-0 ovn_controller[94843]: 2025-11-22T07:59:00Z|00350|binding|INFO|Releasing lport e3b10401-9ef6-4135-93b2-8c19486b866e from this chassis (sb_readonly=0)
Nov 22 07:59:00 compute-0 ovn_controller[94843]: 2025-11-22T07:59:00Z|00351|binding|INFO|Setting lport e3b10401-9ef6-4135-93b2-8c19486b866e down in Southbound
Nov 22 07:59:00 compute-0 ovn_controller[94843]: 2025-11-22T07:59:00Z|00352|binding|INFO|Removing iface tape3b10401-9e ovn-installed in OVS
Nov 22 07:59:00 compute-0 nova_compute[186544]: 2025-11-22 07:59:00.681 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:00 compute-0 nova_compute[186544]: 2025-11-22 07:59:00.693 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:00 compute-0 nova_compute[186544]: 2025-11-22 07:59:00.695 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:00 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:00.701 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:dd:c3:78 10.100.0.7'], port_security=['fa:16:3e:dd:c3:78 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '967dbe07-d575-4894-aabd-483767dcd760', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9f740f05-d312-4e00-a27d-4d2a45e526b6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd2bcbcf3720f46be9fea7fc4685dfecd', 'neutron:revision_number': '5', 'neutron:security_group_ids': '358cb1ee-ee2e-4598-85d3-617cef2413d3 726ed215-2cc1-4cd0-860c-0d95ad883b6b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=63d51e5f-a087-4eb1-a0c4-4a9ee7856c37, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=e3b10401-9ef6-4135-93b2-8c19486b866e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:59:00 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:00.703 103805 INFO neutron.agent.ovn.metadata.agent [-] Port e3b10401-9ef6-4135-93b2-8c19486b866e in datapath 9f740f05-d312-4e00-a27d-4d2a45e526b6 unbound from our chassis
Nov 22 07:59:00 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:00.704 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9f740f05-d312-4e00-a27d-4d2a45e526b6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 07:59:00 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:00.705 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[eb2f38ff-9ff9-4119-8dbc-f81442427d3a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:00 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:00.705 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9f740f05-d312-4e00-a27d-4d2a45e526b6 namespace which is not needed anymore
Nov 22 07:59:00 compute-0 systemd[1]: machine-qemu\x2d41\x2dinstance\x2d00000054.scope: Deactivated successfully.
Nov 22 07:59:00 compute-0 systemd[1]: machine-qemu\x2d41\x2dinstance\x2d00000054.scope: Consumed 9.229s CPU time.
Nov 22 07:59:00 compute-0 systemd-machined[152872]: Machine qemu-41-instance-00000054 terminated.
Nov 22 07:59:00 compute-0 neutron-haproxy-ovnmeta-9f740f05-d312-4e00-a27d-4d2a45e526b6[227113]: [NOTICE]   (227126) : haproxy version is 2.8.14-c23fe91
Nov 22 07:59:00 compute-0 neutron-haproxy-ovnmeta-9f740f05-d312-4e00-a27d-4d2a45e526b6[227113]: [NOTICE]   (227126) : path to executable is /usr/sbin/haproxy
Nov 22 07:59:00 compute-0 neutron-haproxy-ovnmeta-9f740f05-d312-4e00-a27d-4d2a45e526b6[227113]: [WARNING]  (227126) : Exiting Master process...
Nov 22 07:59:00 compute-0 neutron-haproxy-ovnmeta-9f740f05-d312-4e00-a27d-4d2a45e526b6[227113]: [ALERT]    (227126) : Current worker (227129) exited with code 143 (Terminated)
Nov 22 07:59:00 compute-0 neutron-haproxy-ovnmeta-9f740f05-d312-4e00-a27d-4d2a45e526b6[227113]: [WARNING]  (227126) : All workers exited. Exiting... (0)
Nov 22 07:59:00 compute-0 systemd[1]: libpod-a178f49609f63f50204416423f23b53fce7926f73eccb1c0db659d8f7fcee078.scope: Deactivated successfully.
Nov 22 07:59:00 compute-0 conmon[227113]: conmon a178f49609f63f502044 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-a178f49609f63f50204416423f23b53fce7926f73eccb1c0db659d8f7fcee078.scope/container/memory.events
Nov 22 07:59:00 compute-0 podman[227280]: 2025-11-22 07:59:00.835486105 +0000 UTC m=+0.043245616 container died a178f49609f63f50204416423f23b53fce7926f73eccb1c0db659d8f7fcee078 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9f740f05-d312-4e00-a27d-4d2a45e526b6, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 22 07:59:00 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a178f49609f63f50204416423f23b53fce7926f73eccb1c0db659d8f7fcee078-userdata-shm.mount: Deactivated successfully.
Nov 22 07:59:00 compute-0 systemd[1]: var-lib-containers-storage-overlay-9c393a0a35db332ad2cb56372b68732057ab1abd5f77f5929790fbc457ab2b8d-merged.mount: Deactivated successfully.
Nov 22 07:59:00 compute-0 podman[227280]: 2025-11-22 07:59:00.884423511 +0000 UTC m=+0.092183012 container cleanup a178f49609f63f50204416423f23b53fce7926f73eccb1c0db659d8f7fcee078 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9f740f05-d312-4e00-a27d-4d2a45e526b6, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS)
Nov 22 07:59:00 compute-0 systemd[1]: libpod-conmon-a178f49609f63f50204416423f23b53fce7926f73eccb1c0db659d8f7fcee078.scope: Deactivated successfully.
Nov 22 07:59:00 compute-0 nova_compute[186544]: 2025-11-22 07:59:00.897 186548 DEBUG nova.compute.manager [req-a33f20aa-9c60-44c3-9acb-04e0d9d4c9b3 req-41da3cf7-54a4-4b68-b293-b033c4f84686 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Received event network-vif-unplugged-e3b10401-9ef6-4135-93b2-8c19486b866e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:59:00 compute-0 nova_compute[186544]: 2025-11-22 07:59:00.898 186548 DEBUG oslo_concurrency.lockutils [req-a33f20aa-9c60-44c3-9acb-04e0d9d4c9b3 req-41da3cf7-54a4-4b68-b293-b033c4f84686 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "967dbe07-d575-4894-aabd-483767dcd760-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:59:00 compute-0 nova_compute[186544]: 2025-11-22 07:59:00.898 186548 DEBUG oslo_concurrency.lockutils [req-a33f20aa-9c60-44c3-9acb-04e0d9d4c9b3 req-41da3cf7-54a4-4b68-b293-b033c4f84686 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "967dbe07-d575-4894-aabd-483767dcd760-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:59:00 compute-0 nova_compute[186544]: 2025-11-22 07:59:00.899 186548 DEBUG oslo_concurrency.lockutils [req-a33f20aa-9c60-44c3-9acb-04e0d9d4c9b3 req-41da3cf7-54a4-4b68-b293-b033c4f84686 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "967dbe07-d575-4894-aabd-483767dcd760-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:59:00 compute-0 nova_compute[186544]: 2025-11-22 07:59:00.899 186548 DEBUG nova.compute.manager [req-a33f20aa-9c60-44c3-9acb-04e0d9d4c9b3 req-41da3cf7-54a4-4b68-b293-b033c4f84686 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] No waiting events found dispatching network-vif-unplugged-e3b10401-9ef6-4135-93b2-8c19486b866e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:59:00 compute-0 nova_compute[186544]: 2025-11-22 07:59:00.899 186548 WARNING nova.compute.manager [req-a33f20aa-9c60-44c3-9acb-04e0d9d4c9b3 req-41da3cf7-54a4-4b68-b293-b033c4f84686 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Received unexpected event network-vif-unplugged-e3b10401-9ef6-4135-93b2-8c19486b866e for instance with vm_state active and task_state reboot_started_hard.
Nov 22 07:59:00 compute-0 nova_compute[186544]: 2025-11-22 07:59:00.913 186548 INFO nova.virt.libvirt.driver [-] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Instance destroyed successfully.
Nov 22 07:59:00 compute-0 nova_compute[186544]: 2025-11-22 07:59:00.914 186548 DEBUG nova.objects.instance [None req-7669d9df-bce4-41d0-99db-3d135204847b d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Lazy-loading 'resources' on Instance uuid 967dbe07-d575-4894-aabd-483767dcd760 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:59:00 compute-0 nova_compute[186544]: 2025-11-22 07:59:00.924 186548 DEBUG nova.virt.libvirt.vif [None req-7669d9df-bce4-41d0-99db-3d135204847b d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:58:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-SecurityGroupsTestJSON-server-274106407',display_name='tempest-SecurityGroupsTestJSON-server-274106407',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-securitygroupstestjson-server-274106407',id=84,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:58:52Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d2bcbcf3720f46be9fea7fc4685dfecd',ramdisk_id='',reservation_id='r-jtzr0xmm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-SecurityGroupsTestJSON-2135176549',owner_user_name='tempest-SecurityGroupsTestJSON-2135176549-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:59:00Z,user_data=None,user_id='d77b927940494160bce27934c565fda7',uuid=967dbe07-d575-4894-aabd-483767dcd760,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e3b10401-9ef6-4135-93b2-8c19486b866e", "address": "fa:16:3e:dd:c3:78", "network": {"id": "9f740f05-d312-4e00-a27d-4d2a45e526b6", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1530130691-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d2bcbcf3720f46be9fea7fc4685dfecd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape3b10401-9e", "ovs_interfaceid": "e3b10401-9ef6-4135-93b2-8c19486b866e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 07:59:00 compute-0 nova_compute[186544]: 2025-11-22 07:59:00.925 186548 DEBUG nova.network.os_vif_util [None req-7669d9df-bce4-41d0-99db-3d135204847b d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Converting VIF {"id": "e3b10401-9ef6-4135-93b2-8c19486b866e", "address": "fa:16:3e:dd:c3:78", "network": {"id": "9f740f05-d312-4e00-a27d-4d2a45e526b6", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1530130691-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d2bcbcf3720f46be9fea7fc4685dfecd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape3b10401-9e", "ovs_interfaceid": "e3b10401-9ef6-4135-93b2-8c19486b866e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:59:00 compute-0 nova_compute[186544]: 2025-11-22 07:59:00.926 186548 DEBUG nova.network.os_vif_util [None req-7669d9df-bce4-41d0-99db-3d135204847b d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:dd:c3:78,bridge_name='br-int',has_traffic_filtering=True,id=e3b10401-9ef6-4135-93b2-8c19486b866e,network=Network(9f740f05-d312-4e00-a27d-4d2a45e526b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape3b10401-9e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:59:00 compute-0 nova_compute[186544]: 2025-11-22 07:59:00.926 186548 DEBUG os_vif [None req-7669d9df-bce4-41d0-99db-3d135204847b d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:dd:c3:78,bridge_name='br-int',has_traffic_filtering=True,id=e3b10401-9ef6-4135-93b2-8c19486b866e,network=Network(9f740f05-d312-4e00-a27d-4d2a45e526b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape3b10401-9e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 07:59:00 compute-0 nova_compute[186544]: 2025-11-22 07:59:00.929 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:00 compute-0 nova_compute[186544]: 2025-11-22 07:59:00.929 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape3b10401-9e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:59:00 compute-0 nova_compute[186544]: 2025-11-22 07:59:00.931 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:00 compute-0 nova_compute[186544]: 2025-11-22 07:59:00.934 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 07:59:00 compute-0 nova_compute[186544]: 2025-11-22 07:59:00.936 186548 INFO os_vif [None req-7669d9df-bce4-41d0-99db-3d135204847b d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:dd:c3:78,bridge_name='br-int',has_traffic_filtering=True,id=e3b10401-9ef6-4135-93b2-8c19486b866e,network=Network(9f740f05-d312-4e00-a27d-4d2a45e526b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape3b10401-9e')
Nov 22 07:59:00 compute-0 nova_compute[186544]: 2025-11-22 07:59:00.942 186548 DEBUG nova.virt.libvirt.driver [None req-7669d9df-bce4-41d0-99db-3d135204847b d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Start _get_guest_xml network_info=[{"id": "e3b10401-9ef6-4135-93b2-8c19486b866e", "address": "fa:16:3e:dd:c3:78", "network": {"id": "9f740f05-d312-4e00-a27d-4d2a45e526b6", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1530130691-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d2bcbcf3720f46be9fea7fc4685dfecd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape3b10401-9e", "ovs_interfaceid": "e3b10401-9ef6-4135-93b2-8c19486b866e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 07:59:00 compute-0 nova_compute[186544]: 2025-11-22 07:59:00.946 186548 WARNING nova.virt.libvirt.driver [None req-7669d9df-bce4-41d0-99db-3d135204847b d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 07:59:00 compute-0 nova_compute[186544]: 2025-11-22 07:59:00.955 186548 DEBUG nova.virt.libvirt.host [None req-7669d9df-bce4-41d0-99db-3d135204847b d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 07:59:00 compute-0 nova_compute[186544]: 2025-11-22 07:59:00.956 186548 DEBUG nova.virt.libvirt.host [None req-7669d9df-bce4-41d0-99db-3d135204847b d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 07:59:00 compute-0 nova_compute[186544]: 2025-11-22 07:59:00.960 186548 DEBUG nova.virt.libvirt.host [None req-7669d9df-bce4-41d0-99db-3d135204847b d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 07:59:00 compute-0 nova_compute[186544]: 2025-11-22 07:59:00.960 186548 DEBUG nova.virt.libvirt.host [None req-7669d9df-bce4-41d0-99db-3d135204847b d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 07:59:00 compute-0 nova_compute[186544]: 2025-11-22 07:59:00.962 186548 DEBUG nova.virt.libvirt.driver [None req-7669d9df-bce4-41d0-99db-3d135204847b d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 07:59:00 compute-0 nova_compute[186544]: 2025-11-22 07:59:00.962 186548 DEBUG nova.virt.hardware [None req-7669d9df-bce4-41d0-99db-3d135204847b d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 07:59:00 compute-0 nova_compute[186544]: 2025-11-22 07:59:00.963 186548 DEBUG nova.virt.hardware [None req-7669d9df-bce4-41d0-99db-3d135204847b d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 07:59:00 compute-0 nova_compute[186544]: 2025-11-22 07:59:00.963 186548 DEBUG nova.virt.hardware [None req-7669d9df-bce4-41d0-99db-3d135204847b d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 07:59:00 compute-0 nova_compute[186544]: 2025-11-22 07:59:00.963 186548 DEBUG nova.virt.hardware [None req-7669d9df-bce4-41d0-99db-3d135204847b d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 07:59:00 compute-0 nova_compute[186544]: 2025-11-22 07:59:00.963 186548 DEBUG nova.virt.hardware [None req-7669d9df-bce4-41d0-99db-3d135204847b d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 07:59:00 compute-0 nova_compute[186544]: 2025-11-22 07:59:00.964 186548 DEBUG nova.virt.hardware [None req-7669d9df-bce4-41d0-99db-3d135204847b d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 07:59:00 compute-0 podman[227319]: 2025-11-22 07:59:00.964111644 +0000 UTC m=+0.049104951 container remove a178f49609f63f50204416423f23b53fce7926f73eccb1c0db659d8f7fcee078 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9f740f05-d312-4e00-a27d-4d2a45e526b6, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS)
Nov 22 07:59:00 compute-0 nova_compute[186544]: 2025-11-22 07:59:00.964 186548 DEBUG nova.virt.hardware [None req-7669d9df-bce4-41d0-99db-3d135204847b d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 07:59:00 compute-0 nova_compute[186544]: 2025-11-22 07:59:00.965 186548 DEBUG nova.virt.hardware [None req-7669d9df-bce4-41d0-99db-3d135204847b d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 07:59:00 compute-0 nova_compute[186544]: 2025-11-22 07:59:00.965 186548 DEBUG nova.virt.hardware [None req-7669d9df-bce4-41d0-99db-3d135204847b d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 07:59:00 compute-0 nova_compute[186544]: 2025-11-22 07:59:00.965 186548 DEBUG nova.virt.hardware [None req-7669d9df-bce4-41d0-99db-3d135204847b d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 07:59:00 compute-0 nova_compute[186544]: 2025-11-22 07:59:00.966 186548 DEBUG nova.virt.hardware [None req-7669d9df-bce4-41d0-99db-3d135204847b d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 07:59:00 compute-0 nova_compute[186544]: 2025-11-22 07:59:00.966 186548 DEBUG nova.objects.instance [None req-7669d9df-bce4-41d0-99db-3d135204847b d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Lazy-loading 'vcpu_model' on Instance uuid 967dbe07-d575-4894-aabd-483767dcd760 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:59:00 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:00.972 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[053810ea-da42-4c4c-9dbe-4b399d13202b]: (4, ('Sat Nov 22 07:59:00 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-9f740f05-d312-4e00-a27d-4d2a45e526b6 (a178f49609f63f50204416423f23b53fce7926f73eccb1c0db659d8f7fcee078)\na178f49609f63f50204416423f23b53fce7926f73eccb1c0db659d8f7fcee078\nSat Nov 22 07:59:00 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-9f740f05-d312-4e00-a27d-4d2a45e526b6 (a178f49609f63f50204416423f23b53fce7926f73eccb1c0db659d8f7fcee078)\na178f49609f63f50204416423f23b53fce7926f73eccb1c0db659d8f7fcee078\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:00 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:00.974 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[585e83df-602f-41a7-bd17-24d514126306]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:00 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:00.975 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9f740f05-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:59:00 compute-0 kernel: tap9f740f05-d0: left promiscuous mode
Nov 22 07:59:00 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:00.992 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[17b48876-95b4-4872-bc71-1fd0487d213d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:00 compute-0 nova_compute[186544]: 2025-11-22 07:59:00.991 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:00 compute-0 nova_compute[186544]: 2025-11-22 07:59:00.996 186548 DEBUG oslo_concurrency.processutils [None req-7669d9df-bce4-41d0-99db-3d135204847b d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/967dbe07-d575-4894-aabd-483767dcd760/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:59:01 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:01.007 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[f37340e0-f850-4546-a8b3-288ce5ff3994]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:01 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:01.011 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[5c792f4f-28ff-4550-9c89-30ba74b58568]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:01 compute-0 nova_compute[186544]: 2025-11-22 07:59:01.023 186548 INFO nova.virt.libvirt.driver [None req-44b6893b-56ef-4f87-b998-da152c147fb0 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Snapshot image upload complete
Nov 22 07:59:01 compute-0 nova_compute[186544]: 2025-11-22 07:59:01.024 186548 INFO nova.compute.manager [None req-44b6893b-56ef-4f87-b998-da152c147fb0 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Took 4.34 seconds to snapshot the instance on the hypervisor.
Nov 22 07:59:01 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:01.027 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[92c6f05e-3ce8-4f55-b5ae-1ef26e4f2007]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 507199, 'reachable_time': 18725, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227337, 'error': None, 'target': 'ovnmeta-9f740f05-d312-4e00-a27d-4d2a45e526b6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:01 compute-0 systemd[1]: run-netns-ovnmeta\x2d9f740f05\x2dd312\x2d4e00\x2da27d\x2d4d2a45e526b6.mount: Deactivated successfully.
Nov 22 07:59:01 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:01.033 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9f740f05-d312-4e00-a27d-4d2a45e526b6 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 07:59:01 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:01.033 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[74c6d596-7c49-49cd-bcdf-ceb158e674f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:01 compute-0 nova_compute[186544]: 2025-11-22 07:59:01.057 186548 DEBUG oslo_concurrency.processutils [None req-7669d9df-bce4-41d0-99db-3d135204847b d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/967dbe07-d575-4894-aabd-483767dcd760/disk.config --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:59:01 compute-0 nova_compute[186544]: 2025-11-22 07:59:01.057 186548 DEBUG oslo_concurrency.lockutils [None req-7669d9df-bce4-41d0-99db-3d135204847b d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Acquiring lock "/var/lib/nova/instances/967dbe07-d575-4894-aabd-483767dcd760/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:59:01 compute-0 nova_compute[186544]: 2025-11-22 07:59:01.058 186548 DEBUG oslo_concurrency.lockutils [None req-7669d9df-bce4-41d0-99db-3d135204847b d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Lock "/var/lib/nova/instances/967dbe07-d575-4894-aabd-483767dcd760/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:59:01 compute-0 nova_compute[186544]: 2025-11-22 07:59:01.059 186548 DEBUG oslo_concurrency.lockutils [None req-7669d9df-bce4-41d0-99db-3d135204847b d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Lock "/var/lib/nova/instances/967dbe07-d575-4894-aabd-483767dcd760/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:59:01 compute-0 nova_compute[186544]: 2025-11-22 07:59:01.060 186548 DEBUG nova.virt.libvirt.vif [None req-7669d9df-bce4-41d0-99db-3d135204847b d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:58:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-SecurityGroupsTestJSON-server-274106407',display_name='tempest-SecurityGroupsTestJSON-server-274106407',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-securitygroupstestjson-server-274106407',id=84,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:58:52Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d2bcbcf3720f46be9fea7fc4685dfecd',ramdisk_id='',reservation_id='r-jtzr0xmm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-SecurityGroupsTestJSON-2135176549',owner_user_name='tempest-SecurityGroupsTestJSON-2135176549-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:59:00Z,user_data=None,user_id='d77b927940494160bce27934c565fda7',uuid=967dbe07-d575-4894-aabd-483767dcd760,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e3b10401-9ef6-4135-93b2-8c19486b866e", "address": "fa:16:3e:dd:c3:78", "network": {"id": "9f740f05-d312-4e00-a27d-4d2a45e526b6", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1530130691-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d2bcbcf3720f46be9fea7fc4685dfecd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape3b10401-9e", "ovs_interfaceid": "e3b10401-9ef6-4135-93b2-8c19486b866e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 07:59:01 compute-0 nova_compute[186544]: 2025-11-22 07:59:01.060 186548 DEBUG nova.network.os_vif_util [None req-7669d9df-bce4-41d0-99db-3d135204847b d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Converting VIF {"id": "e3b10401-9ef6-4135-93b2-8c19486b866e", "address": "fa:16:3e:dd:c3:78", "network": {"id": "9f740f05-d312-4e00-a27d-4d2a45e526b6", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1530130691-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d2bcbcf3720f46be9fea7fc4685dfecd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape3b10401-9e", "ovs_interfaceid": "e3b10401-9ef6-4135-93b2-8c19486b866e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:59:01 compute-0 nova_compute[186544]: 2025-11-22 07:59:01.061 186548 DEBUG nova.network.os_vif_util [None req-7669d9df-bce4-41d0-99db-3d135204847b d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:dd:c3:78,bridge_name='br-int',has_traffic_filtering=True,id=e3b10401-9ef6-4135-93b2-8c19486b866e,network=Network(9f740f05-d312-4e00-a27d-4d2a45e526b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape3b10401-9e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:59:01 compute-0 nova_compute[186544]: 2025-11-22 07:59:01.062 186548 DEBUG nova.objects.instance [None req-7669d9df-bce4-41d0-99db-3d135204847b d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Lazy-loading 'pci_devices' on Instance uuid 967dbe07-d575-4894-aabd-483767dcd760 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:59:01 compute-0 nova_compute[186544]: 2025-11-22 07:59:01.123 186548 DEBUG nova.virt.libvirt.driver [None req-7669d9df-bce4-41d0-99db-3d135204847b d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] End _get_guest_xml xml=<domain type="kvm">
Nov 22 07:59:01 compute-0 nova_compute[186544]:   <uuid>967dbe07-d575-4894-aabd-483767dcd760</uuid>
Nov 22 07:59:01 compute-0 nova_compute[186544]:   <name>instance-00000054</name>
Nov 22 07:59:01 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 07:59:01 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 07:59:01 compute-0 nova_compute[186544]:   <metadata>
Nov 22 07:59:01 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 07:59:01 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 07:59:01 compute-0 nova_compute[186544]:       <nova:name>tempest-SecurityGroupsTestJSON-server-274106407</nova:name>
Nov 22 07:59:01 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 07:59:00</nova:creationTime>
Nov 22 07:59:01 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 07:59:01 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 07:59:01 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 07:59:01 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 07:59:01 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 07:59:01 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 07:59:01 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 07:59:01 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 07:59:01 compute-0 nova_compute[186544]:         <nova:user uuid="d77b927940494160bce27934c565fda7">tempest-SecurityGroupsTestJSON-2135176549-project-member</nova:user>
Nov 22 07:59:01 compute-0 nova_compute[186544]:         <nova:project uuid="d2bcbcf3720f46be9fea7fc4685dfecd">tempest-SecurityGroupsTestJSON-2135176549</nova:project>
Nov 22 07:59:01 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 07:59:01 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 07:59:01 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 07:59:01 compute-0 nova_compute[186544]:         <nova:port uuid="e3b10401-9ef6-4135-93b2-8c19486b866e">
Nov 22 07:59:01 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Nov 22 07:59:01 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 07:59:01 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 07:59:01 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 07:59:01 compute-0 nova_compute[186544]:   </metadata>
Nov 22 07:59:01 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 07:59:01 compute-0 nova_compute[186544]:     <system>
Nov 22 07:59:01 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 07:59:01 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 07:59:01 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 07:59:01 compute-0 nova_compute[186544]:       <entry name="serial">967dbe07-d575-4894-aabd-483767dcd760</entry>
Nov 22 07:59:01 compute-0 nova_compute[186544]:       <entry name="uuid">967dbe07-d575-4894-aabd-483767dcd760</entry>
Nov 22 07:59:01 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 07:59:01 compute-0 nova_compute[186544]:     </system>
Nov 22 07:59:01 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 07:59:01 compute-0 nova_compute[186544]:   <os>
Nov 22 07:59:01 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 07:59:01 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 07:59:01 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 07:59:01 compute-0 nova_compute[186544]:   </os>
Nov 22 07:59:01 compute-0 nova_compute[186544]:   <features>
Nov 22 07:59:01 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 07:59:01 compute-0 nova_compute[186544]:     <apic/>
Nov 22 07:59:01 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 07:59:01 compute-0 nova_compute[186544]:   </features>
Nov 22 07:59:01 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 07:59:01 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 07:59:01 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 07:59:01 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 07:59:01 compute-0 nova_compute[186544]:   </clock>
Nov 22 07:59:01 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 07:59:01 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 07:59:01 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 07:59:01 compute-0 nova_compute[186544]:   </cpu>
Nov 22 07:59:01 compute-0 nova_compute[186544]:   <devices>
Nov 22 07:59:01 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 07:59:01 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 07:59:01 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/967dbe07-d575-4894-aabd-483767dcd760/disk"/>
Nov 22 07:59:01 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 07:59:01 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:59:01 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 07:59:01 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 07:59:01 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/967dbe07-d575-4894-aabd-483767dcd760/disk.config"/>
Nov 22 07:59:01 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 07:59:01 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:59:01 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 07:59:01 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:dd:c3:78"/>
Nov 22 07:59:01 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 07:59:01 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 07:59:01 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 07:59:01 compute-0 nova_compute[186544]:       <target dev="tape3b10401-9e"/>
Nov 22 07:59:01 compute-0 nova_compute[186544]:     </interface>
Nov 22 07:59:01 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 07:59:01 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/967dbe07-d575-4894-aabd-483767dcd760/console.log" append="off"/>
Nov 22 07:59:01 compute-0 nova_compute[186544]:     </serial>
Nov 22 07:59:01 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 07:59:01 compute-0 nova_compute[186544]:     <video>
Nov 22 07:59:01 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 07:59:01 compute-0 nova_compute[186544]:     </video>
Nov 22 07:59:01 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 07:59:01 compute-0 nova_compute[186544]:     <input type="keyboard" bus="usb"/>
Nov 22 07:59:01 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 07:59:01 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 07:59:01 compute-0 nova_compute[186544]:     </rng>
Nov 22 07:59:01 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 07:59:01 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:01 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:01 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:01 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:01 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:01 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:01 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:01 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:01 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:01 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:01 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:01 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:01 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:01 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:01 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:01 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:01 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:01 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:01 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:01 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:01 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:01 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:01 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:01 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:01 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 07:59:01 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 07:59:01 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 07:59:01 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 07:59:01 compute-0 nova_compute[186544]:   </devices>
Nov 22 07:59:01 compute-0 nova_compute[186544]: </domain>
Nov 22 07:59:01 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 07:59:01 compute-0 nova_compute[186544]: 2025-11-22 07:59:01.124 186548 DEBUG oslo_concurrency.processutils [None req-7669d9df-bce4-41d0-99db-3d135204847b d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/967dbe07-d575-4894-aabd-483767dcd760/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:59:01 compute-0 nova_compute[186544]: 2025-11-22 07:59:01.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:59:01 compute-0 nova_compute[186544]: 2025-11-22 07:59:01.164 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 22 07:59:01 compute-0 nova_compute[186544]: 2025-11-22 07:59:01.180 186548 DEBUG oslo_concurrency.processutils [None req-7669d9df-bce4-41d0-99db-3d135204847b d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/967dbe07-d575-4894-aabd-483767dcd760/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:59:01 compute-0 nova_compute[186544]: 2025-11-22 07:59:01.181 186548 DEBUG oslo_concurrency.processutils [None req-7669d9df-bce4-41d0-99db-3d135204847b d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/967dbe07-d575-4894-aabd-483767dcd760/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:59:01 compute-0 nova_compute[186544]: 2025-11-22 07:59:01.198 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:59:01 compute-0 nova_compute[186544]: 2025-11-22 07:59:01.236 186548 DEBUG oslo_concurrency.processutils [None req-7669d9df-bce4-41d0-99db-3d135204847b d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/967dbe07-d575-4894-aabd-483767dcd760/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:59:01 compute-0 nova_compute[186544]: 2025-11-22 07:59:01.237 186548 DEBUG nova.objects.instance [None req-7669d9df-bce4-41d0-99db-3d135204847b d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Lazy-loading 'trusted_certs' on Instance uuid 967dbe07-d575-4894-aabd-483767dcd760 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:59:01 compute-0 nova_compute[186544]: 2025-11-22 07:59:01.248 186548 DEBUG oslo_concurrency.processutils [None req-7669d9df-bce4-41d0-99db-3d135204847b d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:59:01 compute-0 nova_compute[186544]: 2025-11-22 07:59:01.302 186548 DEBUG oslo_concurrency.processutils [None req-7669d9df-bce4-41d0-99db-3d135204847b d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:59:01 compute-0 nova_compute[186544]: 2025-11-22 07:59:01.303 186548 DEBUG nova.virt.disk.api [None req-7669d9df-bce4-41d0-99db-3d135204847b d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Checking if we can resize image /var/lib/nova/instances/967dbe07-d575-4894-aabd-483767dcd760/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 07:59:01 compute-0 nova_compute[186544]: 2025-11-22 07:59:01.304 186548 DEBUG oslo_concurrency.processutils [None req-7669d9df-bce4-41d0-99db-3d135204847b d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/967dbe07-d575-4894-aabd-483767dcd760/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:59:01 compute-0 nova_compute[186544]: 2025-11-22 07:59:01.363 186548 DEBUG oslo_concurrency.processutils [None req-7669d9df-bce4-41d0-99db-3d135204847b d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/967dbe07-d575-4894-aabd-483767dcd760/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:59:01 compute-0 nova_compute[186544]: 2025-11-22 07:59:01.364 186548 DEBUG nova.virt.disk.api [None req-7669d9df-bce4-41d0-99db-3d135204847b d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Cannot resize image /var/lib/nova/instances/967dbe07-d575-4894-aabd-483767dcd760/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 07:59:01 compute-0 nova_compute[186544]: 2025-11-22 07:59:01.365 186548 DEBUG nova.objects.instance [None req-7669d9df-bce4-41d0-99db-3d135204847b d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Lazy-loading 'migration_context' on Instance uuid 967dbe07-d575-4894-aabd-483767dcd760 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:59:01 compute-0 nova_compute[186544]: 2025-11-22 07:59:01.386 186548 DEBUG nova.virt.libvirt.vif [None req-7669d9df-bce4-41d0-99db-3d135204847b d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:58:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-SecurityGroupsTestJSON-server-274106407',display_name='tempest-SecurityGroupsTestJSON-server-274106407',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-securitygroupstestjson-server-274106407',id=84,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:58:52Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=1,progress=0,project_id='d2bcbcf3720f46be9fea7fc4685dfecd',ramdisk_id='',reservation_id='r-jtzr0xmm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-SecurityGroupsTestJSON-2135176549',owner_user_name='tempest-SecurityGroupsTestJSON-2135176549-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:59:00Z,user_data=None,user_id='d77b927940494160bce27934c565fda7',uuid=967dbe07-d575-4894-aabd-483767dcd760,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e3b10401-9ef6-4135-93b2-8c19486b866e", "address": "fa:16:3e:dd:c3:78", "network": {"id": "9f740f05-d312-4e00-a27d-4d2a45e526b6", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1530130691-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d2bcbcf3720f46be9fea7fc4685dfecd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape3b10401-9e", "ovs_interfaceid": "e3b10401-9ef6-4135-93b2-8c19486b866e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 07:59:01 compute-0 nova_compute[186544]: 2025-11-22 07:59:01.386 186548 DEBUG nova.network.os_vif_util [None req-7669d9df-bce4-41d0-99db-3d135204847b d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Converting VIF {"id": "e3b10401-9ef6-4135-93b2-8c19486b866e", "address": "fa:16:3e:dd:c3:78", "network": {"id": "9f740f05-d312-4e00-a27d-4d2a45e526b6", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1530130691-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d2bcbcf3720f46be9fea7fc4685dfecd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape3b10401-9e", "ovs_interfaceid": "e3b10401-9ef6-4135-93b2-8c19486b866e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:59:01 compute-0 nova_compute[186544]: 2025-11-22 07:59:01.387 186548 DEBUG nova.network.os_vif_util [None req-7669d9df-bce4-41d0-99db-3d135204847b d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:dd:c3:78,bridge_name='br-int',has_traffic_filtering=True,id=e3b10401-9ef6-4135-93b2-8c19486b866e,network=Network(9f740f05-d312-4e00-a27d-4d2a45e526b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape3b10401-9e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:59:01 compute-0 nova_compute[186544]: 2025-11-22 07:59:01.388 186548 DEBUG os_vif [None req-7669d9df-bce4-41d0-99db-3d135204847b d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:dd:c3:78,bridge_name='br-int',has_traffic_filtering=True,id=e3b10401-9ef6-4135-93b2-8c19486b866e,network=Network(9f740f05-d312-4e00-a27d-4d2a45e526b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape3b10401-9e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 07:59:01 compute-0 nova_compute[186544]: 2025-11-22 07:59:01.389 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:01 compute-0 nova_compute[186544]: 2025-11-22 07:59:01.389 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:59:01 compute-0 nova_compute[186544]: 2025-11-22 07:59:01.390 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:59:01 compute-0 nova_compute[186544]: 2025-11-22 07:59:01.392 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:01 compute-0 nova_compute[186544]: 2025-11-22 07:59:01.392 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape3b10401-9e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:59:01 compute-0 nova_compute[186544]: 2025-11-22 07:59:01.393 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape3b10401-9e, col_values=(('external_ids', {'iface-id': 'e3b10401-9ef6-4135-93b2-8c19486b866e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:dd:c3:78', 'vm-uuid': '967dbe07-d575-4894-aabd-483767dcd760'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:59:01 compute-0 nova_compute[186544]: 2025-11-22 07:59:01.394 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:01 compute-0 NetworkManager[55036]: <info>  [1763798341.3956] manager: (tape3b10401-9e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/166)
Nov 22 07:59:01 compute-0 nova_compute[186544]: 2025-11-22 07:59:01.398 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 07:59:01 compute-0 nova_compute[186544]: 2025-11-22 07:59:01.399 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:01 compute-0 nova_compute[186544]: 2025-11-22 07:59:01.400 186548 INFO os_vif [None req-7669d9df-bce4-41d0-99db-3d135204847b d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:dd:c3:78,bridge_name='br-int',has_traffic_filtering=True,id=e3b10401-9ef6-4135-93b2-8c19486b866e,network=Network(9f740f05-d312-4e00-a27d-4d2a45e526b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape3b10401-9e')
Nov 22 07:59:01 compute-0 systemd-udevd[227262]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 07:59:01 compute-0 kernel: tape3b10401-9e: entered promiscuous mode
Nov 22 07:59:01 compute-0 NetworkManager[55036]: <info>  [1763798341.4746] manager: (tape3b10401-9e): new Tun device (/org/freedesktop/NetworkManager/Devices/167)
Nov 22 07:59:01 compute-0 nova_compute[186544]: 2025-11-22 07:59:01.474 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:01 compute-0 ovn_controller[94843]: 2025-11-22T07:59:01Z|00353|binding|INFO|Claiming lport e3b10401-9ef6-4135-93b2-8c19486b866e for this chassis.
Nov 22 07:59:01 compute-0 ovn_controller[94843]: 2025-11-22T07:59:01Z|00354|binding|INFO|e3b10401-9ef6-4135-93b2-8c19486b866e: Claiming fa:16:3e:dd:c3:78 10.100.0.7
Nov 22 07:59:01 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:01.482 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:dd:c3:78 10.100.0.7'], port_security=['fa:16:3e:dd:c3:78 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '967dbe07-d575-4894-aabd-483767dcd760', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9f740f05-d312-4e00-a27d-4d2a45e526b6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd2bcbcf3720f46be9fea7fc4685dfecd', 'neutron:revision_number': '6', 'neutron:security_group_ids': '358cb1ee-ee2e-4598-85d3-617cef2413d3 726ed215-2cc1-4cd0-860c-0d95ad883b6b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=63d51e5f-a087-4eb1-a0c4-4a9ee7856c37, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=e3b10401-9ef6-4135-93b2-8c19486b866e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:59:01 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:01.484 103805 INFO neutron.agent.ovn.metadata.agent [-] Port e3b10401-9ef6-4135-93b2-8c19486b866e in datapath 9f740f05-d312-4e00-a27d-4d2a45e526b6 bound to our chassis
Nov 22 07:59:01 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:01.485 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9f740f05-d312-4e00-a27d-4d2a45e526b6
Nov 22 07:59:01 compute-0 ovn_controller[94843]: 2025-11-22T07:59:01Z|00355|binding|INFO|Setting lport e3b10401-9ef6-4135-93b2-8c19486b866e ovn-installed in OVS
Nov 22 07:59:01 compute-0 NetworkManager[55036]: <info>  [1763798341.4879] device (tape3b10401-9e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 07:59:01 compute-0 ovn_controller[94843]: 2025-11-22T07:59:01Z|00356|binding|INFO|Setting lport e3b10401-9ef6-4135-93b2-8c19486b866e up in Southbound
Nov 22 07:59:01 compute-0 nova_compute[186544]: 2025-11-22 07:59:01.488 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:01 compute-0 NetworkManager[55036]: <info>  [1763798341.4895] device (tape3b10401-9e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 07:59:01 compute-0 nova_compute[186544]: 2025-11-22 07:59:01.491 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:01 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:01.495 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[c0863c2e-b249-4b20-985c-091742dbe71e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:01 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:01.496 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9f740f05-d1 in ovnmeta-9f740f05-d312-4e00-a27d-4d2a45e526b6 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 07:59:01 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:01.499 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9f740f05-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 07:59:01 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:01.499 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[b20d245c-b351-4f57-b8b6-2a41290b06db]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:01 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:01.500 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[9335e3f0-42e5-4fd4-b7e9-305f22e1f07b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:01 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:01.511 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[07103047-41e3-46e1-afc2-daa0228d590f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:01 compute-0 systemd-machined[152872]: New machine qemu-43-instance-00000054.
Nov 22 07:59:01 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:01.527 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[d0c94521-d7ff-4d81-be62-3e14e4827986]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:01 compute-0 systemd[1]: Started Virtual Machine qemu-43-instance-00000054.
Nov 22 07:59:01 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:01.555 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[ed150c02-2291-4b8c-8001-d849894512a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:01 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:01.560 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[03f6fb0b-229d-4b13-bcd5-c083c2093ccc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:01 compute-0 NetworkManager[55036]: <info>  [1763798341.5619] manager: (tap9f740f05-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/168)
Nov 22 07:59:01 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:01.606 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[79e8d660-66e2-44f0-918a-0370bbab6a2d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:01 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:01.609 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[b1b7a52e-7e57-4757-aaed-e085d87f1be7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:01 compute-0 NetworkManager[55036]: <info>  [1763798341.6328] device (tap9f740f05-d0): carrier: link connected
Nov 22 07:59:01 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:01.638 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[6642f2ef-3fec-48fb-ad67-97de32cc6b55]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:01 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:01.655 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[46f60947-61c5-4185-9e31-3edebe8220ee]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9f740f05-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:56:0d:1d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 109], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 508227, 'reachable_time': 43352, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227399, 'error': None, 'target': 'ovnmeta-9f740f05-d312-4e00-a27d-4d2a45e526b6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:01 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:01.667 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[e4ba1326-5b79-4c27-9676-11bf9332b0ed]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe56:d1d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 508227, 'tstamp': 508227}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227400, 'error': None, 'target': 'ovnmeta-9f740f05-d312-4e00-a27d-4d2a45e526b6', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:01 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:01.683 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[971a2f59-c6a1-4fb0-99d2-cf195d993279]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9f740f05-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:56:0d:1d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 109], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 508227, 'reachable_time': 43352, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 227401, 'error': None, 'target': 'ovnmeta-9f740f05-d312-4e00-a27d-4d2a45e526b6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:01 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:01.710 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[caacf9c9-3bce-4583-b339-b4eec7fd7ed2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:01 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:01.762 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[7835093d-bd79-4fd1-927c-3ed6349d004d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:01 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:01.764 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9f740f05-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:59:01 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:01.764 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:59:01 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:01.764 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9f740f05-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:59:01 compute-0 nova_compute[186544]: 2025-11-22 07:59:01.766 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:01 compute-0 NetworkManager[55036]: <info>  [1763798341.7670] manager: (tap9f740f05-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/169)
Nov 22 07:59:01 compute-0 kernel: tap9f740f05-d0: entered promiscuous mode
Nov 22 07:59:01 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:01.769 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9f740f05-d0, col_values=(('external_ids', {'iface-id': 'a92e4d0c-d7b2-40f9-9251-db8a7ccb6b31'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:59:01 compute-0 nova_compute[186544]: 2025-11-22 07:59:01.771 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:01 compute-0 ovn_controller[94843]: 2025-11-22T07:59:01Z|00357|binding|INFO|Releasing lport a92e4d0c-d7b2-40f9-9251-db8a7ccb6b31 from this chassis (sb_readonly=0)
Nov 22 07:59:01 compute-0 nova_compute[186544]: 2025-11-22 07:59:01.782 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:01 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:01.783 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9f740f05-d312-4e00-a27d-4d2a45e526b6.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9f740f05-d312-4e00-a27d-4d2a45e526b6.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 07:59:01 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:01.784 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[da28696b-ea41-43e6-8627-5002f8d76506]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:01 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:01.785 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 07:59:01 compute-0 ovn_metadata_agent[103800]: global
Nov 22 07:59:01 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 07:59:01 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-9f740f05-d312-4e00-a27d-4d2a45e526b6
Nov 22 07:59:01 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 07:59:01 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 07:59:01 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 07:59:01 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/9f740f05-d312-4e00-a27d-4d2a45e526b6.pid.haproxy
Nov 22 07:59:01 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 07:59:01 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:59:01 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 07:59:01 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 07:59:01 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 07:59:01 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 07:59:01 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 07:59:01 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 07:59:01 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 07:59:01 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 07:59:01 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 07:59:01 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 07:59:01 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 07:59:01 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 07:59:01 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 07:59:01 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:59:01 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:59:01 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 07:59:01 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 07:59:01 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 07:59:01 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID 9f740f05-d312-4e00-a27d-4d2a45e526b6
Nov 22 07:59:01 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 07:59:01 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:01.787 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9f740f05-d312-4e00-a27d-4d2a45e526b6', 'env', 'PROCESS_TAG=haproxy-9f740f05-d312-4e00-a27d-4d2a45e526b6', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9f740f05-d312-4e00-a27d-4d2a45e526b6.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 07:59:02 compute-0 nova_compute[186544]: 2025-11-22 07:59:02.056 186548 DEBUG nova.virt.libvirt.host [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Removed pending event for 967dbe07-d575-4894-aabd-483767dcd760 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Nov 22 07:59:02 compute-0 nova_compute[186544]: 2025-11-22 07:59:02.057 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798342.0558531, 967dbe07-d575-4894-aabd-483767dcd760 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:59:02 compute-0 nova_compute[186544]: 2025-11-22 07:59:02.057 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 967dbe07-d575-4894-aabd-483767dcd760] VM Resumed (Lifecycle Event)
Nov 22 07:59:02 compute-0 nova_compute[186544]: 2025-11-22 07:59:02.059 186548 DEBUG nova.compute.manager [None req-7669d9df-bce4-41d0-99db-3d135204847b d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 07:59:02 compute-0 nova_compute[186544]: 2025-11-22 07:59:02.063 186548 INFO nova.virt.libvirt.driver [-] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Instance rebooted successfully.
Nov 22 07:59:02 compute-0 nova_compute[186544]: 2025-11-22 07:59:02.063 186548 DEBUG nova.compute.manager [None req-7669d9df-bce4-41d0-99db-3d135204847b d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:59:02 compute-0 nova_compute[186544]: 2025-11-22 07:59:02.086 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:59:02 compute-0 nova_compute[186544]: 2025-11-22 07:59:02.090 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: reboot_started_hard, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:59:02 compute-0 nova_compute[186544]: 2025-11-22 07:59:02.123 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 967dbe07-d575-4894-aabd-483767dcd760] During sync_power_state the instance has a pending task (reboot_started_hard). Skip.
Nov 22 07:59:02 compute-0 nova_compute[186544]: 2025-11-22 07:59:02.124 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798342.0568254, 967dbe07-d575-4894-aabd-483767dcd760 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:59:02 compute-0 nova_compute[186544]: 2025-11-22 07:59:02.124 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 967dbe07-d575-4894-aabd-483767dcd760] VM Started (Lifecycle Event)
Nov 22 07:59:02 compute-0 nova_compute[186544]: 2025-11-22 07:59:02.138 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:59:02 compute-0 nova_compute[186544]: 2025-11-22 07:59:02.140 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:59:02 compute-0 podman[227440]: 2025-11-22 07:59:02.147000315 +0000 UTC m=+0.064287785 container create dc1a36cffbae8c0900a7b08f3f74ece9465f0da60771b52bdb7a40978de3b15b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9f740f05-d312-4e00-a27d-4d2a45e526b6, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 22 07:59:02 compute-0 systemd[1]: Started libpod-conmon-dc1a36cffbae8c0900a7b08f3f74ece9465f0da60771b52bdb7a40978de3b15b.scope.
Nov 22 07:59:02 compute-0 systemd[1]: Started libcrun container.
Nov 22 07:59:02 compute-0 nova_compute[186544]: 2025-11-22 07:59:02.203 186548 DEBUG oslo_concurrency.lockutils [None req-7669d9df-bce4-41d0-99db-3d135204847b d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Lock "967dbe07-d575-4894-aabd-483767dcd760" "released" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: held 4.838s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:59:02 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0a50312c1e40fe87fc41d0358c8ef83519ebd4be313b31e02940ead55ef335e9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 07:59:02 compute-0 podman[227440]: 2025-11-22 07:59:02.114781451 +0000 UTC m=+0.032068941 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 07:59:02 compute-0 podman[227440]: 2025-11-22 07:59:02.225036787 +0000 UTC m=+0.142324277 container init dc1a36cffbae8c0900a7b08f3f74ece9465f0da60771b52bdb7a40978de3b15b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9f740f05-d312-4e00-a27d-4d2a45e526b6, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Nov 22 07:59:02 compute-0 podman[227440]: 2025-11-22 07:59:02.2308319 +0000 UTC m=+0.148119370 container start dc1a36cffbae8c0900a7b08f3f74ece9465f0da60771b52bdb7a40978de3b15b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9f740f05-d312-4e00-a27d-4d2a45e526b6, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 07:59:02 compute-0 neutron-haproxy-ovnmeta-9f740f05-d312-4e00-a27d-4d2a45e526b6[227455]: [NOTICE]   (227459) : New worker (227461) forked
Nov 22 07:59:02 compute-0 neutron-haproxy-ovnmeta-9f740f05-d312-4e00-a27d-4d2a45e526b6[227455]: [NOTICE]   (227459) : Loading success.
Nov 22 07:59:02 compute-0 nova_compute[186544]: 2025-11-22 07:59:02.436 186548 INFO nova.compute.manager [None req-10503764-4dc9-4e06-8d1f-a5d7e5e5c0cc 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Rescuing
Nov 22 07:59:02 compute-0 nova_compute[186544]: 2025-11-22 07:59:02.437 186548 DEBUG oslo_concurrency.lockutils [None req-10503764-4dc9-4e06-8d1f-a5d7e5e5c0cc 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Acquiring lock "refresh_cache-823521f6-eb37-4b78-982c-526e463c834f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:59:02 compute-0 nova_compute[186544]: 2025-11-22 07:59:02.437 186548 DEBUG oslo_concurrency.lockutils [None req-10503764-4dc9-4e06-8d1f-a5d7e5e5c0cc 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Acquired lock "refresh_cache-823521f6-eb37-4b78-982c-526e463c834f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:59:02 compute-0 nova_compute[186544]: 2025-11-22 07:59:02.438 186548 DEBUG nova.network.neutron [None req-10503764-4dc9-4e06-8d1f-a5d7e5e5c0cc 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 07:59:02 compute-0 nova_compute[186544]: 2025-11-22 07:59:02.989 186548 DEBUG nova.compute.manager [req-b24b4b02-15db-43e0-85b5-63928a8659cd req-848e91b9-f17e-4cf3-b09c-4fb8e713700f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Received event network-vif-plugged-e3b10401-9ef6-4135-93b2-8c19486b866e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:59:02 compute-0 nova_compute[186544]: 2025-11-22 07:59:02.989 186548 DEBUG oslo_concurrency.lockutils [req-b24b4b02-15db-43e0-85b5-63928a8659cd req-848e91b9-f17e-4cf3-b09c-4fb8e713700f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "967dbe07-d575-4894-aabd-483767dcd760-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:59:02 compute-0 nova_compute[186544]: 2025-11-22 07:59:02.989 186548 DEBUG oslo_concurrency.lockutils [req-b24b4b02-15db-43e0-85b5-63928a8659cd req-848e91b9-f17e-4cf3-b09c-4fb8e713700f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "967dbe07-d575-4894-aabd-483767dcd760-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:59:02 compute-0 nova_compute[186544]: 2025-11-22 07:59:02.990 186548 DEBUG oslo_concurrency.lockutils [req-b24b4b02-15db-43e0-85b5-63928a8659cd req-848e91b9-f17e-4cf3-b09c-4fb8e713700f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "967dbe07-d575-4894-aabd-483767dcd760-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:59:02 compute-0 nova_compute[186544]: 2025-11-22 07:59:02.990 186548 DEBUG nova.compute.manager [req-b24b4b02-15db-43e0-85b5-63928a8659cd req-848e91b9-f17e-4cf3-b09c-4fb8e713700f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] No waiting events found dispatching network-vif-plugged-e3b10401-9ef6-4135-93b2-8c19486b866e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:59:02 compute-0 nova_compute[186544]: 2025-11-22 07:59:02.990 186548 WARNING nova.compute.manager [req-b24b4b02-15db-43e0-85b5-63928a8659cd req-848e91b9-f17e-4cf3-b09c-4fb8e713700f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Received unexpected event network-vif-plugged-e3b10401-9ef6-4135-93b2-8c19486b866e for instance with vm_state active and task_state None.
Nov 22 07:59:02 compute-0 nova_compute[186544]: 2025-11-22 07:59:02.991 186548 DEBUG nova.compute.manager [req-b24b4b02-15db-43e0-85b5-63928a8659cd req-848e91b9-f17e-4cf3-b09c-4fb8e713700f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Received event network-vif-plugged-e3b10401-9ef6-4135-93b2-8c19486b866e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:59:02 compute-0 nova_compute[186544]: 2025-11-22 07:59:02.991 186548 DEBUG oslo_concurrency.lockutils [req-b24b4b02-15db-43e0-85b5-63928a8659cd req-848e91b9-f17e-4cf3-b09c-4fb8e713700f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "967dbe07-d575-4894-aabd-483767dcd760-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:59:02 compute-0 nova_compute[186544]: 2025-11-22 07:59:02.991 186548 DEBUG oslo_concurrency.lockutils [req-b24b4b02-15db-43e0-85b5-63928a8659cd req-848e91b9-f17e-4cf3-b09c-4fb8e713700f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "967dbe07-d575-4894-aabd-483767dcd760-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:59:02 compute-0 nova_compute[186544]: 2025-11-22 07:59:02.991 186548 DEBUG oslo_concurrency.lockutils [req-b24b4b02-15db-43e0-85b5-63928a8659cd req-848e91b9-f17e-4cf3-b09c-4fb8e713700f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "967dbe07-d575-4894-aabd-483767dcd760-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:59:02 compute-0 nova_compute[186544]: 2025-11-22 07:59:02.992 186548 DEBUG nova.compute.manager [req-b24b4b02-15db-43e0-85b5-63928a8659cd req-848e91b9-f17e-4cf3-b09c-4fb8e713700f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] No waiting events found dispatching network-vif-plugged-e3b10401-9ef6-4135-93b2-8c19486b866e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:59:02 compute-0 nova_compute[186544]: 2025-11-22 07:59:02.992 186548 WARNING nova.compute.manager [req-b24b4b02-15db-43e0-85b5-63928a8659cd req-848e91b9-f17e-4cf3-b09c-4fb8e713700f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Received unexpected event network-vif-plugged-e3b10401-9ef6-4135-93b2-8c19486b866e for instance with vm_state active and task_state None.
Nov 22 07:59:02 compute-0 nova_compute[186544]: 2025-11-22 07:59:02.992 186548 DEBUG nova.compute.manager [req-b24b4b02-15db-43e0-85b5-63928a8659cd req-848e91b9-f17e-4cf3-b09c-4fb8e713700f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Received event network-vif-plugged-e3b10401-9ef6-4135-93b2-8c19486b866e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:59:02 compute-0 nova_compute[186544]: 2025-11-22 07:59:02.992 186548 DEBUG oslo_concurrency.lockutils [req-b24b4b02-15db-43e0-85b5-63928a8659cd req-848e91b9-f17e-4cf3-b09c-4fb8e713700f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "967dbe07-d575-4894-aabd-483767dcd760-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:59:02 compute-0 nova_compute[186544]: 2025-11-22 07:59:02.993 186548 DEBUG oslo_concurrency.lockutils [req-b24b4b02-15db-43e0-85b5-63928a8659cd req-848e91b9-f17e-4cf3-b09c-4fb8e713700f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "967dbe07-d575-4894-aabd-483767dcd760-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:59:02 compute-0 nova_compute[186544]: 2025-11-22 07:59:02.993 186548 DEBUG oslo_concurrency.lockutils [req-b24b4b02-15db-43e0-85b5-63928a8659cd req-848e91b9-f17e-4cf3-b09c-4fb8e713700f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "967dbe07-d575-4894-aabd-483767dcd760-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:59:02 compute-0 nova_compute[186544]: 2025-11-22 07:59:02.993 186548 DEBUG nova.compute.manager [req-b24b4b02-15db-43e0-85b5-63928a8659cd req-848e91b9-f17e-4cf3-b09c-4fb8e713700f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] No waiting events found dispatching network-vif-plugged-e3b10401-9ef6-4135-93b2-8c19486b866e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:59:02 compute-0 nova_compute[186544]: 2025-11-22 07:59:02.994 186548 WARNING nova.compute.manager [req-b24b4b02-15db-43e0-85b5-63928a8659cd req-848e91b9-f17e-4cf3-b09c-4fb8e713700f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Received unexpected event network-vif-plugged-e3b10401-9ef6-4135-93b2-8c19486b866e for instance with vm_state active and task_state None.
Nov 22 07:59:03 compute-0 podman[227470]: 2025-11-22 07:59:03.410356309 +0000 UTC m=+0.056177655 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 07:59:03 compute-0 nova_compute[186544]: 2025-11-22 07:59:03.435 186548 DEBUG nova.network.neutron [None req-10503764-4dc9-4e06-8d1f-a5d7e5e5c0cc 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Updating instance_info_cache with network_info: [{"id": "1abe3f57-9262-4dc9-a0a8-4465e0cd0702", "address": "fa:16:3e:27:6a:75", "network": {"id": "06e0f3a5-911a-4244-bd9c-8cb4fa4c4794", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-960378838-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd33c7e49baa4c7f9575824b348a0f23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1abe3f57-92", "ovs_interfaceid": "1abe3f57-9262-4dc9-a0a8-4465e0cd0702", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:59:03 compute-0 nova_compute[186544]: 2025-11-22 07:59:03.468 186548 DEBUG oslo_concurrency.lockutils [None req-10503764-4dc9-4e06-8d1f-a5d7e5e5c0cc 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Releasing lock "refresh_cache-823521f6-eb37-4b78-982c-526e463c834f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:59:03 compute-0 nova_compute[186544]: 2025-11-22 07:59:03.691 186548 DEBUG nova.virt.libvirt.driver [None req-10503764-4dc9-4e06-8d1f-a5d7e5e5c0cc 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Nov 22 07:59:05 compute-0 podman[227492]: 2025-11-22 07:59:05.430186077 +0000 UTC m=+0.080809611 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, distribution-scope=public, name=ubi9-minimal, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, release=1755695350, version=9.6, architecture=x86_64, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=)
Nov 22 07:59:05 compute-0 nova_compute[186544]: 2025-11-22 07:59:05.587 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:06 compute-0 nova_compute[186544]: 2025-11-22 07:59:06.340 186548 DEBUG nova.compute.manager [req-50d2866f-be2d-425a-ba13-39674ad96eaf req-6cf4193d-3568-4990-a0e6-8753641a9e5d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Received event network-changed-e3b10401-9ef6-4135-93b2-8c19486b866e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:59:06 compute-0 nova_compute[186544]: 2025-11-22 07:59:06.340 186548 DEBUG nova.compute.manager [req-50d2866f-be2d-425a-ba13-39674ad96eaf req-6cf4193d-3568-4990-a0e6-8753641a9e5d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Refreshing instance network info cache due to event network-changed-e3b10401-9ef6-4135-93b2-8c19486b866e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 07:59:06 compute-0 nova_compute[186544]: 2025-11-22 07:59:06.341 186548 DEBUG oslo_concurrency.lockutils [req-50d2866f-be2d-425a-ba13-39674ad96eaf req-6cf4193d-3568-4990-a0e6-8753641a9e5d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-967dbe07-d575-4894-aabd-483767dcd760" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:59:06 compute-0 nova_compute[186544]: 2025-11-22 07:59:06.341 186548 DEBUG oslo_concurrency.lockutils [req-50d2866f-be2d-425a-ba13-39674ad96eaf req-6cf4193d-3568-4990-a0e6-8753641a9e5d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-967dbe07-d575-4894-aabd-483767dcd760" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:59:06 compute-0 nova_compute[186544]: 2025-11-22 07:59:06.341 186548 DEBUG nova.network.neutron [req-50d2866f-be2d-425a-ba13-39674ad96eaf req-6cf4193d-3568-4990-a0e6-8753641a9e5d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Refreshing network info cache for port e3b10401-9ef6-4135-93b2-8c19486b866e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 07:59:06 compute-0 nova_compute[186544]: 2025-11-22 07:59:06.395 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:06 compute-0 nova_compute[186544]: 2025-11-22 07:59:06.470 186548 DEBUG oslo_concurrency.lockutils [None req-3c2ec661-bb25-4db5-a292-98acbf8ebca0 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Acquiring lock "967dbe07-d575-4894-aabd-483767dcd760" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:59:06 compute-0 nova_compute[186544]: 2025-11-22 07:59:06.471 186548 DEBUG oslo_concurrency.lockutils [None req-3c2ec661-bb25-4db5-a292-98acbf8ebca0 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Lock "967dbe07-d575-4894-aabd-483767dcd760" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:59:06 compute-0 nova_compute[186544]: 2025-11-22 07:59:06.472 186548 DEBUG oslo_concurrency.lockutils [None req-3c2ec661-bb25-4db5-a292-98acbf8ebca0 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Acquiring lock "967dbe07-d575-4894-aabd-483767dcd760-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:59:06 compute-0 nova_compute[186544]: 2025-11-22 07:59:06.473 186548 DEBUG oslo_concurrency.lockutils [None req-3c2ec661-bb25-4db5-a292-98acbf8ebca0 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Lock "967dbe07-d575-4894-aabd-483767dcd760-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:59:06 compute-0 nova_compute[186544]: 2025-11-22 07:59:06.473 186548 DEBUG oslo_concurrency.lockutils [None req-3c2ec661-bb25-4db5-a292-98acbf8ebca0 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Lock "967dbe07-d575-4894-aabd-483767dcd760-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:59:06 compute-0 nova_compute[186544]: 2025-11-22 07:59:06.486 186548 INFO nova.compute.manager [None req-3c2ec661-bb25-4db5-a292-98acbf8ebca0 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Terminating instance
Nov 22 07:59:06 compute-0 nova_compute[186544]: 2025-11-22 07:59:06.493 186548 DEBUG nova.compute.manager [None req-3c2ec661-bb25-4db5-a292-98acbf8ebca0 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 07:59:06 compute-0 kernel: tape3b10401-9e (unregistering): left promiscuous mode
Nov 22 07:59:06 compute-0 NetworkManager[55036]: <info>  [1763798346.5140] device (tape3b10401-9e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 07:59:06 compute-0 ovn_controller[94843]: 2025-11-22T07:59:06Z|00358|binding|INFO|Releasing lport e3b10401-9ef6-4135-93b2-8c19486b866e from this chassis (sb_readonly=0)
Nov 22 07:59:06 compute-0 ovn_controller[94843]: 2025-11-22T07:59:06Z|00359|binding|INFO|Setting lport e3b10401-9ef6-4135-93b2-8c19486b866e down in Southbound
Nov 22 07:59:06 compute-0 ovn_controller[94843]: 2025-11-22T07:59:06Z|00360|binding|INFO|Removing iface tape3b10401-9e ovn-installed in OVS
Nov 22 07:59:06 compute-0 nova_compute[186544]: 2025-11-22 07:59:06.525 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:06.530 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:dd:c3:78 10.100.0.7'], port_security=['fa:16:3e:dd:c3:78 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '967dbe07-d575-4894-aabd-483767dcd760', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9f740f05-d312-4e00-a27d-4d2a45e526b6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd2bcbcf3720f46be9fea7fc4685dfecd', 'neutron:revision_number': '8', 'neutron:security_group_ids': '358cb1ee-ee2e-4598-85d3-617cef2413d3 726ed215-2cc1-4cd0-860c-0d95ad883b6b e19cca6e-a70c-4a97-a2b9-f7c71d5b1755', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=63d51e5f-a087-4eb1-a0c4-4a9ee7856c37, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=e3b10401-9ef6-4135-93b2-8c19486b866e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:59:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:06.531 103805 INFO neutron.agent.ovn.metadata.agent [-] Port e3b10401-9ef6-4135-93b2-8c19486b866e in datapath 9f740f05-d312-4e00-a27d-4d2a45e526b6 unbound from our chassis
Nov 22 07:59:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:06.532 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9f740f05-d312-4e00-a27d-4d2a45e526b6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 07:59:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:06.533 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[0aadbae7-418c-424d-99dc-e94b85ef65c4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:06.534 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9f740f05-d312-4e00-a27d-4d2a45e526b6 namespace which is not needed anymore
Nov 22 07:59:06 compute-0 nova_compute[186544]: 2025-11-22 07:59:06.537 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:06 compute-0 systemd[1]: machine-qemu\x2d43\x2dinstance\x2d00000054.scope: Deactivated successfully.
Nov 22 07:59:06 compute-0 systemd[1]: machine-qemu\x2d43\x2dinstance\x2d00000054.scope: Consumed 5.068s CPU time.
Nov 22 07:59:06 compute-0 systemd-machined[152872]: Machine qemu-43-instance-00000054 terminated.
Nov 22 07:59:06 compute-0 neutron-haproxy-ovnmeta-9f740f05-d312-4e00-a27d-4d2a45e526b6[227455]: [NOTICE]   (227459) : haproxy version is 2.8.14-c23fe91
Nov 22 07:59:06 compute-0 neutron-haproxy-ovnmeta-9f740f05-d312-4e00-a27d-4d2a45e526b6[227455]: [NOTICE]   (227459) : path to executable is /usr/sbin/haproxy
Nov 22 07:59:06 compute-0 neutron-haproxy-ovnmeta-9f740f05-d312-4e00-a27d-4d2a45e526b6[227455]: [WARNING]  (227459) : Exiting Master process...
Nov 22 07:59:06 compute-0 neutron-haproxy-ovnmeta-9f740f05-d312-4e00-a27d-4d2a45e526b6[227455]: [ALERT]    (227459) : Current worker (227461) exited with code 143 (Terminated)
Nov 22 07:59:06 compute-0 neutron-haproxy-ovnmeta-9f740f05-d312-4e00-a27d-4d2a45e526b6[227455]: [WARNING]  (227459) : All workers exited. Exiting... (0)
Nov 22 07:59:06 compute-0 systemd[1]: libpod-dc1a36cffbae8c0900a7b08f3f74ece9465f0da60771b52bdb7a40978de3b15b.scope: Deactivated successfully.
Nov 22 07:59:06 compute-0 podman[227547]: 2025-11-22 07:59:06.659699467 +0000 UTC m=+0.044556878 container died dc1a36cffbae8c0900a7b08f3f74ece9465f0da60771b52bdb7a40978de3b15b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9f740f05-d312-4e00-a27d-4d2a45e526b6, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3)
Nov 22 07:59:06 compute-0 nova_compute[186544]: 2025-11-22 07:59:06.716 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:06 compute-0 nova_compute[186544]: 2025-11-22 07:59:06.718 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:06 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-dc1a36cffbae8c0900a7b08f3f74ece9465f0da60771b52bdb7a40978de3b15b-userdata-shm.mount: Deactivated successfully.
Nov 22 07:59:06 compute-0 systemd[1]: var-lib-containers-storage-overlay-0a50312c1e40fe87fc41d0358c8ef83519ebd4be313b31e02940ead55ef335e9-merged.mount: Deactivated successfully.
Nov 22 07:59:06 compute-0 nova_compute[186544]: 2025-11-22 07:59:06.756 186548 INFO nova.virt.libvirt.driver [-] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Instance destroyed successfully.
Nov 22 07:59:06 compute-0 nova_compute[186544]: 2025-11-22 07:59:06.756 186548 DEBUG nova.objects.instance [None req-3c2ec661-bb25-4db5-a292-98acbf8ebca0 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Lazy-loading 'resources' on Instance uuid 967dbe07-d575-4894-aabd-483767dcd760 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:59:06 compute-0 podman[227547]: 2025-11-22 07:59:06.761435034 +0000 UTC m=+0.146292445 container cleanup dc1a36cffbae8c0900a7b08f3f74ece9465f0da60771b52bdb7a40978de3b15b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9f740f05-d312-4e00-a27d-4d2a45e526b6, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 22 07:59:06 compute-0 systemd[1]: libpod-conmon-dc1a36cffbae8c0900a7b08f3f74ece9465f0da60771b52bdb7a40978de3b15b.scope: Deactivated successfully.
Nov 22 07:59:06 compute-0 podman[227596]: 2025-11-22 07:59:06.830561636 +0000 UTC m=+0.046413944 container remove dc1a36cffbae8c0900a7b08f3f74ece9465f0da60771b52bdb7a40978de3b15b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9f740f05-d312-4e00-a27d-4d2a45e526b6, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true)
Nov 22 07:59:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:06.835 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[6b61a792-2db0-452d-af92-b996e9c6f4b3]: (4, ('Sat Nov 22 07:59:06 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-9f740f05-d312-4e00-a27d-4d2a45e526b6 (dc1a36cffbae8c0900a7b08f3f74ece9465f0da60771b52bdb7a40978de3b15b)\ndc1a36cffbae8c0900a7b08f3f74ece9465f0da60771b52bdb7a40978de3b15b\nSat Nov 22 07:59:06 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-9f740f05-d312-4e00-a27d-4d2a45e526b6 (dc1a36cffbae8c0900a7b08f3f74ece9465f0da60771b52bdb7a40978de3b15b)\ndc1a36cffbae8c0900a7b08f3f74ece9465f0da60771b52bdb7a40978de3b15b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:06.837 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[ca70a97a-578c-4f07-ab2b-a64ba702f90c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:06.838 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9f740f05-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:59:06 compute-0 nova_compute[186544]: 2025-11-22 07:59:06.840 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:06 compute-0 kernel: tap9f740f05-d0: left promiscuous mode
Nov 22 07:59:06 compute-0 nova_compute[186544]: 2025-11-22 07:59:06.857 186548 DEBUG nova.compute.manager [req-16cf2e75-6c0e-4770-863a-5d7051416ea3 req-d293fe1c-c8a1-4703-aa29-32a556034699 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Received event network-vif-unplugged-e3b10401-9ef6-4135-93b2-8c19486b866e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:59:06 compute-0 nova_compute[186544]: 2025-11-22 07:59:06.857 186548 DEBUG oslo_concurrency.lockutils [req-16cf2e75-6c0e-4770-863a-5d7051416ea3 req-d293fe1c-c8a1-4703-aa29-32a556034699 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "967dbe07-d575-4894-aabd-483767dcd760-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:59:06 compute-0 nova_compute[186544]: 2025-11-22 07:59:06.857 186548 DEBUG oslo_concurrency.lockutils [req-16cf2e75-6c0e-4770-863a-5d7051416ea3 req-d293fe1c-c8a1-4703-aa29-32a556034699 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "967dbe07-d575-4894-aabd-483767dcd760-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:59:06 compute-0 nova_compute[186544]: 2025-11-22 07:59:06.857 186548 DEBUG oslo_concurrency.lockutils [req-16cf2e75-6c0e-4770-863a-5d7051416ea3 req-d293fe1c-c8a1-4703-aa29-32a556034699 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "967dbe07-d575-4894-aabd-483767dcd760-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:59:06 compute-0 nova_compute[186544]: 2025-11-22 07:59:06.858 186548 DEBUG nova.compute.manager [req-16cf2e75-6c0e-4770-863a-5d7051416ea3 req-d293fe1c-c8a1-4703-aa29-32a556034699 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] No waiting events found dispatching network-vif-unplugged-e3b10401-9ef6-4135-93b2-8c19486b866e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:59:06 compute-0 nova_compute[186544]: 2025-11-22 07:59:06.858 186548 DEBUG nova.compute.manager [req-16cf2e75-6c0e-4770-863a-5d7051416ea3 req-d293fe1c-c8a1-4703-aa29-32a556034699 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Received event network-vif-unplugged-e3b10401-9ef6-4135-93b2-8c19486b866e for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 22 07:59:06 compute-0 nova_compute[186544]: 2025-11-22 07:59:06.858 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:06.858 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[f0f9e427-35bf-4750-95d6-44771efd79de]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:06.873 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[db8f0937-2763-4b98-a5eb-b3214b9c2179]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:06.876 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[7d36304e-5161-4797-92dc-c8c26d762b9b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:06.893 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[5c2d42d7-f154-400d-aa37-13b1bcf80e8c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 508218, 'reachable_time': 28000, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227614, 'error': None, 'target': 'ovnmeta-9f740f05-d312-4e00-a27d-4d2a45e526b6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:06.896 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9f740f05-d312-4e00-a27d-4d2a45e526b6 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 07:59:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:06.896 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[dac65f25-ab74-47a4-955c-125eed67b6a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:06 compute-0 systemd[1]: run-netns-ovnmeta\x2d9f740f05\x2dd312\x2d4e00\x2da27d\x2d4d2a45e526b6.mount: Deactivated successfully.
Nov 22 07:59:06 compute-0 nova_compute[186544]: 2025-11-22 07:59:06.923 186548 DEBUG nova.virt.libvirt.vif [None req-3c2ec661-bb25-4db5-a292-98acbf8ebca0 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:58:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-SecurityGroupsTestJSON-server-274106407',display_name='tempest-SecurityGroupsTestJSON-server-274106407',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-securitygroupstestjson-server-274106407',id=84,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:58:52Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d2bcbcf3720f46be9fea7fc4685dfecd',ramdisk_id='',reservation_id='r-jtzr0xmm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-SecurityGroupsTestJSON-2135176549',owner_user_name='tempest-SecurityGroupsTestJSON-2135176549-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:59:02Z,user_data=None,user_id='d77b927940494160bce27934c565fda7',uuid=967dbe07-d575-4894-aabd-483767dcd760,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e3b10401-9ef6-4135-93b2-8c19486b866e", "address": "fa:16:3e:dd:c3:78", "network": {"id": "9f740f05-d312-4e00-a27d-4d2a45e526b6", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1530130691-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d2bcbcf3720f46be9fea7fc4685dfecd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape3b10401-9e", "ovs_interfaceid": "e3b10401-9ef6-4135-93b2-8c19486b866e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 07:59:06 compute-0 nova_compute[186544]: 2025-11-22 07:59:06.924 186548 DEBUG nova.network.os_vif_util [None req-3c2ec661-bb25-4db5-a292-98acbf8ebca0 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Converting VIF {"id": "e3b10401-9ef6-4135-93b2-8c19486b866e", "address": "fa:16:3e:dd:c3:78", "network": {"id": "9f740f05-d312-4e00-a27d-4d2a45e526b6", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1530130691-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d2bcbcf3720f46be9fea7fc4685dfecd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape3b10401-9e", "ovs_interfaceid": "e3b10401-9ef6-4135-93b2-8c19486b866e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:59:06 compute-0 nova_compute[186544]: 2025-11-22 07:59:06.924 186548 DEBUG nova.network.os_vif_util [None req-3c2ec661-bb25-4db5-a292-98acbf8ebca0 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:dd:c3:78,bridge_name='br-int',has_traffic_filtering=True,id=e3b10401-9ef6-4135-93b2-8c19486b866e,network=Network(9f740f05-d312-4e00-a27d-4d2a45e526b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape3b10401-9e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:59:06 compute-0 nova_compute[186544]: 2025-11-22 07:59:06.925 186548 DEBUG os_vif [None req-3c2ec661-bb25-4db5-a292-98acbf8ebca0 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:dd:c3:78,bridge_name='br-int',has_traffic_filtering=True,id=e3b10401-9ef6-4135-93b2-8c19486b866e,network=Network(9f740f05-d312-4e00-a27d-4d2a45e526b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape3b10401-9e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 07:59:06 compute-0 nova_compute[186544]: 2025-11-22 07:59:06.926 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:06 compute-0 nova_compute[186544]: 2025-11-22 07:59:06.927 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape3b10401-9e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:59:06 compute-0 nova_compute[186544]: 2025-11-22 07:59:06.928 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:06 compute-0 nova_compute[186544]: 2025-11-22 07:59:06.930 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:06 compute-0 nova_compute[186544]: 2025-11-22 07:59:06.933 186548 INFO os_vif [None req-3c2ec661-bb25-4db5-a292-98acbf8ebca0 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:dd:c3:78,bridge_name='br-int',has_traffic_filtering=True,id=e3b10401-9ef6-4135-93b2-8c19486b866e,network=Network(9f740f05-d312-4e00-a27d-4d2a45e526b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape3b10401-9e')
Nov 22 07:59:06 compute-0 nova_compute[186544]: 2025-11-22 07:59:06.933 186548 INFO nova.virt.libvirt.driver [None req-3c2ec661-bb25-4db5-a292-98acbf8ebca0 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Deleting instance files /var/lib/nova/instances/967dbe07-d575-4894-aabd-483767dcd760_del
Nov 22 07:59:06 compute-0 nova_compute[186544]: 2025-11-22 07:59:06.934 186548 INFO nova.virt.libvirt.driver [None req-3c2ec661-bb25-4db5-a292-98acbf8ebca0 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Deletion of /var/lib/nova/instances/967dbe07-d575-4894-aabd-483767dcd760_del complete
Nov 22 07:59:07 compute-0 nova_compute[186544]: 2025-11-22 07:59:07.126 186548 INFO nova.compute.manager [None req-3c2ec661-bb25-4db5-a292-98acbf8ebca0 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Took 0.63 seconds to destroy the instance on the hypervisor.
Nov 22 07:59:07 compute-0 nova_compute[186544]: 2025-11-22 07:59:07.126 186548 DEBUG oslo.service.loopingcall [None req-3c2ec661-bb25-4db5-a292-98acbf8ebca0 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 07:59:07 compute-0 nova_compute[186544]: 2025-11-22 07:59:07.126 186548 DEBUG nova.compute.manager [-] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 07:59:07 compute-0 nova_compute[186544]: 2025-11-22 07:59:07.127 186548 DEBUG nova.network.neutron [-] [instance: 967dbe07-d575-4894-aabd-483767dcd760] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 07:59:07 compute-0 ovn_controller[94843]: 2025-11-22T07:59:07Z|00039|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:27:6a:75 10.100.0.6
Nov 22 07:59:07 compute-0 ovn_controller[94843]: 2025-11-22T07:59:07Z|00040|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:27:6a:75 10.100.0.6
Nov 22 07:59:07 compute-0 nova_compute[186544]: 2025-11-22 07:59:07.845 186548 DEBUG nova.network.neutron [req-50d2866f-be2d-425a-ba13-39674ad96eaf req-6cf4193d-3568-4990-a0e6-8753641a9e5d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Updated VIF entry in instance network info cache for port e3b10401-9ef6-4135-93b2-8c19486b866e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 07:59:07 compute-0 nova_compute[186544]: 2025-11-22 07:59:07.846 186548 DEBUG nova.network.neutron [req-50d2866f-be2d-425a-ba13-39674ad96eaf req-6cf4193d-3568-4990-a0e6-8753641a9e5d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Updating instance_info_cache with network_info: [{"id": "e3b10401-9ef6-4135-93b2-8c19486b866e", "address": "fa:16:3e:dd:c3:78", "network": {"id": "9f740f05-d312-4e00-a27d-4d2a45e526b6", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1530130691-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d2bcbcf3720f46be9fea7fc4685dfecd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape3b10401-9e", "ovs_interfaceid": "e3b10401-9ef6-4135-93b2-8c19486b866e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:59:07 compute-0 nova_compute[186544]: 2025-11-22 07:59:07.868 186548 DEBUG oslo_concurrency.lockutils [req-50d2866f-be2d-425a-ba13-39674ad96eaf req-6cf4193d-3568-4990-a0e6-8753641a9e5d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-967dbe07-d575-4894-aabd-483767dcd760" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:59:07 compute-0 nova_compute[186544]: 2025-11-22 07:59:07.877 186548 DEBUG nova.network.neutron [-] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:59:07 compute-0 nova_compute[186544]: 2025-11-22 07:59:07.892 186548 INFO nova.compute.manager [-] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Took 0.77 seconds to deallocate network for instance.
Nov 22 07:59:07 compute-0 nova_compute[186544]: 2025-11-22 07:59:07.986 186548 DEBUG oslo_concurrency.lockutils [None req-3c2ec661-bb25-4db5-a292-98acbf8ebca0 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:59:07 compute-0 nova_compute[186544]: 2025-11-22 07:59:07.986 186548 DEBUG oslo_concurrency.lockutils [None req-3c2ec661-bb25-4db5-a292-98acbf8ebca0 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:59:08 compute-0 nova_compute[186544]: 2025-11-22 07:59:08.104 186548 DEBUG nova.compute.provider_tree [None req-3c2ec661-bb25-4db5-a292-98acbf8ebca0 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:59:08 compute-0 nova_compute[186544]: 2025-11-22 07:59:08.119 186548 DEBUG nova.scheduler.client.report [None req-3c2ec661-bb25-4db5-a292-98acbf8ebca0 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:59:08 compute-0 nova_compute[186544]: 2025-11-22 07:59:08.140 186548 DEBUG oslo_concurrency.lockutils [None req-3c2ec661-bb25-4db5-a292-98acbf8ebca0 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.153s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:59:08 compute-0 nova_compute[186544]: 2025-11-22 07:59:08.184 186548 INFO nova.scheduler.client.report [None req-3c2ec661-bb25-4db5-a292-98acbf8ebca0 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Deleted allocations for instance 967dbe07-d575-4894-aabd-483767dcd760
Nov 22 07:59:08 compute-0 nova_compute[186544]: 2025-11-22 07:59:08.265 186548 DEBUG oslo_concurrency.lockutils [None req-3c2ec661-bb25-4db5-a292-98acbf8ebca0 d77b927940494160bce27934c565fda7 d2bcbcf3720f46be9fea7fc4685dfecd - - default default] Lock "967dbe07-d575-4894-aabd-483767dcd760" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.794s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:59:09 compute-0 nova_compute[186544]: 2025-11-22 07:59:09.246 186548 DEBUG nova.compute.manager [req-15ea851e-194b-4363-91d4-e14466731609 req-7e86ca4e-9a18-4959-a3e6-39f5e4402a51 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Received event network-vif-plugged-e3b10401-9ef6-4135-93b2-8c19486b866e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:59:09 compute-0 nova_compute[186544]: 2025-11-22 07:59:09.247 186548 DEBUG oslo_concurrency.lockutils [req-15ea851e-194b-4363-91d4-e14466731609 req-7e86ca4e-9a18-4959-a3e6-39f5e4402a51 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "967dbe07-d575-4894-aabd-483767dcd760-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:59:09 compute-0 nova_compute[186544]: 2025-11-22 07:59:09.248 186548 DEBUG oslo_concurrency.lockutils [req-15ea851e-194b-4363-91d4-e14466731609 req-7e86ca4e-9a18-4959-a3e6-39f5e4402a51 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "967dbe07-d575-4894-aabd-483767dcd760-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:59:09 compute-0 nova_compute[186544]: 2025-11-22 07:59:09.248 186548 DEBUG oslo_concurrency.lockutils [req-15ea851e-194b-4363-91d4-e14466731609 req-7e86ca4e-9a18-4959-a3e6-39f5e4402a51 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "967dbe07-d575-4894-aabd-483767dcd760-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:59:09 compute-0 nova_compute[186544]: 2025-11-22 07:59:09.248 186548 DEBUG nova.compute.manager [req-15ea851e-194b-4363-91d4-e14466731609 req-7e86ca4e-9a18-4959-a3e6-39f5e4402a51 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] No waiting events found dispatching network-vif-plugged-e3b10401-9ef6-4135-93b2-8c19486b866e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:59:09 compute-0 nova_compute[186544]: 2025-11-22 07:59:09.249 186548 WARNING nova.compute.manager [req-15ea851e-194b-4363-91d4-e14466731609 req-7e86ca4e-9a18-4959-a3e6-39f5e4402a51 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Received unexpected event network-vif-plugged-e3b10401-9ef6-4135-93b2-8c19486b866e for instance with vm_state deleted and task_state None.
Nov 22 07:59:09 compute-0 nova_compute[186544]: 2025-11-22 07:59:09.249 186548 DEBUG nova.compute.manager [req-15ea851e-194b-4363-91d4-e14466731609 req-7e86ca4e-9a18-4959-a3e6-39f5e4402a51 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Received event network-vif-deleted-e3b10401-9ef6-4135-93b2-8c19486b866e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:59:10 compute-0 nova_compute[186544]: 2025-11-22 07:59:10.589 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:11 compute-0 nova_compute[186544]: 2025-11-22 07:59:11.468 186548 DEBUG oslo_concurrency.lockutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Acquiring lock "45b35bfb-92b8-4947-bca0-dca87e484f28" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:59:11 compute-0 nova_compute[186544]: 2025-11-22 07:59:11.469 186548 DEBUG oslo_concurrency.lockutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Lock "45b35bfb-92b8-4947-bca0-dca87e484f28" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:59:11 compute-0 nova_compute[186544]: 2025-11-22 07:59:11.489 186548 DEBUG nova.compute.manager [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 45b35bfb-92b8-4947-bca0-dca87e484f28] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 07:59:11 compute-0 nova_compute[186544]: 2025-11-22 07:59:11.590 186548 DEBUG oslo_concurrency.lockutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:59:11 compute-0 nova_compute[186544]: 2025-11-22 07:59:11.590 186548 DEBUG oslo_concurrency.lockutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:59:11 compute-0 nova_compute[186544]: 2025-11-22 07:59:11.596 186548 DEBUG nova.virt.hardware [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 07:59:11 compute-0 nova_compute[186544]: 2025-11-22 07:59:11.597 186548 INFO nova.compute.claims [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 45b35bfb-92b8-4947-bca0-dca87e484f28] Claim successful on node compute-0.ctlplane.example.com
Nov 22 07:59:11 compute-0 nova_compute[186544]: 2025-11-22 07:59:11.757 186548 DEBUG nova.compute.provider_tree [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:59:11 compute-0 nova_compute[186544]: 2025-11-22 07:59:11.770 186548 DEBUG nova.scheduler.client.report [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:59:11 compute-0 nova_compute[186544]: 2025-11-22 07:59:11.791 186548 DEBUG oslo_concurrency.lockutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.201s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:59:11 compute-0 nova_compute[186544]: 2025-11-22 07:59:11.792 186548 DEBUG nova.compute.manager [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 45b35bfb-92b8-4947-bca0-dca87e484f28] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 07:59:11 compute-0 nova_compute[186544]: 2025-11-22 07:59:11.873 186548 DEBUG nova.compute.manager [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 45b35bfb-92b8-4947-bca0-dca87e484f28] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 22 07:59:11 compute-0 nova_compute[186544]: 2025-11-22 07:59:11.874 186548 DEBUG nova.network.neutron [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 45b35bfb-92b8-4947-bca0-dca87e484f28] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 07:59:11 compute-0 nova_compute[186544]: 2025-11-22 07:59:11.928 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:11 compute-0 nova_compute[186544]: 2025-11-22 07:59:11.938 186548 INFO nova.virt.libvirt.driver [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 45b35bfb-92b8-4947-bca0-dca87e484f28] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 07:59:11 compute-0 nova_compute[186544]: 2025-11-22 07:59:11.979 186548 DEBUG nova.compute.manager [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 45b35bfb-92b8-4947-bca0-dca87e484f28] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 07:59:12 compute-0 nova_compute[186544]: 2025-11-22 07:59:12.107 186548 DEBUG nova.compute.manager [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 45b35bfb-92b8-4947-bca0-dca87e484f28] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 07:59:12 compute-0 nova_compute[186544]: 2025-11-22 07:59:12.109 186548 DEBUG nova.virt.libvirt.driver [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 45b35bfb-92b8-4947-bca0-dca87e484f28] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 07:59:12 compute-0 nova_compute[186544]: 2025-11-22 07:59:12.109 186548 INFO nova.virt.libvirt.driver [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 45b35bfb-92b8-4947-bca0-dca87e484f28] Creating image(s)
Nov 22 07:59:12 compute-0 nova_compute[186544]: 2025-11-22 07:59:12.109 186548 DEBUG oslo_concurrency.lockutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Acquiring lock "/var/lib/nova/instances/45b35bfb-92b8-4947-bca0-dca87e484f28/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:59:12 compute-0 nova_compute[186544]: 2025-11-22 07:59:12.110 186548 DEBUG oslo_concurrency.lockutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Lock "/var/lib/nova/instances/45b35bfb-92b8-4947-bca0-dca87e484f28/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:59:12 compute-0 nova_compute[186544]: 2025-11-22 07:59:12.110 186548 DEBUG oslo_concurrency.lockutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Lock "/var/lib/nova/instances/45b35bfb-92b8-4947-bca0-dca87e484f28/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:59:12 compute-0 nova_compute[186544]: 2025-11-22 07:59:12.122 186548 DEBUG oslo_concurrency.processutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:59:12 compute-0 nova_compute[186544]: 2025-11-22 07:59:12.181 186548 DEBUG oslo_concurrency.processutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:59:12 compute-0 nova_compute[186544]: 2025-11-22 07:59:12.182 186548 DEBUG oslo_concurrency.lockutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:59:12 compute-0 nova_compute[186544]: 2025-11-22 07:59:12.182 186548 DEBUG oslo_concurrency.lockutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:59:12 compute-0 nova_compute[186544]: 2025-11-22 07:59:12.193 186548 DEBUG oslo_concurrency.processutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:59:12 compute-0 nova_compute[186544]: 2025-11-22 07:59:12.249 186548 DEBUG oslo_concurrency.processutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:59:12 compute-0 nova_compute[186544]: 2025-11-22 07:59:12.250 186548 DEBUG oslo_concurrency.processutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/45b35bfb-92b8-4947-bca0-dca87e484f28/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:59:12 compute-0 nova_compute[186544]: 2025-11-22 07:59:12.285 186548 DEBUG oslo_concurrency.processutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/45b35bfb-92b8-4947-bca0-dca87e484f28/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:59:12 compute-0 nova_compute[186544]: 2025-11-22 07:59:12.286 186548 DEBUG oslo_concurrency.lockutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.104s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:59:12 compute-0 nova_compute[186544]: 2025-11-22 07:59:12.287 186548 DEBUG oslo_concurrency.processutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:59:12 compute-0 nova_compute[186544]: 2025-11-22 07:59:12.341 186548 DEBUG oslo_concurrency.processutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:59:12 compute-0 nova_compute[186544]: 2025-11-22 07:59:12.343 186548 DEBUG nova.virt.disk.api [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Checking if we can resize image /var/lib/nova/instances/45b35bfb-92b8-4947-bca0-dca87e484f28/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 07:59:12 compute-0 nova_compute[186544]: 2025-11-22 07:59:12.343 186548 DEBUG oslo_concurrency.processutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/45b35bfb-92b8-4947-bca0-dca87e484f28/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:59:12 compute-0 nova_compute[186544]: 2025-11-22 07:59:12.399 186548 DEBUG oslo_concurrency.processutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/45b35bfb-92b8-4947-bca0-dca87e484f28/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:59:12 compute-0 nova_compute[186544]: 2025-11-22 07:59:12.400 186548 DEBUG nova.virt.disk.api [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Cannot resize image /var/lib/nova/instances/45b35bfb-92b8-4947-bca0-dca87e484f28/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 07:59:12 compute-0 nova_compute[186544]: 2025-11-22 07:59:12.400 186548 DEBUG nova.objects.instance [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Lazy-loading 'migration_context' on Instance uuid 45b35bfb-92b8-4947-bca0-dca87e484f28 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:59:12 compute-0 nova_compute[186544]: 2025-11-22 07:59:12.413 186548 DEBUG nova.virt.libvirt.driver [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 45b35bfb-92b8-4947-bca0-dca87e484f28] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 07:59:12 compute-0 nova_compute[186544]: 2025-11-22 07:59:12.414 186548 DEBUG nova.virt.libvirt.driver [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 45b35bfb-92b8-4947-bca0-dca87e484f28] Ensure instance console log exists: /var/lib/nova/instances/45b35bfb-92b8-4947-bca0-dca87e484f28/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 07:59:12 compute-0 nova_compute[186544]: 2025-11-22 07:59:12.414 186548 DEBUG oslo_concurrency.lockutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:59:12 compute-0 nova_compute[186544]: 2025-11-22 07:59:12.414 186548 DEBUG oslo_concurrency.lockutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:59:12 compute-0 nova_compute[186544]: 2025-11-22 07:59:12.415 186548 DEBUG oslo_concurrency.lockutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:59:12 compute-0 nova_compute[186544]: 2025-11-22 07:59:12.436 186548 DEBUG nova.policy [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'cf1790780fd64791b117114d170d6d90', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '16ccb24424c54ae1a1b0d7eef6f7d690', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 22 07:59:13 compute-0 nova_compute[186544]: 2025-11-22 07:59:13.736 186548 DEBUG nova.virt.libvirt.driver [None req-10503764-4dc9-4e06-8d1f-a5d7e5e5c0cc 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Nov 22 07:59:14 compute-0 nova_compute[186544]: 2025-11-22 07:59:14.475 186548 DEBUG nova.network.neutron [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 45b35bfb-92b8-4947-bca0-dca87e484f28] Successfully created port: 55f8379a-37c3-44ee-87da-59e3f28fcabb _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 22 07:59:15 compute-0 nova_compute[186544]: 2025-11-22 07:59:15.592 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:15 compute-0 kernel: tap1abe3f57-92 (unregistering): left promiscuous mode
Nov 22 07:59:15 compute-0 NetworkManager[55036]: <info>  [1763798355.9310] device (tap1abe3f57-92): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 07:59:15 compute-0 ovn_controller[94843]: 2025-11-22T07:59:15Z|00361|binding|INFO|Releasing lport 1abe3f57-9262-4dc9-a0a8-4465e0cd0702 from this chassis (sb_readonly=0)
Nov 22 07:59:15 compute-0 ovn_controller[94843]: 2025-11-22T07:59:15Z|00362|binding|INFO|Setting lport 1abe3f57-9262-4dc9-a0a8-4465e0cd0702 down in Southbound
Nov 22 07:59:15 compute-0 nova_compute[186544]: 2025-11-22 07:59:15.942 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:15 compute-0 ovn_controller[94843]: 2025-11-22T07:59:15Z|00363|binding|INFO|Removing iface tap1abe3f57-92 ovn-installed in OVS
Nov 22 07:59:15 compute-0 nova_compute[186544]: 2025-11-22 07:59:15.944 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:15 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:15.955 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:27:6a:75 10.100.0.6'], port_security=['fa:16:3e:27:6a:75 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '823521f6-eb37-4b78-982c-526e463c834f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'neutron:revision_number': '4', 'neutron:security_group_ids': '40e7d412-78c2-4966-b2d0-76294ef96b0a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2f082361-600e-461c-8c13-2c91c0ff7f77, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=1abe3f57-9262-4dc9-a0a8-4465e0cd0702) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:59:15 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:15.956 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 1abe3f57-9262-4dc9-a0a8-4465e0cd0702 in datapath 06e0f3a5-911a-4244-bd9c-8cb4fa4c4794 unbound from our chassis
Nov 22 07:59:15 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:15.957 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 06e0f3a5-911a-4244-bd9c-8cb4fa4c4794, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 07:59:15 compute-0 nova_compute[186544]: 2025-11-22 07:59:15.958 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:15 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:15.959 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[7c8f5252-313d-4cc9-8020-d6f6ade5a994]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:15 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:15.960 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794 namespace which is not needed anymore
Nov 22 07:59:15 compute-0 systemd[1]: machine-qemu\x2d42\x2dinstance\x2d00000055.scope: Deactivated successfully.
Nov 22 07:59:15 compute-0 systemd[1]: machine-qemu\x2d42\x2dinstance\x2d00000055.scope: Consumed 14.522s CPU time.
Nov 22 07:59:16 compute-0 systemd-machined[152872]: Machine qemu-42-instance-00000055 terminated.
Nov 22 07:59:16 compute-0 neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794[227197]: [NOTICE]   (227202) : haproxy version is 2.8.14-c23fe91
Nov 22 07:59:16 compute-0 neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794[227197]: [NOTICE]   (227202) : path to executable is /usr/sbin/haproxy
Nov 22 07:59:16 compute-0 neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794[227197]: [WARNING]  (227202) : Exiting Master process...
Nov 22 07:59:16 compute-0 neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794[227197]: [WARNING]  (227202) : Exiting Master process...
Nov 22 07:59:16 compute-0 neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794[227197]: [ALERT]    (227202) : Current worker (227204) exited with code 143 (Terminated)
Nov 22 07:59:16 compute-0 neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794[227197]: [WARNING]  (227202) : All workers exited. Exiting... (0)
Nov 22 07:59:16 compute-0 systemd[1]: libpod-6d8d43ddb02f5a48f501eaffa850305c588b4a5d598d59cf5cfe1407b045dc4f.scope: Deactivated successfully.
Nov 22 07:59:16 compute-0 podman[227654]: 2025-11-22 07:59:16.150866945 +0000 UTC m=+0.104955367 container died 6d8d43ddb02f5a48f501eaffa850305c588b4a5d598d59cf5cfe1407b045dc4f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 07:59:16 compute-0 nova_compute[186544]: 2025-11-22 07:59:16.163 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:16 compute-0 nova_compute[186544]: 2025-11-22 07:59:16.168 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:16 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6d8d43ddb02f5a48f501eaffa850305c588b4a5d598d59cf5cfe1407b045dc4f-userdata-shm.mount: Deactivated successfully.
Nov 22 07:59:16 compute-0 systemd[1]: var-lib-containers-storage-overlay-026e5abe48735cd4ac24a9340e911192d2cb399737c74f3129666361318f3e21-merged.mount: Deactivated successfully.
Nov 22 07:59:16 compute-0 podman[227654]: 2025-11-22 07:59:16.192737026 +0000 UTC m=+0.146825448 container cleanup 6d8d43ddb02f5a48f501eaffa850305c588b4a5d598d59cf5cfe1407b045dc4f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 07:59:16 compute-0 nova_compute[186544]: 2025-11-22 07:59:16.196 186548 DEBUG nova.network.neutron [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 45b35bfb-92b8-4947-bca0-dca87e484f28] Successfully updated port: 55f8379a-37c3-44ee-87da-59e3f28fcabb _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 07:59:16 compute-0 systemd[1]: libpod-conmon-6d8d43ddb02f5a48f501eaffa850305c588b4a5d598d59cf5cfe1407b045dc4f.scope: Deactivated successfully.
Nov 22 07:59:16 compute-0 nova_compute[186544]: 2025-11-22 07:59:16.212 186548 DEBUG oslo_concurrency.lockutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Acquiring lock "refresh_cache-45b35bfb-92b8-4947-bca0-dca87e484f28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:59:16 compute-0 nova_compute[186544]: 2025-11-22 07:59:16.214 186548 DEBUG oslo_concurrency.lockutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Acquired lock "refresh_cache-45b35bfb-92b8-4947-bca0-dca87e484f28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:59:16 compute-0 nova_compute[186544]: 2025-11-22 07:59:16.214 186548 DEBUG nova.network.neutron [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 45b35bfb-92b8-4947-bca0-dca87e484f28] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 07:59:16 compute-0 podman[227697]: 2025-11-22 07:59:16.272120752 +0000 UTC m=+0.051626483 container remove 6d8d43ddb02f5a48f501eaffa850305c588b4a5d598d59cf5cfe1407b045dc4f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 22 07:59:16 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:16.278 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[2f4487d1-20ce-4f5c-8154-6dbe19753c85]: (4, ('Sat Nov 22 07:59:16 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794 (6d8d43ddb02f5a48f501eaffa850305c588b4a5d598d59cf5cfe1407b045dc4f)\n6d8d43ddb02f5a48f501eaffa850305c588b4a5d598d59cf5cfe1407b045dc4f\nSat Nov 22 07:59:16 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794 (6d8d43ddb02f5a48f501eaffa850305c588b4a5d598d59cf5cfe1407b045dc4f)\n6d8d43ddb02f5a48f501eaffa850305c588b4a5d598d59cf5cfe1407b045dc4f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:16 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:16.281 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[e018a6ee-94d8-4594-8b48-e79ed6ea7451]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:16 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:16.282 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap06e0f3a5-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:59:16 compute-0 nova_compute[186544]: 2025-11-22 07:59:16.284 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:16 compute-0 kernel: tap06e0f3a5-90: left promiscuous mode
Nov 22 07:59:16 compute-0 nova_compute[186544]: 2025-11-22 07:59:16.300 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:16 compute-0 nova_compute[186544]: 2025-11-22 07:59:16.302 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:16 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:16.304 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[4f44cee7-a8c4-40e3-803d-142533d72b88]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:16 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:16.316 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[45815ddf-1346-4ab4-b0ad-5a46899d5bac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:16 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:16.318 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[21ded470-5dcd-4c3b-a02b-e1ba453e7b00]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:16 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:16.333 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[8c167362-495c-4a65-a61e-2e74484d41d5]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 507433, 'reachable_time': 19820, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227718, 'error': None, 'target': 'ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:16 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:16.335 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 07:59:16 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:16.335 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[9eb490d6-889c-41dc-879d-9654912c9ef0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:16 compute-0 systemd[1]: run-netns-ovnmeta\x2d06e0f3a5\x2d911a\x2d4244\x2dbd9c\x2d8cb4fa4c4794.mount: Deactivated successfully.
Nov 22 07:59:16 compute-0 nova_compute[186544]: 2025-11-22 07:59:16.454 186548 DEBUG nova.network.neutron [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 45b35bfb-92b8-4947-bca0-dca87e484f28] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 07:59:16 compute-0 nova_compute[186544]: 2025-11-22 07:59:16.749 186548 INFO nova.virt.libvirt.driver [None req-10503764-4dc9-4e06-8d1f-a5d7e5e5c0cc 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Instance shutdown successfully after 13 seconds.
Nov 22 07:59:16 compute-0 nova_compute[186544]: 2025-11-22 07:59:16.754 186548 INFO nova.virt.libvirt.driver [-] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Instance destroyed successfully.
Nov 22 07:59:16 compute-0 nova_compute[186544]: 2025-11-22 07:59:16.755 186548 DEBUG nova.objects.instance [None req-10503764-4dc9-4e06-8d1f-a5d7e5e5c0cc 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lazy-loading 'numa_topology' on Instance uuid 823521f6-eb37-4b78-982c-526e463c834f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:59:16 compute-0 nova_compute[186544]: 2025-11-22 07:59:16.766 186548 INFO nova.virt.libvirt.driver [None req-10503764-4dc9-4e06-8d1f-a5d7e5e5c0cc 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Attempting a stable device rescue
Nov 22 07:59:16 compute-0 nova_compute[186544]: 2025-11-22 07:59:16.931 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:17 compute-0 nova_compute[186544]: 2025-11-22 07:59:17.013 186548 DEBUG nova.virt.libvirt.driver [None req-10503764-4dc9-4e06-8d1f-a5d7e5e5c0cc 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}, 'disk.rescue': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}}} rescue /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4314
Nov 22 07:59:17 compute-0 nova_compute[186544]: 2025-11-22 07:59:17.017 186548 DEBUG nova.virt.libvirt.driver [None req-10503764-4dc9-4e06-8d1f-a5d7e5e5c0cc 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719
Nov 22 07:59:17 compute-0 nova_compute[186544]: 2025-11-22 07:59:17.018 186548 INFO nova.virt.libvirt.driver [None req-10503764-4dc9-4e06-8d1f-a5d7e5e5c0cc 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Creating image(s)
Nov 22 07:59:17 compute-0 nova_compute[186544]: 2025-11-22 07:59:17.019 186548 DEBUG oslo_concurrency.lockutils [None req-10503764-4dc9-4e06-8d1f-a5d7e5e5c0cc 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Acquiring lock "/var/lib/nova/instances/823521f6-eb37-4b78-982c-526e463c834f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:59:17 compute-0 nova_compute[186544]: 2025-11-22 07:59:17.019 186548 DEBUG oslo_concurrency.lockutils [None req-10503764-4dc9-4e06-8d1f-a5d7e5e5c0cc 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lock "/var/lib/nova/instances/823521f6-eb37-4b78-982c-526e463c834f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:59:17 compute-0 nova_compute[186544]: 2025-11-22 07:59:17.020 186548 DEBUG oslo_concurrency.lockutils [None req-10503764-4dc9-4e06-8d1f-a5d7e5e5c0cc 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lock "/var/lib/nova/instances/823521f6-eb37-4b78-982c-526e463c834f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:59:17 compute-0 nova_compute[186544]: 2025-11-22 07:59:17.020 186548 DEBUG nova.objects.instance [None req-10503764-4dc9-4e06-8d1f-a5d7e5e5c0cc 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 823521f6-eb37-4b78-982c-526e463c834f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:59:17 compute-0 nova_compute[186544]: 2025-11-22 07:59:17.030 186548 DEBUG oslo_concurrency.lockutils [None req-10503764-4dc9-4e06-8d1f-a5d7e5e5c0cc 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Acquiring lock "a8f717718e805875f8335c6cd87b43bdb39ec5d3" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:59:17 compute-0 nova_compute[186544]: 2025-11-22 07:59:17.031 186548 DEBUG oslo_concurrency.lockutils [None req-10503764-4dc9-4e06-8d1f-a5d7e5e5c0cc 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lock "a8f717718e805875f8335c6cd87b43bdb39ec5d3" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:59:17 compute-0 nova_compute[186544]: 2025-11-22 07:59:17.426 186548 DEBUG nova.compute.manager [req-e54d4800-f7da-4dfe-9dfa-58c26026bc4e req-0961b72d-aaf1-4e8f-95cc-f495313386c8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Received event network-vif-unplugged-1abe3f57-9262-4dc9-a0a8-4465e0cd0702 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:59:17 compute-0 nova_compute[186544]: 2025-11-22 07:59:17.428 186548 DEBUG oslo_concurrency.lockutils [req-e54d4800-f7da-4dfe-9dfa-58c26026bc4e req-0961b72d-aaf1-4e8f-95cc-f495313386c8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "823521f6-eb37-4b78-982c-526e463c834f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:59:17 compute-0 nova_compute[186544]: 2025-11-22 07:59:17.429 186548 DEBUG oslo_concurrency.lockutils [req-e54d4800-f7da-4dfe-9dfa-58c26026bc4e req-0961b72d-aaf1-4e8f-95cc-f495313386c8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "823521f6-eb37-4b78-982c-526e463c834f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:59:17 compute-0 nova_compute[186544]: 2025-11-22 07:59:17.429 186548 DEBUG oslo_concurrency.lockutils [req-e54d4800-f7da-4dfe-9dfa-58c26026bc4e req-0961b72d-aaf1-4e8f-95cc-f495313386c8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "823521f6-eb37-4b78-982c-526e463c834f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:59:17 compute-0 nova_compute[186544]: 2025-11-22 07:59:17.429 186548 DEBUG nova.compute.manager [req-e54d4800-f7da-4dfe-9dfa-58c26026bc4e req-0961b72d-aaf1-4e8f-95cc-f495313386c8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] No waiting events found dispatching network-vif-unplugged-1abe3f57-9262-4dc9-a0a8-4465e0cd0702 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:59:17 compute-0 nova_compute[186544]: 2025-11-22 07:59:17.429 186548 WARNING nova.compute.manager [req-e54d4800-f7da-4dfe-9dfa-58c26026bc4e req-0961b72d-aaf1-4e8f-95cc-f495313386c8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Received unexpected event network-vif-unplugged-1abe3f57-9262-4dc9-a0a8-4465e0cd0702 for instance with vm_state active and task_state rescuing.
Nov 22 07:59:17 compute-0 podman[227720]: 2025-11-22 07:59:17.433702867 +0000 UTC m=+0.077350586 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, container_name=ovn_controller)
Nov 22 07:59:17 compute-0 podman[227719]: 2025-11-22 07:59:17.440842673 +0000 UTC m=+0.084240026 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2)
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.011 186548 DEBUG nova.compute.manager [req-184d21d0-c761-4c6b-a327-eb059f66e3db req-f582b508-a372-47eb-9aba-eff834f75c0d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 45b35bfb-92b8-4947-bca0-dca87e484f28] Received event network-changed-55f8379a-37c3-44ee-87da-59e3f28fcabb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.012 186548 DEBUG nova.compute.manager [req-184d21d0-c761-4c6b-a327-eb059f66e3db req-f582b508-a372-47eb-9aba-eff834f75c0d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 45b35bfb-92b8-4947-bca0-dca87e484f28] Refreshing instance network info cache due to event network-changed-55f8379a-37c3-44ee-87da-59e3f28fcabb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.012 186548 DEBUG oslo_concurrency.lockutils [req-184d21d0-c761-4c6b-a327-eb059f66e3db req-f582b508-a372-47eb-9aba-eff834f75c0d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-45b35bfb-92b8-4947-bca0-dca87e484f28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.303 186548 DEBUG oslo_concurrency.processutils [None req-10503764-4dc9-4e06-8d1f-a5d7e5e5c0cc 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a8f717718e805875f8335c6cd87b43bdb39ec5d3.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.365 186548 DEBUG oslo_concurrency.processutils [None req-10503764-4dc9-4e06-8d1f-a5d7e5e5c0cc 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a8f717718e805875f8335c6cd87b43bdb39ec5d3.part --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.367 186548 DEBUG nova.virt.images [None req-10503764-4dc9-4e06-8d1f-a5d7e5e5c0cc 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] 74516a1d-b693-42b1-9090-70e086e7142f was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.370 186548 DEBUG nova.privsep.utils [None req-10503764-4dc9-4e06-8d1f-a5d7e5e5c0cc 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.370 186548 DEBUG oslo_concurrency.processutils [None req-10503764-4dc9-4e06-8d1f-a5d7e5e5c0cc 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/a8f717718e805875f8335c6cd87b43bdb39ec5d3.part /var/lib/nova/instances/_base/a8f717718e805875f8335c6cd87b43bdb39ec5d3.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.546 186548 DEBUG nova.network.neutron [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 45b35bfb-92b8-4947-bca0-dca87e484f28] Updating instance_info_cache with network_info: [{"id": "55f8379a-37c3-44ee-87da-59e3f28fcabb", "address": "fa:16:3e:5d:02:bc", "network": {"id": "d6148823-d007-4a7e-be44-4329f8ecc6e5", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1122516717-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16ccb24424c54ae1a1b0d7eef6f7d690", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap55f8379a-37", "ovs_interfaceid": "55f8379a-37c3-44ee-87da-59e3f28fcabb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.590 186548 DEBUG oslo_concurrency.lockutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Releasing lock "refresh_cache-45b35bfb-92b8-4947-bca0-dca87e484f28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.590 186548 DEBUG nova.compute.manager [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 45b35bfb-92b8-4947-bca0-dca87e484f28] Instance network_info: |[{"id": "55f8379a-37c3-44ee-87da-59e3f28fcabb", "address": "fa:16:3e:5d:02:bc", "network": {"id": "d6148823-d007-4a7e-be44-4329f8ecc6e5", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1122516717-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16ccb24424c54ae1a1b0d7eef6f7d690", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap55f8379a-37", "ovs_interfaceid": "55f8379a-37c3-44ee-87da-59e3f28fcabb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.591 186548 DEBUG oslo_concurrency.lockutils [req-184d21d0-c761-4c6b-a327-eb059f66e3db req-f582b508-a372-47eb-9aba-eff834f75c0d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-45b35bfb-92b8-4947-bca0-dca87e484f28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.591 186548 DEBUG nova.network.neutron [req-184d21d0-c761-4c6b-a327-eb059f66e3db req-f582b508-a372-47eb-9aba-eff834f75c0d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 45b35bfb-92b8-4947-bca0-dca87e484f28] Refreshing network info cache for port 55f8379a-37c3-44ee-87da-59e3f28fcabb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.594 186548 DEBUG nova.virt.libvirt.driver [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 45b35bfb-92b8-4947-bca0-dca87e484f28] Start _get_guest_xml network_info=[{"id": "55f8379a-37c3-44ee-87da-59e3f28fcabb", "address": "fa:16:3e:5d:02:bc", "network": {"id": "d6148823-d007-4a7e-be44-4329f8ecc6e5", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1122516717-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16ccb24424c54ae1a1b0d7eef6f7d690", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap55f8379a-37", "ovs_interfaceid": "55f8379a-37c3-44ee-87da-59e3f28fcabb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.598 186548 WARNING nova.virt.libvirt.driver [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.604 186548 DEBUG nova.virt.libvirt.host [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.605 186548 DEBUG nova.virt.libvirt.host [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.614 186548 DEBUG nova.virt.libvirt.host [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.614 186548 DEBUG nova.virt.libvirt.host [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.616 186548 DEBUG nova.virt.libvirt.driver [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.616 186548 DEBUG nova.virt.hardware [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.617 186548 DEBUG nova.virt.hardware [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.617 186548 DEBUG nova.virt.hardware [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.617 186548 DEBUG nova.virt.hardware [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.617 186548 DEBUG nova.virt.hardware [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.617 186548 DEBUG nova.virt.hardware [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.618 186548 DEBUG nova.virt.hardware [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.618 186548 DEBUG nova.virt.hardware [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.618 186548 DEBUG nova.virt.hardware [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.618 186548 DEBUG nova.virt.hardware [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.619 186548 DEBUG nova.virt.hardware [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.623 186548 DEBUG nova.virt.libvirt.vif [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:59:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServersNegativeTestJSON-server-1317504943',display_name='tempest-ListServersNegativeTestJSON-server-1317504943-3',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserversnegativetestjson-server-1317504943-3',id=88,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=2,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='16ccb24424c54ae1a1b0d7eef6f7d690',ramdisk_id='',reservation_id='r-ez5qyzoz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServersNegativeTestJSON-1715955177',owner_user_name='tempest-ListServersNegativeTestJSON-1715955177-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:59:12Z,user_data=None,user_id='cf1790780fd64791b117114d170d6d90',uuid=45b35bfb-92b8-4947-bca0-dca87e484f28,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "55f8379a-37c3-44ee-87da-59e3f28fcabb", "address": "fa:16:3e:5d:02:bc", "network": {"id": "d6148823-d007-4a7e-be44-4329f8ecc6e5", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1122516717-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16ccb24424c54ae1a1b0d7eef6f7d690", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap55f8379a-37", "ovs_interfaceid": "55f8379a-37c3-44ee-87da-59e3f28fcabb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.623 186548 DEBUG nova.network.os_vif_util [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Converting VIF {"id": "55f8379a-37c3-44ee-87da-59e3f28fcabb", "address": "fa:16:3e:5d:02:bc", "network": {"id": "d6148823-d007-4a7e-be44-4329f8ecc6e5", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1122516717-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16ccb24424c54ae1a1b0d7eef6f7d690", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap55f8379a-37", "ovs_interfaceid": "55f8379a-37c3-44ee-87da-59e3f28fcabb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.624 186548 DEBUG nova.network.os_vif_util [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5d:02:bc,bridge_name='br-int',has_traffic_filtering=True,id=55f8379a-37c3-44ee-87da-59e3f28fcabb,network=Network(d6148823-d007-4a7e-be44-4329f8ecc6e5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap55f8379a-37') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.625 186548 DEBUG nova.objects.instance [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Lazy-loading 'pci_devices' on Instance uuid 45b35bfb-92b8-4947-bca0-dca87e484f28 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.638 186548 DEBUG nova.virt.libvirt.driver [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 45b35bfb-92b8-4947-bca0-dca87e484f28] End _get_guest_xml xml=<domain type="kvm">
Nov 22 07:59:18 compute-0 nova_compute[186544]:   <uuid>45b35bfb-92b8-4947-bca0-dca87e484f28</uuid>
Nov 22 07:59:18 compute-0 nova_compute[186544]:   <name>instance-00000058</name>
Nov 22 07:59:18 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 07:59:18 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 07:59:18 compute-0 nova_compute[186544]:   <metadata>
Nov 22 07:59:18 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 07:59:18 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 07:59:18 compute-0 nova_compute[186544]:       <nova:name>tempest-ListServersNegativeTestJSON-server-1317504943-3</nova:name>
Nov 22 07:59:18 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 07:59:18</nova:creationTime>
Nov 22 07:59:18 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 07:59:18 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 07:59:18 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 07:59:18 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 07:59:18 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 07:59:18 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 07:59:18 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 07:59:18 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 07:59:18 compute-0 nova_compute[186544]:         <nova:user uuid="cf1790780fd64791b117114d170d6d90">tempest-ListServersNegativeTestJSON-1715955177-project-member</nova:user>
Nov 22 07:59:18 compute-0 nova_compute[186544]:         <nova:project uuid="16ccb24424c54ae1a1b0d7eef6f7d690">tempest-ListServersNegativeTestJSON-1715955177</nova:project>
Nov 22 07:59:18 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 07:59:18 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 07:59:18 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 07:59:18 compute-0 nova_compute[186544]:         <nova:port uuid="55f8379a-37c3-44ee-87da-59e3f28fcabb">
Nov 22 07:59:18 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 22 07:59:18 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 07:59:18 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 07:59:18 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 07:59:18 compute-0 nova_compute[186544]:   </metadata>
Nov 22 07:59:18 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 07:59:18 compute-0 nova_compute[186544]:     <system>
Nov 22 07:59:18 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 07:59:18 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 07:59:18 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 07:59:18 compute-0 nova_compute[186544]:       <entry name="serial">45b35bfb-92b8-4947-bca0-dca87e484f28</entry>
Nov 22 07:59:18 compute-0 nova_compute[186544]:       <entry name="uuid">45b35bfb-92b8-4947-bca0-dca87e484f28</entry>
Nov 22 07:59:18 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 07:59:18 compute-0 nova_compute[186544]:     </system>
Nov 22 07:59:18 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 07:59:18 compute-0 nova_compute[186544]:   <os>
Nov 22 07:59:18 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 07:59:18 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 07:59:18 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 07:59:18 compute-0 nova_compute[186544]:   </os>
Nov 22 07:59:18 compute-0 nova_compute[186544]:   <features>
Nov 22 07:59:18 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 07:59:18 compute-0 nova_compute[186544]:     <apic/>
Nov 22 07:59:18 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 07:59:18 compute-0 nova_compute[186544]:   </features>
Nov 22 07:59:18 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 07:59:18 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 07:59:18 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 07:59:18 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 07:59:18 compute-0 nova_compute[186544]:   </clock>
Nov 22 07:59:18 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 07:59:18 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 07:59:18 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 07:59:18 compute-0 nova_compute[186544]:   </cpu>
Nov 22 07:59:18 compute-0 nova_compute[186544]:   <devices>
Nov 22 07:59:18 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 07:59:18 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 07:59:18 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/45b35bfb-92b8-4947-bca0-dca87e484f28/disk"/>
Nov 22 07:59:18 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 07:59:18 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:59:18 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 07:59:18 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 07:59:18 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/45b35bfb-92b8-4947-bca0-dca87e484f28/disk.config"/>
Nov 22 07:59:18 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 07:59:18 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:59:18 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 07:59:18 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:5d:02:bc"/>
Nov 22 07:59:18 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 07:59:18 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 07:59:18 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 07:59:18 compute-0 nova_compute[186544]:       <target dev="tap55f8379a-37"/>
Nov 22 07:59:18 compute-0 nova_compute[186544]:     </interface>
Nov 22 07:59:18 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 07:59:18 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/45b35bfb-92b8-4947-bca0-dca87e484f28/console.log" append="off"/>
Nov 22 07:59:18 compute-0 nova_compute[186544]:     </serial>
Nov 22 07:59:18 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 07:59:18 compute-0 nova_compute[186544]:     <video>
Nov 22 07:59:18 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 07:59:18 compute-0 nova_compute[186544]:     </video>
Nov 22 07:59:18 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 07:59:18 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 07:59:18 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 07:59:18 compute-0 nova_compute[186544]:     </rng>
Nov 22 07:59:18 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 07:59:18 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:18 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:18 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:18 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:18 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:18 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:18 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:18 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:18 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:18 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:18 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:18 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:18 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:18 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:18 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:18 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:18 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:18 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:18 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:18 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:18 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:18 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:18 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:18 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:18 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 07:59:18 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 07:59:18 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 07:59:18 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 07:59:18 compute-0 nova_compute[186544]:   </devices>
Nov 22 07:59:18 compute-0 nova_compute[186544]: </domain>
Nov 22 07:59:18 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.640 186548 DEBUG nova.compute.manager [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 45b35bfb-92b8-4947-bca0-dca87e484f28] Preparing to wait for external event network-vif-plugged-55f8379a-37c3-44ee-87da-59e3f28fcabb prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.640 186548 DEBUG oslo_concurrency.lockutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Acquiring lock "45b35bfb-92b8-4947-bca0-dca87e484f28-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.640 186548 DEBUG oslo_concurrency.lockutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Lock "45b35bfb-92b8-4947-bca0-dca87e484f28-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.641 186548 DEBUG oslo_concurrency.lockutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Lock "45b35bfb-92b8-4947-bca0-dca87e484f28-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.641 186548 DEBUG nova.virt.libvirt.vif [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T07:59:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServersNegativeTestJSON-server-1317504943',display_name='tempest-ListServersNegativeTestJSON-server-1317504943-3',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserversnegativetestjson-server-1317504943-3',id=88,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=2,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='16ccb24424c54ae1a1b0d7eef6f7d690',ramdisk_id='',reservation_id='r-ez5qyzoz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServersNegativeTestJSON-1715955177',owner_user_name='tempest-ListServersNegativeTestJSON-1715955177-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:59:12Z,user_data=None,user_id='cf1790780fd64791b117114d170d6d90',uuid=45b35bfb-92b8-4947-bca0-dca87e484f28,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "55f8379a-37c3-44ee-87da-59e3f28fcabb", "address": "fa:16:3e:5d:02:bc", "network": {"id": "d6148823-d007-4a7e-be44-4329f8ecc6e5", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1122516717-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16ccb24424c54ae1a1b0d7eef6f7d690", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap55f8379a-37", "ovs_interfaceid": "55f8379a-37c3-44ee-87da-59e3f28fcabb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.642 186548 DEBUG nova.network.os_vif_util [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Converting VIF {"id": "55f8379a-37c3-44ee-87da-59e3f28fcabb", "address": "fa:16:3e:5d:02:bc", "network": {"id": "d6148823-d007-4a7e-be44-4329f8ecc6e5", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1122516717-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16ccb24424c54ae1a1b0d7eef6f7d690", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap55f8379a-37", "ovs_interfaceid": "55f8379a-37c3-44ee-87da-59e3f28fcabb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.643 186548 DEBUG nova.network.os_vif_util [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5d:02:bc,bridge_name='br-int',has_traffic_filtering=True,id=55f8379a-37c3-44ee-87da-59e3f28fcabb,network=Network(d6148823-d007-4a7e-be44-4329f8ecc6e5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap55f8379a-37') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.643 186548 DEBUG os_vif [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5d:02:bc,bridge_name='br-int',has_traffic_filtering=True,id=55f8379a-37c3-44ee-87da-59e3f28fcabb,network=Network(d6148823-d007-4a7e-be44-4329f8ecc6e5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap55f8379a-37') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.644 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.644 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.645 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.648 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.648 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap55f8379a-37, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.649 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap55f8379a-37, col_values=(('external_ids', {'iface-id': '55f8379a-37c3-44ee-87da-59e3f28fcabb', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5d:02:bc', 'vm-uuid': '45b35bfb-92b8-4947-bca0-dca87e484f28'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.651 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:18 compute-0 NetworkManager[55036]: <info>  [1763798358.6531] manager: (tap55f8379a-37): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/170)
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.654 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.661 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.662 186548 INFO os_vif [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5d:02:bc,bridge_name='br-int',has_traffic_filtering=True,id=55f8379a-37c3-44ee-87da-59e3f28fcabb,network=Network(d6148823-d007-4a7e-be44-4329f8ecc6e5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap55f8379a-37')
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.677 186548 DEBUG oslo_concurrency.processutils [None req-10503764-4dc9-4e06-8d1f-a5d7e5e5c0cc 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/a8f717718e805875f8335c6cd87b43bdb39ec5d3.part /var/lib/nova/instances/_base/a8f717718e805875f8335c6cd87b43bdb39ec5d3.converted" returned: 0 in 0.307s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.682 186548 DEBUG oslo_concurrency.processutils [None req-10503764-4dc9-4e06-8d1f-a5d7e5e5c0cc 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a8f717718e805875f8335c6cd87b43bdb39ec5d3.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.749 186548 DEBUG nova.virt.libvirt.driver [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.750 186548 DEBUG nova.virt.libvirt.driver [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.750 186548 DEBUG nova.virt.libvirt.driver [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] No VIF found with MAC fa:16:3e:5d:02:bc, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.751 186548 INFO nova.virt.libvirt.driver [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 45b35bfb-92b8-4947-bca0-dca87e484f28] Using config drive
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.754 186548 DEBUG oslo_concurrency.processutils [None req-10503764-4dc9-4e06-8d1f-a5d7e5e5c0cc 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a8f717718e805875f8335c6cd87b43bdb39ec5d3.converted --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.755 186548 DEBUG oslo_concurrency.lockutils [None req-10503764-4dc9-4e06-8d1f-a5d7e5e5c0cc 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lock "a8f717718e805875f8335c6cd87b43bdb39ec5d3" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.724s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.781 186548 DEBUG oslo_concurrency.lockutils [None req-10503764-4dc9-4e06-8d1f-a5d7e5e5c0cc 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Acquiring lock "a8f717718e805875f8335c6cd87b43bdb39ec5d3" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.782 186548 DEBUG oslo_concurrency.lockutils [None req-10503764-4dc9-4e06-8d1f-a5d7e5e5c0cc 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lock "a8f717718e805875f8335c6cd87b43bdb39ec5d3" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.795 186548 DEBUG oslo_concurrency.processutils [None req-10503764-4dc9-4e06-8d1f-a5d7e5e5c0cc 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a8f717718e805875f8335c6cd87b43bdb39ec5d3 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.856 186548 DEBUG oslo_concurrency.processutils [None req-10503764-4dc9-4e06-8d1f-a5d7e5e5c0cc 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a8f717718e805875f8335c6cd87b43bdb39ec5d3 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.857 186548 DEBUG oslo_concurrency.processutils [None req-10503764-4dc9-4e06-8d1f-a5d7e5e5c0cc 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/a8f717718e805875f8335c6cd87b43bdb39ec5d3,backing_fmt=raw /var/lib/nova/instances/823521f6-eb37-4b78-982c-526e463c834f/disk.rescue execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.909 186548 DEBUG oslo_concurrency.processutils [None req-10503764-4dc9-4e06-8d1f-a5d7e5e5c0cc 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/a8f717718e805875f8335c6cd87b43bdb39ec5d3,backing_fmt=raw /var/lib/nova/instances/823521f6-eb37-4b78-982c-526e463c834f/disk.rescue" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.911 186548 DEBUG oslo_concurrency.lockutils [None req-10503764-4dc9-4e06-8d1f-a5d7e5e5c0cc 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lock "a8f717718e805875f8335c6cd87b43bdb39ec5d3" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.911 186548 DEBUG nova.objects.instance [None req-10503764-4dc9-4e06-8d1f-a5d7e5e5c0cc 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lazy-loading 'migration_context' on Instance uuid 823521f6-eb37-4b78-982c-526e463c834f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.930 186548 DEBUG nova.virt.libvirt.driver [None req-10503764-4dc9-4e06-8d1f-a5d7e5e5c0cc 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.933 186548 DEBUG nova.virt.libvirt.driver [None req-10503764-4dc9-4e06-8d1f-a5d7e5e5c0cc 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Start _get_guest_xml network_info=[{"id": "1abe3f57-9262-4dc9-a0a8-4465e0cd0702", "address": "fa:16:3e:27:6a:75", "network": {"id": "06e0f3a5-911a-4244-bd9c-8cb4fa4c4794", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-960378838-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-960378838-network", "vif_mac": "fa:16:3e:27:6a:75"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd33c7e49baa4c7f9575824b348a0f23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1abe3f57-92", "ovs_interfaceid": "1abe3f57-9262-4dc9-a0a8-4465e0cd0702", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}, 'disk.rescue': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue={'image_id': '74516a1d-b693-42b1-9090-70e086e7142f', 'kernel_id': '', 'ramdisk_id': ''} block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.933 186548 DEBUG nova.objects.instance [None req-10503764-4dc9-4e06-8d1f-a5d7e5e5c0cc 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lazy-loading 'resources' on Instance uuid 823521f6-eb37-4b78-982c-526e463c834f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.946 186548 WARNING nova.virt.libvirt.driver [None req-10503764-4dc9-4e06-8d1f-a5d7e5e5c0cc 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.952 186548 DEBUG nova.virt.libvirt.host [None req-10503764-4dc9-4e06-8d1f-a5d7e5e5c0cc 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.953 186548 DEBUG nova.virt.libvirt.host [None req-10503764-4dc9-4e06-8d1f-a5d7e5e5c0cc 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.956 186548 DEBUG nova.virt.libvirt.host [None req-10503764-4dc9-4e06-8d1f-a5d7e5e5c0cc 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.956 186548 DEBUG nova.virt.libvirt.host [None req-10503764-4dc9-4e06-8d1f-a5d7e5e5c0cc 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.957 186548 DEBUG nova.virt.libvirt.driver [None req-10503764-4dc9-4e06-8d1f-a5d7e5e5c0cc 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.957 186548 DEBUG nova.virt.hardware [None req-10503764-4dc9-4e06-8d1f-a5d7e5e5c0cc 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.958 186548 DEBUG nova.virt.hardware [None req-10503764-4dc9-4e06-8d1f-a5d7e5e5c0cc 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.958 186548 DEBUG nova.virt.hardware [None req-10503764-4dc9-4e06-8d1f-a5d7e5e5c0cc 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.958 186548 DEBUG nova.virt.hardware [None req-10503764-4dc9-4e06-8d1f-a5d7e5e5c0cc 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.958 186548 DEBUG nova.virt.hardware [None req-10503764-4dc9-4e06-8d1f-a5d7e5e5c0cc 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.958 186548 DEBUG nova.virt.hardware [None req-10503764-4dc9-4e06-8d1f-a5d7e5e5c0cc 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.959 186548 DEBUG nova.virt.hardware [None req-10503764-4dc9-4e06-8d1f-a5d7e5e5c0cc 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.959 186548 DEBUG nova.virt.hardware [None req-10503764-4dc9-4e06-8d1f-a5d7e5e5c0cc 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.959 186548 DEBUG nova.virt.hardware [None req-10503764-4dc9-4e06-8d1f-a5d7e5e5c0cc 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.959 186548 DEBUG nova.virt.hardware [None req-10503764-4dc9-4e06-8d1f-a5d7e5e5c0cc 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.959 186548 DEBUG nova.virt.hardware [None req-10503764-4dc9-4e06-8d1f-a5d7e5e5c0cc 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.960 186548 DEBUG nova.objects.instance [None req-10503764-4dc9-4e06-8d1f-a5d7e5e5c0cc 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 823521f6-eb37-4b78-982c-526e463c834f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:59:18 compute-0 nova_compute[186544]: 2025-11-22 07:59:18.977 186548 DEBUG oslo_concurrency.processutils [None req-10503764-4dc9-4e06-8d1f-a5d7e5e5c0cc 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/823521f6-eb37-4b78-982c-526e463c834f/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:59:19 compute-0 nova_compute[186544]: 2025-11-22 07:59:19.039 186548 DEBUG oslo_concurrency.processutils [None req-10503764-4dc9-4e06-8d1f-a5d7e5e5c0cc 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/823521f6-eb37-4b78-982c-526e463c834f/disk.config --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:59:19 compute-0 nova_compute[186544]: 2025-11-22 07:59:19.040 186548 DEBUG oslo_concurrency.lockutils [None req-10503764-4dc9-4e06-8d1f-a5d7e5e5c0cc 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Acquiring lock "/var/lib/nova/instances/823521f6-eb37-4b78-982c-526e463c834f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:59:19 compute-0 nova_compute[186544]: 2025-11-22 07:59:19.041 186548 DEBUG oslo_concurrency.lockutils [None req-10503764-4dc9-4e06-8d1f-a5d7e5e5c0cc 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lock "/var/lib/nova/instances/823521f6-eb37-4b78-982c-526e463c834f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:59:19 compute-0 nova_compute[186544]: 2025-11-22 07:59:19.042 186548 DEBUG oslo_concurrency.lockutils [None req-10503764-4dc9-4e06-8d1f-a5d7e5e5c0cc 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lock "/var/lib/nova/instances/823521f6-eb37-4b78-982c-526e463c834f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:59:19 compute-0 nova_compute[186544]: 2025-11-22 07:59:19.043 186548 DEBUG nova.virt.libvirt.vif [None req-10503764-4dc9-4e06-8d1f-a5d7e5e5c0cc 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:58:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-517382055',display_name='tempest-ServerStableDeviceRescueTest-server-517382055',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-517382055',id=85,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:58:54Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='fd33c7e49baa4c7f9575824b348a0f23',ramdisk_id='',reservation_id='r-iqw87cv9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerStableDeviceRescueTest-455223381',owner_user_name='tempest-ServerStableDeviceRescueTest-455223381-project-member'},tags=<?>,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T07:59:01Z,user_data=None,user_id='0d84421d986b40f481c0caef764443e2',uuid=823521f6-eb37-4b78-982c-526e463c834f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1abe3f57-9262-4dc9-a0a8-4465e0cd0702", "address": "fa:16:3e:27:6a:75", "network": {"id": "06e0f3a5-911a-4244-bd9c-8cb4fa4c4794", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-960378838-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-960378838-network", "vif_mac": "fa:16:3e:27:6a:75"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd33c7e49baa4c7f9575824b348a0f23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1abe3f57-92", "ovs_interfaceid": "1abe3f57-9262-4dc9-a0a8-4465e0cd0702", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 07:59:19 compute-0 nova_compute[186544]: 2025-11-22 07:59:19.044 186548 DEBUG nova.network.os_vif_util [None req-10503764-4dc9-4e06-8d1f-a5d7e5e5c0cc 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Converting VIF {"id": "1abe3f57-9262-4dc9-a0a8-4465e0cd0702", "address": "fa:16:3e:27:6a:75", "network": {"id": "06e0f3a5-911a-4244-bd9c-8cb4fa4c4794", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-960378838-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-960378838-network", "vif_mac": "fa:16:3e:27:6a:75"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd33c7e49baa4c7f9575824b348a0f23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1abe3f57-92", "ovs_interfaceid": "1abe3f57-9262-4dc9-a0a8-4465e0cd0702", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:59:19 compute-0 nova_compute[186544]: 2025-11-22 07:59:19.045 186548 DEBUG nova.network.os_vif_util [None req-10503764-4dc9-4e06-8d1f-a5d7e5e5c0cc 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:27:6a:75,bridge_name='br-int',has_traffic_filtering=True,id=1abe3f57-9262-4dc9-a0a8-4465e0cd0702,network=Network(06e0f3a5-911a-4244-bd9c-8cb4fa4c4794),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1abe3f57-92') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:59:19 compute-0 nova_compute[186544]: 2025-11-22 07:59:19.046 186548 DEBUG nova.objects.instance [None req-10503764-4dc9-4e06-8d1f-a5d7e5e5c0cc 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lazy-loading 'pci_devices' on Instance uuid 823521f6-eb37-4b78-982c-526e463c834f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:59:19 compute-0 nova_compute[186544]: 2025-11-22 07:59:19.062 186548 DEBUG nova.virt.libvirt.driver [None req-10503764-4dc9-4e06-8d1f-a5d7e5e5c0cc 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] End _get_guest_xml xml=<domain type="kvm">
Nov 22 07:59:19 compute-0 nova_compute[186544]:   <uuid>823521f6-eb37-4b78-982c-526e463c834f</uuid>
Nov 22 07:59:19 compute-0 nova_compute[186544]:   <name>instance-00000055</name>
Nov 22 07:59:19 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 07:59:19 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 07:59:19 compute-0 nova_compute[186544]:   <metadata>
Nov 22 07:59:19 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 07:59:19 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 07:59:19 compute-0 nova_compute[186544]:       <nova:name>tempest-ServerStableDeviceRescueTest-server-517382055</nova:name>
Nov 22 07:59:19 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 07:59:18</nova:creationTime>
Nov 22 07:59:19 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 07:59:19 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 07:59:19 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 07:59:19 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 07:59:19 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 07:59:19 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 07:59:19 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 07:59:19 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 07:59:19 compute-0 nova_compute[186544]:         <nova:user uuid="0d84421d986b40f481c0caef764443e2">tempest-ServerStableDeviceRescueTest-455223381-project-member</nova:user>
Nov 22 07:59:19 compute-0 nova_compute[186544]:         <nova:project uuid="fd33c7e49baa4c7f9575824b348a0f23">tempest-ServerStableDeviceRescueTest-455223381</nova:project>
Nov 22 07:59:19 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 07:59:19 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 07:59:19 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 07:59:19 compute-0 nova_compute[186544]:         <nova:port uuid="1abe3f57-9262-4dc9-a0a8-4465e0cd0702">
Nov 22 07:59:19 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Nov 22 07:59:19 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 07:59:19 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 07:59:19 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 07:59:19 compute-0 nova_compute[186544]:   </metadata>
Nov 22 07:59:19 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 07:59:19 compute-0 nova_compute[186544]:     <system>
Nov 22 07:59:19 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 07:59:19 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 07:59:19 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 07:59:19 compute-0 nova_compute[186544]:       <entry name="serial">823521f6-eb37-4b78-982c-526e463c834f</entry>
Nov 22 07:59:19 compute-0 nova_compute[186544]:       <entry name="uuid">823521f6-eb37-4b78-982c-526e463c834f</entry>
Nov 22 07:59:19 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 07:59:19 compute-0 nova_compute[186544]:     </system>
Nov 22 07:59:19 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 07:59:19 compute-0 nova_compute[186544]:   <os>
Nov 22 07:59:19 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 07:59:19 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 07:59:19 compute-0 nova_compute[186544]:   </os>
Nov 22 07:59:19 compute-0 nova_compute[186544]:   <features>
Nov 22 07:59:19 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 07:59:19 compute-0 nova_compute[186544]:     <apic/>
Nov 22 07:59:19 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 07:59:19 compute-0 nova_compute[186544]:   </features>
Nov 22 07:59:19 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 07:59:19 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 07:59:19 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 07:59:19 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 07:59:19 compute-0 nova_compute[186544]:   </clock>
Nov 22 07:59:19 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 07:59:19 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 07:59:19 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 07:59:19 compute-0 nova_compute[186544]:   </cpu>
Nov 22 07:59:19 compute-0 nova_compute[186544]:   <devices>
Nov 22 07:59:19 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 07:59:19 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 07:59:19 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/823521f6-eb37-4b78-982c-526e463c834f/disk"/>
Nov 22 07:59:19 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 07:59:19 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:59:19 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 07:59:19 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 07:59:19 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/823521f6-eb37-4b78-982c-526e463c834f/disk.config"/>
Nov 22 07:59:19 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 07:59:19 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:59:19 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 07:59:19 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 07:59:19 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/823521f6-eb37-4b78-982c-526e463c834f/disk.rescue"/>
Nov 22 07:59:19 compute-0 nova_compute[186544]:       <target dev="vdb" bus="virtio"/>
Nov 22 07:59:19 compute-0 nova_compute[186544]:       <boot order="1"/>
Nov 22 07:59:19 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:59:19 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 07:59:19 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:27:6a:75"/>
Nov 22 07:59:19 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 07:59:19 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 07:59:19 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 07:59:19 compute-0 nova_compute[186544]:       <target dev="tap1abe3f57-92"/>
Nov 22 07:59:19 compute-0 nova_compute[186544]:     </interface>
Nov 22 07:59:19 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 07:59:19 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/823521f6-eb37-4b78-982c-526e463c834f/console.log" append="off"/>
Nov 22 07:59:19 compute-0 nova_compute[186544]:     </serial>
Nov 22 07:59:19 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 07:59:19 compute-0 nova_compute[186544]:     <video>
Nov 22 07:59:19 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 07:59:19 compute-0 nova_compute[186544]:     </video>
Nov 22 07:59:19 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 07:59:19 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 07:59:19 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 07:59:19 compute-0 nova_compute[186544]:     </rng>
Nov 22 07:59:19 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 07:59:19 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:19 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:19 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:19 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:19 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:19 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:19 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:19 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:19 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:19 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:19 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:19 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:19 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:19 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:19 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:19 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:19 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:19 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:19 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:19 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:19 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:19 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:19 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:19 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:19 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 07:59:19 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 07:59:19 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 07:59:19 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 07:59:19 compute-0 nova_compute[186544]:   </devices>
Nov 22 07:59:19 compute-0 nova_compute[186544]: </domain>
Nov 22 07:59:19 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 07:59:19 compute-0 nova_compute[186544]: 2025-11-22 07:59:19.070 186548 INFO nova.virt.libvirt.driver [-] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Instance destroyed successfully.
Nov 22 07:59:19 compute-0 nova_compute[186544]: 2025-11-22 07:59:19.126 186548 DEBUG nova.virt.libvirt.driver [None req-10503764-4dc9-4e06-8d1f-a5d7e5e5c0cc 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:59:19 compute-0 nova_compute[186544]: 2025-11-22 07:59:19.128 186548 DEBUG nova.virt.libvirt.driver [None req-10503764-4dc9-4e06-8d1f-a5d7e5e5c0cc 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:59:19 compute-0 nova_compute[186544]: 2025-11-22 07:59:19.128 186548 DEBUG nova.virt.libvirt.driver [None req-10503764-4dc9-4e06-8d1f-a5d7e5e5c0cc 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:59:19 compute-0 nova_compute[186544]: 2025-11-22 07:59:19.128 186548 DEBUG nova.virt.libvirt.driver [None req-10503764-4dc9-4e06-8d1f-a5d7e5e5c0cc 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] No VIF found with MAC fa:16:3e:27:6a:75, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 07:59:19 compute-0 nova_compute[186544]: 2025-11-22 07:59:19.129 186548 INFO nova.virt.libvirt.driver [None req-10503764-4dc9-4e06-8d1f-a5d7e5e5c0cc 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Using config drive
Nov 22 07:59:19 compute-0 nova_compute[186544]: 2025-11-22 07:59:19.143 186548 DEBUG nova.objects.instance [None req-10503764-4dc9-4e06-8d1f-a5d7e5e5c0cc 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 823521f6-eb37-4b78-982c-526e463c834f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:59:19 compute-0 nova_compute[186544]: 2025-11-22 07:59:19.167 186548 DEBUG nova.objects.instance [None req-10503764-4dc9-4e06-8d1f-a5d7e5e5c0cc 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lazy-loading 'keypairs' on Instance uuid 823521f6-eb37-4b78-982c-526e463c834f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:59:20 compute-0 nova_compute[186544]: 2025-11-22 07:59:20.465 186548 INFO nova.virt.libvirt.driver [None req-10503764-4dc9-4e06-8d1f-a5d7e5e5c0cc 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Creating config drive at /var/lib/nova/instances/823521f6-eb37-4b78-982c-526e463c834f/disk.config.rescue
Nov 22 07:59:20 compute-0 nova_compute[186544]: 2025-11-22 07:59:20.470 186548 DEBUG oslo_concurrency.processutils [None req-10503764-4dc9-4e06-8d1f-a5d7e5e5c0cc 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/823521f6-eb37-4b78-982c-526e463c834f/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmx3ktvs2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:59:20 compute-0 nova_compute[186544]: 2025-11-22 07:59:20.542 186548 INFO nova.virt.libvirt.driver [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 45b35bfb-92b8-4947-bca0-dca87e484f28] Creating config drive at /var/lib/nova/instances/45b35bfb-92b8-4947-bca0-dca87e484f28/disk.config
Nov 22 07:59:20 compute-0 nova_compute[186544]: 2025-11-22 07:59:20.548 186548 DEBUG oslo_concurrency.processutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/45b35bfb-92b8-4947-bca0-dca87e484f28/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmplyk9_jek execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:59:20 compute-0 nova_compute[186544]: 2025-11-22 07:59:20.594 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:20 compute-0 nova_compute[186544]: 2025-11-22 07:59:20.597 186548 DEBUG oslo_concurrency.processutils [None req-10503764-4dc9-4e06-8d1f-a5d7e5e5c0cc 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/823521f6-eb37-4b78-982c-526e463c834f/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmx3ktvs2" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:59:20 compute-0 nova_compute[186544]: 2025-11-22 07:59:20.639 186548 DEBUG nova.compute.manager [req-862893c4-e31b-48b1-b8af-c608ca1e8312 req-db4c68a3-3f74-473d-9d6d-ce753b6403ac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Received event network-vif-plugged-1abe3f57-9262-4dc9-a0a8-4465e0cd0702 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:59:20 compute-0 nova_compute[186544]: 2025-11-22 07:59:20.639 186548 DEBUG oslo_concurrency.lockutils [req-862893c4-e31b-48b1-b8af-c608ca1e8312 req-db4c68a3-3f74-473d-9d6d-ce753b6403ac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "823521f6-eb37-4b78-982c-526e463c834f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:59:20 compute-0 nova_compute[186544]: 2025-11-22 07:59:20.639 186548 DEBUG oslo_concurrency.lockutils [req-862893c4-e31b-48b1-b8af-c608ca1e8312 req-db4c68a3-3f74-473d-9d6d-ce753b6403ac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "823521f6-eb37-4b78-982c-526e463c834f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:59:20 compute-0 nova_compute[186544]: 2025-11-22 07:59:20.640 186548 DEBUG oslo_concurrency.lockutils [req-862893c4-e31b-48b1-b8af-c608ca1e8312 req-db4c68a3-3f74-473d-9d6d-ce753b6403ac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "823521f6-eb37-4b78-982c-526e463c834f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:59:20 compute-0 nova_compute[186544]: 2025-11-22 07:59:20.640 186548 DEBUG nova.compute.manager [req-862893c4-e31b-48b1-b8af-c608ca1e8312 req-db4c68a3-3f74-473d-9d6d-ce753b6403ac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] No waiting events found dispatching network-vif-plugged-1abe3f57-9262-4dc9-a0a8-4465e0cd0702 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:59:20 compute-0 nova_compute[186544]: 2025-11-22 07:59:20.640 186548 WARNING nova.compute.manager [req-862893c4-e31b-48b1-b8af-c608ca1e8312 req-db4c68a3-3f74-473d-9d6d-ce753b6403ac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Received unexpected event network-vif-plugged-1abe3f57-9262-4dc9-a0a8-4465e0cd0702 for instance with vm_state active and task_state rescuing.
Nov 22 07:59:20 compute-0 kernel: tap1abe3f57-92: entered promiscuous mode
Nov 22 07:59:20 compute-0 NetworkManager[55036]: <info>  [1763798360.6647] manager: (tap1abe3f57-92): new Tun device (/org/freedesktop/NetworkManager/Devices/171)
Nov 22 07:59:20 compute-0 ovn_controller[94843]: 2025-11-22T07:59:20Z|00364|binding|INFO|Claiming lport 1abe3f57-9262-4dc9-a0a8-4465e0cd0702 for this chassis.
Nov 22 07:59:20 compute-0 ovn_controller[94843]: 2025-11-22T07:59:20Z|00365|binding|INFO|1abe3f57-9262-4dc9-a0a8-4465e0cd0702: Claiming fa:16:3e:27:6a:75 10.100.0.6
Nov 22 07:59:20 compute-0 nova_compute[186544]: 2025-11-22 07:59:20.666 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:20 compute-0 nova_compute[186544]: 2025-11-22 07:59:20.675 186548 DEBUG oslo_concurrency.processutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/45b35bfb-92b8-4947-bca0-dca87e484f28/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmplyk9_jek" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:59:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:20.677 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:27:6a:75 10.100.0.6'], port_security=['fa:16:3e:27:6a:75 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '823521f6-eb37-4b78-982c-526e463c834f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'neutron:revision_number': '5', 'neutron:security_group_ids': '40e7d412-78c2-4966-b2d0-76294ef96b0a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2f082361-600e-461c-8c13-2c91c0ff7f77, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=1abe3f57-9262-4dc9-a0a8-4465e0cd0702) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:59:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:20.678 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 1abe3f57-9262-4dc9-a0a8-4465e0cd0702 in datapath 06e0f3a5-911a-4244-bd9c-8cb4fa4c4794 bound to our chassis
Nov 22 07:59:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:20.680 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 06e0f3a5-911a-4244-bd9c-8cb4fa4c4794
Nov 22 07:59:20 compute-0 ovn_controller[94843]: 2025-11-22T07:59:20Z|00366|binding|INFO|Setting lport 1abe3f57-9262-4dc9-a0a8-4465e0cd0702 ovn-installed in OVS
Nov 22 07:59:20 compute-0 ovn_controller[94843]: 2025-11-22T07:59:20Z|00367|binding|INFO|Setting lport 1abe3f57-9262-4dc9-a0a8-4465e0cd0702 up in Southbound
Nov 22 07:59:20 compute-0 nova_compute[186544]: 2025-11-22 07:59:20.684 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:20 compute-0 nova_compute[186544]: 2025-11-22 07:59:20.689 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:20.690 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[7ae9a6f9-85f8-41f5-9cd1-1a7daba5fc91]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:20.691 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap06e0f3a5-91 in ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 07:59:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:20.692 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap06e0f3a5-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 07:59:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:20.692 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[fcfb9c7c-1b87-4486-ae92-7b10152a8e55]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:20.694 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[20a3ea57-3202-420e-a46e-c0f9a6ec8342]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:20 compute-0 systemd-machined[152872]: New machine qemu-44-instance-00000055.
Nov 22 07:59:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:20.704 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[76f1f26c-22cd-457c-b674-378ec468c3c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:20 compute-0 systemd[1]: Started Virtual Machine qemu-44-instance-00000055.
Nov 22 07:59:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:20.726 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[7624f979-1125-4698-8c4f-05a6ca398ed0]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:20 compute-0 systemd-udevd[227826]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 07:59:20 compute-0 NetworkManager[55036]: <info>  [1763798360.7403] device (tap1abe3f57-92): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 07:59:20 compute-0 NetworkManager[55036]: <info>  [1763798360.7413] device (tap1abe3f57-92): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 07:59:20 compute-0 NetworkManager[55036]: <info>  [1763798360.7436] manager: (tap55f8379a-37): new Tun device (/org/freedesktop/NetworkManager/Devices/172)
Nov 22 07:59:20 compute-0 kernel: tap55f8379a-37: entered promiscuous mode
Nov 22 07:59:20 compute-0 nova_compute[186544]: 2025-11-22 07:59:20.745 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:20 compute-0 ovn_controller[94843]: 2025-11-22T07:59:20Z|00368|binding|INFO|Claiming lport 55f8379a-37c3-44ee-87da-59e3f28fcabb for this chassis.
Nov 22 07:59:20 compute-0 ovn_controller[94843]: 2025-11-22T07:59:20Z|00369|binding|INFO|55f8379a-37c3-44ee-87da-59e3f28fcabb: Claiming fa:16:3e:5d:02:bc 10.100.0.5
Nov 22 07:59:20 compute-0 nova_compute[186544]: 2025-11-22 07:59:20.749 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:20 compute-0 NetworkManager[55036]: <info>  [1763798360.7535] device (tap55f8379a-37): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 07:59:20 compute-0 NetworkManager[55036]: <info>  [1763798360.7545] device (tap55f8379a-37): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 07:59:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:20.758 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[2fbf05ec-ceba-4870-bc8b-1c23f1d00eb7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:20.762 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5d:02:bc 10.100.0.5'], port_security=['fa:16:3e:5d:02:bc 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '45b35bfb-92b8-4947-bca0-dca87e484f28', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d6148823-d007-4a7e-be44-4329f8ecc6e5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '16ccb24424c54ae1a1b0d7eef6f7d690', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b4820a7f-a658-410a-b393-c754d89b7982', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9ac2bec8-4c70-4af1-8a46-6da94edec63d, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=55f8379a-37c3-44ee-87da-59e3f28fcabb) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:59:20 compute-0 NetworkManager[55036]: <info>  [1763798360.7801] manager: (tap06e0f3a5-90): new Veth device (/org/freedesktop/NetworkManager/Devices/173)
Nov 22 07:59:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:20.779 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[a823dba4-535f-49b1-85e9-948c8acf7c9a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:20 compute-0 nova_compute[186544]: 2025-11-22 07:59:20.807 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:20 compute-0 systemd-machined[152872]: New machine qemu-45-instance-00000058.
Nov 22 07:59:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:20.817 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[245cb7e6-0661-4716-b9ba-0b7882adca1f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:20 compute-0 nova_compute[186544]: 2025-11-22 07:59:20.820 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:20.820 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[ebf9ead7-bae7-40c9-85a6-e577e970e23d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:20 compute-0 NetworkManager[55036]: <info>  [1763798360.8419] device (tap06e0f3a5-90): carrier: link connected
Nov 22 07:59:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:20.847 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[3d12501e-81c6-4322-986a-51909d8bd833]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:20 compute-0 systemd[1]: Started Virtual Machine qemu-45-instance-00000058.
Nov 22 07:59:20 compute-0 ovn_controller[94843]: 2025-11-22T07:59:20Z|00370|binding|INFO|Setting lport 55f8379a-37c3-44ee-87da-59e3f28fcabb ovn-installed in OVS
Nov 22 07:59:20 compute-0 ovn_controller[94843]: 2025-11-22T07:59:20Z|00371|binding|INFO|Setting lport 55f8379a-37c3-44ee-87da-59e3f28fcabb up in Southbound
Nov 22 07:59:20 compute-0 nova_compute[186544]: 2025-11-22 07:59:20.863 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:20.864 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[9c29ec79-9e6d-49d7-94c3-6ecb614413e5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap06e0f3a5-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d7:b7:bd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 114], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 510147, 'reachable_time': 27558, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227864, 'error': None, 'target': 'ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:20.882 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[e182e69d-3d65-491b-8404-634abb6e604e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed7:b7bd'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 510147, 'tstamp': 510147}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227865, 'error': None, 'target': 'ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:20.901 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[16823264-349a-437d-b913-ec1afd7ebde0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap06e0f3a5-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d7:b7:bd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 114], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 510147, 'reachable_time': 27558, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 227870, 'error': None, 'target': 'ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:20.932 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[52287d09-0502-4206-a0e3-fa671e522e43]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:20.984 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[6fd584ca-1b68-4e36-aaef-e9334f8e697d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:20.985 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap06e0f3a5-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:59:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:20.985 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:59:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:20.986 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap06e0f3a5-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:59:20 compute-0 nova_compute[186544]: 2025-11-22 07:59:20.987 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:20 compute-0 NetworkManager[55036]: <info>  [1763798360.9883] manager: (tap06e0f3a5-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/174)
Nov 22 07:59:20 compute-0 kernel: tap06e0f3a5-90: entered promiscuous mode
Nov 22 07:59:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:20.991 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap06e0f3a5-90, col_values=(('external_ids', {'iface-id': '465da2c0-9a1c-41a9-be9a-d10bcbd7a813'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:59:20 compute-0 ovn_controller[94843]: 2025-11-22T07:59:20Z|00372|binding|INFO|Releasing lport 465da2c0-9a1c-41a9-be9a-d10bcbd7a813 from this chassis (sb_readonly=0)
Nov 22 07:59:20 compute-0 nova_compute[186544]: 2025-11-22 07:59:20.995 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:21 compute-0 nova_compute[186544]: 2025-11-22 07:59:21.005 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:21.005 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/06e0f3a5-911a-4244-bd9c-8cb4fa4c4794.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/06e0f3a5-911a-4244-bd9c-8cb4fa4c4794.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 07:59:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:21.006 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[674eb0f4-c83b-4c0f-8a2e-fe7cc4878921]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:21.007 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 07:59:21 compute-0 ovn_metadata_agent[103800]: global
Nov 22 07:59:21 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 07:59:21 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794
Nov 22 07:59:21 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 07:59:21 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 07:59:21 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 07:59:21 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/06e0f3a5-911a-4244-bd9c-8cb4fa4c4794.pid.haproxy
Nov 22 07:59:21 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 07:59:21 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:59:21 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 07:59:21 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 07:59:21 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 07:59:21 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 07:59:21 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 07:59:21 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 07:59:21 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 07:59:21 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 07:59:21 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 07:59:21 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 07:59:21 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 07:59:21 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 07:59:21 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 07:59:21 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:59:21 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:59:21 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 07:59:21 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 07:59:21 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 07:59:21 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID 06e0f3a5-911a-4244-bd9c-8cb4fa4c4794
Nov 22 07:59:21 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 07:59:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:21.008 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794', 'env', 'PROCESS_TAG=haproxy-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/06e0f3a5-911a-4244-bd9c-8cb4fa4c4794.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 07:59:21 compute-0 nova_compute[186544]: 2025-11-22 07:59:21.104 186548 DEBUG nova.virt.libvirt.host [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Removed pending event for 823521f6-eb37-4b78-982c-526e463c834f due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Nov 22 07:59:21 compute-0 nova_compute[186544]: 2025-11-22 07:59:21.105 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798361.1044898, 823521f6-eb37-4b78-982c-526e463c834f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:59:21 compute-0 nova_compute[186544]: 2025-11-22 07:59:21.106 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 823521f6-eb37-4b78-982c-526e463c834f] VM Resumed (Lifecycle Event)
Nov 22 07:59:21 compute-0 nova_compute[186544]: 2025-11-22 07:59:21.117 186548 DEBUG nova.compute.manager [None req-10503764-4dc9-4e06-8d1f-a5d7e5e5c0cc 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:59:21 compute-0 nova_compute[186544]: 2025-11-22 07:59:21.138 186548 DEBUG nova.compute.manager [req-40b41989-352e-4a03-b505-78bb601a5263 req-94909aed-5eb8-44ca-806e-eb2876eae8d6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 45b35bfb-92b8-4947-bca0-dca87e484f28] Received event network-vif-plugged-55f8379a-37c3-44ee-87da-59e3f28fcabb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:59:21 compute-0 nova_compute[186544]: 2025-11-22 07:59:21.139 186548 DEBUG oslo_concurrency.lockutils [req-40b41989-352e-4a03-b505-78bb601a5263 req-94909aed-5eb8-44ca-806e-eb2876eae8d6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "45b35bfb-92b8-4947-bca0-dca87e484f28-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:59:21 compute-0 nova_compute[186544]: 2025-11-22 07:59:21.139 186548 DEBUG oslo_concurrency.lockutils [req-40b41989-352e-4a03-b505-78bb601a5263 req-94909aed-5eb8-44ca-806e-eb2876eae8d6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "45b35bfb-92b8-4947-bca0-dca87e484f28-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:59:21 compute-0 nova_compute[186544]: 2025-11-22 07:59:21.139 186548 DEBUG oslo_concurrency.lockutils [req-40b41989-352e-4a03-b505-78bb601a5263 req-94909aed-5eb8-44ca-806e-eb2876eae8d6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "45b35bfb-92b8-4947-bca0-dca87e484f28-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:59:21 compute-0 nova_compute[186544]: 2025-11-22 07:59:21.140 186548 DEBUG nova.compute.manager [req-40b41989-352e-4a03-b505-78bb601a5263 req-94909aed-5eb8-44ca-806e-eb2876eae8d6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 45b35bfb-92b8-4947-bca0-dca87e484f28] Processing event network-vif-plugged-55f8379a-37c3-44ee-87da-59e3f28fcabb _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 07:59:21 compute-0 nova_compute[186544]: 2025-11-22 07:59:21.147 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:59:21 compute-0 nova_compute[186544]: 2025-11-22 07:59:21.151 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:59:21 compute-0 nova_compute[186544]: 2025-11-22 07:59:21.176 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 823521f6-eb37-4b78-982c-526e463c834f] During sync_power_state the instance has a pending task (rescuing). Skip.
Nov 22 07:59:21 compute-0 nova_compute[186544]: 2025-11-22 07:59:21.176 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798361.1054325, 823521f6-eb37-4b78-982c-526e463c834f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:59:21 compute-0 nova_compute[186544]: 2025-11-22 07:59:21.177 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 823521f6-eb37-4b78-982c-526e463c834f] VM Started (Lifecycle Event)
Nov 22 07:59:21 compute-0 nova_compute[186544]: 2025-11-22 07:59:21.195 186548 DEBUG nova.network.neutron [req-184d21d0-c761-4c6b-a327-eb059f66e3db req-f582b508-a372-47eb-9aba-eff834f75c0d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 45b35bfb-92b8-4947-bca0-dca87e484f28] Updated VIF entry in instance network info cache for port 55f8379a-37c3-44ee-87da-59e3f28fcabb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 07:59:21 compute-0 nova_compute[186544]: 2025-11-22 07:59:21.196 186548 DEBUG nova.network.neutron [req-184d21d0-c761-4c6b-a327-eb059f66e3db req-f582b508-a372-47eb-9aba-eff834f75c0d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 45b35bfb-92b8-4947-bca0-dca87e484f28] Updating instance_info_cache with network_info: [{"id": "55f8379a-37c3-44ee-87da-59e3f28fcabb", "address": "fa:16:3e:5d:02:bc", "network": {"id": "d6148823-d007-4a7e-be44-4329f8ecc6e5", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1122516717-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16ccb24424c54ae1a1b0d7eef6f7d690", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap55f8379a-37", "ovs_interfaceid": "55f8379a-37c3-44ee-87da-59e3f28fcabb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:59:21 compute-0 nova_compute[186544]: 2025-11-22 07:59:21.201 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:59:21 compute-0 nova_compute[186544]: 2025-11-22 07:59:21.207 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:59:21 compute-0 nova_compute[186544]: 2025-11-22 07:59:21.223 186548 DEBUG oslo_concurrency.lockutils [req-184d21d0-c761-4c6b-a327-eb059f66e3db req-f582b508-a372-47eb-9aba-eff834f75c0d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-45b35bfb-92b8-4947-bca0-dca87e484f28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:59:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:21.258 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=24, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=23) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:59:21 compute-0 nova_compute[186544]: 2025-11-22 07:59:21.260 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:21 compute-0 nova_compute[186544]: 2025-11-22 07:59:21.375 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798361.3754222, 45b35bfb-92b8-4947-bca0-dca87e484f28 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:59:21 compute-0 nova_compute[186544]: 2025-11-22 07:59:21.377 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 45b35bfb-92b8-4947-bca0-dca87e484f28] VM Started (Lifecycle Event)
Nov 22 07:59:21 compute-0 nova_compute[186544]: 2025-11-22 07:59:21.380 186548 DEBUG nova.compute.manager [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 45b35bfb-92b8-4947-bca0-dca87e484f28] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 07:59:21 compute-0 nova_compute[186544]: 2025-11-22 07:59:21.384 186548 DEBUG nova.virt.libvirt.driver [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 45b35bfb-92b8-4947-bca0-dca87e484f28] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 07:59:21 compute-0 nova_compute[186544]: 2025-11-22 07:59:21.388 186548 INFO nova.virt.libvirt.driver [-] [instance: 45b35bfb-92b8-4947-bca0-dca87e484f28] Instance spawned successfully.
Nov 22 07:59:21 compute-0 nova_compute[186544]: 2025-11-22 07:59:21.388 186548 DEBUG nova.virt.libvirt.driver [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 45b35bfb-92b8-4947-bca0-dca87e484f28] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 07:59:21 compute-0 nova_compute[186544]: 2025-11-22 07:59:21.403 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 45b35bfb-92b8-4947-bca0-dca87e484f28] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:59:21 compute-0 nova_compute[186544]: 2025-11-22 07:59:21.407 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 45b35bfb-92b8-4947-bca0-dca87e484f28] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:59:21 compute-0 nova_compute[186544]: 2025-11-22 07:59:21.410 186548 DEBUG nova.virt.libvirt.driver [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 45b35bfb-92b8-4947-bca0-dca87e484f28] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:59:21 compute-0 nova_compute[186544]: 2025-11-22 07:59:21.411 186548 DEBUG nova.virt.libvirt.driver [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 45b35bfb-92b8-4947-bca0-dca87e484f28] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:59:21 compute-0 nova_compute[186544]: 2025-11-22 07:59:21.411 186548 DEBUG nova.virt.libvirt.driver [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 45b35bfb-92b8-4947-bca0-dca87e484f28] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:59:21 compute-0 nova_compute[186544]: 2025-11-22 07:59:21.412 186548 DEBUG nova.virt.libvirt.driver [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 45b35bfb-92b8-4947-bca0-dca87e484f28] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:59:21 compute-0 nova_compute[186544]: 2025-11-22 07:59:21.412 186548 DEBUG nova.virt.libvirt.driver [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 45b35bfb-92b8-4947-bca0-dca87e484f28] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:59:21 compute-0 nova_compute[186544]: 2025-11-22 07:59:21.413 186548 DEBUG nova.virt.libvirt.driver [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 45b35bfb-92b8-4947-bca0-dca87e484f28] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:59:21 compute-0 podman[227917]: 2025-11-22 07:59:21.443166093 +0000 UTC m=+0.089494116 container create 5afc48cd0ad8ca48f51339ff9597eca5055fb9be144db15ef2e368bb067515e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 22 07:59:21 compute-0 nova_compute[186544]: 2025-11-22 07:59:21.445 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 45b35bfb-92b8-4947-bca0-dca87e484f28] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 07:59:21 compute-0 nova_compute[186544]: 2025-11-22 07:59:21.446 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798361.375562, 45b35bfb-92b8-4947-bca0-dca87e484f28 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:59:21 compute-0 nova_compute[186544]: 2025-11-22 07:59:21.446 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 45b35bfb-92b8-4947-bca0-dca87e484f28] VM Paused (Lifecycle Event)
Nov 22 07:59:21 compute-0 nova_compute[186544]: 2025-11-22 07:59:21.469 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 45b35bfb-92b8-4947-bca0-dca87e484f28] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:59:21 compute-0 nova_compute[186544]: 2025-11-22 07:59:21.472 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798361.384684, 45b35bfb-92b8-4947-bca0-dca87e484f28 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:59:21 compute-0 nova_compute[186544]: 2025-11-22 07:59:21.473 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 45b35bfb-92b8-4947-bca0-dca87e484f28] VM Resumed (Lifecycle Event)
Nov 22 07:59:21 compute-0 podman[227917]: 2025-11-22 07:59:21.386064816 +0000 UTC m=+0.032392869 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 07:59:21 compute-0 systemd[1]: Started libpod-conmon-5afc48cd0ad8ca48f51339ff9597eca5055fb9be144db15ef2e368bb067515e5.scope.
Nov 22 07:59:21 compute-0 nova_compute[186544]: 2025-11-22 07:59:21.500 186548 INFO nova.compute.manager [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 45b35bfb-92b8-4947-bca0-dca87e484f28] Took 9.39 seconds to spawn the instance on the hypervisor.
Nov 22 07:59:21 compute-0 nova_compute[186544]: 2025-11-22 07:59:21.501 186548 DEBUG nova.compute.manager [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 45b35bfb-92b8-4947-bca0-dca87e484f28] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:59:21 compute-0 nova_compute[186544]: 2025-11-22 07:59:21.501 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 45b35bfb-92b8-4947-bca0-dca87e484f28] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:59:21 compute-0 nova_compute[186544]: 2025-11-22 07:59:21.507 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 45b35bfb-92b8-4947-bca0-dca87e484f28] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:59:21 compute-0 systemd[1]: Started libcrun container.
Nov 22 07:59:21 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5645f6418fda7eb67d6c9ccbe129e879eb71ff4621e1506fd3d863238b3fc592/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 07:59:21 compute-0 podman[227929]: 2025-11-22 07:59:21.541906595 +0000 UTC m=+0.058950483 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 07:59:21 compute-0 nova_compute[186544]: 2025-11-22 07:59:21.544 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 45b35bfb-92b8-4947-bca0-dca87e484f28] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 07:59:21 compute-0 podman[227917]: 2025-11-22 07:59:21.585867508 +0000 UTC m=+0.232195551 container init 5afc48cd0ad8ca48f51339ff9597eca5055fb9be144db15ef2e368bb067515e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 22 07:59:21 compute-0 nova_compute[186544]: 2025-11-22 07:59:21.590 186548 INFO nova.compute.manager [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 45b35bfb-92b8-4947-bca0-dca87e484f28] Took 10.03 seconds to build instance.
Nov 22 07:59:21 compute-0 podman[227917]: 2025-11-22 07:59:21.595910095 +0000 UTC m=+0.242238118 container start 5afc48cd0ad8ca48f51339ff9597eca5055fb9be144db15ef2e368bb067515e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 22 07:59:21 compute-0 nova_compute[186544]: 2025-11-22 07:59:21.604 186548 DEBUG oslo_concurrency.lockutils [None req-a112cd18-e285-4d32-8ea2-7f345558da0b cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Lock "45b35bfb-92b8-4947-bca0-dca87e484f28" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.135s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:59:21 compute-0 neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794[227942]: [NOTICE]   (227957) : New worker (227959) forked
Nov 22 07:59:21 compute-0 neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794[227942]: [NOTICE]   (227957) : Loading success.
Nov 22 07:59:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:21.664 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 55f8379a-37c3-44ee-87da-59e3f28fcabb in datapath d6148823-d007-4a7e-be44-4329f8ecc6e5 unbound from our chassis
Nov 22 07:59:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:21.666 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d6148823-d007-4a7e-be44-4329f8ecc6e5
Nov 22 07:59:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:21.678 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[310162dd-c884-4a91-85aa-a10d16cc0b8e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:21.679 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd6148823-d1 in ovnmeta-d6148823-d007-4a7e-be44-4329f8ecc6e5 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 07:59:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:21.681 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd6148823-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 07:59:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:21.681 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[4cf8ffb9-4fbe-4ef6-b492-cf3671ab8a4f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:21.682 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[d0dab562-dc8d-41eb-a310-f952ca1fa1c2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:21.692 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[e66a4307-1439-4abf-b1cd-fcc1d7ef692d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:21.717 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[1a84ec4b-276f-4bd0-9b51-4228cf514f70]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:21.744 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[824d1355-e316-4ead-951b-17342bc3dfc2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:21 compute-0 nova_compute[186544]: 2025-11-22 07:59:21.754 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763798346.7537742, 967dbe07-d575-4894-aabd-483767dcd760 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:59:21 compute-0 nova_compute[186544]: 2025-11-22 07:59:21.754 186548 INFO nova.compute.manager [-] [instance: 967dbe07-d575-4894-aabd-483767dcd760] VM Stopped (Lifecycle Event)
Nov 22 07:59:21 compute-0 NetworkManager[55036]: <info>  [1763798361.7581] manager: (tapd6148823-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/175)
Nov 22 07:59:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:21.759 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[65f2d1ef-fe69-4b4a-94f2-fdf8d53b4e44]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:21 compute-0 nova_compute[186544]: 2025-11-22 07:59:21.772 186548 DEBUG nova.compute.manager [None req-3b21ecba-edf0-459c-9ab3-a0f8e2a37ea3 - - - - - -] [instance: 967dbe07-d575-4894-aabd-483767dcd760] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:59:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:21.803 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[444af707-8b63-442c-b94e-2cdf80937874]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:21.806 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[b6340685-0e68-49b1-b84a-8cb518fa2331]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:21 compute-0 NetworkManager[55036]: <info>  [1763798361.8288] device (tapd6148823-d0): carrier: link connected
Nov 22 07:59:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:21.836 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[40f45b2c-1836-462d-86c6-303790e28459]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:21.857 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[4fa0ba06-9ca1-47e7-96c3-1ff38f727146]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd6148823-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e2:f2:ef'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 115], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 510246, 'reachable_time': 22549, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227978, 'error': None, 'target': 'ovnmeta-d6148823-d007-4a7e-be44-4329f8ecc6e5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:21.877 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[574c317b-1925-44b0-ae51-901d76c758c8]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee2:f2ef'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 510246, 'tstamp': 510246}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227979, 'error': None, 'target': 'ovnmeta-d6148823-d007-4a7e-be44-4329f8ecc6e5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:21.898 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[f542cdcd-9458-45d7-aefe-be6385a960b0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd6148823-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e2:f2:ef'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 115], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 510246, 'reachable_time': 22549, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 227980, 'error': None, 'target': 'ovnmeta-d6148823-d007-4a7e-be44-4329f8ecc6e5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:21.941 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[4bd05e31-258e-4c92-a346-593cf141dce9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:22.003 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[9c2450f8-016f-403d-847b-a79b3a5ab8bb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:22.005 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd6148823-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:59:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:22.005 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:59:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:22.005 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd6148823-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:59:22 compute-0 nova_compute[186544]: 2025-11-22 07:59:22.008 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:22 compute-0 NetworkManager[55036]: <info>  [1763798362.0087] manager: (tapd6148823-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/176)
Nov 22 07:59:22 compute-0 kernel: tapd6148823-d0: entered promiscuous mode
Nov 22 07:59:22 compute-0 nova_compute[186544]: 2025-11-22 07:59:22.010 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:22.011 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd6148823-d0, col_values=(('external_ids', {'iface-id': '2f86d506-522f-4def-915e-a14693535092'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:59:22 compute-0 nova_compute[186544]: 2025-11-22 07:59:22.012 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:22 compute-0 ovn_controller[94843]: 2025-11-22T07:59:22Z|00373|binding|INFO|Releasing lport 2f86d506-522f-4def-915e-a14693535092 from this chassis (sb_readonly=0)
Nov 22 07:59:22 compute-0 nova_compute[186544]: 2025-11-22 07:59:22.023 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:22.024 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d6148823-d007-4a7e-be44-4329f8ecc6e5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d6148823-d007-4a7e-be44-4329f8ecc6e5.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 07:59:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:22.025 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[028b8752-a159-48e6-bf50-34e719a54996]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:22.026 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 07:59:22 compute-0 ovn_metadata_agent[103800]: global
Nov 22 07:59:22 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 07:59:22 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-d6148823-d007-4a7e-be44-4329f8ecc6e5
Nov 22 07:59:22 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 07:59:22 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 07:59:22 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 07:59:22 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/d6148823-d007-4a7e-be44-4329f8ecc6e5.pid.haproxy
Nov 22 07:59:22 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 07:59:22 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:59:22 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 07:59:22 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 07:59:22 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 07:59:22 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 07:59:22 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 07:59:22 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 07:59:22 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 07:59:22 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 07:59:22 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 07:59:22 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 07:59:22 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 07:59:22 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 07:59:22 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 07:59:22 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:59:22 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:59:22 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 07:59:22 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 07:59:22 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 07:59:22 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID d6148823-d007-4a7e-be44-4329f8ecc6e5
Nov 22 07:59:22 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 07:59:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:22.026 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d6148823-d007-4a7e-be44-4329f8ecc6e5', 'env', 'PROCESS_TAG=haproxy-d6148823-d007-4a7e-be44-4329f8ecc6e5', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d6148823-d007-4a7e-be44-4329f8ecc6e5.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 07:59:22 compute-0 podman[228010]: 2025-11-22 07:59:22.39248642 +0000 UTC m=+0.024563597 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 07:59:22 compute-0 podman[228010]: 2025-11-22 07:59:22.532853687 +0000 UTC m=+0.164930834 container create 1dffa3633c876944929abec3009880aa5b3d3a2ebf101269cc3f2b3535792b30 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6148823-d007-4a7e-be44-4329f8ecc6e5, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS)
Nov 22 07:59:22 compute-0 systemd[1]: Started libpod-conmon-1dffa3633c876944929abec3009880aa5b3d3a2ebf101269cc3f2b3535792b30.scope.
Nov 22 07:59:22 compute-0 systemd[1]: Started libcrun container.
Nov 22 07:59:22 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0a033426b7292f6409be502fc3049a4074e4837a31cece28bc19b1c43dc8fc2f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 07:59:22 compute-0 podman[228010]: 2025-11-22 07:59:22.62670549 +0000 UTC m=+0.258782657 container init 1dffa3633c876944929abec3009880aa5b3d3a2ebf101269cc3f2b3535792b30 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6148823-d007-4a7e-be44-4329f8ecc6e5, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 22 07:59:22 compute-0 podman[228010]: 2025-11-22 07:59:22.633305802 +0000 UTC m=+0.265382949 container start 1dffa3633c876944929abec3009880aa5b3d3a2ebf101269cc3f2b3535792b30 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6148823-d007-4a7e-be44-4329f8ecc6e5, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 22 07:59:22 compute-0 neutron-haproxy-ovnmeta-d6148823-d007-4a7e-be44-4329f8ecc6e5[228025]: [NOTICE]   (228029) : New worker (228031) forked
Nov 22 07:59:22 compute-0 neutron-haproxy-ovnmeta-d6148823-d007-4a7e-be44-4329f8ecc6e5[228025]: [NOTICE]   (228029) : Loading success.
Nov 22 07:59:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:22.697 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 07:59:22 compute-0 nova_compute[186544]: 2025-11-22 07:59:22.731 186548 DEBUG nova.compute.manager [req-c25e3457-8f8f-4756-8473-4c90034eb7d4 req-b2179934-3f85-41f1-9d23-84c439e4ede0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Received event network-vif-plugged-1abe3f57-9262-4dc9-a0a8-4465e0cd0702 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:59:22 compute-0 nova_compute[186544]: 2025-11-22 07:59:22.731 186548 DEBUG oslo_concurrency.lockutils [req-c25e3457-8f8f-4756-8473-4c90034eb7d4 req-b2179934-3f85-41f1-9d23-84c439e4ede0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "823521f6-eb37-4b78-982c-526e463c834f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:59:22 compute-0 nova_compute[186544]: 2025-11-22 07:59:22.731 186548 DEBUG oslo_concurrency.lockutils [req-c25e3457-8f8f-4756-8473-4c90034eb7d4 req-b2179934-3f85-41f1-9d23-84c439e4ede0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "823521f6-eb37-4b78-982c-526e463c834f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:59:22 compute-0 nova_compute[186544]: 2025-11-22 07:59:22.731 186548 DEBUG oslo_concurrency.lockutils [req-c25e3457-8f8f-4756-8473-4c90034eb7d4 req-b2179934-3f85-41f1-9d23-84c439e4ede0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "823521f6-eb37-4b78-982c-526e463c834f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:59:22 compute-0 nova_compute[186544]: 2025-11-22 07:59:22.732 186548 DEBUG nova.compute.manager [req-c25e3457-8f8f-4756-8473-4c90034eb7d4 req-b2179934-3f85-41f1-9d23-84c439e4ede0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] No waiting events found dispatching network-vif-plugged-1abe3f57-9262-4dc9-a0a8-4465e0cd0702 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:59:22 compute-0 nova_compute[186544]: 2025-11-22 07:59:22.732 186548 WARNING nova.compute.manager [req-c25e3457-8f8f-4756-8473-4c90034eb7d4 req-b2179934-3f85-41f1-9d23-84c439e4ede0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Received unexpected event network-vif-plugged-1abe3f57-9262-4dc9-a0a8-4465e0cd0702 for instance with vm_state rescued and task_state unrescuing.
Nov 22 07:59:22 compute-0 nova_compute[186544]: 2025-11-22 07:59:22.732 186548 DEBUG nova.compute.manager [req-c25e3457-8f8f-4756-8473-4c90034eb7d4 req-b2179934-3f85-41f1-9d23-84c439e4ede0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Received event network-vif-plugged-1abe3f57-9262-4dc9-a0a8-4465e0cd0702 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:59:22 compute-0 nova_compute[186544]: 2025-11-22 07:59:22.732 186548 DEBUG oslo_concurrency.lockutils [req-c25e3457-8f8f-4756-8473-4c90034eb7d4 req-b2179934-3f85-41f1-9d23-84c439e4ede0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "823521f6-eb37-4b78-982c-526e463c834f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:59:22 compute-0 nova_compute[186544]: 2025-11-22 07:59:22.732 186548 DEBUG oslo_concurrency.lockutils [req-c25e3457-8f8f-4756-8473-4c90034eb7d4 req-b2179934-3f85-41f1-9d23-84c439e4ede0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "823521f6-eb37-4b78-982c-526e463c834f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:59:22 compute-0 nova_compute[186544]: 2025-11-22 07:59:22.733 186548 DEBUG oslo_concurrency.lockutils [req-c25e3457-8f8f-4756-8473-4c90034eb7d4 req-b2179934-3f85-41f1-9d23-84c439e4ede0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "823521f6-eb37-4b78-982c-526e463c834f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:59:22 compute-0 nova_compute[186544]: 2025-11-22 07:59:22.733 186548 DEBUG nova.compute.manager [req-c25e3457-8f8f-4756-8473-4c90034eb7d4 req-b2179934-3f85-41f1-9d23-84c439e4ede0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] No waiting events found dispatching network-vif-plugged-1abe3f57-9262-4dc9-a0a8-4465e0cd0702 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:59:22 compute-0 nova_compute[186544]: 2025-11-22 07:59:22.733 186548 WARNING nova.compute.manager [req-c25e3457-8f8f-4756-8473-4c90034eb7d4 req-b2179934-3f85-41f1-9d23-84c439e4ede0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Received unexpected event network-vif-plugged-1abe3f57-9262-4dc9-a0a8-4465e0cd0702 for instance with vm_state rescued and task_state unrescuing.
Nov 22 07:59:22 compute-0 nova_compute[186544]: 2025-11-22 07:59:22.761 186548 INFO nova.compute.manager [None req-7e923dcf-f33a-4929-a7d5-99d9355c796f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Unrescuing
Nov 22 07:59:22 compute-0 nova_compute[186544]: 2025-11-22 07:59:22.761 186548 DEBUG oslo_concurrency.lockutils [None req-7e923dcf-f33a-4929-a7d5-99d9355c796f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Acquiring lock "refresh_cache-823521f6-eb37-4b78-982c-526e463c834f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 07:59:22 compute-0 nova_compute[186544]: 2025-11-22 07:59:22.762 186548 DEBUG oslo_concurrency.lockutils [None req-7e923dcf-f33a-4929-a7d5-99d9355c796f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Acquired lock "refresh_cache-823521f6-eb37-4b78-982c-526e463c834f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 07:59:22 compute-0 nova_compute[186544]: 2025-11-22 07:59:22.762 186548 DEBUG nova.network.neutron [None req-7e923dcf-f33a-4929-a7d5-99d9355c796f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 07:59:23 compute-0 podman[228040]: 2025-11-22 07:59:23.407360532 +0000 UTC m=+0.059370574 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 07:59:23 compute-0 nova_compute[186544]: 2025-11-22 07:59:23.652 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:23 compute-0 nova_compute[186544]: 2025-11-22 07:59:23.662 186548 DEBUG nova.compute.manager [req-762445c5-c735-47bf-acd9-249b1f47f77c req-9c414a21-3ba9-4cd3-9cfa-d1e55e02e62b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 45b35bfb-92b8-4947-bca0-dca87e484f28] Received event network-vif-plugged-55f8379a-37c3-44ee-87da-59e3f28fcabb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:59:23 compute-0 nova_compute[186544]: 2025-11-22 07:59:23.662 186548 DEBUG oslo_concurrency.lockutils [req-762445c5-c735-47bf-acd9-249b1f47f77c req-9c414a21-3ba9-4cd3-9cfa-d1e55e02e62b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "45b35bfb-92b8-4947-bca0-dca87e484f28-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:59:23 compute-0 nova_compute[186544]: 2025-11-22 07:59:23.662 186548 DEBUG oslo_concurrency.lockutils [req-762445c5-c735-47bf-acd9-249b1f47f77c req-9c414a21-3ba9-4cd3-9cfa-d1e55e02e62b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "45b35bfb-92b8-4947-bca0-dca87e484f28-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:59:23 compute-0 nova_compute[186544]: 2025-11-22 07:59:23.662 186548 DEBUG oslo_concurrency.lockutils [req-762445c5-c735-47bf-acd9-249b1f47f77c req-9c414a21-3ba9-4cd3-9cfa-d1e55e02e62b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "45b35bfb-92b8-4947-bca0-dca87e484f28-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:59:23 compute-0 nova_compute[186544]: 2025-11-22 07:59:23.663 186548 DEBUG nova.compute.manager [req-762445c5-c735-47bf-acd9-249b1f47f77c req-9c414a21-3ba9-4cd3-9cfa-d1e55e02e62b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 45b35bfb-92b8-4947-bca0-dca87e484f28] No waiting events found dispatching network-vif-plugged-55f8379a-37c3-44ee-87da-59e3f28fcabb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:59:23 compute-0 nova_compute[186544]: 2025-11-22 07:59:23.663 186548 WARNING nova.compute.manager [req-762445c5-c735-47bf-acd9-249b1f47f77c req-9c414a21-3ba9-4cd3-9cfa-d1e55e02e62b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 45b35bfb-92b8-4947-bca0-dca87e484f28] Received unexpected event network-vif-plugged-55f8379a-37c3-44ee-87da-59e3f28fcabb for instance with vm_state active and task_state None.
Nov 22 07:59:24 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:24.699 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '24'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:59:24 compute-0 nova_compute[186544]: 2025-11-22 07:59:24.903 186548 DEBUG nova.network.neutron [None req-7e923dcf-f33a-4929-a7d5-99d9355c796f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Updating instance_info_cache with network_info: [{"id": "1abe3f57-9262-4dc9-a0a8-4465e0cd0702", "address": "fa:16:3e:27:6a:75", "network": {"id": "06e0f3a5-911a-4244-bd9c-8cb4fa4c4794", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-960378838-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd33c7e49baa4c7f9575824b348a0f23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1abe3f57-92", "ovs_interfaceid": "1abe3f57-9262-4dc9-a0a8-4465e0cd0702", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:59:24 compute-0 nova_compute[186544]: 2025-11-22 07:59:24.921 186548 DEBUG oslo_concurrency.lockutils [None req-7e923dcf-f33a-4929-a7d5-99d9355c796f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Releasing lock "refresh_cache-823521f6-eb37-4b78-982c-526e463c834f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 07:59:24 compute-0 nova_compute[186544]: 2025-11-22 07:59:24.922 186548 DEBUG nova.objects.instance [None req-7e923dcf-f33a-4929-a7d5-99d9355c796f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lazy-loading 'flavor' on Instance uuid 823521f6-eb37-4b78-982c-526e463c834f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:59:24 compute-0 kernel: tap1abe3f57-92 (unregistering): left promiscuous mode
Nov 22 07:59:24 compute-0 NetworkManager[55036]: <info>  [1763798364.9664] device (tap1abe3f57-92): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 07:59:24 compute-0 ovn_controller[94843]: 2025-11-22T07:59:24Z|00374|binding|INFO|Releasing lport 1abe3f57-9262-4dc9-a0a8-4465e0cd0702 from this chassis (sb_readonly=0)
Nov 22 07:59:24 compute-0 nova_compute[186544]: 2025-11-22 07:59:24.981 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:24 compute-0 ovn_controller[94843]: 2025-11-22T07:59:24Z|00375|binding|INFO|Setting lport 1abe3f57-9262-4dc9-a0a8-4465e0cd0702 down in Southbound
Nov 22 07:59:24 compute-0 ovn_controller[94843]: 2025-11-22T07:59:24Z|00376|binding|INFO|Removing iface tap1abe3f57-92 ovn-installed in OVS
Nov 22 07:59:24 compute-0 nova_compute[186544]: 2025-11-22 07:59:24.984 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:24 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:24.995 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:27:6a:75 10.100.0.6'], port_security=['fa:16:3e:27:6a:75 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '823521f6-eb37-4b78-982c-526e463c834f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'neutron:revision_number': '6', 'neutron:security_group_ids': '40e7d412-78c2-4966-b2d0-76294ef96b0a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2f082361-600e-461c-8c13-2c91c0ff7f77, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=1abe3f57-9262-4dc9-a0a8-4465e0cd0702) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:59:24 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:24.996 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 1abe3f57-9262-4dc9-a0a8-4465e0cd0702 in datapath 06e0f3a5-911a-4244-bd9c-8cb4fa4c4794 unbound from our chassis
Nov 22 07:59:24 compute-0 nova_compute[186544]: 2025-11-22 07:59:24.996 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:24 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:24.998 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 06e0f3a5-911a-4244-bd9c-8cb4fa4c4794, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 07:59:24 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:24.999 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[57c19c2e-1197-49b6-97ff-7fba3ee50393]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:24 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:24.999 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794 namespace which is not needed anymore
Nov 22 07:59:25 compute-0 systemd[1]: machine-qemu\x2d44\x2dinstance\x2d00000055.scope: Deactivated successfully.
Nov 22 07:59:25 compute-0 systemd[1]: machine-qemu\x2d44\x2dinstance\x2d00000055.scope: Consumed 4.108s CPU time.
Nov 22 07:59:25 compute-0 systemd-machined[152872]: Machine qemu-44-instance-00000055 terminated.
Nov 22 07:59:25 compute-0 neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794[227942]: [NOTICE]   (227957) : haproxy version is 2.8.14-c23fe91
Nov 22 07:59:25 compute-0 neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794[227942]: [NOTICE]   (227957) : path to executable is /usr/sbin/haproxy
Nov 22 07:59:25 compute-0 neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794[227942]: [WARNING]  (227957) : Exiting Master process...
Nov 22 07:59:25 compute-0 neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794[227942]: [ALERT]    (227957) : Current worker (227959) exited with code 143 (Terminated)
Nov 22 07:59:25 compute-0 neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794[227942]: [WARNING]  (227957) : All workers exited. Exiting... (0)
Nov 22 07:59:25 compute-0 systemd[1]: libpod-5afc48cd0ad8ca48f51339ff9597eca5055fb9be144db15ef2e368bb067515e5.scope: Deactivated successfully.
Nov 22 07:59:25 compute-0 podman[228081]: 2025-11-22 07:59:25.154238356 +0000 UTC m=+0.057768294 container died 5afc48cd0ad8ca48f51339ff9597eca5055fb9be144db15ef2e368bb067515e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 22 07:59:25 compute-0 nova_compute[186544]: 2025-11-22 07:59:25.234 186548 INFO nova.virt.libvirt.driver [-] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Instance destroyed successfully.
Nov 22 07:59:25 compute-0 nova_compute[186544]: 2025-11-22 07:59:25.235 186548 DEBUG nova.objects.instance [None req-7e923dcf-f33a-4929-a7d5-99d9355c796f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lazy-loading 'numa_topology' on Instance uuid 823521f6-eb37-4b78-982c-526e463c834f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:59:25 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5afc48cd0ad8ca48f51339ff9597eca5055fb9be144db15ef2e368bb067515e5-userdata-shm.mount: Deactivated successfully.
Nov 22 07:59:25 compute-0 systemd[1]: var-lib-containers-storage-overlay-5645f6418fda7eb67d6c9ccbe129e879eb71ff4621e1506fd3d863238b3fc592-merged.mount: Deactivated successfully.
Nov 22 07:59:25 compute-0 systemd-udevd[228061]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 07:59:25 compute-0 kernel: tap1abe3f57-92: entered promiscuous mode
Nov 22 07:59:25 compute-0 NetworkManager[55036]: <info>  [1763798365.3340] manager: (tap1abe3f57-92): new Tun device (/org/freedesktop/NetworkManager/Devices/177)
Nov 22 07:59:25 compute-0 ovn_controller[94843]: 2025-11-22T07:59:25Z|00377|binding|INFO|Claiming lport 1abe3f57-9262-4dc9-a0a8-4465e0cd0702 for this chassis.
Nov 22 07:59:25 compute-0 ovn_controller[94843]: 2025-11-22T07:59:25Z|00378|binding|INFO|1abe3f57-9262-4dc9-a0a8-4465e0cd0702: Claiming fa:16:3e:27:6a:75 10.100.0.6
Nov 22 07:59:25 compute-0 nova_compute[186544]: 2025-11-22 07:59:25.337 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:25 compute-0 NetworkManager[55036]: <info>  [1763798365.3434] device (tap1abe3f57-92): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 07:59:25 compute-0 NetworkManager[55036]: <info>  [1763798365.3443] device (tap1abe3f57-92): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 07:59:25 compute-0 nova_compute[186544]: 2025-11-22 07:59:25.350 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:25 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:25.349 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:27:6a:75 10.100.0.6'], port_security=['fa:16:3e:27:6a:75 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '823521f6-eb37-4b78-982c-526e463c834f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'neutron:revision_number': '6', 'neutron:security_group_ids': '40e7d412-78c2-4966-b2d0-76294ef96b0a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2f082361-600e-461c-8c13-2c91c0ff7f77, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=1abe3f57-9262-4dc9-a0a8-4465e0cd0702) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:59:25 compute-0 ovn_controller[94843]: 2025-11-22T07:59:25Z|00379|binding|INFO|Setting lport 1abe3f57-9262-4dc9-a0a8-4465e0cd0702 ovn-installed in OVS
Nov 22 07:59:25 compute-0 ovn_controller[94843]: 2025-11-22T07:59:25Z|00380|binding|INFO|Setting lport 1abe3f57-9262-4dc9-a0a8-4465e0cd0702 up in Southbound
Nov 22 07:59:25 compute-0 nova_compute[186544]: 2025-11-22 07:59:25.353 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:25 compute-0 podman[228081]: 2025-11-22 07:59:25.358470308 +0000 UTC m=+0.262000226 container cleanup 5afc48cd0ad8ca48f51339ff9597eca5055fb9be144db15ef2e368bb067515e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 07:59:25 compute-0 systemd[1]: libpod-conmon-5afc48cd0ad8ca48f51339ff9597eca5055fb9be144db15ef2e368bb067515e5.scope: Deactivated successfully.
Nov 22 07:59:25 compute-0 systemd-machined[152872]: New machine qemu-46-instance-00000055.
Nov 22 07:59:25 compute-0 systemd[1]: Started Virtual Machine qemu-46-instance-00000055.
Nov 22 07:59:25 compute-0 podman[228145]: 2025-11-22 07:59:25.46329964 +0000 UTC m=+0.075312096 container remove 5afc48cd0ad8ca48f51339ff9597eca5055fb9be144db15ef2e368bb067515e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 22 07:59:25 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:25.469 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[d96bdc6e-df3f-443a-a78f-b993902c6072]: (4, ('Sat Nov 22 07:59:25 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794 (5afc48cd0ad8ca48f51339ff9597eca5055fb9be144db15ef2e368bb067515e5)\n5afc48cd0ad8ca48f51339ff9597eca5055fb9be144db15ef2e368bb067515e5\nSat Nov 22 07:59:25 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794 (5afc48cd0ad8ca48f51339ff9597eca5055fb9be144db15ef2e368bb067515e5)\n5afc48cd0ad8ca48f51339ff9597eca5055fb9be144db15ef2e368bb067515e5\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:25 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:25.471 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[0e9a4465-2a3e-4e51-a145-6de229fd86e0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:25 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:25.472 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap06e0f3a5-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:59:25 compute-0 nova_compute[186544]: 2025-11-22 07:59:25.474 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:25 compute-0 kernel: tap06e0f3a5-90: left promiscuous mode
Nov 22 07:59:25 compute-0 nova_compute[186544]: 2025-11-22 07:59:25.476 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:25 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:25.479 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[2d74ef19-bb95-4e3c-9141-789311214790]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:25 compute-0 nova_compute[186544]: 2025-11-22 07:59:25.489 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:25 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:25.503 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[f5c6bfb4-85ce-4cd0-8ce2-51d2f6304509]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:25 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:25.505 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[4224f31d-0ca8-4a02-a73e-ce9ee1a74b4a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:25 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:25.525 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[6fb9d54a-2684-4652-809c-4b3f7d7cda8a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 510139, 'reachable_time': 37289, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228166, 'error': None, 'target': 'ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:25 compute-0 systemd[1]: run-netns-ovnmeta\x2d06e0f3a5\x2d911a\x2d4244\x2dbd9c\x2d8cb4fa4c4794.mount: Deactivated successfully.
Nov 22 07:59:25 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:25.533 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 07:59:25 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:25.533 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[b7315af5-3a7b-4744-979d-e330a9c0d304]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:25 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:25.535 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 1abe3f57-9262-4dc9-a0a8-4465e0cd0702 in datapath 06e0f3a5-911a-4244-bd9c-8cb4fa4c4794 unbound from our chassis
Nov 22 07:59:25 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:25.538 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 06e0f3a5-911a-4244-bd9c-8cb4fa4c4794
Nov 22 07:59:25 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:25.556 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[6f50f84b-ae3a-4e90-9cb3-77bb70f07ee8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:25 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:25.558 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap06e0f3a5-91 in ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 07:59:25 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:25.560 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap06e0f3a5-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 07:59:25 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:25.560 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[6407cc2a-6a47-4ff7-ba57-d251ff6f5352]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:25 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:25.561 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[3af0e11f-75bf-4900-a4f8-1eb7ec921eee]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:25 compute-0 nova_compute[186544]: 2025-11-22 07:59:25.568 186548 DEBUG nova.compute.manager [req-d01ca65f-1c32-4e3c-a93d-2cb6d9ab9e1c req-8e81dc7b-c875-41d9-b25d-ad6c634d118d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Received event network-vif-unplugged-1abe3f57-9262-4dc9-a0a8-4465e0cd0702 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:59:25 compute-0 nova_compute[186544]: 2025-11-22 07:59:25.568 186548 DEBUG oslo_concurrency.lockutils [req-d01ca65f-1c32-4e3c-a93d-2cb6d9ab9e1c req-8e81dc7b-c875-41d9-b25d-ad6c634d118d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "823521f6-eb37-4b78-982c-526e463c834f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:59:25 compute-0 nova_compute[186544]: 2025-11-22 07:59:25.569 186548 DEBUG oslo_concurrency.lockutils [req-d01ca65f-1c32-4e3c-a93d-2cb6d9ab9e1c req-8e81dc7b-c875-41d9-b25d-ad6c634d118d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "823521f6-eb37-4b78-982c-526e463c834f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:59:25 compute-0 nova_compute[186544]: 2025-11-22 07:59:25.569 186548 DEBUG oslo_concurrency.lockutils [req-d01ca65f-1c32-4e3c-a93d-2cb6d9ab9e1c req-8e81dc7b-c875-41d9-b25d-ad6c634d118d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "823521f6-eb37-4b78-982c-526e463c834f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:59:25 compute-0 nova_compute[186544]: 2025-11-22 07:59:25.569 186548 DEBUG nova.compute.manager [req-d01ca65f-1c32-4e3c-a93d-2cb6d9ab9e1c req-8e81dc7b-c875-41d9-b25d-ad6c634d118d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] No waiting events found dispatching network-vif-unplugged-1abe3f57-9262-4dc9-a0a8-4465e0cd0702 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:59:25 compute-0 nova_compute[186544]: 2025-11-22 07:59:25.570 186548 WARNING nova.compute.manager [req-d01ca65f-1c32-4e3c-a93d-2cb6d9ab9e1c req-8e81dc7b-c875-41d9-b25d-ad6c634d118d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Received unexpected event network-vif-unplugged-1abe3f57-9262-4dc9-a0a8-4465e0cd0702 for instance with vm_state rescued and task_state unrescuing.
Nov 22 07:59:25 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:25.573 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[881fa9d1-5617-4522-96f3-6f2b4716772b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:25 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:25.585 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[9344db74-8134-4ea4-8658-0fd772b2cd27]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:25 compute-0 nova_compute[186544]: 2025-11-22 07:59:25.597 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:25 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:25.615 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[302fc7c9-a5ec-43b0-b72d-b0f861b5d369]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:25 compute-0 NetworkManager[55036]: <info>  [1763798365.6229] manager: (tap06e0f3a5-90): new Veth device (/org/freedesktop/NetworkManager/Devices/178)
Nov 22 07:59:25 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:25.621 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[322974cb-3699-4632-911d-28e8e7ad841b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:25 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:25.650 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[ec61dfb0-83a4-4197-a516-ef8cecba9529]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:25 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:25.653 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[d4c01755-af1d-4b9b-9df6-bd62709ecbd9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:25 compute-0 NetworkManager[55036]: <info>  [1763798365.6750] device (tap06e0f3a5-90): carrier: link connected
Nov 22 07:59:25 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:25.680 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[320d1c28-0d0d-4c2d-86ab-2413bb317d3d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:25 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:25.698 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[0a6abab7-a201-4ff0-b010-39f8d36d9461]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap06e0f3a5-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d7:b7:bd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 118], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 510631, 'reachable_time': 40959, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228190, 'error': None, 'target': 'ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:25 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:25.715 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[22c87422-07b8-421f-af3b-8e6a7dbf410f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed7:b7bd'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 510631, 'tstamp': 510631}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228191, 'error': None, 'target': 'ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:25 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:25.730 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[08f9f1b7-ed3a-442b-bd10-ecf7464d5cf5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap06e0f3a5-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d7:b7:bd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 118], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 510631, 'reachable_time': 40959, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 228192, 'error': None, 'target': 'ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:25 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:25.757 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[f9a11833-bcaf-4bbd-a532-1b1daf7c6e22]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:25 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:25.815 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[4d80785f-7d1e-4f98-8235-fe0bbc8876d1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:25 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:25.817 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap06e0f3a5-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:59:25 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:25.817 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 07:59:25 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:25.818 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap06e0f3a5-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:59:25 compute-0 NetworkManager[55036]: <info>  [1763798365.8205] manager: (tap06e0f3a5-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/179)
Nov 22 07:59:25 compute-0 nova_compute[186544]: 2025-11-22 07:59:25.820 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:25 compute-0 kernel: tap06e0f3a5-90: entered promiscuous mode
Nov 22 07:59:25 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:25.822 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap06e0f3a5-90, col_values=(('external_ids', {'iface-id': '465da2c0-9a1c-41a9-be9a-d10bcbd7a813'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:59:25 compute-0 ovn_controller[94843]: 2025-11-22T07:59:25Z|00381|binding|INFO|Releasing lport 465da2c0-9a1c-41a9-be9a-d10bcbd7a813 from this chassis (sb_readonly=0)
Nov 22 07:59:25 compute-0 nova_compute[186544]: 2025-11-22 07:59:25.824 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:25 compute-0 nova_compute[186544]: 2025-11-22 07:59:25.836 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:25 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:25.837 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/06e0f3a5-911a-4244-bd9c-8cb4fa4c4794.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/06e0f3a5-911a-4244-bd9c-8cb4fa4c4794.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 07:59:25 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:25.838 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[0713f023-c2f4-41c8-a597-ab6838b61adb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:25 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:25.839 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 07:59:25 compute-0 ovn_metadata_agent[103800]: global
Nov 22 07:59:25 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 07:59:25 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794
Nov 22 07:59:25 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 07:59:25 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 07:59:25 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 07:59:25 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/06e0f3a5-911a-4244-bd9c-8cb4fa4c4794.pid.haproxy
Nov 22 07:59:25 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 07:59:25 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:59:25 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 07:59:25 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 07:59:25 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 07:59:25 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 07:59:25 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 07:59:25 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 07:59:25 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 07:59:25 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 07:59:25 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 07:59:25 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 07:59:25 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 07:59:25 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 07:59:25 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 07:59:25 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:59:25 compute-0 ovn_metadata_agent[103800]: 
Nov 22 07:59:25 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 07:59:25 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 07:59:25 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 07:59:25 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID 06e0f3a5-911a-4244-bd9c-8cb4fa4c4794
Nov 22 07:59:25 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 07:59:25 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:25.839 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794', 'env', 'PROCESS_TAG=haproxy-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/06e0f3a5-911a-4244-bd9c-8cb4fa4c4794.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 07:59:25 compute-0 nova_compute[186544]: 2025-11-22 07:59:25.893 186548 DEBUG nova.virt.libvirt.host [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Removed pending event for 823521f6-eb37-4b78-982c-526e463c834f due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Nov 22 07:59:25 compute-0 nova_compute[186544]: 2025-11-22 07:59:25.893 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798365.8926516, 823521f6-eb37-4b78-982c-526e463c834f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:59:25 compute-0 nova_compute[186544]: 2025-11-22 07:59:25.893 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 823521f6-eb37-4b78-982c-526e463c834f] VM Resumed (Lifecycle Event)
Nov 22 07:59:25 compute-0 nova_compute[186544]: 2025-11-22 07:59:25.897 186548 DEBUG nova.compute.manager [None req-7e923dcf-f33a-4929-a7d5-99d9355c796f 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:59:25 compute-0 nova_compute[186544]: 2025-11-22 07:59:25.927 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:59:25 compute-0 nova_compute[186544]: 2025-11-22 07:59:25.931 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: rescued, current task_state: unrescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:59:25 compute-0 nova_compute[186544]: 2025-11-22 07:59:25.946 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 823521f6-eb37-4b78-982c-526e463c834f] During sync_power_state the instance has a pending task (unrescuing). Skip.
Nov 22 07:59:25 compute-0 nova_compute[186544]: 2025-11-22 07:59:25.946 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798365.8928192, 823521f6-eb37-4b78-982c-526e463c834f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:59:25 compute-0 nova_compute[186544]: 2025-11-22 07:59:25.946 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 823521f6-eb37-4b78-982c-526e463c834f] VM Started (Lifecycle Event)
Nov 22 07:59:25 compute-0 nova_compute[186544]: 2025-11-22 07:59:25.968 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:59:25 compute-0 nova_compute[186544]: 2025-11-22 07:59:25.971 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: unrescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:59:26 compute-0 podman[228232]: 2025-11-22 07:59:26.170305788 +0000 UTC m=+0.024698920 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 07:59:26 compute-0 podman[228232]: 2025-11-22 07:59:26.477152047 +0000 UTC m=+0.331545159 container create 8d3d8d0205557744f239918b1a475a9ab1c670e2453b093046c6f1e8be16e7de (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 22 07:59:26 compute-0 systemd[1]: Started libpod-conmon-8d3d8d0205557744f239918b1a475a9ab1c670e2453b093046c6f1e8be16e7de.scope.
Nov 22 07:59:26 compute-0 systemd[1]: Started libcrun container.
Nov 22 07:59:26 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f0b655380b215dcb768db6e45470da5e483cf1a2b2cee8a66cd3ac10dbc9a055/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 07:59:26 compute-0 podman[228232]: 2025-11-22 07:59:26.653996124 +0000 UTC m=+0.508389246 container init 8d3d8d0205557744f239918b1a475a9ab1c670e2453b093046c6f1e8be16e7de (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 07:59:26 compute-0 podman[228232]: 2025-11-22 07:59:26.664527623 +0000 UTC m=+0.518920735 container start 8d3d8d0205557744f239918b1a475a9ab1c670e2453b093046c6f1e8be16e7de (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Nov 22 07:59:26 compute-0 neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794[228247]: [NOTICE]   (228251) : New worker (228253) forked
Nov 22 07:59:26 compute-0 neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794[228247]: [NOTICE]   (228251) : Loading success.
Nov 22 07:59:27 compute-0 nova_compute[186544]: 2025-11-22 07:59:27.647 186548 DEBUG nova.compute.manager [req-6df5c7d7-ed87-4ff9-b932-4c71bad617c0 req-60cb9528-3a67-4d6f-9841-f864d9022502 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Received event network-vif-plugged-1abe3f57-9262-4dc9-a0a8-4465e0cd0702 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:59:27 compute-0 nova_compute[186544]: 2025-11-22 07:59:27.647 186548 DEBUG oslo_concurrency.lockutils [req-6df5c7d7-ed87-4ff9-b932-4c71bad617c0 req-60cb9528-3a67-4d6f-9841-f864d9022502 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "823521f6-eb37-4b78-982c-526e463c834f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:59:27 compute-0 nova_compute[186544]: 2025-11-22 07:59:27.648 186548 DEBUG oslo_concurrency.lockutils [req-6df5c7d7-ed87-4ff9-b932-4c71bad617c0 req-60cb9528-3a67-4d6f-9841-f864d9022502 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "823521f6-eb37-4b78-982c-526e463c834f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:59:27 compute-0 nova_compute[186544]: 2025-11-22 07:59:27.648 186548 DEBUG oslo_concurrency.lockutils [req-6df5c7d7-ed87-4ff9-b932-4c71bad617c0 req-60cb9528-3a67-4d6f-9841-f864d9022502 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "823521f6-eb37-4b78-982c-526e463c834f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:59:27 compute-0 nova_compute[186544]: 2025-11-22 07:59:27.649 186548 DEBUG nova.compute.manager [req-6df5c7d7-ed87-4ff9-b932-4c71bad617c0 req-60cb9528-3a67-4d6f-9841-f864d9022502 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] No waiting events found dispatching network-vif-plugged-1abe3f57-9262-4dc9-a0a8-4465e0cd0702 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:59:27 compute-0 nova_compute[186544]: 2025-11-22 07:59:27.649 186548 WARNING nova.compute.manager [req-6df5c7d7-ed87-4ff9-b932-4c71bad617c0 req-60cb9528-3a67-4d6f-9841-f864d9022502 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Received unexpected event network-vif-plugged-1abe3f57-9262-4dc9-a0a8-4465e0cd0702 for instance with vm_state active and task_state None.
Nov 22 07:59:27 compute-0 nova_compute[186544]: 2025-11-22 07:59:27.649 186548 DEBUG nova.compute.manager [req-6df5c7d7-ed87-4ff9-b932-4c71bad617c0 req-60cb9528-3a67-4d6f-9841-f864d9022502 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Received event network-vif-plugged-1abe3f57-9262-4dc9-a0a8-4465e0cd0702 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:59:27 compute-0 nova_compute[186544]: 2025-11-22 07:59:27.649 186548 DEBUG oslo_concurrency.lockutils [req-6df5c7d7-ed87-4ff9-b932-4c71bad617c0 req-60cb9528-3a67-4d6f-9841-f864d9022502 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "823521f6-eb37-4b78-982c-526e463c834f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:59:27 compute-0 nova_compute[186544]: 2025-11-22 07:59:27.649 186548 DEBUG oslo_concurrency.lockutils [req-6df5c7d7-ed87-4ff9-b932-4c71bad617c0 req-60cb9528-3a67-4d6f-9841-f864d9022502 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "823521f6-eb37-4b78-982c-526e463c834f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:59:27 compute-0 nova_compute[186544]: 2025-11-22 07:59:27.650 186548 DEBUG oslo_concurrency.lockutils [req-6df5c7d7-ed87-4ff9-b932-4c71bad617c0 req-60cb9528-3a67-4d6f-9841-f864d9022502 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "823521f6-eb37-4b78-982c-526e463c834f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:59:27 compute-0 nova_compute[186544]: 2025-11-22 07:59:27.650 186548 DEBUG nova.compute.manager [req-6df5c7d7-ed87-4ff9-b932-4c71bad617c0 req-60cb9528-3a67-4d6f-9841-f864d9022502 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] No waiting events found dispatching network-vif-plugged-1abe3f57-9262-4dc9-a0a8-4465e0cd0702 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:59:27 compute-0 nova_compute[186544]: 2025-11-22 07:59:27.650 186548 WARNING nova.compute.manager [req-6df5c7d7-ed87-4ff9-b932-4c71bad617c0 req-60cb9528-3a67-4d6f-9841-f864d9022502 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Received unexpected event network-vif-plugged-1abe3f57-9262-4dc9-a0a8-4465e0cd0702 for instance with vm_state active and task_state None.
Nov 22 07:59:27 compute-0 nova_compute[186544]: 2025-11-22 07:59:27.650 186548 DEBUG nova.compute.manager [req-6df5c7d7-ed87-4ff9-b932-4c71bad617c0 req-60cb9528-3a67-4d6f-9841-f864d9022502 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Received event network-vif-plugged-1abe3f57-9262-4dc9-a0a8-4465e0cd0702 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:59:27 compute-0 nova_compute[186544]: 2025-11-22 07:59:27.650 186548 DEBUG oslo_concurrency.lockutils [req-6df5c7d7-ed87-4ff9-b932-4c71bad617c0 req-60cb9528-3a67-4d6f-9841-f864d9022502 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "823521f6-eb37-4b78-982c-526e463c834f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:59:27 compute-0 nova_compute[186544]: 2025-11-22 07:59:27.651 186548 DEBUG oslo_concurrency.lockutils [req-6df5c7d7-ed87-4ff9-b932-4c71bad617c0 req-60cb9528-3a67-4d6f-9841-f864d9022502 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "823521f6-eb37-4b78-982c-526e463c834f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:59:27 compute-0 nova_compute[186544]: 2025-11-22 07:59:27.651 186548 DEBUG oslo_concurrency.lockutils [req-6df5c7d7-ed87-4ff9-b932-4c71bad617c0 req-60cb9528-3a67-4d6f-9841-f864d9022502 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "823521f6-eb37-4b78-982c-526e463c834f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:59:27 compute-0 nova_compute[186544]: 2025-11-22 07:59:27.651 186548 DEBUG nova.compute.manager [req-6df5c7d7-ed87-4ff9-b932-4c71bad617c0 req-60cb9528-3a67-4d6f-9841-f864d9022502 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] No waiting events found dispatching network-vif-plugged-1abe3f57-9262-4dc9-a0a8-4465e0cd0702 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:59:27 compute-0 nova_compute[186544]: 2025-11-22 07:59:27.651 186548 WARNING nova.compute.manager [req-6df5c7d7-ed87-4ff9-b932-4c71bad617c0 req-60cb9528-3a67-4d6f-9841-f864d9022502 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Received unexpected event network-vif-plugged-1abe3f57-9262-4dc9-a0a8-4465e0cd0702 for instance with vm_state active and task_state None.
Nov 22 07:59:28 compute-0 nova_compute[186544]: 2025-11-22 07:59:28.658 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:28 compute-0 nova_compute[186544]: 2025-11-22 07:59:28.767 186548 DEBUG oslo_concurrency.lockutils [None req-8072e1dd-bb07-4954-bc40-aeb1e5b23c5b 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Acquiring lock "823521f6-eb37-4b78-982c-526e463c834f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:59:28 compute-0 nova_compute[186544]: 2025-11-22 07:59:28.767 186548 DEBUG oslo_concurrency.lockutils [None req-8072e1dd-bb07-4954-bc40-aeb1e5b23c5b 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lock "823521f6-eb37-4b78-982c-526e463c834f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:59:28 compute-0 nova_compute[186544]: 2025-11-22 07:59:28.768 186548 DEBUG oslo_concurrency.lockutils [None req-8072e1dd-bb07-4954-bc40-aeb1e5b23c5b 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Acquiring lock "823521f6-eb37-4b78-982c-526e463c834f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:59:28 compute-0 nova_compute[186544]: 2025-11-22 07:59:28.768 186548 DEBUG oslo_concurrency.lockutils [None req-8072e1dd-bb07-4954-bc40-aeb1e5b23c5b 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lock "823521f6-eb37-4b78-982c-526e463c834f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:59:28 compute-0 nova_compute[186544]: 2025-11-22 07:59:28.768 186548 DEBUG oslo_concurrency.lockutils [None req-8072e1dd-bb07-4954-bc40-aeb1e5b23c5b 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lock "823521f6-eb37-4b78-982c-526e463c834f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:59:28 compute-0 nova_compute[186544]: 2025-11-22 07:59:28.776 186548 INFO nova.compute.manager [None req-8072e1dd-bb07-4954-bc40-aeb1e5b23c5b 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Terminating instance
Nov 22 07:59:28 compute-0 nova_compute[186544]: 2025-11-22 07:59:28.782 186548 DEBUG nova.compute.manager [None req-8072e1dd-bb07-4954-bc40-aeb1e5b23c5b 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 07:59:28 compute-0 kernel: tap1abe3f57-92 (unregistering): left promiscuous mode
Nov 22 07:59:28 compute-0 NetworkManager[55036]: <info>  [1763798368.8249] device (tap1abe3f57-92): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 07:59:28 compute-0 ovn_controller[94843]: 2025-11-22T07:59:28Z|00382|binding|INFO|Releasing lport 1abe3f57-9262-4dc9-a0a8-4465e0cd0702 from this chassis (sb_readonly=0)
Nov 22 07:59:28 compute-0 ovn_controller[94843]: 2025-11-22T07:59:28Z|00383|binding|INFO|Setting lport 1abe3f57-9262-4dc9-a0a8-4465e0cd0702 down in Southbound
Nov 22 07:59:28 compute-0 ovn_controller[94843]: 2025-11-22T07:59:28Z|00384|binding|INFO|Removing iface tap1abe3f57-92 ovn-installed in OVS
Nov 22 07:59:28 compute-0 nova_compute[186544]: 2025-11-22 07:59:28.832 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:28 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:28.839 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:27:6a:75 10.100.0.6'], port_security=['fa:16:3e:27:6a:75 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '823521f6-eb37-4b78-982c-526e463c834f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd33c7e49baa4c7f9575824b348a0f23', 'neutron:revision_number': '8', 'neutron:security_group_ids': '40e7d412-78c2-4966-b2d0-76294ef96b0a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2f082361-600e-461c-8c13-2c91c0ff7f77, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=1abe3f57-9262-4dc9-a0a8-4465e0cd0702) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:59:28 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:28.841 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 1abe3f57-9262-4dc9-a0a8-4465e0cd0702 in datapath 06e0f3a5-911a-4244-bd9c-8cb4fa4c4794 unbound from our chassis
Nov 22 07:59:28 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:28.844 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 06e0f3a5-911a-4244-bd9c-8cb4fa4c4794, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 07:59:28 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:28.845 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[ee1987c0-756c-4476-85f0-eb5623cef872]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:28 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:28.846 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794 namespace which is not needed anymore
Nov 22 07:59:28 compute-0 nova_compute[186544]: 2025-11-22 07:59:28.848 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:28 compute-0 systemd[1]: machine-qemu\x2d46\x2dinstance\x2d00000055.scope: Deactivated successfully.
Nov 22 07:59:28 compute-0 systemd[1]: machine-qemu\x2d46\x2dinstance\x2d00000055.scope: Consumed 3.323s CPU time.
Nov 22 07:59:28 compute-0 systemd-machined[152872]: Machine qemu-46-instance-00000055 terminated.
Nov 22 07:59:28 compute-0 neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794[228247]: [NOTICE]   (228251) : haproxy version is 2.8.14-c23fe91
Nov 22 07:59:28 compute-0 neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794[228247]: [NOTICE]   (228251) : path to executable is /usr/sbin/haproxy
Nov 22 07:59:28 compute-0 neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794[228247]: [WARNING]  (228251) : Exiting Master process...
Nov 22 07:59:28 compute-0 neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794[228247]: [ALERT]    (228251) : Current worker (228253) exited with code 143 (Terminated)
Nov 22 07:59:28 compute-0 neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794[228247]: [WARNING]  (228251) : All workers exited. Exiting... (0)
Nov 22 07:59:28 compute-0 systemd[1]: libpod-8d3d8d0205557744f239918b1a475a9ab1c670e2453b093046c6f1e8be16e7de.scope: Deactivated successfully.
Nov 22 07:59:28 compute-0 podman[228285]: 2025-11-22 07:59:28.984729542 +0000 UTC m=+0.052632047 container died 8d3d8d0205557744f239918b1a475a9ab1c670e2453b093046c6f1e8be16e7de (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 22 07:59:29 compute-0 nova_compute[186544]: 2025-11-22 07:59:29.012 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:29 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8d3d8d0205557744f239918b1a475a9ab1c670e2453b093046c6f1e8be16e7de-userdata-shm.mount: Deactivated successfully.
Nov 22 07:59:29 compute-0 nova_compute[186544]: 2025-11-22 07:59:29.017 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:29 compute-0 systemd[1]: var-lib-containers-storage-overlay-f0b655380b215dcb768db6e45470da5e483cf1a2b2cee8a66cd3ac10dbc9a055-merged.mount: Deactivated successfully.
Nov 22 07:59:29 compute-0 podman[228285]: 2025-11-22 07:59:29.044115566 +0000 UTC m=+0.112018071 container cleanup 8d3d8d0205557744f239918b1a475a9ab1c670e2453b093046c6f1e8be16e7de (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Nov 22 07:59:29 compute-0 systemd[1]: libpod-conmon-8d3d8d0205557744f239918b1a475a9ab1c670e2453b093046c6f1e8be16e7de.scope: Deactivated successfully.
Nov 22 07:59:29 compute-0 nova_compute[186544]: 2025-11-22 07:59:29.060 186548 INFO nova.virt.libvirt.driver [-] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Instance destroyed successfully.
Nov 22 07:59:29 compute-0 nova_compute[186544]: 2025-11-22 07:59:29.062 186548 DEBUG nova.objects.instance [None req-8072e1dd-bb07-4954-bc40-aeb1e5b23c5b 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lazy-loading 'resources' on Instance uuid 823521f6-eb37-4b78-982c-526e463c834f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:59:29 compute-0 nova_compute[186544]: 2025-11-22 07:59:29.074 186548 DEBUG nova.virt.libvirt.vif [None req-8072e1dd-bb07-4954-bc40-aeb1e5b23c5b 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:58:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-517382055',display_name='tempest-ServerStableDeviceRescueTest-server-517382055',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-517382055',id=85,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:59:21Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='fd33c7e49baa4c7f9575824b348a0f23',ramdisk_id='',reservation_id='r-iqw87cv9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerStableDeviceRescueTest-455223381',owner_user_name='tempest-ServerStableDeviceRescueTest-455223381-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:59:25Z,user_data=None,user_id='0d84421d986b40f481c0caef764443e2',uuid=823521f6-eb37-4b78-982c-526e463c834f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1abe3f57-9262-4dc9-a0a8-4465e0cd0702", "address": "fa:16:3e:27:6a:75", "network": {"id": "06e0f3a5-911a-4244-bd9c-8cb4fa4c4794", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-960378838-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd33c7e49baa4c7f9575824b348a0f23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1abe3f57-92", "ovs_interfaceid": "1abe3f57-9262-4dc9-a0a8-4465e0cd0702", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 07:59:29 compute-0 nova_compute[186544]: 2025-11-22 07:59:29.074 186548 DEBUG nova.network.os_vif_util [None req-8072e1dd-bb07-4954-bc40-aeb1e5b23c5b 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Converting VIF {"id": "1abe3f57-9262-4dc9-a0a8-4465e0cd0702", "address": "fa:16:3e:27:6a:75", "network": {"id": "06e0f3a5-911a-4244-bd9c-8cb4fa4c4794", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-960378838-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd33c7e49baa4c7f9575824b348a0f23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1abe3f57-92", "ovs_interfaceid": "1abe3f57-9262-4dc9-a0a8-4465e0cd0702", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:59:29 compute-0 nova_compute[186544]: 2025-11-22 07:59:29.075 186548 DEBUG nova.network.os_vif_util [None req-8072e1dd-bb07-4954-bc40-aeb1e5b23c5b 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:27:6a:75,bridge_name='br-int',has_traffic_filtering=True,id=1abe3f57-9262-4dc9-a0a8-4465e0cd0702,network=Network(06e0f3a5-911a-4244-bd9c-8cb4fa4c4794),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1abe3f57-92') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:59:29 compute-0 nova_compute[186544]: 2025-11-22 07:59:29.076 186548 DEBUG os_vif [None req-8072e1dd-bb07-4954-bc40-aeb1e5b23c5b 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:27:6a:75,bridge_name='br-int',has_traffic_filtering=True,id=1abe3f57-9262-4dc9-a0a8-4465e0cd0702,network=Network(06e0f3a5-911a-4244-bd9c-8cb4fa4c4794),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1abe3f57-92') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 07:59:29 compute-0 nova_compute[186544]: 2025-11-22 07:59:29.077 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:29 compute-0 nova_compute[186544]: 2025-11-22 07:59:29.078 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1abe3f57-92, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:59:29 compute-0 nova_compute[186544]: 2025-11-22 07:59:29.081 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:29 compute-0 nova_compute[186544]: 2025-11-22 07:59:29.083 186548 INFO os_vif [None req-8072e1dd-bb07-4954-bc40-aeb1e5b23c5b 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:27:6a:75,bridge_name='br-int',has_traffic_filtering=True,id=1abe3f57-9262-4dc9-a0a8-4465e0cd0702,network=Network(06e0f3a5-911a-4244-bd9c-8cb4fa4c4794),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1abe3f57-92')
Nov 22 07:59:29 compute-0 nova_compute[186544]: 2025-11-22 07:59:29.084 186548 INFO nova.virt.libvirt.driver [None req-8072e1dd-bb07-4954-bc40-aeb1e5b23c5b 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Deleting instance files /var/lib/nova/instances/823521f6-eb37-4b78-982c-526e463c834f_del
Nov 22 07:59:29 compute-0 nova_compute[186544]: 2025-11-22 07:59:29.085 186548 INFO nova.virt.libvirt.driver [None req-8072e1dd-bb07-4954-bc40-aeb1e5b23c5b 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Deletion of /var/lib/nova/instances/823521f6-eb37-4b78-982c-526e463c834f_del complete
Nov 22 07:59:29 compute-0 nova_compute[186544]: 2025-11-22 07:59:29.147 186548 INFO nova.compute.manager [None req-8072e1dd-bb07-4954-bc40-aeb1e5b23c5b 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Took 0.36 seconds to destroy the instance on the hypervisor.
Nov 22 07:59:29 compute-0 nova_compute[186544]: 2025-11-22 07:59:29.148 186548 DEBUG oslo.service.loopingcall [None req-8072e1dd-bb07-4954-bc40-aeb1e5b23c5b 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 07:59:29 compute-0 nova_compute[186544]: 2025-11-22 07:59:29.148 186548 DEBUG nova.compute.manager [-] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 07:59:29 compute-0 nova_compute[186544]: 2025-11-22 07:59:29.149 186548 DEBUG nova.network.neutron [-] [instance: 823521f6-eb37-4b78-982c-526e463c834f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 07:59:29 compute-0 podman[228330]: 2025-11-22 07:59:29.478317872 +0000 UTC m=+0.411860827 container remove 8d3d8d0205557744f239918b1a475a9ab1c670e2453b093046c6f1e8be16e7de (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3)
Nov 22 07:59:29 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:29.483 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[e3f1eccb-acb4-4aea-944b-2c944d02e1f0]: (4, ('Sat Nov 22 07:59:28 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794 (8d3d8d0205557744f239918b1a475a9ab1c670e2453b093046c6f1e8be16e7de)\n8d3d8d0205557744f239918b1a475a9ab1c670e2453b093046c6f1e8be16e7de\nSat Nov 22 07:59:29 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794 (8d3d8d0205557744f239918b1a475a9ab1c670e2453b093046c6f1e8be16e7de)\n8d3d8d0205557744f239918b1a475a9ab1c670e2453b093046c6f1e8be16e7de\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:29 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:29.486 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[1ba81a49-8205-4596-9119-d0dfed8a3acb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:29 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:29.487 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap06e0f3a5-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:59:29 compute-0 nova_compute[186544]: 2025-11-22 07:59:29.489 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:29 compute-0 kernel: tap06e0f3a5-90: left promiscuous mode
Nov 22 07:59:29 compute-0 nova_compute[186544]: 2025-11-22 07:59:29.503 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:29 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:29.507 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[7ca5b4e3-2dd4-4a85-9926-c79ace111bbb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:29 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:29.521 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[de3cf849-5a09-4443-afce-cf31e864607c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:29 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:29.524 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[cfc87a70-4742-44aa-bea7-2380d6952f00]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:29 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:29.540 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[77cd5f69-d7c4-48b1-bc1d-7b4fe1519716]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 510624, 'reachable_time': 20647, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228348, 'error': None, 'target': 'ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:29 compute-0 systemd[1]: run-netns-ovnmeta\x2d06e0f3a5\x2d911a\x2d4244\x2dbd9c\x2d8cb4fa4c4794.mount: Deactivated successfully.
Nov 22 07:59:29 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:29.546 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-06e0f3a5-911a-4244-bd9c-8cb4fa4c4794 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 07:59:29 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:29.546 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[6d87374e-1037-40e8-98e7-85cd11631a33]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:29 compute-0 podman[228340]: 2025-11-22 07:59:29.58134309 +0000 UTC m=+0.061948197 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible)
Nov 22 07:59:29 compute-0 nova_compute[186544]: 2025-11-22 07:59:29.777 186548 DEBUG nova.compute.manager [req-a09daf79-5780-4b65-a5b5-ca973f53e102 req-e427230f-26c5-4eeb-95d9-1a81e06a576c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Received event network-vif-unplugged-1abe3f57-9262-4dc9-a0a8-4465e0cd0702 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:59:29 compute-0 nova_compute[186544]: 2025-11-22 07:59:29.777 186548 DEBUG oslo_concurrency.lockutils [req-a09daf79-5780-4b65-a5b5-ca973f53e102 req-e427230f-26c5-4eeb-95d9-1a81e06a576c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "823521f6-eb37-4b78-982c-526e463c834f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:59:29 compute-0 nova_compute[186544]: 2025-11-22 07:59:29.777 186548 DEBUG oslo_concurrency.lockutils [req-a09daf79-5780-4b65-a5b5-ca973f53e102 req-e427230f-26c5-4eeb-95d9-1a81e06a576c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "823521f6-eb37-4b78-982c-526e463c834f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:59:29 compute-0 nova_compute[186544]: 2025-11-22 07:59:29.778 186548 DEBUG oslo_concurrency.lockutils [req-a09daf79-5780-4b65-a5b5-ca973f53e102 req-e427230f-26c5-4eeb-95d9-1a81e06a576c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "823521f6-eb37-4b78-982c-526e463c834f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:59:29 compute-0 nova_compute[186544]: 2025-11-22 07:59:29.778 186548 DEBUG nova.compute.manager [req-a09daf79-5780-4b65-a5b5-ca973f53e102 req-e427230f-26c5-4eeb-95d9-1a81e06a576c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] No waiting events found dispatching network-vif-unplugged-1abe3f57-9262-4dc9-a0a8-4465e0cd0702 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:59:29 compute-0 nova_compute[186544]: 2025-11-22 07:59:29.779 186548 DEBUG nova.compute.manager [req-a09daf79-5780-4b65-a5b5-ca973f53e102 req-e427230f-26c5-4eeb-95d9-1a81e06a576c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Received event network-vif-unplugged-1abe3f57-9262-4dc9-a0a8-4465e0cd0702 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 22 07:59:30 compute-0 nova_compute[186544]: 2025-11-22 07:59:30.600 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:30 compute-0 nova_compute[186544]: 2025-11-22 07:59:30.673 186548 DEBUG nova.network.neutron [-] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:59:30 compute-0 nova_compute[186544]: 2025-11-22 07:59:30.687 186548 INFO nova.compute.manager [-] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Took 1.54 seconds to deallocate network for instance.
Nov 22 07:59:30 compute-0 nova_compute[186544]: 2025-11-22 07:59:30.737 186548 DEBUG nova.compute.manager [req-5ff9aeb9-255f-45c4-938b-d62ecc9e9fc2 req-e4110f50-ee5e-45cf-8e61-526b12e263b6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Received event network-vif-deleted-1abe3f57-9262-4dc9-a0a8-4465e0cd0702 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:59:30 compute-0 nova_compute[186544]: 2025-11-22 07:59:30.762 186548 DEBUG oslo_concurrency.lockutils [None req-8072e1dd-bb07-4954-bc40-aeb1e5b23c5b 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:59:30 compute-0 nova_compute[186544]: 2025-11-22 07:59:30.763 186548 DEBUG oslo_concurrency.lockutils [None req-8072e1dd-bb07-4954-bc40-aeb1e5b23c5b 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:59:30 compute-0 nova_compute[186544]: 2025-11-22 07:59:30.840 186548 DEBUG nova.compute.provider_tree [None req-8072e1dd-bb07-4954-bc40-aeb1e5b23c5b 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:59:30 compute-0 nova_compute[186544]: 2025-11-22 07:59:30.854 186548 DEBUG nova.scheduler.client.report [None req-8072e1dd-bb07-4954-bc40-aeb1e5b23c5b 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:59:30 compute-0 nova_compute[186544]: 2025-11-22 07:59:30.870 186548 DEBUG oslo_concurrency.lockutils [None req-8072e1dd-bb07-4954-bc40-aeb1e5b23c5b 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.107s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:59:30 compute-0 nova_compute[186544]: 2025-11-22 07:59:30.900 186548 INFO nova.scheduler.client.report [None req-8072e1dd-bb07-4954-bc40-aeb1e5b23c5b 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Deleted allocations for instance 823521f6-eb37-4b78-982c-526e463c834f
Nov 22 07:59:30 compute-0 nova_compute[186544]: 2025-11-22 07:59:30.981 186548 DEBUG oslo_concurrency.lockutils [None req-8072e1dd-bb07-4954-bc40-aeb1e5b23c5b 0d84421d986b40f481c0caef764443e2 fd33c7e49baa4c7f9575824b348a0f23 - - default default] Lock "823521f6-eb37-4b78-982c-526e463c834f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.213s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:59:31 compute-0 nova_compute[186544]: 2025-11-22 07:59:31.603 186548 DEBUG oslo_concurrency.lockutils [None req-7ea3a69c-eb10-4ed7-a684-b1af757699a2 cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Acquiring lock "45b35bfb-92b8-4947-bca0-dca87e484f28" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:59:31 compute-0 nova_compute[186544]: 2025-11-22 07:59:31.604 186548 DEBUG oslo_concurrency.lockutils [None req-7ea3a69c-eb10-4ed7-a684-b1af757699a2 cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Lock "45b35bfb-92b8-4947-bca0-dca87e484f28" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:59:31 compute-0 nova_compute[186544]: 2025-11-22 07:59:31.604 186548 DEBUG oslo_concurrency.lockutils [None req-7ea3a69c-eb10-4ed7-a684-b1af757699a2 cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Acquiring lock "45b35bfb-92b8-4947-bca0-dca87e484f28-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:59:31 compute-0 nova_compute[186544]: 2025-11-22 07:59:31.604 186548 DEBUG oslo_concurrency.lockutils [None req-7ea3a69c-eb10-4ed7-a684-b1af757699a2 cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Lock "45b35bfb-92b8-4947-bca0-dca87e484f28-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:59:31 compute-0 nova_compute[186544]: 2025-11-22 07:59:31.605 186548 DEBUG oslo_concurrency.lockutils [None req-7ea3a69c-eb10-4ed7-a684-b1af757699a2 cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Lock "45b35bfb-92b8-4947-bca0-dca87e484f28-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:59:31 compute-0 nova_compute[186544]: 2025-11-22 07:59:31.613 186548 INFO nova.compute.manager [None req-7ea3a69c-eb10-4ed7-a684-b1af757699a2 cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 45b35bfb-92b8-4947-bca0-dca87e484f28] Terminating instance
Nov 22 07:59:31 compute-0 nova_compute[186544]: 2025-11-22 07:59:31.620 186548 DEBUG nova.compute.manager [None req-7ea3a69c-eb10-4ed7-a684-b1af757699a2 cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 45b35bfb-92b8-4947-bca0-dca87e484f28] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 07:59:31 compute-0 kernel: tap55f8379a-37 (unregistering): left promiscuous mode
Nov 22 07:59:31 compute-0 NetworkManager[55036]: <info>  [1763798371.6403] device (tap55f8379a-37): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 07:59:31 compute-0 ovn_controller[94843]: 2025-11-22T07:59:31Z|00385|binding|INFO|Releasing lport 55f8379a-37c3-44ee-87da-59e3f28fcabb from this chassis (sb_readonly=0)
Nov 22 07:59:31 compute-0 ovn_controller[94843]: 2025-11-22T07:59:31Z|00386|binding|INFO|Setting lport 55f8379a-37c3-44ee-87da-59e3f28fcabb down in Southbound
Nov 22 07:59:31 compute-0 ovn_controller[94843]: 2025-11-22T07:59:31Z|00387|binding|INFO|Removing iface tap55f8379a-37 ovn-installed in OVS
Nov 22 07:59:31 compute-0 nova_compute[186544]: 2025-11-22 07:59:31.648 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:31 compute-0 nova_compute[186544]: 2025-11-22 07:59:31.650 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:31 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:31.665 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5d:02:bc 10.100.0.5'], port_security=['fa:16:3e:5d:02:bc 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '45b35bfb-92b8-4947-bca0-dca87e484f28', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d6148823-d007-4a7e-be44-4329f8ecc6e5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '16ccb24424c54ae1a1b0d7eef6f7d690', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b4820a7f-a658-410a-b393-c754d89b7982', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9ac2bec8-4c70-4af1-8a46-6da94edec63d, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=55f8379a-37c3-44ee-87da-59e3f28fcabb) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 07:59:31 compute-0 nova_compute[186544]: 2025-11-22 07:59:31.667 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:31 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:31.667 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 55f8379a-37c3-44ee-87da-59e3f28fcabb in datapath d6148823-d007-4a7e-be44-4329f8ecc6e5 unbound from our chassis
Nov 22 07:59:31 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:31.669 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d6148823-d007-4a7e-be44-4329f8ecc6e5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 07:59:31 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:31.670 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[2fae0af6-90ed-4f6b-b9c6-46fe89c7c773]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:31 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:31.671 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d6148823-d007-4a7e-be44-4329f8ecc6e5 namespace which is not needed anymore
Nov 22 07:59:31 compute-0 systemd[1]: machine-qemu\x2d45\x2dinstance\x2d00000058.scope: Deactivated successfully.
Nov 22 07:59:31 compute-0 systemd[1]: machine-qemu\x2d45\x2dinstance\x2d00000058.scope: Consumed 10.773s CPU time.
Nov 22 07:59:31 compute-0 systemd-machined[152872]: Machine qemu-45-instance-00000058 terminated.
Nov 22 07:59:31 compute-0 neutron-haproxy-ovnmeta-d6148823-d007-4a7e-be44-4329f8ecc6e5[228025]: [NOTICE]   (228029) : haproxy version is 2.8.14-c23fe91
Nov 22 07:59:31 compute-0 neutron-haproxy-ovnmeta-d6148823-d007-4a7e-be44-4329f8ecc6e5[228025]: [NOTICE]   (228029) : path to executable is /usr/sbin/haproxy
Nov 22 07:59:31 compute-0 neutron-haproxy-ovnmeta-d6148823-d007-4a7e-be44-4329f8ecc6e5[228025]: [WARNING]  (228029) : Exiting Master process...
Nov 22 07:59:31 compute-0 neutron-haproxy-ovnmeta-d6148823-d007-4a7e-be44-4329f8ecc6e5[228025]: [ALERT]    (228029) : Current worker (228031) exited with code 143 (Terminated)
Nov 22 07:59:31 compute-0 neutron-haproxy-ovnmeta-d6148823-d007-4a7e-be44-4329f8ecc6e5[228025]: [WARNING]  (228029) : All workers exited. Exiting... (0)
Nov 22 07:59:31 compute-0 systemd[1]: libpod-1dffa3633c876944929abec3009880aa5b3d3a2ebf101269cc3f2b3535792b30.scope: Deactivated successfully.
Nov 22 07:59:31 compute-0 podman[228385]: 2025-11-22 07:59:31.820146814 +0000 UTC m=+0.062481510 container died 1dffa3633c876944929abec3009880aa5b3d3a2ebf101269cc3f2b3535792b30 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6148823-d007-4a7e-be44-4329f8ecc6e5, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 07:59:31 compute-0 NetworkManager[55036]: <info>  [1763798371.8384] manager: (tap55f8379a-37): new Tun device (/org/freedesktop/NetworkManager/Devices/180)
Nov 22 07:59:31 compute-0 systemd[1]: var-lib-containers-storage-overlay-0a033426b7292f6409be502fc3049a4074e4837a31cece28bc19b1c43dc8fc2f-merged.mount: Deactivated successfully.
Nov 22 07:59:31 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1dffa3633c876944929abec3009880aa5b3d3a2ebf101269cc3f2b3535792b30-userdata-shm.mount: Deactivated successfully.
Nov 22 07:59:31 compute-0 podman[228385]: 2025-11-22 07:59:31.875054527 +0000 UTC m=+0.117389213 container cleanup 1dffa3633c876944929abec3009880aa5b3d3a2ebf101269cc3f2b3535792b30 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6148823-d007-4a7e-be44-4329f8ecc6e5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 07:59:31 compute-0 systemd[1]: libpod-conmon-1dffa3633c876944929abec3009880aa5b3d3a2ebf101269cc3f2b3535792b30.scope: Deactivated successfully.
Nov 22 07:59:31 compute-0 nova_compute[186544]: 2025-11-22 07:59:31.889 186548 INFO nova.virt.libvirt.driver [-] [instance: 45b35bfb-92b8-4947-bca0-dca87e484f28] Instance destroyed successfully.
Nov 22 07:59:31 compute-0 nova_compute[186544]: 2025-11-22 07:59:31.889 186548 DEBUG nova.objects.instance [None req-7ea3a69c-eb10-4ed7-a684-b1af757699a2 cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Lazy-loading 'resources' on Instance uuid 45b35bfb-92b8-4947-bca0-dca87e484f28 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:59:31 compute-0 nova_compute[186544]: 2025-11-22 07:59:31.938 186548 DEBUG nova.virt.libvirt.vif [None req-7ea3a69c-eb10-4ed7-a684-b1af757699a2 cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:59:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServersNegativeTestJSON-server-1317504943',display_name='tempest-ListServersNegativeTestJSON-server-1317504943-3',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserversnegativetestjson-server-1317504943-3',id=88,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=2,launched_at=2025-11-22T07:59:21Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='16ccb24424c54ae1a1b0d7eef6f7d690',ramdisk_id='',reservation_id='r-ez5qyzoz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServersNegativeTestJSON-1715955177',owner_user_name='tempest-ListServersNegativeTestJSON-1715955177-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T07:59:21Z,user_data=None,user_id='cf1790780fd64791b117114d170d6d90',uuid=45b35bfb-92b8-4947-bca0-dca87e484f28,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "55f8379a-37c3-44ee-87da-59e3f28fcabb", "address": "fa:16:3e:5d:02:bc", "network": {"id": "d6148823-d007-4a7e-be44-4329f8ecc6e5", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1122516717-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16ccb24424c54ae1a1b0d7eef6f7d690", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap55f8379a-37", "ovs_interfaceid": "55f8379a-37c3-44ee-87da-59e3f28fcabb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 07:59:31 compute-0 nova_compute[186544]: 2025-11-22 07:59:31.938 186548 DEBUG nova.network.os_vif_util [None req-7ea3a69c-eb10-4ed7-a684-b1af757699a2 cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Converting VIF {"id": "55f8379a-37c3-44ee-87da-59e3f28fcabb", "address": "fa:16:3e:5d:02:bc", "network": {"id": "d6148823-d007-4a7e-be44-4329f8ecc6e5", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1122516717-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16ccb24424c54ae1a1b0d7eef6f7d690", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap55f8379a-37", "ovs_interfaceid": "55f8379a-37c3-44ee-87da-59e3f28fcabb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 07:59:31 compute-0 nova_compute[186544]: 2025-11-22 07:59:31.938 186548 DEBUG nova.network.os_vif_util [None req-7ea3a69c-eb10-4ed7-a684-b1af757699a2 cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5d:02:bc,bridge_name='br-int',has_traffic_filtering=True,id=55f8379a-37c3-44ee-87da-59e3f28fcabb,network=Network(d6148823-d007-4a7e-be44-4329f8ecc6e5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap55f8379a-37') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 07:59:31 compute-0 nova_compute[186544]: 2025-11-22 07:59:31.939 186548 DEBUG os_vif [None req-7ea3a69c-eb10-4ed7-a684-b1af757699a2 cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5d:02:bc,bridge_name='br-int',has_traffic_filtering=True,id=55f8379a-37c3-44ee-87da-59e3f28fcabb,network=Network(d6148823-d007-4a7e-be44-4329f8ecc6e5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap55f8379a-37') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 07:59:31 compute-0 nova_compute[186544]: 2025-11-22 07:59:31.940 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:31 compute-0 nova_compute[186544]: 2025-11-22 07:59:31.941 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap55f8379a-37, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:59:31 compute-0 nova_compute[186544]: 2025-11-22 07:59:31.942 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:31 compute-0 podman[228431]: 2025-11-22 07:59:31.942773535 +0000 UTC m=+0.044672811 container remove 1dffa3633c876944929abec3009880aa5b3d3a2ebf101269cc3f2b3535792b30 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6148823-d007-4a7e-be44-4329f8ecc6e5, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118)
Nov 22 07:59:31 compute-0 nova_compute[186544]: 2025-11-22 07:59:31.944 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:31 compute-0 nova_compute[186544]: 2025-11-22 07:59:31.946 186548 INFO os_vif [None req-7ea3a69c-eb10-4ed7-a684-b1af757699a2 cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5d:02:bc,bridge_name='br-int',has_traffic_filtering=True,id=55f8379a-37c3-44ee-87da-59e3f28fcabb,network=Network(d6148823-d007-4a7e-be44-4329f8ecc6e5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap55f8379a-37')
Nov 22 07:59:31 compute-0 nova_compute[186544]: 2025-11-22 07:59:31.946 186548 INFO nova.virt.libvirt.driver [None req-7ea3a69c-eb10-4ed7-a684-b1af757699a2 cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 45b35bfb-92b8-4947-bca0-dca87e484f28] Deleting instance files /var/lib/nova/instances/45b35bfb-92b8-4947-bca0-dca87e484f28_del
Nov 22 07:59:31 compute-0 nova_compute[186544]: 2025-11-22 07:59:31.947 186548 INFO nova.virt.libvirt.driver [None req-7ea3a69c-eb10-4ed7-a684-b1af757699a2 cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 45b35bfb-92b8-4947-bca0-dca87e484f28] Deletion of /var/lib/nova/instances/45b35bfb-92b8-4947-bca0-dca87e484f28_del complete
Nov 22 07:59:31 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:31.948 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[0322f35a-56b8-4050-991c-d51de0099736]: (4, ('Sat Nov 22 07:59:31 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-d6148823-d007-4a7e-be44-4329f8ecc6e5 (1dffa3633c876944929abec3009880aa5b3d3a2ebf101269cc3f2b3535792b30)\n1dffa3633c876944929abec3009880aa5b3d3a2ebf101269cc3f2b3535792b30\nSat Nov 22 07:59:31 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d6148823-d007-4a7e-be44-4329f8ecc6e5 (1dffa3633c876944929abec3009880aa5b3d3a2ebf101269cc3f2b3535792b30)\n1dffa3633c876944929abec3009880aa5b3d3a2ebf101269cc3f2b3535792b30\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:31 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:31.949 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[9bd250cc-e085-470a-ade1-e8b5001d0892]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:31 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:31.950 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd6148823-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 07:59:31 compute-0 nova_compute[186544]: 2025-11-22 07:59:31.952 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:31 compute-0 kernel: tapd6148823-d0: left promiscuous mode
Nov 22 07:59:31 compute-0 nova_compute[186544]: 2025-11-22 07:59:31.954 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:31 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:31.958 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[11e51547-bc96-4c38-a2ea-97047e3d8e93]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:31 compute-0 nova_compute[186544]: 2025-11-22 07:59:31.965 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:31 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:31.981 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[79c66df0-d924-47c9-964a-9d84d48aea29]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:31 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:31.983 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[0b5f9b76-efc2-4f98-807c-2a2682fd432e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:31 compute-0 nova_compute[186544]: 2025-11-22 07:59:31.984 186548 DEBUG nova.compute.manager [req-e96aca9f-7d31-4ef7-a19a-ffc81f446ad3 req-97a2d076-9b28-46fc-a739-3a922a2a90e9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 45b35bfb-92b8-4947-bca0-dca87e484f28] Received event network-vif-unplugged-55f8379a-37c3-44ee-87da-59e3f28fcabb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:59:31 compute-0 nova_compute[186544]: 2025-11-22 07:59:31.985 186548 DEBUG oslo_concurrency.lockutils [req-e96aca9f-7d31-4ef7-a19a-ffc81f446ad3 req-97a2d076-9b28-46fc-a739-3a922a2a90e9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "45b35bfb-92b8-4947-bca0-dca87e484f28-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:59:31 compute-0 nova_compute[186544]: 2025-11-22 07:59:31.986 186548 DEBUG oslo_concurrency.lockutils [req-e96aca9f-7d31-4ef7-a19a-ffc81f446ad3 req-97a2d076-9b28-46fc-a739-3a922a2a90e9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "45b35bfb-92b8-4947-bca0-dca87e484f28-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:59:31 compute-0 nova_compute[186544]: 2025-11-22 07:59:31.986 186548 DEBUG oslo_concurrency.lockutils [req-e96aca9f-7d31-4ef7-a19a-ffc81f446ad3 req-97a2d076-9b28-46fc-a739-3a922a2a90e9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "45b35bfb-92b8-4947-bca0-dca87e484f28-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:59:31 compute-0 nova_compute[186544]: 2025-11-22 07:59:31.986 186548 DEBUG nova.compute.manager [req-e96aca9f-7d31-4ef7-a19a-ffc81f446ad3 req-97a2d076-9b28-46fc-a739-3a922a2a90e9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 45b35bfb-92b8-4947-bca0-dca87e484f28] No waiting events found dispatching network-vif-unplugged-55f8379a-37c3-44ee-87da-59e3f28fcabb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:59:31 compute-0 nova_compute[186544]: 2025-11-22 07:59:31.986 186548 DEBUG nova.compute.manager [req-e96aca9f-7d31-4ef7-a19a-ffc81f446ad3 req-97a2d076-9b28-46fc-a739-3a922a2a90e9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 45b35bfb-92b8-4947-bca0-dca87e484f28] Received event network-vif-unplugged-55f8379a-37c3-44ee-87da-59e3f28fcabb for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 22 07:59:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:32.000 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[2181ab8c-5d07-4036-8d24-5e9239555654]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 510237, 'reachable_time': 28420, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228446, 'error': None, 'target': 'ovnmeta-d6148823-d007-4a7e-be44-4329f8ecc6e5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:32.003 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d6148823-d007-4a7e-be44-4329f8ecc6e5 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 07:59:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:32.003 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[f3bdbbc3-95c9-4fb5-877d-7fe426c56f61]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 07:59:32 compute-0 systemd[1]: run-netns-ovnmeta\x2dd6148823\x2dd007\x2d4a7e\x2dbe44\x2d4329f8ecc6e5.mount: Deactivated successfully.
Nov 22 07:59:32 compute-0 nova_compute[186544]: 2025-11-22 07:59:32.015 186548 DEBUG nova.compute.manager [req-42fcaf55-9373-46cd-a0dc-976702d63ef4 req-d8081fd2-310f-4037-ba56-00e10d198e76 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Received event network-vif-plugged-1abe3f57-9262-4dc9-a0a8-4465e0cd0702 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:59:32 compute-0 nova_compute[186544]: 2025-11-22 07:59:32.016 186548 DEBUG oslo_concurrency.lockutils [req-42fcaf55-9373-46cd-a0dc-976702d63ef4 req-d8081fd2-310f-4037-ba56-00e10d198e76 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "823521f6-eb37-4b78-982c-526e463c834f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:59:32 compute-0 nova_compute[186544]: 2025-11-22 07:59:32.016 186548 DEBUG oslo_concurrency.lockutils [req-42fcaf55-9373-46cd-a0dc-976702d63ef4 req-d8081fd2-310f-4037-ba56-00e10d198e76 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "823521f6-eb37-4b78-982c-526e463c834f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:59:32 compute-0 nova_compute[186544]: 2025-11-22 07:59:32.016 186548 DEBUG oslo_concurrency.lockutils [req-42fcaf55-9373-46cd-a0dc-976702d63ef4 req-d8081fd2-310f-4037-ba56-00e10d198e76 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "823521f6-eb37-4b78-982c-526e463c834f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:59:32 compute-0 nova_compute[186544]: 2025-11-22 07:59:32.017 186548 DEBUG nova.compute.manager [req-42fcaf55-9373-46cd-a0dc-976702d63ef4 req-d8081fd2-310f-4037-ba56-00e10d198e76 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] No waiting events found dispatching network-vif-plugged-1abe3f57-9262-4dc9-a0a8-4465e0cd0702 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:59:32 compute-0 nova_compute[186544]: 2025-11-22 07:59:32.017 186548 WARNING nova.compute.manager [req-42fcaf55-9373-46cd-a0dc-976702d63ef4 req-d8081fd2-310f-4037-ba56-00e10d198e76 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Received unexpected event network-vif-plugged-1abe3f57-9262-4dc9-a0a8-4465e0cd0702 for instance with vm_state deleted and task_state None.
Nov 22 07:59:32 compute-0 nova_compute[186544]: 2025-11-22 07:59:32.062 186548 INFO nova.compute.manager [None req-7ea3a69c-eb10-4ed7-a684-b1af757699a2 cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] [instance: 45b35bfb-92b8-4947-bca0-dca87e484f28] Took 0.44 seconds to destroy the instance on the hypervisor.
Nov 22 07:59:32 compute-0 nova_compute[186544]: 2025-11-22 07:59:32.062 186548 DEBUG oslo.service.loopingcall [None req-7ea3a69c-eb10-4ed7-a684-b1af757699a2 cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 07:59:32 compute-0 nova_compute[186544]: 2025-11-22 07:59:32.062 186548 DEBUG nova.compute.manager [-] [instance: 45b35bfb-92b8-4947-bca0-dca87e484f28] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 07:59:32 compute-0 nova_compute[186544]: 2025-11-22 07:59:32.063 186548 DEBUG nova.network.neutron [-] [instance: 45b35bfb-92b8-4947-bca0-dca87e484f28] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 07:59:32 compute-0 nova_compute[186544]: 2025-11-22 07:59:32.835 186548 DEBUG nova.network.neutron [-] [instance: 45b35bfb-92b8-4947-bca0-dca87e484f28] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 07:59:32 compute-0 nova_compute[186544]: 2025-11-22 07:59:32.890 186548 INFO nova.compute.manager [-] [instance: 45b35bfb-92b8-4947-bca0-dca87e484f28] Took 0.83 seconds to deallocate network for instance.
Nov 22 07:59:32 compute-0 nova_compute[186544]: 2025-11-22 07:59:32.970 186548 DEBUG nova.compute.manager [req-77d5fa6a-6657-4ef3-8046-670944b3712d req-2663e081-967f-43dc-8ff5-a8f12bf07ca4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 45b35bfb-92b8-4947-bca0-dca87e484f28] Received event network-vif-deleted-55f8379a-37c3-44ee-87da-59e3f28fcabb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:59:32 compute-0 nova_compute[186544]: 2025-11-22 07:59:32.984 186548 DEBUG oslo_concurrency.lockutils [None req-7ea3a69c-eb10-4ed7-a684-b1af757699a2 cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:59:32 compute-0 nova_compute[186544]: 2025-11-22 07:59:32.985 186548 DEBUG oslo_concurrency.lockutils [None req-7ea3a69c-eb10-4ed7-a684-b1af757699a2 cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:59:33 compute-0 nova_compute[186544]: 2025-11-22 07:59:33.033 186548 DEBUG nova.compute.provider_tree [None req-7ea3a69c-eb10-4ed7-a684-b1af757699a2 cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:59:33 compute-0 nova_compute[186544]: 2025-11-22 07:59:33.044 186548 DEBUG nova.scheduler.client.report [None req-7ea3a69c-eb10-4ed7-a684-b1af757699a2 cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:59:33 compute-0 nova_compute[186544]: 2025-11-22 07:59:33.064 186548 DEBUG oslo_concurrency.lockutils [None req-7ea3a69c-eb10-4ed7-a684-b1af757699a2 cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.079s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:59:33 compute-0 nova_compute[186544]: 2025-11-22 07:59:33.093 186548 INFO nova.scheduler.client.report [None req-7ea3a69c-eb10-4ed7-a684-b1af757699a2 cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Deleted allocations for instance 45b35bfb-92b8-4947-bca0-dca87e484f28
Nov 22 07:59:33 compute-0 nova_compute[186544]: 2025-11-22 07:59:33.226 186548 DEBUG oslo_concurrency.lockutils [None req-7ea3a69c-eb10-4ed7-a684-b1af757699a2 cf1790780fd64791b117114d170d6d90 16ccb24424c54ae1a1b0d7eef6f7d690 - - default default] Lock "45b35bfb-92b8-4947-bca0-dca87e484f28" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.622s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:59:34 compute-0 nova_compute[186544]: 2025-11-22 07:59:34.107 186548 DEBUG nova.compute.manager [req-fa9a415e-619c-4983-b52d-bfb91ae9cc92 req-6e3b79d6-d5e4-42c2-ad5d-8b18a9031df4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 45b35bfb-92b8-4947-bca0-dca87e484f28] Received event network-vif-plugged-55f8379a-37c3-44ee-87da-59e3f28fcabb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 07:59:34 compute-0 nova_compute[186544]: 2025-11-22 07:59:34.108 186548 DEBUG oslo_concurrency.lockutils [req-fa9a415e-619c-4983-b52d-bfb91ae9cc92 req-6e3b79d6-d5e4-42c2-ad5d-8b18a9031df4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "45b35bfb-92b8-4947-bca0-dca87e484f28-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:59:34 compute-0 nova_compute[186544]: 2025-11-22 07:59:34.109 186548 DEBUG oslo_concurrency.lockutils [req-fa9a415e-619c-4983-b52d-bfb91ae9cc92 req-6e3b79d6-d5e4-42c2-ad5d-8b18a9031df4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "45b35bfb-92b8-4947-bca0-dca87e484f28-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:59:34 compute-0 nova_compute[186544]: 2025-11-22 07:59:34.109 186548 DEBUG oslo_concurrency.lockutils [req-fa9a415e-619c-4983-b52d-bfb91ae9cc92 req-6e3b79d6-d5e4-42c2-ad5d-8b18a9031df4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "45b35bfb-92b8-4947-bca0-dca87e484f28-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:59:34 compute-0 nova_compute[186544]: 2025-11-22 07:59:34.110 186548 DEBUG nova.compute.manager [req-fa9a415e-619c-4983-b52d-bfb91ae9cc92 req-6e3b79d6-d5e4-42c2-ad5d-8b18a9031df4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 45b35bfb-92b8-4947-bca0-dca87e484f28] No waiting events found dispatching network-vif-plugged-55f8379a-37c3-44ee-87da-59e3f28fcabb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 07:59:34 compute-0 nova_compute[186544]: 2025-11-22 07:59:34.110 186548 WARNING nova.compute.manager [req-fa9a415e-619c-4983-b52d-bfb91ae9cc92 req-6e3b79d6-d5e4-42c2-ad5d-8b18a9031df4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 45b35bfb-92b8-4947-bca0-dca87e484f28] Received unexpected event network-vif-plugged-55f8379a-37c3-44ee-87da-59e3f28fcabb for instance with vm_state deleted and task_state None.
Nov 22 07:59:34 compute-0 podman[228447]: 2025-11-22 07:59:34.401063347 +0000 UTC m=+0.051162063 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 07:59:35 compute-0 nova_compute[186544]: 2025-11-22 07:59:35.601 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:36 compute-0 podman[228472]: 2025-11-22 07:59:36.422971507 +0000 UTC m=+0.076669890 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, version=9.6, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, maintainer=Red Hat, Inc., config_id=edpm, container_name=openstack_network_exporter, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 22 07:59:36 compute-0 nova_compute[186544]: 2025-11-22 07:59:36.942 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:37.327 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:59:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:37.328 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:59:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 07:59:37.328 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:59:37 compute-0 nova_compute[186544]: 2025-11-22 07:59:37.913 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:40 compute-0 nova_compute[186544]: 2025-11-22 07:59:40.603 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:41 compute-0 nova_compute[186544]: 2025-11-22 07:59:41.944 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:44 compute-0 nova_compute[186544]: 2025-11-22 07:59:44.059 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763798369.0585604, 823521f6-eb37-4b78-982c-526e463c834f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:59:44 compute-0 nova_compute[186544]: 2025-11-22 07:59:44.060 186548 INFO nova.compute.manager [-] [instance: 823521f6-eb37-4b78-982c-526e463c834f] VM Stopped (Lifecycle Event)
Nov 22 07:59:44 compute-0 nova_compute[186544]: 2025-11-22 07:59:44.094 186548 DEBUG nova.compute.manager [None req-742ee698-7e84-4eed-bc20-d75610ccd790 - - - - - -] [instance: 823521f6-eb37-4b78-982c-526e463c834f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:59:44 compute-0 nova_compute[186544]: 2025-11-22 07:59:44.203 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:59:45 compute-0 nova_compute[186544]: 2025-11-22 07:59:45.604 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:46 compute-0 nova_compute[186544]: 2025-11-22 07:59:46.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:59:46 compute-0 nova_compute[186544]: 2025-11-22 07:59:46.186 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:59:46 compute-0 nova_compute[186544]: 2025-11-22 07:59:46.186 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:59:46 compute-0 nova_compute[186544]: 2025-11-22 07:59:46.187 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:59:46 compute-0 nova_compute[186544]: 2025-11-22 07:59:46.187 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 07:59:46 compute-0 nova_compute[186544]: 2025-11-22 07:59:46.371 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 07:59:46 compute-0 nova_compute[186544]: 2025-11-22 07:59:46.372 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5633MB free_disk=73.28063201904297GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 07:59:46 compute-0 nova_compute[186544]: 2025-11-22 07:59:46.372 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:59:46 compute-0 nova_compute[186544]: 2025-11-22 07:59:46.373 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:59:46 compute-0 nova_compute[186544]: 2025-11-22 07:59:46.449 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 07:59:46 compute-0 nova_compute[186544]: 2025-11-22 07:59:46.450 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 07:59:46 compute-0 nova_compute[186544]: 2025-11-22 07:59:46.474 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:59:46 compute-0 nova_compute[186544]: 2025-11-22 07:59:46.485 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:59:46 compute-0 nova_compute[186544]: 2025-11-22 07:59:46.529 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 07:59:46 compute-0 nova_compute[186544]: 2025-11-22 07:59:46.529 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.157s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:59:46 compute-0 nova_compute[186544]: 2025-11-22 07:59:46.885 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763798371.8844872, 45b35bfb-92b8-4947-bca0-dca87e484f28 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:59:46 compute-0 nova_compute[186544]: 2025-11-22 07:59:46.886 186548 INFO nova.compute.manager [-] [instance: 45b35bfb-92b8-4947-bca0-dca87e484f28] VM Stopped (Lifecycle Event)
Nov 22 07:59:46 compute-0 nova_compute[186544]: 2025-11-22 07:59:46.911 186548 DEBUG nova.compute.manager [None req-c2d4dd1a-220b-47c0-b535-67e1bbe26443 - - - - - -] [instance: 45b35bfb-92b8-4947-bca0-dca87e484f28] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:59:46 compute-0 nova_compute[186544]: 2025-11-22 07:59:46.945 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:47 compute-0 nova_compute[186544]: 2025-11-22 07:59:47.530 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:59:47 compute-0 nova_compute[186544]: 2025-11-22 07:59:47.530 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 07:59:47 compute-0 nova_compute[186544]: 2025-11-22 07:59:47.530 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 07:59:47 compute-0 nova_compute[186544]: 2025-11-22 07:59:47.541 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 07:59:48 compute-0 podman[228495]: 2025-11-22 07:59:48.415027677 +0000 UTC m=+0.059542845 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 22 07:59:48 compute-0 podman[228496]: 2025-11-22 07:59:48.437049922 +0000 UTC m=+0.081009607 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 22 07:59:49 compute-0 nova_compute[186544]: 2025-11-22 07:59:49.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:59:49 compute-0 nova_compute[186544]: 2025-11-22 07:59:49.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:59:49 compute-0 nova_compute[186544]: 2025-11-22 07:59:49.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:59:49 compute-0 nova_compute[186544]: 2025-11-22 07:59:49.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 07:59:50 compute-0 nova_compute[186544]: 2025-11-22 07:59:50.607 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:51 compute-0 nova_compute[186544]: 2025-11-22 07:59:51.947 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:52 compute-0 podman[228538]: 2025-11-22 07:59:52.403114475 +0000 UTC m=+0.054067230 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 07:59:53 compute-0 nova_compute[186544]: 2025-11-22 07:59:53.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:59:53 compute-0 nova_compute[186544]: 2025-11-22 07:59:53.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:59:54 compute-0 nova_compute[186544]: 2025-11-22 07:59:54.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 07:59:54 compute-0 podman[228560]: 2025-11-22 07:59:54.414336915 +0000 UTC m=+0.056864359 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, container_name=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 22 07:59:55 compute-0 nova_compute[186544]: 2025-11-22 07:59:55.609 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:56 compute-0 nova_compute[186544]: 2025-11-22 07:59:56.948 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 07:59:57 compute-0 nova_compute[186544]: 2025-11-22 07:59:57.188 186548 DEBUG oslo_concurrency.lockutils [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Acquiring lock "e53bd443-98ed-4e38-bff6-3f43fe77ade8" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:59:57 compute-0 nova_compute[186544]: 2025-11-22 07:59:57.189 186548 DEBUG oslo_concurrency.lockutils [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Lock "e53bd443-98ed-4e38-bff6-3f43fe77ade8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:59:57 compute-0 nova_compute[186544]: 2025-11-22 07:59:57.210 186548 DEBUG nova.compute.manager [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: e53bd443-98ed-4e38-bff6-3f43fe77ade8] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 07:59:57 compute-0 nova_compute[186544]: 2025-11-22 07:59:57.297 186548 DEBUG oslo_concurrency.lockutils [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Acquiring lock "b6c62c3d-be2c-46d2-9bb1-26390c5f9593" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:59:57 compute-0 nova_compute[186544]: 2025-11-22 07:59:57.297 186548 DEBUG oslo_concurrency.lockutils [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Lock "b6c62c3d-be2c-46d2-9bb1-26390c5f9593" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:59:57 compute-0 nova_compute[186544]: 2025-11-22 07:59:57.314 186548 DEBUG nova.compute.manager [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 07:59:57 compute-0 nova_compute[186544]: 2025-11-22 07:59:57.317 186548 DEBUG oslo_concurrency.lockutils [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:59:57 compute-0 nova_compute[186544]: 2025-11-22 07:59:57.317 186548 DEBUG oslo_concurrency.lockutils [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:59:57 compute-0 nova_compute[186544]: 2025-11-22 07:59:57.324 186548 DEBUG nova.virt.hardware [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 07:59:57 compute-0 nova_compute[186544]: 2025-11-22 07:59:57.324 186548 INFO nova.compute.claims [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: e53bd443-98ed-4e38-bff6-3f43fe77ade8] Claim successful on node compute-0.ctlplane.example.com
Nov 22 07:59:57 compute-0 nova_compute[186544]: 2025-11-22 07:59:57.472 186548 DEBUG oslo_concurrency.lockutils [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:59:57 compute-0 nova_compute[186544]: 2025-11-22 07:59:57.510 186548 DEBUG nova.compute.provider_tree [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:59:57 compute-0 nova_compute[186544]: 2025-11-22 07:59:57.526 186548 DEBUG nova.scheduler.client.report [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:59:57 compute-0 nova_compute[186544]: 2025-11-22 07:59:57.546 186548 DEBUG oslo_concurrency.lockutils [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.229s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:59:57 compute-0 nova_compute[186544]: 2025-11-22 07:59:57.547 186548 DEBUG nova.compute.manager [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: e53bd443-98ed-4e38-bff6-3f43fe77ade8] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 07:59:57 compute-0 nova_compute[186544]: 2025-11-22 07:59:57.551 186548 DEBUG oslo_concurrency.lockutils [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.079s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:59:57 compute-0 nova_compute[186544]: 2025-11-22 07:59:57.557 186548 DEBUG nova.virt.hardware [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 07:59:57 compute-0 nova_compute[186544]: 2025-11-22 07:59:57.558 186548 INFO nova.compute.claims [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] Claim successful on node compute-0.ctlplane.example.com
Nov 22 07:59:57 compute-0 nova_compute[186544]: 2025-11-22 07:59:57.613 186548 DEBUG nova.compute.manager [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: e53bd443-98ed-4e38-bff6-3f43fe77ade8] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948
Nov 22 07:59:57 compute-0 nova_compute[186544]: 2025-11-22 07:59:57.661 186548 INFO nova.virt.libvirt.driver [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: e53bd443-98ed-4e38-bff6-3f43fe77ade8] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 07:59:57 compute-0 nova_compute[186544]: 2025-11-22 07:59:57.680 186548 DEBUG nova.compute.manager [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: e53bd443-98ed-4e38-bff6-3f43fe77ade8] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 07:59:57 compute-0 nova_compute[186544]: 2025-11-22 07:59:57.686 186548 DEBUG nova.compute.provider_tree [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 07:59:57 compute-0 nova_compute[186544]: 2025-11-22 07:59:57.714 186548 DEBUG nova.scheduler.client.report [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 07:59:57 compute-0 nova_compute[186544]: 2025-11-22 07:59:57.744 186548 DEBUG oslo_concurrency.lockutils [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.194s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:59:57 compute-0 nova_compute[186544]: 2025-11-22 07:59:57.745 186548 DEBUG nova.compute.manager [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 07:59:57 compute-0 nova_compute[186544]: 2025-11-22 07:59:57.806 186548 DEBUG nova.compute.manager [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: e53bd443-98ed-4e38-bff6-3f43fe77ade8] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 07:59:57 compute-0 nova_compute[186544]: 2025-11-22 07:59:57.807 186548 DEBUG nova.virt.libvirt.driver [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: e53bd443-98ed-4e38-bff6-3f43fe77ade8] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 07:59:57 compute-0 nova_compute[186544]: 2025-11-22 07:59:57.807 186548 INFO nova.virt.libvirt.driver [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: e53bd443-98ed-4e38-bff6-3f43fe77ade8] Creating image(s)
Nov 22 07:59:57 compute-0 nova_compute[186544]: 2025-11-22 07:59:57.808 186548 DEBUG oslo_concurrency.lockutils [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Acquiring lock "/var/lib/nova/instances/e53bd443-98ed-4e38-bff6-3f43fe77ade8/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:59:57 compute-0 nova_compute[186544]: 2025-11-22 07:59:57.808 186548 DEBUG oslo_concurrency.lockutils [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Lock "/var/lib/nova/instances/e53bd443-98ed-4e38-bff6-3f43fe77ade8/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:59:57 compute-0 nova_compute[186544]: 2025-11-22 07:59:57.809 186548 DEBUG oslo_concurrency.lockutils [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Lock "/var/lib/nova/instances/e53bd443-98ed-4e38-bff6-3f43fe77ade8/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:59:57 compute-0 nova_compute[186544]: 2025-11-22 07:59:57.825 186548 DEBUG nova.compute.manager [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948
Nov 22 07:59:57 compute-0 nova_compute[186544]: 2025-11-22 07:59:57.826 186548 DEBUG oslo_concurrency.processutils [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:59:57 compute-0 nova_compute[186544]: 2025-11-22 07:59:57.847 186548 INFO nova.virt.libvirt.driver [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 07:59:57 compute-0 nova_compute[186544]: 2025-11-22 07:59:57.864 186548 DEBUG nova.compute.manager [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 07:59:57 compute-0 nova_compute[186544]: 2025-11-22 07:59:57.884 186548 DEBUG oslo_concurrency.processutils [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:59:57 compute-0 nova_compute[186544]: 2025-11-22 07:59:57.884 186548 DEBUG oslo_concurrency.lockutils [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:59:57 compute-0 nova_compute[186544]: 2025-11-22 07:59:57.885 186548 DEBUG oslo_concurrency.lockutils [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:59:57 compute-0 nova_compute[186544]: 2025-11-22 07:59:57.896 186548 DEBUG oslo_concurrency.processutils [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:59:57 compute-0 nova_compute[186544]: 2025-11-22 07:59:57.960 186548 DEBUG oslo_concurrency.processutils [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:59:57 compute-0 nova_compute[186544]: 2025-11-22 07:59:57.961 186548 DEBUG oslo_concurrency.processutils [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/e53bd443-98ed-4e38-bff6-3f43fe77ade8/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:59:57 compute-0 nova_compute[186544]: 2025-11-22 07:59:57.985 186548 DEBUG nova.compute.manager [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 07:59:57 compute-0 nova_compute[186544]: 2025-11-22 07:59:57.987 186548 DEBUG nova.virt.libvirt.driver [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 07:59:57 compute-0 nova_compute[186544]: 2025-11-22 07:59:57.988 186548 INFO nova.virt.libvirt.driver [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] Creating image(s)
Nov 22 07:59:57 compute-0 nova_compute[186544]: 2025-11-22 07:59:57.989 186548 DEBUG oslo_concurrency.lockutils [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Acquiring lock "/var/lib/nova/instances/b6c62c3d-be2c-46d2-9bb1-26390c5f9593/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:59:57 compute-0 nova_compute[186544]: 2025-11-22 07:59:57.989 186548 DEBUG oslo_concurrency.lockutils [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Lock "/var/lib/nova/instances/b6c62c3d-be2c-46d2-9bb1-26390c5f9593/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:59:57 compute-0 nova_compute[186544]: 2025-11-22 07:59:57.990 186548 DEBUG oslo_concurrency.lockutils [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Lock "/var/lib/nova/instances/b6c62c3d-be2c-46d2-9bb1-26390c5f9593/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.007 186548 DEBUG oslo_concurrency.processutils [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.025 186548 DEBUG oslo_concurrency.processutils [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/e53bd443-98ed-4e38-bff6-3f43fe77ade8/disk 1073741824" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.026 186548 DEBUG oslo_concurrency.lockutils [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.142s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.027 186548 DEBUG oslo_concurrency.processutils [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.062 186548 DEBUG oslo_concurrency.processutils [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.063 186548 DEBUG oslo_concurrency.lockutils [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.064 186548 DEBUG oslo_concurrency.lockutils [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.080 186548 DEBUG oslo_concurrency.processutils [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.099 186548 DEBUG oslo_concurrency.processutils [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.100 186548 DEBUG nova.virt.disk.api [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Checking if we can resize image /var/lib/nova/instances/e53bd443-98ed-4e38-bff6-3f43fe77ade8/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.101 186548 DEBUG oslo_concurrency.processutils [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e53bd443-98ed-4e38-bff6-3f43fe77ade8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.137 186548 DEBUG oslo_concurrency.processutils [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.138 186548 DEBUG oslo_concurrency.processutils [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/b6c62c3d-be2c-46d2-9bb1-26390c5f9593/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.158 186548 DEBUG oslo_concurrency.processutils [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e53bd443-98ed-4e38-bff6-3f43fe77ade8/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.159 186548 DEBUG nova.virt.disk.api [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Cannot resize image /var/lib/nova/instances/e53bd443-98ed-4e38-bff6-3f43fe77ade8/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.160 186548 DEBUG nova.objects.instance [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Lazy-loading 'migration_context' on Instance uuid e53bd443-98ed-4e38-bff6-3f43fe77ade8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.170 186548 DEBUG nova.virt.libvirt.driver [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: e53bd443-98ed-4e38-bff6-3f43fe77ade8] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.170 186548 DEBUG nova.virt.libvirt.driver [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: e53bd443-98ed-4e38-bff6-3f43fe77ade8] Ensure instance console log exists: /var/lib/nova/instances/e53bd443-98ed-4e38-bff6-3f43fe77ade8/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.171 186548 DEBUG oslo_concurrency.lockutils [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.171 186548 DEBUG oslo_concurrency.lockutils [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.171 186548 DEBUG oslo_concurrency.lockutils [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.173 186548 DEBUG nova.virt.libvirt.driver [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: e53bd443-98ed-4e38-bff6-3f43fe77ade8] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.176 186548 WARNING nova.virt.libvirt.driver [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.182 186548 DEBUG nova.virt.libvirt.host [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.182 186548 DEBUG nova.virt.libvirt.host [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.185 186548 DEBUG nova.virt.libvirt.host [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.186 186548 DEBUG nova.virt.libvirt.host [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.187 186548 DEBUG nova.virt.libvirt.driver [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.187 186548 DEBUG nova.virt.hardware [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.187 186548 DEBUG nova.virt.hardware [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.188 186548 DEBUG nova.virt.hardware [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.188 186548 DEBUG nova.virt.hardware [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.188 186548 DEBUG nova.virt.hardware [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.188 186548 DEBUG nova.virt.hardware [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.188 186548 DEBUG nova.virt.hardware [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.189 186548 DEBUG nova.virt.hardware [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.189 186548 DEBUG nova.virt.hardware [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.189 186548 DEBUG nova.virt.hardware [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.189 186548 DEBUG nova.virt.hardware [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.193 186548 DEBUG nova.objects.instance [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Lazy-loading 'pci_devices' on Instance uuid e53bd443-98ed-4e38-bff6-3f43fe77ade8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.194 186548 DEBUG oslo_concurrency.processutils [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/b6c62c3d-be2c-46d2-9bb1-26390c5f9593/disk 1073741824" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.195 186548 DEBUG oslo_concurrency.lockutils [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.131s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.195 186548 DEBUG oslo_concurrency.processutils [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.215 186548 DEBUG nova.virt.libvirt.driver [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: e53bd443-98ed-4e38-bff6-3f43fe77ade8] End _get_guest_xml xml=<domain type="kvm">
Nov 22 07:59:58 compute-0 nova_compute[186544]:   <uuid>e53bd443-98ed-4e38-bff6-3f43fe77ade8</uuid>
Nov 22 07:59:58 compute-0 nova_compute[186544]:   <name>instance-00000059</name>
Nov 22 07:59:58 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 07:59:58 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 07:59:58 compute-0 nova_compute[186544]:   <metadata>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 07:59:58 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:       <nova:name>tempest-ServerShowV247Test-server-465401543</nova:name>
Nov 22 07:59:58 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 07:59:58</nova:creationTime>
Nov 22 07:59:58 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 07:59:58 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 07:59:58 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 07:59:58 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 07:59:58 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 07:59:58 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 07:59:58 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 07:59:58 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 07:59:58 compute-0 nova_compute[186544]:         <nova:user uuid="1e19acf488494c4184a911a745e39b4b">tempest-ServerShowV247Test-572024602-project-member</nova:user>
Nov 22 07:59:58 compute-0 nova_compute[186544]:         <nova:project uuid="b82ac9be9aa74af28bd41198152daea3">tempest-ServerShowV247Test-572024602</nova:project>
Nov 22 07:59:58 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 07:59:58 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:       <nova:ports/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 07:59:58 compute-0 nova_compute[186544]:   </metadata>
Nov 22 07:59:58 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <system>
Nov 22 07:59:58 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 07:59:58 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 07:59:58 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 07:59:58 compute-0 nova_compute[186544]:       <entry name="serial">e53bd443-98ed-4e38-bff6-3f43fe77ade8</entry>
Nov 22 07:59:58 compute-0 nova_compute[186544]:       <entry name="uuid">e53bd443-98ed-4e38-bff6-3f43fe77ade8</entry>
Nov 22 07:59:58 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     </system>
Nov 22 07:59:58 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 07:59:58 compute-0 nova_compute[186544]:   <os>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:   </os>
Nov 22 07:59:58 compute-0 nova_compute[186544]:   <features>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <apic/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:   </features>
Nov 22 07:59:58 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:   </clock>
Nov 22 07:59:58 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:   </cpu>
Nov 22 07:59:58 compute-0 nova_compute[186544]:   <devices>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 07:59:58 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/e53bd443-98ed-4e38-bff6-3f43fe77ade8/disk"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 07:59:58 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/e53bd443-98ed-4e38-bff6-3f43fe77ade8/disk.config"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 07:59:58 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/e53bd443-98ed-4e38-bff6-3f43fe77ade8/console.log" append="off"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     </serial>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <video>
Nov 22 07:59:58 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     </video>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 07:59:58 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     </rng>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 07:59:58 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 07:59:58 compute-0 nova_compute[186544]:   </devices>
Nov 22 07:59:58 compute-0 nova_compute[186544]: </domain>
Nov 22 07:59:58 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.250 186548 DEBUG oslo_concurrency.processutils [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.252 186548 DEBUG nova.virt.disk.api [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Checking if we can resize image /var/lib/nova/instances/b6c62c3d-be2c-46d2-9bb1-26390c5f9593/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.252 186548 DEBUG oslo_concurrency.processutils [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b6c62c3d-be2c-46d2-9bb1-26390c5f9593/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.276 186548 DEBUG nova.virt.libvirt.driver [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.276 186548 DEBUG nova.virt.libvirt.driver [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.277 186548 INFO nova.virt.libvirt.driver [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: e53bd443-98ed-4e38-bff6-3f43fe77ade8] Using config drive
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.309 186548 DEBUG oslo_concurrency.processutils [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b6c62c3d-be2c-46d2-9bb1-26390c5f9593/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.310 186548 DEBUG nova.virt.disk.api [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Cannot resize image /var/lib/nova/instances/b6c62c3d-be2c-46d2-9bb1-26390c5f9593/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.311 186548 DEBUG nova.objects.instance [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Lazy-loading 'migration_context' on Instance uuid b6c62c3d-be2c-46d2-9bb1-26390c5f9593 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.348 186548 DEBUG nova.virt.libvirt.driver [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.349 186548 DEBUG nova.virt.libvirt.driver [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] Ensure instance console log exists: /var/lib/nova/instances/b6c62c3d-be2c-46d2-9bb1-26390c5f9593/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.349 186548 DEBUG oslo_concurrency.lockutils [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.350 186548 DEBUG oslo_concurrency.lockutils [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.350 186548 DEBUG oslo_concurrency.lockutils [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.352 186548 DEBUG nova.virt.libvirt.driver [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.356 186548 WARNING nova.virt.libvirt.driver [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.359 186548 DEBUG nova.virt.libvirt.host [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.360 186548 DEBUG nova.virt.libvirt.host [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.363 186548 DEBUG nova.virt.libvirt.host [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.363 186548 DEBUG nova.virt.libvirt.host [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.364 186548 DEBUG nova.virt.libvirt.driver [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.365 186548 DEBUG nova.virt.hardware [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.365 186548 DEBUG nova.virt.hardware [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.366 186548 DEBUG nova.virt.hardware [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.366 186548 DEBUG nova.virt.hardware [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.366 186548 DEBUG nova.virt.hardware [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.366 186548 DEBUG nova.virt.hardware [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.367 186548 DEBUG nova.virt.hardware [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.367 186548 DEBUG nova.virt.hardware [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.367 186548 DEBUG nova.virt.hardware [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.368 186548 DEBUG nova.virt.hardware [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.368 186548 DEBUG nova.virt.hardware [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.371 186548 DEBUG nova.objects.instance [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Lazy-loading 'pci_devices' on Instance uuid b6c62c3d-be2c-46d2-9bb1-26390c5f9593 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.390 186548 DEBUG nova.virt.libvirt.driver [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] End _get_guest_xml xml=<domain type="kvm">
Nov 22 07:59:58 compute-0 nova_compute[186544]:   <uuid>b6c62c3d-be2c-46d2-9bb1-26390c5f9593</uuid>
Nov 22 07:59:58 compute-0 nova_compute[186544]:   <name>instance-0000005a</name>
Nov 22 07:59:58 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 07:59:58 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 07:59:58 compute-0 nova_compute[186544]:   <metadata>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 07:59:58 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:       <nova:name>tempest-ServerShowV247Test-server-734969100</nova:name>
Nov 22 07:59:58 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 07:59:58</nova:creationTime>
Nov 22 07:59:58 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 07:59:58 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 07:59:58 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 07:59:58 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 07:59:58 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 07:59:58 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 07:59:58 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 07:59:58 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 07:59:58 compute-0 nova_compute[186544]:         <nova:user uuid="1e19acf488494c4184a911a745e39b4b">tempest-ServerShowV247Test-572024602-project-member</nova:user>
Nov 22 07:59:58 compute-0 nova_compute[186544]:         <nova:project uuid="b82ac9be9aa74af28bd41198152daea3">tempest-ServerShowV247Test-572024602</nova:project>
Nov 22 07:59:58 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 07:59:58 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:       <nova:ports/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 07:59:58 compute-0 nova_compute[186544]:   </metadata>
Nov 22 07:59:58 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <system>
Nov 22 07:59:58 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 07:59:58 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 07:59:58 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 07:59:58 compute-0 nova_compute[186544]:       <entry name="serial">b6c62c3d-be2c-46d2-9bb1-26390c5f9593</entry>
Nov 22 07:59:58 compute-0 nova_compute[186544]:       <entry name="uuid">b6c62c3d-be2c-46d2-9bb1-26390c5f9593</entry>
Nov 22 07:59:58 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     </system>
Nov 22 07:59:58 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 07:59:58 compute-0 nova_compute[186544]:   <os>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:   </os>
Nov 22 07:59:58 compute-0 nova_compute[186544]:   <features>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <apic/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:   </features>
Nov 22 07:59:58 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:   </clock>
Nov 22 07:59:58 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:   </cpu>
Nov 22 07:59:58 compute-0 nova_compute[186544]:   <devices>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 07:59:58 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/b6c62c3d-be2c-46d2-9bb1-26390c5f9593/disk"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 07:59:58 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/b6c62c3d-be2c-46d2-9bb1-26390c5f9593/disk.config"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     </disk>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 07:59:58 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/b6c62c3d-be2c-46d2-9bb1-26390c5f9593/console.log" append="off"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     </serial>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <video>
Nov 22 07:59:58 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     </video>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 07:59:58 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     </rng>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 07:59:58 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 07:59:58 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 07:59:58 compute-0 nova_compute[186544]:   </devices>
Nov 22 07:59:58 compute-0 nova_compute[186544]: </domain>
Nov 22 07:59:58 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.468 186548 DEBUG nova.virt.libvirt.driver [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.469 186548 DEBUG nova.virt.libvirt.driver [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.469 186548 INFO nova.virt.libvirt.driver [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] Using config drive
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.472 186548 INFO nova.virt.libvirt.driver [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: e53bd443-98ed-4e38-bff6-3f43fe77ade8] Creating config drive at /var/lib/nova/instances/e53bd443-98ed-4e38-bff6-3f43fe77ade8/disk.config
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.476 186548 DEBUG oslo_concurrency.processutils [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e53bd443-98ed-4e38-bff6-3f43fe77ade8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfcd0c6sl execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.600 186548 DEBUG oslo_concurrency.processutils [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e53bd443-98ed-4e38-bff6-3f43fe77ade8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfcd0c6sl" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.639 186548 INFO nova.virt.libvirt.driver [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] Creating config drive at /var/lib/nova/instances/b6c62c3d-be2c-46d2-9bb1-26390c5f9593/disk.config
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.646 186548 DEBUG oslo_concurrency.processutils [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b6c62c3d-be2c-46d2-9bb1-26390c5f9593/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmplzfe4pn4 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 07:59:58 compute-0 systemd-machined[152872]: New machine qemu-47-instance-00000059.
Nov 22 07:59:58 compute-0 systemd[1]: Started Virtual Machine qemu-47-instance-00000059.
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.773 186548 DEBUG oslo_concurrency.processutils [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b6c62c3d-be2c-46d2-9bb1-26390c5f9593/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmplzfe4pn4" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 07:59:58 compute-0 systemd-machined[152872]: New machine qemu-48-instance-0000005a.
Nov 22 07:59:58 compute-0 systemd[1]: Started Virtual Machine qemu-48-instance-0000005a.
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.967 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798398.9666839, e53bd443-98ed-4e38-bff6-3f43fe77ade8 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.967 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: e53bd443-98ed-4e38-bff6-3f43fe77ade8] VM Resumed (Lifecycle Event)
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.971 186548 DEBUG nova.compute.manager [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: e53bd443-98ed-4e38-bff6-3f43fe77ade8] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.971 186548 DEBUG nova.virt.libvirt.driver [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: e53bd443-98ed-4e38-bff6-3f43fe77ade8] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.977 186548 INFO nova.virt.libvirt.driver [-] [instance: e53bd443-98ed-4e38-bff6-3f43fe77ade8] Instance spawned successfully.
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.977 186548 DEBUG nova.virt.libvirt.driver [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: e53bd443-98ed-4e38-bff6-3f43fe77ade8] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 07:59:58 compute-0 nova_compute[186544]: 2025-11-22 07:59:58.990 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: e53bd443-98ed-4e38-bff6-3f43fe77ade8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:59:59 compute-0 nova_compute[186544]: 2025-11-22 07:59:59.005 186548 DEBUG nova.virt.libvirt.driver [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: e53bd443-98ed-4e38-bff6-3f43fe77ade8] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:59:59 compute-0 nova_compute[186544]: 2025-11-22 07:59:59.005 186548 DEBUG nova.virt.libvirt.driver [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: e53bd443-98ed-4e38-bff6-3f43fe77ade8] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:59:59 compute-0 nova_compute[186544]: 2025-11-22 07:59:59.005 186548 DEBUG nova.virt.libvirt.driver [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: e53bd443-98ed-4e38-bff6-3f43fe77ade8] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:59:59 compute-0 nova_compute[186544]: 2025-11-22 07:59:59.006 186548 DEBUG nova.virt.libvirt.driver [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: e53bd443-98ed-4e38-bff6-3f43fe77ade8] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:59:59 compute-0 nova_compute[186544]: 2025-11-22 07:59:59.006 186548 DEBUG nova.virt.libvirt.driver [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: e53bd443-98ed-4e38-bff6-3f43fe77ade8] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:59:59 compute-0 nova_compute[186544]: 2025-11-22 07:59:59.006 186548 DEBUG nova.virt.libvirt.driver [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: e53bd443-98ed-4e38-bff6-3f43fe77ade8] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:59:59 compute-0 nova_compute[186544]: 2025-11-22 07:59:59.010 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: e53bd443-98ed-4e38-bff6-3f43fe77ade8] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:59:59 compute-0 nova_compute[186544]: 2025-11-22 07:59:59.042 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: e53bd443-98ed-4e38-bff6-3f43fe77ade8] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 07:59:59 compute-0 nova_compute[186544]: 2025-11-22 07:59:59.043 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798398.9669683, e53bd443-98ed-4e38-bff6-3f43fe77ade8 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:59:59 compute-0 nova_compute[186544]: 2025-11-22 07:59:59.043 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: e53bd443-98ed-4e38-bff6-3f43fe77ade8] VM Started (Lifecycle Event)
Nov 22 07:59:59 compute-0 nova_compute[186544]: 2025-11-22 07:59:59.058 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: e53bd443-98ed-4e38-bff6-3f43fe77ade8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:59:59 compute-0 nova_compute[186544]: 2025-11-22 07:59:59.061 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: e53bd443-98ed-4e38-bff6-3f43fe77ade8] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:59:59 compute-0 nova_compute[186544]: 2025-11-22 07:59:59.099 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: e53bd443-98ed-4e38-bff6-3f43fe77ade8] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 07:59:59 compute-0 nova_compute[186544]: 2025-11-22 07:59:59.141 186548 INFO nova.compute.manager [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: e53bd443-98ed-4e38-bff6-3f43fe77ade8] Took 1.34 seconds to spawn the instance on the hypervisor.
Nov 22 07:59:59 compute-0 nova_compute[186544]: 2025-11-22 07:59:59.142 186548 DEBUG nova.compute.manager [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: e53bd443-98ed-4e38-bff6-3f43fe77ade8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:59:59 compute-0 nova_compute[186544]: 2025-11-22 07:59:59.339 186548 INFO nova.compute.manager [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: e53bd443-98ed-4e38-bff6-3f43fe77ade8] Took 2.06 seconds to build instance.
Nov 22 07:59:59 compute-0 nova_compute[186544]: 2025-11-22 07:59:59.407 186548 DEBUG oslo_concurrency.lockutils [None req-07de4265-83b2-4176-8810-6cb455fcca6c 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Lock "e53bd443-98ed-4e38-bff6-3f43fe77ade8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.219s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 07:59:59 compute-0 nova_compute[186544]: 2025-11-22 07:59:59.471 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798399.4716468, b6c62c3d-be2c-46d2-9bb1-26390c5f9593 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:59:59 compute-0 nova_compute[186544]: 2025-11-22 07:59:59.472 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] VM Resumed (Lifecycle Event)
Nov 22 07:59:59 compute-0 nova_compute[186544]: 2025-11-22 07:59:59.474 186548 DEBUG nova.compute.manager [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 07:59:59 compute-0 nova_compute[186544]: 2025-11-22 07:59:59.475 186548 DEBUG nova.virt.libvirt.driver [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 07:59:59 compute-0 nova_compute[186544]: 2025-11-22 07:59:59.479 186548 INFO nova.virt.libvirt.driver [-] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] Instance spawned successfully.
Nov 22 07:59:59 compute-0 nova_compute[186544]: 2025-11-22 07:59:59.479 186548 DEBUG nova.virt.libvirt.driver [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 07:59:59 compute-0 nova_compute[186544]: 2025-11-22 07:59:59.491 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:59:59 compute-0 nova_compute[186544]: 2025-11-22 07:59:59.493 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:59:59 compute-0 nova_compute[186544]: 2025-11-22 07:59:59.501 186548 DEBUG nova.virt.libvirt.driver [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:59:59 compute-0 nova_compute[186544]: 2025-11-22 07:59:59.502 186548 DEBUG nova.virt.libvirt.driver [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:59:59 compute-0 nova_compute[186544]: 2025-11-22 07:59:59.502 186548 DEBUG nova.virt.libvirt.driver [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:59:59 compute-0 nova_compute[186544]: 2025-11-22 07:59:59.503 186548 DEBUG nova.virt.libvirt.driver [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:59:59 compute-0 nova_compute[186544]: 2025-11-22 07:59:59.503 186548 DEBUG nova.virt.libvirt.driver [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:59:59 compute-0 nova_compute[186544]: 2025-11-22 07:59:59.503 186548 DEBUG nova.virt.libvirt.driver [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 07:59:59 compute-0 nova_compute[186544]: 2025-11-22 07:59:59.506 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 07:59:59 compute-0 nova_compute[186544]: 2025-11-22 07:59:59.507 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798399.4738648, b6c62c3d-be2c-46d2-9bb1-26390c5f9593 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 07:59:59 compute-0 nova_compute[186544]: 2025-11-22 07:59:59.507 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] VM Started (Lifecycle Event)
Nov 22 07:59:59 compute-0 nova_compute[186544]: 2025-11-22 07:59:59.528 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:59:59 compute-0 nova_compute[186544]: 2025-11-22 07:59:59.531 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 07:59:59 compute-0 nova_compute[186544]: 2025-11-22 07:59:59.557 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 07:59:59 compute-0 nova_compute[186544]: 2025-11-22 07:59:59.705 186548 INFO nova.compute.manager [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] Took 1.72 seconds to spawn the instance on the hypervisor.
Nov 22 07:59:59 compute-0 nova_compute[186544]: 2025-11-22 07:59:59.705 186548 DEBUG nova.compute.manager [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 07:59:59 compute-0 nova_compute[186544]: 2025-11-22 07:59:59.907 186548 INFO nova.compute.manager [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] Took 2.46 seconds to build instance.
Nov 22 07:59:59 compute-0 nova_compute[186544]: 2025-11-22 07:59:59.995 186548 DEBUG oslo_concurrency.lockutils [None req-23ee5ba2-31df-40b8-866f-c7d8bd43d847 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Lock "b6c62c3d-be2c-46d2-9bb1-26390c5f9593" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.698s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:00:00 compute-0 podman[228663]: 2025-11-22 08:00:00.415777775 +0000 UTC m=+0.064284163 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 22 08:00:00 compute-0 nova_compute[186544]: 2025-11-22 08:00:00.611 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:00:01 compute-0 nova_compute[186544]: 2025-11-22 08:00:01.950 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:00:02 compute-0 nova_compute[186544]: 2025-11-22 08:00:02.063 186548 INFO nova.compute.manager [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] Rebuilding instance
Nov 22 08:00:02 compute-0 nova_compute[186544]: 2025-11-22 08:00:02.402 186548 DEBUG nova.compute.manager [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:00:02 compute-0 nova_compute[186544]: 2025-11-22 08:00:02.473 186548 DEBUG nova.objects.instance [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Lazy-loading 'pci_requests' on Instance uuid b6c62c3d-be2c-46d2-9bb1-26390c5f9593 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:00:02 compute-0 nova_compute[186544]: 2025-11-22 08:00:02.488 186548 DEBUG nova.objects.instance [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Lazy-loading 'pci_devices' on Instance uuid b6c62c3d-be2c-46d2-9bb1-26390c5f9593 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:00:02 compute-0 nova_compute[186544]: 2025-11-22 08:00:02.497 186548 DEBUG nova.objects.instance [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Lazy-loading 'resources' on Instance uuid b6c62c3d-be2c-46d2-9bb1-26390c5f9593 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:00:02 compute-0 nova_compute[186544]: 2025-11-22 08:00:02.507 186548 DEBUG nova.objects.instance [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Lazy-loading 'migration_context' on Instance uuid b6c62c3d-be2c-46d2-9bb1-26390c5f9593 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:00:02 compute-0 nova_compute[186544]: 2025-11-22 08:00:02.519 186548 DEBUG nova.objects.instance [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Nov 22 08:00:02 compute-0 nova_compute[186544]: 2025-11-22 08:00:02.522 186548 DEBUG nova.virt.libvirt.driver [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Nov 22 08:00:05 compute-0 nova_compute[186544]: 2025-11-22 08:00:05.158 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:00:05 compute-0 podman[228683]: 2025-11-22 08:00:05.409105512 +0000 UTC m=+0.048737688 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 08:00:05 compute-0 nova_compute[186544]: 2025-11-22 08:00:05.612 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:00:06 compute-0 nova_compute[186544]: 2025-11-22 08:00:06.952 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:00:07 compute-0 podman[228707]: 2025-11-22 08:00:07.402175231 +0000 UTC m=+0.049312271 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., version=9.6, architecture=x86_64, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container)
Nov 22 08:00:10 compute-0 nova_compute[186544]: 2025-11-22 08:00:10.615 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:00:11 compute-0 nova_compute[186544]: 2025-11-22 08:00:11.954 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:00:12 compute-0 nova_compute[186544]: 2025-11-22 08:00:12.561 186548 DEBUG nova.virt.libvirt.driver [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Nov 22 08:00:15 compute-0 nova_compute[186544]: 2025-11-22 08:00:15.616 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:00:16 compute-0 nova_compute[186544]: 2025-11-22 08:00:16.956 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:00:19 compute-0 podman[228768]: 2025-11-22 08:00:19.411201955 +0000 UTC m=+0.064233592 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 22 08:00:19 compute-0 podman[228769]: 2025-11-22 08:00:19.445051023 +0000 UTC m=+0.091430806 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 22 08:00:20 compute-0 nova_compute[186544]: 2025-11-22 08:00:20.617 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:00:21 compute-0 nova_compute[186544]: 2025-11-22 08:00:21.958 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:00:23 compute-0 podman[228815]: 2025-11-22 08:00:23.40320494 +0000 UTC m=+0.053588848 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 08:00:23 compute-0 nova_compute[186544]: 2025-11-22 08:00:23.603 186548 DEBUG nova.virt.libvirt.driver [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] Instance in state 1 after 21 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Nov 22 08:00:25 compute-0 podman[228840]: 2025-11-22 08:00:25.400495645 +0000 UTC m=+0.052923872 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 22 08:00:25 compute-0 nova_compute[186544]: 2025-11-22 08:00:25.618 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:00:26 compute-0 systemd[1]: machine-qemu\x2d48\x2dinstance\x2d0000005a.scope: Deactivated successfully.
Nov 22 08:00:26 compute-0 systemd[1]: machine-qemu\x2d48\x2dinstance\x2d0000005a.scope: Consumed 16.111s CPU time.
Nov 22 08:00:26 compute-0 systemd-machined[152872]: Machine qemu-48-instance-0000005a terminated.
Nov 22 08:00:26 compute-0 nova_compute[186544]: 2025-11-22 08:00:26.616 186548 INFO nova.virt.libvirt.driver [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] Instance shutdown successfully after 24 seconds.
Nov 22 08:00:26 compute-0 nova_compute[186544]: 2025-11-22 08:00:26.622 186548 INFO nova.virt.libvirt.driver [-] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] Instance destroyed successfully.
Nov 22 08:00:26 compute-0 nova_compute[186544]: 2025-11-22 08:00:26.626 186548 INFO nova.virt.libvirt.driver [-] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] Instance destroyed successfully.
Nov 22 08:00:26 compute-0 nova_compute[186544]: 2025-11-22 08:00:26.626 186548 INFO nova.virt.libvirt.driver [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] Deleting instance files /var/lib/nova/instances/b6c62c3d-be2c-46d2-9bb1-26390c5f9593_del
Nov 22 08:00:26 compute-0 nova_compute[186544]: 2025-11-22 08:00:26.627 186548 INFO nova.virt.libvirt.driver [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] Deletion of /var/lib/nova/instances/b6c62c3d-be2c-46d2-9bb1-26390c5f9593_del complete
Nov 22 08:00:26 compute-0 nova_compute[186544]: 2025-11-22 08:00:26.845 186548 DEBUG nova.virt.libvirt.driver [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 08:00:26 compute-0 nova_compute[186544]: 2025-11-22 08:00:26.846 186548 INFO nova.virt.libvirt.driver [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] Creating image(s)
Nov 22 08:00:26 compute-0 nova_compute[186544]: 2025-11-22 08:00:26.846 186548 DEBUG oslo_concurrency.lockutils [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Acquiring lock "/var/lib/nova/instances/b6c62c3d-be2c-46d2-9bb1-26390c5f9593/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:00:26 compute-0 nova_compute[186544]: 2025-11-22 08:00:26.847 186548 DEBUG oslo_concurrency.lockutils [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Lock "/var/lib/nova/instances/b6c62c3d-be2c-46d2-9bb1-26390c5f9593/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:00:26 compute-0 nova_compute[186544]: 2025-11-22 08:00:26.848 186548 DEBUG oslo_concurrency.lockutils [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Lock "/var/lib/nova/instances/b6c62c3d-be2c-46d2-9bb1-26390c5f9593/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:00:26 compute-0 nova_compute[186544]: 2025-11-22 08:00:26.865 186548 DEBUG oslo_concurrency.processutils [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:00:26 compute-0 nova_compute[186544]: 2025-11-22 08:00:26.933 186548 DEBUG oslo_concurrency.processutils [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:00:26 compute-0 nova_compute[186544]: 2025-11-22 08:00:26.935 186548 DEBUG oslo_concurrency.lockutils [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Acquiring lock "2882af3479446958b785a3f508ce087a26493f42" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:00:26 compute-0 nova_compute[186544]: 2025-11-22 08:00:26.935 186548 DEBUG oslo_concurrency.lockutils [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Lock "2882af3479446958b785a3f508ce087a26493f42" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:00:26 compute-0 nova_compute[186544]: 2025-11-22 08:00:26.951 186548 DEBUG oslo_concurrency.processutils [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:00:26 compute-0 nova_compute[186544]: 2025-11-22 08:00:26.969 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:00:27 compute-0 nova_compute[186544]: 2025-11-22 08:00:27.008 186548 DEBUG oslo_concurrency.processutils [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:00:27 compute-0 nova_compute[186544]: 2025-11-22 08:00:27.009 186548 DEBUG oslo_concurrency.processutils [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42,backing_fmt=raw /var/lib/nova/instances/b6c62c3d-be2c-46d2-9bb1-26390c5f9593/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:00:27 compute-0 nova_compute[186544]: 2025-11-22 08:00:27.051 186548 DEBUG oslo_concurrency.processutils [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42,backing_fmt=raw /var/lib/nova/instances/b6c62c3d-be2c-46d2-9bb1-26390c5f9593/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:00:27 compute-0 nova_compute[186544]: 2025-11-22 08:00:27.052 186548 DEBUG oslo_concurrency.lockutils [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Lock "2882af3479446958b785a3f508ce087a26493f42" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.117s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:00:27 compute-0 nova_compute[186544]: 2025-11-22 08:00:27.052 186548 DEBUG oslo_concurrency.processutils [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:00:27 compute-0 nova_compute[186544]: 2025-11-22 08:00:27.107 186548 DEBUG oslo_concurrency.processutils [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:00:27 compute-0 nova_compute[186544]: 2025-11-22 08:00:27.108 186548 DEBUG nova.virt.disk.api [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Checking if we can resize image /var/lib/nova/instances/b6c62c3d-be2c-46d2-9bb1-26390c5f9593/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 08:00:27 compute-0 nova_compute[186544]: 2025-11-22 08:00:27.108 186548 DEBUG oslo_concurrency.processutils [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b6c62c3d-be2c-46d2-9bb1-26390c5f9593/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:00:27 compute-0 nova_compute[186544]: 2025-11-22 08:00:27.167 186548 DEBUG oslo_concurrency.processutils [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b6c62c3d-be2c-46d2-9bb1-26390c5f9593/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:00:27 compute-0 nova_compute[186544]: 2025-11-22 08:00:27.168 186548 DEBUG nova.virt.disk.api [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Cannot resize image /var/lib/nova/instances/b6c62c3d-be2c-46d2-9bb1-26390c5f9593/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 08:00:27 compute-0 nova_compute[186544]: 2025-11-22 08:00:27.169 186548 DEBUG nova.virt.libvirt.driver [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 08:00:27 compute-0 nova_compute[186544]: 2025-11-22 08:00:27.169 186548 DEBUG nova.virt.libvirt.driver [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] Ensure instance console log exists: /var/lib/nova/instances/b6c62c3d-be2c-46d2-9bb1-26390c5f9593/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 08:00:27 compute-0 nova_compute[186544]: 2025-11-22 08:00:27.169 186548 DEBUG oslo_concurrency.lockutils [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:00:27 compute-0 nova_compute[186544]: 2025-11-22 08:00:27.170 186548 DEBUG oslo_concurrency.lockutils [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:00:27 compute-0 nova_compute[186544]: 2025-11-22 08:00:27.170 186548 DEBUG oslo_concurrency.lockutils [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:00:27 compute-0 nova_compute[186544]: 2025-11-22 08:00:27.171 186548 DEBUG nova.virt.libvirt.driver [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:39:01Z,direct_url=<?>,disk_format='qcow2',id=360f90ca-2ddb-4e60-a48e-364e3b48bd96,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:02Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 08:00:27 compute-0 nova_compute[186544]: 2025-11-22 08:00:27.176 186548 WARNING nova.virt.libvirt.driver [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError
Nov 22 08:00:27 compute-0 nova_compute[186544]: 2025-11-22 08:00:27.190 186548 DEBUG nova.virt.libvirt.host [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 08:00:27 compute-0 nova_compute[186544]: 2025-11-22 08:00:27.191 186548 DEBUG nova.virt.libvirt.host [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 08:00:27 compute-0 nova_compute[186544]: 2025-11-22 08:00:27.194 186548 DEBUG nova.virt.libvirt.host [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 08:00:27 compute-0 nova_compute[186544]: 2025-11-22 08:00:27.195 186548 DEBUG nova.virt.libvirt.host [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 08:00:27 compute-0 nova_compute[186544]: 2025-11-22 08:00:27.196 186548 DEBUG nova.virt.libvirt.driver [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 08:00:27 compute-0 nova_compute[186544]: 2025-11-22 08:00:27.196 186548 DEBUG nova.virt.hardware [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:39:01Z,direct_url=<?>,disk_format='qcow2',id=360f90ca-2ddb-4e60-a48e-364e3b48bd96,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:02Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 08:00:27 compute-0 nova_compute[186544]: 2025-11-22 08:00:27.197 186548 DEBUG nova.virt.hardware [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 08:00:27 compute-0 nova_compute[186544]: 2025-11-22 08:00:27.197 186548 DEBUG nova.virt.hardware [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 08:00:27 compute-0 nova_compute[186544]: 2025-11-22 08:00:27.197 186548 DEBUG nova.virt.hardware [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 08:00:27 compute-0 nova_compute[186544]: 2025-11-22 08:00:27.197 186548 DEBUG nova.virt.hardware [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 08:00:27 compute-0 nova_compute[186544]: 2025-11-22 08:00:27.198 186548 DEBUG nova.virt.hardware [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 08:00:27 compute-0 nova_compute[186544]: 2025-11-22 08:00:27.198 186548 DEBUG nova.virt.hardware [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 08:00:27 compute-0 nova_compute[186544]: 2025-11-22 08:00:27.198 186548 DEBUG nova.virt.hardware [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 08:00:27 compute-0 nova_compute[186544]: 2025-11-22 08:00:27.198 186548 DEBUG nova.virt.hardware [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 08:00:27 compute-0 nova_compute[186544]: 2025-11-22 08:00:27.198 186548 DEBUG nova.virt.hardware [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 08:00:27 compute-0 nova_compute[186544]: 2025-11-22 08:00:27.199 186548 DEBUG nova.virt.hardware [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 08:00:27 compute-0 nova_compute[186544]: 2025-11-22 08:00:27.199 186548 DEBUG nova.objects.instance [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Lazy-loading 'vcpu_model' on Instance uuid b6c62c3d-be2c-46d2-9bb1-26390c5f9593 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:00:27 compute-0 nova_compute[186544]: 2025-11-22 08:00:27.215 186548 DEBUG nova.virt.libvirt.driver [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] End _get_guest_xml xml=<domain type="kvm">
Nov 22 08:00:27 compute-0 nova_compute[186544]:   <uuid>b6c62c3d-be2c-46d2-9bb1-26390c5f9593</uuid>
Nov 22 08:00:27 compute-0 nova_compute[186544]:   <name>instance-0000005a</name>
Nov 22 08:00:27 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 08:00:27 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 08:00:27 compute-0 nova_compute[186544]:   <metadata>
Nov 22 08:00:27 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 08:00:27 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 08:00:27 compute-0 nova_compute[186544]:       <nova:name>tempest-ServerShowV247Test-server-734969100</nova:name>
Nov 22 08:00:27 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 08:00:27</nova:creationTime>
Nov 22 08:00:27 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 08:00:27 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 08:00:27 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 08:00:27 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 08:00:27 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 08:00:27 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 08:00:27 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 08:00:27 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 08:00:27 compute-0 nova_compute[186544]:         <nova:user uuid="1e19acf488494c4184a911a745e39b4b">tempest-ServerShowV247Test-572024602-project-member</nova:user>
Nov 22 08:00:27 compute-0 nova_compute[186544]:         <nova:project uuid="b82ac9be9aa74af28bd41198152daea3">tempest-ServerShowV247Test-572024602</nova:project>
Nov 22 08:00:27 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 08:00:27 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="360f90ca-2ddb-4e60-a48e-364e3b48bd96"/>
Nov 22 08:00:27 compute-0 nova_compute[186544]:       <nova:ports/>
Nov 22 08:00:27 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 08:00:27 compute-0 nova_compute[186544]:   </metadata>
Nov 22 08:00:27 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 08:00:27 compute-0 nova_compute[186544]:     <system>
Nov 22 08:00:27 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 08:00:27 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 08:00:27 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 08:00:27 compute-0 nova_compute[186544]:       <entry name="serial">b6c62c3d-be2c-46d2-9bb1-26390c5f9593</entry>
Nov 22 08:00:27 compute-0 nova_compute[186544]:       <entry name="uuid">b6c62c3d-be2c-46d2-9bb1-26390c5f9593</entry>
Nov 22 08:00:27 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 08:00:27 compute-0 nova_compute[186544]:     </system>
Nov 22 08:00:27 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 08:00:27 compute-0 nova_compute[186544]:   <os>
Nov 22 08:00:27 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 08:00:27 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 08:00:27 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 08:00:27 compute-0 nova_compute[186544]:   </os>
Nov 22 08:00:27 compute-0 nova_compute[186544]:   <features>
Nov 22 08:00:27 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 08:00:27 compute-0 nova_compute[186544]:     <apic/>
Nov 22 08:00:27 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 08:00:27 compute-0 nova_compute[186544]:   </features>
Nov 22 08:00:27 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 08:00:27 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 08:00:27 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 08:00:27 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 08:00:27 compute-0 nova_compute[186544]:   </clock>
Nov 22 08:00:27 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 08:00:27 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 08:00:27 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 08:00:27 compute-0 nova_compute[186544]:   </cpu>
Nov 22 08:00:27 compute-0 nova_compute[186544]:   <devices>
Nov 22 08:00:27 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 08:00:27 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 08:00:27 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/b6c62c3d-be2c-46d2-9bb1-26390c5f9593/disk"/>
Nov 22 08:00:27 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 08:00:27 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:00:27 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 08:00:27 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 08:00:27 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/b6c62c3d-be2c-46d2-9bb1-26390c5f9593/disk.config"/>
Nov 22 08:00:27 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 08:00:27 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:00:27 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 08:00:27 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/b6c62c3d-be2c-46d2-9bb1-26390c5f9593/console.log" append="off"/>
Nov 22 08:00:27 compute-0 nova_compute[186544]:     </serial>
Nov 22 08:00:27 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 08:00:27 compute-0 nova_compute[186544]:     <video>
Nov 22 08:00:27 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:00:27 compute-0 nova_compute[186544]:     </video>
Nov 22 08:00:27 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 08:00:27 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 08:00:27 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 08:00:27 compute-0 nova_compute[186544]:     </rng>
Nov 22 08:00:27 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 08:00:27 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:00:27 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:00:27 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:00:27 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:00:27 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:00:27 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:00:27 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:00:27 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:00:27 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:00:27 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:00:27 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:00:27 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:00:27 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:00:27 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:00:27 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:00:27 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:00:27 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:00:27 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:00:27 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:00:27 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:00:27 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:00:27 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:00:27 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:00:27 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:00:27 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 08:00:27 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 08:00:27 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 08:00:27 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 08:00:27 compute-0 nova_compute[186544]:   </devices>
Nov 22 08:00:27 compute-0 nova_compute[186544]: </domain>
Nov 22 08:00:27 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 08:00:27 compute-0 nova_compute[186544]: 2025-11-22 08:00:27.269 186548 DEBUG nova.virt.libvirt.driver [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:00:27 compute-0 nova_compute[186544]: 2025-11-22 08:00:27.269 186548 DEBUG nova.virt.libvirt.driver [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:00:27 compute-0 nova_compute[186544]: 2025-11-22 08:00:27.269 186548 INFO nova.virt.libvirt.driver [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] Using config drive
Nov 22 08:00:27 compute-0 nova_compute[186544]: 2025-11-22 08:00:27.282 186548 DEBUG nova.objects.instance [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Lazy-loading 'ec2_ids' on Instance uuid b6c62c3d-be2c-46d2-9bb1-26390c5f9593 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:00:27 compute-0 nova_compute[186544]: 2025-11-22 08:00:27.304 186548 DEBUG nova.objects.instance [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Lazy-loading 'keypairs' on Instance uuid b6c62c3d-be2c-46d2-9bb1-26390c5f9593 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:00:27 compute-0 nova_compute[186544]: 2025-11-22 08:00:27.468 186548 INFO nova.virt.libvirt.driver [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] Creating config drive at /var/lib/nova/instances/b6c62c3d-be2c-46d2-9bb1-26390c5f9593/disk.config
Nov 22 08:00:27 compute-0 nova_compute[186544]: 2025-11-22 08:00:27.473 186548 DEBUG oslo_concurrency.processutils [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b6c62c3d-be2c-46d2-9bb1-26390c5f9593/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1i_idayb execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:00:27 compute-0 nova_compute[186544]: 2025-11-22 08:00:27.597 186548 DEBUG oslo_concurrency.processutils [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b6c62c3d-be2c-46d2-9bb1-26390c5f9593/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1i_idayb" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:00:27 compute-0 systemd-machined[152872]: New machine qemu-49-instance-0000005a.
Nov 22 08:00:27 compute-0 systemd[1]: Started Virtual Machine qemu-49-instance-0000005a.
Nov 22 08:00:28 compute-0 nova_compute[186544]: 2025-11-22 08:00:28.269 186548 DEBUG nova.virt.libvirt.host [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Removed pending event for b6c62c3d-be2c-46d2-9bb1-26390c5f9593 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Nov 22 08:00:28 compute-0 nova_compute[186544]: 2025-11-22 08:00:28.269 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798428.2688057, b6c62c3d-be2c-46d2-9bb1-26390c5f9593 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:00:28 compute-0 nova_compute[186544]: 2025-11-22 08:00:28.270 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] VM Resumed (Lifecycle Event)
Nov 22 08:00:28 compute-0 nova_compute[186544]: 2025-11-22 08:00:28.273 186548 DEBUG nova.compute.manager [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 08:00:28 compute-0 nova_compute[186544]: 2025-11-22 08:00:28.273 186548 DEBUG nova.virt.libvirt.driver [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 08:00:28 compute-0 nova_compute[186544]: 2025-11-22 08:00:28.277 186548 INFO nova.virt.libvirt.driver [-] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] Instance spawned successfully.
Nov 22 08:00:28 compute-0 nova_compute[186544]: 2025-11-22 08:00:28.277 186548 DEBUG nova.virt.libvirt.driver [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 08:00:28 compute-0 nova_compute[186544]: 2025-11-22 08:00:28.303 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:00:28 compute-0 nova_compute[186544]: 2025-11-22 08:00:28.309 186548 DEBUG nova.virt.libvirt.driver [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:00:28 compute-0 nova_compute[186544]: 2025-11-22 08:00:28.309 186548 DEBUG nova.virt.libvirt.driver [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:00:28 compute-0 nova_compute[186544]: 2025-11-22 08:00:28.310 186548 DEBUG nova.virt.libvirt.driver [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:00:28 compute-0 nova_compute[186544]: 2025-11-22 08:00:28.310 186548 DEBUG nova.virt.libvirt.driver [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:00:28 compute-0 nova_compute[186544]: 2025-11-22 08:00:28.311 186548 DEBUG nova.virt.libvirt.driver [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:00:28 compute-0 nova_compute[186544]: 2025-11-22 08:00:28.311 186548 DEBUG nova.virt.libvirt.driver [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:00:28 compute-0 nova_compute[186544]: 2025-11-22 08:00:28.316 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:00:28 compute-0 nova_compute[186544]: 2025-11-22 08:00:28.365 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Nov 22 08:00:28 compute-0 nova_compute[186544]: 2025-11-22 08:00:28.365 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798428.2725372, b6c62c3d-be2c-46d2-9bb1-26390c5f9593 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:00:28 compute-0 nova_compute[186544]: 2025-11-22 08:00:28.365 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] VM Started (Lifecycle Event)
Nov 22 08:00:28 compute-0 nova_compute[186544]: 2025-11-22 08:00:28.383 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:00:28 compute-0 nova_compute[186544]: 2025-11-22 08:00:28.388 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:00:28 compute-0 nova_compute[186544]: 2025-11-22 08:00:28.406 186548 DEBUG nova.compute.manager [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:00:28 compute-0 nova_compute[186544]: 2025-11-22 08:00:28.414 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Nov 22 08:00:28 compute-0 nova_compute[186544]: 2025-11-22 08:00:28.484 186548 DEBUG oslo_concurrency.lockutils [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:00:28 compute-0 nova_compute[186544]: 2025-11-22 08:00:28.484 186548 DEBUG oslo_concurrency.lockutils [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:00:28 compute-0 nova_compute[186544]: 2025-11-22 08:00:28.485 186548 DEBUG nova.objects.instance [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Nov 22 08:00:28 compute-0 nova_compute[186544]: 2025-11-22 08:00:28.562 186548 DEBUG oslo_concurrency.lockutils [None req-4fc6a0a0-14f0-4907-9986-63df580425e8 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.078s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:00:29 compute-0 nova_compute[186544]: 2025-11-22 08:00:29.574 186548 DEBUG oslo_concurrency.lockutils [None req-121462f8-fd6e-4cab-8342-669fc636854d 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Acquiring lock "b6c62c3d-be2c-46d2-9bb1-26390c5f9593" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:00:29 compute-0 nova_compute[186544]: 2025-11-22 08:00:29.575 186548 DEBUG oslo_concurrency.lockutils [None req-121462f8-fd6e-4cab-8342-669fc636854d 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Lock "b6c62c3d-be2c-46d2-9bb1-26390c5f9593" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:00:29 compute-0 nova_compute[186544]: 2025-11-22 08:00:29.575 186548 DEBUG oslo_concurrency.lockutils [None req-121462f8-fd6e-4cab-8342-669fc636854d 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Acquiring lock "b6c62c3d-be2c-46d2-9bb1-26390c5f9593-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:00:29 compute-0 nova_compute[186544]: 2025-11-22 08:00:29.575 186548 DEBUG oslo_concurrency.lockutils [None req-121462f8-fd6e-4cab-8342-669fc636854d 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Lock "b6c62c3d-be2c-46d2-9bb1-26390c5f9593-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:00:29 compute-0 nova_compute[186544]: 2025-11-22 08:00:29.575 186548 DEBUG oslo_concurrency.lockutils [None req-121462f8-fd6e-4cab-8342-669fc636854d 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Lock "b6c62c3d-be2c-46d2-9bb1-26390c5f9593-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:00:29 compute-0 nova_compute[186544]: 2025-11-22 08:00:29.584 186548 INFO nova.compute.manager [None req-121462f8-fd6e-4cab-8342-669fc636854d 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] Terminating instance
Nov 22 08:00:29 compute-0 nova_compute[186544]: 2025-11-22 08:00:29.590 186548 DEBUG oslo_concurrency.lockutils [None req-121462f8-fd6e-4cab-8342-669fc636854d 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Acquiring lock "refresh_cache-b6c62c3d-be2c-46d2-9bb1-26390c5f9593" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:00:29 compute-0 nova_compute[186544]: 2025-11-22 08:00:29.591 186548 DEBUG oslo_concurrency.lockutils [None req-121462f8-fd6e-4cab-8342-669fc636854d 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Acquired lock "refresh_cache-b6c62c3d-be2c-46d2-9bb1-26390c5f9593" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:00:29 compute-0 nova_compute[186544]: 2025-11-22 08:00:29.591 186548 DEBUG nova.network.neutron [None req-121462f8-fd6e-4cab-8342-669fc636854d 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 08:00:29 compute-0 nova_compute[186544]: 2025-11-22 08:00:29.732 186548 DEBUG nova.network.neutron [None req-121462f8-fd6e-4cab-8342-669fc636854d 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 08:00:30 compute-0 nova_compute[186544]: 2025-11-22 08:00:30.010 186548 DEBUG nova.network.neutron [None req-121462f8-fd6e-4cab-8342-669fc636854d 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:00:30 compute-0 nova_compute[186544]: 2025-11-22 08:00:30.021 186548 DEBUG oslo_concurrency.lockutils [None req-121462f8-fd6e-4cab-8342-669fc636854d 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Releasing lock "refresh_cache-b6c62c3d-be2c-46d2-9bb1-26390c5f9593" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:00:30 compute-0 nova_compute[186544]: 2025-11-22 08:00:30.022 186548 DEBUG nova.compute.manager [None req-121462f8-fd6e-4cab-8342-669fc636854d 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 08:00:30 compute-0 systemd[1]: machine-qemu\x2d49\x2dinstance\x2d0000005a.scope: Deactivated successfully.
Nov 22 08:00:30 compute-0 systemd[1]: machine-qemu\x2d49\x2dinstance\x2d0000005a.scope: Consumed 2.328s CPU time.
Nov 22 08:00:30 compute-0 systemd-machined[152872]: Machine qemu-49-instance-0000005a terminated.
Nov 22 08:00:30 compute-0 nova_compute[186544]: 2025-11-22 08:00:30.258 186548 INFO nova.virt.libvirt.driver [-] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] Instance destroyed successfully.
Nov 22 08:00:30 compute-0 nova_compute[186544]: 2025-11-22 08:00:30.259 186548 DEBUG nova.objects.instance [None req-121462f8-fd6e-4cab-8342-669fc636854d 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Lazy-loading 'resources' on Instance uuid b6c62c3d-be2c-46d2-9bb1-26390c5f9593 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:00:30 compute-0 nova_compute[186544]: 2025-11-22 08:00:30.268 186548 INFO nova.virt.libvirt.driver [None req-121462f8-fd6e-4cab-8342-669fc636854d 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] Deleting instance files /var/lib/nova/instances/b6c62c3d-be2c-46d2-9bb1-26390c5f9593_del
Nov 22 08:00:30 compute-0 nova_compute[186544]: 2025-11-22 08:00:30.268 186548 INFO nova.virt.libvirt.driver [None req-121462f8-fd6e-4cab-8342-669fc636854d 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] Deletion of /var/lib/nova/instances/b6c62c3d-be2c-46d2-9bb1-26390c5f9593_del complete
Nov 22 08:00:30 compute-0 nova_compute[186544]: 2025-11-22 08:00:30.353 186548 INFO nova.compute.manager [None req-121462f8-fd6e-4cab-8342-669fc636854d 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] Took 0.33 seconds to destroy the instance on the hypervisor.
Nov 22 08:00:30 compute-0 nova_compute[186544]: 2025-11-22 08:00:30.353 186548 DEBUG oslo.service.loopingcall [None req-121462f8-fd6e-4cab-8342-669fc636854d 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 08:00:30 compute-0 nova_compute[186544]: 2025-11-22 08:00:30.354 186548 DEBUG nova.compute.manager [-] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 08:00:30 compute-0 nova_compute[186544]: 2025-11-22 08:00:30.354 186548 DEBUG nova.network.neutron [-] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 08:00:30 compute-0 nova_compute[186544]: 2025-11-22 08:00:30.458 186548 DEBUG nova.network.neutron [-] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 08:00:30 compute-0 nova_compute[186544]: 2025-11-22 08:00:30.469 186548 DEBUG nova.network.neutron [-] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:00:30 compute-0 nova_compute[186544]: 2025-11-22 08:00:30.482 186548 INFO nova.compute.manager [-] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] Took 0.13 seconds to deallocate network for instance.
Nov 22 08:00:30 compute-0 nova_compute[186544]: 2025-11-22 08:00:30.541 186548 DEBUG oslo_concurrency.lockutils [None req-121462f8-fd6e-4cab-8342-669fc636854d 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:00:30 compute-0 nova_compute[186544]: 2025-11-22 08:00:30.542 186548 DEBUG oslo_concurrency.lockutils [None req-121462f8-fd6e-4cab-8342-669fc636854d 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:00:30 compute-0 nova_compute[186544]: 2025-11-22 08:00:30.601 186548 DEBUG nova.compute.provider_tree [None req-121462f8-fd6e-4cab-8342-669fc636854d 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:00:30 compute-0 nova_compute[186544]: 2025-11-22 08:00:30.615 186548 DEBUG nova.scheduler.client.report [None req-121462f8-fd6e-4cab-8342-669fc636854d 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:00:30 compute-0 nova_compute[186544]: 2025-11-22 08:00:30.618 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:00:30 compute-0 nova_compute[186544]: 2025-11-22 08:00:30.637 186548 DEBUG oslo_concurrency.lockutils [None req-121462f8-fd6e-4cab-8342-669fc636854d 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.095s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:00:30 compute-0 nova_compute[186544]: 2025-11-22 08:00:30.664 186548 INFO nova.scheduler.client.report [None req-121462f8-fd6e-4cab-8342-669fc636854d 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Deleted allocations for instance b6c62c3d-be2c-46d2-9bb1-26390c5f9593
Nov 22 08:00:30 compute-0 nova_compute[186544]: 2025-11-22 08:00:30.743 186548 DEBUG oslo_concurrency.lockutils [None req-121462f8-fd6e-4cab-8342-669fc636854d 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Lock "b6c62c3d-be2c-46d2-9bb1-26390c5f9593" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.168s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:00:31 compute-0 nova_compute[186544]: 2025-11-22 08:00:31.109 186548 DEBUG oslo_concurrency.lockutils [None req-b84ee836-a7ff-4676-8d8e-300908970fae 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Acquiring lock "e53bd443-98ed-4e38-bff6-3f43fe77ade8" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:00:31 compute-0 nova_compute[186544]: 2025-11-22 08:00:31.110 186548 DEBUG oslo_concurrency.lockutils [None req-b84ee836-a7ff-4676-8d8e-300908970fae 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Lock "e53bd443-98ed-4e38-bff6-3f43fe77ade8" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:00:31 compute-0 nova_compute[186544]: 2025-11-22 08:00:31.110 186548 DEBUG oslo_concurrency.lockutils [None req-b84ee836-a7ff-4676-8d8e-300908970fae 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Acquiring lock "e53bd443-98ed-4e38-bff6-3f43fe77ade8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:00:31 compute-0 nova_compute[186544]: 2025-11-22 08:00:31.110 186548 DEBUG oslo_concurrency.lockutils [None req-b84ee836-a7ff-4676-8d8e-300908970fae 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Lock "e53bd443-98ed-4e38-bff6-3f43fe77ade8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:00:31 compute-0 nova_compute[186544]: 2025-11-22 08:00:31.110 186548 DEBUG oslo_concurrency.lockutils [None req-b84ee836-a7ff-4676-8d8e-300908970fae 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Lock "e53bd443-98ed-4e38-bff6-3f43fe77ade8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:00:31 compute-0 nova_compute[186544]: 2025-11-22 08:00:31.117 186548 INFO nova.compute.manager [None req-b84ee836-a7ff-4676-8d8e-300908970fae 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: e53bd443-98ed-4e38-bff6-3f43fe77ade8] Terminating instance
Nov 22 08:00:31 compute-0 nova_compute[186544]: 2025-11-22 08:00:31.121 186548 DEBUG oslo_concurrency.lockutils [None req-b84ee836-a7ff-4676-8d8e-300908970fae 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Acquiring lock "refresh_cache-e53bd443-98ed-4e38-bff6-3f43fe77ade8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:00:31 compute-0 nova_compute[186544]: 2025-11-22 08:00:31.122 186548 DEBUG oslo_concurrency.lockutils [None req-b84ee836-a7ff-4676-8d8e-300908970fae 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Acquired lock "refresh_cache-e53bd443-98ed-4e38-bff6-3f43fe77ade8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:00:31 compute-0 nova_compute[186544]: 2025-11-22 08:00:31.122 186548 DEBUG nova.network.neutron [None req-b84ee836-a7ff-4676-8d8e-300908970fae 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: e53bd443-98ed-4e38-bff6-3f43fe77ade8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 08:00:31 compute-0 podman[228920]: 2025-11-22 08:00:31.419322956 +0000 UTC m=+0.061237768 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 22 08:00:31 compute-0 nova_compute[186544]: 2025-11-22 08:00:31.435 186548 DEBUG nova.network.neutron [None req-b84ee836-a7ff-4676-8d8e-300908970fae 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: e53bd443-98ed-4e38-bff6-3f43fe77ade8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 08:00:31 compute-0 nova_compute[186544]: 2025-11-22 08:00:31.748 186548 DEBUG nova.network.neutron [None req-b84ee836-a7ff-4676-8d8e-300908970fae 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: e53bd443-98ed-4e38-bff6-3f43fe77ade8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:00:31 compute-0 nova_compute[186544]: 2025-11-22 08:00:31.762 186548 DEBUG oslo_concurrency.lockutils [None req-b84ee836-a7ff-4676-8d8e-300908970fae 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Releasing lock "refresh_cache-e53bd443-98ed-4e38-bff6-3f43fe77ade8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:00:31 compute-0 nova_compute[186544]: 2025-11-22 08:00:31.762 186548 DEBUG nova.compute.manager [None req-b84ee836-a7ff-4676-8d8e-300908970fae 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: e53bd443-98ed-4e38-bff6-3f43fe77ade8] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 08:00:31 compute-0 systemd[1]: machine-qemu\x2d47\x2dinstance\x2d00000059.scope: Deactivated successfully.
Nov 22 08:00:31 compute-0 systemd[1]: machine-qemu\x2d47\x2dinstance\x2d00000059.scope: Consumed 15.364s CPU time.
Nov 22 08:00:31 compute-0 systemd-machined[152872]: Machine qemu-47-instance-00000059 terminated.
Nov 22 08:00:31 compute-0 nova_compute[186544]: 2025-11-22 08:00:31.971 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:00:32 compute-0 nova_compute[186544]: 2025-11-22 08:00:32.005 186548 INFO nova.virt.libvirt.driver [-] [instance: e53bd443-98ed-4e38-bff6-3f43fe77ade8] Instance destroyed successfully.
Nov 22 08:00:32 compute-0 nova_compute[186544]: 2025-11-22 08:00:32.007 186548 DEBUG nova.objects.instance [None req-b84ee836-a7ff-4676-8d8e-300908970fae 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Lazy-loading 'resources' on Instance uuid e53bd443-98ed-4e38-bff6-3f43fe77ade8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:00:32 compute-0 nova_compute[186544]: 2025-11-22 08:00:32.019 186548 INFO nova.virt.libvirt.driver [None req-b84ee836-a7ff-4676-8d8e-300908970fae 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: e53bd443-98ed-4e38-bff6-3f43fe77ade8] Deleting instance files /var/lib/nova/instances/e53bd443-98ed-4e38-bff6-3f43fe77ade8_del
Nov 22 08:00:32 compute-0 nova_compute[186544]: 2025-11-22 08:00:32.020 186548 INFO nova.virt.libvirt.driver [None req-b84ee836-a7ff-4676-8d8e-300908970fae 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: e53bd443-98ed-4e38-bff6-3f43fe77ade8] Deletion of /var/lib/nova/instances/e53bd443-98ed-4e38-bff6-3f43fe77ade8_del complete
Nov 22 08:00:32 compute-0 nova_compute[186544]: 2025-11-22 08:00:32.085 186548 INFO nova.compute.manager [None req-b84ee836-a7ff-4676-8d8e-300908970fae 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] [instance: e53bd443-98ed-4e38-bff6-3f43fe77ade8] Took 0.32 seconds to destroy the instance on the hypervisor.
Nov 22 08:00:32 compute-0 nova_compute[186544]: 2025-11-22 08:00:32.085 186548 DEBUG oslo.service.loopingcall [None req-b84ee836-a7ff-4676-8d8e-300908970fae 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 08:00:32 compute-0 nova_compute[186544]: 2025-11-22 08:00:32.085 186548 DEBUG nova.compute.manager [-] [instance: e53bd443-98ed-4e38-bff6-3f43fe77ade8] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 08:00:32 compute-0 nova_compute[186544]: 2025-11-22 08:00:32.085 186548 DEBUG nova.network.neutron [-] [instance: e53bd443-98ed-4e38-bff6-3f43fe77ade8] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 08:00:32 compute-0 nova_compute[186544]: 2025-11-22 08:00:32.221 186548 DEBUG nova.network.neutron [-] [instance: e53bd443-98ed-4e38-bff6-3f43fe77ade8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 08:00:32 compute-0 nova_compute[186544]: 2025-11-22 08:00:32.230 186548 DEBUG nova.network.neutron [-] [instance: e53bd443-98ed-4e38-bff6-3f43fe77ade8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:00:32 compute-0 nova_compute[186544]: 2025-11-22 08:00:32.247 186548 INFO nova.compute.manager [-] [instance: e53bd443-98ed-4e38-bff6-3f43fe77ade8] Took 0.16 seconds to deallocate network for instance.
Nov 22 08:00:32 compute-0 nova_compute[186544]: 2025-11-22 08:00:32.320 186548 DEBUG oslo_concurrency.lockutils [None req-b84ee836-a7ff-4676-8d8e-300908970fae 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:00:32 compute-0 nova_compute[186544]: 2025-11-22 08:00:32.320 186548 DEBUG oslo_concurrency.lockutils [None req-b84ee836-a7ff-4676-8d8e-300908970fae 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:00:32 compute-0 nova_compute[186544]: 2025-11-22 08:00:32.367 186548 DEBUG nova.compute.provider_tree [None req-b84ee836-a7ff-4676-8d8e-300908970fae 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:00:32 compute-0 nova_compute[186544]: 2025-11-22 08:00:32.383 186548 DEBUG nova.scheduler.client.report [None req-b84ee836-a7ff-4676-8d8e-300908970fae 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:00:32 compute-0 nova_compute[186544]: 2025-11-22 08:00:32.401 186548 DEBUG oslo_concurrency.lockutils [None req-b84ee836-a7ff-4676-8d8e-300908970fae 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.080s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:00:32 compute-0 nova_compute[186544]: 2025-11-22 08:00:32.443 186548 INFO nova.scheduler.client.report [None req-b84ee836-a7ff-4676-8d8e-300908970fae 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Deleted allocations for instance e53bd443-98ed-4e38-bff6-3f43fe77ade8
Nov 22 08:00:32 compute-0 nova_compute[186544]: 2025-11-22 08:00:32.532 186548 DEBUG oslo_concurrency.lockutils [None req-b84ee836-a7ff-4676-8d8e-300908970fae 1e19acf488494c4184a911a745e39b4b b82ac9be9aa74af28bd41198152daea3 - - default default] Lock "e53bd443-98ed-4e38-bff6-3f43fe77ade8" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.423s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:00:33 compute-0 nova_compute[186544]: 2025-11-22 08:00:33.707 186548 DEBUG nova.compute.manager [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Stashing vm_state: stopped _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560
Nov 22 08:00:33 compute-0 nova_compute[186544]: 2025-11-22 08:00:33.854 186548 DEBUG oslo_concurrency.lockutils [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:00:33 compute-0 nova_compute[186544]: 2025-11-22 08:00:33.854 186548 DEBUG oslo_concurrency.lockutils [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:00:33 compute-0 nova_compute[186544]: 2025-11-22 08:00:33.885 186548 DEBUG nova.objects.instance [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lazy-loading 'pci_requests' on Instance uuid 92409a46-2dd7-4b20-ac9d-958bbb30993d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:00:33 compute-0 nova_compute[186544]: 2025-11-22 08:00:33.899 186548 DEBUG nova.virt.hardware [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 08:00:33 compute-0 nova_compute[186544]: 2025-11-22 08:00:33.900 186548 INFO nova.compute.claims [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Claim successful on node compute-0.ctlplane.example.com
Nov 22 08:00:33 compute-0 nova_compute[186544]: 2025-11-22 08:00:33.900 186548 DEBUG nova.objects.instance [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lazy-loading 'resources' on Instance uuid 92409a46-2dd7-4b20-ac9d-958bbb30993d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:00:33 compute-0 nova_compute[186544]: 2025-11-22 08:00:33.911 186548 DEBUG nova.objects.instance [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lazy-loading 'pci_devices' on Instance uuid 92409a46-2dd7-4b20-ac9d-958bbb30993d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:00:34 compute-0 nova_compute[186544]: 2025-11-22 08:00:34.011 186548 INFO nova.compute.resource_tracker [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Updating resource usage from migration 8a0018e5-ed9f-45b7-a5a4-16ecd560c356
Nov 22 08:00:34 compute-0 nova_compute[186544]: 2025-11-22 08:00:34.012 186548 DEBUG nova.compute.resource_tracker [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Starting to track incoming migration 8a0018e5-ed9f-45b7-a5a4-16ecd560c356 with flavor 1c351edf-5b2d-477d-93d0-c380bdae83e7 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431
Nov 22 08:00:34 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:00:34.064 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=25, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=24) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:00:34 compute-0 nova_compute[186544]: 2025-11-22 08:00:34.065 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:00:34 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:00:34.065 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 08:00:34 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:00:34.066 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '25'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:00:34 compute-0 nova_compute[186544]: 2025-11-22 08:00:34.078 186548 DEBUG nova.compute.provider_tree [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:00:34 compute-0 nova_compute[186544]: 2025-11-22 08:00:34.095 186548 DEBUG nova.scheduler.client.report [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:00:34 compute-0 nova_compute[186544]: 2025-11-22 08:00:34.126 186548 DEBUG oslo_concurrency.lockutils [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.272s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:00:34 compute-0 nova_compute[186544]: 2025-11-22 08:00:34.126 186548 INFO nova.compute.manager [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Migrating
Nov 22 08:00:35 compute-0 nova_compute[186544]: 2025-11-22 08:00:35.620 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:00:36 compute-0 podman[228950]: 2025-11-22 08:00:36.399572331 +0000 UTC m=+0.051815384 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 08:00:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:00:36.593 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:00:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:00:36.594 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:00:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:00:36.594 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:00:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:00:36.594 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:00:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:00:36.595 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:00:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:00:36.595 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:00:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:00:36.595 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:00:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:00:36.595 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:00:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:00:36.595 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:00:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:00:36.595 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:00:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:00:36.595 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:00:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:00:36.595 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:00:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:00:36.595 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:00:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:00:36.595 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:00:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:00:36.595 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:00:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:00:36.595 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:00:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:00:36.596 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:00:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:00:36.596 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:00:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:00:36.596 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:00:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:00:36.596 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:00:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:00:36.596 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:00:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:00:36.596 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:00:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:00:36.596 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:00:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:00:36.596 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:00:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:00:36.596 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:00:36 compute-0 nova_compute[186544]: 2025-11-22 08:00:36.973 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:00:37 compute-0 sshd-session[228974]: Accepted publickey for nova from 192.168.122.102 port 43122 ssh2: ECDSA SHA256:92zYFkcPWlh9+ZvK5fXQ5RiSCmUwy/g/KFplYnusFfI
Nov 22 08:00:37 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Nov 22 08:00:37 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 22 08:00:37 compute-0 systemd-logind[821]: New session 45 of user nova.
Nov 22 08:00:37 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 22 08:00:37 compute-0 systemd[1]: Starting User Manager for UID 42436...
Nov 22 08:00:37 compute-0 systemd[228978]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 22 08:00:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:00:37.328 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:00:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:00:37.329 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:00:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:00:37.330 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:00:37 compute-0 systemd[228978]: Queued start job for default target Main User Target.
Nov 22 08:00:37 compute-0 systemd[228978]: Created slice User Application Slice.
Nov 22 08:00:37 compute-0 systemd[228978]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 22 08:00:37 compute-0 systemd[228978]: Started Daily Cleanup of User's Temporary Directories.
Nov 22 08:00:37 compute-0 systemd[228978]: Reached target Paths.
Nov 22 08:00:37 compute-0 systemd[228978]: Reached target Timers.
Nov 22 08:00:37 compute-0 systemd[228978]: Starting D-Bus User Message Bus Socket...
Nov 22 08:00:37 compute-0 systemd[228978]: Starting Create User's Volatile Files and Directories...
Nov 22 08:00:37 compute-0 systemd[228978]: Finished Create User's Volatile Files and Directories.
Nov 22 08:00:37 compute-0 systemd[228978]: Listening on D-Bus User Message Bus Socket.
Nov 22 08:00:37 compute-0 systemd[228978]: Reached target Sockets.
Nov 22 08:00:37 compute-0 systemd[228978]: Reached target Basic System.
Nov 22 08:00:37 compute-0 systemd[228978]: Reached target Main User Target.
Nov 22 08:00:37 compute-0 systemd[228978]: Startup finished in 133ms.
Nov 22 08:00:37 compute-0 systemd[1]: Started User Manager for UID 42436.
Nov 22 08:00:37 compute-0 systemd[1]: Started Session 45 of User nova.
Nov 22 08:00:37 compute-0 sshd-session[228974]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 22 08:00:37 compute-0 sshd-session[228994]: Received disconnect from 192.168.122.102 port 43122:11: disconnected by user
Nov 22 08:00:37 compute-0 sshd-session[228994]: Disconnected from user nova 192.168.122.102 port 43122
Nov 22 08:00:37 compute-0 sshd-session[228974]: pam_unix(sshd:session): session closed for user nova
Nov 22 08:00:37 compute-0 systemd[1]: session-45.scope: Deactivated successfully.
Nov 22 08:00:37 compute-0 systemd-logind[821]: Session 45 logged out. Waiting for processes to exit.
Nov 22 08:00:37 compute-0 systemd-logind[821]: Removed session 45.
Nov 22 08:00:37 compute-0 podman[228993]: 2025-11-22 08:00:37.508463489 +0000 UTC m=+0.068483897 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, vendor=Red Hat, Inc., vcs-type=git, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, version=9.6, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 22 08:00:37 compute-0 sshd-session[229016]: Accepted publickey for nova from 192.168.122.102 port 43126 ssh2: ECDSA SHA256:92zYFkcPWlh9+ZvK5fXQ5RiSCmUwy/g/KFplYnusFfI
Nov 22 08:00:37 compute-0 systemd-logind[821]: New session 47 of user nova.
Nov 22 08:00:37 compute-0 systemd[1]: Started Session 47 of User nova.
Nov 22 08:00:37 compute-0 sshd-session[229016]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 22 08:00:37 compute-0 sshd-session[229019]: Received disconnect from 192.168.122.102 port 43126:11: disconnected by user
Nov 22 08:00:37 compute-0 sshd-session[229019]: Disconnected from user nova 192.168.122.102 port 43126
Nov 22 08:00:37 compute-0 sshd-session[229016]: pam_unix(sshd:session): session closed for user nova
Nov 22 08:00:37 compute-0 systemd[1]: session-47.scope: Deactivated successfully.
Nov 22 08:00:37 compute-0 systemd-logind[821]: Session 47 logged out. Waiting for processes to exit.
Nov 22 08:00:37 compute-0 systemd-logind[821]: Removed session 47.
Nov 22 08:00:37 compute-0 nova_compute[186544]: 2025-11-22 08:00:37.788 186548 DEBUG oslo_concurrency.lockutils [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Acquiring lock "4035e88f-0c79-4db5-a63d-7ae01c056339" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:00:37 compute-0 nova_compute[186544]: 2025-11-22 08:00:37.789 186548 DEBUG oslo_concurrency.lockutils [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Lock "4035e88f-0c79-4db5-a63d-7ae01c056339" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:00:37 compute-0 nova_compute[186544]: 2025-11-22 08:00:37.865 186548 DEBUG nova.compute.manager [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 08:00:37 compute-0 nova_compute[186544]: 2025-11-22 08:00:37.963 186548 DEBUG oslo_concurrency.lockutils [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:00:37 compute-0 nova_compute[186544]: 2025-11-22 08:00:37.963 186548 DEBUG oslo_concurrency.lockutils [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:00:37 compute-0 nova_compute[186544]: 2025-11-22 08:00:37.972 186548 DEBUG nova.virt.hardware [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 08:00:37 compute-0 nova_compute[186544]: 2025-11-22 08:00:37.972 186548 INFO nova.compute.claims [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] Claim successful on node compute-0.ctlplane.example.com
Nov 22 08:00:38 compute-0 sshd-session[229021]: Accepted publickey for nova from 192.168.122.102 port 43138 ssh2: ECDSA SHA256:92zYFkcPWlh9+ZvK5fXQ5RiSCmUwy/g/KFplYnusFfI
Nov 22 08:00:38 compute-0 systemd-logind[821]: New session 48 of user nova.
Nov 22 08:00:38 compute-0 systemd[1]: Started Session 48 of User nova.
Nov 22 08:00:38 compute-0 sshd-session[229021]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 22 08:00:38 compute-0 nova_compute[186544]: 2025-11-22 08:00:38.090 186548 DEBUG nova.compute.provider_tree [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:00:38 compute-0 nova_compute[186544]: 2025-11-22 08:00:38.103 186548 DEBUG nova.scheduler.client.report [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:00:38 compute-0 nova_compute[186544]: 2025-11-22 08:00:38.135 186548 DEBUG oslo_concurrency.lockutils [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.172s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:00:38 compute-0 nova_compute[186544]: 2025-11-22 08:00:38.136 186548 DEBUG nova.compute.manager [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 08:00:38 compute-0 nova_compute[186544]: 2025-11-22 08:00:38.184 186548 DEBUG nova.compute.manager [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948
Nov 22 08:00:38 compute-0 nova_compute[186544]: 2025-11-22 08:00:38.198 186548 INFO nova.virt.libvirt.driver [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 08:00:38 compute-0 nova_compute[186544]: 2025-11-22 08:00:38.215 186548 DEBUG nova.compute.manager [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 08:00:38 compute-0 nova_compute[186544]: 2025-11-22 08:00:38.336 186548 DEBUG nova.compute.manager [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 08:00:38 compute-0 nova_compute[186544]: 2025-11-22 08:00:38.337 186548 DEBUG nova.virt.libvirt.driver [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 08:00:38 compute-0 nova_compute[186544]: 2025-11-22 08:00:38.337 186548 INFO nova.virt.libvirt.driver [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] Creating image(s)
Nov 22 08:00:38 compute-0 nova_compute[186544]: 2025-11-22 08:00:38.338 186548 DEBUG oslo_concurrency.lockutils [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Acquiring lock "/var/lib/nova/instances/4035e88f-0c79-4db5-a63d-7ae01c056339/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:00:38 compute-0 nova_compute[186544]: 2025-11-22 08:00:38.338 186548 DEBUG oslo_concurrency.lockutils [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Lock "/var/lib/nova/instances/4035e88f-0c79-4db5-a63d-7ae01c056339/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:00:38 compute-0 nova_compute[186544]: 2025-11-22 08:00:38.338 186548 DEBUG oslo_concurrency.lockutils [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Lock "/var/lib/nova/instances/4035e88f-0c79-4db5-a63d-7ae01c056339/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:00:38 compute-0 nova_compute[186544]: 2025-11-22 08:00:38.350 186548 DEBUG oslo_concurrency.processutils [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:00:38 compute-0 nova_compute[186544]: 2025-11-22 08:00:38.405 186548 DEBUG oslo_concurrency.processutils [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:00:38 compute-0 nova_compute[186544]: 2025-11-22 08:00:38.407 186548 DEBUG oslo_concurrency.lockutils [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:00:38 compute-0 nova_compute[186544]: 2025-11-22 08:00:38.407 186548 DEBUG oslo_concurrency.lockutils [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:00:38 compute-0 nova_compute[186544]: 2025-11-22 08:00:38.418 186548 DEBUG oslo_concurrency.processutils [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:00:38 compute-0 nova_compute[186544]: 2025-11-22 08:00:38.474 186548 DEBUG oslo_concurrency.processutils [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:00:38 compute-0 nova_compute[186544]: 2025-11-22 08:00:38.475 186548 DEBUG oslo_concurrency.processutils [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/4035e88f-0c79-4db5-a63d-7ae01c056339/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:00:38 compute-0 sshd-session[229024]: Received disconnect from 192.168.122.102 port 43138:11: disconnected by user
Nov 22 08:00:38 compute-0 sshd-session[229024]: Disconnected from user nova 192.168.122.102 port 43138
Nov 22 08:00:38 compute-0 sshd-session[229021]: pam_unix(sshd:session): session closed for user nova
Nov 22 08:00:38 compute-0 systemd[1]: session-48.scope: Deactivated successfully.
Nov 22 08:00:38 compute-0 systemd-logind[821]: Session 48 logged out. Waiting for processes to exit.
Nov 22 08:00:38 compute-0 systemd-logind[821]: Removed session 48.
Nov 22 08:00:38 compute-0 nova_compute[186544]: 2025-11-22 08:00:38.575 186548 DEBUG oslo_concurrency.processutils [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/4035e88f-0c79-4db5-a63d-7ae01c056339/disk 1073741824" returned: 0 in 0.100s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:00:38 compute-0 nova_compute[186544]: 2025-11-22 08:00:38.576 186548 DEBUG oslo_concurrency.lockutils [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.168s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:00:38 compute-0 nova_compute[186544]: 2025-11-22 08:00:38.576 186548 DEBUG oslo_concurrency.processutils [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:00:38 compute-0 nova_compute[186544]: 2025-11-22 08:00:38.631 186548 DEBUG oslo_concurrency.processutils [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:00:38 compute-0 nova_compute[186544]: 2025-11-22 08:00:38.633 186548 DEBUG nova.virt.disk.api [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Checking if we can resize image /var/lib/nova/instances/4035e88f-0c79-4db5-a63d-7ae01c056339/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 08:00:38 compute-0 nova_compute[186544]: 2025-11-22 08:00:38.634 186548 DEBUG oslo_concurrency.processutils [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4035e88f-0c79-4db5-a63d-7ae01c056339/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:00:38 compute-0 sshd-session[229036]: Accepted publickey for nova from 192.168.122.102 port 43152 ssh2: ECDSA SHA256:92zYFkcPWlh9+ZvK5fXQ5RiSCmUwy/g/KFplYnusFfI
Nov 22 08:00:38 compute-0 systemd-logind[821]: New session 49 of user nova.
Nov 22 08:00:38 compute-0 systemd[1]: Started Session 49 of User nova.
Nov 22 08:00:38 compute-0 nova_compute[186544]: 2025-11-22 08:00:38.706 186548 DEBUG oslo_concurrency.processutils [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4035e88f-0c79-4db5-a63d-7ae01c056339/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:00:38 compute-0 nova_compute[186544]: 2025-11-22 08:00:38.707 186548 DEBUG nova.virt.disk.api [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Cannot resize image /var/lib/nova/instances/4035e88f-0c79-4db5-a63d-7ae01c056339/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 08:00:38 compute-0 nova_compute[186544]: 2025-11-22 08:00:38.707 186548 DEBUG nova.objects.instance [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Lazy-loading 'migration_context' on Instance uuid 4035e88f-0c79-4db5-a63d-7ae01c056339 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:00:38 compute-0 sshd-session[229036]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 22 08:00:38 compute-0 nova_compute[186544]: 2025-11-22 08:00:38.718 186548 DEBUG nova.virt.libvirt.driver [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 08:00:38 compute-0 nova_compute[186544]: 2025-11-22 08:00:38.719 186548 DEBUG nova.virt.libvirt.driver [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] Ensure instance console log exists: /var/lib/nova/instances/4035e88f-0c79-4db5-a63d-7ae01c056339/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 08:00:38 compute-0 nova_compute[186544]: 2025-11-22 08:00:38.719 186548 DEBUG oslo_concurrency.lockutils [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:00:38 compute-0 nova_compute[186544]: 2025-11-22 08:00:38.720 186548 DEBUG oslo_concurrency.lockutils [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:00:38 compute-0 nova_compute[186544]: 2025-11-22 08:00:38.720 186548 DEBUG oslo_concurrency.lockutils [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:00:38 compute-0 nova_compute[186544]: 2025-11-22 08:00:38.721 186548 DEBUG nova.virt.libvirt.driver [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 08:00:38 compute-0 nova_compute[186544]: 2025-11-22 08:00:38.727 186548 WARNING nova.virt.libvirt.driver [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:00:38 compute-0 nova_compute[186544]: 2025-11-22 08:00:38.735 186548 DEBUG nova.virt.libvirt.host [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 08:00:38 compute-0 nova_compute[186544]: 2025-11-22 08:00:38.736 186548 DEBUG nova.virt.libvirt.host [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 08:00:38 compute-0 nova_compute[186544]: 2025-11-22 08:00:38.739 186548 DEBUG nova.virt.libvirt.host [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 08:00:38 compute-0 nova_compute[186544]: 2025-11-22 08:00:38.740 186548 DEBUG nova.virt.libvirt.host [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 08:00:38 compute-0 nova_compute[186544]: 2025-11-22 08:00:38.742 186548 DEBUG nova.virt.libvirt.driver [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 08:00:38 compute-0 nova_compute[186544]: 2025-11-22 08:00:38.742 186548 DEBUG nova.virt.hardware [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 08:00:38 compute-0 nova_compute[186544]: 2025-11-22 08:00:38.742 186548 DEBUG nova.virt.hardware [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 08:00:38 compute-0 nova_compute[186544]: 2025-11-22 08:00:38.742 186548 DEBUG nova.virt.hardware [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 08:00:38 compute-0 nova_compute[186544]: 2025-11-22 08:00:38.743 186548 DEBUG nova.virt.hardware [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 08:00:38 compute-0 nova_compute[186544]: 2025-11-22 08:00:38.743 186548 DEBUG nova.virt.hardware [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 08:00:38 compute-0 nova_compute[186544]: 2025-11-22 08:00:38.743 186548 DEBUG nova.virt.hardware [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 08:00:38 compute-0 nova_compute[186544]: 2025-11-22 08:00:38.743 186548 DEBUG nova.virt.hardware [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 08:00:38 compute-0 nova_compute[186544]: 2025-11-22 08:00:38.743 186548 DEBUG nova.virt.hardware [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 08:00:38 compute-0 nova_compute[186544]: 2025-11-22 08:00:38.744 186548 DEBUG nova.virt.hardware [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 08:00:38 compute-0 nova_compute[186544]: 2025-11-22 08:00:38.744 186548 DEBUG nova.virt.hardware [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 08:00:38 compute-0 nova_compute[186544]: 2025-11-22 08:00:38.744 186548 DEBUG nova.virt.hardware [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 08:00:38 compute-0 nova_compute[186544]: 2025-11-22 08:00:38.748 186548 DEBUG nova.objects.instance [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Lazy-loading 'pci_devices' on Instance uuid 4035e88f-0c79-4db5-a63d-7ae01c056339 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:00:38 compute-0 nova_compute[186544]: 2025-11-22 08:00:38.758 186548 DEBUG nova.virt.libvirt.driver [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] End _get_guest_xml xml=<domain type="kvm">
Nov 22 08:00:38 compute-0 nova_compute[186544]:   <uuid>4035e88f-0c79-4db5-a63d-7ae01c056339</uuid>
Nov 22 08:00:38 compute-0 nova_compute[186544]:   <name>instance-0000005e</name>
Nov 22 08:00:38 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 08:00:38 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 08:00:38 compute-0 nova_compute[186544]:   <metadata>
Nov 22 08:00:38 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 08:00:38 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 08:00:38 compute-0 nova_compute[186544]:       <nova:name>tempest-ServerShowV257Test-server-711605332</nova:name>
Nov 22 08:00:38 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 08:00:38</nova:creationTime>
Nov 22 08:00:38 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 08:00:38 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 08:00:38 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 08:00:38 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 08:00:38 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 08:00:38 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 08:00:38 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 08:00:38 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 08:00:38 compute-0 nova_compute[186544]:         <nova:user uuid="5cf0748f4d23460a90fe6f94e42ce0d3">tempest-ServerShowV257Test-549319000-project-member</nova:user>
Nov 22 08:00:38 compute-0 nova_compute[186544]:         <nova:project uuid="eca3d96a49be4f65a1d4fddc300c0346">tempest-ServerShowV257Test-549319000</nova:project>
Nov 22 08:00:38 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 08:00:38 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 08:00:38 compute-0 nova_compute[186544]:       <nova:ports/>
Nov 22 08:00:38 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 08:00:38 compute-0 nova_compute[186544]:   </metadata>
Nov 22 08:00:38 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 08:00:38 compute-0 nova_compute[186544]:     <system>
Nov 22 08:00:38 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 08:00:38 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 08:00:38 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 08:00:38 compute-0 nova_compute[186544]:       <entry name="serial">4035e88f-0c79-4db5-a63d-7ae01c056339</entry>
Nov 22 08:00:38 compute-0 nova_compute[186544]:       <entry name="uuid">4035e88f-0c79-4db5-a63d-7ae01c056339</entry>
Nov 22 08:00:38 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 08:00:38 compute-0 nova_compute[186544]:     </system>
Nov 22 08:00:38 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 08:00:38 compute-0 nova_compute[186544]:   <os>
Nov 22 08:00:38 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 08:00:38 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 08:00:38 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 08:00:38 compute-0 nova_compute[186544]:   </os>
Nov 22 08:00:38 compute-0 nova_compute[186544]:   <features>
Nov 22 08:00:38 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 08:00:38 compute-0 nova_compute[186544]:     <apic/>
Nov 22 08:00:38 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 08:00:38 compute-0 nova_compute[186544]:   </features>
Nov 22 08:00:38 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 08:00:38 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 08:00:38 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 08:00:38 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 08:00:38 compute-0 nova_compute[186544]:   </clock>
Nov 22 08:00:38 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 08:00:38 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 08:00:38 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 08:00:38 compute-0 nova_compute[186544]:   </cpu>
Nov 22 08:00:38 compute-0 nova_compute[186544]:   <devices>
Nov 22 08:00:38 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 08:00:38 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 08:00:38 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/4035e88f-0c79-4db5-a63d-7ae01c056339/disk"/>
Nov 22 08:00:38 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 08:00:38 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:00:38 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 08:00:38 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 08:00:38 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/4035e88f-0c79-4db5-a63d-7ae01c056339/disk.config"/>
Nov 22 08:00:38 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 08:00:38 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:00:38 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 08:00:38 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/4035e88f-0c79-4db5-a63d-7ae01c056339/console.log" append="off"/>
Nov 22 08:00:38 compute-0 nova_compute[186544]:     </serial>
Nov 22 08:00:38 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 08:00:38 compute-0 nova_compute[186544]:     <video>
Nov 22 08:00:38 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:00:38 compute-0 nova_compute[186544]:     </video>
Nov 22 08:00:38 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 08:00:38 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 08:00:38 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 08:00:38 compute-0 nova_compute[186544]:     </rng>
Nov 22 08:00:38 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 08:00:38 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:00:38 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:00:38 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:00:38 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:00:38 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:00:38 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:00:38 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:00:38 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:00:38 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:00:38 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:00:38 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:00:38 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:00:38 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:00:38 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:00:38 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:00:38 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:00:38 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:00:38 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:00:38 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:00:38 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:00:38 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:00:38 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:00:38 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:00:38 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:00:38 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 08:00:38 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 08:00:38 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 08:00:38 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 08:00:38 compute-0 nova_compute[186544]:   </devices>
Nov 22 08:00:38 compute-0 nova_compute[186544]: </domain>
Nov 22 08:00:38 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 08:00:38 compute-0 sshd-session[229045]: Received disconnect from 192.168.122.102 port 43152:11: disconnected by user
Nov 22 08:00:38 compute-0 sshd-session[229045]: Disconnected from user nova 192.168.122.102 port 43152
Nov 22 08:00:38 compute-0 sshd-session[229036]: pam_unix(sshd:session): session closed for user nova
Nov 22 08:00:38 compute-0 systemd[1]: session-49.scope: Deactivated successfully.
Nov 22 08:00:38 compute-0 systemd-logind[821]: Session 49 logged out. Waiting for processes to exit.
Nov 22 08:00:38 compute-0 systemd-logind[821]: Removed session 49.
Nov 22 08:00:38 compute-0 nova_compute[186544]: 2025-11-22 08:00:38.828 186548 DEBUG nova.virt.libvirt.driver [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:00:38 compute-0 nova_compute[186544]: 2025-11-22 08:00:38.828 186548 DEBUG nova.virt.libvirt.driver [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:00:38 compute-0 nova_compute[186544]: 2025-11-22 08:00:38.829 186548 INFO nova.virt.libvirt.driver [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] Using config drive
Nov 22 08:00:38 compute-0 sshd-session[229048]: Accepted publickey for nova from 192.168.122.102 port 43154 ssh2: ECDSA SHA256:92zYFkcPWlh9+ZvK5fXQ5RiSCmUwy/g/KFplYnusFfI
Nov 22 08:00:38 compute-0 systemd-logind[821]: New session 50 of user nova.
Nov 22 08:00:38 compute-0 systemd[1]: Started Session 50 of User nova.
Nov 22 08:00:38 compute-0 sshd-session[229048]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 22 08:00:39 compute-0 sshd-session[229051]: Received disconnect from 192.168.122.102 port 43154:11: disconnected by user
Nov 22 08:00:39 compute-0 sshd-session[229051]: Disconnected from user nova 192.168.122.102 port 43154
Nov 22 08:00:39 compute-0 sshd-session[229048]: pam_unix(sshd:session): session closed for user nova
Nov 22 08:00:39 compute-0 systemd[1]: session-50.scope: Deactivated successfully.
Nov 22 08:00:39 compute-0 systemd-logind[821]: Session 50 logged out. Waiting for processes to exit.
Nov 22 08:00:39 compute-0 systemd-logind[821]: Removed session 50.
Nov 22 08:00:39 compute-0 nova_compute[186544]: 2025-11-22 08:00:39.426 186548 INFO nova.virt.libvirt.driver [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] Creating config drive at /var/lib/nova/instances/4035e88f-0c79-4db5-a63d-7ae01c056339/disk.config
Nov 22 08:00:39 compute-0 nova_compute[186544]: 2025-11-22 08:00:39.431 186548 DEBUG oslo_concurrency.processutils [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4035e88f-0c79-4db5-a63d-7ae01c056339/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8uu6fx85 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:00:39 compute-0 nova_compute[186544]: 2025-11-22 08:00:39.558 186548 DEBUG oslo_concurrency.processutils [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4035e88f-0c79-4db5-a63d-7ae01c056339/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8uu6fx85" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:00:39 compute-0 systemd-machined[152872]: New machine qemu-50-instance-0000005e.
Nov 22 08:00:39 compute-0 systemd[1]: Started Virtual Machine qemu-50-instance-0000005e.
Nov 22 08:00:39 compute-0 nova_compute[186544]: 2025-11-22 08:00:39.986 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798439.9856312, 4035e88f-0c79-4db5-a63d-7ae01c056339 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:00:39 compute-0 nova_compute[186544]: 2025-11-22 08:00:39.988 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] VM Resumed (Lifecycle Event)
Nov 22 08:00:39 compute-0 nova_compute[186544]: 2025-11-22 08:00:39.990 186548 DEBUG nova.compute.manager [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 08:00:39 compute-0 nova_compute[186544]: 2025-11-22 08:00:39.990 186548 DEBUG nova.virt.libvirt.driver [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 08:00:39 compute-0 nova_compute[186544]: 2025-11-22 08:00:39.994 186548 INFO nova.virt.libvirt.driver [-] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] Instance spawned successfully.
Nov 22 08:00:39 compute-0 nova_compute[186544]: 2025-11-22 08:00:39.995 186548 DEBUG nova.virt.libvirt.driver [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 08:00:40 compute-0 nova_compute[186544]: 2025-11-22 08:00:40.011 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:00:40 compute-0 nova_compute[186544]: 2025-11-22 08:00:40.016 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:00:40 compute-0 nova_compute[186544]: 2025-11-22 08:00:40.019 186548 DEBUG nova.virt.libvirt.driver [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:00:40 compute-0 nova_compute[186544]: 2025-11-22 08:00:40.019 186548 DEBUG nova.virt.libvirt.driver [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:00:40 compute-0 nova_compute[186544]: 2025-11-22 08:00:40.020 186548 DEBUG nova.virt.libvirt.driver [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:00:40 compute-0 nova_compute[186544]: 2025-11-22 08:00:40.020 186548 DEBUG nova.virt.libvirt.driver [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:00:40 compute-0 nova_compute[186544]: 2025-11-22 08:00:40.021 186548 DEBUG nova.virt.libvirt.driver [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:00:40 compute-0 nova_compute[186544]: 2025-11-22 08:00:40.021 186548 DEBUG nova.virt.libvirt.driver [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:00:40 compute-0 nova_compute[186544]: 2025-11-22 08:00:40.042 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 08:00:40 compute-0 nova_compute[186544]: 2025-11-22 08:00:40.042 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798439.9862304, 4035e88f-0c79-4db5-a63d-7ae01c056339 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:00:40 compute-0 nova_compute[186544]: 2025-11-22 08:00:40.043 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] VM Started (Lifecycle Event)
Nov 22 08:00:40 compute-0 nova_compute[186544]: 2025-11-22 08:00:40.064 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:00:40 compute-0 nova_compute[186544]: 2025-11-22 08:00:40.068 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:00:40 compute-0 nova_compute[186544]: 2025-11-22 08:00:40.084 186548 INFO nova.compute.manager [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] Took 1.75 seconds to spawn the instance on the hypervisor.
Nov 22 08:00:40 compute-0 nova_compute[186544]: 2025-11-22 08:00:40.084 186548 DEBUG nova.compute.manager [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:00:40 compute-0 nova_compute[186544]: 2025-11-22 08:00:40.092 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 08:00:40 compute-0 nova_compute[186544]: 2025-11-22 08:00:40.158 186548 INFO nova.compute.manager [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] Took 2.22 seconds to build instance.
Nov 22 08:00:40 compute-0 nova_compute[186544]: 2025-11-22 08:00:40.175 186548 DEBUG oslo_concurrency.lockutils [None req-5ed3de36-d385-4bac-95f5-e04b234853a3 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Lock "4035e88f-0c79-4db5-a63d-7ae01c056339" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.386s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:00:40 compute-0 nova_compute[186544]: 2025-11-22 08:00:40.621 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:00:41 compute-0 nova_compute[186544]: 2025-11-22 08:00:41.574 186548 INFO nova.network.neutron [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Updating port e963f21d-d8c0-4f76-b5bc-4a3f577d4055 with attributes {'binding:host_id': 'compute-0.ctlplane.example.com', 'device_owner': 'compute:nova'}
Nov 22 08:00:41 compute-0 ovn_controller[94843]: 2025-11-22T08:00:41Z|00388|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Nov 22 08:00:41 compute-0 nova_compute[186544]: 2025-11-22 08:00:41.975 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:00:42 compute-0 nova_compute[186544]: 2025-11-22 08:00:42.498 186548 INFO nova.compute.manager [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] Rebuilding instance
Nov 22 08:00:42 compute-0 nova_compute[186544]: 2025-11-22 08:00:42.867 186548 DEBUG nova.compute.manager [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:00:42 compute-0 nova_compute[186544]: 2025-11-22 08:00:42.950 186548 DEBUG nova.objects.instance [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Lazy-loading 'pci_requests' on Instance uuid 4035e88f-0c79-4db5-a63d-7ae01c056339 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:00:42 compute-0 nova_compute[186544]: 2025-11-22 08:00:42.963 186548 DEBUG nova.objects.instance [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Lazy-loading 'pci_devices' on Instance uuid 4035e88f-0c79-4db5-a63d-7ae01c056339 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:00:42 compute-0 nova_compute[186544]: 2025-11-22 08:00:42.984 186548 DEBUG nova.objects.instance [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Lazy-loading 'resources' on Instance uuid 4035e88f-0c79-4db5-a63d-7ae01c056339 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:00:43 compute-0 nova_compute[186544]: 2025-11-22 08:00:43.004 186548 DEBUG nova.objects.instance [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Lazy-loading 'migration_context' on Instance uuid 4035e88f-0c79-4db5-a63d-7ae01c056339 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:00:43 compute-0 nova_compute[186544]: 2025-11-22 08:00:43.020 186548 DEBUG nova.objects.instance [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Nov 22 08:00:43 compute-0 nova_compute[186544]: 2025-11-22 08:00:43.023 186548 DEBUG nova.virt.libvirt.driver [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Nov 22 08:00:43 compute-0 nova_compute[186544]: 2025-11-22 08:00:43.715 186548 DEBUG oslo_concurrency.lockutils [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Acquiring lock "refresh_cache-92409a46-2dd7-4b20-ac9d-958bbb30993d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:00:43 compute-0 nova_compute[186544]: 2025-11-22 08:00:43.715 186548 DEBUG oslo_concurrency.lockutils [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Acquired lock "refresh_cache-92409a46-2dd7-4b20-ac9d-958bbb30993d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:00:43 compute-0 nova_compute[186544]: 2025-11-22 08:00:43.716 186548 DEBUG nova.network.neutron [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 08:00:43 compute-0 nova_compute[186544]: 2025-11-22 08:00:43.832 186548 DEBUG nova.compute.manager [req-8b5e9a6b-d283-4899-b8fd-9ff7c450bb3e req-41fde608-86fc-423d-8cc3-c71dc84c8afa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Received event network-changed-e963f21d-d8c0-4f76-b5bc-4a3f577d4055 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:00:43 compute-0 nova_compute[186544]: 2025-11-22 08:00:43.833 186548 DEBUG nova.compute.manager [req-8b5e9a6b-d283-4899-b8fd-9ff7c450bb3e req-41fde608-86fc-423d-8cc3-c71dc84c8afa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Refreshing instance network info cache due to event network-changed-e963f21d-d8c0-4f76-b5bc-4a3f577d4055. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:00:43 compute-0 nova_compute[186544]: 2025-11-22 08:00:43.833 186548 DEBUG oslo_concurrency.lockutils [req-8b5e9a6b-d283-4899-b8fd-9ff7c450bb3e req-41fde608-86fc-423d-8cc3-c71dc84c8afa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-92409a46-2dd7-4b20-ac9d-958bbb30993d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:00:44 compute-0 nova_compute[186544]: 2025-11-22 08:00:44.176 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:00:45 compute-0 nova_compute[186544]: 2025-11-22 08:00:45.257 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763798430.2568915, b6c62c3d-be2c-46d2-9bb1-26390c5f9593 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:00:45 compute-0 nova_compute[186544]: 2025-11-22 08:00:45.258 186548 INFO nova.compute.manager [-] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] VM Stopped (Lifecycle Event)
Nov 22 08:00:45 compute-0 nova_compute[186544]: 2025-11-22 08:00:45.291 186548 DEBUG nova.compute.manager [None req-8f0d9932-3795-4562-a807-ec673f22ff5d - - - - - -] [instance: b6c62c3d-be2c-46d2-9bb1-26390c5f9593] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:00:45 compute-0 nova_compute[186544]: 2025-11-22 08:00:45.623 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.188 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.189 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.189 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.189 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.271 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4035e88f-0c79-4db5-a63d-7ae01c056339/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.331 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4035e88f-0c79-4db5-a63d-7ae01c056339/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.332 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4035e88f-0c79-4db5-a63d-7ae01c056339/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.393 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4035e88f-0c79-4db5-a63d-7ae01c056339/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.470 186548 DEBUG nova.network.neutron [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Updating instance_info_cache with network_info: [{"id": "e963f21d-d8c0-4f76-b5bc-4a3f577d4055", "address": "fa:16:3e:b9:5f:a6", "network": {"id": "f7727db5-43a6-48f6-abbf-aa184d8ad087", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-678450698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62d9a4a13f5d41529bc273c278fae96b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape963f21d-d8", "ovs_interfaceid": "e963f21d-d8c0-4f76-b5bc-4a3f577d4055", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.497 186548 DEBUG oslo_concurrency.lockutils [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Releasing lock "refresh_cache-92409a46-2dd7-4b20-ac9d-958bbb30993d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.501 186548 DEBUG oslo_concurrency.lockutils [req-8b5e9a6b-d283-4899-b8fd-9ff7c450bb3e req-41fde608-86fc-423d-8cc3-c71dc84c8afa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-92409a46-2dd7-4b20-ac9d-958bbb30993d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.502 186548 DEBUG nova.network.neutron [req-8b5e9a6b-d283-4899-b8fd-9ff7c450bb3e req-41fde608-86fc-423d-8cc3-c71dc84c8afa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Refreshing network info cache for port e963f21d-d8c0-4f76-b5bc-4a3f577d4055 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.551 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.552 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5551MB free_disk=73.25155639648438GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.552 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.553 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.623 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Applying migration context for instance 92409a46-2dd7-4b20-ac9d-958bbb30993d as it has an incoming, in-progress migration 8a0018e5-ed9f-45b7-a5a4-16ecd560c356. Migration status is post-migrating _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:950
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.625 186548 INFO nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Updating resource usage from migration 8a0018e5-ed9f-45b7-a5a4-16ecd560c356
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.664 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Instance 92409a46-2dd7-4b20-ac9d-958bbb30993d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.665 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Instance 4035e88f-0c79-4db5-a63d-7ae01c056339 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.665 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.666 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=832MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.672 186548 DEBUG nova.virt.libvirt.driver [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.677 186548 DEBUG nova.virt.libvirt.driver [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.677 186548 INFO nova.virt.libvirt.driver [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Creating image(s)
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.679 186548 DEBUG nova.objects.instance [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lazy-loading 'trusted_certs' on Instance uuid 92409a46-2dd7-4b20-ac9d-958bbb30993d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.694 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Refreshing inventories for resource provider 0a011418-630a-4be8-ab23-41ec1c11a5ea _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.703 186548 DEBUG oslo_concurrency.processutils [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.723 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Updating ProviderTree inventory for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.725 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Updating inventory in ProviderTree for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.761 186548 DEBUG oslo_concurrency.processutils [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.762 186548 DEBUG nova.virt.disk.api [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Checking if we can resize image /var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.762 186548 DEBUG oslo_concurrency.processutils [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.781 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Refreshing aggregate associations for resource provider 0a011418-630a-4be8-ab23-41ec1c11a5ea, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.809 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Refreshing trait associations for resource provider 0a011418-630a-4be8-ab23-41ec1c11a5ea, traits: COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_ACCELERATORS,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_RESCUE_BFV,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSSE3,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE41 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.823 186548 DEBUG oslo_concurrency.processutils [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.825 186548 DEBUG nova.virt.disk.api [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Cannot resize image /var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.850 186548 DEBUG nova.virt.libvirt.driver [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.851 186548 DEBUG nova.virt.libvirt.driver [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Ensure instance console log exists: /var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.851 186548 DEBUG oslo_concurrency.lockutils [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.852 186548 DEBUG oslo_concurrency.lockutils [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.852 186548 DEBUG oslo_concurrency.lockutils [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.855 186548 DEBUG nova.virt.libvirt.driver [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Start _get_guest_xml network_info=[{"id": "e963f21d-d8c0-4f76-b5bc-4a3f577d4055", "address": "fa:16:3e:b9:5f:a6", "network": {"id": "f7727db5-43a6-48f6-abbf-aa184d8ad087", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-678450698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestOtherB-678450698-network", "vif_mac": "fa:16:3e:b9:5f:a6"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62d9a4a13f5d41529bc273c278fae96b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape963f21d-d8", "ovs_interfaceid": "e963f21d-d8c0-4f76-b5bc-4a3f577d4055", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.860 186548 WARNING nova.virt.libvirt.driver [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.866 186548 DEBUG nova.virt.libvirt.host [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.866 186548 DEBUG nova.virt.libvirt.host [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.875 186548 DEBUG nova.virt.libvirt.host [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.875 186548 DEBUG nova.virt.libvirt.host [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.877 186548 DEBUG nova.virt.libvirt.driver [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.877 186548 DEBUG nova.virt.hardware [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1c351edf-5b2d-477d-93d0-c380bdae83e7',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.878 186548 DEBUG nova.virt.hardware [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.878 186548 DEBUG nova.virt.hardware [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.878 186548 DEBUG nova.virt.hardware [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.879 186548 DEBUG nova.virt.hardware [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.879 186548 DEBUG nova.virt.hardware [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.879 186548 DEBUG nova.virt.hardware [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.880 186548 DEBUG nova.virt.hardware [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.880 186548 DEBUG nova.virt.hardware [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.880 186548 DEBUG nova.virt.hardware [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.881 186548 DEBUG nova.virt.hardware [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.881 186548 DEBUG nova.objects.instance [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lazy-loading 'vcpu_model' on Instance uuid 92409a46-2dd7-4b20-ac9d-958bbb30993d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.904 186548 DEBUG oslo_concurrency.processutils [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.930 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.950 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.967 186548 DEBUG oslo_concurrency.processutils [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d/disk.config --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.968 186548 DEBUG oslo_concurrency.lockutils [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Acquiring lock "/var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.969 186548 DEBUG oslo_concurrency.lockutils [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "/var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.970 186548 DEBUG oslo_concurrency.lockutils [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "/var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.971 186548 DEBUG nova.virt.libvirt.vif [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:58:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-342710330',display_name='tempest-ServerActionsTestOtherB-server-342710330',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-342710330',id=83,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAYuGzf0LScMOVBFQXWfhNYOQ6jF90sG17sGlA5mAy2SAy9mKbq2fIQlO0z9fMDBdWr+bE7GGEcby2cnMIY+JJFsycIvPuyPkiwi4nyfq2TJfG30oGHbwkLB5ZFVSD3nfg==',key_name='tempest-keypair-729672741',keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:58:49Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='62d9a4a13f5d41529bc273c278fae96b',ramdisk_id='',reservation_id='r-ucsu4nl7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='stopped',owner_project_name='tempest-ServerActionsTestOtherB-270195081',owner_user_name='tempest-ServerActionsTestOtherB-270195081-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:00:40Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d0c5153b41c5499bac372d2df10b9b03',uuid=92409a46-2dd7-4b20-ac9d-958bbb30993d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "e963f21d-d8c0-4f76-b5bc-4a3f577d4055", "address": "fa:16:3e:b9:5f:a6", "network": {"id": "f7727db5-43a6-48f6-abbf-aa184d8ad087", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-678450698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestOtherB-678450698-network", "vif_mac": "fa:16:3e:b9:5f:a6"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62d9a4a13f5d41529bc273c278fae96b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape963f21d-d8", "ovs_interfaceid": "e963f21d-d8c0-4f76-b5bc-4a3f577d4055", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.972 186548 DEBUG nova.network.os_vif_util [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Converting VIF {"id": "e963f21d-d8c0-4f76-b5bc-4a3f577d4055", "address": "fa:16:3e:b9:5f:a6", "network": {"id": "f7727db5-43a6-48f6-abbf-aa184d8ad087", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-678450698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestOtherB-678450698-network", "vif_mac": "fa:16:3e:b9:5f:a6"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62d9a4a13f5d41529bc273c278fae96b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape963f21d-d8", "ovs_interfaceid": "e963f21d-d8c0-4f76-b5bc-4a3f577d4055", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.973 186548 DEBUG nova.network.os_vif_util [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b9:5f:a6,bridge_name='br-int',has_traffic_filtering=True,id=e963f21d-d8c0-4f76-b5bc-4a3f577d4055,network=Network(f7727db5-43a6-48f6-abbf-aa184d8ad087),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape963f21d-d8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.975 186548 DEBUG nova.virt.libvirt.driver [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] End _get_guest_xml xml=<domain type="kvm">
Nov 22 08:00:46 compute-0 nova_compute[186544]:   <uuid>92409a46-2dd7-4b20-ac9d-958bbb30993d</uuid>
Nov 22 08:00:46 compute-0 nova_compute[186544]:   <name>instance-00000053</name>
Nov 22 08:00:46 compute-0 nova_compute[186544]:   <memory>196608</memory>
Nov 22 08:00:46 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 08:00:46 compute-0 nova_compute[186544]:   <metadata>
Nov 22 08:00:46 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 08:00:46 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 08:00:46 compute-0 nova_compute[186544]:       <nova:name>tempest-ServerActionsTestOtherB-server-342710330</nova:name>
Nov 22 08:00:46 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 08:00:46</nova:creationTime>
Nov 22 08:00:46 compute-0 nova_compute[186544]:       <nova:flavor name="m1.micro">
Nov 22 08:00:46 compute-0 nova_compute[186544]:         <nova:memory>192</nova:memory>
Nov 22 08:00:46 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 08:00:46 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 08:00:46 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 08:00:46 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 08:00:46 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 08:00:46 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 08:00:46 compute-0 nova_compute[186544]:         <nova:user uuid="d0c5153b41c5499bac372d2df10b9b03">tempest-ServerActionsTestOtherB-270195081-project-member</nova:user>
Nov 22 08:00:46 compute-0 nova_compute[186544]:         <nova:project uuid="62d9a4a13f5d41529bc273c278fae96b">tempest-ServerActionsTestOtherB-270195081</nova:project>
Nov 22 08:00:46 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 08:00:46 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 08:00:46 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 08:00:46 compute-0 nova_compute[186544]:         <nova:port uuid="e963f21d-d8c0-4f76-b5bc-4a3f577d4055">
Nov 22 08:00:46 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Nov 22 08:00:46 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 08:00:46 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 08:00:46 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 08:00:46 compute-0 nova_compute[186544]:   </metadata>
Nov 22 08:00:46 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 08:00:46 compute-0 nova_compute[186544]:     <system>
Nov 22 08:00:46 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 08:00:46 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 08:00:46 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 08:00:46 compute-0 nova_compute[186544]:       <entry name="serial">92409a46-2dd7-4b20-ac9d-958bbb30993d</entry>
Nov 22 08:00:46 compute-0 nova_compute[186544]:       <entry name="uuid">92409a46-2dd7-4b20-ac9d-958bbb30993d</entry>
Nov 22 08:00:46 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 08:00:46 compute-0 nova_compute[186544]:     </system>
Nov 22 08:00:46 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 08:00:46 compute-0 nova_compute[186544]:   <os>
Nov 22 08:00:46 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 08:00:46 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 08:00:46 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 08:00:46 compute-0 nova_compute[186544]:   </os>
Nov 22 08:00:46 compute-0 nova_compute[186544]:   <features>
Nov 22 08:00:46 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 08:00:46 compute-0 nova_compute[186544]:     <apic/>
Nov 22 08:00:46 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 08:00:46 compute-0 nova_compute[186544]:   </features>
Nov 22 08:00:46 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 08:00:46 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 08:00:46 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 08:00:46 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 08:00:46 compute-0 nova_compute[186544]:   </clock>
Nov 22 08:00:46 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 08:00:46 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 08:00:46 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 08:00:46 compute-0 nova_compute[186544]:   </cpu>
Nov 22 08:00:46 compute-0 nova_compute[186544]:   <devices>
Nov 22 08:00:46 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 08:00:46 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 08:00:46 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d/disk"/>
Nov 22 08:00:46 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 08:00:46 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:00:46 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 08:00:46 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 08:00:46 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d/disk.config"/>
Nov 22 08:00:46 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 08:00:46 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:00:46 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 08:00:46 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:b9:5f:a6"/>
Nov 22 08:00:46 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:00:46 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 08:00:46 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 08:00:46 compute-0 nova_compute[186544]:       <target dev="tape963f21d-d8"/>
Nov 22 08:00:46 compute-0 nova_compute[186544]:     </interface>
Nov 22 08:00:46 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 08:00:46 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d/console.log" append="off"/>
Nov 22 08:00:46 compute-0 nova_compute[186544]:     </serial>
Nov 22 08:00:46 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 08:00:46 compute-0 nova_compute[186544]:     <video>
Nov 22 08:00:46 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:00:46 compute-0 nova_compute[186544]:     </video>
Nov 22 08:00:46 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 08:00:46 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 08:00:46 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 08:00:46 compute-0 nova_compute[186544]:     </rng>
Nov 22 08:00:46 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 08:00:46 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:00:46 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:00:46 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:00:46 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:00:46 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:00:46 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:00:46 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:00:46 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:00:46 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:00:46 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:00:46 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:00:46 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:00:46 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:00:46 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:00:46 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:00:46 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:00:46 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:00:46 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:00:46 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:00:46 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:00:46 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:00:46 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:00:46 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:00:46 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:00:46 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 08:00:46 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 08:00:46 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 08:00:46 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 08:00:46 compute-0 nova_compute[186544]:   </devices>
Nov 22 08:00:46 compute-0 nova_compute[186544]: </domain>
Nov 22 08:00:46 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.983 186548 DEBUG nova.virt.libvirt.vif [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:58:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-342710330',display_name='tempest-ServerActionsTestOtherB-server-342710330',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-342710330',id=83,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAYuGzf0LScMOVBFQXWfhNYOQ6jF90sG17sGlA5mAy2SAy9mKbq2fIQlO0z9fMDBdWr+bE7GGEcby2cnMIY+JJFsycIvPuyPkiwi4nyfq2TJfG30oGHbwkLB5ZFVSD3nfg==',key_name='tempest-keypair-729672741',keypairs=<?>,launch_index=0,launched_at=2025-11-22T07:58:49Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='62d9a4a13f5d41529bc273c278fae96b',ramdisk_id='',reservation_id='r-ucsu4nl7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='stopped',owner_project_name='tempest-ServerActionsTestOtherB-270195081',owner_user_name='tempest-ServerActionsTestOtherB-270195081-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:00:40Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d0c5153b41c5499bac372d2df10b9b03',uuid=92409a46-2dd7-4b20-ac9d-958bbb30993d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "e963f21d-d8c0-4f76-b5bc-4a3f577d4055", "address": "fa:16:3e:b9:5f:a6", "network": {"id": "f7727db5-43a6-48f6-abbf-aa184d8ad087", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-678450698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestOtherB-678450698-network", "vif_mac": "fa:16:3e:b9:5f:a6"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62d9a4a13f5d41529bc273c278fae96b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape963f21d-d8", "ovs_interfaceid": "e963f21d-d8c0-4f76-b5bc-4a3f577d4055", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.984 186548 DEBUG nova.network.os_vif_util [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Converting VIF {"id": "e963f21d-d8c0-4f76-b5bc-4a3f577d4055", "address": "fa:16:3e:b9:5f:a6", "network": {"id": "f7727db5-43a6-48f6-abbf-aa184d8ad087", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-678450698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestOtherB-678450698-network", "vif_mac": "fa:16:3e:b9:5f:a6"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62d9a4a13f5d41529bc273c278fae96b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape963f21d-d8", "ovs_interfaceid": "e963f21d-d8c0-4f76-b5bc-4a3f577d4055", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.985 186548 DEBUG nova.network.os_vif_util [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b9:5f:a6,bridge_name='br-int',has_traffic_filtering=True,id=e963f21d-d8c0-4f76-b5bc-4a3f577d4055,network=Network(f7727db5-43a6-48f6-abbf-aa184d8ad087),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape963f21d-d8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.985 186548 DEBUG os_vif [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b9:5f:a6,bridge_name='br-int',has_traffic_filtering=True,id=e963f21d-d8c0-4f76-b5bc-4a3f577d4055,network=Network(f7727db5-43a6-48f6-abbf-aa184d8ad087),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape963f21d-d8') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.986 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.987 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.988 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.989 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.990 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.990 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.437s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.992 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.992 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape963f21d-d8, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.993 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape963f21d-d8, col_values=(('external_ids', {'iface-id': 'e963f21d-d8c0-4f76-b5bc-4a3f577d4055', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b9:5f:a6', 'vm-uuid': '92409a46-2dd7-4b20-ac9d-958bbb30993d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.994 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:00:46 compute-0 NetworkManager[55036]: <info>  [1763798446.9958] manager: (tape963f21d-d8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/181)
Nov 22 08:00:46 compute-0 nova_compute[186544]: 2025-11-22 08:00:46.997 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 08:00:47 compute-0 nova_compute[186544]: 2025-11-22 08:00:47.002 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:00:47 compute-0 nova_compute[186544]: 2025-11-22 08:00:47.003 186548 INFO os_vif [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b9:5f:a6,bridge_name='br-int',has_traffic_filtering=True,id=e963f21d-d8c0-4f76-b5bc-4a3f577d4055,network=Network(f7727db5-43a6-48f6-abbf-aa184d8ad087),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape963f21d-d8')
Nov 22 08:00:47 compute-0 nova_compute[186544]: 2025-11-22 08:00:47.004 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763798432.0041084, e53bd443-98ed-4e38-bff6-3f43fe77ade8 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:00:47 compute-0 nova_compute[186544]: 2025-11-22 08:00:47.004 186548 INFO nova.compute.manager [-] [instance: e53bd443-98ed-4e38-bff6-3f43fe77ade8] VM Stopped (Lifecycle Event)
Nov 22 08:00:47 compute-0 nova_compute[186544]: 2025-11-22 08:00:47.028 186548 DEBUG nova.compute.manager [None req-a60a381e-6790-49e9-9dfc-a1eadf580882 - - - - - -] [instance: e53bd443-98ed-4e38-bff6-3f43fe77ade8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:00:47 compute-0 nova_compute[186544]: 2025-11-22 08:00:47.326 186548 DEBUG nova.virt.libvirt.driver [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:00:47 compute-0 nova_compute[186544]: 2025-11-22 08:00:47.327 186548 DEBUG nova.virt.libvirt.driver [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:00:47 compute-0 nova_compute[186544]: 2025-11-22 08:00:47.327 186548 DEBUG nova.virt.libvirt.driver [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] No VIF found with MAC fa:16:3e:b9:5f:a6, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 08:00:47 compute-0 nova_compute[186544]: 2025-11-22 08:00:47.328 186548 INFO nova.virt.libvirt.driver [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Using config drive
Nov 22 08:00:47 compute-0 nova_compute[186544]: 2025-11-22 08:00:47.328 186548 DEBUG nova.compute.manager [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 08:00:47 compute-0 nova_compute[186544]: 2025-11-22 08:00:47.329 186548 DEBUG nova.virt.libvirt.driver [None req-ad66077b-2ba8-4364-90c6-26e0e8f7da63 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793
Nov 22 08:00:49 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Nov 22 08:00:49 compute-0 systemd[228978]: Activating special unit Exit the Session...
Nov 22 08:00:49 compute-0 systemd[228978]: Stopped target Main User Target.
Nov 22 08:00:49 compute-0 systemd[228978]: Stopped target Basic System.
Nov 22 08:00:49 compute-0 systemd[228978]: Stopped target Paths.
Nov 22 08:00:49 compute-0 systemd[228978]: Stopped target Sockets.
Nov 22 08:00:49 compute-0 systemd[228978]: Stopped target Timers.
Nov 22 08:00:49 compute-0 systemd[228978]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 22 08:00:49 compute-0 systemd[228978]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 22 08:00:49 compute-0 systemd[228978]: Closed D-Bus User Message Bus Socket.
Nov 22 08:00:49 compute-0 systemd[228978]: Stopped Create User's Volatile Files and Directories.
Nov 22 08:00:49 compute-0 systemd[228978]: Removed slice User Application Slice.
Nov 22 08:00:49 compute-0 systemd[228978]: Reached target Shutdown.
Nov 22 08:00:49 compute-0 systemd[228978]: Finished Exit the Session.
Nov 22 08:00:49 compute-0 systemd[228978]: Reached target Exit the Session.
Nov 22 08:00:49 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Nov 22 08:00:49 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Nov 22 08:00:49 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 22 08:00:49 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 22 08:00:49 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 22 08:00:49 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 22 08:00:49 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Nov 22 08:00:49 compute-0 nova_compute[186544]: 2025-11-22 08:00:49.535 186548 DEBUG nova.network.neutron [req-8b5e9a6b-d283-4899-b8fd-9ff7c450bb3e req-41fde608-86fc-423d-8cc3-c71dc84c8afa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Updated VIF entry in instance network info cache for port e963f21d-d8c0-4f76-b5bc-4a3f577d4055. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:00:49 compute-0 nova_compute[186544]: 2025-11-22 08:00:49.537 186548 DEBUG nova.network.neutron [req-8b5e9a6b-d283-4899-b8fd-9ff7c450bb3e req-41fde608-86fc-423d-8cc3-c71dc84c8afa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Updating instance_info_cache with network_info: [{"id": "e963f21d-d8c0-4f76-b5bc-4a3f577d4055", "address": "fa:16:3e:b9:5f:a6", "network": {"id": "f7727db5-43a6-48f6-abbf-aa184d8ad087", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-678450698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62d9a4a13f5d41529bc273c278fae96b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape963f21d-d8", "ovs_interfaceid": "e963f21d-d8c0-4f76-b5bc-4a3f577d4055", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:00:49 compute-0 nova_compute[186544]: 2025-11-22 08:00:49.551 186548 DEBUG oslo_concurrency.lockutils [req-8b5e9a6b-d283-4899-b8fd-9ff7c450bb3e req-41fde608-86fc-423d-8cc3-c71dc84c8afa 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-92409a46-2dd7-4b20-ac9d-958bbb30993d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:00:49 compute-0 nova_compute[186544]: 2025-11-22 08:00:49.991 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:00:49 compute-0 nova_compute[186544]: 2025-11-22 08:00:49.991 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 08:00:49 compute-0 nova_compute[186544]: 2025-11-22 08:00:49.991 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 08:00:50 compute-0 podman[229099]: 2025-11-22 08:00:50.413561367 +0000 UTC m=+0.061850993 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 22 08:00:50 compute-0 podman[229100]: 2025-11-22 08:00:50.443025366 +0000 UTC m=+0.091543858 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 22 08:00:50 compute-0 nova_compute[186544]: 2025-11-22 08:00:50.444 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "refresh_cache-92409a46-2dd7-4b20-ac9d-958bbb30993d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:00:50 compute-0 nova_compute[186544]: 2025-11-22 08:00:50.444 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquired lock "refresh_cache-92409a46-2dd7-4b20-ac9d-958bbb30993d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:00:50 compute-0 nova_compute[186544]: 2025-11-22 08:00:50.445 186548 DEBUG nova.network.neutron [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 22 08:00:50 compute-0 nova_compute[186544]: 2025-11-22 08:00:50.445 186548 DEBUG nova.objects.instance [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lazy-loading 'info_cache' on Instance uuid 92409a46-2dd7-4b20-ac9d-958bbb30993d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:00:50 compute-0 nova_compute[186544]: 2025-11-22 08:00:50.624 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:00:51 compute-0 nova_compute[186544]: 2025-11-22 08:00:51.995 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:00:52 compute-0 nova_compute[186544]: 2025-11-22 08:00:52.796 186548 DEBUG nova.network.neutron [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Updating instance_info_cache with network_info: [{"id": "e963f21d-d8c0-4f76-b5bc-4a3f577d4055", "address": "fa:16:3e:b9:5f:a6", "network": {"id": "f7727db5-43a6-48f6-abbf-aa184d8ad087", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-678450698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62d9a4a13f5d41529bc273c278fae96b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape963f21d-d8", "ovs_interfaceid": "e963f21d-d8c0-4f76-b5bc-4a3f577d4055", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:00:52 compute-0 nova_compute[186544]: 2025-11-22 08:00:52.817 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Releasing lock "refresh_cache-92409a46-2dd7-4b20-ac9d-958bbb30993d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:00:52 compute-0 nova_compute[186544]: 2025-11-22 08:00:52.818 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 22 08:00:52 compute-0 nova_compute[186544]: 2025-11-22 08:00:52.818 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:00:52 compute-0 nova_compute[186544]: 2025-11-22 08:00:52.819 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:00:52 compute-0 nova_compute[186544]: 2025-11-22 08:00:52.819 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:00:52 compute-0 nova_compute[186544]: 2025-11-22 08:00:52.819 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 08:00:53 compute-0 nova_compute[186544]: 2025-11-22 08:00:53.064 186548 DEBUG nova.virt.libvirt.driver [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Nov 22 08:00:53 compute-0 nova_compute[186544]: 2025-11-22 08:00:53.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:00:54 compute-0 nova_compute[186544]: 2025-11-22 08:00:54.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:00:54 compute-0 podman[229163]: 2025-11-22 08:00:54.408197947 +0000 UTC m=+0.060333865 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 08:00:55 compute-0 nova_compute[186544]: 2025-11-22 08:00:55.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:00:55 compute-0 nova_compute[186544]: 2025-11-22 08:00:55.626 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:00:56 compute-0 podman[229187]: 2025-11-22 08:00:56.405372877 +0000 UTC m=+0.053795263 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 08:00:56 compute-0 nova_compute[186544]: 2025-11-22 08:00:56.998 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:00:57 compute-0 nova_compute[186544]: 2025-11-22 08:00:57.155 186548 DEBUG nova.objects.instance [None req-0c6af42b-e41d-4741-bba1-5ac39f0bace8 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lazy-loading 'flavor' on Instance uuid 92409a46-2dd7-4b20-ac9d-958bbb30993d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:00:57 compute-0 nova_compute[186544]: 2025-11-22 08:00:57.204 186548 DEBUG nova.objects.instance [None req-0c6af42b-e41d-4741-bba1-5ac39f0bace8 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lazy-loading 'info_cache' on Instance uuid 92409a46-2dd7-4b20-ac9d-958bbb30993d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:00:57 compute-0 nova_compute[186544]: 2025-11-22 08:00:57.231 186548 DEBUG oslo_concurrency.lockutils [None req-0c6af42b-e41d-4741-bba1-5ac39f0bace8 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Acquiring lock "refresh_cache-92409a46-2dd7-4b20-ac9d-958bbb30993d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:00:57 compute-0 nova_compute[186544]: 2025-11-22 08:00:57.232 186548 DEBUG oslo_concurrency.lockutils [None req-0c6af42b-e41d-4741-bba1-5ac39f0bace8 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Acquired lock "refresh_cache-92409a46-2dd7-4b20-ac9d-958bbb30993d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:00:57 compute-0 nova_compute[186544]: 2025-11-22 08:00:57.232 186548 DEBUG nova.network.neutron [None req-0c6af42b-e41d-4741-bba1-5ac39f0bace8 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 08:01:00 compute-0 nova_compute[186544]: 2025-11-22 08:01:00.628 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:01:00 compute-0 nova_compute[186544]: 2025-11-22 08:01:00.933 186548 DEBUG nova.network.neutron [None req-0c6af42b-e41d-4741-bba1-5ac39f0bace8 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Updating instance_info_cache with network_info: [{"id": "e963f21d-d8c0-4f76-b5bc-4a3f577d4055", "address": "fa:16:3e:b9:5f:a6", "network": {"id": "f7727db5-43a6-48f6-abbf-aa184d8ad087", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-678450698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62d9a4a13f5d41529bc273c278fae96b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape963f21d-d8", "ovs_interfaceid": "e963f21d-d8c0-4f76-b5bc-4a3f577d4055", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:01:00 compute-0 nova_compute[186544]: 2025-11-22 08:01:00.969 186548 DEBUG oslo_concurrency.lockutils [None req-0c6af42b-e41d-4741-bba1-5ac39f0bace8 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Releasing lock "refresh_cache-92409a46-2dd7-4b20-ac9d-958bbb30993d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:01:00 compute-0 nova_compute[186544]: 2025-11-22 08:01:00.994 186548 INFO nova.virt.libvirt.driver [-] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Instance destroyed successfully.
Nov 22 08:01:00 compute-0 nova_compute[186544]: 2025-11-22 08:01:00.994 186548 DEBUG nova.objects.instance [None req-0c6af42b-e41d-4741-bba1-5ac39f0bace8 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lazy-loading 'numa_topology' on Instance uuid 92409a46-2dd7-4b20-ac9d-958bbb30993d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:01:01 compute-0 nova_compute[186544]: 2025-11-22 08:01:01.008 186548 DEBUG nova.objects.instance [None req-0c6af42b-e41d-4741-bba1-5ac39f0bace8 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lazy-loading 'resources' on Instance uuid 92409a46-2dd7-4b20-ac9d-958bbb30993d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:01:01 compute-0 nova_compute[186544]: 2025-11-22 08:01:01.048 186548 DEBUG nova.virt.libvirt.vif [None req-0c6af42b-e41d-4741-bba1-5ac39f0bace8 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:58:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-342710330',display_name='tempest-ServerActionsTestOtherB-server-342710330',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-342710330',id=83,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAYuGzf0LScMOVBFQXWfhNYOQ6jF90sG17sGlA5mAy2SAy9mKbq2fIQlO0z9fMDBdWr+bE7GGEcby2cnMIY+JJFsycIvPuyPkiwi4nyfq2TJfG30oGHbwkLB5ZFVSD3nfg==',key_name='tempest-keypair-729672741',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:00:47Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='62d9a4a13f5d41529bc273c278fae96b',ramdisk_id='',reservation_id='r-ucsu4nl7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-270195081',owner_user_name='tempest-ServerActionsTestOtherB-270195081-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:00:54Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d0c5153b41c5499bac372d2df10b9b03',uuid=92409a46-2dd7-4b20-ac9d-958bbb30993d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "e963f21d-d8c0-4f76-b5bc-4a3f577d4055", "address": "fa:16:3e:b9:5f:a6", "network": {"id": "f7727db5-43a6-48f6-abbf-aa184d8ad087", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-678450698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62d9a4a13f5d41529bc273c278fae96b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape963f21d-d8", "ovs_interfaceid": "e963f21d-d8c0-4f76-b5bc-4a3f577d4055", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 08:01:01 compute-0 nova_compute[186544]: 2025-11-22 08:01:01.049 186548 DEBUG nova.network.os_vif_util [None req-0c6af42b-e41d-4741-bba1-5ac39f0bace8 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Converting VIF {"id": "e963f21d-d8c0-4f76-b5bc-4a3f577d4055", "address": "fa:16:3e:b9:5f:a6", "network": {"id": "f7727db5-43a6-48f6-abbf-aa184d8ad087", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-678450698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62d9a4a13f5d41529bc273c278fae96b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape963f21d-d8", "ovs_interfaceid": "e963f21d-d8c0-4f76-b5bc-4a3f577d4055", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:01:01 compute-0 nova_compute[186544]: 2025-11-22 08:01:01.050 186548 DEBUG nova.network.os_vif_util [None req-0c6af42b-e41d-4741-bba1-5ac39f0bace8 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b9:5f:a6,bridge_name='br-int',has_traffic_filtering=True,id=e963f21d-d8c0-4f76-b5bc-4a3f577d4055,network=Network(f7727db5-43a6-48f6-abbf-aa184d8ad087),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape963f21d-d8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:01:01 compute-0 nova_compute[186544]: 2025-11-22 08:01:01.050 186548 DEBUG os_vif [None req-0c6af42b-e41d-4741-bba1-5ac39f0bace8 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b9:5f:a6,bridge_name='br-int',has_traffic_filtering=True,id=e963f21d-d8c0-4f76-b5bc-4a3f577d4055,network=Network(f7727db5-43a6-48f6-abbf-aa184d8ad087),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape963f21d-d8') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 08:01:01 compute-0 nova_compute[186544]: 2025-11-22 08:01:01.052 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:01:01 compute-0 nova_compute[186544]: 2025-11-22 08:01:01.052 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape963f21d-d8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:01:01 compute-0 nova_compute[186544]: 2025-11-22 08:01:01.053 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:01:01 compute-0 nova_compute[186544]: 2025-11-22 08:01:01.055 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:01:01 compute-0 nova_compute[186544]: 2025-11-22 08:01:01.057 186548 INFO os_vif [None req-0c6af42b-e41d-4741-bba1-5ac39f0bace8 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b9:5f:a6,bridge_name='br-int',has_traffic_filtering=True,id=e963f21d-d8c0-4f76-b5bc-4a3f577d4055,network=Network(f7727db5-43a6-48f6-abbf-aa184d8ad087),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape963f21d-d8')
Nov 22 08:01:01 compute-0 nova_compute[186544]: 2025-11-22 08:01:01.063 186548 DEBUG nova.virt.libvirt.driver [None req-0c6af42b-e41d-4741-bba1-5ac39f0bace8 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Start _get_guest_xml network_info=[{"id": "e963f21d-d8c0-4f76-b5bc-4a3f577d4055", "address": "fa:16:3e:b9:5f:a6", "network": {"id": "f7727db5-43a6-48f6-abbf-aa184d8ad087", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-678450698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62d9a4a13f5d41529bc273c278fae96b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape963f21d-d8", "ovs_interfaceid": "e963f21d-d8c0-4f76-b5bc-4a3f577d4055", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 08:01:01 compute-0 nova_compute[186544]: 2025-11-22 08:01:01.066 186548 WARNING nova.virt.libvirt.driver [None req-0c6af42b-e41d-4741-bba1-5ac39f0bace8 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:01:01 compute-0 nova_compute[186544]: 2025-11-22 08:01:01.079 186548 DEBUG nova.virt.libvirt.host [None req-0c6af42b-e41d-4741-bba1-5ac39f0bace8 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 08:01:01 compute-0 nova_compute[186544]: 2025-11-22 08:01:01.079 186548 DEBUG nova.virt.libvirt.host [None req-0c6af42b-e41d-4741-bba1-5ac39f0bace8 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 08:01:01 compute-0 nova_compute[186544]: 2025-11-22 08:01:01.083 186548 DEBUG nova.virt.libvirt.host [None req-0c6af42b-e41d-4741-bba1-5ac39f0bace8 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 08:01:01 compute-0 nova_compute[186544]: 2025-11-22 08:01:01.083 186548 DEBUG nova.virt.libvirt.host [None req-0c6af42b-e41d-4741-bba1-5ac39f0bace8 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 08:01:01 compute-0 nova_compute[186544]: 2025-11-22 08:01:01.084 186548 DEBUG nova.virt.libvirt.driver [None req-0c6af42b-e41d-4741-bba1-5ac39f0bace8 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 08:01:01 compute-0 nova_compute[186544]: 2025-11-22 08:01:01.084 186548 DEBUG nova.virt.hardware [None req-0c6af42b-e41d-4741-bba1-5ac39f0bace8 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1c351edf-5b2d-477d-93d0-c380bdae83e7',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 08:01:01 compute-0 nova_compute[186544]: 2025-11-22 08:01:01.085 186548 DEBUG nova.virt.hardware [None req-0c6af42b-e41d-4741-bba1-5ac39f0bace8 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 08:01:01 compute-0 nova_compute[186544]: 2025-11-22 08:01:01.085 186548 DEBUG nova.virt.hardware [None req-0c6af42b-e41d-4741-bba1-5ac39f0bace8 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 08:01:01 compute-0 nova_compute[186544]: 2025-11-22 08:01:01.085 186548 DEBUG nova.virt.hardware [None req-0c6af42b-e41d-4741-bba1-5ac39f0bace8 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 08:01:01 compute-0 nova_compute[186544]: 2025-11-22 08:01:01.086 186548 DEBUG nova.virt.hardware [None req-0c6af42b-e41d-4741-bba1-5ac39f0bace8 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 08:01:01 compute-0 nova_compute[186544]: 2025-11-22 08:01:01.086 186548 DEBUG nova.virt.hardware [None req-0c6af42b-e41d-4741-bba1-5ac39f0bace8 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 08:01:01 compute-0 nova_compute[186544]: 2025-11-22 08:01:01.086 186548 DEBUG nova.virt.hardware [None req-0c6af42b-e41d-4741-bba1-5ac39f0bace8 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 08:01:01 compute-0 nova_compute[186544]: 2025-11-22 08:01:01.087 186548 DEBUG nova.virt.hardware [None req-0c6af42b-e41d-4741-bba1-5ac39f0bace8 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 08:01:01 compute-0 nova_compute[186544]: 2025-11-22 08:01:01.087 186548 DEBUG nova.virt.hardware [None req-0c6af42b-e41d-4741-bba1-5ac39f0bace8 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 08:01:01 compute-0 nova_compute[186544]: 2025-11-22 08:01:01.087 186548 DEBUG nova.virt.hardware [None req-0c6af42b-e41d-4741-bba1-5ac39f0bace8 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 08:01:01 compute-0 nova_compute[186544]: 2025-11-22 08:01:01.087 186548 DEBUG nova.virt.hardware [None req-0c6af42b-e41d-4741-bba1-5ac39f0bace8 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 08:01:01 compute-0 nova_compute[186544]: 2025-11-22 08:01:01.088 186548 DEBUG nova.objects.instance [None req-0c6af42b-e41d-4741-bba1-5ac39f0bace8 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lazy-loading 'vcpu_model' on Instance uuid 92409a46-2dd7-4b20-ac9d-958bbb30993d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:01:01 compute-0 nova_compute[186544]: 2025-11-22 08:01:01.114 186548 DEBUG nova.virt.libvirt.vif [None req-0c6af42b-e41d-4741-bba1-5ac39f0bace8 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:58:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-342710330',display_name='tempest-ServerActionsTestOtherB-server-342710330',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-342710330',id=83,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAYuGzf0LScMOVBFQXWfhNYOQ6jF90sG17sGlA5mAy2SAy9mKbq2fIQlO0z9fMDBdWr+bE7GGEcby2cnMIY+JJFsycIvPuyPkiwi4nyfq2TJfG30oGHbwkLB5ZFVSD3nfg==',key_name='tempest-keypair-729672741',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:00:47Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='62d9a4a13f5d41529bc273c278fae96b',ramdisk_id='',reservation_id='r-ucsu4nl7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-270195081',owner_user_name='tempest-ServerActionsTestOtherB-270195081-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:00:54Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d0c5153b41c5499bac372d2df10b9b03',uuid=92409a46-2dd7-4b20-ac9d-958bbb30993d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "e963f21d-d8c0-4f76-b5bc-4a3f577d4055", "address": "fa:16:3e:b9:5f:a6", "network": {"id": "f7727db5-43a6-48f6-abbf-aa184d8ad087", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-678450698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62d9a4a13f5d41529bc273c278fae96b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape963f21d-d8", "ovs_interfaceid": "e963f21d-d8c0-4f76-b5bc-4a3f577d4055", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 08:01:01 compute-0 nova_compute[186544]: 2025-11-22 08:01:01.115 186548 DEBUG nova.network.os_vif_util [None req-0c6af42b-e41d-4741-bba1-5ac39f0bace8 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Converting VIF {"id": "e963f21d-d8c0-4f76-b5bc-4a3f577d4055", "address": "fa:16:3e:b9:5f:a6", "network": {"id": "f7727db5-43a6-48f6-abbf-aa184d8ad087", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-678450698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62d9a4a13f5d41529bc273c278fae96b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape963f21d-d8", "ovs_interfaceid": "e963f21d-d8c0-4f76-b5bc-4a3f577d4055", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:01:01 compute-0 nova_compute[186544]: 2025-11-22 08:01:01.116 186548 DEBUG nova.network.os_vif_util [None req-0c6af42b-e41d-4741-bba1-5ac39f0bace8 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b9:5f:a6,bridge_name='br-int',has_traffic_filtering=True,id=e963f21d-d8c0-4f76-b5bc-4a3f577d4055,network=Network(f7727db5-43a6-48f6-abbf-aa184d8ad087),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape963f21d-d8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:01:01 compute-0 nova_compute[186544]: 2025-11-22 08:01:01.117 186548 DEBUG nova.objects.instance [None req-0c6af42b-e41d-4741-bba1-5ac39f0bace8 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lazy-loading 'pci_devices' on Instance uuid 92409a46-2dd7-4b20-ac9d-958bbb30993d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:01:01 compute-0 nova_compute[186544]: 2025-11-22 08:01:01.145 186548 DEBUG nova.virt.libvirt.driver [None req-0c6af42b-e41d-4741-bba1-5ac39f0bace8 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] End _get_guest_xml xml=<domain type="kvm">
Nov 22 08:01:01 compute-0 nova_compute[186544]:   <uuid>92409a46-2dd7-4b20-ac9d-958bbb30993d</uuid>
Nov 22 08:01:01 compute-0 nova_compute[186544]:   <name>instance-00000053</name>
Nov 22 08:01:01 compute-0 nova_compute[186544]:   <memory>196608</memory>
Nov 22 08:01:01 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 08:01:01 compute-0 nova_compute[186544]:   <metadata>
Nov 22 08:01:01 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 08:01:01 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 08:01:01 compute-0 nova_compute[186544]:       <nova:name>tempest-ServerActionsTestOtherB-server-342710330</nova:name>
Nov 22 08:01:01 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 08:01:01</nova:creationTime>
Nov 22 08:01:01 compute-0 nova_compute[186544]:       <nova:flavor name="m1.micro">
Nov 22 08:01:01 compute-0 nova_compute[186544]:         <nova:memory>192</nova:memory>
Nov 22 08:01:01 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 08:01:01 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 08:01:01 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 08:01:01 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 08:01:01 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 08:01:01 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 08:01:01 compute-0 nova_compute[186544]:         <nova:user uuid="d0c5153b41c5499bac372d2df10b9b03">tempest-ServerActionsTestOtherB-270195081-project-member</nova:user>
Nov 22 08:01:01 compute-0 nova_compute[186544]:         <nova:project uuid="62d9a4a13f5d41529bc273c278fae96b">tempest-ServerActionsTestOtherB-270195081</nova:project>
Nov 22 08:01:01 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 08:01:01 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 08:01:01 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 08:01:01 compute-0 nova_compute[186544]:         <nova:port uuid="e963f21d-d8c0-4f76-b5bc-4a3f577d4055">
Nov 22 08:01:01 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Nov 22 08:01:01 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 08:01:01 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 08:01:01 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 08:01:01 compute-0 nova_compute[186544]:   </metadata>
Nov 22 08:01:01 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 08:01:01 compute-0 nova_compute[186544]:     <system>
Nov 22 08:01:01 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 08:01:01 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 08:01:01 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 08:01:01 compute-0 nova_compute[186544]:       <entry name="serial">92409a46-2dd7-4b20-ac9d-958bbb30993d</entry>
Nov 22 08:01:01 compute-0 nova_compute[186544]:       <entry name="uuid">92409a46-2dd7-4b20-ac9d-958bbb30993d</entry>
Nov 22 08:01:01 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 08:01:01 compute-0 nova_compute[186544]:     </system>
Nov 22 08:01:01 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 08:01:01 compute-0 nova_compute[186544]:   <os>
Nov 22 08:01:01 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 08:01:01 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 08:01:01 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 08:01:01 compute-0 nova_compute[186544]:   </os>
Nov 22 08:01:01 compute-0 nova_compute[186544]:   <features>
Nov 22 08:01:01 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 08:01:01 compute-0 nova_compute[186544]:     <apic/>
Nov 22 08:01:01 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 08:01:01 compute-0 nova_compute[186544]:   </features>
Nov 22 08:01:01 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 08:01:01 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 08:01:01 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 08:01:01 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 08:01:01 compute-0 nova_compute[186544]:   </clock>
Nov 22 08:01:01 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 08:01:01 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 08:01:01 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 08:01:01 compute-0 nova_compute[186544]:   </cpu>
Nov 22 08:01:01 compute-0 nova_compute[186544]:   <devices>
Nov 22 08:01:01 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 08:01:01 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 08:01:01 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d/disk"/>
Nov 22 08:01:01 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 08:01:01 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:01:01 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 08:01:01 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 08:01:01 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d/disk.config"/>
Nov 22 08:01:01 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 08:01:01 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:01:01 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 08:01:01 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:b9:5f:a6"/>
Nov 22 08:01:01 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:01:01 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 08:01:01 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 08:01:01 compute-0 nova_compute[186544]:       <target dev="tape963f21d-d8"/>
Nov 22 08:01:01 compute-0 nova_compute[186544]:     </interface>
Nov 22 08:01:01 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 08:01:01 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d/console.log" append="off"/>
Nov 22 08:01:01 compute-0 nova_compute[186544]:     </serial>
Nov 22 08:01:01 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 08:01:01 compute-0 nova_compute[186544]:     <video>
Nov 22 08:01:01 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:01:01 compute-0 nova_compute[186544]:     </video>
Nov 22 08:01:01 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 08:01:01 compute-0 nova_compute[186544]:     <input type="keyboard" bus="usb"/>
Nov 22 08:01:01 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 08:01:01 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 08:01:01 compute-0 nova_compute[186544]:     </rng>
Nov 22 08:01:01 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 08:01:01 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:01 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:01 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:01 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:01 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:01 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:01 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:01 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:01 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:01 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:01 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:01 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:01 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:01 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:01 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:01 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:01 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:01 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:01 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:01 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:01 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:01 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:01 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:01 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:01 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 08:01:01 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 08:01:01 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 08:01:01 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 08:01:01 compute-0 nova_compute[186544]:   </devices>
Nov 22 08:01:01 compute-0 nova_compute[186544]: </domain>
Nov 22 08:01:01 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 08:01:01 compute-0 nova_compute[186544]: 2025-11-22 08:01:01.147 186548 DEBUG oslo_concurrency.processutils [None req-0c6af42b-e41d-4741-bba1-5ac39f0bace8 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:01:01 compute-0 nova_compute[186544]: 2025-11-22 08:01:01.202 186548 DEBUG oslo_concurrency.processutils [None req-0c6af42b-e41d-4741-bba1-5ac39f0bace8 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:01:01 compute-0 nova_compute[186544]: 2025-11-22 08:01:01.203 186548 DEBUG oslo_concurrency.processutils [None req-0c6af42b-e41d-4741-bba1-5ac39f0bace8 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:01:01 compute-0 nova_compute[186544]: 2025-11-22 08:01:01.266 186548 DEBUG oslo_concurrency.processutils [None req-0c6af42b-e41d-4741-bba1-5ac39f0bace8 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:01:01 compute-0 nova_compute[186544]: 2025-11-22 08:01:01.267 186548 DEBUG nova.objects.instance [None req-0c6af42b-e41d-4741-bba1-5ac39f0bace8 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lazy-loading 'trusted_certs' on Instance uuid 92409a46-2dd7-4b20-ac9d-958bbb30993d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:01:01 compute-0 nova_compute[186544]: 2025-11-22 08:01:01.281 186548 DEBUG oslo_concurrency.processutils [None req-0c6af42b-e41d-4741-bba1-5ac39f0bace8 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:01:01 compute-0 nova_compute[186544]: 2025-11-22 08:01:01.345 186548 DEBUG oslo_concurrency.processutils [None req-0c6af42b-e41d-4741-bba1-5ac39f0bace8 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:01:01 compute-0 nova_compute[186544]: 2025-11-22 08:01:01.346 186548 DEBUG nova.virt.disk.api [None req-0c6af42b-e41d-4741-bba1-5ac39f0bace8 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Checking if we can resize image /var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 08:01:01 compute-0 nova_compute[186544]: 2025-11-22 08:01:01.346 186548 DEBUG oslo_concurrency.processutils [None req-0c6af42b-e41d-4741-bba1-5ac39f0bace8 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:01:01 compute-0 nova_compute[186544]: 2025-11-22 08:01:01.403 186548 DEBUG oslo_concurrency.processutils [None req-0c6af42b-e41d-4741-bba1-5ac39f0bace8 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:01:01 compute-0 nova_compute[186544]: 2025-11-22 08:01:01.404 186548 DEBUG nova.virt.disk.api [None req-0c6af42b-e41d-4741-bba1-5ac39f0bace8 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Cannot resize image /var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 08:01:01 compute-0 nova_compute[186544]: 2025-11-22 08:01:01.405 186548 DEBUG nova.objects.instance [None req-0c6af42b-e41d-4741-bba1-5ac39f0bace8 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lazy-loading 'migration_context' on Instance uuid 92409a46-2dd7-4b20-ac9d-958bbb30993d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:01:01 compute-0 nova_compute[186544]: 2025-11-22 08:01:01.417 186548 DEBUG nova.virt.libvirt.vif [None req-0c6af42b-e41d-4741-bba1-5ac39f0bace8 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:58:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-342710330',display_name='tempest-ServerActionsTestOtherB-server-342710330',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-342710330',id=83,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAYuGzf0LScMOVBFQXWfhNYOQ6jF90sG17sGlA5mAy2SAy9mKbq2fIQlO0z9fMDBdWr+bE7GGEcby2cnMIY+JJFsycIvPuyPkiwi4nyfq2TJfG30oGHbwkLB5ZFVSD3nfg==',key_name='tempest-keypair-729672741',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:00:47Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=4,progress=0,project_id='62d9a4a13f5d41529bc273c278fae96b',ramdisk_id='',reservation_id='r-ucsu4nl7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-270195081',owner_user_name='tempest-ServerActionsTestOtherB-270195081-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:00:54Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d0c5153b41c5499bac372d2df10b9b03',uuid=92409a46-2dd7-4b20-ac9d-958bbb30993d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "e963f21d-d8c0-4f76-b5bc-4a3f577d4055", "address": "fa:16:3e:b9:5f:a6", "network": {"id": "f7727db5-43a6-48f6-abbf-aa184d8ad087", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-678450698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62d9a4a13f5d41529bc273c278fae96b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape963f21d-d8", "ovs_interfaceid": "e963f21d-d8c0-4f76-b5bc-4a3f577d4055", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 08:01:01 compute-0 nova_compute[186544]: 2025-11-22 08:01:01.418 186548 DEBUG nova.network.os_vif_util [None req-0c6af42b-e41d-4741-bba1-5ac39f0bace8 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Converting VIF {"id": "e963f21d-d8c0-4f76-b5bc-4a3f577d4055", "address": "fa:16:3e:b9:5f:a6", "network": {"id": "f7727db5-43a6-48f6-abbf-aa184d8ad087", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-678450698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62d9a4a13f5d41529bc273c278fae96b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape963f21d-d8", "ovs_interfaceid": "e963f21d-d8c0-4f76-b5bc-4a3f577d4055", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:01:01 compute-0 nova_compute[186544]: 2025-11-22 08:01:01.418 186548 DEBUG nova.network.os_vif_util [None req-0c6af42b-e41d-4741-bba1-5ac39f0bace8 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b9:5f:a6,bridge_name='br-int',has_traffic_filtering=True,id=e963f21d-d8c0-4f76-b5bc-4a3f577d4055,network=Network(f7727db5-43a6-48f6-abbf-aa184d8ad087),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape963f21d-d8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:01:01 compute-0 nova_compute[186544]: 2025-11-22 08:01:01.419 186548 DEBUG os_vif [None req-0c6af42b-e41d-4741-bba1-5ac39f0bace8 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b9:5f:a6,bridge_name='br-int',has_traffic_filtering=True,id=e963f21d-d8c0-4f76-b5bc-4a3f577d4055,network=Network(f7727db5-43a6-48f6-abbf-aa184d8ad087),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape963f21d-d8') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 08:01:01 compute-0 nova_compute[186544]: 2025-11-22 08:01:01.420 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:01:01 compute-0 nova_compute[186544]: 2025-11-22 08:01:01.420 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:01:01 compute-0 nova_compute[186544]: 2025-11-22 08:01:01.421 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:01:01 compute-0 nova_compute[186544]: 2025-11-22 08:01:01.423 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:01:01 compute-0 nova_compute[186544]: 2025-11-22 08:01:01.423 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape963f21d-d8, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:01:01 compute-0 nova_compute[186544]: 2025-11-22 08:01:01.424 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape963f21d-d8, col_values=(('external_ids', {'iface-id': 'e963f21d-d8c0-4f76-b5bc-4a3f577d4055', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b9:5f:a6', 'vm-uuid': '92409a46-2dd7-4b20-ac9d-958bbb30993d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:01:01 compute-0 nova_compute[186544]: 2025-11-22 08:01:01.425 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:01:01 compute-0 NetworkManager[55036]: <info>  [1763798461.4261] manager: (tape963f21d-d8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/182)
Nov 22 08:01:01 compute-0 nova_compute[186544]: 2025-11-22 08:01:01.429 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 08:01:01 compute-0 nova_compute[186544]: 2025-11-22 08:01:01.430 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:01:01 compute-0 nova_compute[186544]: 2025-11-22 08:01:01.431 186548 INFO os_vif [None req-0c6af42b-e41d-4741-bba1-5ac39f0bace8 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b9:5f:a6,bridge_name='br-int',has_traffic_filtering=True,id=e963f21d-d8c0-4f76-b5bc-4a3f577d4055,network=Network(f7727db5-43a6-48f6-abbf-aa184d8ad087),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape963f21d-d8')
Nov 22 08:01:01 compute-0 CROND[229221]: (root) CMD (run-parts /etc/cron.hourly)
Nov 22 08:01:01 compute-0 run-parts[229224]: (/etc/cron.hourly) starting 0anacron
Nov 22 08:01:01 compute-0 run-parts[229230]: (/etc/cron.hourly) finished 0anacron
Nov 22 08:01:01 compute-0 CROND[229220]: (root) CMDEND (run-parts /etc/cron.hourly)
Nov 22 08:01:01 compute-0 kernel: tape963f21d-d8: entered promiscuous mode
Nov 22 08:01:01 compute-0 NetworkManager[55036]: <info>  [1763798461.7648] manager: (tape963f21d-d8): new Tun device (/org/freedesktop/NetworkManager/Devices/183)
Nov 22 08:01:01 compute-0 nova_compute[186544]: 2025-11-22 08:01:01.764 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:01:01 compute-0 ovn_controller[94843]: 2025-11-22T08:01:01Z|00389|binding|INFO|Claiming lport e963f21d-d8c0-4f76-b5bc-4a3f577d4055 for this chassis.
Nov 22 08:01:01 compute-0 ovn_controller[94843]: 2025-11-22T08:01:01Z|00390|binding|INFO|e963f21d-d8c0-4f76-b5bc-4a3f577d4055: Claiming fa:16:3e:b9:5f:a6 10.100.0.4
Nov 22 08:01:01 compute-0 nova_compute[186544]: 2025-11-22 08:01:01.776 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:01:01 compute-0 nova_compute[186544]: 2025-11-22 08:01:01.780 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:01:01 compute-0 NetworkManager[55036]: <info>  [1763798461.7922] manager: (patch-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/184)
Nov 22 08:01:01 compute-0 NetworkManager[55036]: <info>  [1763798461.7929] manager: (patch-br-int-to-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/185)
Nov 22 08:01:01 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:01.790 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b9:5f:a6 10.100.0.4'], port_security=['fa:16:3e:b9:5f:a6 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '92409a46-2dd7-4b20-ac9d-958bbb30993d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f7727db5-43a6-48f6-abbf-aa184d8ad087', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '62d9a4a13f5d41529bc273c278fae96b', 'neutron:revision_number': '6', 'neutron:security_group_ids': '320b38f4-6497-45cc-9e33-00f741d5a1b1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.191'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c2f56b43-4627-4c45-bd62-967c8ee835ae, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=e963f21d-d8c0-4f76-b5bc-4a3f577d4055) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:01:01 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:01.792 103805 INFO neutron.agent.ovn.metadata.agent [-] Port e963f21d-d8c0-4f76-b5bc-4a3f577d4055 in datapath f7727db5-43a6-48f6-abbf-aa184d8ad087 bound to our chassis
Nov 22 08:01:01 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:01.793 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f7727db5-43a6-48f6-abbf-aa184d8ad087
Nov 22 08:01:01 compute-0 nova_compute[186544]: 2025-11-22 08:01:01.785 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:01:01 compute-0 systemd-machined[152872]: New machine qemu-51-instance-00000053.
Nov 22 08:01:01 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:01.815 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[afe649cf-bd0f-4378-8e17-8fb179e51773]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:01:01 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:01.816 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf7727db5-41 in ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 08:01:01 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:01.818 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf7727db5-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 08:01:01 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:01.818 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[f2d48b53-1758-400f-8628-f3214db87f5c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:01:01 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:01.819 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[1a2de0e1-1f26-4f1e-a653-f504e0204b41]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:01:01 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:01.831 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[89e6254f-a008-422d-ba21-6b93e43a5fdb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:01:01 compute-0 systemd[1]: Started Virtual Machine qemu-51-instance-00000053.
Nov 22 08:01:01 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:01.863 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[1a0f862b-acd1-4990-91f8-dee2b985e72c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:01:01 compute-0 systemd-udevd[229266]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:01:01 compute-0 NetworkManager[55036]: <info>  [1763798461.8870] device (tape963f21d-d8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 08:01:01 compute-0 podman[229238]: 2025-11-22 08:01:01.887371675 +0000 UTC m=+0.129812816 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 08:01:01 compute-0 NetworkManager[55036]: <info>  [1763798461.8885] device (tape963f21d-d8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 08:01:01 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:01.906 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[3dd37147-7a2d-46e1-9375-1317476219d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:01:01 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:01.924 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[0e1abc49-5bf0-4231-b04d-fd4416f3700e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:01:01 compute-0 NetworkManager[55036]: <info>  [1763798461.9248] manager: (tapf7727db5-40): new Veth device (/org/freedesktop/NetworkManager/Devices/186)
Nov 22 08:01:01 compute-0 nova_compute[186544]: 2025-11-22 08:01:01.927 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:01:01 compute-0 nova_compute[186544]: 2025-11-22 08:01:01.946 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:01:01 compute-0 ovn_controller[94843]: 2025-11-22T08:01:01Z|00391|binding|INFO|Setting lport e963f21d-d8c0-4f76-b5bc-4a3f577d4055 ovn-installed in OVS
Nov 22 08:01:01 compute-0 ovn_controller[94843]: 2025-11-22T08:01:01Z|00392|binding|INFO|Setting lport e963f21d-d8c0-4f76-b5bc-4a3f577d4055 up in Southbound
Nov 22 08:01:01 compute-0 nova_compute[186544]: 2025-11-22 08:01:01.956 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:01:01 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:01.968 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[b12abc6e-6312-49df-9492-d4ea51517424]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:01:01 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:01.971 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[93b004b1-783b-4d7c-aed6-1498cd4cbafc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:01:01 compute-0 NetworkManager[55036]: <info>  [1763798461.9974] device (tapf7727db5-40): carrier: link connected
Nov 22 08:01:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:02.003 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[5651c6b0-d552-4770-85e9-78dea25394d6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:01:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:02.019 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[6349c961-46e4-4b48-8f0d-effa4239aab3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf7727db5-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f9:3e:1a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 122], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 520263, 'reachable_time': 42969, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229296, 'error': None, 'target': 'ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:01:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:02.033 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[04afc528-2ca6-468b-a9bd-2576f0e2b6d7]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef9:3e1a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 520263, 'tstamp': 520263}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229297, 'error': None, 'target': 'ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:01:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:02.049 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[88b9eec5-7d53-4d58-9f56-08e4e8acc6e9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf7727db5-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f9:3e:1a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 122], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 520263, 'reachable_time': 42969, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 229298, 'error': None, 'target': 'ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:01:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:02.077 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[d4d89767-7fc9-4d18-908b-48f66b5dd884]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:01:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:02.129 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[bedff78b-579a-4c36-aae7-7e17eb5eeb25]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:01:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:02.133 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf7727db5-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:01:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:02.133 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:01:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:02.134 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf7727db5-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:01:02 compute-0 nova_compute[186544]: 2025-11-22 08:01:02.135 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:01:02 compute-0 NetworkManager[55036]: <info>  [1763798462.1364] manager: (tapf7727db5-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/187)
Nov 22 08:01:02 compute-0 kernel: tapf7727db5-40: entered promiscuous mode
Nov 22 08:01:02 compute-0 nova_compute[186544]: 2025-11-22 08:01:02.137 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:01:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:02.138 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf7727db5-40, col_values=(('external_ids', {'iface-id': '188249cb-6e2b-4c68-9c53-aaa0a3da466f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:01:02 compute-0 nova_compute[186544]: 2025-11-22 08:01:02.139 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:01:02 compute-0 ovn_controller[94843]: 2025-11-22T08:01:02Z|00393|binding|INFO|Releasing lport 188249cb-6e2b-4c68-9c53-aaa0a3da466f from this chassis (sb_readonly=0)
Nov 22 08:01:02 compute-0 nova_compute[186544]: 2025-11-22 08:01:02.140 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:01:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:02.141 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f7727db5-43a6-48f6-abbf-aa184d8ad087.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f7727db5-43a6-48f6-abbf-aa184d8ad087.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 08:01:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:02.142 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[8902b313-a7f0-4533-aaf8-591b74423f3d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:01:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:02.142 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 08:01:02 compute-0 ovn_metadata_agent[103800]: global
Nov 22 08:01:02 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 08:01:02 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-f7727db5-43a6-48f6-abbf-aa184d8ad087
Nov 22 08:01:02 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 08:01:02 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 08:01:02 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 08:01:02 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/f7727db5-43a6-48f6-abbf-aa184d8ad087.pid.haproxy
Nov 22 08:01:02 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 08:01:02 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:01:02 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 08:01:02 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 08:01:02 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 08:01:02 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 08:01:02 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 08:01:02 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 08:01:02 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 08:01:02 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 08:01:02 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 08:01:02 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 08:01:02 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 08:01:02 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 08:01:02 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 08:01:02 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:01:02 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:01:02 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 08:01:02 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 08:01:02 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 08:01:02 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID f7727db5-43a6-48f6-abbf-aa184d8ad087
Nov 22 08:01:02 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 08:01:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:02.143 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087', 'env', 'PROCESS_TAG=haproxy-f7727db5-43a6-48f6-abbf-aa184d8ad087', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f7727db5-43a6-48f6-abbf-aa184d8ad087.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 08:01:02 compute-0 nova_compute[186544]: 2025-11-22 08:01:02.151 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:01:02 compute-0 nova_compute[186544]: 2025-11-22 08:01:02.358 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798462.35824, 92409a46-2dd7-4b20-ac9d-958bbb30993d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:01:02 compute-0 nova_compute[186544]: 2025-11-22 08:01:02.359 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] VM Resumed (Lifecycle Event)
Nov 22 08:01:02 compute-0 nova_compute[186544]: 2025-11-22 08:01:02.361 186548 DEBUG nova.compute.manager [None req-0c6af42b-e41d-4741-bba1-5ac39f0bace8 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 08:01:02 compute-0 nova_compute[186544]: 2025-11-22 08:01:02.365 186548 INFO nova.virt.libvirt.driver [-] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Instance rebooted successfully.
Nov 22 08:01:02 compute-0 nova_compute[186544]: 2025-11-22 08:01:02.366 186548 DEBUG nova.compute.manager [None req-0c6af42b-e41d-4741-bba1-5ac39f0bace8 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:01:02 compute-0 nova_compute[186544]: 2025-11-22 08:01:02.377 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:01:02 compute-0 nova_compute[186544]: 2025-11-22 08:01:02.381 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:01:02 compute-0 nova_compute[186544]: 2025-11-22 08:01:02.404 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] During sync_power_state the instance has a pending task (powering-on). Skip.
Nov 22 08:01:02 compute-0 nova_compute[186544]: 2025-11-22 08:01:02.405 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798462.3594172, 92409a46-2dd7-4b20-ac9d-958bbb30993d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:01:02 compute-0 nova_compute[186544]: 2025-11-22 08:01:02.405 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] VM Started (Lifecycle Event)
Nov 22 08:01:02 compute-0 nova_compute[186544]: 2025-11-22 08:01:02.423 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:01:02 compute-0 nova_compute[186544]: 2025-11-22 08:01:02.428 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:01:02 compute-0 podman[229336]: 2025-11-22 08:01:02.494303263 +0000 UTC m=+0.025054432 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 08:01:03 compute-0 nova_compute[186544]: 2025-11-22 08:01:03.106 186548 DEBUG nova.compute.manager [req-551622d5-48be-4a45-87e8-2a17f9459c89 req-eca06d13-7bc3-449a-929c-55cde050a10e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Received event network-vif-plugged-e963f21d-d8c0-4f76-b5bc-4a3f577d4055 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:01:03 compute-0 nova_compute[186544]: 2025-11-22 08:01:03.106 186548 DEBUG oslo_concurrency.lockutils [req-551622d5-48be-4a45-87e8-2a17f9459c89 req-eca06d13-7bc3-449a-929c-55cde050a10e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "92409a46-2dd7-4b20-ac9d-958bbb30993d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:01:03 compute-0 nova_compute[186544]: 2025-11-22 08:01:03.107 186548 DEBUG oslo_concurrency.lockutils [req-551622d5-48be-4a45-87e8-2a17f9459c89 req-eca06d13-7bc3-449a-929c-55cde050a10e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "92409a46-2dd7-4b20-ac9d-958bbb30993d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:01:03 compute-0 nova_compute[186544]: 2025-11-22 08:01:03.107 186548 DEBUG oslo_concurrency.lockutils [req-551622d5-48be-4a45-87e8-2a17f9459c89 req-eca06d13-7bc3-449a-929c-55cde050a10e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "92409a46-2dd7-4b20-ac9d-958bbb30993d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:01:03 compute-0 nova_compute[186544]: 2025-11-22 08:01:03.107 186548 DEBUG nova.compute.manager [req-551622d5-48be-4a45-87e8-2a17f9459c89 req-eca06d13-7bc3-449a-929c-55cde050a10e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] No waiting events found dispatching network-vif-plugged-e963f21d-d8c0-4f76-b5bc-4a3f577d4055 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:01:03 compute-0 nova_compute[186544]: 2025-11-22 08:01:03.107 186548 WARNING nova.compute.manager [req-551622d5-48be-4a45-87e8-2a17f9459c89 req-eca06d13-7bc3-449a-929c-55cde050a10e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Received unexpected event network-vif-plugged-e963f21d-d8c0-4f76-b5bc-4a3f577d4055 for instance with vm_state active and task_state None.
Nov 22 08:01:03 compute-0 podman[229336]: 2025-11-22 08:01:03.314523602 +0000 UTC m=+0.845274771 container create fe17b513da29d941df73e3ca690122582892c4fbe312057c28391a3cc5232e83 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Nov 22 08:01:03 compute-0 systemd[1]: Started libpod-conmon-fe17b513da29d941df73e3ca690122582892c4fbe312057c28391a3cc5232e83.scope.
Nov 22 08:01:03 compute-0 systemd[1]: Started libcrun container.
Nov 22 08:01:03 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/219b37e51c3fa322741afa0f12af8a7c2b5354616aa8fbf391705bc2ee6a42f9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 08:01:03 compute-0 podman[229336]: 2025-11-22 08:01:03.652665474 +0000 UTC m=+1.183416633 container init fe17b513da29d941df73e3ca690122582892c4fbe312057c28391a3cc5232e83 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Nov 22 08:01:03 compute-0 podman[229336]: 2025-11-22 08:01:03.65891837 +0000 UTC m=+1.189669509 container start fe17b513da29d941df73e3ca690122582892c4fbe312057c28391a3cc5232e83 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0)
Nov 22 08:01:03 compute-0 neutron-haproxy-ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087[229352]: [NOTICE]   (229356) : New worker (229358) forked
Nov 22 08:01:03 compute-0 neutron-haproxy-ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087[229352]: [NOTICE]   (229356) : Loading success.
Nov 22 08:01:04 compute-0 nova_compute[186544]: 2025-11-22 08:01:04.118 186548 DEBUG nova.virt.libvirt.driver [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] Instance in state 1 after 21 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Nov 22 08:01:05 compute-0 nova_compute[186544]: 2025-11-22 08:01:05.181 186548 DEBUG nova.compute.manager [req-1a265680-1d57-4deb-aba3-0c1120d7edd7 req-f686a440-c2e2-4f13-9101-dc757f9655d2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Received event network-vif-plugged-e963f21d-d8c0-4f76-b5bc-4a3f577d4055 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:01:05 compute-0 nova_compute[186544]: 2025-11-22 08:01:05.182 186548 DEBUG oslo_concurrency.lockutils [req-1a265680-1d57-4deb-aba3-0c1120d7edd7 req-f686a440-c2e2-4f13-9101-dc757f9655d2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "92409a46-2dd7-4b20-ac9d-958bbb30993d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:01:05 compute-0 nova_compute[186544]: 2025-11-22 08:01:05.182 186548 DEBUG oslo_concurrency.lockutils [req-1a265680-1d57-4deb-aba3-0c1120d7edd7 req-f686a440-c2e2-4f13-9101-dc757f9655d2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "92409a46-2dd7-4b20-ac9d-958bbb30993d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:01:05 compute-0 nova_compute[186544]: 2025-11-22 08:01:05.182 186548 DEBUG oslo_concurrency.lockutils [req-1a265680-1d57-4deb-aba3-0c1120d7edd7 req-f686a440-c2e2-4f13-9101-dc757f9655d2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "92409a46-2dd7-4b20-ac9d-958bbb30993d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:01:05 compute-0 nova_compute[186544]: 2025-11-22 08:01:05.182 186548 DEBUG nova.compute.manager [req-1a265680-1d57-4deb-aba3-0c1120d7edd7 req-f686a440-c2e2-4f13-9101-dc757f9655d2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] No waiting events found dispatching network-vif-plugged-e963f21d-d8c0-4f76-b5bc-4a3f577d4055 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:01:05 compute-0 nova_compute[186544]: 2025-11-22 08:01:05.182 186548 WARNING nova.compute.manager [req-1a265680-1d57-4deb-aba3-0c1120d7edd7 req-f686a440-c2e2-4f13-9101-dc757f9655d2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Received unexpected event network-vif-plugged-e963f21d-d8c0-4f76-b5bc-4a3f577d4055 for instance with vm_state active and task_state deleting.
Nov 22 08:01:05 compute-0 nova_compute[186544]: 2025-11-22 08:01:05.207 186548 DEBUG oslo_concurrency.lockutils [None req-5498b422-c020-4133-9c91-bbb7ad641a94 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Acquiring lock "92409a46-2dd7-4b20-ac9d-958bbb30993d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:01:05 compute-0 nova_compute[186544]: 2025-11-22 08:01:05.208 186548 DEBUG oslo_concurrency.lockutils [None req-5498b422-c020-4133-9c91-bbb7ad641a94 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "92409a46-2dd7-4b20-ac9d-958bbb30993d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:01:05 compute-0 nova_compute[186544]: 2025-11-22 08:01:05.208 186548 DEBUG oslo_concurrency.lockutils [None req-5498b422-c020-4133-9c91-bbb7ad641a94 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Acquiring lock "92409a46-2dd7-4b20-ac9d-958bbb30993d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:01:05 compute-0 nova_compute[186544]: 2025-11-22 08:01:05.208 186548 DEBUG oslo_concurrency.lockutils [None req-5498b422-c020-4133-9c91-bbb7ad641a94 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "92409a46-2dd7-4b20-ac9d-958bbb30993d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:01:05 compute-0 nova_compute[186544]: 2025-11-22 08:01:05.208 186548 DEBUG oslo_concurrency.lockutils [None req-5498b422-c020-4133-9c91-bbb7ad641a94 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "92409a46-2dd7-4b20-ac9d-958bbb30993d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:01:05 compute-0 nova_compute[186544]: 2025-11-22 08:01:05.215 186548 INFO nova.compute.manager [None req-5498b422-c020-4133-9c91-bbb7ad641a94 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Terminating instance
Nov 22 08:01:05 compute-0 nova_compute[186544]: 2025-11-22 08:01:05.220 186548 DEBUG nova.compute.manager [None req-5498b422-c020-4133-9c91-bbb7ad641a94 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 08:01:05 compute-0 kernel: tape963f21d-d8 (unregistering): left promiscuous mode
Nov 22 08:01:05 compute-0 NetworkManager[55036]: <info>  [1763798465.2841] device (tape963f21d-d8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 08:01:05 compute-0 ovn_controller[94843]: 2025-11-22T08:01:05Z|00394|binding|INFO|Releasing lport e963f21d-d8c0-4f76-b5bc-4a3f577d4055 from this chassis (sb_readonly=0)
Nov 22 08:01:05 compute-0 nova_compute[186544]: 2025-11-22 08:01:05.289 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:01:05 compute-0 ovn_controller[94843]: 2025-11-22T08:01:05Z|00395|binding|INFO|Setting lport e963f21d-d8c0-4f76-b5bc-4a3f577d4055 down in Southbound
Nov 22 08:01:05 compute-0 ovn_controller[94843]: 2025-11-22T08:01:05Z|00396|binding|INFO|Removing iface tape963f21d-d8 ovn-installed in OVS
Nov 22 08:01:05 compute-0 nova_compute[186544]: 2025-11-22 08:01:05.292 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:01:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:05.299 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b9:5f:a6 10.100.0.4'], port_security=['fa:16:3e:b9:5f:a6 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '92409a46-2dd7-4b20-ac9d-958bbb30993d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f7727db5-43a6-48f6-abbf-aa184d8ad087', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '62d9a4a13f5d41529bc273c278fae96b', 'neutron:revision_number': '8', 'neutron:security_group_ids': '320b38f4-6497-45cc-9e33-00f741d5a1b1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.191', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c2f56b43-4627-4c45-bd62-967c8ee835ae, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=e963f21d-d8c0-4f76-b5bc-4a3f577d4055) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:01:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:05.301 103805 INFO neutron.agent.ovn.metadata.agent [-] Port e963f21d-d8c0-4f76-b5bc-4a3f577d4055 in datapath f7727db5-43a6-48f6-abbf-aa184d8ad087 unbound from our chassis
Nov 22 08:01:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:05.302 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f7727db5-43a6-48f6-abbf-aa184d8ad087, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 08:01:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:05.303 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[f54ec0af-8383-4f11-ac90-67957448cd1e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:01:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:05.304 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087 namespace which is not needed anymore
Nov 22 08:01:05 compute-0 nova_compute[186544]: 2025-11-22 08:01:05.305 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:01:05 compute-0 systemd[1]: machine-qemu\x2d51\x2dinstance\x2d00000053.scope: Deactivated successfully.
Nov 22 08:01:05 compute-0 systemd[1]: machine-qemu\x2d51\x2dinstance\x2d00000053.scope: Consumed 3.425s CPU time.
Nov 22 08:01:05 compute-0 systemd-machined[152872]: Machine qemu-51-instance-00000053 terminated.
Nov 22 08:01:05 compute-0 nova_compute[186544]: 2025-11-22 08:01:05.438 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:01:05 compute-0 nova_compute[186544]: 2025-11-22 08:01:05.442 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:01:05 compute-0 nova_compute[186544]: 2025-11-22 08:01:05.484 186548 INFO nova.virt.libvirt.driver [-] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Instance destroyed successfully.
Nov 22 08:01:05 compute-0 nova_compute[186544]: 2025-11-22 08:01:05.485 186548 DEBUG nova.objects.instance [None req-5498b422-c020-4133-9c91-bbb7ad641a94 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lazy-loading 'resources' on Instance uuid 92409a46-2dd7-4b20-ac9d-958bbb30993d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:01:05 compute-0 nova_compute[186544]: 2025-11-22 08:01:05.631 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:01:05 compute-0 neutron-haproxy-ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087[229352]: [NOTICE]   (229356) : haproxy version is 2.8.14-c23fe91
Nov 22 08:01:05 compute-0 neutron-haproxy-ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087[229352]: [NOTICE]   (229356) : path to executable is /usr/sbin/haproxy
Nov 22 08:01:05 compute-0 neutron-haproxy-ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087[229352]: [WARNING]  (229356) : Exiting Master process...
Nov 22 08:01:05 compute-0 neutron-haproxy-ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087[229352]: [WARNING]  (229356) : Exiting Master process...
Nov 22 08:01:05 compute-0 neutron-haproxy-ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087[229352]: [ALERT]    (229356) : Current worker (229358) exited with code 143 (Terminated)
Nov 22 08:01:05 compute-0 neutron-haproxy-ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087[229352]: [WARNING]  (229356) : All workers exited. Exiting... (0)
Nov 22 08:01:05 compute-0 systemd[1]: libpod-fe17b513da29d941df73e3ca690122582892c4fbe312057c28391a3cc5232e83.scope: Deactivated successfully.
Nov 22 08:01:05 compute-0 podman[229392]: 2025-11-22 08:01:05.652062741 +0000 UTC m=+0.263297661 container died fe17b513da29d941df73e3ca690122582892c4fbe312057c28391a3cc5232e83 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Nov 22 08:01:05 compute-0 nova_compute[186544]: 2025-11-22 08:01:05.669 186548 DEBUG nova.virt.libvirt.vif [None req-5498b422-c020-4133-9c91-bbb7ad641a94 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T07:58:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-342710330',display_name='tempest-ServerActionsTestOtherB-server-342710330',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-342710330',id=83,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAYuGzf0LScMOVBFQXWfhNYOQ6jF90sG17sGlA5mAy2SAy9mKbq2fIQlO0z9fMDBdWr+bE7GGEcby2cnMIY+JJFsycIvPuyPkiwi4nyfq2TJfG30oGHbwkLB5ZFVSD3nfg==',key_name='tempest-keypair-729672741',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:00:47Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='62d9a4a13f5d41529bc273c278fae96b',ramdisk_id='',reservation_id='r-ucsu4nl7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-270195081',owner_user_name='tempest-ServerActionsTestOtherB-270195081-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:01:02Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d0c5153b41c5499bac372d2df10b9b03',uuid=92409a46-2dd7-4b20-ac9d-958bbb30993d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e963f21d-d8c0-4f76-b5bc-4a3f577d4055", "address": "fa:16:3e:b9:5f:a6", "network": {"id": "f7727db5-43a6-48f6-abbf-aa184d8ad087", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-678450698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62d9a4a13f5d41529bc273c278fae96b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape963f21d-d8", "ovs_interfaceid": "e963f21d-d8c0-4f76-b5bc-4a3f577d4055", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 08:01:05 compute-0 nova_compute[186544]: 2025-11-22 08:01:05.670 186548 DEBUG nova.network.os_vif_util [None req-5498b422-c020-4133-9c91-bbb7ad641a94 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Converting VIF {"id": "e963f21d-d8c0-4f76-b5bc-4a3f577d4055", "address": "fa:16:3e:b9:5f:a6", "network": {"id": "f7727db5-43a6-48f6-abbf-aa184d8ad087", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-678450698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62d9a4a13f5d41529bc273c278fae96b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape963f21d-d8", "ovs_interfaceid": "e963f21d-d8c0-4f76-b5bc-4a3f577d4055", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:01:05 compute-0 nova_compute[186544]: 2025-11-22 08:01:05.671 186548 DEBUG nova.network.os_vif_util [None req-5498b422-c020-4133-9c91-bbb7ad641a94 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b9:5f:a6,bridge_name='br-int',has_traffic_filtering=True,id=e963f21d-d8c0-4f76-b5bc-4a3f577d4055,network=Network(f7727db5-43a6-48f6-abbf-aa184d8ad087),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape963f21d-d8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:01:05 compute-0 nova_compute[186544]: 2025-11-22 08:01:05.671 186548 DEBUG os_vif [None req-5498b422-c020-4133-9c91-bbb7ad641a94 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b9:5f:a6,bridge_name='br-int',has_traffic_filtering=True,id=e963f21d-d8c0-4f76-b5bc-4a3f577d4055,network=Network(f7727db5-43a6-48f6-abbf-aa184d8ad087),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape963f21d-d8') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 08:01:05 compute-0 nova_compute[186544]: 2025-11-22 08:01:05.673 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:01:05 compute-0 nova_compute[186544]: 2025-11-22 08:01:05.674 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape963f21d-d8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:01:05 compute-0 nova_compute[186544]: 2025-11-22 08:01:05.677 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 08:01:05 compute-0 nova_compute[186544]: 2025-11-22 08:01:05.679 186548 INFO os_vif [None req-5498b422-c020-4133-9c91-bbb7ad641a94 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b9:5f:a6,bridge_name='br-int',has_traffic_filtering=True,id=e963f21d-d8c0-4f76-b5bc-4a3f577d4055,network=Network(f7727db5-43a6-48f6-abbf-aa184d8ad087),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape963f21d-d8')
Nov 22 08:01:05 compute-0 nova_compute[186544]: 2025-11-22 08:01:05.680 186548 INFO nova.virt.libvirt.driver [None req-5498b422-c020-4133-9c91-bbb7ad641a94 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Deleting instance files /var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d_del
Nov 22 08:01:05 compute-0 nova_compute[186544]: 2025-11-22 08:01:05.686 186548 INFO nova.virt.libvirt.driver [None req-5498b422-c020-4133-9c91-bbb7ad641a94 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Deletion of /var/lib/nova/instances/92409a46-2dd7-4b20-ac9d-958bbb30993d_del complete
Nov 22 08:01:05 compute-0 nova_compute[186544]: 2025-11-22 08:01:05.829 186548 INFO nova.compute.manager [None req-5498b422-c020-4133-9c91-bbb7ad641a94 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Took 0.61 seconds to destroy the instance on the hypervisor.
Nov 22 08:01:05 compute-0 nova_compute[186544]: 2025-11-22 08:01:05.829 186548 DEBUG oslo.service.loopingcall [None req-5498b422-c020-4133-9c91-bbb7ad641a94 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 08:01:05 compute-0 nova_compute[186544]: 2025-11-22 08:01:05.829 186548 DEBUG nova.compute.manager [-] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 08:01:05 compute-0 nova_compute[186544]: 2025-11-22 08:01:05.830 186548 DEBUG nova.network.neutron [-] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 08:01:07 compute-0 systemd[1]: machine-qemu\x2d50\x2dinstance\x2d0000005e.scope: Deactivated successfully.
Nov 22 08:01:07 compute-0 systemd[1]: machine-qemu\x2d50\x2dinstance\x2d0000005e.scope: Consumed 14.810s CPU time.
Nov 22 08:01:07 compute-0 systemd-machined[152872]: Machine qemu-50-instance-0000005e terminated.
Nov 22 08:01:07 compute-0 nova_compute[186544]: 2025-11-22 08:01:07.290 186548 DEBUG nova.compute.manager [req-7ce15890-d1f5-4d17-870e-1f7c63c0d77a req-cbfda7a7-7e7b-4160-908c-39d55d7ebef3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Received event network-vif-unplugged-e963f21d-d8c0-4f76-b5bc-4a3f577d4055 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:01:07 compute-0 nova_compute[186544]: 2025-11-22 08:01:07.291 186548 DEBUG oslo_concurrency.lockutils [req-7ce15890-d1f5-4d17-870e-1f7c63c0d77a req-cbfda7a7-7e7b-4160-908c-39d55d7ebef3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "92409a46-2dd7-4b20-ac9d-958bbb30993d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:01:07 compute-0 nova_compute[186544]: 2025-11-22 08:01:07.291 186548 DEBUG oslo_concurrency.lockutils [req-7ce15890-d1f5-4d17-870e-1f7c63c0d77a req-cbfda7a7-7e7b-4160-908c-39d55d7ebef3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "92409a46-2dd7-4b20-ac9d-958bbb30993d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:01:07 compute-0 nova_compute[186544]: 2025-11-22 08:01:07.292 186548 DEBUG oslo_concurrency.lockutils [req-7ce15890-d1f5-4d17-870e-1f7c63c0d77a req-cbfda7a7-7e7b-4160-908c-39d55d7ebef3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "92409a46-2dd7-4b20-ac9d-958bbb30993d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:01:07 compute-0 nova_compute[186544]: 2025-11-22 08:01:07.292 186548 DEBUG nova.compute.manager [req-7ce15890-d1f5-4d17-870e-1f7c63c0d77a req-cbfda7a7-7e7b-4160-908c-39d55d7ebef3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] No waiting events found dispatching network-vif-unplugged-e963f21d-d8c0-4f76-b5bc-4a3f577d4055 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:01:07 compute-0 nova_compute[186544]: 2025-11-22 08:01:07.292 186548 DEBUG nova.compute.manager [req-7ce15890-d1f5-4d17-870e-1f7c63c0d77a req-cbfda7a7-7e7b-4160-908c-39d55d7ebef3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Received event network-vif-unplugged-e963f21d-d8c0-4f76-b5bc-4a3f577d4055 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 22 08:01:07 compute-0 nova_compute[186544]: 2025-11-22 08:01:07.292 186548 DEBUG nova.compute.manager [req-7ce15890-d1f5-4d17-870e-1f7c63c0d77a req-cbfda7a7-7e7b-4160-908c-39d55d7ebef3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Received event network-vif-plugged-e963f21d-d8c0-4f76-b5bc-4a3f577d4055 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:01:07 compute-0 nova_compute[186544]: 2025-11-22 08:01:07.292 186548 DEBUG oslo_concurrency.lockutils [req-7ce15890-d1f5-4d17-870e-1f7c63c0d77a req-cbfda7a7-7e7b-4160-908c-39d55d7ebef3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "92409a46-2dd7-4b20-ac9d-958bbb30993d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:01:07 compute-0 nova_compute[186544]: 2025-11-22 08:01:07.293 186548 DEBUG oslo_concurrency.lockutils [req-7ce15890-d1f5-4d17-870e-1f7c63c0d77a req-cbfda7a7-7e7b-4160-908c-39d55d7ebef3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "92409a46-2dd7-4b20-ac9d-958bbb30993d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:01:07 compute-0 nova_compute[186544]: 2025-11-22 08:01:07.293 186548 DEBUG oslo_concurrency.lockutils [req-7ce15890-d1f5-4d17-870e-1f7c63c0d77a req-cbfda7a7-7e7b-4160-908c-39d55d7ebef3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "92409a46-2dd7-4b20-ac9d-958bbb30993d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:01:07 compute-0 nova_compute[186544]: 2025-11-22 08:01:07.293 186548 DEBUG nova.compute.manager [req-7ce15890-d1f5-4d17-870e-1f7c63c0d77a req-cbfda7a7-7e7b-4160-908c-39d55d7ebef3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] No waiting events found dispatching network-vif-plugged-e963f21d-d8c0-4f76-b5bc-4a3f577d4055 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:01:07 compute-0 nova_compute[186544]: 2025-11-22 08:01:07.293 186548 WARNING nova.compute.manager [req-7ce15890-d1f5-4d17-870e-1f7c63c0d77a req-cbfda7a7-7e7b-4160-908c-39d55d7ebef3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Received unexpected event network-vif-plugged-e963f21d-d8c0-4f76-b5bc-4a3f577d4055 for instance with vm_state active and task_state deleting.
Nov 22 08:01:07 compute-0 nova_compute[186544]: 2025-11-22 08:01:07.311 186548 INFO nova.virt.libvirt.driver [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] Instance shutdown successfully after 24 seconds.
Nov 22 08:01:07 compute-0 nova_compute[186544]: 2025-11-22 08:01:07.316 186548 INFO nova.virt.libvirt.driver [-] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] Instance destroyed successfully.
Nov 22 08:01:07 compute-0 nova_compute[186544]: 2025-11-22 08:01:07.320 186548 INFO nova.virt.libvirt.driver [-] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] Instance destroyed successfully.
Nov 22 08:01:07 compute-0 nova_compute[186544]: 2025-11-22 08:01:07.320 186548 INFO nova.virt.libvirt.driver [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] Deleting instance files /var/lib/nova/instances/4035e88f-0c79-4db5-a63d-7ae01c056339_del
Nov 22 08:01:07 compute-0 nova_compute[186544]: 2025-11-22 08:01:07.321 186548 INFO nova.virt.libvirt.driver [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] Deletion of /var/lib/nova/instances/4035e88f-0c79-4db5-a63d-7ae01c056339_del complete
Nov 22 08:01:07 compute-0 nova_compute[186544]: 2025-11-22 08:01:07.524 186548 DEBUG nova.network.neutron [-] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:01:07 compute-0 nova_compute[186544]: 2025-11-22 08:01:07.560 186548 INFO nova.compute.manager [-] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Took 1.73 seconds to deallocate network for instance.
Nov 22 08:01:07 compute-0 nova_compute[186544]: 2025-11-22 08:01:07.567 186548 DEBUG nova.virt.libvirt.driver [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 08:01:07 compute-0 nova_compute[186544]: 2025-11-22 08:01:07.568 186548 INFO nova.virt.libvirt.driver [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] Creating image(s)
Nov 22 08:01:07 compute-0 nova_compute[186544]: 2025-11-22 08:01:07.568 186548 DEBUG oslo_concurrency.lockutils [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Acquiring lock "/var/lib/nova/instances/4035e88f-0c79-4db5-a63d-7ae01c056339/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:01:07 compute-0 nova_compute[186544]: 2025-11-22 08:01:07.568 186548 DEBUG oslo_concurrency.lockutils [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Lock "/var/lib/nova/instances/4035e88f-0c79-4db5-a63d-7ae01c056339/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:01:07 compute-0 nova_compute[186544]: 2025-11-22 08:01:07.569 186548 DEBUG oslo_concurrency.lockutils [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Lock "/var/lib/nova/instances/4035e88f-0c79-4db5-a63d-7ae01c056339/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:01:07 compute-0 nova_compute[186544]: 2025-11-22 08:01:07.585 186548 DEBUG oslo_concurrency.processutils [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:01:07 compute-0 nova_compute[186544]: 2025-11-22 08:01:07.608 186548 DEBUG nova.compute.manager [req-0eecc07e-aec7-44c1-8ad1-225c0883401f req-63458699-25f0-4e1f-b376-a09f608dc2e8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Received event network-vif-deleted-e963f21d-d8c0-4f76-b5bc-4a3f577d4055 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:01:07 compute-0 nova_compute[186544]: 2025-11-22 08:01:07.639 186548 DEBUG oslo_concurrency.lockutils [None req-5498b422-c020-4133-9c91-bbb7ad641a94 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:01:07 compute-0 nova_compute[186544]: 2025-11-22 08:01:07.639 186548 DEBUG oslo_concurrency.lockutils [None req-5498b422-c020-4133-9c91-bbb7ad641a94 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:01:07 compute-0 nova_compute[186544]: 2025-11-22 08:01:07.643 186548 DEBUG oslo_concurrency.processutils [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:01:07 compute-0 nova_compute[186544]: 2025-11-22 08:01:07.643 186548 DEBUG oslo_concurrency.lockutils [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Acquiring lock "2882af3479446958b785a3f508ce087a26493f42" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:01:07 compute-0 nova_compute[186544]: 2025-11-22 08:01:07.644 186548 DEBUG oslo_concurrency.lockutils [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Lock "2882af3479446958b785a3f508ce087a26493f42" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:01:07 compute-0 nova_compute[186544]: 2025-11-22 08:01:07.657 186548 DEBUG oslo_concurrency.processutils [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:01:07 compute-0 nova_compute[186544]: 2025-11-22 08:01:07.717 186548 DEBUG oslo_concurrency.processutils [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:01:07 compute-0 nova_compute[186544]: 2025-11-22 08:01:07.718 186548 DEBUG oslo_concurrency.processutils [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42,backing_fmt=raw /var/lib/nova/instances/4035e88f-0c79-4db5-a63d-7ae01c056339/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:01:07 compute-0 nova_compute[186544]: 2025-11-22 08:01:07.780 186548 DEBUG nova.compute.provider_tree [None req-5498b422-c020-4133-9c91-bbb7ad641a94 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:01:07 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fe17b513da29d941df73e3ca690122582892c4fbe312057c28391a3cc5232e83-userdata-shm.mount: Deactivated successfully.
Nov 22 08:01:07 compute-0 systemd[1]: var-lib-containers-storage-overlay-219b37e51c3fa322741afa0f12af8a7c2b5354616aa8fbf391705bc2ee6a42f9-merged.mount: Deactivated successfully.
Nov 22 08:01:07 compute-0 nova_compute[186544]: 2025-11-22 08:01:07.795 186548 DEBUG nova.scheduler.client.report [None req-5498b422-c020-4133-9c91-bbb7ad641a94 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:01:07 compute-0 nova_compute[186544]: 2025-11-22 08:01:07.822 186548 DEBUG oslo_concurrency.lockutils [None req-5498b422-c020-4133-9c91-bbb7ad641a94 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.182s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:01:07 compute-0 nova_compute[186544]: 2025-11-22 08:01:07.880 186548 INFO nova.scheduler.client.report [None req-5498b422-c020-4133-9c91-bbb7ad641a94 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Deleted allocations for instance 92409a46-2dd7-4b20-ac9d-958bbb30993d
Nov 22 08:01:07 compute-0 nova_compute[186544]: 2025-11-22 08:01:07.992 186548 DEBUG oslo_concurrency.lockutils [None req-5498b422-c020-4133-9c91-bbb7ad641a94 d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "92409a46-2dd7-4b20-ac9d-958bbb30993d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.784s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:01:07 compute-0 nova_compute[186544]: 2025-11-22 08:01:07.995 186548 DEBUG oslo_concurrency.processutils [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42,backing_fmt=raw /var/lib/nova/instances/4035e88f-0c79-4db5-a63d-7ae01c056339/disk 1073741824" returned: 0 in 0.277s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:01:07 compute-0 nova_compute[186544]: 2025-11-22 08:01:07.995 186548 DEBUG oslo_concurrency.lockutils [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Lock "2882af3479446958b785a3f508ce087a26493f42" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.352s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:01:07 compute-0 nova_compute[186544]: 2025-11-22 08:01:07.996 186548 DEBUG oslo_concurrency.processutils [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:01:08 compute-0 podman[229392]: 2025-11-22 08:01:08.033042246 +0000 UTC m=+2.644277166 container cleanup fe17b513da29d941df73e3ca690122582892c4fbe312057c28391a3cc5232e83 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 22 08:01:08 compute-0 systemd[1]: libpod-conmon-fe17b513da29d941df73e3ca690122582892c4fbe312057c28391a3cc5232e83.scope: Deactivated successfully.
Nov 22 08:01:08 compute-0 nova_compute[186544]: 2025-11-22 08:01:08.052 186548 DEBUG oslo_concurrency.processutils [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:01:08 compute-0 nova_compute[186544]: 2025-11-22 08:01:08.053 186548 DEBUG nova.virt.disk.api [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Checking if we can resize image /var/lib/nova/instances/4035e88f-0c79-4db5-a63d-7ae01c056339/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 08:01:08 compute-0 nova_compute[186544]: 2025-11-22 08:01:08.054 186548 DEBUG oslo_concurrency.processutils [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4035e88f-0c79-4db5-a63d-7ae01c056339/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:01:08 compute-0 podman[229464]: 2025-11-22 08:01:08.057863861 +0000 UTC m=+0.235673217 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., version=9.6, architecture=x86_64, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public)
Nov 22 08:01:08 compute-0 nova_compute[186544]: 2025-11-22 08:01:08.110 186548 DEBUG oslo_concurrency.processutils [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4035e88f-0c79-4db5-a63d-7ae01c056339/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:01:08 compute-0 nova_compute[186544]: 2025-11-22 08:01:08.111 186548 DEBUG nova.virt.disk.api [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Cannot resize image /var/lib/nova/instances/4035e88f-0c79-4db5-a63d-7ae01c056339/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 08:01:08 compute-0 nova_compute[186544]: 2025-11-22 08:01:08.111 186548 DEBUG nova.virt.libvirt.driver [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 08:01:08 compute-0 nova_compute[186544]: 2025-11-22 08:01:08.111 186548 DEBUG nova.virt.libvirt.driver [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] Ensure instance console log exists: /var/lib/nova/instances/4035e88f-0c79-4db5-a63d-7ae01c056339/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 08:01:08 compute-0 nova_compute[186544]: 2025-11-22 08:01:08.112 186548 DEBUG oslo_concurrency.lockutils [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:01:08 compute-0 nova_compute[186544]: 2025-11-22 08:01:08.112 186548 DEBUG oslo_concurrency.lockutils [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:01:08 compute-0 nova_compute[186544]: 2025-11-22 08:01:08.112 186548 DEBUG oslo_concurrency.lockutils [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:01:08 compute-0 nova_compute[186544]: 2025-11-22 08:01:08.113 186548 DEBUG nova.virt.libvirt.driver [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:39:01Z,direct_url=<?>,disk_format='qcow2',id=360f90ca-2ddb-4e60-a48e-364e3b48bd96,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:02Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 08:01:08 compute-0 nova_compute[186544]: 2025-11-22 08:01:08.116 186548 WARNING nova.virt.libvirt.driver [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError
Nov 22 08:01:08 compute-0 nova_compute[186544]: 2025-11-22 08:01:08.121 186548 DEBUG nova.virt.libvirt.host [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 08:01:08 compute-0 nova_compute[186544]: 2025-11-22 08:01:08.121 186548 DEBUG nova.virt.libvirt.host [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 08:01:08 compute-0 nova_compute[186544]: 2025-11-22 08:01:08.125 186548 DEBUG nova.virt.libvirt.host [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 08:01:08 compute-0 nova_compute[186544]: 2025-11-22 08:01:08.126 186548 DEBUG nova.virt.libvirt.host [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 08:01:08 compute-0 nova_compute[186544]: 2025-11-22 08:01:08.127 186548 DEBUG nova.virt.libvirt.driver [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 08:01:08 compute-0 nova_compute[186544]: 2025-11-22 08:01:08.127 186548 DEBUG nova.virt.hardware [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:39:01Z,direct_url=<?>,disk_format='qcow2',id=360f90ca-2ddb-4e60-a48e-364e3b48bd96,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:02Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 08:01:08 compute-0 nova_compute[186544]: 2025-11-22 08:01:08.127 186548 DEBUG nova.virt.hardware [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 08:01:08 compute-0 nova_compute[186544]: 2025-11-22 08:01:08.128 186548 DEBUG nova.virt.hardware [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 08:01:08 compute-0 nova_compute[186544]: 2025-11-22 08:01:08.128 186548 DEBUG nova.virt.hardware [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 08:01:08 compute-0 nova_compute[186544]: 2025-11-22 08:01:08.128 186548 DEBUG nova.virt.hardware [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 08:01:08 compute-0 nova_compute[186544]: 2025-11-22 08:01:08.128 186548 DEBUG nova.virt.hardware [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 08:01:08 compute-0 nova_compute[186544]: 2025-11-22 08:01:08.128 186548 DEBUG nova.virt.hardware [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 08:01:08 compute-0 nova_compute[186544]: 2025-11-22 08:01:08.128 186548 DEBUG nova.virt.hardware [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 08:01:08 compute-0 nova_compute[186544]: 2025-11-22 08:01:08.129 186548 DEBUG nova.virt.hardware [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 08:01:08 compute-0 nova_compute[186544]: 2025-11-22 08:01:08.129 186548 DEBUG nova.virt.hardware [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 08:01:08 compute-0 nova_compute[186544]: 2025-11-22 08:01:08.129 186548 DEBUG nova.virt.hardware [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 08:01:08 compute-0 nova_compute[186544]: 2025-11-22 08:01:08.129 186548 DEBUG nova.objects.instance [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 4035e88f-0c79-4db5-a63d-7ae01c056339 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:01:08 compute-0 nova_compute[186544]: 2025-11-22 08:01:08.152 186548 DEBUG nova.virt.libvirt.driver [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] End _get_guest_xml xml=<domain type="kvm">
Nov 22 08:01:08 compute-0 nova_compute[186544]:   <uuid>4035e88f-0c79-4db5-a63d-7ae01c056339</uuid>
Nov 22 08:01:08 compute-0 nova_compute[186544]:   <name>instance-0000005e</name>
Nov 22 08:01:08 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 08:01:08 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 08:01:08 compute-0 nova_compute[186544]:   <metadata>
Nov 22 08:01:08 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 08:01:08 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 08:01:08 compute-0 nova_compute[186544]:       <nova:name>tempest-ServerShowV257Test-server-711605332</nova:name>
Nov 22 08:01:08 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 08:01:08</nova:creationTime>
Nov 22 08:01:08 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 08:01:08 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 08:01:08 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 08:01:08 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 08:01:08 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 08:01:08 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 08:01:08 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 08:01:08 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 08:01:08 compute-0 nova_compute[186544]:         <nova:user uuid="5cf0748f4d23460a90fe6f94e42ce0d3">tempest-ServerShowV257Test-549319000-project-member</nova:user>
Nov 22 08:01:08 compute-0 nova_compute[186544]:         <nova:project uuid="eca3d96a49be4f65a1d4fddc300c0346">tempest-ServerShowV257Test-549319000</nova:project>
Nov 22 08:01:08 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 08:01:08 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="360f90ca-2ddb-4e60-a48e-364e3b48bd96"/>
Nov 22 08:01:08 compute-0 nova_compute[186544]:       <nova:ports/>
Nov 22 08:01:08 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 08:01:08 compute-0 nova_compute[186544]:   </metadata>
Nov 22 08:01:08 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 08:01:08 compute-0 nova_compute[186544]:     <system>
Nov 22 08:01:08 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 08:01:08 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 08:01:08 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 08:01:08 compute-0 nova_compute[186544]:       <entry name="serial">4035e88f-0c79-4db5-a63d-7ae01c056339</entry>
Nov 22 08:01:08 compute-0 nova_compute[186544]:       <entry name="uuid">4035e88f-0c79-4db5-a63d-7ae01c056339</entry>
Nov 22 08:01:08 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 08:01:08 compute-0 nova_compute[186544]:     </system>
Nov 22 08:01:08 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 08:01:08 compute-0 nova_compute[186544]:   <os>
Nov 22 08:01:08 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 08:01:08 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 08:01:08 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 08:01:08 compute-0 nova_compute[186544]:   </os>
Nov 22 08:01:08 compute-0 nova_compute[186544]:   <features>
Nov 22 08:01:08 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 08:01:08 compute-0 nova_compute[186544]:     <apic/>
Nov 22 08:01:08 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 08:01:08 compute-0 nova_compute[186544]:   </features>
Nov 22 08:01:08 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 08:01:08 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 08:01:08 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 08:01:08 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 08:01:08 compute-0 nova_compute[186544]:   </clock>
Nov 22 08:01:08 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 08:01:08 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 08:01:08 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 08:01:08 compute-0 nova_compute[186544]:   </cpu>
Nov 22 08:01:08 compute-0 nova_compute[186544]:   <devices>
Nov 22 08:01:08 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 08:01:08 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 08:01:08 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/4035e88f-0c79-4db5-a63d-7ae01c056339/disk"/>
Nov 22 08:01:08 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 08:01:08 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:01:08 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 08:01:08 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 08:01:08 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/4035e88f-0c79-4db5-a63d-7ae01c056339/disk.config"/>
Nov 22 08:01:08 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 08:01:08 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:01:08 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 08:01:08 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/4035e88f-0c79-4db5-a63d-7ae01c056339/console.log" append="off"/>
Nov 22 08:01:08 compute-0 nova_compute[186544]:     </serial>
Nov 22 08:01:08 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 08:01:08 compute-0 nova_compute[186544]:     <video>
Nov 22 08:01:08 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:01:08 compute-0 nova_compute[186544]:     </video>
Nov 22 08:01:08 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 08:01:08 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 08:01:08 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 08:01:08 compute-0 nova_compute[186544]:     </rng>
Nov 22 08:01:08 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 08:01:08 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:08 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:08 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:08 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:08 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:08 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:08 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:08 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:08 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:08 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:08 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:08 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:08 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:08 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:08 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:08 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:08 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:08 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:08 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:08 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:08 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:08 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:08 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:08 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:08 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 08:01:08 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 08:01:08 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 08:01:08 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 08:01:08 compute-0 nova_compute[186544]:   </devices>
Nov 22 08:01:08 compute-0 nova_compute[186544]: </domain>
Nov 22 08:01:08 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 08:01:08 compute-0 nova_compute[186544]: 2025-11-22 08:01:08.203 186548 DEBUG nova.virt.libvirt.driver [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:01:08 compute-0 nova_compute[186544]: 2025-11-22 08:01:08.203 186548 DEBUG nova.virt.libvirt.driver [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:01:08 compute-0 nova_compute[186544]: 2025-11-22 08:01:08.203 186548 INFO nova.virt.libvirt.driver [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] Using config drive
Nov 22 08:01:08 compute-0 podman[229434]: 2025-11-22 08:01:08.209914145 +0000 UTC m=+1.080858734 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 08:01:08 compute-0 nova_compute[186544]: 2025-11-22 08:01:08.225 186548 DEBUG nova.objects.instance [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 4035e88f-0c79-4db5-a63d-7ae01c056339 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:01:08 compute-0 nova_compute[186544]: 2025-11-22 08:01:08.260 186548 DEBUG nova.objects.instance [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Lazy-loading 'keypairs' on Instance uuid 4035e88f-0c79-4db5-a63d-7ae01c056339 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:01:08 compute-0 nova_compute[186544]: 2025-11-22 08:01:08.461 186548 INFO nova.virt.libvirt.driver [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] Creating config drive at /var/lib/nova/instances/4035e88f-0c79-4db5-a63d-7ae01c056339/disk.config
Nov 22 08:01:08 compute-0 nova_compute[186544]: 2025-11-22 08:01:08.468 186548 DEBUG oslo_concurrency.processutils [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4035e88f-0c79-4db5-a63d-7ae01c056339/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpxhxuq11s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:01:08 compute-0 nova_compute[186544]: 2025-11-22 08:01:08.594 186548 DEBUG oslo_concurrency.processutils [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4035e88f-0c79-4db5-a63d-7ae01c056339/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpxhxuq11s" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:01:08 compute-0 systemd-machined[152872]: New machine qemu-52-instance-0000005e.
Nov 22 08:01:08 compute-0 systemd[1]: Started Virtual Machine qemu-52-instance-0000005e.
Nov 22 08:01:08 compute-0 podman[229486]: 2025-11-22 08:01:08.893991334 +0000 UTC m=+0.840440031 container remove fe17b513da29d941df73e3ca690122582892c4fbe312057c28391a3cc5232e83 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_managed=true)
Nov 22 08:01:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:08.901 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[19a0ab70-6d2b-4f91-9b33-61a559b0a6b3]: (4, ('Sat Nov 22 08:01:05 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087 (fe17b513da29d941df73e3ca690122582892c4fbe312057c28391a3cc5232e83)\nfe17b513da29d941df73e3ca690122582892c4fbe312057c28391a3cc5232e83\nSat Nov 22 08:01:08 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087 (fe17b513da29d941df73e3ca690122582892c4fbe312057c28391a3cc5232e83)\nfe17b513da29d941df73e3ca690122582892c4fbe312057c28391a3cc5232e83\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:01:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:08.903 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[2052498e-cf2e-47a9-8878-784d74ba209a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:01:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:08.904 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf7727db5-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:01:08 compute-0 nova_compute[186544]: 2025-11-22 08:01:08.906 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:01:08 compute-0 kernel: tapf7727db5-40: left promiscuous mode
Nov 22 08:01:08 compute-0 nova_compute[186544]: 2025-11-22 08:01:08.921 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:01:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:08.924 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[c229dc86-004b-4728-9579-7b51f2738e9b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:01:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:08.936 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[122d3176-ca57-46c5-beef-4deda656d9da]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:01:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:08.938 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[08a808ae-c7c9-4cc9-a253-0a6804473234]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:01:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:08.952 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[92a8b53f-bee9-4297-a2af-997690bc0770]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 520253, 'reachable_time': 25646, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229538, 'error': None, 'target': 'ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:01:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:08.954 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 08:01:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:08.954 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[8c43450a-9702-486b-9c3a-42e1e287fbc5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:01:08 compute-0 systemd[1]: run-netns-ovnmeta\x2df7727db5\x2d43a6\x2d48f6\x2dabbf\x2daa184d8ad087.mount: Deactivated successfully.
Nov 22 08:01:09 compute-0 nova_compute[186544]: 2025-11-22 08:01:09.472 186548 DEBUG nova.virt.libvirt.host [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Removed pending event for 4035e88f-0c79-4db5-a63d-7ae01c056339 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Nov 22 08:01:09 compute-0 nova_compute[186544]: 2025-11-22 08:01:09.472 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798469.471554, 4035e88f-0c79-4db5-a63d-7ae01c056339 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:01:09 compute-0 nova_compute[186544]: 2025-11-22 08:01:09.472 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] VM Resumed (Lifecycle Event)
Nov 22 08:01:09 compute-0 nova_compute[186544]: 2025-11-22 08:01:09.476 186548 DEBUG nova.compute.manager [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 08:01:09 compute-0 nova_compute[186544]: 2025-11-22 08:01:09.476 186548 DEBUG nova.virt.libvirt.driver [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 08:01:09 compute-0 nova_compute[186544]: 2025-11-22 08:01:09.482 186548 INFO nova.virt.libvirt.driver [-] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] Instance spawned successfully.
Nov 22 08:01:09 compute-0 nova_compute[186544]: 2025-11-22 08:01:09.483 186548 DEBUG nova.virt.libvirt.driver [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 08:01:09 compute-0 nova_compute[186544]: 2025-11-22 08:01:09.490 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:01:09 compute-0 nova_compute[186544]: 2025-11-22 08:01:09.492 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:01:09 compute-0 nova_compute[186544]: 2025-11-22 08:01:09.501 186548 DEBUG nova.virt.libvirt.driver [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:01:09 compute-0 nova_compute[186544]: 2025-11-22 08:01:09.501 186548 DEBUG nova.virt.libvirt.driver [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:01:09 compute-0 nova_compute[186544]: 2025-11-22 08:01:09.502 186548 DEBUG nova.virt.libvirt.driver [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:01:09 compute-0 nova_compute[186544]: 2025-11-22 08:01:09.502 186548 DEBUG nova.virt.libvirt.driver [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:01:09 compute-0 nova_compute[186544]: 2025-11-22 08:01:09.503 186548 DEBUG nova.virt.libvirt.driver [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:01:09 compute-0 nova_compute[186544]: 2025-11-22 08:01:09.503 186548 DEBUG nova.virt.libvirt.driver [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:01:09 compute-0 nova_compute[186544]: 2025-11-22 08:01:09.509 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Nov 22 08:01:09 compute-0 nova_compute[186544]: 2025-11-22 08:01:09.509 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798469.472301, 4035e88f-0c79-4db5-a63d-7ae01c056339 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:01:09 compute-0 nova_compute[186544]: 2025-11-22 08:01:09.510 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] VM Started (Lifecycle Event)
Nov 22 08:01:09 compute-0 nova_compute[186544]: 2025-11-22 08:01:09.530 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:01:09 compute-0 nova_compute[186544]: 2025-11-22 08:01:09.533 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:01:09 compute-0 nova_compute[186544]: 2025-11-22 08:01:09.561 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Nov 22 08:01:09 compute-0 nova_compute[186544]: 2025-11-22 08:01:09.592 186548 DEBUG nova.compute.manager [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:01:09 compute-0 nova_compute[186544]: 2025-11-22 08:01:09.688 186548 DEBUG oslo_concurrency.lockutils [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:01:09 compute-0 nova_compute[186544]: 2025-11-22 08:01:09.689 186548 DEBUG oslo_concurrency.lockutils [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:01:09 compute-0 nova_compute[186544]: 2025-11-22 08:01:09.689 186548 DEBUG nova.objects.instance [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Nov 22 08:01:09 compute-0 nova_compute[186544]: 2025-11-22 08:01:09.770 186548 DEBUG oslo_concurrency.lockutils [None req-8bbf4e63-e1d4-476a-88ad-7991be13543d 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.081s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:01:10 compute-0 nova_compute[186544]: 2025-11-22 08:01:10.632 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:01:10 compute-0 nova_compute[186544]: 2025-11-22 08:01:10.675 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:01:11 compute-0 nova_compute[186544]: 2025-11-22 08:01:11.207 186548 DEBUG oslo_concurrency.lockutils [None req-c34edcd2-109e-4666-a574-aed0f6214d34 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Acquiring lock "4035e88f-0c79-4db5-a63d-7ae01c056339" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:01:11 compute-0 nova_compute[186544]: 2025-11-22 08:01:11.208 186548 DEBUG oslo_concurrency.lockutils [None req-c34edcd2-109e-4666-a574-aed0f6214d34 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Lock "4035e88f-0c79-4db5-a63d-7ae01c056339" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:01:11 compute-0 nova_compute[186544]: 2025-11-22 08:01:11.208 186548 DEBUG oslo_concurrency.lockutils [None req-c34edcd2-109e-4666-a574-aed0f6214d34 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Acquiring lock "4035e88f-0c79-4db5-a63d-7ae01c056339-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:01:11 compute-0 nova_compute[186544]: 2025-11-22 08:01:11.208 186548 DEBUG oslo_concurrency.lockutils [None req-c34edcd2-109e-4666-a574-aed0f6214d34 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Lock "4035e88f-0c79-4db5-a63d-7ae01c056339-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:01:11 compute-0 nova_compute[186544]: 2025-11-22 08:01:11.208 186548 DEBUG oslo_concurrency.lockutils [None req-c34edcd2-109e-4666-a574-aed0f6214d34 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Lock "4035e88f-0c79-4db5-a63d-7ae01c056339-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:01:11 compute-0 nova_compute[186544]: 2025-11-22 08:01:11.216 186548 INFO nova.compute.manager [None req-c34edcd2-109e-4666-a574-aed0f6214d34 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] Terminating instance
Nov 22 08:01:11 compute-0 nova_compute[186544]: 2025-11-22 08:01:11.221 186548 DEBUG oslo_concurrency.lockutils [None req-c34edcd2-109e-4666-a574-aed0f6214d34 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Acquiring lock "refresh_cache-4035e88f-0c79-4db5-a63d-7ae01c056339" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:01:11 compute-0 nova_compute[186544]: 2025-11-22 08:01:11.222 186548 DEBUG oslo_concurrency.lockutils [None req-c34edcd2-109e-4666-a574-aed0f6214d34 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Acquired lock "refresh_cache-4035e88f-0c79-4db5-a63d-7ae01c056339" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:01:11 compute-0 nova_compute[186544]: 2025-11-22 08:01:11.222 186548 DEBUG nova.network.neutron [None req-c34edcd2-109e-4666-a574-aed0f6214d34 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 08:01:11 compute-0 nova_compute[186544]: 2025-11-22 08:01:11.470 186548 DEBUG nova.network.neutron [None req-c34edcd2-109e-4666-a574-aed0f6214d34 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 08:01:11 compute-0 nova_compute[186544]: 2025-11-22 08:01:11.802 186548 DEBUG nova.network.neutron [None req-c34edcd2-109e-4666-a574-aed0f6214d34 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:01:11 compute-0 nova_compute[186544]: 2025-11-22 08:01:11.814 186548 DEBUG oslo_concurrency.lockutils [None req-c34edcd2-109e-4666-a574-aed0f6214d34 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Releasing lock "refresh_cache-4035e88f-0c79-4db5-a63d-7ae01c056339" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:01:11 compute-0 nova_compute[186544]: 2025-11-22 08:01:11.815 186548 DEBUG nova.compute.manager [None req-c34edcd2-109e-4666-a574-aed0f6214d34 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 08:01:11 compute-0 systemd[1]: machine-qemu\x2d52\x2dinstance\x2d0000005e.scope: Deactivated successfully.
Nov 22 08:01:11 compute-0 systemd[1]: machine-qemu\x2d52\x2dinstance\x2d0000005e.scope: Consumed 3.176s CPU time.
Nov 22 08:01:11 compute-0 systemd-machined[152872]: Machine qemu-52-instance-0000005e terminated.
Nov 22 08:01:12 compute-0 nova_compute[186544]: 2025-11-22 08:01:12.056 186548 INFO nova.virt.libvirt.driver [-] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] Instance destroyed successfully.
Nov 22 08:01:12 compute-0 nova_compute[186544]: 2025-11-22 08:01:12.057 186548 DEBUG nova.objects.instance [None req-c34edcd2-109e-4666-a574-aed0f6214d34 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Lazy-loading 'resources' on Instance uuid 4035e88f-0c79-4db5-a63d-7ae01c056339 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:01:12 compute-0 nova_compute[186544]: 2025-11-22 08:01:12.076 186548 INFO nova.virt.libvirt.driver [None req-c34edcd2-109e-4666-a574-aed0f6214d34 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] Deleting instance files /var/lib/nova/instances/4035e88f-0c79-4db5-a63d-7ae01c056339_del
Nov 22 08:01:12 compute-0 nova_compute[186544]: 2025-11-22 08:01:12.076 186548 INFO nova.virt.libvirt.driver [None req-c34edcd2-109e-4666-a574-aed0f6214d34 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] Deletion of /var/lib/nova/instances/4035e88f-0c79-4db5-a63d-7ae01c056339_del complete
Nov 22 08:01:12 compute-0 nova_compute[186544]: 2025-11-22 08:01:12.182 186548 INFO nova.compute.manager [None req-c34edcd2-109e-4666-a574-aed0f6214d34 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] Took 0.37 seconds to destroy the instance on the hypervisor.
Nov 22 08:01:12 compute-0 nova_compute[186544]: 2025-11-22 08:01:12.182 186548 DEBUG oslo.service.loopingcall [None req-c34edcd2-109e-4666-a574-aed0f6214d34 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 08:01:12 compute-0 nova_compute[186544]: 2025-11-22 08:01:12.183 186548 DEBUG nova.compute.manager [-] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 08:01:12 compute-0 nova_compute[186544]: 2025-11-22 08:01:12.183 186548 DEBUG nova.network.neutron [-] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 08:01:12 compute-0 nova_compute[186544]: 2025-11-22 08:01:12.572 186548 DEBUG nova.network.neutron [-] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 08:01:12 compute-0 nova_compute[186544]: 2025-11-22 08:01:12.585 186548 DEBUG nova.network.neutron [-] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:01:12 compute-0 nova_compute[186544]: 2025-11-22 08:01:12.596 186548 INFO nova.compute.manager [-] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] Took 0.41 seconds to deallocate network for instance.
Nov 22 08:01:12 compute-0 nova_compute[186544]: 2025-11-22 08:01:12.672 186548 DEBUG oslo_concurrency.lockutils [None req-c34edcd2-109e-4666-a574-aed0f6214d34 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:01:12 compute-0 nova_compute[186544]: 2025-11-22 08:01:12.672 186548 DEBUG oslo_concurrency.lockutils [None req-c34edcd2-109e-4666-a574-aed0f6214d34 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:01:12 compute-0 nova_compute[186544]: 2025-11-22 08:01:12.725 186548 DEBUG nova.compute.provider_tree [None req-c34edcd2-109e-4666-a574-aed0f6214d34 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:01:12 compute-0 nova_compute[186544]: 2025-11-22 08:01:12.756 186548 DEBUG nova.scheduler.client.report [None req-c34edcd2-109e-4666-a574-aed0f6214d34 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:01:12 compute-0 nova_compute[186544]: 2025-11-22 08:01:12.785 186548 DEBUG oslo_concurrency.lockutils [None req-c34edcd2-109e-4666-a574-aed0f6214d34 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.113s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:01:12 compute-0 nova_compute[186544]: 2025-11-22 08:01:12.853 186548 INFO nova.scheduler.client.report [None req-c34edcd2-109e-4666-a574-aed0f6214d34 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Deleted allocations for instance 4035e88f-0c79-4db5-a63d-7ae01c056339
Nov 22 08:01:12 compute-0 nova_compute[186544]: 2025-11-22 08:01:12.944 186548 DEBUG oslo_concurrency.lockutils [None req-c34edcd2-109e-4666-a574-aed0f6214d34 5cf0748f4d23460a90fe6f94e42ce0d3 eca3d96a49be4f65a1d4fddc300c0346 - - default default] Lock "4035e88f-0c79-4db5-a63d-7ae01c056339" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.736s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:01:15 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:15.388 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=26, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=25) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:01:15 compute-0 nova_compute[186544]: 2025-11-22 08:01:15.388 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:01:15 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:15.390 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 08:01:15 compute-0 nova_compute[186544]: 2025-11-22 08:01:15.634 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:01:15 compute-0 nova_compute[186544]: 2025-11-22 08:01:15.677 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:01:20 compute-0 nova_compute[186544]: 2025-11-22 08:01:20.483 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763798465.4821973, 92409a46-2dd7-4b20-ac9d-958bbb30993d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:01:20 compute-0 nova_compute[186544]: 2025-11-22 08:01:20.484 186548 INFO nova.compute.manager [-] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] VM Stopped (Lifecycle Event)
Nov 22 08:01:20 compute-0 nova_compute[186544]: 2025-11-22 08:01:20.515 186548 DEBUG nova.compute.manager [None req-f6506062-da0f-4074-955b-0b47962f1a4b - - - - - -] [instance: 92409a46-2dd7-4b20-ac9d-958bbb30993d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:01:20 compute-0 nova_compute[186544]: 2025-11-22 08:01:20.637 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:01:20 compute-0 nova_compute[186544]: 2025-11-22 08:01:20.679 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:01:21 compute-0 podman[229558]: 2025-11-22 08:01:21.432100338 +0000 UTC m=+0.068853066 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true)
Nov 22 08:01:21 compute-0 podman[229559]: 2025-11-22 08:01:21.479118062 +0000 UTC m=+0.116724011 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 08:01:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:23.392 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '26'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:01:25 compute-0 nova_compute[186544]: 2025-11-22 08:01:25.219 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:01:25 compute-0 podman[229605]: 2025-11-22 08:01:25.407098041 +0000 UTC m=+0.054150362 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 08:01:25 compute-0 nova_compute[186544]: 2025-11-22 08:01:25.638 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:01:25 compute-0 nova_compute[186544]: 2025-11-22 08:01:25.681 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:01:27 compute-0 nova_compute[186544]: 2025-11-22 08:01:27.056 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763798472.0531328, 4035e88f-0c79-4db5-a63d-7ae01c056339 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:01:27 compute-0 nova_compute[186544]: 2025-11-22 08:01:27.056 186548 INFO nova.compute.manager [-] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] VM Stopped (Lifecycle Event)
Nov 22 08:01:27 compute-0 nova_compute[186544]: 2025-11-22 08:01:27.155 186548 DEBUG nova.compute.manager [None req-26158fb1-1a85-4b24-9be3-8e4c1a893e2a - - - - - -] [instance: 4035e88f-0c79-4db5-a63d-7ae01c056339] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:01:27 compute-0 podman[229629]: 2025-11-22 08:01:27.159091922 +0000 UTC m=+0.075959302 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Nov 22 08:01:30 compute-0 nova_compute[186544]: 2025-11-22 08:01:30.641 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:01:30 compute-0 nova_compute[186544]: 2025-11-22 08:01:30.682 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:01:32 compute-0 nova_compute[186544]: 2025-11-22 08:01:32.072 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:01:32 compute-0 podman[229648]: 2025-11-22 08:01:32.435179212 +0000 UTC m=+0.084515794 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 22 08:01:35 compute-0 nova_compute[186544]: 2025-11-22 08:01:35.642 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:01:35 compute-0 nova_compute[186544]: 2025-11-22 08:01:35.683 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:01:36 compute-0 nova_compute[186544]: 2025-11-22 08:01:36.869 186548 DEBUG oslo_concurrency.lockutils [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Acquiring lock "235ecf63-07fa-4f60-97e9-466450a50add" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:01:36 compute-0 nova_compute[186544]: 2025-11-22 08:01:36.870 186548 DEBUG oslo_concurrency.lockutils [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Lock "235ecf63-07fa-4f60-97e9-466450a50add" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:01:36 compute-0 nova_compute[186544]: 2025-11-22 08:01:36.885 186548 DEBUG nova.compute.manager [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 08:01:36 compute-0 nova_compute[186544]: 2025-11-22 08:01:36.984 186548 DEBUG oslo_concurrency.lockutils [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:01:36 compute-0 nova_compute[186544]: 2025-11-22 08:01:36.984 186548 DEBUG oslo_concurrency.lockutils [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:01:36 compute-0 nova_compute[186544]: 2025-11-22 08:01:36.992 186548 DEBUG nova.virt.hardware [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 08:01:36 compute-0 nova_compute[186544]: 2025-11-22 08:01:36.992 186548 INFO nova.compute.claims [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Claim successful on node compute-0.ctlplane.example.com
Nov 22 08:01:37 compute-0 nova_compute[186544]: 2025-11-22 08:01:37.099 186548 DEBUG nova.compute.provider_tree [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:01:37 compute-0 nova_compute[186544]: 2025-11-22 08:01:37.110 186548 DEBUG nova.scheduler.client.report [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:01:37 compute-0 nova_compute[186544]: 2025-11-22 08:01:37.134 186548 DEBUG oslo_concurrency.lockutils [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.150s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:01:37 compute-0 nova_compute[186544]: 2025-11-22 08:01:37.135 186548 DEBUG nova.compute.manager [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 08:01:37 compute-0 nova_compute[186544]: 2025-11-22 08:01:37.207 186548 DEBUG nova.compute.manager [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 22 08:01:37 compute-0 nova_compute[186544]: 2025-11-22 08:01:37.207 186548 DEBUG nova.network.neutron [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 08:01:37 compute-0 nova_compute[186544]: 2025-11-22 08:01:37.222 186548 INFO nova.virt.libvirt.driver [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 08:01:37 compute-0 nova_compute[186544]: 2025-11-22 08:01:37.242 186548 DEBUG nova.compute.manager [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 08:01:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:37.330 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:01:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:37.331 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:01:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:37.331 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:01:37 compute-0 nova_compute[186544]: 2025-11-22 08:01:37.358 186548 DEBUG nova.compute.manager [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 08:01:37 compute-0 nova_compute[186544]: 2025-11-22 08:01:37.359 186548 DEBUG nova.virt.libvirt.driver [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 08:01:37 compute-0 nova_compute[186544]: 2025-11-22 08:01:37.360 186548 INFO nova.virt.libvirt.driver [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Creating image(s)
Nov 22 08:01:37 compute-0 nova_compute[186544]: 2025-11-22 08:01:37.361 186548 DEBUG oslo_concurrency.lockutils [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Acquiring lock "/var/lib/nova/instances/235ecf63-07fa-4f60-97e9-466450a50add/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:01:37 compute-0 nova_compute[186544]: 2025-11-22 08:01:37.361 186548 DEBUG oslo_concurrency.lockutils [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Lock "/var/lib/nova/instances/235ecf63-07fa-4f60-97e9-466450a50add/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:01:37 compute-0 nova_compute[186544]: 2025-11-22 08:01:37.362 186548 DEBUG oslo_concurrency.lockutils [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Lock "/var/lib/nova/instances/235ecf63-07fa-4f60-97e9-466450a50add/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:01:37 compute-0 nova_compute[186544]: 2025-11-22 08:01:37.375 186548 DEBUG oslo_concurrency.processutils [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:01:37 compute-0 nova_compute[186544]: 2025-11-22 08:01:37.436 186548 DEBUG oslo_concurrency.processutils [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:01:37 compute-0 nova_compute[186544]: 2025-11-22 08:01:37.438 186548 DEBUG oslo_concurrency.lockutils [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:01:37 compute-0 nova_compute[186544]: 2025-11-22 08:01:37.438 186548 DEBUG oslo_concurrency.lockutils [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:01:37 compute-0 nova_compute[186544]: 2025-11-22 08:01:37.451 186548 DEBUG oslo_concurrency.processutils [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:01:37 compute-0 nova_compute[186544]: 2025-11-22 08:01:37.508 186548 DEBUG oslo_concurrency.processutils [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:01:37 compute-0 nova_compute[186544]: 2025-11-22 08:01:37.509 186548 DEBUG oslo_concurrency.processutils [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/235ecf63-07fa-4f60-97e9-466450a50add/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:01:37 compute-0 nova_compute[186544]: 2025-11-22 08:01:37.654 186548 DEBUG nova.policy [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '989540cd5ede4a5184a08b8eb3de013d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd967f0cef958482c9711764882a146f3', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 22 08:01:37 compute-0 nova_compute[186544]: 2025-11-22 08:01:37.859 186548 DEBUG oslo_concurrency.processutils [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/235ecf63-07fa-4f60-97e9-466450a50add/disk 1073741824" returned: 0 in 0.350s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:01:37 compute-0 nova_compute[186544]: 2025-11-22 08:01:37.860 186548 DEBUG oslo_concurrency.lockutils [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.422s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:01:37 compute-0 nova_compute[186544]: 2025-11-22 08:01:37.861 186548 DEBUG oslo_concurrency.processutils [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:01:37 compute-0 nova_compute[186544]: 2025-11-22 08:01:37.926 186548 DEBUG oslo_concurrency.processutils [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:01:37 compute-0 nova_compute[186544]: 2025-11-22 08:01:37.927 186548 DEBUG nova.virt.disk.api [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Checking if we can resize image /var/lib/nova/instances/235ecf63-07fa-4f60-97e9-466450a50add/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 08:01:37 compute-0 nova_compute[186544]: 2025-11-22 08:01:37.927 186548 DEBUG oslo_concurrency.processutils [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/235ecf63-07fa-4f60-97e9-466450a50add/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:01:37 compute-0 nova_compute[186544]: 2025-11-22 08:01:37.996 186548 DEBUG oslo_concurrency.processutils [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/235ecf63-07fa-4f60-97e9-466450a50add/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:01:37 compute-0 nova_compute[186544]: 2025-11-22 08:01:37.997 186548 DEBUG nova.virt.disk.api [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Cannot resize image /var/lib/nova/instances/235ecf63-07fa-4f60-97e9-466450a50add/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 08:01:37 compute-0 nova_compute[186544]: 2025-11-22 08:01:37.998 186548 DEBUG nova.objects.instance [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Lazy-loading 'migration_context' on Instance uuid 235ecf63-07fa-4f60-97e9-466450a50add obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:01:38 compute-0 nova_compute[186544]: 2025-11-22 08:01:38.144 186548 DEBUG nova.virt.libvirt.driver [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 08:01:38 compute-0 nova_compute[186544]: 2025-11-22 08:01:38.145 186548 DEBUG nova.virt.libvirt.driver [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Ensure instance console log exists: /var/lib/nova/instances/235ecf63-07fa-4f60-97e9-466450a50add/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 08:01:38 compute-0 nova_compute[186544]: 2025-11-22 08:01:38.146 186548 DEBUG oslo_concurrency.lockutils [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:01:38 compute-0 nova_compute[186544]: 2025-11-22 08:01:38.146 186548 DEBUG oslo_concurrency.lockutils [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:01:38 compute-0 nova_compute[186544]: 2025-11-22 08:01:38.146 186548 DEBUG oslo_concurrency.lockutils [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:01:38 compute-0 podman[229684]: 2025-11-22 08:01:38.439179675 +0000 UTC m=+0.085843986 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, name=ubi9-minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, config_id=edpm, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 22 08:01:38 compute-0 nova_compute[186544]: 2025-11-22 08:01:38.619 186548 DEBUG nova.network.neutron [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Successfully created port: fe9c07c3-e08f-4810-b699-6d6aa3f50cda _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 22 08:01:39 compute-0 podman[229707]: 2025-11-22 08:01:39.397232718 +0000 UTC m=+0.045232801 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 08:01:39 compute-0 nova_compute[186544]: 2025-11-22 08:01:39.704 186548 DEBUG nova.network.neutron [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Successfully updated port: fe9c07c3-e08f-4810-b699-6d6aa3f50cda _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 08:01:39 compute-0 nova_compute[186544]: 2025-11-22 08:01:39.715 186548 DEBUG oslo_concurrency.lockutils [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Acquiring lock "refresh_cache-235ecf63-07fa-4f60-97e9-466450a50add" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:01:39 compute-0 nova_compute[186544]: 2025-11-22 08:01:39.715 186548 DEBUG oslo_concurrency.lockutils [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Acquired lock "refresh_cache-235ecf63-07fa-4f60-97e9-466450a50add" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:01:39 compute-0 nova_compute[186544]: 2025-11-22 08:01:39.715 186548 DEBUG nova.network.neutron [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 08:01:39 compute-0 nova_compute[186544]: 2025-11-22 08:01:39.878 186548 DEBUG nova.compute.manager [req-75c76ac1-049d-44aa-b49f-295ae464412d req-a6950817-859f-4e03-9220-6239b8641835 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Received event network-changed-fe9c07c3-e08f-4810-b699-6d6aa3f50cda external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:01:39 compute-0 nova_compute[186544]: 2025-11-22 08:01:39.878 186548 DEBUG nova.compute.manager [req-75c76ac1-049d-44aa-b49f-295ae464412d req-a6950817-859f-4e03-9220-6239b8641835 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Refreshing instance network info cache due to event network-changed-fe9c07c3-e08f-4810-b699-6d6aa3f50cda. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:01:39 compute-0 nova_compute[186544]: 2025-11-22 08:01:39.878 186548 DEBUG oslo_concurrency.lockutils [req-75c76ac1-049d-44aa-b49f-295ae464412d req-a6950817-859f-4e03-9220-6239b8641835 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-235ecf63-07fa-4f60-97e9-466450a50add" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:01:40 compute-0 nova_compute[186544]: 2025-11-22 08:01:40.029 186548 DEBUG nova.network.neutron [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 08:01:40 compute-0 nova_compute[186544]: 2025-11-22 08:01:40.652 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:01:40 compute-0 nova_compute[186544]: 2025-11-22 08:01:40.684 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:01:40 compute-0 nova_compute[186544]: 2025-11-22 08:01:40.902 186548 DEBUG nova.network.neutron [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Updating instance_info_cache with network_info: [{"id": "fe9c07c3-e08f-4810-b699-6d6aa3f50cda", "address": "fa:16:3e:1c:d6:3e", "network": {"id": "c157128f-75e8-4afb-ab55-34580af9585f", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1557637836-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d967f0cef958482c9711764882a146f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe9c07c3-e0", "ovs_interfaceid": "fe9c07c3-e08f-4810-b699-6d6aa3f50cda", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:01:40 compute-0 nova_compute[186544]: 2025-11-22 08:01:40.938 186548 DEBUG oslo_concurrency.lockutils [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Releasing lock "refresh_cache-235ecf63-07fa-4f60-97e9-466450a50add" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:01:40 compute-0 nova_compute[186544]: 2025-11-22 08:01:40.938 186548 DEBUG nova.compute.manager [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Instance network_info: |[{"id": "fe9c07c3-e08f-4810-b699-6d6aa3f50cda", "address": "fa:16:3e:1c:d6:3e", "network": {"id": "c157128f-75e8-4afb-ab55-34580af9585f", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1557637836-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d967f0cef958482c9711764882a146f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe9c07c3-e0", "ovs_interfaceid": "fe9c07c3-e08f-4810-b699-6d6aa3f50cda", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 22 08:01:40 compute-0 nova_compute[186544]: 2025-11-22 08:01:40.938 186548 DEBUG oslo_concurrency.lockutils [req-75c76ac1-049d-44aa-b49f-295ae464412d req-a6950817-859f-4e03-9220-6239b8641835 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-235ecf63-07fa-4f60-97e9-466450a50add" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:01:40 compute-0 nova_compute[186544]: 2025-11-22 08:01:40.939 186548 DEBUG nova.network.neutron [req-75c76ac1-049d-44aa-b49f-295ae464412d req-a6950817-859f-4e03-9220-6239b8641835 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Refreshing network info cache for port fe9c07c3-e08f-4810-b699-6d6aa3f50cda _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:01:40 compute-0 nova_compute[186544]: 2025-11-22 08:01:40.941 186548 DEBUG nova.virt.libvirt.driver [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Start _get_guest_xml network_info=[{"id": "fe9c07c3-e08f-4810-b699-6d6aa3f50cda", "address": "fa:16:3e:1c:d6:3e", "network": {"id": "c157128f-75e8-4afb-ab55-34580af9585f", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1557637836-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d967f0cef958482c9711764882a146f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe9c07c3-e0", "ovs_interfaceid": "fe9c07c3-e08f-4810-b699-6d6aa3f50cda", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 08:01:40 compute-0 nova_compute[186544]: 2025-11-22 08:01:40.946 186548 WARNING nova.virt.libvirt.driver [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:01:40 compute-0 nova_compute[186544]: 2025-11-22 08:01:40.951 186548 DEBUG nova.virt.libvirt.host [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 08:01:40 compute-0 nova_compute[186544]: 2025-11-22 08:01:40.952 186548 DEBUG nova.virt.libvirt.host [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 08:01:40 compute-0 nova_compute[186544]: 2025-11-22 08:01:40.968 186548 DEBUG nova.virt.libvirt.host [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 08:01:40 compute-0 nova_compute[186544]: 2025-11-22 08:01:40.968 186548 DEBUG nova.virt.libvirt.host [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 08:01:40 compute-0 nova_compute[186544]: 2025-11-22 08:01:40.970 186548 DEBUG nova.virt.libvirt.driver [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 08:01:40 compute-0 nova_compute[186544]: 2025-11-22 08:01:40.970 186548 DEBUG nova.virt.hardware [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 08:01:40 compute-0 nova_compute[186544]: 2025-11-22 08:01:40.970 186548 DEBUG nova.virt.hardware [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 08:01:40 compute-0 nova_compute[186544]: 2025-11-22 08:01:40.971 186548 DEBUG nova.virt.hardware [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 08:01:40 compute-0 nova_compute[186544]: 2025-11-22 08:01:40.971 186548 DEBUG nova.virt.hardware [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 08:01:40 compute-0 nova_compute[186544]: 2025-11-22 08:01:40.971 186548 DEBUG nova.virt.hardware [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 08:01:40 compute-0 nova_compute[186544]: 2025-11-22 08:01:40.971 186548 DEBUG nova.virt.hardware [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 08:01:40 compute-0 nova_compute[186544]: 2025-11-22 08:01:40.971 186548 DEBUG nova.virt.hardware [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 08:01:40 compute-0 nova_compute[186544]: 2025-11-22 08:01:40.971 186548 DEBUG nova.virt.hardware [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 08:01:40 compute-0 nova_compute[186544]: 2025-11-22 08:01:40.972 186548 DEBUG nova.virt.hardware [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 08:01:40 compute-0 nova_compute[186544]: 2025-11-22 08:01:40.972 186548 DEBUG nova.virt.hardware [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 08:01:40 compute-0 nova_compute[186544]: 2025-11-22 08:01:40.972 186548 DEBUG nova.virt.hardware [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 08:01:40 compute-0 nova_compute[186544]: 2025-11-22 08:01:40.975 186548 DEBUG nova.virt.libvirt.vif [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:01:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-899820238',display_name='tempest-ServersNegativeTestJSON-server-899820238',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-899820238',id=97,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d967f0cef958482c9711764882a146f3',ramdisk_id='',reservation_id='r-he9blv4w',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-1872924472',owner_user_name='tempest-ServersNegativeTestJSON-1872924472-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:01:37Z,user_data=None,user_id='989540cd5ede4a5184a08b8eb3de013d',uuid=235ecf63-07fa-4f60-97e9-466450a50add,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fe9c07c3-e08f-4810-b699-6d6aa3f50cda", "address": "fa:16:3e:1c:d6:3e", "network": {"id": "c157128f-75e8-4afb-ab55-34580af9585f", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1557637836-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d967f0cef958482c9711764882a146f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe9c07c3-e0", "ovs_interfaceid": "fe9c07c3-e08f-4810-b699-6d6aa3f50cda", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 08:01:40 compute-0 nova_compute[186544]: 2025-11-22 08:01:40.976 186548 DEBUG nova.network.os_vif_util [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Converting VIF {"id": "fe9c07c3-e08f-4810-b699-6d6aa3f50cda", "address": "fa:16:3e:1c:d6:3e", "network": {"id": "c157128f-75e8-4afb-ab55-34580af9585f", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1557637836-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d967f0cef958482c9711764882a146f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe9c07c3-e0", "ovs_interfaceid": "fe9c07c3-e08f-4810-b699-6d6aa3f50cda", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:01:40 compute-0 nova_compute[186544]: 2025-11-22 08:01:40.976 186548 DEBUG nova.network.os_vif_util [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1c:d6:3e,bridge_name='br-int',has_traffic_filtering=True,id=fe9c07c3-e08f-4810-b699-6d6aa3f50cda,network=Network(c157128f-75e8-4afb-ab55-34580af9585f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe9c07c3-e0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:01:40 compute-0 nova_compute[186544]: 2025-11-22 08:01:40.977 186548 DEBUG nova.objects.instance [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 235ecf63-07fa-4f60-97e9-466450a50add obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:01:40 compute-0 nova_compute[186544]: 2025-11-22 08:01:40.988 186548 DEBUG nova.virt.libvirt.driver [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] End _get_guest_xml xml=<domain type="kvm">
Nov 22 08:01:40 compute-0 nova_compute[186544]:   <uuid>235ecf63-07fa-4f60-97e9-466450a50add</uuid>
Nov 22 08:01:40 compute-0 nova_compute[186544]:   <name>instance-00000061</name>
Nov 22 08:01:40 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 08:01:40 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 08:01:40 compute-0 nova_compute[186544]:   <metadata>
Nov 22 08:01:40 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 08:01:40 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 08:01:40 compute-0 nova_compute[186544]:       <nova:name>tempest-ServersNegativeTestJSON-server-899820238</nova:name>
Nov 22 08:01:40 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 08:01:40</nova:creationTime>
Nov 22 08:01:40 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 08:01:40 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 08:01:40 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 08:01:40 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 08:01:40 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 08:01:40 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 08:01:40 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 08:01:40 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 08:01:40 compute-0 nova_compute[186544]:         <nova:user uuid="989540cd5ede4a5184a08b8eb3de013d">tempest-ServersNegativeTestJSON-1872924472-project-member</nova:user>
Nov 22 08:01:40 compute-0 nova_compute[186544]:         <nova:project uuid="d967f0cef958482c9711764882a146f3">tempest-ServersNegativeTestJSON-1872924472</nova:project>
Nov 22 08:01:40 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 08:01:40 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 08:01:40 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 08:01:40 compute-0 nova_compute[186544]:         <nova:port uuid="fe9c07c3-e08f-4810-b699-6d6aa3f50cda">
Nov 22 08:01:40 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Nov 22 08:01:40 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 08:01:40 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 08:01:40 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 08:01:40 compute-0 nova_compute[186544]:   </metadata>
Nov 22 08:01:40 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 08:01:40 compute-0 nova_compute[186544]:     <system>
Nov 22 08:01:40 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 08:01:40 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 08:01:40 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 08:01:40 compute-0 nova_compute[186544]:       <entry name="serial">235ecf63-07fa-4f60-97e9-466450a50add</entry>
Nov 22 08:01:40 compute-0 nova_compute[186544]:       <entry name="uuid">235ecf63-07fa-4f60-97e9-466450a50add</entry>
Nov 22 08:01:40 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 08:01:40 compute-0 nova_compute[186544]:     </system>
Nov 22 08:01:40 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 08:01:40 compute-0 nova_compute[186544]:   <os>
Nov 22 08:01:40 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 08:01:40 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 08:01:40 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 08:01:40 compute-0 nova_compute[186544]:   </os>
Nov 22 08:01:40 compute-0 nova_compute[186544]:   <features>
Nov 22 08:01:40 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 08:01:40 compute-0 nova_compute[186544]:     <apic/>
Nov 22 08:01:40 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 08:01:40 compute-0 nova_compute[186544]:   </features>
Nov 22 08:01:40 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 08:01:40 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 08:01:40 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 08:01:40 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 08:01:40 compute-0 nova_compute[186544]:   </clock>
Nov 22 08:01:40 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 08:01:40 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 08:01:40 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 08:01:40 compute-0 nova_compute[186544]:   </cpu>
Nov 22 08:01:40 compute-0 nova_compute[186544]:   <devices>
Nov 22 08:01:40 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 08:01:40 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 08:01:40 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/235ecf63-07fa-4f60-97e9-466450a50add/disk"/>
Nov 22 08:01:40 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 08:01:40 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:01:40 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 08:01:40 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 08:01:40 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/235ecf63-07fa-4f60-97e9-466450a50add/disk.config"/>
Nov 22 08:01:40 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 08:01:40 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:01:40 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 08:01:40 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:1c:d6:3e"/>
Nov 22 08:01:40 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:01:40 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 08:01:40 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 08:01:40 compute-0 nova_compute[186544]:       <target dev="tapfe9c07c3-e0"/>
Nov 22 08:01:40 compute-0 nova_compute[186544]:     </interface>
Nov 22 08:01:40 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 08:01:40 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/235ecf63-07fa-4f60-97e9-466450a50add/console.log" append="off"/>
Nov 22 08:01:40 compute-0 nova_compute[186544]:     </serial>
Nov 22 08:01:40 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 08:01:40 compute-0 nova_compute[186544]:     <video>
Nov 22 08:01:40 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:01:40 compute-0 nova_compute[186544]:     </video>
Nov 22 08:01:40 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 08:01:40 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 08:01:40 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 08:01:40 compute-0 nova_compute[186544]:     </rng>
Nov 22 08:01:40 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 08:01:40 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:40 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:40 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:40 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:40 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:40 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:40 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:40 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:40 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:40 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:40 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:40 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:40 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:40 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:40 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:40 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:40 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:40 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:40 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:40 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:40 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:40 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:40 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:40 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:40 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 08:01:40 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 08:01:40 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 08:01:40 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 08:01:40 compute-0 nova_compute[186544]:   </devices>
Nov 22 08:01:40 compute-0 nova_compute[186544]: </domain>
Nov 22 08:01:40 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 08:01:40 compute-0 nova_compute[186544]: 2025-11-22 08:01:40.989 186548 DEBUG nova.compute.manager [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Preparing to wait for external event network-vif-plugged-fe9c07c3-e08f-4810-b699-6d6aa3f50cda prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 08:01:40 compute-0 nova_compute[186544]: 2025-11-22 08:01:40.989 186548 DEBUG oslo_concurrency.lockutils [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Acquiring lock "235ecf63-07fa-4f60-97e9-466450a50add-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:01:40 compute-0 nova_compute[186544]: 2025-11-22 08:01:40.989 186548 DEBUG oslo_concurrency.lockutils [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Lock "235ecf63-07fa-4f60-97e9-466450a50add-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:01:40 compute-0 nova_compute[186544]: 2025-11-22 08:01:40.989 186548 DEBUG oslo_concurrency.lockutils [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Lock "235ecf63-07fa-4f60-97e9-466450a50add-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:01:40 compute-0 nova_compute[186544]: 2025-11-22 08:01:40.990 186548 DEBUG nova.virt.libvirt.vif [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:01:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-899820238',display_name='tempest-ServersNegativeTestJSON-server-899820238',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-899820238',id=97,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d967f0cef958482c9711764882a146f3',ramdisk_id='',reservation_id='r-he9blv4w',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-1872924472',owner_user_name='tempest-ServersNegativeTestJSON-1872924472-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:01:37Z,user_data=None,user_id='989540cd5ede4a5184a08b8eb3de013d',uuid=235ecf63-07fa-4f60-97e9-466450a50add,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fe9c07c3-e08f-4810-b699-6d6aa3f50cda", "address": "fa:16:3e:1c:d6:3e", "network": {"id": "c157128f-75e8-4afb-ab55-34580af9585f", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1557637836-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d967f0cef958482c9711764882a146f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe9c07c3-e0", "ovs_interfaceid": "fe9c07c3-e08f-4810-b699-6d6aa3f50cda", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 08:01:40 compute-0 nova_compute[186544]: 2025-11-22 08:01:40.990 186548 DEBUG nova.network.os_vif_util [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Converting VIF {"id": "fe9c07c3-e08f-4810-b699-6d6aa3f50cda", "address": "fa:16:3e:1c:d6:3e", "network": {"id": "c157128f-75e8-4afb-ab55-34580af9585f", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1557637836-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d967f0cef958482c9711764882a146f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe9c07c3-e0", "ovs_interfaceid": "fe9c07c3-e08f-4810-b699-6d6aa3f50cda", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:01:40 compute-0 nova_compute[186544]: 2025-11-22 08:01:40.990 186548 DEBUG nova.network.os_vif_util [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1c:d6:3e,bridge_name='br-int',has_traffic_filtering=True,id=fe9c07c3-e08f-4810-b699-6d6aa3f50cda,network=Network(c157128f-75e8-4afb-ab55-34580af9585f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe9c07c3-e0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:01:40 compute-0 nova_compute[186544]: 2025-11-22 08:01:40.991 186548 DEBUG os_vif [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1c:d6:3e,bridge_name='br-int',has_traffic_filtering=True,id=fe9c07c3-e08f-4810-b699-6d6aa3f50cda,network=Network(c157128f-75e8-4afb-ab55-34580af9585f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe9c07c3-e0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 08:01:40 compute-0 nova_compute[186544]: 2025-11-22 08:01:40.991 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:01:40 compute-0 nova_compute[186544]: 2025-11-22 08:01:40.991 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:01:40 compute-0 nova_compute[186544]: 2025-11-22 08:01:40.992 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:01:40 compute-0 nova_compute[186544]: 2025-11-22 08:01:40.994 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:01:40 compute-0 nova_compute[186544]: 2025-11-22 08:01:40.994 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfe9c07c3-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:01:40 compute-0 nova_compute[186544]: 2025-11-22 08:01:40.995 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapfe9c07c3-e0, col_values=(('external_ids', {'iface-id': 'fe9c07c3-e08f-4810-b699-6d6aa3f50cda', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1c:d6:3e', 'vm-uuid': '235ecf63-07fa-4f60-97e9-466450a50add'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:01:40 compute-0 nova_compute[186544]: 2025-11-22 08:01:40.996 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:01:40 compute-0 NetworkManager[55036]: <info>  [1763798500.9978] manager: (tapfe9c07c3-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/188)
Nov 22 08:01:40 compute-0 nova_compute[186544]: 2025-11-22 08:01:40.999 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 08:01:41 compute-0 nova_compute[186544]: 2025-11-22 08:01:41.005 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:01:41 compute-0 nova_compute[186544]: 2025-11-22 08:01:41.006 186548 INFO os_vif [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1c:d6:3e,bridge_name='br-int',has_traffic_filtering=True,id=fe9c07c3-e08f-4810-b699-6d6aa3f50cda,network=Network(c157128f-75e8-4afb-ab55-34580af9585f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe9c07c3-e0')
Nov 22 08:01:41 compute-0 nova_compute[186544]: 2025-11-22 08:01:41.090 186548 DEBUG nova.virt.libvirt.driver [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:01:41 compute-0 nova_compute[186544]: 2025-11-22 08:01:41.090 186548 DEBUG nova.virt.libvirt.driver [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:01:41 compute-0 nova_compute[186544]: 2025-11-22 08:01:41.090 186548 DEBUG nova.virt.libvirt.driver [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] No VIF found with MAC fa:16:3e:1c:d6:3e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 08:01:41 compute-0 nova_compute[186544]: 2025-11-22 08:01:41.091 186548 INFO nova.virt.libvirt.driver [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Using config drive
Nov 22 08:01:43 compute-0 nova_compute[186544]: 2025-11-22 08:01:43.605 186548 INFO nova.virt.libvirt.driver [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Creating config drive at /var/lib/nova/instances/235ecf63-07fa-4f60-97e9-466450a50add/disk.config
Nov 22 08:01:43 compute-0 nova_compute[186544]: 2025-11-22 08:01:43.609 186548 DEBUG oslo_concurrency.processutils [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/235ecf63-07fa-4f60-97e9-466450a50add/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpt9sb2y3n execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:01:43 compute-0 nova_compute[186544]: 2025-11-22 08:01:43.736 186548 DEBUG oslo_concurrency.processutils [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/235ecf63-07fa-4f60-97e9-466450a50add/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpt9sb2y3n" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:01:43 compute-0 kernel: tapfe9c07c3-e0: entered promiscuous mode
Nov 22 08:01:43 compute-0 NetworkManager[55036]: <info>  [1763798503.8032] manager: (tapfe9c07c3-e0): new Tun device (/org/freedesktop/NetworkManager/Devices/189)
Nov 22 08:01:43 compute-0 nova_compute[186544]: 2025-11-22 08:01:43.803 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:01:43 compute-0 ovn_controller[94843]: 2025-11-22T08:01:43Z|00397|binding|INFO|Claiming lport fe9c07c3-e08f-4810-b699-6d6aa3f50cda for this chassis.
Nov 22 08:01:43 compute-0 ovn_controller[94843]: 2025-11-22T08:01:43Z|00398|binding|INFO|fe9c07c3-e08f-4810-b699-6d6aa3f50cda: Claiming fa:16:3e:1c:d6:3e 10.100.0.8
Nov 22 08:01:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:43.813 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1c:d6:3e 10.100.0.8'], port_security=['fa:16:3e:1c:d6:3e 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '235ecf63-07fa-4f60-97e9-466450a50add', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c157128f-75e8-4afb-ab55-34580af9585f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd967f0cef958482c9711764882a146f3', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'db55d655-ec9a-40ef-9e54-3247c3ea4f75', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=50c031f4-6b41-4ee7-af4f-a9218d9b390c, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=fe9c07c3-e08f-4810-b699-6d6aa3f50cda) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:01:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:43.814 103805 INFO neutron.agent.ovn.metadata.agent [-] Port fe9c07c3-e08f-4810-b699-6d6aa3f50cda in datapath c157128f-75e8-4afb-ab55-34580af9585f bound to our chassis
Nov 22 08:01:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:43.816 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c157128f-75e8-4afb-ab55-34580af9585f
Nov 22 08:01:43 compute-0 nova_compute[186544]: 2025-11-22 08:01:43.819 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:01:43 compute-0 ovn_controller[94843]: 2025-11-22T08:01:43Z|00399|binding|INFO|Setting lport fe9c07c3-e08f-4810-b699-6d6aa3f50cda ovn-installed in OVS
Nov 22 08:01:43 compute-0 ovn_controller[94843]: 2025-11-22T08:01:43Z|00400|binding|INFO|Setting lport fe9c07c3-e08f-4810-b699-6d6aa3f50cda up in Southbound
Nov 22 08:01:43 compute-0 nova_compute[186544]: 2025-11-22 08:01:43.822 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:01:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:43.827 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[d85396ba-9bde-445b-a884-2037992197d0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:01:43 compute-0 nova_compute[186544]: 2025-11-22 08:01:43.829 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:01:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:43.828 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc157128f-71 in ovnmeta-c157128f-75e8-4afb-ab55-34580af9585f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 08:01:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:43.830 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc157128f-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 08:01:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:43.830 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[741c0d93-d1db-418b-9b25-1f2e650f9efc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:01:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:43.832 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[54abe5c2-5de9-4753-9643-fbae5e8e12a6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:01:43 compute-0 systemd-udevd[229754]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:01:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:43.844 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[3cc9d059-9418-4a72-8362-4dd1a6a493fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:01:43 compute-0 systemd-machined[152872]: New machine qemu-53-instance-00000061.
Nov 22 08:01:43 compute-0 systemd[1]: Started Virtual Machine qemu-53-instance-00000061.
Nov 22 08:01:43 compute-0 NetworkManager[55036]: <info>  [1763798503.8622] device (tapfe9c07c3-e0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 08:01:43 compute-0 NetworkManager[55036]: <info>  [1763798503.8634] device (tapfe9c07c3-e0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 08:01:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:43.875 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[4ebba828-1b6e-4a87-a3f0-41ace3d740d5]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:01:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:43.906 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[f3c2dee2-7bfe-44d6-bdc0-0740dfbfbbef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:01:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:43.912 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[420a14d4-bce7-4ded-81a4-dcc5c3b24780]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:01:43 compute-0 NetworkManager[55036]: <info>  [1763798503.9137] manager: (tapc157128f-70): new Veth device (/org/freedesktop/NetworkManager/Devices/190)
Nov 22 08:01:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:43.941 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[091443d0-b9b7-4c01-82a4-360090be4e98]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:01:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:43.945 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[786eac89-3d4b-4218-aacb-f1f7c965c467]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:01:43 compute-0 NetworkManager[55036]: <info>  [1763798503.9671] device (tapc157128f-70): carrier: link connected
Nov 22 08:01:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:43.971 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[5c38e585-daa5-4e9b-b9f3-e7e323d18f9c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:01:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:43.989 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[38343d12-9126-455a-ad04-0ef894d64f67]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc157128f-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bb:41:0b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 125], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 524460, 'reachable_time': 44678, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229785, 'error': None, 'target': 'ovnmeta-c157128f-75e8-4afb-ab55-34580af9585f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:01:44 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:44.003 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[4bcfe8f6-8775-4cac-a7a0-6627c98d811f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:febb:410b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 524460, 'tstamp': 524460}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229786, 'error': None, 'target': 'ovnmeta-c157128f-75e8-4afb-ab55-34580af9585f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:01:44 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:44.021 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[2df2f4d1-87e4-4541-b5a2-91ccc1a9e581]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc157128f-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bb:41:0b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 125], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 524460, 'reachable_time': 44678, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 229787, 'error': None, 'target': 'ovnmeta-c157128f-75e8-4afb-ab55-34580af9585f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:01:44 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:44.060 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[2c777a8e-de48-4a7b-aae8-5e2a33cf84b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:01:44 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:44.130 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[bffb8fa6-811e-4bc4-9810-057bd9c78bf8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:01:44 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:44.131 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc157128f-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:01:44 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:44.132 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:01:44 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:44.132 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc157128f-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:01:44 compute-0 nova_compute[186544]: 2025-11-22 08:01:44.134 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:01:44 compute-0 NetworkManager[55036]: <info>  [1763798504.1350] manager: (tapc157128f-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/191)
Nov 22 08:01:44 compute-0 kernel: tapc157128f-70: entered promiscuous mode
Nov 22 08:01:44 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:44.138 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc157128f-70, col_values=(('external_ids', {'iface-id': 'a61c8ae7-262d-45c7-859e-6a4502225b00'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:01:44 compute-0 ovn_controller[94843]: 2025-11-22T08:01:44Z|00401|binding|INFO|Releasing lport a61c8ae7-262d-45c7-859e-6a4502225b00 from this chassis (sb_readonly=0)
Nov 22 08:01:44 compute-0 nova_compute[186544]: 2025-11-22 08:01:44.140 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:01:44 compute-0 nova_compute[186544]: 2025-11-22 08:01:44.153 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:01:44 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:44.154 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c157128f-75e8-4afb-ab55-34580af9585f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c157128f-75e8-4afb-ab55-34580af9585f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 08:01:44 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:44.155 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[e3c274ef-7921-48e6-94e3-91d003dbdd84]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:01:44 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:44.156 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 08:01:44 compute-0 ovn_metadata_agent[103800]: global
Nov 22 08:01:44 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 08:01:44 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-c157128f-75e8-4afb-ab55-34580af9585f
Nov 22 08:01:44 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 08:01:44 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 08:01:44 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 08:01:44 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/c157128f-75e8-4afb-ab55-34580af9585f.pid.haproxy
Nov 22 08:01:44 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 08:01:44 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:01:44 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 08:01:44 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 08:01:44 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 08:01:44 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 08:01:44 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 08:01:44 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 08:01:44 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 08:01:44 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 08:01:44 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 08:01:44 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 08:01:44 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 08:01:44 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 08:01:44 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 08:01:44 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:01:44 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:01:44 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 08:01:44 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 08:01:44 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 08:01:44 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID c157128f-75e8-4afb-ab55-34580af9585f
Nov 22 08:01:44 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 08:01:44 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:44.157 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c157128f-75e8-4afb-ab55-34580af9585f', 'env', 'PROCESS_TAG=haproxy-c157128f-75e8-4afb-ab55-34580af9585f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c157128f-75e8-4afb-ab55-34580af9585f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 08:01:44 compute-0 nova_compute[186544]: 2025-11-22 08:01:44.495 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798504.4945843, 235ecf63-07fa-4f60-97e9-466450a50add => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:01:44 compute-0 nova_compute[186544]: 2025-11-22 08:01:44.495 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] VM Started (Lifecycle Event)
Nov 22 08:01:44 compute-0 nova_compute[186544]: 2025-11-22 08:01:44.514 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:01:44 compute-0 nova_compute[186544]: 2025-11-22 08:01:44.516 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798504.4947312, 235ecf63-07fa-4f60-97e9-466450a50add => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:01:44 compute-0 nova_compute[186544]: 2025-11-22 08:01:44.517 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] VM Paused (Lifecycle Event)
Nov 22 08:01:44 compute-0 nova_compute[186544]: 2025-11-22 08:01:44.535 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:01:44 compute-0 nova_compute[186544]: 2025-11-22 08:01:44.537 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:01:44 compute-0 nova_compute[186544]: 2025-11-22 08:01:44.559 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 08:01:44 compute-0 podman[229825]: 2025-11-22 08:01:44.484652646 +0000 UTC m=+0.018997771 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 08:01:44 compute-0 podman[229825]: 2025-11-22 08:01:44.898887673 +0000 UTC m=+0.433232798 container create d21462b54a296e6f191876d6337be85ef87971099aabad1143e15b28d494d33f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c157128f-75e8-4afb-ab55-34580af9585f, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 22 08:01:44 compute-0 systemd[1]: Started libpod-conmon-d21462b54a296e6f191876d6337be85ef87971099aabad1143e15b28d494d33f.scope.
Nov 22 08:01:44 compute-0 systemd[1]: Started libcrun container.
Nov 22 08:01:45 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e83fe16df0084714b4a4f2e3c0097ef0c45e3cf2db1eb0c45443f19f6f0a8e22/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 08:01:45 compute-0 podman[229825]: 2025-11-22 08:01:45.046682333 +0000 UTC m=+0.581027478 container init d21462b54a296e6f191876d6337be85ef87971099aabad1143e15b28d494d33f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c157128f-75e8-4afb-ab55-34580af9585f, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, io.buildah.version=1.41.3)
Nov 22 08:01:45 compute-0 podman[229825]: 2025-11-22 08:01:45.052940677 +0000 UTC m=+0.587285812 container start d21462b54a296e6f191876d6337be85ef87971099aabad1143e15b28d494d33f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c157128f-75e8-4afb-ab55-34580af9585f, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 08:01:45 compute-0 neutron-haproxy-ovnmeta-c157128f-75e8-4afb-ab55-34580af9585f[229841]: [NOTICE]   (229845) : New worker (229847) forked
Nov 22 08:01:45 compute-0 neutron-haproxy-ovnmeta-c157128f-75e8-4afb-ab55-34580af9585f[229841]: [NOTICE]   (229845) : Loading success.
Nov 22 08:01:45 compute-0 nova_compute[186544]: 2025-11-22 08:01:45.158 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:01:45 compute-0 nova_compute[186544]: 2025-11-22 08:01:45.347 186548 DEBUG nova.compute.manager [req-b643b5a6-cd26-448e-977f-939b2ffaa271 req-a5d37599-a454-4e75-85c0-561c0cc61347 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Received event network-vif-plugged-fe9c07c3-e08f-4810-b699-6d6aa3f50cda external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:01:45 compute-0 nova_compute[186544]: 2025-11-22 08:01:45.348 186548 DEBUG oslo_concurrency.lockutils [req-b643b5a6-cd26-448e-977f-939b2ffaa271 req-a5d37599-a454-4e75-85c0-561c0cc61347 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "235ecf63-07fa-4f60-97e9-466450a50add-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:01:45 compute-0 nova_compute[186544]: 2025-11-22 08:01:45.348 186548 DEBUG oslo_concurrency.lockutils [req-b643b5a6-cd26-448e-977f-939b2ffaa271 req-a5d37599-a454-4e75-85c0-561c0cc61347 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "235ecf63-07fa-4f60-97e9-466450a50add-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:01:45 compute-0 nova_compute[186544]: 2025-11-22 08:01:45.349 186548 DEBUG oslo_concurrency.lockutils [req-b643b5a6-cd26-448e-977f-939b2ffaa271 req-a5d37599-a454-4e75-85c0-561c0cc61347 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "235ecf63-07fa-4f60-97e9-466450a50add-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:01:45 compute-0 nova_compute[186544]: 2025-11-22 08:01:45.349 186548 DEBUG nova.compute.manager [req-b643b5a6-cd26-448e-977f-939b2ffaa271 req-a5d37599-a454-4e75-85c0-561c0cc61347 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Processing event network-vif-plugged-fe9c07c3-e08f-4810-b699-6d6aa3f50cda _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 08:01:45 compute-0 nova_compute[186544]: 2025-11-22 08:01:45.349 186548 DEBUG nova.compute.manager [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 08:01:45 compute-0 nova_compute[186544]: 2025-11-22 08:01:45.354 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798505.3541627, 235ecf63-07fa-4f60-97e9-466450a50add => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:01:45 compute-0 nova_compute[186544]: 2025-11-22 08:01:45.354 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] VM Resumed (Lifecycle Event)
Nov 22 08:01:45 compute-0 nova_compute[186544]: 2025-11-22 08:01:45.359 186548 DEBUG nova.virt.libvirt.driver [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 08:01:45 compute-0 nova_compute[186544]: 2025-11-22 08:01:45.365 186548 INFO nova.virt.libvirt.driver [-] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Instance spawned successfully.
Nov 22 08:01:45 compute-0 nova_compute[186544]: 2025-11-22 08:01:45.365 186548 DEBUG nova.virt.libvirt.driver [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 08:01:45 compute-0 nova_compute[186544]: 2025-11-22 08:01:45.380 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:01:45 compute-0 nova_compute[186544]: 2025-11-22 08:01:45.383 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:01:45 compute-0 nova_compute[186544]: 2025-11-22 08:01:45.394 186548 DEBUG nova.virt.libvirt.driver [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:01:45 compute-0 nova_compute[186544]: 2025-11-22 08:01:45.395 186548 DEBUG nova.virt.libvirt.driver [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:01:45 compute-0 nova_compute[186544]: 2025-11-22 08:01:45.395 186548 DEBUG nova.virt.libvirt.driver [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:01:45 compute-0 nova_compute[186544]: 2025-11-22 08:01:45.396 186548 DEBUG nova.virt.libvirt.driver [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:01:45 compute-0 nova_compute[186544]: 2025-11-22 08:01:45.396 186548 DEBUG nova.virt.libvirt.driver [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:01:45 compute-0 nova_compute[186544]: 2025-11-22 08:01:45.397 186548 DEBUG nova.virt.libvirt.driver [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:01:45 compute-0 nova_compute[186544]: 2025-11-22 08:01:45.436 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 08:01:45 compute-0 nova_compute[186544]: 2025-11-22 08:01:45.454 186548 DEBUG nova.network.neutron [req-75c76ac1-049d-44aa-b49f-295ae464412d req-a6950817-859f-4e03-9220-6239b8641835 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Updated VIF entry in instance network info cache for port fe9c07c3-e08f-4810-b699-6d6aa3f50cda. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:01:45 compute-0 nova_compute[186544]: 2025-11-22 08:01:45.455 186548 DEBUG nova.network.neutron [req-75c76ac1-049d-44aa-b49f-295ae464412d req-a6950817-859f-4e03-9220-6239b8641835 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Updating instance_info_cache with network_info: [{"id": "fe9c07c3-e08f-4810-b699-6d6aa3f50cda", "address": "fa:16:3e:1c:d6:3e", "network": {"id": "c157128f-75e8-4afb-ab55-34580af9585f", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1557637836-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d967f0cef958482c9711764882a146f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe9c07c3-e0", "ovs_interfaceid": "fe9c07c3-e08f-4810-b699-6d6aa3f50cda", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:01:45 compute-0 nova_compute[186544]: 2025-11-22 08:01:45.474 186548 DEBUG oslo_concurrency.lockutils [req-75c76ac1-049d-44aa-b49f-295ae464412d req-a6950817-859f-4e03-9220-6239b8641835 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-235ecf63-07fa-4f60-97e9-466450a50add" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:01:45 compute-0 nova_compute[186544]: 2025-11-22 08:01:45.488 186548 INFO nova.compute.manager [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Took 8.13 seconds to spawn the instance on the hypervisor.
Nov 22 08:01:45 compute-0 nova_compute[186544]: 2025-11-22 08:01:45.489 186548 DEBUG nova.compute.manager [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:01:45 compute-0 nova_compute[186544]: 2025-11-22 08:01:45.578 186548 INFO nova.compute.manager [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Took 8.64 seconds to build instance.
Nov 22 08:01:45 compute-0 nova_compute[186544]: 2025-11-22 08:01:45.623 186548 DEBUG oslo_concurrency.lockutils [None req-cf6ff423-3da6-4db1-83ce-f3b1d6de3491 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Lock "235ecf63-07fa-4f60-97e9-466450a50add" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.754s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:01:45 compute-0 nova_compute[186544]: 2025-11-22 08:01:45.655 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:01:45 compute-0 nova_compute[186544]: 2025-11-22 08:01:45.999 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:01:46 compute-0 nova_compute[186544]: 2025-11-22 08:01:46.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:01:46 compute-0 nova_compute[186544]: 2025-11-22 08:01:46.216 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:01:46 compute-0 nova_compute[186544]: 2025-11-22 08:01:46.217 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:01:46 compute-0 nova_compute[186544]: 2025-11-22 08:01:46.218 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:01:46 compute-0 nova_compute[186544]: 2025-11-22 08:01:46.218 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 08:01:46 compute-0 nova_compute[186544]: 2025-11-22 08:01:46.282 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/235ecf63-07fa-4f60-97e9-466450a50add/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:01:46 compute-0 nova_compute[186544]: 2025-11-22 08:01:46.360 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/235ecf63-07fa-4f60-97e9-466450a50add/disk --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:01:46 compute-0 nova_compute[186544]: 2025-11-22 08:01:46.361 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/235ecf63-07fa-4f60-97e9-466450a50add/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:01:46 compute-0 nova_compute[186544]: 2025-11-22 08:01:46.415 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/235ecf63-07fa-4f60-97e9-466450a50add/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:01:46 compute-0 nova_compute[186544]: 2025-11-22 08:01:46.592 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:01:46 compute-0 nova_compute[186544]: 2025-11-22 08:01:46.593 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5647MB free_disk=73.27936553955078GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 08:01:46 compute-0 nova_compute[186544]: 2025-11-22 08:01:46.594 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:01:46 compute-0 nova_compute[186544]: 2025-11-22 08:01:46.594 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:01:46 compute-0 nova_compute[186544]: 2025-11-22 08:01:46.657 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Instance 235ecf63-07fa-4f60-97e9-466450a50add actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 22 08:01:46 compute-0 nova_compute[186544]: 2025-11-22 08:01:46.658 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 08:01:46 compute-0 nova_compute[186544]: 2025-11-22 08:01:46.658 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 08:01:46 compute-0 nova_compute[186544]: 2025-11-22 08:01:46.705 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:01:46 compute-0 nova_compute[186544]: 2025-11-22 08:01:46.720 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:01:46 compute-0 nova_compute[186544]: 2025-11-22 08:01:46.748 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 08:01:46 compute-0 nova_compute[186544]: 2025-11-22 08:01:46.749 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.155s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:01:47 compute-0 nova_compute[186544]: 2025-11-22 08:01:47.429 186548 DEBUG nova.compute.manager [req-7101a059-a617-49d1-bf40-6b61e3ead365 req-906e9bcc-a81e-4b3a-941e-814a90e2ebbf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Received event network-vif-plugged-fe9c07c3-e08f-4810-b699-6d6aa3f50cda external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:01:47 compute-0 nova_compute[186544]: 2025-11-22 08:01:47.430 186548 DEBUG oslo_concurrency.lockutils [req-7101a059-a617-49d1-bf40-6b61e3ead365 req-906e9bcc-a81e-4b3a-941e-814a90e2ebbf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "235ecf63-07fa-4f60-97e9-466450a50add-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:01:47 compute-0 nova_compute[186544]: 2025-11-22 08:01:47.430 186548 DEBUG oslo_concurrency.lockutils [req-7101a059-a617-49d1-bf40-6b61e3ead365 req-906e9bcc-a81e-4b3a-941e-814a90e2ebbf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "235ecf63-07fa-4f60-97e9-466450a50add-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:01:47 compute-0 nova_compute[186544]: 2025-11-22 08:01:47.430 186548 DEBUG oslo_concurrency.lockutils [req-7101a059-a617-49d1-bf40-6b61e3ead365 req-906e9bcc-a81e-4b3a-941e-814a90e2ebbf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "235ecf63-07fa-4f60-97e9-466450a50add-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:01:47 compute-0 nova_compute[186544]: 2025-11-22 08:01:47.431 186548 DEBUG nova.compute.manager [req-7101a059-a617-49d1-bf40-6b61e3ead365 req-906e9bcc-a81e-4b3a-941e-814a90e2ebbf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] No waiting events found dispatching network-vif-plugged-fe9c07c3-e08f-4810-b699-6d6aa3f50cda pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:01:47 compute-0 nova_compute[186544]: 2025-11-22 08:01:47.431 186548 WARNING nova.compute.manager [req-7101a059-a617-49d1-bf40-6b61e3ead365 req-906e9bcc-a81e-4b3a-941e-814a90e2ebbf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Received unexpected event network-vif-plugged-fe9c07c3-e08f-4810-b699-6d6aa3f50cda for instance with vm_state active and task_state None.
Nov 22 08:01:49 compute-0 nova_compute[186544]: 2025-11-22 08:01:49.906 186548 DEBUG oslo_concurrency.lockutils [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Acquiring lock "6efb9379-6030-46a2-bd5f-60441b08d2ff" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:01:49 compute-0 nova_compute[186544]: 2025-11-22 08:01:49.907 186548 DEBUG oslo_concurrency.lockutils [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Lock "6efb9379-6030-46a2-bd5f-60441b08d2ff" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:01:49 compute-0 nova_compute[186544]: 2025-11-22 08:01:49.928 186548 DEBUG nova.compute.manager [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 6efb9379-6030-46a2-bd5f-60441b08d2ff] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 08:01:50 compute-0 nova_compute[186544]: 2025-11-22 08:01:50.004 186548 DEBUG oslo_concurrency.lockutils [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:01:50 compute-0 nova_compute[186544]: 2025-11-22 08:01:50.004 186548 DEBUG oslo_concurrency.lockutils [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:01:50 compute-0 nova_compute[186544]: 2025-11-22 08:01:50.011 186548 DEBUG nova.virt.hardware [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 08:01:50 compute-0 nova_compute[186544]: 2025-11-22 08:01:50.011 186548 INFO nova.compute.claims [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 6efb9379-6030-46a2-bd5f-60441b08d2ff] Claim successful on node compute-0.ctlplane.example.com
Nov 22 08:01:50 compute-0 nova_compute[186544]: 2025-11-22 08:01:50.150 186548 DEBUG nova.compute.provider_tree [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:01:50 compute-0 nova_compute[186544]: 2025-11-22 08:01:50.161 186548 DEBUG nova.scheduler.client.report [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:01:50 compute-0 nova_compute[186544]: 2025-11-22 08:01:50.184 186548 DEBUG oslo_concurrency.lockutils [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.179s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:01:50 compute-0 nova_compute[186544]: 2025-11-22 08:01:50.184 186548 DEBUG nova.compute.manager [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 6efb9379-6030-46a2-bd5f-60441b08d2ff] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 08:01:50 compute-0 nova_compute[186544]: 2025-11-22 08:01:50.233 186548 DEBUG nova.compute.manager [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 6efb9379-6030-46a2-bd5f-60441b08d2ff] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 22 08:01:50 compute-0 nova_compute[186544]: 2025-11-22 08:01:50.233 186548 DEBUG nova.network.neutron [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 6efb9379-6030-46a2-bd5f-60441b08d2ff] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 08:01:50 compute-0 nova_compute[186544]: 2025-11-22 08:01:50.252 186548 INFO nova.virt.libvirt.driver [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 6efb9379-6030-46a2-bd5f-60441b08d2ff] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 08:01:50 compute-0 nova_compute[186544]: 2025-11-22 08:01:50.281 186548 DEBUG nova.compute.manager [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 6efb9379-6030-46a2-bd5f-60441b08d2ff] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 08:01:50 compute-0 nova_compute[186544]: 2025-11-22 08:01:50.403 186548 DEBUG nova.compute.manager [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 6efb9379-6030-46a2-bd5f-60441b08d2ff] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 08:01:50 compute-0 nova_compute[186544]: 2025-11-22 08:01:50.404 186548 DEBUG nova.virt.libvirt.driver [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 6efb9379-6030-46a2-bd5f-60441b08d2ff] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 08:01:50 compute-0 nova_compute[186544]: 2025-11-22 08:01:50.405 186548 INFO nova.virt.libvirt.driver [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 6efb9379-6030-46a2-bd5f-60441b08d2ff] Creating image(s)
Nov 22 08:01:50 compute-0 nova_compute[186544]: 2025-11-22 08:01:50.405 186548 DEBUG oslo_concurrency.lockutils [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Acquiring lock "/var/lib/nova/instances/6efb9379-6030-46a2-bd5f-60441b08d2ff/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:01:50 compute-0 nova_compute[186544]: 2025-11-22 08:01:50.406 186548 DEBUG oslo_concurrency.lockutils [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Lock "/var/lib/nova/instances/6efb9379-6030-46a2-bd5f-60441b08d2ff/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:01:50 compute-0 nova_compute[186544]: 2025-11-22 08:01:50.406 186548 DEBUG oslo_concurrency.lockutils [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Lock "/var/lib/nova/instances/6efb9379-6030-46a2-bd5f-60441b08d2ff/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:01:50 compute-0 nova_compute[186544]: 2025-11-22 08:01:50.418 186548 DEBUG oslo_concurrency.processutils [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:01:50 compute-0 nova_compute[186544]: 2025-11-22 08:01:50.477 186548 DEBUG oslo_concurrency.processutils [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:01:50 compute-0 nova_compute[186544]: 2025-11-22 08:01:50.478 186548 DEBUG oslo_concurrency.lockutils [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:01:50 compute-0 nova_compute[186544]: 2025-11-22 08:01:50.478 186548 DEBUG oslo_concurrency.lockutils [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:01:50 compute-0 nova_compute[186544]: 2025-11-22 08:01:50.492 186548 DEBUG oslo_concurrency.processutils [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:01:50 compute-0 nova_compute[186544]: 2025-11-22 08:01:50.557 186548 DEBUG oslo_concurrency.processutils [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:01:50 compute-0 nova_compute[186544]: 2025-11-22 08:01:50.558 186548 DEBUG oslo_concurrency.processutils [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/6efb9379-6030-46a2-bd5f-60441b08d2ff/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:01:50 compute-0 nova_compute[186544]: 2025-11-22 08:01:50.658 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:01:50 compute-0 nova_compute[186544]: 2025-11-22 08:01:50.750 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:01:50 compute-0 nova_compute[186544]: 2025-11-22 08:01:50.750 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 08:01:50 compute-0 nova_compute[186544]: 2025-11-22 08:01:50.780 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 08:01:50 compute-0 nova_compute[186544]: 2025-11-22 08:01:50.818 186548 DEBUG nova.policy [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '989540cd5ede4a5184a08b8eb3de013d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd967f0cef958482c9711764882a146f3', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 22 08:01:51 compute-0 nova_compute[186544]: 2025-11-22 08:01:51.002 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:01:51 compute-0 nova_compute[186544]: 2025-11-22 08:01:51.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:01:51 compute-0 nova_compute[186544]: 2025-11-22 08:01:51.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:01:51 compute-0 nova_compute[186544]: 2025-11-22 08:01:51.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:01:51 compute-0 nova_compute[186544]: 2025-11-22 08:01:51.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 08:01:51 compute-0 nova_compute[186544]: 2025-11-22 08:01:51.473 186548 DEBUG oslo_concurrency.processutils [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/6efb9379-6030-46a2-bd5f-60441b08d2ff/disk 1073741824" returned: 0 in 0.915s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:01:51 compute-0 nova_compute[186544]: 2025-11-22 08:01:51.474 186548 DEBUG oslo_concurrency.lockutils [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.995s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:01:51 compute-0 nova_compute[186544]: 2025-11-22 08:01:51.474 186548 DEBUG oslo_concurrency.processutils [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:01:51 compute-0 nova_compute[186544]: 2025-11-22 08:01:51.530 186548 DEBUG oslo_concurrency.processutils [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:01:51 compute-0 nova_compute[186544]: 2025-11-22 08:01:51.531 186548 DEBUG nova.virt.disk.api [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Checking if we can resize image /var/lib/nova/instances/6efb9379-6030-46a2-bd5f-60441b08d2ff/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 08:01:51 compute-0 nova_compute[186544]: 2025-11-22 08:01:51.531 186548 DEBUG oslo_concurrency.processutils [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6efb9379-6030-46a2-bd5f-60441b08d2ff/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:01:51 compute-0 nova_compute[186544]: 2025-11-22 08:01:51.587 186548 DEBUG oslo_concurrency.processutils [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6efb9379-6030-46a2-bd5f-60441b08d2ff/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:01:51 compute-0 nova_compute[186544]: 2025-11-22 08:01:51.588 186548 DEBUG nova.virt.disk.api [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Cannot resize image /var/lib/nova/instances/6efb9379-6030-46a2-bd5f-60441b08d2ff/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 08:01:51 compute-0 nova_compute[186544]: 2025-11-22 08:01:51.588 186548 DEBUG nova.objects.instance [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Lazy-loading 'migration_context' on Instance uuid 6efb9379-6030-46a2-bd5f-60441b08d2ff obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:01:51 compute-0 nova_compute[186544]: 2025-11-22 08:01:51.603 186548 DEBUG nova.virt.libvirt.driver [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 6efb9379-6030-46a2-bd5f-60441b08d2ff] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 08:01:51 compute-0 nova_compute[186544]: 2025-11-22 08:01:51.604 186548 DEBUG nova.virt.libvirt.driver [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 6efb9379-6030-46a2-bd5f-60441b08d2ff] Ensure instance console log exists: /var/lib/nova/instances/6efb9379-6030-46a2-bd5f-60441b08d2ff/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 08:01:51 compute-0 nova_compute[186544]: 2025-11-22 08:01:51.604 186548 DEBUG oslo_concurrency.lockutils [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:01:51 compute-0 nova_compute[186544]: 2025-11-22 08:01:51.605 186548 DEBUG oslo_concurrency.lockutils [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:01:51 compute-0 nova_compute[186544]: 2025-11-22 08:01:51.605 186548 DEBUG oslo_concurrency.lockutils [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:01:52 compute-0 podman[229878]: 2025-11-22 08:01:52.408095346 +0000 UTC m=+0.060638842 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, container_name=ceilometer_agent_compute)
Nov 22 08:01:52 compute-0 podman[229879]: 2025-11-22 08:01:52.448158168 +0000 UTC m=+0.097509255 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 08:01:52 compute-0 nova_compute[186544]: 2025-11-22 08:01:52.739 186548 DEBUG nova.network.neutron [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 6efb9379-6030-46a2-bd5f-60441b08d2ff] Successfully created port: 71b69b21-29dd-40cb-a36d-d06613ade5cb _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 22 08:01:53 compute-0 nova_compute[186544]: 2025-11-22 08:01:53.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:01:53 compute-0 nova_compute[186544]: 2025-11-22 08:01:53.789 186548 DEBUG nova.network.neutron [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 6efb9379-6030-46a2-bd5f-60441b08d2ff] Successfully updated port: 71b69b21-29dd-40cb-a36d-d06613ade5cb _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 08:01:53 compute-0 nova_compute[186544]: 2025-11-22 08:01:53.804 186548 DEBUG oslo_concurrency.lockutils [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Acquiring lock "refresh_cache-6efb9379-6030-46a2-bd5f-60441b08d2ff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:01:53 compute-0 nova_compute[186544]: 2025-11-22 08:01:53.804 186548 DEBUG oslo_concurrency.lockutils [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Acquired lock "refresh_cache-6efb9379-6030-46a2-bd5f-60441b08d2ff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:01:53 compute-0 nova_compute[186544]: 2025-11-22 08:01:53.804 186548 DEBUG nova.network.neutron [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 6efb9379-6030-46a2-bd5f-60441b08d2ff] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 08:01:53 compute-0 nova_compute[186544]: 2025-11-22 08:01:53.896 186548 DEBUG nova.compute.manager [req-22af3276-895e-4a47-a292-878ad1e1582f req-2cda7838-0b4f-449c-97f5-0afa0c059829 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6efb9379-6030-46a2-bd5f-60441b08d2ff] Received event network-changed-71b69b21-29dd-40cb-a36d-d06613ade5cb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:01:53 compute-0 nova_compute[186544]: 2025-11-22 08:01:53.896 186548 DEBUG nova.compute.manager [req-22af3276-895e-4a47-a292-878ad1e1582f req-2cda7838-0b4f-449c-97f5-0afa0c059829 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6efb9379-6030-46a2-bd5f-60441b08d2ff] Refreshing instance network info cache due to event network-changed-71b69b21-29dd-40cb-a36d-d06613ade5cb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:01:53 compute-0 nova_compute[186544]: 2025-11-22 08:01:53.897 186548 DEBUG oslo_concurrency.lockutils [req-22af3276-895e-4a47-a292-878ad1e1582f req-2cda7838-0b4f-449c-97f5-0afa0c059829 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-6efb9379-6030-46a2-bd5f-60441b08d2ff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:01:53 compute-0 nova_compute[186544]: 2025-11-22 08:01:53.988 186548 DEBUG nova.network.neutron [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 6efb9379-6030-46a2-bd5f-60441b08d2ff] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 08:01:54 compute-0 nova_compute[186544]: 2025-11-22 08:01:54.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:01:55 compute-0 nova_compute[186544]: 2025-11-22 08:01:55.280 186548 DEBUG nova.network.neutron [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 6efb9379-6030-46a2-bd5f-60441b08d2ff] Updating instance_info_cache with network_info: [{"id": "71b69b21-29dd-40cb-a36d-d06613ade5cb", "address": "fa:16:3e:62:2c:86", "network": {"id": "c157128f-75e8-4afb-ab55-34580af9585f", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1557637836-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d967f0cef958482c9711764882a146f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71b69b21-29", "ovs_interfaceid": "71b69b21-29dd-40cb-a36d-d06613ade5cb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:01:55 compute-0 nova_compute[186544]: 2025-11-22 08:01:55.299 186548 DEBUG oslo_concurrency.lockutils [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Releasing lock "refresh_cache-6efb9379-6030-46a2-bd5f-60441b08d2ff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:01:55 compute-0 nova_compute[186544]: 2025-11-22 08:01:55.300 186548 DEBUG nova.compute.manager [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 6efb9379-6030-46a2-bd5f-60441b08d2ff] Instance network_info: |[{"id": "71b69b21-29dd-40cb-a36d-d06613ade5cb", "address": "fa:16:3e:62:2c:86", "network": {"id": "c157128f-75e8-4afb-ab55-34580af9585f", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1557637836-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d967f0cef958482c9711764882a146f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71b69b21-29", "ovs_interfaceid": "71b69b21-29dd-40cb-a36d-d06613ade5cb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 22 08:01:55 compute-0 nova_compute[186544]: 2025-11-22 08:01:55.300 186548 DEBUG oslo_concurrency.lockutils [req-22af3276-895e-4a47-a292-878ad1e1582f req-2cda7838-0b4f-449c-97f5-0afa0c059829 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-6efb9379-6030-46a2-bd5f-60441b08d2ff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:01:55 compute-0 nova_compute[186544]: 2025-11-22 08:01:55.300 186548 DEBUG nova.network.neutron [req-22af3276-895e-4a47-a292-878ad1e1582f req-2cda7838-0b4f-449c-97f5-0afa0c059829 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6efb9379-6030-46a2-bd5f-60441b08d2ff] Refreshing network info cache for port 71b69b21-29dd-40cb-a36d-d06613ade5cb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:01:55 compute-0 nova_compute[186544]: 2025-11-22 08:01:55.303 186548 DEBUG nova.virt.libvirt.driver [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 6efb9379-6030-46a2-bd5f-60441b08d2ff] Start _get_guest_xml network_info=[{"id": "71b69b21-29dd-40cb-a36d-d06613ade5cb", "address": "fa:16:3e:62:2c:86", "network": {"id": "c157128f-75e8-4afb-ab55-34580af9585f", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1557637836-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d967f0cef958482c9711764882a146f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71b69b21-29", "ovs_interfaceid": "71b69b21-29dd-40cb-a36d-d06613ade5cb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 08:01:55 compute-0 nova_compute[186544]: 2025-11-22 08:01:55.307 186548 WARNING nova.virt.libvirt.driver [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:01:55 compute-0 nova_compute[186544]: 2025-11-22 08:01:55.313 186548 DEBUG nova.virt.libvirt.host [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 08:01:55 compute-0 nova_compute[186544]: 2025-11-22 08:01:55.314 186548 DEBUG nova.virt.libvirt.host [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 08:01:55 compute-0 nova_compute[186544]: 2025-11-22 08:01:55.318 186548 DEBUG nova.virt.libvirt.host [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 08:01:55 compute-0 nova_compute[186544]: 2025-11-22 08:01:55.319 186548 DEBUG nova.virt.libvirt.host [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 08:01:55 compute-0 nova_compute[186544]: 2025-11-22 08:01:55.320 186548 DEBUG nova.virt.libvirt.driver [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 08:01:55 compute-0 nova_compute[186544]: 2025-11-22 08:01:55.320 186548 DEBUG nova.virt.hardware [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 08:01:55 compute-0 nova_compute[186544]: 2025-11-22 08:01:55.321 186548 DEBUG nova.virt.hardware [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 08:01:55 compute-0 nova_compute[186544]: 2025-11-22 08:01:55.321 186548 DEBUG nova.virt.hardware [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 08:01:55 compute-0 nova_compute[186544]: 2025-11-22 08:01:55.321 186548 DEBUG nova.virt.hardware [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 08:01:55 compute-0 nova_compute[186544]: 2025-11-22 08:01:55.321 186548 DEBUG nova.virt.hardware [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 08:01:55 compute-0 nova_compute[186544]: 2025-11-22 08:01:55.322 186548 DEBUG nova.virt.hardware [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 08:01:55 compute-0 nova_compute[186544]: 2025-11-22 08:01:55.322 186548 DEBUG nova.virt.hardware [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 08:01:55 compute-0 nova_compute[186544]: 2025-11-22 08:01:55.322 186548 DEBUG nova.virt.hardware [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 08:01:55 compute-0 nova_compute[186544]: 2025-11-22 08:01:55.323 186548 DEBUG nova.virt.hardware [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 08:01:55 compute-0 nova_compute[186544]: 2025-11-22 08:01:55.323 186548 DEBUG nova.virt.hardware [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 08:01:55 compute-0 nova_compute[186544]: 2025-11-22 08:01:55.323 186548 DEBUG nova.virt.hardware [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 08:01:55 compute-0 nova_compute[186544]: 2025-11-22 08:01:55.327 186548 DEBUG nova.virt.libvirt.vif [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:01:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-2049160290',display_name='tempest-ServersNegativeTestJSON-server-2049160290',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-2049160290',id=98,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d967f0cef958482c9711764882a146f3',ramdisk_id='',reservation_id='r-tj74lunn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-1872924472',owner_user_name='tempest-ServersNegativeTestJSON-1872924472-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:01:50Z,user_data=None,user_id='989540cd5ede4a5184a08b8eb3de013d',uuid=6efb9379-6030-46a2-bd5f-60441b08d2ff,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "71b69b21-29dd-40cb-a36d-d06613ade5cb", "address": "fa:16:3e:62:2c:86", "network": {"id": "c157128f-75e8-4afb-ab55-34580af9585f", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1557637836-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d967f0cef958482c9711764882a146f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71b69b21-29", "ovs_interfaceid": "71b69b21-29dd-40cb-a36d-d06613ade5cb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 08:01:55 compute-0 nova_compute[186544]: 2025-11-22 08:01:55.328 186548 DEBUG nova.network.os_vif_util [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Converting VIF {"id": "71b69b21-29dd-40cb-a36d-d06613ade5cb", "address": "fa:16:3e:62:2c:86", "network": {"id": "c157128f-75e8-4afb-ab55-34580af9585f", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1557637836-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d967f0cef958482c9711764882a146f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71b69b21-29", "ovs_interfaceid": "71b69b21-29dd-40cb-a36d-d06613ade5cb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:01:55 compute-0 nova_compute[186544]: 2025-11-22 08:01:55.329 186548 DEBUG nova.network.os_vif_util [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:62:2c:86,bridge_name='br-int',has_traffic_filtering=True,id=71b69b21-29dd-40cb-a36d-d06613ade5cb,network=Network(c157128f-75e8-4afb-ab55-34580af9585f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap71b69b21-29') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:01:55 compute-0 nova_compute[186544]: 2025-11-22 08:01:55.329 186548 DEBUG nova.objects.instance [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 6efb9379-6030-46a2-bd5f-60441b08d2ff obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:01:55 compute-0 nova_compute[186544]: 2025-11-22 08:01:55.352 186548 DEBUG nova.virt.libvirt.driver [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 6efb9379-6030-46a2-bd5f-60441b08d2ff] End _get_guest_xml xml=<domain type="kvm">
Nov 22 08:01:55 compute-0 nova_compute[186544]:   <uuid>6efb9379-6030-46a2-bd5f-60441b08d2ff</uuid>
Nov 22 08:01:55 compute-0 nova_compute[186544]:   <name>instance-00000062</name>
Nov 22 08:01:55 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 08:01:55 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 08:01:55 compute-0 nova_compute[186544]:   <metadata>
Nov 22 08:01:55 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 08:01:55 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 08:01:55 compute-0 nova_compute[186544]:       <nova:name>tempest-ServersNegativeTestJSON-server-2049160290</nova:name>
Nov 22 08:01:55 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 08:01:55</nova:creationTime>
Nov 22 08:01:55 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 08:01:55 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 08:01:55 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 08:01:55 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 08:01:55 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 08:01:55 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 08:01:55 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 08:01:55 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 08:01:55 compute-0 nova_compute[186544]:         <nova:user uuid="989540cd5ede4a5184a08b8eb3de013d">tempest-ServersNegativeTestJSON-1872924472-project-member</nova:user>
Nov 22 08:01:55 compute-0 nova_compute[186544]:         <nova:project uuid="d967f0cef958482c9711764882a146f3">tempest-ServersNegativeTestJSON-1872924472</nova:project>
Nov 22 08:01:55 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 08:01:55 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 08:01:55 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 08:01:55 compute-0 nova_compute[186544]:         <nova:port uuid="71b69b21-29dd-40cb-a36d-d06613ade5cb">
Nov 22 08:01:55 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Nov 22 08:01:55 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 08:01:55 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 08:01:55 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 08:01:55 compute-0 nova_compute[186544]:   </metadata>
Nov 22 08:01:55 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 08:01:55 compute-0 nova_compute[186544]:     <system>
Nov 22 08:01:55 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 08:01:55 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 08:01:55 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 08:01:55 compute-0 nova_compute[186544]:       <entry name="serial">6efb9379-6030-46a2-bd5f-60441b08d2ff</entry>
Nov 22 08:01:55 compute-0 nova_compute[186544]:       <entry name="uuid">6efb9379-6030-46a2-bd5f-60441b08d2ff</entry>
Nov 22 08:01:55 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 08:01:55 compute-0 nova_compute[186544]:     </system>
Nov 22 08:01:55 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 08:01:55 compute-0 nova_compute[186544]:   <os>
Nov 22 08:01:55 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 08:01:55 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 08:01:55 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 08:01:55 compute-0 nova_compute[186544]:   </os>
Nov 22 08:01:55 compute-0 nova_compute[186544]:   <features>
Nov 22 08:01:55 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 08:01:55 compute-0 nova_compute[186544]:     <apic/>
Nov 22 08:01:55 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 08:01:55 compute-0 nova_compute[186544]:   </features>
Nov 22 08:01:55 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 08:01:55 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 08:01:55 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 08:01:55 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 08:01:55 compute-0 nova_compute[186544]:   </clock>
Nov 22 08:01:55 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 08:01:55 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 08:01:55 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 08:01:55 compute-0 nova_compute[186544]:   </cpu>
Nov 22 08:01:55 compute-0 nova_compute[186544]:   <devices>
Nov 22 08:01:55 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 08:01:55 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 08:01:55 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/6efb9379-6030-46a2-bd5f-60441b08d2ff/disk"/>
Nov 22 08:01:55 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 08:01:55 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:01:55 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 08:01:55 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 08:01:55 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/6efb9379-6030-46a2-bd5f-60441b08d2ff/disk.config"/>
Nov 22 08:01:55 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 08:01:55 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:01:55 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 08:01:55 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:62:2c:86"/>
Nov 22 08:01:55 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:01:55 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 08:01:55 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 08:01:55 compute-0 nova_compute[186544]:       <target dev="tap71b69b21-29"/>
Nov 22 08:01:55 compute-0 nova_compute[186544]:     </interface>
Nov 22 08:01:55 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 08:01:55 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/6efb9379-6030-46a2-bd5f-60441b08d2ff/console.log" append="off"/>
Nov 22 08:01:55 compute-0 nova_compute[186544]:     </serial>
Nov 22 08:01:55 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 08:01:55 compute-0 nova_compute[186544]:     <video>
Nov 22 08:01:55 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:01:55 compute-0 nova_compute[186544]:     </video>
Nov 22 08:01:55 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 08:01:55 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 08:01:55 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 08:01:55 compute-0 nova_compute[186544]:     </rng>
Nov 22 08:01:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 08:01:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:01:55 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 08:01:55 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 08:01:55 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 08:01:55 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 08:01:55 compute-0 nova_compute[186544]:   </devices>
Nov 22 08:01:55 compute-0 nova_compute[186544]: </domain>
Nov 22 08:01:55 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 08:01:55 compute-0 nova_compute[186544]: 2025-11-22 08:01:55.353 186548 DEBUG nova.compute.manager [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 6efb9379-6030-46a2-bd5f-60441b08d2ff] Preparing to wait for external event network-vif-plugged-71b69b21-29dd-40cb-a36d-d06613ade5cb prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 08:01:55 compute-0 nova_compute[186544]: 2025-11-22 08:01:55.353 186548 DEBUG oslo_concurrency.lockutils [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Acquiring lock "6efb9379-6030-46a2-bd5f-60441b08d2ff-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:01:55 compute-0 nova_compute[186544]: 2025-11-22 08:01:55.353 186548 DEBUG oslo_concurrency.lockutils [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Lock "6efb9379-6030-46a2-bd5f-60441b08d2ff-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:01:55 compute-0 nova_compute[186544]: 2025-11-22 08:01:55.354 186548 DEBUG oslo_concurrency.lockutils [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Lock "6efb9379-6030-46a2-bd5f-60441b08d2ff-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:01:55 compute-0 nova_compute[186544]: 2025-11-22 08:01:55.354 186548 DEBUG nova.virt.libvirt.vif [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:01:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-2049160290',display_name='tempest-ServersNegativeTestJSON-server-2049160290',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-2049160290',id=98,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d967f0cef958482c9711764882a146f3',ramdisk_id='',reservation_id='r-tj74lunn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-1872924472',owner_user_name='tempest-ServersNegativeTestJSON-1872924472-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:01:50Z,user_data=None,user_id='989540cd5ede4a5184a08b8eb3de013d',uuid=6efb9379-6030-46a2-bd5f-60441b08d2ff,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "71b69b21-29dd-40cb-a36d-d06613ade5cb", "address": "fa:16:3e:62:2c:86", "network": {"id": "c157128f-75e8-4afb-ab55-34580af9585f", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1557637836-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d967f0cef958482c9711764882a146f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71b69b21-29", "ovs_interfaceid": "71b69b21-29dd-40cb-a36d-d06613ade5cb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 08:01:55 compute-0 nova_compute[186544]: 2025-11-22 08:01:55.355 186548 DEBUG nova.network.os_vif_util [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Converting VIF {"id": "71b69b21-29dd-40cb-a36d-d06613ade5cb", "address": "fa:16:3e:62:2c:86", "network": {"id": "c157128f-75e8-4afb-ab55-34580af9585f", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1557637836-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d967f0cef958482c9711764882a146f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71b69b21-29", "ovs_interfaceid": "71b69b21-29dd-40cb-a36d-d06613ade5cb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:01:55 compute-0 nova_compute[186544]: 2025-11-22 08:01:55.355 186548 DEBUG nova.network.os_vif_util [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:62:2c:86,bridge_name='br-int',has_traffic_filtering=True,id=71b69b21-29dd-40cb-a36d-d06613ade5cb,network=Network(c157128f-75e8-4afb-ab55-34580af9585f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap71b69b21-29') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:01:55 compute-0 nova_compute[186544]: 2025-11-22 08:01:55.356 186548 DEBUG os_vif [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:62:2c:86,bridge_name='br-int',has_traffic_filtering=True,id=71b69b21-29dd-40cb-a36d-d06613ade5cb,network=Network(c157128f-75e8-4afb-ab55-34580af9585f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap71b69b21-29') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 08:01:55 compute-0 nova_compute[186544]: 2025-11-22 08:01:55.356 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:01:55 compute-0 nova_compute[186544]: 2025-11-22 08:01:55.357 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:01:55 compute-0 nova_compute[186544]: 2025-11-22 08:01:55.357 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:01:55 compute-0 nova_compute[186544]: 2025-11-22 08:01:55.361 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:01:55 compute-0 nova_compute[186544]: 2025-11-22 08:01:55.361 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap71b69b21-29, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:01:55 compute-0 nova_compute[186544]: 2025-11-22 08:01:55.361 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap71b69b21-29, col_values=(('external_ids', {'iface-id': '71b69b21-29dd-40cb-a36d-d06613ade5cb', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:62:2c:86', 'vm-uuid': '6efb9379-6030-46a2-bd5f-60441b08d2ff'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:01:55 compute-0 nova_compute[186544]: 2025-11-22 08:01:55.363 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:01:55 compute-0 NetworkManager[55036]: <info>  [1763798515.3644] manager: (tap71b69b21-29): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/192)
Nov 22 08:01:55 compute-0 nova_compute[186544]: 2025-11-22 08:01:55.366 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 08:01:55 compute-0 nova_compute[186544]: 2025-11-22 08:01:55.371 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:01:55 compute-0 nova_compute[186544]: 2025-11-22 08:01:55.372 186548 INFO os_vif [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:62:2c:86,bridge_name='br-int',has_traffic_filtering=True,id=71b69b21-29dd-40cb-a36d-d06613ade5cb,network=Network(c157128f-75e8-4afb-ab55-34580af9585f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap71b69b21-29')
Nov 22 08:01:55 compute-0 nova_compute[186544]: 2025-11-22 08:01:55.661 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:01:55 compute-0 nova_compute[186544]: 2025-11-22 08:01:55.854 186548 DEBUG nova.virt.libvirt.driver [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:01:55 compute-0 nova_compute[186544]: 2025-11-22 08:01:55.855 186548 DEBUG nova.virt.libvirt.driver [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:01:55 compute-0 nova_compute[186544]: 2025-11-22 08:01:55.855 186548 DEBUG nova.virt.libvirt.driver [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] No VIF found with MAC fa:16:3e:62:2c:86, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 08:01:55 compute-0 nova_compute[186544]: 2025-11-22 08:01:55.855 186548 INFO nova.virt.libvirt.driver [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 6efb9379-6030-46a2-bd5f-60441b08d2ff] Using config drive
Nov 22 08:01:56 compute-0 nova_compute[186544]: 2025-11-22 08:01:56.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:01:56 compute-0 podman[229926]: 2025-11-22 08:01:56.408004307 +0000 UTC m=+0.055401214 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 08:01:56 compute-0 nova_compute[186544]: 2025-11-22 08:01:56.667 186548 INFO nova.virt.libvirt.driver [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 6efb9379-6030-46a2-bd5f-60441b08d2ff] Creating config drive at /var/lib/nova/instances/6efb9379-6030-46a2-bd5f-60441b08d2ff/disk.config
Nov 22 08:01:56 compute-0 nova_compute[186544]: 2025-11-22 08:01:56.672 186548 DEBUG oslo_concurrency.processutils [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6efb9379-6030-46a2-bd5f-60441b08d2ff/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp83cwk2q4 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:01:56 compute-0 nova_compute[186544]: 2025-11-22 08:01:56.798 186548 DEBUG oslo_concurrency.processutils [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6efb9379-6030-46a2-bd5f-60441b08d2ff/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp83cwk2q4" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:01:56 compute-0 kernel: tap71b69b21-29: entered promiscuous mode
Nov 22 08:01:56 compute-0 NetworkManager[55036]: <info>  [1763798516.8862] manager: (tap71b69b21-29): new Tun device (/org/freedesktop/NetworkManager/Devices/193)
Nov 22 08:01:56 compute-0 ovn_controller[94843]: 2025-11-22T08:01:56Z|00402|binding|INFO|Claiming lport 71b69b21-29dd-40cb-a36d-d06613ade5cb for this chassis.
Nov 22 08:01:56 compute-0 ovn_controller[94843]: 2025-11-22T08:01:56Z|00403|binding|INFO|71b69b21-29dd-40cb-a36d-d06613ade5cb: Claiming fa:16:3e:62:2c:86 10.100.0.12
Nov 22 08:01:56 compute-0 nova_compute[186544]: 2025-11-22 08:01:56.887 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:01:56 compute-0 ovn_controller[94843]: 2025-11-22T08:01:56Z|00404|binding|INFO|Setting lport 71b69b21-29dd-40cb-a36d-d06613ade5cb ovn-installed in OVS
Nov 22 08:01:56 compute-0 nova_compute[186544]: 2025-11-22 08:01:56.903 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:01:56 compute-0 nova_compute[186544]: 2025-11-22 08:01:56.905 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:01:56 compute-0 systemd-udevd[229966]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:01:56 compute-0 systemd-machined[152872]: New machine qemu-54-instance-00000062.
Nov 22 08:01:56 compute-0 NetworkManager[55036]: <info>  [1763798516.9349] device (tap71b69b21-29): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 08:01:56 compute-0 NetworkManager[55036]: <info>  [1763798516.9359] device (tap71b69b21-29): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 08:01:56 compute-0 systemd[1]: Started Virtual Machine qemu-54-instance-00000062.
Nov 22 08:01:56 compute-0 ovn_controller[94843]: 2025-11-22T08:01:56Z|00405|binding|INFO|Setting lport 71b69b21-29dd-40cb-a36d-d06613ade5cb up in Southbound
Nov 22 08:01:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:56.948 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:62:2c:86 10.100.0.12'], port_security=['fa:16:3e:62:2c:86 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '6efb9379-6030-46a2-bd5f-60441b08d2ff', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c157128f-75e8-4afb-ab55-34580af9585f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd967f0cef958482c9711764882a146f3', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'db55d655-ec9a-40ef-9e54-3247c3ea4f75', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=50c031f4-6b41-4ee7-af4f-a9218d9b390c, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=71b69b21-29dd-40cb-a36d-d06613ade5cb) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:01:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:56.949 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 71b69b21-29dd-40cb-a36d-d06613ade5cb in datapath c157128f-75e8-4afb-ab55-34580af9585f bound to our chassis
Nov 22 08:01:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:56.951 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c157128f-75e8-4afb-ab55-34580af9585f
Nov 22 08:01:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:56.967 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[261617e4-12f0-45bb-8272-f1c55624e940]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:01:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:57.011 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[9f4263ce-b66f-4a5a-92e9-d233c85ac401]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:01:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:57.015 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[e8443fda-aaf3-4ca7-8fe1-786e6bff8b21]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:01:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:57.044 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[1b4a4a0e-dafa-4c94-a378-9b838d7c06a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:01:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:57.060 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[cd71ecb4-3f56-45e9-b041-898392e331b1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc157128f-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bb:41:0b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 5, 'rx_bytes': 532, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 5, 'rx_bytes': 532, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 125], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 524460, 'reachable_time': 44678, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229981, 'error': None, 'target': 'ovnmeta-c157128f-75e8-4afb-ab55-34580af9585f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:01:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:57.076 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[6641098a-c9c5-417e-90bc-a367cfc1a8a0]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapc157128f-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 524472, 'tstamp': 524472}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229982, 'error': None, 'target': 'ovnmeta-c157128f-75e8-4afb-ab55-34580af9585f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapc157128f-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 524476, 'tstamp': 524476}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229982, 'error': None, 'target': 'ovnmeta-c157128f-75e8-4afb-ab55-34580af9585f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:01:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:57.078 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc157128f-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:01:57 compute-0 nova_compute[186544]: 2025-11-22 08:01:57.079 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:01:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:57.081 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc157128f-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:01:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:57.081 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:01:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:57.082 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc157128f-70, col_values=(('external_ids', {'iface-id': 'a61c8ae7-262d-45c7-859e-6a4502225b00'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:01:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:57.082 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:01:57 compute-0 nova_compute[186544]: 2025-11-22 08:01:57.216 186548 DEBUG nova.compute.manager [req-d1516820-04d9-4d1c-ad70-caa3a7a7cb25 req-78c43c42-7d5d-45b6-aa00-6b251bdc66fd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6efb9379-6030-46a2-bd5f-60441b08d2ff] Received event network-vif-plugged-71b69b21-29dd-40cb-a36d-d06613ade5cb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:01:57 compute-0 nova_compute[186544]: 2025-11-22 08:01:57.217 186548 DEBUG oslo_concurrency.lockutils [req-d1516820-04d9-4d1c-ad70-caa3a7a7cb25 req-78c43c42-7d5d-45b6-aa00-6b251bdc66fd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "6efb9379-6030-46a2-bd5f-60441b08d2ff-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:01:57 compute-0 nova_compute[186544]: 2025-11-22 08:01:57.217 186548 DEBUG oslo_concurrency.lockutils [req-d1516820-04d9-4d1c-ad70-caa3a7a7cb25 req-78c43c42-7d5d-45b6-aa00-6b251bdc66fd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6efb9379-6030-46a2-bd5f-60441b08d2ff-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:01:57 compute-0 nova_compute[186544]: 2025-11-22 08:01:57.217 186548 DEBUG oslo_concurrency.lockutils [req-d1516820-04d9-4d1c-ad70-caa3a7a7cb25 req-78c43c42-7d5d-45b6-aa00-6b251bdc66fd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6efb9379-6030-46a2-bd5f-60441b08d2ff-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:01:57 compute-0 nova_compute[186544]: 2025-11-22 08:01:57.218 186548 DEBUG nova.compute.manager [req-d1516820-04d9-4d1c-ad70-caa3a7a7cb25 req-78c43c42-7d5d-45b6-aa00-6b251bdc66fd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6efb9379-6030-46a2-bd5f-60441b08d2ff] Processing event network-vif-plugged-71b69b21-29dd-40cb-a36d-d06613ade5cb _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 08:01:57 compute-0 podman[229983]: 2025-11-22 08:01:57.395326183 +0000 UTC m=+0.047023805 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 08:01:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:57.765 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=27, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=26) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:01:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:01:57.766 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 08:01:57 compute-0 nova_compute[186544]: 2025-11-22 08:01:57.768 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:01:57 compute-0 nova_compute[186544]: 2025-11-22 08:01:57.850 186548 DEBUG nova.network.neutron [req-22af3276-895e-4a47-a292-878ad1e1582f req-2cda7838-0b4f-449c-97f5-0afa0c059829 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6efb9379-6030-46a2-bd5f-60441b08d2ff] Updated VIF entry in instance network info cache for port 71b69b21-29dd-40cb-a36d-d06613ade5cb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:01:57 compute-0 nova_compute[186544]: 2025-11-22 08:01:57.850 186548 DEBUG nova.network.neutron [req-22af3276-895e-4a47-a292-878ad1e1582f req-2cda7838-0b4f-449c-97f5-0afa0c059829 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6efb9379-6030-46a2-bd5f-60441b08d2ff] Updating instance_info_cache with network_info: [{"id": "71b69b21-29dd-40cb-a36d-d06613ade5cb", "address": "fa:16:3e:62:2c:86", "network": {"id": "c157128f-75e8-4afb-ab55-34580af9585f", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1557637836-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d967f0cef958482c9711764882a146f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71b69b21-29", "ovs_interfaceid": "71b69b21-29dd-40cb-a36d-d06613ade5cb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:01:57 compute-0 nova_compute[186544]: 2025-11-22 08:01:57.864 186548 DEBUG oslo_concurrency.lockutils [req-22af3276-895e-4a47-a292-878ad1e1582f req-2cda7838-0b4f-449c-97f5-0afa0c059829 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-6efb9379-6030-46a2-bd5f-60441b08d2ff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:01:58 compute-0 nova_compute[186544]: 2025-11-22 08:01:57.999 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798517.998799, 6efb9379-6030-46a2-bd5f-60441b08d2ff => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:01:58 compute-0 nova_compute[186544]: 2025-11-22 08:01:58.001 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 6efb9379-6030-46a2-bd5f-60441b08d2ff] VM Started (Lifecycle Event)
Nov 22 08:01:58 compute-0 nova_compute[186544]: 2025-11-22 08:01:58.003 186548 DEBUG nova.compute.manager [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 6efb9379-6030-46a2-bd5f-60441b08d2ff] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 08:01:58 compute-0 nova_compute[186544]: 2025-11-22 08:01:58.012 186548 DEBUG nova.virt.libvirt.driver [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 6efb9379-6030-46a2-bd5f-60441b08d2ff] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 08:01:58 compute-0 nova_compute[186544]: 2025-11-22 08:01:58.018 186548 INFO nova.virt.libvirt.driver [-] [instance: 6efb9379-6030-46a2-bd5f-60441b08d2ff] Instance spawned successfully.
Nov 22 08:01:58 compute-0 nova_compute[186544]: 2025-11-22 08:01:58.020 186548 DEBUG nova.virt.libvirt.driver [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 6efb9379-6030-46a2-bd5f-60441b08d2ff] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 08:01:58 compute-0 nova_compute[186544]: 2025-11-22 08:01:58.023 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 6efb9379-6030-46a2-bd5f-60441b08d2ff] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:01:58 compute-0 nova_compute[186544]: 2025-11-22 08:01:58.027 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 6efb9379-6030-46a2-bd5f-60441b08d2ff] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:01:58 compute-0 nova_compute[186544]: 2025-11-22 08:01:58.049 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 6efb9379-6030-46a2-bd5f-60441b08d2ff] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 08:01:58 compute-0 nova_compute[186544]: 2025-11-22 08:01:58.050 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798517.9991364, 6efb9379-6030-46a2-bd5f-60441b08d2ff => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:01:58 compute-0 nova_compute[186544]: 2025-11-22 08:01:58.050 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 6efb9379-6030-46a2-bd5f-60441b08d2ff] VM Paused (Lifecycle Event)
Nov 22 08:01:58 compute-0 nova_compute[186544]: 2025-11-22 08:01:58.053 186548 DEBUG nova.virt.libvirt.driver [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 6efb9379-6030-46a2-bd5f-60441b08d2ff] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:01:58 compute-0 nova_compute[186544]: 2025-11-22 08:01:58.053 186548 DEBUG nova.virt.libvirt.driver [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 6efb9379-6030-46a2-bd5f-60441b08d2ff] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:01:58 compute-0 nova_compute[186544]: 2025-11-22 08:01:58.054 186548 DEBUG nova.virt.libvirt.driver [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 6efb9379-6030-46a2-bd5f-60441b08d2ff] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:01:58 compute-0 nova_compute[186544]: 2025-11-22 08:01:58.054 186548 DEBUG nova.virt.libvirt.driver [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 6efb9379-6030-46a2-bd5f-60441b08d2ff] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:01:58 compute-0 nova_compute[186544]: 2025-11-22 08:01:58.055 186548 DEBUG nova.virt.libvirt.driver [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 6efb9379-6030-46a2-bd5f-60441b08d2ff] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:01:58 compute-0 nova_compute[186544]: 2025-11-22 08:01:58.055 186548 DEBUG nova.virt.libvirt.driver [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 6efb9379-6030-46a2-bd5f-60441b08d2ff] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:01:58 compute-0 nova_compute[186544]: 2025-11-22 08:01:58.076 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 6efb9379-6030-46a2-bd5f-60441b08d2ff] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:01:58 compute-0 nova_compute[186544]: 2025-11-22 08:01:58.080 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798518.012576, 6efb9379-6030-46a2-bd5f-60441b08d2ff => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:01:58 compute-0 nova_compute[186544]: 2025-11-22 08:01:58.081 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 6efb9379-6030-46a2-bd5f-60441b08d2ff] VM Resumed (Lifecycle Event)
Nov 22 08:01:58 compute-0 nova_compute[186544]: 2025-11-22 08:01:58.238 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 6efb9379-6030-46a2-bd5f-60441b08d2ff] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:01:58 compute-0 nova_compute[186544]: 2025-11-22 08:01:58.241 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 6efb9379-6030-46a2-bd5f-60441b08d2ff] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:01:58 compute-0 nova_compute[186544]: 2025-11-22 08:01:58.263 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 6efb9379-6030-46a2-bd5f-60441b08d2ff] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 08:01:58 compute-0 nova_compute[186544]: 2025-11-22 08:01:58.274 186548 INFO nova.compute.manager [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 6efb9379-6030-46a2-bd5f-60441b08d2ff] Took 7.87 seconds to spawn the instance on the hypervisor.
Nov 22 08:01:58 compute-0 nova_compute[186544]: 2025-11-22 08:01:58.275 186548 DEBUG nova.compute.manager [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 6efb9379-6030-46a2-bd5f-60441b08d2ff] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:01:58 compute-0 nova_compute[186544]: 2025-11-22 08:01:58.338 186548 INFO nova.compute.manager [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 6efb9379-6030-46a2-bd5f-60441b08d2ff] Took 8.36 seconds to build instance.
Nov 22 08:01:58 compute-0 nova_compute[186544]: 2025-11-22 08:01:58.361 186548 DEBUG oslo_concurrency.lockutils [None req-2f0a7d24-2c19-4cbf-99f2-56418aacff84 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Lock "6efb9379-6030-46a2-bd5f-60441b08d2ff" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.454s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:01:59 compute-0 nova_compute[186544]: 2025-11-22 08:01:59.322 186548 DEBUG nova.compute.manager [req-e752ec5f-115f-4681-8f8d-4085bd8ad24a req-cfde2393-0833-48b6-a3a2-bad242956cba 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6efb9379-6030-46a2-bd5f-60441b08d2ff] Received event network-vif-plugged-71b69b21-29dd-40cb-a36d-d06613ade5cb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:01:59 compute-0 nova_compute[186544]: 2025-11-22 08:01:59.323 186548 DEBUG oslo_concurrency.lockutils [req-e752ec5f-115f-4681-8f8d-4085bd8ad24a req-cfde2393-0833-48b6-a3a2-bad242956cba 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "6efb9379-6030-46a2-bd5f-60441b08d2ff-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:01:59 compute-0 nova_compute[186544]: 2025-11-22 08:01:59.323 186548 DEBUG oslo_concurrency.lockutils [req-e752ec5f-115f-4681-8f8d-4085bd8ad24a req-cfde2393-0833-48b6-a3a2-bad242956cba 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6efb9379-6030-46a2-bd5f-60441b08d2ff-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:01:59 compute-0 nova_compute[186544]: 2025-11-22 08:01:59.324 186548 DEBUG oslo_concurrency.lockutils [req-e752ec5f-115f-4681-8f8d-4085bd8ad24a req-cfde2393-0833-48b6-a3a2-bad242956cba 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6efb9379-6030-46a2-bd5f-60441b08d2ff-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:01:59 compute-0 nova_compute[186544]: 2025-11-22 08:01:59.324 186548 DEBUG nova.compute.manager [req-e752ec5f-115f-4681-8f8d-4085bd8ad24a req-cfde2393-0833-48b6-a3a2-bad242956cba 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6efb9379-6030-46a2-bd5f-60441b08d2ff] No waiting events found dispatching network-vif-plugged-71b69b21-29dd-40cb-a36d-d06613ade5cb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:01:59 compute-0 nova_compute[186544]: 2025-11-22 08:01:59.324 186548 WARNING nova.compute.manager [req-e752ec5f-115f-4681-8f8d-4085bd8ad24a req-cfde2393-0833-48b6-a3a2-bad242956cba 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6efb9379-6030-46a2-bd5f-60441b08d2ff] Received unexpected event network-vif-plugged-71b69b21-29dd-40cb-a36d-d06613ade5cb for instance with vm_state active and task_state None.
Nov 22 08:02:00 compute-0 nova_compute[186544]: 2025-11-22 08:02:00.364 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:02:00 compute-0 nova_compute[186544]: 2025-11-22 08:02:00.633 186548 DEBUG oslo_concurrency.lockutils [None req-cf637ac6-1f4d-47df-b6d7-0e09863db6ca 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Acquiring lock "6efb9379-6030-46a2-bd5f-60441b08d2ff" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:02:00 compute-0 nova_compute[186544]: 2025-11-22 08:02:00.634 186548 DEBUG oslo_concurrency.lockutils [None req-cf637ac6-1f4d-47df-b6d7-0e09863db6ca 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Lock "6efb9379-6030-46a2-bd5f-60441b08d2ff" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:02:00 compute-0 nova_compute[186544]: 2025-11-22 08:02:00.635 186548 DEBUG oslo_concurrency.lockutils [None req-cf637ac6-1f4d-47df-b6d7-0e09863db6ca 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Acquiring lock "6efb9379-6030-46a2-bd5f-60441b08d2ff-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:02:00 compute-0 nova_compute[186544]: 2025-11-22 08:02:00.635 186548 DEBUG oslo_concurrency.lockutils [None req-cf637ac6-1f4d-47df-b6d7-0e09863db6ca 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Lock "6efb9379-6030-46a2-bd5f-60441b08d2ff-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:02:00 compute-0 nova_compute[186544]: 2025-11-22 08:02:00.635 186548 DEBUG oslo_concurrency.lockutils [None req-cf637ac6-1f4d-47df-b6d7-0e09863db6ca 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Lock "6efb9379-6030-46a2-bd5f-60441b08d2ff-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:02:00 compute-0 nova_compute[186544]: 2025-11-22 08:02:00.643 186548 INFO nova.compute.manager [None req-cf637ac6-1f4d-47df-b6d7-0e09863db6ca 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 6efb9379-6030-46a2-bd5f-60441b08d2ff] Terminating instance
Nov 22 08:02:00 compute-0 nova_compute[186544]: 2025-11-22 08:02:00.650 186548 DEBUG nova.compute.manager [None req-cf637ac6-1f4d-47df-b6d7-0e09863db6ca 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 6efb9379-6030-46a2-bd5f-60441b08d2ff] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 08:02:00 compute-0 nova_compute[186544]: 2025-11-22 08:02:00.661 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:02:00 compute-0 kernel: tap71b69b21-29 (unregistering): left promiscuous mode
Nov 22 08:02:00 compute-0 NetworkManager[55036]: <info>  [1763798520.6825] device (tap71b69b21-29): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 08:02:00 compute-0 ovn_controller[94843]: 2025-11-22T08:02:00Z|00406|binding|INFO|Releasing lport 71b69b21-29dd-40cb-a36d-d06613ade5cb from this chassis (sb_readonly=0)
Nov 22 08:02:00 compute-0 nova_compute[186544]: 2025-11-22 08:02:00.693 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:02:00 compute-0 ovn_controller[94843]: 2025-11-22T08:02:00Z|00407|binding|INFO|Setting lport 71b69b21-29dd-40cb-a36d-d06613ade5cb down in Southbound
Nov 22 08:02:00 compute-0 ovn_controller[94843]: 2025-11-22T08:02:00Z|00408|binding|INFO|Removing iface tap71b69b21-29 ovn-installed in OVS
Nov 22 08:02:00 compute-0 nova_compute[186544]: 2025-11-22 08:02:00.697 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:02:00 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:00.706 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:62:2c:86 10.100.0.12'], port_security=['fa:16:3e:62:2c:86 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '6efb9379-6030-46a2-bd5f-60441b08d2ff', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c157128f-75e8-4afb-ab55-34580af9585f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd967f0cef958482c9711764882a146f3', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'db55d655-ec9a-40ef-9e54-3247c3ea4f75', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=50c031f4-6b41-4ee7-af4f-a9218d9b390c, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=71b69b21-29dd-40cb-a36d-d06613ade5cb) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:02:00 compute-0 nova_compute[186544]: 2025-11-22 08:02:00.707 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:02:00 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:00.709 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 71b69b21-29dd-40cb-a36d-d06613ade5cb in datapath c157128f-75e8-4afb-ab55-34580af9585f unbound from our chassis
Nov 22 08:02:00 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:00.711 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c157128f-75e8-4afb-ab55-34580af9585f
Nov 22 08:02:00 compute-0 systemd[1]: machine-qemu\x2d54\x2dinstance\x2d00000062.scope: Deactivated successfully.
Nov 22 08:02:00 compute-0 systemd[1]: machine-qemu\x2d54\x2dinstance\x2d00000062.scope: Consumed 3.602s CPU time.
Nov 22 08:02:00 compute-0 systemd-machined[152872]: Machine qemu-54-instance-00000062 terminated.
Nov 22 08:02:00 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:00.728 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[8dcca3c3-968b-4e47-badd-84627d7c20b5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:02:00 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:00.754 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[604106b6-9802-4481-a0c4-d43777172cef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:02:00 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:00.757 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[651ec1c3-2cc4-4201-b847-9953e23c794e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:02:00 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:00.785 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[ca0f01b5-3e30-4426-8598-22fd8db161cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:02:00 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:00.802 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[146499eb-fdb4-4a2b-84ed-9e9092f0fbf4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc157128f-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bb:41:0b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 7, 'rx_bytes': 532, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 7, 'rx_bytes': 532, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 125], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 524460, 'reachable_time': 44678, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230043, 'error': None, 'target': 'ovnmeta-c157128f-75e8-4afb-ab55-34580af9585f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:02:00 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:00.818 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[f27fd97b-2382-4169-a205-eb22357add29]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapc157128f-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 524472, 'tstamp': 524472}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230044, 'error': None, 'target': 'ovnmeta-c157128f-75e8-4afb-ab55-34580af9585f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapc157128f-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 524476, 'tstamp': 524476}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230044, 'error': None, 'target': 'ovnmeta-c157128f-75e8-4afb-ab55-34580af9585f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:02:00 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:00.819 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc157128f-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:02:00 compute-0 nova_compute[186544]: 2025-11-22 08:02:00.821 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:02:00 compute-0 nova_compute[186544]: 2025-11-22 08:02:00.826 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:02:00 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:00.827 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc157128f-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:02:00 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:00.827 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:02:00 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:00.828 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc157128f-70, col_values=(('external_ids', {'iface-id': 'a61c8ae7-262d-45c7-859e-6a4502225b00'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:02:00 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:00.828 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:02:00 compute-0 nova_compute[186544]: 2025-11-22 08:02:00.915 186548 INFO nova.virt.libvirt.driver [-] [instance: 6efb9379-6030-46a2-bd5f-60441b08d2ff] Instance destroyed successfully.
Nov 22 08:02:00 compute-0 nova_compute[186544]: 2025-11-22 08:02:00.915 186548 DEBUG nova.objects.instance [None req-cf637ac6-1f4d-47df-b6d7-0e09863db6ca 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Lazy-loading 'resources' on Instance uuid 6efb9379-6030-46a2-bd5f-60441b08d2ff obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:02:00 compute-0 nova_compute[186544]: 2025-11-22 08:02:00.929 186548 DEBUG nova.virt.libvirt.vif [None req-cf637ac6-1f4d-47df-b6d7-0e09863db6ca 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:01:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-2049160290',display_name='tempest-ServersNegativeTestJSON-server-2049160290',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-2049160290',id=98,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:01:58Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d967f0cef958482c9711764882a146f3',ramdisk_id='',reservation_id='r-tj74lunn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersNegativeTestJSON-1872924472',owner_user_name='tempest-ServersNegativeTestJSON-1872924472-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:01:58Z,user_data=None,user_id='989540cd5ede4a5184a08b8eb3de013d',uuid=6efb9379-6030-46a2-bd5f-60441b08d2ff,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "71b69b21-29dd-40cb-a36d-d06613ade5cb", "address": "fa:16:3e:62:2c:86", "network": {"id": "c157128f-75e8-4afb-ab55-34580af9585f", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1557637836-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d967f0cef958482c9711764882a146f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71b69b21-29", "ovs_interfaceid": "71b69b21-29dd-40cb-a36d-d06613ade5cb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 08:02:00 compute-0 nova_compute[186544]: 2025-11-22 08:02:00.930 186548 DEBUG nova.network.os_vif_util [None req-cf637ac6-1f4d-47df-b6d7-0e09863db6ca 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Converting VIF {"id": "71b69b21-29dd-40cb-a36d-d06613ade5cb", "address": "fa:16:3e:62:2c:86", "network": {"id": "c157128f-75e8-4afb-ab55-34580af9585f", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1557637836-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d967f0cef958482c9711764882a146f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71b69b21-29", "ovs_interfaceid": "71b69b21-29dd-40cb-a36d-d06613ade5cb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:02:00 compute-0 nova_compute[186544]: 2025-11-22 08:02:00.931 186548 DEBUG nova.network.os_vif_util [None req-cf637ac6-1f4d-47df-b6d7-0e09863db6ca 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:62:2c:86,bridge_name='br-int',has_traffic_filtering=True,id=71b69b21-29dd-40cb-a36d-d06613ade5cb,network=Network(c157128f-75e8-4afb-ab55-34580af9585f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap71b69b21-29') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:02:00 compute-0 nova_compute[186544]: 2025-11-22 08:02:00.931 186548 DEBUG os_vif [None req-cf637ac6-1f4d-47df-b6d7-0e09863db6ca 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:62:2c:86,bridge_name='br-int',has_traffic_filtering=True,id=71b69b21-29dd-40cb-a36d-d06613ade5cb,network=Network(c157128f-75e8-4afb-ab55-34580af9585f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap71b69b21-29') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 08:02:00 compute-0 nova_compute[186544]: 2025-11-22 08:02:00.933 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:02:00 compute-0 nova_compute[186544]: 2025-11-22 08:02:00.934 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap71b69b21-29, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:02:00 compute-0 nova_compute[186544]: 2025-11-22 08:02:00.935 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:02:00 compute-0 nova_compute[186544]: 2025-11-22 08:02:00.936 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:02:00 compute-0 nova_compute[186544]: 2025-11-22 08:02:00.938 186548 INFO os_vif [None req-cf637ac6-1f4d-47df-b6d7-0e09863db6ca 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:62:2c:86,bridge_name='br-int',has_traffic_filtering=True,id=71b69b21-29dd-40cb-a36d-d06613ade5cb,network=Network(c157128f-75e8-4afb-ab55-34580af9585f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap71b69b21-29')
Nov 22 08:02:00 compute-0 nova_compute[186544]: 2025-11-22 08:02:00.938 186548 INFO nova.virt.libvirt.driver [None req-cf637ac6-1f4d-47df-b6d7-0e09863db6ca 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 6efb9379-6030-46a2-bd5f-60441b08d2ff] Deleting instance files /var/lib/nova/instances/6efb9379-6030-46a2-bd5f-60441b08d2ff_del
Nov 22 08:02:00 compute-0 nova_compute[186544]: 2025-11-22 08:02:00.939 186548 INFO nova.virt.libvirt.driver [None req-cf637ac6-1f4d-47df-b6d7-0e09863db6ca 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 6efb9379-6030-46a2-bd5f-60441b08d2ff] Deletion of /var/lib/nova/instances/6efb9379-6030-46a2-bd5f-60441b08d2ff_del complete
Nov 22 08:02:01 compute-0 nova_compute[186544]: 2025-11-22 08:02:01.087 186548 INFO nova.compute.manager [None req-cf637ac6-1f4d-47df-b6d7-0e09863db6ca 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 6efb9379-6030-46a2-bd5f-60441b08d2ff] Took 0.44 seconds to destroy the instance on the hypervisor.
Nov 22 08:02:01 compute-0 nova_compute[186544]: 2025-11-22 08:02:01.088 186548 DEBUG oslo.service.loopingcall [None req-cf637ac6-1f4d-47df-b6d7-0e09863db6ca 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 08:02:01 compute-0 nova_compute[186544]: 2025-11-22 08:02:01.088 186548 DEBUG nova.compute.manager [-] [instance: 6efb9379-6030-46a2-bd5f-60441b08d2ff] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 08:02:01 compute-0 nova_compute[186544]: 2025-11-22 08:02:01.089 186548 DEBUG nova.network.neutron [-] [instance: 6efb9379-6030-46a2-bd5f-60441b08d2ff] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 08:02:01 compute-0 nova_compute[186544]: 2025-11-22 08:02:01.406 186548 DEBUG nova.compute.manager [req-e6f8cc94-9693-41e5-980e-7aea2f2a99f3 req-07f34af2-5d4e-4c32-acc1-3b414116250d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6efb9379-6030-46a2-bd5f-60441b08d2ff] Received event network-vif-unplugged-71b69b21-29dd-40cb-a36d-d06613ade5cb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:02:01 compute-0 nova_compute[186544]: 2025-11-22 08:02:01.406 186548 DEBUG oslo_concurrency.lockutils [req-e6f8cc94-9693-41e5-980e-7aea2f2a99f3 req-07f34af2-5d4e-4c32-acc1-3b414116250d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "6efb9379-6030-46a2-bd5f-60441b08d2ff-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:02:01 compute-0 nova_compute[186544]: 2025-11-22 08:02:01.406 186548 DEBUG oslo_concurrency.lockutils [req-e6f8cc94-9693-41e5-980e-7aea2f2a99f3 req-07f34af2-5d4e-4c32-acc1-3b414116250d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6efb9379-6030-46a2-bd5f-60441b08d2ff-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:02:01 compute-0 nova_compute[186544]: 2025-11-22 08:02:01.407 186548 DEBUG oslo_concurrency.lockutils [req-e6f8cc94-9693-41e5-980e-7aea2f2a99f3 req-07f34af2-5d4e-4c32-acc1-3b414116250d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6efb9379-6030-46a2-bd5f-60441b08d2ff-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:02:01 compute-0 nova_compute[186544]: 2025-11-22 08:02:01.407 186548 DEBUG nova.compute.manager [req-e6f8cc94-9693-41e5-980e-7aea2f2a99f3 req-07f34af2-5d4e-4c32-acc1-3b414116250d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6efb9379-6030-46a2-bd5f-60441b08d2ff] No waiting events found dispatching network-vif-unplugged-71b69b21-29dd-40cb-a36d-d06613ade5cb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:02:01 compute-0 nova_compute[186544]: 2025-11-22 08:02:01.407 186548 DEBUG nova.compute.manager [req-e6f8cc94-9693-41e5-980e-7aea2f2a99f3 req-07f34af2-5d4e-4c32-acc1-3b414116250d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6efb9379-6030-46a2-bd5f-60441b08d2ff] Received event network-vif-unplugged-71b69b21-29dd-40cb-a36d-d06613ade5cb for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 22 08:02:01 compute-0 nova_compute[186544]: 2025-11-22 08:02:01.407 186548 DEBUG nova.compute.manager [req-e6f8cc94-9693-41e5-980e-7aea2f2a99f3 req-07f34af2-5d4e-4c32-acc1-3b414116250d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6efb9379-6030-46a2-bd5f-60441b08d2ff] Received event network-vif-plugged-71b69b21-29dd-40cb-a36d-d06613ade5cb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:02:01 compute-0 nova_compute[186544]: 2025-11-22 08:02:01.407 186548 DEBUG oslo_concurrency.lockutils [req-e6f8cc94-9693-41e5-980e-7aea2f2a99f3 req-07f34af2-5d4e-4c32-acc1-3b414116250d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "6efb9379-6030-46a2-bd5f-60441b08d2ff-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:02:01 compute-0 nova_compute[186544]: 2025-11-22 08:02:01.408 186548 DEBUG oslo_concurrency.lockutils [req-e6f8cc94-9693-41e5-980e-7aea2f2a99f3 req-07f34af2-5d4e-4c32-acc1-3b414116250d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6efb9379-6030-46a2-bd5f-60441b08d2ff-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:02:01 compute-0 nova_compute[186544]: 2025-11-22 08:02:01.408 186548 DEBUG oslo_concurrency.lockutils [req-e6f8cc94-9693-41e5-980e-7aea2f2a99f3 req-07f34af2-5d4e-4c32-acc1-3b414116250d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "6efb9379-6030-46a2-bd5f-60441b08d2ff-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:02:01 compute-0 nova_compute[186544]: 2025-11-22 08:02:01.408 186548 DEBUG nova.compute.manager [req-e6f8cc94-9693-41e5-980e-7aea2f2a99f3 req-07f34af2-5d4e-4c32-acc1-3b414116250d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6efb9379-6030-46a2-bd5f-60441b08d2ff] No waiting events found dispatching network-vif-plugged-71b69b21-29dd-40cb-a36d-d06613ade5cb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:02:01 compute-0 nova_compute[186544]: 2025-11-22 08:02:01.408 186548 WARNING nova.compute.manager [req-e6f8cc94-9693-41e5-980e-7aea2f2a99f3 req-07f34af2-5d4e-4c32-acc1-3b414116250d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6efb9379-6030-46a2-bd5f-60441b08d2ff] Received unexpected event network-vif-plugged-71b69b21-29dd-40cb-a36d-d06613ade5cb for instance with vm_state active and task_state deleting.
Nov 22 08:02:01 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:01.767 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '27'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:02:01 compute-0 ovn_controller[94843]: 2025-11-22T08:02:01Z|00041|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:1c:d6:3e 10.100.0.8
Nov 22 08:02:01 compute-0 ovn_controller[94843]: 2025-11-22T08:02:01Z|00042|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:1c:d6:3e 10.100.0.8
Nov 22 08:02:02 compute-0 nova_compute[186544]: 2025-11-22 08:02:02.085 186548 DEBUG nova.network.neutron [-] [instance: 6efb9379-6030-46a2-bd5f-60441b08d2ff] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:02:02 compute-0 nova_compute[186544]: 2025-11-22 08:02:02.100 186548 INFO nova.compute.manager [-] [instance: 6efb9379-6030-46a2-bd5f-60441b08d2ff] Took 1.01 seconds to deallocate network for instance.
Nov 22 08:02:02 compute-0 nova_compute[186544]: 2025-11-22 08:02:02.133 186548 DEBUG nova.compute.manager [req-f4642990-3d11-4cb6-b85b-413e981abb05 req-5e303377-3038-4850-915d-6a8cd294c1c5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 6efb9379-6030-46a2-bd5f-60441b08d2ff] Received event network-vif-deleted-71b69b21-29dd-40cb-a36d-d06613ade5cb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:02:02 compute-0 nova_compute[186544]: 2025-11-22 08:02:02.160 186548 DEBUG oslo_concurrency.lockutils [None req-cf637ac6-1f4d-47df-b6d7-0e09863db6ca 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:02:02 compute-0 nova_compute[186544]: 2025-11-22 08:02:02.161 186548 DEBUG oslo_concurrency.lockutils [None req-cf637ac6-1f4d-47df-b6d7-0e09863db6ca 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:02:02 compute-0 nova_compute[186544]: 2025-11-22 08:02:02.230 186548 DEBUG nova.compute.provider_tree [None req-cf637ac6-1f4d-47df-b6d7-0e09863db6ca 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:02:02 compute-0 nova_compute[186544]: 2025-11-22 08:02:02.244 186548 DEBUG nova.scheduler.client.report [None req-cf637ac6-1f4d-47df-b6d7-0e09863db6ca 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:02:02 compute-0 nova_compute[186544]: 2025-11-22 08:02:02.273 186548 DEBUG oslo_concurrency.lockutils [None req-cf637ac6-1f4d-47df-b6d7-0e09863db6ca 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.112s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:02:02 compute-0 nova_compute[186544]: 2025-11-22 08:02:02.296 186548 INFO nova.scheduler.client.report [None req-cf637ac6-1f4d-47df-b6d7-0e09863db6ca 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Deleted allocations for instance 6efb9379-6030-46a2-bd5f-60441b08d2ff
Nov 22 08:02:02 compute-0 nova_compute[186544]: 2025-11-22 08:02:02.365 186548 DEBUG oslo_concurrency.lockutils [None req-cf637ac6-1f4d-47df-b6d7-0e09863db6ca 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Lock "6efb9379-6030-46a2-bd5f-60441b08d2ff" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.731s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:02:03 compute-0 podman[230063]: 2025-11-22 08:02:03.426333574 +0000 UTC m=+0.065631786 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 08:02:03 compute-0 nova_compute[186544]: 2025-11-22 08:02:03.554 186548 DEBUG oslo_concurrency.lockutils [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Acquiring lock "1a3633d4-fe6f-4956-974f-e00f2bf9d8d0" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:02:03 compute-0 nova_compute[186544]: 2025-11-22 08:02:03.555 186548 DEBUG oslo_concurrency.lockutils [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "1a3633d4-fe6f-4956-974f-e00f2bf9d8d0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:02:03 compute-0 nova_compute[186544]: 2025-11-22 08:02:03.574 186548 DEBUG nova.compute.manager [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 08:02:03 compute-0 nova_compute[186544]: 2025-11-22 08:02:03.664 186548 DEBUG oslo_concurrency.lockutils [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:02:03 compute-0 nova_compute[186544]: 2025-11-22 08:02:03.664 186548 DEBUG oslo_concurrency.lockutils [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:02:03 compute-0 nova_compute[186544]: 2025-11-22 08:02:03.672 186548 DEBUG nova.virt.hardware [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 08:02:03 compute-0 nova_compute[186544]: 2025-11-22 08:02:03.672 186548 INFO nova.compute.claims [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] Claim successful on node compute-0.ctlplane.example.com
Nov 22 08:02:03 compute-0 nova_compute[186544]: 2025-11-22 08:02:03.779 186548 DEBUG nova.compute.provider_tree [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:02:03 compute-0 nova_compute[186544]: 2025-11-22 08:02:03.794 186548 DEBUG nova.scheduler.client.report [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:02:03 compute-0 nova_compute[186544]: 2025-11-22 08:02:03.817 186548 DEBUG oslo_concurrency.lockutils [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.153s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:02:03 compute-0 nova_compute[186544]: 2025-11-22 08:02:03.817 186548 DEBUG nova.compute.manager [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 08:02:03 compute-0 nova_compute[186544]: 2025-11-22 08:02:03.864 186548 DEBUG nova.compute.manager [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 22 08:02:03 compute-0 nova_compute[186544]: 2025-11-22 08:02:03.865 186548 DEBUG nova.network.neutron [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 08:02:03 compute-0 nova_compute[186544]: 2025-11-22 08:02:03.880 186548 INFO nova.virt.libvirt.driver [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 08:02:03 compute-0 nova_compute[186544]: 2025-11-22 08:02:03.896 186548 DEBUG nova.compute.manager [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 08:02:04 compute-0 nova_compute[186544]: 2025-11-22 08:02:04.007 186548 DEBUG nova.compute.manager [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 08:02:04 compute-0 nova_compute[186544]: 2025-11-22 08:02:04.009 186548 DEBUG nova.virt.libvirt.driver [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 08:02:04 compute-0 nova_compute[186544]: 2025-11-22 08:02:04.009 186548 INFO nova.virt.libvirt.driver [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] Creating image(s)
Nov 22 08:02:04 compute-0 nova_compute[186544]: 2025-11-22 08:02:04.010 186548 DEBUG oslo_concurrency.lockutils [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Acquiring lock "/var/lib/nova/instances/1a3633d4-fe6f-4956-974f-e00f2bf9d8d0/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:02:04 compute-0 nova_compute[186544]: 2025-11-22 08:02:04.010 186548 DEBUG oslo_concurrency.lockutils [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "/var/lib/nova/instances/1a3633d4-fe6f-4956-974f-e00f2bf9d8d0/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:02:04 compute-0 nova_compute[186544]: 2025-11-22 08:02:04.011 186548 DEBUG oslo_concurrency.lockutils [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "/var/lib/nova/instances/1a3633d4-fe6f-4956-974f-e00f2bf9d8d0/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:02:04 compute-0 nova_compute[186544]: 2025-11-22 08:02:04.025 186548 DEBUG oslo_concurrency.processutils [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:02:04 compute-0 nova_compute[186544]: 2025-11-22 08:02:04.069 186548 DEBUG nova.policy [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd0c5153b41c5499bac372d2df10b9b03', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '62d9a4a13f5d41529bc273c278fae96b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 22 08:02:04 compute-0 nova_compute[186544]: 2025-11-22 08:02:04.092 186548 DEBUG oslo_concurrency.processutils [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:02:04 compute-0 nova_compute[186544]: 2025-11-22 08:02:04.093 186548 DEBUG oslo_concurrency.lockutils [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:02:04 compute-0 nova_compute[186544]: 2025-11-22 08:02:04.094 186548 DEBUG oslo_concurrency.lockutils [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:02:04 compute-0 nova_compute[186544]: 2025-11-22 08:02:04.105 186548 DEBUG oslo_concurrency.processutils [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:02:04 compute-0 nova_compute[186544]: 2025-11-22 08:02:04.160 186548 DEBUG oslo_concurrency.processutils [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:02:04 compute-0 nova_compute[186544]: 2025-11-22 08:02:04.161 186548 DEBUG oslo_concurrency.processutils [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/1a3633d4-fe6f-4956-974f-e00f2bf9d8d0/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:02:04 compute-0 nova_compute[186544]: 2025-11-22 08:02:04.451 186548 DEBUG oslo_concurrency.processutils [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/1a3633d4-fe6f-4956-974f-e00f2bf9d8d0/disk 1073741824" returned: 0 in 0.291s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:02:04 compute-0 nova_compute[186544]: 2025-11-22 08:02:04.452 186548 DEBUG oslo_concurrency.lockutils [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.359s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:02:04 compute-0 nova_compute[186544]: 2025-11-22 08:02:04.453 186548 DEBUG oslo_concurrency.processutils [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:02:04 compute-0 nova_compute[186544]: 2025-11-22 08:02:04.514 186548 DEBUG oslo_concurrency.processutils [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:02:04 compute-0 nova_compute[186544]: 2025-11-22 08:02:04.516 186548 DEBUG nova.virt.disk.api [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Checking if we can resize image /var/lib/nova/instances/1a3633d4-fe6f-4956-974f-e00f2bf9d8d0/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 08:02:04 compute-0 nova_compute[186544]: 2025-11-22 08:02:04.516 186548 DEBUG oslo_concurrency.processutils [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1a3633d4-fe6f-4956-974f-e00f2bf9d8d0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:02:04 compute-0 nova_compute[186544]: 2025-11-22 08:02:04.577 186548 DEBUG oslo_concurrency.processutils [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1a3633d4-fe6f-4956-974f-e00f2bf9d8d0/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:02:04 compute-0 nova_compute[186544]: 2025-11-22 08:02:04.577 186548 DEBUG nova.virt.disk.api [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Cannot resize image /var/lib/nova/instances/1a3633d4-fe6f-4956-974f-e00f2bf9d8d0/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 08:02:04 compute-0 nova_compute[186544]: 2025-11-22 08:02:04.578 186548 DEBUG nova.objects.instance [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lazy-loading 'migration_context' on Instance uuid 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:02:04 compute-0 nova_compute[186544]: 2025-11-22 08:02:04.591 186548 DEBUG nova.virt.libvirt.driver [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 08:02:04 compute-0 nova_compute[186544]: 2025-11-22 08:02:04.591 186548 DEBUG nova.virt.libvirt.driver [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] Ensure instance console log exists: /var/lib/nova/instances/1a3633d4-fe6f-4956-974f-e00f2bf9d8d0/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 08:02:04 compute-0 nova_compute[186544]: 2025-11-22 08:02:04.592 186548 DEBUG oslo_concurrency.lockutils [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:02:04 compute-0 nova_compute[186544]: 2025-11-22 08:02:04.592 186548 DEBUG oslo_concurrency.lockutils [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:02:04 compute-0 nova_compute[186544]: 2025-11-22 08:02:04.592 186548 DEBUG oslo_concurrency.lockutils [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:02:04 compute-0 nova_compute[186544]: 2025-11-22 08:02:04.855 186548 DEBUG nova.network.neutron [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] Successfully created port: 95fc5a99-476d-4f95-b80a-2a134fa639ed _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 22 08:02:05 compute-0 nova_compute[186544]: 2025-11-22 08:02:05.665 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:02:05 compute-0 nova_compute[186544]: 2025-11-22 08:02:05.935 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:02:09 compute-0 podman[230099]: 2025-11-22 08:02:09.412807071 +0000 UTC m=+0.052816679 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, io.buildah.version=1.33.7, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm)
Nov 22 08:02:09 compute-0 podman[230120]: 2025-11-22 08:02:09.488645559 +0000 UTC m=+0.048763999 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 08:02:10 compute-0 nova_compute[186544]: 2025-11-22 08:02:10.160 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:02:10 compute-0 nova_compute[186544]: 2025-11-22 08:02:10.667 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:02:10 compute-0 nova_compute[186544]: 2025-11-22 08:02:10.937 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:02:11 compute-0 nova_compute[186544]: 2025-11-22 08:02:11.214 186548 DEBUG nova.network.neutron [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] Successfully updated port: 95fc5a99-476d-4f95-b80a-2a134fa639ed _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 08:02:11 compute-0 nova_compute[186544]: 2025-11-22 08:02:11.235 186548 DEBUG oslo_concurrency.lockutils [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Acquiring lock "refresh_cache-1a3633d4-fe6f-4956-974f-e00f2bf9d8d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:02:11 compute-0 nova_compute[186544]: 2025-11-22 08:02:11.235 186548 DEBUG oslo_concurrency.lockutils [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Acquired lock "refresh_cache-1a3633d4-fe6f-4956-974f-e00f2bf9d8d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:02:11 compute-0 nova_compute[186544]: 2025-11-22 08:02:11.236 186548 DEBUG nova.network.neutron [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 08:02:11 compute-0 nova_compute[186544]: 2025-11-22 08:02:11.443 186548 DEBUG nova.network.neutron [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 08:02:11 compute-0 nova_compute[186544]: 2025-11-22 08:02:11.732 186548 DEBUG nova.compute.manager [req-56f6cb53-10ac-42ef-ab05-f020364f8079 req-e1965afa-402c-4e18-ab19-3720dc801e46 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] Received event network-changed-95fc5a99-476d-4f95-b80a-2a134fa639ed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:02:11 compute-0 nova_compute[186544]: 2025-11-22 08:02:11.732 186548 DEBUG nova.compute.manager [req-56f6cb53-10ac-42ef-ab05-f020364f8079 req-e1965afa-402c-4e18-ab19-3720dc801e46 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] Refreshing instance network info cache due to event network-changed-95fc5a99-476d-4f95-b80a-2a134fa639ed. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:02:11 compute-0 nova_compute[186544]: 2025-11-22 08:02:11.733 186548 DEBUG oslo_concurrency.lockutils [req-56f6cb53-10ac-42ef-ab05-f020364f8079 req-e1965afa-402c-4e18-ab19-3720dc801e46 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-1a3633d4-fe6f-4956-974f-e00f2bf9d8d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:02:13 compute-0 nova_compute[186544]: 2025-11-22 08:02:13.184 186548 DEBUG nova.network.neutron [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] Updating instance_info_cache with network_info: [{"id": "95fc5a99-476d-4f95-b80a-2a134fa639ed", "address": "fa:16:3e:19:25:07", "network": {"id": "f7727db5-43a6-48f6-abbf-aa184d8ad087", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-678450698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62d9a4a13f5d41529bc273c278fae96b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap95fc5a99-47", "ovs_interfaceid": "95fc5a99-476d-4f95-b80a-2a134fa639ed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:02:13 compute-0 nova_compute[186544]: 2025-11-22 08:02:13.213 186548 DEBUG oslo_concurrency.lockutils [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Releasing lock "refresh_cache-1a3633d4-fe6f-4956-974f-e00f2bf9d8d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:02:13 compute-0 nova_compute[186544]: 2025-11-22 08:02:13.214 186548 DEBUG nova.compute.manager [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] Instance network_info: |[{"id": "95fc5a99-476d-4f95-b80a-2a134fa639ed", "address": "fa:16:3e:19:25:07", "network": {"id": "f7727db5-43a6-48f6-abbf-aa184d8ad087", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-678450698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62d9a4a13f5d41529bc273c278fae96b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap95fc5a99-47", "ovs_interfaceid": "95fc5a99-476d-4f95-b80a-2a134fa639ed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 22 08:02:13 compute-0 nova_compute[186544]: 2025-11-22 08:02:13.214 186548 DEBUG oslo_concurrency.lockutils [req-56f6cb53-10ac-42ef-ab05-f020364f8079 req-e1965afa-402c-4e18-ab19-3720dc801e46 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-1a3633d4-fe6f-4956-974f-e00f2bf9d8d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:02:13 compute-0 nova_compute[186544]: 2025-11-22 08:02:13.215 186548 DEBUG nova.network.neutron [req-56f6cb53-10ac-42ef-ab05-f020364f8079 req-e1965afa-402c-4e18-ab19-3720dc801e46 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] Refreshing network info cache for port 95fc5a99-476d-4f95-b80a-2a134fa639ed _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:02:13 compute-0 nova_compute[186544]: 2025-11-22 08:02:13.218 186548 DEBUG nova.virt.libvirt.driver [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] Start _get_guest_xml network_info=[{"id": "95fc5a99-476d-4f95-b80a-2a134fa639ed", "address": "fa:16:3e:19:25:07", "network": {"id": "f7727db5-43a6-48f6-abbf-aa184d8ad087", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-678450698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62d9a4a13f5d41529bc273c278fae96b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap95fc5a99-47", "ovs_interfaceid": "95fc5a99-476d-4f95-b80a-2a134fa639ed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 08:02:13 compute-0 nova_compute[186544]: 2025-11-22 08:02:13.223 186548 WARNING nova.virt.libvirt.driver [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:02:13 compute-0 nova_compute[186544]: 2025-11-22 08:02:13.227 186548 DEBUG nova.virt.libvirt.host [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 08:02:13 compute-0 nova_compute[186544]: 2025-11-22 08:02:13.228 186548 DEBUG nova.virt.libvirt.host [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 08:02:13 compute-0 nova_compute[186544]: 2025-11-22 08:02:13.231 186548 DEBUG nova.virt.libvirt.host [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 08:02:13 compute-0 nova_compute[186544]: 2025-11-22 08:02:13.232 186548 DEBUG nova.virt.libvirt.host [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 08:02:13 compute-0 nova_compute[186544]: 2025-11-22 08:02:13.233 186548 DEBUG nova.virt.libvirt.driver [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 08:02:13 compute-0 nova_compute[186544]: 2025-11-22 08:02:13.233 186548 DEBUG nova.virt.hardware [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 08:02:13 compute-0 nova_compute[186544]: 2025-11-22 08:02:13.233 186548 DEBUG nova.virt.hardware [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 08:02:13 compute-0 nova_compute[186544]: 2025-11-22 08:02:13.234 186548 DEBUG nova.virt.hardware [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 08:02:13 compute-0 nova_compute[186544]: 2025-11-22 08:02:13.234 186548 DEBUG nova.virt.hardware [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 08:02:13 compute-0 nova_compute[186544]: 2025-11-22 08:02:13.234 186548 DEBUG nova.virt.hardware [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 08:02:13 compute-0 nova_compute[186544]: 2025-11-22 08:02:13.234 186548 DEBUG nova.virt.hardware [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 08:02:13 compute-0 nova_compute[186544]: 2025-11-22 08:02:13.234 186548 DEBUG nova.virt.hardware [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 08:02:13 compute-0 nova_compute[186544]: 2025-11-22 08:02:13.235 186548 DEBUG nova.virt.hardware [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 08:02:13 compute-0 nova_compute[186544]: 2025-11-22 08:02:13.235 186548 DEBUG nova.virt.hardware [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 08:02:13 compute-0 nova_compute[186544]: 2025-11-22 08:02:13.235 186548 DEBUG nova.virt.hardware [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 08:02:13 compute-0 nova_compute[186544]: 2025-11-22 08:02:13.235 186548 DEBUG nova.virt.hardware [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 08:02:13 compute-0 nova_compute[186544]: 2025-11-22 08:02:13.239 186548 DEBUG nova.virt.libvirt.vif [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:02:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-279343321',display_name='tempest-ServerActionsTestOtherB-server-279343321',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-279343321',id=99,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='62d9a4a13f5d41529bc273c278fae96b',ramdisk_id='',reservation_id='r-lp61ywby',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherB-270195081',owner_user_name='tempest-ServerActionsTestOtherB-270195081-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:02:03Z,user_data=None,user_id='d0c5153b41c5499bac372d2df10b9b03',uuid=1a3633d4-fe6f-4956-974f-e00f2bf9d8d0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "95fc5a99-476d-4f95-b80a-2a134fa639ed", "address": "fa:16:3e:19:25:07", "network": {"id": "f7727db5-43a6-48f6-abbf-aa184d8ad087", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-678450698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62d9a4a13f5d41529bc273c278fae96b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap95fc5a99-47", "ovs_interfaceid": "95fc5a99-476d-4f95-b80a-2a134fa639ed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 08:02:13 compute-0 nova_compute[186544]: 2025-11-22 08:02:13.239 186548 DEBUG nova.network.os_vif_util [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Converting VIF {"id": "95fc5a99-476d-4f95-b80a-2a134fa639ed", "address": "fa:16:3e:19:25:07", "network": {"id": "f7727db5-43a6-48f6-abbf-aa184d8ad087", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-678450698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62d9a4a13f5d41529bc273c278fae96b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap95fc5a99-47", "ovs_interfaceid": "95fc5a99-476d-4f95-b80a-2a134fa639ed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:02:13 compute-0 nova_compute[186544]: 2025-11-22 08:02:13.240 186548 DEBUG nova.network.os_vif_util [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:19:25:07,bridge_name='br-int',has_traffic_filtering=True,id=95fc5a99-476d-4f95-b80a-2a134fa639ed,network=Network(f7727db5-43a6-48f6-abbf-aa184d8ad087),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap95fc5a99-47') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:02:13 compute-0 nova_compute[186544]: 2025-11-22 08:02:13.241 186548 DEBUG nova.objects.instance [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lazy-loading 'pci_devices' on Instance uuid 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:02:13 compute-0 nova_compute[186544]: 2025-11-22 08:02:13.251 186548 DEBUG nova.virt.libvirt.driver [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] End _get_guest_xml xml=<domain type="kvm">
Nov 22 08:02:13 compute-0 nova_compute[186544]:   <uuid>1a3633d4-fe6f-4956-974f-e00f2bf9d8d0</uuid>
Nov 22 08:02:13 compute-0 nova_compute[186544]:   <name>instance-00000063</name>
Nov 22 08:02:13 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 08:02:13 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 08:02:13 compute-0 nova_compute[186544]:   <metadata>
Nov 22 08:02:13 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 08:02:13 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 08:02:13 compute-0 nova_compute[186544]:       <nova:name>tempest-ServerActionsTestOtherB-server-279343321</nova:name>
Nov 22 08:02:13 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 08:02:13</nova:creationTime>
Nov 22 08:02:13 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 08:02:13 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 08:02:13 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 08:02:13 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 08:02:13 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 08:02:13 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 08:02:13 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 08:02:13 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 08:02:13 compute-0 nova_compute[186544]:         <nova:user uuid="d0c5153b41c5499bac372d2df10b9b03">tempest-ServerActionsTestOtherB-270195081-project-member</nova:user>
Nov 22 08:02:13 compute-0 nova_compute[186544]:         <nova:project uuid="62d9a4a13f5d41529bc273c278fae96b">tempest-ServerActionsTestOtherB-270195081</nova:project>
Nov 22 08:02:13 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 08:02:13 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 08:02:13 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 08:02:13 compute-0 nova_compute[186544]:         <nova:port uuid="95fc5a99-476d-4f95-b80a-2a134fa639ed">
Nov 22 08:02:13 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Nov 22 08:02:13 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 08:02:13 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 08:02:13 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 08:02:13 compute-0 nova_compute[186544]:   </metadata>
Nov 22 08:02:13 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 08:02:13 compute-0 nova_compute[186544]:     <system>
Nov 22 08:02:13 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 08:02:13 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 08:02:13 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 08:02:13 compute-0 nova_compute[186544]:       <entry name="serial">1a3633d4-fe6f-4956-974f-e00f2bf9d8d0</entry>
Nov 22 08:02:13 compute-0 nova_compute[186544]:       <entry name="uuid">1a3633d4-fe6f-4956-974f-e00f2bf9d8d0</entry>
Nov 22 08:02:13 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 08:02:13 compute-0 nova_compute[186544]:     </system>
Nov 22 08:02:13 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 08:02:13 compute-0 nova_compute[186544]:   <os>
Nov 22 08:02:13 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 08:02:13 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 08:02:13 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 08:02:13 compute-0 nova_compute[186544]:   </os>
Nov 22 08:02:13 compute-0 nova_compute[186544]:   <features>
Nov 22 08:02:13 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 08:02:13 compute-0 nova_compute[186544]:     <apic/>
Nov 22 08:02:13 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 08:02:13 compute-0 nova_compute[186544]:   </features>
Nov 22 08:02:13 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 08:02:13 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 08:02:13 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 08:02:13 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 08:02:13 compute-0 nova_compute[186544]:   </clock>
Nov 22 08:02:13 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 08:02:13 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 08:02:13 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 08:02:13 compute-0 nova_compute[186544]:   </cpu>
Nov 22 08:02:13 compute-0 nova_compute[186544]:   <devices>
Nov 22 08:02:13 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 08:02:13 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 08:02:13 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/1a3633d4-fe6f-4956-974f-e00f2bf9d8d0/disk"/>
Nov 22 08:02:13 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 08:02:13 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:02:13 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 08:02:13 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 08:02:13 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/1a3633d4-fe6f-4956-974f-e00f2bf9d8d0/disk.config"/>
Nov 22 08:02:13 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 08:02:13 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:02:13 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 08:02:13 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:19:25:07"/>
Nov 22 08:02:13 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:02:13 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 08:02:13 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 08:02:13 compute-0 nova_compute[186544]:       <target dev="tap95fc5a99-47"/>
Nov 22 08:02:13 compute-0 nova_compute[186544]:     </interface>
Nov 22 08:02:13 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 08:02:13 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/1a3633d4-fe6f-4956-974f-e00f2bf9d8d0/console.log" append="off"/>
Nov 22 08:02:13 compute-0 nova_compute[186544]:     </serial>
Nov 22 08:02:13 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 08:02:13 compute-0 nova_compute[186544]:     <video>
Nov 22 08:02:13 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:02:13 compute-0 nova_compute[186544]:     </video>
Nov 22 08:02:13 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 08:02:13 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 08:02:13 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 08:02:13 compute-0 nova_compute[186544]:     </rng>
Nov 22 08:02:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 08:02:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:02:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:02:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:02:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:02:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:02:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:02:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:02:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:02:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:02:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:02:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:02:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:02:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:02:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:02:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:02:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:02:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:02:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:02:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:02:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:02:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:02:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:02:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:02:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:02:13 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 08:02:13 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 08:02:13 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 08:02:13 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 08:02:13 compute-0 nova_compute[186544]:   </devices>
Nov 22 08:02:13 compute-0 nova_compute[186544]: </domain>
Nov 22 08:02:13 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 08:02:13 compute-0 nova_compute[186544]: 2025-11-22 08:02:13.252 186548 DEBUG nova.compute.manager [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] Preparing to wait for external event network-vif-plugged-95fc5a99-476d-4f95-b80a-2a134fa639ed prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 08:02:13 compute-0 nova_compute[186544]: 2025-11-22 08:02:13.252 186548 DEBUG oslo_concurrency.lockutils [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Acquiring lock "1a3633d4-fe6f-4956-974f-e00f2bf9d8d0-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:02:13 compute-0 nova_compute[186544]: 2025-11-22 08:02:13.253 186548 DEBUG oslo_concurrency.lockutils [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "1a3633d4-fe6f-4956-974f-e00f2bf9d8d0-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:02:13 compute-0 nova_compute[186544]: 2025-11-22 08:02:13.253 186548 DEBUG oslo_concurrency.lockutils [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "1a3633d4-fe6f-4956-974f-e00f2bf9d8d0-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:02:13 compute-0 nova_compute[186544]: 2025-11-22 08:02:13.254 186548 DEBUG nova.virt.libvirt.vif [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:02:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-279343321',display_name='tempest-ServerActionsTestOtherB-server-279343321',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-279343321',id=99,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='62d9a4a13f5d41529bc273c278fae96b',ramdisk_id='',reservation_id='r-lp61ywby',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherB-270195081',owner_user_name='tempest-ServerActionsTestOtherB-270195081-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:02:03Z,user_data=None,user_id='d0c5153b41c5499bac372d2df10b9b03',uuid=1a3633d4-fe6f-4956-974f-e00f2bf9d8d0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "95fc5a99-476d-4f95-b80a-2a134fa639ed", "address": "fa:16:3e:19:25:07", "network": {"id": "f7727db5-43a6-48f6-abbf-aa184d8ad087", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-678450698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62d9a4a13f5d41529bc273c278fae96b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap95fc5a99-47", "ovs_interfaceid": "95fc5a99-476d-4f95-b80a-2a134fa639ed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 08:02:13 compute-0 nova_compute[186544]: 2025-11-22 08:02:13.254 186548 DEBUG nova.network.os_vif_util [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Converting VIF {"id": "95fc5a99-476d-4f95-b80a-2a134fa639ed", "address": "fa:16:3e:19:25:07", "network": {"id": "f7727db5-43a6-48f6-abbf-aa184d8ad087", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-678450698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62d9a4a13f5d41529bc273c278fae96b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap95fc5a99-47", "ovs_interfaceid": "95fc5a99-476d-4f95-b80a-2a134fa639ed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:02:13 compute-0 nova_compute[186544]: 2025-11-22 08:02:13.255 186548 DEBUG nova.network.os_vif_util [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:19:25:07,bridge_name='br-int',has_traffic_filtering=True,id=95fc5a99-476d-4f95-b80a-2a134fa639ed,network=Network(f7727db5-43a6-48f6-abbf-aa184d8ad087),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap95fc5a99-47') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:02:13 compute-0 nova_compute[186544]: 2025-11-22 08:02:13.255 186548 DEBUG os_vif [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:19:25:07,bridge_name='br-int',has_traffic_filtering=True,id=95fc5a99-476d-4f95-b80a-2a134fa639ed,network=Network(f7727db5-43a6-48f6-abbf-aa184d8ad087),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap95fc5a99-47') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 08:02:13 compute-0 nova_compute[186544]: 2025-11-22 08:02:13.256 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:02:13 compute-0 nova_compute[186544]: 2025-11-22 08:02:13.256 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:02:13 compute-0 nova_compute[186544]: 2025-11-22 08:02:13.256 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:02:13 compute-0 nova_compute[186544]: 2025-11-22 08:02:13.259 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:02:13 compute-0 nova_compute[186544]: 2025-11-22 08:02:13.260 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap95fc5a99-47, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:02:13 compute-0 nova_compute[186544]: 2025-11-22 08:02:13.260 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap95fc5a99-47, col_values=(('external_ids', {'iface-id': '95fc5a99-476d-4f95-b80a-2a134fa639ed', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:19:25:07', 'vm-uuid': '1a3633d4-fe6f-4956-974f-e00f2bf9d8d0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:02:13 compute-0 nova_compute[186544]: 2025-11-22 08:02:13.262 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:02:13 compute-0 NetworkManager[55036]: <info>  [1763798533.2633] manager: (tap95fc5a99-47): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/194)
Nov 22 08:02:13 compute-0 nova_compute[186544]: 2025-11-22 08:02:13.264 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 08:02:13 compute-0 nova_compute[186544]: 2025-11-22 08:02:13.267 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:02:13 compute-0 nova_compute[186544]: 2025-11-22 08:02:13.268 186548 INFO os_vif [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:19:25:07,bridge_name='br-int',has_traffic_filtering=True,id=95fc5a99-476d-4f95-b80a-2a134fa639ed,network=Network(f7727db5-43a6-48f6-abbf-aa184d8ad087),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap95fc5a99-47')
Nov 22 08:02:13 compute-0 nova_compute[186544]: 2025-11-22 08:02:13.406 186548 DEBUG nova.virt.libvirt.driver [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:02:13 compute-0 nova_compute[186544]: 2025-11-22 08:02:13.407 186548 DEBUG nova.virt.libvirt.driver [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:02:13 compute-0 nova_compute[186544]: 2025-11-22 08:02:13.407 186548 DEBUG nova.virt.libvirt.driver [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] No VIF found with MAC fa:16:3e:19:25:07, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 08:02:13 compute-0 nova_compute[186544]: 2025-11-22 08:02:13.407 186548 INFO nova.virt.libvirt.driver [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] Using config drive
Nov 22 08:02:13 compute-0 nova_compute[186544]: 2025-11-22 08:02:13.727 186548 INFO nova.virt.libvirt.driver [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] Creating config drive at /var/lib/nova/instances/1a3633d4-fe6f-4956-974f-e00f2bf9d8d0/disk.config
Nov 22 08:02:13 compute-0 nova_compute[186544]: 2025-11-22 08:02:13.732 186548 DEBUG oslo_concurrency.processutils [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1a3633d4-fe6f-4956-974f-e00f2bf9d8d0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpg862qqvc execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:02:13 compute-0 nova_compute[186544]: 2025-11-22 08:02:13.858 186548 DEBUG oslo_concurrency.processutils [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1a3633d4-fe6f-4956-974f-e00f2bf9d8d0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpg862qqvc" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:02:13 compute-0 kernel: tap95fc5a99-47: entered promiscuous mode
Nov 22 08:02:13 compute-0 NetworkManager[55036]: <info>  [1763798533.9190] manager: (tap95fc5a99-47): new Tun device (/org/freedesktop/NetworkManager/Devices/195)
Nov 22 08:02:13 compute-0 nova_compute[186544]: 2025-11-22 08:02:13.920 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:02:13 compute-0 ovn_controller[94843]: 2025-11-22T08:02:13Z|00409|binding|INFO|Claiming lport 95fc5a99-476d-4f95-b80a-2a134fa639ed for this chassis.
Nov 22 08:02:13 compute-0 ovn_controller[94843]: 2025-11-22T08:02:13Z|00410|binding|INFO|95fc5a99-476d-4f95-b80a-2a134fa639ed: Claiming fa:16:3e:19:25:07 10.100.0.9
Nov 22 08:02:13 compute-0 ovn_controller[94843]: 2025-11-22T08:02:13Z|00411|binding|INFO|Setting lport 95fc5a99-476d-4f95-b80a-2a134fa639ed ovn-installed in OVS
Nov 22 08:02:13 compute-0 nova_compute[186544]: 2025-11-22 08:02:13.934 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:02:13 compute-0 ovn_controller[94843]: 2025-11-22T08:02:13Z|00412|binding|INFO|Setting lport 95fc5a99-476d-4f95-b80a-2a134fa639ed up in Southbound
Nov 22 08:02:13 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:13.937 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:19:25:07 10.100.0.9'], port_security=['fa:16:3e:19:25:07 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '1a3633d4-fe6f-4956-974f-e00f2bf9d8d0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f7727db5-43a6-48f6-abbf-aa184d8ad087', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '62d9a4a13f5d41529bc273c278fae96b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '80908aff-0365-41dd-a88b-8ec1981e86fb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c2f56b43-4627-4c45-bd62-967c8ee835ae, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=95fc5a99-476d-4f95-b80a-2a134fa639ed) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:02:13 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:13.938 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 95fc5a99-476d-4f95-b80a-2a134fa639ed in datapath f7727db5-43a6-48f6-abbf-aa184d8ad087 bound to our chassis
Nov 22 08:02:13 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:13.939 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f7727db5-43a6-48f6-abbf-aa184d8ad087
Nov 22 08:02:13 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:13.951 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[f09ed1d2-2f57-403d-b3f6-d5f0199eb92a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:02:13 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:13.952 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf7727db5-41 in ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 08:02:13 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:13.954 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf7727db5-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 08:02:13 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:13.954 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[132064b6-c031-4b14-ac90-c6e1aed020c4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:02:13 compute-0 systemd-udevd[230166]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:02:13 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:13.955 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[e46da837-9827-4ac1-9cb1-eea57b2331d7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:02:13 compute-0 systemd-machined[152872]: New machine qemu-55-instance-00000063.
Nov 22 08:02:13 compute-0 NetworkManager[55036]: <info>  [1763798533.9674] device (tap95fc5a99-47): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 08:02:13 compute-0 NetworkManager[55036]: <info>  [1763798533.9683] device (tap95fc5a99-47): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 08:02:13 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:13.968 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[b6f8c483-0c03-411b-9719-385ba4eb7808]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:02:13 compute-0 systemd[1]: Started Virtual Machine qemu-55-instance-00000063.
Nov 22 08:02:13 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:13.985 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[515ca8c1-648d-4892-af52-9f02882a739f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:02:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:14.016 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[b5573d87-43a0-4af0-8ae9-7030b732bdad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:02:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:14.020 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[b4adaaf6-fc25-4fda-8f07-90fd571baa80]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:02:14 compute-0 NetworkManager[55036]: <info>  [1763798534.0219] manager: (tapf7727db5-40): new Veth device (/org/freedesktop/NetworkManager/Devices/196)
Nov 22 08:02:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:14.049 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[9ece6caf-4307-4cc1-b80d-3d3283fb8c7a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:02:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:14.052 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[911693a4-fd36-4ea0-82e7-75742290f218]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:02:14 compute-0 NetworkManager[55036]: <info>  [1763798534.0800] device (tapf7727db5-40): carrier: link connected
Nov 22 08:02:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:14.085 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[3abfb2b2-4736-4aca-a434-337d360c3bd5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:02:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:14.102 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[cdc0b8c9-6ddc-40dc-8a3f-445105389b38]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf7727db5-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f9:3e:1a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 129], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 527471, 'reachable_time': 36531, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230199, 'error': None, 'target': 'ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:02:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:14.117 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[ad620023-66bb-4e7e-b23e-705316899519]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef9:3e1a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 527471, 'tstamp': 527471}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230200, 'error': None, 'target': 'ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:02:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:14.132 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[62e23afd-baa9-4ab3-a645-3135fdaa18c4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf7727db5-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f9:3e:1a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 129], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 527471, 'reachable_time': 36531, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 230201, 'error': None, 'target': 'ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:02:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:14.161 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[7fc66f39-087a-4e7b-a5e9-025ea30ecc27]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:02:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:14.218 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[f859e2a4-c8fe-4378-ab9a-3376e6237197]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:02:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:14.219 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf7727db5-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:02:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:14.219 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:02:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:14.220 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf7727db5-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:02:14 compute-0 nova_compute[186544]: 2025-11-22 08:02:14.221 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:02:14 compute-0 kernel: tapf7727db5-40: entered promiscuous mode
Nov 22 08:02:14 compute-0 NetworkManager[55036]: <info>  [1763798534.2225] manager: (tapf7727db5-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/197)
Nov 22 08:02:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:14.224 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf7727db5-40, col_values=(('external_ids', {'iface-id': '188249cb-6e2b-4c68-9c53-aaa0a3da466f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:02:14 compute-0 nova_compute[186544]: 2025-11-22 08:02:14.225 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:02:14 compute-0 ovn_controller[94843]: 2025-11-22T08:02:14Z|00413|binding|INFO|Releasing lport 188249cb-6e2b-4c68-9c53-aaa0a3da466f from this chassis (sb_readonly=0)
Nov 22 08:02:14 compute-0 nova_compute[186544]: 2025-11-22 08:02:14.236 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:02:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:14.239 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f7727db5-43a6-48f6-abbf-aa184d8ad087.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f7727db5-43a6-48f6-abbf-aa184d8ad087.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 08:02:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:14.240 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[e6958823-7f7a-47c0-b252-03b8c634bdf6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:02:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:14.241 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 08:02:14 compute-0 ovn_metadata_agent[103800]: global
Nov 22 08:02:14 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 08:02:14 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-f7727db5-43a6-48f6-abbf-aa184d8ad087
Nov 22 08:02:14 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 08:02:14 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 08:02:14 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 08:02:14 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/f7727db5-43a6-48f6-abbf-aa184d8ad087.pid.haproxy
Nov 22 08:02:14 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 08:02:14 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:02:14 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 08:02:14 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 08:02:14 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 08:02:14 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 08:02:14 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 08:02:14 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 08:02:14 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 08:02:14 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 08:02:14 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 08:02:14 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 08:02:14 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 08:02:14 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 08:02:14 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 08:02:14 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:02:14 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:02:14 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 08:02:14 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 08:02:14 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 08:02:14 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID f7727db5-43a6-48f6-abbf-aa184d8ad087
Nov 22 08:02:14 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 08:02:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:14.242 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087', 'env', 'PROCESS_TAG=haproxy-f7727db5-43a6-48f6-abbf-aa184d8ad087', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f7727db5-43a6-48f6-abbf-aa184d8ad087.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 08:02:14 compute-0 nova_compute[186544]: 2025-11-22 08:02:14.461 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798534.4606533, 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:02:14 compute-0 nova_compute[186544]: 2025-11-22 08:02:14.462 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] VM Started (Lifecycle Event)
Nov 22 08:02:14 compute-0 nova_compute[186544]: 2025-11-22 08:02:14.478 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:02:14 compute-0 nova_compute[186544]: 2025-11-22 08:02:14.482 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798534.4609807, 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:02:14 compute-0 nova_compute[186544]: 2025-11-22 08:02:14.482 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] VM Paused (Lifecycle Event)
Nov 22 08:02:14 compute-0 nova_compute[186544]: 2025-11-22 08:02:14.496 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:02:14 compute-0 nova_compute[186544]: 2025-11-22 08:02:14.499 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:02:14 compute-0 nova_compute[186544]: 2025-11-22 08:02:14.516 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 08:02:14 compute-0 podman[230241]: 2025-11-22 08:02:14.576516617 +0000 UTC m=+0.022439796 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 08:02:14 compute-0 nova_compute[186544]: 2025-11-22 08:02:14.812 186548 DEBUG nova.compute.manager [req-8363fe6f-c863-4fa8-b7e8-6e351f49ccc0 req-e2205195-918e-4a49-ab2e-71480c21fb60 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] Received event network-vif-plugged-95fc5a99-476d-4f95-b80a-2a134fa639ed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:02:14 compute-0 nova_compute[186544]: 2025-11-22 08:02:14.812 186548 DEBUG oslo_concurrency.lockutils [req-8363fe6f-c863-4fa8-b7e8-6e351f49ccc0 req-e2205195-918e-4a49-ab2e-71480c21fb60 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1a3633d4-fe6f-4956-974f-e00f2bf9d8d0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:02:14 compute-0 nova_compute[186544]: 2025-11-22 08:02:14.813 186548 DEBUG oslo_concurrency.lockutils [req-8363fe6f-c863-4fa8-b7e8-6e351f49ccc0 req-e2205195-918e-4a49-ab2e-71480c21fb60 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1a3633d4-fe6f-4956-974f-e00f2bf9d8d0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:02:14 compute-0 nova_compute[186544]: 2025-11-22 08:02:14.813 186548 DEBUG oslo_concurrency.lockutils [req-8363fe6f-c863-4fa8-b7e8-6e351f49ccc0 req-e2205195-918e-4a49-ab2e-71480c21fb60 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1a3633d4-fe6f-4956-974f-e00f2bf9d8d0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:02:14 compute-0 nova_compute[186544]: 2025-11-22 08:02:14.813 186548 DEBUG nova.compute.manager [req-8363fe6f-c863-4fa8-b7e8-6e351f49ccc0 req-e2205195-918e-4a49-ab2e-71480c21fb60 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] Processing event network-vif-plugged-95fc5a99-476d-4f95-b80a-2a134fa639ed _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 08:02:14 compute-0 nova_compute[186544]: 2025-11-22 08:02:14.814 186548 DEBUG nova.compute.manager [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 08:02:14 compute-0 podman[230241]: 2025-11-22 08:02:14.815663909 +0000 UTC m=+0.261587068 container create 6ab4196e05e9fd58c0e7d8c58a304361405a4fcd4b6eff0f9bbb562c502c19de (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 08:02:14 compute-0 nova_compute[186544]: 2025-11-22 08:02:14.819 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798534.8196156, 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:02:14 compute-0 nova_compute[186544]: 2025-11-22 08:02:14.820 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] VM Resumed (Lifecycle Event)
Nov 22 08:02:14 compute-0 nova_compute[186544]: 2025-11-22 08:02:14.823 186548 DEBUG nova.virt.libvirt.driver [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 08:02:14 compute-0 nova_compute[186544]: 2025-11-22 08:02:14.826 186548 INFO nova.virt.libvirt.driver [-] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] Instance spawned successfully.
Nov 22 08:02:14 compute-0 nova_compute[186544]: 2025-11-22 08:02:14.826 186548 DEBUG nova.virt.libvirt.driver [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 08:02:14 compute-0 nova_compute[186544]: 2025-11-22 08:02:14.848 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:02:14 compute-0 nova_compute[186544]: 2025-11-22 08:02:14.853 186548 DEBUG nova.virt.libvirt.driver [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:02:14 compute-0 nova_compute[186544]: 2025-11-22 08:02:14.854 186548 DEBUG nova.virt.libvirt.driver [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:02:14 compute-0 nova_compute[186544]: 2025-11-22 08:02:14.854 186548 DEBUG nova.virt.libvirt.driver [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:02:14 compute-0 nova_compute[186544]: 2025-11-22 08:02:14.855 186548 DEBUG nova.virt.libvirt.driver [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:02:14 compute-0 nova_compute[186544]: 2025-11-22 08:02:14.855 186548 DEBUG nova.virt.libvirt.driver [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:02:14 compute-0 nova_compute[186544]: 2025-11-22 08:02:14.855 186548 DEBUG nova.virt.libvirt.driver [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:02:14 compute-0 nova_compute[186544]: 2025-11-22 08:02:14.860 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:02:14 compute-0 systemd[1]: Started libpod-conmon-6ab4196e05e9fd58c0e7d8c58a304361405a4fcd4b6eff0f9bbb562c502c19de.scope.
Nov 22 08:02:14 compute-0 nova_compute[186544]: 2025-11-22 08:02:14.896 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 08:02:14 compute-0 systemd[1]: Started libcrun container.
Nov 22 08:02:14 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ce1715cb1a4dd580c4507b1aa78dc1b5bd48fd73c4f96a747668aea94f1f4894/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 08:02:14 compute-0 nova_compute[186544]: 2025-11-22 08:02:14.935 186548 INFO nova.compute.manager [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] Took 10.93 seconds to spawn the instance on the hypervisor.
Nov 22 08:02:14 compute-0 nova_compute[186544]: 2025-11-22 08:02:14.936 186548 DEBUG nova.compute.manager [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:02:15 compute-0 nova_compute[186544]: 2025-11-22 08:02:15.013 186548 INFO nova.compute.manager [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] Took 11.37 seconds to build instance.
Nov 22 08:02:15 compute-0 nova_compute[186544]: 2025-11-22 08:02:15.032 186548 DEBUG oslo_concurrency.lockutils [None req-2c4338e8-07c6-4981-82a2-14770e18b33e d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "1a3633d4-fe6f-4956-974f-e00f2bf9d8d0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.477s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:02:15 compute-0 podman[230241]: 2025-11-22 08:02:15.061791383 +0000 UTC m=+0.507714562 container init 6ab4196e05e9fd58c0e7d8c58a304361405a4fcd4b6eff0f9bbb562c502c19de (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118)
Nov 22 08:02:15 compute-0 podman[230241]: 2025-11-22 08:02:15.06728432 +0000 UTC m=+0.513207469 container start 6ab4196e05e9fd58c0e7d8c58a304361405a4fcd4b6eff0f9bbb562c502c19de (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 08:02:15 compute-0 neutron-haproxy-ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087[230256]: [NOTICE]   (230260) : New worker (230262) forked
Nov 22 08:02:15 compute-0 neutron-haproxy-ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087[230256]: [NOTICE]   (230260) : Loading success.
Nov 22 08:02:15 compute-0 nova_compute[186544]: 2025-11-22 08:02:15.422 186548 DEBUG nova.network.neutron [req-56f6cb53-10ac-42ef-ab05-f020364f8079 req-e1965afa-402c-4e18-ab19-3720dc801e46 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] Updated VIF entry in instance network info cache for port 95fc5a99-476d-4f95-b80a-2a134fa639ed. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:02:15 compute-0 nova_compute[186544]: 2025-11-22 08:02:15.423 186548 DEBUG nova.network.neutron [req-56f6cb53-10ac-42ef-ab05-f020364f8079 req-e1965afa-402c-4e18-ab19-3720dc801e46 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] Updating instance_info_cache with network_info: [{"id": "95fc5a99-476d-4f95-b80a-2a134fa639ed", "address": "fa:16:3e:19:25:07", "network": {"id": "f7727db5-43a6-48f6-abbf-aa184d8ad087", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-678450698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62d9a4a13f5d41529bc273c278fae96b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap95fc5a99-47", "ovs_interfaceid": "95fc5a99-476d-4f95-b80a-2a134fa639ed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:02:15 compute-0 nova_compute[186544]: 2025-11-22 08:02:15.437 186548 DEBUG oslo_concurrency.lockutils [req-56f6cb53-10ac-42ef-ab05-f020364f8079 req-e1965afa-402c-4e18-ab19-3720dc801e46 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-1a3633d4-fe6f-4956-974f-e00f2bf9d8d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:02:15 compute-0 nova_compute[186544]: 2025-11-22 08:02:15.670 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:02:15 compute-0 nova_compute[186544]: 2025-11-22 08:02:15.915 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763798520.913508, 6efb9379-6030-46a2-bd5f-60441b08d2ff => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:02:15 compute-0 nova_compute[186544]: 2025-11-22 08:02:15.915 186548 INFO nova.compute.manager [-] [instance: 6efb9379-6030-46a2-bd5f-60441b08d2ff] VM Stopped (Lifecycle Event)
Nov 22 08:02:15 compute-0 nova_compute[186544]: 2025-11-22 08:02:15.954 186548 DEBUG nova.compute.manager [None req-daaa8734-a2d4-4185-9157-41125548de6f - - - - - -] [instance: 6efb9379-6030-46a2-bd5f-60441b08d2ff] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:02:16 compute-0 nova_compute[186544]: 2025-11-22 08:02:16.908 186548 DEBUG nova.compute.manager [req-2d4c6601-2a9a-4f66-b1fe-298825115c96 req-73803785-bf90-4af1-b8f1-d7afb195be73 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] Received event network-vif-plugged-95fc5a99-476d-4f95-b80a-2a134fa639ed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:02:16 compute-0 nova_compute[186544]: 2025-11-22 08:02:16.909 186548 DEBUG oslo_concurrency.lockutils [req-2d4c6601-2a9a-4f66-b1fe-298825115c96 req-73803785-bf90-4af1-b8f1-d7afb195be73 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1a3633d4-fe6f-4956-974f-e00f2bf9d8d0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:02:16 compute-0 nova_compute[186544]: 2025-11-22 08:02:16.909 186548 DEBUG oslo_concurrency.lockutils [req-2d4c6601-2a9a-4f66-b1fe-298825115c96 req-73803785-bf90-4af1-b8f1-d7afb195be73 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1a3633d4-fe6f-4956-974f-e00f2bf9d8d0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:02:16 compute-0 nova_compute[186544]: 2025-11-22 08:02:16.909 186548 DEBUG oslo_concurrency.lockutils [req-2d4c6601-2a9a-4f66-b1fe-298825115c96 req-73803785-bf90-4af1-b8f1-d7afb195be73 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1a3633d4-fe6f-4956-974f-e00f2bf9d8d0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:02:16 compute-0 nova_compute[186544]: 2025-11-22 08:02:16.910 186548 DEBUG nova.compute.manager [req-2d4c6601-2a9a-4f66-b1fe-298825115c96 req-73803785-bf90-4af1-b8f1-d7afb195be73 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] No waiting events found dispatching network-vif-plugged-95fc5a99-476d-4f95-b80a-2a134fa639ed pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:02:16 compute-0 nova_compute[186544]: 2025-11-22 08:02:16.910 186548 WARNING nova.compute.manager [req-2d4c6601-2a9a-4f66-b1fe-298825115c96 req-73803785-bf90-4af1-b8f1-d7afb195be73 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] Received unexpected event network-vif-plugged-95fc5a99-476d-4f95-b80a-2a134fa639ed for instance with vm_state active and task_state None.
Nov 22 08:02:17 compute-0 nova_compute[186544]: 2025-11-22 08:02:17.045 186548 INFO nova.compute.manager [None req-3d21b2e9-e17a-4f37-ac4f-c9cfb103d53f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] Pausing
Nov 22 08:02:17 compute-0 nova_compute[186544]: 2025-11-22 08:02:17.046 186548 DEBUG nova.objects.instance [None req-3d21b2e9-e17a-4f37-ac4f-c9cfb103d53f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lazy-loading 'flavor' on Instance uuid 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:02:17 compute-0 nova_compute[186544]: 2025-11-22 08:02:17.087 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798537.087393, 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:02:17 compute-0 nova_compute[186544]: 2025-11-22 08:02:17.088 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] VM Paused (Lifecycle Event)
Nov 22 08:02:17 compute-0 nova_compute[186544]: 2025-11-22 08:02:17.090 186548 DEBUG nova.compute.manager [None req-3d21b2e9-e17a-4f37-ac4f-c9cfb103d53f d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:02:17 compute-0 nova_compute[186544]: 2025-11-22 08:02:17.109 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:02:17 compute-0 nova_compute[186544]: 2025-11-22 08:02:17.113 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: pausing, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:02:17 compute-0 nova_compute[186544]: 2025-11-22 08:02:17.141 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] During sync_power_state the instance has a pending task (pausing). Skip.
Nov 22 08:02:18 compute-0 nova_compute[186544]: 2025-11-22 08:02:18.264 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:02:20 compute-0 nova_compute[186544]: 2025-11-22 08:02:20.398 186548 DEBUG oslo_concurrency.lockutils [None req-ea85be1c-1f68-4bdf-a1d1-69178f45deea d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Acquiring lock "1a3633d4-fe6f-4956-974f-e00f2bf9d8d0" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:02:20 compute-0 nova_compute[186544]: 2025-11-22 08:02:20.399 186548 DEBUG oslo_concurrency.lockutils [None req-ea85be1c-1f68-4bdf-a1d1-69178f45deea d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "1a3633d4-fe6f-4956-974f-e00f2bf9d8d0" acquired by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:02:20 compute-0 nova_compute[186544]: 2025-11-22 08:02:20.399 186548 INFO nova.compute.manager [None req-ea85be1c-1f68-4bdf-a1d1-69178f45deea d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] Shelving
Nov 22 08:02:20 compute-0 kernel: tap95fc5a99-47 (unregistering): left promiscuous mode
Nov 22 08:02:20 compute-0 NetworkManager[55036]: <info>  [1763798540.4535] device (tap95fc5a99-47): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 08:02:20 compute-0 ovn_controller[94843]: 2025-11-22T08:02:20Z|00414|binding|INFO|Releasing lport 95fc5a99-476d-4f95-b80a-2a134fa639ed from this chassis (sb_readonly=0)
Nov 22 08:02:20 compute-0 ovn_controller[94843]: 2025-11-22T08:02:20Z|00415|binding|INFO|Setting lport 95fc5a99-476d-4f95-b80a-2a134fa639ed down in Southbound
Nov 22 08:02:20 compute-0 nova_compute[186544]: 2025-11-22 08:02:20.462 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:02:20 compute-0 ovn_controller[94843]: 2025-11-22T08:02:20Z|00416|binding|INFO|Removing iface tap95fc5a99-47 ovn-installed in OVS
Nov 22 08:02:20 compute-0 nova_compute[186544]: 2025-11-22 08:02:20.465 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:02:20 compute-0 nova_compute[186544]: 2025-11-22 08:02:20.475 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:02:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:20.479 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:19:25:07 10.100.0.9'], port_security=['fa:16:3e:19:25:07 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '1a3633d4-fe6f-4956-974f-e00f2bf9d8d0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f7727db5-43a6-48f6-abbf-aa184d8ad087', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '62d9a4a13f5d41529bc273c278fae96b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '80908aff-0365-41dd-a88b-8ec1981e86fb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c2f56b43-4627-4c45-bd62-967c8ee835ae, chassis=[], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=95fc5a99-476d-4f95-b80a-2a134fa639ed) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:02:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:20.481 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 95fc5a99-476d-4f95-b80a-2a134fa639ed in datapath f7727db5-43a6-48f6-abbf-aa184d8ad087 unbound from our chassis
Nov 22 08:02:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:20.482 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f7727db5-43a6-48f6-abbf-aa184d8ad087, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 08:02:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:20.483 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[31d8715f-f202-48e4-b226-c5cea52b6930]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:02:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:20.483 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087 namespace which is not needed anymore
Nov 22 08:02:20 compute-0 systemd[1]: machine-qemu\x2d55\x2dinstance\x2d00000063.scope: Deactivated successfully.
Nov 22 08:02:20 compute-0 systemd[1]: machine-qemu\x2d55\x2dinstance\x2d00000063.scope: Consumed 2.734s CPU time.
Nov 22 08:02:20 compute-0 systemd-machined[152872]: Machine qemu-55-instance-00000063 terminated.
Nov 22 08:02:20 compute-0 nova_compute[186544]: 2025-11-22 08:02:20.671 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:02:20 compute-0 nova_compute[186544]: 2025-11-22 08:02:20.691 186548 INFO nova.virt.libvirt.driver [-] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] Instance destroyed successfully.
Nov 22 08:02:20 compute-0 nova_compute[186544]: 2025-11-22 08:02:20.692 186548 DEBUG nova.objects.instance [None req-ea85be1c-1f68-4bdf-a1d1-69178f45deea d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lazy-loading 'numa_topology' on Instance uuid 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:02:20 compute-0 nova_compute[186544]: 2025-11-22 08:02:20.709 186548 DEBUG nova.compute.manager [req-fa68cb98-6d4f-4d9a-af20-dee95119d4a6 req-6db66298-3a75-4c50-b451-e0d07f122351 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] Received event network-vif-unplugged-95fc5a99-476d-4f95-b80a-2a134fa639ed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:02:20 compute-0 nova_compute[186544]: 2025-11-22 08:02:20.710 186548 DEBUG oslo_concurrency.lockutils [req-fa68cb98-6d4f-4d9a-af20-dee95119d4a6 req-6db66298-3a75-4c50-b451-e0d07f122351 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1a3633d4-fe6f-4956-974f-e00f2bf9d8d0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:02:20 compute-0 nova_compute[186544]: 2025-11-22 08:02:20.710 186548 DEBUG oslo_concurrency.lockutils [req-fa68cb98-6d4f-4d9a-af20-dee95119d4a6 req-6db66298-3a75-4c50-b451-e0d07f122351 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1a3633d4-fe6f-4956-974f-e00f2bf9d8d0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:02:20 compute-0 nova_compute[186544]: 2025-11-22 08:02:20.710 186548 DEBUG oslo_concurrency.lockutils [req-fa68cb98-6d4f-4d9a-af20-dee95119d4a6 req-6db66298-3a75-4c50-b451-e0d07f122351 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1a3633d4-fe6f-4956-974f-e00f2bf9d8d0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:02:20 compute-0 nova_compute[186544]: 2025-11-22 08:02:20.710 186548 DEBUG nova.compute.manager [req-fa68cb98-6d4f-4d9a-af20-dee95119d4a6 req-6db66298-3a75-4c50-b451-e0d07f122351 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] No waiting events found dispatching network-vif-unplugged-95fc5a99-476d-4f95-b80a-2a134fa639ed pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:02:20 compute-0 nova_compute[186544]: 2025-11-22 08:02:20.711 186548 WARNING nova.compute.manager [req-fa68cb98-6d4f-4d9a-af20-dee95119d4a6 req-6db66298-3a75-4c50-b451-e0d07f122351 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] Received unexpected event network-vif-unplugged-95fc5a99-476d-4f95-b80a-2a134fa639ed for instance with vm_state paused and task_state shelving.
Nov 22 08:02:20 compute-0 neutron-haproxy-ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087[230256]: [NOTICE]   (230260) : haproxy version is 2.8.14-c23fe91
Nov 22 08:02:20 compute-0 neutron-haproxy-ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087[230256]: [NOTICE]   (230260) : path to executable is /usr/sbin/haproxy
Nov 22 08:02:20 compute-0 neutron-haproxy-ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087[230256]: [WARNING]  (230260) : Exiting Master process...
Nov 22 08:02:20 compute-0 neutron-haproxy-ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087[230256]: [WARNING]  (230260) : Exiting Master process...
Nov 22 08:02:20 compute-0 neutron-haproxy-ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087[230256]: [ALERT]    (230260) : Current worker (230262) exited with code 143 (Terminated)
Nov 22 08:02:20 compute-0 neutron-haproxy-ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087[230256]: [WARNING]  (230260) : All workers exited. Exiting... (0)
Nov 22 08:02:20 compute-0 systemd[1]: libpod-6ab4196e05e9fd58c0e7d8c58a304361405a4fcd4b6eff0f9bbb562c502c19de.scope: Deactivated successfully.
Nov 22 08:02:20 compute-0 podman[230295]: 2025-11-22 08:02:20.792463899 +0000 UTC m=+0.214929663 container died 6ab4196e05e9fd58c0e7d8c58a304361405a4fcd4b6eff0f9bbb562c502c19de (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Nov 22 08:02:21 compute-0 nova_compute[186544]: 2025-11-22 08:02:21.026 186548 INFO nova.virt.libvirt.driver [None req-ea85be1c-1f68-4bdf-a1d1-69178f45deea d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] Beginning cold snapshot process
Nov 22 08:02:21 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6ab4196e05e9fd58c0e7d8c58a304361405a4fcd4b6eff0f9bbb562c502c19de-userdata-shm.mount: Deactivated successfully.
Nov 22 08:02:21 compute-0 systemd[1]: var-lib-containers-storage-overlay-ce1715cb1a4dd580c4507b1aa78dc1b5bd48fd73c4f96a747668aea94f1f4894-merged.mount: Deactivated successfully.
Nov 22 08:02:21 compute-0 nova_compute[186544]: 2025-11-22 08:02:21.299 186548 DEBUG nova.privsep.utils [None req-ea85be1c-1f68-4bdf-a1d1-69178f45deea d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Nov 22 08:02:21 compute-0 nova_compute[186544]: 2025-11-22 08:02:21.299 186548 DEBUG oslo_concurrency.processutils [None req-ea85be1c-1f68-4bdf-a1d1-69178f45deea d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/1a3633d4-fe6f-4956-974f-e00f2bf9d8d0/disk /var/lib/nova/instances/snapshots/tmpuzf11spw/77b7001580a547bc896b06f638d0eff6 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:02:21 compute-0 podman[230295]: 2025-11-22 08:02:21.442844443 +0000 UTC m=+0.865310207 container cleanup 6ab4196e05e9fd58c0e7d8c58a304361405a4fcd4b6eff0f9bbb562c502c19de (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 08:02:21 compute-0 systemd[1]: libpod-conmon-6ab4196e05e9fd58c0e7d8c58a304361405a4fcd4b6eff0f9bbb562c502c19de.scope: Deactivated successfully.
Nov 22 08:02:21 compute-0 podman[230347]: 2025-11-22 08:02:21.951732994 +0000 UTC m=+0.484914299 container remove 6ab4196e05e9fd58c0e7d8c58a304361405a4fcd4b6eff0f9bbb562c502c19de (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 08:02:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:21.957 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[b7cece4a-d31c-4021-a835-9e09eea0696a]: (4, ('Sat Nov 22 08:02:20 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087 (6ab4196e05e9fd58c0e7d8c58a304361405a4fcd4b6eff0f9bbb562c502c19de)\n6ab4196e05e9fd58c0e7d8c58a304361405a4fcd4b6eff0f9bbb562c502c19de\nSat Nov 22 08:02:21 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087 (6ab4196e05e9fd58c0e7d8c58a304361405a4fcd4b6eff0f9bbb562c502c19de)\n6ab4196e05e9fd58c0e7d8c58a304361405a4fcd4b6eff0f9bbb562c502c19de\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:02:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:21.959 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[ac985a5e-f82b-4a6f-8cdd-c07984de4414]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:02:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:21.960 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf7727db5-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:02:21 compute-0 nova_compute[186544]: 2025-11-22 08:02:21.962 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:02:21 compute-0 kernel: tapf7727db5-40: left promiscuous mode
Nov 22 08:02:21 compute-0 nova_compute[186544]: 2025-11-22 08:02:21.979 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:02:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:21.983 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[3272c69e-4c89-459b-bd92-94a846598e24]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:02:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:22.001 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[47cc29d0-0638-4376-ab98-c6a51d6f4f46]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:02:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:22.002 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[94de8710-a5c1-425d-b404-baa9e287b725]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:02:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:22.016 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[1fbafaac-732a-4cfe-b710-46b79da74c58]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 527464, 'reachable_time': 25378, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230370, 'error': None, 'target': 'ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:02:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:22.019 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f7727db5-43a6-48f6-abbf-aa184d8ad087 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 08:02:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:22.020 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[9590217c-01be-47a8-a335-316d83fbf6c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:02:22 compute-0 systemd[1]: run-netns-ovnmeta\x2df7727db5\x2d43a6\x2d48f6\x2dabbf\x2daa184d8ad087.mount: Deactivated successfully.
Nov 22 08:02:22 compute-0 nova_compute[186544]: 2025-11-22 08:02:22.614 186548 DEBUG oslo_concurrency.processutils [None req-ea85be1c-1f68-4bdf-a1d1-69178f45deea d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/1a3633d4-fe6f-4956-974f-e00f2bf9d8d0/disk /var/lib/nova/instances/snapshots/tmpuzf11spw/77b7001580a547bc896b06f638d0eff6" returned: 0 in 1.315s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:02:22 compute-0 nova_compute[186544]: 2025-11-22 08:02:22.615 186548 INFO nova.virt.libvirt.driver [None req-ea85be1c-1f68-4bdf-a1d1-69178f45deea d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] Snapshot extracted, beginning image upload
Nov 22 08:02:23 compute-0 nova_compute[186544]: 2025-11-22 08:02:23.059 186548 DEBUG nova.compute.manager [req-9f8d710c-c76b-4348-baed-0edbfb5fefe2 req-ac40eb6d-d4d7-4193-bb5c-b6bfc1a2848d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] Received event network-vif-plugged-95fc5a99-476d-4f95-b80a-2a134fa639ed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:02:23 compute-0 nova_compute[186544]: 2025-11-22 08:02:23.060 186548 DEBUG oslo_concurrency.lockutils [req-9f8d710c-c76b-4348-baed-0edbfb5fefe2 req-ac40eb6d-d4d7-4193-bb5c-b6bfc1a2848d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1a3633d4-fe6f-4956-974f-e00f2bf9d8d0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:02:23 compute-0 nova_compute[186544]: 2025-11-22 08:02:23.060 186548 DEBUG oslo_concurrency.lockutils [req-9f8d710c-c76b-4348-baed-0edbfb5fefe2 req-ac40eb6d-d4d7-4193-bb5c-b6bfc1a2848d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1a3633d4-fe6f-4956-974f-e00f2bf9d8d0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:02:23 compute-0 nova_compute[186544]: 2025-11-22 08:02:23.060 186548 DEBUG oslo_concurrency.lockutils [req-9f8d710c-c76b-4348-baed-0edbfb5fefe2 req-ac40eb6d-d4d7-4193-bb5c-b6bfc1a2848d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1a3633d4-fe6f-4956-974f-e00f2bf9d8d0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:02:23 compute-0 nova_compute[186544]: 2025-11-22 08:02:23.061 186548 DEBUG nova.compute.manager [req-9f8d710c-c76b-4348-baed-0edbfb5fefe2 req-ac40eb6d-d4d7-4193-bb5c-b6bfc1a2848d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] No waiting events found dispatching network-vif-plugged-95fc5a99-476d-4f95-b80a-2a134fa639ed pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:02:23 compute-0 nova_compute[186544]: 2025-11-22 08:02:23.061 186548 WARNING nova.compute.manager [req-9f8d710c-c76b-4348-baed-0edbfb5fefe2 req-ac40eb6d-d4d7-4193-bb5c-b6bfc1a2848d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] Received unexpected event network-vif-plugged-95fc5a99-476d-4f95-b80a-2a134fa639ed for instance with vm_state paused and task_state shelving_image_uploading.
Nov 22 08:02:23 compute-0 nova_compute[186544]: 2025-11-22 08:02:23.267 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:02:23 compute-0 podman[230371]: 2025-11-22 08:02:23.411041557 +0000 UTC m=+0.059893974 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 08:02:23 compute-0 podman[230372]: 2025-11-22 08:02:23.450707189 +0000 UTC m=+0.097096695 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 22 08:02:25 compute-0 nova_compute[186544]: 2025-11-22 08:02:25.673 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:02:25 compute-0 nova_compute[186544]: 2025-11-22 08:02:25.819 186548 INFO nova.virt.libvirt.driver [None req-ea85be1c-1f68-4bdf-a1d1-69178f45deea d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] Snapshot image upload complete
Nov 22 08:02:25 compute-0 nova_compute[186544]: 2025-11-22 08:02:25.821 186548 DEBUG nova.compute.manager [None req-ea85be1c-1f68-4bdf-a1d1-69178f45deea d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:02:25 compute-0 nova_compute[186544]: 2025-11-22 08:02:25.892 186548 INFO nova.compute.manager [None req-ea85be1c-1f68-4bdf-a1d1-69178f45deea d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] Shelve offloading
Nov 22 08:02:25 compute-0 nova_compute[186544]: 2025-11-22 08:02:25.905 186548 INFO nova.virt.libvirt.driver [-] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] Instance destroyed successfully.
Nov 22 08:02:25 compute-0 nova_compute[186544]: 2025-11-22 08:02:25.906 186548 DEBUG nova.compute.manager [None req-ea85be1c-1f68-4bdf-a1d1-69178f45deea d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:02:25 compute-0 nova_compute[186544]: 2025-11-22 08:02:25.908 186548 DEBUG oslo_concurrency.lockutils [None req-ea85be1c-1f68-4bdf-a1d1-69178f45deea d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Acquiring lock "refresh_cache-1a3633d4-fe6f-4956-974f-e00f2bf9d8d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:02:25 compute-0 nova_compute[186544]: 2025-11-22 08:02:25.909 186548 DEBUG oslo_concurrency.lockutils [None req-ea85be1c-1f68-4bdf-a1d1-69178f45deea d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Acquired lock "refresh_cache-1a3633d4-fe6f-4956-974f-e00f2bf9d8d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:02:25 compute-0 nova_compute[186544]: 2025-11-22 08:02:25.909 186548 DEBUG nova.network.neutron [None req-ea85be1c-1f68-4bdf-a1d1-69178f45deea d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 08:02:27 compute-0 podman[230416]: 2025-11-22 08:02:27.419424687 +0000 UTC m=+0.073217814 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 08:02:27 compute-0 podman[230439]: 2025-11-22 08:02:27.512524172 +0000 UTC m=+0.059825563 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 08:02:28 compute-0 nova_compute[186544]: 2025-11-22 08:02:28.269 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:02:29 compute-0 nova_compute[186544]: 2025-11-22 08:02:29.741 186548 DEBUG nova.network.neutron [None req-ea85be1c-1f68-4bdf-a1d1-69178f45deea d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] Updating instance_info_cache with network_info: [{"id": "95fc5a99-476d-4f95-b80a-2a134fa639ed", "address": "fa:16:3e:19:25:07", "network": {"id": "f7727db5-43a6-48f6-abbf-aa184d8ad087", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-678450698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62d9a4a13f5d41529bc273c278fae96b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap95fc5a99-47", "ovs_interfaceid": "95fc5a99-476d-4f95-b80a-2a134fa639ed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:02:29 compute-0 nova_compute[186544]: 2025-11-22 08:02:29.755 186548 DEBUG oslo_concurrency.lockutils [None req-ea85be1c-1f68-4bdf-a1d1-69178f45deea d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Releasing lock "refresh_cache-1a3633d4-fe6f-4956-974f-e00f2bf9d8d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:02:30 compute-0 nova_compute[186544]: 2025-11-22 08:02:30.676 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:02:31 compute-0 nova_compute[186544]: 2025-11-22 08:02:31.908 186548 INFO nova.virt.libvirt.driver [-] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] Instance destroyed successfully.
Nov 22 08:02:31 compute-0 nova_compute[186544]: 2025-11-22 08:02:31.908 186548 DEBUG nova.objects.instance [None req-ea85be1c-1f68-4bdf-a1d1-69178f45deea d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lazy-loading 'resources' on Instance uuid 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:02:31 compute-0 nova_compute[186544]: 2025-11-22 08:02:31.925 186548 DEBUG nova.virt.libvirt.vif [None req-ea85be1c-1f68-4bdf-a1d1-69178f45deea d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:02:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-279343321',display_name='tempest-ServerActionsTestOtherB-server-279343321',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-279343321',id=99,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:02:14Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='62d9a4a13f5d41529bc273c278fae96b',ramdisk_id='',reservation_id='r-lp61ywby',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-270195081',owner_user_name='tempest-ServerActionsTestOtherB-270195081-project-member',shelved_at='2025-11-22T08:02:25.821003',shelved_host='compute-0.ctlplane.example.com',shelved_image_id='ec1b0364-d0ab-4328-844b-9eecc7118a15'},tags=<?>,task_state='shelving_offloading',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:02:22Z,user_data=None,user_id='d0c5153b41c5499bac372d2df10b9b03',uuid=1a3633d4-fe6f-4956-974f-e00f2bf9d8d0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='shelved') vif={"id": "95fc5a99-476d-4f95-b80a-2a134fa639ed", "address": "fa:16:3e:19:25:07", "network": {"id": "f7727db5-43a6-48f6-abbf-aa184d8ad087", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-678450698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62d9a4a13f5d41529bc273c278fae96b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap95fc5a99-47", "ovs_interfaceid": "95fc5a99-476d-4f95-b80a-2a134fa639ed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 08:02:31 compute-0 nova_compute[186544]: 2025-11-22 08:02:31.926 186548 DEBUG nova.network.os_vif_util [None req-ea85be1c-1f68-4bdf-a1d1-69178f45deea d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Converting VIF {"id": "95fc5a99-476d-4f95-b80a-2a134fa639ed", "address": "fa:16:3e:19:25:07", "network": {"id": "f7727db5-43a6-48f6-abbf-aa184d8ad087", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-678450698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62d9a4a13f5d41529bc273c278fae96b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap95fc5a99-47", "ovs_interfaceid": "95fc5a99-476d-4f95-b80a-2a134fa639ed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:02:31 compute-0 nova_compute[186544]: 2025-11-22 08:02:31.926 186548 DEBUG nova.network.os_vif_util [None req-ea85be1c-1f68-4bdf-a1d1-69178f45deea d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:19:25:07,bridge_name='br-int',has_traffic_filtering=True,id=95fc5a99-476d-4f95-b80a-2a134fa639ed,network=Network(f7727db5-43a6-48f6-abbf-aa184d8ad087),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap95fc5a99-47') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:02:31 compute-0 nova_compute[186544]: 2025-11-22 08:02:31.927 186548 DEBUG os_vif [None req-ea85be1c-1f68-4bdf-a1d1-69178f45deea d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:19:25:07,bridge_name='br-int',has_traffic_filtering=True,id=95fc5a99-476d-4f95-b80a-2a134fa639ed,network=Network(f7727db5-43a6-48f6-abbf-aa184d8ad087),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap95fc5a99-47') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 08:02:31 compute-0 nova_compute[186544]: 2025-11-22 08:02:31.928 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:02:31 compute-0 nova_compute[186544]: 2025-11-22 08:02:31.929 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap95fc5a99-47, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:02:31 compute-0 nova_compute[186544]: 2025-11-22 08:02:31.930 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:02:31 compute-0 nova_compute[186544]: 2025-11-22 08:02:31.931 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:02:31 compute-0 nova_compute[186544]: 2025-11-22 08:02:31.934 186548 INFO os_vif [None req-ea85be1c-1f68-4bdf-a1d1-69178f45deea d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:19:25:07,bridge_name='br-int',has_traffic_filtering=True,id=95fc5a99-476d-4f95-b80a-2a134fa639ed,network=Network(f7727db5-43a6-48f6-abbf-aa184d8ad087),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap95fc5a99-47')
Nov 22 08:02:31 compute-0 nova_compute[186544]: 2025-11-22 08:02:31.934 186548 INFO nova.virt.libvirt.driver [None req-ea85be1c-1f68-4bdf-a1d1-69178f45deea d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] Deleting instance files /var/lib/nova/instances/1a3633d4-fe6f-4956-974f-e00f2bf9d8d0_del
Nov 22 08:02:31 compute-0 nova_compute[186544]: 2025-11-22 08:02:31.935 186548 INFO nova.virt.libvirt.driver [None req-ea85be1c-1f68-4bdf-a1d1-69178f45deea d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] Deletion of /var/lib/nova/instances/1a3633d4-fe6f-4956-974f-e00f2bf9d8d0_del complete
Nov 22 08:02:32 compute-0 nova_compute[186544]: 2025-11-22 08:02:32.179 186548 INFO nova.scheduler.client.report [None req-ea85be1c-1f68-4bdf-a1d1-69178f45deea d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Deleted allocations for instance 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0
Nov 22 08:02:32 compute-0 nova_compute[186544]: 2025-11-22 08:02:32.319 186548 DEBUG oslo_concurrency.lockutils [None req-ea85be1c-1f68-4bdf-a1d1-69178f45deea d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:02:32 compute-0 nova_compute[186544]: 2025-11-22 08:02:32.319 186548 DEBUG oslo_concurrency.lockutils [None req-ea85be1c-1f68-4bdf-a1d1-69178f45deea d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:02:32 compute-0 nova_compute[186544]: 2025-11-22 08:02:32.364 186548 DEBUG nova.compute.provider_tree [None req-ea85be1c-1f68-4bdf-a1d1-69178f45deea d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:02:32 compute-0 nova_compute[186544]: 2025-11-22 08:02:32.374 186548 DEBUG nova.scheduler.client.report [None req-ea85be1c-1f68-4bdf-a1d1-69178f45deea d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:02:32 compute-0 nova_compute[186544]: 2025-11-22 08:02:32.401 186548 DEBUG oslo_concurrency.lockutils [None req-ea85be1c-1f68-4bdf-a1d1-69178f45deea d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.082s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:02:32 compute-0 nova_compute[186544]: 2025-11-22 08:02:32.542 186548 DEBUG oslo_concurrency.lockutils [None req-ea85be1c-1f68-4bdf-a1d1-69178f45deea d0c5153b41c5499bac372d2df10b9b03 62d9a4a13f5d41529bc273c278fae96b - - default default] Lock "1a3633d4-fe6f-4956-974f-e00f2bf9d8d0" "released" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: held 12.142s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:02:34 compute-0 podman[230459]: 2025-11-22 08:02:34.417248837 +0000 UTC m=+0.064249362 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 08:02:35 compute-0 nova_compute[186544]: 2025-11-22 08:02:35.677 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:02:35 compute-0 nova_compute[186544]: 2025-11-22 08:02:35.690 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763798540.6890514, 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:02:35 compute-0 nova_compute[186544]: 2025-11-22 08:02:35.691 186548 INFO nova.compute.manager [-] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] VM Stopped (Lifecycle Event)
Nov 22 08:02:35 compute-0 nova_compute[186544]: 2025-11-22 08:02:35.707 186548 DEBUG nova.compute.manager [None req-68da9e8f-fd67-4b7c-bcba-3b51b94ba13a - - - - - -] [instance: 1a3633d4-fe6f-4956-974f-e00f2bf9d8d0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.597 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '235ecf63-07fa-4f60-97e9-466450a50add', 'name': 'tempest-ServersNegativeTestJSON-server-899820238', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000061', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'd967f0cef958482c9711764882a146f3', 'user_id': '989540cd5ede4a5184a08b8eb3de013d', 'hostId': '0dddb932d08bd406e5ba67b183c68dd251b2f7cf64cee01ea4c355f1', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.597 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.601 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 235ecf63-07fa-4f60-97e9-466450a50add / tapfe9c07c3-e0 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.601 12 DEBUG ceilometer.compute.pollsters [-] 235ecf63-07fa-4f60-97e9-466450a50add/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.603 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '22061261-27c1-4e3d-963d-cbfa31a74d1c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '989540cd5ede4a5184a08b8eb3de013d', 'user_name': None, 'project_id': 'd967f0cef958482c9711764882a146f3', 'project_name': None, 'resource_id': 'instance-00000061-235ecf63-07fa-4f60-97e9-466450a50add-tapfe9c07c3-e0', 'timestamp': '2025-11-22T08:02:36.598102', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-899820238', 'name': 'tapfe9c07c3-e0', 'instance_id': '235ecf63-07fa-4f60-97e9-466450a50add', 'instance_type': 'm1.nano', 'host': '0dddb932d08bd406e5ba67b183c68dd251b2f7cf64cee01ea4c355f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1c:d6:3e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfe9c07c3-e0'}, 'message_id': '9bb895b0-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5297.289881569, 'message_signature': 'babc9a932710dca9b56835ec1f92eea5c97c26278f1b868ade10e078840480e3'}]}, 'timestamp': '2025-11-22 08:02:36.602345', '_unique_id': 'd9857f4cdc174091a44a4c1fd9746bc7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.603 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.603 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.603 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.603 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.603 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.603 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.603 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.603 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.603 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.603 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.603 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.603 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.603 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.603 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.603 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.603 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.603 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.603 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.603 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.603 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.603 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.603 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.603 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.603 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.603 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.603 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.603 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.603 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.603 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.603 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.603 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.604 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.604 12 DEBUG ceilometer.compute.pollsters [-] 235ecf63-07fa-4f60-97e9-466450a50add/network.incoming.bytes volume: 1520 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.605 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ce0e970a-59a7-4ad6-9846-cb4618ddd989', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1520, 'user_id': '989540cd5ede4a5184a08b8eb3de013d', 'user_name': None, 'project_id': 'd967f0cef958482c9711764882a146f3', 'project_name': None, 'resource_id': 'instance-00000061-235ecf63-07fa-4f60-97e9-466450a50add-tapfe9c07c3-e0', 'timestamp': '2025-11-22T08:02:36.604852', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-899820238', 'name': 'tapfe9c07c3-e0', 'instance_id': '235ecf63-07fa-4f60-97e9-466450a50add', 'instance_type': 'm1.nano', 'host': '0dddb932d08bd406e5ba67b183c68dd251b2f7cf64cee01ea4c355f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1c:d6:3e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfe9c07c3-e0'}, 'message_id': '9bb906e4-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5297.289881569, 'message_signature': '725e74b041a918573fdd0e6da3c9b85fce9e808f9777842896b9c83b5b5a28fa'}]}, 'timestamp': '2025-11-22 08:02:36.605133', '_unique_id': '188b5534513248efb37bc2564ae8ce99'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.605 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.605 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.605 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.605 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.605 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.605 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.605 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.605 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.605 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.605 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.605 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.605 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.605 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.605 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.605 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.605 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.605 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.605 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.605 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.605 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.605 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.605 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.605 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.605 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.605 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.605 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.605 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.605 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.605 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.605 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.605 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.606 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.606 12 DEBUG ceilometer.compute.pollsters [-] 235ecf63-07fa-4f60-97e9-466450a50add/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.607 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4634f4db-7f21-4d07-80d1-f3c915f7e957', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '989540cd5ede4a5184a08b8eb3de013d', 'user_name': None, 'project_id': 'd967f0cef958482c9711764882a146f3', 'project_name': None, 'resource_id': 'instance-00000061-235ecf63-07fa-4f60-97e9-466450a50add-tapfe9c07c3-e0', 'timestamp': '2025-11-22T08:02:36.606414', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-899820238', 'name': 'tapfe9c07c3-e0', 'instance_id': '235ecf63-07fa-4f60-97e9-466450a50add', 'instance_type': 'm1.nano', 'host': '0dddb932d08bd406e5ba67b183c68dd251b2f7cf64cee01ea4c355f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1c:d6:3e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfe9c07c3-e0'}, 'message_id': '9bb94320-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5297.289881569, 'message_signature': 'bb450155f17a91245d1bd3a13d785fddccf70d144a1ea6fa0c7b84b810042110'}]}, 'timestamp': '2025-11-22 08:02:36.606710', '_unique_id': '2e0cd0cf5e8f4ccc821a79e6a2825be3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.607 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.607 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.607 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.607 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.607 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.607 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.607 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.607 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.607 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.607 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.607 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.607 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.607 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.607 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.607 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.607 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.607 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.607 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.607 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.607 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.607 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.607 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.607 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.607 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.607 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.607 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.607 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.607 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.607 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.607 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.607 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.607 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.640 12 DEBUG ceilometer.compute.pollsters [-] 235ecf63-07fa-4f60-97e9-466450a50add/disk.device.read.bytes volume: 30358016 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.640 12 DEBUG ceilometer.compute.pollsters [-] 235ecf63-07fa-4f60-97e9-466450a50add/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.642 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f3ba3630-8f10-4720-93ec-88f819151a8c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30358016, 'user_id': '989540cd5ede4a5184a08b8eb3de013d', 'user_name': None, 'project_id': 'd967f0cef958482c9711764882a146f3', 'project_name': None, 'resource_id': '235ecf63-07fa-4f60-97e9-466450a50add-vda', 'timestamp': '2025-11-22T08:02:36.607937', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-899820238', 'name': 'instance-00000061', 'instance_id': '235ecf63-07fa-4f60-97e9-466450a50add', 'instance_type': 'm1.nano', 'host': '0dddb932d08bd406e5ba67b183c68dd251b2f7cf64cee01ea4c355f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9bbe78e0-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5297.299741743, 'message_signature': 'b5f7b0987806924af834e2bde5463f9a1a8d828569d14f5f49c67fa8b103dd3b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '989540cd5ede4a5184a08b8eb3de013d', 'user_name': None, 'project_id': 'd967f0cef958482c9711764882a146f3', 'project_name': None, 'resource_id': '235ecf63-07fa-4f60-97e9-466450a50add-sda', 'timestamp': '2025-11-22T08:02:36.607937', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-899820238', 'name': 'instance-00000061', 'instance_id': '235ecf63-07fa-4f60-97e9-466450a50add', 'instance_type': 'm1.nano', 'host': '0dddb932d08bd406e5ba67b183c68dd251b2f7cf64cee01ea4c355f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9bbe8786-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5297.299741743, 'message_signature': '39dd91846c23a4181567fa0712fc7bbd3655034be0b9558c63f2bfbc80667382'}]}, 'timestamp': '2025-11-22 08:02:36.641211', '_unique_id': '03d025ea61d74968849293843aa8fe4c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.642 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.642 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.642 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.642 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.642 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.642 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.642 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.642 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.642 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.642 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.642 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.642 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.642 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.642 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.642 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.642 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.642 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.642 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.642 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.642 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.642 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.642 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.642 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.642 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.642 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.642 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.642 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.642 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.642 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.642 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.642 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.643 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.643 12 DEBUG ceilometer.compute.pollsters [-] 235ecf63-07fa-4f60-97e9-466450a50add/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.644 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7dedbf63-976f-497d-b28c-ff8b365c2f32', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '989540cd5ede4a5184a08b8eb3de013d', 'user_name': None, 'project_id': 'd967f0cef958482c9711764882a146f3', 'project_name': None, 'resource_id': 'instance-00000061-235ecf63-07fa-4f60-97e9-466450a50add-tapfe9c07c3-e0', 'timestamp': '2025-11-22T08:02:36.643379', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-899820238', 'name': 'tapfe9c07c3-e0', 'instance_id': '235ecf63-07fa-4f60-97e9-466450a50add', 'instance_type': 'm1.nano', 'host': '0dddb932d08bd406e5ba67b183c68dd251b2f7cf64cee01ea4c355f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1c:d6:3e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfe9c07c3-e0'}, 'message_id': '9bbeeb36-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5297.289881569, 'message_signature': '671da6185332f676a20ade94f0755a46c159ab13cfe42a97cf30ffbd5187cf66'}]}, 'timestamp': '2025-11-22 08:02:36.643779', '_unique_id': '15963833905d4de79b08e6de152b969d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.644 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.644 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.644 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.644 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.644 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.644 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.644 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.644 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.644 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.644 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.644 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.644 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.644 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.644 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.644 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.644 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.644 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.644 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.644 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.644 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.644 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.644 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.644 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.644 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.644 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.644 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.644 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.644 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.644 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.644 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.644 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.645 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.645 12 DEBUG ceilometer.compute.pollsters [-] 235ecf63-07fa-4f60-97e9-466450a50add/network.incoming.packets volume: 13 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.646 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a2448c67-679b-49f8-81b9-4402cb09aa1b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 13, 'user_id': '989540cd5ede4a5184a08b8eb3de013d', 'user_name': None, 'project_id': 'd967f0cef958482c9711764882a146f3', 'project_name': None, 'resource_id': 'instance-00000061-235ecf63-07fa-4f60-97e9-466450a50add-tapfe9c07c3-e0', 'timestamp': '2025-11-22T08:02:36.645452', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-899820238', 'name': 'tapfe9c07c3-e0', 'instance_id': '235ecf63-07fa-4f60-97e9-466450a50add', 'instance_type': 'm1.nano', 'host': '0dddb932d08bd406e5ba67b183c68dd251b2f7cf64cee01ea4c355f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1c:d6:3e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfe9c07c3-e0'}, 'message_id': '9bbf3910-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5297.289881569, 'message_signature': '55f7203c591c962dc7578475272beae790e4c6bd83867bb2fe996ae7464192f1'}]}, 'timestamp': '2025-11-22 08:02:36.645759', '_unique_id': '54d3276bc1a44dff98a3cf83101ce466'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.646 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.646 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.646 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.646 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.646 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.646 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.646 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.646 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.646 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.646 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.646 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.646 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.646 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.646 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.646 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.646 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.646 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.646 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.646 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.646 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.646 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.646 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.646 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.646 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.646 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.646 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.646 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.646 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.646 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.646 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.646 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.646 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.647 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.658 12 DEBUG ceilometer.compute.pollsters [-] 235ecf63-07fa-4f60-97e9-466450a50add/disk.device.allocation volume: 30351360 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.659 12 DEBUG ceilometer.compute.pollsters [-] 235ecf63-07fa-4f60-97e9-466450a50add/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.660 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ae717628-22e9-4ae3-bbf9-20ec392cdfd4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30351360, 'user_id': '989540cd5ede4a5184a08b8eb3de013d', 'user_name': None, 'project_id': 'd967f0cef958482c9711764882a146f3', 'project_name': None, 'resource_id': '235ecf63-07fa-4f60-97e9-466450a50add-vda', 'timestamp': '2025-11-22T08:02:36.647148', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-899820238', 'name': 'instance-00000061', 'instance_id': '235ecf63-07fa-4f60-97e9-466450a50add', 'instance_type': 'm1.nano', 'host': '0dddb932d08bd406e5ba67b183c68dd251b2f7cf64cee01ea4c355f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9bc14534-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5297.338958214, 'message_signature': '36a72861394ff2077452b11e1c44a834e183cb0abaa46cee4c8f62278ee1b844'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '989540cd5ede4a5184a08b8eb3de013d', 'user_name': None, 'project_id': 'd967f0cef958482c9711764882a146f3', 'project_name': None, 'resource_id': '235ecf63-07fa-4f60-97e9-466450a50add-sda', 'timestamp': '2025-11-22T08:02:36.647148', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-899820238', 'name': 'instance-00000061', 'instance_id': '235ecf63-07fa-4f60-97e9-466450a50add', 'instance_type': 'm1.nano', 'host': '0dddb932d08bd406e5ba67b183c68dd251b2f7cf64cee01ea4c355f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9bc1533a-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5297.338958214, 'message_signature': '43ad423872804f4bfe9cca4241355802b4d132eec72684c96139ded368069930'}]}, 'timestamp': '2025-11-22 08:02:36.659569', '_unique_id': '93153df9824f45c1868e11da0527a3b3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.660 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.660 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.660 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.660 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.660 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.660 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.660 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.660 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.660 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.660 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.660 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.660 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.660 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.660 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.660 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.660 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.660 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.660 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.660 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.660 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.660 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.660 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.660 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.660 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.660 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.660 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.660 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.660 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.660 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.660 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.660 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.660 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.661 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.661 12 DEBUG ceilometer.compute.pollsters [-] 235ecf63-07fa-4f60-97e9-466450a50add/disk.device.write.bytes volume: 72867840 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.662 12 DEBUG ceilometer.compute.pollsters [-] 235ecf63-07fa-4f60-97e9-466450a50add/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.663 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '068d2296-9272-4edd-ad71-c585b5de86d2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72867840, 'user_id': '989540cd5ede4a5184a08b8eb3de013d', 'user_name': None, 'project_id': 'd967f0cef958482c9711764882a146f3', 'project_name': None, 'resource_id': '235ecf63-07fa-4f60-97e9-466450a50add-vda', 'timestamp': '2025-11-22T08:02:36.661730', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-899820238', 'name': 'instance-00000061', 'instance_id': '235ecf63-07fa-4f60-97e9-466450a50add', 'instance_type': 'm1.nano', 'host': '0dddb932d08bd406e5ba67b183c68dd251b2f7cf64cee01ea4c355f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9bc1b58c-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5297.299741743, 'message_signature': '75973fe66c0238f9e40d43718357e7c7d3f819858d550f888923ac7d36fac1c0'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '989540cd5ede4a5184a08b8eb3de013d', 'user_name': None, 'project_id': 'd967f0cef958482c9711764882a146f3', 'project_name': None, 'resource_id': '235ecf63-07fa-4f60-97e9-466450a50add-sda', 'timestamp': '2025-11-22T08:02:36.661730', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-899820238', 'name': 'instance-00000061', 'instance_id': '235ecf63-07fa-4f60-97e9-466450a50add', 'instance_type': 'm1.nano', 'host': '0dddb932d08bd406e5ba67b183c68dd251b2f7cf64cee01ea4c355f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9bc1c0fe-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5297.299741743, 'message_signature': '6cfa8a70027338089ec4dba973f8e31de6c6490a0e92765170e08b8a74b4819a'}]}, 'timestamp': '2025-11-22 08:02:36.662353', '_unique_id': 'e586762b29bd46569185440e966d0125'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.663 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.663 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.663 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.663 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.663 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.663 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.663 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.663 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.663 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.663 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.663 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.663 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.663 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.663 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.663 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.663 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.663 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.663 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.663 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.663 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.663 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.663 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.663 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.663 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.663 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.663 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.663 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.663 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.663 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.663 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.663 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.663 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.664 12 DEBUG ceilometer.compute.pollsters [-] 235ecf63-07fa-4f60-97e9-466450a50add/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.664 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1f704a6a-4730-49a4-975a-afeb645b312e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '989540cd5ede4a5184a08b8eb3de013d', 'user_name': None, 'project_id': 'd967f0cef958482c9711764882a146f3', 'project_name': None, 'resource_id': 'instance-00000061-235ecf63-07fa-4f60-97e9-466450a50add-tapfe9c07c3-e0', 'timestamp': '2025-11-22T08:02:36.663977', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-899820238', 'name': 'tapfe9c07c3-e0', 'instance_id': '235ecf63-07fa-4f60-97e9-466450a50add', 'instance_type': 'm1.nano', 'host': '0dddb932d08bd406e5ba67b183c68dd251b2f7cf64cee01ea4c355f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1c:d6:3e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfe9c07c3-e0'}, 'message_id': '9bc20d2a-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5297.289881569, 'message_signature': 'e4468c328c50963dd2f4a973490721d5097adb338f124247c2a03435ff82f1fe'}]}, 'timestamp': '2025-11-22 08:02:36.664332', '_unique_id': 'bcae5fb7297e4ff6a063a0cd162e05d1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.664 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.664 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.664 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.664 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.664 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.664 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.664 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.664 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.664 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.664 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.664 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.664 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.664 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.664 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.664 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.664 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.664 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.664 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.664 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.664 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.664 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.664 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.664 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.664 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.664 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.664 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.664 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.664 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.664 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.664 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.664 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.665 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.665 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.665 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServersNegativeTestJSON-server-899820238>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersNegativeTestJSON-server-899820238>]
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.666 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.666 12 DEBUG ceilometer.compute.pollsters [-] 235ecf63-07fa-4f60-97e9-466450a50add/disk.device.usage volume: 29884416 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.666 12 DEBUG ceilometer.compute.pollsters [-] 235ecf63-07fa-4f60-97e9-466450a50add/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.667 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7b10a7fb-7bdb-4c5f-ade2-d74fbd329b85', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29884416, 'user_id': '989540cd5ede4a5184a08b8eb3de013d', 'user_name': None, 'project_id': 'd967f0cef958482c9711764882a146f3', 'project_name': None, 'resource_id': '235ecf63-07fa-4f60-97e9-466450a50add-vda', 'timestamp': '2025-11-22T08:02:36.666196', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-899820238', 'name': 'instance-00000061', 'instance_id': '235ecf63-07fa-4f60-97e9-466450a50add', 'instance_type': 'm1.nano', 'host': '0dddb932d08bd406e5ba67b183c68dd251b2f7cf64cee01ea4c355f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9bc2646e-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5297.338958214, 'message_signature': '72998ae8fbfacb1a786f5f70259b4d5095593c720f32ecb3ceb4a641b70d172e'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '989540cd5ede4a5184a08b8eb3de013d', 'user_name': None, 'project_id': 'd967f0cef958482c9711764882a146f3', 'project_name': None, 'resource_id': '235ecf63-07fa-4f60-97e9-466450a50add-sda', 'timestamp': '2025-11-22T08:02:36.666196', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-899820238', 'name': 'instance-00000061', 'instance_id': '235ecf63-07fa-4f60-97e9-466450a50add', 'instance_type': 'm1.nano', 'host': '0dddb932d08bd406e5ba67b183c68dd251b2f7cf64cee01ea4c355f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9bc26f54-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5297.338958214, 'message_signature': 'a21c0ae096b9f6d0ed8d5c4871dce7c9d1a73135de79b33f3e299d4a3beccdf6'}]}, 'timestamp': '2025-11-22 08:02:36.666790', '_unique_id': '706b87bf72a6456db72d6555bbce15b7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.667 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.667 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.667 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.667 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.667 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.667 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.667 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.667 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.667 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.667 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.667 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.667 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.667 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.667 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.667 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.667 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.667 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.667 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.667 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.667 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.667 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.667 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.667 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.667 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.667 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.667 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.667 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.667 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.667 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.667 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.667 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.668 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.668 12 DEBUG ceilometer.compute.pollsters [-] 235ecf63-07fa-4f60-97e9-466450a50add/disk.device.read.requests volume: 1083 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.668 12 DEBUG ceilometer.compute.pollsters [-] 235ecf63-07fa-4f60-97e9-466450a50add/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.669 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0c8492ef-4074-4dc8-b3e1-54e361de7e75', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1083, 'user_id': '989540cd5ede4a5184a08b8eb3de013d', 'user_name': None, 'project_id': 'd967f0cef958482c9711764882a146f3', 'project_name': None, 'resource_id': '235ecf63-07fa-4f60-97e9-466450a50add-vda', 'timestamp': '2025-11-22T08:02:36.668308', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-899820238', 'name': 'instance-00000061', 'instance_id': '235ecf63-07fa-4f60-97e9-466450a50add', 'instance_type': 'm1.nano', 'host': '0dddb932d08bd406e5ba67b183c68dd251b2f7cf64cee01ea4c355f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9bc2b5f4-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5297.299741743, 'message_signature': '45bdabbf894790de2deeccdf241d59c73bc9c5ddff918999890d844c8e634b29'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '989540cd5ede4a5184a08b8eb3de013d', 'user_name': None, 'project_id': 'd967f0cef958482c9711764882a146f3', 'project_name': None, 'resource_id': '235ecf63-07fa-4f60-97e9-466450a50add-sda', 'timestamp': '2025-11-22T08:02:36.668308', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-899820238', 'name': 'instance-00000061', 'instance_id': '235ecf63-07fa-4f60-97e9-466450a50add', 'instance_type': 'm1.nano', 'host': '0dddb932d08bd406e5ba67b183c68dd251b2f7cf64cee01ea4c355f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9bc2c094-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5297.299741743, 'message_signature': 'e0235a38dacbb13b7c25e014cbd38fe0453a4958536446514348660b03e622ce'}]}, 'timestamp': '2025-11-22 08:02:36.668871', '_unique_id': 'b75877c0e8224448b7d2c337e775a5c7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.669 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.669 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.669 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.669 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.669 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.669 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.669 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.669 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.669 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.669 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.669 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.669 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.669 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.669 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.669 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.669 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.669 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.669 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.669 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.669 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.669 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.669 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.669 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.669 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.669 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.669 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.669 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.669 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.669 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.669 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.669 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.670 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.670 12 DEBUG ceilometer.compute.pollsters [-] 235ecf63-07fa-4f60-97e9-466450a50add/network.outgoing.bytes volume: 1620 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.671 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cd51f14f-d4ae-4cd8-99b2-aa5c311b067f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1620, 'user_id': '989540cd5ede4a5184a08b8eb3de013d', 'user_name': None, 'project_id': 'd967f0cef958482c9711764882a146f3', 'project_name': None, 'resource_id': 'instance-00000061-235ecf63-07fa-4f60-97e9-466450a50add-tapfe9c07c3-e0', 'timestamp': '2025-11-22T08:02:36.670485', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-899820238', 'name': 'tapfe9c07c3-e0', 'instance_id': '235ecf63-07fa-4f60-97e9-466450a50add', 'instance_type': 'm1.nano', 'host': '0dddb932d08bd406e5ba67b183c68dd251b2f7cf64cee01ea4c355f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1c:d6:3e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfe9c07c3-e0'}, 'message_id': '9bc30b76-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5297.289881569, 'message_signature': '4ebc16112f1c3ce53dd0e8ca6fd2a7bc08ffa87cad6ca1bcee8c86f9b51e7f96'}]}, 'timestamp': '2025-11-22 08:02:36.670812', '_unique_id': '91ac75ff02344831997a2989c2adcee2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.671 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.671 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.671 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.671 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.671 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.671 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.671 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.671 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.671 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.671 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.671 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.671 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.671 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.671 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.671 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.671 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.671 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.671 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.671 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.671 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.671 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.671 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.671 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.671 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.671 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.671 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.671 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.671 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.671 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.671 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.671 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.672 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.672 12 DEBUG ceilometer.compute.pollsters [-] 235ecf63-07fa-4f60-97e9-466450a50add/disk.device.read.latency volume: 2411642180 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.672 12 DEBUG ceilometer.compute.pollsters [-] 235ecf63-07fa-4f60-97e9-466450a50add/disk.device.read.latency volume: 321786638 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.673 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '423348af-48df-4041-8080-dcc105023f0f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2411642180, 'user_id': '989540cd5ede4a5184a08b8eb3de013d', 'user_name': None, 'project_id': 'd967f0cef958482c9711764882a146f3', 'project_name': None, 'resource_id': '235ecf63-07fa-4f60-97e9-466450a50add-vda', 'timestamp': '2025-11-22T08:02:36.672357', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-899820238', 'name': 'instance-00000061', 'instance_id': '235ecf63-07fa-4f60-97e9-466450a50add', 'instance_type': 'm1.nano', 'host': '0dddb932d08bd406e5ba67b183c68dd251b2f7cf64cee01ea4c355f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9bc35400-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5297.299741743, 'message_signature': 'd287b1a859896621d62050669de5ffe1fc571b2395dc5c08ce799d54693b66bc'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 321786638, 'user_id': '989540cd5ede4a5184a08b8eb3de013d', 'user_name': None, 'project_id': 'd967f0cef958482c9711764882a146f3', 'project_name': None, 'resource_id': '235ecf63-07fa-4f60-97e9-466450a50add-sda', 'timestamp': '2025-11-22T08:02:36.672357', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-899820238', 'name': 'instance-00000061', 'instance_id': '235ecf63-07fa-4f60-97e9-466450a50add', 'instance_type': 'm1.nano', 'host': '0dddb932d08bd406e5ba67b183c68dd251b2f7cf64cee01ea4c355f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9bc35f04-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5297.299741743, 'message_signature': '0c5a8536fd73a9b667a3153539d8c88a2eb731b74be95d58db018e32de02bdbc'}]}, 'timestamp': '2025-11-22 08:02:36.672925', '_unique_id': '20cfdab6f6434f51a04fabd8a2b339d3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.673 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.673 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.673 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.673 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.673 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.673 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.673 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.673 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.673 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.673 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.673 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.673 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.673 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.673 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.673 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.673 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.673 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.673 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.673 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.673 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.673 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.673 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.673 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.673 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.673 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.673 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.673 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.673 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.673 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.673 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.673 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.674 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.674 12 DEBUG ceilometer.compute.pollsters [-] 235ecf63-07fa-4f60-97e9-466450a50add/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.675 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '433acf14-4108-44fc-8f8e-84e96bb1fc99', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '989540cd5ede4a5184a08b8eb3de013d', 'user_name': None, 'project_id': 'd967f0cef958482c9711764882a146f3', 'project_name': None, 'resource_id': 'instance-00000061-235ecf63-07fa-4f60-97e9-466450a50add-tapfe9c07c3-e0', 'timestamp': '2025-11-22T08:02:36.674430', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-899820238', 'name': 'tapfe9c07c3-e0', 'instance_id': '235ecf63-07fa-4f60-97e9-466450a50add', 'instance_type': 'm1.nano', 'host': '0dddb932d08bd406e5ba67b183c68dd251b2f7cf64cee01ea4c355f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1c:d6:3e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfe9c07c3-e0'}, 'message_id': '9bc3a518-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5297.289881569, 'message_signature': '752bb4b7d5305ad5e10d710e5e895dad0149fbf651ea1d5980c1f3fbe16f278e'}]}, 'timestamp': '2025-11-22 08:02:36.674736', '_unique_id': '757632bf36844fc9b814f0e8d231835c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.675 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.675 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.675 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.675 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.675 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.675 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.675 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.675 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.675 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.675 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.675 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.675 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.675 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.675 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.675 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.675 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.675 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.675 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.675 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.675 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.675 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.675 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.675 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.675 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.675 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.675 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.675 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.675 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.675 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.675 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.675 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.676 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.676 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.676 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServersNegativeTestJSON-server-899820238>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersNegativeTestJSON-server-899820238>]
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.676 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.676 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.676 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServersNegativeTestJSON-server-899820238>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersNegativeTestJSON-server-899820238>]
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.676 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.676 12 DEBUG ceilometer.compute.pollsters [-] 235ecf63-07fa-4f60-97e9-466450a50add/network.outgoing.packets volume: 16 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.677 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4a8c8932-79dc-445c-a7c6-670e21a4eb93', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 16, 'user_id': '989540cd5ede4a5184a08b8eb3de013d', 'user_name': None, 'project_id': 'd967f0cef958482c9711764882a146f3', 'project_name': None, 'resource_id': 'instance-00000061-235ecf63-07fa-4f60-97e9-466450a50add-tapfe9c07c3-e0', 'timestamp': '2025-11-22T08:02:36.676915', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-899820238', 'name': 'tapfe9c07c3-e0', 'instance_id': '235ecf63-07fa-4f60-97e9-466450a50add', 'instance_type': 'm1.nano', 'host': '0dddb932d08bd406e5ba67b183c68dd251b2f7cf64cee01ea4c355f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1c:d6:3e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfe9c07c3-e0'}, 'message_id': '9bc40652-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5297.289881569, 'message_signature': '7901d3a02538209f41ee0523e212f6ef4e5224f73ce9233485e4bd24aec6cca8'}]}, 'timestamp': '2025-11-22 08:02:36.677234', '_unique_id': '5ca8292b4d944c249c2d1af8c3717776'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.677 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.677 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.677 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.677 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.677 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.677 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.677 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.677 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.677 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.677 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.677 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.677 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.677 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.677 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.677 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.677 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.677 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.677 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.677 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.677 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.677 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.677 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.677 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.677 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.677 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.677 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.677 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.677 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.677 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.677 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.677 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.678 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.678 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.679 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServersNegativeTestJSON-server-899820238>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersNegativeTestJSON-server-899820238>]
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.679 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.679 12 DEBUG ceilometer.compute.pollsters [-] 235ecf63-07fa-4f60-97e9-466450a50add/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.680 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f601bd97-3f00-4e4f-9f42-8ffc9496341e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '989540cd5ede4a5184a08b8eb3de013d', 'user_name': None, 'project_id': 'd967f0cef958482c9711764882a146f3', 'project_name': None, 'resource_id': 'instance-00000061-235ecf63-07fa-4f60-97e9-466450a50add-tapfe9c07c3-e0', 'timestamp': '2025-11-22T08:02:36.679290', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-899820238', 'name': 'tapfe9c07c3-e0', 'instance_id': '235ecf63-07fa-4f60-97e9-466450a50add', 'instance_type': 'm1.nano', 'host': '0dddb932d08bd406e5ba67b183c68dd251b2f7cf64cee01ea4c355f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1c:d6:3e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfe9c07c3-e0'}, 'message_id': '9bc46318-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5297.289881569, 'message_signature': '01a2aa8081cafa17afb3a8b570d1072c94775d4752706fa0a6ef01c5f65d00f2'}]}, 'timestamp': '2025-11-22 08:02:36.679621', '_unique_id': 'a22e2fa3a24f401993f2b769607ba979'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.680 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.680 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.680 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.680 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.680 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.680 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.680 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.680 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.680 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.680 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.680 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.680 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.680 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.680 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.680 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.680 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.680 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.680 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.680 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.680 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.680 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.680 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.680 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.680 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.680 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.680 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.680 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.680 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.680 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.680 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.680 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.680 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.681 12 DEBUG ceilometer.compute.pollsters [-] 235ecf63-07fa-4f60-97e9-466450a50add/disk.device.write.requests volume: 312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.681 12 DEBUG ceilometer.compute.pollsters [-] 235ecf63-07fa-4f60-97e9-466450a50add/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.682 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a14be9ef-5abf-4f1c-81b6-6882ae8b7a32', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 312, 'user_id': '989540cd5ede4a5184a08b8eb3de013d', 'user_name': None, 'project_id': 'd967f0cef958482c9711764882a146f3', 'project_name': None, 'resource_id': '235ecf63-07fa-4f60-97e9-466450a50add-vda', 'timestamp': '2025-11-22T08:02:36.680985', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-899820238', 'name': 'instance-00000061', 'instance_id': '235ecf63-07fa-4f60-97e9-466450a50add', 'instance_type': 'm1.nano', 'host': '0dddb932d08bd406e5ba67b183c68dd251b2f7cf64cee01ea4c355f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9bc4a49a-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5297.299741743, 'message_signature': '0b6294c480c950d105b5fa8be3cde3ddf84df25597769cdb8ff3f6ebd9bee90a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '989540cd5ede4a5184a08b8eb3de013d', 'user_name': None, 'project_id': 'd967f0cef958482c9711764882a146f3', 'project_name': None, 'resource_id': '235ecf63-07fa-4f60-97e9-466450a50add-sda', 'timestamp': '2025-11-22T08:02:36.680985', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-899820238', 'name': 'instance-00000061', 'instance_id': '235ecf63-07fa-4f60-97e9-466450a50add', 'instance_type': 'm1.nano', 'host': '0dddb932d08bd406e5ba67b183c68dd251b2f7cf64cee01ea4c355f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9bc4b016-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5297.299741743, 'message_signature': '32ec4f3fb542b83737f050baa9bfd6f363190b93fa97a28f15e15329a15f69ec'}]}, 'timestamp': '2025-11-22 08:02:36.681558', '_unique_id': 'dba0ca9e38854676b466130f3b237092'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.682 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.682 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.682 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.682 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.682 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.682 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.682 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.682 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.682 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.682 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.682 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.682 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.682 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.682 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.682 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.682 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.682 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.682 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.682 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.682 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.682 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.682 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.682 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.682 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.682 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.682 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.682 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.682 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.682 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.682 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.682 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.682 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.700 12 DEBUG ceilometer.compute.pollsters [-] 235ecf63-07fa-4f60-97e9-466450a50add/cpu volume: 14670000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.704 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '00ccc79a-1893-4b0f-8cc0-a9fa7f54be68', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 14670000000, 'user_id': '989540cd5ede4a5184a08b8eb3de013d', 'user_name': None, 'project_id': 'd967f0cef958482c9711764882a146f3', 'project_name': None, 'resource_id': '235ecf63-07fa-4f60-97e9-466450a50add', 'timestamp': '2025-11-22T08:02:36.682941', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-899820238', 'name': 'instance-00000061', 'instance_id': '235ecf63-07fa-4f60-97e9-466450a50add', 'instance_type': 'm1.nano', 'host': '0dddb932d08bd406e5ba67b183c68dd251b2f7cf64cee01ea4c355f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '9bc7dd9a-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5297.392505931, 'message_signature': '78e780c3c1e49be1a5d35a26d5e83fecceab838c1041a5b7144bcdc6d4ee8927'}]}, 'timestamp': '2025-11-22 08:02:36.702667', '_unique_id': '9055fdf5bd54469c86f06bc4947a4361'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.704 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.704 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.704 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.704 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.704 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.704 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.704 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.704 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.704 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.704 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.704 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.704 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.704 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.704 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.704 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.704 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.704 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.704 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.704 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.704 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.704 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.704 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.704 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.704 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.704 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.704 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.704 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.704 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.704 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.704 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.704 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.705 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.705 12 DEBUG ceilometer.compute.pollsters [-] 235ecf63-07fa-4f60-97e9-466450a50add/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.705 12 DEBUG ceilometer.compute.pollsters [-] 235ecf63-07fa-4f60-97e9-466450a50add/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.706 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '10f9299f-5cdb-41f4-8c02-4f3540980c99', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '989540cd5ede4a5184a08b8eb3de013d', 'user_name': None, 'project_id': 'd967f0cef958482c9711764882a146f3', 'project_name': None, 'resource_id': '235ecf63-07fa-4f60-97e9-466450a50add-vda', 'timestamp': '2025-11-22T08:02:36.705172', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-899820238', 'name': 'instance-00000061', 'instance_id': '235ecf63-07fa-4f60-97e9-466450a50add', 'instance_type': 'm1.nano', 'host': '0dddb932d08bd406e5ba67b183c68dd251b2f7cf64cee01ea4c355f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9bc85720-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5297.338958214, 'message_signature': '2e5cb1adb9cf08e2c17612064e204ea161860e20d7ab53749fdf327100e022b1'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '989540cd5ede4a5184a08b8eb3de013d', 'user_name': None, 'project_id': 'd967f0cef958482c9711764882a146f3', 'project_name': None, 'resource_id': '235ecf63-07fa-4f60-97e9-466450a50add-sda', 'timestamp': '2025-11-22T08:02:36.705172', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-899820238', 'name': 'instance-00000061', 'instance_id': '235ecf63-07fa-4f60-97e9-466450a50add', 'instance_type': 'm1.nano', 'host': '0dddb932d08bd406e5ba67b183c68dd251b2f7cf64cee01ea4c355f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9bc8618e-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5297.338958214, 'message_signature': '49fb8be104ee1a8afe2a7e4b98783eafcd6b8031eadbd02603bb30b6ecadb2c9'}]}, 'timestamp': '2025-11-22 08:02:36.705755', '_unique_id': 'b302fbcb01264cfd822c08d3c250b7a1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.706 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.706 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.706 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.706 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.706 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.706 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.706 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.706 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.706 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.706 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.706 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.706 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.706 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.706 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.706 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.706 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.706 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.706 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.706 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.706 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.706 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.706 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.706 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.706 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.706 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.706 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.706 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.706 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.706 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.706 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.706 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.707 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.707 12 DEBUG ceilometer.compute.pollsters [-] 235ecf63-07fa-4f60-97e9-466450a50add/disk.device.write.latency volume: 49486424498 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.707 12 DEBUG ceilometer.compute.pollsters [-] 235ecf63-07fa-4f60-97e9-466450a50add/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.708 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '80a79786-6ade-471a-8969-89c2739128ad', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 49486424498, 'user_id': '989540cd5ede4a5184a08b8eb3de013d', 'user_name': None, 'project_id': 'd967f0cef958482c9711764882a146f3', 'project_name': None, 'resource_id': '235ecf63-07fa-4f60-97e9-466450a50add-vda', 'timestamp': '2025-11-22T08:02:36.707214', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-899820238', 'name': 'instance-00000061', 'instance_id': '235ecf63-07fa-4f60-97e9-466450a50add', 'instance_type': 'm1.nano', 'host': '0dddb932d08bd406e5ba67b183c68dd251b2f7cf64cee01ea4c355f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9bc8a694-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5297.299741743, 'message_signature': '3d5d1d56766540ee63c8750105264a1d6ba8c5801cd871ab533e756473d41fc1'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '989540cd5ede4a5184a08b8eb3de013d', 'user_name': None, 'project_id': 'd967f0cef958482c9711764882a146f3', 'project_name': None, 'resource_id': '235ecf63-07fa-4f60-97e9-466450a50add-sda', 'timestamp': '2025-11-22T08:02:36.707214', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-899820238', 'name': 'instance-00000061', 'instance_id': '235ecf63-07fa-4f60-97e9-466450a50add', 'instance_type': 'm1.nano', 'host': '0dddb932d08bd406e5ba67b183c68dd251b2f7cf64cee01ea4c355f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9bc8b2ce-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5297.299741743, 'message_signature': '8cdcd4e3df22d4971fccd3c561c2c3371e670d7e51cfca4ee8f07c3c938f7091'}]}, 'timestamp': '2025-11-22 08:02:36.707835', '_unique_id': '45093f8307934a5d8306a865a0e4642a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.708 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.708 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.708 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.708 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.708 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.708 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.708 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.708 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.708 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.708 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.708 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.708 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.708 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.708 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.708 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.708 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.708 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.708 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.708 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.708 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.708 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.708 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.708 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.708 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.708 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.708 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.708 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.708 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.708 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.708 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.708 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.709 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.709 12 DEBUG ceilometer.compute.pollsters [-] 235ecf63-07fa-4f60-97e9-466450a50add/memory.usage volume: 42.90234375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.710 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e5c1243f-267c-4af7-9163-c39ad90bf0f4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.90234375, 'user_id': '989540cd5ede4a5184a08b8eb3de013d', 'user_name': None, 'project_id': 'd967f0cef958482c9711764882a146f3', 'project_name': None, 'resource_id': '235ecf63-07fa-4f60-97e9-466450a50add', 'timestamp': '2025-11-22T08:02:36.709314', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-899820238', 'name': 'instance-00000061', 'instance_id': '235ecf63-07fa-4f60-97e9-466450a50add', 'instance_type': 'm1.nano', 'host': '0dddb932d08bd406e5ba67b183c68dd251b2f7cf64cee01ea4c355f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '9bc8f69e-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5297.392505931, 'message_signature': 'b113b8c3c64e665cffde29a2ad21996f31309e45bc3c94cbacc29286ad0293ca'}]}, 'timestamp': '2025-11-22 08:02:36.709551', '_unique_id': '2d41241d3c84434586e03ba4ed03b4c5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.710 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.710 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.710 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.710 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.710 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.710 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.710 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.710 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.710 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.710 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.710 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.710 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.710 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.710 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.710 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.710 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.710 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.710 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.710 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.710 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.710 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.710 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.710 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.710 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.710 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.710 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.710 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.710 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.710 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.710 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:02:36.710 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:02:36 compute-0 nova_compute[186544]: 2025-11-22 08:02:36.930 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:02:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:37.330 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:02:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:37.331 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:02:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:37.332 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:02:38 compute-0 nova_compute[186544]: 2025-11-22 08:02:38.887 186548 DEBUG oslo_concurrency.lockutils [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquiring lock "886c510f-5bb2-455c-a7db-d83b3fc86ef2" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:02:38 compute-0 nova_compute[186544]: 2025-11-22 08:02:38.887 186548 DEBUG oslo_concurrency.lockutils [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "886c510f-5bb2-455c-a7db-d83b3fc86ef2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:02:38 compute-0 nova_compute[186544]: 2025-11-22 08:02:38.964 186548 DEBUG nova.compute.manager [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 08:02:39 compute-0 nova_compute[186544]: 2025-11-22 08:02:39.051 186548 DEBUG oslo_concurrency.lockutils [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:02:39 compute-0 nova_compute[186544]: 2025-11-22 08:02:39.051 186548 DEBUG oslo_concurrency.lockutils [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:02:39 compute-0 nova_compute[186544]: 2025-11-22 08:02:39.057 186548 DEBUG nova.virt.hardware [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 08:02:39 compute-0 nova_compute[186544]: 2025-11-22 08:02:39.057 186548 INFO nova.compute.claims [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Claim successful on node compute-0.ctlplane.example.com
Nov 22 08:02:39 compute-0 nova_compute[186544]: 2025-11-22 08:02:39.259 186548 DEBUG nova.compute.provider_tree [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:02:39 compute-0 nova_compute[186544]: 2025-11-22 08:02:39.311 186548 DEBUG nova.scheduler.client.report [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:02:39 compute-0 nova_compute[186544]: 2025-11-22 08:02:39.415 186548 DEBUG oslo_concurrency.lockutils [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.363s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:02:39 compute-0 nova_compute[186544]: 2025-11-22 08:02:39.415 186548 DEBUG nova.compute.manager [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 08:02:39 compute-0 nova_compute[186544]: 2025-11-22 08:02:39.468 186548 DEBUG nova.compute.manager [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 22 08:02:39 compute-0 nova_compute[186544]: 2025-11-22 08:02:39.468 186548 DEBUG nova.network.neutron [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 08:02:39 compute-0 nova_compute[186544]: 2025-11-22 08:02:39.512 186548 INFO nova.virt.libvirt.driver [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 08:02:39 compute-0 nova_compute[186544]: 2025-11-22 08:02:39.538 186548 DEBUG nova.compute.manager [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 08:02:39 compute-0 nova_compute[186544]: 2025-11-22 08:02:39.653 186548 DEBUG nova.compute.manager [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 08:02:39 compute-0 nova_compute[186544]: 2025-11-22 08:02:39.654 186548 DEBUG nova.virt.libvirt.driver [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 08:02:39 compute-0 nova_compute[186544]: 2025-11-22 08:02:39.654 186548 INFO nova.virt.libvirt.driver [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Creating image(s)
Nov 22 08:02:39 compute-0 nova_compute[186544]: 2025-11-22 08:02:39.655 186548 DEBUG oslo_concurrency.lockutils [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquiring lock "/var/lib/nova/instances/886c510f-5bb2-455c-a7db-d83b3fc86ef2/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:02:39 compute-0 nova_compute[186544]: 2025-11-22 08:02:39.655 186548 DEBUG oslo_concurrency.lockutils [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "/var/lib/nova/instances/886c510f-5bb2-455c-a7db-d83b3fc86ef2/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:02:39 compute-0 nova_compute[186544]: 2025-11-22 08:02:39.656 186548 DEBUG oslo_concurrency.lockutils [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "/var/lib/nova/instances/886c510f-5bb2-455c-a7db-d83b3fc86ef2/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:02:39 compute-0 nova_compute[186544]: 2025-11-22 08:02:39.675 186548 DEBUG oslo_concurrency.processutils [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:02:39 compute-0 nova_compute[186544]: 2025-11-22 08:02:39.723 186548 DEBUG nova.policy [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 22 08:02:39 compute-0 nova_compute[186544]: 2025-11-22 08:02:39.746 186548 DEBUG oslo_concurrency.processutils [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:02:39 compute-0 nova_compute[186544]: 2025-11-22 08:02:39.747 186548 DEBUG oslo_concurrency.lockutils [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:02:39 compute-0 nova_compute[186544]: 2025-11-22 08:02:39.748 186548 DEBUG oslo_concurrency.lockutils [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:02:39 compute-0 nova_compute[186544]: 2025-11-22 08:02:39.765 186548 DEBUG oslo_concurrency.processutils [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:02:39 compute-0 nova_compute[186544]: 2025-11-22 08:02:39.830 186548 DEBUG oslo_concurrency.processutils [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:02:39 compute-0 nova_compute[186544]: 2025-11-22 08:02:39.831 186548 DEBUG oslo_concurrency.processutils [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/886c510f-5bb2-455c-a7db-d83b3fc86ef2/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:02:40 compute-0 nova_compute[186544]: 2025-11-22 08:02:40.138 186548 DEBUG oslo_concurrency.processutils [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/886c510f-5bb2-455c-a7db-d83b3fc86ef2/disk 1073741824" returned: 0 in 0.307s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:02:40 compute-0 nova_compute[186544]: 2025-11-22 08:02:40.139 186548 DEBUG oslo_concurrency.lockutils [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.392s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:02:40 compute-0 nova_compute[186544]: 2025-11-22 08:02:40.140 186548 DEBUG oslo_concurrency.processutils [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:02:40 compute-0 nova_compute[186544]: 2025-11-22 08:02:40.203 186548 DEBUG oslo_concurrency.processutils [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:02:40 compute-0 nova_compute[186544]: 2025-11-22 08:02:40.204 186548 DEBUG nova.virt.disk.api [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Checking if we can resize image /var/lib/nova/instances/886c510f-5bb2-455c-a7db-d83b3fc86ef2/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 08:02:40 compute-0 nova_compute[186544]: 2025-11-22 08:02:40.204 186548 DEBUG oslo_concurrency.processutils [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/886c510f-5bb2-455c-a7db-d83b3fc86ef2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:02:40 compute-0 nova_compute[186544]: 2025-11-22 08:02:40.267 186548 DEBUG oslo_concurrency.processutils [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/886c510f-5bb2-455c-a7db-d83b3fc86ef2/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:02:40 compute-0 nova_compute[186544]: 2025-11-22 08:02:40.268 186548 DEBUG nova.virt.disk.api [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Cannot resize image /var/lib/nova/instances/886c510f-5bb2-455c-a7db-d83b3fc86ef2/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 08:02:40 compute-0 nova_compute[186544]: 2025-11-22 08:02:40.269 186548 DEBUG nova.objects.instance [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lazy-loading 'migration_context' on Instance uuid 886c510f-5bb2-455c-a7db-d83b3fc86ef2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:02:40 compute-0 nova_compute[186544]: 2025-11-22 08:02:40.284 186548 DEBUG nova.virt.libvirt.driver [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 08:02:40 compute-0 nova_compute[186544]: 2025-11-22 08:02:40.284 186548 DEBUG nova.virt.libvirt.driver [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Ensure instance console log exists: /var/lib/nova/instances/886c510f-5bb2-455c-a7db-d83b3fc86ef2/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 08:02:40 compute-0 nova_compute[186544]: 2025-11-22 08:02:40.285 186548 DEBUG oslo_concurrency.lockutils [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:02:40 compute-0 nova_compute[186544]: 2025-11-22 08:02:40.285 186548 DEBUG oslo_concurrency.lockutils [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:02:40 compute-0 nova_compute[186544]: 2025-11-22 08:02:40.286 186548 DEBUG oslo_concurrency.lockutils [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:02:40 compute-0 podman[230496]: 2025-11-22 08:02:40.428979461 +0000 UTC m=+0.064251722 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, vendor=Red Hat, Inc., distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, name=ubi9-minimal, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, managed_by=edpm_ansible, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-type=git, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.buildah.version=1.33.7)
Nov 22 08:02:40 compute-0 podman[230495]: 2025-11-22 08:02:40.442329332 +0000 UTC m=+0.081631212 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 08:02:40 compute-0 nova_compute[186544]: 2025-11-22 08:02:40.679 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:02:41 compute-0 nova_compute[186544]: 2025-11-22 08:02:41.933 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:02:42 compute-0 nova_compute[186544]: 2025-11-22 08:02:42.046 186548 DEBUG nova.network.neutron [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Successfully created port: cebf92b8-e08f-4803-92db-8cb2866aa038 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 22 08:02:45 compute-0 nova_compute[186544]: 2025-11-22 08:02:45.438 186548 DEBUG nova.network.neutron [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Successfully updated port: cebf92b8-e08f-4803-92db-8cb2866aa038 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 08:02:45 compute-0 nova_compute[186544]: 2025-11-22 08:02:45.458 186548 DEBUG oslo_concurrency.lockutils [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquiring lock "refresh_cache-886c510f-5bb2-455c-a7db-d83b3fc86ef2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:02:45 compute-0 nova_compute[186544]: 2025-11-22 08:02:45.458 186548 DEBUG oslo_concurrency.lockutils [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquired lock "refresh_cache-886c510f-5bb2-455c-a7db-d83b3fc86ef2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:02:45 compute-0 nova_compute[186544]: 2025-11-22 08:02:45.458 186548 DEBUG nova.network.neutron [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 08:02:45 compute-0 nova_compute[186544]: 2025-11-22 08:02:45.681 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:02:46 compute-0 nova_compute[186544]: 2025-11-22 08:02:46.159 186548 DEBUG nova.compute.manager [req-69a1f0bc-259a-4f5b-97cd-c903be92fe87 req-398429d6-a256-4537-817e-419baa0453ff 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Received event network-changed-cebf92b8-e08f-4803-92db-8cb2866aa038 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:02:46 compute-0 nova_compute[186544]: 2025-11-22 08:02:46.159 186548 DEBUG nova.compute.manager [req-69a1f0bc-259a-4f5b-97cd-c903be92fe87 req-398429d6-a256-4537-817e-419baa0453ff 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Refreshing instance network info cache due to event network-changed-cebf92b8-e08f-4803-92db-8cb2866aa038. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:02:46 compute-0 nova_compute[186544]: 2025-11-22 08:02:46.160 186548 DEBUG oslo_concurrency.lockutils [req-69a1f0bc-259a-4f5b-97cd-c903be92fe87 req-398429d6-a256-4537-817e-419baa0453ff 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-886c510f-5bb2-455c-a7db-d83b3fc86ef2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:02:46 compute-0 nova_compute[186544]: 2025-11-22 08:02:46.177 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:02:46 compute-0 nova_compute[186544]: 2025-11-22 08:02:46.701 186548 DEBUG nova.network.neutron [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 08:02:46 compute-0 nova_compute[186544]: 2025-11-22 08:02:46.934 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:02:48 compute-0 nova_compute[186544]: 2025-11-22 08:02:48.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:02:48 compute-0 nova_compute[186544]: 2025-11-22 08:02:48.183 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:02:48 compute-0 nova_compute[186544]: 2025-11-22 08:02:48.183 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:02:48 compute-0 nova_compute[186544]: 2025-11-22 08:02:48.184 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:02:48 compute-0 nova_compute[186544]: 2025-11-22 08:02:48.184 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 08:02:48 compute-0 nova_compute[186544]: 2025-11-22 08:02:48.263 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/235ecf63-07fa-4f60-97e9-466450a50add/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:02:48 compute-0 nova_compute[186544]: 2025-11-22 08:02:48.322 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/235ecf63-07fa-4f60-97e9-466450a50add/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:02:48 compute-0 nova_compute[186544]: 2025-11-22 08:02:48.323 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/235ecf63-07fa-4f60-97e9-466450a50add/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:02:48 compute-0 nova_compute[186544]: 2025-11-22 08:02:48.385 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/235ecf63-07fa-4f60-97e9-466450a50add/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:02:48 compute-0 nova_compute[186544]: 2025-11-22 08:02:48.587 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:02:48 compute-0 nova_compute[186544]: 2025-11-22 08:02:48.588 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5561MB free_disk=73.25111389160156GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 08:02:48 compute-0 nova_compute[186544]: 2025-11-22 08:02:48.588 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:02:48 compute-0 nova_compute[186544]: 2025-11-22 08:02:48.589 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:02:48 compute-0 nova_compute[186544]: 2025-11-22 08:02:48.665 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Instance 235ecf63-07fa-4f60-97e9-466450a50add actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 22 08:02:48 compute-0 nova_compute[186544]: 2025-11-22 08:02:48.665 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Instance 886c510f-5bb2-455c-a7db-d83b3fc86ef2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 22 08:02:48 compute-0 nova_compute[186544]: 2025-11-22 08:02:48.666 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 08:02:48 compute-0 nova_compute[186544]: 2025-11-22 08:02:48.666 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 08:02:48 compute-0 nova_compute[186544]: 2025-11-22 08:02:48.738 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:02:48 compute-0 nova_compute[186544]: 2025-11-22 08:02:48.750 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:02:48 compute-0 nova_compute[186544]: 2025-11-22 08:02:48.770 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 08:02:48 compute-0 nova_compute[186544]: 2025-11-22 08:02:48.770 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.181s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:02:48 compute-0 nova_compute[186544]: 2025-11-22 08:02:48.928 186548 INFO nova.compute.manager [None req-8e94e421-606b-4814-8e9f-c98eaf019c72 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Pausing
Nov 22 08:02:48 compute-0 nova_compute[186544]: 2025-11-22 08:02:48.930 186548 DEBUG nova.objects.instance [None req-8e94e421-606b-4814-8e9f-c98eaf019c72 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Lazy-loading 'flavor' on Instance uuid 235ecf63-07fa-4f60-97e9-466450a50add obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:02:48 compute-0 nova_compute[186544]: 2025-11-22 08:02:48.971 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798568.97101, 235ecf63-07fa-4f60-97e9-466450a50add => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:02:48 compute-0 nova_compute[186544]: 2025-11-22 08:02:48.971 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] VM Paused (Lifecycle Event)
Nov 22 08:02:48 compute-0 nova_compute[186544]: 2025-11-22 08:02:48.991 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:02:48 compute-0 nova_compute[186544]: 2025-11-22 08:02:48.995 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: pausing, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:02:49 compute-0 nova_compute[186544]: 2025-11-22 08:02:49.014 186548 DEBUG nova.compute.manager [None req-8e94e421-606b-4814-8e9f-c98eaf019c72 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:02:49 compute-0 nova_compute[186544]: 2025-11-22 08:02:49.020 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] During sync_power_state the instance has a pending task (pausing). Skip.
Nov 22 08:02:49 compute-0 nova_compute[186544]: 2025-11-22 08:02:49.533 186548 DEBUG nova.network.neutron [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Updating instance_info_cache with network_info: [{"id": "cebf92b8-e08f-4803-92db-8cb2866aa038", "address": "fa:16:3e:ac:64:2a", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcebf92b8-e0", "ovs_interfaceid": "cebf92b8-e08f-4803-92db-8cb2866aa038", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:02:49 compute-0 nova_compute[186544]: 2025-11-22 08:02:49.548 186548 DEBUG oslo_concurrency.lockutils [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Releasing lock "refresh_cache-886c510f-5bb2-455c-a7db-d83b3fc86ef2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:02:49 compute-0 nova_compute[186544]: 2025-11-22 08:02:49.548 186548 DEBUG nova.compute.manager [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Instance network_info: |[{"id": "cebf92b8-e08f-4803-92db-8cb2866aa038", "address": "fa:16:3e:ac:64:2a", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcebf92b8-e0", "ovs_interfaceid": "cebf92b8-e08f-4803-92db-8cb2866aa038", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 22 08:02:49 compute-0 nova_compute[186544]: 2025-11-22 08:02:49.549 186548 DEBUG oslo_concurrency.lockutils [req-69a1f0bc-259a-4f5b-97cd-c903be92fe87 req-398429d6-a256-4537-817e-419baa0453ff 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-886c510f-5bb2-455c-a7db-d83b3fc86ef2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:02:49 compute-0 nova_compute[186544]: 2025-11-22 08:02:49.550 186548 DEBUG nova.network.neutron [req-69a1f0bc-259a-4f5b-97cd-c903be92fe87 req-398429d6-a256-4537-817e-419baa0453ff 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Refreshing network info cache for port cebf92b8-e08f-4803-92db-8cb2866aa038 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:02:49 compute-0 nova_compute[186544]: 2025-11-22 08:02:49.554 186548 DEBUG nova.virt.libvirt.driver [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Start _get_guest_xml network_info=[{"id": "cebf92b8-e08f-4803-92db-8cb2866aa038", "address": "fa:16:3e:ac:64:2a", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcebf92b8-e0", "ovs_interfaceid": "cebf92b8-e08f-4803-92db-8cb2866aa038", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 08:02:49 compute-0 nova_compute[186544]: 2025-11-22 08:02:49.558 186548 WARNING nova.virt.libvirt.driver [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:02:49 compute-0 nova_compute[186544]: 2025-11-22 08:02:49.569 186548 DEBUG nova.virt.libvirt.host [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 08:02:49 compute-0 nova_compute[186544]: 2025-11-22 08:02:49.571 186548 DEBUG nova.virt.libvirt.host [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 08:02:49 compute-0 nova_compute[186544]: 2025-11-22 08:02:49.575 186548 DEBUG nova.virt.libvirt.host [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 08:02:49 compute-0 nova_compute[186544]: 2025-11-22 08:02:49.576 186548 DEBUG nova.virt.libvirt.host [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 08:02:49 compute-0 nova_compute[186544]: 2025-11-22 08:02:49.578 186548 DEBUG nova.virt.libvirt.driver [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 08:02:49 compute-0 nova_compute[186544]: 2025-11-22 08:02:49.578 186548 DEBUG nova.virt.hardware [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 08:02:49 compute-0 nova_compute[186544]: 2025-11-22 08:02:49.579 186548 DEBUG nova.virt.hardware [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 08:02:49 compute-0 nova_compute[186544]: 2025-11-22 08:02:49.579 186548 DEBUG nova.virt.hardware [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 08:02:49 compute-0 nova_compute[186544]: 2025-11-22 08:02:49.579 186548 DEBUG nova.virt.hardware [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 08:02:49 compute-0 nova_compute[186544]: 2025-11-22 08:02:49.580 186548 DEBUG nova.virt.hardware [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 08:02:49 compute-0 nova_compute[186544]: 2025-11-22 08:02:49.580 186548 DEBUG nova.virt.hardware [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 08:02:49 compute-0 nova_compute[186544]: 2025-11-22 08:02:49.580 186548 DEBUG nova.virt.hardware [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 08:02:49 compute-0 nova_compute[186544]: 2025-11-22 08:02:49.580 186548 DEBUG nova.virt.hardware [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 08:02:49 compute-0 nova_compute[186544]: 2025-11-22 08:02:49.581 186548 DEBUG nova.virt.hardware [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 08:02:49 compute-0 nova_compute[186544]: 2025-11-22 08:02:49.582 186548 DEBUG nova.virt.hardware [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 08:02:49 compute-0 nova_compute[186544]: 2025-11-22 08:02:49.583 186548 DEBUG nova.virt.hardware [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 08:02:49 compute-0 nova_compute[186544]: 2025-11-22 08:02:49.588 186548 DEBUG nova.virt.libvirt.vif [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:02:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1370271558',display_name='tempest-tempest.common.compute-instance-1370271558',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1370271558',id=100,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ac6b78572b7d4aedaf745e5e6ba1d360',ramdisk_id='',reservation_id='r-4vx40fvs',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-1104477664',owner_user_name='tempest-ServerActionsTestJSON-1104477664-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:02:39Z,user_data=None,user_id='b6cc24df1e344e369f2aff864f278268',uuid=886c510f-5bb2-455c-a7db-d83b3fc86ef2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cebf92b8-e08f-4803-92db-8cb2866aa038", "address": "fa:16:3e:ac:64:2a", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcebf92b8-e0", "ovs_interfaceid": "cebf92b8-e08f-4803-92db-8cb2866aa038", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 08:02:49 compute-0 nova_compute[186544]: 2025-11-22 08:02:49.589 186548 DEBUG nova.network.os_vif_util [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Converting VIF {"id": "cebf92b8-e08f-4803-92db-8cb2866aa038", "address": "fa:16:3e:ac:64:2a", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcebf92b8-e0", "ovs_interfaceid": "cebf92b8-e08f-4803-92db-8cb2866aa038", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:02:49 compute-0 nova_compute[186544]: 2025-11-22 08:02:49.590 186548 DEBUG nova.network.os_vif_util [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ac:64:2a,bridge_name='br-int',has_traffic_filtering=True,id=cebf92b8-e08f-4803-92db-8cb2866aa038,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcebf92b8-e0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:02:49 compute-0 nova_compute[186544]: 2025-11-22 08:02:49.592 186548 DEBUG nova.objects.instance [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lazy-loading 'pci_devices' on Instance uuid 886c510f-5bb2-455c-a7db-d83b3fc86ef2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:02:49 compute-0 nova_compute[186544]: 2025-11-22 08:02:49.609 186548 DEBUG nova.virt.libvirt.driver [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] End _get_guest_xml xml=<domain type="kvm">
Nov 22 08:02:49 compute-0 nova_compute[186544]:   <uuid>886c510f-5bb2-455c-a7db-d83b3fc86ef2</uuid>
Nov 22 08:02:49 compute-0 nova_compute[186544]:   <name>instance-00000064</name>
Nov 22 08:02:49 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 08:02:49 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 08:02:49 compute-0 nova_compute[186544]:   <metadata>
Nov 22 08:02:49 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 08:02:49 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 08:02:49 compute-0 nova_compute[186544]:       <nova:name>tempest-tempest.common.compute-instance-1370271558</nova:name>
Nov 22 08:02:49 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 08:02:49</nova:creationTime>
Nov 22 08:02:49 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 08:02:49 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 08:02:49 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 08:02:49 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 08:02:49 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 08:02:49 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 08:02:49 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 08:02:49 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 08:02:49 compute-0 nova_compute[186544]:         <nova:user uuid="b6cc24df1e344e369f2aff864f278268">tempest-ServerActionsTestJSON-1104477664-project-member</nova:user>
Nov 22 08:02:49 compute-0 nova_compute[186544]:         <nova:project uuid="ac6b78572b7d4aedaf745e5e6ba1d360">tempest-ServerActionsTestJSON-1104477664</nova:project>
Nov 22 08:02:49 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 08:02:49 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 08:02:49 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 08:02:49 compute-0 nova_compute[186544]:         <nova:port uuid="cebf92b8-e08f-4803-92db-8cb2866aa038">
Nov 22 08:02:49 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Nov 22 08:02:49 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 08:02:49 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 08:02:49 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 08:02:49 compute-0 nova_compute[186544]:   </metadata>
Nov 22 08:02:49 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 08:02:49 compute-0 nova_compute[186544]:     <system>
Nov 22 08:02:49 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 08:02:49 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 08:02:49 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 08:02:49 compute-0 nova_compute[186544]:       <entry name="serial">886c510f-5bb2-455c-a7db-d83b3fc86ef2</entry>
Nov 22 08:02:49 compute-0 nova_compute[186544]:       <entry name="uuid">886c510f-5bb2-455c-a7db-d83b3fc86ef2</entry>
Nov 22 08:02:49 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 08:02:49 compute-0 nova_compute[186544]:     </system>
Nov 22 08:02:49 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 08:02:49 compute-0 nova_compute[186544]:   <os>
Nov 22 08:02:49 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 08:02:49 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 08:02:49 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 08:02:49 compute-0 nova_compute[186544]:   </os>
Nov 22 08:02:49 compute-0 nova_compute[186544]:   <features>
Nov 22 08:02:49 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 08:02:49 compute-0 nova_compute[186544]:     <apic/>
Nov 22 08:02:49 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 08:02:49 compute-0 nova_compute[186544]:   </features>
Nov 22 08:02:49 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 08:02:49 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 08:02:49 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 08:02:49 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 08:02:49 compute-0 nova_compute[186544]:   </clock>
Nov 22 08:02:49 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 08:02:49 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 08:02:49 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 08:02:49 compute-0 nova_compute[186544]:   </cpu>
Nov 22 08:02:49 compute-0 nova_compute[186544]:   <devices>
Nov 22 08:02:49 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 08:02:49 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 08:02:49 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/886c510f-5bb2-455c-a7db-d83b3fc86ef2/disk"/>
Nov 22 08:02:49 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 08:02:49 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:02:49 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 08:02:49 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 08:02:49 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/886c510f-5bb2-455c-a7db-d83b3fc86ef2/disk.config"/>
Nov 22 08:02:49 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 08:02:49 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:02:49 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 08:02:49 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:ac:64:2a"/>
Nov 22 08:02:49 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:02:49 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 08:02:49 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 08:02:49 compute-0 nova_compute[186544]:       <target dev="tapcebf92b8-e0"/>
Nov 22 08:02:49 compute-0 nova_compute[186544]:     </interface>
Nov 22 08:02:49 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 08:02:49 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/886c510f-5bb2-455c-a7db-d83b3fc86ef2/console.log" append="off"/>
Nov 22 08:02:49 compute-0 nova_compute[186544]:     </serial>
Nov 22 08:02:49 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 08:02:49 compute-0 nova_compute[186544]:     <video>
Nov 22 08:02:49 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:02:49 compute-0 nova_compute[186544]:     </video>
Nov 22 08:02:49 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 08:02:49 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 08:02:49 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 08:02:49 compute-0 nova_compute[186544]:     </rng>
Nov 22 08:02:49 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 08:02:49 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:02:49 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:02:49 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:02:49 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:02:49 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:02:49 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:02:49 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:02:49 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:02:49 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:02:49 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:02:49 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:02:49 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:02:49 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:02:49 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:02:49 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:02:49 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:02:49 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:02:49 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:02:49 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:02:49 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:02:49 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:02:49 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:02:49 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:02:49 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:02:49 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 08:02:49 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 08:02:49 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 08:02:49 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 08:02:49 compute-0 nova_compute[186544]:   </devices>
Nov 22 08:02:49 compute-0 nova_compute[186544]: </domain>
Nov 22 08:02:49 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 08:02:49 compute-0 nova_compute[186544]: 2025-11-22 08:02:49.611 186548 DEBUG nova.compute.manager [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Preparing to wait for external event network-vif-plugged-cebf92b8-e08f-4803-92db-8cb2866aa038 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 08:02:49 compute-0 nova_compute[186544]: 2025-11-22 08:02:49.612 186548 DEBUG oslo_concurrency.lockutils [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquiring lock "886c510f-5bb2-455c-a7db-d83b3fc86ef2-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:02:49 compute-0 nova_compute[186544]: 2025-11-22 08:02:49.612 186548 DEBUG oslo_concurrency.lockutils [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "886c510f-5bb2-455c-a7db-d83b3fc86ef2-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:02:49 compute-0 nova_compute[186544]: 2025-11-22 08:02:49.612 186548 DEBUG oslo_concurrency.lockutils [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "886c510f-5bb2-455c-a7db-d83b3fc86ef2-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:02:49 compute-0 nova_compute[186544]: 2025-11-22 08:02:49.613 186548 DEBUG nova.virt.libvirt.vif [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:02:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1370271558',display_name='tempest-tempest.common.compute-instance-1370271558',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1370271558',id=100,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ac6b78572b7d4aedaf745e5e6ba1d360',ramdisk_id='',reservation_id='r-4vx40fvs',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-1104477664',owner_user_name='tempest-ServerActionsTestJSON-1104477664-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:02:39Z,user_data=None,user_id='b6cc24df1e344e369f2aff864f278268',uuid=886c510f-5bb2-455c-a7db-d83b3fc86ef2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cebf92b8-e08f-4803-92db-8cb2866aa038", "address": "fa:16:3e:ac:64:2a", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcebf92b8-e0", "ovs_interfaceid": "cebf92b8-e08f-4803-92db-8cb2866aa038", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 08:02:49 compute-0 nova_compute[186544]: 2025-11-22 08:02:49.614 186548 DEBUG nova.network.os_vif_util [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Converting VIF {"id": "cebf92b8-e08f-4803-92db-8cb2866aa038", "address": "fa:16:3e:ac:64:2a", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcebf92b8-e0", "ovs_interfaceid": "cebf92b8-e08f-4803-92db-8cb2866aa038", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:02:49 compute-0 nova_compute[186544]: 2025-11-22 08:02:49.615 186548 DEBUG nova.network.os_vif_util [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ac:64:2a,bridge_name='br-int',has_traffic_filtering=True,id=cebf92b8-e08f-4803-92db-8cb2866aa038,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcebf92b8-e0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:02:49 compute-0 nova_compute[186544]: 2025-11-22 08:02:49.616 186548 DEBUG os_vif [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ac:64:2a,bridge_name='br-int',has_traffic_filtering=True,id=cebf92b8-e08f-4803-92db-8cb2866aa038,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcebf92b8-e0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 08:02:49 compute-0 nova_compute[186544]: 2025-11-22 08:02:49.616 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:02:49 compute-0 nova_compute[186544]: 2025-11-22 08:02:49.617 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:02:49 compute-0 nova_compute[186544]: 2025-11-22 08:02:49.617 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:02:49 compute-0 nova_compute[186544]: 2025-11-22 08:02:49.621 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:02:49 compute-0 nova_compute[186544]: 2025-11-22 08:02:49.622 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcebf92b8-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:02:49 compute-0 nova_compute[186544]: 2025-11-22 08:02:49.623 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapcebf92b8-e0, col_values=(('external_ids', {'iface-id': 'cebf92b8-e08f-4803-92db-8cb2866aa038', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ac:64:2a', 'vm-uuid': '886c510f-5bb2-455c-a7db-d83b3fc86ef2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:02:49 compute-0 NetworkManager[55036]: <info>  [1763798569.6270] manager: (tapcebf92b8-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/198)
Nov 22 08:02:49 compute-0 nova_compute[186544]: 2025-11-22 08:02:49.629 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 08:02:49 compute-0 nova_compute[186544]: 2025-11-22 08:02:49.634 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:02:49 compute-0 nova_compute[186544]: 2025-11-22 08:02:49.636 186548 INFO os_vif [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ac:64:2a,bridge_name='br-int',has_traffic_filtering=True,id=cebf92b8-e08f-4803-92db-8cb2866aa038,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcebf92b8-e0')
Nov 22 08:02:49 compute-0 nova_compute[186544]: 2025-11-22 08:02:49.794 186548 DEBUG nova.virt.libvirt.driver [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:02:49 compute-0 nova_compute[186544]: 2025-11-22 08:02:49.794 186548 DEBUG nova.virt.libvirt.driver [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:02:49 compute-0 nova_compute[186544]: 2025-11-22 08:02:49.795 186548 DEBUG nova.virt.libvirt.driver [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] No VIF found with MAC fa:16:3e:ac:64:2a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 08:02:49 compute-0 nova_compute[186544]: 2025-11-22 08:02:49.796 186548 INFO nova.virt.libvirt.driver [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Using config drive
Nov 22 08:02:50 compute-0 nova_compute[186544]: 2025-11-22 08:02:50.682 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:02:51 compute-0 nova_compute[186544]: 2025-11-22 08:02:51.116 186548 INFO nova.virt.libvirt.driver [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Creating config drive at /var/lib/nova/instances/886c510f-5bb2-455c-a7db-d83b3fc86ef2/disk.config
Nov 22 08:02:51 compute-0 nova_compute[186544]: 2025-11-22 08:02:51.122 186548 DEBUG oslo_concurrency.processutils [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/886c510f-5bb2-455c-a7db-d83b3fc86ef2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5mrl_ylc execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:02:51 compute-0 nova_compute[186544]: 2025-11-22 08:02:51.249 186548 DEBUG oslo_concurrency.processutils [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/886c510f-5bb2-455c-a7db-d83b3fc86ef2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5mrl_ylc" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:02:51 compute-0 kernel: tapcebf92b8-e0: entered promiscuous mode
Nov 22 08:02:51 compute-0 NetworkManager[55036]: <info>  [1763798571.3321] manager: (tapcebf92b8-e0): new Tun device (/org/freedesktop/NetworkManager/Devices/199)
Nov 22 08:02:51 compute-0 ovn_controller[94843]: 2025-11-22T08:02:51Z|00417|binding|INFO|Claiming lport cebf92b8-e08f-4803-92db-8cb2866aa038 for this chassis.
Nov 22 08:02:51 compute-0 nova_compute[186544]: 2025-11-22 08:02:51.333 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:02:51 compute-0 ovn_controller[94843]: 2025-11-22T08:02:51Z|00418|binding|INFO|cebf92b8-e08f-4803-92db-8cb2866aa038: Claiming fa:16:3e:ac:64:2a 10.100.0.10
Nov 22 08:02:51 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:51.345 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ac:64:2a 10.100.0.10'], port_security=['fa:16:3e:ac:64:2a 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '886c510f-5bb2-455c-a7db-d83b3fc86ef2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'neutron:revision_number': '2', 'neutron:security_group_ids': '6f667341-2539-478f-aedb-18e68df5c8e5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a46c39a3-69e8-4fb9-a300-4c114a0c0910, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=cebf92b8-e08f-4803-92db-8cb2866aa038) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:02:51 compute-0 ovn_controller[94843]: 2025-11-22T08:02:51Z|00419|binding|INFO|Setting lport cebf92b8-e08f-4803-92db-8cb2866aa038 ovn-installed in OVS
Nov 22 08:02:51 compute-0 ovn_controller[94843]: 2025-11-22T08:02:51Z|00420|binding|INFO|Setting lport cebf92b8-e08f-4803-92db-8cb2866aa038 up in Southbound
Nov 22 08:02:51 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:51.346 103805 INFO neutron.agent.ovn.metadata.agent [-] Port cebf92b8-e08f-4803-92db-8cb2866aa038 in datapath 165f7f23-d3c9-4f49-8a34-4fd7222ad518 bound to our chassis
Nov 22 08:02:51 compute-0 nova_compute[186544]: 2025-11-22 08:02:51.346 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:02:51 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:51.347 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 165f7f23-d3c9-4f49-8a34-4fd7222ad518
Nov 22 08:02:51 compute-0 nova_compute[186544]: 2025-11-22 08:02:51.349 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:02:51 compute-0 systemd-udevd[230569]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:02:51 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:51.358 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[f690eb3f-37be-4375-bb2f-a2a51f1a99ff]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:02:51 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:51.362 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap165f7f23-d1 in ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 08:02:51 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:51.363 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap165f7f23-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 08:02:51 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:51.364 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[5b286f7e-ecef-42e9-80e1-aa987443e18b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:02:51 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:51.365 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[1045e53a-d42b-4c63-9cf0-e0b6e2309a55]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:02:51 compute-0 systemd-machined[152872]: New machine qemu-56-instance-00000064.
Nov 22 08:02:51 compute-0 NetworkManager[55036]: <info>  [1763798571.3718] device (tapcebf92b8-e0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 08:02:51 compute-0 NetworkManager[55036]: <info>  [1763798571.3729] device (tapcebf92b8-e0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 08:02:51 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:51.376 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[39a1412c-50a0-46d7-b204-6636a13e2631]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:02:51 compute-0 systemd[1]: Started Virtual Machine qemu-56-instance-00000064.
Nov 22 08:02:51 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:51.400 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[68dfcb49-4e8a-418a-ba4e-23f3ddaa4b28]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:02:51 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:51.427 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[b93a9153-e586-4260-b1eb-5fb84fa3539e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:02:51 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:51.432 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[7d2f96d1-850c-4bb7-b23c-d75438f89d59]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:02:51 compute-0 systemd-udevd[230573]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:02:51 compute-0 NetworkManager[55036]: <info>  [1763798571.4341] manager: (tap165f7f23-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/200)
Nov 22 08:02:51 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:51.462 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[5439b05e-e6e9-4f55-bfa8-af8005c989b0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:02:51 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:51.465 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[40bc0d22-29e3-48be-baa1-c90c972b8bf9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:02:51 compute-0 NetworkManager[55036]: <info>  [1763798571.4867] device (tap165f7f23-d0): carrier: link connected
Nov 22 08:02:51 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:51.492 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[ab4b5dea-f0e2-40f2-8896-5766ebe2a115]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:02:51 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:51.509 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[8e193614-6947-42e2-b710-1d623179932e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap165f7f23-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:00:cc:98'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 132], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 531212, 'reachable_time': 37248, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230602, 'error': None, 'target': 'ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:02:51 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:51.527 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[1985344d-9fd7-4ce7-9c8f-faf2cb23fd95]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe00:cc98'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 531212, 'tstamp': 531212}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230603, 'error': None, 'target': 'ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:02:51 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:51.544 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[5244f7a2-fc01-4694-b549-80b8abfa3377]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap165f7f23-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:00:cc:98'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 132], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 531212, 'reachable_time': 37248, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 230604, 'error': None, 'target': 'ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:02:51 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:51.579 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[bf098069-9be0-4a67-8f87-c142d82008aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:02:51 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:51.643 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[839b90b9-7b41-4315-9386-31ad7e85dd88]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:02:51 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:51.648 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap165f7f23-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:02:51 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:51.648 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:02:51 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:51.648 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap165f7f23-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:02:51 compute-0 NetworkManager[55036]: <info>  [1763798571.6513] manager: (tap165f7f23-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/201)
Nov 22 08:02:51 compute-0 nova_compute[186544]: 2025-11-22 08:02:51.650 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:02:51 compute-0 kernel: tap165f7f23-d0: entered promiscuous mode
Nov 22 08:02:51 compute-0 nova_compute[186544]: 2025-11-22 08:02:51.654 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:02:51 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:51.655 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap165f7f23-d0, col_values=(('external_ids', {'iface-id': '966efbe2-6c09-40dc-9351-4f58f2542b31'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:02:51 compute-0 nova_compute[186544]: 2025-11-22 08:02:51.656 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:02:51 compute-0 ovn_controller[94843]: 2025-11-22T08:02:51Z|00421|binding|INFO|Releasing lport 966efbe2-6c09-40dc-9351-4f58f2542b31 from this chassis (sb_readonly=0)
Nov 22 08:02:51 compute-0 nova_compute[186544]: 2025-11-22 08:02:51.657 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:02:51 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:51.658 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/165f7f23-d3c9-4f49-8a34-4fd7222ad518.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/165f7f23-d3c9-4f49-8a34-4fd7222ad518.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 08:02:51 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:51.659 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[292c7760-f8c9-4104-87da-8b896e9e7464]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:02:51 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:51.660 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 08:02:51 compute-0 ovn_metadata_agent[103800]: global
Nov 22 08:02:51 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 08:02:51 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-165f7f23-d3c9-4f49-8a34-4fd7222ad518
Nov 22 08:02:51 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 08:02:51 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 08:02:51 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 08:02:51 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/165f7f23-d3c9-4f49-8a34-4fd7222ad518.pid.haproxy
Nov 22 08:02:51 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 08:02:51 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:02:51 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 08:02:51 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 08:02:51 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 08:02:51 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 08:02:51 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 08:02:51 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 08:02:51 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 08:02:51 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 08:02:51 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 08:02:51 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 08:02:51 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 08:02:51 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 08:02:51 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 08:02:51 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:02:51 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:02:51 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 08:02:51 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 08:02:51 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 08:02:51 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID 165f7f23-d3c9-4f49-8a34-4fd7222ad518
Nov 22 08:02:51 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 08:02:51 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:02:51.661 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'env', 'PROCESS_TAG=haproxy-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/165f7f23-d3c9-4f49-8a34-4fd7222ad518.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 08:02:51 compute-0 nova_compute[186544]: 2025-11-22 08:02:51.661 186548 DEBUG nova.compute.manager [req-5c828527-7143-4c8c-b4ce-225d58d0babd req-0305bc40-425e-4db9-b0a6-ac44ff138f3e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Received event network-vif-plugged-cebf92b8-e08f-4803-92db-8cb2866aa038 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:02:51 compute-0 nova_compute[186544]: 2025-11-22 08:02:51.662 186548 DEBUG oslo_concurrency.lockutils [req-5c828527-7143-4c8c-b4ce-225d58d0babd req-0305bc40-425e-4db9-b0a6-ac44ff138f3e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "886c510f-5bb2-455c-a7db-d83b3fc86ef2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:02:51 compute-0 nova_compute[186544]: 2025-11-22 08:02:51.662 186548 DEBUG oslo_concurrency.lockutils [req-5c828527-7143-4c8c-b4ce-225d58d0babd req-0305bc40-425e-4db9-b0a6-ac44ff138f3e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "886c510f-5bb2-455c-a7db-d83b3fc86ef2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:02:51 compute-0 nova_compute[186544]: 2025-11-22 08:02:51.662 186548 DEBUG oslo_concurrency.lockutils [req-5c828527-7143-4c8c-b4ce-225d58d0babd req-0305bc40-425e-4db9-b0a6-ac44ff138f3e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "886c510f-5bb2-455c-a7db-d83b3fc86ef2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:02:51 compute-0 nova_compute[186544]: 2025-11-22 08:02:51.663 186548 DEBUG nova.compute.manager [req-5c828527-7143-4c8c-b4ce-225d58d0babd req-0305bc40-425e-4db9-b0a6-ac44ff138f3e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Processing event network-vif-plugged-cebf92b8-e08f-4803-92db-8cb2866aa038 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 08:02:51 compute-0 nova_compute[186544]: 2025-11-22 08:02:51.671 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:02:51 compute-0 nova_compute[186544]: 2025-11-22 08:02:51.704 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798571.7039633, 886c510f-5bb2-455c-a7db-d83b3fc86ef2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:02:51 compute-0 nova_compute[186544]: 2025-11-22 08:02:51.705 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] VM Started (Lifecycle Event)
Nov 22 08:02:51 compute-0 nova_compute[186544]: 2025-11-22 08:02:51.707 186548 DEBUG nova.compute.manager [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 08:02:51 compute-0 nova_compute[186544]: 2025-11-22 08:02:51.712 186548 DEBUG nova.virt.libvirt.driver [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 08:02:51 compute-0 nova_compute[186544]: 2025-11-22 08:02:51.716 186548 INFO nova.virt.libvirt.driver [-] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Instance spawned successfully.
Nov 22 08:02:51 compute-0 nova_compute[186544]: 2025-11-22 08:02:51.717 186548 DEBUG nova.virt.libvirt.driver [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 08:02:51 compute-0 nova_compute[186544]: 2025-11-22 08:02:51.732 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:02:51 compute-0 nova_compute[186544]: 2025-11-22 08:02:51.743 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:02:51 compute-0 nova_compute[186544]: 2025-11-22 08:02:51.746 186548 DEBUG nova.virt.libvirt.driver [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:02:51 compute-0 nova_compute[186544]: 2025-11-22 08:02:51.747 186548 DEBUG nova.virt.libvirt.driver [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:02:51 compute-0 nova_compute[186544]: 2025-11-22 08:02:51.747 186548 DEBUG nova.virt.libvirt.driver [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:02:51 compute-0 nova_compute[186544]: 2025-11-22 08:02:51.747 186548 DEBUG nova.virt.libvirt.driver [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:02:51 compute-0 nova_compute[186544]: 2025-11-22 08:02:51.748 186548 DEBUG nova.virt.libvirt.driver [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:02:51 compute-0 nova_compute[186544]: 2025-11-22 08:02:51.748 186548 DEBUG nova.virt.libvirt.driver [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:02:51 compute-0 nova_compute[186544]: 2025-11-22 08:02:51.769 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:02:51 compute-0 nova_compute[186544]: 2025-11-22 08:02:51.770 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 08:02:51 compute-0 nova_compute[186544]: 2025-11-22 08:02:51.770 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 08:02:51 compute-0 nova_compute[186544]: 2025-11-22 08:02:51.773 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 08:02:51 compute-0 nova_compute[186544]: 2025-11-22 08:02:51.774 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798571.7040796, 886c510f-5bb2-455c-a7db-d83b3fc86ef2 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:02:51 compute-0 nova_compute[186544]: 2025-11-22 08:02:51.774 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] VM Paused (Lifecycle Event)
Nov 22 08:02:51 compute-0 nova_compute[186544]: 2025-11-22 08:02:51.776 186548 INFO nova.compute.manager [None req-950d768c-5fcc-459f-9951-473a3a17db2d 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Unpausing
Nov 22 08:02:51 compute-0 nova_compute[186544]: 2025-11-22 08:02:51.777 186548 DEBUG nova.objects.instance [None req-950d768c-5fcc-459f-9951-473a3a17db2d 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Lazy-loading 'flavor' on Instance uuid 235ecf63-07fa-4f60-97e9-466450a50add obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:02:51 compute-0 nova_compute[186544]: 2025-11-22 08:02:51.801 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:02:51 compute-0 nova_compute[186544]: 2025-11-22 08:02:51.805 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Nov 22 08:02:51 compute-0 nova_compute[186544]: 2025-11-22 08:02:51.806 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "refresh_cache-235ecf63-07fa-4f60-97e9-466450a50add" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:02:51 compute-0 nova_compute[186544]: 2025-11-22 08:02:51.806 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquired lock "refresh_cache-235ecf63-07fa-4f60-97e9-466450a50add" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:02:51 compute-0 nova_compute[186544]: 2025-11-22 08:02:51.807 186548 DEBUG nova.network.neutron [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 22 08:02:51 compute-0 nova_compute[186544]: 2025-11-22 08:02:51.807 186548 DEBUG nova.objects.instance [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lazy-loading 'info_cache' on Instance uuid 235ecf63-07fa-4f60-97e9-466450a50add obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:02:51 compute-0 nova_compute[186544]: 2025-11-22 08:02:51.811 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798571.7113876, 886c510f-5bb2-455c-a7db-d83b3fc86ef2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:02:51 compute-0 nova_compute[186544]: 2025-11-22 08:02:51.811 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] VM Resumed (Lifecycle Event)
Nov 22 08:02:51 compute-0 virtqemud[186092]: argument unsupported: QEMU guest agent is not configured
Nov 22 08:02:51 compute-0 nova_compute[186544]: 2025-11-22 08:02:51.826 186548 DEBUG nova.virt.libvirt.guest [None req-950d768c-5fcc-459f-9951-473a3a17db2d 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200
Nov 22 08:02:51 compute-0 nova_compute[186544]: 2025-11-22 08:02:51.827 186548 DEBUG nova.compute.manager [None req-950d768c-5fcc-459f-9951-473a3a17db2d 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:02:51 compute-0 nova_compute[186544]: 2025-11-22 08:02:51.832 186548 INFO nova.compute.manager [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Took 12.18 seconds to spawn the instance on the hypervisor.
Nov 22 08:02:51 compute-0 nova_compute[186544]: 2025-11-22 08:02:51.833 186548 DEBUG nova.compute.manager [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:02:51 compute-0 nova_compute[186544]: 2025-11-22 08:02:51.843 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:02:51 compute-0 nova_compute[186544]: 2025-11-22 08:02:51.852 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:02:51 compute-0 nova_compute[186544]: 2025-11-22 08:02:51.902 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 08:02:51 compute-0 nova_compute[186544]: 2025-11-22 08:02:51.903 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798571.8216174, 235ecf63-07fa-4f60-97e9-466450a50add => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:02:51 compute-0 nova_compute[186544]: 2025-11-22 08:02:51.903 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] VM Resumed (Lifecycle Event)
Nov 22 08:02:51 compute-0 nova_compute[186544]: 2025-11-22 08:02:51.953 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:02:51 compute-0 nova_compute[186544]: 2025-11-22 08:02:51.971 186548 INFO nova.compute.manager [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Took 12.95 seconds to build instance.
Nov 22 08:02:51 compute-0 nova_compute[186544]: 2025-11-22 08:02:51.973 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:02:52 compute-0 nova_compute[186544]: 2025-11-22 08:02:52.005 186548 DEBUG oslo_concurrency.lockutils [None req-ebb0bd70-d76e-4c11-8f18-bde96eca6fb8 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "886c510f-5bb2-455c-a7db-d83b3fc86ef2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.118s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:02:52 compute-0 nova_compute[186544]: 2025-11-22 08:02:52.118 186548 DEBUG nova.network.neutron [req-69a1f0bc-259a-4f5b-97cd-c903be92fe87 req-398429d6-a256-4537-817e-419baa0453ff 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Updated VIF entry in instance network info cache for port cebf92b8-e08f-4803-92db-8cb2866aa038. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:02:52 compute-0 nova_compute[186544]: 2025-11-22 08:02:52.119 186548 DEBUG nova.network.neutron [req-69a1f0bc-259a-4f5b-97cd-c903be92fe87 req-398429d6-a256-4537-817e-419baa0453ff 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Updating instance_info_cache with network_info: [{"id": "cebf92b8-e08f-4803-92db-8cb2866aa038", "address": "fa:16:3e:ac:64:2a", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcebf92b8-e0", "ovs_interfaceid": "cebf92b8-e08f-4803-92db-8cb2866aa038", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:02:52 compute-0 nova_compute[186544]: 2025-11-22 08:02:52.131 186548 DEBUG oslo_concurrency.lockutils [req-69a1f0bc-259a-4f5b-97cd-c903be92fe87 req-398429d6-a256-4537-817e-419baa0453ff 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-886c510f-5bb2-455c-a7db-d83b3fc86ef2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:02:52 compute-0 podman[230643]: 2025-11-22 08:02:52.047916864 +0000 UTC m=+0.023776440 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 08:02:52 compute-0 podman[230643]: 2025-11-22 08:02:52.805109353 +0000 UTC m=+0.780968909 container create b8f7c83fbba06fdb5d9b53d68d6447d11cf1fad8f73e44c86c76481878d0cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.schema-version=1.0)
Nov 22 08:02:53 compute-0 systemd[1]: Started libpod-conmon-b8f7c83fbba06fdb5d9b53d68d6447d11cf1fad8f73e44c86c76481878d0cb2a.scope.
Nov 22 08:02:53 compute-0 systemd[1]: Started libcrun container.
Nov 22 08:02:53 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6af0af91aa10e45f1a3b9ced8fb34bb505d0d127b898126a2ffff1cb9bf5c77d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 08:02:53 compute-0 podman[230643]: 2025-11-22 08:02:53.252387997 +0000 UTC m=+1.228247583 container init b8f7c83fbba06fdb5d9b53d68d6447d11cf1fad8f73e44c86c76481878d0cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118)
Nov 22 08:02:53 compute-0 podman[230643]: 2025-11-22 08:02:53.260076427 +0000 UTC m=+1.235935983 container start b8f7c83fbba06fdb5d9b53d68d6447d11cf1fad8f73e44c86c76481878d0cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 08:02:53 compute-0 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[230658]: [NOTICE]   (230662) : New worker (230664) forked
Nov 22 08:02:53 compute-0 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[230658]: [NOTICE]   (230662) : Loading success.
Nov 22 08:02:53 compute-0 nova_compute[186544]: 2025-11-22 08:02:53.681 186548 DEBUG nova.network.neutron [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Updating instance_info_cache with network_info: [{"id": "fe9c07c3-e08f-4810-b699-6d6aa3f50cda", "address": "fa:16:3e:1c:d6:3e", "network": {"id": "c157128f-75e8-4afb-ab55-34580af9585f", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1557637836-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d967f0cef958482c9711764882a146f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe9c07c3-e0", "ovs_interfaceid": "fe9c07c3-e08f-4810-b699-6d6aa3f50cda", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:02:53 compute-0 nova_compute[186544]: 2025-11-22 08:02:53.696 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Releasing lock "refresh_cache-235ecf63-07fa-4f60-97e9-466450a50add" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:02:53 compute-0 nova_compute[186544]: 2025-11-22 08:02:53.697 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 22 08:02:53 compute-0 nova_compute[186544]: 2025-11-22 08:02:53.698 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:02:53 compute-0 nova_compute[186544]: 2025-11-22 08:02:53.698 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:02:53 compute-0 nova_compute[186544]: 2025-11-22 08:02:53.698 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:02:53 compute-0 nova_compute[186544]: 2025-11-22 08:02:53.699 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 08:02:53 compute-0 nova_compute[186544]: 2025-11-22 08:02:53.927 186548 DEBUG nova.compute.manager [req-e44cc5de-a567-4f76-b420-5442562fa3c7 req-09952680-1610-45cf-a2f4-ef2dc7519c45 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Received event network-vif-plugged-cebf92b8-e08f-4803-92db-8cb2866aa038 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:02:53 compute-0 nova_compute[186544]: 2025-11-22 08:02:53.928 186548 DEBUG oslo_concurrency.lockutils [req-e44cc5de-a567-4f76-b420-5442562fa3c7 req-09952680-1610-45cf-a2f4-ef2dc7519c45 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "886c510f-5bb2-455c-a7db-d83b3fc86ef2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:02:53 compute-0 nova_compute[186544]: 2025-11-22 08:02:53.928 186548 DEBUG oslo_concurrency.lockutils [req-e44cc5de-a567-4f76-b420-5442562fa3c7 req-09952680-1610-45cf-a2f4-ef2dc7519c45 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "886c510f-5bb2-455c-a7db-d83b3fc86ef2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:02:53 compute-0 nova_compute[186544]: 2025-11-22 08:02:53.929 186548 DEBUG oslo_concurrency.lockutils [req-e44cc5de-a567-4f76-b420-5442562fa3c7 req-09952680-1610-45cf-a2f4-ef2dc7519c45 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "886c510f-5bb2-455c-a7db-d83b3fc86ef2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:02:53 compute-0 nova_compute[186544]: 2025-11-22 08:02:53.929 186548 DEBUG nova.compute.manager [req-e44cc5de-a567-4f76-b420-5442562fa3c7 req-09952680-1610-45cf-a2f4-ef2dc7519c45 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] No waiting events found dispatching network-vif-plugged-cebf92b8-e08f-4803-92db-8cb2866aa038 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:02:53 compute-0 nova_compute[186544]: 2025-11-22 08:02:53.929 186548 WARNING nova.compute.manager [req-e44cc5de-a567-4f76-b420-5442562fa3c7 req-09952680-1610-45cf-a2f4-ef2dc7519c45 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Received unexpected event network-vif-plugged-cebf92b8-e08f-4803-92db-8cb2866aa038 for instance with vm_state active and task_state None.
Nov 22 08:02:54 compute-0 podman[230673]: 2025-11-22 08:02:54.422935361 +0000 UTC m=+0.068978519 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 08:02:54 compute-0 podman[230674]: 2025-11-22 08:02:54.466004128 +0000 UTC m=+0.107745479 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 08:02:54 compute-0 nova_compute[186544]: 2025-11-22 08:02:54.629 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:02:55 compute-0 nova_compute[186544]: 2025-11-22 08:02:55.060 186548 INFO nova.compute.manager [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Rebuilding instance
Nov 22 08:02:55 compute-0 nova_compute[186544]: 2025-11-22 08:02:55.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:02:55 compute-0 nova_compute[186544]: 2025-11-22 08:02:55.263 186548 DEBUG nova.compute.manager [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:02:55 compute-0 nova_compute[186544]: 2025-11-22 08:02:55.410 186548 DEBUG nova.objects.instance [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lazy-loading 'pci_requests' on Instance uuid 886c510f-5bb2-455c-a7db-d83b3fc86ef2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:02:55 compute-0 nova_compute[186544]: 2025-11-22 08:02:55.420 186548 DEBUG nova.objects.instance [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lazy-loading 'pci_devices' on Instance uuid 886c510f-5bb2-455c-a7db-d83b3fc86ef2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:02:55 compute-0 nova_compute[186544]: 2025-11-22 08:02:55.437 186548 DEBUG nova.objects.instance [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lazy-loading 'resources' on Instance uuid 886c510f-5bb2-455c-a7db-d83b3fc86ef2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:02:55 compute-0 nova_compute[186544]: 2025-11-22 08:02:55.476 186548 DEBUG nova.objects.instance [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lazy-loading 'migration_context' on Instance uuid 886c510f-5bb2-455c-a7db-d83b3fc86ef2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:02:55 compute-0 nova_compute[186544]: 2025-11-22 08:02:55.487 186548 DEBUG nova.objects.instance [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Nov 22 08:02:55 compute-0 nova_compute[186544]: 2025-11-22 08:02:55.490 186548 DEBUG nova.virt.libvirt.driver [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Nov 22 08:02:55 compute-0 nova_compute[186544]: 2025-11-22 08:02:55.685 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:02:56 compute-0 nova_compute[186544]: 2025-11-22 08:02:56.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:02:58 compute-0 nova_compute[186544]: 2025-11-22 08:02:58.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:02:58 compute-0 podman[230716]: 2025-11-22 08:02:58.251663684 +0000 UTC m=+0.057777573 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 08:02:58 compute-0 podman[230715]: 2025-11-22 08:02:58.252503504 +0000 UTC m=+0.058981131 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent)
Nov 22 08:02:59 compute-0 nova_compute[186544]: 2025-11-22 08:02:59.632 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:00 compute-0 nova_compute[186544]: 2025-11-22 08:03:00.687 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:01 compute-0 nova_compute[186544]: 2025-11-22 08:03:01.014 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:01 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:01.014 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=28, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=27) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:03:01 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:01.015 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 08:03:04 compute-0 nova_compute[186544]: 2025-11-22 08:03:04.638 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:05 compute-0 podman[230757]: 2025-11-22 08:03:05.424283472 +0000 UTC m=+0.060274983 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 08:03:05 compute-0 nova_compute[186544]: 2025-11-22 08:03:05.536 186548 DEBUG nova.virt.libvirt.driver [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Nov 22 08:03:05 compute-0 nova_compute[186544]: 2025-11-22 08:03:05.689 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:06.018 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '28'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:03:09 compute-0 nova_compute[186544]: 2025-11-22 08:03:09.640 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:10 compute-0 ovn_controller[94843]: 2025-11-22T08:03:10Z|00043|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ac:64:2a 10.100.0.10
Nov 22 08:03:10 compute-0 ovn_controller[94843]: 2025-11-22T08:03:10Z|00044|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ac:64:2a 10.100.0.10
Nov 22 08:03:10 compute-0 nova_compute[186544]: 2025-11-22 08:03:10.691 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:11 compute-0 podman[230803]: 2025-11-22 08:03:11.414974696 +0000 UTC m=+0.058776147 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 08:03:11 compute-0 podman[230804]: 2025-11-22 08:03:11.418090843 +0000 UTC m=+0.055757732 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, managed_by=edpm_ansible, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, release=1755695350, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm)
Nov 22 08:03:14 compute-0 nova_compute[186544]: 2025-11-22 08:03:14.642 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:15 compute-0 nova_compute[186544]: 2025-11-22 08:03:15.694 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:16 compute-0 nova_compute[186544]: 2025-11-22 08:03:16.586 186548 DEBUG nova.virt.libvirt.driver [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Instance in state 1 after 21 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Nov 22 08:03:16 compute-0 nova_compute[186544]: 2025-11-22 08:03:16.775 186548 DEBUG oslo_concurrency.lockutils [None req-ba185f8a-f896-4144-8ff5-81da1ec2cd06 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Acquiring lock "235ecf63-07fa-4f60-97e9-466450a50add" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:03:16 compute-0 nova_compute[186544]: 2025-11-22 08:03:16.777 186548 DEBUG oslo_concurrency.lockutils [None req-ba185f8a-f896-4144-8ff5-81da1ec2cd06 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Lock "235ecf63-07fa-4f60-97e9-466450a50add" acquired by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:03:16 compute-0 nova_compute[186544]: 2025-11-22 08:03:16.778 186548 INFO nova.compute.manager [None req-ba185f8a-f896-4144-8ff5-81da1ec2cd06 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Shelving
Nov 22 08:03:16 compute-0 nova_compute[186544]: 2025-11-22 08:03:16.818 186548 DEBUG nova.virt.libvirt.driver [None req-ba185f8a-f896-4144-8ff5-81da1ec2cd06 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Nov 22 08:03:19 compute-0 kernel: tapfe9c07c3-e0 (unregistering): left promiscuous mode
Nov 22 08:03:19 compute-0 NetworkManager[55036]: <info>  [1763798599.3192] device (tapfe9c07c3-e0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 08:03:19 compute-0 ovn_controller[94843]: 2025-11-22T08:03:19Z|00422|binding|INFO|Releasing lport fe9c07c3-e08f-4810-b699-6d6aa3f50cda from this chassis (sb_readonly=0)
Nov 22 08:03:19 compute-0 ovn_controller[94843]: 2025-11-22T08:03:19Z|00423|binding|INFO|Setting lport fe9c07c3-e08f-4810-b699-6d6aa3f50cda down in Southbound
Nov 22 08:03:19 compute-0 nova_compute[186544]: 2025-11-22 08:03:19.328 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:19 compute-0 ovn_controller[94843]: 2025-11-22T08:03:19Z|00424|binding|INFO|Removing iface tapfe9c07c3-e0 ovn-installed in OVS
Nov 22 08:03:19 compute-0 kernel: tapcebf92b8-e0 (unregistering): left promiscuous mode
Nov 22 08:03:19 compute-0 nova_compute[186544]: 2025-11-22 08:03:19.329 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:19 compute-0 NetworkManager[55036]: <info>  [1763798599.3342] device (tapcebf92b8-e0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 08:03:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:19.347 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1c:d6:3e 10.100.0.8'], port_security=['fa:16:3e:1c:d6:3e 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '235ecf63-07fa-4f60-97e9-466450a50add', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c157128f-75e8-4afb-ab55-34580af9585f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd967f0cef958482c9711764882a146f3', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'db55d655-ec9a-40ef-9e54-3247c3ea4f75', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=50c031f4-6b41-4ee7-af4f-a9218d9b390c, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=fe9c07c3-e08f-4810-b699-6d6aa3f50cda) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:03:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:19.348 103805 INFO neutron.agent.ovn.metadata.agent [-] Port fe9c07c3-e08f-4810-b699-6d6aa3f50cda in datapath c157128f-75e8-4afb-ab55-34580af9585f unbound from our chassis
Nov 22 08:03:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:19.350 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c157128f-75e8-4afb-ab55-34580af9585f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 08:03:19 compute-0 nova_compute[186544]: 2025-11-22 08:03:19.351 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:19 compute-0 ovn_controller[94843]: 2025-11-22T08:03:19Z|00425|binding|INFO|Releasing lport cebf92b8-e08f-4803-92db-8cb2866aa038 from this chassis (sb_readonly=0)
Nov 22 08:03:19 compute-0 ovn_controller[94843]: 2025-11-22T08:03:19Z|00426|binding|INFO|Setting lport cebf92b8-e08f-4803-92db-8cb2866aa038 down in Southbound
Nov 22 08:03:19 compute-0 ovn_controller[94843]: 2025-11-22T08:03:19Z|00427|binding|INFO|Removing iface tapcebf92b8-e0 ovn-installed in OVS
Nov 22 08:03:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:19.351 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[876056c3-d124-484e-8b27-b542d4c36c7c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:03:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:19.354 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c157128f-75e8-4afb-ab55-34580af9585f namespace which is not needed anymore
Nov 22 08:03:19 compute-0 nova_compute[186544]: 2025-11-22 08:03:19.355 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:19.366 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ac:64:2a 10.100.0.10'], port_security=['fa:16:3e:ac:64:2a 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '886c510f-5bb2-455c-a7db-d83b3fc86ef2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'neutron:revision_number': '4', 'neutron:security_group_ids': '6f667341-2539-478f-aedb-18e68df5c8e5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a46c39a3-69e8-4fb9-a300-4c114a0c0910, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=cebf92b8-e08f-4803-92db-8cb2866aa038) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:03:19 compute-0 nova_compute[186544]: 2025-11-22 08:03:19.374 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:19 compute-0 systemd[1]: machine-qemu\x2d53\x2dinstance\x2d00000061.scope: Deactivated successfully.
Nov 22 08:03:19 compute-0 systemd[1]: machine-qemu\x2d53\x2dinstance\x2d00000061.scope: Consumed 19.698s CPU time.
Nov 22 08:03:19 compute-0 systemd-machined[152872]: Machine qemu-53-instance-00000061 terminated.
Nov 22 08:03:19 compute-0 systemd[1]: machine-qemu\x2d56\x2dinstance\x2d00000064.scope: Deactivated successfully.
Nov 22 08:03:19 compute-0 systemd[1]: machine-qemu\x2d56\x2dinstance\x2d00000064.scope: Consumed 16.394s CPU time.
Nov 22 08:03:19 compute-0 systemd-machined[152872]: Machine qemu-56-instance-00000064 terminated.
Nov 22 08:03:19 compute-0 nova_compute[186544]: 2025-11-22 08:03:19.617 186548 DEBUG nova.compute.manager [req-10fc0036-3221-487a-a72d-be7e367e232f req-84a47341-9523-4b4f-b84e-86247f665ad5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Received event network-vif-unplugged-cebf92b8-e08f-4803-92db-8cb2866aa038 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:03:19 compute-0 nova_compute[186544]: 2025-11-22 08:03:19.617 186548 DEBUG oslo_concurrency.lockutils [req-10fc0036-3221-487a-a72d-be7e367e232f req-84a47341-9523-4b4f-b84e-86247f665ad5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "886c510f-5bb2-455c-a7db-d83b3fc86ef2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:03:19 compute-0 nova_compute[186544]: 2025-11-22 08:03:19.617 186548 DEBUG oslo_concurrency.lockutils [req-10fc0036-3221-487a-a72d-be7e367e232f req-84a47341-9523-4b4f-b84e-86247f665ad5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "886c510f-5bb2-455c-a7db-d83b3fc86ef2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:03:19 compute-0 nova_compute[186544]: 2025-11-22 08:03:19.618 186548 DEBUG oslo_concurrency.lockutils [req-10fc0036-3221-487a-a72d-be7e367e232f req-84a47341-9523-4b4f-b84e-86247f665ad5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "886c510f-5bb2-455c-a7db-d83b3fc86ef2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:03:19 compute-0 nova_compute[186544]: 2025-11-22 08:03:19.618 186548 DEBUG nova.compute.manager [req-10fc0036-3221-487a-a72d-be7e367e232f req-84a47341-9523-4b4f-b84e-86247f665ad5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] No waiting events found dispatching network-vif-unplugged-cebf92b8-e08f-4803-92db-8cb2866aa038 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:03:19 compute-0 nova_compute[186544]: 2025-11-22 08:03:19.618 186548 WARNING nova.compute.manager [req-10fc0036-3221-487a-a72d-be7e367e232f req-84a47341-9523-4b4f-b84e-86247f665ad5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Received unexpected event network-vif-unplugged-cebf92b8-e08f-4803-92db-8cb2866aa038 for instance with vm_state active and task_state rebuilding.
Nov 22 08:03:19 compute-0 nova_compute[186544]: 2025-11-22 08:03:19.644 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:19 compute-0 nova_compute[186544]: 2025-11-22 08:03:19.648 186548 INFO nova.virt.libvirt.driver [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Instance shutdown successfully after 24 seconds.
Nov 22 08:03:19 compute-0 neutron-haproxy-ovnmeta-c157128f-75e8-4afb-ab55-34580af9585f[229841]: [NOTICE]   (229845) : haproxy version is 2.8.14-c23fe91
Nov 22 08:03:19 compute-0 neutron-haproxy-ovnmeta-c157128f-75e8-4afb-ab55-34580af9585f[229841]: [NOTICE]   (229845) : path to executable is /usr/sbin/haproxy
Nov 22 08:03:19 compute-0 neutron-haproxy-ovnmeta-c157128f-75e8-4afb-ab55-34580af9585f[229841]: [WARNING]  (229845) : Exiting Master process...
Nov 22 08:03:19 compute-0 neutron-haproxy-ovnmeta-c157128f-75e8-4afb-ab55-34580af9585f[229841]: [ALERT]    (229845) : Current worker (229847) exited with code 143 (Terminated)
Nov 22 08:03:19 compute-0 neutron-haproxy-ovnmeta-c157128f-75e8-4afb-ab55-34580af9585f[229841]: [WARNING]  (229845) : All workers exited. Exiting... (0)
Nov 22 08:03:19 compute-0 systemd[1]: libpod-d21462b54a296e6f191876d6337be85ef87971099aabad1143e15b28d494d33f.scope: Deactivated successfully.
Nov 22 08:03:19 compute-0 nova_compute[186544]: 2025-11-22 08:03:19.658 186548 INFO nova.virt.libvirt.driver [-] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Instance destroyed successfully.
Nov 22 08:03:19 compute-0 nova_compute[186544]: 2025-11-22 08:03:19.663 186548 INFO nova.virt.libvirt.driver [-] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Instance destroyed successfully.
Nov 22 08:03:19 compute-0 podman[230875]: 2025-11-22 08:03:19.66408247 +0000 UTC m=+0.225335441 container died d21462b54a296e6f191876d6337be85ef87971099aabad1143e15b28d494d33f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c157128f-75e8-4afb-ab55-34580af9585f, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 08:03:19 compute-0 nova_compute[186544]: 2025-11-22 08:03:19.663 186548 DEBUG nova.virt.libvirt.vif [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:02:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1370271558',display_name='tempest-ServerActionsTestJSON-server-1290637600',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1370271558',id=100,image_ref='360f90ca-2ddb-4e60-a48e-364e3b48bd96',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:02:51Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={rebuild='server'},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='ac6b78572b7d4aedaf745e5e6ba1d360',ramdisk_id='',reservation_id='r-4vx40fvs',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='360f90ca-2ddb-4e60-a48e-364e3b48bd96',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1104477664',owner_user_name='tempest-ServerActionsTestJSON-1104477664-project-member'},tags=<?>,task_state='rebuilding',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:02:54Z,user_data=None,user_id='b6cc24df1e344e369f2aff864f278268',uuid=886c510f-5bb2-455c-a7db-d83b3fc86ef2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cebf92b8-e08f-4803-92db-8cb2866aa038", "address": "fa:16:3e:ac:64:2a", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcebf92b8-e0", "ovs_interfaceid": "cebf92b8-e08f-4803-92db-8cb2866aa038", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 08:03:19 compute-0 nova_compute[186544]: 2025-11-22 08:03:19.664 186548 DEBUG nova.network.os_vif_util [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Converting VIF {"id": "cebf92b8-e08f-4803-92db-8cb2866aa038", "address": "fa:16:3e:ac:64:2a", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcebf92b8-e0", "ovs_interfaceid": "cebf92b8-e08f-4803-92db-8cb2866aa038", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:03:19 compute-0 nova_compute[186544]: 2025-11-22 08:03:19.664 186548 DEBUG nova.network.os_vif_util [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ac:64:2a,bridge_name='br-int',has_traffic_filtering=True,id=cebf92b8-e08f-4803-92db-8cb2866aa038,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcebf92b8-e0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:03:19 compute-0 nova_compute[186544]: 2025-11-22 08:03:19.665 186548 DEBUG os_vif [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ac:64:2a,bridge_name='br-int',has_traffic_filtering=True,id=cebf92b8-e08f-4803-92db-8cb2866aa038,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcebf92b8-e0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 08:03:19 compute-0 nova_compute[186544]: 2025-11-22 08:03:19.667 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:19 compute-0 nova_compute[186544]: 2025-11-22 08:03:19.667 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcebf92b8-e0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:03:19 compute-0 nova_compute[186544]: 2025-11-22 08:03:19.668 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:19 compute-0 nova_compute[186544]: 2025-11-22 08:03:19.671 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 08:03:19 compute-0 nova_compute[186544]: 2025-11-22 08:03:19.672 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:19 compute-0 nova_compute[186544]: 2025-11-22 08:03:19.674 186548 INFO os_vif [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ac:64:2a,bridge_name='br-int',has_traffic_filtering=True,id=cebf92b8-e08f-4803-92db-8cb2866aa038,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcebf92b8-e0')
Nov 22 08:03:19 compute-0 nova_compute[186544]: 2025-11-22 08:03:19.675 186548 INFO nova.virt.libvirt.driver [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Deleting instance files /var/lib/nova/instances/886c510f-5bb2-455c-a7db-d83b3fc86ef2_del
Nov 22 08:03:19 compute-0 nova_compute[186544]: 2025-11-22 08:03:19.675 186548 INFO nova.virt.libvirt.driver [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Deletion of /var/lib/nova/instances/886c510f-5bb2-455c-a7db-d83b3fc86ef2_del complete
Nov 22 08:03:19 compute-0 nova_compute[186544]: 2025-11-22 08:03:19.834 186548 INFO nova.virt.libvirt.driver [None req-ba185f8a-f896-4144-8ff5-81da1ec2cd06 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Instance shutdown successfully after 3 seconds.
Nov 22 08:03:19 compute-0 nova_compute[186544]: 2025-11-22 08:03:19.839 186548 INFO nova.virt.libvirt.driver [-] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Instance destroyed successfully.
Nov 22 08:03:19 compute-0 nova_compute[186544]: 2025-11-22 08:03:19.840 186548 DEBUG nova.objects.instance [None req-ba185f8a-f896-4144-8ff5-81da1ec2cd06 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Lazy-loading 'numa_topology' on Instance uuid 235ecf63-07fa-4f60-97e9-466450a50add obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:03:19 compute-0 nova_compute[186544]: 2025-11-22 08:03:19.862 186548 DEBUG nova.virt.libvirt.driver [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 08:03:19 compute-0 nova_compute[186544]: 2025-11-22 08:03:19.863 186548 INFO nova.virt.libvirt.driver [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Creating image(s)
Nov 22 08:03:19 compute-0 nova_compute[186544]: 2025-11-22 08:03:19.865 186548 DEBUG oslo_concurrency.lockutils [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquiring lock "/var/lib/nova/instances/886c510f-5bb2-455c-a7db-d83b3fc86ef2/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:03:19 compute-0 nova_compute[186544]: 2025-11-22 08:03:19.865 186548 DEBUG oslo_concurrency.lockutils [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "/var/lib/nova/instances/886c510f-5bb2-455c-a7db-d83b3fc86ef2/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:03:19 compute-0 nova_compute[186544]: 2025-11-22 08:03:19.866 186548 DEBUG oslo_concurrency.lockutils [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "/var/lib/nova/instances/886c510f-5bb2-455c-a7db-d83b3fc86ef2/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:03:19 compute-0 nova_compute[186544]: 2025-11-22 08:03:19.881 186548 DEBUG oslo_concurrency.processutils [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:03:19 compute-0 nova_compute[186544]: 2025-11-22 08:03:19.945 186548 DEBUG oslo_concurrency.processutils [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:03:19 compute-0 nova_compute[186544]: 2025-11-22 08:03:19.947 186548 DEBUG oslo_concurrency.lockutils [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquiring lock "2882af3479446958b785a3f508ce087a26493f42" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:03:19 compute-0 nova_compute[186544]: 2025-11-22 08:03:19.947 186548 DEBUG oslo_concurrency.lockutils [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "2882af3479446958b785a3f508ce087a26493f42" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:03:19 compute-0 nova_compute[186544]: 2025-11-22 08:03:19.963 186548 DEBUG oslo_concurrency.processutils [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:03:20 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d21462b54a296e6f191876d6337be85ef87971099aabad1143e15b28d494d33f-userdata-shm.mount: Deactivated successfully.
Nov 22 08:03:20 compute-0 systemd[1]: var-lib-containers-storage-overlay-e83fe16df0084714b4a4f2e3c0097ef0c45e3cf2db1eb0c45443f19f6f0a8e22-merged.mount: Deactivated successfully.
Nov 22 08:03:20 compute-0 nova_compute[186544]: 2025-11-22 08:03:20.016 186548 DEBUG nova.compute.manager [req-fbddb32e-bc41-4c0d-a9a1-fae655467380 req-06af2741-b4ae-4708-ba2d-917855647850 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Received event network-vif-unplugged-fe9c07c3-e08f-4810-b699-6d6aa3f50cda external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:03:20 compute-0 nova_compute[186544]: 2025-11-22 08:03:20.016 186548 DEBUG oslo_concurrency.lockutils [req-fbddb32e-bc41-4c0d-a9a1-fae655467380 req-06af2741-b4ae-4708-ba2d-917855647850 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "235ecf63-07fa-4f60-97e9-466450a50add-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:03:20 compute-0 nova_compute[186544]: 2025-11-22 08:03:20.017 186548 DEBUG oslo_concurrency.lockutils [req-fbddb32e-bc41-4c0d-a9a1-fae655467380 req-06af2741-b4ae-4708-ba2d-917855647850 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "235ecf63-07fa-4f60-97e9-466450a50add-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:03:20 compute-0 nova_compute[186544]: 2025-11-22 08:03:20.017 186548 DEBUG oslo_concurrency.lockutils [req-fbddb32e-bc41-4c0d-a9a1-fae655467380 req-06af2741-b4ae-4708-ba2d-917855647850 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "235ecf63-07fa-4f60-97e9-466450a50add-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:03:20 compute-0 nova_compute[186544]: 2025-11-22 08:03:20.017 186548 DEBUG nova.compute.manager [req-fbddb32e-bc41-4c0d-a9a1-fae655467380 req-06af2741-b4ae-4708-ba2d-917855647850 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] No waiting events found dispatching network-vif-unplugged-fe9c07c3-e08f-4810-b699-6d6aa3f50cda pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:03:20 compute-0 nova_compute[186544]: 2025-11-22 08:03:20.018 186548 WARNING nova.compute.manager [req-fbddb32e-bc41-4c0d-a9a1-fae655467380 req-06af2741-b4ae-4708-ba2d-917855647850 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Received unexpected event network-vif-unplugged-fe9c07c3-e08f-4810-b699-6d6aa3f50cda for instance with vm_state active and task_state shelving.
Nov 22 08:03:20 compute-0 nova_compute[186544]: 2025-11-22 08:03:20.026 186548 DEBUG oslo_concurrency.processutils [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:03:20 compute-0 nova_compute[186544]: 2025-11-22 08:03:20.026 186548 DEBUG oslo_concurrency.processutils [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42,backing_fmt=raw /var/lib/nova/instances/886c510f-5bb2-455c-a7db-d83b3fc86ef2/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:03:20 compute-0 nova_compute[186544]: 2025-11-22 08:03:20.159 186548 INFO nova.virt.libvirt.driver [None req-ba185f8a-f896-4144-8ff5-81da1ec2cd06 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Beginning cold snapshot process
Nov 22 08:03:20 compute-0 nova_compute[186544]: 2025-11-22 08:03:20.457 186548 DEBUG nova.privsep.utils [None req-ba185f8a-f896-4144-8ff5-81da1ec2cd06 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Nov 22 08:03:20 compute-0 nova_compute[186544]: 2025-11-22 08:03:20.458 186548 DEBUG oslo_concurrency.processutils [None req-ba185f8a-f896-4144-8ff5-81da1ec2cd06 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/235ecf63-07fa-4f60-97e9-466450a50add/disk /var/lib/nova/instances/snapshots/tmp7qgumw2h/30bd163777724d7980ecd81699c2836d execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:03:20 compute-0 nova_compute[186544]: 2025-11-22 08:03:20.697 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:20 compute-0 nova_compute[186544]: 2025-11-22 08:03:20.939 186548 DEBUG oslo_concurrency.processutils [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42,backing_fmt=raw /var/lib/nova/instances/886c510f-5bb2-455c-a7db-d83b3fc86ef2/disk 1073741824" returned: 0 in 0.913s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:03:20 compute-0 nova_compute[186544]: 2025-11-22 08:03:20.940 186548 DEBUG oslo_concurrency.lockutils [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "2882af3479446958b785a3f508ce087a26493f42" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.992s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:03:20 compute-0 nova_compute[186544]: 2025-11-22 08:03:20.940 186548 DEBUG oslo_concurrency.processutils [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:03:20 compute-0 nova_compute[186544]: 2025-11-22 08:03:20.993 186548 DEBUG oslo_concurrency.processutils [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:03:20 compute-0 nova_compute[186544]: 2025-11-22 08:03:20.994 186548 DEBUG nova.virt.disk.api [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Checking if we can resize image /var/lib/nova/instances/886c510f-5bb2-455c-a7db-d83b3fc86ef2/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 08:03:20 compute-0 nova_compute[186544]: 2025-11-22 08:03:20.995 186548 DEBUG oslo_concurrency.processutils [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/886c510f-5bb2-455c-a7db-d83b3fc86ef2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:03:21 compute-0 nova_compute[186544]: 2025-11-22 08:03:21.049 186548 DEBUG oslo_concurrency.processutils [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/886c510f-5bb2-455c-a7db-d83b3fc86ef2/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:03:21 compute-0 nova_compute[186544]: 2025-11-22 08:03:21.050 186548 DEBUG nova.virt.disk.api [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Cannot resize image /var/lib/nova/instances/886c510f-5bb2-455c-a7db-d83b3fc86ef2/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 08:03:21 compute-0 nova_compute[186544]: 2025-11-22 08:03:21.051 186548 DEBUG nova.virt.libvirt.driver [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 08:03:21 compute-0 nova_compute[186544]: 2025-11-22 08:03:21.051 186548 DEBUG nova.virt.libvirt.driver [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Ensure instance console log exists: /var/lib/nova/instances/886c510f-5bb2-455c-a7db-d83b3fc86ef2/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 08:03:21 compute-0 nova_compute[186544]: 2025-11-22 08:03:21.051 186548 DEBUG oslo_concurrency.lockutils [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:03:21 compute-0 nova_compute[186544]: 2025-11-22 08:03:21.052 186548 DEBUG oslo_concurrency.lockutils [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:03:21 compute-0 nova_compute[186544]: 2025-11-22 08:03:21.052 186548 DEBUG oslo_concurrency.lockutils [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:03:21 compute-0 nova_compute[186544]: 2025-11-22 08:03:21.054 186548 DEBUG nova.virt.libvirt.driver [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Start _get_guest_xml network_info=[{"id": "cebf92b8-e08f-4803-92db-8cb2866aa038", "address": "fa:16:3e:ac:64:2a", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcebf92b8-e0", "ovs_interfaceid": "cebf92b8-e08f-4803-92db-8cb2866aa038", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:39:01Z,direct_url=<?>,disk_format='qcow2',id=360f90ca-2ddb-4e60-a48e-364e3b48bd96,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:02Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 08:03:21 compute-0 nova_compute[186544]: 2025-11-22 08:03:21.060 186548 WARNING nova.virt.libvirt.driver [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError
Nov 22 08:03:21 compute-0 nova_compute[186544]: 2025-11-22 08:03:21.065 186548 DEBUG nova.virt.libvirt.host [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 08:03:21 compute-0 nova_compute[186544]: 2025-11-22 08:03:21.065 186548 DEBUG nova.virt.libvirt.host [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 08:03:21 compute-0 nova_compute[186544]: 2025-11-22 08:03:21.069 186548 DEBUG nova.virt.libvirt.host [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 08:03:21 compute-0 nova_compute[186544]: 2025-11-22 08:03:21.069 186548 DEBUG nova.virt.libvirt.host [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 08:03:21 compute-0 nova_compute[186544]: 2025-11-22 08:03:21.071 186548 DEBUG nova.virt.libvirt.driver [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 08:03:21 compute-0 nova_compute[186544]: 2025-11-22 08:03:21.071 186548 DEBUG nova.virt.hardware [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:39:01Z,direct_url=<?>,disk_format='qcow2',id=360f90ca-2ddb-4e60-a48e-364e3b48bd96,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:02Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 08:03:21 compute-0 nova_compute[186544]: 2025-11-22 08:03:21.071 186548 DEBUG nova.virt.hardware [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 08:03:21 compute-0 nova_compute[186544]: 2025-11-22 08:03:21.071 186548 DEBUG nova.virt.hardware [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 08:03:21 compute-0 nova_compute[186544]: 2025-11-22 08:03:21.072 186548 DEBUG nova.virt.hardware [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 08:03:21 compute-0 nova_compute[186544]: 2025-11-22 08:03:21.072 186548 DEBUG nova.virt.hardware [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 08:03:21 compute-0 nova_compute[186544]: 2025-11-22 08:03:21.072 186548 DEBUG nova.virt.hardware [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 08:03:21 compute-0 nova_compute[186544]: 2025-11-22 08:03:21.072 186548 DEBUG nova.virt.hardware [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 08:03:21 compute-0 nova_compute[186544]: 2025-11-22 08:03:21.073 186548 DEBUG nova.virt.hardware [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 08:03:21 compute-0 nova_compute[186544]: 2025-11-22 08:03:21.073 186548 DEBUG nova.virt.hardware [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 08:03:21 compute-0 nova_compute[186544]: 2025-11-22 08:03:21.073 186548 DEBUG nova.virt.hardware [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 08:03:21 compute-0 nova_compute[186544]: 2025-11-22 08:03:21.073 186548 DEBUG nova.virt.hardware [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 08:03:21 compute-0 nova_compute[186544]: 2025-11-22 08:03:21.074 186548 DEBUG nova.objects.instance [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 886c510f-5bb2-455c-a7db-d83b3fc86ef2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:03:21 compute-0 nova_compute[186544]: 2025-11-22 08:03:21.101 186548 DEBUG nova.virt.libvirt.vif [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-22T08:02:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1370271558',display_name='tempest-ServerActionsTestJSON-server-1290637600',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1370271558',id=100,image_ref='360f90ca-2ddb-4e60-a48e-364e3b48bd96',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:02:51Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={rebuild='server'},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='ac6b78572b7d4aedaf745e5e6ba1d360',ramdisk_id='',reservation_id='r-4vx40fvs',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='360f90ca-2ddb-4e60-a48e-364e3b48bd96',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1104477664',owner_user_name='tempest-ServerActionsTestJSON-1104477664-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:03:19Z,user_data=None,user_id='b6cc24df1e344e369f2aff864f278268',uuid=886c510f-5bb2-455c-a7db-d83b3fc86ef2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cebf92b8-e08f-4803-92db-8cb2866aa038", "address": "fa:16:3e:ac:64:2a", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcebf92b8-e0", "ovs_interfaceid": "cebf92b8-e08f-4803-92db-8cb2866aa038", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 08:03:21 compute-0 nova_compute[186544]: 2025-11-22 08:03:21.102 186548 DEBUG nova.network.os_vif_util [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Converting VIF {"id": "cebf92b8-e08f-4803-92db-8cb2866aa038", "address": "fa:16:3e:ac:64:2a", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcebf92b8-e0", "ovs_interfaceid": "cebf92b8-e08f-4803-92db-8cb2866aa038", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:03:21 compute-0 nova_compute[186544]: 2025-11-22 08:03:21.102 186548 DEBUG nova.network.os_vif_util [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ac:64:2a,bridge_name='br-int',has_traffic_filtering=True,id=cebf92b8-e08f-4803-92db-8cb2866aa038,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcebf92b8-e0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:03:21 compute-0 nova_compute[186544]: 2025-11-22 08:03:21.104 186548 DEBUG nova.virt.libvirt.driver [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] End _get_guest_xml xml=<domain type="kvm">
Nov 22 08:03:21 compute-0 nova_compute[186544]:   <uuid>886c510f-5bb2-455c-a7db-d83b3fc86ef2</uuid>
Nov 22 08:03:21 compute-0 nova_compute[186544]:   <name>instance-00000064</name>
Nov 22 08:03:21 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 08:03:21 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 08:03:21 compute-0 nova_compute[186544]:   <metadata>
Nov 22 08:03:21 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 08:03:21 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 08:03:21 compute-0 nova_compute[186544]:       <nova:name>tempest-ServerActionsTestJSON-server-1290637600</nova:name>
Nov 22 08:03:21 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 08:03:21</nova:creationTime>
Nov 22 08:03:21 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 08:03:21 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 08:03:21 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 08:03:21 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 08:03:21 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 08:03:21 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 08:03:21 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 08:03:21 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 08:03:21 compute-0 nova_compute[186544]:         <nova:user uuid="b6cc24df1e344e369f2aff864f278268">tempest-ServerActionsTestJSON-1104477664-project-member</nova:user>
Nov 22 08:03:21 compute-0 nova_compute[186544]:         <nova:project uuid="ac6b78572b7d4aedaf745e5e6ba1d360">tempest-ServerActionsTestJSON-1104477664</nova:project>
Nov 22 08:03:21 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 08:03:21 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="360f90ca-2ddb-4e60-a48e-364e3b48bd96"/>
Nov 22 08:03:21 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 08:03:21 compute-0 nova_compute[186544]:         <nova:port uuid="cebf92b8-e08f-4803-92db-8cb2866aa038">
Nov 22 08:03:21 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Nov 22 08:03:21 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 08:03:21 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 08:03:21 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 08:03:21 compute-0 nova_compute[186544]:   </metadata>
Nov 22 08:03:21 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 08:03:21 compute-0 nova_compute[186544]:     <system>
Nov 22 08:03:21 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 08:03:21 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 08:03:21 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 08:03:21 compute-0 nova_compute[186544]:       <entry name="serial">886c510f-5bb2-455c-a7db-d83b3fc86ef2</entry>
Nov 22 08:03:21 compute-0 nova_compute[186544]:       <entry name="uuid">886c510f-5bb2-455c-a7db-d83b3fc86ef2</entry>
Nov 22 08:03:21 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 08:03:21 compute-0 nova_compute[186544]:     </system>
Nov 22 08:03:21 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 08:03:21 compute-0 nova_compute[186544]:   <os>
Nov 22 08:03:21 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 08:03:21 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 08:03:21 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 08:03:21 compute-0 nova_compute[186544]:   </os>
Nov 22 08:03:21 compute-0 nova_compute[186544]:   <features>
Nov 22 08:03:21 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 08:03:21 compute-0 nova_compute[186544]:     <apic/>
Nov 22 08:03:21 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 08:03:21 compute-0 nova_compute[186544]:   </features>
Nov 22 08:03:21 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 08:03:21 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 08:03:21 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 08:03:21 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 08:03:21 compute-0 nova_compute[186544]:   </clock>
Nov 22 08:03:21 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 08:03:21 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 08:03:21 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 08:03:21 compute-0 nova_compute[186544]:   </cpu>
Nov 22 08:03:21 compute-0 nova_compute[186544]:   <devices>
Nov 22 08:03:21 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 08:03:21 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 08:03:21 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/886c510f-5bb2-455c-a7db-d83b3fc86ef2/disk"/>
Nov 22 08:03:21 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 08:03:21 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:03:21 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 08:03:21 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 08:03:21 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/886c510f-5bb2-455c-a7db-d83b3fc86ef2/disk.config"/>
Nov 22 08:03:21 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 08:03:21 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:03:21 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 08:03:21 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:ac:64:2a"/>
Nov 22 08:03:21 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:03:21 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 08:03:21 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 08:03:21 compute-0 nova_compute[186544]:       <target dev="tapcebf92b8-e0"/>
Nov 22 08:03:21 compute-0 nova_compute[186544]:     </interface>
Nov 22 08:03:21 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 08:03:21 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/886c510f-5bb2-455c-a7db-d83b3fc86ef2/console.log" append="off"/>
Nov 22 08:03:21 compute-0 nova_compute[186544]:     </serial>
Nov 22 08:03:21 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 08:03:21 compute-0 nova_compute[186544]:     <video>
Nov 22 08:03:21 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:03:21 compute-0 nova_compute[186544]:     </video>
Nov 22 08:03:21 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 08:03:21 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 08:03:21 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 08:03:21 compute-0 nova_compute[186544]:     </rng>
Nov 22 08:03:21 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 08:03:21 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:03:21 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:03:21 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:03:21 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:03:21 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:03:21 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:03:21 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:03:21 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:03:21 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:03:21 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:03:21 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:03:21 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:03:21 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:03:21 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:03:21 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:03:21 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:03:21 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:03:21 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:03:21 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:03:21 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:03:21 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:03:21 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:03:21 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:03:21 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:03:21 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 08:03:21 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 08:03:21 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 08:03:21 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 08:03:21 compute-0 nova_compute[186544]:   </devices>
Nov 22 08:03:21 compute-0 nova_compute[186544]: </domain>
Nov 22 08:03:21 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 08:03:21 compute-0 nova_compute[186544]: 2025-11-22 08:03:21.105 186548 DEBUG nova.compute.manager [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Preparing to wait for external event network-vif-plugged-cebf92b8-e08f-4803-92db-8cb2866aa038 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 08:03:21 compute-0 nova_compute[186544]: 2025-11-22 08:03:21.106 186548 DEBUG oslo_concurrency.lockutils [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquiring lock "886c510f-5bb2-455c-a7db-d83b3fc86ef2-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:03:21 compute-0 nova_compute[186544]: 2025-11-22 08:03:21.106 186548 DEBUG oslo_concurrency.lockutils [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "886c510f-5bb2-455c-a7db-d83b3fc86ef2-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:03:21 compute-0 nova_compute[186544]: 2025-11-22 08:03:21.106 186548 DEBUG oslo_concurrency.lockutils [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "886c510f-5bb2-455c-a7db-d83b3fc86ef2-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:03:21 compute-0 nova_compute[186544]: 2025-11-22 08:03:21.107 186548 DEBUG nova.virt.libvirt.vif [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-22T08:02:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1370271558',display_name='tempest-ServerActionsTestJSON-server-1290637600',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1370271558',id=100,image_ref='360f90ca-2ddb-4e60-a48e-364e3b48bd96',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:02:51Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={rebuild='server'},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='ac6b78572b7d4aedaf745e5e6ba1d360',ramdisk_id='',reservation_id='r-4vx40fvs',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='360f90ca-2ddb-4e60-a48e-364e3b48bd96',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1104477664',owner_user_name='tempest-ServerActionsTestJSON-1104477664-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:03:19Z,user_data=None,user_id='b6cc24df1e344e369f2aff864f278268',uuid=886c510f-5bb2-455c-a7db-d83b3fc86ef2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cebf92b8-e08f-4803-92db-8cb2866aa038", "address": "fa:16:3e:ac:64:2a", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcebf92b8-e0", "ovs_interfaceid": "cebf92b8-e08f-4803-92db-8cb2866aa038", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 08:03:21 compute-0 nova_compute[186544]: 2025-11-22 08:03:21.107 186548 DEBUG nova.network.os_vif_util [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Converting VIF {"id": "cebf92b8-e08f-4803-92db-8cb2866aa038", "address": "fa:16:3e:ac:64:2a", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcebf92b8-e0", "ovs_interfaceid": "cebf92b8-e08f-4803-92db-8cb2866aa038", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:03:21 compute-0 nova_compute[186544]: 2025-11-22 08:03:21.108 186548 DEBUG nova.network.os_vif_util [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ac:64:2a,bridge_name='br-int',has_traffic_filtering=True,id=cebf92b8-e08f-4803-92db-8cb2866aa038,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcebf92b8-e0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:03:21 compute-0 nova_compute[186544]: 2025-11-22 08:03:21.108 186548 DEBUG os_vif [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ac:64:2a,bridge_name='br-int',has_traffic_filtering=True,id=cebf92b8-e08f-4803-92db-8cb2866aa038,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcebf92b8-e0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 08:03:21 compute-0 nova_compute[186544]: 2025-11-22 08:03:21.109 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:21 compute-0 nova_compute[186544]: 2025-11-22 08:03:21.109 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:03:21 compute-0 nova_compute[186544]: 2025-11-22 08:03:21.110 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:03:21 compute-0 nova_compute[186544]: 2025-11-22 08:03:21.112 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:21 compute-0 nova_compute[186544]: 2025-11-22 08:03:21.112 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcebf92b8-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:03:21 compute-0 nova_compute[186544]: 2025-11-22 08:03:21.113 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapcebf92b8-e0, col_values=(('external_ids', {'iface-id': 'cebf92b8-e08f-4803-92db-8cb2866aa038', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ac:64:2a', 'vm-uuid': '886c510f-5bb2-455c-a7db-d83b3fc86ef2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:03:21 compute-0 nova_compute[186544]: 2025-11-22 08:03:21.116 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:21 compute-0 NetworkManager[55036]: <info>  [1763798601.1180] manager: (tapcebf92b8-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/202)
Nov 22 08:03:21 compute-0 nova_compute[186544]: 2025-11-22 08:03:21.120 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 08:03:21 compute-0 nova_compute[186544]: 2025-11-22 08:03:21.123 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:21 compute-0 nova_compute[186544]: 2025-11-22 08:03:21.124 186548 INFO os_vif [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ac:64:2a,bridge_name='br-int',has_traffic_filtering=True,id=cebf92b8-e08f-4803-92db-8cb2866aa038,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcebf92b8-e0')
Nov 22 08:03:21 compute-0 podman[230875]: 2025-11-22 08:03:21.253794882 +0000 UTC m=+1.815047863 container cleanup d21462b54a296e6f191876d6337be85ef87971099aabad1143e15b28d494d33f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c157128f-75e8-4afb-ab55-34580af9585f, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 22 08:03:21 compute-0 systemd[1]: libpod-conmon-d21462b54a296e6f191876d6337be85ef87971099aabad1143e15b28d494d33f.scope: Deactivated successfully.
Nov 22 08:03:21 compute-0 nova_compute[186544]: 2025-11-22 08:03:21.541 186548 DEBUG nova.virt.libvirt.driver [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:03:21 compute-0 nova_compute[186544]: 2025-11-22 08:03:21.541 186548 DEBUG nova.virt.libvirt.driver [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:03:21 compute-0 nova_compute[186544]: 2025-11-22 08:03:21.541 186548 DEBUG nova.virt.libvirt.driver [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] No VIF found with MAC fa:16:3e:ac:64:2a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 08:03:21 compute-0 nova_compute[186544]: 2025-11-22 08:03:21.542 186548 INFO nova.virt.libvirt.driver [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Using config drive
Nov 22 08:03:21 compute-0 nova_compute[186544]: 2025-11-22 08:03:21.559 186548 DEBUG nova.objects.instance [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 886c510f-5bb2-455c-a7db-d83b3fc86ef2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:03:21 compute-0 nova_compute[186544]: 2025-11-22 08:03:21.595 186548 DEBUG nova.objects.instance [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lazy-loading 'keypairs' on Instance uuid 886c510f-5bb2-455c-a7db-d83b3fc86ef2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:03:22 compute-0 nova_compute[186544]: 2025-11-22 08:03:22.008 186548 DEBUG nova.compute.manager [req-c4eec9f8-512f-4bc6-bd66-8675b5b9627b req-03f53459-3c20-4e8a-824c-cc8d01bb4d61 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Received event network-vif-plugged-cebf92b8-e08f-4803-92db-8cb2866aa038 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:03:22 compute-0 nova_compute[186544]: 2025-11-22 08:03:22.008 186548 DEBUG oslo_concurrency.lockutils [req-c4eec9f8-512f-4bc6-bd66-8675b5b9627b req-03f53459-3c20-4e8a-824c-cc8d01bb4d61 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "886c510f-5bb2-455c-a7db-d83b3fc86ef2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:03:22 compute-0 nova_compute[186544]: 2025-11-22 08:03:22.009 186548 DEBUG oslo_concurrency.lockutils [req-c4eec9f8-512f-4bc6-bd66-8675b5b9627b req-03f53459-3c20-4e8a-824c-cc8d01bb4d61 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "886c510f-5bb2-455c-a7db-d83b3fc86ef2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:03:22 compute-0 nova_compute[186544]: 2025-11-22 08:03:22.009 186548 DEBUG oslo_concurrency.lockutils [req-c4eec9f8-512f-4bc6-bd66-8675b5b9627b req-03f53459-3c20-4e8a-824c-cc8d01bb4d61 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "886c510f-5bb2-455c-a7db-d83b3fc86ef2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:03:22 compute-0 nova_compute[186544]: 2025-11-22 08:03:22.010 186548 DEBUG nova.compute.manager [req-c4eec9f8-512f-4bc6-bd66-8675b5b9627b req-03f53459-3c20-4e8a-824c-cc8d01bb4d61 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Processing event network-vif-plugged-cebf92b8-e08f-4803-92db-8cb2866aa038 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 08:03:22 compute-0 nova_compute[186544]: 2025-11-22 08:03:22.088 186548 INFO nova.virt.libvirt.driver [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Creating config drive at /var/lib/nova/instances/886c510f-5bb2-455c-a7db-d83b3fc86ef2/disk.config
Nov 22 08:03:22 compute-0 nova_compute[186544]: 2025-11-22 08:03:22.092 186548 DEBUG oslo_concurrency.processutils [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/886c510f-5bb2-455c-a7db-d83b3fc86ef2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpiirlkoaj execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:03:22 compute-0 nova_compute[186544]: 2025-11-22 08:03:22.140 186548 DEBUG nova.compute.manager [req-52286b83-32ad-4b73-8a45-8d948106ed4c req-38b82272-edd1-4783-bb8b-d41e9850a945 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Received event network-vif-plugged-fe9c07c3-e08f-4810-b699-6d6aa3f50cda external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:03:22 compute-0 nova_compute[186544]: 2025-11-22 08:03:22.141 186548 DEBUG oslo_concurrency.lockutils [req-52286b83-32ad-4b73-8a45-8d948106ed4c req-38b82272-edd1-4783-bb8b-d41e9850a945 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "235ecf63-07fa-4f60-97e9-466450a50add-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:03:22 compute-0 nova_compute[186544]: 2025-11-22 08:03:22.141 186548 DEBUG oslo_concurrency.lockutils [req-52286b83-32ad-4b73-8a45-8d948106ed4c req-38b82272-edd1-4783-bb8b-d41e9850a945 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "235ecf63-07fa-4f60-97e9-466450a50add-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:03:22 compute-0 nova_compute[186544]: 2025-11-22 08:03:22.142 186548 DEBUG oslo_concurrency.lockutils [req-52286b83-32ad-4b73-8a45-8d948106ed4c req-38b82272-edd1-4783-bb8b-d41e9850a945 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "235ecf63-07fa-4f60-97e9-466450a50add-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:03:22 compute-0 nova_compute[186544]: 2025-11-22 08:03:22.142 186548 DEBUG nova.compute.manager [req-52286b83-32ad-4b73-8a45-8d948106ed4c req-38b82272-edd1-4783-bb8b-d41e9850a945 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] No waiting events found dispatching network-vif-plugged-fe9c07c3-e08f-4810-b699-6d6aa3f50cda pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:03:22 compute-0 nova_compute[186544]: 2025-11-22 08:03:22.143 186548 WARNING nova.compute.manager [req-52286b83-32ad-4b73-8a45-8d948106ed4c req-38b82272-edd1-4783-bb8b-d41e9850a945 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Received unexpected event network-vif-plugged-fe9c07c3-e08f-4810-b699-6d6aa3f50cda for instance with vm_state active and task_state shelving_image_pending_upload.
Nov 22 08:03:22 compute-0 nova_compute[186544]: 2025-11-22 08:03:22.214 186548 DEBUG oslo_concurrency.processutils [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/886c510f-5bb2-455c-a7db-d83b3fc86ef2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpiirlkoaj" returned: 0 in 0.121s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:03:22 compute-0 kernel: tapcebf92b8-e0: entered promiscuous mode
Nov 22 08:03:22 compute-0 NetworkManager[55036]: <info>  [1763798602.2669] manager: (tapcebf92b8-e0): new Tun device (/org/freedesktop/NetworkManager/Devices/203)
Nov 22 08:03:22 compute-0 systemd-udevd[230851]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:03:22 compute-0 nova_compute[186544]: 2025-11-22 08:03:22.270 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:22 compute-0 ovn_controller[94843]: 2025-11-22T08:03:22Z|00428|binding|INFO|Claiming lport cebf92b8-e08f-4803-92db-8cb2866aa038 for this chassis.
Nov 22 08:03:22 compute-0 ovn_controller[94843]: 2025-11-22T08:03:22Z|00429|binding|INFO|cebf92b8-e08f-4803-92db-8cb2866aa038: Claiming fa:16:3e:ac:64:2a 10.100.0.10
Nov 22 08:03:22 compute-0 NetworkManager[55036]: <info>  [1763798602.2798] device (tapcebf92b8-e0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 08:03:22 compute-0 NetworkManager[55036]: <info>  [1763798602.2805] device (tapcebf92b8-e0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 08:03:22 compute-0 ovn_controller[94843]: 2025-11-22T08:03:22Z|00430|binding|INFO|Setting lport cebf92b8-e08f-4803-92db-8cb2866aa038 ovn-installed in OVS
Nov 22 08:03:22 compute-0 nova_compute[186544]: 2025-11-22 08:03:22.288 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:22 compute-0 ovn_controller[94843]: 2025-11-22T08:03:22Z|00431|binding|INFO|Setting lport cebf92b8-e08f-4803-92db-8cb2866aa038 up in Southbound
Nov 22 08:03:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:22.292 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ac:64:2a 10.100.0.10'], port_security=['fa:16:3e:ac:64:2a 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '886c510f-5bb2-455c-a7db-d83b3fc86ef2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'neutron:revision_number': '5', 'neutron:security_group_ids': '6f667341-2539-478f-aedb-18e68df5c8e5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a46c39a3-69e8-4fb9-a300-4c114a0c0910, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=cebf92b8-e08f-4803-92db-8cb2866aa038) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:03:22 compute-0 nova_compute[186544]: 2025-11-22 08:03:22.295 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:22 compute-0 systemd-machined[152872]: New machine qemu-57-instance-00000064.
Nov 22 08:03:22 compute-0 systemd[1]: Started Virtual Machine qemu-57-instance-00000064.
Nov 22 08:03:23 compute-0 podman[230964]: 2025-11-22 08:03:23.122527963 +0000 UTC m=+1.845039035 container remove d21462b54a296e6f191876d6337be85ef87971099aabad1143e15b28d494d33f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c157128f-75e8-4afb-ab55-34580af9585f, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118)
Nov 22 08:03:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:23.128 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[25686862-d33a-445f-9d02-5c68e15a3491]: (4, ('Sat Nov 22 08:03:19 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-c157128f-75e8-4afb-ab55-34580af9585f (d21462b54a296e6f191876d6337be85ef87971099aabad1143e15b28d494d33f)\nd21462b54a296e6f191876d6337be85ef87971099aabad1143e15b28d494d33f\nSat Nov 22 08:03:21 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-c157128f-75e8-4afb-ab55-34580af9585f (d21462b54a296e6f191876d6337be85ef87971099aabad1143e15b28d494d33f)\nd21462b54a296e6f191876d6337be85ef87971099aabad1143e15b28d494d33f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:03:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:23.130 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[c4f48078-165f-47a9-925e-d97933655a07]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:03:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:23.131 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc157128f-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:03:23 compute-0 nova_compute[186544]: 2025-11-22 08:03:23.133 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:23 compute-0 kernel: tapc157128f-70: left promiscuous mode
Nov 22 08:03:23 compute-0 nova_compute[186544]: 2025-11-22 08:03:23.153 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:23.157 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[f2b1a239-591e-4ff0-902f-f4938e459d45]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:03:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:23.170 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[61a4d023-3c25-4799-8c05-75da27232d42]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:03:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:23.172 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[4f1943db-9e30-45c3-931b-981cd1919f23]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:03:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:23.188 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[b1967eeb-4d1b-4b83-b925-bb6d3e85abd1]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 524453, 'reachable_time': 27945, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231023, 'error': None, 'target': 'ovnmeta-c157128f-75e8-4afb-ab55-34580af9585f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:03:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:23.191 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c157128f-75e8-4afb-ab55-34580af9585f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 08:03:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:23.191 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[889dfd77-62bb-44e3-88a3-5cc9f06e615b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:03:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:23.192 103805 INFO neutron.agent.ovn.metadata.agent [-] Port cebf92b8-e08f-4803-92db-8cb2866aa038 in datapath 165f7f23-d3c9-4f49-8a34-4fd7222ad518 unbound from our chassis
Nov 22 08:03:23 compute-0 systemd[1]: run-netns-ovnmeta\x2dc157128f\x2d75e8\x2d4afb\x2dab55\x2d34580af9585f.mount: Deactivated successfully.
Nov 22 08:03:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:23.194 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 165f7f23-d3c9-4f49-8a34-4fd7222ad518
Nov 22 08:03:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:23.213 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[67ee37ec-7379-4ac6-a7a2-002a529f5fea]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:03:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:23.240 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[4bf26222-ffab-4436-8d7e-c2bd67a5a89a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:03:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:23.243 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[b5593e7c-0e08-4a05-9361-11522cc69661]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:03:23 compute-0 nova_compute[186544]: 2025-11-22 08:03:23.248 186548 DEBUG nova.virt.libvirt.host [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Removed pending event for 886c510f-5bb2-455c-a7db-d83b3fc86ef2 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Nov 22 08:03:23 compute-0 nova_compute[186544]: 2025-11-22 08:03:23.248 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798603.2476003, 886c510f-5bb2-455c-a7db-d83b3fc86ef2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:03:23 compute-0 nova_compute[186544]: 2025-11-22 08:03:23.248 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] VM Started (Lifecycle Event)
Nov 22 08:03:23 compute-0 nova_compute[186544]: 2025-11-22 08:03:23.250 186548 DEBUG nova.compute.manager [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 08:03:23 compute-0 nova_compute[186544]: 2025-11-22 08:03:23.254 186548 DEBUG nova.virt.libvirt.driver [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 08:03:23 compute-0 nova_compute[186544]: 2025-11-22 08:03:23.257 186548 INFO nova.virt.libvirt.driver [-] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Instance spawned successfully.
Nov 22 08:03:23 compute-0 nova_compute[186544]: 2025-11-22 08:03:23.257 186548 DEBUG nova.virt.libvirt.driver [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 08:03:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:23.270 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[91ab8a67-85c1-4074-b0bf-82d61d50e310]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:03:23 compute-0 nova_compute[186544]: 2025-11-22 08:03:23.271 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:03:23 compute-0 nova_compute[186544]: 2025-11-22 08:03:23.274 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:03:23 compute-0 nova_compute[186544]: 2025-11-22 08:03:23.283 186548 DEBUG nova.virt.libvirt.driver [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:03:23 compute-0 nova_compute[186544]: 2025-11-22 08:03:23.283 186548 DEBUG nova.virt.libvirt.driver [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:03:23 compute-0 nova_compute[186544]: 2025-11-22 08:03:23.284 186548 DEBUG nova.virt.libvirt.driver [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:03:23 compute-0 nova_compute[186544]: 2025-11-22 08:03:23.284 186548 DEBUG nova.virt.libvirt.driver [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:03:23 compute-0 nova_compute[186544]: 2025-11-22 08:03:23.285 186548 DEBUG nova.virt.libvirt.driver [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:03:23 compute-0 nova_compute[186544]: 2025-11-22 08:03:23.285 186548 DEBUG nova.virt.libvirt.driver [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:03:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:23.286 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[897b3adf-bd3a-4d81-939f-24f8dcec5849]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap165f7f23-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:00:cc:98'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 132], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 531212, 'reachable_time': 37248, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231031, 'error': None, 'target': 'ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:03:23 compute-0 nova_compute[186544]: 2025-11-22 08:03:23.293 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Nov 22 08:03:23 compute-0 nova_compute[186544]: 2025-11-22 08:03:23.293 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798603.250526, 886c510f-5bb2-455c-a7db-d83b3fc86ef2 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:03:23 compute-0 nova_compute[186544]: 2025-11-22 08:03:23.293 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] VM Paused (Lifecycle Event)
Nov 22 08:03:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:23.304 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[721e9410-1c7a-444f-97c3-d290c69d1e5d]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap165f7f23-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 531224, 'tstamp': 531224}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231032, 'error': None, 'target': 'ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap165f7f23-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 531227, 'tstamp': 531227}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231032, 'error': None, 'target': 'ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:03:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:23.307 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap165f7f23-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:03:23 compute-0 nova_compute[186544]: 2025-11-22 08:03:23.309 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:23 compute-0 nova_compute[186544]: 2025-11-22 08:03:23.314 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:03:23 compute-0 nova_compute[186544]: 2025-11-22 08:03:23.314 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:23.314 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap165f7f23-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:03:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:23.315 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:03:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:23.316 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap165f7f23-d0, col_values=(('external_ids', {'iface-id': '966efbe2-6c09-40dc-9351-4f58f2542b31'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:03:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:23.317 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:03:23 compute-0 nova_compute[186544]: 2025-11-22 08:03:23.318 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798603.2534888, 886c510f-5bb2-455c-a7db-d83b3fc86ef2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:03:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:23.318 103805 INFO neutron.agent.ovn.metadata.agent [-] Port cebf92b8-e08f-4803-92db-8cb2866aa038 in datapath 165f7f23-d3c9-4f49-8a34-4fd7222ad518 unbound from our chassis
Nov 22 08:03:23 compute-0 nova_compute[186544]: 2025-11-22 08:03:23.318 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] VM Resumed (Lifecycle Event)
Nov 22 08:03:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:23.320 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 165f7f23-d3c9-4f49-8a34-4fd7222ad518
Nov 22 08:03:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:23.333 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[f79bf898-6fb2-4998-8790-4d7b4043c829]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:03:23 compute-0 nova_compute[186544]: 2025-11-22 08:03:23.349 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:03:23 compute-0 nova_compute[186544]: 2025-11-22 08:03:23.354 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:03:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:23.360 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[eb402215-8d08-4791-aac5-c06777ae2d4a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:03:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:23.364 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[0de4bfdb-9218-4a0f-8243-6df94f35c337]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:03:23 compute-0 nova_compute[186544]: 2025-11-22 08:03:23.376 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Nov 22 08:03:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:23.389 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[fd612727-863e-4444-8d7e-377d3b8ffe86]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:03:23 compute-0 nova_compute[186544]: 2025-11-22 08:03:23.399 186548 DEBUG nova.compute.manager [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:03:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:23.405 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[c15f4c8c-6ba9-4f5b-a138-11a6ffdaf667]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap165f7f23-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:00:cc:98'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 616, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 616, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 132], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 531212, 'reachable_time': 37248, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231039, 'error': None, 'target': 'ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:03:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:23.419 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[2a791662-9f7a-4118-a75e-d8ab98dd16a7]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap165f7f23-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 531224, 'tstamp': 531224}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231040, 'error': None, 'target': 'ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap165f7f23-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 531227, 'tstamp': 531227}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231040, 'error': None, 'target': 'ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:03:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:23.421 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap165f7f23-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:03:23 compute-0 nova_compute[186544]: 2025-11-22 08:03:23.422 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:23 compute-0 nova_compute[186544]: 2025-11-22 08:03:23.426 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:23.427 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap165f7f23-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:03:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:23.427 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:03:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:23.428 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap165f7f23-d0, col_values=(('external_ids', {'iface-id': '966efbe2-6c09-40dc-9351-4f58f2542b31'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:03:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:23.428 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:03:23 compute-0 nova_compute[186544]: 2025-11-22 08:03:23.492 186548 DEBUG oslo_concurrency.lockutils [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:03:23 compute-0 nova_compute[186544]: 2025-11-22 08:03:23.493 186548 DEBUG oslo_concurrency.lockutils [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:03:23 compute-0 nova_compute[186544]: 2025-11-22 08:03:23.493 186548 DEBUG nova.objects.instance [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Nov 22 08:03:23 compute-0 nova_compute[186544]: 2025-11-22 08:03:23.554 186548 DEBUG oslo_concurrency.lockutils [None req-557be63d-2936-485d-8bf6-8429b8fa5882 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.062s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:03:24 compute-0 nova_compute[186544]: 2025-11-22 08:03:24.084 186548 DEBUG nova.compute.manager [req-d3132634-3af0-4f24-9717-46f08ff65c65 req-c3276601-b7dd-481f-9c63-69bf7deb9160 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Received event network-vif-plugged-cebf92b8-e08f-4803-92db-8cb2866aa038 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:03:24 compute-0 nova_compute[186544]: 2025-11-22 08:03:24.085 186548 DEBUG oslo_concurrency.lockutils [req-d3132634-3af0-4f24-9717-46f08ff65c65 req-c3276601-b7dd-481f-9c63-69bf7deb9160 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "886c510f-5bb2-455c-a7db-d83b3fc86ef2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:03:24 compute-0 nova_compute[186544]: 2025-11-22 08:03:24.085 186548 DEBUG oslo_concurrency.lockutils [req-d3132634-3af0-4f24-9717-46f08ff65c65 req-c3276601-b7dd-481f-9c63-69bf7deb9160 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "886c510f-5bb2-455c-a7db-d83b3fc86ef2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:03:24 compute-0 nova_compute[186544]: 2025-11-22 08:03:24.085 186548 DEBUG oslo_concurrency.lockutils [req-d3132634-3af0-4f24-9717-46f08ff65c65 req-c3276601-b7dd-481f-9c63-69bf7deb9160 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "886c510f-5bb2-455c-a7db-d83b3fc86ef2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:03:24 compute-0 nova_compute[186544]: 2025-11-22 08:03:24.085 186548 DEBUG nova.compute.manager [req-d3132634-3af0-4f24-9717-46f08ff65c65 req-c3276601-b7dd-481f-9c63-69bf7deb9160 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] No waiting events found dispatching network-vif-plugged-cebf92b8-e08f-4803-92db-8cb2866aa038 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:03:24 compute-0 nova_compute[186544]: 2025-11-22 08:03:24.085 186548 WARNING nova.compute.manager [req-d3132634-3af0-4f24-9717-46f08ff65c65 req-c3276601-b7dd-481f-9c63-69bf7deb9160 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Received unexpected event network-vif-plugged-cebf92b8-e08f-4803-92db-8cb2866aa038 for instance with vm_state active and task_state None.
Nov 22 08:03:24 compute-0 nova_compute[186544]: 2025-11-22 08:03:24.086 186548 DEBUG nova.compute.manager [req-d3132634-3af0-4f24-9717-46f08ff65c65 req-c3276601-b7dd-481f-9c63-69bf7deb9160 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Received event network-vif-plugged-cebf92b8-e08f-4803-92db-8cb2866aa038 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:03:24 compute-0 nova_compute[186544]: 2025-11-22 08:03:24.086 186548 DEBUG oslo_concurrency.lockutils [req-d3132634-3af0-4f24-9717-46f08ff65c65 req-c3276601-b7dd-481f-9c63-69bf7deb9160 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "886c510f-5bb2-455c-a7db-d83b3fc86ef2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:03:24 compute-0 nova_compute[186544]: 2025-11-22 08:03:24.086 186548 DEBUG oslo_concurrency.lockutils [req-d3132634-3af0-4f24-9717-46f08ff65c65 req-c3276601-b7dd-481f-9c63-69bf7deb9160 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "886c510f-5bb2-455c-a7db-d83b3fc86ef2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:03:24 compute-0 nova_compute[186544]: 2025-11-22 08:03:24.086 186548 DEBUG oslo_concurrency.lockutils [req-d3132634-3af0-4f24-9717-46f08ff65c65 req-c3276601-b7dd-481f-9c63-69bf7deb9160 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "886c510f-5bb2-455c-a7db-d83b3fc86ef2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:03:24 compute-0 nova_compute[186544]: 2025-11-22 08:03:24.086 186548 DEBUG nova.compute.manager [req-d3132634-3af0-4f24-9717-46f08ff65c65 req-c3276601-b7dd-481f-9c63-69bf7deb9160 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] No waiting events found dispatching network-vif-plugged-cebf92b8-e08f-4803-92db-8cb2866aa038 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:03:24 compute-0 nova_compute[186544]: 2025-11-22 08:03:24.087 186548 WARNING nova.compute.manager [req-d3132634-3af0-4f24-9717-46f08ff65c65 req-c3276601-b7dd-481f-9c63-69bf7deb9160 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Received unexpected event network-vif-plugged-cebf92b8-e08f-4803-92db-8cb2866aa038 for instance with vm_state active and task_state None.
Nov 22 08:03:25 compute-0 podman[231043]: 2025-11-22 08:03:25.425053614 +0000 UTC m=+0.063506413 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 08:03:25 compute-0 podman[231044]: 2025-11-22 08:03:25.451189121 +0000 UTC m=+0.087439516 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 22 08:03:25 compute-0 nova_compute[186544]: 2025-11-22 08:03:25.698 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:26 compute-0 nova_compute[186544]: 2025-11-22 08:03:26.117 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:26 compute-0 nova_compute[186544]: 2025-11-22 08:03:26.732 186548 DEBUG oslo_concurrency.processutils [None req-ba185f8a-f896-4144-8ff5-81da1ec2cd06 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/235ecf63-07fa-4f60-97e9-466450a50add/disk /var/lib/nova/instances/snapshots/tmp7qgumw2h/30bd163777724d7980ecd81699c2836d" returned: 0 in 6.274s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:03:26 compute-0 nova_compute[186544]: 2025-11-22 08:03:26.733 186548 INFO nova.virt.libvirt.driver [None req-ba185f8a-f896-4144-8ff5-81da1ec2cd06 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Snapshot extracted, beginning image upload
Nov 22 08:03:28 compute-0 nova_compute[186544]: 2025-11-22 08:03:28.296 186548 DEBUG oslo_concurrency.lockutils [None req-65d31e2e-7456-4d9d-983f-25e3682f0b29 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquiring lock "886c510f-5bb2-455c-a7db-d83b3fc86ef2" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:03:28 compute-0 nova_compute[186544]: 2025-11-22 08:03:28.296 186548 DEBUG oslo_concurrency.lockutils [None req-65d31e2e-7456-4d9d-983f-25e3682f0b29 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "886c510f-5bb2-455c-a7db-d83b3fc86ef2" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:03:28 compute-0 nova_compute[186544]: 2025-11-22 08:03:28.296 186548 DEBUG oslo_concurrency.lockutils [None req-65d31e2e-7456-4d9d-983f-25e3682f0b29 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquiring lock "886c510f-5bb2-455c-a7db-d83b3fc86ef2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:03:28 compute-0 nova_compute[186544]: 2025-11-22 08:03:28.297 186548 DEBUG oslo_concurrency.lockutils [None req-65d31e2e-7456-4d9d-983f-25e3682f0b29 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "886c510f-5bb2-455c-a7db-d83b3fc86ef2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:03:28 compute-0 nova_compute[186544]: 2025-11-22 08:03:28.297 186548 DEBUG oslo_concurrency.lockutils [None req-65d31e2e-7456-4d9d-983f-25e3682f0b29 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "886c510f-5bb2-455c-a7db-d83b3fc86ef2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:03:28 compute-0 nova_compute[186544]: 2025-11-22 08:03:28.304 186548 INFO nova.compute.manager [None req-65d31e2e-7456-4d9d-983f-25e3682f0b29 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Terminating instance
Nov 22 08:03:28 compute-0 nova_compute[186544]: 2025-11-22 08:03:28.309 186548 DEBUG nova.compute.manager [None req-65d31e2e-7456-4d9d-983f-25e3682f0b29 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 08:03:28 compute-0 kernel: tapcebf92b8-e0 (unregistering): left promiscuous mode
Nov 22 08:03:28 compute-0 NetworkManager[55036]: <info>  [1763798608.3617] device (tapcebf92b8-e0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 08:03:28 compute-0 ovn_controller[94843]: 2025-11-22T08:03:28Z|00432|binding|INFO|Releasing lport cebf92b8-e08f-4803-92db-8cb2866aa038 from this chassis (sb_readonly=0)
Nov 22 08:03:28 compute-0 nova_compute[186544]: 2025-11-22 08:03:28.380 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:28 compute-0 ovn_controller[94843]: 2025-11-22T08:03:28Z|00433|binding|INFO|Setting lport cebf92b8-e08f-4803-92db-8cb2866aa038 down in Southbound
Nov 22 08:03:28 compute-0 ovn_controller[94843]: 2025-11-22T08:03:28Z|00434|binding|INFO|Removing iface tapcebf92b8-e0 ovn-installed in OVS
Nov 22 08:03:28 compute-0 nova_compute[186544]: 2025-11-22 08:03:28.384 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:28 compute-0 systemd[1]: machine-qemu\x2d57\x2dinstance\x2d00000064.scope: Deactivated successfully.
Nov 22 08:03:28 compute-0 systemd[1]: machine-qemu\x2d57\x2dinstance\x2d00000064.scope: Consumed 5.665s CPU time.
Nov 22 08:03:28 compute-0 nova_compute[186544]: 2025-11-22 08:03:28.401 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:28 compute-0 systemd-machined[152872]: Machine qemu-57-instance-00000064 terminated.
Nov 22 08:03:28 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:28.419 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ac:64:2a 10.100.0.10'], port_security=['fa:16:3e:ac:64:2a 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '886c510f-5bb2-455c-a7db-d83b3fc86ef2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'neutron:revision_number': '6', 'neutron:security_group_ids': '6f667341-2539-478f-aedb-18e68df5c8e5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a46c39a3-69e8-4fb9-a300-4c114a0c0910, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=cebf92b8-e08f-4803-92db-8cb2866aa038) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:03:28 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:28.420 103805 INFO neutron.agent.ovn.metadata.agent [-] Port cebf92b8-e08f-4803-92db-8cb2866aa038 in datapath 165f7f23-d3c9-4f49-8a34-4fd7222ad518 unbound from our chassis
Nov 22 08:03:28 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:28.423 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 165f7f23-d3c9-4f49-8a34-4fd7222ad518, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 08:03:28 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:28.424 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[b32423f1-96ef-4c1f-87a6-765d3a4f9f99]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:03:28 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:28.424 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518 namespace which is not needed anymore
Nov 22 08:03:28 compute-0 podman[231085]: 2025-11-22 08:03:28.431379213 +0000 UTC m=+0.059488804 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent)
Nov 22 08:03:28 compute-0 podman[231088]: 2025-11-22 08:03:28.438334275 +0000 UTC m=+0.064005795 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 08:03:28 compute-0 kernel: tapcebf92b8-e0: entered promiscuous mode
Nov 22 08:03:28 compute-0 kernel: tapcebf92b8-e0 (unregistering): left promiscuous mode
Nov 22 08:03:28 compute-0 NetworkManager[55036]: <info>  [1763798608.5317] manager: (tapcebf92b8-e0): new Tun device (/org/freedesktop/NetworkManager/Devices/204)
Nov 22 08:03:28 compute-0 nova_compute[186544]: 2025-11-22 08:03:28.537 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:28 compute-0 ovn_controller[94843]: 2025-11-22T08:03:28Z|00435|binding|INFO|Claiming lport cebf92b8-e08f-4803-92db-8cb2866aa038 for this chassis.
Nov 22 08:03:28 compute-0 ovn_controller[94843]: 2025-11-22T08:03:28Z|00436|binding|INFO|cebf92b8-e08f-4803-92db-8cb2866aa038: Claiming fa:16:3e:ac:64:2a 10.100.0.10
Nov 22 08:03:28 compute-0 ovn_controller[94843]: 2025-11-22T08:03:28Z|00437|if_status|INFO|Dropped 2 log messages in last 950 seconds (most recently, 950 seconds ago) due to excessive rate
Nov 22 08:03:28 compute-0 ovn_controller[94843]: 2025-11-22T08:03:28Z|00438|if_status|INFO|Not setting lport cebf92b8-e08f-4803-92db-8cb2866aa038 down as sb is readonly
Nov 22 08:03:28 compute-0 nova_compute[186544]: 2025-11-22 08:03:28.556 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:28 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:28.559 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ac:64:2a 10.100.0.10'], port_security=['fa:16:3e:ac:64:2a 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '886c510f-5bb2-455c-a7db-d83b3fc86ef2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'neutron:revision_number': '6', 'neutron:security_group_ids': '6f667341-2539-478f-aedb-18e68df5c8e5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a46c39a3-69e8-4fb9-a300-4c114a0c0910, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=cebf92b8-e08f-4803-92db-8cb2866aa038) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:03:28 compute-0 ovn_controller[94843]: 2025-11-22T08:03:28Z|00439|binding|INFO|Releasing lport cebf92b8-e08f-4803-92db-8cb2866aa038 from this chassis (sb_readonly=0)
Nov 22 08:03:28 compute-0 nova_compute[186544]: 2025-11-22 08:03:28.574 186548 INFO nova.virt.libvirt.driver [-] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Instance destroyed successfully.
Nov 22 08:03:28 compute-0 nova_compute[186544]: 2025-11-22 08:03:28.574 186548 DEBUG nova.objects.instance [None req-65d31e2e-7456-4d9d-983f-25e3682f0b29 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lazy-loading 'resources' on Instance uuid 886c510f-5bb2-455c-a7db-d83b3fc86ef2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:03:28 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:28.581 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ac:64:2a 10.100.0.10'], port_security=['fa:16:3e:ac:64:2a 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '886c510f-5bb2-455c-a7db-d83b3fc86ef2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'neutron:revision_number': '6', 'neutron:security_group_ids': '6f667341-2539-478f-aedb-18e68df5c8e5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a46c39a3-69e8-4fb9-a300-4c114a0c0910, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=cebf92b8-e08f-4803-92db-8cb2866aa038) old=Port_Binding(chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:03:28 compute-0 nova_compute[186544]: 2025-11-22 08:03:28.597 186548 DEBUG nova.virt.libvirt.vif [None req-65d31e2e-7456-4d9d-983f-25e3682f0b29 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-22T08:02:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1370271558',display_name='tempest-ServerActionsTestJSON-server-1290637600',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1370271558',id=100,image_ref='360f90ca-2ddb-4e60-a48e-364e3b48bd96',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:03:23Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={rebuild='server'},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ac6b78572b7d4aedaf745e5e6ba1d360',ramdisk_id='',reservation_id='r-4vx40fvs',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='360f90ca-2ddb-4e60-a48e-364e3b48bd96',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1104477664',owner_user_name='tempest-ServerActionsTestJSON-1104477664-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:03:23Z,user_data=None,user_id='b6cc24df1e344e369f2aff864f278268',uuid=886c510f-5bb2-455c-a7db-d83b3fc86ef2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cebf92b8-e08f-4803-92db-8cb2866aa038", "address": "fa:16:3e:ac:64:2a", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcebf92b8-e0", "ovs_interfaceid": "cebf92b8-e08f-4803-92db-8cb2866aa038", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 08:03:28 compute-0 nova_compute[186544]: 2025-11-22 08:03:28.598 186548 DEBUG nova.network.os_vif_util [None req-65d31e2e-7456-4d9d-983f-25e3682f0b29 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Converting VIF {"id": "cebf92b8-e08f-4803-92db-8cb2866aa038", "address": "fa:16:3e:ac:64:2a", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcebf92b8-e0", "ovs_interfaceid": "cebf92b8-e08f-4803-92db-8cb2866aa038", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:03:28 compute-0 nova_compute[186544]: 2025-11-22 08:03:28.598 186548 DEBUG nova.network.os_vif_util [None req-65d31e2e-7456-4d9d-983f-25e3682f0b29 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ac:64:2a,bridge_name='br-int',has_traffic_filtering=True,id=cebf92b8-e08f-4803-92db-8cb2866aa038,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcebf92b8-e0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:03:28 compute-0 nova_compute[186544]: 2025-11-22 08:03:28.598 186548 DEBUG os_vif [None req-65d31e2e-7456-4d9d-983f-25e3682f0b29 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ac:64:2a,bridge_name='br-int',has_traffic_filtering=True,id=cebf92b8-e08f-4803-92db-8cb2866aa038,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcebf92b8-e0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 08:03:28 compute-0 nova_compute[186544]: 2025-11-22 08:03:28.600 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:28 compute-0 nova_compute[186544]: 2025-11-22 08:03:28.601 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcebf92b8-e0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:03:28 compute-0 nova_compute[186544]: 2025-11-22 08:03:28.602 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:28 compute-0 nova_compute[186544]: 2025-11-22 08:03:28.604 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 08:03:28 compute-0 nova_compute[186544]: 2025-11-22 08:03:28.607 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:28 compute-0 nova_compute[186544]: 2025-11-22 08:03:28.610 186548 INFO os_vif [None req-65d31e2e-7456-4d9d-983f-25e3682f0b29 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ac:64:2a,bridge_name='br-int',has_traffic_filtering=True,id=cebf92b8-e08f-4803-92db-8cb2866aa038,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcebf92b8-e0')
Nov 22 08:03:28 compute-0 nova_compute[186544]: 2025-11-22 08:03:28.611 186548 INFO nova.virt.libvirt.driver [None req-65d31e2e-7456-4d9d-983f-25e3682f0b29 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Deleting instance files /var/lib/nova/instances/886c510f-5bb2-455c-a7db-d83b3fc86ef2_del
Nov 22 08:03:28 compute-0 nova_compute[186544]: 2025-11-22 08:03:28.611 186548 INFO nova.virt.libvirt.driver [None req-65d31e2e-7456-4d9d-983f-25e3682f0b29 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Deletion of /var/lib/nova/instances/886c510f-5bb2-455c-a7db-d83b3fc86ef2_del complete
Nov 22 08:03:28 compute-0 nova_compute[186544]: 2025-11-22 08:03:28.640 186548 DEBUG nova.compute.manager [req-91ea32ba-f572-417b-9a28-9f3def51e0e2 req-567553b1-446b-4920-8798-23f76dcc7d92 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Received event network-vif-unplugged-cebf92b8-e08f-4803-92db-8cb2866aa038 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:03:28 compute-0 nova_compute[186544]: 2025-11-22 08:03:28.640 186548 DEBUG oslo_concurrency.lockutils [req-91ea32ba-f572-417b-9a28-9f3def51e0e2 req-567553b1-446b-4920-8798-23f76dcc7d92 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "886c510f-5bb2-455c-a7db-d83b3fc86ef2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:03:28 compute-0 nova_compute[186544]: 2025-11-22 08:03:28.640 186548 DEBUG oslo_concurrency.lockutils [req-91ea32ba-f572-417b-9a28-9f3def51e0e2 req-567553b1-446b-4920-8798-23f76dcc7d92 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "886c510f-5bb2-455c-a7db-d83b3fc86ef2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:03:28 compute-0 nova_compute[186544]: 2025-11-22 08:03:28.641 186548 DEBUG oslo_concurrency.lockutils [req-91ea32ba-f572-417b-9a28-9f3def51e0e2 req-567553b1-446b-4920-8798-23f76dcc7d92 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "886c510f-5bb2-455c-a7db-d83b3fc86ef2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:03:28 compute-0 nova_compute[186544]: 2025-11-22 08:03:28.641 186548 DEBUG nova.compute.manager [req-91ea32ba-f572-417b-9a28-9f3def51e0e2 req-567553b1-446b-4920-8798-23f76dcc7d92 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] No waiting events found dispatching network-vif-unplugged-cebf92b8-e08f-4803-92db-8cb2866aa038 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:03:28 compute-0 nova_compute[186544]: 2025-11-22 08:03:28.641 186548 DEBUG nova.compute.manager [req-91ea32ba-f572-417b-9a28-9f3def51e0e2 req-567553b1-446b-4920-8798-23f76dcc7d92 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Received event network-vif-unplugged-cebf92b8-e08f-4803-92db-8cb2866aa038 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 22 08:03:28 compute-0 nova_compute[186544]: 2025-11-22 08:03:28.674 186548 INFO nova.compute.manager [None req-65d31e2e-7456-4d9d-983f-25e3682f0b29 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Took 0.36 seconds to destroy the instance on the hypervisor.
Nov 22 08:03:28 compute-0 nova_compute[186544]: 2025-11-22 08:03:28.675 186548 DEBUG oslo.service.loopingcall [None req-65d31e2e-7456-4d9d-983f-25e3682f0b29 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 08:03:28 compute-0 nova_compute[186544]: 2025-11-22 08:03:28.675 186548 DEBUG nova.compute.manager [-] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 08:03:28 compute-0 nova_compute[186544]: 2025-11-22 08:03:28.675 186548 DEBUG nova.network.neutron [-] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 08:03:28 compute-0 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[230658]: [NOTICE]   (230662) : haproxy version is 2.8.14-c23fe91
Nov 22 08:03:28 compute-0 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[230658]: [NOTICE]   (230662) : path to executable is /usr/sbin/haproxy
Nov 22 08:03:28 compute-0 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[230658]: [WARNING]  (230662) : Exiting Master process...
Nov 22 08:03:28 compute-0 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[230658]: [ALERT]    (230662) : Current worker (230664) exited with code 143 (Terminated)
Nov 22 08:03:28 compute-0 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[230658]: [WARNING]  (230662) : All workers exited. Exiting... (0)
Nov 22 08:03:28 compute-0 systemd[1]: libpod-b8f7c83fbba06fdb5d9b53d68d6447d11cf1fad8f73e44c86c76481878d0cb2a.scope: Deactivated successfully.
Nov 22 08:03:28 compute-0 podman[231155]: 2025-11-22 08:03:28.862514248 +0000 UTC m=+0.354678753 container died b8f7c83fbba06fdb5d9b53d68d6447d11cf1fad8f73e44c86c76481878d0cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2)
Nov 22 08:03:29 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b8f7c83fbba06fdb5d9b53d68d6447d11cf1fad8f73e44c86c76481878d0cb2a-userdata-shm.mount: Deactivated successfully.
Nov 22 08:03:29 compute-0 systemd[1]: var-lib-containers-storage-overlay-6af0af91aa10e45f1a3b9ced8fb34bb505d0d127b898126a2ffff1cb9bf5c77d-merged.mount: Deactivated successfully.
Nov 22 08:03:29 compute-0 nova_compute[186544]: 2025-11-22 08:03:29.817 186548 INFO nova.virt.libvirt.driver [None req-ba185f8a-f896-4144-8ff5-81da1ec2cd06 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Snapshot image upload complete
Nov 22 08:03:29 compute-0 nova_compute[186544]: 2025-11-22 08:03:29.818 186548 DEBUG nova.compute.manager [None req-ba185f8a-f896-4144-8ff5-81da1ec2cd06 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:03:29 compute-0 nova_compute[186544]: 2025-11-22 08:03:29.940 186548 DEBUG nova.network.neutron [-] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:03:29 compute-0 nova_compute[186544]: 2025-11-22 08:03:29.979 186548 INFO nova.compute.manager [-] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Took 1.30 seconds to deallocate network for instance.
Nov 22 08:03:29 compute-0 nova_compute[186544]: 2025-11-22 08:03:29.995 186548 INFO nova.compute.manager [None req-ba185f8a-f896-4144-8ff5-81da1ec2cd06 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Shelve offloading
Nov 22 08:03:29 compute-0 nova_compute[186544]: 2025-11-22 08:03:29.998 186548 DEBUG nova.compute.manager [req-2fd92b2f-880f-4573-9431-6af6be6596a2 req-ee2a2404-71bd-4a3e-921b-72db97fec16c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Received event network-vif-deleted-cebf92b8-e08f-4803-92db-8cb2866aa038 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:03:30 compute-0 nova_compute[186544]: 2025-11-22 08:03:30.009 186548 INFO nova.virt.libvirt.driver [-] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Instance destroyed successfully.
Nov 22 08:03:30 compute-0 nova_compute[186544]: 2025-11-22 08:03:30.009 186548 DEBUG nova.compute.manager [None req-ba185f8a-f896-4144-8ff5-81da1ec2cd06 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:03:30 compute-0 nova_compute[186544]: 2025-11-22 08:03:30.011 186548 DEBUG oslo_concurrency.lockutils [None req-ba185f8a-f896-4144-8ff5-81da1ec2cd06 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Acquiring lock "refresh_cache-235ecf63-07fa-4f60-97e9-466450a50add" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:03:30 compute-0 nova_compute[186544]: 2025-11-22 08:03:30.012 186548 DEBUG oslo_concurrency.lockutils [None req-ba185f8a-f896-4144-8ff5-81da1ec2cd06 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Acquired lock "refresh_cache-235ecf63-07fa-4f60-97e9-466450a50add" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:03:30 compute-0 nova_compute[186544]: 2025-11-22 08:03:30.012 186548 DEBUG nova.network.neutron [None req-ba185f8a-f896-4144-8ff5-81da1ec2cd06 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 08:03:30 compute-0 nova_compute[186544]: 2025-11-22 08:03:30.069 186548 DEBUG oslo_concurrency.lockutils [None req-65d31e2e-7456-4d9d-983f-25e3682f0b29 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:03:30 compute-0 nova_compute[186544]: 2025-11-22 08:03:30.070 186548 DEBUG oslo_concurrency.lockutils [None req-65d31e2e-7456-4d9d-983f-25e3682f0b29 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:03:30 compute-0 nova_compute[186544]: 2025-11-22 08:03:30.163 186548 DEBUG nova.compute.provider_tree [None req-65d31e2e-7456-4d9d-983f-25e3682f0b29 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:03:30 compute-0 nova_compute[186544]: 2025-11-22 08:03:30.176 186548 DEBUG nova.scheduler.client.report [None req-65d31e2e-7456-4d9d-983f-25e3682f0b29 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:03:30 compute-0 nova_compute[186544]: 2025-11-22 08:03:30.223 186548 DEBUG oslo_concurrency.lockutils [None req-65d31e2e-7456-4d9d-983f-25e3682f0b29 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.152s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:03:30 compute-0 nova_compute[186544]: 2025-11-22 08:03:30.247 186548 INFO nova.scheduler.client.report [None req-65d31e2e-7456-4d9d-983f-25e3682f0b29 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Deleted allocations for instance 886c510f-5bb2-455c-a7db-d83b3fc86ef2
Nov 22 08:03:30 compute-0 nova_compute[186544]: 2025-11-22 08:03:30.326 186548 DEBUG oslo_concurrency.lockutils [None req-65d31e2e-7456-4d9d-983f-25e3682f0b29 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "886c510f-5bb2-455c-a7db-d83b3fc86ef2" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.030s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:03:30 compute-0 podman[231155]: 2025-11-22 08:03:30.377220093 +0000 UTC m=+1.869384598 container cleanup b8f7c83fbba06fdb5d9b53d68d6447d11cf1fad8f73e44c86c76481878d0cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 22 08:03:30 compute-0 systemd[1]: libpod-conmon-b8f7c83fbba06fdb5d9b53d68d6447d11cf1fad8f73e44c86c76481878d0cb2a.scope: Deactivated successfully.
Nov 22 08:03:30 compute-0 ovn_controller[94843]: 2025-11-22T08:03:30Z|00440|binding|INFO|Releasing lport 966efbe2-6c09-40dc-9351-4f58f2542b31 from this chassis (sb_readonly=0)
Nov 22 08:03:30 compute-0 nova_compute[186544]: 2025-11-22 08:03:30.495 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:30 compute-0 ovn_controller[94843]: 2025-11-22T08:03:30Z|00441|binding|INFO|Releasing lport 966efbe2-6c09-40dc-9351-4f58f2542b31 from this chassis (sb_readonly=0)
Nov 22 08:03:30 compute-0 nova_compute[186544]: 2025-11-22 08:03:30.686 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:30 compute-0 nova_compute[186544]: 2025-11-22 08:03:30.699 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:30 compute-0 nova_compute[186544]: 2025-11-22 08:03:30.723 186548 DEBUG nova.compute.manager [req-5c12465e-8e48-47a1-9054-37a65c585ec3 req-df0ad714-f3ef-4475-aa67-663e322c47b0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Received event network-vif-plugged-cebf92b8-e08f-4803-92db-8cb2866aa038 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:03:30 compute-0 nova_compute[186544]: 2025-11-22 08:03:30.724 186548 DEBUG oslo_concurrency.lockutils [req-5c12465e-8e48-47a1-9054-37a65c585ec3 req-df0ad714-f3ef-4475-aa67-663e322c47b0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "886c510f-5bb2-455c-a7db-d83b3fc86ef2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:03:30 compute-0 nova_compute[186544]: 2025-11-22 08:03:30.724 186548 DEBUG oslo_concurrency.lockutils [req-5c12465e-8e48-47a1-9054-37a65c585ec3 req-df0ad714-f3ef-4475-aa67-663e322c47b0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "886c510f-5bb2-455c-a7db-d83b3fc86ef2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:03:30 compute-0 nova_compute[186544]: 2025-11-22 08:03:30.724 186548 DEBUG oslo_concurrency.lockutils [req-5c12465e-8e48-47a1-9054-37a65c585ec3 req-df0ad714-f3ef-4475-aa67-663e322c47b0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "886c510f-5bb2-455c-a7db-d83b3fc86ef2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:03:30 compute-0 nova_compute[186544]: 2025-11-22 08:03:30.725 186548 DEBUG nova.compute.manager [req-5c12465e-8e48-47a1-9054-37a65c585ec3 req-df0ad714-f3ef-4475-aa67-663e322c47b0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] No waiting events found dispatching network-vif-plugged-cebf92b8-e08f-4803-92db-8cb2866aa038 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:03:30 compute-0 nova_compute[186544]: 2025-11-22 08:03:30.725 186548 WARNING nova.compute.manager [req-5c12465e-8e48-47a1-9054-37a65c585ec3 req-df0ad714-f3ef-4475-aa67-663e322c47b0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Received unexpected event network-vif-plugged-cebf92b8-e08f-4803-92db-8cb2866aa038 for instance with vm_state deleted and task_state None.
Nov 22 08:03:31 compute-0 podman[231195]: 2025-11-22 08:03:31.376411424 +0000 UTC m=+0.972686646 container remove b8f7c83fbba06fdb5d9b53d68d6447d11cf1fad8f73e44c86c76481878d0cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Nov 22 08:03:31 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:31.382 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[917c7f96-5cc2-4159-be51-8db0c3ccd584]: (4, ('Sat Nov 22 08:03:28 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518 (b8f7c83fbba06fdb5d9b53d68d6447d11cf1fad8f73e44c86c76481878d0cb2a)\nb8f7c83fbba06fdb5d9b53d68d6447d11cf1fad8f73e44c86c76481878d0cb2a\nSat Nov 22 08:03:30 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518 (b8f7c83fbba06fdb5d9b53d68d6447d11cf1fad8f73e44c86c76481878d0cb2a)\nb8f7c83fbba06fdb5d9b53d68d6447d11cf1fad8f73e44c86c76481878d0cb2a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:03:31 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:31.383 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[087d52c9-a645-4a9d-8174-869b674c66e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:03:31 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:31.384 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap165f7f23-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:03:31 compute-0 nova_compute[186544]: 2025-11-22 08:03:31.386 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:31 compute-0 kernel: tap165f7f23-d0: left promiscuous mode
Nov 22 08:03:31 compute-0 nova_compute[186544]: 2025-11-22 08:03:31.398 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:31 compute-0 nova_compute[186544]: 2025-11-22 08:03:31.403 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:31 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:31.405 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[c501cc70-e1d7-467b-8718-d21d2950a1ba]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:03:31 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:31.422 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[c958eb54-1c9a-4e39-940a-be74908aee51]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:03:31 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:31.424 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[c344fb4a-6f42-44ea-ba95-f179ee249491]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:03:31 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:31.441 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[680d1df4-269e-43e7-96a0-faae9459f414]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 531206, 'reachable_time': 33801, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231214, 'error': None, 'target': 'ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:03:31 compute-0 systemd[1]: run-netns-ovnmeta\x2d165f7f23\x2dd3c9\x2d4f49\x2d8a34\x2d4fd7222ad518.mount: Deactivated successfully.
Nov 22 08:03:31 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:31.445 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 08:03:31 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:31.446 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[317dd163-da49-4ad4-923f-c5aba0e27d7e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:03:31 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:31.446 103805 INFO neutron.agent.ovn.metadata.agent [-] Port cebf92b8-e08f-4803-92db-8cb2866aa038 in datapath 165f7f23-d3c9-4f49-8a34-4fd7222ad518 unbound from our chassis
Nov 22 08:03:31 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:31.447 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 165f7f23-d3c9-4f49-8a34-4fd7222ad518, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 08:03:31 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:31.448 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[b8194daf-a662-4e02-9400-15de875cc528]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:03:31 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:31.449 103805 INFO neutron.agent.ovn.metadata.agent [-] Port cebf92b8-e08f-4803-92db-8cb2866aa038 in datapath 165f7f23-d3c9-4f49-8a34-4fd7222ad518 unbound from our chassis
Nov 22 08:03:31 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:31.449 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 165f7f23-d3c9-4f49-8a34-4fd7222ad518, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 08:03:31 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:31.450 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[e51001f3-2b8f-4881-9419-74f567d63626]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:03:31 compute-0 nova_compute[186544]: 2025-11-22 08:03:31.726 186548 DEBUG nova.network.neutron [None req-ba185f8a-f896-4144-8ff5-81da1ec2cd06 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Updating instance_info_cache with network_info: [{"id": "fe9c07c3-e08f-4810-b699-6d6aa3f50cda", "address": "fa:16:3e:1c:d6:3e", "network": {"id": "c157128f-75e8-4afb-ab55-34580af9585f", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1557637836-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d967f0cef958482c9711764882a146f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe9c07c3-e0", "ovs_interfaceid": "fe9c07c3-e08f-4810-b699-6d6aa3f50cda", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:03:31 compute-0 nova_compute[186544]: 2025-11-22 08:03:31.770 186548 DEBUG oslo_concurrency.lockutils [None req-ba185f8a-f896-4144-8ff5-81da1ec2cd06 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Releasing lock "refresh_cache-235ecf63-07fa-4f60-97e9-466450a50add" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:03:32 compute-0 nova_compute[186544]: 2025-11-22 08:03:32.625 186548 INFO nova.virt.libvirt.driver [-] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Instance destroyed successfully.
Nov 22 08:03:32 compute-0 nova_compute[186544]: 2025-11-22 08:03:32.626 186548 DEBUG nova.objects.instance [None req-ba185f8a-f896-4144-8ff5-81da1ec2cd06 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Lazy-loading 'resources' on Instance uuid 235ecf63-07fa-4f60-97e9-466450a50add obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:03:32 compute-0 nova_compute[186544]: 2025-11-22 08:03:32.640 186548 DEBUG nova.virt.libvirt.vif [None req-ba185f8a-f896-4144-8ff5-81da1ec2cd06 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:01:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-899820238',display_name='tempest-ServersNegativeTestJSON-server-899820238',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-899820238',id=97,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:01:45Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='d967f0cef958482c9711764882a146f3',ramdisk_id='',reservation_id='r-he9blv4w',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersNegativeTestJSON-1872924472',owner_user_name='tempest-ServersNegativeTestJSON-1872924472-project-member',shelved_at='2025-11-22T08:03:29.818091',shelved_host='compute-0.ctlplane.example.com',shelved_image_id='165c19d9-e5a7-4c8d-9823-47ddc3418023'},tags=<?>,task_state='shelving_offloading',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:03:26Z,user_data=None,user_id='989540cd5ede4a5184a08b8eb3de013d',uuid=235ecf63-07fa-4f60-97e9-466450a50add,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='shelved') vif={"id": "fe9c07c3-e08f-4810-b699-6d6aa3f50cda", "address": "fa:16:3e:1c:d6:3e", "network": {"id": "c157128f-75e8-4afb-ab55-34580af9585f", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1557637836-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d967f0cef958482c9711764882a146f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe9c07c3-e0", "ovs_interfaceid": "fe9c07c3-e08f-4810-b699-6d6aa3f50cda", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 08:03:32 compute-0 nova_compute[186544]: 2025-11-22 08:03:32.641 186548 DEBUG nova.network.os_vif_util [None req-ba185f8a-f896-4144-8ff5-81da1ec2cd06 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Converting VIF {"id": "fe9c07c3-e08f-4810-b699-6d6aa3f50cda", "address": "fa:16:3e:1c:d6:3e", "network": {"id": "c157128f-75e8-4afb-ab55-34580af9585f", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1557637836-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d967f0cef958482c9711764882a146f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe9c07c3-e0", "ovs_interfaceid": "fe9c07c3-e08f-4810-b699-6d6aa3f50cda", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:03:32 compute-0 nova_compute[186544]: 2025-11-22 08:03:32.641 186548 DEBUG nova.network.os_vif_util [None req-ba185f8a-f896-4144-8ff5-81da1ec2cd06 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1c:d6:3e,bridge_name='br-int',has_traffic_filtering=True,id=fe9c07c3-e08f-4810-b699-6d6aa3f50cda,network=Network(c157128f-75e8-4afb-ab55-34580af9585f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe9c07c3-e0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:03:32 compute-0 nova_compute[186544]: 2025-11-22 08:03:32.642 186548 DEBUG os_vif [None req-ba185f8a-f896-4144-8ff5-81da1ec2cd06 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1c:d6:3e,bridge_name='br-int',has_traffic_filtering=True,id=fe9c07c3-e08f-4810-b699-6d6aa3f50cda,network=Network(c157128f-75e8-4afb-ab55-34580af9585f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe9c07c3-e0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 08:03:32 compute-0 nova_compute[186544]: 2025-11-22 08:03:32.643 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:32 compute-0 nova_compute[186544]: 2025-11-22 08:03:32.644 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfe9c07c3-e0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:03:32 compute-0 nova_compute[186544]: 2025-11-22 08:03:32.645 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:32 compute-0 nova_compute[186544]: 2025-11-22 08:03:32.647 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 08:03:32 compute-0 nova_compute[186544]: 2025-11-22 08:03:32.649 186548 INFO os_vif [None req-ba185f8a-f896-4144-8ff5-81da1ec2cd06 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1c:d6:3e,bridge_name='br-int',has_traffic_filtering=True,id=fe9c07c3-e08f-4810-b699-6d6aa3f50cda,network=Network(c157128f-75e8-4afb-ab55-34580af9585f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe9c07c3-e0')
Nov 22 08:03:32 compute-0 nova_compute[186544]: 2025-11-22 08:03:32.649 186548 INFO nova.virt.libvirt.driver [None req-ba185f8a-f896-4144-8ff5-81da1ec2cd06 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Deleting instance files /var/lib/nova/instances/235ecf63-07fa-4f60-97e9-466450a50add_del
Nov 22 08:03:32 compute-0 nova_compute[186544]: 2025-11-22 08:03:32.655 186548 INFO nova.virt.libvirt.driver [None req-ba185f8a-f896-4144-8ff5-81da1ec2cd06 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Deletion of /var/lib/nova/instances/235ecf63-07fa-4f60-97e9-466450a50add_del complete
Nov 22 08:03:32 compute-0 nova_compute[186544]: 2025-11-22 08:03:32.734 186548 DEBUG nova.compute.manager [req-8bab6674-2978-4c7a-bde0-96b33532f5ba req-489abd10-67fc-413a-abbe-b19146dd0b12 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Received event network-changed-fe9c07c3-e08f-4810-b699-6d6aa3f50cda external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:03:32 compute-0 nova_compute[186544]: 2025-11-22 08:03:32.734 186548 DEBUG nova.compute.manager [req-8bab6674-2978-4c7a-bde0-96b33532f5ba req-489abd10-67fc-413a-abbe-b19146dd0b12 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Refreshing instance network info cache due to event network-changed-fe9c07c3-e08f-4810-b699-6d6aa3f50cda. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:03:32 compute-0 nova_compute[186544]: 2025-11-22 08:03:32.734 186548 DEBUG oslo_concurrency.lockutils [req-8bab6674-2978-4c7a-bde0-96b33532f5ba req-489abd10-67fc-413a-abbe-b19146dd0b12 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-235ecf63-07fa-4f60-97e9-466450a50add" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:03:32 compute-0 nova_compute[186544]: 2025-11-22 08:03:32.735 186548 DEBUG oslo_concurrency.lockutils [req-8bab6674-2978-4c7a-bde0-96b33532f5ba req-489abd10-67fc-413a-abbe-b19146dd0b12 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-235ecf63-07fa-4f60-97e9-466450a50add" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:03:32 compute-0 nova_compute[186544]: 2025-11-22 08:03:32.735 186548 DEBUG nova.network.neutron [req-8bab6674-2978-4c7a-bde0-96b33532f5ba req-489abd10-67fc-413a-abbe-b19146dd0b12 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Refreshing network info cache for port fe9c07c3-e08f-4810-b699-6d6aa3f50cda _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:03:32 compute-0 nova_compute[186544]: 2025-11-22 08:03:32.771 186548 INFO nova.scheduler.client.report [None req-ba185f8a-f896-4144-8ff5-81da1ec2cd06 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Deleted allocations for instance 235ecf63-07fa-4f60-97e9-466450a50add
Nov 22 08:03:32 compute-0 nova_compute[186544]: 2025-11-22 08:03:32.840 186548 DEBUG oslo_concurrency.lockutils [None req-ba185f8a-f896-4144-8ff5-81da1ec2cd06 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:03:32 compute-0 nova_compute[186544]: 2025-11-22 08:03:32.840 186548 DEBUG oslo_concurrency.lockutils [None req-ba185f8a-f896-4144-8ff5-81da1ec2cd06 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:03:32 compute-0 nova_compute[186544]: 2025-11-22 08:03:32.875 186548 DEBUG nova.compute.provider_tree [None req-ba185f8a-f896-4144-8ff5-81da1ec2cd06 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:03:32 compute-0 nova_compute[186544]: 2025-11-22 08:03:32.889 186548 DEBUG nova.scheduler.client.report [None req-ba185f8a-f896-4144-8ff5-81da1ec2cd06 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:03:32 compute-0 nova_compute[186544]: 2025-11-22 08:03:32.911 186548 DEBUG oslo_concurrency.lockutils [None req-ba185f8a-f896-4144-8ff5-81da1ec2cd06 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.070s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:03:32 compute-0 nova_compute[186544]: 2025-11-22 08:03:32.991 186548 DEBUG oslo_concurrency.lockutils [None req-ba185f8a-f896-4144-8ff5-81da1ec2cd06 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Lock "235ecf63-07fa-4f60-97e9-466450a50add" "released" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: held 16.214s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:03:34 compute-0 nova_compute[186544]: 2025-11-22 08:03:34.479 186548 DEBUG nova.network.neutron [req-8bab6674-2978-4c7a-bde0-96b33532f5ba req-489abd10-67fc-413a-abbe-b19146dd0b12 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Updated VIF entry in instance network info cache for port fe9c07c3-e08f-4810-b699-6d6aa3f50cda. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:03:34 compute-0 nova_compute[186544]: 2025-11-22 08:03:34.480 186548 DEBUG nova.network.neutron [req-8bab6674-2978-4c7a-bde0-96b33532f5ba req-489abd10-67fc-413a-abbe-b19146dd0b12 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Updating instance_info_cache with network_info: [{"id": "fe9c07c3-e08f-4810-b699-6d6aa3f50cda", "address": "fa:16:3e:1c:d6:3e", "network": {"id": "c157128f-75e8-4afb-ab55-34580af9585f", "bridge": null, "label": "tempest-ServersNegativeTestJSON-1557637836-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d967f0cef958482c9711764882a146f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "unbound", "details": {}, "devname": "tapfe9c07c3-e0", "ovs_interfaceid": null, "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:03:34 compute-0 nova_compute[186544]: 2025-11-22 08:03:34.502 186548 DEBUG oslo_concurrency.lockutils [req-8bab6674-2978-4c7a-bde0-96b33532f5ba req-489abd10-67fc-413a-abbe-b19146dd0b12 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-235ecf63-07fa-4f60-97e9-466450a50add" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:03:34 compute-0 nova_compute[186544]: 2025-11-22 08:03:34.588 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763798599.5875635, 235ecf63-07fa-4f60-97e9-466450a50add => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:03:34 compute-0 nova_compute[186544]: 2025-11-22 08:03:34.589 186548 INFO nova.compute.manager [-] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] VM Stopped (Lifecycle Event)
Nov 22 08:03:34 compute-0 nova_compute[186544]: 2025-11-22 08:03:34.616 186548 DEBUG nova.compute.manager [None req-6b92344d-3f6f-4e32-95fa-787277bfade5 - - - - - -] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:03:35 compute-0 nova_compute[186544]: 2025-11-22 08:03:35.636 186548 DEBUG nova.compute.manager [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560
Nov 22 08:03:35 compute-0 nova_compute[186544]: 2025-11-22 08:03:35.701 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:35 compute-0 nova_compute[186544]: 2025-11-22 08:03:35.738 186548 DEBUG oslo_concurrency.lockutils [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:03:35 compute-0 nova_compute[186544]: 2025-11-22 08:03:35.739 186548 DEBUG oslo_concurrency.lockutils [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:03:35 compute-0 nova_compute[186544]: 2025-11-22 08:03:35.762 186548 DEBUG nova.objects.instance [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lazy-loading 'pci_requests' on Instance uuid eb6b82cf-7eb5-4a69-9342-a5d3fb896e58 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:03:35 compute-0 nova_compute[186544]: 2025-11-22 08:03:35.774 186548 DEBUG nova.virt.hardware [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 08:03:35 compute-0 nova_compute[186544]: 2025-11-22 08:03:35.775 186548 INFO nova.compute.claims [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Claim successful on node compute-0.ctlplane.example.com
Nov 22 08:03:35 compute-0 nova_compute[186544]: 2025-11-22 08:03:35.775 186548 DEBUG nova.objects.instance [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lazy-loading 'resources' on Instance uuid eb6b82cf-7eb5-4a69-9342-a5d3fb896e58 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:03:35 compute-0 nova_compute[186544]: 2025-11-22 08:03:35.783 186548 DEBUG nova.objects.instance [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lazy-loading 'pci_devices' on Instance uuid eb6b82cf-7eb5-4a69-9342-a5d3fb896e58 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:03:35 compute-0 nova_compute[186544]: 2025-11-22 08:03:35.817 186548 INFO nova.compute.resource_tracker [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Updating resource usage from migration 90f3f02c-becf-4e76-be2e-e639916871d2
Nov 22 08:03:35 compute-0 nova_compute[186544]: 2025-11-22 08:03:35.817 186548 DEBUG nova.compute.resource_tracker [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Starting to track incoming migration 90f3f02c-becf-4e76-be2e-e639916871d2 with flavor 1c351edf-5b2d-477d-93d0-c380bdae83e7 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431
Nov 22 08:03:35 compute-0 nova_compute[186544]: 2025-11-22 08:03:35.866 186548 DEBUG nova.compute.provider_tree [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:03:35 compute-0 nova_compute[186544]: 2025-11-22 08:03:35.879 186548 DEBUG nova.scheduler.client.report [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:03:35 compute-0 nova_compute[186544]: 2025-11-22 08:03:35.904 186548 DEBUG oslo_concurrency.lockutils [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.165s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:03:35 compute-0 nova_compute[186544]: 2025-11-22 08:03:35.904 186548 INFO nova.compute.manager [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Migrating
Nov 22 08:03:36 compute-0 podman[231219]: 2025-11-22 08:03:36.403220371 +0000 UTC m=+0.055739471 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, container_name=multipathd, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 22 08:03:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:37.331 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:03:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:37.332 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:03:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:37.332 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:03:37 compute-0 nova_compute[186544]: 2025-11-22 08:03:37.406 186548 DEBUG oslo_concurrency.lockutils [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Acquiring lock "235ecf63-07fa-4f60-97e9-466450a50add" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:03:37 compute-0 nova_compute[186544]: 2025-11-22 08:03:37.407 186548 DEBUG oslo_concurrency.lockutils [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Lock "235ecf63-07fa-4f60-97e9-466450a50add" acquired by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:03:37 compute-0 nova_compute[186544]: 2025-11-22 08:03:37.407 186548 INFO nova.compute.manager [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Unshelving
Nov 22 08:03:37 compute-0 nova_compute[186544]: 2025-11-22 08:03:37.498 186548 DEBUG oslo_concurrency.lockutils [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:03:37 compute-0 nova_compute[186544]: 2025-11-22 08:03:37.498 186548 DEBUG oslo_concurrency.lockutils [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:03:37 compute-0 nova_compute[186544]: 2025-11-22 08:03:37.505 186548 DEBUG nova.objects.instance [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Lazy-loading 'pci_requests' on Instance uuid 235ecf63-07fa-4f60-97e9-466450a50add obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:03:37 compute-0 nova_compute[186544]: 2025-11-22 08:03:37.519 186548 DEBUG nova.objects.instance [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Lazy-loading 'numa_topology' on Instance uuid 235ecf63-07fa-4f60-97e9-466450a50add obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:03:37 compute-0 nova_compute[186544]: 2025-11-22 08:03:37.530 186548 DEBUG nova.virt.hardware [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 08:03:37 compute-0 nova_compute[186544]: 2025-11-22 08:03:37.530 186548 INFO nova.compute.claims [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Claim successful on node compute-0.ctlplane.example.com
Nov 22 08:03:37 compute-0 nova_compute[186544]: 2025-11-22 08:03:37.645 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:37 compute-0 nova_compute[186544]: 2025-11-22 08:03:37.663 186548 DEBUG nova.compute.provider_tree [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:03:37 compute-0 nova_compute[186544]: 2025-11-22 08:03:37.674 186548 DEBUG nova.scheduler.client.report [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:03:37 compute-0 nova_compute[186544]: 2025-11-22 08:03:37.693 186548 DEBUG oslo_concurrency.lockutils [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.195s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:03:37 compute-0 nova_compute[186544]: 2025-11-22 08:03:37.877 186548 INFO nova.network.neutron [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Updating port fe9c07c3-e08f-4810-b699-6d6aa3f50cda with attributes {'binding:host_id': 'compute-0.ctlplane.example.com', 'device_owner': 'compute:nova'}
Nov 22 08:03:38 compute-0 sshd-session[231239]: Accepted publickey for nova from 192.168.122.102 port 40362 ssh2: ECDSA SHA256:92zYFkcPWlh9+ZvK5fXQ5RiSCmUwy/g/KFplYnusFfI
Nov 22 08:03:38 compute-0 systemd-logind[821]: New session 51 of user nova.
Nov 22 08:03:38 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Nov 22 08:03:38 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 22 08:03:38 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 22 08:03:38 compute-0 systemd[1]: Starting User Manager for UID 42436...
Nov 22 08:03:38 compute-0 systemd[231243]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 22 08:03:38 compute-0 systemd[231243]: Queued start job for default target Main User Target.
Nov 22 08:03:38 compute-0 systemd[231243]: Created slice User Application Slice.
Nov 22 08:03:38 compute-0 systemd[231243]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 22 08:03:38 compute-0 systemd[231243]: Started Daily Cleanup of User's Temporary Directories.
Nov 22 08:03:38 compute-0 systemd[231243]: Reached target Paths.
Nov 22 08:03:38 compute-0 systemd[231243]: Reached target Timers.
Nov 22 08:03:38 compute-0 systemd[231243]: Starting D-Bus User Message Bus Socket...
Nov 22 08:03:38 compute-0 systemd[231243]: Starting Create User's Volatile Files and Directories...
Nov 22 08:03:38 compute-0 systemd[231243]: Finished Create User's Volatile Files and Directories.
Nov 22 08:03:38 compute-0 systemd[231243]: Listening on D-Bus User Message Bus Socket.
Nov 22 08:03:38 compute-0 systemd[231243]: Reached target Sockets.
Nov 22 08:03:38 compute-0 systemd[231243]: Reached target Basic System.
Nov 22 08:03:38 compute-0 systemd[231243]: Reached target Main User Target.
Nov 22 08:03:38 compute-0 systemd[231243]: Startup finished in 145ms.
Nov 22 08:03:38 compute-0 systemd[1]: Started User Manager for UID 42436.
Nov 22 08:03:38 compute-0 systemd[1]: Started Session 51 of User nova.
Nov 22 08:03:38 compute-0 sshd-session[231239]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 22 08:03:38 compute-0 sshd-session[231258]: Received disconnect from 192.168.122.102 port 40362:11: disconnected by user
Nov 22 08:03:38 compute-0 sshd-session[231258]: Disconnected from user nova 192.168.122.102 port 40362
Nov 22 08:03:38 compute-0 sshd-session[231239]: pam_unix(sshd:session): session closed for user nova
Nov 22 08:03:38 compute-0 systemd[1]: session-51.scope: Deactivated successfully.
Nov 22 08:03:38 compute-0 systemd-logind[821]: Session 51 logged out. Waiting for processes to exit.
Nov 22 08:03:38 compute-0 systemd-logind[821]: Removed session 51.
Nov 22 08:03:38 compute-0 nova_compute[186544]: 2025-11-22 08:03:38.788 186548 DEBUG oslo_concurrency.lockutils [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Acquiring lock "refresh_cache-235ecf63-07fa-4f60-97e9-466450a50add" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:03:38 compute-0 nova_compute[186544]: 2025-11-22 08:03:38.791 186548 DEBUG oslo_concurrency.lockutils [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Acquired lock "refresh_cache-235ecf63-07fa-4f60-97e9-466450a50add" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:03:38 compute-0 nova_compute[186544]: 2025-11-22 08:03:38.792 186548 DEBUG nova.network.neutron [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 08:03:38 compute-0 sshd-session[231260]: Accepted publickey for nova from 192.168.122.102 port 40374 ssh2: ECDSA SHA256:92zYFkcPWlh9+ZvK5fXQ5RiSCmUwy/g/KFplYnusFfI
Nov 22 08:03:38 compute-0 systemd-logind[821]: New session 53 of user nova.
Nov 22 08:03:38 compute-0 systemd[1]: Started Session 53 of User nova.
Nov 22 08:03:38 compute-0 sshd-session[231260]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 22 08:03:38 compute-0 sshd-session[231263]: Received disconnect from 192.168.122.102 port 40374:11: disconnected by user
Nov 22 08:03:38 compute-0 sshd-session[231263]: Disconnected from user nova 192.168.122.102 port 40374
Nov 22 08:03:38 compute-0 sshd-session[231260]: pam_unix(sshd:session): session closed for user nova
Nov 22 08:03:38 compute-0 systemd[1]: session-53.scope: Deactivated successfully.
Nov 22 08:03:38 compute-0 systemd-logind[821]: Session 53 logged out. Waiting for processes to exit.
Nov 22 08:03:38 compute-0 systemd-logind[821]: Removed session 53.
Nov 22 08:03:38 compute-0 nova_compute[186544]: 2025-11-22 08:03:38.912 186548 DEBUG nova.compute.manager [req-4b5239b7-3dc2-4c00-a1c8-e47c7eb40d6b req-78b6dba6-f285-48f0-82a5-6f8e3807982c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Received event network-changed-fe9c07c3-e08f-4810-b699-6d6aa3f50cda external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:03:38 compute-0 nova_compute[186544]: 2025-11-22 08:03:38.913 186548 DEBUG nova.compute.manager [req-4b5239b7-3dc2-4c00-a1c8-e47c7eb40d6b req-78b6dba6-f285-48f0-82a5-6f8e3807982c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Refreshing instance network info cache due to event network-changed-fe9c07c3-e08f-4810-b699-6d6aa3f50cda. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:03:38 compute-0 nova_compute[186544]: 2025-11-22 08:03:38.913 186548 DEBUG oslo_concurrency.lockutils [req-4b5239b7-3dc2-4c00-a1c8-e47c7eb40d6b req-78b6dba6-f285-48f0-82a5-6f8e3807982c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-235ecf63-07fa-4f60-97e9-466450a50add" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:03:40 compute-0 nova_compute[186544]: 2025-11-22 08:03:40.165 186548 DEBUG nova.network.neutron [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Updating instance_info_cache with network_info: [{"id": "fe9c07c3-e08f-4810-b699-6d6aa3f50cda", "address": "fa:16:3e:1c:d6:3e", "network": {"id": "c157128f-75e8-4afb-ab55-34580af9585f", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1557637836-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d967f0cef958482c9711764882a146f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe9c07c3-e0", "ovs_interfaceid": "fe9c07c3-e08f-4810-b699-6d6aa3f50cda", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:03:40 compute-0 nova_compute[186544]: 2025-11-22 08:03:40.188 186548 DEBUG oslo_concurrency.lockutils [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Releasing lock "refresh_cache-235ecf63-07fa-4f60-97e9-466450a50add" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:03:40 compute-0 nova_compute[186544]: 2025-11-22 08:03:40.189 186548 DEBUG nova.virt.libvirt.driver [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 08:03:40 compute-0 nova_compute[186544]: 2025-11-22 08:03:40.190 186548 INFO nova.virt.libvirt.driver [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Creating image(s)
Nov 22 08:03:40 compute-0 nova_compute[186544]: 2025-11-22 08:03:40.190 186548 DEBUG oslo_concurrency.lockutils [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Acquiring lock "/var/lib/nova/instances/235ecf63-07fa-4f60-97e9-466450a50add/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:03:40 compute-0 nova_compute[186544]: 2025-11-22 08:03:40.191 186548 DEBUG oslo_concurrency.lockutils [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Lock "/var/lib/nova/instances/235ecf63-07fa-4f60-97e9-466450a50add/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:03:40 compute-0 nova_compute[186544]: 2025-11-22 08:03:40.191 186548 DEBUG oslo_concurrency.lockutils [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Lock "/var/lib/nova/instances/235ecf63-07fa-4f60-97e9-466450a50add/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:03:40 compute-0 nova_compute[186544]: 2025-11-22 08:03:40.192 186548 DEBUG nova.objects.instance [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 235ecf63-07fa-4f60-97e9-466450a50add obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:03:40 compute-0 nova_compute[186544]: 2025-11-22 08:03:40.193 186548 DEBUG oslo_concurrency.lockutils [req-4b5239b7-3dc2-4c00-a1c8-e47c7eb40d6b req-78b6dba6-f285-48f0-82a5-6f8e3807982c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-235ecf63-07fa-4f60-97e9-466450a50add" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:03:40 compute-0 nova_compute[186544]: 2025-11-22 08:03:40.193 186548 DEBUG nova.network.neutron [req-4b5239b7-3dc2-4c00-a1c8-e47c7eb40d6b req-78b6dba6-f285-48f0-82a5-6f8e3807982c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Refreshing network info cache for port fe9c07c3-e08f-4810-b699-6d6aa3f50cda _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:03:40 compute-0 nova_compute[186544]: 2025-11-22 08:03:40.214 186548 DEBUG oslo_concurrency.lockutils [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Acquiring lock "e4c7d8ef8985edf6bd00969b5ee2ac6a894891c5" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:03:40 compute-0 nova_compute[186544]: 2025-11-22 08:03:40.215 186548 DEBUG oslo_concurrency.lockutils [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Lock "e4c7d8ef8985edf6bd00969b5ee2ac6a894891c5" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:03:40 compute-0 nova_compute[186544]: 2025-11-22 08:03:40.703 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:42 compute-0 sshd-session[231265]: Accepted publickey for nova from 192.168.122.102 port 33978 ssh2: ECDSA SHA256:92zYFkcPWlh9+ZvK5fXQ5RiSCmUwy/g/KFplYnusFfI
Nov 22 08:03:42 compute-0 systemd-logind[821]: New session 54 of user nova.
Nov 22 08:03:42 compute-0 systemd[1]: Started Session 54 of User nova.
Nov 22 08:03:42 compute-0 sshd-session[231265]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 22 08:03:42 compute-0 podman[231267]: 2025-11-22 08:03:42.272894168 +0000 UTC m=+0.060070828 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 08:03:42 compute-0 podman[231269]: 2025-11-22 08:03:42.283419149 +0000 UTC m=+0.069472671 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, config_id=edpm, release=1755695350, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, com.redhat.component=ubi9-minimal-container, distribution-scope=public, vcs-type=git, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 22 08:03:42 compute-0 nova_compute[186544]: 2025-11-22 08:03:42.480 186548 DEBUG oslo_concurrency.processutils [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e4c7d8ef8985edf6bd00969b5ee2ac6a894891c5.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:03:42 compute-0 nova_compute[186544]: 2025-11-22 08:03:42.539 186548 DEBUG oslo_concurrency.processutils [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e4c7d8ef8985edf6bd00969b5ee2ac6a894891c5.part --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:03:42 compute-0 nova_compute[186544]: 2025-11-22 08:03:42.540 186548 DEBUG nova.virt.images [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] 165c19d9-e5a7-4c8d-9823-47ddc3418023 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Nov 22 08:03:42 compute-0 nova_compute[186544]: 2025-11-22 08:03:42.838 186548 DEBUG nova.privsep.utils [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Nov 22 08:03:42 compute-0 nova_compute[186544]: 2025-11-22 08:03:42.839 186548 DEBUG oslo_concurrency.processutils [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/e4c7d8ef8985edf6bd00969b5ee2ac6a894891c5.part /var/lib/nova/instances/_base/e4c7d8ef8985edf6bd00969b5ee2ac6a894891c5.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:03:42 compute-0 nova_compute[186544]: 2025-11-22 08:03:42.858 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:42 compute-0 nova_compute[186544]: 2025-11-22 08:03:42.861 186548 DEBUG nova.network.neutron [req-4b5239b7-3dc2-4c00-a1c8-e47c7eb40d6b req-78b6dba6-f285-48f0-82a5-6f8e3807982c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Updated VIF entry in instance network info cache for port fe9c07c3-e08f-4810-b699-6d6aa3f50cda. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:03:42 compute-0 nova_compute[186544]: 2025-11-22 08:03:42.862 186548 DEBUG nova.network.neutron [req-4b5239b7-3dc2-4c00-a1c8-e47c7eb40d6b req-78b6dba6-f285-48f0-82a5-6f8e3807982c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Updating instance_info_cache with network_info: [{"id": "fe9c07c3-e08f-4810-b699-6d6aa3f50cda", "address": "fa:16:3e:1c:d6:3e", "network": {"id": "c157128f-75e8-4afb-ab55-34580af9585f", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1557637836-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d967f0cef958482c9711764882a146f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe9c07c3-e0", "ovs_interfaceid": "fe9c07c3-e08f-4810-b699-6d6aa3f50cda", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:03:42 compute-0 nova_compute[186544]: 2025-11-22 08:03:42.876 186548 DEBUG oslo_concurrency.lockutils [req-4b5239b7-3dc2-4c00-a1c8-e47c7eb40d6b req-78b6dba6-f285-48f0-82a5-6f8e3807982c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-235ecf63-07fa-4f60-97e9-466450a50add" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:03:43 compute-0 sshd-session[231280]: Received disconnect from 192.168.122.102 port 33978:11: disconnected by user
Nov 22 08:03:43 compute-0 sshd-session[231280]: Disconnected from user nova 192.168.122.102 port 33978
Nov 22 08:03:43 compute-0 sshd-session[231265]: pam_unix(sshd:session): session closed for user nova
Nov 22 08:03:43 compute-0 systemd[1]: session-54.scope: Deactivated successfully.
Nov 22 08:03:43 compute-0 systemd-logind[821]: Session 54 logged out. Waiting for processes to exit.
Nov 22 08:03:43 compute-0 systemd-logind[821]: Removed session 54.
Nov 22 08:03:43 compute-0 sshd-session[231322]: Accepted publickey for nova from 192.168.122.102 port 33982 ssh2: ECDSA SHA256:92zYFkcPWlh9+ZvK5fXQ5RiSCmUwy/g/KFplYnusFfI
Nov 22 08:03:43 compute-0 systemd-logind[821]: New session 55 of user nova.
Nov 22 08:03:43 compute-0 systemd[1]: Started Session 55 of User nova.
Nov 22 08:03:43 compute-0 sshd-session[231322]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 22 08:03:43 compute-0 sshd-session[231325]: Received disconnect from 192.168.122.102 port 33982:11: disconnected by user
Nov 22 08:03:43 compute-0 sshd-session[231325]: Disconnected from user nova 192.168.122.102 port 33982
Nov 22 08:03:43 compute-0 sshd-session[231322]: pam_unix(sshd:session): session closed for user nova
Nov 22 08:03:43 compute-0 systemd[1]: session-55.scope: Deactivated successfully.
Nov 22 08:03:43 compute-0 systemd-logind[821]: Session 55 logged out. Waiting for processes to exit.
Nov 22 08:03:43 compute-0 systemd-logind[821]: Removed session 55.
Nov 22 08:03:43 compute-0 nova_compute[186544]: 2025-11-22 08:03:43.573 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763798608.571792, 886c510f-5bb2-455c-a7db-d83b3fc86ef2 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:03:43 compute-0 nova_compute[186544]: 2025-11-22 08:03:43.573 186548 INFO nova.compute.manager [-] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] VM Stopped (Lifecycle Event)
Nov 22 08:03:43 compute-0 nova_compute[186544]: 2025-11-22 08:03:43.592 186548 DEBUG nova.compute.manager [None req-2b86daf1-6fcf-4f7c-90ec-c5edd18584cc - - - - - -] [instance: 886c510f-5bb2-455c-a7db-d83b3fc86ef2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:03:43 compute-0 sshd-session[231327]: Accepted publickey for nova from 192.168.122.102 port 33994 ssh2: ECDSA SHA256:92zYFkcPWlh9+ZvK5fXQ5RiSCmUwy/g/KFplYnusFfI
Nov 22 08:03:43 compute-0 systemd-logind[821]: New session 56 of user nova.
Nov 22 08:03:43 compute-0 systemd[1]: Started Session 56 of User nova.
Nov 22 08:03:43 compute-0 sshd-session[231327]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 22 08:03:43 compute-0 sshd-session[231330]: Received disconnect from 192.168.122.102 port 33994:11: disconnected by user
Nov 22 08:03:43 compute-0 sshd-session[231330]: Disconnected from user nova 192.168.122.102 port 33994
Nov 22 08:03:43 compute-0 sshd-session[231327]: pam_unix(sshd:session): session closed for user nova
Nov 22 08:03:43 compute-0 systemd[1]: session-56.scope: Deactivated successfully.
Nov 22 08:03:43 compute-0 systemd-logind[821]: Session 56 logged out. Waiting for processes to exit.
Nov 22 08:03:43 compute-0 systemd-logind[821]: Removed session 56.
Nov 22 08:03:44 compute-0 nova_compute[186544]: 2025-11-22 08:03:44.173 186548 INFO nova.network.neutron [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Updating port a2f45e58-237f-4de0-8339-5f17a4ad3cfe with attributes {'binding:host_id': 'compute-0.ctlplane.example.com', 'device_owner': 'compute:nova'}
Nov 22 08:03:44 compute-0 nova_compute[186544]: 2025-11-22 08:03:44.444 186548 DEBUG oslo_concurrency.processutils [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/e4c7d8ef8985edf6bd00969b5ee2ac6a894891c5.part /var/lib/nova/instances/_base/e4c7d8ef8985edf6bd00969b5ee2ac6a894891c5.converted" returned: 0 in 1.605s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:03:44 compute-0 nova_compute[186544]: 2025-11-22 08:03:44.454 186548 DEBUG oslo_concurrency.processutils [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e4c7d8ef8985edf6bd00969b5ee2ac6a894891c5.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:03:44 compute-0 nova_compute[186544]: 2025-11-22 08:03:44.510 186548 DEBUG oslo_concurrency.processutils [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e4c7d8ef8985edf6bd00969b5ee2ac6a894891c5.converted --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:03:44 compute-0 nova_compute[186544]: 2025-11-22 08:03:44.511 186548 DEBUG oslo_concurrency.lockutils [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Lock "e4c7d8ef8985edf6bd00969b5ee2ac6a894891c5" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 4.296s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:03:44 compute-0 nova_compute[186544]: 2025-11-22 08:03:44.524 186548 DEBUG oslo_concurrency.processutils [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e4c7d8ef8985edf6bd00969b5ee2ac6a894891c5 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:03:44 compute-0 nova_compute[186544]: 2025-11-22 08:03:44.578 186548 DEBUG oslo_concurrency.processutils [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e4c7d8ef8985edf6bd00969b5ee2ac6a894891c5 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:03:44 compute-0 nova_compute[186544]: 2025-11-22 08:03:44.580 186548 DEBUG oslo_concurrency.lockutils [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Acquiring lock "e4c7d8ef8985edf6bd00969b5ee2ac6a894891c5" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:03:44 compute-0 nova_compute[186544]: 2025-11-22 08:03:44.581 186548 DEBUG oslo_concurrency.lockutils [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Lock "e4c7d8ef8985edf6bd00969b5ee2ac6a894891c5" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:03:44 compute-0 nova_compute[186544]: 2025-11-22 08:03:44.605 186548 DEBUG oslo_concurrency.processutils [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e4c7d8ef8985edf6bd00969b5ee2ac6a894891c5 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:03:44 compute-0 nova_compute[186544]: 2025-11-22 08:03:44.665 186548 DEBUG oslo_concurrency.processutils [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e4c7d8ef8985edf6bd00969b5ee2ac6a894891c5 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:03:44 compute-0 nova_compute[186544]: 2025-11-22 08:03:44.666 186548 DEBUG oslo_concurrency.processutils [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e4c7d8ef8985edf6bd00969b5ee2ac6a894891c5,backing_fmt=raw /var/lib/nova/instances/235ecf63-07fa-4f60-97e9-466450a50add/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:03:44 compute-0 nova_compute[186544]: 2025-11-22 08:03:44.779 186548 DEBUG nova.compute.manager [req-27169a96-046d-47a5-a6b4-e388d73f7101 req-ff2c6a9c-52d9-4bc4-88ab-769b73230c4a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Received event network-vif-unplugged-a2f45e58-237f-4de0-8339-5f17a4ad3cfe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:03:44 compute-0 nova_compute[186544]: 2025-11-22 08:03:44.780 186548 DEBUG oslo_concurrency.lockutils [req-27169a96-046d-47a5-a6b4-e388d73f7101 req-ff2c6a9c-52d9-4bc4-88ab-769b73230c4a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:03:44 compute-0 nova_compute[186544]: 2025-11-22 08:03:44.781 186548 DEBUG oslo_concurrency.lockutils [req-27169a96-046d-47a5-a6b4-e388d73f7101 req-ff2c6a9c-52d9-4bc4-88ab-769b73230c4a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:03:44 compute-0 nova_compute[186544]: 2025-11-22 08:03:44.782 186548 DEBUG oslo_concurrency.lockutils [req-27169a96-046d-47a5-a6b4-e388d73f7101 req-ff2c6a9c-52d9-4bc4-88ab-769b73230c4a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:03:44 compute-0 nova_compute[186544]: 2025-11-22 08:03:44.783 186548 DEBUG nova.compute.manager [req-27169a96-046d-47a5-a6b4-e388d73f7101 req-ff2c6a9c-52d9-4bc4-88ab-769b73230c4a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] No waiting events found dispatching network-vif-unplugged-a2f45e58-237f-4de0-8339-5f17a4ad3cfe pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:03:44 compute-0 nova_compute[186544]: 2025-11-22 08:03:44.783 186548 WARNING nova.compute.manager [req-27169a96-046d-47a5-a6b4-e388d73f7101 req-ff2c6a9c-52d9-4bc4-88ab-769b73230c4a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Received unexpected event network-vif-unplugged-a2f45e58-237f-4de0-8339-5f17a4ad3cfe for instance with vm_state active and task_state resize_migrated.
Nov 22 08:03:44 compute-0 nova_compute[186544]: 2025-11-22 08:03:44.784 186548 DEBUG nova.compute.manager [req-27169a96-046d-47a5-a6b4-e388d73f7101 req-ff2c6a9c-52d9-4bc4-88ab-769b73230c4a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Received event network-vif-plugged-a2f45e58-237f-4de0-8339-5f17a4ad3cfe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:03:44 compute-0 nova_compute[186544]: 2025-11-22 08:03:44.784 186548 DEBUG oslo_concurrency.lockutils [req-27169a96-046d-47a5-a6b4-e388d73f7101 req-ff2c6a9c-52d9-4bc4-88ab-769b73230c4a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:03:44 compute-0 nova_compute[186544]: 2025-11-22 08:03:44.785 186548 DEBUG oslo_concurrency.lockutils [req-27169a96-046d-47a5-a6b4-e388d73f7101 req-ff2c6a9c-52d9-4bc4-88ab-769b73230c4a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:03:44 compute-0 nova_compute[186544]: 2025-11-22 08:03:44.786 186548 DEBUG oslo_concurrency.lockutils [req-27169a96-046d-47a5-a6b4-e388d73f7101 req-ff2c6a9c-52d9-4bc4-88ab-769b73230c4a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:03:44 compute-0 nova_compute[186544]: 2025-11-22 08:03:44.786 186548 DEBUG nova.compute.manager [req-27169a96-046d-47a5-a6b4-e388d73f7101 req-ff2c6a9c-52d9-4bc4-88ab-769b73230c4a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] No waiting events found dispatching network-vif-plugged-a2f45e58-237f-4de0-8339-5f17a4ad3cfe pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:03:44 compute-0 nova_compute[186544]: 2025-11-22 08:03:44.786 186548 WARNING nova.compute.manager [req-27169a96-046d-47a5-a6b4-e388d73f7101 req-ff2c6a9c-52d9-4bc4-88ab-769b73230c4a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Received unexpected event network-vif-plugged-a2f45e58-237f-4de0-8339-5f17a4ad3cfe for instance with vm_state active and task_state resize_migrated.
Nov 22 08:03:44 compute-0 nova_compute[186544]: 2025-11-22 08:03:44.849 186548 DEBUG oslo_concurrency.lockutils [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquiring lock "refresh_cache-eb6b82cf-7eb5-4a69-9342-a5d3fb896e58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:03:44 compute-0 nova_compute[186544]: 2025-11-22 08:03:44.849 186548 DEBUG oslo_concurrency.lockutils [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquired lock "refresh_cache-eb6b82cf-7eb5-4a69-9342-a5d3fb896e58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:03:44 compute-0 nova_compute[186544]: 2025-11-22 08:03:44.850 186548 DEBUG nova.network.neutron [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 08:03:44 compute-0 nova_compute[186544]: 2025-11-22 08:03:44.995 186548 DEBUG oslo_concurrency.processutils [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e4c7d8ef8985edf6bd00969b5ee2ac6a894891c5,backing_fmt=raw /var/lib/nova/instances/235ecf63-07fa-4f60-97e9-466450a50add/disk 1073741824" returned: 0 in 0.328s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:03:44 compute-0 nova_compute[186544]: 2025-11-22 08:03:44.995 186548 DEBUG oslo_concurrency.lockutils [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Lock "e4c7d8ef8985edf6bd00969b5ee2ac6a894891c5" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.414s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:03:44 compute-0 nova_compute[186544]: 2025-11-22 08:03:44.996 186548 DEBUG oslo_concurrency.processutils [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e4c7d8ef8985edf6bd00969b5ee2ac6a894891c5 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:03:45 compute-0 nova_compute[186544]: 2025-11-22 08:03:45.075 186548 DEBUG oslo_concurrency.processutils [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e4c7d8ef8985edf6bd00969b5ee2ac6a894891c5 --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:03:45 compute-0 nova_compute[186544]: 2025-11-22 08:03:45.078 186548 DEBUG nova.objects.instance [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Lazy-loading 'migration_context' on Instance uuid 235ecf63-07fa-4f60-97e9-466450a50add obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:03:45 compute-0 nova_compute[186544]: 2025-11-22 08:03:45.095 186548 INFO nova.virt.libvirt.driver [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Rebasing disk image.
Nov 22 08:03:45 compute-0 nova_compute[186544]: 2025-11-22 08:03:45.096 186548 DEBUG oslo_concurrency.processutils [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:03:45 compute-0 nova_compute[186544]: 2025-11-22 08:03:45.149 186548 DEBUG oslo_concurrency.processutils [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:03:45 compute-0 nova_compute[186544]: 2025-11-22 08:03:45.150 186548 DEBUG oslo_concurrency.processutils [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Running cmd (subprocess): qemu-img rebase -b /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 -F raw /var/lib/nova/instances/235ecf63-07fa-4f60-97e9-466450a50add/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:03:45 compute-0 nova_compute[186544]: 2025-11-22 08:03:45.705 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:46 compute-0 nova_compute[186544]: 2025-11-22 08:03:46.736 186548 DEBUG nova.network.neutron [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Updating instance_info_cache with network_info: [{"id": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "address": "fa:16:3e:df:95:59", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2f45e58-23", "ovs_interfaceid": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:03:46 compute-0 nova_compute[186544]: 2025-11-22 08:03:46.780 186548 DEBUG oslo_concurrency.lockutils [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Releasing lock "refresh_cache-eb6b82cf-7eb5-4a69-9342-a5d3fb896e58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:03:46 compute-0 nova_compute[186544]: 2025-11-22 08:03:46.930 186548 DEBUG nova.compute.manager [req-02362e0c-a00a-4843-99c2-e73f52fe9e1f req-c2b12d1e-ca50-4e0d-9c36-4dcd7fcf3f0b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Received event network-changed-a2f45e58-237f-4de0-8339-5f17a4ad3cfe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:03:46 compute-0 nova_compute[186544]: 2025-11-22 08:03:46.931 186548 DEBUG nova.compute.manager [req-02362e0c-a00a-4843-99c2-e73f52fe9e1f req-c2b12d1e-ca50-4e0d-9c36-4dcd7fcf3f0b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Refreshing instance network info cache due to event network-changed-a2f45e58-237f-4de0-8339-5f17a4ad3cfe. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:03:46 compute-0 nova_compute[186544]: 2025-11-22 08:03:46.931 186548 DEBUG oslo_concurrency.lockutils [req-02362e0c-a00a-4843-99c2-e73f52fe9e1f req-c2b12d1e-ca50-4e0d-9c36-4dcd7fcf3f0b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-eb6b82cf-7eb5-4a69-9342-a5d3fb896e58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:03:46 compute-0 nova_compute[186544]: 2025-11-22 08:03:46.931 186548 DEBUG oslo_concurrency.lockutils [req-02362e0c-a00a-4843-99c2-e73f52fe9e1f req-c2b12d1e-ca50-4e0d-9c36-4dcd7fcf3f0b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-eb6b82cf-7eb5-4a69-9342-a5d3fb896e58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:03:46 compute-0 nova_compute[186544]: 2025-11-22 08:03:46.932 186548 DEBUG nova.network.neutron [req-02362e0c-a00a-4843-99c2-e73f52fe9e1f req-c2b12d1e-ca50-4e0d-9c36-4dcd7fcf3f0b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Refreshing network info cache for port a2f45e58-237f-4de0-8339-5f17a4ad3cfe _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:03:46 compute-0 nova_compute[186544]: 2025-11-22 08:03:46.966 186548 DEBUG nova.virt.libvirt.driver [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698
Nov 22 08:03:46 compute-0 nova_compute[186544]: 2025-11-22 08:03:46.968 186548 DEBUG nova.virt.libvirt.driver [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719
Nov 22 08:03:46 compute-0 nova_compute[186544]: 2025-11-22 08:03:46.968 186548 INFO nova.virt.libvirt.driver [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Creating image(s)
Nov 22 08:03:46 compute-0 nova_compute[186544]: 2025-11-22 08:03:46.969 186548 DEBUG nova.objects.instance [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lazy-loading 'trusted_certs' on Instance uuid eb6b82cf-7eb5-4a69-9342-a5d3fb896e58 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:03:46 compute-0 nova_compute[186544]: 2025-11-22 08:03:46.984 186548 DEBUG oslo_concurrency.processutils [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.039 186548 DEBUG oslo_concurrency.processutils [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.040 186548 DEBUG nova.virt.disk.api [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Checking if we can resize image /var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.041 186548 DEBUG oslo_concurrency.processutils [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.478 186548 DEBUG oslo_concurrency.processutils [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk --force-share --output=json" returned: 0 in 0.437s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.479 186548 DEBUG nova.virt.disk.api [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Cannot resize image /var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.490 186548 DEBUG nova.virt.libvirt.driver [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.491 186548 DEBUG nova.virt.libvirt.driver [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Ensure instance console log exists: /var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.492 186548 DEBUG oslo_concurrency.lockutils [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.493 186548 DEBUG oslo_concurrency.lockutils [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.493 186548 DEBUG oslo_concurrency.lockutils [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.497 186548 DEBUG nova.virt.libvirt.driver [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Start _get_guest_xml network_info=[{"id": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "address": "fa:16:3e:df:95:59", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-1436581558-network", "vif_mac": "fa:16:3e:df:95:59"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2f45e58-23", "ovs_interfaceid": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.503 186548 WARNING nova.virt.libvirt.driver [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.509 186548 DEBUG nova.virt.libvirt.host [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.510 186548 DEBUG nova.virt.libvirt.host [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.513 186548 DEBUG nova.virt.libvirt.host [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.514 186548 DEBUG nova.virt.libvirt.host [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.515 186548 DEBUG nova.virt.libvirt.driver [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.515 186548 DEBUG nova.virt.hardware [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1c351edf-5b2d-477d-93d0-c380bdae83e7',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.516 186548 DEBUG nova.virt.hardware [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.516 186548 DEBUG nova.virt.hardware [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.516 186548 DEBUG nova.virt.hardware [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.517 186548 DEBUG nova.virt.hardware [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.517 186548 DEBUG nova.virt.hardware [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.517 186548 DEBUG nova.virt.hardware [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.517 186548 DEBUG nova.virt.hardware [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.518 186548 DEBUG nova.virt.hardware [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.518 186548 DEBUG nova.virt.hardware [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.518 186548 DEBUG nova.virt.hardware [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.518 186548 DEBUG nova.objects.instance [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lazy-loading 'vcpu_model' on Instance uuid eb6b82cf-7eb5-4a69-9342-a5d3fb896e58 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.537 186548 DEBUG nova.virt.libvirt.vif [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:00:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1519356482',display_name='tempest-ServerActionsTestJSON-server-1519356482',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1519356482',id=93,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEDyh5RRpb7qDHgAc9H+oNwOI/lxx0x2a7uhOXIX+Er9GoVqnK9B1X3kTc/PIYUbBPjQjhoPfQeu2jPU9pyeFHD6mBTSbq1gvJNECPvummRKdXnVokvmyleOZmFdoGP/ZQ==',key_name='tempest-keypair-1877507320',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:00:16Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='ac6b78572b7d4aedaf745e5e6ba1d360',ramdisk_id='',reservation_id='r-0hew71dq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-1104477664',owner_user_name='tempest-ServerActionsTestJSON-1104477664-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:03:43Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='b6cc24df1e344e369f2aff864f278268',uuid=eb6b82cf-7eb5-4a69-9342-a5d3fb896e58,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "address": "fa:16:3e:df:95:59", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-1436581558-network", "vif_mac": "fa:16:3e:df:95:59"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2f45e58-23", "ovs_interfaceid": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.537 186548 DEBUG nova.network.os_vif_util [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Converting VIF {"id": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "address": "fa:16:3e:df:95:59", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-1436581558-network", "vif_mac": "fa:16:3e:df:95:59"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2f45e58-23", "ovs_interfaceid": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.538 186548 DEBUG nova.network.os_vif_util [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:df:95:59,bridge_name='br-int',has_traffic_filtering=True,id=a2f45e58-237f-4de0-8339-5f17a4ad3cfe,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2f45e58-23') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.540 186548 DEBUG nova.virt.libvirt.driver [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] End _get_guest_xml xml=<domain type="kvm">
Nov 22 08:03:47 compute-0 nova_compute[186544]:   <uuid>eb6b82cf-7eb5-4a69-9342-a5d3fb896e58</uuid>
Nov 22 08:03:47 compute-0 nova_compute[186544]:   <name>instance-0000005d</name>
Nov 22 08:03:47 compute-0 nova_compute[186544]:   <memory>196608</memory>
Nov 22 08:03:47 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 08:03:47 compute-0 nova_compute[186544]:   <metadata>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 08:03:47 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:       <nova:name>tempest-ServerActionsTestJSON-server-1519356482</nova:name>
Nov 22 08:03:47 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 08:03:47</nova:creationTime>
Nov 22 08:03:47 compute-0 nova_compute[186544]:       <nova:flavor name="m1.micro">
Nov 22 08:03:47 compute-0 nova_compute[186544]:         <nova:memory>192</nova:memory>
Nov 22 08:03:47 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 08:03:47 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 08:03:47 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 08:03:47 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 08:03:47 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 08:03:47 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 08:03:47 compute-0 nova_compute[186544]:         <nova:user uuid="b6cc24df1e344e369f2aff864f278268">tempest-ServerActionsTestJSON-1104477664-project-member</nova:user>
Nov 22 08:03:47 compute-0 nova_compute[186544]:         <nova:project uuid="ac6b78572b7d4aedaf745e5e6ba1d360">tempest-ServerActionsTestJSON-1104477664</nova:project>
Nov 22 08:03:47 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 08:03:47 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 08:03:47 compute-0 nova_compute[186544]:         <nova:port uuid="a2f45e58-237f-4de0-8339-5f17a4ad3cfe">
Nov 22 08:03:47 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 08:03:47 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 08:03:47 compute-0 nova_compute[186544]:   </metadata>
Nov 22 08:03:47 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <system>
Nov 22 08:03:47 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 08:03:47 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 08:03:47 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 08:03:47 compute-0 nova_compute[186544]:       <entry name="serial">eb6b82cf-7eb5-4a69-9342-a5d3fb896e58</entry>
Nov 22 08:03:47 compute-0 nova_compute[186544]:       <entry name="uuid">eb6b82cf-7eb5-4a69-9342-a5d3fb896e58</entry>
Nov 22 08:03:47 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     </system>
Nov 22 08:03:47 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 08:03:47 compute-0 nova_compute[186544]:   <os>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:   </os>
Nov 22 08:03:47 compute-0 nova_compute[186544]:   <features>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <apic/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:   </features>
Nov 22 08:03:47 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:   </clock>
Nov 22 08:03:47 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:   </cpu>
Nov 22 08:03:47 compute-0 nova_compute[186544]:   <devices>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 08:03:47 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 08:03:47 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk.config"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 08:03:47 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:df:95:59"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:       <target dev="tapa2f45e58-23"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     </interface>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 08:03:47 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/console.log" append="off"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     </serial>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <video>
Nov 22 08:03:47 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     </video>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 08:03:47 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     </rng>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 08:03:47 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 08:03:47 compute-0 nova_compute[186544]:   </devices>
Nov 22 08:03:47 compute-0 nova_compute[186544]: </domain>
Nov 22 08:03:47 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.541 186548 DEBUG nova.virt.libvirt.vif [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:00:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1519356482',display_name='tempest-ServerActionsTestJSON-server-1519356482',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1519356482',id=93,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEDyh5RRpb7qDHgAc9H+oNwOI/lxx0x2a7uhOXIX+Er9GoVqnK9B1X3kTc/PIYUbBPjQjhoPfQeu2jPU9pyeFHD6mBTSbq1gvJNECPvummRKdXnVokvmyleOZmFdoGP/ZQ==',key_name='tempest-keypair-1877507320',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:00:16Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='ac6b78572b7d4aedaf745e5e6ba1d360',ramdisk_id='',reservation_id='r-0hew71dq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-1104477664',owner_user_name='tempest-ServerActionsTestJSON-1104477664-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:03:43Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='b6cc24df1e344e369f2aff864f278268',uuid=eb6b82cf-7eb5-4a69-9342-a5d3fb896e58,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "address": "fa:16:3e:df:95:59", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-1436581558-network", "vif_mac": "fa:16:3e:df:95:59"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2f45e58-23", "ovs_interfaceid": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.541 186548 DEBUG nova.network.os_vif_util [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Converting VIF {"id": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "address": "fa:16:3e:df:95:59", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-1436581558-network", "vif_mac": "fa:16:3e:df:95:59"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2f45e58-23", "ovs_interfaceid": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.542 186548 DEBUG nova.network.os_vif_util [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:df:95:59,bridge_name='br-int',has_traffic_filtering=True,id=a2f45e58-237f-4de0-8339-5f17a4ad3cfe,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2f45e58-23') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.543 186548 DEBUG os_vif [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:df:95:59,bridge_name='br-int',has_traffic_filtering=True,id=a2f45e58-237f-4de0-8339-5f17a4ad3cfe,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2f45e58-23') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.543 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.544 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.544 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.547 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.547 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa2f45e58-23, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.548 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa2f45e58-23, col_values=(('external_ids', {'iface-id': 'a2f45e58-237f-4de0-8339-5f17a4ad3cfe', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:df:95:59', 'vm-uuid': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.550 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:47 compute-0 NetworkManager[55036]: <info>  [1763798627.5516] manager: (tapa2f45e58-23): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/205)
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.553 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.557 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.558 186548 INFO os_vif [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:df:95:59,bridge_name='br-int',has_traffic_filtering=True,id=a2f45e58-237f-4de0-8339-5f17a4ad3cfe,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2f45e58-23')
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.682 186548 DEBUG nova.virt.libvirt.driver [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.684 186548 DEBUG nova.virt.libvirt.driver [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.684 186548 DEBUG nova.virt.libvirt.driver [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] No VIF found with MAC fa:16:3e:df:95:59, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.685 186548 INFO nova.virt.libvirt.driver [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Using config drive
Nov 22 08:03:47 compute-0 kernel: tapa2f45e58-23: entered promiscuous mode
Nov 22 08:03:47 compute-0 NetworkManager[55036]: <info>  [1763798627.7432] manager: (tapa2f45e58-23): new Tun device (/org/freedesktop/NetworkManager/Devices/206)
Nov 22 08:03:47 compute-0 ovn_controller[94843]: 2025-11-22T08:03:47Z|00442|binding|INFO|Claiming lport a2f45e58-237f-4de0-8339-5f17a4ad3cfe for this chassis.
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.744 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:47 compute-0 ovn_controller[94843]: 2025-11-22T08:03:47Z|00443|binding|INFO|a2f45e58-237f-4de0-8339-5f17a4ad3cfe: Claiming fa:16:3e:df:95:59 10.100.0.8
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.748 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.755 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.756 186548 DEBUG oslo_concurrency.processutils [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] CMD "qemu-img rebase -b /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 -F raw /var/lib/nova/instances/235ecf63-07fa-4f60-97e9-466450a50add/disk" returned: 0 in 2.606s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.757 186548 DEBUG nova.virt.libvirt.driver [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.757 186548 DEBUG nova.virt.libvirt.driver [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Ensure instance console log exists: /var/lib/nova/instances/235ecf63-07fa-4f60-97e9-466450a50add/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.758 186548 DEBUG oslo_concurrency.lockutils [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.758 186548 DEBUG oslo_concurrency.lockutils [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.758 186548 DEBUG oslo_concurrency.lockutils [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.760 186548 DEBUG nova.virt.libvirt.driver [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Start _get_guest_xml network_info=[{"id": "fe9c07c3-e08f-4810-b699-6d6aa3f50cda", "address": "fa:16:3e:1c:d6:3e", "network": {"id": "c157128f-75e8-4afb-ab55-34580af9585f", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1557637836-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d967f0cef958482c9711764882a146f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe9c07c3-e0", "ovs_interfaceid": "fe9c07c3-e08f-4810-b699-6d6aa3f50cda", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='2b1138cdac3c5232dcea6f3351ca9bde',container_format='bare',created_at=2025-11-22T08:03:16Z,direct_url=<?>,disk_format='qcow2',id=165c19d9-e5a7-4c8d-9823-47ddc3418023,min_disk=1,min_ram=0,name='tempest-ServersNegativeTestJSON-server-899820238-shelved',owner='d967f0cef958482c9711764882a146f3',properties=ImageMetaProps,protected=<?>,size=52232192,status='active',tags=<?>,updated_at=2025-11-22T08:03:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.761 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:47 compute-0 NetworkManager[55036]: <info>  [1763798627.7624] manager: (patch-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/207)
Nov 22 08:03:47 compute-0 NetworkManager[55036]: <info>  [1763798627.7631] manager: (patch-br-int-to-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/208)
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.766 186548 WARNING nova.virt.libvirt.driver [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.771 186548 DEBUG nova.virt.libvirt.host [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.771 186548 DEBUG nova.virt.libvirt.host [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 08:03:47 compute-0 systemd-udevd[231376]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.775 186548 DEBUG nova.virt.libvirt.host [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.775 186548 DEBUG nova.virt.libvirt.host [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.776 186548 DEBUG nova.virt.libvirt.driver [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.776 186548 DEBUG nova.virt.hardware [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='2b1138cdac3c5232dcea6f3351ca9bde',container_format='bare',created_at=2025-11-22T08:03:16Z,direct_url=<?>,disk_format='qcow2',id=165c19d9-e5a7-4c8d-9823-47ddc3418023,min_disk=1,min_ram=0,name='tempest-ServersNegativeTestJSON-server-899820238-shelved',owner='d967f0cef958482c9711764882a146f3',properties=ImageMetaProps,protected=<?>,size=52232192,status='active',tags=<?>,updated_at=2025-11-22T08:03:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.777 186548 DEBUG nova.virt.hardware [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.777 186548 DEBUG nova.virt.hardware [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.777 186548 DEBUG nova.virt.hardware [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.777 186548 DEBUG nova.virt.hardware [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.777 186548 DEBUG nova.virt.hardware [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.777 186548 DEBUG nova.virt.hardware [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.777 186548 DEBUG nova.virt.hardware [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.778 186548 DEBUG nova.virt.hardware [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.778 186548 DEBUG nova.virt.hardware [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.778 186548 DEBUG nova.virt.hardware [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.778 186548 DEBUG nova.objects.instance [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 235ecf63-07fa-4f60-97e9-466450a50add obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:03:47 compute-0 systemd-machined[152872]: New machine qemu-58-instance-0000005d.
Nov 22 08:03:47 compute-0 NetworkManager[55036]: <info>  [1763798627.7867] device (tapa2f45e58-23): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 08:03:47 compute-0 NetworkManager[55036]: <info>  [1763798627.7879] device (tapa2f45e58-23): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.791 186548 DEBUG nova.virt.libvirt.vif [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-22T08:01:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-899820238',display_name='tempest-ServersNegativeTestJSON-server-899820238',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-899820238',id=97,image_ref='165c19d9-e5a7-4c8d-9823-47ddc3418023',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:01:45Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='d967f0cef958482c9711764882a146f3',ramdisk_id='',reservation_id='r-he9blv4w',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersNegativeTestJSON-1872924472',owner_user_name='tempest-ServersNegativeTestJSON-1872924472-project-member',shelved_at='2025-11-22T08:03:29.818091',shelved_host='compute-0.ctlplane.example.com',shelved_image_id='165c19d9-e5a7-4c8d-9823-47ddc3418023'},tags=<?>,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:03:37Z,user_data=None,user_id='989540cd5ede4a5184a08b8eb3de013d',uuid=235ecf63-07fa-4f60-97e9-466450a50add,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='shelved_offloaded') vif={"id": "fe9c07c3-e08f-4810-b699-6d6aa3f50cda", "address": "fa:16:3e:1c:d6:3e", "network": {"id": "c157128f-75e8-4afb-ab55-34580af9585f", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1557637836-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d967f0cef958482c9711764882a146f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe9c07c3-e0", "ovs_interfaceid": "fe9c07c3-e08f-4810-b699-6d6aa3f50cda", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.792 186548 DEBUG nova.network.os_vif_util [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Converting VIF {"id": "fe9c07c3-e08f-4810-b699-6d6aa3f50cda", "address": "fa:16:3e:1c:d6:3e", "network": {"id": "c157128f-75e8-4afb-ab55-34580af9585f", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1557637836-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d967f0cef958482c9711764882a146f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe9c07c3-e0", "ovs_interfaceid": "fe9c07c3-e08f-4810-b699-6d6aa3f50cda", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.793 186548 DEBUG nova.network.os_vif_util [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1c:d6:3e,bridge_name='br-int',has_traffic_filtering=True,id=fe9c07c3-e08f-4810-b699-6d6aa3f50cda,network=Network(c157128f-75e8-4afb-ab55-34580af9585f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe9c07c3-e0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.793 186548 DEBUG nova.objects.instance [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 235ecf63-07fa-4f60-97e9-466450a50add obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:03:47 compute-0 systemd[1]: Started Virtual Machine qemu-58-instance-0000005d.
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.805 186548 DEBUG nova.virt.libvirt.driver [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] End _get_guest_xml xml=<domain type="kvm">
Nov 22 08:03:47 compute-0 nova_compute[186544]:   <uuid>235ecf63-07fa-4f60-97e9-466450a50add</uuid>
Nov 22 08:03:47 compute-0 nova_compute[186544]:   <name>instance-00000061</name>
Nov 22 08:03:47 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 08:03:47 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 08:03:47 compute-0 nova_compute[186544]:   <metadata>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 08:03:47 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:       <nova:name>tempest-ServersNegativeTestJSON-server-899820238</nova:name>
Nov 22 08:03:47 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 08:03:47</nova:creationTime>
Nov 22 08:03:47 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 08:03:47 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 08:03:47 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 08:03:47 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 08:03:47 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 08:03:47 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 08:03:47 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 08:03:47 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 08:03:47 compute-0 nova_compute[186544]:         <nova:user uuid="989540cd5ede4a5184a08b8eb3de013d">tempest-ServersNegativeTestJSON-1872924472-project-member</nova:user>
Nov 22 08:03:47 compute-0 nova_compute[186544]:         <nova:project uuid="d967f0cef958482c9711764882a146f3">tempest-ServersNegativeTestJSON-1872924472</nova:project>
Nov 22 08:03:47 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 08:03:47 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="165c19d9-e5a7-4c8d-9823-47ddc3418023"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 08:03:47 compute-0 nova_compute[186544]:         <nova:port uuid="fe9c07c3-e08f-4810-b699-6d6aa3f50cda">
Nov 22 08:03:47 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 08:03:47 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 08:03:47 compute-0 nova_compute[186544]:   </metadata>
Nov 22 08:03:47 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <system>
Nov 22 08:03:47 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 08:03:47 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 08:03:47 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 08:03:47 compute-0 nova_compute[186544]:       <entry name="serial">235ecf63-07fa-4f60-97e9-466450a50add</entry>
Nov 22 08:03:47 compute-0 nova_compute[186544]:       <entry name="uuid">235ecf63-07fa-4f60-97e9-466450a50add</entry>
Nov 22 08:03:47 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     </system>
Nov 22 08:03:47 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 08:03:47 compute-0 nova_compute[186544]:   <os>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:   </os>
Nov 22 08:03:47 compute-0 nova_compute[186544]:   <features>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <apic/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:   </features>
Nov 22 08:03:47 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:   </clock>
Nov 22 08:03:47 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:   </cpu>
Nov 22 08:03:47 compute-0 nova_compute[186544]:   <devices>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 08:03:47 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/235ecf63-07fa-4f60-97e9-466450a50add/disk"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 08:03:47 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/235ecf63-07fa-4f60-97e9-466450a50add/disk.config"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 08:03:47 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:1c:d6:3e"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:       <target dev="tapfe9c07c3-e0"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     </interface>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 08:03:47 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/235ecf63-07fa-4f60-97e9-466450a50add/console.log" append="off"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     </serial>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <video>
Nov 22 08:03:47 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     </video>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <input type="keyboard" bus="usb"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 08:03:47 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     </rng>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 08:03:47 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 08:03:47 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 08:03:47 compute-0 nova_compute[186544]:   </devices>
Nov 22 08:03:47 compute-0 nova_compute[186544]: </domain>
Nov 22 08:03:47 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.806 186548 DEBUG nova.compute.manager [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Preparing to wait for external event network-vif-plugged-fe9c07c3-e08f-4810-b699-6d6aa3f50cda prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.806 186548 DEBUG oslo_concurrency.lockutils [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Acquiring lock "235ecf63-07fa-4f60-97e9-466450a50add-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.807 186548 DEBUG oslo_concurrency.lockutils [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Lock "235ecf63-07fa-4f60-97e9-466450a50add-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.807 186548 DEBUG oslo_concurrency.lockutils [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Lock "235ecf63-07fa-4f60-97e9-466450a50add-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.807 186548 DEBUG nova.virt.libvirt.vif [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-22T08:01:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-899820238',display_name='tempest-ServersNegativeTestJSON-server-899820238',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-899820238',id=97,image_ref='165c19d9-e5a7-4c8d-9823-47ddc3418023',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:01:45Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='d967f0cef958482c9711764882a146f3',ramdisk_id='',reservation_id='r-he9blv4w',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersNegativeTestJSON-1872924472',owner_user_name='tempest-ServersNegativeTestJSON-1872924472-project-member',shelved_at='2025-11-22T08:03:29.818091',shelved_host='compute-0.ctlplane.example.com',shelved_image_id='165c19d9-e5a7-4c8d-9823-47ddc3418023'},tags=<?>,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:03:37Z,user_data=None,user_id='989540cd5ede4a5184a08b8eb3de013d',uuid=235ecf63-07fa-4f60-97e9-466450a50add,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='shelved_offloaded') vif={"id": "fe9c07c3-e08f-4810-b699-6d6aa3f50cda", "address": "fa:16:3e:1c:d6:3e", "network": {"id": "c157128f-75e8-4afb-ab55-34580af9585f", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1557637836-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d967f0cef958482c9711764882a146f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe9c07c3-e0", "ovs_interfaceid": "fe9c07c3-e08f-4810-b699-6d6aa3f50cda", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.808 186548 DEBUG nova.network.os_vif_util [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Converting VIF {"id": "fe9c07c3-e08f-4810-b699-6d6aa3f50cda", "address": "fa:16:3e:1c:d6:3e", "network": {"id": "c157128f-75e8-4afb-ab55-34580af9585f", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1557637836-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d967f0cef958482c9711764882a146f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe9c07c3-e0", "ovs_interfaceid": "fe9c07c3-e08f-4810-b699-6d6aa3f50cda", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.808 186548 DEBUG nova.network.os_vif_util [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1c:d6:3e,bridge_name='br-int',has_traffic_filtering=True,id=fe9c07c3-e08f-4810-b699-6d6aa3f50cda,network=Network(c157128f-75e8-4afb-ab55-34580af9585f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe9c07c3-e0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.808 186548 DEBUG os_vif [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1c:d6:3e,bridge_name='br-int',has_traffic_filtering=True,id=fe9c07c3-e08f-4810-b699-6d6aa3f50cda,network=Network(c157128f-75e8-4afb-ab55-34580af9585f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe9c07c3-e0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.809 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.809 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.810 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.812 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.812 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfe9c07c3-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.812 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapfe9c07c3-e0, col_values=(('external_ids', {'iface-id': 'fe9c07c3-e08f-4810-b699-6d6aa3f50cda', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1c:d6:3e', 'vm-uuid': '235ecf63-07fa-4f60-97e9-466450a50add'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.813 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:47 compute-0 NetworkManager[55036]: <info>  [1763798627.8148] manager: (tapfe9c07c3-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/209)
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.816 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 08:03:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:47.876 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:df:95:59 10.100.0.8'], port_security=['fa:16:3e:df:95:59 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'neutron:revision_number': '12', 'neutron:security_group_ids': 'f75b5f45-3232-42aa-a8f2-594f0428a6f8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.204'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a46c39a3-69e8-4fb9-a300-4c114a0c0910, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=a2f45e58-237f-4de0-8339-5f17a4ad3cfe) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:03:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:47.878 103805 INFO neutron.agent.ovn.metadata.agent [-] Port a2f45e58-237f-4de0-8339-5f17a4ad3cfe in datapath 165f7f23-d3c9-4f49-8a34-4fd7222ad518 bound to our chassis
Nov 22 08:03:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:47.879 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 165f7f23-d3c9-4f49-8a34-4fd7222ad518
Nov 22 08:03:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:47.891 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[2d1eb980-7c62-4c23-ae10-f9e9670cd4c1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:03:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:47.892 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap165f7f23-d1 in ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 08:03:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:47.894 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap165f7f23-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 08:03:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:47.894 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[1818b421-5549-4d93-8625-fc3152dbe1f6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:03:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:47.895 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[f941a091-2fa5-46f9-8bed-b6e3d5abf2ee]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.908 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:47.907 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[ee4b6b13-281c-4704-b641-2335c9a28b7a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.910 186548 INFO os_vif [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1c:d6:3e,bridge_name='br-int',has_traffic_filtering=True,id=fe9c07c3-e08f-4810-b699-6d6aa3f50cda,network=Network(c157128f-75e8-4afb-ab55-34580af9585f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe9c07c3-e0')
Nov 22 08:03:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:47.923 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[261b65af-ec8e-4c07-ba89-29d90620ea89]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.937 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:47 compute-0 ovn_controller[94843]: 2025-11-22T08:03:47Z|00444|binding|INFO|Setting lport a2f45e58-237f-4de0-8339-5f17a4ad3cfe ovn-installed in OVS
Nov 22 08:03:47 compute-0 ovn_controller[94843]: 2025-11-22T08:03:47Z|00445|binding|INFO|Setting lport a2f45e58-237f-4de0-8339-5f17a4ad3cfe up in Southbound
Nov 22 08:03:47 compute-0 nova_compute[186544]: 2025-11-22 08:03:47.947 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:47.962 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[677ed603-1f98-4d5f-ad6f-89792ba1fc0f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:03:47 compute-0 systemd-udevd[231379]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:03:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:47.969 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[42b2ec15-7db5-4380-b6f4-e837401ebbc8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:03:47 compute-0 NetworkManager[55036]: <info>  [1763798627.9709] manager: (tap165f7f23-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/210)
Nov 22 08:03:48 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:48.009 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[ec6b9790-393a-45d0-804d-a3f99d740e14]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:03:48 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:48.012 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[ba6c3d61-6ce2-4fc3-9271-991177dd7fd0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:03:48 compute-0 NetworkManager[55036]: <info>  [1763798628.0469] device (tap165f7f23-d0): carrier: link connected
Nov 22 08:03:48 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:48.053 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[69e63dc3-72d7-4e8e-a495-0b1b578712f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:03:48 compute-0 nova_compute[186544]: 2025-11-22 08:03:48.055 186548 DEBUG nova.virt.libvirt.driver [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:03:48 compute-0 nova_compute[186544]: 2025-11-22 08:03:48.055 186548 DEBUG nova.virt.libvirt.driver [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:03:48 compute-0 nova_compute[186544]: 2025-11-22 08:03:48.056 186548 DEBUG nova.virt.libvirt.driver [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] No VIF found with MAC fa:16:3e:1c:d6:3e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 08:03:48 compute-0 nova_compute[186544]: 2025-11-22 08:03:48.056 186548 INFO nova.virt.libvirt.driver [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Using config drive
Nov 22 08:03:48 compute-0 nova_compute[186544]: 2025-11-22 08:03:48.071 186548 DEBUG nova.objects.instance [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 235ecf63-07fa-4f60-97e9-466450a50add obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:03:48 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:48.072 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[c7418fb2-fc7d-41d0-be3b-a61044c54d88]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap165f7f23-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:00:cc:98'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 138], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 536868, 'reachable_time': 32924, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231415, 'error': None, 'target': 'ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:03:48 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:48.095 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[58332fa7-9ca1-4a4f-8708-52da0b707260]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe00:cc98'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 536868, 'tstamp': 536868}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231416, 'error': None, 'target': 'ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:03:48 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:48.112 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[4fbe57d5-6d96-4d9b-8c87-a29fe5e8fcd5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap165f7f23-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:00:cc:98'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 138], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 536868, 'reachable_time': 32924, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 231418, 'error': None, 'target': 'ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:03:48 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:48.144 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[588e10d4-896d-47b0-82ca-3732a04a6130]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:03:48 compute-0 nova_compute[186544]: 2025-11-22 08:03:48.158 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:03:48 compute-0 nova_compute[186544]: 2025-11-22 08:03:48.195 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798628.195502, eb6b82cf-7eb5-4a69-9342-a5d3fb896e58 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:03:48 compute-0 nova_compute[186544]: 2025-11-22 08:03:48.196 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] VM Resumed (Lifecycle Event)
Nov 22 08:03:48 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:48.197 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[e2c5da22-25c8-4f1b-9472-2cd0bba77cdf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:03:48 compute-0 nova_compute[186544]: 2025-11-22 08:03:48.198 186548 DEBUG nova.compute.manager [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 08:03:48 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:48.199 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap165f7f23-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:03:48 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:48.199 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:03:48 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:48.200 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap165f7f23-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:03:48 compute-0 nova_compute[186544]: 2025-11-22 08:03:48.201 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:48 compute-0 NetworkManager[55036]: <info>  [1763798628.2024] manager: (tap165f7f23-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/211)
Nov 22 08:03:48 compute-0 kernel: tap165f7f23-d0: entered promiscuous mode
Nov 22 08:03:48 compute-0 nova_compute[186544]: 2025-11-22 08:03:48.203 186548 INFO nova.virt.libvirt.driver [-] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Instance running successfully.
Nov 22 08:03:48 compute-0 virtqemud[186092]: argument unsupported: QEMU guest agent is not configured
Nov 22 08:03:48 compute-0 nova_compute[186544]: 2025-11-22 08:03:48.206 186548 DEBUG nova.virt.libvirt.guest [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200
Nov 22 08:03:48 compute-0 nova_compute[186544]: 2025-11-22 08:03:48.206 186548 DEBUG nova.virt.libvirt.driver [None req-ad7872f9-9746-41c4-94d0-1d441fd0475b b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793
Nov 22 08:03:48 compute-0 nova_compute[186544]: 2025-11-22 08:03:48.208 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:48 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:48.209 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap165f7f23-d0, col_values=(('external_ids', {'iface-id': '966efbe2-6c09-40dc-9351-4f58f2542b31'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:03:48 compute-0 nova_compute[186544]: 2025-11-22 08:03:48.210 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:48 compute-0 ovn_controller[94843]: 2025-11-22T08:03:48Z|00446|binding|INFO|Releasing lport 966efbe2-6c09-40dc-9351-4f58f2542b31 from this chassis (sb_readonly=0)
Nov 22 08:03:48 compute-0 nova_compute[186544]: 2025-11-22 08:03:48.223 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:03:48 compute-0 nova_compute[186544]: 2025-11-22 08:03:48.224 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:48 compute-0 nova_compute[186544]: 2025-11-22 08:03:48.227 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:48 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:48.227 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/165f7f23-d3c9-4f49-8a34-4fd7222ad518.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/165f7f23-d3c9-4f49-8a34-4fd7222ad518.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 08:03:48 compute-0 nova_compute[186544]: 2025-11-22 08:03:48.228 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:03:48 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:48.228 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[db1274c6-1109-4d57-8859-7259607e7a13]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:03:48 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:48.229 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 08:03:48 compute-0 ovn_metadata_agent[103800]: global
Nov 22 08:03:48 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 08:03:48 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-165f7f23-d3c9-4f49-8a34-4fd7222ad518
Nov 22 08:03:48 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 08:03:48 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 08:03:48 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 08:03:48 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/165f7f23-d3c9-4f49-8a34-4fd7222ad518.pid.haproxy
Nov 22 08:03:48 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 08:03:48 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:03:48 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 08:03:48 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 08:03:48 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 08:03:48 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 08:03:48 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 08:03:48 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 08:03:48 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 08:03:48 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 08:03:48 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 08:03:48 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 08:03:48 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 08:03:48 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 08:03:48 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 08:03:48 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:03:48 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:03:48 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 08:03:48 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 08:03:48 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 08:03:48 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID 165f7f23-d3c9-4f49-8a34-4fd7222ad518
Nov 22 08:03:48 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 08:03:48 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:48.229 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'env', 'PROCESS_TAG=haproxy-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/165f7f23-d3c9-4f49-8a34-4fd7222ad518.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 08:03:48 compute-0 nova_compute[186544]: 2025-11-22 08:03:48.247 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] During sync_power_state the instance has a pending task (resize_finish). Skip.
Nov 22 08:03:48 compute-0 nova_compute[186544]: 2025-11-22 08:03:48.248 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798628.1966763, eb6b82cf-7eb5-4a69-9342-a5d3fb896e58 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:03:48 compute-0 nova_compute[186544]: 2025-11-22 08:03:48.248 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] VM Started (Lifecycle Event)
Nov 22 08:03:48 compute-0 nova_compute[186544]: 2025-11-22 08:03:48.268 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:03:48 compute-0 nova_compute[186544]: 2025-11-22 08:03:48.279 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:03:48 compute-0 nova_compute[186544]: 2025-11-22 08:03:48.296 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] During sync_power_state the instance has a pending task (resize_finish). Skip.
Nov 22 08:03:48 compute-0 nova_compute[186544]: 2025-11-22 08:03:48.453 186548 DEBUG nova.objects.instance [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Lazy-loading 'keypairs' on Instance uuid 235ecf63-07fa-4f60-97e9-466450a50add obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:03:48 compute-0 podman[231459]: 2025-11-22 08:03:48.569971009 +0000 UTC m=+0.028271182 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 08:03:48 compute-0 nova_compute[186544]: 2025-11-22 08:03:48.744 186548 DEBUG nova.network.neutron [req-02362e0c-a00a-4843-99c2-e73f52fe9e1f req-c2b12d1e-ca50-4e0d-9c36-4dcd7fcf3f0b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Updated VIF entry in instance network info cache for port a2f45e58-237f-4de0-8339-5f17a4ad3cfe. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:03:48 compute-0 nova_compute[186544]: 2025-11-22 08:03:48.745 186548 DEBUG nova.network.neutron [req-02362e0c-a00a-4843-99c2-e73f52fe9e1f req-c2b12d1e-ca50-4e0d-9c36-4dcd7fcf3f0b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Updating instance_info_cache with network_info: [{"id": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "address": "fa:16:3e:df:95:59", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2f45e58-23", "ovs_interfaceid": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:03:48 compute-0 podman[231459]: 2025-11-22 08:03:48.750824697 +0000 UTC m=+0.209124770 container create cc0c45666c697e8d9a9af2e2ed12cce0599cfc04a467096db0330bb82a2c380e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 08:03:48 compute-0 systemd[1]: Started libpod-conmon-cc0c45666c697e8d9a9af2e2ed12cce0599cfc04a467096db0330bb82a2c380e.scope.
Nov 22 08:03:48 compute-0 systemd[1]: Started libcrun container.
Nov 22 08:03:48 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5b1a0022075694266c7131d75b6e3c9a2931a70f221bab7d7bf36a21d6a933df/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 08:03:48 compute-0 nova_compute[186544]: 2025-11-22 08:03:48.856 186548 DEBUG oslo_concurrency.lockutils [req-02362e0c-a00a-4843-99c2-e73f52fe9e1f req-c2b12d1e-ca50-4e0d-9c36-4dcd7fcf3f0b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-eb6b82cf-7eb5-4a69-9342-a5d3fb896e58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:03:48 compute-0 podman[231459]: 2025-11-22 08:03:48.936047303 +0000 UTC m=+0.394347406 container init cc0c45666c697e8d9a9af2e2ed12cce0599cfc04a467096db0330bb82a2c380e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 08:03:48 compute-0 podman[231459]: 2025-11-22 08:03:48.941759344 +0000 UTC m=+0.400059427 container start cc0c45666c697e8d9a9af2e2ed12cce0599cfc04a467096db0330bb82a2c380e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 08:03:48 compute-0 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[231475]: [NOTICE]   (231479) : New worker (231481) forked
Nov 22 08:03:48 compute-0 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[231475]: [NOTICE]   (231479) : Loading success.
Nov 22 08:03:49 compute-0 nova_compute[186544]: 2025-11-22 08:03:49.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:03:49 compute-0 nova_compute[186544]: 2025-11-22 08:03:49.189 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:03:49 compute-0 nova_compute[186544]: 2025-11-22 08:03:49.190 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:03:49 compute-0 nova_compute[186544]: 2025-11-22 08:03:49.190 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:03:49 compute-0 nova_compute[186544]: 2025-11-22 08:03:49.190 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 08:03:49 compute-0 nova_compute[186544]: 2025-11-22 08:03:49.259 186548 INFO nova.virt.libvirt.driver [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Creating config drive at /var/lib/nova/instances/235ecf63-07fa-4f60-97e9-466450a50add/disk.config
Nov 22 08:03:49 compute-0 nova_compute[186544]: 2025-11-22 08:03:49.264 186548 DEBUG oslo_concurrency.processutils [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/235ecf63-07fa-4f60-97e9-466450a50add/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpt8nvpowe execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:03:49 compute-0 nova_compute[186544]: 2025-11-22 08:03:49.286 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:03:49 compute-0 nova_compute[186544]: 2025-11-22 08:03:49.350 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:03:49 compute-0 nova_compute[186544]: 2025-11-22 08:03:49.351 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:03:49 compute-0 nova_compute[186544]: 2025-11-22 08:03:49.396 186548 DEBUG oslo_concurrency.processutils [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/235ecf63-07fa-4f60-97e9-466450a50add/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpt8nvpowe" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:03:49 compute-0 nova_compute[186544]: 2025-11-22 08:03:49.407 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:03:49 compute-0 NetworkManager[55036]: <info>  [1763798629.4553] manager: (tapfe9c07c3-e0): new Tun device (/org/freedesktop/NetworkManager/Devices/212)
Nov 22 08:03:49 compute-0 systemd-udevd[231402]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:03:49 compute-0 kernel: tapfe9c07c3-e0: entered promiscuous mode
Nov 22 08:03:49 compute-0 ovn_controller[94843]: 2025-11-22T08:03:49Z|00447|binding|INFO|Claiming lport fe9c07c3-e08f-4810-b699-6d6aa3f50cda for this chassis.
Nov 22 08:03:49 compute-0 nova_compute[186544]: 2025-11-22 08:03:49.461 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:49 compute-0 ovn_controller[94843]: 2025-11-22T08:03:49Z|00448|binding|INFO|fe9c07c3-e08f-4810-b699-6d6aa3f50cda: Claiming fa:16:3e:1c:d6:3e 10.100.0.8
Nov 22 08:03:49 compute-0 NetworkManager[55036]: <info>  [1763798629.4718] device (tapfe9c07c3-e0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 08:03:49 compute-0 NetworkManager[55036]: <info>  [1763798629.4727] device (tapfe9c07c3-e0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 08:03:49 compute-0 ovn_controller[94843]: 2025-11-22T08:03:49Z|00449|binding|INFO|Setting lport fe9c07c3-e08f-4810-b699-6d6aa3f50cda ovn-installed in OVS
Nov 22 08:03:49 compute-0 nova_compute[186544]: 2025-11-22 08:03:49.477 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:49 compute-0 nova_compute[186544]: 2025-11-22 08:03:49.479 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:49 compute-0 systemd-machined[152872]: New machine qemu-59-instance-00000061.
Nov 22 08:03:49 compute-0 systemd[1]: Started Virtual Machine qemu-59-instance-00000061.
Nov 22 08:03:49 compute-0 nova_compute[186544]: 2025-11-22 08:03:49.529 186548 DEBUG nova.compute.manager [req-6c8b15bb-3885-4f2c-b994-7b5a97eedee1 req-674e0f16-c68f-4407-888b-1fa3c2e08974 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Received event network-vif-plugged-a2f45e58-237f-4de0-8339-5f17a4ad3cfe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:03:49 compute-0 nova_compute[186544]: 2025-11-22 08:03:49.530 186548 DEBUG oslo_concurrency.lockutils [req-6c8b15bb-3885-4f2c-b994-7b5a97eedee1 req-674e0f16-c68f-4407-888b-1fa3c2e08974 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:03:49 compute-0 nova_compute[186544]: 2025-11-22 08:03:49.531 186548 DEBUG oslo_concurrency.lockutils [req-6c8b15bb-3885-4f2c-b994-7b5a97eedee1 req-674e0f16-c68f-4407-888b-1fa3c2e08974 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:03:49 compute-0 nova_compute[186544]: 2025-11-22 08:03:49.531 186548 DEBUG oslo_concurrency.lockutils [req-6c8b15bb-3885-4f2c-b994-7b5a97eedee1 req-674e0f16-c68f-4407-888b-1fa3c2e08974 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:03:49 compute-0 nova_compute[186544]: 2025-11-22 08:03:49.531 186548 DEBUG nova.compute.manager [req-6c8b15bb-3885-4f2c-b994-7b5a97eedee1 req-674e0f16-c68f-4407-888b-1fa3c2e08974 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] No waiting events found dispatching network-vif-plugged-a2f45e58-237f-4de0-8339-5f17a4ad3cfe pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:03:49 compute-0 nova_compute[186544]: 2025-11-22 08:03:49.532 186548 WARNING nova.compute.manager [req-6c8b15bb-3885-4f2c-b994-7b5a97eedee1 req-674e0f16-c68f-4407-888b-1fa3c2e08974 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Received unexpected event network-vif-plugged-a2f45e58-237f-4de0-8339-5f17a4ad3cfe for instance with vm_state resized and task_state None.
Nov 22 08:03:49 compute-0 nova_compute[186544]: 2025-11-22 08:03:49.585 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/235ecf63-07fa-4f60-97e9-466450a50add/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:03:49 compute-0 ovn_controller[94843]: 2025-11-22T08:03:49Z|00450|binding|INFO|Setting lport fe9c07c3-e08f-4810-b699-6d6aa3f50cda up in Southbound
Nov 22 08:03:49 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:49.592 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1c:d6:3e 10.100.0.8'], port_security=['fa:16:3e:1c:d6:3e 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '235ecf63-07fa-4f60-97e9-466450a50add', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c157128f-75e8-4afb-ab55-34580af9585f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd967f0cef958482c9711764882a146f3', 'neutron:revision_number': '7', 'neutron:security_group_ids': 'db55d655-ec9a-40ef-9e54-3247c3ea4f75', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=50c031f4-6b41-4ee7-af4f-a9218d9b390c, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=fe9c07c3-e08f-4810-b699-6d6aa3f50cda) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:03:49 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:49.593 103805 INFO neutron.agent.ovn.metadata.agent [-] Port fe9c07c3-e08f-4810-b699-6d6aa3f50cda in datapath c157128f-75e8-4afb-ab55-34580af9585f bound to our chassis
Nov 22 08:03:49 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:49.595 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c157128f-75e8-4afb-ab55-34580af9585f
Nov 22 08:03:49 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:49.605 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[e66d5dde-3598-462a-ae1c-67cdad66b8ad]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:03:49 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:49.606 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc157128f-71 in ovnmeta-c157128f-75e8-4afb-ab55-34580af9585f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 08:03:49 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:49.609 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc157128f-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 08:03:49 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:49.609 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[8baa02b9-7a32-4fd6-b4b2-ed3d0e295dff]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:03:49 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:49.610 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[e0cb8818-8559-4b09-ae59-74c5e8d80c3f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:03:49 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:49.622 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[b29f118f-5d03-4b13-a083-c415d8d6484b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:03:49 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:49.645 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[f0ec7b89-9eb5-496f-b838-41e2b874a9b6]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:03:49 compute-0 nova_compute[186544]: 2025-11-22 08:03:49.655 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/235ecf63-07fa-4f60-97e9-466450a50add/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:03:49 compute-0 nova_compute[186544]: 2025-11-22 08:03:49.656 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/235ecf63-07fa-4f60-97e9-466450a50add/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:03:49 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:49.685 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[42f02b77-e294-4cca-a55b-8abd8b5ec0e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:03:49 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:49.691 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[64d33278-9369-41c5-81c6-2e460101314d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:03:49 compute-0 NetworkManager[55036]: <info>  [1763798629.6921] manager: (tapc157128f-70): new Veth device (/org/freedesktop/NetworkManager/Devices/213)
Nov 22 08:03:49 compute-0 nova_compute[186544]: 2025-11-22 08:03:49.715 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/235ecf63-07fa-4f60-97e9-466450a50add/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:03:49 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:49.735 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[0a40ee6e-349e-45fb-a2bc-2dbe48588ee1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:03:49 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:49.740 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[b5286b4b-aa41-4bce-ac82-0feb9fc0d1a9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:03:49 compute-0 NetworkManager[55036]: <info>  [1763798629.7686] device (tapc157128f-70): carrier: link connected
Nov 22 08:03:49 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:49.774 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[dc9cf711-8316-4b71-9f7b-956b127b176d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:03:49 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:49.803 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[25f173c6-7869-48b2-89d9-54536d87d14d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc157128f-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bb:41:0b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 140], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 537040, 'reachable_time': 27806, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231538, 'error': None, 'target': 'ovnmeta-c157128f-75e8-4afb-ab55-34580af9585f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:03:49 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:49.821 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[84a349bb-13f4-4989-9853-c0b091b9f151]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:febb:410b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 537040, 'tstamp': 537040}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231539, 'error': None, 'target': 'ovnmeta-c157128f-75e8-4afb-ab55-34580af9585f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:03:49 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:49.850 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[709a5ac6-695e-43a4-a9b2-7dbc19eb1122]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc157128f-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bb:41:0b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 140], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 537040, 'reachable_time': 27806, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 231540, 'error': None, 'target': 'ovnmeta-c157128f-75e8-4afb-ab55-34580af9585f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:03:49 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:49.888 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[3026f3e8-a0b9-4c07-93a7-89194d007ca2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:03:49 compute-0 nova_compute[186544]: 2025-11-22 08:03:49.902 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:03:49 compute-0 nova_compute[186544]: 2025-11-22 08:03:49.903 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5645MB free_disk=73.15520095825195GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 08:03:49 compute-0 nova_compute[186544]: 2025-11-22 08:03:49.904 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:03:49 compute-0 nova_compute[186544]: 2025-11-22 08:03:49.904 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:03:49 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:49.947 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[b9ed20ae-8b15-4649-89b3-af3d166afabd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:03:49 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:49.948 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc157128f-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:03:49 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:49.949 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:03:49 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:49.949 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc157128f-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:03:49 compute-0 NetworkManager[55036]: <info>  [1763798629.9519] manager: (tapc157128f-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/214)
Nov 22 08:03:49 compute-0 kernel: tapc157128f-70: entered promiscuous mode
Nov 22 08:03:49 compute-0 nova_compute[186544]: 2025-11-22 08:03:49.953 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:49 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:49.954 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc157128f-70, col_values=(('external_ids', {'iface-id': 'a61c8ae7-262d-45c7-859e-6a4502225b00'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:03:49 compute-0 ovn_controller[94843]: 2025-11-22T08:03:49Z|00451|binding|INFO|Releasing lport a61c8ae7-262d-45c7-859e-6a4502225b00 from this chassis (sb_readonly=0)
Nov 22 08:03:49 compute-0 nova_compute[186544]: 2025-11-22 08:03:49.967 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:49 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:49.968 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c157128f-75e8-4afb-ab55-34580af9585f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c157128f-75e8-4afb-ab55-34580af9585f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 08:03:49 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:49.968 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[c95e4cdc-85ce-450d-b35c-5bd72f20f57a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:03:49 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:49.969 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 08:03:49 compute-0 ovn_metadata_agent[103800]: global
Nov 22 08:03:49 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 08:03:49 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-c157128f-75e8-4afb-ab55-34580af9585f
Nov 22 08:03:49 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 08:03:49 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 08:03:49 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 08:03:49 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/c157128f-75e8-4afb-ab55-34580af9585f.pid.haproxy
Nov 22 08:03:49 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 08:03:49 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:03:49 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 08:03:49 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 08:03:49 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 08:03:49 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 08:03:49 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 08:03:49 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 08:03:49 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 08:03:49 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 08:03:49 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 08:03:49 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 08:03:49 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 08:03:49 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 08:03:49 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 08:03:49 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:03:49 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:03:49 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 08:03:49 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 08:03:49 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 08:03:49 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID c157128f-75e8-4afb-ab55-34580af9585f
Nov 22 08:03:49 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 08:03:49 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:03:49.970 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c157128f-75e8-4afb-ab55-34580af9585f', 'env', 'PROCESS_TAG=haproxy-c157128f-75e8-4afb-ab55-34580af9585f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c157128f-75e8-4afb-ab55-34580af9585f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 08:03:50 compute-0 nova_compute[186544]: 2025-11-22 08:03:50.310 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Applying migration context for instance eb6b82cf-7eb5-4a69-9342-a5d3fb896e58 as it has an incoming, in-progress migration 90f3f02c-becf-4e76-be2e-e639916871d2. Migration status is confirming _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:950
Nov 22 08:03:50 compute-0 nova_compute[186544]: 2025-11-22 08:03:50.311 186548 INFO nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Updating resource usage from migration 90f3f02c-becf-4e76-be2e-e639916871d2
Nov 22 08:03:50 compute-0 nova_compute[186544]: 2025-11-22 08:03:50.378 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798630.3777683, 235ecf63-07fa-4f60-97e9-466450a50add => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:03:50 compute-0 nova_compute[186544]: 2025-11-22 08:03:50.379 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] VM Started (Lifecycle Event)
Nov 22 08:03:50 compute-0 nova_compute[186544]: 2025-11-22 08:03:50.412 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:03:50 compute-0 nova_compute[186544]: 2025-11-22 08:03:50.417 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798630.3785615, 235ecf63-07fa-4f60-97e9-466450a50add => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:03:50 compute-0 nova_compute[186544]: 2025-11-22 08:03:50.418 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] VM Paused (Lifecycle Event)
Nov 22 08:03:50 compute-0 nova_compute[186544]: 2025-11-22 08:03:50.420 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Instance eb6b82cf-7eb5-4a69-9342-a5d3fb896e58 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 22 08:03:50 compute-0 nova_compute[186544]: 2025-11-22 08:03:50.421 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Instance 235ecf63-07fa-4f60-97e9-466450a50add actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 22 08:03:50 compute-0 nova_compute[186544]: 2025-11-22 08:03:50.421 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 08:03:50 compute-0 nova_compute[186544]: 2025-11-22 08:03:50.421 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=832MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 08:03:50 compute-0 podman[231578]: 2025-11-22 08:03:50.334445378 +0000 UTC m=+0.025189474 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 08:03:50 compute-0 nova_compute[186544]: 2025-11-22 08:03:50.442 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:03:50 compute-0 nova_compute[186544]: 2025-11-22 08:03:50.446 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:03:50 compute-0 nova_compute[186544]: 2025-11-22 08:03:50.465 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 08:03:50 compute-0 nova_compute[186544]: 2025-11-22 08:03:50.523 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:03:50 compute-0 nova_compute[186544]: 2025-11-22 08:03:50.533 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:03:50 compute-0 podman[231578]: 2025-11-22 08:03:50.576252765 +0000 UTC m=+0.266996651 container create 62e7aee09791ac211b894e39c6e39ff2f5c50ff253a86690ecce036cfe94b70c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c157128f-75e8-4afb-ab55-34580af9585f, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 22 08:03:50 compute-0 nova_compute[186544]: 2025-11-22 08:03:50.604 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 08:03:50 compute-0 nova_compute[186544]: 2025-11-22 08:03:50.605 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.701s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:03:50 compute-0 systemd[1]: Started libpod-conmon-62e7aee09791ac211b894e39c6e39ff2f5c50ff253a86690ecce036cfe94b70c.scope.
Nov 22 08:03:50 compute-0 systemd[1]: Started libcrun container.
Nov 22 08:03:50 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b006fe062981e9307ef85fc3f7dca0219de3cf5d0bda653ba2245d36c1c7558c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 08:03:50 compute-0 nova_compute[186544]: 2025-11-22 08:03:50.706 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:50 compute-0 podman[231578]: 2025-11-22 08:03:50.837589816 +0000 UTC m=+0.528333722 container init 62e7aee09791ac211b894e39c6e39ff2f5c50ff253a86690ecce036cfe94b70c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c157128f-75e8-4afb-ab55-34580af9585f, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 08:03:50 compute-0 podman[231578]: 2025-11-22 08:03:50.845247696 +0000 UTC m=+0.535991582 container start 62e7aee09791ac211b894e39c6e39ff2f5c50ff253a86690ecce036cfe94b70c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c157128f-75e8-4afb-ab55-34580af9585f, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 22 08:03:50 compute-0 neutron-haproxy-ovnmeta-c157128f-75e8-4afb-ab55-34580af9585f[231594]: [NOTICE]   (231598) : New worker (231600) forked
Nov 22 08:03:50 compute-0 neutron-haproxy-ovnmeta-c157128f-75e8-4afb-ab55-34580af9585f[231594]: [NOTICE]   (231598) : Loading success.
Nov 22 08:03:51 compute-0 nova_compute[186544]: 2025-11-22 08:03:51.638 186548 DEBUG nova.compute.manager [req-e17be5b7-32bc-45e2-9c6b-d89226ecdbb7 req-c56e912a-2adc-406c-b97e-deb03c239af0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Received event network-vif-plugged-a2f45e58-237f-4de0-8339-5f17a4ad3cfe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:03:51 compute-0 nova_compute[186544]: 2025-11-22 08:03:51.639 186548 DEBUG oslo_concurrency.lockutils [req-e17be5b7-32bc-45e2-9c6b-d89226ecdbb7 req-c56e912a-2adc-406c-b97e-deb03c239af0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:03:51 compute-0 nova_compute[186544]: 2025-11-22 08:03:51.639 186548 DEBUG oslo_concurrency.lockutils [req-e17be5b7-32bc-45e2-9c6b-d89226ecdbb7 req-c56e912a-2adc-406c-b97e-deb03c239af0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:03:51 compute-0 nova_compute[186544]: 2025-11-22 08:03:51.639 186548 DEBUG oslo_concurrency.lockutils [req-e17be5b7-32bc-45e2-9c6b-d89226ecdbb7 req-c56e912a-2adc-406c-b97e-deb03c239af0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:03:51 compute-0 nova_compute[186544]: 2025-11-22 08:03:51.640 186548 DEBUG nova.compute.manager [req-e17be5b7-32bc-45e2-9c6b-d89226ecdbb7 req-c56e912a-2adc-406c-b97e-deb03c239af0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] No waiting events found dispatching network-vif-plugged-a2f45e58-237f-4de0-8339-5f17a4ad3cfe pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:03:51 compute-0 nova_compute[186544]: 2025-11-22 08:03:51.640 186548 WARNING nova.compute.manager [req-e17be5b7-32bc-45e2-9c6b-d89226ecdbb7 req-c56e912a-2adc-406c-b97e-deb03c239af0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Received unexpected event network-vif-plugged-a2f45e58-237f-4de0-8339-5f17a4ad3cfe for instance with vm_state resized and task_state None.
Nov 22 08:03:51 compute-0 nova_compute[186544]: 2025-11-22 08:03:51.640 186548 DEBUG nova.compute.manager [req-e17be5b7-32bc-45e2-9c6b-d89226ecdbb7 req-c56e912a-2adc-406c-b97e-deb03c239af0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Received event network-vif-plugged-fe9c07c3-e08f-4810-b699-6d6aa3f50cda external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:03:51 compute-0 nova_compute[186544]: 2025-11-22 08:03:51.640 186548 DEBUG oslo_concurrency.lockutils [req-e17be5b7-32bc-45e2-9c6b-d89226ecdbb7 req-c56e912a-2adc-406c-b97e-deb03c239af0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "235ecf63-07fa-4f60-97e9-466450a50add-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:03:51 compute-0 nova_compute[186544]: 2025-11-22 08:03:51.641 186548 DEBUG oslo_concurrency.lockutils [req-e17be5b7-32bc-45e2-9c6b-d89226ecdbb7 req-c56e912a-2adc-406c-b97e-deb03c239af0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "235ecf63-07fa-4f60-97e9-466450a50add-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:03:51 compute-0 nova_compute[186544]: 2025-11-22 08:03:51.641 186548 DEBUG oslo_concurrency.lockutils [req-e17be5b7-32bc-45e2-9c6b-d89226ecdbb7 req-c56e912a-2adc-406c-b97e-deb03c239af0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "235ecf63-07fa-4f60-97e9-466450a50add-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:03:51 compute-0 nova_compute[186544]: 2025-11-22 08:03:51.641 186548 DEBUG nova.compute.manager [req-e17be5b7-32bc-45e2-9c6b-d89226ecdbb7 req-c56e912a-2adc-406c-b97e-deb03c239af0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Processing event network-vif-plugged-fe9c07c3-e08f-4810-b699-6d6aa3f50cda _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 08:03:51 compute-0 nova_compute[186544]: 2025-11-22 08:03:51.641 186548 DEBUG nova.compute.manager [req-e17be5b7-32bc-45e2-9c6b-d89226ecdbb7 req-c56e912a-2adc-406c-b97e-deb03c239af0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Received event network-vif-plugged-fe9c07c3-e08f-4810-b699-6d6aa3f50cda external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:03:51 compute-0 nova_compute[186544]: 2025-11-22 08:03:51.641 186548 DEBUG oslo_concurrency.lockutils [req-e17be5b7-32bc-45e2-9c6b-d89226ecdbb7 req-c56e912a-2adc-406c-b97e-deb03c239af0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "235ecf63-07fa-4f60-97e9-466450a50add-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:03:51 compute-0 nova_compute[186544]: 2025-11-22 08:03:51.642 186548 DEBUG oslo_concurrency.lockutils [req-e17be5b7-32bc-45e2-9c6b-d89226ecdbb7 req-c56e912a-2adc-406c-b97e-deb03c239af0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "235ecf63-07fa-4f60-97e9-466450a50add-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:03:51 compute-0 nova_compute[186544]: 2025-11-22 08:03:51.642 186548 DEBUG oslo_concurrency.lockutils [req-e17be5b7-32bc-45e2-9c6b-d89226ecdbb7 req-c56e912a-2adc-406c-b97e-deb03c239af0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "235ecf63-07fa-4f60-97e9-466450a50add-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:03:51 compute-0 nova_compute[186544]: 2025-11-22 08:03:51.642 186548 DEBUG nova.compute.manager [req-e17be5b7-32bc-45e2-9c6b-d89226ecdbb7 req-c56e912a-2adc-406c-b97e-deb03c239af0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] No waiting events found dispatching network-vif-plugged-fe9c07c3-e08f-4810-b699-6d6aa3f50cda pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:03:51 compute-0 nova_compute[186544]: 2025-11-22 08:03:51.642 186548 WARNING nova.compute.manager [req-e17be5b7-32bc-45e2-9c6b-d89226ecdbb7 req-c56e912a-2adc-406c-b97e-deb03c239af0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Received unexpected event network-vif-plugged-fe9c07c3-e08f-4810-b699-6d6aa3f50cda for instance with vm_state shelved_offloaded and task_state spawning.
Nov 22 08:03:51 compute-0 nova_compute[186544]: 2025-11-22 08:03:51.643 186548 DEBUG nova.compute.manager [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 08:03:51 compute-0 nova_compute[186544]: 2025-11-22 08:03:51.647 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798631.6476352, 235ecf63-07fa-4f60-97e9-466450a50add => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:03:51 compute-0 nova_compute[186544]: 2025-11-22 08:03:51.648 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] VM Resumed (Lifecycle Event)
Nov 22 08:03:51 compute-0 nova_compute[186544]: 2025-11-22 08:03:51.650 186548 DEBUG nova.virt.libvirt.driver [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 08:03:51 compute-0 nova_compute[186544]: 2025-11-22 08:03:51.652 186548 INFO nova.virt.libvirt.driver [-] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Instance spawned successfully.
Nov 22 08:03:51 compute-0 nova_compute[186544]: 2025-11-22 08:03:51.664 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:03:51 compute-0 nova_compute[186544]: 2025-11-22 08:03:51.670 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:03:51 compute-0 nova_compute[186544]: 2025-11-22 08:03:51.687 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 08:03:52 compute-0 nova_compute[186544]: 2025-11-22 08:03:52.606 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:03:52 compute-0 nova_compute[186544]: 2025-11-22 08:03:52.606 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 08:03:52 compute-0 nova_compute[186544]: 2025-11-22 08:03:52.606 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 08:03:52 compute-0 nova_compute[186544]: 2025-11-22 08:03:52.815 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:53 compute-0 nova_compute[186544]: 2025-11-22 08:03:53.759 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "refresh_cache-eb6b82cf-7eb5-4a69-9342-a5d3fb896e58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:03:53 compute-0 nova_compute[186544]: 2025-11-22 08:03:53.759 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquired lock "refresh_cache-eb6b82cf-7eb5-4a69-9342-a5d3fb896e58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:03:53 compute-0 nova_compute[186544]: 2025-11-22 08:03:53.759 186548 DEBUG nova.network.neutron [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 22 08:03:53 compute-0 nova_compute[186544]: 2025-11-22 08:03:53.760 186548 DEBUG nova.objects.instance [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lazy-loading 'info_cache' on Instance uuid eb6b82cf-7eb5-4a69-9342-a5d3fb896e58 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:03:53 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Nov 22 08:03:53 compute-0 systemd[231243]: Activating special unit Exit the Session...
Nov 22 08:03:53 compute-0 systemd[231243]: Stopped target Main User Target.
Nov 22 08:03:53 compute-0 systemd[231243]: Stopped target Basic System.
Nov 22 08:03:53 compute-0 systemd[231243]: Stopped target Paths.
Nov 22 08:03:53 compute-0 systemd[231243]: Stopped target Sockets.
Nov 22 08:03:53 compute-0 systemd[231243]: Stopped target Timers.
Nov 22 08:03:53 compute-0 systemd[231243]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 22 08:03:53 compute-0 systemd[231243]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 22 08:03:53 compute-0 systemd[231243]: Closed D-Bus User Message Bus Socket.
Nov 22 08:03:53 compute-0 systemd[231243]: Stopped Create User's Volatile Files and Directories.
Nov 22 08:03:53 compute-0 systemd[231243]: Removed slice User Application Slice.
Nov 22 08:03:53 compute-0 systemd[231243]: Reached target Shutdown.
Nov 22 08:03:53 compute-0 systemd[231243]: Finished Exit the Session.
Nov 22 08:03:53 compute-0 systemd[231243]: Reached target Exit the Session.
Nov 22 08:03:53 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Nov 22 08:03:53 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Nov 22 08:03:53 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 22 08:03:53 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 22 08:03:53 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 22 08:03:53 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 22 08:03:53 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Nov 22 08:03:55 compute-0 nova_compute[186544]: 2025-11-22 08:03:55.574 186548 DEBUG nova.compute.manager [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:03:55 compute-0 nova_compute[186544]: 2025-11-22 08:03:55.709 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:55 compute-0 nova_compute[186544]: 2025-11-22 08:03:55.787 186548 DEBUG oslo_concurrency.lockutils [None req-a5327dd2-ccd1-4346-8c58-f8bbab926994 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Lock "235ecf63-07fa-4f60-97e9-466450a50add" "released" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: held 18.380s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:03:56 compute-0 podman[231611]: 2025-11-22 08:03:56.425776604 +0000 UTC m=+0.067133024 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 22 08:03:56 compute-0 podman[231612]: 2025-11-22 08:03:56.465074327 +0000 UTC m=+0.098685614 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 22 08:03:57 compute-0 nova_compute[186544]: 2025-11-22 08:03:57.816 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:03:57 compute-0 nova_compute[186544]: 2025-11-22 08:03:57.877 186548 DEBUG nova.network.neutron [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Updating instance_info_cache with network_info: [{"id": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "address": "fa:16:3e:df:95:59", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2f45e58-23", "ovs_interfaceid": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:03:57 compute-0 nova_compute[186544]: 2025-11-22 08:03:57.904 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Releasing lock "refresh_cache-eb6b82cf-7eb5-4a69-9342-a5d3fb896e58" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:03:57 compute-0 nova_compute[186544]: 2025-11-22 08:03:57.904 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 22 08:03:57 compute-0 nova_compute[186544]: 2025-11-22 08:03:57.904 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:03:57 compute-0 nova_compute[186544]: 2025-11-22 08:03:57.905 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:03:57 compute-0 nova_compute[186544]: 2025-11-22 08:03:57.905 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:03:57 compute-0 nova_compute[186544]: 2025-11-22 08:03:57.905 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:03:57 compute-0 nova_compute[186544]: 2025-11-22 08:03:57.905 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:03:57 compute-0 nova_compute[186544]: 2025-11-22 08:03:57.906 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 08:03:59 compute-0 podman[231658]: 2025-11-22 08:03:59.418509687 +0000 UTC m=+0.058016729 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 08:03:59 compute-0 podman[231657]: 2025-11-22 08:03:59.418988778 +0000 UTC m=+0.060200682 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 22 08:04:00 compute-0 nova_compute[186544]: 2025-11-22 08:04:00.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:04:00 compute-0 nova_compute[186544]: 2025-11-22 08:04:00.711 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:04:02 compute-0 nova_compute[186544]: 2025-11-22 08:04:02.573 186548 DEBUG oslo_concurrency.lockutils [None req-0bb7b4f5-5fca-45f1-9ca3-4359144a6112 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquiring lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:04:02 compute-0 nova_compute[186544]: 2025-11-22 08:04:02.574 186548 DEBUG oslo_concurrency.lockutils [None req-0bb7b4f5-5fca-45f1-9ca3-4359144a6112 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:04:02 compute-0 nova_compute[186544]: 2025-11-22 08:04:02.574 186548 DEBUG oslo_concurrency.lockutils [None req-0bb7b4f5-5fca-45f1-9ca3-4359144a6112 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquiring lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:04:02 compute-0 nova_compute[186544]: 2025-11-22 08:04:02.574 186548 DEBUG oslo_concurrency.lockutils [None req-0bb7b4f5-5fca-45f1-9ca3-4359144a6112 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:04:02 compute-0 nova_compute[186544]: 2025-11-22 08:04:02.575 186548 DEBUG oslo_concurrency.lockutils [None req-0bb7b4f5-5fca-45f1-9ca3-4359144a6112 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:04:02 compute-0 nova_compute[186544]: 2025-11-22 08:04:02.583 186548 INFO nova.compute.manager [None req-0bb7b4f5-5fca-45f1-9ca3-4359144a6112 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Terminating instance
Nov 22 08:04:02 compute-0 nova_compute[186544]: 2025-11-22 08:04:02.591 186548 DEBUG nova.compute.manager [None req-0bb7b4f5-5fca-45f1-9ca3-4359144a6112 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 08:04:02 compute-0 kernel: tapa2f45e58-23 (unregistering): left promiscuous mode
Nov 22 08:04:02 compute-0 NetworkManager[55036]: <info>  [1763798642.6246] device (tapa2f45e58-23): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 08:04:02 compute-0 ovn_controller[94843]: 2025-11-22T08:04:02Z|00452|binding|INFO|Releasing lport a2f45e58-237f-4de0-8339-5f17a4ad3cfe from this chassis (sb_readonly=0)
Nov 22 08:04:02 compute-0 nova_compute[186544]: 2025-11-22 08:04:02.632 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:04:02 compute-0 ovn_controller[94843]: 2025-11-22T08:04:02Z|00453|binding|INFO|Setting lport a2f45e58-237f-4de0-8339-5f17a4ad3cfe down in Southbound
Nov 22 08:04:02 compute-0 ovn_controller[94843]: 2025-11-22T08:04:02Z|00454|binding|INFO|Removing iface tapa2f45e58-23 ovn-installed in OVS
Nov 22 08:04:02 compute-0 nova_compute[186544]: 2025-11-22 08:04:02.642 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:04:02 compute-0 nova_compute[186544]: 2025-11-22 08:04:02.646 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:04:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:04:02.659 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:df:95:59 10.100.0.8'], port_security=['fa:16:3e:df:95:59 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'eb6b82cf-7eb5-4a69-9342-a5d3fb896e58', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'neutron:revision_number': '14', 'neutron:security_group_ids': 'f75b5f45-3232-42aa-a8f2-594f0428a6f8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.204', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a46c39a3-69e8-4fb9-a300-4c114a0c0910, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=a2f45e58-237f-4de0-8339-5f17a4ad3cfe) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:04:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:04:02.661 103805 INFO neutron.agent.ovn.metadata.agent [-] Port a2f45e58-237f-4de0-8339-5f17a4ad3cfe in datapath 165f7f23-d3c9-4f49-8a34-4fd7222ad518 unbound from our chassis
Nov 22 08:04:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:04:02.662 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 165f7f23-d3c9-4f49-8a34-4fd7222ad518, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 08:04:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:04:02.664 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[1ce3206e-800f-4817-a467-776bc1e8cc5e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:04:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:04:02.669 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518 namespace which is not needed anymore
Nov 22 08:04:02 compute-0 systemd[1]: machine-qemu\x2d58\x2dinstance\x2d0000005d.scope: Deactivated successfully.
Nov 22 08:04:02 compute-0 systemd[1]: machine-qemu\x2d58\x2dinstance\x2d0000005d.scope: Consumed 14.685s CPU time.
Nov 22 08:04:02 compute-0 systemd-machined[152872]: Machine qemu-58-instance-0000005d terminated.
Nov 22 08:04:02 compute-0 nova_compute[186544]: 2025-11-22 08:04:02.818 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:04:02 compute-0 nova_compute[186544]: 2025-11-22 08:04:02.873 186548 INFO nova.virt.libvirt.driver [-] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Instance destroyed successfully.
Nov 22 08:04:02 compute-0 nova_compute[186544]: 2025-11-22 08:04:02.874 186548 DEBUG nova.objects.instance [None req-0bb7b4f5-5fca-45f1-9ca3-4359144a6112 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lazy-loading 'resources' on Instance uuid eb6b82cf-7eb5-4a69-9342-a5d3fb896e58 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:04:02 compute-0 nova_compute[186544]: 2025-11-22 08:04:02.888 186548 DEBUG nova.virt.libvirt.vif [None req-0bb7b4f5-5fca-45f1-9ca3-4359144a6112 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:00:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1519356482',display_name='tempest-ServerActionsTestJSON-server-1519356482',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1519356482',id=93,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEDyh5RRpb7qDHgAc9H+oNwOI/lxx0x2a7uhOXIX+Er9GoVqnK9B1X3kTc/PIYUbBPjQjhoPfQeu2jPU9pyeFHD6mBTSbq1gvJNECPvummRKdXnVokvmyleOZmFdoGP/ZQ==',key_name='tempest-keypair-1877507320',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:03:48Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ac6b78572b7d4aedaf745e5e6ba1d360',ramdisk_id='',reservation_id='r-0hew71dq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1104477664',owner_user_name='tempest-ServerActionsTestJSON-1104477664-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:03:57Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='b6cc24df1e344e369f2aff864f278268',uuid=eb6b82cf-7eb5-4a69-9342-a5d3fb896e58,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "address": "fa:16:3e:df:95:59", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2f45e58-23", "ovs_interfaceid": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 08:04:02 compute-0 nova_compute[186544]: 2025-11-22 08:04:02.888 186548 DEBUG nova.network.os_vif_util [None req-0bb7b4f5-5fca-45f1-9ca3-4359144a6112 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Converting VIF {"id": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "address": "fa:16:3e:df:95:59", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2f45e58-23", "ovs_interfaceid": "a2f45e58-237f-4de0-8339-5f17a4ad3cfe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:04:02 compute-0 nova_compute[186544]: 2025-11-22 08:04:02.890 186548 DEBUG nova.network.os_vif_util [None req-0bb7b4f5-5fca-45f1-9ca3-4359144a6112 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:df:95:59,bridge_name='br-int',has_traffic_filtering=True,id=a2f45e58-237f-4de0-8339-5f17a4ad3cfe,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2f45e58-23') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:04:02 compute-0 nova_compute[186544]: 2025-11-22 08:04:02.890 186548 DEBUG os_vif [None req-0bb7b4f5-5fca-45f1-9ca3-4359144a6112 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:df:95:59,bridge_name='br-int',has_traffic_filtering=True,id=a2f45e58-237f-4de0-8339-5f17a4ad3cfe,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2f45e58-23') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 08:04:02 compute-0 nova_compute[186544]: 2025-11-22 08:04:02.892 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:04:02 compute-0 nova_compute[186544]: 2025-11-22 08:04:02.892 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa2f45e58-23, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:04:02 compute-0 nova_compute[186544]: 2025-11-22 08:04:02.895 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:04:02 compute-0 nova_compute[186544]: 2025-11-22 08:04:02.897 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 08:04:02 compute-0 nova_compute[186544]: 2025-11-22 08:04:02.899 186548 INFO os_vif [None req-0bb7b4f5-5fca-45f1-9ca3-4359144a6112 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:df:95:59,bridge_name='br-int',has_traffic_filtering=True,id=a2f45e58-237f-4de0-8339-5f17a4ad3cfe,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2f45e58-23')
Nov 22 08:04:02 compute-0 nova_compute[186544]: 2025-11-22 08:04:02.899 186548 INFO nova.virt.libvirt.driver [None req-0bb7b4f5-5fca-45f1-9ca3-4359144a6112 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Deleting instance files /var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58_del
Nov 22 08:04:02 compute-0 nova_compute[186544]: 2025-11-22 08:04:02.905 186548 INFO nova.virt.libvirt.driver [None req-0bb7b4f5-5fca-45f1-9ca3-4359144a6112 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Deletion of /var/lib/nova/instances/eb6b82cf-7eb5-4a69-9342-a5d3fb896e58_del complete
Nov 22 08:04:03 compute-0 nova_compute[186544]: 2025-11-22 08:04:03.032 186548 INFO nova.compute.manager [None req-0bb7b4f5-5fca-45f1-9ca3-4359144a6112 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Took 0.44 seconds to destroy the instance on the hypervisor.
Nov 22 08:04:03 compute-0 nova_compute[186544]: 2025-11-22 08:04:03.033 186548 DEBUG oslo.service.loopingcall [None req-0bb7b4f5-5fca-45f1-9ca3-4359144a6112 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 08:04:03 compute-0 nova_compute[186544]: 2025-11-22 08:04:03.033 186548 DEBUG nova.compute.manager [-] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 08:04:03 compute-0 nova_compute[186544]: 2025-11-22 08:04:03.033 186548 DEBUG nova.network.neutron [-] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 08:04:03 compute-0 nova_compute[186544]: 2025-11-22 08:04:03.158 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:04:03 compute-0 nova_compute[186544]: 2025-11-22 08:04:03.253 186548 DEBUG nova.compute.manager [req-7663596c-b5fc-415c-bdce-3b0b1f147f4a req-4961959e-8182-48fd-918a-430288b23296 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Received event network-vif-unplugged-a2f45e58-237f-4de0-8339-5f17a4ad3cfe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:04:03 compute-0 nova_compute[186544]: 2025-11-22 08:04:03.253 186548 DEBUG oslo_concurrency.lockutils [req-7663596c-b5fc-415c-bdce-3b0b1f147f4a req-4961959e-8182-48fd-918a-430288b23296 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:04:03 compute-0 nova_compute[186544]: 2025-11-22 08:04:03.253 186548 DEBUG oslo_concurrency.lockutils [req-7663596c-b5fc-415c-bdce-3b0b1f147f4a req-4961959e-8182-48fd-918a-430288b23296 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:04:03 compute-0 nova_compute[186544]: 2025-11-22 08:04:03.254 186548 DEBUG oslo_concurrency.lockutils [req-7663596c-b5fc-415c-bdce-3b0b1f147f4a req-4961959e-8182-48fd-918a-430288b23296 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:04:03 compute-0 nova_compute[186544]: 2025-11-22 08:04:03.254 186548 DEBUG nova.compute.manager [req-7663596c-b5fc-415c-bdce-3b0b1f147f4a req-4961959e-8182-48fd-918a-430288b23296 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] No waiting events found dispatching network-vif-unplugged-a2f45e58-237f-4de0-8339-5f17a4ad3cfe pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:04:03 compute-0 nova_compute[186544]: 2025-11-22 08:04:03.254 186548 DEBUG nova.compute.manager [req-7663596c-b5fc-415c-bdce-3b0b1f147f4a req-4961959e-8182-48fd-918a-430288b23296 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Received event network-vif-unplugged-a2f45e58-237f-4de0-8339-5f17a4ad3cfe for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 22 08:04:03 compute-0 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[231475]: [NOTICE]   (231479) : haproxy version is 2.8.14-c23fe91
Nov 22 08:04:03 compute-0 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[231475]: [NOTICE]   (231479) : path to executable is /usr/sbin/haproxy
Nov 22 08:04:03 compute-0 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[231475]: [WARNING]  (231479) : Exiting Master process...
Nov 22 08:04:03 compute-0 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[231475]: [WARNING]  (231479) : Exiting Master process...
Nov 22 08:04:03 compute-0 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[231475]: [ALERT]    (231479) : Current worker (231481) exited with code 143 (Terminated)
Nov 22 08:04:03 compute-0 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[231475]: [WARNING]  (231479) : All workers exited. Exiting... (0)
Nov 22 08:04:03 compute-0 systemd[1]: libpod-cc0c45666c697e8d9a9af2e2ed12cce0599cfc04a467096db0330bb82a2c380e.scope: Deactivated successfully.
Nov 22 08:04:03 compute-0 podman[231723]: 2025-11-22 08:04:03.455147287 +0000 UTC m=+0.688400627 container died cc0c45666c697e8d9a9af2e2ed12cce0599cfc04a467096db0330bb82a2c380e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 08:04:04 compute-0 nova_compute[186544]: 2025-11-22 08:04:04.123 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:04:04 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:04:04.124 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=29, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=28) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:04:04 compute-0 systemd[1]: var-lib-containers-storage-overlay-5b1a0022075694266c7131d75b6e3c9a2931a70f221bab7d7bf36a21d6a933df-merged.mount: Deactivated successfully.
Nov 22 08:04:04 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cc0c45666c697e8d9a9af2e2ed12cce0599cfc04a467096db0330bb82a2c380e-userdata-shm.mount: Deactivated successfully.
Nov 22 08:04:04 compute-0 podman[231723]: 2025-11-22 08:04:04.88144344 +0000 UTC m=+2.114696760 container cleanup cc0c45666c697e8d9a9af2e2ed12cce0599cfc04a467096db0330bb82a2c380e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 08:04:04 compute-0 systemd[1]: libpod-conmon-cc0c45666c697e8d9a9af2e2ed12cce0599cfc04a467096db0330bb82a2c380e.scope: Deactivated successfully.
Nov 22 08:04:05 compute-0 nova_compute[186544]: 2025-11-22 08:04:05.074 186548 DEBUG nova.network.neutron [-] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:04:05 compute-0 nova_compute[186544]: 2025-11-22 08:04:05.110 186548 INFO nova.compute.manager [-] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Took 2.08 seconds to deallocate network for instance.
Nov 22 08:04:05 compute-0 nova_compute[186544]: 2025-11-22 08:04:05.261 186548 DEBUG oslo_concurrency.lockutils [None req-0bb7b4f5-5fca-45f1-9ca3-4359144a6112 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:04:05 compute-0 nova_compute[186544]: 2025-11-22 08:04:05.262 186548 DEBUG oslo_concurrency.lockutils [None req-0bb7b4f5-5fca-45f1-9ca3-4359144a6112 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:04:05 compute-0 nova_compute[186544]: 2025-11-22 08:04:05.430 186548 DEBUG nova.compute.provider_tree [None req-0bb7b4f5-5fca-45f1-9ca3-4359144a6112 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:04:05 compute-0 nova_compute[186544]: 2025-11-22 08:04:05.445 186548 DEBUG nova.scheduler.client.report [None req-0bb7b4f5-5fca-45f1-9ca3-4359144a6112 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:04:05 compute-0 nova_compute[186544]: 2025-11-22 08:04:05.544 186548 DEBUG oslo_concurrency.lockutils [None req-0bb7b4f5-5fca-45f1-9ca3-4359144a6112 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.282s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:04:05 compute-0 nova_compute[186544]: 2025-11-22 08:04:05.596 186548 DEBUG nova.compute.manager [req-4d34a9ee-a4ad-4a95-aef7-be9a25961723 req-9b510ee3-5e2d-4b02-b1a9-cf822e2c72be 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Received event network-vif-deleted-a2f45e58-237f-4de0-8339-5f17a4ad3cfe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:04:05 compute-0 nova_compute[186544]: 2025-11-22 08:04:05.664 186548 INFO nova.scheduler.client.report [None req-0bb7b4f5-5fca-45f1-9ca3-4359144a6112 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Deleted allocations for instance eb6b82cf-7eb5-4a69-9342-a5d3fb896e58
Nov 22 08:04:05 compute-0 podman[231768]: 2025-11-22 08:04:05.677458093 +0000 UTC m=+0.763107381 container remove cc0c45666c697e8d9a9af2e2ed12cce0599cfc04a467096db0330bb82a2c380e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3)
Nov 22 08:04:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:04:05.682 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[f4773385-8f7e-44cf-bd34-e26b26443143]: (4, ('Sat Nov 22 08:04:02 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518 (cc0c45666c697e8d9a9af2e2ed12cce0599cfc04a467096db0330bb82a2c380e)\ncc0c45666c697e8d9a9af2e2ed12cce0599cfc04a467096db0330bb82a2c380e\nSat Nov 22 08:04:04 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518 (cc0c45666c697e8d9a9af2e2ed12cce0599cfc04a467096db0330bb82a2c380e)\ncc0c45666c697e8d9a9af2e2ed12cce0599cfc04a467096db0330bb82a2c380e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:04:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:04:05.684 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[17a0a5b6-6b89-4e27-a440-adeabb80a350]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:04:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:04:05.685 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap165f7f23-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:04:05 compute-0 nova_compute[186544]: 2025-11-22 08:04:05.688 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:04:05 compute-0 kernel: tap165f7f23-d0: left promiscuous mode
Nov 22 08:04:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:04:05.707 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[31010a4e-3580-4df9-935b-5f69fa4c1290]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:04:05 compute-0 nova_compute[186544]: 2025-11-22 08:04:05.705 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:04:05 compute-0 nova_compute[186544]: 2025-11-22 08:04:05.718 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:04:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:04:05.734 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[48ca88e6-377b-4d34-beb0-4e3fa7c24aed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:04:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:04:05.736 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[27b969eb-70b9-46f2-8485-b4ee906c75fd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:04:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:04:05.754 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[74340c08-3c63-4b19-ae64-3af40ed0efdc]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 536859, 'reachable_time': 39975, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231782, 'error': None, 'target': 'ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:04:05 compute-0 systemd[1]: run-netns-ovnmeta\x2d165f7f23\x2dd3c9\x2d4f49\x2d8a34\x2d4fd7222ad518.mount: Deactivated successfully.
Nov 22 08:04:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:04:05.760 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 08:04:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:04:05.761 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[c516eff1-249e-4a17-9bb4-383604d9919a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:04:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:04:05.761 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 08:04:05 compute-0 nova_compute[186544]: 2025-11-22 08:04:05.781 186548 DEBUG nova.compute.manager [req-565c14ef-0c66-458f-bbf7-77c0503bc905 req-7d739634-a87b-42d5-8fa9-2405aaadfd7f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Received event network-vif-plugged-a2f45e58-237f-4de0-8339-5f17a4ad3cfe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:04:05 compute-0 nova_compute[186544]: 2025-11-22 08:04:05.782 186548 DEBUG oslo_concurrency.lockutils [req-565c14ef-0c66-458f-bbf7-77c0503bc905 req-7d739634-a87b-42d5-8fa9-2405aaadfd7f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:04:05 compute-0 nova_compute[186544]: 2025-11-22 08:04:05.782 186548 DEBUG oslo_concurrency.lockutils [req-565c14ef-0c66-458f-bbf7-77c0503bc905 req-7d739634-a87b-42d5-8fa9-2405aaadfd7f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:04:05 compute-0 nova_compute[186544]: 2025-11-22 08:04:05.782 186548 DEBUG oslo_concurrency.lockutils [req-565c14ef-0c66-458f-bbf7-77c0503bc905 req-7d739634-a87b-42d5-8fa9-2405aaadfd7f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:04:05 compute-0 nova_compute[186544]: 2025-11-22 08:04:05.782 186548 DEBUG nova.compute.manager [req-565c14ef-0c66-458f-bbf7-77c0503bc905 req-7d739634-a87b-42d5-8fa9-2405aaadfd7f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] No waiting events found dispatching network-vif-plugged-a2f45e58-237f-4de0-8339-5f17a4ad3cfe pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:04:05 compute-0 nova_compute[186544]: 2025-11-22 08:04:05.782 186548 WARNING nova.compute.manager [req-565c14ef-0c66-458f-bbf7-77c0503bc905 req-7d739634-a87b-42d5-8fa9-2405aaadfd7f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Received unexpected event network-vif-plugged-a2f45e58-237f-4de0-8339-5f17a4ad3cfe for instance with vm_state deleted and task_state None.
Nov 22 08:04:05 compute-0 nova_compute[186544]: 2025-11-22 08:04:05.852 186548 DEBUG oslo_concurrency.lockutils [None req-0bb7b4f5-5fca-45f1-9ca3-4359144a6112 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "eb6b82cf-7eb5-4a69-9342-a5d3fb896e58" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.278s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:04:05 compute-0 nova_compute[186544]: 2025-11-22 08:04:05.869 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:04:07 compute-0 nova_compute[186544]: 2025-11-22 08:04:07.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:04:07 compute-0 nova_compute[186544]: 2025-11-22 08:04:07.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 22 08:04:07 compute-0 nova_compute[186544]: 2025-11-22 08:04:07.178 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 22 08:04:07 compute-0 podman[231792]: 2025-11-22 08:04:07.823248572 +0000 UTC m=+0.471336505 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 22 08:04:07 compute-0 nova_compute[186544]: 2025-11-22 08:04:07.895 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:04:07 compute-0 nova_compute[186544]: 2025-11-22 08:04:07.963 186548 DEBUG nova.objects.instance [None req-03da7761-7faf-464e-8a7d-877d2818413a 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 235ecf63-07fa-4f60-97e9-466450a50add obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:04:08 compute-0 nova_compute[186544]: 2025-11-22 08:04:08.000 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798647.9997685, 235ecf63-07fa-4f60-97e9-466450a50add => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:04:08 compute-0 nova_compute[186544]: 2025-11-22 08:04:08.000 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] VM Paused (Lifecycle Event)
Nov 22 08:04:08 compute-0 nova_compute[186544]: 2025-11-22 08:04:08.026 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:04:08 compute-0 nova_compute[186544]: 2025-11-22 08:04:08.030 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:04:08 compute-0 nova_compute[186544]: 2025-11-22 08:04:08.052 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] During sync_power_state the instance has a pending task (suspending). Skip.
Nov 22 08:04:08 compute-0 nova_compute[186544]: 2025-11-22 08:04:08.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:04:10 compute-0 kernel: tapfe9c07c3-e0 (unregistering): left promiscuous mode
Nov 22 08:04:10 compute-0 NetworkManager[55036]: <info>  [1763798650.4887] device (tapfe9c07c3-e0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 08:04:10 compute-0 ovn_controller[94843]: 2025-11-22T08:04:10Z|00455|binding|INFO|Releasing lport fe9c07c3-e08f-4810-b699-6d6aa3f50cda from this chassis (sb_readonly=0)
Nov 22 08:04:10 compute-0 ovn_controller[94843]: 2025-11-22T08:04:10Z|00456|binding|INFO|Setting lport fe9c07c3-e08f-4810-b699-6d6aa3f50cda down in Southbound
Nov 22 08:04:10 compute-0 ovn_controller[94843]: 2025-11-22T08:04:10Z|00457|binding|INFO|Removing iface tapfe9c07c3-e0 ovn-installed in OVS
Nov 22 08:04:10 compute-0 nova_compute[186544]: 2025-11-22 08:04:10.496 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:04:10 compute-0 nova_compute[186544]: 2025-11-22 08:04:10.497 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:04:10 compute-0 nova_compute[186544]: 2025-11-22 08:04:10.514 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:04:10 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:04:10.543 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1c:d6:3e 10.100.0.8'], port_security=['fa:16:3e:1c:d6:3e 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '235ecf63-07fa-4f60-97e9-466450a50add', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c157128f-75e8-4afb-ab55-34580af9585f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd967f0cef958482c9711764882a146f3', 'neutron:revision_number': '9', 'neutron:security_group_ids': 'db55d655-ec9a-40ef-9e54-3247c3ea4f75', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=50c031f4-6b41-4ee7-af4f-a9218d9b390c, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=fe9c07c3-e08f-4810-b699-6d6aa3f50cda) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:04:10 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:04:10.544 103805 INFO neutron.agent.ovn.metadata.agent [-] Port fe9c07c3-e08f-4810-b699-6d6aa3f50cda in datapath c157128f-75e8-4afb-ab55-34580af9585f unbound from our chassis
Nov 22 08:04:10 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:04:10.546 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c157128f-75e8-4afb-ab55-34580af9585f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 08:04:10 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:04:10.547 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[231e7f3a-cee8-4732-9104-1538b6271462]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:04:10 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:04:10.547 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c157128f-75e8-4afb-ab55-34580af9585f namespace which is not needed anymore
Nov 22 08:04:10 compute-0 systemd[1]: machine-qemu\x2d59\x2dinstance\x2d00000061.scope: Deactivated successfully.
Nov 22 08:04:10 compute-0 systemd[1]: machine-qemu\x2d59\x2dinstance\x2d00000061.scope: Consumed 16.363s CPU time.
Nov 22 08:04:10 compute-0 systemd-machined[152872]: Machine qemu-59-instance-00000061 terminated.
Nov 22 08:04:10 compute-0 nova_compute[186544]: 2025-11-22 08:04:10.684 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:04:10 compute-0 nova_compute[186544]: 2025-11-22 08:04:10.689 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:04:10 compute-0 nova_compute[186544]: 2025-11-22 08:04:10.722 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:04:10 compute-0 nova_compute[186544]: 2025-11-22 08:04:10.735 186548 DEBUG nova.compute.manager [None req-03da7761-7faf-464e-8a7d-877d2818413a 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:04:10 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:04:10.763 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '29'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:04:10 compute-0 neutron-haproxy-ovnmeta-c157128f-75e8-4afb-ab55-34580af9585f[231594]: [NOTICE]   (231598) : haproxy version is 2.8.14-c23fe91
Nov 22 08:04:10 compute-0 neutron-haproxy-ovnmeta-c157128f-75e8-4afb-ab55-34580af9585f[231594]: [NOTICE]   (231598) : path to executable is /usr/sbin/haproxy
Nov 22 08:04:10 compute-0 neutron-haproxy-ovnmeta-c157128f-75e8-4afb-ab55-34580af9585f[231594]: [WARNING]  (231598) : Exiting Master process...
Nov 22 08:04:10 compute-0 neutron-haproxy-ovnmeta-c157128f-75e8-4afb-ab55-34580af9585f[231594]: [WARNING]  (231598) : Exiting Master process...
Nov 22 08:04:10 compute-0 neutron-haproxy-ovnmeta-c157128f-75e8-4afb-ab55-34580af9585f[231594]: [ALERT]    (231598) : Current worker (231600) exited with code 143 (Terminated)
Nov 22 08:04:10 compute-0 neutron-haproxy-ovnmeta-c157128f-75e8-4afb-ab55-34580af9585f[231594]: [WARNING]  (231598) : All workers exited. Exiting... (0)
Nov 22 08:04:10 compute-0 systemd[1]: libpod-62e7aee09791ac211b894e39c6e39ff2f5c50ff253a86690ecce036cfe94b70c.scope: Deactivated successfully.
Nov 22 08:04:10 compute-0 podman[231839]: 2025-11-22 08:04:10.817282672 +0000 UTC m=+0.169035820 container died 62e7aee09791ac211b894e39c6e39ff2f5c50ff253a86690ecce036cfe94b70c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c157128f-75e8-4afb-ab55-34580af9585f, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 08:04:10 compute-0 nova_compute[186544]: 2025-11-22 08:04:10.965 186548 DEBUG nova.compute.manager [req-c3dbf25b-7f87-4f3b-9f9c-f2557daf3a24 req-68c4ce70-0ccb-4032-a797-4150648037b9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Received event network-vif-unplugged-fe9c07c3-e08f-4810-b699-6d6aa3f50cda external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:04:10 compute-0 nova_compute[186544]: 2025-11-22 08:04:10.965 186548 DEBUG oslo_concurrency.lockutils [req-c3dbf25b-7f87-4f3b-9f9c-f2557daf3a24 req-68c4ce70-0ccb-4032-a797-4150648037b9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "235ecf63-07fa-4f60-97e9-466450a50add-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:04:10 compute-0 nova_compute[186544]: 2025-11-22 08:04:10.967 186548 DEBUG oslo_concurrency.lockutils [req-c3dbf25b-7f87-4f3b-9f9c-f2557daf3a24 req-68c4ce70-0ccb-4032-a797-4150648037b9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "235ecf63-07fa-4f60-97e9-466450a50add-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:04:10 compute-0 nova_compute[186544]: 2025-11-22 08:04:10.967 186548 DEBUG oslo_concurrency.lockutils [req-c3dbf25b-7f87-4f3b-9f9c-f2557daf3a24 req-68c4ce70-0ccb-4032-a797-4150648037b9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "235ecf63-07fa-4f60-97e9-466450a50add-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:04:10 compute-0 nova_compute[186544]: 2025-11-22 08:04:10.968 186548 DEBUG nova.compute.manager [req-c3dbf25b-7f87-4f3b-9f9c-f2557daf3a24 req-68c4ce70-0ccb-4032-a797-4150648037b9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] No waiting events found dispatching network-vif-unplugged-fe9c07c3-e08f-4810-b699-6d6aa3f50cda pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:04:10 compute-0 nova_compute[186544]: 2025-11-22 08:04:10.968 186548 WARNING nova.compute.manager [req-c3dbf25b-7f87-4f3b-9f9c-f2557daf3a24 req-68c4ce70-0ccb-4032-a797-4150648037b9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Received unexpected event network-vif-unplugged-fe9c07c3-e08f-4810-b699-6d6aa3f50cda for instance with vm_state suspended and task_state None.
Nov 22 08:04:11 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-62e7aee09791ac211b894e39c6e39ff2f5c50ff253a86690ecce036cfe94b70c-userdata-shm.mount: Deactivated successfully.
Nov 22 08:04:11 compute-0 systemd[1]: var-lib-containers-storage-overlay-b006fe062981e9307ef85fc3f7dca0219de3cf5d0bda653ba2245d36c1c7558c-merged.mount: Deactivated successfully.
Nov 22 08:04:12 compute-0 podman[231839]: 2025-11-22 08:04:12.083479079 +0000 UTC m=+1.435232227 container cleanup 62e7aee09791ac211b894e39c6e39ff2f5c50ff253a86690ecce036cfe94b70c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c157128f-75e8-4afb-ab55-34580af9585f, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 08:04:12 compute-0 systemd[1]: libpod-conmon-62e7aee09791ac211b894e39c6e39ff2f5c50ff253a86690ecce036cfe94b70c.scope: Deactivated successfully.
Nov 22 08:04:12 compute-0 nova_compute[186544]: 2025-11-22 08:04:12.168 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:04:12 compute-0 nova_compute[186544]: 2025-11-22 08:04:12.898 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:04:12 compute-0 podman[231886]: 2025-11-22 08:04:12.907246026 +0000 UTC m=+0.794045145 container remove 62e7aee09791ac211b894e39c6e39ff2f5c50ff253a86690ecce036cfe94b70c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c157128f-75e8-4afb-ab55-34580af9585f, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 08:04:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:04:12.913 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[75b6037e-6a58-4288-badc-84673ce67ec9]: (4, ('Sat Nov 22 08:04:10 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-c157128f-75e8-4afb-ab55-34580af9585f (62e7aee09791ac211b894e39c6e39ff2f5c50ff253a86690ecce036cfe94b70c)\n62e7aee09791ac211b894e39c6e39ff2f5c50ff253a86690ecce036cfe94b70c\nSat Nov 22 08:04:12 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-c157128f-75e8-4afb-ab55-34580af9585f (62e7aee09791ac211b894e39c6e39ff2f5c50ff253a86690ecce036cfe94b70c)\n62e7aee09791ac211b894e39c6e39ff2f5c50ff253a86690ecce036cfe94b70c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:04:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:04:12.914 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[07e91faf-9d06-45dc-ae21-ece341cabc1c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:04:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:04:12.915 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc157128f-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:04:12 compute-0 nova_compute[186544]: 2025-11-22 08:04:12.917 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:04:12 compute-0 kernel: tapc157128f-70: left promiscuous mode
Nov 22 08:04:12 compute-0 nova_compute[186544]: 2025-11-22 08:04:12.934 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:04:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:04:12.937 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[e134028d-7f46-46b8-a065-2462e4fdc146]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:04:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:04:12.967 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[220f6d86-7aed-4bea-aefa-031cf9e5c916]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:04:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:04:12.970 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[76a0b9e7-8739-473b-a774-bcb36b25fdac]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:04:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:04:12.985 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[317bec23-0616-4446-a184-1d927cee7a20]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 537031, 'reachable_time': 27412, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231926, 'error': None, 'target': 'ovnmeta-c157128f-75e8-4afb-ab55-34580af9585f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:04:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:04:12.988 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c157128f-75e8-4afb-ab55-34580af9585f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 08:04:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:04:12.988 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[9fc10151-3f8f-49d2-9eb7-c343479370cd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:04:12 compute-0 systemd[1]: run-netns-ovnmeta\x2dc157128f\x2d75e8\x2d4afb\x2dab55\x2d34580af9585f.mount: Deactivated successfully.
Nov 22 08:04:13 compute-0 podman[231899]: 2025-11-22 08:04:13.029585013 +0000 UTC m=+0.671647776 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 08:04:13 compute-0 podman[231900]: 2025-11-22 08:04:13.028305001 +0000 UTC m=+0.667509744 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., version=9.6, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, io.openshift.expose-services=, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git)
Nov 22 08:04:13 compute-0 nova_compute[186544]: 2025-11-22 08:04:13.531 186548 DEBUG nova.compute.manager [req-3e46bbab-dbc9-4b01-b0a8-f7bbdb99529f req-454fd782-32e4-4fe4-ad8d-a98c8194139c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Received event network-vif-plugged-fe9c07c3-e08f-4810-b699-6d6aa3f50cda external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:04:13 compute-0 nova_compute[186544]: 2025-11-22 08:04:13.531 186548 DEBUG oslo_concurrency.lockutils [req-3e46bbab-dbc9-4b01-b0a8-f7bbdb99529f req-454fd782-32e4-4fe4-ad8d-a98c8194139c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "235ecf63-07fa-4f60-97e9-466450a50add-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:04:13 compute-0 nova_compute[186544]: 2025-11-22 08:04:13.531 186548 DEBUG oslo_concurrency.lockutils [req-3e46bbab-dbc9-4b01-b0a8-f7bbdb99529f req-454fd782-32e4-4fe4-ad8d-a98c8194139c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "235ecf63-07fa-4f60-97e9-466450a50add-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:04:13 compute-0 nova_compute[186544]: 2025-11-22 08:04:13.532 186548 DEBUG oslo_concurrency.lockutils [req-3e46bbab-dbc9-4b01-b0a8-f7bbdb99529f req-454fd782-32e4-4fe4-ad8d-a98c8194139c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "235ecf63-07fa-4f60-97e9-466450a50add-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:04:13 compute-0 nova_compute[186544]: 2025-11-22 08:04:13.532 186548 DEBUG nova.compute.manager [req-3e46bbab-dbc9-4b01-b0a8-f7bbdb99529f req-454fd782-32e4-4fe4-ad8d-a98c8194139c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] No waiting events found dispatching network-vif-plugged-fe9c07c3-e08f-4810-b699-6d6aa3f50cda pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:04:13 compute-0 nova_compute[186544]: 2025-11-22 08:04:13.532 186548 WARNING nova.compute.manager [req-3e46bbab-dbc9-4b01-b0a8-f7bbdb99529f req-454fd782-32e4-4fe4-ad8d-a98c8194139c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Received unexpected event network-vif-plugged-fe9c07c3-e08f-4810-b699-6d6aa3f50cda for instance with vm_state suspended and task_state None.
Nov 22 08:04:14 compute-0 nova_compute[186544]: 2025-11-22 08:04:14.098 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:04:14 compute-0 nova_compute[186544]: 2025-11-22 08:04:14.149 186548 INFO nova.compute.manager [None req-9b8f4be3-de1e-45c8-b9a1-0dece8e23161 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Resuming
Nov 22 08:04:14 compute-0 nova_compute[186544]: 2025-11-22 08:04:14.150 186548 DEBUG nova.objects.instance [None req-9b8f4be3-de1e-45c8-b9a1-0dece8e23161 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Lazy-loading 'flavor' on Instance uuid 235ecf63-07fa-4f60-97e9-466450a50add obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:04:14 compute-0 nova_compute[186544]: 2025-11-22 08:04:14.213 186548 DEBUG oslo_concurrency.lockutils [None req-9b8f4be3-de1e-45c8-b9a1-0dece8e23161 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Acquiring lock "refresh_cache-235ecf63-07fa-4f60-97e9-466450a50add" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:04:14 compute-0 nova_compute[186544]: 2025-11-22 08:04:14.213 186548 DEBUG oslo_concurrency.lockutils [None req-9b8f4be3-de1e-45c8-b9a1-0dece8e23161 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Acquired lock "refresh_cache-235ecf63-07fa-4f60-97e9-466450a50add" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:04:14 compute-0 nova_compute[186544]: 2025-11-22 08:04:14.214 186548 DEBUG nova.network.neutron [None req-9b8f4be3-de1e-45c8-b9a1-0dece8e23161 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 08:04:15 compute-0 nova_compute[186544]: 2025-11-22 08:04:15.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:04:15 compute-0 nova_compute[186544]: 2025-11-22 08:04:15.164 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 22 08:04:15 compute-0 nova_compute[186544]: 2025-11-22 08:04:15.723 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:04:17 compute-0 nova_compute[186544]: 2025-11-22 08:04:17.871 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763798642.869036, eb6b82cf-7eb5-4a69-9342-a5d3fb896e58 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:04:17 compute-0 nova_compute[186544]: 2025-11-22 08:04:17.871 186548 INFO nova.compute.manager [-] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] VM Stopped (Lifecycle Event)
Nov 22 08:04:17 compute-0 nova_compute[186544]: 2025-11-22 08:04:17.901 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:04:17 compute-0 nova_compute[186544]: 2025-11-22 08:04:17.903 186548 DEBUG nova.compute.manager [None req-ae69478c-07ef-4027-b551-a68e0f2dae92 - - - - - -] [instance: eb6b82cf-7eb5-4a69-9342-a5d3fb896e58] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:04:19 compute-0 nova_compute[186544]: 2025-11-22 08:04:19.125 186548 DEBUG nova.network.neutron [None req-9b8f4be3-de1e-45c8-b9a1-0dece8e23161 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Updating instance_info_cache with network_info: [{"id": "fe9c07c3-e08f-4810-b699-6d6aa3f50cda", "address": "fa:16:3e:1c:d6:3e", "network": {"id": "c157128f-75e8-4afb-ab55-34580af9585f", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1557637836-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d967f0cef958482c9711764882a146f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe9c07c3-e0", "ovs_interfaceid": "fe9c07c3-e08f-4810-b699-6d6aa3f50cda", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:04:19 compute-0 nova_compute[186544]: 2025-11-22 08:04:19.155 186548 DEBUG oslo_concurrency.lockutils [None req-9b8f4be3-de1e-45c8-b9a1-0dece8e23161 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Releasing lock "refresh_cache-235ecf63-07fa-4f60-97e9-466450a50add" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:04:19 compute-0 nova_compute[186544]: 2025-11-22 08:04:19.161 186548 DEBUG nova.virt.libvirt.vif [None req-9b8f4be3-de1e-45c8-b9a1-0dece8e23161 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-22T08:01:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-899820238',display_name='tempest-ServersNegativeTestJSON-server-899820238',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-899820238',id=97,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:03:55Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='d967f0cef958482c9711764882a146f3',ramdisk_id='',reservation_id='r-he9blv4w',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServersNegativeTestJSON-1872924472',owner_user_name='tempest-ServersNegativeTestJSON-1872924472-project-member'},tags=<?>,task_state='resuming',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:04:10Z,user_data=None,user_id='989540cd5ede4a5184a08b8eb3de013d',uuid=235ecf63-07fa-4f60-97e9-466450a50add,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='suspended') vif={"id": "fe9c07c3-e08f-4810-b699-6d6aa3f50cda", "address": "fa:16:3e:1c:d6:3e", "network": {"id": "c157128f-75e8-4afb-ab55-34580af9585f", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1557637836-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d967f0cef958482c9711764882a146f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe9c07c3-e0", "ovs_interfaceid": "fe9c07c3-e08f-4810-b699-6d6aa3f50cda", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 08:04:19 compute-0 nova_compute[186544]: 2025-11-22 08:04:19.161 186548 DEBUG nova.network.os_vif_util [None req-9b8f4be3-de1e-45c8-b9a1-0dece8e23161 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Converting VIF {"id": "fe9c07c3-e08f-4810-b699-6d6aa3f50cda", "address": "fa:16:3e:1c:d6:3e", "network": {"id": "c157128f-75e8-4afb-ab55-34580af9585f", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1557637836-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d967f0cef958482c9711764882a146f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe9c07c3-e0", "ovs_interfaceid": "fe9c07c3-e08f-4810-b699-6d6aa3f50cda", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:04:19 compute-0 nova_compute[186544]: 2025-11-22 08:04:19.162 186548 DEBUG nova.network.os_vif_util [None req-9b8f4be3-de1e-45c8-b9a1-0dece8e23161 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1c:d6:3e,bridge_name='br-int',has_traffic_filtering=True,id=fe9c07c3-e08f-4810-b699-6d6aa3f50cda,network=Network(c157128f-75e8-4afb-ab55-34580af9585f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe9c07c3-e0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:04:19 compute-0 nova_compute[186544]: 2025-11-22 08:04:19.162 186548 DEBUG os_vif [None req-9b8f4be3-de1e-45c8-b9a1-0dece8e23161 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1c:d6:3e,bridge_name='br-int',has_traffic_filtering=True,id=fe9c07c3-e08f-4810-b699-6d6aa3f50cda,network=Network(c157128f-75e8-4afb-ab55-34580af9585f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe9c07c3-e0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 08:04:19 compute-0 nova_compute[186544]: 2025-11-22 08:04:19.163 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:04:19 compute-0 nova_compute[186544]: 2025-11-22 08:04:19.163 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:04:19 compute-0 nova_compute[186544]: 2025-11-22 08:04:19.163 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:04:19 compute-0 nova_compute[186544]: 2025-11-22 08:04:19.167 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:04:19 compute-0 nova_compute[186544]: 2025-11-22 08:04:19.168 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfe9c07c3-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:04:19 compute-0 nova_compute[186544]: 2025-11-22 08:04:19.168 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapfe9c07c3-e0, col_values=(('external_ids', {'iface-id': 'fe9c07c3-e08f-4810-b699-6d6aa3f50cda', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1c:d6:3e', 'vm-uuid': '235ecf63-07fa-4f60-97e9-466450a50add'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:04:19 compute-0 nova_compute[186544]: 2025-11-22 08:04:19.169 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:04:19 compute-0 nova_compute[186544]: 2025-11-22 08:04:19.170 186548 INFO os_vif [None req-9b8f4be3-de1e-45c8-b9a1-0dece8e23161 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1c:d6:3e,bridge_name='br-int',has_traffic_filtering=True,id=fe9c07c3-e08f-4810-b699-6d6aa3f50cda,network=Network(c157128f-75e8-4afb-ab55-34580af9585f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe9c07c3-e0')
Nov 22 08:04:19 compute-0 nova_compute[186544]: 2025-11-22 08:04:19.304 186548 DEBUG nova.objects.instance [None req-9b8f4be3-de1e-45c8-b9a1-0dece8e23161 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Lazy-loading 'numa_topology' on Instance uuid 235ecf63-07fa-4f60-97e9-466450a50add obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:04:19 compute-0 kernel: tapfe9c07c3-e0: entered promiscuous mode
Nov 22 08:04:19 compute-0 ovn_controller[94843]: 2025-11-22T08:04:19Z|00458|binding|INFO|Claiming lport fe9c07c3-e08f-4810-b699-6d6aa3f50cda for this chassis.
Nov 22 08:04:19 compute-0 nova_compute[186544]: 2025-11-22 08:04:19.396 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:04:19 compute-0 ovn_controller[94843]: 2025-11-22T08:04:19Z|00459|binding|INFO|fe9c07c3-e08f-4810-b699-6d6aa3f50cda: Claiming fa:16:3e:1c:d6:3e 10.100.0.8
Nov 22 08:04:19 compute-0 NetworkManager[55036]: <info>  [1763798659.3988] manager: (tapfe9c07c3-e0): new Tun device (/org/freedesktop/NetworkManager/Devices/215)
Nov 22 08:04:19 compute-0 ovn_controller[94843]: 2025-11-22T08:04:19Z|00460|binding|INFO|Setting lport fe9c07c3-e08f-4810-b699-6d6aa3f50cda ovn-installed in OVS
Nov 22 08:04:19 compute-0 nova_compute[186544]: 2025-11-22 08:04:19.412 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:04:19 compute-0 nova_compute[186544]: 2025-11-22 08:04:19.414 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:04:19 compute-0 systemd-udevd[231965]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:04:19 compute-0 ovn_controller[94843]: 2025-11-22T08:04:19Z|00461|binding|INFO|Setting lport fe9c07c3-e08f-4810-b699-6d6aa3f50cda up in Southbound
Nov 22 08:04:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:04:19.427 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1c:d6:3e 10.100.0.8'], port_security=['fa:16:3e:1c:d6:3e 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '235ecf63-07fa-4f60-97e9-466450a50add', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c157128f-75e8-4afb-ab55-34580af9585f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd967f0cef958482c9711764882a146f3', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'db55d655-ec9a-40ef-9e54-3247c3ea4f75', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=50c031f4-6b41-4ee7-af4f-a9218d9b390c, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=fe9c07c3-e08f-4810-b699-6d6aa3f50cda) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:04:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:04:19.429 103805 INFO neutron.agent.ovn.metadata.agent [-] Port fe9c07c3-e08f-4810-b699-6d6aa3f50cda in datapath c157128f-75e8-4afb-ab55-34580af9585f bound to our chassis
Nov 22 08:04:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:04:19.430 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c157128f-75e8-4afb-ab55-34580af9585f
Nov 22 08:04:19 compute-0 systemd-machined[152872]: New machine qemu-60-instance-00000061.
Nov 22 08:04:19 compute-0 NetworkManager[55036]: <info>  [1763798659.4394] device (tapfe9c07c3-e0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 08:04:19 compute-0 NetworkManager[55036]: <info>  [1763798659.4403] device (tapfe9c07c3-e0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 08:04:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:04:19.440 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[c579bc8e-8848-41b5-96e2-959c2d3a58c1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:04:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:04:19.441 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc157128f-71 in ovnmeta-c157128f-75e8-4afb-ab55-34580af9585f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 08:04:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:04:19.443 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc157128f-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 08:04:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:04:19.444 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[7a8520ba-d094-49d1-a328-536834047b41]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:04:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:04:19.444 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[fa257656-3f60-417d-ad0f-b71fb65af7aa]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:04:19 compute-0 systemd[1]: Started Virtual Machine qemu-60-instance-00000061.
Nov 22 08:04:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:04:19.455 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[062c14c1-e31c-4e81-b78e-bce175e4aca4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:04:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:04:19.466 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[273fcf20-72da-494b-9206-6a4408305a73]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:04:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:04:19.493 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[5f345fe5-a103-4924-934c-6c5966daf70e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:04:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:04:19.498 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[bfe118e8-075a-4ed2-b17e-ed06e105c028]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:04:19 compute-0 systemd-udevd[231969]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:04:19 compute-0 NetworkManager[55036]: <info>  [1763798659.4991] manager: (tapc157128f-70): new Veth device (/org/freedesktop/NetworkManager/Devices/216)
Nov 22 08:04:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:04:19.527 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[0bb05392-9fac-4f0e-97bf-8e452757a319]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:04:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:04:19.531 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[da8ff793-3077-44ac-987d-f07002e655d6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:04:19 compute-0 NetworkManager[55036]: <info>  [1763798659.5528] device (tapc157128f-70): carrier: link connected
Nov 22 08:04:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:04:19.559 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[a6e28d37-bd3b-44da-aaac-3fb01b330359]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:04:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:04:19.575 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[bfb8e1db-a6c1-45af-b074-f39094426dd2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc157128f-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bb:41:0b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 144], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 540019, 'reachable_time': 44782, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231999, 'error': None, 'target': 'ovnmeta-c157128f-75e8-4afb-ab55-34580af9585f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:04:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:04:19.587 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[01467f9a-66c8-4c99-8241-9a1b26fdc9e5]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:febb:410b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 540019, 'tstamp': 540019}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 232000, 'error': None, 'target': 'ovnmeta-c157128f-75e8-4afb-ab55-34580af9585f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:04:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:04:19.603 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[c871937c-0c95-4ac8-b429-29e0fb98077a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc157128f-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bb:41:0b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 144], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 540019, 'reachable_time': 44782, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 232002, 'error': None, 'target': 'ovnmeta-c157128f-75e8-4afb-ab55-34580af9585f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:04:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:04:19.631 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[80ecee38-90bf-4461-b9e5-75ea72e76209]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:04:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:04:19.679 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[3cccf460-c9c8-4807-9009-dd9d4e42029d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:04:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:04:19.681 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc157128f-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:04:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:04:19.681 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:04:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:04:19.682 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc157128f-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:04:19 compute-0 kernel: tapc157128f-70: entered promiscuous mode
Nov 22 08:04:19 compute-0 NetworkManager[55036]: <info>  [1763798659.6862] manager: (tapc157128f-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/217)
Nov 22 08:04:19 compute-0 nova_compute[186544]: 2025-11-22 08:04:19.685 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:04:19 compute-0 nova_compute[186544]: 2025-11-22 08:04:19.687 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:04:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:04:19.690 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc157128f-70, col_values=(('external_ids', {'iface-id': 'a61c8ae7-262d-45c7-859e-6a4502225b00'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:04:19 compute-0 nova_compute[186544]: 2025-11-22 08:04:19.691 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:04:19 compute-0 ovn_controller[94843]: 2025-11-22T08:04:19Z|00462|binding|INFO|Releasing lport a61c8ae7-262d-45c7-859e-6a4502225b00 from this chassis (sb_readonly=0)
Nov 22 08:04:19 compute-0 nova_compute[186544]: 2025-11-22 08:04:19.692 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:04:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:04:19.694 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c157128f-75e8-4afb-ab55-34580af9585f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c157128f-75e8-4afb-ab55-34580af9585f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 08:04:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:04:19.695 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[7c974a1a-54eb-41cd-a639-7789992c1561]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:04:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:04:19.696 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 08:04:19 compute-0 ovn_metadata_agent[103800]: global
Nov 22 08:04:19 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 08:04:19 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-c157128f-75e8-4afb-ab55-34580af9585f
Nov 22 08:04:19 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 08:04:19 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 08:04:19 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 08:04:19 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/c157128f-75e8-4afb-ab55-34580af9585f.pid.haproxy
Nov 22 08:04:19 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 08:04:19 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:04:19 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 08:04:19 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 08:04:19 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 08:04:19 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 08:04:19 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 08:04:19 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 08:04:19 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 08:04:19 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 08:04:19 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 08:04:19 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 08:04:19 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 08:04:19 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 08:04:19 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 08:04:19 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:04:19 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:04:19 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 08:04:19 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 08:04:19 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 08:04:19 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID c157128f-75e8-4afb-ab55-34580af9585f
Nov 22 08:04:19 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 08:04:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:04:19.697 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c157128f-75e8-4afb-ab55-34580af9585f', 'env', 'PROCESS_TAG=haproxy-c157128f-75e8-4afb-ab55-34580af9585f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c157128f-75e8-4afb-ab55-34580af9585f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 08:04:19 compute-0 nova_compute[186544]: 2025-11-22 08:04:19.708 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:04:19 compute-0 nova_compute[186544]: 2025-11-22 08:04:19.874 186548 DEBUG nova.virt.libvirt.host [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Removed pending event for 235ecf63-07fa-4f60-97e9-466450a50add due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Nov 22 08:04:19 compute-0 nova_compute[186544]: 2025-11-22 08:04:19.875 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798659.8744712, 235ecf63-07fa-4f60-97e9-466450a50add => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:04:19 compute-0 nova_compute[186544]: 2025-11-22 08:04:19.875 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] VM Started (Lifecycle Event)
Nov 22 08:04:19 compute-0 nova_compute[186544]: 2025-11-22 08:04:19.911 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:04:19 compute-0 nova_compute[186544]: 2025-11-22 08:04:19.932 186548 DEBUG nova.compute.manager [None req-9b8f4be3-de1e-45c8-b9a1-0dece8e23161 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 08:04:19 compute-0 nova_compute[186544]: 2025-11-22 08:04:19.933 186548 DEBUG nova.objects.instance [None req-9b8f4be3-de1e-45c8-b9a1-0dece8e23161 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 235ecf63-07fa-4f60-97e9-466450a50add obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:04:19 compute-0 nova_compute[186544]: 2025-11-22 08:04:19.938 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Synchronizing instance power state after lifecycle event "Started"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:04:19 compute-0 nova_compute[186544]: 2025-11-22 08:04:19.949 186548 INFO nova.virt.libvirt.driver [-] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Instance running successfully.
Nov 22 08:04:19 compute-0 virtqemud[186092]: argument unsupported: QEMU guest agent is not configured
Nov 22 08:04:19 compute-0 nova_compute[186544]: 2025-11-22 08:04:19.953 186548 DEBUG nova.virt.libvirt.guest [None req-9b8f4be3-de1e-45c8-b9a1-0dece8e23161 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200
Nov 22 08:04:19 compute-0 nova_compute[186544]: 2025-11-22 08:04:19.953 186548 DEBUG nova.compute.manager [None req-9b8f4be3-de1e-45c8-b9a1-0dece8e23161 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:04:19 compute-0 nova_compute[186544]: 2025-11-22 08:04:19.973 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] During sync_power_state the instance has a pending task (resuming). Skip.
Nov 22 08:04:19 compute-0 nova_compute[186544]: 2025-11-22 08:04:19.973 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798659.9181206, 235ecf63-07fa-4f60-97e9-466450a50add => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:04:19 compute-0 nova_compute[186544]: 2025-11-22 08:04:19.974 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] VM Resumed (Lifecycle Event)
Nov 22 08:04:20 compute-0 nova_compute[186544]: 2025-11-22 08:04:20.003 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:04:20 compute-0 nova_compute[186544]: 2025-11-22 08:04:20.007 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:04:20 compute-0 podman[232037]: 2025-11-22 08:04:20.041188956 +0000 UTC m=+0.026481664 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 08:04:20 compute-0 podman[232037]: 2025-11-22 08:04:20.538583492 +0000 UTC m=+0.523876170 container create 8b2838395485a98f4d0cee5100a8a240f1f31900e34311d1b3996ec03c8e5e0a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c157128f-75e8-4afb-ab55-34580af9585f, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_managed=true)
Nov 22 08:04:20 compute-0 systemd[1]: Started libpod-conmon-8b2838395485a98f4d0cee5100a8a240f1f31900e34311d1b3996ec03c8e5e0a.scope.
Nov 22 08:04:20 compute-0 nova_compute[186544]: 2025-11-22 08:04:20.719 186548 DEBUG nova.compute.manager [req-064c3e07-1127-472f-ac53-c171837aa6b1 req-04923118-0a61-41b2-81ab-0b881618d5c4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Received event network-vif-plugged-fe9c07c3-e08f-4810-b699-6d6aa3f50cda external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:04:20 compute-0 nova_compute[186544]: 2025-11-22 08:04:20.720 186548 DEBUG oslo_concurrency.lockutils [req-064c3e07-1127-472f-ac53-c171837aa6b1 req-04923118-0a61-41b2-81ab-0b881618d5c4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "235ecf63-07fa-4f60-97e9-466450a50add-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:04:20 compute-0 nova_compute[186544]: 2025-11-22 08:04:20.720 186548 DEBUG oslo_concurrency.lockutils [req-064c3e07-1127-472f-ac53-c171837aa6b1 req-04923118-0a61-41b2-81ab-0b881618d5c4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "235ecf63-07fa-4f60-97e9-466450a50add-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:04:20 compute-0 nova_compute[186544]: 2025-11-22 08:04:20.720 186548 DEBUG oslo_concurrency.lockutils [req-064c3e07-1127-472f-ac53-c171837aa6b1 req-04923118-0a61-41b2-81ab-0b881618d5c4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "235ecf63-07fa-4f60-97e9-466450a50add-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:04:20 compute-0 nova_compute[186544]: 2025-11-22 08:04:20.720 186548 DEBUG nova.compute.manager [req-064c3e07-1127-472f-ac53-c171837aa6b1 req-04923118-0a61-41b2-81ab-0b881618d5c4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] No waiting events found dispatching network-vif-plugged-fe9c07c3-e08f-4810-b699-6d6aa3f50cda pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:04:20 compute-0 nova_compute[186544]: 2025-11-22 08:04:20.721 186548 WARNING nova.compute.manager [req-064c3e07-1127-472f-ac53-c171837aa6b1 req-04923118-0a61-41b2-81ab-0b881618d5c4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Received unexpected event network-vif-plugged-fe9c07c3-e08f-4810-b699-6d6aa3f50cda for instance with vm_state active and task_state None.
Nov 22 08:04:20 compute-0 nova_compute[186544]: 2025-11-22 08:04:20.721 186548 DEBUG nova.compute.manager [req-064c3e07-1127-472f-ac53-c171837aa6b1 req-04923118-0a61-41b2-81ab-0b881618d5c4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Received event network-vif-plugged-fe9c07c3-e08f-4810-b699-6d6aa3f50cda external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:04:20 compute-0 nova_compute[186544]: 2025-11-22 08:04:20.721 186548 DEBUG oslo_concurrency.lockutils [req-064c3e07-1127-472f-ac53-c171837aa6b1 req-04923118-0a61-41b2-81ab-0b881618d5c4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "235ecf63-07fa-4f60-97e9-466450a50add-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:04:20 compute-0 nova_compute[186544]: 2025-11-22 08:04:20.721 186548 DEBUG oslo_concurrency.lockutils [req-064c3e07-1127-472f-ac53-c171837aa6b1 req-04923118-0a61-41b2-81ab-0b881618d5c4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "235ecf63-07fa-4f60-97e9-466450a50add-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:04:20 compute-0 nova_compute[186544]: 2025-11-22 08:04:20.721 186548 DEBUG oslo_concurrency.lockutils [req-064c3e07-1127-472f-ac53-c171837aa6b1 req-04923118-0a61-41b2-81ab-0b881618d5c4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "235ecf63-07fa-4f60-97e9-466450a50add-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:04:20 compute-0 nova_compute[186544]: 2025-11-22 08:04:20.721 186548 DEBUG nova.compute.manager [req-064c3e07-1127-472f-ac53-c171837aa6b1 req-04923118-0a61-41b2-81ab-0b881618d5c4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] No waiting events found dispatching network-vif-plugged-fe9c07c3-e08f-4810-b699-6d6aa3f50cda pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:04:20 compute-0 nova_compute[186544]: 2025-11-22 08:04:20.722 186548 WARNING nova.compute.manager [req-064c3e07-1127-472f-ac53-c171837aa6b1 req-04923118-0a61-41b2-81ab-0b881618d5c4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Received unexpected event network-vif-plugged-fe9c07c3-e08f-4810-b699-6d6aa3f50cda for instance with vm_state active and task_state None.
Nov 22 08:04:20 compute-0 nova_compute[186544]: 2025-11-22 08:04:20.725 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:04:20 compute-0 systemd[1]: Started libcrun container.
Nov 22 08:04:20 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/034dd310076fd19614eb76329c600d1f2a33366836076aeacf21a2afe6156677/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 08:04:20 compute-0 podman[232037]: 2025-11-22 08:04:20.906143407 +0000 UTC m=+0.891436105 container init 8b2838395485a98f4d0cee5100a8a240f1f31900e34311d1b3996ec03c8e5e0a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c157128f-75e8-4afb-ab55-34580af9585f, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 22 08:04:20 compute-0 podman[232037]: 2025-11-22 08:04:20.912878383 +0000 UTC m=+0.898171061 container start 8b2838395485a98f4d0cee5100a8a240f1f31900e34311d1b3996ec03c8e5e0a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c157128f-75e8-4afb-ab55-34580af9585f, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 08:04:20 compute-0 neutron-haproxy-ovnmeta-c157128f-75e8-4afb-ab55-34580af9585f[232057]: [NOTICE]   (232061) : New worker (232063) forked
Nov 22 08:04:20 compute-0 neutron-haproxy-ovnmeta-c157128f-75e8-4afb-ab55-34580af9585f[232057]: [NOTICE]   (232061) : Loading success.
Nov 22 08:04:21 compute-0 ovn_controller[94843]: 2025-11-22T08:04:21Z|00045|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:1c:d6:3e 10.100.0.8
Nov 22 08:04:22 compute-0 nova_compute[186544]: 2025-11-22 08:04:22.903 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:04:25 compute-0 nova_compute[186544]: 2025-11-22 08:04:25.728 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:04:26 compute-0 nova_compute[186544]: 2025-11-22 08:04:26.159 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:04:26 compute-0 nova_compute[186544]: 2025-11-22 08:04:26.178 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Triggering sync for uuid 235ecf63-07fa-4f60-97e9-466450a50add _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Nov 22 08:04:26 compute-0 nova_compute[186544]: 2025-11-22 08:04:26.179 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "235ecf63-07fa-4f60-97e9-466450a50add" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:04:26 compute-0 nova_compute[186544]: 2025-11-22 08:04:26.179 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "235ecf63-07fa-4f60-97e9-466450a50add" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:04:26 compute-0 nova_compute[186544]: 2025-11-22 08:04:26.217 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "235ecf63-07fa-4f60-97e9-466450a50add" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.038s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:04:27 compute-0 podman[232077]: 2025-11-22 08:04:27.417920472 +0000 UTC m=+0.060384630 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 08:04:27 compute-0 podman[232078]: 2025-11-22 08:04:27.461072847 +0000 UTC m=+0.095601249 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 08:04:27 compute-0 nova_compute[186544]: 2025-11-22 08:04:27.905 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:04:30 compute-0 podman[232123]: 2025-11-22 08:04:30.403115354 +0000 UTC m=+0.052400572 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 22 08:04:30 compute-0 podman[232124]: 2025-11-22 08:04:30.407617946 +0000 UTC m=+0.054067555 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 08:04:30 compute-0 nova_compute[186544]: 2025-11-22 08:04:30.729 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:04:32 compute-0 nova_compute[186544]: 2025-11-22 08:04:32.908 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:04:35 compute-0 nova_compute[186544]: 2025-11-22 08:04:35.732 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.597 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '235ecf63-07fa-4f60-97e9-466450a50add', 'name': 'tempest-ServersNegativeTestJSON-server-899820238', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '165c19d9-e5a7-4c8d-9823-47ddc3418023'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000061', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'd967f0cef958482c9711764882a146f3', 'user_id': '989540cd5ede4a5184a08b8eb3de013d', 'hostId': '0dddb932d08bd406e5ba67b183c68dd251b2f7cf64cee01ea4c355f1', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.597 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.602 12 DEBUG ceilometer.compute.pollsters [-] 235ecf63-07fa-4f60-97e9-466450a50add/network.incoming.packets volume: 8 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.603 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '45142fcc-6944-41e8-9dc1-13dd88a2f196', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 8, 'user_id': '989540cd5ede4a5184a08b8eb3de013d', 'user_name': None, 'project_id': 'd967f0cef958482c9711764882a146f3', 'project_name': None, 'resource_id': 'instance-00000061-235ecf63-07fa-4f60-97e9-466450a50add-tapfe9c07c3-e0', 'timestamp': '2025-11-22T08:04:36.597981', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-899820238', 'name': 'tapfe9c07c3-e0', 'instance_id': '235ecf63-07fa-4f60-97e9-466450a50add', 'instance_type': 'm1.nano', 'host': '0dddb932d08bd406e5ba67b183c68dd251b2f7cf64cee01ea4c355f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '165c19d9-e5a7-4c8d-9823-47ddc3418023'}, 'image_ref': '165c19d9-e5a7-4c8d-9823-47ddc3418023', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1c:d6:3e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfe9c07c3-e0'}, 'message_id': 'e33f3222-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5417.289768107, 'message_signature': 'd8d4087c70cc5def7540210c71c36f4158371a3f128a6e12cb78099e8cacf357'}]}, 'timestamp': '2025-11-22 08:04:36.602704', '_unique_id': '91106d26ed1943bfb513fed0b5a092ee'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.603 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.603 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.603 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.603 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.603 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.603 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.603 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.603 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.603 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.603 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.603 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.603 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.603 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.603 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.603 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.603 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.603 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.603 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.603 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.603 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.603 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.603 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.603 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.603 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.603 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.603 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.603 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.603 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.603 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.603 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.603 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.604 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.604 12 DEBUG ceilometer.compute.pollsters [-] 235ecf63-07fa-4f60-97e9-466450a50add/network.outgoing.bytes volume: 1668 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.605 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7aef5e1a-0ff1-4953-aef5-58b1a7810f19', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1668, 'user_id': '989540cd5ede4a5184a08b8eb3de013d', 'user_name': None, 'project_id': 'd967f0cef958482c9711764882a146f3', 'project_name': None, 'resource_id': 'instance-00000061-235ecf63-07fa-4f60-97e9-466450a50add-tapfe9c07c3-e0', 'timestamp': '2025-11-22T08:04:36.604779', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-899820238', 'name': 'tapfe9c07c3-e0', 'instance_id': '235ecf63-07fa-4f60-97e9-466450a50add', 'instance_type': 'm1.nano', 'host': '0dddb932d08bd406e5ba67b183c68dd251b2f7cf64cee01ea4c355f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '165c19d9-e5a7-4c8d-9823-47ddc3418023'}, 'image_ref': '165c19d9-e5a7-4c8d-9823-47ddc3418023', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1c:d6:3e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfe9c07c3-e0'}, 'message_id': 'e33f903c-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5417.289768107, 'message_signature': '37617519b1f06577a76321e8b20ac4f1072374edef5a986da6fc008bc183e1b7'}]}, 'timestamp': '2025-11-22 08:04:36.605080', '_unique_id': 'e0d65b8b625d4f4eadc31573bb90be7c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.605 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.605 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.605 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.605 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.605 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.605 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.605 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.605 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.605 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.605 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.605 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.605 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.605 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.605 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.605 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.605 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.605 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.605 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.605 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.605 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.605 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.605 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.605 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.605 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.605 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.605 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.605 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.605 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.605 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.605 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.605 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.606 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.606 12 DEBUG ceilometer.compute.pollsters [-] 235ecf63-07fa-4f60-97e9-466450a50add/network.incoming.bytes volume: 987 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.607 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2e3e1a47-9558-4579-ad10-43d6b5efe90f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 987, 'user_id': '989540cd5ede4a5184a08b8eb3de013d', 'user_name': None, 'project_id': 'd967f0cef958482c9711764882a146f3', 'project_name': None, 'resource_id': 'instance-00000061-235ecf63-07fa-4f60-97e9-466450a50add-tapfe9c07c3-e0', 'timestamp': '2025-11-22T08:04:36.606458', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-899820238', 'name': 'tapfe9c07c3-e0', 'instance_id': '235ecf63-07fa-4f60-97e9-466450a50add', 'instance_type': 'm1.nano', 'host': '0dddb932d08bd406e5ba67b183c68dd251b2f7cf64cee01ea4c355f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '165c19d9-e5a7-4c8d-9823-47ddc3418023'}, 'image_ref': '165c19d9-e5a7-4c8d-9823-47ddc3418023', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1c:d6:3e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfe9c07c3-e0'}, 'message_id': 'e33fd0a6-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5417.289768107, 'message_signature': '947cea96ec10fbfdf14591d31f80f9d3c03a320289497e815faf61b95721187b'}]}, 'timestamp': '2025-11-22 08:04:36.606710', '_unique_id': '3668326f1d3845e98121ee329eff07a4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.607 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.607 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.607 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.607 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.607 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.607 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.607 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.607 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.607 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.607 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.607 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.607 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.607 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.607 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.607 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.607 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.607 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.607 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.607 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.607 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.607 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.607 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.607 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.607 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.607 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.607 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.607 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.607 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.607 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.607 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.607 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.607 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.639 12 DEBUG ceilometer.compute.pollsters [-] 235ecf63-07fa-4f60-97e9-466450a50add/disk.device.write.bytes volume: 122880 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.639 12 DEBUG ceilometer.compute.pollsters [-] 235ecf63-07fa-4f60-97e9-466450a50add/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.641 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8dc00623-d794-4e3a-965f-6b318a932245', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 122880, 'user_id': '989540cd5ede4a5184a08b8eb3de013d', 'user_name': None, 'project_id': 'd967f0cef958482c9711764882a146f3', 'project_name': None, 'resource_id': '235ecf63-07fa-4f60-97e9-466450a50add-vda', 'timestamp': '2025-11-22T08:04:36.608071', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-899820238', 'name': 'instance-00000061', 'instance_id': '235ecf63-07fa-4f60-97e9-466450a50add', 'instance_type': 'm1.nano', 'host': '0dddb932d08bd406e5ba67b183c68dd251b2f7cf64cee01ea4c355f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '165c19d9-e5a7-4c8d-9823-47ddc3418023'}, 'image_ref': '165c19d9-e5a7-4c8d-9823-47ddc3418023', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e344e0c8-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5417.299871556, 'message_signature': '94982ecd3e84047ef2bd11a264c65b465bb935b3d92bf77a7c09167b48080976'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '989540cd5ede4a5184a08b8eb3de013d', 'user_name': None, 'project_id': 'd967f0cef958482c9711764882a146f3', 'project_name': None, 'resource_id': '235ecf63-07fa-4f60-97e9-466450a50add-sda', 'timestamp': '2025-11-22T08:04:36.608071', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-899820238', 'name': 'instance-00000061', 'instance_id': '235ecf63-07fa-4f60-97e9-466450a50add', 'instance_type': 'm1.nano', 'host': '0dddb932d08bd406e5ba67b183c68dd251b2f7cf64cee01ea4c355f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '165c19d9-e5a7-4c8d-9823-47ddc3418023'}, 'image_ref': '165c19d9-e5a7-4c8d-9823-47ddc3418023', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e344eeb0-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5417.299871556, 'message_signature': '5f867c40686bed8dffa54dfcb96d1eb595f65122baa3d7b8772f8966be7ff9e9'}]}, 'timestamp': '2025-11-22 08:04:36.640296', '_unique_id': 'ae737196b16b4a8f8bb9f27bba8feb5f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.641 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.641 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.641 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.641 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.641 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.641 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.641 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.641 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.641 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.641 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.641 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.641 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.641 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.641 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.641 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.641 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.641 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.641 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.641 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.641 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.641 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.641 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.641 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.641 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.641 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.641 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.641 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.641 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.641 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.641 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.641 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.641 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.641 12 DEBUG ceilometer.compute.pollsters [-] 235ecf63-07fa-4f60-97e9-466450a50add/network.outgoing.bytes.delta volume: 48 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.642 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd672a236-7778-4934-907b-edd03dcac38c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 48, 'user_id': '989540cd5ede4a5184a08b8eb3de013d', 'user_name': None, 'project_id': 'd967f0cef958482c9711764882a146f3', 'project_name': None, 'resource_id': 'instance-00000061-235ecf63-07fa-4f60-97e9-466450a50add-tapfe9c07c3-e0', 'timestamp': '2025-11-22T08:04:36.641951', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-899820238', 'name': 'tapfe9c07c3-e0', 'instance_id': '235ecf63-07fa-4f60-97e9-466450a50add', 'instance_type': 'm1.nano', 'host': '0dddb932d08bd406e5ba67b183c68dd251b2f7cf64cee01ea4c355f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '165c19d9-e5a7-4c8d-9823-47ddc3418023'}, 'image_ref': '165c19d9-e5a7-4c8d-9823-47ddc3418023', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1c:d6:3e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfe9c07c3-e0'}, 'message_id': 'e3453b68-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5417.289768107, 'message_signature': 'bba5a01b8e9e7e2252648d9d3cd7a5acf45bb28367310f0666ab275378ae7778'}]}, 'timestamp': '2025-11-22 08:04:36.642217', '_unique_id': 'd1c58169066140dbb9b2647f68c62423'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.642 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.642 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.642 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.642 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.642 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.642 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.642 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.642 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.642 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.642 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.642 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.642 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.642 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.642 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.642 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.642 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.642 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.642 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.642 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.642 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.642 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.642 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.642 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.642 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.642 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.642 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.642 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.642 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.642 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.642 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.642 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.643 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.643 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.643 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.658 12 DEBUG ceilometer.compute.pollsters [-] 235ecf63-07fa-4f60-97e9-466450a50add/memory.usage volume: 42.3203125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.659 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '78ea3e36-aa44-45b8-9db6-02e4508819f0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.3203125, 'user_id': '989540cd5ede4a5184a08b8eb3de013d', 'user_name': None, 'project_id': 'd967f0cef958482c9711764882a146f3', 'project_name': None, 'resource_id': '235ecf63-07fa-4f60-97e9-466450a50add', 'timestamp': '2025-11-22T08:04:36.643922', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-899820238', 'name': 'instance-00000061', 'instance_id': '235ecf63-07fa-4f60-97e9-466450a50add', 'instance_type': 'm1.nano', 'host': '0dddb932d08bd406e5ba67b183c68dd251b2f7cf64cee01ea4c355f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '165c19d9-e5a7-4c8d-9823-47ddc3418023'}, 'image_ref': '165c19d9-e5a7-4c8d-9823-47ddc3418023', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'e347cda6-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5417.35031939, 'message_signature': '8b4c2537287f25a122e8781fa14beab363b1789ba5ca7e74a763948ff79fc341'}]}, 'timestamp': '2025-11-22 08:04:36.659099', '_unique_id': '92f020a2f8a94eab8e4c068754344da2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.659 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.659 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.659 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.659 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.659 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.659 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.659 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.659 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.659 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.659 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.659 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.659 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.659 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.659 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.659 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.659 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.659 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.659 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.659 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.659 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.659 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.659 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.659 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.659 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.659 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.659 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.659 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.659 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.659 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.659 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.659 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.660 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.660 12 DEBUG ceilometer.compute.pollsters [-] 235ecf63-07fa-4f60-97e9-466450a50add/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.661 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ae842f4d-4d1c-4d00-b471-cc3a4d73b289', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '989540cd5ede4a5184a08b8eb3de013d', 'user_name': None, 'project_id': 'd967f0cef958482c9711764882a146f3', 'project_name': None, 'resource_id': 'instance-00000061-235ecf63-07fa-4f60-97e9-466450a50add-tapfe9c07c3-e0', 'timestamp': '2025-11-22T08:04:36.660756', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-899820238', 'name': 'tapfe9c07c3-e0', 'instance_id': '235ecf63-07fa-4f60-97e9-466450a50add', 'instance_type': 'm1.nano', 'host': '0dddb932d08bd406e5ba67b183c68dd251b2f7cf64cee01ea4c355f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '165c19d9-e5a7-4c8d-9823-47ddc3418023'}, 'image_ref': '165c19d9-e5a7-4c8d-9823-47ddc3418023', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1c:d6:3e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfe9c07c3-e0'}, 'message_id': 'e3481a04-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5417.289768107, 'message_signature': '6089047bf3911a9b0b6edb71debd93ea29e09bee73293c73f3d2081e9028d20c'}]}, 'timestamp': '2025-11-22 08:04:36.661004', '_unique_id': '7680e01f66c44b4795444928e491013b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.661 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.661 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.661 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.661 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.661 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.661 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.661 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.661 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.661 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.661 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.661 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.661 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.661 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.661 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.661 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.661 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.661 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.661 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.661 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.661 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.661 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.661 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.661 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.661 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.661 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.661 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.661 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.661 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.661 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.661 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.661 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.662 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.662 12 DEBUG ceilometer.compute.pollsters [-] 235ecf63-07fa-4f60-97e9-466450a50add/disk.device.read.requests volume: 247 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.662 12 DEBUG ceilometer.compute.pollsters [-] 235ecf63-07fa-4f60-97e9-466450a50add/disk.device.read.requests volume: 88 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.663 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '96401fc7-841e-4116-8cb7-5c0c137e7c89', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 247, 'user_id': '989540cd5ede4a5184a08b8eb3de013d', 'user_name': None, 'project_id': 'd967f0cef958482c9711764882a146f3', 'project_name': None, 'resource_id': '235ecf63-07fa-4f60-97e9-466450a50add-vda', 'timestamp': '2025-11-22T08:04:36.662087', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-899820238', 'name': 'instance-00000061', 'instance_id': '235ecf63-07fa-4f60-97e9-466450a50add', 'instance_type': 'm1.nano', 'host': '0dddb932d08bd406e5ba67b183c68dd251b2f7cf64cee01ea4c355f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '165c19d9-e5a7-4c8d-9823-47ddc3418023'}, 'image_ref': '165c19d9-e5a7-4c8d-9823-47ddc3418023', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e3484de4-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5417.299871556, 'message_signature': '9a90e71c1c46ac3e601e7eb7807f0788d621735f89491b37716d1ae6a937a0c1'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 88, 'user_id': '989540cd5ede4a5184a08b8eb3de013d', 'user_name': None, 'project_id': 'd967f0cef958482c9711764882a146f3', 'project_name': None, 'resource_id': '235ecf63-07fa-4f60-97e9-466450a50add-sda', 'timestamp': '2025-11-22T08:04:36.662087', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-899820238', 'name': 'instance-00000061', 'instance_id': '235ecf63-07fa-4f60-97e9-466450a50add', 'instance_type': 'm1.nano', 'host': '0dddb932d08bd406e5ba67b183c68dd251b2f7cf64cee01ea4c355f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '165c19d9-e5a7-4c8d-9823-47ddc3418023'}, 'image_ref': '165c19d9-e5a7-4c8d-9823-47ddc3418023', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e3485910-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5417.299871556, 'message_signature': '0c67526dc5a91d11e3ae0c31ae0eaa1d25c12de4e678d51b398db84c0e193a69'}]}, 'timestamp': '2025-11-22 08:04:36.662605', '_unique_id': '40c8ce1fd7a54d60b926d8698925e85b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.663 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.663 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.663 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.663 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.663 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.663 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.663 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.663 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.663 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.663 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.663 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.663 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.663 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.663 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.663 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.663 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.663 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.663 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.663 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.663 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.663 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.663 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.663 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.663 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.663 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.663 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.663 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.663 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.663 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.663 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.663 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.663 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.673 12 DEBUG ceilometer.compute.pollsters [-] 235ecf63-07fa-4f60-97e9-466450a50add/disk.device.usage volume: 30081024 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.673 12 DEBUG ceilometer.compute.pollsters [-] 235ecf63-07fa-4f60-97e9-466450a50add/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.674 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a94c4660-2312-4e7e-8919-5ce89495b43e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30081024, 'user_id': '989540cd5ede4a5184a08b8eb3de013d', 'user_name': None, 'project_id': 'd967f0cef958482c9711764882a146f3', 'project_name': None, 'resource_id': '235ecf63-07fa-4f60-97e9-466450a50add-vda', 'timestamp': '2025-11-22T08:04:36.663748', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-899820238', 'name': 'instance-00000061', 'instance_id': '235ecf63-07fa-4f60-97e9-466450a50add', 'instance_type': 'm1.nano', 'host': '0dddb932d08bd406e5ba67b183c68dd251b2f7cf64cee01ea4c355f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '165c19d9-e5a7-4c8d-9823-47ddc3418023'}, 'image_ref': '165c19d9-e5a7-4c8d-9823-47ddc3418023', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e349fe96-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5417.355527229, 'message_signature': '6ea47287fec9a5f668ed78d94808c05362cb16031601a4752bffb854dae8607e'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '989540cd5ede4a5184a08b8eb3de013d', 'user_name': None, 'project_id': 'd967f0cef958482c9711764882a146f3', 'project_name': None, 'resource_id': '235ecf63-07fa-4f60-97e9-466450a50add-sda', 'timestamp': '2025-11-22T08:04:36.663748', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-899820238', 'name': 'instance-00000061', 'instance_id': '235ecf63-07fa-4f60-97e9-466450a50add', 'instance_type': 'm1.nano', 'host': '0dddb932d08bd406e5ba67b183c68dd251b2f7cf64cee01ea4c355f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '165c19d9-e5a7-4c8d-9823-47ddc3418023'}, 'image_ref': '165c19d9-e5a7-4c8d-9823-47ddc3418023', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e34a0828-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5417.355527229, 'message_signature': '8b443b9ace12977a2bf9695b4cb411ef5b3dead0b5afb27ca8692d7ee9956031'}]}, 'timestamp': '2025-11-22 08:04:36.673642', '_unique_id': 'b380f13ceb04484287d3a21a66edc582'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.674 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.674 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.674 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.674 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.674 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.674 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.674 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.674 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.674 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.674 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.674 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.674 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.674 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.674 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.674 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.674 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.674 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.674 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.674 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.674 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.674 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.674 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.674 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.674 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.674 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.674 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.674 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.674 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.674 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.674 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.674 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.675 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.675 12 DEBUG ceilometer.compute.pollsters [-] 235ecf63-07fa-4f60-97e9-466450a50add/disk.device.write.latency volume: 455119474 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.675 12 DEBUG ceilometer.compute.pollsters [-] 235ecf63-07fa-4f60-97e9-466450a50add/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.676 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '41ec3861-d280-4392-a8f2-f395ee3778dd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 455119474, 'user_id': '989540cd5ede4a5184a08b8eb3de013d', 'user_name': None, 'project_id': 'd967f0cef958482c9711764882a146f3', 'project_name': None, 'resource_id': '235ecf63-07fa-4f60-97e9-466450a50add-vda', 'timestamp': '2025-11-22T08:04:36.675295', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-899820238', 'name': 'instance-00000061', 'instance_id': '235ecf63-07fa-4f60-97e9-466450a50add', 'instance_type': 'm1.nano', 'host': '0dddb932d08bd406e5ba67b183c68dd251b2f7cf64cee01ea4c355f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '165c19d9-e5a7-4c8d-9823-47ddc3418023'}, 'image_ref': '165c19d9-e5a7-4c8d-9823-47ddc3418023', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e34a5508-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5417.299871556, 'message_signature': '355f3a43006f0771d15f3ed5782670de98bafa873add8f5ee37616222dac5e57'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '989540cd5ede4a5184a08b8eb3de013d', 'user_name': None, 'project_id': 'd967f0cef958482c9711764882a146f3', 'project_name': None, 'resource_id': '235ecf63-07fa-4f60-97e9-466450a50add-sda', 'timestamp': '2025-11-22T08:04:36.675295', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-899820238', 'name': 'instance-00000061', 'instance_id': '235ecf63-07fa-4f60-97e9-466450a50add', 'instance_type': 'm1.nano', 'host': '0dddb932d08bd406e5ba67b183c68dd251b2f7cf64cee01ea4c355f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '165c19d9-e5a7-4c8d-9823-47ddc3418023'}, 'image_ref': '165c19d9-e5a7-4c8d-9823-47ddc3418023', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e34a6098-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5417.299871556, 'message_signature': '49ce19b01fa5a77ba6f0db6ac2af7a9ec5627bb7cc08250114e9f904e13284b7'}]}, 'timestamp': '2025-11-22 08:04:36.675943', '_unique_id': '3955986352ce438b9ff7615fe0c407d9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.676 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.676 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.676 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.676 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.676 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.676 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.676 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.676 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.676 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.676 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.676 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.676 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.676 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.676 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.676 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.676 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.676 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.676 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.676 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.676 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.676 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.676 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.676 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.676 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.676 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.676 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.676 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.676 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.676 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.676 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.676 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.677 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.677 12 DEBUG ceilometer.compute.pollsters [-] 235ecf63-07fa-4f60-97e9-466450a50add/disk.device.allocation volume: 30617600 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.677 12 DEBUG ceilometer.compute.pollsters [-] 235ecf63-07fa-4f60-97e9-466450a50add/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.678 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '69ea8151-0681-4ea2-8252-0c5bca9dfe0e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30617600, 'user_id': '989540cd5ede4a5184a08b8eb3de013d', 'user_name': None, 'project_id': 'd967f0cef958482c9711764882a146f3', 'project_name': None, 'resource_id': '235ecf63-07fa-4f60-97e9-466450a50add-vda', 'timestamp': '2025-11-22T08:04:36.677621', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-899820238', 'name': 'instance-00000061', 'instance_id': '235ecf63-07fa-4f60-97e9-466450a50add', 'instance_type': 'm1.nano', 'host': '0dddb932d08bd406e5ba67b183c68dd251b2f7cf64cee01ea4c355f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '165c19d9-e5a7-4c8d-9823-47ddc3418023'}, 'image_ref': '165c19d9-e5a7-4c8d-9823-47ddc3418023', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e34aac60-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5417.355527229, 'message_signature': 'e54bbe217db5bc2dccae843a01c12b94c11af38c5677f123e09cf124484b0e18'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '989540cd5ede4a5184a08b8eb3de013d', 'user_name': None, 'project_id': 'd967f0cef958482c9711764882a146f3', 'project_name': None, 'resource_id': '235ecf63-07fa-4f60-97e9-466450a50add-sda', 'timestamp': '2025-11-22T08:04:36.677621', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-899820238', 'name': 'instance-00000061', 'instance_id': '235ecf63-07fa-4f60-97e9-466450a50add', 'instance_type': 'm1.nano', 'host': '0dddb932d08bd406e5ba67b183c68dd251b2f7cf64cee01ea4c355f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '165c19d9-e5a7-4c8d-9823-47ddc3418023'}, 'image_ref': '165c19d9-e5a7-4c8d-9823-47ddc3418023', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e34ab4a8-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5417.355527229, 'message_signature': 'ee6df63f3622ab3f36143fabe1321ba66b70b0f2595d1bbc61cfd4dafddea996'}]}, 'timestamp': '2025-11-22 08:04:36.678052', '_unique_id': '42e6b3905f7f45aab5b2a7305cc650e5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.678 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.678 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.678 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.678 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.678 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.678 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.678 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.678 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.678 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.678 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.678 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.678 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.678 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.678 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.678 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.678 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.678 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.678 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.678 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.678 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.678 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.678 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.678 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.678 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.678 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.678 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.678 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.678 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.678 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.678 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.678 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.679 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.679 12 DEBUG ceilometer.compute.pollsters [-] 235ecf63-07fa-4f60-97e9-466450a50add/disk.device.write.requests volume: 10 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.679 12 DEBUG ceilometer.compute.pollsters [-] 235ecf63-07fa-4f60-97e9-466450a50add/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.680 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5b6c002a-2072-430d-840e-2893e51e9c62', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 10, 'user_id': '989540cd5ede4a5184a08b8eb3de013d', 'user_name': None, 'project_id': 'd967f0cef958482c9711764882a146f3', 'project_name': None, 'resource_id': '235ecf63-07fa-4f60-97e9-466450a50add-vda', 'timestamp': '2025-11-22T08:04:36.679252', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-899820238', 'name': 'instance-00000061', 'instance_id': '235ecf63-07fa-4f60-97e9-466450a50add', 'instance_type': 'm1.nano', 'host': '0dddb932d08bd406e5ba67b183c68dd251b2f7cf64cee01ea4c355f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '165c19d9-e5a7-4c8d-9823-47ddc3418023'}, 'image_ref': '165c19d9-e5a7-4c8d-9823-47ddc3418023', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e34aef0e-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5417.299871556, 'message_signature': '4773a3d43fbc507d2ac42acf7d09dd8f42208c39d8a0c0e569c71d782431d508'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '989540cd5ede4a5184a08b8eb3de013d', 'user_name': None, 'project_id': 'd967f0cef958482c9711764882a146f3', 'project_name': None, 'resource_id': '235ecf63-07fa-4f60-97e9-466450a50add-sda', 'timestamp': '2025-11-22T08:04:36.679252', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-899820238', 'name': 'instance-00000061', 'instance_id': '235ecf63-07fa-4f60-97e9-466450a50add', 'instance_type': 'm1.nano', 'host': '0dddb932d08bd406e5ba67b183c68dd251b2f7cf64cee01ea4c355f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '165c19d9-e5a7-4c8d-9823-47ddc3418023'}, 'image_ref': '165c19d9-e5a7-4c8d-9823-47ddc3418023', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e34afa30-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5417.299871556, 'message_signature': 'df9f9fc75d450dbfb5d812a931ab9e95d82831549340241a7e0820eb60fcb22a'}]}, 'timestamp': '2025-11-22 08:04:36.679873', '_unique_id': 'ec40f69ae00948fb82d6277a53af0ebf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.680 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.680 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.680 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.680 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.680 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.680 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.680 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.680 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.680 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.680 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.680 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.680 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.680 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.680 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.680 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.680 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.680 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.680 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.680 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.680 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.680 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.680 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.680 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.680 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.680 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.680 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.680 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.680 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.680 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.680 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.680 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.681 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.681 12 DEBUG ceilometer.compute.pollsters [-] 235ecf63-07fa-4f60-97e9-466450a50add/cpu volume: 1160000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.682 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '439aab03-32a6-46a8-b58b-cf09f31dadd1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1160000000, 'user_id': '989540cd5ede4a5184a08b8eb3de013d', 'user_name': None, 'project_id': 'd967f0cef958482c9711764882a146f3', 'project_name': None, 'resource_id': '235ecf63-07fa-4f60-97e9-466450a50add', 'timestamp': '2025-11-22T08:04:36.681339', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-899820238', 'name': 'instance-00000061', 'instance_id': '235ecf63-07fa-4f60-97e9-466450a50add', 'instance_type': 'm1.nano', 'host': '0dddb932d08bd406e5ba67b183c68dd251b2f7cf64cee01ea4c355f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '165c19d9-e5a7-4c8d-9823-47ddc3418023'}, 'image_ref': '165c19d9-e5a7-4c8d-9823-47ddc3418023', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'e34b3f18-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5417.35031939, 'message_signature': 'c9658340a1e2e3fd0f4fa62c40f8ab775809d889f4d7ba0cb30f99eba1b3b5b0'}]}, 'timestamp': '2025-11-22 08:04:36.681645', '_unique_id': '75661649d292470d9bc558e531427428'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.682 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.682 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.682 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.682 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.682 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.682 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.682 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.682 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.682 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.682 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.682 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.682 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.682 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.682 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.682 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.682 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.682 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.682 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.682 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.682 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.682 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.682 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.682 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.682 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.682 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.682 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.682 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.682 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.682 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.682 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.682 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.682 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.682 12 DEBUG ceilometer.compute.pollsters [-] 235ecf63-07fa-4f60-97e9-466450a50add/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.683 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c1d64d55-35cd-4771-882c-557bd582afbd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '989540cd5ede4a5184a08b8eb3de013d', 'user_name': None, 'project_id': 'd967f0cef958482c9711764882a146f3', 'project_name': None, 'resource_id': 'instance-00000061-235ecf63-07fa-4f60-97e9-466450a50add-tapfe9c07c3-e0', 'timestamp': '2025-11-22T08:04:36.682914', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-899820238', 'name': 'tapfe9c07c3-e0', 'instance_id': '235ecf63-07fa-4f60-97e9-466450a50add', 'instance_type': 'm1.nano', 'host': '0dddb932d08bd406e5ba67b183c68dd251b2f7cf64cee01ea4c355f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '165c19d9-e5a7-4c8d-9823-47ddc3418023'}, 'image_ref': '165c19d9-e5a7-4c8d-9823-47ddc3418023', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1c:d6:3e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfe9c07c3-e0'}, 'message_id': 'e34b7b36-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5417.289768107, 'message_signature': '0c57366c881780a6e594f7f910a9c176c64376df829a5de06f56714f47b10d77'}]}, 'timestamp': '2025-11-22 08:04:36.683163', '_unique_id': '0f3014e22a854974a229f02bc8b73774'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.683 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.683 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.683 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.683 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.683 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.683 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.683 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.683 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.683 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.683 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.683 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.683 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.683 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.683 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.683 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.683 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.683 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.683 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.683 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.683 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.683 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.683 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.683 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.683 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.683 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.683 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.683 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.683 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.683 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.683 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.683 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.684 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.684 12 DEBUG ceilometer.compute.pollsters [-] 235ecf63-07fa-4f60-97e9-466450a50add/network.outgoing.packets volume: 21 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.685 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8c721daf-2bd1-4448-a9d1-5459579d9a6c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 21, 'user_id': '989540cd5ede4a5184a08b8eb3de013d', 'user_name': None, 'project_id': 'd967f0cef958482c9711764882a146f3', 'project_name': None, 'resource_id': 'instance-00000061-235ecf63-07fa-4f60-97e9-466450a50add-tapfe9c07c3-e0', 'timestamp': '2025-11-22T08:04:36.684286', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-899820238', 'name': 'tapfe9c07c3-e0', 'instance_id': '235ecf63-07fa-4f60-97e9-466450a50add', 'instance_type': 'm1.nano', 'host': '0dddb932d08bd406e5ba67b183c68dd251b2f7cf64cee01ea4c355f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '165c19d9-e5a7-4c8d-9823-47ddc3418023'}, 'image_ref': '165c19d9-e5a7-4c8d-9823-47ddc3418023', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1c:d6:3e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfe9c07c3-e0'}, 'message_id': 'e34bb2ae-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5417.289768107, 'message_signature': '608e8d469df101b4ea6760b78ceb07df7046f774f9da9f0a82a6f23d1f5e40d5'}]}, 'timestamp': '2025-11-22 08:04:36.684613', '_unique_id': 'e3b6f5d5e4ea48cc9d9f0bf250f9358c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.685 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.685 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.685 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.685 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.685 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.685 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.685 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.685 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.685 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.685 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.685 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.685 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.685 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.685 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.685 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.685 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.685 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.685 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.685 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.685 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.685 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.685 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.685 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.685 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.685 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.685 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.685 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.685 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.685 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.685 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.685 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.686 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.686 12 DEBUG ceilometer.compute.pollsters [-] 235ecf63-07fa-4f60-97e9-466450a50add/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.686 12 DEBUG ceilometer.compute.pollsters [-] 235ecf63-07fa-4f60-97e9-466450a50add/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.687 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '731f9099-6df9-4204-99e2-1112ae5e6271', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '989540cd5ede4a5184a08b8eb3de013d', 'user_name': None, 'project_id': 'd967f0cef958482c9711764882a146f3', 'project_name': None, 'resource_id': '235ecf63-07fa-4f60-97e9-466450a50add-vda', 'timestamp': '2025-11-22T08:04:36.686196', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-899820238', 'name': 'instance-00000061', 'instance_id': '235ecf63-07fa-4f60-97e9-466450a50add', 'instance_type': 'm1.nano', 'host': '0dddb932d08bd406e5ba67b183c68dd251b2f7cf64cee01ea4c355f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '165c19d9-e5a7-4c8d-9823-47ddc3418023'}, 'image_ref': '165c19d9-e5a7-4c8d-9823-47ddc3418023', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e34bfd54-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5417.355527229, 'message_signature': '0d93da41e46784765b4c220ee4cd22f029aae8aa5775ea1b6f45421adbc175a0'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '989540cd5ede4a5184a08b8eb3de013d', 'user_name': None, 'project_id': 'd967f0cef958482c9711764882a146f3', 'project_name': None, 'resource_id': '235ecf63-07fa-4f60-97e9-466450a50add-sda', 'timestamp': '2025-11-22T08:04:36.686196', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-899820238', 'name': 'instance-00000061', 'instance_id': '235ecf63-07fa-4f60-97e9-466450a50add', 'instance_type': 'm1.nano', 'host': '0dddb932d08bd406e5ba67b183c68dd251b2f7cf64cee01ea4c355f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '165c19d9-e5a7-4c8d-9823-47ddc3418023'}, 'image_ref': '165c19d9-e5a7-4c8d-9823-47ddc3418023', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e34c081c-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5417.355527229, 'message_signature': 'afb6bf1dac806114c7921873151701b431dfc20c894229b2644801a0c9f6c163'}]}, 'timestamp': '2025-11-22 08:04:36.686784', '_unique_id': '13d07780814b4e8b9dae3d8ba416bdaf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.687 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.687 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.687 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.687 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.687 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.687 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.687 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.687 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.687 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.687 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.687 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.687 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.687 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.687 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.687 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.687 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.687 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.687 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.687 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.687 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.687 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.687 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.687 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.687 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.687 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.687 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.687 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.687 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.687 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.687 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.687 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.687 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.688 12 DEBUG ceilometer.compute.pollsters [-] 235ecf63-07fa-4f60-97e9-466450a50add/disk.device.read.bytes volume: 4955648 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.688 12 DEBUG ceilometer.compute.pollsters [-] 235ecf63-07fa-4f60-97e9-466450a50add/disk.device.read.bytes volume: 219276 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.688 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '86d29724-fded-4726-83c8-ec5f5bd1077b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4955648, 'user_id': '989540cd5ede4a5184a08b8eb3de013d', 'user_name': None, 'project_id': 'd967f0cef958482c9711764882a146f3', 'project_name': None, 'resource_id': '235ecf63-07fa-4f60-97e9-466450a50add-vda', 'timestamp': '2025-11-22T08:04:36.688001', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-899820238', 'name': 'instance-00000061', 'instance_id': '235ecf63-07fa-4f60-97e9-466450a50add', 'instance_type': 'm1.nano', 'host': '0dddb932d08bd406e5ba67b183c68dd251b2f7cf64cee01ea4c355f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '165c19d9-e5a7-4c8d-9823-47ddc3418023'}, 'image_ref': '165c19d9-e5a7-4c8d-9823-47ddc3418023', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e34c419c-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5417.299871556, 'message_signature': 'a76355685a1e25cf4eb89384942b247c7842b05315ebf82f58c9261f3f42a6a2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 219276, 'user_id': '989540cd5ede4a5184a08b8eb3de013d', 'user_name': None, 'project_id': 'd967f0cef958482c9711764882a146f3', 'project_name': None, 'resource_id': '235ecf63-07fa-4f60-97e9-466450a50add-sda', 'timestamp': '2025-11-22T08:04:36.688001', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-899820238', 'name': 'instance-00000061', 'instance_id': '235ecf63-07fa-4f60-97e9-466450a50add', 'instance_type': 'm1.nano', 'host': '0dddb932d08bd406e5ba67b183c68dd251b2f7cf64cee01ea4c355f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '165c19d9-e5a7-4c8d-9823-47ddc3418023'}, 'image_ref': '165c19d9-e5a7-4c8d-9823-47ddc3418023', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e34c4a5c-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5417.299871556, 'message_signature': 'c904eaa4759e5023739c3557bb76521bd3fd83581f51cc4ff5fdb7c7a90706be'}]}, 'timestamp': '2025-11-22 08:04:36.688437', '_unique_id': 'f140c41d0b3f4489988a3cc16f8ec136'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.688 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.688 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.688 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.688 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.688 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.688 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.688 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.688 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.688 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.688 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.688 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.688 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.688 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.688 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.688 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.688 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.688 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.688 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.688 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.688 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.688 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.688 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.688 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.688 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.688 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.688 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.688 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.688 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.688 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.688 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.688 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.689 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.689 12 DEBUG ceilometer.compute.pollsters [-] 235ecf63-07fa-4f60-97e9-466450a50add/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.690 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '14e93886-6046-4231-8b6e-214899696d8e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '989540cd5ede4a5184a08b8eb3de013d', 'user_name': None, 'project_id': 'd967f0cef958482c9711764882a146f3', 'project_name': None, 'resource_id': 'instance-00000061-235ecf63-07fa-4f60-97e9-466450a50add-tapfe9c07c3-e0', 'timestamp': '2025-11-22T08:04:36.689804', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-899820238', 'name': 'tapfe9c07c3-e0', 'instance_id': '235ecf63-07fa-4f60-97e9-466450a50add', 'instance_type': 'm1.nano', 'host': '0dddb932d08bd406e5ba67b183c68dd251b2f7cf64cee01ea4c355f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '165c19d9-e5a7-4c8d-9823-47ddc3418023'}, 'image_ref': '165c19d9-e5a7-4c8d-9823-47ddc3418023', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1c:d6:3e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfe9c07c3-e0'}, 'message_id': 'e34c8990-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5417.289768107, 'message_signature': 'a4374477023dbb18214aeda665bb142a056ccf335dca6257a4db478b9300dcd5'}]}, 'timestamp': '2025-11-22 08:04:36.690113', '_unique_id': '3595558ab78a48b38bb9dc85dae7ce53'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.690 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.690 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.690 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.690 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.690 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.690 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.690 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.690 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.690 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.690 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.690 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.690 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.690 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.690 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.690 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.690 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.690 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.690 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.690 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.690 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.690 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.690 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.690 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.690 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.690 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.690 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.690 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.690 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.690 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.690 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.690 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.691 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.691 12 DEBUG ceilometer.compute.pollsters [-] 235ecf63-07fa-4f60-97e9-466450a50add/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.692 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b64b1f4d-3b2a-4857-b369-4bba9a35fbae', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '989540cd5ede4a5184a08b8eb3de013d', 'user_name': None, 'project_id': 'd967f0cef958482c9711764882a146f3', 'project_name': None, 'resource_id': 'instance-00000061-235ecf63-07fa-4f60-97e9-466450a50add-tapfe9c07c3-e0', 'timestamp': '2025-11-22T08:04:36.691551', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-899820238', 'name': 'tapfe9c07c3-e0', 'instance_id': '235ecf63-07fa-4f60-97e9-466450a50add', 'instance_type': 'm1.nano', 'host': '0dddb932d08bd406e5ba67b183c68dd251b2f7cf64cee01ea4c355f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '165c19d9-e5a7-4c8d-9823-47ddc3418023'}, 'image_ref': '165c19d9-e5a7-4c8d-9823-47ddc3418023', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1c:d6:3e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfe9c07c3-e0'}, 'message_id': 'e34ccdec-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5417.289768107, 'message_signature': '5c1b9df330df1eb601aaa00fdd68cb4bb19ecb1adcecc08dc98171c917ad807b'}]}, 'timestamp': '2025-11-22 08:04:36.691866', '_unique_id': '39b8d7e0890143529c54a7d8c59ec35b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.692 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.692 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.692 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.692 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.692 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.692 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.692 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.692 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.692 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.692 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.692 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.692 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.692 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.692 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.692 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.692 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.692 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.692 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.692 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.692 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.692 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.692 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.692 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.692 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.692 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.692 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.692 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.692 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.692 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.692 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.692 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.692 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.692 12 DEBUG ceilometer.compute.pollsters [-] 235ecf63-07fa-4f60-97e9-466450a50add/network.incoming.bytes.delta volume: 987 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.693 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '79b5d223-5318-4db6-8cab-e291c6f1477a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 987, 'user_id': '989540cd5ede4a5184a08b8eb3de013d', 'user_name': None, 'project_id': 'd967f0cef958482c9711764882a146f3', 'project_name': None, 'resource_id': 'instance-00000061-235ecf63-07fa-4f60-97e9-466450a50add-tapfe9c07c3-e0', 'timestamp': '2025-11-22T08:04:36.692938', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-899820238', 'name': 'tapfe9c07c3-e0', 'instance_id': '235ecf63-07fa-4f60-97e9-466450a50add', 'instance_type': 'm1.nano', 'host': '0dddb932d08bd406e5ba67b183c68dd251b2f7cf64cee01ea4c355f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '165c19d9-e5a7-4c8d-9823-47ddc3418023'}, 'image_ref': '165c19d9-e5a7-4c8d-9823-47ddc3418023', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1c:d6:3e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfe9c07c3-e0'}, 'message_id': 'e34d0276-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5417.289768107, 'message_signature': 'db76023c18d337a142cb77bf6772c0439479ee3d092056bf4b27366b2d3f6e34'}]}, 'timestamp': '2025-11-22 08:04:36.693164', '_unique_id': '261ef5f190ea4856bc68bdadc15889a1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.693 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.693 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.693 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.693 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.693 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.693 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.693 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.693 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.693 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.693 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.693 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.693 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.693 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.693 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.693 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.693 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.693 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.693 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.693 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.693 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.693 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.693 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.693 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.693 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.693 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.693 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.693 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.693 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.693 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.693 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.693 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.694 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.694 12 DEBUG ceilometer.compute.pollsters [-] 235ecf63-07fa-4f60-97e9-466450a50add/disk.device.read.latency volume: 531836504 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.694 12 DEBUG ceilometer.compute.pollsters [-] 235ecf63-07fa-4f60-97e9-466450a50add/disk.device.read.latency volume: 129578499 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.695 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0ceb1260-6407-4e5c-8a5d-8d060f7beb46', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 531836504, 'user_id': '989540cd5ede4a5184a08b8eb3de013d', 'user_name': None, 'project_id': 'd967f0cef958482c9711764882a146f3', 'project_name': None, 'resource_id': '235ecf63-07fa-4f60-97e9-466450a50add-vda', 'timestamp': '2025-11-22T08:04:36.694508', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-899820238', 'name': 'instance-00000061', 'instance_id': '235ecf63-07fa-4f60-97e9-466450a50add', 'instance_type': 'm1.nano', 'host': '0dddb932d08bd406e5ba67b183c68dd251b2f7cf64cee01ea4c355f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '165c19d9-e5a7-4c8d-9823-47ddc3418023'}, 'image_ref': '165c19d9-e5a7-4c8d-9823-47ddc3418023', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e34d4132-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5417.299871556, 'message_signature': 'a5aa5d8b9287d74972d55630faa6c5ee0376b34e5b9e2add5d4dea84396bd64f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 129578499, 'user_id': '989540cd5ede4a5184a08b8eb3de013d', 'user_name': None, 'project_id': 'd967f0cef958482c9711764882a146f3', 'project_name': None, 'resource_id': '235ecf63-07fa-4f60-97e9-466450a50add-sda', 'timestamp': '2025-11-22T08:04:36.694508', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-899820238', 'name': 'instance-00000061', 'instance_id': '235ecf63-07fa-4f60-97e9-466450a50add', 'instance_type': 'm1.nano', 'host': '0dddb932d08bd406e5ba67b183c68dd251b2f7cf64cee01ea4c355f1', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '165c19d9-e5a7-4c8d-9823-47ddc3418023'}, 'image_ref': '165c19d9-e5a7-4c8d-9823-47ddc3418023', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e34d4c04-c779-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5417.299871556, 'message_signature': 'be35c5a4e11086c8dc173fc2849fd1229d8d1d99dcd9877571bb93a998765021'}]}, 'timestamp': '2025-11-22 08:04:36.695073', '_unique_id': 'e306015bec7a4c86bafc421ab194fc53'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.695 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.695 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.695 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.695 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.695 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.695 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.695 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.695 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.695 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.695 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.695 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.695 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.695 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.695 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.695 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.695 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.695 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.695 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.695 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.695 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.695 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.695 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.695 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.695 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.695 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.695 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.695 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.695 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.695 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.695 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.695 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.696 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:04:36.696 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:04:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:04:37.332 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:04:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:04:37.333 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:04:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:04:37.333 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:04:37 compute-0 nova_compute[186544]: 2025-11-22 08:04:37.910 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:04:38 compute-0 podman[232166]: 2025-11-22 08:04:38.416034871 +0000 UTC m=+0.058978915 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 08:04:40 compute-0 ovn_controller[94843]: 2025-11-22T08:04:40Z|00463|binding|INFO|Releasing lport a61c8ae7-262d-45c7-859e-6a4502225b00 from this chassis (sb_readonly=0)
Nov 22 08:04:40 compute-0 nova_compute[186544]: 2025-11-22 08:04:40.734 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:04:40 compute-0 nova_compute[186544]: 2025-11-22 08:04:40.741 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:04:40 compute-0 ovn_controller[94843]: 2025-11-22T08:04:40Z|00464|binding|INFO|Releasing lport a61c8ae7-262d-45c7-859e-6a4502225b00 from this chassis (sb_readonly=0)
Nov 22 08:04:40 compute-0 nova_compute[186544]: 2025-11-22 08:04:40.894 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:04:42 compute-0 nova_compute[186544]: 2025-11-22 08:04:42.913 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:04:43 compute-0 podman[232188]: 2025-11-22 08:04:43.410242329 +0000 UTC m=+0.057991621 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, architecture=x86_64, build-date=2025-08-20T13:12:41, distribution-scope=public, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, managed_by=edpm_ansible, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, io.buildah.version=1.33.7)
Nov 22 08:04:43 compute-0 podman[232187]: 2025-11-22 08:04:43.43053275 +0000 UTC m=+0.081060260 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 08:04:45 compute-0 nova_compute[186544]: 2025-11-22 08:04:45.736 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:04:47 compute-0 nova_compute[186544]: 2025-11-22 08:04:47.915 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:04:49 compute-0 nova_compute[186544]: 2025-11-22 08:04:49.178 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:04:50 compute-0 nova_compute[186544]: 2025-11-22 08:04:50.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:04:50 compute-0 nova_compute[186544]: 2025-11-22 08:04:50.212 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:04:50 compute-0 nova_compute[186544]: 2025-11-22 08:04:50.212 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:04:50 compute-0 nova_compute[186544]: 2025-11-22 08:04:50.212 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:04:50 compute-0 nova_compute[186544]: 2025-11-22 08:04:50.212 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 08:04:50 compute-0 nova_compute[186544]: 2025-11-22 08:04:50.310 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/235ecf63-07fa-4f60-97e9-466450a50add/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:04:50 compute-0 nova_compute[186544]: 2025-11-22 08:04:50.367 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/235ecf63-07fa-4f60-97e9-466450a50add/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:04:50 compute-0 nova_compute[186544]: 2025-11-22 08:04:50.369 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/235ecf63-07fa-4f60-97e9-466450a50add/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:04:50 compute-0 nova_compute[186544]: 2025-11-22 08:04:50.433 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/235ecf63-07fa-4f60-97e9-466450a50add/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:04:50 compute-0 nova_compute[186544]: 2025-11-22 08:04:50.584 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:04:50 compute-0 nova_compute[186544]: 2025-11-22 08:04:50.586 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5553MB free_disk=73.18372344970703GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 08:04:50 compute-0 nova_compute[186544]: 2025-11-22 08:04:50.586 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:04:50 compute-0 nova_compute[186544]: 2025-11-22 08:04:50.586 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:04:50 compute-0 nova_compute[186544]: 2025-11-22 08:04:50.675 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Instance 235ecf63-07fa-4f60-97e9-466450a50add actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 22 08:04:50 compute-0 nova_compute[186544]: 2025-11-22 08:04:50.675 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 08:04:50 compute-0 nova_compute[186544]: 2025-11-22 08:04:50.676 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 08:04:50 compute-0 nova_compute[186544]: 2025-11-22 08:04:50.737 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:04:50 compute-0 nova_compute[186544]: 2025-11-22 08:04:50.834 186548 DEBUG oslo_concurrency.lockutils [None req-2077f312-48b8-4da0-b197-8f6ef4192d54 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Acquiring lock "235ecf63-07fa-4f60-97e9-466450a50add" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:04:50 compute-0 nova_compute[186544]: 2025-11-22 08:04:50.834 186548 DEBUG oslo_concurrency.lockutils [None req-2077f312-48b8-4da0-b197-8f6ef4192d54 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Lock "235ecf63-07fa-4f60-97e9-466450a50add" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:04:50 compute-0 nova_compute[186544]: 2025-11-22 08:04:50.834 186548 DEBUG oslo_concurrency.lockutils [None req-2077f312-48b8-4da0-b197-8f6ef4192d54 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Acquiring lock "235ecf63-07fa-4f60-97e9-466450a50add-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:04:50 compute-0 nova_compute[186544]: 2025-11-22 08:04:50.835 186548 DEBUG oslo_concurrency.lockutils [None req-2077f312-48b8-4da0-b197-8f6ef4192d54 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Lock "235ecf63-07fa-4f60-97e9-466450a50add-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:04:50 compute-0 nova_compute[186544]: 2025-11-22 08:04:50.835 186548 DEBUG oslo_concurrency.lockutils [None req-2077f312-48b8-4da0-b197-8f6ef4192d54 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Lock "235ecf63-07fa-4f60-97e9-466450a50add-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:04:50 compute-0 nova_compute[186544]: 2025-11-22 08:04:50.844 186548 INFO nova.compute.manager [None req-2077f312-48b8-4da0-b197-8f6ef4192d54 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Terminating instance
Nov 22 08:04:50 compute-0 nova_compute[186544]: 2025-11-22 08:04:50.851 186548 DEBUG nova.compute.manager [None req-2077f312-48b8-4da0-b197-8f6ef4192d54 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 08:04:50 compute-0 nova_compute[186544]: 2025-11-22 08:04:50.950 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:04:50 compute-0 nova_compute[186544]: 2025-11-22 08:04:50.964 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:04:50 compute-0 nova_compute[186544]: 2025-11-22 08:04:50.988 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 08:04:50 compute-0 nova_compute[186544]: 2025-11-22 08:04:50.989 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.402s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:04:51 compute-0 kernel: tapfe9c07c3-e0 (unregistering): left promiscuous mode
Nov 22 08:04:51 compute-0 NetworkManager[55036]: <info>  [1763798691.1556] device (tapfe9c07c3-e0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 08:04:51 compute-0 ovn_controller[94843]: 2025-11-22T08:04:51Z|00465|binding|INFO|Releasing lport fe9c07c3-e08f-4810-b699-6d6aa3f50cda from this chassis (sb_readonly=0)
Nov 22 08:04:51 compute-0 ovn_controller[94843]: 2025-11-22T08:04:51Z|00466|binding|INFO|Setting lport fe9c07c3-e08f-4810-b699-6d6aa3f50cda down in Southbound
Nov 22 08:04:51 compute-0 ovn_controller[94843]: 2025-11-22T08:04:51Z|00467|binding|INFO|Removing iface tapfe9c07c3-e0 ovn-installed in OVS
Nov 22 08:04:51 compute-0 nova_compute[186544]: 2025-11-22 08:04:51.159 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:04:51 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:04:51.173 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1c:d6:3e 10.100.0.8'], port_security=['fa:16:3e:1c:d6:3e 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '235ecf63-07fa-4f60-97e9-466450a50add', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c157128f-75e8-4afb-ab55-34580af9585f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd967f0cef958482c9711764882a146f3', 'neutron:revision_number': '11', 'neutron:security_group_ids': 'db55d655-ec9a-40ef-9e54-3247c3ea4f75', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=50c031f4-6b41-4ee7-af4f-a9218d9b390c, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=fe9c07c3-e08f-4810-b699-6d6aa3f50cda) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:04:51 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:04:51.175 103805 INFO neutron.agent.ovn.metadata.agent [-] Port fe9c07c3-e08f-4810-b699-6d6aa3f50cda in datapath c157128f-75e8-4afb-ab55-34580af9585f unbound from our chassis
Nov 22 08:04:51 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:04:51.177 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c157128f-75e8-4afb-ab55-34580af9585f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 08:04:51 compute-0 nova_compute[186544]: 2025-11-22 08:04:51.177 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:04:51 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:04:51.178 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[b7a7ad08-ee74-4666-970a-1cab3f424067]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:04:51 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:04:51.180 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c157128f-75e8-4afb-ab55-34580af9585f namespace which is not needed anymore
Nov 22 08:04:51 compute-0 systemd[1]: machine-qemu\x2d60\x2dinstance\x2d00000061.scope: Deactivated successfully.
Nov 22 08:04:51 compute-0 systemd[1]: machine-qemu\x2d60\x2dinstance\x2d00000061.scope: Consumed 2.867s CPU time.
Nov 22 08:04:51 compute-0 systemd-machined[152872]: Machine qemu-60-instance-00000061 terminated.
Nov 22 08:04:51 compute-0 nova_compute[186544]: 2025-11-22 08:04:51.303 186548 INFO nova.virt.libvirt.driver [-] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Instance destroyed successfully.
Nov 22 08:04:51 compute-0 nova_compute[186544]: 2025-11-22 08:04:51.303 186548 DEBUG nova.objects.instance [None req-2077f312-48b8-4da0-b197-8f6ef4192d54 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Lazy-loading 'resources' on Instance uuid 235ecf63-07fa-4f60-97e9-466450a50add obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:04:51 compute-0 nova_compute[186544]: 2025-11-22 08:04:51.371 186548 DEBUG nova.virt.libvirt.vif [None req-2077f312-48b8-4da0-b197-8f6ef4192d54 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-22T08:01:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-899820238',display_name='tempest-ServersNegativeTestJSON-server-899820238',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-899820238',id=97,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:03:55Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d967f0cef958482c9711764882a146f3',ramdisk_id='',reservation_id='r-he9blv4w',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersNegativeTestJSON-1872924472',owner_user_name='tempest-ServersNegativeTestJSON-1872924472-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:04:20Z,user_data=None,user_id='989540cd5ede4a5184a08b8eb3de013d',uuid=235ecf63-07fa-4f60-97e9-466450a50add,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fe9c07c3-e08f-4810-b699-6d6aa3f50cda", "address": "fa:16:3e:1c:d6:3e", "network": {"id": "c157128f-75e8-4afb-ab55-34580af9585f", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1557637836-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d967f0cef958482c9711764882a146f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe9c07c3-e0", "ovs_interfaceid": "fe9c07c3-e08f-4810-b699-6d6aa3f50cda", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 08:04:51 compute-0 nova_compute[186544]: 2025-11-22 08:04:51.371 186548 DEBUG nova.network.os_vif_util [None req-2077f312-48b8-4da0-b197-8f6ef4192d54 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Converting VIF {"id": "fe9c07c3-e08f-4810-b699-6d6aa3f50cda", "address": "fa:16:3e:1c:d6:3e", "network": {"id": "c157128f-75e8-4afb-ab55-34580af9585f", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1557637836-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d967f0cef958482c9711764882a146f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe9c07c3-e0", "ovs_interfaceid": "fe9c07c3-e08f-4810-b699-6d6aa3f50cda", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:04:51 compute-0 nova_compute[186544]: 2025-11-22 08:04:51.372 186548 DEBUG nova.network.os_vif_util [None req-2077f312-48b8-4da0-b197-8f6ef4192d54 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1c:d6:3e,bridge_name='br-int',has_traffic_filtering=True,id=fe9c07c3-e08f-4810-b699-6d6aa3f50cda,network=Network(c157128f-75e8-4afb-ab55-34580af9585f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe9c07c3-e0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:04:51 compute-0 nova_compute[186544]: 2025-11-22 08:04:51.372 186548 DEBUG os_vif [None req-2077f312-48b8-4da0-b197-8f6ef4192d54 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1c:d6:3e,bridge_name='br-int',has_traffic_filtering=True,id=fe9c07c3-e08f-4810-b699-6d6aa3f50cda,network=Network(c157128f-75e8-4afb-ab55-34580af9585f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe9c07c3-e0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 08:04:51 compute-0 nova_compute[186544]: 2025-11-22 08:04:51.374 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:04:51 compute-0 nova_compute[186544]: 2025-11-22 08:04:51.374 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfe9c07c3-e0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:04:51 compute-0 nova_compute[186544]: 2025-11-22 08:04:51.375 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:04:51 compute-0 nova_compute[186544]: 2025-11-22 08:04:51.377 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:04:51 compute-0 nova_compute[186544]: 2025-11-22 08:04:51.379 186548 INFO os_vif [None req-2077f312-48b8-4da0-b197-8f6ef4192d54 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1c:d6:3e,bridge_name='br-int',has_traffic_filtering=True,id=fe9c07c3-e08f-4810-b699-6d6aa3f50cda,network=Network(c157128f-75e8-4afb-ab55-34580af9585f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe9c07c3-e0')
Nov 22 08:04:51 compute-0 nova_compute[186544]: 2025-11-22 08:04:51.380 186548 INFO nova.virt.libvirt.driver [None req-2077f312-48b8-4da0-b197-8f6ef4192d54 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Deleting instance files /var/lib/nova/instances/235ecf63-07fa-4f60-97e9-466450a50add_del
Nov 22 08:04:51 compute-0 nova_compute[186544]: 2025-11-22 08:04:51.387 186548 INFO nova.virt.libvirt.driver [None req-2077f312-48b8-4da0-b197-8f6ef4192d54 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Deletion of /var/lib/nova/instances/235ecf63-07fa-4f60-97e9-466450a50add_del complete
Nov 22 08:04:51 compute-0 nova_compute[186544]: 2025-11-22 08:04:51.615 186548 INFO nova.compute.manager [None req-2077f312-48b8-4da0-b197-8f6ef4192d54 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Took 0.76 seconds to destroy the instance on the hypervisor.
Nov 22 08:04:51 compute-0 nova_compute[186544]: 2025-11-22 08:04:51.617 186548 DEBUG oslo.service.loopingcall [None req-2077f312-48b8-4da0-b197-8f6ef4192d54 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 08:04:51 compute-0 nova_compute[186544]: 2025-11-22 08:04:51.617 186548 DEBUG nova.compute.manager [-] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 08:04:51 compute-0 nova_compute[186544]: 2025-11-22 08:04:51.617 186548 DEBUG nova.network.neutron [-] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 08:04:51 compute-0 neutron-haproxy-ovnmeta-c157128f-75e8-4afb-ab55-34580af9585f[232057]: [NOTICE]   (232061) : haproxy version is 2.8.14-c23fe91
Nov 22 08:04:51 compute-0 neutron-haproxy-ovnmeta-c157128f-75e8-4afb-ab55-34580af9585f[232057]: [NOTICE]   (232061) : path to executable is /usr/sbin/haproxy
Nov 22 08:04:51 compute-0 neutron-haproxy-ovnmeta-c157128f-75e8-4afb-ab55-34580af9585f[232057]: [WARNING]  (232061) : Exiting Master process...
Nov 22 08:04:51 compute-0 neutron-haproxy-ovnmeta-c157128f-75e8-4afb-ab55-34580af9585f[232057]: [WARNING]  (232061) : Exiting Master process...
Nov 22 08:04:51 compute-0 neutron-haproxy-ovnmeta-c157128f-75e8-4afb-ab55-34580af9585f[232057]: [ALERT]    (232061) : Current worker (232063) exited with code 143 (Terminated)
Nov 22 08:04:51 compute-0 neutron-haproxy-ovnmeta-c157128f-75e8-4afb-ab55-34580af9585f[232057]: [WARNING]  (232061) : All workers exited. Exiting... (0)
Nov 22 08:04:51 compute-0 systemd[1]: libpod-8b2838395485a98f4d0cee5100a8a240f1f31900e34311d1b3996ec03c8e5e0a.scope: Deactivated successfully.
Nov 22 08:04:51 compute-0 podman[232263]: 2025-11-22 08:04:51.698034494 +0000 UTC m=+0.438171445 container died 8b2838395485a98f4d0cee5100a8a240f1f31900e34311d1b3996ec03c8e5e0a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c157128f-75e8-4afb-ab55-34580af9585f, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 22 08:04:52 compute-0 nova_compute[186544]: 2025-11-22 08:04:52.079 186548 DEBUG nova.compute.manager [req-d39340e9-0db0-4f60-835e-2ffe8dd70a54 req-aa794e4e-ab0f-425d-a758-21ee0696ebd1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Received event network-vif-unplugged-fe9c07c3-e08f-4810-b699-6d6aa3f50cda external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:04:52 compute-0 nova_compute[186544]: 2025-11-22 08:04:52.079 186548 DEBUG oslo_concurrency.lockutils [req-d39340e9-0db0-4f60-835e-2ffe8dd70a54 req-aa794e4e-ab0f-425d-a758-21ee0696ebd1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "235ecf63-07fa-4f60-97e9-466450a50add-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:04:52 compute-0 nova_compute[186544]: 2025-11-22 08:04:52.080 186548 DEBUG oslo_concurrency.lockutils [req-d39340e9-0db0-4f60-835e-2ffe8dd70a54 req-aa794e4e-ab0f-425d-a758-21ee0696ebd1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "235ecf63-07fa-4f60-97e9-466450a50add-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:04:52 compute-0 nova_compute[186544]: 2025-11-22 08:04:52.080 186548 DEBUG oslo_concurrency.lockutils [req-d39340e9-0db0-4f60-835e-2ffe8dd70a54 req-aa794e4e-ab0f-425d-a758-21ee0696ebd1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "235ecf63-07fa-4f60-97e9-466450a50add-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:04:52 compute-0 nova_compute[186544]: 2025-11-22 08:04:52.080 186548 DEBUG nova.compute.manager [req-d39340e9-0db0-4f60-835e-2ffe8dd70a54 req-aa794e4e-ab0f-425d-a758-21ee0696ebd1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] No waiting events found dispatching network-vif-unplugged-fe9c07c3-e08f-4810-b699-6d6aa3f50cda pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:04:52 compute-0 nova_compute[186544]: 2025-11-22 08:04:52.080 186548 DEBUG nova.compute.manager [req-d39340e9-0db0-4f60-835e-2ffe8dd70a54 req-aa794e4e-ab0f-425d-a758-21ee0696ebd1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Received event network-vif-unplugged-fe9c07c3-e08f-4810-b699-6d6aa3f50cda for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 22 08:04:52 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8b2838395485a98f4d0cee5100a8a240f1f31900e34311d1b3996ec03c8e5e0a-userdata-shm.mount: Deactivated successfully.
Nov 22 08:04:52 compute-0 systemd[1]: var-lib-containers-storage-overlay-034dd310076fd19614eb76329c600d1f2a33366836076aeacf21a2afe6156677-merged.mount: Deactivated successfully.
Nov 22 08:04:52 compute-0 podman[232263]: 2025-11-22 08:04:52.609910974 +0000 UTC m=+1.350047955 container cleanup 8b2838395485a98f4d0cee5100a8a240f1f31900e34311d1b3996ec03c8e5e0a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c157128f-75e8-4afb-ab55-34580af9585f, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 08:04:52 compute-0 systemd[1]: libpod-conmon-8b2838395485a98f4d0cee5100a8a240f1f31900e34311d1b3996ec03c8e5e0a.scope: Deactivated successfully.
Nov 22 08:04:52 compute-0 nova_compute[186544]: 2025-11-22 08:04:52.990 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:04:52 compute-0 nova_compute[186544]: 2025-11-22 08:04:52.991 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 08:04:53 compute-0 nova_compute[186544]: 2025-11-22 08:04:53.034 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9907
Nov 22 08:04:53 compute-0 nova_compute[186544]: 2025-11-22 08:04:53.035 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 08:04:53 compute-0 nova_compute[186544]: 2025-11-22 08:04:53.036 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:04:53 compute-0 nova_compute[186544]: 2025-11-22 08:04:53.036 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 08:04:53 compute-0 podman[232309]: 2025-11-22 08:04:53.304818921 +0000 UTC m=+0.667634946 container remove 8b2838395485a98f4d0cee5100a8a240f1f31900e34311d1b3996ec03c8e5e0a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c157128f-75e8-4afb-ab55-34580af9585f, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118)
Nov 22 08:04:53 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:04:53.311 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[cd925f4f-324d-462f-b0fe-d46b12cf1f19]: (4, ('Sat Nov 22 08:04:51 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-c157128f-75e8-4afb-ab55-34580af9585f (8b2838395485a98f4d0cee5100a8a240f1f31900e34311d1b3996ec03c8e5e0a)\n8b2838395485a98f4d0cee5100a8a240f1f31900e34311d1b3996ec03c8e5e0a\nSat Nov 22 08:04:52 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-c157128f-75e8-4afb-ab55-34580af9585f (8b2838395485a98f4d0cee5100a8a240f1f31900e34311d1b3996ec03c8e5e0a)\n8b2838395485a98f4d0cee5100a8a240f1f31900e34311d1b3996ec03c8e5e0a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:04:53 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:04:53.314 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[a129fd06-0df1-4bd3-83ac-5d54b03a1a8e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:04:53 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:04:53.315 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc157128f-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:04:53 compute-0 nova_compute[186544]: 2025-11-22 08:04:53.316 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:04:53 compute-0 kernel: tapc157128f-70: left promiscuous mode
Nov 22 08:04:53 compute-0 nova_compute[186544]: 2025-11-22 08:04:53.330 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:04:53 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:04:53.332 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[1d141995-ff2d-41df-9911-34eee267dc70]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:04:53 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:04:53.350 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[3fc058d9-f242-4f26-9350-c95e52ed309e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:04:53 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:04:53.352 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[700877dc-9658-4891-8c1f-a80b24cfa4b2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:04:53 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:04:53.369 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[cf6a0f02-da1e-440b-b421-7b49492d58cc]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 540012, 'reachable_time': 24845, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232326, 'error': None, 'target': 'ovnmeta-c157128f-75e8-4afb-ab55-34580af9585f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:04:53 compute-0 systemd[1]: run-netns-ovnmeta\x2dc157128f\x2d75e8\x2d4afb\x2dab55\x2d34580af9585f.mount: Deactivated successfully.
Nov 22 08:04:53 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:04:53.374 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c157128f-75e8-4afb-ab55-34580af9585f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 08:04:53 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:04:53.375 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[f5a8515c-2d4b-40f9-bfbc-f9c37623daba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:04:54 compute-0 nova_compute[186544]: 2025-11-22 08:04:54.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:04:54 compute-0 nova_compute[186544]: 2025-11-22 08:04:54.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:04:54 compute-0 nova_compute[186544]: 2025-11-22 08:04:54.394 186548 DEBUG nova.compute.manager [req-441307f6-93d2-4fc8-9d48-aa4d0e124756 req-aa5ad657-1fd2-4fe9-89d2-805e7a2ddc0f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Received event network-vif-plugged-fe9c07c3-e08f-4810-b699-6d6aa3f50cda external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:04:54 compute-0 nova_compute[186544]: 2025-11-22 08:04:54.394 186548 DEBUG oslo_concurrency.lockutils [req-441307f6-93d2-4fc8-9d48-aa4d0e124756 req-aa5ad657-1fd2-4fe9-89d2-805e7a2ddc0f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "235ecf63-07fa-4f60-97e9-466450a50add-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:04:54 compute-0 nova_compute[186544]: 2025-11-22 08:04:54.395 186548 DEBUG oslo_concurrency.lockutils [req-441307f6-93d2-4fc8-9d48-aa4d0e124756 req-aa5ad657-1fd2-4fe9-89d2-805e7a2ddc0f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "235ecf63-07fa-4f60-97e9-466450a50add-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:04:54 compute-0 nova_compute[186544]: 2025-11-22 08:04:54.395 186548 DEBUG oslo_concurrency.lockutils [req-441307f6-93d2-4fc8-9d48-aa4d0e124756 req-aa5ad657-1fd2-4fe9-89d2-805e7a2ddc0f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "235ecf63-07fa-4f60-97e9-466450a50add-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:04:54 compute-0 nova_compute[186544]: 2025-11-22 08:04:54.395 186548 DEBUG nova.compute.manager [req-441307f6-93d2-4fc8-9d48-aa4d0e124756 req-aa5ad657-1fd2-4fe9-89d2-805e7a2ddc0f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] No waiting events found dispatching network-vif-plugged-fe9c07c3-e08f-4810-b699-6d6aa3f50cda pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:04:54 compute-0 nova_compute[186544]: 2025-11-22 08:04:54.395 186548 WARNING nova.compute.manager [req-441307f6-93d2-4fc8-9d48-aa4d0e124756 req-aa5ad657-1fd2-4fe9-89d2-805e7a2ddc0f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Received unexpected event network-vif-plugged-fe9c07c3-e08f-4810-b699-6d6aa3f50cda for instance with vm_state active and task_state deleting.
Nov 22 08:04:54 compute-0 nova_compute[186544]: 2025-11-22 08:04:54.841 186548 DEBUG nova.network.neutron [-] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:04:54 compute-0 nova_compute[186544]: 2025-11-22 08:04:54.903 186548 INFO nova.compute.manager [-] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Took 3.29 seconds to deallocate network for instance.
Nov 22 08:04:54 compute-0 nova_compute[186544]: 2025-11-22 08:04:54.970 186548 DEBUG nova.compute.manager [req-5fe15f45-7843-4875-a279-99fc8678b0a3 req-520be8f7-a5a7-4f05-9707-ce57723226a5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Received event network-vif-deleted-fe9c07c3-e08f-4810-b699-6d6aa3f50cda external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:04:55 compute-0 nova_compute[186544]: 2025-11-22 08:04:55.062 186548 DEBUG oslo_concurrency.lockutils [None req-2077f312-48b8-4da0-b197-8f6ef4192d54 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:04:55 compute-0 nova_compute[186544]: 2025-11-22 08:04:55.062 186548 DEBUG oslo_concurrency.lockutils [None req-2077f312-48b8-4da0-b197-8f6ef4192d54 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:04:55 compute-0 nova_compute[186544]: 2025-11-22 08:04:55.137 186548 DEBUG nova.compute.provider_tree [None req-2077f312-48b8-4da0-b197-8f6ef4192d54 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:04:55 compute-0 nova_compute[186544]: 2025-11-22 08:04:55.149 186548 DEBUG nova.scheduler.client.report [None req-2077f312-48b8-4da0-b197-8f6ef4192d54 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:04:55 compute-0 nova_compute[186544]: 2025-11-22 08:04:55.178 186548 DEBUG oslo_concurrency.lockutils [None req-2077f312-48b8-4da0-b197-8f6ef4192d54 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.116s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:04:55 compute-0 nova_compute[186544]: 2025-11-22 08:04:55.231 186548 INFO nova.scheduler.client.report [None req-2077f312-48b8-4da0-b197-8f6ef4192d54 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Deleted allocations for instance 235ecf63-07fa-4f60-97e9-466450a50add
Nov 22 08:04:55 compute-0 nova_compute[186544]: 2025-11-22 08:04:55.399 186548 DEBUG oslo_concurrency.lockutils [None req-2077f312-48b8-4da0-b197-8f6ef4192d54 989540cd5ede4a5184a08b8eb3de013d d967f0cef958482c9711764882a146f3 - - default default] Lock "235ecf63-07fa-4f60-97e9-466450a50add" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.565s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:04:55 compute-0 nova_compute[186544]: 2025-11-22 08:04:55.740 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:04:56 compute-0 nova_compute[186544]: 2025-11-22 08:04:56.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:04:56 compute-0 nova_compute[186544]: 2025-11-22 08:04:56.377 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:04:57 compute-0 nova_compute[186544]: 2025-11-22 08:04:57.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:04:58 compute-0 podman[232328]: 2025-11-22 08:04:58.416480818 +0000 UTC m=+0.058177466 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ceilometer_agent_compute)
Nov 22 08:04:58 compute-0 podman[232329]: 2025-11-22 08:04:58.448860386 +0000 UTC m=+0.083297944 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 08:05:00 compute-0 nova_compute[186544]: 2025-11-22 08:05:00.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:05:00 compute-0 nova_compute[186544]: 2025-11-22 08:05:00.741 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:05:01 compute-0 nova_compute[186544]: 2025-11-22 08:05:01.379 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:05:01 compute-0 podman[232375]: 2025-11-22 08:05:01.4114146 +0000 UTC m=+0.052398814 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 08:05:01 compute-0 podman[232374]: 2025-11-22 08:05:01.419372447 +0000 UTC m=+0.059524380 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 08:05:03 compute-0 nova_compute[186544]: 2025-11-22 08:05:03.035 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:05:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:05:05.343 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=30, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=29) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:05:05 compute-0 nova_compute[186544]: 2025-11-22 08:05:05.344 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:05:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:05:05.344 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 08:05:05 compute-0 nova_compute[186544]: 2025-11-22 08:05:05.743 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:05:06 compute-0 nova_compute[186544]: 2025-11-22 08:05:06.301 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763798691.300085, 235ecf63-07fa-4f60-97e9-466450a50add => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:05:06 compute-0 nova_compute[186544]: 2025-11-22 08:05:06.302 186548 INFO nova.compute.manager [-] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] VM Stopped (Lifecycle Event)
Nov 22 08:05:06 compute-0 nova_compute[186544]: 2025-11-22 08:05:06.325 186548 DEBUG nova.compute.manager [None req-417176b5-9292-4876-b69a-08318900ad3e - - - - - -] [instance: 235ecf63-07fa-4f60-97e9-466450a50add] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:05:06 compute-0 nova_compute[186544]: 2025-11-22 08:05:06.381 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:05:09 compute-0 podman[232417]: 2025-11-22 08:05:09.411727196 +0000 UTC m=+0.060503153 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 08:05:10 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:05:10.347 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '30'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:05:10 compute-0 nova_compute[186544]: 2025-11-22 08:05:10.745 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:05:11 compute-0 nova_compute[186544]: 2025-11-22 08:05:11.384 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:05:14 compute-0 podman[232437]: 2025-11-22 08:05:14.407175256 +0000 UTC m=+0.052043356 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 08:05:14 compute-0 podman[232438]: 2025-11-22 08:05:14.452059432 +0000 UTC m=+0.089304233 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.tags=minimal rhel9, distribution-scope=public, container_name=openstack_network_exporter, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, build-date=2025-08-20T13:12:41, architecture=x86_64, io.openshift.expose-services=, managed_by=edpm_ansible, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., name=ubi9-minimal, release=1755695350)
Nov 22 08:05:15 compute-0 nova_compute[186544]: 2025-11-22 08:05:15.746 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:05:16 compute-0 nova_compute[186544]: 2025-11-22 08:05:16.385 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:05:16 compute-0 nova_compute[186544]: 2025-11-22 08:05:16.466 186548 DEBUG nova.compute.manager [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560
Nov 22 08:05:16 compute-0 nova_compute[186544]: 2025-11-22 08:05:16.964 186548 DEBUG oslo_concurrency.lockutils [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:05:16 compute-0 nova_compute[186544]: 2025-11-22 08:05:16.965 186548 DEBUG oslo_concurrency.lockutils [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:05:17 compute-0 nova_compute[186544]: 2025-11-22 08:05:17.048 186548 DEBUG nova.objects.instance [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lazy-loading 'pci_requests' on Instance uuid b9ee5ebd-90a8-426a-b369-d38bf61616f2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:05:17 compute-0 nova_compute[186544]: 2025-11-22 08:05:17.068 186548 DEBUG nova.virt.hardware [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 08:05:17 compute-0 nova_compute[186544]: 2025-11-22 08:05:17.068 186548 INFO nova.compute.claims [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Claim successful on node compute-0.ctlplane.example.com
Nov 22 08:05:17 compute-0 nova_compute[186544]: 2025-11-22 08:05:17.069 186548 DEBUG nova.objects.instance [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lazy-loading 'resources' on Instance uuid b9ee5ebd-90a8-426a-b369-d38bf61616f2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:05:17 compute-0 nova_compute[186544]: 2025-11-22 08:05:17.115 186548 DEBUG nova.objects.instance [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lazy-loading 'pci_devices' on Instance uuid b9ee5ebd-90a8-426a-b369-d38bf61616f2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:05:17 compute-0 nova_compute[186544]: 2025-11-22 08:05:17.190 186548 INFO nova.compute.resource_tracker [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Updating resource usage from migration 0696147e-c4a0-43ec-ad4c-96615e87f0b0
Nov 22 08:05:17 compute-0 nova_compute[186544]: 2025-11-22 08:05:17.190 186548 DEBUG nova.compute.resource_tracker [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Starting to track incoming migration 0696147e-c4a0-43ec-ad4c-96615e87f0b0 with flavor 1c351edf-5b2d-477d-93d0-c380bdae83e7 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431
Nov 22 08:05:17 compute-0 nova_compute[186544]: 2025-11-22 08:05:17.314 186548 DEBUG nova.compute.provider_tree [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:05:17 compute-0 nova_compute[186544]: 2025-11-22 08:05:17.342 186548 DEBUG nova.scheduler.client.report [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:05:17 compute-0 nova_compute[186544]: 2025-11-22 08:05:17.389 186548 DEBUG oslo_concurrency.lockutils [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.424s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:05:17 compute-0 nova_compute[186544]: 2025-11-22 08:05:17.390 186548 INFO nova.compute.manager [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Migrating
Nov 22 08:05:20 compute-0 nova_compute[186544]: 2025-11-22 08:05:20.748 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:05:21 compute-0 nova_compute[186544]: 2025-11-22 08:05:21.387 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:05:22 compute-0 sshd-session[232481]: Accepted publickey for nova from 192.168.122.102 port 58286 ssh2: ECDSA SHA256:92zYFkcPWlh9+ZvK5fXQ5RiSCmUwy/g/KFplYnusFfI
Nov 22 08:05:22 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Nov 22 08:05:22 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 22 08:05:22 compute-0 systemd-logind[821]: New session 57 of user nova.
Nov 22 08:05:22 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 22 08:05:22 compute-0 systemd[1]: Starting User Manager for UID 42436...
Nov 22 08:05:22 compute-0 systemd[232485]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 22 08:05:22 compute-0 systemd[232485]: Queued start job for default target Main User Target.
Nov 22 08:05:22 compute-0 systemd[232485]: Created slice User Application Slice.
Nov 22 08:05:22 compute-0 systemd[232485]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 22 08:05:22 compute-0 systemd[232485]: Started Daily Cleanup of User's Temporary Directories.
Nov 22 08:05:22 compute-0 systemd[232485]: Reached target Paths.
Nov 22 08:05:22 compute-0 systemd[232485]: Reached target Timers.
Nov 22 08:05:22 compute-0 systemd[232485]: Starting D-Bus User Message Bus Socket...
Nov 22 08:05:22 compute-0 systemd[232485]: Starting Create User's Volatile Files and Directories...
Nov 22 08:05:22 compute-0 systemd[232485]: Listening on D-Bus User Message Bus Socket.
Nov 22 08:05:22 compute-0 systemd[232485]: Reached target Sockets.
Nov 22 08:05:22 compute-0 systemd[232485]: Finished Create User's Volatile Files and Directories.
Nov 22 08:05:22 compute-0 systemd[232485]: Reached target Basic System.
Nov 22 08:05:22 compute-0 systemd[232485]: Reached target Main User Target.
Nov 22 08:05:22 compute-0 systemd[232485]: Startup finished in 126ms.
Nov 22 08:05:22 compute-0 systemd[1]: Started User Manager for UID 42436.
Nov 22 08:05:22 compute-0 systemd[1]: Started Session 57 of User nova.
Nov 22 08:05:22 compute-0 sshd-session[232481]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 22 08:05:22 compute-0 sshd-session[232500]: Received disconnect from 192.168.122.102 port 58286:11: disconnected by user
Nov 22 08:05:22 compute-0 sshd-session[232500]: Disconnected from user nova 192.168.122.102 port 58286
Nov 22 08:05:22 compute-0 sshd-session[232481]: pam_unix(sshd:session): session closed for user nova
Nov 22 08:05:22 compute-0 systemd[1]: session-57.scope: Deactivated successfully.
Nov 22 08:05:22 compute-0 systemd-logind[821]: Session 57 logged out. Waiting for processes to exit.
Nov 22 08:05:22 compute-0 systemd-logind[821]: Removed session 57.
Nov 22 08:05:23 compute-0 sshd-session[232502]: Accepted publickey for nova from 192.168.122.102 port 58302 ssh2: ECDSA SHA256:92zYFkcPWlh9+ZvK5fXQ5RiSCmUwy/g/KFplYnusFfI
Nov 22 08:05:23 compute-0 systemd-logind[821]: New session 59 of user nova.
Nov 22 08:05:23 compute-0 systemd[1]: Started Session 59 of User nova.
Nov 22 08:05:23 compute-0 sshd-session[232502]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 22 08:05:23 compute-0 sshd-session[232505]: Received disconnect from 192.168.122.102 port 58302:11: disconnected by user
Nov 22 08:05:23 compute-0 sshd-session[232505]: Disconnected from user nova 192.168.122.102 port 58302
Nov 22 08:05:23 compute-0 sshd-session[232502]: pam_unix(sshd:session): session closed for user nova
Nov 22 08:05:23 compute-0 systemd[1]: session-59.scope: Deactivated successfully.
Nov 22 08:05:23 compute-0 systemd-logind[821]: Session 59 logged out. Waiting for processes to exit.
Nov 22 08:05:23 compute-0 systemd-logind[821]: Removed session 59.
Nov 22 08:05:25 compute-0 nova_compute[186544]: 2025-11-22 08:05:25.752 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:05:26 compute-0 nova_compute[186544]: 2025-11-22 08:05:26.390 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:05:26 compute-0 sshd-session[232508]: Accepted publickey for nova from 192.168.122.102 port 58306 ssh2: ECDSA SHA256:92zYFkcPWlh9+ZvK5fXQ5RiSCmUwy/g/KFplYnusFfI
Nov 22 08:05:26 compute-0 systemd-logind[821]: New session 60 of user nova.
Nov 22 08:05:26 compute-0 systemd[1]: Started Session 60 of User nova.
Nov 22 08:05:26 compute-0 sshd-session[232508]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 22 08:05:27 compute-0 sshd-session[232511]: Received disconnect from 192.168.122.102 port 58306:11: disconnected by user
Nov 22 08:05:27 compute-0 sshd-session[232511]: Disconnected from user nova 192.168.122.102 port 58306
Nov 22 08:05:27 compute-0 sshd-session[232508]: pam_unix(sshd:session): session closed for user nova
Nov 22 08:05:27 compute-0 systemd[1]: session-60.scope: Deactivated successfully.
Nov 22 08:05:27 compute-0 systemd-logind[821]: Session 60 logged out. Waiting for processes to exit.
Nov 22 08:05:27 compute-0 systemd-logind[821]: Removed session 60.
Nov 22 08:05:27 compute-0 sshd-session[232513]: Accepted publickey for nova from 192.168.122.102 port 58312 ssh2: ECDSA SHA256:92zYFkcPWlh9+ZvK5fXQ5RiSCmUwy/g/KFplYnusFfI
Nov 22 08:05:27 compute-0 systemd-logind[821]: New session 61 of user nova.
Nov 22 08:05:27 compute-0 nova_compute[186544]: 2025-11-22 08:05:27.179 186548 DEBUG nova.compute.manager [req-3e94bf2b-a586-4bc7-a520-729ad2fda225 req-5aff679a-6ffa-4dc2-807a-2014de554d89 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Received event network-vif-unplugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:05:27 compute-0 nova_compute[186544]: 2025-11-22 08:05:27.180 186548 DEBUG oslo_concurrency.lockutils [req-3e94bf2b-a586-4bc7-a520-729ad2fda225 req-5aff679a-6ffa-4dc2-807a-2014de554d89 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:05:27 compute-0 nova_compute[186544]: 2025-11-22 08:05:27.180 186548 DEBUG oslo_concurrency.lockutils [req-3e94bf2b-a586-4bc7-a520-729ad2fda225 req-5aff679a-6ffa-4dc2-807a-2014de554d89 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:05:27 compute-0 nova_compute[186544]: 2025-11-22 08:05:27.181 186548 DEBUG oslo_concurrency.lockutils [req-3e94bf2b-a586-4bc7-a520-729ad2fda225 req-5aff679a-6ffa-4dc2-807a-2014de554d89 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:05:27 compute-0 nova_compute[186544]: 2025-11-22 08:05:27.181 186548 DEBUG nova.compute.manager [req-3e94bf2b-a586-4bc7-a520-729ad2fda225 req-5aff679a-6ffa-4dc2-807a-2014de554d89 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] No waiting events found dispatching network-vif-unplugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:05:27 compute-0 nova_compute[186544]: 2025-11-22 08:05:27.181 186548 WARNING nova.compute.manager [req-3e94bf2b-a586-4bc7-a520-729ad2fda225 req-5aff679a-6ffa-4dc2-807a-2014de554d89 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Received unexpected event network-vif-unplugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc for instance with vm_state active and task_state resize_migrating.
Nov 22 08:05:27 compute-0 systemd[1]: Started Session 61 of User nova.
Nov 22 08:05:27 compute-0 sshd-session[232513]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 22 08:05:27 compute-0 sshd-session[232516]: Received disconnect from 192.168.122.102 port 58312:11: disconnected by user
Nov 22 08:05:27 compute-0 sshd-session[232516]: Disconnected from user nova 192.168.122.102 port 58312
Nov 22 08:05:27 compute-0 sshd-session[232513]: pam_unix(sshd:session): session closed for user nova
Nov 22 08:05:27 compute-0 systemd[1]: session-61.scope: Deactivated successfully.
Nov 22 08:05:27 compute-0 systemd-logind[821]: Session 61 logged out. Waiting for processes to exit.
Nov 22 08:05:27 compute-0 systemd-logind[821]: Removed session 61.
Nov 22 08:05:27 compute-0 sshd-session[232518]: Accepted publickey for nova from 192.168.122.102 port 58328 ssh2: ECDSA SHA256:92zYFkcPWlh9+ZvK5fXQ5RiSCmUwy/g/KFplYnusFfI
Nov 22 08:05:27 compute-0 systemd-logind[821]: New session 62 of user nova.
Nov 22 08:05:27 compute-0 systemd[1]: Started Session 62 of User nova.
Nov 22 08:05:27 compute-0 sshd-session[232518]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 22 08:05:27 compute-0 sshd-session[232521]: Received disconnect from 192.168.122.102 port 58328:11: disconnected by user
Nov 22 08:05:27 compute-0 sshd-session[232521]: Disconnected from user nova 192.168.122.102 port 58328
Nov 22 08:05:27 compute-0 sshd-session[232518]: pam_unix(sshd:session): session closed for user nova
Nov 22 08:05:27 compute-0 systemd[1]: session-62.scope: Deactivated successfully.
Nov 22 08:05:27 compute-0 systemd-logind[821]: Session 62 logged out. Waiting for processes to exit.
Nov 22 08:05:27 compute-0 systemd-logind[821]: Removed session 62.
Nov 22 08:05:29 compute-0 podman[232523]: 2025-11-22 08:05:29.456497037 +0000 UTC m=+0.090261157 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute)
Nov 22 08:05:29 compute-0 podman[232524]: 2025-11-22 08:05:29.461243943 +0000 UTC m=+0.093417514 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 22 08:05:29 compute-0 nova_compute[186544]: 2025-11-22 08:05:29.711 186548 DEBUG nova.compute.manager [req-5a6e1381-2db3-4b58-b106-be83422456a9 req-b05ed82a-2edc-4562-b039-af6af6e65ffc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Received event network-vif-plugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:05:29 compute-0 nova_compute[186544]: 2025-11-22 08:05:29.712 186548 DEBUG oslo_concurrency.lockutils [req-5a6e1381-2db3-4b58-b106-be83422456a9 req-b05ed82a-2edc-4562-b039-af6af6e65ffc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:05:29 compute-0 nova_compute[186544]: 2025-11-22 08:05:29.712 186548 DEBUG oslo_concurrency.lockutils [req-5a6e1381-2db3-4b58-b106-be83422456a9 req-b05ed82a-2edc-4562-b039-af6af6e65ffc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:05:29 compute-0 nova_compute[186544]: 2025-11-22 08:05:29.712 186548 DEBUG oslo_concurrency.lockutils [req-5a6e1381-2db3-4b58-b106-be83422456a9 req-b05ed82a-2edc-4562-b039-af6af6e65ffc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:05:29 compute-0 nova_compute[186544]: 2025-11-22 08:05:29.712 186548 DEBUG nova.compute.manager [req-5a6e1381-2db3-4b58-b106-be83422456a9 req-b05ed82a-2edc-4562-b039-af6af6e65ffc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] No waiting events found dispatching network-vif-plugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:05:29 compute-0 nova_compute[186544]: 2025-11-22 08:05:29.713 186548 WARNING nova.compute.manager [req-5a6e1381-2db3-4b58-b106-be83422456a9 req-b05ed82a-2edc-4562-b039-af6af6e65ffc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Received unexpected event network-vif-plugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc for instance with vm_state active and task_state resize_migrated.
Nov 22 08:05:30 compute-0 nova_compute[186544]: 2025-11-22 08:05:30.038 186548 INFO nova.network.neutron [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Updating port 348c8bec-11f0-4b6d-9dce-ae3c3f37efbc with attributes {'binding:host_id': 'compute-0.ctlplane.example.com', 'device_owner': 'compute:nova'}
Nov 22 08:05:30 compute-0 nova_compute[186544]: 2025-11-22 08:05:30.755 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:05:31 compute-0 nova_compute[186544]: 2025-11-22 08:05:31.393 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:05:32 compute-0 nova_compute[186544]: 2025-11-22 08:05:32.357 186548 DEBUG nova.compute.manager [req-12578866-85d7-4bbd-8539-72e82935dcd7 req-969a539b-cd61-432e-846a-09b8cd14d9f2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Received event network-vif-plugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:05:32 compute-0 nova_compute[186544]: 2025-11-22 08:05:32.358 186548 DEBUG oslo_concurrency.lockutils [req-12578866-85d7-4bbd-8539-72e82935dcd7 req-969a539b-cd61-432e-846a-09b8cd14d9f2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:05:32 compute-0 nova_compute[186544]: 2025-11-22 08:05:32.358 186548 DEBUG oslo_concurrency.lockutils [req-12578866-85d7-4bbd-8539-72e82935dcd7 req-969a539b-cd61-432e-846a-09b8cd14d9f2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:05:32 compute-0 nova_compute[186544]: 2025-11-22 08:05:32.358 186548 DEBUG oslo_concurrency.lockutils [req-12578866-85d7-4bbd-8539-72e82935dcd7 req-969a539b-cd61-432e-846a-09b8cd14d9f2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:05:32 compute-0 nova_compute[186544]: 2025-11-22 08:05:32.359 186548 DEBUG nova.compute.manager [req-12578866-85d7-4bbd-8539-72e82935dcd7 req-969a539b-cd61-432e-846a-09b8cd14d9f2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] No waiting events found dispatching network-vif-plugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:05:32 compute-0 nova_compute[186544]: 2025-11-22 08:05:32.359 186548 WARNING nova.compute.manager [req-12578866-85d7-4bbd-8539-72e82935dcd7 req-969a539b-cd61-432e-846a-09b8cd14d9f2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Received unexpected event network-vif-plugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc for instance with vm_state active and task_state resize_migrated.
Nov 22 08:05:32 compute-0 nova_compute[186544]: 2025-11-22 08:05:32.359 186548 DEBUG nova.compute.manager [req-12578866-85d7-4bbd-8539-72e82935dcd7 req-969a539b-cd61-432e-846a-09b8cd14d9f2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Received event network-vif-plugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:05:32 compute-0 nova_compute[186544]: 2025-11-22 08:05:32.359 186548 DEBUG oslo_concurrency.lockutils [req-12578866-85d7-4bbd-8539-72e82935dcd7 req-969a539b-cd61-432e-846a-09b8cd14d9f2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:05:32 compute-0 nova_compute[186544]: 2025-11-22 08:05:32.359 186548 DEBUG oslo_concurrency.lockutils [req-12578866-85d7-4bbd-8539-72e82935dcd7 req-969a539b-cd61-432e-846a-09b8cd14d9f2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:05:32 compute-0 nova_compute[186544]: 2025-11-22 08:05:32.359 186548 DEBUG oslo_concurrency.lockutils [req-12578866-85d7-4bbd-8539-72e82935dcd7 req-969a539b-cd61-432e-846a-09b8cd14d9f2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:05:32 compute-0 nova_compute[186544]: 2025-11-22 08:05:32.360 186548 DEBUG nova.compute.manager [req-12578866-85d7-4bbd-8539-72e82935dcd7 req-969a539b-cd61-432e-846a-09b8cd14d9f2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] No waiting events found dispatching network-vif-plugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:05:32 compute-0 nova_compute[186544]: 2025-11-22 08:05:32.360 186548 WARNING nova.compute.manager [req-12578866-85d7-4bbd-8539-72e82935dcd7 req-969a539b-cd61-432e-846a-09b8cd14d9f2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Received unexpected event network-vif-plugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc for instance with vm_state active and task_state resize_migrated.
Nov 22 08:05:32 compute-0 podman[232569]: 2025-11-22 08:05:32.430473451 +0000 UTC m=+0.081127303 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 22 08:05:32 compute-0 podman[232570]: 2025-11-22 08:05:32.430539483 +0000 UTC m=+0.077301188 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 08:05:34 compute-0 nova_compute[186544]: 2025-11-22 08:05:34.638 186548 DEBUG nova.compute.manager [req-741bd169-19ce-43dd-a333-615d55fae10c req-c5dade0d-3e0a-4853-8f5e-96ff1ae1fc4f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Received event network-vif-unplugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:05:34 compute-0 nova_compute[186544]: 2025-11-22 08:05:34.638 186548 DEBUG oslo_concurrency.lockutils [req-741bd169-19ce-43dd-a333-615d55fae10c req-c5dade0d-3e0a-4853-8f5e-96ff1ae1fc4f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:05:34 compute-0 nova_compute[186544]: 2025-11-22 08:05:34.638 186548 DEBUG oslo_concurrency.lockutils [req-741bd169-19ce-43dd-a333-615d55fae10c req-c5dade0d-3e0a-4853-8f5e-96ff1ae1fc4f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:05:34 compute-0 nova_compute[186544]: 2025-11-22 08:05:34.638 186548 DEBUG oslo_concurrency.lockutils [req-741bd169-19ce-43dd-a333-615d55fae10c req-c5dade0d-3e0a-4853-8f5e-96ff1ae1fc4f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:05:34 compute-0 nova_compute[186544]: 2025-11-22 08:05:34.639 186548 DEBUG nova.compute.manager [req-741bd169-19ce-43dd-a333-615d55fae10c req-c5dade0d-3e0a-4853-8f5e-96ff1ae1fc4f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] No waiting events found dispatching network-vif-unplugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:05:34 compute-0 nova_compute[186544]: 2025-11-22 08:05:34.641 186548 WARNING nova.compute.manager [req-741bd169-19ce-43dd-a333-615d55fae10c req-c5dade0d-3e0a-4853-8f5e-96ff1ae1fc4f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Received unexpected event network-vif-unplugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc for instance with vm_state active and task_state resize_migrated.
Nov 22 08:05:34 compute-0 nova_compute[186544]: 2025-11-22 08:05:34.641 186548 DEBUG nova.compute.manager [req-741bd169-19ce-43dd-a333-615d55fae10c req-c5dade0d-3e0a-4853-8f5e-96ff1ae1fc4f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Received event network-vif-plugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:05:34 compute-0 nova_compute[186544]: 2025-11-22 08:05:34.641 186548 DEBUG oslo_concurrency.lockutils [req-741bd169-19ce-43dd-a333-615d55fae10c req-c5dade0d-3e0a-4853-8f5e-96ff1ae1fc4f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:05:34 compute-0 nova_compute[186544]: 2025-11-22 08:05:34.642 186548 DEBUG oslo_concurrency.lockutils [req-741bd169-19ce-43dd-a333-615d55fae10c req-c5dade0d-3e0a-4853-8f5e-96ff1ae1fc4f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:05:34 compute-0 nova_compute[186544]: 2025-11-22 08:05:34.642 186548 DEBUG oslo_concurrency.lockutils [req-741bd169-19ce-43dd-a333-615d55fae10c req-c5dade0d-3e0a-4853-8f5e-96ff1ae1fc4f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:05:34 compute-0 nova_compute[186544]: 2025-11-22 08:05:34.642 186548 DEBUG nova.compute.manager [req-741bd169-19ce-43dd-a333-615d55fae10c req-c5dade0d-3e0a-4853-8f5e-96ff1ae1fc4f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] No waiting events found dispatching network-vif-plugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:05:34 compute-0 nova_compute[186544]: 2025-11-22 08:05:34.642 186548 WARNING nova.compute.manager [req-741bd169-19ce-43dd-a333-615d55fae10c req-c5dade0d-3e0a-4853-8f5e-96ff1ae1fc4f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Received unexpected event network-vif-plugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc for instance with vm_state active and task_state resize_migrated.
Nov 22 08:05:35 compute-0 nova_compute[186544]: 2025-11-22 08:05:35.757 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:05:35 compute-0 nova_compute[186544]: 2025-11-22 08:05:35.867 186548 DEBUG oslo_concurrency.lockutils [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquiring lock "refresh_cache-b9ee5ebd-90a8-426a-b369-d38bf61616f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:05:35 compute-0 nova_compute[186544]: 2025-11-22 08:05:35.867 186548 DEBUG oslo_concurrency.lockutils [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquired lock "refresh_cache-b9ee5ebd-90a8-426a-b369-d38bf61616f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:05:35 compute-0 nova_compute[186544]: 2025-11-22 08:05:35.867 186548 DEBUG nova.network.neutron [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 08:05:36 compute-0 nova_compute[186544]: 2025-11-22 08:05:36.395 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:05:36 compute-0 nova_compute[186544]: 2025-11-22 08:05:36.807 186548 DEBUG nova.compute.manager [req-a3f9088e-e793-4130-b1cf-a6ff47644886 req-6addbf06-71a1-4380-b1e5-711c60f5cf97 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Received event network-changed-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:05:36 compute-0 nova_compute[186544]: 2025-11-22 08:05:36.807 186548 DEBUG nova.compute.manager [req-a3f9088e-e793-4130-b1cf-a6ff47644886 req-6addbf06-71a1-4380-b1e5-711c60f5cf97 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Refreshing instance network info cache due to event network-changed-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:05:36 compute-0 nova_compute[186544]: 2025-11-22 08:05:36.808 186548 DEBUG oslo_concurrency.lockutils [req-a3f9088e-e793-4130-b1cf-a6ff47644886 req-6addbf06-71a1-4380-b1e5-711c60f5cf97 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-b9ee5ebd-90a8-426a-b369-d38bf61616f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:05:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:05:37.334 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:05:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:05:37.335 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:05:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:05:37.335 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:05:37 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Nov 22 08:05:37 compute-0 systemd[232485]: Activating special unit Exit the Session...
Nov 22 08:05:37 compute-0 systemd[232485]: Stopped target Main User Target.
Nov 22 08:05:37 compute-0 systemd[232485]: Stopped target Basic System.
Nov 22 08:05:37 compute-0 systemd[232485]: Stopped target Paths.
Nov 22 08:05:37 compute-0 systemd[232485]: Stopped target Sockets.
Nov 22 08:05:37 compute-0 systemd[232485]: Stopped target Timers.
Nov 22 08:05:37 compute-0 systemd[232485]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 22 08:05:37 compute-0 systemd[232485]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 22 08:05:37 compute-0 systemd[232485]: Closed D-Bus User Message Bus Socket.
Nov 22 08:05:37 compute-0 systemd[232485]: Stopped Create User's Volatile Files and Directories.
Nov 22 08:05:37 compute-0 systemd[232485]: Removed slice User Application Slice.
Nov 22 08:05:37 compute-0 systemd[232485]: Reached target Shutdown.
Nov 22 08:05:37 compute-0 systemd[232485]: Finished Exit the Session.
Nov 22 08:05:37 compute-0 systemd[232485]: Reached target Exit the Session.
Nov 22 08:05:37 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Nov 22 08:05:37 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Nov 22 08:05:37 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 22 08:05:37 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 22 08:05:37 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 22 08:05:37 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 22 08:05:37 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Nov 22 08:05:40 compute-0 podman[232612]: 2025-11-22 08:05:40.409571435 +0000 UTC m=+0.061146790 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 08:05:40 compute-0 nova_compute[186544]: 2025-11-22 08:05:40.760 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:05:41 compute-0 nova_compute[186544]: 2025-11-22 08:05:41.397 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:05:41 compute-0 nova_compute[186544]: 2025-11-22 08:05:41.933 186548 DEBUG nova.network.neutron [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Updating instance_info_cache with network_info: [{"id": "348c8bec-11f0-4b6d-9dce-ae3c3f37efbc", "address": "fa:16:3e:ba:d5:b9", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap348c8bec-11", "ovs_interfaceid": "348c8bec-11f0-4b6d-9dce-ae3c3f37efbc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:05:41 compute-0 nova_compute[186544]: 2025-11-22 08:05:41.973 186548 DEBUG oslo_concurrency.lockutils [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Releasing lock "refresh_cache-b9ee5ebd-90a8-426a-b369-d38bf61616f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:05:41 compute-0 nova_compute[186544]: 2025-11-22 08:05:41.977 186548 DEBUG oslo_concurrency.lockutils [req-a3f9088e-e793-4130-b1cf-a6ff47644886 req-6addbf06-71a1-4380-b1e5-711c60f5cf97 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-b9ee5ebd-90a8-426a-b369-d38bf61616f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:05:41 compute-0 nova_compute[186544]: 2025-11-22 08:05:41.978 186548 DEBUG nova.network.neutron [req-a3f9088e-e793-4130-b1cf-a6ff47644886 req-6addbf06-71a1-4380-b1e5-711c60f5cf97 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Refreshing network info cache for port 348c8bec-11f0-4b6d-9dce-ae3c3f37efbc _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:05:42 compute-0 nova_compute[186544]: 2025-11-22 08:05:42.309 186548 DEBUG nova.virt.libvirt.driver [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698
Nov 22 08:05:42 compute-0 nova_compute[186544]: 2025-11-22 08:05:42.311 186548 DEBUG nova.virt.libvirt.driver [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719
Nov 22 08:05:42 compute-0 nova_compute[186544]: 2025-11-22 08:05:42.311 186548 INFO nova.virt.libvirt.driver [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Creating image(s)
Nov 22 08:05:42 compute-0 nova_compute[186544]: 2025-11-22 08:05:42.312 186548 DEBUG nova.objects.instance [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lazy-loading 'trusted_certs' on Instance uuid b9ee5ebd-90a8-426a-b369-d38bf61616f2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:05:42 compute-0 nova_compute[186544]: 2025-11-22 08:05:42.326 186548 DEBUG oslo_concurrency.processutils [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:05:42 compute-0 nova_compute[186544]: 2025-11-22 08:05:42.385 186548 DEBUG oslo_concurrency.processutils [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:05:42 compute-0 nova_compute[186544]: 2025-11-22 08:05:42.387 186548 DEBUG nova.virt.disk.api [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Checking if we can resize image /var/lib/nova/instances/b9ee5ebd-90a8-426a-b369-d38bf61616f2/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 08:05:42 compute-0 nova_compute[186544]: 2025-11-22 08:05:42.387 186548 DEBUG oslo_concurrency.processutils [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b9ee5ebd-90a8-426a-b369-d38bf61616f2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:05:42 compute-0 nova_compute[186544]: 2025-11-22 08:05:42.454 186548 DEBUG oslo_concurrency.processutils [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b9ee5ebd-90a8-426a-b369-d38bf61616f2/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:05:42 compute-0 nova_compute[186544]: 2025-11-22 08:05:42.455 186548 DEBUG nova.virt.disk.api [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Cannot resize image /var/lib/nova/instances/b9ee5ebd-90a8-426a-b369-d38bf61616f2/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 08:05:42 compute-0 nova_compute[186544]: 2025-11-22 08:05:42.616 186548 DEBUG nova.virt.libvirt.driver [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859
Nov 22 08:05:42 compute-0 nova_compute[186544]: 2025-11-22 08:05:42.616 186548 DEBUG nova.virt.libvirt.driver [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Ensure instance console log exists: /var/lib/nova/instances/b9ee5ebd-90a8-426a-b369-d38bf61616f2/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 08:05:42 compute-0 nova_compute[186544]: 2025-11-22 08:05:42.617 186548 DEBUG oslo_concurrency.lockutils [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:05:42 compute-0 nova_compute[186544]: 2025-11-22 08:05:42.617 186548 DEBUG oslo_concurrency.lockutils [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:05:42 compute-0 nova_compute[186544]: 2025-11-22 08:05:42.618 186548 DEBUG oslo_concurrency.lockutils [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:05:42 compute-0 nova_compute[186544]: 2025-11-22 08:05:42.620 186548 DEBUG nova.virt.libvirt.driver [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Start _get_guest_xml network_info=[{"id": "348c8bec-11f0-4b6d-9dce-ae3c3f37efbc", "address": "fa:16:3e:ba:d5:b9", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-1436581558-network", "vif_mac": "fa:16:3e:ba:d5:b9"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap348c8bec-11", "ovs_interfaceid": "348c8bec-11f0-4b6d-9dce-ae3c3f37efbc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 08:05:42 compute-0 nova_compute[186544]: 2025-11-22 08:05:42.626 186548 WARNING nova.virt.libvirt.driver [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:05:42 compute-0 nova_compute[186544]: 2025-11-22 08:05:42.635 186548 DEBUG nova.virt.libvirt.host [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 08:05:42 compute-0 nova_compute[186544]: 2025-11-22 08:05:42.635 186548 DEBUG nova.virt.libvirt.host [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 08:05:42 compute-0 nova_compute[186544]: 2025-11-22 08:05:42.640 186548 DEBUG nova.virt.libvirt.host [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 08:05:42 compute-0 nova_compute[186544]: 2025-11-22 08:05:42.640 186548 DEBUG nova.virt.libvirt.host [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 08:05:42 compute-0 nova_compute[186544]: 2025-11-22 08:05:42.642 186548 DEBUG nova.virt.libvirt.driver [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 08:05:42 compute-0 nova_compute[186544]: 2025-11-22 08:05:42.642 186548 DEBUG nova.virt.hardware [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1c351edf-5b2d-477d-93d0-c380bdae83e7',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 08:05:42 compute-0 nova_compute[186544]: 2025-11-22 08:05:42.642 186548 DEBUG nova.virt.hardware [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 08:05:42 compute-0 nova_compute[186544]: 2025-11-22 08:05:42.643 186548 DEBUG nova.virt.hardware [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 08:05:42 compute-0 nova_compute[186544]: 2025-11-22 08:05:42.643 186548 DEBUG nova.virt.hardware [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 08:05:42 compute-0 nova_compute[186544]: 2025-11-22 08:05:42.643 186548 DEBUG nova.virt.hardware [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 08:05:42 compute-0 nova_compute[186544]: 2025-11-22 08:05:42.643 186548 DEBUG nova.virt.hardware [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 08:05:42 compute-0 nova_compute[186544]: 2025-11-22 08:05:42.644 186548 DEBUG nova.virt.hardware [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 08:05:42 compute-0 nova_compute[186544]: 2025-11-22 08:05:42.644 186548 DEBUG nova.virt.hardware [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 08:05:42 compute-0 nova_compute[186544]: 2025-11-22 08:05:42.644 186548 DEBUG nova.virt.hardware [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 08:05:42 compute-0 nova_compute[186544]: 2025-11-22 08:05:42.644 186548 DEBUG nova.virt.hardware [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 08:05:42 compute-0 nova_compute[186544]: 2025-11-22 08:05:42.644 186548 DEBUG nova.virt.hardware [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 08:05:42 compute-0 nova_compute[186544]: 2025-11-22 08:05:42.645 186548 DEBUG nova.objects.instance [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lazy-loading 'vcpu_model' on Instance uuid b9ee5ebd-90a8-426a-b369-d38bf61616f2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:05:42 compute-0 nova_compute[186544]: 2025-11-22 08:05:42.667 186548 DEBUG oslo_concurrency.processutils [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b9ee5ebd-90a8-426a-b369-d38bf61616f2/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:05:42 compute-0 nova_compute[186544]: 2025-11-22 08:05:42.730 186548 DEBUG oslo_concurrency.processutils [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b9ee5ebd-90a8-426a-b369-d38bf61616f2/disk.config --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:05:42 compute-0 nova_compute[186544]: 2025-11-22 08:05:42.732 186548 DEBUG oslo_concurrency.lockutils [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquiring lock "/var/lib/nova/instances/b9ee5ebd-90a8-426a-b369-d38bf61616f2/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:05:42 compute-0 nova_compute[186544]: 2025-11-22 08:05:42.732 186548 DEBUG oslo_concurrency.lockutils [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "/var/lib/nova/instances/b9ee5ebd-90a8-426a-b369-d38bf61616f2/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:05:42 compute-0 nova_compute[186544]: 2025-11-22 08:05:42.733 186548 DEBUG oslo_concurrency.lockutils [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "/var/lib/nova/instances/b9ee5ebd-90a8-426a-b369-d38bf61616f2/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:05:42 compute-0 nova_compute[186544]: 2025-11-22 08:05:42.735 186548 DEBUG nova.virt.libvirt.vif [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:04:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-316749730',display_name='tempest-ServerActionsTestJSON-server-316749730',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-316749730',id=103,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEDyh5RRpb7qDHgAc9H+oNwOI/lxx0x2a7uhOXIX+Er9GoVqnK9B1X3kTc/PIYUbBPjQjhoPfQeu2jPU9pyeFHD6mBTSbq1gvJNECPvummRKdXnVokvmyleOZmFdoGP/ZQ==',key_name='tempest-keypair-1877507320',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:04:29Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='ac6b78572b7d4aedaf745e5e6ba1d360',ramdisk_id='',reservation_id='r-b7qa77dd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-1104477664',owner_user_name='tempest-ServerActionsTestJSON-1104477664-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:05:28Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='b6cc24df1e344e369f2aff864f278268',uuid=b9ee5ebd-90a8-426a-b369-d38bf61616f2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "348c8bec-11f0-4b6d-9dce-ae3c3f37efbc", "address": "fa:16:3e:ba:d5:b9", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-1436581558-network", "vif_mac": "fa:16:3e:ba:d5:b9"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap348c8bec-11", "ovs_interfaceid": "348c8bec-11f0-4b6d-9dce-ae3c3f37efbc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 08:05:42 compute-0 nova_compute[186544]: 2025-11-22 08:05:42.735 186548 DEBUG nova.network.os_vif_util [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Converting VIF {"id": "348c8bec-11f0-4b6d-9dce-ae3c3f37efbc", "address": "fa:16:3e:ba:d5:b9", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-1436581558-network", "vif_mac": "fa:16:3e:ba:d5:b9"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap348c8bec-11", "ovs_interfaceid": "348c8bec-11f0-4b6d-9dce-ae3c3f37efbc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:05:42 compute-0 nova_compute[186544]: 2025-11-22 08:05:42.736 186548 DEBUG nova.network.os_vif_util [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ba:d5:b9,bridge_name='br-int',has_traffic_filtering=True,id=348c8bec-11f0-4b6d-9dce-ae3c3f37efbc,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap348c8bec-11') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:05:42 compute-0 nova_compute[186544]: 2025-11-22 08:05:42.738 186548 DEBUG nova.virt.libvirt.driver [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] End _get_guest_xml xml=<domain type="kvm">
Nov 22 08:05:42 compute-0 nova_compute[186544]:   <uuid>b9ee5ebd-90a8-426a-b369-d38bf61616f2</uuid>
Nov 22 08:05:42 compute-0 nova_compute[186544]:   <name>instance-00000067</name>
Nov 22 08:05:42 compute-0 nova_compute[186544]:   <memory>196608</memory>
Nov 22 08:05:42 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 08:05:42 compute-0 nova_compute[186544]:   <metadata>
Nov 22 08:05:42 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 08:05:42 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 08:05:42 compute-0 nova_compute[186544]:       <nova:name>tempest-ServerActionsTestJSON-server-316749730</nova:name>
Nov 22 08:05:42 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 08:05:42</nova:creationTime>
Nov 22 08:05:42 compute-0 nova_compute[186544]:       <nova:flavor name="m1.micro">
Nov 22 08:05:42 compute-0 nova_compute[186544]:         <nova:memory>192</nova:memory>
Nov 22 08:05:42 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 08:05:42 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 08:05:42 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 08:05:42 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 08:05:42 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 08:05:42 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 08:05:42 compute-0 nova_compute[186544]:         <nova:user uuid="b6cc24df1e344e369f2aff864f278268">tempest-ServerActionsTestJSON-1104477664-project-member</nova:user>
Nov 22 08:05:42 compute-0 nova_compute[186544]:         <nova:project uuid="ac6b78572b7d4aedaf745e5e6ba1d360">tempest-ServerActionsTestJSON-1104477664</nova:project>
Nov 22 08:05:42 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 08:05:42 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 08:05:42 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 08:05:42 compute-0 nova_compute[186544]:         <nova:port uuid="348c8bec-11f0-4b6d-9dce-ae3c3f37efbc">
Nov 22 08:05:42 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Nov 22 08:05:42 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 08:05:42 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 08:05:42 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 08:05:42 compute-0 nova_compute[186544]:   </metadata>
Nov 22 08:05:42 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 08:05:42 compute-0 nova_compute[186544]:     <system>
Nov 22 08:05:42 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 08:05:42 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 08:05:42 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 08:05:42 compute-0 nova_compute[186544]:       <entry name="serial">b9ee5ebd-90a8-426a-b369-d38bf61616f2</entry>
Nov 22 08:05:42 compute-0 nova_compute[186544]:       <entry name="uuid">b9ee5ebd-90a8-426a-b369-d38bf61616f2</entry>
Nov 22 08:05:42 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 08:05:42 compute-0 nova_compute[186544]:     </system>
Nov 22 08:05:42 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 08:05:42 compute-0 nova_compute[186544]:   <os>
Nov 22 08:05:42 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 08:05:42 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 08:05:42 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 08:05:42 compute-0 nova_compute[186544]:   </os>
Nov 22 08:05:42 compute-0 nova_compute[186544]:   <features>
Nov 22 08:05:42 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 08:05:42 compute-0 nova_compute[186544]:     <apic/>
Nov 22 08:05:42 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 08:05:42 compute-0 nova_compute[186544]:   </features>
Nov 22 08:05:42 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 08:05:42 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 08:05:42 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 08:05:42 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 08:05:42 compute-0 nova_compute[186544]:   </clock>
Nov 22 08:05:42 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 08:05:42 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 08:05:42 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 08:05:42 compute-0 nova_compute[186544]:   </cpu>
Nov 22 08:05:42 compute-0 nova_compute[186544]:   <devices>
Nov 22 08:05:42 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 08:05:42 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 08:05:42 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/b9ee5ebd-90a8-426a-b369-d38bf61616f2/disk"/>
Nov 22 08:05:42 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 08:05:42 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:05:42 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 08:05:42 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 08:05:42 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/b9ee5ebd-90a8-426a-b369-d38bf61616f2/disk.config"/>
Nov 22 08:05:42 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 08:05:42 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:05:42 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 08:05:42 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:ba:d5:b9"/>
Nov 22 08:05:42 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:05:42 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 08:05:42 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 08:05:42 compute-0 nova_compute[186544]:       <target dev="tap348c8bec-11"/>
Nov 22 08:05:42 compute-0 nova_compute[186544]:     </interface>
Nov 22 08:05:42 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 08:05:42 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/b9ee5ebd-90a8-426a-b369-d38bf61616f2/console.log" append="off"/>
Nov 22 08:05:42 compute-0 nova_compute[186544]:     </serial>
Nov 22 08:05:42 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 08:05:42 compute-0 nova_compute[186544]:     <video>
Nov 22 08:05:42 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:05:42 compute-0 nova_compute[186544]:     </video>
Nov 22 08:05:42 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 08:05:42 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 08:05:42 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 08:05:42 compute-0 nova_compute[186544]:     </rng>
Nov 22 08:05:42 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 08:05:42 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:05:42 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:05:42 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:05:42 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:05:42 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:05:42 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:05:42 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:05:42 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:05:42 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:05:42 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:05:42 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:05:42 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:05:42 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:05:42 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:05:42 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:05:42 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:05:42 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:05:42 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:05:42 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:05:42 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:05:42 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:05:42 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:05:42 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:05:42 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:05:42 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 08:05:42 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 08:05:42 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 08:05:42 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 08:05:42 compute-0 nova_compute[186544]:   </devices>
Nov 22 08:05:42 compute-0 nova_compute[186544]: </domain>
Nov 22 08:05:42 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 08:05:42 compute-0 nova_compute[186544]: 2025-11-22 08:05:42.741 186548 DEBUG nova.virt.libvirt.vif [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:04:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-316749730',display_name='tempest-ServerActionsTestJSON-server-316749730',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-316749730',id=103,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEDyh5RRpb7qDHgAc9H+oNwOI/lxx0x2a7uhOXIX+Er9GoVqnK9B1X3kTc/PIYUbBPjQjhoPfQeu2jPU9pyeFHD6mBTSbq1gvJNECPvummRKdXnVokvmyleOZmFdoGP/ZQ==',key_name='tempest-keypair-1877507320',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:04:29Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='ac6b78572b7d4aedaf745e5e6ba1d360',ramdisk_id='',reservation_id='r-b7qa77dd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-1104477664',owner_user_name='tempest-ServerActionsTestJSON-1104477664-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:05:28Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='b6cc24df1e344e369f2aff864f278268',uuid=b9ee5ebd-90a8-426a-b369-d38bf61616f2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "348c8bec-11f0-4b6d-9dce-ae3c3f37efbc", "address": "fa:16:3e:ba:d5:b9", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-1436581558-network", "vif_mac": "fa:16:3e:ba:d5:b9"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap348c8bec-11", "ovs_interfaceid": "348c8bec-11f0-4b6d-9dce-ae3c3f37efbc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 08:05:42 compute-0 nova_compute[186544]: 2025-11-22 08:05:42.742 186548 DEBUG nova.network.os_vif_util [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Converting VIF {"id": "348c8bec-11f0-4b6d-9dce-ae3c3f37efbc", "address": "fa:16:3e:ba:d5:b9", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-1436581558-network", "vif_mac": "fa:16:3e:ba:d5:b9"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap348c8bec-11", "ovs_interfaceid": "348c8bec-11f0-4b6d-9dce-ae3c3f37efbc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:05:42 compute-0 nova_compute[186544]: 2025-11-22 08:05:42.742 186548 DEBUG nova.network.os_vif_util [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ba:d5:b9,bridge_name='br-int',has_traffic_filtering=True,id=348c8bec-11f0-4b6d-9dce-ae3c3f37efbc,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap348c8bec-11') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:05:42 compute-0 nova_compute[186544]: 2025-11-22 08:05:42.743 186548 DEBUG os_vif [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ba:d5:b9,bridge_name='br-int',has_traffic_filtering=True,id=348c8bec-11f0-4b6d-9dce-ae3c3f37efbc,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap348c8bec-11') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 08:05:42 compute-0 nova_compute[186544]: 2025-11-22 08:05:42.744 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:05:42 compute-0 nova_compute[186544]: 2025-11-22 08:05:42.744 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:05:42 compute-0 nova_compute[186544]: 2025-11-22 08:05:42.745 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:05:42 compute-0 nova_compute[186544]: 2025-11-22 08:05:42.748 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:05:42 compute-0 nova_compute[186544]: 2025-11-22 08:05:42.749 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap348c8bec-11, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:05:42 compute-0 nova_compute[186544]: 2025-11-22 08:05:42.750 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap348c8bec-11, col_values=(('external_ids', {'iface-id': '348c8bec-11f0-4b6d-9dce-ae3c3f37efbc', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ba:d5:b9', 'vm-uuid': 'b9ee5ebd-90a8-426a-b369-d38bf61616f2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:05:42 compute-0 nova_compute[186544]: 2025-11-22 08:05:42.751 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:05:42 compute-0 NetworkManager[55036]: <info>  [1763798742.7528] manager: (tap348c8bec-11): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/218)
Nov 22 08:05:42 compute-0 nova_compute[186544]: 2025-11-22 08:05:42.755 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 08:05:42 compute-0 nova_compute[186544]: 2025-11-22 08:05:42.759 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:05:42 compute-0 nova_compute[186544]: 2025-11-22 08:05:42.760 186548 INFO os_vif [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ba:d5:b9,bridge_name='br-int',has_traffic_filtering=True,id=348c8bec-11f0-4b6d-9dce-ae3c3f37efbc,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap348c8bec-11')
Nov 22 08:05:42 compute-0 nova_compute[186544]: 2025-11-22 08:05:42.850 186548 DEBUG nova.virt.libvirt.driver [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:05:42 compute-0 nova_compute[186544]: 2025-11-22 08:05:42.851 186548 DEBUG nova.virt.libvirt.driver [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:05:42 compute-0 nova_compute[186544]: 2025-11-22 08:05:42.851 186548 DEBUG nova.virt.libvirt.driver [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] No VIF found with MAC fa:16:3e:ba:d5:b9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 08:05:42 compute-0 nova_compute[186544]: 2025-11-22 08:05:42.852 186548 INFO nova.virt.libvirt.driver [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Using config drive
Nov 22 08:05:42 compute-0 kernel: tap348c8bec-11: entered promiscuous mode
Nov 22 08:05:42 compute-0 NetworkManager[55036]: <info>  [1763798742.9272] manager: (tap348c8bec-11): new Tun device (/org/freedesktop/NetworkManager/Devices/219)
Nov 22 08:05:42 compute-0 nova_compute[186544]: 2025-11-22 08:05:42.927 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:05:42 compute-0 ovn_controller[94843]: 2025-11-22T08:05:42Z|00468|binding|INFO|Claiming lport 348c8bec-11f0-4b6d-9dce-ae3c3f37efbc for this chassis.
Nov 22 08:05:42 compute-0 ovn_controller[94843]: 2025-11-22T08:05:42Z|00469|binding|INFO|348c8bec-11f0-4b6d-9dce-ae3c3f37efbc: Claiming fa:16:3e:ba:d5:b9 10.100.0.14
Nov 22 08:05:42 compute-0 nova_compute[186544]: 2025-11-22 08:05:42.931 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:05:42 compute-0 nova_compute[186544]: 2025-11-22 08:05:42.936 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:05:42 compute-0 nova_compute[186544]: 2025-11-22 08:05:42.941 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:05:42 compute-0 nova_compute[186544]: 2025-11-22 08:05:42.957 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:05:42 compute-0 NetworkManager[55036]: <info>  [1763798742.9583] manager: (patch-br-int-to-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/220)
Nov 22 08:05:42 compute-0 NetworkManager[55036]: <info>  [1763798742.9593] manager: (patch-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/221)
Nov 22 08:05:42 compute-0 systemd-udevd[232658]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:05:42 compute-0 NetworkManager[55036]: <info>  [1763798742.9812] device (tap348c8bec-11): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 08:05:42 compute-0 NetworkManager[55036]: <info>  [1763798742.9820] device (tap348c8bec-11): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 08:05:42 compute-0 systemd-machined[152872]: New machine qemu-61-instance-00000067.
Nov 22 08:05:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:05:43.004 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ba:d5:b9 10.100.0.14'], port_security=['fa:16:3e:ba:d5:b9 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'b9ee5ebd-90a8-426a-b369-d38bf61616f2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'neutron:revision_number': '7', 'neutron:security_group_ids': 'f75b5f45-3232-42aa-a8f2-594f0428a6f8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.204'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a46c39a3-69e8-4fb9-a300-4c114a0c0910, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=348c8bec-11f0-4b6d-9dce-ae3c3f37efbc) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:05:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:05:43.005 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 348c8bec-11f0-4b6d-9dce-ae3c3f37efbc in datapath 165f7f23-d3c9-4f49-8a34-4fd7222ad518 bound to our chassis
Nov 22 08:05:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:05:43.007 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 165f7f23-d3c9-4f49-8a34-4fd7222ad518
Nov 22 08:05:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:05:43.018 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[1cde03a1-03f2-48e9-ad64-8c6fcd4e74e2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:05:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:05:43.019 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap165f7f23-d1 in ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 08:05:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:05:43.023 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap165f7f23-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 08:05:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:05:43.023 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[5697cc9c-cbb0-46fc-ba36-58c51cdf1eaa]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:05:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:05:43.024 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[2e978460-2c56-40e5-850e-39928bf84483]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:05:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:05:43.041 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[b16ecaac-a5d2-4434-a15d-0717238e03c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:05:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:05:43.066 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[37471534-423c-4d2b-9b9f-eb9ed78af70e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:05:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:05:43.091 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[ac12065f-d5a3-44c8-81c0-34bb4d7e8403]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:05:43 compute-0 systemd-udevd[232661]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:05:43 compute-0 systemd[1]: Started Virtual Machine qemu-61-instance-00000067.
Nov 22 08:05:43 compute-0 NetworkManager[55036]: <info>  [1763798743.1107] manager: (tap165f7f23-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/222)
Nov 22 08:05:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:05:43.109 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[2a433df7-2d1d-4817-93c9-30141674373a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:05:43 compute-0 nova_compute[186544]: 2025-11-22 08:05:43.142 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:05:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:05:43.144 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[c9ba613d-c47d-48e2-8eda-460b9d4c1389]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:05:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:05:43.150 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[9fc6596d-eda0-4e78-91a1-fcce2ed927b7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:05:43 compute-0 nova_compute[186544]: 2025-11-22 08:05:43.167 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:05:43 compute-0 NetworkManager[55036]: <info>  [1763798743.1739] device (tap165f7f23-d0): carrier: link connected
Nov 22 08:05:43 compute-0 ovn_controller[94843]: 2025-11-22T08:05:43Z|00470|binding|INFO|Setting lport 348c8bec-11f0-4b6d-9dce-ae3c3f37efbc ovn-installed in OVS
Nov 22 08:05:43 compute-0 ovn_controller[94843]: 2025-11-22T08:05:43Z|00471|binding|INFO|Setting lport 348c8bec-11f0-4b6d-9dce-ae3c3f37efbc up in Southbound
Nov 22 08:05:43 compute-0 nova_compute[186544]: 2025-11-22 08:05:43.177 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:05:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:05:43.178 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[c42e0b13-0352-4c20-89ad-99dcb747fc47]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:05:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:05:43.194 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[144fd15e-f4ec-47f6-bc61-69562260b40b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap165f7f23-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:00:cc:98'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 147], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 548381, 'reachable_time': 44919, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232692, 'error': None, 'target': 'ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:05:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:05:43.211 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[3656f3d7-5b74-4c54-88d4-d605379e3f64]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe00:cc98'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 548381, 'tstamp': 548381}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 232693, 'error': None, 'target': 'ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:05:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:05:43.229 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[dc67e857-cd3c-4839-ac1a-2e14616541a7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap165f7f23-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:00:cc:98'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 147], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 548381, 'reachable_time': 44919, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 232694, 'error': None, 'target': 'ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:05:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:05:43.258 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[1b747fce-74ee-4ec0-b28b-f5163ff6b225]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:05:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:05:43.316 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[40bcd909-b253-4de0-82c0-e249d6f7dfb2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:05:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:05:43.318 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap165f7f23-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:05:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:05:43.318 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:05:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:05:43.318 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap165f7f23-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:05:43 compute-0 nova_compute[186544]: 2025-11-22 08:05:43.320 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:05:43 compute-0 NetworkManager[55036]: <info>  [1763798743.3208] manager: (tap165f7f23-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/223)
Nov 22 08:05:43 compute-0 kernel: tap165f7f23-d0: entered promiscuous mode
Nov 22 08:05:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:05:43.323 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap165f7f23-d0, col_values=(('external_ids', {'iface-id': '966efbe2-6c09-40dc-9351-4f58f2542b31'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:05:43 compute-0 ovn_controller[94843]: 2025-11-22T08:05:43Z|00472|binding|INFO|Releasing lport 966efbe2-6c09-40dc-9351-4f58f2542b31 from this chassis (sb_readonly=0)
Nov 22 08:05:43 compute-0 nova_compute[186544]: 2025-11-22 08:05:43.324 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:05:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:05:43.326 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/165f7f23-d3c9-4f49-8a34-4fd7222ad518.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/165f7f23-d3c9-4f49-8a34-4fd7222ad518.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 08:05:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:05:43.326 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[73c8466c-fae2-4915-ae39-bb01195de285]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:05:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:05:43.327 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 08:05:43 compute-0 ovn_metadata_agent[103800]: global
Nov 22 08:05:43 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 08:05:43 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-165f7f23-d3c9-4f49-8a34-4fd7222ad518
Nov 22 08:05:43 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 08:05:43 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 08:05:43 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 08:05:43 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/165f7f23-d3c9-4f49-8a34-4fd7222ad518.pid.haproxy
Nov 22 08:05:43 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 08:05:43 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:05:43 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 08:05:43 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 08:05:43 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 08:05:43 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 08:05:43 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 08:05:43 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 08:05:43 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 08:05:43 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 08:05:43 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 08:05:43 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 08:05:43 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 08:05:43 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 08:05:43 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 08:05:43 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:05:43 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:05:43 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 08:05:43 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 08:05:43 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 08:05:43 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID 165f7f23-d3c9-4f49-8a34-4fd7222ad518
Nov 22 08:05:43 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 08:05:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:05:43.328 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'env', 'PROCESS_TAG=haproxy-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/165f7f23-d3c9-4f49-8a34-4fd7222ad518.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 08:05:43 compute-0 nova_compute[186544]: 2025-11-22 08:05:43.338 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:05:43 compute-0 podman[232726]: 2025-11-22 08:05:43.752766687 +0000 UTC m=+0.094347059 container create 1083d3a9b89e49a0fb5c9423e7cf90aecceee5778aae18aefc37ad12d2abcbd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Nov 22 08:05:43 compute-0 nova_compute[186544]: 2025-11-22 08:05:43.776 186548 DEBUG nova.compute.manager [req-a2c13540-8aca-46cb-b4a0-d54d130d3d9d req-9c3f53a5-7d56-488f-8ed5-b0bacbb2b0c9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Received event network-vif-plugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:05:43 compute-0 nova_compute[186544]: 2025-11-22 08:05:43.777 186548 DEBUG oslo_concurrency.lockutils [req-a2c13540-8aca-46cb-b4a0-d54d130d3d9d req-9c3f53a5-7d56-488f-8ed5-b0bacbb2b0c9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:05:43 compute-0 nova_compute[186544]: 2025-11-22 08:05:43.778 186548 DEBUG oslo_concurrency.lockutils [req-a2c13540-8aca-46cb-b4a0-d54d130d3d9d req-9c3f53a5-7d56-488f-8ed5-b0bacbb2b0c9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:05:43 compute-0 nova_compute[186544]: 2025-11-22 08:05:43.778 186548 DEBUG oslo_concurrency.lockutils [req-a2c13540-8aca-46cb-b4a0-d54d130d3d9d req-9c3f53a5-7d56-488f-8ed5-b0bacbb2b0c9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:05:43 compute-0 nova_compute[186544]: 2025-11-22 08:05:43.779 186548 DEBUG nova.compute.manager [req-a2c13540-8aca-46cb-b4a0-d54d130d3d9d req-9c3f53a5-7d56-488f-8ed5-b0bacbb2b0c9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] No waiting events found dispatching network-vif-plugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:05:43 compute-0 nova_compute[186544]: 2025-11-22 08:05:43.780 186548 WARNING nova.compute.manager [req-a2c13540-8aca-46cb-b4a0-d54d130d3d9d req-9c3f53a5-7d56-488f-8ed5-b0bacbb2b0c9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Received unexpected event network-vif-plugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc for instance with vm_state active and task_state resize_finish.
Nov 22 08:05:43 compute-0 podman[232726]: 2025-11-22 08:05:43.686610734 +0000 UTC m=+0.028191126 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 08:05:43 compute-0 nova_compute[186544]: 2025-11-22 08:05:43.787 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798743.787251, b9ee5ebd-90a8-426a-b369-d38bf61616f2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:05:43 compute-0 nova_compute[186544]: 2025-11-22 08:05:43.788 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] VM Resumed (Lifecycle Event)
Nov 22 08:05:43 compute-0 nova_compute[186544]: 2025-11-22 08:05:43.792 186548 DEBUG nova.compute.manager [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 08:05:43 compute-0 nova_compute[186544]: 2025-11-22 08:05:43.797 186548 INFO nova.virt.libvirt.driver [-] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Instance running successfully.
Nov 22 08:05:43 compute-0 virtqemud[186092]: argument unsupported: QEMU guest agent is not configured
Nov 22 08:05:43 compute-0 nova_compute[186544]: 2025-11-22 08:05:43.802 186548 DEBUG nova.virt.libvirt.guest [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200
Nov 22 08:05:43 compute-0 nova_compute[186544]: 2025-11-22 08:05:43.803 186548 DEBUG nova.virt.libvirt.driver [None req-936ba395-1238-4587-be03-7e5ce78841eb b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793
Nov 22 08:05:43 compute-0 systemd[1]: Started libpod-conmon-1083d3a9b89e49a0fb5c9423e7cf90aecceee5778aae18aefc37ad12d2abcbd5.scope.
Nov 22 08:05:43 compute-0 nova_compute[186544]: 2025-11-22 08:05:43.819 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:05:43 compute-0 nova_compute[186544]: 2025-11-22 08:05:43.827 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:05:43 compute-0 systemd[1]: Started libcrun container.
Nov 22 08:05:43 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b64edef82453c5179bc410198e136feff48c3eceb054e28e2866f1e381bd42d4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 08:05:43 compute-0 podman[232726]: 2025-11-22 08:05:43.876508908 +0000 UTC m=+0.218089310 container init 1083d3a9b89e49a0fb5c9423e7cf90aecceee5778aae18aefc37ad12d2abcbd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, tcib_managed=true)
Nov 22 08:05:43 compute-0 podman[232726]: 2025-11-22 08:05:43.882406613 +0000 UTC m=+0.223986985 container start 1083d3a9b89e49a0fb5c9423e7cf90aecceee5778aae18aefc37ad12d2abcbd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Nov 22 08:05:43 compute-0 nova_compute[186544]: 2025-11-22 08:05:43.886 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] During sync_power_state the instance has a pending task (resize_finish). Skip.
Nov 22 08:05:43 compute-0 nova_compute[186544]: 2025-11-22 08:05:43.887 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798743.7917645, b9ee5ebd-90a8-426a-b369-d38bf61616f2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:05:43 compute-0 nova_compute[186544]: 2025-11-22 08:05:43.887 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] VM Started (Lifecycle Event)
Nov 22 08:05:43 compute-0 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[232748]: [NOTICE]   (232752) : New worker (232754) forked
Nov 22 08:05:43 compute-0 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[232748]: [NOTICE]   (232752) : Loading success.
Nov 22 08:05:43 compute-0 nova_compute[186544]: 2025-11-22 08:05:43.930 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:05:43 compute-0 nova_compute[186544]: 2025-11-22 08:05:43.934 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:05:44 compute-0 nova_compute[186544]: 2025-11-22 08:05:44.952 186548 DEBUG nova.network.neutron [req-a3f9088e-e793-4130-b1cf-a6ff47644886 req-6addbf06-71a1-4380-b1e5-711c60f5cf97 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Updated VIF entry in instance network info cache for port 348c8bec-11f0-4b6d-9dce-ae3c3f37efbc. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:05:44 compute-0 nova_compute[186544]: 2025-11-22 08:05:44.953 186548 DEBUG nova.network.neutron [req-a3f9088e-e793-4130-b1cf-a6ff47644886 req-6addbf06-71a1-4380-b1e5-711c60f5cf97 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Updating instance_info_cache with network_info: [{"id": "348c8bec-11f0-4b6d-9dce-ae3c3f37efbc", "address": "fa:16:3e:ba:d5:b9", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap348c8bec-11", "ovs_interfaceid": "348c8bec-11f0-4b6d-9dce-ae3c3f37efbc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:05:44 compute-0 nova_compute[186544]: 2025-11-22 08:05:44.973 186548 DEBUG oslo_concurrency.lockutils [req-a3f9088e-e793-4130-b1cf-a6ff47644886 req-6addbf06-71a1-4380-b1e5-711c60f5cf97 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-b9ee5ebd-90a8-426a-b369-d38bf61616f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:05:45 compute-0 podman[232765]: 2025-11-22 08:05:45.418398614 +0000 UTC m=+0.058392560 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, architecture=x86_64, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.openshift.expose-services=, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, distribution-scope=public)
Nov 22 08:05:45 compute-0 podman[232764]: 2025-11-22 08:05:45.435959398 +0000 UTC m=+0.080029204 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 08:05:45 compute-0 nova_compute[186544]: 2025-11-22 08:05:45.763 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:05:46 compute-0 nova_compute[186544]: 2025-11-22 08:05:46.776 186548 DEBUG nova.compute.manager [req-d2b5973c-4308-442b-861d-bd5ab0820e43 req-64643887-6217-4c7f-b78e-01bedd2581ca 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Received event network-vif-plugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:05:46 compute-0 nova_compute[186544]: 2025-11-22 08:05:46.776 186548 DEBUG oslo_concurrency.lockutils [req-d2b5973c-4308-442b-861d-bd5ab0820e43 req-64643887-6217-4c7f-b78e-01bedd2581ca 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:05:46 compute-0 nova_compute[186544]: 2025-11-22 08:05:46.776 186548 DEBUG oslo_concurrency.lockutils [req-d2b5973c-4308-442b-861d-bd5ab0820e43 req-64643887-6217-4c7f-b78e-01bedd2581ca 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:05:46 compute-0 nova_compute[186544]: 2025-11-22 08:05:46.776 186548 DEBUG oslo_concurrency.lockutils [req-d2b5973c-4308-442b-861d-bd5ab0820e43 req-64643887-6217-4c7f-b78e-01bedd2581ca 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:05:46 compute-0 nova_compute[186544]: 2025-11-22 08:05:46.777 186548 DEBUG nova.compute.manager [req-d2b5973c-4308-442b-861d-bd5ab0820e43 req-64643887-6217-4c7f-b78e-01bedd2581ca 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] No waiting events found dispatching network-vif-plugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:05:46 compute-0 nova_compute[186544]: 2025-11-22 08:05:46.777 186548 WARNING nova.compute.manager [req-d2b5973c-4308-442b-861d-bd5ab0820e43 req-64643887-6217-4c7f-b78e-01bedd2581ca 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Received unexpected event network-vif-plugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc for instance with vm_state resized and task_state resize_reverting.
Nov 22 08:05:47 compute-0 nova_compute[186544]: 2025-11-22 08:05:47.243 186548 DEBUG nova.network.neutron [None req-07b57415-623c-44dd-937e-718559e07fa0 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Port 348c8bec-11f0-4b6d-9dce-ae3c3f37efbc binding to destination host compute-0.ctlplane.example.com is already ACTIVE migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3171
Nov 22 08:05:47 compute-0 nova_compute[186544]: 2025-11-22 08:05:47.244 186548 DEBUG oslo_concurrency.lockutils [None req-07b57415-623c-44dd-937e-718559e07fa0 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquiring lock "refresh_cache-b9ee5ebd-90a8-426a-b369-d38bf61616f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:05:47 compute-0 nova_compute[186544]: 2025-11-22 08:05:47.244 186548 DEBUG oslo_concurrency.lockutils [None req-07b57415-623c-44dd-937e-718559e07fa0 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquired lock "refresh_cache-b9ee5ebd-90a8-426a-b369-d38bf61616f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:05:47 compute-0 nova_compute[186544]: 2025-11-22 08:05:47.245 186548 DEBUG nova.network.neutron [None req-07b57415-623c-44dd-937e-718559e07fa0 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 08:05:47 compute-0 nova_compute[186544]: 2025-11-22 08:05:47.752 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:05:48 compute-0 nova_compute[186544]: 2025-11-22 08:05:48.678 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:05:50 compute-0 nova_compute[186544]: 2025-11-22 08:05:50.158 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:05:50 compute-0 nova_compute[186544]: 2025-11-22 08:05:50.200 186548 DEBUG nova.network.neutron [None req-07b57415-623c-44dd-937e-718559e07fa0 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Updating instance_info_cache with network_info: [{"id": "348c8bec-11f0-4b6d-9dce-ae3c3f37efbc", "address": "fa:16:3e:ba:d5:b9", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap348c8bec-11", "ovs_interfaceid": "348c8bec-11f0-4b6d-9dce-ae3c3f37efbc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:05:50 compute-0 nova_compute[186544]: 2025-11-22 08:05:50.233 186548 DEBUG oslo_concurrency.lockutils [None req-07b57415-623c-44dd-937e-718559e07fa0 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Releasing lock "refresh_cache-b9ee5ebd-90a8-426a-b369-d38bf61616f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:05:50 compute-0 nova_compute[186544]: 2025-11-22 08:05:50.249 186548 DEBUG nova.virt.libvirt.driver [None req-07b57415-623c-44dd-937e-718559e07fa0 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Creating tmpfile /var/lib/nova/instances/b9ee5ebd-90a8-426a-b369-d38bf61616f2/tmpdhy9fzum to verify with other compute node that the instance is on the same shared storage. check_instance_shared_storage_local /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:9618
Nov 22 08:05:50 compute-0 kernel: tap348c8bec-11 (unregistering): left promiscuous mode
Nov 22 08:05:50 compute-0 NetworkManager[55036]: <info>  [1763798750.2787] device (tap348c8bec-11): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 08:05:50 compute-0 ovn_controller[94843]: 2025-11-22T08:05:50Z|00473|binding|INFO|Releasing lport 348c8bec-11f0-4b6d-9dce-ae3c3f37efbc from this chassis (sb_readonly=0)
Nov 22 08:05:50 compute-0 ovn_controller[94843]: 2025-11-22T08:05:50Z|00474|binding|INFO|Setting lport 348c8bec-11f0-4b6d-9dce-ae3c3f37efbc down in Southbound
Nov 22 08:05:50 compute-0 ovn_controller[94843]: 2025-11-22T08:05:50Z|00475|binding|INFO|Removing iface tap348c8bec-11 ovn-installed in OVS
Nov 22 08:05:50 compute-0 nova_compute[186544]: 2025-11-22 08:05:50.286 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:05:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:05:50.295 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ba:d5:b9 10.100.0.14'], port_security=['fa:16:3e:ba:d5:b9 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'b9ee5ebd-90a8-426a-b369-d38bf61616f2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'neutron:revision_number': '9', 'neutron:security_group_ids': 'f75b5f45-3232-42aa-a8f2-594f0428a6f8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.204', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a46c39a3-69e8-4fb9-a300-4c114a0c0910, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=348c8bec-11f0-4b6d-9dce-ae3c3f37efbc) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:05:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:05:50.298 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 348c8bec-11f0-4b6d-9dce-ae3c3f37efbc in datapath 165f7f23-d3c9-4f49-8a34-4fd7222ad518 unbound from our chassis
Nov 22 08:05:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:05:50.299 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 165f7f23-d3c9-4f49-8a34-4fd7222ad518, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 08:05:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:05:50.300 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[d216fa8c-1fb5-486e-852d-6811a37ca71b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:05:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:05:50.301 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518 namespace which is not needed anymore
Nov 22 08:05:50 compute-0 nova_compute[186544]: 2025-11-22 08:05:50.301 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:05:50 compute-0 systemd[1]: machine-qemu\x2d61\x2dinstance\x2d00000067.scope: Deactivated successfully.
Nov 22 08:05:50 compute-0 systemd[1]: machine-qemu\x2d61\x2dinstance\x2d00000067.scope: Consumed 7.226s CPU time.
Nov 22 08:05:50 compute-0 systemd-machined[152872]: Machine qemu-61-instance-00000067 terminated.
Nov 22 08:05:50 compute-0 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[232748]: [NOTICE]   (232752) : haproxy version is 2.8.14-c23fe91
Nov 22 08:05:50 compute-0 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[232748]: [NOTICE]   (232752) : path to executable is /usr/sbin/haproxy
Nov 22 08:05:50 compute-0 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[232748]: [WARNING]  (232752) : Exiting Master process...
Nov 22 08:05:50 compute-0 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[232748]: [ALERT]    (232752) : Current worker (232754) exited with code 143 (Terminated)
Nov 22 08:05:50 compute-0 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[232748]: [WARNING]  (232752) : All workers exited. Exiting... (0)
Nov 22 08:05:50 compute-0 systemd[1]: libpod-1083d3a9b89e49a0fb5c9423e7cf90aecceee5778aae18aefc37ad12d2abcbd5.scope: Deactivated successfully.
Nov 22 08:05:50 compute-0 podman[232833]: 2025-11-22 08:05:50.505915415 +0000 UTC m=+0.121934168 container died 1083d3a9b89e49a0fb5c9423e7cf90aecceee5778aae18aefc37ad12d2abcbd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 22 08:05:50 compute-0 nova_compute[186544]: 2025-11-22 08:05:50.513 186548 INFO nova.virt.libvirt.driver [-] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Instance destroyed successfully.
Nov 22 08:05:50 compute-0 nova_compute[186544]: 2025-11-22 08:05:50.515 186548 DEBUG nova.objects.instance [None req-07b57415-623c-44dd-937e-718559e07fa0 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lazy-loading 'resources' on Instance uuid b9ee5ebd-90a8-426a-b369-d38bf61616f2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:05:50 compute-0 nova_compute[186544]: 2025-11-22 08:05:50.551 186548 DEBUG nova.virt.libvirt.vif [None req-07b57415-623c-44dd-937e-718559e07fa0 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:04:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-316749730',display_name='tempest-ServerActionsTestJSON-server-316749730',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-316749730',id=103,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEDyh5RRpb7qDHgAc9H+oNwOI/lxx0x2a7uhOXIX+Er9GoVqnK9B1X3kTc/PIYUbBPjQjhoPfQeu2jPU9pyeFHD6mBTSbq1gvJNECPvummRKdXnVokvmyleOZmFdoGP/ZQ==',key_name='tempest-keypair-1877507320',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:05:43Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ac6b78572b7d4aedaf745e5e6ba1d360',ramdisk_id='',reservation_id='r-b7qa77dd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-1104477664',owner_user_name='tempest-ServerActionsTestJSON-1104477664-project-member'},tags=<?>,task_state='resize_reverting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:05:43Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='b6cc24df1e344e369f2aff864f278268',uuid=b9ee5ebd-90a8-426a-b369-d38bf61616f2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "348c8bec-11f0-4b6d-9dce-ae3c3f37efbc", "address": "fa:16:3e:ba:d5:b9", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap348c8bec-11", "ovs_interfaceid": "348c8bec-11f0-4b6d-9dce-ae3c3f37efbc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 08:05:50 compute-0 nova_compute[186544]: 2025-11-22 08:05:50.552 186548 DEBUG nova.network.os_vif_util [None req-07b57415-623c-44dd-937e-718559e07fa0 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Converting VIF {"id": "348c8bec-11f0-4b6d-9dce-ae3c3f37efbc", "address": "fa:16:3e:ba:d5:b9", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap348c8bec-11", "ovs_interfaceid": "348c8bec-11f0-4b6d-9dce-ae3c3f37efbc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:05:50 compute-0 nova_compute[186544]: 2025-11-22 08:05:50.552 186548 DEBUG nova.network.os_vif_util [None req-07b57415-623c-44dd-937e-718559e07fa0 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ba:d5:b9,bridge_name='br-int',has_traffic_filtering=True,id=348c8bec-11f0-4b6d-9dce-ae3c3f37efbc,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap348c8bec-11') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:05:50 compute-0 nova_compute[186544]: 2025-11-22 08:05:50.553 186548 DEBUG os_vif [None req-07b57415-623c-44dd-937e-718559e07fa0 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ba:d5:b9,bridge_name='br-int',has_traffic_filtering=True,id=348c8bec-11f0-4b6d-9dce-ae3c3f37efbc,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap348c8bec-11') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 08:05:50 compute-0 nova_compute[186544]: 2025-11-22 08:05:50.554 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:05:50 compute-0 nova_compute[186544]: 2025-11-22 08:05:50.555 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap348c8bec-11, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:05:50 compute-0 nova_compute[186544]: 2025-11-22 08:05:50.556 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:05:50 compute-0 nova_compute[186544]: 2025-11-22 08:05:50.558 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:05:50 compute-0 nova_compute[186544]: 2025-11-22 08:05:50.561 186548 INFO os_vif [None req-07b57415-623c-44dd-937e-718559e07fa0 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ba:d5:b9,bridge_name='br-int',has_traffic_filtering=True,id=348c8bec-11f0-4b6d-9dce-ae3c3f37efbc,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap348c8bec-11')
Nov 22 08:05:50 compute-0 nova_compute[186544]: 2025-11-22 08:05:50.561 186548 INFO nova.virt.libvirt.driver [None req-07b57415-623c-44dd-937e-718559e07fa0 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Deleting instance files /var/lib/nova/instances/b9ee5ebd-90a8-426a-b369-d38bf61616f2_del
Nov 22 08:05:50 compute-0 nova_compute[186544]: 2025-11-22 08:05:50.567 186548 INFO nova.virt.libvirt.driver [None req-07b57415-623c-44dd-937e-718559e07fa0 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Deletion of /var/lib/nova/instances/b9ee5ebd-90a8-426a-b369-d38bf61616f2_del complete
Nov 22 08:05:50 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1083d3a9b89e49a0fb5c9423e7cf90aecceee5778aae18aefc37ad12d2abcbd5-userdata-shm.mount: Deactivated successfully.
Nov 22 08:05:50 compute-0 systemd[1]: var-lib-containers-storage-overlay-b64edef82453c5179bc410198e136feff48c3eceb054e28e2866f1e381bd42d4-merged.mount: Deactivated successfully.
Nov 22 08:05:50 compute-0 nova_compute[186544]: 2025-11-22 08:05:50.650 186548 DEBUG oslo_concurrency.lockutils [None req-07b57415-623c-44dd-937e-718559e07fa0 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:05:50 compute-0 nova_compute[186544]: 2025-11-22 08:05:50.651 186548 DEBUG oslo_concurrency.lockutils [None req-07b57415-623c-44dd-937e-718559e07fa0 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:05:50 compute-0 nova_compute[186544]: 2025-11-22 08:05:50.683 186548 DEBUG nova.objects.instance [None req-07b57415-623c-44dd-937e-718559e07fa0 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lazy-loading 'migration_context' on Instance uuid b9ee5ebd-90a8-426a-b369-d38bf61616f2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:05:50 compute-0 nova_compute[186544]: 2025-11-22 08:05:50.753 186548 DEBUG nova.scheduler.client.report [None req-07b57415-623c-44dd-937e-718559e07fa0 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Refreshing inventories for resource provider 0a011418-630a-4be8-ab23-41ec1c11a5ea _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 22 08:05:50 compute-0 nova_compute[186544]: 2025-11-22 08:05:50.765 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:05:50 compute-0 podman[232833]: 2025-11-22 08:05:50.783002679 +0000 UTC m=+0.399021402 container cleanup 1083d3a9b89e49a0fb5c9423e7cf90aecceee5778aae18aefc37ad12d2abcbd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 08:05:50 compute-0 systemd[1]: libpod-conmon-1083d3a9b89e49a0fb5c9423e7cf90aecceee5778aae18aefc37ad12d2abcbd5.scope: Deactivated successfully.
Nov 22 08:05:50 compute-0 nova_compute[186544]: 2025-11-22 08:05:50.792 186548 DEBUG nova.scheduler.client.report [None req-07b57415-623c-44dd-937e-718559e07fa0 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Updating ProviderTree inventory for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 22 08:05:50 compute-0 nova_compute[186544]: 2025-11-22 08:05:50.793 186548 DEBUG nova.compute.provider_tree [None req-07b57415-623c-44dd-937e-718559e07fa0 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Updating inventory in ProviderTree for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 22 08:05:50 compute-0 nova_compute[186544]: 2025-11-22 08:05:50.824 186548 DEBUG nova.scheduler.client.report [None req-07b57415-623c-44dd-937e-718559e07fa0 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Refreshing aggregate associations for resource provider 0a011418-630a-4be8-ab23-41ec1c11a5ea, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 22 08:05:50 compute-0 nova_compute[186544]: 2025-11-22 08:05:50.880 186548 DEBUG nova.scheduler.client.report [None req-07b57415-623c-44dd-937e-718559e07fa0 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Refreshing trait associations for resource provider 0a011418-630a-4be8-ab23-41ec1c11a5ea, traits: COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_ACCELERATORS,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_RESCUE_BFV,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSSE3,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE41 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 22 08:05:50 compute-0 podman[232881]: 2025-11-22 08:05:50.921969676 +0000 UTC m=+0.119376815 container remove 1083d3a9b89e49a0fb5c9423e7cf90aecceee5778aae18aefc37ad12d2abcbd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team)
Nov 22 08:05:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:05:50.926 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[9d5974fa-13dd-44eb-98d6-ce5cde26ab9a]: (4, ('Sat Nov 22 08:05:50 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518 (1083d3a9b89e49a0fb5c9423e7cf90aecceee5778aae18aefc37ad12d2abcbd5)\n1083d3a9b89e49a0fb5c9423e7cf90aecceee5778aae18aefc37ad12d2abcbd5\nSat Nov 22 08:05:50 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518 (1083d3a9b89e49a0fb5c9423e7cf90aecceee5778aae18aefc37ad12d2abcbd5)\n1083d3a9b89e49a0fb5c9423e7cf90aecceee5778aae18aefc37ad12d2abcbd5\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:05:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:05:50.928 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[97a03958-3bc8-4317-a945-1c730773593d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:05:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:05:50.929 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap165f7f23-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:05:50 compute-0 nova_compute[186544]: 2025-11-22 08:05:50.931 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:05:50 compute-0 kernel: tap165f7f23-d0: left promiscuous mode
Nov 22 08:05:50 compute-0 nova_compute[186544]: 2025-11-22 08:05:50.932 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:05:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:05:50.934 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[7e47db80-9b3c-44c9-85e3-c4e0f4abe4f1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:05:50 compute-0 nova_compute[186544]: 2025-11-22 08:05:50.944 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:05:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:05:50.955 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[ebe134bc-2929-4992-b56b-95edc4286b8e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:05:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:05:50.957 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[8c975c99-6488-4295-be55-bc24fe16e59e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:05:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:05:50.974 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[fe4fd091-8f90-44f3-88ea-98076bfeae32]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 548372, 'reachable_time': 39592, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232896, 'error': None, 'target': 'ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:05:50 compute-0 systemd[1]: run-netns-ovnmeta\x2d165f7f23\x2dd3c9\x2d4f49\x2d8a34\x2d4fd7222ad518.mount: Deactivated successfully.
Nov 22 08:05:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:05:50.977 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 08:05:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:05:50.978 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[cec490a7-f786-4660-b5af-4192ca458b8d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:05:51 compute-0 nova_compute[186544]: 2025-11-22 08:05:50.999 186548 DEBUG nova.compute.provider_tree [None req-07b57415-623c-44dd-937e-718559e07fa0 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:05:51 compute-0 nova_compute[186544]: 2025-11-22 08:05:51.012 186548 DEBUG nova.compute.manager [req-6f9507af-f880-4cae-ad1c-9551a772eee8 req-ef9f1c42-008e-4129-8a1f-7dcda22c0f31 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Received event network-vif-unplugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:05:51 compute-0 nova_compute[186544]: 2025-11-22 08:05:51.012 186548 DEBUG oslo_concurrency.lockutils [req-6f9507af-f880-4cae-ad1c-9551a772eee8 req-ef9f1c42-008e-4129-8a1f-7dcda22c0f31 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:05:51 compute-0 nova_compute[186544]: 2025-11-22 08:05:51.012 186548 DEBUG oslo_concurrency.lockutils [req-6f9507af-f880-4cae-ad1c-9551a772eee8 req-ef9f1c42-008e-4129-8a1f-7dcda22c0f31 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:05:51 compute-0 nova_compute[186544]: 2025-11-22 08:05:51.013 186548 DEBUG oslo_concurrency.lockutils [req-6f9507af-f880-4cae-ad1c-9551a772eee8 req-ef9f1c42-008e-4129-8a1f-7dcda22c0f31 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:05:51 compute-0 nova_compute[186544]: 2025-11-22 08:05:51.013 186548 DEBUG nova.compute.manager [req-6f9507af-f880-4cae-ad1c-9551a772eee8 req-ef9f1c42-008e-4129-8a1f-7dcda22c0f31 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] No waiting events found dispatching network-vif-unplugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:05:51 compute-0 nova_compute[186544]: 2025-11-22 08:05:51.013 186548 WARNING nova.compute.manager [req-6f9507af-f880-4cae-ad1c-9551a772eee8 req-ef9f1c42-008e-4129-8a1f-7dcda22c0f31 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Received unexpected event network-vif-unplugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc for instance with vm_state resized and task_state resize_reverting.
Nov 22 08:05:51 compute-0 nova_compute[186544]: 2025-11-22 08:05:51.015 186548 DEBUG nova.scheduler.client.report [None req-07b57415-623c-44dd-937e-718559e07fa0 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:05:51 compute-0 nova_compute[186544]: 2025-11-22 08:05:51.119 186548 DEBUG oslo_concurrency.lockutils [None req-07b57415-623c-44dd-937e-718559e07fa0 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" :: held 0.468s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:05:51 compute-0 nova_compute[186544]: 2025-11-22 08:05:51.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:05:51 compute-0 nova_compute[186544]: 2025-11-22 08:05:51.215 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:05:51 compute-0 nova_compute[186544]: 2025-11-22 08:05:51.216 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:05:51 compute-0 nova_compute[186544]: 2025-11-22 08:05:51.216 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:05:51 compute-0 nova_compute[186544]: 2025-11-22 08:05:51.216 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 08:05:51 compute-0 nova_compute[186544]: 2025-11-22 08:05:51.413 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:05:51 compute-0 nova_compute[186544]: 2025-11-22 08:05:51.414 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5676MB free_disk=73.21078109741211GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 08:05:51 compute-0 nova_compute[186544]: 2025-11-22 08:05:51.414 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:05:51 compute-0 nova_compute[186544]: 2025-11-22 08:05:51.414 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:05:51 compute-0 nova_compute[186544]: 2025-11-22 08:05:51.562 186548 WARNING nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Instance b9ee5ebd-90a8-426a-b369-d38bf61616f2 has been moved to another host compute-2.ctlplane.example.com(compute-2.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'VCPU': 1, 'MEMORY_MB': 192, 'DISK_GB': 1}}.
Nov 22 08:05:51 compute-0 nova_compute[186544]: 2025-11-22 08:05:51.563 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 08:05:51 compute-0 nova_compute[186544]: 2025-11-22 08:05:51.563 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 08:05:51 compute-0 nova_compute[186544]: 2025-11-22 08:05:51.604 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:05:51 compute-0 nova_compute[186544]: 2025-11-22 08:05:51.622 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:05:51 compute-0 nova_compute[186544]: 2025-11-22 08:05:51.661 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 08:05:51 compute-0 nova_compute[186544]: 2025-11-22 08:05:51.663 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.248s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:05:52 compute-0 nova_compute[186544]: 2025-11-22 08:05:52.665 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:05:52 compute-0 nova_compute[186544]: 2025-11-22 08:05:52.665 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 08:05:53 compute-0 nova_compute[186544]: 2025-11-22 08:05:53.411 186548 DEBUG nova.compute.manager [req-d7e96f00-aaa2-4ec0-bbe1-cf835a6c6bb1 req-9043f6fe-5e6e-4a8c-8e62-0cf24f3f3fdc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Received event network-vif-plugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:05:53 compute-0 nova_compute[186544]: 2025-11-22 08:05:53.411 186548 DEBUG oslo_concurrency.lockutils [req-d7e96f00-aaa2-4ec0-bbe1-cf835a6c6bb1 req-9043f6fe-5e6e-4a8c-8e62-0cf24f3f3fdc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:05:53 compute-0 nova_compute[186544]: 2025-11-22 08:05:53.412 186548 DEBUG oslo_concurrency.lockutils [req-d7e96f00-aaa2-4ec0-bbe1-cf835a6c6bb1 req-9043f6fe-5e6e-4a8c-8e62-0cf24f3f3fdc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:05:53 compute-0 nova_compute[186544]: 2025-11-22 08:05:53.412 186548 DEBUG oslo_concurrency.lockutils [req-d7e96f00-aaa2-4ec0-bbe1-cf835a6c6bb1 req-9043f6fe-5e6e-4a8c-8e62-0cf24f3f3fdc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:05:53 compute-0 nova_compute[186544]: 2025-11-22 08:05:53.412 186548 DEBUG nova.compute.manager [req-d7e96f00-aaa2-4ec0-bbe1-cf835a6c6bb1 req-9043f6fe-5e6e-4a8c-8e62-0cf24f3f3fdc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] No waiting events found dispatching network-vif-plugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:05:53 compute-0 nova_compute[186544]: 2025-11-22 08:05:53.412 186548 WARNING nova.compute.manager [req-d7e96f00-aaa2-4ec0-bbe1-cf835a6c6bb1 req-9043f6fe-5e6e-4a8c-8e62-0cf24f3f3fdc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Received unexpected event network-vif-plugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc for instance with vm_state resized and task_state resize_reverting.
Nov 22 08:05:54 compute-0 nova_compute[186544]: 2025-11-22 08:05:54.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:05:54 compute-0 nova_compute[186544]: 2025-11-22 08:05:54.164 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 08:05:54 compute-0 nova_compute[186544]: 2025-11-22 08:05:54.164 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 08:05:54 compute-0 nova_compute[186544]: 2025-11-22 08:05:54.168 186548 DEBUG nova.compute.manager [req-5dc0030f-c007-4382-a632-8b510e9b6295 req-ce43b874-f9db-4197-b487-7b7996cd2215 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Received event network-changed-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:05:54 compute-0 nova_compute[186544]: 2025-11-22 08:05:54.168 186548 DEBUG nova.compute.manager [req-5dc0030f-c007-4382-a632-8b510e9b6295 req-ce43b874-f9db-4197-b487-7b7996cd2215 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Refreshing instance network info cache due to event network-changed-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:05:54 compute-0 nova_compute[186544]: 2025-11-22 08:05:54.169 186548 DEBUG oslo_concurrency.lockutils [req-5dc0030f-c007-4382-a632-8b510e9b6295 req-ce43b874-f9db-4197-b487-7b7996cd2215 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-b9ee5ebd-90a8-426a-b369-d38bf61616f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:05:54 compute-0 nova_compute[186544]: 2025-11-22 08:05:54.169 186548 DEBUG oslo_concurrency.lockutils [req-5dc0030f-c007-4382-a632-8b510e9b6295 req-ce43b874-f9db-4197-b487-7b7996cd2215 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-b9ee5ebd-90a8-426a-b369-d38bf61616f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:05:54 compute-0 nova_compute[186544]: 2025-11-22 08:05:54.169 186548 DEBUG nova.network.neutron [req-5dc0030f-c007-4382-a632-8b510e9b6295 req-ce43b874-f9db-4197-b487-7b7996cd2215 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Refreshing network info cache for port 348c8bec-11f0-4b6d-9dce-ae3c3f37efbc _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:05:54 compute-0 nova_compute[186544]: 2025-11-22 08:05:54.184 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 08:05:54 compute-0 nova_compute[186544]: 2025-11-22 08:05:54.185 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:05:55 compute-0 nova_compute[186544]: 2025-11-22 08:05:55.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:05:55 compute-0 nova_compute[186544]: 2025-11-22 08:05:55.558 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:05:55 compute-0 nova_compute[186544]: 2025-11-22 08:05:55.615 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:05:55 compute-0 nova_compute[186544]: 2025-11-22 08:05:55.768 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:05:56 compute-0 nova_compute[186544]: 2025-11-22 08:05:56.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:05:57 compute-0 nova_compute[186544]: 2025-11-22 08:05:57.185 186548 DEBUG nova.network.neutron [req-5dc0030f-c007-4382-a632-8b510e9b6295 req-ce43b874-f9db-4197-b487-7b7996cd2215 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Updated VIF entry in instance network info cache for port 348c8bec-11f0-4b6d-9dce-ae3c3f37efbc. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:05:57 compute-0 nova_compute[186544]: 2025-11-22 08:05:57.186 186548 DEBUG nova.network.neutron [req-5dc0030f-c007-4382-a632-8b510e9b6295 req-ce43b874-f9db-4197-b487-7b7996cd2215 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Updating instance_info_cache with network_info: [{"id": "348c8bec-11f0-4b6d-9dce-ae3c3f37efbc", "address": "fa:16:3e:ba:d5:b9", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap348c8bec-11", "ovs_interfaceid": "348c8bec-11f0-4b6d-9dce-ae3c3f37efbc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:05:57 compute-0 nova_compute[186544]: 2025-11-22 08:05:57.206 186548 DEBUG oslo_concurrency.lockutils [req-5dc0030f-c007-4382-a632-8b510e9b6295 req-ce43b874-f9db-4197-b487-7b7996cd2215 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-b9ee5ebd-90a8-426a-b369-d38bf61616f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:05:58 compute-0 nova_compute[186544]: 2025-11-22 08:05:58.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:05:59 compute-0 nova_compute[186544]: 2025-11-22 08:05:59.224 186548 DEBUG nova.compute.manager [req-480b6606-b3c7-4d65-9e50-cfde6d2d06c5 req-6ebe1c6f-02f2-417f-be22-f349caf7aa2c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Received event network-vif-plugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:05:59 compute-0 nova_compute[186544]: 2025-11-22 08:05:59.225 186548 DEBUG oslo_concurrency.lockutils [req-480b6606-b3c7-4d65-9e50-cfde6d2d06c5 req-6ebe1c6f-02f2-417f-be22-f349caf7aa2c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:05:59 compute-0 nova_compute[186544]: 2025-11-22 08:05:59.225 186548 DEBUG oslo_concurrency.lockutils [req-480b6606-b3c7-4d65-9e50-cfde6d2d06c5 req-6ebe1c6f-02f2-417f-be22-f349caf7aa2c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:05:59 compute-0 nova_compute[186544]: 2025-11-22 08:05:59.226 186548 DEBUG oslo_concurrency.lockutils [req-480b6606-b3c7-4d65-9e50-cfde6d2d06c5 req-6ebe1c6f-02f2-417f-be22-f349caf7aa2c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b9ee5ebd-90a8-426a-b369-d38bf61616f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:05:59 compute-0 nova_compute[186544]: 2025-11-22 08:05:59.226 186548 DEBUG nova.compute.manager [req-480b6606-b3c7-4d65-9e50-cfde6d2d06c5 req-6ebe1c6f-02f2-417f-be22-f349caf7aa2c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] No waiting events found dispatching network-vif-plugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:05:59 compute-0 nova_compute[186544]: 2025-11-22 08:05:59.226 186548 WARNING nova.compute.manager [req-480b6606-b3c7-4d65-9e50-cfde6d2d06c5 req-6ebe1c6f-02f2-417f-be22-f349caf7aa2c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Received unexpected event network-vif-plugged-348c8bec-11f0-4b6d-9dce-ae3c3f37efbc for instance with vm_state resized and task_state resize_reverting.
Nov 22 08:06:00 compute-0 nova_compute[186544]: 2025-11-22 08:06:00.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:06:00 compute-0 podman[232898]: 2025-11-22 08:06:00.439506341 +0000 UTC m=+0.087294624 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 08:06:00 compute-0 podman[232899]: 2025-11-22 08:06:00.439597624 +0000 UTC m=+0.084859555 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 22 08:06:00 compute-0 nova_compute[186544]: 2025-11-22 08:06:00.559 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:06:00 compute-0 nova_compute[186544]: 2025-11-22 08:06:00.770 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:06:02 compute-0 nova_compute[186544]: 2025-11-22 08:06:02.116 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:06:02 compute-0 nova_compute[186544]: 2025-11-22 08:06:02.346 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:06:03 compute-0 podman[232944]: 2025-11-22 08:06:03.392059678 +0000 UTC m=+0.044937919 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 08:06:03 compute-0 podman[232945]: 2025-11-22 08:06:03.395867712 +0000 UTC m=+0.044618401 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 08:06:05 compute-0 nova_compute[186544]: 2025-11-22 08:06:05.512 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763798750.5114865, b9ee5ebd-90a8-426a-b369-d38bf61616f2 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:06:05 compute-0 nova_compute[186544]: 2025-11-22 08:06:05.512 186548 INFO nova.compute.manager [-] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] VM Stopped (Lifecycle Event)
Nov 22 08:06:05 compute-0 nova_compute[186544]: 2025-11-22 08:06:05.556 186548 DEBUG nova.compute.manager [None req-c2ad6325-644e-4e31-aa56-c54f697c3dce - - - - - -] [instance: b9ee5ebd-90a8-426a-b369-d38bf61616f2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:06:05 compute-0 nova_compute[186544]: 2025-11-22 08:06:05.560 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:06:05 compute-0 nova_compute[186544]: 2025-11-22 08:06:05.771 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:06:06 compute-0 nova_compute[186544]: 2025-11-22 08:06:06.981 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:06:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:06:06.982 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=31, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=30) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:06:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:06:06.984 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 08:06:10 compute-0 nova_compute[186544]: 2025-11-22 08:06:10.563 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:06:10 compute-0 nova_compute[186544]: 2025-11-22 08:06:10.773 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:06:10 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:06:10.986 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '31'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:06:11 compute-0 podman[232988]: 2025-11-22 08:06:11.405619964 +0000 UTC m=+0.056343052 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 08:06:15 compute-0 nova_compute[186544]: 2025-11-22 08:06:15.159 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:06:15 compute-0 nova_compute[186544]: 2025-11-22 08:06:15.219 186548 DEBUG oslo_concurrency.lockutils [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquiring lock "1fe5c7b8-8a9f-4b64-bb69-822a770565f6" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:06:15 compute-0 nova_compute[186544]: 2025-11-22 08:06:15.220 186548 DEBUG oslo_concurrency.lockutils [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "1fe5c7b8-8a9f-4b64-bb69-822a770565f6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:06:15 compute-0 nova_compute[186544]: 2025-11-22 08:06:15.235 186548 DEBUG nova.compute.manager [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 08:06:15 compute-0 nova_compute[186544]: 2025-11-22 08:06:15.471 186548 DEBUG oslo_concurrency.lockutils [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:06:15 compute-0 nova_compute[186544]: 2025-11-22 08:06:15.471 186548 DEBUG oslo_concurrency.lockutils [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:06:15 compute-0 nova_compute[186544]: 2025-11-22 08:06:15.477 186548 DEBUG nova.virt.hardware [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 08:06:15 compute-0 nova_compute[186544]: 2025-11-22 08:06:15.477 186548 INFO nova.compute.claims [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Claim successful on node compute-0.ctlplane.example.com
Nov 22 08:06:15 compute-0 nova_compute[186544]: 2025-11-22 08:06:15.566 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:06:15 compute-0 nova_compute[186544]: 2025-11-22 08:06:15.646 186548 DEBUG nova.compute.provider_tree [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:06:15 compute-0 nova_compute[186544]: 2025-11-22 08:06:15.662 186548 DEBUG nova.scheduler.client.report [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:06:15 compute-0 nova_compute[186544]: 2025-11-22 08:06:15.685 186548 DEBUG oslo_concurrency.lockutils [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.213s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:06:15 compute-0 nova_compute[186544]: 2025-11-22 08:06:15.685 186548 DEBUG nova.compute.manager [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 08:06:15 compute-0 nova_compute[186544]: 2025-11-22 08:06:15.741 186548 DEBUG nova.compute.manager [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 22 08:06:15 compute-0 nova_compute[186544]: 2025-11-22 08:06:15.742 186548 DEBUG nova.network.neutron [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 08:06:15 compute-0 nova_compute[186544]: 2025-11-22 08:06:15.775 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:06:15 compute-0 nova_compute[186544]: 2025-11-22 08:06:15.823 186548 INFO nova.virt.libvirt.driver [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 08:06:15 compute-0 nova_compute[186544]: 2025-11-22 08:06:15.846 186548 DEBUG nova.compute.manager [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 08:06:15 compute-0 nova_compute[186544]: 2025-11-22 08:06:15.959 186548 DEBUG nova.compute.manager [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 08:06:15 compute-0 nova_compute[186544]: 2025-11-22 08:06:15.961 186548 DEBUG nova.virt.libvirt.driver [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 08:06:15 compute-0 nova_compute[186544]: 2025-11-22 08:06:15.961 186548 INFO nova.virt.libvirt.driver [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Creating image(s)
Nov 22 08:06:15 compute-0 nova_compute[186544]: 2025-11-22 08:06:15.962 186548 DEBUG oslo_concurrency.lockutils [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquiring lock "/var/lib/nova/instances/1fe5c7b8-8a9f-4b64-bb69-822a770565f6/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:06:15 compute-0 nova_compute[186544]: 2025-11-22 08:06:15.962 186548 DEBUG oslo_concurrency.lockutils [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "/var/lib/nova/instances/1fe5c7b8-8a9f-4b64-bb69-822a770565f6/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:06:15 compute-0 nova_compute[186544]: 2025-11-22 08:06:15.963 186548 DEBUG oslo_concurrency.lockutils [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "/var/lib/nova/instances/1fe5c7b8-8a9f-4b64-bb69-822a770565f6/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:06:15 compute-0 nova_compute[186544]: 2025-11-22 08:06:15.978 186548 DEBUG oslo_concurrency.processutils [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:06:16 compute-0 nova_compute[186544]: 2025-11-22 08:06:16.036 186548 DEBUG oslo_concurrency.processutils [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:06:16 compute-0 nova_compute[186544]: 2025-11-22 08:06:16.037 186548 DEBUG oslo_concurrency.lockutils [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:06:16 compute-0 nova_compute[186544]: 2025-11-22 08:06:16.038 186548 DEBUG oslo_concurrency.lockutils [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:06:16 compute-0 nova_compute[186544]: 2025-11-22 08:06:16.050 186548 DEBUG oslo_concurrency.processutils [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:06:16 compute-0 nova_compute[186544]: 2025-11-22 08:06:16.110 186548 DEBUG oslo_concurrency.processutils [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:06:16 compute-0 nova_compute[186544]: 2025-11-22 08:06:16.111 186548 DEBUG oslo_concurrency.processutils [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/1fe5c7b8-8a9f-4b64-bb69-822a770565f6/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:06:16 compute-0 nova_compute[186544]: 2025-11-22 08:06:16.167 186548 DEBUG oslo_concurrency.processutils [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/1fe5c7b8-8a9f-4b64-bb69-822a770565f6/disk 1073741824" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:06:16 compute-0 nova_compute[186544]: 2025-11-22 08:06:16.168 186548 DEBUG oslo_concurrency.lockutils [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.131s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:06:16 compute-0 nova_compute[186544]: 2025-11-22 08:06:16.169 186548 DEBUG oslo_concurrency.processutils [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:06:16 compute-0 nova_compute[186544]: 2025-11-22 08:06:16.188 186548 DEBUG nova.policy [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 22 08:06:16 compute-0 nova_compute[186544]: 2025-11-22 08:06:16.229 186548 DEBUG oslo_concurrency.processutils [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:06:16 compute-0 nova_compute[186544]: 2025-11-22 08:06:16.230 186548 DEBUG nova.virt.disk.api [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Checking if we can resize image /var/lib/nova/instances/1fe5c7b8-8a9f-4b64-bb69-822a770565f6/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 08:06:16 compute-0 nova_compute[186544]: 2025-11-22 08:06:16.230 186548 DEBUG oslo_concurrency.processutils [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1fe5c7b8-8a9f-4b64-bb69-822a770565f6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:06:16 compute-0 nova_compute[186544]: 2025-11-22 08:06:16.294 186548 DEBUG oslo_concurrency.processutils [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1fe5c7b8-8a9f-4b64-bb69-822a770565f6/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:06:16 compute-0 nova_compute[186544]: 2025-11-22 08:06:16.296 186548 DEBUG nova.virt.disk.api [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Cannot resize image /var/lib/nova/instances/1fe5c7b8-8a9f-4b64-bb69-822a770565f6/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 08:06:16 compute-0 nova_compute[186544]: 2025-11-22 08:06:16.296 186548 DEBUG nova.objects.instance [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lazy-loading 'migration_context' on Instance uuid 1fe5c7b8-8a9f-4b64-bb69-822a770565f6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:06:16 compute-0 nova_compute[186544]: 2025-11-22 08:06:16.316 186548 DEBUG nova.virt.libvirt.driver [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 08:06:16 compute-0 nova_compute[186544]: 2025-11-22 08:06:16.317 186548 DEBUG nova.virt.libvirt.driver [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Ensure instance console log exists: /var/lib/nova/instances/1fe5c7b8-8a9f-4b64-bb69-822a770565f6/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 08:06:16 compute-0 nova_compute[186544]: 2025-11-22 08:06:16.317 186548 DEBUG oslo_concurrency.lockutils [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:06:16 compute-0 nova_compute[186544]: 2025-11-22 08:06:16.318 186548 DEBUG oslo_concurrency.lockutils [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:06:16 compute-0 nova_compute[186544]: 2025-11-22 08:06:16.318 186548 DEBUG oslo_concurrency.lockutils [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:06:16 compute-0 podman[233025]: 2025-11-22 08:06:16.399594996 +0000 UTC m=+0.051238544 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, distribution-scope=public, managed_by=edpm_ansible, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter)
Nov 22 08:06:16 compute-0 podman[233024]: 2025-11-22 08:06:16.40012569 +0000 UTC m=+0.053774108 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 08:06:17 compute-0 nova_compute[186544]: 2025-11-22 08:06:17.681 186548 DEBUG nova.network.neutron [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Successfully created port: 480abb3e-15cf-4910-b471-7667ee6c50c8 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 22 08:06:19 compute-0 nova_compute[186544]: 2025-11-22 08:06:19.444 186548 DEBUG nova.network.neutron [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Successfully updated port: 480abb3e-15cf-4910-b471-7667ee6c50c8 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 08:06:19 compute-0 nova_compute[186544]: 2025-11-22 08:06:19.459 186548 DEBUG oslo_concurrency.lockutils [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquiring lock "refresh_cache-1fe5c7b8-8a9f-4b64-bb69-822a770565f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:06:19 compute-0 nova_compute[186544]: 2025-11-22 08:06:19.460 186548 DEBUG oslo_concurrency.lockutils [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquired lock "refresh_cache-1fe5c7b8-8a9f-4b64-bb69-822a770565f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:06:19 compute-0 nova_compute[186544]: 2025-11-22 08:06:19.460 186548 DEBUG nova.network.neutron [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 08:06:19 compute-0 nova_compute[186544]: 2025-11-22 08:06:19.547 186548 DEBUG nova.compute.manager [req-f1785904-0cdc-46f3-8a9e-cd2bae250408 req-039460a2-91b9-49e2-b2b4-a936fd2355e6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Received event network-changed-480abb3e-15cf-4910-b471-7667ee6c50c8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:06:19 compute-0 nova_compute[186544]: 2025-11-22 08:06:19.547 186548 DEBUG nova.compute.manager [req-f1785904-0cdc-46f3-8a9e-cd2bae250408 req-039460a2-91b9-49e2-b2b4-a936fd2355e6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Refreshing instance network info cache due to event network-changed-480abb3e-15cf-4910-b471-7667ee6c50c8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:06:19 compute-0 nova_compute[186544]: 2025-11-22 08:06:19.547 186548 DEBUG oslo_concurrency.lockutils [req-f1785904-0cdc-46f3-8a9e-cd2bae250408 req-039460a2-91b9-49e2-b2b4-a936fd2355e6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-1fe5c7b8-8a9f-4b64-bb69-822a770565f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:06:19 compute-0 nova_compute[186544]: 2025-11-22 08:06:19.697 186548 DEBUG nova.network.neutron [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 08:06:20 compute-0 nova_compute[186544]: 2025-11-22 08:06:20.569 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:06:20 compute-0 nova_compute[186544]: 2025-11-22 08:06:20.778 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:06:21 compute-0 nova_compute[186544]: 2025-11-22 08:06:21.154 186548 DEBUG nova.network.neutron [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Updating instance_info_cache with network_info: [{"id": "480abb3e-15cf-4910-b471-7667ee6c50c8", "address": "fa:16:3e:d0:87:85", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap480abb3e-15", "ovs_interfaceid": "480abb3e-15cf-4910-b471-7667ee6c50c8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:06:21 compute-0 nova_compute[186544]: 2025-11-22 08:06:21.232 186548 DEBUG oslo_concurrency.lockutils [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Releasing lock "refresh_cache-1fe5c7b8-8a9f-4b64-bb69-822a770565f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:06:21 compute-0 nova_compute[186544]: 2025-11-22 08:06:21.232 186548 DEBUG nova.compute.manager [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Instance network_info: |[{"id": "480abb3e-15cf-4910-b471-7667ee6c50c8", "address": "fa:16:3e:d0:87:85", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap480abb3e-15", "ovs_interfaceid": "480abb3e-15cf-4910-b471-7667ee6c50c8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 22 08:06:21 compute-0 nova_compute[186544]: 2025-11-22 08:06:21.232 186548 DEBUG oslo_concurrency.lockutils [req-f1785904-0cdc-46f3-8a9e-cd2bae250408 req-039460a2-91b9-49e2-b2b4-a936fd2355e6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-1fe5c7b8-8a9f-4b64-bb69-822a770565f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:06:21 compute-0 nova_compute[186544]: 2025-11-22 08:06:21.233 186548 DEBUG nova.network.neutron [req-f1785904-0cdc-46f3-8a9e-cd2bae250408 req-039460a2-91b9-49e2-b2b4-a936fd2355e6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Refreshing network info cache for port 480abb3e-15cf-4910-b471-7667ee6c50c8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:06:21 compute-0 nova_compute[186544]: 2025-11-22 08:06:21.236 186548 DEBUG nova.virt.libvirt.driver [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Start _get_guest_xml network_info=[{"id": "480abb3e-15cf-4910-b471-7667ee6c50c8", "address": "fa:16:3e:d0:87:85", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap480abb3e-15", "ovs_interfaceid": "480abb3e-15cf-4910-b471-7667ee6c50c8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 08:06:21 compute-0 nova_compute[186544]: 2025-11-22 08:06:21.240 186548 WARNING nova.virt.libvirt.driver [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:06:21 compute-0 nova_compute[186544]: 2025-11-22 08:06:21.244 186548 DEBUG nova.virt.libvirt.host [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 08:06:21 compute-0 nova_compute[186544]: 2025-11-22 08:06:21.244 186548 DEBUG nova.virt.libvirt.host [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 08:06:21 compute-0 nova_compute[186544]: 2025-11-22 08:06:21.247 186548 DEBUG nova.virt.libvirt.host [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 08:06:21 compute-0 nova_compute[186544]: 2025-11-22 08:06:21.248 186548 DEBUG nova.virt.libvirt.host [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 08:06:21 compute-0 nova_compute[186544]: 2025-11-22 08:06:21.249 186548 DEBUG nova.virt.libvirt.driver [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 08:06:21 compute-0 nova_compute[186544]: 2025-11-22 08:06:21.249 186548 DEBUG nova.virt.hardware [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 08:06:21 compute-0 nova_compute[186544]: 2025-11-22 08:06:21.249 186548 DEBUG nova.virt.hardware [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 08:06:21 compute-0 nova_compute[186544]: 2025-11-22 08:06:21.250 186548 DEBUG nova.virt.hardware [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 08:06:21 compute-0 nova_compute[186544]: 2025-11-22 08:06:21.250 186548 DEBUG nova.virt.hardware [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 08:06:21 compute-0 nova_compute[186544]: 2025-11-22 08:06:21.250 186548 DEBUG nova.virt.hardware [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 08:06:21 compute-0 nova_compute[186544]: 2025-11-22 08:06:21.250 186548 DEBUG nova.virt.hardware [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 08:06:21 compute-0 nova_compute[186544]: 2025-11-22 08:06:21.250 186548 DEBUG nova.virt.hardware [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 08:06:21 compute-0 nova_compute[186544]: 2025-11-22 08:06:21.251 186548 DEBUG nova.virt.hardware [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 08:06:21 compute-0 nova_compute[186544]: 2025-11-22 08:06:21.251 186548 DEBUG nova.virt.hardware [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 08:06:21 compute-0 nova_compute[186544]: 2025-11-22 08:06:21.251 186548 DEBUG nova.virt.hardware [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 08:06:21 compute-0 nova_compute[186544]: 2025-11-22 08:06:21.251 186548 DEBUG nova.virt.hardware [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 08:06:21 compute-0 nova_compute[186544]: 2025-11-22 08:06:21.255 186548 DEBUG nova.virt.libvirt.vif [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:06:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1640453793',display_name='tempest-ServerActionsTestJSON-server-1640453793',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1640453793',id=106,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEDyh5RRpb7qDHgAc9H+oNwOI/lxx0x2a7uhOXIX+Er9GoVqnK9B1X3kTc/PIYUbBPjQjhoPfQeu2jPU9pyeFHD6mBTSbq1gvJNECPvummRKdXnVokvmyleOZmFdoGP/ZQ==',key_name='tempest-keypair-1877507320',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ac6b78572b7d4aedaf745e5e6ba1d360',ramdisk_id='',reservation_id='r-tvm9f4y6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-1104477664',owner_user_name='tempest-ServerActionsTestJSON-1104477664-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:06:15Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='b6cc24df1e344e369f2aff864f278268',uuid=1fe5c7b8-8a9f-4b64-bb69-822a770565f6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "480abb3e-15cf-4910-b471-7667ee6c50c8", "address": "fa:16:3e:d0:87:85", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap480abb3e-15", "ovs_interfaceid": "480abb3e-15cf-4910-b471-7667ee6c50c8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 08:06:21 compute-0 nova_compute[186544]: 2025-11-22 08:06:21.255 186548 DEBUG nova.network.os_vif_util [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Converting VIF {"id": "480abb3e-15cf-4910-b471-7667ee6c50c8", "address": "fa:16:3e:d0:87:85", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap480abb3e-15", "ovs_interfaceid": "480abb3e-15cf-4910-b471-7667ee6c50c8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:06:21 compute-0 nova_compute[186544]: 2025-11-22 08:06:21.256 186548 DEBUG nova.network.os_vif_util [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d0:87:85,bridge_name='br-int',has_traffic_filtering=True,id=480abb3e-15cf-4910-b471-7667ee6c50c8,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap480abb3e-15') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:06:21 compute-0 nova_compute[186544]: 2025-11-22 08:06:21.257 186548 DEBUG nova.objects.instance [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lazy-loading 'pci_devices' on Instance uuid 1fe5c7b8-8a9f-4b64-bb69-822a770565f6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:06:21 compute-0 nova_compute[186544]: 2025-11-22 08:06:21.273 186548 DEBUG nova.virt.libvirt.driver [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] End _get_guest_xml xml=<domain type="kvm">
Nov 22 08:06:21 compute-0 nova_compute[186544]:   <uuid>1fe5c7b8-8a9f-4b64-bb69-822a770565f6</uuid>
Nov 22 08:06:21 compute-0 nova_compute[186544]:   <name>instance-0000006a</name>
Nov 22 08:06:21 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 08:06:21 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 08:06:21 compute-0 nova_compute[186544]:   <metadata>
Nov 22 08:06:21 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 08:06:21 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 08:06:21 compute-0 nova_compute[186544]:       <nova:name>tempest-ServerActionsTestJSON-server-1640453793</nova:name>
Nov 22 08:06:21 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 08:06:21</nova:creationTime>
Nov 22 08:06:21 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 08:06:21 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 08:06:21 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 08:06:21 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 08:06:21 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 08:06:21 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 08:06:21 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 08:06:21 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 08:06:21 compute-0 nova_compute[186544]:         <nova:user uuid="b6cc24df1e344e369f2aff864f278268">tempest-ServerActionsTestJSON-1104477664-project-member</nova:user>
Nov 22 08:06:21 compute-0 nova_compute[186544]:         <nova:project uuid="ac6b78572b7d4aedaf745e5e6ba1d360">tempest-ServerActionsTestJSON-1104477664</nova:project>
Nov 22 08:06:21 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 08:06:21 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 08:06:21 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 08:06:21 compute-0 nova_compute[186544]:         <nova:port uuid="480abb3e-15cf-4910-b471-7667ee6c50c8">
Nov 22 08:06:21 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 22 08:06:21 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 08:06:21 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 08:06:21 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 08:06:21 compute-0 nova_compute[186544]:   </metadata>
Nov 22 08:06:21 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 08:06:21 compute-0 nova_compute[186544]:     <system>
Nov 22 08:06:21 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 08:06:21 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 08:06:21 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 08:06:21 compute-0 nova_compute[186544]:       <entry name="serial">1fe5c7b8-8a9f-4b64-bb69-822a770565f6</entry>
Nov 22 08:06:21 compute-0 nova_compute[186544]:       <entry name="uuid">1fe5c7b8-8a9f-4b64-bb69-822a770565f6</entry>
Nov 22 08:06:21 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 08:06:21 compute-0 nova_compute[186544]:     </system>
Nov 22 08:06:21 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 08:06:21 compute-0 nova_compute[186544]:   <os>
Nov 22 08:06:21 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 08:06:21 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 08:06:21 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 08:06:21 compute-0 nova_compute[186544]:   </os>
Nov 22 08:06:21 compute-0 nova_compute[186544]:   <features>
Nov 22 08:06:21 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 08:06:21 compute-0 nova_compute[186544]:     <apic/>
Nov 22 08:06:21 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 08:06:21 compute-0 nova_compute[186544]:   </features>
Nov 22 08:06:21 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 08:06:21 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 08:06:21 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 08:06:21 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 08:06:21 compute-0 nova_compute[186544]:   </clock>
Nov 22 08:06:21 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 08:06:21 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 08:06:21 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 08:06:21 compute-0 nova_compute[186544]:   </cpu>
Nov 22 08:06:21 compute-0 nova_compute[186544]:   <devices>
Nov 22 08:06:21 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 08:06:21 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 08:06:21 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/1fe5c7b8-8a9f-4b64-bb69-822a770565f6/disk"/>
Nov 22 08:06:21 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 08:06:21 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:06:21 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 08:06:21 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 08:06:21 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/1fe5c7b8-8a9f-4b64-bb69-822a770565f6/disk.config"/>
Nov 22 08:06:21 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 08:06:21 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:06:21 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 08:06:21 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:d0:87:85"/>
Nov 22 08:06:21 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:06:21 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 08:06:21 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 08:06:21 compute-0 nova_compute[186544]:       <target dev="tap480abb3e-15"/>
Nov 22 08:06:21 compute-0 nova_compute[186544]:     </interface>
Nov 22 08:06:21 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 08:06:21 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/1fe5c7b8-8a9f-4b64-bb69-822a770565f6/console.log" append="off"/>
Nov 22 08:06:21 compute-0 nova_compute[186544]:     </serial>
Nov 22 08:06:21 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 08:06:21 compute-0 nova_compute[186544]:     <video>
Nov 22 08:06:21 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:06:21 compute-0 nova_compute[186544]:     </video>
Nov 22 08:06:21 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 08:06:21 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 08:06:21 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 08:06:21 compute-0 nova_compute[186544]:     </rng>
Nov 22 08:06:21 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 08:06:21 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:06:21 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:06:21 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:06:21 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:06:21 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:06:21 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:06:21 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:06:21 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:06:21 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:06:21 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:06:21 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:06:21 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:06:21 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:06:21 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:06:21 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:06:21 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:06:21 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:06:21 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:06:21 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:06:21 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:06:21 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:06:21 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:06:21 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:06:21 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:06:21 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 08:06:21 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 08:06:21 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 08:06:21 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 08:06:21 compute-0 nova_compute[186544]:   </devices>
Nov 22 08:06:21 compute-0 nova_compute[186544]: </domain>
Nov 22 08:06:21 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 08:06:21 compute-0 nova_compute[186544]: 2025-11-22 08:06:21.274 186548 DEBUG nova.compute.manager [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Preparing to wait for external event network-vif-plugged-480abb3e-15cf-4910-b471-7667ee6c50c8 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 08:06:21 compute-0 nova_compute[186544]: 2025-11-22 08:06:21.274 186548 DEBUG oslo_concurrency.lockutils [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquiring lock "1fe5c7b8-8a9f-4b64-bb69-822a770565f6-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:06:21 compute-0 nova_compute[186544]: 2025-11-22 08:06:21.274 186548 DEBUG oslo_concurrency.lockutils [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "1fe5c7b8-8a9f-4b64-bb69-822a770565f6-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:06:21 compute-0 nova_compute[186544]: 2025-11-22 08:06:21.274 186548 DEBUG oslo_concurrency.lockutils [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "1fe5c7b8-8a9f-4b64-bb69-822a770565f6-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:06:21 compute-0 nova_compute[186544]: 2025-11-22 08:06:21.275 186548 DEBUG nova.virt.libvirt.vif [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:06:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1640453793',display_name='tempest-ServerActionsTestJSON-server-1640453793',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1640453793',id=106,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEDyh5RRpb7qDHgAc9H+oNwOI/lxx0x2a7uhOXIX+Er9GoVqnK9B1X3kTc/PIYUbBPjQjhoPfQeu2jPU9pyeFHD6mBTSbq1gvJNECPvummRKdXnVokvmyleOZmFdoGP/ZQ==',key_name='tempest-keypair-1877507320',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ac6b78572b7d4aedaf745e5e6ba1d360',ramdisk_id='',reservation_id='r-tvm9f4y6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-1104477664',owner_user_name='tempest-ServerActionsTestJSON-1104477664-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:06:15Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='b6cc24df1e344e369f2aff864f278268',uuid=1fe5c7b8-8a9f-4b64-bb69-822a770565f6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "480abb3e-15cf-4910-b471-7667ee6c50c8", "address": "fa:16:3e:d0:87:85", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap480abb3e-15", "ovs_interfaceid": "480abb3e-15cf-4910-b471-7667ee6c50c8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 08:06:21 compute-0 nova_compute[186544]: 2025-11-22 08:06:21.275 186548 DEBUG nova.network.os_vif_util [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Converting VIF {"id": "480abb3e-15cf-4910-b471-7667ee6c50c8", "address": "fa:16:3e:d0:87:85", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap480abb3e-15", "ovs_interfaceid": "480abb3e-15cf-4910-b471-7667ee6c50c8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:06:21 compute-0 nova_compute[186544]: 2025-11-22 08:06:21.276 186548 DEBUG nova.network.os_vif_util [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d0:87:85,bridge_name='br-int',has_traffic_filtering=True,id=480abb3e-15cf-4910-b471-7667ee6c50c8,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap480abb3e-15') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:06:21 compute-0 nova_compute[186544]: 2025-11-22 08:06:21.277 186548 DEBUG os_vif [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d0:87:85,bridge_name='br-int',has_traffic_filtering=True,id=480abb3e-15cf-4910-b471-7667ee6c50c8,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap480abb3e-15') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 08:06:21 compute-0 nova_compute[186544]: 2025-11-22 08:06:21.277 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:06:21 compute-0 nova_compute[186544]: 2025-11-22 08:06:21.278 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:06:21 compute-0 nova_compute[186544]: 2025-11-22 08:06:21.278 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:06:21 compute-0 nova_compute[186544]: 2025-11-22 08:06:21.280 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:06:21 compute-0 nova_compute[186544]: 2025-11-22 08:06:21.281 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap480abb3e-15, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:06:21 compute-0 nova_compute[186544]: 2025-11-22 08:06:21.281 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap480abb3e-15, col_values=(('external_ids', {'iface-id': '480abb3e-15cf-4910-b471-7667ee6c50c8', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d0:87:85', 'vm-uuid': '1fe5c7b8-8a9f-4b64-bb69-822a770565f6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:06:21 compute-0 NetworkManager[55036]: <info>  [1763798781.2836] manager: (tap480abb3e-15): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/224)
Nov 22 08:06:21 compute-0 nova_compute[186544]: 2025-11-22 08:06:21.284 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 08:06:21 compute-0 nova_compute[186544]: 2025-11-22 08:06:21.288 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:06:21 compute-0 nova_compute[186544]: 2025-11-22 08:06:21.289 186548 INFO os_vif [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d0:87:85,bridge_name='br-int',has_traffic_filtering=True,id=480abb3e-15cf-4910-b471-7667ee6c50c8,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap480abb3e-15')
Nov 22 08:06:21 compute-0 nova_compute[186544]: 2025-11-22 08:06:21.358 186548 DEBUG nova.virt.libvirt.driver [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:06:21 compute-0 nova_compute[186544]: 2025-11-22 08:06:21.359 186548 DEBUG nova.virt.libvirt.driver [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:06:21 compute-0 nova_compute[186544]: 2025-11-22 08:06:21.359 186548 DEBUG nova.virt.libvirt.driver [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] No VIF found with MAC fa:16:3e:d0:87:85, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 08:06:21 compute-0 nova_compute[186544]: 2025-11-22 08:06:21.360 186548 INFO nova.virt.libvirt.driver [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Using config drive
Nov 22 08:06:21 compute-0 nova_compute[186544]: 2025-11-22 08:06:21.885 186548 INFO nova.virt.libvirt.driver [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Creating config drive at /var/lib/nova/instances/1fe5c7b8-8a9f-4b64-bb69-822a770565f6/disk.config
Nov 22 08:06:21 compute-0 nova_compute[186544]: 2025-11-22 08:06:21.892 186548 DEBUG oslo_concurrency.processutils [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1fe5c7b8-8a9f-4b64-bb69-822a770565f6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpthvpe80g execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:06:22 compute-0 nova_compute[186544]: 2025-11-22 08:06:22.018 186548 DEBUG oslo_concurrency.processutils [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1fe5c7b8-8a9f-4b64-bb69-822a770565f6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpthvpe80g" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:06:22 compute-0 kernel: tap480abb3e-15: entered promiscuous mode
Nov 22 08:06:22 compute-0 NetworkManager[55036]: <info>  [1763798782.0760] manager: (tap480abb3e-15): new Tun device (/org/freedesktop/NetworkManager/Devices/225)
Nov 22 08:06:22 compute-0 ovn_controller[94843]: 2025-11-22T08:06:22Z|00476|binding|INFO|Claiming lport 480abb3e-15cf-4910-b471-7667ee6c50c8 for this chassis.
Nov 22 08:06:22 compute-0 nova_compute[186544]: 2025-11-22 08:06:22.077 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:06:22 compute-0 ovn_controller[94843]: 2025-11-22T08:06:22Z|00477|binding|INFO|480abb3e-15cf-4910-b471-7667ee6c50c8: Claiming fa:16:3e:d0:87:85 10.100.0.11
Nov 22 08:06:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:06:22.093 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d0:87:85 10.100.0.11'], port_security=['fa:16:3e:d0:87:85 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '1fe5c7b8-8a9f-4b64-bb69-822a770565f6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'f75b5f45-3232-42aa-a8f2-594f0428a6f8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a46c39a3-69e8-4fb9-a300-4c114a0c0910, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=480abb3e-15cf-4910-b471-7667ee6c50c8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:06:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:06:22.094 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 480abb3e-15cf-4910-b471-7667ee6c50c8 in datapath 165f7f23-d3c9-4f49-8a34-4fd7222ad518 bound to our chassis
Nov 22 08:06:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:06:22.095 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 165f7f23-d3c9-4f49-8a34-4fd7222ad518
Nov 22 08:06:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:06:22.106 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[b0754b96-cb64-4511-bc27-d9f93c0bd81e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:06:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:06:22.106 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap165f7f23-d1 in ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 08:06:22 compute-0 systemd-udevd[233088]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:06:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:06:22.108 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap165f7f23-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 08:06:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:06:22.108 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[6cd8e998-8770-4860-ac04-35d4fca46f9e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:06:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:06:22.109 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[410c1d71-c536-44a3-af5c-16ae3bd17f3c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:06:22 compute-0 systemd-machined[152872]: New machine qemu-62-instance-0000006a.
Nov 22 08:06:22 compute-0 NetworkManager[55036]: <info>  [1763798782.1203] device (tap480abb3e-15): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 08:06:22 compute-0 NetworkManager[55036]: <info>  [1763798782.1220] device (tap480abb3e-15): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 08:06:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:06:22.120 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[59ca5dc4-9cb7-40da-b070-f82ef5cc0864]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:06:22 compute-0 nova_compute[186544]: 2025-11-22 08:06:22.135 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:06:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:06:22.135 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[0706faef-7ecb-44ae-9836-7bcb65a019d6]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:06:22 compute-0 systemd[1]: Started Virtual Machine qemu-62-instance-0000006a.
Nov 22 08:06:22 compute-0 ovn_controller[94843]: 2025-11-22T08:06:22Z|00478|binding|INFO|Setting lport 480abb3e-15cf-4910-b471-7667ee6c50c8 ovn-installed in OVS
Nov 22 08:06:22 compute-0 ovn_controller[94843]: 2025-11-22T08:06:22Z|00479|binding|INFO|Setting lport 480abb3e-15cf-4910-b471-7667ee6c50c8 up in Southbound
Nov 22 08:06:22 compute-0 nova_compute[186544]: 2025-11-22 08:06:22.144 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:06:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:06:22.164 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[80eff8b8-496d-42e9-bc53-99206a853f02]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:06:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:06:22.169 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[c4e5e39c-a5ea-48ac-977f-03fa7393be83]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:06:22 compute-0 NetworkManager[55036]: <info>  [1763798782.1708] manager: (tap165f7f23-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/226)
Nov 22 08:06:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:06:22.204 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[1fb54d90-ad76-41ff-aaf3-5b7eb4e52a90]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:06:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:06:22.207 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[c81326da-8bbe-44d6-82a4-923ec9160e8c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:06:22 compute-0 NetworkManager[55036]: <info>  [1763798782.2292] device (tap165f7f23-d0): carrier: link connected
Nov 22 08:06:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:06:22.233 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[5db6b31c-c2ac-4a7e-949c-bccf513d3523]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:06:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:06:22.251 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[041a8ba4-793d-4bc7-87d5-7c8e6597abce]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap165f7f23-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:00:cc:98'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 150], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 552286, 'reachable_time': 19617, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233120, 'error': None, 'target': 'ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:06:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:06:22.266 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[2f5a181d-eb95-45c3-83b7-657c643830a9]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe00:cc98'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 552286, 'tstamp': 552286}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233121, 'error': None, 'target': 'ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:06:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:06:22.282 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[93e2ca54-d28f-4fde-8837-1fc506f65a48]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap165f7f23-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:00:cc:98'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 150], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 552286, 'reachable_time': 19617, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 233122, 'error': None, 'target': 'ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:06:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:06:22.309 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[831918c6-584a-4a42-8a45-374c50924346]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:06:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:06:22.361 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[162c7d54-62e2-47c9-a984-fd685793c284]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:06:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:06:22.363 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap165f7f23-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:06:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:06:22.363 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:06:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:06:22.363 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap165f7f23-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:06:22 compute-0 NetworkManager[55036]: <info>  [1763798782.3660] manager: (tap165f7f23-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/227)
Nov 22 08:06:22 compute-0 kernel: tap165f7f23-d0: entered promiscuous mode
Nov 22 08:06:22 compute-0 nova_compute[186544]: 2025-11-22 08:06:22.365 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:06:22 compute-0 nova_compute[186544]: 2025-11-22 08:06:22.367 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:06:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:06:22.369 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap165f7f23-d0, col_values=(('external_ids', {'iface-id': '966efbe2-6c09-40dc-9351-4f58f2542b31'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:06:22 compute-0 nova_compute[186544]: 2025-11-22 08:06:22.370 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:06:22 compute-0 ovn_controller[94843]: 2025-11-22T08:06:22Z|00480|binding|INFO|Releasing lport 966efbe2-6c09-40dc-9351-4f58f2542b31 from this chassis (sb_readonly=0)
Nov 22 08:06:22 compute-0 nova_compute[186544]: 2025-11-22 08:06:22.371 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:06:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:06:22.373 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/165f7f23-d3c9-4f49-8a34-4fd7222ad518.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/165f7f23-d3c9-4f49-8a34-4fd7222ad518.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 08:06:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:06:22.374 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[592e1221-68d2-40a9-b482-255c59495655]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:06:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:06:22.375 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 08:06:22 compute-0 ovn_metadata_agent[103800]: global
Nov 22 08:06:22 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 08:06:22 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-165f7f23-d3c9-4f49-8a34-4fd7222ad518
Nov 22 08:06:22 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 08:06:22 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 08:06:22 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 08:06:22 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/165f7f23-d3c9-4f49-8a34-4fd7222ad518.pid.haproxy
Nov 22 08:06:22 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 08:06:22 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:06:22 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 08:06:22 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 08:06:22 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 08:06:22 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 08:06:22 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 08:06:22 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 08:06:22 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 08:06:22 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 08:06:22 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 08:06:22 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 08:06:22 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 08:06:22 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 08:06:22 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 08:06:22 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:06:22 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:06:22 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 08:06:22 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 08:06:22 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 08:06:22 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID 165f7f23-d3c9-4f49-8a34-4fd7222ad518
Nov 22 08:06:22 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 08:06:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:06:22.375 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'env', 'PROCESS_TAG=haproxy-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/165f7f23-d3c9-4f49-8a34-4fd7222ad518.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 08:06:22 compute-0 nova_compute[186544]: 2025-11-22 08:06:22.382 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:06:22 compute-0 nova_compute[186544]: 2025-11-22 08:06:22.419 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798782.4195037, 1fe5c7b8-8a9f-4b64-bb69-822a770565f6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:06:22 compute-0 nova_compute[186544]: 2025-11-22 08:06:22.420 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] VM Started (Lifecycle Event)
Nov 22 08:06:22 compute-0 nova_compute[186544]: 2025-11-22 08:06:22.440 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:06:22 compute-0 nova_compute[186544]: 2025-11-22 08:06:22.445 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798782.4196923, 1fe5c7b8-8a9f-4b64-bb69-822a770565f6 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:06:22 compute-0 nova_compute[186544]: 2025-11-22 08:06:22.445 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] VM Paused (Lifecycle Event)
Nov 22 08:06:22 compute-0 nova_compute[186544]: 2025-11-22 08:06:22.461 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:06:22 compute-0 nova_compute[186544]: 2025-11-22 08:06:22.464 186548 DEBUG nova.compute.manager [req-4280900e-7dc4-41b7-9907-92b974f844bb req-1321e910-cad4-4bef-b951-e07ffd1cafcc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Received event network-vif-plugged-480abb3e-15cf-4910-b471-7667ee6c50c8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:06:22 compute-0 nova_compute[186544]: 2025-11-22 08:06:22.464 186548 DEBUG oslo_concurrency.lockutils [req-4280900e-7dc4-41b7-9907-92b974f844bb req-1321e910-cad4-4bef-b951-e07ffd1cafcc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1fe5c7b8-8a9f-4b64-bb69-822a770565f6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:06:22 compute-0 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 22 08:06:22 compute-0 nova_compute[186544]: 2025-11-22 08:06:22.464 186548 DEBUG oslo_concurrency.lockutils [req-4280900e-7dc4-41b7-9907-92b974f844bb req-1321e910-cad4-4bef-b951-e07ffd1cafcc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1fe5c7b8-8a9f-4b64-bb69-822a770565f6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:06:22 compute-0 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 22 08:06:22 compute-0 nova_compute[186544]: 2025-11-22 08:06:22.465 186548 DEBUG oslo_concurrency.lockutils [req-4280900e-7dc4-41b7-9907-92b974f844bb req-1321e910-cad4-4bef-b951-e07ffd1cafcc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1fe5c7b8-8a9f-4b64-bb69-822a770565f6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:06:22 compute-0 nova_compute[186544]: 2025-11-22 08:06:22.465 186548 DEBUG nova.compute.manager [req-4280900e-7dc4-41b7-9907-92b974f844bb req-1321e910-cad4-4bef-b951-e07ffd1cafcc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Processing event network-vif-plugged-480abb3e-15cf-4910-b471-7667ee6c50c8 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 08:06:22 compute-0 nova_compute[186544]: 2025-11-22 08:06:22.466 186548 DEBUG nova.compute.manager [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 08:06:22 compute-0 nova_compute[186544]: 2025-11-22 08:06:22.468 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:06:22 compute-0 nova_compute[186544]: 2025-11-22 08:06:22.479 186548 DEBUG nova.virt.libvirt.driver [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 08:06:22 compute-0 nova_compute[186544]: 2025-11-22 08:06:22.483 186548 INFO nova.virt.libvirt.driver [-] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Instance spawned successfully.
Nov 22 08:06:22 compute-0 nova_compute[186544]: 2025-11-22 08:06:22.483 186548 DEBUG nova.virt.libvirt.driver [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 08:06:22 compute-0 nova_compute[186544]: 2025-11-22 08:06:22.500 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 08:06:22 compute-0 nova_compute[186544]: 2025-11-22 08:06:22.500 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798782.472949, 1fe5c7b8-8a9f-4b64-bb69-822a770565f6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:06:22 compute-0 nova_compute[186544]: 2025-11-22 08:06:22.500 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] VM Resumed (Lifecycle Event)
Nov 22 08:06:22 compute-0 nova_compute[186544]: 2025-11-22 08:06:22.512 186548 DEBUG nova.virt.libvirt.driver [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:06:22 compute-0 nova_compute[186544]: 2025-11-22 08:06:22.512 186548 DEBUG nova.virt.libvirt.driver [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:06:22 compute-0 nova_compute[186544]: 2025-11-22 08:06:22.513 186548 DEBUG nova.virt.libvirt.driver [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:06:22 compute-0 nova_compute[186544]: 2025-11-22 08:06:22.513 186548 DEBUG nova.virt.libvirt.driver [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:06:22 compute-0 nova_compute[186544]: 2025-11-22 08:06:22.513 186548 DEBUG nova.virt.libvirt.driver [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:06:22 compute-0 nova_compute[186544]: 2025-11-22 08:06:22.514 186548 DEBUG nova.virt.libvirt.driver [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:06:22 compute-0 nova_compute[186544]: 2025-11-22 08:06:22.519 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:06:22 compute-0 nova_compute[186544]: 2025-11-22 08:06:22.522 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:06:22 compute-0 nova_compute[186544]: 2025-11-22 08:06:22.561 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 08:06:22 compute-0 nova_compute[186544]: 2025-11-22 08:06:22.629 186548 INFO nova.compute.manager [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Took 6.67 seconds to spawn the instance on the hypervisor.
Nov 22 08:06:22 compute-0 nova_compute[186544]: 2025-11-22 08:06:22.630 186548 DEBUG nova.compute.manager [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:06:22 compute-0 nova_compute[186544]: 2025-11-22 08:06:22.705 186548 INFO nova.compute.manager [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Took 7.39 seconds to build instance.
Nov 22 08:06:22 compute-0 nova_compute[186544]: 2025-11-22 08:06:22.732 186548 DEBUG oslo_concurrency.lockutils [None req-4ca82a24-bf21-4871-b6e1-d051c3a9d1b4 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "1fe5c7b8-8a9f-4b64-bb69-822a770565f6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.512s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:06:22 compute-0 podman[233162]: 2025-11-22 08:06:22.717416479 +0000 UTC m=+0.026734270 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 08:06:22 compute-0 podman[233162]: 2025-11-22 08:06:22.834974228 +0000 UTC m=+0.144291989 container create 162d76850fddfd295ec6cfc7852d33517305bab18572764e3ac45524db6377fe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 08:06:22 compute-0 systemd[1]: Started libpod-conmon-162d76850fddfd295ec6cfc7852d33517305bab18572764e3ac45524db6377fe.scope.
Nov 22 08:06:22 compute-0 systemd[1]: Started libcrun container.
Nov 22 08:06:22 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/38f5f74cb44622d3e39dce251e1a53ebee23dd6195313051a58786b1cfc3118c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 08:06:22 compute-0 podman[233162]: 2025-11-22 08:06:22.939574977 +0000 UTC m=+0.248892758 container init 162d76850fddfd295ec6cfc7852d33517305bab18572764e3ac45524db6377fe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_managed=true)
Nov 22 08:06:22 compute-0 podman[233162]: 2025-11-22 08:06:22.945808122 +0000 UTC m=+0.255125883 container start 162d76850fddfd295ec6cfc7852d33517305bab18572764e3ac45524db6377fe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 22 08:06:22 compute-0 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[233177]: [NOTICE]   (233181) : New worker (233183) forked
Nov 22 08:06:22 compute-0 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[233177]: [NOTICE]   (233181) : Loading success.
Nov 22 08:06:22 compute-0 nova_compute[186544]: 2025-11-22 08:06:22.969 186548 DEBUG nova.network.neutron [req-f1785904-0cdc-46f3-8a9e-cd2bae250408 req-039460a2-91b9-49e2-b2b4-a936fd2355e6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Updated VIF entry in instance network info cache for port 480abb3e-15cf-4910-b471-7667ee6c50c8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:06:22 compute-0 nova_compute[186544]: 2025-11-22 08:06:22.971 186548 DEBUG nova.network.neutron [req-f1785904-0cdc-46f3-8a9e-cd2bae250408 req-039460a2-91b9-49e2-b2b4-a936fd2355e6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Updating instance_info_cache with network_info: [{"id": "480abb3e-15cf-4910-b471-7667ee6c50c8", "address": "fa:16:3e:d0:87:85", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap480abb3e-15", "ovs_interfaceid": "480abb3e-15cf-4910-b471-7667ee6c50c8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:06:22 compute-0 nova_compute[186544]: 2025-11-22 08:06:22.990 186548 DEBUG oslo_concurrency.lockutils [req-f1785904-0cdc-46f3-8a9e-cd2bae250408 req-039460a2-91b9-49e2-b2b4-a936fd2355e6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-1fe5c7b8-8a9f-4b64-bb69-822a770565f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:06:24 compute-0 nova_compute[186544]: 2025-11-22 08:06:24.595 186548 DEBUG nova.compute.manager [req-3d377544-3171-45e6-a29d-15a449cf4fa1 req-4600ddea-6797-4568-bdd6-87e2b774d1d0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Received event network-vif-plugged-480abb3e-15cf-4910-b471-7667ee6c50c8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:06:24 compute-0 nova_compute[186544]: 2025-11-22 08:06:24.596 186548 DEBUG oslo_concurrency.lockutils [req-3d377544-3171-45e6-a29d-15a449cf4fa1 req-4600ddea-6797-4568-bdd6-87e2b774d1d0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1fe5c7b8-8a9f-4b64-bb69-822a770565f6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:06:24 compute-0 nova_compute[186544]: 2025-11-22 08:06:24.596 186548 DEBUG oslo_concurrency.lockutils [req-3d377544-3171-45e6-a29d-15a449cf4fa1 req-4600ddea-6797-4568-bdd6-87e2b774d1d0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1fe5c7b8-8a9f-4b64-bb69-822a770565f6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:06:24 compute-0 nova_compute[186544]: 2025-11-22 08:06:24.597 186548 DEBUG oslo_concurrency.lockutils [req-3d377544-3171-45e6-a29d-15a449cf4fa1 req-4600ddea-6797-4568-bdd6-87e2b774d1d0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1fe5c7b8-8a9f-4b64-bb69-822a770565f6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:06:24 compute-0 nova_compute[186544]: 2025-11-22 08:06:24.597 186548 DEBUG nova.compute.manager [req-3d377544-3171-45e6-a29d-15a449cf4fa1 req-4600ddea-6797-4568-bdd6-87e2b774d1d0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] No waiting events found dispatching network-vif-plugged-480abb3e-15cf-4910-b471-7667ee6c50c8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:06:24 compute-0 nova_compute[186544]: 2025-11-22 08:06:24.597 186548 WARNING nova.compute.manager [req-3d377544-3171-45e6-a29d-15a449cf4fa1 req-4600ddea-6797-4568-bdd6-87e2b774d1d0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Received unexpected event network-vif-plugged-480abb3e-15cf-4910-b471-7667ee6c50c8 for instance with vm_state active and task_state None.
Nov 22 08:06:25 compute-0 NetworkManager[55036]: <info>  [1763798785.4082] manager: (patch-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/228)
Nov 22 08:06:25 compute-0 NetworkManager[55036]: <info>  [1763798785.4092] manager: (patch-br-int-to-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/229)
Nov 22 08:06:25 compute-0 nova_compute[186544]: 2025-11-22 08:06:25.407 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:06:25 compute-0 nova_compute[186544]: 2025-11-22 08:06:25.512 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:06:25 compute-0 ovn_controller[94843]: 2025-11-22T08:06:25Z|00481|binding|INFO|Releasing lport 966efbe2-6c09-40dc-9351-4f58f2542b31 from this chassis (sb_readonly=0)
Nov 22 08:06:25 compute-0 nova_compute[186544]: 2025-11-22 08:06:25.529 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:06:25 compute-0 nova_compute[186544]: 2025-11-22 08:06:25.779 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:06:25 compute-0 nova_compute[186544]: 2025-11-22 08:06:25.862 186548 DEBUG nova.compute.manager [req-41f8aa27-c871-453e-a5eb-2964659ba3df req-bfd4be54-0af5-4102-8e12-5dfaca6e488d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Received event network-changed-480abb3e-15cf-4910-b471-7667ee6c50c8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:06:25 compute-0 nova_compute[186544]: 2025-11-22 08:06:25.863 186548 DEBUG nova.compute.manager [req-41f8aa27-c871-453e-a5eb-2964659ba3df req-bfd4be54-0af5-4102-8e12-5dfaca6e488d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Refreshing instance network info cache due to event network-changed-480abb3e-15cf-4910-b471-7667ee6c50c8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:06:25 compute-0 nova_compute[186544]: 2025-11-22 08:06:25.863 186548 DEBUG oslo_concurrency.lockutils [req-41f8aa27-c871-453e-a5eb-2964659ba3df req-bfd4be54-0af5-4102-8e12-5dfaca6e488d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-1fe5c7b8-8a9f-4b64-bb69-822a770565f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:06:25 compute-0 nova_compute[186544]: 2025-11-22 08:06:25.863 186548 DEBUG oslo_concurrency.lockutils [req-41f8aa27-c871-453e-a5eb-2964659ba3df req-bfd4be54-0af5-4102-8e12-5dfaca6e488d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-1fe5c7b8-8a9f-4b64-bb69-822a770565f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:06:25 compute-0 nova_compute[186544]: 2025-11-22 08:06:25.863 186548 DEBUG nova.network.neutron [req-41f8aa27-c871-453e-a5eb-2964659ba3df req-bfd4be54-0af5-4102-8e12-5dfaca6e488d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Refreshing network info cache for port 480abb3e-15cf-4910-b471-7667ee6c50c8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:06:26 compute-0 nova_compute[186544]: 2025-11-22 08:06:26.283 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:06:27 compute-0 nova_compute[186544]: 2025-11-22 08:06:27.506 186548 DEBUG nova.network.neutron [req-41f8aa27-c871-453e-a5eb-2964659ba3df req-bfd4be54-0af5-4102-8e12-5dfaca6e488d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Updated VIF entry in instance network info cache for port 480abb3e-15cf-4910-b471-7667ee6c50c8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:06:27 compute-0 nova_compute[186544]: 2025-11-22 08:06:27.507 186548 DEBUG nova.network.neutron [req-41f8aa27-c871-453e-a5eb-2964659ba3df req-bfd4be54-0af5-4102-8e12-5dfaca6e488d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Updating instance_info_cache with network_info: [{"id": "480abb3e-15cf-4910-b471-7667ee6c50c8", "address": "fa:16:3e:d0:87:85", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap480abb3e-15", "ovs_interfaceid": "480abb3e-15cf-4910-b471-7667ee6c50c8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:06:27 compute-0 nova_compute[186544]: 2025-11-22 08:06:27.534 186548 DEBUG oslo_concurrency.lockutils [req-41f8aa27-c871-453e-a5eb-2964659ba3df req-bfd4be54-0af5-4102-8e12-5dfaca6e488d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-1fe5c7b8-8a9f-4b64-bb69-822a770565f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:06:27 compute-0 ovn_controller[94843]: 2025-11-22T08:06:27Z|00482|binding|INFO|Releasing lport 966efbe2-6c09-40dc-9351-4f58f2542b31 from this chassis (sb_readonly=0)
Nov 22 08:06:27 compute-0 nova_compute[186544]: 2025-11-22 08:06:27.645 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:06:28 compute-0 nova_compute[186544]: 2025-11-22 08:06:28.630 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:06:30 compute-0 nova_compute[186544]: 2025-11-22 08:06:30.782 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:06:31 compute-0 nova_compute[186544]: 2025-11-22 08:06:31.285 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:06:31 compute-0 podman[233193]: 2025-11-22 08:06:31.435899227 +0000 UTC m=+0.082209468 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team)
Nov 22 08:06:31 compute-0 podman[233194]: 2025-11-22 08:06:31.444431728 +0000 UTC m=+0.086425743 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3)
Nov 22 08:06:32 compute-0 nova_compute[186544]: 2025-11-22 08:06:32.383 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:06:34 compute-0 podman[233236]: 2025-11-22 08:06:34.402657435 +0000 UTC m=+0.052557178 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 08:06:34 compute-0 podman[233237]: 2025-11-22 08:06:34.403784982 +0000 UTC m=+0.050558998 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 08:06:35 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:06:35.143 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1e:28:f1 10.100.0.2 2001:db8::f816:3eff:fe1e:28f1'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe1e:28f1/64', 'neutron:device_id': 'ovnmeta-90da6fca-65d1-4012-9602-d88842a0ad0e', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-90da6fca-65d1-4012-9602-d88842a0ad0e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e9c41f1e-b11e-4868-a3a0-70214f7435c4, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=0abd56a4-3e9e-4d28-8383-eadcda41744d) old=Port_Binding(mac=['fa:16:3e:1e:28:f1 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-90da6fca-65d1-4012-9602-d88842a0ad0e', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-90da6fca-65d1-4012-9602-d88842a0ad0e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:06:35 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:06:35.145 103805 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 0abd56a4-3e9e-4d28-8383-eadcda41744d in datapath 90da6fca-65d1-4012-9602-d88842a0ad0e updated
Nov 22 08:06:35 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:06:35.146 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 90da6fca-65d1-4012-9602-d88842a0ad0e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 08:06:35 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:06:35.148 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[b00e26c6-5b52-40c2-9e28-57286820760a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:06:35 compute-0 nova_compute[186544]: 2025-11-22 08:06:35.785 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:06:36 compute-0 nova_compute[186544]: 2025-11-22 08:06:36.288 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:06:36 compute-0 nova_compute[186544]: 2025-11-22 08:06:36.576 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.597 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '1fe5c7b8-8a9f-4b64-bb69-822a770565f6', 'name': 'tempest-ServerActionsTestJSON-server-1640453793', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000006a', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'hostId': 'ac23f2f8fc2aa6166e6aba121c94ba9f76160a4d8ab407d829a284f0', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.598 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.602 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 1fe5c7b8-8a9f-4b64-bb69-822a770565f6 / tap480abb3e-15 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.602 12 DEBUG ceilometer.compute.pollsters [-] 1fe5c7b8-8a9f-4b64-bb69-822a770565f6/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.604 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '91ef4bdd-0f70-4dec-a269-65a7ff57127c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'instance-0000006a-1fe5c7b8-8a9f-4b64-bb69-822a770565f6-tap480abb3e-15', 'timestamp': '2025-11-22T08:06:36.598461', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1640453793', 'name': 'tap480abb3e-15', 'instance_id': '1fe5c7b8-8a9f-4b64-bb69-822a770565f6', 'instance_type': 'm1.nano', 'host': 'ac23f2f8fc2aa6166e6aba121c94ba9f76160a4d8ab407d829a284f0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d0:87:85', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap480abb3e-15'}, 'message_id': '2ac5bf1c-c77a-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5537.29024823, 'message_signature': '232fa3aa10ee8871b4b9dfd8ead3fe5492c1b81fa5ce78e3162d1d3fa1c797a7'}]}, 'timestamp': '2025-11-22 08:06:36.602768', '_unique_id': 'df3dd97e96a94e5ba413b926f9727a7e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.604 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.604 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.604 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.604 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.604 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.604 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.604 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.604 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.604 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.604 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.604 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.604 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.604 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.604 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.604 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.604 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.604 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.604 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.604 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.604 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.604 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.604 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.604 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.604 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.604 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.604 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.604 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.604 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.604 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.604 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.604 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.604 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.631 12 DEBUG ceilometer.compute.pollsters [-] 1fe5c7b8-8a9f-4b64-bb69-822a770565f6/disk.device.write.bytes volume: 25624576 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.631 12 DEBUG ceilometer.compute.pollsters [-] 1fe5c7b8-8a9f-4b64-bb69-822a770565f6/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.633 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '60a01c9a-6790-47bc-8e37-2fd78863dd7b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 25624576, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': '1fe5c7b8-8a9f-4b64-bb69-822a770565f6-vda', 'timestamp': '2025-11-22T08:06:36.605067', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1640453793', 'name': 'instance-0000006a', 'instance_id': '1fe5c7b8-8a9f-4b64-bb69-822a770565f6', 'instance_type': 'm1.nano', 'host': 'ac23f2f8fc2aa6166e6aba121c94ba9f76160a4d8ab407d829a284f0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2aca3268-c77a-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5537.296870044, 'message_signature': 'ae50a63a036f128b08d1b6d3389b4add308ff03de9b21b694c4967e647f0ffaf'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': '1fe5c7b8-8a9f-4b64-bb69-822a770565f6-sda', 'timestamp': '2025-11-22T08:06:36.605067', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1640453793', 'name': 'instance-0000006a', 'instance_id': '1fe5c7b8-8a9f-4b64-bb69-822a770565f6', 'instance_type': 'm1.nano', 'host': 'ac23f2f8fc2aa6166e6aba121c94ba9f76160a4d8ab407d829a284f0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2aca3fb0-c77a-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5537.296870044, 'message_signature': '69411a044df194d24aeb2edb6eef523f84c1d648cf386085b3bd5528f2d85bdf'}]}, 'timestamp': '2025-11-22 08:06:36.632211', '_unique_id': '30c0770b13774bce946f6f731dec65f8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.633 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.633 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.633 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.633 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.633 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.633 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.633 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.633 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.633 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.633 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.633 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.633 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.633 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.633 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.633 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.633 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.633 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.633 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.633 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.633 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.633 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.633 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.633 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.633 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.633 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.633 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.633 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.633 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.633 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.633 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.633 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.633 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.651 12 DEBUG ceilometer.compute.pollsters [-] 1fe5c7b8-8a9f-4b64-bb69-822a770565f6/cpu volume: 12340000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.653 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4c4fda68-f1d5-4f9a-b653-02733ba7c48a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12340000000, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': '1fe5c7b8-8a9f-4b64-bb69-822a770565f6', 'timestamp': '2025-11-22T08:06:36.634040', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1640453793', 'name': 'instance-0000006a', 'instance_id': '1fe5c7b8-8a9f-4b64-bb69-822a770565f6', 'instance_type': 'm1.nano', 'host': 'ac23f2f8fc2aa6166e6aba121c94ba9f76160a4d8ab407d829a284f0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '2acd41c4-c77a-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5537.343093704, 'message_signature': '18187cd18930655e45c3210878e0807e953e9096e538e9c370b5e9055d5d941e'}]}, 'timestamp': '2025-11-22 08:06:36.651993', '_unique_id': '226de164f57244a5a37c3baca82ce34a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.653 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.653 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.653 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.653 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.653 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.653 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.653 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.653 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.653 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.653 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.653 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.653 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.653 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.653 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.653 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.653 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.653 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.653 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.653 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.653 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.653 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.653 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.653 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.653 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.653 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.653 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.653 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.653 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.653 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.653 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.653 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.653 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.653 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.653 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.653 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.653 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.653 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.653 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.653 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.653 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.653 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.653 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.653 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.653 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.653 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.653 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.653 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.653 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.653 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.653 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.653 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.653 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.653 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.653 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.654 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.654 12 DEBUG ceilometer.compute.pollsters [-] 1fe5c7b8-8a9f-4b64-bb69-822a770565f6/disk.device.read.bytes volume: 26529280 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.654 12 DEBUG ceilometer.compute.pollsters [-] 1fe5c7b8-8a9f-4b64-bb69-822a770565f6/disk.device.read.bytes volume: 55474 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.655 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6fa79c65-61cc-43e7-9477-014ab73f7191', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 26529280, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': '1fe5c7b8-8a9f-4b64-bb69-822a770565f6-vda', 'timestamp': '2025-11-22T08:06:36.654183', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1640453793', 'name': 'instance-0000006a', 'instance_id': '1fe5c7b8-8a9f-4b64-bb69-822a770565f6', 'instance_type': 'm1.nano', 'host': 'ac23f2f8fc2aa6166e6aba121c94ba9f76160a4d8ab407d829a284f0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2acda844-c77a-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5537.296870044, 'message_signature': '2c20a1a40b5d1085d5c2a9f1fec877d82442d61c137eccd283f7c64b97e7b28e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 55474, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': '1fe5c7b8-8a9f-4b64-bb69-822a770565f6-sda', 'timestamp': '2025-11-22T08:06:36.654183', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1640453793', 'name': 'instance-0000006a', 'instance_id': '1fe5c7b8-8a9f-4b64-bb69-822a770565f6', 'instance_type': 'm1.nano', 'host': 'ac23f2f8fc2aa6166e6aba121c94ba9f76160a4d8ab407d829a284f0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2acdb3b6-c77a-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5537.296870044, 'message_signature': '6c8cbd8b9c2e187d3ca7fac72d8660e48ec5a32b37de5499a749a780f41d2214'}]}, 'timestamp': '2025-11-22 08:06:36.654791', '_unique_id': 'e95c662cbb8f4bba9f1b0a8f1e174bba'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.655 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.655 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.655 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.655 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.655 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.655 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.655 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.655 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.655 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.655 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.655 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.655 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.655 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.655 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.655 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.655 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.655 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.655 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.655 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.655 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.655 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.655 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.655 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.655 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.655 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.655 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.655 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.655 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.655 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.655 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.655 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.655 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.655 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.655 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.655 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.655 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.655 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.655 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.655 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.655 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.655 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.655 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.655 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.655 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.655 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.655 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.655 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.655 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.655 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.655 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.655 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.655 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.655 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.655 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.656 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.667 12 DEBUG ceilometer.compute.pollsters [-] 1fe5c7b8-8a9f-4b64-bb69-822a770565f6/disk.device.allocation volume: 28844032 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.667 12 DEBUG ceilometer.compute.pollsters [-] 1fe5c7b8-8a9f-4b64-bb69-822a770565f6/disk.device.allocation volume: 512000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.669 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7dc0fb31-ce15-455e-b3ba-9ad99aed552b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 28844032, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': '1fe5c7b8-8a9f-4b64-bb69-822a770565f6-vda', 'timestamp': '2025-11-22T08:06:36.656135', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1640453793', 'name': 'instance-0000006a', 'instance_id': '1fe5c7b8-8a9f-4b64-bb69-822a770565f6', 'instance_type': 'm1.nano', 'host': 'ac23f2f8fc2aa6166e6aba121c94ba9f76160a4d8ab407d829a284f0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2acfb1b6-c77a-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5537.347931983, 'message_signature': 'd8a0dc182c228a8a6412dee0093d06879733eb3c39939a37ee860e29f6ece3cf'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512000, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': '1fe5c7b8-8a9f-4b64-bb69-822a770565f6-sda', 'timestamp': '2025-11-22T08:06:36.656135', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1640453793', 'name': 'instance-0000006a', 'instance_id': '1fe5c7b8-8a9f-4b64-bb69-822a770565f6', 'instance_type': 'm1.nano', 'host': 'ac23f2f8fc2aa6166e6aba121c94ba9f76160a4d8ab407d829a284f0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2acfbc74-c77a-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5537.347931983, 'message_signature': '2f5817b875ac6ff1f85b126d11e2763f538aea1d6c927e378ff6b2424c9273dc'}]}, 'timestamp': '2025-11-22 08:06:36.668152', '_unique_id': '3762375db24e4e3aa434591a04053562'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.669 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.669 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.669 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.669 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.669 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.669 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.669 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.669 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.669 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.669 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.669 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.669 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.669 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.669 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.669 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.669 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.669 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.669 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.669 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.669 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.669 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.669 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.669 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.669 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.669 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.669 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.669 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.669 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.669 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.669 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.669 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.669 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.669 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.670 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServerActionsTestJSON-server-1640453793>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestJSON-server-1640453793>]
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.670 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.670 12 DEBUG ceilometer.compute.pollsters [-] 1fe5c7b8-8a9f-4b64-bb69-822a770565f6/disk.device.write.requests volume: 266 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.670 12 DEBUG ceilometer.compute.pollsters [-] 1fe5c7b8-8a9f-4b64-bb69-822a770565f6/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.671 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6c801d09-73e1-4bd5-8b76-b4701bc78611', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 266, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': '1fe5c7b8-8a9f-4b64-bb69-822a770565f6-vda', 'timestamp': '2025-11-22T08:06:36.670376', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1640453793', 'name': 'instance-0000006a', 'instance_id': '1fe5c7b8-8a9f-4b64-bb69-822a770565f6', 'instance_type': 'm1.nano', 'host': 'ac23f2f8fc2aa6166e6aba121c94ba9f76160a4d8ab407d829a284f0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2ad01e58-c77a-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5537.296870044, 'message_signature': '1476a634d7fbfe0777aa3afa0cd8ea3798842ae3d303340fc0e5c3399652af4c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': '1fe5c7b8-8a9f-4b64-bb69-822a770565f6-sda', 'timestamp': '2025-11-22T08:06:36.670376', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1640453793', 'name': 'instance-0000006a', 'instance_id': '1fe5c7b8-8a9f-4b64-bb69-822a770565f6', 'instance_type': 'm1.nano', 'host': 'ac23f2f8fc2aa6166e6aba121c94ba9f76160a4d8ab407d829a284f0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2ad02768-c77a-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5537.296870044, 'message_signature': '3b8c1a32cbeaaeb9fdc7e4d07f73221daffbba0f0bf961d8cc21b772a2b7f03a'}]}, 'timestamp': '2025-11-22 08:06:36.670859', '_unique_id': '0d407353c6874374b5e7f54d4e01ed7f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.671 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.671 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.671 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.671 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.671 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.671 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.671 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.671 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.671 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.671 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.671 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.671 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.671 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.671 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.671 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.671 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.671 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.671 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.671 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.671 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.671 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.671 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.671 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.671 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.671 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.671 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.671 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.671 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.671 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.671 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.671 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.672 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.672 12 DEBUG ceilometer.compute.pollsters [-] 1fe5c7b8-8a9f-4b64-bb69-822a770565f6/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.672 12 DEBUG ceilometer.compute.pollsters [-] 1fe5c7b8-8a9f-4b64-bb69-822a770565f6/disk.device.capacity volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.673 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8dd6ffe7-5677-40ec-a22a-5e9afa59670b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': '1fe5c7b8-8a9f-4b64-bb69-822a770565f6-vda', 'timestamp': '2025-11-22T08:06:36.672254', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1640453793', 'name': 'instance-0000006a', 'instance_id': '1fe5c7b8-8a9f-4b64-bb69-822a770565f6', 'instance_type': 'm1.nano', 'host': 'ac23f2f8fc2aa6166e6aba121c94ba9f76160a4d8ab407d829a284f0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2ad06796-c77a-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5537.347931983, 'message_signature': '1ecf724c9441ff3423c251f50898170f34ff9a3da3f1d9b1afc2787cce4d61f5'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': '1fe5c7b8-8a9f-4b64-bb69-822a770565f6-sda', 'timestamp': '2025-11-22T08:06:36.672254', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1640453793', 'name': 'instance-0000006a', 'instance_id': '1fe5c7b8-8a9f-4b64-bb69-822a770565f6', 'instance_type': 'm1.nano', 'host': 'ac23f2f8fc2aa6166e6aba121c94ba9f76160a4d8ab407d829a284f0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2ad06fde-c77a-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5537.347931983, 'message_signature': '8c25c6de2e75ac5e2f67dfc43ac1ab68db0f1ce4b0c97fe70423eefc923f63a8'}]}, 'timestamp': '2025-11-22 08:06:36.672707', '_unique_id': '3802351f42be4ca1965902e6c4e125b1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.673 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.673 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.673 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.673 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.673 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.673 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.673 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.673 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.673 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.673 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.673 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.673 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.673 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.673 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.673 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.673 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.673 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.673 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.673 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.673 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.673 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.673 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.673 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.673 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.673 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.673 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.673 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.673 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.673 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.673 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.673 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.673 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.673 12 DEBUG ceilometer.compute.pollsters [-] 1fe5c7b8-8a9f-4b64-bb69-822a770565f6/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.674 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e9701fb3-a3a6-4ea8-8166-d09d70d98728', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'instance-0000006a-1fe5c7b8-8a9f-4b64-bb69-822a770565f6-tap480abb3e-15', 'timestamp': '2025-11-22T08:06:36.673968', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1640453793', 'name': 'tap480abb3e-15', 'instance_id': '1fe5c7b8-8a9f-4b64-bb69-822a770565f6', 'instance_type': 'm1.nano', 'host': 'ac23f2f8fc2aa6166e6aba121c94ba9f76160a4d8ab407d829a284f0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d0:87:85', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap480abb3e-15'}, 'message_id': '2ad0ab20-c77a-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5537.29024823, 'message_signature': '4ea4c4769e2ddf52175a04b29c66b95f1574d0eab3d83fe2c649910709544582'}]}, 'timestamp': '2025-11-22 08:06:36.674259', '_unique_id': '8b4585e1080843b8a55aacd3c837935a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.674 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.674 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.674 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.674 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.674 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.674 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.674 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.674 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.674 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.674 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.674 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.674 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.674 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.674 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.674 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.674 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.674 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.674 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.674 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.674 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.674 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.674 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.674 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.674 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.674 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.674 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.674 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.674 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.674 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.674 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.674 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.674 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.675 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.675 12 DEBUG ceilometer.compute.pollsters [-] 1fe5c7b8-8a9f-4b64-bb69-822a770565f6/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.676 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9c6ff071-aae7-4a0d-aa78-b369c15f13bb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'instance-0000006a-1fe5c7b8-8a9f-4b64-bb69-822a770565f6-tap480abb3e-15', 'timestamp': '2025-11-22T08:06:36.675621', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1640453793', 'name': 'tap480abb3e-15', 'instance_id': '1fe5c7b8-8a9f-4b64-bb69-822a770565f6', 'instance_type': 'm1.nano', 'host': 'ac23f2f8fc2aa6166e6aba121c94ba9f76160a4d8ab407d829a284f0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d0:87:85', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap480abb3e-15'}, 'message_id': '2ad0ead6-c77a-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5537.29024823, 'message_signature': 'c3c6294916a4c2dac3e7c61ca4a9a1c52ae8cb4f71db7309d8fb5f61af9ef15f'}]}, 'timestamp': '2025-11-22 08:06:36.675880', '_unique_id': 'eca7287c118248ebacc6c3b794975d84'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.676 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.676 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.676 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.676 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.676 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.676 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.676 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.676 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.676 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.676 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.676 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.676 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.676 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.676 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.676 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.676 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.676 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.676 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.676 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.676 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.676 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.676 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.676 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.676 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.676 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.676 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.676 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.676 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.676 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.676 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.676 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.676 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.677 12 DEBUG ceilometer.compute.pollsters [-] 1fe5c7b8-8a9f-4b64-bb69-822a770565f6/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.677 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2814c3d1-fa06-4c95-82fe-b34db7b7d651', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'instance-0000006a-1fe5c7b8-8a9f-4b64-bb69-822a770565f6-tap480abb3e-15', 'timestamp': '2025-11-22T08:06:36.677039', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1640453793', 'name': 'tap480abb3e-15', 'instance_id': '1fe5c7b8-8a9f-4b64-bb69-822a770565f6', 'instance_type': 'm1.nano', 'host': 'ac23f2f8fc2aa6166e6aba121c94ba9f76160a4d8ab407d829a284f0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d0:87:85', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap480abb3e-15'}, 'message_id': '2ad1219a-c77a-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5537.29024823, 'message_signature': 'd87cdd5394b683b5dfdde214afa2b7d1418c4520648109eebfba583fff6b21b1'}]}, 'timestamp': '2025-11-22 08:06:36.677289', '_unique_id': 'cdaed547512f4cd8bacd3afe7b08812d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.677 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.677 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.677 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.677 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.677 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.677 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.677 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.677 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.677 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.677 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.677 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.677 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.677 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.677 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.677 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.677 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.677 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.677 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.677 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.677 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.677 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.677 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.677 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.677 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.677 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.677 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.677 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.677 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.677 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.677 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.677 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.678 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.678 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.678 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServerActionsTestJSON-server-1640453793>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestJSON-server-1640453793>]
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.678 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.678 12 DEBUG ceilometer.compute.pollsters [-] 1fe5c7b8-8a9f-4b64-bb69-822a770565f6/disk.device.read.latency volume: 1262001535 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.678 12 DEBUG ceilometer.compute.pollsters [-] 1fe5c7b8-8a9f-4b64-bb69-822a770565f6/disk.device.read.latency volume: 90983752 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.679 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '06c1767c-ea51-4970-a121-8dfaa34d033d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1262001535, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': '1fe5c7b8-8a9f-4b64-bb69-822a770565f6-vda', 'timestamp': '2025-11-22T08:06:36.678681', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1640453793', 'name': 'instance-0000006a', 'instance_id': '1fe5c7b8-8a9f-4b64-bb69-822a770565f6', 'instance_type': 'm1.nano', 'host': 'ac23f2f8fc2aa6166e6aba121c94ba9f76160a4d8ab407d829a284f0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2ad161be-c77a-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5537.296870044, 'message_signature': '7ff0098696ab55ed810f884a55f9e68096d77cdcb8b86860e5a90f0831c7bce0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 90983752, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': '1fe5c7b8-8a9f-4b64-bb69-822a770565f6-sda', 'timestamp': '2025-11-22T08:06:36.678681', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1640453793', 'name': 'instance-0000006a', 'instance_id': '1fe5c7b8-8a9f-4b64-bb69-822a770565f6', 'instance_type': 'm1.nano', 'host': 'ac23f2f8fc2aa6166e6aba121c94ba9f76160a4d8ab407d829a284f0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2ad169ca-c77a-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5537.296870044, 'message_signature': '90a6b6115b63f74b3bacb305973f057f88e1ac91a2fb8e65f009034a977a1332'}]}, 'timestamp': '2025-11-22 08:06:36.679109', '_unique_id': '017fd6c888564868a0a8fedc732c5129'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.679 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.679 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.679 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.679 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.679 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.679 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.679 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.679 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.679 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.679 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.679 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.679 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.679 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.679 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.679 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.679 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.679 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.679 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.679 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.679 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.679 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.679 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.679 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.679 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.679 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.679 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.679 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.679 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.679 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.679 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.679 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.680 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.680 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.680 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServerActionsTestJSON-server-1640453793>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestJSON-server-1640453793>]
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.680 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.680 12 DEBUG ceilometer.compute.pollsters [-] 1fe5c7b8-8a9f-4b64-bb69-822a770565f6/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.681 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '23fef8f6-e0ac-4e37-b554-40c5a37f7424', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'instance-0000006a-1fe5c7b8-8a9f-4b64-bb69-822a770565f6-tap480abb3e-15', 'timestamp': '2025-11-22T08:06:36.680674', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1640453793', 'name': 'tap480abb3e-15', 'instance_id': '1fe5c7b8-8a9f-4b64-bb69-822a770565f6', 'instance_type': 'm1.nano', 'host': 'ac23f2f8fc2aa6166e6aba121c94ba9f76160a4d8ab407d829a284f0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d0:87:85', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap480abb3e-15'}, 'message_id': '2ad1afa2-c77a-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5537.29024823, 'message_signature': 'c7a9418d4ebd82478bff705b99d0ef04ace4e0bcd892458f84623706843413f0'}]}, 'timestamp': '2025-11-22 08:06:36.680911', '_unique_id': 'f5633b862e354a3697161fba5667f034'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.681 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.681 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.681 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.681 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.681 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.681 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.681 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.681 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.681 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.681 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.681 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.681 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.681 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.681 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.681 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.681 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.681 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.681 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.681 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.681 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.681 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.681 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.681 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.681 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.681 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.681 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.681 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.681 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.681 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.681 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.681 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.682 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.682 12 DEBUG ceilometer.compute.pollsters [-] 1fe5c7b8-8a9f-4b64-bb69-822a770565f6/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.683 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ae81b185-c1ea-4693-af2c-f9523374848c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'instance-0000006a-1fe5c7b8-8a9f-4b64-bb69-822a770565f6-tap480abb3e-15', 'timestamp': '2025-11-22T08:06:36.682510', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1640453793', 'name': 'tap480abb3e-15', 'instance_id': '1fe5c7b8-8a9f-4b64-bb69-822a770565f6', 'instance_type': 'm1.nano', 'host': 'ac23f2f8fc2aa6166e6aba121c94ba9f76160a4d8ab407d829a284f0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d0:87:85', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap480abb3e-15'}, 'message_id': '2ad1f76e-c77a-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5537.29024823, 'message_signature': '03b8f799dc601eba64fce4395498021d4a8fc4ac600d24bdd8d21b28add50e66'}]}, 'timestamp': '2025-11-22 08:06:36.682752', '_unique_id': '74926f5acc8641f0a04d80bb3ce62541'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.683 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.683 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.683 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.683 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.683 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.683 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.683 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.683 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.683 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.683 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.683 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.683 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.683 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.683 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.683 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.683 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.683 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.683 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.683 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.683 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.683 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.683 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.683 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.683 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.683 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.683 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.683 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.683 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.683 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.683 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.683 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.683 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.683 12 DEBUG ceilometer.compute.pollsters [-] 1fe5c7b8-8a9f-4b64-bb69-822a770565f6/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '27fbe134-4a26-436a-8543-e640b8242a11', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'instance-0000006a-1fe5c7b8-8a9f-4b64-bb69-822a770565f6-tap480abb3e-15', 'timestamp': '2025-11-22T08:06:36.683887', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1640453793', 'name': 'tap480abb3e-15', 'instance_id': '1fe5c7b8-8a9f-4b64-bb69-822a770565f6', 'instance_type': 'm1.nano', 'host': 'ac23f2f8fc2aa6166e6aba121c94ba9f76160a4d8ab407d829a284f0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d0:87:85', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap480abb3e-15'}, 'message_id': '2ad22d24-c77a-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5537.29024823, 'message_signature': 'd1e196ce2bd3f47ffa448dd33ce78c27fa477833b8b7f8a69877857c5f374a79'}]}, 'timestamp': '2025-11-22 08:06:36.684153', '_unique_id': '87ec88834d28496ab3fcd84753522b3b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.684 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.685 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.685 12 DEBUG ceilometer.compute.pollsters [-] 1fe5c7b8-8a9f-4b64-bb69-822a770565f6/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.685 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5f2b1f75-1fb1-4373-b7c5-5a4bf7b95076', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'instance-0000006a-1fe5c7b8-8a9f-4b64-bb69-822a770565f6-tap480abb3e-15', 'timestamp': '2025-11-22T08:06:36.685255', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1640453793', 'name': 'tap480abb3e-15', 'instance_id': '1fe5c7b8-8a9f-4b64-bb69-822a770565f6', 'instance_type': 'm1.nano', 'host': 'ac23f2f8fc2aa6166e6aba121c94ba9f76160a4d8ab407d829a284f0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d0:87:85', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap480abb3e-15'}, 'message_id': '2ad26366-c77a-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5537.29024823, 'message_signature': '260fead6908e7b4d0bda6ef7b0cf4dd92a00a685a90371da966a8f5302b08435'}]}, 'timestamp': '2025-11-22 08:06:36.685510', '_unique_id': '36f7acc0f93d4108896cdd58641c9f9a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.685 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.685 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.685 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.685 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.685 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.685 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.685 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.685 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.685 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.685 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.685 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.685 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.685 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.685 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.685 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.685 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.685 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.685 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.685 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.685 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.685 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.685 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.685 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.685 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.685 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.685 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.685 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.685 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.685 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.685 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.685 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.686 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.687 12 DEBUG ceilometer.compute.pollsters [-] 1fe5c7b8-8a9f-4b64-bb69-822a770565f6/memory.usage volume: 40.37890625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '616508eb-ab03-434e-a00c-beb1645ea998', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.37890625, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': '1fe5c7b8-8a9f-4b64-bb69-822a770565f6', 'timestamp': '2025-11-22T08:06:36.687021', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1640453793', 'name': 'instance-0000006a', 'instance_id': '1fe5c7b8-8a9f-4b64-bb69-822a770565f6', 'instance_type': 'm1.nano', 'host': 'ac23f2f8fc2aa6166e6aba121c94ba9f76160a4d8ab407d829a284f0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '2ad2a8a8-c77a-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5537.343093704, 'message_signature': 'b27e8df64f2ae1a024c66ae75661fa746351a92aafeca69c61e23c29f72f9b53'}]}, 'timestamp': '2025-11-22 08:06:36.687341', '_unique_id': 'c93536f796074139bf30275116f9727b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.687 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.688 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.688 12 DEBUG ceilometer.compute.pollsters [-] 1fe5c7b8-8a9f-4b64-bb69-822a770565f6/disk.device.usage volume: 28246016 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.689 12 DEBUG ceilometer.compute.pollsters [-] 1fe5c7b8-8a9f-4b64-bb69-822a770565f6/disk.device.usage volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.690 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dbd1098f-9723-4787-b37f-dba6fa4c9077', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 28246016, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': '1fe5c7b8-8a9f-4b64-bb69-822a770565f6-vda', 'timestamp': '2025-11-22T08:06:36.688749', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1640453793', 'name': 'instance-0000006a', 'instance_id': '1fe5c7b8-8a9f-4b64-bb69-822a770565f6', 'instance_type': 'm1.nano', 'host': 'ac23f2f8fc2aa6166e6aba121c94ba9f76160a4d8ab407d829a284f0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2ad2ec0a-c77a-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5537.347931983, 'message_signature': '9908060d32e7108336eacb03c92d3b06258fd0bb6b3a3746541dbf8e15c30744'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': '1fe5c7b8-8a9f-4b64-bb69-822a770565f6-sda', 'timestamp': '2025-11-22T08:06:36.688749', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1640453793', 'name': 'instance-0000006a', 'instance_id': '1fe5c7b8-8a9f-4b64-bb69-822a770565f6', 'instance_type': 'm1.nano', 'host': 'ac23f2f8fc2aa6166e6aba121c94ba9f76160a4d8ab407d829a284f0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2ad2f6b4-c77a-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5537.347931983, 'message_signature': '63569dd6f9ed92d144afc883e3a683c8896eec735eb69039c3a733807dd4f433'}]}, 'timestamp': '2025-11-22 08:06:36.689333', '_unique_id': 'd3e27d6f50804edc98c0e9b4859b5ca6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.690 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.690 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.690 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.690 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.690 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.690 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.690 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.690 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.690 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.690 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.690 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.690 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.690 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.690 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.690 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.690 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.690 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.690 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.690 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.690 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.690 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.690 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.690 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.690 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.690 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.690 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.690 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.690 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.690 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.690 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.690 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.690 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.690 12 DEBUG ceilometer.compute.pollsters [-] 1fe5c7b8-8a9f-4b64-bb69-822a770565f6/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.691 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '02089736-feb0-48e6-88c2-988890793481', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'instance-0000006a-1fe5c7b8-8a9f-4b64-bb69-822a770565f6-tap480abb3e-15', 'timestamp': '2025-11-22T08:06:36.690785', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1640453793', 'name': 'tap480abb3e-15', 'instance_id': '1fe5c7b8-8a9f-4b64-bb69-822a770565f6', 'instance_type': 'm1.nano', 'host': 'ac23f2f8fc2aa6166e6aba121c94ba9f76160a4d8ab407d829a284f0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d0:87:85', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap480abb3e-15'}, 'message_id': '2ad33aca-c77a-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5537.29024823, 'message_signature': 'bc5d94060763f6721c43b0ca3357ce8a43c7ddaf1c2cf7dd64d01615ddf7a6e8'}]}, 'timestamp': '2025-11-22 08:06:36.691025', '_unique_id': '83deaa56f8034388bf0c4d55eee7b5ce'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.691 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.691 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.691 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.691 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.691 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.691 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.691 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.691 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.691 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.691 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.691 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.691 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.691 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.691 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.691 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.691 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.691 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.691 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.691 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.691 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.691 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.691 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.691 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.691 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.691 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.691 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.691 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.691 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.691 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.691 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.691 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.692 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.692 12 DEBUG ceilometer.compute.pollsters [-] 1fe5c7b8-8a9f-4b64-bb69-822a770565f6/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '369847eb-1d69-4751-83a0-bc138607419b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': 'instance-0000006a-1fe5c7b8-8a9f-4b64-bb69-822a770565f6-tap480abb3e-15', 'timestamp': '2025-11-22T08:06:36.692144', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1640453793', 'name': 'tap480abb3e-15', 'instance_id': '1fe5c7b8-8a9f-4b64-bb69-822a770565f6', 'instance_type': 'm1.nano', 'host': 'ac23f2f8fc2aa6166e6aba121c94ba9f76160a4d8ab407d829a284f0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d0:87:85', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap480abb3e-15'}, 'message_id': '2ad370d0-c77a-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5537.29024823, 'message_signature': '8c695e3fc2baafd237aba4ba4e11e84b5c285adfdaf2f0de5050d9f6598af535'}]}, 'timestamp': '2025-11-22 08:06:36.692408', '_unique_id': '0b96a0e5dcc74efeadd6a55f2a2c4a8c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.692 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.693 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.693 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.693 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServerActionsTestJSON-server-1640453793>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestJSON-server-1640453793>]
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.693 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.693 12 DEBUG ceilometer.compute.pollsters [-] 1fe5c7b8-8a9f-4b64-bb69-822a770565f6/disk.device.write.latency volume: 38950711685 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.694 12 DEBUG ceilometer.compute.pollsters [-] 1fe5c7b8-8a9f-4b64-bb69-822a770565f6/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.694 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7d74713d-7706-4e65-8edc-e16e706f0dd6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 38950711685, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': '1fe5c7b8-8a9f-4b64-bb69-822a770565f6-vda', 'timestamp': '2025-11-22T08:06:36.693811', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1640453793', 'name': 'instance-0000006a', 'instance_id': '1fe5c7b8-8a9f-4b64-bb69-822a770565f6', 'instance_type': 'm1.nano', 'host': 'ac23f2f8fc2aa6166e6aba121c94ba9f76160a4d8ab407d829a284f0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2ad3b0e0-c77a-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5537.296870044, 'message_signature': 'bfaae8cd469a99c834e851ae835d305d2c2721ea82f7eccd1c48d74d25a8f605'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': '1fe5c7b8-8a9f-4b64-bb69-822a770565f6-sda', 'timestamp': '2025-11-22T08:06:36.693811', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1640453793', 'name': 'instance-0000006a', 'instance_id': '1fe5c7b8-8a9f-4b64-bb69-822a770565f6', 'instance_type': 'm1.nano', 'host': 'ac23f2f8fc2aa6166e6aba121c94ba9f76160a4d8ab407d829a284f0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2ad3b96e-c77a-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5537.296870044, 'message_signature': '0d8bd2b3c341f0829e13c2e90a78054734f59a68f36f4370c70795fd7710960e'}]}, 'timestamp': '2025-11-22 08:06:36.694252', '_unique_id': '4a8b539e66b6472c96d7da486806fac3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.694 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.694 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.694 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.694 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.694 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.694 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.694 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.694 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.694 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.694 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.694 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.694 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.694 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.694 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.694 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.694 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.694 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.694 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.694 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.694 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.694 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.694 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.694 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.694 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.694 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.694 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.694 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.694 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.694 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.694 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.694 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.695 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.695 12 DEBUG ceilometer.compute.pollsters [-] 1fe5c7b8-8a9f-4b64-bb69-822a770565f6/disk.device.read.requests volume: 877 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.695 12 DEBUG ceilometer.compute.pollsters [-] 1fe5c7b8-8a9f-4b64-bb69-822a770565f6/disk.device.read.requests volume: 20 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.696 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b60d5839-8626-489b-8305-701552c30596', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 877, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': '1fe5c7b8-8a9f-4b64-bb69-822a770565f6-vda', 'timestamp': '2025-11-22T08:06:36.695506', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1640453793', 'name': 'instance-0000006a', 'instance_id': '1fe5c7b8-8a9f-4b64-bb69-822a770565f6', 'instance_type': 'm1.nano', 'host': 'ac23f2f8fc2aa6166e6aba121c94ba9f76160a4d8ab407d829a284f0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2ad3f366-c77a-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5537.296870044, 'message_signature': 'b301691e58490f763e8d75d67b1dc47cd731483a1a7d3554dd375e8510c5cb68'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 20, 'user_id': 'b6cc24df1e344e369f2aff864f278268', 'user_name': None, 'project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'project_name': None, 'resource_id': '1fe5c7b8-8a9f-4b64-bb69-822a770565f6-sda', 'timestamp': '2025-11-22T08:06:36.695506', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-1640453793', 'name': 'instance-0000006a', 'instance_id': '1fe5c7b8-8a9f-4b64-bb69-822a770565f6', 'instance_type': 'm1.nano', 'host': 'ac23f2f8fc2aa6166e6aba121c94ba9f76160a4d8ab407d829a284f0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2ad3fcbc-c77a-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 5537.296870044, 'message_signature': 'd592f6f29c2e90646ae7bbd3d41bcb5d09df0f06741e87a1cdaf24a86997a8b6'}]}, 'timestamp': '2025-11-22 08:06:36.696001', '_unique_id': '56a80727b84f41a7be8f71fe6f69ce71'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.696 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.696 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.696 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.696 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.696 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.696 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.696 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.696 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.696 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.696 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.696 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.696 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.696 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.696 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.696 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.696 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.696 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.696 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.696 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.696 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.696 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.696 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.696 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.696 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.696 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.696 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.696 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.696 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.696 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.696 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:06:36.696 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:06:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:06:37.336 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:06:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:06:37.336 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:06:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:06:37.337 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:06:38 compute-0 ovn_controller[94843]: 2025-11-22T08:06:38Z|00046|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d0:87:85 10.100.0.11
Nov 22 08:06:38 compute-0 ovn_controller[94843]: 2025-11-22T08:06:38Z|00047|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d0:87:85 10.100.0.11
Nov 22 08:06:40 compute-0 nova_compute[186544]: 2025-11-22 08:06:40.335 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:06:40 compute-0 nova_compute[186544]: 2025-11-22 08:06:40.786 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:06:41 compute-0 nova_compute[186544]: 2025-11-22 08:06:41.290 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:06:41 compute-0 nova_compute[186544]: 2025-11-22 08:06:41.494 186548 DEBUG oslo_concurrency.lockutils [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:06:41 compute-0 nova_compute[186544]: 2025-11-22 08:06:41.495 186548 DEBUG oslo_concurrency.lockutils [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:06:41 compute-0 nova_compute[186544]: 2025-11-22 08:06:41.536 186548 DEBUG nova.compute.manager [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 08:06:41 compute-0 nova_compute[186544]: 2025-11-22 08:06:41.663 186548 DEBUG oslo_concurrency.lockutils [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:06:41 compute-0 nova_compute[186544]: 2025-11-22 08:06:41.663 186548 DEBUG oslo_concurrency.lockutils [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:06:41 compute-0 nova_compute[186544]: 2025-11-22 08:06:41.671 186548 DEBUG nova.virt.hardware [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 08:06:41 compute-0 nova_compute[186544]: 2025-11-22 08:06:41.671 186548 INFO nova.compute.claims [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] Claim successful on node compute-0.ctlplane.example.com
Nov 22 08:06:41 compute-0 nova_compute[186544]: 2025-11-22 08:06:41.793 186548 DEBUG nova.compute.provider_tree [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:06:41 compute-0 nova_compute[186544]: 2025-11-22 08:06:41.806 186548 DEBUG nova.scheduler.client.report [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:06:41 compute-0 nova_compute[186544]: 2025-11-22 08:06:41.831 186548 DEBUG oslo_concurrency.lockutils [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.168s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:06:41 compute-0 nova_compute[186544]: 2025-11-22 08:06:41.832 186548 DEBUG nova.compute.manager [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 08:06:41 compute-0 nova_compute[186544]: 2025-11-22 08:06:41.876 186548 DEBUG nova.compute.manager [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 22 08:06:41 compute-0 nova_compute[186544]: 2025-11-22 08:06:41.877 186548 DEBUG nova.network.neutron [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 08:06:41 compute-0 nova_compute[186544]: 2025-11-22 08:06:41.897 186548 INFO nova.virt.libvirt.driver [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 08:06:41 compute-0 nova_compute[186544]: 2025-11-22 08:06:41.928 186548 DEBUG nova.compute.manager [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 08:06:42 compute-0 nova_compute[186544]: 2025-11-22 08:06:42.045 186548 DEBUG nova.compute.manager [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 08:06:42 compute-0 nova_compute[186544]: 2025-11-22 08:06:42.046 186548 DEBUG nova.virt.libvirt.driver [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 08:06:42 compute-0 nova_compute[186544]: 2025-11-22 08:06:42.046 186548 INFO nova.virt.libvirt.driver [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] Creating image(s)
Nov 22 08:06:42 compute-0 nova_compute[186544]: 2025-11-22 08:06:42.047 186548 DEBUG oslo_concurrency.lockutils [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "/var/lib/nova/instances/c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:06:42 compute-0 nova_compute[186544]: 2025-11-22 08:06:42.047 186548 DEBUG oslo_concurrency.lockutils [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "/var/lib/nova/instances/c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:06:42 compute-0 nova_compute[186544]: 2025-11-22 08:06:42.048 186548 DEBUG oslo_concurrency.lockutils [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "/var/lib/nova/instances/c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:06:42 compute-0 nova_compute[186544]: 2025-11-22 08:06:42.061 186548 DEBUG oslo_concurrency.processutils [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:06:42 compute-0 nova_compute[186544]: 2025-11-22 08:06:42.080 186548 DEBUG nova.policy [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '809b865601654264af5bff7f49127cea', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 22 08:06:42 compute-0 nova_compute[186544]: 2025-11-22 08:06:42.120 186548 DEBUG oslo_concurrency.processutils [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:06:42 compute-0 nova_compute[186544]: 2025-11-22 08:06:42.121 186548 DEBUG oslo_concurrency.lockutils [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:06:42 compute-0 nova_compute[186544]: 2025-11-22 08:06:42.121 186548 DEBUG oslo_concurrency.lockutils [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:06:42 compute-0 nova_compute[186544]: 2025-11-22 08:06:42.134 186548 DEBUG oslo_concurrency.processutils [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:06:42 compute-0 nova_compute[186544]: 2025-11-22 08:06:42.193 186548 DEBUG oslo_concurrency.processutils [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:06:42 compute-0 nova_compute[186544]: 2025-11-22 08:06:42.194 186548 DEBUG oslo_concurrency.processutils [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:06:42 compute-0 nova_compute[186544]: 2025-11-22 08:06:42.252 186548 DEBUG oslo_concurrency.processutils [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762/disk 1073741824" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:06:42 compute-0 nova_compute[186544]: 2025-11-22 08:06:42.253 186548 DEBUG oslo_concurrency.lockutils [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.132s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:06:42 compute-0 nova_compute[186544]: 2025-11-22 08:06:42.254 186548 DEBUG oslo_concurrency.processutils [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:06:42 compute-0 nova_compute[186544]: 2025-11-22 08:06:42.308 186548 DEBUG oslo_concurrency.processutils [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:06:42 compute-0 nova_compute[186544]: 2025-11-22 08:06:42.309 186548 DEBUG nova.virt.disk.api [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Checking if we can resize image /var/lib/nova/instances/c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 08:06:42 compute-0 nova_compute[186544]: 2025-11-22 08:06:42.310 186548 DEBUG oslo_concurrency.processutils [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:06:42 compute-0 nova_compute[186544]: 2025-11-22 08:06:42.380 186548 DEBUG oslo_concurrency.processutils [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:06:42 compute-0 nova_compute[186544]: 2025-11-22 08:06:42.381 186548 DEBUG nova.virt.disk.api [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Cannot resize image /var/lib/nova/instances/c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 08:06:42 compute-0 nova_compute[186544]: 2025-11-22 08:06:42.381 186548 DEBUG nova.objects.instance [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lazy-loading 'migration_context' on Instance uuid c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:06:42 compute-0 nova_compute[186544]: 2025-11-22 08:06:42.405 186548 DEBUG nova.virt.libvirt.driver [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 08:06:42 compute-0 podman[233305]: 2025-11-22 08:06:42.403797561 +0000 UTC m=+0.056431743 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Nov 22 08:06:42 compute-0 nova_compute[186544]: 2025-11-22 08:06:42.405 186548 DEBUG nova.virt.libvirt.driver [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] Ensure instance console log exists: /var/lib/nova/instances/c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 08:06:42 compute-0 nova_compute[186544]: 2025-11-22 08:06:42.406 186548 DEBUG oslo_concurrency.lockutils [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:06:42 compute-0 nova_compute[186544]: 2025-11-22 08:06:42.406 186548 DEBUG oslo_concurrency.lockutils [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:06:42 compute-0 nova_compute[186544]: 2025-11-22 08:06:42.406 186548 DEBUG oslo_concurrency.lockutils [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:06:43 compute-0 nova_compute[186544]: 2025-11-22 08:06:43.497 186548 DEBUG nova.network.neutron [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] Successfully created port: f66ac761-5b67-4bf3-98de-982febfd27c1 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 22 08:06:44 compute-0 nova_compute[186544]: 2025-11-22 08:06:44.121 186548 DEBUG nova.network.neutron [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] Successfully updated port: f66ac761-5b67-4bf3-98de-982febfd27c1 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 08:06:44 compute-0 nova_compute[186544]: 2025-11-22 08:06:44.152 186548 DEBUG oslo_concurrency.lockutils [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "refresh_cache-c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:06:44 compute-0 nova_compute[186544]: 2025-11-22 08:06:44.152 186548 DEBUG oslo_concurrency.lockutils [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquired lock "refresh_cache-c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:06:44 compute-0 nova_compute[186544]: 2025-11-22 08:06:44.152 186548 DEBUG nova.network.neutron [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 08:06:44 compute-0 nova_compute[186544]: 2025-11-22 08:06:44.190 186548 DEBUG nova.compute.manager [req-8ed8bfbb-5d6b-4db7-bec8-28f146ae1007 req-90aadb27-c164-4449-9bb9-782c532f1dbf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] Received event network-changed-f66ac761-5b67-4bf3-98de-982febfd27c1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:06:44 compute-0 nova_compute[186544]: 2025-11-22 08:06:44.190 186548 DEBUG nova.compute.manager [req-8ed8bfbb-5d6b-4db7-bec8-28f146ae1007 req-90aadb27-c164-4449-9bb9-782c532f1dbf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] Refreshing instance network info cache due to event network-changed-f66ac761-5b67-4bf3-98de-982febfd27c1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:06:44 compute-0 nova_compute[186544]: 2025-11-22 08:06:44.191 186548 DEBUG oslo_concurrency.lockutils [req-8ed8bfbb-5d6b-4db7-bec8-28f146ae1007 req-90aadb27-c164-4449-9bb9-782c532f1dbf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:06:44 compute-0 nova_compute[186544]: 2025-11-22 08:06:44.299 186548 DEBUG nova.network.neutron [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 08:06:45 compute-0 nova_compute[186544]: 2025-11-22 08:06:45.788 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:06:46 compute-0 nova_compute[186544]: 2025-11-22 08:06:46.023 186548 DEBUG nova.network.neutron [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] Updating instance_info_cache with network_info: [{"id": "f66ac761-5b67-4bf3-98de-982febfd27c1", "address": "fa:16:3e:07:30:39", "network": {"id": "90da6fca-65d1-4012-9602-d88842a0ad0e", "bridge": "br-int", "label": "tempest-network-smoke--1184291441", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe07:3039", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf66ac761-5b", "ovs_interfaceid": "f66ac761-5b67-4bf3-98de-982febfd27c1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:06:46 compute-0 nova_compute[186544]: 2025-11-22 08:06:46.044 186548 DEBUG oslo_concurrency.lockutils [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Releasing lock "refresh_cache-c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:06:46 compute-0 nova_compute[186544]: 2025-11-22 08:06:46.045 186548 DEBUG nova.compute.manager [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] Instance network_info: |[{"id": "f66ac761-5b67-4bf3-98de-982febfd27c1", "address": "fa:16:3e:07:30:39", "network": {"id": "90da6fca-65d1-4012-9602-d88842a0ad0e", "bridge": "br-int", "label": "tempest-network-smoke--1184291441", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe07:3039", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf66ac761-5b", "ovs_interfaceid": "f66ac761-5b67-4bf3-98de-982febfd27c1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 22 08:06:46 compute-0 nova_compute[186544]: 2025-11-22 08:06:46.045 186548 DEBUG oslo_concurrency.lockutils [req-8ed8bfbb-5d6b-4db7-bec8-28f146ae1007 req-90aadb27-c164-4449-9bb9-782c532f1dbf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:06:46 compute-0 nova_compute[186544]: 2025-11-22 08:06:46.046 186548 DEBUG nova.network.neutron [req-8ed8bfbb-5d6b-4db7-bec8-28f146ae1007 req-90aadb27-c164-4449-9bb9-782c532f1dbf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] Refreshing network info cache for port f66ac761-5b67-4bf3-98de-982febfd27c1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:06:46 compute-0 nova_compute[186544]: 2025-11-22 08:06:46.049 186548 DEBUG nova.virt.libvirt.driver [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] Start _get_guest_xml network_info=[{"id": "f66ac761-5b67-4bf3-98de-982febfd27c1", "address": "fa:16:3e:07:30:39", "network": {"id": "90da6fca-65d1-4012-9602-d88842a0ad0e", "bridge": "br-int", "label": "tempest-network-smoke--1184291441", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe07:3039", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf66ac761-5b", "ovs_interfaceid": "f66ac761-5b67-4bf3-98de-982febfd27c1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 08:06:46 compute-0 nova_compute[186544]: 2025-11-22 08:06:46.056 186548 WARNING nova.virt.libvirt.driver [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:06:46 compute-0 nova_compute[186544]: 2025-11-22 08:06:46.071 186548 DEBUG nova.virt.libvirt.host [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 08:06:46 compute-0 nova_compute[186544]: 2025-11-22 08:06:46.072 186548 DEBUG nova.virt.libvirt.host [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 08:06:46 compute-0 nova_compute[186544]: 2025-11-22 08:06:46.076 186548 DEBUG nova.virt.libvirt.host [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 08:06:46 compute-0 nova_compute[186544]: 2025-11-22 08:06:46.077 186548 DEBUG nova.virt.libvirt.host [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 08:06:46 compute-0 nova_compute[186544]: 2025-11-22 08:06:46.078 186548 DEBUG nova.virt.libvirt.driver [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 08:06:46 compute-0 nova_compute[186544]: 2025-11-22 08:06:46.078 186548 DEBUG nova.virt.hardware [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 08:06:46 compute-0 nova_compute[186544]: 2025-11-22 08:06:46.078 186548 DEBUG nova.virt.hardware [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 08:06:46 compute-0 nova_compute[186544]: 2025-11-22 08:06:46.078 186548 DEBUG nova.virt.hardware [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 08:06:46 compute-0 nova_compute[186544]: 2025-11-22 08:06:46.079 186548 DEBUG nova.virt.hardware [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 08:06:46 compute-0 nova_compute[186544]: 2025-11-22 08:06:46.079 186548 DEBUG nova.virt.hardware [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 08:06:46 compute-0 nova_compute[186544]: 2025-11-22 08:06:46.079 186548 DEBUG nova.virt.hardware [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 08:06:46 compute-0 nova_compute[186544]: 2025-11-22 08:06:46.079 186548 DEBUG nova.virt.hardware [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 08:06:46 compute-0 nova_compute[186544]: 2025-11-22 08:06:46.079 186548 DEBUG nova.virt.hardware [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 08:06:46 compute-0 nova_compute[186544]: 2025-11-22 08:06:46.080 186548 DEBUG nova.virt.hardware [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 08:06:46 compute-0 nova_compute[186544]: 2025-11-22 08:06:46.080 186548 DEBUG nova.virt.hardware [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 08:06:46 compute-0 nova_compute[186544]: 2025-11-22 08:06:46.080 186548 DEBUG nova.virt.hardware [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 08:06:46 compute-0 nova_compute[186544]: 2025-11-22 08:06:46.083 186548 DEBUG nova.virt.libvirt.vif [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:06:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-695468086',display_name='tempest-TestGettingAddress-server-695468086',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-695468086',id=109,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEPnaERasa+izdcdfyTuC1NZxdKIV3QYAmiEXJjMkASn0E1tv7r6vCMDrq3+5wI/5DgRhzrsGj9ouyKzyqBuAz+X8ag3n7AcCuRnJpHSdd9YGkwB1w6Z6YQ+SkW/64cPWQ==',key_name='tempest-TestGettingAddress-1450732548',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c4200f1d1fbb44a5aaf5e3578f6354ae',ramdisk_id='',reservation_id='r-niopkunw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-25838038',owner_user_name='tempest-TestGettingAddress-25838038-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:06:41Z,user_data=None,user_id='809b865601654264af5bff7f49127cea',uuid=c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f66ac761-5b67-4bf3-98de-982febfd27c1", "address": "fa:16:3e:07:30:39", "network": {"id": "90da6fca-65d1-4012-9602-d88842a0ad0e", "bridge": "br-int", "label": "tempest-network-smoke--1184291441", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe07:3039", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf66ac761-5b", "ovs_interfaceid": "f66ac761-5b67-4bf3-98de-982febfd27c1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 08:06:46 compute-0 nova_compute[186544]: 2025-11-22 08:06:46.084 186548 DEBUG nova.network.os_vif_util [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converting VIF {"id": "f66ac761-5b67-4bf3-98de-982febfd27c1", "address": "fa:16:3e:07:30:39", "network": {"id": "90da6fca-65d1-4012-9602-d88842a0ad0e", "bridge": "br-int", "label": "tempest-network-smoke--1184291441", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe07:3039", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf66ac761-5b", "ovs_interfaceid": "f66ac761-5b67-4bf3-98de-982febfd27c1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:06:46 compute-0 nova_compute[186544]: 2025-11-22 08:06:46.084 186548 DEBUG nova.network.os_vif_util [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:07:30:39,bridge_name='br-int',has_traffic_filtering=True,id=f66ac761-5b67-4bf3-98de-982febfd27c1,network=Network(90da6fca-65d1-4012-9602-d88842a0ad0e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf66ac761-5b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:06:46 compute-0 nova_compute[186544]: 2025-11-22 08:06:46.085 186548 DEBUG nova.objects.instance [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lazy-loading 'pci_devices' on Instance uuid c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:06:46 compute-0 nova_compute[186544]: 2025-11-22 08:06:46.098 186548 DEBUG nova.virt.libvirt.driver [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] End _get_guest_xml xml=<domain type="kvm">
Nov 22 08:06:46 compute-0 nova_compute[186544]:   <uuid>c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762</uuid>
Nov 22 08:06:46 compute-0 nova_compute[186544]:   <name>instance-0000006d</name>
Nov 22 08:06:46 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 08:06:46 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 08:06:46 compute-0 nova_compute[186544]:   <metadata>
Nov 22 08:06:46 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 08:06:46 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 08:06:46 compute-0 nova_compute[186544]:       <nova:name>tempest-TestGettingAddress-server-695468086</nova:name>
Nov 22 08:06:46 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 08:06:46</nova:creationTime>
Nov 22 08:06:46 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 08:06:46 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 08:06:46 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 08:06:46 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 08:06:46 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 08:06:46 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 08:06:46 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 08:06:46 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 08:06:46 compute-0 nova_compute[186544]:         <nova:user uuid="809b865601654264af5bff7f49127cea">tempest-TestGettingAddress-25838038-project-member</nova:user>
Nov 22 08:06:46 compute-0 nova_compute[186544]:         <nova:project uuid="c4200f1d1fbb44a5aaf5e3578f6354ae">tempest-TestGettingAddress-25838038</nova:project>
Nov 22 08:06:46 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 08:06:46 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 08:06:46 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 08:06:46 compute-0 nova_compute[186544]:         <nova:port uuid="f66ac761-5b67-4bf3-98de-982febfd27c1">
Nov 22 08:06:46 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="2001:db8::f816:3eff:fe07:3039" ipVersion="6"/>
Nov 22 08:06:46 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 22 08:06:46 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 08:06:46 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 08:06:46 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 08:06:46 compute-0 nova_compute[186544]:   </metadata>
Nov 22 08:06:46 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 08:06:46 compute-0 nova_compute[186544]:     <system>
Nov 22 08:06:46 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 08:06:46 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 08:06:46 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 08:06:46 compute-0 nova_compute[186544]:       <entry name="serial">c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762</entry>
Nov 22 08:06:46 compute-0 nova_compute[186544]:       <entry name="uuid">c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762</entry>
Nov 22 08:06:46 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 08:06:46 compute-0 nova_compute[186544]:     </system>
Nov 22 08:06:46 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 08:06:46 compute-0 nova_compute[186544]:   <os>
Nov 22 08:06:46 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 08:06:46 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 08:06:46 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 08:06:46 compute-0 nova_compute[186544]:   </os>
Nov 22 08:06:46 compute-0 nova_compute[186544]:   <features>
Nov 22 08:06:46 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 08:06:46 compute-0 nova_compute[186544]:     <apic/>
Nov 22 08:06:46 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 08:06:46 compute-0 nova_compute[186544]:   </features>
Nov 22 08:06:46 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 08:06:46 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 08:06:46 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 08:06:46 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 08:06:46 compute-0 nova_compute[186544]:   </clock>
Nov 22 08:06:46 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 08:06:46 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 08:06:46 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 08:06:46 compute-0 nova_compute[186544]:   </cpu>
Nov 22 08:06:46 compute-0 nova_compute[186544]:   <devices>
Nov 22 08:06:46 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 08:06:46 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 08:06:46 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762/disk"/>
Nov 22 08:06:46 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 08:06:46 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:06:46 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 08:06:46 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 08:06:46 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762/disk.config"/>
Nov 22 08:06:46 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 08:06:46 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:06:46 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 08:06:46 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:07:30:39"/>
Nov 22 08:06:46 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:06:46 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 08:06:46 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 08:06:46 compute-0 nova_compute[186544]:       <target dev="tapf66ac761-5b"/>
Nov 22 08:06:46 compute-0 nova_compute[186544]:     </interface>
Nov 22 08:06:46 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 08:06:46 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762/console.log" append="off"/>
Nov 22 08:06:46 compute-0 nova_compute[186544]:     </serial>
Nov 22 08:06:46 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 08:06:46 compute-0 nova_compute[186544]:     <video>
Nov 22 08:06:46 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:06:46 compute-0 nova_compute[186544]:     </video>
Nov 22 08:06:46 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 08:06:46 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 08:06:46 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 08:06:46 compute-0 nova_compute[186544]:     </rng>
Nov 22 08:06:46 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 08:06:46 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:06:46 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:06:46 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:06:46 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:06:46 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:06:46 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:06:46 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:06:46 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:06:46 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:06:46 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:06:46 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:06:46 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:06:46 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:06:46 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:06:46 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:06:46 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:06:46 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:06:46 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:06:46 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:06:46 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:06:46 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:06:46 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:06:46 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:06:46 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:06:46 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 08:06:46 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 08:06:46 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 08:06:46 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 08:06:46 compute-0 nova_compute[186544]:   </devices>
Nov 22 08:06:46 compute-0 nova_compute[186544]: </domain>
Nov 22 08:06:46 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 08:06:46 compute-0 nova_compute[186544]: 2025-11-22 08:06:46.099 186548 DEBUG nova.compute.manager [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] Preparing to wait for external event network-vif-plugged-f66ac761-5b67-4bf3-98de-982febfd27c1 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 08:06:46 compute-0 nova_compute[186544]: 2025-11-22 08:06:46.100 186548 DEBUG oslo_concurrency.lockutils [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:06:46 compute-0 nova_compute[186544]: 2025-11-22 08:06:46.100 186548 DEBUG oslo_concurrency.lockutils [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:06:46 compute-0 nova_compute[186544]: 2025-11-22 08:06:46.100 186548 DEBUG oslo_concurrency.lockutils [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:06:46 compute-0 nova_compute[186544]: 2025-11-22 08:06:46.101 186548 DEBUG nova.virt.libvirt.vif [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:06:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-695468086',display_name='tempest-TestGettingAddress-server-695468086',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-695468086',id=109,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEPnaERasa+izdcdfyTuC1NZxdKIV3QYAmiEXJjMkASn0E1tv7r6vCMDrq3+5wI/5DgRhzrsGj9ouyKzyqBuAz+X8ag3n7AcCuRnJpHSdd9YGkwB1w6Z6YQ+SkW/64cPWQ==',key_name='tempest-TestGettingAddress-1450732548',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c4200f1d1fbb44a5aaf5e3578f6354ae',ramdisk_id='',reservation_id='r-niopkunw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-25838038',owner_user_name='tempest-TestGettingAddress-25838038-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:06:41Z,user_data=None,user_id='809b865601654264af5bff7f49127cea',uuid=c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f66ac761-5b67-4bf3-98de-982febfd27c1", "address": "fa:16:3e:07:30:39", "network": {"id": "90da6fca-65d1-4012-9602-d88842a0ad0e", "bridge": "br-int", "label": "tempest-network-smoke--1184291441", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe07:3039", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf66ac761-5b", "ovs_interfaceid": "f66ac761-5b67-4bf3-98de-982febfd27c1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 08:06:46 compute-0 nova_compute[186544]: 2025-11-22 08:06:46.102 186548 DEBUG nova.network.os_vif_util [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converting VIF {"id": "f66ac761-5b67-4bf3-98de-982febfd27c1", "address": "fa:16:3e:07:30:39", "network": {"id": "90da6fca-65d1-4012-9602-d88842a0ad0e", "bridge": "br-int", "label": "tempest-network-smoke--1184291441", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe07:3039", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf66ac761-5b", "ovs_interfaceid": "f66ac761-5b67-4bf3-98de-982febfd27c1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:06:46 compute-0 nova_compute[186544]: 2025-11-22 08:06:46.103 186548 DEBUG nova.network.os_vif_util [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:07:30:39,bridge_name='br-int',has_traffic_filtering=True,id=f66ac761-5b67-4bf3-98de-982febfd27c1,network=Network(90da6fca-65d1-4012-9602-d88842a0ad0e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf66ac761-5b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:06:46 compute-0 nova_compute[186544]: 2025-11-22 08:06:46.103 186548 DEBUG os_vif [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:07:30:39,bridge_name='br-int',has_traffic_filtering=True,id=f66ac761-5b67-4bf3-98de-982febfd27c1,network=Network(90da6fca-65d1-4012-9602-d88842a0ad0e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf66ac761-5b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 08:06:46 compute-0 nova_compute[186544]: 2025-11-22 08:06:46.104 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:06:46 compute-0 nova_compute[186544]: 2025-11-22 08:06:46.104 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:06:46 compute-0 nova_compute[186544]: 2025-11-22 08:06:46.105 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:06:46 compute-0 nova_compute[186544]: 2025-11-22 08:06:46.109 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:06:46 compute-0 nova_compute[186544]: 2025-11-22 08:06:46.110 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf66ac761-5b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:06:46 compute-0 nova_compute[186544]: 2025-11-22 08:06:46.110 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf66ac761-5b, col_values=(('external_ids', {'iface-id': 'f66ac761-5b67-4bf3-98de-982febfd27c1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:07:30:39', 'vm-uuid': 'c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:06:46 compute-0 nova_compute[186544]: 2025-11-22 08:06:46.112 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:06:46 compute-0 NetworkManager[55036]: <info>  [1763798806.1133] manager: (tapf66ac761-5b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/230)
Nov 22 08:06:46 compute-0 nova_compute[186544]: 2025-11-22 08:06:46.115 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 08:06:46 compute-0 nova_compute[186544]: 2025-11-22 08:06:46.119 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:06:46 compute-0 nova_compute[186544]: 2025-11-22 08:06:46.120 186548 INFO os_vif [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:07:30:39,bridge_name='br-int',has_traffic_filtering=True,id=f66ac761-5b67-4bf3-98de-982febfd27c1,network=Network(90da6fca-65d1-4012-9602-d88842a0ad0e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf66ac761-5b')
Nov 22 08:06:46 compute-0 nova_compute[186544]: 2025-11-22 08:06:46.167 186548 DEBUG nova.virt.libvirt.driver [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:06:46 compute-0 nova_compute[186544]: 2025-11-22 08:06:46.168 186548 DEBUG nova.virt.libvirt.driver [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:06:46 compute-0 nova_compute[186544]: 2025-11-22 08:06:46.168 186548 DEBUG nova.virt.libvirt.driver [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] No VIF found with MAC fa:16:3e:07:30:39, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 08:06:46 compute-0 nova_compute[186544]: 2025-11-22 08:06:46.169 186548 INFO nova.virt.libvirt.driver [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] Using config drive
Nov 22 08:06:46 compute-0 nova_compute[186544]: 2025-11-22 08:06:46.567 186548 INFO nova.virt.libvirt.driver [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] Creating config drive at /var/lib/nova/instances/c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762/disk.config
Nov 22 08:06:46 compute-0 nova_compute[186544]: 2025-11-22 08:06:46.573 186548 DEBUG oslo_concurrency.processutils [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1zly4ffl execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:06:46 compute-0 nova_compute[186544]: 2025-11-22 08:06:46.698 186548 DEBUG oslo_concurrency.processutils [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1zly4ffl" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:06:46 compute-0 kernel: tapf66ac761-5b: entered promiscuous mode
Nov 22 08:06:46 compute-0 NetworkManager[55036]: <info>  [1763798806.7904] manager: (tapf66ac761-5b): new Tun device (/org/freedesktop/NetworkManager/Devices/231)
Nov 22 08:06:46 compute-0 nova_compute[186544]: 2025-11-22 08:06:46.795 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:06:46 compute-0 ovn_controller[94843]: 2025-11-22T08:06:46Z|00483|binding|INFO|Claiming lport f66ac761-5b67-4bf3-98de-982febfd27c1 for this chassis.
Nov 22 08:06:46 compute-0 ovn_controller[94843]: 2025-11-22T08:06:46Z|00484|binding|INFO|f66ac761-5b67-4bf3-98de-982febfd27c1: Claiming fa:16:3e:07:30:39 10.100.0.11 2001:db8::f816:3eff:fe07:3039
Nov 22 08:06:46 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:06:46.806 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:07:30:39 10.100.0.11 2001:db8::f816:3eff:fe07:3039'], port_security=['fa:16:3e:07:30:39 10.100.0.11 2001:db8::f816:3eff:fe07:3039'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28 2001:db8::f816:3eff:fe07:3039/64', 'neutron:device_id': 'c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-90da6fca-65d1-4012-9602-d88842a0ad0e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '2', 'neutron:security_group_ids': '573b06fa-1b11-4261-bfd0-ca50fa18731b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e9c41f1e-b11e-4868-a3a0-70214f7435c4, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=f66ac761-5b67-4bf3-98de-982febfd27c1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:06:46 compute-0 ovn_controller[94843]: 2025-11-22T08:06:46Z|00485|binding|INFO|Setting lport f66ac761-5b67-4bf3-98de-982febfd27c1 ovn-installed in OVS
Nov 22 08:06:46 compute-0 ovn_controller[94843]: 2025-11-22T08:06:46Z|00486|binding|INFO|Setting lport f66ac761-5b67-4bf3-98de-982febfd27c1 up in Southbound
Nov 22 08:06:46 compute-0 nova_compute[186544]: 2025-11-22 08:06:46.811 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:06:46 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:06:46.808 103805 INFO neutron.agent.ovn.metadata.agent [-] Port f66ac761-5b67-4bf3-98de-982febfd27c1 in datapath 90da6fca-65d1-4012-9602-d88842a0ad0e bound to our chassis
Nov 22 08:06:46 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:06:46.809 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 90da6fca-65d1-4012-9602-d88842a0ad0e
Nov 22 08:06:46 compute-0 nova_compute[186544]: 2025-11-22 08:06:46.816 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:06:46 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:06:46.824 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[e9964285-65ee-477a-8666-8d09a3a45cf3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:06:46 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:06:46.825 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap90da6fca-61 in ovnmeta-90da6fca-65d1-4012-9602-d88842a0ad0e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 08:06:46 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:06:46.837 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap90da6fca-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 08:06:46 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:06:46.837 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[26ba3f45-617a-43c5-a7a2-19f34b05ae18]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:06:46 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:06:46.838 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[1788ea47-a27a-4cbb-9c8b-82b3658d7b45]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:06:46 compute-0 systemd-udevd[233373]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:06:46 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:06:46.851 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[971e185a-94cb-4551-90a7-a7da93d835f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:06:46 compute-0 systemd-machined[152872]: New machine qemu-63-instance-0000006d.
Nov 22 08:06:46 compute-0 NetworkManager[55036]: <info>  [1763798806.8641] device (tapf66ac761-5b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 08:06:46 compute-0 NetworkManager[55036]: <info>  [1763798806.8653] device (tapf66ac761-5b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 08:06:46 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:06:46.867 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[1ca9a671-8063-4a6b-bd8e-201abbff9084]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:06:46 compute-0 systemd[1]: Started Virtual Machine qemu-63-instance-0000006d.
Nov 22 08:06:46 compute-0 podman[233342]: 2025-11-22 08:06:46.895565369 +0000 UTC m=+0.113898400 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., release=1755695350, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, architecture=x86_64, io.openshift.expose-services=, managed_by=edpm_ansible)
Nov 22 08:06:46 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:06:46.896 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[2e8f8730-6240-4f7a-8378-8bd2dbb56ea6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:06:46 compute-0 NetworkManager[55036]: <info>  [1763798806.9069] manager: (tap90da6fca-60): new Veth device (/org/freedesktop/NetworkManager/Devices/232)
Nov 22 08:06:46 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:06:46.906 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[062c2012-ae8e-49aa-b212-dde2ee903565]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:06:46 compute-0 podman[233341]: 2025-11-22 08:06:46.914349203 +0000 UTC m=+0.136244682 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 08:06:46 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:06:46.944 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[bc4fe132-9787-432c-b3f5-1b4de7b21c5e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:06:46 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:06:46.947 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[a1d3a0e2-bab2-4ca1-aac9-1a32b4f44bf5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:06:46 compute-0 NetworkManager[55036]: <info>  [1763798806.9708] device (tap90da6fca-60): carrier: link connected
Nov 22 08:06:46 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:06:46.975 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[377905a5-3735-416d-9e01-e8b2d297f1cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:06:46 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:06:46.996 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[8fe3216e-94e4-41f3-be30-49cfd9db2cbd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap90da6fca-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1e:28:f1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 152], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 554760, 'reachable_time': 34568, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233423, 'error': None, 'target': 'ovnmeta-90da6fca-65d1-4012-9602-d88842a0ad0e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:06:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:06:47.012 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[d82b3739-79dd-4e7d-b806-9a46cf532693]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe1e:28f1'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 554760, 'tstamp': 554760}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233424, 'error': None, 'target': 'ovnmeta-90da6fca-65d1-4012-9602-d88842a0ad0e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:06:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:06:47.032 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[bc62886d-b025-4ee0-8c3c-6a1f4b4409f3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap90da6fca-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1e:28:f1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 152], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 554760, 'reachable_time': 34568, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 233425, 'error': None, 'target': 'ovnmeta-90da6fca-65d1-4012-9602-d88842a0ad0e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:06:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:06:47.067 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[b2668d1b-1537-449c-b478-b4f85c7ba11c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:06:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:06:47.137 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[5b1111d4-6434-4eaf-8e85-a50c5d00d8c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:06:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:06:47.139 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap90da6fca-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:06:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:06:47.139 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:06:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:06:47.140 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap90da6fca-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:06:47 compute-0 nova_compute[186544]: 2025-11-22 08:06:47.142 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:06:47 compute-0 NetworkManager[55036]: <info>  [1763798807.1430] manager: (tap90da6fca-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/233)
Nov 22 08:06:47 compute-0 kernel: tap90da6fca-60: entered promiscuous mode
Nov 22 08:06:47 compute-0 nova_compute[186544]: 2025-11-22 08:06:47.146 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:06:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:06:47.148 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap90da6fca-60, col_values=(('external_ids', {'iface-id': '0abd56a4-3e9e-4d28-8383-eadcda41744d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:06:47 compute-0 nova_compute[186544]: 2025-11-22 08:06:47.149 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:06:47 compute-0 ovn_controller[94843]: 2025-11-22T08:06:47Z|00487|binding|INFO|Releasing lport 0abd56a4-3e9e-4d28-8383-eadcda41744d from this chassis (sb_readonly=0)
Nov 22 08:06:47 compute-0 nova_compute[186544]: 2025-11-22 08:06:47.150 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:06:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:06:47.150 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/90da6fca-65d1-4012-9602-d88842a0ad0e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/90da6fca-65d1-4012-9602-d88842a0ad0e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 08:06:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:06:47.151 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[36fd62cc-0d37-4d5b-864a-37887fa78e23]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:06:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:06:47.152 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 08:06:47 compute-0 ovn_metadata_agent[103800]: global
Nov 22 08:06:47 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 08:06:47 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-90da6fca-65d1-4012-9602-d88842a0ad0e
Nov 22 08:06:47 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 08:06:47 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 08:06:47 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 08:06:47 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/90da6fca-65d1-4012-9602-d88842a0ad0e.pid.haproxy
Nov 22 08:06:47 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 08:06:47 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:06:47 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 08:06:47 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 08:06:47 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 08:06:47 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 08:06:47 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 08:06:47 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 08:06:47 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 08:06:47 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 08:06:47 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 08:06:47 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 08:06:47 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 08:06:47 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 08:06:47 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 08:06:47 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:06:47 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:06:47 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 08:06:47 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 08:06:47 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 08:06:47 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID 90da6fca-65d1-4012-9602-d88842a0ad0e
Nov 22 08:06:47 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 08:06:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:06:47.153 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-90da6fca-65d1-4012-9602-d88842a0ad0e', 'env', 'PROCESS_TAG=haproxy-90da6fca-65d1-4012-9602-d88842a0ad0e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/90da6fca-65d1-4012-9602-d88842a0ad0e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 08:06:47 compute-0 nova_compute[186544]: 2025-11-22 08:06:47.160 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:06:47 compute-0 nova_compute[186544]: 2025-11-22 08:06:47.187 186548 DEBUG nova.compute.manager [req-b83afd2b-dc88-4461-8494-fcca88ca651c req-c238c678-908d-4ffe-aab1-2fd081b4f37a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] Received event network-vif-plugged-f66ac761-5b67-4bf3-98de-982febfd27c1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:06:47 compute-0 nova_compute[186544]: 2025-11-22 08:06:47.187 186548 DEBUG oslo_concurrency.lockutils [req-b83afd2b-dc88-4461-8494-fcca88ca651c req-c238c678-908d-4ffe-aab1-2fd081b4f37a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:06:47 compute-0 nova_compute[186544]: 2025-11-22 08:06:47.188 186548 DEBUG oslo_concurrency.lockutils [req-b83afd2b-dc88-4461-8494-fcca88ca651c req-c238c678-908d-4ffe-aab1-2fd081b4f37a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:06:47 compute-0 nova_compute[186544]: 2025-11-22 08:06:47.188 186548 DEBUG oslo_concurrency.lockutils [req-b83afd2b-dc88-4461-8494-fcca88ca651c req-c238c678-908d-4ffe-aab1-2fd081b4f37a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:06:47 compute-0 nova_compute[186544]: 2025-11-22 08:06:47.189 186548 DEBUG nova.compute.manager [req-b83afd2b-dc88-4461-8494-fcca88ca651c req-c238c678-908d-4ffe-aab1-2fd081b4f37a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] Processing event network-vif-plugged-f66ac761-5b67-4bf3-98de-982febfd27c1 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 08:06:47 compute-0 nova_compute[186544]: 2025-11-22 08:06:47.258 186548 DEBUG nova.compute.manager [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 08:06:47 compute-0 nova_compute[186544]: 2025-11-22 08:06:47.259 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798807.2576761, c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:06:47 compute-0 nova_compute[186544]: 2025-11-22 08:06:47.259 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] VM Started (Lifecycle Event)
Nov 22 08:06:47 compute-0 nova_compute[186544]: 2025-11-22 08:06:47.263 186548 DEBUG nova.virt.libvirt.driver [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 08:06:47 compute-0 nova_compute[186544]: 2025-11-22 08:06:47.266 186548 INFO nova.virt.libvirt.driver [-] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] Instance spawned successfully.
Nov 22 08:06:47 compute-0 nova_compute[186544]: 2025-11-22 08:06:47.267 186548 DEBUG nova.virt.libvirt.driver [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 08:06:47 compute-0 nova_compute[186544]: 2025-11-22 08:06:47.283 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:06:47 compute-0 nova_compute[186544]: 2025-11-22 08:06:47.290 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:06:47 compute-0 nova_compute[186544]: 2025-11-22 08:06:47.294 186548 DEBUG nova.virt.libvirt.driver [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:06:47 compute-0 nova_compute[186544]: 2025-11-22 08:06:47.295 186548 DEBUG nova.virt.libvirt.driver [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:06:47 compute-0 nova_compute[186544]: 2025-11-22 08:06:47.295 186548 DEBUG nova.virt.libvirt.driver [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:06:47 compute-0 nova_compute[186544]: 2025-11-22 08:06:47.296 186548 DEBUG nova.virt.libvirt.driver [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:06:47 compute-0 nova_compute[186544]: 2025-11-22 08:06:47.296 186548 DEBUG nova.virt.libvirt.driver [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:06:47 compute-0 nova_compute[186544]: 2025-11-22 08:06:47.297 186548 DEBUG nova.virt.libvirt.driver [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:06:47 compute-0 nova_compute[186544]: 2025-11-22 08:06:47.318 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 08:06:47 compute-0 nova_compute[186544]: 2025-11-22 08:06:47.319 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798807.2587175, c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:06:47 compute-0 nova_compute[186544]: 2025-11-22 08:06:47.319 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] VM Paused (Lifecycle Event)
Nov 22 08:06:47 compute-0 nova_compute[186544]: 2025-11-22 08:06:47.347 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:06:47 compute-0 nova_compute[186544]: 2025-11-22 08:06:47.351 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798807.2634623, c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:06:47 compute-0 nova_compute[186544]: 2025-11-22 08:06:47.351 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] VM Resumed (Lifecycle Event)
Nov 22 08:06:47 compute-0 nova_compute[186544]: 2025-11-22 08:06:47.369 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:06:47 compute-0 nova_compute[186544]: 2025-11-22 08:06:47.373 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:06:47 compute-0 nova_compute[186544]: 2025-11-22 08:06:47.410 186548 INFO nova.compute.manager [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] Took 5.36 seconds to spawn the instance on the hypervisor.
Nov 22 08:06:47 compute-0 nova_compute[186544]: 2025-11-22 08:06:47.411 186548 DEBUG nova.compute.manager [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:06:47 compute-0 nova_compute[186544]: 2025-11-22 08:06:47.415 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 08:06:47 compute-0 nova_compute[186544]: 2025-11-22 08:06:47.484 186548 INFO nova.compute.manager [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] Took 5.86 seconds to build instance.
Nov 22 08:06:47 compute-0 nova_compute[186544]: 2025-11-22 08:06:47.497 186548 DEBUG oslo_concurrency.lockutils [None req-144be45f-4845-4a27-b9ae-f79bfc1875e4 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:06:47 compute-0 podman[233464]: 2025-11-22 08:06:47.618318513 +0000 UTC m=+0.110354891 container create 50a82291d99f6caf69b7711a7344be2ef721adaa442b9e61e9d345808f4efe80 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-90da6fca-65d1-4012-9602-d88842a0ad0e, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 08:06:47 compute-0 podman[233464]: 2025-11-22 08:06:47.535647975 +0000 UTC m=+0.027684373 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 08:06:47 compute-0 nova_compute[186544]: 2025-11-22 08:06:47.644 186548 DEBUG nova.network.neutron [req-8ed8bfbb-5d6b-4db7-bec8-28f146ae1007 req-90aadb27-c164-4449-9bb9-782c532f1dbf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] Updated VIF entry in instance network info cache for port f66ac761-5b67-4bf3-98de-982febfd27c1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:06:47 compute-0 nova_compute[186544]: 2025-11-22 08:06:47.645 186548 DEBUG nova.network.neutron [req-8ed8bfbb-5d6b-4db7-bec8-28f146ae1007 req-90aadb27-c164-4449-9bb9-782c532f1dbf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] Updating instance_info_cache with network_info: [{"id": "f66ac761-5b67-4bf3-98de-982febfd27c1", "address": "fa:16:3e:07:30:39", "network": {"id": "90da6fca-65d1-4012-9602-d88842a0ad0e", "bridge": "br-int", "label": "tempest-network-smoke--1184291441", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe07:3039", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf66ac761-5b", "ovs_interfaceid": "f66ac761-5b67-4bf3-98de-982febfd27c1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:06:47 compute-0 systemd[1]: Started libpod-conmon-50a82291d99f6caf69b7711a7344be2ef721adaa442b9e61e9d345808f4efe80.scope.
Nov 22 08:06:47 compute-0 nova_compute[186544]: 2025-11-22 08:06:47.670 186548 DEBUG oslo_concurrency.lockutils [req-8ed8bfbb-5d6b-4db7-bec8-28f146ae1007 req-90aadb27-c164-4449-9bb9-782c532f1dbf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:06:47 compute-0 systemd[1]: Started libcrun container.
Nov 22 08:06:47 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/88275896954d780476515c3897676c6f85015ac1fcb973bd62e076b55af84898/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 08:06:47 compute-0 podman[233464]: 2025-11-22 08:06:47.71346777 +0000 UTC m=+0.205504168 container init 50a82291d99f6caf69b7711a7344be2ef721adaa442b9e61e9d345808f4efe80 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-90da6fca-65d1-4012-9602-d88842a0ad0e, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 22 08:06:47 compute-0 podman[233464]: 2025-11-22 08:06:47.71914726 +0000 UTC m=+0.211183638 container start 50a82291d99f6caf69b7711a7344be2ef721adaa442b9e61e9d345808f4efe80 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-90da6fca-65d1-4012-9602-d88842a0ad0e, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 08:06:47 compute-0 neutron-haproxy-ovnmeta-90da6fca-65d1-4012-9602-d88842a0ad0e[233479]: [NOTICE]   (233483) : New worker (233485) forked
Nov 22 08:06:47 compute-0 neutron-haproxy-ovnmeta-90da6fca-65d1-4012-9602-d88842a0ad0e[233479]: [NOTICE]   (233483) : Loading success.
Nov 22 08:06:49 compute-0 nova_compute[186544]: 2025-11-22 08:06:49.283 186548 DEBUG nova.compute.manager [req-0860fcf3-3508-4c66-88c3-cab25188ba03 req-a91b80c2-b149-4114-8677-b17499f1af4c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] Received event network-vif-plugged-f66ac761-5b67-4bf3-98de-982febfd27c1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:06:49 compute-0 nova_compute[186544]: 2025-11-22 08:06:49.284 186548 DEBUG oslo_concurrency.lockutils [req-0860fcf3-3508-4c66-88c3-cab25188ba03 req-a91b80c2-b149-4114-8677-b17499f1af4c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:06:49 compute-0 nova_compute[186544]: 2025-11-22 08:06:49.284 186548 DEBUG oslo_concurrency.lockutils [req-0860fcf3-3508-4c66-88c3-cab25188ba03 req-a91b80c2-b149-4114-8677-b17499f1af4c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:06:49 compute-0 nova_compute[186544]: 2025-11-22 08:06:49.284 186548 DEBUG oslo_concurrency.lockutils [req-0860fcf3-3508-4c66-88c3-cab25188ba03 req-a91b80c2-b149-4114-8677-b17499f1af4c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:06:49 compute-0 nova_compute[186544]: 2025-11-22 08:06:49.285 186548 DEBUG nova.compute.manager [req-0860fcf3-3508-4c66-88c3-cab25188ba03 req-a91b80c2-b149-4114-8677-b17499f1af4c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] No waiting events found dispatching network-vif-plugged-f66ac761-5b67-4bf3-98de-982febfd27c1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:06:49 compute-0 nova_compute[186544]: 2025-11-22 08:06:49.285 186548 WARNING nova.compute.manager [req-0860fcf3-3508-4c66-88c3-cab25188ba03 req-a91b80c2-b149-4114-8677-b17499f1af4c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] Received unexpected event network-vif-plugged-f66ac761-5b67-4bf3-98de-982febfd27c1 for instance with vm_state active and task_state None.
Nov 22 08:06:50 compute-0 nova_compute[186544]: 2025-11-22 08:06:50.610 186548 DEBUG nova.compute.manager [req-a2d284d9-79cd-4ef0-9b26-2ea8f6b30c6a req-15fb616f-caa9-4cf1-8af0-53a5438b41bb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] Received event network-changed-f66ac761-5b67-4bf3-98de-982febfd27c1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:06:50 compute-0 nova_compute[186544]: 2025-11-22 08:06:50.610 186548 DEBUG nova.compute.manager [req-a2d284d9-79cd-4ef0-9b26-2ea8f6b30c6a req-15fb616f-caa9-4cf1-8af0-53a5438b41bb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] Refreshing instance network info cache due to event network-changed-f66ac761-5b67-4bf3-98de-982febfd27c1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:06:50 compute-0 nova_compute[186544]: 2025-11-22 08:06:50.611 186548 DEBUG oslo_concurrency.lockutils [req-a2d284d9-79cd-4ef0-9b26-2ea8f6b30c6a req-15fb616f-caa9-4cf1-8af0-53a5438b41bb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:06:50 compute-0 nova_compute[186544]: 2025-11-22 08:06:50.611 186548 DEBUG oslo_concurrency.lockutils [req-a2d284d9-79cd-4ef0-9b26-2ea8f6b30c6a req-15fb616f-caa9-4cf1-8af0-53a5438b41bb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:06:50 compute-0 nova_compute[186544]: 2025-11-22 08:06:50.611 186548 DEBUG nova.network.neutron [req-a2d284d9-79cd-4ef0-9b26-2ea8f6b30c6a req-15fb616f-caa9-4cf1-8af0-53a5438b41bb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] Refreshing network info cache for port f66ac761-5b67-4bf3-98de-982febfd27c1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:06:50 compute-0 nova_compute[186544]: 2025-11-22 08:06:50.789 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:06:51 compute-0 nova_compute[186544]: 2025-11-22 08:06:51.113 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:06:51 compute-0 nova_compute[186544]: 2025-11-22 08:06:51.173 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:06:52 compute-0 nova_compute[186544]: 2025-11-22 08:06:52.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:06:52 compute-0 nova_compute[186544]: 2025-11-22 08:06:52.192 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:06:52 compute-0 nova_compute[186544]: 2025-11-22 08:06:52.192 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:06:52 compute-0 nova_compute[186544]: 2025-11-22 08:06:52.193 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:06:52 compute-0 nova_compute[186544]: 2025-11-22 08:06:52.193 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 08:06:52 compute-0 nova_compute[186544]: 2025-11-22 08:06:52.281 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1fe5c7b8-8a9f-4b64-bb69-822a770565f6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:06:52 compute-0 nova_compute[186544]: 2025-11-22 08:06:52.340 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1fe5c7b8-8a9f-4b64-bb69-822a770565f6/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:06:52 compute-0 nova_compute[186544]: 2025-11-22 08:06:52.342 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1fe5c7b8-8a9f-4b64-bb69-822a770565f6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:06:52 compute-0 nova_compute[186544]: 2025-11-22 08:06:52.395 186548 DEBUG nova.network.neutron [req-a2d284d9-79cd-4ef0-9b26-2ea8f6b30c6a req-15fb616f-caa9-4cf1-8af0-53a5438b41bb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] Updated VIF entry in instance network info cache for port f66ac761-5b67-4bf3-98de-982febfd27c1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:06:52 compute-0 nova_compute[186544]: 2025-11-22 08:06:52.396 186548 DEBUG nova.network.neutron [req-a2d284d9-79cd-4ef0-9b26-2ea8f6b30c6a req-15fb616f-caa9-4cf1-8af0-53a5438b41bb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] Updating instance_info_cache with network_info: [{"id": "f66ac761-5b67-4bf3-98de-982febfd27c1", "address": "fa:16:3e:07:30:39", "network": {"id": "90da6fca-65d1-4012-9602-d88842a0ad0e", "bridge": "br-int", "label": "tempest-network-smoke--1184291441", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe07:3039", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.236", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf66ac761-5b", "ovs_interfaceid": "f66ac761-5b67-4bf3-98de-982febfd27c1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:06:52 compute-0 nova_compute[186544]: 2025-11-22 08:06:52.416 186548 DEBUG oslo_concurrency.lockutils [req-a2d284d9-79cd-4ef0-9b26-2ea8f6b30c6a req-15fb616f-caa9-4cf1-8af0-53a5438b41bb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:06:52 compute-0 nova_compute[186544]: 2025-11-22 08:06:52.418 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1fe5c7b8-8a9f-4b64-bb69-822a770565f6/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:06:52 compute-0 nova_compute[186544]: 2025-11-22 08:06:52.426 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:06:52 compute-0 nova_compute[186544]: 2025-11-22 08:06:52.484 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:06:52 compute-0 nova_compute[186544]: 2025-11-22 08:06:52.485 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:06:52 compute-0 nova_compute[186544]: 2025-11-22 08:06:52.544 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:06:52 compute-0 nova_compute[186544]: 2025-11-22 08:06:52.764 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:06:52 compute-0 nova_compute[186544]: 2025-11-22 08:06:52.766 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5439MB free_disk=73.1745834350586GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 08:06:52 compute-0 nova_compute[186544]: 2025-11-22 08:06:52.766 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:06:52 compute-0 nova_compute[186544]: 2025-11-22 08:06:52.767 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:06:52 compute-0 nova_compute[186544]: 2025-11-22 08:06:52.848 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Instance 1fe5c7b8-8a9f-4b64-bb69-822a770565f6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 22 08:06:52 compute-0 nova_compute[186544]: 2025-11-22 08:06:52.849 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Instance c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 22 08:06:52 compute-0 nova_compute[186544]: 2025-11-22 08:06:52.849 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 08:06:52 compute-0 nova_compute[186544]: 2025-11-22 08:06:52.850 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 08:06:52 compute-0 nova_compute[186544]: 2025-11-22 08:06:52.905 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:06:52 compute-0 nova_compute[186544]: 2025-11-22 08:06:52.928 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:06:52 compute-0 nova_compute[186544]: 2025-11-22 08:06:52.951 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 08:06:52 compute-0 nova_compute[186544]: 2025-11-22 08:06:52.952 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.185s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:06:54 compute-0 nova_compute[186544]: 2025-11-22 08:06:54.953 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:06:54 compute-0 nova_compute[186544]: 2025-11-22 08:06:54.953 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 08:06:54 compute-0 nova_compute[186544]: 2025-11-22 08:06:54.953 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 08:06:55 compute-0 nova_compute[186544]: 2025-11-22 08:06:55.310 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "refresh_cache-1fe5c7b8-8a9f-4b64-bb69-822a770565f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:06:55 compute-0 nova_compute[186544]: 2025-11-22 08:06:55.310 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquired lock "refresh_cache-1fe5c7b8-8a9f-4b64-bb69-822a770565f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:06:55 compute-0 nova_compute[186544]: 2025-11-22 08:06:55.311 186548 DEBUG nova.network.neutron [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 22 08:06:55 compute-0 nova_compute[186544]: 2025-11-22 08:06:55.311 186548 DEBUG nova.objects.instance [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lazy-loading 'info_cache' on Instance uuid 1fe5c7b8-8a9f-4b64-bb69-822a770565f6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:06:55 compute-0 nova_compute[186544]: 2025-11-22 08:06:55.791 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:06:56 compute-0 nova_compute[186544]: 2025-11-22 08:06:56.115 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:06:57 compute-0 nova_compute[186544]: 2025-11-22 08:06:57.714 186548 DEBUG oslo_concurrency.lockutils [None req-2709bdff-4e3b-4c09-aed4-f2fe452c14bd b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquiring lock "1fe5c7b8-8a9f-4b64-bb69-822a770565f6" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:06:57 compute-0 nova_compute[186544]: 2025-11-22 08:06:57.715 186548 DEBUG oslo_concurrency.lockutils [None req-2709bdff-4e3b-4c09-aed4-f2fe452c14bd b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "1fe5c7b8-8a9f-4b64-bb69-822a770565f6" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:06:57 compute-0 nova_compute[186544]: 2025-11-22 08:06:57.715 186548 DEBUG nova.compute.manager [None req-2709bdff-4e3b-4c09-aed4-f2fe452c14bd b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:06:57 compute-0 nova_compute[186544]: 2025-11-22 08:06:57.722 186548 DEBUG nova.compute.manager [None req-2709bdff-4e3b-4c09-aed4-f2fe452c14bd b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338
Nov 22 08:06:57 compute-0 nova_compute[186544]: 2025-11-22 08:06:57.725 186548 DEBUG nova.objects.instance [None req-2709bdff-4e3b-4c09-aed4-f2fe452c14bd b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lazy-loading 'flavor' on Instance uuid 1fe5c7b8-8a9f-4b64-bb69-822a770565f6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:06:57 compute-0 nova_compute[186544]: 2025-11-22 08:06:57.755 186548 DEBUG nova.objects.instance [None req-2709bdff-4e3b-4c09-aed4-f2fe452c14bd b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lazy-loading 'info_cache' on Instance uuid 1fe5c7b8-8a9f-4b64-bb69-822a770565f6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:06:57 compute-0 nova_compute[186544]: 2025-11-22 08:06:57.786 186548 DEBUG nova.virt.libvirt.driver [None req-2709bdff-4e3b-4c09-aed4-f2fe452c14bd b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Nov 22 08:06:58 compute-0 nova_compute[186544]: 2025-11-22 08:06:58.040 186548 DEBUG nova.network.neutron [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Updating instance_info_cache with network_info: [{"id": "480abb3e-15cf-4910-b471-7667ee6c50c8", "address": "fa:16:3e:d0:87:85", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap480abb3e-15", "ovs_interfaceid": "480abb3e-15cf-4910-b471-7667ee6c50c8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:06:58 compute-0 nova_compute[186544]: 2025-11-22 08:06:58.055 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Releasing lock "refresh_cache-1fe5c7b8-8a9f-4b64-bb69-822a770565f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:06:58 compute-0 nova_compute[186544]: 2025-11-22 08:06:58.055 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 22 08:06:58 compute-0 nova_compute[186544]: 2025-11-22 08:06:58.056 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:06:58 compute-0 nova_compute[186544]: 2025-11-22 08:06:58.056 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:06:58 compute-0 nova_compute[186544]: 2025-11-22 08:06:58.056 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:06:58 compute-0 nova_compute[186544]: 2025-11-22 08:06:58.057 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:06:58 compute-0 nova_compute[186544]: 2025-11-22 08:06:58.057 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 08:06:59 compute-0 nova_compute[186544]: 2025-11-22 08:06:59.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:06:59 compute-0 kernel: tap480abb3e-15 (unregistering): left promiscuous mode
Nov 22 08:06:59 compute-0 NetworkManager[55036]: <info>  [1763798819.9786] device (tap480abb3e-15): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 08:06:59 compute-0 nova_compute[186544]: 2025-11-22 08:06:59.992 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:06:59 compute-0 ovn_controller[94843]: 2025-11-22T08:06:59Z|00488|binding|INFO|Releasing lport 480abb3e-15cf-4910-b471-7667ee6c50c8 from this chassis (sb_readonly=0)
Nov 22 08:06:59 compute-0 ovn_controller[94843]: 2025-11-22T08:06:59Z|00489|binding|INFO|Setting lport 480abb3e-15cf-4910-b471-7667ee6c50c8 down in Southbound
Nov 22 08:06:59 compute-0 ovn_controller[94843]: 2025-11-22T08:06:59Z|00490|binding|INFO|Removing iface tap480abb3e-15 ovn-installed in OVS
Nov 22 08:06:59 compute-0 nova_compute[186544]: 2025-11-22 08:06:59.995 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:00 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:06:59.999 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d0:87:85 10.100.0.11'], port_security=['fa:16:3e:d0:87:85 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '1fe5c7b8-8a9f-4b64-bb69-822a770565f6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f75b5f45-3232-42aa-a8f2-594f0428a6f8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.204'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a46c39a3-69e8-4fb9-a300-4c114a0c0910, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=480abb3e-15cf-4910-b471-7667ee6c50c8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:07:00 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:00.001 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 480abb3e-15cf-4910-b471-7667ee6c50c8 in datapath 165f7f23-d3c9-4f49-8a34-4fd7222ad518 unbound from our chassis
Nov 22 08:07:00 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:00.002 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 165f7f23-d3c9-4f49-8a34-4fd7222ad518, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 08:07:00 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:00.003 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[990b5e4b-a68e-4ca3-9b98-694aed82ae02]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:07:00 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:00.003 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518 namespace which is not needed anymore
Nov 22 08:07:00 compute-0 nova_compute[186544]: 2025-11-22 08:07:00.018 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:00 compute-0 systemd[1]: machine-qemu\x2d62\x2dinstance\x2d0000006a.scope: Deactivated successfully.
Nov 22 08:07:00 compute-0 systemd[1]: machine-qemu\x2d62\x2dinstance\x2d0000006a.scope: Consumed 15.871s CPU time.
Nov 22 08:07:00 compute-0 systemd-machined[152872]: Machine qemu-62-instance-0000006a terminated.
Nov 22 08:07:00 compute-0 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[233177]: [NOTICE]   (233181) : haproxy version is 2.8.14-c23fe91
Nov 22 08:07:00 compute-0 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[233177]: [NOTICE]   (233181) : path to executable is /usr/sbin/haproxy
Nov 22 08:07:00 compute-0 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[233177]: [WARNING]  (233181) : Exiting Master process...
Nov 22 08:07:00 compute-0 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[233177]: [ALERT]    (233181) : Current worker (233183) exited with code 143 (Terminated)
Nov 22 08:07:00 compute-0 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[233177]: [WARNING]  (233181) : All workers exited. Exiting... (0)
Nov 22 08:07:00 compute-0 systemd[1]: libpod-162d76850fddfd295ec6cfc7852d33517305bab18572764e3ac45524db6377fe.scope: Deactivated successfully.
Nov 22 08:07:00 compute-0 conmon[233177]: conmon 162d76850fddfd295ec6 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-162d76850fddfd295ec6cfc7852d33517305bab18572764e3ac45524db6377fe.scope/container/memory.events
Nov 22 08:07:00 compute-0 podman[233531]: 2025-11-22 08:07:00.136197893 +0000 UTC m=+0.046243811 container died 162d76850fddfd295ec6cfc7852d33517305bab18572764e3ac45524db6377fe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 22 08:07:00 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-162d76850fddfd295ec6cfc7852d33517305bab18572764e3ac45524db6377fe-userdata-shm.mount: Deactivated successfully.
Nov 22 08:07:00 compute-0 systemd[1]: var-lib-containers-storage-overlay-38f5f74cb44622d3e39dce251e1a53ebee23dd6195313051a58786b1cfc3118c-merged.mount: Deactivated successfully.
Nov 22 08:07:00 compute-0 podman[233531]: 2025-11-22 08:07:00.188006301 +0000 UTC m=+0.098052209 container cleanup 162d76850fddfd295ec6cfc7852d33517305bab18572764e3ac45524db6377fe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 22 08:07:00 compute-0 systemd[1]: libpod-conmon-162d76850fddfd295ec6cfc7852d33517305bab18572764e3ac45524db6377fe.scope: Deactivated successfully.
Nov 22 08:07:00 compute-0 nova_compute[186544]: 2025-11-22 08:07:00.227 186548 DEBUG nova.compute.manager [req-8fd21e26-83da-410f-ba0f-4e870cb1a737 req-bd50bcc6-c715-4b37-8029-7da62b979adb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Received event network-vif-unplugged-480abb3e-15cf-4910-b471-7667ee6c50c8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:07:00 compute-0 nova_compute[186544]: 2025-11-22 08:07:00.227 186548 DEBUG oslo_concurrency.lockutils [req-8fd21e26-83da-410f-ba0f-4e870cb1a737 req-bd50bcc6-c715-4b37-8029-7da62b979adb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1fe5c7b8-8a9f-4b64-bb69-822a770565f6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:07:00 compute-0 nova_compute[186544]: 2025-11-22 08:07:00.228 186548 DEBUG oslo_concurrency.lockutils [req-8fd21e26-83da-410f-ba0f-4e870cb1a737 req-bd50bcc6-c715-4b37-8029-7da62b979adb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1fe5c7b8-8a9f-4b64-bb69-822a770565f6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:07:00 compute-0 nova_compute[186544]: 2025-11-22 08:07:00.228 186548 DEBUG oslo_concurrency.lockutils [req-8fd21e26-83da-410f-ba0f-4e870cb1a737 req-bd50bcc6-c715-4b37-8029-7da62b979adb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1fe5c7b8-8a9f-4b64-bb69-822a770565f6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:07:00 compute-0 nova_compute[186544]: 2025-11-22 08:07:00.228 186548 DEBUG nova.compute.manager [req-8fd21e26-83da-410f-ba0f-4e870cb1a737 req-bd50bcc6-c715-4b37-8029-7da62b979adb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] No waiting events found dispatching network-vif-unplugged-480abb3e-15cf-4910-b471-7667ee6c50c8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:07:00 compute-0 nova_compute[186544]: 2025-11-22 08:07:00.228 186548 WARNING nova.compute.manager [req-8fd21e26-83da-410f-ba0f-4e870cb1a737 req-bd50bcc6-c715-4b37-8029-7da62b979adb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Received unexpected event network-vif-unplugged-480abb3e-15cf-4910-b471-7667ee6c50c8 for instance with vm_state active and task_state powering-off.
Nov 22 08:07:00 compute-0 podman[233562]: 2025-11-22 08:07:00.255920465 +0000 UTC m=+0.044604721 container remove 162d76850fddfd295ec6cfc7852d33517305bab18572764e3ac45524db6377fe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS)
Nov 22 08:07:00 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:00.260 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[2bd71e20-1dac-4053-8cbc-2838da598f96]: (4, ('Sat Nov 22 08:07:00 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518 (162d76850fddfd295ec6cfc7852d33517305bab18572764e3ac45524db6377fe)\n162d76850fddfd295ec6cfc7852d33517305bab18572764e3ac45524db6377fe\nSat Nov 22 08:07:00 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518 (162d76850fddfd295ec6cfc7852d33517305bab18572764e3ac45524db6377fe)\n162d76850fddfd295ec6cfc7852d33517305bab18572764e3ac45524db6377fe\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:07:00 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:00.262 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[6f25596d-47df-4c9c-88fd-e852fe379ea0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:07:00 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:00.263 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap165f7f23-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:07:00 compute-0 kernel: tap165f7f23-d0: left promiscuous mode
Nov 22 08:07:00 compute-0 nova_compute[186544]: 2025-11-22 08:07:00.265 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:00 compute-0 nova_compute[186544]: 2025-11-22 08:07:00.281 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:00 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:00.284 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[b9630a34-9eab-4b54-ae5f-650b410cc51c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:07:00 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:00.299 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[723f44e5-3186-4b01-ab3b-b53677654aff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:07:00 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:00.301 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[e7e5da1f-0b58-4e72-b796-42d88ec9d407]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:07:00 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:00.315 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[ec5693e0-9254-4228-a361-fe801c4fa726]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 552279, 'reachable_time': 19465, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233594, 'error': None, 'target': 'ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:07:00 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:00.318 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 08:07:00 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:00.318 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[08c951e2-2b74-4660-8e44-24de893972a5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:07:00 compute-0 systemd[1]: run-netns-ovnmeta\x2d165f7f23\x2dd3c9\x2d4f49\x2d8a34\x2d4fd7222ad518.mount: Deactivated successfully.
Nov 22 08:07:00 compute-0 nova_compute[186544]: 2025-11-22 08:07:00.794 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:00 compute-0 nova_compute[186544]: 2025-11-22 08:07:00.802 186548 INFO nova.virt.libvirt.driver [None req-2709bdff-4e3b-4c09-aed4-f2fe452c14bd b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Instance shutdown successfully after 3 seconds.
Nov 22 08:07:00 compute-0 nova_compute[186544]: 2025-11-22 08:07:00.810 186548 INFO nova.virt.libvirt.driver [-] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Instance destroyed successfully.
Nov 22 08:07:00 compute-0 nova_compute[186544]: 2025-11-22 08:07:00.810 186548 DEBUG nova.objects.instance [None req-2709bdff-4e3b-4c09-aed4-f2fe452c14bd b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lazy-loading 'numa_topology' on Instance uuid 1fe5c7b8-8a9f-4b64-bb69-822a770565f6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:07:00 compute-0 nova_compute[186544]: 2025-11-22 08:07:00.820 186548 DEBUG nova.compute.manager [None req-2709bdff-4e3b-4c09-aed4-f2fe452c14bd b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:07:00 compute-0 nova_compute[186544]: 2025-11-22 08:07:00.888 186548 DEBUG oslo_concurrency.lockutils [None req-2709bdff-4e3b-4c09-aed4-f2fe452c14bd b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "1fe5c7b8-8a9f-4b64-bb69-822a770565f6" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 3.174s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:07:01 compute-0 nova_compute[186544]: 2025-11-22 08:07:01.117 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:01 compute-0 nova_compute[186544]: 2025-11-22 08:07:01.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:07:02 compute-0 podman[233615]: 2025-11-22 08:07:02.419706981 +0000 UTC m=+0.058628537 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true)
Nov 22 08:07:02 compute-0 podman[233616]: 2025-11-22 08:07:02.435803727 +0000 UTC m=+0.073149154 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 08:07:02 compute-0 nova_compute[186544]: 2025-11-22 08:07:02.522 186548 DEBUG nova.compute.manager [req-82622182-9092-46cb-9444-50d0745a7bea req-ad39bd4f-0a9b-4cb7-9356-8398f02a35fc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Received event network-vif-plugged-480abb3e-15cf-4910-b471-7667ee6c50c8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:07:02 compute-0 nova_compute[186544]: 2025-11-22 08:07:02.522 186548 DEBUG oslo_concurrency.lockutils [req-82622182-9092-46cb-9444-50d0745a7bea req-ad39bd4f-0a9b-4cb7-9356-8398f02a35fc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1fe5c7b8-8a9f-4b64-bb69-822a770565f6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:07:02 compute-0 nova_compute[186544]: 2025-11-22 08:07:02.523 186548 DEBUG oslo_concurrency.lockutils [req-82622182-9092-46cb-9444-50d0745a7bea req-ad39bd4f-0a9b-4cb7-9356-8398f02a35fc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1fe5c7b8-8a9f-4b64-bb69-822a770565f6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:07:02 compute-0 nova_compute[186544]: 2025-11-22 08:07:02.523 186548 DEBUG oslo_concurrency.lockutils [req-82622182-9092-46cb-9444-50d0745a7bea req-ad39bd4f-0a9b-4cb7-9356-8398f02a35fc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1fe5c7b8-8a9f-4b64-bb69-822a770565f6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:07:02 compute-0 nova_compute[186544]: 2025-11-22 08:07:02.523 186548 DEBUG nova.compute.manager [req-82622182-9092-46cb-9444-50d0745a7bea req-ad39bd4f-0a9b-4cb7-9356-8398f02a35fc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] No waiting events found dispatching network-vif-plugged-480abb3e-15cf-4910-b471-7667ee6c50c8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:07:02 compute-0 nova_compute[186544]: 2025-11-22 08:07:02.523 186548 WARNING nova.compute.manager [req-82622182-9092-46cb-9444-50d0745a7bea req-ad39bd4f-0a9b-4cb7-9356-8398f02a35fc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Received unexpected event network-vif-plugged-480abb3e-15cf-4910-b471-7667ee6c50c8 for instance with vm_state stopped and task_state None.
Nov 22 08:07:02 compute-0 nova_compute[186544]: 2025-11-22 08:07:02.592 186548 DEBUG nova.objects.instance [None req-f1a5a799-bb66-424e-9890-962259ce7146 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lazy-loading 'flavor' on Instance uuid 1fe5c7b8-8a9f-4b64-bb69-822a770565f6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:07:02 compute-0 nova_compute[186544]: 2025-11-22 08:07:02.610 186548 DEBUG nova.objects.instance [None req-f1a5a799-bb66-424e-9890-962259ce7146 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lazy-loading 'info_cache' on Instance uuid 1fe5c7b8-8a9f-4b64-bb69-822a770565f6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:07:02 compute-0 nova_compute[186544]: 2025-11-22 08:07:02.629 186548 DEBUG oslo_concurrency.lockutils [None req-f1a5a799-bb66-424e-9890-962259ce7146 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquiring lock "refresh_cache-1fe5c7b8-8a9f-4b64-bb69-822a770565f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:07:02 compute-0 nova_compute[186544]: 2025-11-22 08:07:02.629 186548 DEBUG oslo_concurrency.lockutils [None req-f1a5a799-bb66-424e-9890-962259ce7146 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquired lock "refresh_cache-1fe5c7b8-8a9f-4b64-bb69-822a770565f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:07:02 compute-0 nova_compute[186544]: 2025-11-22 08:07:02.630 186548 DEBUG nova.network.neutron [None req-f1a5a799-bb66-424e-9890-962259ce7146 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 08:07:03 compute-0 ovn_controller[94843]: 2025-11-22T08:07:03Z|00048|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:07:30:39 10.100.0.11
Nov 22 08:07:03 compute-0 ovn_controller[94843]: 2025-11-22T08:07:03Z|00049|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:07:30:39 10.100.0.11
Nov 22 08:07:04 compute-0 nova_compute[186544]: 2025-11-22 08:07:04.527 186548 DEBUG nova.network.neutron [None req-f1a5a799-bb66-424e-9890-962259ce7146 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Updating instance_info_cache with network_info: [{"id": "480abb3e-15cf-4910-b471-7667ee6c50c8", "address": "fa:16:3e:d0:87:85", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap480abb3e-15", "ovs_interfaceid": "480abb3e-15cf-4910-b471-7667ee6c50c8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:07:04 compute-0 nova_compute[186544]: 2025-11-22 08:07:04.541 186548 DEBUG oslo_concurrency.lockutils [None req-f1a5a799-bb66-424e-9890-962259ce7146 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Releasing lock "refresh_cache-1fe5c7b8-8a9f-4b64-bb69-822a770565f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:07:04 compute-0 nova_compute[186544]: 2025-11-22 08:07:04.561 186548 INFO nova.virt.libvirt.driver [-] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Instance destroyed successfully.
Nov 22 08:07:04 compute-0 nova_compute[186544]: 2025-11-22 08:07:04.561 186548 DEBUG nova.objects.instance [None req-f1a5a799-bb66-424e-9890-962259ce7146 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lazy-loading 'numa_topology' on Instance uuid 1fe5c7b8-8a9f-4b64-bb69-822a770565f6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:07:04 compute-0 nova_compute[186544]: 2025-11-22 08:07:04.573 186548 DEBUG nova.objects.instance [None req-f1a5a799-bb66-424e-9890-962259ce7146 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lazy-loading 'resources' on Instance uuid 1fe5c7b8-8a9f-4b64-bb69-822a770565f6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:07:04 compute-0 nova_compute[186544]: 2025-11-22 08:07:04.584 186548 DEBUG nova.virt.libvirt.vif [None req-f1a5a799-bb66-424e-9890-962259ce7146 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:06:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1640453793',display_name='tempest-ServerActionsTestJSON-server-1640453793',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1640453793',id=106,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEDyh5RRpb7qDHgAc9H+oNwOI/lxx0x2a7uhOXIX+Er9GoVqnK9B1X3kTc/PIYUbBPjQjhoPfQeu2jPU9pyeFHD6mBTSbq1gvJNECPvummRKdXnVokvmyleOZmFdoGP/ZQ==',key_name='tempest-keypair-1877507320',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:06:22Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='ac6b78572b7d4aedaf745e5e6ba1d360',ramdisk_id='',reservation_id='r-tvm9f4y6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1104477664',owner_user_name='tempest-ServerActionsTestJSON-1104477664-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:07:00Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='b6cc24df1e344e369f2aff864f278268',uuid=1fe5c7b8-8a9f-4b64-bb69-822a770565f6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "480abb3e-15cf-4910-b471-7667ee6c50c8", "address": "fa:16:3e:d0:87:85", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap480abb3e-15", "ovs_interfaceid": "480abb3e-15cf-4910-b471-7667ee6c50c8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 08:07:04 compute-0 nova_compute[186544]: 2025-11-22 08:07:04.584 186548 DEBUG nova.network.os_vif_util [None req-f1a5a799-bb66-424e-9890-962259ce7146 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Converting VIF {"id": "480abb3e-15cf-4910-b471-7667ee6c50c8", "address": "fa:16:3e:d0:87:85", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap480abb3e-15", "ovs_interfaceid": "480abb3e-15cf-4910-b471-7667ee6c50c8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:07:04 compute-0 nova_compute[186544]: 2025-11-22 08:07:04.585 186548 DEBUG nova.network.os_vif_util [None req-f1a5a799-bb66-424e-9890-962259ce7146 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d0:87:85,bridge_name='br-int',has_traffic_filtering=True,id=480abb3e-15cf-4910-b471-7667ee6c50c8,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap480abb3e-15') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:07:04 compute-0 nova_compute[186544]: 2025-11-22 08:07:04.585 186548 DEBUG os_vif [None req-f1a5a799-bb66-424e-9890-962259ce7146 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d0:87:85,bridge_name='br-int',has_traffic_filtering=True,id=480abb3e-15cf-4910-b471-7667ee6c50c8,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap480abb3e-15') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 08:07:04 compute-0 nova_compute[186544]: 2025-11-22 08:07:04.587 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:04 compute-0 nova_compute[186544]: 2025-11-22 08:07:04.588 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap480abb3e-15, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:07:04 compute-0 nova_compute[186544]: 2025-11-22 08:07:04.589 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:04 compute-0 nova_compute[186544]: 2025-11-22 08:07:04.590 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:04 compute-0 nova_compute[186544]: 2025-11-22 08:07:04.592 186548 INFO os_vif [None req-f1a5a799-bb66-424e-9890-962259ce7146 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d0:87:85,bridge_name='br-int',has_traffic_filtering=True,id=480abb3e-15cf-4910-b471-7667ee6c50c8,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap480abb3e-15')
Nov 22 08:07:04 compute-0 nova_compute[186544]: 2025-11-22 08:07:04.597 186548 DEBUG nova.virt.libvirt.driver [None req-f1a5a799-bb66-424e-9890-962259ce7146 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Start _get_guest_xml network_info=[{"id": "480abb3e-15cf-4910-b471-7667ee6c50c8", "address": "fa:16:3e:d0:87:85", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap480abb3e-15", "ovs_interfaceid": "480abb3e-15cf-4910-b471-7667ee6c50c8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 08:07:04 compute-0 nova_compute[186544]: 2025-11-22 08:07:04.601 186548 WARNING nova.virt.libvirt.driver [None req-f1a5a799-bb66-424e-9890-962259ce7146 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:07:04 compute-0 nova_compute[186544]: 2025-11-22 08:07:04.608 186548 DEBUG nova.virt.libvirt.host [None req-f1a5a799-bb66-424e-9890-962259ce7146 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 08:07:04 compute-0 nova_compute[186544]: 2025-11-22 08:07:04.609 186548 DEBUG nova.virt.libvirt.host [None req-f1a5a799-bb66-424e-9890-962259ce7146 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 08:07:04 compute-0 nova_compute[186544]: 2025-11-22 08:07:04.613 186548 DEBUG nova.virt.libvirt.host [None req-f1a5a799-bb66-424e-9890-962259ce7146 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 08:07:04 compute-0 nova_compute[186544]: 2025-11-22 08:07:04.614 186548 DEBUG nova.virt.libvirt.host [None req-f1a5a799-bb66-424e-9890-962259ce7146 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 08:07:04 compute-0 nova_compute[186544]: 2025-11-22 08:07:04.615 186548 DEBUG nova.virt.libvirt.driver [None req-f1a5a799-bb66-424e-9890-962259ce7146 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 08:07:04 compute-0 nova_compute[186544]: 2025-11-22 08:07:04.615 186548 DEBUG nova.virt.hardware [None req-f1a5a799-bb66-424e-9890-962259ce7146 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 08:07:04 compute-0 nova_compute[186544]: 2025-11-22 08:07:04.616 186548 DEBUG nova.virt.hardware [None req-f1a5a799-bb66-424e-9890-962259ce7146 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 08:07:04 compute-0 nova_compute[186544]: 2025-11-22 08:07:04.616 186548 DEBUG nova.virt.hardware [None req-f1a5a799-bb66-424e-9890-962259ce7146 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 08:07:04 compute-0 nova_compute[186544]: 2025-11-22 08:07:04.616 186548 DEBUG nova.virt.hardware [None req-f1a5a799-bb66-424e-9890-962259ce7146 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 08:07:04 compute-0 nova_compute[186544]: 2025-11-22 08:07:04.616 186548 DEBUG nova.virt.hardware [None req-f1a5a799-bb66-424e-9890-962259ce7146 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 08:07:04 compute-0 nova_compute[186544]: 2025-11-22 08:07:04.617 186548 DEBUG nova.virt.hardware [None req-f1a5a799-bb66-424e-9890-962259ce7146 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 08:07:04 compute-0 nova_compute[186544]: 2025-11-22 08:07:04.617 186548 DEBUG nova.virt.hardware [None req-f1a5a799-bb66-424e-9890-962259ce7146 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 08:07:04 compute-0 nova_compute[186544]: 2025-11-22 08:07:04.617 186548 DEBUG nova.virt.hardware [None req-f1a5a799-bb66-424e-9890-962259ce7146 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 08:07:04 compute-0 nova_compute[186544]: 2025-11-22 08:07:04.617 186548 DEBUG nova.virt.hardware [None req-f1a5a799-bb66-424e-9890-962259ce7146 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 08:07:04 compute-0 nova_compute[186544]: 2025-11-22 08:07:04.618 186548 DEBUG nova.virt.hardware [None req-f1a5a799-bb66-424e-9890-962259ce7146 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 08:07:04 compute-0 nova_compute[186544]: 2025-11-22 08:07:04.618 186548 DEBUG nova.virt.hardware [None req-f1a5a799-bb66-424e-9890-962259ce7146 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 08:07:04 compute-0 nova_compute[186544]: 2025-11-22 08:07:04.618 186548 DEBUG nova.objects.instance [None req-f1a5a799-bb66-424e-9890-962259ce7146 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 1fe5c7b8-8a9f-4b64-bb69-822a770565f6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:07:04 compute-0 nova_compute[186544]: 2025-11-22 08:07:04.635 186548 DEBUG oslo_concurrency.processutils [None req-f1a5a799-bb66-424e-9890-962259ce7146 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1fe5c7b8-8a9f-4b64-bb69-822a770565f6/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:07:04 compute-0 nova_compute[186544]: 2025-11-22 08:07:04.700 186548 DEBUG oslo_concurrency.processutils [None req-f1a5a799-bb66-424e-9890-962259ce7146 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1fe5c7b8-8a9f-4b64-bb69-822a770565f6/disk.config --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:07:04 compute-0 nova_compute[186544]: 2025-11-22 08:07:04.701 186548 DEBUG oslo_concurrency.lockutils [None req-f1a5a799-bb66-424e-9890-962259ce7146 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquiring lock "/var/lib/nova/instances/1fe5c7b8-8a9f-4b64-bb69-822a770565f6/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:07:04 compute-0 nova_compute[186544]: 2025-11-22 08:07:04.701 186548 DEBUG oslo_concurrency.lockutils [None req-f1a5a799-bb66-424e-9890-962259ce7146 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "/var/lib/nova/instances/1fe5c7b8-8a9f-4b64-bb69-822a770565f6/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:07:04 compute-0 nova_compute[186544]: 2025-11-22 08:07:04.702 186548 DEBUG oslo_concurrency.lockutils [None req-f1a5a799-bb66-424e-9890-962259ce7146 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "/var/lib/nova/instances/1fe5c7b8-8a9f-4b64-bb69-822a770565f6/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:07:04 compute-0 nova_compute[186544]: 2025-11-22 08:07:04.703 186548 DEBUG nova.virt.libvirt.vif [None req-f1a5a799-bb66-424e-9890-962259ce7146 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:06:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1640453793',display_name='tempest-ServerActionsTestJSON-server-1640453793',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1640453793',id=106,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEDyh5RRpb7qDHgAc9H+oNwOI/lxx0x2a7uhOXIX+Er9GoVqnK9B1X3kTc/PIYUbBPjQjhoPfQeu2jPU9pyeFHD6mBTSbq1gvJNECPvummRKdXnVokvmyleOZmFdoGP/ZQ==',key_name='tempest-keypair-1877507320',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:06:22Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='ac6b78572b7d4aedaf745e5e6ba1d360',ramdisk_id='',reservation_id='r-tvm9f4y6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1104477664',owner_user_name='tempest-ServerActionsTestJSON-1104477664-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:07:00Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='b6cc24df1e344e369f2aff864f278268',uuid=1fe5c7b8-8a9f-4b64-bb69-822a770565f6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "480abb3e-15cf-4910-b471-7667ee6c50c8", "address": "fa:16:3e:d0:87:85", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap480abb3e-15", "ovs_interfaceid": "480abb3e-15cf-4910-b471-7667ee6c50c8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 08:07:04 compute-0 nova_compute[186544]: 2025-11-22 08:07:04.704 186548 DEBUG nova.network.os_vif_util [None req-f1a5a799-bb66-424e-9890-962259ce7146 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Converting VIF {"id": "480abb3e-15cf-4910-b471-7667ee6c50c8", "address": "fa:16:3e:d0:87:85", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap480abb3e-15", "ovs_interfaceid": "480abb3e-15cf-4910-b471-7667ee6c50c8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:07:04 compute-0 nova_compute[186544]: 2025-11-22 08:07:04.705 186548 DEBUG nova.network.os_vif_util [None req-f1a5a799-bb66-424e-9890-962259ce7146 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d0:87:85,bridge_name='br-int',has_traffic_filtering=True,id=480abb3e-15cf-4910-b471-7667ee6c50c8,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap480abb3e-15') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:07:04 compute-0 nova_compute[186544]: 2025-11-22 08:07:04.706 186548 DEBUG nova.objects.instance [None req-f1a5a799-bb66-424e-9890-962259ce7146 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lazy-loading 'pci_devices' on Instance uuid 1fe5c7b8-8a9f-4b64-bb69-822a770565f6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:07:04 compute-0 nova_compute[186544]: 2025-11-22 08:07:04.718 186548 DEBUG nova.virt.libvirt.driver [None req-f1a5a799-bb66-424e-9890-962259ce7146 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] End _get_guest_xml xml=<domain type="kvm">
Nov 22 08:07:04 compute-0 nova_compute[186544]:   <uuid>1fe5c7b8-8a9f-4b64-bb69-822a770565f6</uuid>
Nov 22 08:07:04 compute-0 nova_compute[186544]:   <name>instance-0000006a</name>
Nov 22 08:07:04 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 08:07:04 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 08:07:04 compute-0 nova_compute[186544]:   <metadata>
Nov 22 08:07:04 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 08:07:04 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 08:07:04 compute-0 nova_compute[186544]:       <nova:name>tempest-ServerActionsTestJSON-server-1640453793</nova:name>
Nov 22 08:07:04 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 08:07:04</nova:creationTime>
Nov 22 08:07:04 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 08:07:04 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 08:07:04 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 08:07:04 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 08:07:04 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 08:07:04 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 08:07:04 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 08:07:04 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 08:07:04 compute-0 nova_compute[186544]:         <nova:user uuid="b6cc24df1e344e369f2aff864f278268">tempest-ServerActionsTestJSON-1104477664-project-member</nova:user>
Nov 22 08:07:04 compute-0 nova_compute[186544]:         <nova:project uuid="ac6b78572b7d4aedaf745e5e6ba1d360">tempest-ServerActionsTestJSON-1104477664</nova:project>
Nov 22 08:07:04 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 08:07:04 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 08:07:04 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 08:07:04 compute-0 nova_compute[186544]:         <nova:port uuid="480abb3e-15cf-4910-b471-7667ee6c50c8">
Nov 22 08:07:04 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 22 08:07:04 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 08:07:04 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 08:07:04 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 08:07:04 compute-0 nova_compute[186544]:   </metadata>
Nov 22 08:07:04 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 08:07:04 compute-0 nova_compute[186544]:     <system>
Nov 22 08:07:04 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 08:07:04 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 08:07:04 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 08:07:04 compute-0 nova_compute[186544]:       <entry name="serial">1fe5c7b8-8a9f-4b64-bb69-822a770565f6</entry>
Nov 22 08:07:04 compute-0 nova_compute[186544]:       <entry name="uuid">1fe5c7b8-8a9f-4b64-bb69-822a770565f6</entry>
Nov 22 08:07:04 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 08:07:04 compute-0 nova_compute[186544]:     </system>
Nov 22 08:07:04 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 08:07:04 compute-0 nova_compute[186544]:   <os>
Nov 22 08:07:04 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 08:07:04 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 08:07:04 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 08:07:04 compute-0 nova_compute[186544]:   </os>
Nov 22 08:07:04 compute-0 nova_compute[186544]:   <features>
Nov 22 08:07:04 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 08:07:04 compute-0 nova_compute[186544]:     <apic/>
Nov 22 08:07:04 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 08:07:04 compute-0 nova_compute[186544]:   </features>
Nov 22 08:07:04 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 08:07:04 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 08:07:04 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 08:07:04 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 08:07:04 compute-0 nova_compute[186544]:   </clock>
Nov 22 08:07:04 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 08:07:04 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 08:07:04 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 08:07:04 compute-0 nova_compute[186544]:   </cpu>
Nov 22 08:07:04 compute-0 nova_compute[186544]:   <devices>
Nov 22 08:07:04 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 08:07:04 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 08:07:04 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/1fe5c7b8-8a9f-4b64-bb69-822a770565f6/disk"/>
Nov 22 08:07:04 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 08:07:04 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:07:04 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 08:07:04 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 08:07:04 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/1fe5c7b8-8a9f-4b64-bb69-822a770565f6/disk.config"/>
Nov 22 08:07:04 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 08:07:04 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:07:04 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 08:07:04 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:d0:87:85"/>
Nov 22 08:07:04 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:07:04 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 08:07:04 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 08:07:04 compute-0 nova_compute[186544]:       <target dev="tap480abb3e-15"/>
Nov 22 08:07:04 compute-0 nova_compute[186544]:     </interface>
Nov 22 08:07:04 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 08:07:04 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/1fe5c7b8-8a9f-4b64-bb69-822a770565f6/console.log" append="off"/>
Nov 22 08:07:04 compute-0 nova_compute[186544]:     </serial>
Nov 22 08:07:04 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 08:07:04 compute-0 nova_compute[186544]:     <video>
Nov 22 08:07:04 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:07:04 compute-0 nova_compute[186544]:     </video>
Nov 22 08:07:04 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 08:07:04 compute-0 nova_compute[186544]:     <input type="keyboard" bus="usb"/>
Nov 22 08:07:04 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 08:07:04 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 08:07:04 compute-0 nova_compute[186544]:     </rng>
Nov 22 08:07:04 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 08:07:04 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:07:04 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:07:04 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:07:04 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:07:04 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:07:04 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:07:04 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:07:04 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:07:04 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:07:04 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:07:04 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:07:04 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:07:04 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:07:04 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:07:04 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:07:04 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:07:04 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:07:04 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:07:04 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:07:04 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:07:04 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:07:04 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:07:04 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:07:04 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:07:04 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 08:07:04 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 08:07:04 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 08:07:04 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 08:07:04 compute-0 nova_compute[186544]:   </devices>
Nov 22 08:07:04 compute-0 nova_compute[186544]: </domain>
Nov 22 08:07:04 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 08:07:04 compute-0 nova_compute[186544]: 2025-11-22 08:07:04.720 186548 DEBUG oslo_concurrency.processutils [None req-f1a5a799-bb66-424e-9890-962259ce7146 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1fe5c7b8-8a9f-4b64-bb69-822a770565f6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:07:04 compute-0 nova_compute[186544]: 2025-11-22 08:07:04.777 186548 DEBUG oslo_concurrency.processutils [None req-f1a5a799-bb66-424e-9890-962259ce7146 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1fe5c7b8-8a9f-4b64-bb69-822a770565f6/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:07:04 compute-0 nova_compute[186544]: 2025-11-22 08:07:04.778 186548 DEBUG oslo_concurrency.processutils [None req-f1a5a799-bb66-424e-9890-962259ce7146 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1fe5c7b8-8a9f-4b64-bb69-822a770565f6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:07:04 compute-0 nova_compute[186544]: 2025-11-22 08:07:04.835 186548 DEBUG oslo_concurrency.processutils [None req-f1a5a799-bb66-424e-9890-962259ce7146 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1fe5c7b8-8a9f-4b64-bb69-822a770565f6/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:07:04 compute-0 nova_compute[186544]: 2025-11-22 08:07:04.837 186548 DEBUG nova.objects.instance [None req-f1a5a799-bb66-424e-9890-962259ce7146 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 1fe5c7b8-8a9f-4b64-bb69-822a770565f6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:07:04 compute-0 nova_compute[186544]: 2025-11-22 08:07:04.849 186548 DEBUG oslo_concurrency.processutils [None req-f1a5a799-bb66-424e-9890-962259ce7146 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:07:04 compute-0 nova_compute[186544]: 2025-11-22 08:07:04.907 186548 DEBUG oslo_concurrency.processutils [None req-f1a5a799-bb66-424e-9890-962259ce7146 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:07:04 compute-0 nova_compute[186544]: 2025-11-22 08:07:04.908 186548 DEBUG nova.virt.disk.api [None req-f1a5a799-bb66-424e-9890-962259ce7146 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Checking if we can resize image /var/lib/nova/instances/1fe5c7b8-8a9f-4b64-bb69-822a770565f6/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 08:07:04 compute-0 nova_compute[186544]: 2025-11-22 08:07:04.909 186548 DEBUG oslo_concurrency.processutils [None req-f1a5a799-bb66-424e-9890-962259ce7146 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1fe5c7b8-8a9f-4b64-bb69-822a770565f6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:07:04 compute-0 nova_compute[186544]: 2025-11-22 08:07:04.968 186548 DEBUG oslo_concurrency.processutils [None req-f1a5a799-bb66-424e-9890-962259ce7146 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1fe5c7b8-8a9f-4b64-bb69-822a770565f6/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:07:04 compute-0 nova_compute[186544]: 2025-11-22 08:07:04.969 186548 DEBUG nova.virt.disk.api [None req-f1a5a799-bb66-424e-9890-962259ce7146 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Cannot resize image /var/lib/nova/instances/1fe5c7b8-8a9f-4b64-bb69-822a770565f6/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 08:07:04 compute-0 nova_compute[186544]: 2025-11-22 08:07:04.970 186548 DEBUG nova.objects.instance [None req-f1a5a799-bb66-424e-9890-962259ce7146 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lazy-loading 'migration_context' on Instance uuid 1fe5c7b8-8a9f-4b64-bb69-822a770565f6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:07:04 compute-0 nova_compute[186544]: 2025-11-22 08:07:04.992 186548 DEBUG nova.virt.libvirt.vif [None req-f1a5a799-bb66-424e-9890-962259ce7146 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:06:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1640453793',display_name='tempest-ServerActionsTestJSON-server-1640453793',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1640453793',id=106,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEDyh5RRpb7qDHgAc9H+oNwOI/lxx0x2a7uhOXIX+Er9GoVqnK9B1X3kTc/PIYUbBPjQjhoPfQeu2jPU9pyeFHD6mBTSbq1gvJNECPvummRKdXnVokvmyleOZmFdoGP/ZQ==',key_name='tempest-keypair-1877507320',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:06:22Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=4,progress=0,project_id='ac6b78572b7d4aedaf745e5e6ba1d360',ramdisk_id='',reservation_id='r-tvm9f4y6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1104477664',owner_user_name='tempest-ServerActionsTestJSON-1104477664-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:07:00Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='b6cc24df1e344e369f2aff864f278268',uuid=1fe5c7b8-8a9f-4b64-bb69-822a770565f6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "480abb3e-15cf-4910-b471-7667ee6c50c8", "address": "fa:16:3e:d0:87:85", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap480abb3e-15", "ovs_interfaceid": "480abb3e-15cf-4910-b471-7667ee6c50c8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 08:07:04 compute-0 nova_compute[186544]: 2025-11-22 08:07:04.993 186548 DEBUG nova.network.os_vif_util [None req-f1a5a799-bb66-424e-9890-962259ce7146 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Converting VIF {"id": "480abb3e-15cf-4910-b471-7667ee6c50c8", "address": "fa:16:3e:d0:87:85", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap480abb3e-15", "ovs_interfaceid": "480abb3e-15cf-4910-b471-7667ee6c50c8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:07:04 compute-0 nova_compute[186544]: 2025-11-22 08:07:04.994 186548 DEBUG nova.network.os_vif_util [None req-f1a5a799-bb66-424e-9890-962259ce7146 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d0:87:85,bridge_name='br-int',has_traffic_filtering=True,id=480abb3e-15cf-4910-b471-7667ee6c50c8,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap480abb3e-15') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:07:04 compute-0 nova_compute[186544]: 2025-11-22 08:07:04.994 186548 DEBUG os_vif [None req-f1a5a799-bb66-424e-9890-962259ce7146 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d0:87:85,bridge_name='br-int',has_traffic_filtering=True,id=480abb3e-15cf-4910-b471-7667ee6c50c8,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap480abb3e-15') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 08:07:04 compute-0 nova_compute[186544]: 2025-11-22 08:07:04.995 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:04 compute-0 nova_compute[186544]: 2025-11-22 08:07:04.995 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:07:04 compute-0 nova_compute[186544]: 2025-11-22 08:07:04.996 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:07:04 compute-0 nova_compute[186544]: 2025-11-22 08:07:04.998 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:04 compute-0 nova_compute[186544]: 2025-11-22 08:07:04.998 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap480abb3e-15, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:07:04 compute-0 nova_compute[186544]: 2025-11-22 08:07:04.998 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap480abb3e-15, col_values=(('external_ids', {'iface-id': '480abb3e-15cf-4910-b471-7667ee6c50c8', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d0:87:85', 'vm-uuid': '1fe5c7b8-8a9f-4b64-bb69-822a770565f6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:07:05 compute-0 nova_compute[186544]: 2025-11-22 08:07:05.000 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:05 compute-0 NetworkManager[55036]: <info>  [1763798825.0009] manager: (tap480abb3e-15): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/234)
Nov 22 08:07:05 compute-0 nova_compute[186544]: 2025-11-22 08:07:05.003 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 08:07:05 compute-0 nova_compute[186544]: 2025-11-22 08:07:05.005 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:05 compute-0 nova_compute[186544]: 2025-11-22 08:07:05.006 186548 INFO os_vif [None req-f1a5a799-bb66-424e-9890-962259ce7146 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d0:87:85,bridge_name='br-int',has_traffic_filtering=True,id=480abb3e-15cf-4910-b471-7667ee6c50c8,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap480abb3e-15')
Nov 22 08:07:05 compute-0 kernel: tap480abb3e-15: entered promiscuous mode
Nov 22 08:07:05 compute-0 NetworkManager[55036]: <info>  [1763798825.0921] manager: (tap480abb3e-15): new Tun device (/org/freedesktop/NetworkManager/Devices/235)
Nov 22 08:07:05 compute-0 ovn_controller[94843]: 2025-11-22T08:07:05Z|00491|binding|INFO|Claiming lport 480abb3e-15cf-4910-b471-7667ee6c50c8 for this chassis.
Nov 22 08:07:05 compute-0 ovn_controller[94843]: 2025-11-22T08:07:05Z|00492|binding|INFO|480abb3e-15cf-4910-b471-7667ee6c50c8: Claiming fa:16:3e:d0:87:85 10.100.0.11
Nov 22 08:07:05 compute-0 nova_compute[186544]: 2025-11-22 08:07:05.093 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:05.100 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d0:87:85 10.100.0.11'], port_security=['fa:16:3e:d0:87:85 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '1fe5c7b8-8a9f-4b64-bb69-822a770565f6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'f75b5f45-3232-42aa-a8f2-594f0428a6f8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.204'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a46c39a3-69e8-4fb9-a300-4c114a0c0910, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=480abb3e-15cf-4910-b471-7667ee6c50c8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:07:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:05.101 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 480abb3e-15cf-4910-b471-7667ee6c50c8 in datapath 165f7f23-d3c9-4f49-8a34-4fd7222ad518 bound to our chassis
Nov 22 08:07:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:05.103 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 165f7f23-d3c9-4f49-8a34-4fd7222ad518
Nov 22 08:07:05 compute-0 ovn_controller[94843]: 2025-11-22T08:07:05Z|00493|binding|INFO|Setting lport 480abb3e-15cf-4910-b471-7667ee6c50c8 ovn-installed in OVS
Nov 22 08:07:05 compute-0 ovn_controller[94843]: 2025-11-22T08:07:05Z|00494|binding|INFO|Setting lport 480abb3e-15cf-4910-b471-7667ee6c50c8 up in Southbound
Nov 22 08:07:05 compute-0 nova_compute[186544]: 2025-11-22 08:07:05.110 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:05 compute-0 nova_compute[186544]: 2025-11-22 08:07:05.114 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:05.120 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[79c70529-dd06-4da8-8854-8871e1c7d374]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:07:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:05.122 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap165f7f23-d1 in ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 08:07:05 compute-0 podman[233678]: 2025-11-22 08:07:05.12501443 +0000 UTC m=+0.080479167 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 22 08:07:05 compute-0 podman[233679]: 2025-11-22 08:07:05.12588376 +0000 UTC m=+0.080756672 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 08:07:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:05.125 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap165f7f23-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 08:07:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:05.125 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[76c9440f-ce6f-4ef5-b42d-0272cd56e0fc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:07:05 compute-0 systemd-udevd[233732]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:07:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:05.127 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[086609c6-3b9c-434f-8764-bbd7c30a5a9e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:07:05 compute-0 systemd-machined[152872]: New machine qemu-64-instance-0000006a.
Nov 22 08:07:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:05.140 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[8b6d4207-8f74-4db5-a541-8489831b1a5c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:07:05 compute-0 NetworkManager[55036]: <info>  [1763798825.1456] device (tap480abb3e-15): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 08:07:05 compute-0 NetworkManager[55036]: <info>  [1763798825.1466] device (tap480abb3e-15): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 08:07:05 compute-0 systemd[1]: Started Virtual Machine qemu-64-instance-0000006a.
Nov 22 08:07:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:05.166 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[82c70c08-acb3-4c08-9224-6375e531a561]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:07:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:05.200 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[51374504-5834-47f0-89d4-cb88e693e5b7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:07:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:05.207 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[ae692ea2-7bec-4ddd-a485-e6b46d68e250]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:07:05 compute-0 NetworkManager[55036]: <info>  [1763798825.2088] manager: (tap165f7f23-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/236)
Nov 22 08:07:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:05.240 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[ffc1544f-a90e-48b7-b7c5-6736756a876b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:07:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:05.244 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[9a6ef21f-fe40-41a2-8d5a-c555d42259a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:07:05 compute-0 NetworkManager[55036]: <info>  [1763798825.2677] device (tap165f7f23-d0): carrier: link connected
Nov 22 08:07:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:05.271 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[f19b1bde-430d-45e0-ba1a-f56fb9b20498]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:07:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:05.290 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[48b62502-e453-4713-91bd-62debe7cd9ca]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap165f7f23-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:00:cc:98'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 155], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 556590, 'reachable_time': 17476, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233766, 'error': None, 'target': 'ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:07:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:05.305 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[184ca03e-52b8-4ac8-9f75-42d22999e08c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe00:cc98'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 556590, 'tstamp': 556590}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233767, 'error': None, 'target': 'ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:07:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:05.322 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[cdbff4c5-178a-4bed-9e0b-332f3ec0c3dd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap165f7f23-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:00:cc:98'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 155], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 556590, 'reachable_time': 17476, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 233768, 'error': None, 'target': 'ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:07:05 compute-0 nova_compute[186544]: 2025-11-22 08:07:05.340 186548 DEBUG nova.compute.manager [req-d79076bf-c688-4030-a58a-f6204d92f5f2 req-09015597-b6ae-48fd-a4d6-de59afd30c41 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Received event network-vif-plugged-480abb3e-15cf-4910-b471-7667ee6c50c8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:07:05 compute-0 nova_compute[186544]: 2025-11-22 08:07:05.341 186548 DEBUG oslo_concurrency.lockutils [req-d79076bf-c688-4030-a58a-f6204d92f5f2 req-09015597-b6ae-48fd-a4d6-de59afd30c41 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1fe5c7b8-8a9f-4b64-bb69-822a770565f6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:07:05 compute-0 nova_compute[186544]: 2025-11-22 08:07:05.341 186548 DEBUG oslo_concurrency.lockutils [req-d79076bf-c688-4030-a58a-f6204d92f5f2 req-09015597-b6ae-48fd-a4d6-de59afd30c41 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1fe5c7b8-8a9f-4b64-bb69-822a770565f6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:07:05 compute-0 nova_compute[186544]: 2025-11-22 08:07:05.341 186548 DEBUG oslo_concurrency.lockutils [req-d79076bf-c688-4030-a58a-f6204d92f5f2 req-09015597-b6ae-48fd-a4d6-de59afd30c41 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1fe5c7b8-8a9f-4b64-bb69-822a770565f6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:07:05 compute-0 nova_compute[186544]: 2025-11-22 08:07:05.342 186548 DEBUG nova.compute.manager [req-d79076bf-c688-4030-a58a-f6204d92f5f2 req-09015597-b6ae-48fd-a4d6-de59afd30c41 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] No waiting events found dispatching network-vif-plugged-480abb3e-15cf-4910-b471-7667ee6c50c8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:07:05 compute-0 nova_compute[186544]: 2025-11-22 08:07:05.342 186548 WARNING nova.compute.manager [req-d79076bf-c688-4030-a58a-f6204d92f5f2 req-09015597-b6ae-48fd-a4d6-de59afd30c41 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Received unexpected event network-vif-plugged-480abb3e-15cf-4910-b471-7667ee6c50c8 for instance with vm_state stopped and task_state powering-on.
Nov 22 08:07:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:05.360 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[10c391c9-35ab-41f5-b273-3f3745f6aa8d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:07:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:05.431 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[f8ce7a8f-7154-4ab2-b332-73cb24982e8f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:07:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:05.432 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap165f7f23-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:07:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:05.432 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:07:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:05.433 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap165f7f23-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:07:05 compute-0 kernel: tap165f7f23-d0: entered promiscuous mode
Nov 22 08:07:05 compute-0 nova_compute[186544]: 2025-11-22 08:07:05.434 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:05 compute-0 NetworkManager[55036]: <info>  [1763798825.4379] manager: (tap165f7f23-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/237)
Nov 22 08:07:05 compute-0 nova_compute[186544]: 2025-11-22 08:07:05.441 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:05.444 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap165f7f23-d0, col_values=(('external_ids', {'iface-id': '966efbe2-6c09-40dc-9351-4f58f2542b31'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:07:05 compute-0 nova_compute[186544]: 2025-11-22 08:07:05.446 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:05 compute-0 ovn_controller[94843]: 2025-11-22T08:07:05Z|00495|binding|INFO|Releasing lport 966efbe2-6c09-40dc-9351-4f58f2542b31 from this chassis (sb_readonly=0)
Nov 22 08:07:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:05.450 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/165f7f23-d3c9-4f49-8a34-4fd7222ad518.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/165f7f23-d3c9-4f49-8a34-4fd7222ad518.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 08:07:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:05.451 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[67d3768a-3288-4fc2-b9df-0814b353b6e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:07:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:05.452 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 08:07:05 compute-0 ovn_metadata_agent[103800]: global
Nov 22 08:07:05 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 08:07:05 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-165f7f23-d3c9-4f49-8a34-4fd7222ad518
Nov 22 08:07:05 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 08:07:05 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 08:07:05 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 08:07:05 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/165f7f23-d3c9-4f49-8a34-4fd7222ad518.pid.haproxy
Nov 22 08:07:05 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 08:07:05 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:07:05 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 08:07:05 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 08:07:05 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 08:07:05 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 08:07:05 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 08:07:05 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 08:07:05 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 08:07:05 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 08:07:05 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 08:07:05 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 08:07:05 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 08:07:05 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 08:07:05 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 08:07:05 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:07:05 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:07:05 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 08:07:05 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 08:07:05 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 08:07:05 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID 165f7f23-d3c9-4f49-8a34-4fd7222ad518
Nov 22 08:07:05 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 08:07:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:05.453 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'env', 'PROCESS_TAG=haproxy-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/165f7f23-d3c9-4f49-8a34-4fd7222ad518.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 08:07:05 compute-0 nova_compute[186544]: 2025-11-22 08:07:05.460 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:05 compute-0 nova_compute[186544]: 2025-11-22 08:07:05.796 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:05 compute-0 podman[233800]: 2025-11-22 08:07:05.82488611 +0000 UTC m=+0.022100216 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 08:07:05 compute-0 podman[233800]: 2025-11-22 08:07:05.972621923 +0000 UTC m=+0.169836009 container create a18acd9be5c5f864ebc5a57d2b5a46c5cf2c2d50017dee5c6bd295ec4bc33097 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 08:07:05 compute-0 nova_compute[186544]: 2025-11-22 08:07:05.990 186548 DEBUG nova.virt.libvirt.host [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Removed pending event for 1fe5c7b8-8a9f-4b64-bb69-822a770565f6 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Nov 22 08:07:05 compute-0 nova_compute[186544]: 2025-11-22 08:07:05.991 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798825.9905024, 1fe5c7b8-8a9f-4b64-bb69-822a770565f6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:07:05 compute-0 nova_compute[186544]: 2025-11-22 08:07:05.991 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] VM Resumed (Lifecycle Event)
Nov 22 08:07:05 compute-0 nova_compute[186544]: 2025-11-22 08:07:05.994 186548 DEBUG nova.compute.manager [None req-f1a5a799-bb66-424e-9890-962259ce7146 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 08:07:05 compute-0 nova_compute[186544]: 2025-11-22 08:07:05.997 186548 INFO nova.virt.libvirt.driver [-] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Instance rebooted successfully.
Nov 22 08:07:05 compute-0 nova_compute[186544]: 2025-11-22 08:07:05.997 186548 DEBUG nova.compute.manager [None req-f1a5a799-bb66-424e-9890-962259ce7146 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:07:06 compute-0 systemd[1]: Started libpod-conmon-a18acd9be5c5f864ebc5a57d2b5a46c5cf2c2d50017dee5c6bd295ec4bc33097.scope.
Nov 22 08:07:06 compute-0 nova_compute[186544]: 2025-11-22 08:07:06.025 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:07:06 compute-0 nova_compute[186544]: 2025-11-22 08:07:06.029 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:07:06 compute-0 systemd[1]: Started libcrun container.
Nov 22 08:07:06 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c6611f5bc4142f5cdddba6ab02f485c8d52bb16ae8a203af476b1bdf354eaa98/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 08:07:06 compute-0 nova_compute[186544]: 2025-11-22 08:07:06.055 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] During sync_power_state the instance has a pending task (powering-on). Skip.
Nov 22 08:07:06 compute-0 nova_compute[186544]: 2025-11-22 08:07:06.056 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798825.9914012, 1fe5c7b8-8a9f-4b64-bb69-822a770565f6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:07:06 compute-0 nova_compute[186544]: 2025-11-22 08:07:06.056 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] VM Started (Lifecycle Event)
Nov 22 08:07:06 compute-0 podman[233800]: 2025-11-22 08:07:06.07102106 +0000 UTC m=+0.268235166 container init a18acd9be5c5f864ebc5a57d2b5a46c5cf2c2d50017dee5c6bd295ec4bc33097 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 22 08:07:06 compute-0 podman[233800]: 2025-11-22 08:07:06.077309525 +0000 UTC m=+0.274523611 container start a18acd9be5c5f864ebc5a57d2b5a46c5cf2c2d50017dee5c6bd295ec4bc33097 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 08:07:06 compute-0 nova_compute[186544]: 2025-11-22 08:07:06.084 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:07:06 compute-0 nova_compute[186544]: 2025-11-22 08:07:06.089 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:07:06 compute-0 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[233822]: [NOTICE]   (233826) : New worker (233828) forked
Nov 22 08:07:06 compute-0 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[233822]: [NOTICE]   (233826) : Loading success.
Nov 22 08:07:07 compute-0 nova_compute[186544]: 2025-11-22 08:07:07.409 186548 DEBUG nova.compute.manager [req-374cd231-7486-4edd-a893-35b03b335d19 req-790af02a-4cb3-4298-866b-c0da0729b5c3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Received event network-vif-plugged-480abb3e-15cf-4910-b471-7667ee6c50c8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:07:07 compute-0 nova_compute[186544]: 2025-11-22 08:07:07.409 186548 DEBUG oslo_concurrency.lockutils [req-374cd231-7486-4edd-a893-35b03b335d19 req-790af02a-4cb3-4298-866b-c0da0729b5c3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1fe5c7b8-8a9f-4b64-bb69-822a770565f6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:07:07 compute-0 nova_compute[186544]: 2025-11-22 08:07:07.410 186548 DEBUG oslo_concurrency.lockutils [req-374cd231-7486-4edd-a893-35b03b335d19 req-790af02a-4cb3-4298-866b-c0da0729b5c3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1fe5c7b8-8a9f-4b64-bb69-822a770565f6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:07:07 compute-0 nova_compute[186544]: 2025-11-22 08:07:07.410 186548 DEBUG oslo_concurrency.lockutils [req-374cd231-7486-4edd-a893-35b03b335d19 req-790af02a-4cb3-4298-866b-c0da0729b5c3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1fe5c7b8-8a9f-4b64-bb69-822a770565f6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:07:07 compute-0 nova_compute[186544]: 2025-11-22 08:07:07.410 186548 DEBUG nova.compute.manager [req-374cd231-7486-4edd-a893-35b03b335d19 req-790af02a-4cb3-4298-866b-c0da0729b5c3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] No waiting events found dispatching network-vif-plugged-480abb3e-15cf-4910-b471-7667ee6c50c8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:07:07 compute-0 nova_compute[186544]: 2025-11-22 08:07:07.410 186548 WARNING nova.compute.manager [req-374cd231-7486-4edd-a893-35b03b335d19 req-790af02a-4cb3-4298-866b-c0da0729b5c3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Received unexpected event network-vif-plugged-480abb3e-15cf-4910-b471-7667ee6c50c8 for instance with vm_state active and task_state None.
Nov 22 08:07:09 compute-0 nova_compute[186544]: 2025-11-22 08:07:09.676 186548 DEBUG nova.objects.instance [None req-995dc96a-106b-4884-b065-36cf1aa02f3e b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lazy-loading 'pci_devices' on Instance uuid 1fe5c7b8-8a9f-4b64-bb69-822a770565f6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:07:09 compute-0 nova_compute[186544]: 2025-11-22 08:07:09.694 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798829.6939378, 1fe5c7b8-8a9f-4b64-bb69-822a770565f6 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:07:09 compute-0 nova_compute[186544]: 2025-11-22 08:07:09.694 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] VM Paused (Lifecycle Event)
Nov 22 08:07:09 compute-0 nova_compute[186544]: 2025-11-22 08:07:09.708 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:07:09 compute-0 nova_compute[186544]: 2025-11-22 08:07:09.711 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:07:09 compute-0 nova_compute[186544]: 2025-11-22 08:07:09.732 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] During sync_power_state the instance has a pending task (suspending). Skip.
Nov 22 08:07:10 compute-0 nova_compute[186544]: 2025-11-22 08:07:10.001 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:10 compute-0 kernel: tap480abb3e-15 (unregistering): left promiscuous mode
Nov 22 08:07:10 compute-0 NetworkManager[55036]: <info>  [1763798830.3839] device (tap480abb3e-15): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 08:07:10 compute-0 nova_compute[186544]: 2025-11-22 08:07:10.392 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:10 compute-0 ovn_controller[94843]: 2025-11-22T08:07:10Z|00496|binding|INFO|Releasing lport 480abb3e-15cf-4910-b471-7667ee6c50c8 from this chassis (sb_readonly=0)
Nov 22 08:07:10 compute-0 ovn_controller[94843]: 2025-11-22T08:07:10Z|00497|binding|INFO|Setting lport 480abb3e-15cf-4910-b471-7667ee6c50c8 down in Southbound
Nov 22 08:07:10 compute-0 ovn_controller[94843]: 2025-11-22T08:07:10Z|00498|binding|INFO|Removing iface tap480abb3e-15 ovn-installed in OVS
Nov 22 08:07:10 compute-0 nova_compute[186544]: 2025-11-22 08:07:10.396 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:10 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:10.403 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d0:87:85 10.100.0.11'], port_security=['fa:16:3e:d0:87:85 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '1fe5c7b8-8a9f-4b64-bb69-822a770565f6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'f75b5f45-3232-42aa-a8f2-594f0428a6f8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.204', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a46c39a3-69e8-4fb9-a300-4c114a0c0910, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=480abb3e-15cf-4910-b471-7667ee6c50c8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:07:10 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:10.405 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 480abb3e-15cf-4910-b471-7667ee6c50c8 in datapath 165f7f23-d3c9-4f49-8a34-4fd7222ad518 unbound from our chassis
Nov 22 08:07:10 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:10.407 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 165f7f23-d3c9-4f49-8a34-4fd7222ad518, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 08:07:10 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:10.408 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[c65224f2-7195-4bd9-ad65-3b28613c2bd6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:07:10 compute-0 nova_compute[186544]: 2025-11-22 08:07:10.408 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:10 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:10.409 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518 namespace which is not needed anymore
Nov 22 08:07:10 compute-0 systemd[1]: machine-qemu\x2d64\x2dinstance\x2d0000006a.scope: Deactivated successfully.
Nov 22 08:07:10 compute-0 systemd[1]: machine-qemu\x2d64\x2dinstance\x2d0000006a.scope: Consumed 4.756s CPU time.
Nov 22 08:07:10 compute-0 systemd-machined[152872]: Machine qemu-64-instance-0000006a terminated.
Nov 22 08:07:10 compute-0 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[233822]: [NOTICE]   (233826) : haproxy version is 2.8.14-c23fe91
Nov 22 08:07:10 compute-0 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[233822]: [NOTICE]   (233826) : path to executable is /usr/sbin/haproxy
Nov 22 08:07:10 compute-0 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[233822]: [WARNING]  (233826) : Exiting Master process...
Nov 22 08:07:10 compute-0 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[233822]: [ALERT]    (233826) : Current worker (233828) exited with code 143 (Terminated)
Nov 22 08:07:10 compute-0 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[233822]: [WARNING]  (233826) : All workers exited. Exiting... (0)
Nov 22 08:07:10 compute-0 systemd[1]: libpod-a18acd9be5c5f864ebc5a57d2b5a46c5cf2c2d50017dee5c6bd295ec4bc33097.scope: Deactivated successfully.
Nov 22 08:07:10 compute-0 podman[233864]: 2025-11-22 08:07:10.54748279 +0000 UTC m=+0.055474449 container died a18acd9be5c5f864ebc5a57d2b5a46c5cf2c2d50017dee5c6bd295ec4bc33097 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 08:07:10 compute-0 nova_compute[186544]: 2025-11-22 08:07:10.585 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:10 compute-0 nova_compute[186544]: 2025-11-22 08:07:10.588 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:10 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a18acd9be5c5f864ebc5a57d2b5a46c5cf2c2d50017dee5c6bd295ec4bc33097-userdata-shm.mount: Deactivated successfully.
Nov 22 08:07:10 compute-0 systemd[1]: var-lib-containers-storage-overlay-c6611f5bc4142f5cdddba6ab02f485c8d52bb16ae8a203af476b1bdf354eaa98-merged.mount: Deactivated successfully.
Nov 22 08:07:10 compute-0 podman[233864]: 2025-11-22 08:07:10.618275176 +0000 UTC m=+0.126266825 container cleanup a18acd9be5c5f864ebc5a57d2b5a46c5cf2c2d50017dee5c6bd295ec4bc33097 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 08:07:10 compute-0 systemd[1]: libpod-conmon-a18acd9be5c5f864ebc5a57d2b5a46c5cf2c2d50017dee5c6bd295ec4bc33097.scope: Deactivated successfully.
Nov 22 08:07:10 compute-0 nova_compute[186544]: 2025-11-22 08:07:10.629 186548 DEBUG nova.compute.manager [None req-995dc96a-106b-4884-b065-36cf1aa02f3e b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:07:10 compute-0 podman[233907]: 2025-11-22 08:07:10.697733936 +0000 UTC m=+0.055989842 container remove a18acd9be5c5f864ebc5a57d2b5a46c5cf2c2d50017dee5c6bd295ec4bc33097 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2)
Nov 22 08:07:10 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:10.703 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[612c0c7e-3718-4fbd-995c-10ab86a4e4ea]: (4, ('Sat Nov 22 08:07:10 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518 (a18acd9be5c5f864ebc5a57d2b5a46c5cf2c2d50017dee5c6bd295ec4bc33097)\na18acd9be5c5f864ebc5a57d2b5a46c5cf2c2d50017dee5c6bd295ec4bc33097\nSat Nov 22 08:07:10 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518 (a18acd9be5c5f864ebc5a57d2b5a46c5cf2c2d50017dee5c6bd295ec4bc33097)\na18acd9be5c5f864ebc5a57d2b5a46c5cf2c2d50017dee5c6bd295ec4bc33097\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:07:10 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:10.705 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[cfc05e66-8010-4e5e-b908-493f30cee8f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:07:10 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:10.706 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap165f7f23-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:07:10 compute-0 nova_compute[186544]: 2025-11-22 08:07:10.709 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:10 compute-0 kernel: tap165f7f23-d0: left promiscuous mode
Nov 22 08:07:10 compute-0 nova_compute[186544]: 2025-11-22 08:07:10.727 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:10 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:10.727 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[6e057e94-6ef9-4832-8aaa-0d17a1762868]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:07:10 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:10.747 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[3b98e80a-25db-4747-a5fb-09569de5e581]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:07:10 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:10.748 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[e1467256-52f0-4514-bf1c-eb3f47d4b868]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:07:10 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:10.764 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[b4d28662-0c05-447a-8a70-c0002ebdd108]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 556583, 'reachable_time': 28711, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233925, 'error': None, 'target': 'ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:07:10 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:10.767 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 08:07:10 compute-0 systemd[1]: run-netns-ovnmeta\x2d165f7f23\x2dd3c9\x2d4f49\x2d8a34\x2d4fd7222ad518.mount: Deactivated successfully.
Nov 22 08:07:10 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:10.767 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[6cc565f0-c8f9-4f05-881d-2608c0ba1fd0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:07:10 compute-0 nova_compute[186544]: 2025-11-22 08:07:10.799 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:11 compute-0 nova_compute[186544]: 2025-11-22 08:07:11.713 186548 INFO nova.compute.manager [None req-f68a05ad-3df0-4741-8338-67b5e246ba30 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Resuming
Nov 22 08:07:11 compute-0 nova_compute[186544]: 2025-11-22 08:07:11.714 186548 DEBUG nova.objects.instance [None req-f68a05ad-3df0-4741-8338-67b5e246ba30 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lazy-loading 'flavor' on Instance uuid 1fe5c7b8-8a9f-4b64-bb69-822a770565f6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:07:11 compute-0 nova_compute[186544]: 2025-11-22 08:07:11.757 186548 DEBUG oslo_concurrency.lockutils [None req-f68a05ad-3df0-4741-8338-67b5e246ba30 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquiring lock "refresh_cache-1fe5c7b8-8a9f-4b64-bb69-822a770565f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:07:11 compute-0 nova_compute[186544]: 2025-11-22 08:07:11.757 186548 DEBUG oslo_concurrency.lockutils [None req-f68a05ad-3df0-4741-8338-67b5e246ba30 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquired lock "refresh_cache-1fe5c7b8-8a9f-4b64-bb69-822a770565f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:07:11 compute-0 nova_compute[186544]: 2025-11-22 08:07:11.758 186548 DEBUG nova.network.neutron [None req-f68a05ad-3df0-4741-8338-67b5e246ba30 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 08:07:12 compute-0 nova_compute[186544]: 2025-11-22 08:07:12.091 186548 DEBUG nova.compute.manager [req-b553ccc5-8db2-412b-8937-235d362eaeff req-13afc51c-40c5-45a3-af60-0a9f01a37e7e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Received event network-vif-unplugged-480abb3e-15cf-4910-b471-7667ee6c50c8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:07:12 compute-0 nova_compute[186544]: 2025-11-22 08:07:12.092 186548 DEBUG oslo_concurrency.lockutils [req-b553ccc5-8db2-412b-8937-235d362eaeff req-13afc51c-40c5-45a3-af60-0a9f01a37e7e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1fe5c7b8-8a9f-4b64-bb69-822a770565f6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:07:12 compute-0 nova_compute[186544]: 2025-11-22 08:07:12.092 186548 DEBUG oslo_concurrency.lockutils [req-b553ccc5-8db2-412b-8937-235d362eaeff req-13afc51c-40c5-45a3-af60-0a9f01a37e7e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1fe5c7b8-8a9f-4b64-bb69-822a770565f6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:07:12 compute-0 nova_compute[186544]: 2025-11-22 08:07:12.093 186548 DEBUG oslo_concurrency.lockutils [req-b553ccc5-8db2-412b-8937-235d362eaeff req-13afc51c-40c5-45a3-af60-0a9f01a37e7e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1fe5c7b8-8a9f-4b64-bb69-822a770565f6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:07:12 compute-0 nova_compute[186544]: 2025-11-22 08:07:12.093 186548 DEBUG nova.compute.manager [req-b553ccc5-8db2-412b-8937-235d362eaeff req-13afc51c-40c5-45a3-af60-0a9f01a37e7e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] No waiting events found dispatching network-vif-unplugged-480abb3e-15cf-4910-b471-7667ee6c50c8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:07:12 compute-0 nova_compute[186544]: 2025-11-22 08:07:12.093 186548 WARNING nova.compute.manager [req-b553ccc5-8db2-412b-8937-235d362eaeff req-13afc51c-40c5-45a3-af60-0a9f01a37e7e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Received unexpected event network-vif-unplugged-480abb3e-15cf-4910-b471-7667ee6c50c8 for instance with vm_state suspended and task_state resuming.
Nov 22 08:07:12 compute-0 nova_compute[186544]: 2025-11-22 08:07:12.093 186548 DEBUG nova.compute.manager [req-b553ccc5-8db2-412b-8937-235d362eaeff req-13afc51c-40c5-45a3-af60-0a9f01a37e7e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Received event network-vif-plugged-480abb3e-15cf-4910-b471-7667ee6c50c8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:07:12 compute-0 nova_compute[186544]: 2025-11-22 08:07:12.093 186548 DEBUG oslo_concurrency.lockutils [req-b553ccc5-8db2-412b-8937-235d362eaeff req-13afc51c-40c5-45a3-af60-0a9f01a37e7e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1fe5c7b8-8a9f-4b64-bb69-822a770565f6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:07:12 compute-0 nova_compute[186544]: 2025-11-22 08:07:12.094 186548 DEBUG oslo_concurrency.lockutils [req-b553ccc5-8db2-412b-8937-235d362eaeff req-13afc51c-40c5-45a3-af60-0a9f01a37e7e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1fe5c7b8-8a9f-4b64-bb69-822a770565f6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:07:12 compute-0 nova_compute[186544]: 2025-11-22 08:07:12.094 186548 DEBUG oslo_concurrency.lockutils [req-b553ccc5-8db2-412b-8937-235d362eaeff req-13afc51c-40c5-45a3-af60-0a9f01a37e7e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1fe5c7b8-8a9f-4b64-bb69-822a770565f6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:07:12 compute-0 nova_compute[186544]: 2025-11-22 08:07:12.094 186548 DEBUG nova.compute.manager [req-b553ccc5-8db2-412b-8937-235d362eaeff req-13afc51c-40c5-45a3-af60-0a9f01a37e7e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] No waiting events found dispatching network-vif-plugged-480abb3e-15cf-4910-b471-7667ee6c50c8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:07:12 compute-0 nova_compute[186544]: 2025-11-22 08:07:12.094 186548 WARNING nova.compute.manager [req-b553ccc5-8db2-412b-8937-235d362eaeff req-13afc51c-40c5-45a3-af60-0a9f01a37e7e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Received unexpected event network-vif-plugged-480abb3e-15cf-4910-b471-7667ee6c50c8 for instance with vm_state suspended and task_state resuming.
Nov 22 08:07:13 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:13.389 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=32, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=31) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:07:13 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:13.390 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 08:07:13 compute-0 nova_compute[186544]: 2025-11-22 08:07:13.392 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:13 compute-0 podman[233926]: 2025-11-22 08:07:13.415284917 +0000 UTC m=+0.061990589 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3)
Nov 22 08:07:14 compute-0 nova_compute[186544]: 2025-11-22 08:07:14.052 186548 DEBUG nova.network.neutron [None req-f68a05ad-3df0-4741-8338-67b5e246ba30 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Updating instance_info_cache with network_info: [{"id": "480abb3e-15cf-4910-b471-7667ee6c50c8", "address": "fa:16:3e:d0:87:85", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap480abb3e-15", "ovs_interfaceid": "480abb3e-15cf-4910-b471-7667ee6c50c8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:07:14 compute-0 nova_compute[186544]: 2025-11-22 08:07:14.069 186548 DEBUG oslo_concurrency.lockutils [None req-f68a05ad-3df0-4741-8338-67b5e246ba30 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Releasing lock "refresh_cache-1fe5c7b8-8a9f-4b64-bb69-822a770565f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:07:14 compute-0 nova_compute[186544]: 2025-11-22 08:07:14.076 186548 DEBUG nova.virt.libvirt.vif [None req-f68a05ad-3df0-4741-8338-67b5e246ba30 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:06:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1640453793',display_name='tempest-ServerActionsTestJSON-server-1640453793',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1640453793',id=106,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEDyh5RRpb7qDHgAc9H+oNwOI/lxx0x2a7uhOXIX+Er9GoVqnK9B1X3kTc/PIYUbBPjQjhoPfQeu2jPU9pyeFHD6mBTSbq1gvJNECPvummRKdXnVokvmyleOZmFdoGP/ZQ==',key_name='tempest-keypair-1877507320',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:06:22Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='ac6b78572b7d4aedaf745e5e6ba1d360',ramdisk_id='',reservation_id='r-tvm9f4y6',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-1104477664',owner_user_name='tempest-ServerActionsTestJSON-1104477664-project-member'},tags=<?>,task_state='resuming',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:07:10Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='b6cc24df1e344e369f2aff864f278268',uuid=1fe5c7b8-8a9f-4b64-bb69-822a770565f6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='suspended') vif={"id": "480abb3e-15cf-4910-b471-7667ee6c50c8", "address": "fa:16:3e:d0:87:85", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap480abb3e-15", "ovs_interfaceid": "480abb3e-15cf-4910-b471-7667ee6c50c8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 08:07:14 compute-0 nova_compute[186544]: 2025-11-22 08:07:14.077 186548 DEBUG nova.network.os_vif_util [None req-f68a05ad-3df0-4741-8338-67b5e246ba30 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Converting VIF {"id": "480abb3e-15cf-4910-b471-7667ee6c50c8", "address": "fa:16:3e:d0:87:85", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap480abb3e-15", "ovs_interfaceid": "480abb3e-15cf-4910-b471-7667ee6c50c8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:07:14 compute-0 nova_compute[186544]: 2025-11-22 08:07:14.078 186548 DEBUG nova.network.os_vif_util [None req-f68a05ad-3df0-4741-8338-67b5e246ba30 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d0:87:85,bridge_name='br-int',has_traffic_filtering=True,id=480abb3e-15cf-4910-b471-7667ee6c50c8,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap480abb3e-15') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:07:14 compute-0 nova_compute[186544]: 2025-11-22 08:07:14.078 186548 DEBUG os_vif [None req-f68a05ad-3df0-4741-8338-67b5e246ba30 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d0:87:85,bridge_name='br-int',has_traffic_filtering=True,id=480abb3e-15cf-4910-b471-7667ee6c50c8,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap480abb3e-15') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 08:07:14 compute-0 nova_compute[186544]: 2025-11-22 08:07:14.079 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:14 compute-0 nova_compute[186544]: 2025-11-22 08:07:14.079 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:07:14 compute-0 nova_compute[186544]: 2025-11-22 08:07:14.079 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:07:14 compute-0 nova_compute[186544]: 2025-11-22 08:07:14.083 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:14 compute-0 nova_compute[186544]: 2025-11-22 08:07:14.084 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap480abb3e-15, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:07:14 compute-0 nova_compute[186544]: 2025-11-22 08:07:14.084 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap480abb3e-15, col_values=(('external_ids', {'iface-id': '480abb3e-15cf-4910-b471-7667ee6c50c8', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d0:87:85', 'vm-uuid': '1fe5c7b8-8a9f-4b64-bb69-822a770565f6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:07:14 compute-0 nova_compute[186544]: 2025-11-22 08:07:14.085 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:07:14 compute-0 nova_compute[186544]: 2025-11-22 08:07:14.085 186548 INFO os_vif [None req-f68a05ad-3df0-4741-8338-67b5e246ba30 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d0:87:85,bridge_name='br-int',has_traffic_filtering=True,id=480abb3e-15cf-4910-b471-7667ee6c50c8,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap480abb3e-15')
Nov 22 08:07:14 compute-0 nova_compute[186544]: 2025-11-22 08:07:14.109 186548 DEBUG nova.objects.instance [None req-f68a05ad-3df0-4741-8338-67b5e246ba30 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lazy-loading 'numa_topology' on Instance uuid 1fe5c7b8-8a9f-4b64-bb69-822a770565f6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:07:14 compute-0 kernel: tap480abb3e-15: entered promiscuous mode
Nov 22 08:07:14 compute-0 NetworkManager[55036]: <info>  [1763798834.1739] manager: (tap480abb3e-15): new Tun device (/org/freedesktop/NetworkManager/Devices/238)
Nov 22 08:07:14 compute-0 nova_compute[186544]: 2025-11-22 08:07:14.175 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:14 compute-0 ovn_controller[94843]: 2025-11-22T08:07:14Z|00499|binding|INFO|Claiming lport 480abb3e-15cf-4910-b471-7667ee6c50c8 for this chassis.
Nov 22 08:07:14 compute-0 ovn_controller[94843]: 2025-11-22T08:07:14Z|00500|binding|INFO|480abb3e-15cf-4910-b471-7667ee6c50c8: Claiming fa:16:3e:d0:87:85 10.100.0.11
Nov 22 08:07:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:14.184 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d0:87:85 10.100.0.11'], port_security=['fa:16:3e:d0:87:85 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '1fe5c7b8-8a9f-4b64-bb69-822a770565f6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'neutron:revision_number': '7', 'neutron:security_group_ids': 'f75b5f45-3232-42aa-a8f2-594f0428a6f8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.204'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a46c39a3-69e8-4fb9-a300-4c114a0c0910, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=480abb3e-15cf-4910-b471-7667ee6c50c8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:07:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:14.186 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 480abb3e-15cf-4910-b471-7667ee6c50c8 in datapath 165f7f23-d3c9-4f49-8a34-4fd7222ad518 bound to our chassis
Nov 22 08:07:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:14.188 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 165f7f23-d3c9-4f49-8a34-4fd7222ad518
Nov 22 08:07:14 compute-0 ovn_controller[94843]: 2025-11-22T08:07:14Z|00501|binding|INFO|Setting lport 480abb3e-15cf-4910-b471-7667ee6c50c8 ovn-installed in OVS
Nov 22 08:07:14 compute-0 ovn_controller[94843]: 2025-11-22T08:07:14Z|00502|binding|INFO|Setting lport 480abb3e-15cf-4910-b471-7667ee6c50c8 up in Southbound
Nov 22 08:07:14 compute-0 nova_compute[186544]: 2025-11-22 08:07:14.192 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:14.198 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[cfeb2c0e-e30b-4b58-bfe0-f944d417a4bb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:07:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:14.198 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap165f7f23-d1 in ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 08:07:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:14.200 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap165f7f23-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 08:07:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:14.200 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[50aed30a-39e9-4f50-a7ff-7dafd6fe2b7b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:07:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:14.200 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[5f7d9003-d2ff-4c7f-97a1-46cdcc5b9596]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:07:14 compute-0 systemd-udevd[233962]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:07:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:14.211 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[b96fa4b9-59c2-4732-acc7-caae2e460fb7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:07:14 compute-0 systemd-machined[152872]: New machine qemu-65-instance-0000006a.
Nov 22 08:07:14 compute-0 NetworkManager[55036]: <info>  [1763798834.2162] device (tap480abb3e-15): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 08:07:14 compute-0 NetworkManager[55036]: <info>  [1763798834.2173] device (tap480abb3e-15): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 08:07:14 compute-0 systemd[1]: Started Virtual Machine qemu-65-instance-0000006a.
Nov 22 08:07:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:14.225 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[82cfda4b-6b06-4857-95c7-eb09167a806e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:07:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:14.254 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[d88ff13f-f982-4c19-91f4-817588a4ae43]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:07:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:14.260 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[5d8edde2-b362-416c-9607-73fb74dde730]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:07:14 compute-0 NetworkManager[55036]: <info>  [1763798834.2617] manager: (tap165f7f23-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/239)
Nov 22 08:07:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:14.294 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[ea643e95-95cf-41a8-9430-b4f5ee8274e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:07:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:14.297 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[ed943fff-e14d-47b5-bd9b-1cf5a7876e18]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:07:14 compute-0 NetworkManager[55036]: <info>  [1763798834.3163] device (tap165f7f23-d0): carrier: link connected
Nov 22 08:07:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:14.320 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[c08be7b0-4161-4cce-99c1-d3db3d4099fd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:07:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:14.334 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[9248e53d-f105-4060-8e86-d553d6b8e295]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap165f7f23-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:00:cc:98'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 158], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 557495, 'reachable_time': 44292, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233995, 'error': None, 'target': 'ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:07:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:14.350 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[9991d768-1edc-45c7-a9e5-ad82e7627d6f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe00:cc98'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 557495, 'tstamp': 557495}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233996, 'error': None, 'target': 'ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:07:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:14.364 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[d2a00ae0-5274-4490-bc2a-145a72e2bae6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap165f7f23-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:00:cc:98'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 158], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 557495, 'reachable_time': 44292, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 233997, 'error': None, 'target': 'ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:07:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:14.393 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[85dc4bde-ee2b-40b1-8b6e-c92a739133cd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:07:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:14.444 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[5eb2cf65-8ce5-4a32-9127-e8ba718f7bde]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:07:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:14.445 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap165f7f23-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:07:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:14.445 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:07:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:14.446 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap165f7f23-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:07:14 compute-0 nova_compute[186544]: 2025-11-22 08:07:14.448 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:14 compute-0 kernel: tap165f7f23-d0: entered promiscuous mode
Nov 22 08:07:14 compute-0 NetworkManager[55036]: <info>  [1763798834.4491] manager: (tap165f7f23-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/240)
Nov 22 08:07:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:14.451 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap165f7f23-d0, col_values=(('external_ids', {'iface-id': '966efbe2-6c09-40dc-9351-4f58f2542b31'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:07:14 compute-0 ovn_controller[94843]: 2025-11-22T08:07:14Z|00503|binding|INFO|Releasing lport 966efbe2-6c09-40dc-9351-4f58f2542b31 from this chassis (sb_readonly=0)
Nov 22 08:07:14 compute-0 nova_compute[186544]: 2025-11-22 08:07:14.465 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:14.466 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/165f7f23-d3c9-4f49-8a34-4fd7222ad518.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/165f7f23-d3c9-4f49-8a34-4fd7222ad518.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 08:07:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:14.467 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[9e74699f-f110-4ffa-b5e7-42d81cfc7f6c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:07:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:14.467 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 08:07:14 compute-0 ovn_metadata_agent[103800]: global
Nov 22 08:07:14 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 08:07:14 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-165f7f23-d3c9-4f49-8a34-4fd7222ad518
Nov 22 08:07:14 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 08:07:14 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 08:07:14 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 08:07:14 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/165f7f23-d3c9-4f49-8a34-4fd7222ad518.pid.haproxy
Nov 22 08:07:14 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 08:07:14 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:07:14 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 08:07:14 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 08:07:14 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 08:07:14 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 08:07:14 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 08:07:14 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 08:07:14 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 08:07:14 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 08:07:14 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 08:07:14 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 08:07:14 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 08:07:14 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 08:07:14 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 08:07:14 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:07:14 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:07:14 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 08:07:14 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 08:07:14 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 08:07:14 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID 165f7f23-d3c9-4f49-8a34-4fd7222ad518
Nov 22 08:07:14 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 08:07:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:14.468 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'env', 'PROCESS_TAG=haproxy-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/165f7f23-d3c9-4f49-8a34-4fd7222ad518.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 08:07:14 compute-0 podman[234029]: 2025-11-22 08:07:14.830260823 +0000 UTC m=+0.053851428 container create 5be691b9816fe4dac21c26a6ac63c1ea7b6ed21e19b805f24ca2aab50dfa11bc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118)
Nov 22 08:07:14 compute-0 systemd[1]: Started libpod-conmon-5be691b9816fe4dac21c26a6ac63c1ea7b6ed21e19b805f24ca2aab50dfa11bc.scope.
Nov 22 08:07:14 compute-0 podman[234029]: 2025-11-22 08:07:14.798339367 +0000 UTC m=+0.021929992 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 08:07:14 compute-0 systemd[1]: Started libcrun container.
Nov 22 08:07:14 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bcb864acf86e4259b6e3e1123ce39ea7f2e6418c4904e2d90351ee85582ad69b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 08:07:14 compute-0 podman[234029]: 2025-11-22 08:07:14.926456986 +0000 UTC m=+0.150047631 container init 5be691b9816fe4dac21c26a6ac63c1ea7b6ed21e19b805f24ca2aab50dfa11bc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 08:07:14 compute-0 podman[234029]: 2025-11-22 08:07:14.932621239 +0000 UTC m=+0.156211844 container start 5be691b9816fe4dac21c26a6ac63c1ea7b6ed21e19b805f24ca2aab50dfa11bc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 22 08:07:14 compute-0 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[234044]: [NOTICE]   (234048) : New worker (234050) forked
Nov 22 08:07:14 compute-0 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[234044]: [NOTICE]   (234048) : Loading success.
Nov 22 08:07:15 compute-0 nova_compute[186544]: 2025-11-22 08:07:15.002 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:15 compute-0 nova_compute[186544]: 2025-11-22 08:07:15.446 186548 DEBUG nova.compute.manager [req-b2c59059-e759-4bb0-8e4e-5a3b8397316d req-72d3b579-0d06-4553-b7d2-c28ef6dc3eef 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Received event network-vif-plugged-480abb3e-15cf-4910-b471-7667ee6c50c8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:07:15 compute-0 nova_compute[186544]: 2025-11-22 08:07:15.446 186548 DEBUG oslo_concurrency.lockutils [req-b2c59059-e759-4bb0-8e4e-5a3b8397316d req-72d3b579-0d06-4553-b7d2-c28ef6dc3eef 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1fe5c7b8-8a9f-4b64-bb69-822a770565f6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:07:15 compute-0 nova_compute[186544]: 2025-11-22 08:07:15.447 186548 DEBUG oslo_concurrency.lockutils [req-b2c59059-e759-4bb0-8e4e-5a3b8397316d req-72d3b579-0d06-4553-b7d2-c28ef6dc3eef 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1fe5c7b8-8a9f-4b64-bb69-822a770565f6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:07:15 compute-0 nova_compute[186544]: 2025-11-22 08:07:15.447 186548 DEBUG oslo_concurrency.lockutils [req-b2c59059-e759-4bb0-8e4e-5a3b8397316d req-72d3b579-0d06-4553-b7d2-c28ef6dc3eef 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1fe5c7b8-8a9f-4b64-bb69-822a770565f6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:07:15 compute-0 nova_compute[186544]: 2025-11-22 08:07:15.447 186548 DEBUG nova.compute.manager [req-b2c59059-e759-4bb0-8e4e-5a3b8397316d req-72d3b579-0d06-4553-b7d2-c28ef6dc3eef 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] No waiting events found dispatching network-vif-plugged-480abb3e-15cf-4910-b471-7667ee6c50c8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:07:15 compute-0 nova_compute[186544]: 2025-11-22 08:07:15.448 186548 WARNING nova.compute.manager [req-b2c59059-e759-4bb0-8e4e-5a3b8397316d req-72d3b579-0d06-4553-b7d2-c28ef6dc3eef 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Received unexpected event network-vif-plugged-480abb3e-15cf-4910-b471-7667ee6c50c8 for instance with vm_state suspended and task_state resuming.
Nov 22 08:07:15 compute-0 nova_compute[186544]: 2025-11-22 08:07:15.800 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:15 compute-0 nova_compute[186544]: 2025-11-22 08:07:15.822 186548 DEBUG nova.virt.libvirt.host [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Removed pending event for 1fe5c7b8-8a9f-4b64-bb69-822a770565f6 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Nov 22 08:07:15 compute-0 nova_compute[186544]: 2025-11-22 08:07:15.823 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798835.8223567, 1fe5c7b8-8a9f-4b64-bb69-822a770565f6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:07:15 compute-0 nova_compute[186544]: 2025-11-22 08:07:15.823 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] VM Started (Lifecycle Event)
Nov 22 08:07:15 compute-0 nova_compute[186544]: 2025-11-22 08:07:15.845 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:07:15 compute-0 nova_compute[186544]: 2025-11-22 08:07:15.853 186548 DEBUG nova.compute.manager [None req-f68a05ad-3df0-4741-8338-67b5e246ba30 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 08:07:15 compute-0 nova_compute[186544]: 2025-11-22 08:07:15.853 186548 DEBUG nova.objects.instance [None req-f68a05ad-3df0-4741-8338-67b5e246ba30 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lazy-loading 'pci_devices' on Instance uuid 1fe5c7b8-8a9f-4b64-bb69-822a770565f6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:07:15 compute-0 nova_compute[186544]: 2025-11-22 08:07:15.858 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Synchronizing instance power state after lifecycle event "Started"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:07:15 compute-0 nova_compute[186544]: 2025-11-22 08:07:15.893 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] During sync_power_state the instance has a pending task (resuming). Skip.
Nov 22 08:07:15 compute-0 nova_compute[186544]: 2025-11-22 08:07:15.893 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798835.8389683, 1fe5c7b8-8a9f-4b64-bb69-822a770565f6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:07:15 compute-0 nova_compute[186544]: 2025-11-22 08:07:15.894 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] VM Resumed (Lifecycle Event)
Nov 22 08:07:15 compute-0 nova_compute[186544]: 2025-11-22 08:07:15.897 186548 INFO nova.virt.libvirt.driver [-] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Instance running successfully.
Nov 22 08:07:15 compute-0 virtqemud[186092]: argument unsupported: QEMU guest agent is not configured
Nov 22 08:07:15 compute-0 nova_compute[186544]: 2025-11-22 08:07:15.899 186548 DEBUG nova.virt.libvirt.guest [None req-f68a05ad-3df0-4741-8338-67b5e246ba30 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200
Nov 22 08:07:15 compute-0 nova_compute[186544]: 2025-11-22 08:07:15.899 186548 DEBUG nova.compute.manager [None req-f68a05ad-3df0-4741-8338-67b5e246ba30 b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:07:15 compute-0 nova_compute[186544]: 2025-11-22 08:07:15.960 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:07:15 compute-0 nova_compute[186544]: 2025-11-22 08:07:15.964 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:07:16 compute-0 nova_compute[186544]: 2025-11-22 08:07:16.003 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] During sync_power_state the instance has a pending task (resuming). Skip.
Nov 22 08:07:16 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:16.392 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '32'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:07:17 compute-0 podman[234066]: 2025-11-22 08:07:17.418988117 +0000 UTC m=+0.051039909 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 08:07:17 compute-0 podman[234067]: 2025-11-22 08:07:17.425457907 +0000 UTC m=+0.059008646 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.component=ubi9-minimal-container, distribution-scope=public, vendor=Red Hat, Inc., name=ubi9-minimal, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.33.7, io.openshift.expose-services=, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, version=9.6, build-date=2025-08-20T13:12:41)
Nov 22 08:07:17 compute-0 nova_compute[186544]: 2025-11-22 08:07:17.552 186548 DEBUG nova.compute.manager [req-8a12f48d-3e64-4069-b0fb-ec450c1a3b90 req-4475066e-3d1b-4b03-b865-af3b373990cf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Received event network-vif-plugged-480abb3e-15cf-4910-b471-7667ee6c50c8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:07:17 compute-0 nova_compute[186544]: 2025-11-22 08:07:17.553 186548 DEBUG oslo_concurrency.lockutils [req-8a12f48d-3e64-4069-b0fb-ec450c1a3b90 req-4475066e-3d1b-4b03-b865-af3b373990cf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1fe5c7b8-8a9f-4b64-bb69-822a770565f6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:07:17 compute-0 nova_compute[186544]: 2025-11-22 08:07:17.553 186548 DEBUG oslo_concurrency.lockutils [req-8a12f48d-3e64-4069-b0fb-ec450c1a3b90 req-4475066e-3d1b-4b03-b865-af3b373990cf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1fe5c7b8-8a9f-4b64-bb69-822a770565f6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:07:17 compute-0 nova_compute[186544]: 2025-11-22 08:07:17.553 186548 DEBUG oslo_concurrency.lockutils [req-8a12f48d-3e64-4069-b0fb-ec450c1a3b90 req-4475066e-3d1b-4b03-b865-af3b373990cf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1fe5c7b8-8a9f-4b64-bb69-822a770565f6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:07:17 compute-0 nova_compute[186544]: 2025-11-22 08:07:17.553 186548 DEBUG nova.compute.manager [req-8a12f48d-3e64-4069-b0fb-ec450c1a3b90 req-4475066e-3d1b-4b03-b865-af3b373990cf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] No waiting events found dispatching network-vif-plugged-480abb3e-15cf-4910-b471-7667ee6c50c8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:07:17 compute-0 nova_compute[186544]: 2025-11-22 08:07:17.554 186548 WARNING nova.compute.manager [req-8a12f48d-3e64-4069-b0fb-ec450c1a3b90 req-4475066e-3d1b-4b03-b865-af3b373990cf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Received unexpected event network-vif-plugged-480abb3e-15cf-4910-b471-7667ee6c50c8 for instance with vm_state active and task_state None.
Nov 22 08:07:18 compute-0 nova_compute[186544]: 2025-11-22 08:07:18.296 186548 DEBUG oslo_concurrency.lockutils [None req-745e0059-9871-4a30-8485-457adb1e861d b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquiring lock "1fe5c7b8-8a9f-4b64-bb69-822a770565f6" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:07:18 compute-0 nova_compute[186544]: 2025-11-22 08:07:18.297 186548 DEBUG oslo_concurrency.lockutils [None req-745e0059-9871-4a30-8485-457adb1e861d b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "1fe5c7b8-8a9f-4b64-bb69-822a770565f6" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:07:18 compute-0 nova_compute[186544]: 2025-11-22 08:07:18.297 186548 DEBUG oslo_concurrency.lockutils [None req-745e0059-9871-4a30-8485-457adb1e861d b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquiring lock "1fe5c7b8-8a9f-4b64-bb69-822a770565f6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:07:18 compute-0 nova_compute[186544]: 2025-11-22 08:07:18.297 186548 DEBUG oslo_concurrency.lockutils [None req-745e0059-9871-4a30-8485-457adb1e861d b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "1fe5c7b8-8a9f-4b64-bb69-822a770565f6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:07:18 compute-0 nova_compute[186544]: 2025-11-22 08:07:18.298 186548 DEBUG oslo_concurrency.lockutils [None req-745e0059-9871-4a30-8485-457adb1e861d b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "1fe5c7b8-8a9f-4b64-bb69-822a770565f6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:07:18 compute-0 nova_compute[186544]: 2025-11-22 08:07:18.304 186548 INFO nova.compute.manager [None req-745e0059-9871-4a30-8485-457adb1e861d b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Terminating instance
Nov 22 08:07:18 compute-0 nova_compute[186544]: 2025-11-22 08:07:18.309 186548 DEBUG nova.compute.manager [None req-745e0059-9871-4a30-8485-457adb1e861d b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 08:07:18 compute-0 kernel: tap480abb3e-15 (unregistering): left promiscuous mode
Nov 22 08:07:18 compute-0 NetworkManager[55036]: <info>  [1763798838.3363] device (tap480abb3e-15): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 08:07:18 compute-0 nova_compute[186544]: 2025-11-22 08:07:18.348 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:18 compute-0 ovn_controller[94843]: 2025-11-22T08:07:18Z|00504|binding|INFO|Releasing lport 480abb3e-15cf-4910-b471-7667ee6c50c8 from this chassis (sb_readonly=0)
Nov 22 08:07:18 compute-0 ovn_controller[94843]: 2025-11-22T08:07:18Z|00505|binding|INFO|Setting lport 480abb3e-15cf-4910-b471-7667ee6c50c8 down in Southbound
Nov 22 08:07:18 compute-0 ovn_controller[94843]: 2025-11-22T08:07:18Z|00506|binding|INFO|Removing iface tap480abb3e-15 ovn-installed in OVS
Nov 22 08:07:18 compute-0 nova_compute[186544]: 2025-11-22 08:07:18.351 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:18 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:18.359 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d0:87:85 10.100.0.11'], port_security=['fa:16:3e:d0:87:85 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '1fe5c7b8-8a9f-4b64-bb69-822a770565f6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ac6b78572b7d4aedaf745e5e6ba1d360', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'f75b5f45-3232-42aa-a8f2-594f0428a6f8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.204', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a46c39a3-69e8-4fb9-a300-4c114a0c0910, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=480abb3e-15cf-4910-b471-7667ee6c50c8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:07:18 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:18.360 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 480abb3e-15cf-4910-b471-7667ee6c50c8 in datapath 165f7f23-d3c9-4f49-8a34-4fd7222ad518 unbound from our chassis
Nov 22 08:07:18 compute-0 nova_compute[186544]: 2025-11-22 08:07:18.361 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:18 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:18.362 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 165f7f23-d3c9-4f49-8a34-4fd7222ad518, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 08:07:18 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:18.363 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[0a6bce6f-d3b3-4bfd-af78-d3148a44bfdd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:07:18 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:18.363 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518 namespace which is not needed anymore
Nov 22 08:07:18 compute-0 systemd[1]: machine-qemu\x2d65\x2dinstance\x2d0000006a.scope: Deactivated successfully.
Nov 22 08:07:18 compute-0 systemd[1]: machine-qemu\x2d65\x2dinstance\x2d0000006a.scope: Consumed 4.028s CPU time.
Nov 22 08:07:18 compute-0 systemd-machined[152872]: Machine qemu-65-instance-0000006a terminated.
Nov 22 08:07:18 compute-0 nova_compute[186544]: 2025-11-22 08:07:18.530 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:18 compute-0 nova_compute[186544]: 2025-11-22 08:07:18.535 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:18 compute-0 nova_compute[186544]: 2025-11-22 08:07:18.566 186548 INFO nova.virt.libvirt.driver [-] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Instance destroyed successfully.
Nov 22 08:07:18 compute-0 nova_compute[186544]: 2025-11-22 08:07:18.569 186548 DEBUG nova.objects.instance [None req-745e0059-9871-4a30-8485-457adb1e861d b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lazy-loading 'resources' on Instance uuid 1fe5c7b8-8a9f-4b64-bb69-822a770565f6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:07:18 compute-0 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[234044]: [NOTICE]   (234048) : haproxy version is 2.8.14-c23fe91
Nov 22 08:07:18 compute-0 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[234044]: [NOTICE]   (234048) : path to executable is /usr/sbin/haproxy
Nov 22 08:07:18 compute-0 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[234044]: [WARNING]  (234048) : Exiting Master process...
Nov 22 08:07:18 compute-0 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[234044]: [ALERT]    (234048) : Current worker (234050) exited with code 143 (Terminated)
Nov 22 08:07:18 compute-0 neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518[234044]: [WARNING]  (234048) : All workers exited. Exiting... (0)
Nov 22 08:07:18 compute-0 nova_compute[186544]: 2025-11-22 08:07:18.585 186548 DEBUG nova.virt.libvirt.vif [None req-745e0059-9871-4a30-8485-457adb1e861d b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:06:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1640453793',display_name='tempest-ServerActionsTestJSON-server-1640453793',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1640453793',id=106,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEDyh5RRpb7qDHgAc9H+oNwOI/lxx0x2a7uhOXIX+Er9GoVqnK9B1X3kTc/PIYUbBPjQjhoPfQeu2jPU9pyeFHD6mBTSbq1gvJNECPvummRKdXnVokvmyleOZmFdoGP/ZQ==',key_name='tempest-keypair-1877507320',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:06:22Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ac6b78572b7d4aedaf745e5e6ba1d360',ramdisk_id='',reservation_id='r-tvm9f4y6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1104477664',owner_user_name='tempest-ServerActionsTestJSON-1104477664-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:07:15Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='b6cc24df1e344e369f2aff864f278268',uuid=1fe5c7b8-8a9f-4b64-bb69-822a770565f6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "480abb3e-15cf-4910-b471-7667ee6c50c8", "address": "fa:16:3e:d0:87:85", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap480abb3e-15", "ovs_interfaceid": "480abb3e-15cf-4910-b471-7667ee6c50c8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 08:07:18 compute-0 nova_compute[186544]: 2025-11-22 08:07:18.586 186548 DEBUG nova.network.os_vif_util [None req-745e0059-9871-4a30-8485-457adb1e861d b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Converting VIF {"id": "480abb3e-15cf-4910-b471-7667ee6c50c8", "address": "fa:16:3e:d0:87:85", "network": {"id": "165f7f23-d3c9-4f49-8a34-4fd7222ad518", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1436581558-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac6b78572b7d4aedaf745e5e6ba1d360", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap480abb3e-15", "ovs_interfaceid": "480abb3e-15cf-4910-b471-7667ee6c50c8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:07:18 compute-0 nova_compute[186544]: 2025-11-22 08:07:18.586 186548 DEBUG nova.network.os_vif_util [None req-745e0059-9871-4a30-8485-457adb1e861d b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d0:87:85,bridge_name='br-int',has_traffic_filtering=True,id=480abb3e-15cf-4910-b471-7667ee6c50c8,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap480abb3e-15') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:07:18 compute-0 systemd[1]: libpod-5be691b9816fe4dac21c26a6ac63c1ea7b6ed21e19b805f24ca2aab50dfa11bc.scope: Deactivated successfully.
Nov 22 08:07:18 compute-0 nova_compute[186544]: 2025-11-22 08:07:18.586 186548 DEBUG os_vif [None req-745e0059-9871-4a30-8485-457adb1e861d b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d0:87:85,bridge_name='br-int',has_traffic_filtering=True,id=480abb3e-15cf-4910-b471-7667ee6c50c8,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap480abb3e-15') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 08:07:18 compute-0 nova_compute[186544]: 2025-11-22 08:07:18.589 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:18 compute-0 nova_compute[186544]: 2025-11-22 08:07:18.589 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap480abb3e-15, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:07:18 compute-0 nova_compute[186544]: 2025-11-22 08:07:18.592 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:18 compute-0 podman[234131]: 2025-11-22 08:07:18.594260093 +0000 UTC m=+0.137766379 container died 5be691b9816fe4dac21c26a6ac63c1ea7b6ed21e19b805f24ca2aab50dfa11bc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 08:07:18 compute-0 nova_compute[186544]: 2025-11-22 08:07:18.595 186548 INFO os_vif [None req-745e0059-9871-4a30-8485-457adb1e861d b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d0:87:85,bridge_name='br-int',has_traffic_filtering=True,id=480abb3e-15cf-4910-b471-7667ee6c50c8,network=Network(165f7f23-d3c9-4f49-8a34-4fd7222ad518),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap480abb3e-15')
Nov 22 08:07:18 compute-0 nova_compute[186544]: 2025-11-22 08:07:18.595 186548 INFO nova.virt.libvirt.driver [None req-745e0059-9871-4a30-8485-457adb1e861d b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Deleting instance files /var/lib/nova/instances/1fe5c7b8-8a9f-4b64-bb69-822a770565f6_del
Nov 22 08:07:18 compute-0 nova_compute[186544]: 2025-11-22 08:07:18.596 186548 INFO nova.virt.libvirt.driver [None req-745e0059-9871-4a30-8485-457adb1e861d b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Deletion of /var/lib/nova/instances/1fe5c7b8-8a9f-4b64-bb69-822a770565f6_del complete
Nov 22 08:07:18 compute-0 nova_compute[186544]: 2025-11-22 08:07:18.662 186548 INFO nova.compute.manager [None req-745e0059-9871-4a30-8485-457adb1e861d b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Took 0.35 seconds to destroy the instance on the hypervisor.
Nov 22 08:07:18 compute-0 nova_compute[186544]: 2025-11-22 08:07:18.662 186548 DEBUG oslo.service.loopingcall [None req-745e0059-9871-4a30-8485-457adb1e861d b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 08:07:18 compute-0 nova_compute[186544]: 2025-11-22 08:07:18.663 186548 DEBUG nova.compute.manager [-] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 08:07:18 compute-0 nova_compute[186544]: 2025-11-22 08:07:18.663 186548 DEBUG nova.network.neutron [-] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 08:07:18 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5be691b9816fe4dac21c26a6ac63c1ea7b6ed21e19b805f24ca2aab50dfa11bc-userdata-shm.mount: Deactivated successfully.
Nov 22 08:07:18 compute-0 systemd[1]: var-lib-containers-storage-overlay-bcb864acf86e4259b6e3e1123ce39ea7f2e6418c4904e2d90351ee85582ad69b-merged.mount: Deactivated successfully.
Nov 22 08:07:19 compute-0 podman[234131]: 2025-11-22 08:07:19.410686398 +0000 UTC m=+0.954192674 container cleanup 5be691b9816fe4dac21c26a6ac63c1ea7b6ed21e19b805f24ca2aab50dfa11bc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 08:07:19 compute-0 systemd[1]: libpod-conmon-5be691b9816fe4dac21c26a6ac63c1ea7b6ed21e19b805f24ca2aab50dfa11bc.scope: Deactivated successfully.
Nov 22 08:07:19 compute-0 nova_compute[186544]: 2025-11-22 08:07:19.725 186548 DEBUG nova.compute.manager [req-7e7ab762-71d4-4393-a126-ae99c231e61b req-42f6befb-f12e-4fbe-a495-b162cfcabb17 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Received event network-vif-unplugged-480abb3e-15cf-4910-b471-7667ee6c50c8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:07:19 compute-0 nova_compute[186544]: 2025-11-22 08:07:19.725 186548 DEBUG oslo_concurrency.lockutils [req-7e7ab762-71d4-4393-a126-ae99c231e61b req-42f6befb-f12e-4fbe-a495-b162cfcabb17 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1fe5c7b8-8a9f-4b64-bb69-822a770565f6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:07:19 compute-0 nova_compute[186544]: 2025-11-22 08:07:19.726 186548 DEBUG oslo_concurrency.lockutils [req-7e7ab762-71d4-4393-a126-ae99c231e61b req-42f6befb-f12e-4fbe-a495-b162cfcabb17 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1fe5c7b8-8a9f-4b64-bb69-822a770565f6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:07:19 compute-0 nova_compute[186544]: 2025-11-22 08:07:19.726 186548 DEBUG oslo_concurrency.lockutils [req-7e7ab762-71d4-4393-a126-ae99c231e61b req-42f6befb-f12e-4fbe-a495-b162cfcabb17 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1fe5c7b8-8a9f-4b64-bb69-822a770565f6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:07:19 compute-0 nova_compute[186544]: 2025-11-22 08:07:19.726 186548 DEBUG nova.compute.manager [req-7e7ab762-71d4-4393-a126-ae99c231e61b req-42f6befb-f12e-4fbe-a495-b162cfcabb17 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] No waiting events found dispatching network-vif-unplugged-480abb3e-15cf-4910-b471-7667ee6c50c8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:07:19 compute-0 nova_compute[186544]: 2025-11-22 08:07:19.726 186548 DEBUG nova.compute.manager [req-7e7ab762-71d4-4393-a126-ae99c231e61b req-42f6befb-f12e-4fbe-a495-b162cfcabb17 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Received event network-vif-unplugged-480abb3e-15cf-4910-b471-7667ee6c50c8 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 22 08:07:19 compute-0 nova_compute[186544]: 2025-11-22 08:07:19.733 186548 DEBUG nova.compute.manager [req-7e7ab762-71d4-4393-a126-ae99c231e61b req-42f6befb-f12e-4fbe-a495-b162cfcabb17 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Received event network-vif-plugged-480abb3e-15cf-4910-b471-7667ee6c50c8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:07:19 compute-0 nova_compute[186544]: 2025-11-22 08:07:19.734 186548 DEBUG oslo_concurrency.lockutils [req-7e7ab762-71d4-4393-a126-ae99c231e61b req-42f6befb-f12e-4fbe-a495-b162cfcabb17 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1fe5c7b8-8a9f-4b64-bb69-822a770565f6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:07:19 compute-0 nova_compute[186544]: 2025-11-22 08:07:19.734 186548 DEBUG oslo_concurrency.lockutils [req-7e7ab762-71d4-4393-a126-ae99c231e61b req-42f6befb-f12e-4fbe-a495-b162cfcabb17 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1fe5c7b8-8a9f-4b64-bb69-822a770565f6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:07:19 compute-0 nova_compute[186544]: 2025-11-22 08:07:19.734 186548 DEBUG oslo_concurrency.lockutils [req-7e7ab762-71d4-4393-a126-ae99c231e61b req-42f6befb-f12e-4fbe-a495-b162cfcabb17 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1fe5c7b8-8a9f-4b64-bb69-822a770565f6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:07:19 compute-0 nova_compute[186544]: 2025-11-22 08:07:19.734 186548 DEBUG nova.compute.manager [req-7e7ab762-71d4-4393-a126-ae99c231e61b req-42f6befb-f12e-4fbe-a495-b162cfcabb17 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] No waiting events found dispatching network-vif-plugged-480abb3e-15cf-4910-b471-7667ee6c50c8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:07:19 compute-0 nova_compute[186544]: 2025-11-22 08:07:19.735 186548 WARNING nova.compute.manager [req-7e7ab762-71d4-4393-a126-ae99c231e61b req-42f6befb-f12e-4fbe-a495-b162cfcabb17 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Received unexpected event network-vif-plugged-480abb3e-15cf-4910-b471-7667ee6c50c8 for instance with vm_state active and task_state deleting.
Nov 22 08:07:19 compute-0 nova_compute[186544]: 2025-11-22 08:07:19.749 186548 DEBUG nova.network.neutron [-] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:07:19 compute-0 nova_compute[186544]: 2025-11-22 08:07:19.770 186548 INFO nova.compute.manager [-] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Took 1.11 seconds to deallocate network for instance.
Nov 22 08:07:19 compute-0 nova_compute[186544]: 2025-11-22 08:07:19.837 186548 DEBUG oslo_concurrency.lockutils [None req-745e0059-9871-4a30-8485-457adb1e861d b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:07:19 compute-0 nova_compute[186544]: 2025-11-22 08:07:19.837 186548 DEBUG oslo_concurrency.lockutils [None req-745e0059-9871-4a30-8485-457adb1e861d b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:07:19 compute-0 podman[234174]: 2025-11-22 08:07:19.939416757 +0000 UTC m=+0.507250431 container remove 5be691b9816fe4dac21c26a6ac63c1ea7b6ed21e19b805f24ca2aab50dfa11bc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 22 08:07:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:19.947 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[9262ef5d-ff03-477b-a6aa-d5d1819ff22f]: (4, ('Sat Nov 22 08:07:18 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518 (5be691b9816fe4dac21c26a6ac63c1ea7b6ed21e19b805f24ca2aab50dfa11bc)\n5be691b9816fe4dac21c26a6ac63c1ea7b6ed21e19b805f24ca2aab50dfa11bc\nSat Nov 22 08:07:19 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518 (5be691b9816fe4dac21c26a6ac63c1ea7b6ed21e19b805f24ca2aab50dfa11bc)\n5be691b9816fe4dac21c26a6ac63c1ea7b6ed21e19b805f24ca2aab50dfa11bc\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:07:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:19.950 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[a48f57e6-f529-4a74-a1c7-fe230a77f065]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:07:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:19.951 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap165f7f23-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:07:19 compute-0 nova_compute[186544]: 2025-11-22 08:07:19.954 186548 DEBUG nova.compute.provider_tree [None req-745e0059-9871-4a30-8485-457adb1e861d b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:07:19 compute-0 kernel: tap165f7f23-d0: left promiscuous mode
Nov 22 08:07:19 compute-0 nova_compute[186544]: 2025-11-22 08:07:19.956 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:19 compute-0 nova_compute[186544]: 2025-11-22 08:07:19.966 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:19 compute-0 nova_compute[186544]: 2025-11-22 08:07:19.967 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:19.969 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[8d7a782a-4066-49bd-a7aa-bf1187d50ffc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:07:19 compute-0 nova_compute[186544]: 2025-11-22 08:07:19.973 186548 DEBUG nova.scheduler.client.report [None req-745e0059-9871-4a30-8485-457adb1e861d b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:07:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:19.984 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[de18ccf8-b60d-4ee3-8f1f-4ef9fc6c250f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:07:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:19.985 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[13286a1d-bd2e-4f5d-be66-c63dd47fbcfd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:07:19 compute-0 nova_compute[186544]: 2025-11-22 08:07:19.996 186548 DEBUG oslo_concurrency.lockutils [None req-745e0059-9871-4a30-8485-457adb1e861d b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.159s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:07:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:20.000 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[09a21615-b7da-4349-a0dc-ef8028ed0915]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 557488, 'reachable_time': 21967, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234190, 'error': None, 'target': 'ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:07:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:20.002 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-165f7f23-d3c9-4f49-8a34-4fd7222ad518 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 08:07:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:20.002 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[09e24773-24df-49a5-b9c6-db1d965ffa5b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:07:20 compute-0 systemd[1]: run-netns-ovnmeta\x2d165f7f23\x2dd3c9\x2d4f49\x2d8a34\x2d4fd7222ad518.mount: Deactivated successfully.
Nov 22 08:07:20 compute-0 nova_compute[186544]: 2025-11-22 08:07:20.031 186548 INFO nova.scheduler.client.report [None req-745e0059-9871-4a30-8485-457adb1e861d b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Deleted allocations for instance 1fe5c7b8-8a9f-4b64-bb69-822a770565f6
Nov 22 08:07:20 compute-0 nova_compute[186544]: 2025-11-22 08:07:20.056 186548 DEBUG nova.compute.manager [req-9976fd20-e005-4dd5-aff9-f6f5b12f1b73 req-1d083358-02da-4563-be97-ac5db050159b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Received event network-vif-deleted-480abb3e-15cf-4910-b471-7667ee6c50c8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:07:20 compute-0 nova_compute[186544]: 2025-11-22 08:07:20.089 186548 DEBUG oslo_concurrency.lockutils [None req-745e0059-9871-4a30-8485-457adb1e861d b6cc24df1e344e369f2aff864f278268 ac6b78572b7d4aedaf745e5e6ba1d360 - - default default] Lock "1fe5c7b8-8a9f-4b64-bb69-822a770565f6" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.792s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:07:20 compute-0 nova_compute[186544]: 2025-11-22 08:07:20.802 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:23 compute-0 nova_compute[186544]: 2025-11-22 08:07:23.592 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:24 compute-0 ovn_controller[94843]: 2025-11-22T08:07:24Z|00507|binding|INFO|Releasing lport 0abd56a4-3e9e-4d28-8383-eadcda41744d from this chassis (sb_readonly=0)
Nov 22 08:07:24 compute-0 nova_compute[186544]: 2025-11-22 08:07:24.399 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:25 compute-0 nova_compute[186544]: 2025-11-22 08:07:25.803 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:26 compute-0 ovn_controller[94843]: 2025-11-22T08:07:26Z|00508|binding|INFO|Releasing lport 0abd56a4-3e9e-4d28-8383-eadcda41744d from this chassis (sb_readonly=0)
Nov 22 08:07:26 compute-0 nova_compute[186544]: 2025-11-22 08:07:26.167 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:28 compute-0 nova_compute[186544]: 2025-11-22 08:07:28.596 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:30 compute-0 nova_compute[186544]: 2025-11-22 08:07:30.805 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:33 compute-0 podman[234192]: 2025-11-22 08:07:33.410219779 +0000 UTC m=+0.056222538 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 22 08:07:33 compute-0 podman[234193]: 2025-11-22 08:07:33.464443156 +0000 UTC m=+0.109852600 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Nov 22 08:07:33 compute-0 nova_compute[186544]: 2025-11-22 08:07:33.565 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763798838.5650225, 1fe5c7b8-8a9f-4b64-bb69-822a770565f6 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:07:33 compute-0 nova_compute[186544]: 2025-11-22 08:07:33.566 186548 INFO nova.compute.manager [-] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] VM Stopped (Lifecycle Event)
Nov 22 08:07:33 compute-0 nova_compute[186544]: 2025-11-22 08:07:33.583 186548 DEBUG nova.compute.manager [None req-0b10903c-2ebd-40d2-a140-c1736ea61d4c - - - - - -] [instance: 1fe5c7b8-8a9f-4b64-bb69-822a770565f6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:07:33 compute-0 nova_compute[186544]: 2025-11-22 08:07:33.597 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:35 compute-0 podman[234238]: 2025-11-22 08:07:35.403019907 +0000 UTC m=+0.051577583 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 22 08:07:35 compute-0 podman[234239]: 2025-11-22 08:07:35.428103956 +0000 UTC m=+0.074131229 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 08:07:35 compute-0 nova_compute[186544]: 2025-11-22 08:07:35.807 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:35 compute-0 nova_compute[186544]: 2025-11-22 08:07:35.989 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:37 compute-0 nova_compute[186544]: 2025-11-22 08:07:37.208 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:37.336 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:07:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:37.337 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:07:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:37.337 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:07:38 compute-0 nova_compute[186544]: 2025-11-22 08:07:38.599 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:39 compute-0 nova_compute[186544]: 2025-11-22 08:07:39.661 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:40 compute-0 nova_compute[186544]: 2025-11-22 08:07:40.808 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:43 compute-0 nova_compute[186544]: 2025-11-22 08:07:43.602 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:44 compute-0 podman[234278]: 2025-11-22 08:07:44.416893589 +0000 UTC m=+0.063750383 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_managed=true)
Nov 22 08:07:45 compute-0 nova_compute[186544]: 2025-11-22 08:07:45.809 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:47 compute-0 nova_compute[186544]: 2025-11-22 08:07:47.057 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:48 compute-0 podman[234298]: 2025-11-22 08:07:48.403014297 +0000 UTC m=+0.048104008 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 08:07:48 compute-0 podman[234299]: 2025-11-22 08:07:48.414056868 +0000 UTC m=+0.055332755 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, release=1755695350, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., version=9.6, architecture=x86_64, distribution-scope=public, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.buildah.version=1.33.7, name=ubi9-minimal, config_id=edpm, managed_by=edpm_ansible)
Nov 22 08:07:48 compute-0 nova_compute[186544]: 2025-11-22 08:07:48.604 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:50 compute-0 nova_compute[186544]: 2025-11-22 08:07:50.811 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:51 compute-0 nova_compute[186544]: 2025-11-22 08:07:51.970 186548 DEBUG oslo_concurrency.lockutils [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] Acquiring lock "197a8113-f798-4088-a43c-06c87c60411d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:07:51 compute-0 nova_compute[186544]: 2025-11-22 08:07:51.971 186548 DEBUG oslo_concurrency.lockutils [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] Lock "197a8113-f798-4088-a43c-06c87c60411d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:07:51 compute-0 nova_compute[186544]: 2025-11-22 08:07:51.991 186548 DEBUG nova.compute.manager [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] [instance: 197a8113-f798-4088-a43c-06c87c60411d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 08:07:52 compute-0 nova_compute[186544]: 2025-11-22 08:07:52.112 186548 DEBUG oslo_concurrency.lockutils [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:07:52 compute-0 nova_compute[186544]: 2025-11-22 08:07:52.113 186548 DEBUG oslo_concurrency.lockutils [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:07:52 compute-0 nova_compute[186544]: 2025-11-22 08:07:52.118 186548 DEBUG nova.virt.hardware [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 08:07:52 compute-0 nova_compute[186544]: 2025-11-22 08:07:52.119 186548 INFO nova.compute.claims [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] [instance: 197a8113-f798-4088-a43c-06c87c60411d] Claim successful on node compute-0.ctlplane.example.com
Nov 22 08:07:52 compute-0 nova_compute[186544]: 2025-11-22 08:07:52.272 186548 DEBUG nova.compute.provider_tree [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:07:52 compute-0 nova_compute[186544]: 2025-11-22 08:07:52.284 186548 DEBUG nova.scheduler.client.report [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:07:52 compute-0 nova_compute[186544]: 2025-11-22 08:07:52.308 186548 DEBUG oslo_concurrency.lockutils [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.196s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:07:52 compute-0 nova_compute[186544]: 2025-11-22 08:07:52.309 186548 DEBUG nova.compute.manager [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] [instance: 197a8113-f798-4088-a43c-06c87c60411d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 08:07:52 compute-0 nova_compute[186544]: 2025-11-22 08:07:52.419 186548 DEBUG nova.compute.manager [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] [instance: 197a8113-f798-4088-a43c-06c87c60411d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 22 08:07:52 compute-0 nova_compute[186544]: 2025-11-22 08:07:52.419 186548 DEBUG nova.network.neutron [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] [instance: 197a8113-f798-4088-a43c-06c87c60411d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 08:07:52 compute-0 nova_compute[186544]: 2025-11-22 08:07:52.473 186548 INFO nova.virt.libvirt.driver [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] [instance: 197a8113-f798-4088-a43c-06c87c60411d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 08:07:52 compute-0 nova_compute[186544]: 2025-11-22 08:07:52.505 186548 DEBUG nova.compute.manager [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] [instance: 197a8113-f798-4088-a43c-06c87c60411d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 08:07:52 compute-0 nova_compute[186544]: 2025-11-22 08:07:52.637 186548 DEBUG nova.compute.manager [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] [instance: 197a8113-f798-4088-a43c-06c87c60411d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 08:07:52 compute-0 nova_compute[186544]: 2025-11-22 08:07:52.638 186548 DEBUG nova.virt.libvirt.driver [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] [instance: 197a8113-f798-4088-a43c-06c87c60411d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 08:07:52 compute-0 nova_compute[186544]: 2025-11-22 08:07:52.639 186548 INFO nova.virt.libvirt.driver [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] [instance: 197a8113-f798-4088-a43c-06c87c60411d] Creating image(s)
Nov 22 08:07:52 compute-0 nova_compute[186544]: 2025-11-22 08:07:52.639 186548 DEBUG oslo_concurrency.lockutils [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] Acquiring lock "/var/lib/nova/instances/197a8113-f798-4088-a43c-06c87c60411d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:07:52 compute-0 nova_compute[186544]: 2025-11-22 08:07:52.640 186548 DEBUG oslo_concurrency.lockutils [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] Lock "/var/lib/nova/instances/197a8113-f798-4088-a43c-06c87c60411d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:07:52 compute-0 nova_compute[186544]: 2025-11-22 08:07:52.640 186548 DEBUG oslo_concurrency.lockutils [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] Lock "/var/lib/nova/instances/197a8113-f798-4088-a43c-06c87c60411d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:07:52 compute-0 nova_compute[186544]: 2025-11-22 08:07:52.652 186548 DEBUG oslo_concurrency.processutils [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:07:52 compute-0 nova_compute[186544]: 2025-11-22 08:07:52.711 186548 DEBUG oslo_concurrency.processutils [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:07:52 compute-0 nova_compute[186544]: 2025-11-22 08:07:52.712 186548 DEBUG oslo_concurrency.lockutils [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:07:52 compute-0 nova_compute[186544]: 2025-11-22 08:07:52.713 186548 DEBUG oslo_concurrency.lockutils [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:07:52 compute-0 nova_compute[186544]: 2025-11-22 08:07:52.728 186548 DEBUG oslo_concurrency.processutils [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:07:52 compute-0 nova_compute[186544]: 2025-11-22 08:07:52.784 186548 DEBUG oslo_concurrency.processutils [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:07:52 compute-0 nova_compute[186544]: 2025-11-22 08:07:52.785 186548 DEBUG oslo_concurrency.processutils [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/197a8113-f798-4088-a43c-06c87c60411d/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:07:52 compute-0 nova_compute[186544]: 2025-11-22 08:07:52.985 186548 DEBUG oslo_concurrency.processutils [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/197a8113-f798-4088-a43c-06c87c60411d/disk 1073741824" returned: 0 in 0.200s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:07:52 compute-0 nova_compute[186544]: 2025-11-22 08:07:52.987 186548 DEBUG oslo_concurrency.lockutils [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.274s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:07:52 compute-0 nova_compute[186544]: 2025-11-22 08:07:52.987 186548 DEBUG oslo_concurrency.processutils [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:07:53 compute-0 nova_compute[186544]: 2025-11-22 08:07:53.042 186548 DEBUG oslo_concurrency.processutils [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:07:53 compute-0 nova_compute[186544]: 2025-11-22 08:07:53.043 186548 DEBUG nova.virt.disk.api [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] Checking if we can resize image /var/lib/nova/instances/197a8113-f798-4088-a43c-06c87c60411d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 08:07:53 compute-0 nova_compute[186544]: 2025-11-22 08:07:53.044 186548 DEBUG oslo_concurrency.processutils [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/197a8113-f798-4088-a43c-06c87c60411d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:07:53 compute-0 nova_compute[186544]: 2025-11-22 08:07:53.109 186548 DEBUG oslo_concurrency.processutils [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/197a8113-f798-4088-a43c-06c87c60411d/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:07:53 compute-0 nova_compute[186544]: 2025-11-22 08:07:53.110 186548 DEBUG nova.virt.disk.api [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] Cannot resize image /var/lib/nova/instances/197a8113-f798-4088-a43c-06c87c60411d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 08:07:53 compute-0 nova_compute[186544]: 2025-11-22 08:07:53.110 186548 DEBUG nova.objects.instance [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] Lazy-loading 'migration_context' on Instance uuid 197a8113-f798-4088-a43c-06c87c60411d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:07:53 compute-0 nova_compute[186544]: 2025-11-22 08:07:53.123 186548 DEBUG nova.virt.libvirt.driver [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] [instance: 197a8113-f798-4088-a43c-06c87c60411d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 08:07:53 compute-0 nova_compute[186544]: 2025-11-22 08:07:53.124 186548 DEBUG nova.virt.libvirt.driver [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] [instance: 197a8113-f798-4088-a43c-06c87c60411d] Ensure instance console log exists: /var/lib/nova/instances/197a8113-f798-4088-a43c-06c87c60411d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 08:07:53 compute-0 nova_compute[186544]: 2025-11-22 08:07:53.124 186548 DEBUG oslo_concurrency.lockutils [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:07:53 compute-0 nova_compute[186544]: 2025-11-22 08:07:53.125 186548 DEBUG oslo_concurrency.lockutils [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:07:53 compute-0 nova_compute[186544]: 2025-11-22 08:07:53.125 186548 DEBUG oslo_concurrency.lockutils [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:07:53 compute-0 nova_compute[186544]: 2025-11-22 08:07:53.133 186548 DEBUG nova.policy [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c6751e9b0adc4800b9bc3c06303a22fa', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a38780a3686c492ea46910996253a08a', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 22 08:07:53 compute-0 nova_compute[186544]: 2025-11-22 08:07:53.158 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:07:53 compute-0 nova_compute[186544]: 2025-11-22 08:07:53.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:07:53 compute-0 nova_compute[186544]: 2025-11-22 08:07:53.182 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:07:53 compute-0 nova_compute[186544]: 2025-11-22 08:07:53.183 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:07:53 compute-0 nova_compute[186544]: 2025-11-22 08:07:53.183 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:07:53 compute-0 nova_compute[186544]: 2025-11-22 08:07:53.183 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 08:07:53 compute-0 nova_compute[186544]: 2025-11-22 08:07:53.183 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:53 compute-0 nova_compute[186544]: 2025-11-22 08:07:53.256 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:07:53 compute-0 nova_compute[186544]: 2025-11-22 08:07:53.323 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:07:53 compute-0 nova_compute[186544]: 2025-11-22 08:07:53.324 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:07:53 compute-0 nova_compute[186544]: 2025-11-22 08:07:53.385 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:07:53 compute-0 nova_compute[186544]: 2025-11-22 08:07:53.578 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:07:53 compute-0 nova_compute[186544]: 2025-11-22 08:07:53.579 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5555MB free_disk=73.17563247680664GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 08:07:53 compute-0 nova_compute[186544]: 2025-11-22 08:07:53.580 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:07:53 compute-0 nova_compute[186544]: 2025-11-22 08:07:53.580 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:07:53 compute-0 nova_compute[186544]: 2025-11-22 08:07:53.607 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:53 compute-0 nova_compute[186544]: 2025-11-22 08:07:53.668 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Instance c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 22 08:07:53 compute-0 nova_compute[186544]: 2025-11-22 08:07:53.669 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Instance 197a8113-f798-4088-a43c-06c87c60411d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 22 08:07:53 compute-0 nova_compute[186544]: 2025-11-22 08:07:53.669 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 08:07:53 compute-0 nova_compute[186544]: 2025-11-22 08:07:53.669 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 08:07:53 compute-0 nova_compute[186544]: 2025-11-22 08:07:53.737 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:07:53 compute-0 nova_compute[186544]: 2025-11-22 08:07:53.749 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:07:53 compute-0 nova_compute[186544]: 2025-11-22 08:07:53.769 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 08:07:53 compute-0 nova_compute[186544]: 2025-11-22 08:07:53.769 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.190s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:07:54 compute-0 nova_compute[186544]: 2025-11-22 08:07:54.579 186548 DEBUG nova.network.neutron [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] [instance: 197a8113-f798-4088-a43c-06c87c60411d] Successfully created port: b6e26a0a-e1d6-4b22-a957-9272225fdd13 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 22 08:07:54 compute-0 nova_compute[186544]: 2025-11-22 08:07:54.770 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:07:54 compute-0 nova_compute[186544]: 2025-11-22 08:07:54.771 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 08:07:55 compute-0 nova_compute[186544]: 2025-11-22 08:07:55.115 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "refresh_cache-c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:07:55 compute-0 nova_compute[186544]: 2025-11-22 08:07:55.116 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquired lock "refresh_cache-c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:07:55 compute-0 nova_compute[186544]: 2025-11-22 08:07:55.116 186548 DEBUG nova.network.neutron [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 22 08:07:55 compute-0 nova_compute[186544]: 2025-11-22 08:07:55.812 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:56 compute-0 nova_compute[186544]: 2025-11-22 08:07:56.462 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:57 compute-0 nova_compute[186544]: 2025-11-22 08:07:57.057 186548 DEBUG nova.network.neutron [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] [instance: 197a8113-f798-4088-a43c-06c87c60411d] Successfully updated port: b6e26a0a-e1d6-4b22-a957-9272225fdd13 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 08:07:57 compute-0 nova_compute[186544]: 2025-11-22 08:07:57.075 186548 DEBUG oslo_concurrency.lockutils [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] Acquiring lock "refresh_cache-197a8113-f798-4088-a43c-06c87c60411d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:07:57 compute-0 nova_compute[186544]: 2025-11-22 08:07:57.075 186548 DEBUG oslo_concurrency.lockutils [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] Acquired lock "refresh_cache-197a8113-f798-4088-a43c-06c87c60411d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:07:57 compute-0 nova_compute[186544]: 2025-11-22 08:07:57.076 186548 DEBUG nova.network.neutron [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] [instance: 197a8113-f798-4088-a43c-06c87c60411d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 08:07:57 compute-0 nova_compute[186544]: 2025-11-22 08:07:57.420 186548 DEBUG nova.network.neutron [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] [instance: 197a8113-f798-4088-a43c-06c87c60411d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 08:07:57 compute-0 nova_compute[186544]: 2025-11-22 08:07:57.812 186548 DEBUG nova.network.neutron [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] Updating instance_info_cache with network_info: [{"id": "f66ac761-5b67-4bf3-98de-982febfd27c1", "address": "fa:16:3e:07:30:39", "network": {"id": "90da6fca-65d1-4012-9602-d88842a0ad0e", "bridge": "br-int", "label": "tempest-network-smoke--1184291441", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe07:3039", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.236", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf66ac761-5b", "ovs_interfaceid": "f66ac761-5b67-4bf3-98de-982febfd27c1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:07:57 compute-0 nova_compute[186544]: 2025-11-22 08:07:57.843 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Releasing lock "refresh_cache-c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:07:57 compute-0 nova_compute[186544]: 2025-11-22 08:07:57.843 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 22 08:07:57 compute-0 nova_compute[186544]: 2025-11-22 08:07:57.843 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:07:57 compute-0 nova_compute[186544]: 2025-11-22 08:07:57.844 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:07:57 compute-0 nova_compute[186544]: 2025-11-22 08:07:57.844 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 08:07:58 compute-0 nova_compute[186544]: 2025-11-22 08:07:58.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:07:58 compute-0 nova_compute[186544]: 2025-11-22 08:07:58.418 186548 DEBUG nova.network.neutron [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] [instance: 197a8113-f798-4088-a43c-06c87c60411d] Updating instance_info_cache with network_info: [{"id": "b6e26a0a-e1d6-4b22-a957-9272225fdd13", "address": "fa:16:3e:54:7d:63", "network": {"id": "370b86d5-40da-430f-aa64-a9124b2ca55b", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-2055379623-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a38780a3686c492ea46910996253a08a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb6e26a0a-e1", "ovs_interfaceid": "b6e26a0a-e1d6-4b22-a957-9272225fdd13", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:07:58 compute-0 nova_compute[186544]: 2025-11-22 08:07:58.437 186548 DEBUG oslo_concurrency.lockutils [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] Releasing lock "refresh_cache-197a8113-f798-4088-a43c-06c87c60411d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:07:58 compute-0 nova_compute[186544]: 2025-11-22 08:07:58.438 186548 DEBUG nova.compute.manager [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] [instance: 197a8113-f798-4088-a43c-06c87c60411d] Instance network_info: |[{"id": "b6e26a0a-e1d6-4b22-a957-9272225fdd13", "address": "fa:16:3e:54:7d:63", "network": {"id": "370b86d5-40da-430f-aa64-a9124b2ca55b", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-2055379623-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a38780a3686c492ea46910996253a08a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb6e26a0a-e1", "ovs_interfaceid": "b6e26a0a-e1d6-4b22-a957-9272225fdd13", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 22 08:07:58 compute-0 nova_compute[186544]: 2025-11-22 08:07:58.440 186548 DEBUG nova.virt.libvirt.driver [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] [instance: 197a8113-f798-4088-a43c-06c87c60411d] Start _get_guest_xml network_info=[{"id": "b6e26a0a-e1d6-4b22-a957-9272225fdd13", "address": "fa:16:3e:54:7d:63", "network": {"id": "370b86d5-40da-430f-aa64-a9124b2ca55b", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-2055379623-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a38780a3686c492ea46910996253a08a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb6e26a0a-e1", "ovs_interfaceid": "b6e26a0a-e1d6-4b22-a957-9272225fdd13", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 08:07:58 compute-0 nova_compute[186544]: 2025-11-22 08:07:58.444 186548 WARNING nova.virt.libvirt.driver [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:07:58 compute-0 nova_compute[186544]: 2025-11-22 08:07:58.449 186548 DEBUG nova.virt.libvirt.host [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 08:07:58 compute-0 nova_compute[186544]: 2025-11-22 08:07:58.449 186548 DEBUG nova.virt.libvirt.host [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 08:07:58 compute-0 nova_compute[186544]: 2025-11-22 08:07:58.452 186548 DEBUG nova.virt.libvirt.host [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 08:07:58 compute-0 nova_compute[186544]: 2025-11-22 08:07:58.453 186548 DEBUG nova.virt.libvirt.host [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 08:07:58 compute-0 nova_compute[186544]: 2025-11-22 08:07:58.455 186548 DEBUG nova.virt.libvirt.driver [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 08:07:58 compute-0 nova_compute[186544]: 2025-11-22 08:07:58.455 186548 DEBUG nova.virt.hardware [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 08:07:58 compute-0 nova_compute[186544]: 2025-11-22 08:07:58.455 186548 DEBUG nova.virt.hardware [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 08:07:58 compute-0 nova_compute[186544]: 2025-11-22 08:07:58.456 186548 DEBUG nova.virt.hardware [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 08:07:58 compute-0 nova_compute[186544]: 2025-11-22 08:07:58.456 186548 DEBUG nova.virt.hardware [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 08:07:58 compute-0 nova_compute[186544]: 2025-11-22 08:07:58.456 186548 DEBUG nova.virt.hardware [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 08:07:58 compute-0 nova_compute[186544]: 2025-11-22 08:07:58.456 186548 DEBUG nova.virt.hardware [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 08:07:58 compute-0 nova_compute[186544]: 2025-11-22 08:07:58.457 186548 DEBUG nova.virt.hardware [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 08:07:58 compute-0 nova_compute[186544]: 2025-11-22 08:07:58.457 186548 DEBUG nova.virt.hardware [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 08:07:58 compute-0 nova_compute[186544]: 2025-11-22 08:07:58.457 186548 DEBUG nova.virt.hardware [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 08:07:58 compute-0 nova_compute[186544]: 2025-11-22 08:07:58.457 186548 DEBUG nova.virt.hardware [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 08:07:58 compute-0 nova_compute[186544]: 2025-11-22 08:07:58.458 186548 DEBUG nova.virt.hardware [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 08:07:58 compute-0 nova_compute[186544]: 2025-11-22 08:07:58.461 186548 DEBUG nova.virt.libvirt.vif [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:07:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerAddressesTestJSON-server-2082170142',display_name='tempest-ServerAddressesTestJSON-server-2082170142',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveraddressestestjson-server-2082170142',id=113,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a38780a3686c492ea46910996253a08a',ramdisk_id='',reservation_id='r-3i7z3rjz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerAddressesTestJSON-1745652147',owner_user_name='tempest-ServerAddressesTestJSON-1745652147-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:07:52Z,user_data=None,user_id='c6751e9b0adc4800b9bc3c06303a22fa',uuid=197a8113-f798-4088-a43c-06c87c60411d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b6e26a0a-e1d6-4b22-a957-9272225fdd13", "address": "fa:16:3e:54:7d:63", "network": {"id": "370b86d5-40da-430f-aa64-a9124b2ca55b", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-2055379623-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a38780a3686c492ea46910996253a08a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb6e26a0a-e1", "ovs_interfaceid": "b6e26a0a-e1d6-4b22-a957-9272225fdd13", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 08:07:58 compute-0 nova_compute[186544]: 2025-11-22 08:07:58.461 186548 DEBUG nova.network.os_vif_util [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] Converting VIF {"id": "b6e26a0a-e1d6-4b22-a957-9272225fdd13", "address": "fa:16:3e:54:7d:63", "network": {"id": "370b86d5-40da-430f-aa64-a9124b2ca55b", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-2055379623-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a38780a3686c492ea46910996253a08a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb6e26a0a-e1", "ovs_interfaceid": "b6e26a0a-e1d6-4b22-a957-9272225fdd13", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:07:58 compute-0 nova_compute[186544]: 2025-11-22 08:07:58.461 186548 DEBUG nova.network.os_vif_util [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:54:7d:63,bridge_name='br-int',has_traffic_filtering=True,id=b6e26a0a-e1d6-4b22-a957-9272225fdd13,network=Network(370b86d5-40da-430f-aa64-a9124b2ca55b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb6e26a0a-e1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:07:58 compute-0 nova_compute[186544]: 2025-11-22 08:07:58.462 186548 DEBUG nova.objects.instance [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] Lazy-loading 'pci_devices' on Instance uuid 197a8113-f798-4088-a43c-06c87c60411d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:07:58 compute-0 nova_compute[186544]: 2025-11-22 08:07:58.479 186548 DEBUG nova.virt.libvirt.driver [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] [instance: 197a8113-f798-4088-a43c-06c87c60411d] End _get_guest_xml xml=<domain type="kvm">
Nov 22 08:07:58 compute-0 nova_compute[186544]:   <uuid>197a8113-f798-4088-a43c-06c87c60411d</uuid>
Nov 22 08:07:58 compute-0 nova_compute[186544]:   <name>instance-00000071</name>
Nov 22 08:07:58 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 08:07:58 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 08:07:58 compute-0 nova_compute[186544]:   <metadata>
Nov 22 08:07:58 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 08:07:58 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 08:07:58 compute-0 nova_compute[186544]:       <nova:name>tempest-ServerAddressesTestJSON-server-2082170142</nova:name>
Nov 22 08:07:58 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 08:07:58</nova:creationTime>
Nov 22 08:07:58 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 08:07:58 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 08:07:58 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 08:07:58 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 08:07:58 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 08:07:58 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 08:07:58 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 08:07:58 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 08:07:58 compute-0 nova_compute[186544]:         <nova:user uuid="c6751e9b0adc4800b9bc3c06303a22fa">tempest-ServerAddressesTestJSON-1745652147-project-member</nova:user>
Nov 22 08:07:58 compute-0 nova_compute[186544]:         <nova:project uuid="a38780a3686c492ea46910996253a08a">tempest-ServerAddressesTestJSON-1745652147</nova:project>
Nov 22 08:07:58 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 08:07:58 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 08:07:58 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 08:07:58 compute-0 nova_compute[186544]:         <nova:port uuid="b6e26a0a-e1d6-4b22-a957-9272225fdd13">
Nov 22 08:07:58 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Nov 22 08:07:58 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 08:07:58 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 08:07:58 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 08:07:58 compute-0 nova_compute[186544]:   </metadata>
Nov 22 08:07:58 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 08:07:58 compute-0 nova_compute[186544]:     <system>
Nov 22 08:07:58 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 08:07:58 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 08:07:58 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 08:07:58 compute-0 nova_compute[186544]:       <entry name="serial">197a8113-f798-4088-a43c-06c87c60411d</entry>
Nov 22 08:07:58 compute-0 nova_compute[186544]:       <entry name="uuid">197a8113-f798-4088-a43c-06c87c60411d</entry>
Nov 22 08:07:58 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 08:07:58 compute-0 nova_compute[186544]:     </system>
Nov 22 08:07:58 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 08:07:58 compute-0 nova_compute[186544]:   <os>
Nov 22 08:07:58 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 08:07:58 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 08:07:58 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 08:07:58 compute-0 nova_compute[186544]:   </os>
Nov 22 08:07:58 compute-0 nova_compute[186544]:   <features>
Nov 22 08:07:58 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 08:07:58 compute-0 nova_compute[186544]:     <apic/>
Nov 22 08:07:58 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 08:07:58 compute-0 nova_compute[186544]:   </features>
Nov 22 08:07:58 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 08:07:58 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 08:07:58 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 08:07:58 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 08:07:58 compute-0 nova_compute[186544]:   </clock>
Nov 22 08:07:58 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 08:07:58 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 08:07:58 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 08:07:58 compute-0 nova_compute[186544]:   </cpu>
Nov 22 08:07:58 compute-0 nova_compute[186544]:   <devices>
Nov 22 08:07:58 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 08:07:58 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 08:07:58 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/197a8113-f798-4088-a43c-06c87c60411d/disk"/>
Nov 22 08:07:58 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 08:07:58 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:07:58 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 08:07:58 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 08:07:58 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/197a8113-f798-4088-a43c-06c87c60411d/disk.config"/>
Nov 22 08:07:58 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 08:07:58 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:07:58 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 08:07:58 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:54:7d:63"/>
Nov 22 08:07:58 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:07:58 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 08:07:58 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 08:07:58 compute-0 nova_compute[186544]:       <target dev="tapb6e26a0a-e1"/>
Nov 22 08:07:58 compute-0 nova_compute[186544]:     </interface>
Nov 22 08:07:58 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 08:07:58 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/197a8113-f798-4088-a43c-06c87c60411d/console.log" append="off"/>
Nov 22 08:07:58 compute-0 nova_compute[186544]:     </serial>
Nov 22 08:07:58 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 08:07:58 compute-0 nova_compute[186544]:     <video>
Nov 22 08:07:58 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:07:58 compute-0 nova_compute[186544]:     </video>
Nov 22 08:07:58 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 08:07:58 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 08:07:58 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 08:07:58 compute-0 nova_compute[186544]:     </rng>
Nov 22 08:07:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 08:07:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:07:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:07:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:07:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:07:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:07:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:07:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:07:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:07:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:07:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:07:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:07:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:07:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:07:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:07:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:07:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:07:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:07:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:07:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:07:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:07:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:07:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:07:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:07:58 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:07:58 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 08:07:58 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 08:07:58 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 08:07:58 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 08:07:58 compute-0 nova_compute[186544]:   </devices>
Nov 22 08:07:58 compute-0 nova_compute[186544]: </domain>
Nov 22 08:07:58 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 08:07:58 compute-0 nova_compute[186544]: 2025-11-22 08:07:58.480 186548 DEBUG nova.compute.manager [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] [instance: 197a8113-f798-4088-a43c-06c87c60411d] Preparing to wait for external event network-vif-plugged-b6e26a0a-e1d6-4b22-a957-9272225fdd13 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 08:07:58 compute-0 nova_compute[186544]: 2025-11-22 08:07:58.480 186548 DEBUG oslo_concurrency.lockutils [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] Acquiring lock "197a8113-f798-4088-a43c-06c87c60411d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:07:58 compute-0 nova_compute[186544]: 2025-11-22 08:07:58.480 186548 DEBUG oslo_concurrency.lockutils [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] Lock "197a8113-f798-4088-a43c-06c87c60411d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:07:58 compute-0 nova_compute[186544]: 2025-11-22 08:07:58.481 186548 DEBUG oslo_concurrency.lockutils [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] Lock "197a8113-f798-4088-a43c-06c87c60411d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:07:58 compute-0 nova_compute[186544]: 2025-11-22 08:07:58.481 186548 DEBUG nova.virt.libvirt.vif [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:07:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerAddressesTestJSON-server-2082170142',display_name='tempest-ServerAddressesTestJSON-server-2082170142',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveraddressestestjson-server-2082170142',id=113,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a38780a3686c492ea46910996253a08a',ramdisk_id='',reservation_id='r-3i7z3rjz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerAddressesTestJSON-1745652147',owner_user_name='tempest-ServerAddressesTestJSON-1745652147-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:07:52Z,user_data=None,user_id='c6751e9b0adc4800b9bc3c06303a22fa',uuid=197a8113-f798-4088-a43c-06c87c60411d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b6e26a0a-e1d6-4b22-a957-9272225fdd13", "address": "fa:16:3e:54:7d:63", "network": {"id": "370b86d5-40da-430f-aa64-a9124b2ca55b", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-2055379623-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a38780a3686c492ea46910996253a08a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb6e26a0a-e1", "ovs_interfaceid": "b6e26a0a-e1d6-4b22-a957-9272225fdd13", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 08:07:58 compute-0 nova_compute[186544]: 2025-11-22 08:07:58.481 186548 DEBUG nova.network.os_vif_util [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] Converting VIF {"id": "b6e26a0a-e1d6-4b22-a957-9272225fdd13", "address": "fa:16:3e:54:7d:63", "network": {"id": "370b86d5-40da-430f-aa64-a9124b2ca55b", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-2055379623-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a38780a3686c492ea46910996253a08a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb6e26a0a-e1", "ovs_interfaceid": "b6e26a0a-e1d6-4b22-a957-9272225fdd13", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:07:58 compute-0 nova_compute[186544]: 2025-11-22 08:07:58.482 186548 DEBUG nova.network.os_vif_util [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:54:7d:63,bridge_name='br-int',has_traffic_filtering=True,id=b6e26a0a-e1d6-4b22-a957-9272225fdd13,network=Network(370b86d5-40da-430f-aa64-a9124b2ca55b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb6e26a0a-e1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:07:58 compute-0 nova_compute[186544]: 2025-11-22 08:07:58.482 186548 DEBUG os_vif [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:54:7d:63,bridge_name='br-int',has_traffic_filtering=True,id=b6e26a0a-e1d6-4b22-a957-9272225fdd13,network=Network(370b86d5-40da-430f-aa64-a9124b2ca55b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb6e26a0a-e1') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 08:07:58 compute-0 nova_compute[186544]: 2025-11-22 08:07:58.483 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:58 compute-0 nova_compute[186544]: 2025-11-22 08:07:58.483 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:07:58 compute-0 nova_compute[186544]: 2025-11-22 08:07:58.483 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:07:58 compute-0 nova_compute[186544]: 2025-11-22 08:07:58.486 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:58 compute-0 nova_compute[186544]: 2025-11-22 08:07:58.486 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb6e26a0a-e1, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:07:58 compute-0 nova_compute[186544]: 2025-11-22 08:07:58.486 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb6e26a0a-e1, col_values=(('external_ids', {'iface-id': 'b6e26a0a-e1d6-4b22-a957-9272225fdd13', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:54:7d:63', 'vm-uuid': '197a8113-f798-4088-a43c-06c87c60411d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:07:58 compute-0 nova_compute[186544]: 2025-11-22 08:07:58.487 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:58 compute-0 NetworkManager[55036]: <info>  [1763798878.4888] manager: (tapb6e26a0a-e1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/241)
Nov 22 08:07:58 compute-0 nova_compute[186544]: 2025-11-22 08:07:58.490 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 08:07:58 compute-0 nova_compute[186544]: 2025-11-22 08:07:58.495 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:58 compute-0 nova_compute[186544]: 2025-11-22 08:07:58.496 186548 INFO os_vif [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:54:7d:63,bridge_name='br-int',has_traffic_filtering=True,id=b6e26a0a-e1d6-4b22-a957-9272225fdd13,network=Network(370b86d5-40da-430f-aa64-a9124b2ca55b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb6e26a0a-e1')
Nov 22 08:07:58 compute-0 nova_compute[186544]: 2025-11-22 08:07:58.520 186548 DEBUG nova.compute.manager [req-7a887c4f-5396-41dc-b7d0-6157b937295d req-e588904a-afa1-48ac-923c-75c4d048591e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 197a8113-f798-4088-a43c-06c87c60411d] Received event network-changed-b6e26a0a-e1d6-4b22-a957-9272225fdd13 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:07:58 compute-0 nova_compute[186544]: 2025-11-22 08:07:58.520 186548 DEBUG nova.compute.manager [req-7a887c4f-5396-41dc-b7d0-6157b937295d req-e588904a-afa1-48ac-923c-75c4d048591e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 197a8113-f798-4088-a43c-06c87c60411d] Refreshing instance network info cache due to event network-changed-b6e26a0a-e1d6-4b22-a957-9272225fdd13. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:07:58 compute-0 nova_compute[186544]: 2025-11-22 08:07:58.521 186548 DEBUG oslo_concurrency.lockutils [req-7a887c4f-5396-41dc-b7d0-6157b937295d req-e588904a-afa1-48ac-923c-75c4d048591e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-197a8113-f798-4088-a43c-06c87c60411d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:07:58 compute-0 nova_compute[186544]: 2025-11-22 08:07:58.521 186548 DEBUG oslo_concurrency.lockutils [req-7a887c4f-5396-41dc-b7d0-6157b937295d req-e588904a-afa1-48ac-923c-75c4d048591e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-197a8113-f798-4088-a43c-06c87c60411d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:07:58 compute-0 nova_compute[186544]: 2025-11-22 08:07:58.521 186548 DEBUG nova.network.neutron [req-7a887c4f-5396-41dc-b7d0-6157b937295d req-e588904a-afa1-48ac-923c-75c4d048591e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 197a8113-f798-4088-a43c-06c87c60411d] Refreshing network info cache for port b6e26a0a-e1d6-4b22-a957-9272225fdd13 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:07:58 compute-0 nova_compute[186544]: 2025-11-22 08:07:58.544 186548 DEBUG nova.virt.libvirt.driver [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:07:58 compute-0 nova_compute[186544]: 2025-11-22 08:07:58.544 186548 DEBUG nova.virt.libvirt.driver [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:07:58 compute-0 nova_compute[186544]: 2025-11-22 08:07:58.544 186548 DEBUG nova.virt.libvirt.driver [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] No VIF found with MAC fa:16:3e:54:7d:63, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 08:07:58 compute-0 nova_compute[186544]: 2025-11-22 08:07:58.545 186548 INFO nova.virt.libvirt.driver [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] [instance: 197a8113-f798-4088-a43c-06c87c60411d] Using config drive
Nov 22 08:07:58 compute-0 nova_compute[186544]: 2025-11-22 08:07:58.900 186548 INFO nova.virt.libvirt.driver [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] [instance: 197a8113-f798-4088-a43c-06c87c60411d] Creating config drive at /var/lib/nova/instances/197a8113-f798-4088-a43c-06c87c60411d/disk.config
Nov 22 08:07:58 compute-0 nova_compute[186544]: 2025-11-22 08:07:58.905 186548 DEBUG oslo_concurrency.processutils [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/197a8113-f798-4088-a43c-06c87c60411d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2g78y110 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:07:59 compute-0 nova_compute[186544]: 2025-11-22 08:07:59.043 186548 DEBUG oslo_concurrency.processutils [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/197a8113-f798-4088-a43c-06c87c60411d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2g78y110" returned: 0 in 0.138s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:07:59 compute-0 kernel: tapb6e26a0a-e1: entered promiscuous mode
Nov 22 08:07:59 compute-0 NetworkManager[55036]: <info>  [1763798879.0993] manager: (tapb6e26a0a-e1): new Tun device (/org/freedesktop/NetworkManager/Devices/242)
Nov 22 08:07:59 compute-0 ovn_controller[94843]: 2025-11-22T08:07:59Z|00509|binding|INFO|Claiming lport b6e26a0a-e1d6-4b22-a957-9272225fdd13 for this chassis.
Nov 22 08:07:59 compute-0 ovn_controller[94843]: 2025-11-22T08:07:59Z|00510|binding|INFO|b6e26a0a-e1d6-4b22-a957-9272225fdd13: Claiming fa:16:3e:54:7d:63 10.100.0.10
Nov 22 08:07:59 compute-0 nova_compute[186544]: 2025-11-22 08:07:59.100 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:59 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:59.107 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:54:7d:63 10.100.0.10'], port_security=['fa:16:3e:54:7d:63 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '197a8113-f798-4088-a43c-06c87c60411d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-370b86d5-40da-430f-aa64-a9124b2ca55b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a38780a3686c492ea46910996253a08a', 'neutron:revision_number': '2', 'neutron:security_group_ids': '89136c99-0a5a-4ed2-8c7f-2c8142459fb7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a8ab5813-7e7d-4f9d-b2f4-efd563c10fa6, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=b6e26a0a-e1d6-4b22-a957-9272225fdd13) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:07:59 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:59.109 103805 INFO neutron.agent.ovn.metadata.agent [-] Port b6e26a0a-e1d6-4b22-a957-9272225fdd13 in datapath 370b86d5-40da-430f-aa64-a9124b2ca55b bound to our chassis
Nov 22 08:07:59 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:59.110 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 370b86d5-40da-430f-aa64-a9124b2ca55b
Nov 22 08:07:59 compute-0 ovn_controller[94843]: 2025-11-22T08:07:59Z|00511|binding|INFO|Setting lport b6e26a0a-e1d6-4b22-a957-9272225fdd13 ovn-installed in OVS
Nov 22 08:07:59 compute-0 ovn_controller[94843]: 2025-11-22T08:07:59Z|00512|binding|INFO|Setting lport b6e26a0a-e1d6-4b22-a957-9272225fdd13 up in Southbound
Nov 22 08:07:59 compute-0 nova_compute[186544]: 2025-11-22 08:07:59.112 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:59 compute-0 nova_compute[186544]: 2025-11-22 08:07:59.114 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:59 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:59.123 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[d34ed720-a36e-4268-81e2-187c13627a93]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:07:59 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:59.124 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap370b86d5-41 in ovnmeta-370b86d5-40da-430f-aa64-a9124b2ca55b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 08:07:59 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:59.129 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap370b86d5-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 08:07:59 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:59.129 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[7fd20c31-e369-457f-9909-b65a8cbdd7eb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:07:59 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:59.130 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[f20cb816-e284-418e-9b4b-b715adbf42f0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:07:59 compute-0 systemd-udevd[234381]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:07:59 compute-0 systemd-machined[152872]: New machine qemu-66-instance-00000071.
Nov 22 08:07:59 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:59.142 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[6089439b-4b71-4912-8427-c0ed928f997f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:07:59 compute-0 NetworkManager[55036]: <info>  [1763798879.1445] device (tapb6e26a0a-e1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 08:07:59 compute-0 NetworkManager[55036]: <info>  [1763798879.1453] device (tapb6e26a0a-e1): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 08:07:59 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:59.155 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[49661135-0e49-416c-9ce0-cd9f10c675c0]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:07:59 compute-0 systemd[1]: Started Virtual Machine qemu-66-instance-00000071.
Nov 22 08:07:59 compute-0 nova_compute[186544]: 2025-11-22 08:07:59.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:07:59 compute-0 nova_compute[186544]: 2025-11-22 08:07:59.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:07:59 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:59.190 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[6912421d-e4e7-4cca-9e4c-6584ff858dbb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:07:59 compute-0 systemd-udevd[234385]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:07:59 compute-0 NetworkManager[55036]: <info>  [1763798879.1983] manager: (tap370b86d5-40): new Veth device (/org/freedesktop/NetworkManager/Devices/243)
Nov 22 08:07:59 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:59.196 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[5bac2dba-7df7-4c07-be3c-b6dcacd647d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:07:59 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:59.233 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[8cd8d86b-702a-4ed9-951e-3df7df2d33e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:07:59 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:59.236 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[b4b3c0b6-690b-4336-b1a8-aa7dca9f0833]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:07:59 compute-0 NetworkManager[55036]: <info>  [1763798879.2587] device (tap370b86d5-40): carrier: link connected
Nov 22 08:07:59 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:59.262 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[f9853c00-d382-480a-aba6-48a7c6464521]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:07:59 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:59.279 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[7c904707-65c7-4bf3-a07a-597f36dfdf67]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap370b86d5-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1a:e9:63'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 161], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 561989, 'reachable_time': 38152, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234414, 'error': None, 'target': 'ovnmeta-370b86d5-40da-430f-aa64-a9124b2ca55b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:07:59 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:59.294 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[88c9374b-335e-4b13-9247-3cd6e2718660]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe1a:e963'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 561989, 'tstamp': 561989}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234415, 'error': None, 'target': 'ovnmeta-370b86d5-40da-430f-aa64-a9124b2ca55b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:07:59 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:59.314 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[8b0b9781-459d-4890-8090-7956ead28ed7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap370b86d5-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1a:e9:63'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 161], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 561989, 'reachable_time': 38152, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 234416, 'error': None, 'target': 'ovnmeta-370b86d5-40da-430f-aa64-a9124b2ca55b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:07:59 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:59.350 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[f02b3046-f8d6-4b63-8c29-19a4465e12ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:07:59 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:59.414 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[9ec3bdfa-1430-4ae3-be2a-1ebd30a50d2e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:07:59 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:59.415 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap370b86d5-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:07:59 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:59.415 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:07:59 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:59.416 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap370b86d5-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:07:59 compute-0 kernel: tap370b86d5-40: entered promiscuous mode
Nov 22 08:07:59 compute-0 nova_compute[186544]: 2025-11-22 08:07:59.418 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:59 compute-0 NetworkManager[55036]: <info>  [1763798879.4187] manager: (tap370b86d5-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/244)
Nov 22 08:07:59 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:59.423 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap370b86d5-40, col_values=(('external_ids', {'iface-id': 'cd51c49d-ca1b-4eb1-876f-cf619c7d21ce'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:07:59 compute-0 ovn_controller[94843]: 2025-11-22T08:07:59Z|00513|binding|INFO|Releasing lport cd51c49d-ca1b-4eb1-876f-cf619c7d21ce from this chassis (sb_readonly=0)
Nov 22 08:07:59 compute-0 nova_compute[186544]: 2025-11-22 08:07:59.424 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:59 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:59.427 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/370b86d5-40da-430f-aa64-a9124b2ca55b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/370b86d5-40da-430f-aa64-a9124b2ca55b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 08:07:59 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:59.435 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[48a02999-0a2a-4cbd-a7be-1c7c427f5c4e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:07:59 compute-0 nova_compute[186544]: 2025-11-22 08:07:59.437 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:07:59 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:59.438 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 08:07:59 compute-0 ovn_metadata_agent[103800]: global
Nov 22 08:07:59 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 08:07:59 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-370b86d5-40da-430f-aa64-a9124b2ca55b
Nov 22 08:07:59 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 08:07:59 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 08:07:59 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 08:07:59 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/370b86d5-40da-430f-aa64-a9124b2ca55b.pid.haproxy
Nov 22 08:07:59 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 08:07:59 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:07:59 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 08:07:59 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 08:07:59 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 08:07:59 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 08:07:59 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 08:07:59 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 08:07:59 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 08:07:59 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 08:07:59 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 08:07:59 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 08:07:59 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 08:07:59 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 08:07:59 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 08:07:59 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:07:59 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:07:59 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 08:07:59 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 08:07:59 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 08:07:59 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID 370b86d5-40da-430f-aa64-a9124b2ca55b
Nov 22 08:07:59 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 08:07:59 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:07:59.439 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-370b86d5-40da-430f-aa64-a9124b2ca55b', 'env', 'PROCESS_TAG=haproxy-370b86d5-40da-430f-aa64-a9124b2ca55b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/370b86d5-40da-430f-aa64-a9124b2ca55b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 08:07:59 compute-0 nova_compute[186544]: 2025-11-22 08:07:59.449 186548 DEBUG nova.compute.manager [req-6b88b0dd-729d-47d9-b10b-8d9799be50b0 req-6471a3b6-1b91-41ee-93df-0df289a54cd3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 197a8113-f798-4088-a43c-06c87c60411d] Received event network-vif-plugged-b6e26a0a-e1d6-4b22-a957-9272225fdd13 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:07:59 compute-0 nova_compute[186544]: 2025-11-22 08:07:59.450 186548 DEBUG oslo_concurrency.lockutils [req-6b88b0dd-729d-47d9-b10b-8d9799be50b0 req-6471a3b6-1b91-41ee-93df-0df289a54cd3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "197a8113-f798-4088-a43c-06c87c60411d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:07:59 compute-0 nova_compute[186544]: 2025-11-22 08:07:59.450 186548 DEBUG oslo_concurrency.lockutils [req-6b88b0dd-729d-47d9-b10b-8d9799be50b0 req-6471a3b6-1b91-41ee-93df-0df289a54cd3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "197a8113-f798-4088-a43c-06c87c60411d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:07:59 compute-0 nova_compute[186544]: 2025-11-22 08:07:59.451 186548 DEBUG oslo_concurrency.lockutils [req-6b88b0dd-729d-47d9-b10b-8d9799be50b0 req-6471a3b6-1b91-41ee-93df-0df289a54cd3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "197a8113-f798-4088-a43c-06c87c60411d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:07:59 compute-0 nova_compute[186544]: 2025-11-22 08:07:59.451 186548 DEBUG nova.compute.manager [req-6b88b0dd-729d-47d9-b10b-8d9799be50b0 req-6471a3b6-1b91-41ee-93df-0df289a54cd3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 197a8113-f798-4088-a43c-06c87c60411d] Processing event network-vif-plugged-b6e26a0a-e1d6-4b22-a957-9272225fdd13 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 08:07:59 compute-0 nova_compute[186544]: 2025-11-22 08:07:59.796 186548 DEBUG nova.compute.manager [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] [instance: 197a8113-f798-4088-a43c-06c87c60411d] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 08:07:59 compute-0 nova_compute[186544]: 2025-11-22 08:07:59.797 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798879.7963028, 197a8113-f798-4088-a43c-06c87c60411d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:07:59 compute-0 nova_compute[186544]: 2025-11-22 08:07:59.797 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 197a8113-f798-4088-a43c-06c87c60411d] VM Started (Lifecycle Event)
Nov 22 08:07:59 compute-0 nova_compute[186544]: 2025-11-22 08:07:59.804 186548 DEBUG nova.virt.libvirt.driver [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] [instance: 197a8113-f798-4088-a43c-06c87c60411d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 08:07:59 compute-0 nova_compute[186544]: 2025-11-22 08:07:59.809 186548 INFO nova.virt.libvirt.driver [-] [instance: 197a8113-f798-4088-a43c-06c87c60411d] Instance spawned successfully.
Nov 22 08:07:59 compute-0 nova_compute[186544]: 2025-11-22 08:07:59.809 186548 DEBUG nova.virt.libvirt.driver [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] [instance: 197a8113-f798-4088-a43c-06c87c60411d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 08:07:59 compute-0 nova_compute[186544]: 2025-11-22 08:07:59.824 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 197a8113-f798-4088-a43c-06c87c60411d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:07:59 compute-0 nova_compute[186544]: 2025-11-22 08:07:59.826 186548 DEBUG nova.network.neutron [req-7a887c4f-5396-41dc-b7d0-6157b937295d req-e588904a-afa1-48ac-923c-75c4d048591e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 197a8113-f798-4088-a43c-06c87c60411d] Updated VIF entry in instance network info cache for port b6e26a0a-e1d6-4b22-a957-9272225fdd13. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:07:59 compute-0 nova_compute[186544]: 2025-11-22 08:07:59.826 186548 DEBUG nova.network.neutron [req-7a887c4f-5396-41dc-b7d0-6157b937295d req-e588904a-afa1-48ac-923c-75c4d048591e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 197a8113-f798-4088-a43c-06c87c60411d] Updating instance_info_cache with network_info: [{"id": "b6e26a0a-e1d6-4b22-a957-9272225fdd13", "address": "fa:16:3e:54:7d:63", "network": {"id": "370b86d5-40da-430f-aa64-a9124b2ca55b", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-2055379623-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a38780a3686c492ea46910996253a08a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb6e26a0a-e1", "ovs_interfaceid": "b6e26a0a-e1d6-4b22-a957-9272225fdd13", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:07:59 compute-0 podman[234455]: 2025-11-22 08:07:59.83071037 +0000 UTC m=+0.055626022 container create b2a8d06dbf6a1fc5f045f3e366ae8283c5cff24a22f43bc04f168a52f4289682 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-370b86d5-40da-430f-aa64-a9124b2ca55b, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 22 08:07:59 compute-0 nova_compute[186544]: 2025-11-22 08:07:59.832 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 197a8113-f798-4088-a43c-06c87c60411d] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:07:59 compute-0 nova_compute[186544]: 2025-11-22 08:07:59.836 186548 DEBUG nova.virt.libvirt.driver [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] [instance: 197a8113-f798-4088-a43c-06c87c60411d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:07:59 compute-0 nova_compute[186544]: 2025-11-22 08:07:59.837 186548 DEBUG nova.virt.libvirt.driver [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] [instance: 197a8113-f798-4088-a43c-06c87c60411d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:07:59 compute-0 nova_compute[186544]: 2025-11-22 08:07:59.837 186548 DEBUG nova.virt.libvirt.driver [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] [instance: 197a8113-f798-4088-a43c-06c87c60411d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:07:59 compute-0 nova_compute[186544]: 2025-11-22 08:07:59.838 186548 DEBUG nova.virt.libvirt.driver [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] [instance: 197a8113-f798-4088-a43c-06c87c60411d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:07:59 compute-0 nova_compute[186544]: 2025-11-22 08:07:59.838 186548 DEBUG nova.virt.libvirt.driver [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] [instance: 197a8113-f798-4088-a43c-06c87c60411d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:07:59 compute-0 nova_compute[186544]: 2025-11-22 08:07:59.838 186548 DEBUG nova.virt.libvirt.driver [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] [instance: 197a8113-f798-4088-a43c-06c87c60411d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:07:59 compute-0 nova_compute[186544]: 2025-11-22 08:07:59.847 186548 DEBUG oslo_concurrency.lockutils [req-7a887c4f-5396-41dc-b7d0-6157b937295d req-e588904a-afa1-48ac-923c-75c4d048591e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-197a8113-f798-4088-a43c-06c87c60411d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:07:59 compute-0 nova_compute[186544]: 2025-11-22 08:07:59.860 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 197a8113-f798-4088-a43c-06c87c60411d] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 08:07:59 compute-0 nova_compute[186544]: 2025-11-22 08:07:59.861 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798879.7996876, 197a8113-f798-4088-a43c-06c87c60411d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:07:59 compute-0 nova_compute[186544]: 2025-11-22 08:07:59.861 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 197a8113-f798-4088-a43c-06c87c60411d] VM Paused (Lifecycle Event)
Nov 22 08:07:59 compute-0 systemd[1]: Started libpod-conmon-b2a8d06dbf6a1fc5f045f3e366ae8283c5cff24a22f43bc04f168a52f4289682.scope.
Nov 22 08:07:59 compute-0 nova_compute[186544]: 2025-11-22 08:07:59.883 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 197a8113-f798-4088-a43c-06c87c60411d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:07:59 compute-0 nova_compute[186544]: 2025-11-22 08:07:59.887 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798879.8018322, 197a8113-f798-4088-a43c-06c87c60411d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:07:59 compute-0 nova_compute[186544]: 2025-11-22 08:07:59.887 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 197a8113-f798-4088-a43c-06c87c60411d] VM Resumed (Lifecycle Event)
Nov 22 08:07:59 compute-0 systemd[1]: Started libcrun container.
Nov 22 08:07:59 compute-0 podman[234455]: 2025-11-22 08:07:59.800405563 +0000 UTC m=+0.025321235 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 08:07:59 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7a055bf326ae15e24effeac4ed39364cb4cb89a21785e934d26f4bd1e208d597/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 08:07:59 compute-0 nova_compute[186544]: 2025-11-22 08:07:59.907 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 197a8113-f798-4088-a43c-06c87c60411d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:07:59 compute-0 podman[234455]: 2025-11-22 08:07:59.908396666 +0000 UTC m=+0.133312318 container init b2a8d06dbf6a1fc5f045f3e366ae8283c5cff24a22f43bc04f168a52f4289682 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-370b86d5-40da-430f-aa64-a9124b2ca55b, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 22 08:07:59 compute-0 nova_compute[186544]: 2025-11-22 08:07:59.910 186548 INFO nova.compute.manager [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] [instance: 197a8113-f798-4088-a43c-06c87c60411d] Took 7.27 seconds to spawn the instance on the hypervisor.
Nov 22 08:07:59 compute-0 nova_compute[186544]: 2025-11-22 08:07:59.910 186548 DEBUG nova.compute.manager [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] [instance: 197a8113-f798-4088-a43c-06c87c60411d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:07:59 compute-0 nova_compute[186544]: 2025-11-22 08:07:59.913 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 197a8113-f798-4088-a43c-06c87c60411d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:07:59 compute-0 podman[234455]: 2025-11-22 08:07:59.914247901 +0000 UTC m=+0.139163553 container start b2a8d06dbf6a1fc5f045f3e366ae8283c5cff24a22f43bc04f168a52f4289682 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-370b86d5-40da-430f-aa64-a9124b2ca55b, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 22 08:07:59 compute-0 neutron-haproxy-ovnmeta-370b86d5-40da-430f-aa64-a9124b2ca55b[234470]: [NOTICE]   (234474) : New worker (234476) forked
Nov 22 08:07:59 compute-0 neutron-haproxy-ovnmeta-370b86d5-40da-430f-aa64-a9124b2ca55b[234470]: [NOTICE]   (234474) : Loading success.
Nov 22 08:07:59 compute-0 nova_compute[186544]: 2025-11-22 08:07:59.951 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 197a8113-f798-4088-a43c-06c87c60411d] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 08:07:59 compute-0 nova_compute[186544]: 2025-11-22 08:07:59.988 186548 INFO nova.compute.manager [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] [instance: 197a8113-f798-4088-a43c-06c87c60411d] Took 7.92 seconds to build instance.
Nov 22 08:08:00 compute-0 nova_compute[186544]: 2025-11-22 08:08:00.005 186548 DEBUG oslo_concurrency.lockutils [None req-e76cf092-9dbd-47be-b50f-9958ff4e624d c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] Lock "197a8113-f798-4088-a43c-06c87c60411d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.034s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:08:00 compute-0 ovn_controller[94843]: 2025-11-22T08:08:00Z|00514|binding|INFO|Releasing lport cd51c49d-ca1b-4eb1-876f-cf619c7d21ce from this chassis (sb_readonly=0)
Nov 22 08:08:00 compute-0 ovn_controller[94843]: 2025-11-22T08:08:00Z|00515|binding|INFO|Releasing lport 0abd56a4-3e9e-4d28-8383-eadcda41744d from this chassis (sb_readonly=0)
Nov 22 08:08:00 compute-0 nova_compute[186544]: 2025-11-22 08:08:00.761 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:08:00 compute-0 nova_compute[186544]: 2025-11-22 08:08:00.814 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:08:01 compute-0 nova_compute[186544]: 2025-11-22 08:08:01.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:08:01 compute-0 nova_compute[186544]: 2025-11-22 08:08:01.572 186548 DEBUG nova.compute.manager [req-491c7493-0972-48e3-a1bb-bc6a7b1ced98 req-25b63c62-ce52-48d8-93b7-ff899e65cbf8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 197a8113-f798-4088-a43c-06c87c60411d] Received event network-vif-plugged-b6e26a0a-e1d6-4b22-a957-9272225fdd13 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:08:01 compute-0 nova_compute[186544]: 2025-11-22 08:08:01.572 186548 DEBUG oslo_concurrency.lockutils [req-491c7493-0972-48e3-a1bb-bc6a7b1ced98 req-25b63c62-ce52-48d8-93b7-ff899e65cbf8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "197a8113-f798-4088-a43c-06c87c60411d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:08:01 compute-0 nova_compute[186544]: 2025-11-22 08:08:01.573 186548 DEBUG oslo_concurrency.lockutils [req-491c7493-0972-48e3-a1bb-bc6a7b1ced98 req-25b63c62-ce52-48d8-93b7-ff899e65cbf8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "197a8113-f798-4088-a43c-06c87c60411d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:08:01 compute-0 nova_compute[186544]: 2025-11-22 08:08:01.573 186548 DEBUG oslo_concurrency.lockutils [req-491c7493-0972-48e3-a1bb-bc6a7b1ced98 req-25b63c62-ce52-48d8-93b7-ff899e65cbf8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "197a8113-f798-4088-a43c-06c87c60411d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:08:01 compute-0 nova_compute[186544]: 2025-11-22 08:08:01.573 186548 DEBUG nova.compute.manager [req-491c7493-0972-48e3-a1bb-bc6a7b1ced98 req-25b63c62-ce52-48d8-93b7-ff899e65cbf8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 197a8113-f798-4088-a43c-06c87c60411d] No waiting events found dispatching network-vif-plugged-b6e26a0a-e1d6-4b22-a957-9272225fdd13 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:08:01 compute-0 nova_compute[186544]: 2025-11-22 08:08:01.573 186548 WARNING nova.compute.manager [req-491c7493-0972-48e3-a1bb-bc6a7b1ced98 req-25b63c62-ce52-48d8-93b7-ff899e65cbf8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 197a8113-f798-4088-a43c-06c87c60411d] Received unexpected event network-vif-plugged-b6e26a0a-e1d6-4b22-a957-9272225fdd13 for instance with vm_state active and task_state None.
Nov 22 08:08:02 compute-0 nova_compute[186544]: 2025-11-22 08:08:02.029 186548 DEBUG oslo_concurrency.lockutils [None req-bb85204e-6484-4db3-82e1-8329f8d8602e c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] Acquiring lock "197a8113-f798-4088-a43c-06c87c60411d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:08:02 compute-0 nova_compute[186544]: 2025-11-22 08:08:02.030 186548 DEBUG oslo_concurrency.lockutils [None req-bb85204e-6484-4db3-82e1-8329f8d8602e c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] Lock "197a8113-f798-4088-a43c-06c87c60411d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:08:02 compute-0 nova_compute[186544]: 2025-11-22 08:08:02.030 186548 DEBUG oslo_concurrency.lockutils [None req-bb85204e-6484-4db3-82e1-8329f8d8602e c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] Acquiring lock "197a8113-f798-4088-a43c-06c87c60411d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:08:02 compute-0 nova_compute[186544]: 2025-11-22 08:08:02.030 186548 DEBUG oslo_concurrency.lockutils [None req-bb85204e-6484-4db3-82e1-8329f8d8602e c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] Lock "197a8113-f798-4088-a43c-06c87c60411d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:08:02 compute-0 nova_compute[186544]: 2025-11-22 08:08:02.030 186548 DEBUG oslo_concurrency.lockutils [None req-bb85204e-6484-4db3-82e1-8329f8d8602e c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] Lock "197a8113-f798-4088-a43c-06c87c60411d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:08:02 compute-0 nova_compute[186544]: 2025-11-22 08:08:02.038 186548 INFO nova.compute.manager [None req-bb85204e-6484-4db3-82e1-8329f8d8602e c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] [instance: 197a8113-f798-4088-a43c-06c87c60411d] Terminating instance
Nov 22 08:08:02 compute-0 nova_compute[186544]: 2025-11-22 08:08:02.044 186548 DEBUG nova.compute.manager [None req-bb85204e-6484-4db3-82e1-8329f8d8602e c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] [instance: 197a8113-f798-4088-a43c-06c87c60411d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 08:08:02 compute-0 kernel: tapb6e26a0a-e1 (unregistering): left promiscuous mode
Nov 22 08:08:02 compute-0 NetworkManager[55036]: <info>  [1763798882.0631] device (tapb6e26a0a-e1): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 08:08:02 compute-0 ovn_controller[94843]: 2025-11-22T08:08:02Z|00516|binding|INFO|Releasing lport b6e26a0a-e1d6-4b22-a957-9272225fdd13 from this chassis (sb_readonly=0)
Nov 22 08:08:02 compute-0 nova_compute[186544]: 2025-11-22 08:08:02.071 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:08:02 compute-0 ovn_controller[94843]: 2025-11-22T08:08:02Z|00517|binding|INFO|Setting lport b6e26a0a-e1d6-4b22-a957-9272225fdd13 down in Southbound
Nov 22 08:08:02 compute-0 ovn_controller[94843]: 2025-11-22T08:08:02Z|00518|binding|INFO|Removing iface tapb6e26a0a-e1 ovn-installed in OVS
Nov 22 08:08:02 compute-0 nova_compute[186544]: 2025-11-22 08:08:02.073 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:08:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:08:02.080 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:54:7d:63 10.100.0.10'], port_security=['fa:16:3e:54:7d:63 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '197a8113-f798-4088-a43c-06c87c60411d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-370b86d5-40da-430f-aa64-a9124b2ca55b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a38780a3686c492ea46910996253a08a', 'neutron:revision_number': '4', 'neutron:security_group_ids': '89136c99-0a5a-4ed2-8c7f-2c8142459fb7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a8ab5813-7e7d-4f9d-b2f4-efd563c10fa6, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=b6e26a0a-e1d6-4b22-a957-9272225fdd13) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:08:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:08:02.081 103805 INFO neutron.agent.ovn.metadata.agent [-] Port b6e26a0a-e1d6-4b22-a957-9272225fdd13 in datapath 370b86d5-40da-430f-aa64-a9124b2ca55b unbound from our chassis
Nov 22 08:08:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:08:02.083 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 370b86d5-40da-430f-aa64-a9124b2ca55b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 08:08:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:08:02.084 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[37a7729c-2cb6-4ca0-93d1-de39911b92c2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:08:02 compute-0 nova_compute[186544]: 2025-11-22 08:08:02.084 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:08:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:08:02.084 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-370b86d5-40da-430f-aa64-a9124b2ca55b namespace which is not needed anymore
Nov 22 08:08:02 compute-0 systemd[1]: machine-qemu\x2d66\x2dinstance\x2d00000071.scope: Deactivated successfully.
Nov 22 08:08:02 compute-0 systemd[1]: machine-qemu\x2d66\x2dinstance\x2d00000071.scope: Consumed 2.862s CPU time.
Nov 22 08:08:02 compute-0 systemd-machined[152872]: Machine qemu-66-instance-00000071 terminated.
Nov 22 08:08:02 compute-0 neutron-haproxy-ovnmeta-370b86d5-40da-430f-aa64-a9124b2ca55b[234470]: [NOTICE]   (234474) : haproxy version is 2.8.14-c23fe91
Nov 22 08:08:02 compute-0 neutron-haproxy-ovnmeta-370b86d5-40da-430f-aa64-a9124b2ca55b[234470]: [NOTICE]   (234474) : path to executable is /usr/sbin/haproxy
Nov 22 08:08:02 compute-0 neutron-haproxy-ovnmeta-370b86d5-40da-430f-aa64-a9124b2ca55b[234470]: [WARNING]  (234474) : Exiting Master process...
Nov 22 08:08:02 compute-0 neutron-haproxy-ovnmeta-370b86d5-40da-430f-aa64-a9124b2ca55b[234470]: [ALERT]    (234474) : Current worker (234476) exited with code 143 (Terminated)
Nov 22 08:08:02 compute-0 neutron-haproxy-ovnmeta-370b86d5-40da-430f-aa64-a9124b2ca55b[234470]: [WARNING]  (234474) : All workers exited. Exiting... (0)
Nov 22 08:08:02 compute-0 systemd[1]: libpod-b2a8d06dbf6a1fc5f045f3e366ae8283c5cff24a22f43bc04f168a52f4289682.scope: Deactivated successfully.
Nov 22 08:08:02 compute-0 podman[234507]: 2025-11-22 08:08:02.209321672 +0000 UTC m=+0.046668631 container died b2a8d06dbf6a1fc5f045f3e366ae8283c5cff24a22f43bc04f168a52f4289682 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-370b86d5-40da-430f-aa64-a9124b2ca55b, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 08:08:02 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b2a8d06dbf6a1fc5f045f3e366ae8283c5cff24a22f43bc04f168a52f4289682-userdata-shm.mount: Deactivated successfully.
Nov 22 08:08:02 compute-0 systemd[1]: var-lib-containers-storage-overlay-7a055bf326ae15e24effeac4ed39364cb4cb89a21785e934d26f4bd1e208d597-merged.mount: Deactivated successfully.
Nov 22 08:08:02 compute-0 podman[234507]: 2025-11-22 08:08:02.283134673 +0000 UTC m=+0.120481652 container cleanup b2a8d06dbf6a1fc5f045f3e366ae8283c5cff24a22f43bc04f168a52f4289682 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-370b86d5-40da-430f-aa64-a9124b2ca55b, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 22 08:08:02 compute-0 systemd[1]: libpod-conmon-b2a8d06dbf6a1fc5f045f3e366ae8283c5cff24a22f43bc04f168a52f4289682.scope: Deactivated successfully.
Nov 22 08:08:02 compute-0 nova_compute[186544]: 2025-11-22 08:08:02.311 186548 INFO nova.virt.libvirt.driver [-] [instance: 197a8113-f798-4088-a43c-06c87c60411d] Instance destroyed successfully.
Nov 22 08:08:02 compute-0 nova_compute[186544]: 2025-11-22 08:08:02.313 186548 DEBUG nova.objects.instance [None req-bb85204e-6484-4db3-82e1-8329f8d8602e c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] Lazy-loading 'resources' on Instance uuid 197a8113-f798-4088-a43c-06c87c60411d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:08:02 compute-0 nova_compute[186544]: 2025-11-22 08:08:02.323 186548 DEBUG nova.virt.libvirt.vif [None req-bb85204e-6484-4db3-82e1-8329f8d8602e c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:07:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerAddressesTestJSON-server-2082170142',display_name='tempest-ServerAddressesTestJSON-server-2082170142',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveraddressestestjson-server-2082170142',id=113,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:07:59Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='a38780a3686c492ea46910996253a08a',ramdisk_id='',reservation_id='r-3i7z3rjz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerAddressesTestJSON-1745652147',owner_user_name='tempest-ServerAddressesTestJSON-1745652147-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:07:59Z,user_data=None,user_id='c6751e9b0adc4800b9bc3c06303a22fa',uuid=197a8113-f798-4088-a43c-06c87c60411d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b6e26a0a-e1d6-4b22-a957-9272225fdd13", "address": "fa:16:3e:54:7d:63", "network": {"id": "370b86d5-40da-430f-aa64-a9124b2ca55b", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-2055379623-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a38780a3686c492ea46910996253a08a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb6e26a0a-e1", "ovs_interfaceid": "b6e26a0a-e1d6-4b22-a957-9272225fdd13", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 08:08:02 compute-0 nova_compute[186544]: 2025-11-22 08:08:02.323 186548 DEBUG nova.network.os_vif_util [None req-bb85204e-6484-4db3-82e1-8329f8d8602e c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] Converting VIF {"id": "b6e26a0a-e1d6-4b22-a957-9272225fdd13", "address": "fa:16:3e:54:7d:63", "network": {"id": "370b86d5-40da-430f-aa64-a9124b2ca55b", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-2055379623-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a38780a3686c492ea46910996253a08a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb6e26a0a-e1", "ovs_interfaceid": "b6e26a0a-e1d6-4b22-a957-9272225fdd13", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:08:02 compute-0 nova_compute[186544]: 2025-11-22 08:08:02.324 186548 DEBUG nova.network.os_vif_util [None req-bb85204e-6484-4db3-82e1-8329f8d8602e c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:54:7d:63,bridge_name='br-int',has_traffic_filtering=True,id=b6e26a0a-e1d6-4b22-a957-9272225fdd13,network=Network(370b86d5-40da-430f-aa64-a9124b2ca55b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb6e26a0a-e1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:08:02 compute-0 nova_compute[186544]: 2025-11-22 08:08:02.324 186548 DEBUG os_vif [None req-bb85204e-6484-4db3-82e1-8329f8d8602e c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:54:7d:63,bridge_name='br-int',has_traffic_filtering=True,id=b6e26a0a-e1d6-4b22-a957-9272225fdd13,network=Network(370b86d5-40da-430f-aa64-a9124b2ca55b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb6e26a0a-e1') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 08:08:02 compute-0 nova_compute[186544]: 2025-11-22 08:08:02.326 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:08:02 compute-0 nova_compute[186544]: 2025-11-22 08:08:02.326 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb6e26a0a-e1, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:08:02 compute-0 nova_compute[186544]: 2025-11-22 08:08:02.330 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:08:02 compute-0 nova_compute[186544]: 2025-11-22 08:08:02.331 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 08:08:02 compute-0 nova_compute[186544]: 2025-11-22 08:08:02.334 186548 INFO os_vif [None req-bb85204e-6484-4db3-82e1-8329f8d8602e c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:54:7d:63,bridge_name='br-int',has_traffic_filtering=True,id=b6e26a0a-e1d6-4b22-a957-9272225fdd13,network=Network(370b86d5-40da-430f-aa64-a9124b2ca55b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb6e26a0a-e1')
Nov 22 08:08:02 compute-0 nova_compute[186544]: 2025-11-22 08:08:02.335 186548 INFO nova.virt.libvirt.driver [None req-bb85204e-6484-4db3-82e1-8329f8d8602e c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] [instance: 197a8113-f798-4088-a43c-06c87c60411d] Deleting instance files /var/lib/nova/instances/197a8113-f798-4088-a43c-06c87c60411d_del
Nov 22 08:08:02 compute-0 nova_compute[186544]: 2025-11-22 08:08:02.336 186548 INFO nova.virt.libvirt.driver [None req-bb85204e-6484-4db3-82e1-8329f8d8602e c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] [instance: 197a8113-f798-4088-a43c-06c87c60411d] Deletion of /var/lib/nova/instances/197a8113-f798-4088-a43c-06c87c60411d_del complete
Nov 22 08:08:02 compute-0 podman[234552]: 2025-11-22 08:08:02.353184251 +0000 UTC m=+0.047465232 container remove b2a8d06dbf6a1fc5f045f3e366ae8283c5cff24a22f43bc04f168a52f4289682 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-370b86d5-40da-430f-aa64-a9124b2ca55b, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 22 08:08:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:08:02.359 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[b5851163-3e81-4210-809e-c29a0d56eec2]: (4, ('Sat Nov 22 08:08:02 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-370b86d5-40da-430f-aa64-a9124b2ca55b (b2a8d06dbf6a1fc5f045f3e366ae8283c5cff24a22f43bc04f168a52f4289682)\nb2a8d06dbf6a1fc5f045f3e366ae8283c5cff24a22f43bc04f168a52f4289682\nSat Nov 22 08:08:02 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-370b86d5-40da-430f-aa64-a9124b2ca55b (b2a8d06dbf6a1fc5f045f3e366ae8283c5cff24a22f43bc04f168a52f4289682)\nb2a8d06dbf6a1fc5f045f3e366ae8283c5cff24a22f43bc04f168a52f4289682\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:08:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:08:02.360 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[118fd4f6-4a27-403e-83c7-e77b86fc2a7d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:08:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:08:02.361 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap370b86d5-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:08:02 compute-0 nova_compute[186544]: 2025-11-22 08:08:02.363 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:08:02 compute-0 kernel: tap370b86d5-40: left promiscuous mode
Nov 22 08:08:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:08:02.367 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[774c7032-2688-4225-9d56-6bebe4951827]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:08:02 compute-0 nova_compute[186544]: 2025-11-22 08:08:02.376 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:08:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:08:02.389 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[61797558-b10f-456a-9ed0-b35a36638dc4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:08:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:08:02.391 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[289c2285-407f-447b-8b73-abc751055de6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:08:02 compute-0 nova_compute[186544]: 2025-11-22 08:08:02.394 186548 INFO nova.compute.manager [None req-bb85204e-6484-4db3-82e1-8329f8d8602e c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] [instance: 197a8113-f798-4088-a43c-06c87c60411d] Took 0.35 seconds to destroy the instance on the hypervisor.
Nov 22 08:08:02 compute-0 nova_compute[186544]: 2025-11-22 08:08:02.394 186548 DEBUG oslo.service.loopingcall [None req-bb85204e-6484-4db3-82e1-8329f8d8602e c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 08:08:02 compute-0 nova_compute[186544]: 2025-11-22 08:08:02.395 186548 DEBUG nova.compute.manager [-] [instance: 197a8113-f798-4088-a43c-06c87c60411d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 08:08:02 compute-0 nova_compute[186544]: 2025-11-22 08:08:02.395 186548 DEBUG nova.network.neutron [-] [instance: 197a8113-f798-4088-a43c-06c87c60411d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 08:08:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:08:02.406 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[58ba4336-5bbd-4f1a-8688-e388579acf2b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 561982, 'reachable_time': 32419, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234577, 'error': None, 'target': 'ovnmeta-370b86d5-40da-430f-aa64-a9124b2ca55b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:08:02 compute-0 systemd[1]: run-netns-ovnmeta\x2d370b86d5\x2d40da\x2d430f\x2daa64\x2da9124b2ca55b.mount: Deactivated successfully.
Nov 22 08:08:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:08:02.409 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-370b86d5-40da-430f-aa64-a9124b2ca55b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 08:08:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:08:02.409 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[db6b73d7-c122-4fb4-8277-785c6e778265]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:08:02 compute-0 nova_compute[186544]: 2025-11-22 08:08:02.948 186548 DEBUG nova.network.neutron [-] [instance: 197a8113-f798-4088-a43c-06c87c60411d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:08:02 compute-0 nova_compute[186544]: 2025-11-22 08:08:02.966 186548 INFO nova.compute.manager [-] [instance: 197a8113-f798-4088-a43c-06c87c60411d] Took 0.57 seconds to deallocate network for instance.
Nov 22 08:08:03 compute-0 nova_compute[186544]: 2025-11-22 08:08:03.035 186548 DEBUG nova.compute.manager [req-24ba2118-1cb6-4125-a3b7-859051289dd9 req-56bfd0b2-3bcf-40f9-9dc5-05bc43ec86a6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 197a8113-f798-4088-a43c-06c87c60411d] Received event network-vif-deleted-b6e26a0a-e1d6-4b22-a957-9272225fdd13 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:08:03 compute-0 nova_compute[186544]: 2025-11-22 08:08:03.039 186548 DEBUG oslo_concurrency.lockutils [None req-bb85204e-6484-4db3-82e1-8329f8d8602e c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:08:03 compute-0 nova_compute[186544]: 2025-11-22 08:08:03.039 186548 DEBUG oslo_concurrency.lockutils [None req-bb85204e-6484-4db3-82e1-8329f8d8602e c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:08:03 compute-0 nova_compute[186544]: 2025-11-22 08:08:03.111 186548 DEBUG nova.compute.provider_tree [None req-bb85204e-6484-4db3-82e1-8329f8d8602e c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:08:03 compute-0 nova_compute[186544]: 2025-11-22 08:08:03.121 186548 DEBUG nova.scheduler.client.report [None req-bb85204e-6484-4db3-82e1-8329f8d8602e c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:08:03 compute-0 nova_compute[186544]: 2025-11-22 08:08:03.141 186548 DEBUG oslo_concurrency.lockutils [None req-bb85204e-6484-4db3-82e1-8329f8d8602e c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.102s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:08:03 compute-0 nova_compute[186544]: 2025-11-22 08:08:03.193 186548 INFO nova.scheduler.client.report [None req-bb85204e-6484-4db3-82e1-8329f8d8602e c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] Deleted allocations for instance 197a8113-f798-4088-a43c-06c87c60411d
Nov 22 08:08:03 compute-0 nova_compute[186544]: 2025-11-22 08:08:03.277 186548 DEBUG oslo_concurrency.lockutils [None req-bb85204e-6484-4db3-82e1-8329f8d8602e c6751e9b0adc4800b9bc3c06303a22fa a38780a3686c492ea46910996253a08a - - default default] Lock "197a8113-f798-4088-a43c-06c87c60411d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.248s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:08:03 compute-0 nova_compute[186544]: 2025-11-22 08:08:03.739 186548 DEBUG nova.compute.manager [req-ab262930-2df4-4f63-b8c9-78ec20812a9c req-8513a192-3096-4331-901c-eaa5e8048d5d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 197a8113-f798-4088-a43c-06c87c60411d] Received event network-vif-unplugged-b6e26a0a-e1d6-4b22-a957-9272225fdd13 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:08:03 compute-0 nova_compute[186544]: 2025-11-22 08:08:03.739 186548 DEBUG oslo_concurrency.lockutils [req-ab262930-2df4-4f63-b8c9-78ec20812a9c req-8513a192-3096-4331-901c-eaa5e8048d5d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "197a8113-f798-4088-a43c-06c87c60411d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:08:03 compute-0 nova_compute[186544]: 2025-11-22 08:08:03.740 186548 DEBUG oslo_concurrency.lockutils [req-ab262930-2df4-4f63-b8c9-78ec20812a9c req-8513a192-3096-4331-901c-eaa5e8048d5d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "197a8113-f798-4088-a43c-06c87c60411d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:08:03 compute-0 nova_compute[186544]: 2025-11-22 08:08:03.740 186548 DEBUG oslo_concurrency.lockutils [req-ab262930-2df4-4f63-b8c9-78ec20812a9c req-8513a192-3096-4331-901c-eaa5e8048d5d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "197a8113-f798-4088-a43c-06c87c60411d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:08:03 compute-0 nova_compute[186544]: 2025-11-22 08:08:03.740 186548 DEBUG nova.compute.manager [req-ab262930-2df4-4f63-b8c9-78ec20812a9c req-8513a192-3096-4331-901c-eaa5e8048d5d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 197a8113-f798-4088-a43c-06c87c60411d] No waiting events found dispatching network-vif-unplugged-b6e26a0a-e1d6-4b22-a957-9272225fdd13 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:08:03 compute-0 nova_compute[186544]: 2025-11-22 08:08:03.740 186548 WARNING nova.compute.manager [req-ab262930-2df4-4f63-b8c9-78ec20812a9c req-8513a192-3096-4331-901c-eaa5e8048d5d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 197a8113-f798-4088-a43c-06c87c60411d] Received unexpected event network-vif-unplugged-b6e26a0a-e1d6-4b22-a957-9272225fdd13 for instance with vm_state deleted and task_state None.
Nov 22 08:08:03 compute-0 nova_compute[186544]: 2025-11-22 08:08:03.740 186548 DEBUG nova.compute.manager [req-ab262930-2df4-4f63-b8c9-78ec20812a9c req-8513a192-3096-4331-901c-eaa5e8048d5d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 197a8113-f798-4088-a43c-06c87c60411d] Received event network-vif-plugged-b6e26a0a-e1d6-4b22-a957-9272225fdd13 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:08:03 compute-0 nova_compute[186544]: 2025-11-22 08:08:03.741 186548 DEBUG oslo_concurrency.lockutils [req-ab262930-2df4-4f63-b8c9-78ec20812a9c req-8513a192-3096-4331-901c-eaa5e8048d5d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "197a8113-f798-4088-a43c-06c87c60411d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:08:03 compute-0 nova_compute[186544]: 2025-11-22 08:08:03.741 186548 DEBUG oslo_concurrency.lockutils [req-ab262930-2df4-4f63-b8c9-78ec20812a9c req-8513a192-3096-4331-901c-eaa5e8048d5d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "197a8113-f798-4088-a43c-06c87c60411d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:08:03 compute-0 nova_compute[186544]: 2025-11-22 08:08:03.741 186548 DEBUG oslo_concurrency.lockutils [req-ab262930-2df4-4f63-b8c9-78ec20812a9c req-8513a192-3096-4331-901c-eaa5e8048d5d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "197a8113-f798-4088-a43c-06c87c60411d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:08:03 compute-0 nova_compute[186544]: 2025-11-22 08:08:03.741 186548 DEBUG nova.compute.manager [req-ab262930-2df4-4f63-b8c9-78ec20812a9c req-8513a192-3096-4331-901c-eaa5e8048d5d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 197a8113-f798-4088-a43c-06c87c60411d] No waiting events found dispatching network-vif-plugged-b6e26a0a-e1d6-4b22-a957-9272225fdd13 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:08:03 compute-0 nova_compute[186544]: 2025-11-22 08:08:03.742 186548 WARNING nova.compute.manager [req-ab262930-2df4-4f63-b8c9-78ec20812a9c req-8513a192-3096-4331-901c-eaa5e8048d5d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 197a8113-f798-4088-a43c-06c87c60411d] Received unexpected event network-vif-plugged-b6e26a0a-e1d6-4b22-a957-9272225fdd13 for instance with vm_state deleted and task_state None.
Nov 22 08:08:04 compute-0 podman[234578]: 2025-11-22 08:08:04.408222253 +0000 UTC m=+0.052020894 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 22 08:08:04 compute-0 podman[234579]: 2025-11-22 08:08:04.432545933 +0000 UTC m=+0.075166985 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20251118, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 08:08:05 compute-0 nova_compute[186544]: 2025-11-22 08:08:05.817 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:08:06 compute-0 nova_compute[186544]: 2025-11-22 08:08:06.206 186548 DEBUG oslo_concurrency.lockutils [None req-97eb24a2-e184-491e-ba35-f8dc613ae5cc 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:08:06 compute-0 nova_compute[186544]: 2025-11-22 08:08:06.206 186548 DEBUG oslo_concurrency.lockutils [None req-97eb24a2-e184-491e-ba35-f8dc613ae5cc 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:08:06 compute-0 nova_compute[186544]: 2025-11-22 08:08:06.206 186548 DEBUG oslo_concurrency.lockutils [None req-97eb24a2-e184-491e-ba35-f8dc613ae5cc 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:08:06 compute-0 nova_compute[186544]: 2025-11-22 08:08:06.207 186548 DEBUG oslo_concurrency.lockutils [None req-97eb24a2-e184-491e-ba35-f8dc613ae5cc 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:08:06 compute-0 nova_compute[186544]: 2025-11-22 08:08:06.207 186548 DEBUG oslo_concurrency.lockutils [None req-97eb24a2-e184-491e-ba35-f8dc613ae5cc 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:08:06 compute-0 nova_compute[186544]: 2025-11-22 08:08:06.214 186548 INFO nova.compute.manager [None req-97eb24a2-e184-491e-ba35-f8dc613ae5cc 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] Terminating instance
Nov 22 08:08:06 compute-0 nova_compute[186544]: 2025-11-22 08:08:06.220 186548 DEBUG nova.compute.manager [None req-97eb24a2-e184-491e-ba35-f8dc613ae5cc 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 08:08:06 compute-0 kernel: tapf66ac761-5b (unregistering): left promiscuous mode
Nov 22 08:08:06 compute-0 NetworkManager[55036]: <info>  [1763798886.2453] device (tapf66ac761-5b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 08:08:06 compute-0 nova_compute[186544]: 2025-11-22 08:08:06.253 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:08:06 compute-0 ovn_controller[94843]: 2025-11-22T08:08:06Z|00519|binding|INFO|Releasing lport f66ac761-5b67-4bf3-98de-982febfd27c1 from this chassis (sb_readonly=0)
Nov 22 08:08:06 compute-0 ovn_controller[94843]: 2025-11-22T08:08:06Z|00520|binding|INFO|Setting lport f66ac761-5b67-4bf3-98de-982febfd27c1 down in Southbound
Nov 22 08:08:06 compute-0 ovn_controller[94843]: 2025-11-22T08:08:06Z|00521|binding|INFO|Removing iface tapf66ac761-5b ovn-installed in OVS
Nov 22 08:08:06 compute-0 nova_compute[186544]: 2025-11-22 08:08:06.256 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:08:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:08:06.262 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:07:30:39 10.100.0.11 2001:db8::f816:3eff:fe07:3039'], port_security=['fa:16:3e:07:30:39 10.100.0.11 2001:db8::f816:3eff:fe07:3039'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28 2001:db8::f816:3eff:fe07:3039/64', 'neutron:device_id': 'c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-90da6fca-65d1-4012-9602-d88842a0ad0e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '4', 'neutron:security_group_ids': '573b06fa-1b11-4261-bfd0-ca50fa18731b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e9c41f1e-b11e-4868-a3a0-70214f7435c4, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=f66ac761-5b67-4bf3-98de-982febfd27c1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:08:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:08:06.263 103805 INFO neutron.agent.ovn.metadata.agent [-] Port f66ac761-5b67-4bf3-98de-982febfd27c1 in datapath 90da6fca-65d1-4012-9602-d88842a0ad0e unbound from our chassis
Nov 22 08:08:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:08:06.265 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 90da6fca-65d1-4012-9602-d88842a0ad0e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 08:08:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:08:06.266 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[83789df2-5fbb-4461-b508-1019e6d30c45]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:08:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:08:06.267 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-90da6fca-65d1-4012-9602-d88842a0ad0e namespace which is not needed anymore
Nov 22 08:08:06 compute-0 nova_compute[186544]: 2025-11-22 08:08:06.272 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:08:06 compute-0 systemd[1]: machine-qemu\x2d63\x2dinstance\x2d0000006d.scope: Deactivated successfully.
Nov 22 08:08:06 compute-0 systemd[1]: machine-qemu\x2d63\x2dinstance\x2d0000006d.scope: Consumed 17.723s CPU time.
Nov 22 08:08:06 compute-0 systemd-machined[152872]: Machine qemu-63-instance-0000006d terminated.
Nov 22 08:08:06 compute-0 podman[234628]: 2025-11-22 08:08:06.325512618 +0000 UTC m=+0.056268859 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 08:08:06 compute-0 podman[234625]: 2025-11-22 08:08:06.342256691 +0000 UTC m=+0.074774405 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 22 08:08:06 compute-0 neutron-haproxy-ovnmeta-90da6fca-65d1-4012-9602-d88842a0ad0e[233479]: [NOTICE]   (233483) : haproxy version is 2.8.14-c23fe91
Nov 22 08:08:06 compute-0 neutron-haproxy-ovnmeta-90da6fca-65d1-4012-9602-d88842a0ad0e[233479]: [NOTICE]   (233483) : path to executable is /usr/sbin/haproxy
Nov 22 08:08:06 compute-0 neutron-haproxy-ovnmeta-90da6fca-65d1-4012-9602-d88842a0ad0e[233479]: [WARNING]  (233483) : Exiting Master process...
Nov 22 08:08:06 compute-0 neutron-haproxy-ovnmeta-90da6fca-65d1-4012-9602-d88842a0ad0e[233479]: [WARNING]  (233483) : Exiting Master process...
Nov 22 08:08:06 compute-0 neutron-haproxy-ovnmeta-90da6fca-65d1-4012-9602-d88842a0ad0e[233479]: [ALERT]    (233483) : Current worker (233485) exited with code 143 (Terminated)
Nov 22 08:08:06 compute-0 neutron-haproxy-ovnmeta-90da6fca-65d1-4012-9602-d88842a0ad0e[233479]: [WARNING]  (233483) : All workers exited. Exiting... (0)
Nov 22 08:08:06 compute-0 systemd[1]: libpod-50a82291d99f6caf69b7711a7344be2ef721adaa442b9e61e9d345808f4efe80.scope: Deactivated successfully.
Nov 22 08:08:06 compute-0 podman[234689]: 2025-11-22 08:08:06.419100606 +0000 UTC m=+0.061838836 container died 50a82291d99f6caf69b7711a7344be2ef721adaa442b9e61e9d345808f4efe80 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-90da6fca-65d1-4012-9602-d88842a0ad0e, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 22 08:08:06 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-50a82291d99f6caf69b7711a7344be2ef721adaa442b9e61e9d345808f4efe80-userdata-shm.mount: Deactivated successfully.
Nov 22 08:08:06 compute-0 systemd[1]: var-lib-containers-storage-overlay-88275896954d780476515c3897676c6f85015ac1fcb973bd62e076b55af84898-merged.mount: Deactivated successfully.
Nov 22 08:08:06 compute-0 podman[234689]: 2025-11-22 08:08:06.491994594 +0000 UTC m=+0.134732824 container cleanup 50a82291d99f6caf69b7711a7344be2ef721adaa442b9e61e9d345808f4efe80 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-90da6fca-65d1-4012-9602-d88842a0ad0e, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 22 08:08:06 compute-0 nova_compute[186544]: 2025-11-22 08:08:06.492 186548 INFO nova.virt.libvirt.driver [-] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] Instance destroyed successfully.
Nov 22 08:08:06 compute-0 nova_compute[186544]: 2025-11-22 08:08:06.494 186548 DEBUG nova.objects.instance [None req-97eb24a2-e184-491e-ba35-f8dc613ae5cc 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lazy-loading 'resources' on Instance uuid c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:08:06 compute-0 systemd[1]: libpod-conmon-50a82291d99f6caf69b7711a7344be2ef721adaa442b9e61e9d345808f4efe80.scope: Deactivated successfully.
Nov 22 08:08:06 compute-0 nova_compute[186544]: 2025-11-22 08:08:06.519 186548 DEBUG nova.virt.libvirt.vif [None req-97eb24a2-e184-491e-ba35-f8dc613ae5cc 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:06:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-695468086',display_name='tempest-TestGettingAddress-server-695468086',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-695468086',id=109,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEPnaERasa+izdcdfyTuC1NZxdKIV3QYAmiEXJjMkASn0E1tv7r6vCMDrq3+5wI/5DgRhzrsGj9ouyKzyqBuAz+X8ag3n7AcCuRnJpHSdd9YGkwB1w6Z6YQ+SkW/64cPWQ==',key_name='tempest-TestGettingAddress-1450732548',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:06:47Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c4200f1d1fbb44a5aaf5e3578f6354ae',ramdisk_id='',reservation_id='r-niopkunw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-25838038',owner_user_name='tempest-TestGettingAddress-25838038-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:06:47Z,user_data=None,user_id='809b865601654264af5bff7f49127cea',uuid=c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f66ac761-5b67-4bf3-98de-982febfd27c1", "address": "fa:16:3e:07:30:39", "network": {"id": "90da6fca-65d1-4012-9602-d88842a0ad0e", "bridge": "br-int", "label": "tempest-network-smoke--1184291441", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe07:3039", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.236", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf66ac761-5b", "ovs_interfaceid": "f66ac761-5b67-4bf3-98de-982febfd27c1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 08:08:06 compute-0 nova_compute[186544]: 2025-11-22 08:08:06.519 186548 DEBUG nova.network.os_vif_util [None req-97eb24a2-e184-491e-ba35-f8dc613ae5cc 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converting VIF {"id": "f66ac761-5b67-4bf3-98de-982febfd27c1", "address": "fa:16:3e:07:30:39", "network": {"id": "90da6fca-65d1-4012-9602-d88842a0ad0e", "bridge": "br-int", "label": "tempest-network-smoke--1184291441", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe07:3039", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.236", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf66ac761-5b", "ovs_interfaceid": "f66ac761-5b67-4bf3-98de-982febfd27c1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:08:06 compute-0 nova_compute[186544]: 2025-11-22 08:08:06.520 186548 DEBUG nova.network.os_vif_util [None req-97eb24a2-e184-491e-ba35-f8dc613ae5cc 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:07:30:39,bridge_name='br-int',has_traffic_filtering=True,id=f66ac761-5b67-4bf3-98de-982febfd27c1,network=Network(90da6fca-65d1-4012-9602-d88842a0ad0e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf66ac761-5b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:08:06 compute-0 nova_compute[186544]: 2025-11-22 08:08:06.520 186548 DEBUG os_vif [None req-97eb24a2-e184-491e-ba35-f8dc613ae5cc 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:07:30:39,bridge_name='br-int',has_traffic_filtering=True,id=f66ac761-5b67-4bf3-98de-982febfd27c1,network=Network(90da6fca-65d1-4012-9602-d88842a0ad0e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf66ac761-5b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 08:08:06 compute-0 nova_compute[186544]: 2025-11-22 08:08:06.522 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:08:06 compute-0 nova_compute[186544]: 2025-11-22 08:08:06.522 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf66ac761-5b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:08:06 compute-0 nova_compute[186544]: 2025-11-22 08:08:06.523 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:08:06 compute-0 nova_compute[186544]: 2025-11-22 08:08:06.525 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 08:08:06 compute-0 nova_compute[186544]: 2025-11-22 08:08:06.525 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:08:06 compute-0 nova_compute[186544]: 2025-11-22 08:08:06.527 186548 INFO os_vif [None req-97eb24a2-e184-491e-ba35-f8dc613ae5cc 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:07:30:39,bridge_name='br-int',has_traffic_filtering=True,id=f66ac761-5b67-4bf3-98de-982febfd27c1,network=Network(90da6fca-65d1-4012-9602-d88842a0ad0e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf66ac761-5b')
Nov 22 08:08:06 compute-0 nova_compute[186544]: 2025-11-22 08:08:06.528 186548 INFO nova.virt.libvirt.driver [None req-97eb24a2-e184-491e-ba35-f8dc613ae5cc 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] Deleting instance files /var/lib/nova/instances/c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762_del
Nov 22 08:08:06 compute-0 nova_compute[186544]: 2025-11-22 08:08:06.529 186548 INFO nova.virt.libvirt.driver [None req-97eb24a2-e184-491e-ba35-f8dc613ae5cc 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] Deletion of /var/lib/nova/instances/c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762_del complete
Nov 22 08:08:06 compute-0 podman[234734]: 2025-11-22 08:08:06.577090632 +0000 UTC m=+0.059531549 container remove 50a82291d99f6caf69b7711a7344be2ef721adaa442b9e61e9d345808f4efe80 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-90da6fca-65d1-4012-9602-d88842a0ad0e, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 08:08:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:08:06.582 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[2f4ea92e-6ade-405d-8881-c539aac6a661]: (4, ('Sat Nov 22 08:08:06 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-90da6fca-65d1-4012-9602-d88842a0ad0e (50a82291d99f6caf69b7711a7344be2ef721adaa442b9e61e9d345808f4efe80)\n50a82291d99f6caf69b7711a7344be2ef721adaa442b9e61e9d345808f4efe80\nSat Nov 22 08:08:06 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-90da6fca-65d1-4012-9602-d88842a0ad0e (50a82291d99f6caf69b7711a7344be2ef721adaa442b9e61e9d345808f4efe80)\n50a82291d99f6caf69b7711a7344be2ef721adaa442b9e61e9d345808f4efe80\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:08:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:08:06.584 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[5e244157-327b-4a85-b9a6-058deafaf5e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:08:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:08:06.585 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap90da6fca-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:08:06 compute-0 nova_compute[186544]: 2025-11-22 08:08:06.587 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:08:06 compute-0 kernel: tap90da6fca-60: left promiscuous mode
Nov 22 08:08:06 compute-0 nova_compute[186544]: 2025-11-22 08:08:06.601 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:08:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:08:06.603 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[39e6fcb8-1e5e-4bc3-986a-4c9335cae00e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:08:06 compute-0 nova_compute[186544]: 2025-11-22 08:08:06.613 186548 INFO nova.compute.manager [None req-97eb24a2-e184-491e-ba35-f8dc613ae5cc 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] Took 0.39 seconds to destroy the instance on the hypervisor.
Nov 22 08:08:06 compute-0 nova_compute[186544]: 2025-11-22 08:08:06.613 186548 DEBUG oslo.service.loopingcall [None req-97eb24a2-e184-491e-ba35-f8dc613ae5cc 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 08:08:06 compute-0 nova_compute[186544]: 2025-11-22 08:08:06.614 186548 DEBUG nova.compute.manager [-] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 08:08:06 compute-0 nova_compute[186544]: 2025-11-22 08:08:06.614 186548 DEBUG nova.network.neutron [-] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 08:08:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:08:06.616 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[753c1806-8ef2-4eda-8a07-81560a51baf5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:08:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:08:06.617 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[b4c771df-aa65-4bc0-a6ff-0b3dd6480130]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:08:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:08:06.636 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[e5d87d1a-7ee0-4535-a789-49b0d5668a3a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 554752, 'reachable_time': 34200, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234749, 'error': None, 'target': 'ovnmeta-90da6fca-65d1-4012-9602-d88842a0ad0e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:08:06 compute-0 systemd[1]: run-netns-ovnmeta\x2d90da6fca\x2d65d1\x2d4012\x2d9602\x2dd88842a0ad0e.mount: Deactivated successfully.
Nov 22 08:08:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:08:06.639 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-90da6fca-65d1-4012-9602-d88842a0ad0e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 08:08:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:08:06.640 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[ad19c8bd-bb7a-406f-bd1d-94c40e74e941]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:08:06 compute-0 nova_compute[186544]: 2025-11-22 08:08:06.695 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:08:06 compute-0 nova_compute[186544]: 2025-11-22 08:08:06.888 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:08:07 compute-0 nova_compute[186544]: 2025-11-22 08:08:07.203 186548 DEBUG nova.compute.manager [req-39aa4225-7329-4047-8d65-e0bf3f295e58 req-97c55bdc-f131-4649-aa7a-17453e81eb17 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] Received event network-changed-f66ac761-5b67-4bf3-98de-982febfd27c1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:08:07 compute-0 nova_compute[186544]: 2025-11-22 08:08:07.203 186548 DEBUG nova.compute.manager [req-39aa4225-7329-4047-8d65-e0bf3f295e58 req-97c55bdc-f131-4649-aa7a-17453e81eb17 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] Refreshing instance network info cache due to event network-changed-f66ac761-5b67-4bf3-98de-982febfd27c1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:08:07 compute-0 nova_compute[186544]: 2025-11-22 08:08:07.204 186548 DEBUG oslo_concurrency.lockutils [req-39aa4225-7329-4047-8d65-e0bf3f295e58 req-97c55bdc-f131-4649-aa7a-17453e81eb17 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:08:07 compute-0 nova_compute[186544]: 2025-11-22 08:08:07.204 186548 DEBUG oslo_concurrency.lockutils [req-39aa4225-7329-4047-8d65-e0bf3f295e58 req-97c55bdc-f131-4649-aa7a-17453e81eb17 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:08:07 compute-0 nova_compute[186544]: 2025-11-22 08:08:07.204 186548 DEBUG nova.network.neutron [req-39aa4225-7329-4047-8d65-e0bf3f295e58 req-97c55bdc-f131-4649-aa7a-17453e81eb17 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] Refreshing network info cache for port f66ac761-5b67-4bf3-98de-982febfd27c1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:08:08 compute-0 nova_compute[186544]: 2025-11-22 08:08:08.652 186548 DEBUG nova.network.neutron [-] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:08:08 compute-0 nova_compute[186544]: 2025-11-22 08:08:08.687 186548 INFO nova.compute.manager [-] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] Took 2.07 seconds to deallocate network for instance.
Nov 22 08:08:08 compute-0 nova_compute[186544]: 2025-11-22 08:08:08.767 186548 DEBUG oslo_concurrency.lockutils [None req-97eb24a2-e184-491e-ba35-f8dc613ae5cc 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:08:08 compute-0 nova_compute[186544]: 2025-11-22 08:08:08.767 186548 DEBUG oslo_concurrency.lockutils [None req-97eb24a2-e184-491e-ba35-f8dc613ae5cc 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:08:08 compute-0 nova_compute[186544]: 2025-11-22 08:08:08.817 186548 DEBUG nova.compute.provider_tree [None req-97eb24a2-e184-491e-ba35-f8dc613ae5cc 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:08:08 compute-0 nova_compute[186544]: 2025-11-22 08:08:08.830 186548 DEBUG nova.scheduler.client.report [None req-97eb24a2-e184-491e-ba35-f8dc613ae5cc 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:08:08 compute-0 nova_compute[186544]: 2025-11-22 08:08:08.850 186548 DEBUG oslo_concurrency.lockutils [None req-97eb24a2-e184-491e-ba35-f8dc613ae5cc 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.083s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:08:08 compute-0 nova_compute[186544]: 2025-11-22 08:08:08.879 186548 INFO nova.scheduler.client.report [None req-97eb24a2-e184-491e-ba35-f8dc613ae5cc 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Deleted allocations for instance c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762
Nov 22 08:08:09 compute-0 nova_compute[186544]: 2025-11-22 08:08:09.019 186548 DEBUG oslo_concurrency.lockutils [None req-97eb24a2-e184-491e-ba35-f8dc613ae5cc 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.813s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:08:09 compute-0 nova_compute[186544]: 2025-11-22 08:08:09.692 186548 DEBUG nova.compute.manager [req-10bf5be1-e876-4929-a5b0-e1997527aac1 req-b1451033-930d-4a25-afca-343a7064d2ae 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] Received event network-vif-plugged-f66ac761-5b67-4bf3-98de-982febfd27c1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:08:09 compute-0 nova_compute[186544]: 2025-11-22 08:08:09.693 186548 DEBUG oslo_concurrency.lockutils [req-10bf5be1-e876-4929-a5b0-e1997527aac1 req-b1451033-930d-4a25-afca-343a7064d2ae 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:08:09 compute-0 nova_compute[186544]: 2025-11-22 08:08:09.693 186548 DEBUG oslo_concurrency.lockutils [req-10bf5be1-e876-4929-a5b0-e1997527aac1 req-b1451033-930d-4a25-afca-343a7064d2ae 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:08:09 compute-0 nova_compute[186544]: 2025-11-22 08:08:09.693 186548 DEBUG oslo_concurrency.lockutils [req-10bf5be1-e876-4929-a5b0-e1997527aac1 req-b1451033-930d-4a25-afca-343a7064d2ae 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:08:09 compute-0 nova_compute[186544]: 2025-11-22 08:08:09.694 186548 DEBUG nova.compute.manager [req-10bf5be1-e876-4929-a5b0-e1997527aac1 req-b1451033-930d-4a25-afca-343a7064d2ae 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] No waiting events found dispatching network-vif-plugged-f66ac761-5b67-4bf3-98de-982febfd27c1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:08:09 compute-0 nova_compute[186544]: 2025-11-22 08:08:09.694 186548 WARNING nova.compute.manager [req-10bf5be1-e876-4929-a5b0-e1997527aac1 req-b1451033-930d-4a25-afca-343a7064d2ae 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] Received unexpected event network-vif-plugged-f66ac761-5b67-4bf3-98de-982febfd27c1 for instance with vm_state deleted and task_state None.
Nov 22 08:08:09 compute-0 nova_compute[186544]: 2025-11-22 08:08:09.694 186548 DEBUG nova.compute.manager [req-10bf5be1-e876-4929-a5b0-e1997527aac1 req-b1451033-930d-4a25-afca-343a7064d2ae 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] Received event network-vif-deleted-f66ac761-5b67-4bf3-98de-982febfd27c1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:08:09 compute-0 nova_compute[186544]: 2025-11-22 08:08:09.852 186548 DEBUG nova.network.neutron [req-39aa4225-7329-4047-8d65-e0bf3f295e58 req-97c55bdc-f131-4649-aa7a-17453e81eb17 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] Updated VIF entry in instance network info cache for port f66ac761-5b67-4bf3-98de-982febfd27c1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:08:09 compute-0 nova_compute[186544]: 2025-11-22 08:08:09.852 186548 DEBUG nova.network.neutron [req-39aa4225-7329-4047-8d65-e0bf3f295e58 req-97c55bdc-f131-4649-aa7a-17453e81eb17 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] Updating instance_info_cache with network_info: [{"id": "f66ac761-5b67-4bf3-98de-982febfd27c1", "address": "fa:16:3e:07:30:39", "network": {"id": "90da6fca-65d1-4012-9602-d88842a0ad0e", "bridge": "br-int", "label": "tempest-network-smoke--1184291441", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe07:3039", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf66ac761-5b", "ovs_interfaceid": "f66ac761-5b67-4bf3-98de-982febfd27c1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:08:09 compute-0 nova_compute[186544]: 2025-11-22 08:08:09.878 186548 DEBUG oslo_concurrency.lockutils [req-39aa4225-7329-4047-8d65-e0bf3f295e58 req-97c55bdc-f131-4649-aa7a-17453e81eb17 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:08:09 compute-0 nova_compute[186544]: 2025-11-22 08:08:09.879 186548 DEBUG nova.compute.manager [req-39aa4225-7329-4047-8d65-e0bf3f295e58 req-97c55bdc-f131-4649-aa7a-17453e81eb17 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] Received event network-vif-unplugged-f66ac761-5b67-4bf3-98de-982febfd27c1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:08:09 compute-0 nova_compute[186544]: 2025-11-22 08:08:09.879 186548 DEBUG oslo_concurrency.lockutils [req-39aa4225-7329-4047-8d65-e0bf3f295e58 req-97c55bdc-f131-4649-aa7a-17453e81eb17 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:08:09 compute-0 nova_compute[186544]: 2025-11-22 08:08:09.879 186548 DEBUG oslo_concurrency.lockutils [req-39aa4225-7329-4047-8d65-e0bf3f295e58 req-97c55bdc-f131-4649-aa7a-17453e81eb17 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:08:09 compute-0 nova_compute[186544]: 2025-11-22 08:08:09.879 186548 DEBUG oslo_concurrency.lockutils [req-39aa4225-7329-4047-8d65-e0bf3f295e58 req-97c55bdc-f131-4649-aa7a-17453e81eb17 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:08:09 compute-0 nova_compute[186544]: 2025-11-22 08:08:09.880 186548 DEBUG nova.compute.manager [req-39aa4225-7329-4047-8d65-e0bf3f295e58 req-97c55bdc-f131-4649-aa7a-17453e81eb17 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] No waiting events found dispatching network-vif-unplugged-f66ac761-5b67-4bf3-98de-982febfd27c1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:08:09 compute-0 nova_compute[186544]: 2025-11-22 08:08:09.880 186548 DEBUG nova.compute.manager [req-39aa4225-7329-4047-8d65-e0bf3f295e58 req-97c55bdc-f131-4649-aa7a-17453e81eb17 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] Received event network-vif-unplugged-f66ac761-5b67-4bf3-98de-982febfd27c1 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 22 08:08:10 compute-0 nova_compute[186544]: 2025-11-22 08:08:10.820 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:08:11 compute-0 nova_compute[186544]: 2025-11-22 08:08:11.526 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:08:14 compute-0 nova_compute[186544]: 2025-11-22 08:08:14.363 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:08:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:08:14.365 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=33, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=32) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:08:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:08:14.365 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 08:08:15 compute-0 podman[234750]: 2025-11-22 08:08:15.438789543 +0000 UTC m=+0.087381997 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 08:08:15 compute-0 nova_compute[186544]: 2025-11-22 08:08:15.822 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:08:16 compute-0 nova_compute[186544]: 2025-11-22 08:08:16.158 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:08:16 compute-0 nova_compute[186544]: 2025-11-22 08:08:16.527 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:08:17 compute-0 nova_compute[186544]: 2025-11-22 08:08:17.307 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763798882.3061926, 197a8113-f798-4088-a43c-06c87c60411d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:08:17 compute-0 nova_compute[186544]: 2025-11-22 08:08:17.308 186548 INFO nova.compute.manager [-] [instance: 197a8113-f798-4088-a43c-06c87c60411d] VM Stopped (Lifecycle Event)
Nov 22 08:08:17 compute-0 nova_compute[186544]: 2025-11-22 08:08:17.323 186548 DEBUG nova.compute.manager [None req-f67595d0-1fa0-45be-bf13-2193b3a3fd04 - - - - - -] [instance: 197a8113-f798-4088-a43c-06c87c60411d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:08:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:08:19.368 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '33'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:08:19 compute-0 podman[234771]: 2025-11-22 08:08:19.411192621 +0000 UTC m=+0.057453458 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 08:08:19 compute-0 podman[234772]: 2025-11-22 08:08:19.417301482 +0000 UTC m=+0.061698503 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, container_name=openstack_network_exporter, name=ubi9-minimal, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., architecture=x86_64, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 22 08:08:20 compute-0 nova_compute[186544]: 2025-11-22 08:08:20.822 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:08:21 compute-0 nova_compute[186544]: 2025-11-22 08:08:21.489 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763798886.4882681, c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:08:21 compute-0 nova_compute[186544]: 2025-11-22 08:08:21.490 186548 INFO nova.compute.manager [-] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] VM Stopped (Lifecycle Event)
Nov 22 08:08:21 compute-0 nova_compute[186544]: 2025-11-22 08:08:21.523 186548 DEBUG nova.compute.manager [None req-4a52bc78-d41d-45f7-a1b2-0a9536f3d75c - - - - - -] [instance: c3f5ba3b-8607-4aae-a0ca-3a5f6ed39762] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:08:21 compute-0 nova_compute[186544]: 2025-11-22 08:08:21.529 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:08:25 compute-0 nova_compute[186544]: 2025-11-22 08:08:25.825 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:08:26 compute-0 nova_compute[186544]: 2025-11-22 08:08:26.530 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:08:28 compute-0 nova_compute[186544]: 2025-11-22 08:08:28.146 186548 DEBUG oslo_concurrency.lockutils [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Acquiring lock "267a26ae-5092-423d-80ce-b122467635a3" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:08:28 compute-0 nova_compute[186544]: 2025-11-22 08:08:28.147 186548 DEBUG oslo_concurrency.lockutils [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Lock "267a26ae-5092-423d-80ce-b122467635a3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:08:28 compute-0 nova_compute[186544]: 2025-11-22 08:08:28.187 186548 DEBUG nova.compute.manager [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 08:08:28 compute-0 nova_compute[186544]: 2025-11-22 08:08:28.376 186548 DEBUG oslo_concurrency.lockutils [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:08:28 compute-0 nova_compute[186544]: 2025-11-22 08:08:28.377 186548 DEBUG oslo_concurrency.lockutils [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:08:28 compute-0 nova_compute[186544]: 2025-11-22 08:08:28.389 186548 DEBUG nova.virt.hardware [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 08:08:28 compute-0 nova_compute[186544]: 2025-11-22 08:08:28.390 186548 INFO nova.compute.claims [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Claim successful on node compute-0.ctlplane.example.com
Nov 22 08:08:28 compute-0 nova_compute[186544]: 2025-11-22 08:08:28.527 186548 DEBUG nova.compute.provider_tree [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:08:28 compute-0 nova_compute[186544]: 2025-11-22 08:08:28.560 186548 DEBUG nova.scheduler.client.report [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:08:28 compute-0 nova_compute[186544]: 2025-11-22 08:08:28.734 186548 DEBUG oslo_concurrency.lockutils [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.357s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:08:28 compute-0 nova_compute[186544]: 2025-11-22 08:08:28.735 186548 DEBUG nova.compute.manager [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 08:08:28 compute-0 nova_compute[186544]: 2025-11-22 08:08:28.807 186548 DEBUG nova.compute.manager [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 22 08:08:28 compute-0 nova_compute[186544]: 2025-11-22 08:08:28.807 186548 DEBUG nova.network.neutron [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: 267a26ae-5092-423d-80ce-b122467635a3] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 08:08:28 compute-0 nova_compute[186544]: 2025-11-22 08:08:28.830 186548 INFO nova.virt.libvirt.driver [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 08:08:28 compute-0 nova_compute[186544]: 2025-11-22 08:08:28.846 186548 DEBUG nova.compute.manager [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 08:08:29 compute-0 nova_compute[186544]: 2025-11-22 08:08:29.001 186548 DEBUG nova.compute.manager [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 08:08:29 compute-0 nova_compute[186544]: 2025-11-22 08:08:29.002 186548 DEBUG nova.virt.libvirt.driver [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 08:08:29 compute-0 nova_compute[186544]: 2025-11-22 08:08:29.003 186548 INFO nova.virt.libvirt.driver [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Creating image(s)
Nov 22 08:08:29 compute-0 nova_compute[186544]: 2025-11-22 08:08:29.004 186548 DEBUG oslo_concurrency.lockutils [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Acquiring lock "/var/lib/nova/instances/267a26ae-5092-423d-80ce-b122467635a3/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:08:29 compute-0 nova_compute[186544]: 2025-11-22 08:08:29.004 186548 DEBUG oslo_concurrency.lockutils [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Lock "/var/lib/nova/instances/267a26ae-5092-423d-80ce-b122467635a3/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:08:29 compute-0 nova_compute[186544]: 2025-11-22 08:08:29.004 186548 DEBUG oslo_concurrency.lockutils [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Lock "/var/lib/nova/instances/267a26ae-5092-423d-80ce-b122467635a3/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:08:29 compute-0 nova_compute[186544]: 2025-11-22 08:08:29.017 186548 DEBUG oslo_concurrency.processutils [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:08:29 compute-0 nova_compute[186544]: 2025-11-22 08:08:29.080 186548 DEBUG oslo_concurrency.processutils [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:08:29 compute-0 nova_compute[186544]: 2025-11-22 08:08:29.081 186548 DEBUG oslo_concurrency.lockutils [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:08:29 compute-0 nova_compute[186544]: 2025-11-22 08:08:29.082 186548 DEBUG oslo_concurrency.lockutils [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:08:29 compute-0 nova_compute[186544]: 2025-11-22 08:08:29.100 186548 DEBUG oslo_concurrency.processutils [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:08:29 compute-0 nova_compute[186544]: 2025-11-22 08:08:29.157 186548 DEBUG oslo_concurrency.processutils [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:08:29 compute-0 nova_compute[186544]: 2025-11-22 08:08:29.159 186548 DEBUG oslo_concurrency.processutils [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/267a26ae-5092-423d-80ce-b122467635a3/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:08:29 compute-0 nova_compute[186544]: 2025-11-22 08:08:29.286 186548 DEBUG nova.policy [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '867dbb7f34964c339e824aadd897d3f9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '347404e1ff614e68bf6621e027c9212f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 22 08:08:29 compute-0 nova_compute[186544]: 2025-11-22 08:08:29.552 186548 DEBUG oslo_concurrency.processutils [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/267a26ae-5092-423d-80ce-b122467635a3/disk 1073741824" returned: 0 in 0.393s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:08:29 compute-0 nova_compute[186544]: 2025-11-22 08:08:29.553 186548 DEBUG oslo_concurrency.lockutils [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.471s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:08:29 compute-0 nova_compute[186544]: 2025-11-22 08:08:29.553 186548 DEBUG oslo_concurrency.processutils [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:08:29 compute-0 nova_compute[186544]: 2025-11-22 08:08:29.606 186548 DEBUG oslo_concurrency.processutils [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:08:29 compute-0 nova_compute[186544]: 2025-11-22 08:08:29.607 186548 DEBUG nova.virt.disk.api [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Checking if we can resize image /var/lib/nova/instances/267a26ae-5092-423d-80ce-b122467635a3/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 08:08:29 compute-0 nova_compute[186544]: 2025-11-22 08:08:29.607 186548 DEBUG oslo_concurrency.processutils [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/267a26ae-5092-423d-80ce-b122467635a3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:08:29 compute-0 nova_compute[186544]: 2025-11-22 08:08:29.664 186548 DEBUG oslo_concurrency.processutils [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/267a26ae-5092-423d-80ce-b122467635a3/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:08:29 compute-0 nova_compute[186544]: 2025-11-22 08:08:29.665 186548 DEBUG nova.virt.disk.api [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Cannot resize image /var/lib/nova/instances/267a26ae-5092-423d-80ce-b122467635a3/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 08:08:29 compute-0 nova_compute[186544]: 2025-11-22 08:08:29.665 186548 DEBUG nova.objects.instance [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Lazy-loading 'migration_context' on Instance uuid 267a26ae-5092-423d-80ce-b122467635a3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:08:29 compute-0 nova_compute[186544]: 2025-11-22 08:08:29.678 186548 DEBUG nova.virt.libvirt.driver [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 08:08:29 compute-0 nova_compute[186544]: 2025-11-22 08:08:29.679 186548 DEBUG nova.virt.libvirt.driver [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Ensure instance console log exists: /var/lib/nova/instances/267a26ae-5092-423d-80ce-b122467635a3/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 08:08:29 compute-0 nova_compute[186544]: 2025-11-22 08:08:29.679 186548 DEBUG oslo_concurrency.lockutils [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:08:29 compute-0 nova_compute[186544]: 2025-11-22 08:08:29.680 186548 DEBUG oslo_concurrency.lockutils [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:08:29 compute-0 nova_compute[186544]: 2025-11-22 08:08:29.680 186548 DEBUG oslo_concurrency.lockutils [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:08:30 compute-0 nova_compute[186544]: 2025-11-22 08:08:30.827 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:08:30 compute-0 nova_compute[186544]: 2025-11-22 08:08:30.958 186548 DEBUG nova.network.neutron [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Successfully created port: 843ca841-6500-4cad-970d-0e512ba71d41 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 22 08:08:31 compute-0 nova_compute[186544]: 2025-11-22 08:08:31.533 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:08:34 compute-0 nova_compute[186544]: 2025-11-22 08:08:34.767 186548 DEBUG nova.network.neutron [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Successfully updated port: 843ca841-6500-4cad-970d-0e512ba71d41 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 08:08:34 compute-0 nova_compute[186544]: 2025-11-22 08:08:34.810 186548 DEBUG oslo_concurrency.lockutils [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Acquiring lock "refresh_cache-267a26ae-5092-423d-80ce-b122467635a3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:08:34 compute-0 nova_compute[186544]: 2025-11-22 08:08:34.811 186548 DEBUG oslo_concurrency.lockutils [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Acquired lock "refresh_cache-267a26ae-5092-423d-80ce-b122467635a3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:08:34 compute-0 nova_compute[186544]: 2025-11-22 08:08:34.811 186548 DEBUG nova.network.neutron [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 08:08:34 compute-0 nova_compute[186544]: 2025-11-22 08:08:34.877 186548 DEBUG nova.compute.manager [req-8998a580-0deb-43f3-b128-b74db65db164 req-efd4a671-50dc-4715-bc0d-9b15e31a31d4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Received event network-changed-843ca841-6500-4cad-970d-0e512ba71d41 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:08:34 compute-0 nova_compute[186544]: 2025-11-22 08:08:34.877 186548 DEBUG nova.compute.manager [req-8998a580-0deb-43f3-b128-b74db65db164 req-efd4a671-50dc-4715-bc0d-9b15e31a31d4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Refreshing instance network info cache due to event network-changed-843ca841-6500-4cad-970d-0e512ba71d41. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:08:34 compute-0 nova_compute[186544]: 2025-11-22 08:08:34.877 186548 DEBUG oslo_concurrency.lockutils [req-8998a580-0deb-43f3-b128-b74db65db164 req-efd4a671-50dc-4715-bc0d-9b15e31a31d4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-267a26ae-5092-423d-80ce-b122467635a3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:08:35 compute-0 nova_compute[186544]: 2025-11-22 08:08:35.406 186548 DEBUG nova.network.neutron [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 08:08:35 compute-0 podman[234830]: 2025-11-22 08:08:35.410498222 +0000 UTC m=+0.061648137 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 22 08:08:35 compute-0 podman[234831]: 2025-11-22 08:08:35.437204063 +0000 UTC m=+0.085799384 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true)
Nov 22 08:08:35 compute-0 nova_compute[186544]: 2025-11-22 08:08:35.828 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:08:36 compute-0 nova_compute[186544]: 2025-11-22 08:08:36.534 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:08:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:08:36.595 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:08:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:08:36.596 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:08:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:08:36.596 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:08:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:08:36.596 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:08:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:08:36.596 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:08:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:08:36.596 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:08:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:08:36.596 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:08:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:08:36.596 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:08:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:08:36.596 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:08:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:08:36.596 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:08:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:08:36.596 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:08:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:08:36.596 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:08:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:08:36.597 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:08:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:08:36.597 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:08:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:08:36.597 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:08:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:08:36.597 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:08:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:08:36.597 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:08:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:08:36.597 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:08:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:08:36.597 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:08:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:08:36.597 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:08:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:08:36.597 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:08:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:08:36.597 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:08:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:08:36.597 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:08:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:08:36.597 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:08:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:08:36.598 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:08:36 compute-0 nova_compute[186544]: 2025-11-22 08:08:36.816 186548 DEBUG nova.network.neutron [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Updating instance_info_cache with network_info: [{"id": "843ca841-6500-4cad-970d-0e512ba71d41", "address": "fa:16:3e:d5:a2:84", "network": {"id": "4b6c329c-02bc-419e-9c44-e313eaa92343", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-2017524329-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "347404e1ff614e68bf6621e027c9212f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap843ca841-65", "ovs_interfaceid": "843ca841-6500-4cad-970d-0e512ba71d41", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:08:36 compute-0 nova_compute[186544]: 2025-11-22 08:08:36.859 186548 DEBUG oslo_concurrency.lockutils [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Releasing lock "refresh_cache-267a26ae-5092-423d-80ce-b122467635a3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:08:36 compute-0 nova_compute[186544]: 2025-11-22 08:08:36.860 186548 DEBUG nova.compute.manager [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Instance network_info: |[{"id": "843ca841-6500-4cad-970d-0e512ba71d41", "address": "fa:16:3e:d5:a2:84", "network": {"id": "4b6c329c-02bc-419e-9c44-e313eaa92343", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-2017524329-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "347404e1ff614e68bf6621e027c9212f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap843ca841-65", "ovs_interfaceid": "843ca841-6500-4cad-970d-0e512ba71d41", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 22 08:08:36 compute-0 nova_compute[186544]: 2025-11-22 08:08:36.860 186548 DEBUG oslo_concurrency.lockutils [req-8998a580-0deb-43f3-b128-b74db65db164 req-efd4a671-50dc-4715-bc0d-9b15e31a31d4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-267a26ae-5092-423d-80ce-b122467635a3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:08:36 compute-0 nova_compute[186544]: 2025-11-22 08:08:36.861 186548 DEBUG nova.network.neutron [req-8998a580-0deb-43f3-b128-b74db65db164 req-efd4a671-50dc-4715-bc0d-9b15e31a31d4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Refreshing network info cache for port 843ca841-6500-4cad-970d-0e512ba71d41 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:08:36 compute-0 nova_compute[186544]: 2025-11-22 08:08:36.864 186548 DEBUG nova.virt.libvirt.driver [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Start _get_guest_xml network_info=[{"id": "843ca841-6500-4cad-970d-0e512ba71d41", "address": "fa:16:3e:d5:a2:84", "network": {"id": "4b6c329c-02bc-419e-9c44-e313eaa92343", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-2017524329-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "347404e1ff614e68bf6621e027c9212f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap843ca841-65", "ovs_interfaceid": "843ca841-6500-4cad-970d-0e512ba71d41", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 08:08:36 compute-0 nova_compute[186544]: 2025-11-22 08:08:36.869 186548 WARNING nova.virt.libvirt.driver [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:08:36 compute-0 nova_compute[186544]: 2025-11-22 08:08:36.874 186548 DEBUG nova.virt.libvirt.host [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 08:08:36 compute-0 nova_compute[186544]: 2025-11-22 08:08:36.874 186548 DEBUG nova.virt.libvirt.host [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 08:08:36 compute-0 nova_compute[186544]: 2025-11-22 08:08:36.883 186548 DEBUG nova.virt.libvirt.host [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 08:08:36 compute-0 nova_compute[186544]: 2025-11-22 08:08:36.884 186548 DEBUG nova.virt.libvirt.host [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 08:08:36 compute-0 nova_compute[186544]: 2025-11-22 08:08:36.885 186548 DEBUG nova.virt.libvirt.driver [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 08:08:36 compute-0 nova_compute[186544]: 2025-11-22 08:08:36.885 186548 DEBUG nova.virt.hardware [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 08:08:36 compute-0 nova_compute[186544]: 2025-11-22 08:08:36.886 186548 DEBUG nova.virt.hardware [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 08:08:36 compute-0 nova_compute[186544]: 2025-11-22 08:08:36.886 186548 DEBUG nova.virt.hardware [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 08:08:36 compute-0 nova_compute[186544]: 2025-11-22 08:08:36.886 186548 DEBUG nova.virt.hardware [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 08:08:36 compute-0 nova_compute[186544]: 2025-11-22 08:08:36.886 186548 DEBUG nova.virt.hardware [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 08:08:36 compute-0 nova_compute[186544]: 2025-11-22 08:08:36.887 186548 DEBUG nova.virt.hardware [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 08:08:36 compute-0 nova_compute[186544]: 2025-11-22 08:08:36.887 186548 DEBUG nova.virt.hardware [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 08:08:36 compute-0 nova_compute[186544]: 2025-11-22 08:08:36.887 186548 DEBUG nova.virt.hardware [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 08:08:36 compute-0 nova_compute[186544]: 2025-11-22 08:08:36.888 186548 DEBUG nova.virt.hardware [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 08:08:36 compute-0 nova_compute[186544]: 2025-11-22 08:08:36.888 186548 DEBUG nova.virt.hardware [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 08:08:36 compute-0 nova_compute[186544]: 2025-11-22 08:08:36.888 186548 DEBUG nova.virt.hardware [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 08:08:36 compute-0 nova_compute[186544]: 2025-11-22 08:08:36.892 186548 DEBUG nova.virt.libvirt.vif [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:08:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueTestJSON-server-1964315236',display_name='tempest-ServerRescueTestJSON-server-1964315236',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverrescuetestjson-server-1964315236',id=115,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='347404e1ff614e68bf6621e027c9212f',ramdisk_id='',reservation_id='r-suk6riwi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueTestJSON-1650311982',owner_user_name='tempest-ServerRescueTestJSON-1650311982-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:08:28Z,user_data=None,user_id='867dbb7f34964c339e824aadd897d3f9',uuid=267a26ae-5092-423d-80ce-b122467635a3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "843ca841-6500-4cad-970d-0e512ba71d41", "address": "fa:16:3e:d5:a2:84", "network": {"id": "4b6c329c-02bc-419e-9c44-e313eaa92343", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-2017524329-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "347404e1ff614e68bf6621e027c9212f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap843ca841-65", "ovs_interfaceid": "843ca841-6500-4cad-970d-0e512ba71d41", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 08:08:36 compute-0 nova_compute[186544]: 2025-11-22 08:08:36.893 186548 DEBUG nova.network.os_vif_util [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Converting VIF {"id": "843ca841-6500-4cad-970d-0e512ba71d41", "address": "fa:16:3e:d5:a2:84", "network": {"id": "4b6c329c-02bc-419e-9c44-e313eaa92343", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-2017524329-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "347404e1ff614e68bf6621e027c9212f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap843ca841-65", "ovs_interfaceid": "843ca841-6500-4cad-970d-0e512ba71d41", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:08:36 compute-0 nova_compute[186544]: 2025-11-22 08:08:36.893 186548 DEBUG nova.network.os_vif_util [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d5:a2:84,bridge_name='br-int',has_traffic_filtering=True,id=843ca841-6500-4cad-970d-0e512ba71d41,network=Network(4b6c329c-02bc-419e-9c44-e313eaa92343),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap843ca841-65') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:08:36 compute-0 nova_compute[186544]: 2025-11-22 08:08:36.894 186548 DEBUG nova.objects.instance [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Lazy-loading 'pci_devices' on Instance uuid 267a26ae-5092-423d-80ce-b122467635a3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:08:36 compute-0 nova_compute[186544]: 2025-11-22 08:08:36.908 186548 DEBUG nova.virt.libvirt.driver [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: 267a26ae-5092-423d-80ce-b122467635a3] End _get_guest_xml xml=<domain type="kvm">
Nov 22 08:08:36 compute-0 nova_compute[186544]:   <uuid>267a26ae-5092-423d-80ce-b122467635a3</uuid>
Nov 22 08:08:36 compute-0 nova_compute[186544]:   <name>instance-00000073</name>
Nov 22 08:08:36 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 08:08:36 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 08:08:36 compute-0 nova_compute[186544]:   <metadata>
Nov 22 08:08:36 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 08:08:36 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 08:08:36 compute-0 nova_compute[186544]:       <nova:name>tempest-ServerRescueTestJSON-server-1964315236</nova:name>
Nov 22 08:08:36 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 08:08:36</nova:creationTime>
Nov 22 08:08:36 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 08:08:36 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 08:08:36 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 08:08:36 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 08:08:36 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 08:08:36 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 08:08:36 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 08:08:36 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 08:08:36 compute-0 nova_compute[186544]:         <nova:user uuid="867dbb7f34964c339e824aadd897d3f9">tempest-ServerRescueTestJSON-1650311982-project-member</nova:user>
Nov 22 08:08:36 compute-0 nova_compute[186544]:         <nova:project uuid="347404e1ff614e68bf6621e027c9212f">tempest-ServerRescueTestJSON-1650311982</nova:project>
Nov 22 08:08:36 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 08:08:36 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 08:08:36 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 08:08:36 compute-0 nova_compute[186544]:         <nova:port uuid="843ca841-6500-4cad-970d-0e512ba71d41">
Nov 22 08:08:36 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 22 08:08:36 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 08:08:36 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 08:08:36 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 08:08:36 compute-0 nova_compute[186544]:   </metadata>
Nov 22 08:08:36 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 08:08:36 compute-0 nova_compute[186544]:     <system>
Nov 22 08:08:36 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 08:08:36 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 08:08:36 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 08:08:36 compute-0 nova_compute[186544]:       <entry name="serial">267a26ae-5092-423d-80ce-b122467635a3</entry>
Nov 22 08:08:36 compute-0 nova_compute[186544]:       <entry name="uuid">267a26ae-5092-423d-80ce-b122467635a3</entry>
Nov 22 08:08:36 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 08:08:36 compute-0 nova_compute[186544]:     </system>
Nov 22 08:08:36 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 08:08:36 compute-0 nova_compute[186544]:   <os>
Nov 22 08:08:36 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 08:08:36 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 08:08:36 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 08:08:36 compute-0 nova_compute[186544]:   </os>
Nov 22 08:08:36 compute-0 nova_compute[186544]:   <features>
Nov 22 08:08:36 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 08:08:36 compute-0 nova_compute[186544]:     <apic/>
Nov 22 08:08:36 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 08:08:36 compute-0 nova_compute[186544]:   </features>
Nov 22 08:08:36 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 08:08:36 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 08:08:36 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 08:08:36 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 08:08:36 compute-0 nova_compute[186544]:   </clock>
Nov 22 08:08:36 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 08:08:36 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 08:08:36 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 08:08:36 compute-0 nova_compute[186544]:   </cpu>
Nov 22 08:08:36 compute-0 nova_compute[186544]:   <devices>
Nov 22 08:08:36 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 08:08:36 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 08:08:36 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/267a26ae-5092-423d-80ce-b122467635a3/disk"/>
Nov 22 08:08:36 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 08:08:36 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:08:36 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 08:08:36 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 08:08:36 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/267a26ae-5092-423d-80ce-b122467635a3/disk.config"/>
Nov 22 08:08:36 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 08:08:36 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:08:36 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 08:08:36 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:d5:a2:84"/>
Nov 22 08:08:36 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:08:36 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 08:08:36 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 08:08:36 compute-0 nova_compute[186544]:       <target dev="tap843ca841-65"/>
Nov 22 08:08:36 compute-0 nova_compute[186544]:     </interface>
Nov 22 08:08:36 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 08:08:36 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/267a26ae-5092-423d-80ce-b122467635a3/console.log" append="off"/>
Nov 22 08:08:36 compute-0 nova_compute[186544]:     </serial>
Nov 22 08:08:36 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 08:08:36 compute-0 nova_compute[186544]:     <video>
Nov 22 08:08:36 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:08:36 compute-0 nova_compute[186544]:     </video>
Nov 22 08:08:36 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 08:08:36 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 08:08:36 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 08:08:36 compute-0 nova_compute[186544]:     </rng>
Nov 22 08:08:36 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 08:08:36 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:08:36 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:08:36 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:08:36 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:08:36 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:08:36 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:08:36 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:08:36 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:08:36 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:08:36 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:08:36 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:08:36 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:08:36 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:08:36 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:08:36 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:08:36 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:08:36 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:08:36 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:08:36 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:08:36 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:08:36 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:08:36 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:08:36 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:08:36 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:08:36 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 08:08:36 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 08:08:36 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 08:08:36 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 08:08:36 compute-0 nova_compute[186544]:   </devices>
Nov 22 08:08:36 compute-0 nova_compute[186544]: </domain>
Nov 22 08:08:36 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 08:08:36 compute-0 nova_compute[186544]: 2025-11-22 08:08:36.909 186548 DEBUG nova.compute.manager [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Preparing to wait for external event network-vif-plugged-843ca841-6500-4cad-970d-0e512ba71d41 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 08:08:36 compute-0 nova_compute[186544]: 2025-11-22 08:08:36.909 186548 DEBUG oslo_concurrency.lockutils [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Acquiring lock "267a26ae-5092-423d-80ce-b122467635a3-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:08:36 compute-0 nova_compute[186544]: 2025-11-22 08:08:36.909 186548 DEBUG oslo_concurrency.lockutils [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Lock "267a26ae-5092-423d-80ce-b122467635a3-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:08:36 compute-0 nova_compute[186544]: 2025-11-22 08:08:36.909 186548 DEBUG oslo_concurrency.lockutils [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Lock "267a26ae-5092-423d-80ce-b122467635a3-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:08:36 compute-0 nova_compute[186544]: 2025-11-22 08:08:36.910 186548 DEBUG nova.virt.libvirt.vif [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:08:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueTestJSON-server-1964315236',display_name='tempest-ServerRescueTestJSON-server-1964315236',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverrescuetestjson-server-1964315236',id=115,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='347404e1ff614e68bf6621e027c9212f',ramdisk_id='',reservation_id='r-suk6riwi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueTestJSON-1650311982',owner_user_name='tempest-ServerRescueTestJSON-1650311982-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:08:28Z,user_data=None,user_id='867dbb7f34964c339e824aadd897d3f9',uuid=267a26ae-5092-423d-80ce-b122467635a3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "843ca841-6500-4cad-970d-0e512ba71d41", "address": "fa:16:3e:d5:a2:84", "network": {"id": "4b6c329c-02bc-419e-9c44-e313eaa92343", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-2017524329-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "347404e1ff614e68bf6621e027c9212f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap843ca841-65", "ovs_interfaceid": "843ca841-6500-4cad-970d-0e512ba71d41", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 08:08:36 compute-0 nova_compute[186544]: 2025-11-22 08:08:36.910 186548 DEBUG nova.network.os_vif_util [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Converting VIF {"id": "843ca841-6500-4cad-970d-0e512ba71d41", "address": "fa:16:3e:d5:a2:84", "network": {"id": "4b6c329c-02bc-419e-9c44-e313eaa92343", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-2017524329-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "347404e1ff614e68bf6621e027c9212f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap843ca841-65", "ovs_interfaceid": "843ca841-6500-4cad-970d-0e512ba71d41", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:08:36 compute-0 nova_compute[186544]: 2025-11-22 08:08:36.911 186548 DEBUG nova.network.os_vif_util [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d5:a2:84,bridge_name='br-int',has_traffic_filtering=True,id=843ca841-6500-4cad-970d-0e512ba71d41,network=Network(4b6c329c-02bc-419e-9c44-e313eaa92343),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap843ca841-65') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:08:36 compute-0 nova_compute[186544]: 2025-11-22 08:08:36.911 186548 DEBUG os_vif [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d5:a2:84,bridge_name='br-int',has_traffic_filtering=True,id=843ca841-6500-4cad-970d-0e512ba71d41,network=Network(4b6c329c-02bc-419e-9c44-e313eaa92343),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap843ca841-65') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 08:08:36 compute-0 nova_compute[186544]: 2025-11-22 08:08:36.912 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:08:36 compute-0 nova_compute[186544]: 2025-11-22 08:08:36.912 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:08:36 compute-0 nova_compute[186544]: 2025-11-22 08:08:36.913 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:08:36 compute-0 nova_compute[186544]: 2025-11-22 08:08:36.916 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:08:36 compute-0 nova_compute[186544]: 2025-11-22 08:08:36.917 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap843ca841-65, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:08:36 compute-0 nova_compute[186544]: 2025-11-22 08:08:36.917 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap843ca841-65, col_values=(('external_ids', {'iface-id': '843ca841-6500-4cad-970d-0e512ba71d41', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d5:a2:84', 'vm-uuid': '267a26ae-5092-423d-80ce-b122467635a3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:08:36 compute-0 nova_compute[186544]: 2025-11-22 08:08:36.918 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:08:36 compute-0 NetworkManager[55036]: <info>  [1763798916.9198] manager: (tap843ca841-65): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/245)
Nov 22 08:08:36 compute-0 nova_compute[186544]: 2025-11-22 08:08:36.920 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 08:08:36 compute-0 nova_compute[186544]: 2025-11-22 08:08:36.924 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:08:36 compute-0 nova_compute[186544]: 2025-11-22 08:08:36.927 186548 INFO os_vif [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d5:a2:84,bridge_name='br-int',has_traffic_filtering=True,id=843ca841-6500-4cad-970d-0e512ba71d41,network=Network(4b6c329c-02bc-419e-9c44-e313eaa92343),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap843ca841-65')
Nov 22 08:08:37 compute-0 podman[234878]: 2025-11-22 08:08:37.015539465 +0000 UTC m=+0.054875720 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 22 08:08:37 compute-0 podman[234879]: 2025-11-22 08:08:37.015993517 +0000 UTC m=+0.050599885 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 08:08:37 compute-0 nova_compute[186544]: 2025-11-22 08:08:37.303 186548 DEBUG nova.virt.libvirt.driver [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:08:37 compute-0 nova_compute[186544]: 2025-11-22 08:08:37.303 186548 DEBUG nova.virt.libvirt.driver [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:08:37 compute-0 nova_compute[186544]: 2025-11-22 08:08:37.304 186548 DEBUG nova.virt.libvirt.driver [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] No VIF found with MAC fa:16:3e:d5:a2:84, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 08:08:37 compute-0 nova_compute[186544]: 2025-11-22 08:08:37.304 186548 INFO nova.virt.libvirt.driver [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Using config drive
Nov 22 08:08:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:08:37.337 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:08:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:08:37.337 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:08:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:08:37.337 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:08:37 compute-0 nova_compute[186544]: 2025-11-22 08:08:37.687 186548 INFO nova.virt.libvirt.driver [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Creating config drive at /var/lib/nova/instances/267a26ae-5092-423d-80ce-b122467635a3/disk.config
Nov 22 08:08:37 compute-0 nova_compute[186544]: 2025-11-22 08:08:37.691 186548 DEBUG oslo_concurrency.processutils [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/267a26ae-5092-423d-80ce-b122467635a3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_h93zu_2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:08:37 compute-0 nova_compute[186544]: 2025-11-22 08:08:37.816 186548 DEBUG oslo_concurrency.processutils [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/267a26ae-5092-423d-80ce-b122467635a3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_h93zu_2" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:08:37 compute-0 kernel: tap843ca841-65: entered promiscuous mode
Nov 22 08:08:37 compute-0 ovn_controller[94843]: 2025-11-22T08:08:37Z|00522|binding|INFO|Claiming lport 843ca841-6500-4cad-970d-0e512ba71d41 for this chassis.
Nov 22 08:08:37 compute-0 NetworkManager[55036]: <info>  [1763798917.8821] manager: (tap843ca841-65): new Tun device (/org/freedesktop/NetworkManager/Devices/246)
Nov 22 08:08:37 compute-0 ovn_controller[94843]: 2025-11-22T08:08:37Z|00523|binding|INFO|843ca841-6500-4cad-970d-0e512ba71d41: Claiming fa:16:3e:d5:a2:84 10.100.0.5
Nov 22 08:08:37 compute-0 nova_compute[186544]: 2025-11-22 08:08:37.881 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:08:37 compute-0 nova_compute[186544]: 2025-11-22 08:08:37.886 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:08:37 compute-0 systemd-udevd[234938]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:08:37 compute-0 NetworkManager[55036]: <info>  [1763798917.9264] device (tap843ca841-65): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 08:08:37 compute-0 NetworkManager[55036]: <info>  [1763798917.9276] device (tap843ca841-65): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 08:08:37 compute-0 systemd-machined[152872]: New machine qemu-67-instance-00000073.
Nov 22 08:08:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:08:37.936 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d5:a2:84 10.100.0.5'], port_security=['fa:16:3e:d5:a2:84 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '267a26ae-5092-423d-80ce-b122467635a3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4b6c329c-02bc-419e-9c44-e313eaa92343', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '347404e1ff614e68bf6621e027c9212f', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b4ddbfb9-1873-4998-b083-20067bac9400', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3bf81230-5a70-4bce-ad1c-95ab3d0988b0, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=843ca841-6500-4cad-970d-0e512ba71d41) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:08:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:08:37.938 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 843ca841-6500-4cad-970d-0e512ba71d41 in datapath 4b6c329c-02bc-419e-9c44-e313eaa92343 bound to our chassis
Nov 22 08:08:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:08:37.939 103805 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 4b6c329c-02bc-419e-9c44-e313eaa92343 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 22 08:08:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:08:37.941 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[e838cf9e-2085-4639-9064-05bb7ee4b6c7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:08:37 compute-0 nova_compute[186544]: 2025-11-22 08:08:37.945 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:08:37 compute-0 systemd[1]: Started Virtual Machine qemu-67-instance-00000073.
Nov 22 08:08:37 compute-0 ovn_controller[94843]: 2025-11-22T08:08:37Z|00524|binding|INFO|Setting lport 843ca841-6500-4cad-970d-0e512ba71d41 ovn-installed in OVS
Nov 22 08:08:37 compute-0 ovn_controller[94843]: 2025-11-22T08:08:37Z|00525|binding|INFO|Setting lport 843ca841-6500-4cad-970d-0e512ba71d41 up in Southbound
Nov 22 08:08:37 compute-0 nova_compute[186544]: 2025-11-22 08:08:37.954 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:08:38 compute-0 nova_compute[186544]: 2025-11-22 08:08:38.225 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798918.2248468, 267a26ae-5092-423d-80ce-b122467635a3 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:08:38 compute-0 nova_compute[186544]: 2025-11-22 08:08:38.226 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 267a26ae-5092-423d-80ce-b122467635a3] VM Started (Lifecycle Event)
Nov 22 08:08:38 compute-0 nova_compute[186544]: 2025-11-22 08:08:38.245 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:08:38 compute-0 nova_compute[186544]: 2025-11-22 08:08:38.248 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798918.225803, 267a26ae-5092-423d-80ce-b122467635a3 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:08:38 compute-0 nova_compute[186544]: 2025-11-22 08:08:38.249 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 267a26ae-5092-423d-80ce-b122467635a3] VM Paused (Lifecycle Event)
Nov 22 08:08:38 compute-0 nova_compute[186544]: 2025-11-22 08:08:38.265 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:08:38 compute-0 nova_compute[186544]: 2025-11-22 08:08:38.268 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:08:38 compute-0 nova_compute[186544]: 2025-11-22 08:08:38.285 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 267a26ae-5092-423d-80ce-b122467635a3] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 08:08:38 compute-0 nova_compute[186544]: 2025-11-22 08:08:38.321 186548 DEBUG nova.compute.manager [req-0928f550-5afb-4a94-a7e1-97964c6f9031 req-9aeab293-4748-435b-8bc0-d810072e382e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Received event network-vif-plugged-843ca841-6500-4cad-970d-0e512ba71d41 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:08:38 compute-0 nova_compute[186544]: 2025-11-22 08:08:38.321 186548 DEBUG oslo_concurrency.lockutils [req-0928f550-5afb-4a94-a7e1-97964c6f9031 req-9aeab293-4748-435b-8bc0-d810072e382e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "267a26ae-5092-423d-80ce-b122467635a3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:08:38 compute-0 nova_compute[186544]: 2025-11-22 08:08:38.321 186548 DEBUG oslo_concurrency.lockutils [req-0928f550-5afb-4a94-a7e1-97964c6f9031 req-9aeab293-4748-435b-8bc0-d810072e382e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "267a26ae-5092-423d-80ce-b122467635a3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:08:38 compute-0 nova_compute[186544]: 2025-11-22 08:08:38.322 186548 DEBUG oslo_concurrency.lockutils [req-0928f550-5afb-4a94-a7e1-97964c6f9031 req-9aeab293-4748-435b-8bc0-d810072e382e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "267a26ae-5092-423d-80ce-b122467635a3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:08:38 compute-0 nova_compute[186544]: 2025-11-22 08:08:38.322 186548 DEBUG nova.compute.manager [req-0928f550-5afb-4a94-a7e1-97964c6f9031 req-9aeab293-4748-435b-8bc0-d810072e382e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Processing event network-vif-plugged-843ca841-6500-4cad-970d-0e512ba71d41 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 08:08:38 compute-0 nova_compute[186544]: 2025-11-22 08:08:38.323 186548 DEBUG nova.compute.manager [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 08:08:38 compute-0 nova_compute[186544]: 2025-11-22 08:08:38.326 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798918.3255897, 267a26ae-5092-423d-80ce-b122467635a3 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:08:38 compute-0 nova_compute[186544]: 2025-11-22 08:08:38.326 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 267a26ae-5092-423d-80ce-b122467635a3] VM Resumed (Lifecycle Event)
Nov 22 08:08:38 compute-0 nova_compute[186544]: 2025-11-22 08:08:38.328 186548 DEBUG nova.virt.libvirt.driver [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 08:08:38 compute-0 nova_compute[186544]: 2025-11-22 08:08:38.331 186548 INFO nova.virt.libvirt.driver [-] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Instance spawned successfully.
Nov 22 08:08:38 compute-0 nova_compute[186544]: 2025-11-22 08:08:38.331 186548 DEBUG nova.virt.libvirt.driver [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 08:08:38 compute-0 nova_compute[186544]: 2025-11-22 08:08:38.347 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:08:38 compute-0 nova_compute[186544]: 2025-11-22 08:08:38.354 186548 DEBUG nova.virt.libvirt.driver [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:08:38 compute-0 nova_compute[186544]: 2025-11-22 08:08:38.354 186548 DEBUG nova.virt.libvirt.driver [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:08:38 compute-0 nova_compute[186544]: 2025-11-22 08:08:38.355 186548 DEBUG nova.virt.libvirt.driver [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:08:38 compute-0 nova_compute[186544]: 2025-11-22 08:08:38.355 186548 DEBUG nova.virt.libvirt.driver [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:08:38 compute-0 nova_compute[186544]: 2025-11-22 08:08:38.355 186548 DEBUG nova.virt.libvirt.driver [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:08:38 compute-0 nova_compute[186544]: 2025-11-22 08:08:38.356 186548 DEBUG nova.virt.libvirt.driver [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:08:38 compute-0 nova_compute[186544]: 2025-11-22 08:08:38.359 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:08:38 compute-0 nova_compute[186544]: 2025-11-22 08:08:38.391 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 267a26ae-5092-423d-80ce-b122467635a3] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 08:08:38 compute-0 nova_compute[186544]: 2025-11-22 08:08:38.420 186548 DEBUG nova.network.neutron [req-8998a580-0deb-43f3-b128-b74db65db164 req-efd4a671-50dc-4715-bc0d-9b15e31a31d4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Updated VIF entry in instance network info cache for port 843ca841-6500-4cad-970d-0e512ba71d41. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:08:38 compute-0 nova_compute[186544]: 2025-11-22 08:08:38.420 186548 DEBUG nova.network.neutron [req-8998a580-0deb-43f3-b128-b74db65db164 req-efd4a671-50dc-4715-bc0d-9b15e31a31d4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Updating instance_info_cache with network_info: [{"id": "843ca841-6500-4cad-970d-0e512ba71d41", "address": "fa:16:3e:d5:a2:84", "network": {"id": "4b6c329c-02bc-419e-9c44-e313eaa92343", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-2017524329-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "347404e1ff614e68bf6621e027c9212f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap843ca841-65", "ovs_interfaceid": "843ca841-6500-4cad-970d-0e512ba71d41", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:08:38 compute-0 nova_compute[186544]: 2025-11-22 08:08:38.439 186548 DEBUG oslo_concurrency.lockutils [req-8998a580-0deb-43f3-b128-b74db65db164 req-efd4a671-50dc-4715-bc0d-9b15e31a31d4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-267a26ae-5092-423d-80ce-b122467635a3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:08:38 compute-0 nova_compute[186544]: 2025-11-22 08:08:38.608 186548 INFO nova.compute.manager [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Took 9.61 seconds to spawn the instance on the hypervisor.
Nov 22 08:08:38 compute-0 nova_compute[186544]: 2025-11-22 08:08:38.609 186548 DEBUG nova.compute.manager [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:08:38 compute-0 nova_compute[186544]: 2025-11-22 08:08:38.674 186548 INFO nova.compute.manager [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Took 10.37 seconds to build instance.
Nov 22 08:08:38 compute-0 nova_compute[186544]: 2025-11-22 08:08:38.705 186548 DEBUG oslo_concurrency.lockutils [None req-80f92a0f-9369-43cc-aeb6-90e92c99d055 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Lock "267a26ae-5092-423d-80ce-b122467635a3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.558s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:08:40 compute-0 nova_compute[186544]: 2025-11-22 08:08:40.397 186548 DEBUG nova.compute.manager [req-6cc7c6b7-b0e8-4ac1-810b-e1b436d029da req-ed5092f0-7143-41fd-8f24-aff9e02c6242 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Received event network-vif-plugged-843ca841-6500-4cad-970d-0e512ba71d41 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:08:40 compute-0 nova_compute[186544]: 2025-11-22 08:08:40.397 186548 DEBUG oslo_concurrency.lockutils [req-6cc7c6b7-b0e8-4ac1-810b-e1b436d029da req-ed5092f0-7143-41fd-8f24-aff9e02c6242 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "267a26ae-5092-423d-80ce-b122467635a3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:08:40 compute-0 nova_compute[186544]: 2025-11-22 08:08:40.398 186548 DEBUG oslo_concurrency.lockutils [req-6cc7c6b7-b0e8-4ac1-810b-e1b436d029da req-ed5092f0-7143-41fd-8f24-aff9e02c6242 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "267a26ae-5092-423d-80ce-b122467635a3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:08:40 compute-0 nova_compute[186544]: 2025-11-22 08:08:40.398 186548 DEBUG oslo_concurrency.lockutils [req-6cc7c6b7-b0e8-4ac1-810b-e1b436d029da req-ed5092f0-7143-41fd-8f24-aff9e02c6242 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "267a26ae-5092-423d-80ce-b122467635a3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:08:40 compute-0 nova_compute[186544]: 2025-11-22 08:08:40.398 186548 DEBUG nova.compute.manager [req-6cc7c6b7-b0e8-4ac1-810b-e1b436d029da req-ed5092f0-7143-41fd-8f24-aff9e02c6242 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 267a26ae-5092-423d-80ce-b122467635a3] No waiting events found dispatching network-vif-plugged-843ca841-6500-4cad-970d-0e512ba71d41 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:08:40 compute-0 nova_compute[186544]: 2025-11-22 08:08:40.398 186548 WARNING nova.compute.manager [req-6cc7c6b7-b0e8-4ac1-810b-e1b436d029da req-ed5092f0-7143-41fd-8f24-aff9e02c6242 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Received unexpected event network-vif-plugged-843ca841-6500-4cad-970d-0e512ba71d41 for instance with vm_state active and task_state None.
Nov 22 08:08:40 compute-0 nova_compute[186544]: 2025-11-22 08:08:40.831 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:08:40 compute-0 nova_compute[186544]: 2025-11-22 08:08:40.919 186548 INFO nova.compute.manager [None req-57b5e437-b0ed-4a38-9711-fee742f1ce75 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Rescuing
Nov 22 08:08:40 compute-0 nova_compute[186544]: 2025-11-22 08:08:40.920 186548 DEBUG oslo_concurrency.lockutils [None req-57b5e437-b0ed-4a38-9711-fee742f1ce75 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Acquiring lock "refresh_cache-267a26ae-5092-423d-80ce-b122467635a3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:08:40 compute-0 nova_compute[186544]: 2025-11-22 08:08:40.921 186548 DEBUG oslo_concurrency.lockutils [None req-57b5e437-b0ed-4a38-9711-fee742f1ce75 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Acquired lock "refresh_cache-267a26ae-5092-423d-80ce-b122467635a3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:08:40 compute-0 nova_compute[186544]: 2025-11-22 08:08:40.921 186548 DEBUG nova.network.neutron [None req-57b5e437-b0ed-4a38-9711-fee742f1ce75 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 08:08:41 compute-0 nova_compute[186544]: 2025-11-22 08:08:41.919 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:08:42 compute-0 nova_compute[186544]: 2025-11-22 08:08:42.112 186548 DEBUG nova.network.neutron [None req-57b5e437-b0ed-4a38-9711-fee742f1ce75 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Updating instance_info_cache with network_info: [{"id": "843ca841-6500-4cad-970d-0e512ba71d41", "address": "fa:16:3e:d5:a2:84", "network": {"id": "4b6c329c-02bc-419e-9c44-e313eaa92343", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-2017524329-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "347404e1ff614e68bf6621e027c9212f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap843ca841-65", "ovs_interfaceid": "843ca841-6500-4cad-970d-0e512ba71d41", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:08:42 compute-0 nova_compute[186544]: 2025-11-22 08:08:42.286 186548 DEBUG oslo_concurrency.lockutils [None req-57b5e437-b0ed-4a38-9711-fee742f1ce75 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Releasing lock "refresh_cache-267a26ae-5092-423d-80ce-b122467635a3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:08:42 compute-0 nova_compute[186544]: 2025-11-22 08:08:42.697 186548 DEBUG nova.virt.libvirt.driver [None req-57b5e437-b0ed-4a38-9711-fee742f1ce75 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Nov 22 08:08:45 compute-0 nova_compute[186544]: 2025-11-22 08:08:45.832 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:08:46 compute-0 podman[234955]: 2025-11-22 08:08:46.418180506 +0000 UTC m=+0.059713299 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd)
Nov 22 08:08:46 compute-0 nova_compute[186544]: 2025-11-22 08:08:46.921 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:08:50 compute-0 podman[234975]: 2025-11-22 08:08:50.42722363 +0000 UTC m=+0.075944451 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 08:08:50 compute-0 podman[234976]: 2025-11-22 08:08:50.430315796 +0000 UTC m=+0.078511105 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter)
Nov 22 08:08:50 compute-0 nova_compute[186544]: 2025-11-22 08:08:50.834 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:08:51 compute-0 nova_compute[186544]: 2025-11-22 08:08:51.925 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:08:52 compute-0 nova_compute[186544]: 2025-11-22 08:08:52.737 186548 DEBUG nova.virt.libvirt.driver [None req-57b5e437-b0ed-4a38-9711-fee742f1ce75 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Nov 22 08:08:53 compute-0 nova_compute[186544]: 2025-11-22 08:08:53.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:08:53 compute-0 nova_compute[186544]: 2025-11-22 08:08:53.199 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:08:53 compute-0 nova_compute[186544]: 2025-11-22 08:08:53.200 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:08:53 compute-0 nova_compute[186544]: 2025-11-22 08:08:53.200 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:08:53 compute-0 nova_compute[186544]: 2025-11-22 08:08:53.200 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 08:08:53 compute-0 nova_compute[186544]: 2025-11-22 08:08:53.299 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/267a26ae-5092-423d-80ce-b122467635a3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:08:53 compute-0 nova_compute[186544]: 2025-11-22 08:08:53.361 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/267a26ae-5092-423d-80ce-b122467635a3/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:08:53 compute-0 nova_compute[186544]: 2025-11-22 08:08:53.362 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/267a26ae-5092-423d-80ce-b122467635a3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:08:53 compute-0 nova_compute[186544]: 2025-11-22 08:08:53.420 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/267a26ae-5092-423d-80ce-b122467635a3/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:08:53 compute-0 nova_compute[186544]: 2025-11-22 08:08:53.590 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:08:53 compute-0 nova_compute[186544]: 2025-11-22 08:08:53.592 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5573MB free_disk=73.17559814453125GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 08:08:53 compute-0 nova_compute[186544]: 2025-11-22 08:08:53.592 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:08:53 compute-0 nova_compute[186544]: 2025-11-22 08:08:53.592 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:08:53 compute-0 nova_compute[186544]: 2025-11-22 08:08:53.863 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Instance 267a26ae-5092-423d-80ce-b122467635a3 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 22 08:08:53 compute-0 nova_compute[186544]: 2025-11-22 08:08:53.863 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 08:08:53 compute-0 nova_compute[186544]: 2025-11-22 08:08:53.863 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 08:08:54 compute-0 nova_compute[186544]: 2025-11-22 08:08:54.051 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:08:54 compute-0 nova_compute[186544]: 2025-11-22 08:08:54.064 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:08:54 compute-0 nova_compute[186544]: 2025-11-22 08:08:54.103 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 08:08:54 compute-0 nova_compute[186544]: 2025-11-22 08:08:54.103 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.511s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:08:55 compute-0 nova_compute[186544]: 2025-11-22 08:08:55.099 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:08:55 compute-0 kernel: tap843ca841-65 (unregistering): left promiscuous mode
Nov 22 08:08:55 compute-0 NetworkManager[55036]: <info>  [1763798935.7083] device (tap843ca841-65): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 08:08:55 compute-0 ovn_controller[94843]: 2025-11-22T08:08:55Z|00526|binding|INFO|Releasing lport 843ca841-6500-4cad-970d-0e512ba71d41 from this chassis (sb_readonly=0)
Nov 22 08:08:55 compute-0 ovn_controller[94843]: 2025-11-22T08:08:55Z|00527|binding|INFO|Setting lport 843ca841-6500-4cad-970d-0e512ba71d41 down in Southbound
Nov 22 08:08:55 compute-0 ovn_controller[94843]: 2025-11-22T08:08:55Z|00528|binding|INFO|Removing iface tap843ca841-65 ovn-installed in OVS
Nov 22 08:08:55 compute-0 nova_compute[186544]: 2025-11-22 08:08:55.718 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:08:55 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:08:55.728 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d5:a2:84 10.100.0.5'], port_security=['fa:16:3e:d5:a2:84 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '267a26ae-5092-423d-80ce-b122467635a3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4b6c329c-02bc-419e-9c44-e313eaa92343', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '347404e1ff614e68bf6621e027c9212f', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b4ddbfb9-1873-4998-b083-20067bac9400', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3bf81230-5a70-4bce-ad1c-95ab3d0988b0, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=843ca841-6500-4cad-970d-0e512ba71d41) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:08:55 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:08:55.729 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 843ca841-6500-4cad-970d-0e512ba71d41 in datapath 4b6c329c-02bc-419e-9c44-e313eaa92343 unbound from our chassis
Nov 22 08:08:55 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:08:55.730 103805 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 4b6c329c-02bc-419e-9c44-e313eaa92343 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 22 08:08:55 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:08:55.731 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[10bdc907-ff1c-4eb2-8e54-c34260fbd51d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:08:55 compute-0 nova_compute[186544]: 2025-11-22 08:08:55.731 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:08:55 compute-0 systemd[1]: machine-qemu\x2d67\x2dinstance\x2d00000073.scope: Deactivated successfully.
Nov 22 08:08:55 compute-0 systemd[1]: machine-qemu\x2d67\x2dinstance\x2d00000073.scope: Consumed 14.181s CPU time.
Nov 22 08:08:55 compute-0 systemd-machined[152872]: Machine qemu-67-instance-00000073 terminated.
Nov 22 08:08:55 compute-0 nova_compute[186544]: 2025-11-22 08:08:55.836 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:08:55 compute-0 nova_compute[186544]: 2025-11-22 08:08:55.939 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:08:55 compute-0 nova_compute[186544]: 2025-11-22 08:08:55.943 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:08:55 compute-0 nova_compute[186544]: 2025-11-22 08:08:55.984 186548 INFO nova.virt.libvirt.driver [None req-57b5e437-b0ed-4a38-9711-fee742f1ce75 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Instance shutdown successfully after 13 seconds.
Nov 22 08:08:55 compute-0 nova_compute[186544]: 2025-11-22 08:08:55.990 186548 INFO nova.virt.libvirt.driver [-] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Instance destroyed successfully.
Nov 22 08:08:55 compute-0 nova_compute[186544]: 2025-11-22 08:08:55.990 186548 DEBUG nova.objects.instance [None req-57b5e437-b0ed-4a38-9711-fee742f1ce75 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Lazy-loading 'numa_topology' on Instance uuid 267a26ae-5092-423d-80ce-b122467635a3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:08:56 compute-0 nova_compute[186544]: 2025-11-22 08:08:56.014 186548 INFO nova.virt.libvirt.driver [None req-57b5e437-b0ed-4a38-9711-fee742f1ce75 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Attempting rescue
Nov 22 08:08:56 compute-0 nova_compute[186544]: 2025-11-22 08:08:56.014 186548 DEBUG nova.virt.libvirt.driver [None req-57b5e437-b0ed-4a38-9711-fee742f1ce75 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} rescue /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4314
Nov 22 08:08:56 compute-0 nova_compute[186544]: 2025-11-22 08:08:56.018 186548 DEBUG nova.virt.libvirt.driver [None req-57b5e437-b0ed-4a38-9711-fee742f1ce75 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719
Nov 22 08:08:56 compute-0 nova_compute[186544]: 2025-11-22 08:08:56.018 186548 INFO nova.virt.libvirt.driver [None req-57b5e437-b0ed-4a38-9711-fee742f1ce75 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Creating image(s)
Nov 22 08:08:56 compute-0 nova_compute[186544]: 2025-11-22 08:08:56.019 186548 DEBUG oslo_concurrency.lockutils [None req-57b5e437-b0ed-4a38-9711-fee742f1ce75 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Acquiring lock "/var/lib/nova/instances/267a26ae-5092-423d-80ce-b122467635a3/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:08:56 compute-0 nova_compute[186544]: 2025-11-22 08:08:56.019 186548 DEBUG oslo_concurrency.lockutils [None req-57b5e437-b0ed-4a38-9711-fee742f1ce75 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Lock "/var/lib/nova/instances/267a26ae-5092-423d-80ce-b122467635a3/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:08:56 compute-0 nova_compute[186544]: 2025-11-22 08:08:56.020 186548 DEBUG oslo_concurrency.lockutils [None req-57b5e437-b0ed-4a38-9711-fee742f1ce75 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Lock "/var/lib/nova/instances/267a26ae-5092-423d-80ce-b122467635a3/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:08:56 compute-0 nova_compute[186544]: 2025-11-22 08:08:56.020 186548 DEBUG nova.objects.instance [None req-57b5e437-b0ed-4a38-9711-fee742f1ce75 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Lazy-loading 'trusted_certs' on Instance uuid 267a26ae-5092-423d-80ce-b122467635a3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:08:56 compute-0 nova_compute[186544]: 2025-11-22 08:08:56.050 186548 DEBUG oslo_concurrency.lockutils [None req-57b5e437-b0ed-4a38-9711-fee742f1ce75 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:08:56 compute-0 nova_compute[186544]: 2025-11-22 08:08:56.050 186548 DEBUG oslo_concurrency.lockutils [None req-57b5e437-b0ed-4a38-9711-fee742f1ce75 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:08:56 compute-0 nova_compute[186544]: 2025-11-22 08:08:56.062 186548 DEBUG oslo_concurrency.processutils [None req-57b5e437-b0ed-4a38-9711-fee742f1ce75 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:08:56 compute-0 nova_compute[186544]: 2025-11-22 08:08:56.117 186548 DEBUG oslo_concurrency.processutils [None req-57b5e437-b0ed-4a38-9711-fee742f1ce75 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:08:56 compute-0 nova_compute[186544]: 2025-11-22 08:08:56.118 186548 DEBUG oslo_concurrency.processutils [None req-57b5e437-b0ed-4a38-9711-fee742f1ce75 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/267a26ae-5092-423d-80ce-b122467635a3/disk.rescue execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:08:56 compute-0 nova_compute[186544]: 2025-11-22 08:08:56.158 186548 DEBUG oslo_concurrency.processutils [None req-57b5e437-b0ed-4a38-9711-fee742f1ce75 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/267a26ae-5092-423d-80ce-b122467635a3/disk.rescue" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:08:56 compute-0 nova_compute[186544]: 2025-11-22 08:08:56.159 186548 DEBUG oslo_concurrency.lockutils [None req-57b5e437-b0ed-4a38-9711-fee742f1ce75 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.109s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:08:56 compute-0 nova_compute[186544]: 2025-11-22 08:08:56.159 186548 DEBUG nova.objects.instance [None req-57b5e437-b0ed-4a38-9711-fee742f1ce75 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Lazy-loading 'migration_context' on Instance uuid 267a26ae-5092-423d-80ce-b122467635a3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:08:56 compute-0 nova_compute[186544]: 2025-11-22 08:08:56.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:08:56 compute-0 nova_compute[186544]: 2025-11-22 08:08:56.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 08:08:56 compute-0 nova_compute[186544]: 2025-11-22 08:08:56.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 08:08:56 compute-0 nova_compute[186544]: 2025-11-22 08:08:56.189 186548 DEBUG nova.virt.libvirt.driver [None req-57b5e437-b0ed-4a38-9711-fee742f1ce75 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 08:08:56 compute-0 nova_compute[186544]: 2025-11-22 08:08:56.190 186548 DEBUG nova.virt.libvirt.driver [None req-57b5e437-b0ed-4a38-9711-fee742f1ce75 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Start _get_guest_xml network_info=[{"id": "843ca841-6500-4cad-970d-0e512ba71d41", "address": "fa:16:3e:d5:a2:84", "network": {"id": "4b6c329c-02bc-419e-9c44-e313eaa92343", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-2017524329-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSON-2017524329-network", "vif_mac": "fa:16:3e:d5:a2:84"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "347404e1ff614e68bf6621e027c9212f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap843ca841-65", "ovs_interfaceid": "843ca841-6500-4cad-970d-0e512ba71d41", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue={'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'kernel_id': '', 'ramdisk_id': ''} block_device_info=None _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 08:08:56 compute-0 nova_compute[186544]: 2025-11-22 08:08:56.190 186548 DEBUG nova.objects.instance [None req-57b5e437-b0ed-4a38-9711-fee742f1ce75 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Lazy-loading 'resources' on Instance uuid 267a26ae-5092-423d-80ce-b122467635a3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:08:56 compute-0 nova_compute[186544]: 2025-11-22 08:08:56.193 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "refresh_cache-267a26ae-5092-423d-80ce-b122467635a3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:08:56 compute-0 nova_compute[186544]: 2025-11-22 08:08:56.193 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquired lock "refresh_cache-267a26ae-5092-423d-80ce-b122467635a3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:08:56 compute-0 nova_compute[186544]: 2025-11-22 08:08:56.193 186548 DEBUG nova.network.neutron [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 22 08:08:56 compute-0 nova_compute[186544]: 2025-11-22 08:08:56.193 186548 DEBUG nova.objects.instance [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lazy-loading 'info_cache' on Instance uuid 267a26ae-5092-423d-80ce-b122467635a3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:08:56 compute-0 nova_compute[186544]: 2025-11-22 08:08:56.225 186548 WARNING nova.virt.libvirt.driver [None req-57b5e437-b0ed-4a38-9711-fee742f1ce75 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:08:56 compute-0 nova_compute[186544]: 2025-11-22 08:08:56.231 186548 DEBUG nova.virt.libvirt.host [None req-57b5e437-b0ed-4a38-9711-fee742f1ce75 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 08:08:56 compute-0 nova_compute[186544]: 2025-11-22 08:08:56.232 186548 DEBUG nova.virt.libvirt.host [None req-57b5e437-b0ed-4a38-9711-fee742f1ce75 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 08:08:56 compute-0 nova_compute[186544]: 2025-11-22 08:08:56.242 186548 DEBUG nova.virt.libvirt.host [None req-57b5e437-b0ed-4a38-9711-fee742f1ce75 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 08:08:56 compute-0 nova_compute[186544]: 2025-11-22 08:08:56.243 186548 DEBUG nova.virt.libvirt.host [None req-57b5e437-b0ed-4a38-9711-fee742f1ce75 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 08:08:56 compute-0 nova_compute[186544]: 2025-11-22 08:08:56.244 186548 DEBUG nova.virt.libvirt.driver [None req-57b5e437-b0ed-4a38-9711-fee742f1ce75 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 08:08:56 compute-0 nova_compute[186544]: 2025-11-22 08:08:56.244 186548 DEBUG nova.virt.hardware [None req-57b5e437-b0ed-4a38-9711-fee742f1ce75 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 08:08:56 compute-0 nova_compute[186544]: 2025-11-22 08:08:56.244 186548 DEBUG nova.virt.hardware [None req-57b5e437-b0ed-4a38-9711-fee742f1ce75 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 08:08:56 compute-0 nova_compute[186544]: 2025-11-22 08:08:56.245 186548 DEBUG nova.virt.hardware [None req-57b5e437-b0ed-4a38-9711-fee742f1ce75 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 08:08:56 compute-0 nova_compute[186544]: 2025-11-22 08:08:56.245 186548 DEBUG nova.virt.hardware [None req-57b5e437-b0ed-4a38-9711-fee742f1ce75 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 08:08:56 compute-0 nova_compute[186544]: 2025-11-22 08:08:56.245 186548 DEBUG nova.virt.hardware [None req-57b5e437-b0ed-4a38-9711-fee742f1ce75 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 08:08:56 compute-0 nova_compute[186544]: 2025-11-22 08:08:56.245 186548 DEBUG nova.virt.hardware [None req-57b5e437-b0ed-4a38-9711-fee742f1ce75 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 08:08:56 compute-0 nova_compute[186544]: 2025-11-22 08:08:56.245 186548 DEBUG nova.virt.hardware [None req-57b5e437-b0ed-4a38-9711-fee742f1ce75 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 08:08:56 compute-0 nova_compute[186544]: 2025-11-22 08:08:56.246 186548 DEBUG nova.virt.hardware [None req-57b5e437-b0ed-4a38-9711-fee742f1ce75 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 08:08:56 compute-0 nova_compute[186544]: 2025-11-22 08:08:56.246 186548 DEBUG nova.virt.hardware [None req-57b5e437-b0ed-4a38-9711-fee742f1ce75 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 08:08:56 compute-0 nova_compute[186544]: 2025-11-22 08:08:56.246 186548 DEBUG nova.virt.hardware [None req-57b5e437-b0ed-4a38-9711-fee742f1ce75 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 08:08:56 compute-0 nova_compute[186544]: 2025-11-22 08:08:56.246 186548 DEBUG nova.virt.hardware [None req-57b5e437-b0ed-4a38-9711-fee742f1ce75 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 08:08:56 compute-0 nova_compute[186544]: 2025-11-22 08:08:56.247 186548 DEBUG nova.objects.instance [None req-57b5e437-b0ed-4a38-9711-fee742f1ce75 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Lazy-loading 'vcpu_model' on Instance uuid 267a26ae-5092-423d-80ce-b122467635a3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:08:56 compute-0 nova_compute[186544]: 2025-11-22 08:08:56.263 186548 DEBUG nova.virt.libvirt.vif [None req-57b5e437-b0ed-4a38-9711-fee742f1ce75 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:08:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueTestJSON-server-1964315236',display_name='tempest-ServerRescueTestJSON-server-1964315236',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverrescuetestjson-server-1964315236',id=115,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:08:38Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='347404e1ff614e68bf6621e027c9212f',ramdisk_id='',reservation_id='r-suk6riwi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueTestJSON-1650311982',owner_user_name='tempest-ServerRescueTestJSON-1650311982-project-member'},tags=<?>,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:08:38Z,user_data=None,user_id='867dbb7f34964c339e824aadd897d3f9',uuid=267a26ae-5092-423d-80ce-b122467635a3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "843ca841-6500-4cad-970d-0e512ba71d41", "address": "fa:16:3e:d5:a2:84", "network": {"id": "4b6c329c-02bc-419e-9c44-e313eaa92343", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-2017524329-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSON-2017524329-network", "vif_mac": "fa:16:3e:d5:a2:84"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "347404e1ff614e68bf6621e027c9212f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap843ca841-65", "ovs_interfaceid": "843ca841-6500-4cad-970d-0e512ba71d41", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 08:08:56 compute-0 nova_compute[186544]: 2025-11-22 08:08:56.264 186548 DEBUG nova.network.os_vif_util [None req-57b5e437-b0ed-4a38-9711-fee742f1ce75 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Converting VIF {"id": "843ca841-6500-4cad-970d-0e512ba71d41", "address": "fa:16:3e:d5:a2:84", "network": {"id": "4b6c329c-02bc-419e-9c44-e313eaa92343", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-2017524329-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSON-2017524329-network", "vif_mac": "fa:16:3e:d5:a2:84"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "347404e1ff614e68bf6621e027c9212f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap843ca841-65", "ovs_interfaceid": "843ca841-6500-4cad-970d-0e512ba71d41", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:08:56 compute-0 nova_compute[186544]: 2025-11-22 08:08:56.265 186548 DEBUG nova.network.os_vif_util [None req-57b5e437-b0ed-4a38-9711-fee742f1ce75 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d5:a2:84,bridge_name='br-int',has_traffic_filtering=True,id=843ca841-6500-4cad-970d-0e512ba71d41,network=Network(4b6c329c-02bc-419e-9c44-e313eaa92343),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap843ca841-65') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:08:56 compute-0 nova_compute[186544]: 2025-11-22 08:08:56.265 186548 DEBUG nova.objects.instance [None req-57b5e437-b0ed-4a38-9711-fee742f1ce75 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Lazy-loading 'pci_devices' on Instance uuid 267a26ae-5092-423d-80ce-b122467635a3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:08:56 compute-0 nova_compute[186544]: 2025-11-22 08:08:56.276 186548 DEBUG nova.virt.libvirt.driver [None req-57b5e437-b0ed-4a38-9711-fee742f1ce75 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: 267a26ae-5092-423d-80ce-b122467635a3] End _get_guest_xml xml=<domain type="kvm">
Nov 22 08:08:56 compute-0 nova_compute[186544]:   <uuid>267a26ae-5092-423d-80ce-b122467635a3</uuid>
Nov 22 08:08:56 compute-0 nova_compute[186544]:   <name>instance-00000073</name>
Nov 22 08:08:56 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 08:08:56 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 08:08:56 compute-0 nova_compute[186544]:   <metadata>
Nov 22 08:08:56 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 08:08:56 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 08:08:56 compute-0 nova_compute[186544]:       <nova:name>tempest-ServerRescueTestJSON-server-1964315236</nova:name>
Nov 22 08:08:56 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 08:08:56</nova:creationTime>
Nov 22 08:08:56 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 08:08:56 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 08:08:56 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 08:08:56 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 08:08:56 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 08:08:56 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 08:08:56 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 08:08:56 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 08:08:56 compute-0 nova_compute[186544]:         <nova:user uuid="867dbb7f34964c339e824aadd897d3f9">tempest-ServerRescueTestJSON-1650311982-project-member</nova:user>
Nov 22 08:08:56 compute-0 nova_compute[186544]:         <nova:project uuid="347404e1ff614e68bf6621e027c9212f">tempest-ServerRescueTestJSON-1650311982</nova:project>
Nov 22 08:08:56 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 08:08:56 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 08:08:56 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 08:08:56 compute-0 nova_compute[186544]:         <nova:port uuid="843ca841-6500-4cad-970d-0e512ba71d41">
Nov 22 08:08:56 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 22 08:08:56 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 08:08:56 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 08:08:56 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 08:08:56 compute-0 nova_compute[186544]:   </metadata>
Nov 22 08:08:56 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 08:08:56 compute-0 nova_compute[186544]:     <system>
Nov 22 08:08:56 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 08:08:56 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 08:08:56 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 08:08:56 compute-0 nova_compute[186544]:       <entry name="serial">267a26ae-5092-423d-80ce-b122467635a3</entry>
Nov 22 08:08:56 compute-0 nova_compute[186544]:       <entry name="uuid">267a26ae-5092-423d-80ce-b122467635a3</entry>
Nov 22 08:08:56 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 08:08:56 compute-0 nova_compute[186544]:     </system>
Nov 22 08:08:56 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 08:08:56 compute-0 nova_compute[186544]:   <os>
Nov 22 08:08:56 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 08:08:56 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 08:08:56 compute-0 nova_compute[186544]:   </os>
Nov 22 08:08:56 compute-0 nova_compute[186544]:   <features>
Nov 22 08:08:56 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 08:08:56 compute-0 nova_compute[186544]:     <apic/>
Nov 22 08:08:56 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 08:08:56 compute-0 nova_compute[186544]:   </features>
Nov 22 08:08:56 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 08:08:56 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 08:08:56 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 08:08:56 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 08:08:56 compute-0 nova_compute[186544]:   </clock>
Nov 22 08:08:56 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 08:08:56 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 08:08:56 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 08:08:56 compute-0 nova_compute[186544]:   </cpu>
Nov 22 08:08:56 compute-0 nova_compute[186544]:   <devices>
Nov 22 08:08:56 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 08:08:56 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 08:08:56 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/267a26ae-5092-423d-80ce-b122467635a3/disk.rescue"/>
Nov 22 08:08:56 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 08:08:56 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:08:56 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 08:08:56 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 08:08:56 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/267a26ae-5092-423d-80ce-b122467635a3/disk"/>
Nov 22 08:08:56 compute-0 nova_compute[186544]:       <target dev="vdb" bus="virtio"/>
Nov 22 08:08:56 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:08:56 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 08:08:56 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 08:08:56 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/267a26ae-5092-423d-80ce-b122467635a3/disk.config.rescue"/>
Nov 22 08:08:56 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 08:08:56 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:08:56 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 08:08:56 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:d5:a2:84"/>
Nov 22 08:08:56 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:08:56 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 08:08:56 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 08:08:56 compute-0 nova_compute[186544]:       <target dev="tap843ca841-65"/>
Nov 22 08:08:56 compute-0 nova_compute[186544]:     </interface>
Nov 22 08:08:56 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 08:08:56 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/267a26ae-5092-423d-80ce-b122467635a3/console.log" append="off"/>
Nov 22 08:08:56 compute-0 nova_compute[186544]:     </serial>
Nov 22 08:08:56 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 08:08:56 compute-0 nova_compute[186544]:     <video>
Nov 22 08:08:56 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:08:56 compute-0 nova_compute[186544]:     </video>
Nov 22 08:08:56 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 08:08:56 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 08:08:56 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 08:08:56 compute-0 nova_compute[186544]:     </rng>
Nov 22 08:08:56 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 08:08:56 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:08:56 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:08:56 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:08:56 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:08:56 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:08:56 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:08:56 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:08:56 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:08:56 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:08:56 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:08:56 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:08:56 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:08:56 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:08:56 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:08:56 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:08:56 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:08:56 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:08:56 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:08:56 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:08:56 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:08:56 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:08:56 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:08:56 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:08:56 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:08:56 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 08:08:56 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 08:08:56 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 08:08:56 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 08:08:56 compute-0 nova_compute[186544]:   </devices>
Nov 22 08:08:56 compute-0 nova_compute[186544]: </domain>
Nov 22 08:08:56 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 08:08:56 compute-0 nova_compute[186544]: 2025-11-22 08:08:56.283 186548 INFO nova.virt.libvirt.driver [-] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Instance destroyed successfully.
Nov 22 08:08:56 compute-0 nova_compute[186544]: 2025-11-22 08:08:56.339 186548 DEBUG nova.virt.libvirt.driver [None req-57b5e437-b0ed-4a38-9711-fee742f1ce75 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:08:56 compute-0 nova_compute[186544]: 2025-11-22 08:08:56.340 186548 DEBUG nova.virt.libvirt.driver [None req-57b5e437-b0ed-4a38-9711-fee742f1ce75 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:08:56 compute-0 nova_compute[186544]: 2025-11-22 08:08:56.340 186548 DEBUG nova.virt.libvirt.driver [None req-57b5e437-b0ed-4a38-9711-fee742f1ce75 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:08:56 compute-0 nova_compute[186544]: 2025-11-22 08:08:56.340 186548 DEBUG nova.virt.libvirt.driver [None req-57b5e437-b0ed-4a38-9711-fee742f1ce75 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] No VIF found with MAC fa:16:3e:d5:a2:84, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 08:08:56 compute-0 nova_compute[186544]: 2025-11-22 08:08:56.341 186548 INFO nova.virt.libvirt.driver [None req-57b5e437-b0ed-4a38-9711-fee742f1ce75 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Using config drive
Nov 22 08:08:56 compute-0 nova_compute[186544]: 2025-11-22 08:08:56.355 186548 DEBUG nova.objects.instance [None req-57b5e437-b0ed-4a38-9711-fee742f1ce75 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Lazy-loading 'ec2_ids' on Instance uuid 267a26ae-5092-423d-80ce-b122467635a3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:08:56 compute-0 nova_compute[186544]: 2025-11-22 08:08:56.417 186548 DEBUG nova.objects.instance [None req-57b5e437-b0ed-4a38-9711-fee742f1ce75 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Lazy-loading 'keypairs' on Instance uuid 267a26ae-5092-423d-80ce-b122467635a3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:08:56 compute-0 nova_compute[186544]: 2025-11-22 08:08:56.927 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:08:57 compute-0 nova_compute[186544]: 2025-11-22 08:08:57.262 186548 INFO nova.virt.libvirt.driver [None req-57b5e437-b0ed-4a38-9711-fee742f1ce75 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Creating config drive at /var/lib/nova/instances/267a26ae-5092-423d-80ce-b122467635a3/disk.config.rescue
Nov 22 08:08:57 compute-0 nova_compute[186544]: 2025-11-22 08:08:57.268 186548 DEBUG oslo_concurrency.processutils [None req-57b5e437-b0ed-4a38-9711-fee742f1ce75 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/267a26ae-5092-423d-80ce-b122467635a3/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpoyksyi1l execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:08:57 compute-0 nova_compute[186544]: 2025-11-22 08:08:57.394 186548 DEBUG oslo_concurrency.processutils [None req-57b5e437-b0ed-4a38-9711-fee742f1ce75 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/267a26ae-5092-423d-80ce-b122467635a3/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpoyksyi1l" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:08:57 compute-0 kernel: tap843ca841-65: entered promiscuous mode
Nov 22 08:08:57 compute-0 systemd-udevd[235046]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:08:57 compute-0 NetworkManager[55036]: <info>  [1763798937.4633] manager: (tap843ca841-65): new Tun device (/org/freedesktop/NetworkManager/Devices/247)
Nov 22 08:08:57 compute-0 ovn_controller[94843]: 2025-11-22T08:08:57Z|00529|binding|INFO|Claiming lport 843ca841-6500-4cad-970d-0e512ba71d41 for this chassis.
Nov 22 08:08:57 compute-0 ovn_controller[94843]: 2025-11-22T08:08:57Z|00530|binding|INFO|843ca841-6500-4cad-970d-0e512ba71d41: Claiming fa:16:3e:d5:a2:84 10.100.0.5
Nov 22 08:08:57 compute-0 nova_compute[186544]: 2025-11-22 08:08:57.465 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:08:57 compute-0 NetworkManager[55036]: <info>  [1763798937.4728] device (tap843ca841-65): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 08:08:57 compute-0 NetworkManager[55036]: <info>  [1763798937.4736] device (tap843ca841-65): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 08:08:57 compute-0 ovn_controller[94843]: 2025-11-22T08:08:57Z|00531|binding|INFO|Setting lport 843ca841-6500-4cad-970d-0e512ba71d41 ovn-installed in OVS
Nov 22 08:08:57 compute-0 nova_compute[186544]: 2025-11-22 08:08:57.478 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:08:57 compute-0 nova_compute[186544]: 2025-11-22 08:08:57.480 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:08:57 compute-0 systemd-machined[152872]: New machine qemu-68-instance-00000073.
Nov 22 08:08:57 compute-0 systemd[1]: Started Virtual Machine qemu-68-instance-00000073.
Nov 22 08:08:57 compute-0 ovn_controller[94843]: 2025-11-22T08:08:57Z|00532|binding|INFO|Setting lport 843ca841-6500-4cad-970d-0e512ba71d41 up in Southbound
Nov 22 08:08:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:08:57.551 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d5:a2:84 10.100.0.5'], port_security=['fa:16:3e:d5:a2:84 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '267a26ae-5092-423d-80ce-b122467635a3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4b6c329c-02bc-419e-9c44-e313eaa92343', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '347404e1ff614e68bf6621e027c9212f', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b4ddbfb9-1873-4998-b083-20067bac9400', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3bf81230-5a70-4bce-ad1c-95ab3d0988b0, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=843ca841-6500-4cad-970d-0e512ba71d41) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:08:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:08:57.552 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 843ca841-6500-4cad-970d-0e512ba71d41 in datapath 4b6c329c-02bc-419e-9c44-e313eaa92343 bound to our chassis
Nov 22 08:08:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:08:57.553 103805 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 4b6c329c-02bc-419e-9c44-e313eaa92343 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 22 08:08:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:08:57.554 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[6891de9e-a187-4e57-a52d-7873cedb91fc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:08:58 compute-0 nova_compute[186544]: 2025-11-22 08:08:58.035 186548 DEBUG nova.virt.libvirt.host [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Removed pending event for 267a26ae-5092-423d-80ce-b122467635a3 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Nov 22 08:08:58 compute-0 nova_compute[186544]: 2025-11-22 08:08:58.036 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798938.034524, 267a26ae-5092-423d-80ce-b122467635a3 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:08:58 compute-0 nova_compute[186544]: 2025-11-22 08:08:58.036 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 267a26ae-5092-423d-80ce-b122467635a3] VM Resumed (Lifecycle Event)
Nov 22 08:08:58 compute-0 nova_compute[186544]: 2025-11-22 08:08:58.048 186548 DEBUG nova.compute.manager [None req-57b5e437-b0ed-4a38-9711-fee742f1ce75 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:08:58 compute-0 nova_compute[186544]: 2025-11-22 08:08:58.057 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:08:58 compute-0 nova_compute[186544]: 2025-11-22 08:08:58.060 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:08:58 compute-0 nova_compute[186544]: 2025-11-22 08:08:58.089 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 267a26ae-5092-423d-80ce-b122467635a3] During sync_power_state the instance has a pending task (rescuing). Skip.
Nov 22 08:08:58 compute-0 nova_compute[186544]: 2025-11-22 08:08:58.089 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798938.0386615, 267a26ae-5092-423d-80ce-b122467635a3 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:08:58 compute-0 nova_compute[186544]: 2025-11-22 08:08:58.089 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 267a26ae-5092-423d-80ce-b122467635a3] VM Started (Lifecycle Event)
Nov 22 08:08:58 compute-0 nova_compute[186544]: 2025-11-22 08:08:58.121 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:08:58 compute-0 nova_compute[186544]: 2025-11-22 08:08:58.124 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:08:58 compute-0 nova_compute[186544]: 2025-11-22 08:08:58.694 186548 DEBUG nova.network.neutron [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Updating instance_info_cache with network_info: [{"id": "843ca841-6500-4cad-970d-0e512ba71d41", "address": "fa:16:3e:d5:a2:84", "network": {"id": "4b6c329c-02bc-419e-9c44-e313eaa92343", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-2017524329-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "347404e1ff614e68bf6621e027c9212f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap843ca841-65", "ovs_interfaceid": "843ca841-6500-4cad-970d-0e512ba71d41", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:08:58 compute-0 nova_compute[186544]: 2025-11-22 08:08:58.715 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Releasing lock "refresh_cache-267a26ae-5092-423d-80ce-b122467635a3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:08:58 compute-0 nova_compute[186544]: 2025-11-22 08:08:58.715 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 22 08:08:58 compute-0 nova_compute[186544]: 2025-11-22 08:08:58.716 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:08:58 compute-0 nova_compute[186544]: 2025-11-22 08:08:58.716 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:08:58 compute-0 nova_compute[186544]: 2025-11-22 08:08:58.716 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 08:08:59 compute-0 nova_compute[186544]: 2025-11-22 08:08:59.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:09:00 compute-0 nova_compute[186544]: 2025-11-22 08:09:00.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:09:00 compute-0 nova_compute[186544]: 2025-11-22 08:09:00.523 186548 DEBUG nova.compute.manager [req-87044dea-d383-49ba-9042-b1f31c49078f req-9a9d1adf-c182-451d-9bc8-261846b9aa5e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Received event network-vif-unplugged-843ca841-6500-4cad-970d-0e512ba71d41 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:09:00 compute-0 nova_compute[186544]: 2025-11-22 08:09:00.523 186548 DEBUG oslo_concurrency.lockutils [req-87044dea-d383-49ba-9042-b1f31c49078f req-9a9d1adf-c182-451d-9bc8-261846b9aa5e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "267a26ae-5092-423d-80ce-b122467635a3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:09:00 compute-0 nova_compute[186544]: 2025-11-22 08:09:00.524 186548 DEBUG oslo_concurrency.lockutils [req-87044dea-d383-49ba-9042-b1f31c49078f req-9a9d1adf-c182-451d-9bc8-261846b9aa5e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "267a26ae-5092-423d-80ce-b122467635a3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:09:00 compute-0 nova_compute[186544]: 2025-11-22 08:09:00.524 186548 DEBUG oslo_concurrency.lockutils [req-87044dea-d383-49ba-9042-b1f31c49078f req-9a9d1adf-c182-451d-9bc8-261846b9aa5e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "267a26ae-5092-423d-80ce-b122467635a3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:09:00 compute-0 nova_compute[186544]: 2025-11-22 08:09:00.524 186548 DEBUG nova.compute.manager [req-87044dea-d383-49ba-9042-b1f31c49078f req-9a9d1adf-c182-451d-9bc8-261846b9aa5e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 267a26ae-5092-423d-80ce-b122467635a3] No waiting events found dispatching network-vif-unplugged-843ca841-6500-4cad-970d-0e512ba71d41 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:09:00 compute-0 nova_compute[186544]: 2025-11-22 08:09:00.525 186548 WARNING nova.compute.manager [req-87044dea-d383-49ba-9042-b1f31c49078f req-9a9d1adf-c182-451d-9bc8-261846b9aa5e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Received unexpected event network-vif-unplugged-843ca841-6500-4cad-970d-0e512ba71d41 for instance with vm_state rescued and task_state None.
Nov 22 08:09:00 compute-0 nova_compute[186544]: 2025-11-22 08:09:00.837 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:09:01 compute-0 nova_compute[186544]: 2025-11-22 08:09:01.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:09:01 compute-0 nova_compute[186544]: 2025-11-22 08:09:01.929 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:09:03 compute-0 nova_compute[186544]: 2025-11-22 08:09:03.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:09:03 compute-0 nova_compute[186544]: 2025-11-22 08:09:03.389 186548 DEBUG nova.compute.manager [req-52707a07-266e-482f-9a95-eb8f8994cb2a req-582e4c66-d325-40d2-85f9-86cfebcbb003 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Received event network-vif-plugged-843ca841-6500-4cad-970d-0e512ba71d41 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:09:03 compute-0 nova_compute[186544]: 2025-11-22 08:09:03.389 186548 DEBUG oslo_concurrency.lockutils [req-52707a07-266e-482f-9a95-eb8f8994cb2a req-582e4c66-d325-40d2-85f9-86cfebcbb003 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "267a26ae-5092-423d-80ce-b122467635a3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:09:03 compute-0 nova_compute[186544]: 2025-11-22 08:09:03.390 186548 DEBUG oslo_concurrency.lockutils [req-52707a07-266e-482f-9a95-eb8f8994cb2a req-582e4c66-d325-40d2-85f9-86cfebcbb003 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "267a26ae-5092-423d-80ce-b122467635a3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:09:03 compute-0 nova_compute[186544]: 2025-11-22 08:09:03.390 186548 DEBUG oslo_concurrency.lockutils [req-52707a07-266e-482f-9a95-eb8f8994cb2a req-582e4c66-d325-40d2-85f9-86cfebcbb003 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "267a26ae-5092-423d-80ce-b122467635a3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:09:03 compute-0 nova_compute[186544]: 2025-11-22 08:09:03.390 186548 DEBUG nova.compute.manager [req-52707a07-266e-482f-9a95-eb8f8994cb2a req-582e4c66-d325-40d2-85f9-86cfebcbb003 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 267a26ae-5092-423d-80ce-b122467635a3] No waiting events found dispatching network-vif-plugged-843ca841-6500-4cad-970d-0e512ba71d41 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:09:03 compute-0 nova_compute[186544]: 2025-11-22 08:09:03.390 186548 WARNING nova.compute.manager [req-52707a07-266e-482f-9a95-eb8f8994cb2a req-582e4c66-d325-40d2-85f9-86cfebcbb003 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Received unexpected event network-vif-plugged-843ca841-6500-4cad-970d-0e512ba71d41 for instance with vm_state rescued and task_state None.
Nov 22 08:09:05 compute-0 nova_compute[186544]: 2025-11-22 08:09:05.523 186548 DEBUG nova.compute.manager [req-1dfa8c51-99d7-480f-924c-0a06aba66385 req-a878b90a-6b47-44e5-b3a0-69e400eb0aac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Received event network-vif-plugged-843ca841-6500-4cad-970d-0e512ba71d41 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:09:05 compute-0 nova_compute[186544]: 2025-11-22 08:09:05.523 186548 DEBUG oslo_concurrency.lockutils [req-1dfa8c51-99d7-480f-924c-0a06aba66385 req-a878b90a-6b47-44e5-b3a0-69e400eb0aac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "267a26ae-5092-423d-80ce-b122467635a3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:09:05 compute-0 nova_compute[186544]: 2025-11-22 08:09:05.523 186548 DEBUG oslo_concurrency.lockutils [req-1dfa8c51-99d7-480f-924c-0a06aba66385 req-a878b90a-6b47-44e5-b3a0-69e400eb0aac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "267a26ae-5092-423d-80ce-b122467635a3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:09:05 compute-0 nova_compute[186544]: 2025-11-22 08:09:05.524 186548 DEBUG oslo_concurrency.lockutils [req-1dfa8c51-99d7-480f-924c-0a06aba66385 req-a878b90a-6b47-44e5-b3a0-69e400eb0aac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "267a26ae-5092-423d-80ce-b122467635a3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:09:05 compute-0 nova_compute[186544]: 2025-11-22 08:09:05.524 186548 DEBUG nova.compute.manager [req-1dfa8c51-99d7-480f-924c-0a06aba66385 req-a878b90a-6b47-44e5-b3a0-69e400eb0aac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 267a26ae-5092-423d-80ce-b122467635a3] No waiting events found dispatching network-vif-plugged-843ca841-6500-4cad-970d-0e512ba71d41 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:09:05 compute-0 nova_compute[186544]: 2025-11-22 08:09:05.524 186548 WARNING nova.compute.manager [req-1dfa8c51-99d7-480f-924c-0a06aba66385 req-a878b90a-6b47-44e5-b3a0-69e400eb0aac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Received unexpected event network-vif-plugged-843ca841-6500-4cad-970d-0e512ba71d41 for instance with vm_state rescued and task_state None.
Nov 22 08:09:05 compute-0 nova_compute[186544]: 2025-11-22 08:09:05.524 186548 DEBUG nova.compute.manager [req-1dfa8c51-99d7-480f-924c-0a06aba66385 req-a878b90a-6b47-44e5-b3a0-69e400eb0aac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Received event network-vif-plugged-843ca841-6500-4cad-970d-0e512ba71d41 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:09:05 compute-0 nova_compute[186544]: 2025-11-22 08:09:05.525 186548 DEBUG oslo_concurrency.lockutils [req-1dfa8c51-99d7-480f-924c-0a06aba66385 req-a878b90a-6b47-44e5-b3a0-69e400eb0aac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "267a26ae-5092-423d-80ce-b122467635a3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:09:05 compute-0 nova_compute[186544]: 2025-11-22 08:09:05.525 186548 DEBUG oslo_concurrency.lockutils [req-1dfa8c51-99d7-480f-924c-0a06aba66385 req-a878b90a-6b47-44e5-b3a0-69e400eb0aac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "267a26ae-5092-423d-80ce-b122467635a3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:09:05 compute-0 nova_compute[186544]: 2025-11-22 08:09:05.525 186548 DEBUG oslo_concurrency.lockutils [req-1dfa8c51-99d7-480f-924c-0a06aba66385 req-a878b90a-6b47-44e5-b3a0-69e400eb0aac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "267a26ae-5092-423d-80ce-b122467635a3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:09:05 compute-0 nova_compute[186544]: 2025-11-22 08:09:05.525 186548 DEBUG nova.compute.manager [req-1dfa8c51-99d7-480f-924c-0a06aba66385 req-a878b90a-6b47-44e5-b3a0-69e400eb0aac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 267a26ae-5092-423d-80ce-b122467635a3] No waiting events found dispatching network-vif-plugged-843ca841-6500-4cad-970d-0e512ba71d41 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:09:05 compute-0 nova_compute[186544]: 2025-11-22 08:09:05.525 186548 WARNING nova.compute.manager [req-1dfa8c51-99d7-480f-924c-0a06aba66385 req-a878b90a-6b47-44e5-b3a0-69e400eb0aac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Received unexpected event network-vif-plugged-843ca841-6500-4cad-970d-0e512ba71d41 for instance with vm_state rescued and task_state None.
Nov 22 08:09:05 compute-0 nova_compute[186544]: 2025-11-22 08:09:05.840 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:09:06 compute-0 podman[235108]: 2025-11-22 08:09:06.434569446 +0000 UTC m=+0.065472352 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 22 08:09:06 compute-0 podman[235109]: 2025-11-22 08:09:06.455452073 +0000 UTC m=+0.086513932 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 22 08:09:06 compute-0 nova_compute[186544]: 2025-11-22 08:09:06.931 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:09:07 compute-0 podman[235152]: 2025-11-22 08:09:07.405146453 +0000 UTC m=+0.050146213 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 08:09:07 compute-0 podman[235151]: 2025-11-22 08:09:07.424992834 +0000 UTC m=+0.073761067 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Nov 22 08:09:10 compute-0 nova_compute[186544]: 2025-11-22 08:09:10.171 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:09:10 compute-0 nova_compute[186544]: 2025-11-22 08:09:10.171 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 22 08:09:10 compute-0 nova_compute[186544]: 2025-11-22 08:09:10.189 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 22 08:09:10 compute-0 nova_compute[186544]: 2025-11-22 08:09:10.842 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:09:11 compute-0 nova_compute[186544]: 2025-11-22 08:09:11.933 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:09:15 compute-0 nova_compute[186544]: 2025-11-22 08:09:15.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:09:15 compute-0 nova_compute[186544]: 2025-11-22 08:09:15.164 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 22 08:09:15 compute-0 nova_compute[186544]: 2025-11-22 08:09:15.844 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:09:16 compute-0 nova_compute[186544]: 2025-11-22 08:09:16.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:09:16 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:16.352 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=34, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=33) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:09:16 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:16.353 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 08:09:16 compute-0 nova_compute[186544]: 2025-11-22 08:09:16.353 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:09:16 compute-0 nova_compute[186544]: 2025-11-22 08:09:16.936 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:09:17 compute-0 podman[235201]: 2025-11-22 08:09:17.415467206 +0000 UTC m=+0.063789809 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=multipathd, org.label-schema.build-date=20251118, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 08:09:18 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:18.355 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '34'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:09:20 compute-0 nova_compute[186544]: 2025-11-22 08:09:20.847 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:09:21 compute-0 podman[235223]: 2025-11-22 08:09:21.411372864 +0000 UTC m=+0.053419404 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=minimal rhel9, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_id=edpm, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, release=1755695350, version=9.6, io.buildah.version=1.33.7)
Nov 22 08:09:21 compute-0 podman[235222]: 2025-11-22 08:09:21.434342973 +0000 UTC m=+0.079554651 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 08:09:21 compute-0 nova_compute[186544]: 2025-11-22 08:09:21.937 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:09:25 compute-0 nova_compute[186544]: 2025-11-22 08:09:25.850 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:09:26 compute-0 nova_compute[186544]: 2025-11-22 08:09:26.938 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:09:30 compute-0 nova_compute[186544]: 2025-11-22 08:09:30.852 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:09:31 compute-0 nova_compute[186544]: 2025-11-22 08:09:31.940 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:09:35 compute-0 nova_compute[186544]: 2025-11-22 08:09:35.855 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:09:36 compute-0 nova_compute[186544]: 2025-11-22 08:09:36.943 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:09:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:37.337 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:09:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:37.338 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:09:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:37.338 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:09:37 compute-0 podman[235266]: 2025-11-22 08:09:37.412065497 +0000 UTC m=+0.059164625 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 22 08:09:37 compute-0 podman[235267]: 2025-11-22 08:09:37.436550594 +0000 UTC m=+0.081259053 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 08:09:37 compute-0 podman[235315]: 2025-11-22 08:09:37.499142914 +0000 UTC m=+0.047687483 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 22 08:09:37 compute-0 podman[235312]: 2025-11-22 08:09:37.514938605 +0000 UTC m=+0.073267186 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 08:09:39 compute-0 nova_compute[186544]: 2025-11-22 08:09:39.857 186548 DEBUG oslo_concurrency.lockutils [None req-faf1fdbd-6760-4cc0-98c8-c528776acc1f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Acquiring lock "267a26ae-5092-423d-80ce-b122467635a3" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:09:39 compute-0 nova_compute[186544]: 2025-11-22 08:09:39.857 186548 DEBUG oslo_concurrency.lockutils [None req-faf1fdbd-6760-4cc0-98c8-c528776acc1f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Lock "267a26ae-5092-423d-80ce-b122467635a3" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:09:39 compute-0 nova_compute[186544]: 2025-11-22 08:09:39.858 186548 DEBUG oslo_concurrency.lockutils [None req-faf1fdbd-6760-4cc0-98c8-c528776acc1f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Acquiring lock "267a26ae-5092-423d-80ce-b122467635a3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:09:39 compute-0 nova_compute[186544]: 2025-11-22 08:09:39.858 186548 DEBUG oslo_concurrency.lockutils [None req-faf1fdbd-6760-4cc0-98c8-c528776acc1f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Lock "267a26ae-5092-423d-80ce-b122467635a3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:09:39 compute-0 nova_compute[186544]: 2025-11-22 08:09:39.858 186548 DEBUG oslo_concurrency.lockutils [None req-faf1fdbd-6760-4cc0-98c8-c528776acc1f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Lock "267a26ae-5092-423d-80ce-b122467635a3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:09:39 compute-0 nova_compute[186544]: 2025-11-22 08:09:39.866 186548 INFO nova.compute.manager [None req-faf1fdbd-6760-4cc0-98c8-c528776acc1f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Terminating instance
Nov 22 08:09:39 compute-0 nova_compute[186544]: 2025-11-22 08:09:39.871 186548 DEBUG nova.compute.manager [None req-faf1fdbd-6760-4cc0-98c8-c528776acc1f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 08:09:39 compute-0 kernel: tap843ca841-65 (unregistering): left promiscuous mode
Nov 22 08:09:39 compute-0 NetworkManager[55036]: <info>  [1763798979.9105] device (tap843ca841-65): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 08:09:39 compute-0 ovn_controller[94843]: 2025-11-22T08:09:39Z|00533|binding|INFO|Releasing lport 843ca841-6500-4cad-970d-0e512ba71d41 from this chassis (sb_readonly=0)
Nov 22 08:09:39 compute-0 ovn_controller[94843]: 2025-11-22T08:09:39Z|00534|binding|INFO|Setting lport 843ca841-6500-4cad-970d-0e512ba71d41 down in Southbound
Nov 22 08:09:39 compute-0 nova_compute[186544]: 2025-11-22 08:09:39.920 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:09:39 compute-0 ovn_controller[94843]: 2025-11-22T08:09:39Z|00535|binding|INFO|Removing iface tap843ca841-65 ovn-installed in OVS
Nov 22 08:09:39 compute-0 nova_compute[186544]: 2025-11-22 08:09:39.923 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:09:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:39.932 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d5:a2:84 10.100.0.5'], port_security=['fa:16:3e:d5:a2:84 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '267a26ae-5092-423d-80ce-b122467635a3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4b6c329c-02bc-419e-9c44-e313eaa92343', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '347404e1ff614e68bf6621e027c9212f', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'b4ddbfb9-1873-4998-b083-20067bac9400', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3bf81230-5a70-4bce-ad1c-95ab3d0988b0, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=843ca841-6500-4cad-970d-0e512ba71d41) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:09:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:39.934 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 843ca841-6500-4cad-970d-0e512ba71d41 in datapath 4b6c329c-02bc-419e-9c44-e313eaa92343 unbound from our chassis
Nov 22 08:09:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:39.936 103805 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 4b6c329c-02bc-419e-9c44-e313eaa92343 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 22 08:09:39 compute-0 nova_compute[186544]: 2025-11-22 08:09:39.937 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:09:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:39.937 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[e1607cb2-3aa2-480d-874f-e9f2f9fedc3d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:09:39 compute-0 systemd[1]: machine-qemu\x2d68\x2dinstance\x2d00000073.scope: Deactivated successfully.
Nov 22 08:09:39 compute-0 systemd[1]: machine-qemu\x2d68\x2dinstance\x2d00000073.scope: Consumed 15.244s CPU time.
Nov 22 08:09:39 compute-0 systemd-machined[152872]: Machine qemu-68-instance-00000073 terminated.
Nov 22 08:09:40 compute-0 nova_compute[186544]: 2025-11-22 08:09:40.151 186548 INFO nova.virt.libvirt.driver [-] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Instance destroyed successfully.
Nov 22 08:09:40 compute-0 nova_compute[186544]: 2025-11-22 08:09:40.152 186548 DEBUG nova.objects.instance [None req-faf1fdbd-6760-4cc0-98c8-c528776acc1f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Lazy-loading 'resources' on Instance uuid 267a26ae-5092-423d-80ce-b122467635a3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:09:40 compute-0 nova_compute[186544]: 2025-11-22 08:09:40.162 186548 DEBUG nova.virt.libvirt.vif [None req-faf1fdbd-6760-4cc0-98c8-c528776acc1f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:08:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueTestJSON-server-1964315236',display_name='tempest-ServerRescueTestJSON-server-1964315236',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverrescuetestjson-server-1964315236',id=115,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:08:58Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='347404e1ff614e68bf6621e027c9212f',ramdisk_id='',reservation_id='r-suk6riwi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueTestJSON-1650311982',owner_user_name='tempest-ServerRescueTestJSON-1650311982-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:08:58Z,user_data=None,user_id='867dbb7f34964c339e824aadd897d3f9',uuid=267a26ae-5092-423d-80ce-b122467635a3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='rescued') vif={"id": "843ca841-6500-4cad-970d-0e512ba71d41", "address": "fa:16:3e:d5:a2:84", "network": {"id": "4b6c329c-02bc-419e-9c44-e313eaa92343", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-2017524329-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "347404e1ff614e68bf6621e027c9212f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap843ca841-65", "ovs_interfaceid": "843ca841-6500-4cad-970d-0e512ba71d41", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 08:09:40 compute-0 nova_compute[186544]: 2025-11-22 08:09:40.162 186548 DEBUG nova.network.os_vif_util [None req-faf1fdbd-6760-4cc0-98c8-c528776acc1f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Converting VIF {"id": "843ca841-6500-4cad-970d-0e512ba71d41", "address": "fa:16:3e:d5:a2:84", "network": {"id": "4b6c329c-02bc-419e-9c44-e313eaa92343", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-2017524329-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "347404e1ff614e68bf6621e027c9212f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap843ca841-65", "ovs_interfaceid": "843ca841-6500-4cad-970d-0e512ba71d41", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:09:40 compute-0 nova_compute[186544]: 2025-11-22 08:09:40.163 186548 DEBUG nova.network.os_vif_util [None req-faf1fdbd-6760-4cc0-98c8-c528776acc1f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d5:a2:84,bridge_name='br-int',has_traffic_filtering=True,id=843ca841-6500-4cad-970d-0e512ba71d41,network=Network(4b6c329c-02bc-419e-9c44-e313eaa92343),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap843ca841-65') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:09:40 compute-0 nova_compute[186544]: 2025-11-22 08:09:40.163 186548 DEBUG os_vif [None req-faf1fdbd-6760-4cc0-98c8-c528776acc1f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d5:a2:84,bridge_name='br-int',has_traffic_filtering=True,id=843ca841-6500-4cad-970d-0e512ba71d41,network=Network(4b6c329c-02bc-419e-9c44-e313eaa92343),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap843ca841-65') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 08:09:40 compute-0 nova_compute[186544]: 2025-11-22 08:09:40.165 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:09:40 compute-0 nova_compute[186544]: 2025-11-22 08:09:40.165 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap843ca841-65, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:09:40 compute-0 nova_compute[186544]: 2025-11-22 08:09:40.166 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:09:40 compute-0 nova_compute[186544]: 2025-11-22 08:09:40.169 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 08:09:40 compute-0 nova_compute[186544]: 2025-11-22 08:09:40.171 186548 INFO os_vif [None req-faf1fdbd-6760-4cc0-98c8-c528776acc1f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d5:a2:84,bridge_name='br-int',has_traffic_filtering=True,id=843ca841-6500-4cad-970d-0e512ba71d41,network=Network(4b6c329c-02bc-419e-9c44-e313eaa92343),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap843ca841-65')
Nov 22 08:09:40 compute-0 nova_compute[186544]: 2025-11-22 08:09:40.172 186548 INFO nova.virt.libvirt.driver [None req-faf1fdbd-6760-4cc0-98c8-c528776acc1f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Deleting instance files /var/lib/nova/instances/267a26ae-5092-423d-80ce-b122467635a3_del
Nov 22 08:09:40 compute-0 nova_compute[186544]: 2025-11-22 08:09:40.173 186548 INFO nova.virt.libvirt.driver [None req-faf1fdbd-6760-4cc0-98c8-c528776acc1f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Deletion of /var/lib/nova/instances/267a26ae-5092-423d-80ce-b122467635a3_del complete
Nov 22 08:09:40 compute-0 nova_compute[186544]: 2025-11-22 08:09:40.238 186548 INFO nova.compute.manager [None req-faf1fdbd-6760-4cc0-98c8-c528776acc1f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Took 0.37 seconds to destroy the instance on the hypervisor.
Nov 22 08:09:40 compute-0 nova_compute[186544]: 2025-11-22 08:09:40.239 186548 DEBUG oslo.service.loopingcall [None req-faf1fdbd-6760-4cc0-98c8-c528776acc1f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 08:09:40 compute-0 nova_compute[186544]: 2025-11-22 08:09:40.239 186548 DEBUG nova.compute.manager [-] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 08:09:40 compute-0 nova_compute[186544]: 2025-11-22 08:09:40.240 186548 DEBUG nova.network.neutron [-] [instance: 267a26ae-5092-423d-80ce-b122467635a3] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 08:09:40 compute-0 nova_compute[186544]: 2025-11-22 08:09:40.841 186548 DEBUG nova.network.neutron [-] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:09:40 compute-0 nova_compute[186544]: 2025-11-22 08:09:40.857 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:09:40 compute-0 nova_compute[186544]: 2025-11-22 08:09:40.863 186548 INFO nova.compute.manager [-] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Took 0.62 seconds to deallocate network for instance.
Nov 22 08:09:40 compute-0 nova_compute[186544]: 2025-11-22 08:09:40.971 186548 DEBUG oslo_concurrency.lockutils [None req-faf1fdbd-6760-4cc0-98c8-c528776acc1f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:09:40 compute-0 nova_compute[186544]: 2025-11-22 08:09:40.972 186548 DEBUG oslo_concurrency.lockutils [None req-faf1fdbd-6760-4cc0-98c8-c528776acc1f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:09:41 compute-0 nova_compute[186544]: 2025-11-22 08:09:41.032 186548 DEBUG nova.compute.provider_tree [None req-faf1fdbd-6760-4cc0-98c8-c528776acc1f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:09:41 compute-0 nova_compute[186544]: 2025-11-22 08:09:41.047 186548 DEBUG nova.scheduler.client.report [None req-faf1fdbd-6760-4cc0-98c8-c528776acc1f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:09:41 compute-0 nova_compute[186544]: 2025-11-22 08:09:41.070 186548 DEBUG oslo_concurrency.lockutils [None req-faf1fdbd-6760-4cc0-98c8-c528776acc1f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.099s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:09:41 compute-0 nova_compute[186544]: 2025-11-22 08:09:41.101 186548 INFO nova.scheduler.client.report [None req-faf1fdbd-6760-4cc0-98c8-c528776acc1f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Deleted allocations for instance 267a26ae-5092-423d-80ce-b122467635a3
Nov 22 08:09:41 compute-0 nova_compute[186544]: 2025-11-22 08:09:41.192 186548 DEBUG oslo_concurrency.lockutils [None req-faf1fdbd-6760-4cc0-98c8-c528776acc1f 867dbb7f34964c339e824aadd897d3f9 347404e1ff614e68bf6621e027c9212f - - default default] Lock "267a26ae-5092-423d-80ce-b122467635a3" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.335s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:09:41 compute-0 nova_compute[186544]: 2025-11-22 08:09:41.527 186548 DEBUG nova.compute.manager [req-1dd256ea-63f5-481b-85d5-3f629bdcbc5d req-5480a8d0-0b19-465d-836a-eaaa4ab0c9f6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Received event network-vif-deleted-843ca841-6500-4cad-970d-0e512ba71d41 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:09:42 compute-0 nova_compute[186544]: 2025-11-22 08:09:42.196 186548 DEBUG oslo_concurrency.lockutils [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Acquiring lock "63bce965-e211-4f6a-8700-64667594e4d0" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:09:42 compute-0 nova_compute[186544]: 2025-11-22 08:09:42.197 186548 DEBUG oslo_concurrency.lockutils [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lock "63bce965-e211-4f6a-8700-64667594e4d0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:09:42 compute-0 nova_compute[186544]: 2025-11-22 08:09:42.234 186548 DEBUG nova.compute.manager [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 63bce965-e211-4f6a-8700-64667594e4d0] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 08:09:42 compute-0 nova_compute[186544]: 2025-11-22 08:09:42.331 186548 DEBUG oslo_concurrency.lockutils [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:09:42 compute-0 nova_compute[186544]: 2025-11-22 08:09:42.331 186548 DEBUG oslo_concurrency.lockutils [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:09:42 compute-0 nova_compute[186544]: 2025-11-22 08:09:42.336 186548 DEBUG nova.virt.hardware [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 08:09:42 compute-0 nova_compute[186544]: 2025-11-22 08:09:42.336 186548 INFO nova.compute.claims [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 63bce965-e211-4f6a-8700-64667594e4d0] Claim successful on node compute-0.ctlplane.example.com
Nov 22 08:09:42 compute-0 nova_compute[186544]: 2025-11-22 08:09:42.504 186548 DEBUG nova.compute.provider_tree [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:09:42 compute-0 nova_compute[186544]: 2025-11-22 08:09:42.529 186548 DEBUG nova.scheduler.client.report [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:09:42 compute-0 nova_compute[186544]: 2025-11-22 08:09:42.572 186548 DEBUG oslo_concurrency.lockutils [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.241s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:09:42 compute-0 nova_compute[186544]: 2025-11-22 08:09:42.573 186548 DEBUG nova.compute.manager [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 63bce965-e211-4f6a-8700-64667594e4d0] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 08:09:42 compute-0 nova_compute[186544]: 2025-11-22 08:09:42.644 186548 DEBUG nova.compute.manager [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 63bce965-e211-4f6a-8700-64667594e4d0] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 22 08:09:42 compute-0 nova_compute[186544]: 2025-11-22 08:09:42.644 186548 DEBUG nova.network.neutron [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 63bce965-e211-4f6a-8700-64667594e4d0] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 08:09:42 compute-0 nova_compute[186544]: 2025-11-22 08:09:42.667 186548 INFO nova.virt.libvirt.driver [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 63bce965-e211-4f6a-8700-64667594e4d0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 08:09:42 compute-0 nova_compute[186544]: 2025-11-22 08:09:42.686 186548 DEBUG nova.compute.manager [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 63bce965-e211-4f6a-8700-64667594e4d0] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 08:09:42 compute-0 nova_compute[186544]: 2025-11-22 08:09:42.789 186548 DEBUG nova.compute.manager [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 63bce965-e211-4f6a-8700-64667594e4d0] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 08:09:42 compute-0 nova_compute[186544]: 2025-11-22 08:09:42.790 186548 DEBUG nova.virt.libvirt.driver [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 63bce965-e211-4f6a-8700-64667594e4d0] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 08:09:42 compute-0 nova_compute[186544]: 2025-11-22 08:09:42.790 186548 INFO nova.virt.libvirt.driver [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 63bce965-e211-4f6a-8700-64667594e4d0] Creating image(s)
Nov 22 08:09:42 compute-0 nova_compute[186544]: 2025-11-22 08:09:42.791 186548 DEBUG oslo_concurrency.lockutils [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Acquiring lock "/var/lib/nova/instances/63bce965-e211-4f6a-8700-64667594e4d0/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:09:42 compute-0 nova_compute[186544]: 2025-11-22 08:09:42.792 186548 DEBUG oslo_concurrency.lockutils [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lock "/var/lib/nova/instances/63bce965-e211-4f6a-8700-64667594e4d0/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:09:42 compute-0 nova_compute[186544]: 2025-11-22 08:09:42.792 186548 DEBUG oslo_concurrency.lockutils [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lock "/var/lib/nova/instances/63bce965-e211-4f6a-8700-64667594e4d0/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:09:42 compute-0 nova_compute[186544]: 2025-11-22 08:09:42.806 186548 DEBUG oslo_concurrency.processutils [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:09:42 compute-0 nova_compute[186544]: 2025-11-22 08:09:42.866 186548 DEBUG oslo_concurrency.processutils [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:09:42 compute-0 nova_compute[186544]: 2025-11-22 08:09:42.867 186548 DEBUG oslo_concurrency.lockutils [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:09:42 compute-0 nova_compute[186544]: 2025-11-22 08:09:42.868 186548 DEBUG oslo_concurrency.lockutils [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:09:42 compute-0 nova_compute[186544]: 2025-11-22 08:09:42.879 186548 DEBUG oslo_concurrency.processutils [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:09:42 compute-0 nova_compute[186544]: 2025-11-22 08:09:42.898 186548 DEBUG nova.policy [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c867ad823e59410b995507d3e85b3465', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9c564dfb60114407b72d22a9c49ed513', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 22 08:09:42 compute-0 nova_compute[186544]: 2025-11-22 08:09:42.938 186548 DEBUG oslo_concurrency.processutils [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:09:42 compute-0 nova_compute[186544]: 2025-11-22 08:09:42.939 186548 DEBUG oslo_concurrency.processutils [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/63bce965-e211-4f6a-8700-64667594e4d0/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:09:42 compute-0 nova_compute[186544]: 2025-11-22 08:09:42.977 186548 DEBUG oslo_concurrency.processutils [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/63bce965-e211-4f6a-8700-64667594e4d0/disk 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:09:42 compute-0 nova_compute[186544]: 2025-11-22 08:09:42.978 186548 DEBUG oslo_concurrency.lockutils [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:09:42 compute-0 nova_compute[186544]: 2025-11-22 08:09:42.978 186548 DEBUG oslo_concurrency.processutils [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:09:43 compute-0 nova_compute[186544]: 2025-11-22 08:09:43.034 186548 DEBUG oslo_concurrency.processutils [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:09:43 compute-0 nova_compute[186544]: 2025-11-22 08:09:43.035 186548 DEBUG nova.virt.disk.api [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Checking if we can resize image /var/lib/nova/instances/63bce965-e211-4f6a-8700-64667594e4d0/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 08:09:43 compute-0 nova_compute[186544]: 2025-11-22 08:09:43.036 186548 DEBUG oslo_concurrency.processutils [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/63bce965-e211-4f6a-8700-64667594e4d0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:09:43 compute-0 nova_compute[186544]: 2025-11-22 08:09:43.093 186548 DEBUG oslo_concurrency.processutils [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/63bce965-e211-4f6a-8700-64667594e4d0/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:09:43 compute-0 nova_compute[186544]: 2025-11-22 08:09:43.094 186548 DEBUG nova.virt.disk.api [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Cannot resize image /var/lib/nova/instances/63bce965-e211-4f6a-8700-64667594e4d0/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 08:09:43 compute-0 nova_compute[186544]: 2025-11-22 08:09:43.095 186548 DEBUG nova.objects.instance [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lazy-loading 'migration_context' on Instance uuid 63bce965-e211-4f6a-8700-64667594e4d0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:09:43 compute-0 nova_compute[186544]: 2025-11-22 08:09:43.208 186548 DEBUG nova.virt.libvirt.driver [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 63bce965-e211-4f6a-8700-64667594e4d0] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 08:09:43 compute-0 nova_compute[186544]: 2025-11-22 08:09:43.210 186548 DEBUG nova.virt.libvirt.driver [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 63bce965-e211-4f6a-8700-64667594e4d0] Ensure instance console log exists: /var/lib/nova/instances/63bce965-e211-4f6a-8700-64667594e4d0/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 08:09:43 compute-0 nova_compute[186544]: 2025-11-22 08:09:43.211 186548 DEBUG oslo_concurrency.lockutils [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:09:43 compute-0 nova_compute[186544]: 2025-11-22 08:09:43.211 186548 DEBUG oslo_concurrency.lockutils [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:09:43 compute-0 nova_compute[186544]: 2025-11-22 08:09:43.211 186548 DEBUG oslo_concurrency.lockutils [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:09:43 compute-0 nova_compute[186544]: 2025-11-22 08:09:43.948 186548 DEBUG nova.network.neutron [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 63bce965-e211-4f6a-8700-64667594e4d0] Successfully created port: 1b2e31ef-a7fe-455e-b7e0-a50675608829 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 22 08:09:44 compute-0 nova_compute[186544]: 2025-11-22 08:09:44.627 186548 DEBUG oslo_concurrency.lockutils [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Acquiring lock "36bedc95-cfbe-490a-b6d2-b148301708dc" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:09:44 compute-0 nova_compute[186544]: 2025-11-22 08:09:44.627 186548 DEBUG oslo_concurrency.lockutils [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Lock "36bedc95-cfbe-490a-b6d2-b148301708dc" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:09:44 compute-0 nova_compute[186544]: 2025-11-22 08:09:44.644 186548 DEBUG nova.compute.manager [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 36bedc95-cfbe-490a-b6d2-b148301708dc] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 08:09:44 compute-0 nova_compute[186544]: 2025-11-22 08:09:44.743 186548 DEBUG oslo_concurrency.lockutils [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:09:44 compute-0 nova_compute[186544]: 2025-11-22 08:09:44.743 186548 DEBUG oslo_concurrency.lockutils [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:09:44 compute-0 nova_compute[186544]: 2025-11-22 08:09:44.754 186548 DEBUG nova.virt.hardware [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 08:09:44 compute-0 nova_compute[186544]: 2025-11-22 08:09:44.755 186548 INFO nova.compute.claims [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 36bedc95-cfbe-490a-b6d2-b148301708dc] Claim successful on node compute-0.ctlplane.example.com
Nov 22 08:09:44 compute-0 nova_compute[186544]: 2025-11-22 08:09:44.921 186548 DEBUG nova.compute.provider_tree [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:09:44 compute-0 nova_compute[186544]: 2025-11-22 08:09:44.932 186548 DEBUG nova.scheduler.client.report [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:09:44 compute-0 nova_compute[186544]: 2025-11-22 08:09:44.953 186548 DEBUG oslo_concurrency.lockutils [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.210s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:09:44 compute-0 nova_compute[186544]: 2025-11-22 08:09:44.954 186548 DEBUG nova.compute.manager [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 36bedc95-cfbe-490a-b6d2-b148301708dc] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 08:09:45 compute-0 nova_compute[186544]: 2025-11-22 08:09:45.011 186548 DEBUG nova.compute.manager [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 36bedc95-cfbe-490a-b6d2-b148301708dc] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 22 08:09:45 compute-0 nova_compute[186544]: 2025-11-22 08:09:45.011 186548 DEBUG nova.network.neutron [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 36bedc95-cfbe-490a-b6d2-b148301708dc] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 08:09:45 compute-0 nova_compute[186544]: 2025-11-22 08:09:45.031 186548 INFO nova.virt.libvirt.driver [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 36bedc95-cfbe-490a-b6d2-b148301708dc] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 08:09:45 compute-0 nova_compute[186544]: 2025-11-22 08:09:45.048 186548 DEBUG nova.compute.manager [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 36bedc95-cfbe-490a-b6d2-b148301708dc] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 08:09:45 compute-0 nova_compute[186544]: 2025-11-22 08:09:45.156 186548 DEBUG nova.compute.manager [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 36bedc95-cfbe-490a-b6d2-b148301708dc] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 08:09:45 compute-0 nova_compute[186544]: 2025-11-22 08:09:45.157 186548 DEBUG nova.virt.libvirt.driver [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 36bedc95-cfbe-490a-b6d2-b148301708dc] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 08:09:45 compute-0 nova_compute[186544]: 2025-11-22 08:09:45.157 186548 INFO nova.virt.libvirt.driver [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 36bedc95-cfbe-490a-b6d2-b148301708dc] Creating image(s)
Nov 22 08:09:45 compute-0 nova_compute[186544]: 2025-11-22 08:09:45.158 186548 DEBUG oslo_concurrency.lockutils [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Acquiring lock "/var/lib/nova/instances/36bedc95-cfbe-490a-b6d2-b148301708dc/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:09:45 compute-0 nova_compute[186544]: 2025-11-22 08:09:45.158 186548 DEBUG oslo_concurrency.lockutils [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Lock "/var/lib/nova/instances/36bedc95-cfbe-490a-b6d2-b148301708dc/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:09:45 compute-0 nova_compute[186544]: 2025-11-22 08:09:45.158 186548 DEBUG oslo_concurrency.lockutils [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Lock "/var/lib/nova/instances/36bedc95-cfbe-490a-b6d2-b148301708dc/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:09:45 compute-0 nova_compute[186544]: 2025-11-22 08:09:45.170 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:09:45 compute-0 nova_compute[186544]: 2025-11-22 08:09:45.171 186548 DEBUG oslo_concurrency.processutils [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:09:45 compute-0 nova_compute[186544]: 2025-11-22 08:09:45.226 186548 DEBUG oslo_concurrency.processutils [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:09:45 compute-0 nova_compute[186544]: 2025-11-22 08:09:45.227 186548 DEBUG oslo_concurrency.lockutils [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:09:45 compute-0 nova_compute[186544]: 2025-11-22 08:09:45.228 186548 DEBUG oslo_concurrency.lockutils [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:09:45 compute-0 nova_compute[186544]: 2025-11-22 08:09:45.238 186548 DEBUG oslo_concurrency.processutils [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:09:45 compute-0 nova_compute[186544]: 2025-11-22 08:09:45.264 186548 DEBUG nova.network.neutron [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 63bce965-e211-4f6a-8700-64667594e4d0] Successfully updated port: 1b2e31ef-a7fe-455e-b7e0-a50675608829 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 08:09:45 compute-0 nova_compute[186544]: 2025-11-22 08:09:45.277 186548 DEBUG oslo_concurrency.lockutils [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Acquiring lock "refresh_cache-63bce965-e211-4f6a-8700-64667594e4d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:09:45 compute-0 nova_compute[186544]: 2025-11-22 08:09:45.277 186548 DEBUG oslo_concurrency.lockutils [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Acquired lock "refresh_cache-63bce965-e211-4f6a-8700-64667594e4d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:09:45 compute-0 nova_compute[186544]: 2025-11-22 08:09:45.278 186548 DEBUG nova.network.neutron [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 63bce965-e211-4f6a-8700-64667594e4d0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 08:09:45 compute-0 nova_compute[186544]: 2025-11-22 08:09:45.294 186548 DEBUG oslo_concurrency.processutils [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:09:45 compute-0 nova_compute[186544]: 2025-11-22 08:09:45.294 186548 DEBUG oslo_concurrency.processutils [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/36bedc95-cfbe-490a-b6d2-b148301708dc/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:09:45 compute-0 nova_compute[186544]: 2025-11-22 08:09:45.385 186548 DEBUG oslo_concurrency.processutils [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/36bedc95-cfbe-490a-b6d2-b148301708dc/disk 1073741824" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:09:45 compute-0 nova_compute[186544]: 2025-11-22 08:09:45.386 186548 DEBUG oslo_concurrency.lockutils [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.158s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:09:45 compute-0 nova_compute[186544]: 2025-11-22 08:09:45.386 186548 DEBUG oslo_concurrency.processutils [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:09:45 compute-0 nova_compute[186544]: 2025-11-22 08:09:45.440 186548 DEBUG oslo_concurrency.processutils [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:09:45 compute-0 nova_compute[186544]: 2025-11-22 08:09:45.443 186548 DEBUG nova.network.neutron [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 63bce965-e211-4f6a-8700-64667594e4d0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 08:09:45 compute-0 nova_compute[186544]: 2025-11-22 08:09:45.445 186548 DEBUG nova.virt.disk.api [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Checking if we can resize image /var/lib/nova/instances/36bedc95-cfbe-490a-b6d2-b148301708dc/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 08:09:45 compute-0 nova_compute[186544]: 2025-11-22 08:09:45.446 186548 DEBUG oslo_concurrency.processutils [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/36bedc95-cfbe-490a-b6d2-b148301708dc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:09:45 compute-0 nova_compute[186544]: 2025-11-22 08:09:45.500 186548 DEBUG oslo_concurrency.processutils [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/36bedc95-cfbe-490a-b6d2-b148301708dc/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:09:45 compute-0 nova_compute[186544]: 2025-11-22 08:09:45.501 186548 DEBUG nova.virt.disk.api [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Cannot resize image /var/lib/nova/instances/36bedc95-cfbe-490a-b6d2-b148301708dc/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 08:09:45 compute-0 nova_compute[186544]: 2025-11-22 08:09:45.501 186548 DEBUG nova.objects.instance [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Lazy-loading 'migration_context' on Instance uuid 36bedc95-cfbe-490a-b6d2-b148301708dc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:09:45 compute-0 nova_compute[186544]: 2025-11-22 08:09:45.527 186548 DEBUG nova.virt.libvirt.driver [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 36bedc95-cfbe-490a-b6d2-b148301708dc] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 08:09:45 compute-0 nova_compute[186544]: 2025-11-22 08:09:45.528 186548 DEBUG nova.virt.libvirt.driver [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 36bedc95-cfbe-490a-b6d2-b148301708dc] Ensure instance console log exists: /var/lib/nova/instances/36bedc95-cfbe-490a-b6d2-b148301708dc/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 08:09:45 compute-0 nova_compute[186544]: 2025-11-22 08:09:45.529 186548 DEBUG oslo_concurrency.lockutils [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:09:45 compute-0 nova_compute[186544]: 2025-11-22 08:09:45.529 186548 DEBUG oslo_concurrency.lockutils [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:09:45 compute-0 nova_compute[186544]: 2025-11-22 08:09:45.529 186548 DEBUG oslo_concurrency.lockutils [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:09:45 compute-0 nova_compute[186544]: 2025-11-22 08:09:45.800 186548 DEBUG nova.network.neutron [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 36bedc95-cfbe-490a-b6d2-b148301708dc] Successfully created port: a05e871f-1fed-40ac-8e55-6161c2b1f91f _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 22 08:09:45 compute-0 nova_compute[186544]: 2025-11-22 08:09:45.859 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:09:46 compute-0 nova_compute[186544]: 2025-11-22 08:09:46.341 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:09:46 compute-0 nova_compute[186544]: 2025-11-22 08:09:46.580 186548 DEBUG nova.network.neutron [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 63bce965-e211-4f6a-8700-64667594e4d0] Updating instance_info_cache with network_info: [{"id": "1b2e31ef-a7fe-455e-b7e0-a50675608829", "address": "fa:16:3e:46:96:9f", "network": {"id": "c75f33da-8305-4145-97ef-eef656e4f067", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-910086432-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c564dfb60114407b72d22a9c49ed513", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b2e31ef-a7", "ovs_interfaceid": "1b2e31ef-a7fe-455e-b7e0-a50675608829", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:09:46 compute-0 nova_compute[186544]: 2025-11-22 08:09:46.619 186548 DEBUG oslo_concurrency.lockutils [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Releasing lock "refresh_cache-63bce965-e211-4f6a-8700-64667594e4d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:09:46 compute-0 nova_compute[186544]: 2025-11-22 08:09:46.619 186548 DEBUG nova.compute.manager [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 63bce965-e211-4f6a-8700-64667594e4d0] Instance network_info: |[{"id": "1b2e31ef-a7fe-455e-b7e0-a50675608829", "address": "fa:16:3e:46:96:9f", "network": {"id": "c75f33da-8305-4145-97ef-eef656e4f067", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-910086432-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c564dfb60114407b72d22a9c49ed513", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b2e31ef-a7", "ovs_interfaceid": "1b2e31ef-a7fe-455e-b7e0-a50675608829", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 22 08:09:46 compute-0 nova_compute[186544]: 2025-11-22 08:09:46.620 186548 DEBUG nova.network.neutron [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 36bedc95-cfbe-490a-b6d2-b148301708dc] Successfully updated port: a05e871f-1fed-40ac-8e55-6161c2b1f91f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 08:09:46 compute-0 nova_compute[186544]: 2025-11-22 08:09:46.624 186548 DEBUG nova.virt.libvirt.driver [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 63bce965-e211-4f6a-8700-64667594e4d0] Start _get_guest_xml network_info=[{"id": "1b2e31ef-a7fe-455e-b7e0-a50675608829", "address": "fa:16:3e:46:96:9f", "network": {"id": "c75f33da-8305-4145-97ef-eef656e4f067", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-910086432-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c564dfb60114407b72d22a9c49ed513", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b2e31ef-a7", "ovs_interfaceid": "1b2e31ef-a7fe-455e-b7e0-a50675608829", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 08:09:46 compute-0 nova_compute[186544]: 2025-11-22 08:09:46.627 186548 WARNING nova.virt.libvirt.driver [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:09:46 compute-0 nova_compute[186544]: 2025-11-22 08:09:46.632 186548 DEBUG nova.virt.libvirt.host [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 08:09:46 compute-0 nova_compute[186544]: 2025-11-22 08:09:46.633 186548 DEBUG nova.virt.libvirt.host [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 08:09:46 compute-0 nova_compute[186544]: 2025-11-22 08:09:46.637 186548 DEBUG nova.virt.libvirt.host [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 08:09:46 compute-0 nova_compute[186544]: 2025-11-22 08:09:46.637 186548 DEBUG nova.virt.libvirt.host [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 08:09:46 compute-0 nova_compute[186544]: 2025-11-22 08:09:46.638 186548 DEBUG nova.virt.libvirt.driver [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 08:09:46 compute-0 nova_compute[186544]: 2025-11-22 08:09:46.638 186548 DEBUG nova.virt.hardware [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 08:09:46 compute-0 nova_compute[186544]: 2025-11-22 08:09:46.639 186548 DEBUG nova.virt.hardware [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 08:09:46 compute-0 nova_compute[186544]: 2025-11-22 08:09:46.639 186548 DEBUG nova.virt.hardware [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 08:09:46 compute-0 nova_compute[186544]: 2025-11-22 08:09:46.639 186548 DEBUG nova.virt.hardware [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 08:09:46 compute-0 nova_compute[186544]: 2025-11-22 08:09:46.640 186548 DEBUG nova.virt.hardware [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 08:09:46 compute-0 nova_compute[186544]: 2025-11-22 08:09:46.640 186548 DEBUG nova.virt.hardware [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 08:09:46 compute-0 nova_compute[186544]: 2025-11-22 08:09:46.640 186548 DEBUG nova.virt.hardware [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 08:09:46 compute-0 nova_compute[186544]: 2025-11-22 08:09:46.640 186548 DEBUG nova.virt.hardware [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 08:09:46 compute-0 nova_compute[186544]: 2025-11-22 08:09:46.640 186548 DEBUG nova.virt.hardware [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 08:09:46 compute-0 nova_compute[186544]: 2025-11-22 08:09:46.641 186548 DEBUG nova.virt.hardware [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 08:09:46 compute-0 nova_compute[186544]: 2025-11-22 08:09:46.641 186548 DEBUG nova.virt.hardware [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 08:09:46 compute-0 nova_compute[186544]: 2025-11-22 08:09:46.644 186548 DEBUG nova.virt.libvirt.vif [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:09:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-705204345',display_name='tempest-tempest.common.compute-instance-705204345-2',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-705204345-2',id=120,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=1,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9c564dfb60114407b72d22a9c49ed513',ramdisk_id='',reservation_id='r-hgk57t70',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-MultipleCreateTestJSON-1558462004',owner_user_name='tempest-MultipleCreateTestJSON-1558462004-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:09:42Z,user_data=None,user_id='c867ad823e59410b995507d3e85b3465',uuid=63bce965-e211-4f6a-8700-64667594e4d0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1b2e31ef-a7fe-455e-b7e0-a50675608829", "address": "fa:16:3e:46:96:9f", "network": {"id": "c75f33da-8305-4145-97ef-eef656e4f067", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-910086432-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c564dfb60114407b72d22a9c49ed513", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b2e31ef-a7", "ovs_interfaceid": "1b2e31ef-a7fe-455e-b7e0-a50675608829", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 08:09:46 compute-0 nova_compute[186544]: 2025-11-22 08:09:46.645 186548 DEBUG nova.network.os_vif_util [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Converting VIF {"id": "1b2e31ef-a7fe-455e-b7e0-a50675608829", "address": "fa:16:3e:46:96:9f", "network": {"id": "c75f33da-8305-4145-97ef-eef656e4f067", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-910086432-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c564dfb60114407b72d22a9c49ed513", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b2e31ef-a7", "ovs_interfaceid": "1b2e31ef-a7fe-455e-b7e0-a50675608829", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:09:46 compute-0 nova_compute[186544]: 2025-11-22 08:09:46.646 186548 DEBUG nova.network.os_vif_util [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:46:96:9f,bridge_name='br-int',has_traffic_filtering=True,id=1b2e31ef-a7fe-455e-b7e0-a50675608829,network=Network(c75f33da-8305-4145-97ef-eef656e4f067),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1b2e31ef-a7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:09:46 compute-0 nova_compute[186544]: 2025-11-22 08:09:46.646 186548 DEBUG nova.objects.instance [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lazy-loading 'pci_devices' on Instance uuid 63bce965-e211-4f6a-8700-64667594e4d0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:09:46 compute-0 nova_compute[186544]: 2025-11-22 08:09:46.648 186548 DEBUG oslo_concurrency.lockutils [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Acquiring lock "refresh_cache-36bedc95-cfbe-490a-b6d2-b148301708dc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:09:46 compute-0 nova_compute[186544]: 2025-11-22 08:09:46.648 186548 DEBUG oslo_concurrency.lockutils [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Acquired lock "refresh_cache-36bedc95-cfbe-490a-b6d2-b148301708dc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:09:46 compute-0 nova_compute[186544]: 2025-11-22 08:09:46.649 186548 DEBUG nova.network.neutron [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 36bedc95-cfbe-490a-b6d2-b148301708dc] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 08:09:46 compute-0 nova_compute[186544]: 2025-11-22 08:09:46.663 186548 DEBUG nova.virt.libvirt.driver [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 63bce965-e211-4f6a-8700-64667594e4d0] End _get_guest_xml xml=<domain type="kvm">
Nov 22 08:09:46 compute-0 nova_compute[186544]:   <uuid>63bce965-e211-4f6a-8700-64667594e4d0</uuid>
Nov 22 08:09:46 compute-0 nova_compute[186544]:   <name>instance-00000078</name>
Nov 22 08:09:46 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 08:09:46 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 08:09:46 compute-0 nova_compute[186544]:   <metadata>
Nov 22 08:09:46 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 08:09:46 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 08:09:46 compute-0 nova_compute[186544]:       <nova:name>tempest-tempest.common.compute-instance-705204345-2</nova:name>
Nov 22 08:09:46 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 08:09:46</nova:creationTime>
Nov 22 08:09:46 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 08:09:46 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 08:09:46 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 08:09:46 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 08:09:46 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 08:09:46 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 08:09:46 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 08:09:46 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 08:09:46 compute-0 nova_compute[186544]:         <nova:user uuid="c867ad823e59410b995507d3e85b3465">tempest-MultipleCreateTestJSON-1558462004-project-member</nova:user>
Nov 22 08:09:46 compute-0 nova_compute[186544]:         <nova:project uuid="9c564dfb60114407b72d22a9c49ed513">tempest-MultipleCreateTestJSON-1558462004</nova:project>
Nov 22 08:09:46 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 08:09:46 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 08:09:46 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 08:09:46 compute-0 nova_compute[186544]:         <nova:port uuid="1b2e31ef-a7fe-455e-b7e0-a50675608829">
Nov 22 08:09:46 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Nov 22 08:09:46 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 08:09:46 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 08:09:46 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 08:09:46 compute-0 nova_compute[186544]:   </metadata>
Nov 22 08:09:46 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 08:09:46 compute-0 nova_compute[186544]:     <system>
Nov 22 08:09:46 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 08:09:46 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 08:09:46 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 08:09:46 compute-0 nova_compute[186544]:       <entry name="serial">63bce965-e211-4f6a-8700-64667594e4d0</entry>
Nov 22 08:09:46 compute-0 nova_compute[186544]:       <entry name="uuid">63bce965-e211-4f6a-8700-64667594e4d0</entry>
Nov 22 08:09:46 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 08:09:46 compute-0 nova_compute[186544]:     </system>
Nov 22 08:09:46 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 08:09:46 compute-0 nova_compute[186544]:   <os>
Nov 22 08:09:46 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 08:09:46 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 08:09:46 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 08:09:46 compute-0 nova_compute[186544]:   </os>
Nov 22 08:09:46 compute-0 nova_compute[186544]:   <features>
Nov 22 08:09:46 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 08:09:46 compute-0 nova_compute[186544]:     <apic/>
Nov 22 08:09:46 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 08:09:46 compute-0 nova_compute[186544]:   </features>
Nov 22 08:09:46 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 08:09:46 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 08:09:46 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 08:09:46 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 08:09:46 compute-0 nova_compute[186544]:   </clock>
Nov 22 08:09:46 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 08:09:46 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 08:09:46 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 08:09:46 compute-0 nova_compute[186544]:   </cpu>
Nov 22 08:09:46 compute-0 nova_compute[186544]:   <devices>
Nov 22 08:09:46 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 08:09:46 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 08:09:46 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/63bce965-e211-4f6a-8700-64667594e4d0/disk"/>
Nov 22 08:09:46 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 08:09:46 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:09:46 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 08:09:46 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 08:09:46 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/63bce965-e211-4f6a-8700-64667594e4d0/disk.config"/>
Nov 22 08:09:46 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 08:09:46 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:09:46 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 08:09:46 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:46:96:9f"/>
Nov 22 08:09:46 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:09:46 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 08:09:46 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 08:09:46 compute-0 nova_compute[186544]:       <target dev="tap1b2e31ef-a7"/>
Nov 22 08:09:46 compute-0 nova_compute[186544]:     </interface>
Nov 22 08:09:46 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 08:09:46 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/63bce965-e211-4f6a-8700-64667594e4d0/console.log" append="off"/>
Nov 22 08:09:46 compute-0 nova_compute[186544]:     </serial>
Nov 22 08:09:46 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 08:09:46 compute-0 nova_compute[186544]:     <video>
Nov 22 08:09:46 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:09:46 compute-0 nova_compute[186544]:     </video>
Nov 22 08:09:46 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 08:09:46 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 08:09:46 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 08:09:46 compute-0 nova_compute[186544]:     </rng>
Nov 22 08:09:46 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 08:09:46 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:09:46 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:09:46 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:09:46 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:09:46 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:09:46 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:09:46 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:09:46 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:09:46 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:09:46 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:09:46 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:09:46 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:09:46 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:09:46 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:09:46 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:09:46 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:09:46 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:09:46 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:09:46 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:09:46 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:09:46 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:09:46 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:09:46 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:09:46 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:09:46 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 08:09:46 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 08:09:46 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 08:09:46 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 08:09:46 compute-0 nova_compute[186544]:   </devices>
Nov 22 08:09:46 compute-0 nova_compute[186544]: </domain>
Nov 22 08:09:46 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 08:09:46 compute-0 nova_compute[186544]: 2025-11-22 08:09:46.664 186548 DEBUG nova.compute.manager [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 63bce965-e211-4f6a-8700-64667594e4d0] Preparing to wait for external event network-vif-plugged-1b2e31ef-a7fe-455e-b7e0-a50675608829 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 08:09:46 compute-0 nova_compute[186544]: 2025-11-22 08:09:46.664 186548 DEBUG oslo_concurrency.lockutils [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Acquiring lock "63bce965-e211-4f6a-8700-64667594e4d0-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:09:46 compute-0 nova_compute[186544]: 2025-11-22 08:09:46.664 186548 DEBUG oslo_concurrency.lockutils [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lock "63bce965-e211-4f6a-8700-64667594e4d0-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:09:46 compute-0 nova_compute[186544]: 2025-11-22 08:09:46.664 186548 DEBUG oslo_concurrency.lockutils [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lock "63bce965-e211-4f6a-8700-64667594e4d0-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:09:46 compute-0 nova_compute[186544]: 2025-11-22 08:09:46.665 186548 DEBUG nova.virt.libvirt.vif [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:09:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-705204345',display_name='tempest-tempest.common.compute-instance-705204345-2',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-705204345-2',id=120,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=1,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9c564dfb60114407b72d22a9c49ed513',ramdisk_id='',reservation_id='r-hgk57t70',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-MultipleCreateTestJSON-1558462004',owner_user_name='tempest-MultipleCreateTestJSON-1558462004-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:09:42Z,user_data=None,user_id='c867ad823e59410b995507d3e85b3465',uuid=63bce965-e211-4f6a-8700-64667594e4d0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1b2e31ef-a7fe-455e-b7e0-a50675608829", "address": "fa:16:3e:46:96:9f", "network": {"id": "c75f33da-8305-4145-97ef-eef656e4f067", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-910086432-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c564dfb60114407b72d22a9c49ed513", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b2e31ef-a7", "ovs_interfaceid": "1b2e31ef-a7fe-455e-b7e0-a50675608829", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 08:09:46 compute-0 nova_compute[186544]: 2025-11-22 08:09:46.665 186548 DEBUG nova.network.os_vif_util [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Converting VIF {"id": "1b2e31ef-a7fe-455e-b7e0-a50675608829", "address": "fa:16:3e:46:96:9f", "network": {"id": "c75f33da-8305-4145-97ef-eef656e4f067", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-910086432-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c564dfb60114407b72d22a9c49ed513", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b2e31ef-a7", "ovs_interfaceid": "1b2e31ef-a7fe-455e-b7e0-a50675608829", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:09:46 compute-0 nova_compute[186544]: 2025-11-22 08:09:46.666 186548 DEBUG nova.network.os_vif_util [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:46:96:9f,bridge_name='br-int',has_traffic_filtering=True,id=1b2e31ef-a7fe-455e-b7e0-a50675608829,network=Network(c75f33da-8305-4145-97ef-eef656e4f067),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1b2e31ef-a7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:09:46 compute-0 nova_compute[186544]: 2025-11-22 08:09:46.666 186548 DEBUG os_vif [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:46:96:9f,bridge_name='br-int',has_traffic_filtering=True,id=1b2e31ef-a7fe-455e-b7e0-a50675608829,network=Network(c75f33da-8305-4145-97ef-eef656e4f067),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1b2e31ef-a7') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 08:09:46 compute-0 nova_compute[186544]: 2025-11-22 08:09:46.667 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:09:46 compute-0 nova_compute[186544]: 2025-11-22 08:09:46.667 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:09:46 compute-0 nova_compute[186544]: 2025-11-22 08:09:46.667 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:09:46 compute-0 nova_compute[186544]: 2025-11-22 08:09:46.672 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:09:46 compute-0 nova_compute[186544]: 2025-11-22 08:09:46.672 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1b2e31ef-a7, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:09:46 compute-0 nova_compute[186544]: 2025-11-22 08:09:46.672 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1b2e31ef-a7, col_values=(('external_ids', {'iface-id': '1b2e31ef-a7fe-455e-b7e0-a50675608829', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:46:96:9f', 'vm-uuid': '63bce965-e211-4f6a-8700-64667594e4d0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:09:46 compute-0 nova_compute[186544]: 2025-11-22 08:09:46.674 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:09:46 compute-0 NetworkManager[55036]: <info>  [1763798986.6750] manager: (tap1b2e31ef-a7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/248)
Nov 22 08:09:46 compute-0 nova_compute[186544]: 2025-11-22 08:09:46.676 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 08:09:46 compute-0 nova_compute[186544]: 2025-11-22 08:09:46.678 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:09:46 compute-0 nova_compute[186544]: 2025-11-22 08:09:46.679 186548 INFO os_vif [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:46:96:9f,bridge_name='br-int',has_traffic_filtering=True,id=1b2e31ef-a7fe-455e-b7e0-a50675608829,network=Network(c75f33da-8305-4145-97ef-eef656e4f067),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1b2e31ef-a7')
Nov 22 08:09:46 compute-0 nova_compute[186544]: 2025-11-22 08:09:46.713 186548 DEBUG nova.compute.manager [req-8fd4a0cf-8220-447f-b2f2-fe836f7b474a req-e63df9cf-c7f8-4d92-bb2f-7ecbf02f5daf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 36bedc95-cfbe-490a-b6d2-b148301708dc] Received event network-changed-a05e871f-1fed-40ac-8e55-6161c2b1f91f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:09:46 compute-0 nova_compute[186544]: 2025-11-22 08:09:46.714 186548 DEBUG nova.compute.manager [req-8fd4a0cf-8220-447f-b2f2-fe836f7b474a req-e63df9cf-c7f8-4d92-bb2f-7ecbf02f5daf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 36bedc95-cfbe-490a-b6d2-b148301708dc] Refreshing instance network info cache due to event network-changed-a05e871f-1fed-40ac-8e55-6161c2b1f91f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:09:46 compute-0 nova_compute[186544]: 2025-11-22 08:09:46.714 186548 DEBUG oslo_concurrency.lockutils [req-8fd4a0cf-8220-447f-b2f2-fe836f7b474a req-e63df9cf-c7f8-4d92-bb2f-7ecbf02f5daf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-36bedc95-cfbe-490a-b6d2-b148301708dc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:09:46 compute-0 nova_compute[186544]: 2025-11-22 08:09:46.755 186548 DEBUG nova.virt.libvirt.driver [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:09:46 compute-0 nova_compute[186544]: 2025-11-22 08:09:46.755 186548 DEBUG nova.virt.libvirt.driver [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:09:46 compute-0 nova_compute[186544]: 2025-11-22 08:09:46.756 186548 DEBUG nova.virt.libvirt.driver [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] No VIF found with MAC fa:16:3e:46:96:9f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 08:09:46 compute-0 nova_compute[186544]: 2025-11-22 08:09:46.756 186548 INFO nova.virt.libvirt.driver [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 63bce965-e211-4f6a-8700-64667594e4d0] Using config drive
Nov 22 08:09:46 compute-0 nova_compute[186544]: 2025-11-22 08:09:46.877 186548 DEBUG nova.network.neutron [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 36bedc95-cfbe-490a-b6d2-b148301708dc] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 08:09:47 compute-0 nova_compute[186544]: 2025-11-22 08:09:47.196 186548 DEBUG nova.compute.manager [req-0e702213-9c8d-4455-83f5-6b7beacdc207 req-75a2f37d-93b4-4307-8d98-291baedaaadb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 63bce965-e211-4f6a-8700-64667594e4d0] Received event network-changed-1b2e31ef-a7fe-455e-b7e0-a50675608829 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:09:47 compute-0 nova_compute[186544]: 2025-11-22 08:09:47.196 186548 DEBUG nova.compute.manager [req-0e702213-9c8d-4455-83f5-6b7beacdc207 req-75a2f37d-93b4-4307-8d98-291baedaaadb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 63bce965-e211-4f6a-8700-64667594e4d0] Refreshing instance network info cache due to event network-changed-1b2e31ef-a7fe-455e-b7e0-a50675608829. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:09:47 compute-0 nova_compute[186544]: 2025-11-22 08:09:47.196 186548 DEBUG oslo_concurrency.lockutils [req-0e702213-9c8d-4455-83f5-6b7beacdc207 req-75a2f37d-93b4-4307-8d98-291baedaaadb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-63bce965-e211-4f6a-8700-64667594e4d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:09:47 compute-0 nova_compute[186544]: 2025-11-22 08:09:47.196 186548 DEBUG oslo_concurrency.lockutils [req-0e702213-9c8d-4455-83f5-6b7beacdc207 req-75a2f37d-93b4-4307-8d98-291baedaaadb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-63bce965-e211-4f6a-8700-64667594e4d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:09:47 compute-0 nova_compute[186544]: 2025-11-22 08:09:47.197 186548 DEBUG nova.network.neutron [req-0e702213-9c8d-4455-83f5-6b7beacdc207 req-75a2f37d-93b4-4307-8d98-291baedaaadb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 63bce965-e211-4f6a-8700-64667594e4d0] Refreshing network info cache for port 1b2e31ef-a7fe-455e-b7e0-a50675608829 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:09:47 compute-0 nova_compute[186544]: 2025-11-22 08:09:47.224 186548 INFO nova.virt.libvirt.driver [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 63bce965-e211-4f6a-8700-64667594e4d0] Creating config drive at /var/lib/nova/instances/63bce965-e211-4f6a-8700-64667594e4d0/disk.config
Nov 22 08:09:47 compute-0 nova_compute[186544]: 2025-11-22 08:09:47.229 186548 DEBUG oslo_concurrency.processutils [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/63bce965-e211-4f6a-8700-64667594e4d0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmps_efnxaa execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:09:47 compute-0 nova_compute[186544]: 2025-11-22 08:09:47.354 186548 DEBUG oslo_concurrency.processutils [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/63bce965-e211-4f6a-8700-64667594e4d0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmps_efnxaa" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:09:47 compute-0 kernel: tap1b2e31ef-a7: entered promiscuous mode
Nov 22 08:09:47 compute-0 ovn_controller[94843]: 2025-11-22T08:09:47Z|00536|binding|INFO|Claiming lport 1b2e31ef-a7fe-455e-b7e0-a50675608829 for this chassis.
Nov 22 08:09:47 compute-0 nova_compute[186544]: 2025-11-22 08:09:47.408 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:09:47 compute-0 ovn_controller[94843]: 2025-11-22T08:09:47Z|00537|binding|INFO|1b2e31ef-a7fe-455e-b7e0-a50675608829: Claiming fa:16:3e:46:96:9f 10.100.0.3
Nov 22 08:09:47 compute-0 NetworkManager[55036]: <info>  [1763798987.4104] manager: (tap1b2e31ef-a7): new Tun device (/org/freedesktop/NetworkManager/Devices/249)
Nov 22 08:09:47 compute-0 nova_compute[186544]: 2025-11-22 08:09:47.413 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:09:47 compute-0 nova_compute[186544]: 2025-11-22 08:09:47.416 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:09:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:47.428 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:46:96:9f 10.100.0.3'], port_security=['fa:16:3e:46:96:9f 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '63bce965-e211-4f6a-8700-64667594e4d0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c75f33da-8305-4145-97ef-eef656e4f067', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9c564dfb60114407b72d22a9c49ed513', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7b4508ee-1620-408d-af22-547cca254fde', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a1e17c95-3f14-4b31-90be-c563d86a1107, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=1b2e31ef-a7fe-455e-b7e0-a50675608829) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:09:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:47.429 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 1b2e31ef-a7fe-455e-b7e0-a50675608829 in datapath c75f33da-8305-4145-97ef-eef656e4f067 bound to our chassis
Nov 22 08:09:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:47.430 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c75f33da-8305-4145-97ef-eef656e4f067
Nov 22 08:09:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:47.440 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[2cd2c477-6326-4cbe-982b-71f5f3866243]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:09:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:47.441 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc75f33da-81 in ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 08:09:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:47.443 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc75f33da-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 08:09:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:47.443 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[6e5a3847-4cbd-4d34-9ded-5727eb58fd28]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:09:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:47.443 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[f6a263e3-6bd5-40aa-9489-518baa7b53a9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:09:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:47.455 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[01932294-6cf3-4abe-965c-b8e55b869a3b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:09:47 compute-0 nova_compute[186544]: 2025-11-22 08:09:47.470 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:09:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:47.470 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[ce27562a-9ac8-4b2e-b9e2-bdd3fa9d7d59]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:09:47 compute-0 ovn_controller[94843]: 2025-11-22T08:09:47Z|00538|binding|INFO|Setting lport 1b2e31ef-a7fe-455e-b7e0-a50675608829 ovn-installed in OVS
Nov 22 08:09:47 compute-0 ovn_controller[94843]: 2025-11-22T08:09:47Z|00539|binding|INFO|Setting lport 1b2e31ef-a7fe-455e-b7e0-a50675608829 up in Southbound
Nov 22 08:09:47 compute-0 nova_compute[186544]: 2025-11-22 08:09:47.481 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:09:47 compute-0 systemd-machined[152872]: New machine qemu-69-instance-00000078.
Nov 22 08:09:47 compute-0 systemd-udevd[235438]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:09:47 compute-0 systemd[1]: Started Virtual Machine qemu-69-instance-00000078.
Nov 22 08:09:47 compute-0 NetworkManager[55036]: <info>  [1763798987.5038] device (tap1b2e31ef-a7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 08:09:47 compute-0 NetworkManager[55036]: <info>  [1763798987.5047] device (tap1b2e31ef-a7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 08:09:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:47.500 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[a4bfe841-0033-44f0-afb3-3d1fbbe451e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:09:47 compute-0 systemd-udevd[235449]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:09:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:47.507 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[eca50f94-a223-44a0-9654-fcba0dc366dc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:09:47 compute-0 NetworkManager[55036]: <info>  [1763798987.5110] manager: (tapc75f33da-80): new Veth device (/org/freedesktop/NetworkManager/Devices/250)
Nov 22 08:09:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:47.537 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[71f74b4e-975a-4a74-a85a-c21d4e32399e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:09:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:47.541 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[323d220a-1d81-4508-a961-01f2f31c0dd6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:09:47 compute-0 podman[235431]: 2025-11-22 08:09:47.556679825 +0000 UTC m=+0.067723698 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_id=multipathd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 22 08:09:47 compute-0 NetworkManager[55036]: <info>  [1763798987.5630] device (tapc75f33da-80): carrier: link connected
Nov 22 08:09:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:47.568 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[6f5665f2-6aa1-4f6a-b71a-ebbca5f8f00c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:09:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:47.585 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[7135c6fd-0cea-4d75-adf0-63621aabc7c0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc75f33da-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6a:c8:98'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 169], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 572820, 'reachable_time': 27609, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235482, 'error': None, 'target': 'ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:09:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:47.601 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[21d693fd-be40-4fa1-9990-c67030839c32]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe6a:c898'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 572820, 'tstamp': 572820}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 235484, 'error': None, 'target': 'ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:09:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:47.616 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[db1688e1-a651-4370-92e1-f91a3612b6e1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc75f33da-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6a:c8:98'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 169], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 572820, 'reachable_time': 27609, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 235486, 'error': None, 'target': 'ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:09:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:47.647 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[8e97436a-f2ca-4525-8fcd-8294e43d3d91]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:09:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:47.702 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[a4193f3d-50ed-4fb6-b7c8-0bb9d68da990]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:09:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:47.704 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc75f33da-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:09:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:47.705 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:09:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:47.705 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc75f33da-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:09:47 compute-0 nova_compute[186544]: 2025-11-22 08:09:47.707 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:09:47 compute-0 kernel: tapc75f33da-80: entered promiscuous mode
Nov 22 08:09:47 compute-0 NetworkManager[55036]: <info>  [1763798987.7083] manager: (tapc75f33da-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/251)
Nov 22 08:09:47 compute-0 nova_compute[186544]: 2025-11-22 08:09:47.709 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:09:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:47.710 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc75f33da-80, col_values=(('external_ids', {'iface-id': 'd2b1e9d2-8364-40b7-8c31-edbcc237653b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:09:47 compute-0 ovn_controller[94843]: 2025-11-22T08:09:47Z|00540|binding|INFO|Releasing lport d2b1e9d2-8364-40b7-8c31-edbcc237653b from this chassis (sb_readonly=0)
Nov 22 08:09:47 compute-0 nova_compute[186544]: 2025-11-22 08:09:47.712 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:09:47 compute-0 nova_compute[186544]: 2025-11-22 08:09:47.713 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:09:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:47.713 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c75f33da-8305-4145-97ef-eef656e4f067.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c75f33da-8305-4145-97ef-eef656e4f067.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 08:09:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:47.714 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[20e65576-a56c-4afc-acb5-54e07e4ecf8f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:09:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:47.715 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 08:09:47 compute-0 ovn_metadata_agent[103800]: global
Nov 22 08:09:47 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 08:09:47 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-c75f33da-8305-4145-97ef-eef656e4f067
Nov 22 08:09:47 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 08:09:47 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 08:09:47 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 08:09:47 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/c75f33da-8305-4145-97ef-eef656e4f067.pid.haproxy
Nov 22 08:09:47 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 08:09:47 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:09:47 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 08:09:47 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 08:09:47 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 08:09:47 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 08:09:47 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 08:09:47 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 08:09:47 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 08:09:47 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 08:09:47 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 08:09:47 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 08:09:47 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 08:09:47 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 08:09:47 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 08:09:47 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:09:47 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:09:47 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 08:09:47 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 08:09:47 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 08:09:47 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID c75f33da-8305-4145-97ef-eef656e4f067
Nov 22 08:09:47 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 08:09:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:47.717 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067', 'env', 'PROCESS_TAG=haproxy-c75f33da-8305-4145-97ef-eef656e4f067', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c75f33da-8305-4145-97ef-eef656e4f067.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 08:09:47 compute-0 nova_compute[186544]: 2025-11-22 08:09:47.724 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:09:47 compute-0 nova_compute[186544]: 2025-11-22 08:09:47.818 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798987.8169737, 63bce965-e211-4f6a-8700-64667594e4d0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:09:47 compute-0 nova_compute[186544]: 2025-11-22 08:09:47.818 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 63bce965-e211-4f6a-8700-64667594e4d0] VM Started (Lifecycle Event)
Nov 22 08:09:47 compute-0 nova_compute[186544]: 2025-11-22 08:09:47.842 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 63bce965-e211-4f6a-8700-64667594e4d0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:09:47 compute-0 nova_compute[186544]: 2025-11-22 08:09:47.845 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798987.8172104, 63bce965-e211-4f6a-8700-64667594e4d0 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:09:47 compute-0 nova_compute[186544]: 2025-11-22 08:09:47.845 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 63bce965-e211-4f6a-8700-64667594e4d0] VM Paused (Lifecycle Event)
Nov 22 08:09:47 compute-0 nova_compute[186544]: 2025-11-22 08:09:47.875 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 63bce965-e211-4f6a-8700-64667594e4d0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:09:47 compute-0 nova_compute[186544]: 2025-11-22 08:09:47.883 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 63bce965-e211-4f6a-8700-64667594e4d0] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:09:47 compute-0 nova_compute[186544]: 2025-11-22 08:09:47.924 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 63bce965-e211-4f6a-8700-64667594e4d0] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 08:09:48 compute-0 podman[235525]: 2025-11-22 08:09:48.053422072 +0000 UTC m=+0.023100473 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 08:09:48 compute-0 podman[235525]: 2025-11-22 08:09:48.234193937 +0000 UTC m=+0.203872318 container create f1cd6ec4ccb0c93afa256e6318dff112e46f134f5404e6a9f5b391a4bf35429e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 08:09:48 compute-0 systemd[1]: Started libpod-conmon-f1cd6ec4ccb0c93afa256e6318dff112e46f134f5404e6a9f5b391a4bf35429e.scope.
Nov 22 08:09:48 compute-0 systemd[1]: Started libcrun container.
Nov 22 08:09:48 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8a43891dc5ab8c10876dc013c9af5894155da97fbf9423db9e64bf42bfa246ea/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 08:09:48 compute-0 podman[235525]: 2025-11-22 08:09:48.317915739 +0000 UTC m=+0.287594130 container init f1cd6ec4ccb0c93afa256e6318dff112e46f134f5404e6a9f5b391a4bf35429e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 22 08:09:48 compute-0 podman[235525]: 2025-11-22 08:09:48.323426366 +0000 UTC m=+0.293104737 container start f1cd6ec4ccb0c93afa256e6318dff112e46f134f5404e6a9f5b391a4bf35429e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 08:09:48 compute-0 neutron-haproxy-ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067[235542]: [NOTICE]   (235546) : New worker (235548) forked
Nov 22 08:09:48 compute-0 neutron-haproxy-ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067[235542]: [NOTICE]   (235546) : Loading success.
Nov 22 08:09:48 compute-0 nova_compute[186544]: 2025-11-22 08:09:48.549 186548 DEBUG nova.network.neutron [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 36bedc95-cfbe-490a-b6d2-b148301708dc] Updating instance_info_cache with network_info: [{"id": "a05e871f-1fed-40ac-8e55-6161c2b1f91f", "address": "fa:16:3e:96:a3:0e", "network": {"id": "390460fe-fb7f-40ce-abb7-9e99dea93a54", "bridge": "br-int", "label": "tempest-TestServerMultinode-1226412633-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5ea60c87d8514904b3cac3301e5875af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa05e871f-1f", "ovs_interfaceid": "a05e871f-1fed-40ac-8e55-6161c2b1f91f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:09:48 compute-0 nova_compute[186544]: 2025-11-22 08:09:48.570 186548 DEBUG oslo_concurrency.lockutils [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Releasing lock "refresh_cache-36bedc95-cfbe-490a-b6d2-b148301708dc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:09:48 compute-0 nova_compute[186544]: 2025-11-22 08:09:48.571 186548 DEBUG nova.compute.manager [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 36bedc95-cfbe-490a-b6d2-b148301708dc] Instance network_info: |[{"id": "a05e871f-1fed-40ac-8e55-6161c2b1f91f", "address": "fa:16:3e:96:a3:0e", "network": {"id": "390460fe-fb7f-40ce-abb7-9e99dea93a54", "bridge": "br-int", "label": "tempest-TestServerMultinode-1226412633-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5ea60c87d8514904b3cac3301e5875af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa05e871f-1f", "ovs_interfaceid": "a05e871f-1fed-40ac-8e55-6161c2b1f91f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 22 08:09:48 compute-0 nova_compute[186544]: 2025-11-22 08:09:48.571 186548 DEBUG oslo_concurrency.lockutils [req-8fd4a0cf-8220-447f-b2f2-fe836f7b474a req-e63df9cf-c7f8-4d92-bb2f-7ecbf02f5daf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-36bedc95-cfbe-490a-b6d2-b148301708dc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:09:48 compute-0 nova_compute[186544]: 2025-11-22 08:09:48.571 186548 DEBUG nova.network.neutron [req-8fd4a0cf-8220-447f-b2f2-fe836f7b474a req-e63df9cf-c7f8-4d92-bb2f-7ecbf02f5daf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 36bedc95-cfbe-490a-b6d2-b148301708dc] Refreshing network info cache for port a05e871f-1fed-40ac-8e55-6161c2b1f91f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:09:48 compute-0 nova_compute[186544]: 2025-11-22 08:09:48.574 186548 DEBUG nova.virt.libvirt.driver [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 36bedc95-cfbe-490a-b6d2-b148301708dc] Start _get_guest_xml network_info=[{"id": "a05e871f-1fed-40ac-8e55-6161c2b1f91f", "address": "fa:16:3e:96:a3:0e", "network": {"id": "390460fe-fb7f-40ce-abb7-9e99dea93a54", "bridge": "br-int", "label": "tempest-TestServerMultinode-1226412633-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5ea60c87d8514904b3cac3301e5875af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa05e871f-1f", "ovs_interfaceid": "a05e871f-1fed-40ac-8e55-6161c2b1f91f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 08:09:48 compute-0 nova_compute[186544]: 2025-11-22 08:09:48.579 186548 WARNING nova.virt.libvirt.driver [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:09:48 compute-0 nova_compute[186544]: 2025-11-22 08:09:48.583 186548 DEBUG nova.virt.libvirt.host [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 08:09:48 compute-0 nova_compute[186544]: 2025-11-22 08:09:48.584 186548 DEBUG nova.virt.libvirt.host [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 08:09:48 compute-0 nova_compute[186544]: 2025-11-22 08:09:48.591 186548 DEBUG nova.virt.libvirt.host [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 08:09:48 compute-0 nova_compute[186544]: 2025-11-22 08:09:48.591 186548 DEBUG nova.virt.libvirt.host [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 08:09:48 compute-0 nova_compute[186544]: 2025-11-22 08:09:48.592 186548 DEBUG nova.virt.libvirt.driver [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 08:09:48 compute-0 nova_compute[186544]: 2025-11-22 08:09:48.593 186548 DEBUG nova.virt.hardware [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 08:09:48 compute-0 nova_compute[186544]: 2025-11-22 08:09:48.593 186548 DEBUG nova.virt.hardware [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 08:09:48 compute-0 nova_compute[186544]: 2025-11-22 08:09:48.593 186548 DEBUG nova.virt.hardware [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 08:09:48 compute-0 nova_compute[186544]: 2025-11-22 08:09:48.594 186548 DEBUG nova.virt.hardware [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 08:09:48 compute-0 nova_compute[186544]: 2025-11-22 08:09:48.594 186548 DEBUG nova.virt.hardware [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 08:09:48 compute-0 nova_compute[186544]: 2025-11-22 08:09:48.594 186548 DEBUG nova.virt.hardware [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 08:09:48 compute-0 nova_compute[186544]: 2025-11-22 08:09:48.594 186548 DEBUG nova.virt.hardware [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 08:09:48 compute-0 nova_compute[186544]: 2025-11-22 08:09:48.595 186548 DEBUG nova.virt.hardware [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 08:09:48 compute-0 nova_compute[186544]: 2025-11-22 08:09:48.595 186548 DEBUG nova.virt.hardware [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 08:09:48 compute-0 nova_compute[186544]: 2025-11-22 08:09:48.595 186548 DEBUG nova.virt.hardware [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 08:09:48 compute-0 nova_compute[186544]: 2025-11-22 08:09:48.595 186548 DEBUG nova.virt.hardware [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 08:09:48 compute-0 nova_compute[186544]: 2025-11-22 08:09:48.599 186548 DEBUG nova.virt.libvirt.vif [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:09:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerMultinode-server-1687759960',display_name='tempest-TestServerMultinode-server-1687759960',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testservermultinode-server-1687759960',id=121,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b67388009f754931a62cbdd391fb4f53',ramdisk_id='',reservation_id='r-xdz10hmr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerMultinode-1734646453',owner_user_name='tempest-TestServerMultinode-1734646453-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:09:45Z,user_data=None,user_id='1bc17d213e01420ebb2a0bf75f44e357',uuid=36bedc95-cfbe-490a-b6d2-b148301708dc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a05e871f-1fed-40ac-8e55-6161c2b1f91f", "address": "fa:16:3e:96:a3:0e", "network": {"id": "390460fe-fb7f-40ce-abb7-9e99dea93a54", "bridge": "br-int", "label": "tempest-TestServerMultinode-1226412633-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5ea60c87d8514904b3cac3301e5875af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa05e871f-1f", "ovs_interfaceid": "a05e871f-1fed-40ac-8e55-6161c2b1f91f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 08:09:48 compute-0 nova_compute[186544]: 2025-11-22 08:09:48.599 186548 DEBUG nova.network.os_vif_util [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Converting VIF {"id": "a05e871f-1fed-40ac-8e55-6161c2b1f91f", "address": "fa:16:3e:96:a3:0e", "network": {"id": "390460fe-fb7f-40ce-abb7-9e99dea93a54", "bridge": "br-int", "label": "tempest-TestServerMultinode-1226412633-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5ea60c87d8514904b3cac3301e5875af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa05e871f-1f", "ovs_interfaceid": "a05e871f-1fed-40ac-8e55-6161c2b1f91f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:09:48 compute-0 nova_compute[186544]: 2025-11-22 08:09:48.600 186548 DEBUG nova.network.os_vif_util [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:96:a3:0e,bridge_name='br-int',has_traffic_filtering=True,id=a05e871f-1fed-40ac-8e55-6161c2b1f91f,network=Network(390460fe-fb7f-40ce-abb7-9e99dea93a54),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa05e871f-1f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:09:48 compute-0 nova_compute[186544]: 2025-11-22 08:09:48.601 186548 DEBUG nova.objects.instance [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Lazy-loading 'pci_devices' on Instance uuid 36bedc95-cfbe-490a-b6d2-b148301708dc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:09:48 compute-0 nova_compute[186544]: 2025-11-22 08:09:48.636 186548 DEBUG nova.virt.libvirt.driver [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 36bedc95-cfbe-490a-b6d2-b148301708dc] End _get_guest_xml xml=<domain type="kvm">
Nov 22 08:09:48 compute-0 nova_compute[186544]:   <uuid>36bedc95-cfbe-490a-b6d2-b148301708dc</uuid>
Nov 22 08:09:48 compute-0 nova_compute[186544]:   <name>instance-00000079</name>
Nov 22 08:09:48 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 08:09:48 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 08:09:48 compute-0 nova_compute[186544]:   <metadata>
Nov 22 08:09:48 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 08:09:48 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 08:09:48 compute-0 nova_compute[186544]:       <nova:name>tempest-TestServerMultinode-server-1687759960</nova:name>
Nov 22 08:09:48 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 08:09:48</nova:creationTime>
Nov 22 08:09:48 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 08:09:48 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 08:09:48 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 08:09:48 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 08:09:48 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 08:09:48 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 08:09:48 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 08:09:48 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 08:09:48 compute-0 nova_compute[186544]:         <nova:user uuid="1bc17d213e01420ebb2a0bf75f44e357">tempest-TestServerMultinode-1734646453-project-admin</nova:user>
Nov 22 08:09:48 compute-0 nova_compute[186544]:         <nova:project uuid="b67388009f754931a62cbdd391fb4f53">tempest-TestServerMultinode-1734646453</nova:project>
Nov 22 08:09:48 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 08:09:48 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 08:09:48 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 08:09:48 compute-0 nova_compute[186544]:         <nova:port uuid="a05e871f-1fed-40ac-8e55-6161c2b1f91f">
Nov 22 08:09:48 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Nov 22 08:09:48 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 08:09:48 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 08:09:48 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 08:09:48 compute-0 nova_compute[186544]:   </metadata>
Nov 22 08:09:48 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 08:09:48 compute-0 nova_compute[186544]:     <system>
Nov 22 08:09:48 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 08:09:48 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 08:09:48 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 08:09:48 compute-0 nova_compute[186544]:       <entry name="serial">36bedc95-cfbe-490a-b6d2-b148301708dc</entry>
Nov 22 08:09:48 compute-0 nova_compute[186544]:       <entry name="uuid">36bedc95-cfbe-490a-b6d2-b148301708dc</entry>
Nov 22 08:09:48 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 08:09:48 compute-0 nova_compute[186544]:     </system>
Nov 22 08:09:48 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 08:09:48 compute-0 nova_compute[186544]:   <os>
Nov 22 08:09:48 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 08:09:48 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 08:09:48 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 08:09:48 compute-0 nova_compute[186544]:   </os>
Nov 22 08:09:48 compute-0 nova_compute[186544]:   <features>
Nov 22 08:09:48 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 08:09:48 compute-0 nova_compute[186544]:     <apic/>
Nov 22 08:09:48 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 08:09:48 compute-0 nova_compute[186544]:   </features>
Nov 22 08:09:48 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 08:09:48 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 08:09:48 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 08:09:48 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 08:09:48 compute-0 nova_compute[186544]:   </clock>
Nov 22 08:09:48 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 08:09:48 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 08:09:48 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 08:09:48 compute-0 nova_compute[186544]:   </cpu>
Nov 22 08:09:48 compute-0 nova_compute[186544]:   <devices>
Nov 22 08:09:48 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 08:09:48 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 08:09:48 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/36bedc95-cfbe-490a-b6d2-b148301708dc/disk"/>
Nov 22 08:09:48 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 08:09:48 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:09:48 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 08:09:48 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 08:09:48 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/36bedc95-cfbe-490a-b6d2-b148301708dc/disk.config"/>
Nov 22 08:09:48 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 08:09:48 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:09:48 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 08:09:48 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:96:a3:0e"/>
Nov 22 08:09:48 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:09:48 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 08:09:48 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 08:09:48 compute-0 nova_compute[186544]:       <target dev="tapa05e871f-1f"/>
Nov 22 08:09:48 compute-0 nova_compute[186544]:     </interface>
Nov 22 08:09:48 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 08:09:48 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/36bedc95-cfbe-490a-b6d2-b148301708dc/console.log" append="off"/>
Nov 22 08:09:48 compute-0 nova_compute[186544]:     </serial>
Nov 22 08:09:48 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 08:09:48 compute-0 nova_compute[186544]:     <video>
Nov 22 08:09:48 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:09:48 compute-0 nova_compute[186544]:     </video>
Nov 22 08:09:48 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 08:09:48 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 08:09:48 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 08:09:48 compute-0 nova_compute[186544]:     </rng>
Nov 22 08:09:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 08:09:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:09:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:09:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:09:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:09:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:09:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:09:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:09:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:09:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:09:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:09:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:09:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:09:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:09:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:09:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:09:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:09:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:09:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:09:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:09:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:09:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:09:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:09:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:09:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:09:48 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 08:09:48 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 08:09:48 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 08:09:48 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 08:09:48 compute-0 nova_compute[186544]:   </devices>
Nov 22 08:09:48 compute-0 nova_compute[186544]: </domain>
Nov 22 08:09:48 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 08:09:48 compute-0 nova_compute[186544]: 2025-11-22 08:09:48.637 186548 DEBUG nova.compute.manager [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 36bedc95-cfbe-490a-b6d2-b148301708dc] Preparing to wait for external event network-vif-plugged-a05e871f-1fed-40ac-8e55-6161c2b1f91f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 08:09:48 compute-0 nova_compute[186544]: 2025-11-22 08:09:48.638 186548 DEBUG oslo_concurrency.lockutils [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Acquiring lock "36bedc95-cfbe-490a-b6d2-b148301708dc-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:09:48 compute-0 nova_compute[186544]: 2025-11-22 08:09:48.638 186548 DEBUG oslo_concurrency.lockutils [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Lock "36bedc95-cfbe-490a-b6d2-b148301708dc-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:09:48 compute-0 nova_compute[186544]: 2025-11-22 08:09:48.638 186548 DEBUG oslo_concurrency.lockutils [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Lock "36bedc95-cfbe-490a-b6d2-b148301708dc-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:09:48 compute-0 nova_compute[186544]: 2025-11-22 08:09:48.639 186548 DEBUG nova.virt.libvirt.vif [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:09:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerMultinode-server-1687759960',display_name='tempest-TestServerMultinode-server-1687759960',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testservermultinode-server-1687759960',id=121,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b67388009f754931a62cbdd391fb4f53',ramdisk_id='',reservation_id='r-xdz10hmr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerMultinode-1734646453',owner_user_name='tempest-TestServerMultinode-1734646453-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:09:45Z,user_data=None,user_id='1bc17d213e01420ebb2a0bf75f44e357',uuid=36bedc95-cfbe-490a-b6d2-b148301708dc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a05e871f-1fed-40ac-8e55-6161c2b1f91f", "address": "fa:16:3e:96:a3:0e", "network": {"id": "390460fe-fb7f-40ce-abb7-9e99dea93a54", "bridge": "br-int", "label": "tempest-TestServerMultinode-1226412633-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5ea60c87d8514904b3cac3301e5875af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa05e871f-1f", "ovs_interfaceid": "a05e871f-1fed-40ac-8e55-6161c2b1f91f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 08:09:48 compute-0 nova_compute[186544]: 2025-11-22 08:09:48.639 186548 DEBUG nova.network.os_vif_util [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Converting VIF {"id": "a05e871f-1fed-40ac-8e55-6161c2b1f91f", "address": "fa:16:3e:96:a3:0e", "network": {"id": "390460fe-fb7f-40ce-abb7-9e99dea93a54", "bridge": "br-int", "label": "tempest-TestServerMultinode-1226412633-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5ea60c87d8514904b3cac3301e5875af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa05e871f-1f", "ovs_interfaceid": "a05e871f-1fed-40ac-8e55-6161c2b1f91f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:09:48 compute-0 nova_compute[186544]: 2025-11-22 08:09:48.640 186548 DEBUG nova.network.os_vif_util [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:96:a3:0e,bridge_name='br-int',has_traffic_filtering=True,id=a05e871f-1fed-40ac-8e55-6161c2b1f91f,network=Network(390460fe-fb7f-40ce-abb7-9e99dea93a54),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa05e871f-1f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:09:48 compute-0 nova_compute[186544]: 2025-11-22 08:09:48.640 186548 DEBUG os_vif [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:96:a3:0e,bridge_name='br-int',has_traffic_filtering=True,id=a05e871f-1fed-40ac-8e55-6161c2b1f91f,network=Network(390460fe-fb7f-40ce-abb7-9e99dea93a54),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa05e871f-1f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 08:09:48 compute-0 nova_compute[186544]: 2025-11-22 08:09:48.641 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:09:48 compute-0 nova_compute[186544]: 2025-11-22 08:09:48.641 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:09:48 compute-0 nova_compute[186544]: 2025-11-22 08:09:48.642 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:09:48 compute-0 nova_compute[186544]: 2025-11-22 08:09:48.644 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:09:48 compute-0 nova_compute[186544]: 2025-11-22 08:09:48.644 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa05e871f-1f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:09:48 compute-0 nova_compute[186544]: 2025-11-22 08:09:48.644 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa05e871f-1f, col_values=(('external_ids', {'iface-id': 'a05e871f-1fed-40ac-8e55-6161c2b1f91f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:96:a3:0e', 'vm-uuid': '36bedc95-cfbe-490a-b6d2-b148301708dc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:09:48 compute-0 NetworkManager[55036]: <info>  [1763798988.6466] manager: (tapa05e871f-1f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/252)
Nov 22 08:09:48 compute-0 nova_compute[186544]: 2025-11-22 08:09:48.647 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 08:09:48 compute-0 nova_compute[186544]: 2025-11-22 08:09:48.653 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:09:48 compute-0 nova_compute[186544]: 2025-11-22 08:09:48.655 186548 INFO os_vif [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:96:a3:0e,bridge_name='br-int',has_traffic_filtering=True,id=a05e871f-1fed-40ac-8e55-6161c2b1f91f,network=Network(390460fe-fb7f-40ce-abb7-9e99dea93a54),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa05e871f-1f')
Nov 22 08:09:48 compute-0 nova_compute[186544]: 2025-11-22 08:09:48.713 186548 DEBUG nova.virt.libvirt.driver [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:09:48 compute-0 nova_compute[186544]: 2025-11-22 08:09:48.713 186548 DEBUG nova.virt.libvirt.driver [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:09:48 compute-0 nova_compute[186544]: 2025-11-22 08:09:48.714 186548 DEBUG nova.virt.libvirt.driver [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] No VIF found with MAC fa:16:3e:96:a3:0e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 08:09:48 compute-0 nova_compute[186544]: 2025-11-22 08:09:48.714 186548 INFO nova.virt.libvirt.driver [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 36bedc95-cfbe-490a-b6d2-b148301708dc] Using config drive
Nov 22 08:09:49 compute-0 nova_compute[186544]: 2025-11-22 08:09:49.300 186548 INFO nova.virt.libvirt.driver [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 36bedc95-cfbe-490a-b6d2-b148301708dc] Creating config drive at /var/lib/nova/instances/36bedc95-cfbe-490a-b6d2-b148301708dc/disk.config
Nov 22 08:09:49 compute-0 nova_compute[186544]: 2025-11-22 08:09:49.304 186548 DEBUG oslo_concurrency.processutils [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/36bedc95-cfbe-490a-b6d2-b148301708dc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpxrqjr4bb execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:09:49 compute-0 nova_compute[186544]: 2025-11-22 08:09:49.336 186548 DEBUG nova.network.neutron [req-0e702213-9c8d-4455-83f5-6b7beacdc207 req-75a2f37d-93b4-4307-8d98-291baedaaadb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 63bce965-e211-4f6a-8700-64667594e4d0] Updated VIF entry in instance network info cache for port 1b2e31ef-a7fe-455e-b7e0-a50675608829. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:09:49 compute-0 nova_compute[186544]: 2025-11-22 08:09:49.337 186548 DEBUG nova.network.neutron [req-0e702213-9c8d-4455-83f5-6b7beacdc207 req-75a2f37d-93b4-4307-8d98-291baedaaadb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 63bce965-e211-4f6a-8700-64667594e4d0] Updating instance_info_cache with network_info: [{"id": "1b2e31ef-a7fe-455e-b7e0-a50675608829", "address": "fa:16:3e:46:96:9f", "network": {"id": "c75f33da-8305-4145-97ef-eef656e4f067", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-910086432-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c564dfb60114407b72d22a9c49ed513", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b2e31ef-a7", "ovs_interfaceid": "1b2e31ef-a7fe-455e-b7e0-a50675608829", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:09:49 compute-0 nova_compute[186544]: 2025-11-22 08:09:49.363 186548 DEBUG oslo_concurrency.lockutils [req-0e702213-9c8d-4455-83f5-6b7beacdc207 req-75a2f37d-93b4-4307-8d98-291baedaaadb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-63bce965-e211-4f6a-8700-64667594e4d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:09:49 compute-0 nova_compute[186544]: 2025-11-22 08:09:49.428 186548 DEBUG oslo_concurrency.processutils [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/36bedc95-cfbe-490a-b6d2-b148301708dc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpxrqjr4bb" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:09:49 compute-0 kernel: tapa05e871f-1f: entered promiscuous mode
Nov 22 08:09:49 compute-0 NetworkManager[55036]: <info>  [1763798989.4875] manager: (tapa05e871f-1f): new Tun device (/org/freedesktop/NetworkManager/Devices/253)
Nov 22 08:09:49 compute-0 systemd-udevd[235456]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:09:49 compute-0 ovn_controller[94843]: 2025-11-22T08:09:49Z|00541|binding|INFO|Claiming lport a05e871f-1fed-40ac-8e55-6161c2b1f91f for this chassis.
Nov 22 08:09:49 compute-0 ovn_controller[94843]: 2025-11-22T08:09:49Z|00542|binding|INFO|a05e871f-1fed-40ac-8e55-6161c2b1f91f: Claiming fa:16:3e:96:a3:0e 10.100.0.7
Nov 22 08:09:49 compute-0 nova_compute[186544]: 2025-11-22 08:09:49.489 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:09:49 compute-0 nova_compute[186544]: 2025-11-22 08:09:49.493 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:09:49 compute-0 NetworkManager[55036]: <info>  [1763798989.5019] device (tapa05e871f-1f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 08:09:49 compute-0 NetworkManager[55036]: <info>  [1763798989.5030] device (tapa05e871f-1f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 08:09:49 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:49.505 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:96:a3:0e 10.100.0.7'], port_security=['fa:16:3e:96:a3:0e 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '36bedc95-cfbe-490a-b6d2-b148301708dc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-390460fe-fb7f-40ce-abb7-9e99dea93a54', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b67388009f754931a62cbdd391fb4f53', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e23cfd74-a57b-4610-ab28-51062b779dc9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9b005592-2b67-4b5e-87ed-f6d87ca37498, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=a05e871f-1fed-40ac-8e55-6161c2b1f91f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:09:49 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:49.506 103805 INFO neutron.agent.ovn.metadata.agent [-] Port a05e871f-1fed-40ac-8e55-6161c2b1f91f in datapath 390460fe-fb7f-40ce-abb7-9e99dea93a54 bound to our chassis
Nov 22 08:09:49 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:49.508 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 390460fe-fb7f-40ce-abb7-9e99dea93a54
Nov 22 08:09:49 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:49.519 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[dfd3d302-6b44-4916-8a24-76b6bcb3dbc1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:09:49 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:49.521 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap390460fe-f1 in ovnmeta-390460fe-fb7f-40ce-abb7-9e99dea93a54 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 08:09:49 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:49.524 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap390460fe-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 08:09:49 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:49.524 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[3b63d9de-63b3-4915-8759-3546c0d2e6bd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:09:49 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:49.526 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[887480f7-feae-4e10-b151-9554035f9ca0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:09:49 compute-0 systemd-machined[152872]: New machine qemu-70-instance-00000079.
Nov 22 08:09:49 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:49.538 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[945e096c-b860-4528-9b54-e00790f567c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:09:49 compute-0 nova_compute[186544]: 2025-11-22 08:09:49.547 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:09:49 compute-0 ovn_controller[94843]: 2025-11-22T08:09:49Z|00543|binding|INFO|Setting lport a05e871f-1fed-40ac-8e55-6161c2b1f91f ovn-installed in OVS
Nov 22 08:09:49 compute-0 ovn_controller[94843]: 2025-11-22T08:09:49Z|00544|binding|INFO|Setting lport a05e871f-1fed-40ac-8e55-6161c2b1f91f up in Southbound
Nov 22 08:09:49 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:49.552 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[6114a423-4e0b-4bd3-852e-ec65c97dfac7]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:09:49 compute-0 nova_compute[186544]: 2025-11-22 08:09:49.553 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:09:49 compute-0 systemd[1]: Started Virtual Machine qemu-70-instance-00000079.
Nov 22 08:09:49 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:49.601 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[105d32c0-3d0e-4489-a6cf-182f8d0c2bf6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:09:49 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:49.608 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[5b255f16-5180-4e79-b78f-59c47192f511]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:09:49 compute-0 NetworkManager[55036]: <info>  [1763798989.6101] manager: (tap390460fe-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/254)
Nov 22 08:09:49 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:49.641 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[4d264e5b-1486-427a-85cc-b512c62a8eac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:09:49 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:49.645 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[6f5e5ccf-57c2-4ae7-aeae-9492fc34d5d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:09:49 compute-0 NetworkManager[55036]: <info>  [1763798989.6857] device (tap390460fe-f0): carrier: link connected
Nov 22 08:09:49 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:49.691 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[c6514e8d-3546-410f-bed3-c3dd17781c5e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:09:49 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:49.710 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[6f5918fc-685f-4829-ba6e-5bffae441eb1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap390460fe-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:02:0a:50'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 171], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 573032, 'reachable_time': 33083, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235593, 'error': None, 'target': 'ovnmeta-390460fe-fb7f-40ce-abb7-9e99dea93a54', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:09:49 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:49.726 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[89493171-67ee-4209-bf0b-bba9fea1c81f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe02:a50'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 573032, 'tstamp': 573032}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 235595, 'error': None, 'target': 'ovnmeta-390460fe-fb7f-40ce-abb7-9e99dea93a54', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:09:49 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:49.745 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[b97e9aea-9158-41a6-8f61-c4f9b3f0f812]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap390460fe-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:02:0a:50'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 171], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 573032, 'reachable_time': 33083, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 235596, 'error': None, 'target': 'ovnmeta-390460fe-fb7f-40ce-abb7-9e99dea93a54', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:09:49 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:49.776 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[0286a946-f187-4279-aa60-029ba445b2c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:09:49 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:49.838 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[273cb7f7-321e-4cf0-9c7f-5dad9ef36282]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:09:49 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:49.839 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap390460fe-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:09:49 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:49.840 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:09:49 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:49.841 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap390460fe-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:09:49 compute-0 NetworkManager[55036]: <info>  [1763798989.8435] manager: (tap390460fe-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/255)
Nov 22 08:09:49 compute-0 nova_compute[186544]: 2025-11-22 08:09:49.843 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:09:49 compute-0 kernel: tap390460fe-f0: entered promiscuous mode
Nov 22 08:09:49 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:49.845 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap390460fe-f0, col_values=(('external_ids', {'iface-id': '71a8d1b1-af34-4bcb-98ae-9fcab10d0f3b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:09:49 compute-0 ovn_controller[94843]: 2025-11-22T08:09:49Z|00545|binding|INFO|Releasing lport 71a8d1b1-af34-4bcb-98ae-9fcab10d0f3b from this chassis (sb_readonly=0)
Nov 22 08:09:49 compute-0 nova_compute[186544]: 2025-11-22 08:09:49.856 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798989.8561373, 36bedc95-cfbe-490a-b6d2-b148301708dc => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:09:49 compute-0 nova_compute[186544]: 2025-11-22 08:09:49.857 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 36bedc95-cfbe-490a-b6d2-b148301708dc] VM Started (Lifecycle Event)
Nov 22 08:09:49 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:49.859 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/390460fe-fb7f-40ce-abb7-9e99dea93a54.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/390460fe-fb7f-40ce-abb7-9e99dea93a54.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 08:09:49 compute-0 nova_compute[186544]: 2025-11-22 08:09:49.860 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:09:49 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:49.860 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[ab30b657-b8ae-46ae-9789-f124ef5fc382]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:09:49 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:49.862 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 08:09:49 compute-0 ovn_metadata_agent[103800]: global
Nov 22 08:09:49 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 08:09:49 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-390460fe-fb7f-40ce-abb7-9e99dea93a54
Nov 22 08:09:49 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 08:09:49 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 08:09:49 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 08:09:49 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/390460fe-fb7f-40ce-abb7-9e99dea93a54.pid.haproxy
Nov 22 08:09:49 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 08:09:49 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:09:49 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 08:09:49 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 08:09:49 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 08:09:49 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 08:09:49 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 08:09:49 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 08:09:49 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 08:09:49 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 08:09:49 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 08:09:49 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 08:09:49 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 08:09:49 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 08:09:49 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 08:09:49 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:09:49 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:09:49 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 08:09:49 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 08:09:49 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 08:09:49 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID 390460fe-fb7f-40ce-abb7-9e99dea93a54
Nov 22 08:09:49 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 08:09:49 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:49.864 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-390460fe-fb7f-40ce-abb7-9e99dea93a54', 'env', 'PROCESS_TAG=haproxy-390460fe-fb7f-40ce-abb7-9e99dea93a54', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/390460fe-fb7f-40ce-abb7-9e99dea93a54.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 08:09:49 compute-0 nova_compute[186544]: 2025-11-22 08:09:49.885 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 36bedc95-cfbe-490a-b6d2-b148301708dc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:09:49 compute-0 nova_compute[186544]: 2025-11-22 08:09:49.891 186548 DEBUG nova.compute.manager [req-4242c353-222b-4e2c-806a-0755e5052241 req-a2d6a8a6-9bd4-42f7-a3d4-8a76db1c0602 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 63bce965-e211-4f6a-8700-64667594e4d0] Received event network-vif-plugged-1b2e31ef-a7fe-455e-b7e0-a50675608829 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:09:49 compute-0 nova_compute[186544]: 2025-11-22 08:09:49.891 186548 DEBUG oslo_concurrency.lockutils [req-4242c353-222b-4e2c-806a-0755e5052241 req-a2d6a8a6-9bd4-42f7-a3d4-8a76db1c0602 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "63bce965-e211-4f6a-8700-64667594e4d0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:09:49 compute-0 nova_compute[186544]: 2025-11-22 08:09:49.892 186548 DEBUG oslo_concurrency.lockutils [req-4242c353-222b-4e2c-806a-0755e5052241 req-a2d6a8a6-9bd4-42f7-a3d4-8a76db1c0602 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "63bce965-e211-4f6a-8700-64667594e4d0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:09:49 compute-0 nova_compute[186544]: 2025-11-22 08:09:49.892 186548 DEBUG oslo_concurrency.lockutils [req-4242c353-222b-4e2c-806a-0755e5052241 req-a2d6a8a6-9bd4-42f7-a3d4-8a76db1c0602 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "63bce965-e211-4f6a-8700-64667594e4d0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:09:49 compute-0 nova_compute[186544]: 2025-11-22 08:09:49.892 186548 DEBUG nova.compute.manager [req-4242c353-222b-4e2c-806a-0755e5052241 req-a2d6a8a6-9bd4-42f7-a3d4-8a76db1c0602 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 63bce965-e211-4f6a-8700-64667594e4d0] Processing event network-vif-plugged-1b2e31ef-a7fe-455e-b7e0-a50675608829 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 08:09:49 compute-0 nova_compute[186544]: 2025-11-22 08:09:49.892 186548 DEBUG nova.compute.manager [req-4242c353-222b-4e2c-806a-0755e5052241 req-a2d6a8a6-9bd4-42f7-a3d4-8a76db1c0602 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 63bce965-e211-4f6a-8700-64667594e4d0] Received event network-vif-plugged-1b2e31ef-a7fe-455e-b7e0-a50675608829 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:09:49 compute-0 nova_compute[186544]: 2025-11-22 08:09:49.893 186548 DEBUG oslo_concurrency.lockutils [req-4242c353-222b-4e2c-806a-0755e5052241 req-a2d6a8a6-9bd4-42f7-a3d4-8a76db1c0602 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "63bce965-e211-4f6a-8700-64667594e4d0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:09:49 compute-0 nova_compute[186544]: 2025-11-22 08:09:49.893 186548 DEBUG oslo_concurrency.lockutils [req-4242c353-222b-4e2c-806a-0755e5052241 req-a2d6a8a6-9bd4-42f7-a3d4-8a76db1c0602 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "63bce965-e211-4f6a-8700-64667594e4d0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:09:49 compute-0 nova_compute[186544]: 2025-11-22 08:09:49.893 186548 DEBUG oslo_concurrency.lockutils [req-4242c353-222b-4e2c-806a-0755e5052241 req-a2d6a8a6-9bd4-42f7-a3d4-8a76db1c0602 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "63bce965-e211-4f6a-8700-64667594e4d0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:09:49 compute-0 nova_compute[186544]: 2025-11-22 08:09:49.893 186548 DEBUG nova.compute.manager [req-4242c353-222b-4e2c-806a-0755e5052241 req-a2d6a8a6-9bd4-42f7-a3d4-8a76db1c0602 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 63bce965-e211-4f6a-8700-64667594e4d0] No waiting events found dispatching network-vif-plugged-1b2e31ef-a7fe-455e-b7e0-a50675608829 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:09:49 compute-0 nova_compute[186544]: 2025-11-22 08:09:49.893 186548 WARNING nova.compute.manager [req-4242c353-222b-4e2c-806a-0755e5052241 req-a2d6a8a6-9bd4-42f7-a3d4-8a76db1c0602 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 63bce965-e211-4f6a-8700-64667594e4d0] Received unexpected event network-vif-plugged-1b2e31ef-a7fe-455e-b7e0-a50675608829 for instance with vm_state building and task_state spawning.
Nov 22 08:09:49 compute-0 nova_compute[186544]: 2025-11-22 08:09:49.894 186548 DEBUG nova.compute.manager [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 63bce965-e211-4f6a-8700-64667594e4d0] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 08:09:49 compute-0 nova_compute[186544]: 2025-11-22 08:09:49.898 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798989.85637, 36bedc95-cfbe-490a-b6d2-b148301708dc => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:09:49 compute-0 nova_compute[186544]: 2025-11-22 08:09:49.898 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 36bedc95-cfbe-490a-b6d2-b148301708dc] VM Paused (Lifecycle Event)
Nov 22 08:09:49 compute-0 nova_compute[186544]: 2025-11-22 08:09:49.900 186548 DEBUG nova.virt.libvirt.driver [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 63bce965-e211-4f6a-8700-64667594e4d0] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 08:09:49 compute-0 nova_compute[186544]: 2025-11-22 08:09:49.904 186548 INFO nova.virt.libvirt.driver [-] [instance: 63bce965-e211-4f6a-8700-64667594e4d0] Instance spawned successfully.
Nov 22 08:09:49 compute-0 nova_compute[186544]: 2025-11-22 08:09:49.905 186548 DEBUG nova.virt.libvirt.driver [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 63bce965-e211-4f6a-8700-64667594e4d0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 08:09:49 compute-0 nova_compute[186544]: 2025-11-22 08:09:49.917 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 36bedc95-cfbe-490a-b6d2-b148301708dc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:09:49 compute-0 nova_compute[186544]: 2025-11-22 08:09:49.920 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 36bedc95-cfbe-490a-b6d2-b148301708dc] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:09:49 compute-0 nova_compute[186544]: 2025-11-22 08:09:49.927 186548 DEBUG nova.virt.libvirt.driver [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 63bce965-e211-4f6a-8700-64667594e4d0] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:09:49 compute-0 nova_compute[186544]: 2025-11-22 08:09:49.927 186548 DEBUG nova.virt.libvirt.driver [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 63bce965-e211-4f6a-8700-64667594e4d0] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:09:49 compute-0 nova_compute[186544]: 2025-11-22 08:09:49.928 186548 DEBUG nova.virt.libvirt.driver [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 63bce965-e211-4f6a-8700-64667594e4d0] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:09:49 compute-0 nova_compute[186544]: 2025-11-22 08:09:49.928 186548 DEBUG nova.virt.libvirt.driver [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 63bce965-e211-4f6a-8700-64667594e4d0] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:09:49 compute-0 nova_compute[186544]: 2025-11-22 08:09:49.929 186548 DEBUG nova.virt.libvirt.driver [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 63bce965-e211-4f6a-8700-64667594e4d0] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:09:49 compute-0 nova_compute[186544]: 2025-11-22 08:09:49.929 186548 DEBUG nova.virt.libvirt.driver [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 63bce965-e211-4f6a-8700-64667594e4d0] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:09:49 compute-0 nova_compute[186544]: 2025-11-22 08:09:49.953 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 36bedc95-cfbe-490a-b6d2-b148301708dc] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 08:09:49 compute-0 nova_compute[186544]: 2025-11-22 08:09:49.954 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798989.905543, 63bce965-e211-4f6a-8700-64667594e4d0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:09:49 compute-0 nova_compute[186544]: 2025-11-22 08:09:49.954 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 63bce965-e211-4f6a-8700-64667594e4d0] VM Resumed (Lifecycle Event)
Nov 22 08:09:49 compute-0 nova_compute[186544]: 2025-11-22 08:09:49.988 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 63bce965-e211-4f6a-8700-64667594e4d0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:09:49 compute-0 nova_compute[186544]: 2025-11-22 08:09:49.991 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 63bce965-e211-4f6a-8700-64667594e4d0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:09:49 compute-0 nova_compute[186544]: 2025-11-22 08:09:49.994 186548 DEBUG nova.compute.manager [req-a7d4022a-2a41-4e7a-ac16-be10d85ec3f0 req-cf83d36a-2068-43f7-b67b-37070b95a5d6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 36bedc95-cfbe-490a-b6d2-b148301708dc] Received event network-vif-plugged-a05e871f-1fed-40ac-8e55-6161c2b1f91f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:09:49 compute-0 nova_compute[186544]: 2025-11-22 08:09:49.995 186548 DEBUG oslo_concurrency.lockutils [req-a7d4022a-2a41-4e7a-ac16-be10d85ec3f0 req-cf83d36a-2068-43f7-b67b-37070b95a5d6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "36bedc95-cfbe-490a-b6d2-b148301708dc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:09:49 compute-0 nova_compute[186544]: 2025-11-22 08:09:49.995 186548 DEBUG oslo_concurrency.lockutils [req-a7d4022a-2a41-4e7a-ac16-be10d85ec3f0 req-cf83d36a-2068-43f7-b67b-37070b95a5d6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "36bedc95-cfbe-490a-b6d2-b148301708dc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:09:49 compute-0 nova_compute[186544]: 2025-11-22 08:09:49.995 186548 DEBUG oslo_concurrency.lockutils [req-a7d4022a-2a41-4e7a-ac16-be10d85ec3f0 req-cf83d36a-2068-43f7-b67b-37070b95a5d6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "36bedc95-cfbe-490a-b6d2-b148301708dc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:09:49 compute-0 nova_compute[186544]: 2025-11-22 08:09:49.995 186548 DEBUG nova.compute.manager [req-a7d4022a-2a41-4e7a-ac16-be10d85ec3f0 req-cf83d36a-2068-43f7-b67b-37070b95a5d6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 36bedc95-cfbe-490a-b6d2-b148301708dc] Processing event network-vif-plugged-a05e871f-1fed-40ac-8e55-6161c2b1f91f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 08:09:49 compute-0 nova_compute[186544]: 2025-11-22 08:09:49.996 186548 DEBUG nova.compute.manager [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 36bedc95-cfbe-490a-b6d2-b148301708dc] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 08:09:50 compute-0 nova_compute[186544]: 2025-11-22 08:09:50.000 186548 DEBUG nova.virt.libvirt.driver [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 36bedc95-cfbe-490a-b6d2-b148301708dc] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 08:09:50 compute-0 nova_compute[186544]: 2025-11-22 08:09:50.002 186548 INFO nova.virt.libvirt.driver [-] [instance: 36bedc95-cfbe-490a-b6d2-b148301708dc] Instance spawned successfully.
Nov 22 08:09:50 compute-0 nova_compute[186544]: 2025-11-22 08:09:50.004 186548 DEBUG nova.virt.libvirt.driver [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 36bedc95-cfbe-490a-b6d2-b148301708dc] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 08:09:50 compute-0 nova_compute[186544]: 2025-11-22 08:09:50.009 186548 INFO nova.compute.manager [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 63bce965-e211-4f6a-8700-64667594e4d0] Took 7.22 seconds to spawn the instance on the hypervisor.
Nov 22 08:09:50 compute-0 nova_compute[186544]: 2025-11-22 08:09:50.010 186548 DEBUG nova.compute.manager [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 63bce965-e211-4f6a-8700-64667594e4d0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:09:50 compute-0 nova_compute[186544]: 2025-11-22 08:09:50.012 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 63bce965-e211-4f6a-8700-64667594e4d0] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 08:09:50 compute-0 nova_compute[186544]: 2025-11-22 08:09:50.012 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763798990.0015147, 36bedc95-cfbe-490a-b6d2-b148301708dc => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:09:50 compute-0 nova_compute[186544]: 2025-11-22 08:09:50.012 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 36bedc95-cfbe-490a-b6d2-b148301708dc] VM Resumed (Lifecycle Event)
Nov 22 08:09:50 compute-0 nova_compute[186544]: 2025-11-22 08:09:50.025 186548 DEBUG nova.virt.libvirt.driver [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 36bedc95-cfbe-490a-b6d2-b148301708dc] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:09:50 compute-0 nova_compute[186544]: 2025-11-22 08:09:50.025 186548 DEBUG nova.virt.libvirt.driver [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 36bedc95-cfbe-490a-b6d2-b148301708dc] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:09:50 compute-0 nova_compute[186544]: 2025-11-22 08:09:50.026 186548 DEBUG nova.virt.libvirt.driver [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 36bedc95-cfbe-490a-b6d2-b148301708dc] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:09:50 compute-0 nova_compute[186544]: 2025-11-22 08:09:50.026 186548 DEBUG nova.virt.libvirt.driver [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 36bedc95-cfbe-490a-b6d2-b148301708dc] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:09:50 compute-0 nova_compute[186544]: 2025-11-22 08:09:50.026 186548 DEBUG nova.virt.libvirt.driver [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 36bedc95-cfbe-490a-b6d2-b148301708dc] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:09:50 compute-0 nova_compute[186544]: 2025-11-22 08:09:50.027 186548 DEBUG nova.virt.libvirt.driver [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 36bedc95-cfbe-490a-b6d2-b148301708dc] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:09:50 compute-0 nova_compute[186544]: 2025-11-22 08:09:50.034 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 36bedc95-cfbe-490a-b6d2-b148301708dc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:09:50 compute-0 nova_compute[186544]: 2025-11-22 08:09:50.038 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 36bedc95-cfbe-490a-b6d2-b148301708dc] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:09:50 compute-0 nova_compute[186544]: 2025-11-22 08:09:50.081 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 36bedc95-cfbe-490a-b6d2-b148301708dc] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 08:09:50 compute-0 nova_compute[186544]: 2025-11-22 08:09:50.112 186548 INFO nova.compute.manager [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 36bedc95-cfbe-490a-b6d2-b148301708dc] Took 4.96 seconds to spawn the instance on the hypervisor.
Nov 22 08:09:50 compute-0 nova_compute[186544]: 2025-11-22 08:09:50.112 186548 DEBUG nova.compute.manager [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 36bedc95-cfbe-490a-b6d2-b148301708dc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:09:50 compute-0 nova_compute[186544]: 2025-11-22 08:09:50.113 186548 INFO nova.compute.manager [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 63bce965-e211-4f6a-8700-64667594e4d0] Took 7.80 seconds to build instance.
Nov 22 08:09:50 compute-0 nova_compute[186544]: 2025-11-22 08:09:50.148 186548 DEBUG oslo_concurrency.lockutils [None req-793a1ed2-c163-4766-b57d-58ea17c37181 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lock "63bce965-e211-4f6a-8700-64667594e4d0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.951s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:09:50 compute-0 nova_compute[186544]: 2025-11-22 08:09:50.182 186548 INFO nova.compute.manager [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 36bedc95-cfbe-490a-b6d2-b148301708dc] Took 5.47 seconds to build instance.
Nov 22 08:09:50 compute-0 nova_compute[186544]: 2025-11-22 08:09:50.201 186548 DEBUG oslo_concurrency.lockutils [None req-e73b095c-e0e1-4913-8786-ac63be79915d 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Lock "36bedc95-cfbe-490a-b6d2-b148301708dc" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.574s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:09:50 compute-0 podman[235634]: 2025-11-22 08:09:50.261331038 +0000 UTC m=+0.061950204 container create 6f0f5787a758038c61a8376b380093739ebbebf2c59f6f97a467c7e4f05d3e1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-390460fe-fb7f-40ce-abb7-9e99dea93a54, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 08:09:50 compute-0 systemd[1]: Started libpod-conmon-6f0f5787a758038c61a8376b380093739ebbebf2c59f6f97a467c7e4f05d3e1e.scope.
Nov 22 08:09:50 compute-0 podman[235634]: 2025-11-22 08:09:50.229716196 +0000 UTC m=+0.030335152 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 08:09:50 compute-0 systemd[1]: Started libcrun container.
Nov 22 08:09:50 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6ad0c10590aa981080733f8fa48af7d6c03d283c49f861df1599b5727bd47f9a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 08:09:50 compute-0 podman[235634]: 2025-11-22 08:09:50.339999326 +0000 UTC m=+0.140618282 container init 6f0f5787a758038c61a8376b380093739ebbebf2c59f6f97a467c7e4f05d3e1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-390460fe-fb7f-40ce-abb7-9e99dea93a54, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Nov 22 08:09:50 compute-0 podman[235634]: 2025-11-22 08:09:50.348872765 +0000 UTC m=+0.149491691 container start 6f0f5787a758038c61a8376b380093739ebbebf2c59f6f97a467c7e4f05d3e1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-390460fe-fb7f-40ce-abb7-9e99dea93a54, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 08:09:50 compute-0 neutron-haproxy-ovnmeta-390460fe-fb7f-40ce-abb7-9e99dea93a54[235649]: [NOTICE]   (235653) : New worker (235655) forked
Nov 22 08:09:50 compute-0 neutron-haproxy-ovnmeta-390460fe-fb7f-40ce-abb7-9e99dea93a54[235649]: [NOTICE]   (235653) : Loading success.
Nov 22 08:09:50 compute-0 nova_compute[186544]: 2025-11-22 08:09:50.861 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:09:50 compute-0 nova_compute[186544]: 2025-11-22 08:09:50.900 186548 DEBUG nova.network.neutron [req-8fd4a0cf-8220-447f-b2f2-fe836f7b474a req-e63df9cf-c7f8-4d92-bb2f-7ecbf02f5daf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 36bedc95-cfbe-490a-b6d2-b148301708dc] Updated VIF entry in instance network info cache for port a05e871f-1fed-40ac-8e55-6161c2b1f91f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:09:50 compute-0 nova_compute[186544]: 2025-11-22 08:09:50.901 186548 DEBUG nova.network.neutron [req-8fd4a0cf-8220-447f-b2f2-fe836f7b474a req-e63df9cf-c7f8-4d92-bb2f-7ecbf02f5daf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 36bedc95-cfbe-490a-b6d2-b148301708dc] Updating instance_info_cache with network_info: [{"id": "a05e871f-1fed-40ac-8e55-6161c2b1f91f", "address": "fa:16:3e:96:a3:0e", "network": {"id": "390460fe-fb7f-40ce-abb7-9e99dea93a54", "bridge": "br-int", "label": "tempest-TestServerMultinode-1226412633-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5ea60c87d8514904b3cac3301e5875af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa05e871f-1f", "ovs_interfaceid": "a05e871f-1fed-40ac-8e55-6161c2b1f91f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:09:50 compute-0 nova_compute[186544]: 2025-11-22 08:09:50.913 186548 DEBUG oslo_concurrency.lockutils [req-8fd4a0cf-8220-447f-b2f2-fe836f7b474a req-e63df9cf-c7f8-4d92-bb2f-7ecbf02f5daf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-36bedc95-cfbe-490a-b6d2-b148301708dc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:09:52 compute-0 nova_compute[186544]: 2025-11-22 08:09:52.099 186548 DEBUG nova.compute.manager [req-312e12af-5694-4ff0-b4ed-0df5190ed8d4 req-ab62119a-807a-407b-8947-1392fdf02e97 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 36bedc95-cfbe-490a-b6d2-b148301708dc] Received event network-vif-plugged-a05e871f-1fed-40ac-8e55-6161c2b1f91f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:09:52 compute-0 nova_compute[186544]: 2025-11-22 08:09:52.100 186548 DEBUG oslo_concurrency.lockutils [req-312e12af-5694-4ff0-b4ed-0df5190ed8d4 req-ab62119a-807a-407b-8947-1392fdf02e97 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "36bedc95-cfbe-490a-b6d2-b148301708dc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:09:52 compute-0 nova_compute[186544]: 2025-11-22 08:09:52.100 186548 DEBUG oslo_concurrency.lockutils [req-312e12af-5694-4ff0-b4ed-0df5190ed8d4 req-ab62119a-807a-407b-8947-1392fdf02e97 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "36bedc95-cfbe-490a-b6d2-b148301708dc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:09:52 compute-0 nova_compute[186544]: 2025-11-22 08:09:52.100 186548 DEBUG oslo_concurrency.lockutils [req-312e12af-5694-4ff0-b4ed-0df5190ed8d4 req-ab62119a-807a-407b-8947-1392fdf02e97 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "36bedc95-cfbe-490a-b6d2-b148301708dc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:09:52 compute-0 nova_compute[186544]: 2025-11-22 08:09:52.101 186548 DEBUG nova.compute.manager [req-312e12af-5694-4ff0-b4ed-0df5190ed8d4 req-ab62119a-807a-407b-8947-1392fdf02e97 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 36bedc95-cfbe-490a-b6d2-b148301708dc] No waiting events found dispatching network-vif-plugged-a05e871f-1fed-40ac-8e55-6161c2b1f91f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:09:52 compute-0 nova_compute[186544]: 2025-11-22 08:09:52.101 186548 WARNING nova.compute.manager [req-312e12af-5694-4ff0-b4ed-0df5190ed8d4 req-ab62119a-807a-407b-8947-1392fdf02e97 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 36bedc95-cfbe-490a-b6d2-b148301708dc] Received unexpected event network-vif-plugged-a05e871f-1fed-40ac-8e55-6161c2b1f91f for instance with vm_state active and task_state None.
Nov 22 08:09:52 compute-0 nova_compute[186544]: 2025-11-22 08:09:52.301 186548 DEBUG oslo_concurrency.lockutils [None req-e8fe6d40-4509-4117-91fe-1ab3fa7d6c9e c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Acquiring lock "63bce965-e211-4f6a-8700-64667594e4d0" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:09:52 compute-0 nova_compute[186544]: 2025-11-22 08:09:52.301 186548 DEBUG oslo_concurrency.lockutils [None req-e8fe6d40-4509-4117-91fe-1ab3fa7d6c9e c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lock "63bce965-e211-4f6a-8700-64667594e4d0" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:09:52 compute-0 nova_compute[186544]: 2025-11-22 08:09:52.302 186548 DEBUG oslo_concurrency.lockutils [None req-e8fe6d40-4509-4117-91fe-1ab3fa7d6c9e c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Acquiring lock "63bce965-e211-4f6a-8700-64667594e4d0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:09:52 compute-0 nova_compute[186544]: 2025-11-22 08:09:52.302 186548 DEBUG oslo_concurrency.lockutils [None req-e8fe6d40-4509-4117-91fe-1ab3fa7d6c9e c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lock "63bce965-e211-4f6a-8700-64667594e4d0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:09:52 compute-0 nova_compute[186544]: 2025-11-22 08:09:52.303 186548 DEBUG oslo_concurrency.lockutils [None req-e8fe6d40-4509-4117-91fe-1ab3fa7d6c9e c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lock "63bce965-e211-4f6a-8700-64667594e4d0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:09:52 compute-0 nova_compute[186544]: 2025-11-22 08:09:52.311 186548 INFO nova.compute.manager [None req-e8fe6d40-4509-4117-91fe-1ab3fa7d6c9e c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 63bce965-e211-4f6a-8700-64667594e4d0] Terminating instance
Nov 22 08:09:52 compute-0 nova_compute[186544]: 2025-11-22 08:09:52.318 186548 DEBUG nova.compute.manager [None req-e8fe6d40-4509-4117-91fe-1ab3fa7d6c9e c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 63bce965-e211-4f6a-8700-64667594e4d0] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 08:09:52 compute-0 kernel: tap1b2e31ef-a7 (unregistering): left promiscuous mode
Nov 22 08:09:52 compute-0 NetworkManager[55036]: <info>  [1763798992.3562] device (tap1b2e31ef-a7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 08:09:52 compute-0 nova_compute[186544]: 2025-11-22 08:09:52.359 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:09:52 compute-0 ovn_controller[94843]: 2025-11-22T08:09:52Z|00546|binding|INFO|Releasing lport 1b2e31ef-a7fe-455e-b7e0-a50675608829 from this chassis (sb_readonly=0)
Nov 22 08:09:52 compute-0 ovn_controller[94843]: 2025-11-22T08:09:52Z|00547|binding|INFO|Setting lport 1b2e31ef-a7fe-455e-b7e0-a50675608829 down in Southbound
Nov 22 08:09:52 compute-0 ovn_controller[94843]: 2025-11-22T08:09:52Z|00548|binding|INFO|Removing iface tap1b2e31ef-a7 ovn-installed in OVS
Nov 22 08:09:52 compute-0 nova_compute[186544]: 2025-11-22 08:09:52.363 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:09:52 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:52.370 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:46:96:9f 10.100.0.3'], port_security=['fa:16:3e:46:96:9f 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '63bce965-e211-4f6a-8700-64667594e4d0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c75f33da-8305-4145-97ef-eef656e4f067', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9c564dfb60114407b72d22a9c49ed513', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7b4508ee-1620-408d-af22-547cca254fde', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a1e17c95-3f14-4b31-90be-c563d86a1107, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=1b2e31ef-a7fe-455e-b7e0-a50675608829) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:09:52 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:52.371 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 1b2e31ef-a7fe-455e-b7e0-a50675608829 in datapath c75f33da-8305-4145-97ef-eef656e4f067 unbound from our chassis
Nov 22 08:09:52 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:52.373 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c75f33da-8305-4145-97ef-eef656e4f067, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 08:09:52 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:52.375 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[bfcdc70c-e388-49bf-87f8-a7c73607f2c7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:09:52 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:52.375 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067 namespace which is not needed anymore
Nov 22 08:09:52 compute-0 nova_compute[186544]: 2025-11-22 08:09:52.379 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:09:52 compute-0 systemd[1]: machine-qemu\x2d69\x2dinstance\x2d00000078.scope: Deactivated successfully.
Nov 22 08:09:52 compute-0 systemd[1]: machine-qemu\x2d69\x2dinstance\x2d00000078.scope: Consumed 2.723s CPU time.
Nov 22 08:09:52 compute-0 systemd-machined[152872]: Machine qemu-69-instance-00000078 terminated.
Nov 22 08:09:52 compute-0 podman[235665]: 2025-11-22 08:09:52.43348633 +0000 UTC m=+0.075256644 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.openshift.expose-services=, managed_by=edpm_ansible, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.openshift.tags=minimal rhel9, vcs-type=git, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 22 08:09:52 compute-0 podman[235664]: 2025-11-22 08:09:52.455251558 +0000 UTC m=+0.101467012 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 08:09:52 compute-0 neutron-haproxy-ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067[235542]: [NOTICE]   (235546) : haproxy version is 2.8.14-c23fe91
Nov 22 08:09:52 compute-0 neutron-haproxy-ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067[235542]: [NOTICE]   (235546) : path to executable is /usr/sbin/haproxy
Nov 22 08:09:52 compute-0 neutron-haproxy-ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067[235542]: [WARNING]  (235546) : Exiting Master process...
Nov 22 08:09:52 compute-0 neutron-haproxy-ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067[235542]: [WARNING]  (235546) : Exiting Master process...
Nov 22 08:09:52 compute-0 neutron-haproxy-ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067[235542]: [ALERT]    (235546) : Current worker (235548) exited with code 143 (Terminated)
Nov 22 08:09:52 compute-0 neutron-haproxy-ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067[235542]: [WARNING]  (235546) : All workers exited. Exiting... (0)
Nov 22 08:09:52 compute-0 systemd[1]: libpod-f1cd6ec4ccb0c93afa256e6318dff112e46f134f5404e6a9f5b391a4bf35429e.scope: Deactivated successfully.
Nov 22 08:09:52 compute-0 podman[235732]: 2025-11-22 08:09:52.52195563 +0000 UTC m=+0.052938412 container died f1cd6ec4ccb0c93afa256e6318dff112e46f134f5404e6a9f5b391a4bf35429e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 22 08:09:52 compute-0 nova_compute[186544]: 2025-11-22 08:09:52.543 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:09:52 compute-0 nova_compute[186544]: 2025-11-22 08:09:52.548 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:09:52 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f1cd6ec4ccb0c93afa256e6318dff112e46f134f5404e6a9f5b391a4bf35429e-userdata-shm.mount: Deactivated successfully.
Nov 22 08:09:52 compute-0 systemd[1]: var-lib-containers-storage-overlay-8a43891dc5ab8c10876dc013c9af5894155da97fbf9423db9e64bf42bfa246ea-merged.mount: Deactivated successfully.
Nov 22 08:09:52 compute-0 nova_compute[186544]: 2025-11-22 08:09:52.591 186548 INFO nova.virt.libvirt.driver [-] [instance: 63bce965-e211-4f6a-8700-64667594e4d0] Instance destroyed successfully.
Nov 22 08:09:52 compute-0 nova_compute[186544]: 2025-11-22 08:09:52.592 186548 DEBUG nova.objects.instance [None req-e8fe6d40-4509-4117-91fe-1ab3fa7d6c9e c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lazy-loading 'resources' on Instance uuid 63bce965-e211-4f6a-8700-64667594e4d0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:09:52 compute-0 podman[235732]: 2025-11-22 08:09:52.59868698 +0000 UTC m=+0.129669732 container cleanup f1cd6ec4ccb0c93afa256e6318dff112e46f134f5404e6a9f5b391a4bf35429e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Nov 22 08:09:52 compute-0 nova_compute[186544]: 2025-11-22 08:09:52.603 186548 DEBUG nova.virt.libvirt.vif [None req-e8fe6d40-4509-4117-91fe-1ab3fa7d6c9e c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:09:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-705204345',display_name='tempest-tempest.common.compute-instance-705204345-2',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-705204345-2',id=120,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=1,launched_at=2025-11-22T08:09:50Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9c564dfb60114407b72d22a9c49ed513',ramdisk_id='',reservation_id='r-hgk57t70',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-MultipleCreateTestJSON-1558462004',owner_user_name='tempest-MultipleCreateTestJSON-1558462004-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:09:50Z,user_data=None,user_id='c867ad823e59410b995507d3e85b3465',uuid=63bce965-e211-4f6a-8700-64667594e4d0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1b2e31ef-a7fe-455e-b7e0-a50675608829", "address": "fa:16:3e:46:96:9f", "network": {"id": "c75f33da-8305-4145-97ef-eef656e4f067", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-910086432-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c564dfb60114407b72d22a9c49ed513", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b2e31ef-a7", "ovs_interfaceid": "1b2e31ef-a7fe-455e-b7e0-a50675608829", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 08:09:52 compute-0 nova_compute[186544]: 2025-11-22 08:09:52.603 186548 DEBUG nova.network.os_vif_util [None req-e8fe6d40-4509-4117-91fe-1ab3fa7d6c9e c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Converting VIF {"id": "1b2e31ef-a7fe-455e-b7e0-a50675608829", "address": "fa:16:3e:46:96:9f", "network": {"id": "c75f33da-8305-4145-97ef-eef656e4f067", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-910086432-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c564dfb60114407b72d22a9c49ed513", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b2e31ef-a7", "ovs_interfaceid": "1b2e31ef-a7fe-455e-b7e0-a50675608829", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:09:52 compute-0 nova_compute[186544]: 2025-11-22 08:09:52.604 186548 DEBUG nova.network.os_vif_util [None req-e8fe6d40-4509-4117-91fe-1ab3fa7d6c9e c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:46:96:9f,bridge_name='br-int',has_traffic_filtering=True,id=1b2e31ef-a7fe-455e-b7e0-a50675608829,network=Network(c75f33da-8305-4145-97ef-eef656e4f067),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1b2e31ef-a7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:09:52 compute-0 nova_compute[186544]: 2025-11-22 08:09:52.605 186548 DEBUG os_vif [None req-e8fe6d40-4509-4117-91fe-1ab3fa7d6c9e c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:46:96:9f,bridge_name='br-int',has_traffic_filtering=True,id=1b2e31ef-a7fe-455e-b7e0-a50675608829,network=Network(c75f33da-8305-4145-97ef-eef656e4f067),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1b2e31ef-a7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 08:09:52 compute-0 systemd[1]: libpod-conmon-f1cd6ec4ccb0c93afa256e6318dff112e46f134f5404e6a9f5b391a4bf35429e.scope: Deactivated successfully.
Nov 22 08:09:52 compute-0 nova_compute[186544]: 2025-11-22 08:09:52.607 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:09:52 compute-0 nova_compute[186544]: 2025-11-22 08:09:52.608 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1b2e31ef-a7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:09:52 compute-0 nova_compute[186544]: 2025-11-22 08:09:52.610 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:09:52 compute-0 nova_compute[186544]: 2025-11-22 08:09:52.613 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 08:09:52 compute-0 nova_compute[186544]: 2025-11-22 08:09:52.615 186548 INFO os_vif [None req-e8fe6d40-4509-4117-91fe-1ab3fa7d6c9e c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:46:96:9f,bridge_name='br-int',has_traffic_filtering=True,id=1b2e31ef-a7fe-455e-b7e0-a50675608829,network=Network(c75f33da-8305-4145-97ef-eef656e4f067),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1b2e31ef-a7')
Nov 22 08:09:52 compute-0 nova_compute[186544]: 2025-11-22 08:09:52.616 186548 INFO nova.virt.libvirt.driver [None req-e8fe6d40-4509-4117-91fe-1ab3fa7d6c9e c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 63bce965-e211-4f6a-8700-64667594e4d0] Deleting instance files /var/lib/nova/instances/63bce965-e211-4f6a-8700-64667594e4d0_del
Nov 22 08:09:52 compute-0 nova_compute[186544]: 2025-11-22 08:09:52.616 186548 INFO nova.virt.libvirt.driver [None req-e8fe6d40-4509-4117-91fe-1ab3fa7d6c9e c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 63bce965-e211-4f6a-8700-64667594e4d0] Deletion of /var/lib/nova/instances/63bce965-e211-4f6a-8700-64667594e4d0_del complete
Nov 22 08:09:52 compute-0 nova_compute[186544]: 2025-11-22 08:09:52.691 186548 INFO nova.compute.manager [None req-e8fe6d40-4509-4117-91fe-1ab3fa7d6c9e c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 63bce965-e211-4f6a-8700-64667594e4d0] Took 0.37 seconds to destroy the instance on the hypervisor.
Nov 22 08:09:52 compute-0 nova_compute[186544]: 2025-11-22 08:09:52.692 186548 DEBUG oslo.service.loopingcall [None req-e8fe6d40-4509-4117-91fe-1ab3fa7d6c9e c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 08:09:52 compute-0 nova_compute[186544]: 2025-11-22 08:09:52.692 186548 DEBUG nova.compute.manager [-] [instance: 63bce965-e211-4f6a-8700-64667594e4d0] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 08:09:52 compute-0 nova_compute[186544]: 2025-11-22 08:09:52.692 186548 DEBUG nova.network.neutron [-] [instance: 63bce965-e211-4f6a-8700-64667594e4d0] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 08:09:52 compute-0 podman[235776]: 2025-11-22 08:09:52.693778864 +0000 UTC m=+0.056524951 container remove f1cd6ec4ccb0c93afa256e6318dff112e46f134f5404e6a9f5b391a4bf35429e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 08:09:52 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:52.703 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[0051c2d8-a1d2-4d70-88ff-ad8e551d1a26]: (4, ('Sat Nov 22 08:09:52 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067 (f1cd6ec4ccb0c93afa256e6318dff112e46f134f5404e6a9f5b391a4bf35429e)\nf1cd6ec4ccb0c93afa256e6318dff112e46f134f5404e6a9f5b391a4bf35429e\nSat Nov 22 08:09:52 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067 (f1cd6ec4ccb0c93afa256e6318dff112e46f134f5404e6a9f5b391a4bf35429e)\nf1cd6ec4ccb0c93afa256e6318dff112e46f134f5404e6a9f5b391a4bf35429e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:09:52 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:52.705 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[d6d3c121-c77a-42a9-bf56-52589e819009]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:09:52 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:52.706 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc75f33da-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:09:52 compute-0 nova_compute[186544]: 2025-11-22 08:09:52.708 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:09:52 compute-0 kernel: tapc75f33da-80: left promiscuous mode
Nov 22 08:09:52 compute-0 nova_compute[186544]: 2025-11-22 08:09:52.724 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:09:52 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:52.727 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[83cee2ec-df69-4cf7-b6be-1c90bba3bc09]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:09:52 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:52.753 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[5340222f-89c8-4ad6-82fc-96485c1ca068]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:09:52 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:52.755 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[7dd78c97-09fa-434c-b291-af7025fbacbd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:09:52 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:52.769 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[fda4c251-63be-45b3-8620-e096b330a18f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 572813, 'reachable_time': 17775, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235792, 'error': None, 'target': 'ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:09:52 compute-0 systemd[1]: run-netns-ovnmeta\x2dc75f33da\x2d8305\x2d4145\x2d97ef\x2deef656e4f067.mount: Deactivated successfully.
Nov 22 08:09:52 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:52.771 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 08:09:52 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:09:52.771 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[5a0dd6f0-0af3-4d0d-81f5-f599f80162b0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:09:54 compute-0 nova_compute[186544]: 2025-11-22 08:09:54.170 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:09:54 compute-0 nova_compute[186544]: 2025-11-22 08:09:54.353 186548 DEBUG nova.network.neutron [-] [instance: 63bce965-e211-4f6a-8700-64667594e4d0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:09:54 compute-0 nova_compute[186544]: 2025-11-22 08:09:54.381 186548 INFO nova.compute.manager [-] [instance: 63bce965-e211-4f6a-8700-64667594e4d0] Took 1.69 seconds to deallocate network for instance.
Nov 22 08:09:54 compute-0 nova_compute[186544]: 2025-11-22 08:09:54.477 186548 DEBUG oslo_concurrency.lockutils [None req-e8fe6d40-4509-4117-91fe-1ab3fa7d6c9e c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:09:54 compute-0 nova_compute[186544]: 2025-11-22 08:09:54.478 186548 DEBUG oslo_concurrency.lockutils [None req-e8fe6d40-4509-4117-91fe-1ab3fa7d6c9e c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:09:54 compute-0 nova_compute[186544]: 2025-11-22 08:09:54.517 186548 DEBUG nova.compute.manager [req-6f9ba0e9-03f9-4d5a-aae1-3f6941a11ca4 req-ea906411-0da7-4bd4-b5e2-fed064b0d568 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 63bce965-e211-4f6a-8700-64667594e4d0] Received event network-vif-unplugged-1b2e31ef-a7fe-455e-b7e0-a50675608829 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:09:54 compute-0 nova_compute[186544]: 2025-11-22 08:09:54.517 186548 DEBUG oslo_concurrency.lockutils [req-6f9ba0e9-03f9-4d5a-aae1-3f6941a11ca4 req-ea906411-0da7-4bd4-b5e2-fed064b0d568 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "63bce965-e211-4f6a-8700-64667594e4d0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:09:54 compute-0 nova_compute[186544]: 2025-11-22 08:09:54.518 186548 DEBUG oslo_concurrency.lockutils [req-6f9ba0e9-03f9-4d5a-aae1-3f6941a11ca4 req-ea906411-0da7-4bd4-b5e2-fed064b0d568 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "63bce965-e211-4f6a-8700-64667594e4d0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:09:54 compute-0 nova_compute[186544]: 2025-11-22 08:09:54.518 186548 DEBUG oslo_concurrency.lockutils [req-6f9ba0e9-03f9-4d5a-aae1-3f6941a11ca4 req-ea906411-0da7-4bd4-b5e2-fed064b0d568 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "63bce965-e211-4f6a-8700-64667594e4d0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:09:54 compute-0 nova_compute[186544]: 2025-11-22 08:09:54.518 186548 DEBUG nova.compute.manager [req-6f9ba0e9-03f9-4d5a-aae1-3f6941a11ca4 req-ea906411-0da7-4bd4-b5e2-fed064b0d568 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 63bce965-e211-4f6a-8700-64667594e4d0] No waiting events found dispatching network-vif-unplugged-1b2e31ef-a7fe-455e-b7e0-a50675608829 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:09:54 compute-0 nova_compute[186544]: 2025-11-22 08:09:54.519 186548 WARNING nova.compute.manager [req-6f9ba0e9-03f9-4d5a-aae1-3f6941a11ca4 req-ea906411-0da7-4bd4-b5e2-fed064b0d568 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 63bce965-e211-4f6a-8700-64667594e4d0] Received unexpected event network-vif-unplugged-1b2e31ef-a7fe-455e-b7e0-a50675608829 for instance with vm_state deleted and task_state None.
Nov 22 08:09:54 compute-0 nova_compute[186544]: 2025-11-22 08:09:54.519 186548 DEBUG nova.compute.manager [req-6f9ba0e9-03f9-4d5a-aae1-3f6941a11ca4 req-ea906411-0da7-4bd4-b5e2-fed064b0d568 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 63bce965-e211-4f6a-8700-64667594e4d0] Received event network-vif-plugged-1b2e31ef-a7fe-455e-b7e0-a50675608829 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:09:54 compute-0 nova_compute[186544]: 2025-11-22 08:09:54.519 186548 DEBUG oslo_concurrency.lockutils [req-6f9ba0e9-03f9-4d5a-aae1-3f6941a11ca4 req-ea906411-0da7-4bd4-b5e2-fed064b0d568 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "63bce965-e211-4f6a-8700-64667594e4d0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:09:54 compute-0 nova_compute[186544]: 2025-11-22 08:09:54.520 186548 DEBUG oslo_concurrency.lockutils [req-6f9ba0e9-03f9-4d5a-aae1-3f6941a11ca4 req-ea906411-0da7-4bd4-b5e2-fed064b0d568 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "63bce965-e211-4f6a-8700-64667594e4d0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:09:54 compute-0 nova_compute[186544]: 2025-11-22 08:09:54.520 186548 DEBUG oslo_concurrency.lockutils [req-6f9ba0e9-03f9-4d5a-aae1-3f6941a11ca4 req-ea906411-0da7-4bd4-b5e2-fed064b0d568 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "63bce965-e211-4f6a-8700-64667594e4d0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:09:54 compute-0 nova_compute[186544]: 2025-11-22 08:09:54.520 186548 DEBUG nova.compute.manager [req-6f9ba0e9-03f9-4d5a-aae1-3f6941a11ca4 req-ea906411-0da7-4bd4-b5e2-fed064b0d568 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 63bce965-e211-4f6a-8700-64667594e4d0] No waiting events found dispatching network-vif-plugged-1b2e31ef-a7fe-455e-b7e0-a50675608829 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:09:54 compute-0 nova_compute[186544]: 2025-11-22 08:09:54.520 186548 WARNING nova.compute.manager [req-6f9ba0e9-03f9-4d5a-aae1-3f6941a11ca4 req-ea906411-0da7-4bd4-b5e2-fed064b0d568 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 63bce965-e211-4f6a-8700-64667594e4d0] Received unexpected event network-vif-plugged-1b2e31ef-a7fe-455e-b7e0-a50675608829 for instance with vm_state deleted and task_state None.
Nov 22 08:09:54 compute-0 nova_compute[186544]: 2025-11-22 08:09:54.776 186548 DEBUG nova.compute.provider_tree [None req-e8fe6d40-4509-4117-91fe-1ab3fa7d6c9e c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:09:54 compute-0 nova_compute[186544]: 2025-11-22 08:09:54.789 186548 DEBUG nova.scheduler.client.report [None req-e8fe6d40-4509-4117-91fe-1ab3fa7d6c9e c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:09:54 compute-0 nova_compute[186544]: 2025-11-22 08:09:54.819 186548 DEBUG oslo_concurrency.lockutils [None req-e8fe6d40-4509-4117-91fe-1ab3fa7d6c9e c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.340s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:09:54 compute-0 nova_compute[186544]: 2025-11-22 08:09:54.856 186548 INFO nova.scheduler.client.report [None req-e8fe6d40-4509-4117-91fe-1ab3fa7d6c9e c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Deleted allocations for instance 63bce965-e211-4f6a-8700-64667594e4d0
Nov 22 08:09:54 compute-0 nova_compute[186544]: 2025-11-22 08:09:54.993 186548 DEBUG oslo_concurrency.lockutils [None req-e8fe6d40-4509-4117-91fe-1ab3fa7d6c9e c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lock "63bce965-e211-4f6a-8700-64667594e4d0" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.692s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:09:55 compute-0 nova_compute[186544]: 2025-11-22 08:09:55.149 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763798980.148107, 267a26ae-5092-423d-80ce-b122467635a3 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:09:55 compute-0 nova_compute[186544]: 2025-11-22 08:09:55.149 186548 INFO nova.compute.manager [-] [instance: 267a26ae-5092-423d-80ce-b122467635a3] VM Stopped (Lifecycle Event)
Nov 22 08:09:55 compute-0 nova_compute[186544]: 2025-11-22 08:09:55.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:09:55 compute-0 nova_compute[186544]: 2025-11-22 08:09:55.167 186548 DEBUG nova.compute.manager [None req-022e97e1-03da-434c-b06f-11d19362c627 - - - - - -] [instance: 267a26ae-5092-423d-80ce-b122467635a3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:09:55 compute-0 nova_compute[186544]: 2025-11-22 08:09:55.180 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:09:55 compute-0 nova_compute[186544]: 2025-11-22 08:09:55.181 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:09:55 compute-0 nova_compute[186544]: 2025-11-22 08:09:55.181 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:09:55 compute-0 nova_compute[186544]: 2025-11-22 08:09:55.181 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 08:09:55 compute-0 nova_compute[186544]: 2025-11-22 08:09:55.241 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/36bedc95-cfbe-490a-b6d2-b148301708dc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:09:55 compute-0 nova_compute[186544]: 2025-11-22 08:09:55.304 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/36bedc95-cfbe-490a-b6d2-b148301708dc/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:09:55 compute-0 nova_compute[186544]: 2025-11-22 08:09:55.305 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/36bedc95-cfbe-490a-b6d2-b148301708dc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:09:55 compute-0 nova_compute[186544]: 2025-11-22 08:09:55.365 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/36bedc95-cfbe-490a-b6d2-b148301708dc/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:09:55 compute-0 nova_compute[186544]: 2025-11-22 08:09:55.531 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:09:55 compute-0 nova_compute[186544]: 2025-11-22 08:09:55.532 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5544MB free_disk=73.20371627807617GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 08:09:55 compute-0 nova_compute[186544]: 2025-11-22 08:09:55.532 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:09:55 compute-0 nova_compute[186544]: 2025-11-22 08:09:55.533 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:09:55 compute-0 nova_compute[186544]: 2025-11-22 08:09:55.634 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Instance 36bedc95-cfbe-490a-b6d2-b148301708dc actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 22 08:09:55 compute-0 nova_compute[186544]: 2025-11-22 08:09:55.635 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 08:09:55 compute-0 nova_compute[186544]: 2025-11-22 08:09:55.636 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 08:09:55 compute-0 nova_compute[186544]: 2025-11-22 08:09:55.695 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:09:55 compute-0 nova_compute[186544]: 2025-11-22 08:09:55.708 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:09:55 compute-0 nova_compute[186544]: 2025-11-22 08:09:55.730 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 08:09:55 compute-0 nova_compute[186544]: 2025-11-22 08:09:55.730 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.198s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:09:55 compute-0 nova_compute[186544]: 2025-11-22 08:09:55.859 186548 DEBUG nova.compute.manager [req-ef869094-827b-413f-a6aa-28f9684868f5 req-31c24832-8b14-4338-a864-433b3e2b33ad 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 63bce965-e211-4f6a-8700-64667594e4d0] Received event network-vif-deleted-1b2e31ef-a7fe-455e-b7e0-a50675608829 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:09:55 compute-0 nova_compute[186544]: 2025-11-22 08:09:55.863 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:09:57 compute-0 nova_compute[186544]: 2025-11-22 08:09:57.610 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:09:57 compute-0 nova_compute[186544]: 2025-11-22 08:09:57.730 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:09:57 compute-0 nova_compute[186544]: 2025-11-22 08:09:57.731 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 08:09:57 compute-0 nova_compute[186544]: 2025-11-22 08:09:57.731 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 08:09:57 compute-0 nova_compute[186544]: 2025-11-22 08:09:57.909 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "refresh_cache-36bedc95-cfbe-490a-b6d2-b148301708dc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:09:57 compute-0 nova_compute[186544]: 2025-11-22 08:09:57.909 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquired lock "refresh_cache-36bedc95-cfbe-490a-b6d2-b148301708dc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:09:57 compute-0 nova_compute[186544]: 2025-11-22 08:09:57.910 186548 DEBUG nova.network.neutron [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 36bedc95-cfbe-490a-b6d2-b148301708dc] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 22 08:09:57 compute-0 nova_compute[186544]: 2025-11-22 08:09:57.910 186548 DEBUG nova.objects.instance [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lazy-loading 'info_cache' on Instance uuid 36bedc95-cfbe-490a-b6d2-b148301708dc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:09:58 compute-0 nova_compute[186544]: 2025-11-22 08:09:58.762 186548 DEBUG oslo_concurrency.lockutils [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Acquiring lock "4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:09:58 compute-0 nova_compute[186544]: 2025-11-22 08:09:58.763 186548 DEBUG oslo_concurrency.lockutils [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lock "4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:09:58 compute-0 nova_compute[186544]: 2025-11-22 08:09:58.789 186548 DEBUG nova.compute.manager [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 08:09:58 compute-0 nova_compute[186544]: 2025-11-22 08:09:58.904 186548 DEBUG oslo_concurrency.lockutils [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:09:58 compute-0 nova_compute[186544]: 2025-11-22 08:09:58.905 186548 DEBUG oslo_concurrency.lockutils [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:09:58 compute-0 nova_compute[186544]: 2025-11-22 08:09:58.910 186548 DEBUG nova.virt.hardware [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 08:09:58 compute-0 nova_compute[186544]: 2025-11-22 08:09:58.910 186548 INFO nova.compute.claims [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1] Claim successful on node compute-0.ctlplane.example.com
Nov 22 08:09:59 compute-0 nova_compute[186544]: 2025-11-22 08:09:59.084 186548 DEBUG nova.compute.provider_tree [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:09:59 compute-0 nova_compute[186544]: 2025-11-22 08:09:59.103 186548 DEBUG nova.scheduler.client.report [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:09:59 compute-0 nova_compute[186544]: 2025-11-22 08:09:59.125 186548 DEBUG oslo_concurrency.lockutils [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.220s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:09:59 compute-0 nova_compute[186544]: 2025-11-22 08:09:59.126 186548 DEBUG nova.compute.manager [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 08:09:59 compute-0 nova_compute[186544]: 2025-11-22 08:09:59.181 186548 DEBUG nova.compute.manager [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 22 08:09:59 compute-0 nova_compute[186544]: 2025-11-22 08:09:59.182 186548 DEBUG nova.network.neutron [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 08:09:59 compute-0 nova_compute[186544]: 2025-11-22 08:09:59.218 186548 INFO nova.virt.libvirt.driver [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 08:09:59 compute-0 nova_compute[186544]: 2025-11-22 08:09:59.257 186548 DEBUG nova.compute.manager [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 08:09:59 compute-0 nova_compute[186544]: 2025-11-22 08:09:59.384 186548 DEBUG nova.compute.manager [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 08:09:59 compute-0 nova_compute[186544]: 2025-11-22 08:09:59.386 186548 DEBUG nova.virt.libvirt.driver [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 08:09:59 compute-0 nova_compute[186544]: 2025-11-22 08:09:59.387 186548 INFO nova.virt.libvirt.driver [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1] Creating image(s)
Nov 22 08:09:59 compute-0 nova_compute[186544]: 2025-11-22 08:09:59.387 186548 DEBUG oslo_concurrency.lockutils [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Acquiring lock "/var/lib/nova/instances/4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:09:59 compute-0 nova_compute[186544]: 2025-11-22 08:09:59.388 186548 DEBUG oslo_concurrency.lockutils [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lock "/var/lib/nova/instances/4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:09:59 compute-0 nova_compute[186544]: 2025-11-22 08:09:59.388 186548 DEBUG oslo_concurrency.lockutils [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lock "/var/lib/nova/instances/4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:09:59 compute-0 nova_compute[186544]: 2025-11-22 08:09:59.401 186548 DEBUG oslo_concurrency.processutils [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:09:59 compute-0 nova_compute[186544]: 2025-11-22 08:09:59.459 186548 DEBUG oslo_concurrency.processutils [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:09:59 compute-0 nova_compute[186544]: 2025-11-22 08:09:59.460 186548 DEBUG oslo_concurrency.lockutils [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:09:59 compute-0 nova_compute[186544]: 2025-11-22 08:09:59.461 186548 DEBUG oslo_concurrency.lockutils [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:09:59 compute-0 nova_compute[186544]: 2025-11-22 08:09:59.474 186548 DEBUG oslo_concurrency.processutils [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:09:59 compute-0 nova_compute[186544]: 2025-11-22 08:09:59.495 186548 DEBUG nova.policy [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c867ad823e59410b995507d3e85b3465', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9c564dfb60114407b72d22a9c49ed513', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 22 08:09:59 compute-0 nova_compute[186544]: 2025-11-22 08:09:59.535 186548 DEBUG oslo_concurrency.processutils [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:09:59 compute-0 nova_compute[186544]: 2025-11-22 08:09:59.538 186548 DEBUG oslo_concurrency.processutils [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:09:59 compute-0 nova_compute[186544]: 2025-11-22 08:09:59.566 186548 DEBUG nova.network.neutron [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 36bedc95-cfbe-490a-b6d2-b148301708dc] Updating instance_info_cache with network_info: [{"id": "a05e871f-1fed-40ac-8e55-6161c2b1f91f", "address": "fa:16:3e:96:a3:0e", "network": {"id": "390460fe-fb7f-40ce-abb7-9e99dea93a54", "bridge": "br-int", "label": "tempest-TestServerMultinode-1226412633-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5ea60c87d8514904b3cac3301e5875af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa05e871f-1f", "ovs_interfaceid": "a05e871f-1fed-40ac-8e55-6161c2b1f91f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:09:59 compute-0 nova_compute[186544]: 2025-11-22 08:09:59.580 186548 DEBUG oslo_concurrency.processutils [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1/disk 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:09:59 compute-0 nova_compute[186544]: 2025-11-22 08:09:59.581 186548 DEBUG oslo_concurrency.lockutils [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.120s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:09:59 compute-0 nova_compute[186544]: 2025-11-22 08:09:59.581 186548 DEBUG oslo_concurrency.processutils [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:09:59 compute-0 nova_compute[186544]: 2025-11-22 08:09:59.601 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Releasing lock "refresh_cache-36bedc95-cfbe-490a-b6d2-b148301708dc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:09:59 compute-0 nova_compute[186544]: 2025-11-22 08:09:59.602 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 36bedc95-cfbe-490a-b6d2-b148301708dc] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 22 08:09:59 compute-0 nova_compute[186544]: 2025-11-22 08:09:59.602 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:09:59 compute-0 nova_compute[186544]: 2025-11-22 08:09:59.603 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:09:59 compute-0 nova_compute[186544]: 2025-11-22 08:09:59.603 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 08:09:59 compute-0 nova_compute[186544]: 2025-11-22 08:09:59.641 186548 DEBUG oslo_concurrency.processutils [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:09:59 compute-0 nova_compute[186544]: 2025-11-22 08:09:59.642 186548 DEBUG nova.virt.disk.api [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Checking if we can resize image /var/lib/nova/instances/4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 08:09:59 compute-0 nova_compute[186544]: 2025-11-22 08:09:59.642 186548 DEBUG oslo_concurrency.processutils [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:09:59 compute-0 nova_compute[186544]: 2025-11-22 08:09:59.697 186548 DEBUG oslo_concurrency.processutils [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:09:59 compute-0 nova_compute[186544]: 2025-11-22 08:09:59.698 186548 DEBUG nova.virt.disk.api [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Cannot resize image /var/lib/nova/instances/4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 08:09:59 compute-0 nova_compute[186544]: 2025-11-22 08:09:59.698 186548 DEBUG nova.objects.instance [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lazy-loading 'migration_context' on Instance uuid 4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:09:59 compute-0 nova_compute[186544]: 2025-11-22 08:09:59.715 186548 DEBUG nova.virt.libvirt.driver [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 08:09:59 compute-0 nova_compute[186544]: 2025-11-22 08:09:59.716 186548 DEBUG nova.virt.libvirt.driver [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1] Ensure instance console log exists: /var/lib/nova/instances/4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 08:09:59 compute-0 nova_compute[186544]: 2025-11-22 08:09:59.716 186548 DEBUG oslo_concurrency.lockutils [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:09:59 compute-0 nova_compute[186544]: 2025-11-22 08:09:59.717 186548 DEBUG oslo_concurrency.lockutils [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:09:59 compute-0 nova_compute[186544]: 2025-11-22 08:09:59.717 186548 DEBUG oslo_concurrency.lockutils [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:10:00 compute-0 nova_compute[186544]: 2025-11-22 08:10:00.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:10:00 compute-0 nova_compute[186544]: 2025-11-22 08:10:00.346 186548 DEBUG nova.network.neutron [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1] Successfully created port: b3aa9f06-52e1-4538-a563-9217bb6da2e5 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 22 08:10:00 compute-0 nova_compute[186544]: 2025-11-22 08:10:00.865 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:10:01 compute-0 nova_compute[186544]: 2025-11-22 08:10:01.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:10:01 compute-0 nova_compute[186544]: 2025-11-22 08:10:01.553 186548 DEBUG nova.network.neutron [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1] Successfully updated port: b3aa9f06-52e1-4538-a563-9217bb6da2e5 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 08:10:01 compute-0 nova_compute[186544]: 2025-11-22 08:10:01.595 186548 DEBUG oslo_concurrency.lockutils [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Acquiring lock "refresh_cache-4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:10:01 compute-0 nova_compute[186544]: 2025-11-22 08:10:01.595 186548 DEBUG oslo_concurrency.lockutils [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Acquired lock "refresh_cache-4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:10:01 compute-0 nova_compute[186544]: 2025-11-22 08:10:01.595 186548 DEBUG nova.network.neutron [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 08:10:01 compute-0 nova_compute[186544]: 2025-11-22 08:10:01.709 186548 DEBUG nova.compute.manager [req-223dfe05-c647-4b47-8fc6-c9ea46626081 req-9613d2db-877f-4b4d-8c4d-f9c67216753d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1] Received event network-changed-b3aa9f06-52e1-4538-a563-9217bb6da2e5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:10:01 compute-0 nova_compute[186544]: 2025-11-22 08:10:01.710 186548 DEBUG nova.compute.manager [req-223dfe05-c647-4b47-8fc6-c9ea46626081 req-9613d2db-877f-4b4d-8c4d-f9c67216753d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1] Refreshing instance network info cache due to event network-changed-b3aa9f06-52e1-4538-a563-9217bb6da2e5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:10:01 compute-0 nova_compute[186544]: 2025-11-22 08:10:01.710 186548 DEBUG oslo_concurrency.lockutils [req-223dfe05-c647-4b47-8fc6-c9ea46626081 req-9613d2db-877f-4b4d-8c4d-f9c67216753d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:10:02 compute-0 nova_compute[186544]: 2025-11-22 08:10:02.068 186548 DEBUG nova.network.neutron [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 08:10:02 compute-0 nova_compute[186544]: 2025-11-22 08:10:02.614 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:10:03 compute-0 nova_compute[186544]: 2025-11-22 08:10:03.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:10:03 compute-0 ovn_controller[94843]: 2025-11-22T08:10:03Z|00050|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:96:a3:0e 10.100.0.7
Nov 22 08:10:03 compute-0 ovn_controller[94843]: 2025-11-22T08:10:03Z|00051|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:96:a3:0e 10.100.0.7
Nov 22 08:10:04 compute-0 nova_compute[186544]: 2025-11-22 08:10:04.307 186548 DEBUG nova.network.neutron [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1] Updating instance_info_cache with network_info: [{"id": "b3aa9f06-52e1-4538-a563-9217bb6da2e5", "address": "fa:16:3e:94:38:5b", "network": {"id": "c75f33da-8305-4145-97ef-eef656e4f067", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-910086432-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c564dfb60114407b72d22a9c49ed513", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb3aa9f06-52", "ovs_interfaceid": "b3aa9f06-52e1-4538-a563-9217bb6da2e5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:10:04 compute-0 nova_compute[186544]: 2025-11-22 08:10:04.326 186548 DEBUG oslo_concurrency.lockutils [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Releasing lock "refresh_cache-4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:10:04 compute-0 nova_compute[186544]: 2025-11-22 08:10:04.327 186548 DEBUG nova.compute.manager [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1] Instance network_info: |[{"id": "b3aa9f06-52e1-4538-a563-9217bb6da2e5", "address": "fa:16:3e:94:38:5b", "network": {"id": "c75f33da-8305-4145-97ef-eef656e4f067", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-910086432-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c564dfb60114407b72d22a9c49ed513", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb3aa9f06-52", "ovs_interfaceid": "b3aa9f06-52e1-4538-a563-9217bb6da2e5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 22 08:10:04 compute-0 nova_compute[186544]: 2025-11-22 08:10:04.327 186548 DEBUG oslo_concurrency.lockutils [req-223dfe05-c647-4b47-8fc6-c9ea46626081 req-9613d2db-877f-4b4d-8c4d-f9c67216753d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:10:04 compute-0 nova_compute[186544]: 2025-11-22 08:10:04.327 186548 DEBUG nova.network.neutron [req-223dfe05-c647-4b47-8fc6-c9ea46626081 req-9613d2db-877f-4b4d-8c4d-f9c67216753d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1] Refreshing network info cache for port b3aa9f06-52e1-4538-a563-9217bb6da2e5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:10:04 compute-0 nova_compute[186544]: 2025-11-22 08:10:04.330 186548 DEBUG nova.virt.libvirt.driver [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1] Start _get_guest_xml network_info=[{"id": "b3aa9f06-52e1-4538-a563-9217bb6da2e5", "address": "fa:16:3e:94:38:5b", "network": {"id": "c75f33da-8305-4145-97ef-eef656e4f067", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-910086432-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c564dfb60114407b72d22a9c49ed513", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb3aa9f06-52", "ovs_interfaceid": "b3aa9f06-52e1-4538-a563-9217bb6da2e5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 08:10:04 compute-0 nova_compute[186544]: 2025-11-22 08:10:04.333 186548 WARNING nova.virt.libvirt.driver [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:10:04 compute-0 nova_compute[186544]: 2025-11-22 08:10:04.342 186548 DEBUG nova.virt.libvirt.host [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 08:10:04 compute-0 nova_compute[186544]: 2025-11-22 08:10:04.342 186548 DEBUG nova.virt.libvirt.host [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 08:10:04 compute-0 nova_compute[186544]: 2025-11-22 08:10:04.349 186548 DEBUG nova.virt.libvirt.host [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 08:10:04 compute-0 nova_compute[186544]: 2025-11-22 08:10:04.349 186548 DEBUG nova.virt.libvirt.host [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 08:10:04 compute-0 nova_compute[186544]: 2025-11-22 08:10:04.351 186548 DEBUG nova.virt.libvirt.driver [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 08:10:04 compute-0 nova_compute[186544]: 2025-11-22 08:10:04.351 186548 DEBUG nova.virt.hardware [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 08:10:04 compute-0 nova_compute[186544]: 2025-11-22 08:10:04.351 186548 DEBUG nova.virt.hardware [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 08:10:04 compute-0 nova_compute[186544]: 2025-11-22 08:10:04.352 186548 DEBUG nova.virt.hardware [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 08:10:04 compute-0 nova_compute[186544]: 2025-11-22 08:10:04.352 186548 DEBUG nova.virt.hardware [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 08:10:04 compute-0 nova_compute[186544]: 2025-11-22 08:10:04.352 186548 DEBUG nova.virt.hardware [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 08:10:04 compute-0 nova_compute[186544]: 2025-11-22 08:10:04.352 186548 DEBUG nova.virt.hardware [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 08:10:04 compute-0 nova_compute[186544]: 2025-11-22 08:10:04.353 186548 DEBUG nova.virt.hardware [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 08:10:04 compute-0 nova_compute[186544]: 2025-11-22 08:10:04.353 186548 DEBUG nova.virt.hardware [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 08:10:04 compute-0 nova_compute[186544]: 2025-11-22 08:10:04.353 186548 DEBUG nova.virt.hardware [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 08:10:04 compute-0 nova_compute[186544]: 2025-11-22 08:10:04.354 186548 DEBUG nova.virt.hardware [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 08:10:04 compute-0 nova_compute[186544]: 2025-11-22 08:10:04.354 186548 DEBUG nova.virt.hardware [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 08:10:04 compute-0 nova_compute[186544]: 2025-11-22 08:10:04.357 186548 DEBUG nova.virt.libvirt.vif [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:09:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-MultipleCreateTestJSON-server-519320456',display_name='tempest-MultipleCreateTestJSON-server-519320456-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-multiplecreatetestjson-server-519320456-1',id=125,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9c564dfb60114407b72d22a9c49ed513',ramdisk_id='',reservation_id='r-s5402b4q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-MultipleCreateTestJSON-1558462004',owner_user_name='tempest-MultipleCreateTestJSON-1558462004-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:09:59Z,user_data=None,user_id='c867ad823e59410b995507d3e85b3465',uuid=4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b3aa9f06-52e1-4538-a563-9217bb6da2e5", "address": "fa:16:3e:94:38:5b", "network": {"id": "c75f33da-8305-4145-97ef-eef656e4f067", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-910086432-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c564dfb60114407b72d22a9c49ed513", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb3aa9f06-52", "ovs_interfaceid": "b3aa9f06-52e1-4538-a563-9217bb6da2e5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 08:10:04 compute-0 nova_compute[186544]: 2025-11-22 08:10:04.358 186548 DEBUG nova.network.os_vif_util [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Converting VIF {"id": "b3aa9f06-52e1-4538-a563-9217bb6da2e5", "address": "fa:16:3e:94:38:5b", "network": {"id": "c75f33da-8305-4145-97ef-eef656e4f067", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-910086432-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c564dfb60114407b72d22a9c49ed513", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb3aa9f06-52", "ovs_interfaceid": "b3aa9f06-52e1-4538-a563-9217bb6da2e5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:10:04 compute-0 nova_compute[186544]: 2025-11-22 08:10:04.358 186548 DEBUG nova.network.os_vif_util [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:94:38:5b,bridge_name='br-int',has_traffic_filtering=True,id=b3aa9f06-52e1-4538-a563-9217bb6da2e5,network=Network(c75f33da-8305-4145-97ef-eef656e4f067),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb3aa9f06-52') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:10:04 compute-0 nova_compute[186544]: 2025-11-22 08:10:04.359 186548 DEBUG nova.objects.instance [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lazy-loading 'pci_devices' on Instance uuid 4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:10:04 compute-0 nova_compute[186544]: 2025-11-22 08:10:04.370 186548 DEBUG nova.virt.libvirt.driver [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1] End _get_guest_xml xml=<domain type="kvm">
Nov 22 08:10:04 compute-0 nova_compute[186544]:   <uuid>4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1</uuid>
Nov 22 08:10:04 compute-0 nova_compute[186544]:   <name>instance-0000007d</name>
Nov 22 08:10:04 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 08:10:04 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 08:10:04 compute-0 nova_compute[186544]:   <metadata>
Nov 22 08:10:04 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 08:10:04 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 08:10:04 compute-0 nova_compute[186544]:       <nova:name>tempest-MultipleCreateTestJSON-server-519320456-1</nova:name>
Nov 22 08:10:04 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 08:10:04</nova:creationTime>
Nov 22 08:10:04 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 08:10:04 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 08:10:04 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 08:10:04 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 08:10:04 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 08:10:04 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 08:10:04 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 08:10:04 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 08:10:04 compute-0 nova_compute[186544]:         <nova:user uuid="c867ad823e59410b995507d3e85b3465">tempest-MultipleCreateTestJSON-1558462004-project-member</nova:user>
Nov 22 08:10:04 compute-0 nova_compute[186544]:         <nova:project uuid="9c564dfb60114407b72d22a9c49ed513">tempest-MultipleCreateTestJSON-1558462004</nova:project>
Nov 22 08:10:04 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 08:10:04 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 08:10:04 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 08:10:04 compute-0 nova_compute[186544]:         <nova:port uuid="b3aa9f06-52e1-4538-a563-9217bb6da2e5">
Nov 22 08:10:04 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Nov 22 08:10:04 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 08:10:04 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 08:10:04 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 08:10:04 compute-0 nova_compute[186544]:   </metadata>
Nov 22 08:10:04 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 08:10:04 compute-0 nova_compute[186544]:     <system>
Nov 22 08:10:04 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 08:10:04 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 08:10:04 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 08:10:04 compute-0 nova_compute[186544]:       <entry name="serial">4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1</entry>
Nov 22 08:10:04 compute-0 nova_compute[186544]:       <entry name="uuid">4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1</entry>
Nov 22 08:10:04 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 08:10:04 compute-0 nova_compute[186544]:     </system>
Nov 22 08:10:04 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 08:10:04 compute-0 nova_compute[186544]:   <os>
Nov 22 08:10:04 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 08:10:04 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 08:10:04 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 08:10:04 compute-0 nova_compute[186544]:   </os>
Nov 22 08:10:04 compute-0 nova_compute[186544]:   <features>
Nov 22 08:10:04 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 08:10:04 compute-0 nova_compute[186544]:     <apic/>
Nov 22 08:10:04 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 08:10:04 compute-0 nova_compute[186544]:   </features>
Nov 22 08:10:04 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 08:10:04 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 08:10:04 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 08:10:04 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 08:10:04 compute-0 nova_compute[186544]:   </clock>
Nov 22 08:10:04 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 08:10:04 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 08:10:04 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 08:10:04 compute-0 nova_compute[186544]:   </cpu>
Nov 22 08:10:04 compute-0 nova_compute[186544]:   <devices>
Nov 22 08:10:04 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 08:10:04 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 08:10:04 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1/disk"/>
Nov 22 08:10:04 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 08:10:04 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:10:04 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 08:10:04 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 08:10:04 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1/disk.config"/>
Nov 22 08:10:04 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 08:10:04 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:10:04 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 08:10:04 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:94:38:5b"/>
Nov 22 08:10:04 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:10:04 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 08:10:04 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 08:10:04 compute-0 nova_compute[186544]:       <target dev="tapb3aa9f06-52"/>
Nov 22 08:10:04 compute-0 nova_compute[186544]:     </interface>
Nov 22 08:10:04 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 08:10:04 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1/console.log" append="off"/>
Nov 22 08:10:04 compute-0 nova_compute[186544]:     </serial>
Nov 22 08:10:04 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 08:10:04 compute-0 nova_compute[186544]:     <video>
Nov 22 08:10:04 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:10:04 compute-0 nova_compute[186544]:     </video>
Nov 22 08:10:04 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 08:10:04 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 08:10:04 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 08:10:04 compute-0 nova_compute[186544]:     </rng>
Nov 22 08:10:04 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 08:10:04 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:10:04 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:10:04 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:10:04 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:10:04 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:10:04 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:10:04 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:10:04 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:10:04 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:10:04 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:10:04 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:10:04 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:10:04 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:10:04 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:10:04 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:10:04 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:10:04 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:10:04 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:10:04 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:10:04 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:10:04 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:10:04 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:10:04 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:10:04 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:10:04 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 08:10:04 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 08:10:04 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 08:10:04 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 08:10:04 compute-0 nova_compute[186544]:   </devices>
Nov 22 08:10:04 compute-0 nova_compute[186544]: </domain>
Nov 22 08:10:04 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 08:10:04 compute-0 nova_compute[186544]: 2025-11-22 08:10:04.372 186548 DEBUG nova.compute.manager [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1] Preparing to wait for external event network-vif-plugged-b3aa9f06-52e1-4538-a563-9217bb6da2e5 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 08:10:04 compute-0 nova_compute[186544]: 2025-11-22 08:10:04.373 186548 DEBUG oslo_concurrency.lockutils [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Acquiring lock "4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:10:04 compute-0 nova_compute[186544]: 2025-11-22 08:10:04.373 186548 DEBUG oslo_concurrency.lockutils [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lock "4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:10:04 compute-0 nova_compute[186544]: 2025-11-22 08:10:04.373 186548 DEBUG oslo_concurrency.lockutils [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lock "4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:10:04 compute-0 nova_compute[186544]: 2025-11-22 08:10:04.374 186548 DEBUG nova.virt.libvirt.vif [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:09:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-MultipleCreateTestJSON-server-519320456',display_name='tempest-MultipleCreateTestJSON-server-519320456-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-multiplecreatetestjson-server-519320456-1',id=125,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9c564dfb60114407b72d22a9c49ed513',ramdisk_id='',reservation_id='r-s5402b4q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-MultipleCreateTestJSON-1558462004',owner_user_name='tempest-MultipleCreateTestJSON-1558462004-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:09:59Z,user_data=None,user_id='c867ad823e59410b995507d3e85b3465',uuid=4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b3aa9f06-52e1-4538-a563-9217bb6da2e5", "address": "fa:16:3e:94:38:5b", "network": {"id": "c75f33da-8305-4145-97ef-eef656e4f067", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-910086432-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c564dfb60114407b72d22a9c49ed513", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb3aa9f06-52", "ovs_interfaceid": "b3aa9f06-52e1-4538-a563-9217bb6da2e5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 08:10:04 compute-0 nova_compute[186544]: 2025-11-22 08:10:04.374 186548 DEBUG nova.network.os_vif_util [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Converting VIF {"id": "b3aa9f06-52e1-4538-a563-9217bb6da2e5", "address": "fa:16:3e:94:38:5b", "network": {"id": "c75f33da-8305-4145-97ef-eef656e4f067", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-910086432-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c564dfb60114407b72d22a9c49ed513", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb3aa9f06-52", "ovs_interfaceid": "b3aa9f06-52e1-4538-a563-9217bb6da2e5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:10:04 compute-0 nova_compute[186544]: 2025-11-22 08:10:04.375 186548 DEBUG nova.network.os_vif_util [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:94:38:5b,bridge_name='br-int',has_traffic_filtering=True,id=b3aa9f06-52e1-4538-a563-9217bb6da2e5,network=Network(c75f33da-8305-4145-97ef-eef656e4f067),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb3aa9f06-52') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:10:04 compute-0 nova_compute[186544]: 2025-11-22 08:10:04.375 186548 DEBUG os_vif [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:94:38:5b,bridge_name='br-int',has_traffic_filtering=True,id=b3aa9f06-52e1-4538-a563-9217bb6da2e5,network=Network(c75f33da-8305-4145-97ef-eef656e4f067),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb3aa9f06-52') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 08:10:04 compute-0 nova_compute[186544]: 2025-11-22 08:10:04.376 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:10:04 compute-0 nova_compute[186544]: 2025-11-22 08:10:04.376 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:10:04 compute-0 nova_compute[186544]: 2025-11-22 08:10:04.376 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:10:04 compute-0 nova_compute[186544]: 2025-11-22 08:10:04.380 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:10:04 compute-0 nova_compute[186544]: 2025-11-22 08:10:04.381 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb3aa9f06-52, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:10:04 compute-0 nova_compute[186544]: 2025-11-22 08:10:04.381 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb3aa9f06-52, col_values=(('external_ids', {'iface-id': 'b3aa9f06-52e1-4538-a563-9217bb6da2e5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:94:38:5b', 'vm-uuid': '4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:10:04 compute-0 nova_compute[186544]: 2025-11-22 08:10:04.382 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:10:04 compute-0 NetworkManager[55036]: <info>  [1763799004.3840] manager: (tapb3aa9f06-52): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/256)
Nov 22 08:10:04 compute-0 nova_compute[186544]: 2025-11-22 08:10:04.386 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 08:10:04 compute-0 nova_compute[186544]: 2025-11-22 08:10:04.389 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:10:04 compute-0 nova_compute[186544]: 2025-11-22 08:10:04.390 186548 INFO os_vif [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:94:38:5b,bridge_name='br-int',has_traffic_filtering=True,id=b3aa9f06-52e1-4538-a563-9217bb6da2e5,network=Network(c75f33da-8305-4145-97ef-eef656e4f067),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb3aa9f06-52')
Nov 22 08:10:04 compute-0 nova_compute[186544]: 2025-11-22 08:10:04.560 186548 DEBUG nova.virt.libvirt.driver [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:10:04 compute-0 nova_compute[186544]: 2025-11-22 08:10:04.561 186548 DEBUG nova.virt.libvirt.driver [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:10:04 compute-0 nova_compute[186544]: 2025-11-22 08:10:04.562 186548 DEBUG nova.virt.libvirt.driver [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] No VIF found with MAC fa:16:3e:94:38:5b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 08:10:04 compute-0 nova_compute[186544]: 2025-11-22 08:10:04.563 186548 INFO nova.virt.libvirt.driver [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1] Using config drive
Nov 22 08:10:05 compute-0 nova_compute[186544]: 2025-11-22 08:10:05.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:10:05 compute-0 nova_compute[186544]: 2025-11-22 08:10:05.867 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:10:06 compute-0 nova_compute[186544]: 2025-11-22 08:10:06.375 186548 INFO nova.virt.libvirt.driver [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1] Creating config drive at /var/lib/nova/instances/4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1/disk.config
Nov 22 08:10:06 compute-0 nova_compute[186544]: 2025-11-22 08:10:06.381 186548 DEBUG oslo_concurrency.processutils [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpk0pq_z8b execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:10:06 compute-0 nova_compute[186544]: 2025-11-22 08:10:06.505 186548 DEBUG oslo_concurrency.processutils [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpk0pq_z8b" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:10:06 compute-0 kernel: tapb3aa9f06-52: entered promiscuous mode
Nov 22 08:10:06 compute-0 NetworkManager[55036]: <info>  [1763799006.5737] manager: (tapb3aa9f06-52): new Tun device (/org/freedesktop/NetworkManager/Devices/257)
Nov 22 08:10:06 compute-0 ovn_controller[94843]: 2025-11-22T08:10:06Z|00549|binding|INFO|Claiming lport b3aa9f06-52e1-4538-a563-9217bb6da2e5 for this chassis.
Nov 22 08:10:06 compute-0 ovn_controller[94843]: 2025-11-22T08:10:06Z|00550|binding|INFO|b3aa9f06-52e1-4538-a563-9217bb6da2e5: Claiming fa:16:3e:94:38:5b 10.100.0.13
Nov 22 08:10:06 compute-0 nova_compute[186544]: 2025-11-22 08:10:06.573 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:10:06 compute-0 ovn_controller[94843]: 2025-11-22T08:10:06Z|00551|binding|INFO|Setting lport b3aa9f06-52e1-4538-a563-9217bb6da2e5 ovn-installed in OVS
Nov 22 08:10:06 compute-0 nova_compute[186544]: 2025-11-22 08:10:06.591 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:10:06 compute-0 nova_compute[186544]: 2025-11-22 08:10:06.594 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:10:06 compute-0 systemd-udevd[235848]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:10:06 compute-0 NetworkManager[55036]: <info>  [1763799006.6132] device (tapb3aa9f06-52): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 08:10:06 compute-0 NetworkManager[55036]: <info>  [1763799006.6143] device (tapb3aa9f06-52): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 08:10:06 compute-0 systemd-machined[152872]: New machine qemu-71-instance-0000007d.
Nov 22 08:10:06 compute-0 systemd[1]: Started Virtual Machine qemu-71-instance-0000007d.
Nov 22 08:10:06 compute-0 ovn_controller[94843]: 2025-11-22T08:10:06Z|00552|binding|INFO|Setting lport b3aa9f06-52e1-4538-a563-9217bb6da2e5 up in Southbound
Nov 22 08:10:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:06.642 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:94:38:5b 10.100.0.13'], port_security=['fa:16:3e:94:38:5b 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c75f33da-8305-4145-97ef-eef656e4f067', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9c564dfb60114407b72d22a9c49ed513', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7b4508ee-1620-408d-af22-547cca254fde', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a1e17c95-3f14-4b31-90be-c563d86a1107, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=b3aa9f06-52e1-4538-a563-9217bb6da2e5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:10:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:06.643 103805 INFO neutron.agent.ovn.metadata.agent [-] Port b3aa9f06-52e1-4538-a563-9217bb6da2e5 in datapath c75f33da-8305-4145-97ef-eef656e4f067 bound to our chassis
Nov 22 08:10:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:06.645 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c75f33da-8305-4145-97ef-eef656e4f067
Nov 22 08:10:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:06.655 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[c2395616-20d7-48dd-a41d-6e135212ab7b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:10:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:06.656 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc75f33da-81 in ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 08:10:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:06.658 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc75f33da-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 08:10:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:06.658 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[99c6e79f-a4b8-4088-b9e9-27b7714e89b2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:10:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:06.659 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[b14e9d1c-ebcc-4ee7-834f-59d380610f75]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:10:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:06.671 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[f3aad87e-4a9b-4977-bab4-90fe98816a2f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:10:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:06.684 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[edf85ef5-4fed-4964-8a59-a52825f701a5]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:10:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:06.712 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[f6a9eea5-b6b5-4a9b-9c71-066e08cc784a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:10:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:06.717 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[2693853d-3c73-412b-932e-27ab44ca901b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:10:06 compute-0 NetworkManager[55036]: <info>  [1763799006.7184] manager: (tapc75f33da-80): new Veth device (/org/freedesktop/NetworkManager/Devices/258)
Nov 22 08:10:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:06.757 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[0b8dda01-7ced-49bc-9092-4d6d81d59421]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:10:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:06.761 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[0e077f21-05a3-4fa5-9308-ec374bec9770]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:10:06 compute-0 NetworkManager[55036]: <info>  [1763799006.7822] device (tapc75f33da-80): carrier: link connected
Nov 22 08:10:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:06.787 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[e9648410-23b8-4b06-9446-7f3520ff4cb0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:10:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:06.802 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[75992ef4-8e18-45f7-b5f8-b37e4327f26f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc75f33da-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6a:c8:98'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 174], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 574742, 'reachable_time': 33634, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235886, 'error': None, 'target': 'ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:10:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:06.818 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[1fc30067-aa86-40ae-b359-98614c87c8eb]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe6a:c898'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 574742, 'tstamp': 574742}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 235891, 'error': None, 'target': 'ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:10:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:06.835 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[aed74ac8-bf2f-490f-8d8c-3b94119ce7c1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc75f33da-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6a:c8:98'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 174], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 574742, 'reachable_time': 33634, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 235892, 'error': None, 'target': 'ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:10:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:06.869 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[2a50324d-a138-41bc-bfb4-15b68abacbb6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:10:06 compute-0 nova_compute[186544]: 2025-11-22 08:10:06.882 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763799006.881748, 4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:10:06 compute-0 nova_compute[186544]: 2025-11-22 08:10:06.884 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1] VM Started (Lifecycle Event)
Nov 22 08:10:06 compute-0 nova_compute[186544]: 2025-11-22 08:10:06.912 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:10:06 compute-0 nova_compute[186544]: 2025-11-22 08:10:06.918 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763799006.8819008, 4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:10:06 compute-0 nova_compute[186544]: 2025-11-22 08:10:06.918 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1] VM Paused (Lifecycle Event)
Nov 22 08:10:06 compute-0 nova_compute[186544]: 2025-11-22 08:10:06.939 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:10:06 compute-0 nova_compute[186544]: 2025-11-22 08:10:06.943 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:10:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:06.945 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[a298befd-5ca4-46a3-81b3-fe48f2f63975]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:10:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:06.947 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc75f33da-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:10:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:06.948 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:10:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:06.948 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc75f33da-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:10:06 compute-0 nova_compute[186544]: 2025-11-22 08:10:06.950 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:10:06 compute-0 NetworkManager[55036]: <info>  [1763799006.9512] manager: (tapc75f33da-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/259)
Nov 22 08:10:06 compute-0 kernel: tapc75f33da-80: entered promiscuous mode
Nov 22 08:10:06 compute-0 nova_compute[186544]: 2025-11-22 08:10:06.952 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:10:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:06.956 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc75f33da-80, col_values=(('external_ids', {'iface-id': 'd2b1e9d2-8364-40b7-8c31-edbcc237653b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:10:06 compute-0 ovn_controller[94843]: 2025-11-22T08:10:06Z|00553|binding|INFO|Releasing lport d2b1e9d2-8364-40b7-8c31-edbcc237653b from this chassis (sb_readonly=0)
Nov 22 08:10:06 compute-0 nova_compute[186544]: 2025-11-22 08:10:06.958 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:10:06 compute-0 nova_compute[186544]: 2025-11-22 08:10:06.959 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:10:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:06.960 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c75f33da-8305-4145-97ef-eef656e4f067.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c75f33da-8305-4145-97ef-eef656e4f067.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 08:10:06 compute-0 nova_compute[186544]: 2025-11-22 08:10:06.961 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 08:10:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:06.961 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[81c8b535-44dd-4e6f-92c0-6311cb433f96]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:10:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:06.962 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 08:10:06 compute-0 ovn_metadata_agent[103800]: global
Nov 22 08:10:06 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 08:10:06 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-c75f33da-8305-4145-97ef-eef656e4f067
Nov 22 08:10:06 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 08:10:06 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 08:10:06 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 08:10:06 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/c75f33da-8305-4145-97ef-eef656e4f067.pid.haproxy
Nov 22 08:10:06 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 08:10:06 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:10:06 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 08:10:06 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 08:10:06 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 08:10:06 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 08:10:06 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 08:10:06 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 08:10:06 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 08:10:06 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 08:10:06 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 08:10:06 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 08:10:06 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 08:10:06 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 08:10:06 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 08:10:06 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:10:06 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:10:06 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 08:10:06 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 08:10:06 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 08:10:06 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID c75f33da-8305-4145-97ef-eef656e4f067
Nov 22 08:10:06 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 08:10:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:06.962 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067', 'env', 'PROCESS_TAG=haproxy-c75f33da-8305-4145-97ef-eef656e4f067', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c75f33da-8305-4145-97ef-eef656e4f067.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 08:10:06 compute-0 nova_compute[186544]: 2025-11-22 08:10:06.971 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:10:07 compute-0 nova_compute[186544]: 2025-11-22 08:10:07.231 186548 DEBUG nova.network.neutron [req-223dfe05-c647-4b47-8fc6-c9ea46626081 req-9613d2db-877f-4b4d-8c4d-f9c67216753d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1] Updated VIF entry in instance network info cache for port b3aa9f06-52e1-4538-a563-9217bb6da2e5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:10:07 compute-0 nova_compute[186544]: 2025-11-22 08:10:07.231 186548 DEBUG nova.network.neutron [req-223dfe05-c647-4b47-8fc6-c9ea46626081 req-9613d2db-877f-4b4d-8c4d-f9c67216753d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1] Updating instance_info_cache with network_info: [{"id": "b3aa9f06-52e1-4538-a563-9217bb6da2e5", "address": "fa:16:3e:94:38:5b", "network": {"id": "c75f33da-8305-4145-97ef-eef656e4f067", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-910086432-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c564dfb60114407b72d22a9c49ed513", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb3aa9f06-52", "ovs_interfaceid": "b3aa9f06-52e1-4538-a563-9217bb6da2e5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:10:07 compute-0 nova_compute[186544]: 2025-11-22 08:10:07.246 186548 DEBUG oslo_concurrency.lockutils [req-223dfe05-c647-4b47-8fc6-c9ea46626081 req-9613d2db-877f-4b4d-8c4d-f9c67216753d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:10:07 compute-0 podman[235925]: 2025-11-22 08:10:07.351725029 +0000 UTC m=+0.052349267 container create 28e6c0df846333da781a58711f49ab124b83f7087b58b896091fc0f70119e3c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 08:10:07 compute-0 systemd[1]: Started libpod-conmon-28e6c0df846333da781a58711f49ab124b83f7087b58b896091fc0f70119e3c3.scope.
Nov 22 08:10:07 compute-0 systemd[1]: Started libcrun container.
Nov 22 08:10:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/095f99a47ba3a2e11eb056b9cb42558f859c400b6117289167438adb69ce45ac/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 08:10:07 compute-0 podman[235925]: 2025-11-22 08:10:07.323346096 +0000 UTC m=+0.023970354 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 08:10:07 compute-0 podman[235925]: 2025-11-22 08:10:07.430613372 +0000 UTC m=+0.131237640 container init 28e6c0df846333da781a58711f49ab124b83f7087b58b896091fc0f70119e3c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 08:10:07 compute-0 podman[235925]: 2025-11-22 08:10:07.436485877 +0000 UTC m=+0.137110115 container start 28e6c0df846333da781a58711f49ab124b83f7087b58b896091fc0f70119e3c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118)
Nov 22 08:10:07 compute-0 neutron-haproxy-ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067[235941]: [NOTICE]   (235945) : New worker (235947) forked
Nov 22 08:10:07 compute-0 neutron-haproxy-ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067[235941]: [NOTICE]   (235945) : Loading success.
Nov 22 08:10:07 compute-0 nova_compute[186544]: 2025-11-22 08:10:07.591 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763798992.5896113, 63bce965-e211-4f6a-8700-64667594e4d0 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:10:07 compute-0 nova_compute[186544]: 2025-11-22 08:10:07.591 186548 INFO nova.compute.manager [-] [instance: 63bce965-e211-4f6a-8700-64667594e4d0] VM Stopped (Lifecycle Event)
Nov 22 08:10:07 compute-0 nova_compute[186544]: 2025-11-22 08:10:07.615 186548 DEBUG nova.compute.manager [None req-b4733f33-c653-4713-891c-3a469d75876f - - - - - -] [instance: 63bce965-e211-4f6a-8700-64667594e4d0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:10:08 compute-0 nova_compute[186544]: 2025-11-22 08:10:08.073 186548 DEBUG oslo_concurrency.lockutils [None req-f3c62e41-5d64-44b7-a2c6-1a6259e3fb7b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Acquiring lock "36bedc95-cfbe-490a-b6d2-b148301708dc" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:10:08 compute-0 nova_compute[186544]: 2025-11-22 08:10:08.074 186548 DEBUG oslo_concurrency.lockutils [None req-f3c62e41-5d64-44b7-a2c6-1a6259e3fb7b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Lock "36bedc95-cfbe-490a-b6d2-b148301708dc" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:10:08 compute-0 nova_compute[186544]: 2025-11-22 08:10:08.074 186548 DEBUG oslo_concurrency.lockutils [None req-f3c62e41-5d64-44b7-a2c6-1a6259e3fb7b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Acquiring lock "36bedc95-cfbe-490a-b6d2-b148301708dc-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:10:08 compute-0 nova_compute[186544]: 2025-11-22 08:10:08.074 186548 DEBUG oslo_concurrency.lockutils [None req-f3c62e41-5d64-44b7-a2c6-1a6259e3fb7b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Lock "36bedc95-cfbe-490a-b6d2-b148301708dc-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:10:08 compute-0 nova_compute[186544]: 2025-11-22 08:10:08.074 186548 DEBUG oslo_concurrency.lockutils [None req-f3c62e41-5d64-44b7-a2c6-1a6259e3fb7b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Lock "36bedc95-cfbe-490a-b6d2-b148301708dc-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:10:08 compute-0 nova_compute[186544]: 2025-11-22 08:10:08.083 186548 INFO nova.compute.manager [None req-f3c62e41-5d64-44b7-a2c6-1a6259e3fb7b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 36bedc95-cfbe-490a-b6d2-b148301708dc] Terminating instance
Nov 22 08:10:08 compute-0 nova_compute[186544]: 2025-11-22 08:10:08.089 186548 DEBUG nova.compute.manager [None req-f3c62e41-5d64-44b7-a2c6-1a6259e3fb7b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 36bedc95-cfbe-490a-b6d2-b148301708dc] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 08:10:08 compute-0 kernel: tapa05e871f-1f (unregistering): left promiscuous mode
Nov 22 08:10:08 compute-0 NetworkManager[55036]: <info>  [1763799008.1196] device (tapa05e871f-1f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 08:10:08 compute-0 ovn_controller[94843]: 2025-11-22T08:10:08Z|00554|binding|INFO|Releasing lport a05e871f-1fed-40ac-8e55-6161c2b1f91f from this chassis (sb_readonly=0)
Nov 22 08:10:08 compute-0 ovn_controller[94843]: 2025-11-22T08:10:08Z|00555|binding|INFO|Setting lport a05e871f-1fed-40ac-8e55-6161c2b1f91f down in Southbound
Nov 22 08:10:08 compute-0 ovn_controller[94843]: 2025-11-22T08:10:08Z|00556|binding|INFO|Removing iface tapa05e871f-1f ovn-installed in OVS
Nov 22 08:10:08 compute-0 nova_compute[186544]: 2025-11-22 08:10:08.130 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:10:08 compute-0 nova_compute[186544]: 2025-11-22 08:10:08.132 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:10:08 compute-0 nova_compute[186544]: 2025-11-22 08:10:08.147 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:10:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:08.147 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:96:a3:0e 10.100.0.7'], port_security=['fa:16:3e:96:a3:0e 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '36bedc95-cfbe-490a-b6d2-b148301708dc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-390460fe-fb7f-40ce-abb7-9e99dea93a54', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b67388009f754931a62cbdd391fb4f53', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e23cfd74-a57b-4610-ab28-51062b779dc9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9b005592-2b67-4b5e-87ed-f6d87ca37498, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=a05e871f-1fed-40ac-8e55-6161c2b1f91f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:10:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:08.148 103805 INFO neutron.agent.ovn.metadata.agent [-] Port a05e871f-1fed-40ac-8e55-6161c2b1f91f in datapath 390460fe-fb7f-40ce-abb7-9e99dea93a54 unbound from our chassis
Nov 22 08:10:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:08.150 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 390460fe-fb7f-40ce-abb7-9e99dea93a54, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 08:10:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:08.151 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[b979d3a6-5113-491e-9dd9-69afe1b6311f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:10:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:08.152 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-390460fe-fb7f-40ce-abb7-9e99dea93a54 namespace which is not needed anymore
Nov 22 08:10:08 compute-0 systemd[1]: machine-qemu\x2d70\x2dinstance\x2d00000079.scope: Deactivated successfully.
Nov 22 08:10:08 compute-0 systemd[1]: machine-qemu\x2d70\x2dinstance\x2d00000079.scope: Consumed 14.139s CPU time.
Nov 22 08:10:08 compute-0 systemd-machined[152872]: Machine qemu-70-instance-00000079 terminated.
Nov 22 08:10:08 compute-0 podman[235961]: 2025-11-22 08:10:08.207472283 +0000 UTC m=+0.061740269 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 08:10:08 compute-0 podman[235962]: 2025-11-22 08:10:08.21018044 +0000 UTC m=+0.056843078 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 08:10:08 compute-0 podman[235956]: 2025-11-22 08:10:08.244943461 +0000 UTC m=+0.097166247 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 08:10:08 compute-0 podman[235963]: 2025-11-22 08:10:08.246085729 +0000 UTC m=+0.091877445 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 08:10:08 compute-0 neutron-haproxy-ovnmeta-390460fe-fb7f-40ce-abb7-9e99dea93a54[235649]: [NOTICE]   (235653) : haproxy version is 2.8.14-c23fe91
Nov 22 08:10:08 compute-0 neutron-haproxy-ovnmeta-390460fe-fb7f-40ce-abb7-9e99dea93a54[235649]: [NOTICE]   (235653) : path to executable is /usr/sbin/haproxy
Nov 22 08:10:08 compute-0 neutron-haproxy-ovnmeta-390460fe-fb7f-40ce-abb7-9e99dea93a54[235649]: [WARNING]  (235653) : Exiting Master process...
Nov 22 08:10:08 compute-0 neutron-haproxy-ovnmeta-390460fe-fb7f-40ce-abb7-9e99dea93a54[235649]: [ALERT]    (235653) : Current worker (235655) exited with code 143 (Terminated)
Nov 22 08:10:08 compute-0 neutron-haproxy-ovnmeta-390460fe-fb7f-40ce-abb7-9e99dea93a54[235649]: [WARNING]  (235653) : All workers exited. Exiting... (0)
Nov 22 08:10:08 compute-0 systemd[1]: libpod-6f0f5787a758038c61a8376b380093739ebbebf2c59f6f97a467c7e4f05d3e1e.scope: Deactivated successfully.
Nov 22 08:10:08 compute-0 podman[236059]: 2025-11-22 08:10:08.291646167 +0000 UTC m=+0.050141943 container died 6f0f5787a758038c61a8376b380093739ebbebf2c59f6f97a467c7e4f05d3e1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-390460fe-fb7f-40ce-abb7-9e99dea93a54, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 22 08:10:08 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6f0f5787a758038c61a8376b380093739ebbebf2c59f6f97a467c7e4f05d3e1e-userdata-shm.mount: Deactivated successfully.
Nov 22 08:10:08 compute-0 systemd[1]: var-lib-containers-storage-overlay-6ad0c10590aa981080733f8fa48af7d6c03d283c49f861df1599b5727bd47f9a-merged.mount: Deactivated successfully.
Nov 22 08:10:08 compute-0 podman[236059]: 2025-11-22 08:10:08.341913701 +0000 UTC m=+0.100409477 container cleanup 6f0f5787a758038c61a8376b380093739ebbebf2c59f6f97a467c7e4f05d3e1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-390460fe-fb7f-40ce-abb7-9e99dea93a54, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 22 08:10:08 compute-0 nova_compute[186544]: 2025-11-22 08:10:08.347 186548 INFO nova.virt.libvirt.driver [-] [instance: 36bedc95-cfbe-490a-b6d2-b148301708dc] Instance destroyed successfully.
Nov 22 08:10:08 compute-0 nova_compute[186544]: 2025-11-22 08:10:08.348 186548 DEBUG nova.objects.instance [None req-f3c62e41-5d64-44b7-a2c6-1a6259e3fb7b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Lazy-loading 'resources' on Instance uuid 36bedc95-cfbe-490a-b6d2-b148301708dc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:10:08 compute-0 systemd[1]: libpod-conmon-6f0f5787a758038c61a8376b380093739ebbebf2c59f6f97a467c7e4f05d3e1e.scope: Deactivated successfully.
Nov 22 08:10:08 compute-0 nova_compute[186544]: 2025-11-22 08:10:08.368 186548 DEBUG nova.virt.libvirt.vif [None req-f3c62e41-5d64-44b7-a2c6-1a6259e3fb7b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:09:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestServerMultinode-server-1687759960',display_name='tempest-TestServerMultinode-server-1687759960',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testservermultinode-server-1687759960',id=121,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:09:50Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b67388009f754931a62cbdd391fb4f53',ramdisk_id='',reservation_id='r-xdz10hmr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestServerMultinode-1734646453',owner_user_name='tempest-TestServerMultinode-1734646453-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:09:50Z,user_data=None,user_id='1bc17d213e01420ebb2a0bf75f44e357',uuid=36bedc95-cfbe-490a-b6d2-b148301708dc,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a05e871f-1fed-40ac-8e55-6161c2b1f91f", "address": "fa:16:3e:96:a3:0e", "network": {"id": "390460fe-fb7f-40ce-abb7-9e99dea93a54", "bridge": "br-int", "label": "tempest-TestServerMultinode-1226412633-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5ea60c87d8514904b3cac3301e5875af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa05e871f-1f", "ovs_interfaceid": "a05e871f-1fed-40ac-8e55-6161c2b1f91f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 08:10:08 compute-0 nova_compute[186544]: 2025-11-22 08:10:08.369 186548 DEBUG nova.network.os_vif_util [None req-f3c62e41-5d64-44b7-a2c6-1a6259e3fb7b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Converting VIF {"id": "a05e871f-1fed-40ac-8e55-6161c2b1f91f", "address": "fa:16:3e:96:a3:0e", "network": {"id": "390460fe-fb7f-40ce-abb7-9e99dea93a54", "bridge": "br-int", "label": "tempest-TestServerMultinode-1226412633-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5ea60c87d8514904b3cac3301e5875af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa05e871f-1f", "ovs_interfaceid": "a05e871f-1fed-40ac-8e55-6161c2b1f91f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:10:08 compute-0 nova_compute[186544]: 2025-11-22 08:10:08.370 186548 DEBUG nova.network.os_vif_util [None req-f3c62e41-5d64-44b7-a2c6-1a6259e3fb7b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:96:a3:0e,bridge_name='br-int',has_traffic_filtering=True,id=a05e871f-1fed-40ac-8e55-6161c2b1f91f,network=Network(390460fe-fb7f-40ce-abb7-9e99dea93a54),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa05e871f-1f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:10:08 compute-0 nova_compute[186544]: 2025-11-22 08:10:08.370 186548 DEBUG os_vif [None req-f3c62e41-5d64-44b7-a2c6-1a6259e3fb7b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:96:a3:0e,bridge_name='br-int',has_traffic_filtering=True,id=a05e871f-1fed-40ac-8e55-6161c2b1f91f,network=Network(390460fe-fb7f-40ce-abb7-9e99dea93a54),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa05e871f-1f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 08:10:08 compute-0 nova_compute[186544]: 2025-11-22 08:10:08.372 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:10:08 compute-0 nova_compute[186544]: 2025-11-22 08:10:08.372 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa05e871f-1f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:10:08 compute-0 nova_compute[186544]: 2025-11-22 08:10:08.374 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:10:08 compute-0 nova_compute[186544]: 2025-11-22 08:10:08.375 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:10:08 compute-0 nova_compute[186544]: 2025-11-22 08:10:08.377 186548 INFO os_vif [None req-f3c62e41-5d64-44b7-a2c6-1a6259e3fb7b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:96:a3:0e,bridge_name='br-int',has_traffic_filtering=True,id=a05e871f-1fed-40ac-8e55-6161c2b1f91f,network=Network(390460fe-fb7f-40ce-abb7-9e99dea93a54),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa05e871f-1f')
Nov 22 08:10:08 compute-0 nova_compute[186544]: 2025-11-22 08:10:08.378 186548 INFO nova.virt.libvirt.driver [None req-f3c62e41-5d64-44b7-a2c6-1a6259e3fb7b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 36bedc95-cfbe-490a-b6d2-b148301708dc] Deleting instance files /var/lib/nova/instances/36bedc95-cfbe-490a-b6d2-b148301708dc_del
Nov 22 08:10:08 compute-0 nova_compute[186544]: 2025-11-22 08:10:08.378 186548 INFO nova.virt.libvirt.driver [None req-f3c62e41-5d64-44b7-a2c6-1a6259e3fb7b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 36bedc95-cfbe-490a-b6d2-b148301708dc] Deletion of /var/lib/nova/instances/36bedc95-cfbe-490a-b6d2-b148301708dc_del complete
Nov 22 08:10:08 compute-0 nova_compute[186544]: 2025-11-22 08:10:08.396 186548 DEBUG nova.compute.manager [req-c7d8eedf-b445-4d3b-8baf-86cb2fb3669c req-16f78d88-27f2-4d78-ba20-75e394f795f1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 36bedc95-cfbe-490a-b6d2-b148301708dc] Received event network-vif-unplugged-a05e871f-1fed-40ac-8e55-6161c2b1f91f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:10:08 compute-0 nova_compute[186544]: 2025-11-22 08:10:08.397 186548 DEBUG oslo_concurrency.lockutils [req-c7d8eedf-b445-4d3b-8baf-86cb2fb3669c req-16f78d88-27f2-4d78-ba20-75e394f795f1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "36bedc95-cfbe-490a-b6d2-b148301708dc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:10:08 compute-0 nova_compute[186544]: 2025-11-22 08:10:08.397 186548 DEBUG oslo_concurrency.lockutils [req-c7d8eedf-b445-4d3b-8baf-86cb2fb3669c req-16f78d88-27f2-4d78-ba20-75e394f795f1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "36bedc95-cfbe-490a-b6d2-b148301708dc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:10:08 compute-0 nova_compute[186544]: 2025-11-22 08:10:08.397 186548 DEBUG oslo_concurrency.lockutils [req-c7d8eedf-b445-4d3b-8baf-86cb2fb3669c req-16f78d88-27f2-4d78-ba20-75e394f795f1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "36bedc95-cfbe-490a-b6d2-b148301708dc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:10:08 compute-0 nova_compute[186544]: 2025-11-22 08:10:08.397 186548 DEBUG nova.compute.manager [req-c7d8eedf-b445-4d3b-8baf-86cb2fb3669c req-16f78d88-27f2-4d78-ba20-75e394f795f1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 36bedc95-cfbe-490a-b6d2-b148301708dc] No waiting events found dispatching network-vif-unplugged-a05e871f-1fed-40ac-8e55-6161c2b1f91f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:10:08 compute-0 nova_compute[186544]: 2025-11-22 08:10:08.398 186548 DEBUG nova.compute.manager [req-c7d8eedf-b445-4d3b-8baf-86cb2fb3669c req-16f78d88-27f2-4d78-ba20-75e394f795f1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 36bedc95-cfbe-490a-b6d2-b148301708dc] Received event network-vif-unplugged-a05e871f-1fed-40ac-8e55-6161c2b1f91f for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 22 08:10:08 compute-0 podman[236105]: 2025-11-22 08:10:08.41737755 +0000 UTC m=+0.055314761 container remove 6f0f5787a758038c61a8376b380093739ebbebf2c59f6f97a467c7e4f05d3e1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-390460fe-fb7f-40ce-abb7-9e99dea93a54, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 22 08:10:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:08.422 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[66a87830-2dbc-4c10-adf0-c10f256c2f8e]: (4, ('Sat Nov 22 08:10:08 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-390460fe-fb7f-40ce-abb7-9e99dea93a54 (6f0f5787a758038c61a8376b380093739ebbebf2c59f6f97a467c7e4f05d3e1e)\n6f0f5787a758038c61a8376b380093739ebbebf2c59f6f97a467c7e4f05d3e1e\nSat Nov 22 08:10:08 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-390460fe-fb7f-40ce-abb7-9e99dea93a54 (6f0f5787a758038c61a8376b380093739ebbebf2c59f6f97a467c7e4f05d3e1e)\n6f0f5787a758038c61a8376b380093739ebbebf2c59f6f97a467c7e4f05d3e1e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:10:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:08.424 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[a331e9b4-d459-4bfa-bf81-4b59bb7574ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:10:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:08.424 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap390460fe-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:10:08 compute-0 nova_compute[186544]: 2025-11-22 08:10:08.426 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:10:08 compute-0 kernel: tap390460fe-f0: left promiscuous mode
Nov 22 08:10:08 compute-0 nova_compute[186544]: 2025-11-22 08:10:08.440 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:10:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:08.442 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[d8eff5ac-ee5e-43b2-b7c0-dcf3cd842774]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:10:08 compute-0 nova_compute[186544]: 2025-11-22 08:10:08.458 186548 INFO nova.compute.manager [None req-f3c62e41-5d64-44b7-a2c6-1a6259e3fb7b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] [instance: 36bedc95-cfbe-490a-b6d2-b148301708dc] Took 0.37 seconds to destroy the instance on the hypervisor.
Nov 22 08:10:08 compute-0 nova_compute[186544]: 2025-11-22 08:10:08.458 186548 DEBUG oslo.service.loopingcall [None req-f3c62e41-5d64-44b7-a2c6-1a6259e3fb7b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 08:10:08 compute-0 nova_compute[186544]: 2025-11-22 08:10:08.459 186548 DEBUG nova.compute.manager [-] [instance: 36bedc95-cfbe-490a-b6d2-b148301708dc] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 08:10:08 compute-0 nova_compute[186544]: 2025-11-22 08:10:08.459 186548 DEBUG nova.network.neutron [-] [instance: 36bedc95-cfbe-490a-b6d2-b148301708dc] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 08:10:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:08.460 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[efb12009-b85b-49eb-a045-2d56ddc624ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:10:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:08.461 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[b7d1959c-f5e5-44c3-8171-acea3c18baf5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:10:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:08.476 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[942f2b68-38e0-41e6-ba67-b1fa8255d2a0]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 573023, 'reachable_time': 39370, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236118, 'error': None, 'target': 'ovnmeta-390460fe-fb7f-40ce-abb7-9e99dea93a54', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:10:08 compute-0 systemd[1]: run-netns-ovnmeta\x2d390460fe\x2dfb7f\x2d40ce\x2dabb7\x2d9e99dea93a54.mount: Deactivated successfully.
Nov 22 08:10:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:08.481 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-390460fe-fb7f-40ce-abb7-9e99dea93a54 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 08:10:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:08.481 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[ee742131-4f75-49fd-9c5e-48ba24f90efa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:10:09 compute-0 nova_compute[186544]: 2025-11-22 08:10:09.799 186548 DEBUG nova.compute.manager [req-13e7cce3-1a6f-4570-976f-489a95eb4b52 req-249762c8-1f81-4227-9b73-6e816d25851d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1] Received event network-vif-plugged-b3aa9f06-52e1-4538-a563-9217bb6da2e5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:10:09 compute-0 nova_compute[186544]: 2025-11-22 08:10:09.799 186548 DEBUG oslo_concurrency.lockutils [req-13e7cce3-1a6f-4570-976f-489a95eb4b52 req-249762c8-1f81-4227-9b73-6e816d25851d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:10:09 compute-0 nova_compute[186544]: 2025-11-22 08:10:09.799 186548 DEBUG oslo_concurrency.lockutils [req-13e7cce3-1a6f-4570-976f-489a95eb4b52 req-249762c8-1f81-4227-9b73-6e816d25851d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:10:09 compute-0 nova_compute[186544]: 2025-11-22 08:10:09.799 186548 DEBUG oslo_concurrency.lockutils [req-13e7cce3-1a6f-4570-976f-489a95eb4b52 req-249762c8-1f81-4227-9b73-6e816d25851d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:10:09 compute-0 nova_compute[186544]: 2025-11-22 08:10:09.800 186548 DEBUG nova.compute.manager [req-13e7cce3-1a6f-4570-976f-489a95eb4b52 req-249762c8-1f81-4227-9b73-6e816d25851d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1] Processing event network-vif-plugged-b3aa9f06-52e1-4538-a563-9217bb6da2e5 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 08:10:09 compute-0 nova_compute[186544]: 2025-11-22 08:10:09.800 186548 DEBUG nova.compute.manager [req-13e7cce3-1a6f-4570-976f-489a95eb4b52 req-249762c8-1f81-4227-9b73-6e816d25851d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1] Received event network-vif-plugged-b3aa9f06-52e1-4538-a563-9217bb6da2e5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:10:09 compute-0 nova_compute[186544]: 2025-11-22 08:10:09.800 186548 DEBUG oslo_concurrency.lockutils [req-13e7cce3-1a6f-4570-976f-489a95eb4b52 req-249762c8-1f81-4227-9b73-6e816d25851d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:10:09 compute-0 nova_compute[186544]: 2025-11-22 08:10:09.800 186548 DEBUG oslo_concurrency.lockutils [req-13e7cce3-1a6f-4570-976f-489a95eb4b52 req-249762c8-1f81-4227-9b73-6e816d25851d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:10:09 compute-0 nova_compute[186544]: 2025-11-22 08:10:09.800 186548 DEBUG oslo_concurrency.lockutils [req-13e7cce3-1a6f-4570-976f-489a95eb4b52 req-249762c8-1f81-4227-9b73-6e816d25851d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:10:09 compute-0 nova_compute[186544]: 2025-11-22 08:10:09.800 186548 DEBUG nova.compute.manager [req-13e7cce3-1a6f-4570-976f-489a95eb4b52 req-249762c8-1f81-4227-9b73-6e816d25851d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1] No waiting events found dispatching network-vif-plugged-b3aa9f06-52e1-4538-a563-9217bb6da2e5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:10:09 compute-0 nova_compute[186544]: 2025-11-22 08:10:09.801 186548 WARNING nova.compute.manager [req-13e7cce3-1a6f-4570-976f-489a95eb4b52 req-249762c8-1f81-4227-9b73-6e816d25851d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1] Received unexpected event network-vif-plugged-b3aa9f06-52e1-4538-a563-9217bb6da2e5 for instance with vm_state building and task_state spawning.
Nov 22 08:10:09 compute-0 nova_compute[186544]: 2025-11-22 08:10:09.801 186548 DEBUG nova.compute.manager [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 08:10:09 compute-0 nova_compute[186544]: 2025-11-22 08:10:09.806 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763799009.8065944, 4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:10:09 compute-0 nova_compute[186544]: 2025-11-22 08:10:09.807 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1] VM Resumed (Lifecycle Event)
Nov 22 08:10:09 compute-0 nova_compute[186544]: 2025-11-22 08:10:09.809 186548 DEBUG nova.virt.libvirt.driver [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 08:10:09 compute-0 nova_compute[186544]: 2025-11-22 08:10:09.812 186548 INFO nova.virt.libvirt.driver [-] [instance: 4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1] Instance spawned successfully.
Nov 22 08:10:09 compute-0 nova_compute[186544]: 2025-11-22 08:10:09.812 186548 DEBUG nova.virt.libvirt.driver [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 08:10:09 compute-0 nova_compute[186544]: 2025-11-22 08:10:09.845 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:10:09 compute-0 nova_compute[186544]: 2025-11-22 08:10:09.849 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:10:09 compute-0 nova_compute[186544]: 2025-11-22 08:10:09.855 186548 DEBUG nova.virt.libvirt.driver [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:10:09 compute-0 nova_compute[186544]: 2025-11-22 08:10:09.855 186548 DEBUG nova.virt.libvirt.driver [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:10:09 compute-0 nova_compute[186544]: 2025-11-22 08:10:09.856 186548 DEBUG nova.virt.libvirt.driver [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:10:09 compute-0 nova_compute[186544]: 2025-11-22 08:10:09.856 186548 DEBUG nova.virt.libvirt.driver [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:10:09 compute-0 nova_compute[186544]: 2025-11-22 08:10:09.856 186548 DEBUG nova.virt.libvirt.driver [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:10:09 compute-0 nova_compute[186544]: 2025-11-22 08:10:09.857 186548 DEBUG nova.virt.libvirt.driver [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:10:09 compute-0 nova_compute[186544]: 2025-11-22 08:10:09.903 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 08:10:09 compute-0 nova_compute[186544]: 2025-11-22 08:10:09.905 186548 DEBUG nova.network.neutron [-] [instance: 36bedc95-cfbe-490a-b6d2-b148301708dc] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:10:09 compute-0 nova_compute[186544]: 2025-11-22 08:10:09.945 186548 INFO nova.compute.manager [-] [instance: 36bedc95-cfbe-490a-b6d2-b148301708dc] Took 1.49 seconds to deallocate network for instance.
Nov 22 08:10:09 compute-0 nova_compute[186544]: 2025-11-22 08:10:09.958 186548 INFO nova.compute.manager [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1] Took 10.57 seconds to spawn the instance on the hypervisor.
Nov 22 08:10:09 compute-0 nova_compute[186544]: 2025-11-22 08:10:09.959 186548 DEBUG nova.compute.manager [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:10:10 compute-0 nova_compute[186544]: 2025-11-22 08:10:10.044 186548 DEBUG oslo_concurrency.lockutils [None req-f3c62e41-5d64-44b7-a2c6-1a6259e3fb7b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:10:10 compute-0 nova_compute[186544]: 2025-11-22 08:10:10.045 186548 DEBUG oslo_concurrency.lockutils [None req-f3c62e41-5d64-44b7-a2c6-1a6259e3fb7b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:10:10 compute-0 nova_compute[186544]: 2025-11-22 08:10:10.053 186548 INFO nova.compute.manager [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1] Took 11.17 seconds to build instance.
Nov 22 08:10:10 compute-0 nova_compute[186544]: 2025-11-22 08:10:10.083 186548 DEBUG oslo_concurrency.lockutils [None req-781d9b79-792a-422a-9df1-0b2baab653e4 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lock "4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.320s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:10:10 compute-0 nova_compute[186544]: 2025-11-22 08:10:10.133 186548 DEBUG nova.compute.provider_tree [None req-f3c62e41-5d64-44b7-a2c6-1a6259e3fb7b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:10:10 compute-0 nova_compute[186544]: 2025-11-22 08:10:10.153 186548 DEBUG nova.scheduler.client.report [None req-f3c62e41-5d64-44b7-a2c6-1a6259e3fb7b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:10:10 compute-0 nova_compute[186544]: 2025-11-22 08:10:10.174 186548 DEBUG oslo_concurrency.lockutils [None req-f3c62e41-5d64-44b7-a2c6-1a6259e3fb7b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:10:10 compute-0 nova_compute[186544]: 2025-11-22 08:10:10.224 186548 INFO nova.scheduler.client.report [None req-f3c62e41-5d64-44b7-a2c6-1a6259e3fb7b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Deleted allocations for instance 36bedc95-cfbe-490a-b6d2-b148301708dc
Nov 22 08:10:10 compute-0 nova_compute[186544]: 2025-11-22 08:10:10.300 186548 DEBUG oslo_concurrency.lockutils [None req-f3c62e41-5d64-44b7-a2c6-1a6259e3fb7b 1bc17d213e01420ebb2a0bf75f44e357 b67388009f754931a62cbdd391fb4f53 - - default default] Lock "36bedc95-cfbe-490a-b6d2-b148301708dc" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.226s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:10:10 compute-0 nova_compute[186544]: 2025-11-22 08:10:10.504 186548 DEBUG nova.compute.manager [req-686a9c14-bf01-4741-8593-c01554c59c62 req-7edae77f-d114-4deb-b732-998a93576188 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 36bedc95-cfbe-490a-b6d2-b148301708dc] Received event network-vif-plugged-a05e871f-1fed-40ac-8e55-6161c2b1f91f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:10:10 compute-0 nova_compute[186544]: 2025-11-22 08:10:10.505 186548 DEBUG oslo_concurrency.lockutils [req-686a9c14-bf01-4741-8593-c01554c59c62 req-7edae77f-d114-4deb-b732-998a93576188 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "36bedc95-cfbe-490a-b6d2-b148301708dc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:10:10 compute-0 nova_compute[186544]: 2025-11-22 08:10:10.505 186548 DEBUG oslo_concurrency.lockutils [req-686a9c14-bf01-4741-8593-c01554c59c62 req-7edae77f-d114-4deb-b732-998a93576188 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "36bedc95-cfbe-490a-b6d2-b148301708dc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:10:10 compute-0 nova_compute[186544]: 2025-11-22 08:10:10.506 186548 DEBUG oslo_concurrency.lockutils [req-686a9c14-bf01-4741-8593-c01554c59c62 req-7edae77f-d114-4deb-b732-998a93576188 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "36bedc95-cfbe-490a-b6d2-b148301708dc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:10:10 compute-0 nova_compute[186544]: 2025-11-22 08:10:10.506 186548 DEBUG nova.compute.manager [req-686a9c14-bf01-4741-8593-c01554c59c62 req-7edae77f-d114-4deb-b732-998a93576188 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 36bedc95-cfbe-490a-b6d2-b148301708dc] No waiting events found dispatching network-vif-plugged-a05e871f-1fed-40ac-8e55-6161c2b1f91f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:10:10 compute-0 nova_compute[186544]: 2025-11-22 08:10:10.506 186548 WARNING nova.compute.manager [req-686a9c14-bf01-4741-8593-c01554c59c62 req-7edae77f-d114-4deb-b732-998a93576188 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 36bedc95-cfbe-490a-b6d2-b148301708dc] Received unexpected event network-vif-plugged-a05e871f-1fed-40ac-8e55-6161c2b1f91f for instance with vm_state deleted and task_state None.
Nov 22 08:10:10 compute-0 nova_compute[186544]: 2025-11-22 08:10:10.506 186548 DEBUG nova.compute.manager [req-686a9c14-bf01-4741-8593-c01554c59c62 req-7edae77f-d114-4deb-b732-998a93576188 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 36bedc95-cfbe-490a-b6d2-b148301708dc] Received event network-vif-deleted-a05e871f-1fed-40ac-8e55-6161c2b1f91f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:10:10 compute-0 nova_compute[186544]: 2025-11-22 08:10:10.869 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:10:12 compute-0 nova_compute[186544]: 2025-11-22 08:10:12.136 186548 DEBUG oslo_concurrency.lockutils [None req-f60fa86d-d678-434f-8ffe-f06d6cd0cc95 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Acquiring lock "4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:10:12 compute-0 nova_compute[186544]: 2025-11-22 08:10:12.137 186548 DEBUG oslo_concurrency.lockutils [None req-f60fa86d-d678-434f-8ffe-f06d6cd0cc95 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lock "4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:10:12 compute-0 nova_compute[186544]: 2025-11-22 08:10:12.137 186548 DEBUG oslo_concurrency.lockutils [None req-f60fa86d-d678-434f-8ffe-f06d6cd0cc95 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Acquiring lock "4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:10:12 compute-0 nova_compute[186544]: 2025-11-22 08:10:12.137 186548 DEBUG oslo_concurrency.lockutils [None req-f60fa86d-d678-434f-8ffe-f06d6cd0cc95 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lock "4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:10:12 compute-0 nova_compute[186544]: 2025-11-22 08:10:12.137 186548 DEBUG oslo_concurrency.lockutils [None req-f60fa86d-d678-434f-8ffe-f06d6cd0cc95 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lock "4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:10:12 compute-0 nova_compute[186544]: 2025-11-22 08:10:12.144 186548 INFO nova.compute.manager [None req-f60fa86d-d678-434f-8ffe-f06d6cd0cc95 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1] Terminating instance
Nov 22 08:10:12 compute-0 nova_compute[186544]: 2025-11-22 08:10:12.151 186548 DEBUG nova.compute.manager [None req-f60fa86d-d678-434f-8ffe-f06d6cd0cc95 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 08:10:12 compute-0 kernel: tapb3aa9f06-52 (unregistering): left promiscuous mode
Nov 22 08:10:12 compute-0 NetworkManager[55036]: <info>  [1763799012.1748] device (tapb3aa9f06-52): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 08:10:12 compute-0 nova_compute[186544]: 2025-11-22 08:10:12.183 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:10:12 compute-0 ovn_controller[94843]: 2025-11-22T08:10:12Z|00557|binding|INFO|Releasing lport b3aa9f06-52e1-4538-a563-9217bb6da2e5 from this chassis (sb_readonly=0)
Nov 22 08:10:12 compute-0 ovn_controller[94843]: 2025-11-22T08:10:12Z|00558|binding|INFO|Setting lport b3aa9f06-52e1-4538-a563-9217bb6da2e5 down in Southbound
Nov 22 08:10:12 compute-0 ovn_controller[94843]: 2025-11-22T08:10:12Z|00559|binding|INFO|Removing iface tapb3aa9f06-52 ovn-installed in OVS
Nov 22 08:10:12 compute-0 nova_compute[186544]: 2025-11-22 08:10:12.185 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:10:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:12.191 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:94:38:5b 10.100.0.13'], port_security=['fa:16:3e:94:38:5b 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c75f33da-8305-4145-97ef-eef656e4f067', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9c564dfb60114407b72d22a9c49ed513', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7b4508ee-1620-408d-af22-547cca254fde', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a1e17c95-3f14-4b31-90be-c563d86a1107, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=b3aa9f06-52e1-4538-a563-9217bb6da2e5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:10:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:12.192 103805 INFO neutron.agent.ovn.metadata.agent [-] Port b3aa9f06-52e1-4538-a563-9217bb6da2e5 in datapath c75f33da-8305-4145-97ef-eef656e4f067 unbound from our chassis
Nov 22 08:10:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:12.194 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c75f33da-8305-4145-97ef-eef656e4f067, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 08:10:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:12.195 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[bf556097-603c-4c80-ab92-611ada6af3ca]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:10:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:12.196 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067 namespace which is not needed anymore
Nov 22 08:10:12 compute-0 nova_compute[186544]: 2025-11-22 08:10:12.201 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:10:12 compute-0 systemd[1]: machine-qemu\x2d71\x2dinstance\x2d0000007d.scope: Deactivated successfully.
Nov 22 08:10:12 compute-0 systemd[1]: machine-qemu\x2d71\x2dinstance\x2d0000007d.scope: Consumed 2.619s CPU time.
Nov 22 08:10:12 compute-0 systemd-machined[152872]: Machine qemu-71-instance-0000007d terminated.
Nov 22 08:10:12 compute-0 neutron-haproxy-ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067[235941]: [NOTICE]   (235945) : haproxy version is 2.8.14-c23fe91
Nov 22 08:10:12 compute-0 neutron-haproxy-ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067[235941]: [NOTICE]   (235945) : path to executable is /usr/sbin/haproxy
Nov 22 08:10:12 compute-0 neutron-haproxy-ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067[235941]: [WARNING]  (235945) : Exiting Master process...
Nov 22 08:10:12 compute-0 neutron-haproxy-ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067[235941]: [ALERT]    (235945) : Current worker (235947) exited with code 143 (Terminated)
Nov 22 08:10:12 compute-0 neutron-haproxy-ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067[235941]: [WARNING]  (235945) : All workers exited. Exiting... (0)
Nov 22 08:10:12 compute-0 systemd[1]: libpod-28e6c0df846333da781a58711f49ab124b83f7087b58b896091fc0f70119e3c3.scope: Deactivated successfully.
Nov 22 08:10:12 compute-0 podman[236143]: 2025-11-22 08:10:12.330209041 +0000 UTC m=+0.050738387 container died 28e6c0df846333da781a58711f49ab124b83f7087b58b896091fc0f70119e3c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 22 08:10:12 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-28e6c0df846333da781a58711f49ab124b83f7087b58b896091fc0f70119e3c3-userdata-shm.mount: Deactivated successfully.
Nov 22 08:10:12 compute-0 systemd[1]: var-lib-containers-storage-overlay-095f99a47ba3a2e11eb056b9cb42558f859c400b6117289167438adb69ce45ac-merged.mount: Deactivated successfully.
Nov 22 08:10:12 compute-0 kernel: tapb3aa9f06-52: entered promiscuous mode
Nov 22 08:10:12 compute-0 NetworkManager[55036]: <info>  [1763799012.3694] manager: (tapb3aa9f06-52): new Tun device (/org/freedesktop/NetworkManager/Devices/260)
Nov 22 08:10:12 compute-0 ovn_controller[94843]: 2025-11-22T08:10:12Z|00560|binding|INFO|Claiming lport b3aa9f06-52e1-4538-a563-9217bb6da2e5 for this chassis.
Nov 22 08:10:12 compute-0 ovn_controller[94843]: 2025-11-22T08:10:12Z|00561|binding|INFO|b3aa9f06-52e1-4538-a563-9217bb6da2e5: Claiming fa:16:3e:94:38:5b 10.100.0.13
Nov 22 08:10:12 compute-0 nova_compute[186544]: 2025-11-22 08:10:12.371 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:10:12 compute-0 kernel: tapb3aa9f06-52 (unregistering): left promiscuous mode
Nov 22 08:10:12 compute-0 podman[236143]: 2025-11-22 08:10:12.373452501 +0000 UTC m=+0.093981837 container cleanup 28e6c0df846333da781a58711f49ab124b83f7087b58b896091fc0f70119e3c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Nov 22 08:10:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:12.377 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:94:38:5b 10.100.0.13'], port_security=['fa:16:3e:94:38:5b 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c75f33da-8305-4145-97ef-eef656e4f067', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9c564dfb60114407b72d22a9c49ed513', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7b4508ee-1620-408d-af22-547cca254fde', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a1e17c95-3f14-4b31-90be-c563d86a1107, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=b3aa9f06-52e1-4538-a563-9217bb6da2e5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:10:12 compute-0 systemd[1]: libpod-conmon-28e6c0df846333da781a58711f49ab124b83f7087b58b896091fc0f70119e3c3.scope: Deactivated successfully.
Nov 22 08:10:12 compute-0 ovn_controller[94843]: 2025-11-22T08:10:12Z|00562|binding|INFO|Setting lport b3aa9f06-52e1-4538-a563-9217bb6da2e5 ovn-installed in OVS
Nov 22 08:10:12 compute-0 ovn_controller[94843]: 2025-11-22T08:10:12Z|00563|binding|INFO|Setting lport b3aa9f06-52e1-4538-a563-9217bb6da2e5 up in Southbound
Nov 22 08:10:12 compute-0 ovn_controller[94843]: 2025-11-22T08:10:12Z|00564|binding|INFO|Releasing lport b3aa9f06-52e1-4538-a563-9217bb6da2e5 from this chassis (sb_readonly=1)
Nov 22 08:10:12 compute-0 ovn_controller[94843]: 2025-11-22T08:10:12Z|00565|if_status|INFO|Not setting lport b3aa9f06-52e1-4538-a563-9217bb6da2e5 down as sb is readonly
Nov 22 08:10:12 compute-0 nova_compute[186544]: 2025-11-22 08:10:12.390 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:10:12 compute-0 ovn_controller[94843]: 2025-11-22T08:10:12Z|00566|binding|INFO|Removing iface tapb3aa9f06-52 ovn-installed in OVS
Nov 22 08:10:12 compute-0 ovn_controller[94843]: 2025-11-22T08:10:12Z|00567|binding|INFO|Releasing lport b3aa9f06-52e1-4538-a563-9217bb6da2e5 from this chassis (sb_readonly=0)
Nov 22 08:10:12 compute-0 ovn_controller[94843]: 2025-11-22T08:10:12Z|00568|binding|INFO|Setting lport b3aa9f06-52e1-4538-a563-9217bb6da2e5 down in Southbound
Nov 22 08:10:12 compute-0 nova_compute[186544]: 2025-11-22 08:10:12.404 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:10:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:12.412 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:94:38:5b 10.100.0.13'], port_security=['fa:16:3e:94:38:5b 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c75f33da-8305-4145-97ef-eef656e4f067', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9c564dfb60114407b72d22a9c49ed513', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7b4508ee-1620-408d-af22-547cca254fde', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a1e17c95-3f14-4b31-90be-c563d86a1107, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=b3aa9f06-52e1-4538-a563-9217bb6da2e5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:10:12 compute-0 nova_compute[186544]: 2025-11-22 08:10:12.433 186548 INFO nova.virt.libvirt.driver [-] [instance: 4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1] Instance destroyed successfully.
Nov 22 08:10:12 compute-0 nova_compute[186544]: 2025-11-22 08:10:12.433 186548 DEBUG nova.objects.instance [None req-f60fa86d-d678-434f-8ffe-f06d6cd0cc95 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lazy-loading 'resources' on Instance uuid 4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:10:12 compute-0 podman[236175]: 2025-11-22 08:10:12.446472339 +0000 UTC m=+0.047427914 container remove 28e6c0df846333da781a58711f49ab124b83f7087b58b896091fc0f70119e3c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 22 08:10:12 compute-0 nova_compute[186544]: 2025-11-22 08:10:12.449 186548 DEBUG nova.virt.libvirt.vif [None req-f60fa86d-d678-434f-8ffe-f06d6cd0cc95 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:09:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-MultipleCreateTestJSON-server-519320456',display_name='tempest-MultipleCreateTestJSON-server-519320456-1',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-multiplecreatetestjson-server-519320456-1',id=125,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:10:09Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9c564dfb60114407b72d22a9c49ed513',ramdisk_id='',reservation_id='r-s5402b4q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-MultipleCreateTestJSON-1558462004',owner_user_name='tempest-MultipleCreateTestJSON-1558462004-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:10:10Z,user_data=None,user_id='c867ad823e59410b995507d3e85b3465',uuid=4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b3aa9f06-52e1-4538-a563-9217bb6da2e5", "address": "fa:16:3e:94:38:5b", "network": {"id": "c75f33da-8305-4145-97ef-eef656e4f067", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-910086432-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c564dfb60114407b72d22a9c49ed513", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb3aa9f06-52", "ovs_interfaceid": "b3aa9f06-52e1-4538-a563-9217bb6da2e5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 08:10:12 compute-0 nova_compute[186544]: 2025-11-22 08:10:12.450 186548 DEBUG nova.network.os_vif_util [None req-f60fa86d-d678-434f-8ffe-f06d6cd0cc95 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Converting VIF {"id": "b3aa9f06-52e1-4538-a563-9217bb6da2e5", "address": "fa:16:3e:94:38:5b", "network": {"id": "c75f33da-8305-4145-97ef-eef656e4f067", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-910086432-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9c564dfb60114407b72d22a9c49ed513", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb3aa9f06-52", "ovs_interfaceid": "b3aa9f06-52e1-4538-a563-9217bb6da2e5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:10:12 compute-0 nova_compute[186544]: 2025-11-22 08:10:12.450 186548 DEBUG nova.network.os_vif_util [None req-f60fa86d-d678-434f-8ffe-f06d6cd0cc95 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:94:38:5b,bridge_name='br-int',has_traffic_filtering=True,id=b3aa9f06-52e1-4538-a563-9217bb6da2e5,network=Network(c75f33da-8305-4145-97ef-eef656e4f067),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb3aa9f06-52') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:10:12 compute-0 nova_compute[186544]: 2025-11-22 08:10:12.451 186548 DEBUG os_vif [None req-f60fa86d-d678-434f-8ffe-f06d6cd0cc95 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:94:38:5b,bridge_name='br-int',has_traffic_filtering=True,id=b3aa9f06-52e1-4538-a563-9217bb6da2e5,network=Network(c75f33da-8305-4145-97ef-eef656e4f067),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb3aa9f06-52') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 08:10:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:12.451 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[baedd33b-e38b-4ba1-9cde-346f2dfe6a05]: (4, ('Sat Nov 22 08:10:12 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067 (28e6c0df846333da781a58711f49ab124b83f7087b58b896091fc0f70119e3c3)\n28e6c0df846333da781a58711f49ab124b83f7087b58b896091fc0f70119e3c3\nSat Nov 22 08:10:12 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067 (28e6c0df846333da781a58711f49ab124b83f7087b58b896091fc0f70119e3c3)\n28e6c0df846333da781a58711f49ab124b83f7087b58b896091fc0f70119e3c3\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:10:12 compute-0 nova_compute[186544]: 2025-11-22 08:10:12.452 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:10:12 compute-0 nova_compute[186544]: 2025-11-22 08:10:12.452 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb3aa9f06-52, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:10:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:12.453 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[69cab08f-47a0-45b2-a9f1-6c8ad9026f36]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:10:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:12.454 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc75f33da-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:10:12 compute-0 nova_compute[186544]: 2025-11-22 08:10:12.453 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:10:12 compute-0 kernel: tapc75f33da-80: left promiscuous mode
Nov 22 08:10:12 compute-0 nova_compute[186544]: 2025-11-22 08:10:12.457 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 08:10:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:12.458 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[d3cb992e-59f5-4999-8f15-87b8e80059ac]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:10:12 compute-0 nova_compute[186544]: 2025-11-22 08:10:12.459 186548 INFO os_vif [None req-f60fa86d-d678-434f-8ffe-f06d6cd0cc95 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:94:38:5b,bridge_name='br-int',has_traffic_filtering=True,id=b3aa9f06-52e1-4538-a563-9217bb6da2e5,network=Network(c75f33da-8305-4145-97ef-eef656e4f067),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb3aa9f06-52')
Nov 22 08:10:12 compute-0 nova_compute[186544]: 2025-11-22 08:10:12.460 186548 INFO nova.virt.libvirt.driver [None req-f60fa86d-d678-434f-8ffe-f06d6cd0cc95 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1] Deleting instance files /var/lib/nova/instances/4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1_del
Nov 22 08:10:12 compute-0 nova_compute[186544]: 2025-11-22 08:10:12.461 186548 INFO nova.virt.libvirt.driver [None req-f60fa86d-d678-434f-8ffe-f06d6cd0cc95 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1] Deletion of /var/lib/nova/instances/4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1_del complete
Nov 22 08:10:12 compute-0 nova_compute[186544]: 2025-11-22 08:10:12.467 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:10:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:12.475 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[36279049-3db9-48c0-9262-b1319c9d10ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:10:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:12.477 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[b50613cf-549c-4137-86f7-2bba9cd04cf5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:10:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:12.490 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[213352f3-294d-4375-a4e1-f77dda68a302]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 574734, 'reachable_time': 22351, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236197, 'error': None, 'target': 'ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:10:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:12.492 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c75f33da-8305-4145-97ef-eef656e4f067 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 08:10:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:12.493 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[cb4fa41b-90fc-49f6-b97f-7d992207721f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:10:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:12.493 103805 INFO neutron.agent.ovn.metadata.agent [-] Port b3aa9f06-52e1-4538-a563-9217bb6da2e5 in datapath c75f33da-8305-4145-97ef-eef656e4f067 unbound from our chassis
Nov 22 08:10:12 compute-0 systemd[1]: run-netns-ovnmeta\x2dc75f33da\x2d8305\x2d4145\x2d97ef\x2deef656e4f067.mount: Deactivated successfully.
Nov 22 08:10:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:12.495 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c75f33da-8305-4145-97ef-eef656e4f067, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 08:10:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:12.496 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[69a0023e-6321-40f1-bcba-ea1e97ab24b1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:10:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:12.497 103805 INFO neutron.agent.ovn.metadata.agent [-] Port b3aa9f06-52e1-4538-a563-9217bb6da2e5 in datapath c75f33da-8305-4145-97ef-eef656e4f067 unbound from our chassis
Nov 22 08:10:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:12.498 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c75f33da-8305-4145-97ef-eef656e4f067, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 08:10:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:12.498 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[a97175ac-7a1f-4816-b593-1649bf95b386]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:10:12 compute-0 nova_compute[186544]: 2025-11-22 08:10:12.530 186548 INFO nova.compute.manager [None req-f60fa86d-d678-434f-8ffe-f06d6cd0cc95 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] [instance: 4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1] Took 0.38 seconds to destroy the instance on the hypervisor.
Nov 22 08:10:12 compute-0 nova_compute[186544]: 2025-11-22 08:10:12.530 186548 DEBUG oslo.service.loopingcall [None req-f60fa86d-d678-434f-8ffe-f06d6cd0cc95 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 08:10:12 compute-0 nova_compute[186544]: 2025-11-22 08:10:12.531 186548 DEBUG nova.compute.manager [-] [instance: 4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 08:10:12 compute-0 nova_compute[186544]: 2025-11-22 08:10:12.532 186548 DEBUG nova.network.neutron [-] [instance: 4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 08:10:14 compute-0 nova_compute[186544]: 2025-11-22 08:10:14.367 186548 DEBUG nova.network.neutron [-] [instance: 4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:10:14 compute-0 nova_compute[186544]: 2025-11-22 08:10:14.393 186548 INFO nova.compute.manager [-] [instance: 4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1] Took 1.86 seconds to deallocate network for instance.
Nov 22 08:10:14 compute-0 nova_compute[186544]: 2025-11-22 08:10:14.515 186548 DEBUG oslo_concurrency.lockutils [None req-f60fa86d-d678-434f-8ffe-f06d6cd0cc95 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:10:14 compute-0 nova_compute[186544]: 2025-11-22 08:10:14.515 186548 DEBUG oslo_concurrency.lockutils [None req-f60fa86d-d678-434f-8ffe-f06d6cd0cc95 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:10:14 compute-0 nova_compute[186544]: 2025-11-22 08:10:14.519 186548 DEBUG nova.compute.manager [req-63a3b603-6ff1-4fb9-9d53-f9a0ff38fcc3 req-398da92c-9a8f-4b65-9949-8138629363b6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1] Received event network-vif-deleted-b3aa9f06-52e1-4538-a563-9217bb6da2e5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:10:14 compute-0 nova_compute[186544]: 2025-11-22 08:10:14.578 186548 DEBUG nova.compute.provider_tree [None req-f60fa86d-d678-434f-8ffe-f06d6cd0cc95 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:10:14 compute-0 nova_compute[186544]: 2025-11-22 08:10:14.588 186548 DEBUG nova.scheduler.client.report [None req-f60fa86d-d678-434f-8ffe-f06d6cd0cc95 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:10:14 compute-0 nova_compute[186544]: 2025-11-22 08:10:14.610 186548 DEBUG oslo_concurrency.lockutils [None req-f60fa86d-d678-434f-8ffe-f06d6cd0cc95 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.094s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:10:14 compute-0 nova_compute[186544]: 2025-11-22 08:10:14.642 186548 INFO nova.scheduler.client.report [None req-f60fa86d-d678-434f-8ffe-f06d6cd0cc95 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Deleted allocations for instance 4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1
Nov 22 08:10:14 compute-0 nova_compute[186544]: 2025-11-22 08:10:14.719 186548 DEBUG oslo_concurrency.lockutils [None req-f60fa86d-d678-434f-8ffe-f06d6cd0cc95 c867ad823e59410b995507d3e85b3465 9c564dfb60114407b72d22a9c49ed513 - - default default] Lock "4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.583s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:10:14 compute-0 nova_compute[186544]: 2025-11-22 08:10:14.974 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:10:15 compute-0 nova_compute[186544]: 2025-11-22 08:10:15.870 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:10:16 compute-0 nova_compute[186544]: 2025-11-22 08:10:16.158 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:10:16 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:16.774 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=35, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=34) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:10:16 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:16.775 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 08:10:16 compute-0 nova_compute[186544]: 2025-11-22 08:10:16.775 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:10:16 compute-0 nova_compute[186544]: 2025-11-22 08:10:16.789 186548 DEBUG oslo_concurrency.lockutils [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] Acquiring lock "da80db28-1519-41a1-851d-57a233d5042b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:10:16 compute-0 nova_compute[186544]: 2025-11-22 08:10:16.789 186548 DEBUG oslo_concurrency.lockutils [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] Lock "da80db28-1519-41a1-851d-57a233d5042b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:10:16 compute-0 nova_compute[186544]: 2025-11-22 08:10:16.841 186548 DEBUG nova.compute.manager [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] [instance: da80db28-1519-41a1-851d-57a233d5042b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 08:10:16 compute-0 nova_compute[186544]: 2025-11-22 08:10:16.919 186548 DEBUG oslo_concurrency.lockutils [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:10:16 compute-0 nova_compute[186544]: 2025-11-22 08:10:16.920 186548 DEBUG oslo_concurrency.lockutils [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:10:16 compute-0 nova_compute[186544]: 2025-11-22 08:10:16.929 186548 DEBUG nova.virt.hardware [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 08:10:16 compute-0 nova_compute[186544]: 2025-11-22 08:10:16.930 186548 INFO nova.compute.claims [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] [instance: da80db28-1519-41a1-851d-57a233d5042b] Claim successful on node compute-0.ctlplane.example.com
Nov 22 08:10:17 compute-0 nova_compute[186544]: 2025-11-22 08:10:17.030 186548 DEBUG nova.compute.provider_tree [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:10:17 compute-0 nova_compute[186544]: 2025-11-22 08:10:17.041 186548 DEBUG nova.scheduler.client.report [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:10:17 compute-0 nova_compute[186544]: 2025-11-22 08:10:17.067 186548 DEBUG oslo_concurrency.lockutils [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.147s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:10:17 compute-0 nova_compute[186544]: 2025-11-22 08:10:17.068 186548 DEBUG nova.compute.manager [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] [instance: da80db28-1519-41a1-851d-57a233d5042b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 08:10:17 compute-0 nova_compute[186544]: 2025-11-22 08:10:17.199 186548 DEBUG nova.compute.manager [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] [instance: da80db28-1519-41a1-851d-57a233d5042b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 22 08:10:17 compute-0 nova_compute[186544]: 2025-11-22 08:10:17.199 186548 DEBUG nova.network.neutron [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] [instance: da80db28-1519-41a1-851d-57a233d5042b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 08:10:17 compute-0 nova_compute[186544]: 2025-11-22 08:10:17.268 186548 INFO nova.virt.libvirt.driver [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] [instance: da80db28-1519-41a1-851d-57a233d5042b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 08:10:17 compute-0 nova_compute[186544]: 2025-11-22 08:10:17.390 186548 DEBUG nova.compute.manager [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] [instance: da80db28-1519-41a1-851d-57a233d5042b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 08:10:17 compute-0 nova_compute[186544]: 2025-11-22 08:10:17.447 186548 DEBUG nova.policy [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '489eb690676440d3a5bd52bc11f043ae', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5855d101586040f3b0c542214fe15523', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 22 08:10:17 compute-0 nova_compute[186544]: 2025-11-22 08:10:17.455 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:10:17 compute-0 nova_compute[186544]: 2025-11-22 08:10:17.559 186548 DEBUG nova.compute.manager [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] [instance: da80db28-1519-41a1-851d-57a233d5042b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 08:10:17 compute-0 nova_compute[186544]: 2025-11-22 08:10:17.560 186548 DEBUG nova.virt.libvirt.driver [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] [instance: da80db28-1519-41a1-851d-57a233d5042b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 08:10:17 compute-0 nova_compute[186544]: 2025-11-22 08:10:17.561 186548 INFO nova.virt.libvirt.driver [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] [instance: da80db28-1519-41a1-851d-57a233d5042b] Creating image(s)
Nov 22 08:10:17 compute-0 nova_compute[186544]: 2025-11-22 08:10:17.561 186548 DEBUG oslo_concurrency.lockutils [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] Acquiring lock "/var/lib/nova/instances/da80db28-1519-41a1-851d-57a233d5042b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:10:17 compute-0 nova_compute[186544]: 2025-11-22 08:10:17.562 186548 DEBUG oslo_concurrency.lockutils [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] Lock "/var/lib/nova/instances/da80db28-1519-41a1-851d-57a233d5042b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:10:17 compute-0 nova_compute[186544]: 2025-11-22 08:10:17.562 186548 DEBUG oslo_concurrency.lockutils [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] Lock "/var/lib/nova/instances/da80db28-1519-41a1-851d-57a233d5042b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:10:17 compute-0 nova_compute[186544]: 2025-11-22 08:10:17.575 186548 DEBUG oslo_concurrency.processutils [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:10:17 compute-0 nova_compute[186544]: 2025-11-22 08:10:17.635 186548 DEBUG oslo_concurrency.processutils [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:10:17 compute-0 nova_compute[186544]: 2025-11-22 08:10:17.636 186548 DEBUG oslo_concurrency.lockutils [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:10:17 compute-0 nova_compute[186544]: 2025-11-22 08:10:17.637 186548 DEBUG oslo_concurrency.lockutils [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:10:17 compute-0 nova_compute[186544]: 2025-11-22 08:10:17.649 186548 DEBUG oslo_concurrency.processutils [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:10:17 compute-0 nova_compute[186544]: 2025-11-22 08:10:17.718 186548 DEBUG oslo_concurrency.processutils [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:10:17 compute-0 nova_compute[186544]: 2025-11-22 08:10:17.719 186548 DEBUG oslo_concurrency.processutils [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/da80db28-1519-41a1-851d-57a233d5042b/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:10:17 compute-0 nova_compute[186544]: 2025-11-22 08:10:17.764 186548 DEBUG oslo_concurrency.processutils [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/da80db28-1519-41a1-851d-57a233d5042b/disk 1073741824" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:10:17 compute-0 nova_compute[186544]: 2025-11-22 08:10:17.765 186548 DEBUG oslo_concurrency.lockutils [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.128s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:10:17 compute-0 nova_compute[186544]: 2025-11-22 08:10:17.766 186548 DEBUG oslo_concurrency.processutils [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:10:17 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:17.776 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '35'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:10:17 compute-0 nova_compute[186544]: 2025-11-22 08:10:17.832 186548 DEBUG oslo_concurrency.processutils [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:10:17 compute-0 nova_compute[186544]: 2025-11-22 08:10:17.834 186548 DEBUG nova.virt.disk.api [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] Checking if we can resize image /var/lib/nova/instances/da80db28-1519-41a1-851d-57a233d5042b/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 08:10:17 compute-0 nova_compute[186544]: 2025-11-22 08:10:17.834 186548 DEBUG oslo_concurrency.processutils [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/da80db28-1519-41a1-851d-57a233d5042b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:10:17 compute-0 nova_compute[186544]: 2025-11-22 08:10:17.899 186548 DEBUG oslo_concurrency.processutils [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/da80db28-1519-41a1-851d-57a233d5042b/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:10:17 compute-0 nova_compute[186544]: 2025-11-22 08:10:17.900 186548 DEBUG nova.virt.disk.api [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] Cannot resize image /var/lib/nova/instances/da80db28-1519-41a1-851d-57a233d5042b/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 08:10:17 compute-0 nova_compute[186544]: 2025-11-22 08:10:17.900 186548 DEBUG nova.objects.instance [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] Lazy-loading 'migration_context' on Instance uuid da80db28-1519-41a1-851d-57a233d5042b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:10:17 compute-0 nova_compute[186544]: 2025-11-22 08:10:17.924 186548 DEBUG nova.virt.libvirt.driver [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] [instance: da80db28-1519-41a1-851d-57a233d5042b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 08:10:17 compute-0 nova_compute[186544]: 2025-11-22 08:10:17.924 186548 DEBUG nova.virt.libvirt.driver [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] [instance: da80db28-1519-41a1-851d-57a233d5042b] Ensure instance console log exists: /var/lib/nova/instances/da80db28-1519-41a1-851d-57a233d5042b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 08:10:17 compute-0 nova_compute[186544]: 2025-11-22 08:10:17.925 186548 DEBUG oslo_concurrency.lockutils [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:10:17 compute-0 nova_compute[186544]: 2025-11-22 08:10:17.925 186548 DEBUG oslo_concurrency.lockutils [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:10:17 compute-0 nova_compute[186544]: 2025-11-22 08:10:17.925 186548 DEBUG oslo_concurrency.lockutils [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:10:18 compute-0 podman[236213]: 2025-11-22 08:10:18.44365051 +0000 UTC m=+0.092751287 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 08:10:18 compute-0 nova_compute[186544]: 2025-11-22 08:10:18.970 186548 DEBUG nova.network.neutron [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] [instance: da80db28-1519-41a1-851d-57a233d5042b] Successfully created port: b89d0a45-2675-40f9-bd05-b00e8971f117 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 22 08:10:20 compute-0 nova_compute[186544]: 2025-11-22 08:10:20.529 186548 DEBUG nova.network.neutron [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] [instance: da80db28-1519-41a1-851d-57a233d5042b] Successfully updated port: b89d0a45-2675-40f9-bd05-b00e8971f117 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 08:10:20 compute-0 nova_compute[186544]: 2025-11-22 08:10:20.547 186548 DEBUG oslo_concurrency.lockutils [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] Acquiring lock "refresh_cache-da80db28-1519-41a1-851d-57a233d5042b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:10:20 compute-0 nova_compute[186544]: 2025-11-22 08:10:20.547 186548 DEBUG oslo_concurrency.lockutils [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] Acquired lock "refresh_cache-da80db28-1519-41a1-851d-57a233d5042b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:10:20 compute-0 nova_compute[186544]: 2025-11-22 08:10:20.547 186548 DEBUG nova.network.neutron [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] [instance: da80db28-1519-41a1-851d-57a233d5042b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 08:10:20 compute-0 nova_compute[186544]: 2025-11-22 08:10:20.669 186548 DEBUG nova.compute.manager [req-6cf45e22-0e1f-4244-828f-7104d1380154 req-d1819104-aed5-4333-b5cc-b12feab1c740 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: da80db28-1519-41a1-851d-57a233d5042b] Received event network-changed-b89d0a45-2675-40f9-bd05-b00e8971f117 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:10:20 compute-0 nova_compute[186544]: 2025-11-22 08:10:20.670 186548 DEBUG nova.compute.manager [req-6cf45e22-0e1f-4244-828f-7104d1380154 req-d1819104-aed5-4333-b5cc-b12feab1c740 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: da80db28-1519-41a1-851d-57a233d5042b] Refreshing instance network info cache due to event network-changed-b89d0a45-2675-40f9-bd05-b00e8971f117. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:10:20 compute-0 nova_compute[186544]: 2025-11-22 08:10:20.670 186548 DEBUG oslo_concurrency.lockutils [req-6cf45e22-0e1f-4244-828f-7104d1380154 req-d1819104-aed5-4333-b5cc-b12feab1c740 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-da80db28-1519-41a1-851d-57a233d5042b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:10:20 compute-0 nova_compute[186544]: 2025-11-22 08:10:20.788 186548 DEBUG nova.network.neutron [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] [instance: da80db28-1519-41a1-851d-57a233d5042b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 08:10:20 compute-0 nova_compute[186544]: 2025-11-22 08:10:20.872 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:10:22 compute-0 nova_compute[186544]: 2025-11-22 08:10:22.110 186548 DEBUG nova.network.neutron [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] [instance: da80db28-1519-41a1-851d-57a233d5042b] Updating instance_info_cache with network_info: [{"id": "b89d0a45-2675-40f9-bd05-b00e8971f117", "address": "fa:16:3e:25:7a:86", "network": {"id": "e9cce02b-633c-4954-b26d-7f7a9b076d6f", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-304061609-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5855d101586040f3b0c542214fe15523", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb89d0a45-26", "ovs_interfaceid": "b89d0a45-2675-40f9-bd05-b00e8971f117", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:10:22 compute-0 nova_compute[186544]: 2025-11-22 08:10:22.140 186548 DEBUG oslo_concurrency.lockutils [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] Releasing lock "refresh_cache-da80db28-1519-41a1-851d-57a233d5042b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:10:22 compute-0 nova_compute[186544]: 2025-11-22 08:10:22.140 186548 DEBUG nova.compute.manager [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] [instance: da80db28-1519-41a1-851d-57a233d5042b] Instance network_info: |[{"id": "b89d0a45-2675-40f9-bd05-b00e8971f117", "address": "fa:16:3e:25:7a:86", "network": {"id": "e9cce02b-633c-4954-b26d-7f7a9b076d6f", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-304061609-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5855d101586040f3b0c542214fe15523", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb89d0a45-26", "ovs_interfaceid": "b89d0a45-2675-40f9-bd05-b00e8971f117", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 22 08:10:22 compute-0 nova_compute[186544]: 2025-11-22 08:10:22.141 186548 DEBUG oslo_concurrency.lockutils [req-6cf45e22-0e1f-4244-828f-7104d1380154 req-d1819104-aed5-4333-b5cc-b12feab1c740 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-da80db28-1519-41a1-851d-57a233d5042b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:10:22 compute-0 nova_compute[186544]: 2025-11-22 08:10:22.141 186548 DEBUG nova.network.neutron [req-6cf45e22-0e1f-4244-828f-7104d1380154 req-d1819104-aed5-4333-b5cc-b12feab1c740 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: da80db28-1519-41a1-851d-57a233d5042b] Refreshing network info cache for port b89d0a45-2675-40f9-bd05-b00e8971f117 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:10:22 compute-0 nova_compute[186544]: 2025-11-22 08:10:22.143 186548 DEBUG nova.virt.libvirt.driver [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] [instance: da80db28-1519-41a1-851d-57a233d5042b] Start _get_guest_xml network_info=[{"id": "b89d0a45-2675-40f9-bd05-b00e8971f117", "address": "fa:16:3e:25:7a:86", "network": {"id": "e9cce02b-633c-4954-b26d-7f7a9b076d6f", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-304061609-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5855d101586040f3b0c542214fe15523", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb89d0a45-26", "ovs_interfaceid": "b89d0a45-2675-40f9-bd05-b00e8971f117", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 08:10:22 compute-0 nova_compute[186544]: 2025-11-22 08:10:22.147 186548 WARNING nova.virt.libvirt.driver [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:10:22 compute-0 nova_compute[186544]: 2025-11-22 08:10:22.152 186548 DEBUG nova.virt.libvirt.host [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 08:10:22 compute-0 nova_compute[186544]: 2025-11-22 08:10:22.152 186548 DEBUG nova.virt.libvirt.host [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 08:10:22 compute-0 nova_compute[186544]: 2025-11-22 08:10:22.159 186548 DEBUG nova.virt.libvirt.host [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 08:10:22 compute-0 nova_compute[186544]: 2025-11-22 08:10:22.160 186548 DEBUG nova.virt.libvirt.host [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 08:10:22 compute-0 nova_compute[186544]: 2025-11-22 08:10:22.161 186548 DEBUG nova.virt.libvirt.driver [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 08:10:22 compute-0 nova_compute[186544]: 2025-11-22 08:10:22.161 186548 DEBUG nova.virt.hardware [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 08:10:22 compute-0 nova_compute[186544]: 2025-11-22 08:10:22.161 186548 DEBUG nova.virt.hardware [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 08:10:22 compute-0 nova_compute[186544]: 2025-11-22 08:10:22.162 186548 DEBUG nova.virt.hardware [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 08:10:22 compute-0 nova_compute[186544]: 2025-11-22 08:10:22.162 186548 DEBUG nova.virt.hardware [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 08:10:22 compute-0 nova_compute[186544]: 2025-11-22 08:10:22.162 186548 DEBUG nova.virt.hardware [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 08:10:22 compute-0 nova_compute[186544]: 2025-11-22 08:10:22.162 186548 DEBUG nova.virt.hardware [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 08:10:22 compute-0 nova_compute[186544]: 2025-11-22 08:10:22.162 186548 DEBUG nova.virt.hardware [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 08:10:22 compute-0 nova_compute[186544]: 2025-11-22 08:10:22.163 186548 DEBUG nova.virt.hardware [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 08:10:22 compute-0 nova_compute[186544]: 2025-11-22 08:10:22.163 186548 DEBUG nova.virt.hardware [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 08:10:22 compute-0 nova_compute[186544]: 2025-11-22 08:10:22.163 186548 DEBUG nova.virt.hardware [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 08:10:22 compute-0 nova_compute[186544]: 2025-11-22 08:10:22.163 186548 DEBUG nova.virt.hardware [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 08:10:22 compute-0 nova_compute[186544]: 2025-11-22 08:10:22.167 186548 DEBUG nova.virt.libvirt.vif [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:10:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestMultiTenantJSON-server-409940739',display_name='tempest-ServersNegativeTestMultiTenantJSON-server-409940739',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversnegativetestmultitenantjson-server-409940739',id=127,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5855d101586040f3b0c542214fe15523',ramdisk_id='',reservation_id='r-iumjzw9t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersNegativeTestMultiTenantJSON-904700549',owner_user_name='tempest-ServersNegativeTestMultiTenantJSON-904700549-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:10:17Z,user_data=None,user_id='489eb690676440d3a5bd52bc11f043ae',uuid=da80db28-1519-41a1-851d-57a233d5042b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b89d0a45-2675-40f9-bd05-b00e8971f117", "address": "fa:16:3e:25:7a:86", "network": {"id": "e9cce02b-633c-4954-b26d-7f7a9b076d6f", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-304061609-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5855d101586040f3b0c542214fe15523", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb89d0a45-26", "ovs_interfaceid": "b89d0a45-2675-40f9-bd05-b00e8971f117", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 08:10:22 compute-0 nova_compute[186544]: 2025-11-22 08:10:22.167 186548 DEBUG nova.network.os_vif_util [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] Converting VIF {"id": "b89d0a45-2675-40f9-bd05-b00e8971f117", "address": "fa:16:3e:25:7a:86", "network": {"id": "e9cce02b-633c-4954-b26d-7f7a9b076d6f", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-304061609-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5855d101586040f3b0c542214fe15523", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb89d0a45-26", "ovs_interfaceid": "b89d0a45-2675-40f9-bd05-b00e8971f117", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:10:22 compute-0 nova_compute[186544]: 2025-11-22 08:10:22.168 186548 DEBUG nova.network.os_vif_util [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:25:7a:86,bridge_name='br-int',has_traffic_filtering=True,id=b89d0a45-2675-40f9-bd05-b00e8971f117,network=Network(e9cce02b-633c-4954-b26d-7f7a9b076d6f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb89d0a45-26') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:10:22 compute-0 nova_compute[186544]: 2025-11-22 08:10:22.168 186548 DEBUG nova.objects.instance [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] Lazy-loading 'pci_devices' on Instance uuid da80db28-1519-41a1-851d-57a233d5042b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:10:22 compute-0 nova_compute[186544]: 2025-11-22 08:10:22.181 186548 DEBUG nova.virt.libvirt.driver [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] [instance: da80db28-1519-41a1-851d-57a233d5042b] End _get_guest_xml xml=<domain type="kvm">
Nov 22 08:10:22 compute-0 nova_compute[186544]:   <uuid>da80db28-1519-41a1-851d-57a233d5042b</uuid>
Nov 22 08:10:22 compute-0 nova_compute[186544]:   <name>instance-0000007f</name>
Nov 22 08:10:22 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 08:10:22 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 08:10:22 compute-0 nova_compute[186544]:   <metadata>
Nov 22 08:10:22 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 08:10:22 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 08:10:22 compute-0 nova_compute[186544]:       <nova:name>tempest-ServersNegativeTestMultiTenantJSON-server-409940739</nova:name>
Nov 22 08:10:22 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 08:10:22</nova:creationTime>
Nov 22 08:10:22 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 08:10:22 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 08:10:22 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 08:10:22 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 08:10:22 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 08:10:22 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 08:10:22 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 08:10:22 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 08:10:22 compute-0 nova_compute[186544]:         <nova:user uuid="489eb690676440d3a5bd52bc11f043ae">tempest-ServersNegativeTestMultiTenantJSON-904700549-project-member</nova:user>
Nov 22 08:10:22 compute-0 nova_compute[186544]:         <nova:project uuid="5855d101586040f3b0c542214fe15523">tempest-ServersNegativeTestMultiTenantJSON-904700549</nova:project>
Nov 22 08:10:22 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 08:10:22 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 08:10:22 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 08:10:22 compute-0 nova_compute[186544]:         <nova:port uuid="b89d0a45-2675-40f9-bd05-b00e8971f117">
Nov 22 08:10:22 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Nov 22 08:10:22 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 08:10:22 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 08:10:22 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 08:10:22 compute-0 nova_compute[186544]:   </metadata>
Nov 22 08:10:22 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 08:10:22 compute-0 nova_compute[186544]:     <system>
Nov 22 08:10:22 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 08:10:22 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 08:10:22 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 08:10:22 compute-0 nova_compute[186544]:       <entry name="serial">da80db28-1519-41a1-851d-57a233d5042b</entry>
Nov 22 08:10:22 compute-0 nova_compute[186544]:       <entry name="uuid">da80db28-1519-41a1-851d-57a233d5042b</entry>
Nov 22 08:10:22 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 08:10:22 compute-0 nova_compute[186544]:     </system>
Nov 22 08:10:22 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 08:10:22 compute-0 nova_compute[186544]:   <os>
Nov 22 08:10:22 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 08:10:22 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 08:10:22 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 08:10:22 compute-0 nova_compute[186544]:   </os>
Nov 22 08:10:22 compute-0 nova_compute[186544]:   <features>
Nov 22 08:10:22 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 08:10:22 compute-0 nova_compute[186544]:     <apic/>
Nov 22 08:10:22 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 08:10:22 compute-0 nova_compute[186544]:   </features>
Nov 22 08:10:22 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 08:10:22 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 08:10:22 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 08:10:22 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 08:10:22 compute-0 nova_compute[186544]:   </clock>
Nov 22 08:10:22 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 08:10:22 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 08:10:22 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 08:10:22 compute-0 nova_compute[186544]:   </cpu>
Nov 22 08:10:22 compute-0 nova_compute[186544]:   <devices>
Nov 22 08:10:22 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 08:10:22 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 08:10:22 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/da80db28-1519-41a1-851d-57a233d5042b/disk"/>
Nov 22 08:10:22 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 08:10:22 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:10:22 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 08:10:22 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 08:10:22 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/da80db28-1519-41a1-851d-57a233d5042b/disk.config"/>
Nov 22 08:10:22 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 08:10:22 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:10:22 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 08:10:22 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:25:7a:86"/>
Nov 22 08:10:22 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:10:22 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 08:10:22 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 08:10:22 compute-0 nova_compute[186544]:       <target dev="tapb89d0a45-26"/>
Nov 22 08:10:22 compute-0 nova_compute[186544]:     </interface>
Nov 22 08:10:22 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 08:10:22 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/da80db28-1519-41a1-851d-57a233d5042b/console.log" append="off"/>
Nov 22 08:10:22 compute-0 nova_compute[186544]:     </serial>
Nov 22 08:10:22 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 08:10:22 compute-0 nova_compute[186544]:     <video>
Nov 22 08:10:22 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:10:22 compute-0 nova_compute[186544]:     </video>
Nov 22 08:10:22 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 08:10:22 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 08:10:22 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 08:10:22 compute-0 nova_compute[186544]:     </rng>
Nov 22 08:10:22 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 08:10:22 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:10:22 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:10:22 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:10:22 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:10:22 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:10:22 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:10:22 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:10:22 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:10:22 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:10:22 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:10:22 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:10:22 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:10:22 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:10:22 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:10:22 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:10:22 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:10:22 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:10:22 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:10:22 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:10:22 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:10:22 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:10:22 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:10:22 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:10:22 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:10:22 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 08:10:22 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 08:10:22 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 08:10:22 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 08:10:22 compute-0 nova_compute[186544]:   </devices>
Nov 22 08:10:22 compute-0 nova_compute[186544]: </domain>
Nov 22 08:10:22 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 08:10:22 compute-0 nova_compute[186544]: 2025-11-22 08:10:22.183 186548 DEBUG nova.compute.manager [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] [instance: da80db28-1519-41a1-851d-57a233d5042b] Preparing to wait for external event network-vif-plugged-b89d0a45-2675-40f9-bd05-b00e8971f117 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 08:10:22 compute-0 nova_compute[186544]: 2025-11-22 08:10:22.183 186548 DEBUG oslo_concurrency.lockutils [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] Acquiring lock "da80db28-1519-41a1-851d-57a233d5042b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:10:22 compute-0 nova_compute[186544]: 2025-11-22 08:10:22.183 186548 DEBUG oslo_concurrency.lockutils [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] Lock "da80db28-1519-41a1-851d-57a233d5042b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:10:22 compute-0 nova_compute[186544]: 2025-11-22 08:10:22.183 186548 DEBUG oslo_concurrency.lockutils [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] Lock "da80db28-1519-41a1-851d-57a233d5042b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:10:22 compute-0 nova_compute[186544]: 2025-11-22 08:10:22.184 186548 DEBUG nova.virt.libvirt.vif [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:10:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestMultiTenantJSON-server-409940739',display_name='tempest-ServersNegativeTestMultiTenantJSON-server-409940739',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversnegativetestmultitenantjson-server-409940739',id=127,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5855d101586040f3b0c542214fe15523',ramdisk_id='',reservation_id='r-iumjzw9t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersNegativeTestMultiTenantJSON-904700549',owner_user_name='tempest-ServersNegativeTestMultiTenantJSON-904700549-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:10:17Z,user_data=None,user_id='489eb690676440d3a5bd52bc11f043ae',uuid=da80db28-1519-41a1-851d-57a233d5042b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b89d0a45-2675-40f9-bd05-b00e8971f117", "address": "fa:16:3e:25:7a:86", "network": {"id": "e9cce02b-633c-4954-b26d-7f7a9b076d6f", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-304061609-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5855d101586040f3b0c542214fe15523", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb89d0a45-26", "ovs_interfaceid": "b89d0a45-2675-40f9-bd05-b00e8971f117", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 08:10:22 compute-0 nova_compute[186544]: 2025-11-22 08:10:22.184 186548 DEBUG nova.network.os_vif_util [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] Converting VIF {"id": "b89d0a45-2675-40f9-bd05-b00e8971f117", "address": "fa:16:3e:25:7a:86", "network": {"id": "e9cce02b-633c-4954-b26d-7f7a9b076d6f", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-304061609-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5855d101586040f3b0c542214fe15523", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb89d0a45-26", "ovs_interfaceid": "b89d0a45-2675-40f9-bd05-b00e8971f117", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:10:22 compute-0 nova_compute[186544]: 2025-11-22 08:10:22.185 186548 DEBUG nova.network.os_vif_util [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:25:7a:86,bridge_name='br-int',has_traffic_filtering=True,id=b89d0a45-2675-40f9-bd05-b00e8971f117,network=Network(e9cce02b-633c-4954-b26d-7f7a9b076d6f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb89d0a45-26') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:10:22 compute-0 nova_compute[186544]: 2025-11-22 08:10:22.185 186548 DEBUG os_vif [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:25:7a:86,bridge_name='br-int',has_traffic_filtering=True,id=b89d0a45-2675-40f9-bd05-b00e8971f117,network=Network(e9cce02b-633c-4954-b26d-7f7a9b076d6f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb89d0a45-26') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 08:10:22 compute-0 nova_compute[186544]: 2025-11-22 08:10:22.185 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:10:22 compute-0 nova_compute[186544]: 2025-11-22 08:10:22.186 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:10:22 compute-0 nova_compute[186544]: 2025-11-22 08:10:22.186 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:10:22 compute-0 nova_compute[186544]: 2025-11-22 08:10:22.189 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:10:22 compute-0 nova_compute[186544]: 2025-11-22 08:10:22.189 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb89d0a45-26, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:10:22 compute-0 nova_compute[186544]: 2025-11-22 08:10:22.190 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb89d0a45-26, col_values=(('external_ids', {'iface-id': 'b89d0a45-2675-40f9-bd05-b00e8971f117', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:25:7a:86', 'vm-uuid': 'da80db28-1519-41a1-851d-57a233d5042b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:10:22 compute-0 NetworkManager[55036]: <info>  [1763799022.1919] manager: (tapb89d0a45-26): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/261)
Nov 22 08:10:22 compute-0 nova_compute[186544]: 2025-11-22 08:10:22.194 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 08:10:22 compute-0 nova_compute[186544]: 2025-11-22 08:10:22.197 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:10:22 compute-0 nova_compute[186544]: 2025-11-22 08:10:22.198 186548 INFO os_vif [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:25:7a:86,bridge_name='br-int',has_traffic_filtering=True,id=b89d0a45-2675-40f9-bd05-b00e8971f117,network=Network(e9cce02b-633c-4954-b26d-7f7a9b076d6f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb89d0a45-26')
Nov 22 08:10:22 compute-0 nova_compute[186544]: 2025-11-22 08:10:22.241 186548 DEBUG nova.virt.libvirt.driver [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:10:22 compute-0 nova_compute[186544]: 2025-11-22 08:10:22.242 186548 DEBUG nova.virt.libvirt.driver [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:10:22 compute-0 nova_compute[186544]: 2025-11-22 08:10:22.242 186548 DEBUG nova.virt.libvirt.driver [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] No VIF found with MAC fa:16:3e:25:7a:86, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 08:10:22 compute-0 nova_compute[186544]: 2025-11-22 08:10:22.243 186548 INFO nova.virt.libvirt.driver [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] [instance: da80db28-1519-41a1-851d-57a233d5042b] Using config drive
Nov 22 08:10:22 compute-0 nova_compute[186544]: 2025-11-22 08:10:22.605 186548 INFO nova.virt.libvirt.driver [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] [instance: da80db28-1519-41a1-851d-57a233d5042b] Creating config drive at /var/lib/nova/instances/da80db28-1519-41a1-851d-57a233d5042b/disk.config
Nov 22 08:10:22 compute-0 nova_compute[186544]: 2025-11-22 08:10:22.609 186548 DEBUG oslo_concurrency.processutils [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/da80db28-1519-41a1-851d-57a233d5042b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmplsnfldr6 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:10:22 compute-0 nova_compute[186544]: 2025-11-22 08:10:22.731 186548 DEBUG oslo_concurrency.processutils [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/da80db28-1519-41a1-851d-57a233d5042b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmplsnfldr6" returned: 0 in 0.121s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:10:22 compute-0 kernel: tapb89d0a45-26: entered promiscuous mode
Nov 22 08:10:22 compute-0 NetworkManager[55036]: <info>  [1763799022.8042] manager: (tapb89d0a45-26): new Tun device (/org/freedesktop/NetworkManager/Devices/262)
Nov 22 08:10:22 compute-0 ovn_controller[94843]: 2025-11-22T08:10:22Z|00569|binding|INFO|Claiming lport b89d0a45-2675-40f9-bd05-b00e8971f117 for this chassis.
Nov 22 08:10:22 compute-0 ovn_controller[94843]: 2025-11-22T08:10:22Z|00570|binding|INFO|b89d0a45-2675-40f9-bd05-b00e8971f117: Claiming fa:16:3e:25:7a:86 10.100.0.12
Nov 22 08:10:22 compute-0 nova_compute[186544]: 2025-11-22 08:10:22.806 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:10:22 compute-0 nova_compute[186544]: 2025-11-22 08:10:22.812 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:10:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:22.822 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:25:7a:86 10.100.0.12'], port_security=['fa:16:3e:25:7a:86 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'da80db28-1519-41a1-851d-57a233d5042b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e9cce02b-633c-4954-b26d-7f7a9b076d6f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5855d101586040f3b0c542214fe15523', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2ac70502-6c4f-490a-89b0-7dd68676bf46', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a000fd8a-e179-4cc3-960f-37cc57a44ab2, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=b89d0a45-2675-40f9-bd05-b00e8971f117) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:10:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:22.823 103805 INFO neutron.agent.ovn.metadata.agent [-] Port b89d0a45-2675-40f9-bd05-b00e8971f117 in datapath e9cce02b-633c-4954-b26d-7f7a9b076d6f bound to our chassis
Nov 22 08:10:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:22.825 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e9cce02b-633c-4954-b26d-7f7a9b076d6f
Nov 22 08:10:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:22.836 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[61058235-773b-4e41-81ce-918762d05565]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:10:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:22.837 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tape9cce02b-61 in ovnmeta-e9cce02b-633c-4954-b26d-7f7a9b076d6f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 08:10:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:22.838 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tape9cce02b-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 08:10:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:22.838 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[4dd9968a-cf6b-4a23-be13-97ecce3b60c4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:10:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:22.839 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[9396131a-1973-41d1-af64-19a0a1d2a0ff]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:10:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:22.850 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[18d157f3-f14f-41d6-adcb-4ef0edd3e8fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:10:22 compute-0 systemd-machined[152872]: New machine qemu-72-instance-0000007f.
Nov 22 08:10:22 compute-0 podman[236248]: 2025-11-22 08:10:22.855175626 +0000 UTC m=+0.059352549 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 08:10:22 compute-0 nova_compute[186544]: 2025-11-22 08:10:22.865 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:10:22 compute-0 systemd[1]: Started Virtual Machine qemu-72-instance-0000007f.
Nov 22 08:10:22 compute-0 ovn_controller[94843]: 2025-11-22T08:10:22Z|00571|binding|INFO|Setting lport b89d0a45-2675-40f9-bd05-b00e8971f117 ovn-installed in OVS
Nov 22 08:10:22 compute-0 ovn_controller[94843]: 2025-11-22T08:10:22Z|00572|binding|INFO|Setting lport b89d0a45-2675-40f9-bd05-b00e8971f117 up in Southbound
Nov 22 08:10:22 compute-0 nova_compute[186544]: 2025-11-22 08:10:22.871 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:10:22 compute-0 systemd-udevd[236297]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:10:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:22.876 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[0ae9db02-9706-4f70-b8ab-9d0cf38657f2]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:10:22 compute-0 podman[236249]: 2025-11-22 08:10:22.892073181 +0000 UTC m=+0.094306947 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.component=ubi9-minimal-container, version=9.6, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, managed_by=edpm_ansible, maintainer=Red Hat, Inc., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, config_id=edpm, name=ubi9-minimal, architecture=x86_64, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 22 08:10:22 compute-0 NetworkManager[55036]: <info>  [1763799022.8930] device (tapb89d0a45-26): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 08:10:22 compute-0 NetworkManager[55036]: <info>  [1763799022.8939] device (tapb89d0a45-26): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 08:10:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:22.908 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[6657b9ef-5b1b-47df-b709-2fae060992fe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:10:22 compute-0 NetworkManager[55036]: <info>  [1763799022.9150] manager: (tape9cce02b-60): new Veth device (/org/freedesktop/NetworkManager/Devices/263)
Nov 22 08:10:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:22.914 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[3bf62be5-0460-4dbe-a5f5-8d1edc54f571]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:10:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:22.950 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[1e3cab47-eb83-4807-ab7a-2aa41a36f52f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:10:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:22.954 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[9417d5b6-60db-452e-9177-26479df0f4ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:10:22 compute-0 NetworkManager[55036]: <info>  [1763799022.9801] device (tape9cce02b-60): carrier: link connected
Nov 22 08:10:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:22.988 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[4baf369e-317e-4c06-9c08-6ddb4e209cd4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:10:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:23.007 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[fa9ade13-5067-4271-8698-ed835ce96242]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape9cce02b-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:59:0e:a9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 178], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 576361, 'reachable_time': 37928, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236330, 'error': None, 'target': 'ovnmeta-e9cce02b-633c-4954-b26d-7f7a9b076d6f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:10:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:23.030 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[11026329-eaf1-448b-9dac-d0435ae41bbf]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe59:ea9'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 576361, 'tstamp': 576361}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 236331, 'error': None, 'target': 'ovnmeta-e9cce02b-633c-4954-b26d-7f7a9b076d6f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:10:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:23.052 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[482566f9-b484-45bb-bdb3-4b12f7a221f3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape9cce02b-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:59:0e:a9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 178], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 576361, 'reachable_time': 37928, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 236332, 'error': None, 'target': 'ovnmeta-e9cce02b-633c-4954-b26d-7f7a9b076d6f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:10:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:23.089 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[15a248ab-38c1-44ab-b75d-6bb4384fc4c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:10:23 compute-0 nova_compute[186544]: 2025-11-22 08:10:23.108 186548 DEBUG nova.compute.manager [req-d2eeb781-0461-4246-86d2-7d3e0a3aa60e req-aaea811b-df49-4dd7-95d4-037881dfaeb1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: da80db28-1519-41a1-851d-57a233d5042b] Received event network-vif-plugged-b89d0a45-2675-40f9-bd05-b00e8971f117 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:10:23 compute-0 nova_compute[186544]: 2025-11-22 08:10:23.108 186548 DEBUG oslo_concurrency.lockutils [req-d2eeb781-0461-4246-86d2-7d3e0a3aa60e req-aaea811b-df49-4dd7-95d4-037881dfaeb1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "da80db28-1519-41a1-851d-57a233d5042b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:10:23 compute-0 nova_compute[186544]: 2025-11-22 08:10:23.108 186548 DEBUG oslo_concurrency.lockutils [req-d2eeb781-0461-4246-86d2-7d3e0a3aa60e req-aaea811b-df49-4dd7-95d4-037881dfaeb1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "da80db28-1519-41a1-851d-57a233d5042b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:10:23 compute-0 nova_compute[186544]: 2025-11-22 08:10:23.108 186548 DEBUG oslo_concurrency.lockutils [req-d2eeb781-0461-4246-86d2-7d3e0a3aa60e req-aaea811b-df49-4dd7-95d4-037881dfaeb1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "da80db28-1519-41a1-851d-57a233d5042b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:10:23 compute-0 nova_compute[186544]: 2025-11-22 08:10:23.109 186548 DEBUG nova.compute.manager [req-d2eeb781-0461-4246-86d2-7d3e0a3aa60e req-aaea811b-df49-4dd7-95d4-037881dfaeb1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: da80db28-1519-41a1-851d-57a233d5042b] Processing event network-vif-plugged-b89d0a45-2675-40f9-bd05-b00e8971f117 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 08:10:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:23.161 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[e279ee97-c610-45b3-b4fc-f6f4d3d48d27]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:10:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:23.163 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape9cce02b-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:10:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:23.163 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:10:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:23.163 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape9cce02b-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:10:23 compute-0 NetworkManager[55036]: <info>  [1763799023.1661] manager: (tape9cce02b-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/264)
Nov 22 08:10:23 compute-0 kernel: tape9cce02b-60: entered promiscuous mode
Nov 22 08:10:23 compute-0 nova_compute[186544]: 2025-11-22 08:10:23.167 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:10:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:23.167 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape9cce02b-60, col_values=(('external_ids', {'iface-id': '3200ace6-dd04-43fe-970d-fc3dd3f651bb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:10:23 compute-0 ovn_controller[94843]: 2025-11-22T08:10:23Z|00573|binding|INFO|Releasing lport 3200ace6-dd04-43fe-970d-fc3dd3f651bb from this chassis (sb_readonly=0)
Nov 22 08:10:23 compute-0 nova_compute[186544]: 2025-11-22 08:10:23.170 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:10:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:23.170 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e9cce02b-633c-4954-b26d-7f7a9b076d6f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e9cce02b-633c-4954-b26d-7f7a9b076d6f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 08:10:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:23.171 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[1062e654-13ef-4fb8-a39f-7bff2671720e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:10:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:23.172 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 08:10:23 compute-0 ovn_metadata_agent[103800]: global
Nov 22 08:10:23 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 08:10:23 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-e9cce02b-633c-4954-b26d-7f7a9b076d6f
Nov 22 08:10:23 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 08:10:23 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 08:10:23 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 08:10:23 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/e9cce02b-633c-4954-b26d-7f7a9b076d6f.pid.haproxy
Nov 22 08:10:23 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 08:10:23 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:10:23 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 08:10:23 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 08:10:23 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 08:10:23 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 08:10:23 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 08:10:23 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 08:10:23 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 08:10:23 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 08:10:23 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 08:10:23 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 08:10:23 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 08:10:23 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 08:10:23 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 08:10:23 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:10:23 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:10:23 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 08:10:23 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 08:10:23 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 08:10:23 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID e9cce02b-633c-4954-b26d-7f7a9b076d6f
Nov 22 08:10:23 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 08:10:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:23.172 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-e9cce02b-633c-4954-b26d-7f7a9b076d6f', 'env', 'PROCESS_TAG=haproxy-e9cce02b-633c-4954-b26d-7f7a9b076d6f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/e9cce02b-633c-4954-b26d-7f7a9b076d6f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 08:10:23 compute-0 nova_compute[186544]: 2025-11-22 08:10:23.180 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:10:23 compute-0 nova_compute[186544]: 2025-11-22 08:10:23.331 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763799023.3311207, da80db28-1519-41a1-851d-57a233d5042b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:10:23 compute-0 nova_compute[186544]: 2025-11-22 08:10:23.332 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: da80db28-1519-41a1-851d-57a233d5042b] VM Started (Lifecycle Event)
Nov 22 08:10:23 compute-0 nova_compute[186544]: 2025-11-22 08:10:23.334 186548 DEBUG nova.compute.manager [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] [instance: da80db28-1519-41a1-851d-57a233d5042b] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 08:10:23 compute-0 nova_compute[186544]: 2025-11-22 08:10:23.337 186548 DEBUG nova.virt.libvirt.driver [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] [instance: da80db28-1519-41a1-851d-57a233d5042b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 08:10:23 compute-0 nova_compute[186544]: 2025-11-22 08:10:23.341 186548 INFO nova.virt.libvirt.driver [-] [instance: da80db28-1519-41a1-851d-57a233d5042b] Instance spawned successfully.
Nov 22 08:10:23 compute-0 nova_compute[186544]: 2025-11-22 08:10:23.342 186548 DEBUG nova.virt.libvirt.driver [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] [instance: da80db28-1519-41a1-851d-57a233d5042b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 08:10:23 compute-0 nova_compute[186544]: 2025-11-22 08:10:23.346 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763799008.3461933, 36bedc95-cfbe-490a-b6d2-b148301708dc => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:10:23 compute-0 nova_compute[186544]: 2025-11-22 08:10:23.347 186548 INFO nova.compute.manager [-] [instance: 36bedc95-cfbe-490a-b6d2-b148301708dc] VM Stopped (Lifecycle Event)
Nov 22 08:10:23 compute-0 nova_compute[186544]: 2025-11-22 08:10:23.492 186548 DEBUG nova.virt.libvirt.driver [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] [instance: da80db28-1519-41a1-851d-57a233d5042b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:10:23 compute-0 nova_compute[186544]: 2025-11-22 08:10:23.492 186548 DEBUG nova.virt.libvirt.driver [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] [instance: da80db28-1519-41a1-851d-57a233d5042b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:10:23 compute-0 nova_compute[186544]: 2025-11-22 08:10:23.493 186548 DEBUG nova.virt.libvirt.driver [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] [instance: da80db28-1519-41a1-851d-57a233d5042b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:10:23 compute-0 nova_compute[186544]: 2025-11-22 08:10:23.493 186548 DEBUG nova.virt.libvirt.driver [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] [instance: da80db28-1519-41a1-851d-57a233d5042b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:10:23 compute-0 nova_compute[186544]: 2025-11-22 08:10:23.493 186548 DEBUG nova.virt.libvirt.driver [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] [instance: da80db28-1519-41a1-851d-57a233d5042b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:10:23 compute-0 nova_compute[186544]: 2025-11-22 08:10:23.494 186548 DEBUG nova.virt.libvirt.driver [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] [instance: da80db28-1519-41a1-851d-57a233d5042b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:10:23 compute-0 nova_compute[186544]: 2025-11-22 08:10:23.497 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: da80db28-1519-41a1-851d-57a233d5042b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:10:23 compute-0 nova_compute[186544]: 2025-11-22 08:10:23.498 186548 DEBUG nova.compute.manager [None req-f43b26b3-e267-41c4-a6cd-a0e5ba7353c7 - - - - - -] [instance: 36bedc95-cfbe-490a-b6d2-b148301708dc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:10:23 compute-0 nova_compute[186544]: 2025-11-22 08:10:23.501 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: da80db28-1519-41a1-851d-57a233d5042b] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:10:23 compute-0 nova_compute[186544]: 2025-11-22 08:10:23.536 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: da80db28-1519-41a1-851d-57a233d5042b] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 08:10:23 compute-0 nova_compute[186544]: 2025-11-22 08:10:23.536 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763799023.3337822, da80db28-1519-41a1-851d-57a233d5042b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:10:23 compute-0 nova_compute[186544]: 2025-11-22 08:10:23.536 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: da80db28-1519-41a1-851d-57a233d5042b] VM Paused (Lifecycle Event)
Nov 22 08:10:23 compute-0 podman[236371]: 2025-11-22 08:10:23.552569831 +0000 UTC m=+0.047529167 container create f350e0595d4c15088e8515909b80274527060d3a78c14abfd1abd51c25812efd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e9cce02b-633c-4954-b26d-7f7a9b076d6f, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 08:10:23 compute-0 nova_compute[186544]: 2025-11-22 08:10:23.566 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: da80db28-1519-41a1-851d-57a233d5042b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:10:23 compute-0 nova_compute[186544]: 2025-11-22 08:10:23.570 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763799023.3369753, da80db28-1519-41a1-851d-57a233d5042b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:10:23 compute-0 nova_compute[186544]: 2025-11-22 08:10:23.570 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: da80db28-1519-41a1-851d-57a233d5042b] VM Resumed (Lifecycle Event)
Nov 22 08:10:23 compute-0 systemd[1]: Started libpod-conmon-f350e0595d4c15088e8515909b80274527060d3a78c14abfd1abd51c25812efd.scope.
Nov 22 08:10:23 compute-0 nova_compute[186544]: 2025-11-22 08:10:23.605 186548 INFO nova.compute.manager [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] [instance: da80db28-1519-41a1-851d-57a233d5042b] Took 6.05 seconds to spawn the instance on the hypervisor.
Nov 22 08:10:23 compute-0 nova_compute[186544]: 2025-11-22 08:10:23.605 186548 DEBUG nova.compute.manager [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] [instance: da80db28-1519-41a1-851d-57a233d5042b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:10:23 compute-0 nova_compute[186544]: 2025-11-22 08:10:23.606 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: da80db28-1519-41a1-851d-57a233d5042b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:10:23 compute-0 nova_compute[186544]: 2025-11-22 08:10:23.612 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: da80db28-1519-41a1-851d-57a233d5042b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:10:23 compute-0 systemd[1]: Started libcrun container.
Nov 22 08:10:23 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9dd0bc3c64f8e663e4692353e1dcc026f1719f3b75e9cbefbdd5f842a57fa9a8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 08:10:23 compute-0 podman[236371]: 2025-11-22 08:10:23.525789017 +0000 UTC m=+0.020748383 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 08:10:23 compute-0 nova_compute[186544]: 2025-11-22 08:10:23.639 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: da80db28-1519-41a1-851d-57a233d5042b] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 08:10:23 compute-0 podman[236371]: 2025-11-22 08:10:23.639873062 +0000 UTC m=+0.134832418 container init f350e0595d4c15088e8515909b80274527060d3a78c14abfd1abd51c25812efd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e9cce02b-633c-4954-b26d-7f7a9b076d6f, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 22 08:10:23 compute-0 podman[236371]: 2025-11-22 08:10:23.64502338 +0000 UTC m=+0.139982716 container start f350e0595d4c15088e8515909b80274527060d3a78c14abfd1abd51c25812efd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e9cce02b-633c-4954-b26d-7f7a9b076d6f, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 22 08:10:23 compute-0 neutron-haproxy-ovnmeta-e9cce02b-633c-4954-b26d-7f7a9b076d6f[236386]: [NOTICE]   (236391) : New worker (236393) forked
Nov 22 08:10:23 compute-0 neutron-haproxy-ovnmeta-e9cce02b-633c-4954-b26d-7f7a9b076d6f[236386]: [NOTICE]   (236391) : Loading success.
Nov 22 08:10:23 compute-0 nova_compute[186544]: 2025-11-22 08:10:23.698 186548 INFO nova.compute.manager [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] [instance: da80db28-1519-41a1-851d-57a233d5042b] Took 6.81 seconds to build instance.
Nov 22 08:10:23 compute-0 nova_compute[186544]: 2025-11-22 08:10:23.725 186548 DEBUG oslo_concurrency.lockutils [None req-96e43aca-f0b5-4ece-8598-f49e7fdd2c4c 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] Lock "da80db28-1519-41a1-851d-57a233d5042b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.936s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:10:24 compute-0 nova_compute[186544]: 2025-11-22 08:10:24.197 186548 DEBUG nova.network.neutron [req-6cf45e22-0e1f-4244-828f-7104d1380154 req-d1819104-aed5-4333-b5cc-b12feab1c740 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: da80db28-1519-41a1-851d-57a233d5042b] Updated VIF entry in instance network info cache for port b89d0a45-2675-40f9-bd05-b00e8971f117. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:10:24 compute-0 nova_compute[186544]: 2025-11-22 08:10:24.197 186548 DEBUG nova.network.neutron [req-6cf45e22-0e1f-4244-828f-7104d1380154 req-d1819104-aed5-4333-b5cc-b12feab1c740 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: da80db28-1519-41a1-851d-57a233d5042b] Updating instance_info_cache with network_info: [{"id": "b89d0a45-2675-40f9-bd05-b00e8971f117", "address": "fa:16:3e:25:7a:86", "network": {"id": "e9cce02b-633c-4954-b26d-7f7a9b076d6f", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-304061609-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5855d101586040f3b0c542214fe15523", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb89d0a45-26", "ovs_interfaceid": "b89d0a45-2675-40f9-bd05-b00e8971f117", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:10:24 compute-0 nova_compute[186544]: 2025-11-22 08:10:24.211 186548 DEBUG oslo_concurrency.lockutils [req-6cf45e22-0e1f-4244-828f-7104d1380154 req-d1819104-aed5-4333-b5cc-b12feab1c740 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-da80db28-1519-41a1-851d-57a233d5042b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:10:25 compute-0 nova_compute[186544]: 2025-11-22 08:10:25.276 186548 DEBUG nova.compute.manager [req-c549cf5c-678d-43d8-9af9-804eed603459 req-da367aee-d75a-4251-ad29-b09cd73c69f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: da80db28-1519-41a1-851d-57a233d5042b] Received event network-vif-plugged-b89d0a45-2675-40f9-bd05-b00e8971f117 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:10:25 compute-0 nova_compute[186544]: 2025-11-22 08:10:25.277 186548 DEBUG oslo_concurrency.lockutils [req-c549cf5c-678d-43d8-9af9-804eed603459 req-da367aee-d75a-4251-ad29-b09cd73c69f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "da80db28-1519-41a1-851d-57a233d5042b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:10:25 compute-0 nova_compute[186544]: 2025-11-22 08:10:25.277 186548 DEBUG oslo_concurrency.lockutils [req-c549cf5c-678d-43d8-9af9-804eed603459 req-da367aee-d75a-4251-ad29-b09cd73c69f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "da80db28-1519-41a1-851d-57a233d5042b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:10:25 compute-0 nova_compute[186544]: 2025-11-22 08:10:25.277 186548 DEBUG oslo_concurrency.lockutils [req-c549cf5c-678d-43d8-9af9-804eed603459 req-da367aee-d75a-4251-ad29-b09cd73c69f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "da80db28-1519-41a1-851d-57a233d5042b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:10:25 compute-0 nova_compute[186544]: 2025-11-22 08:10:25.277 186548 DEBUG nova.compute.manager [req-c549cf5c-678d-43d8-9af9-804eed603459 req-da367aee-d75a-4251-ad29-b09cd73c69f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: da80db28-1519-41a1-851d-57a233d5042b] No waiting events found dispatching network-vif-plugged-b89d0a45-2675-40f9-bd05-b00e8971f117 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:10:25 compute-0 nova_compute[186544]: 2025-11-22 08:10:25.278 186548 WARNING nova.compute.manager [req-c549cf5c-678d-43d8-9af9-804eed603459 req-da367aee-d75a-4251-ad29-b09cd73c69f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: da80db28-1519-41a1-851d-57a233d5042b] Received unexpected event network-vif-plugged-b89d0a45-2675-40f9-bd05-b00e8971f117 for instance with vm_state active and task_state None.
Nov 22 08:10:25 compute-0 nova_compute[186544]: 2025-11-22 08:10:25.875 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:10:26 compute-0 nova_compute[186544]: 2025-11-22 08:10:26.333 186548 DEBUG oslo_concurrency.lockutils [None req-2465611c-204f-40f5-871c-badcc1746400 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] Acquiring lock "da80db28-1519-41a1-851d-57a233d5042b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:10:26 compute-0 nova_compute[186544]: 2025-11-22 08:10:26.334 186548 DEBUG oslo_concurrency.lockutils [None req-2465611c-204f-40f5-871c-badcc1746400 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] Lock "da80db28-1519-41a1-851d-57a233d5042b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:10:26 compute-0 nova_compute[186544]: 2025-11-22 08:10:26.334 186548 DEBUG oslo_concurrency.lockutils [None req-2465611c-204f-40f5-871c-badcc1746400 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] Acquiring lock "da80db28-1519-41a1-851d-57a233d5042b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:10:26 compute-0 nova_compute[186544]: 2025-11-22 08:10:26.334 186548 DEBUG oslo_concurrency.lockutils [None req-2465611c-204f-40f5-871c-badcc1746400 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] Lock "da80db28-1519-41a1-851d-57a233d5042b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:10:26 compute-0 nova_compute[186544]: 2025-11-22 08:10:26.334 186548 DEBUG oslo_concurrency.lockutils [None req-2465611c-204f-40f5-871c-badcc1746400 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] Lock "da80db28-1519-41a1-851d-57a233d5042b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:10:26 compute-0 nova_compute[186544]: 2025-11-22 08:10:26.342 186548 INFO nova.compute.manager [None req-2465611c-204f-40f5-871c-badcc1746400 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] [instance: da80db28-1519-41a1-851d-57a233d5042b] Terminating instance
Nov 22 08:10:26 compute-0 nova_compute[186544]: 2025-11-22 08:10:26.347 186548 DEBUG nova.compute.manager [None req-2465611c-204f-40f5-871c-badcc1746400 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] [instance: da80db28-1519-41a1-851d-57a233d5042b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 08:10:26 compute-0 kernel: tapb89d0a45-26 (unregistering): left promiscuous mode
Nov 22 08:10:26 compute-0 NetworkManager[55036]: <info>  [1763799026.3669] device (tapb89d0a45-26): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 08:10:26 compute-0 nova_compute[186544]: 2025-11-22 08:10:26.374 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:10:26 compute-0 ovn_controller[94843]: 2025-11-22T08:10:26Z|00574|binding|INFO|Releasing lport b89d0a45-2675-40f9-bd05-b00e8971f117 from this chassis (sb_readonly=0)
Nov 22 08:10:26 compute-0 ovn_controller[94843]: 2025-11-22T08:10:26Z|00575|binding|INFO|Setting lport b89d0a45-2675-40f9-bd05-b00e8971f117 down in Southbound
Nov 22 08:10:26 compute-0 ovn_controller[94843]: 2025-11-22T08:10:26Z|00576|binding|INFO|Removing iface tapb89d0a45-26 ovn-installed in OVS
Nov 22 08:10:26 compute-0 nova_compute[186544]: 2025-11-22 08:10:26.377 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:10:26 compute-0 nova_compute[186544]: 2025-11-22 08:10:26.390 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:10:26 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:26.394 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:25:7a:86 10.100.0.12'], port_security=['fa:16:3e:25:7a:86 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'da80db28-1519-41a1-851d-57a233d5042b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e9cce02b-633c-4954-b26d-7f7a9b076d6f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5855d101586040f3b0c542214fe15523', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2ac70502-6c4f-490a-89b0-7dd68676bf46', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a000fd8a-e179-4cc3-960f-37cc57a44ab2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=b89d0a45-2675-40f9-bd05-b00e8971f117) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:10:26 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:26.396 103805 INFO neutron.agent.ovn.metadata.agent [-] Port b89d0a45-2675-40f9-bd05-b00e8971f117 in datapath e9cce02b-633c-4954-b26d-7f7a9b076d6f unbound from our chassis
Nov 22 08:10:26 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:26.398 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e9cce02b-633c-4954-b26d-7f7a9b076d6f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 08:10:26 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:26.399 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[dcc7fc5a-08b7-4470-9e2e-4204479f36b2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:10:26 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:26.399 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-e9cce02b-633c-4954-b26d-7f7a9b076d6f namespace which is not needed anymore
Nov 22 08:10:26 compute-0 systemd[1]: machine-qemu\x2d72\x2dinstance\x2d0000007f.scope: Deactivated successfully.
Nov 22 08:10:26 compute-0 systemd[1]: machine-qemu\x2d72\x2dinstance\x2d0000007f.scope: Consumed 3.456s CPU time.
Nov 22 08:10:26 compute-0 systemd-machined[152872]: Machine qemu-72-instance-0000007f terminated.
Nov 22 08:10:26 compute-0 neutron-haproxy-ovnmeta-e9cce02b-633c-4954-b26d-7f7a9b076d6f[236386]: [NOTICE]   (236391) : haproxy version is 2.8.14-c23fe91
Nov 22 08:10:26 compute-0 neutron-haproxy-ovnmeta-e9cce02b-633c-4954-b26d-7f7a9b076d6f[236386]: [NOTICE]   (236391) : path to executable is /usr/sbin/haproxy
Nov 22 08:10:26 compute-0 neutron-haproxy-ovnmeta-e9cce02b-633c-4954-b26d-7f7a9b076d6f[236386]: [WARNING]  (236391) : Exiting Master process...
Nov 22 08:10:26 compute-0 neutron-haproxy-ovnmeta-e9cce02b-633c-4954-b26d-7f7a9b076d6f[236386]: [ALERT]    (236391) : Current worker (236393) exited with code 143 (Terminated)
Nov 22 08:10:26 compute-0 neutron-haproxy-ovnmeta-e9cce02b-633c-4954-b26d-7f7a9b076d6f[236386]: [WARNING]  (236391) : All workers exited. Exiting... (0)
Nov 22 08:10:26 compute-0 systemd[1]: libpod-f350e0595d4c15088e8515909b80274527060d3a78c14abfd1abd51c25812efd.scope: Deactivated successfully.
Nov 22 08:10:26 compute-0 podman[236427]: 2025-11-22 08:10:26.537634565 +0000 UTC m=+0.045551828 container died f350e0595d4c15088e8515909b80274527060d3a78c14abfd1abd51c25812efd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e9cce02b-633c-4954-b26d-7f7a9b076d6f, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Nov 22 08:10:26 compute-0 kernel: tapb89d0a45-26: entered promiscuous mode
Nov 22 08:10:26 compute-0 ovn_controller[94843]: 2025-11-22T08:10:26Z|00577|binding|INFO|Claiming lport b89d0a45-2675-40f9-bd05-b00e8971f117 for this chassis.
Nov 22 08:10:26 compute-0 ovn_controller[94843]: 2025-11-22T08:10:26Z|00578|binding|INFO|b89d0a45-2675-40f9-bd05-b00e8971f117: Claiming fa:16:3e:25:7a:86 10.100.0.12
Nov 22 08:10:26 compute-0 nova_compute[186544]: 2025-11-22 08:10:26.571 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:10:26 compute-0 kernel: tapb89d0a45-26 (unregistering): left promiscuous mode
Nov 22 08:10:26 compute-0 NetworkManager[55036]: <info>  [1763799026.5743] manager: (tapb89d0a45-26): new Tun device (/org/freedesktop/NetworkManager/Devices/265)
Nov 22 08:10:26 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f350e0595d4c15088e8515909b80274527060d3a78c14abfd1abd51c25812efd-userdata-shm.mount: Deactivated successfully.
Nov 22 08:10:26 compute-0 systemd[1]: var-lib-containers-storage-overlay-9dd0bc3c64f8e663e4692353e1dcc026f1719f3b75e9cbefbdd5f842a57fa9a8-merged.mount: Deactivated successfully.
Nov 22 08:10:26 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:26.586 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:25:7a:86 10.100.0.12'], port_security=['fa:16:3e:25:7a:86 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'da80db28-1519-41a1-851d-57a233d5042b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e9cce02b-633c-4954-b26d-7f7a9b076d6f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5855d101586040f3b0c542214fe15523', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2ac70502-6c4f-490a-89b0-7dd68676bf46', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a000fd8a-e179-4cc3-960f-37cc57a44ab2, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=b89d0a45-2675-40f9-bd05-b00e8971f117) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:10:26 compute-0 nova_compute[186544]: 2025-11-22 08:10:26.591 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:10:26 compute-0 ovn_controller[94843]: 2025-11-22T08:10:26Z|00579|binding|INFO|Setting lport b89d0a45-2675-40f9-bd05-b00e8971f117 ovn-installed in OVS
Nov 22 08:10:26 compute-0 ovn_controller[94843]: 2025-11-22T08:10:26Z|00580|binding|INFO|Setting lport b89d0a45-2675-40f9-bd05-b00e8971f117 up in Southbound
Nov 22 08:10:26 compute-0 ovn_controller[94843]: 2025-11-22T08:10:26Z|00581|binding|INFO|Releasing lport b89d0a45-2675-40f9-bd05-b00e8971f117 from this chassis (sb_readonly=1)
Nov 22 08:10:26 compute-0 ovn_controller[94843]: 2025-11-22T08:10:26Z|00582|binding|INFO|Removing iface tapb89d0a45-26 ovn-installed in OVS
Nov 22 08:10:26 compute-0 ovn_controller[94843]: 2025-11-22T08:10:26Z|00583|if_status|INFO|Dropped 2 log messages in last 14 seconds (most recently, 14 seconds ago) due to excessive rate
Nov 22 08:10:26 compute-0 ovn_controller[94843]: 2025-11-22T08:10:26Z|00584|if_status|INFO|Not setting lport b89d0a45-2675-40f9-bd05-b00e8971f117 down as sb is readonly
Nov 22 08:10:26 compute-0 ovn_controller[94843]: 2025-11-22T08:10:26Z|00585|binding|INFO|Releasing lport b89d0a45-2675-40f9-bd05-b00e8971f117 from this chassis (sb_readonly=0)
Nov 22 08:10:26 compute-0 ovn_controller[94843]: 2025-11-22T08:10:26Z|00586|binding|INFO|Setting lport b89d0a45-2675-40f9-bd05-b00e8971f117 down in Southbound
Nov 22 08:10:26 compute-0 nova_compute[186544]: 2025-11-22 08:10:26.595 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:10:26 compute-0 podman[236427]: 2025-11-22 08:10:26.599534838 +0000 UTC m=+0.107452101 container cleanup f350e0595d4c15088e8515909b80274527060d3a78c14abfd1abd51c25812efd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e9cce02b-633c-4954-b26d-7f7a9b076d6f, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118)
Nov 22 08:10:26 compute-0 systemd[1]: libpod-conmon-f350e0595d4c15088e8515909b80274527060d3a78c14abfd1abd51c25812efd.scope: Deactivated successfully.
Nov 22 08:10:26 compute-0 nova_compute[186544]: 2025-11-22 08:10:26.609 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:10:26 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:26.610 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:25:7a:86 10.100.0.12'], port_security=['fa:16:3e:25:7a:86 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'da80db28-1519-41a1-851d-57a233d5042b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e9cce02b-633c-4954-b26d-7f7a9b076d6f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5855d101586040f3b0c542214fe15523', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2ac70502-6c4f-490a-89b0-7dd68676bf46', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a000fd8a-e179-4cc3-960f-37cc57a44ab2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=b89d0a45-2675-40f9-bd05-b00e8971f117) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:10:26 compute-0 nova_compute[186544]: 2025-11-22 08:10:26.622 186548 INFO nova.virt.libvirt.driver [-] [instance: da80db28-1519-41a1-851d-57a233d5042b] Instance destroyed successfully.
Nov 22 08:10:26 compute-0 nova_compute[186544]: 2025-11-22 08:10:26.622 186548 DEBUG nova.objects.instance [None req-2465611c-204f-40f5-871c-badcc1746400 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] Lazy-loading 'resources' on Instance uuid da80db28-1519-41a1-851d-57a233d5042b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:10:26 compute-0 nova_compute[186544]: 2025-11-22 08:10:26.635 186548 DEBUG nova.virt.libvirt.vif [None req-2465611c-204f-40f5-871c-badcc1746400 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:10:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersNegativeTestMultiTenantJSON-server-409940739',display_name='tempest-ServersNegativeTestMultiTenantJSON-server-409940739',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversnegativetestmultitenantjson-server-409940739',id=127,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:10:23Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5855d101586040f3b0c542214fe15523',ramdisk_id='',reservation_id='r-iumjzw9t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersNegativeTestMultiTenantJSON-904700549',owner_user_name='tempest-ServersNegativeTestMultiTenantJSON-904700549-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:10:23Z,user_data=None,user_id='489eb690676440d3a5bd52bc11f043ae',uuid=da80db28-1519-41a1-851d-57a233d5042b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b89d0a45-2675-40f9-bd05-b00e8971f117", "address": "fa:16:3e:25:7a:86", "network": {"id": "e9cce02b-633c-4954-b26d-7f7a9b076d6f", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-304061609-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5855d101586040f3b0c542214fe15523", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb89d0a45-26", "ovs_interfaceid": "b89d0a45-2675-40f9-bd05-b00e8971f117", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 08:10:26 compute-0 nova_compute[186544]: 2025-11-22 08:10:26.636 186548 DEBUG nova.network.os_vif_util [None req-2465611c-204f-40f5-871c-badcc1746400 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] Converting VIF {"id": "b89d0a45-2675-40f9-bd05-b00e8971f117", "address": "fa:16:3e:25:7a:86", "network": {"id": "e9cce02b-633c-4954-b26d-7f7a9b076d6f", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-304061609-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5855d101586040f3b0c542214fe15523", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb89d0a45-26", "ovs_interfaceid": "b89d0a45-2675-40f9-bd05-b00e8971f117", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:10:26 compute-0 nova_compute[186544]: 2025-11-22 08:10:26.636 186548 DEBUG nova.network.os_vif_util [None req-2465611c-204f-40f5-871c-badcc1746400 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:25:7a:86,bridge_name='br-int',has_traffic_filtering=True,id=b89d0a45-2675-40f9-bd05-b00e8971f117,network=Network(e9cce02b-633c-4954-b26d-7f7a9b076d6f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb89d0a45-26') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:10:26 compute-0 nova_compute[186544]: 2025-11-22 08:10:26.637 186548 DEBUG os_vif [None req-2465611c-204f-40f5-871c-badcc1746400 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:25:7a:86,bridge_name='br-int',has_traffic_filtering=True,id=b89d0a45-2675-40f9-bd05-b00e8971f117,network=Network(e9cce02b-633c-4954-b26d-7f7a9b076d6f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb89d0a45-26') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 08:10:26 compute-0 nova_compute[186544]: 2025-11-22 08:10:26.639 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:10:26 compute-0 nova_compute[186544]: 2025-11-22 08:10:26.639 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb89d0a45-26, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:10:26 compute-0 nova_compute[186544]: 2025-11-22 08:10:26.641 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:10:26 compute-0 nova_compute[186544]: 2025-11-22 08:10:26.642 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:10:26 compute-0 nova_compute[186544]: 2025-11-22 08:10:26.645 186548 INFO os_vif [None req-2465611c-204f-40f5-871c-badcc1746400 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:25:7a:86,bridge_name='br-int',has_traffic_filtering=True,id=b89d0a45-2675-40f9-bd05-b00e8971f117,network=Network(e9cce02b-633c-4954-b26d-7f7a9b076d6f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb89d0a45-26')
Nov 22 08:10:26 compute-0 nova_compute[186544]: 2025-11-22 08:10:26.646 186548 INFO nova.virt.libvirt.driver [None req-2465611c-204f-40f5-871c-badcc1746400 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] [instance: da80db28-1519-41a1-851d-57a233d5042b] Deleting instance files /var/lib/nova/instances/da80db28-1519-41a1-851d-57a233d5042b_del
Nov 22 08:10:26 compute-0 nova_compute[186544]: 2025-11-22 08:10:26.647 186548 INFO nova.virt.libvirt.driver [None req-2465611c-204f-40f5-871c-badcc1746400 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] [instance: da80db28-1519-41a1-851d-57a233d5042b] Deletion of /var/lib/nova/instances/da80db28-1519-41a1-851d-57a233d5042b_del complete
Nov 22 08:10:26 compute-0 podman[236474]: 2025-11-22 08:10:26.680756949 +0000 UTC m=+0.045074127 container remove f350e0595d4c15088e8515909b80274527060d3a78c14abfd1abd51c25812efd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e9cce02b-633c-4954-b26d-7f7a9b076d6f, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 08:10:26 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:26.686 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[2cc374a5-a6ca-46b5-8e50-373d7f73dcbb]: (4, ('Sat Nov 22 08:10:26 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-e9cce02b-633c-4954-b26d-7f7a9b076d6f (f350e0595d4c15088e8515909b80274527060d3a78c14abfd1abd51c25812efd)\nf350e0595d4c15088e8515909b80274527060d3a78c14abfd1abd51c25812efd\nSat Nov 22 08:10:26 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-e9cce02b-633c-4954-b26d-7f7a9b076d6f (f350e0595d4c15088e8515909b80274527060d3a78c14abfd1abd51c25812efd)\nf350e0595d4c15088e8515909b80274527060d3a78c14abfd1abd51c25812efd\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:10:26 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:26.687 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[8d5ea363-c37a-46ab-9771-ed33200e8823]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:10:26 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:26.688 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape9cce02b-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:10:26 compute-0 nova_compute[186544]: 2025-11-22 08:10:26.690 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:10:26 compute-0 kernel: tape9cce02b-60: left promiscuous mode
Nov 22 08:10:26 compute-0 nova_compute[186544]: 2025-11-22 08:10:26.700 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:10:26 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:26.703 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[bb6a1ae8-89a6-4673-9c8f-d5973141466e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:10:26 compute-0 nova_compute[186544]: 2025-11-22 08:10:26.704 186548 INFO nova.compute.manager [None req-2465611c-204f-40f5-871c-badcc1746400 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] [instance: da80db28-1519-41a1-851d-57a233d5042b] Took 0.36 seconds to destroy the instance on the hypervisor.
Nov 22 08:10:26 compute-0 nova_compute[186544]: 2025-11-22 08:10:26.705 186548 DEBUG oslo.service.loopingcall [None req-2465611c-204f-40f5-871c-badcc1746400 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 08:10:26 compute-0 nova_compute[186544]: 2025-11-22 08:10:26.705 186548 DEBUG nova.compute.manager [-] [instance: da80db28-1519-41a1-851d-57a233d5042b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 08:10:26 compute-0 nova_compute[186544]: 2025-11-22 08:10:26.705 186548 DEBUG nova.network.neutron [-] [instance: da80db28-1519-41a1-851d-57a233d5042b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 08:10:26 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:26.722 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[49543457-b739-47e6-bb4b-038cd38de038]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:10:26 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:26.723 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[885cffa0-8801-43fc-908a-ad0945cd2bba]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:10:26 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:26.739 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[b0640cbc-38f3-4088-b512-2d23f78114b2]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 576354, 'reachable_time': 41152, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236490, 'error': None, 'target': 'ovnmeta-e9cce02b-633c-4954-b26d-7f7a9b076d6f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:10:26 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:26.743 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-e9cce02b-633c-4954-b26d-7f7a9b076d6f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 08:10:26 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:26.743 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[f741ad3f-27e3-4fdc-be1f-1ca10b464197]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:10:26 compute-0 systemd[1]: run-netns-ovnmeta\x2de9cce02b\x2d633c\x2d4954\x2db26d\x2d7f7a9b076d6f.mount: Deactivated successfully.
Nov 22 08:10:26 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:26.744 103805 INFO neutron.agent.ovn.metadata.agent [-] Port b89d0a45-2675-40f9-bd05-b00e8971f117 in datapath e9cce02b-633c-4954-b26d-7f7a9b076d6f unbound from our chassis
Nov 22 08:10:26 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:26.746 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e9cce02b-633c-4954-b26d-7f7a9b076d6f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 08:10:26 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:26.747 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[71681155-368d-459a-b855-4c2709d0c95d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:10:26 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:26.748 103805 INFO neutron.agent.ovn.metadata.agent [-] Port b89d0a45-2675-40f9-bd05-b00e8971f117 in datapath e9cce02b-633c-4954-b26d-7f7a9b076d6f unbound from our chassis
Nov 22 08:10:26 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:26.749 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e9cce02b-633c-4954-b26d-7f7a9b076d6f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 08:10:26 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:26.750 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[862ce757-7064-4620-b4c9-99b55445383f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:10:27 compute-0 nova_compute[186544]: 2025-11-22 08:10:27.430 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763799012.4289086, 4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:10:27 compute-0 nova_compute[186544]: 2025-11-22 08:10:27.431 186548 INFO nova.compute.manager [-] [instance: 4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1] VM Stopped (Lifecycle Event)
Nov 22 08:10:27 compute-0 nova_compute[186544]: 2025-11-22 08:10:27.450 186548 DEBUG nova.compute.manager [None req-c0f2fe5c-6e18-4f3f-aab6-601f6347fa80 - - - - - -] [instance: 4b85d7c2-5f08-4b5a-9a22-5efeffa3a8f1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:10:27 compute-0 nova_compute[186544]: 2025-11-22 08:10:27.558 186548 DEBUG nova.compute.manager [req-1c840f0c-a262-438a-809f-373ec11ea784 req-82c2dbdd-0f00-4a02-b96f-392fc1ecdd13 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: da80db28-1519-41a1-851d-57a233d5042b] Received event network-vif-unplugged-b89d0a45-2675-40f9-bd05-b00e8971f117 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:10:27 compute-0 nova_compute[186544]: 2025-11-22 08:10:27.558 186548 DEBUG oslo_concurrency.lockutils [req-1c840f0c-a262-438a-809f-373ec11ea784 req-82c2dbdd-0f00-4a02-b96f-392fc1ecdd13 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "da80db28-1519-41a1-851d-57a233d5042b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:10:27 compute-0 nova_compute[186544]: 2025-11-22 08:10:27.559 186548 DEBUG oslo_concurrency.lockutils [req-1c840f0c-a262-438a-809f-373ec11ea784 req-82c2dbdd-0f00-4a02-b96f-392fc1ecdd13 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "da80db28-1519-41a1-851d-57a233d5042b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:10:27 compute-0 nova_compute[186544]: 2025-11-22 08:10:27.559 186548 DEBUG oslo_concurrency.lockutils [req-1c840f0c-a262-438a-809f-373ec11ea784 req-82c2dbdd-0f00-4a02-b96f-392fc1ecdd13 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "da80db28-1519-41a1-851d-57a233d5042b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:10:27 compute-0 nova_compute[186544]: 2025-11-22 08:10:27.559 186548 DEBUG nova.compute.manager [req-1c840f0c-a262-438a-809f-373ec11ea784 req-82c2dbdd-0f00-4a02-b96f-392fc1ecdd13 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: da80db28-1519-41a1-851d-57a233d5042b] No waiting events found dispatching network-vif-unplugged-b89d0a45-2675-40f9-bd05-b00e8971f117 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:10:27 compute-0 nova_compute[186544]: 2025-11-22 08:10:27.559 186548 DEBUG nova.compute.manager [req-1c840f0c-a262-438a-809f-373ec11ea784 req-82c2dbdd-0f00-4a02-b96f-392fc1ecdd13 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: da80db28-1519-41a1-851d-57a233d5042b] Received event network-vif-unplugged-b89d0a45-2675-40f9-bd05-b00e8971f117 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 22 08:10:27 compute-0 nova_compute[186544]: 2025-11-22 08:10:27.560 186548 DEBUG nova.compute.manager [req-1c840f0c-a262-438a-809f-373ec11ea784 req-82c2dbdd-0f00-4a02-b96f-392fc1ecdd13 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: da80db28-1519-41a1-851d-57a233d5042b] Received event network-vif-plugged-b89d0a45-2675-40f9-bd05-b00e8971f117 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:10:27 compute-0 nova_compute[186544]: 2025-11-22 08:10:27.560 186548 DEBUG oslo_concurrency.lockutils [req-1c840f0c-a262-438a-809f-373ec11ea784 req-82c2dbdd-0f00-4a02-b96f-392fc1ecdd13 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "da80db28-1519-41a1-851d-57a233d5042b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:10:27 compute-0 nova_compute[186544]: 2025-11-22 08:10:27.560 186548 DEBUG oslo_concurrency.lockutils [req-1c840f0c-a262-438a-809f-373ec11ea784 req-82c2dbdd-0f00-4a02-b96f-392fc1ecdd13 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "da80db28-1519-41a1-851d-57a233d5042b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:10:27 compute-0 nova_compute[186544]: 2025-11-22 08:10:27.561 186548 DEBUG oslo_concurrency.lockutils [req-1c840f0c-a262-438a-809f-373ec11ea784 req-82c2dbdd-0f00-4a02-b96f-392fc1ecdd13 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "da80db28-1519-41a1-851d-57a233d5042b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:10:27 compute-0 nova_compute[186544]: 2025-11-22 08:10:27.561 186548 DEBUG nova.compute.manager [req-1c840f0c-a262-438a-809f-373ec11ea784 req-82c2dbdd-0f00-4a02-b96f-392fc1ecdd13 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: da80db28-1519-41a1-851d-57a233d5042b] No waiting events found dispatching network-vif-plugged-b89d0a45-2675-40f9-bd05-b00e8971f117 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:10:27 compute-0 nova_compute[186544]: 2025-11-22 08:10:27.561 186548 WARNING nova.compute.manager [req-1c840f0c-a262-438a-809f-373ec11ea784 req-82c2dbdd-0f00-4a02-b96f-392fc1ecdd13 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: da80db28-1519-41a1-851d-57a233d5042b] Received unexpected event network-vif-plugged-b89d0a45-2675-40f9-bd05-b00e8971f117 for instance with vm_state active and task_state deleting.
Nov 22 08:10:27 compute-0 nova_compute[186544]: 2025-11-22 08:10:27.561 186548 DEBUG nova.compute.manager [req-1c840f0c-a262-438a-809f-373ec11ea784 req-82c2dbdd-0f00-4a02-b96f-392fc1ecdd13 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: da80db28-1519-41a1-851d-57a233d5042b] Received event network-vif-plugged-b89d0a45-2675-40f9-bd05-b00e8971f117 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:10:27 compute-0 nova_compute[186544]: 2025-11-22 08:10:27.562 186548 DEBUG oslo_concurrency.lockutils [req-1c840f0c-a262-438a-809f-373ec11ea784 req-82c2dbdd-0f00-4a02-b96f-392fc1ecdd13 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "da80db28-1519-41a1-851d-57a233d5042b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:10:27 compute-0 nova_compute[186544]: 2025-11-22 08:10:27.562 186548 DEBUG oslo_concurrency.lockutils [req-1c840f0c-a262-438a-809f-373ec11ea784 req-82c2dbdd-0f00-4a02-b96f-392fc1ecdd13 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "da80db28-1519-41a1-851d-57a233d5042b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:10:27 compute-0 nova_compute[186544]: 2025-11-22 08:10:27.562 186548 DEBUG oslo_concurrency.lockutils [req-1c840f0c-a262-438a-809f-373ec11ea784 req-82c2dbdd-0f00-4a02-b96f-392fc1ecdd13 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "da80db28-1519-41a1-851d-57a233d5042b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:10:27 compute-0 nova_compute[186544]: 2025-11-22 08:10:27.562 186548 DEBUG nova.compute.manager [req-1c840f0c-a262-438a-809f-373ec11ea784 req-82c2dbdd-0f00-4a02-b96f-392fc1ecdd13 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: da80db28-1519-41a1-851d-57a233d5042b] No waiting events found dispatching network-vif-plugged-b89d0a45-2675-40f9-bd05-b00e8971f117 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:10:27 compute-0 nova_compute[186544]: 2025-11-22 08:10:27.562 186548 WARNING nova.compute.manager [req-1c840f0c-a262-438a-809f-373ec11ea784 req-82c2dbdd-0f00-4a02-b96f-392fc1ecdd13 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: da80db28-1519-41a1-851d-57a233d5042b] Received unexpected event network-vif-plugged-b89d0a45-2675-40f9-bd05-b00e8971f117 for instance with vm_state active and task_state deleting.
Nov 22 08:10:27 compute-0 nova_compute[186544]: 2025-11-22 08:10:27.915 186548 DEBUG nova.network.neutron [-] [instance: da80db28-1519-41a1-851d-57a233d5042b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:10:27 compute-0 nova_compute[186544]: 2025-11-22 08:10:27.930 186548 INFO nova.compute.manager [-] [instance: da80db28-1519-41a1-851d-57a233d5042b] Took 1.22 seconds to deallocate network for instance.
Nov 22 08:10:27 compute-0 nova_compute[186544]: 2025-11-22 08:10:27.991 186548 DEBUG oslo_concurrency.lockutils [None req-2465611c-204f-40f5-871c-badcc1746400 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:10:27 compute-0 nova_compute[186544]: 2025-11-22 08:10:27.992 186548 DEBUG oslo_concurrency.lockutils [None req-2465611c-204f-40f5-871c-badcc1746400 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:10:28 compute-0 nova_compute[186544]: 2025-11-22 08:10:28.064 186548 DEBUG nova.compute.provider_tree [None req-2465611c-204f-40f5-871c-badcc1746400 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:10:28 compute-0 nova_compute[186544]: 2025-11-22 08:10:28.083 186548 DEBUG nova.scheduler.client.report [None req-2465611c-204f-40f5-871c-badcc1746400 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:10:28 compute-0 nova_compute[186544]: 2025-11-22 08:10:28.108 186548 DEBUG oslo_concurrency.lockutils [None req-2465611c-204f-40f5-871c-badcc1746400 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.116s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:10:28 compute-0 nova_compute[186544]: 2025-11-22 08:10:28.151 186548 INFO nova.scheduler.client.report [None req-2465611c-204f-40f5-871c-badcc1746400 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] Deleted allocations for instance da80db28-1519-41a1-851d-57a233d5042b
Nov 22 08:10:28 compute-0 nova_compute[186544]: 2025-11-22 08:10:28.211 186548 DEBUG oslo_concurrency.lockutils [None req-2465611c-204f-40f5-871c-badcc1746400 489eb690676440d3a5bd52bc11f043ae 5855d101586040f3b0c542214fe15523 - - default default] Lock "da80db28-1519-41a1-851d-57a233d5042b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.877s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:10:29 compute-0 nova_compute[186544]: 2025-11-22 08:10:29.685 186548 DEBUG nova.compute.manager [req-b923d0c5-525b-4d5d-a297-c87286b943a8 req-0bdf363d-19e7-48aa-8d58-c9f8ce40cfc7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: da80db28-1519-41a1-851d-57a233d5042b] Received event network-vif-plugged-b89d0a45-2675-40f9-bd05-b00e8971f117 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:10:29 compute-0 nova_compute[186544]: 2025-11-22 08:10:29.685 186548 DEBUG oslo_concurrency.lockutils [req-b923d0c5-525b-4d5d-a297-c87286b943a8 req-0bdf363d-19e7-48aa-8d58-c9f8ce40cfc7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "da80db28-1519-41a1-851d-57a233d5042b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:10:29 compute-0 nova_compute[186544]: 2025-11-22 08:10:29.687 186548 DEBUG oslo_concurrency.lockutils [req-b923d0c5-525b-4d5d-a297-c87286b943a8 req-0bdf363d-19e7-48aa-8d58-c9f8ce40cfc7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "da80db28-1519-41a1-851d-57a233d5042b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:10:29 compute-0 nova_compute[186544]: 2025-11-22 08:10:29.687 186548 DEBUG oslo_concurrency.lockutils [req-b923d0c5-525b-4d5d-a297-c87286b943a8 req-0bdf363d-19e7-48aa-8d58-c9f8ce40cfc7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "da80db28-1519-41a1-851d-57a233d5042b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:10:29 compute-0 nova_compute[186544]: 2025-11-22 08:10:29.688 186548 DEBUG nova.compute.manager [req-b923d0c5-525b-4d5d-a297-c87286b943a8 req-0bdf363d-19e7-48aa-8d58-c9f8ce40cfc7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: da80db28-1519-41a1-851d-57a233d5042b] No waiting events found dispatching network-vif-plugged-b89d0a45-2675-40f9-bd05-b00e8971f117 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:10:29 compute-0 nova_compute[186544]: 2025-11-22 08:10:29.689 186548 WARNING nova.compute.manager [req-b923d0c5-525b-4d5d-a297-c87286b943a8 req-0bdf363d-19e7-48aa-8d58-c9f8ce40cfc7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: da80db28-1519-41a1-851d-57a233d5042b] Received unexpected event network-vif-plugged-b89d0a45-2675-40f9-bd05-b00e8971f117 for instance with vm_state deleted and task_state None.
Nov 22 08:10:29 compute-0 nova_compute[186544]: 2025-11-22 08:10:29.689 186548 DEBUG nova.compute.manager [req-b923d0c5-525b-4d5d-a297-c87286b943a8 req-0bdf363d-19e7-48aa-8d58-c9f8ce40cfc7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: da80db28-1519-41a1-851d-57a233d5042b] Received event network-vif-deleted-b89d0a45-2675-40f9-bd05-b00e8971f117 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:10:29 compute-0 nova_compute[186544]: 2025-11-22 08:10:29.690 186548 DEBUG nova.compute.manager [req-b923d0c5-525b-4d5d-a297-c87286b943a8 req-0bdf363d-19e7-48aa-8d58-c9f8ce40cfc7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: da80db28-1519-41a1-851d-57a233d5042b] Received event network-vif-plugged-b89d0a45-2675-40f9-bd05-b00e8971f117 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:10:29 compute-0 nova_compute[186544]: 2025-11-22 08:10:29.690 186548 DEBUG oslo_concurrency.lockutils [req-b923d0c5-525b-4d5d-a297-c87286b943a8 req-0bdf363d-19e7-48aa-8d58-c9f8ce40cfc7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "da80db28-1519-41a1-851d-57a233d5042b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:10:29 compute-0 nova_compute[186544]: 2025-11-22 08:10:29.691 186548 DEBUG oslo_concurrency.lockutils [req-b923d0c5-525b-4d5d-a297-c87286b943a8 req-0bdf363d-19e7-48aa-8d58-c9f8ce40cfc7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "da80db28-1519-41a1-851d-57a233d5042b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:10:29 compute-0 nova_compute[186544]: 2025-11-22 08:10:29.691 186548 DEBUG oslo_concurrency.lockutils [req-b923d0c5-525b-4d5d-a297-c87286b943a8 req-0bdf363d-19e7-48aa-8d58-c9f8ce40cfc7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "da80db28-1519-41a1-851d-57a233d5042b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:10:29 compute-0 nova_compute[186544]: 2025-11-22 08:10:29.692 186548 DEBUG nova.compute.manager [req-b923d0c5-525b-4d5d-a297-c87286b943a8 req-0bdf363d-19e7-48aa-8d58-c9f8ce40cfc7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: da80db28-1519-41a1-851d-57a233d5042b] No waiting events found dispatching network-vif-plugged-b89d0a45-2675-40f9-bd05-b00e8971f117 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:10:29 compute-0 nova_compute[186544]: 2025-11-22 08:10:29.693 186548 WARNING nova.compute.manager [req-b923d0c5-525b-4d5d-a297-c87286b943a8 req-0bdf363d-19e7-48aa-8d58-c9f8ce40cfc7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: da80db28-1519-41a1-851d-57a233d5042b] Received unexpected event network-vif-plugged-b89d0a45-2675-40f9-bd05-b00e8971f117 for instance with vm_state deleted and task_state None.
Nov 22 08:10:30 compute-0 nova_compute[186544]: 2025-11-22 08:10:30.876 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:10:31 compute-0 nova_compute[186544]: 2025-11-22 08:10:31.642 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:10:35 compute-0 nova_compute[186544]: 2025-11-22 08:10:35.878 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:10:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:10:36.595 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:10:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:10:36.596 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:10:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:10:36.596 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:10:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:10:36.596 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:10:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:10:36.596 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:10:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:10:36.596 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:10:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:10:36.597 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:10:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:10:36.597 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:10:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:10:36.597 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:10:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:10:36.597 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:10:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:10:36.597 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:10:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:10:36.597 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:10:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:10:36.597 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:10:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:10:36.597 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:10:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:10:36.597 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:10:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:10:36.597 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:10:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:10:36.597 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:10:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:10:36.597 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:10:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:10:36.598 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:10:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:10:36.598 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:10:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:10:36.598 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:10:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:10:36.598 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:10:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:10:36.598 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:10:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:10:36.598 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:10:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:10:36.598 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:10:36 compute-0 nova_compute[186544]: 2025-11-22 08:10:36.645 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:10:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:37.338 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:10:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:37.339 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:10:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:10:37.339 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:10:37 compute-0 nova_compute[186544]: 2025-11-22 08:10:37.504 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:10:38 compute-0 podman[236491]: 2025-11-22 08:10:38.421355215 +0000 UTC m=+0.072164018 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 08:10:38 compute-0 podman[236492]: 2025-11-22 08:10:38.421434876 +0000 UTC m=+0.069112111 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 08:10:38 compute-0 podman[236493]: 2025-11-22 08:10:38.424479962 +0000 UTC m=+0.065002250 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 08:10:38 compute-0 podman[236499]: 2025-11-22 08:10:38.449120042 +0000 UTC m=+0.087899947 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 22 08:10:40 compute-0 nova_compute[186544]: 2025-11-22 08:10:40.879 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:10:41 compute-0 nova_compute[186544]: 2025-11-22 08:10:41.619 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763799026.6185262, da80db28-1519-41a1-851d-57a233d5042b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:10:41 compute-0 nova_compute[186544]: 2025-11-22 08:10:41.620 186548 INFO nova.compute.manager [-] [instance: da80db28-1519-41a1-851d-57a233d5042b] VM Stopped (Lifecycle Event)
Nov 22 08:10:41 compute-0 nova_compute[186544]: 2025-11-22 08:10:41.641 186548 DEBUG nova.compute.manager [None req-c1d1bd85-b9eb-4344-8848-595fe8ca3048 - - - - - -] [instance: da80db28-1519-41a1-851d-57a233d5042b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:10:41 compute-0 nova_compute[186544]: 2025-11-22 08:10:41.648 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:10:45 compute-0 nova_compute[186544]: 2025-11-22 08:10:45.880 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:10:46 compute-0 nova_compute[186544]: 2025-11-22 08:10:46.649 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:10:49 compute-0 podman[236582]: 2025-11-22 08:10:49.404101692 +0000 UTC m=+0.054894070 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.build-date=20251118)
Nov 22 08:10:50 compute-0 nova_compute[186544]: 2025-11-22 08:10:50.882 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:10:51 compute-0 nova_compute[186544]: 2025-11-22 08:10:51.652 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:10:53 compute-0 podman[236602]: 2025-11-22 08:10:53.399396806 +0000 UTC m=+0.048096172 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 08:10:53 compute-0 podman[236603]: 2025-11-22 08:10:53.422346744 +0000 UTC m=+0.066830196 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, version=9.6, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, distribution-scope=public, io.openshift.expose-services=, release=1755695350)
Nov 22 08:10:55 compute-0 nova_compute[186544]: 2025-11-22 08:10:55.397 186548 DEBUG oslo_concurrency.lockutils [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Acquiring lock "b697e0e3-6ab8-4e90-b8e9-e72e362283da" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:10:55 compute-0 nova_compute[186544]: 2025-11-22 08:10:55.397 186548 DEBUG oslo_concurrency.lockutils [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Lock "b697e0e3-6ab8-4e90-b8e9-e72e362283da" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:10:55 compute-0 nova_compute[186544]: 2025-11-22 08:10:55.419 186548 DEBUG nova.compute.manager [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 08:10:55 compute-0 nova_compute[186544]: 2025-11-22 08:10:55.536 186548 DEBUG oslo_concurrency.lockutils [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:10:55 compute-0 nova_compute[186544]: 2025-11-22 08:10:55.537 186548 DEBUG oslo_concurrency.lockutils [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:10:55 compute-0 nova_compute[186544]: 2025-11-22 08:10:55.543 186548 DEBUG nova.virt.hardware [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 08:10:55 compute-0 nova_compute[186544]: 2025-11-22 08:10:55.544 186548 INFO nova.compute.claims [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Claim successful on node compute-0.ctlplane.example.com
Nov 22 08:10:55 compute-0 nova_compute[186544]: 2025-11-22 08:10:55.693 186548 DEBUG nova.scheduler.client.report [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Refreshing inventories for resource provider 0a011418-630a-4be8-ab23-41ec1c11a5ea _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 22 08:10:55 compute-0 nova_compute[186544]: 2025-11-22 08:10:55.714 186548 DEBUG nova.scheduler.client.report [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Updating ProviderTree inventory for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 22 08:10:55 compute-0 nova_compute[186544]: 2025-11-22 08:10:55.715 186548 DEBUG nova.compute.provider_tree [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Updating inventory in ProviderTree for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 22 08:10:55 compute-0 nova_compute[186544]: 2025-11-22 08:10:55.730 186548 DEBUG nova.scheduler.client.report [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Refreshing aggregate associations for resource provider 0a011418-630a-4be8-ab23-41ec1c11a5ea, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 22 08:10:55 compute-0 nova_compute[186544]: 2025-11-22 08:10:55.753 186548 DEBUG nova.scheduler.client.report [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Refreshing trait associations for resource provider 0a011418-630a-4be8-ab23-41ec1c11a5ea, traits: COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_ACCELERATORS,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_RESCUE_BFV,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSSE3,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE41 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 22 08:10:55 compute-0 nova_compute[186544]: 2025-11-22 08:10:55.794 186548 DEBUG nova.compute.provider_tree [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:10:55 compute-0 nova_compute[186544]: 2025-11-22 08:10:55.818 186548 DEBUG nova.scheduler.client.report [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:10:55 compute-0 nova_compute[186544]: 2025-11-22 08:10:55.845 186548 DEBUG oslo_concurrency.lockutils [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.308s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:10:55 compute-0 nova_compute[186544]: 2025-11-22 08:10:55.845 186548 DEBUG nova.compute.manager [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 08:10:55 compute-0 nova_compute[186544]: 2025-11-22 08:10:55.883 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:10:55 compute-0 nova_compute[186544]: 2025-11-22 08:10:55.917 186548 DEBUG nova.compute.manager [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 22 08:10:55 compute-0 nova_compute[186544]: 2025-11-22 08:10:55.917 186548 DEBUG nova.network.neutron [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 08:10:55 compute-0 nova_compute[186544]: 2025-11-22 08:10:55.934 186548 INFO nova.virt.libvirt.driver [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 08:10:55 compute-0 nova_compute[186544]: 2025-11-22 08:10:55.948 186548 DEBUG nova.compute.manager [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 08:10:56 compute-0 nova_compute[186544]: 2025-11-22 08:10:56.044 186548 DEBUG nova.compute.manager [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 08:10:56 compute-0 nova_compute[186544]: 2025-11-22 08:10:56.046 186548 DEBUG nova.virt.libvirt.driver [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 08:10:56 compute-0 nova_compute[186544]: 2025-11-22 08:10:56.046 186548 INFO nova.virt.libvirt.driver [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Creating image(s)
Nov 22 08:10:56 compute-0 nova_compute[186544]: 2025-11-22 08:10:56.047 186548 DEBUG oslo_concurrency.lockutils [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Acquiring lock "/var/lib/nova/instances/b697e0e3-6ab8-4e90-b8e9-e72e362283da/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:10:56 compute-0 nova_compute[186544]: 2025-11-22 08:10:56.047 186548 DEBUG oslo_concurrency.lockutils [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Lock "/var/lib/nova/instances/b697e0e3-6ab8-4e90-b8e9-e72e362283da/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:10:56 compute-0 nova_compute[186544]: 2025-11-22 08:10:56.048 186548 DEBUG oslo_concurrency.lockutils [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Lock "/var/lib/nova/instances/b697e0e3-6ab8-4e90-b8e9-e72e362283da/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:10:56 compute-0 nova_compute[186544]: 2025-11-22 08:10:56.060 186548 DEBUG oslo_concurrency.processutils [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:10:56 compute-0 nova_compute[186544]: 2025-11-22 08:10:56.119 186548 DEBUG oslo_concurrency.processutils [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:10:56 compute-0 nova_compute[186544]: 2025-11-22 08:10:56.120 186548 DEBUG oslo_concurrency.lockutils [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:10:56 compute-0 nova_compute[186544]: 2025-11-22 08:10:56.121 186548 DEBUG oslo_concurrency.lockutils [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:10:56 compute-0 nova_compute[186544]: 2025-11-22 08:10:56.132 186548 DEBUG oslo_concurrency.processutils [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:10:56 compute-0 nova_compute[186544]: 2025-11-22 08:10:56.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:10:56 compute-0 nova_compute[186544]: 2025-11-22 08:10:56.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:10:56 compute-0 nova_compute[186544]: 2025-11-22 08:10:56.183 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:10:56 compute-0 nova_compute[186544]: 2025-11-22 08:10:56.184 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:10:56 compute-0 nova_compute[186544]: 2025-11-22 08:10:56.184 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:10:56 compute-0 nova_compute[186544]: 2025-11-22 08:10:56.185 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 08:10:56 compute-0 nova_compute[186544]: 2025-11-22 08:10:56.188 186548 DEBUG oslo_concurrency.processutils [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:10:56 compute-0 nova_compute[186544]: 2025-11-22 08:10:56.189 186548 DEBUG oslo_concurrency.processutils [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/b697e0e3-6ab8-4e90-b8e9-e72e362283da/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:10:56 compute-0 nova_compute[186544]: 2025-11-22 08:10:56.223 186548 DEBUG oslo_concurrency.processutils [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/b697e0e3-6ab8-4e90-b8e9-e72e362283da/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:10:56 compute-0 nova_compute[186544]: 2025-11-22 08:10:56.225 186548 DEBUG oslo_concurrency.lockutils [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.104s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:10:56 compute-0 nova_compute[186544]: 2025-11-22 08:10:56.225 186548 DEBUG oslo_concurrency.processutils [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:10:56 compute-0 nova_compute[186544]: 2025-11-22 08:10:56.244 186548 DEBUG nova.policy [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'cf64193131de4d458bf1bd37c21125f6', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e77afcde171b45e6bb008c9dff8ffb44', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 22 08:10:56 compute-0 nova_compute[186544]: 2025-11-22 08:10:56.281 186548 DEBUG oslo_concurrency.processutils [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:10:56 compute-0 nova_compute[186544]: 2025-11-22 08:10:56.281 186548 DEBUG nova.virt.disk.api [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Checking if we can resize image /var/lib/nova/instances/b697e0e3-6ab8-4e90-b8e9-e72e362283da/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 08:10:56 compute-0 nova_compute[186544]: 2025-11-22 08:10:56.282 186548 DEBUG oslo_concurrency.processutils [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b697e0e3-6ab8-4e90-b8e9-e72e362283da/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:10:56 compute-0 nova_compute[186544]: 2025-11-22 08:10:56.339 186548 DEBUG oslo_concurrency.processutils [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b697e0e3-6ab8-4e90-b8e9-e72e362283da/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:10:56 compute-0 nova_compute[186544]: 2025-11-22 08:10:56.340 186548 DEBUG nova.virt.disk.api [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Cannot resize image /var/lib/nova/instances/b697e0e3-6ab8-4e90-b8e9-e72e362283da/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 08:10:56 compute-0 nova_compute[186544]: 2025-11-22 08:10:56.340 186548 DEBUG nova.objects.instance [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Lazy-loading 'migration_context' on Instance uuid b697e0e3-6ab8-4e90-b8e9-e72e362283da obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:10:56 compute-0 nova_compute[186544]: 2025-11-22 08:10:56.357 186548 DEBUG nova.virt.libvirt.driver [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 08:10:56 compute-0 nova_compute[186544]: 2025-11-22 08:10:56.357 186548 DEBUG nova.virt.libvirt.driver [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Ensure instance console log exists: /var/lib/nova/instances/b697e0e3-6ab8-4e90-b8e9-e72e362283da/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 08:10:56 compute-0 nova_compute[186544]: 2025-11-22 08:10:56.358 186548 DEBUG oslo_concurrency.lockutils [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:10:56 compute-0 nova_compute[186544]: 2025-11-22 08:10:56.358 186548 DEBUG oslo_concurrency.lockutils [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:10:56 compute-0 nova_compute[186544]: 2025-11-22 08:10:56.358 186548 DEBUG oslo_concurrency.lockutils [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:10:56 compute-0 nova_compute[186544]: 2025-11-22 08:10:56.422 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:10:56 compute-0 nova_compute[186544]: 2025-11-22 08:10:56.424 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5717MB free_disk=73.20442199707031GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 08:10:56 compute-0 nova_compute[186544]: 2025-11-22 08:10:56.424 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:10:56 compute-0 nova_compute[186544]: 2025-11-22 08:10:56.425 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:10:56 compute-0 nova_compute[186544]: 2025-11-22 08:10:56.545 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Instance b697e0e3-6ab8-4e90-b8e9-e72e362283da actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 22 08:10:56 compute-0 nova_compute[186544]: 2025-11-22 08:10:56.545 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 08:10:56 compute-0 nova_compute[186544]: 2025-11-22 08:10:56.546 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 08:10:56 compute-0 nova_compute[186544]: 2025-11-22 08:10:56.601 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:10:56 compute-0 nova_compute[186544]: 2025-11-22 08:10:56.612 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:10:56 compute-0 nova_compute[186544]: 2025-11-22 08:10:56.633 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 08:10:56 compute-0 nova_compute[186544]: 2025-11-22 08:10:56.634 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.209s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:10:56 compute-0 nova_compute[186544]: 2025-11-22 08:10:56.655 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:10:58 compute-0 nova_compute[186544]: 2025-11-22 08:10:58.503 186548 DEBUG nova.network.neutron [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Successfully created port: 7b325460-e116-46b4-b42f-3595a0a85fa1 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 22 08:10:59 compute-0 nova_compute[186544]: 2025-11-22 08:10:59.633 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:10:59 compute-0 nova_compute[186544]: 2025-11-22 08:10:59.633 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 08:10:59 compute-0 nova_compute[186544]: 2025-11-22 08:10:59.634 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 08:10:59 compute-0 nova_compute[186544]: 2025-11-22 08:10:59.661 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Nov 22 08:10:59 compute-0 nova_compute[186544]: 2025-11-22 08:10:59.662 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 08:11:00 compute-0 nova_compute[186544]: 2025-11-22 08:11:00.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:11:00 compute-0 nova_compute[186544]: 2025-11-22 08:11:00.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:11:00 compute-0 nova_compute[186544]: 2025-11-22 08:11:00.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:11:00 compute-0 nova_compute[186544]: 2025-11-22 08:11:00.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 08:11:00 compute-0 nova_compute[186544]: 2025-11-22 08:11:00.716 186548 DEBUG nova.network.neutron [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Successfully updated port: 7b325460-e116-46b4-b42f-3595a0a85fa1 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 08:11:00 compute-0 nova_compute[186544]: 2025-11-22 08:11:00.733 186548 DEBUG oslo_concurrency.lockutils [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Acquiring lock "refresh_cache-b697e0e3-6ab8-4e90-b8e9-e72e362283da" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:11:00 compute-0 nova_compute[186544]: 2025-11-22 08:11:00.734 186548 DEBUG oslo_concurrency.lockutils [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Acquired lock "refresh_cache-b697e0e3-6ab8-4e90-b8e9-e72e362283da" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:11:00 compute-0 nova_compute[186544]: 2025-11-22 08:11:00.734 186548 DEBUG nova.network.neutron [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 08:11:00 compute-0 nova_compute[186544]: 2025-11-22 08:11:00.885 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:11:00 compute-0 nova_compute[186544]: 2025-11-22 08:11:00.903 186548 DEBUG nova.compute.manager [req-6638f44d-d24f-4954-9c80-2c136572f002 req-466eeb99-0928-4ff0-aeab-09fac745baf1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Received event network-changed-7b325460-e116-46b4-b42f-3595a0a85fa1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:11:00 compute-0 nova_compute[186544]: 2025-11-22 08:11:00.904 186548 DEBUG nova.compute.manager [req-6638f44d-d24f-4954-9c80-2c136572f002 req-466eeb99-0928-4ff0-aeab-09fac745baf1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Refreshing instance network info cache due to event network-changed-7b325460-e116-46b4-b42f-3595a0a85fa1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:11:00 compute-0 nova_compute[186544]: 2025-11-22 08:11:00.904 186548 DEBUG oslo_concurrency.lockutils [req-6638f44d-d24f-4954-9c80-2c136572f002 req-466eeb99-0928-4ff0-aeab-09fac745baf1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-b697e0e3-6ab8-4e90-b8e9-e72e362283da" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:11:00 compute-0 nova_compute[186544]: 2025-11-22 08:11:00.978 186548 DEBUG nova.network.neutron [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 08:11:01 compute-0 nova_compute[186544]: 2025-11-22 08:11:01.658 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:11:03 compute-0 nova_compute[186544]: 2025-11-22 08:11:03.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:11:03 compute-0 nova_compute[186544]: 2025-11-22 08:11:03.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:11:04 compute-0 nova_compute[186544]: 2025-11-22 08:11:04.007 186548 DEBUG nova.network.neutron [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Updating instance_info_cache with network_info: [{"id": "7b325460-e116-46b4-b42f-3595a0a85fa1", "address": "fa:16:3e:1f:39:14", "network": {"id": "8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f", "bridge": "br-int", "label": "tempest-TestShelveInstance-1145639745-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e77afcde171b45e6bb008c9dff8ffb44", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b325460-e1", "ovs_interfaceid": "7b325460-e116-46b4-b42f-3595a0a85fa1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:11:04 compute-0 nova_compute[186544]: 2025-11-22 08:11:04.029 186548 DEBUG oslo_concurrency.lockutils [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Releasing lock "refresh_cache-b697e0e3-6ab8-4e90-b8e9-e72e362283da" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:11:04 compute-0 nova_compute[186544]: 2025-11-22 08:11:04.030 186548 DEBUG nova.compute.manager [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Instance network_info: |[{"id": "7b325460-e116-46b4-b42f-3595a0a85fa1", "address": "fa:16:3e:1f:39:14", "network": {"id": "8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f", "bridge": "br-int", "label": "tempest-TestShelveInstance-1145639745-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e77afcde171b45e6bb008c9dff8ffb44", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b325460-e1", "ovs_interfaceid": "7b325460-e116-46b4-b42f-3595a0a85fa1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 22 08:11:04 compute-0 nova_compute[186544]: 2025-11-22 08:11:04.030 186548 DEBUG oslo_concurrency.lockutils [req-6638f44d-d24f-4954-9c80-2c136572f002 req-466eeb99-0928-4ff0-aeab-09fac745baf1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-b697e0e3-6ab8-4e90-b8e9-e72e362283da" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:11:04 compute-0 nova_compute[186544]: 2025-11-22 08:11:04.030 186548 DEBUG nova.network.neutron [req-6638f44d-d24f-4954-9c80-2c136572f002 req-466eeb99-0928-4ff0-aeab-09fac745baf1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Refreshing network info cache for port 7b325460-e116-46b4-b42f-3595a0a85fa1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:11:04 compute-0 nova_compute[186544]: 2025-11-22 08:11:04.033 186548 DEBUG nova.virt.libvirt.driver [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Start _get_guest_xml network_info=[{"id": "7b325460-e116-46b4-b42f-3595a0a85fa1", "address": "fa:16:3e:1f:39:14", "network": {"id": "8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f", "bridge": "br-int", "label": "tempest-TestShelveInstance-1145639745-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e77afcde171b45e6bb008c9dff8ffb44", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b325460-e1", "ovs_interfaceid": "7b325460-e116-46b4-b42f-3595a0a85fa1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 08:11:04 compute-0 nova_compute[186544]: 2025-11-22 08:11:04.037 186548 WARNING nova.virt.libvirt.driver [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:11:04 compute-0 nova_compute[186544]: 2025-11-22 08:11:04.042 186548 DEBUG nova.virt.libvirt.host [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 08:11:04 compute-0 nova_compute[186544]: 2025-11-22 08:11:04.043 186548 DEBUG nova.virt.libvirt.host [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 08:11:04 compute-0 nova_compute[186544]: 2025-11-22 08:11:04.051 186548 DEBUG nova.virt.libvirt.host [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 08:11:04 compute-0 nova_compute[186544]: 2025-11-22 08:11:04.052 186548 DEBUG nova.virt.libvirt.host [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 08:11:04 compute-0 nova_compute[186544]: 2025-11-22 08:11:04.053 186548 DEBUG nova.virt.libvirt.driver [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 08:11:04 compute-0 nova_compute[186544]: 2025-11-22 08:11:04.053 186548 DEBUG nova.virt.hardware [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 08:11:04 compute-0 nova_compute[186544]: 2025-11-22 08:11:04.053 186548 DEBUG nova.virt.hardware [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 08:11:04 compute-0 nova_compute[186544]: 2025-11-22 08:11:04.054 186548 DEBUG nova.virt.hardware [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 08:11:04 compute-0 nova_compute[186544]: 2025-11-22 08:11:04.054 186548 DEBUG nova.virt.hardware [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 08:11:04 compute-0 nova_compute[186544]: 2025-11-22 08:11:04.054 186548 DEBUG nova.virt.hardware [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 08:11:04 compute-0 nova_compute[186544]: 2025-11-22 08:11:04.054 186548 DEBUG nova.virt.hardware [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 08:11:04 compute-0 nova_compute[186544]: 2025-11-22 08:11:04.054 186548 DEBUG nova.virt.hardware [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 08:11:04 compute-0 nova_compute[186544]: 2025-11-22 08:11:04.055 186548 DEBUG nova.virt.hardware [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 08:11:04 compute-0 nova_compute[186544]: 2025-11-22 08:11:04.055 186548 DEBUG nova.virt.hardware [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 08:11:04 compute-0 nova_compute[186544]: 2025-11-22 08:11:04.055 186548 DEBUG nova.virt.hardware [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 08:11:04 compute-0 nova_compute[186544]: 2025-11-22 08:11:04.055 186548 DEBUG nova.virt.hardware [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 08:11:04 compute-0 nova_compute[186544]: 2025-11-22 08:11:04.059 186548 DEBUG nova.virt.libvirt.vif [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:10:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestShelveInstance-server-551674936',display_name='tempest-TestShelveInstance-server-551674936',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testshelveinstance-server-551674936',id=129,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAI0aHL6Ou2IJSgJR8KmWp5jvk1H0nenSFPpcgEZ2nM/U0LfRgUiMbaK3XQDYYBX/t4paqNprX8CiArX2uXN/07J1M2q50BshVpGq5Y37se64GB7gIu++lFp7/cMvXmcGQ==',key_name='tempest-TestShelveInstance-1754802104',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e77afcde171b45e6bb008c9dff8ffb44',ramdisk_id='',reservation_id='r-kc0vpko6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestShelveInstance-2007945223',owner_user_name='tempest-TestShelveInstance-2007945223-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:10:55Z,user_data=None,user_id='cf64193131de4d458bf1bd37c21125f6',uuid=b697e0e3-6ab8-4e90-b8e9-e72e362283da,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7b325460-e116-46b4-b42f-3595a0a85fa1", "address": "fa:16:3e:1f:39:14", "network": {"id": "8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f", "bridge": "br-int", "label": "tempest-TestShelveInstance-1145639745-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e77afcde171b45e6bb008c9dff8ffb44", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b325460-e1", "ovs_interfaceid": "7b325460-e116-46b4-b42f-3595a0a85fa1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 08:11:04 compute-0 nova_compute[186544]: 2025-11-22 08:11:04.059 186548 DEBUG nova.network.os_vif_util [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Converting VIF {"id": "7b325460-e116-46b4-b42f-3595a0a85fa1", "address": "fa:16:3e:1f:39:14", "network": {"id": "8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f", "bridge": "br-int", "label": "tempest-TestShelveInstance-1145639745-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e77afcde171b45e6bb008c9dff8ffb44", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b325460-e1", "ovs_interfaceid": "7b325460-e116-46b4-b42f-3595a0a85fa1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:11:04 compute-0 nova_compute[186544]: 2025-11-22 08:11:04.060 186548 DEBUG nova.network.os_vif_util [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1f:39:14,bridge_name='br-int',has_traffic_filtering=True,id=7b325460-e116-46b4-b42f-3595a0a85fa1,network=Network(8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b325460-e1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:11:04 compute-0 nova_compute[186544]: 2025-11-22 08:11:04.061 186548 DEBUG nova.objects.instance [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Lazy-loading 'pci_devices' on Instance uuid b697e0e3-6ab8-4e90-b8e9-e72e362283da obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:11:04 compute-0 nova_compute[186544]: 2025-11-22 08:11:04.075 186548 DEBUG nova.virt.libvirt.driver [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] End _get_guest_xml xml=<domain type="kvm">
Nov 22 08:11:04 compute-0 nova_compute[186544]:   <uuid>b697e0e3-6ab8-4e90-b8e9-e72e362283da</uuid>
Nov 22 08:11:04 compute-0 nova_compute[186544]:   <name>instance-00000081</name>
Nov 22 08:11:04 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 08:11:04 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 08:11:04 compute-0 nova_compute[186544]:   <metadata>
Nov 22 08:11:04 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 08:11:04 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 08:11:04 compute-0 nova_compute[186544]:       <nova:name>tempest-TestShelveInstance-server-551674936</nova:name>
Nov 22 08:11:04 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 08:11:04</nova:creationTime>
Nov 22 08:11:04 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 08:11:04 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 08:11:04 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 08:11:04 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 08:11:04 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 08:11:04 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 08:11:04 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 08:11:04 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 08:11:04 compute-0 nova_compute[186544]:         <nova:user uuid="cf64193131de4d458bf1bd37c21125f6">tempest-TestShelveInstance-2007945223-project-member</nova:user>
Nov 22 08:11:04 compute-0 nova_compute[186544]:         <nova:project uuid="e77afcde171b45e6bb008c9dff8ffb44">tempest-TestShelveInstance-2007945223</nova:project>
Nov 22 08:11:04 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 08:11:04 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 08:11:04 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 08:11:04 compute-0 nova_compute[186544]:         <nova:port uuid="7b325460-e116-46b4-b42f-3595a0a85fa1">
Nov 22 08:11:04 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Nov 22 08:11:04 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 08:11:04 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 08:11:04 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 08:11:04 compute-0 nova_compute[186544]:   </metadata>
Nov 22 08:11:04 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 08:11:04 compute-0 nova_compute[186544]:     <system>
Nov 22 08:11:04 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 08:11:04 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 08:11:04 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 08:11:04 compute-0 nova_compute[186544]:       <entry name="serial">b697e0e3-6ab8-4e90-b8e9-e72e362283da</entry>
Nov 22 08:11:04 compute-0 nova_compute[186544]:       <entry name="uuid">b697e0e3-6ab8-4e90-b8e9-e72e362283da</entry>
Nov 22 08:11:04 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 08:11:04 compute-0 nova_compute[186544]:     </system>
Nov 22 08:11:04 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 08:11:04 compute-0 nova_compute[186544]:   <os>
Nov 22 08:11:04 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 08:11:04 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 08:11:04 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 08:11:04 compute-0 nova_compute[186544]:   </os>
Nov 22 08:11:04 compute-0 nova_compute[186544]:   <features>
Nov 22 08:11:04 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 08:11:04 compute-0 nova_compute[186544]:     <apic/>
Nov 22 08:11:04 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 08:11:04 compute-0 nova_compute[186544]:   </features>
Nov 22 08:11:04 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 08:11:04 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 08:11:04 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 08:11:04 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 08:11:04 compute-0 nova_compute[186544]:   </clock>
Nov 22 08:11:04 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 08:11:04 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 08:11:04 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 08:11:04 compute-0 nova_compute[186544]:   </cpu>
Nov 22 08:11:04 compute-0 nova_compute[186544]:   <devices>
Nov 22 08:11:04 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 08:11:04 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 08:11:04 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/b697e0e3-6ab8-4e90-b8e9-e72e362283da/disk"/>
Nov 22 08:11:04 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 08:11:04 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:11:04 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 08:11:04 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 08:11:04 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/b697e0e3-6ab8-4e90-b8e9-e72e362283da/disk.config"/>
Nov 22 08:11:04 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 08:11:04 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:11:04 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 08:11:04 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:1f:39:14"/>
Nov 22 08:11:04 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:11:04 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 08:11:04 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 08:11:04 compute-0 nova_compute[186544]:       <target dev="tap7b325460-e1"/>
Nov 22 08:11:04 compute-0 nova_compute[186544]:     </interface>
Nov 22 08:11:04 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 08:11:04 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/b697e0e3-6ab8-4e90-b8e9-e72e362283da/console.log" append="off"/>
Nov 22 08:11:04 compute-0 nova_compute[186544]:     </serial>
Nov 22 08:11:04 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 08:11:04 compute-0 nova_compute[186544]:     <video>
Nov 22 08:11:04 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:11:04 compute-0 nova_compute[186544]:     </video>
Nov 22 08:11:04 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 08:11:04 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 08:11:04 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 08:11:04 compute-0 nova_compute[186544]:     </rng>
Nov 22 08:11:04 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 08:11:04 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:11:04 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:11:04 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:11:04 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:11:04 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:11:04 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:11:04 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:11:04 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:11:04 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:11:04 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:11:04 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:11:04 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:11:04 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:11:04 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:11:04 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:11:04 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:11:04 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:11:04 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:11:04 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:11:04 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:11:04 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:11:04 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:11:04 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:11:04 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:11:04 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 08:11:04 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 08:11:04 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 08:11:04 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 08:11:04 compute-0 nova_compute[186544]:   </devices>
Nov 22 08:11:04 compute-0 nova_compute[186544]: </domain>
Nov 22 08:11:04 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 08:11:04 compute-0 nova_compute[186544]: 2025-11-22 08:11:04.076 186548 DEBUG nova.compute.manager [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Preparing to wait for external event network-vif-plugged-7b325460-e116-46b4-b42f-3595a0a85fa1 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 08:11:04 compute-0 nova_compute[186544]: 2025-11-22 08:11:04.077 186548 DEBUG oslo_concurrency.lockutils [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Acquiring lock "b697e0e3-6ab8-4e90-b8e9-e72e362283da-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:11:04 compute-0 nova_compute[186544]: 2025-11-22 08:11:04.077 186548 DEBUG oslo_concurrency.lockutils [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Lock "b697e0e3-6ab8-4e90-b8e9-e72e362283da-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:11:04 compute-0 nova_compute[186544]: 2025-11-22 08:11:04.077 186548 DEBUG oslo_concurrency.lockutils [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Lock "b697e0e3-6ab8-4e90-b8e9-e72e362283da-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:11:04 compute-0 nova_compute[186544]: 2025-11-22 08:11:04.078 186548 DEBUG nova.virt.libvirt.vif [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:10:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestShelveInstance-server-551674936',display_name='tempest-TestShelveInstance-server-551674936',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testshelveinstance-server-551674936',id=129,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAI0aHL6Ou2IJSgJR8KmWp5jvk1H0nenSFPpcgEZ2nM/U0LfRgUiMbaK3XQDYYBX/t4paqNprX8CiArX2uXN/07J1M2q50BshVpGq5Y37se64GB7gIu++lFp7/cMvXmcGQ==',key_name='tempest-TestShelveInstance-1754802104',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e77afcde171b45e6bb008c9dff8ffb44',ramdisk_id='',reservation_id='r-kc0vpko6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestShelveInstance-2007945223',owner_user_name='tempest-TestShelveInstance-2007945223-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:10:55Z,user_data=None,user_id='cf64193131de4d458bf1bd37c21125f6',uuid=b697e0e3-6ab8-4e90-b8e9-e72e362283da,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7b325460-e116-46b4-b42f-3595a0a85fa1", "address": "fa:16:3e:1f:39:14", "network": {"id": "8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f", "bridge": "br-int", "label": "tempest-TestShelveInstance-1145639745-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e77afcde171b45e6bb008c9dff8ffb44", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b325460-e1", "ovs_interfaceid": "7b325460-e116-46b4-b42f-3595a0a85fa1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 08:11:04 compute-0 nova_compute[186544]: 2025-11-22 08:11:04.078 186548 DEBUG nova.network.os_vif_util [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Converting VIF {"id": "7b325460-e116-46b4-b42f-3595a0a85fa1", "address": "fa:16:3e:1f:39:14", "network": {"id": "8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f", "bridge": "br-int", "label": "tempest-TestShelveInstance-1145639745-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e77afcde171b45e6bb008c9dff8ffb44", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b325460-e1", "ovs_interfaceid": "7b325460-e116-46b4-b42f-3595a0a85fa1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:11:04 compute-0 nova_compute[186544]: 2025-11-22 08:11:04.079 186548 DEBUG nova.network.os_vif_util [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1f:39:14,bridge_name='br-int',has_traffic_filtering=True,id=7b325460-e116-46b4-b42f-3595a0a85fa1,network=Network(8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b325460-e1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:11:04 compute-0 nova_compute[186544]: 2025-11-22 08:11:04.079 186548 DEBUG os_vif [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1f:39:14,bridge_name='br-int',has_traffic_filtering=True,id=7b325460-e116-46b4-b42f-3595a0a85fa1,network=Network(8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b325460-e1') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 08:11:04 compute-0 nova_compute[186544]: 2025-11-22 08:11:04.080 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:11:04 compute-0 nova_compute[186544]: 2025-11-22 08:11:04.080 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:11:04 compute-0 nova_compute[186544]: 2025-11-22 08:11:04.080 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:11:04 compute-0 nova_compute[186544]: 2025-11-22 08:11:04.082 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:11:04 compute-0 nova_compute[186544]: 2025-11-22 08:11:04.083 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7b325460-e1, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:11:04 compute-0 nova_compute[186544]: 2025-11-22 08:11:04.083 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7b325460-e1, col_values=(('external_ids', {'iface-id': '7b325460-e116-46b4-b42f-3595a0a85fa1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1f:39:14', 'vm-uuid': 'b697e0e3-6ab8-4e90-b8e9-e72e362283da'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:11:04 compute-0 nova_compute[186544]: 2025-11-22 08:11:04.084 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:11:04 compute-0 NetworkManager[55036]: <info>  [1763799064.0856] manager: (tap7b325460-e1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/266)
Nov 22 08:11:04 compute-0 nova_compute[186544]: 2025-11-22 08:11:04.087 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 08:11:04 compute-0 nova_compute[186544]: 2025-11-22 08:11:04.090 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:11:04 compute-0 nova_compute[186544]: 2025-11-22 08:11:04.091 186548 INFO os_vif [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1f:39:14,bridge_name='br-int',has_traffic_filtering=True,id=7b325460-e116-46b4-b42f-3595a0a85fa1,network=Network(8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b325460-e1')
Nov 22 08:11:04 compute-0 nova_compute[186544]: 2025-11-22 08:11:04.141 186548 DEBUG nova.virt.libvirt.driver [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:11:04 compute-0 nova_compute[186544]: 2025-11-22 08:11:04.141 186548 DEBUG nova.virt.libvirt.driver [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:11:04 compute-0 nova_compute[186544]: 2025-11-22 08:11:04.141 186548 DEBUG nova.virt.libvirt.driver [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] No VIF found with MAC fa:16:3e:1f:39:14, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 08:11:04 compute-0 nova_compute[186544]: 2025-11-22 08:11:04.142 186548 INFO nova.virt.libvirt.driver [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Using config drive
Nov 22 08:11:04 compute-0 nova_compute[186544]: 2025-11-22 08:11:04.748 186548 INFO nova.virt.libvirt.driver [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Creating config drive at /var/lib/nova/instances/b697e0e3-6ab8-4e90-b8e9-e72e362283da/disk.config
Nov 22 08:11:04 compute-0 nova_compute[186544]: 2025-11-22 08:11:04.756 186548 DEBUG oslo_concurrency.processutils [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b697e0e3-6ab8-4e90-b8e9-e72e362283da/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5jnudt9n execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:11:04 compute-0 nova_compute[186544]: 2025-11-22 08:11:04.884 186548 DEBUG oslo_concurrency.processutils [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b697e0e3-6ab8-4e90-b8e9-e72e362283da/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5jnudt9n" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:11:04 compute-0 kernel: tap7b325460-e1: entered promiscuous mode
Nov 22 08:11:04 compute-0 NetworkManager[55036]: <info>  [1763799064.9458] manager: (tap7b325460-e1): new Tun device (/org/freedesktop/NetworkManager/Devices/267)
Nov 22 08:11:04 compute-0 nova_compute[186544]: 2025-11-22 08:11:04.946 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:11:04 compute-0 ovn_controller[94843]: 2025-11-22T08:11:04Z|00587|binding|INFO|Claiming lport 7b325460-e116-46b4-b42f-3595a0a85fa1 for this chassis.
Nov 22 08:11:04 compute-0 ovn_controller[94843]: 2025-11-22T08:11:04Z|00588|binding|INFO|7b325460-e116-46b4-b42f-3595a0a85fa1: Claiming fa:16:3e:1f:39:14 10.100.0.4
Nov 22 08:11:04 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:04.965 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1f:39:14 10.100.0.4'], port_security=['fa:16:3e:1f:39:14 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'b697e0e3-6ab8-4e90-b8e9-e72e362283da', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e77afcde171b45e6bb008c9dff8ffb44', 'neutron:revision_number': '2', 'neutron:security_group_ids': '74fef083-1145-4a81-b923-db02721f22a5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6709d38e-861b-4fc2-860f-7d0aaf6cf724, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=7b325460-e116-46b4-b42f-3595a0a85fa1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:11:04 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:04.968 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 7b325460-e116-46b4-b42f-3595a0a85fa1 in datapath 8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f bound to our chassis
Nov 22 08:11:04 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:04.969 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f
Nov 22 08:11:04 compute-0 systemd-udevd[236681]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:11:04 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:04.981 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[3c2835e9-2694-4479-81a8-9ea109b9546d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:11:04 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:04.983 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap8b2fd30e-d1 in ovnmeta-8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 08:11:04 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:04.985 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap8b2fd30e-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 08:11:04 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:04.986 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[b783ddbe-9014-43ed-b0a0-e23fba1087f1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:11:04 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:04.989 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[d93ef62d-6202-40fb-9297-541221395eef]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:11:04 compute-0 NetworkManager[55036]: <info>  [1763799064.9924] device (tap7b325460-e1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 08:11:04 compute-0 systemd-machined[152872]: New machine qemu-73-instance-00000081.
Nov 22 08:11:04 compute-0 NetworkManager[55036]: <info>  [1763799064.9939] device (tap7b325460-e1): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 08:11:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:05.000 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[9c0d3788-6904-4bd8-81f7-c9d999bf1ccf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:11:05 compute-0 nova_compute[186544]: 2025-11-22 08:11:05.004 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:11:05 compute-0 systemd[1]: Started Virtual Machine qemu-73-instance-00000081.
Nov 22 08:11:05 compute-0 nova_compute[186544]: 2025-11-22 08:11:05.010 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:11:05 compute-0 ovn_controller[94843]: 2025-11-22T08:11:05Z|00589|binding|INFO|Setting lport 7b325460-e116-46b4-b42f-3595a0a85fa1 ovn-installed in OVS
Nov 22 08:11:05 compute-0 ovn_controller[94843]: 2025-11-22T08:11:05Z|00590|binding|INFO|Setting lport 7b325460-e116-46b4-b42f-3595a0a85fa1 up in Southbound
Nov 22 08:11:05 compute-0 nova_compute[186544]: 2025-11-22 08:11:05.013 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:11:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:05.015 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[464ea34a-9b2c-420e-a0d9-23e85d7bb0e5]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:11:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:05.045 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[ef8f2db8-4203-487a-8f8d-f86e073f4679]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:11:05 compute-0 NetworkManager[55036]: <info>  [1763799065.0519] manager: (tap8b2fd30e-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/268)
Nov 22 08:11:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:05.051 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[21379126-0dcd-4a09-991f-687f78c15cf3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:11:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:05.084 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[cfa5226b-1211-4202-858a-5a424e4dce34]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:11:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:05.088 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[8d9e7de5-76ad-4362-81ef-09cfbc56c00d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:11:05 compute-0 NetworkManager[55036]: <info>  [1763799065.1107] device (tap8b2fd30e-d0): carrier: link connected
Nov 22 08:11:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:05.115 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[530a8a33-053f-4f24-b031-fa844be01c37]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:11:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:05.131 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[4361af1f-96d3-435b-8720-d8dd63df0cba]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8b2fd30e-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b6:48:73'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 181], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 580574, 'reachable_time': 19939, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236715, 'error': None, 'target': 'ovnmeta-8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:11:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:05.147 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[8a06a6f1-7323-4292-b639-876e68eb1596]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb6:4873'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 580574, 'tstamp': 580574}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 236716, 'error': None, 'target': 'ovnmeta-8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:11:05 compute-0 nova_compute[186544]: 2025-11-22 08:11:05.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:11:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:05.164 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[04ede6b1-1923-40e5-9621-51f5a1e96890]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8b2fd30e-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b6:48:73'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 181], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 580574, 'reachable_time': 19939, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 236717, 'error': None, 'target': 'ovnmeta-8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:11:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:05.198 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[fe2a4963-7d3c-4e27-bfbc-38bbecf9b760]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:11:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:05.254 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[25bfaa07-2cf1-48a5-b6ca-6d64d0294d2a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:11:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:05.256 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8b2fd30e-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:11:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:05.256 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:11:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:05.257 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8b2fd30e-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:11:05 compute-0 nova_compute[186544]: 2025-11-22 08:11:05.259 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:11:05 compute-0 NetworkManager[55036]: <info>  [1763799065.2600] manager: (tap8b2fd30e-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/269)
Nov 22 08:11:05 compute-0 kernel: tap8b2fd30e-d0: entered promiscuous mode
Nov 22 08:11:05 compute-0 nova_compute[186544]: 2025-11-22 08:11:05.261 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:11:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:05.262 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8b2fd30e-d0, col_values=(('external_ids', {'iface-id': 'ef9893d7-8bc5-4970-a831-1d511ec4e023'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:11:05 compute-0 ovn_controller[94843]: 2025-11-22T08:11:05Z|00591|binding|INFO|Releasing lport ef9893d7-8bc5-4970-a831-1d511ec4e023 from this chassis (sb_readonly=0)
Nov 22 08:11:05 compute-0 nova_compute[186544]: 2025-11-22 08:11:05.264 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:11:05 compute-0 nova_compute[186544]: 2025-11-22 08:11:05.275 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:11:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:05.276 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 08:11:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:05.277 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[ad7a0870-b697-4677-9ab3-6059b745bf6e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:11:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:05.278 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 08:11:05 compute-0 ovn_metadata_agent[103800]: global
Nov 22 08:11:05 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 08:11:05 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f
Nov 22 08:11:05 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 08:11:05 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 08:11:05 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 08:11:05 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f.pid.haproxy
Nov 22 08:11:05 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 08:11:05 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:11:05 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 08:11:05 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 08:11:05 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 08:11:05 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 08:11:05 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 08:11:05 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 08:11:05 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 08:11:05 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 08:11:05 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 08:11:05 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 08:11:05 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 08:11:05 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 08:11:05 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 08:11:05 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:11:05 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:11:05 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 08:11:05 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 08:11:05 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 08:11:05 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID 8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f
Nov 22 08:11:05 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 08:11:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:05.279 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f', 'env', 'PROCESS_TAG=haproxy-8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 08:11:05 compute-0 nova_compute[186544]: 2025-11-22 08:11:05.322 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763799065.322136, b697e0e3-6ab8-4e90-b8e9-e72e362283da => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:11:05 compute-0 nova_compute[186544]: 2025-11-22 08:11:05.323 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] VM Started (Lifecycle Event)
Nov 22 08:11:05 compute-0 nova_compute[186544]: 2025-11-22 08:11:05.343 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:11:05 compute-0 nova_compute[186544]: 2025-11-22 08:11:05.349 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763799065.3222506, b697e0e3-6ab8-4e90-b8e9-e72e362283da => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:11:05 compute-0 nova_compute[186544]: 2025-11-22 08:11:05.349 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] VM Paused (Lifecycle Event)
Nov 22 08:11:05 compute-0 nova_compute[186544]: 2025-11-22 08:11:05.367 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:11:05 compute-0 nova_compute[186544]: 2025-11-22 08:11:05.371 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:11:05 compute-0 nova_compute[186544]: 2025-11-22 08:11:05.398 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 08:11:05 compute-0 podman[236756]: 2025-11-22 08:11:05.672055855 +0000 UTC m=+0.056105230 container create b88f1f359cd7ac81f451911128a8026f621bd0d5e34b160b532c5b1c739739a3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 08:11:05 compute-0 systemd[1]: Started libpod-conmon-b88f1f359cd7ac81f451911128a8026f621bd0d5e34b160b532c5b1c739739a3.scope.
Nov 22 08:11:05 compute-0 podman[236756]: 2025-11-22 08:11:05.641236832 +0000 UTC m=+0.025286237 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 08:11:05 compute-0 systemd[1]: Started libcrun container.
Nov 22 08:11:05 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/db35481592e2c5ffbccb55d37fe9350be045a76ce341013a9281f5c85fc97ba7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 08:11:05 compute-0 podman[236756]: 2025-11-22 08:11:05.76193159 +0000 UTC m=+0.145980995 container init b88f1f359cd7ac81f451911128a8026f621bd0d5e34b160b532c5b1c739739a3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 22 08:11:05 compute-0 podman[236756]: 2025-11-22 08:11:05.767373275 +0000 UTC m=+0.151422660 container start b88f1f359cd7ac81f451911128a8026f621bd0d5e34b160b532c5b1c739739a3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS)
Nov 22 08:11:05 compute-0 neutron-haproxy-ovnmeta-8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f[236771]: [NOTICE]   (236775) : New worker (236777) forked
Nov 22 08:11:05 compute-0 neutron-haproxy-ovnmeta-8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f[236771]: [NOTICE]   (236775) : Loading success.
Nov 22 08:11:05 compute-0 nova_compute[186544]: 2025-11-22 08:11:05.863 186548 DEBUG nova.network.neutron [req-6638f44d-d24f-4954-9c80-2c136572f002 req-466eeb99-0928-4ff0-aeab-09fac745baf1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Updated VIF entry in instance network info cache for port 7b325460-e116-46b4-b42f-3595a0a85fa1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:11:05 compute-0 nova_compute[186544]: 2025-11-22 08:11:05.865 186548 DEBUG nova.network.neutron [req-6638f44d-d24f-4954-9c80-2c136572f002 req-466eeb99-0928-4ff0-aeab-09fac745baf1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Updating instance_info_cache with network_info: [{"id": "7b325460-e116-46b4-b42f-3595a0a85fa1", "address": "fa:16:3e:1f:39:14", "network": {"id": "8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f", "bridge": "br-int", "label": "tempest-TestShelveInstance-1145639745-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e77afcde171b45e6bb008c9dff8ffb44", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b325460-e1", "ovs_interfaceid": "7b325460-e116-46b4-b42f-3595a0a85fa1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:11:05 compute-0 nova_compute[186544]: 2025-11-22 08:11:05.886 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:11:05 compute-0 nova_compute[186544]: 2025-11-22 08:11:05.890 186548 DEBUG oslo_concurrency.lockutils [req-6638f44d-d24f-4954-9c80-2c136572f002 req-466eeb99-0928-4ff0-aeab-09fac745baf1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-b697e0e3-6ab8-4e90-b8e9-e72e362283da" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:11:09 compute-0 nova_compute[186544]: 2025-11-22 08:11:09.087 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:11:09 compute-0 podman[236788]: 2025-11-22 08:11:09.41614717 +0000 UTC m=+0.056368606 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 08:11:09 compute-0 podman[236786]: 2025-11-22 08:11:09.42101089 +0000 UTC m=+0.066912887 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 08:11:09 compute-0 podman[236787]: 2025-11-22 08:11:09.434567676 +0000 UTC m=+0.078898194 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible)
Nov 22 08:11:09 compute-0 podman[236789]: 2025-11-22 08:11:09.454308035 +0000 UTC m=+0.091605559 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 22 08:11:10 compute-0 nova_compute[186544]: 2025-11-22 08:11:10.889 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:11:10 compute-0 nova_compute[186544]: 2025-11-22 08:11:10.927 186548 DEBUG nova.compute.manager [req-048b3999-70f5-43c1-8235-3270c09cf2ec req-348e6c34-23be-4c66-b5e5-c67fd72f6b64 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Received event network-vif-plugged-7b325460-e116-46b4-b42f-3595a0a85fa1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:11:10 compute-0 nova_compute[186544]: 2025-11-22 08:11:10.928 186548 DEBUG oslo_concurrency.lockutils [req-048b3999-70f5-43c1-8235-3270c09cf2ec req-348e6c34-23be-4c66-b5e5-c67fd72f6b64 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "b697e0e3-6ab8-4e90-b8e9-e72e362283da-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:11:10 compute-0 nova_compute[186544]: 2025-11-22 08:11:10.928 186548 DEBUG oslo_concurrency.lockutils [req-048b3999-70f5-43c1-8235-3270c09cf2ec req-348e6c34-23be-4c66-b5e5-c67fd72f6b64 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b697e0e3-6ab8-4e90-b8e9-e72e362283da-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:11:10 compute-0 nova_compute[186544]: 2025-11-22 08:11:10.928 186548 DEBUG oslo_concurrency.lockutils [req-048b3999-70f5-43c1-8235-3270c09cf2ec req-348e6c34-23be-4c66-b5e5-c67fd72f6b64 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b697e0e3-6ab8-4e90-b8e9-e72e362283da-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:11:10 compute-0 nova_compute[186544]: 2025-11-22 08:11:10.928 186548 DEBUG nova.compute.manager [req-048b3999-70f5-43c1-8235-3270c09cf2ec req-348e6c34-23be-4c66-b5e5-c67fd72f6b64 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Processing event network-vif-plugged-7b325460-e116-46b4-b42f-3595a0a85fa1 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 08:11:10 compute-0 nova_compute[186544]: 2025-11-22 08:11:10.929 186548 DEBUG nova.compute.manager [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Instance event wait completed in 5 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 08:11:10 compute-0 nova_compute[186544]: 2025-11-22 08:11:10.932 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763799070.9326646, b697e0e3-6ab8-4e90-b8e9-e72e362283da => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:11:10 compute-0 nova_compute[186544]: 2025-11-22 08:11:10.933 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] VM Resumed (Lifecycle Event)
Nov 22 08:11:10 compute-0 nova_compute[186544]: 2025-11-22 08:11:10.934 186548 DEBUG nova.virt.libvirt.driver [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 08:11:10 compute-0 nova_compute[186544]: 2025-11-22 08:11:10.937 186548 INFO nova.virt.libvirt.driver [-] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Instance spawned successfully.
Nov 22 08:11:10 compute-0 nova_compute[186544]: 2025-11-22 08:11:10.938 186548 DEBUG nova.virt.libvirt.driver [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 08:11:10 compute-0 nova_compute[186544]: 2025-11-22 08:11:10.951 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:11:10 compute-0 nova_compute[186544]: 2025-11-22 08:11:10.956 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:11:10 compute-0 nova_compute[186544]: 2025-11-22 08:11:10.959 186548 DEBUG nova.virt.libvirt.driver [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:11:10 compute-0 nova_compute[186544]: 2025-11-22 08:11:10.960 186548 DEBUG nova.virt.libvirt.driver [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:11:10 compute-0 nova_compute[186544]: 2025-11-22 08:11:10.960 186548 DEBUG nova.virt.libvirt.driver [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:11:10 compute-0 nova_compute[186544]: 2025-11-22 08:11:10.961 186548 DEBUG nova.virt.libvirt.driver [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:11:10 compute-0 nova_compute[186544]: 2025-11-22 08:11:10.961 186548 DEBUG nova.virt.libvirt.driver [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:11:10 compute-0 nova_compute[186544]: 2025-11-22 08:11:10.961 186548 DEBUG nova.virt.libvirt.driver [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:11:10 compute-0 nova_compute[186544]: 2025-11-22 08:11:10.981 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 08:11:11 compute-0 nova_compute[186544]: 2025-11-22 08:11:11.015 186548 INFO nova.compute.manager [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Took 14.97 seconds to spawn the instance on the hypervisor.
Nov 22 08:11:11 compute-0 nova_compute[186544]: 2025-11-22 08:11:11.015 186548 DEBUG nova.compute.manager [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:11:11 compute-0 nova_compute[186544]: 2025-11-22 08:11:11.092 186548 INFO nova.compute.manager [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Took 15.59 seconds to build instance.
Nov 22 08:11:11 compute-0 nova_compute[186544]: 2025-11-22 08:11:11.125 186548 DEBUG oslo_concurrency.lockutils [None req-95395ab7-049b-4330-8531-a9db761e88b9 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Lock "b697e0e3-6ab8-4e90-b8e9-e72e362283da" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.728s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:11:13 compute-0 nova_compute[186544]: 2025-11-22 08:11:13.158 186548 DEBUG nova.compute.manager [req-b25091ae-e80e-4449-8ebf-e831ca8c1ba0 req-568365b2-849f-41fb-97f2-91a37d09d098 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Received event network-vif-plugged-7b325460-e116-46b4-b42f-3595a0a85fa1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:11:13 compute-0 nova_compute[186544]: 2025-11-22 08:11:13.159 186548 DEBUG oslo_concurrency.lockutils [req-b25091ae-e80e-4449-8ebf-e831ca8c1ba0 req-568365b2-849f-41fb-97f2-91a37d09d098 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "b697e0e3-6ab8-4e90-b8e9-e72e362283da-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:11:13 compute-0 nova_compute[186544]: 2025-11-22 08:11:13.159 186548 DEBUG oslo_concurrency.lockutils [req-b25091ae-e80e-4449-8ebf-e831ca8c1ba0 req-568365b2-849f-41fb-97f2-91a37d09d098 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b697e0e3-6ab8-4e90-b8e9-e72e362283da-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:11:13 compute-0 nova_compute[186544]: 2025-11-22 08:11:13.159 186548 DEBUG oslo_concurrency.lockutils [req-b25091ae-e80e-4449-8ebf-e831ca8c1ba0 req-568365b2-849f-41fb-97f2-91a37d09d098 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b697e0e3-6ab8-4e90-b8e9-e72e362283da-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:11:13 compute-0 nova_compute[186544]: 2025-11-22 08:11:13.159 186548 DEBUG nova.compute.manager [req-b25091ae-e80e-4449-8ebf-e831ca8c1ba0 req-568365b2-849f-41fb-97f2-91a37d09d098 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] No waiting events found dispatching network-vif-plugged-7b325460-e116-46b4-b42f-3595a0a85fa1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:11:13 compute-0 nova_compute[186544]: 2025-11-22 08:11:13.159 186548 WARNING nova.compute.manager [req-b25091ae-e80e-4449-8ebf-e831ca8c1ba0 req-568365b2-849f-41fb-97f2-91a37d09d098 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Received unexpected event network-vif-plugged-7b325460-e116-46b4-b42f-3595a0a85fa1 for instance with vm_state active and task_state None.
Nov 22 08:11:14 compute-0 nova_compute[186544]: 2025-11-22 08:11:14.091 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:11:15 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:15.624 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:30:2c:69 2001:db8:0:1:f816:3eff:fe30:2c69 2001:db8::f816:3eff:fe30:2c69'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe30:2c69/64 2001:db8::f816:3eff:fe30:2c69/64', 'neutron:device_id': 'ovnmeta-2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d81a98b9-7f60-4da8-a82f-30c94c08d498, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=f86e6fc7-3969-4922-9612-9c86d85f21ec) old=Port_Binding(mac=['fa:16:3e:30:2c:69 2001:db8::f816:3eff:fe30:2c69'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe30:2c69/64', 'neutron:device_id': 'ovnmeta-2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:11:15 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:15.626 103805 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port f86e6fc7-3969-4922-9612-9c86d85f21ec in datapath 2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94 updated
Nov 22 08:11:15 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:15.627 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2b7e9f2d-2098-4c6b-8c54-a96b7bb53c94, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 08:11:15 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:15.628 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[5eec8327-b797-42a0-bb67-07711a228714]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:11:15 compute-0 nova_compute[186544]: 2025-11-22 08:11:15.891 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:11:17 compute-0 nova_compute[186544]: 2025-11-22 08:11:17.618 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:11:17 compute-0 NetworkManager[55036]: <info>  [1763799077.6220] manager: (patch-br-int-to-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/270)
Nov 22 08:11:17 compute-0 NetworkManager[55036]: <info>  [1763799077.6237] manager: (patch-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/271)
Nov 22 08:11:17 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:17.824 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=36, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=35) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:11:17 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:17.825 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 08:11:17 compute-0 nova_compute[186544]: 2025-11-22 08:11:17.835 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:11:17 compute-0 ovn_controller[94843]: 2025-11-22T08:11:17Z|00592|binding|INFO|Releasing lport ef9893d7-8bc5-4970-a831-1d511ec4e023 from this chassis (sb_readonly=0)
Nov 22 08:11:17 compute-0 nova_compute[186544]: 2025-11-22 08:11:17.859 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:11:18 compute-0 nova_compute[186544]: 2025-11-22 08:11:18.001 186548 DEBUG nova.compute.manager [req-bd620743-0149-418b-8ca6-cd1c934e3fcd req-a6d735ca-c912-4494-82b4-2bda76e6c1a7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Received event network-changed-7b325460-e116-46b4-b42f-3595a0a85fa1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:11:18 compute-0 nova_compute[186544]: 2025-11-22 08:11:18.001 186548 DEBUG nova.compute.manager [req-bd620743-0149-418b-8ca6-cd1c934e3fcd req-a6d735ca-c912-4494-82b4-2bda76e6c1a7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Refreshing instance network info cache due to event network-changed-7b325460-e116-46b4-b42f-3595a0a85fa1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:11:18 compute-0 nova_compute[186544]: 2025-11-22 08:11:18.002 186548 DEBUG oslo_concurrency.lockutils [req-bd620743-0149-418b-8ca6-cd1c934e3fcd req-a6d735ca-c912-4494-82b4-2bda76e6c1a7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-b697e0e3-6ab8-4e90-b8e9-e72e362283da" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:11:18 compute-0 nova_compute[186544]: 2025-11-22 08:11:18.002 186548 DEBUG oslo_concurrency.lockutils [req-bd620743-0149-418b-8ca6-cd1c934e3fcd req-a6d735ca-c912-4494-82b4-2bda76e6c1a7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-b697e0e3-6ab8-4e90-b8e9-e72e362283da" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:11:18 compute-0 nova_compute[186544]: 2025-11-22 08:11:18.002 186548 DEBUG nova.network.neutron [req-bd620743-0149-418b-8ca6-cd1c934e3fcd req-a6d735ca-c912-4494-82b4-2bda76e6c1a7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Refreshing network info cache for port 7b325460-e116-46b4-b42f-3595a0a85fa1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:11:19 compute-0 nova_compute[186544]: 2025-11-22 08:11:19.092 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:11:19 compute-0 nova_compute[186544]: 2025-11-22 08:11:19.758 186548 DEBUG nova.network.neutron [req-bd620743-0149-418b-8ca6-cd1c934e3fcd req-a6d735ca-c912-4494-82b4-2bda76e6c1a7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Updated VIF entry in instance network info cache for port 7b325460-e116-46b4-b42f-3595a0a85fa1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:11:19 compute-0 nova_compute[186544]: 2025-11-22 08:11:19.759 186548 DEBUG nova.network.neutron [req-bd620743-0149-418b-8ca6-cd1c934e3fcd req-a6d735ca-c912-4494-82b4-2bda76e6c1a7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Updating instance_info_cache with network_info: [{"id": "7b325460-e116-46b4-b42f-3595a0a85fa1", "address": "fa:16:3e:1f:39:14", "network": {"id": "8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f", "bridge": "br-int", "label": "tempest-TestShelveInstance-1145639745-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e77afcde171b45e6bb008c9dff8ffb44", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b325460-e1", "ovs_interfaceid": "7b325460-e116-46b4-b42f-3595a0a85fa1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:11:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:19.827 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '36'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:11:19 compute-0 nova_compute[186544]: 2025-11-22 08:11:19.848 186548 DEBUG oslo_concurrency.lockutils [req-bd620743-0149-418b-8ca6-cd1c934e3fcd req-a6d735ca-c912-4494-82b4-2bda76e6c1a7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-b697e0e3-6ab8-4e90-b8e9-e72e362283da" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:11:20 compute-0 podman[236870]: 2025-11-22 08:11:20.408068183 +0000 UTC m=+0.057852452 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251118)
Nov 22 08:11:20 compute-0 nova_compute[186544]: 2025-11-22 08:11:20.892 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:11:22 compute-0 nova_compute[186544]: 2025-11-22 08:11:22.076 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:11:24 compute-0 nova_compute[186544]: 2025-11-22 08:11:24.096 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:11:24 compute-0 podman[236903]: 2025-11-22 08:11:24.400843335 +0000 UTC m=+0.049049595 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 08:11:24 compute-0 podman[236904]: 2025-11-22 08:11:24.412193896 +0000 UTC m=+0.056713015 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, vendor=Red Hat, Inc., architecture=x86_64, managed_by=edpm_ansible, version=9.6, maintainer=Red Hat, Inc., io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 22 08:11:25 compute-0 ovn_controller[94843]: 2025-11-22T08:11:25Z|00052|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:1f:39:14 10.100.0.4
Nov 22 08:11:25 compute-0 ovn_controller[94843]: 2025-11-22T08:11:25Z|00053|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:1f:39:14 10.100.0.4
Nov 22 08:11:25 compute-0 nova_compute[186544]: 2025-11-22 08:11:25.893 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:11:27 compute-0 nova_compute[186544]: 2025-11-22 08:11:27.138 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:11:29 compute-0 nova_compute[186544]: 2025-11-22 08:11:29.100 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:11:29 compute-0 ovn_controller[94843]: 2025-11-22T08:11:29Z|00593|binding|INFO|Releasing lport ef9893d7-8bc5-4970-a831-1d511ec4e023 from this chassis (sb_readonly=0)
Nov 22 08:11:29 compute-0 nova_compute[186544]: 2025-11-22 08:11:29.428 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:11:30 compute-0 nova_compute[186544]: 2025-11-22 08:11:30.894 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:11:31 compute-0 ovn_controller[94843]: 2025-11-22T08:11:31Z|00594|binding|INFO|Releasing lport ef9893d7-8bc5-4970-a831-1d511ec4e023 from this chassis (sb_readonly=0)
Nov 22 08:11:31 compute-0 nova_compute[186544]: 2025-11-22 08:11:31.793 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:11:34 compute-0 nova_compute[186544]: 2025-11-22 08:11:34.103 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:11:34 compute-0 nova_compute[186544]: 2025-11-22 08:11:34.833 186548 DEBUG oslo_concurrency.lockutils [None req-0f0aebca-340c-4c3b-b617-4cbc4d2c2053 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Acquiring lock "b697e0e3-6ab8-4e90-b8e9-e72e362283da" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:11:34 compute-0 nova_compute[186544]: 2025-11-22 08:11:34.834 186548 DEBUG oslo_concurrency.lockutils [None req-0f0aebca-340c-4c3b-b617-4cbc4d2c2053 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Lock "b697e0e3-6ab8-4e90-b8e9-e72e362283da" acquired by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:11:34 compute-0 nova_compute[186544]: 2025-11-22 08:11:34.834 186548 INFO nova.compute.manager [None req-0f0aebca-340c-4c3b-b617-4cbc4d2c2053 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Shelving
Nov 22 08:11:34 compute-0 nova_compute[186544]: 2025-11-22 08:11:34.858 186548 DEBUG nova.virt.libvirt.driver [None req-0f0aebca-340c-4c3b-b617-4cbc4d2c2053 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Nov 22 08:11:35 compute-0 nova_compute[186544]: 2025-11-22 08:11:35.896 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:11:37 compute-0 kernel: tap7b325460-e1 (unregistering): left promiscuous mode
Nov 22 08:11:37 compute-0 NetworkManager[55036]: <info>  [1763799097.0167] device (tap7b325460-e1): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 08:11:37 compute-0 ovn_controller[94843]: 2025-11-22T08:11:37Z|00595|binding|INFO|Releasing lport 7b325460-e116-46b4-b42f-3595a0a85fa1 from this chassis (sb_readonly=0)
Nov 22 08:11:37 compute-0 nova_compute[186544]: 2025-11-22 08:11:37.023 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:11:37 compute-0 ovn_controller[94843]: 2025-11-22T08:11:37Z|00596|binding|INFO|Setting lport 7b325460-e116-46b4-b42f-3595a0a85fa1 down in Southbound
Nov 22 08:11:37 compute-0 ovn_controller[94843]: 2025-11-22T08:11:37Z|00597|binding|INFO|Removing iface tap7b325460-e1 ovn-installed in OVS
Nov 22 08:11:37 compute-0 nova_compute[186544]: 2025-11-22 08:11:37.026 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:11:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:37.031 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1f:39:14 10.100.0.4'], port_security=['fa:16:3e:1f:39:14 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'b697e0e3-6ab8-4e90-b8e9-e72e362283da', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e77afcde171b45e6bb008c9dff8ffb44', 'neutron:revision_number': '4', 'neutron:security_group_ids': '74fef083-1145-4a81-b923-db02721f22a5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.182'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6709d38e-861b-4fc2-860f-7d0aaf6cf724, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=7b325460-e116-46b4-b42f-3595a0a85fa1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:11:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:37.032 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 7b325460-e116-46b4-b42f-3595a0a85fa1 in datapath 8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f unbound from our chassis
Nov 22 08:11:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:37.033 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 08:11:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:37.034 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[f9656936-af43-402f-a872-b1168cfc6345]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:11:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:37.035 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f namespace which is not needed anymore
Nov 22 08:11:37 compute-0 nova_compute[186544]: 2025-11-22 08:11:37.039 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:11:37 compute-0 systemd[1]: machine-qemu\x2d73\x2dinstance\x2d00000081.scope: Deactivated successfully.
Nov 22 08:11:37 compute-0 systemd[1]: machine-qemu\x2d73\x2dinstance\x2d00000081.scope: Consumed 14.468s CPU time.
Nov 22 08:11:37 compute-0 systemd-machined[152872]: Machine qemu-73-instance-00000081 terminated.
Nov 22 08:11:37 compute-0 neutron-haproxy-ovnmeta-8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f[236771]: [NOTICE]   (236775) : haproxy version is 2.8.14-c23fe91
Nov 22 08:11:37 compute-0 neutron-haproxy-ovnmeta-8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f[236771]: [NOTICE]   (236775) : path to executable is /usr/sbin/haproxy
Nov 22 08:11:37 compute-0 neutron-haproxy-ovnmeta-8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f[236771]: [WARNING]  (236775) : Exiting Master process...
Nov 22 08:11:37 compute-0 neutron-haproxy-ovnmeta-8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f[236771]: [ALERT]    (236775) : Current worker (236777) exited with code 143 (Terminated)
Nov 22 08:11:37 compute-0 neutron-haproxy-ovnmeta-8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f[236771]: [WARNING]  (236775) : All workers exited. Exiting... (0)
Nov 22 08:11:37 compute-0 systemd[1]: libpod-b88f1f359cd7ac81f451911128a8026f621bd0d5e34b160b532c5b1c739739a3.scope: Deactivated successfully.
Nov 22 08:11:37 compute-0 podman[236973]: 2025-11-22 08:11:37.163457531 +0000 UTC m=+0.044323977 container died b88f1f359cd7ac81f451911128a8026f621bd0d5e34b160b532c5b1c739739a3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 08:11:37 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b88f1f359cd7ac81f451911128a8026f621bd0d5e34b160b532c5b1c739739a3-userdata-shm.mount: Deactivated successfully.
Nov 22 08:11:37 compute-0 systemd[1]: var-lib-containers-storage-overlay-db35481592e2c5ffbccb55d37fe9350be045a76ce341013a9281f5c85fc97ba7-merged.mount: Deactivated successfully.
Nov 22 08:11:37 compute-0 podman[236973]: 2025-11-22 08:11:37.206699012 +0000 UTC m=+0.087565458 container cleanup b88f1f359cd7ac81f451911128a8026f621bd0d5e34b160b532c5b1c739739a3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 08:11:37 compute-0 systemd[1]: libpod-conmon-b88f1f359cd7ac81f451911128a8026f621bd0d5e34b160b532c5b1c739739a3.scope: Deactivated successfully.
Nov 22 08:11:37 compute-0 podman[237004]: 2025-11-22 08:11:37.277743631 +0000 UTC m=+0.049079516 container remove b88f1f359cd7ac81f451911128a8026f621bd0d5e34b160b532c5b1c739739a3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118)
Nov 22 08:11:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:37.284 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[ee4cfaed-0f95-4112-90b3-e714791894b0]: (4, ('Sat Nov 22 08:11:37 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f (b88f1f359cd7ac81f451911128a8026f621bd0d5e34b160b532c5b1c739739a3)\nb88f1f359cd7ac81f451911128a8026f621bd0d5e34b160b532c5b1c739739a3\nSat Nov 22 08:11:37 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f (b88f1f359cd7ac81f451911128a8026f621bd0d5e34b160b532c5b1c739739a3)\nb88f1f359cd7ac81f451911128a8026f621bd0d5e34b160b532c5b1c739739a3\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:11:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:37.287 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[591b61c2-79c1-4535-9081-f54e5016af87]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:11:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:37.289 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8b2fd30e-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:11:37 compute-0 nova_compute[186544]: 2025-11-22 08:11:37.291 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:11:37 compute-0 kernel: tap8b2fd30e-d0: left promiscuous mode
Nov 22 08:11:37 compute-0 nova_compute[186544]: 2025-11-22 08:11:37.308 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:11:37 compute-0 nova_compute[186544]: 2025-11-22 08:11:37.309 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:11:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:37.312 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[ca9f6b28-0eb4-49ad-b9e4-243291e39c68]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:11:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:37.327 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[ae255db9-4a04-47cc-aa27-0d2517f9250d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:11:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:37.329 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[ca732d11-d9a5-41e1-8642-6483dc772bb7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:11:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:37.339 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:11:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:37.339 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:11:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:37.340 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:11:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:37.344 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[cc5f39b4-10e0-4034-9fe9-4728c959f0a8]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 580567, 'reachable_time': 41911, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237041, 'error': None, 'target': 'ovnmeta-8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:11:37 compute-0 systemd[1]: run-netns-ovnmeta\x2d8b2fd30e\x2dd4e3\x2d4155\x2d885e\x2def1a2c6dae8f.mount: Deactivated successfully.
Nov 22 08:11:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:37.347 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 08:11:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:37.348 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[6fe7850d-edd2-476f-8225-b34573975eb8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:11:37 compute-0 nova_compute[186544]: 2025-11-22 08:11:37.875 186548 INFO nova.virt.libvirt.driver [None req-0f0aebca-340c-4c3b-b617-4cbc4d2c2053 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Instance shutdown successfully after 3 seconds.
Nov 22 08:11:37 compute-0 nova_compute[186544]: 2025-11-22 08:11:37.880 186548 INFO nova.virt.libvirt.driver [-] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Instance destroyed successfully.
Nov 22 08:11:37 compute-0 nova_compute[186544]: 2025-11-22 08:11:37.880 186548 DEBUG nova.objects.instance [None req-0f0aebca-340c-4c3b-b617-4cbc4d2c2053 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Lazy-loading 'numa_topology' on Instance uuid b697e0e3-6ab8-4e90-b8e9-e72e362283da obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:11:38 compute-0 nova_compute[186544]: 2025-11-22 08:11:38.249 186548 INFO nova.virt.libvirt.driver [None req-0f0aebca-340c-4c3b-b617-4cbc4d2c2053 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Beginning cold snapshot process
Nov 22 08:11:38 compute-0 nova_compute[186544]: 2025-11-22 08:11:38.457 186548 DEBUG nova.privsep.utils [None req-0f0aebca-340c-4c3b-b617-4cbc4d2c2053 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Nov 22 08:11:38 compute-0 nova_compute[186544]: 2025-11-22 08:11:38.458 186548 DEBUG oslo_concurrency.processutils [None req-0f0aebca-340c-4c3b-b617-4cbc4d2c2053 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/b697e0e3-6ab8-4e90-b8e9-e72e362283da/disk /var/lib/nova/instances/snapshots/tmp8xja6pac/6749d02741b046eb814de303c8bdeb32 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:11:38 compute-0 nova_compute[186544]: 2025-11-22 08:11:38.965 186548 DEBUG oslo_concurrency.processutils [None req-0f0aebca-340c-4c3b-b617-4cbc4d2c2053 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/b697e0e3-6ab8-4e90-b8e9-e72e362283da/disk /var/lib/nova/instances/snapshots/tmp8xja6pac/6749d02741b046eb814de303c8bdeb32" returned: 0 in 0.508s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:11:38 compute-0 nova_compute[186544]: 2025-11-22 08:11:38.966 186548 INFO nova.virt.libvirt.driver [None req-0f0aebca-340c-4c3b-b617-4cbc4d2c2053 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Snapshot extracted, beginning image upload
Nov 22 08:11:39 compute-0 nova_compute[186544]: 2025-11-22 08:11:39.106 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:11:39 compute-0 nova_compute[186544]: 2025-11-22 08:11:39.216 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:11:39 compute-0 nova_compute[186544]: 2025-11-22 08:11:39.273 186548 DEBUG nova.compute.manager [req-26510510-78b0-48b3-8c0b-cffbe923b878 req-28e65d7d-1f9f-4f45-bc6d-bf5df4a75853 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Received event network-vif-unplugged-7b325460-e116-46b4-b42f-3595a0a85fa1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:11:39 compute-0 nova_compute[186544]: 2025-11-22 08:11:39.274 186548 DEBUG oslo_concurrency.lockutils [req-26510510-78b0-48b3-8c0b-cffbe923b878 req-28e65d7d-1f9f-4f45-bc6d-bf5df4a75853 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "b697e0e3-6ab8-4e90-b8e9-e72e362283da-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:11:39 compute-0 nova_compute[186544]: 2025-11-22 08:11:39.274 186548 DEBUG oslo_concurrency.lockutils [req-26510510-78b0-48b3-8c0b-cffbe923b878 req-28e65d7d-1f9f-4f45-bc6d-bf5df4a75853 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b697e0e3-6ab8-4e90-b8e9-e72e362283da-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:11:39 compute-0 nova_compute[186544]: 2025-11-22 08:11:39.274 186548 DEBUG oslo_concurrency.lockutils [req-26510510-78b0-48b3-8c0b-cffbe923b878 req-28e65d7d-1f9f-4f45-bc6d-bf5df4a75853 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b697e0e3-6ab8-4e90-b8e9-e72e362283da-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:11:39 compute-0 nova_compute[186544]: 2025-11-22 08:11:39.274 186548 DEBUG nova.compute.manager [req-26510510-78b0-48b3-8c0b-cffbe923b878 req-28e65d7d-1f9f-4f45-bc6d-bf5df4a75853 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] No waiting events found dispatching network-vif-unplugged-7b325460-e116-46b4-b42f-3595a0a85fa1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:11:39 compute-0 nova_compute[186544]: 2025-11-22 08:11:39.274 186548 WARNING nova.compute.manager [req-26510510-78b0-48b3-8c0b-cffbe923b878 req-28e65d7d-1f9f-4f45-bc6d-bf5df4a75853 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Received unexpected event network-vif-unplugged-7b325460-e116-46b4-b42f-3595a0a85fa1 for instance with vm_state active and task_state shelving_image_uploading.
Nov 22 08:11:39 compute-0 nova_compute[186544]: 2025-11-22 08:11:39.275 186548 DEBUG nova.compute.manager [req-26510510-78b0-48b3-8c0b-cffbe923b878 req-28e65d7d-1f9f-4f45-bc6d-bf5df4a75853 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Received event network-vif-plugged-7b325460-e116-46b4-b42f-3595a0a85fa1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:11:39 compute-0 nova_compute[186544]: 2025-11-22 08:11:39.275 186548 DEBUG oslo_concurrency.lockutils [req-26510510-78b0-48b3-8c0b-cffbe923b878 req-28e65d7d-1f9f-4f45-bc6d-bf5df4a75853 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "b697e0e3-6ab8-4e90-b8e9-e72e362283da-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:11:39 compute-0 nova_compute[186544]: 2025-11-22 08:11:39.275 186548 DEBUG oslo_concurrency.lockutils [req-26510510-78b0-48b3-8c0b-cffbe923b878 req-28e65d7d-1f9f-4f45-bc6d-bf5df4a75853 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b697e0e3-6ab8-4e90-b8e9-e72e362283da-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:11:39 compute-0 nova_compute[186544]: 2025-11-22 08:11:39.275 186548 DEBUG oslo_concurrency.lockutils [req-26510510-78b0-48b3-8c0b-cffbe923b878 req-28e65d7d-1f9f-4f45-bc6d-bf5df4a75853 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b697e0e3-6ab8-4e90-b8e9-e72e362283da-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:11:39 compute-0 nova_compute[186544]: 2025-11-22 08:11:39.275 186548 DEBUG nova.compute.manager [req-26510510-78b0-48b3-8c0b-cffbe923b878 req-28e65d7d-1f9f-4f45-bc6d-bf5df4a75853 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] No waiting events found dispatching network-vif-plugged-7b325460-e116-46b4-b42f-3595a0a85fa1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:11:39 compute-0 nova_compute[186544]: 2025-11-22 08:11:39.276 186548 WARNING nova.compute.manager [req-26510510-78b0-48b3-8c0b-cffbe923b878 req-28e65d7d-1f9f-4f45-bc6d-bf5df4a75853 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Received unexpected event network-vif-plugged-7b325460-e116-46b4-b42f-3595a0a85fa1 for instance with vm_state active and task_state shelving_image_uploading.
Nov 22 08:11:39 compute-0 nova_compute[186544]: 2025-11-22 08:11:39.424 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:11:40 compute-0 podman[237054]: 2025-11-22 08:11:40.414345047 +0000 UTC m=+0.058541530 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 08:11:40 compute-0 podman[237055]: 2025-11-22 08:11:40.439645424 +0000 UTC m=+0.082879903 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 08:11:40 compute-0 podman[237053]: 2025-11-22 08:11:40.445049518 +0000 UTC m=+0.091397674 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, managed_by=edpm_ansible, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 22 08:11:40 compute-0 podman[237056]: 2025-11-22 08:11:40.445109389 +0000 UTC m=+0.084676807 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Nov 22 08:11:40 compute-0 nova_compute[186544]: 2025-11-22 08:11:40.898 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:11:42 compute-0 nova_compute[186544]: 2025-11-22 08:11:42.180 186548 INFO nova.virt.libvirt.driver [None req-0f0aebca-340c-4c3b-b617-4cbc4d2c2053 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Snapshot image upload complete
Nov 22 08:11:42 compute-0 nova_compute[186544]: 2025-11-22 08:11:42.181 186548 DEBUG nova.compute.manager [None req-0f0aebca-340c-4c3b-b617-4cbc4d2c2053 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:11:42 compute-0 nova_compute[186544]: 2025-11-22 08:11:42.278 186548 INFO nova.compute.manager [None req-0f0aebca-340c-4c3b-b617-4cbc4d2c2053 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Shelve offloading
Nov 22 08:11:42 compute-0 nova_compute[186544]: 2025-11-22 08:11:42.288 186548 INFO nova.virt.libvirt.driver [-] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Instance destroyed successfully.
Nov 22 08:11:42 compute-0 nova_compute[186544]: 2025-11-22 08:11:42.288 186548 DEBUG nova.compute.manager [None req-0f0aebca-340c-4c3b-b617-4cbc4d2c2053 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:11:42 compute-0 nova_compute[186544]: 2025-11-22 08:11:42.290 186548 DEBUG oslo_concurrency.lockutils [None req-0f0aebca-340c-4c3b-b617-4cbc4d2c2053 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Acquiring lock "refresh_cache-b697e0e3-6ab8-4e90-b8e9-e72e362283da" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:11:42 compute-0 nova_compute[186544]: 2025-11-22 08:11:42.291 186548 DEBUG oslo_concurrency.lockutils [None req-0f0aebca-340c-4c3b-b617-4cbc4d2c2053 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Acquired lock "refresh_cache-b697e0e3-6ab8-4e90-b8e9-e72e362283da" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:11:42 compute-0 nova_compute[186544]: 2025-11-22 08:11:42.291 186548 DEBUG nova.network.neutron [None req-0f0aebca-340c-4c3b-b617-4cbc4d2c2053 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 08:11:44 compute-0 nova_compute[186544]: 2025-11-22 08:11:44.110 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:11:44 compute-0 nova_compute[186544]: 2025-11-22 08:11:44.184 186548 DEBUG nova.network.neutron [None req-0f0aebca-340c-4c3b-b617-4cbc4d2c2053 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Updating instance_info_cache with network_info: [{"id": "7b325460-e116-46b4-b42f-3595a0a85fa1", "address": "fa:16:3e:1f:39:14", "network": {"id": "8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f", "bridge": "br-int", "label": "tempest-TestShelveInstance-1145639745-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e77afcde171b45e6bb008c9dff8ffb44", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b325460-e1", "ovs_interfaceid": "7b325460-e116-46b4-b42f-3595a0a85fa1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:11:44 compute-0 nova_compute[186544]: 2025-11-22 08:11:44.232 186548 DEBUG oslo_concurrency.lockutils [None req-0f0aebca-340c-4c3b-b617-4cbc4d2c2053 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Releasing lock "refresh_cache-b697e0e3-6ab8-4e90-b8e9-e72e362283da" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:11:45 compute-0 nova_compute[186544]: 2025-11-22 08:11:45.542 186548 INFO nova.virt.libvirt.driver [-] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Instance destroyed successfully.
Nov 22 08:11:45 compute-0 nova_compute[186544]: 2025-11-22 08:11:45.543 186548 DEBUG nova.objects.instance [None req-0f0aebca-340c-4c3b-b617-4cbc4d2c2053 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Lazy-loading 'resources' on Instance uuid b697e0e3-6ab8-4e90-b8e9-e72e362283da obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:11:45 compute-0 nova_compute[186544]: 2025-11-22 08:11:45.560 186548 DEBUG nova.virt.libvirt.vif [None req-0f0aebca-340c-4c3b-b617-4cbc4d2c2053 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:10:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestShelveInstance-server-551674936',display_name='tempest-TestShelveInstance-server-551674936',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testshelveinstance-server-551674936',id=129,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAI0aHL6Ou2IJSgJR8KmWp5jvk1H0nenSFPpcgEZ2nM/U0LfRgUiMbaK3XQDYYBX/t4paqNprX8CiArX2uXN/07J1M2q50BshVpGq5Y37se64GB7gIu++lFp7/cMvXmcGQ==',key_name='tempest-TestShelveInstance-1754802104',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:11:11Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='e77afcde171b45e6bb008c9dff8ffb44',ramdisk_id='',reservation_id='r-kc0vpko6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestShelveInstance-2007945223',owner_user_name='tempest-TestShelveInstance-2007945223-project-member',shelved_at='2025-11-22T08:11:42.181531',shelved_host='compute-0.ctlplane.example.com',shelved_image_id='18c54654-0e67-4906-8227-56fee132cb55'},tags=<?>,task_state='shelving_offloading',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:11:39Z,user_data=None,user_id='cf64193131de4d458bf1bd37c21125f6',uuid=b697e0e3-6ab8-4e90-b8e9-e72e362283da,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='shelved') vif={"id": "7b325460-e116-46b4-b42f-3595a0a85fa1", "address": "fa:16:3e:1f:39:14", "network": {"id": "8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f", "bridge": "br-int", "label": "tempest-TestShelveInstance-1145639745-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e77afcde171b45e6bb008c9dff8ffb44", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b325460-e1", "ovs_interfaceid": "7b325460-e116-46b4-b42f-3595a0a85fa1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 08:11:45 compute-0 nova_compute[186544]: 2025-11-22 08:11:45.560 186548 DEBUG nova.network.os_vif_util [None req-0f0aebca-340c-4c3b-b617-4cbc4d2c2053 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Converting VIF {"id": "7b325460-e116-46b4-b42f-3595a0a85fa1", "address": "fa:16:3e:1f:39:14", "network": {"id": "8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f", "bridge": "br-int", "label": "tempest-TestShelveInstance-1145639745-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e77afcde171b45e6bb008c9dff8ffb44", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b325460-e1", "ovs_interfaceid": "7b325460-e116-46b4-b42f-3595a0a85fa1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:11:45 compute-0 nova_compute[186544]: 2025-11-22 08:11:45.561 186548 DEBUG nova.network.os_vif_util [None req-0f0aebca-340c-4c3b-b617-4cbc4d2c2053 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1f:39:14,bridge_name='br-int',has_traffic_filtering=True,id=7b325460-e116-46b4-b42f-3595a0a85fa1,network=Network(8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b325460-e1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:11:45 compute-0 nova_compute[186544]: 2025-11-22 08:11:45.561 186548 DEBUG os_vif [None req-0f0aebca-340c-4c3b-b617-4cbc4d2c2053 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1f:39:14,bridge_name='br-int',has_traffic_filtering=True,id=7b325460-e116-46b4-b42f-3595a0a85fa1,network=Network(8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b325460-e1') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 08:11:45 compute-0 nova_compute[186544]: 2025-11-22 08:11:45.562 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:11:45 compute-0 nova_compute[186544]: 2025-11-22 08:11:45.563 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7b325460-e1, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:11:45 compute-0 nova_compute[186544]: 2025-11-22 08:11:45.565 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:11:45 compute-0 nova_compute[186544]: 2025-11-22 08:11:45.566 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:11:45 compute-0 nova_compute[186544]: 2025-11-22 08:11:45.569 186548 INFO os_vif [None req-0f0aebca-340c-4c3b-b617-4cbc4d2c2053 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1f:39:14,bridge_name='br-int',has_traffic_filtering=True,id=7b325460-e116-46b4-b42f-3595a0a85fa1,network=Network(8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b325460-e1')
Nov 22 08:11:45 compute-0 nova_compute[186544]: 2025-11-22 08:11:45.570 186548 INFO nova.virt.libvirt.driver [None req-0f0aebca-340c-4c3b-b617-4cbc4d2c2053 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Deleting instance files /var/lib/nova/instances/b697e0e3-6ab8-4e90-b8e9-e72e362283da_del
Nov 22 08:11:45 compute-0 nova_compute[186544]: 2025-11-22 08:11:45.575 186548 INFO nova.virt.libvirt.driver [None req-0f0aebca-340c-4c3b-b617-4cbc4d2c2053 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Deletion of /var/lib/nova/instances/b697e0e3-6ab8-4e90-b8e9-e72e362283da_del complete
Nov 22 08:11:45 compute-0 nova_compute[186544]: 2025-11-22 08:11:45.694 186548 DEBUG nova.compute.manager [req-4f261dc4-5581-4b19-9dde-bb42e7d126ca req-ec2828ee-4be6-456b-a564-8fedff55d99a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Received event network-changed-7b325460-e116-46b4-b42f-3595a0a85fa1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:11:45 compute-0 nova_compute[186544]: 2025-11-22 08:11:45.694 186548 DEBUG nova.compute.manager [req-4f261dc4-5581-4b19-9dde-bb42e7d126ca req-ec2828ee-4be6-456b-a564-8fedff55d99a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Refreshing instance network info cache due to event network-changed-7b325460-e116-46b4-b42f-3595a0a85fa1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:11:45 compute-0 nova_compute[186544]: 2025-11-22 08:11:45.695 186548 DEBUG oslo_concurrency.lockutils [req-4f261dc4-5581-4b19-9dde-bb42e7d126ca req-ec2828ee-4be6-456b-a564-8fedff55d99a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-b697e0e3-6ab8-4e90-b8e9-e72e362283da" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:11:45 compute-0 nova_compute[186544]: 2025-11-22 08:11:45.695 186548 DEBUG oslo_concurrency.lockutils [req-4f261dc4-5581-4b19-9dde-bb42e7d126ca req-ec2828ee-4be6-456b-a564-8fedff55d99a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-b697e0e3-6ab8-4e90-b8e9-e72e362283da" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:11:45 compute-0 nova_compute[186544]: 2025-11-22 08:11:45.695 186548 DEBUG nova.network.neutron [req-4f261dc4-5581-4b19-9dde-bb42e7d126ca req-ec2828ee-4be6-456b-a564-8fedff55d99a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Refreshing network info cache for port 7b325460-e116-46b4-b42f-3595a0a85fa1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:11:45 compute-0 nova_compute[186544]: 2025-11-22 08:11:45.712 186548 INFO nova.scheduler.client.report [None req-0f0aebca-340c-4c3b-b617-4cbc4d2c2053 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Deleted allocations for instance b697e0e3-6ab8-4e90-b8e9-e72e362283da
Nov 22 08:11:45 compute-0 nova_compute[186544]: 2025-11-22 08:11:45.774 186548 DEBUG oslo_concurrency.lockutils [None req-0f0aebca-340c-4c3b-b617-4cbc4d2c2053 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:11:45 compute-0 nova_compute[186544]: 2025-11-22 08:11:45.775 186548 DEBUG oslo_concurrency.lockutils [None req-0f0aebca-340c-4c3b-b617-4cbc4d2c2053 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:11:45 compute-0 nova_compute[186544]: 2025-11-22 08:11:45.820 186548 DEBUG nova.compute.provider_tree [None req-0f0aebca-340c-4c3b-b617-4cbc4d2c2053 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:11:45 compute-0 nova_compute[186544]: 2025-11-22 08:11:45.836 186548 DEBUG nova.scheduler.client.report [None req-0f0aebca-340c-4c3b-b617-4cbc4d2c2053 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:11:45 compute-0 nova_compute[186544]: 2025-11-22 08:11:45.858 186548 DEBUG oslo_concurrency.lockutils [None req-0f0aebca-340c-4c3b-b617-4cbc4d2c2053 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.083s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:11:45 compute-0 nova_compute[186544]: 2025-11-22 08:11:45.899 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:11:45 compute-0 nova_compute[186544]: 2025-11-22 08:11:45.922 186548 DEBUG oslo_concurrency.lockutils [None req-0f0aebca-340c-4c3b-b617-4cbc4d2c2053 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Lock "b697e0e3-6ab8-4e90-b8e9-e72e362283da" "released" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: held 11.088s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:11:46 compute-0 nova_compute[186544]: 2025-11-22 08:11:46.700 186548 DEBUG nova.network.neutron [req-4f261dc4-5581-4b19-9dde-bb42e7d126ca req-ec2828ee-4be6-456b-a564-8fedff55d99a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Updated VIF entry in instance network info cache for port 7b325460-e116-46b4-b42f-3595a0a85fa1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:11:46 compute-0 nova_compute[186544]: 2025-11-22 08:11:46.700 186548 DEBUG nova.network.neutron [req-4f261dc4-5581-4b19-9dde-bb42e7d126ca req-ec2828ee-4be6-456b-a564-8fedff55d99a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Updating instance_info_cache with network_info: [{"id": "7b325460-e116-46b4-b42f-3595a0a85fa1", "address": "fa:16:3e:1f:39:14", "network": {"id": "8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f", "bridge": null, "label": "tempest-TestShelveInstance-1145639745-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e77afcde171b45e6bb008c9dff8ffb44", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "unbound", "details": {}, "devname": "tap7b325460-e1", "ovs_interfaceid": null, "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:11:46 compute-0 nova_compute[186544]: 2025-11-22 08:11:46.739 186548 DEBUG oslo_concurrency.lockutils [req-4f261dc4-5581-4b19-9dde-bb42e7d126ca req-ec2828ee-4be6-456b-a564-8fedff55d99a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-b697e0e3-6ab8-4e90-b8e9-e72e362283da" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:11:48 compute-0 nova_compute[186544]: 2025-11-22 08:11:48.208 186548 DEBUG oslo_concurrency.lockutils [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Acquiring lock "b697e0e3-6ab8-4e90-b8e9-e72e362283da" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:11:48 compute-0 nova_compute[186544]: 2025-11-22 08:11:48.208 186548 DEBUG oslo_concurrency.lockutils [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Lock "b697e0e3-6ab8-4e90-b8e9-e72e362283da" acquired by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:11:48 compute-0 nova_compute[186544]: 2025-11-22 08:11:48.209 186548 INFO nova.compute.manager [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Unshelving
Nov 22 08:11:48 compute-0 nova_compute[186544]: 2025-11-22 08:11:48.328 186548 DEBUG oslo_concurrency.lockutils [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:11:48 compute-0 nova_compute[186544]: 2025-11-22 08:11:48.329 186548 DEBUG oslo_concurrency.lockutils [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:11:48 compute-0 nova_compute[186544]: 2025-11-22 08:11:48.333 186548 DEBUG nova.objects.instance [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Lazy-loading 'pci_requests' on Instance uuid b697e0e3-6ab8-4e90-b8e9-e72e362283da obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:11:48 compute-0 nova_compute[186544]: 2025-11-22 08:11:48.345 186548 DEBUG nova.objects.instance [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Lazy-loading 'numa_topology' on Instance uuid b697e0e3-6ab8-4e90-b8e9-e72e362283da obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:11:48 compute-0 nova_compute[186544]: 2025-11-22 08:11:48.357 186548 DEBUG nova.virt.hardware [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 08:11:48 compute-0 nova_compute[186544]: 2025-11-22 08:11:48.357 186548 INFO nova.compute.claims [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Claim successful on node compute-0.ctlplane.example.com
Nov 22 08:11:48 compute-0 nova_compute[186544]: 2025-11-22 08:11:48.455 186548 DEBUG nova.compute.provider_tree [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:11:48 compute-0 nova_compute[186544]: 2025-11-22 08:11:48.468 186548 DEBUG nova.scheduler.client.report [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:11:48 compute-0 nova_compute[186544]: 2025-11-22 08:11:48.490 186548 DEBUG oslo_concurrency.lockutils [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.161s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:11:48 compute-0 nova_compute[186544]: 2025-11-22 08:11:48.620 186548 INFO nova.network.neutron [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Updating port 7b325460-e116-46b4-b42f-3595a0a85fa1 with attributes {'binding:host_id': 'compute-0.ctlplane.example.com', 'device_owner': 'compute:nova'}
Nov 22 08:11:49 compute-0 nova_compute[186544]: 2025-11-22 08:11:49.137 186548 DEBUG oslo_concurrency.lockutils [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Acquiring lock "refresh_cache-b697e0e3-6ab8-4e90-b8e9-e72e362283da" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:11:49 compute-0 nova_compute[186544]: 2025-11-22 08:11:49.138 186548 DEBUG oslo_concurrency.lockutils [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Acquired lock "refresh_cache-b697e0e3-6ab8-4e90-b8e9-e72e362283da" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:11:49 compute-0 nova_compute[186544]: 2025-11-22 08:11:49.139 186548 DEBUG nova.network.neutron [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 08:11:49 compute-0 nova_compute[186544]: 2025-11-22 08:11:49.255 186548 DEBUG nova.compute.manager [req-5e638397-5678-4541-a6e9-88e34c454909 req-d4fd1c79-c133-4e99-8ed3-d1f0175f5744 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Received event network-changed-7b325460-e116-46b4-b42f-3595a0a85fa1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:11:49 compute-0 nova_compute[186544]: 2025-11-22 08:11:49.256 186548 DEBUG nova.compute.manager [req-5e638397-5678-4541-a6e9-88e34c454909 req-d4fd1c79-c133-4e99-8ed3-d1f0175f5744 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Refreshing instance network info cache due to event network-changed-7b325460-e116-46b4-b42f-3595a0a85fa1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:11:49 compute-0 nova_compute[186544]: 2025-11-22 08:11:49.256 186548 DEBUG oslo_concurrency.lockutils [req-5e638397-5678-4541-a6e9-88e34c454909 req-d4fd1c79-c133-4e99-8ed3-d1f0175f5744 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-b697e0e3-6ab8-4e90-b8e9-e72e362283da" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:11:50 compute-0 nova_compute[186544]: 2025-11-22 08:11:50.566 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:11:50 compute-0 nova_compute[186544]: 2025-11-22 08:11:50.900 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:11:51 compute-0 nova_compute[186544]: 2025-11-22 08:11:51.238 186548 DEBUG nova.network.neutron [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Updating instance_info_cache with network_info: [{"id": "7b325460-e116-46b4-b42f-3595a0a85fa1", "address": "fa:16:3e:1f:39:14", "network": {"id": "8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f", "bridge": "br-int", "label": "tempest-TestShelveInstance-1145639745-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e77afcde171b45e6bb008c9dff8ffb44", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b325460-e1", "ovs_interfaceid": "7b325460-e116-46b4-b42f-3595a0a85fa1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:11:51 compute-0 nova_compute[186544]: 2025-11-22 08:11:51.257 186548 DEBUG oslo_concurrency.lockutils [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Releasing lock "refresh_cache-b697e0e3-6ab8-4e90-b8e9-e72e362283da" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:11:51 compute-0 nova_compute[186544]: 2025-11-22 08:11:51.258 186548 DEBUG nova.virt.libvirt.driver [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 08:11:51 compute-0 nova_compute[186544]: 2025-11-22 08:11:51.258 186548 INFO nova.virt.libvirt.driver [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Creating image(s)
Nov 22 08:11:51 compute-0 nova_compute[186544]: 2025-11-22 08:11:51.259 186548 DEBUG oslo_concurrency.lockutils [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Acquiring lock "/var/lib/nova/instances/b697e0e3-6ab8-4e90-b8e9-e72e362283da/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:11:51 compute-0 nova_compute[186544]: 2025-11-22 08:11:51.259 186548 DEBUG oslo_concurrency.lockutils [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Lock "/var/lib/nova/instances/b697e0e3-6ab8-4e90-b8e9-e72e362283da/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:11:51 compute-0 nova_compute[186544]: 2025-11-22 08:11:51.260 186548 DEBUG oslo_concurrency.lockutils [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Lock "/var/lib/nova/instances/b697e0e3-6ab8-4e90-b8e9-e72e362283da/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:11:51 compute-0 nova_compute[186544]: 2025-11-22 08:11:51.260 186548 DEBUG nova.objects.instance [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Lazy-loading 'trusted_certs' on Instance uuid b697e0e3-6ab8-4e90-b8e9-e72e362283da obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:11:51 compute-0 nova_compute[186544]: 2025-11-22 08:11:51.261 186548 DEBUG oslo_concurrency.lockutils [req-5e638397-5678-4541-a6e9-88e34c454909 req-d4fd1c79-c133-4e99-8ed3-d1f0175f5744 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-b697e0e3-6ab8-4e90-b8e9-e72e362283da" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:11:51 compute-0 nova_compute[186544]: 2025-11-22 08:11:51.262 186548 DEBUG nova.network.neutron [req-5e638397-5678-4541-a6e9-88e34c454909 req-d4fd1c79-c133-4e99-8ed3-d1f0175f5744 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Refreshing network info cache for port 7b325460-e116-46b4-b42f-3595a0a85fa1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:11:51 compute-0 nova_compute[186544]: 2025-11-22 08:11:51.276 186548 DEBUG oslo_concurrency.lockutils [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Acquiring lock "b311e5b8c8aaec45fed9ab32a27ca19947834438" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:11:51 compute-0 nova_compute[186544]: 2025-11-22 08:11:51.276 186548 DEBUG oslo_concurrency.lockutils [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Lock "b311e5b8c8aaec45fed9ab32a27ca19947834438" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:11:51 compute-0 podman[237134]: 2025-11-22 08:11:51.406053555 +0000 UTC m=+0.054361547 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, container_name=multipathd, org.label-schema.schema-version=1.0)
Nov 22 08:11:52 compute-0 nova_compute[186544]: 2025-11-22 08:11:52.312 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763799097.3116417, b697e0e3-6ab8-4e90-b8e9-e72e362283da => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:11:52 compute-0 nova_compute[186544]: 2025-11-22 08:11:52.313 186548 INFO nova.compute.manager [-] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] VM Stopped (Lifecycle Event)
Nov 22 08:11:52 compute-0 nova_compute[186544]: 2025-11-22 08:11:52.364 186548 DEBUG nova.compute.manager [None req-b8456cb6-3f20-4f60-a914-af8ab3e28347 - - - - - -] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:11:52 compute-0 nova_compute[186544]: 2025-11-22 08:11:52.749 186548 DEBUG nova.network.neutron [req-5e638397-5678-4541-a6e9-88e34c454909 req-d4fd1c79-c133-4e99-8ed3-d1f0175f5744 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Updated VIF entry in instance network info cache for port 7b325460-e116-46b4-b42f-3595a0a85fa1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:11:52 compute-0 nova_compute[186544]: 2025-11-22 08:11:52.750 186548 DEBUG nova.network.neutron [req-5e638397-5678-4541-a6e9-88e34c454909 req-d4fd1c79-c133-4e99-8ed3-d1f0175f5744 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Updating instance_info_cache with network_info: [{"id": "7b325460-e116-46b4-b42f-3595a0a85fa1", "address": "fa:16:3e:1f:39:14", "network": {"id": "8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f", "bridge": "br-int", "label": "tempest-TestShelveInstance-1145639745-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e77afcde171b45e6bb008c9dff8ffb44", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b325460-e1", "ovs_interfaceid": "7b325460-e116-46b4-b42f-3595a0a85fa1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:11:52 compute-0 nova_compute[186544]: 2025-11-22 08:11:52.764 186548 DEBUG oslo_concurrency.lockutils [req-5e638397-5678-4541-a6e9-88e34c454909 req-d4fd1c79-c133-4e99-8ed3-d1f0175f5744 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-b697e0e3-6ab8-4e90-b8e9-e72e362283da" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:11:53 compute-0 nova_compute[186544]: 2025-11-22 08:11:53.456 186548 DEBUG oslo_concurrency.processutils [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b311e5b8c8aaec45fed9ab32a27ca19947834438.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:11:53 compute-0 nova_compute[186544]: 2025-11-22 08:11:53.525 186548 DEBUG oslo_concurrency.processutils [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b311e5b8c8aaec45fed9ab32a27ca19947834438.part --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:11:53 compute-0 nova_compute[186544]: 2025-11-22 08:11:53.526 186548 DEBUG nova.virt.images [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] 18c54654-0e67-4906-8227-56fee132cb55 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Nov 22 08:11:53 compute-0 nova_compute[186544]: 2025-11-22 08:11:53.527 186548 DEBUG nova.privsep.utils [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Nov 22 08:11:53 compute-0 nova_compute[186544]: 2025-11-22 08:11:53.528 186548 DEBUG oslo_concurrency.processutils [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/b311e5b8c8aaec45fed9ab32a27ca19947834438.part /var/lib/nova/instances/_base/b311e5b8c8aaec45fed9ab32a27ca19947834438.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:11:53 compute-0 nova_compute[186544]: 2025-11-22 08:11:53.873 186548 DEBUG oslo_concurrency.processutils [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/b311e5b8c8aaec45fed9ab32a27ca19947834438.part /var/lib/nova/instances/_base/b311e5b8c8aaec45fed9ab32a27ca19947834438.converted" returned: 0 in 0.345s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:11:53 compute-0 nova_compute[186544]: 2025-11-22 08:11:53.885 186548 DEBUG oslo_concurrency.processutils [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b311e5b8c8aaec45fed9ab32a27ca19947834438.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:11:53 compute-0 nova_compute[186544]: 2025-11-22 08:11:53.952 186548 DEBUG oslo_concurrency.processutils [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b311e5b8c8aaec45fed9ab32a27ca19947834438.converted --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:11:53 compute-0 nova_compute[186544]: 2025-11-22 08:11:53.954 186548 DEBUG oslo_concurrency.lockutils [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Lock "b311e5b8c8aaec45fed9ab32a27ca19947834438" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.677s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:11:53 compute-0 nova_compute[186544]: 2025-11-22 08:11:53.969 186548 DEBUG oslo_concurrency.processutils [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b311e5b8c8aaec45fed9ab32a27ca19947834438 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:11:54 compute-0 nova_compute[186544]: 2025-11-22 08:11:54.025 186548 DEBUG oslo_concurrency.processutils [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b311e5b8c8aaec45fed9ab32a27ca19947834438 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:11:54 compute-0 nova_compute[186544]: 2025-11-22 08:11:54.027 186548 DEBUG oslo_concurrency.lockutils [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Acquiring lock "b311e5b8c8aaec45fed9ab32a27ca19947834438" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:11:54 compute-0 nova_compute[186544]: 2025-11-22 08:11:54.027 186548 DEBUG oslo_concurrency.lockutils [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Lock "b311e5b8c8aaec45fed9ab32a27ca19947834438" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:11:54 compute-0 nova_compute[186544]: 2025-11-22 08:11:54.039 186548 DEBUG oslo_concurrency.processutils [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b311e5b8c8aaec45fed9ab32a27ca19947834438 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:11:54 compute-0 nova_compute[186544]: 2025-11-22 08:11:54.099 186548 DEBUG oslo_concurrency.processutils [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b311e5b8c8aaec45fed9ab32a27ca19947834438 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:11:54 compute-0 nova_compute[186544]: 2025-11-22 08:11:54.101 186548 DEBUG oslo_concurrency.processutils [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/b311e5b8c8aaec45fed9ab32a27ca19947834438,backing_fmt=raw /var/lib/nova/instances/b697e0e3-6ab8-4e90-b8e9-e72e362283da/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:11:54 compute-0 nova_compute[186544]: 2025-11-22 08:11:54.144 186548 DEBUG oslo_concurrency.processutils [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/b311e5b8c8aaec45fed9ab32a27ca19947834438,backing_fmt=raw /var/lib/nova/instances/b697e0e3-6ab8-4e90-b8e9-e72e362283da/disk 1073741824" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:11:54 compute-0 nova_compute[186544]: 2025-11-22 08:11:54.146 186548 DEBUG oslo_concurrency.lockutils [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Lock "b311e5b8c8aaec45fed9ab32a27ca19947834438" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.118s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:11:54 compute-0 nova_compute[186544]: 2025-11-22 08:11:54.146 186548 DEBUG oslo_concurrency.processutils [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b311e5b8c8aaec45fed9ab32a27ca19947834438 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:11:54 compute-0 nova_compute[186544]: 2025-11-22 08:11:54.204 186548 DEBUG oslo_concurrency.processutils [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b311e5b8c8aaec45fed9ab32a27ca19947834438 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:11:54 compute-0 nova_compute[186544]: 2025-11-22 08:11:54.205 186548 DEBUG nova.objects.instance [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Lazy-loading 'migration_context' on Instance uuid b697e0e3-6ab8-4e90-b8e9-e72e362283da obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:11:54 compute-0 nova_compute[186544]: 2025-11-22 08:11:54.219 186548 INFO nova.virt.libvirt.driver [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Rebasing disk image.
Nov 22 08:11:54 compute-0 nova_compute[186544]: 2025-11-22 08:11:54.219 186548 DEBUG oslo_concurrency.processutils [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:11:54 compute-0 nova_compute[186544]: 2025-11-22 08:11:54.278 186548 DEBUG oslo_concurrency.processutils [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:11:54 compute-0 nova_compute[186544]: 2025-11-22 08:11:54.279 186548 DEBUG oslo_concurrency.processutils [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Running cmd (subprocess): qemu-img rebase -b /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 -F raw /var/lib/nova/instances/b697e0e3-6ab8-4e90-b8e9-e72e362283da/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:11:55 compute-0 podman[237186]: 2025-11-22 08:11:55.403151422 +0000 UTC m=+0.053688750 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 08:11:55 compute-0 podman[237187]: 2025-11-22 08:11:55.413115099 +0000 UTC m=+0.058167841 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., release=1755695350, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal)
Nov 22 08:11:55 compute-0 nova_compute[186544]: 2025-11-22 08:11:55.487 186548 DEBUG oslo_concurrency.processutils [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] CMD "qemu-img rebase -b /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 -F raw /var/lib/nova/instances/b697e0e3-6ab8-4e90-b8e9-e72e362283da/disk" returned: 0 in 1.208s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:11:55 compute-0 nova_compute[186544]: 2025-11-22 08:11:55.488 186548 DEBUG nova.virt.libvirt.driver [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 08:11:55 compute-0 nova_compute[186544]: 2025-11-22 08:11:55.488 186548 DEBUG nova.virt.libvirt.driver [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Ensure instance console log exists: /var/lib/nova/instances/b697e0e3-6ab8-4e90-b8e9-e72e362283da/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 08:11:55 compute-0 nova_compute[186544]: 2025-11-22 08:11:55.488 186548 DEBUG oslo_concurrency.lockutils [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:11:55 compute-0 nova_compute[186544]: 2025-11-22 08:11:55.489 186548 DEBUG oslo_concurrency.lockutils [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:11:55 compute-0 nova_compute[186544]: 2025-11-22 08:11:55.489 186548 DEBUG oslo_concurrency.lockutils [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:11:55 compute-0 nova_compute[186544]: 2025-11-22 08:11:55.491 186548 DEBUG nova.virt.libvirt.driver [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Start _get_guest_xml network_info=[{"id": "7b325460-e116-46b4-b42f-3595a0a85fa1", "address": "fa:16:3e:1f:39:14", "network": {"id": "8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f", "bridge": "br-int", "label": "tempest-TestShelveInstance-1145639745-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e77afcde171b45e6bb008c9dff8ffb44", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b325460-e1", "ovs_interfaceid": "7b325460-e116-46b4-b42f-3595a0a85fa1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='96d3d0a08bbbffea26a5a72c22301221',container_format='bare',created_at=2025-11-22T08:11:34Z,direct_url=<?>,disk_format='qcow2',id=18c54654-0e67-4906-8227-56fee132cb55,min_disk=1,min_ram=0,name='tempest-TestShelveInstance-server-551674936-shelved',owner='e77afcde171b45e6bb008c9dff8ffb44',properties=ImageMetaProps,protected=<?>,size=52363264,status='active',tags=<?>,updated_at=2025-11-22T08:11:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 08:11:55 compute-0 nova_compute[186544]: 2025-11-22 08:11:55.495 186548 WARNING nova.virt.libvirt.driver [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:11:55 compute-0 nova_compute[186544]: 2025-11-22 08:11:55.501 186548 DEBUG nova.virt.libvirt.host [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 08:11:55 compute-0 nova_compute[186544]: 2025-11-22 08:11:55.502 186548 DEBUG nova.virt.libvirt.host [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 08:11:55 compute-0 nova_compute[186544]: 2025-11-22 08:11:55.506 186548 DEBUG nova.virt.libvirt.host [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 08:11:55 compute-0 nova_compute[186544]: 2025-11-22 08:11:55.507 186548 DEBUG nova.virt.libvirt.host [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 08:11:55 compute-0 nova_compute[186544]: 2025-11-22 08:11:55.509 186548 DEBUG nova.virt.libvirt.driver [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 08:11:55 compute-0 nova_compute[186544]: 2025-11-22 08:11:55.509 186548 DEBUG nova.virt.hardware [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='96d3d0a08bbbffea26a5a72c22301221',container_format='bare',created_at=2025-11-22T08:11:34Z,direct_url=<?>,disk_format='qcow2',id=18c54654-0e67-4906-8227-56fee132cb55,min_disk=1,min_ram=0,name='tempest-TestShelveInstance-server-551674936-shelved',owner='e77afcde171b45e6bb008c9dff8ffb44',properties=ImageMetaProps,protected=<?>,size=52363264,status='active',tags=<?>,updated_at=2025-11-22T08:11:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 08:11:55 compute-0 nova_compute[186544]: 2025-11-22 08:11:55.509 186548 DEBUG nova.virt.hardware [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 08:11:55 compute-0 nova_compute[186544]: 2025-11-22 08:11:55.510 186548 DEBUG nova.virt.hardware [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 08:11:55 compute-0 nova_compute[186544]: 2025-11-22 08:11:55.510 186548 DEBUG nova.virt.hardware [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 08:11:55 compute-0 nova_compute[186544]: 2025-11-22 08:11:55.510 186548 DEBUG nova.virt.hardware [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 08:11:55 compute-0 nova_compute[186544]: 2025-11-22 08:11:55.511 186548 DEBUG nova.virt.hardware [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 08:11:55 compute-0 nova_compute[186544]: 2025-11-22 08:11:55.511 186548 DEBUG nova.virt.hardware [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 08:11:55 compute-0 nova_compute[186544]: 2025-11-22 08:11:55.511 186548 DEBUG nova.virt.hardware [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 08:11:55 compute-0 nova_compute[186544]: 2025-11-22 08:11:55.512 186548 DEBUG nova.virt.hardware [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 08:11:55 compute-0 nova_compute[186544]: 2025-11-22 08:11:55.512 186548 DEBUG nova.virt.hardware [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 08:11:55 compute-0 nova_compute[186544]: 2025-11-22 08:11:55.512 186548 DEBUG nova.virt.hardware [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 08:11:55 compute-0 nova_compute[186544]: 2025-11-22 08:11:55.512 186548 DEBUG nova.objects.instance [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Lazy-loading 'vcpu_model' on Instance uuid b697e0e3-6ab8-4e90-b8e9-e72e362283da obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:11:55 compute-0 nova_compute[186544]: 2025-11-22 08:11:55.526 186548 DEBUG nova.virt.libvirt.vif [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-22T08:10:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestShelveInstance-server-551674936',display_name='tempest-TestShelveInstance-server-551674936',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testshelveinstance-server-551674936',id=129,image_ref='18c54654-0e67-4906-8227-56fee132cb55',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name='tempest-TestShelveInstance-1754802104',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:11:11Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='e77afcde171b45e6bb008c9dff8ffb44',ramdisk_id='',reservation_id='r-kc0vpko6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestShelveInstance-2007945223',owner_user_name='tempest-TestShelveInstance-2007945223-project-member',shelved_at='2025-11-22T08:11:42.181531',shelved_host='compute-0.ctlplane.example.com',shelved_image_id='18c54654-0e67-4906-8227-56fee132cb55'},tags=<?>,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:11:48Z,user_data=None,user_id='cf64193131de4d458bf1bd37c21125f6',uuid=b697e0e3-6ab8-4e90-b8e9-e72e362283da,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='shelved_offloaded') vif={"id": "7b325460-e116-46b4-b42f-3595a0a85fa1", "address": "fa:16:3e:1f:39:14", "network": {"id": "8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f", "bridge": "br-int", "label": "tempest-TestShelveInstance-1145639745-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e77afcde171b45e6bb008c9dff8ffb44", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b325460-e1", "ovs_interfaceid": "7b325460-e116-46b4-b42f-3595a0a85fa1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 08:11:55 compute-0 nova_compute[186544]: 2025-11-22 08:11:55.527 186548 DEBUG nova.network.os_vif_util [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Converting VIF {"id": "7b325460-e116-46b4-b42f-3595a0a85fa1", "address": "fa:16:3e:1f:39:14", "network": {"id": "8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f", "bridge": "br-int", "label": "tempest-TestShelveInstance-1145639745-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e77afcde171b45e6bb008c9dff8ffb44", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b325460-e1", "ovs_interfaceid": "7b325460-e116-46b4-b42f-3595a0a85fa1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:11:55 compute-0 nova_compute[186544]: 2025-11-22 08:11:55.527 186548 DEBUG nova.network.os_vif_util [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1f:39:14,bridge_name='br-int',has_traffic_filtering=True,id=7b325460-e116-46b4-b42f-3595a0a85fa1,network=Network(8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b325460-e1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:11:55 compute-0 nova_compute[186544]: 2025-11-22 08:11:55.528 186548 DEBUG nova.objects.instance [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Lazy-loading 'pci_devices' on Instance uuid b697e0e3-6ab8-4e90-b8e9-e72e362283da obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:11:55 compute-0 nova_compute[186544]: 2025-11-22 08:11:55.538 186548 DEBUG nova.virt.libvirt.driver [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] End _get_guest_xml xml=<domain type="kvm">
Nov 22 08:11:55 compute-0 nova_compute[186544]:   <uuid>b697e0e3-6ab8-4e90-b8e9-e72e362283da</uuid>
Nov 22 08:11:55 compute-0 nova_compute[186544]:   <name>instance-00000081</name>
Nov 22 08:11:55 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 08:11:55 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 08:11:55 compute-0 nova_compute[186544]:   <metadata>
Nov 22 08:11:55 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 08:11:55 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 08:11:55 compute-0 nova_compute[186544]:       <nova:name>tempest-TestShelveInstance-server-551674936</nova:name>
Nov 22 08:11:55 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 08:11:55</nova:creationTime>
Nov 22 08:11:55 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 08:11:55 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 08:11:55 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 08:11:55 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 08:11:55 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 08:11:55 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 08:11:55 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 08:11:55 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 08:11:55 compute-0 nova_compute[186544]:         <nova:user uuid="cf64193131de4d458bf1bd37c21125f6">tempest-TestShelveInstance-2007945223-project-member</nova:user>
Nov 22 08:11:55 compute-0 nova_compute[186544]:         <nova:project uuid="e77afcde171b45e6bb008c9dff8ffb44">tempest-TestShelveInstance-2007945223</nova:project>
Nov 22 08:11:55 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 08:11:55 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="18c54654-0e67-4906-8227-56fee132cb55"/>
Nov 22 08:11:55 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 08:11:55 compute-0 nova_compute[186544]:         <nova:port uuid="7b325460-e116-46b4-b42f-3595a0a85fa1">
Nov 22 08:11:55 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Nov 22 08:11:55 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 08:11:55 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 08:11:55 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 08:11:55 compute-0 nova_compute[186544]:   </metadata>
Nov 22 08:11:55 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 08:11:55 compute-0 nova_compute[186544]:     <system>
Nov 22 08:11:55 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 08:11:55 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 08:11:55 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 08:11:55 compute-0 nova_compute[186544]:       <entry name="serial">b697e0e3-6ab8-4e90-b8e9-e72e362283da</entry>
Nov 22 08:11:55 compute-0 nova_compute[186544]:       <entry name="uuid">b697e0e3-6ab8-4e90-b8e9-e72e362283da</entry>
Nov 22 08:11:55 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 08:11:55 compute-0 nova_compute[186544]:     </system>
Nov 22 08:11:55 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 08:11:55 compute-0 nova_compute[186544]:   <os>
Nov 22 08:11:55 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 08:11:55 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 08:11:55 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 08:11:55 compute-0 nova_compute[186544]:   </os>
Nov 22 08:11:55 compute-0 nova_compute[186544]:   <features>
Nov 22 08:11:55 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 08:11:55 compute-0 nova_compute[186544]:     <apic/>
Nov 22 08:11:55 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 08:11:55 compute-0 nova_compute[186544]:   </features>
Nov 22 08:11:55 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 08:11:55 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 08:11:55 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 08:11:55 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 08:11:55 compute-0 nova_compute[186544]:   </clock>
Nov 22 08:11:55 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 08:11:55 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 08:11:55 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 08:11:55 compute-0 nova_compute[186544]:   </cpu>
Nov 22 08:11:55 compute-0 nova_compute[186544]:   <devices>
Nov 22 08:11:55 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 08:11:55 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 08:11:55 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/b697e0e3-6ab8-4e90-b8e9-e72e362283da/disk"/>
Nov 22 08:11:55 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 08:11:55 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:11:55 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 08:11:55 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 08:11:55 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/b697e0e3-6ab8-4e90-b8e9-e72e362283da/disk.config"/>
Nov 22 08:11:55 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 08:11:55 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:11:55 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 08:11:55 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:1f:39:14"/>
Nov 22 08:11:55 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:11:55 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 08:11:55 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 08:11:55 compute-0 nova_compute[186544]:       <target dev="tap7b325460-e1"/>
Nov 22 08:11:55 compute-0 nova_compute[186544]:     </interface>
Nov 22 08:11:55 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 08:11:55 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/b697e0e3-6ab8-4e90-b8e9-e72e362283da/console.log" append="off"/>
Nov 22 08:11:55 compute-0 nova_compute[186544]:     </serial>
Nov 22 08:11:55 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 08:11:55 compute-0 nova_compute[186544]:     <video>
Nov 22 08:11:55 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:11:55 compute-0 nova_compute[186544]:     </video>
Nov 22 08:11:55 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 08:11:55 compute-0 nova_compute[186544]:     <input type="keyboard" bus="usb"/>
Nov 22 08:11:55 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 08:11:55 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 08:11:55 compute-0 nova_compute[186544]:     </rng>
Nov 22 08:11:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 08:11:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:11:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:11:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:11:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:11:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:11:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:11:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:11:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:11:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:11:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:11:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:11:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:11:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:11:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:11:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:11:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:11:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:11:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:11:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:11:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:11:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:11:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:11:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:11:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:11:55 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 08:11:55 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 08:11:55 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 08:11:55 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 08:11:55 compute-0 nova_compute[186544]:   </devices>
Nov 22 08:11:55 compute-0 nova_compute[186544]: </domain>
Nov 22 08:11:55 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 08:11:55 compute-0 nova_compute[186544]: 2025-11-22 08:11:55.539 186548 DEBUG nova.compute.manager [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Preparing to wait for external event network-vif-plugged-7b325460-e116-46b4-b42f-3595a0a85fa1 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 08:11:55 compute-0 nova_compute[186544]: 2025-11-22 08:11:55.540 186548 DEBUG oslo_concurrency.lockutils [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Acquiring lock "b697e0e3-6ab8-4e90-b8e9-e72e362283da-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:11:55 compute-0 nova_compute[186544]: 2025-11-22 08:11:55.540 186548 DEBUG oslo_concurrency.lockutils [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Lock "b697e0e3-6ab8-4e90-b8e9-e72e362283da-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:11:55 compute-0 nova_compute[186544]: 2025-11-22 08:11:55.540 186548 DEBUG oslo_concurrency.lockutils [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Lock "b697e0e3-6ab8-4e90-b8e9-e72e362283da-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:11:55 compute-0 nova_compute[186544]: 2025-11-22 08:11:55.541 186548 DEBUG nova.virt.libvirt.vif [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-22T08:10:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestShelveInstance-server-551674936',display_name='tempest-TestShelveInstance-server-551674936',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testshelveinstance-server-551674936',id=129,image_ref='18c54654-0e67-4906-8227-56fee132cb55',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name='tempest-TestShelveInstance-1754802104',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:11:11Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='e77afcde171b45e6bb008c9dff8ffb44',ramdisk_id='',reservation_id='r-kc0vpko6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestShelveInstance-2007945223',owner_user_name='tempest-TestShelveInstance-2007945223-project-member',shelved_at='2025-11-22T08:11:42.181531',shelved_host='compute-0.ctlplane.example.com',shelved_image_id='18c54654-0e67-4906-8227-56fee132cb55'},tags=<?>,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:11:48Z,user_data=None,user_id='cf64193131de4d458bf1bd37c21125f6',uuid=b697e0e3-6ab8-4e90-b8e9-e72e362283da,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='shelved_offloaded') vif={"id": "7b325460-e116-46b4-b42f-3595a0a85fa1", "address": "fa:16:3e:1f:39:14", "network": {"id": "8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f", "bridge": "br-int", "label": "tempest-TestShelveInstance-1145639745-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e77afcde171b45e6bb008c9dff8ffb44", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b325460-e1", "ovs_interfaceid": "7b325460-e116-46b4-b42f-3595a0a85fa1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 08:11:55 compute-0 nova_compute[186544]: 2025-11-22 08:11:55.541 186548 DEBUG nova.network.os_vif_util [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Converting VIF {"id": "7b325460-e116-46b4-b42f-3595a0a85fa1", "address": "fa:16:3e:1f:39:14", "network": {"id": "8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f", "bridge": "br-int", "label": "tempest-TestShelveInstance-1145639745-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e77afcde171b45e6bb008c9dff8ffb44", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b325460-e1", "ovs_interfaceid": "7b325460-e116-46b4-b42f-3595a0a85fa1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:11:55 compute-0 nova_compute[186544]: 2025-11-22 08:11:55.542 186548 DEBUG nova.network.os_vif_util [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1f:39:14,bridge_name='br-int',has_traffic_filtering=True,id=7b325460-e116-46b4-b42f-3595a0a85fa1,network=Network(8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b325460-e1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:11:55 compute-0 nova_compute[186544]: 2025-11-22 08:11:55.542 186548 DEBUG os_vif [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1f:39:14,bridge_name='br-int',has_traffic_filtering=True,id=7b325460-e116-46b4-b42f-3595a0a85fa1,network=Network(8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b325460-e1') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 08:11:55 compute-0 nova_compute[186544]: 2025-11-22 08:11:55.543 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:11:55 compute-0 nova_compute[186544]: 2025-11-22 08:11:55.543 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:11:55 compute-0 nova_compute[186544]: 2025-11-22 08:11:55.544 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:11:55 compute-0 nova_compute[186544]: 2025-11-22 08:11:55.546 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:11:55 compute-0 nova_compute[186544]: 2025-11-22 08:11:55.546 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7b325460-e1, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:11:55 compute-0 nova_compute[186544]: 2025-11-22 08:11:55.547 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7b325460-e1, col_values=(('external_ids', {'iface-id': '7b325460-e116-46b4-b42f-3595a0a85fa1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1f:39:14', 'vm-uuid': 'b697e0e3-6ab8-4e90-b8e9-e72e362283da'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:11:55 compute-0 nova_compute[186544]: 2025-11-22 08:11:55.548 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:11:55 compute-0 NetworkManager[55036]: <info>  [1763799115.5500] manager: (tap7b325460-e1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/272)
Nov 22 08:11:55 compute-0 nova_compute[186544]: 2025-11-22 08:11:55.550 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 08:11:55 compute-0 nova_compute[186544]: 2025-11-22 08:11:55.555 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:11:55 compute-0 nova_compute[186544]: 2025-11-22 08:11:55.556 186548 INFO os_vif [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1f:39:14,bridge_name='br-int',has_traffic_filtering=True,id=7b325460-e116-46b4-b42f-3595a0a85fa1,network=Network(8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b325460-e1')
Nov 22 08:11:55 compute-0 nova_compute[186544]: 2025-11-22 08:11:55.608 186548 DEBUG nova.virt.libvirt.driver [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:11:55 compute-0 nova_compute[186544]: 2025-11-22 08:11:55.608 186548 DEBUG nova.virt.libvirt.driver [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:11:55 compute-0 nova_compute[186544]: 2025-11-22 08:11:55.608 186548 DEBUG nova.virt.libvirt.driver [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] No VIF found with MAC fa:16:3e:1f:39:14, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 08:11:55 compute-0 nova_compute[186544]: 2025-11-22 08:11:55.609 186548 INFO nova.virt.libvirt.driver [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Using config drive
Nov 22 08:11:55 compute-0 nova_compute[186544]: 2025-11-22 08:11:55.623 186548 DEBUG nova.objects.instance [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Lazy-loading 'ec2_ids' on Instance uuid b697e0e3-6ab8-4e90-b8e9-e72e362283da obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:11:55 compute-0 nova_compute[186544]: 2025-11-22 08:11:55.662 186548 DEBUG nova.objects.instance [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Lazy-loading 'keypairs' on Instance uuid b697e0e3-6ab8-4e90-b8e9-e72e362283da obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:11:55 compute-0 nova_compute[186544]: 2025-11-22 08:11:55.903 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:11:56 compute-0 nova_compute[186544]: 2025-11-22 08:11:56.158 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:11:56 compute-0 nova_compute[186544]: 2025-11-22 08:11:56.540 186548 INFO nova.virt.libvirt.driver [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Creating config drive at /var/lib/nova/instances/b697e0e3-6ab8-4e90-b8e9-e72e362283da/disk.config
Nov 22 08:11:56 compute-0 nova_compute[186544]: 2025-11-22 08:11:56.545 186548 DEBUG oslo_concurrency.processutils [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b697e0e3-6ab8-4e90-b8e9-e72e362283da/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvrfqdt9d execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:11:56 compute-0 nova_compute[186544]: 2025-11-22 08:11:56.672 186548 DEBUG oslo_concurrency.processutils [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b697e0e3-6ab8-4e90-b8e9-e72e362283da/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvrfqdt9d" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:11:56 compute-0 kernel: tap7b325460-e1: entered promiscuous mode
Nov 22 08:11:56 compute-0 NetworkManager[55036]: <info>  [1763799116.7320] manager: (tap7b325460-e1): new Tun device (/org/freedesktop/NetworkManager/Devices/273)
Nov 22 08:11:56 compute-0 ovn_controller[94843]: 2025-11-22T08:11:56Z|00598|binding|INFO|Claiming lport 7b325460-e116-46b4-b42f-3595a0a85fa1 for this chassis.
Nov 22 08:11:56 compute-0 ovn_controller[94843]: 2025-11-22T08:11:56Z|00599|binding|INFO|7b325460-e116-46b4-b42f-3595a0a85fa1: Claiming fa:16:3e:1f:39:14 10.100.0.4
Nov 22 08:11:56 compute-0 nova_compute[186544]: 2025-11-22 08:11:56.736 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:11:56 compute-0 nova_compute[186544]: 2025-11-22 08:11:56.750 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:11:56 compute-0 nova_compute[186544]: 2025-11-22 08:11:56.756 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:11:56 compute-0 NetworkManager[55036]: <info>  [1763799116.7574] manager: (patch-br-int-to-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/274)
Nov 22 08:11:56 compute-0 NetworkManager[55036]: <info>  [1763799116.7582] manager: (patch-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/275)
Nov 22 08:11:56 compute-0 systemd-udevd[237251]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:11:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:56.762 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1f:39:14 10.100.0.4'], port_security=['fa:16:3e:1f:39:14 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'b697e0e3-6ab8-4e90-b8e9-e72e362283da', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e77afcde171b45e6bb008c9dff8ffb44', 'neutron:revision_number': '7', 'neutron:security_group_ids': '74fef083-1145-4a81-b923-db02721f22a5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.182'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6709d38e-861b-4fc2-860f-7d0aaf6cf724, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=7b325460-e116-46b4-b42f-3595a0a85fa1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:11:56 compute-0 systemd-machined[152872]: New machine qemu-74-instance-00000081.
Nov 22 08:11:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:56.764 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 7b325460-e116-46b4-b42f-3595a0a85fa1 in datapath 8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f bound to our chassis
Nov 22 08:11:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:56.766 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f
Nov 22 08:11:56 compute-0 NetworkManager[55036]: <info>  [1763799116.7763] device (tap7b325460-e1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 08:11:56 compute-0 NetworkManager[55036]: <info>  [1763799116.7771] device (tap7b325460-e1): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 08:11:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:56.778 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[40630404-c2b5-4510-9f65-e251fe3ea0d7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:11:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:56.779 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap8b2fd30e-d1 in ovnmeta-8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 08:11:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:56.780 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap8b2fd30e-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 08:11:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:56.781 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[d19aacc3-e419-48e2-9b3d-d740a1589cf6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:11:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:56.782 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[5ad2477e-3046-4e23-aba0-21544095bd74]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:11:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:56.794 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[e128e19e-46e2-4836-86a4-ef305037dcbd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:11:56 compute-0 systemd[1]: Started Virtual Machine qemu-74-instance-00000081.
Nov 22 08:11:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:56.824 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[ac141c89-2a68-46c6-8719-8be6e463c995]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:11:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:56.856 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[962d7663-f44d-4d13-aec1-acd64b779871]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:11:56 compute-0 NetworkManager[55036]: <info>  [1763799116.8713] manager: (tap8b2fd30e-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/276)
Nov 22 08:11:56 compute-0 systemd-udevd[237254]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:11:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:56.872 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[4819b882-0a93-43d8-8772-8109dfa950b7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:11:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:56.909 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[ca5f89d7-4bcf-493e-a0e4-6a80247a747d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:11:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:56.912 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[6e69370f-94fb-442f-95c7-4286d75fd64d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:11:56 compute-0 NetworkManager[55036]: <info>  [1763799116.9348] device (tap8b2fd30e-d0): carrier: link connected
Nov 22 08:11:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:56.940 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[93b0be64-969e-4cc6-9177-25a7e82c63a7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:11:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:56.957 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[6e76de60-c1cc-4572-a969-73d5aa4e4f5f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8b2fd30e-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b6:48:73'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 184], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 585757, 'reachable_time': 30744, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237284, 'error': None, 'target': 'ovnmeta-8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:11:56 compute-0 nova_compute[186544]: 2025-11-22 08:11:56.966 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:11:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:56.977 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[42131ae3-35fc-44b7-bc6a-651f8427e67f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb6:4873'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 585757, 'tstamp': 585757}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 237285, 'error': None, 'target': 'ovnmeta-8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:11:56 compute-0 nova_compute[186544]: 2025-11-22 08:11:56.995 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:11:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:56.994 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[9ee50bfd-577e-482d-94af-7d9a7bca4ad0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8b2fd30e-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b6:48:73'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 184], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 585757, 'reachable_time': 30744, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 237286, 'error': None, 'target': 'ovnmeta-8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:11:57 compute-0 ovn_controller[94843]: 2025-11-22T08:11:57Z|00600|binding|INFO|Setting lport 7b325460-e116-46b4-b42f-3595a0a85fa1 ovn-installed in OVS
Nov 22 08:11:57 compute-0 ovn_controller[94843]: 2025-11-22T08:11:57Z|00601|binding|INFO|Setting lport 7b325460-e116-46b4-b42f-3595a0a85fa1 up in Southbound
Nov 22 08:11:57 compute-0 nova_compute[186544]: 2025-11-22 08:11:57.006 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:11:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:57.026 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[7639cb19-b4fe-48d5-ac69-8759bb08ba23]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:11:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:57.089 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[aa96cc26-ab86-4730-b729-71b6264130b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:11:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:57.090 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8b2fd30e-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:11:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:57.091 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:11:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:57.091 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8b2fd30e-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:11:57 compute-0 nova_compute[186544]: 2025-11-22 08:11:57.093 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:11:57 compute-0 NetworkManager[55036]: <info>  [1763799117.0943] manager: (tap8b2fd30e-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/277)
Nov 22 08:11:57 compute-0 kernel: tap8b2fd30e-d0: entered promiscuous mode
Nov 22 08:11:57 compute-0 nova_compute[186544]: 2025-11-22 08:11:57.096 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:11:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:57.097 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8b2fd30e-d0, col_values=(('external_ids', {'iface-id': 'ef9893d7-8bc5-4970-a831-1d511ec4e023'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:11:57 compute-0 nova_compute[186544]: 2025-11-22 08:11:57.098 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:11:57 compute-0 ovn_controller[94843]: 2025-11-22T08:11:57Z|00602|binding|INFO|Releasing lport ef9893d7-8bc5-4970-a831-1d511ec4e023 from this chassis (sb_readonly=0)
Nov 22 08:11:57 compute-0 nova_compute[186544]: 2025-11-22 08:11:57.111 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:11:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:57.113 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 08:11:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:57.114 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[deff97ac-27a8-4e5f-92d7-ea0386816038]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:11:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:57.115 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 08:11:57 compute-0 ovn_metadata_agent[103800]: global
Nov 22 08:11:57 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 08:11:57 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f
Nov 22 08:11:57 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 08:11:57 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 08:11:57 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 08:11:57 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f.pid.haproxy
Nov 22 08:11:57 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 08:11:57 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:11:57 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 08:11:57 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 08:11:57 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 08:11:57 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 08:11:57 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 08:11:57 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 08:11:57 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 08:11:57 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 08:11:57 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 08:11:57 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 08:11:57 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 08:11:57 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 08:11:57 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 08:11:57 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:11:57 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:11:57 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 08:11:57 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 08:11:57 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 08:11:57 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID 8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f
Nov 22 08:11:57 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 08:11:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:11:57.115 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f', 'env', 'PROCESS_TAG=haproxy-8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 08:11:57 compute-0 nova_compute[186544]: 2025-11-22 08:11:57.136 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763799117.1358464, b697e0e3-6ab8-4e90-b8e9-e72e362283da => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:11:57 compute-0 nova_compute[186544]: 2025-11-22 08:11:57.136 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] VM Started (Lifecycle Event)
Nov 22 08:11:57 compute-0 nova_compute[186544]: 2025-11-22 08:11:57.161 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:11:57 compute-0 nova_compute[186544]: 2025-11-22 08:11:57.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:11:57 compute-0 nova_compute[186544]: 2025-11-22 08:11:57.166 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763799117.1360452, b697e0e3-6ab8-4e90-b8e9-e72e362283da => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:11:57 compute-0 nova_compute[186544]: 2025-11-22 08:11:57.166 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] VM Paused (Lifecycle Event)
Nov 22 08:11:57 compute-0 nova_compute[186544]: 2025-11-22 08:11:57.189 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:11:57 compute-0 nova_compute[186544]: 2025-11-22 08:11:57.190 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:11:57 compute-0 nova_compute[186544]: 2025-11-22 08:11:57.190 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:11:57 compute-0 nova_compute[186544]: 2025-11-22 08:11:57.190 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 08:11:57 compute-0 nova_compute[186544]: 2025-11-22 08:11:57.191 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:11:57 compute-0 nova_compute[186544]: 2025-11-22 08:11:57.197 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:11:57 compute-0 nova_compute[186544]: 2025-11-22 08:11:57.213 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 08:11:57 compute-0 nova_compute[186544]: 2025-11-22 08:11:57.236 186548 DEBUG nova.compute.manager [req-2771d24b-176f-473a-824e-596a8381c8ed req-4ed4db75-62ad-4440-b5af-763a5e651f84 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Received event network-vif-plugged-7b325460-e116-46b4-b42f-3595a0a85fa1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:11:57 compute-0 nova_compute[186544]: 2025-11-22 08:11:57.237 186548 DEBUG oslo_concurrency.lockutils [req-2771d24b-176f-473a-824e-596a8381c8ed req-4ed4db75-62ad-4440-b5af-763a5e651f84 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "b697e0e3-6ab8-4e90-b8e9-e72e362283da-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:11:57 compute-0 nova_compute[186544]: 2025-11-22 08:11:57.238 186548 DEBUG oslo_concurrency.lockutils [req-2771d24b-176f-473a-824e-596a8381c8ed req-4ed4db75-62ad-4440-b5af-763a5e651f84 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b697e0e3-6ab8-4e90-b8e9-e72e362283da-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:11:57 compute-0 nova_compute[186544]: 2025-11-22 08:11:57.238 186548 DEBUG oslo_concurrency.lockutils [req-2771d24b-176f-473a-824e-596a8381c8ed req-4ed4db75-62ad-4440-b5af-763a5e651f84 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b697e0e3-6ab8-4e90-b8e9-e72e362283da-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:11:57 compute-0 nova_compute[186544]: 2025-11-22 08:11:57.238 186548 DEBUG nova.compute.manager [req-2771d24b-176f-473a-824e-596a8381c8ed req-4ed4db75-62ad-4440-b5af-763a5e651f84 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Processing event network-vif-plugged-7b325460-e116-46b4-b42f-3595a0a85fa1 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 08:11:57 compute-0 nova_compute[186544]: 2025-11-22 08:11:57.239 186548 DEBUG nova.compute.manager [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 08:11:57 compute-0 nova_compute[186544]: 2025-11-22 08:11:57.244 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763799117.2440007, b697e0e3-6ab8-4e90-b8e9-e72e362283da => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:11:57 compute-0 nova_compute[186544]: 2025-11-22 08:11:57.244 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] VM Resumed (Lifecycle Event)
Nov 22 08:11:57 compute-0 nova_compute[186544]: 2025-11-22 08:11:57.246 186548 DEBUG nova.virt.libvirt.driver [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 08:11:57 compute-0 nova_compute[186544]: 2025-11-22 08:11:57.254 186548 INFO nova.virt.libvirt.driver [-] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Instance spawned successfully.
Nov 22 08:11:57 compute-0 nova_compute[186544]: 2025-11-22 08:11:57.262 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:11:57 compute-0 nova_compute[186544]: 2025-11-22 08:11:57.265 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b697e0e3-6ab8-4e90-b8e9-e72e362283da/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:11:57 compute-0 nova_compute[186544]: 2025-11-22 08:11:57.284 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:11:57 compute-0 nova_compute[186544]: 2025-11-22 08:11:57.307 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 08:11:57 compute-0 nova_compute[186544]: 2025-11-22 08:11:57.327 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b697e0e3-6ab8-4e90-b8e9-e72e362283da/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:11:57 compute-0 nova_compute[186544]: 2025-11-22 08:11:57.328 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b697e0e3-6ab8-4e90-b8e9-e72e362283da/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:11:57 compute-0 nova_compute[186544]: 2025-11-22 08:11:57.385 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b697e0e3-6ab8-4e90-b8e9-e72e362283da/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:11:57 compute-0 podman[237330]: 2025-11-22 08:11:57.564862645 +0000 UTC m=+0.099503154 container create 00d4346506c184713a21a4565408915642b993b9262f9a6428fd9513f062224e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2)
Nov 22 08:11:57 compute-0 nova_compute[186544]: 2025-11-22 08:11:57.603 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:11:57 compute-0 nova_compute[186544]: 2025-11-22 08:11:57.605 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5685MB free_disk=73.10772705078125GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 08:11:57 compute-0 nova_compute[186544]: 2025-11-22 08:11:57.605 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:11:57 compute-0 nova_compute[186544]: 2025-11-22 08:11:57.605 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:11:57 compute-0 podman[237330]: 2025-11-22 08:11:57.516451196 +0000 UTC m=+0.051091715 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 08:11:57 compute-0 systemd[1]: Started libpod-conmon-00d4346506c184713a21a4565408915642b993b9262f9a6428fd9513f062224e.scope.
Nov 22 08:11:57 compute-0 systemd[1]: Started libcrun container.
Nov 22 08:11:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/29fd54899ff062910ae353ad48b4d0c14ef3e7260045a4e01dfbeaf0177ffbbc/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 08:11:57 compute-0 nova_compute[186544]: 2025-11-22 08:11:57.715 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Instance b697e0e3-6ab8-4e90-b8e9-e72e362283da actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 22 08:11:57 compute-0 nova_compute[186544]: 2025-11-22 08:11:57.715 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 08:11:57 compute-0 nova_compute[186544]: 2025-11-22 08:11:57.716 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 08:11:57 compute-0 nova_compute[186544]: 2025-11-22 08:11:57.762 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:11:57 compute-0 nova_compute[186544]: 2025-11-22 08:11:57.774 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:11:57 compute-0 podman[237330]: 2025-11-22 08:11:57.794615142 +0000 UTC m=+0.329255681 container init 00d4346506c184713a21a4565408915642b993b9262f9a6428fd9513f062224e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 22 08:11:57 compute-0 nova_compute[186544]: 2025-11-22 08:11:57.794 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 08:11:57 compute-0 nova_compute[186544]: 2025-11-22 08:11:57.794 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.189s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:11:57 compute-0 podman[237330]: 2025-11-22 08:11:57.801078563 +0000 UTC m=+0.335719072 container start 00d4346506c184713a21a4565408915642b993b9262f9a6428fd9513f062224e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Nov 22 08:11:57 compute-0 neutron-haproxy-ovnmeta-8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f[237346]: [NOTICE]   (237350) : New worker (237352) forked
Nov 22 08:11:57 compute-0 neutron-haproxy-ovnmeta-8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f[237346]: [NOTICE]   (237350) : Loading success.
Nov 22 08:11:58 compute-0 nova_compute[186544]: 2025-11-22 08:11:58.454 186548 DEBUG nova.compute.manager [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:11:58 compute-0 nova_compute[186544]: 2025-11-22 08:11:58.572 186548 DEBUG oslo_concurrency.lockutils [None req-b956c954-e189-4a9f-9ce3-54a1fe74ba49 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Lock "b697e0e3-6ab8-4e90-b8e9-e72e362283da" "released" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: held 10.364s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:11:59 compute-0 nova_compute[186544]: 2025-11-22 08:11:59.353 186548 DEBUG nova.compute.manager [req-5c743c21-9799-42af-aa93-92e90c28134d req-b74f36d8-7cd3-40d2-894d-6ab872127d12 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Received event network-vif-plugged-7b325460-e116-46b4-b42f-3595a0a85fa1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:11:59 compute-0 nova_compute[186544]: 2025-11-22 08:11:59.353 186548 DEBUG oslo_concurrency.lockutils [req-5c743c21-9799-42af-aa93-92e90c28134d req-b74f36d8-7cd3-40d2-894d-6ab872127d12 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "b697e0e3-6ab8-4e90-b8e9-e72e362283da-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:11:59 compute-0 nova_compute[186544]: 2025-11-22 08:11:59.354 186548 DEBUG oslo_concurrency.lockutils [req-5c743c21-9799-42af-aa93-92e90c28134d req-b74f36d8-7cd3-40d2-894d-6ab872127d12 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b697e0e3-6ab8-4e90-b8e9-e72e362283da-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:11:59 compute-0 nova_compute[186544]: 2025-11-22 08:11:59.354 186548 DEBUG oslo_concurrency.lockutils [req-5c743c21-9799-42af-aa93-92e90c28134d req-b74f36d8-7cd3-40d2-894d-6ab872127d12 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b697e0e3-6ab8-4e90-b8e9-e72e362283da-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:11:59 compute-0 nova_compute[186544]: 2025-11-22 08:11:59.354 186548 DEBUG nova.compute.manager [req-5c743c21-9799-42af-aa93-92e90c28134d req-b74f36d8-7cd3-40d2-894d-6ab872127d12 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] No waiting events found dispatching network-vif-plugged-7b325460-e116-46b4-b42f-3595a0a85fa1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:11:59 compute-0 nova_compute[186544]: 2025-11-22 08:11:59.355 186548 WARNING nova.compute.manager [req-5c743c21-9799-42af-aa93-92e90c28134d req-b74f36d8-7cd3-40d2-894d-6ab872127d12 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Received unexpected event network-vif-plugged-7b325460-e116-46b4-b42f-3595a0a85fa1 for instance with vm_state active and task_state None.
Nov 22 08:12:00 compute-0 nova_compute[186544]: 2025-11-22 08:12:00.549 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:12:00 compute-0 nova_compute[186544]: 2025-11-22 08:12:00.795 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:12:00 compute-0 nova_compute[186544]: 2025-11-22 08:12:00.796 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 08:12:00 compute-0 nova_compute[186544]: 2025-11-22 08:12:00.796 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 08:12:00 compute-0 nova_compute[186544]: 2025-11-22 08:12:00.904 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:12:01 compute-0 nova_compute[186544]: 2025-11-22 08:12:01.534 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "refresh_cache-b697e0e3-6ab8-4e90-b8e9-e72e362283da" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:12:01 compute-0 nova_compute[186544]: 2025-11-22 08:12:01.535 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquired lock "refresh_cache-b697e0e3-6ab8-4e90-b8e9-e72e362283da" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:12:01 compute-0 nova_compute[186544]: 2025-11-22 08:12:01.535 186548 DEBUG nova.network.neutron [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 22 08:12:01 compute-0 nova_compute[186544]: 2025-11-22 08:12:01.536 186548 DEBUG nova.objects.instance [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lazy-loading 'info_cache' on Instance uuid b697e0e3-6ab8-4e90-b8e9-e72e362283da obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:12:02 compute-0 nova_compute[186544]: 2025-11-22 08:12:02.618 186548 DEBUG nova.network.neutron [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Updating instance_info_cache with network_info: [{"id": "7b325460-e116-46b4-b42f-3595a0a85fa1", "address": "fa:16:3e:1f:39:14", "network": {"id": "8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f", "bridge": "br-int", "label": "tempest-TestShelveInstance-1145639745-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e77afcde171b45e6bb008c9dff8ffb44", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b325460-e1", "ovs_interfaceid": "7b325460-e116-46b4-b42f-3595a0a85fa1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:12:02 compute-0 nova_compute[186544]: 2025-11-22 08:12:02.642 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Releasing lock "refresh_cache-b697e0e3-6ab8-4e90-b8e9-e72e362283da" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:12:02 compute-0 nova_compute[186544]: 2025-11-22 08:12:02.643 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 22 08:12:02 compute-0 nova_compute[186544]: 2025-11-22 08:12:02.644 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:12:02 compute-0 nova_compute[186544]: 2025-11-22 08:12:02.644 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:12:02 compute-0 nova_compute[186544]: 2025-11-22 08:12:02.645 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:12:02 compute-0 nova_compute[186544]: 2025-11-22 08:12:02.645 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 08:12:04 compute-0 nova_compute[186544]: 2025-11-22 08:12:04.165 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:12:05 compute-0 nova_compute[186544]: 2025-11-22 08:12:05.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:12:05 compute-0 nova_compute[186544]: 2025-11-22 08:12:05.553 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:12:05 compute-0 nova_compute[186544]: 2025-11-22 08:12:05.906 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:12:07 compute-0 nova_compute[186544]: 2025-11-22 08:12:07.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:12:07 compute-0 ovn_controller[94843]: 2025-11-22T08:12:07Z|00603|binding|INFO|Releasing lport ef9893d7-8bc5-4970-a831-1d511ec4e023 from this chassis (sb_readonly=0)
Nov 22 08:12:07 compute-0 nova_compute[186544]: 2025-11-22 08:12:07.527 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:12:10 compute-0 nova_compute[186544]: 2025-11-22 08:12:10.556 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:12:10 compute-0 nova_compute[186544]: 2025-11-22 08:12:10.909 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:12:11 compute-0 podman[237375]: 2025-11-22 08:12:11.428089268 +0000 UTC m=+0.072440374 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute)
Nov 22 08:12:11 compute-0 podman[237376]: 2025-11-22 08:12:11.437259255 +0000 UTC m=+0.075871719 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 08:12:11 compute-0 podman[237377]: 2025-11-22 08:12:11.440054084 +0000 UTC m=+0.076842833 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 08:12:11 compute-0 podman[237378]: 2025-11-22 08:12:11.49602867 +0000 UTC m=+0.125219111 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 22 08:12:12 compute-0 ovn_controller[94843]: 2025-11-22T08:12:12Z|00054|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:1f:39:14 10.100.0.4
Nov 22 08:12:15 compute-0 nova_compute[186544]: 2025-11-22 08:12:15.558 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:12:15 compute-0 nova_compute[186544]: 2025-11-22 08:12:15.911 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:12:18 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:12:18.797 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=37, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=36) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:12:18 compute-0 nova_compute[186544]: 2025-11-22 08:12:18.798 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:12:18 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:12:18.799 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 08:12:19 compute-0 nova_compute[186544]: 2025-11-22 08:12:19.519 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:12:20 compute-0 nova_compute[186544]: 2025-11-22 08:12:20.561 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:12:20 compute-0 nova_compute[186544]: 2025-11-22 08:12:20.913 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:12:21 compute-0 nova_compute[186544]: 2025-11-22 08:12:21.159 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:12:22 compute-0 podman[237461]: 2025-11-22 08:12:22.408126938 +0000 UTC m=+0.053122796 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Nov 22 08:12:23 compute-0 nova_compute[186544]: 2025-11-22 08:12:23.733 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:12:25 compute-0 nova_compute[186544]: 2025-11-22 08:12:25.563 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:12:25 compute-0 nova_compute[186544]: 2025-11-22 08:12:25.915 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:12:26 compute-0 podman[237484]: 2025-11-22 08:12:26.411628574 +0000 UTC m=+0.061635797 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 08:12:26 compute-0 podman[237485]: 2025-11-22 08:12:26.422564395 +0000 UTC m=+0.055308241 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, architecture=x86_64, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, container_name=openstack_network_exporter, name=ubi9-minimal, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, release=1755695350)
Nov 22 08:12:27 compute-0 nova_compute[186544]: 2025-11-22 08:12:27.667 186548 DEBUG oslo_concurrency.lockutils [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] Acquiring lock "5ad88893-e3b8-416a-8ae6-fc7dfd032f51" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:12:27 compute-0 nova_compute[186544]: 2025-11-22 08:12:27.668 186548 DEBUG oslo_concurrency.lockutils [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] Lock "5ad88893-e3b8-416a-8ae6-fc7dfd032f51" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:12:27 compute-0 nova_compute[186544]: 2025-11-22 08:12:27.692 186548 DEBUG nova.compute.manager [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] [instance: 5ad88893-e3b8-416a-8ae6-fc7dfd032f51] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 08:12:27 compute-0 nova_compute[186544]: 2025-11-22 08:12:27.774 186548 DEBUG oslo_concurrency.lockutils [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:12:27 compute-0 nova_compute[186544]: 2025-11-22 08:12:27.775 186548 DEBUG oslo_concurrency.lockutils [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:12:27 compute-0 nova_compute[186544]: 2025-11-22 08:12:27.784 186548 DEBUG nova.virt.hardware [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 08:12:27 compute-0 nova_compute[186544]: 2025-11-22 08:12:27.784 186548 INFO nova.compute.claims [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] [instance: 5ad88893-e3b8-416a-8ae6-fc7dfd032f51] Claim successful on node compute-0.ctlplane.example.com
Nov 22 08:12:27 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:12:27.800 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '37'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:12:27 compute-0 nova_compute[186544]: 2025-11-22 08:12:27.921 186548 DEBUG nova.compute.provider_tree [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:12:27 compute-0 nova_compute[186544]: 2025-11-22 08:12:27.933 186548 DEBUG nova.scheduler.client.report [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:12:27 compute-0 nova_compute[186544]: 2025-11-22 08:12:27.954 186548 DEBUG oslo_concurrency.lockutils [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.180s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:12:27 compute-0 nova_compute[186544]: 2025-11-22 08:12:27.956 186548 DEBUG nova.compute.manager [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] [instance: 5ad88893-e3b8-416a-8ae6-fc7dfd032f51] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 08:12:28 compute-0 nova_compute[186544]: 2025-11-22 08:12:28.004 186548 DEBUG nova.compute.manager [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] [instance: 5ad88893-e3b8-416a-8ae6-fc7dfd032f51] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 22 08:12:28 compute-0 nova_compute[186544]: 2025-11-22 08:12:28.004 186548 DEBUG nova.network.neutron [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] [instance: 5ad88893-e3b8-416a-8ae6-fc7dfd032f51] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 08:12:28 compute-0 nova_compute[186544]: 2025-11-22 08:12:28.026 186548 INFO nova.virt.libvirt.driver [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] [instance: 5ad88893-e3b8-416a-8ae6-fc7dfd032f51] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 08:12:28 compute-0 nova_compute[186544]: 2025-11-22 08:12:28.044 186548 DEBUG nova.compute.manager [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] [instance: 5ad88893-e3b8-416a-8ae6-fc7dfd032f51] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 08:12:28 compute-0 nova_compute[186544]: 2025-11-22 08:12:28.152 186548 DEBUG nova.compute.manager [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] [instance: 5ad88893-e3b8-416a-8ae6-fc7dfd032f51] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 08:12:28 compute-0 nova_compute[186544]: 2025-11-22 08:12:28.154 186548 DEBUG nova.virt.libvirt.driver [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] [instance: 5ad88893-e3b8-416a-8ae6-fc7dfd032f51] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 08:12:28 compute-0 nova_compute[186544]: 2025-11-22 08:12:28.154 186548 INFO nova.virt.libvirt.driver [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] [instance: 5ad88893-e3b8-416a-8ae6-fc7dfd032f51] Creating image(s)
Nov 22 08:12:28 compute-0 nova_compute[186544]: 2025-11-22 08:12:28.155 186548 DEBUG oslo_concurrency.lockutils [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] Acquiring lock "/var/lib/nova/instances/5ad88893-e3b8-416a-8ae6-fc7dfd032f51/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:12:28 compute-0 nova_compute[186544]: 2025-11-22 08:12:28.155 186548 DEBUG oslo_concurrency.lockutils [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] Lock "/var/lib/nova/instances/5ad88893-e3b8-416a-8ae6-fc7dfd032f51/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:12:28 compute-0 nova_compute[186544]: 2025-11-22 08:12:28.156 186548 DEBUG oslo_concurrency.lockutils [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] Lock "/var/lib/nova/instances/5ad88893-e3b8-416a-8ae6-fc7dfd032f51/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:12:28 compute-0 nova_compute[186544]: 2025-11-22 08:12:28.169 186548 DEBUG oslo_concurrency.processutils [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:12:28 compute-0 nova_compute[186544]: 2025-11-22 08:12:28.237 186548 DEBUG oslo_concurrency.processutils [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:12:28 compute-0 nova_compute[186544]: 2025-11-22 08:12:28.238 186548 DEBUG oslo_concurrency.lockutils [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:12:28 compute-0 nova_compute[186544]: 2025-11-22 08:12:28.239 186548 DEBUG oslo_concurrency.lockutils [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:12:28 compute-0 nova_compute[186544]: 2025-11-22 08:12:28.250 186548 DEBUG oslo_concurrency.processutils [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:12:28 compute-0 nova_compute[186544]: 2025-11-22 08:12:28.304 186548 DEBUG oslo_concurrency.processutils [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:12:28 compute-0 nova_compute[186544]: 2025-11-22 08:12:28.305 186548 DEBUG oslo_concurrency.processutils [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/5ad88893-e3b8-416a-8ae6-fc7dfd032f51/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:12:28 compute-0 nova_compute[186544]: 2025-11-22 08:12:28.346 186548 DEBUG oslo_concurrency.processutils [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/5ad88893-e3b8-416a-8ae6-fc7dfd032f51/disk 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:12:28 compute-0 nova_compute[186544]: 2025-11-22 08:12:28.347 186548 DEBUG oslo_concurrency.lockutils [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.109s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:12:28 compute-0 nova_compute[186544]: 2025-11-22 08:12:28.348 186548 DEBUG oslo_concurrency.processutils [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:12:28 compute-0 nova_compute[186544]: 2025-11-22 08:12:28.405 186548 DEBUG oslo_concurrency.processutils [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:12:28 compute-0 nova_compute[186544]: 2025-11-22 08:12:28.406 186548 DEBUG nova.virt.disk.api [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] Checking if we can resize image /var/lib/nova/instances/5ad88893-e3b8-416a-8ae6-fc7dfd032f51/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 08:12:28 compute-0 nova_compute[186544]: 2025-11-22 08:12:28.406 186548 DEBUG oslo_concurrency.processutils [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5ad88893-e3b8-416a-8ae6-fc7dfd032f51/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:12:28 compute-0 nova_compute[186544]: 2025-11-22 08:12:28.465 186548 DEBUG oslo_concurrency.processutils [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5ad88893-e3b8-416a-8ae6-fc7dfd032f51/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:12:28 compute-0 nova_compute[186544]: 2025-11-22 08:12:28.466 186548 DEBUG nova.virt.disk.api [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] Cannot resize image /var/lib/nova/instances/5ad88893-e3b8-416a-8ae6-fc7dfd032f51/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 08:12:28 compute-0 nova_compute[186544]: 2025-11-22 08:12:28.466 186548 DEBUG nova.objects.instance [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] Lazy-loading 'migration_context' on Instance uuid 5ad88893-e3b8-416a-8ae6-fc7dfd032f51 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:12:28 compute-0 nova_compute[186544]: 2025-11-22 08:12:28.479 186548 DEBUG nova.virt.libvirt.driver [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] [instance: 5ad88893-e3b8-416a-8ae6-fc7dfd032f51] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 08:12:28 compute-0 nova_compute[186544]: 2025-11-22 08:12:28.479 186548 DEBUG nova.virt.libvirt.driver [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] [instance: 5ad88893-e3b8-416a-8ae6-fc7dfd032f51] Ensure instance console log exists: /var/lib/nova/instances/5ad88893-e3b8-416a-8ae6-fc7dfd032f51/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 08:12:28 compute-0 nova_compute[186544]: 2025-11-22 08:12:28.480 186548 DEBUG oslo_concurrency.lockutils [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:12:28 compute-0 nova_compute[186544]: 2025-11-22 08:12:28.480 186548 DEBUG oslo_concurrency.lockutils [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:12:28 compute-0 nova_compute[186544]: 2025-11-22 08:12:28.481 186548 DEBUG oslo_concurrency.lockutils [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:12:28 compute-0 nova_compute[186544]: 2025-11-22 08:12:28.590 186548 DEBUG nova.policy [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c0ef2c0fb47644918f7915437e98fc9a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e6f0c135c6434a09b8f88d035ba8aa76', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 22 08:12:29 compute-0 nova_compute[186544]: 2025-11-22 08:12:29.700 186548 DEBUG nova.network.neutron [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] [instance: 5ad88893-e3b8-416a-8ae6-fc7dfd032f51] Successfully created port: b467ba7d-c233-4daf-93a8-01b1a9d6e012 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 22 08:12:30 compute-0 nova_compute[186544]: 2025-11-22 08:12:30.500 186548 DEBUG nova.network.neutron [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] [instance: 5ad88893-e3b8-416a-8ae6-fc7dfd032f51] Successfully updated port: b467ba7d-c233-4daf-93a8-01b1a9d6e012 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 08:12:30 compute-0 nova_compute[186544]: 2025-11-22 08:12:30.512 186548 DEBUG oslo_concurrency.lockutils [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] Acquiring lock "refresh_cache-5ad88893-e3b8-416a-8ae6-fc7dfd032f51" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:12:30 compute-0 nova_compute[186544]: 2025-11-22 08:12:30.512 186548 DEBUG oslo_concurrency.lockutils [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] Acquired lock "refresh_cache-5ad88893-e3b8-416a-8ae6-fc7dfd032f51" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:12:30 compute-0 nova_compute[186544]: 2025-11-22 08:12:30.512 186548 DEBUG nova.network.neutron [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] [instance: 5ad88893-e3b8-416a-8ae6-fc7dfd032f51] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 08:12:30 compute-0 nova_compute[186544]: 2025-11-22 08:12:30.565 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:12:30 compute-0 nova_compute[186544]: 2025-11-22 08:12:30.629 186548 DEBUG nova.compute.manager [req-2b47ad45-8046-4a10-a2d4-918e7a2d3369 req-1d197dbf-3d37-44e2-a52d-d2abd7ff4a1e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5ad88893-e3b8-416a-8ae6-fc7dfd032f51] Received event network-changed-b467ba7d-c233-4daf-93a8-01b1a9d6e012 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:12:30 compute-0 nova_compute[186544]: 2025-11-22 08:12:30.630 186548 DEBUG nova.compute.manager [req-2b47ad45-8046-4a10-a2d4-918e7a2d3369 req-1d197dbf-3d37-44e2-a52d-d2abd7ff4a1e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5ad88893-e3b8-416a-8ae6-fc7dfd032f51] Refreshing instance network info cache due to event network-changed-b467ba7d-c233-4daf-93a8-01b1a9d6e012. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:12:30 compute-0 nova_compute[186544]: 2025-11-22 08:12:30.630 186548 DEBUG oslo_concurrency.lockutils [req-2b47ad45-8046-4a10-a2d4-918e7a2d3369 req-1d197dbf-3d37-44e2-a52d-d2abd7ff4a1e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-5ad88893-e3b8-416a-8ae6-fc7dfd032f51" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:12:30 compute-0 nova_compute[186544]: 2025-11-22 08:12:30.674 186548 DEBUG nova.network.neutron [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] [instance: 5ad88893-e3b8-416a-8ae6-fc7dfd032f51] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 08:12:30 compute-0 nova_compute[186544]: 2025-11-22 08:12:30.918 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:12:31 compute-0 nova_compute[186544]: 2025-11-22 08:12:31.782 186548 DEBUG nova.network.neutron [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] [instance: 5ad88893-e3b8-416a-8ae6-fc7dfd032f51] Updating instance_info_cache with network_info: [{"id": "b467ba7d-c233-4daf-93a8-01b1a9d6e012", "address": "fa:16:3e:bb:c0:58", "network": {"id": "edad52ae-eb56-4482-9255-08c505724d77", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-1843979129-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6f0c135c6434a09b8f88d035ba8aa76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb467ba7d-c2", "ovs_interfaceid": "b467ba7d-c233-4daf-93a8-01b1a9d6e012", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:12:31 compute-0 nova_compute[186544]: 2025-11-22 08:12:31.807 186548 DEBUG oslo_concurrency.lockutils [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] Releasing lock "refresh_cache-5ad88893-e3b8-416a-8ae6-fc7dfd032f51" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:12:31 compute-0 nova_compute[186544]: 2025-11-22 08:12:31.808 186548 DEBUG nova.compute.manager [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] [instance: 5ad88893-e3b8-416a-8ae6-fc7dfd032f51] Instance network_info: |[{"id": "b467ba7d-c233-4daf-93a8-01b1a9d6e012", "address": "fa:16:3e:bb:c0:58", "network": {"id": "edad52ae-eb56-4482-9255-08c505724d77", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-1843979129-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6f0c135c6434a09b8f88d035ba8aa76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb467ba7d-c2", "ovs_interfaceid": "b467ba7d-c233-4daf-93a8-01b1a9d6e012", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 22 08:12:31 compute-0 nova_compute[186544]: 2025-11-22 08:12:31.808 186548 DEBUG oslo_concurrency.lockutils [req-2b47ad45-8046-4a10-a2d4-918e7a2d3369 req-1d197dbf-3d37-44e2-a52d-d2abd7ff4a1e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-5ad88893-e3b8-416a-8ae6-fc7dfd032f51" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:12:31 compute-0 nova_compute[186544]: 2025-11-22 08:12:31.809 186548 DEBUG nova.network.neutron [req-2b47ad45-8046-4a10-a2d4-918e7a2d3369 req-1d197dbf-3d37-44e2-a52d-d2abd7ff4a1e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5ad88893-e3b8-416a-8ae6-fc7dfd032f51] Refreshing network info cache for port b467ba7d-c233-4daf-93a8-01b1a9d6e012 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:12:31 compute-0 nova_compute[186544]: 2025-11-22 08:12:31.811 186548 DEBUG nova.virt.libvirt.driver [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] [instance: 5ad88893-e3b8-416a-8ae6-fc7dfd032f51] Start _get_guest_xml network_info=[{"id": "b467ba7d-c233-4daf-93a8-01b1a9d6e012", "address": "fa:16:3e:bb:c0:58", "network": {"id": "edad52ae-eb56-4482-9255-08c505724d77", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-1843979129-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6f0c135c6434a09b8f88d035ba8aa76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb467ba7d-c2", "ovs_interfaceid": "b467ba7d-c233-4daf-93a8-01b1a9d6e012", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 08:12:31 compute-0 nova_compute[186544]: 2025-11-22 08:12:31.816 186548 WARNING nova.virt.libvirt.driver [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:12:31 compute-0 nova_compute[186544]: 2025-11-22 08:12:31.823 186548 DEBUG nova.virt.libvirt.host [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 08:12:31 compute-0 nova_compute[186544]: 2025-11-22 08:12:31.823 186548 DEBUG nova.virt.libvirt.host [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 08:12:31 compute-0 nova_compute[186544]: 2025-11-22 08:12:31.827 186548 DEBUG nova.virt.libvirt.host [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 08:12:31 compute-0 nova_compute[186544]: 2025-11-22 08:12:31.828 186548 DEBUG nova.virt.libvirt.host [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 08:12:31 compute-0 nova_compute[186544]: 2025-11-22 08:12:31.829 186548 DEBUG nova.virt.libvirt.driver [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 08:12:31 compute-0 nova_compute[186544]: 2025-11-22 08:12:31.829 186548 DEBUG nova.virt.hardware [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 08:12:31 compute-0 nova_compute[186544]: 2025-11-22 08:12:31.830 186548 DEBUG nova.virt.hardware [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 08:12:31 compute-0 nova_compute[186544]: 2025-11-22 08:12:31.830 186548 DEBUG nova.virt.hardware [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 08:12:31 compute-0 nova_compute[186544]: 2025-11-22 08:12:31.830 186548 DEBUG nova.virt.hardware [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 08:12:31 compute-0 nova_compute[186544]: 2025-11-22 08:12:31.830 186548 DEBUG nova.virt.hardware [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 08:12:31 compute-0 nova_compute[186544]: 2025-11-22 08:12:31.831 186548 DEBUG nova.virt.hardware [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 08:12:31 compute-0 nova_compute[186544]: 2025-11-22 08:12:31.831 186548 DEBUG nova.virt.hardware [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 08:12:31 compute-0 nova_compute[186544]: 2025-11-22 08:12:31.831 186548 DEBUG nova.virt.hardware [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 08:12:31 compute-0 nova_compute[186544]: 2025-11-22 08:12:31.831 186548 DEBUG nova.virt.hardware [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 08:12:31 compute-0 nova_compute[186544]: 2025-11-22 08:12:31.831 186548 DEBUG nova.virt.hardware [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 08:12:31 compute-0 nova_compute[186544]: 2025-11-22 08:12:31.832 186548 DEBUG nova.virt.hardware [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 08:12:31 compute-0 nova_compute[186544]: 2025-11-22 08:12:31.835 186548 DEBUG nova.virt.libvirt.vif [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:12:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerPasswordTestJSON-server-1051020919',display_name='tempest-ServerPasswordTestJSON-server-1051020919',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverpasswordtestjson-server-1051020919',id=134,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e6f0c135c6434a09b8f88d035ba8aa76',ramdisk_id='',reservation_id='r-r53rpd0r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerPasswordTestJSON-761097360',owner_user_name='tempest-ServerPasswordTestJSON-761097360-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:12:28Z,user_data=None,user_id='c0ef2c0fb47644918f7915437e98fc9a',uuid=5ad88893-e3b8-416a-8ae6-fc7dfd032f51,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b467ba7d-c233-4daf-93a8-01b1a9d6e012", "address": "fa:16:3e:bb:c0:58", "network": {"id": "edad52ae-eb56-4482-9255-08c505724d77", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-1843979129-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6f0c135c6434a09b8f88d035ba8aa76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb467ba7d-c2", "ovs_interfaceid": "b467ba7d-c233-4daf-93a8-01b1a9d6e012", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 08:12:31 compute-0 nova_compute[186544]: 2025-11-22 08:12:31.835 186548 DEBUG nova.network.os_vif_util [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] Converting VIF {"id": "b467ba7d-c233-4daf-93a8-01b1a9d6e012", "address": "fa:16:3e:bb:c0:58", "network": {"id": "edad52ae-eb56-4482-9255-08c505724d77", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-1843979129-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6f0c135c6434a09b8f88d035ba8aa76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb467ba7d-c2", "ovs_interfaceid": "b467ba7d-c233-4daf-93a8-01b1a9d6e012", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:12:31 compute-0 nova_compute[186544]: 2025-11-22 08:12:31.836 186548 DEBUG nova.network.os_vif_util [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bb:c0:58,bridge_name='br-int',has_traffic_filtering=True,id=b467ba7d-c233-4daf-93a8-01b1a9d6e012,network=Network(edad52ae-eb56-4482-9255-08c505724d77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb467ba7d-c2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:12:31 compute-0 nova_compute[186544]: 2025-11-22 08:12:31.837 186548 DEBUG nova.objects.instance [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] Lazy-loading 'pci_devices' on Instance uuid 5ad88893-e3b8-416a-8ae6-fc7dfd032f51 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:12:31 compute-0 nova_compute[186544]: 2025-11-22 08:12:31.847 186548 DEBUG nova.virt.libvirt.driver [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] [instance: 5ad88893-e3b8-416a-8ae6-fc7dfd032f51] End _get_guest_xml xml=<domain type="kvm">
Nov 22 08:12:31 compute-0 nova_compute[186544]:   <uuid>5ad88893-e3b8-416a-8ae6-fc7dfd032f51</uuid>
Nov 22 08:12:31 compute-0 nova_compute[186544]:   <name>instance-00000086</name>
Nov 22 08:12:31 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 08:12:31 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 08:12:31 compute-0 nova_compute[186544]:   <metadata>
Nov 22 08:12:31 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 08:12:31 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 08:12:31 compute-0 nova_compute[186544]:       <nova:name>tempest-ServerPasswordTestJSON-server-1051020919</nova:name>
Nov 22 08:12:31 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 08:12:31</nova:creationTime>
Nov 22 08:12:31 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 08:12:31 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 08:12:31 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 08:12:31 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 08:12:31 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 08:12:31 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 08:12:31 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 08:12:31 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 08:12:31 compute-0 nova_compute[186544]:         <nova:user uuid="c0ef2c0fb47644918f7915437e98fc9a">tempest-ServerPasswordTestJSON-761097360-project-member</nova:user>
Nov 22 08:12:31 compute-0 nova_compute[186544]:         <nova:project uuid="e6f0c135c6434a09b8f88d035ba8aa76">tempest-ServerPasswordTestJSON-761097360</nova:project>
Nov 22 08:12:31 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 08:12:31 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 08:12:31 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 08:12:31 compute-0 nova_compute[186544]:         <nova:port uuid="b467ba7d-c233-4daf-93a8-01b1a9d6e012">
Nov 22 08:12:31 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Nov 22 08:12:31 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 08:12:31 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 08:12:31 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 08:12:31 compute-0 nova_compute[186544]:   </metadata>
Nov 22 08:12:31 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 08:12:31 compute-0 nova_compute[186544]:     <system>
Nov 22 08:12:31 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 08:12:31 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 08:12:31 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 08:12:31 compute-0 nova_compute[186544]:       <entry name="serial">5ad88893-e3b8-416a-8ae6-fc7dfd032f51</entry>
Nov 22 08:12:31 compute-0 nova_compute[186544]:       <entry name="uuid">5ad88893-e3b8-416a-8ae6-fc7dfd032f51</entry>
Nov 22 08:12:31 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 08:12:31 compute-0 nova_compute[186544]:     </system>
Nov 22 08:12:31 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 08:12:31 compute-0 nova_compute[186544]:   <os>
Nov 22 08:12:31 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 08:12:31 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 08:12:31 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 08:12:31 compute-0 nova_compute[186544]:   </os>
Nov 22 08:12:31 compute-0 nova_compute[186544]:   <features>
Nov 22 08:12:31 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 08:12:31 compute-0 nova_compute[186544]:     <apic/>
Nov 22 08:12:31 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 08:12:31 compute-0 nova_compute[186544]:   </features>
Nov 22 08:12:31 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 08:12:31 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 08:12:31 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 08:12:31 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 08:12:31 compute-0 nova_compute[186544]:   </clock>
Nov 22 08:12:31 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 08:12:31 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 08:12:31 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 08:12:31 compute-0 nova_compute[186544]:   </cpu>
Nov 22 08:12:31 compute-0 nova_compute[186544]:   <devices>
Nov 22 08:12:31 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 08:12:31 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 08:12:31 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/5ad88893-e3b8-416a-8ae6-fc7dfd032f51/disk"/>
Nov 22 08:12:31 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 08:12:31 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:12:31 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 08:12:31 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 08:12:31 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/5ad88893-e3b8-416a-8ae6-fc7dfd032f51/disk.config"/>
Nov 22 08:12:31 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 08:12:31 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:12:31 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 08:12:31 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:bb:c0:58"/>
Nov 22 08:12:31 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:12:31 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 08:12:31 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 08:12:31 compute-0 nova_compute[186544]:       <target dev="tapb467ba7d-c2"/>
Nov 22 08:12:31 compute-0 nova_compute[186544]:     </interface>
Nov 22 08:12:31 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 08:12:31 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/5ad88893-e3b8-416a-8ae6-fc7dfd032f51/console.log" append="off"/>
Nov 22 08:12:31 compute-0 nova_compute[186544]:     </serial>
Nov 22 08:12:31 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 08:12:31 compute-0 nova_compute[186544]:     <video>
Nov 22 08:12:31 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:12:31 compute-0 nova_compute[186544]:     </video>
Nov 22 08:12:31 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 08:12:31 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 08:12:31 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 08:12:31 compute-0 nova_compute[186544]:     </rng>
Nov 22 08:12:31 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 08:12:31 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:12:31 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:12:31 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:12:31 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:12:31 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:12:31 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:12:31 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:12:31 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:12:31 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:12:31 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:12:31 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:12:31 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:12:31 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:12:31 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:12:31 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:12:31 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:12:31 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:12:31 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:12:31 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:12:31 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:12:31 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:12:31 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:12:31 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:12:31 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:12:31 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 08:12:31 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 08:12:31 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 08:12:31 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 08:12:31 compute-0 nova_compute[186544]:   </devices>
Nov 22 08:12:31 compute-0 nova_compute[186544]: </domain>
Nov 22 08:12:31 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 08:12:31 compute-0 nova_compute[186544]: 2025-11-22 08:12:31.848 186548 DEBUG nova.compute.manager [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] [instance: 5ad88893-e3b8-416a-8ae6-fc7dfd032f51] Preparing to wait for external event network-vif-plugged-b467ba7d-c233-4daf-93a8-01b1a9d6e012 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 08:12:31 compute-0 nova_compute[186544]: 2025-11-22 08:12:31.849 186548 DEBUG oslo_concurrency.lockutils [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] Acquiring lock "5ad88893-e3b8-416a-8ae6-fc7dfd032f51-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:12:31 compute-0 nova_compute[186544]: 2025-11-22 08:12:31.849 186548 DEBUG oslo_concurrency.lockutils [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] Lock "5ad88893-e3b8-416a-8ae6-fc7dfd032f51-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:12:31 compute-0 nova_compute[186544]: 2025-11-22 08:12:31.849 186548 DEBUG oslo_concurrency.lockutils [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] Lock "5ad88893-e3b8-416a-8ae6-fc7dfd032f51-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:12:31 compute-0 nova_compute[186544]: 2025-11-22 08:12:31.850 186548 DEBUG nova.virt.libvirt.vif [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:12:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerPasswordTestJSON-server-1051020919',display_name='tempest-ServerPasswordTestJSON-server-1051020919',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverpasswordtestjson-server-1051020919',id=134,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e6f0c135c6434a09b8f88d035ba8aa76',ramdisk_id='',reservation_id='r-r53rpd0r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerPasswordTestJSON-761097360',owner_user_name='tempest-ServerPasswordTestJSON-761097360-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:12:28Z,user_data=None,user_id='c0ef2c0fb47644918f7915437e98fc9a',uuid=5ad88893-e3b8-416a-8ae6-fc7dfd032f51,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b467ba7d-c233-4daf-93a8-01b1a9d6e012", "address": "fa:16:3e:bb:c0:58", "network": {"id": "edad52ae-eb56-4482-9255-08c505724d77", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-1843979129-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6f0c135c6434a09b8f88d035ba8aa76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb467ba7d-c2", "ovs_interfaceid": "b467ba7d-c233-4daf-93a8-01b1a9d6e012", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 08:12:31 compute-0 nova_compute[186544]: 2025-11-22 08:12:31.850 186548 DEBUG nova.network.os_vif_util [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] Converting VIF {"id": "b467ba7d-c233-4daf-93a8-01b1a9d6e012", "address": "fa:16:3e:bb:c0:58", "network": {"id": "edad52ae-eb56-4482-9255-08c505724d77", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-1843979129-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6f0c135c6434a09b8f88d035ba8aa76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb467ba7d-c2", "ovs_interfaceid": "b467ba7d-c233-4daf-93a8-01b1a9d6e012", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:12:31 compute-0 nova_compute[186544]: 2025-11-22 08:12:31.851 186548 DEBUG nova.network.os_vif_util [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bb:c0:58,bridge_name='br-int',has_traffic_filtering=True,id=b467ba7d-c233-4daf-93a8-01b1a9d6e012,network=Network(edad52ae-eb56-4482-9255-08c505724d77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb467ba7d-c2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:12:31 compute-0 nova_compute[186544]: 2025-11-22 08:12:31.854 186548 DEBUG os_vif [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:bb:c0:58,bridge_name='br-int',has_traffic_filtering=True,id=b467ba7d-c233-4daf-93a8-01b1a9d6e012,network=Network(edad52ae-eb56-4482-9255-08c505724d77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb467ba7d-c2') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 08:12:31 compute-0 nova_compute[186544]: 2025-11-22 08:12:31.854 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:12:31 compute-0 nova_compute[186544]: 2025-11-22 08:12:31.855 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:12:31 compute-0 nova_compute[186544]: 2025-11-22 08:12:31.855 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:12:31 compute-0 nova_compute[186544]: 2025-11-22 08:12:31.859 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:12:31 compute-0 nova_compute[186544]: 2025-11-22 08:12:31.859 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb467ba7d-c2, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:12:31 compute-0 nova_compute[186544]: 2025-11-22 08:12:31.860 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb467ba7d-c2, col_values=(('external_ids', {'iface-id': 'b467ba7d-c233-4daf-93a8-01b1a9d6e012', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:bb:c0:58', 'vm-uuid': '5ad88893-e3b8-416a-8ae6-fc7dfd032f51'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:12:31 compute-0 nova_compute[186544]: 2025-11-22 08:12:31.861 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:12:31 compute-0 NetworkManager[55036]: <info>  [1763799151.8626] manager: (tapb467ba7d-c2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/278)
Nov 22 08:12:31 compute-0 nova_compute[186544]: 2025-11-22 08:12:31.864 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 08:12:31 compute-0 nova_compute[186544]: 2025-11-22 08:12:31.869 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:12:31 compute-0 nova_compute[186544]: 2025-11-22 08:12:31.870 186548 INFO os_vif [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:bb:c0:58,bridge_name='br-int',has_traffic_filtering=True,id=b467ba7d-c233-4daf-93a8-01b1a9d6e012,network=Network(edad52ae-eb56-4482-9255-08c505724d77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb467ba7d-c2')
Nov 22 08:12:31 compute-0 nova_compute[186544]: 2025-11-22 08:12:31.919 186548 DEBUG nova.virt.libvirt.driver [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:12:31 compute-0 nova_compute[186544]: 2025-11-22 08:12:31.920 186548 DEBUG nova.virt.libvirt.driver [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:12:31 compute-0 nova_compute[186544]: 2025-11-22 08:12:31.920 186548 DEBUG nova.virt.libvirt.driver [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] No VIF found with MAC fa:16:3e:bb:c0:58, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 08:12:31 compute-0 nova_compute[186544]: 2025-11-22 08:12:31.921 186548 INFO nova.virt.libvirt.driver [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] [instance: 5ad88893-e3b8-416a-8ae6-fc7dfd032f51] Using config drive
Nov 22 08:12:32 compute-0 nova_compute[186544]: 2025-11-22 08:12:32.596 186548 INFO nova.virt.libvirt.driver [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] [instance: 5ad88893-e3b8-416a-8ae6-fc7dfd032f51] Creating config drive at /var/lib/nova/instances/5ad88893-e3b8-416a-8ae6-fc7dfd032f51/disk.config
Nov 22 08:12:32 compute-0 nova_compute[186544]: 2025-11-22 08:12:32.600 186548 DEBUG oslo_concurrency.processutils [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5ad88893-e3b8-416a-8ae6-fc7dfd032f51/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpyrdg5sq6 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:12:32 compute-0 nova_compute[186544]: 2025-11-22 08:12:32.725 186548 DEBUG oslo_concurrency.processutils [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5ad88893-e3b8-416a-8ae6-fc7dfd032f51/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpyrdg5sq6" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:12:32 compute-0 NetworkManager[55036]: <info>  [1763799152.7859] manager: (tapb467ba7d-c2): new Tun device (/org/freedesktop/NetworkManager/Devices/279)
Nov 22 08:12:32 compute-0 kernel: tapb467ba7d-c2: entered promiscuous mode
Nov 22 08:12:32 compute-0 ovn_controller[94843]: 2025-11-22T08:12:32Z|00604|binding|INFO|Claiming lport b467ba7d-c233-4daf-93a8-01b1a9d6e012 for this chassis.
Nov 22 08:12:32 compute-0 ovn_controller[94843]: 2025-11-22T08:12:32Z|00605|binding|INFO|b467ba7d-c233-4daf-93a8-01b1a9d6e012: Claiming fa:16:3e:bb:c0:58 10.100.0.12
Nov 22 08:12:32 compute-0 nova_compute[186544]: 2025-11-22 08:12:32.788 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:12:32 compute-0 ovn_controller[94843]: 2025-11-22T08:12:32Z|00606|binding|INFO|Setting lport b467ba7d-c233-4daf-93a8-01b1a9d6e012 ovn-installed in OVS
Nov 22 08:12:32 compute-0 nova_compute[186544]: 2025-11-22 08:12:32.806 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:12:32 compute-0 nova_compute[186544]: 2025-11-22 08:12:32.808 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:12:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:12:32.812 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bb:c0:58 10.100.0.12'], port_security=['fa:16:3e:bb:c0:58 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '5ad88893-e3b8-416a-8ae6-fc7dfd032f51', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-edad52ae-eb56-4482-9255-08c505724d77', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e6f0c135c6434a09b8f88d035ba8aa76', 'neutron:revision_number': '2', 'neutron:security_group_ids': '013fc171-e5ee-4003-9142-d78adb5e1ee5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb6c98c9-4b66-4ccd-8cfe-8ef59410683c, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=b467ba7d-c233-4daf-93a8-01b1a9d6e012) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:12:32 compute-0 ovn_controller[94843]: 2025-11-22T08:12:32Z|00607|binding|INFO|Setting lport b467ba7d-c233-4daf-93a8-01b1a9d6e012 up in Southbound
Nov 22 08:12:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:12:32.813 103805 INFO neutron.agent.ovn.metadata.agent [-] Port b467ba7d-c233-4daf-93a8-01b1a9d6e012 in datapath edad52ae-eb56-4482-9255-08c505724d77 bound to our chassis
Nov 22 08:12:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:12:32.814 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network edad52ae-eb56-4482-9255-08c505724d77
Nov 22 08:12:32 compute-0 systemd-udevd[237560]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:12:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:12:32.826 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[841e259f-5f65-42d0-be85-364f23d9ab14]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:12:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:12:32.827 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapedad52ae-e1 in ovnmeta-edad52ae-eb56-4482-9255-08c505724d77 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 08:12:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:12:32.829 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapedad52ae-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 08:12:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:12:32.829 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[b565049c-38f7-458e-942e-92277c4be6e9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:12:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:12:32.830 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[7d4e220f-aef4-4589-9115-7433fd58a123]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:12:32 compute-0 systemd-machined[152872]: New machine qemu-75-instance-00000086.
Nov 22 08:12:32 compute-0 NetworkManager[55036]: <info>  [1763799152.8418] device (tapb467ba7d-c2): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 08:12:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:12:32.841 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[be051c46-9bfa-4f5d-a563-8ce51c20626a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:12:32 compute-0 NetworkManager[55036]: <info>  [1763799152.8441] device (tapb467ba7d-c2): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 08:12:32 compute-0 systemd[1]: Started Virtual Machine qemu-75-instance-00000086.
Nov 22 08:12:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:12:32.870 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[d9d2503c-fb5a-4663-901a-224da5d3f6fa]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:12:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:12:32.908 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[96086e80-da48-4239-9fcb-5487bcad2265]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:12:32 compute-0 NetworkManager[55036]: <info>  [1763799152.9156] manager: (tapedad52ae-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/280)
Nov 22 08:12:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:12:32.914 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[64af1c95-6157-42c9-814e-532a84e08fbf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:12:32 compute-0 systemd-udevd[237566]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:12:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:12:32.948 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[b0ca49b7-25d2-479f-b7c5-b1e2af8d15ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:12:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:12:32.951 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[8ff47a7b-ecaa-4fdb-ac79-ebe6037f1574]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:12:32 compute-0 NetworkManager[55036]: <info>  [1763799152.9758] device (tapedad52ae-e0): carrier: link connected
Nov 22 08:12:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:12:32.981 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[8d48a00f-1289-4810-820d-3f8dd39258c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:12:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:12:32.999 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[a824141a-f23a-4799-9a1e-c812176a8090]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapedad52ae-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:23:30:a2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 186], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 589361, 'reachable_time': 33313, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237595, 'error': None, 'target': 'ovnmeta-edad52ae-eb56-4482-9255-08c505724d77', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:12:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:12:33.019 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[79dc49ab-51a1-4a5c-8efb-33e4ae903acf]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe23:30a2'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 589361, 'tstamp': 589361}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 237596, 'error': None, 'target': 'ovnmeta-edad52ae-eb56-4482-9255-08c505724d77', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:12:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:12:33.035 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[46ba093c-a0b5-4c6d-8d73-cfdd03bbf457]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapedad52ae-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:23:30:a2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 186], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 589361, 'reachable_time': 33313, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 237597, 'error': None, 'target': 'ovnmeta-edad52ae-eb56-4482-9255-08c505724d77', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:12:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:12:33.066 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[f98b0891-9cb1-403c-9d16-239d699d7132]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:12:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:12:33.126 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[e00c5dc2-2d1c-43b4-acaa-cb9ebba9f442]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:12:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:12:33.127 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapedad52ae-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:12:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:12:33.127 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:12:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:12:33.128 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapedad52ae-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:12:33 compute-0 kernel: tapedad52ae-e0: entered promiscuous mode
Nov 22 08:12:33 compute-0 nova_compute[186544]: 2025-11-22 08:12:33.129 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:12:33 compute-0 NetworkManager[55036]: <info>  [1763799153.1303] manager: (tapedad52ae-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/281)
Nov 22 08:12:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:12:33.136 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapedad52ae-e0, col_values=(('external_ids', {'iface-id': '0b1c5837-bb39-4a10-83b2-95551dc41c90'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:12:33 compute-0 nova_compute[186544]: 2025-11-22 08:12:33.137 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:12:33 compute-0 ovn_controller[94843]: 2025-11-22T08:12:33Z|00608|binding|INFO|Releasing lport 0b1c5837-bb39-4a10-83b2-95551dc41c90 from this chassis (sb_readonly=0)
Nov 22 08:12:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:12:33.140 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/edad52ae-eb56-4482-9255-08c505724d77.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/edad52ae-eb56-4482-9255-08c505724d77.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 08:12:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:12:33.141 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[9ef7aa22-e302-46fe-88a3-272a69e6b3e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:12:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:12:33.142 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 08:12:33 compute-0 ovn_metadata_agent[103800]: global
Nov 22 08:12:33 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 08:12:33 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-edad52ae-eb56-4482-9255-08c505724d77
Nov 22 08:12:33 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 08:12:33 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 08:12:33 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 08:12:33 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/edad52ae-eb56-4482-9255-08c505724d77.pid.haproxy
Nov 22 08:12:33 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 08:12:33 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:12:33 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 08:12:33 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 08:12:33 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 08:12:33 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 08:12:33 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 08:12:33 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 08:12:33 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 08:12:33 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 08:12:33 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 08:12:33 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 08:12:33 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 08:12:33 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 08:12:33 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 08:12:33 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:12:33 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:12:33 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 08:12:33 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 08:12:33 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 08:12:33 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID edad52ae-eb56-4482-9255-08c505724d77
Nov 22 08:12:33 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 08:12:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:12:33.143 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-edad52ae-eb56-4482-9255-08c505724d77', 'env', 'PROCESS_TAG=haproxy-edad52ae-eb56-4482-9255-08c505724d77', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/edad52ae-eb56-4482-9255-08c505724d77.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 08:12:33 compute-0 nova_compute[186544]: 2025-11-22 08:12:33.150 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:12:33 compute-0 nova_compute[186544]: 2025-11-22 08:12:33.227 186548 DEBUG nova.compute.manager [req-e889b2b8-da2e-42d9-9b30-144f633e16f4 req-46761de7-2c5d-4100-8ce9-d5f926c63f28 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5ad88893-e3b8-416a-8ae6-fc7dfd032f51] Received event network-vif-plugged-b467ba7d-c233-4daf-93a8-01b1a9d6e012 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:12:33 compute-0 nova_compute[186544]: 2025-11-22 08:12:33.227 186548 DEBUG oslo_concurrency.lockutils [req-e889b2b8-da2e-42d9-9b30-144f633e16f4 req-46761de7-2c5d-4100-8ce9-d5f926c63f28 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "5ad88893-e3b8-416a-8ae6-fc7dfd032f51-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:12:33 compute-0 nova_compute[186544]: 2025-11-22 08:12:33.228 186548 DEBUG oslo_concurrency.lockutils [req-e889b2b8-da2e-42d9-9b30-144f633e16f4 req-46761de7-2c5d-4100-8ce9-d5f926c63f28 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "5ad88893-e3b8-416a-8ae6-fc7dfd032f51-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:12:33 compute-0 nova_compute[186544]: 2025-11-22 08:12:33.228 186548 DEBUG oslo_concurrency.lockutils [req-e889b2b8-da2e-42d9-9b30-144f633e16f4 req-46761de7-2c5d-4100-8ce9-d5f926c63f28 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "5ad88893-e3b8-416a-8ae6-fc7dfd032f51-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:12:33 compute-0 nova_compute[186544]: 2025-11-22 08:12:33.228 186548 DEBUG nova.compute.manager [req-e889b2b8-da2e-42d9-9b30-144f633e16f4 req-46761de7-2c5d-4100-8ce9-d5f926c63f28 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5ad88893-e3b8-416a-8ae6-fc7dfd032f51] Processing event network-vif-plugged-b467ba7d-c233-4daf-93a8-01b1a9d6e012 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 08:12:33 compute-0 nova_compute[186544]: 2025-11-22 08:12:33.498 186548 DEBUG oslo_concurrency.lockutils [None req-7f2213de-8905-404c-bbb3-b38fb30e2469 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Acquiring lock "b697e0e3-6ab8-4e90-b8e9-e72e362283da" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:12:33 compute-0 nova_compute[186544]: 2025-11-22 08:12:33.498 186548 DEBUG oslo_concurrency.lockutils [None req-7f2213de-8905-404c-bbb3-b38fb30e2469 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Lock "b697e0e3-6ab8-4e90-b8e9-e72e362283da" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:12:33 compute-0 nova_compute[186544]: 2025-11-22 08:12:33.499 186548 DEBUG oslo_concurrency.lockutils [None req-7f2213de-8905-404c-bbb3-b38fb30e2469 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Acquiring lock "b697e0e3-6ab8-4e90-b8e9-e72e362283da-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:12:33 compute-0 nova_compute[186544]: 2025-11-22 08:12:33.499 186548 DEBUG oslo_concurrency.lockutils [None req-7f2213de-8905-404c-bbb3-b38fb30e2469 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Lock "b697e0e3-6ab8-4e90-b8e9-e72e362283da-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:12:33 compute-0 nova_compute[186544]: 2025-11-22 08:12:33.499 186548 DEBUG oslo_concurrency.lockutils [None req-7f2213de-8905-404c-bbb3-b38fb30e2469 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Lock "b697e0e3-6ab8-4e90-b8e9-e72e362283da-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:12:33 compute-0 nova_compute[186544]: 2025-11-22 08:12:33.506 186548 INFO nova.compute.manager [None req-7f2213de-8905-404c-bbb3-b38fb30e2469 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Terminating instance
Nov 22 08:12:33 compute-0 nova_compute[186544]: 2025-11-22 08:12:33.512 186548 DEBUG nova.compute.manager [None req-7f2213de-8905-404c-bbb3-b38fb30e2469 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 08:12:33 compute-0 nova_compute[186544]: 2025-11-22 08:12:33.537 186548 DEBUG nova.compute.manager [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] [instance: 5ad88893-e3b8-416a-8ae6-fc7dfd032f51] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 08:12:33 compute-0 nova_compute[186544]: 2025-11-22 08:12:33.537 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763799153.536603, 5ad88893-e3b8-416a-8ae6-fc7dfd032f51 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:12:33 compute-0 nova_compute[186544]: 2025-11-22 08:12:33.538 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 5ad88893-e3b8-416a-8ae6-fc7dfd032f51] VM Started (Lifecycle Event)
Nov 22 08:12:33 compute-0 kernel: tap7b325460-e1 (unregistering): left promiscuous mode
Nov 22 08:12:33 compute-0 podman[237633]: 2025-11-22 08:12:33.549140422 +0000 UTC m=+0.090438420 container create c07b80430f982eafff5db43a2c7aab60adb169e5c2e5dfe2047ec44982fbd5ec (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-edad52ae-eb56-4482-9255-08c505724d77, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 22 08:12:33 compute-0 nova_compute[186544]: 2025-11-22 08:12:33.549 186548 DEBUG nova.virt.libvirt.driver [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] [instance: 5ad88893-e3b8-416a-8ae6-fc7dfd032f51] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 08:12:33 compute-0 NetworkManager[55036]: <info>  [1763799153.5513] device (tap7b325460-e1): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 08:12:33 compute-0 ovn_controller[94843]: 2025-11-22T08:12:33Z|00609|binding|INFO|Releasing lport 7b325460-e116-46b4-b42f-3595a0a85fa1 from this chassis (sb_readonly=0)
Nov 22 08:12:33 compute-0 ovn_controller[94843]: 2025-11-22T08:12:33Z|00610|binding|INFO|Setting lport 7b325460-e116-46b4-b42f-3595a0a85fa1 down in Southbound
Nov 22 08:12:33 compute-0 ovn_controller[94843]: 2025-11-22T08:12:33Z|00611|binding|INFO|Removing iface tap7b325460-e1 ovn-installed in OVS
Nov 22 08:12:33 compute-0 nova_compute[186544]: 2025-11-22 08:12:33.562 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 5ad88893-e3b8-416a-8ae6-fc7dfd032f51] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:12:33 compute-0 nova_compute[186544]: 2025-11-22 08:12:33.564 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:12:33 compute-0 nova_compute[186544]: 2025-11-22 08:12:33.568 186548 INFO nova.virt.libvirt.driver [-] [instance: 5ad88893-e3b8-416a-8ae6-fc7dfd032f51] Instance spawned successfully.
Nov 22 08:12:33 compute-0 nova_compute[186544]: 2025-11-22 08:12:33.568 186548 DEBUG nova.virt.libvirt.driver [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] [instance: 5ad88893-e3b8-416a-8ae6-fc7dfd032f51] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 08:12:33 compute-0 nova_compute[186544]: 2025-11-22 08:12:33.573 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 5ad88893-e3b8-416a-8ae6-fc7dfd032f51] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:12:33 compute-0 podman[237633]: 2025-11-22 08:12:33.480413201 +0000 UTC m=+0.021711199 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 08:12:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:12:33.575 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1f:39:14 10.100.0.4'], port_security=['fa:16:3e:1f:39:14 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'b697e0e3-6ab8-4e90-b8e9-e72e362283da', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e77afcde171b45e6bb008c9dff8ffb44', 'neutron:revision_number': '9', 'neutron:security_group_ids': '74fef083-1145-4a81-b923-db02721f22a5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6709d38e-861b-4fc2-860f-7d0aaf6cf724, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=7b325460-e116-46b4-b42f-3595a0a85fa1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:12:33 compute-0 nova_compute[186544]: 2025-11-22 08:12:33.583 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:12:33 compute-0 systemd[1]: Started libpod-conmon-c07b80430f982eafff5db43a2c7aab60adb169e5c2e5dfe2047ec44982fbd5ec.scope.
Nov 22 08:12:33 compute-0 nova_compute[186544]: 2025-11-22 08:12:33.594 186548 DEBUG nova.virt.libvirt.driver [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] [instance: 5ad88893-e3b8-416a-8ae6-fc7dfd032f51] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:12:33 compute-0 nova_compute[186544]: 2025-11-22 08:12:33.595 186548 DEBUG nova.virt.libvirt.driver [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] [instance: 5ad88893-e3b8-416a-8ae6-fc7dfd032f51] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:12:33 compute-0 nova_compute[186544]: 2025-11-22 08:12:33.595 186548 DEBUG nova.virt.libvirt.driver [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] [instance: 5ad88893-e3b8-416a-8ae6-fc7dfd032f51] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:12:33 compute-0 nova_compute[186544]: 2025-11-22 08:12:33.596 186548 DEBUG nova.virt.libvirt.driver [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] [instance: 5ad88893-e3b8-416a-8ae6-fc7dfd032f51] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:12:33 compute-0 nova_compute[186544]: 2025-11-22 08:12:33.597 186548 DEBUG nova.virt.libvirt.driver [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] [instance: 5ad88893-e3b8-416a-8ae6-fc7dfd032f51] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:12:33 compute-0 nova_compute[186544]: 2025-11-22 08:12:33.597 186548 DEBUG nova.virt.libvirt.driver [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] [instance: 5ad88893-e3b8-416a-8ae6-fc7dfd032f51] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:12:33 compute-0 nova_compute[186544]: 2025-11-22 08:12:33.600 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 5ad88893-e3b8-416a-8ae6-fc7dfd032f51] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 08:12:33 compute-0 nova_compute[186544]: 2025-11-22 08:12:33.601 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763799153.5375543, 5ad88893-e3b8-416a-8ae6-fc7dfd032f51 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:12:33 compute-0 nova_compute[186544]: 2025-11-22 08:12:33.601 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 5ad88893-e3b8-416a-8ae6-fc7dfd032f51] VM Paused (Lifecycle Event)
Nov 22 08:12:33 compute-0 systemd[1]: machine-qemu\x2d74\x2dinstance\x2d00000081.scope: Deactivated successfully.
Nov 22 08:12:33 compute-0 systemd[1]: machine-qemu\x2d74\x2dinstance\x2d00000081.scope: Consumed 16.168s CPU time.
Nov 22 08:12:33 compute-0 systemd[1]: Started libcrun container.
Nov 22 08:12:33 compute-0 systemd-machined[152872]: Machine qemu-74-instance-00000081 terminated.
Nov 22 08:12:33 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eefb78a1960de99bb0daf83273de80f30377224b1a11fd76f44f3a79282d3009/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 08:12:33 compute-0 nova_compute[186544]: 2025-11-22 08:12:33.633 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 5ad88893-e3b8-416a-8ae6-fc7dfd032f51] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:12:33 compute-0 nova_compute[186544]: 2025-11-22 08:12:33.637 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763799153.5420985, 5ad88893-e3b8-416a-8ae6-fc7dfd032f51 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:12:33 compute-0 nova_compute[186544]: 2025-11-22 08:12:33.638 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 5ad88893-e3b8-416a-8ae6-fc7dfd032f51] VM Resumed (Lifecycle Event)
Nov 22 08:12:33 compute-0 podman[237633]: 2025-11-22 08:12:33.639901109 +0000 UTC m=+0.181199107 container init c07b80430f982eafff5db43a2c7aab60adb169e5c2e5dfe2047ec44982fbd5ec (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-edad52ae-eb56-4482-9255-08c505724d77, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 22 08:12:33 compute-0 podman[237633]: 2025-11-22 08:12:33.646057571 +0000 UTC m=+0.187355569 container start c07b80430f982eafff5db43a2c7aab60adb169e5c2e5dfe2047ec44982fbd5ec (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-edad52ae-eb56-4482-9255-08c505724d77, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 22 08:12:33 compute-0 neutron-haproxy-ovnmeta-edad52ae-eb56-4482-9255-08c505724d77[237655]: [NOTICE]   (237659) : New worker (237661) forked
Nov 22 08:12:33 compute-0 neutron-haproxy-ovnmeta-edad52ae-eb56-4482-9255-08c505724d77[237655]: [NOTICE]   (237659) : Loading success.
Nov 22 08:12:33 compute-0 nova_compute[186544]: 2025-11-22 08:12:33.672 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 5ad88893-e3b8-416a-8ae6-fc7dfd032f51] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:12:33 compute-0 nova_compute[186544]: 2025-11-22 08:12:33.678 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 5ad88893-e3b8-416a-8ae6-fc7dfd032f51] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:12:33 compute-0 nova_compute[186544]: 2025-11-22 08:12:33.692 186548 INFO nova.compute.manager [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] [instance: 5ad88893-e3b8-416a-8ae6-fc7dfd032f51] Took 5.54 seconds to spawn the instance on the hypervisor.
Nov 22 08:12:33 compute-0 nova_compute[186544]: 2025-11-22 08:12:33.692 186548 DEBUG nova.compute.manager [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] [instance: 5ad88893-e3b8-416a-8ae6-fc7dfd032f51] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:12:33 compute-0 nova_compute[186544]: 2025-11-22 08:12:33.698 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 5ad88893-e3b8-416a-8ae6-fc7dfd032f51] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 08:12:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:12:33.705 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 7b325460-e116-46b4-b42f-3595a0a85fa1 in datapath 8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f unbound from our chassis
Nov 22 08:12:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:12:33.706 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 08:12:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:12:33.707 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[cc9ef1e2-4804-4781-8466-c915241680a3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:12:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:12:33.708 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f namespace which is not needed anymore
Nov 22 08:12:33 compute-0 nova_compute[186544]: 2025-11-22 08:12:33.767 186548 INFO nova.compute.manager [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] [instance: 5ad88893-e3b8-416a-8ae6-fc7dfd032f51] Took 6.02 seconds to build instance.
Nov 22 08:12:33 compute-0 nova_compute[186544]: 2025-11-22 08:12:33.783 186548 DEBUG oslo_concurrency.lockutils [None req-210b6670-c76e-400c-90cb-8732dc99ded7 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] Lock "5ad88893-e3b8-416a-8ae6-fc7dfd032f51" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.115s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:12:33 compute-0 nova_compute[186544]: 2025-11-22 08:12:33.788 186548 INFO nova.virt.libvirt.driver [-] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Instance destroyed successfully.
Nov 22 08:12:33 compute-0 nova_compute[186544]: 2025-11-22 08:12:33.789 186548 DEBUG nova.objects.instance [None req-7f2213de-8905-404c-bbb3-b38fb30e2469 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Lazy-loading 'resources' on Instance uuid b697e0e3-6ab8-4e90-b8e9-e72e362283da obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:12:33 compute-0 nova_compute[186544]: 2025-11-22 08:12:33.801 186548 DEBUG nova.virt.libvirt.vif [None req-7f2213de-8905-404c-bbb3-b38fb30e2469 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-22T08:10:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestShelveInstance-server-551674936',display_name='tempest-TestShelveInstance-server-551674936',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testshelveinstance-server-551674936',id=129,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAI0aHL6Ou2IJSgJR8KmWp5jvk1H0nenSFPpcgEZ2nM/U0LfRgUiMbaK3XQDYYBX/t4paqNprX8CiArX2uXN/07J1M2q50BshVpGq5Y37se64GB7gIu++lFp7/cMvXmcGQ==',key_name='tempest-TestShelveInstance-1754802104',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:11:58Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e77afcde171b45e6bb008c9dff8ffb44',ramdisk_id='',reservation_id='r-kc0vpko6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestShelveInstance-2007945223',owner_user_name='tempest-TestShelveInstance-2007945223-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:11:58Z,user_data=None,user_id='cf64193131de4d458bf1bd37c21125f6',uuid=b697e0e3-6ab8-4e90-b8e9-e72e362283da,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7b325460-e116-46b4-b42f-3595a0a85fa1", "address": "fa:16:3e:1f:39:14", "network": {"id": "8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f", "bridge": "br-int", "label": "tempest-TestShelveInstance-1145639745-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e77afcde171b45e6bb008c9dff8ffb44", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b325460-e1", "ovs_interfaceid": "7b325460-e116-46b4-b42f-3595a0a85fa1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 08:12:33 compute-0 nova_compute[186544]: 2025-11-22 08:12:33.802 186548 DEBUG nova.network.os_vif_util [None req-7f2213de-8905-404c-bbb3-b38fb30e2469 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Converting VIF {"id": "7b325460-e116-46b4-b42f-3595a0a85fa1", "address": "fa:16:3e:1f:39:14", "network": {"id": "8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f", "bridge": "br-int", "label": "tempest-TestShelveInstance-1145639745-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.182", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e77afcde171b45e6bb008c9dff8ffb44", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b325460-e1", "ovs_interfaceid": "7b325460-e116-46b4-b42f-3595a0a85fa1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:12:33 compute-0 nova_compute[186544]: 2025-11-22 08:12:33.803 186548 DEBUG nova.network.os_vif_util [None req-7f2213de-8905-404c-bbb3-b38fb30e2469 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:1f:39:14,bridge_name='br-int',has_traffic_filtering=True,id=7b325460-e116-46b4-b42f-3595a0a85fa1,network=Network(8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b325460-e1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:12:33 compute-0 nova_compute[186544]: 2025-11-22 08:12:33.803 186548 DEBUG os_vif [None req-7f2213de-8905-404c-bbb3-b38fb30e2469 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:1f:39:14,bridge_name='br-int',has_traffic_filtering=True,id=7b325460-e116-46b4-b42f-3595a0a85fa1,network=Network(8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b325460-e1') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 08:12:33 compute-0 nova_compute[186544]: 2025-11-22 08:12:33.805 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:12:33 compute-0 nova_compute[186544]: 2025-11-22 08:12:33.805 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7b325460-e1, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:12:33 compute-0 nova_compute[186544]: 2025-11-22 08:12:33.808 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:12:33 compute-0 nova_compute[186544]: 2025-11-22 08:12:33.811 186548 INFO os_vif [None req-7f2213de-8905-404c-bbb3-b38fb30e2469 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:1f:39:14,bridge_name='br-int',has_traffic_filtering=True,id=7b325460-e116-46b4-b42f-3595a0a85fa1,network=Network(8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b325460-e1')
Nov 22 08:12:33 compute-0 nova_compute[186544]: 2025-11-22 08:12:33.811 186548 INFO nova.virt.libvirt.driver [None req-7f2213de-8905-404c-bbb3-b38fb30e2469 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Deleting instance files /var/lib/nova/instances/b697e0e3-6ab8-4e90-b8e9-e72e362283da_del
Nov 22 08:12:33 compute-0 nova_compute[186544]: 2025-11-22 08:12:33.817 186548 INFO nova.virt.libvirt.driver [None req-7f2213de-8905-404c-bbb3-b38fb30e2469 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Deletion of /var/lib/nova/instances/b697e0e3-6ab8-4e90-b8e9-e72e362283da_del complete
Nov 22 08:12:33 compute-0 nova_compute[186544]: 2025-11-22 08:12:33.836 186548 DEBUG nova.network.neutron [req-2b47ad45-8046-4a10-a2d4-918e7a2d3369 req-1d197dbf-3d37-44e2-a52d-d2abd7ff4a1e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5ad88893-e3b8-416a-8ae6-fc7dfd032f51] Updated VIF entry in instance network info cache for port b467ba7d-c233-4daf-93a8-01b1a9d6e012. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:12:33 compute-0 nova_compute[186544]: 2025-11-22 08:12:33.837 186548 DEBUG nova.network.neutron [req-2b47ad45-8046-4a10-a2d4-918e7a2d3369 req-1d197dbf-3d37-44e2-a52d-d2abd7ff4a1e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5ad88893-e3b8-416a-8ae6-fc7dfd032f51] Updating instance_info_cache with network_info: [{"id": "b467ba7d-c233-4daf-93a8-01b1a9d6e012", "address": "fa:16:3e:bb:c0:58", "network": {"id": "edad52ae-eb56-4482-9255-08c505724d77", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-1843979129-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6f0c135c6434a09b8f88d035ba8aa76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb467ba7d-c2", "ovs_interfaceid": "b467ba7d-c233-4daf-93a8-01b1a9d6e012", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:12:33 compute-0 neutron-haproxy-ovnmeta-8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f[237346]: [NOTICE]   (237350) : haproxy version is 2.8.14-c23fe91
Nov 22 08:12:33 compute-0 neutron-haproxy-ovnmeta-8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f[237346]: [NOTICE]   (237350) : path to executable is /usr/sbin/haproxy
Nov 22 08:12:33 compute-0 neutron-haproxy-ovnmeta-8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f[237346]: [WARNING]  (237350) : Exiting Master process...
Nov 22 08:12:33 compute-0 neutron-haproxy-ovnmeta-8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f[237346]: [WARNING]  (237350) : Exiting Master process...
Nov 22 08:12:33 compute-0 neutron-haproxy-ovnmeta-8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f[237346]: [ALERT]    (237350) : Current worker (237352) exited with code 143 (Terminated)
Nov 22 08:12:33 compute-0 neutron-haproxy-ovnmeta-8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f[237346]: [WARNING]  (237350) : All workers exited. Exiting... (0)
Nov 22 08:12:33 compute-0 systemd[1]: libpod-00d4346506c184713a21a4565408915642b993b9262f9a6428fd9513f062224e.scope: Deactivated successfully.
Nov 22 08:12:33 compute-0 nova_compute[186544]: 2025-11-22 08:12:33.854 186548 DEBUG oslo_concurrency.lockutils [req-2b47ad45-8046-4a10-a2d4-918e7a2d3369 req-1d197dbf-3d37-44e2-a52d-d2abd7ff4a1e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-5ad88893-e3b8-416a-8ae6-fc7dfd032f51" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:12:33 compute-0 podman[237704]: 2025-11-22 08:12:33.856291155 +0000 UTC m=+0.050249775 container died 00d4346506c184713a21a4565408915642b993b9262f9a6428fd9513f062224e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3)
Nov 22 08:12:33 compute-0 nova_compute[186544]: 2025-11-22 08:12:33.875 186548 INFO nova.compute.manager [None req-7f2213de-8905-404c-bbb3-b38fb30e2469 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Took 0.36 seconds to destroy the instance on the hypervisor.
Nov 22 08:12:33 compute-0 nova_compute[186544]: 2025-11-22 08:12:33.875 186548 DEBUG oslo.service.loopingcall [None req-7f2213de-8905-404c-bbb3-b38fb30e2469 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 08:12:33 compute-0 nova_compute[186544]: 2025-11-22 08:12:33.875 186548 DEBUG nova.compute.manager [-] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 08:12:33 compute-0 nova_compute[186544]: 2025-11-22 08:12:33.875 186548 DEBUG nova.network.neutron [-] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 08:12:33 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-00d4346506c184713a21a4565408915642b993b9262f9a6428fd9513f062224e-userdata-shm.mount: Deactivated successfully.
Nov 22 08:12:33 compute-0 systemd[1]: var-lib-containers-storage-overlay-29fd54899ff062910ae353ad48b4d0c14ef3e7260045a4e01dfbeaf0177ffbbc-merged.mount: Deactivated successfully.
Nov 22 08:12:33 compute-0 podman[237704]: 2025-11-22 08:12:33.902163101 +0000 UTC m=+0.096121701 container cleanup 00d4346506c184713a21a4565408915642b993b9262f9a6428fd9513f062224e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 22 08:12:33 compute-0 systemd[1]: libpod-conmon-00d4346506c184713a21a4565408915642b993b9262f9a6428fd9513f062224e.scope: Deactivated successfully.
Nov 22 08:12:33 compute-0 podman[237733]: 2025-11-22 08:12:33.98006119 +0000 UTC m=+0.056876989 container remove 00d4346506c184713a21a4565408915642b993b9262f9a6428fd9513f062224e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 08:12:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:12:33.986 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[679e9540-fcf2-48cd-89ac-f93f8f18e516]: (4, ('Sat Nov 22 08:12:33 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f (00d4346506c184713a21a4565408915642b993b9262f9a6428fd9513f062224e)\n00d4346506c184713a21a4565408915642b993b9262f9a6428fd9513f062224e\nSat Nov 22 08:12:33 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f (00d4346506c184713a21a4565408915642b993b9262f9a6428fd9513f062224e)\n00d4346506c184713a21a4565408915642b993b9262f9a6428fd9513f062224e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:12:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:12:33.989 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[606e7761-e8c4-47ad-baed-a7a76dc7cb3b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:12:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:12:33.990 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8b2fd30e-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:12:33 compute-0 kernel: tap8b2fd30e-d0: left promiscuous mode
Nov 22 08:12:33 compute-0 nova_compute[186544]: 2025-11-22 08:12:33.993 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:12:33 compute-0 nova_compute[186544]: 2025-11-22 08:12:33.996 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:12:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:12:33.998 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[0db27d07-afbd-441b-ac74-8f811ffbb613]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:12:34 compute-0 nova_compute[186544]: 2025-11-22 08:12:34.009 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:12:34 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:12:34.024 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[8958d97e-5a74-4be5-ab1a-2c54867c7ea7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:12:34 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:12:34.031 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[2ac0cf70-3b58-4c8f-b04c-7ad2492b510b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:12:34 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:12:34.050 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[6370f49d-6c56-42f8-aaaa-2cb8adeb411a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 585748, 'reachable_time': 28867, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237748, 'error': None, 'target': 'ovnmeta-8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:12:34 compute-0 systemd[1]: run-netns-ovnmeta\x2d8b2fd30e\x2dd4e3\x2d4155\x2d885e\x2def1a2c6dae8f.mount: Deactivated successfully.
Nov 22 08:12:34 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:12:34.055 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-8b2fd30e-d4e3-4155-885e-ef1a2c6dae8f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 08:12:34 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:12:34.056 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[3db9de08-5d73-41a7-9e13-479f5c85021b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:12:34 compute-0 nova_compute[186544]: 2025-11-22 08:12:34.909 186548 DEBUG nova.network.neutron [-] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:12:34 compute-0 nova_compute[186544]: 2025-11-22 08:12:34.929 186548 INFO nova.compute.manager [-] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Took 1.05 seconds to deallocate network for instance.
Nov 22 08:12:34 compute-0 nova_compute[186544]: 2025-11-22 08:12:34.958 186548 DEBUG nova.compute.manager [req-53e5b2fd-c4de-4037-8a65-5594ce9443bf req-d54b6378-37e9-49e0-9295-f7b41f191a53 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Received event network-changed-7b325460-e116-46b4-b42f-3595a0a85fa1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:12:34 compute-0 nova_compute[186544]: 2025-11-22 08:12:34.959 186548 DEBUG nova.compute.manager [req-53e5b2fd-c4de-4037-8a65-5594ce9443bf req-d54b6378-37e9-49e0-9295-f7b41f191a53 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Refreshing instance network info cache due to event network-changed-7b325460-e116-46b4-b42f-3595a0a85fa1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:12:34 compute-0 nova_compute[186544]: 2025-11-22 08:12:34.959 186548 DEBUG oslo_concurrency.lockutils [req-53e5b2fd-c4de-4037-8a65-5594ce9443bf req-d54b6378-37e9-49e0-9295-f7b41f191a53 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-b697e0e3-6ab8-4e90-b8e9-e72e362283da" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:12:34 compute-0 nova_compute[186544]: 2025-11-22 08:12:34.959 186548 DEBUG oslo_concurrency.lockutils [req-53e5b2fd-c4de-4037-8a65-5594ce9443bf req-d54b6378-37e9-49e0-9295-f7b41f191a53 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-b697e0e3-6ab8-4e90-b8e9-e72e362283da" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:12:34 compute-0 nova_compute[186544]: 2025-11-22 08:12:34.959 186548 DEBUG nova.network.neutron [req-53e5b2fd-c4de-4037-8a65-5594ce9443bf req-d54b6378-37e9-49e0-9295-f7b41f191a53 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Refreshing network info cache for port 7b325460-e116-46b4-b42f-3595a0a85fa1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:12:34 compute-0 nova_compute[186544]: 2025-11-22 08:12:34.990 186548 DEBUG oslo_concurrency.lockutils [None req-7f2213de-8905-404c-bbb3-b38fb30e2469 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:12:34 compute-0 nova_compute[186544]: 2025-11-22 08:12:34.990 186548 DEBUG oslo_concurrency.lockutils [None req-7f2213de-8905-404c-bbb3-b38fb30e2469 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:12:35 compute-0 nova_compute[186544]: 2025-11-22 08:12:35.079 186548 DEBUG nova.compute.provider_tree [None req-7f2213de-8905-404c-bbb3-b38fb30e2469 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:12:35 compute-0 nova_compute[186544]: 2025-11-22 08:12:35.091 186548 DEBUG nova.scheduler.client.report [None req-7f2213de-8905-404c-bbb3-b38fb30e2469 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:12:35 compute-0 nova_compute[186544]: 2025-11-22 08:12:35.105 186548 DEBUG nova.compute.manager [req-9e7f4ce0-c7cd-4cca-a3f2-a7a15f028856 req-59d42aef-e3f0-4272-be1d-8169b88771e2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Received event network-vif-deleted-7b325460-e116-46b4-b42f-3595a0a85fa1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:12:35 compute-0 nova_compute[186544]: 2025-11-22 08:12:35.111 186548 DEBUG oslo_concurrency.lockutils [None req-7f2213de-8905-404c-bbb3-b38fb30e2469 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:12:35 compute-0 nova_compute[186544]: 2025-11-22 08:12:35.119 186548 DEBUG nova.network.neutron [req-53e5b2fd-c4de-4037-8a65-5594ce9443bf req-d54b6378-37e9-49e0-9295-f7b41f191a53 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 08:12:35 compute-0 nova_compute[186544]: 2025-11-22 08:12:35.146 186548 INFO nova.scheduler.client.report [None req-7f2213de-8905-404c-bbb3-b38fb30e2469 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Deleted allocations for instance b697e0e3-6ab8-4e90-b8e9-e72e362283da
Nov 22 08:12:35 compute-0 nova_compute[186544]: 2025-11-22 08:12:35.220 186548 DEBUG oslo_concurrency.lockutils [None req-7f2213de-8905-404c-bbb3-b38fb30e2469 cf64193131de4d458bf1bd37c21125f6 e77afcde171b45e6bb008c9dff8ffb44 - - default default] Lock "b697e0e3-6ab8-4e90-b8e9-e72e362283da" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.722s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:12:35 compute-0 nova_compute[186544]: 2025-11-22 08:12:35.324 186548 DEBUG nova.compute.manager [req-6c46d6f4-d8c2-4d8c-bc4a-c2a88568c59c req-1dd488f5-d5b5-43b5-8021-4b9462066603 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5ad88893-e3b8-416a-8ae6-fc7dfd032f51] Received event network-vif-plugged-b467ba7d-c233-4daf-93a8-01b1a9d6e012 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:12:35 compute-0 nova_compute[186544]: 2025-11-22 08:12:35.324 186548 DEBUG oslo_concurrency.lockutils [req-6c46d6f4-d8c2-4d8c-bc4a-c2a88568c59c req-1dd488f5-d5b5-43b5-8021-4b9462066603 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "5ad88893-e3b8-416a-8ae6-fc7dfd032f51-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:12:35 compute-0 nova_compute[186544]: 2025-11-22 08:12:35.325 186548 DEBUG oslo_concurrency.lockutils [req-6c46d6f4-d8c2-4d8c-bc4a-c2a88568c59c req-1dd488f5-d5b5-43b5-8021-4b9462066603 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "5ad88893-e3b8-416a-8ae6-fc7dfd032f51-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:12:35 compute-0 nova_compute[186544]: 2025-11-22 08:12:35.325 186548 DEBUG oslo_concurrency.lockutils [req-6c46d6f4-d8c2-4d8c-bc4a-c2a88568c59c req-1dd488f5-d5b5-43b5-8021-4b9462066603 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "5ad88893-e3b8-416a-8ae6-fc7dfd032f51-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:12:35 compute-0 nova_compute[186544]: 2025-11-22 08:12:35.325 186548 DEBUG nova.compute.manager [req-6c46d6f4-d8c2-4d8c-bc4a-c2a88568c59c req-1dd488f5-d5b5-43b5-8021-4b9462066603 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5ad88893-e3b8-416a-8ae6-fc7dfd032f51] No waiting events found dispatching network-vif-plugged-b467ba7d-c233-4daf-93a8-01b1a9d6e012 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:12:35 compute-0 nova_compute[186544]: 2025-11-22 08:12:35.325 186548 WARNING nova.compute.manager [req-6c46d6f4-d8c2-4d8c-bc4a-c2a88568c59c req-1dd488f5-d5b5-43b5-8021-4b9462066603 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5ad88893-e3b8-416a-8ae6-fc7dfd032f51] Received unexpected event network-vif-plugged-b467ba7d-c233-4daf-93a8-01b1a9d6e012 for instance with vm_state active and task_state None.
Nov 22 08:12:35 compute-0 nova_compute[186544]: 2025-11-22 08:12:35.523 186548 DEBUG nova.network.neutron [req-53e5b2fd-c4de-4037-8a65-5594ce9443bf req-d54b6378-37e9-49e0-9295-f7b41f191a53 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:12:35 compute-0 nova_compute[186544]: 2025-11-22 08:12:35.542 186548 DEBUG oslo_concurrency.lockutils [req-53e5b2fd-c4de-4037-8a65-5594ce9443bf req-d54b6378-37e9-49e0-9295-f7b41f191a53 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-b697e0e3-6ab8-4e90-b8e9-e72e362283da" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:12:35 compute-0 nova_compute[186544]: 2025-11-22 08:12:35.543 186548 DEBUG nova.compute.manager [req-53e5b2fd-c4de-4037-8a65-5594ce9443bf req-d54b6378-37e9-49e0-9295-f7b41f191a53 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Received event network-vif-unplugged-7b325460-e116-46b4-b42f-3595a0a85fa1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:12:35 compute-0 nova_compute[186544]: 2025-11-22 08:12:35.543 186548 DEBUG oslo_concurrency.lockutils [req-53e5b2fd-c4de-4037-8a65-5594ce9443bf req-d54b6378-37e9-49e0-9295-f7b41f191a53 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "b697e0e3-6ab8-4e90-b8e9-e72e362283da-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:12:35 compute-0 nova_compute[186544]: 2025-11-22 08:12:35.543 186548 DEBUG oslo_concurrency.lockutils [req-53e5b2fd-c4de-4037-8a65-5594ce9443bf req-d54b6378-37e9-49e0-9295-f7b41f191a53 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b697e0e3-6ab8-4e90-b8e9-e72e362283da-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:12:35 compute-0 nova_compute[186544]: 2025-11-22 08:12:35.543 186548 DEBUG oslo_concurrency.lockutils [req-53e5b2fd-c4de-4037-8a65-5594ce9443bf req-d54b6378-37e9-49e0-9295-f7b41f191a53 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b697e0e3-6ab8-4e90-b8e9-e72e362283da-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:12:35 compute-0 nova_compute[186544]: 2025-11-22 08:12:35.544 186548 DEBUG nova.compute.manager [req-53e5b2fd-c4de-4037-8a65-5594ce9443bf req-d54b6378-37e9-49e0-9295-f7b41f191a53 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] No waiting events found dispatching network-vif-unplugged-7b325460-e116-46b4-b42f-3595a0a85fa1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:12:35 compute-0 nova_compute[186544]: 2025-11-22 08:12:35.544 186548 DEBUG nova.compute.manager [req-53e5b2fd-c4de-4037-8a65-5594ce9443bf req-d54b6378-37e9-49e0-9295-f7b41f191a53 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Received event network-vif-unplugged-7b325460-e116-46b4-b42f-3595a0a85fa1 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 22 08:12:35 compute-0 nova_compute[186544]: 2025-11-22 08:12:35.544 186548 DEBUG nova.compute.manager [req-53e5b2fd-c4de-4037-8a65-5594ce9443bf req-d54b6378-37e9-49e0-9295-f7b41f191a53 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Received event network-vif-plugged-7b325460-e116-46b4-b42f-3595a0a85fa1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:12:35 compute-0 nova_compute[186544]: 2025-11-22 08:12:35.544 186548 DEBUG oslo_concurrency.lockutils [req-53e5b2fd-c4de-4037-8a65-5594ce9443bf req-d54b6378-37e9-49e0-9295-f7b41f191a53 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "b697e0e3-6ab8-4e90-b8e9-e72e362283da-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:12:35 compute-0 nova_compute[186544]: 2025-11-22 08:12:35.544 186548 DEBUG oslo_concurrency.lockutils [req-53e5b2fd-c4de-4037-8a65-5594ce9443bf req-d54b6378-37e9-49e0-9295-f7b41f191a53 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b697e0e3-6ab8-4e90-b8e9-e72e362283da-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:12:35 compute-0 nova_compute[186544]: 2025-11-22 08:12:35.545 186548 DEBUG oslo_concurrency.lockutils [req-53e5b2fd-c4de-4037-8a65-5594ce9443bf req-d54b6378-37e9-49e0-9295-f7b41f191a53 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b697e0e3-6ab8-4e90-b8e9-e72e362283da-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:12:35 compute-0 nova_compute[186544]: 2025-11-22 08:12:35.545 186548 DEBUG nova.compute.manager [req-53e5b2fd-c4de-4037-8a65-5594ce9443bf req-d54b6378-37e9-49e0-9295-f7b41f191a53 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] No waiting events found dispatching network-vif-plugged-7b325460-e116-46b4-b42f-3595a0a85fa1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:12:35 compute-0 nova_compute[186544]: 2025-11-22 08:12:35.545 186548 WARNING nova.compute.manager [req-53e5b2fd-c4de-4037-8a65-5594ce9443bf req-d54b6378-37e9-49e0-9295-f7b41f191a53 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Received unexpected event network-vif-plugged-7b325460-e116-46b4-b42f-3595a0a85fa1 for instance with vm_state active and task_state deleting.
Nov 22 08:12:35 compute-0 nova_compute[186544]: 2025-11-22 08:12:35.743 186548 DEBUG oslo_concurrency.lockutils [None req-6f70b412-100c-4f50-a886-fc6f3cdc28b2 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] Acquiring lock "5ad88893-e3b8-416a-8ae6-fc7dfd032f51" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:12:35 compute-0 nova_compute[186544]: 2025-11-22 08:12:35.744 186548 DEBUG oslo_concurrency.lockutils [None req-6f70b412-100c-4f50-a886-fc6f3cdc28b2 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] Lock "5ad88893-e3b8-416a-8ae6-fc7dfd032f51" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:12:35 compute-0 nova_compute[186544]: 2025-11-22 08:12:35.744 186548 DEBUG oslo_concurrency.lockutils [None req-6f70b412-100c-4f50-a886-fc6f3cdc28b2 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] Acquiring lock "5ad88893-e3b8-416a-8ae6-fc7dfd032f51-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:12:35 compute-0 nova_compute[186544]: 2025-11-22 08:12:35.744 186548 DEBUG oslo_concurrency.lockutils [None req-6f70b412-100c-4f50-a886-fc6f3cdc28b2 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] Lock "5ad88893-e3b8-416a-8ae6-fc7dfd032f51-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:12:35 compute-0 nova_compute[186544]: 2025-11-22 08:12:35.744 186548 DEBUG oslo_concurrency.lockutils [None req-6f70b412-100c-4f50-a886-fc6f3cdc28b2 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] Lock "5ad88893-e3b8-416a-8ae6-fc7dfd032f51-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:12:35 compute-0 nova_compute[186544]: 2025-11-22 08:12:35.752 186548 INFO nova.compute.manager [None req-6f70b412-100c-4f50-a886-fc6f3cdc28b2 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] [instance: 5ad88893-e3b8-416a-8ae6-fc7dfd032f51] Terminating instance
Nov 22 08:12:35 compute-0 nova_compute[186544]: 2025-11-22 08:12:35.759 186548 DEBUG nova.compute.manager [None req-6f70b412-100c-4f50-a886-fc6f3cdc28b2 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] [instance: 5ad88893-e3b8-416a-8ae6-fc7dfd032f51] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 08:12:35 compute-0 kernel: tapb467ba7d-c2 (unregistering): left promiscuous mode
Nov 22 08:12:35 compute-0 NetworkManager[55036]: <info>  [1763799155.7762] device (tapb467ba7d-c2): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 08:12:35 compute-0 nova_compute[186544]: 2025-11-22 08:12:35.785 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:12:35 compute-0 ovn_controller[94843]: 2025-11-22T08:12:35Z|00612|binding|INFO|Releasing lport b467ba7d-c233-4daf-93a8-01b1a9d6e012 from this chassis (sb_readonly=0)
Nov 22 08:12:35 compute-0 ovn_controller[94843]: 2025-11-22T08:12:35Z|00613|binding|INFO|Setting lport b467ba7d-c233-4daf-93a8-01b1a9d6e012 down in Southbound
Nov 22 08:12:35 compute-0 ovn_controller[94843]: 2025-11-22T08:12:35Z|00614|binding|INFO|Removing iface tapb467ba7d-c2 ovn-installed in OVS
Nov 22 08:12:35 compute-0 nova_compute[186544]: 2025-11-22 08:12:35.787 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:12:35 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:12:35.795 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bb:c0:58 10.100.0.12'], port_security=['fa:16:3e:bb:c0:58 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '5ad88893-e3b8-416a-8ae6-fc7dfd032f51', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-edad52ae-eb56-4482-9255-08c505724d77', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e6f0c135c6434a09b8f88d035ba8aa76', 'neutron:revision_number': '4', 'neutron:security_group_ids': '013fc171-e5ee-4003-9142-d78adb5e1ee5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb6c98c9-4b66-4ccd-8cfe-8ef59410683c, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=b467ba7d-c233-4daf-93a8-01b1a9d6e012) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:12:35 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:12:35.796 103805 INFO neutron.agent.ovn.metadata.agent [-] Port b467ba7d-c233-4daf-93a8-01b1a9d6e012 in datapath edad52ae-eb56-4482-9255-08c505724d77 unbound from our chassis
Nov 22 08:12:35 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:12:35.797 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network edad52ae-eb56-4482-9255-08c505724d77, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 08:12:35 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:12:35.798 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[f160096a-059f-4cad-9553-1f69e5234770]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:12:35 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:12:35.799 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-edad52ae-eb56-4482-9255-08c505724d77 namespace which is not needed anymore
Nov 22 08:12:35 compute-0 nova_compute[186544]: 2025-11-22 08:12:35.807 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:12:35 compute-0 systemd[1]: machine-qemu\x2d75\x2dinstance\x2d00000086.scope: Deactivated successfully.
Nov 22 08:12:35 compute-0 systemd[1]: machine-qemu\x2d75\x2dinstance\x2d00000086.scope: Consumed 2.924s CPU time.
Nov 22 08:12:35 compute-0 systemd-machined[152872]: Machine qemu-75-instance-00000086 terminated.
Nov 22 08:12:35 compute-0 neutron-haproxy-ovnmeta-edad52ae-eb56-4482-9255-08c505724d77[237655]: [NOTICE]   (237659) : haproxy version is 2.8.14-c23fe91
Nov 22 08:12:35 compute-0 neutron-haproxy-ovnmeta-edad52ae-eb56-4482-9255-08c505724d77[237655]: [NOTICE]   (237659) : path to executable is /usr/sbin/haproxy
Nov 22 08:12:35 compute-0 neutron-haproxy-ovnmeta-edad52ae-eb56-4482-9255-08c505724d77[237655]: [WARNING]  (237659) : Exiting Master process...
Nov 22 08:12:35 compute-0 neutron-haproxy-ovnmeta-edad52ae-eb56-4482-9255-08c505724d77[237655]: [ALERT]    (237659) : Current worker (237661) exited with code 143 (Terminated)
Nov 22 08:12:35 compute-0 neutron-haproxy-ovnmeta-edad52ae-eb56-4482-9255-08c505724d77[237655]: [WARNING]  (237659) : All workers exited. Exiting... (0)
Nov 22 08:12:35 compute-0 nova_compute[186544]: 2025-11-22 08:12:35.919 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:12:35 compute-0 systemd[1]: libpod-c07b80430f982eafff5db43a2c7aab60adb169e5c2e5dfe2047ec44982fbd5ec.scope: Deactivated successfully.
Nov 22 08:12:35 compute-0 podman[237773]: 2025-11-22 08:12:35.928430061 +0000 UTC m=+0.045528008 container died c07b80430f982eafff5db43a2c7aab60adb169e5c2e5dfe2047ec44982fbd5ec (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-edad52ae-eb56-4482-9255-08c505724d77, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 08:12:35 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c07b80430f982eafff5db43a2c7aab60adb169e5c2e5dfe2047ec44982fbd5ec-userdata-shm.mount: Deactivated successfully.
Nov 22 08:12:35 compute-0 systemd[1]: var-lib-containers-storage-overlay-eefb78a1960de99bb0daf83273de80f30377224b1a11fd76f44f3a79282d3009-merged.mount: Deactivated successfully.
Nov 22 08:12:35 compute-0 podman[237773]: 2025-11-22 08:12:35.970225796 +0000 UTC m=+0.087323733 container cleanup c07b80430f982eafff5db43a2c7aab60adb169e5c2e5dfe2047ec44982fbd5ec (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-edad52ae-eb56-4482-9255-08c505724d77, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 08:12:35 compute-0 systemd[1]: libpod-conmon-c07b80430f982eafff5db43a2c7aab60adb169e5c2e5dfe2047ec44982fbd5ec.scope: Deactivated successfully.
Nov 22 08:12:35 compute-0 NetworkManager[55036]: <info>  [1763799155.9818] manager: (tapb467ba7d-c2): new Tun device (/org/freedesktop/NetworkManager/Devices/282)
Nov 22 08:12:36 compute-0 nova_compute[186544]: 2025-11-22 08:12:36.034 186548 INFO nova.virt.libvirt.driver [-] [instance: 5ad88893-e3b8-416a-8ae6-fc7dfd032f51] Instance destroyed successfully.
Nov 22 08:12:36 compute-0 nova_compute[186544]: 2025-11-22 08:12:36.035 186548 DEBUG nova.objects.instance [None req-6f70b412-100c-4f50-a886-fc6f3cdc28b2 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] Lazy-loading 'resources' on Instance uuid 5ad88893-e3b8-416a-8ae6-fc7dfd032f51 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:12:36 compute-0 nova_compute[186544]: 2025-11-22 08:12:36.054 186548 DEBUG nova.virt.libvirt.vif [None req-6f70b412-100c-4f50-a886-fc6f3cdc28b2 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:12:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerPasswordTestJSON-server-1051020919',display_name='tempest-ServerPasswordTestJSON-server-1051020919',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverpasswordtestjson-server-1051020919',id=134,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:12:33Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e6f0c135c6434a09b8f88d035ba8aa76',ramdisk_id='',reservation_id='r-r53rpd0r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerPasswordTestJSON-761097360',owner_user_name='tempest-ServerPasswordTestJSON-761097360-project-member',password_0='',password_1='',password_2='',password_3=''},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:12:35Z,user_data=None,user_id='c0ef2c0fb47644918f7915437e98fc9a',uuid=5ad88893-e3b8-416a-8ae6-fc7dfd032f51,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b467ba7d-c233-4daf-93a8-01b1a9d6e012", "address": "fa:16:3e:bb:c0:58", "network": {"id": "edad52ae-eb56-4482-9255-08c505724d77", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-1843979129-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6f0c135c6434a09b8f88d035ba8aa76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb467ba7d-c2", "ovs_interfaceid": "b467ba7d-c233-4daf-93a8-01b1a9d6e012", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 08:12:36 compute-0 nova_compute[186544]: 2025-11-22 08:12:36.054 186548 DEBUG nova.network.os_vif_util [None req-6f70b412-100c-4f50-a886-fc6f3cdc28b2 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] Converting VIF {"id": "b467ba7d-c233-4daf-93a8-01b1a9d6e012", "address": "fa:16:3e:bb:c0:58", "network": {"id": "edad52ae-eb56-4482-9255-08c505724d77", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-1843979129-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6f0c135c6434a09b8f88d035ba8aa76", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb467ba7d-c2", "ovs_interfaceid": "b467ba7d-c233-4daf-93a8-01b1a9d6e012", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:12:36 compute-0 nova_compute[186544]: 2025-11-22 08:12:36.055 186548 DEBUG nova.network.os_vif_util [None req-6f70b412-100c-4f50-a886-fc6f3cdc28b2 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bb:c0:58,bridge_name='br-int',has_traffic_filtering=True,id=b467ba7d-c233-4daf-93a8-01b1a9d6e012,network=Network(edad52ae-eb56-4482-9255-08c505724d77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb467ba7d-c2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:12:36 compute-0 nova_compute[186544]: 2025-11-22 08:12:36.055 186548 DEBUG os_vif [None req-6f70b412-100c-4f50-a886-fc6f3cdc28b2 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:bb:c0:58,bridge_name='br-int',has_traffic_filtering=True,id=b467ba7d-c233-4daf-93a8-01b1a9d6e012,network=Network(edad52ae-eb56-4482-9255-08c505724d77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb467ba7d-c2') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 08:12:36 compute-0 podman[237804]: 2025-11-22 08:12:36.05607759 +0000 UTC m=+0.055471063 container remove c07b80430f982eafff5db43a2c7aab60adb169e5c2e5dfe2047ec44982fbd5ec (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-edad52ae-eb56-4482-9255-08c505724d77, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Nov 22 08:12:36 compute-0 nova_compute[186544]: 2025-11-22 08:12:36.056 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:12:36 compute-0 nova_compute[186544]: 2025-11-22 08:12:36.056 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb467ba7d-c2, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:12:36 compute-0 nova_compute[186544]: 2025-11-22 08:12:36.058 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:12:36 compute-0 nova_compute[186544]: 2025-11-22 08:12:36.059 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:12:36 compute-0 nova_compute[186544]: 2025-11-22 08:12:36.061 186548 INFO os_vif [None req-6f70b412-100c-4f50-a886-fc6f3cdc28b2 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:bb:c0:58,bridge_name='br-int',has_traffic_filtering=True,id=b467ba7d-c233-4daf-93a8-01b1a9d6e012,network=Network(edad52ae-eb56-4482-9255-08c505724d77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb467ba7d-c2')
Nov 22 08:12:36 compute-0 nova_compute[186544]: 2025-11-22 08:12:36.062 186548 INFO nova.virt.libvirt.driver [None req-6f70b412-100c-4f50-a886-fc6f3cdc28b2 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] [instance: 5ad88893-e3b8-416a-8ae6-fc7dfd032f51] Deleting instance files /var/lib/nova/instances/5ad88893-e3b8-416a-8ae6-fc7dfd032f51_del
Nov 22 08:12:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:12:36.061 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[174b2454-6a8c-4825-8b2e-11011c05e541]: (4, ('Sat Nov 22 08:12:35 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-edad52ae-eb56-4482-9255-08c505724d77 (c07b80430f982eafff5db43a2c7aab60adb169e5c2e5dfe2047ec44982fbd5ec)\nc07b80430f982eafff5db43a2c7aab60adb169e5c2e5dfe2047ec44982fbd5ec\nSat Nov 22 08:12:35 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-edad52ae-eb56-4482-9255-08c505724d77 (c07b80430f982eafff5db43a2c7aab60adb169e5c2e5dfe2047ec44982fbd5ec)\nc07b80430f982eafff5db43a2c7aab60adb169e5c2e5dfe2047ec44982fbd5ec\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:12:36 compute-0 nova_compute[186544]: 2025-11-22 08:12:36.062 186548 INFO nova.virt.libvirt.driver [None req-6f70b412-100c-4f50-a886-fc6f3cdc28b2 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] [instance: 5ad88893-e3b8-416a-8ae6-fc7dfd032f51] Deletion of /var/lib/nova/instances/5ad88893-e3b8-416a-8ae6-fc7dfd032f51_del complete
Nov 22 08:12:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:12:36.063 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[172197e1-df87-45b6-9c6f-9c2f92feeb98]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:12:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:12:36.063 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapedad52ae-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:12:36 compute-0 kernel: tapedad52ae-e0: left promiscuous mode
Nov 22 08:12:36 compute-0 nova_compute[186544]: 2025-11-22 08:12:36.067 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:12:36 compute-0 nova_compute[186544]: 2025-11-22 08:12:36.080 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:12:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:12:36.083 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[8bca7352-d6fe-4199-96b2-d15a30533482]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:12:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:12:36.099 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[4d20a1b1-daa2-4c6b-9d17-f1f3a53cade2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:12:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:12:36.101 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[d8c70164-7a4d-430d-bf8e-d1df030056e3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:12:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:12:36.119 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[ac0bbd96-0ea4-40d7-b0d3-8f57ff97ed22]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 589354, 'reachable_time': 31010, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237831, 'error': None, 'target': 'ovnmeta-edad52ae-eb56-4482-9255-08c505724d77', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:12:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:12:36.121 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-edad52ae-eb56-4482-9255-08c505724d77 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 08:12:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:12:36.121 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[14f6d25c-1c54-4abb-85ec-acdd615e6e5d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:12:36 compute-0 systemd[1]: run-netns-ovnmeta\x2dedad52ae\x2deb56\x2d4482\x2d9255\x2d08c505724d77.mount: Deactivated successfully.
Nov 22 08:12:36 compute-0 nova_compute[186544]: 2025-11-22 08:12:36.161 186548 INFO nova.compute.manager [None req-6f70b412-100c-4f50-a886-fc6f3cdc28b2 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] [instance: 5ad88893-e3b8-416a-8ae6-fc7dfd032f51] Took 0.40 seconds to destroy the instance on the hypervisor.
Nov 22 08:12:36 compute-0 nova_compute[186544]: 2025-11-22 08:12:36.162 186548 DEBUG oslo.service.loopingcall [None req-6f70b412-100c-4f50-a886-fc6f3cdc28b2 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 08:12:36 compute-0 nova_compute[186544]: 2025-11-22 08:12:36.163 186548 DEBUG nova.compute.manager [-] [instance: 5ad88893-e3b8-416a-8ae6-fc7dfd032f51] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 08:12:36 compute-0 nova_compute[186544]: 2025-11-22 08:12:36.163 186548 DEBUG nova.network.neutron [-] [instance: 5ad88893-e3b8-416a-8ae6-fc7dfd032f51] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 08:12:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:12:36.597 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:12:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:12:36.598 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:12:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:12:36.598 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:12:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:12:36.598 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:12:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:12:36.598 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:12:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:12:36.598 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:12:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:12:36.598 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:12:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:12:36.598 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:12:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:12:36.599 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:12:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:12:36.599 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:12:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:12:36.599 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:12:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:12:36.599 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:12:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:12:36.599 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:12:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:12:36.599 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:12:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:12:36.599 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:12:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:12:36.599 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:12:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:12:36.599 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:12:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:12:36.599 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:12:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:12:36.599 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:12:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:12:36.600 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:12:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:12:36.600 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:12:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:12:36.600 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:12:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:12:36.600 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:12:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:12:36.600 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:12:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:12:36.600 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:12:36 compute-0 nova_compute[186544]: 2025-11-22 08:12:36.952 186548 DEBUG nova.network.neutron [-] [instance: 5ad88893-e3b8-416a-8ae6-fc7dfd032f51] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:12:36 compute-0 nova_compute[186544]: 2025-11-22 08:12:36.971 186548 INFO nova.compute.manager [-] [instance: 5ad88893-e3b8-416a-8ae6-fc7dfd032f51] Took 0.81 seconds to deallocate network for instance.
Nov 22 08:12:37 compute-0 nova_compute[186544]: 2025-11-22 08:12:37.027 186548 DEBUG oslo_concurrency.lockutils [None req-6f70b412-100c-4f50-a886-fc6f3cdc28b2 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:12:37 compute-0 nova_compute[186544]: 2025-11-22 08:12:37.027 186548 DEBUG oslo_concurrency.lockutils [None req-6f70b412-100c-4f50-a886-fc6f3cdc28b2 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:12:37 compute-0 nova_compute[186544]: 2025-11-22 08:12:37.096 186548 DEBUG nova.compute.provider_tree [None req-6f70b412-100c-4f50-a886-fc6f3cdc28b2 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:12:37 compute-0 nova_compute[186544]: 2025-11-22 08:12:37.109 186548 DEBUG nova.scheduler.client.report [None req-6f70b412-100c-4f50-a886-fc6f3cdc28b2 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:12:37 compute-0 nova_compute[186544]: 2025-11-22 08:12:37.236 186548 DEBUG oslo_concurrency.lockutils [None req-6f70b412-100c-4f50-a886-fc6f3cdc28b2 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.209s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:12:37 compute-0 nova_compute[186544]: 2025-11-22 08:12:37.332 186548 INFO nova.scheduler.client.report [None req-6f70b412-100c-4f50-a886-fc6f3cdc28b2 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] Deleted allocations for instance 5ad88893-e3b8-416a-8ae6-fc7dfd032f51
Nov 22 08:12:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:12:37.340 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:12:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:12:37.340 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:12:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:12:37.341 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:12:37 compute-0 nova_compute[186544]: 2025-11-22 08:12:37.431 186548 DEBUG nova.compute.manager [req-602c88fd-62d6-4e13-a6d7-2e3ac602263a req-ad1312d7-72fd-439c-9e3e-7b460438333c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5ad88893-e3b8-416a-8ae6-fc7dfd032f51] Received event network-vif-unplugged-b467ba7d-c233-4daf-93a8-01b1a9d6e012 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:12:37 compute-0 nova_compute[186544]: 2025-11-22 08:12:37.431 186548 DEBUG oslo_concurrency.lockutils [req-602c88fd-62d6-4e13-a6d7-2e3ac602263a req-ad1312d7-72fd-439c-9e3e-7b460438333c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "5ad88893-e3b8-416a-8ae6-fc7dfd032f51-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:12:37 compute-0 nova_compute[186544]: 2025-11-22 08:12:37.431 186548 DEBUG oslo_concurrency.lockutils [req-602c88fd-62d6-4e13-a6d7-2e3ac602263a req-ad1312d7-72fd-439c-9e3e-7b460438333c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "5ad88893-e3b8-416a-8ae6-fc7dfd032f51-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:12:37 compute-0 nova_compute[186544]: 2025-11-22 08:12:37.432 186548 DEBUG oslo_concurrency.lockutils [req-602c88fd-62d6-4e13-a6d7-2e3ac602263a req-ad1312d7-72fd-439c-9e3e-7b460438333c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "5ad88893-e3b8-416a-8ae6-fc7dfd032f51-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:12:37 compute-0 nova_compute[186544]: 2025-11-22 08:12:37.432 186548 DEBUG nova.compute.manager [req-602c88fd-62d6-4e13-a6d7-2e3ac602263a req-ad1312d7-72fd-439c-9e3e-7b460438333c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5ad88893-e3b8-416a-8ae6-fc7dfd032f51] No waiting events found dispatching network-vif-unplugged-b467ba7d-c233-4daf-93a8-01b1a9d6e012 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:12:37 compute-0 nova_compute[186544]: 2025-11-22 08:12:37.432 186548 WARNING nova.compute.manager [req-602c88fd-62d6-4e13-a6d7-2e3ac602263a req-ad1312d7-72fd-439c-9e3e-7b460438333c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5ad88893-e3b8-416a-8ae6-fc7dfd032f51] Received unexpected event network-vif-unplugged-b467ba7d-c233-4daf-93a8-01b1a9d6e012 for instance with vm_state deleted and task_state None.
Nov 22 08:12:37 compute-0 nova_compute[186544]: 2025-11-22 08:12:37.432 186548 DEBUG nova.compute.manager [req-602c88fd-62d6-4e13-a6d7-2e3ac602263a req-ad1312d7-72fd-439c-9e3e-7b460438333c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5ad88893-e3b8-416a-8ae6-fc7dfd032f51] Received event network-vif-plugged-b467ba7d-c233-4daf-93a8-01b1a9d6e012 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:12:37 compute-0 nova_compute[186544]: 2025-11-22 08:12:37.432 186548 DEBUG oslo_concurrency.lockutils [req-602c88fd-62d6-4e13-a6d7-2e3ac602263a req-ad1312d7-72fd-439c-9e3e-7b460438333c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "5ad88893-e3b8-416a-8ae6-fc7dfd032f51-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:12:37 compute-0 nova_compute[186544]: 2025-11-22 08:12:37.433 186548 DEBUG oslo_concurrency.lockutils [req-602c88fd-62d6-4e13-a6d7-2e3ac602263a req-ad1312d7-72fd-439c-9e3e-7b460438333c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "5ad88893-e3b8-416a-8ae6-fc7dfd032f51-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:12:37 compute-0 nova_compute[186544]: 2025-11-22 08:12:37.433 186548 DEBUG oslo_concurrency.lockutils [req-602c88fd-62d6-4e13-a6d7-2e3ac602263a req-ad1312d7-72fd-439c-9e3e-7b460438333c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "5ad88893-e3b8-416a-8ae6-fc7dfd032f51-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:12:37 compute-0 nova_compute[186544]: 2025-11-22 08:12:37.433 186548 DEBUG nova.compute.manager [req-602c88fd-62d6-4e13-a6d7-2e3ac602263a req-ad1312d7-72fd-439c-9e3e-7b460438333c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5ad88893-e3b8-416a-8ae6-fc7dfd032f51] No waiting events found dispatching network-vif-plugged-b467ba7d-c233-4daf-93a8-01b1a9d6e012 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:12:37 compute-0 nova_compute[186544]: 2025-11-22 08:12:37.433 186548 WARNING nova.compute.manager [req-602c88fd-62d6-4e13-a6d7-2e3ac602263a req-ad1312d7-72fd-439c-9e3e-7b460438333c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5ad88893-e3b8-416a-8ae6-fc7dfd032f51] Received unexpected event network-vif-plugged-b467ba7d-c233-4daf-93a8-01b1a9d6e012 for instance with vm_state deleted and task_state None.
Nov 22 08:12:37 compute-0 nova_compute[186544]: 2025-11-22 08:12:37.434 186548 DEBUG nova.compute.manager [req-602c88fd-62d6-4e13-a6d7-2e3ac602263a req-ad1312d7-72fd-439c-9e3e-7b460438333c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 5ad88893-e3b8-416a-8ae6-fc7dfd032f51] Received event network-vif-deleted-b467ba7d-c233-4daf-93a8-01b1a9d6e012 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:12:37 compute-0 nova_compute[186544]: 2025-11-22 08:12:37.693 186548 DEBUG oslo_concurrency.lockutils [None req-6f70b412-100c-4f50-a886-fc6f3cdc28b2 c0ef2c0fb47644918f7915437e98fc9a e6f0c135c6434a09b8f88d035ba8aa76 - - default default] Lock "5ad88893-e3b8-416a-8ae6-fc7dfd032f51" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.949s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:12:40 compute-0 nova_compute[186544]: 2025-11-22 08:12:40.040 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:12:40 compute-0 nova_compute[186544]: 2025-11-22 08:12:40.250 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:12:40 compute-0 nova_compute[186544]: 2025-11-22 08:12:40.921 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:12:41 compute-0 nova_compute[186544]: 2025-11-22 08:12:41.058 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:12:42 compute-0 podman[237835]: 2025-11-22 08:12:42.417540308 +0000 UTC m=+0.059430912 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 08:12:42 compute-0 podman[237834]: 2025-11-22 08:12:42.423397673 +0000 UTC m=+0.067800439 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 22 08:12:42 compute-0 podman[237836]: 2025-11-22 08:12:42.451114469 +0000 UTC m=+0.090400789 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 22 08:12:42 compute-0 podman[237833]: 2025-11-22 08:12:42.455147949 +0000 UTC m=+0.100980431 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 22 08:12:45 compute-0 nova_compute[186544]: 2025-11-22 08:12:45.922 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:12:46 compute-0 nova_compute[186544]: 2025-11-22 08:12:46.061 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:12:48 compute-0 nova_compute[186544]: 2025-11-22 08:12:48.785 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763799153.7842257, b697e0e3-6ab8-4e90-b8e9-e72e362283da => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:12:48 compute-0 nova_compute[186544]: 2025-11-22 08:12:48.785 186548 INFO nova.compute.manager [-] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] VM Stopped (Lifecycle Event)
Nov 22 08:12:48 compute-0 nova_compute[186544]: 2025-11-22 08:12:48.809 186548 DEBUG nova.compute.manager [None req-4dbd408a-0965-4e7c-8491-810db7f60cc5 - - - - - -] [instance: b697e0e3-6ab8-4e90-b8e9-e72e362283da] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:12:50 compute-0 nova_compute[186544]: 2025-11-22 08:12:50.924 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:12:51 compute-0 nova_compute[186544]: 2025-11-22 08:12:51.032 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763799156.0318806, 5ad88893-e3b8-416a-8ae6-fc7dfd032f51 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:12:51 compute-0 nova_compute[186544]: 2025-11-22 08:12:51.033 186548 INFO nova.compute.manager [-] [instance: 5ad88893-e3b8-416a-8ae6-fc7dfd032f51] VM Stopped (Lifecycle Event)
Nov 22 08:12:51 compute-0 nova_compute[186544]: 2025-11-22 08:12:51.061 186548 DEBUG nova.compute.manager [None req-d86b55aa-6cfe-4e06-94bd-405bceb48a03 - - - - - -] [instance: 5ad88893-e3b8-416a-8ae6-fc7dfd032f51] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:12:51 compute-0 nova_compute[186544]: 2025-11-22 08:12:51.063 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:12:53 compute-0 podman[237912]: 2025-11-22 08:12:53.40007599 +0000 UTC m=+0.052703136 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 08:12:55 compute-0 nova_compute[186544]: 2025-11-22 08:12:55.927 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:12:56 compute-0 nova_compute[186544]: 2025-11-22 08:12:56.066 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:12:56 compute-0 nova_compute[186544]: 2025-11-22 08:12:56.174 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:12:56 compute-0 nova_compute[186544]: 2025-11-22 08:12:56.697 186548 DEBUG oslo_concurrency.lockutils [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] Acquiring lock "b8c444a0-4593-4343-9e2c-99f3110cceca" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:12:56 compute-0 nova_compute[186544]: 2025-11-22 08:12:56.697 186548 DEBUG oslo_concurrency.lockutils [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] Lock "b8c444a0-4593-4343-9e2c-99f3110cceca" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:12:56 compute-0 nova_compute[186544]: 2025-11-22 08:12:56.713 186548 DEBUG nova.compute.manager [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] [instance: b8c444a0-4593-4343-9e2c-99f3110cceca] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 08:12:56 compute-0 nova_compute[186544]: 2025-11-22 08:12:56.818 186548 DEBUG oslo_concurrency.lockutils [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:12:56 compute-0 nova_compute[186544]: 2025-11-22 08:12:56.819 186548 DEBUG oslo_concurrency.lockutils [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:12:56 compute-0 nova_compute[186544]: 2025-11-22 08:12:56.840 186548 DEBUG nova.virt.hardware [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 08:12:56 compute-0 nova_compute[186544]: 2025-11-22 08:12:56.840 186548 INFO nova.compute.claims [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] [instance: b8c444a0-4593-4343-9e2c-99f3110cceca] Claim successful on node compute-0.ctlplane.example.com
Nov 22 08:12:56 compute-0 nova_compute[186544]: 2025-11-22 08:12:56.984 186548 DEBUG nova.compute.provider_tree [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:12:56 compute-0 nova_compute[186544]: 2025-11-22 08:12:56.995 186548 DEBUG nova.scheduler.client.report [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:12:57 compute-0 nova_compute[186544]: 2025-11-22 08:12:57.012 186548 DEBUG oslo_concurrency.lockutils [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.193s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:12:57 compute-0 nova_compute[186544]: 2025-11-22 08:12:57.013 186548 DEBUG nova.compute.manager [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] [instance: b8c444a0-4593-4343-9e2c-99f3110cceca] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 08:12:57 compute-0 nova_compute[186544]: 2025-11-22 08:12:57.063 186548 DEBUG nova.compute.manager [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] [instance: b8c444a0-4593-4343-9e2c-99f3110cceca] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 22 08:12:57 compute-0 nova_compute[186544]: 2025-11-22 08:12:57.064 186548 DEBUG nova.network.neutron [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] [instance: b8c444a0-4593-4343-9e2c-99f3110cceca] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 08:12:57 compute-0 nova_compute[186544]: 2025-11-22 08:12:57.082 186548 INFO nova.virt.libvirt.driver [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] [instance: b8c444a0-4593-4343-9e2c-99f3110cceca] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 08:12:57 compute-0 nova_compute[186544]: 2025-11-22 08:12:57.104 186548 DEBUG nova.compute.manager [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] [instance: b8c444a0-4593-4343-9e2c-99f3110cceca] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 08:12:57 compute-0 nova_compute[186544]: 2025-11-22 08:12:57.199 186548 DEBUG nova.compute.manager [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] [instance: b8c444a0-4593-4343-9e2c-99f3110cceca] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 08:12:57 compute-0 nova_compute[186544]: 2025-11-22 08:12:57.200 186548 DEBUG nova.virt.libvirt.driver [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] [instance: b8c444a0-4593-4343-9e2c-99f3110cceca] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 08:12:57 compute-0 nova_compute[186544]: 2025-11-22 08:12:57.201 186548 INFO nova.virt.libvirt.driver [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] [instance: b8c444a0-4593-4343-9e2c-99f3110cceca] Creating image(s)
Nov 22 08:12:57 compute-0 nova_compute[186544]: 2025-11-22 08:12:57.202 186548 DEBUG oslo_concurrency.lockutils [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] Acquiring lock "/var/lib/nova/instances/b8c444a0-4593-4343-9e2c-99f3110cceca/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:12:57 compute-0 nova_compute[186544]: 2025-11-22 08:12:57.203 186548 DEBUG oslo_concurrency.lockutils [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] Lock "/var/lib/nova/instances/b8c444a0-4593-4343-9e2c-99f3110cceca/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:12:57 compute-0 nova_compute[186544]: 2025-11-22 08:12:57.203 186548 DEBUG oslo_concurrency.lockutils [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] Lock "/var/lib/nova/instances/b8c444a0-4593-4343-9e2c-99f3110cceca/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:12:57 compute-0 nova_compute[186544]: 2025-11-22 08:12:57.216 186548 DEBUG oslo_concurrency.processutils [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:12:57 compute-0 nova_compute[186544]: 2025-11-22 08:12:57.304 186548 DEBUG oslo_concurrency.processutils [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:12:57 compute-0 nova_compute[186544]: 2025-11-22 08:12:57.306 186548 DEBUG oslo_concurrency.lockutils [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:12:57 compute-0 nova_compute[186544]: 2025-11-22 08:12:57.307 186548 DEBUG oslo_concurrency.lockutils [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:12:57 compute-0 nova_compute[186544]: 2025-11-22 08:12:57.322 186548 DEBUG oslo_concurrency.processutils [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:12:57 compute-0 nova_compute[186544]: 2025-11-22 08:12:57.383 186548 DEBUG oslo_concurrency.processutils [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:12:57 compute-0 nova_compute[186544]: 2025-11-22 08:12:57.384 186548 DEBUG oslo_concurrency.processutils [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/b8c444a0-4593-4343-9e2c-99f3110cceca/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:12:57 compute-0 podman[237936]: 2025-11-22 08:12:57.394733946 +0000 UTC m=+0.047862586 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 08:12:57 compute-0 podman[237938]: 2025-11-22 08:12:57.402491358 +0000 UTC m=+0.052946072 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, version=9.6, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, name=ubi9-minimal, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, release=1755695350, config_id=edpm, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7)
Nov 22 08:12:57 compute-0 nova_compute[186544]: 2025-11-22 08:12:57.599 186548 DEBUG nova.policy [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '34e68ca06c2b4e039ead3d9c5b44c88c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0e2821b0488241e9bc0d864bb3f2ca40', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 22 08:12:58 compute-0 nova_compute[186544]: 2025-11-22 08:12:58.155 186548 DEBUG oslo_concurrency.processutils [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/b8c444a0-4593-4343-9e2c-99f3110cceca/disk 1073741824" returned: 0 in 0.771s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:12:58 compute-0 nova_compute[186544]: 2025-11-22 08:12:58.156 186548 DEBUG oslo_concurrency.lockutils [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.849s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:12:58 compute-0 nova_compute[186544]: 2025-11-22 08:12:58.157 186548 DEBUG oslo_concurrency.processutils [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:12:58 compute-0 nova_compute[186544]: 2025-11-22 08:12:58.214 186548 DEBUG oslo_concurrency.processutils [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:12:58 compute-0 nova_compute[186544]: 2025-11-22 08:12:58.216 186548 DEBUG nova.virt.disk.api [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] Checking if we can resize image /var/lib/nova/instances/b8c444a0-4593-4343-9e2c-99f3110cceca/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 08:12:58 compute-0 nova_compute[186544]: 2025-11-22 08:12:58.217 186548 DEBUG oslo_concurrency.processutils [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b8c444a0-4593-4343-9e2c-99f3110cceca/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:12:58 compute-0 nova_compute[186544]: 2025-11-22 08:12:58.276 186548 DEBUG oslo_concurrency.processutils [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b8c444a0-4593-4343-9e2c-99f3110cceca/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:12:58 compute-0 nova_compute[186544]: 2025-11-22 08:12:58.278 186548 DEBUG nova.virt.disk.api [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] Cannot resize image /var/lib/nova/instances/b8c444a0-4593-4343-9e2c-99f3110cceca/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 08:12:58 compute-0 nova_compute[186544]: 2025-11-22 08:12:58.278 186548 DEBUG nova.objects.instance [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] Lazy-loading 'migration_context' on Instance uuid b8c444a0-4593-4343-9e2c-99f3110cceca obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:12:58 compute-0 nova_compute[186544]: 2025-11-22 08:12:58.292 186548 DEBUG nova.virt.libvirt.driver [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] [instance: b8c444a0-4593-4343-9e2c-99f3110cceca] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 08:12:58 compute-0 nova_compute[186544]: 2025-11-22 08:12:58.293 186548 DEBUG nova.virt.libvirt.driver [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] [instance: b8c444a0-4593-4343-9e2c-99f3110cceca] Ensure instance console log exists: /var/lib/nova/instances/b8c444a0-4593-4343-9e2c-99f3110cceca/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 08:12:58 compute-0 nova_compute[186544]: 2025-11-22 08:12:58.293 186548 DEBUG oslo_concurrency.lockutils [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:12:58 compute-0 nova_compute[186544]: 2025-11-22 08:12:58.294 186548 DEBUG oslo_concurrency.lockutils [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:12:58 compute-0 nova_compute[186544]: 2025-11-22 08:12:58.294 186548 DEBUG oslo_concurrency.lockutils [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:12:58 compute-0 nova_compute[186544]: 2025-11-22 08:12:58.627 186548 DEBUG nova.network.neutron [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] [instance: b8c444a0-4593-4343-9e2c-99f3110cceca] Successfully created port: 1e98276d-f701-4c00-b1bb-177abba437fa _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 22 08:12:59 compute-0 nova_compute[186544]: 2025-11-22 08:12:59.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:12:59 compute-0 nova_compute[186544]: 2025-11-22 08:12:59.193 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:12:59 compute-0 nova_compute[186544]: 2025-11-22 08:12:59.194 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:12:59 compute-0 nova_compute[186544]: 2025-11-22 08:12:59.194 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:12:59 compute-0 nova_compute[186544]: 2025-11-22 08:12:59.194 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 08:12:59 compute-0 nova_compute[186544]: 2025-11-22 08:12:59.377 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:12:59 compute-0 nova_compute[186544]: 2025-11-22 08:12:59.378 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5737MB free_disk=73.1366958618164GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 08:12:59 compute-0 nova_compute[186544]: 2025-11-22 08:12:59.378 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:12:59 compute-0 nova_compute[186544]: 2025-11-22 08:12:59.378 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:12:59 compute-0 nova_compute[186544]: 2025-11-22 08:12:59.488 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Instance b8c444a0-4593-4343-9e2c-99f3110cceca actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 22 08:12:59 compute-0 nova_compute[186544]: 2025-11-22 08:12:59.489 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 08:12:59 compute-0 nova_compute[186544]: 2025-11-22 08:12:59.489 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 08:12:59 compute-0 nova_compute[186544]: 2025-11-22 08:12:59.529 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:12:59 compute-0 nova_compute[186544]: 2025-11-22 08:12:59.543 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:12:59 compute-0 nova_compute[186544]: 2025-11-22 08:12:59.567 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 08:12:59 compute-0 nova_compute[186544]: 2025-11-22 08:12:59.568 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.189s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:12:59 compute-0 nova_compute[186544]: 2025-11-22 08:12:59.667 186548 DEBUG nova.network.neutron [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] [instance: b8c444a0-4593-4343-9e2c-99f3110cceca] Successfully updated port: 1e98276d-f701-4c00-b1bb-177abba437fa _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 08:12:59 compute-0 nova_compute[186544]: 2025-11-22 08:12:59.679 186548 DEBUG oslo_concurrency.lockutils [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] Acquiring lock "refresh_cache-b8c444a0-4593-4343-9e2c-99f3110cceca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:12:59 compute-0 nova_compute[186544]: 2025-11-22 08:12:59.679 186548 DEBUG oslo_concurrency.lockutils [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] Acquired lock "refresh_cache-b8c444a0-4593-4343-9e2c-99f3110cceca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:12:59 compute-0 nova_compute[186544]: 2025-11-22 08:12:59.679 186548 DEBUG nova.network.neutron [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] [instance: b8c444a0-4593-4343-9e2c-99f3110cceca] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 08:12:59 compute-0 nova_compute[186544]: 2025-11-22 08:12:59.806 186548 DEBUG nova.compute.manager [req-41cbb022-11e5-446d-b916-40d5c1ba8906 req-37d57745-5337-4b89-b1f0-9203aa150ff3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b8c444a0-4593-4343-9e2c-99f3110cceca] Received event network-changed-1e98276d-f701-4c00-b1bb-177abba437fa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:12:59 compute-0 nova_compute[186544]: 2025-11-22 08:12:59.806 186548 DEBUG nova.compute.manager [req-41cbb022-11e5-446d-b916-40d5c1ba8906 req-37d57745-5337-4b89-b1f0-9203aa150ff3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b8c444a0-4593-4343-9e2c-99f3110cceca] Refreshing instance network info cache due to event network-changed-1e98276d-f701-4c00-b1bb-177abba437fa. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:12:59 compute-0 nova_compute[186544]: 2025-11-22 08:12:59.807 186548 DEBUG oslo_concurrency.lockutils [req-41cbb022-11e5-446d-b916-40d5c1ba8906 req-37d57745-5337-4b89-b1f0-9203aa150ff3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-b8c444a0-4593-4343-9e2c-99f3110cceca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:12:59 compute-0 nova_compute[186544]: 2025-11-22 08:12:59.898 186548 DEBUG nova.network.neutron [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] [instance: b8c444a0-4593-4343-9e2c-99f3110cceca] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 08:13:00 compute-0 nova_compute[186544]: 2025-11-22 08:13:00.568 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:13:00 compute-0 nova_compute[186544]: 2025-11-22 08:13:00.568 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 08:13:00 compute-0 nova_compute[186544]: 2025-11-22 08:13:00.569 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 08:13:00 compute-0 nova_compute[186544]: 2025-11-22 08:13:00.591 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: b8c444a0-4593-4343-9e2c-99f3110cceca] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Nov 22 08:13:00 compute-0 nova_compute[186544]: 2025-11-22 08:13:00.592 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 08:13:00 compute-0 nova_compute[186544]: 2025-11-22 08:13:00.592 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:13:00 compute-0 nova_compute[186544]: 2025-11-22 08:13:00.919 186548 DEBUG nova.network.neutron [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] [instance: b8c444a0-4593-4343-9e2c-99f3110cceca] Updating instance_info_cache with network_info: [{"id": "1e98276d-f701-4c00-b1bb-177abba437fa", "address": "fa:16:3e:82:d1:b1", "network": {"id": "c8226b55-5efa-4d39-8f1e-3f41c9c3a242", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-1102188402-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0e2821b0488241e9bc0d864bb3f2ca40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e98276d-f7", "ovs_interfaceid": "1e98276d-f701-4c00-b1bb-177abba437fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:13:00 compute-0 nova_compute[186544]: 2025-11-22 08:13:00.928 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:13:00 compute-0 nova_compute[186544]: 2025-11-22 08:13:00.939 186548 DEBUG oslo_concurrency.lockutils [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] Releasing lock "refresh_cache-b8c444a0-4593-4343-9e2c-99f3110cceca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:13:00 compute-0 nova_compute[186544]: 2025-11-22 08:13:00.940 186548 DEBUG nova.compute.manager [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] [instance: b8c444a0-4593-4343-9e2c-99f3110cceca] Instance network_info: |[{"id": "1e98276d-f701-4c00-b1bb-177abba437fa", "address": "fa:16:3e:82:d1:b1", "network": {"id": "c8226b55-5efa-4d39-8f1e-3f41c9c3a242", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-1102188402-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0e2821b0488241e9bc0d864bb3f2ca40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e98276d-f7", "ovs_interfaceid": "1e98276d-f701-4c00-b1bb-177abba437fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 22 08:13:00 compute-0 nova_compute[186544]: 2025-11-22 08:13:00.940 186548 DEBUG oslo_concurrency.lockutils [req-41cbb022-11e5-446d-b916-40d5c1ba8906 req-37d57745-5337-4b89-b1f0-9203aa150ff3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-b8c444a0-4593-4343-9e2c-99f3110cceca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:13:00 compute-0 nova_compute[186544]: 2025-11-22 08:13:00.940 186548 DEBUG nova.network.neutron [req-41cbb022-11e5-446d-b916-40d5c1ba8906 req-37d57745-5337-4b89-b1f0-9203aa150ff3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b8c444a0-4593-4343-9e2c-99f3110cceca] Refreshing network info cache for port 1e98276d-f701-4c00-b1bb-177abba437fa _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:13:00 compute-0 nova_compute[186544]: 2025-11-22 08:13:00.943 186548 DEBUG nova.virt.libvirt.driver [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] [instance: b8c444a0-4593-4343-9e2c-99f3110cceca] Start _get_guest_xml network_info=[{"id": "1e98276d-f701-4c00-b1bb-177abba437fa", "address": "fa:16:3e:82:d1:b1", "network": {"id": "c8226b55-5efa-4d39-8f1e-3f41c9c3a242", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-1102188402-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0e2821b0488241e9bc0d864bb3f2ca40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e98276d-f7", "ovs_interfaceid": "1e98276d-f701-4c00-b1bb-177abba437fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 08:13:00 compute-0 nova_compute[186544]: 2025-11-22 08:13:00.946 186548 WARNING nova.virt.libvirt.driver [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:13:00 compute-0 nova_compute[186544]: 2025-11-22 08:13:00.951 186548 DEBUG nova.virt.libvirt.host [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 08:13:00 compute-0 nova_compute[186544]: 2025-11-22 08:13:00.951 186548 DEBUG nova.virt.libvirt.host [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 08:13:00 compute-0 nova_compute[186544]: 2025-11-22 08:13:00.956 186548 DEBUG nova.virt.libvirt.host [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 08:13:00 compute-0 nova_compute[186544]: 2025-11-22 08:13:00.957 186548 DEBUG nova.virt.libvirt.host [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 08:13:00 compute-0 nova_compute[186544]: 2025-11-22 08:13:00.958 186548 DEBUG nova.virt.libvirt.driver [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 08:13:00 compute-0 nova_compute[186544]: 2025-11-22 08:13:00.958 186548 DEBUG nova.virt.hardware [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 08:13:00 compute-0 nova_compute[186544]: 2025-11-22 08:13:00.959 186548 DEBUG nova.virt.hardware [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 08:13:00 compute-0 nova_compute[186544]: 2025-11-22 08:13:00.959 186548 DEBUG nova.virt.hardware [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 08:13:00 compute-0 nova_compute[186544]: 2025-11-22 08:13:00.959 186548 DEBUG nova.virt.hardware [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 08:13:00 compute-0 nova_compute[186544]: 2025-11-22 08:13:00.959 186548 DEBUG nova.virt.hardware [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 08:13:00 compute-0 nova_compute[186544]: 2025-11-22 08:13:00.960 186548 DEBUG nova.virt.hardware [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 08:13:00 compute-0 nova_compute[186544]: 2025-11-22 08:13:00.960 186548 DEBUG nova.virt.hardware [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 08:13:00 compute-0 nova_compute[186544]: 2025-11-22 08:13:00.960 186548 DEBUG nova.virt.hardware [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 08:13:00 compute-0 nova_compute[186544]: 2025-11-22 08:13:00.960 186548 DEBUG nova.virt.hardware [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 08:13:00 compute-0 nova_compute[186544]: 2025-11-22 08:13:00.960 186548 DEBUG nova.virt.hardware [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 08:13:00 compute-0 nova_compute[186544]: 2025-11-22 08:13:00.961 186548 DEBUG nova.virt.hardware [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 08:13:00 compute-0 nova_compute[186544]: 2025-11-22 08:13:00.964 186548 DEBUG nova.virt.libvirt.vif [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:12:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerTagsTestJSON-server-835938536',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-servertagstestjson-server-835938536',id=135,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0e2821b0488241e9bc0d864bb3f2ca40',ramdisk_id='',reservation_id='r-yg01o3lk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerTagsTestJSON-1623180383',owner_user_name='tempest-ServerTagsTestJSON-1623180383-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:12:57Z,user_data=None,user_id='34e68ca06c2b4e039ead3d9c5b44c88c',uuid=b8c444a0-4593-4343-9e2c-99f3110cceca,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1e98276d-f701-4c00-b1bb-177abba437fa", "address": "fa:16:3e:82:d1:b1", "network": {"id": "c8226b55-5efa-4d39-8f1e-3f41c9c3a242", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-1102188402-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0e2821b0488241e9bc0d864bb3f2ca40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e98276d-f7", "ovs_interfaceid": "1e98276d-f701-4c00-b1bb-177abba437fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 08:13:00 compute-0 nova_compute[186544]: 2025-11-22 08:13:00.964 186548 DEBUG nova.network.os_vif_util [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] Converting VIF {"id": "1e98276d-f701-4c00-b1bb-177abba437fa", "address": "fa:16:3e:82:d1:b1", "network": {"id": "c8226b55-5efa-4d39-8f1e-3f41c9c3a242", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-1102188402-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0e2821b0488241e9bc0d864bb3f2ca40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e98276d-f7", "ovs_interfaceid": "1e98276d-f701-4c00-b1bb-177abba437fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:13:00 compute-0 nova_compute[186544]: 2025-11-22 08:13:00.965 186548 DEBUG nova.network.os_vif_util [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:82:d1:b1,bridge_name='br-int',has_traffic_filtering=True,id=1e98276d-f701-4c00-b1bb-177abba437fa,network=Network(c8226b55-5efa-4d39-8f1e-3f41c9c3a242),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1e98276d-f7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:13:00 compute-0 nova_compute[186544]: 2025-11-22 08:13:00.966 186548 DEBUG nova.objects.instance [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] Lazy-loading 'pci_devices' on Instance uuid b8c444a0-4593-4343-9e2c-99f3110cceca obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:13:00 compute-0 nova_compute[186544]: 2025-11-22 08:13:00.978 186548 DEBUG nova.virt.libvirt.driver [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] [instance: b8c444a0-4593-4343-9e2c-99f3110cceca] End _get_guest_xml xml=<domain type="kvm">
Nov 22 08:13:00 compute-0 nova_compute[186544]:   <uuid>b8c444a0-4593-4343-9e2c-99f3110cceca</uuid>
Nov 22 08:13:00 compute-0 nova_compute[186544]:   <name>instance-00000087</name>
Nov 22 08:13:00 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 08:13:00 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 08:13:00 compute-0 nova_compute[186544]:   <metadata>
Nov 22 08:13:00 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 08:13:00 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 08:13:00 compute-0 nova_compute[186544]:       <nova:name>tempest-ServerTagsTestJSON-server-835938536</nova:name>
Nov 22 08:13:00 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 08:13:00</nova:creationTime>
Nov 22 08:13:00 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 08:13:00 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 08:13:00 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 08:13:00 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 08:13:00 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 08:13:00 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 08:13:00 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 08:13:00 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 08:13:00 compute-0 nova_compute[186544]:         <nova:user uuid="34e68ca06c2b4e039ead3d9c5b44c88c">tempest-ServerTagsTestJSON-1623180383-project-member</nova:user>
Nov 22 08:13:00 compute-0 nova_compute[186544]:         <nova:project uuid="0e2821b0488241e9bc0d864bb3f2ca40">tempest-ServerTagsTestJSON-1623180383</nova:project>
Nov 22 08:13:00 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 08:13:00 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 08:13:00 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 08:13:00 compute-0 nova_compute[186544]:         <nova:port uuid="1e98276d-f701-4c00-b1bb-177abba437fa">
Nov 22 08:13:00 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Nov 22 08:13:00 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 08:13:00 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 08:13:00 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 08:13:00 compute-0 nova_compute[186544]:   </metadata>
Nov 22 08:13:00 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 08:13:00 compute-0 nova_compute[186544]:     <system>
Nov 22 08:13:00 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 08:13:00 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 08:13:00 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 08:13:00 compute-0 nova_compute[186544]:       <entry name="serial">b8c444a0-4593-4343-9e2c-99f3110cceca</entry>
Nov 22 08:13:00 compute-0 nova_compute[186544]:       <entry name="uuid">b8c444a0-4593-4343-9e2c-99f3110cceca</entry>
Nov 22 08:13:00 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 08:13:00 compute-0 nova_compute[186544]:     </system>
Nov 22 08:13:00 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 08:13:00 compute-0 nova_compute[186544]:   <os>
Nov 22 08:13:00 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 08:13:00 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 08:13:00 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 08:13:00 compute-0 nova_compute[186544]:   </os>
Nov 22 08:13:00 compute-0 nova_compute[186544]:   <features>
Nov 22 08:13:00 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 08:13:00 compute-0 nova_compute[186544]:     <apic/>
Nov 22 08:13:00 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 08:13:00 compute-0 nova_compute[186544]:   </features>
Nov 22 08:13:00 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 08:13:00 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 08:13:00 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 08:13:00 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 08:13:00 compute-0 nova_compute[186544]:   </clock>
Nov 22 08:13:00 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 08:13:00 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 08:13:00 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 08:13:00 compute-0 nova_compute[186544]:   </cpu>
Nov 22 08:13:00 compute-0 nova_compute[186544]:   <devices>
Nov 22 08:13:00 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 08:13:00 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 08:13:00 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/b8c444a0-4593-4343-9e2c-99f3110cceca/disk"/>
Nov 22 08:13:00 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 08:13:00 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:13:00 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 08:13:00 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 08:13:00 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/b8c444a0-4593-4343-9e2c-99f3110cceca/disk.config"/>
Nov 22 08:13:00 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 08:13:00 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:13:00 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 08:13:00 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:82:d1:b1"/>
Nov 22 08:13:00 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:13:00 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 08:13:00 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 08:13:00 compute-0 nova_compute[186544]:       <target dev="tap1e98276d-f7"/>
Nov 22 08:13:00 compute-0 nova_compute[186544]:     </interface>
Nov 22 08:13:00 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 08:13:00 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/b8c444a0-4593-4343-9e2c-99f3110cceca/console.log" append="off"/>
Nov 22 08:13:00 compute-0 nova_compute[186544]:     </serial>
Nov 22 08:13:00 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 08:13:00 compute-0 nova_compute[186544]:     <video>
Nov 22 08:13:00 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:13:00 compute-0 nova_compute[186544]:     </video>
Nov 22 08:13:00 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 08:13:00 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 08:13:00 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 08:13:00 compute-0 nova_compute[186544]:     </rng>
Nov 22 08:13:00 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 08:13:00 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:13:00 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:13:00 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:13:00 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:13:00 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:13:00 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:13:00 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:13:00 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:13:00 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:13:00 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:13:00 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:13:00 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:13:00 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:13:00 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:13:00 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:13:00 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:13:00 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:13:00 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:13:00 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:13:00 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:13:00 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:13:00 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:13:00 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:13:00 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:13:00 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 08:13:00 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 08:13:00 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 08:13:00 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 08:13:00 compute-0 nova_compute[186544]:   </devices>
Nov 22 08:13:00 compute-0 nova_compute[186544]: </domain>
Nov 22 08:13:00 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 08:13:00 compute-0 nova_compute[186544]: 2025-11-22 08:13:00.981 186548 DEBUG nova.compute.manager [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] [instance: b8c444a0-4593-4343-9e2c-99f3110cceca] Preparing to wait for external event network-vif-plugged-1e98276d-f701-4c00-b1bb-177abba437fa prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 08:13:00 compute-0 nova_compute[186544]: 2025-11-22 08:13:00.981 186548 DEBUG oslo_concurrency.lockutils [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] Acquiring lock "b8c444a0-4593-4343-9e2c-99f3110cceca-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:13:00 compute-0 nova_compute[186544]: 2025-11-22 08:13:00.981 186548 DEBUG oslo_concurrency.lockutils [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] Lock "b8c444a0-4593-4343-9e2c-99f3110cceca-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:13:00 compute-0 nova_compute[186544]: 2025-11-22 08:13:00.981 186548 DEBUG oslo_concurrency.lockutils [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] Lock "b8c444a0-4593-4343-9e2c-99f3110cceca-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:13:00 compute-0 nova_compute[186544]: 2025-11-22 08:13:00.982 186548 DEBUG nova.virt.libvirt.vif [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:12:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerTagsTestJSON-server-835938536',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-servertagstestjson-server-835938536',id=135,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0e2821b0488241e9bc0d864bb3f2ca40',ramdisk_id='',reservation_id='r-yg01o3lk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerTagsTestJSON-1623180383',owner_user_name='tempest-ServerTagsTestJSON-1623180383-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:12:57Z,user_data=None,user_id='34e68ca06c2b4e039ead3d9c5b44c88c',uuid=b8c444a0-4593-4343-9e2c-99f3110cceca,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1e98276d-f701-4c00-b1bb-177abba437fa", "address": "fa:16:3e:82:d1:b1", "network": {"id": "c8226b55-5efa-4d39-8f1e-3f41c9c3a242", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-1102188402-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0e2821b0488241e9bc0d864bb3f2ca40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e98276d-f7", "ovs_interfaceid": "1e98276d-f701-4c00-b1bb-177abba437fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 08:13:00 compute-0 nova_compute[186544]: 2025-11-22 08:13:00.982 186548 DEBUG nova.network.os_vif_util [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] Converting VIF {"id": "1e98276d-f701-4c00-b1bb-177abba437fa", "address": "fa:16:3e:82:d1:b1", "network": {"id": "c8226b55-5efa-4d39-8f1e-3f41c9c3a242", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-1102188402-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0e2821b0488241e9bc0d864bb3f2ca40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e98276d-f7", "ovs_interfaceid": "1e98276d-f701-4c00-b1bb-177abba437fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:13:00 compute-0 nova_compute[186544]: 2025-11-22 08:13:00.983 186548 DEBUG nova.network.os_vif_util [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:82:d1:b1,bridge_name='br-int',has_traffic_filtering=True,id=1e98276d-f701-4c00-b1bb-177abba437fa,network=Network(c8226b55-5efa-4d39-8f1e-3f41c9c3a242),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1e98276d-f7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:13:00 compute-0 nova_compute[186544]: 2025-11-22 08:13:00.983 186548 DEBUG os_vif [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:82:d1:b1,bridge_name='br-int',has_traffic_filtering=True,id=1e98276d-f701-4c00-b1bb-177abba437fa,network=Network(c8226b55-5efa-4d39-8f1e-3f41c9c3a242),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1e98276d-f7') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 08:13:00 compute-0 nova_compute[186544]: 2025-11-22 08:13:00.984 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:13:00 compute-0 nova_compute[186544]: 2025-11-22 08:13:00.984 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:13:00 compute-0 nova_compute[186544]: 2025-11-22 08:13:00.985 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:13:00 compute-0 nova_compute[186544]: 2025-11-22 08:13:00.987 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:13:00 compute-0 nova_compute[186544]: 2025-11-22 08:13:00.987 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1e98276d-f7, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:13:00 compute-0 nova_compute[186544]: 2025-11-22 08:13:00.988 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1e98276d-f7, col_values=(('external_ids', {'iface-id': '1e98276d-f701-4c00-b1bb-177abba437fa', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:82:d1:b1', 'vm-uuid': 'b8c444a0-4593-4343-9e2c-99f3110cceca'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:13:00 compute-0 nova_compute[186544]: 2025-11-22 08:13:00.989 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:13:00 compute-0 NetworkManager[55036]: <info>  [1763799180.9904] manager: (tap1e98276d-f7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/283)
Nov 22 08:13:00 compute-0 nova_compute[186544]: 2025-11-22 08:13:00.992 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 08:13:00 compute-0 nova_compute[186544]: 2025-11-22 08:13:00.995 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:13:00 compute-0 nova_compute[186544]: 2025-11-22 08:13:00.996 186548 INFO os_vif [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:82:d1:b1,bridge_name='br-int',has_traffic_filtering=True,id=1e98276d-f701-4c00-b1bb-177abba437fa,network=Network(c8226b55-5efa-4d39-8f1e-3f41c9c3a242),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1e98276d-f7')
Nov 22 08:13:01 compute-0 nova_compute[186544]: 2025-11-22 08:13:01.103 186548 DEBUG nova.virt.libvirt.driver [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:13:01 compute-0 nova_compute[186544]: 2025-11-22 08:13:01.104 186548 DEBUG nova.virt.libvirt.driver [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:13:01 compute-0 nova_compute[186544]: 2025-11-22 08:13:01.104 186548 DEBUG nova.virt.libvirt.driver [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] No VIF found with MAC fa:16:3e:82:d1:b1, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 08:13:01 compute-0 nova_compute[186544]: 2025-11-22 08:13:01.104 186548 INFO nova.virt.libvirt.driver [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] [instance: b8c444a0-4593-4343-9e2c-99f3110cceca] Using config drive
Nov 22 08:13:01 compute-0 nova_compute[186544]: 2025-11-22 08:13:01.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:13:01 compute-0 nova_compute[186544]: 2025-11-22 08:13:01.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 08:13:01 compute-0 nova_compute[186544]: 2025-11-22 08:13:01.824 186548 INFO nova.virt.libvirt.driver [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] [instance: b8c444a0-4593-4343-9e2c-99f3110cceca] Creating config drive at /var/lib/nova/instances/b8c444a0-4593-4343-9e2c-99f3110cceca/disk.config
Nov 22 08:13:01 compute-0 nova_compute[186544]: 2025-11-22 08:13:01.832 186548 DEBUG oslo_concurrency.processutils [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b8c444a0-4593-4343-9e2c-99f3110cceca/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp885wou7p execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:13:01 compute-0 nova_compute[186544]: 2025-11-22 08:13:01.968 186548 DEBUG oslo_concurrency.processutils [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b8c444a0-4593-4343-9e2c-99f3110cceca/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp885wou7p" returned: 0 in 0.136s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:13:02 compute-0 kernel: tap1e98276d-f7: entered promiscuous mode
Nov 22 08:13:02 compute-0 NetworkManager[55036]: <info>  [1763799182.0401] manager: (tap1e98276d-f7): new Tun device (/org/freedesktop/NetworkManager/Devices/284)
Nov 22 08:13:02 compute-0 ovn_controller[94843]: 2025-11-22T08:13:02Z|00615|binding|INFO|Claiming lport 1e98276d-f701-4c00-b1bb-177abba437fa for this chassis.
Nov 22 08:13:02 compute-0 ovn_controller[94843]: 2025-11-22T08:13:02Z|00616|binding|INFO|1e98276d-f701-4c00-b1bb-177abba437fa: Claiming fa:16:3e:82:d1:b1 10.100.0.9
Nov 22 08:13:02 compute-0 nova_compute[186544]: 2025-11-22 08:13:02.041 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:13:02 compute-0 nova_compute[186544]: 2025-11-22 08:13:02.045 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:13:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:02.057 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:82:d1:b1 10.100.0.9'], port_security=['fa:16:3e:82:d1:b1 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'b8c444a0-4593-4343-9e2c-99f3110cceca', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c8226b55-5efa-4d39-8f1e-3f41c9c3a242', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0e2821b0488241e9bc0d864bb3f2ca40', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9432bae9-cd92-430a-85bc-83427236a34e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=575ca84b-c9be-4d99-b762-044616ad097c, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=1e98276d-f701-4c00-b1bb-177abba437fa) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:13:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:02.058 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 1e98276d-f701-4c00-b1bb-177abba437fa in datapath c8226b55-5efa-4d39-8f1e-3f41c9c3a242 bound to our chassis
Nov 22 08:13:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:02.060 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c8226b55-5efa-4d39-8f1e-3f41c9c3a242
Nov 22 08:13:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:02.071 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[0da743b7-a206-47fa-bae0-b0efcb3a1b7a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:13:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:02.072 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc8226b55-51 in ovnmeta-c8226b55-5efa-4d39-8f1e-3f41c9c3a242 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 08:13:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:02.074 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc8226b55-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 08:13:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:02.074 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[8c0dc70b-3b68-4bee-a91b-5475d5259c09]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:13:02 compute-0 systemd-udevd[238014]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:13:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:02.075 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[1bf8784e-ffd0-42f9-abcf-7ba3b0ca0bf1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:13:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:02.086 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[7b40f853-ad18-4253-8bcd-2a7ddc8c58ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:13:02 compute-0 systemd-machined[152872]: New machine qemu-76-instance-00000087.
Nov 22 08:13:02 compute-0 NetworkManager[55036]: <info>  [1763799182.0955] device (tap1e98276d-f7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 08:13:02 compute-0 NetworkManager[55036]: <info>  [1763799182.0971] device (tap1e98276d-f7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 08:13:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:02.100 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[bebabaa9-2b34-4104-abb5-c18524d14533]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:13:02 compute-0 nova_compute[186544]: 2025-11-22 08:13:02.101 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:13:02 compute-0 ovn_controller[94843]: 2025-11-22T08:13:02Z|00617|binding|INFO|Setting lport 1e98276d-f701-4c00-b1bb-177abba437fa ovn-installed in OVS
Nov 22 08:13:02 compute-0 ovn_controller[94843]: 2025-11-22T08:13:02Z|00618|binding|INFO|Setting lport 1e98276d-f701-4c00-b1bb-177abba437fa up in Southbound
Nov 22 08:13:02 compute-0 nova_compute[186544]: 2025-11-22 08:13:02.107 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:13:02 compute-0 systemd[1]: Started Virtual Machine qemu-76-instance-00000087.
Nov 22 08:13:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:02.131 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[da259ff6-872c-40fa-9bb5-61fe8ff7c283]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:13:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:02.137 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[e48571ce-6028-4d05-b437-abcd0c9ecdda]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:13:02 compute-0 NetworkManager[55036]: <info>  [1763799182.1380] manager: (tapc8226b55-50): new Veth device (/org/freedesktop/NetworkManager/Devices/285)
Nov 22 08:13:02 compute-0 nova_compute[186544]: 2025-11-22 08:13:02.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:13:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:02.163 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[c3ec8a00-3a62-4871-a4b4-e4e7bbfc825a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:13:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:02.167 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[77503f8e-b16b-464c-9a51-804121d621a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:13:02 compute-0 NetworkManager[55036]: <info>  [1763799182.1883] device (tapc8226b55-50): carrier: link connected
Nov 22 08:13:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:02.193 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[53055522-3eff-4a1b-8c6c-41b0e4578347]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:13:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:02.207 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[b35db906-fca8-4427-b3e0-28fb2334a3d1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc8226b55-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0e:48:1f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 190], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 592282, 'reachable_time': 34924, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238047, 'error': None, 'target': 'ovnmeta-c8226b55-5efa-4d39-8f1e-3f41c9c3a242', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:13:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:02.225 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[819f60eb-3b7f-4443-8625-084bb989fb24]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0e:481f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 592282, 'tstamp': 592282}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 238048, 'error': None, 'target': 'ovnmeta-c8226b55-5efa-4d39-8f1e-3f41c9c3a242', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:13:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:02.244 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[a0ecdd72-bb70-4ada-b7ab-4d4d2c127f0d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc8226b55-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0e:48:1f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 190], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 592282, 'reachable_time': 34924, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 238049, 'error': None, 'target': 'ovnmeta-c8226b55-5efa-4d39-8f1e-3f41c9c3a242', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:13:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:02.276 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[5ff4742c-deeb-45e4-b205-7f55412120bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:13:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:02.331 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[6f320c6b-b054-444c-811a-e5e789a5a4d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:13:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:02.334 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc8226b55-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:13:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:02.334 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:13:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:02.335 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc8226b55-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:13:02 compute-0 NetworkManager[55036]: <info>  [1763799182.3380] manager: (tapc8226b55-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/286)
Nov 22 08:13:02 compute-0 nova_compute[186544]: 2025-11-22 08:13:02.337 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:13:02 compute-0 kernel: tapc8226b55-50: entered promiscuous mode
Nov 22 08:13:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:02.340 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc8226b55-50, col_values=(('external_ids', {'iface-id': '64218c39-1594-47c8-91a7-687dc9697f7d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:13:02 compute-0 ovn_controller[94843]: 2025-11-22T08:13:02Z|00619|binding|INFO|Releasing lport 64218c39-1594-47c8-91a7-687dc9697f7d from this chassis (sb_readonly=0)
Nov 22 08:13:02 compute-0 nova_compute[186544]: 2025-11-22 08:13:02.343 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:13:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:02.344 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c8226b55-5efa-4d39-8f1e-3f41c9c3a242.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c8226b55-5efa-4d39-8f1e-3f41c9c3a242.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 08:13:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:02.345 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[12f9aebc-71bd-4406-8308-72b6e6f415ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:13:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:02.347 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 08:13:02 compute-0 ovn_metadata_agent[103800]: global
Nov 22 08:13:02 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 08:13:02 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-c8226b55-5efa-4d39-8f1e-3f41c9c3a242
Nov 22 08:13:02 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 08:13:02 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 08:13:02 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 08:13:02 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/c8226b55-5efa-4d39-8f1e-3f41c9c3a242.pid.haproxy
Nov 22 08:13:02 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 08:13:02 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:13:02 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 08:13:02 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 08:13:02 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 08:13:02 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 08:13:02 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 08:13:02 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 08:13:02 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 08:13:02 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 08:13:02 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 08:13:02 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 08:13:02 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 08:13:02 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 08:13:02 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 08:13:02 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:13:02 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:13:02 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 08:13:02 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 08:13:02 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 08:13:02 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID c8226b55-5efa-4d39-8f1e-3f41c9c3a242
Nov 22 08:13:02 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 08:13:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:02.348 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c8226b55-5efa-4d39-8f1e-3f41c9c3a242', 'env', 'PROCESS_TAG=haproxy-c8226b55-5efa-4d39-8f1e-3f41c9c3a242', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c8226b55-5efa-4d39-8f1e-3f41c9c3a242.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 08:13:02 compute-0 nova_compute[186544]: 2025-11-22 08:13:02.358 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:13:02 compute-0 nova_compute[186544]: 2025-11-22 08:13:02.521 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763799182.5210788, b8c444a0-4593-4343-9e2c-99f3110cceca => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:13:02 compute-0 nova_compute[186544]: 2025-11-22 08:13:02.522 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: b8c444a0-4593-4343-9e2c-99f3110cceca] VM Started (Lifecycle Event)
Nov 22 08:13:02 compute-0 nova_compute[186544]: 2025-11-22 08:13:02.546 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: b8c444a0-4593-4343-9e2c-99f3110cceca] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:13:02 compute-0 nova_compute[186544]: 2025-11-22 08:13:02.551 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763799182.5219126, b8c444a0-4593-4343-9e2c-99f3110cceca => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:13:02 compute-0 nova_compute[186544]: 2025-11-22 08:13:02.551 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: b8c444a0-4593-4343-9e2c-99f3110cceca] VM Paused (Lifecycle Event)
Nov 22 08:13:02 compute-0 nova_compute[186544]: 2025-11-22 08:13:02.570 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: b8c444a0-4593-4343-9e2c-99f3110cceca] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:13:02 compute-0 nova_compute[186544]: 2025-11-22 08:13:02.573 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: b8c444a0-4593-4343-9e2c-99f3110cceca] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:13:02 compute-0 nova_compute[186544]: 2025-11-22 08:13:02.602 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: b8c444a0-4593-4343-9e2c-99f3110cceca] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 08:13:02 compute-0 podman[238088]: 2025-11-22 08:13:02.690355809 +0000 UTC m=+0.019176166 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 08:13:02 compute-0 nova_compute[186544]: 2025-11-22 08:13:02.853 186548 DEBUG nova.compute.manager [req-12174dea-7060-4060-a219-ebba766e534f req-b2f58aa2-d434-4f8c-92bd-ee5dea0e8899 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b8c444a0-4593-4343-9e2c-99f3110cceca] Received event network-vif-plugged-1e98276d-f701-4c00-b1bb-177abba437fa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:13:02 compute-0 nova_compute[186544]: 2025-11-22 08:13:02.854 186548 DEBUG oslo_concurrency.lockutils [req-12174dea-7060-4060-a219-ebba766e534f req-b2f58aa2-d434-4f8c-92bd-ee5dea0e8899 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "b8c444a0-4593-4343-9e2c-99f3110cceca-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:13:02 compute-0 nova_compute[186544]: 2025-11-22 08:13:02.855 186548 DEBUG oslo_concurrency.lockutils [req-12174dea-7060-4060-a219-ebba766e534f req-b2f58aa2-d434-4f8c-92bd-ee5dea0e8899 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b8c444a0-4593-4343-9e2c-99f3110cceca-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:13:02 compute-0 nova_compute[186544]: 2025-11-22 08:13:02.855 186548 DEBUG oslo_concurrency.lockutils [req-12174dea-7060-4060-a219-ebba766e534f req-b2f58aa2-d434-4f8c-92bd-ee5dea0e8899 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b8c444a0-4593-4343-9e2c-99f3110cceca-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:13:02 compute-0 nova_compute[186544]: 2025-11-22 08:13:02.855 186548 DEBUG nova.compute.manager [req-12174dea-7060-4060-a219-ebba766e534f req-b2f58aa2-d434-4f8c-92bd-ee5dea0e8899 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b8c444a0-4593-4343-9e2c-99f3110cceca] Processing event network-vif-plugged-1e98276d-f701-4c00-b1bb-177abba437fa _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 08:13:02 compute-0 nova_compute[186544]: 2025-11-22 08:13:02.856 186548 DEBUG nova.compute.manager [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] [instance: b8c444a0-4593-4343-9e2c-99f3110cceca] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 08:13:02 compute-0 nova_compute[186544]: 2025-11-22 08:13:02.860 186548 DEBUG nova.virt.libvirt.driver [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] [instance: b8c444a0-4593-4343-9e2c-99f3110cceca] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 08:13:02 compute-0 nova_compute[186544]: 2025-11-22 08:13:02.860 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763799182.8600209, b8c444a0-4593-4343-9e2c-99f3110cceca => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:13:02 compute-0 nova_compute[186544]: 2025-11-22 08:13:02.860 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: b8c444a0-4593-4343-9e2c-99f3110cceca] VM Resumed (Lifecycle Event)
Nov 22 08:13:02 compute-0 nova_compute[186544]: 2025-11-22 08:13:02.864 186548 INFO nova.virt.libvirt.driver [-] [instance: b8c444a0-4593-4343-9e2c-99f3110cceca] Instance spawned successfully.
Nov 22 08:13:02 compute-0 nova_compute[186544]: 2025-11-22 08:13:02.864 186548 DEBUG nova.virt.libvirt.driver [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] [instance: b8c444a0-4593-4343-9e2c-99f3110cceca] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 08:13:02 compute-0 nova_compute[186544]: 2025-11-22 08:13:02.886 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: b8c444a0-4593-4343-9e2c-99f3110cceca] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:13:02 compute-0 nova_compute[186544]: 2025-11-22 08:13:02.893 186548 DEBUG nova.virt.libvirt.driver [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] [instance: b8c444a0-4593-4343-9e2c-99f3110cceca] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:13:02 compute-0 nova_compute[186544]: 2025-11-22 08:13:02.893 186548 DEBUG nova.virt.libvirt.driver [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] [instance: b8c444a0-4593-4343-9e2c-99f3110cceca] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:13:02 compute-0 nova_compute[186544]: 2025-11-22 08:13:02.894 186548 DEBUG nova.virt.libvirt.driver [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] [instance: b8c444a0-4593-4343-9e2c-99f3110cceca] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:13:02 compute-0 nova_compute[186544]: 2025-11-22 08:13:02.894 186548 DEBUG nova.virt.libvirt.driver [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] [instance: b8c444a0-4593-4343-9e2c-99f3110cceca] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:13:02 compute-0 nova_compute[186544]: 2025-11-22 08:13:02.895 186548 DEBUG nova.virt.libvirt.driver [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] [instance: b8c444a0-4593-4343-9e2c-99f3110cceca] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:13:02 compute-0 nova_compute[186544]: 2025-11-22 08:13:02.895 186548 DEBUG nova.virt.libvirt.driver [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] [instance: b8c444a0-4593-4343-9e2c-99f3110cceca] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:13:02 compute-0 nova_compute[186544]: 2025-11-22 08:13:02.899 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: b8c444a0-4593-4343-9e2c-99f3110cceca] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:13:02 compute-0 nova_compute[186544]: 2025-11-22 08:13:02.930 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: b8c444a0-4593-4343-9e2c-99f3110cceca] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 08:13:03 compute-0 nova_compute[186544]: 2025-11-22 08:13:03.050 186548 INFO nova.compute.manager [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] [instance: b8c444a0-4593-4343-9e2c-99f3110cceca] Took 5.85 seconds to spawn the instance on the hypervisor.
Nov 22 08:13:03 compute-0 nova_compute[186544]: 2025-11-22 08:13:03.050 186548 DEBUG nova.compute.manager [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] [instance: b8c444a0-4593-4343-9e2c-99f3110cceca] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:13:03 compute-0 podman[238088]: 2025-11-22 08:13:03.130451143 +0000 UTC m=+0.459271470 container create 67e23b1b5289ebc9ea3bf670e3fd809a54237b44b419f7093585a2c17090e3f0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c8226b55-5efa-4d39-8f1e-3f41c9c3a242, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 08:13:03 compute-0 nova_compute[186544]: 2025-11-22 08:13:03.138 186548 DEBUG nova.network.neutron [req-41cbb022-11e5-446d-b916-40d5c1ba8906 req-37d57745-5337-4b89-b1f0-9203aa150ff3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b8c444a0-4593-4343-9e2c-99f3110cceca] Updated VIF entry in instance network info cache for port 1e98276d-f701-4c00-b1bb-177abba437fa. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:13:03 compute-0 nova_compute[186544]: 2025-11-22 08:13:03.139 186548 DEBUG nova.network.neutron [req-41cbb022-11e5-446d-b916-40d5c1ba8906 req-37d57745-5337-4b89-b1f0-9203aa150ff3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b8c444a0-4593-4343-9e2c-99f3110cceca] Updating instance_info_cache with network_info: [{"id": "1e98276d-f701-4c00-b1bb-177abba437fa", "address": "fa:16:3e:82:d1:b1", "network": {"id": "c8226b55-5efa-4d39-8f1e-3f41c9c3a242", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-1102188402-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0e2821b0488241e9bc0d864bb3f2ca40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e98276d-f7", "ovs_interfaceid": "1e98276d-f701-4c00-b1bb-177abba437fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:13:03 compute-0 nova_compute[186544]: 2025-11-22 08:13:03.150 186548 DEBUG oslo_concurrency.lockutils [req-41cbb022-11e5-446d-b916-40d5c1ba8906 req-37d57745-5337-4b89-b1f0-9203aa150ff3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-b8c444a0-4593-4343-9e2c-99f3110cceca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:13:03 compute-0 systemd[1]: Started libpod-conmon-67e23b1b5289ebc9ea3bf670e3fd809a54237b44b419f7093585a2c17090e3f0.scope.
Nov 22 08:13:03 compute-0 systemd[1]: Started libcrun container.
Nov 22 08:13:03 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cf18ee0d958a0fba594c14ad959f4c41ef644144c6b2b16807f2ccc2ddcd05a9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 08:13:03 compute-0 podman[238088]: 2025-11-22 08:13:03.47004591 +0000 UTC m=+0.798866247 container init 67e23b1b5289ebc9ea3bf670e3fd809a54237b44b419f7093585a2c17090e3f0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c8226b55-5efa-4d39-8f1e-3f41c9c3a242, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3)
Nov 22 08:13:03 compute-0 podman[238088]: 2025-11-22 08:13:03.476773437 +0000 UTC m=+0.805593764 container start 67e23b1b5289ebc9ea3bf670e3fd809a54237b44b419f7093585a2c17090e3f0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c8226b55-5efa-4d39-8f1e-3f41c9c3a242, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 22 08:13:03 compute-0 neutron-haproxy-ovnmeta-c8226b55-5efa-4d39-8f1e-3f41c9c3a242[238103]: [NOTICE]   (238107) : New worker (238109) forked
Nov 22 08:13:03 compute-0 neutron-haproxy-ovnmeta-c8226b55-5efa-4d39-8f1e-3f41c9c3a242[238103]: [NOTICE]   (238107) : Loading success.
Nov 22 08:13:03 compute-0 nova_compute[186544]: 2025-11-22 08:13:03.530 186548 INFO nova.compute.manager [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] [instance: b8c444a0-4593-4343-9e2c-99f3110cceca] Took 6.77 seconds to build instance.
Nov 22 08:13:03 compute-0 nova_compute[186544]: 2025-11-22 08:13:03.557 186548 DEBUG oslo_concurrency.lockutils [None req-c35496af-6470-43b9-ac9b-49d7d4a37a23 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] Lock "b8c444a0-4593-4343-9e2c-99f3110cceca" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.860s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:13:04 compute-0 nova_compute[186544]: 2025-11-22 08:13:04.936 186548 DEBUG nova.compute.manager [req-082a2155-2e0e-41b2-863e-a7930a9ef75c req-bdd2b1f3-c8d7-4d44-b8ae-2ff85198c660 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b8c444a0-4593-4343-9e2c-99f3110cceca] Received event network-vif-plugged-1e98276d-f701-4c00-b1bb-177abba437fa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:13:04 compute-0 nova_compute[186544]: 2025-11-22 08:13:04.937 186548 DEBUG oslo_concurrency.lockutils [req-082a2155-2e0e-41b2-863e-a7930a9ef75c req-bdd2b1f3-c8d7-4d44-b8ae-2ff85198c660 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "b8c444a0-4593-4343-9e2c-99f3110cceca-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:13:04 compute-0 nova_compute[186544]: 2025-11-22 08:13:04.937 186548 DEBUG oslo_concurrency.lockutils [req-082a2155-2e0e-41b2-863e-a7930a9ef75c req-bdd2b1f3-c8d7-4d44-b8ae-2ff85198c660 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b8c444a0-4593-4343-9e2c-99f3110cceca-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:13:04 compute-0 nova_compute[186544]: 2025-11-22 08:13:04.937 186548 DEBUG oslo_concurrency.lockutils [req-082a2155-2e0e-41b2-863e-a7930a9ef75c req-bdd2b1f3-c8d7-4d44-b8ae-2ff85198c660 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b8c444a0-4593-4343-9e2c-99f3110cceca-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:13:04 compute-0 nova_compute[186544]: 2025-11-22 08:13:04.938 186548 DEBUG nova.compute.manager [req-082a2155-2e0e-41b2-863e-a7930a9ef75c req-bdd2b1f3-c8d7-4d44-b8ae-2ff85198c660 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b8c444a0-4593-4343-9e2c-99f3110cceca] No waiting events found dispatching network-vif-plugged-1e98276d-f701-4c00-b1bb-177abba437fa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:13:04 compute-0 nova_compute[186544]: 2025-11-22 08:13:04.938 186548 WARNING nova.compute.manager [req-082a2155-2e0e-41b2-863e-a7930a9ef75c req-bdd2b1f3-c8d7-4d44-b8ae-2ff85198c660 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b8c444a0-4593-4343-9e2c-99f3110cceca] Received unexpected event network-vif-plugged-1e98276d-f701-4c00-b1bb-177abba437fa for instance with vm_state active and task_state None.
Nov 22 08:13:05 compute-0 nova_compute[186544]: 2025-11-22 08:13:05.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:13:05 compute-0 nova_compute[186544]: 2025-11-22 08:13:05.931 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:13:05 compute-0 nova_compute[186544]: 2025-11-22 08:13:05.989 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:13:06 compute-0 nova_compute[186544]: 2025-11-22 08:13:06.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:13:06 compute-0 nova_compute[186544]: 2025-11-22 08:13:06.952 186548 DEBUG oslo_concurrency.lockutils [None req-339bcc08-d047-4b35-8cb6-49abdf711aa8 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] Acquiring lock "b8c444a0-4593-4343-9e2c-99f3110cceca" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:13:06 compute-0 nova_compute[186544]: 2025-11-22 08:13:06.952 186548 DEBUG oslo_concurrency.lockutils [None req-339bcc08-d047-4b35-8cb6-49abdf711aa8 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] Lock "b8c444a0-4593-4343-9e2c-99f3110cceca" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:13:06 compute-0 nova_compute[186544]: 2025-11-22 08:13:06.953 186548 DEBUG oslo_concurrency.lockutils [None req-339bcc08-d047-4b35-8cb6-49abdf711aa8 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] Acquiring lock "b8c444a0-4593-4343-9e2c-99f3110cceca-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:13:06 compute-0 nova_compute[186544]: 2025-11-22 08:13:06.953 186548 DEBUG oslo_concurrency.lockutils [None req-339bcc08-d047-4b35-8cb6-49abdf711aa8 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] Lock "b8c444a0-4593-4343-9e2c-99f3110cceca-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:13:06 compute-0 nova_compute[186544]: 2025-11-22 08:13:06.953 186548 DEBUG oslo_concurrency.lockutils [None req-339bcc08-d047-4b35-8cb6-49abdf711aa8 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] Lock "b8c444a0-4593-4343-9e2c-99f3110cceca-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:13:06 compute-0 nova_compute[186544]: 2025-11-22 08:13:06.959 186548 INFO nova.compute.manager [None req-339bcc08-d047-4b35-8cb6-49abdf711aa8 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] [instance: b8c444a0-4593-4343-9e2c-99f3110cceca] Terminating instance
Nov 22 08:13:06 compute-0 nova_compute[186544]: 2025-11-22 08:13:06.964 186548 DEBUG nova.compute.manager [None req-339bcc08-d047-4b35-8cb6-49abdf711aa8 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] [instance: b8c444a0-4593-4343-9e2c-99f3110cceca] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 08:13:06 compute-0 kernel: tap1e98276d-f7 (unregistering): left promiscuous mode
Nov 22 08:13:06 compute-0 NetworkManager[55036]: <info>  [1763799186.9869] device (tap1e98276d-f7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 08:13:06 compute-0 nova_compute[186544]: 2025-11-22 08:13:06.990 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:13:06 compute-0 ovn_controller[94843]: 2025-11-22T08:13:06Z|00620|binding|INFO|Releasing lport 1e98276d-f701-4c00-b1bb-177abba437fa from this chassis (sb_readonly=0)
Nov 22 08:13:06 compute-0 ovn_controller[94843]: 2025-11-22T08:13:06Z|00621|binding|INFO|Setting lport 1e98276d-f701-4c00-b1bb-177abba437fa down in Southbound
Nov 22 08:13:06 compute-0 ovn_controller[94843]: 2025-11-22T08:13:06Z|00622|binding|INFO|Removing iface tap1e98276d-f7 ovn-installed in OVS
Nov 22 08:13:06 compute-0 nova_compute[186544]: 2025-11-22 08:13:06.992 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:13:07 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:07.008 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:82:d1:b1 10.100.0.9'], port_security=['fa:16:3e:82:d1:b1 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'b8c444a0-4593-4343-9e2c-99f3110cceca', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c8226b55-5efa-4d39-8f1e-3f41c9c3a242', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0e2821b0488241e9bc0d864bb3f2ca40', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9432bae9-cd92-430a-85bc-83427236a34e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=575ca84b-c9be-4d99-b762-044616ad097c, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=1e98276d-f701-4c00-b1bb-177abba437fa) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:13:07 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:07.010 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 1e98276d-f701-4c00-b1bb-177abba437fa in datapath c8226b55-5efa-4d39-8f1e-3f41c9c3a242 unbound from our chassis
Nov 22 08:13:07 compute-0 nova_compute[186544]: 2025-11-22 08:13:07.010 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:13:07 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:07.011 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c8226b55-5efa-4d39-8f1e-3f41c9c3a242, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 08:13:07 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:07.013 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[bb5e248c-69fe-432a-99ac-7b460225d1e4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:13:07 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:07.014 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c8226b55-5efa-4d39-8f1e-3f41c9c3a242 namespace which is not needed anymore
Nov 22 08:13:07 compute-0 systemd[1]: machine-qemu\x2d76\x2dinstance\x2d00000087.scope: Deactivated successfully.
Nov 22 08:13:07 compute-0 systemd[1]: machine-qemu\x2d76\x2dinstance\x2d00000087.scope: Consumed 4.498s CPU time.
Nov 22 08:13:07 compute-0 systemd-machined[152872]: Machine qemu-76-instance-00000087 terminated.
Nov 22 08:13:07 compute-0 neutron-haproxy-ovnmeta-c8226b55-5efa-4d39-8f1e-3f41c9c3a242[238103]: [NOTICE]   (238107) : haproxy version is 2.8.14-c23fe91
Nov 22 08:13:07 compute-0 neutron-haproxy-ovnmeta-c8226b55-5efa-4d39-8f1e-3f41c9c3a242[238103]: [NOTICE]   (238107) : path to executable is /usr/sbin/haproxy
Nov 22 08:13:07 compute-0 neutron-haproxy-ovnmeta-c8226b55-5efa-4d39-8f1e-3f41c9c3a242[238103]: [WARNING]  (238107) : Exiting Master process...
Nov 22 08:13:07 compute-0 neutron-haproxy-ovnmeta-c8226b55-5efa-4d39-8f1e-3f41c9c3a242[238103]: [WARNING]  (238107) : Exiting Master process...
Nov 22 08:13:07 compute-0 neutron-haproxy-ovnmeta-c8226b55-5efa-4d39-8f1e-3f41c9c3a242[238103]: [ALERT]    (238107) : Current worker (238109) exited with code 143 (Terminated)
Nov 22 08:13:07 compute-0 neutron-haproxy-ovnmeta-c8226b55-5efa-4d39-8f1e-3f41c9c3a242[238103]: [WARNING]  (238107) : All workers exited. Exiting... (0)
Nov 22 08:13:07 compute-0 systemd[1]: libpod-67e23b1b5289ebc9ea3bf670e3fd809a54237b44b419f7093585a2c17090e3f0.scope: Deactivated successfully.
Nov 22 08:13:07 compute-0 nova_compute[186544]: 2025-11-22 08:13:07.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:13:07 compute-0 podman[238143]: 2025-11-22 08:13:07.165412338 +0000 UTC m=+0.051936867 container died 67e23b1b5289ebc9ea3bf670e3fd809a54237b44b419f7093585a2c17090e3f0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c8226b55-5efa-4d39-8f1e-3f41c9c3a242, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 08:13:07 compute-0 nova_compute[186544]: 2025-11-22 08:13:07.172 186548 DEBUG nova.compute.manager [req-696a6159-8f67-4ca0-9ad7-b57892edd7b9 req-42c482eb-0491-42df-8e25-e834967bc57a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b8c444a0-4593-4343-9e2c-99f3110cceca] Received event network-vif-unplugged-1e98276d-f701-4c00-b1bb-177abba437fa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:13:07 compute-0 nova_compute[186544]: 2025-11-22 08:13:07.173 186548 DEBUG oslo_concurrency.lockutils [req-696a6159-8f67-4ca0-9ad7-b57892edd7b9 req-42c482eb-0491-42df-8e25-e834967bc57a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "b8c444a0-4593-4343-9e2c-99f3110cceca-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:13:07 compute-0 nova_compute[186544]: 2025-11-22 08:13:07.174 186548 DEBUG oslo_concurrency.lockutils [req-696a6159-8f67-4ca0-9ad7-b57892edd7b9 req-42c482eb-0491-42df-8e25-e834967bc57a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b8c444a0-4593-4343-9e2c-99f3110cceca-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:13:07 compute-0 nova_compute[186544]: 2025-11-22 08:13:07.174 186548 DEBUG oslo_concurrency.lockutils [req-696a6159-8f67-4ca0-9ad7-b57892edd7b9 req-42c482eb-0491-42df-8e25-e834967bc57a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b8c444a0-4593-4343-9e2c-99f3110cceca-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:13:07 compute-0 nova_compute[186544]: 2025-11-22 08:13:07.174 186548 DEBUG nova.compute.manager [req-696a6159-8f67-4ca0-9ad7-b57892edd7b9 req-42c482eb-0491-42df-8e25-e834967bc57a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b8c444a0-4593-4343-9e2c-99f3110cceca] No waiting events found dispatching network-vif-unplugged-1e98276d-f701-4c00-b1bb-177abba437fa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:13:07 compute-0 nova_compute[186544]: 2025-11-22 08:13:07.175 186548 DEBUG nova.compute.manager [req-696a6159-8f67-4ca0-9ad7-b57892edd7b9 req-42c482eb-0491-42df-8e25-e834967bc57a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b8c444a0-4593-4343-9e2c-99f3110cceca] Received event network-vif-unplugged-1e98276d-f701-4c00-b1bb-177abba437fa for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 22 08:13:07 compute-0 nova_compute[186544]: 2025-11-22 08:13:07.251 186548 INFO nova.virt.libvirt.driver [-] [instance: b8c444a0-4593-4343-9e2c-99f3110cceca] Instance destroyed successfully.
Nov 22 08:13:07 compute-0 nova_compute[186544]: 2025-11-22 08:13:07.252 186548 DEBUG nova.objects.instance [None req-339bcc08-d047-4b35-8cb6-49abdf711aa8 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] Lazy-loading 'resources' on Instance uuid b8c444a0-4593-4343-9e2c-99f3110cceca obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:13:07 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-67e23b1b5289ebc9ea3bf670e3fd809a54237b44b419f7093585a2c17090e3f0-userdata-shm.mount: Deactivated successfully.
Nov 22 08:13:07 compute-0 systemd[1]: var-lib-containers-storage-overlay-cf18ee0d958a0fba594c14ad959f4c41ef644144c6b2b16807f2ccc2ddcd05a9-merged.mount: Deactivated successfully.
Nov 22 08:13:07 compute-0 nova_compute[186544]: 2025-11-22 08:13:07.288 186548 DEBUG nova.virt.libvirt.vif [None req-339bcc08-d047-4b35-8cb6-49abdf711aa8 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:12:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-ServerTagsTestJSON-server-835938536',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-servertagstestjson-server-835938536',id=135,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:13:03Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0e2821b0488241e9bc0d864bb3f2ca40',ramdisk_id='',reservation_id='r-yg01o3lk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerTagsTestJSON-1623180383',owner_user_name='tempest-ServerTagsTestJSON-1623180383-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:13:03Z,user_data=None,user_id='34e68ca06c2b4e039ead3d9c5b44c88c',uuid=b8c444a0-4593-4343-9e2c-99f3110cceca,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1e98276d-f701-4c00-b1bb-177abba437fa", "address": "fa:16:3e:82:d1:b1", "network": {"id": "c8226b55-5efa-4d39-8f1e-3f41c9c3a242", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-1102188402-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0e2821b0488241e9bc0d864bb3f2ca40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e98276d-f7", "ovs_interfaceid": "1e98276d-f701-4c00-b1bb-177abba437fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 08:13:07 compute-0 nova_compute[186544]: 2025-11-22 08:13:07.289 186548 DEBUG nova.network.os_vif_util [None req-339bcc08-d047-4b35-8cb6-49abdf711aa8 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] Converting VIF {"id": "1e98276d-f701-4c00-b1bb-177abba437fa", "address": "fa:16:3e:82:d1:b1", "network": {"id": "c8226b55-5efa-4d39-8f1e-3f41c9c3a242", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-1102188402-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0e2821b0488241e9bc0d864bb3f2ca40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e98276d-f7", "ovs_interfaceid": "1e98276d-f701-4c00-b1bb-177abba437fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:13:07 compute-0 nova_compute[186544]: 2025-11-22 08:13:07.290 186548 DEBUG nova.network.os_vif_util [None req-339bcc08-d047-4b35-8cb6-49abdf711aa8 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:82:d1:b1,bridge_name='br-int',has_traffic_filtering=True,id=1e98276d-f701-4c00-b1bb-177abba437fa,network=Network(c8226b55-5efa-4d39-8f1e-3f41c9c3a242),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1e98276d-f7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:13:07 compute-0 nova_compute[186544]: 2025-11-22 08:13:07.291 186548 DEBUG os_vif [None req-339bcc08-d047-4b35-8cb6-49abdf711aa8 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:82:d1:b1,bridge_name='br-int',has_traffic_filtering=True,id=1e98276d-f701-4c00-b1bb-177abba437fa,network=Network(c8226b55-5efa-4d39-8f1e-3f41c9c3a242),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1e98276d-f7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 08:13:07 compute-0 nova_compute[186544]: 2025-11-22 08:13:07.293 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:13:07 compute-0 nova_compute[186544]: 2025-11-22 08:13:07.293 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1e98276d-f7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:13:07 compute-0 nova_compute[186544]: 2025-11-22 08:13:07.295 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:13:07 compute-0 nova_compute[186544]: 2025-11-22 08:13:07.297 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:13:07 compute-0 nova_compute[186544]: 2025-11-22 08:13:07.299 186548 INFO os_vif [None req-339bcc08-d047-4b35-8cb6-49abdf711aa8 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:82:d1:b1,bridge_name='br-int',has_traffic_filtering=True,id=1e98276d-f701-4c00-b1bb-177abba437fa,network=Network(c8226b55-5efa-4d39-8f1e-3f41c9c3a242),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1e98276d-f7')
Nov 22 08:13:07 compute-0 nova_compute[186544]: 2025-11-22 08:13:07.300 186548 INFO nova.virt.libvirt.driver [None req-339bcc08-d047-4b35-8cb6-49abdf711aa8 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] [instance: b8c444a0-4593-4343-9e2c-99f3110cceca] Deleting instance files /var/lib/nova/instances/b8c444a0-4593-4343-9e2c-99f3110cceca_del
Nov 22 08:13:07 compute-0 nova_compute[186544]: 2025-11-22 08:13:07.300 186548 INFO nova.virt.libvirt.driver [None req-339bcc08-d047-4b35-8cb6-49abdf711aa8 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] [instance: b8c444a0-4593-4343-9e2c-99f3110cceca] Deletion of /var/lib/nova/instances/b8c444a0-4593-4343-9e2c-99f3110cceca_del complete
Nov 22 08:13:07 compute-0 nova_compute[186544]: 2025-11-22 08:13:07.381 186548 INFO nova.compute.manager [None req-339bcc08-d047-4b35-8cb6-49abdf711aa8 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] [instance: b8c444a0-4593-4343-9e2c-99f3110cceca] Took 0.42 seconds to destroy the instance on the hypervisor.
Nov 22 08:13:07 compute-0 nova_compute[186544]: 2025-11-22 08:13:07.382 186548 DEBUG oslo.service.loopingcall [None req-339bcc08-d047-4b35-8cb6-49abdf711aa8 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 08:13:07 compute-0 nova_compute[186544]: 2025-11-22 08:13:07.382 186548 DEBUG nova.compute.manager [-] [instance: b8c444a0-4593-4343-9e2c-99f3110cceca] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 08:13:07 compute-0 nova_compute[186544]: 2025-11-22 08:13:07.383 186548 DEBUG nova.network.neutron [-] [instance: b8c444a0-4593-4343-9e2c-99f3110cceca] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 08:13:07 compute-0 podman[238143]: 2025-11-22 08:13:07.425187998 +0000 UTC m=+0.311712517 container cleanup 67e23b1b5289ebc9ea3bf670e3fd809a54237b44b419f7093585a2c17090e3f0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c8226b55-5efa-4d39-8f1e-3f41c9c3a242, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 08:13:07 compute-0 systemd[1]: libpod-conmon-67e23b1b5289ebc9ea3bf670e3fd809a54237b44b419f7093585a2c17090e3f0.scope: Deactivated successfully.
Nov 22 08:13:07 compute-0 podman[238191]: 2025-11-22 08:13:07.607589573 +0000 UTC m=+0.152998288 container remove 67e23b1b5289ebc9ea3bf670e3fd809a54237b44b419f7093585a2c17090e3f0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c8226b55-5efa-4d39-8f1e-3f41c9c3a242, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Nov 22 08:13:07 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:07.613 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[8cefbe12-29dd-476d-a347-2578583deb8e]: (4, ('Sat Nov 22 08:13:07 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-c8226b55-5efa-4d39-8f1e-3f41c9c3a242 (67e23b1b5289ebc9ea3bf670e3fd809a54237b44b419f7093585a2c17090e3f0)\n67e23b1b5289ebc9ea3bf670e3fd809a54237b44b419f7093585a2c17090e3f0\nSat Nov 22 08:13:07 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-c8226b55-5efa-4d39-8f1e-3f41c9c3a242 (67e23b1b5289ebc9ea3bf670e3fd809a54237b44b419f7093585a2c17090e3f0)\n67e23b1b5289ebc9ea3bf670e3fd809a54237b44b419f7093585a2c17090e3f0\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:13:07 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:07.615 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[ff8d22fb-2e32-410b-8092-9ec3bafef389]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:13:07 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:07.615 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc8226b55-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:13:07 compute-0 nova_compute[186544]: 2025-11-22 08:13:07.617 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:13:07 compute-0 kernel: tapc8226b55-50: left promiscuous mode
Nov 22 08:13:07 compute-0 nova_compute[186544]: 2025-11-22 08:13:07.629 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:13:07 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:07.632 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[11937220-6982-4511-85c3-7b350e324867]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:13:07 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:07.687 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[18c0b34d-bffd-4253-a345-c5fcc004231b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:13:07 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:07.689 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[3b5e15c9-22b8-4e3a-bca7-6631eb52c287]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:13:07 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:07.702 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[150d35ea-e818-4350-b4dd-b1c4403d1bf1]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 592276, 'reachable_time': 20906, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238206, 'error': None, 'target': 'ovnmeta-c8226b55-5efa-4d39-8f1e-3f41c9c3a242', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:13:07 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:07.705 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c8226b55-5efa-4d39-8f1e-3f41c9c3a242 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 08:13:07 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:07.705 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[8e2a9039-82ce-4334-ac21-20e65b8a8160]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:13:07 compute-0 systemd[1]: run-netns-ovnmeta\x2dc8226b55\x2d5efa\x2d4d39\x2d8f1e\x2d3f41c9c3a242.mount: Deactivated successfully.
Nov 22 08:13:08 compute-0 nova_compute[186544]: 2025-11-22 08:13:08.005 186548 DEBUG nova.network.neutron [-] [instance: b8c444a0-4593-4343-9e2c-99f3110cceca] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:13:08 compute-0 nova_compute[186544]: 2025-11-22 08:13:08.022 186548 INFO nova.compute.manager [-] [instance: b8c444a0-4593-4343-9e2c-99f3110cceca] Took 0.64 seconds to deallocate network for instance.
Nov 22 08:13:08 compute-0 nova_compute[186544]: 2025-11-22 08:13:08.085 186548 DEBUG oslo_concurrency.lockutils [None req-339bcc08-d047-4b35-8cb6-49abdf711aa8 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:13:08 compute-0 nova_compute[186544]: 2025-11-22 08:13:08.085 186548 DEBUG oslo_concurrency.lockutils [None req-339bcc08-d047-4b35-8cb6-49abdf711aa8 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:13:08 compute-0 nova_compute[186544]: 2025-11-22 08:13:08.146 186548 DEBUG nova.compute.provider_tree [None req-339bcc08-d047-4b35-8cb6-49abdf711aa8 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:13:08 compute-0 nova_compute[186544]: 2025-11-22 08:13:08.159 186548 DEBUG nova.scheduler.client.report [None req-339bcc08-d047-4b35-8cb6-49abdf711aa8 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:13:08 compute-0 nova_compute[186544]: 2025-11-22 08:13:08.180 186548 DEBUG oslo_concurrency.lockutils [None req-339bcc08-d047-4b35-8cb6-49abdf711aa8 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.095s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:13:08 compute-0 nova_compute[186544]: 2025-11-22 08:13:08.209 186548 INFO nova.scheduler.client.report [None req-339bcc08-d047-4b35-8cb6-49abdf711aa8 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] Deleted allocations for instance b8c444a0-4593-4343-9e2c-99f3110cceca
Nov 22 08:13:08 compute-0 nova_compute[186544]: 2025-11-22 08:13:08.275 186548 DEBUG oslo_concurrency.lockutils [None req-339bcc08-d047-4b35-8cb6-49abdf711aa8 34e68ca06c2b4e039ead3d9c5b44c88c 0e2821b0488241e9bc0d864bb3f2ca40 - - default default] Lock "b8c444a0-4593-4343-9e2c-99f3110cceca" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.323s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:13:09 compute-0 nova_compute[186544]: 2025-11-22 08:13:09.257 186548 DEBUG nova.compute.manager [req-0ca3b3d8-551e-427d-b9a4-ba8103f70f6e req-a08b316d-682e-4986-b2fd-4ac2efbb9c4f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b8c444a0-4593-4343-9e2c-99f3110cceca] Received event network-vif-plugged-1e98276d-f701-4c00-b1bb-177abba437fa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:13:09 compute-0 nova_compute[186544]: 2025-11-22 08:13:09.257 186548 DEBUG oslo_concurrency.lockutils [req-0ca3b3d8-551e-427d-b9a4-ba8103f70f6e req-a08b316d-682e-4986-b2fd-4ac2efbb9c4f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "b8c444a0-4593-4343-9e2c-99f3110cceca-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:13:09 compute-0 nova_compute[186544]: 2025-11-22 08:13:09.257 186548 DEBUG oslo_concurrency.lockutils [req-0ca3b3d8-551e-427d-b9a4-ba8103f70f6e req-a08b316d-682e-4986-b2fd-4ac2efbb9c4f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b8c444a0-4593-4343-9e2c-99f3110cceca-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:13:09 compute-0 nova_compute[186544]: 2025-11-22 08:13:09.257 186548 DEBUG oslo_concurrency.lockutils [req-0ca3b3d8-551e-427d-b9a4-ba8103f70f6e req-a08b316d-682e-4986-b2fd-4ac2efbb9c4f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "b8c444a0-4593-4343-9e2c-99f3110cceca-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:13:09 compute-0 nova_compute[186544]: 2025-11-22 08:13:09.258 186548 DEBUG nova.compute.manager [req-0ca3b3d8-551e-427d-b9a4-ba8103f70f6e req-a08b316d-682e-4986-b2fd-4ac2efbb9c4f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b8c444a0-4593-4343-9e2c-99f3110cceca] No waiting events found dispatching network-vif-plugged-1e98276d-f701-4c00-b1bb-177abba437fa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:13:09 compute-0 nova_compute[186544]: 2025-11-22 08:13:09.258 186548 WARNING nova.compute.manager [req-0ca3b3d8-551e-427d-b9a4-ba8103f70f6e req-a08b316d-682e-4986-b2fd-4ac2efbb9c4f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b8c444a0-4593-4343-9e2c-99f3110cceca] Received unexpected event network-vif-plugged-1e98276d-f701-4c00-b1bb-177abba437fa for instance with vm_state deleted and task_state None.
Nov 22 08:13:09 compute-0 nova_compute[186544]: 2025-11-22 08:13:09.258 186548 DEBUG nova.compute.manager [req-0ca3b3d8-551e-427d-b9a4-ba8103f70f6e req-a08b316d-682e-4986-b2fd-4ac2efbb9c4f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: b8c444a0-4593-4343-9e2c-99f3110cceca] Received event network-vif-deleted-1e98276d-f701-4c00-b1bb-177abba437fa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:13:10 compute-0 nova_compute[186544]: 2025-11-22 08:13:10.934 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:13:12 compute-0 nova_compute[186544]: 2025-11-22 08:13:12.295 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:13:12 compute-0 nova_compute[186544]: 2025-11-22 08:13:12.730 186548 DEBUG oslo_concurrency.lockutils [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Acquiring lock "64848f5c-64c9-41ed-9c0d-c2ef3839d5de" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:13:12 compute-0 nova_compute[186544]: 2025-11-22 08:13:12.731 186548 DEBUG oslo_concurrency.lockutils [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Lock "64848f5c-64c9-41ed-9c0d-c2ef3839d5de" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:13:12 compute-0 nova_compute[186544]: 2025-11-22 08:13:12.755 186548 DEBUG nova.compute.manager [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 08:13:12 compute-0 nova_compute[186544]: 2025-11-22 08:13:12.882 186548 DEBUG oslo_concurrency.lockutils [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:13:12 compute-0 nova_compute[186544]: 2025-11-22 08:13:12.883 186548 DEBUG oslo_concurrency.lockutils [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:13:12 compute-0 nova_compute[186544]: 2025-11-22 08:13:12.889 186548 DEBUG nova.virt.hardware [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 08:13:12 compute-0 nova_compute[186544]: 2025-11-22 08:13:12.890 186548 INFO nova.compute.claims [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Claim successful on node compute-0.ctlplane.example.com
Nov 22 08:13:13 compute-0 nova_compute[186544]: 2025-11-22 08:13:13.048 186548 DEBUG nova.compute.provider_tree [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:13:13 compute-0 nova_compute[186544]: 2025-11-22 08:13:13.070 186548 DEBUG nova.scheduler.client.report [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:13:13 compute-0 nova_compute[186544]: 2025-11-22 08:13:13.073 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:13:13 compute-0 nova_compute[186544]: 2025-11-22 08:13:13.097 186548 DEBUG oslo_concurrency.lockutils [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.214s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:13:13 compute-0 nova_compute[186544]: 2025-11-22 08:13:13.097 186548 DEBUG nova.compute.manager [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 08:13:13 compute-0 nova_compute[186544]: 2025-11-22 08:13:13.158 186548 DEBUG nova.compute.manager [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 22 08:13:13 compute-0 nova_compute[186544]: 2025-11-22 08:13:13.158 186548 DEBUG nova.network.neutron [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 08:13:13 compute-0 nova_compute[186544]: 2025-11-22 08:13:13.180 186548 INFO nova.virt.libvirt.driver [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 08:13:13 compute-0 nova_compute[186544]: 2025-11-22 08:13:13.209 186548 DEBUG nova.compute.manager [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 08:13:13 compute-0 nova_compute[186544]: 2025-11-22 08:13:13.343 186548 DEBUG nova.compute.manager [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 08:13:13 compute-0 nova_compute[186544]: 2025-11-22 08:13:13.344 186548 DEBUG nova.virt.libvirt.driver [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 08:13:13 compute-0 nova_compute[186544]: 2025-11-22 08:13:13.344 186548 INFO nova.virt.libvirt.driver [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Creating image(s)
Nov 22 08:13:13 compute-0 nova_compute[186544]: 2025-11-22 08:13:13.345 186548 DEBUG oslo_concurrency.lockutils [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Acquiring lock "/var/lib/nova/instances/64848f5c-64c9-41ed-9c0d-c2ef3839d5de/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:13:13 compute-0 nova_compute[186544]: 2025-11-22 08:13:13.345 186548 DEBUG oslo_concurrency.lockutils [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Lock "/var/lib/nova/instances/64848f5c-64c9-41ed-9c0d-c2ef3839d5de/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:13:13 compute-0 nova_compute[186544]: 2025-11-22 08:13:13.346 186548 DEBUG oslo_concurrency.lockutils [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Lock "/var/lib/nova/instances/64848f5c-64c9-41ed-9c0d-c2ef3839d5de/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:13:13 compute-0 nova_compute[186544]: 2025-11-22 08:13:13.357 186548 DEBUG oslo_concurrency.processutils [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:13:13 compute-0 podman[238208]: 2025-11-22 08:13:13.413151308 +0000 UTC m=+0.054390037 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2)
Nov 22 08:13:13 compute-0 nova_compute[186544]: 2025-11-22 08:13:13.422 186548 DEBUG oslo_concurrency.processutils [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:13:13 compute-0 podman[238209]: 2025-11-22 08:13:13.422877839 +0000 UTC m=+0.057081594 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 08:13:13 compute-0 nova_compute[186544]: 2025-11-22 08:13:13.423 186548 DEBUG oslo_concurrency.lockutils [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:13:13 compute-0 nova_compute[186544]: 2025-11-22 08:13:13.423 186548 DEBUG oslo_concurrency.lockutils [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:13:13 compute-0 nova_compute[186544]: 2025-11-22 08:13:13.434 186548 DEBUG oslo_concurrency.processutils [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:13:13 compute-0 podman[238207]: 2025-11-22 08:13:13.443243324 +0000 UTC m=+0.087119808 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, container_name=ceilometer_agent_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 22 08:13:13 compute-0 podman[238210]: 2025-11-22 08:13:13.470609221 +0000 UTC m=+0.104085898 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 22 08:13:13 compute-0 nova_compute[186544]: 2025-11-22 08:13:13.478 186548 DEBUG nova.policy [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '72df4512d7f245118018df81223ce5ff', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5c9016c6b616412fa2db0983e23a8150', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 22 08:13:13 compute-0 nova_compute[186544]: 2025-11-22 08:13:13.491 186548 DEBUG oslo_concurrency.processutils [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:13:13 compute-0 nova_compute[186544]: 2025-11-22 08:13:13.491 186548 DEBUG oslo_concurrency.processutils [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/64848f5c-64c9-41ed-9c0d-c2ef3839d5de/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:13:14 compute-0 nova_compute[186544]: 2025-11-22 08:13:14.261 186548 DEBUG oslo_concurrency.processutils [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/64848f5c-64c9-41ed-9c0d-c2ef3839d5de/disk 1073741824" returned: 0 in 0.770s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:13:14 compute-0 nova_compute[186544]: 2025-11-22 08:13:14.262 186548 DEBUG oslo_concurrency.lockutils [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.839s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:13:14 compute-0 nova_compute[186544]: 2025-11-22 08:13:14.262 186548 DEBUG oslo_concurrency.processutils [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:13:14 compute-0 nova_compute[186544]: 2025-11-22 08:13:14.338 186548 DEBUG oslo_concurrency.processutils [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:13:14 compute-0 nova_compute[186544]: 2025-11-22 08:13:14.339 186548 DEBUG nova.virt.disk.api [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Checking if we can resize image /var/lib/nova/instances/64848f5c-64c9-41ed-9c0d-c2ef3839d5de/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 08:13:14 compute-0 nova_compute[186544]: 2025-11-22 08:13:14.339 186548 DEBUG oslo_concurrency.processutils [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/64848f5c-64c9-41ed-9c0d-c2ef3839d5de/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:13:14 compute-0 nova_compute[186544]: 2025-11-22 08:13:14.409 186548 DEBUG oslo_concurrency.processutils [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/64848f5c-64c9-41ed-9c0d-c2ef3839d5de/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:13:14 compute-0 nova_compute[186544]: 2025-11-22 08:13:14.410 186548 DEBUG nova.virt.disk.api [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Cannot resize image /var/lib/nova/instances/64848f5c-64c9-41ed-9c0d-c2ef3839d5de/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 08:13:14 compute-0 nova_compute[186544]: 2025-11-22 08:13:14.411 186548 DEBUG nova.objects.instance [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Lazy-loading 'migration_context' on Instance uuid 64848f5c-64c9-41ed-9c0d-c2ef3839d5de obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:13:14 compute-0 nova_compute[186544]: 2025-11-22 08:13:14.413 186548 DEBUG nova.network.neutron [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Successfully created port: 56b64f59-c3ee-40e4-8a5a-42d53b2d04b7 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 22 08:13:14 compute-0 nova_compute[186544]: 2025-11-22 08:13:14.426 186548 DEBUG nova.virt.libvirt.driver [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 08:13:14 compute-0 nova_compute[186544]: 2025-11-22 08:13:14.426 186548 DEBUG nova.virt.libvirt.driver [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Ensure instance console log exists: /var/lib/nova/instances/64848f5c-64c9-41ed-9c0d-c2ef3839d5de/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 08:13:14 compute-0 nova_compute[186544]: 2025-11-22 08:13:14.427 186548 DEBUG oslo_concurrency.lockutils [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:13:14 compute-0 nova_compute[186544]: 2025-11-22 08:13:14.427 186548 DEBUG oslo_concurrency.lockutils [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:13:14 compute-0 nova_compute[186544]: 2025-11-22 08:13:14.427 186548 DEBUG oslo_concurrency.lockutils [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:13:15 compute-0 nova_compute[186544]: 2025-11-22 08:13:15.936 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:13:17 compute-0 nova_compute[186544]: 2025-11-22 08:13:17.223 186548 DEBUG nova.network.neutron [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Successfully updated port: 56b64f59-c3ee-40e4-8a5a-42d53b2d04b7 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 08:13:17 compute-0 nova_compute[186544]: 2025-11-22 08:13:17.249 186548 DEBUG oslo_concurrency.lockutils [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Acquiring lock "refresh_cache-64848f5c-64c9-41ed-9c0d-c2ef3839d5de" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:13:17 compute-0 nova_compute[186544]: 2025-11-22 08:13:17.250 186548 DEBUG oslo_concurrency.lockutils [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Acquired lock "refresh_cache-64848f5c-64c9-41ed-9c0d-c2ef3839d5de" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:13:17 compute-0 nova_compute[186544]: 2025-11-22 08:13:17.250 186548 DEBUG nova.network.neutron [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 08:13:17 compute-0 nova_compute[186544]: 2025-11-22 08:13:17.297 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:13:17 compute-0 nova_compute[186544]: 2025-11-22 08:13:17.533 186548 DEBUG nova.network.neutron [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 08:13:17 compute-0 nova_compute[186544]: 2025-11-22 08:13:17.870 186548 DEBUG nova.compute.manager [req-bd68067a-3edc-42f4-b87b-acaf82c7f091 req-5aeb6798-ac63-4bf1-ac3f-dfc1848f7586 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Received event network-changed-56b64f59-c3ee-40e4-8a5a-42d53b2d04b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:13:17 compute-0 nova_compute[186544]: 2025-11-22 08:13:17.870 186548 DEBUG nova.compute.manager [req-bd68067a-3edc-42f4-b87b-acaf82c7f091 req-5aeb6798-ac63-4bf1-ac3f-dfc1848f7586 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Refreshing instance network info cache due to event network-changed-56b64f59-c3ee-40e4-8a5a-42d53b2d04b7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:13:17 compute-0 nova_compute[186544]: 2025-11-22 08:13:17.870 186548 DEBUG oslo_concurrency.lockutils [req-bd68067a-3edc-42f4-b87b-acaf82c7f091 req-5aeb6798-ac63-4bf1-ac3f-dfc1848f7586 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-64848f5c-64c9-41ed-9c0d-c2ef3839d5de" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:13:19 compute-0 nova_compute[186544]: 2025-11-22 08:13:19.910 186548 DEBUG nova.network.neutron [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Updating instance_info_cache with network_info: [{"id": "56b64f59-c3ee-40e4-8a5a-42d53b2d04b7", "address": "fa:16:3e:d3:d2:ad", "network": {"id": "5cbf5083-8d50-44bd-b6ba-93e507a8654e", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-722250133-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9016c6b616412fa2db0983e23a8150", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap56b64f59-c3", "ovs_interfaceid": "56b64f59-c3ee-40e4-8a5a-42d53b2d04b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:13:19 compute-0 nova_compute[186544]: 2025-11-22 08:13:19.929 186548 DEBUG oslo_concurrency.lockutils [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Releasing lock "refresh_cache-64848f5c-64c9-41ed-9c0d-c2ef3839d5de" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:13:19 compute-0 nova_compute[186544]: 2025-11-22 08:13:19.929 186548 DEBUG nova.compute.manager [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Instance network_info: |[{"id": "56b64f59-c3ee-40e4-8a5a-42d53b2d04b7", "address": "fa:16:3e:d3:d2:ad", "network": {"id": "5cbf5083-8d50-44bd-b6ba-93e507a8654e", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-722250133-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9016c6b616412fa2db0983e23a8150", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap56b64f59-c3", "ovs_interfaceid": "56b64f59-c3ee-40e4-8a5a-42d53b2d04b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 22 08:13:19 compute-0 nova_compute[186544]: 2025-11-22 08:13:19.929 186548 DEBUG oslo_concurrency.lockutils [req-bd68067a-3edc-42f4-b87b-acaf82c7f091 req-5aeb6798-ac63-4bf1-ac3f-dfc1848f7586 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-64848f5c-64c9-41ed-9c0d-c2ef3839d5de" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:13:19 compute-0 nova_compute[186544]: 2025-11-22 08:13:19.930 186548 DEBUG nova.network.neutron [req-bd68067a-3edc-42f4-b87b-acaf82c7f091 req-5aeb6798-ac63-4bf1-ac3f-dfc1848f7586 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Refreshing network info cache for port 56b64f59-c3ee-40e4-8a5a-42d53b2d04b7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:13:19 compute-0 nova_compute[186544]: 2025-11-22 08:13:19.933 186548 DEBUG nova.virt.libvirt.driver [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Start _get_guest_xml network_info=[{"id": "56b64f59-c3ee-40e4-8a5a-42d53b2d04b7", "address": "fa:16:3e:d3:d2:ad", "network": {"id": "5cbf5083-8d50-44bd-b6ba-93e507a8654e", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-722250133-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9016c6b616412fa2db0983e23a8150", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap56b64f59-c3", "ovs_interfaceid": "56b64f59-c3ee-40e4-8a5a-42d53b2d04b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 08:13:19 compute-0 nova_compute[186544]: 2025-11-22 08:13:19.943 186548 WARNING nova.virt.libvirt.driver [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:13:19 compute-0 nova_compute[186544]: 2025-11-22 08:13:19.950 186548 DEBUG nova.virt.libvirt.host [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 08:13:19 compute-0 nova_compute[186544]: 2025-11-22 08:13:19.951 186548 DEBUG nova.virt.libvirt.host [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 08:13:19 compute-0 nova_compute[186544]: 2025-11-22 08:13:19.954 186548 DEBUG nova.virt.libvirt.host [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 08:13:19 compute-0 nova_compute[186544]: 2025-11-22 08:13:19.955 186548 DEBUG nova.virt.libvirt.host [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 08:13:19 compute-0 nova_compute[186544]: 2025-11-22 08:13:19.956 186548 DEBUG nova.virt.libvirt.driver [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 08:13:19 compute-0 nova_compute[186544]: 2025-11-22 08:13:19.956 186548 DEBUG nova.virt.hardware [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 08:13:19 compute-0 nova_compute[186544]: 2025-11-22 08:13:19.956 186548 DEBUG nova.virt.hardware [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 08:13:19 compute-0 nova_compute[186544]: 2025-11-22 08:13:19.957 186548 DEBUG nova.virt.hardware [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 08:13:19 compute-0 nova_compute[186544]: 2025-11-22 08:13:19.957 186548 DEBUG nova.virt.hardware [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 08:13:19 compute-0 nova_compute[186544]: 2025-11-22 08:13:19.957 186548 DEBUG nova.virt.hardware [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 08:13:19 compute-0 nova_compute[186544]: 2025-11-22 08:13:19.957 186548 DEBUG nova.virt.hardware [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 08:13:19 compute-0 nova_compute[186544]: 2025-11-22 08:13:19.958 186548 DEBUG nova.virt.hardware [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 08:13:19 compute-0 nova_compute[186544]: 2025-11-22 08:13:19.958 186548 DEBUG nova.virt.hardware [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 08:13:19 compute-0 nova_compute[186544]: 2025-11-22 08:13:19.958 186548 DEBUG nova.virt.hardware [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 08:13:19 compute-0 nova_compute[186544]: 2025-11-22 08:13:19.958 186548 DEBUG nova.virt.hardware [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 08:13:19 compute-0 nova_compute[186544]: 2025-11-22 08:13:19.959 186548 DEBUG nova.virt.hardware [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 08:13:19 compute-0 nova_compute[186544]: 2025-11-22 08:13:19.962 186548 DEBUG nova.virt.libvirt.vif [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:13:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestSnapshotPattern-server-2002217618',display_name='tempest-TestSnapshotPattern-server-2002217618',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testsnapshotpattern-server-2002217618',id=136,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBO3zqwL5oCAVcYUK4UfRxRlwiLCpXhyrVibiQXfDMPSmEzdCg2weZeJjjoUlK1vs2o/ZsP7kK+r7TBW2xEMw9M43RfSbbpgfpmDe3/3E/PZ1RgVY0zy+sKDgo7g8yf0esA==',key_name='tempest-TestSnapshotPattern-653067273',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5c9016c6b616412fa2db0983e23a8150',ramdisk_id='',reservation_id='r-kg583xko',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSnapshotPattern-1254822391',owner_user_name='tempest-TestSnapshotPattern-1254822391-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:13:13Z,user_data=None,user_id='72df4512d7f245118018df81223ce5ff',uuid=64848f5c-64c9-41ed-9c0d-c2ef3839d5de,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "56b64f59-c3ee-40e4-8a5a-42d53b2d04b7", "address": "fa:16:3e:d3:d2:ad", "network": {"id": "5cbf5083-8d50-44bd-b6ba-93e507a8654e", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-722250133-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9016c6b616412fa2db0983e23a8150", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap56b64f59-c3", "ovs_interfaceid": "56b64f59-c3ee-40e4-8a5a-42d53b2d04b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 08:13:19 compute-0 nova_compute[186544]: 2025-11-22 08:13:19.963 186548 DEBUG nova.network.os_vif_util [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Converting VIF {"id": "56b64f59-c3ee-40e4-8a5a-42d53b2d04b7", "address": "fa:16:3e:d3:d2:ad", "network": {"id": "5cbf5083-8d50-44bd-b6ba-93e507a8654e", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-722250133-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9016c6b616412fa2db0983e23a8150", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap56b64f59-c3", "ovs_interfaceid": "56b64f59-c3ee-40e4-8a5a-42d53b2d04b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:13:19 compute-0 nova_compute[186544]: 2025-11-22 08:13:19.964 186548 DEBUG nova.network.os_vif_util [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d3:d2:ad,bridge_name='br-int',has_traffic_filtering=True,id=56b64f59-c3ee-40e4-8a5a-42d53b2d04b7,network=Network(5cbf5083-8d50-44bd-b6ba-93e507a8654e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap56b64f59-c3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:13:19 compute-0 nova_compute[186544]: 2025-11-22 08:13:19.964 186548 DEBUG nova.objects.instance [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Lazy-loading 'pci_devices' on Instance uuid 64848f5c-64c9-41ed-9c0d-c2ef3839d5de obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:13:19 compute-0 nova_compute[186544]: 2025-11-22 08:13:19.978 186548 DEBUG nova.virt.libvirt.driver [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] End _get_guest_xml xml=<domain type="kvm">
Nov 22 08:13:19 compute-0 nova_compute[186544]:   <uuid>64848f5c-64c9-41ed-9c0d-c2ef3839d5de</uuid>
Nov 22 08:13:19 compute-0 nova_compute[186544]:   <name>instance-00000088</name>
Nov 22 08:13:19 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 08:13:19 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 08:13:19 compute-0 nova_compute[186544]:   <metadata>
Nov 22 08:13:19 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 08:13:19 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 08:13:19 compute-0 nova_compute[186544]:       <nova:name>tempest-TestSnapshotPattern-server-2002217618</nova:name>
Nov 22 08:13:19 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 08:13:19</nova:creationTime>
Nov 22 08:13:19 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 08:13:19 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 08:13:19 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 08:13:19 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 08:13:19 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 08:13:19 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 08:13:19 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 08:13:19 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 08:13:19 compute-0 nova_compute[186544]:         <nova:user uuid="72df4512d7f245118018df81223ce5ff">tempest-TestSnapshotPattern-1254822391-project-member</nova:user>
Nov 22 08:13:19 compute-0 nova_compute[186544]:         <nova:project uuid="5c9016c6b616412fa2db0983e23a8150">tempest-TestSnapshotPattern-1254822391</nova:project>
Nov 22 08:13:19 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 08:13:19 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 08:13:19 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 08:13:19 compute-0 nova_compute[186544]:         <nova:port uuid="56b64f59-c3ee-40e4-8a5a-42d53b2d04b7">
Nov 22 08:13:19 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Nov 22 08:13:19 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 08:13:19 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 08:13:19 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 08:13:19 compute-0 nova_compute[186544]:   </metadata>
Nov 22 08:13:19 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 08:13:19 compute-0 nova_compute[186544]:     <system>
Nov 22 08:13:19 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 08:13:19 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 08:13:19 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 08:13:19 compute-0 nova_compute[186544]:       <entry name="serial">64848f5c-64c9-41ed-9c0d-c2ef3839d5de</entry>
Nov 22 08:13:19 compute-0 nova_compute[186544]:       <entry name="uuid">64848f5c-64c9-41ed-9c0d-c2ef3839d5de</entry>
Nov 22 08:13:19 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 08:13:19 compute-0 nova_compute[186544]:     </system>
Nov 22 08:13:19 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 08:13:19 compute-0 nova_compute[186544]:   <os>
Nov 22 08:13:19 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 08:13:19 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 08:13:19 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 08:13:19 compute-0 nova_compute[186544]:   </os>
Nov 22 08:13:19 compute-0 nova_compute[186544]:   <features>
Nov 22 08:13:19 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 08:13:19 compute-0 nova_compute[186544]:     <apic/>
Nov 22 08:13:19 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 08:13:19 compute-0 nova_compute[186544]:   </features>
Nov 22 08:13:19 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 08:13:19 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 08:13:19 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 08:13:19 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 08:13:19 compute-0 nova_compute[186544]:   </clock>
Nov 22 08:13:19 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 08:13:19 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 08:13:19 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 08:13:19 compute-0 nova_compute[186544]:   </cpu>
Nov 22 08:13:19 compute-0 nova_compute[186544]:   <devices>
Nov 22 08:13:19 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 08:13:19 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 08:13:19 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/64848f5c-64c9-41ed-9c0d-c2ef3839d5de/disk"/>
Nov 22 08:13:19 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 08:13:19 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:13:19 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 08:13:19 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 08:13:19 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/64848f5c-64c9-41ed-9c0d-c2ef3839d5de/disk.config"/>
Nov 22 08:13:19 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 08:13:19 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:13:19 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 08:13:19 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:d3:d2:ad"/>
Nov 22 08:13:19 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:13:19 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 08:13:19 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 08:13:19 compute-0 nova_compute[186544]:       <target dev="tap56b64f59-c3"/>
Nov 22 08:13:19 compute-0 nova_compute[186544]:     </interface>
Nov 22 08:13:19 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 08:13:19 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/64848f5c-64c9-41ed-9c0d-c2ef3839d5de/console.log" append="off"/>
Nov 22 08:13:19 compute-0 nova_compute[186544]:     </serial>
Nov 22 08:13:19 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 08:13:19 compute-0 nova_compute[186544]:     <video>
Nov 22 08:13:19 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:13:19 compute-0 nova_compute[186544]:     </video>
Nov 22 08:13:19 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 08:13:19 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 08:13:19 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 08:13:19 compute-0 nova_compute[186544]:     </rng>
Nov 22 08:13:19 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 08:13:19 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:13:19 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:13:19 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:13:19 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:13:19 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:13:19 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:13:19 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:13:19 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:13:19 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:13:19 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:13:19 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:13:19 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:13:19 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:13:19 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:13:19 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:13:19 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:13:19 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:13:19 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:13:19 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:13:19 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:13:19 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:13:19 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:13:19 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:13:19 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:13:19 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 08:13:19 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 08:13:19 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 08:13:19 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 08:13:19 compute-0 nova_compute[186544]:   </devices>
Nov 22 08:13:19 compute-0 nova_compute[186544]: </domain>
Nov 22 08:13:19 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 08:13:19 compute-0 nova_compute[186544]: 2025-11-22 08:13:19.979 186548 DEBUG nova.compute.manager [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Preparing to wait for external event network-vif-plugged-56b64f59-c3ee-40e4-8a5a-42d53b2d04b7 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 08:13:19 compute-0 nova_compute[186544]: 2025-11-22 08:13:19.980 186548 DEBUG oslo_concurrency.lockutils [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Acquiring lock "64848f5c-64c9-41ed-9c0d-c2ef3839d5de-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:13:19 compute-0 nova_compute[186544]: 2025-11-22 08:13:19.980 186548 DEBUG oslo_concurrency.lockutils [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Lock "64848f5c-64c9-41ed-9c0d-c2ef3839d5de-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:13:19 compute-0 nova_compute[186544]: 2025-11-22 08:13:19.980 186548 DEBUG oslo_concurrency.lockutils [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Lock "64848f5c-64c9-41ed-9c0d-c2ef3839d5de-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:13:19 compute-0 nova_compute[186544]: 2025-11-22 08:13:19.981 186548 DEBUG nova.virt.libvirt.vif [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:13:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestSnapshotPattern-server-2002217618',display_name='tempest-TestSnapshotPattern-server-2002217618',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testsnapshotpattern-server-2002217618',id=136,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBO3zqwL5oCAVcYUK4UfRxRlwiLCpXhyrVibiQXfDMPSmEzdCg2weZeJjjoUlK1vs2o/ZsP7kK+r7TBW2xEMw9M43RfSbbpgfpmDe3/3E/PZ1RgVY0zy+sKDgo7g8yf0esA==',key_name='tempest-TestSnapshotPattern-653067273',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5c9016c6b616412fa2db0983e23a8150',ramdisk_id='',reservation_id='r-kg583xko',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSnapshotPattern-1254822391',owner_user_name='tempest-TestSnapshotPattern-1254822391-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:13:13Z,user_data=None,user_id='72df4512d7f245118018df81223ce5ff',uuid=64848f5c-64c9-41ed-9c0d-c2ef3839d5de,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "56b64f59-c3ee-40e4-8a5a-42d53b2d04b7", "address": "fa:16:3e:d3:d2:ad", "network": {"id": "5cbf5083-8d50-44bd-b6ba-93e507a8654e", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-722250133-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9016c6b616412fa2db0983e23a8150", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap56b64f59-c3", "ovs_interfaceid": "56b64f59-c3ee-40e4-8a5a-42d53b2d04b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 08:13:19 compute-0 nova_compute[186544]: 2025-11-22 08:13:19.982 186548 DEBUG nova.network.os_vif_util [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Converting VIF {"id": "56b64f59-c3ee-40e4-8a5a-42d53b2d04b7", "address": "fa:16:3e:d3:d2:ad", "network": {"id": "5cbf5083-8d50-44bd-b6ba-93e507a8654e", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-722250133-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9016c6b616412fa2db0983e23a8150", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap56b64f59-c3", "ovs_interfaceid": "56b64f59-c3ee-40e4-8a5a-42d53b2d04b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:13:19 compute-0 nova_compute[186544]: 2025-11-22 08:13:19.982 186548 DEBUG nova.network.os_vif_util [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d3:d2:ad,bridge_name='br-int',has_traffic_filtering=True,id=56b64f59-c3ee-40e4-8a5a-42d53b2d04b7,network=Network(5cbf5083-8d50-44bd-b6ba-93e507a8654e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap56b64f59-c3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:13:19 compute-0 nova_compute[186544]: 2025-11-22 08:13:19.983 186548 DEBUG os_vif [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d3:d2:ad,bridge_name='br-int',has_traffic_filtering=True,id=56b64f59-c3ee-40e4-8a5a-42d53b2d04b7,network=Network(5cbf5083-8d50-44bd-b6ba-93e507a8654e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap56b64f59-c3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 08:13:19 compute-0 nova_compute[186544]: 2025-11-22 08:13:19.983 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:13:19 compute-0 nova_compute[186544]: 2025-11-22 08:13:19.984 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:13:19 compute-0 nova_compute[186544]: 2025-11-22 08:13:19.984 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:13:19 compute-0 nova_compute[186544]: 2025-11-22 08:13:19.988 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:13:19 compute-0 nova_compute[186544]: 2025-11-22 08:13:19.988 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap56b64f59-c3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:13:19 compute-0 nova_compute[186544]: 2025-11-22 08:13:19.988 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap56b64f59-c3, col_values=(('external_ids', {'iface-id': '56b64f59-c3ee-40e4-8a5a-42d53b2d04b7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d3:d2:ad', 'vm-uuid': '64848f5c-64c9-41ed-9c0d-c2ef3839d5de'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:13:19 compute-0 nova_compute[186544]: 2025-11-22 08:13:19.990 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:13:19 compute-0 NetworkManager[55036]: <info>  [1763799199.9908] manager: (tap56b64f59-c3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/287)
Nov 22 08:13:19 compute-0 nova_compute[186544]: 2025-11-22 08:13:19.992 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 08:13:19 compute-0 nova_compute[186544]: 2025-11-22 08:13:19.996 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:13:19 compute-0 nova_compute[186544]: 2025-11-22 08:13:19.997 186548 INFO os_vif [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d3:d2:ad,bridge_name='br-int',has_traffic_filtering=True,id=56b64f59-c3ee-40e4-8a5a-42d53b2d04b7,network=Network(5cbf5083-8d50-44bd-b6ba-93e507a8654e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap56b64f59-c3')
Nov 22 08:13:20 compute-0 nova_compute[186544]: 2025-11-22 08:13:20.337 186548 DEBUG nova.virt.libvirt.driver [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:13:20 compute-0 nova_compute[186544]: 2025-11-22 08:13:20.338 186548 DEBUG nova.virt.libvirt.driver [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:13:20 compute-0 nova_compute[186544]: 2025-11-22 08:13:20.338 186548 DEBUG nova.virt.libvirt.driver [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] No VIF found with MAC fa:16:3e:d3:d2:ad, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 08:13:20 compute-0 nova_compute[186544]: 2025-11-22 08:13:20.338 186548 INFO nova.virt.libvirt.driver [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Using config drive
Nov 22 08:13:20 compute-0 nova_compute[186544]: 2025-11-22 08:13:20.709 186548 INFO nova.virt.libvirt.driver [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Creating config drive at /var/lib/nova/instances/64848f5c-64c9-41ed-9c0d-c2ef3839d5de/disk.config
Nov 22 08:13:20 compute-0 nova_compute[186544]: 2025-11-22 08:13:20.715 186548 DEBUG oslo_concurrency.processutils [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/64848f5c-64c9-41ed-9c0d-c2ef3839d5de/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdnt1_d_e execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:13:20 compute-0 nova_compute[186544]: 2025-11-22 08:13:20.844 186548 DEBUG oslo_concurrency.processutils [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/64848f5c-64c9-41ed-9c0d-c2ef3839d5de/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdnt1_d_e" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:13:20 compute-0 kernel: tap56b64f59-c3: entered promiscuous mode
Nov 22 08:13:20 compute-0 NetworkManager[55036]: <info>  [1763799200.9080] manager: (tap56b64f59-c3): new Tun device (/org/freedesktop/NetworkManager/Devices/288)
Nov 22 08:13:20 compute-0 ovn_controller[94843]: 2025-11-22T08:13:20Z|00623|binding|INFO|Claiming lport 56b64f59-c3ee-40e4-8a5a-42d53b2d04b7 for this chassis.
Nov 22 08:13:20 compute-0 ovn_controller[94843]: 2025-11-22T08:13:20Z|00624|binding|INFO|56b64f59-c3ee-40e4-8a5a-42d53b2d04b7: Claiming fa:16:3e:d3:d2:ad 10.100.0.4
Nov 22 08:13:20 compute-0 nova_compute[186544]: 2025-11-22 08:13:20.908 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:13:20 compute-0 nova_compute[186544]: 2025-11-22 08:13:20.911 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:13:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:20.920 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d3:d2:ad 10.100.0.4'], port_security=['fa:16:3e:d3:d2:ad 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '64848f5c-64c9-41ed-9c0d-c2ef3839d5de', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5cbf5083-8d50-44bd-b6ba-93e507a8654e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5c9016c6b616412fa2db0983e23a8150', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7420c781-e9c7-4653-97a5-92e76e44aa71', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7e7964e9-a04c-4b66-8053-f482dcbb2cee, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=56b64f59-c3ee-40e4-8a5a-42d53b2d04b7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:13:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:20.922 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 56b64f59-c3ee-40e4-8a5a-42d53b2d04b7 in datapath 5cbf5083-8d50-44bd-b6ba-93e507a8654e bound to our chassis
Nov 22 08:13:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:20.924 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 5cbf5083-8d50-44bd-b6ba-93e507a8654e
Nov 22 08:13:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:20.934 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[1cae0cbd-2abb-4a39-b1dc-9f944820d5b1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:13:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:20.935 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap5cbf5083-81 in ovnmeta-5cbf5083-8d50-44bd-b6ba-93e507a8654e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 08:13:20 compute-0 systemd-udevd[238324]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:13:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:20.937 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap5cbf5083-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 08:13:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:20.937 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[897ca161-8459-48da-8e41-d8da87572848]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:13:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:20.938 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[46894fd0-8285-42ce-bba8-4ab61327bc72]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:13:20 compute-0 systemd-machined[152872]: New machine qemu-77-instance-00000088.
Nov 22 08:13:20 compute-0 NetworkManager[55036]: <info>  [1763799200.9488] device (tap56b64f59-c3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 08:13:20 compute-0 NetworkManager[55036]: <info>  [1763799200.9496] device (tap56b64f59-c3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 08:13:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:20.950 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[0570800f-7b67-4034-9211-457325ba9485]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:13:20 compute-0 nova_compute[186544]: 2025-11-22 08:13:20.961 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:13:20 compute-0 systemd[1]: Started Virtual Machine qemu-77-instance-00000088.
Nov 22 08:13:20 compute-0 nova_compute[186544]: 2025-11-22 08:13:20.967 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:13:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:20.966 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[da7ec557-580a-4a2d-add9-767018569bbb]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:13:20 compute-0 ovn_controller[94843]: 2025-11-22T08:13:20Z|00625|binding|INFO|Setting lport 56b64f59-c3ee-40e4-8a5a-42d53b2d04b7 ovn-installed in OVS
Nov 22 08:13:20 compute-0 ovn_controller[94843]: 2025-11-22T08:13:20Z|00626|binding|INFO|Setting lport 56b64f59-c3ee-40e4-8a5a-42d53b2d04b7 up in Southbound
Nov 22 08:13:20 compute-0 nova_compute[186544]: 2025-11-22 08:13:20.970 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:13:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:20.994 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[87214919-0209-401b-a775-216310a84da7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:13:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:20.999 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[abd01e60-6d1d-4c87-84ff-2ad9eb588b9d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:13:21 compute-0 systemd-udevd[238328]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:13:21 compute-0 NetworkManager[55036]: <info>  [1763799201.0011] manager: (tap5cbf5083-80): new Veth device (/org/freedesktop/NetworkManager/Devices/289)
Nov 22 08:13:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:21.032 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[d177055e-0803-4b12-8e23-29d289d1fcd3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:13:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:21.036 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[cd300dfa-8719-4266-bc6e-8334d3f2afea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:13:21 compute-0 NetworkManager[55036]: <info>  [1763799201.0588] device (tap5cbf5083-80): carrier: link connected
Nov 22 08:13:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:21.064 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[1c4a13b9-a0da-434c-a2fe-1f0a6e2ea685]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:13:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:21.083 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[af5d69dc-48c5-459e-9d90-358d4f6c8024]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5cbf5083-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b0:31:34'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 193], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 594169, 'reachable_time': 20019, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238357, 'error': None, 'target': 'ovnmeta-5cbf5083-8d50-44bd-b6ba-93e507a8654e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:13:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:21.097 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[e3a6c9a0-db65-439b-b308-e42a7800afc9]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb0:3134'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 594169, 'tstamp': 594169}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 238358, 'error': None, 'target': 'ovnmeta-5cbf5083-8d50-44bd-b6ba-93e507a8654e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:13:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:21.114 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[35ed69f2-4cbc-4e3e-81c3-0a313cae9c75]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5cbf5083-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b0:31:34'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 193], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 594169, 'reachable_time': 20019, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 238360, 'error': None, 'target': 'ovnmeta-5cbf5083-8d50-44bd-b6ba-93e507a8654e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:13:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:21.145 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[f5c3295a-dcc9-48f2-9ce6-73e466a359ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:13:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:21.204 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[9df08ce2-a86d-43cc-b6e2-30d3a65d358c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:13:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:21.205 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5cbf5083-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:13:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:21.206 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:13:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:21.206 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5cbf5083-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:13:21 compute-0 kernel: tap5cbf5083-80: entered promiscuous mode
Nov 22 08:13:21 compute-0 nova_compute[186544]: 2025-11-22 08:13:21.207 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:13:21 compute-0 NetworkManager[55036]: <info>  [1763799201.2084] manager: (tap5cbf5083-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/290)
Nov 22 08:13:21 compute-0 nova_compute[186544]: 2025-11-22 08:13:21.209 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:13:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:21.210 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap5cbf5083-80, col_values=(('external_ids', {'iface-id': 'c7997624-ca02-4f3d-814b-acdd3ec0189c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:13:21 compute-0 nova_compute[186544]: 2025-11-22 08:13:21.211 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:13:21 compute-0 ovn_controller[94843]: 2025-11-22T08:13:21Z|00627|binding|INFO|Releasing lport c7997624-ca02-4f3d-814b-acdd3ec0189c from this chassis (sb_readonly=0)
Nov 22 08:13:21 compute-0 nova_compute[186544]: 2025-11-22 08:13:21.222 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:13:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:21.223 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/5cbf5083-8d50-44bd-b6ba-93e507a8654e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/5cbf5083-8d50-44bd-b6ba-93e507a8654e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 08:13:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:21.224 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[ef0351dd-7157-42e7-a8d9-50b4c50d57dd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:13:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:21.224 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 08:13:21 compute-0 ovn_metadata_agent[103800]: global
Nov 22 08:13:21 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 08:13:21 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-5cbf5083-8d50-44bd-b6ba-93e507a8654e
Nov 22 08:13:21 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 08:13:21 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 08:13:21 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 08:13:21 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/5cbf5083-8d50-44bd-b6ba-93e507a8654e.pid.haproxy
Nov 22 08:13:21 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 08:13:21 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:13:21 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 08:13:21 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 08:13:21 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 08:13:21 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 08:13:21 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 08:13:21 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 08:13:21 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 08:13:21 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 08:13:21 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 08:13:21 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 08:13:21 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 08:13:21 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 08:13:21 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 08:13:21 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:13:21 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:13:21 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 08:13:21 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 08:13:21 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 08:13:21 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID 5cbf5083-8d50-44bd-b6ba-93e507a8654e
Nov 22 08:13:21 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 08:13:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:21.225 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-5cbf5083-8d50-44bd-b6ba-93e507a8654e', 'env', 'PROCESS_TAG=haproxy-5cbf5083-8d50-44bd-b6ba-93e507a8654e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/5cbf5083-8d50-44bd-b6ba-93e507a8654e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 08:13:21 compute-0 nova_compute[186544]: 2025-11-22 08:13:21.225 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763799201.2233639, 64848f5c-64c9-41ed-9c0d-c2ef3839d5de => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:13:21 compute-0 nova_compute[186544]: 2025-11-22 08:13:21.226 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] VM Started (Lifecycle Event)
Nov 22 08:13:21 compute-0 nova_compute[186544]: 2025-11-22 08:13:21.255 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:13:21 compute-0 nova_compute[186544]: 2025-11-22 08:13:21.261 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763799201.2235794, 64848f5c-64c9-41ed-9c0d-c2ef3839d5de => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:13:21 compute-0 nova_compute[186544]: 2025-11-22 08:13:21.262 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] VM Paused (Lifecycle Event)
Nov 22 08:13:21 compute-0 nova_compute[186544]: 2025-11-22 08:13:21.284 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:13:21 compute-0 nova_compute[186544]: 2025-11-22 08:13:21.288 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:13:21 compute-0 nova_compute[186544]: 2025-11-22 08:13:21.306 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 08:13:21 compute-0 nova_compute[186544]: 2025-11-22 08:13:21.373 186548 DEBUG nova.compute.manager [req-2a87d519-0da7-402d-9725-9da33fe57209 req-e0bb2af9-d6a6-4c4d-89ea-2f903a9de032 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Received event network-vif-plugged-56b64f59-c3ee-40e4-8a5a-42d53b2d04b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:13:21 compute-0 nova_compute[186544]: 2025-11-22 08:13:21.373 186548 DEBUG oslo_concurrency.lockutils [req-2a87d519-0da7-402d-9725-9da33fe57209 req-e0bb2af9-d6a6-4c4d-89ea-2f903a9de032 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "64848f5c-64c9-41ed-9c0d-c2ef3839d5de-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:13:21 compute-0 nova_compute[186544]: 2025-11-22 08:13:21.374 186548 DEBUG oslo_concurrency.lockutils [req-2a87d519-0da7-402d-9725-9da33fe57209 req-e0bb2af9-d6a6-4c4d-89ea-2f903a9de032 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "64848f5c-64c9-41ed-9c0d-c2ef3839d5de-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:13:21 compute-0 nova_compute[186544]: 2025-11-22 08:13:21.374 186548 DEBUG oslo_concurrency.lockutils [req-2a87d519-0da7-402d-9725-9da33fe57209 req-e0bb2af9-d6a6-4c4d-89ea-2f903a9de032 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "64848f5c-64c9-41ed-9c0d-c2ef3839d5de-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:13:21 compute-0 nova_compute[186544]: 2025-11-22 08:13:21.374 186548 DEBUG nova.compute.manager [req-2a87d519-0da7-402d-9725-9da33fe57209 req-e0bb2af9-d6a6-4c4d-89ea-2f903a9de032 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Processing event network-vif-plugged-56b64f59-c3ee-40e4-8a5a-42d53b2d04b7 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 08:13:21 compute-0 nova_compute[186544]: 2025-11-22 08:13:21.375 186548 DEBUG nova.compute.manager [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 08:13:21 compute-0 nova_compute[186544]: 2025-11-22 08:13:21.391 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763799201.3909616, 64848f5c-64c9-41ed-9c0d-c2ef3839d5de => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:13:21 compute-0 nova_compute[186544]: 2025-11-22 08:13:21.392 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] VM Resumed (Lifecycle Event)
Nov 22 08:13:21 compute-0 nova_compute[186544]: 2025-11-22 08:13:21.394 186548 DEBUG nova.virt.libvirt.driver [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 08:13:21 compute-0 nova_compute[186544]: 2025-11-22 08:13:21.399 186548 INFO nova.virt.libvirt.driver [-] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Instance spawned successfully.
Nov 22 08:13:21 compute-0 nova_compute[186544]: 2025-11-22 08:13:21.400 186548 DEBUG nova.virt.libvirt.driver [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 08:13:21 compute-0 nova_compute[186544]: 2025-11-22 08:13:21.420 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:13:21 compute-0 nova_compute[186544]: 2025-11-22 08:13:21.431 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:13:21 compute-0 nova_compute[186544]: 2025-11-22 08:13:21.437 186548 DEBUG nova.virt.libvirt.driver [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:13:21 compute-0 nova_compute[186544]: 2025-11-22 08:13:21.438 186548 DEBUG nova.virt.libvirt.driver [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:13:21 compute-0 nova_compute[186544]: 2025-11-22 08:13:21.438 186548 DEBUG nova.virt.libvirt.driver [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:13:21 compute-0 nova_compute[186544]: 2025-11-22 08:13:21.439 186548 DEBUG nova.virt.libvirt.driver [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:13:21 compute-0 nova_compute[186544]: 2025-11-22 08:13:21.439 186548 DEBUG nova.virt.libvirt.driver [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:13:21 compute-0 nova_compute[186544]: 2025-11-22 08:13:21.440 186548 DEBUG nova.virt.libvirt.driver [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:13:21 compute-0 nova_compute[186544]: 2025-11-22 08:13:21.448 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 08:13:21 compute-0 nova_compute[186544]: 2025-11-22 08:13:21.510 186548 INFO nova.compute.manager [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Took 8.17 seconds to spawn the instance on the hypervisor.
Nov 22 08:13:21 compute-0 nova_compute[186544]: 2025-11-22 08:13:21.511 186548 DEBUG nova.compute.manager [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:13:21 compute-0 nova_compute[186544]: 2025-11-22 08:13:21.572 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:13:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:21.572 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=38, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=37) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:13:21 compute-0 nova_compute[186544]: 2025-11-22 08:13:21.594 186548 INFO nova.compute.manager [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Took 8.74 seconds to build instance.
Nov 22 08:13:21 compute-0 nova_compute[186544]: 2025-11-22 08:13:21.617 186548 DEBUG oslo_concurrency.lockutils [None req-22c711ab-59e1-4aba-b4bd-1c18b9fedffc 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Lock "64848f5c-64c9-41ed-9c0d-c2ef3839d5de" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.886s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:13:21 compute-0 podman[238398]: 2025-11-22 08:13:21.573016854 +0000 UTC m=+0.023779169 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 08:13:21 compute-0 podman[238398]: 2025-11-22 08:13:21.745408532 +0000 UTC m=+0.196170827 container create 33ab42803c85de2a341cd9ab489cee45a4ad5e2a8cb4d2a1819f30fca662e93a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5cbf5083-8d50-44bd-b6ba-93e507a8654e, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Nov 22 08:13:21 compute-0 systemd[1]: Started libpod-conmon-33ab42803c85de2a341cd9ab489cee45a4ad5e2a8cb4d2a1819f30fca662e93a.scope.
Nov 22 08:13:21 compute-0 systemd[1]: Started libcrun container.
Nov 22 08:13:21 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8129f6348dfb45d4b21587bf3427b4e472929aa3252894f18839520057cb53d2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 08:13:21 compute-0 podman[238398]: 2025-11-22 08:13:21.826937 +0000 UTC m=+0.277699325 container init 33ab42803c85de2a341cd9ab489cee45a4ad5e2a8cb4d2a1819f30fca662e93a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5cbf5083-8d50-44bd-b6ba-93e507a8654e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 08:13:21 compute-0 podman[238398]: 2025-11-22 08:13:21.832150269 +0000 UTC m=+0.282912564 container start 33ab42803c85de2a341cd9ab489cee45a4ad5e2a8cb4d2a1819f30fca662e93a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5cbf5083-8d50-44bd-b6ba-93e507a8654e, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 22 08:13:21 compute-0 neutron-haproxy-ovnmeta-5cbf5083-8d50-44bd-b6ba-93e507a8654e[238414]: [NOTICE]   (238418) : New worker (238420) forked
Nov 22 08:13:21 compute-0 neutron-haproxy-ovnmeta-5cbf5083-8d50-44bd-b6ba-93e507a8654e[238414]: [NOTICE]   (238418) : Loading success.
Nov 22 08:13:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:21.895 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 08:13:22 compute-0 nova_compute[186544]: 2025-11-22 08:13:22.019 186548 DEBUG nova.network.neutron [req-bd68067a-3edc-42f4-b87b-acaf82c7f091 req-5aeb6798-ac63-4bf1-ac3f-dfc1848f7586 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Updated VIF entry in instance network info cache for port 56b64f59-c3ee-40e4-8a5a-42d53b2d04b7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:13:22 compute-0 nova_compute[186544]: 2025-11-22 08:13:22.020 186548 DEBUG nova.network.neutron [req-bd68067a-3edc-42f4-b87b-acaf82c7f091 req-5aeb6798-ac63-4bf1-ac3f-dfc1848f7586 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Updating instance_info_cache with network_info: [{"id": "56b64f59-c3ee-40e4-8a5a-42d53b2d04b7", "address": "fa:16:3e:d3:d2:ad", "network": {"id": "5cbf5083-8d50-44bd-b6ba-93e507a8654e", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-722250133-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9016c6b616412fa2db0983e23a8150", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap56b64f59-c3", "ovs_interfaceid": "56b64f59-c3ee-40e4-8a5a-42d53b2d04b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:13:22 compute-0 nova_compute[186544]: 2025-11-22 08:13:22.034 186548 DEBUG oslo_concurrency.lockutils [req-bd68067a-3edc-42f4-b87b-acaf82c7f091 req-5aeb6798-ac63-4bf1-ac3f-dfc1848f7586 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-64848f5c-64c9-41ed-9c0d-c2ef3839d5de" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:13:22 compute-0 nova_compute[186544]: 2025-11-22 08:13:22.249 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763799187.247728, b8c444a0-4593-4343-9e2c-99f3110cceca => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:13:22 compute-0 nova_compute[186544]: 2025-11-22 08:13:22.249 186548 INFO nova.compute.manager [-] [instance: b8c444a0-4593-4343-9e2c-99f3110cceca] VM Stopped (Lifecycle Event)
Nov 22 08:13:22 compute-0 nova_compute[186544]: 2025-11-22 08:13:22.273 186548 DEBUG nova.compute.manager [None req-9582ecd5-0367-415c-9554-f6908e70539c - - - - - -] [instance: b8c444a0-4593-4343-9e2c-99f3110cceca] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:13:23 compute-0 nova_compute[186544]: 2025-11-22 08:13:23.489 186548 DEBUG nova.compute.manager [req-683f5a30-66ac-4aaa-950c-52aa323bfdf0 req-06b43b5b-d22d-458e-9e13-332236db0708 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Received event network-vif-plugged-56b64f59-c3ee-40e4-8a5a-42d53b2d04b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:13:23 compute-0 nova_compute[186544]: 2025-11-22 08:13:23.490 186548 DEBUG oslo_concurrency.lockutils [req-683f5a30-66ac-4aaa-950c-52aa323bfdf0 req-06b43b5b-d22d-458e-9e13-332236db0708 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "64848f5c-64c9-41ed-9c0d-c2ef3839d5de-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:13:23 compute-0 nova_compute[186544]: 2025-11-22 08:13:23.490 186548 DEBUG oslo_concurrency.lockutils [req-683f5a30-66ac-4aaa-950c-52aa323bfdf0 req-06b43b5b-d22d-458e-9e13-332236db0708 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "64848f5c-64c9-41ed-9c0d-c2ef3839d5de-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:13:23 compute-0 nova_compute[186544]: 2025-11-22 08:13:23.490 186548 DEBUG oslo_concurrency.lockutils [req-683f5a30-66ac-4aaa-950c-52aa323bfdf0 req-06b43b5b-d22d-458e-9e13-332236db0708 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "64848f5c-64c9-41ed-9c0d-c2ef3839d5de-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:13:23 compute-0 nova_compute[186544]: 2025-11-22 08:13:23.491 186548 DEBUG nova.compute.manager [req-683f5a30-66ac-4aaa-950c-52aa323bfdf0 req-06b43b5b-d22d-458e-9e13-332236db0708 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] No waiting events found dispatching network-vif-plugged-56b64f59-c3ee-40e4-8a5a-42d53b2d04b7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:13:23 compute-0 nova_compute[186544]: 2025-11-22 08:13:23.491 186548 WARNING nova.compute.manager [req-683f5a30-66ac-4aaa-950c-52aa323bfdf0 req-06b43b5b-d22d-458e-9e13-332236db0708 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Received unexpected event network-vif-plugged-56b64f59-c3ee-40e4-8a5a-42d53b2d04b7 for instance with vm_state active and task_state None.
Nov 22 08:13:24 compute-0 podman[238429]: 2025-11-22 08:13:24.412230869 +0000 UTC m=+0.056410117 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118)
Nov 22 08:13:24 compute-0 nova_compute[186544]: 2025-11-22 08:13:24.831 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:13:24 compute-0 NetworkManager[55036]: <info>  [1763799204.8320] manager: (patch-br-int-to-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/291)
Nov 22 08:13:24 compute-0 NetworkManager[55036]: <info>  [1763799204.8328] manager: (patch-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/292)
Nov 22 08:13:24 compute-0 nova_compute[186544]: 2025-11-22 08:13:24.916 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:13:24 compute-0 ovn_controller[94843]: 2025-11-22T08:13:24Z|00628|binding|INFO|Releasing lport c7997624-ca02-4f3d-814b-acdd3ec0189c from this chassis (sb_readonly=0)
Nov 22 08:13:24 compute-0 nova_compute[186544]: 2025-11-22 08:13:24.928 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:13:24 compute-0 nova_compute[186544]: 2025-11-22 08:13:24.990 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:13:25 compute-0 nova_compute[186544]: 2025-11-22 08:13:25.624 186548 DEBUG nova.compute.manager [req-0812b0b6-a51a-4324-8cef-09fe95768669 req-d71cf26e-b340-417d-9321-b9314ed70fa3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Received event network-changed-56b64f59-c3ee-40e4-8a5a-42d53b2d04b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:13:25 compute-0 nova_compute[186544]: 2025-11-22 08:13:25.625 186548 DEBUG nova.compute.manager [req-0812b0b6-a51a-4324-8cef-09fe95768669 req-d71cf26e-b340-417d-9321-b9314ed70fa3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Refreshing instance network info cache due to event network-changed-56b64f59-c3ee-40e4-8a5a-42d53b2d04b7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:13:25 compute-0 nova_compute[186544]: 2025-11-22 08:13:25.625 186548 DEBUG oslo_concurrency.lockutils [req-0812b0b6-a51a-4324-8cef-09fe95768669 req-d71cf26e-b340-417d-9321-b9314ed70fa3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-64848f5c-64c9-41ed-9c0d-c2ef3839d5de" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:13:25 compute-0 nova_compute[186544]: 2025-11-22 08:13:25.625 186548 DEBUG oslo_concurrency.lockutils [req-0812b0b6-a51a-4324-8cef-09fe95768669 req-d71cf26e-b340-417d-9321-b9314ed70fa3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-64848f5c-64c9-41ed-9c0d-c2ef3839d5de" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:13:25 compute-0 nova_compute[186544]: 2025-11-22 08:13:25.625 186548 DEBUG nova.network.neutron [req-0812b0b6-a51a-4324-8cef-09fe95768669 req-d71cf26e-b340-417d-9321-b9314ed70fa3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Refreshing network info cache for port 56b64f59-c3ee-40e4-8a5a-42d53b2d04b7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:13:25 compute-0 nova_compute[186544]: 2025-11-22 08:13:25.966 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:13:26 compute-0 nova_compute[186544]: 2025-11-22 08:13:26.000 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:13:26 compute-0 nova_compute[186544]: 2025-11-22 08:13:26.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:13:26 compute-0 nova_compute[186544]: 2025-11-22 08:13:26.163 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:13:26 compute-0 nova_compute[186544]: 2025-11-22 08:13:26.164 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:13:26 compute-0 nova_compute[186544]: 2025-11-22 08:13:26.164 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:13:26 compute-0 nova_compute[186544]: 2025-11-22 08:13:26.164 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:13:26 compute-0 nova_compute[186544]: 2025-11-22 08:13:26.165 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:13:26 compute-0 nova_compute[186544]: 2025-11-22 08:13:26.165 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:13:26 compute-0 nova_compute[186544]: 2025-11-22 08:13:26.201 186548 DEBUG nova.virt.libvirt.imagecache [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Verify base images _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:314
Nov 22 08:13:26 compute-0 nova_compute[186544]: 2025-11-22 08:13:26.201 186548 DEBUG nova.virt.libvirt.imagecache [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Image id eb6eb4ac-7956-4021-b3a0-d612ae61d38c yields fingerprint 169b85625b85d2ad681b52460a0c196a18b2a726 _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:319
Nov 22 08:13:26 compute-0 nova_compute[186544]: 2025-11-22 08:13:26.201 186548 INFO nova.virt.libvirt.imagecache [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] image eb6eb4ac-7956-4021-b3a0-d612ae61d38c at (/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726): checking
Nov 22 08:13:26 compute-0 nova_compute[186544]: 2025-11-22 08:13:26.202 186548 DEBUG nova.virt.libvirt.imagecache [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] image eb6eb4ac-7956-4021-b3a0-d612ae61d38c at (/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726): image is in use _mark_in_use /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:279
Nov 22 08:13:26 compute-0 nova_compute[186544]: 2025-11-22 08:13:26.203 186548 DEBUG nova.virt.libvirt.imagecache [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Image id  yields fingerprint da39a3ee5e6b4b0d3255bfef95601890afd80709 _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:319
Nov 22 08:13:26 compute-0 nova_compute[186544]: 2025-11-22 08:13:26.204 186548 DEBUG nova.virt.libvirt.imagecache [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] 64848f5c-64c9-41ed-9c0d-c2ef3839d5de is a valid instance name _list_backing_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:126
Nov 22 08:13:26 compute-0 nova_compute[186544]: 2025-11-22 08:13:26.204 186548 DEBUG nova.virt.libvirt.imagecache [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] 64848f5c-64c9-41ed-9c0d-c2ef3839d5de has a disk file _list_backing_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:129
Nov 22 08:13:26 compute-0 nova_compute[186544]: 2025-11-22 08:13:26.205 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/64848f5c-64c9-41ed-9c0d-c2ef3839d5de/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:13:26 compute-0 nova_compute[186544]: 2025-11-22 08:13:26.266 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/64848f5c-64c9-41ed-9c0d-c2ef3839d5de/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:13:26 compute-0 nova_compute[186544]: 2025-11-22 08:13:26.267 186548 DEBUG nova.virt.libvirt.imagecache [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Instance 64848f5c-64c9-41ed-9c0d-c2ef3839d5de is backed by 169b85625b85d2ad681b52460a0c196a18b2a726 _list_backing_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:141
Nov 22 08:13:26 compute-0 nova_compute[186544]: 2025-11-22 08:13:26.267 186548 WARNING nova.virt.libvirt.imagecache [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Unknown base file: /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42
Nov 22 08:13:26 compute-0 nova_compute[186544]: 2025-11-22 08:13:26.268 186548 WARNING nova.virt.libvirt.imagecache [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Unknown base file: /var/lib/nova/instances/_base/9794f07bebbd0adda446514edb1f5d23da5d1e0c
Nov 22 08:13:26 compute-0 nova_compute[186544]: 2025-11-22 08:13:26.268 186548 WARNING nova.virt.libvirt.imagecache [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Unknown base file: /var/lib/nova/instances/_base/8536cfb4a3f8d7a3046f2bf7b9787f58422494f1
Nov 22 08:13:26 compute-0 nova_compute[186544]: 2025-11-22 08:13:26.268 186548 WARNING nova.virt.libvirt.imagecache [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Unknown base file: /var/lib/nova/instances/_base/a8f717718e805875f8335c6cd87b43bdb39ec5d3
Nov 22 08:13:26 compute-0 nova_compute[186544]: 2025-11-22 08:13:26.268 186548 WARNING nova.virt.libvirt.imagecache [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Unknown base file: /var/lib/nova/instances/_base/e4c7d8ef8985edf6bd00969b5ee2ac6a894891c5
Nov 22 08:13:26 compute-0 nova_compute[186544]: 2025-11-22 08:13:26.269 186548 WARNING nova.virt.libvirt.imagecache [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Unknown base file: /var/lib/nova/instances/_base/b311e5b8c8aaec45fed9ab32a27ca19947834438
Nov 22 08:13:26 compute-0 nova_compute[186544]: 2025-11-22 08:13:26.269 186548 INFO nova.virt.libvirt.imagecache [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Active base files: /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726
Nov 22 08:13:26 compute-0 nova_compute[186544]: 2025-11-22 08:13:26.269 186548 INFO nova.virt.libvirt.imagecache [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Removable base files: /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42 /var/lib/nova/instances/_base/9794f07bebbd0adda446514edb1f5d23da5d1e0c /var/lib/nova/instances/_base/8536cfb4a3f8d7a3046f2bf7b9787f58422494f1 /var/lib/nova/instances/_base/a8f717718e805875f8335c6cd87b43bdb39ec5d3 /var/lib/nova/instances/_base/e4c7d8ef8985edf6bd00969b5ee2ac6a894891c5 /var/lib/nova/instances/_base/b311e5b8c8aaec45fed9ab32a27ca19947834438
Nov 22 08:13:26 compute-0 nova_compute[186544]: 2025-11-22 08:13:26.270 186548 INFO nova.virt.libvirt.imagecache [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42
Nov 22 08:13:26 compute-0 nova_compute[186544]: 2025-11-22 08:13:26.270 186548 INFO nova.virt.libvirt.imagecache [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/9794f07bebbd0adda446514edb1f5d23da5d1e0c
Nov 22 08:13:26 compute-0 nova_compute[186544]: 2025-11-22 08:13:26.270 186548 INFO nova.virt.libvirt.imagecache [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/8536cfb4a3f8d7a3046f2bf7b9787f58422494f1
Nov 22 08:13:26 compute-0 nova_compute[186544]: 2025-11-22 08:13:26.270 186548 INFO nova.virt.libvirt.imagecache [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/a8f717718e805875f8335c6cd87b43bdb39ec5d3
Nov 22 08:13:26 compute-0 nova_compute[186544]: 2025-11-22 08:13:26.271 186548 INFO nova.virt.libvirt.imagecache [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/e4c7d8ef8985edf6bd00969b5ee2ac6a894891c5
Nov 22 08:13:26 compute-0 nova_compute[186544]: 2025-11-22 08:13:26.271 186548 INFO nova.virt.libvirt.imagecache [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/b311e5b8c8aaec45fed9ab32a27ca19947834438
Nov 22 08:13:26 compute-0 nova_compute[186544]: 2025-11-22 08:13:26.271 186548 DEBUG nova.virt.libvirt.imagecache [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Verification complete _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:350
Nov 22 08:13:26 compute-0 nova_compute[186544]: 2025-11-22 08:13:26.271 186548 DEBUG nova.virt.libvirt.imagecache [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Verify swap images _age_and_verify_swap_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:299
Nov 22 08:13:26 compute-0 nova_compute[186544]: 2025-11-22 08:13:26.271 186548 DEBUG nova.virt.libvirt.imagecache [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Verify ephemeral images _age_and_verify_ephemeral_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:284
Nov 22 08:13:26 compute-0 ovn_controller[94843]: 2025-11-22T08:13:26Z|00629|binding|INFO|Releasing lport c7997624-ca02-4f3d-814b-acdd3ec0189c from this chassis (sb_readonly=0)
Nov 22 08:13:26 compute-0 nova_compute[186544]: 2025-11-22 08:13:26.438 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:13:26 compute-0 nova_compute[186544]: 2025-11-22 08:13:26.865 186548 DEBUG nova.network.neutron [req-0812b0b6-a51a-4324-8cef-09fe95768669 req-d71cf26e-b340-417d-9321-b9314ed70fa3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Updated VIF entry in instance network info cache for port 56b64f59-c3ee-40e4-8a5a-42d53b2d04b7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:13:26 compute-0 nova_compute[186544]: 2025-11-22 08:13:26.866 186548 DEBUG nova.network.neutron [req-0812b0b6-a51a-4324-8cef-09fe95768669 req-d71cf26e-b340-417d-9321-b9314ed70fa3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Updating instance_info_cache with network_info: [{"id": "56b64f59-c3ee-40e4-8a5a-42d53b2d04b7", "address": "fa:16:3e:d3:d2:ad", "network": {"id": "5cbf5083-8d50-44bd-b6ba-93e507a8654e", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-722250133-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.221", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9016c6b616412fa2db0983e23a8150", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap56b64f59-c3", "ovs_interfaceid": "56b64f59-c3ee-40e4-8a5a-42d53b2d04b7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:13:27 compute-0 nova_compute[186544]: 2025-11-22 08:13:27.026 186548 DEBUG oslo_concurrency.lockutils [req-0812b0b6-a51a-4324-8cef-09fe95768669 req-d71cf26e-b340-417d-9321-b9314ed70fa3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-64848f5c-64c9-41ed-9c0d-c2ef3839d5de" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:13:28 compute-0 podman[238452]: 2025-11-22 08:13:28.400860788 +0000 UTC m=+0.045484048 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 08:13:28 compute-0 podman[238453]: 2025-11-22 08:13:28.41956143 +0000 UTC m=+0.060036107 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, architecture=x86_64, maintainer=Red Hat, Inc., name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.openshift.tags=minimal rhel9, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.buildah.version=1.33.7, version=9.6, container_name=openstack_network_exporter)
Nov 22 08:13:28 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:28.897 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '38'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:13:29 compute-0 nova_compute[186544]: 2025-11-22 08:13:29.992 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:13:30 compute-0 nova_compute[186544]: 2025-11-22 08:13:30.770 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:13:30 compute-0 nova_compute[186544]: 2025-11-22 08:13:30.967 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:13:34 compute-0 nova_compute[186544]: 2025-11-22 08:13:34.994 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:13:35 compute-0 nova_compute[186544]: 2025-11-22 08:13:35.969 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:13:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:37.340 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:13:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:37.341 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:13:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:37.341 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:13:38 compute-0 ovn_controller[94843]: 2025-11-22T08:13:38Z|00055|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d3:d2:ad 10.100.0.4
Nov 22 08:13:38 compute-0 ovn_controller[94843]: 2025-11-22T08:13:38Z|00056|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d3:d2:ad 10.100.0.4
Nov 22 08:13:39 compute-0 nova_compute[186544]: 2025-11-22 08:13:39.995 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:13:40 compute-0 nova_compute[186544]: 2025-11-22 08:13:40.971 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:13:43 compute-0 nova_compute[186544]: 2025-11-22 08:13:43.734 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:13:44 compute-0 podman[238513]: 2025-11-22 08:13:44.40545905 +0000 UTC m=+0.054673565 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 22 08:13:44 compute-0 podman[238514]: 2025-11-22 08:13:44.410001133 +0000 UTC m=+0.055808753 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 08:13:44 compute-0 podman[238512]: 2025-11-22 08:13:44.410417193 +0000 UTC m=+0.063852862 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 08:13:44 compute-0 podman[238520]: 2025-11-22 08:13:44.441867041 +0000 UTC m=+0.085040295 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 08:13:44 compute-0 nova_compute[186544]: 2025-11-22 08:13:44.997 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:13:45 compute-0 nova_compute[186544]: 2025-11-22 08:13:45.973 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:13:48 compute-0 nova_compute[186544]: 2025-11-22 08:13:48.088 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:13:50 compute-0 nova_compute[186544]: 2025-11-22 08:13:49.999 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:13:50 compute-0 nova_compute[186544]: 2025-11-22 08:13:50.975 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:13:52 compute-0 nova_compute[186544]: 2025-11-22 08:13:52.723 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:13:54 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:54.297 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7e:a2:f4 2001:db8:0:1:f816:3eff:fe7e:a2f4 2001:db8::f816:3eff:fe7e:a2f4'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe7e:a2f4/64 2001:db8::f816:3eff:fe7e:a2f4/64', 'neutron:device_id': 'ovnmeta-6c0a2255-6426-43c4-abc3-5c1857ba0a79', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6c0a2255-6426-43c4-abc3-5c1857ba0a79', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1f6eb0a2-d476-48e9-8756-79e6bbc84c15, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=433cf940-3b59-425c-aeb8-689a57de46c2) old=Port_Binding(mac=['fa:16:3e:7e:a2:f4 2001:db8::f816:3eff:fe7e:a2f4'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe7e:a2f4/64', 'neutron:device_id': 'ovnmeta-6c0a2255-6426-43c4-abc3-5c1857ba0a79', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6c0a2255-6426-43c4-abc3-5c1857ba0a79', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:13:54 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:54.299 103805 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 433cf940-3b59-425c-aeb8-689a57de46c2 in datapath 6c0a2255-6426-43c4-abc3-5c1857ba0a79 updated
Nov 22 08:13:54 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:54.300 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6c0a2255-6426-43c4-abc3-5c1857ba0a79, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 08:13:54 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:54.301 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[55cf1355-380c-498a-8234-bb1514ea3cca]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:13:55 compute-0 nova_compute[186544]: 2025-11-22 08:13:55.001 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:13:55 compute-0 podman[238596]: 2025-11-22 08:13:55.41550051 +0000 UTC m=+0.065466582 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 22 08:13:55 compute-0 nova_compute[186544]: 2025-11-22 08:13:55.979 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:13:58 compute-0 nova_compute[186544]: 2025-11-22 08:13:58.266 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:13:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:58.691 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=39, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=38) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:13:58 compute-0 nova_compute[186544]: 2025-11-22 08:13:58.691 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:13:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:13:58.692 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 08:13:59 compute-0 nova_compute[186544]: 2025-11-22 08:13:59.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:13:59 compute-0 nova_compute[186544]: 2025-11-22 08:13:59.196 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:13:59 compute-0 nova_compute[186544]: 2025-11-22 08:13:59.196 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:13:59 compute-0 nova_compute[186544]: 2025-11-22 08:13:59.196 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:13:59 compute-0 nova_compute[186544]: 2025-11-22 08:13:59.197 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 08:13:59 compute-0 nova_compute[186544]: 2025-11-22 08:13:59.283 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/64848f5c-64c9-41ed-9c0d-c2ef3839d5de/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:13:59 compute-0 podman[238615]: 2025-11-22 08:13:59.311120506 +0000 UTC m=+0.059683698 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 08:13:59 compute-0 podman[238617]: 2025-11-22 08:13:59.318123899 +0000 UTC m=+0.063774799 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., version=9.6, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.buildah.version=1.33.7, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, name=ubi9-minimal, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., release=1755695350, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 22 08:13:59 compute-0 nova_compute[186544]: 2025-11-22 08:13:59.348 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/64848f5c-64c9-41ed-9c0d-c2ef3839d5de/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:13:59 compute-0 nova_compute[186544]: 2025-11-22 08:13:59.349 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/64848f5c-64c9-41ed-9c0d-c2ef3839d5de/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:13:59 compute-0 nova_compute[186544]: 2025-11-22 08:13:59.411 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/64848f5c-64c9-41ed-9c0d-c2ef3839d5de/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:13:59 compute-0 nova_compute[186544]: 2025-11-22 08:13:59.639 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:13:59 compute-0 nova_compute[186544]: 2025-11-22 08:13:59.641 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5567MB free_disk=73.10724639892578GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 08:13:59 compute-0 nova_compute[186544]: 2025-11-22 08:13:59.642 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:13:59 compute-0 nova_compute[186544]: 2025-11-22 08:13:59.642 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:13:59 compute-0 nova_compute[186544]: 2025-11-22 08:13:59.748 186548 DEBUG nova.compute.manager [None req-f935ee32-9417-4f0e-98d3-e4ea90507b0f 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:13:59 compute-0 nova_compute[186544]: 2025-11-22 08:13:59.826 186548 INFO nova.compute.manager [None req-f935ee32-9417-4f0e-98d3-e4ea90507b0f 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] instance snapshotting
Nov 22 08:13:59 compute-0 nova_compute[186544]: 2025-11-22 08:13:59.903 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Instance 64848f5c-64c9-41ed-9c0d-c2ef3839d5de actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 22 08:13:59 compute-0 nova_compute[186544]: 2025-11-22 08:13:59.904 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 08:13:59 compute-0 nova_compute[186544]: 2025-11-22 08:13:59.904 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 08:14:00 compute-0 nova_compute[186544]: 2025-11-22 08:14:00.003 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:14:00 compute-0 nova_compute[186544]: 2025-11-22 08:14:00.027 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:14:00 compute-0 nova_compute[186544]: 2025-11-22 08:14:00.045 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:14:00 compute-0 nova_compute[186544]: 2025-11-22 08:14:00.173 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 08:14:00 compute-0 nova_compute[186544]: 2025-11-22 08:14:00.174 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.532s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:14:00 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:14:00.694 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '39'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:14:00 compute-0 nova_compute[186544]: 2025-11-22 08:14:00.978 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:14:01 compute-0 nova_compute[186544]: 2025-11-22 08:14:01.008 186548 INFO nova.virt.libvirt.driver [None req-f935ee32-9417-4f0e-98d3-e4ea90507b0f 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Beginning live snapshot process
Nov 22 08:14:01 compute-0 nova_compute[186544]: 2025-11-22 08:14:01.174 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:14:01 compute-0 nova_compute[186544]: 2025-11-22 08:14:01.175 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 08:14:01 compute-0 nova_compute[186544]: 2025-11-22 08:14:01.175 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 08:14:01 compute-0 virtqemud[186092]: invalid argument: disk vda does not have an active block job
Nov 22 08:14:01 compute-0 nova_compute[186544]: 2025-11-22 08:14:01.184 186548 DEBUG oslo_concurrency.processutils [None req-f935ee32-9417-4f0e-98d3-e4ea90507b0f 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/64848f5c-64c9-41ed-9c0d-c2ef3839d5de/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:14:01 compute-0 nova_compute[186544]: 2025-11-22 08:14:01.204 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "refresh_cache-64848f5c-64c9-41ed-9c0d-c2ef3839d5de" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:14:01 compute-0 nova_compute[186544]: 2025-11-22 08:14:01.204 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquired lock "refresh_cache-64848f5c-64c9-41ed-9c0d-c2ef3839d5de" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:14:01 compute-0 nova_compute[186544]: 2025-11-22 08:14:01.205 186548 DEBUG nova.network.neutron [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 22 08:14:01 compute-0 nova_compute[186544]: 2025-11-22 08:14:01.205 186548 DEBUG nova.objects.instance [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lazy-loading 'info_cache' on Instance uuid 64848f5c-64c9-41ed-9c0d-c2ef3839d5de obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:14:01 compute-0 nova_compute[186544]: 2025-11-22 08:14:01.243 186548 DEBUG oslo_concurrency.processutils [None req-f935ee32-9417-4f0e-98d3-e4ea90507b0f 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/64848f5c-64c9-41ed-9c0d-c2ef3839d5de/disk --force-share --output=json -f qcow2" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:14:01 compute-0 nova_compute[186544]: 2025-11-22 08:14:01.244 186548 DEBUG oslo_concurrency.processutils [None req-f935ee32-9417-4f0e-98d3-e4ea90507b0f 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/64848f5c-64c9-41ed-9c0d-c2ef3839d5de/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:14:01 compute-0 nova_compute[186544]: 2025-11-22 08:14:01.314 186548 DEBUG oslo_concurrency.processutils [None req-f935ee32-9417-4f0e-98d3-e4ea90507b0f 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/64848f5c-64c9-41ed-9c0d-c2ef3839d5de/disk --force-share --output=json -f qcow2" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:14:01 compute-0 nova_compute[186544]: 2025-11-22 08:14:01.329 186548 DEBUG oslo_concurrency.processutils [None req-f935ee32-9417-4f0e-98d3-e4ea90507b0f 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:14:01 compute-0 nova_compute[186544]: 2025-11-22 08:14:01.389 186548 DEBUG oslo_concurrency.processutils [None req-f935ee32-9417-4f0e-98d3-e4ea90507b0f 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:14:01 compute-0 nova_compute[186544]: 2025-11-22 08:14:01.390 186548 DEBUG oslo_concurrency.processutils [None req-f935ee32-9417-4f0e-98d3-e4ea90507b0f 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpjoujd27g/8a139d4efa6443819a3b43a4e1c99880.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:14:01 compute-0 nova_compute[186544]: 2025-11-22 08:14:01.458 186548 DEBUG oslo_concurrency.processutils [None req-f935ee32-9417-4f0e-98d3-e4ea90507b0f 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpjoujd27g/8a139d4efa6443819a3b43a4e1c99880.delta 1073741824" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:14:01 compute-0 nova_compute[186544]: 2025-11-22 08:14:01.459 186548 INFO nova.virt.libvirt.driver [None req-f935ee32-9417-4f0e-98d3-e4ea90507b0f 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Quiescing instance not available: QEMU guest agent is not enabled.
Nov 22 08:14:01 compute-0 nova_compute[186544]: 2025-11-22 08:14:01.526 186548 DEBUG nova.virt.libvirt.guest [None req-f935ee32-9417-4f0e-98d3-e4ea90507b0f 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] COPY block job progress, current cursor: 0 final cursor: 75497472 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Nov 22 08:14:02 compute-0 nova_compute[186544]: 2025-11-22 08:14:02.030 186548 DEBUG nova.virt.libvirt.guest [None req-f935ee32-9417-4f0e-98d3-e4ea90507b0f 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] COPY block job progress, current cursor: 21626880 final cursor: 75497472 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Nov 22 08:14:02 compute-0 nova_compute[186544]: 2025-11-22 08:14:02.180 186548 DEBUG oslo_concurrency.lockutils [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "ddf94a98-572f-4116-87fb-d46dc5f72174" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:14:02 compute-0 nova_compute[186544]: 2025-11-22 08:14:02.180 186548 DEBUG oslo_concurrency.lockutils [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "ddf94a98-572f-4116-87fb-d46dc5f72174" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:14:02 compute-0 nova_compute[186544]: 2025-11-22 08:14:02.201 186548 DEBUG nova.compute.manager [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 08:14:02 compute-0 nova_compute[186544]: 2025-11-22 08:14:02.335 186548 DEBUG oslo_concurrency.lockutils [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:14:02 compute-0 nova_compute[186544]: 2025-11-22 08:14:02.335 186548 DEBUG oslo_concurrency.lockutils [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:14:02 compute-0 nova_compute[186544]: 2025-11-22 08:14:02.342 186548 DEBUG nova.virt.hardware [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 08:14:02 compute-0 nova_compute[186544]: 2025-11-22 08:14:02.343 186548 INFO nova.compute.claims [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Claim successful on node compute-0.ctlplane.example.com
Nov 22 08:14:02 compute-0 nova_compute[186544]: 2025-11-22 08:14:02.528 186548 DEBUG nova.compute.provider_tree [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:14:02 compute-0 nova_compute[186544]: 2025-11-22 08:14:02.533 186548 DEBUG nova.virt.libvirt.guest [None req-f935ee32-9417-4f0e-98d3-e4ea90507b0f 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] COPY block job progress, current cursor: 75497472 final cursor: 75497472 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Nov 22 08:14:02 compute-0 nova_compute[186544]: 2025-11-22 08:14:02.536 186548 INFO nova.virt.libvirt.driver [None req-f935ee32-9417-4f0e-98d3-e4ea90507b0f 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Skipping quiescing instance: QEMU guest agent is not enabled.
Nov 22 08:14:02 compute-0 nova_compute[186544]: 2025-11-22 08:14:02.544 186548 DEBUG nova.scheduler.client.report [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:14:02 compute-0 nova_compute[186544]: 2025-11-22 08:14:02.565 186548 DEBUG oslo_concurrency.lockutils [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.230s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:14:02 compute-0 nova_compute[186544]: 2025-11-22 08:14:02.566 186548 DEBUG nova.compute.manager [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 08:14:02 compute-0 nova_compute[186544]: 2025-11-22 08:14:02.623 186548 DEBUG nova.compute.manager [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 22 08:14:02 compute-0 nova_compute[186544]: 2025-11-22 08:14:02.624 186548 DEBUG nova.network.neutron [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 08:14:02 compute-0 nova_compute[186544]: 2025-11-22 08:14:02.630 186548 DEBUG nova.privsep.utils [None req-f935ee32-9417-4f0e-98d3-e4ea90507b0f 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Nov 22 08:14:02 compute-0 nova_compute[186544]: 2025-11-22 08:14:02.631 186548 DEBUG oslo_concurrency.processutils [None req-f935ee32-9417-4f0e-98d3-e4ea90507b0f 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpjoujd27g/8a139d4efa6443819a3b43a4e1c99880.delta /var/lib/nova/instances/snapshots/tmpjoujd27g/8a139d4efa6443819a3b43a4e1c99880 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:14:02 compute-0 nova_compute[186544]: 2025-11-22 08:14:02.654 186548 INFO nova.virt.libvirt.driver [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 08:14:02 compute-0 nova_compute[186544]: 2025-11-22 08:14:02.672 186548 DEBUG nova.compute.manager [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 08:14:02 compute-0 nova_compute[186544]: 2025-11-22 08:14:02.769 186548 DEBUG nova.compute.manager [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 08:14:02 compute-0 nova_compute[186544]: 2025-11-22 08:14:02.771 186548 DEBUG nova.virt.libvirt.driver [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 08:14:02 compute-0 nova_compute[186544]: 2025-11-22 08:14:02.771 186548 INFO nova.virt.libvirt.driver [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Creating image(s)
Nov 22 08:14:02 compute-0 nova_compute[186544]: 2025-11-22 08:14:02.772 186548 DEBUG oslo_concurrency.lockutils [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "/var/lib/nova/instances/ddf94a98-572f-4116-87fb-d46dc5f72174/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:14:02 compute-0 nova_compute[186544]: 2025-11-22 08:14:02.772 186548 DEBUG oslo_concurrency.lockutils [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "/var/lib/nova/instances/ddf94a98-572f-4116-87fb-d46dc5f72174/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:14:02 compute-0 nova_compute[186544]: 2025-11-22 08:14:02.773 186548 DEBUG oslo_concurrency.lockutils [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "/var/lib/nova/instances/ddf94a98-572f-4116-87fb-d46dc5f72174/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:14:02 compute-0 nova_compute[186544]: 2025-11-22 08:14:02.789 186548 DEBUG oslo_concurrency.processutils [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:14:02 compute-0 nova_compute[186544]: 2025-11-22 08:14:02.861 186548 DEBUG oslo_concurrency.processutils [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:14:02 compute-0 nova_compute[186544]: 2025-11-22 08:14:02.862 186548 DEBUG oslo_concurrency.lockutils [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:14:02 compute-0 nova_compute[186544]: 2025-11-22 08:14:02.863 186548 DEBUG oslo_concurrency.lockutils [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:14:02 compute-0 nova_compute[186544]: 2025-11-22 08:14:02.876 186548 DEBUG oslo_concurrency.processutils [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:14:02 compute-0 nova_compute[186544]: 2025-11-22 08:14:02.905 186548 DEBUG nova.network.neutron [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Updating instance_info_cache with network_info: [{"id": "56b64f59-c3ee-40e4-8a5a-42d53b2d04b7", "address": "fa:16:3e:d3:d2:ad", "network": {"id": "5cbf5083-8d50-44bd-b6ba-93e507a8654e", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-722250133-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.221", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9016c6b616412fa2db0983e23a8150", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap56b64f59-c3", "ovs_interfaceid": "56b64f59-c3ee-40e4-8a5a-42d53b2d04b7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:14:02 compute-0 nova_compute[186544]: 2025-11-22 08:14:02.920 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Releasing lock "refresh_cache-64848f5c-64c9-41ed-9c0d-c2ef3839d5de" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:14:02 compute-0 nova_compute[186544]: 2025-11-22 08:14:02.921 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 22 08:14:02 compute-0 nova_compute[186544]: 2025-11-22 08:14:02.921 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:14:02 compute-0 nova_compute[186544]: 2025-11-22 08:14:02.922 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:14:02 compute-0 nova_compute[186544]: 2025-11-22 08:14:02.922 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 08:14:02 compute-0 nova_compute[186544]: 2025-11-22 08:14:02.924 186548 DEBUG nova.policy [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '809b865601654264af5bff7f49127cea', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 22 08:14:02 compute-0 nova_compute[186544]: 2025-11-22 08:14:02.940 186548 DEBUG oslo_concurrency.processutils [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:14:02 compute-0 nova_compute[186544]: 2025-11-22 08:14:02.940 186548 DEBUG oslo_concurrency.processutils [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/ddf94a98-572f-4116-87fb-d46dc5f72174/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:14:03 compute-0 nova_compute[186544]: 2025-11-22 08:14:03.072 186548 DEBUG oslo_concurrency.processutils [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/ddf94a98-572f-4116-87fb-d46dc5f72174/disk 1073741824" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:14:03 compute-0 nova_compute[186544]: 2025-11-22 08:14:03.073 186548 DEBUG oslo_concurrency.lockutils [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.210s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:14:03 compute-0 nova_compute[186544]: 2025-11-22 08:14:03.074 186548 DEBUG oslo_concurrency.processutils [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:14:03 compute-0 nova_compute[186544]: 2025-11-22 08:14:03.137 186548 DEBUG oslo_concurrency.processutils [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:14:03 compute-0 nova_compute[186544]: 2025-11-22 08:14:03.138 186548 DEBUG nova.virt.disk.api [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Checking if we can resize image /var/lib/nova/instances/ddf94a98-572f-4116-87fb-d46dc5f72174/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 08:14:03 compute-0 nova_compute[186544]: 2025-11-22 08:14:03.138 186548 DEBUG oslo_concurrency.processutils [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ddf94a98-572f-4116-87fb-d46dc5f72174/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:14:03 compute-0 nova_compute[186544]: 2025-11-22 08:14:03.202 186548 DEBUG oslo_concurrency.processutils [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ddf94a98-572f-4116-87fb-d46dc5f72174/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:14:03 compute-0 nova_compute[186544]: 2025-11-22 08:14:03.203 186548 DEBUG nova.virt.disk.api [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Cannot resize image /var/lib/nova/instances/ddf94a98-572f-4116-87fb-d46dc5f72174/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 08:14:03 compute-0 nova_compute[186544]: 2025-11-22 08:14:03.203 186548 DEBUG nova.objects.instance [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lazy-loading 'migration_context' on Instance uuid ddf94a98-572f-4116-87fb-d46dc5f72174 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:14:03 compute-0 nova_compute[186544]: 2025-11-22 08:14:03.215 186548 DEBUG nova.virt.libvirt.driver [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 08:14:03 compute-0 nova_compute[186544]: 2025-11-22 08:14:03.216 186548 DEBUG nova.virt.libvirt.driver [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Ensure instance console log exists: /var/lib/nova/instances/ddf94a98-572f-4116-87fb-d46dc5f72174/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 08:14:03 compute-0 nova_compute[186544]: 2025-11-22 08:14:03.217 186548 DEBUG oslo_concurrency.lockutils [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:14:03 compute-0 nova_compute[186544]: 2025-11-22 08:14:03.217 186548 DEBUG oslo_concurrency.lockutils [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:14:03 compute-0 nova_compute[186544]: 2025-11-22 08:14:03.217 186548 DEBUG oslo_concurrency.lockutils [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:14:03 compute-0 nova_compute[186544]: 2025-11-22 08:14:03.457 186548 DEBUG oslo_concurrency.processutils [None req-f935ee32-9417-4f0e-98d3-e4ea90507b0f 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpjoujd27g/8a139d4efa6443819a3b43a4e1c99880.delta /var/lib/nova/instances/snapshots/tmpjoujd27g/8a139d4efa6443819a3b43a4e1c99880" returned: 0 in 0.826s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:14:03 compute-0 nova_compute[186544]: 2025-11-22 08:14:03.463 186548 INFO nova.virt.libvirt.driver [None req-f935ee32-9417-4f0e-98d3-e4ea90507b0f 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Snapshot extracted, beginning image upload
Nov 22 08:14:04 compute-0 nova_compute[186544]: 2025-11-22 08:14:04.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:14:04 compute-0 nova_compute[186544]: 2025-11-22 08:14:04.216 186548 DEBUG nova.network.neutron [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Successfully created port: f7a27b23-e94c-4aef-9b08-6d0ddb100265 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 22 08:14:04 compute-0 nova_compute[186544]: 2025-11-22 08:14:04.802 186548 DEBUG nova.network.neutron [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Successfully created port: d867b5ab-dfbe-4189-8dbb-ed5c62d72ad3 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 22 08:14:05 compute-0 nova_compute[186544]: 2025-11-22 08:14:05.005 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:14:05 compute-0 nova_compute[186544]: 2025-11-22 08:14:05.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:14:05 compute-0 nova_compute[186544]: 2025-11-22 08:14:05.919 186548 DEBUG nova.network.neutron [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Successfully updated port: f7a27b23-e94c-4aef-9b08-6d0ddb100265 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 08:14:05 compute-0 nova_compute[186544]: 2025-11-22 08:14:05.981 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:14:06 compute-0 nova_compute[186544]: 2025-11-22 08:14:06.607 186548 INFO nova.virt.libvirt.driver [None req-f935ee32-9417-4f0e-98d3-e4ea90507b0f 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Snapshot image upload complete
Nov 22 08:14:06 compute-0 nova_compute[186544]: 2025-11-22 08:14:06.608 186548 INFO nova.compute.manager [None req-f935ee32-9417-4f0e-98d3-e4ea90507b0f 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Took 6.77 seconds to snapshot the instance on the hypervisor.
Nov 22 08:14:06 compute-0 nova_compute[186544]: 2025-11-22 08:14:06.981 186548 DEBUG nova.network.neutron [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Successfully updated port: d867b5ab-dfbe-4189-8dbb-ed5c62d72ad3 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 08:14:06 compute-0 nova_compute[186544]: 2025-11-22 08:14:06.993 186548 DEBUG oslo_concurrency.lockutils [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "refresh_cache-ddf94a98-572f-4116-87fb-d46dc5f72174" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:14:06 compute-0 nova_compute[186544]: 2025-11-22 08:14:06.994 186548 DEBUG oslo_concurrency.lockutils [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquired lock "refresh_cache-ddf94a98-572f-4116-87fb-d46dc5f72174" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:14:06 compute-0 nova_compute[186544]: 2025-11-22 08:14:06.994 186548 DEBUG nova.network.neutron [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 08:14:07 compute-0 nova_compute[186544]: 2025-11-22 08:14:07.092 186548 DEBUG nova.compute.manager [req-d0252520-5ac1-44f9-adb1-fb77ddea5e27 req-e41080f3-e865-44de-ae4a-e6f76174e062 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Received event network-changed-f7a27b23-e94c-4aef-9b08-6d0ddb100265 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:14:07 compute-0 nova_compute[186544]: 2025-11-22 08:14:07.093 186548 DEBUG nova.compute.manager [req-d0252520-5ac1-44f9-adb1-fb77ddea5e27 req-e41080f3-e865-44de-ae4a-e6f76174e062 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Refreshing instance network info cache due to event network-changed-f7a27b23-e94c-4aef-9b08-6d0ddb100265. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:14:07 compute-0 nova_compute[186544]: 2025-11-22 08:14:07.093 186548 DEBUG oslo_concurrency.lockutils [req-d0252520-5ac1-44f9-adb1-fb77ddea5e27 req-e41080f3-e865-44de-ae4a-e6f76174e062 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-ddf94a98-572f-4116-87fb-d46dc5f72174" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:14:07 compute-0 nova_compute[186544]: 2025-11-22 08:14:07.179 186548 DEBUG nova.network.neutron [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 08:14:08 compute-0 nova_compute[186544]: 2025-11-22 08:14:08.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:14:09 compute-0 nova_compute[186544]: 2025-11-22 08:14:09.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:14:09 compute-0 nova_compute[186544]: 2025-11-22 08:14:09.327 186548 DEBUG nova.network.neutron [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Updating instance_info_cache with network_info: [{"id": "f7a27b23-e94c-4aef-9b08-6d0ddb100265", "address": "fa:16:3e:1a:d3:6f", "network": {"id": "7e573664-04ba-4ce5-994a-9fb9483a2400", "bridge": "br-int", "label": "tempest-network-smoke--447291836", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7a27b23-e9", "ovs_interfaceid": "f7a27b23-e94c-4aef-9b08-6d0ddb100265", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "d867b5ab-dfbe-4189-8dbb-ed5c62d72ad3", "address": "fa:16:3e:04:7f:07", "network": {"id": "6c0a2255-6426-43c4-abc3-5c1857ba0a79", "bridge": "br-int", "label": "tempest-network-smoke--277865928", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe04:7f07", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe04:7f07", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd867b5ab-df", "ovs_interfaceid": "d867b5ab-dfbe-4189-8dbb-ed5c62d72ad3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:14:09 compute-0 nova_compute[186544]: 2025-11-22 08:14:09.356 186548 DEBUG oslo_concurrency.lockutils [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Releasing lock "refresh_cache-ddf94a98-572f-4116-87fb-d46dc5f72174" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:14:09 compute-0 nova_compute[186544]: 2025-11-22 08:14:09.356 186548 DEBUG nova.compute.manager [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Instance network_info: |[{"id": "f7a27b23-e94c-4aef-9b08-6d0ddb100265", "address": "fa:16:3e:1a:d3:6f", "network": {"id": "7e573664-04ba-4ce5-994a-9fb9483a2400", "bridge": "br-int", "label": "tempest-network-smoke--447291836", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7a27b23-e9", "ovs_interfaceid": "f7a27b23-e94c-4aef-9b08-6d0ddb100265", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "d867b5ab-dfbe-4189-8dbb-ed5c62d72ad3", "address": "fa:16:3e:04:7f:07", "network": {"id": "6c0a2255-6426-43c4-abc3-5c1857ba0a79", "bridge": "br-int", "label": "tempest-network-smoke--277865928", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe04:7f07", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe04:7f07", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd867b5ab-df", "ovs_interfaceid": "d867b5ab-dfbe-4189-8dbb-ed5c62d72ad3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 22 08:14:09 compute-0 nova_compute[186544]: 2025-11-22 08:14:09.357 186548 DEBUG oslo_concurrency.lockutils [req-d0252520-5ac1-44f9-adb1-fb77ddea5e27 req-e41080f3-e865-44de-ae4a-e6f76174e062 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-ddf94a98-572f-4116-87fb-d46dc5f72174" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:14:09 compute-0 nova_compute[186544]: 2025-11-22 08:14:09.357 186548 DEBUG nova.network.neutron [req-d0252520-5ac1-44f9-adb1-fb77ddea5e27 req-e41080f3-e865-44de-ae4a-e6f76174e062 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Refreshing network info cache for port f7a27b23-e94c-4aef-9b08-6d0ddb100265 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:14:09 compute-0 nova_compute[186544]: 2025-11-22 08:14:09.361 186548 DEBUG nova.virt.libvirt.driver [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Start _get_guest_xml network_info=[{"id": "f7a27b23-e94c-4aef-9b08-6d0ddb100265", "address": "fa:16:3e:1a:d3:6f", "network": {"id": "7e573664-04ba-4ce5-994a-9fb9483a2400", "bridge": "br-int", "label": "tempest-network-smoke--447291836", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7a27b23-e9", "ovs_interfaceid": "f7a27b23-e94c-4aef-9b08-6d0ddb100265", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "d867b5ab-dfbe-4189-8dbb-ed5c62d72ad3", "address": "fa:16:3e:04:7f:07", "network": {"id": "6c0a2255-6426-43c4-abc3-5c1857ba0a79", "bridge": "br-int", "label": "tempest-network-smoke--277865928", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe04:7f07", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe04:7f07", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd867b5ab-df", "ovs_interfaceid": "d867b5ab-dfbe-4189-8dbb-ed5c62d72ad3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 08:14:09 compute-0 nova_compute[186544]: 2025-11-22 08:14:09.365 186548 WARNING nova.virt.libvirt.driver [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:14:09 compute-0 nova_compute[186544]: 2025-11-22 08:14:09.369 186548 DEBUG nova.virt.libvirt.host [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 08:14:09 compute-0 nova_compute[186544]: 2025-11-22 08:14:09.370 186548 DEBUG nova.virt.libvirt.host [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 08:14:09 compute-0 nova_compute[186544]: 2025-11-22 08:14:09.372 186548 DEBUG nova.virt.libvirt.host [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 08:14:09 compute-0 nova_compute[186544]: 2025-11-22 08:14:09.373 186548 DEBUG nova.virt.libvirt.host [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 08:14:09 compute-0 nova_compute[186544]: 2025-11-22 08:14:09.374 186548 DEBUG nova.virt.libvirt.driver [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 08:14:09 compute-0 nova_compute[186544]: 2025-11-22 08:14:09.374 186548 DEBUG nova.virt.hardware [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 08:14:09 compute-0 nova_compute[186544]: 2025-11-22 08:14:09.374 186548 DEBUG nova.virt.hardware [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 08:14:09 compute-0 nova_compute[186544]: 2025-11-22 08:14:09.374 186548 DEBUG nova.virt.hardware [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 08:14:09 compute-0 nova_compute[186544]: 2025-11-22 08:14:09.375 186548 DEBUG nova.virt.hardware [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 08:14:09 compute-0 nova_compute[186544]: 2025-11-22 08:14:09.375 186548 DEBUG nova.virt.hardware [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 08:14:09 compute-0 nova_compute[186544]: 2025-11-22 08:14:09.375 186548 DEBUG nova.virt.hardware [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 08:14:09 compute-0 nova_compute[186544]: 2025-11-22 08:14:09.375 186548 DEBUG nova.virt.hardware [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 08:14:09 compute-0 nova_compute[186544]: 2025-11-22 08:14:09.376 186548 DEBUG nova.virt.hardware [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 08:14:09 compute-0 nova_compute[186544]: 2025-11-22 08:14:09.376 186548 DEBUG nova.virt.hardware [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 08:14:09 compute-0 nova_compute[186544]: 2025-11-22 08:14:09.376 186548 DEBUG nova.virt.hardware [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 08:14:09 compute-0 nova_compute[186544]: 2025-11-22 08:14:09.376 186548 DEBUG nova.virt.hardware [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 08:14:09 compute-0 nova_compute[186544]: 2025-11-22 08:14:09.379 186548 DEBUG nova.virt.libvirt.vif [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:13:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-585014751',display_name='tempest-TestGettingAddress-server-585014751',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-585014751',id=139,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBE3h1SGyV3gSWS5sQ4nylu2AAOZq0lzcQgQV1fi/afTfgHNAebinpbyavWAHUC3BTYwehM8YAaeM76WaxgrKeLAhyjYG3qrO7DzWZz90S7erIXCzT/UdxFEeIFnV62ADrw==',key_name='tempest-TestGettingAddress-2002370226',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c4200f1d1fbb44a5aaf5e3578f6354ae',ramdisk_id='',reservation_id='r-00mzxd97',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-25838038',owner_user_name='tempest-TestGettingAddress-25838038-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:14:02Z,user_data=None,user_id='809b865601654264af5bff7f49127cea',uuid=ddf94a98-572f-4116-87fb-d46dc5f72174,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f7a27b23-e94c-4aef-9b08-6d0ddb100265", "address": "fa:16:3e:1a:d3:6f", "network": {"id": "7e573664-04ba-4ce5-994a-9fb9483a2400", "bridge": "br-int", "label": "tempest-network-smoke--447291836", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7a27b23-e9", "ovs_interfaceid": "f7a27b23-e94c-4aef-9b08-6d0ddb100265", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 08:14:09 compute-0 nova_compute[186544]: 2025-11-22 08:14:09.380 186548 DEBUG nova.network.os_vif_util [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converting VIF {"id": "f7a27b23-e94c-4aef-9b08-6d0ddb100265", "address": "fa:16:3e:1a:d3:6f", "network": {"id": "7e573664-04ba-4ce5-994a-9fb9483a2400", "bridge": "br-int", "label": "tempest-network-smoke--447291836", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7a27b23-e9", "ovs_interfaceid": "f7a27b23-e94c-4aef-9b08-6d0ddb100265", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:14:09 compute-0 nova_compute[186544]: 2025-11-22 08:14:09.380 186548 DEBUG nova.network.os_vif_util [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1a:d3:6f,bridge_name='br-int',has_traffic_filtering=True,id=f7a27b23-e94c-4aef-9b08-6d0ddb100265,network=Network(7e573664-04ba-4ce5-994a-9fb9483a2400),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7a27b23-e9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:14:09 compute-0 nova_compute[186544]: 2025-11-22 08:14:09.381 186548 DEBUG nova.virt.libvirt.vif [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:13:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-585014751',display_name='tempest-TestGettingAddress-server-585014751',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-585014751',id=139,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBE3h1SGyV3gSWS5sQ4nylu2AAOZq0lzcQgQV1fi/afTfgHNAebinpbyavWAHUC3BTYwehM8YAaeM76WaxgrKeLAhyjYG3qrO7DzWZz90S7erIXCzT/UdxFEeIFnV62ADrw==',key_name='tempest-TestGettingAddress-2002370226',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c4200f1d1fbb44a5aaf5e3578f6354ae',ramdisk_id='',reservation_id='r-00mzxd97',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-25838038',owner_user_name='tempest-TestGettingAddress-25838038-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:14:02Z,user_data=None,user_id='809b865601654264af5bff7f49127cea',uuid=ddf94a98-572f-4116-87fb-d46dc5f72174,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d867b5ab-dfbe-4189-8dbb-ed5c62d72ad3", "address": "fa:16:3e:04:7f:07", "network": {"id": "6c0a2255-6426-43c4-abc3-5c1857ba0a79", "bridge": "br-int", "label": "tempest-network-smoke--277865928", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe04:7f07", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe04:7f07", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd867b5ab-df", "ovs_interfaceid": "d867b5ab-dfbe-4189-8dbb-ed5c62d72ad3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 08:14:09 compute-0 nova_compute[186544]: 2025-11-22 08:14:09.381 186548 DEBUG nova.network.os_vif_util [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converting VIF {"id": "d867b5ab-dfbe-4189-8dbb-ed5c62d72ad3", "address": "fa:16:3e:04:7f:07", "network": {"id": "6c0a2255-6426-43c4-abc3-5c1857ba0a79", "bridge": "br-int", "label": "tempest-network-smoke--277865928", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe04:7f07", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe04:7f07", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd867b5ab-df", "ovs_interfaceid": "d867b5ab-dfbe-4189-8dbb-ed5c62d72ad3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:14:09 compute-0 nova_compute[186544]: 2025-11-22 08:14:09.382 186548 DEBUG nova.network.os_vif_util [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:04:7f:07,bridge_name='br-int',has_traffic_filtering=True,id=d867b5ab-dfbe-4189-8dbb-ed5c62d72ad3,network=Network(6c0a2255-6426-43c4-abc3-5c1857ba0a79),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd867b5ab-df') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:14:09 compute-0 nova_compute[186544]: 2025-11-22 08:14:09.383 186548 DEBUG nova.objects.instance [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lazy-loading 'pci_devices' on Instance uuid ddf94a98-572f-4116-87fb-d46dc5f72174 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:14:09 compute-0 nova_compute[186544]: 2025-11-22 08:14:09.398 186548 DEBUG nova.virt.libvirt.driver [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] End _get_guest_xml xml=<domain type="kvm">
Nov 22 08:14:09 compute-0 nova_compute[186544]:   <uuid>ddf94a98-572f-4116-87fb-d46dc5f72174</uuid>
Nov 22 08:14:09 compute-0 nova_compute[186544]:   <name>instance-0000008b</name>
Nov 22 08:14:09 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 08:14:09 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 08:14:09 compute-0 nova_compute[186544]:   <metadata>
Nov 22 08:14:09 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 08:14:09 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 08:14:09 compute-0 nova_compute[186544]:       <nova:name>tempest-TestGettingAddress-server-585014751</nova:name>
Nov 22 08:14:09 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 08:14:09</nova:creationTime>
Nov 22 08:14:09 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 08:14:09 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 08:14:09 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 08:14:09 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 08:14:09 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 08:14:09 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 08:14:09 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 08:14:09 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 08:14:09 compute-0 nova_compute[186544]:         <nova:user uuid="809b865601654264af5bff7f49127cea">tempest-TestGettingAddress-25838038-project-member</nova:user>
Nov 22 08:14:09 compute-0 nova_compute[186544]:         <nova:project uuid="c4200f1d1fbb44a5aaf5e3578f6354ae">tempest-TestGettingAddress-25838038</nova:project>
Nov 22 08:14:09 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 08:14:09 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 08:14:09 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 08:14:09 compute-0 nova_compute[186544]:         <nova:port uuid="f7a27b23-e94c-4aef-9b08-6d0ddb100265">
Nov 22 08:14:09 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Nov 22 08:14:09 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 08:14:09 compute-0 nova_compute[186544]:         <nova:port uuid="d867b5ab-dfbe-4189-8dbb-ed5c62d72ad3">
Nov 22 08:14:09 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="2001:db8::f816:3eff:fe04:7f07" ipVersion="6"/>
Nov 22 08:14:09 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="2001:db8:0:1:f816:3eff:fe04:7f07" ipVersion="6"/>
Nov 22 08:14:09 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 08:14:09 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 08:14:09 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 08:14:09 compute-0 nova_compute[186544]:   </metadata>
Nov 22 08:14:09 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 08:14:09 compute-0 nova_compute[186544]:     <system>
Nov 22 08:14:09 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 08:14:09 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 08:14:09 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 08:14:09 compute-0 nova_compute[186544]:       <entry name="serial">ddf94a98-572f-4116-87fb-d46dc5f72174</entry>
Nov 22 08:14:09 compute-0 nova_compute[186544]:       <entry name="uuid">ddf94a98-572f-4116-87fb-d46dc5f72174</entry>
Nov 22 08:14:09 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 08:14:09 compute-0 nova_compute[186544]:     </system>
Nov 22 08:14:09 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 08:14:09 compute-0 nova_compute[186544]:   <os>
Nov 22 08:14:09 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 08:14:09 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 08:14:09 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 08:14:09 compute-0 nova_compute[186544]:   </os>
Nov 22 08:14:09 compute-0 nova_compute[186544]:   <features>
Nov 22 08:14:09 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 08:14:09 compute-0 nova_compute[186544]:     <apic/>
Nov 22 08:14:09 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 08:14:09 compute-0 nova_compute[186544]:   </features>
Nov 22 08:14:09 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 08:14:09 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 08:14:09 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 08:14:09 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 08:14:09 compute-0 nova_compute[186544]:   </clock>
Nov 22 08:14:09 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 08:14:09 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 08:14:09 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 08:14:09 compute-0 nova_compute[186544]:   </cpu>
Nov 22 08:14:09 compute-0 nova_compute[186544]:   <devices>
Nov 22 08:14:09 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 08:14:09 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 08:14:09 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/ddf94a98-572f-4116-87fb-d46dc5f72174/disk"/>
Nov 22 08:14:09 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 08:14:09 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:14:09 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 08:14:09 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 08:14:09 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/ddf94a98-572f-4116-87fb-d46dc5f72174/disk.config"/>
Nov 22 08:14:09 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 08:14:09 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:14:09 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 08:14:09 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:1a:d3:6f"/>
Nov 22 08:14:09 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:14:09 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 08:14:09 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 08:14:09 compute-0 nova_compute[186544]:       <target dev="tapf7a27b23-e9"/>
Nov 22 08:14:09 compute-0 nova_compute[186544]:     </interface>
Nov 22 08:14:09 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 08:14:09 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:04:7f:07"/>
Nov 22 08:14:09 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:14:09 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 08:14:09 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 08:14:09 compute-0 nova_compute[186544]:       <target dev="tapd867b5ab-df"/>
Nov 22 08:14:09 compute-0 nova_compute[186544]:     </interface>
Nov 22 08:14:09 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 08:14:09 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/ddf94a98-572f-4116-87fb-d46dc5f72174/console.log" append="off"/>
Nov 22 08:14:09 compute-0 nova_compute[186544]:     </serial>
Nov 22 08:14:09 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 08:14:09 compute-0 nova_compute[186544]:     <video>
Nov 22 08:14:09 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:14:09 compute-0 nova_compute[186544]:     </video>
Nov 22 08:14:09 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 08:14:09 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 08:14:09 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 08:14:09 compute-0 nova_compute[186544]:     </rng>
Nov 22 08:14:09 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 08:14:09 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:14:09 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:14:09 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:14:09 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:14:09 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:14:09 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:14:09 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:14:09 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:14:09 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:14:09 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:14:09 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:14:09 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:14:09 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:14:09 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:14:09 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:14:09 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:14:09 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:14:09 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:14:09 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:14:09 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:14:09 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:14:09 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:14:09 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:14:09 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:14:09 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 08:14:09 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 08:14:09 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 08:14:09 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 08:14:09 compute-0 nova_compute[186544]:   </devices>
Nov 22 08:14:09 compute-0 nova_compute[186544]: </domain>
Nov 22 08:14:09 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 08:14:09 compute-0 nova_compute[186544]: 2025-11-22 08:14:09.399 186548 DEBUG nova.compute.manager [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Preparing to wait for external event network-vif-plugged-f7a27b23-e94c-4aef-9b08-6d0ddb100265 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 08:14:09 compute-0 nova_compute[186544]: 2025-11-22 08:14:09.399 186548 DEBUG oslo_concurrency.lockutils [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "ddf94a98-572f-4116-87fb-d46dc5f72174-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:14:09 compute-0 nova_compute[186544]: 2025-11-22 08:14:09.399 186548 DEBUG oslo_concurrency.lockutils [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "ddf94a98-572f-4116-87fb-d46dc5f72174-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:14:09 compute-0 nova_compute[186544]: 2025-11-22 08:14:09.400 186548 DEBUG oslo_concurrency.lockutils [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "ddf94a98-572f-4116-87fb-d46dc5f72174-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:14:09 compute-0 nova_compute[186544]: 2025-11-22 08:14:09.400 186548 DEBUG nova.compute.manager [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Preparing to wait for external event network-vif-plugged-d867b5ab-dfbe-4189-8dbb-ed5c62d72ad3 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 08:14:09 compute-0 nova_compute[186544]: 2025-11-22 08:14:09.400 186548 DEBUG oslo_concurrency.lockutils [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "ddf94a98-572f-4116-87fb-d46dc5f72174-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:14:09 compute-0 nova_compute[186544]: 2025-11-22 08:14:09.400 186548 DEBUG oslo_concurrency.lockutils [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "ddf94a98-572f-4116-87fb-d46dc5f72174-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:14:09 compute-0 nova_compute[186544]: 2025-11-22 08:14:09.400 186548 DEBUG oslo_concurrency.lockutils [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "ddf94a98-572f-4116-87fb-d46dc5f72174-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:14:09 compute-0 nova_compute[186544]: 2025-11-22 08:14:09.401 186548 DEBUG nova.virt.libvirt.vif [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:13:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-585014751',display_name='tempest-TestGettingAddress-server-585014751',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-585014751',id=139,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBE3h1SGyV3gSWS5sQ4nylu2AAOZq0lzcQgQV1fi/afTfgHNAebinpbyavWAHUC3BTYwehM8YAaeM76WaxgrKeLAhyjYG3qrO7DzWZz90S7erIXCzT/UdxFEeIFnV62ADrw==',key_name='tempest-TestGettingAddress-2002370226',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c4200f1d1fbb44a5aaf5e3578f6354ae',ramdisk_id='',reservation_id='r-00mzxd97',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-25838038',owner_user_name='tempest-TestGettingAddress-25838038-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:14:02Z,user_data=None,user_id='809b865601654264af5bff7f49127cea',uuid=ddf94a98-572f-4116-87fb-d46dc5f72174,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f7a27b23-e94c-4aef-9b08-6d0ddb100265", "address": "fa:16:3e:1a:d3:6f", "network": {"id": "7e573664-04ba-4ce5-994a-9fb9483a2400", "bridge": "br-int", "label": "tempest-network-smoke--447291836", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7a27b23-e9", "ovs_interfaceid": "f7a27b23-e94c-4aef-9b08-6d0ddb100265", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 08:14:09 compute-0 nova_compute[186544]: 2025-11-22 08:14:09.401 186548 DEBUG nova.network.os_vif_util [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converting VIF {"id": "f7a27b23-e94c-4aef-9b08-6d0ddb100265", "address": "fa:16:3e:1a:d3:6f", "network": {"id": "7e573664-04ba-4ce5-994a-9fb9483a2400", "bridge": "br-int", "label": "tempest-network-smoke--447291836", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7a27b23-e9", "ovs_interfaceid": "f7a27b23-e94c-4aef-9b08-6d0ddb100265", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:14:09 compute-0 nova_compute[186544]: 2025-11-22 08:14:09.402 186548 DEBUG nova.network.os_vif_util [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1a:d3:6f,bridge_name='br-int',has_traffic_filtering=True,id=f7a27b23-e94c-4aef-9b08-6d0ddb100265,network=Network(7e573664-04ba-4ce5-994a-9fb9483a2400),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7a27b23-e9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:14:09 compute-0 nova_compute[186544]: 2025-11-22 08:14:09.402 186548 DEBUG os_vif [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1a:d3:6f,bridge_name='br-int',has_traffic_filtering=True,id=f7a27b23-e94c-4aef-9b08-6d0ddb100265,network=Network(7e573664-04ba-4ce5-994a-9fb9483a2400),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7a27b23-e9') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 08:14:09 compute-0 nova_compute[186544]: 2025-11-22 08:14:09.403 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:14:09 compute-0 nova_compute[186544]: 2025-11-22 08:14:09.403 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:14:09 compute-0 nova_compute[186544]: 2025-11-22 08:14:09.403 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:14:09 compute-0 nova_compute[186544]: 2025-11-22 08:14:09.406 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:14:09 compute-0 nova_compute[186544]: 2025-11-22 08:14:09.407 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf7a27b23-e9, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:14:09 compute-0 nova_compute[186544]: 2025-11-22 08:14:09.407 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf7a27b23-e9, col_values=(('external_ids', {'iface-id': 'f7a27b23-e94c-4aef-9b08-6d0ddb100265', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1a:d3:6f', 'vm-uuid': 'ddf94a98-572f-4116-87fb-d46dc5f72174'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:14:09 compute-0 NetworkManager[55036]: <info>  [1763799249.4094] manager: (tapf7a27b23-e9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/293)
Nov 22 08:14:09 compute-0 nova_compute[186544]: 2025-11-22 08:14:09.411 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 08:14:09 compute-0 nova_compute[186544]: 2025-11-22 08:14:09.415 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:14:09 compute-0 nova_compute[186544]: 2025-11-22 08:14:09.416 186548 INFO os_vif [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1a:d3:6f,bridge_name='br-int',has_traffic_filtering=True,id=f7a27b23-e94c-4aef-9b08-6d0ddb100265,network=Network(7e573664-04ba-4ce5-994a-9fb9483a2400),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7a27b23-e9')
Nov 22 08:14:09 compute-0 nova_compute[186544]: 2025-11-22 08:14:09.416 186548 DEBUG nova.virt.libvirt.vif [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:13:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-585014751',display_name='tempest-TestGettingAddress-server-585014751',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-585014751',id=139,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBE3h1SGyV3gSWS5sQ4nylu2AAOZq0lzcQgQV1fi/afTfgHNAebinpbyavWAHUC3BTYwehM8YAaeM76WaxgrKeLAhyjYG3qrO7DzWZz90S7erIXCzT/UdxFEeIFnV62ADrw==',key_name='tempest-TestGettingAddress-2002370226',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c4200f1d1fbb44a5aaf5e3578f6354ae',ramdisk_id='',reservation_id='r-00mzxd97',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-25838038',owner_user_name='tempest-TestGettingAddress-25838038-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:14:02Z,user_data=None,user_id='809b865601654264af5bff7f49127cea',uuid=ddf94a98-572f-4116-87fb-d46dc5f72174,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d867b5ab-dfbe-4189-8dbb-ed5c62d72ad3", "address": "fa:16:3e:04:7f:07", "network": {"id": "6c0a2255-6426-43c4-abc3-5c1857ba0a79", "bridge": "br-int", "label": "tempest-network-smoke--277865928", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe04:7f07", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe04:7f07", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd867b5ab-df", "ovs_interfaceid": "d867b5ab-dfbe-4189-8dbb-ed5c62d72ad3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 08:14:09 compute-0 nova_compute[186544]: 2025-11-22 08:14:09.417 186548 DEBUG nova.network.os_vif_util [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converting VIF {"id": "d867b5ab-dfbe-4189-8dbb-ed5c62d72ad3", "address": "fa:16:3e:04:7f:07", "network": {"id": "6c0a2255-6426-43c4-abc3-5c1857ba0a79", "bridge": "br-int", "label": "tempest-network-smoke--277865928", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe04:7f07", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe04:7f07", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd867b5ab-df", "ovs_interfaceid": "d867b5ab-dfbe-4189-8dbb-ed5c62d72ad3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:14:09 compute-0 nova_compute[186544]: 2025-11-22 08:14:09.417 186548 DEBUG nova.network.os_vif_util [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:04:7f:07,bridge_name='br-int',has_traffic_filtering=True,id=d867b5ab-dfbe-4189-8dbb-ed5c62d72ad3,network=Network(6c0a2255-6426-43c4-abc3-5c1857ba0a79),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd867b5ab-df') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:14:09 compute-0 nova_compute[186544]: 2025-11-22 08:14:09.418 186548 DEBUG os_vif [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:04:7f:07,bridge_name='br-int',has_traffic_filtering=True,id=d867b5ab-dfbe-4189-8dbb-ed5c62d72ad3,network=Network(6c0a2255-6426-43c4-abc3-5c1857ba0a79),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd867b5ab-df') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 08:14:09 compute-0 nova_compute[186544]: 2025-11-22 08:14:09.418 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:14:09 compute-0 nova_compute[186544]: 2025-11-22 08:14:09.418 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:14:09 compute-0 nova_compute[186544]: 2025-11-22 08:14:09.419 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:14:09 compute-0 nova_compute[186544]: 2025-11-22 08:14:09.420 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:14:09 compute-0 nova_compute[186544]: 2025-11-22 08:14:09.421 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd867b5ab-df, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:14:09 compute-0 nova_compute[186544]: 2025-11-22 08:14:09.421 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd867b5ab-df, col_values=(('external_ids', {'iface-id': 'd867b5ab-dfbe-4189-8dbb-ed5c62d72ad3', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:04:7f:07', 'vm-uuid': 'ddf94a98-572f-4116-87fb-d46dc5f72174'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:14:09 compute-0 NetworkManager[55036]: <info>  [1763799249.4232] manager: (tapd867b5ab-df): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/294)
Nov 22 08:14:09 compute-0 nova_compute[186544]: 2025-11-22 08:14:09.425 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 08:14:09 compute-0 nova_compute[186544]: 2025-11-22 08:14:09.428 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:14:09 compute-0 nova_compute[186544]: 2025-11-22 08:14:09.429 186548 INFO os_vif [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:04:7f:07,bridge_name='br-int',has_traffic_filtering=True,id=d867b5ab-dfbe-4189-8dbb-ed5c62d72ad3,network=Network(6c0a2255-6426-43c4-abc3-5c1857ba0a79),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd867b5ab-df')
Nov 22 08:14:09 compute-0 nova_compute[186544]: 2025-11-22 08:14:09.551 186548 DEBUG nova.virt.libvirt.driver [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:14:09 compute-0 nova_compute[186544]: 2025-11-22 08:14:09.551 186548 DEBUG nova.virt.libvirt.driver [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:14:09 compute-0 nova_compute[186544]: 2025-11-22 08:14:09.552 186548 DEBUG nova.virt.libvirt.driver [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] No VIF found with MAC fa:16:3e:1a:d3:6f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 08:14:09 compute-0 nova_compute[186544]: 2025-11-22 08:14:09.552 186548 DEBUG nova.virt.libvirt.driver [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] No VIF found with MAC fa:16:3e:04:7f:07, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 08:14:09 compute-0 nova_compute[186544]: 2025-11-22 08:14:09.552 186548 INFO nova.virt.libvirt.driver [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Using config drive
Nov 22 08:14:10 compute-0 nova_compute[186544]: 2025-11-22 08:14:10.907 186548 INFO nova.virt.libvirt.driver [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Creating config drive at /var/lib/nova/instances/ddf94a98-572f-4116-87fb-d46dc5f72174/disk.config
Nov 22 08:14:10 compute-0 nova_compute[186544]: 2025-11-22 08:14:10.912 186548 DEBUG oslo_concurrency.processutils [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ddf94a98-572f-4116-87fb-d46dc5f72174/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp7qgut2_q execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:14:10 compute-0 nova_compute[186544]: 2025-11-22 08:14:10.983 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:14:11 compute-0 nova_compute[186544]: 2025-11-22 08:14:11.035 186548 DEBUG oslo_concurrency.processutils [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ddf94a98-572f-4116-87fb-d46dc5f72174/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp7qgut2_q" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:14:11 compute-0 kernel: tapf7a27b23-e9: entered promiscuous mode
Nov 22 08:14:11 compute-0 NetworkManager[55036]: <info>  [1763799251.0909] manager: (tapf7a27b23-e9): new Tun device (/org/freedesktop/NetworkManager/Devices/295)
Nov 22 08:14:11 compute-0 ovn_controller[94843]: 2025-11-22T08:14:11Z|00630|binding|INFO|Claiming lport f7a27b23-e94c-4aef-9b08-6d0ddb100265 for this chassis.
Nov 22 08:14:11 compute-0 ovn_controller[94843]: 2025-11-22T08:14:11Z|00631|binding|INFO|f7a27b23-e94c-4aef-9b08-6d0ddb100265: Claiming fa:16:3e:1a:d3:6f 10.100.0.14
Nov 22 08:14:11 compute-0 nova_compute[186544]: 2025-11-22 08:14:11.096 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:14:11 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:14:11.109 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1a:d3:6f 10.100.0.14'], port_security=['fa:16:3e:1a:d3:6f 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'ddf94a98-572f-4116-87fb-d46dc5f72174', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7e573664-04ba-4ce5-994a-9fb9483a2400', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'bc613044-b796-41b5-a7b0-c508f998d641', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=da796c20-96a3-420c-a9ae-3320426db7c7, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=f7a27b23-e94c-4aef-9b08-6d0ddb100265) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:14:11 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:14:11.110 103805 INFO neutron.agent.ovn.metadata.agent [-] Port f7a27b23-e94c-4aef-9b08-6d0ddb100265 in datapath 7e573664-04ba-4ce5-994a-9fb9483a2400 bound to our chassis
Nov 22 08:14:11 compute-0 NetworkManager[55036]: <info>  [1763799251.1121] manager: (tapd867b5ab-df): new Tun device (/org/freedesktop/NetworkManager/Devices/296)
Nov 22 08:14:11 compute-0 kernel: tapd867b5ab-df: entered promiscuous mode
Nov 22 08:14:11 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:14:11.112 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7e573664-04ba-4ce5-994a-9fb9483a2400
Nov 22 08:14:11 compute-0 ovn_controller[94843]: 2025-11-22T08:14:11Z|00632|binding|INFO|Setting lport f7a27b23-e94c-4aef-9b08-6d0ddb100265 ovn-installed in OVS
Nov 22 08:14:11 compute-0 ovn_controller[94843]: 2025-11-22T08:14:11Z|00633|binding|INFO|Setting lport f7a27b23-e94c-4aef-9b08-6d0ddb100265 up in Southbound
Nov 22 08:14:11 compute-0 ovn_controller[94843]: 2025-11-22T08:14:11Z|00634|if_status|INFO|Not updating pb chassis for d867b5ab-dfbe-4189-8dbb-ed5c62d72ad3 now as sb is readonly
Nov 22 08:14:11 compute-0 nova_compute[186544]: 2025-11-22 08:14:11.114 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:14:11 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:14:11.122 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[bf9b3516-d30e-4264-b529-70000e54cc07]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:14:11 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:14:11.123 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7e573664-01 in ovnmeta-7e573664-04ba-4ce5-994a-9fb9483a2400 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 08:14:11 compute-0 systemd-udevd[238742]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:14:11 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:14:11.126 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7e573664-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 08:14:11 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:14:11.126 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[62a0a89e-7dd4-42e5-a4fa-f9919fa086c3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:14:11 compute-0 systemd-udevd[238743]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:14:11 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:14:11.127 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[c891d285-88b3-4021-ac21-8689aff9ae26]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:14:11 compute-0 ovn_controller[94843]: 2025-11-22T08:14:11Z|00635|binding|INFO|Claiming lport d867b5ab-dfbe-4189-8dbb-ed5c62d72ad3 for this chassis.
Nov 22 08:14:11 compute-0 ovn_controller[94843]: 2025-11-22T08:14:11Z|00636|binding|INFO|d867b5ab-dfbe-4189-8dbb-ed5c62d72ad3: Claiming fa:16:3e:04:7f:07 2001:db8:0:1:f816:3eff:fe04:7f07 2001:db8::f816:3eff:fe04:7f07
Nov 22 08:14:11 compute-0 ovn_controller[94843]: 2025-11-22T08:14:11Z|00637|binding|INFO|Setting lport d867b5ab-dfbe-4189-8dbb-ed5c62d72ad3 ovn-installed in OVS
Nov 22 08:14:11 compute-0 nova_compute[186544]: 2025-11-22 08:14:11.131 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:14:11 compute-0 NetworkManager[55036]: <info>  [1763799251.1408] device (tapf7a27b23-e9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 08:14:11 compute-0 ovn_controller[94843]: 2025-11-22T08:14:11Z|00638|binding|INFO|Setting lport d867b5ab-dfbe-4189-8dbb-ed5c62d72ad3 up in Southbound
Nov 22 08:14:11 compute-0 NetworkManager[55036]: <info>  [1763799251.1419] device (tapf7a27b23-e9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 08:14:11 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:14:11.141 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[65bf4526-cedc-49a1-92a1-124e72a359b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:14:11 compute-0 NetworkManager[55036]: <info>  [1763799251.1428] device (tapd867b5ab-df): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 08:14:11 compute-0 NetworkManager[55036]: <info>  [1763799251.1439] device (tapd867b5ab-df): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 08:14:11 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:14:11.147 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:04:7f:07 2001:db8:0:1:f816:3eff:fe04:7f07 2001:db8::f816:3eff:fe04:7f07'], port_security=['fa:16:3e:04:7f:07 2001:db8:0:1:f816:3eff:fe04:7f07 2001:db8::f816:3eff:fe04:7f07'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe04:7f07/64 2001:db8::f816:3eff:fe04:7f07/64', 'neutron:device_id': 'ddf94a98-572f-4116-87fb-d46dc5f72174', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6c0a2255-6426-43c4-abc3-5c1857ba0a79', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'bc613044-b796-41b5-a7b0-c508f998d641', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1f6eb0a2-d476-48e9-8756-79e6bbc84c15, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=d867b5ab-dfbe-4189-8dbb-ed5c62d72ad3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:14:11 compute-0 systemd-machined[152872]: New machine qemu-78-instance-0000008b.
Nov 22 08:14:11 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:14:11.154 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[9788f775-3aa5-4f0c-aca5-8b713b0b99d0]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:14:11 compute-0 systemd[1]: Started Virtual Machine qemu-78-instance-0000008b.
Nov 22 08:14:11 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:14:11.183 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[e11b076b-2e96-4fce-abf3-83c15a877eda]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:14:11 compute-0 NetworkManager[55036]: <info>  [1763799251.1899] manager: (tap7e573664-00): new Veth device (/org/freedesktop/NetworkManager/Devices/297)
Nov 22 08:14:11 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:14:11.189 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[84631385-451f-44c5-bf24-3d7309a7b814]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:14:11 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:14:11.219 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[a6eae5af-b470-43d0-bb13-71af9eb3e13a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:14:11 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:14:11.223 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[8aa8416c-9f1f-4712-8234-535e4476bab3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:14:11 compute-0 NetworkManager[55036]: <info>  [1763799251.2453] device (tap7e573664-00): carrier: link connected
Nov 22 08:14:11 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:14:11.253 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[2a3e865e-a7ab-41e9-81df-cf86cb6dd22e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:14:11 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:14:11.270 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[708f1d77-bc64-4da9-bbdc-82eaa45d8585]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7e573664-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fe:03:12'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 196], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 599188, 'reachable_time': 16462, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238778, 'error': None, 'target': 'ovnmeta-7e573664-04ba-4ce5-994a-9fb9483a2400', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:14:11 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:14:11.287 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[549a36c1-a75b-495e-988b-31533fa1609e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fefe:312'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 599188, 'tstamp': 599188}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 238779, 'error': None, 'target': 'ovnmeta-7e573664-04ba-4ce5-994a-9fb9483a2400', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:14:11 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:14:11.303 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[1abdc055-558a-4b9e-895b-ab85f82fc6bc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7e573664-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fe:03:12'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 196], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 599188, 'reachable_time': 16462, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 238780, 'error': None, 'target': 'ovnmeta-7e573664-04ba-4ce5-994a-9fb9483a2400', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:14:11 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:14:11.334 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[9aa62d0f-fc3d-41d9-adfb-951784287333]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:14:11 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:14:11.391 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[1b8ba85e-abaa-4bbc-847c-9cbbb210cec6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:14:11 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:14:11.392 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7e573664-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:14:11 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:14:11.392 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:14:11 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:14:11.393 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7e573664-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:14:11 compute-0 NetworkManager[55036]: <info>  [1763799251.3958] manager: (tap7e573664-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/298)
Nov 22 08:14:11 compute-0 nova_compute[186544]: 2025-11-22 08:14:11.395 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:14:11 compute-0 kernel: tap7e573664-00: entered promiscuous mode
Nov 22 08:14:11 compute-0 nova_compute[186544]: 2025-11-22 08:14:11.397 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:14:11 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:14:11.398 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7e573664-00, col_values=(('external_ids', {'iface-id': '32debc95-1c60-4d8b-9d74-79ae74c8f38f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:14:11 compute-0 ovn_controller[94843]: 2025-11-22T08:14:11Z|00639|binding|INFO|Releasing lport 32debc95-1c60-4d8b-9d74-79ae74c8f38f from this chassis (sb_readonly=0)
Nov 22 08:14:11 compute-0 nova_compute[186544]: 2025-11-22 08:14:11.399 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:14:11 compute-0 nova_compute[186544]: 2025-11-22 08:14:11.401 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:14:11 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:14:11.402 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7e573664-04ba-4ce5-994a-9fb9483a2400.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7e573664-04ba-4ce5-994a-9fb9483a2400.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 08:14:11 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:14:11.403 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[a1329e6a-000d-42d0-b97b-df6f3d6a8ac6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:14:11 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:14:11.404 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 08:14:11 compute-0 ovn_metadata_agent[103800]: global
Nov 22 08:14:11 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 08:14:11 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-7e573664-04ba-4ce5-994a-9fb9483a2400
Nov 22 08:14:11 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 08:14:11 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 08:14:11 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 08:14:11 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/7e573664-04ba-4ce5-994a-9fb9483a2400.pid.haproxy
Nov 22 08:14:11 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 08:14:11 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:14:11 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 08:14:11 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 08:14:11 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 08:14:11 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 08:14:11 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 08:14:11 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 08:14:11 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 08:14:11 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 08:14:11 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 08:14:11 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 08:14:11 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 08:14:11 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 08:14:11 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 08:14:11 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:14:11 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:14:11 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 08:14:11 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 08:14:11 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 08:14:11 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID 7e573664-04ba-4ce5-994a-9fb9483a2400
Nov 22 08:14:11 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 08:14:11 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:14:11.404 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7e573664-04ba-4ce5-994a-9fb9483a2400', 'env', 'PROCESS_TAG=haproxy-7e573664-04ba-4ce5-994a-9fb9483a2400', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7e573664-04ba-4ce5-994a-9fb9483a2400.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 08:14:11 compute-0 nova_compute[186544]: 2025-11-22 08:14:11.413 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:14:11 compute-0 nova_compute[186544]: 2025-11-22 08:14:11.485 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763799251.484236, ddf94a98-572f-4116-87fb-d46dc5f72174 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:14:11 compute-0 nova_compute[186544]: 2025-11-22 08:14:11.485 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] VM Started (Lifecycle Event)
Nov 22 08:14:11 compute-0 nova_compute[186544]: 2025-11-22 08:14:11.521 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:14:11 compute-0 nova_compute[186544]: 2025-11-22 08:14:11.525 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763799251.4845638, ddf94a98-572f-4116-87fb-d46dc5f72174 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:14:11 compute-0 nova_compute[186544]: 2025-11-22 08:14:11.525 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] VM Paused (Lifecycle Event)
Nov 22 08:14:11 compute-0 nova_compute[186544]: 2025-11-22 08:14:11.566 186548 DEBUG nova.compute.manager [req-e7972b6b-63a7-481b-9db9-07a5feb50bb2 req-a845307c-972c-4796-839f-2ddbcc272840 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Received event network-vif-plugged-f7a27b23-e94c-4aef-9b08-6d0ddb100265 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:14:11 compute-0 nova_compute[186544]: 2025-11-22 08:14:11.566 186548 DEBUG oslo_concurrency.lockutils [req-e7972b6b-63a7-481b-9db9-07a5feb50bb2 req-a845307c-972c-4796-839f-2ddbcc272840 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "ddf94a98-572f-4116-87fb-d46dc5f72174-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:14:11 compute-0 nova_compute[186544]: 2025-11-22 08:14:11.567 186548 DEBUG oslo_concurrency.lockutils [req-e7972b6b-63a7-481b-9db9-07a5feb50bb2 req-a845307c-972c-4796-839f-2ddbcc272840 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ddf94a98-572f-4116-87fb-d46dc5f72174-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:14:11 compute-0 nova_compute[186544]: 2025-11-22 08:14:11.567 186548 DEBUG oslo_concurrency.lockutils [req-e7972b6b-63a7-481b-9db9-07a5feb50bb2 req-a845307c-972c-4796-839f-2ddbcc272840 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ddf94a98-572f-4116-87fb-d46dc5f72174-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:14:11 compute-0 nova_compute[186544]: 2025-11-22 08:14:11.567 186548 DEBUG nova.compute.manager [req-e7972b6b-63a7-481b-9db9-07a5feb50bb2 req-a845307c-972c-4796-839f-2ddbcc272840 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Processing event network-vif-plugged-f7a27b23-e94c-4aef-9b08-6d0ddb100265 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 08:14:11 compute-0 nova_compute[186544]: 2025-11-22 08:14:11.578 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:14:11 compute-0 nova_compute[186544]: 2025-11-22 08:14:11.581 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:14:11 compute-0 nova_compute[186544]: 2025-11-22 08:14:11.594 186548 DEBUG nova.compute.manager [req-f6d4871a-7879-4ac6-a986-d7376afe5a3b req-25575660-1b54-4dfd-89f1-5826b53c2ad1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Received event network-vif-plugged-d867b5ab-dfbe-4189-8dbb-ed5c62d72ad3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:14:11 compute-0 nova_compute[186544]: 2025-11-22 08:14:11.594 186548 DEBUG oslo_concurrency.lockutils [req-f6d4871a-7879-4ac6-a986-d7376afe5a3b req-25575660-1b54-4dfd-89f1-5826b53c2ad1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "ddf94a98-572f-4116-87fb-d46dc5f72174-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:14:11 compute-0 nova_compute[186544]: 2025-11-22 08:14:11.594 186548 DEBUG oslo_concurrency.lockutils [req-f6d4871a-7879-4ac6-a986-d7376afe5a3b req-25575660-1b54-4dfd-89f1-5826b53c2ad1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ddf94a98-572f-4116-87fb-d46dc5f72174-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:14:11 compute-0 nova_compute[186544]: 2025-11-22 08:14:11.594 186548 DEBUG oslo_concurrency.lockutils [req-f6d4871a-7879-4ac6-a986-d7376afe5a3b req-25575660-1b54-4dfd-89f1-5826b53c2ad1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ddf94a98-572f-4116-87fb-d46dc5f72174-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:14:11 compute-0 nova_compute[186544]: 2025-11-22 08:14:11.595 186548 DEBUG nova.compute.manager [req-f6d4871a-7879-4ac6-a986-d7376afe5a3b req-25575660-1b54-4dfd-89f1-5826b53c2ad1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Processing event network-vif-plugged-d867b5ab-dfbe-4189-8dbb-ed5c62d72ad3 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 08:14:11 compute-0 nova_compute[186544]: 2025-11-22 08:14:11.595 186548 DEBUG nova.compute.manager [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Instance event wait completed in 0 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 08:14:11 compute-0 nova_compute[186544]: 2025-11-22 08:14:11.598 186548 DEBUG nova.virt.libvirt.driver [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 08:14:11 compute-0 nova_compute[186544]: 2025-11-22 08:14:11.604 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 08:14:11 compute-0 nova_compute[186544]: 2025-11-22 08:14:11.604 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763799251.5978136, ddf94a98-572f-4116-87fb-d46dc5f72174 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:14:11 compute-0 nova_compute[186544]: 2025-11-22 08:14:11.605 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] VM Resumed (Lifecycle Event)
Nov 22 08:14:11 compute-0 nova_compute[186544]: 2025-11-22 08:14:11.608 186548 INFO nova.virt.libvirt.driver [-] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Instance spawned successfully.
Nov 22 08:14:11 compute-0 nova_compute[186544]: 2025-11-22 08:14:11.609 186548 DEBUG nova.virt.libvirt.driver [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 08:14:11 compute-0 nova_compute[186544]: 2025-11-22 08:14:11.617 186548 DEBUG nova.network.neutron [req-d0252520-5ac1-44f9-adb1-fb77ddea5e27 req-e41080f3-e865-44de-ae4a-e6f76174e062 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Updated VIF entry in instance network info cache for port f7a27b23-e94c-4aef-9b08-6d0ddb100265. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:14:11 compute-0 nova_compute[186544]: 2025-11-22 08:14:11.617 186548 DEBUG nova.network.neutron [req-d0252520-5ac1-44f9-adb1-fb77ddea5e27 req-e41080f3-e865-44de-ae4a-e6f76174e062 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Updating instance_info_cache with network_info: [{"id": "f7a27b23-e94c-4aef-9b08-6d0ddb100265", "address": "fa:16:3e:1a:d3:6f", "network": {"id": "7e573664-04ba-4ce5-994a-9fb9483a2400", "bridge": "br-int", "label": "tempest-network-smoke--447291836", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7a27b23-e9", "ovs_interfaceid": "f7a27b23-e94c-4aef-9b08-6d0ddb100265", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "d867b5ab-dfbe-4189-8dbb-ed5c62d72ad3", "address": "fa:16:3e:04:7f:07", "network": {"id": "6c0a2255-6426-43c4-abc3-5c1857ba0a79", "bridge": "br-int", "label": "tempest-network-smoke--277865928", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe04:7f07", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe04:7f07", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd867b5ab-df", "ovs_interfaceid": "d867b5ab-dfbe-4189-8dbb-ed5c62d72ad3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:14:11 compute-0 nova_compute[186544]: 2025-11-22 08:14:11.623 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:14:11 compute-0 nova_compute[186544]: 2025-11-22 08:14:11.625 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:14:11 compute-0 nova_compute[186544]: 2025-11-22 08:14:11.672 186548 DEBUG nova.virt.libvirt.driver [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:14:11 compute-0 nova_compute[186544]: 2025-11-22 08:14:11.673 186548 DEBUG nova.virt.libvirt.driver [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:14:11 compute-0 nova_compute[186544]: 2025-11-22 08:14:11.673 186548 DEBUG nova.virt.libvirt.driver [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:14:11 compute-0 nova_compute[186544]: 2025-11-22 08:14:11.673 186548 DEBUG nova.virt.libvirt.driver [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:14:11 compute-0 nova_compute[186544]: 2025-11-22 08:14:11.674 186548 DEBUG nova.virt.libvirt.driver [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:14:11 compute-0 nova_compute[186544]: 2025-11-22 08:14:11.674 186548 DEBUG nova.virt.libvirt.driver [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:14:11 compute-0 nova_compute[186544]: 2025-11-22 08:14:11.734 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 08:14:11 compute-0 nova_compute[186544]: 2025-11-22 08:14:11.748 186548 DEBUG oslo_concurrency.lockutils [req-d0252520-5ac1-44f9-adb1-fb77ddea5e27 req-e41080f3-e865-44de-ae4a-e6f76174e062 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-ddf94a98-572f-4116-87fb-d46dc5f72174" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:14:11 compute-0 nova_compute[186544]: 2025-11-22 08:14:11.748 186548 DEBUG nova.compute.manager [req-d0252520-5ac1-44f9-adb1-fb77ddea5e27 req-e41080f3-e865-44de-ae4a-e6f76174e062 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Received event network-changed-d867b5ab-dfbe-4189-8dbb-ed5c62d72ad3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:14:11 compute-0 nova_compute[186544]: 2025-11-22 08:14:11.748 186548 DEBUG nova.compute.manager [req-d0252520-5ac1-44f9-adb1-fb77ddea5e27 req-e41080f3-e865-44de-ae4a-e6f76174e062 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Refreshing instance network info cache due to event network-changed-d867b5ab-dfbe-4189-8dbb-ed5c62d72ad3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:14:11 compute-0 nova_compute[186544]: 2025-11-22 08:14:11.749 186548 DEBUG oslo_concurrency.lockutils [req-d0252520-5ac1-44f9-adb1-fb77ddea5e27 req-e41080f3-e865-44de-ae4a-e6f76174e062 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-ddf94a98-572f-4116-87fb-d46dc5f72174" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:14:11 compute-0 nova_compute[186544]: 2025-11-22 08:14:11.749 186548 DEBUG oslo_concurrency.lockutils [req-d0252520-5ac1-44f9-adb1-fb77ddea5e27 req-e41080f3-e865-44de-ae4a-e6f76174e062 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-ddf94a98-572f-4116-87fb-d46dc5f72174" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:14:11 compute-0 nova_compute[186544]: 2025-11-22 08:14:11.749 186548 DEBUG nova.network.neutron [req-d0252520-5ac1-44f9-adb1-fb77ddea5e27 req-e41080f3-e865-44de-ae4a-e6f76174e062 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Refreshing network info cache for port d867b5ab-dfbe-4189-8dbb-ed5c62d72ad3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:14:11 compute-0 podman[238821]: 2025-11-22 08:14:11.738550314 +0000 UTC m=+0.023196455 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 08:14:11 compute-0 nova_compute[186544]: 2025-11-22 08:14:11.866 186548 INFO nova.compute.manager [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Took 9.10 seconds to spawn the instance on the hypervisor.
Nov 22 08:14:11 compute-0 nova_compute[186544]: 2025-11-22 08:14:11.867 186548 DEBUG nova.compute.manager [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:14:11 compute-0 podman[238821]: 2025-11-22 08:14:11.887514322 +0000 UTC m=+0.172160443 container create 4e6a0d95662b598132081efffea470175fb5f1d3d1c89f2bf21e0e496aab5d8b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7e573664-04ba-4ce5-994a-9fb9483a2400, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Nov 22 08:14:11 compute-0 systemd[1]: Started libpod-conmon-4e6a0d95662b598132081efffea470175fb5f1d3d1c89f2bf21e0e496aab5d8b.scope.
Nov 22 08:14:11 compute-0 systemd[1]: Started libcrun container.
Nov 22 08:14:11 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8272f7621a95407b159d2d2e869c23e3a1a1d2942646f9986573be1d6048143c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 08:14:11 compute-0 nova_compute[186544]: 2025-11-22 08:14:11.993 186548 INFO nova.compute.manager [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Took 9.69 seconds to build instance.
Nov 22 08:14:12 compute-0 podman[238821]: 2025-11-22 08:14:12.006568269 +0000 UTC m=+0.291214420 container init 4e6a0d95662b598132081efffea470175fb5f1d3d1c89f2bf21e0e496aab5d8b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7e573664-04ba-4ce5-994a-9fb9483a2400, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Nov 22 08:14:12 compute-0 podman[238821]: 2025-11-22 08:14:12.014197058 +0000 UTC m=+0.298843179 container start 4e6a0d95662b598132081efffea470175fb5f1d3d1c89f2bf21e0e496aab5d8b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7e573664-04ba-4ce5-994a-9fb9483a2400, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 22 08:14:12 compute-0 nova_compute[186544]: 2025-11-22 08:14:12.026 186548 DEBUG oslo_concurrency.lockutils [None req-a3870303-0b5a-4b4b-8f17-997063b1c1a3 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "ddf94a98-572f-4116-87fb-d46dc5f72174" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.845s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:14:12 compute-0 neutron-haproxy-ovnmeta-7e573664-04ba-4ce5-994a-9fb9483a2400[238837]: [NOTICE]   (238841) : New worker (238843) forked
Nov 22 08:14:12 compute-0 neutron-haproxy-ovnmeta-7e573664-04ba-4ce5-994a-9fb9483a2400[238837]: [NOTICE]   (238841) : Loading success.
Nov 22 08:14:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:14:12.083 103805 INFO neutron.agent.ovn.metadata.agent [-] Port d867b5ab-dfbe-4189-8dbb-ed5c62d72ad3 in datapath 6c0a2255-6426-43c4-abc3-5c1857ba0a79 unbound from our chassis
Nov 22 08:14:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:14:12.085 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6c0a2255-6426-43c4-abc3-5c1857ba0a79
Nov 22 08:14:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:14:12.095 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[1859648a-f62c-4cf9-9a5c-200e0b6092f9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:14:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:14:12.096 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6c0a2255-61 in ovnmeta-6c0a2255-6426-43c4-abc3-5c1857ba0a79 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 08:14:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:14:12.097 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6c0a2255-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 08:14:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:14:12.097 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[3673a0bc-8648-477b-8aa2-02a054a31863]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:14:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:14:12.098 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[23be670e-faa6-461f-b47d-a2ec7e2260c4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:14:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:14:12.109 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[1649ec36-c572-4eaf-9c1a-b8769f50e847]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:14:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:14:12.134 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[2470fb88-c23a-47a1-9831-e7b126db37a8]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:14:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:14:12.159 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[796248a1-0e3f-4c89-b9d7-9f2582a8db93]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:14:12 compute-0 NetworkManager[55036]: <info>  [1763799252.1679] manager: (tap6c0a2255-60): new Veth device (/org/freedesktop/NetworkManager/Devices/299)
Nov 22 08:14:12 compute-0 systemd-udevd[238764]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:14:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:14:12.169 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[45aa930d-c5d4-46f6-94bb-770a32bef021]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:14:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:14:12.199 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[e8a350dd-785a-4267-9dc3-a6269c677bf2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:14:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:14:12.203 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[700b7a5a-6a5d-4dc8-b5ac-85213677d35b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:14:12 compute-0 NetworkManager[55036]: <info>  [1763799252.2249] device (tap6c0a2255-60): carrier: link connected
Nov 22 08:14:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:14:12.229 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[68034427-335c-4eb7-9a8f-3c94497eea46]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:14:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:14:12.245 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[00689475-64b8-4358-ab12-cf3bc8374b73]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6c0a2255-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7e:a2:f4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 197], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 599286, 'reachable_time': 21644, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238862, 'error': None, 'target': 'ovnmeta-6c0a2255-6426-43c4-abc3-5c1857ba0a79', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:14:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:14:12.260 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[e6435f2d-52b2-4bdd-ab29-2adc16640d5e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe7e:a2f4'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 599286, 'tstamp': 599286}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 238863, 'error': None, 'target': 'ovnmeta-6c0a2255-6426-43c4-abc3-5c1857ba0a79', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:14:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:14:12.277 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[f12dd526-f42b-49bf-9d45-cb9012475204]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6c0a2255-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7e:a2:f4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 197], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 599286, 'reachable_time': 21644, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 238864, 'error': None, 'target': 'ovnmeta-6c0a2255-6426-43c4-abc3-5c1857ba0a79', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:14:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:14:12.305 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[65dc8c24-743e-46f0-8631-dd824e04c0a5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:14:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:14:12.335 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[3e51445d-7901-489f-ba81-c6b053998ee2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:14:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:14:12.336 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6c0a2255-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:14:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:14:12.337 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:14:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:14:12.337 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6c0a2255-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:14:12 compute-0 nova_compute[186544]: 2025-11-22 08:14:12.339 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:14:12 compute-0 kernel: tap6c0a2255-60: entered promiscuous mode
Nov 22 08:14:12 compute-0 NetworkManager[55036]: <info>  [1763799252.3399] manager: (tap6c0a2255-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/300)
Nov 22 08:14:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:14:12.342 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6c0a2255-60, col_values=(('external_ids', {'iface-id': '433cf940-3b59-425c-aeb8-689a57de46c2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:14:12 compute-0 nova_compute[186544]: 2025-11-22 08:14:12.342 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:14:12 compute-0 nova_compute[186544]: 2025-11-22 08:14:12.345 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:14:12 compute-0 ovn_controller[94843]: 2025-11-22T08:14:12Z|00640|binding|INFO|Releasing lport 433cf940-3b59-425c-aeb8-689a57de46c2 from this chassis (sb_readonly=0)
Nov 22 08:14:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:14:12.345 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6c0a2255-6426-43c4-abc3-5c1857ba0a79.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6c0a2255-6426-43c4-abc3-5c1857ba0a79.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 08:14:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:14:12.347 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[ccd8af1f-cb27-41d3-888e-9327c56fb050]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:14:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:14:12.348 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 08:14:12 compute-0 ovn_metadata_agent[103800]: global
Nov 22 08:14:12 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 08:14:12 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-6c0a2255-6426-43c4-abc3-5c1857ba0a79
Nov 22 08:14:12 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 08:14:12 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 08:14:12 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 08:14:12 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/6c0a2255-6426-43c4-abc3-5c1857ba0a79.pid.haproxy
Nov 22 08:14:12 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 08:14:12 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:14:12 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 08:14:12 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 08:14:12 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 08:14:12 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 08:14:12 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 08:14:12 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 08:14:12 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 08:14:12 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 08:14:12 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 08:14:12 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 08:14:12 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 08:14:12 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 08:14:12 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 08:14:12 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:14:12 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:14:12 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 08:14:12 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 08:14:12 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 08:14:12 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID 6c0a2255-6426-43c4-abc3-5c1857ba0a79
Nov 22 08:14:12 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 08:14:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:14:12.348 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6c0a2255-6426-43c4-abc3-5c1857ba0a79', 'env', 'PROCESS_TAG=haproxy-6c0a2255-6426-43c4-abc3-5c1857ba0a79', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6c0a2255-6426-43c4-abc3-5c1857ba0a79.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 08:14:12 compute-0 nova_compute[186544]: 2025-11-22 08:14:12.360 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:14:12 compute-0 podman[238894]: 2025-11-22 08:14:12.701959293 +0000 UTC m=+0.026299602 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 08:14:12 compute-0 podman[238894]: 2025-11-22 08:14:12.842824031 +0000 UTC m=+0.167164300 container create e6a5dc09a7ea385e3a95c5a3bfa05781d93aece7934879f01774d555c042899b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6c0a2255-6426-43c4-abc3-5c1857ba0a79, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 08:14:12 compute-0 systemd[1]: Started libpod-conmon-e6a5dc09a7ea385e3a95c5a3bfa05781d93aece7934879f01774d555c042899b.scope.
Nov 22 08:14:12 compute-0 systemd[1]: Started libcrun container.
Nov 22 08:14:12 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/285b94d6409d77da5bd4c59a325022e9369d30a0acb0556c1047f3c47b9967fe/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 08:14:12 compute-0 podman[238894]: 2025-11-22 08:14:12.957844888 +0000 UTC m=+0.282185187 container init e6a5dc09a7ea385e3a95c5a3bfa05781d93aece7934879f01774d555c042899b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6c0a2255-6426-43c4-abc3-5c1857ba0a79, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Nov 22 08:14:12 compute-0 podman[238894]: 2025-11-22 08:14:12.964068962 +0000 UTC m=+0.288409231 container start e6a5dc09a7ea385e3a95c5a3bfa05781d93aece7934879f01774d555c042899b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6c0a2255-6426-43c4-abc3-5c1857ba0a79, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118)
Nov 22 08:14:12 compute-0 neutron-haproxy-ovnmeta-6c0a2255-6426-43c4-abc3-5c1857ba0a79[238909]: [NOTICE]   (238913) : New worker (238915) forked
Nov 22 08:14:12 compute-0 neutron-haproxy-ovnmeta-6c0a2255-6426-43c4-abc3-5c1857ba0a79[238909]: [NOTICE]   (238913) : Loading success.
Nov 22 08:14:13 compute-0 nova_compute[186544]: 2025-11-22 08:14:13.718 186548 DEBUG nova.compute.manager [req-434e6cf8-ee5a-43d6-84b9-c80bcc3cbacf req-e4203cc7-b3e4-4a28-b93d-fa2a836f2345 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Received event network-vif-plugged-f7a27b23-e94c-4aef-9b08-6d0ddb100265 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:14:13 compute-0 nova_compute[186544]: 2025-11-22 08:14:13.718 186548 DEBUG oslo_concurrency.lockutils [req-434e6cf8-ee5a-43d6-84b9-c80bcc3cbacf req-e4203cc7-b3e4-4a28-b93d-fa2a836f2345 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "ddf94a98-572f-4116-87fb-d46dc5f72174-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:14:13 compute-0 nova_compute[186544]: 2025-11-22 08:14:13.718 186548 DEBUG oslo_concurrency.lockutils [req-434e6cf8-ee5a-43d6-84b9-c80bcc3cbacf req-e4203cc7-b3e4-4a28-b93d-fa2a836f2345 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ddf94a98-572f-4116-87fb-d46dc5f72174-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:14:13 compute-0 nova_compute[186544]: 2025-11-22 08:14:13.719 186548 DEBUG oslo_concurrency.lockutils [req-434e6cf8-ee5a-43d6-84b9-c80bcc3cbacf req-e4203cc7-b3e4-4a28-b93d-fa2a836f2345 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ddf94a98-572f-4116-87fb-d46dc5f72174-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:14:13 compute-0 nova_compute[186544]: 2025-11-22 08:14:13.719 186548 DEBUG nova.compute.manager [req-434e6cf8-ee5a-43d6-84b9-c80bcc3cbacf req-e4203cc7-b3e4-4a28-b93d-fa2a836f2345 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] No waiting events found dispatching network-vif-plugged-f7a27b23-e94c-4aef-9b08-6d0ddb100265 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:14:13 compute-0 nova_compute[186544]: 2025-11-22 08:14:13.719 186548 WARNING nova.compute.manager [req-434e6cf8-ee5a-43d6-84b9-c80bcc3cbacf req-e4203cc7-b3e4-4a28-b93d-fa2a836f2345 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Received unexpected event network-vif-plugged-f7a27b23-e94c-4aef-9b08-6d0ddb100265 for instance with vm_state active and task_state None.
Nov 22 08:14:13 compute-0 nova_compute[186544]: 2025-11-22 08:14:13.790 186548 DEBUG nova.compute.manager [req-a3661697-b296-4d88-9d88-4ad8c5a84b60 req-a71f535e-06cb-466a-9785-6d9300bff8fd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Received event network-vif-plugged-d867b5ab-dfbe-4189-8dbb-ed5c62d72ad3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:14:13 compute-0 nova_compute[186544]: 2025-11-22 08:14:13.790 186548 DEBUG oslo_concurrency.lockutils [req-a3661697-b296-4d88-9d88-4ad8c5a84b60 req-a71f535e-06cb-466a-9785-6d9300bff8fd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "ddf94a98-572f-4116-87fb-d46dc5f72174-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:14:13 compute-0 nova_compute[186544]: 2025-11-22 08:14:13.791 186548 DEBUG oslo_concurrency.lockutils [req-a3661697-b296-4d88-9d88-4ad8c5a84b60 req-a71f535e-06cb-466a-9785-6d9300bff8fd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ddf94a98-572f-4116-87fb-d46dc5f72174-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:14:13 compute-0 nova_compute[186544]: 2025-11-22 08:14:13.791 186548 DEBUG oslo_concurrency.lockutils [req-a3661697-b296-4d88-9d88-4ad8c5a84b60 req-a71f535e-06cb-466a-9785-6d9300bff8fd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ddf94a98-572f-4116-87fb-d46dc5f72174-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:14:13 compute-0 nova_compute[186544]: 2025-11-22 08:14:13.791 186548 DEBUG nova.compute.manager [req-a3661697-b296-4d88-9d88-4ad8c5a84b60 req-a71f535e-06cb-466a-9785-6d9300bff8fd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] No waiting events found dispatching network-vif-plugged-d867b5ab-dfbe-4189-8dbb-ed5c62d72ad3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:14:13 compute-0 nova_compute[186544]: 2025-11-22 08:14:13.791 186548 WARNING nova.compute.manager [req-a3661697-b296-4d88-9d88-4ad8c5a84b60 req-a71f535e-06cb-466a-9785-6d9300bff8fd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Received unexpected event network-vif-plugged-d867b5ab-dfbe-4189-8dbb-ed5c62d72ad3 for instance with vm_state active and task_state None.
Nov 22 08:14:14 compute-0 nova_compute[186544]: 2025-11-22 08:14:14.423 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:14:15 compute-0 podman[238925]: 2025-11-22 08:14:15.4140206 +0000 UTC m=+0.055077954 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 22 08:14:15 compute-0 podman[238926]: 2025-11-22 08:14:15.424198672 +0000 UTC m=+0.061956304 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 08:14:15 compute-0 podman[238924]: 2025-11-22 08:14:15.431130564 +0000 UTC m=+0.075875099 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 22 08:14:15 compute-0 podman[238927]: 2025-11-22 08:14:15.463069264 +0000 UTC m=+0.095471774 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 22 08:14:15 compute-0 nova_compute[186544]: 2025-11-22 08:14:15.986 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:14:19 compute-0 nova_compute[186544]: 2025-11-22 08:14:19.137 186548 DEBUG nova.network.neutron [req-d0252520-5ac1-44f9-adb1-fb77ddea5e27 req-e41080f3-e865-44de-ae4a-e6f76174e062 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Updated VIF entry in instance network info cache for port d867b5ab-dfbe-4189-8dbb-ed5c62d72ad3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:14:19 compute-0 nova_compute[186544]: 2025-11-22 08:14:19.137 186548 DEBUG nova.network.neutron [req-d0252520-5ac1-44f9-adb1-fb77ddea5e27 req-e41080f3-e865-44de-ae4a-e6f76174e062 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Updating instance_info_cache with network_info: [{"id": "f7a27b23-e94c-4aef-9b08-6d0ddb100265", "address": "fa:16:3e:1a:d3:6f", "network": {"id": "7e573664-04ba-4ce5-994a-9fb9483a2400", "bridge": "br-int", "label": "tempest-network-smoke--447291836", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7a27b23-e9", "ovs_interfaceid": "f7a27b23-e94c-4aef-9b08-6d0ddb100265", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "d867b5ab-dfbe-4189-8dbb-ed5c62d72ad3", "address": "fa:16:3e:04:7f:07", "network": {"id": "6c0a2255-6426-43c4-abc3-5c1857ba0a79", "bridge": "br-int", "label": "tempest-network-smoke--277865928", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe04:7f07", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe04:7f07", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd867b5ab-df", "ovs_interfaceid": "d867b5ab-dfbe-4189-8dbb-ed5c62d72ad3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:14:19 compute-0 nova_compute[186544]: 2025-11-22 08:14:19.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:14:19 compute-0 nova_compute[186544]: 2025-11-22 08:14:19.164 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 22 08:14:19 compute-0 nova_compute[186544]: 2025-11-22 08:14:19.184 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 22 08:14:19 compute-0 nova_compute[186544]: 2025-11-22 08:14:19.427 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:14:19 compute-0 nova_compute[186544]: 2025-11-22 08:14:19.511 186548 DEBUG oslo_concurrency.lockutils [req-d0252520-5ac1-44f9-adb1-fb77ddea5e27 req-e41080f3-e865-44de-ae4a-e6f76174e062 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-ddf94a98-572f-4116-87fb-d46dc5f72174" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:14:20 compute-0 nova_compute[186544]: 2025-11-22 08:14:20.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:14:20 compute-0 nova_compute[186544]: 2025-11-22 08:14:20.164 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 22 08:14:20 compute-0 nova_compute[186544]: 2025-11-22 08:14:20.987 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:14:21 compute-0 nova_compute[186544]: 2025-11-22 08:14:21.180 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:14:24 compute-0 nova_compute[186544]: 2025-11-22 08:14:24.431 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:14:25 compute-0 nova_compute[186544]: 2025-11-22 08:14:25.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:14:25 compute-0 nova_compute[186544]: 2025-11-22 08:14:25.447 186548 DEBUG nova.compute.manager [req-7083162c-ef0b-4887-bc88-71d13617d4de req-ffb77848-01e5-4e5a-a73b-96e40f61e9c3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Received event network-changed-f7a27b23-e94c-4aef-9b08-6d0ddb100265 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:14:25 compute-0 nova_compute[186544]: 2025-11-22 08:14:25.448 186548 DEBUG nova.compute.manager [req-7083162c-ef0b-4887-bc88-71d13617d4de req-ffb77848-01e5-4e5a-a73b-96e40f61e9c3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Refreshing instance network info cache due to event network-changed-f7a27b23-e94c-4aef-9b08-6d0ddb100265. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:14:25 compute-0 nova_compute[186544]: 2025-11-22 08:14:25.449 186548 DEBUG oslo_concurrency.lockutils [req-7083162c-ef0b-4887-bc88-71d13617d4de req-ffb77848-01e5-4e5a-a73b-96e40f61e9c3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-ddf94a98-572f-4116-87fb-d46dc5f72174" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:14:25 compute-0 nova_compute[186544]: 2025-11-22 08:14:25.449 186548 DEBUG oslo_concurrency.lockutils [req-7083162c-ef0b-4887-bc88-71d13617d4de req-ffb77848-01e5-4e5a-a73b-96e40f61e9c3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-ddf94a98-572f-4116-87fb-d46dc5f72174" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:14:25 compute-0 nova_compute[186544]: 2025-11-22 08:14:25.449 186548 DEBUG nova.network.neutron [req-7083162c-ef0b-4887-bc88-71d13617d4de req-ffb77848-01e5-4e5a-a73b-96e40f61e9c3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Refreshing network info cache for port f7a27b23-e94c-4aef-9b08-6d0ddb100265 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:14:25 compute-0 nova_compute[186544]: 2025-11-22 08:14:25.989 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:14:26 compute-0 podman[239023]: 2025-11-22 08:14:26.401023843 +0000 UTC m=+0.049570318 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 08:14:27 compute-0 ovn_controller[94843]: 2025-11-22T08:14:27Z|00057|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:1a:d3:6f 10.100.0.14
Nov 22 08:14:27 compute-0 ovn_controller[94843]: 2025-11-22T08:14:27Z|00058|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:1a:d3:6f 10.100.0.14
Nov 22 08:14:29 compute-0 nova_compute[186544]: 2025-11-22 08:14:29.217 186548 DEBUG nova.network.neutron [req-7083162c-ef0b-4887-bc88-71d13617d4de req-ffb77848-01e5-4e5a-a73b-96e40f61e9c3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Updated VIF entry in instance network info cache for port f7a27b23-e94c-4aef-9b08-6d0ddb100265. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:14:29 compute-0 nova_compute[186544]: 2025-11-22 08:14:29.219 186548 DEBUG nova.network.neutron [req-7083162c-ef0b-4887-bc88-71d13617d4de req-ffb77848-01e5-4e5a-a73b-96e40f61e9c3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Updating instance_info_cache with network_info: [{"id": "f7a27b23-e94c-4aef-9b08-6d0ddb100265", "address": "fa:16:3e:1a:d3:6f", "network": {"id": "7e573664-04ba-4ce5-994a-9fb9483a2400", "bridge": "br-int", "label": "tempest-network-smoke--447291836", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7a27b23-e9", "ovs_interfaceid": "f7a27b23-e94c-4aef-9b08-6d0ddb100265", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "d867b5ab-dfbe-4189-8dbb-ed5c62d72ad3", "address": "fa:16:3e:04:7f:07", "network": {"id": "6c0a2255-6426-43c4-abc3-5c1857ba0a79", "bridge": "br-int", "label": "tempest-network-smoke--277865928", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe04:7f07", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe04:7f07", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd867b5ab-df", "ovs_interfaceid": "d867b5ab-dfbe-4189-8dbb-ed5c62d72ad3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:14:29 compute-0 podman[239043]: 2025-11-22 08:14:29.404140025 +0000 UTC m=+0.052759347 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 08:14:29 compute-0 nova_compute[186544]: 2025-11-22 08:14:29.436 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:14:29 compute-0 podman[239044]: 2025-11-22 08:14:29.437493601 +0000 UTC m=+0.082349430 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, config_id=edpm, distribution-scope=public, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=)
Nov 22 08:14:29 compute-0 nova_compute[186544]: 2025-11-22 08:14:29.931 186548 DEBUG oslo_concurrency.lockutils [req-7083162c-ef0b-4887-bc88-71d13617d4de req-ffb77848-01e5-4e5a-a73b-96e40f61e9c3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-ddf94a98-572f-4116-87fb-d46dc5f72174" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:14:30 compute-0 nova_compute[186544]: 2025-11-22 08:14:30.992 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:14:34 compute-0 nova_compute[186544]: 2025-11-22 08:14:34.439 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:14:35 compute-0 nova_compute[186544]: 2025-11-22 08:14:35.995 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.600 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'ddf94a98-572f-4116-87fb-d46dc5f72174', 'name': 'tempest-TestGettingAddress-server-585014751', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000008b', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'user_id': '809b865601654264af5bff7f49127cea', 'hostId': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.602 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '64848f5c-64c9-41ed-9c0d-c2ef3839d5de', 'name': 'tempest-TestSnapshotPattern-server-2002217618', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000088', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '5c9016c6b616412fa2db0983e23a8150', 'user_id': '72df4512d7f245118018df81223ce5ff', 'hostId': 'e48e638f02bb08d71c5a9c32c3d0531ffa50e5e68534f2d3ee242f98', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.602 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.606 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for ddf94a98-572f-4116-87fb-d46dc5f72174 / tapf7a27b23-e9 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.606 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for ddf94a98-572f-4116-87fb-d46dc5f72174 / tapd867b5ab-df inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.607 12 DEBUG ceilometer.compute.pollsters [-] ddf94a98-572f-4116-87fb-d46dc5f72174/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.607 12 DEBUG ceilometer.compute.pollsters [-] ddf94a98-572f-4116-87fb-d46dc5f72174/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.609 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 64848f5c-64c9-41ed-9c0d-c2ef3839d5de / tap56b64f59-c3 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.610 12 DEBUG ceilometer.compute.pollsters [-] 64848f5c-64c9-41ed-9c0d-c2ef3839d5de/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.611 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aeb1316a-595e-4087-a506-cef4ce534592', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-0000008b-ddf94a98-572f-4116-87fb-d46dc5f72174-tapf7a27b23-e9', 'timestamp': '2025-11-22T08:14:36.602865', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-585014751', 'name': 'tapf7a27b23-e9', 'instance_id': 'ddf94a98-572f-4116-87fb-d46dc5f72174', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1a:d3:6f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf7a27b23-e9'}, 'message_id': '48e0ac40-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6017.294645464, 'message_signature': 'e8047548d8c67577c1697538944f3099459ef4837ea850c2b5ef405788209772'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-0000008b-ddf94a98-572f-4116-87fb-d46dc5f72174-tapd867b5ab-df', 'timestamp': '2025-11-22T08:14:36.602865', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-585014751', 'name': 'tapd867b5ab-df', 'instance_id': 'ddf94a98-572f-4116-87fb-d46dc5f72174', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:04:7f:07', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd867b5ab-df'}, 'message_id': '48e0b974-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6017.294645464, 'message_signature': 'c44ebb826f14bb852ca247c5e0e9a2b4ec4b617e771ba252482868611eca0431'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '72df4512d7f245118018df81223ce5ff', 'user_name': None, 'project_id': '5c9016c6b616412fa2db0983e23a8150', 'project_name': None, 'resource_id': 'instance-00000088-64848f5c-64c9-41ed-9c0d-c2ef3839d5de-tap56b64f59-c3', 'timestamp': '2025-11-22T08:14:36.602865', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-2002217618', 'name': 'tap56b64f59-c3', 'instance_id': '64848f5c-64c9-41ed-9c0d-c2ef3839d5de', 'instance_type': 'm1.nano', 'host': 'e48e638f02bb08d71c5a9c32c3d0531ffa50e5e68534f2d3ee242f98', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d3:d2:ad', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap56b64f59-c3'}, 'message_id': '48e11c0c-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6017.299607296, 'message_signature': 'a1ac6f31462dd72f543e48433b0a27ae7e2a82a328905715626fbcf50938d09b'}]}, 'timestamp': '2025-11-22 08:14:36.610444', '_unique_id': '12ab158e0dc2412b9e8d96fed361c3b2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.611 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.611 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.611 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.611 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.611 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.611 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.611 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.611 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.611 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.611 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.611 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.611 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.611 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.611 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.611 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.611 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.611 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.611 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.611 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.611 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.611 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.611 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.611 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.611 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.611 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.611 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.611 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.611 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.611 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.611 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.611 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.613 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.637 12 DEBUG ceilometer.compute.pollsters [-] ddf94a98-572f-4116-87fb-d46dc5f72174/disk.device.read.latency volume: 2153098447 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.637 12 DEBUG ceilometer.compute.pollsters [-] ddf94a98-572f-4116-87fb-d46dc5f72174/disk.device.read.latency volume: 238495518 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.662 12 DEBUG ceilometer.compute.pollsters [-] 64848f5c-64c9-41ed-9c0d-c2ef3839d5de/disk.device.read.latency volume: 1165892824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.663 12 DEBUG ceilometer.compute.pollsters [-] 64848f5c-64c9-41ed-9c0d-c2ef3839d5de/disk.device.read.latency volume: 67610460 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.665 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8aca5333-b3e5-4978-8a89-b9459568ab8a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2153098447, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'ddf94a98-572f-4116-87fb-d46dc5f72174-vda', 'timestamp': '2025-11-22T08:14:36.613446', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-585014751', 'name': 'instance-0000008b', 'instance_id': 'ddf94a98-572f-4116-87fb-d46dc5f72174', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '48e545d4-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6017.305287117, 'message_signature': '11b5e903ce656c93add2802d06ba355606b70a6f128f8dc36c6ecd4de5b4f340'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 238495518, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'ddf94a98-572f-4116-87fb-d46dc5f72174-sda', 'timestamp': '2025-11-22T08:14:36.613446', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-585014751', 'name': 'instance-0000008b', 'instance_id': 'ddf94a98-572f-4116-87fb-d46dc5f72174', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '48e555ba-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6017.305287117, 'message_signature': '44c6333b40090533f76f872b2b6740eb8391562c79a6b3403572254d2f301108'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1165892824, 'user_id': '72df4512d7f245118018df81223ce5ff', 'user_name': None, 'project_id': '5c9016c6b616412fa2db0983e23a8150', 'project_name': None, 'resource_id': '64848f5c-64c9-41ed-9c0d-c2ef3839d5de-vda', 'timestamp': '2025-11-22T08:14:36.613446', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-2002217618', 'name': 'instance-00000088', 'instance_id': '64848f5c-64c9-41ed-9c0d-c2ef3839d5de', 'instance_type': 'm1.nano', 'host': 'e48e638f02bb08d71c5a9c32c3d0531ffa50e5e68534f2d3ee242f98', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '48e92e38-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6017.329849096, 'message_signature': 'd652c50f1cf5c405d18a3a3f64d30694d957535791c233babee18cd8f041481d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 67610460, 'user_id': '72df4512d7f245118018df81223ce5ff', 'user_name': None, 'project_id': '5c9016c6b616412fa2db0983e23a8150', 'project_name': None, 'resource_id': '64848f5c-64c9-41ed-9c0d-c2ef3839d5de-sda', 'timestamp': '2025-11-22T08:14:36.613446', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-2002217618', 'name': 'instance-00000088', 'instance_id': '64848f5c-64c9-41ed-9c0d-c2ef3839d5de', 'instance_type': 'm1.nano', 'host': 'e48e638f02bb08d71c5a9c32c3d0531ffa50e5e68534f2d3ee242f98', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '48e94134-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6017.329849096, 'message_signature': 'bcb5160066816a0f589660b4aaf45083b7dab19a55eebc8c71812979f12ecd29'}]}, 'timestamp': '2025-11-22 08:14:36.663840', '_unique_id': '4e7f978dd1f042dbb7accedba4c996be'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.665 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.665 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.665 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.665 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.665 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.665 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.665 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.665 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.665 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.665 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.665 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.665 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.665 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.665 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.665 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.665 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.665 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.665 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.665 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.665 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.665 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.665 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.665 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.665 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.665 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.665 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.665 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.665 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.665 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.665 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.665 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.666 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.666 12 DEBUG ceilometer.compute.pollsters [-] ddf94a98-572f-4116-87fb-d46dc5f72174/disk.device.read.bytes volume: 31844864 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.666 12 DEBUG ceilometer.compute.pollsters [-] ddf94a98-572f-4116-87fb-d46dc5f72174/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.667 12 DEBUG ceilometer.compute.pollsters [-] 64848f5c-64c9-41ed-9c0d-c2ef3839d5de/disk.device.read.bytes volume: 31668736 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.667 12 DEBUG ceilometer.compute.pollsters [-] 64848f5c-64c9-41ed-9c0d-c2ef3839d5de/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.668 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '196cb967-363c-42eb-8cfd-8c7032e2db46', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 31844864, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'ddf94a98-572f-4116-87fb-d46dc5f72174-vda', 'timestamp': '2025-11-22T08:14:36.666596', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-585014751', 'name': 'instance-0000008b', 'instance_id': 'ddf94a98-572f-4116-87fb-d46dc5f72174', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '48e9bd58-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6017.305287117, 'message_signature': 'd3b7f7b34d3b43b39a13a2944f6a1e04f8ca966303c7bb90e09cf6b24f4203f6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'ddf94a98-572f-4116-87fb-d46dc5f72174-sda', 'timestamp': '2025-11-22T08:14:36.666596', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-585014751', 'name': 'instance-0000008b', 'instance_id': 'ddf94a98-572f-4116-87fb-d46dc5f72174', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '48e9c992-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6017.305287117, 'message_signature': 'c8f342131115bac0b64235f37dbaff0ee2466a8879e6718b47a278b52a624822'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 31668736, 'user_id': '72df4512d7f245118018df81223ce5ff', 'user_name': None, 'project_id': '5c9016c6b616412fa2db0983e23a8150', 'project_name': None, 'resource_id': '64848f5c-64c9-41ed-9c0d-c2ef3839d5de-vda', 'timestamp': '2025-11-22T08:14:36.666596', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-2002217618', 'name': 'instance-00000088', 'instance_id': '64848f5c-64c9-41ed-9c0d-c2ef3839d5de', 'instance_type': 'm1.nano', 'host': 'e48e638f02bb08d71c5a9c32c3d0531ffa50e5e68534f2d3ee242f98', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '48e9d82e-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6017.329849096, 'message_signature': '7a28e0c9660adcb0cefb5d90468aa7a91c5a16a63b80379bc126071150fecc1d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '72df4512d7f245118018df81223ce5ff', 'user_name': None, 'project_id': '5c9016c6b616412fa2db0983e23a8150', 'project_name': None, 'resource_id': '64848f5c-64c9-41ed-9c0d-c2ef3839d5de-sda', 'timestamp': '2025-11-22T08:14:36.666596', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-2002217618', 'name': 'instance-00000088', 'instance_id': '64848f5c-64c9-41ed-9c0d-c2ef3839d5de', 'instance_type': 'm1.nano', 'host': 'e48e638f02bb08d71c5a9c32c3d0531ffa50e5e68534f2d3ee242f98', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '48e9e210-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6017.329849096, 'message_signature': '4014c914aa24d090e9a4b869c1095de69228823a94d9cbc55cca2d1aa41a2061'}]}, 'timestamp': '2025-11-22 08:14:36.667853', '_unique_id': '8e0d22011d0643448c83b83042a2a055'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.668 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.668 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.668 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.668 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.668 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.668 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.668 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.668 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.668 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.668 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.668 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.668 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.668 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.668 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.668 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.668 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.668 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.668 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.668 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.668 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.668 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.668 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.668 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.668 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.668 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.668 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.668 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.668 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.668 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.668 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.668 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.669 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.669 12 DEBUG ceilometer.compute.pollsters [-] ddf94a98-572f-4116-87fb-d46dc5f72174/network.outgoing.bytes volume: 1550 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.670 12 DEBUG ceilometer.compute.pollsters [-] ddf94a98-572f-4116-87fb-d46dc5f72174/network.outgoing.bytes volume: 2194 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.670 12 DEBUG ceilometer.compute.pollsters [-] 64848f5c-64c9-41ed-9c0d-c2ef3839d5de/network.outgoing.bytes volume: 8228 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.671 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '04a3647a-51cb-4a6b-95ff-99a1159b1eda', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1550, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-0000008b-ddf94a98-572f-4116-87fb-d46dc5f72174-tapf7a27b23-e9', 'timestamp': '2025-11-22T08:14:36.669705', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-585014751', 'name': 'tapf7a27b23-e9', 'instance_id': 'ddf94a98-572f-4116-87fb-d46dc5f72174', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1a:d3:6f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf7a27b23-e9'}, 'message_id': '48ea351c-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6017.294645464, 'message_signature': '48e420cfc1bacf0a4de9999581f8ee4ca583e691a4e95fe5c48d887b20621ffd'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2194, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-0000008b-ddf94a98-572f-4116-87fb-d46dc5f72174-tapd867b5ab-df', 'timestamp': '2025-11-22T08:14:36.669705', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-585014751', 'name': 'tapd867b5ab-df', 'instance_id': 'ddf94a98-572f-4116-87fb-d46dc5f72174', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:04:7f:07', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd867b5ab-df'}, 'message_id': '48ea4138-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6017.294645464, 'message_signature': '0d9ae0b807e7f63c40fe38fdb22e1fb6afe27c5ac256e7c2b46eb9c44aff2b15'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 8228, 'user_id': '72df4512d7f245118018df81223ce5ff', 'user_name': None, 'project_id': '5c9016c6b616412fa2db0983e23a8150', 'project_name': None, 'resource_id': 'instance-00000088-64848f5c-64c9-41ed-9c0d-c2ef3839d5de-tap56b64f59-c3', 'timestamp': '2025-11-22T08:14:36.669705', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-2002217618', 'name': 'tap56b64f59-c3', 'instance_id': '64848f5c-64c9-41ed-9c0d-c2ef3839d5de', 'instance_type': 'm1.nano', 'host': 'e48e638f02bb08d71c5a9c32c3d0531ffa50e5e68534f2d3ee242f98', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d3:d2:ad', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap56b64f59-c3'}, 'message_id': '48ea4bec-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6017.299607296, 'message_signature': 'ed714d1dfce6246fdb794cee47a26d942c5e483f16b7c1cb71ac129a8dbbf07e'}]}, 'timestamp': '2025-11-22 08:14:36.670567', '_unique_id': '40108b694f744f709f960e425382c481'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.671 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.671 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.671 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.671 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.671 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.671 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.671 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.671 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.671 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.671 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.671 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.671 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.671 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.671 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.671 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.671 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.671 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.671 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.671 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.671 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.671 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.671 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.671 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.671 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.671 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.671 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.671 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.671 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.671 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.671 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.671 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.672 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.685 12 DEBUG ceilometer.compute.pollsters [-] ddf94a98-572f-4116-87fb-d46dc5f72174/memory.usage volume: 43.828125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.696 12 DEBUG ceilometer.compute.pollsters [-] 64848f5c-64c9-41ed-9c0d-c2ef3839d5de/memory.usage volume: 46.70703125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.698 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6ad3662c-6210-46ef-b1f8-ea164f3b3cbb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 43.828125, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'ddf94a98-572f-4116-87fb-d46dc5f72174', 'timestamp': '2025-11-22T08:14:36.672447', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-585014751', 'name': 'instance-0000008b', 'instance_id': 'ddf94a98-572f-4116-87fb-d46dc5f72174', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '48eca04a-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6017.377067654, 'message_signature': '92ab79c7b47dfb1c9d7f7848f3718c51e1499e2ce3341594c50e9494e9870aaa'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 46.70703125, 'user_id': '72df4512d7f245118018df81223ce5ff', 'user_name': None, 'project_id': '5c9016c6b616412fa2db0983e23a8150', 'project_name': None, 'resource_id': '64848f5c-64c9-41ed-9c0d-c2ef3839d5de', 'timestamp': '2025-11-22T08:14:36.672447', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-2002217618', 'name': 'instance-00000088', 'instance_id': '64848f5c-64c9-41ed-9c0d-c2ef3839d5de', 'instance_type': 'm1.nano', 'host': 'e48e638f02bb08d71c5a9c32c3d0531ffa50e5e68534f2d3ee242f98', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '48ee59da-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6017.388531288, 'message_signature': '94c89c9bf13f9498857c424a73e64581ec1583b45f4f269fed9562e9fbfd20ff'}]}, 'timestamp': '2025-11-22 08:14:36.697231', '_unique_id': '098c1f0a668944ceba856e1ad6730512'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.698 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.698 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.698 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.698 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.698 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.698 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.698 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.698 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.698 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.698 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.698 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.698 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.698 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.698 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.698 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.698 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.698 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.698 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.698 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.698 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.698 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.698 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.698 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.698 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.698 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.698 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.698 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.698 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.698 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.698 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.698 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.699 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.699 12 DEBUG ceilometer.compute.pollsters [-] ddf94a98-572f-4116-87fb-d46dc5f72174/network.outgoing.packets volume: 15 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.699 12 DEBUG ceilometer.compute.pollsters [-] ddf94a98-572f-4116-87fb-d46dc5f72174/network.outgoing.packets volume: 21 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.699 12 DEBUG ceilometer.compute.pollsters [-] 64848f5c-64c9-41ed-9c0d-c2ef3839d5de/network.outgoing.packets volume: 61 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.700 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ef6642ad-b247-44b8-b7cb-90537fc7ab20', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 15, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-0000008b-ddf94a98-572f-4116-87fb-d46dc5f72174-tapf7a27b23-e9', 'timestamp': '2025-11-22T08:14:36.699427', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-585014751', 'name': 'tapf7a27b23-e9', 'instance_id': 'ddf94a98-572f-4116-87fb-d46dc5f72174', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1a:d3:6f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf7a27b23-e9'}, 'message_id': '48eebe3e-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6017.294645464, 'message_signature': '3cfbbd7c7bcea4c584735b598dccc3474c11bf1c84213792ae1a482461256407'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 21, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-0000008b-ddf94a98-572f-4116-87fb-d46dc5f72174-tapd867b5ab-df', 'timestamp': '2025-11-22T08:14:36.699427', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-585014751', 'name': 'tapd867b5ab-df', 'instance_id': 'ddf94a98-572f-4116-87fb-d46dc5f72174', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:04:7f:07', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd867b5ab-df'}, 'message_id': '48eec82a-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6017.294645464, 'message_signature': '00bdf4881ff3ac4c6d635f37717346fde640405d75b431158b1fba5bb15a3a86'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 61, 'user_id': '72df4512d7f245118018df81223ce5ff', 'user_name': None, 'project_id': '5c9016c6b616412fa2db0983e23a8150', 'project_name': None, 'resource_id': 'instance-00000088-64848f5c-64c9-41ed-9c0d-c2ef3839d5de-tap56b64f59-c3', 'timestamp': '2025-11-22T08:14:36.699427', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-2002217618', 'name': 'tap56b64f59-c3', 'instance_id': '64848f5c-64c9-41ed-9c0d-c2ef3839d5de', 'instance_type': 'm1.nano', 'host': 'e48e638f02bb08d71c5a9c32c3d0531ffa50e5e68534f2d3ee242f98', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d3:d2:ad', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap56b64f59-c3'}, 'message_id': '48eed2d4-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6017.299607296, 'message_signature': '6d403bbcccb4a49d7e63325787052a609b48e7854d3165be416873508c4c22b3'}]}, 'timestamp': '2025-11-22 08:14:36.700338', '_unique_id': '5793b1b9b50147419616701bb3862654'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.700 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.700 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.700 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.700 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.700 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.700 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.700 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.700 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.700 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.700 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.700 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.700 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.700 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.700 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.700 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.700 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.700 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.700 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.700 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.700 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.700 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.700 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.700 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.700 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.700 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.700 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.700 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.700 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.700 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.700 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.700 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.702 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.702 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.702 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-585014751>, <NovaLikeServer: tempest-TestSnapshotPattern-server-2002217618>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-585014751>, <NovaLikeServer: tempest-TestSnapshotPattern-server-2002217618>]
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.702 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.710 12 DEBUG ceilometer.compute.pollsters [-] ddf94a98-572f-4116-87fb-d46dc5f72174/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.710 12 DEBUG ceilometer.compute.pollsters [-] ddf94a98-572f-4116-87fb-d46dc5f72174/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.718 12 DEBUG ceilometer.compute.pollsters [-] 64848f5c-64c9-41ed-9c0d-c2ef3839d5de/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.718 12 DEBUG ceilometer.compute.pollsters [-] 64848f5c-64c9-41ed-9c0d-c2ef3839d5de/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.719 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5978e1bc-c078-44e0-863a-ed9f191831a9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'ddf94a98-572f-4116-87fb-d46dc5f72174-vda', 'timestamp': '2025-11-22T08:14:36.702648', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-585014751', 'name': 'instance-0000008b', 'instance_id': 'ddf94a98-572f-4116-87fb-d46dc5f72174', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '48f0710c-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6017.394434964, 'message_signature': 'a3bf845d9b46c2fe555453fac3d076758c41604b167e97b113b63b394b775afe'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'ddf94a98-572f-4116-87fb-d46dc5f72174-sda', 'timestamp': '2025-11-22T08:14:36.702648', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-585014751', 'name': 'instance-0000008b', 'instance_id': 'ddf94a98-572f-4116-87fb-d46dc5f72174', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '48f07b20-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6017.394434964, 'message_signature': '890af01a6e5a86569b306d608df1a7326204c19bb0a66014c327f55b8634cf70'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '72df4512d7f245118018df81223ce5ff', 'user_name': None, 'project_id': '5c9016c6b616412fa2db0983e23a8150', 'project_name': None, 'resource_id': '64848f5c-64c9-41ed-9c0d-c2ef3839d5de-vda', 'timestamp': '2025-11-22T08:14:36.702648', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-2002217618', 'name': 'instance-00000088', 'instance_id': '64848f5c-64c9-41ed-9c0d-c2ef3839d5de', 'instance_type': 'm1.nano', 'host': 'e48e638f02bb08d71c5a9c32c3d0531ffa50e5e68534f2d3ee242f98', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '48f19e38-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6017.402841882, 'message_signature': 'c6d608a8c911a1e862cf11ed467edba17f1fbb245f1b92b590149a2b950e3ebc'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '72df4512d7f245118018df81223ce5ff', 'user_name': None, 'project_id': '5c9016c6b616412fa2db0983e23a8150', 'project_name': None, 'resource_id': '64848f5c-64c9-41ed-9c0d-c2ef3839d5de-sda', 'timestamp': '2025-11-22T08:14:36.702648', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-2002217618', 'name': 'instance-00000088', 'instance_id': '64848f5c-64c9-41ed-9c0d-c2ef3839d5de', 'instance_type': 'm1.nano', 'host': 'e48e638f02bb08d71c5a9c32c3d0531ffa50e5e68534f2d3ee242f98', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '48f1aa22-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6017.402841882, 'message_signature': '796f441dafab0b440ebc6d98b19f1bbec27934b8d7bcc186c3c796c2d9864e00'}]}, 'timestamp': '2025-11-22 08:14:36.718856', '_unique_id': 'a3011147070c4b478fbc34719fa7198c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.719 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.719 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.719 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.719 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.719 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.719 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.719 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.719 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.719 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.719 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.719 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.719 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.719 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.719 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.719 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.719 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.719 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.719 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.719 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.719 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.719 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.719 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.719 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.719 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.719 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.719 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.719 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.719 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.719 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.719 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.719 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.719 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.720 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.720 12 DEBUG ceilometer.compute.pollsters [-] ddf94a98-572f-4116-87fb-d46dc5f72174/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.721 12 DEBUG ceilometer.compute.pollsters [-] ddf94a98-572f-4116-87fb-d46dc5f72174/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.721 12 DEBUG ceilometer.compute.pollsters [-] 64848f5c-64c9-41ed-9c0d-c2ef3839d5de/disk.device.usage volume: 30015488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.721 12 DEBUG ceilometer.compute.pollsters [-] 64848f5c-64c9-41ed-9c0d-c2ef3839d5de/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.722 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '778bb9c8-134c-40e7-9d9a-5e4b948f3a0c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'ddf94a98-572f-4116-87fb-d46dc5f72174-vda', 'timestamp': '2025-11-22T08:14:36.720920', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-585014751', 'name': 'instance-0000008b', 'instance_id': 'ddf94a98-572f-4116-87fb-d46dc5f72174', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '48f2056c-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6017.394434964, 'message_signature': '24aa2c9551c07e0a6912f24effc84a41f1a58c6a1cc740690ec48b3725787dbd'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'ddf94a98-572f-4116-87fb-d46dc5f72174-sda', 'timestamp': '2025-11-22T08:14:36.720920', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-585014751', 'name': 'instance-0000008b', 'instance_id': 'ddf94a98-572f-4116-87fb-d46dc5f72174', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '48f20f94-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6017.394434964, 'message_signature': 'f68729357b9c17ba98c27785e5501350330bb7a15e92e25b4cccf23d24e82c43'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30015488, 'user_id': '72df4512d7f245118018df81223ce5ff', 'user_name': None, 'project_id': '5c9016c6b616412fa2db0983e23a8150', 'project_name': None, 'resource_id': '64848f5c-64c9-41ed-9c0d-c2ef3839d5de-vda', 'timestamp': '2025-11-22T08:14:36.720920', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-2002217618', 'name': 'instance-00000088', 'instance_id': '64848f5c-64c9-41ed-9c0d-c2ef3839d5de', 'instance_type': 'm1.nano', 'host': 'e48e638f02bb08d71c5a9c32c3d0531ffa50e5e68534f2d3ee242f98', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '48f218cc-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6017.402841882, 'message_signature': '246ef8ffa0a310180c091708085122f5e144e3d0aea823354619c01b3bbb895c'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '72df4512d7f245118018df81223ce5ff', 'user_name': None, 'project_id': '5c9016c6b616412fa2db0983e23a8150', 'project_name': None, 'resource_id': '64848f5c-64c9-41ed-9c0d-c2ef3839d5de-sda', 'timestamp': '2025-11-22T08:14:36.720920', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-2002217618', 'name': 'instance-00000088', 'instance_id': '64848f5c-64c9-41ed-9c0d-c2ef3839d5de', 'instance_type': 'm1.nano', 'host': 'e48e638f02bb08d71c5a9c32c3d0531ffa50e5e68534f2d3ee242f98', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '48f229b6-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6017.402841882, 'message_signature': '5452afc69cc8e080c08e70828a97de313671459084c07c5986dd9c6e62284e65'}]}, 'timestamp': '2025-11-22 08:14:36.722113', '_unique_id': 'de75c4192f834e459e7562a8042c955f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.722 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.722 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.722 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.722 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.722 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.722 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.722 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.722 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.722 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.722 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.722 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.722 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.722 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.722 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.722 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.722 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.722 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.722 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.722 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.722 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.722 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.722 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.722 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.722 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.722 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.722 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.722 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.722 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.722 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.722 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.722 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.723 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.723 12 DEBUG ceilometer.compute.pollsters [-] ddf94a98-572f-4116-87fb-d46dc5f72174/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.723 12 DEBUG ceilometer.compute.pollsters [-] ddf94a98-572f-4116-87fb-d46dc5f72174/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.724 12 DEBUG ceilometer.compute.pollsters [-] 64848f5c-64c9-41ed-9c0d-c2ef3839d5de/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.725 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8ddd6e63-530d-4819-8c72-6509cf77dad1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-0000008b-ddf94a98-572f-4116-87fb-d46dc5f72174-tapf7a27b23-e9', 'timestamp': '2025-11-22T08:14:36.723682', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-585014751', 'name': 'tapf7a27b23-e9', 'instance_id': 'ddf94a98-572f-4116-87fb-d46dc5f72174', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1a:d3:6f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf7a27b23-e9'}, 'message_id': '48f27114-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6017.294645464, 'message_signature': 'c9900706061b8695ff2eb254d5af6eea410199f32cdd9e78bdf996aa182c969b'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-0000008b-ddf94a98-572f-4116-87fb-d46dc5f72174-tapd867b5ab-df', 'timestamp': '2025-11-22T08:14:36.723682', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-585014751', 'name': 'tapd867b5ab-df', 'instance_id': 'ddf94a98-572f-4116-87fb-d46dc5f72174', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:04:7f:07', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd867b5ab-df'}, 'message_id': '48f27cae-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6017.294645464, 'message_signature': 'ca600ac6e08b70df80024468d2fc02b21bb9c1394555128bef2685fe3366d524'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '72df4512d7f245118018df81223ce5ff', 'user_name': None, 'project_id': '5c9016c6b616412fa2db0983e23a8150', 'project_name': None, 'resource_id': 'instance-00000088-64848f5c-64c9-41ed-9c0d-c2ef3839d5de-tap56b64f59-c3', 'timestamp': '2025-11-22T08:14:36.723682', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-2002217618', 'name': 'tap56b64f59-c3', 'instance_id': '64848f5c-64c9-41ed-9c0d-c2ef3839d5de', 'instance_type': 'm1.nano', 'host': 'e48e638f02bb08d71c5a9c32c3d0531ffa50e5e68534f2d3ee242f98', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d3:d2:ad', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap56b64f59-c3'}, 'message_id': '48f287e4-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6017.299607296, 'message_signature': '01f22afdde21a3b1a1aad4366747fa029ae42eed9d5255f5aa6917e7b3c64f6f'}]}, 'timestamp': '2025-11-22 08:14:36.724533', '_unique_id': '428869c3539f473ea776e22484426c4e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.725 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.725 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.725 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.725 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.725 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.725 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.725 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.725 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.725 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.725 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.725 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.725 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.725 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.725 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.725 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.725 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.725 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.725 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.725 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.725 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.725 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.725 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.725 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.725 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.725 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.725 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.725 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.725 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.725 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.725 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.725 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.725 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.725 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.726 12 DEBUG ceilometer.compute.pollsters [-] ddf94a98-572f-4116-87fb-d46dc5f72174/network.incoming.packets volume: 13 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.726 12 DEBUG ceilometer.compute.pollsters [-] ddf94a98-572f-4116-87fb-d46dc5f72174/network.incoming.packets volume: 8 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.726 12 DEBUG ceilometer.compute.pollsters [-] 64848f5c-64c9-41ed-9c0d-c2ef3839d5de/network.incoming.packets volume: 59 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.727 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '07033931-d9f7-41a5-b392-db108cc0432d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 13, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-0000008b-ddf94a98-572f-4116-87fb-d46dc5f72174-tapf7a27b23-e9', 'timestamp': '2025-11-22T08:14:36.726082', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-585014751', 'name': 'tapf7a27b23-e9', 'instance_id': 'ddf94a98-572f-4116-87fb-d46dc5f72174', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1a:d3:6f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf7a27b23-e9'}, 'message_id': '48f2cede-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6017.294645464, 'message_signature': 'e1af3e7c8c7bb93687bc2fa1bc1c3581ee40f4c91a7bc80cb7bb6de10fcafb3f'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 8, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-0000008b-ddf94a98-572f-4116-87fb-d46dc5f72174-tapd867b5ab-df', 'timestamp': '2025-11-22T08:14:36.726082', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-585014751', 'name': 'tapd867b5ab-df', 'instance_id': 'ddf94a98-572f-4116-87fb-d46dc5f72174', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:04:7f:07', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd867b5ab-df'}, 'message_id': '48f2d942-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6017.294645464, 'message_signature': '7359dedafebd0097a3ae25ca1cec39323b8340bb2c9eacf7960f0f5ea2935a37'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 59, 'user_id': '72df4512d7f245118018df81223ce5ff', 'user_name': None, 'project_id': '5c9016c6b616412fa2db0983e23a8150', 'project_name': None, 'resource_id': 'instance-00000088-64848f5c-64c9-41ed-9c0d-c2ef3839d5de-tap56b64f59-c3', 'timestamp': '2025-11-22T08:14:36.726082', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-2002217618', 'name': 'tap56b64f59-c3', 'instance_id': '64848f5c-64c9-41ed-9c0d-c2ef3839d5de', 'instance_type': 'm1.nano', 'host': 'e48e638f02bb08d71c5a9c32c3d0531ffa50e5e68534f2d3ee242f98', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d3:d2:ad', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap56b64f59-c3'}, 'message_id': '48f2e2c0-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6017.299607296, 'message_signature': 'af02f02479246d558dd6e17da40bb16209d365673a8010142076ffb5f698f8ac'}]}, 'timestamp': '2025-11-22 08:14:36.726856', '_unique_id': '83f91d4553f841a9ad5acabf7b643858'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.727 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.727 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.727 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.727 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.727 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.727 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.727 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.727 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.727 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.727 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.727 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.727 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.727 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.727 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.727 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.727 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.727 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.727 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.727 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.727 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.727 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.727 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.727 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.727 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.727 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.727 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.727 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.727 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.727 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.727 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.727 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.727 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.728 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.728 12 DEBUG ceilometer.compute.pollsters [-] ddf94a98-572f-4116-87fb-d46dc5f72174/disk.device.write.bytes volume: 72904704 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.728 12 DEBUG ceilometer.compute.pollsters [-] ddf94a98-572f-4116-87fb-d46dc5f72174/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.728 12 DEBUG ceilometer.compute.pollsters [-] 64848f5c-64c9-41ed-9c0d-c2ef3839d5de/disk.device.write.bytes volume: 73175040 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.729 12 DEBUG ceilometer.compute.pollsters [-] 64848f5c-64c9-41ed-9c0d-c2ef3839d5de/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.729 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '64af2999-10af-4d84-ae50-794281e11da4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72904704, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'ddf94a98-572f-4116-87fb-d46dc5f72174-vda', 'timestamp': '2025-11-22T08:14:36.728433', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-585014751', 'name': 'instance-0000008b', 'instance_id': 'ddf94a98-572f-4116-87fb-d46dc5f72174', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '48f32a6e-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6017.305287117, 'message_signature': '99d6279820b40077c6f64367a42e41aebf64ec2160c318dc186c89c7a4faa6c5'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'ddf94a98-572f-4116-87fb-d46dc5f72174-sda', 'timestamp': '2025-11-22T08:14:36.728433', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-585014751', 'name': 'instance-0000008b', 'instance_id': 'ddf94a98-572f-4116-87fb-d46dc5f72174', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '48f3339c-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6017.305287117, 'message_signature': '4c07aaedf67b37dc0f1e2b04054907476802400c3d6076ced76f29f587cc74fd'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73175040, 'user_id': '72df4512d7f245118018df81223ce5ff', 'user_name': None, 'project_id': '5c9016c6b616412fa2db0983e23a8150', 'project_name': None, 'resource_id': '64848f5c-64c9-41ed-9c0d-c2ef3839d5de-vda', 'timestamp': '2025-11-22T08:14:36.728433', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-2002217618', 'name': 'instance-00000088', 'instance_id': '64848f5c-64c9-41ed-9c0d-c2ef3839d5de', 'instance_type': 'm1.nano', 'host': 'e48e638f02bb08d71c5a9c32c3d0531ffa50e5e68534f2d3ee242f98', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '48f33c7a-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6017.329849096, 'message_signature': '32bbbe76540c19ea5a9dd963c215bf6191085d23a90889f6f3c7eb3dcd847686'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '72df4512d7f245118018df81223ce5ff', 'user_name': None, 'project_id': '5c9016c6b616412fa2db0983e23a8150', 'project_name': None, 'resource_id': '64848f5c-64c9-41ed-9c0d-c2ef3839d5de-sda', 'timestamp': '2025-11-22T08:14:36.728433', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-2002217618', 'name': 'instance-00000088', 'instance_id': '64848f5c-64c9-41ed-9c0d-c2ef3839d5de', 'instance_type': 'm1.nano', 'host': 'e48e638f02bb08d71c5a9c32c3d0531ffa50e5e68534f2d3ee242f98', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '48f3472e-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6017.329849096, 'message_signature': '307c0ead4f582d8d1e1951a3fb6d6cac55a3b96820eaa8316184d8e4fcc3e4b9'}]}, 'timestamp': '2025-11-22 08:14:36.729424', '_unique_id': '81296009cb8c48c2ae665203b2524b4d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.729 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.729 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.729 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.729 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.729 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.729 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.729 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.729 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.729 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.729 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.729 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.729 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.729 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.729 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.729 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.729 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.729 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.729 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.729 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.729 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.729 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.729 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.729 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.729 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.729 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.729 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.729 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.729 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.729 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.729 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.729 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.730 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.731 12 DEBUG ceilometer.compute.pollsters [-] ddf94a98-572f-4116-87fb-d46dc5f72174/network.incoming.bytes volume: 1648 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.731 12 DEBUG ceilometer.compute.pollsters [-] ddf94a98-572f-4116-87fb-d46dc5f72174/network.incoming.bytes volume: 772 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.731 12 DEBUG ceilometer.compute.pollsters [-] 64848f5c-64c9-41ed-9c0d-c2ef3839d5de/network.incoming.bytes volume: 10463 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.732 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4077e328-9edf-48d8-b36e-16238c149583', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1648, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-0000008b-ddf94a98-572f-4116-87fb-d46dc5f72174-tapf7a27b23-e9', 'timestamp': '2025-11-22T08:14:36.731044', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-585014751', 'name': 'tapf7a27b23-e9', 'instance_id': 'ddf94a98-572f-4116-87fb-d46dc5f72174', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1a:d3:6f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf7a27b23-e9'}, 'message_id': '48f39152-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6017.294645464, 'message_signature': 'c7371edf16f08a85c747fc3f64b3d378e1c8876e64ba58f6a5606890bf4423ba'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 772, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-0000008b-ddf94a98-572f-4116-87fb-d46dc5f72174-tapd867b5ab-df', 'timestamp': '2025-11-22T08:14:36.731044', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-585014751', 'name': 'tapd867b5ab-df', 'instance_id': 'ddf94a98-572f-4116-87fb-d46dc5f72174', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:04:7f:07', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd867b5ab-df'}, 'message_id': '48f39daa-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6017.294645464, 'message_signature': 'cde1ea4d26dc7aced72631cc7f31cc89842ff89130cc4250caddbfad8a6c9f2f'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10463, 'user_id': '72df4512d7f245118018df81223ce5ff', 'user_name': None, 'project_id': '5c9016c6b616412fa2db0983e23a8150', 'project_name': None, 'resource_id': 'instance-00000088-64848f5c-64c9-41ed-9c0d-c2ef3839d5de-tap56b64f59-c3', 'timestamp': '2025-11-22T08:14:36.731044', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-2002217618', 'name': 'tap56b64f59-c3', 'instance_id': '64848f5c-64c9-41ed-9c0d-c2ef3839d5de', 'instance_type': 'm1.nano', 'host': 'e48e638f02bb08d71c5a9c32c3d0531ffa50e5e68534f2d3ee242f98', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d3:d2:ad', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap56b64f59-c3'}, 'message_id': '48f3a728-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6017.299607296, 'message_signature': 'b4a775ae928f47f2088b41726b9169a429bd1ac22fb329540773a510b2eb7f9d'}]}, 'timestamp': '2025-11-22 08:14:36.731885', '_unique_id': '5462292e379b435585210ad0d161fca8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.732 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.732 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.732 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.732 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.732 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.732 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.732 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.732 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.732 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.732 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.732 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.732 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.732 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.732 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.732 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.732 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.732 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.732 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.732 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.732 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.732 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.732 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.732 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.732 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.732 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.732 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.732 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.732 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.732 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.732 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.732 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.733 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.733 12 DEBUG ceilometer.compute.pollsters [-] ddf94a98-572f-4116-87fb-d46dc5f72174/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.733 12 DEBUG ceilometer.compute.pollsters [-] ddf94a98-572f-4116-87fb-d46dc5f72174/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.733 12 DEBUG ceilometer.compute.pollsters [-] 64848f5c-64c9-41ed-9c0d-c2ef3839d5de/disk.device.allocation volume: 31006720 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.734 12 DEBUG ceilometer.compute.pollsters [-] 64848f5c-64c9-41ed-9c0d-c2ef3839d5de/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.734 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3a70e1c7-d7c9-4036-9742-80303031aac6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'ddf94a98-572f-4116-87fb-d46dc5f72174-vda', 'timestamp': '2025-11-22T08:14:36.733439', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-585014751', 'name': 'instance-0000008b', 'instance_id': 'ddf94a98-572f-4116-87fb-d46dc5f72174', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '48f3ee40-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6017.394434964, 'message_signature': '466e161def7c21a8cb99a91421df4bb677663eef14b3460ba9ade0ee740a2852'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'ddf94a98-572f-4116-87fb-d46dc5f72174-sda', 'timestamp': '2025-11-22T08:14:36.733439', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-585014751', 'name': 'instance-0000008b', 'instance_id': 'ddf94a98-572f-4116-87fb-d46dc5f72174', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '48f3f7a0-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6017.394434964, 'message_signature': '8c394316f995bb45796377e35c16cc103aa7f16c2b85ea7f95813f33ae100cf6'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31006720, 'user_id': '72df4512d7f245118018df81223ce5ff', 'user_name': None, 'project_id': '5c9016c6b616412fa2db0983e23a8150', 'project_name': None, 'resource_id': '64848f5c-64c9-41ed-9c0d-c2ef3839d5de-vda', 'timestamp': '2025-11-22T08:14:36.733439', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-2002217618', 'name': 'instance-00000088', 'instance_id': '64848f5c-64c9-41ed-9c0d-c2ef3839d5de', 'instance_type': 'm1.nano', 'host': 'e48e638f02bb08d71c5a9c32c3d0531ffa50e5e68534f2d3ee242f98', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '48f400c4-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6017.402841882, 'message_signature': '7efb661972ef0132d5f06514187c5ad93c88da71a20c6b5ae550923c789a316b'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '72df4512d7f245118018df81223ce5ff', 'user_name': None, 'project_id': '5c9016c6b616412fa2db0983e23a8150', 'project_name': None, 'resource_id': '64848f5c-64c9-41ed-9c0d-c2ef3839d5de-sda', 'timestamp': '2025-11-22T08:14:36.733439', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-2002217618', 'name': 'instance-00000088', 'instance_id': '64848f5c-64c9-41ed-9c0d-c2ef3839d5de', 'instance_type': 'm1.nano', 'host': 'e48e638f02bb08d71c5a9c32c3d0531ffa50e5e68534f2d3ee242f98', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '48f40a6a-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6017.402841882, 'message_signature': 'f5b9ba3bccb756f5e0b3d1a77575e208ff2898e70bf87c4c342af623c4779e9d'}]}, 'timestamp': '2025-11-22 08:14:36.734464', '_unique_id': '04d4e5c94b5d4722bc1dd96e65c22508'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.734 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.734 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.734 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.734 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.734 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.734 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.734 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.734 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.734 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.734 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.734 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.734 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.734 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.734 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.734 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.734 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.734 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.734 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.734 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.734 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.734 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.734 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.734 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.734 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.734 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.734 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.734 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.734 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.734 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.734 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.734 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.734 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.735 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.736 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.736 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-585014751>, <NovaLikeServer: tempest-TestSnapshotPattern-server-2002217618>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-585014751>, <NovaLikeServer: tempest-TestSnapshotPattern-server-2002217618>]
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.736 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.736 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.736 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestGettingAddress-server-585014751>, <NovaLikeServer: tempest-TestSnapshotPattern-server-2002217618>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-585014751>, <NovaLikeServer: tempest-TestSnapshotPattern-server-2002217618>]
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.736 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.736 12 DEBUG ceilometer.compute.pollsters [-] ddf94a98-572f-4116-87fb-d46dc5f72174/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.737 12 DEBUG ceilometer.compute.pollsters [-] ddf94a98-572f-4116-87fb-d46dc5f72174/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.737 12 DEBUG ceilometer.compute.pollsters [-] 64848f5c-64c9-41ed-9c0d-c2ef3839d5de/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '387029d2-48b7-491b-ba7d-1ef33589f682', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-0000008b-ddf94a98-572f-4116-87fb-d46dc5f72174-tapf7a27b23-e9', 'timestamp': '2025-11-22T08:14:36.736874', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-585014751', 'name': 'tapf7a27b23-e9', 'instance_id': 'ddf94a98-572f-4116-87fb-d46dc5f72174', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1a:d3:6f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf7a27b23-e9'}, 'message_id': '48f47414-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6017.294645464, 'message_signature': 'b8353eca012abcc0598806dbdce77a7fba9e883181ad4057a2882f0bbeded198'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-0000008b-ddf94a98-572f-4116-87fb-d46dc5f72174-tapd867b5ab-df', 'timestamp': '2025-11-22T08:14:36.736874', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-585014751', 'name': 'tapd867b5ab-df', 'instance_id': 'ddf94a98-572f-4116-87fb-d46dc5f72174', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:04:7f:07', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd867b5ab-df'}, 'message_id': '48f47e46-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6017.294645464, 'message_signature': '6a4fdcd0fbc786476c4f4204b42d155a1868e0a1f9c6ef0ee270280746d46f9e'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '72df4512d7f245118018df81223ce5ff', 'user_name': None, 'project_id': '5c9016c6b616412fa2db0983e23a8150', 'project_name': None, 'resource_id': 'instance-00000088-64848f5c-64c9-41ed-9c0d-c2ef3839d5de-tap56b64f59-c3', 'timestamp': '2025-11-22T08:14:36.736874', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-2002217618', 'name': 'tap56b64f59-c3', 'instance_id': '64848f5c-64c9-41ed-9c0d-c2ef3839d5de', 'instance_type': 'm1.nano', 'host': 'e48e638f02bb08d71c5a9c32c3d0531ffa50e5e68534f2d3ee242f98', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d3:d2:ad', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap56b64f59-c3'}, 'message_id': '48f48882-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6017.299607296, 'message_signature': '4e107a1b82573baa29090ac242884348b79cc41ee8c20a6e2bee7622e85fa095'}]}, 'timestamp': '2025-11-22 08:14:36.737655', '_unique_id': 'a440e924eadd4277912079af3ab0c83d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.738 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.739 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.739 12 DEBUG ceilometer.compute.pollsters [-] ddf94a98-572f-4116-87fb-d46dc5f72174/cpu volume: 13780000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.739 12 DEBUG ceilometer.compute.pollsters [-] 64848f5c-64c9-41ed-9c0d-c2ef3839d5de/cpu volume: 13490000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0e32d694-61a3-40b2-a8fd-dcc2c028bde9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 13780000000, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'ddf94a98-572f-4116-87fb-d46dc5f72174', 'timestamp': '2025-11-22T08:14:36.739522', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-585014751', 'name': 'instance-0000008b', 'instance_id': 'ddf94a98-572f-4116-87fb-d46dc5f72174', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '48f4dbc0-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6017.377067654, 'message_signature': '4c4b3186ef7e2337796a711be48476a8828256beedf39f354db30851ae994d8b'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 13490000000, 'user_id': '72df4512d7f245118018df81223ce5ff', 'user_name': None, 'project_id': '5c9016c6b616412fa2db0983e23a8150', 'project_name': None, 'resource_id': '64848f5c-64c9-41ed-9c0d-c2ef3839d5de', 'timestamp': '2025-11-22T08:14:36.739522', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-2002217618', 'name': 'instance-00000088', 'instance_id': '64848f5c-64c9-41ed-9c0d-c2ef3839d5de', 'instance_type': 'm1.nano', 'host': 'e48e638f02bb08d71c5a9c32c3d0531ffa50e5e68534f2d3ee242f98', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '48f4e520-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6017.388531288, 'message_signature': '1ce34dabcb559e6eecfb85a9aa926665e34ef9f8ac943ccba744cd5a6019951c'}]}, 'timestamp': '2025-11-22 08:14:36.740019', '_unique_id': 'a345c3ec723f460e813cf7d7f8dcfff1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.740 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.741 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.741 12 DEBUG ceilometer.compute.pollsters [-] ddf94a98-572f-4116-87fb-d46dc5f72174/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.741 12 DEBUG ceilometer.compute.pollsters [-] ddf94a98-572f-4116-87fb-d46dc5f72174/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.741 12 DEBUG ceilometer.compute.pollsters [-] 64848f5c-64c9-41ed-9c0d-c2ef3839d5de/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.742 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5c1d2383-6555-44bc-ad4f-5c27032607de', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-0000008b-ddf94a98-572f-4116-87fb-d46dc5f72174-tapf7a27b23-e9', 'timestamp': '2025-11-22T08:14:36.741442', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-585014751', 'name': 'tapf7a27b23-e9', 'instance_id': 'ddf94a98-572f-4116-87fb-d46dc5f72174', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1a:d3:6f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf7a27b23-e9'}, 'message_id': '48f526c0-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6017.294645464, 'message_signature': '98a3f1405f4478349e7546a23d13f2bdcd99e2ff6b140696adcd253fc5a51eeb'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-0000008b-ddf94a98-572f-4116-87fb-d46dc5f72174-tapd867b5ab-df', 'timestamp': '2025-11-22T08:14:36.741442', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-585014751', 'name': 'tapd867b5ab-df', 'instance_id': 'ddf94a98-572f-4116-87fb-d46dc5f72174', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:04:7f:07', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd867b5ab-df'}, 'message_id': '48f5305c-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6017.294645464, 'message_signature': '4aec761b8e883eb6b896e5825a6fa17d3af1b90ac1684cc3cca60962bb4c9794'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '72df4512d7f245118018df81223ce5ff', 'user_name': None, 'project_id': '5c9016c6b616412fa2db0983e23a8150', 'project_name': None, 'resource_id': 'instance-00000088-64848f5c-64c9-41ed-9c0d-c2ef3839d5de-tap56b64f59-c3', 'timestamp': '2025-11-22T08:14:36.741442', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-2002217618', 'name': 'tap56b64f59-c3', 'instance_id': '64848f5c-64c9-41ed-9c0d-c2ef3839d5de', 'instance_type': 'm1.nano', 'host': 'e48e638f02bb08d71c5a9c32c3d0531ffa50e5e68534f2d3ee242f98', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d3:d2:ad', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap56b64f59-c3'}, 'message_id': '48f5399e-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6017.299607296, 'message_signature': '7c4a626cbc4936b8077ab22f37b39cd1f902b8841c4ef6a8770b5fd1d95c402d'}]}, 'timestamp': '2025-11-22 08:14:36.742186', '_unique_id': '7361ef9422bf46a3b650eda46466d346'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.742 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.742 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.742 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.742 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.742 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.742 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.742 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.742 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.742 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.742 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.742 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.742 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.742 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.742 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.742 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.742 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.742 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.742 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.742 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.742 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.742 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.742 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.742 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.742 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.742 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.742 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.742 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.742 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.742 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.742 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.742 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.743 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.743 12 DEBUG ceilometer.compute.pollsters [-] ddf94a98-572f-4116-87fb-d46dc5f72174/disk.device.write.requests volume: 274 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.744 12 DEBUG ceilometer.compute.pollsters [-] ddf94a98-572f-4116-87fb-d46dc5f72174/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.744 12 DEBUG ceilometer.compute.pollsters [-] 64848f5c-64c9-41ed-9c0d-c2ef3839d5de/disk.device.write.requests volume: 347 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.744 12 DEBUG ceilometer.compute.pollsters [-] 64848f5c-64c9-41ed-9c0d-c2ef3839d5de/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.745 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6c4db7a6-2389-46f1-ab25-1131a928d93f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 274, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'ddf94a98-572f-4116-87fb-d46dc5f72174-vda', 'timestamp': '2025-11-22T08:14:36.743793', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-585014751', 'name': 'instance-0000008b', 'instance_id': 'ddf94a98-572f-4116-87fb-d46dc5f72174', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '48f5828c-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6017.305287117, 'message_signature': '4796ce0ba7eca24960f1c466ae56bfc598583021b695fe1837e52f0133a84427'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'ddf94a98-572f-4116-87fb-d46dc5f72174-sda', 'timestamp': '2025-11-22T08:14:36.743793', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-585014751', 'name': 'instance-0000008b', 'instance_id': 'ddf94a98-572f-4116-87fb-d46dc5f72174', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '48f58bce-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6017.305287117, 'message_signature': '0fd4a378c08337b2c8f6e6825bdfcc457ff8b70a9caed9b021bf8f54b5c66bb7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 347, 'user_id': '72df4512d7f245118018df81223ce5ff', 'user_name': None, 'project_id': '5c9016c6b616412fa2db0983e23a8150', 'project_name': None, 'resource_id': '64848f5c-64c9-41ed-9c0d-c2ef3839d5de-vda', 'timestamp': '2025-11-22T08:14:36.743793', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-2002217618', 'name': 'instance-00000088', 'instance_id': '64848f5c-64c9-41ed-9c0d-c2ef3839d5de', 'instance_type': 'm1.nano', 'host': 'e48e638f02bb08d71c5a9c32c3d0531ffa50e5e68534f2d3ee242f98', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '48f59588-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6017.329849096, 'message_signature': '40239b689821305085cdc05098b056973bd1fe7dfebc6e292a654bb5ace69a12'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '72df4512d7f245118018df81223ce5ff', 'user_name': None, 'project_id': '5c9016c6b616412fa2db0983e23a8150', 'project_name': None, 'resource_id': '64848f5c-64c9-41ed-9c0d-c2ef3839d5de-sda', 'timestamp': '2025-11-22T08:14:36.743793', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-2002217618', 'name': 'instance-00000088', 'instance_id': '64848f5c-64c9-41ed-9c0d-c2ef3839d5de', 'instance_type': 'm1.nano', 'host': 'e48e638f02bb08d71c5a9c32c3d0531ffa50e5e68534f2d3ee242f98', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '48f59e52-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6017.329849096, 'message_signature': '14d0192ecb05b0695ac3337d250196921b7f08d34988c4692d2e882dd8913e17'}]}, 'timestamp': '2025-11-22 08:14:36.744756', '_unique_id': '9387146f19c94a74a39a14ad1e09951e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.745 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.745 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.745 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.745 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.745 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.745 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.745 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.745 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.745 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.745 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.745 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.745 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.745 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.745 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.745 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.745 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.745 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.745 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.745 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.745 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.745 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.745 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.745 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.745 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.745 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.745 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.745 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.745 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.745 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.745 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.745 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.746 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.746 12 DEBUG ceilometer.compute.pollsters [-] ddf94a98-572f-4116-87fb-d46dc5f72174/disk.device.read.requests volume: 1166 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.746 12 DEBUG ceilometer.compute.pollsters [-] ddf94a98-572f-4116-87fb-d46dc5f72174/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.746 12 DEBUG ceilometer.compute.pollsters [-] 64848f5c-64c9-41ed-9c0d-c2ef3839d5de/disk.device.read.requests volume: 1163 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.747 12 DEBUG ceilometer.compute.pollsters [-] 64848f5c-64c9-41ed-9c0d-c2ef3839d5de/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.747 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b8f858a6-7bd9-4b59-bd69-d7532e1c2a7a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1166, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'ddf94a98-572f-4116-87fb-d46dc5f72174-vda', 'timestamp': '2025-11-22T08:14:36.746319', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-585014751', 'name': 'instance-0000008b', 'instance_id': 'ddf94a98-572f-4116-87fb-d46dc5f72174', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '48f5e54c-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6017.305287117, 'message_signature': 'dc9dfc110afd2964d48d85029bfc54630e6ccf68ba2552f4d48a8f923360e3f8'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'ddf94a98-572f-4116-87fb-d46dc5f72174-sda', 'timestamp': '2025-11-22T08:14:36.746319', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-585014751', 'name': 'instance-0000008b', 'instance_id': 'ddf94a98-572f-4116-87fb-d46dc5f72174', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '48f5ee98-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6017.305287117, 'message_signature': 'c2c86e65fe28251f594e4561331196e1cc477d8de00b3c1fb6734df6c0e62633'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1163, 'user_id': '72df4512d7f245118018df81223ce5ff', 'user_name': None, 'project_id': '5c9016c6b616412fa2db0983e23a8150', 'project_name': None, 'resource_id': '64848f5c-64c9-41ed-9c0d-c2ef3839d5de-vda', 'timestamp': '2025-11-22T08:14:36.746319', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-2002217618', 'name': 'instance-00000088', 'instance_id': '64848f5c-64c9-41ed-9c0d-c2ef3839d5de', 'instance_type': 'm1.nano', 'host': 'e48e638f02bb08d71c5a9c32c3d0531ffa50e5e68534f2d3ee242f98', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '48f5f780-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6017.329849096, 'message_signature': '3d47a2457081e8ec7fd7cfb233c8ae733b1f8c13be6908d979ea468197584642'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '72df4512d7f245118018df81223ce5ff', 'user_name': None, 'project_id': '5c9016c6b616412fa2db0983e23a8150', 'project_name': None, 'resource_id': '64848f5c-64c9-41ed-9c0d-c2ef3839d5de-sda', 'timestamp': '2025-11-22T08:14:36.746319', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-2002217618', 'name': 'instance-00000088', 'instance_id': '64848f5c-64c9-41ed-9c0d-c2ef3839d5de', 'instance_type': 'm1.nano', 'host': 'e48e638f02bb08d71c5a9c32c3d0531ffa50e5e68534f2d3ee242f98', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '48f60040-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6017.329849096, 'message_signature': 'f514d070cea73a020114a29475e3cc9a59aa7c426ab9c277b5758ed188d7e1d9'}]}, 'timestamp': '2025-11-22 08:14:36.747308', '_unique_id': '5ab16386a7734210aaa838a8b374007b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.747 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.747 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.747 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.747 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.747 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.747 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.747 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.747 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.747 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.747 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.747 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.747 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.747 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.747 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.747 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.747 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.747 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.747 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.747 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.747 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.747 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.747 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.747 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.747 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.747 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.747 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.747 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.747 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.747 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.747 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.747 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.748 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.748 12 DEBUG ceilometer.compute.pollsters [-] ddf94a98-572f-4116-87fb-d46dc5f72174/disk.device.write.latency volume: 26431458918 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.749 12 DEBUG ceilometer.compute.pollsters [-] ddf94a98-572f-4116-87fb-d46dc5f72174/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.749 12 DEBUG ceilometer.compute.pollsters [-] 64848f5c-64c9-41ed-9c0d-c2ef3839d5de/disk.device.write.latency volume: 101329685503 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.749 12 DEBUG ceilometer.compute.pollsters [-] 64848f5c-64c9-41ed-9c0d-c2ef3839d5de/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.750 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3e44b47f-1331-443d-a213-900c34aa4f03', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 26431458918, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'ddf94a98-572f-4116-87fb-d46dc5f72174-vda', 'timestamp': '2025-11-22T08:14:36.748801', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-585014751', 'name': 'instance-0000008b', 'instance_id': 'ddf94a98-572f-4116-87fb-d46dc5f72174', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '48f6462c-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6017.305287117, 'message_signature': 'ae947bd75af083eadf77b88e965a742d4941edefb7535fd165c66bb96e43d3f7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'ddf94a98-572f-4116-87fb-d46dc5f72174-sda', 'timestamp': '2025-11-22T08:14:36.748801', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-585014751', 'name': 'instance-0000008b', 'instance_id': 'ddf94a98-572f-4116-87fb-d46dc5f72174', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '48f64f50-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6017.305287117, 'message_signature': '1ab531c7849503e8b98ea309b71db998f7a3f4648d4b33fe808940fd209bad13'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 101329685503, 'user_id': '72df4512d7f245118018df81223ce5ff', 'user_name': None, 'project_id': '5c9016c6b616412fa2db0983e23a8150', 'project_name': None, 'resource_id': '64848f5c-64c9-41ed-9c0d-c2ef3839d5de-vda', 'timestamp': '2025-11-22T08:14:36.748801', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-2002217618', 'name': 'instance-00000088', 'instance_id': '64848f5c-64c9-41ed-9c0d-c2ef3839d5de', 'instance_type': 'm1.nano', 'host': 'e48e638f02bb08d71c5a9c32c3d0531ffa50e5e68534f2d3ee242f98', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '48f6590a-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6017.329849096, 'message_signature': '85055a48fb5ac0bc224a7bc24bfbbd1504a01ad819c50d075b0e047f2320c152'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '72df4512d7f245118018df81223ce5ff', 'user_name': None, 'project_id': '5c9016c6b616412fa2db0983e23a8150', 'project_name': None, 'resource_id': '64848f5c-64c9-41ed-9c0d-c2ef3839d5de-sda', 'timestamp': '2025-11-22T08:14:36.748801', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-2002217618', 'name': 'instance-00000088', 'instance_id': '64848f5c-64c9-41ed-9c0d-c2ef3839d5de', 'instance_type': 'm1.nano', 'host': 'e48e638f02bb08d71c5a9c32c3d0531ffa50e5e68534f2d3ee242f98', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '48f661fc-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6017.329849096, 'message_signature': 'b3ec94539b6215cc4274f32f0dcd46091c15481d8748a06a7f041ad9557b48e3'}]}, 'timestamp': '2025-11-22 08:14:36.749765', '_unique_id': '1b2ab699ba92423396a40444a0572f9a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.750 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.750 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.750 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.750 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.750 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.750 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.750 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.750 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.750 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.750 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.750 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.750 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.750 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.750 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.750 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.750 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.750 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.750 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.750 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.750 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.750 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.750 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.750 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.750 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.750 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.750 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.750 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.750 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.750 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.750 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.750 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.750 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.751 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.751 12 DEBUG ceilometer.compute.pollsters [-] ddf94a98-572f-4116-87fb-d46dc5f72174/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.751 12 DEBUG ceilometer.compute.pollsters [-] ddf94a98-572f-4116-87fb-d46dc5f72174/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.751 12 DEBUG ceilometer.compute.pollsters [-] 64848f5c-64c9-41ed-9c0d-c2ef3839d5de/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.752 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f16024af-dfef-44ab-aab3-d0e205e34619', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-0000008b-ddf94a98-572f-4116-87fb-d46dc5f72174-tapf7a27b23-e9', 'timestamp': '2025-11-22T08:14:36.751229', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-585014751', 'name': 'tapf7a27b23-e9', 'instance_id': 'ddf94a98-572f-4116-87fb-d46dc5f72174', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1a:d3:6f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf7a27b23-e9'}, 'message_id': '48f6a64e-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6017.294645464, 'message_signature': '7a3fc67d46c2a267310d41df93162c8aba05b45385b5d8d2ea76e2df1613f3fd'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-0000008b-ddf94a98-572f-4116-87fb-d46dc5f72174-tapd867b5ab-df', 'timestamp': '2025-11-22T08:14:36.751229', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-585014751', 'name': 'tapd867b5ab-df', 'instance_id': 'ddf94a98-572f-4116-87fb-d46dc5f72174', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:04:7f:07', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd867b5ab-df'}, 'message_id': '48f6afd6-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6017.294645464, 'message_signature': '0af79dcf920e298e62891e6694be398f8595f8c81ae38d6738606f009c12452b'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '72df4512d7f245118018df81223ce5ff', 'user_name': None, 'project_id': '5c9016c6b616412fa2db0983e23a8150', 'project_name': None, 'resource_id': 'instance-00000088-64848f5c-64c9-41ed-9c0d-c2ef3839d5de-tap56b64f59-c3', 'timestamp': '2025-11-22T08:14:36.751229', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-2002217618', 'name': 'tap56b64f59-c3', 'instance_id': '64848f5c-64c9-41ed-9c0d-c2ef3839d5de', 'instance_type': 'm1.nano', 'host': 'e48e638f02bb08d71c5a9c32c3d0531ffa50e5e68534f2d3ee242f98', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d3:d2:ad', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap56b64f59-c3'}, 'message_id': '48f6b904-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6017.299607296, 'message_signature': 'db909d4a8503f342549935bcbc44f606ac15684b1575e45cd709193ea957e73b'}]}, 'timestamp': '2025-11-22 08:14:36.752001', '_unique_id': '1fa74968b3a3476a840b5e63668acc0f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.752 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.752 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.752 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.752 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.752 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.752 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.752 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.752 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.752 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.752 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.752 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.752 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.752 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.752 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.752 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.752 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.752 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.752 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.752 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.752 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.752 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.752 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.752 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.752 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.752 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.752 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.752 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.752 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.752 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.752 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.752 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.753 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.753 12 DEBUG ceilometer.compute.pollsters [-] ddf94a98-572f-4116-87fb-d46dc5f72174/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.753 12 DEBUG ceilometer.compute.pollsters [-] ddf94a98-572f-4116-87fb-d46dc5f72174/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.753 12 DEBUG ceilometer.compute.pollsters [-] 64848f5c-64c9-41ed-9c0d-c2ef3839d5de/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '134a0532-ba8d-46a7-b9e7-97e7b3d22f1a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-0000008b-ddf94a98-572f-4116-87fb-d46dc5f72174-tapf7a27b23-e9', 'timestamp': '2025-11-22T08:14:36.753473', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-585014751', 'name': 'tapf7a27b23-e9', 'instance_id': 'ddf94a98-572f-4116-87fb-d46dc5f72174', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1a:d3:6f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf7a27b23-e9'}, 'message_id': '48f6fcac-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6017.294645464, 'message_signature': '58a3cf25ba54b3803ff274dd5b8b518c149b5da38b215e3d6a3b33d72cda692f'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-0000008b-ddf94a98-572f-4116-87fb-d46dc5f72174-tapd867b5ab-df', 'timestamp': '2025-11-22T08:14:36.753473', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-585014751', 'name': 'tapd867b5ab-df', 'instance_id': 'ddf94a98-572f-4116-87fb-d46dc5f72174', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:04:7f:07', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd867b5ab-df'}, 'message_id': '48f7062a-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6017.294645464, 'message_signature': 'ed766841dc7fc28afe82fbb6f7193f4c92fba8453d8063ce38642342a481636d'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '72df4512d7f245118018df81223ce5ff', 'user_name': None, 'project_id': '5c9016c6b616412fa2db0983e23a8150', 'project_name': None, 'resource_id': 'instance-00000088-64848f5c-64c9-41ed-9c0d-c2ef3839d5de-tap56b64f59-c3', 'timestamp': '2025-11-22T08:14:36.753473', 'resource_metadata': {'display_name': 'tempest-TestSnapshotPattern-server-2002217618', 'name': 'tap56b64f59-c3', 'instance_id': '64848f5c-64c9-41ed-9c0d-c2ef3839d5de', 'instance_type': 'm1.nano', 'host': 'e48e638f02bb08d71c5a9c32c3d0531ffa50e5e68534f2d3ee242f98', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d3:d2:ad', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap56b64f59-c3'}, 'message_id': '48f70f4e-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6017.299607296, 'message_signature': 'c9bbe9d8d86cea5ece3cc41d06fcaef021c190e3ac741f022b1eacbba1ad08cb'}]}, 'timestamp': '2025-11-22 08:14:36.754209', '_unique_id': 'f22ba6c4ceba45d0bd24a4398577b4c8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.754 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.755 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.755 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 08:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:14:36.755 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestGettingAddress-server-585014751>, <NovaLikeServer: tempest-TestSnapshotPattern-server-2002217618>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-585014751>, <NovaLikeServer: tempest-TestSnapshotPattern-server-2002217618>]
Nov 22 08:14:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:14:37.342 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:14:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:14:37.342 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:14:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:14:37.343 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:14:39 compute-0 nova_compute[186544]: 2025-11-22 08:14:39.444 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:14:40 compute-0 nova_compute[186544]: 2025-11-22 08:14:40.997 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:14:44 compute-0 nova_compute[186544]: 2025-11-22 08:14:44.166 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:14:44 compute-0 nova_compute[186544]: 2025-11-22 08:14:44.185 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Triggering sync for uuid 64848f5c-64c9-41ed-9c0d-c2ef3839d5de _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Nov 22 08:14:44 compute-0 nova_compute[186544]: 2025-11-22 08:14:44.186 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Triggering sync for uuid ddf94a98-572f-4116-87fb-d46dc5f72174 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Nov 22 08:14:44 compute-0 nova_compute[186544]: 2025-11-22 08:14:44.186 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "64848f5c-64c9-41ed-9c0d-c2ef3839d5de" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:14:44 compute-0 nova_compute[186544]: 2025-11-22 08:14:44.187 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "64848f5c-64c9-41ed-9c0d-c2ef3839d5de" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:14:44 compute-0 nova_compute[186544]: 2025-11-22 08:14:44.187 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "ddf94a98-572f-4116-87fb-d46dc5f72174" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:14:44 compute-0 nova_compute[186544]: 2025-11-22 08:14:44.187 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "ddf94a98-572f-4116-87fb-d46dc5f72174" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:14:44 compute-0 nova_compute[186544]: 2025-11-22 08:14:44.224 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "64848f5c-64c9-41ed-9c0d-c2ef3839d5de" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.037s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:14:44 compute-0 nova_compute[186544]: 2025-11-22 08:14:44.226 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "ddf94a98-572f-4116-87fb-d46dc5f72174" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.038s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:14:44 compute-0 nova_compute[186544]: 2025-11-22 08:14:44.447 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:14:45 compute-0 nova_compute[186544]: 2025-11-22 08:14:45.998 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:14:46 compute-0 podman[239087]: 2025-11-22 08:14:46.411786107 +0000 UTC m=+0.053457974 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 22 08:14:46 compute-0 podman[239086]: 2025-11-22 08:14:46.438647362 +0000 UTC m=+0.083135428 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 22 08:14:46 compute-0 podman[239088]: 2025-11-22 08:14:46.438855967 +0000 UTC m=+0.067650025 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 08:14:46 compute-0 podman[239089]: 2025-11-22 08:14:46.456092855 +0000 UTC m=+0.090672606 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 08:14:49 compute-0 nova_compute[186544]: 2025-11-22 08:14:49.450 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:14:50 compute-0 nova_compute[186544]: 2025-11-22 08:14:50.802 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:14:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:14:50.803 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=40, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=39) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:14:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:14:50.805 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 08:14:51 compute-0 nova_compute[186544]: 2025-11-22 08:14:51.000 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:14:51 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:14:51.807 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '40'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:14:54 compute-0 nova_compute[186544]: 2025-11-22 08:14:54.453 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:14:56 compute-0 nova_compute[186544]: 2025-11-22 08:14:56.002 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:14:57 compute-0 podman[239174]: 2025-11-22 08:14:57.424203317 +0000 UTC m=+0.060609121 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 22 08:14:59 compute-0 nova_compute[186544]: 2025-11-22 08:14:59.178 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:14:59 compute-0 nova_compute[186544]: 2025-11-22 08:14:59.455 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:15:00 compute-0 nova_compute[186544]: 2025-11-22 08:15:00.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:15:00 compute-0 nova_compute[186544]: 2025-11-22 08:15:00.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 08:15:00 compute-0 nova_compute[186544]: 2025-11-22 08:15:00.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 08:15:00 compute-0 podman[239195]: 2025-11-22 08:15:00.39342468 +0000 UTC m=+0.042729350 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 08:15:00 compute-0 podman[239196]: 2025-11-22 08:15:00.426204191 +0000 UTC m=+0.069395189 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.openshift.expose-services=, name=ubi9-minimal, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, architecture=x86_64, container_name=openstack_network_exporter, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., vcs-type=git)
Nov 22 08:15:00 compute-0 nova_compute[186544]: 2025-11-22 08:15:00.570 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "refresh_cache-64848f5c-64c9-41ed-9c0d-c2ef3839d5de" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:15:00 compute-0 nova_compute[186544]: 2025-11-22 08:15:00.570 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquired lock "refresh_cache-64848f5c-64c9-41ed-9c0d-c2ef3839d5de" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:15:00 compute-0 nova_compute[186544]: 2025-11-22 08:15:00.570 186548 DEBUG nova.network.neutron [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 22 08:15:00 compute-0 nova_compute[186544]: 2025-11-22 08:15:00.571 186548 DEBUG nova.objects.instance [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lazy-loading 'info_cache' on Instance uuid 64848f5c-64c9-41ed-9c0d-c2ef3839d5de obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:15:01 compute-0 nova_compute[186544]: 2025-11-22 08:15:01.004 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:15:04 compute-0 nova_compute[186544]: 2025-11-22 08:15:04.457 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:15:06 compute-0 nova_compute[186544]: 2025-11-22 08:15:06.013 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:15:08 compute-0 nova_compute[186544]: 2025-11-22 08:15:08.767 186548 DEBUG nova.network.neutron [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Updating instance_info_cache with network_info: [{"id": "56b64f59-c3ee-40e4-8a5a-42d53b2d04b7", "address": "fa:16:3e:d3:d2:ad", "network": {"id": "5cbf5083-8d50-44bd-b6ba-93e507a8654e", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-722250133-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.221", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9016c6b616412fa2db0983e23a8150", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap56b64f59-c3", "ovs_interfaceid": "56b64f59-c3ee-40e4-8a5a-42d53b2d04b7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:15:08 compute-0 nova_compute[186544]: 2025-11-22 08:15:08.784 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Releasing lock "refresh_cache-64848f5c-64c9-41ed-9c0d-c2ef3839d5de" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:15:08 compute-0 nova_compute[186544]: 2025-11-22 08:15:08.785 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 22 08:15:08 compute-0 nova_compute[186544]: 2025-11-22 08:15:08.785 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:15:08 compute-0 nova_compute[186544]: 2025-11-22 08:15:08.785 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:15:08 compute-0 nova_compute[186544]: 2025-11-22 08:15:08.786 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:15:08 compute-0 nova_compute[186544]: 2025-11-22 08:15:08.786 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:15:08 compute-0 nova_compute[186544]: 2025-11-22 08:15:08.786 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:15:08 compute-0 nova_compute[186544]: 2025-11-22 08:15:08.786 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 08:15:08 compute-0 nova_compute[186544]: 2025-11-22 08:15:08.786 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:15:08 compute-0 nova_compute[186544]: 2025-11-22 08:15:08.815 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:15:08 compute-0 nova_compute[186544]: 2025-11-22 08:15:08.816 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:15:08 compute-0 nova_compute[186544]: 2025-11-22 08:15:08.816 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:15:08 compute-0 nova_compute[186544]: 2025-11-22 08:15:08.816 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 08:15:08 compute-0 nova_compute[186544]: 2025-11-22 08:15:08.934 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ddf94a98-572f-4116-87fb-d46dc5f72174/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:15:08 compute-0 nova_compute[186544]: 2025-11-22 08:15:08.997 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ddf94a98-572f-4116-87fb-d46dc5f72174/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:15:08 compute-0 nova_compute[186544]: 2025-11-22 08:15:08.998 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ddf94a98-572f-4116-87fb-d46dc5f72174/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:15:09 compute-0 nova_compute[186544]: 2025-11-22 08:15:09.058 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ddf94a98-572f-4116-87fb-d46dc5f72174/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:15:09 compute-0 nova_compute[186544]: 2025-11-22 08:15:09.064 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/64848f5c-64c9-41ed-9c0d-c2ef3839d5de/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:15:09 compute-0 nova_compute[186544]: 2025-11-22 08:15:09.122 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/64848f5c-64c9-41ed-9c0d-c2ef3839d5de/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:15:09 compute-0 nova_compute[186544]: 2025-11-22 08:15:09.123 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/64848f5c-64c9-41ed-9c0d-c2ef3839d5de/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:15:09 compute-0 nova_compute[186544]: 2025-11-22 08:15:09.176 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/64848f5c-64c9-41ed-9c0d-c2ef3839d5de/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:15:09 compute-0 nova_compute[186544]: 2025-11-22 08:15:09.358 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:15:09 compute-0 nova_compute[186544]: 2025-11-22 08:15:09.359 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5335MB free_disk=73.07846450805664GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 08:15:09 compute-0 nova_compute[186544]: 2025-11-22 08:15:09.359 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:15:09 compute-0 nova_compute[186544]: 2025-11-22 08:15:09.360 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:15:09 compute-0 nova_compute[186544]: 2025-11-22 08:15:09.462 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:15:09 compute-0 nova_compute[186544]: 2025-11-22 08:15:09.657 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Instance 64848f5c-64c9-41ed-9c0d-c2ef3839d5de actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 22 08:15:09 compute-0 nova_compute[186544]: 2025-11-22 08:15:09.657 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Instance ddf94a98-572f-4116-87fb-d46dc5f72174 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 22 08:15:09 compute-0 nova_compute[186544]: 2025-11-22 08:15:09.657 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 08:15:09 compute-0 nova_compute[186544]: 2025-11-22 08:15:09.657 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 08:15:09 compute-0 nova_compute[186544]: 2025-11-22 08:15:09.800 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:15:09 compute-0 nova_compute[186544]: 2025-11-22 08:15:09.836 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:15:09 compute-0 nova_compute[186544]: 2025-11-22 08:15:09.874 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 08:15:09 compute-0 nova_compute[186544]: 2025-11-22 08:15:09.874 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.515s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:15:11 compute-0 nova_compute[186544]: 2025-11-22 08:15:11.014 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:15:11 compute-0 nova_compute[186544]: 2025-11-22 08:15:11.252 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:15:14 compute-0 nova_compute[186544]: 2025-11-22 08:15:14.466 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:15:16 compute-0 nova_compute[186544]: 2025-11-22 08:15:16.016 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:15:17 compute-0 podman[239254]: 2025-11-22 08:15:17.421141049 +0000 UTC m=+0.070437995 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 08:15:17 compute-0 podman[239255]: 2025-11-22 08:15:17.439574406 +0000 UTC m=+0.086221896 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 22 08:15:17 compute-0 podman[239256]: 2025-11-22 08:15:17.441643077 +0000 UTC m=+0.083051057 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 08:15:17 compute-0 podman[239257]: 2025-11-22 08:15:17.448007805 +0000 UTC m=+0.086085123 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 22 08:15:19 compute-0 nova_compute[186544]: 2025-11-22 08:15:19.470 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:15:21 compute-0 nova_compute[186544]: 2025-11-22 08:15:21.018 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:15:24 compute-0 nova_compute[186544]: 2025-11-22 08:15:24.474 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:15:25 compute-0 nova_compute[186544]: 2025-11-22 08:15:25.638 186548 DEBUG nova.compute.manager [req-2377b514-879c-40d4-9406-e7deaaca0fdd req-f56ce99a-e9e3-4539-b8a1-d77965e6042a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Received event network-changed-56b64f59-c3ee-40e4-8a5a-42d53b2d04b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:15:25 compute-0 nova_compute[186544]: 2025-11-22 08:15:25.639 186548 DEBUG nova.compute.manager [req-2377b514-879c-40d4-9406-e7deaaca0fdd req-f56ce99a-e9e3-4539-b8a1-d77965e6042a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Refreshing instance network info cache due to event network-changed-56b64f59-c3ee-40e4-8a5a-42d53b2d04b7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:15:25 compute-0 nova_compute[186544]: 2025-11-22 08:15:25.639 186548 DEBUG oslo_concurrency.lockutils [req-2377b514-879c-40d4-9406-e7deaaca0fdd req-f56ce99a-e9e3-4539-b8a1-d77965e6042a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-64848f5c-64c9-41ed-9c0d-c2ef3839d5de" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:15:25 compute-0 nova_compute[186544]: 2025-11-22 08:15:25.640 186548 DEBUG oslo_concurrency.lockutils [req-2377b514-879c-40d4-9406-e7deaaca0fdd req-f56ce99a-e9e3-4539-b8a1-d77965e6042a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-64848f5c-64c9-41ed-9c0d-c2ef3839d5de" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:15:25 compute-0 nova_compute[186544]: 2025-11-22 08:15:25.640 186548 DEBUG nova.network.neutron [req-2377b514-879c-40d4-9406-e7deaaca0fdd req-f56ce99a-e9e3-4539-b8a1-d77965e6042a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Refreshing network info cache for port 56b64f59-c3ee-40e4-8a5a-42d53b2d04b7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:15:25 compute-0 nova_compute[186544]: 2025-11-22 08:15:25.859 186548 DEBUG oslo_concurrency.lockutils [None req-07bfd88e-2722-4473-b6f7-702751eaa1e2 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Acquiring lock "64848f5c-64c9-41ed-9c0d-c2ef3839d5de" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:15:25 compute-0 nova_compute[186544]: 2025-11-22 08:15:25.859 186548 DEBUG oslo_concurrency.lockutils [None req-07bfd88e-2722-4473-b6f7-702751eaa1e2 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Lock "64848f5c-64c9-41ed-9c0d-c2ef3839d5de" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:15:25 compute-0 nova_compute[186544]: 2025-11-22 08:15:25.859 186548 DEBUG oslo_concurrency.lockutils [None req-07bfd88e-2722-4473-b6f7-702751eaa1e2 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Acquiring lock "64848f5c-64c9-41ed-9c0d-c2ef3839d5de-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:15:25 compute-0 nova_compute[186544]: 2025-11-22 08:15:25.860 186548 DEBUG oslo_concurrency.lockutils [None req-07bfd88e-2722-4473-b6f7-702751eaa1e2 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Lock "64848f5c-64c9-41ed-9c0d-c2ef3839d5de-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:15:25 compute-0 nova_compute[186544]: 2025-11-22 08:15:25.860 186548 DEBUG oslo_concurrency.lockutils [None req-07bfd88e-2722-4473-b6f7-702751eaa1e2 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Lock "64848f5c-64c9-41ed-9c0d-c2ef3839d5de-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:15:25 compute-0 nova_compute[186544]: 2025-11-22 08:15:25.867 186548 INFO nova.compute.manager [None req-07bfd88e-2722-4473-b6f7-702751eaa1e2 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Terminating instance
Nov 22 08:15:25 compute-0 nova_compute[186544]: 2025-11-22 08:15:25.872 186548 DEBUG nova.compute.manager [None req-07bfd88e-2722-4473-b6f7-702751eaa1e2 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 08:15:25 compute-0 kernel: tap56b64f59-c3 (unregistering): left promiscuous mode
Nov 22 08:15:25 compute-0 NetworkManager[55036]: <info>  [1763799325.8964] device (tap56b64f59-c3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 08:15:25 compute-0 ovn_controller[94843]: 2025-11-22T08:15:25Z|00641|binding|INFO|Releasing lport 56b64f59-c3ee-40e4-8a5a-42d53b2d04b7 from this chassis (sb_readonly=0)
Nov 22 08:15:25 compute-0 ovn_controller[94843]: 2025-11-22T08:15:25Z|00642|binding|INFO|Setting lport 56b64f59-c3ee-40e4-8a5a-42d53b2d04b7 down in Southbound
Nov 22 08:15:25 compute-0 nova_compute[186544]: 2025-11-22 08:15:25.905 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:15:25 compute-0 ovn_controller[94843]: 2025-11-22T08:15:25Z|00643|binding|INFO|Removing iface tap56b64f59-c3 ovn-installed in OVS
Nov 22 08:15:25 compute-0 nova_compute[186544]: 2025-11-22 08:15:25.909 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:15:25 compute-0 nova_compute[186544]: 2025-11-22 08:15:25.919 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:15:25 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:15:25.935 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d3:d2:ad 10.100.0.4'], port_security=['fa:16:3e:d3:d2:ad 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '64848f5c-64c9-41ed-9c0d-c2ef3839d5de', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5cbf5083-8d50-44bd-b6ba-93e507a8654e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5c9016c6b616412fa2db0983e23a8150', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7420c781-e9c7-4653-97a5-92e76e44aa71', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7e7964e9-a04c-4b66-8053-f482dcbb2cee, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=56b64f59-c3ee-40e4-8a5a-42d53b2d04b7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:15:25 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:15:25.994 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 56b64f59-c3ee-40e4-8a5a-42d53b2d04b7 in datapath 5cbf5083-8d50-44bd-b6ba-93e507a8654e unbound from our chassis
Nov 22 08:15:25 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:15:25.996 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5cbf5083-8d50-44bd-b6ba-93e507a8654e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 08:15:25 compute-0 systemd[1]: machine-qemu\x2d77\x2dinstance\x2d00000088.scope: Deactivated successfully.
Nov 22 08:15:25 compute-0 systemd[1]: machine-qemu\x2d77\x2dinstance\x2d00000088.scope: Consumed 19.278s CPU time.
Nov 22 08:15:26 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:15:25.998 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[70a8b229-c1f4-4fc7-b2d8-acca78ee9e11]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:15:26 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:15:26.002 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-5cbf5083-8d50-44bd-b6ba-93e507a8654e namespace which is not needed anymore
Nov 22 08:15:26 compute-0 systemd-machined[152872]: Machine qemu-77-instance-00000088 terminated.
Nov 22 08:15:26 compute-0 nova_compute[186544]: 2025-11-22 08:15:26.020 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:15:26 compute-0 nova_compute[186544]: 2025-11-22 08:15:26.096 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:15:26 compute-0 nova_compute[186544]: 2025-11-22 08:15:26.102 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:15:26 compute-0 nova_compute[186544]: 2025-11-22 08:15:26.143 186548 INFO nova.virt.libvirt.driver [-] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Instance destroyed successfully.
Nov 22 08:15:26 compute-0 nova_compute[186544]: 2025-11-22 08:15:26.145 186548 DEBUG nova.objects.instance [None req-07bfd88e-2722-4473-b6f7-702751eaa1e2 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Lazy-loading 'resources' on Instance uuid 64848f5c-64c9-41ed-9c0d-c2ef3839d5de obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:15:26 compute-0 neutron-haproxy-ovnmeta-5cbf5083-8d50-44bd-b6ba-93e507a8654e[238414]: [NOTICE]   (238418) : haproxy version is 2.8.14-c23fe91
Nov 22 08:15:26 compute-0 neutron-haproxy-ovnmeta-5cbf5083-8d50-44bd-b6ba-93e507a8654e[238414]: [NOTICE]   (238418) : path to executable is /usr/sbin/haproxy
Nov 22 08:15:26 compute-0 neutron-haproxy-ovnmeta-5cbf5083-8d50-44bd-b6ba-93e507a8654e[238414]: [WARNING]  (238418) : Exiting Master process...
Nov 22 08:15:26 compute-0 neutron-haproxy-ovnmeta-5cbf5083-8d50-44bd-b6ba-93e507a8654e[238414]: [WARNING]  (238418) : Exiting Master process...
Nov 22 08:15:26 compute-0 neutron-haproxy-ovnmeta-5cbf5083-8d50-44bd-b6ba-93e507a8654e[238414]: [ALERT]    (238418) : Current worker (238420) exited with code 143 (Terminated)
Nov 22 08:15:26 compute-0 neutron-haproxy-ovnmeta-5cbf5083-8d50-44bd-b6ba-93e507a8654e[238414]: [WARNING]  (238418) : All workers exited. Exiting... (0)
Nov 22 08:15:26 compute-0 nova_compute[186544]: 2025-11-22 08:15:26.159 186548 DEBUG nova.virt.libvirt.vif [None req-07bfd88e-2722-4473-b6f7-702751eaa1e2 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:13:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestSnapshotPattern-server-2002217618',display_name='tempest-TestSnapshotPattern-server-2002217618',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testsnapshotpattern-server-2002217618',id=136,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBO3zqwL5oCAVcYUK4UfRxRlwiLCpXhyrVibiQXfDMPSmEzdCg2weZeJjjoUlK1vs2o/ZsP7kK+r7TBW2xEMw9M43RfSbbpgfpmDe3/3E/PZ1RgVY0zy+sKDgo7g8yf0esA==',key_name='tempest-TestSnapshotPattern-653067273',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:13:21Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5c9016c6b616412fa2db0983e23a8150',ramdisk_id='',reservation_id='r-kg583xko',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSnapshotPattern-1254822391',owner_user_name='tempest-TestSnapshotPattern-1254822391-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:14:06Z,user_data=None,user_id='72df4512d7f245118018df81223ce5ff',uuid=64848f5c-64c9-41ed-9c0d-c2ef3839d5de,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "56b64f59-c3ee-40e4-8a5a-42d53b2d04b7", "address": "fa:16:3e:d3:d2:ad", "network": {"id": "5cbf5083-8d50-44bd-b6ba-93e507a8654e", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-722250133-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.221", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9016c6b616412fa2db0983e23a8150", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap56b64f59-c3", "ovs_interfaceid": "56b64f59-c3ee-40e4-8a5a-42d53b2d04b7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 08:15:26 compute-0 nova_compute[186544]: 2025-11-22 08:15:26.159 186548 DEBUG nova.network.os_vif_util [None req-07bfd88e-2722-4473-b6f7-702751eaa1e2 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Converting VIF {"id": "56b64f59-c3ee-40e4-8a5a-42d53b2d04b7", "address": "fa:16:3e:d3:d2:ad", "network": {"id": "5cbf5083-8d50-44bd-b6ba-93e507a8654e", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-722250133-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.221", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9016c6b616412fa2db0983e23a8150", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap56b64f59-c3", "ovs_interfaceid": "56b64f59-c3ee-40e4-8a5a-42d53b2d04b7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:15:26 compute-0 nova_compute[186544]: 2025-11-22 08:15:26.160 186548 DEBUG nova.network.os_vif_util [None req-07bfd88e-2722-4473-b6f7-702751eaa1e2 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d3:d2:ad,bridge_name='br-int',has_traffic_filtering=True,id=56b64f59-c3ee-40e4-8a5a-42d53b2d04b7,network=Network(5cbf5083-8d50-44bd-b6ba-93e507a8654e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap56b64f59-c3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:15:26 compute-0 nova_compute[186544]: 2025-11-22 08:15:26.161 186548 DEBUG os_vif [None req-07bfd88e-2722-4473-b6f7-702751eaa1e2 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d3:d2:ad,bridge_name='br-int',has_traffic_filtering=True,id=56b64f59-c3ee-40e4-8a5a-42d53b2d04b7,network=Network(5cbf5083-8d50-44bd-b6ba-93e507a8654e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap56b64f59-c3') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 08:15:26 compute-0 nova_compute[186544]: 2025-11-22 08:15:26.162 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:15:26 compute-0 nova_compute[186544]: 2025-11-22 08:15:26.162 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap56b64f59-c3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:15:26 compute-0 systemd[1]: libpod-33ab42803c85de2a341cd9ab489cee45a4ad5e2a8cb4d2a1819f30fca662e93a.scope: Deactivated successfully.
Nov 22 08:15:26 compute-0 nova_compute[186544]: 2025-11-22 08:15:26.164 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:15:26 compute-0 nova_compute[186544]: 2025-11-22 08:15:26.166 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 08:15:26 compute-0 podman[239364]: 2025-11-22 08:15:26.168603069 +0000 UTC m=+0.064681161 container died 33ab42803c85de2a341cd9ab489cee45a4ad5e2a8cb4d2a1819f30fca662e93a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5cbf5083-8d50-44bd-b6ba-93e507a8654e, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 22 08:15:26 compute-0 nova_compute[186544]: 2025-11-22 08:15:26.168 186548 INFO os_vif [None req-07bfd88e-2722-4473-b6f7-702751eaa1e2 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d3:d2:ad,bridge_name='br-int',has_traffic_filtering=True,id=56b64f59-c3ee-40e4-8a5a-42d53b2d04b7,network=Network(5cbf5083-8d50-44bd-b6ba-93e507a8654e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap56b64f59-c3')
Nov 22 08:15:26 compute-0 nova_compute[186544]: 2025-11-22 08:15:26.169 186548 INFO nova.virt.libvirt.driver [None req-07bfd88e-2722-4473-b6f7-702751eaa1e2 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Deleting instance files /var/lib/nova/instances/64848f5c-64c9-41ed-9c0d-c2ef3839d5de_del
Nov 22 08:15:26 compute-0 nova_compute[186544]: 2025-11-22 08:15:26.170 186548 INFO nova.virt.libvirt.driver [None req-07bfd88e-2722-4473-b6f7-702751eaa1e2 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Deletion of /var/lib/nova/instances/64848f5c-64c9-41ed-9c0d-c2ef3839d5de_del complete
Nov 22 08:15:26 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-33ab42803c85de2a341cd9ab489cee45a4ad5e2a8cb4d2a1819f30fca662e93a-userdata-shm.mount: Deactivated successfully.
Nov 22 08:15:26 compute-0 systemd[1]: var-lib-containers-storage-overlay-8129f6348dfb45d4b21587bf3427b4e472929aa3252894f18839520057cb53d2-merged.mount: Deactivated successfully.
Nov 22 08:15:26 compute-0 podman[239364]: 2025-11-22 08:15:26.226096943 +0000 UTC m=+0.122175035 container cleanup 33ab42803c85de2a341cd9ab489cee45a4ad5e2a8cb4d2a1819f30fca662e93a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5cbf5083-8d50-44bd-b6ba-93e507a8654e, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Nov 22 08:15:26 compute-0 systemd[1]: libpod-conmon-33ab42803c85de2a341cd9ab489cee45a4ad5e2a8cb4d2a1819f30fca662e93a.scope: Deactivated successfully.
Nov 22 08:15:26 compute-0 nova_compute[186544]: 2025-11-22 08:15:26.239 186548 INFO nova.compute.manager [None req-07bfd88e-2722-4473-b6f7-702751eaa1e2 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Took 0.37 seconds to destroy the instance on the hypervisor.
Nov 22 08:15:26 compute-0 nova_compute[186544]: 2025-11-22 08:15:26.240 186548 DEBUG oslo.service.loopingcall [None req-07bfd88e-2722-4473-b6f7-702751eaa1e2 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 08:15:26 compute-0 nova_compute[186544]: 2025-11-22 08:15:26.241 186548 DEBUG nova.compute.manager [-] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 08:15:26 compute-0 nova_compute[186544]: 2025-11-22 08:15:26.241 186548 DEBUG nova.network.neutron [-] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 08:15:26 compute-0 podman[239407]: 2025-11-22 08:15:26.2926233 +0000 UTC m=+0.046502713 container remove 33ab42803c85de2a341cd9ab489cee45a4ad5e2a8cb4d2a1819f30fca662e93a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5cbf5083-8d50-44bd-b6ba-93e507a8654e, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 08:15:26 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:15:26.297 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[adf5f2bc-4187-439d-b926-113c20c844ac]: (4, ('Sat Nov 22 08:15:26 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-5cbf5083-8d50-44bd-b6ba-93e507a8654e (33ab42803c85de2a341cd9ab489cee45a4ad5e2a8cb4d2a1819f30fca662e93a)\n33ab42803c85de2a341cd9ab489cee45a4ad5e2a8cb4d2a1819f30fca662e93a\nSat Nov 22 08:15:26 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-5cbf5083-8d50-44bd-b6ba-93e507a8654e (33ab42803c85de2a341cd9ab489cee45a4ad5e2a8cb4d2a1819f30fca662e93a)\n33ab42803c85de2a341cd9ab489cee45a4ad5e2a8cb4d2a1819f30fca662e93a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:15:26 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:15:26.299 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[1eeb64ee-2008-4cc0-9df9-f725241afa9c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:15:26 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:15:26.300 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5cbf5083-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:15:26 compute-0 nova_compute[186544]: 2025-11-22 08:15:26.302 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:15:26 compute-0 kernel: tap5cbf5083-80: left promiscuous mode
Nov 22 08:15:26 compute-0 nova_compute[186544]: 2025-11-22 08:15:26.320 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:15:26 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:15:26.324 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[c0fbf50f-47f8-4a74-9703-4b864a039cbb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:15:26 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:15:26.344 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[ac308931-115d-4ca6-b29a-f658403b3ac8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:15:26 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:15:26.346 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[ea688ed4-f1b8-4762-9153-902013456809]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:15:26 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:15:26.361 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[0bce962d-feb4-4292-b422-1e17710755e2]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 594162, 'reachable_time': 31310, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239422, 'error': None, 'target': 'ovnmeta-5cbf5083-8d50-44bd-b6ba-93e507a8654e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:15:26 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:15:26.364 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-5cbf5083-8d50-44bd-b6ba-93e507a8654e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 08:15:26 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:15:26.364 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[8d2c85f5-0ad9-42dd-a342-5ea92c53ad84]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:15:26 compute-0 systemd[1]: run-netns-ovnmeta\x2d5cbf5083\x2d8d50\x2d44bd\x2db6ba\x2d93e507a8654e.mount: Deactivated successfully.
Nov 22 08:15:28 compute-0 podman[239423]: 2025-11-22 08:15:28.407451192 +0000 UTC m=+0.054366577 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Nov 22 08:15:29 compute-0 nova_compute[186544]: 2025-11-22 08:15:29.276 186548 DEBUG nova.compute.manager [req-3798e0ce-c8d0-40eb-b44f-402bbc52d1a6 req-a7c66477-744c-42ea-bdec-34a460b1ec40 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Received event network-vif-plugged-56b64f59-c3ee-40e4-8a5a-42d53b2d04b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:15:29 compute-0 nova_compute[186544]: 2025-11-22 08:15:29.277 186548 DEBUG oslo_concurrency.lockutils [req-3798e0ce-c8d0-40eb-b44f-402bbc52d1a6 req-a7c66477-744c-42ea-bdec-34a460b1ec40 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "64848f5c-64c9-41ed-9c0d-c2ef3839d5de-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:15:29 compute-0 nova_compute[186544]: 2025-11-22 08:15:29.278 186548 DEBUG oslo_concurrency.lockutils [req-3798e0ce-c8d0-40eb-b44f-402bbc52d1a6 req-a7c66477-744c-42ea-bdec-34a460b1ec40 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "64848f5c-64c9-41ed-9c0d-c2ef3839d5de-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:15:29 compute-0 nova_compute[186544]: 2025-11-22 08:15:29.278 186548 DEBUG oslo_concurrency.lockutils [req-3798e0ce-c8d0-40eb-b44f-402bbc52d1a6 req-a7c66477-744c-42ea-bdec-34a460b1ec40 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "64848f5c-64c9-41ed-9c0d-c2ef3839d5de-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:15:29 compute-0 nova_compute[186544]: 2025-11-22 08:15:29.278 186548 DEBUG nova.compute.manager [req-3798e0ce-c8d0-40eb-b44f-402bbc52d1a6 req-a7c66477-744c-42ea-bdec-34a460b1ec40 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] No waiting events found dispatching network-vif-plugged-56b64f59-c3ee-40e4-8a5a-42d53b2d04b7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:15:29 compute-0 nova_compute[186544]: 2025-11-22 08:15:29.279 186548 WARNING nova.compute.manager [req-3798e0ce-c8d0-40eb-b44f-402bbc52d1a6 req-a7c66477-744c-42ea-bdec-34a460b1ec40 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Received unexpected event network-vif-plugged-56b64f59-c3ee-40e4-8a5a-42d53b2d04b7 for instance with vm_state active and task_state deleting.
Nov 22 08:15:29 compute-0 nova_compute[186544]: 2025-11-22 08:15:29.609 186548 DEBUG nova.network.neutron [-] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:15:29 compute-0 nova_compute[186544]: 2025-11-22 08:15:29.667 186548 DEBUG nova.compute.manager [req-90d68e05-69b9-45fc-b147-d8fe086cdf87 req-88872a26-95f0-4322-a28f-0e570d49badc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Received event network-vif-deleted-56b64f59-c3ee-40e4-8a5a-42d53b2d04b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:15:29 compute-0 nova_compute[186544]: 2025-11-22 08:15:29.667 186548 INFO nova.compute.manager [req-90d68e05-69b9-45fc-b147-d8fe086cdf87 req-88872a26-95f0-4322-a28f-0e570d49badc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Neutron deleted interface 56b64f59-c3ee-40e4-8a5a-42d53b2d04b7; detaching it from the instance and deleting it from the info cache
Nov 22 08:15:29 compute-0 nova_compute[186544]: 2025-11-22 08:15:29.667 186548 DEBUG nova.network.neutron [req-90d68e05-69b9-45fc-b147-d8fe086cdf87 req-88872a26-95f0-4322-a28f-0e570d49badc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:15:29 compute-0 nova_compute[186544]: 2025-11-22 08:15:29.670 186548 INFO nova.compute.manager [-] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Took 3.43 seconds to deallocate network for instance.
Nov 22 08:15:29 compute-0 nova_compute[186544]: 2025-11-22 08:15:29.683 186548 DEBUG nova.compute.manager [req-90d68e05-69b9-45fc-b147-d8fe086cdf87 req-88872a26-95f0-4322-a28f-0e570d49badc 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Detach interface failed, port_id=56b64f59-c3ee-40e4-8a5a-42d53b2d04b7, reason: Instance 64848f5c-64c9-41ed-9c0d-c2ef3839d5de could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Nov 22 08:15:29 compute-0 nova_compute[186544]: 2025-11-22 08:15:29.803 186548 DEBUG oslo_concurrency.lockutils [None req-07bfd88e-2722-4473-b6f7-702751eaa1e2 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:15:29 compute-0 nova_compute[186544]: 2025-11-22 08:15:29.803 186548 DEBUG oslo_concurrency.lockutils [None req-07bfd88e-2722-4473-b6f7-702751eaa1e2 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:15:29 compute-0 nova_compute[186544]: 2025-11-22 08:15:29.885 186548 DEBUG nova.compute.provider_tree [None req-07bfd88e-2722-4473-b6f7-702751eaa1e2 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:15:29 compute-0 nova_compute[186544]: 2025-11-22 08:15:29.912 186548 DEBUG nova.scheduler.client.report [None req-07bfd88e-2722-4473-b6f7-702751eaa1e2 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:15:29 compute-0 nova_compute[186544]: 2025-11-22 08:15:29.934 186548 DEBUG oslo_concurrency.lockutils [None req-07bfd88e-2722-4473-b6f7-702751eaa1e2 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.131s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:15:29 compute-0 nova_compute[186544]: 2025-11-22 08:15:29.966 186548 INFO nova.scheduler.client.report [None req-07bfd88e-2722-4473-b6f7-702751eaa1e2 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Deleted allocations for instance 64848f5c-64c9-41ed-9c0d-c2ef3839d5de
Nov 22 08:15:30 compute-0 nova_compute[186544]: 2025-11-22 08:15:30.032 186548 DEBUG oslo_concurrency.lockutils [None req-07bfd88e-2722-4473-b6f7-702751eaa1e2 72df4512d7f245118018df81223ce5ff 5c9016c6b616412fa2db0983e23a8150 - - default default] Lock "64848f5c-64c9-41ed-9c0d-c2ef3839d5de" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.173s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:15:30 compute-0 nova_compute[186544]: 2025-11-22 08:15:30.291 186548 DEBUG nova.network.neutron [req-2377b514-879c-40d4-9406-e7deaaca0fdd req-f56ce99a-e9e3-4539-b8a1-d77965e6042a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Updated VIF entry in instance network info cache for port 56b64f59-c3ee-40e4-8a5a-42d53b2d04b7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:15:30 compute-0 nova_compute[186544]: 2025-11-22 08:15:30.291 186548 DEBUG nova.network.neutron [req-2377b514-879c-40d4-9406-e7deaaca0fdd req-f56ce99a-e9e3-4539-b8a1-d77965e6042a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Updating instance_info_cache with network_info: [{"id": "56b64f59-c3ee-40e4-8a5a-42d53b2d04b7", "address": "fa:16:3e:d3:d2:ad", "network": {"id": "5cbf5083-8d50-44bd-b6ba-93e507a8654e", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-722250133-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9016c6b616412fa2db0983e23a8150", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap56b64f59-c3", "ovs_interfaceid": "56b64f59-c3ee-40e4-8a5a-42d53b2d04b7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:15:30 compute-0 nova_compute[186544]: 2025-11-22 08:15:30.320 186548 DEBUG oslo_concurrency.lockutils [req-2377b514-879c-40d4-9406-e7deaaca0fdd req-f56ce99a-e9e3-4539-b8a1-d77965e6042a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-64848f5c-64c9-41ed-9c0d-c2ef3839d5de" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:15:31 compute-0 nova_compute[186544]: 2025-11-22 08:15:31.022 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:15:31 compute-0 nova_compute[186544]: 2025-11-22 08:15:31.165 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:15:31 compute-0 podman[239444]: 2025-11-22 08:15:31.405003226 +0000 UTC m=+0.051595949 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 08:15:31 compute-0 podman[239445]: 2025-11-22 08:15:31.41567705 +0000 UTC m=+0.060619892 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 22 08:15:36 compute-0 nova_compute[186544]: 2025-11-22 08:15:36.025 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:15:36 compute-0 nova_compute[186544]: 2025-11-22 08:15:36.166 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:15:36 compute-0 ovn_controller[94843]: 2025-11-22T08:15:36Z|00644|binding|INFO|Releasing lport 32debc95-1c60-4d8b-9d74-79ae74c8f38f from this chassis (sb_readonly=0)
Nov 22 08:15:36 compute-0 ovn_controller[94843]: 2025-11-22T08:15:36Z|00645|binding|INFO|Releasing lport 433cf940-3b59-425c-aeb8-689a57de46c2 from this chassis (sb_readonly=0)
Nov 22 08:15:36 compute-0 nova_compute[186544]: 2025-11-22 08:15:36.794 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:15:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:15:37.342 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:15:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:15:37.342 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:15:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:15:37.343 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:15:41 compute-0 nova_compute[186544]: 2025-11-22 08:15:41.027 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:15:41 compute-0 nova_compute[186544]: 2025-11-22 08:15:41.141 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763799326.139821, 64848f5c-64c9-41ed-9c0d-c2ef3839d5de => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:15:41 compute-0 nova_compute[186544]: 2025-11-22 08:15:41.141 186548 INFO nova.compute.manager [-] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] VM Stopped (Lifecycle Event)
Nov 22 08:15:41 compute-0 nova_compute[186544]: 2025-11-22 08:15:41.160 186548 DEBUG nova.compute.manager [None req-debdf086-53fc-49c0-8c96-b7e1855ceace - - - - - -] [instance: 64848f5c-64c9-41ed-9c0d-c2ef3839d5de] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:15:41 compute-0 nova_compute[186544]: 2025-11-22 08:15:41.168 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:15:41 compute-0 nova_compute[186544]: 2025-11-22 08:15:41.897 186548 DEBUG nova.compute.manager [req-66c9422d-23bb-4f69-b19d-3fcd0bc6d972 req-1044096a-7ff4-4feb-83b4-c55ca1b98379 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Received event network-changed-f7a27b23-e94c-4aef-9b08-6d0ddb100265 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:15:41 compute-0 nova_compute[186544]: 2025-11-22 08:15:41.897 186548 DEBUG nova.compute.manager [req-66c9422d-23bb-4f69-b19d-3fcd0bc6d972 req-1044096a-7ff4-4feb-83b4-c55ca1b98379 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Refreshing instance network info cache due to event network-changed-f7a27b23-e94c-4aef-9b08-6d0ddb100265. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:15:41 compute-0 nova_compute[186544]: 2025-11-22 08:15:41.898 186548 DEBUG oslo_concurrency.lockutils [req-66c9422d-23bb-4f69-b19d-3fcd0bc6d972 req-1044096a-7ff4-4feb-83b4-c55ca1b98379 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-ddf94a98-572f-4116-87fb-d46dc5f72174" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:15:41 compute-0 nova_compute[186544]: 2025-11-22 08:15:41.899 186548 DEBUG oslo_concurrency.lockutils [req-66c9422d-23bb-4f69-b19d-3fcd0bc6d972 req-1044096a-7ff4-4feb-83b4-c55ca1b98379 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-ddf94a98-572f-4116-87fb-d46dc5f72174" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:15:41 compute-0 nova_compute[186544]: 2025-11-22 08:15:41.899 186548 DEBUG nova.network.neutron [req-66c9422d-23bb-4f69-b19d-3fcd0bc6d972 req-1044096a-7ff4-4feb-83b4-c55ca1b98379 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Refreshing network info cache for port f7a27b23-e94c-4aef-9b08-6d0ddb100265 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:15:42 compute-0 nova_compute[186544]: 2025-11-22 08:15:42.106 186548 DEBUG oslo_concurrency.lockutils [None req-50d9be94-f02a-4bf6-ae29-0bb54bae0603 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "ddf94a98-572f-4116-87fb-d46dc5f72174" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:15:42 compute-0 nova_compute[186544]: 2025-11-22 08:15:42.106 186548 DEBUG oslo_concurrency.lockutils [None req-50d9be94-f02a-4bf6-ae29-0bb54bae0603 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "ddf94a98-572f-4116-87fb-d46dc5f72174" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:15:42 compute-0 nova_compute[186544]: 2025-11-22 08:15:42.107 186548 DEBUG oslo_concurrency.lockutils [None req-50d9be94-f02a-4bf6-ae29-0bb54bae0603 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "ddf94a98-572f-4116-87fb-d46dc5f72174-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:15:42 compute-0 nova_compute[186544]: 2025-11-22 08:15:42.107 186548 DEBUG oslo_concurrency.lockutils [None req-50d9be94-f02a-4bf6-ae29-0bb54bae0603 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "ddf94a98-572f-4116-87fb-d46dc5f72174-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:15:42 compute-0 nova_compute[186544]: 2025-11-22 08:15:42.107 186548 DEBUG oslo_concurrency.lockutils [None req-50d9be94-f02a-4bf6-ae29-0bb54bae0603 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "ddf94a98-572f-4116-87fb-d46dc5f72174-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:15:42 compute-0 nova_compute[186544]: 2025-11-22 08:15:42.118 186548 INFO nova.compute.manager [None req-50d9be94-f02a-4bf6-ae29-0bb54bae0603 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Terminating instance
Nov 22 08:15:42 compute-0 nova_compute[186544]: 2025-11-22 08:15:42.130 186548 DEBUG nova.compute.manager [None req-50d9be94-f02a-4bf6-ae29-0bb54bae0603 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 08:15:42 compute-0 kernel: tapf7a27b23-e9 (unregistering): left promiscuous mode
Nov 22 08:15:42 compute-0 NetworkManager[55036]: <info>  [1763799342.1631] device (tapf7a27b23-e9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 08:15:42 compute-0 nova_compute[186544]: 2025-11-22 08:15:42.180 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:15:42 compute-0 ovn_controller[94843]: 2025-11-22T08:15:42Z|00646|binding|INFO|Releasing lport f7a27b23-e94c-4aef-9b08-6d0ddb100265 from this chassis (sb_readonly=0)
Nov 22 08:15:42 compute-0 ovn_controller[94843]: 2025-11-22T08:15:42Z|00647|binding|INFO|Setting lport f7a27b23-e94c-4aef-9b08-6d0ddb100265 down in Southbound
Nov 22 08:15:42 compute-0 ovn_controller[94843]: 2025-11-22T08:15:42Z|00648|binding|INFO|Removing iface tapf7a27b23-e9 ovn-installed in OVS
Nov 22 08:15:42 compute-0 nova_compute[186544]: 2025-11-22 08:15:42.184 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:15:42 compute-0 kernel: tapd867b5ab-df (unregistering): left promiscuous mode
Nov 22 08:15:42 compute-0 NetworkManager[55036]: <info>  [1763799342.1989] device (tapd867b5ab-df): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 08:15:42 compute-0 nova_compute[186544]: 2025-11-22 08:15:42.201 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:15:42 compute-0 ovn_controller[94843]: 2025-11-22T08:15:42Z|00649|binding|INFO|Releasing lport d867b5ab-dfbe-4189-8dbb-ed5c62d72ad3 from this chassis (sb_readonly=1)
Nov 22 08:15:42 compute-0 nova_compute[186544]: 2025-11-22 08:15:42.215 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:15:42 compute-0 ovn_controller[94843]: 2025-11-22T08:15:42Z|00650|binding|INFO|Removing iface tapd867b5ab-df ovn-installed in OVS
Nov 22 08:15:42 compute-0 ovn_controller[94843]: 2025-11-22T08:15:42Z|00651|if_status|INFO|Not setting lport d867b5ab-dfbe-4189-8dbb-ed5c62d72ad3 down as sb is readonly
Nov 22 08:15:42 compute-0 nova_compute[186544]: 2025-11-22 08:15:42.216 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:15:42 compute-0 nova_compute[186544]: 2025-11-22 08:15:42.240 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:15:42 compute-0 systemd[1]: machine-qemu\x2d78\x2dinstance\x2d0000008b.scope: Deactivated successfully.
Nov 22 08:15:42 compute-0 systemd[1]: machine-qemu\x2d78\x2dinstance\x2d0000008b.scope: Consumed 18.682s CPU time.
Nov 22 08:15:42 compute-0 systemd-machined[152872]: Machine qemu-78-instance-0000008b terminated.
Nov 22 08:15:42 compute-0 ovn_controller[94843]: 2025-11-22T08:15:42Z|00652|binding|INFO|Setting lport d867b5ab-dfbe-4189-8dbb-ed5c62d72ad3 down in Southbound
Nov 22 08:15:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:15:42.276 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1a:d3:6f 10.100.0.14'], port_security=['fa:16:3e:1a:d3:6f 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'ddf94a98-572f-4116-87fb-d46dc5f72174', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7e573664-04ba-4ce5-994a-9fb9483a2400', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'bc613044-b796-41b5-a7b0-c508f998d641', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=da796c20-96a3-420c-a9ae-3320426db7c7, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=f7a27b23-e94c-4aef-9b08-6d0ddb100265) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:15:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:15:42.277 103805 INFO neutron.agent.ovn.metadata.agent [-] Port f7a27b23-e94c-4aef-9b08-6d0ddb100265 in datapath 7e573664-04ba-4ce5-994a-9fb9483a2400 unbound from our chassis
Nov 22 08:15:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:15:42.278 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7e573664-04ba-4ce5-994a-9fb9483a2400, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 08:15:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:15:42.280 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[ca35eca5-9a6a-4100-843b-addf3e23869d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:15:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:15:42.280 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7e573664-04ba-4ce5-994a-9fb9483a2400 namespace which is not needed anymore
Nov 22 08:15:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:15:42.297 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:04:7f:07 2001:db8:0:1:f816:3eff:fe04:7f07 2001:db8::f816:3eff:fe04:7f07'], port_security=['fa:16:3e:04:7f:07 2001:db8:0:1:f816:3eff:fe04:7f07 2001:db8::f816:3eff:fe04:7f07'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe04:7f07/64 2001:db8::f816:3eff:fe04:7f07/64', 'neutron:device_id': 'ddf94a98-572f-4116-87fb-d46dc5f72174', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6c0a2255-6426-43c4-abc3-5c1857ba0a79', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'bc613044-b796-41b5-a7b0-c508f998d641', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1f6eb0a2-d476-48e9-8756-79e6bbc84c15, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=d867b5ab-dfbe-4189-8dbb-ed5c62d72ad3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:15:42 compute-0 NetworkManager[55036]: <info>  [1763799342.3694] manager: (tapd867b5ab-df): new Tun device (/org/freedesktop/NetworkManager/Devices/301)
Nov 22 08:15:42 compute-0 nova_compute[186544]: 2025-11-22 08:15:42.423 186548 INFO nova.virt.libvirt.driver [-] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Instance destroyed successfully.
Nov 22 08:15:42 compute-0 nova_compute[186544]: 2025-11-22 08:15:42.424 186548 DEBUG nova.objects.instance [None req-50d9be94-f02a-4bf6-ae29-0bb54bae0603 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lazy-loading 'resources' on Instance uuid ddf94a98-572f-4116-87fb-d46dc5f72174 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:15:42 compute-0 nova_compute[186544]: 2025-11-22 08:15:42.437 186548 DEBUG nova.virt.libvirt.vif [None req-50d9be94-f02a-4bf6-ae29-0bb54bae0603 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:13:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-585014751',display_name='tempest-TestGettingAddress-server-585014751',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-585014751',id=139,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBE3h1SGyV3gSWS5sQ4nylu2AAOZq0lzcQgQV1fi/afTfgHNAebinpbyavWAHUC3BTYwehM8YAaeM76WaxgrKeLAhyjYG3qrO7DzWZz90S7erIXCzT/UdxFEeIFnV62ADrw==',key_name='tempest-TestGettingAddress-2002370226',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:14:11Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c4200f1d1fbb44a5aaf5e3578f6354ae',ramdisk_id='',reservation_id='r-00mzxd97',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-25838038',owner_user_name='tempest-TestGettingAddress-25838038-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:14:11Z,user_data=None,user_id='809b865601654264af5bff7f49127cea',uuid=ddf94a98-572f-4116-87fb-d46dc5f72174,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f7a27b23-e94c-4aef-9b08-6d0ddb100265", "address": "fa:16:3e:1a:d3:6f", "network": {"id": "7e573664-04ba-4ce5-994a-9fb9483a2400", "bridge": "br-int", "label": "tempest-network-smoke--447291836", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7a27b23-e9", "ovs_interfaceid": "f7a27b23-e94c-4aef-9b08-6d0ddb100265", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 08:15:42 compute-0 nova_compute[186544]: 2025-11-22 08:15:42.438 186548 DEBUG nova.network.os_vif_util [None req-50d9be94-f02a-4bf6-ae29-0bb54bae0603 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converting VIF {"id": "f7a27b23-e94c-4aef-9b08-6d0ddb100265", "address": "fa:16:3e:1a:d3:6f", "network": {"id": "7e573664-04ba-4ce5-994a-9fb9483a2400", "bridge": "br-int", "label": "tempest-network-smoke--447291836", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7a27b23-e9", "ovs_interfaceid": "f7a27b23-e94c-4aef-9b08-6d0ddb100265", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:15:42 compute-0 nova_compute[186544]: 2025-11-22 08:15:42.438 186548 DEBUG nova.network.os_vif_util [None req-50d9be94-f02a-4bf6-ae29-0bb54bae0603 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:1a:d3:6f,bridge_name='br-int',has_traffic_filtering=True,id=f7a27b23-e94c-4aef-9b08-6d0ddb100265,network=Network(7e573664-04ba-4ce5-994a-9fb9483a2400),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7a27b23-e9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:15:42 compute-0 nova_compute[186544]: 2025-11-22 08:15:42.439 186548 DEBUG os_vif [None req-50d9be94-f02a-4bf6-ae29-0bb54bae0603 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:1a:d3:6f,bridge_name='br-int',has_traffic_filtering=True,id=f7a27b23-e94c-4aef-9b08-6d0ddb100265,network=Network(7e573664-04ba-4ce5-994a-9fb9483a2400),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7a27b23-e9') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 08:15:42 compute-0 nova_compute[186544]: 2025-11-22 08:15:42.440 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:15:42 compute-0 nova_compute[186544]: 2025-11-22 08:15:42.440 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf7a27b23-e9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:15:42 compute-0 nova_compute[186544]: 2025-11-22 08:15:42.442 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:15:42 compute-0 nova_compute[186544]: 2025-11-22 08:15:42.445 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 08:15:42 compute-0 nova_compute[186544]: 2025-11-22 08:15:42.447 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:15:42 compute-0 nova_compute[186544]: 2025-11-22 08:15:42.449 186548 INFO os_vif [None req-50d9be94-f02a-4bf6-ae29-0bb54bae0603 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:1a:d3:6f,bridge_name='br-int',has_traffic_filtering=True,id=f7a27b23-e94c-4aef-9b08-6d0ddb100265,network=Network(7e573664-04ba-4ce5-994a-9fb9483a2400),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7a27b23-e9')
Nov 22 08:15:42 compute-0 nova_compute[186544]: 2025-11-22 08:15:42.450 186548 DEBUG nova.virt.libvirt.vif [None req-50d9be94-f02a-4bf6-ae29-0bb54bae0603 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:13:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-585014751',display_name='tempest-TestGettingAddress-server-585014751',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-585014751',id=139,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBE3h1SGyV3gSWS5sQ4nylu2AAOZq0lzcQgQV1fi/afTfgHNAebinpbyavWAHUC3BTYwehM8YAaeM76WaxgrKeLAhyjYG3qrO7DzWZz90S7erIXCzT/UdxFEeIFnV62ADrw==',key_name='tempest-TestGettingAddress-2002370226',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:14:11Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c4200f1d1fbb44a5aaf5e3578f6354ae',ramdisk_id='',reservation_id='r-00mzxd97',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-25838038',owner_user_name='tempest-TestGettingAddress-25838038-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:14:11Z,user_data=None,user_id='809b865601654264af5bff7f49127cea',uuid=ddf94a98-572f-4116-87fb-d46dc5f72174,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d867b5ab-dfbe-4189-8dbb-ed5c62d72ad3", "address": "fa:16:3e:04:7f:07", "network": {"id": "6c0a2255-6426-43c4-abc3-5c1857ba0a79", "bridge": "br-int", "label": "tempest-network-smoke--277865928", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe04:7f07", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe04:7f07", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd867b5ab-df", "ovs_interfaceid": "d867b5ab-dfbe-4189-8dbb-ed5c62d72ad3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 08:15:42 compute-0 nova_compute[186544]: 2025-11-22 08:15:42.451 186548 DEBUG nova.network.os_vif_util [None req-50d9be94-f02a-4bf6-ae29-0bb54bae0603 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converting VIF {"id": "d867b5ab-dfbe-4189-8dbb-ed5c62d72ad3", "address": "fa:16:3e:04:7f:07", "network": {"id": "6c0a2255-6426-43c4-abc3-5c1857ba0a79", "bridge": "br-int", "label": "tempest-network-smoke--277865928", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe04:7f07", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe04:7f07", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd867b5ab-df", "ovs_interfaceid": "d867b5ab-dfbe-4189-8dbb-ed5c62d72ad3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:15:42 compute-0 nova_compute[186544]: 2025-11-22 08:15:42.452 186548 DEBUG nova.network.os_vif_util [None req-50d9be94-f02a-4bf6-ae29-0bb54bae0603 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:04:7f:07,bridge_name='br-int',has_traffic_filtering=True,id=d867b5ab-dfbe-4189-8dbb-ed5c62d72ad3,network=Network(6c0a2255-6426-43c4-abc3-5c1857ba0a79),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd867b5ab-df') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:15:42 compute-0 nova_compute[186544]: 2025-11-22 08:15:42.452 186548 DEBUG os_vif [None req-50d9be94-f02a-4bf6-ae29-0bb54bae0603 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:04:7f:07,bridge_name='br-int',has_traffic_filtering=True,id=d867b5ab-dfbe-4189-8dbb-ed5c62d72ad3,network=Network(6c0a2255-6426-43c4-abc3-5c1857ba0a79),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd867b5ab-df') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 08:15:42 compute-0 nova_compute[186544]: 2025-11-22 08:15:42.453 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:15:42 compute-0 nova_compute[186544]: 2025-11-22 08:15:42.453 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd867b5ab-df, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:15:42 compute-0 nova_compute[186544]: 2025-11-22 08:15:42.455 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:15:42 compute-0 nova_compute[186544]: 2025-11-22 08:15:42.461 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 08:15:42 compute-0 nova_compute[186544]: 2025-11-22 08:15:42.463 186548 INFO os_vif [None req-50d9be94-f02a-4bf6-ae29-0bb54bae0603 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:04:7f:07,bridge_name='br-int',has_traffic_filtering=True,id=d867b5ab-dfbe-4189-8dbb-ed5c62d72ad3,network=Network(6c0a2255-6426-43c4-abc3-5c1857ba0a79),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd867b5ab-df')
Nov 22 08:15:42 compute-0 nova_compute[186544]: 2025-11-22 08:15:42.664 186548 INFO nova.virt.libvirt.driver [None req-50d9be94-f02a-4bf6-ae29-0bb54bae0603 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Deleting instance files /var/lib/nova/instances/ddf94a98-572f-4116-87fb-d46dc5f72174_del
Nov 22 08:15:42 compute-0 nova_compute[186544]: 2025-11-22 08:15:42.752 186548 INFO nova.virt.libvirt.driver [None req-50d9be94-f02a-4bf6-ae29-0bb54bae0603 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Deletion of /var/lib/nova/instances/ddf94a98-572f-4116-87fb-d46dc5f72174_del complete
Nov 22 08:15:42 compute-0 nova_compute[186544]: 2025-11-22 08:15:42.873 186548 DEBUG nova.compute.manager [req-1362769e-d619-4290-aa77-846e62ac983c req-e036e4ff-4669-456e-bb7e-074feed1534a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Received event network-vif-unplugged-f7a27b23-e94c-4aef-9b08-6d0ddb100265 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:15:42 compute-0 nova_compute[186544]: 2025-11-22 08:15:42.873 186548 DEBUG oslo_concurrency.lockutils [req-1362769e-d619-4290-aa77-846e62ac983c req-e036e4ff-4669-456e-bb7e-074feed1534a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "ddf94a98-572f-4116-87fb-d46dc5f72174-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:15:42 compute-0 nova_compute[186544]: 2025-11-22 08:15:42.874 186548 DEBUG oslo_concurrency.lockutils [req-1362769e-d619-4290-aa77-846e62ac983c req-e036e4ff-4669-456e-bb7e-074feed1534a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ddf94a98-572f-4116-87fb-d46dc5f72174-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:15:42 compute-0 nova_compute[186544]: 2025-11-22 08:15:42.874 186548 DEBUG oslo_concurrency.lockutils [req-1362769e-d619-4290-aa77-846e62ac983c req-e036e4ff-4669-456e-bb7e-074feed1534a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ddf94a98-572f-4116-87fb-d46dc5f72174-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:15:42 compute-0 nova_compute[186544]: 2025-11-22 08:15:42.874 186548 DEBUG nova.compute.manager [req-1362769e-d619-4290-aa77-846e62ac983c req-e036e4ff-4669-456e-bb7e-074feed1534a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] No waiting events found dispatching network-vif-unplugged-f7a27b23-e94c-4aef-9b08-6d0ddb100265 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:15:42 compute-0 nova_compute[186544]: 2025-11-22 08:15:42.875 186548 DEBUG nova.compute.manager [req-1362769e-d619-4290-aa77-846e62ac983c req-e036e4ff-4669-456e-bb7e-074feed1534a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Received event network-vif-unplugged-f7a27b23-e94c-4aef-9b08-6d0ddb100265 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 22 08:15:42 compute-0 nova_compute[186544]: 2025-11-22 08:15:42.941 186548 DEBUG nova.compute.manager [req-41f57575-7578-4afd-9670-15360b2d920d req-589406fc-c14d-4c7f-a935-0488426ccdf7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Received event network-vif-unplugged-d867b5ab-dfbe-4189-8dbb-ed5c62d72ad3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:15:42 compute-0 nova_compute[186544]: 2025-11-22 08:15:42.942 186548 DEBUG oslo_concurrency.lockutils [req-41f57575-7578-4afd-9670-15360b2d920d req-589406fc-c14d-4c7f-a935-0488426ccdf7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "ddf94a98-572f-4116-87fb-d46dc5f72174-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:15:42 compute-0 nova_compute[186544]: 2025-11-22 08:15:42.942 186548 DEBUG oslo_concurrency.lockutils [req-41f57575-7578-4afd-9670-15360b2d920d req-589406fc-c14d-4c7f-a935-0488426ccdf7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ddf94a98-572f-4116-87fb-d46dc5f72174-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:15:42 compute-0 nova_compute[186544]: 2025-11-22 08:15:42.943 186548 DEBUG oslo_concurrency.lockutils [req-41f57575-7578-4afd-9670-15360b2d920d req-589406fc-c14d-4c7f-a935-0488426ccdf7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ddf94a98-572f-4116-87fb-d46dc5f72174-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:15:42 compute-0 nova_compute[186544]: 2025-11-22 08:15:42.943 186548 DEBUG nova.compute.manager [req-41f57575-7578-4afd-9670-15360b2d920d req-589406fc-c14d-4c7f-a935-0488426ccdf7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] No waiting events found dispatching network-vif-unplugged-d867b5ab-dfbe-4189-8dbb-ed5c62d72ad3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:15:42 compute-0 nova_compute[186544]: 2025-11-22 08:15:42.944 186548 DEBUG nova.compute.manager [req-41f57575-7578-4afd-9670-15360b2d920d req-589406fc-c14d-4c7f-a935-0488426ccdf7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Received event network-vif-unplugged-d867b5ab-dfbe-4189-8dbb-ed5c62d72ad3 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 22 08:15:43 compute-0 neutron-haproxy-ovnmeta-7e573664-04ba-4ce5-994a-9fb9483a2400[238837]: [NOTICE]   (238841) : haproxy version is 2.8.14-c23fe91
Nov 22 08:15:43 compute-0 neutron-haproxy-ovnmeta-7e573664-04ba-4ce5-994a-9fb9483a2400[238837]: [NOTICE]   (238841) : path to executable is /usr/sbin/haproxy
Nov 22 08:15:43 compute-0 neutron-haproxy-ovnmeta-7e573664-04ba-4ce5-994a-9fb9483a2400[238837]: [WARNING]  (238841) : Exiting Master process...
Nov 22 08:15:43 compute-0 nova_compute[186544]: 2025-11-22 08:15:43.053 186548 INFO nova.compute.manager [None req-50d9be94-f02a-4bf6-ae29-0bb54bae0603 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Took 0.92 seconds to destroy the instance on the hypervisor.
Nov 22 08:15:43 compute-0 nova_compute[186544]: 2025-11-22 08:15:43.053 186548 DEBUG oslo.service.loopingcall [None req-50d9be94-f02a-4bf6-ae29-0bb54bae0603 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 08:15:43 compute-0 nova_compute[186544]: 2025-11-22 08:15:43.054 186548 DEBUG nova.compute.manager [-] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 08:15:43 compute-0 nova_compute[186544]: 2025-11-22 08:15:43.054 186548 DEBUG nova.network.neutron [-] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 08:15:43 compute-0 neutron-haproxy-ovnmeta-7e573664-04ba-4ce5-994a-9fb9483a2400[238837]: [ALERT]    (238841) : Current worker (238843) exited with code 143 (Terminated)
Nov 22 08:15:43 compute-0 neutron-haproxy-ovnmeta-7e573664-04ba-4ce5-994a-9fb9483a2400[238837]: [WARNING]  (238841) : All workers exited. Exiting... (0)
Nov 22 08:15:43 compute-0 systemd[1]: libpod-4e6a0d95662b598132081efffea470175fb5f1d3d1c89f2bf21e0e496aab5d8b.scope: Deactivated successfully.
Nov 22 08:15:43 compute-0 podman[239527]: 2025-11-22 08:15:43.063391436 +0000 UTC m=+0.677786159 container died 4e6a0d95662b598132081efffea470175fb5f1d3d1c89f2bf21e0e496aab5d8b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7e573664-04ba-4ce5-994a-9fb9483a2400, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 22 08:15:43 compute-0 ovn_controller[94843]: 2025-11-22T08:15:43Z|00653|binding|INFO|Releasing lport 32debc95-1c60-4d8b-9d74-79ae74c8f38f from this chassis (sb_readonly=0)
Nov 22 08:15:43 compute-0 ovn_controller[94843]: 2025-11-22T08:15:43Z|00654|binding|INFO|Releasing lport 433cf940-3b59-425c-aeb8-689a57de46c2 from this chassis (sb_readonly=0)
Nov 22 08:15:43 compute-0 nova_compute[186544]: 2025-11-22 08:15:43.132 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:15:43 compute-0 ovn_controller[94843]: 2025-11-22T08:15:43Z|00655|binding|INFO|Releasing lport 32debc95-1c60-4d8b-9d74-79ae74c8f38f from this chassis (sb_readonly=0)
Nov 22 08:15:43 compute-0 ovn_controller[94843]: 2025-11-22T08:15:43Z|00656|binding|INFO|Releasing lport 433cf940-3b59-425c-aeb8-689a57de46c2 from this chassis (sb_readonly=0)
Nov 22 08:15:43 compute-0 nova_compute[186544]: 2025-11-22 08:15:43.332 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:15:43 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4e6a0d95662b598132081efffea470175fb5f1d3d1c89f2bf21e0e496aab5d8b-userdata-shm.mount: Deactivated successfully.
Nov 22 08:15:43 compute-0 systemd[1]: var-lib-containers-storage-overlay-8272f7621a95407b159d2d2e869c23e3a1a1d2942646f9986573be1d6048143c-merged.mount: Deactivated successfully.
Nov 22 08:15:43 compute-0 podman[239527]: 2025-11-22 08:15:43.578032986 +0000 UTC m=+1.192427719 container cleanup 4e6a0d95662b598132081efffea470175fb5f1d3d1c89f2bf21e0e496aab5d8b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7e573664-04ba-4ce5-994a-9fb9483a2400, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 08:15:43 compute-0 systemd[1]: libpod-conmon-4e6a0d95662b598132081efffea470175fb5f1d3d1c89f2bf21e0e496aab5d8b.scope: Deactivated successfully.
Nov 22 08:15:43 compute-0 podman[239575]: 2025-11-22 08:15:43.852346307 +0000 UTC m=+0.249463916 container remove 4e6a0d95662b598132081efffea470175fb5f1d3d1c89f2bf21e0e496aab5d8b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7e573664-04ba-4ce5-994a-9fb9483a2400, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 22 08:15:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:15:43.861 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[24edeee1-c7c3-4a10-9aa1-3b246f768cfe]: (4, ('Sat Nov 22 08:15:42 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-7e573664-04ba-4ce5-994a-9fb9483a2400 (4e6a0d95662b598132081efffea470175fb5f1d3d1c89f2bf21e0e496aab5d8b)\n4e6a0d95662b598132081efffea470175fb5f1d3d1c89f2bf21e0e496aab5d8b\nSat Nov 22 08:15:43 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-7e573664-04ba-4ce5-994a-9fb9483a2400 (4e6a0d95662b598132081efffea470175fb5f1d3d1c89f2bf21e0e496aab5d8b)\n4e6a0d95662b598132081efffea470175fb5f1d3d1c89f2bf21e0e496aab5d8b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:15:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:15:43.863 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[5f177a2b-6160-446a-9a73-254108935e6c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:15:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:15:43.865 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7e573664-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:15:43 compute-0 nova_compute[186544]: 2025-11-22 08:15:43.866 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:15:43 compute-0 kernel: tap7e573664-00: left promiscuous mode
Nov 22 08:15:43 compute-0 nova_compute[186544]: 2025-11-22 08:15:43.868 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:15:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:15:43.873 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[8c979dbc-2f55-4799-b277-15f5ab5dc0ab]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:15:43 compute-0 nova_compute[186544]: 2025-11-22 08:15:43.881 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:15:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:15:43.888 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[5d921e85-4fbc-4a80-815f-d28e07929dab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:15:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:15:43.890 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[d8b0e450-2259-4fcf-a9fd-03a6349133fe]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:15:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:15:43.912 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[ec32e09b-22ce-4775-94a4-675ab45b1b24]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 599181, 'reachable_time': 31463, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239590, 'error': None, 'target': 'ovnmeta-7e573664-04ba-4ce5-994a-9fb9483a2400', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:15:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:15:43.915 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7e573664-04ba-4ce5-994a-9fb9483a2400 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 08:15:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:15:43.915 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[4e02da6d-8228-4e61-bca3-22ed1aed0e1f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:15:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:15:43.916 103805 INFO neutron.agent.ovn.metadata.agent [-] Port d867b5ab-dfbe-4189-8dbb-ed5c62d72ad3 in datapath 6c0a2255-6426-43c4-abc3-5c1857ba0a79 unbound from our chassis
Nov 22 08:15:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:15:43.917 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6c0a2255-6426-43c4-abc3-5c1857ba0a79, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 08:15:43 compute-0 systemd[1]: run-netns-ovnmeta\x2d7e573664\x2d04ba\x2d4ce5\x2d994a\x2d9fb9483a2400.mount: Deactivated successfully.
Nov 22 08:15:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:15:43.917 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[72ee43b0-23d8-4ec3-92df-8c0fa8c35d13]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:15:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:15:43.918 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6c0a2255-6426-43c4-abc3-5c1857ba0a79 namespace which is not needed anymore
Nov 22 08:15:44 compute-0 neutron-haproxy-ovnmeta-6c0a2255-6426-43c4-abc3-5c1857ba0a79[238909]: [NOTICE]   (238913) : haproxy version is 2.8.14-c23fe91
Nov 22 08:15:44 compute-0 neutron-haproxy-ovnmeta-6c0a2255-6426-43c4-abc3-5c1857ba0a79[238909]: [NOTICE]   (238913) : path to executable is /usr/sbin/haproxy
Nov 22 08:15:44 compute-0 neutron-haproxy-ovnmeta-6c0a2255-6426-43c4-abc3-5c1857ba0a79[238909]: [WARNING]  (238913) : Exiting Master process...
Nov 22 08:15:44 compute-0 neutron-haproxy-ovnmeta-6c0a2255-6426-43c4-abc3-5c1857ba0a79[238909]: [ALERT]    (238913) : Current worker (238915) exited with code 143 (Terminated)
Nov 22 08:15:44 compute-0 neutron-haproxy-ovnmeta-6c0a2255-6426-43c4-abc3-5c1857ba0a79[238909]: [WARNING]  (238913) : All workers exited. Exiting... (0)
Nov 22 08:15:44 compute-0 systemd[1]: libpod-e6a5dc09a7ea385e3a95c5a3bfa05781d93aece7934879f01774d555c042899b.scope: Deactivated successfully.
Nov 22 08:15:44 compute-0 podman[239606]: 2025-11-22 08:15:44.242374012 +0000 UTC m=+0.213717822 container died e6a5dc09a7ea385e3a95c5a3bfa05781d93aece7934879f01774d555c042899b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6c0a2255-6426-43c4-abc3-5c1857ba0a79, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 08:15:44 compute-0 nova_compute[186544]: 2025-11-22 08:15:44.375 186548 DEBUG nova.network.neutron [req-66c9422d-23bb-4f69-b19d-3fcd0bc6d972 req-1044096a-7ff4-4feb-83b4-c55ca1b98379 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Updated VIF entry in instance network info cache for port f7a27b23-e94c-4aef-9b08-6d0ddb100265. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:15:44 compute-0 nova_compute[186544]: 2025-11-22 08:15:44.377 186548 DEBUG nova.network.neutron [req-66c9422d-23bb-4f69-b19d-3fcd0bc6d972 req-1044096a-7ff4-4feb-83b4-c55ca1b98379 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Updating instance_info_cache with network_info: [{"id": "f7a27b23-e94c-4aef-9b08-6d0ddb100265", "address": "fa:16:3e:1a:d3:6f", "network": {"id": "7e573664-04ba-4ce5-994a-9fb9483a2400", "bridge": "br-int", "label": "tempest-network-smoke--447291836", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7a27b23-e9", "ovs_interfaceid": "f7a27b23-e94c-4aef-9b08-6d0ddb100265", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "d867b5ab-dfbe-4189-8dbb-ed5c62d72ad3", "address": "fa:16:3e:04:7f:07", "network": {"id": "6c0a2255-6426-43c4-abc3-5c1857ba0a79", "bridge": "br-int", "label": "tempest-network-smoke--277865928", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe04:7f07", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe04:7f07", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd867b5ab-df", "ovs_interfaceid": "d867b5ab-dfbe-4189-8dbb-ed5c62d72ad3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:15:44 compute-0 nova_compute[186544]: 2025-11-22 08:15:44.438 186548 DEBUG oslo_concurrency.lockutils [req-66c9422d-23bb-4f69-b19d-3fcd0bc6d972 req-1044096a-7ff4-4feb-83b4-c55ca1b98379 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-ddf94a98-572f-4116-87fb-d46dc5f72174" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:15:44 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e6a5dc09a7ea385e3a95c5a3bfa05781d93aece7934879f01774d555c042899b-userdata-shm.mount: Deactivated successfully.
Nov 22 08:15:44 compute-0 systemd[1]: var-lib-containers-storage-overlay-285b94d6409d77da5bd4c59a325022e9369d30a0acb0556c1047f3c47b9967fe-merged.mount: Deactivated successfully.
Nov 22 08:15:44 compute-0 podman[239606]: 2025-11-22 08:15:44.742637105 +0000 UTC m=+0.713980915 container cleanup e6a5dc09a7ea385e3a95c5a3bfa05781d93aece7934879f01774d555c042899b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6c0a2255-6426-43c4-abc3-5c1857ba0a79, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 08:15:44 compute-0 systemd[1]: libpod-conmon-e6a5dc09a7ea385e3a95c5a3bfa05781d93aece7934879f01774d555c042899b.scope: Deactivated successfully.
Nov 22 08:15:45 compute-0 podman[239635]: 2025-11-22 08:15:45.077555916 +0000 UTC m=+0.315344567 container remove e6a5dc09a7ea385e3a95c5a3bfa05781d93aece7934879f01774d555c042899b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6c0a2255-6426-43c4-abc3-5c1857ba0a79, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 08:15:45 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:15:45.086 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[918c6314-74c8-4afb-b525-57bec108d868]: (4, ('Sat Nov 22 08:15:44 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-6c0a2255-6426-43c4-abc3-5c1857ba0a79 (e6a5dc09a7ea385e3a95c5a3bfa05781d93aece7934879f01774d555c042899b)\ne6a5dc09a7ea385e3a95c5a3bfa05781d93aece7934879f01774d555c042899b\nSat Nov 22 08:15:44 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-6c0a2255-6426-43c4-abc3-5c1857ba0a79 (e6a5dc09a7ea385e3a95c5a3bfa05781d93aece7934879f01774d555c042899b)\ne6a5dc09a7ea385e3a95c5a3bfa05781d93aece7934879f01774d555c042899b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:15:45 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:15:45.088 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[7cace481-e7d8-4206-ad29-9af5bc41f7b5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:15:45 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:15:45.090 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6c0a2255-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:15:45 compute-0 nova_compute[186544]: 2025-11-22 08:15:45.092 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:15:45 compute-0 kernel: tap6c0a2255-60: left promiscuous mode
Nov 22 08:15:45 compute-0 nova_compute[186544]: 2025-11-22 08:15:45.103 186548 DEBUG nova.compute.manager [req-ebac8eb5-e36a-47f2-bd27-e7292b3fd0bd req-5decc60a-c8f7-456a-afe7-b529d2d871a7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Received event network-vif-plugged-f7a27b23-e94c-4aef-9b08-6d0ddb100265 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:15:45 compute-0 nova_compute[186544]: 2025-11-22 08:15:45.103 186548 DEBUG oslo_concurrency.lockutils [req-ebac8eb5-e36a-47f2-bd27-e7292b3fd0bd req-5decc60a-c8f7-456a-afe7-b529d2d871a7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "ddf94a98-572f-4116-87fb-d46dc5f72174-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:15:45 compute-0 nova_compute[186544]: 2025-11-22 08:15:45.104 186548 DEBUG oslo_concurrency.lockutils [req-ebac8eb5-e36a-47f2-bd27-e7292b3fd0bd req-5decc60a-c8f7-456a-afe7-b529d2d871a7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ddf94a98-572f-4116-87fb-d46dc5f72174-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:15:45 compute-0 nova_compute[186544]: 2025-11-22 08:15:45.104 186548 DEBUG oslo_concurrency.lockutils [req-ebac8eb5-e36a-47f2-bd27-e7292b3fd0bd req-5decc60a-c8f7-456a-afe7-b529d2d871a7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ddf94a98-572f-4116-87fb-d46dc5f72174-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:15:45 compute-0 nova_compute[186544]: 2025-11-22 08:15:45.104 186548 DEBUG nova.compute.manager [req-ebac8eb5-e36a-47f2-bd27-e7292b3fd0bd req-5decc60a-c8f7-456a-afe7-b529d2d871a7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] No waiting events found dispatching network-vif-plugged-f7a27b23-e94c-4aef-9b08-6d0ddb100265 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:15:45 compute-0 nova_compute[186544]: 2025-11-22 08:15:45.104 186548 WARNING nova.compute.manager [req-ebac8eb5-e36a-47f2-bd27-e7292b3fd0bd req-5decc60a-c8f7-456a-afe7-b529d2d871a7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Received unexpected event network-vif-plugged-f7a27b23-e94c-4aef-9b08-6d0ddb100265 for instance with vm_state active and task_state deleting.
Nov 22 08:15:45 compute-0 nova_compute[186544]: 2025-11-22 08:15:45.105 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:15:45 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:15:45.108 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[392ab8f4-dbf2-4227-a116-7f284bef4639]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:15:45 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:15:45.125 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[89905b3d-6c54-497f-8273-1ba866f34a70]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:15:45 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:15:45.127 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[a58746ae-e4c7-4cb1-9cb6-4f828d31969e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:15:45 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:15:45.151 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[ef3ded7a-3dab-4d5b-93e1-eea07fd5039d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 599279, 'reachable_time': 20212, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239651, 'error': None, 'target': 'ovnmeta-6c0a2255-6426-43c4-abc3-5c1857ba0a79', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:15:45 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:15:45.155 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6c0a2255-6426-43c4-abc3-5c1857ba0a79 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 08:15:45 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:15:45.155 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[29685fcc-ecb6-463e-a454-cdc9c8496d42]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:15:45 compute-0 systemd[1]: run-netns-ovnmeta\x2d6c0a2255\x2d6426\x2d43c4\x2dabc3\x2d5c1857ba0a79.mount: Deactivated successfully.
Nov 22 08:15:46 compute-0 nova_compute[186544]: 2025-11-22 08:15:46.030 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:15:47 compute-0 nova_compute[186544]: 2025-11-22 08:15:47.455 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:15:48 compute-0 podman[239653]: 2025-11-22 08:15:48.435187534 +0000 UTC m=+0.067482461 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 08:15:48 compute-0 podman[239654]: 2025-11-22 08:15:48.444220539 +0000 UTC m=+0.066511048 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 08:15:48 compute-0 podman[239652]: 2025-11-22 08:15:48.446483764 +0000 UTC m=+0.083529778 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm)
Nov 22 08:15:48 compute-0 podman[239660]: 2025-11-22 08:15:48.476109437 +0000 UTC m=+0.096228692 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2)
Nov 22 08:15:49 compute-0 nova_compute[186544]: 2025-11-22 08:15:49.228 186548 DEBUG nova.network.neutron [-] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:15:49 compute-0 nova_compute[186544]: 2025-11-22 08:15:49.421 186548 INFO nova.compute.manager [-] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Took 6.37 seconds to deallocate network for instance.
Nov 22 08:15:49 compute-0 nova_compute[186544]: 2025-11-22 08:15:49.735 186548 DEBUG oslo_concurrency.lockutils [None req-50d9be94-f02a-4bf6-ae29-0bb54bae0603 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:15:49 compute-0 nova_compute[186544]: 2025-11-22 08:15:49.736 186548 DEBUG oslo_concurrency.lockutils [None req-50d9be94-f02a-4bf6-ae29-0bb54bae0603 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:15:50 compute-0 nova_compute[186544]: 2025-11-22 08:15:50.007 186548 DEBUG nova.compute.provider_tree [None req-50d9be94-f02a-4bf6-ae29-0bb54bae0603 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:15:50 compute-0 nova_compute[186544]: 2025-11-22 08:15:50.027 186548 DEBUG nova.scheduler.client.report [None req-50d9be94-f02a-4bf6-ae29-0bb54bae0603 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:15:50 compute-0 nova_compute[186544]: 2025-11-22 08:15:50.086 186548 DEBUG nova.compute.manager [req-03972e13-27c8-4319-b84c-65f9b13bb27e req-9fabb990-dee4-4d2a-8181-6e2d8fe86c5e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Received event network-vif-plugged-d867b5ab-dfbe-4189-8dbb-ed5c62d72ad3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:15:50 compute-0 nova_compute[186544]: 2025-11-22 08:15:50.086 186548 DEBUG oslo_concurrency.lockutils [req-03972e13-27c8-4319-b84c-65f9b13bb27e req-9fabb990-dee4-4d2a-8181-6e2d8fe86c5e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "ddf94a98-572f-4116-87fb-d46dc5f72174-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:15:50 compute-0 nova_compute[186544]: 2025-11-22 08:15:50.087 186548 DEBUG oslo_concurrency.lockutils [req-03972e13-27c8-4319-b84c-65f9b13bb27e req-9fabb990-dee4-4d2a-8181-6e2d8fe86c5e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ddf94a98-572f-4116-87fb-d46dc5f72174-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:15:50 compute-0 nova_compute[186544]: 2025-11-22 08:15:50.087 186548 DEBUG oslo_concurrency.lockutils [req-03972e13-27c8-4319-b84c-65f9b13bb27e req-9fabb990-dee4-4d2a-8181-6e2d8fe86c5e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ddf94a98-572f-4116-87fb-d46dc5f72174-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:15:50 compute-0 nova_compute[186544]: 2025-11-22 08:15:50.087 186548 DEBUG nova.compute.manager [req-03972e13-27c8-4319-b84c-65f9b13bb27e req-9fabb990-dee4-4d2a-8181-6e2d8fe86c5e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] No waiting events found dispatching network-vif-plugged-d867b5ab-dfbe-4189-8dbb-ed5c62d72ad3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:15:50 compute-0 nova_compute[186544]: 2025-11-22 08:15:50.087 186548 WARNING nova.compute.manager [req-03972e13-27c8-4319-b84c-65f9b13bb27e req-9fabb990-dee4-4d2a-8181-6e2d8fe86c5e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Received unexpected event network-vif-plugged-d867b5ab-dfbe-4189-8dbb-ed5c62d72ad3 for instance with vm_state deleted and task_state None.
Nov 22 08:15:50 compute-0 nova_compute[186544]: 2025-11-22 08:15:50.088 186548 DEBUG nova.compute.manager [req-03972e13-27c8-4319-b84c-65f9b13bb27e req-9fabb990-dee4-4d2a-8181-6e2d8fe86c5e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Received event network-vif-deleted-f7a27b23-e94c-4aef-9b08-6d0ddb100265 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:15:50 compute-0 nova_compute[186544]: 2025-11-22 08:15:50.219 186548 DEBUG oslo_concurrency.lockutils [None req-50d9be94-f02a-4bf6-ae29-0bb54bae0603 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.483s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:15:50 compute-0 nova_compute[186544]: 2025-11-22 08:15:50.429 186548 INFO nova.scheduler.client.report [None req-50d9be94-f02a-4bf6-ae29-0bb54bae0603 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Deleted allocations for instance ddf94a98-572f-4116-87fb-d46dc5f72174
Nov 22 08:15:50 compute-0 nova_compute[186544]: 2025-11-22 08:15:50.735 186548 DEBUG oslo_concurrency.lockutils [None req-50d9be94-f02a-4bf6-ae29-0bb54bae0603 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "ddf94a98-572f-4116-87fb-d46dc5f72174" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 8.629s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:15:51 compute-0 nova_compute[186544]: 2025-11-22 08:15:51.031 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:15:51 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:15:51.806 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=41, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=40) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:15:51 compute-0 nova_compute[186544]: 2025-11-22 08:15:51.807 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:15:51 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:15:51.807 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 08:15:52 compute-0 nova_compute[186544]: 2025-11-22 08:15:52.239 186548 DEBUG nova.compute.manager [req-532ac7df-c37f-4276-ab72-49217bc99c99 req-bfb1ee08-09f9-4f65-8ec3-ac7be23a7226 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Received event network-vif-deleted-d867b5ab-dfbe-4189-8dbb-ed5c62d72ad3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:15:52 compute-0 nova_compute[186544]: 2025-11-22 08:15:52.457 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:15:52 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:15:52.809 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '41'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:15:56 compute-0 nova_compute[186544]: 2025-11-22 08:15:56.033 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:15:57 compute-0 nova_compute[186544]: 2025-11-22 08:15:57.422 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763799342.4210508, ddf94a98-572f-4116-87fb-d46dc5f72174 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:15:57 compute-0 nova_compute[186544]: 2025-11-22 08:15:57.422 186548 INFO nova.compute.manager [-] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] VM Stopped (Lifecycle Event)
Nov 22 08:15:57 compute-0 nova_compute[186544]: 2025-11-22 08:15:57.459 186548 DEBUG nova.compute.manager [None req-6675b1d7-67bd-4ef2-820f-6b9676bc15e8 - - - - - -] [instance: ddf94a98-572f-4116-87fb-d46dc5f72174] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:15:57 compute-0 nova_compute[186544]: 2025-11-22 08:15:57.460 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:15:59 compute-0 podman[239741]: 2025-11-22 08:15:59.432164884 +0000 UTC m=+0.077538360 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, config_id=multipathd)
Nov 22 08:16:00 compute-0 nova_compute[186544]: 2025-11-22 08:16:00.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:16:00 compute-0 nova_compute[186544]: 2025-11-22 08:16:00.189 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:16:00 compute-0 nova_compute[186544]: 2025-11-22 08:16:00.190 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:16:00 compute-0 nova_compute[186544]: 2025-11-22 08:16:00.191 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:16:00 compute-0 nova_compute[186544]: 2025-11-22 08:16:00.191 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 08:16:00 compute-0 nova_compute[186544]: 2025-11-22 08:16:00.403 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:16:00 compute-0 nova_compute[186544]: 2025-11-22 08:16:00.405 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5718MB free_disk=73.1367301940918GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 08:16:00 compute-0 nova_compute[186544]: 2025-11-22 08:16:00.405 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:16:00 compute-0 nova_compute[186544]: 2025-11-22 08:16:00.405 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:16:01 compute-0 nova_compute[186544]: 2025-11-22 08:16:01.036 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:16:01 compute-0 anacron[30858]: Job `cron.monthly' started
Nov 22 08:16:02 compute-0 podman[239765]: 2025-11-22 08:16:02.449292153 +0000 UTC m=+0.076430062 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 08:16:02 compute-0 podman[239766]: 2025-11-22 08:16:02.456668066 +0000 UTC m=+0.081001316 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, managed_by=edpm_ansible, io.openshift.expose-services=, config_id=edpm, vcs-type=git, distribution-scope=public, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, release=1755695350, maintainer=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter)
Nov 22 08:16:02 compute-0 nova_compute[186544]: 2025-11-22 08:16:02.461 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:16:03 compute-0 nova_compute[186544]: 2025-11-22 08:16:03.764 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 08:16:03 compute-0 nova_compute[186544]: 2025-11-22 08:16:03.765 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 08:16:04 compute-0 nova_compute[186544]: 2025-11-22 08:16:04.077 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Refreshing inventories for resource provider 0a011418-630a-4be8-ab23-41ec1c11a5ea _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 22 08:16:04 compute-0 nova_compute[186544]: 2025-11-22 08:16:04.329 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Updating ProviderTree inventory for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 22 08:16:04 compute-0 nova_compute[186544]: 2025-11-22 08:16:04.330 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Updating inventory in ProviderTree for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 22 08:16:04 compute-0 nova_compute[186544]: 2025-11-22 08:16:04.345 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Refreshing aggregate associations for resource provider 0a011418-630a-4be8-ab23-41ec1c11a5ea, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 22 08:16:04 compute-0 nova_compute[186544]: 2025-11-22 08:16:04.372 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Refreshing trait associations for resource provider 0a011418-630a-4be8-ab23-41ec1c11a5ea, traits: COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_ACCELERATORS,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_RESCUE_BFV,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSSE3,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE41 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 22 08:16:04 compute-0 nova_compute[186544]: 2025-11-22 08:16:04.399 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:16:04 compute-0 nova_compute[186544]: 2025-11-22 08:16:04.420 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:16:04 compute-0 nova_compute[186544]: 2025-11-22 08:16:04.516 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 08:16:04 compute-0 nova_compute[186544]: 2025-11-22 08:16:04.517 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 4.112s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:16:04 compute-0 anacron[30858]: Job `cron.monthly' terminated
Nov 22 08:16:04 compute-0 anacron[30858]: Normal exit (3 jobs run)
Nov 22 08:16:05 compute-0 nova_compute[186544]: 2025-11-22 08:16:05.512 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:16:05 compute-0 nova_compute[186544]: 2025-11-22 08:16:05.513 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:16:05 compute-0 nova_compute[186544]: 2025-11-22 08:16:05.513 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 08:16:05 compute-0 nova_compute[186544]: 2025-11-22 08:16:05.529 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 08:16:05 compute-0 nova_compute[186544]: 2025-11-22 08:16:05.530 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:16:05 compute-0 nova_compute[186544]: 2025-11-22 08:16:05.530 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:16:05 compute-0 nova_compute[186544]: 2025-11-22 08:16:05.530 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 08:16:06 compute-0 nova_compute[186544]: 2025-11-22 08:16:06.037 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:16:07 compute-0 nova_compute[186544]: 2025-11-22 08:16:07.462 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:16:08 compute-0 nova_compute[186544]: 2025-11-22 08:16:08.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:16:08 compute-0 nova_compute[186544]: 2025-11-22 08:16:08.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:16:10 compute-0 nova_compute[186544]: 2025-11-22 08:16:10.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:16:10 compute-0 nova_compute[186544]: 2025-11-22 08:16:10.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:16:11 compute-0 nova_compute[186544]: 2025-11-22 08:16:11.038 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:16:12 compute-0 nova_compute[186544]: 2025-11-22 08:16:12.463 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:16:16 compute-0 nova_compute[186544]: 2025-11-22 08:16:16.039 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:16:17 compute-0 nova_compute[186544]: 2025-11-22 08:16:17.465 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:16:19 compute-0 podman[239809]: 2025-11-22 08:16:19.419747795 +0000 UTC m=+0.066826706 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 22 08:16:19 compute-0 podman[239816]: 2025-11-22 08:16:19.442390485 +0000 UTC m=+0.068322803 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 08:16:19 compute-0 podman[239810]: 2025-11-22 08:16:19.442398495 +0000 UTC m=+0.073063279 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 08:16:19 compute-0 podman[239817]: 2025-11-22 08:16:19.474239534 +0000 UTC m=+0.101682479 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller)
Nov 22 08:16:21 compute-0 nova_compute[186544]: 2025-11-22 08:16:21.040 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:16:22 compute-0 nova_compute[186544]: 2025-11-22 08:16:22.467 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:16:26 compute-0 nova_compute[186544]: 2025-11-22 08:16:26.043 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:16:26 compute-0 nova_compute[186544]: 2025-11-22 08:16:26.159 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:16:27 compute-0 nova_compute[186544]: 2025-11-22 08:16:27.469 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:16:30 compute-0 podman[239894]: 2025-11-22 08:16:30.429587472 +0000 UTC m=+0.083577390 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 08:16:31 compute-0 nova_compute[186544]: 2025-11-22 08:16:31.044 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:16:32 compute-0 nova_compute[186544]: 2025-11-22 08:16:32.471 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:16:33 compute-0 podman[239916]: 2025-11-22 08:16:33.396114749 +0000 UTC m=+0.045669392 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 08:16:33 compute-0 podman[239917]: 2025-11-22 08:16:33.40704195 +0000 UTC m=+0.052036380 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, io.openshift.expose-services=, maintainer=Red Hat, Inc., release=1755695350, config_id=edpm, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, managed_by=edpm_ansible, name=ubi9-minimal, distribution-scope=public)
Nov 22 08:16:34 compute-0 nova_compute[186544]: 2025-11-22 08:16:34.920 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:16:34 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:16:34.923 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=42, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=41) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:16:34 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:16:34.924 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 08:16:34 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:16:34.924 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '42'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:16:36 compute-0 nova_compute[186544]: 2025-11-22 08:16:36.046 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:16:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:16:36.598 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:16:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:16:36.599 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:16:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:16:36.599 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:16:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:16:36.599 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:16:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:16:36.599 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:16:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:16:36.599 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:16:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:16:36.599 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:16:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:16:36.599 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:16:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:16:36.599 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:16:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:16:36.599 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:16:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:16:36.600 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:16:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:16:36.600 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:16:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:16:36.600 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:16:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:16:36.600 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:16:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:16:36.600 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:16:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:16:36.600 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:16:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:16:36.600 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:16:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:16:36.600 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:16:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:16:36.600 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:16:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:16:36.600 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:16:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:16:36.600 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:16:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:16:36.600 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:16:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:16:36.600 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:16:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:16:36.600 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:16:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:16:36.601 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:16:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:16:37.343 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:16:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:16:37.343 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:16:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:16:37.344 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:16:37 compute-0 nova_compute[186544]: 2025-11-22 08:16:37.473 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:16:41 compute-0 nova_compute[186544]: 2025-11-22 08:16:41.049 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:16:42 compute-0 nova_compute[186544]: 2025-11-22 08:16:42.474 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:16:46 compute-0 nova_compute[186544]: 2025-11-22 08:16:46.052 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:16:47 compute-0 nova_compute[186544]: 2025-11-22 08:16:47.478 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:16:50 compute-0 podman[239964]: 2025-11-22 08:16:50.405958404 +0000 UTC m=+0.055437763 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Nov 22 08:16:50 compute-0 podman[239965]: 2025-11-22 08:16:50.406088658 +0000 UTC m=+0.050663516 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 08:16:50 compute-0 podman[239963]: 2025-11-22 08:16:50.424182565 +0000 UTC m=+0.069352207 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm)
Nov 22 08:16:50 compute-0 podman[239966]: 2025-11-22 08:16:50.437582578 +0000 UTC m=+0.080194516 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3)
Nov 22 08:16:51 compute-0 nova_compute[186544]: 2025-11-22 08:16:51.054 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:16:52 compute-0 nova_compute[186544]: 2025-11-22 08:16:52.480 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:16:53 compute-0 nova_compute[186544]: 2025-11-22 08:16:53.720 186548 DEBUG oslo_concurrency.lockutils [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "042b3132-f4d0-455f-a591-712ce4fd2839" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:16:53 compute-0 nova_compute[186544]: 2025-11-22 08:16:53.720 186548 DEBUG oslo_concurrency.lockutils [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "042b3132-f4d0-455f-a591-712ce4fd2839" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:16:53 compute-0 nova_compute[186544]: 2025-11-22 08:16:53.759 186548 DEBUG nova.compute.manager [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 08:16:53 compute-0 nova_compute[186544]: 2025-11-22 08:16:53.872 186548 DEBUG oslo_concurrency.lockutils [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:16:53 compute-0 nova_compute[186544]: 2025-11-22 08:16:53.872 186548 DEBUG oslo_concurrency.lockutils [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:16:53 compute-0 nova_compute[186544]: 2025-11-22 08:16:53.878 186548 DEBUG nova.virt.hardware [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 08:16:53 compute-0 nova_compute[186544]: 2025-11-22 08:16:53.878 186548 INFO nova.compute.claims [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Claim successful on node compute-0.ctlplane.example.com
Nov 22 08:16:54 compute-0 nova_compute[186544]: 2025-11-22 08:16:54.029 186548 DEBUG nova.compute.provider_tree [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:16:54 compute-0 nova_compute[186544]: 2025-11-22 08:16:54.054 186548 DEBUG nova.scheduler.client.report [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:16:54 compute-0 nova_compute[186544]: 2025-11-22 08:16:54.081 186548 DEBUG oslo_concurrency.lockutils [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.209s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:16:54 compute-0 nova_compute[186544]: 2025-11-22 08:16:54.082 186548 DEBUG nova.compute.manager [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 08:16:54 compute-0 nova_compute[186544]: 2025-11-22 08:16:54.145 186548 DEBUG nova.compute.manager [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 22 08:16:54 compute-0 nova_compute[186544]: 2025-11-22 08:16:54.145 186548 DEBUG nova.network.neutron [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 08:16:54 compute-0 nova_compute[186544]: 2025-11-22 08:16:54.161 186548 INFO nova.virt.libvirt.driver [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 08:16:54 compute-0 nova_compute[186544]: 2025-11-22 08:16:54.178 186548 DEBUG nova.compute.manager [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 08:16:54 compute-0 nova_compute[186544]: 2025-11-22 08:16:54.298 186548 DEBUG nova.compute.manager [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 08:16:54 compute-0 nova_compute[186544]: 2025-11-22 08:16:54.299 186548 DEBUG nova.virt.libvirt.driver [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 08:16:54 compute-0 nova_compute[186544]: 2025-11-22 08:16:54.299 186548 INFO nova.virt.libvirt.driver [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Creating image(s)
Nov 22 08:16:54 compute-0 nova_compute[186544]: 2025-11-22 08:16:54.300 186548 DEBUG oslo_concurrency.lockutils [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "/var/lib/nova/instances/042b3132-f4d0-455f-a591-712ce4fd2839/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:16:54 compute-0 nova_compute[186544]: 2025-11-22 08:16:54.300 186548 DEBUG oslo_concurrency.lockutils [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "/var/lib/nova/instances/042b3132-f4d0-455f-a591-712ce4fd2839/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:16:54 compute-0 nova_compute[186544]: 2025-11-22 08:16:54.301 186548 DEBUG oslo_concurrency.lockutils [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "/var/lib/nova/instances/042b3132-f4d0-455f-a591-712ce4fd2839/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:16:54 compute-0 nova_compute[186544]: 2025-11-22 08:16:54.313 186548 DEBUG oslo_concurrency.processutils [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:16:54 compute-0 nova_compute[186544]: 2025-11-22 08:16:54.369 186548 DEBUG oslo_concurrency.processutils [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:16:54 compute-0 nova_compute[186544]: 2025-11-22 08:16:54.370 186548 DEBUG oslo_concurrency.lockutils [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:16:54 compute-0 nova_compute[186544]: 2025-11-22 08:16:54.371 186548 DEBUG oslo_concurrency.lockutils [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:16:54 compute-0 nova_compute[186544]: 2025-11-22 08:16:54.385 186548 DEBUG oslo_concurrency.processutils [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:16:54 compute-0 nova_compute[186544]: 2025-11-22 08:16:54.440 186548 DEBUG oslo_concurrency.processutils [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:16:54 compute-0 nova_compute[186544]: 2025-11-22 08:16:54.441 186548 DEBUG oslo_concurrency.processutils [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/042b3132-f4d0-455f-a591-712ce4fd2839/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:16:54 compute-0 nova_compute[186544]: 2025-11-22 08:16:54.461 186548 DEBUG nova.policy [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '809b865601654264af5bff7f49127cea', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 22 08:16:54 compute-0 nova_compute[186544]: 2025-11-22 08:16:54.750 186548 DEBUG oslo_concurrency.processutils [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/042b3132-f4d0-455f-a591-712ce4fd2839/disk 1073741824" returned: 0 in 0.308s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:16:54 compute-0 nova_compute[186544]: 2025-11-22 08:16:54.751 186548 DEBUG oslo_concurrency.lockutils [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.380s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:16:54 compute-0 nova_compute[186544]: 2025-11-22 08:16:54.751 186548 DEBUG oslo_concurrency.processutils [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:16:54 compute-0 nova_compute[186544]: 2025-11-22 08:16:54.803 186548 DEBUG oslo_concurrency.processutils [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:16:54 compute-0 nova_compute[186544]: 2025-11-22 08:16:54.804 186548 DEBUG nova.virt.disk.api [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Checking if we can resize image /var/lib/nova/instances/042b3132-f4d0-455f-a591-712ce4fd2839/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 08:16:54 compute-0 nova_compute[186544]: 2025-11-22 08:16:54.805 186548 DEBUG oslo_concurrency.processutils [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/042b3132-f4d0-455f-a591-712ce4fd2839/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:16:54 compute-0 nova_compute[186544]: 2025-11-22 08:16:54.862 186548 DEBUG oslo_concurrency.processutils [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/042b3132-f4d0-455f-a591-712ce4fd2839/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:16:54 compute-0 nova_compute[186544]: 2025-11-22 08:16:54.863 186548 DEBUG nova.virt.disk.api [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Cannot resize image /var/lib/nova/instances/042b3132-f4d0-455f-a591-712ce4fd2839/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 08:16:54 compute-0 nova_compute[186544]: 2025-11-22 08:16:54.864 186548 DEBUG nova.objects.instance [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lazy-loading 'migration_context' on Instance uuid 042b3132-f4d0-455f-a591-712ce4fd2839 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:16:54 compute-0 nova_compute[186544]: 2025-11-22 08:16:54.876 186548 DEBUG nova.virt.libvirt.driver [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 08:16:54 compute-0 nova_compute[186544]: 2025-11-22 08:16:54.876 186548 DEBUG nova.virt.libvirt.driver [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Ensure instance console log exists: /var/lib/nova/instances/042b3132-f4d0-455f-a591-712ce4fd2839/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 08:16:54 compute-0 nova_compute[186544]: 2025-11-22 08:16:54.877 186548 DEBUG oslo_concurrency.lockutils [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:16:54 compute-0 nova_compute[186544]: 2025-11-22 08:16:54.877 186548 DEBUG oslo_concurrency.lockutils [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:16:54 compute-0 nova_compute[186544]: 2025-11-22 08:16:54.877 186548 DEBUG oslo_concurrency.lockutils [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:16:56 compute-0 nova_compute[186544]: 2025-11-22 08:16:56.056 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:16:56 compute-0 nova_compute[186544]: 2025-11-22 08:16:56.162 186548 DEBUG nova.network.neutron [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Successfully created port: 328a2b4f-54d4-42e2-b943-54d2e4ec2bae _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 22 08:16:56 compute-0 nova_compute[186544]: 2025-11-22 08:16:56.893 186548 DEBUG nova.network.neutron [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Successfully created port: 46205a33-4ef2-438e-9ea8-e4918a0d6e9b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 22 08:16:57 compute-0 nova_compute[186544]: 2025-11-22 08:16:57.482 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:16:58 compute-0 nova_compute[186544]: 2025-11-22 08:16:58.103 186548 DEBUG nova.network.neutron [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Successfully updated port: 328a2b4f-54d4-42e2-b943-54d2e4ec2bae _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 08:16:58 compute-0 nova_compute[186544]: 2025-11-22 08:16:58.439 186548 DEBUG nova.compute.manager [req-0e5b235e-f26c-4059-9f19-91fcd3697881 req-c98b233d-bac1-44b2-8df7-7054b4b3d79d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Received event network-changed-328a2b4f-54d4-42e2-b943-54d2e4ec2bae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:16:58 compute-0 nova_compute[186544]: 2025-11-22 08:16:58.440 186548 DEBUG nova.compute.manager [req-0e5b235e-f26c-4059-9f19-91fcd3697881 req-c98b233d-bac1-44b2-8df7-7054b4b3d79d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Refreshing instance network info cache due to event network-changed-328a2b4f-54d4-42e2-b943-54d2e4ec2bae. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:16:58 compute-0 nova_compute[186544]: 2025-11-22 08:16:58.440 186548 DEBUG oslo_concurrency.lockutils [req-0e5b235e-f26c-4059-9f19-91fcd3697881 req-c98b233d-bac1-44b2-8df7-7054b4b3d79d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-042b3132-f4d0-455f-a591-712ce4fd2839" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:16:58 compute-0 nova_compute[186544]: 2025-11-22 08:16:58.441 186548 DEBUG oslo_concurrency.lockutils [req-0e5b235e-f26c-4059-9f19-91fcd3697881 req-c98b233d-bac1-44b2-8df7-7054b4b3d79d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-042b3132-f4d0-455f-a591-712ce4fd2839" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:16:58 compute-0 nova_compute[186544]: 2025-11-22 08:16:58.441 186548 DEBUG nova.network.neutron [req-0e5b235e-f26c-4059-9f19-91fcd3697881 req-c98b233d-bac1-44b2-8df7-7054b4b3d79d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Refreshing network info cache for port 328a2b4f-54d4-42e2-b943-54d2e4ec2bae _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:16:58 compute-0 nova_compute[186544]: 2025-11-22 08:16:58.685 186548 DEBUG nova.network.neutron [req-0e5b235e-f26c-4059-9f19-91fcd3697881 req-c98b233d-bac1-44b2-8df7-7054b4b3d79d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 08:16:59 compute-0 nova_compute[186544]: 2025-11-22 08:16:59.238 186548 DEBUG nova.network.neutron [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Successfully updated port: 46205a33-4ef2-438e-9ea8-e4918a0d6e9b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 08:16:59 compute-0 nova_compute[186544]: 2025-11-22 08:16:59.257 186548 DEBUG oslo_concurrency.lockutils [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "refresh_cache-042b3132-f4d0-455f-a591-712ce4fd2839" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:16:59 compute-0 nova_compute[186544]: 2025-11-22 08:16:59.323 186548 DEBUG nova.network.neutron [req-0e5b235e-f26c-4059-9f19-91fcd3697881 req-c98b233d-bac1-44b2-8df7-7054b4b3d79d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:16:59 compute-0 nova_compute[186544]: 2025-11-22 08:16:59.341 186548 DEBUG oslo_concurrency.lockutils [req-0e5b235e-f26c-4059-9f19-91fcd3697881 req-c98b233d-bac1-44b2-8df7-7054b4b3d79d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-042b3132-f4d0-455f-a591-712ce4fd2839" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:16:59 compute-0 nova_compute[186544]: 2025-11-22 08:16:59.341 186548 DEBUG oslo_concurrency.lockutils [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquired lock "refresh_cache-042b3132-f4d0-455f-a591-712ce4fd2839" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:16:59 compute-0 nova_compute[186544]: 2025-11-22 08:16:59.341 186548 DEBUG nova.network.neutron [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 08:16:59 compute-0 nova_compute[186544]: 2025-11-22 08:16:59.553 186548 DEBUG nova.network.neutron [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 08:17:00 compute-0 nova_compute[186544]: 2025-11-22 08:17:00.734 186548 DEBUG nova.compute.manager [req-d80c6ea1-d440-4ff1-9bf2-2b010f49f99a req-3c549790-f248-4468-95e3-4694b5312d85 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Received event network-changed-46205a33-4ef2-438e-9ea8-e4918a0d6e9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:17:00 compute-0 nova_compute[186544]: 2025-11-22 08:17:00.735 186548 DEBUG nova.compute.manager [req-d80c6ea1-d440-4ff1-9bf2-2b010f49f99a req-3c549790-f248-4468-95e3-4694b5312d85 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Refreshing instance network info cache due to event network-changed-46205a33-4ef2-438e-9ea8-e4918a0d6e9b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:17:00 compute-0 nova_compute[186544]: 2025-11-22 08:17:00.735 186548 DEBUG oslo_concurrency.lockutils [req-d80c6ea1-d440-4ff1-9bf2-2b010f49f99a req-3c549790-f248-4468-95e3-4694b5312d85 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-042b3132-f4d0-455f-a591-712ce4fd2839" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:17:01 compute-0 nova_compute[186544]: 2025-11-22 08:17:01.058 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:17:01 compute-0 nova_compute[186544]: 2025-11-22 08:17:01.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:17:01 compute-0 nova_compute[186544]: 2025-11-22 08:17:01.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 08:17:01 compute-0 nova_compute[186544]: 2025-11-22 08:17:01.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 08:17:01 compute-0 nova_compute[186544]: 2025-11-22 08:17:01.185 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Nov 22 08:17:01 compute-0 nova_compute[186544]: 2025-11-22 08:17:01.186 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 08:17:01 compute-0 podman[240064]: 2025-11-22 08:17:01.400241524 +0000 UTC m=+0.047709279 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 08:17:02 compute-0 nova_compute[186544]: 2025-11-22 08:17:02.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:17:02 compute-0 nova_compute[186544]: 2025-11-22 08:17:02.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:17:02 compute-0 nova_compute[186544]: 2025-11-22 08:17:02.187 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:17:02 compute-0 nova_compute[186544]: 2025-11-22 08:17:02.187 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:17:02 compute-0 nova_compute[186544]: 2025-11-22 08:17:02.188 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:17:02 compute-0 nova_compute[186544]: 2025-11-22 08:17:02.188 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 08:17:02 compute-0 nova_compute[186544]: 2025-11-22 08:17:02.346 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:17:02 compute-0 nova_compute[186544]: 2025-11-22 08:17:02.347 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5711MB free_disk=73.13653564453125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 08:17:02 compute-0 nova_compute[186544]: 2025-11-22 08:17:02.348 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:17:02 compute-0 nova_compute[186544]: 2025-11-22 08:17:02.348 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:17:02 compute-0 nova_compute[186544]: 2025-11-22 08:17:02.427 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Instance 042b3132-f4d0-455f-a591-712ce4fd2839 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 22 08:17:02 compute-0 nova_compute[186544]: 2025-11-22 08:17:02.427 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 08:17:02 compute-0 nova_compute[186544]: 2025-11-22 08:17:02.428 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 08:17:02 compute-0 nova_compute[186544]: 2025-11-22 08:17:02.465 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:17:02 compute-0 nova_compute[186544]: 2025-11-22 08:17:02.479 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:17:02 compute-0 nova_compute[186544]: 2025-11-22 08:17:02.483 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:17:02 compute-0 nova_compute[186544]: 2025-11-22 08:17:02.526 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 08:17:02 compute-0 nova_compute[186544]: 2025-11-22 08:17:02.527 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.179s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:17:03 compute-0 nova_compute[186544]: 2025-11-22 08:17:03.527 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:17:03 compute-0 nova_compute[186544]: 2025-11-22 08:17:03.527 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:17:03 compute-0 nova_compute[186544]: 2025-11-22 08:17:03.528 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 08:17:03 compute-0 nova_compute[186544]: 2025-11-22 08:17:03.858 186548 DEBUG nova.network.neutron [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Updating instance_info_cache with network_info: [{"id": "328a2b4f-54d4-42e2-b943-54d2e4ec2bae", "address": "fa:16:3e:82:97:79", "network": {"id": "28285a99-0933-48f9-aee6-f1e507bcd777", "bridge": "br-int", "label": "tempest-network-smoke--85194758", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap328a2b4f-54", "ovs_interfaceid": "328a2b4f-54d4-42e2-b943-54d2e4ec2bae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "46205a33-4ef2-438e-9ea8-e4918a0d6e9b", "address": "fa:16:3e:d5:20:0a", "network": {"id": "206a04da-ce2f-48ff-99c7-e70706547580", "bridge": "br-int", "label": "tempest-network-smoke--1887597380", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fed5:200a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap46205a33-4e", "ovs_interfaceid": "46205a33-4ef2-438e-9ea8-e4918a0d6e9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:17:03 compute-0 nova_compute[186544]: 2025-11-22 08:17:03.923 186548 DEBUG oslo_concurrency.lockutils [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Releasing lock "refresh_cache-042b3132-f4d0-455f-a591-712ce4fd2839" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:17:03 compute-0 nova_compute[186544]: 2025-11-22 08:17:03.924 186548 DEBUG nova.compute.manager [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Instance network_info: |[{"id": "328a2b4f-54d4-42e2-b943-54d2e4ec2bae", "address": "fa:16:3e:82:97:79", "network": {"id": "28285a99-0933-48f9-aee6-f1e507bcd777", "bridge": "br-int", "label": "tempest-network-smoke--85194758", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap328a2b4f-54", "ovs_interfaceid": "328a2b4f-54d4-42e2-b943-54d2e4ec2bae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "46205a33-4ef2-438e-9ea8-e4918a0d6e9b", "address": "fa:16:3e:d5:20:0a", "network": {"id": "206a04da-ce2f-48ff-99c7-e70706547580", "bridge": "br-int", "label": "tempest-network-smoke--1887597380", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fed5:200a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap46205a33-4e", "ovs_interfaceid": "46205a33-4ef2-438e-9ea8-e4918a0d6e9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 22 08:17:03 compute-0 nova_compute[186544]: 2025-11-22 08:17:03.924 186548 DEBUG oslo_concurrency.lockutils [req-d80c6ea1-d440-4ff1-9bf2-2b010f49f99a req-3c549790-f248-4468-95e3-4694b5312d85 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-042b3132-f4d0-455f-a591-712ce4fd2839" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:17:03 compute-0 nova_compute[186544]: 2025-11-22 08:17:03.924 186548 DEBUG nova.network.neutron [req-d80c6ea1-d440-4ff1-9bf2-2b010f49f99a req-3c549790-f248-4468-95e3-4694b5312d85 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Refreshing network info cache for port 46205a33-4ef2-438e-9ea8-e4918a0d6e9b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:17:03 compute-0 nova_compute[186544]: 2025-11-22 08:17:03.927 186548 DEBUG nova.virt.libvirt.driver [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Start _get_guest_xml network_info=[{"id": "328a2b4f-54d4-42e2-b943-54d2e4ec2bae", "address": "fa:16:3e:82:97:79", "network": {"id": "28285a99-0933-48f9-aee6-f1e507bcd777", "bridge": "br-int", "label": "tempest-network-smoke--85194758", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap328a2b4f-54", "ovs_interfaceid": "328a2b4f-54d4-42e2-b943-54d2e4ec2bae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "46205a33-4ef2-438e-9ea8-e4918a0d6e9b", "address": "fa:16:3e:d5:20:0a", "network": {"id": "206a04da-ce2f-48ff-99c7-e70706547580", "bridge": "br-int", "label": "tempest-network-smoke--1887597380", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fed5:200a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap46205a33-4e", "ovs_interfaceid": "46205a33-4ef2-438e-9ea8-e4918a0d6e9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 08:17:03 compute-0 nova_compute[186544]: 2025-11-22 08:17:03.931 186548 WARNING nova.virt.libvirt.driver [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:17:03 compute-0 nova_compute[186544]: 2025-11-22 08:17:03.936 186548 DEBUG nova.virt.libvirt.host [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 08:17:03 compute-0 nova_compute[186544]: 2025-11-22 08:17:03.936 186548 DEBUG nova.virt.libvirt.host [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 08:17:03 compute-0 nova_compute[186544]: 2025-11-22 08:17:03.942 186548 DEBUG nova.virt.libvirt.host [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 08:17:03 compute-0 nova_compute[186544]: 2025-11-22 08:17:03.942 186548 DEBUG nova.virt.libvirt.host [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 08:17:03 compute-0 nova_compute[186544]: 2025-11-22 08:17:03.944 186548 DEBUG nova.virt.libvirt.driver [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 08:17:03 compute-0 nova_compute[186544]: 2025-11-22 08:17:03.944 186548 DEBUG nova.virt.hardware [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 08:17:03 compute-0 nova_compute[186544]: 2025-11-22 08:17:03.944 186548 DEBUG nova.virt.hardware [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 08:17:03 compute-0 nova_compute[186544]: 2025-11-22 08:17:03.944 186548 DEBUG nova.virt.hardware [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 08:17:03 compute-0 nova_compute[186544]: 2025-11-22 08:17:03.945 186548 DEBUG nova.virt.hardware [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 08:17:03 compute-0 nova_compute[186544]: 2025-11-22 08:17:03.945 186548 DEBUG nova.virt.hardware [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 08:17:03 compute-0 nova_compute[186544]: 2025-11-22 08:17:03.945 186548 DEBUG nova.virt.hardware [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 08:17:03 compute-0 nova_compute[186544]: 2025-11-22 08:17:03.945 186548 DEBUG nova.virt.hardware [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 08:17:03 compute-0 nova_compute[186544]: 2025-11-22 08:17:03.945 186548 DEBUG nova.virt.hardware [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 08:17:03 compute-0 nova_compute[186544]: 2025-11-22 08:17:03.946 186548 DEBUG nova.virt.hardware [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 08:17:03 compute-0 nova_compute[186544]: 2025-11-22 08:17:03.946 186548 DEBUG nova.virt.hardware [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 08:17:03 compute-0 nova_compute[186544]: 2025-11-22 08:17:03.946 186548 DEBUG nova.virt.hardware [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 08:17:03 compute-0 nova_compute[186544]: 2025-11-22 08:17:03.949 186548 DEBUG nova.virt.libvirt.vif [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:16:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-332737681',display_name='tempest-TestGettingAddress-server-332737681',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-332737681',id=147,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM4eWB6S8gF3+vAQNfWODRrXJ2TSEmW43skqR+J/UySpPtPQ+ovw0XjJfGr33wuxAxwi/2V7+yN1aEcDFfs9GT9vSaMpY282CNYDuBhDuhcnpLdM1GTSDhOlEpnjVOI3fA==',key_name='tempest-TestGettingAddress-444639012',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c4200f1d1fbb44a5aaf5e3578f6354ae',ramdisk_id='',reservation_id='r-fgzl09bb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-25838038',owner_user_name='tempest-TestGettingAddress-25838038-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:16:54Z,user_data=None,user_id='809b865601654264af5bff7f49127cea',uuid=042b3132-f4d0-455f-a591-712ce4fd2839,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "328a2b4f-54d4-42e2-b943-54d2e4ec2bae", "address": "fa:16:3e:82:97:79", "network": {"id": "28285a99-0933-48f9-aee6-f1e507bcd777", "bridge": "br-int", "label": "tempest-network-smoke--85194758", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap328a2b4f-54", "ovs_interfaceid": "328a2b4f-54d4-42e2-b943-54d2e4ec2bae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 08:17:03 compute-0 nova_compute[186544]: 2025-11-22 08:17:03.949 186548 DEBUG nova.network.os_vif_util [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converting VIF {"id": "328a2b4f-54d4-42e2-b943-54d2e4ec2bae", "address": "fa:16:3e:82:97:79", "network": {"id": "28285a99-0933-48f9-aee6-f1e507bcd777", "bridge": "br-int", "label": "tempest-network-smoke--85194758", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap328a2b4f-54", "ovs_interfaceid": "328a2b4f-54d4-42e2-b943-54d2e4ec2bae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:17:03 compute-0 nova_compute[186544]: 2025-11-22 08:17:03.950 186548 DEBUG nova.network.os_vif_util [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:82:97:79,bridge_name='br-int',has_traffic_filtering=True,id=328a2b4f-54d4-42e2-b943-54d2e4ec2bae,network=Network(28285a99-0933-48f9-aee6-f1e507bcd777),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap328a2b4f-54') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:17:03 compute-0 nova_compute[186544]: 2025-11-22 08:17:03.951 186548 DEBUG nova.virt.libvirt.vif [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:16:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-332737681',display_name='tempest-TestGettingAddress-server-332737681',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-332737681',id=147,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM4eWB6S8gF3+vAQNfWODRrXJ2TSEmW43skqR+J/UySpPtPQ+ovw0XjJfGr33wuxAxwi/2V7+yN1aEcDFfs9GT9vSaMpY282CNYDuBhDuhcnpLdM1GTSDhOlEpnjVOI3fA==',key_name='tempest-TestGettingAddress-444639012',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c4200f1d1fbb44a5aaf5e3578f6354ae',ramdisk_id='',reservation_id='r-fgzl09bb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-25838038',owner_user_name='tempest-TestGettingAddress-25838038-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:16:54Z,user_data=None,user_id='809b865601654264af5bff7f49127cea',uuid=042b3132-f4d0-455f-a591-712ce4fd2839,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "46205a33-4ef2-438e-9ea8-e4918a0d6e9b", "address": "fa:16:3e:d5:20:0a", "network": {"id": "206a04da-ce2f-48ff-99c7-e70706547580", "bridge": "br-int", "label": "tempest-network-smoke--1887597380", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fed5:200a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap46205a33-4e", "ovs_interfaceid": "46205a33-4ef2-438e-9ea8-e4918a0d6e9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 08:17:03 compute-0 nova_compute[186544]: 2025-11-22 08:17:03.951 186548 DEBUG nova.network.os_vif_util [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converting VIF {"id": "46205a33-4ef2-438e-9ea8-e4918a0d6e9b", "address": "fa:16:3e:d5:20:0a", "network": {"id": "206a04da-ce2f-48ff-99c7-e70706547580", "bridge": "br-int", "label": "tempest-network-smoke--1887597380", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fed5:200a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap46205a33-4e", "ovs_interfaceid": "46205a33-4ef2-438e-9ea8-e4918a0d6e9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:17:03 compute-0 nova_compute[186544]: 2025-11-22 08:17:03.951 186548 DEBUG nova.network.os_vif_util [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d5:20:0a,bridge_name='br-int',has_traffic_filtering=True,id=46205a33-4ef2-438e-9ea8-e4918a0d6e9b,network=Network(206a04da-ce2f-48ff-99c7-e70706547580),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap46205a33-4e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:17:03 compute-0 nova_compute[186544]: 2025-11-22 08:17:03.952 186548 DEBUG nova.objects.instance [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lazy-loading 'pci_devices' on Instance uuid 042b3132-f4d0-455f-a591-712ce4fd2839 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:17:03 compute-0 nova_compute[186544]: 2025-11-22 08:17:03.965 186548 DEBUG nova.virt.libvirt.driver [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] End _get_guest_xml xml=<domain type="kvm">
Nov 22 08:17:03 compute-0 nova_compute[186544]:   <uuid>042b3132-f4d0-455f-a591-712ce4fd2839</uuid>
Nov 22 08:17:03 compute-0 nova_compute[186544]:   <name>instance-00000093</name>
Nov 22 08:17:03 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 08:17:03 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 08:17:03 compute-0 nova_compute[186544]:   <metadata>
Nov 22 08:17:03 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 08:17:03 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 08:17:03 compute-0 nova_compute[186544]:       <nova:name>tempest-TestGettingAddress-server-332737681</nova:name>
Nov 22 08:17:03 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 08:17:03</nova:creationTime>
Nov 22 08:17:03 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 08:17:03 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 08:17:03 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 08:17:03 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 08:17:03 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 08:17:03 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 08:17:03 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 08:17:03 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 08:17:03 compute-0 nova_compute[186544]:         <nova:user uuid="809b865601654264af5bff7f49127cea">tempest-TestGettingAddress-25838038-project-member</nova:user>
Nov 22 08:17:03 compute-0 nova_compute[186544]:         <nova:project uuid="c4200f1d1fbb44a5aaf5e3578f6354ae">tempest-TestGettingAddress-25838038</nova:project>
Nov 22 08:17:03 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 08:17:03 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 08:17:03 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 08:17:03 compute-0 nova_compute[186544]:         <nova:port uuid="328a2b4f-54d4-42e2-b943-54d2e4ec2bae">
Nov 22 08:17:03 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Nov 22 08:17:03 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 08:17:03 compute-0 nova_compute[186544]:         <nova:port uuid="46205a33-4ef2-438e-9ea8-e4918a0d6e9b">
Nov 22 08:17:03 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="2001:db8::f816:3eff:fed5:200a" ipVersion="6"/>
Nov 22 08:17:03 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 08:17:03 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 08:17:03 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 08:17:03 compute-0 nova_compute[186544]:   </metadata>
Nov 22 08:17:03 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 08:17:03 compute-0 nova_compute[186544]:     <system>
Nov 22 08:17:03 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 08:17:03 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 08:17:03 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 08:17:03 compute-0 nova_compute[186544]:       <entry name="serial">042b3132-f4d0-455f-a591-712ce4fd2839</entry>
Nov 22 08:17:03 compute-0 nova_compute[186544]:       <entry name="uuid">042b3132-f4d0-455f-a591-712ce4fd2839</entry>
Nov 22 08:17:03 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 08:17:03 compute-0 nova_compute[186544]:     </system>
Nov 22 08:17:03 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 08:17:03 compute-0 nova_compute[186544]:   <os>
Nov 22 08:17:03 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 08:17:03 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 08:17:03 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 08:17:03 compute-0 nova_compute[186544]:   </os>
Nov 22 08:17:03 compute-0 nova_compute[186544]:   <features>
Nov 22 08:17:03 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 08:17:03 compute-0 nova_compute[186544]:     <apic/>
Nov 22 08:17:03 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 08:17:03 compute-0 nova_compute[186544]:   </features>
Nov 22 08:17:03 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 08:17:03 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 08:17:03 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 08:17:03 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 08:17:03 compute-0 nova_compute[186544]:   </clock>
Nov 22 08:17:03 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 08:17:03 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 08:17:03 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 08:17:03 compute-0 nova_compute[186544]:   </cpu>
Nov 22 08:17:03 compute-0 nova_compute[186544]:   <devices>
Nov 22 08:17:03 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 08:17:03 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 08:17:03 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/042b3132-f4d0-455f-a591-712ce4fd2839/disk"/>
Nov 22 08:17:03 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 08:17:03 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:17:03 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 08:17:03 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 08:17:03 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/042b3132-f4d0-455f-a591-712ce4fd2839/disk.config"/>
Nov 22 08:17:03 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 08:17:03 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:17:03 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 08:17:03 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:82:97:79"/>
Nov 22 08:17:03 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:17:03 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 08:17:03 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 08:17:03 compute-0 nova_compute[186544]:       <target dev="tap328a2b4f-54"/>
Nov 22 08:17:03 compute-0 nova_compute[186544]:     </interface>
Nov 22 08:17:03 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 08:17:03 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:d5:20:0a"/>
Nov 22 08:17:03 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:17:03 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 08:17:03 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 08:17:03 compute-0 nova_compute[186544]:       <target dev="tap46205a33-4e"/>
Nov 22 08:17:03 compute-0 nova_compute[186544]:     </interface>
Nov 22 08:17:03 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 08:17:03 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/042b3132-f4d0-455f-a591-712ce4fd2839/console.log" append="off"/>
Nov 22 08:17:03 compute-0 nova_compute[186544]:     </serial>
Nov 22 08:17:03 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 08:17:03 compute-0 nova_compute[186544]:     <video>
Nov 22 08:17:03 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:17:03 compute-0 nova_compute[186544]:     </video>
Nov 22 08:17:03 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 08:17:03 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 08:17:03 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 08:17:03 compute-0 nova_compute[186544]:     </rng>
Nov 22 08:17:03 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 08:17:03 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:17:03 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:17:03 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:17:03 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:17:03 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:17:03 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:17:03 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:17:03 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:17:03 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:17:03 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:17:03 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:17:03 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:17:03 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:17:03 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:17:03 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:17:03 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:17:03 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:17:03 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:17:03 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:17:03 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:17:03 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:17:03 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:17:03 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:17:03 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:17:03 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 08:17:03 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 08:17:03 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 08:17:03 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 08:17:03 compute-0 nova_compute[186544]:   </devices>
Nov 22 08:17:03 compute-0 nova_compute[186544]: </domain>
Nov 22 08:17:03 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 08:17:03 compute-0 nova_compute[186544]: 2025-11-22 08:17:03.966 186548 DEBUG nova.compute.manager [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Preparing to wait for external event network-vif-plugged-328a2b4f-54d4-42e2-b943-54d2e4ec2bae prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 08:17:03 compute-0 nova_compute[186544]: 2025-11-22 08:17:03.966 186548 DEBUG oslo_concurrency.lockutils [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "042b3132-f4d0-455f-a591-712ce4fd2839-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:17:03 compute-0 nova_compute[186544]: 2025-11-22 08:17:03.966 186548 DEBUG oslo_concurrency.lockutils [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "042b3132-f4d0-455f-a591-712ce4fd2839-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:17:03 compute-0 nova_compute[186544]: 2025-11-22 08:17:03.967 186548 DEBUG oslo_concurrency.lockutils [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "042b3132-f4d0-455f-a591-712ce4fd2839-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:17:03 compute-0 nova_compute[186544]: 2025-11-22 08:17:03.967 186548 DEBUG nova.compute.manager [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Preparing to wait for external event network-vif-plugged-46205a33-4ef2-438e-9ea8-e4918a0d6e9b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 08:17:03 compute-0 nova_compute[186544]: 2025-11-22 08:17:03.967 186548 DEBUG oslo_concurrency.lockutils [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "042b3132-f4d0-455f-a591-712ce4fd2839-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:17:03 compute-0 nova_compute[186544]: 2025-11-22 08:17:03.967 186548 DEBUG oslo_concurrency.lockutils [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "042b3132-f4d0-455f-a591-712ce4fd2839-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:17:03 compute-0 nova_compute[186544]: 2025-11-22 08:17:03.968 186548 DEBUG oslo_concurrency.lockutils [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "042b3132-f4d0-455f-a591-712ce4fd2839-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:17:03 compute-0 nova_compute[186544]: 2025-11-22 08:17:03.968 186548 DEBUG nova.virt.libvirt.vif [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:16:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-332737681',display_name='tempest-TestGettingAddress-server-332737681',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-332737681',id=147,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM4eWB6S8gF3+vAQNfWODRrXJ2TSEmW43skqR+J/UySpPtPQ+ovw0XjJfGr33wuxAxwi/2V7+yN1aEcDFfs9GT9vSaMpY282CNYDuBhDuhcnpLdM1GTSDhOlEpnjVOI3fA==',key_name='tempest-TestGettingAddress-444639012',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c4200f1d1fbb44a5aaf5e3578f6354ae',ramdisk_id='',reservation_id='r-fgzl09bb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-25838038',owner_user_name='tempest-TestGettingAddress-25838038-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:16:54Z,user_data=None,user_id='809b865601654264af5bff7f49127cea',uuid=042b3132-f4d0-455f-a591-712ce4fd2839,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "328a2b4f-54d4-42e2-b943-54d2e4ec2bae", "address": "fa:16:3e:82:97:79", "network": {"id": "28285a99-0933-48f9-aee6-f1e507bcd777", "bridge": "br-int", "label": "tempest-network-smoke--85194758", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap328a2b4f-54", "ovs_interfaceid": "328a2b4f-54d4-42e2-b943-54d2e4ec2bae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 08:17:03 compute-0 nova_compute[186544]: 2025-11-22 08:17:03.968 186548 DEBUG nova.network.os_vif_util [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converting VIF {"id": "328a2b4f-54d4-42e2-b943-54d2e4ec2bae", "address": "fa:16:3e:82:97:79", "network": {"id": "28285a99-0933-48f9-aee6-f1e507bcd777", "bridge": "br-int", "label": "tempest-network-smoke--85194758", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap328a2b4f-54", "ovs_interfaceid": "328a2b4f-54d4-42e2-b943-54d2e4ec2bae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:17:03 compute-0 nova_compute[186544]: 2025-11-22 08:17:03.969 186548 DEBUG nova.network.os_vif_util [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:82:97:79,bridge_name='br-int',has_traffic_filtering=True,id=328a2b4f-54d4-42e2-b943-54d2e4ec2bae,network=Network(28285a99-0933-48f9-aee6-f1e507bcd777),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap328a2b4f-54') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:17:03 compute-0 nova_compute[186544]: 2025-11-22 08:17:03.971 186548 DEBUG os_vif [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:82:97:79,bridge_name='br-int',has_traffic_filtering=True,id=328a2b4f-54d4-42e2-b943-54d2e4ec2bae,network=Network(28285a99-0933-48f9-aee6-f1e507bcd777),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap328a2b4f-54') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 08:17:03 compute-0 nova_compute[186544]: 2025-11-22 08:17:03.971 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:17:03 compute-0 nova_compute[186544]: 2025-11-22 08:17:03.971 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:17:03 compute-0 nova_compute[186544]: 2025-11-22 08:17:03.972 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:17:03 compute-0 nova_compute[186544]: 2025-11-22 08:17:03.973 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:17:03 compute-0 nova_compute[186544]: 2025-11-22 08:17:03.974 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap328a2b4f-54, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:17:03 compute-0 nova_compute[186544]: 2025-11-22 08:17:03.974 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap328a2b4f-54, col_values=(('external_ids', {'iface-id': '328a2b4f-54d4-42e2-b943-54d2e4ec2bae', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:82:97:79', 'vm-uuid': '042b3132-f4d0-455f-a591-712ce4fd2839'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:17:03 compute-0 nova_compute[186544]: 2025-11-22 08:17:03.976 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:17:03 compute-0 NetworkManager[55036]: <info>  [1763799423.9772] manager: (tap328a2b4f-54): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/302)
Nov 22 08:17:03 compute-0 nova_compute[186544]: 2025-11-22 08:17:03.978 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 08:17:03 compute-0 nova_compute[186544]: 2025-11-22 08:17:03.983 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:17:03 compute-0 nova_compute[186544]: 2025-11-22 08:17:03.984 186548 INFO os_vif [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:82:97:79,bridge_name='br-int',has_traffic_filtering=True,id=328a2b4f-54d4-42e2-b943-54d2e4ec2bae,network=Network(28285a99-0933-48f9-aee6-f1e507bcd777),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap328a2b4f-54')
Nov 22 08:17:03 compute-0 nova_compute[186544]: 2025-11-22 08:17:03.985 186548 DEBUG nova.virt.libvirt.vif [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:16:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-332737681',display_name='tempest-TestGettingAddress-server-332737681',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-332737681',id=147,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM4eWB6S8gF3+vAQNfWODRrXJ2TSEmW43skqR+J/UySpPtPQ+ovw0XjJfGr33wuxAxwi/2V7+yN1aEcDFfs9GT9vSaMpY282CNYDuBhDuhcnpLdM1GTSDhOlEpnjVOI3fA==',key_name='tempest-TestGettingAddress-444639012',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c4200f1d1fbb44a5aaf5e3578f6354ae',ramdisk_id='',reservation_id='r-fgzl09bb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-25838038',owner_user_name='tempest-TestGettingAddress-25838038-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:16:54Z,user_data=None,user_id='809b865601654264af5bff7f49127cea',uuid=042b3132-f4d0-455f-a591-712ce4fd2839,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "46205a33-4ef2-438e-9ea8-e4918a0d6e9b", "address": "fa:16:3e:d5:20:0a", "network": {"id": "206a04da-ce2f-48ff-99c7-e70706547580", "bridge": "br-int", "label": "tempest-network-smoke--1887597380", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fed5:200a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap46205a33-4e", "ovs_interfaceid": "46205a33-4ef2-438e-9ea8-e4918a0d6e9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 08:17:03 compute-0 nova_compute[186544]: 2025-11-22 08:17:03.985 186548 DEBUG nova.network.os_vif_util [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converting VIF {"id": "46205a33-4ef2-438e-9ea8-e4918a0d6e9b", "address": "fa:16:3e:d5:20:0a", "network": {"id": "206a04da-ce2f-48ff-99c7-e70706547580", "bridge": "br-int", "label": "tempest-network-smoke--1887597380", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fed5:200a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap46205a33-4e", "ovs_interfaceid": "46205a33-4ef2-438e-9ea8-e4918a0d6e9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:17:03 compute-0 nova_compute[186544]: 2025-11-22 08:17:03.986 186548 DEBUG nova.network.os_vif_util [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d5:20:0a,bridge_name='br-int',has_traffic_filtering=True,id=46205a33-4ef2-438e-9ea8-e4918a0d6e9b,network=Network(206a04da-ce2f-48ff-99c7-e70706547580),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap46205a33-4e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:17:03 compute-0 nova_compute[186544]: 2025-11-22 08:17:03.986 186548 DEBUG os_vif [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d5:20:0a,bridge_name='br-int',has_traffic_filtering=True,id=46205a33-4ef2-438e-9ea8-e4918a0d6e9b,network=Network(206a04da-ce2f-48ff-99c7-e70706547580),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap46205a33-4e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 08:17:03 compute-0 nova_compute[186544]: 2025-11-22 08:17:03.986 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:17:03 compute-0 nova_compute[186544]: 2025-11-22 08:17:03.987 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:17:03 compute-0 nova_compute[186544]: 2025-11-22 08:17:03.987 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:17:03 compute-0 nova_compute[186544]: 2025-11-22 08:17:03.989 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:17:03 compute-0 nova_compute[186544]: 2025-11-22 08:17:03.989 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap46205a33-4e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:17:03 compute-0 nova_compute[186544]: 2025-11-22 08:17:03.990 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap46205a33-4e, col_values=(('external_ids', {'iface-id': '46205a33-4ef2-438e-9ea8-e4918a0d6e9b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d5:20:0a', 'vm-uuid': '042b3132-f4d0-455f-a591-712ce4fd2839'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:17:03 compute-0 nova_compute[186544]: 2025-11-22 08:17:03.991 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:17:03 compute-0 NetworkManager[55036]: <info>  [1763799423.9920] manager: (tap46205a33-4e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/303)
Nov 22 08:17:03 compute-0 nova_compute[186544]: 2025-11-22 08:17:03.993 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 08:17:03 compute-0 nova_compute[186544]: 2025-11-22 08:17:03.998 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:17:03 compute-0 nova_compute[186544]: 2025-11-22 08:17:03.999 186548 INFO os_vif [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d5:20:0a,bridge_name='br-int',has_traffic_filtering=True,id=46205a33-4ef2-438e-9ea8-e4918a0d6e9b,network=Network(206a04da-ce2f-48ff-99c7-e70706547580),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap46205a33-4e')
Nov 22 08:17:04 compute-0 nova_compute[186544]: 2025-11-22 08:17:04.176 186548 DEBUG nova.virt.libvirt.driver [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:17:04 compute-0 nova_compute[186544]: 2025-11-22 08:17:04.176 186548 DEBUG nova.virt.libvirt.driver [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:17:04 compute-0 nova_compute[186544]: 2025-11-22 08:17:04.176 186548 DEBUG nova.virt.libvirt.driver [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] No VIF found with MAC fa:16:3e:82:97:79, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 08:17:04 compute-0 nova_compute[186544]: 2025-11-22 08:17:04.177 186548 DEBUG nova.virt.libvirt.driver [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] No VIF found with MAC fa:16:3e:d5:20:0a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 08:17:04 compute-0 nova_compute[186544]: 2025-11-22 08:17:04.177 186548 INFO nova.virt.libvirt.driver [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Using config drive
Nov 22 08:17:04 compute-0 podman[240088]: 2025-11-22 08:17:04.405610912 +0000 UTC m=+0.050554739 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 08:17:04 compute-0 podman[240089]: 2025-11-22 08:17:04.406836973 +0000 UTC m=+0.049798901 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, release=1755695350, architecture=x86_64, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, config_id=edpm, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, vcs-type=git, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., managed_by=edpm_ansible)
Nov 22 08:17:05 compute-0 nova_compute[186544]: 2025-11-22 08:17:05.166 186548 INFO nova.virt.libvirt.driver [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Creating config drive at /var/lib/nova/instances/042b3132-f4d0-455f-a591-712ce4fd2839/disk.config
Nov 22 08:17:05 compute-0 nova_compute[186544]: 2025-11-22 08:17:05.171 186548 DEBUG oslo_concurrency.processutils [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/042b3132-f4d0-455f-a591-712ce4fd2839/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpzh4409j9 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:17:05 compute-0 nova_compute[186544]: 2025-11-22 08:17:05.294 186548 DEBUG oslo_concurrency.processutils [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/042b3132-f4d0-455f-a591-712ce4fd2839/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpzh4409j9" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:17:05 compute-0 NetworkManager[55036]: <info>  [1763799425.3378] manager: (tap328a2b4f-54): new Tun device (/org/freedesktop/NetworkManager/Devices/304)
Nov 22 08:17:05 compute-0 kernel: tap328a2b4f-54: entered promiscuous mode
Nov 22 08:17:05 compute-0 ovn_controller[94843]: 2025-11-22T08:17:05Z|00657|binding|INFO|Claiming lport 328a2b4f-54d4-42e2-b943-54d2e4ec2bae for this chassis.
Nov 22 08:17:05 compute-0 ovn_controller[94843]: 2025-11-22T08:17:05Z|00658|binding|INFO|328a2b4f-54d4-42e2-b943-54d2e4ec2bae: Claiming fa:16:3e:82:97:79 10.100.0.4
Nov 22 08:17:05 compute-0 nova_compute[186544]: 2025-11-22 08:17:05.341 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:17:05 compute-0 NetworkManager[55036]: <info>  [1763799425.3509] manager: (tap46205a33-4e): new Tun device (/org/freedesktop/NetworkManager/Devices/305)
Nov 22 08:17:05 compute-0 kernel: tap46205a33-4e: entered promiscuous mode
Nov 22 08:17:05 compute-0 systemd-udevd[240153]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:17:05 compute-0 systemd-udevd[240152]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:17:05 compute-0 NetworkManager[55036]: <info>  [1763799425.3839] device (tap46205a33-4e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 08:17:05 compute-0 NetworkManager[55036]: <info>  [1763799425.3847] device (tap328a2b4f-54): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 08:17:05 compute-0 NetworkManager[55036]: <info>  [1763799425.3853] device (tap46205a33-4e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 08:17:05 compute-0 NetworkManager[55036]: <info>  [1763799425.3856] device (tap328a2b4f-54): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 08:17:05 compute-0 systemd-machined[152872]: New machine qemu-79-instance-00000093.
Nov 22 08:17:05 compute-0 nova_compute[186544]: 2025-11-22 08:17:05.418 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:17:05 compute-0 ovn_controller[94843]: 2025-11-22T08:17:05Z|00659|if_status|INFO|Dropped 1 log messages in last 175 seconds (most recently, 175 seconds ago) due to excessive rate
Nov 22 08:17:05 compute-0 ovn_controller[94843]: 2025-11-22T08:17:05Z|00660|if_status|INFO|Not updating pb chassis for 46205a33-4ef2-438e-9ea8-e4918a0d6e9b now as sb is readonly
Nov 22 08:17:05 compute-0 nova_compute[186544]: 2025-11-22 08:17:05.422 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:17:05 compute-0 nova_compute[186544]: 2025-11-22 08:17:05.426 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:17:05 compute-0 ovn_controller[94843]: 2025-11-22T08:17:05Z|00661|binding|INFO|Claiming lport 46205a33-4ef2-438e-9ea8-e4918a0d6e9b for this chassis.
Nov 22 08:17:05 compute-0 ovn_controller[94843]: 2025-11-22T08:17:05Z|00662|binding|INFO|46205a33-4ef2-438e-9ea8-e4918a0d6e9b: Claiming fa:16:3e:d5:20:0a 2001:db8::f816:3eff:fed5:200a
Nov 22 08:17:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:17:05.435 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:82:97:79 10.100.0.4'], port_security=['fa:16:3e:82:97:79 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '042b3132-f4d0-455f-a591-712ce4fd2839', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-28285a99-0933-48f9-aee6-f1e507bcd777', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2f853bc3-cae6-48c5-838f-5d956d1719f7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2ce7fc5f-5ca9-4729-bcf9-4866d6397f92, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=328a2b4f-54d4-42e2-b943-54d2e4ec2bae) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:17:05 compute-0 ovn_controller[94843]: 2025-11-22T08:17:05Z|00663|binding|INFO|Setting lport 46205a33-4ef2-438e-9ea8-e4918a0d6e9b ovn-installed in OVS
Nov 22 08:17:05 compute-0 ovn_controller[94843]: 2025-11-22T08:17:05Z|00664|binding|INFO|Setting lport 328a2b4f-54d4-42e2-b943-54d2e4ec2bae ovn-installed in OVS
Nov 22 08:17:05 compute-0 systemd[1]: Started Virtual Machine qemu-79-instance-00000093.
Nov 22 08:17:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:17:05.436 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 328a2b4f-54d4-42e2-b943-54d2e4ec2bae in datapath 28285a99-0933-48f9-aee6-f1e507bcd777 bound to our chassis
Nov 22 08:17:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:17:05.437 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 28285a99-0933-48f9-aee6-f1e507bcd777
Nov 22 08:17:05 compute-0 nova_compute[186544]: 2025-11-22 08:17:05.437 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:17:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:17:05.446 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[c3b17ee0-520e-4511-a2e7-415a0fef5e0c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:17:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:17:05.447 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap28285a99-01 in ovnmeta-28285a99-0933-48f9-aee6-f1e507bcd777 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 08:17:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:17:05.448 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap28285a99-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 08:17:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:17:05.448 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[3eb4528b-b520-4a82-9b24-0ba40ccfb341]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:17:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:17:05.448 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[a5336f45-2fa9-40b6-aa3d-51cdbf707470]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:17:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:17:05.458 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[5fba7f3a-918f-4964-b977-8a4967d001b5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:17:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:17:05.480 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[47a2db6b-0888-4602-8e63-cf04e7c2dd7b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:17:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:17:05.510 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[54509dc2-4a35-43e3-b161-9f9a05417999]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:17:05 compute-0 NetworkManager[55036]: <info>  [1763799425.5173] manager: (tap28285a99-00): new Veth device (/org/freedesktop/NetworkManager/Devices/306)
Nov 22 08:17:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:17:05.517 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[0e41a6a3-2280-49a2-beab-205eab322664]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:17:05 compute-0 ovn_controller[94843]: 2025-11-22T08:17:05Z|00665|binding|INFO|Setting lport 46205a33-4ef2-438e-9ea8-e4918a0d6e9b up in Southbound
Nov 22 08:17:05 compute-0 ovn_controller[94843]: 2025-11-22T08:17:05Z|00666|binding|INFO|Setting lport 328a2b4f-54d4-42e2-b943-54d2e4ec2bae up in Southbound
Nov 22 08:17:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:17:05.545 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d5:20:0a 2001:db8::f816:3eff:fed5:200a'], port_security=['fa:16:3e:d5:20:0a 2001:db8::f816:3eff:fed5:200a'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fed5:200a/64', 'neutron:device_id': '042b3132-f4d0-455f-a591-712ce4fd2839', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-206a04da-ce2f-48ff-99c7-e70706547580', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2f853bc3-cae6-48c5-838f-5d956d1719f7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=450e32a6-ae0a-4cd4-b338-c697096c146f, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=46205a33-4ef2-438e-9ea8-e4918a0d6e9b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:17:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:17:05.549 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[bc83ed06-0510-4d89-8ecb-bf70f0a7cf38]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:17:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:17:05.552 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[80db171c-dcd3-4115-af62-9536b4a15209]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:17:05 compute-0 NetworkManager[55036]: <info>  [1763799425.5738] device (tap28285a99-00): carrier: link connected
Nov 22 08:17:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:17:05.578 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[44ea13a1-0ce6-4f38-9c76-c6f575b0d43b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:17:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:17:05.595 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[504fa3da-48c9-42c5-bdbe-a07ea44ebf91]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap28285a99-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:37:93:8c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 203], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 616621, 'reachable_time': 28655, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240189, 'error': None, 'target': 'ovnmeta-28285a99-0933-48f9-aee6-f1e507bcd777', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:17:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:17:05.614 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[653bcb1b-bc5f-4d13-b00c-79fc7e345739]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe37:938c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 616621, 'tstamp': 616621}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 240190, 'error': None, 'target': 'ovnmeta-28285a99-0933-48f9-aee6-f1e507bcd777', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:17:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:17:05.629 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[68254fd1-0d76-4aee-8ed2-5257168e844d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap28285a99-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:37:93:8c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 203], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 616621, 'reachable_time': 28655, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 240195, 'error': None, 'target': 'ovnmeta-28285a99-0933-48f9-aee6-f1e507bcd777', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:17:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:17:05.661 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[ff5e496f-9fb4-4144-b5da-b0827629dbf4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:17:05 compute-0 nova_compute[186544]: 2025-11-22 08:17:05.709 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763799425.7087305, 042b3132-f4d0-455f-a591-712ce4fd2839 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:17:05 compute-0 nova_compute[186544]: 2025-11-22 08:17:05.709 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] VM Started (Lifecycle Event)
Nov 22 08:17:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:17:05.720 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[3c9262a1-106b-49c1-a50a-e5e7f15dd727]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:17:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:17:05.721 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap28285a99-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:17:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:17:05.722 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:17:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:17:05.722 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap28285a99-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:17:05 compute-0 kernel: tap28285a99-00: entered promiscuous mode
Nov 22 08:17:05 compute-0 NetworkManager[55036]: <info>  [1763799425.7250] manager: (tap28285a99-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/307)
Nov 22 08:17:05 compute-0 nova_compute[186544]: 2025-11-22 08:17:05.725 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:17:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:17:05.729 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap28285a99-00, col_values=(('external_ids', {'iface-id': '44730bfb-6390-4f63-a416-89f912674e46'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:17:05 compute-0 ovn_controller[94843]: 2025-11-22T08:17:05Z|00667|binding|INFO|Releasing lport 44730bfb-6390-4f63-a416-89f912674e46 from this chassis (sb_readonly=0)
Nov 22 08:17:05 compute-0 nova_compute[186544]: 2025-11-22 08:17:05.730 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:17:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:17:05.732 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/28285a99-0933-48f9-aee6-f1e507bcd777.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/28285a99-0933-48f9-aee6-f1e507bcd777.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 08:17:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:17:05.732 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[06824132-46a5-48b1-ab66-028c6f5f01c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:17:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:17:05.733 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 08:17:05 compute-0 ovn_metadata_agent[103800]: global
Nov 22 08:17:05 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 08:17:05 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-28285a99-0933-48f9-aee6-f1e507bcd777
Nov 22 08:17:05 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 08:17:05 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 08:17:05 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 08:17:05 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/28285a99-0933-48f9-aee6-f1e507bcd777.pid.haproxy
Nov 22 08:17:05 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 08:17:05 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:17:05 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 08:17:05 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 08:17:05 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 08:17:05 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 08:17:05 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 08:17:05 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 08:17:05 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 08:17:05 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 08:17:05 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 08:17:05 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 08:17:05 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 08:17:05 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 08:17:05 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 08:17:05 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:17:05 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:17:05 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 08:17:05 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 08:17:05 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 08:17:05 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID 28285a99-0933-48f9-aee6-f1e507bcd777
Nov 22 08:17:05 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 08:17:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:17:05.734 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-28285a99-0933-48f9-aee6-f1e507bcd777', 'env', 'PROCESS_TAG=haproxy-28285a99-0933-48f9-aee6-f1e507bcd777', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/28285a99-0933-48f9-aee6-f1e507bcd777.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 08:17:05 compute-0 nova_compute[186544]: 2025-11-22 08:17:05.741 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:17:05 compute-0 nova_compute[186544]: 2025-11-22 08:17:05.750 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:17:05 compute-0 nova_compute[186544]: 2025-11-22 08:17:05.754 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763799425.7096653, 042b3132-f4d0-455f-a591-712ce4fd2839 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:17:05 compute-0 nova_compute[186544]: 2025-11-22 08:17:05.754 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] VM Paused (Lifecycle Event)
Nov 22 08:17:05 compute-0 nova_compute[186544]: 2025-11-22 08:17:05.784 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:17:05 compute-0 nova_compute[186544]: 2025-11-22 08:17:05.788 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:17:05 compute-0 nova_compute[186544]: 2025-11-22 08:17:05.827 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 08:17:06 compute-0 nova_compute[186544]: 2025-11-22 08:17:06.009 186548 DEBUG nova.compute.manager [req-ddafe272-cd1c-498c-9a7b-2601823248da req-65c17df1-3200-4713-90dd-b5b4927b1f68 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Received event network-vif-plugged-46205a33-4ef2-438e-9ea8-e4918a0d6e9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:17:06 compute-0 nova_compute[186544]: 2025-11-22 08:17:06.010 186548 DEBUG oslo_concurrency.lockutils [req-ddafe272-cd1c-498c-9a7b-2601823248da req-65c17df1-3200-4713-90dd-b5b4927b1f68 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "042b3132-f4d0-455f-a591-712ce4fd2839-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:17:06 compute-0 nova_compute[186544]: 2025-11-22 08:17:06.011 186548 DEBUG oslo_concurrency.lockutils [req-ddafe272-cd1c-498c-9a7b-2601823248da req-65c17df1-3200-4713-90dd-b5b4927b1f68 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "042b3132-f4d0-455f-a591-712ce4fd2839-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:17:06 compute-0 nova_compute[186544]: 2025-11-22 08:17:06.011 186548 DEBUG oslo_concurrency.lockutils [req-ddafe272-cd1c-498c-9a7b-2601823248da req-65c17df1-3200-4713-90dd-b5b4927b1f68 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "042b3132-f4d0-455f-a591-712ce4fd2839-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:17:06 compute-0 nova_compute[186544]: 2025-11-22 08:17:06.011 186548 DEBUG nova.compute.manager [req-ddafe272-cd1c-498c-9a7b-2601823248da req-65c17df1-3200-4713-90dd-b5b4927b1f68 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Processing event network-vif-plugged-46205a33-4ef2-438e-9ea8-e4918a0d6e9b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 08:17:06 compute-0 nova_compute[186544]: 2025-11-22 08:17:06.061 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:17:06 compute-0 podman[240230]: 2025-11-22 08:17:06.06836221 +0000 UTC m=+0.021042630 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 08:17:06 compute-0 nova_compute[186544]: 2025-11-22 08:17:06.183 186548 DEBUG nova.network.neutron [req-d80c6ea1-d440-4ff1-9bf2-2b010f49f99a req-3c549790-f248-4468-95e3-4694b5312d85 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Updated VIF entry in instance network info cache for port 46205a33-4ef2-438e-9ea8-e4918a0d6e9b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:17:06 compute-0 nova_compute[186544]: 2025-11-22 08:17:06.184 186548 DEBUG nova.network.neutron [req-d80c6ea1-d440-4ff1-9bf2-2b010f49f99a req-3c549790-f248-4468-95e3-4694b5312d85 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Updating instance_info_cache with network_info: [{"id": "328a2b4f-54d4-42e2-b943-54d2e4ec2bae", "address": "fa:16:3e:82:97:79", "network": {"id": "28285a99-0933-48f9-aee6-f1e507bcd777", "bridge": "br-int", "label": "tempest-network-smoke--85194758", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap328a2b4f-54", "ovs_interfaceid": "328a2b4f-54d4-42e2-b943-54d2e4ec2bae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "46205a33-4ef2-438e-9ea8-e4918a0d6e9b", "address": "fa:16:3e:d5:20:0a", "network": {"id": "206a04da-ce2f-48ff-99c7-e70706547580", "bridge": "br-int", "label": "tempest-network-smoke--1887597380", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fed5:200a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap46205a33-4e", "ovs_interfaceid": "46205a33-4ef2-438e-9ea8-e4918a0d6e9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:17:06 compute-0 nova_compute[186544]: 2025-11-22 08:17:06.200 186548 DEBUG oslo_concurrency.lockutils [req-d80c6ea1-d440-4ff1-9bf2-2b010f49f99a req-3c549790-f248-4468-95e3-4694b5312d85 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-042b3132-f4d0-455f-a591-712ce4fd2839" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:17:07 compute-0 podman[240230]: 2025-11-22 08:17:07.310758836 +0000 UTC m=+1.263439256 container create 55d6e9d6f56574009abd5b1b99f0d3575081433020b87ce61308b0252cb7c4a3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-28285a99-0933-48f9-aee6-f1e507bcd777, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 22 08:17:07 compute-0 systemd[1]: Started libpod-conmon-55d6e9d6f56574009abd5b1b99f0d3575081433020b87ce61308b0252cb7c4a3.scope.
Nov 22 08:17:07 compute-0 systemd[1]: Started libcrun container.
Nov 22 08:17:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9b699a2d790f67f47bc6c195e1b05787655db1ed29646cc959b56b690863a0fc/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 08:17:07 compute-0 podman[240230]: 2025-11-22 08:17:07.824511425 +0000 UTC m=+1.777191905 container init 55d6e9d6f56574009abd5b1b99f0d3575081433020b87ce61308b0252cb7c4a3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-28285a99-0933-48f9-aee6-f1e507bcd777, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118)
Nov 22 08:17:07 compute-0 podman[240230]: 2025-11-22 08:17:07.829733174 +0000 UTC m=+1.782413564 container start 55d6e9d6f56574009abd5b1b99f0d3575081433020b87ce61308b0252cb7c4a3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-28285a99-0933-48f9-aee6-f1e507bcd777, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 08:17:07 compute-0 neutron-haproxy-ovnmeta-28285a99-0933-48f9-aee6-f1e507bcd777[240245]: [NOTICE]   (240249) : New worker (240251) forked
Nov 22 08:17:07 compute-0 neutron-haproxy-ovnmeta-28285a99-0933-48f9-aee6-f1e507bcd777[240245]: [NOTICE]   (240249) : Loading success.
Nov 22 08:17:08 compute-0 nova_compute[186544]: 2025-11-22 08:17:08.125 186548 DEBUG nova.compute.manager [req-e923413d-b4da-475e-b3ba-d3a5d2ed2055 req-3f572b46-f08e-4153-b297-9494f2441497 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Received event network-vif-plugged-46205a33-4ef2-438e-9ea8-e4918a0d6e9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:17:08 compute-0 nova_compute[186544]: 2025-11-22 08:17:08.127 186548 DEBUG oslo_concurrency.lockutils [req-e923413d-b4da-475e-b3ba-d3a5d2ed2055 req-3f572b46-f08e-4153-b297-9494f2441497 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "042b3132-f4d0-455f-a591-712ce4fd2839-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:17:08 compute-0 nova_compute[186544]: 2025-11-22 08:17:08.128 186548 DEBUG oslo_concurrency.lockutils [req-e923413d-b4da-475e-b3ba-d3a5d2ed2055 req-3f572b46-f08e-4153-b297-9494f2441497 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "042b3132-f4d0-455f-a591-712ce4fd2839-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:17:08 compute-0 nova_compute[186544]: 2025-11-22 08:17:08.128 186548 DEBUG oslo_concurrency.lockutils [req-e923413d-b4da-475e-b3ba-d3a5d2ed2055 req-3f572b46-f08e-4153-b297-9494f2441497 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "042b3132-f4d0-455f-a591-712ce4fd2839-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:17:08 compute-0 nova_compute[186544]: 2025-11-22 08:17:08.129 186548 DEBUG nova.compute.manager [req-e923413d-b4da-475e-b3ba-d3a5d2ed2055 req-3f572b46-f08e-4153-b297-9494f2441497 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] No event matching network-vif-plugged-46205a33-4ef2-438e-9ea8-e4918a0d6e9b in dict_keys([('network-vif-plugged', '328a2b4f-54d4-42e2-b943-54d2e4ec2bae')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325
Nov 22 08:17:08 compute-0 nova_compute[186544]: 2025-11-22 08:17:08.130 186548 WARNING nova.compute.manager [req-e923413d-b4da-475e-b3ba-d3a5d2ed2055 req-3f572b46-f08e-4153-b297-9494f2441497 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Received unexpected event network-vif-plugged-46205a33-4ef2-438e-9ea8-e4918a0d6e9b for instance with vm_state building and task_state spawning.
Nov 22 08:17:08 compute-0 nova_compute[186544]: 2025-11-22 08:17:08.130 186548 DEBUG nova.compute.manager [req-e923413d-b4da-475e-b3ba-d3a5d2ed2055 req-3f572b46-f08e-4153-b297-9494f2441497 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Received event network-vif-plugged-328a2b4f-54d4-42e2-b943-54d2e4ec2bae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:17:08 compute-0 nova_compute[186544]: 2025-11-22 08:17:08.131 186548 DEBUG oslo_concurrency.lockutils [req-e923413d-b4da-475e-b3ba-d3a5d2ed2055 req-3f572b46-f08e-4153-b297-9494f2441497 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "042b3132-f4d0-455f-a591-712ce4fd2839-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:17:08 compute-0 nova_compute[186544]: 2025-11-22 08:17:08.132 186548 DEBUG oslo_concurrency.lockutils [req-e923413d-b4da-475e-b3ba-d3a5d2ed2055 req-3f572b46-f08e-4153-b297-9494f2441497 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "042b3132-f4d0-455f-a591-712ce4fd2839-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:17:08 compute-0 nova_compute[186544]: 2025-11-22 08:17:08.132 186548 DEBUG oslo_concurrency.lockutils [req-e923413d-b4da-475e-b3ba-d3a5d2ed2055 req-3f572b46-f08e-4153-b297-9494f2441497 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "042b3132-f4d0-455f-a591-712ce4fd2839-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:17:08 compute-0 nova_compute[186544]: 2025-11-22 08:17:08.133 186548 DEBUG nova.compute.manager [req-e923413d-b4da-475e-b3ba-d3a5d2ed2055 req-3f572b46-f08e-4153-b297-9494f2441497 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Processing event network-vif-plugged-328a2b4f-54d4-42e2-b943-54d2e4ec2bae _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 08:17:08 compute-0 nova_compute[186544]: 2025-11-22 08:17:08.134 186548 DEBUG nova.compute.manager [req-e923413d-b4da-475e-b3ba-d3a5d2ed2055 req-3f572b46-f08e-4153-b297-9494f2441497 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Received event network-vif-plugged-328a2b4f-54d4-42e2-b943-54d2e4ec2bae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:17:08 compute-0 nova_compute[186544]: 2025-11-22 08:17:08.134 186548 DEBUG oslo_concurrency.lockutils [req-e923413d-b4da-475e-b3ba-d3a5d2ed2055 req-3f572b46-f08e-4153-b297-9494f2441497 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "042b3132-f4d0-455f-a591-712ce4fd2839-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:17:08 compute-0 nova_compute[186544]: 2025-11-22 08:17:08.135 186548 DEBUG oslo_concurrency.lockutils [req-e923413d-b4da-475e-b3ba-d3a5d2ed2055 req-3f572b46-f08e-4153-b297-9494f2441497 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "042b3132-f4d0-455f-a591-712ce4fd2839-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:17:08 compute-0 nova_compute[186544]: 2025-11-22 08:17:08.136 186548 DEBUG oslo_concurrency.lockutils [req-e923413d-b4da-475e-b3ba-d3a5d2ed2055 req-3f572b46-f08e-4153-b297-9494f2441497 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "042b3132-f4d0-455f-a591-712ce4fd2839-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:17:08 compute-0 nova_compute[186544]: 2025-11-22 08:17:08.136 186548 DEBUG nova.compute.manager [req-e923413d-b4da-475e-b3ba-d3a5d2ed2055 req-3f572b46-f08e-4153-b297-9494f2441497 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] No waiting events found dispatching network-vif-plugged-328a2b4f-54d4-42e2-b943-54d2e4ec2bae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:17:08 compute-0 nova_compute[186544]: 2025-11-22 08:17:08.137 186548 WARNING nova.compute.manager [req-e923413d-b4da-475e-b3ba-d3a5d2ed2055 req-3f572b46-f08e-4153-b297-9494f2441497 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Received unexpected event network-vif-plugged-328a2b4f-54d4-42e2-b943-54d2e4ec2bae for instance with vm_state building and task_state spawning.
Nov 22 08:17:08 compute-0 nova_compute[186544]: 2025-11-22 08:17:08.138 186548 DEBUG nova.compute.manager [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Instance event wait completed in 2 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 08:17:08 compute-0 nova_compute[186544]: 2025-11-22 08:17:08.144 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763799428.1444418, 042b3132-f4d0-455f-a591-712ce4fd2839 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:17:08 compute-0 nova_compute[186544]: 2025-11-22 08:17:08.145 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] VM Resumed (Lifecycle Event)
Nov 22 08:17:08 compute-0 nova_compute[186544]: 2025-11-22 08:17:08.148 186548 DEBUG nova.virt.libvirt.driver [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 08:17:08 compute-0 nova_compute[186544]: 2025-11-22 08:17:08.157 186548 INFO nova.virt.libvirt.driver [-] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Instance spawned successfully.
Nov 22 08:17:08 compute-0 nova_compute[186544]: 2025-11-22 08:17:08.158 186548 DEBUG nova.virt.libvirt.driver [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 08:17:08 compute-0 nova_compute[186544]: 2025-11-22 08:17:08.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:17:08 compute-0 nova_compute[186544]: 2025-11-22 08:17:08.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:17:08 compute-0 nova_compute[186544]: 2025-11-22 08:17:08.167 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:17:08 compute-0 nova_compute[186544]: 2025-11-22 08:17:08.179 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:17:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:17:08.186 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 46205a33-4ef2-438e-9ea8-e4918a0d6e9b in datapath 206a04da-ce2f-48ff-99c7-e70706547580 unbound from our chassis
Nov 22 08:17:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:17:08.188 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 206a04da-ce2f-48ff-99c7-e70706547580
Nov 22 08:17:08 compute-0 nova_compute[186544]: 2025-11-22 08:17:08.188 186548 DEBUG nova.virt.libvirt.driver [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:17:08 compute-0 nova_compute[186544]: 2025-11-22 08:17:08.190 186548 DEBUG nova.virt.libvirt.driver [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:17:08 compute-0 nova_compute[186544]: 2025-11-22 08:17:08.191 186548 DEBUG nova.virt.libvirt.driver [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:17:08 compute-0 nova_compute[186544]: 2025-11-22 08:17:08.192 186548 DEBUG nova.virt.libvirt.driver [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:17:08 compute-0 nova_compute[186544]: 2025-11-22 08:17:08.193 186548 DEBUG nova.virt.libvirt.driver [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:17:08 compute-0 nova_compute[186544]: 2025-11-22 08:17:08.194 186548 DEBUG nova.virt.libvirt.driver [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:17:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:17:08.203 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[1c0f810d-888a-4503-84f9-b040282ad3eb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:17:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:17:08.204 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap206a04da-c1 in ovnmeta-206a04da-ce2f-48ff-99c7-e70706547580 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 08:17:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:17:08.208 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap206a04da-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 08:17:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:17:08.208 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[b8abf50e-e8a1-4a63-aee8-c6a64f9211e9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:17:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:17:08.210 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[e7bf8451-4ff2-498c-a4e3-2bac6981a577]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:17:08 compute-0 nova_compute[186544]: 2025-11-22 08:17:08.222 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 08:17:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:17:08.227 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[81a8a456-14a7-4349-9683-3bb7b06ccf9d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:17:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:17:08.245 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[8d406ba2-7e47-4137-90ea-b8a7da7fcfb4]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:17:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:17:08.280 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[6305cc25-0c68-49cd-917d-b40166d10d02]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:17:08 compute-0 NetworkManager[55036]: <info>  [1763799428.2952] manager: (tap206a04da-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/308)
Nov 22 08:17:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:17:08.296 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[e3932844-6a3f-46b5-a8d0-cb3905136481]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:17:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:17:08.336 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[2cb7dc71-b9b2-4b9d-806f-0d0ffbe32f0b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:17:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:17:08.340 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[73ce84b9-3e35-4df1-bd72-d2c2dfbe6c2d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:17:08 compute-0 NetworkManager[55036]: <info>  [1763799428.3667] device (tap206a04da-c0): carrier: link connected
Nov 22 08:17:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:17:08.372 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[5eaa4498-359d-4fde-ad5d-3a1ac0df7965]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:17:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:17:08.391 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[e95d4917-2fbf-4984-8130-de4a69048a53]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap206a04da-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0c:fc:b7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 204], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 616900, 'reachable_time': 21448, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240272, 'error': None, 'target': 'ovnmeta-206a04da-ce2f-48ff-99c7-e70706547580', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:17:08 compute-0 nova_compute[186544]: 2025-11-22 08:17:08.404 186548 INFO nova.compute.manager [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Took 14.11 seconds to spawn the instance on the hypervisor.
Nov 22 08:17:08 compute-0 nova_compute[186544]: 2025-11-22 08:17:08.405 186548 DEBUG nova.compute.manager [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:17:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:17:08.408 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[e9e50517-782b-4770-b856-2ff6d7dab605]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0c:fcb7'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 616900, 'tstamp': 616900}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 240273, 'error': None, 'target': 'ovnmeta-206a04da-ce2f-48ff-99c7-e70706547580', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:17:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:17:08.428 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[9d4fc9a2-947f-4f24-893b-afe3ff09e9a3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap206a04da-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0c:fc:b7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 204], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 616900, 'reachable_time': 21448, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 240274, 'error': None, 'target': 'ovnmeta-206a04da-ce2f-48ff-99c7-e70706547580', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:17:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:17:08.464 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[532e02e0-3d7e-4754-a741-77477d42b20b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:17:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:17:08.508 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[98102f7d-d08a-4d68-bf29-64901f7a0042]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:17:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:17:08.509 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap206a04da-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:17:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:17:08.510 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:17:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:17:08.510 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap206a04da-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:17:08 compute-0 nova_compute[186544]: 2025-11-22 08:17:08.512 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:17:08 compute-0 NetworkManager[55036]: <info>  [1763799428.5127] manager: (tap206a04da-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/309)
Nov 22 08:17:08 compute-0 kernel: tap206a04da-c0: entered promiscuous mode
Nov 22 08:17:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:17:08.515 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap206a04da-c0, col_values=(('external_ids', {'iface-id': '9e35917e-9d7c-4228-9823-4967f6df52f0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:17:08 compute-0 ovn_controller[94843]: 2025-11-22T08:17:08Z|00668|binding|INFO|Releasing lport 9e35917e-9d7c-4228-9823-4967f6df52f0 from this chassis (sb_readonly=0)
Nov 22 08:17:08 compute-0 nova_compute[186544]: 2025-11-22 08:17:08.530 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:17:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:17:08.531 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/206a04da-ce2f-48ff-99c7-e70706547580.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/206a04da-ce2f-48ff-99c7-e70706547580.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 08:17:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:17:08.532 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[b03a8120-5619-4efa-a05d-c53e4e7e3d70]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:17:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:17:08.533 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 08:17:08 compute-0 ovn_metadata_agent[103800]: global
Nov 22 08:17:08 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 08:17:08 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-206a04da-ce2f-48ff-99c7-e70706547580
Nov 22 08:17:08 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 08:17:08 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 08:17:08 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 08:17:08 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/206a04da-ce2f-48ff-99c7-e70706547580.pid.haproxy
Nov 22 08:17:08 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 08:17:08 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:17:08 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 08:17:08 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 08:17:08 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 08:17:08 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 08:17:08 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 08:17:08 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 08:17:08 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 08:17:08 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 08:17:08 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 08:17:08 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 08:17:08 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 08:17:08 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 08:17:08 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 08:17:08 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:17:08 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:17:08 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 08:17:08 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 08:17:08 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 08:17:08 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID 206a04da-ce2f-48ff-99c7-e70706547580
Nov 22 08:17:08 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 08:17:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:17:08.534 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-206a04da-ce2f-48ff-99c7-e70706547580', 'env', 'PROCESS_TAG=haproxy-206a04da-ce2f-48ff-99c7-e70706547580', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/206a04da-ce2f-48ff-99c7-e70706547580.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 08:17:08 compute-0 nova_compute[186544]: 2025-11-22 08:17:08.614 186548 INFO nova.compute.manager [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Took 14.79 seconds to build instance.
Nov 22 08:17:08 compute-0 nova_compute[186544]: 2025-11-22 08:17:08.681 186548 DEBUG oslo_concurrency.lockutils [None req-29b88054-f49e-4ff4-9057-3cc5952932ed 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "042b3132-f4d0-455f-a591-712ce4fd2839" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.960s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:17:08 compute-0 podman[240305]: 2025-11-22 08:17:08.854754391 +0000 UTC m=+0.021611876 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 08:17:08 compute-0 nova_compute[186544]: 2025-11-22 08:17:08.991 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:17:09 compute-0 podman[240305]: 2025-11-22 08:17:09.518775611 +0000 UTC m=+0.685633086 container create cb696134294cdd0bfb9be591c08645895587ad6de5925a7f9714d136e43d54b9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-206a04da-ce2f-48ff-99c7-e70706547580, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 08:17:09 compute-0 systemd[1]: Started libpod-conmon-cb696134294cdd0bfb9be591c08645895587ad6de5925a7f9714d136e43d54b9.scope.
Nov 22 08:17:09 compute-0 systemd[1]: Started libcrun container.
Nov 22 08:17:09 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cc5c1ccef575014a85809801db318ddb91049dbba867fb51a31d06b8ddb05e8c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 08:17:09 compute-0 podman[240305]: 2025-11-22 08:17:09.922862972 +0000 UTC m=+1.089720467 container init cb696134294cdd0bfb9be591c08645895587ad6de5925a7f9714d136e43d54b9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-206a04da-ce2f-48ff-99c7-e70706547580, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 08:17:09 compute-0 podman[240305]: 2025-11-22 08:17:09.929793033 +0000 UTC m=+1.096650508 container start cb696134294cdd0bfb9be591c08645895587ad6de5925a7f9714d136e43d54b9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-206a04da-ce2f-48ff-99c7-e70706547580, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 08:17:09 compute-0 neutron-haproxy-ovnmeta-206a04da-ce2f-48ff-99c7-e70706547580[240321]: [NOTICE]   (240325) : New worker (240327) forked
Nov 22 08:17:09 compute-0 neutron-haproxy-ovnmeta-206a04da-ce2f-48ff-99c7-e70706547580[240321]: [NOTICE]   (240325) : Loading success.
Nov 22 08:17:11 compute-0 nova_compute[186544]: 2025-11-22 08:17:11.063 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:17:12 compute-0 nova_compute[186544]: 2025-11-22 08:17:12.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:17:12 compute-0 nova_compute[186544]: 2025-11-22 08:17:12.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:17:13 compute-0 nova_compute[186544]: 2025-11-22 08:17:13.993 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:17:14 compute-0 ovn_controller[94843]: 2025-11-22T08:17:14Z|00669|binding|INFO|Releasing lport 44730bfb-6390-4f63-a416-89f912674e46 from this chassis (sb_readonly=0)
Nov 22 08:17:14 compute-0 ovn_controller[94843]: 2025-11-22T08:17:14Z|00670|binding|INFO|Releasing lport 9e35917e-9d7c-4228-9823-4967f6df52f0 from this chassis (sb_readonly=0)
Nov 22 08:17:14 compute-0 nova_compute[186544]: 2025-11-22 08:17:14.830 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:17:14 compute-0 NetworkManager[55036]: <info>  [1763799434.8312] manager: (patch-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/310)
Nov 22 08:17:14 compute-0 NetworkManager[55036]: <info>  [1763799434.8333] manager: (patch-br-int-to-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/311)
Nov 22 08:17:14 compute-0 ovn_controller[94843]: 2025-11-22T08:17:14Z|00671|binding|INFO|Releasing lport 44730bfb-6390-4f63-a416-89f912674e46 from this chassis (sb_readonly=0)
Nov 22 08:17:14 compute-0 ovn_controller[94843]: 2025-11-22T08:17:14Z|00672|binding|INFO|Releasing lport 9e35917e-9d7c-4228-9823-4967f6df52f0 from this chassis (sb_readonly=0)
Nov 22 08:17:14 compute-0 nova_compute[186544]: 2025-11-22 08:17:14.890 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:17:14 compute-0 nova_compute[186544]: 2025-11-22 08:17:14.899 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:17:16 compute-0 nova_compute[186544]: 2025-11-22 08:17:16.064 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:17:17 compute-0 nova_compute[186544]: 2025-11-22 08:17:17.175 186548 DEBUG nova.compute.manager [req-21045df4-47e8-4691-ae9b-dad142df0f40 req-495b6138-628a-4b8c-b205-d4bfac70eb4f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Received event network-changed-328a2b4f-54d4-42e2-b943-54d2e4ec2bae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:17:17 compute-0 nova_compute[186544]: 2025-11-22 08:17:17.176 186548 DEBUG nova.compute.manager [req-21045df4-47e8-4691-ae9b-dad142df0f40 req-495b6138-628a-4b8c-b205-d4bfac70eb4f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Refreshing instance network info cache due to event network-changed-328a2b4f-54d4-42e2-b943-54d2e4ec2bae. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:17:17 compute-0 nova_compute[186544]: 2025-11-22 08:17:17.176 186548 DEBUG oslo_concurrency.lockutils [req-21045df4-47e8-4691-ae9b-dad142df0f40 req-495b6138-628a-4b8c-b205-d4bfac70eb4f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-042b3132-f4d0-455f-a591-712ce4fd2839" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:17:17 compute-0 nova_compute[186544]: 2025-11-22 08:17:17.177 186548 DEBUG oslo_concurrency.lockutils [req-21045df4-47e8-4691-ae9b-dad142df0f40 req-495b6138-628a-4b8c-b205-d4bfac70eb4f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-042b3132-f4d0-455f-a591-712ce4fd2839" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:17:17 compute-0 nova_compute[186544]: 2025-11-22 08:17:17.177 186548 DEBUG nova.network.neutron [req-21045df4-47e8-4691-ae9b-dad142df0f40 req-495b6138-628a-4b8c-b205-d4bfac70eb4f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Refreshing network info cache for port 328a2b4f-54d4-42e2-b943-54d2e4ec2bae _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:17:18 compute-0 nova_compute[186544]: 2025-11-22 08:17:18.995 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:17:21 compute-0 nova_compute[186544]: 2025-11-22 08:17:21.066 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:17:21 compute-0 podman[240340]: 2025-11-22 08:17:21.420372218 +0000 UTC m=+0.064906374 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 08:17:21 compute-0 podman[240341]: 2025-11-22 08:17:21.431453062 +0000 UTC m=+0.069423155 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 08:17:21 compute-0 podman[240342]: 2025-11-22 08:17:21.431867633 +0000 UTC m=+0.067696704 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 08:17:21 compute-0 podman[240343]: 2025-11-22 08:17:21.465206245 +0000 UTC m=+0.101570979 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 08:17:22 compute-0 nova_compute[186544]: 2025-11-22 08:17:22.966 186548 DEBUG nova.network.neutron [req-21045df4-47e8-4691-ae9b-dad142df0f40 req-495b6138-628a-4b8c-b205-d4bfac70eb4f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Updated VIF entry in instance network info cache for port 328a2b4f-54d4-42e2-b943-54d2e4ec2bae. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:17:22 compute-0 nova_compute[186544]: 2025-11-22 08:17:22.967 186548 DEBUG nova.network.neutron [req-21045df4-47e8-4691-ae9b-dad142df0f40 req-495b6138-628a-4b8c-b205-d4bfac70eb4f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Updating instance_info_cache with network_info: [{"id": "328a2b4f-54d4-42e2-b943-54d2e4ec2bae", "address": "fa:16:3e:82:97:79", "network": {"id": "28285a99-0933-48f9-aee6-f1e507bcd777", "bridge": "br-int", "label": "tempest-network-smoke--85194758", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.209", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap328a2b4f-54", "ovs_interfaceid": "328a2b4f-54d4-42e2-b943-54d2e4ec2bae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "46205a33-4ef2-438e-9ea8-e4918a0d6e9b", "address": "fa:16:3e:d5:20:0a", "network": {"id": "206a04da-ce2f-48ff-99c7-e70706547580", "bridge": "br-int", "label": "tempest-network-smoke--1887597380", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fed5:200a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap46205a33-4e", "ovs_interfaceid": "46205a33-4ef2-438e-9ea8-e4918a0d6e9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:17:23 compute-0 nova_compute[186544]: 2025-11-22 08:17:23.336 186548 DEBUG oslo_concurrency.lockutils [req-21045df4-47e8-4691-ae9b-dad142df0f40 req-495b6138-628a-4b8c-b205-d4bfac70eb4f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-042b3132-f4d0-455f-a591-712ce4fd2839" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:17:23 compute-0 nova_compute[186544]: 2025-11-22 08:17:23.997 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:17:26 compute-0 nova_compute[186544]: 2025-11-22 08:17:26.069 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:17:28 compute-0 nova_compute[186544]: 2025-11-22 08:17:28.999 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:17:31 compute-0 nova_compute[186544]: 2025-11-22 08:17:31.072 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:17:32 compute-0 podman[240437]: 2025-11-22 08:17:32.439091958 +0000 UTC m=+0.085189646 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 22 08:17:34 compute-0 nova_compute[186544]: 2025-11-22 08:17:34.000 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:17:35 compute-0 podman[240462]: 2025-11-22 08:17:35.452588418 +0000 UTC m=+0.098101544 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 08:17:35 compute-0 podman[240463]: 2025-11-22 08:17:35.45992237 +0000 UTC m=+0.098103904 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., vendor=Red Hat, Inc., version=9.6, managed_by=edpm_ansible, architecture=x86_64, io.buildah.version=1.33.7, container_name=openstack_network_exporter, distribution-scope=public, config_id=edpm, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 22 08:17:35 compute-0 ovn_controller[94843]: 2025-11-22T08:17:35Z|00059|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:82:97:79 10.100.0.4
Nov 22 08:17:35 compute-0 ovn_controller[94843]: 2025-11-22T08:17:35Z|00060|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:82:97:79 10.100.0.4
Nov 22 08:17:36 compute-0 nova_compute[186544]: 2025-11-22 08:17:36.073 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:17:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:17:37.344 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:17:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:17:37.344 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:17:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:17:37.345 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:17:39 compute-0 nova_compute[186544]: 2025-11-22 08:17:39.002 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:17:41 compute-0 nova_compute[186544]: 2025-11-22 08:17:41.076 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:17:44 compute-0 nova_compute[186544]: 2025-11-22 08:17:44.004 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:17:44 compute-0 ovn_controller[94843]: 2025-11-22T08:17:44Z|00673|memory_trim|INFO|Detected inactivity (last active 30022 ms ago): trimming memory
Nov 22 08:17:46 compute-0 nova_compute[186544]: 2025-11-22 08:17:46.079 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:17:49 compute-0 nova_compute[186544]: 2025-11-22 08:17:49.007 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:17:51 compute-0 nova_compute[186544]: 2025-11-22 08:17:51.082 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:17:52 compute-0 systemd[1]: Starting dnf makecache...
Nov 22 08:17:52 compute-0 podman[240512]: 2025-11-22 08:17:52.42920211 +0000 UTC m=+0.060015334 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 08:17:52 compute-0 podman[240511]: 2025-11-22 08:17:52.4292112 +0000 UTC m=+0.062624357 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 08:17:52 compute-0 podman[240510]: 2025-11-22 08:17:52.429363654 +0000 UTC m=+0.069160759 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3)
Nov 22 08:17:52 compute-0 podman[240516]: 2025-11-22 08:17:52.456202668 +0000 UTC m=+0.082535160 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible)
Nov 22 08:17:52 compute-0 dnf[240519]: Metadata cache refreshed recently.
Nov 22 08:17:52 compute-0 systemd[1]: dnf-makecache.service: Deactivated successfully.
Nov 22 08:17:52 compute-0 systemd[1]: Finished dnf makecache.
Nov 22 08:17:54 compute-0 nova_compute[186544]: 2025-11-22 08:17:54.009 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:17:56 compute-0 nova_compute[186544]: 2025-11-22 08:17:56.084 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:17:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:17:58.506 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=43, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=42) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:17:58 compute-0 nova_compute[186544]: 2025-11-22 08:17:58.507 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:17:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:17:58.508 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 08:17:59 compute-0 nova_compute[186544]: 2025-11-22 08:17:59.012 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:18:01 compute-0 nova_compute[186544]: 2025-11-22 08:18:01.087 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:18:03 compute-0 nova_compute[186544]: 2025-11-22 08:18:03.159 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:18:03 compute-0 nova_compute[186544]: 2025-11-22 08:18:03.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:18:03 compute-0 nova_compute[186544]: 2025-11-22 08:18:03.162 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 08:18:03 compute-0 nova_compute[186544]: 2025-11-22 08:18:03.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 08:18:03 compute-0 podman[240597]: 2025-11-22 08:18:03.426372977 +0000 UTC m=+0.070369260 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 22 08:18:03 compute-0 nova_compute[186544]: 2025-11-22 08:18:03.534 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "refresh_cache-042b3132-f4d0-455f-a591-712ce4fd2839" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:18:03 compute-0 nova_compute[186544]: 2025-11-22 08:18:03.535 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquired lock "refresh_cache-042b3132-f4d0-455f-a591-712ce4fd2839" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:18:03 compute-0 nova_compute[186544]: 2025-11-22 08:18:03.536 186548 DEBUG nova.network.neutron [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 22 08:18:03 compute-0 nova_compute[186544]: 2025-11-22 08:18:03.536 186548 DEBUG nova.objects.instance [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lazy-loading 'info_cache' on Instance uuid 042b3132-f4d0-455f-a591-712ce4fd2839 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:18:04 compute-0 nova_compute[186544]: 2025-11-22 08:18:04.014 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:18:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:18:05.510 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '43'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:18:06 compute-0 nova_compute[186544]: 2025-11-22 08:18:06.088 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:18:06 compute-0 podman[240618]: 2025-11-22 08:18:06.410676545 +0000 UTC m=+0.059598253 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.openshift.expose-services=, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, maintainer=Red Hat, Inc., release=1755695350)
Nov 22 08:18:06 compute-0 podman[240617]: 2025-11-22 08:18:06.425564723 +0000 UTC m=+0.075972898 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 08:18:07 compute-0 nova_compute[186544]: 2025-11-22 08:18:07.007 186548 DEBUG nova.network.neutron [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Updating instance_info_cache with network_info: [{"id": "328a2b4f-54d4-42e2-b943-54d2e4ec2bae", "address": "fa:16:3e:82:97:79", "network": {"id": "28285a99-0933-48f9-aee6-f1e507bcd777", "bridge": "br-int", "label": "tempest-network-smoke--85194758", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.209", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap328a2b4f-54", "ovs_interfaceid": "328a2b4f-54d4-42e2-b943-54d2e4ec2bae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "46205a33-4ef2-438e-9ea8-e4918a0d6e9b", "address": "fa:16:3e:d5:20:0a", "network": {"id": "206a04da-ce2f-48ff-99c7-e70706547580", "bridge": "br-int", "label": "tempest-network-smoke--1887597380", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fed5:200a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap46205a33-4e", "ovs_interfaceid": "46205a33-4ef2-438e-9ea8-e4918a0d6e9b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:18:07 compute-0 nova_compute[186544]: 2025-11-22 08:18:07.038 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Releasing lock "refresh_cache-042b3132-f4d0-455f-a591-712ce4fd2839" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:18:07 compute-0 nova_compute[186544]: 2025-11-22 08:18:07.038 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 22 08:18:07 compute-0 nova_compute[186544]: 2025-11-22 08:18:07.039 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:18:07 compute-0 nova_compute[186544]: 2025-11-22 08:18:07.039 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:18:07 compute-0 nova_compute[186544]: 2025-11-22 08:18:07.039 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 08:18:07 compute-0 nova_compute[186544]: 2025-11-22 08:18:07.039 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:18:07 compute-0 nova_compute[186544]: 2025-11-22 08:18:07.060 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:18:07 compute-0 nova_compute[186544]: 2025-11-22 08:18:07.061 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:18:07 compute-0 nova_compute[186544]: 2025-11-22 08:18:07.061 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:18:07 compute-0 nova_compute[186544]: 2025-11-22 08:18:07.061 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 08:18:07 compute-0 nova_compute[186544]: 2025-11-22 08:18:07.133 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/042b3132-f4d0-455f-a591-712ce4fd2839/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:18:07 compute-0 nova_compute[186544]: 2025-11-22 08:18:07.190 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/042b3132-f4d0-455f-a591-712ce4fd2839/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:18:07 compute-0 nova_compute[186544]: 2025-11-22 08:18:07.191 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/042b3132-f4d0-455f-a591-712ce4fd2839/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:18:07 compute-0 nova_compute[186544]: 2025-11-22 08:18:07.265 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/042b3132-f4d0-455f-a591-712ce4fd2839/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:18:07 compute-0 nova_compute[186544]: 2025-11-22 08:18:07.432 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:18:07 compute-0 nova_compute[186544]: 2025-11-22 08:18:07.433 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5534MB free_disk=73.10802841186523GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 08:18:07 compute-0 nova_compute[186544]: 2025-11-22 08:18:07.433 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:18:07 compute-0 nova_compute[186544]: 2025-11-22 08:18:07.434 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:18:07 compute-0 nova_compute[186544]: 2025-11-22 08:18:07.506 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Instance 042b3132-f4d0-455f-a591-712ce4fd2839 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 22 08:18:07 compute-0 nova_compute[186544]: 2025-11-22 08:18:07.507 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 08:18:07 compute-0 nova_compute[186544]: 2025-11-22 08:18:07.507 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 08:18:07 compute-0 nova_compute[186544]: 2025-11-22 08:18:07.579 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:18:07 compute-0 nova_compute[186544]: 2025-11-22 08:18:07.603 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:18:07 compute-0 nova_compute[186544]: 2025-11-22 08:18:07.626 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 08:18:07 compute-0 nova_compute[186544]: 2025-11-22 08:18:07.627 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.193s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:18:08 compute-0 nova_compute[186544]: 2025-11-22 08:18:08.751 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:18:09 compute-0 nova_compute[186544]: 2025-11-22 08:18:09.016 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:18:10 compute-0 nova_compute[186544]: 2025-11-22 08:18:10.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:18:11 compute-0 nova_compute[186544]: 2025-11-22 08:18:11.090 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:18:13 compute-0 nova_compute[186544]: 2025-11-22 08:18:13.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:18:14 compute-0 nova_compute[186544]: 2025-11-22 08:18:14.019 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:18:14 compute-0 nova_compute[186544]: 2025-11-22 08:18:14.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:18:16 compute-0 nova_compute[186544]: 2025-11-22 08:18:16.093 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:18:18 compute-0 nova_compute[186544]: 2025-11-22 08:18:18.183 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:18:19 compute-0 nova_compute[186544]: 2025-11-22 08:18:19.021 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:18:21 compute-0 nova_compute[186544]: 2025-11-22 08:18:21.095 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:18:22 compute-0 nova_compute[186544]: 2025-11-22 08:18:22.009 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:18:23 compute-0 podman[240668]: 2025-11-22 08:18:23.416080659 +0000 UTC m=+0.063385477 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 08:18:23 compute-0 podman[240669]: 2025-11-22 08:18:23.426159218 +0000 UTC m=+0.065862768 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 08:18:23 compute-0 podman[240667]: 2025-11-22 08:18:23.441460366 +0000 UTC m=+0.089349488 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_id=edpm, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Nov 22 08:18:23 compute-0 podman[240670]: 2025-11-22 08:18:23.461100291 +0000 UTC m=+0.097796647 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Nov 22 08:18:24 compute-0 nova_compute[186544]: 2025-11-22 08:18:24.023 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:18:26 compute-0 nova_compute[186544]: 2025-11-22 08:18:26.096 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:18:28 compute-0 nova_compute[186544]: 2025-11-22 08:18:28.159 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:18:29 compute-0 nova_compute[186544]: 2025-11-22 08:18:29.026 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:18:31 compute-0 nova_compute[186544]: 2025-11-22 08:18:31.099 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:18:34 compute-0 nova_compute[186544]: 2025-11-22 08:18:34.027 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:18:34 compute-0 podman[240750]: 2025-11-22 08:18:34.435054791 +0000 UTC m=+0.069907926 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 08:18:36 compute-0 nova_compute[186544]: 2025-11-22 08:18:36.100 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.600 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '042b3132-f4d0-455f-a591-712ce4fd2839', 'name': 'tempest-TestGettingAddress-server-332737681', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000093', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'user_id': '809b865601654264af5bff7f49127cea', 'hostId': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.600 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.626 12 DEBUG ceilometer.compute.pollsters [-] 042b3132-f4d0-455f-a591-712ce4fd2839/disk.device.write.requests volume: 348 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.627 12 DEBUG ceilometer.compute.pollsters [-] 042b3132-f4d0-455f-a591-712ce4fd2839/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.628 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7b3c6ef0-d5a7-468e-a0dc-0c24e96bb45a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 348, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '042b3132-f4d0-455f-a591-712ce4fd2839-vda', 'timestamp': '2025-11-22T08:18:36.601102', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-332737681', 'name': 'instance-00000093', 'instance_id': '042b3132-f4d0-455f-a591-712ce4fd2839', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd7f0c816-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6257.292883894, 'message_signature': 'beb49f6f2ddc6979907e420b7139876fedb0dab46899c29a79cb2af940afc2dd'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '042b3132-f4d0-455f-a591-712ce4fd2839-sda', 'timestamp': '2025-11-22T08:18:36.601102', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-332737681', 'name': 'instance-00000093', 'instance_id': '042b3132-f4d0-455f-a591-712ce4fd2839', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd7f0d5d6-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6257.292883894, 'message_signature': 'a972587212c2fb736fb94ea8afa9d725c0344e586a8a832853c708877fc2251d'}]}, 'timestamp': '2025-11-22 08:18:36.627666', '_unique_id': '5fbb4802a6cd422e978ff321584921bd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.628 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.628 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.628 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.628 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.628 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.628 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.628 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.628 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.628 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.628 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.628 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.628 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.628 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.628 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.628 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.628 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.628 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.628 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.628 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.628 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.628 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.628 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.628 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.628 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.628 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.628 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.628 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.628 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.628 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.628 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.628 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.629 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.629 12 DEBUG ceilometer.compute.pollsters [-] 042b3132-f4d0-455f-a591-712ce4fd2839/disk.device.read.bytes volume: 31541760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.629 12 DEBUG ceilometer.compute.pollsters [-] 042b3132-f4d0-455f-a591-712ce4fd2839/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.631 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd6207fa5-41c7-4df5-9e4d-0db9aa30500c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 31541760, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '042b3132-f4d0-455f-a591-712ce4fd2839-vda', 'timestamp': '2025-11-22T08:18:36.629615', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-332737681', 'name': 'instance-00000093', 'instance_id': '042b3132-f4d0-455f-a591-712ce4fd2839', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd7f12f90-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6257.292883894, 'message_signature': 'e447078c2547e30ba16afe84c5021c85b3512bce0963a9de2fcc010a998fb7cf'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '042b3132-f4d0-455f-a591-712ce4fd2839-sda', 'timestamp': '2025-11-22T08:18:36.629615', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-332737681', 'name': 'instance-00000093', 'instance_id': '042b3132-f4d0-455f-a591-712ce4fd2839', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd7f13bc0-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6257.292883894, 'message_signature': 'c5a5a6d121a588662fbba122c2526454e910a8dc36b8dc6e406c097b7b21159d'}]}, 'timestamp': '2025-11-22 08:18:36.630260', '_unique_id': 'f73f0b1dda284ba88f06364bef34e0cf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.631 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.631 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.631 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.631 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.631 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.631 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.631 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.631 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.631 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.631 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.631 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.631 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.631 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.631 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.631 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.631 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.631 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.631 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.631 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.631 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.631 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.631 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.631 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.631 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.631 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.631 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.631 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.631 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.631 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.631 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.631 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.631 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.631 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.631 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.631 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.631 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.631 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.631 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.631 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.631 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.631 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.631 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.631 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.631 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.631 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.631 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.631 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.631 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.631 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.631 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.631 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.631 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.631 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.631 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.632 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.632 12 DEBUG ceilometer.compute.pollsters [-] 042b3132-f4d0-455f-a591-712ce4fd2839/disk.device.write.bytes volume: 73072640 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.632 12 DEBUG ceilometer.compute.pollsters [-] 042b3132-f4d0-455f-a591-712ce4fd2839/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.633 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e2d8c837-31c1-420f-9313-17471bff474e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73072640, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '042b3132-f4d0-455f-a591-712ce4fd2839-vda', 'timestamp': '2025-11-22T08:18:36.632106', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-332737681', 'name': 'instance-00000093', 'instance_id': '042b3132-f4d0-455f-a591-712ce4fd2839', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd7f1917e-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6257.292883894, 'message_signature': '980753bc2d6ed36453c2351e68ca45f90ae2a2a07036a1ea5bf5f09335804375'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '042b3132-f4d0-455f-a591-712ce4fd2839-sda', 'timestamp': '2025-11-22T08:18:36.632106', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-332737681', 'name': 'instance-00000093', 'instance_id': '042b3132-f4d0-455f-a591-712ce4fd2839', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd7f19d90-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6257.292883894, 'message_signature': 'd46431a5c4ce924f775ba191061b7d503176e1ee30f312662c9773feeb201667'}]}, 'timestamp': '2025-11-22 08:18:36.632760', '_unique_id': 'a970bded2bdd4ae393bbb96a8414406a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.633 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.633 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.633 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.633 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.633 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.633 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.633 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.633 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.633 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.633 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.633 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.633 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.633 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.633 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.633 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.633 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.633 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.633 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.633 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.633 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.633 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.633 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.633 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.633 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.633 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.633 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.633 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.633 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.633 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.633 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.633 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.634 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.637 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 042b3132-f4d0-455f-a591-712ce4fd2839 / tap328a2b4f-54 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.637 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 042b3132-f4d0-455f-a591-712ce4fd2839 / tap46205a33-4e inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.638 12 DEBUG ceilometer.compute.pollsters [-] 042b3132-f4d0-455f-a591-712ce4fd2839/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.638 12 DEBUG ceilometer.compute.pollsters [-] 042b3132-f4d0-455f-a591-712ce4fd2839/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.639 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b61da383-e172-4162-915b-541ec37fcf17', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000093-042b3132-f4d0-455f-a591-712ce4fd2839-tap328a2b4f-54', 'timestamp': '2025-11-22T08:18:36.634426', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-332737681', 'name': 'tap328a2b4f-54', 'instance_id': '042b3132-f4d0-455f-a591-712ce4fd2839', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:82:97:79', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap328a2b4f-54'}, 'message_id': 'd7f27c06-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6257.326233587, 'message_signature': 'e665bab3a47d82376f526f881dc6e8db174edda81e1b2f042c36d5f60c45e52a'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000093-042b3132-f4d0-455f-a591-712ce4fd2839-tap46205a33-4e', 'timestamp': '2025-11-22T08:18:36.634426', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-332737681', 'name': 'tap46205a33-4e', 'instance_id': '042b3132-f4d0-455f-a591-712ce4fd2839', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d5:20:0a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap46205a33-4e'}, 'message_id': 'd7f286f6-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6257.326233587, 'message_signature': 'c260e559ac2b09067747ad0d9616cee719f0c1d30e2f836dd50b5de0035070a9'}]}, 'timestamp': '2025-11-22 08:18:36.638733', '_unique_id': 'e3d9a84db57045c8886215f442572d98'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.639 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.639 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.639 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.639 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.639 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.639 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.639 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.639 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.639 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.639 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.639 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.639 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.639 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.639 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.639 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.639 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.639 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.639 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.639 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.639 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.639 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.639 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.639 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.639 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.639 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.639 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.639 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.639 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.639 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.639 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.639 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.640 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.640 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.640 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestGettingAddress-server-332737681>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-332737681>]
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.640 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.640 12 DEBUG ceilometer.compute.pollsters [-] 042b3132-f4d0-455f-a591-712ce4fd2839/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.641 12 DEBUG ceilometer.compute.pollsters [-] 042b3132-f4d0-455f-a591-712ce4fd2839/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.642 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a6c90acb-32e4-4ee6-b636-b1a99b556942', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000093-042b3132-f4d0-455f-a591-712ce4fd2839-tap328a2b4f-54', 'timestamp': '2025-11-22T08:18:36.640781', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-332737681', 'name': 'tap328a2b4f-54', 'instance_id': '042b3132-f4d0-455f-a591-712ce4fd2839', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:82:97:79', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap328a2b4f-54'}, 'message_id': 'd7f2e556-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6257.326233587, 'message_signature': '6bae67a007639f44066e8264d94b05107030615fbd5300ac0942e14f7e6126ce'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000093-042b3132-f4d0-455f-a591-712ce4fd2839-tap46205a33-4e', 'timestamp': '2025-11-22T08:18:36.640781', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-332737681', 'name': 'tap46205a33-4e', 'instance_id': '042b3132-f4d0-455f-a591-712ce4fd2839', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d5:20:0a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap46205a33-4e'}, 'message_id': 'd7f2f230-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6257.326233587, 'message_signature': '8e9c069a33ae18b0ccae26d4d7e80a5001f49729a48a829d8e17e8a2af1768f6'}]}, 'timestamp': '2025-11-22 08:18:36.641492', '_unique_id': 'bc0a65f22f6242859a7274a0c6fdc8c3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.642 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.642 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.642 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.642 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.642 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.642 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.642 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.642 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.642 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.642 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.642 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.642 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.642 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.642 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.642 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.642 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.642 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.642 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.642 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.642 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.642 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.642 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.642 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.642 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.642 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.642 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.642 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.642 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.642 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.642 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.642 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.642 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.643 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.657 12 DEBUG ceilometer.compute.pollsters [-] 042b3132-f4d0-455f-a591-712ce4fd2839/cpu volume: 14680000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.658 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3cb3f707-e024-42e4-a108-93761fa176ba', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 14680000000, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '042b3132-f4d0-455f-a591-712ce4fd2839', 'timestamp': '2025-11-22T08:18:36.643100', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-332737681', 'name': 'instance-00000093', 'instance_id': '042b3132-f4d0-455f-a591-712ce4fd2839', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'd7f573a2-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6257.349116383, 'message_signature': '2042b398cdfc71ab8e9596662ed754f33227ae9d7109446593ce51a4178afc61'}]}, 'timestamp': '2025-11-22 08:18:36.657962', '_unique_id': '055b72f3e71f47529b608d10610a5121'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.658 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.658 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.658 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.658 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.658 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.658 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.658 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.658 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.658 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.658 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.658 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.658 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.658 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.658 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.658 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.658 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.658 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.658 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.658 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.658 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.658 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.658 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.658 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.658 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.658 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.658 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.658 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.658 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.658 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.658 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.658 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.658 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.658 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.658 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.658 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.658 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.658 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.658 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.658 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.658 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.658 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.658 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.658 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.658 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.658 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.658 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.658 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.658 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.658 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.658 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.658 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.658 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.658 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.658 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.659 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.659 12 DEBUG ceilometer.compute.pollsters [-] 042b3132-f4d0-455f-a591-712ce4fd2839/disk.device.write.latency volume: 192736785798 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.659 12 DEBUG ceilometer.compute.pollsters [-] 042b3132-f4d0-455f-a591-712ce4fd2839/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.661 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd52371ba-a01b-4808-80e7-267906332a6c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 192736785798, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '042b3132-f4d0-455f-a591-712ce4fd2839-vda', 'timestamp': '2025-11-22T08:18:36.659553', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-332737681', 'name': 'instance-00000093', 'instance_id': '042b3132-f4d0-455f-a591-712ce4fd2839', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd7f5c078-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6257.292883894, 'message_signature': '70ffaa54684f19e1547b5a21cf9eaa8c15838a532cc1dccd7ba48ef9dd15d604'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '042b3132-f4d0-455f-a591-712ce4fd2839-sda', 'timestamp': '2025-11-22T08:18:36.659553', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-332737681', 'name': 'instance-00000093', 'instance_id': '042b3132-f4d0-455f-a591-712ce4fd2839', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd7f5ccee-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6257.292883894, 'message_signature': 'bda8431dc6d97776842c7b3c72fd71f7909762b7b5e371742d98d09fcb8fa877'}]}, 'timestamp': '2025-11-22 08:18:36.660188', '_unique_id': '6d7db94387514c9a9514dd70fcb40396'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.661 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.661 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.661 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.661 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.661 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.661 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.661 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.661 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.661 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.661 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.661 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.661 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.661 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.661 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.661 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.661 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.661 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.661 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.661 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.661 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.661 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.661 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.661 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.661 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.661 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.661 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.661 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.661 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.661 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.661 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.661 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.661 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.661 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.662 12 DEBUG ceilometer.compute.pollsters [-] 042b3132-f4d0-455f-a591-712ce4fd2839/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.662 12 DEBUG ceilometer.compute.pollsters [-] 042b3132-f4d0-455f-a591-712ce4fd2839/network.incoming.packets.drop volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.663 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8f83a8e8-708d-4016-9877-e626edfef662', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000093-042b3132-f4d0-455f-a591-712ce4fd2839-tap328a2b4f-54', 'timestamp': '2025-11-22T08:18:36.661979', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-332737681', 'name': 'tap328a2b4f-54', 'instance_id': '042b3132-f4d0-455f-a591-712ce4fd2839', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:82:97:79', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap328a2b4f-54'}, 'message_id': 'd7f61fa0-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6257.326233587, 'message_signature': '2ed113733d4b9ff7cd5a5937abcf4564be2d94123a0d4fb6398f2fd853564487'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000093-042b3132-f4d0-455f-a591-712ce4fd2839-tap46205a33-4e', 'timestamp': '2025-11-22T08:18:36.661979', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-332737681', 'name': 'tap46205a33-4e', 'instance_id': '042b3132-f4d0-455f-a591-712ce4fd2839', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d5:20:0a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap46205a33-4e'}, 'message_id': 'd7f62d1a-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6257.326233587, 'message_signature': 'e682c4206bca464b86e756e5e8de5c6561b0894aeb01ae2eb808cfd22c147a5b'}]}, 'timestamp': '2025-11-22 08:18:36.662658', '_unique_id': '66219fe2572443a28f497e6eaaea04fe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.663 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.663 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.663 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.663 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.663 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.663 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.663 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.663 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.663 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.663 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.663 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.663 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.663 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.663 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.663 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.663 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.663 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.663 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.663 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.663 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.663 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.663 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.663 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.663 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.663 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.663 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.663 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.663 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.663 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.663 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.663 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.664 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.664 12 DEBUG ceilometer.compute.pollsters [-] 042b3132-f4d0-455f-a591-712ce4fd2839/disk.device.read.latency volume: 3214808948 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.664 12 DEBUG ceilometer.compute.pollsters [-] 042b3132-f4d0-455f-a591-712ce4fd2839/disk.device.read.latency volume: 290118043 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.667 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6285d151-a620-457e-acfa-37e1e4cecdc0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3214808948, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '042b3132-f4d0-455f-a591-712ce4fd2839-vda', 'timestamp': '2025-11-22T08:18:36.664462', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-332737681', 'name': 'instance-00000093', 'instance_id': '042b3132-f4d0-455f-a591-712ce4fd2839', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd7f6803a-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6257.292883894, 'message_signature': '5ae7a5366b93ec0ddd71a411a060b9955677d105b0ecee99607c31ac19bfc630'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 290118043, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '042b3132-f4d0-455f-a591-712ce4fd2839-sda', 'timestamp': '2025-11-22T08:18:36.664462', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-332737681', 'name': 'instance-00000093', 'instance_id': '042b3132-f4d0-455f-a591-712ce4fd2839', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd7f68cba-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6257.292883894, 'message_signature': 'ec132656335c85559bcc56891590424c2c19a6fead36f6ed5c97bd9c64c572ae'}]}, 'timestamp': '2025-11-22 08:18:36.666299', '_unique_id': 'c8bdb69a7cd346548c3acd838aa741b8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.667 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.667 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.667 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.667 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.667 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.667 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.667 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.667 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.667 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.667 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.667 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.667 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.667 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.667 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.667 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.667 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.667 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.667 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.667 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.667 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.667 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.667 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.667 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.667 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.667 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.667 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.667 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.667 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.667 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.667 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.667 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.668 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.668 12 DEBUG ceilometer.compute.pollsters [-] 042b3132-f4d0-455f-a591-712ce4fd2839/network.incoming.bytes volume: 29731 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.668 12 DEBUG ceilometer.compute.pollsters [-] 042b3132-f4d0-455f-a591-712ce4fd2839/network.incoming.bytes volume: 2146 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.669 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f4a87e3e-4166-47a4-acaf-31376893e9d8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29731, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000093-042b3132-f4d0-455f-a591-712ce4fd2839-tap328a2b4f-54', 'timestamp': '2025-11-22T08:18:36.668103', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-332737681', 'name': 'tap328a2b4f-54', 'instance_id': '042b3132-f4d0-455f-a591-712ce4fd2839', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:82:97:79', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap328a2b4f-54'}, 'message_id': 'd7f70f1e-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6257.326233587, 'message_signature': '17a5e4029c07c902b649de8662186899cf415aa7db033556639bbb59d1b55c79'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2146, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000093-042b3132-f4d0-455f-a591-712ce4fd2839-tap46205a33-4e', 'timestamp': '2025-11-22T08:18:36.668103', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-332737681', 'name': 'tap46205a33-4e', 'instance_id': '042b3132-f4d0-455f-a591-712ce4fd2839', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d5:20:0a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap46205a33-4e'}, 'message_id': 'd7f71b62-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6257.326233587, 'message_signature': '5837731a05383f6de28051f2a8c07f2805cd1d905961abd5f20dd623b657932b'}]}, 'timestamp': '2025-11-22 08:18:36.668783', '_unique_id': 'ff43889338bc473eb4b611552ae6b7fb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.669 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.669 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.669 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.669 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.669 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.669 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.669 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.669 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.669 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.669 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.669 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.669 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.669 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.669 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.669 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.669 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.669 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.669 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.669 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.669 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.669 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.669 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.669 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.669 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.669 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.669 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.669 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.669 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.669 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.669 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.669 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.670 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.670 12 DEBUG ceilometer.compute.pollsters [-] 042b3132-f4d0-455f-a591-712ce4fd2839/memory.usage volume: 43.42578125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.671 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '80efa68d-1d09-4e30-8745-afc84cb3d0b7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 43.42578125, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '042b3132-f4d0-455f-a591-712ce4fd2839', 'timestamp': '2025-11-22T08:18:36.670292', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-332737681', 'name': 'instance-00000093', 'instance_id': '042b3132-f4d0-455f-a591-712ce4fd2839', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'd7f763ec-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6257.349116383, 'message_signature': 'efe57f61b4a8891a3c605b330ecd28287e7177abdfe5c9eb9b224798e5e0955f'}]}, 'timestamp': '2025-11-22 08:18:36.670609', '_unique_id': '014ea05de76346b6abca7065b2e17110'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.671 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.671 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.671 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.671 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.671 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.671 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.671 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.671 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.671 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.671 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.671 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.671 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.671 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.671 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.671 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.671 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.671 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.671 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.671 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.671 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.671 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.671 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.671 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.671 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.671 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.671 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.671 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.671 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.671 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.671 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.671 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.672 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.672 12 DEBUG ceilometer.compute.pollsters [-] 042b3132-f4d0-455f-a591-712ce4fd2839/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.672 12 DEBUG ceilometer.compute.pollsters [-] 042b3132-f4d0-455f-a591-712ce4fd2839/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.673 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '29497846-ad8e-4105-8f1a-e53cd2f74d17', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000093-042b3132-f4d0-455f-a591-712ce4fd2839-tap328a2b4f-54', 'timestamp': '2025-11-22T08:18:36.672240', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-332737681', 'name': 'tap328a2b4f-54', 'instance_id': '042b3132-f4d0-455f-a591-712ce4fd2839', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:82:97:79', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap328a2b4f-54'}, 'message_id': 'd7f7b0a4-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6257.326233587, 'message_signature': '864d073ec45614c402a4f1c611d9a06dfde3b182a641e49bd62fed5eeff9d512'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000093-042b3132-f4d0-455f-a591-712ce4fd2839-tap46205a33-4e', 'timestamp': '2025-11-22T08:18:36.672240', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-332737681', 'name': 'tap46205a33-4e', 'instance_id': '042b3132-f4d0-455f-a591-712ce4fd2839', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d5:20:0a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap46205a33-4e'}, 'message_id': 'd7f7bda6-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6257.326233587, 'message_signature': 'b942dc221e298e103411ac03972da0b60f229e67699437e9b2d399b348f51c2c'}]}, 'timestamp': '2025-11-22 08:18:36.672912', '_unique_id': '1729b2026dac47098ec5b86c04c59b8b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.673 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.673 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.673 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.673 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.673 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.673 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.673 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.673 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.673 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.673 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.673 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.673 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.673 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.673 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.673 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.673 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.673 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.673 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.673 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.673 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.673 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.673 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.673 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.673 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.673 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.673 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.673 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.673 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.673 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.673 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.673 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.674 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.674 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.674 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-332737681>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-332737681>]
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.674 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.674 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.674 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-332737681>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-332737681>]
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.675 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.686 12 DEBUG ceilometer.compute.pollsters [-] 042b3132-f4d0-455f-a591-712ce4fd2839/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.686 12 DEBUG ceilometer.compute.pollsters [-] 042b3132-f4d0-455f-a591-712ce4fd2839/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.688 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e4d902c3-8b32-43a1-ab46-fc1376993037', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '042b3132-f4d0-455f-a591-712ce4fd2839-vda', 'timestamp': '2025-11-22T08:18:36.675452', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-332737681', 'name': 'instance-00000093', 'instance_id': '042b3132-f4d0-455f-a591-712ce4fd2839', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd7f9dd02-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6257.367257381, 'message_signature': '7d180c42e93604998a6243fa3c00f65a0278c9b7e91e3a1d18ce9a0761118496'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '042b3132-f4d0-455f-a591-712ce4fd2839-sda', 'timestamp': '2025-11-22T08:18:36.675452', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-332737681', 'name': 'instance-00000093', 'instance_id': '042b3132-f4d0-455f-a591-712ce4fd2839', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd7f9e9d2-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6257.367257381, 'message_signature': 'ecd00b6b4408f78d520ce4b3930e28fd0dd2422aa1db5d4001c5ff14fad445b2'}]}, 'timestamp': '2025-11-22 08:18:36.687131', '_unique_id': 'e4fd4de117854f3bbeef2ba5391df84f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.688 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.688 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.688 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.688 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.688 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.688 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.688 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.688 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.688 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.688 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.688 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.688 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.688 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.688 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.688 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.688 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.688 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.688 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.688 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.688 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.688 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.688 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.688 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.688 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.688 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.688 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.688 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.688 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.688 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.688 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.688 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.688 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.688 12 DEBUG ceilometer.compute.pollsters [-] 042b3132-f4d0-455f-a591-712ce4fd2839/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.689 12 DEBUG ceilometer.compute.pollsters [-] 042b3132-f4d0-455f-a591-712ce4fd2839/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.689 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6972ddf1-1ce9-4b31-b2cb-d9d2048c0151', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000093-042b3132-f4d0-455f-a591-712ce4fd2839-tap328a2b4f-54', 'timestamp': '2025-11-22T08:18:36.688858', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-332737681', 'name': 'tap328a2b4f-54', 'instance_id': '042b3132-f4d0-455f-a591-712ce4fd2839', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:82:97:79', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap328a2b4f-54'}, 'message_id': 'd7fa3798-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6257.326233587, 'message_signature': 'e63a96dcc11e226d24e20b31f8766a596b53d52fb5aea600cea6ef318b3bde5a'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000093-042b3132-f4d0-455f-a591-712ce4fd2839-tap46205a33-4e', 'timestamp': '2025-11-22T08:18:36.688858', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-332737681', 'name': 'tap46205a33-4e', 'instance_id': '042b3132-f4d0-455f-a591-712ce4fd2839', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d5:20:0a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap46205a33-4e'}, 'message_id': 'd7fa3fc2-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6257.326233587, 'message_signature': '683ad705c2677f37bf2a7fb6145eae2d99dde4b192b69c694927a276840e94de'}]}, 'timestamp': '2025-11-22 08:18:36.689326', '_unique_id': 'da6800a669834292aff09ce4692fdf2d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.689 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.689 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.689 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.689 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.689 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.689 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.689 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.689 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.689 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.689 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.689 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.689 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.689 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.689 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.689 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.689 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.689 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.689 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.689 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.689 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.689 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.689 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.689 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.689 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.689 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.689 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.689 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.689 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.689 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.689 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.689 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.690 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.690 12 DEBUG ceilometer.compute.pollsters [-] 042b3132-f4d0-455f-a591-712ce4fd2839/network.outgoing.bytes volume: 27110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.690 12 DEBUG ceilometer.compute.pollsters [-] 042b3132-f4d0-455f-a591-712ce4fd2839/network.outgoing.bytes volume: 3486 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.691 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fad3b9b9-ab40-4d9c-9b31-ea87106b0f53', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 27110, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000093-042b3132-f4d0-455f-a591-712ce4fd2839-tap328a2b4f-54', 'timestamp': '2025-11-22T08:18:36.690711', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-332737681', 'name': 'tap328a2b4f-54', 'instance_id': '042b3132-f4d0-455f-a591-712ce4fd2839', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:82:97:79', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap328a2b4f-54'}, 'message_id': 'd7fa7fa0-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6257.326233587, 'message_signature': 'd4de2b043943bdef7d62dd7d528847f8ff13b617d62265872c77d28e52ad8a03'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3486, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000093-042b3132-f4d0-455f-a591-712ce4fd2839-tap46205a33-4e', 'timestamp': '2025-11-22T08:18:36.690711', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-332737681', 'name': 'tap46205a33-4e', 'instance_id': '042b3132-f4d0-455f-a591-712ce4fd2839', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d5:20:0a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap46205a33-4e'}, 'message_id': 'd7fa87e8-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6257.326233587, 'message_signature': '7209ad0933f118a64edfbe038a745ff9bb70fef5401a2b8067aa39dda30f7c56'}]}, 'timestamp': '2025-11-22 08:18:36.691142', '_unique_id': '70ccd1b2110449ba97ca4811587e4feb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.691 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.691 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.691 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.691 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.691 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.691 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.691 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.691 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.691 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.691 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.691 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.691 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.691 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.691 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.691 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.691 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.691 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.691 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.691 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.691 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.691 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.691 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.691 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.691 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.691 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.691 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.691 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.691 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.691 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.691 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.691 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.692 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.692 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.692 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestGettingAddress-server-332737681>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-332737681>]
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.692 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.692 12 DEBUG ceilometer.compute.pollsters [-] 042b3132-f4d0-455f-a591-712ce4fd2839/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.692 12 DEBUG ceilometer.compute.pollsters [-] 042b3132-f4d0-455f-a591-712ce4fd2839/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.693 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '47a62856-1e76-4d59-88bf-c0c110219570', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '042b3132-f4d0-455f-a591-712ce4fd2839-vda', 'timestamp': '2025-11-22T08:18:36.692555', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-332737681', 'name': 'instance-00000093', 'instance_id': '042b3132-f4d0-455f-a591-712ce4fd2839', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd7fac87a-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6257.367257381, 'message_signature': 'bfacb2bb2a1f95cbf2173119557458e46d410bbfe888cf73dddc3365ce9b104d'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '042b3132-f4d0-455f-a591-712ce4fd2839-sda', 'timestamp': '2025-11-22T08:18:36.692555', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-332737681', 'name': 'instance-00000093', 'instance_id': '042b3132-f4d0-455f-a591-712ce4fd2839', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd7fad068-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6257.367257381, 'message_signature': '7423b5dce0ee09dab754370d6dea3edd9e0cfa14c9acc5714906e0b2793b3687'}]}, 'timestamp': '2025-11-22 08:18:36.692993', '_unique_id': '645135b150334bf09e7dc50b2f27be73'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.693 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.693 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.693 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.693 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.693 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.693 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.693 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.693 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.693 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.693 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.693 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.693 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.693 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.693 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.693 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.693 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.693 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.693 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.693 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.693 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.693 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.693 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.693 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.693 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.693 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.693 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.693 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.693 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.693 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.693 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.693 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.694 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.694 12 DEBUG ceilometer.compute.pollsters [-] 042b3132-f4d0-455f-a591-712ce4fd2839/network.outgoing.packets volume: 172 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.694 12 DEBUG ceilometer.compute.pollsters [-] 042b3132-f4d0-455f-a591-712ce4fd2839/network.outgoing.packets volume: 27 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.695 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '19af514e-b530-4990-91dc-80b9a3bb0ab7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 172, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000093-042b3132-f4d0-455f-a591-712ce4fd2839-tap328a2b4f-54', 'timestamp': '2025-11-22T08:18:36.694096', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-332737681', 'name': 'tap328a2b4f-54', 'instance_id': '042b3132-f4d0-455f-a591-712ce4fd2839', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:82:97:79', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap328a2b4f-54'}, 'message_id': 'd7fb04d4-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6257.326233587, 'message_signature': 'bfc7ed4294a2e40d0c627a2bd72923edbaef441e67f4f66a462c608144df552c'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 27, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000093-042b3132-f4d0-455f-a591-712ce4fd2839-tap46205a33-4e', 'timestamp': '2025-11-22T08:18:36.694096', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-332737681', 'name': 'tap46205a33-4e', 'instance_id': '042b3132-f4d0-455f-a591-712ce4fd2839', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d5:20:0a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap46205a33-4e'}, 'message_id': 'd7fb0f10-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6257.326233587, 'message_signature': '37091d06a71ea6547cad7da07522426a2ae9597c698c5e885ffff322ebca9a29'}]}, 'timestamp': '2025-11-22 08:18:36.694616', '_unique_id': '6b69914b12ca48f4831b0cfebd4cdfcd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.695 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.695 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.695 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.695 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.695 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.695 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.695 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.695 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.695 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.695 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.695 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.695 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.695 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.695 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.695 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.695 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.695 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.695 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.695 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.695 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.695 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.695 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.695 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.695 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.695 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.695 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.695 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.695 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.695 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.695 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.695 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.695 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.695 12 DEBUG ceilometer.compute.pollsters [-] 042b3132-f4d0-455f-a591-712ce4fd2839/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.696 12 DEBUG ceilometer.compute.pollsters [-] 042b3132-f4d0-455f-a591-712ce4fd2839/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.696 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '236a674e-3b77-4ecb-970c-7196722b5607', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000093-042b3132-f4d0-455f-a591-712ce4fd2839-tap328a2b4f-54', 'timestamp': '2025-11-22T08:18:36.695852', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-332737681', 'name': 'tap328a2b4f-54', 'instance_id': '042b3132-f4d0-455f-a591-712ce4fd2839', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:82:97:79', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap328a2b4f-54'}, 'message_id': 'd7fb487c-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6257.326233587, 'message_signature': 'd7854d3bcd06b9fbbdc6b46592df9a9d0e89089debb3978f3931859309ca7a5d'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000093-042b3132-f4d0-455f-a591-712ce4fd2839-tap46205a33-4e', 'timestamp': '2025-11-22T08:18:36.695852', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-332737681', 'name': 'tap46205a33-4e', 'instance_id': '042b3132-f4d0-455f-a591-712ce4fd2839', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d5:20:0a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap46205a33-4e'}, 'message_id': 'd7fb50c4-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6257.326233587, 'message_signature': '036d620f5ccdebaed6378410f0e2829c99b30798358eb36ac8bb90cb9502864d'}]}, 'timestamp': '2025-11-22 08:18:36.696307', '_unique_id': '3f4cf97c4eb94b729a721bf743a5fd2a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.696 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.696 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.696 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.696 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.696 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.696 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.696 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.696 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.696 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.696 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.696 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.696 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.696 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.696 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.696 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.696 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.696 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.696 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.696 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.696 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.696 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.696 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.696 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.696 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.696 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.696 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.696 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.696 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.696 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.696 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.696 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.697 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.697 12 DEBUG ceilometer.compute.pollsters [-] 042b3132-f4d0-455f-a591-712ce4fd2839/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.697 12 DEBUG ceilometer.compute.pollsters [-] 042b3132-f4d0-455f-a591-712ce4fd2839/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.698 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd34cba31-e34f-4fd0-99ac-7c9d640896ba', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '042b3132-f4d0-455f-a591-712ce4fd2839-vda', 'timestamp': '2025-11-22T08:18:36.697552', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-332737681', 'name': 'instance-00000093', 'instance_id': '042b3132-f4d0-455f-a591-712ce4fd2839', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd7fb8abc-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6257.367257381, 'message_signature': '55ea4323f1bc99674d519c4f94c8462db1307c71e3a4ab4f55d05e02a81021dd'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '042b3132-f4d0-455f-a591-712ce4fd2839-sda', 'timestamp': '2025-11-22T08:18:36.697552', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-332737681', 'name': 'instance-00000093', 'instance_id': '042b3132-f4d0-455f-a591-712ce4fd2839', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd7fb928c-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6257.367257381, 'message_signature': '4ed0213aac1c060cb4f3d5b29f48c1d87b9efed289d94d0eaf28474fdc923310'}]}, 'timestamp': '2025-11-22 08:18:36.697964', '_unique_id': '3e420082806940f9be0db622601b82c0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.698 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.698 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.698 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.698 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.698 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.698 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.698 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.698 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.698 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.698 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.698 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.698 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.698 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.698 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.698 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.698 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.698 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.698 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.698 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.698 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.698 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.698 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.698 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.698 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.698 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.698 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.698 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.698 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.698 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.698 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.698 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.698 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.699 12 DEBUG ceilometer.compute.pollsters [-] 042b3132-f4d0-455f-a591-712ce4fd2839/disk.device.read.requests volume: 1173 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.699 12 DEBUG ceilometer.compute.pollsters [-] 042b3132-f4d0-455f-a591-712ce4fd2839/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.700 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5be4d207-f360-445d-9c97-6dd1b26b8a8d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1173, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '042b3132-f4d0-455f-a591-712ce4fd2839-vda', 'timestamp': '2025-11-22T08:18:36.699052', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-332737681', 'name': 'instance-00000093', 'instance_id': '042b3132-f4d0-455f-a591-712ce4fd2839', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd7fbc55e-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6257.292883894, 'message_signature': 'bb899aff52f6eb0fb39e66cb9e810832b8d2aaab1986913d40ffb19a0433c592'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '042b3132-f4d0-455f-a591-712ce4fd2839-sda', 'timestamp': '2025-11-22T08:18:36.699052', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-332737681', 'name': 'instance-00000093', 'instance_id': '042b3132-f4d0-455f-a591-712ce4fd2839', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd7fbce96-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6257.292883894, 'message_signature': '636db762970cd41eac2f4c0156a2882d3dd656c044627e2dd14c5ea63fe83504'}]}, 'timestamp': '2025-11-22 08:18:36.699505', '_unique_id': '9861578554bb4b1d9f39db1139612d65'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.700 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.700 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.700 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.700 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.700 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.700 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.700 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.700 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.700 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.700 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.700 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.700 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.700 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.700 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.700 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.700 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.700 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.700 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.700 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.700 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.700 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.700 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.700 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.700 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.700 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.700 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.700 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.700 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.700 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.700 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.700 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.700 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.700 12 DEBUG ceilometer.compute.pollsters [-] 042b3132-f4d0-455f-a591-712ce4fd2839/network.incoming.packets volume: 166 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.700 12 DEBUG ceilometer.compute.pollsters [-] 042b3132-f4d0-455f-a591-712ce4fd2839/network.incoming.packets volume: 23 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.701 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b388e34c-56f9-45ea-b177-2792e384a730', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 166, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000093-042b3132-f4d0-455f-a591-712ce4fd2839-tap328a2b4f-54', 'timestamp': '2025-11-22T08:18:36.700670', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-332737681', 'name': 'tap328a2b4f-54', 'instance_id': '042b3132-f4d0-455f-a591-712ce4fd2839', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:82:97:79', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap328a2b4f-54'}, 'message_id': 'd7fc04c4-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6257.326233587, 'message_signature': 'd41e9a99100b6979129fd7d60e9ab7048cacc2227e5cb4a6907c87ec20272010'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 23, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-00000093-042b3132-f4d0-455f-a591-712ce4fd2839-tap46205a33-4e', 'timestamp': '2025-11-22T08:18:36.700670', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-332737681', 'name': 'tap46205a33-4e', 'instance_id': '042b3132-f4d0-455f-a591-712ce4fd2839', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d5:20:0a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap46205a33-4e'}, 'message_id': 'd7fc0cbc-c77b-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6257.326233587, 'message_signature': '680eb82a340fe6fedc035ab7f20587534b8b7d972adc5e9abb6dd320dc265582'}]}, 'timestamp': '2025-11-22 08:18:36.701118', '_unique_id': '1de92c003a16476ba3d26cb3d7306047'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.701 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.701 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.701 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.701 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.701 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.701 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.701 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.701 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.701 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.701 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.701 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.701 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.701 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.701 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.701 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.701 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.701 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.701 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.701 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.701 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.701 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.701 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.701 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.701 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.701 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.701 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.701 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.701 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.701 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.701 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.701 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.701 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.701 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.701 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.701 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.701 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.701 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.701 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.701 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.701 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.701 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.701 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.701 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.701 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.701 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.701 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.701 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.701 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.701 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.701 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.701 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.701 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.701 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:18:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:18:36.701 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:18:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:18:37.344 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:18:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:18:37.345 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:18:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:18:37.345 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:18:37 compute-0 podman[240770]: 2025-11-22 08:18:37.400967826 +0000 UTC m=+0.052834596 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 08:18:37 compute-0 podman[240771]: 2025-11-22 08:18:37.412430839 +0000 UTC m=+0.060122976 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.buildah.version=1.33.7, managed_by=edpm_ansible, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, build-date=2025-08-20T13:12:41)
Nov 22 08:18:39 compute-0 nova_compute[186544]: 2025-11-22 08:18:39.029 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:18:41 compute-0 nova_compute[186544]: 2025-11-22 08:18:41.103 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:18:42 compute-0 nova_compute[186544]: 2025-11-22 08:18:42.210 186548 DEBUG oslo_concurrency.lockutils [None req-3ac41d86-25fb-49fa-bbab-acde20a9a0cf 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "042b3132-f4d0-455f-a591-712ce4fd2839" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:18:42 compute-0 nova_compute[186544]: 2025-11-22 08:18:42.211 186548 DEBUG oslo_concurrency.lockutils [None req-3ac41d86-25fb-49fa-bbab-acde20a9a0cf 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "042b3132-f4d0-455f-a591-712ce4fd2839" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:18:42 compute-0 nova_compute[186544]: 2025-11-22 08:18:42.211 186548 DEBUG oslo_concurrency.lockutils [None req-3ac41d86-25fb-49fa-bbab-acde20a9a0cf 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "042b3132-f4d0-455f-a591-712ce4fd2839-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:18:42 compute-0 nova_compute[186544]: 2025-11-22 08:18:42.211 186548 DEBUG oslo_concurrency.lockutils [None req-3ac41d86-25fb-49fa-bbab-acde20a9a0cf 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "042b3132-f4d0-455f-a591-712ce4fd2839-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:18:42 compute-0 nova_compute[186544]: 2025-11-22 08:18:42.212 186548 DEBUG oslo_concurrency.lockutils [None req-3ac41d86-25fb-49fa-bbab-acde20a9a0cf 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "042b3132-f4d0-455f-a591-712ce4fd2839-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:18:42 compute-0 nova_compute[186544]: 2025-11-22 08:18:42.221 186548 INFO nova.compute.manager [None req-3ac41d86-25fb-49fa-bbab-acde20a9a0cf 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Terminating instance
Nov 22 08:18:42 compute-0 nova_compute[186544]: 2025-11-22 08:18:42.227 186548 DEBUG nova.compute.manager [None req-3ac41d86-25fb-49fa-bbab-acde20a9a0cf 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 08:18:42 compute-0 kernel: tap328a2b4f-54 (unregistering): left promiscuous mode
Nov 22 08:18:42 compute-0 NetworkManager[55036]: <info>  [1763799522.2620] device (tap328a2b4f-54): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 08:18:42 compute-0 nova_compute[186544]: 2025-11-22 08:18:42.271 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:18:42 compute-0 ovn_controller[94843]: 2025-11-22T08:18:42Z|00674|binding|INFO|Releasing lport 328a2b4f-54d4-42e2-b943-54d2e4ec2bae from this chassis (sb_readonly=0)
Nov 22 08:18:42 compute-0 ovn_controller[94843]: 2025-11-22T08:18:42Z|00675|binding|INFO|Setting lport 328a2b4f-54d4-42e2-b943-54d2e4ec2bae down in Southbound
Nov 22 08:18:42 compute-0 ovn_controller[94843]: 2025-11-22T08:18:42Z|00676|binding|INFO|Removing iface tap328a2b4f-54 ovn-installed in OVS
Nov 22 08:18:42 compute-0 nova_compute[186544]: 2025-11-22 08:18:42.276 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:18:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:18:42.289 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:82:97:79 10.100.0.4'], port_security=['fa:16:3e:82:97:79 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '042b3132-f4d0-455f-a591-712ce4fd2839', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-28285a99-0933-48f9-aee6-f1e507bcd777', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2f853bc3-cae6-48c5-838f-5d956d1719f7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2ce7fc5f-5ca9-4729-bcf9-4866d6397f92, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=328a2b4f-54d4-42e2-b943-54d2e4ec2bae) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:18:42 compute-0 nova_compute[186544]: 2025-11-22 08:18:42.289 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:18:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:18:42.290 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 328a2b4f-54d4-42e2-b943-54d2e4ec2bae in datapath 28285a99-0933-48f9-aee6-f1e507bcd777 unbound from our chassis
Nov 22 08:18:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:18:42.291 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 28285a99-0933-48f9-aee6-f1e507bcd777, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 08:18:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:18:42.293 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[8b117c3c-8213-481e-90af-00907669fff6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:18:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:18:42.294 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-28285a99-0933-48f9-aee6-f1e507bcd777 namespace which is not needed anymore
Nov 22 08:18:42 compute-0 kernel: tap46205a33-4e (unregistering): left promiscuous mode
Nov 22 08:18:42 compute-0 NetworkManager[55036]: <info>  [1763799522.2999] device (tap46205a33-4e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 08:18:42 compute-0 ovn_controller[94843]: 2025-11-22T08:18:42Z|00677|binding|INFO|Releasing lport 46205a33-4ef2-438e-9ea8-e4918a0d6e9b from this chassis (sb_readonly=0)
Nov 22 08:18:42 compute-0 ovn_controller[94843]: 2025-11-22T08:18:42Z|00678|binding|INFO|Setting lport 46205a33-4ef2-438e-9ea8-e4918a0d6e9b down in Southbound
Nov 22 08:18:42 compute-0 ovn_controller[94843]: 2025-11-22T08:18:42Z|00679|binding|INFO|Removing iface tap46205a33-4e ovn-installed in OVS
Nov 22 08:18:42 compute-0 nova_compute[186544]: 2025-11-22 08:18:42.307 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:18:42 compute-0 nova_compute[186544]: 2025-11-22 08:18:42.309 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:18:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:18:42.314 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d5:20:0a 2001:db8::f816:3eff:fed5:200a'], port_security=['fa:16:3e:d5:20:0a 2001:db8::f816:3eff:fed5:200a'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fed5:200a/64', 'neutron:device_id': '042b3132-f4d0-455f-a591-712ce4fd2839', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-206a04da-ce2f-48ff-99c7-e70706547580', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2f853bc3-cae6-48c5-838f-5d956d1719f7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=450e32a6-ae0a-4cd4-b338-c697096c146f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=46205a33-4ef2-438e-9ea8-e4918a0d6e9b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:18:42 compute-0 nova_compute[186544]: 2025-11-22 08:18:42.332 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:18:42 compute-0 systemd[1]: machine-qemu\x2d79\x2dinstance\x2d00000093.scope: Deactivated successfully.
Nov 22 08:18:42 compute-0 systemd[1]: machine-qemu\x2d79\x2dinstance\x2d00000093.scope: Consumed 19.277s CPU time.
Nov 22 08:18:42 compute-0 systemd-machined[152872]: Machine qemu-79-instance-00000093 terminated.
Nov 22 08:18:42 compute-0 neutron-haproxy-ovnmeta-28285a99-0933-48f9-aee6-f1e507bcd777[240245]: [NOTICE]   (240249) : haproxy version is 2.8.14-c23fe91
Nov 22 08:18:42 compute-0 neutron-haproxy-ovnmeta-28285a99-0933-48f9-aee6-f1e507bcd777[240245]: [NOTICE]   (240249) : path to executable is /usr/sbin/haproxy
Nov 22 08:18:42 compute-0 neutron-haproxy-ovnmeta-28285a99-0933-48f9-aee6-f1e507bcd777[240245]: [WARNING]  (240249) : Exiting Master process...
Nov 22 08:18:42 compute-0 neutron-haproxy-ovnmeta-28285a99-0933-48f9-aee6-f1e507bcd777[240245]: [ALERT]    (240249) : Current worker (240251) exited with code 143 (Terminated)
Nov 22 08:18:42 compute-0 neutron-haproxy-ovnmeta-28285a99-0933-48f9-aee6-f1e507bcd777[240245]: [WARNING]  (240249) : All workers exited. Exiting... (0)
Nov 22 08:18:42 compute-0 systemd[1]: libpod-55d6e9d6f56574009abd5b1b99f0d3575081433020b87ce61308b0252cb7c4a3.scope: Deactivated successfully.
Nov 22 08:18:42 compute-0 podman[240840]: 2025-11-22 08:18:42.420499681 +0000 UTC m=+0.044260074 container died 55d6e9d6f56574009abd5b1b99f0d3575081433020b87ce61308b0252cb7c4a3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-28285a99-0933-48f9-aee6-f1e507bcd777, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118)
Nov 22 08:18:42 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-55d6e9d6f56574009abd5b1b99f0d3575081433020b87ce61308b0252cb7c4a3-userdata-shm.mount: Deactivated successfully.
Nov 22 08:18:42 compute-0 systemd[1]: var-lib-containers-storage-overlay-9b699a2d790f67f47bc6c195e1b05787655db1ed29646cc959b56b690863a0fc-merged.mount: Deactivated successfully.
Nov 22 08:18:42 compute-0 NetworkManager[55036]: <info>  [1763799522.4513] manager: (tap328a2b4f-54): new Tun device (/org/freedesktop/NetworkManager/Devices/312)
Nov 22 08:18:42 compute-0 podman[240840]: 2025-11-22 08:18:42.457560386 +0000 UTC m=+0.081320749 container cleanup 55d6e9d6f56574009abd5b1b99f0d3575081433020b87ce61308b0252cb7c4a3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-28285a99-0933-48f9-aee6-f1e507bcd777, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team)
Nov 22 08:18:42 compute-0 NetworkManager[55036]: <info>  [1763799522.4625] manager: (tap46205a33-4e): new Tun device (/org/freedesktop/NetworkManager/Devices/313)
Nov 22 08:18:42 compute-0 systemd-udevd[240816]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:18:42 compute-0 kernel: tap46205a33-4e: entered promiscuous mode
Nov 22 08:18:42 compute-0 kernel: tap46205a33-4e (unregistering): left promiscuous mode
Nov 22 08:18:42 compute-0 ovn_controller[94843]: 2025-11-22T08:18:42Z|00680|binding|INFO|Claiming lport 46205a33-4ef2-438e-9ea8-e4918a0d6e9b for this chassis.
Nov 22 08:18:42 compute-0 ovn_controller[94843]: 2025-11-22T08:18:42Z|00681|binding|INFO|46205a33-4ef2-438e-9ea8-e4918a0d6e9b: Claiming fa:16:3e:d5:20:0a 2001:db8::f816:3eff:fed5:200a
Nov 22 08:18:42 compute-0 nova_compute[186544]: 2025-11-22 08:18:42.469 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:18:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:18:42.477 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d5:20:0a 2001:db8::f816:3eff:fed5:200a'], port_security=['fa:16:3e:d5:20:0a 2001:db8::f816:3eff:fed5:200a'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fed5:200a/64', 'neutron:device_id': '042b3132-f4d0-455f-a591-712ce4fd2839', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-206a04da-ce2f-48ff-99c7-e70706547580', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2f853bc3-cae6-48c5-838f-5d956d1719f7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=450e32a6-ae0a-4cd4-b338-c697096c146f, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=46205a33-4ef2-438e-9ea8-e4918a0d6e9b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:18:42 compute-0 ovn_controller[94843]: 2025-11-22T08:18:42Z|00682|binding|INFO|Setting lport 46205a33-4ef2-438e-9ea8-e4918a0d6e9b ovn-installed in OVS
Nov 22 08:18:42 compute-0 ovn_controller[94843]: 2025-11-22T08:18:42Z|00683|binding|INFO|Setting lport 46205a33-4ef2-438e-9ea8-e4918a0d6e9b up in Southbound
Nov 22 08:18:42 compute-0 ovn_controller[94843]: 2025-11-22T08:18:42Z|00684|binding|INFO|Releasing lport 46205a33-4ef2-438e-9ea8-e4918a0d6e9b from this chassis (sb_readonly=1)
Nov 22 08:18:42 compute-0 nova_compute[186544]: 2025-11-22 08:18:42.488 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:18:42 compute-0 ovn_controller[94843]: 2025-11-22T08:18:42Z|00685|if_status|INFO|Dropped 4 log messages in last 181 seconds (most recently, 181 seconds ago) due to excessive rate
Nov 22 08:18:42 compute-0 ovn_controller[94843]: 2025-11-22T08:18:42Z|00686|if_status|INFO|Not setting lport 46205a33-4ef2-438e-9ea8-e4918a0d6e9b down as sb is readonly
Nov 22 08:18:42 compute-0 ovn_controller[94843]: 2025-11-22T08:18:42Z|00687|binding|INFO|Removing iface tap46205a33-4e ovn-installed in OVS
Nov 22 08:18:42 compute-0 ovn_controller[94843]: 2025-11-22T08:18:42Z|00688|binding|INFO|Releasing lport 46205a33-4ef2-438e-9ea8-e4918a0d6e9b from this chassis (sb_readonly=0)
Nov 22 08:18:42 compute-0 ovn_controller[94843]: 2025-11-22T08:18:42Z|00689|binding|INFO|Setting lport 46205a33-4ef2-438e-9ea8-e4918a0d6e9b down in Southbound
Nov 22 08:18:42 compute-0 nova_compute[186544]: 2025-11-22 08:18:42.491 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:18:42 compute-0 systemd[1]: libpod-conmon-55d6e9d6f56574009abd5b1b99f0d3575081433020b87ce61308b0252cb7c4a3.scope: Deactivated successfully.
Nov 22 08:18:42 compute-0 nova_compute[186544]: 2025-11-22 08:18:42.501 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:18:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:18:42.507 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d5:20:0a 2001:db8::f816:3eff:fed5:200a'], port_security=['fa:16:3e:d5:20:0a 2001:db8::f816:3eff:fed5:200a'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fed5:200a/64', 'neutron:device_id': '042b3132-f4d0-455f-a591-712ce4fd2839', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-206a04da-ce2f-48ff-99c7-e70706547580', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2f853bc3-cae6-48c5-838f-5d956d1719f7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=450e32a6-ae0a-4cd4-b338-c697096c146f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=46205a33-4ef2-438e-9ea8-e4918a0d6e9b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:18:42 compute-0 nova_compute[186544]: 2025-11-22 08:18:42.520 186548 DEBUG nova.compute.manager [req-d2cc6548-815d-44b9-9983-2e3830aa769f req-b8227c4c-de91-4d36-91e0-a0551b3eac42 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Received event network-vif-unplugged-328a2b4f-54d4-42e2-b943-54d2e4ec2bae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:18:42 compute-0 nova_compute[186544]: 2025-11-22 08:18:42.521 186548 DEBUG oslo_concurrency.lockutils [req-d2cc6548-815d-44b9-9983-2e3830aa769f req-b8227c4c-de91-4d36-91e0-a0551b3eac42 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "042b3132-f4d0-455f-a591-712ce4fd2839-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:18:42 compute-0 nova_compute[186544]: 2025-11-22 08:18:42.522 186548 DEBUG oslo_concurrency.lockutils [req-d2cc6548-815d-44b9-9983-2e3830aa769f req-b8227c4c-de91-4d36-91e0-a0551b3eac42 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "042b3132-f4d0-455f-a591-712ce4fd2839-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:18:42 compute-0 nova_compute[186544]: 2025-11-22 08:18:42.522 186548 DEBUG oslo_concurrency.lockutils [req-d2cc6548-815d-44b9-9983-2e3830aa769f req-b8227c4c-de91-4d36-91e0-a0551b3eac42 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "042b3132-f4d0-455f-a591-712ce4fd2839-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:18:42 compute-0 nova_compute[186544]: 2025-11-22 08:18:42.522 186548 DEBUG nova.compute.manager [req-d2cc6548-815d-44b9-9983-2e3830aa769f req-b8227c4c-de91-4d36-91e0-a0551b3eac42 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] No waiting events found dispatching network-vif-unplugged-328a2b4f-54d4-42e2-b943-54d2e4ec2bae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:18:42 compute-0 nova_compute[186544]: 2025-11-22 08:18:42.522 186548 DEBUG nova.compute.manager [req-d2cc6548-815d-44b9-9983-2e3830aa769f req-b8227c4c-de91-4d36-91e0-a0551b3eac42 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Received event network-vif-unplugged-328a2b4f-54d4-42e2-b943-54d2e4ec2bae for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 22 08:18:42 compute-0 nova_compute[186544]: 2025-11-22 08:18:42.526 186548 INFO nova.virt.libvirt.driver [-] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Instance destroyed successfully.
Nov 22 08:18:42 compute-0 nova_compute[186544]: 2025-11-22 08:18:42.527 186548 DEBUG nova.objects.instance [None req-3ac41d86-25fb-49fa-bbab-acde20a9a0cf 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lazy-loading 'resources' on Instance uuid 042b3132-f4d0-455f-a591-712ce4fd2839 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:18:42 compute-0 nova_compute[186544]: 2025-11-22 08:18:42.541 186548 DEBUG nova.virt.libvirt.vif [None req-3ac41d86-25fb-49fa-bbab-acde20a9a0cf 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:16:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-332737681',display_name='tempest-TestGettingAddress-server-332737681',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-332737681',id=147,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM4eWB6S8gF3+vAQNfWODRrXJ2TSEmW43skqR+J/UySpPtPQ+ovw0XjJfGr33wuxAxwi/2V7+yN1aEcDFfs9GT9vSaMpY282CNYDuBhDuhcnpLdM1GTSDhOlEpnjVOI3fA==',key_name='tempest-TestGettingAddress-444639012',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:17:08Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c4200f1d1fbb44a5aaf5e3578f6354ae',ramdisk_id='',reservation_id='r-fgzl09bb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-25838038',owner_user_name='tempest-TestGettingAddress-25838038-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:17:08Z,user_data=None,user_id='809b865601654264af5bff7f49127cea',uuid=042b3132-f4d0-455f-a591-712ce4fd2839,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "328a2b4f-54d4-42e2-b943-54d2e4ec2bae", "address": "fa:16:3e:82:97:79", "network": {"id": "28285a99-0933-48f9-aee6-f1e507bcd777", "bridge": "br-int", "label": "tempest-network-smoke--85194758", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.209", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap328a2b4f-54", "ovs_interfaceid": "328a2b4f-54d4-42e2-b943-54d2e4ec2bae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 08:18:42 compute-0 podman[240882]: 2025-11-22 08:18:42.541558931 +0000 UTC m=+0.056356883 container remove 55d6e9d6f56574009abd5b1b99f0d3575081433020b87ce61308b0252cb7c4a3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-28285a99-0933-48f9-aee6-f1e507bcd777, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 22 08:18:42 compute-0 nova_compute[186544]: 2025-11-22 08:18:42.541 186548 DEBUG nova.network.os_vif_util [None req-3ac41d86-25fb-49fa-bbab-acde20a9a0cf 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converting VIF {"id": "328a2b4f-54d4-42e2-b943-54d2e4ec2bae", "address": "fa:16:3e:82:97:79", "network": {"id": "28285a99-0933-48f9-aee6-f1e507bcd777", "bridge": "br-int", "label": "tempest-network-smoke--85194758", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.209", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap328a2b4f-54", "ovs_interfaceid": "328a2b4f-54d4-42e2-b943-54d2e4ec2bae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:18:42 compute-0 nova_compute[186544]: 2025-11-22 08:18:42.542 186548 DEBUG nova.network.os_vif_util [None req-3ac41d86-25fb-49fa-bbab-acde20a9a0cf 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:82:97:79,bridge_name='br-int',has_traffic_filtering=True,id=328a2b4f-54d4-42e2-b943-54d2e4ec2bae,network=Network(28285a99-0933-48f9-aee6-f1e507bcd777),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap328a2b4f-54') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:18:42 compute-0 nova_compute[186544]: 2025-11-22 08:18:42.542 186548 DEBUG os_vif [None req-3ac41d86-25fb-49fa-bbab-acde20a9a0cf 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:82:97:79,bridge_name='br-int',has_traffic_filtering=True,id=328a2b4f-54d4-42e2-b943-54d2e4ec2bae,network=Network(28285a99-0933-48f9-aee6-f1e507bcd777),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap328a2b4f-54') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 08:18:42 compute-0 nova_compute[186544]: 2025-11-22 08:18:42.545 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:18:42 compute-0 nova_compute[186544]: 2025-11-22 08:18:42.545 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap328a2b4f-54, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:18:42 compute-0 nova_compute[186544]: 2025-11-22 08:18:42.547 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:18:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:18:42.548 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[fd9ce825-82fe-4d77-91d4-131f69b7d63d]: (4, ('Sat Nov 22 08:18:42 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-28285a99-0933-48f9-aee6-f1e507bcd777 (55d6e9d6f56574009abd5b1b99f0d3575081433020b87ce61308b0252cb7c4a3)\n55d6e9d6f56574009abd5b1b99f0d3575081433020b87ce61308b0252cb7c4a3\nSat Nov 22 08:18:42 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-28285a99-0933-48f9-aee6-f1e507bcd777 (55d6e9d6f56574009abd5b1b99f0d3575081433020b87ce61308b0252cb7c4a3)\n55d6e9d6f56574009abd5b1b99f0d3575081433020b87ce61308b0252cb7c4a3\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:18:42 compute-0 nova_compute[186544]: 2025-11-22 08:18:42.550 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 08:18:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:18:42.551 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[580af4ff-3af4-45a8-a39e-65686c759f0f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:18:42 compute-0 nova_compute[186544]: 2025-11-22 08:18:42.552 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:18:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:18:42.552 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap28285a99-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:18:42 compute-0 nova_compute[186544]: 2025-11-22 08:18:42.554 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:18:42 compute-0 kernel: tap28285a99-00: left promiscuous mode
Nov 22 08:18:42 compute-0 nova_compute[186544]: 2025-11-22 08:18:42.555 186548 INFO os_vif [None req-3ac41d86-25fb-49fa-bbab-acde20a9a0cf 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:82:97:79,bridge_name='br-int',has_traffic_filtering=True,id=328a2b4f-54d4-42e2-b943-54d2e4ec2bae,network=Network(28285a99-0933-48f9-aee6-f1e507bcd777),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap328a2b4f-54')
Nov 22 08:18:42 compute-0 nova_compute[186544]: 2025-11-22 08:18:42.556 186548 DEBUG nova.virt.libvirt.vif [None req-3ac41d86-25fb-49fa-bbab-acde20a9a0cf 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:16:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-332737681',display_name='tempest-TestGettingAddress-server-332737681',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-332737681',id=147,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM4eWB6S8gF3+vAQNfWODRrXJ2TSEmW43skqR+J/UySpPtPQ+ovw0XjJfGr33wuxAxwi/2V7+yN1aEcDFfs9GT9vSaMpY282CNYDuBhDuhcnpLdM1GTSDhOlEpnjVOI3fA==',key_name='tempest-TestGettingAddress-444639012',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:17:08Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c4200f1d1fbb44a5aaf5e3578f6354ae',ramdisk_id='',reservation_id='r-fgzl09bb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-25838038',owner_user_name='tempest-TestGettingAddress-25838038-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:17:08Z,user_data=None,user_id='809b865601654264af5bff7f49127cea',uuid=042b3132-f4d0-455f-a591-712ce4fd2839,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "46205a33-4ef2-438e-9ea8-e4918a0d6e9b", "address": "fa:16:3e:d5:20:0a", "network": {"id": "206a04da-ce2f-48ff-99c7-e70706547580", "bridge": "br-int", "label": "tempest-network-smoke--1887597380", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fed5:200a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap46205a33-4e", "ovs_interfaceid": "46205a33-4ef2-438e-9ea8-e4918a0d6e9b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 08:18:42 compute-0 nova_compute[186544]: 2025-11-22 08:18:42.556 186548 DEBUG nova.network.os_vif_util [None req-3ac41d86-25fb-49fa-bbab-acde20a9a0cf 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converting VIF {"id": "46205a33-4ef2-438e-9ea8-e4918a0d6e9b", "address": "fa:16:3e:d5:20:0a", "network": {"id": "206a04da-ce2f-48ff-99c7-e70706547580", "bridge": "br-int", "label": "tempest-network-smoke--1887597380", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fed5:200a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap46205a33-4e", "ovs_interfaceid": "46205a33-4ef2-438e-9ea8-e4918a0d6e9b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:18:42 compute-0 nova_compute[186544]: 2025-11-22 08:18:42.557 186548 DEBUG nova.network.os_vif_util [None req-3ac41d86-25fb-49fa-bbab-acde20a9a0cf 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d5:20:0a,bridge_name='br-int',has_traffic_filtering=True,id=46205a33-4ef2-438e-9ea8-e4918a0d6e9b,network=Network(206a04da-ce2f-48ff-99c7-e70706547580),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap46205a33-4e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:18:42 compute-0 nova_compute[186544]: 2025-11-22 08:18:42.557 186548 DEBUG os_vif [None req-3ac41d86-25fb-49fa-bbab-acde20a9a0cf 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d5:20:0a,bridge_name='br-int',has_traffic_filtering=True,id=46205a33-4ef2-438e-9ea8-e4918a0d6e9b,network=Network(206a04da-ce2f-48ff-99c7-e70706547580),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap46205a33-4e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 08:18:42 compute-0 nova_compute[186544]: 2025-11-22 08:18:42.558 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:18:42 compute-0 nova_compute[186544]: 2025-11-22 08:18:42.559 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap46205a33-4e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:18:42 compute-0 nova_compute[186544]: 2025-11-22 08:18:42.561 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 08:18:42 compute-0 nova_compute[186544]: 2025-11-22 08:18:42.568 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:18:42 compute-0 nova_compute[186544]: 2025-11-22 08:18:42.571 186548 INFO os_vif [None req-3ac41d86-25fb-49fa-bbab-acde20a9a0cf 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d5:20:0a,bridge_name='br-int',has_traffic_filtering=True,id=46205a33-4ef2-438e-9ea8-e4918a0d6e9b,network=Network(206a04da-ce2f-48ff-99c7-e70706547580),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap46205a33-4e')
Nov 22 08:18:42 compute-0 nova_compute[186544]: 2025-11-22 08:18:42.571 186548 INFO nova.virt.libvirt.driver [None req-3ac41d86-25fb-49fa-bbab-acde20a9a0cf 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Deleting instance files /var/lib/nova/instances/042b3132-f4d0-455f-a591-712ce4fd2839_del
Nov 22 08:18:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:18:42.571 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[19c15be6-9709-42d7-bfab-fa98b747a75a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:18:42 compute-0 nova_compute[186544]: 2025-11-22 08:18:42.572 186548 INFO nova.virt.libvirt.driver [None req-3ac41d86-25fb-49fa-bbab-acde20a9a0cf 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Deletion of /var/lib/nova/instances/042b3132-f4d0-455f-a591-712ce4fd2839_del complete
Nov 22 08:18:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:18:42.591 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[388ef76c-ea01-4bac-a0e3-f2a7d7d4b9b1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:18:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:18:42.593 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[e10caf27-550b-4c4a-8743-13d1b06b55e5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:18:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:18:42.609 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[167768f6-6c01-4e2d-91dd-a9c3412d0a68]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 616614, 'reachable_time': 31567, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240908, 'error': None, 'target': 'ovnmeta-28285a99-0933-48f9-aee6-f1e507bcd777', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:18:42 compute-0 systemd[1]: run-netns-ovnmeta\x2d28285a99\x2d0933\x2d48f9\x2daee6\x2df1e507bcd777.mount: Deactivated successfully.
Nov 22 08:18:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:18:42.617 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-28285a99-0933-48f9-aee6-f1e507bcd777 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 08:18:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:18:42.619 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[ca6917b8-1840-47e3-a030-85226ed969c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:18:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:18:42.620 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 46205a33-4ef2-438e-9ea8-e4918a0d6e9b in datapath 206a04da-ce2f-48ff-99c7-e70706547580 unbound from our chassis
Nov 22 08:18:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:18:42.621 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 206a04da-ce2f-48ff-99c7-e70706547580, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 08:18:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:18:42.622 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[2a02f088-a765-4c11-8b48-417dc48c5e96]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:18:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:18:42.623 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-206a04da-ce2f-48ff-99c7-e70706547580 namespace which is not needed anymore
Nov 22 08:18:42 compute-0 nova_compute[186544]: 2025-11-22 08:18:42.664 186548 INFO nova.compute.manager [None req-3ac41d86-25fb-49fa-bbab-acde20a9a0cf 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Took 0.44 seconds to destroy the instance on the hypervisor.
Nov 22 08:18:42 compute-0 nova_compute[186544]: 2025-11-22 08:18:42.664 186548 DEBUG oslo.service.loopingcall [None req-3ac41d86-25fb-49fa-bbab-acde20a9a0cf 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 08:18:42 compute-0 nova_compute[186544]: 2025-11-22 08:18:42.665 186548 DEBUG nova.compute.manager [-] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 08:18:42 compute-0 nova_compute[186544]: 2025-11-22 08:18:42.665 186548 DEBUG nova.network.neutron [-] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 08:18:42 compute-0 neutron-haproxy-ovnmeta-206a04da-ce2f-48ff-99c7-e70706547580[240321]: [NOTICE]   (240325) : haproxy version is 2.8.14-c23fe91
Nov 22 08:18:42 compute-0 neutron-haproxy-ovnmeta-206a04da-ce2f-48ff-99c7-e70706547580[240321]: [NOTICE]   (240325) : path to executable is /usr/sbin/haproxy
Nov 22 08:18:42 compute-0 neutron-haproxy-ovnmeta-206a04da-ce2f-48ff-99c7-e70706547580[240321]: [WARNING]  (240325) : Exiting Master process...
Nov 22 08:18:42 compute-0 neutron-haproxy-ovnmeta-206a04da-ce2f-48ff-99c7-e70706547580[240321]: [ALERT]    (240325) : Current worker (240327) exited with code 143 (Terminated)
Nov 22 08:18:42 compute-0 neutron-haproxy-ovnmeta-206a04da-ce2f-48ff-99c7-e70706547580[240321]: [WARNING]  (240325) : All workers exited. Exiting... (0)
Nov 22 08:18:42 compute-0 systemd[1]: libpod-cb696134294cdd0bfb9be591c08645895587ad6de5925a7f9714d136e43d54b9.scope: Deactivated successfully.
Nov 22 08:18:42 compute-0 podman[240926]: 2025-11-22 08:18:42.750210544 +0000 UTC m=+0.044911860 container died cb696134294cdd0bfb9be591c08645895587ad6de5925a7f9714d136e43d54b9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-206a04da-ce2f-48ff-99c7-e70706547580, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS)
Nov 22 08:18:42 compute-0 systemd[1]: var-lib-containers-storage-overlay-cc5c1ccef575014a85809801db318ddb91049dbba867fb51a31d06b8ddb05e8c-merged.mount: Deactivated successfully.
Nov 22 08:18:42 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cb696134294cdd0bfb9be591c08645895587ad6de5925a7f9714d136e43d54b9-userdata-shm.mount: Deactivated successfully.
Nov 22 08:18:42 compute-0 podman[240926]: 2025-11-22 08:18:42.791984076 +0000 UTC m=+0.086685362 container cleanup cb696134294cdd0bfb9be591c08645895587ad6de5925a7f9714d136e43d54b9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-206a04da-ce2f-48ff-99c7-e70706547580, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS)
Nov 22 08:18:42 compute-0 systemd[1]: libpod-conmon-cb696134294cdd0bfb9be591c08645895587ad6de5925a7f9714d136e43d54b9.scope: Deactivated successfully.
Nov 22 08:18:42 compute-0 podman[240956]: 2025-11-22 08:18:42.884299866 +0000 UTC m=+0.072744078 container remove cb696134294cdd0bfb9be591c08645895587ad6de5925a7f9714d136e43d54b9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-206a04da-ce2f-48ff-99c7-e70706547580, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 08:18:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:18:42.889 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[fccbce3d-691e-4c98-b447-d258e810a2e5]: (4, ('Sat Nov 22 08:18:42 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-206a04da-ce2f-48ff-99c7-e70706547580 (cb696134294cdd0bfb9be591c08645895587ad6de5925a7f9714d136e43d54b9)\ncb696134294cdd0bfb9be591c08645895587ad6de5925a7f9714d136e43d54b9\nSat Nov 22 08:18:42 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-206a04da-ce2f-48ff-99c7-e70706547580 (cb696134294cdd0bfb9be591c08645895587ad6de5925a7f9714d136e43d54b9)\ncb696134294cdd0bfb9be591c08645895587ad6de5925a7f9714d136e43d54b9\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:18:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:18:42.891 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[216f5696-7b86-4629-8de6-a685ef17bd3a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:18:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:18:42.892 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap206a04da-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:18:42 compute-0 kernel: tap206a04da-c0: left promiscuous mode
Nov 22 08:18:42 compute-0 nova_compute[186544]: 2025-11-22 08:18:42.894 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:18:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:18:42.898 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[e223d22f-2840-4129-9934-e67b1419580c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:18:42 compute-0 nova_compute[186544]: 2025-11-22 08:18:42.910 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:18:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:18:42.918 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[d4824707-803b-461b-b2e1-a8c19fdb46d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:18:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:18:42.919 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[e6cea081-d60c-4828-bbbd-e4a19de3df78]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:18:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:18:42.934 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[b790dac6-6e99-42f4-9e3a-03f3e7cf6351]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 616891, 'reachable_time': 35521, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240971, 'error': None, 'target': 'ovnmeta-206a04da-ce2f-48ff-99c7-e70706547580', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:18:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:18:42.936 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-206a04da-ce2f-48ff-99c7-e70706547580 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 08:18:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:18:42.937 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[303226aa-2768-4e94-b03d-e9cf8e3259c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:18:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:18:42.938 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 46205a33-4ef2-438e-9ea8-e4918a0d6e9b in datapath 206a04da-ce2f-48ff-99c7-e70706547580 unbound from our chassis
Nov 22 08:18:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:18:42.939 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 206a04da-ce2f-48ff-99c7-e70706547580, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 08:18:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:18:42.940 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[73a917cb-9a89-4f7c-82c9-6fb1786d69b1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:18:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:18:42.941 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 46205a33-4ef2-438e-9ea8-e4918a0d6e9b in datapath 206a04da-ce2f-48ff-99c7-e70706547580 unbound from our chassis
Nov 22 08:18:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:18:42.942 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 206a04da-ce2f-48ff-99c7-e70706547580, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 08:18:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:18:42.943 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[d2746a89-fb5f-488c-86e5-d901fafee822]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:18:43 compute-0 nova_compute[186544]: 2025-11-22 08:18:43.424 186548 DEBUG nova.compute.manager [req-33894204-aa62-404c-a347-4d84c315af55 req-01e636db-2e00-462b-8b9b-54798475b397 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Received event network-changed-328a2b4f-54d4-42e2-b943-54d2e4ec2bae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:18:43 compute-0 nova_compute[186544]: 2025-11-22 08:18:43.424 186548 DEBUG nova.compute.manager [req-33894204-aa62-404c-a347-4d84c315af55 req-01e636db-2e00-462b-8b9b-54798475b397 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Refreshing instance network info cache due to event network-changed-328a2b4f-54d4-42e2-b943-54d2e4ec2bae. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:18:43 compute-0 nova_compute[186544]: 2025-11-22 08:18:43.424 186548 DEBUG oslo_concurrency.lockutils [req-33894204-aa62-404c-a347-4d84c315af55 req-01e636db-2e00-462b-8b9b-54798475b397 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-042b3132-f4d0-455f-a591-712ce4fd2839" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:18:43 compute-0 nova_compute[186544]: 2025-11-22 08:18:43.424 186548 DEBUG oslo_concurrency.lockutils [req-33894204-aa62-404c-a347-4d84c315af55 req-01e636db-2e00-462b-8b9b-54798475b397 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-042b3132-f4d0-455f-a591-712ce4fd2839" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:18:43 compute-0 nova_compute[186544]: 2025-11-22 08:18:43.425 186548 DEBUG nova.network.neutron [req-33894204-aa62-404c-a347-4d84c315af55 req-01e636db-2e00-462b-8b9b-54798475b397 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Refreshing network info cache for port 328a2b4f-54d4-42e2-b943-54d2e4ec2bae _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:18:43 compute-0 systemd[1]: run-netns-ovnmeta\x2d206a04da\x2dce2f\x2d48ff\x2d99c7\x2de70706547580.mount: Deactivated successfully.
Nov 22 08:18:43 compute-0 nova_compute[186544]: 2025-11-22 08:18:43.612 186548 INFO nova.network.neutron [req-33894204-aa62-404c-a347-4d84c315af55 req-01e636db-2e00-462b-8b9b-54798475b397 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Port 328a2b4f-54d4-42e2-b943-54d2e4ec2bae from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.
Nov 22 08:18:43 compute-0 nova_compute[186544]: 2025-11-22 08:18:43.612 186548 DEBUG nova.network.neutron [req-33894204-aa62-404c-a347-4d84c315af55 req-01e636db-2e00-462b-8b9b-54798475b397 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Updating instance_info_cache with network_info: [{"id": "46205a33-4ef2-438e-9ea8-e4918a0d6e9b", "address": "fa:16:3e:d5:20:0a", "network": {"id": "206a04da-ce2f-48ff-99c7-e70706547580", "bridge": "br-int", "label": "tempest-network-smoke--1887597380", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fed5:200a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap46205a33-4e", "ovs_interfaceid": "46205a33-4ef2-438e-9ea8-e4918a0d6e9b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:18:43 compute-0 nova_compute[186544]: 2025-11-22 08:18:43.671 186548 DEBUG nova.compute.manager [req-e8660c2f-579a-4c3f-8204-bfb403efcff6 req-aa2b86ae-0c1b-4069-a727-ece6f4387739 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Received event network-vif-deleted-328a2b4f-54d4-42e2-b943-54d2e4ec2bae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:18:43 compute-0 nova_compute[186544]: 2025-11-22 08:18:43.671 186548 INFO nova.compute.manager [req-e8660c2f-579a-4c3f-8204-bfb403efcff6 req-aa2b86ae-0c1b-4069-a727-ece6f4387739 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Neutron deleted interface 328a2b4f-54d4-42e2-b943-54d2e4ec2bae; detaching it from the instance and deleting it from the info cache
Nov 22 08:18:43 compute-0 nova_compute[186544]: 2025-11-22 08:18:43.672 186548 DEBUG nova.network.neutron [req-e8660c2f-579a-4c3f-8204-bfb403efcff6 req-aa2b86ae-0c1b-4069-a727-ece6f4387739 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Updating instance_info_cache with network_info: [{"id": "46205a33-4ef2-438e-9ea8-e4918a0d6e9b", "address": "fa:16:3e:d5:20:0a", "network": {"id": "206a04da-ce2f-48ff-99c7-e70706547580", "bridge": "br-int", "label": "tempest-network-smoke--1887597380", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fed5:200a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap46205a33-4e", "ovs_interfaceid": "46205a33-4ef2-438e-9ea8-e4918a0d6e9b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:18:43 compute-0 nova_compute[186544]: 2025-11-22 08:18:43.817 186548 DEBUG oslo_concurrency.lockutils [req-33894204-aa62-404c-a347-4d84c315af55 req-01e636db-2e00-462b-8b9b-54798475b397 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-042b3132-f4d0-455f-a591-712ce4fd2839" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:18:43 compute-0 nova_compute[186544]: 2025-11-22 08:18:43.820 186548 DEBUG nova.compute.manager [req-e8660c2f-579a-4c3f-8204-bfb403efcff6 req-aa2b86ae-0c1b-4069-a727-ece6f4387739 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Detach interface failed, port_id=328a2b4f-54d4-42e2-b943-54d2e4ec2bae, reason: Instance 042b3132-f4d0-455f-a591-712ce4fd2839 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Nov 22 08:18:44 compute-0 nova_compute[186544]: 2025-11-22 08:18:44.651 186548 DEBUG nova.compute.manager [req-1dbf4405-339b-45f8-8e38-bd3e0240f180 req-58ab45af-074e-4d1a-a407-d9076138c2f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Received event network-vif-plugged-328a2b4f-54d4-42e2-b943-54d2e4ec2bae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:18:44 compute-0 nova_compute[186544]: 2025-11-22 08:18:44.652 186548 DEBUG oslo_concurrency.lockutils [req-1dbf4405-339b-45f8-8e38-bd3e0240f180 req-58ab45af-074e-4d1a-a407-d9076138c2f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "042b3132-f4d0-455f-a591-712ce4fd2839-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:18:44 compute-0 nova_compute[186544]: 2025-11-22 08:18:44.653 186548 DEBUG oslo_concurrency.lockutils [req-1dbf4405-339b-45f8-8e38-bd3e0240f180 req-58ab45af-074e-4d1a-a407-d9076138c2f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "042b3132-f4d0-455f-a591-712ce4fd2839-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:18:44 compute-0 nova_compute[186544]: 2025-11-22 08:18:44.653 186548 DEBUG oslo_concurrency.lockutils [req-1dbf4405-339b-45f8-8e38-bd3e0240f180 req-58ab45af-074e-4d1a-a407-d9076138c2f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "042b3132-f4d0-455f-a591-712ce4fd2839-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:18:44 compute-0 nova_compute[186544]: 2025-11-22 08:18:44.654 186548 DEBUG nova.compute.manager [req-1dbf4405-339b-45f8-8e38-bd3e0240f180 req-58ab45af-074e-4d1a-a407-d9076138c2f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] No waiting events found dispatching network-vif-plugged-328a2b4f-54d4-42e2-b943-54d2e4ec2bae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:18:44 compute-0 nova_compute[186544]: 2025-11-22 08:18:44.654 186548 WARNING nova.compute.manager [req-1dbf4405-339b-45f8-8e38-bd3e0240f180 req-58ab45af-074e-4d1a-a407-d9076138c2f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Received unexpected event network-vif-plugged-328a2b4f-54d4-42e2-b943-54d2e4ec2bae for instance with vm_state active and task_state deleting.
Nov 22 08:18:44 compute-0 nova_compute[186544]: 2025-11-22 08:18:44.655 186548 DEBUG nova.compute.manager [req-1dbf4405-339b-45f8-8e38-bd3e0240f180 req-58ab45af-074e-4d1a-a407-d9076138c2f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Received event network-vif-unplugged-46205a33-4ef2-438e-9ea8-e4918a0d6e9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:18:44 compute-0 nova_compute[186544]: 2025-11-22 08:18:44.655 186548 DEBUG oslo_concurrency.lockutils [req-1dbf4405-339b-45f8-8e38-bd3e0240f180 req-58ab45af-074e-4d1a-a407-d9076138c2f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "042b3132-f4d0-455f-a591-712ce4fd2839-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:18:44 compute-0 nova_compute[186544]: 2025-11-22 08:18:44.656 186548 DEBUG oslo_concurrency.lockutils [req-1dbf4405-339b-45f8-8e38-bd3e0240f180 req-58ab45af-074e-4d1a-a407-d9076138c2f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "042b3132-f4d0-455f-a591-712ce4fd2839-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:18:44 compute-0 nova_compute[186544]: 2025-11-22 08:18:44.656 186548 DEBUG oslo_concurrency.lockutils [req-1dbf4405-339b-45f8-8e38-bd3e0240f180 req-58ab45af-074e-4d1a-a407-d9076138c2f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "042b3132-f4d0-455f-a591-712ce4fd2839-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:18:44 compute-0 nova_compute[186544]: 2025-11-22 08:18:44.657 186548 DEBUG nova.compute.manager [req-1dbf4405-339b-45f8-8e38-bd3e0240f180 req-58ab45af-074e-4d1a-a407-d9076138c2f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] No waiting events found dispatching network-vif-unplugged-46205a33-4ef2-438e-9ea8-e4918a0d6e9b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:18:44 compute-0 nova_compute[186544]: 2025-11-22 08:18:44.657 186548 DEBUG nova.compute.manager [req-1dbf4405-339b-45f8-8e38-bd3e0240f180 req-58ab45af-074e-4d1a-a407-d9076138c2f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Received event network-vif-unplugged-46205a33-4ef2-438e-9ea8-e4918a0d6e9b for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 22 08:18:44 compute-0 nova_compute[186544]: 2025-11-22 08:18:44.657 186548 DEBUG nova.compute.manager [req-1dbf4405-339b-45f8-8e38-bd3e0240f180 req-58ab45af-074e-4d1a-a407-d9076138c2f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Received event network-vif-plugged-46205a33-4ef2-438e-9ea8-e4918a0d6e9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:18:44 compute-0 nova_compute[186544]: 2025-11-22 08:18:44.657 186548 DEBUG oslo_concurrency.lockutils [req-1dbf4405-339b-45f8-8e38-bd3e0240f180 req-58ab45af-074e-4d1a-a407-d9076138c2f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "042b3132-f4d0-455f-a591-712ce4fd2839-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:18:44 compute-0 nova_compute[186544]: 2025-11-22 08:18:44.658 186548 DEBUG oslo_concurrency.lockutils [req-1dbf4405-339b-45f8-8e38-bd3e0240f180 req-58ab45af-074e-4d1a-a407-d9076138c2f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "042b3132-f4d0-455f-a591-712ce4fd2839-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:18:44 compute-0 nova_compute[186544]: 2025-11-22 08:18:44.658 186548 DEBUG oslo_concurrency.lockutils [req-1dbf4405-339b-45f8-8e38-bd3e0240f180 req-58ab45af-074e-4d1a-a407-d9076138c2f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "042b3132-f4d0-455f-a591-712ce4fd2839-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:18:44 compute-0 nova_compute[186544]: 2025-11-22 08:18:44.658 186548 DEBUG nova.compute.manager [req-1dbf4405-339b-45f8-8e38-bd3e0240f180 req-58ab45af-074e-4d1a-a407-d9076138c2f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] No waiting events found dispatching network-vif-plugged-46205a33-4ef2-438e-9ea8-e4918a0d6e9b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:18:44 compute-0 nova_compute[186544]: 2025-11-22 08:18:44.658 186548 WARNING nova.compute.manager [req-1dbf4405-339b-45f8-8e38-bd3e0240f180 req-58ab45af-074e-4d1a-a407-d9076138c2f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Received unexpected event network-vif-plugged-46205a33-4ef2-438e-9ea8-e4918a0d6e9b for instance with vm_state active and task_state deleting.
Nov 22 08:18:44 compute-0 nova_compute[186544]: 2025-11-22 08:18:44.659 186548 DEBUG nova.compute.manager [req-1dbf4405-339b-45f8-8e38-bd3e0240f180 req-58ab45af-074e-4d1a-a407-d9076138c2f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Received event network-vif-plugged-46205a33-4ef2-438e-9ea8-e4918a0d6e9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:18:44 compute-0 nova_compute[186544]: 2025-11-22 08:18:44.659 186548 DEBUG oslo_concurrency.lockutils [req-1dbf4405-339b-45f8-8e38-bd3e0240f180 req-58ab45af-074e-4d1a-a407-d9076138c2f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "042b3132-f4d0-455f-a591-712ce4fd2839-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:18:44 compute-0 nova_compute[186544]: 2025-11-22 08:18:44.659 186548 DEBUG oslo_concurrency.lockutils [req-1dbf4405-339b-45f8-8e38-bd3e0240f180 req-58ab45af-074e-4d1a-a407-d9076138c2f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "042b3132-f4d0-455f-a591-712ce4fd2839-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:18:44 compute-0 nova_compute[186544]: 2025-11-22 08:18:44.659 186548 DEBUG oslo_concurrency.lockutils [req-1dbf4405-339b-45f8-8e38-bd3e0240f180 req-58ab45af-074e-4d1a-a407-d9076138c2f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "042b3132-f4d0-455f-a591-712ce4fd2839-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:18:44 compute-0 nova_compute[186544]: 2025-11-22 08:18:44.660 186548 DEBUG nova.compute.manager [req-1dbf4405-339b-45f8-8e38-bd3e0240f180 req-58ab45af-074e-4d1a-a407-d9076138c2f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] No waiting events found dispatching network-vif-plugged-46205a33-4ef2-438e-9ea8-e4918a0d6e9b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:18:44 compute-0 nova_compute[186544]: 2025-11-22 08:18:44.660 186548 WARNING nova.compute.manager [req-1dbf4405-339b-45f8-8e38-bd3e0240f180 req-58ab45af-074e-4d1a-a407-d9076138c2f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Received unexpected event network-vif-plugged-46205a33-4ef2-438e-9ea8-e4918a0d6e9b for instance with vm_state active and task_state deleting.
Nov 22 08:18:44 compute-0 nova_compute[186544]: 2025-11-22 08:18:44.660 186548 DEBUG nova.compute.manager [req-1dbf4405-339b-45f8-8e38-bd3e0240f180 req-58ab45af-074e-4d1a-a407-d9076138c2f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Received event network-vif-plugged-46205a33-4ef2-438e-9ea8-e4918a0d6e9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:18:44 compute-0 nova_compute[186544]: 2025-11-22 08:18:44.660 186548 DEBUG oslo_concurrency.lockutils [req-1dbf4405-339b-45f8-8e38-bd3e0240f180 req-58ab45af-074e-4d1a-a407-d9076138c2f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "042b3132-f4d0-455f-a591-712ce4fd2839-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:18:44 compute-0 nova_compute[186544]: 2025-11-22 08:18:44.661 186548 DEBUG oslo_concurrency.lockutils [req-1dbf4405-339b-45f8-8e38-bd3e0240f180 req-58ab45af-074e-4d1a-a407-d9076138c2f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "042b3132-f4d0-455f-a591-712ce4fd2839-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:18:44 compute-0 nova_compute[186544]: 2025-11-22 08:18:44.661 186548 DEBUG oslo_concurrency.lockutils [req-1dbf4405-339b-45f8-8e38-bd3e0240f180 req-58ab45af-074e-4d1a-a407-d9076138c2f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "042b3132-f4d0-455f-a591-712ce4fd2839-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:18:44 compute-0 nova_compute[186544]: 2025-11-22 08:18:44.661 186548 DEBUG nova.compute.manager [req-1dbf4405-339b-45f8-8e38-bd3e0240f180 req-58ab45af-074e-4d1a-a407-d9076138c2f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] No waiting events found dispatching network-vif-plugged-46205a33-4ef2-438e-9ea8-e4918a0d6e9b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:18:44 compute-0 nova_compute[186544]: 2025-11-22 08:18:44.661 186548 WARNING nova.compute.manager [req-1dbf4405-339b-45f8-8e38-bd3e0240f180 req-58ab45af-074e-4d1a-a407-d9076138c2f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Received unexpected event network-vif-plugged-46205a33-4ef2-438e-9ea8-e4918a0d6e9b for instance with vm_state active and task_state deleting.
Nov 22 08:18:44 compute-0 nova_compute[186544]: 2025-11-22 08:18:44.661 186548 DEBUG nova.compute.manager [req-1dbf4405-339b-45f8-8e38-bd3e0240f180 req-58ab45af-074e-4d1a-a407-d9076138c2f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Received event network-vif-unplugged-46205a33-4ef2-438e-9ea8-e4918a0d6e9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:18:44 compute-0 nova_compute[186544]: 2025-11-22 08:18:44.662 186548 DEBUG oslo_concurrency.lockutils [req-1dbf4405-339b-45f8-8e38-bd3e0240f180 req-58ab45af-074e-4d1a-a407-d9076138c2f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "042b3132-f4d0-455f-a591-712ce4fd2839-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:18:44 compute-0 nova_compute[186544]: 2025-11-22 08:18:44.662 186548 DEBUG oslo_concurrency.lockutils [req-1dbf4405-339b-45f8-8e38-bd3e0240f180 req-58ab45af-074e-4d1a-a407-d9076138c2f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "042b3132-f4d0-455f-a591-712ce4fd2839-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:18:44 compute-0 nova_compute[186544]: 2025-11-22 08:18:44.662 186548 DEBUG oslo_concurrency.lockutils [req-1dbf4405-339b-45f8-8e38-bd3e0240f180 req-58ab45af-074e-4d1a-a407-d9076138c2f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "042b3132-f4d0-455f-a591-712ce4fd2839-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:18:44 compute-0 nova_compute[186544]: 2025-11-22 08:18:44.662 186548 DEBUG nova.compute.manager [req-1dbf4405-339b-45f8-8e38-bd3e0240f180 req-58ab45af-074e-4d1a-a407-d9076138c2f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] No waiting events found dispatching network-vif-unplugged-46205a33-4ef2-438e-9ea8-e4918a0d6e9b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:18:44 compute-0 nova_compute[186544]: 2025-11-22 08:18:44.662 186548 DEBUG nova.compute.manager [req-1dbf4405-339b-45f8-8e38-bd3e0240f180 req-58ab45af-074e-4d1a-a407-d9076138c2f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Received event network-vif-unplugged-46205a33-4ef2-438e-9ea8-e4918a0d6e9b for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 22 08:18:44 compute-0 nova_compute[186544]: 2025-11-22 08:18:44.663 186548 DEBUG nova.compute.manager [req-1dbf4405-339b-45f8-8e38-bd3e0240f180 req-58ab45af-074e-4d1a-a407-d9076138c2f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Received event network-vif-plugged-46205a33-4ef2-438e-9ea8-e4918a0d6e9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:18:44 compute-0 nova_compute[186544]: 2025-11-22 08:18:44.663 186548 DEBUG oslo_concurrency.lockutils [req-1dbf4405-339b-45f8-8e38-bd3e0240f180 req-58ab45af-074e-4d1a-a407-d9076138c2f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "042b3132-f4d0-455f-a591-712ce4fd2839-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:18:44 compute-0 nova_compute[186544]: 2025-11-22 08:18:44.663 186548 DEBUG oslo_concurrency.lockutils [req-1dbf4405-339b-45f8-8e38-bd3e0240f180 req-58ab45af-074e-4d1a-a407-d9076138c2f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "042b3132-f4d0-455f-a591-712ce4fd2839-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:18:44 compute-0 nova_compute[186544]: 2025-11-22 08:18:44.663 186548 DEBUG oslo_concurrency.lockutils [req-1dbf4405-339b-45f8-8e38-bd3e0240f180 req-58ab45af-074e-4d1a-a407-d9076138c2f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "042b3132-f4d0-455f-a591-712ce4fd2839-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:18:44 compute-0 nova_compute[186544]: 2025-11-22 08:18:44.663 186548 DEBUG nova.compute.manager [req-1dbf4405-339b-45f8-8e38-bd3e0240f180 req-58ab45af-074e-4d1a-a407-d9076138c2f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] No waiting events found dispatching network-vif-plugged-46205a33-4ef2-438e-9ea8-e4918a0d6e9b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:18:44 compute-0 nova_compute[186544]: 2025-11-22 08:18:44.664 186548 WARNING nova.compute.manager [req-1dbf4405-339b-45f8-8e38-bd3e0240f180 req-58ab45af-074e-4d1a-a407-d9076138c2f3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Received unexpected event network-vif-plugged-46205a33-4ef2-438e-9ea8-e4918a0d6e9b for instance with vm_state active and task_state deleting.
Nov 22 08:18:45 compute-0 nova_compute[186544]: 2025-11-22 08:18:45.336 186548 DEBUG nova.network.neutron [-] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:18:45 compute-0 nova_compute[186544]: 2025-11-22 08:18:45.451 186548 INFO nova.compute.manager [-] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Took 2.79 seconds to deallocate network for instance.
Nov 22 08:18:46 compute-0 nova_compute[186544]: 2025-11-22 08:18:46.006 186548 DEBUG oslo_concurrency.lockutils [None req-3ac41d86-25fb-49fa-bbab-acde20a9a0cf 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:18:46 compute-0 nova_compute[186544]: 2025-11-22 08:18:46.007 186548 DEBUG oslo_concurrency.lockutils [None req-3ac41d86-25fb-49fa-bbab-acde20a9a0cf 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:18:46 compute-0 nova_compute[186544]: 2025-11-22 08:18:46.106 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:18:46 compute-0 nova_compute[186544]: 2025-11-22 08:18:46.252 186548 DEBUG nova.compute.manager [req-a76ecce4-884a-436f-ae57-8b04e519a37c req-2352bd5d-6720-48fa-8f68-1fa835038ee3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Received event network-vif-deleted-46205a33-4ef2-438e-9ea8-e4918a0d6e9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:18:46 compute-0 nova_compute[186544]: 2025-11-22 08:18:46.325 186548 DEBUG nova.compute.provider_tree [None req-3ac41d86-25fb-49fa-bbab-acde20a9a0cf 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:18:46 compute-0 nova_compute[186544]: 2025-11-22 08:18:46.340 186548 DEBUG nova.scheduler.client.report [None req-3ac41d86-25fb-49fa-bbab-acde20a9a0cf 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:18:46 compute-0 nova_compute[186544]: 2025-11-22 08:18:46.370 186548 DEBUG oslo_concurrency.lockutils [None req-3ac41d86-25fb-49fa-bbab-acde20a9a0cf 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.363s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:18:46 compute-0 nova_compute[186544]: 2025-11-22 08:18:46.436 186548 INFO nova.scheduler.client.report [None req-3ac41d86-25fb-49fa-bbab-acde20a9a0cf 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Deleted allocations for instance 042b3132-f4d0-455f-a591-712ce4fd2839
Nov 22 08:18:46 compute-0 nova_compute[186544]: 2025-11-22 08:18:46.519 186548 DEBUG oslo_concurrency.lockutils [None req-3ac41d86-25fb-49fa-bbab-acde20a9a0cf 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "042b3132-f4d0-455f-a591-712ce4fd2839" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.309s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:18:47 compute-0 nova_compute[186544]: 2025-11-22 08:18:47.560 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:18:51 compute-0 nova_compute[186544]: 2025-11-22 08:18:51.108 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:18:52 compute-0 nova_compute[186544]: 2025-11-22 08:18:52.562 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:18:54 compute-0 podman[240972]: 2025-11-22 08:18:54.417330647 +0000 UTC m=+0.065467098 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Nov 22 08:18:54 compute-0 podman[240973]: 2025-11-22 08:18:54.426161535 +0000 UTC m=+0.063017917 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 08:18:54 compute-0 podman[240974]: 2025-11-22 08:18:54.439131256 +0000 UTC m=+0.081651168 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 08:18:54 compute-0 podman[240975]: 2025-11-22 08:18:54.448036266 +0000 UTC m=+0.087940463 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, org.label-schema.vendor=CentOS)
Nov 22 08:18:55 compute-0 nova_compute[186544]: 2025-11-22 08:18:55.069 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:18:55 compute-0 nova_compute[186544]: 2025-11-22 08:18:55.190 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:18:56 compute-0 nova_compute[186544]: 2025-11-22 08:18:56.111 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:18:57 compute-0 nova_compute[186544]: 2025-11-22 08:18:57.524 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763799522.5214152, 042b3132-f4d0-455f-a591-712ce4fd2839 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:18:57 compute-0 nova_compute[186544]: 2025-11-22 08:18:57.524 186548 INFO nova.compute.manager [-] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] VM Stopped (Lifecycle Event)
Nov 22 08:18:57 compute-0 nova_compute[186544]: 2025-11-22 08:18:57.541 186548 DEBUG nova.compute.manager [None req-5e1a4e9c-7c2b-444b-aad1-57e2e1c1438b - - - - - -] [instance: 042b3132-f4d0-455f-a591-712ce4fd2839] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:18:57 compute-0 nova_compute[186544]: 2025-11-22 08:18:57.563 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:18:59 compute-0 nova_compute[186544]: 2025-11-22 08:18:59.497 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:18:59 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:18:59.498 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=44, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=43) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:18:59 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:18:59.499 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 08:18:59 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:18:59.499 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '44'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:19:01 compute-0 nova_compute[186544]: 2025-11-22 08:19:01.114 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:19:02 compute-0 nova_compute[186544]: 2025-11-22 08:19:02.565 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:19:03 compute-0 nova_compute[186544]: 2025-11-22 08:19:03.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:19:03 compute-0 nova_compute[186544]: 2025-11-22 08:19:03.201 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:19:03 compute-0 nova_compute[186544]: 2025-11-22 08:19:03.201 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:19:03 compute-0 nova_compute[186544]: 2025-11-22 08:19:03.201 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:19:03 compute-0 nova_compute[186544]: 2025-11-22 08:19:03.202 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 08:19:03 compute-0 nova_compute[186544]: 2025-11-22 08:19:03.363 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:19:03 compute-0 nova_compute[186544]: 2025-11-22 08:19:03.364 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5711MB free_disk=73.13673782348633GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 08:19:03 compute-0 nova_compute[186544]: 2025-11-22 08:19:03.365 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:19:03 compute-0 nova_compute[186544]: 2025-11-22 08:19:03.365 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:19:03 compute-0 nova_compute[186544]: 2025-11-22 08:19:03.474 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 08:19:03 compute-0 nova_compute[186544]: 2025-11-22 08:19:03.474 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 08:19:03 compute-0 nova_compute[186544]: 2025-11-22 08:19:03.497 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:19:03 compute-0 nova_compute[186544]: 2025-11-22 08:19:03.510 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:19:03 compute-0 nova_compute[186544]: 2025-11-22 08:19:03.530 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 08:19:03 compute-0 nova_compute[186544]: 2025-11-22 08:19:03.531 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.166s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:19:05 compute-0 podman[241055]: 2025-11-22 08:19:05.413049866 +0000 UTC m=+0.060713870 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 08:19:05 compute-0 nova_compute[186544]: 2025-11-22 08:19:05.526 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:19:05 compute-0 nova_compute[186544]: 2025-11-22 08:19:05.527 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:19:05 compute-0 nova_compute[186544]: 2025-11-22 08:19:05.527 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 08:19:05 compute-0 nova_compute[186544]: 2025-11-22 08:19:05.527 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 08:19:05 compute-0 nova_compute[186544]: 2025-11-22 08:19:05.555 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 08:19:05 compute-0 nova_compute[186544]: 2025-11-22 08:19:05.555 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:19:05 compute-0 nova_compute[186544]: 2025-11-22 08:19:05.556 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:19:05 compute-0 nova_compute[186544]: 2025-11-22 08:19:05.557 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 08:19:06 compute-0 nova_compute[186544]: 2025-11-22 08:19:06.115 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:19:07 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:19:07.549 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2c:5b:22 10.100.0.2 2001:db8::f816:3eff:fe2c:5b22'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe2c:5b22/64', 'neutron:device_id': 'ovnmeta-cfb1249f-37ac-4df7-b559-e7968406997d', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cfb1249f-37ac-4df7-b559-e7968406997d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=94e93905-1b64-4ecc-b682-ceea307bebcf, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=ac76a812-5ead-4b51-8c63-4eaca1b65820) old=Port_Binding(mac=['fa:16:3e:2c:5b:22 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-cfb1249f-37ac-4df7-b559-e7968406997d', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cfb1249f-37ac-4df7-b559-e7968406997d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:19:07 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:19:07.551 103805 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port ac76a812-5ead-4b51-8c63-4eaca1b65820 in datapath cfb1249f-37ac-4df7-b559-e7968406997d updated
Nov 22 08:19:07 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:19:07.552 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cfb1249f-37ac-4df7-b559-e7968406997d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 08:19:07 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:19:07.553 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[e19d37f1-0698-4617-9e23-44c7f341373e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:19:07 compute-0 nova_compute[186544]: 2025-11-22 08:19:07.566 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:19:08 compute-0 podman[241075]: 2025-11-22 08:19:08.398172034 +0000 UTC m=+0.050911218 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 08:19:08 compute-0 podman[241076]: 2025-11-22 08:19:08.405415754 +0000 UTC m=+0.055847800 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, release=1755695350, io.openshift.expose-services=, vcs-type=git, managed_by=edpm_ansible, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, config_id=edpm, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, maintainer=Red Hat, Inc., version=9.6, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41)
Nov 22 08:19:09 compute-0 nova_compute[186544]: 2025-11-22 08:19:09.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:19:11 compute-0 nova_compute[186544]: 2025-11-22 08:19:11.117 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:19:12 compute-0 nova_compute[186544]: 2025-11-22 08:19:12.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:19:12 compute-0 nova_compute[186544]: 2025-11-22 08:19:12.568 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:19:15 compute-0 nova_compute[186544]: 2025-11-22 08:19:15.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:19:16 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:19:16.047 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2c:5b:22 10.100.0.2 2001:db8:0:1:f816:3eff:fe2c:5b22 2001:db8::f816:3eff:fe2c:5b22'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8:0:1:f816:3eff:fe2c:5b22/64 2001:db8::f816:3eff:fe2c:5b22/64', 'neutron:device_id': 'ovnmeta-cfb1249f-37ac-4df7-b559-e7968406997d', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cfb1249f-37ac-4df7-b559-e7968406997d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=94e93905-1b64-4ecc-b682-ceea307bebcf, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=ac76a812-5ead-4b51-8c63-4eaca1b65820) old=Port_Binding(mac=['fa:16:3e:2c:5b:22 10.100.0.2 2001:db8::f816:3eff:fe2c:5b22'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe2c:5b22/64', 'neutron:device_id': 'ovnmeta-cfb1249f-37ac-4df7-b559-e7968406997d', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cfb1249f-37ac-4df7-b559-e7968406997d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:19:16 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:19:16.049 103805 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port ac76a812-5ead-4b51-8c63-4eaca1b65820 in datapath cfb1249f-37ac-4df7-b559-e7968406997d updated
Nov 22 08:19:16 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:19:16.050 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cfb1249f-37ac-4df7-b559-e7968406997d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 08:19:16 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:19:16.051 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[52713620-884e-4139-91eb-6f9ee29bfb09]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:19:16 compute-0 nova_compute[186544]: 2025-11-22 08:19:16.119 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:19:16 compute-0 nova_compute[186544]: 2025-11-22 08:19:16.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:19:17 compute-0 nova_compute[186544]: 2025-11-22 08:19:17.569 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:19:21 compute-0 nova_compute[186544]: 2025-11-22 08:19:21.120 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:19:22 compute-0 nova_compute[186544]: 2025-11-22 08:19:22.572 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:19:25 compute-0 podman[241119]: 2025-11-22 08:19:25.424327859 +0000 UTC m=+0.068933345 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 08:19:25 compute-0 podman[241121]: 2025-11-22 08:19:25.440844336 +0000 UTC m=+0.079701949 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 08:19:25 compute-0 podman[241120]: 2025-11-22 08:19:25.441438921 +0000 UTC m=+0.072812249 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 08:19:25 compute-0 podman[241122]: 2025-11-22 08:19:25.469477313 +0000 UTC m=+0.100987855 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_controller)
Nov 22 08:19:26 compute-0 nova_compute[186544]: 2025-11-22 08:19:26.121 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:19:27 compute-0 nova_compute[186544]: 2025-11-22 08:19:27.573 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:19:31 compute-0 nova_compute[186544]: 2025-11-22 08:19:31.123 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:19:32 compute-0 nova_compute[186544]: 2025-11-22 08:19:32.575 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:19:33 compute-0 nova_compute[186544]: 2025-11-22 08:19:33.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:19:33 compute-0 nova_compute[186544]: 2025-11-22 08:19:33.164 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 22 08:19:33 compute-0 nova_compute[186544]: 2025-11-22 08:19:33.180 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 22 08:19:34 compute-0 nova_compute[186544]: 2025-11-22 08:19:34.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:19:34 compute-0 nova_compute[186544]: 2025-11-22 08:19:34.164 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 22 08:19:36 compute-0 nova_compute[186544]: 2025-11-22 08:19:36.125 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:19:36 compute-0 podman[241201]: 2025-11-22 08:19:36.411789673 +0000 UTC m=+0.058023163 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 22 08:19:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:19:37.345 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:19:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:19:37.345 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:19:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:19:37.345 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:19:37 compute-0 nova_compute[186544]: 2025-11-22 08:19:37.576 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:19:39 compute-0 nova_compute[186544]: 2025-11-22 08:19:39.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:19:39 compute-0 podman[241221]: 2025-11-22 08:19:39.403386112 +0000 UTC m=+0.050215961 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 08:19:39 compute-0 podman[241222]: 2025-11-22 08:19:39.412664481 +0000 UTC m=+0.056922237 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, architecture=x86_64, managed_by=edpm_ansible, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, io.openshift.expose-services=)
Nov 22 08:19:41 compute-0 nova_compute[186544]: 2025-11-22 08:19:41.126 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:19:42 compute-0 nova_compute[186544]: 2025-11-22 08:19:42.578 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:19:46 compute-0 nova_compute[186544]: 2025-11-22 08:19:46.128 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:19:47 compute-0 nova_compute[186544]: 2025-11-22 08:19:47.580 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:19:51 compute-0 nova_compute[186544]: 2025-11-22 08:19:51.130 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:19:52 compute-0 nova_compute[186544]: 2025-11-22 08:19:52.581 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:19:56 compute-0 nova_compute[186544]: 2025-11-22 08:19:56.132 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:19:56 compute-0 podman[241264]: 2025-11-22 08:19:56.40703118 +0000 UTC m=+0.056612549 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 22 08:19:56 compute-0 podman[241265]: 2025-11-22 08:19:56.41230814 +0000 UTC m=+0.054452146 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 08:19:56 compute-0 podman[241266]: 2025-11-22 08:19:56.430667074 +0000 UTC m=+0.073556039 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 22 08:19:56 compute-0 podman[241263]: 2025-11-22 08:19:56.431135115 +0000 UTC m=+0.079940606 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 22 08:19:57 compute-0 nova_compute[186544]: 2025-11-22 08:19:57.582 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:19:58 compute-0 nova_compute[186544]: 2025-11-22 08:19:58.705 186548 DEBUG oslo_concurrency.lockutils [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "074d9b5a-057a-46af-aea1-0f43e0ac7418" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:19:58 compute-0 nova_compute[186544]: 2025-11-22 08:19:58.706 186548 DEBUG oslo_concurrency.lockutils [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "074d9b5a-057a-46af-aea1-0f43e0ac7418" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:19:58 compute-0 nova_compute[186544]: 2025-11-22 08:19:58.720 186548 DEBUG nova.compute.manager [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 08:19:58 compute-0 nova_compute[186544]: 2025-11-22 08:19:58.805 186548 DEBUG oslo_concurrency.lockutils [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:19:58 compute-0 nova_compute[186544]: 2025-11-22 08:19:58.806 186548 DEBUG oslo_concurrency.lockutils [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:19:58 compute-0 nova_compute[186544]: 2025-11-22 08:19:58.820 186548 DEBUG nova.virt.hardware [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 08:19:58 compute-0 nova_compute[186544]: 2025-11-22 08:19:58.821 186548 INFO nova.compute.claims [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Claim successful on node compute-0.ctlplane.example.com
Nov 22 08:19:58 compute-0 nova_compute[186544]: 2025-11-22 08:19:58.967 186548 DEBUG nova.compute.provider_tree [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:19:58 compute-0 nova_compute[186544]: 2025-11-22 08:19:58.979 186548 DEBUG nova.scheduler.client.report [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:19:58 compute-0 nova_compute[186544]: 2025-11-22 08:19:58.998 186548 DEBUG oslo_concurrency.lockutils [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.192s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:19:58 compute-0 nova_compute[186544]: 2025-11-22 08:19:58.998 186548 DEBUG nova.compute.manager [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 08:19:59 compute-0 nova_compute[186544]: 2025-11-22 08:19:59.045 186548 DEBUG nova.compute.manager [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 22 08:19:59 compute-0 nova_compute[186544]: 2025-11-22 08:19:59.046 186548 DEBUG nova.network.neutron [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 08:19:59 compute-0 nova_compute[186544]: 2025-11-22 08:19:59.060 186548 INFO nova.virt.libvirt.driver [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 08:19:59 compute-0 nova_compute[186544]: 2025-11-22 08:19:59.077 186548 DEBUG nova.compute.manager [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 08:19:59 compute-0 nova_compute[186544]: 2025-11-22 08:19:59.169 186548 DEBUG nova.compute.manager [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 08:19:59 compute-0 nova_compute[186544]: 2025-11-22 08:19:59.170 186548 DEBUG nova.virt.libvirt.driver [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 08:19:59 compute-0 nova_compute[186544]: 2025-11-22 08:19:59.170 186548 INFO nova.virt.libvirt.driver [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Creating image(s)
Nov 22 08:19:59 compute-0 nova_compute[186544]: 2025-11-22 08:19:59.171 186548 DEBUG oslo_concurrency.lockutils [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "/var/lib/nova/instances/074d9b5a-057a-46af-aea1-0f43e0ac7418/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:19:59 compute-0 nova_compute[186544]: 2025-11-22 08:19:59.171 186548 DEBUG oslo_concurrency.lockutils [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "/var/lib/nova/instances/074d9b5a-057a-46af-aea1-0f43e0ac7418/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:19:59 compute-0 nova_compute[186544]: 2025-11-22 08:19:59.172 186548 DEBUG oslo_concurrency.lockutils [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "/var/lib/nova/instances/074d9b5a-057a-46af-aea1-0f43e0ac7418/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:19:59 compute-0 nova_compute[186544]: 2025-11-22 08:19:59.183 186548 DEBUG oslo_concurrency.processutils [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:19:59 compute-0 nova_compute[186544]: 2025-11-22 08:19:59.243 186548 DEBUG oslo_concurrency.processutils [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:19:59 compute-0 nova_compute[186544]: 2025-11-22 08:19:59.245 186548 DEBUG oslo_concurrency.lockutils [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:19:59 compute-0 nova_compute[186544]: 2025-11-22 08:19:59.245 186548 DEBUG oslo_concurrency.lockutils [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:19:59 compute-0 nova_compute[186544]: 2025-11-22 08:19:59.256 186548 DEBUG oslo_concurrency.processutils [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:19:59 compute-0 nova_compute[186544]: 2025-11-22 08:19:59.316 186548 DEBUG oslo_concurrency.processutils [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:19:59 compute-0 nova_compute[186544]: 2025-11-22 08:19:59.317 186548 DEBUG oslo_concurrency.processutils [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/074d9b5a-057a-46af-aea1-0f43e0ac7418/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:19:59 compute-0 nova_compute[186544]: 2025-11-22 08:19:59.394 186548 DEBUG oslo_concurrency.processutils [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/074d9b5a-057a-46af-aea1-0f43e0ac7418/disk 1073741824" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:19:59 compute-0 nova_compute[186544]: 2025-11-22 08:19:59.395 186548 DEBUG oslo_concurrency.lockutils [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.149s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:19:59 compute-0 nova_compute[186544]: 2025-11-22 08:19:59.395 186548 DEBUG oslo_concurrency.processutils [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:19:59 compute-0 nova_compute[186544]: 2025-11-22 08:19:59.455 186548 DEBUG oslo_concurrency.processutils [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:19:59 compute-0 nova_compute[186544]: 2025-11-22 08:19:59.457 186548 DEBUG nova.virt.disk.api [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Checking if we can resize image /var/lib/nova/instances/074d9b5a-057a-46af-aea1-0f43e0ac7418/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 08:19:59 compute-0 nova_compute[186544]: 2025-11-22 08:19:59.457 186548 DEBUG oslo_concurrency.processutils [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/074d9b5a-057a-46af-aea1-0f43e0ac7418/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:19:59 compute-0 nova_compute[186544]: 2025-11-22 08:19:59.516 186548 DEBUG oslo_concurrency.processutils [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/074d9b5a-057a-46af-aea1-0f43e0ac7418/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:19:59 compute-0 nova_compute[186544]: 2025-11-22 08:19:59.517 186548 DEBUG nova.virt.disk.api [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Cannot resize image /var/lib/nova/instances/074d9b5a-057a-46af-aea1-0f43e0ac7418/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 08:19:59 compute-0 nova_compute[186544]: 2025-11-22 08:19:59.517 186548 DEBUG nova.objects.instance [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lazy-loading 'migration_context' on Instance uuid 074d9b5a-057a-46af-aea1-0f43e0ac7418 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:19:59 compute-0 nova_compute[186544]: 2025-11-22 08:19:59.542 186548 DEBUG nova.virt.libvirt.driver [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 08:19:59 compute-0 nova_compute[186544]: 2025-11-22 08:19:59.543 186548 DEBUG nova.virt.libvirt.driver [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Ensure instance console log exists: /var/lib/nova/instances/074d9b5a-057a-46af-aea1-0f43e0ac7418/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 08:19:59 compute-0 nova_compute[186544]: 2025-11-22 08:19:59.543 186548 DEBUG oslo_concurrency.lockutils [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:19:59 compute-0 nova_compute[186544]: 2025-11-22 08:19:59.544 186548 DEBUG oslo_concurrency.lockutils [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:19:59 compute-0 nova_compute[186544]: 2025-11-22 08:19:59.544 186548 DEBUG oslo_concurrency.lockutils [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:20:00 compute-0 nova_compute[186544]: 2025-11-22 08:20:00.055 186548 DEBUG nova.policy [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 22 08:20:01 compute-0 nova_compute[186544]: 2025-11-22 08:20:01.135 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:20:02 compute-0 nova_compute[186544]: 2025-11-22 08:20:02.130 186548 DEBUG nova.network.neutron [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Successfully created port: eda1ac92-e156-463f-9f90-8fdd14f55dc0 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 22 08:20:02 compute-0 nova_compute[186544]: 2025-11-22 08:20:02.584 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:20:04 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:20:04.030 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=45, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=44) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:20:04 compute-0 nova_compute[186544]: 2025-11-22 08:20:04.030 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:20:04 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:20:04.031 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 08:20:04 compute-0 nova_compute[186544]: 2025-11-22 08:20:04.176 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:20:04 compute-0 nova_compute[186544]: 2025-11-22 08:20:04.201 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:20:04 compute-0 nova_compute[186544]: 2025-11-22 08:20:04.201 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:20:04 compute-0 nova_compute[186544]: 2025-11-22 08:20:04.202 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:20:04 compute-0 nova_compute[186544]: 2025-11-22 08:20:04.202 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 08:20:04 compute-0 nova_compute[186544]: 2025-11-22 08:20:04.335 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:20:04 compute-0 nova_compute[186544]: 2025-11-22 08:20:04.336 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5704MB free_disk=73.1364974975586GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 08:20:04 compute-0 nova_compute[186544]: 2025-11-22 08:20:04.337 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:20:04 compute-0 nova_compute[186544]: 2025-11-22 08:20:04.337 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:20:04 compute-0 nova_compute[186544]: 2025-11-22 08:20:04.405 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Instance 074d9b5a-057a-46af-aea1-0f43e0ac7418 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 22 08:20:04 compute-0 nova_compute[186544]: 2025-11-22 08:20:04.406 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 08:20:04 compute-0 nova_compute[186544]: 2025-11-22 08:20:04.406 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 08:20:04 compute-0 nova_compute[186544]: 2025-11-22 08:20:04.457 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:20:04 compute-0 nova_compute[186544]: 2025-11-22 08:20:04.470 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:20:04 compute-0 nova_compute[186544]: 2025-11-22 08:20:04.492 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 08:20:04 compute-0 nova_compute[186544]: 2025-11-22 08:20:04.492 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.155s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:20:04 compute-0 nova_compute[186544]: 2025-11-22 08:20:04.692 186548 DEBUG nova.network.neutron [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Successfully updated port: eda1ac92-e156-463f-9f90-8fdd14f55dc0 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 08:20:04 compute-0 nova_compute[186544]: 2025-11-22 08:20:04.719 186548 DEBUG oslo_concurrency.lockutils [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "refresh_cache-074d9b5a-057a-46af-aea1-0f43e0ac7418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:20:04 compute-0 nova_compute[186544]: 2025-11-22 08:20:04.720 186548 DEBUG oslo_concurrency.lockutils [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquired lock "refresh_cache-074d9b5a-057a-46af-aea1-0f43e0ac7418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:20:04 compute-0 nova_compute[186544]: 2025-11-22 08:20:04.720 186548 DEBUG nova.network.neutron [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 08:20:04 compute-0 nova_compute[186544]: 2025-11-22 08:20:04.875 186548 DEBUG nova.compute.manager [req-b204f5f3-1414-47d6-be8e-fc156d9ca939 req-9377fe3c-3b91-493e-926d-d9d9e7b4b3b5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Received event network-changed-eda1ac92-e156-463f-9f90-8fdd14f55dc0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:20:04 compute-0 nova_compute[186544]: 2025-11-22 08:20:04.875 186548 DEBUG nova.compute.manager [req-b204f5f3-1414-47d6-be8e-fc156d9ca939 req-9377fe3c-3b91-493e-926d-d9d9e7b4b3b5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Refreshing instance network info cache due to event network-changed-eda1ac92-e156-463f-9f90-8fdd14f55dc0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:20:04 compute-0 nova_compute[186544]: 2025-11-22 08:20:04.876 186548 DEBUG oslo_concurrency.lockutils [req-b204f5f3-1414-47d6-be8e-fc156d9ca939 req-9377fe3c-3b91-493e-926d-d9d9e7b4b3b5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-074d9b5a-057a-46af-aea1-0f43e0ac7418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:20:04 compute-0 nova_compute[186544]: 2025-11-22 08:20:04.950 186548 DEBUG nova.network.neutron [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 08:20:05 compute-0 nova_compute[186544]: 2025-11-22 08:20:05.479 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:20:05 compute-0 nova_compute[186544]: 2025-11-22 08:20:05.479 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 08:20:06 compute-0 nova_compute[186544]: 2025-11-22 08:20:06.136 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:20:06 compute-0 nova_compute[186544]: 2025-11-22 08:20:06.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:20:06 compute-0 nova_compute[186544]: 2025-11-22 08:20:06.164 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 08:20:06 compute-0 nova_compute[186544]: 2025-11-22 08:20:06.164 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 08:20:06 compute-0 nova_compute[186544]: 2025-11-22 08:20:06.185 186548 DEBUG nova.network.neutron [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Updating instance_info_cache with network_info: [{"id": "eda1ac92-e156-463f-9f90-8fdd14f55dc0", "address": "fa:16:3e:b3:0e:98", "network": {"id": "52d2fbd4-6713-49c3-93b1-794bccb91cb5", "bridge": "br-int", "label": "tempest-network-smoke--1623348371", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeda1ac92-e1", "ovs_interfaceid": "eda1ac92-e156-463f-9f90-8fdd14f55dc0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:20:06 compute-0 nova_compute[186544]: 2025-11-22 08:20:06.189 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Nov 22 08:20:06 compute-0 nova_compute[186544]: 2025-11-22 08:20:06.189 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 08:20:06 compute-0 nova_compute[186544]: 2025-11-22 08:20:06.202 186548 DEBUG oslo_concurrency.lockutils [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Releasing lock "refresh_cache-074d9b5a-057a-46af-aea1-0f43e0ac7418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:20:06 compute-0 nova_compute[186544]: 2025-11-22 08:20:06.203 186548 DEBUG nova.compute.manager [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Instance network_info: |[{"id": "eda1ac92-e156-463f-9f90-8fdd14f55dc0", "address": "fa:16:3e:b3:0e:98", "network": {"id": "52d2fbd4-6713-49c3-93b1-794bccb91cb5", "bridge": "br-int", "label": "tempest-network-smoke--1623348371", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeda1ac92-e1", "ovs_interfaceid": "eda1ac92-e156-463f-9f90-8fdd14f55dc0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 22 08:20:06 compute-0 nova_compute[186544]: 2025-11-22 08:20:06.204 186548 DEBUG oslo_concurrency.lockutils [req-b204f5f3-1414-47d6-be8e-fc156d9ca939 req-9377fe3c-3b91-493e-926d-d9d9e7b4b3b5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-074d9b5a-057a-46af-aea1-0f43e0ac7418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:20:06 compute-0 nova_compute[186544]: 2025-11-22 08:20:06.205 186548 DEBUG nova.network.neutron [req-b204f5f3-1414-47d6-be8e-fc156d9ca939 req-9377fe3c-3b91-493e-926d-d9d9e7b4b3b5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Refreshing network info cache for port eda1ac92-e156-463f-9f90-8fdd14f55dc0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:20:06 compute-0 nova_compute[186544]: 2025-11-22 08:20:06.211 186548 DEBUG nova.virt.libvirt.driver [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Start _get_guest_xml network_info=[{"id": "eda1ac92-e156-463f-9f90-8fdd14f55dc0", "address": "fa:16:3e:b3:0e:98", "network": {"id": "52d2fbd4-6713-49c3-93b1-794bccb91cb5", "bridge": "br-int", "label": "tempest-network-smoke--1623348371", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeda1ac92-e1", "ovs_interfaceid": "eda1ac92-e156-463f-9f90-8fdd14f55dc0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 08:20:06 compute-0 nova_compute[186544]: 2025-11-22 08:20:06.217 186548 WARNING nova.virt.libvirt.driver [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:20:06 compute-0 nova_compute[186544]: 2025-11-22 08:20:06.222 186548 DEBUG nova.virt.libvirt.host [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 08:20:06 compute-0 nova_compute[186544]: 2025-11-22 08:20:06.222 186548 DEBUG nova.virt.libvirt.host [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 08:20:06 compute-0 nova_compute[186544]: 2025-11-22 08:20:06.228 186548 DEBUG nova.virt.libvirt.host [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 08:20:06 compute-0 nova_compute[186544]: 2025-11-22 08:20:06.228 186548 DEBUG nova.virt.libvirt.host [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 08:20:06 compute-0 nova_compute[186544]: 2025-11-22 08:20:06.229 186548 DEBUG nova.virt.libvirt.driver [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 08:20:06 compute-0 nova_compute[186544]: 2025-11-22 08:20:06.229 186548 DEBUG nova.virt.hardware [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 08:20:06 compute-0 nova_compute[186544]: 2025-11-22 08:20:06.230 186548 DEBUG nova.virt.hardware [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 08:20:06 compute-0 nova_compute[186544]: 2025-11-22 08:20:06.230 186548 DEBUG nova.virt.hardware [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 08:20:06 compute-0 nova_compute[186544]: 2025-11-22 08:20:06.230 186548 DEBUG nova.virt.hardware [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 08:20:06 compute-0 nova_compute[186544]: 2025-11-22 08:20:06.230 186548 DEBUG nova.virt.hardware [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 08:20:06 compute-0 nova_compute[186544]: 2025-11-22 08:20:06.230 186548 DEBUG nova.virt.hardware [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 08:20:06 compute-0 nova_compute[186544]: 2025-11-22 08:20:06.231 186548 DEBUG nova.virt.hardware [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 08:20:06 compute-0 nova_compute[186544]: 2025-11-22 08:20:06.231 186548 DEBUG nova.virt.hardware [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 08:20:06 compute-0 nova_compute[186544]: 2025-11-22 08:20:06.231 186548 DEBUG nova.virt.hardware [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 08:20:06 compute-0 nova_compute[186544]: 2025-11-22 08:20:06.231 186548 DEBUG nova.virt.hardware [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 08:20:06 compute-0 nova_compute[186544]: 2025-11-22 08:20:06.232 186548 DEBUG nova.virt.hardware [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 08:20:06 compute-0 nova_compute[186544]: 2025-11-22 08:20:06.235 186548 DEBUG nova.virt.libvirt.vif [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:19:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-550951011',display_name='tempest-TestNetworkAdvancedServerOps-server-550951011',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-550951011',id=152,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBChgjIQbTG83YupjtVjqz+L3+8SX3/AyjC8fqpXlZMUq0Yc6UvLnNy2SkzagnhhQjk5r+5IpiMQj6wR0xNs5cYnWEn7ZMM5fmHS1ZM+0SVA7KQ3TBAqX6QTRX0NRVvymVQ==',key_name='tempest-TestNetworkAdvancedServerOps-1187204626',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='042f6d127720471aaedb8a1fb7535416',ramdisk_id='',reservation_id='r-pxtss0dk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1221065053',owner_user_name='tempest-TestNetworkAdvancedServerOps-1221065053-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:19:59Z,user_data=None,user_id='d8853d84c1e84f6baaf01635ef1d0f7c',uuid=074d9b5a-057a-46af-aea1-0f43e0ac7418,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "eda1ac92-e156-463f-9f90-8fdd14f55dc0", "address": "fa:16:3e:b3:0e:98", "network": {"id": "52d2fbd4-6713-49c3-93b1-794bccb91cb5", "bridge": "br-int", "label": "tempest-network-smoke--1623348371", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeda1ac92-e1", "ovs_interfaceid": "eda1ac92-e156-463f-9f90-8fdd14f55dc0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 08:20:06 compute-0 nova_compute[186544]: 2025-11-22 08:20:06.235 186548 DEBUG nova.network.os_vif_util [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converting VIF {"id": "eda1ac92-e156-463f-9f90-8fdd14f55dc0", "address": "fa:16:3e:b3:0e:98", "network": {"id": "52d2fbd4-6713-49c3-93b1-794bccb91cb5", "bridge": "br-int", "label": "tempest-network-smoke--1623348371", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeda1ac92-e1", "ovs_interfaceid": "eda1ac92-e156-463f-9f90-8fdd14f55dc0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:20:06 compute-0 nova_compute[186544]: 2025-11-22 08:20:06.236 186548 DEBUG nova.network.os_vif_util [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b3:0e:98,bridge_name='br-int',has_traffic_filtering=True,id=eda1ac92-e156-463f-9f90-8fdd14f55dc0,network=Network(52d2fbd4-6713-49c3-93b1-794bccb91cb5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeda1ac92-e1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:20:06 compute-0 nova_compute[186544]: 2025-11-22 08:20:06.237 186548 DEBUG nova.objects.instance [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lazy-loading 'pci_devices' on Instance uuid 074d9b5a-057a-46af-aea1-0f43e0ac7418 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:20:06 compute-0 nova_compute[186544]: 2025-11-22 08:20:06.252 186548 DEBUG nova.virt.libvirt.driver [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] End _get_guest_xml xml=<domain type="kvm">
Nov 22 08:20:06 compute-0 nova_compute[186544]:   <uuid>074d9b5a-057a-46af-aea1-0f43e0ac7418</uuid>
Nov 22 08:20:06 compute-0 nova_compute[186544]:   <name>instance-00000098</name>
Nov 22 08:20:06 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 08:20:06 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 08:20:06 compute-0 nova_compute[186544]:   <metadata>
Nov 22 08:20:06 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 08:20:06 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 08:20:06 compute-0 nova_compute[186544]:       <nova:name>tempest-TestNetworkAdvancedServerOps-server-550951011</nova:name>
Nov 22 08:20:06 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 08:20:06</nova:creationTime>
Nov 22 08:20:06 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 08:20:06 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 08:20:06 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 08:20:06 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 08:20:06 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 08:20:06 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 08:20:06 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 08:20:06 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 08:20:06 compute-0 nova_compute[186544]:         <nova:user uuid="d8853d84c1e84f6baaf01635ef1d0f7c">tempest-TestNetworkAdvancedServerOps-1221065053-project-member</nova:user>
Nov 22 08:20:06 compute-0 nova_compute[186544]:         <nova:project uuid="042f6d127720471aaedb8a1fb7535416">tempest-TestNetworkAdvancedServerOps-1221065053</nova:project>
Nov 22 08:20:06 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 08:20:06 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 08:20:06 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 08:20:06 compute-0 nova_compute[186544]:         <nova:port uuid="eda1ac92-e156-463f-9f90-8fdd14f55dc0">
Nov 22 08:20:06 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 22 08:20:06 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 08:20:06 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 08:20:06 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 08:20:06 compute-0 nova_compute[186544]:   </metadata>
Nov 22 08:20:06 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 08:20:06 compute-0 nova_compute[186544]:     <system>
Nov 22 08:20:06 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 08:20:06 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 08:20:06 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 08:20:06 compute-0 nova_compute[186544]:       <entry name="serial">074d9b5a-057a-46af-aea1-0f43e0ac7418</entry>
Nov 22 08:20:06 compute-0 nova_compute[186544]:       <entry name="uuid">074d9b5a-057a-46af-aea1-0f43e0ac7418</entry>
Nov 22 08:20:06 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 08:20:06 compute-0 nova_compute[186544]:     </system>
Nov 22 08:20:06 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 08:20:06 compute-0 nova_compute[186544]:   <os>
Nov 22 08:20:06 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 08:20:06 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 08:20:06 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 08:20:06 compute-0 nova_compute[186544]:   </os>
Nov 22 08:20:06 compute-0 nova_compute[186544]:   <features>
Nov 22 08:20:06 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 08:20:06 compute-0 nova_compute[186544]:     <apic/>
Nov 22 08:20:06 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 08:20:06 compute-0 nova_compute[186544]:   </features>
Nov 22 08:20:06 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 08:20:06 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 08:20:06 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 08:20:06 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 08:20:06 compute-0 nova_compute[186544]:   </clock>
Nov 22 08:20:06 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 08:20:06 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 08:20:06 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 08:20:06 compute-0 nova_compute[186544]:   </cpu>
Nov 22 08:20:06 compute-0 nova_compute[186544]:   <devices>
Nov 22 08:20:06 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 08:20:06 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 08:20:06 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/074d9b5a-057a-46af-aea1-0f43e0ac7418/disk"/>
Nov 22 08:20:06 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 08:20:06 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:20:06 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 08:20:06 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 08:20:06 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/074d9b5a-057a-46af-aea1-0f43e0ac7418/disk.config"/>
Nov 22 08:20:06 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 08:20:06 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:20:06 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 08:20:06 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:b3:0e:98"/>
Nov 22 08:20:06 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:20:06 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 08:20:06 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 08:20:06 compute-0 nova_compute[186544]:       <target dev="tapeda1ac92-e1"/>
Nov 22 08:20:06 compute-0 nova_compute[186544]:     </interface>
Nov 22 08:20:06 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 08:20:06 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/074d9b5a-057a-46af-aea1-0f43e0ac7418/console.log" append="off"/>
Nov 22 08:20:06 compute-0 nova_compute[186544]:     </serial>
Nov 22 08:20:06 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 08:20:06 compute-0 nova_compute[186544]:     <video>
Nov 22 08:20:06 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:20:06 compute-0 nova_compute[186544]:     </video>
Nov 22 08:20:06 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 08:20:06 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 08:20:06 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 08:20:06 compute-0 nova_compute[186544]:     </rng>
Nov 22 08:20:06 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 08:20:06 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:20:06 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:20:06 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:20:06 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:20:06 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:20:06 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:20:06 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:20:06 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:20:06 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:20:06 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:20:06 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:20:06 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:20:06 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:20:06 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:20:06 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:20:06 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:20:06 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:20:06 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:20:06 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:20:06 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:20:06 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:20:06 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:20:06 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:20:06 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:20:06 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 08:20:06 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 08:20:06 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 08:20:06 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 08:20:06 compute-0 nova_compute[186544]:   </devices>
Nov 22 08:20:06 compute-0 nova_compute[186544]: </domain>
Nov 22 08:20:06 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 08:20:06 compute-0 nova_compute[186544]: 2025-11-22 08:20:06.254 186548 DEBUG nova.compute.manager [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Preparing to wait for external event network-vif-plugged-eda1ac92-e156-463f-9f90-8fdd14f55dc0 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 08:20:06 compute-0 nova_compute[186544]: 2025-11-22 08:20:06.255 186548 DEBUG oslo_concurrency.lockutils [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "074d9b5a-057a-46af-aea1-0f43e0ac7418-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:20:06 compute-0 nova_compute[186544]: 2025-11-22 08:20:06.255 186548 DEBUG oslo_concurrency.lockutils [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "074d9b5a-057a-46af-aea1-0f43e0ac7418-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:20:06 compute-0 nova_compute[186544]: 2025-11-22 08:20:06.255 186548 DEBUG oslo_concurrency.lockutils [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "074d9b5a-057a-46af-aea1-0f43e0ac7418-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:20:06 compute-0 nova_compute[186544]: 2025-11-22 08:20:06.256 186548 DEBUG nova.virt.libvirt.vif [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:19:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-550951011',display_name='tempest-TestNetworkAdvancedServerOps-server-550951011',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-550951011',id=152,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBChgjIQbTG83YupjtVjqz+L3+8SX3/AyjC8fqpXlZMUq0Yc6UvLnNy2SkzagnhhQjk5r+5IpiMQj6wR0xNs5cYnWEn7ZMM5fmHS1ZM+0SVA7KQ3TBAqX6QTRX0NRVvymVQ==',key_name='tempest-TestNetworkAdvancedServerOps-1187204626',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='042f6d127720471aaedb8a1fb7535416',ramdisk_id='',reservation_id='r-pxtss0dk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1221065053',owner_user_name='tempest-TestNetworkAdvancedServerOps-1221065053-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:19:59Z,user_data=None,user_id='d8853d84c1e84f6baaf01635ef1d0f7c',uuid=074d9b5a-057a-46af-aea1-0f43e0ac7418,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "eda1ac92-e156-463f-9f90-8fdd14f55dc0", "address": "fa:16:3e:b3:0e:98", "network": {"id": "52d2fbd4-6713-49c3-93b1-794bccb91cb5", "bridge": "br-int", "label": "tempest-network-smoke--1623348371", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeda1ac92-e1", "ovs_interfaceid": "eda1ac92-e156-463f-9f90-8fdd14f55dc0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 08:20:06 compute-0 nova_compute[186544]: 2025-11-22 08:20:06.256 186548 DEBUG nova.network.os_vif_util [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converting VIF {"id": "eda1ac92-e156-463f-9f90-8fdd14f55dc0", "address": "fa:16:3e:b3:0e:98", "network": {"id": "52d2fbd4-6713-49c3-93b1-794bccb91cb5", "bridge": "br-int", "label": "tempest-network-smoke--1623348371", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeda1ac92-e1", "ovs_interfaceid": "eda1ac92-e156-463f-9f90-8fdd14f55dc0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:20:06 compute-0 nova_compute[186544]: 2025-11-22 08:20:06.256 186548 DEBUG nova.network.os_vif_util [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b3:0e:98,bridge_name='br-int',has_traffic_filtering=True,id=eda1ac92-e156-463f-9f90-8fdd14f55dc0,network=Network(52d2fbd4-6713-49c3-93b1-794bccb91cb5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeda1ac92-e1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:20:06 compute-0 nova_compute[186544]: 2025-11-22 08:20:06.257 186548 DEBUG os_vif [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b3:0e:98,bridge_name='br-int',has_traffic_filtering=True,id=eda1ac92-e156-463f-9f90-8fdd14f55dc0,network=Network(52d2fbd4-6713-49c3-93b1-794bccb91cb5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeda1ac92-e1') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 08:20:06 compute-0 nova_compute[186544]: 2025-11-22 08:20:06.257 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:20:06 compute-0 nova_compute[186544]: 2025-11-22 08:20:06.257 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:20:06 compute-0 nova_compute[186544]: 2025-11-22 08:20:06.258 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:20:06 compute-0 nova_compute[186544]: 2025-11-22 08:20:06.260 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:20:06 compute-0 nova_compute[186544]: 2025-11-22 08:20:06.260 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapeda1ac92-e1, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:20:06 compute-0 nova_compute[186544]: 2025-11-22 08:20:06.260 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapeda1ac92-e1, col_values=(('external_ids', {'iface-id': 'eda1ac92-e156-463f-9f90-8fdd14f55dc0', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b3:0e:98', 'vm-uuid': '074d9b5a-057a-46af-aea1-0f43e0ac7418'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:20:06 compute-0 nova_compute[186544]: 2025-11-22 08:20:06.262 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:20:06 compute-0 nova_compute[186544]: 2025-11-22 08:20:06.263 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:20:06 compute-0 NetworkManager[55036]: <info>  [1763799606.2640] manager: (tapeda1ac92-e1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/314)
Nov 22 08:20:06 compute-0 nova_compute[186544]: 2025-11-22 08:20:06.264 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 08:20:06 compute-0 nova_compute[186544]: 2025-11-22 08:20:06.268 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:20:06 compute-0 nova_compute[186544]: 2025-11-22 08:20:06.268 186548 INFO os_vif [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b3:0e:98,bridge_name='br-int',has_traffic_filtering=True,id=eda1ac92-e156-463f-9f90-8fdd14f55dc0,network=Network(52d2fbd4-6713-49c3-93b1-794bccb91cb5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeda1ac92-e1')
Nov 22 08:20:06 compute-0 nova_compute[186544]: 2025-11-22 08:20:06.348 186548 DEBUG nova.virt.libvirt.driver [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:20:06 compute-0 nova_compute[186544]: 2025-11-22 08:20:06.349 186548 DEBUG nova.virt.libvirt.driver [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:20:06 compute-0 nova_compute[186544]: 2025-11-22 08:20:06.349 186548 DEBUG nova.virt.libvirt.driver [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] No VIF found with MAC fa:16:3e:b3:0e:98, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 08:20:06 compute-0 nova_compute[186544]: 2025-11-22 08:20:06.350 186548 INFO nova.virt.libvirt.driver [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Using config drive
Nov 22 08:20:07 compute-0 nova_compute[186544]: 2025-11-22 08:20:07.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:20:07 compute-0 nova_compute[186544]: 2025-11-22 08:20:07.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:20:07 compute-0 nova_compute[186544]: 2025-11-22 08:20:07.386 186548 INFO nova.virt.libvirt.driver [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Creating config drive at /var/lib/nova/instances/074d9b5a-057a-46af-aea1-0f43e0ac7418/disk.config
Nov 22 08:20:07 compute-0 nova_compute[186544]: 2025-11-22 08:20:07.391 186548 DEBUG oslo_concurrency.processutils [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/074d9b5a-057a-46af-aea1-0f43e0ac7418/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmprcimmktt execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:20:07 compute-0 podman[241369]: 2025-11-22 08:20:07.437316552 +0000 UTC m=+0.084079387 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251118)
Nov 22 08:20:07 compute-0 nova_compute[186544]: 2025-11-22 08:20:07.519 186548 DEBUG oslo_concurrency.processutils [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/074d9b5a-057a-46af-aea1-0f43e0ac7418/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmprcimmktt" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:20:07 compute-0 kernel: tapeda1ac92-e1: entered promiscuous mode
Nov 22 08:20:07 compute-0 NetworkManager[55036]: <info>  [1763799607.6004] manager: (tapeda1ac92-e1): new Tun device (/org/freedesktop/NetworkManager/Devices/315)
Nov 22 08:20:07 compute-0 nova_compute[186544]: 2025-11-22 08:20:07.601 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:20:07 compute-0 ovn_controller[94843]: 2025-11-22T08:20:07Z|00690|binding|INFO|Claiming lport eda1ac92-e156-463f-9f90-8fdd14f55dc0 for this chassis.
Nov 22 08:20:07 compute-0 ovn_controller[94843]: 2025-11-22T08:20:07Z|00691|binding|INFO|eda1ac92-e156-463f-9f90-8fdd14f55dc0: Claiming fa:16:3e:b3:0e:98 10.100.0.5
Nov 22 08:20:07 compute-0 nova_compute[186544]: 2025-11-22 08:20:07.604 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:20:07 compute-0 nova_compute[186544]: 2025-11-22 08:20:07.608 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:20:07 compute-0 systemd-udevd[241407]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:20:07 compute-0 systemd-machined[152872]: New machine qemu-80-instance-00000098.
Nov 22 08:20:07 compute-0 NetworkManager[55036]: <info>  [1763799607.6492] device (tapeda1ac92-e1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 08:20:07 compute-0 NetworkManager[55036]: <info>  [1763799607.6502] device (tapeda1ac92-e1): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 08:20:07 compute-0 systemd[1]: Started Virtual Machine qemu-80-instance-00000098.
Nov 22 08:20:07 compute-0 nova_compute[186544]: 2025-11-22 08:20:07.662 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:20:07 compute-0 ovn_controller[94843]: 2025-11-22T08:20:07Z|00692|binding|INFO|Setting lport eda1ac92-e156-463f-9f90-8fdd14f55dc0 ovn-installed in OVS
Nov 22 08:20:07 compute-0 nova_compute[186544]: 2025-11-22 08:20:07.666 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:20:07 compute-0 ovn_controller[94843]: 2025-11-22T08:20:07Z|00693|binding|INFO|Setting lport eda1ac92-e156-463f-9f90-8fdd14f55dc0 up in Southbound
Nov 22 08:20:07 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:20:07.886 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b3:0e:98 10.100.0.5'], port_security=['fa:16:3e:b3:0e:98 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '074d9b5a-057a-46af-aea1-0f43e0ac7418', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-52d2fbd4-6713-49c3-93b1-794bccb91cb5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '042f6d127720471aaedb8a1fb7535416', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a792325d-0f21-49cc-9e79-20df696791a7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=46980f3d-5c4c-4eaa-bdbc-e9dfef13e740, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=eda1ac92-e156-463f-9f90-8fdd14f55dc0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:20:07 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:20:07.887 103805 INFO neutron.agent.ovn.metadata.agent [-] Port eda1ac92-e156-463f-9f90-8fdd14f55dc0 in datapath 52d2fbd4-6713-49c3-93b1-794bccb91cb5 bound to our chassis
Nov 22 08:20:07 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:20:07.888 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 52d2fbd4-6713-49c3-93b1-794bccb91cb5
Nov 22 08:20:07 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:20:07.898 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[30759e05-bd89-42b2-b775-e354209a0420]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:20:07 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:20:07.899 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap52d2fbd4-61 in ovnmeta-52d2fbd4-6713-49c3-93b1-794bccb91cb5 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 08:20:07 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:20:07.901 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap52d2fbd4-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 08:20:07 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:20:07.901 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[ff5eaf2d-375b-41ac-915d-8c844f685f32]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:20:07 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:20:07.902 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[f869b845-5588-4baa-b309-d5a43cc71200]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:20:07 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:20:07.921 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[038653b3-e87e-4396-9894-87d0add31af5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:20:07 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:20:07.940 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[4d113f35-2977-48af-adf4-d3ec143ca04a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:20:07 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:20:07.969 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[d55a3efc-a7a9-41c9-9b2b-371a602a0766]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:20:07 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:20:07.980 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[07f4347f-2a99-42a0-a2a7-225fbdb8fb26]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:20:07 compute-0 NetworkManager[55036]: <info>  [1763799607.9817] manager: (tap52d2fbd4-60): new Veth device (/org/freedesktop/NetworkManager/Devices/316)
Nov 22 08:20:07 compute-0 systemd-udevd[241409]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:20:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:20:08.016 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[5b271bc2-90ec-45c8-a2e0-c8692436524d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:20:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:20:08.020 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[4869b477-5190-439d-bfd1-d5df8096bb1b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:20:08 compute-0 NetworkManager[55036]: <info>  [1763799608.0428] device (tap52d2fbd4-60): carrier: link connected
Nov 22 08:20:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:20:08.048 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[012f29a4-7ead-4af5-8f3a-51d690f8b74c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:20:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:20:08.066 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[5546f65a-fb50-4888-9985-5aff093ed024]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap52d2fbd4-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0c:e8:32'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 208], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 634868, 'reachable_time': 32374, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241448, 'error': None, 'target': 'ovnmeta-52d2fbd4-6713-49c3-93b1-794bccb91cb5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:20:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:20:08.084 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[f0d5fb71-ae74-45e3-8e29-b4bf65db6e75]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0c:e832'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 634868, 'tstamp': 634868}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 241450, 'error': None, 'target': 'ovnmeta-52d2fbd4-6713-49c3-93b1-794bccb91cb5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:20:08 compute-0 nova_compute[186544]: 2025-11-22 08:20:08.097 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763799608.0968232, 074d9b5a-057a-46af-aea1-0f43e0ac7418 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:20:08 compute-0 nova_compute[186544]: 2025-11-22 08:20:08.097 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] VM Started (Lifecycle Event)
Nov 22 08:20:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:20:08.104 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[11793394-c3d0-4b00-b2f3-052b98682f34]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap52d2fbd4-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0c:e8:32'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 208], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 634868, 'reachable_time': 32374, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 241451, 'error': None, 'target': 'ovnmeta-52d2fbd4-6713-49c3-93b1-794bccb91cb5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:20:08 compute-0 nova_compute[186544]: 2025-11-22 08:20:08.121 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:20:08 compute-0 nova_compute[186544]: 2025-11-22 08:20:08.125 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763799608.0977342, 074d9b5a-057a-46af-aea1-0f43e0ac7418 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:20:08 compute-0 nova_compute[186544]: 2025-11-22 08:20:08.126 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] VM Paused (Lifecycle Event)
Nov 22 08:20:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:20:08.141 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[accce681-9fa7-4778-bffe-cfbdb3881acc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:20:08 compute-0 nova_compute[186544]: 2025-11-22 08:20:08.150 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:20:08 compute-0 nova_compute[186544]: 2025-11-22 08:20:08.154 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:20:08 compute-0 nova_compute[186544]: 2025-11-22 08:20:08.186 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 08:20:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:20:08.202 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[2c156a78-eafa-46c2-963e-cbbe914b44c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:20:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:20:08.204 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap52d2fbd4-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:20:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:20:08.205 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:20:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:20:08.205 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap52d2fbd4-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:20:08 compute-0 nova_compute[186544]: 2025-11-22 08:20:08.207 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:20:08 compute-0 NetworkManager[55036]: <info>  [1763799608.2082] manager: (tap52d2fbd4-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/317)
Nov 22 08:20:08 compute-0 kernel: tap52d2fbd4-60: entered promiscuous mode
Nov 22 08:20:08 compute-0 nova_compute[186544]: 2025-11-22 08:20:08.210 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:20:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:20:08.211 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap52d2fbd4-60, col_values=(('external_ids', {'iface-id': 'a25db17f-5074-4e3e-9504-ee30cd3f6d4c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:20:08 compute-0 nova_compute[186544]: 2025-11-22 08:20:08.212 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:20:08 compute-0 ovn_controller[94843]: 2025-11-22T08:20:08Z|00694|binding|INFO|Releasing lport a25db17f-5074-4e3e-9504-ee30cd3f6d4c from this chassis (sb_readonly=0)
Nov 22 08:20:08 compute-0 nova_compute[186544]: 2025-11-22 08:20:08.213 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:20:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:20:08.214 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/52d2fbd4-6713-49c3-93b1-794bccb91cb5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/52d2fbd4-6713-49c3-93b1-794bccb91cb5.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 08:20:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:20:08.215 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[09b24f9d-1480-4475-990a-14f72835fd3e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:20:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:20:08.216 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 08:20:08 compute-0 ovn_metadata_agent[103800]: global
Nov 22 08:20:08 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 08:20:08 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-52d2fbd4-6713-49c3-93b1-794bccb91cb5
Nov 22 08:20:08 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 08:20:08 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 08:20:08 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 08:20:08 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/52d2fbd4-6713-49c3-93b1-794bccb91cb5.pid.haproxy
Nov 22 08:20:08 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 08:20:08 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:20:08 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 08:20:08 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 08:20:08 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 08:20:08 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 08:20:08 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 08:20:08 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 08:20:08 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 08:20:08 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 08:20:08 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 08:20:08 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 08:20:08 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 08:20:08 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 08:20:08 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 08:20:08 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:20:08 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:20:08 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 08:20:08 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 08:20:08 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 08:20:08 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID 52d2fbd4-6713-49c3-93b1-794bccb91cb5
Nov 22 08:20:08 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 08:20:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:20:08.217 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-52d2fbd4-6713-49c3-93b1-794bccb91cb5', 'env', 'PROCESS_TAG=haproxy-52d2fbd4-6713-49c3-93b1-794bccb91cb5', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/52d2fbd4-6713-49c3-93b1-794bccb91cb5.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 08:20:08 compute-0 nova_compute[186544]: 2025-11-22 08:20:08.225 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:20:08 compute-0 nova_compute[186544]: 2025-11-22 08:20:08.600 186548 DEBUG nova.compute.manager [req-2c69cabd-97b3-4949-9e33-2b1691778966 req-7c65893d-21a9-484e-af8c-270832cfdb78 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Received event network-vif-plugged-eda1ac92-e156-463f-9f90-8fdd14f55dc0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:20:08 compute-0 nova_compute[186544]: 2025-11-22 08:20:08.601 186548 DEBUG oslo_concurrency.lockutils [req-2c69cabd-97b3-4949-9e33-2b1691778966 req-7c65893d-21a9-484e-af8c-270832cfdb78 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "074d9b5a-057a-46af-aea1-0f43e0ac7418-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:20:08 compute-0 nova_compute[186544]: 2025-11-22 08:20:08.601 186548 DEBUG oslo_concurrency.lockutils [req-2c69cabd-97b3-4949-9e33-2b1691778966 req-7c65893d-21a9-484e-af8c-270832cfdb78 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "074d9b5a-057a-46af-aea1-0f43e0ac7418-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:20:08 compute-0 nova_compute[186544]: 2025-11-22 08:20:08.601 186548 DEBUG oslo_concurrency.lockutils [req-2c69cabd-97b3-4949-9e33-2b1691778966 req-7c65893d-21a9-484e-af8c-270832cfdb78 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "074d9b5a-057a-46af-aea1-0f43e0ac7418-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:20:08 compute-0 nova_compute[186544]: 2025-11-22 08:20:08.602 186548 DEBUG nova.compute.manager [req-2c69cabd-97b3-4949-9e33-2b1691778966 req-7c65893d-21a9-484e-af8c-270832cfdb78 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Processing event network-vif-plugged-eda1ac92-e156-463f-9f90-8fdd14f55dc0 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 08:20:08 compute-0 nova_compute[186544]: 2025-11-22 08:20:08.602 186548 DEBUG nova.compute.manager [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 08:20:08 compute-0 nova_compute[186544]: 2025-11-22 08:20:08.606 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763799608.6064155, 074d9b5a-057a-46af-aea1-0f43e0ac7418 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:20:08 compute-0 nova_compute[186544]: 2025-11-22 08:20:08.606 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] VM Resumed (Lifecycle Event)
Nov 22 08:20:08 compute-0 podman[241483]: 2025-11-22 08:20:08.60772149 +0000 UTC m=+0.076847529 container create e2157db018632e27f5de29f54f6e36e8c8fbc379bafb77e1adde5e4a4e0235c1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-52d2fbd4-6713-49c3-93b1-794bccb91cb5, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 22 08:20:08 compute-0 nova_compute[186544]: 2025-11-22 08:20:08.608 186548 DEBUG nova.virt.libvirt.driver [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 08:20:08 compute-0 nova_compute[186544]: 2025-11-22 08:20:08.612 186548 INFO nova.virt.libvirt.driver [-] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Instance spawned successfully.
Nov 22 08:20:08 compute-0 nova_compute[186544]: 2025-11-22 08:20:08.613 186548 DEBUG nova.virt.libvirt.driver [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 08:20:08 compute-0 nova_compute[186544]: 2025-11-22 08:20:08.642 186548 DEBUG nova.virt.libvirt.driver [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:20:08 compute-0 nova_compute[186544]: 2025-11-22 08:20:08.643 186548 DEBUG nova.virt.libvirt.driver [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:20:08 compute-0 nova_compute[186544]: 2025-11-22 08:20:08.644 186548 DEBUG nova.virt.libvirt.driver [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:20:08 compute-0 nova_compute[186544]: 2025-11-22 08:20:08.644 186548 DEBUG nova.virt.libvirt.driver [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:20:08 compute-0 nova_compute[186544]: 2025-11-22 08:20:08.644 186548 DEBUG nova.virt.libvirt.driver [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:20:08 compute-0 nova_compute[186544]: 2025-11-22 08:20:08.645 186548 DEBUG nova.virt.libvirt.driver [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:20:08 compute-0 podman[241483]: 2025-11-22 08:20:08.554521236 +0000 UTC m=+0.023647295 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 08:20:08 compute-0 nova_compute[186544]: 2025-11-22 08:20:08.651 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:20:08 compute-0 nova_compute[186544]: 2025-11-22 08:20:08.655 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:20:08 compute-0 systemd[1]: Started libpod-conmon-e2157db018632e27f5de29f54f6e36e8c8fbc379bafb77e1adde5e4a4e0235c1.scope.
Nov 22 08:20:08 compute-0 nova_compute[186544]: 2025-11-22 08:20:08.675 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 08:20:08 compute-0 systemd[1]: Started libcrun container.
Nov 22 08:20:08 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/01e62d23a7f3aa8cf3b1013695e56d174d8ec5e3c71da273d3f0ed8784d3b81d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 08:20:08 compute-0 podman[241483]: 2025-11-22 08:20:08.719976782 +0000 UTC m=+0.189102841 container init e2157db018632e27f5de29f54f6e36e8c8fbc379bafb77e1adde5e4a4e0235c1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-52d2fbd4-6713-49c3-93b1-794bccb91cb5, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team)
Nov 22 08:20:08 compute-0 podman[241483]: 2025-11-22 08:20:08.725668883 +0000 UTC m=+0.194794922 container start e2157db018632e27f5de29f54f6e36e8c8fbc379bafb77e1adde5e4a4e0235c1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-52d2fbd4-6713-49c3-93b1-794bccb91cb5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 22 08:20:08 compute-0 nova_compute[186544]: 2025-11-22 08:20:08.737 186548 INFO nova.compute.manager [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Took 9.57 seconds to spawn the instance on the hypervisor.
Nov 22 08:20:08 compute-0 nova_compute[186544]: 2025-11-22 08:20:08.739 186548 DEBUG nova.compute.manager [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:20:08 compute-0 neutron-haproxy-ovnmeta-52d2fbd4-6713-49c3-93b1-794bccb91cb5[241498]: [NOTICE]   (241502) : New worker (241504) forked
Nov 22 08:20:08 compute-0 neutron-haproxy-ovnmeta-52d2fbd4-6713-49c3-93b1-794bccb91cb5[241498]: [NOTICE]   (241502) : Loading success.
Nov 22 08:20:08 compute-0 nova_compute[186544]: 2025-11-22 08:20:08.834 186548 INFO nova.compute.manager [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Took 10.07 seconds to build instance.
Nov 22 08:20:08 compute-0 nova_compute[186544]: 2025-11-22 08:20:08.861 186548 DEBUG oslo_concurrency.lockutils [None req-c8450014-0186-4a24-8543-bb0178a67e49 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "074d9b5a-057a-46af-aea1-0f43e0ac7418" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.155s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:20:09 compute-0 nova_compute[186544]: 2025-11-22 08:20:09.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:20:10 compute-0 nova_compute[186544]: 2025-11-22 08:20:10.235 186548 DEBUG nova.network.neutron [req-b204f5f3-1414-47d6-be8e-fc156d9ca939 req-9377fe3c-3b91-493e-926d-d9d9e7b4b3b5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Updated VIF entry in instance network info cache for port eda1ac92-e156-463f-9f90-8fdd14f55dc0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:20:10 compute-0 nova_compute[186544]: 2025-11-22 08:20:10.236 186548 DEBUG nova.network.neutron [req-b204f5f3-1414-47d6-be8e-fc156d9ca939 req-9377fe3c-3b91-493e-926d-d9d9e7b4b3b5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Updating instance_info_cache with network_info: [{"id": "eda1ac92-e156-463f-9f90-8fdd14f55dc0", "address": "fa:16:3e:b3:0e:98", "network": {"id": "52d2fbd4-6713-49c3-93b1-794bccb91cb5", "bridge": "br-int", "label": "tempest-network-smoke--1623348371", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeda1ac92-e1", "ovs_interfaceid": "eda1ac92-e156-463f-9f90-8fdd14f55dc0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:20:10 compute-0 nova_compute[186544]: 2025-11-22 08:20:10.256 186548 DEBUG oslo_concurrency.lockutils [req-b204f5f3-1414-47d6-be8e-fc156d9ca939 req-9377fe3c-3b91-493e-926d-d9d9e7b4b3b5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-074d9b5a-057a-46af-aea1-0f43e0ac7418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:20:10 compute-0 podman[241514]: 2025-11-22 08:20:10.443651915 +0000 UTC m=+0.087640975 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, architecture=x86_64, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 22 08:20:10 compute-0 podman[241513]: 2025-11-22 08:20:10.443782098 +0000 UTC m=+0.088220080 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 08:20:10 compute-0 nova_compute[186544]: 2025-11-22 08:20:10.693 186548 DEBUG nova.compute.manager [req-0f4e3a22-a9cb-41e2-b455-3bffdda67f56 req-a5f26bf0-2066-4572-ba4f-12f0e8bc94cf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Received event network-vif-plugged-eda1ac92-e156-463f-9f90-8fdd14f55dc0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:20:10 compute-0 nova_compute[186544]: 2025-11-22 08:20:10.693 186548 DEBUG oslo_concurrency.lockutils [req-0f4e3a22-a9cb-41e2-b455-3bffdda67f56 req-a5f26bf0-2066-4572-ba4f-12f0e8bc94cf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "074d9b5a-057a-46af-aea1-0f43e0ac7418-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:20:10 compute-0 nova_compute[186544]: 2025-11-22 08:20:10.693 186548 DEBUG oslo_concurrency.lockutils [req-0f4e3a22-a9cb-41e2-b455-3bffdda67f56 req-a5f26bf0-2066-4572-ba4f-12f0e8bc94cf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "074d9b5a-057a-46af-aea1-0f43e0ac7418-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:20:10 compute-0 nova_compute[186544]: 2025-11-22 08:20:10.693 186548 DEBUG oslo_concurrency.lockutils [req-0f4e3a22-a9cb-41e2-b455-3bffdda67f56 req-a5f26bf0-2066-4572-ba4f-12f0e8bc94cf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "074d9b5a-057a-46af-aea1-0f43e0ac7418-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:20:10 compute-0 nova_compute[186544]: 2025-11-22 08:20:10.694 186548 DEBUG nova.compute.manager [req-0f4e3a22-a9cb-41e2-b455-3bffdda67f56 req-a5f26bf0-2066-4572-ba4f-12f0e8bc94cf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] No waiting events found dispatching network-vif-plugged-eda1ac92-e156-463f-9f90-8fdd14f55dc0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:20:10 compute-0 nova_compute[186544]: 2025-11-22 08:20:10.694 186548 WARNING nova.compute.manager [req-0f4e3a22-a9cb-41e2-b455-3bffdda67f56 req-a5f26bf0-2066-4572-ba4f-12f0e8bc94cf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Received unexpected event network-vif-plugged-eda1ac92-e156-463f-9f90-8fdd14f55dc0 for instance with vm_state active and task_state None.
Nov 22 08:20:11 compute-0 nova_compute[186544]: 2025-11-22 08:20:11.138 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:20:11 compute-0 nova_compute[186544]: 2025-11-22 08:20:11.262 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:20:13 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:20:13.033 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '45'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:20:13 compute-0 nova_compute[186544]: 2025-11-22 08:20:13.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:20:15 compute-0 nova_compute[186544]: 2025-11-22 08:20:15.323 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:20:15 compute-0 NetworkManager[55036]: <info>  [1763799615.3242] manager: (patch-br-int-to-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/318)
Nov 22 08:20:15 compute-0 NetworkManager[55036]: <info>  [1763799615.3253] manager: (patch-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/319)
Nov 22 08:20:15 compute-0 nova_compute[186544]: 2025-11-22 08:20:15.386 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:20:15 compute-0 ovn_controller[94843]: 2025-11-22T08:20:15Z|00695|binding|INFO|Releasing lport a25db17f-5074-4e3e-9504-ee30cd3f6d4c from this chassis (sb_readonly=0)
Nov 22 08:20:15 compute-0 nova_compute[186544]: 2025-11-22 08:20:15.396 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:20:15 compute-0 nova_compute[186544]: 2025-11-22 08:20:15.684 186548 DEBUG nova.compute.manager [req-05db1714-bf81-45eb-9b54-6c26c7a669d4 req-873dc9c0-a173-49e7-9680-02ab2b3930ab 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Received event network-changed-eda1ac92-e156-463f-9f90-8fdd14f55dc0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:20:15 compute-0 nova_compute[186544]: 2025-11-22 08:20:15.685 186548 DEBUG nova.compute.manager [req-05db1714-bf81-45eb-9b54-6c26c7a669d4 req-873dc9c0-a173-49e7-9680-02ab2b3930ab 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Refreshing instance network info cache due to event network-changed-eda1ac92-e156-463f-9f90-8fdd14f55dc0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:20:15 compute-0 nova_compute[186544]: 2025-11-22 08:20:15.686 186548 DEBUG oslo_concurrency.lockutils [req-05db1714-bf81-45eb-9b54-6c26c7a669d4 req-873dc9c0-a173-49e7-9680-02ab2b3930ab 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-074d9b5a-057a-46af-aea1-0f43e0ac7418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:20:15 compute-0 nova_compute[186544]: 2025-11-22 08:20:15.687 186548 DEBUG oslo_concurrency.lockutils [req-05db1714-bf81-45eb-9b54-6c26c7a669d4 req-873dc9c0-a173-49e7-9680-02ab2b3930ab 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-074d9b5a-057a-46af-aea1-0f43e0ac7418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:20:15 compute-0 nova_compute[186544]: 2025-11-22 08:20:15.687 186548 DEBUG nova.network.neutron [req-05db1714-bf81-45eb-9b54-6c26c7a669d4 req-873dc9c0-a173-49e7-9680-02ab2b3930ab 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Refreshing network info cache for port eda1ac92-e156-463f-9f90-8fdd14f55dc0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:20:16 compute-0 nova_compute[186544]: 2025-11-22 08:20:16.139 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:20:16 compute-0 nova_compute[186544]: 2025-11-22 08:20:16.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:20:16 compute-0 nova_compute[186544]: 2025-11-22 08:20:16.264 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:20:17 compute-0 nova_compute[186544]: 2025-11-22 08:20:17.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:20:19 compute-0 nova_compute[186544]: 2025-11-22 08:20:19.058 186548 DEBUG nova.network.neutron [req-05db1714-bf81-45eb-9b54-6c26c7a669d4 req-873dc9c0-a173-49e7-9680-02ab2b3930ab 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Updated VIF entry in instance network info cache for port eda1ac92-e156-463f-9f90-8fdd14f55dc0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:20:19 compute-0 nova_compute[186544]: 2025-11-22 08:20:19.060 186548 DEBUG nova.network.neutron [req-05db1714-bf81-45eb-9b54-6c26c7a669d4 req-873dc9c0-a173-49e7-9680-02ab2b3930ab 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Updating instance_info_cache with network_info: [{"id": "eda1ac92-e156-463f-9f90-8fdd14f55dc0", "address": "fa:16:3e:b3:0e:98", "network": {"id": "52d2fbd4-6713-49c3-93b1-794bccb91cb5", "bridge": "br-int", "label": "tempest-network-smoke--1623348371", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeda1ac92-e1", "ovs_interfaceid": "eda1ac92-e156-463f-9f90-8fdd14f55dc0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:20:19 compute-0 nova_compute[186544]: 2025-11-22 08:20:19.081 186548 DEBUG oslo_concurrency.lockutils [req-05db1714-bf81-45eb-9b54-6c26c7a669d4 req-873dc9c0-a173-49e7-9680-02ab2b3930ab 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-074d9b5a-057a-46af-aea1-0f43e0ac7418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:20:21 compute-0 nova_compute[186544]: 2025-11-22 08:20:21.141 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:20:21 compute-0 nova_compute[186544]: 2025-11-22 08:20:21.266 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:20:22 compute-0 ovn_controller[94843]: 2025-11-22T08:20:22Z|00061|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:b3:0e:98 10.100.0.5
Nov 22 08:20:22 compute-0 ovn_controller[94843]: 2025-11-22T08:20:22Z|00062|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b3:0e:98 10.100.0.5
Nov 22 08:20:26 compute-0 nova_compute[186544]: 2025-11-22 08:20:26.143 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:20:26 compute-0 nova_compute[186544]: 2025-11-22 08:20:26.269 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:20:27 compute-0 nova_compute[186544]: 2025-11-22 08:20:27.389 186548 INFO nova.compute.manager [None req-03f5b2e9-0dca-4257-b4b4-caa387e87640 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Get console output
Nov 22 08:20:27 compute-0 podman[241577]: 2025-11-22 08:20:27.418312108 +0000 UTC m=+0.057082321 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 22 08:20:27 compute-0 podman[241578]: 2025-11-22 08:20:27.418442801 +0000 UTC m=+0.054421005 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 08:20:27 compute-0 podman[241579]: 2025-11-22 08:20:27.455975998 +0000 UTC m=+0.088641011 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 08:20:27 compute-0 podman[241576]: 2025-11-22 08:20:27.45605288 +0000 UTC m=+0.098046323 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3)
Nov 22 08:20:27 compute-0 nova_compute[186544]: 2025-11-22 08:20:27.497 213059 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 22 08:20:29 compute-0 nova_compute[186544]: 2025-11-22 08:20:29.149 186548 INFO nova.compute.manager [None req-fefd4774-fbcb-4f49-93a8-09370ca8ec3b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Get console output
Nov 22 08:20:29 compute-0 nova_compute[186544]: 2025-11-22 08:20:29.155 213059 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 22 08:20:31 compute-0 nova_compute[186544]: 2025-11-22 08:20:31.145 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:20:31 compute-0 nova_compute[186544]: 2025-11-22 08:20:31.159 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:20:31 compute-0 nova_compute[186544]: 2025-11-22 08:20:31.271 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:20:33 compute-0 nova_compute[186544]: 2025-11-22 08:20:33.497 186548 DEBUG oslo_concurrency.lockutils [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Acquiring lock "refresh_cache-074d9b5a-057a-46af-aea1-0f43e0ac7418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:20:33 compute-0 nova_compute[186544]: 2025-11-22 08:20:33.497 186548 DEBUG oslo_concurrency.lockutils [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Acquired lock "refresh_cache-074d9b5a-057a-46af-aea1-0f43e0ac7418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:20:33 compute-0 nova_compute[186544]: 2025-11-22 08:20:33.498 186548 DEBUG nova.network.neutron [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 08:20:35 compute-0 nova_compute[186544]: 2025-11-22 08:20:35.792 186548 DEBUG nova.network.neutron [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Updating instance_info_cache with network_info: [{"id": "eda1ac92-e156-463f-9f90-8fdd14f55dc0", "address": "fa:16:3e:b3:0e:98", "network": {"id": "52d2fbd4-6713-49c3-93b1-794bccb91cb5", "bridge": "br-int", "label": "tempest-network-smoke--1623348371", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeda1ac92-e1", "ovs_interfaceid": "eda1ac92-e156-463f-9f90-8fdd14f55dc0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:20:35 compute-0 nova_compute[186544]: 2025-11-22 08:20:35.806 186548 DEBUG oslo_concurrency.lockutils [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Releasing lock "refresh_cache-074d9b5a-057a-46af-aea1-0f43e0ac7418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:20:36 compute-0 nova_compute[186544]: 2025-11-22 08:20:36.146 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:20:36 compute-0 nova_compute[186544]: 2025-11-22 08:20:36.273 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:20:36 compute-0 nova_compute[186544]: 2025-11-22 08:20:36.405 186548 DEBUG nova.virt.libvirt.driver [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511
Nov 22 08:20:36 compute-0 nova_compute[186544]: 2025-11-22 08:20:36.405 186548 DEBUG nova.virt.libvirt.volume.remotefs [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Creating file /var/lib/nova/instances/074d9b5a-057a-46af-aea1-0f43e0ac7418/f516ea9e43d440df91e1608f0d4988d1.tmp on remote host 192.168.122.102 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79
Nov 22 08:20:36 compute-0 nova_compute[186544]: 2025-11-22 08:20:36.406 186548 DEBUG oslo_concurrency.processutils [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/074d9b5a-057a-46af-aea1-0f43e0ac7418/f516ea9e43d440df91e1608f0d4988d1.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.600 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '074d9b5a-057a-46af-aea1-0f43e0ac7418', 'name': 'tempest-TestNetworkAdvancedServerOps-server-550951011', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000098', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '042f6d127720471aaedb8a1fb7535416', 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'hostId': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.601 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.632 12 DEBUG ceilometer.compute.pollsters [-] 074d9b5a-057a-46af-aea1-0f43e0ac7418/disk.device.read.bytes volume: 29628928 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.633 12 DEBUG ceilometer.compute.pollsters [-] 074d9b5a-057a-46af-aea1-0f43e0ac7418/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.634 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5679f30b-14a7-46f1-9b4a-72ee72fa5f2e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29628928, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': '074d9b5a-057a-46af-aea1-0f43e0ac7418-vda', 'timestamp': '2025-11-22T08:20:36.601639', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-550951011', 'name': 'instance-00000098', 'instance_id': '074d9b5a-057a-46af-aea1-0f43e0ac7418', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1f78316a-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6377.293427257, 'message_signature': '48487be6f670d8f5df7252ce048285ec321d5048fc1f993afc191b15d3156303'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': '074d9b5a-057a-46af-aea1-0f43e0ac7418-sda', 'timestamp': '2025-11-22T08:20:36.601639', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-550951011', 'name': 'instance-00000098', 'instance_id': '074d9b5a-057a-46af-aea1-0f43e0ac7418', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1f783f98-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6377.293427257, 'message_signature': '5ec9a8c5a45a4a257f2f2df13827000c82a78ef4c5c672a636e0588ba394f80b'}]}, 'timestamp': '2025-11-22 08:20:36.633364', '_unique_id': '1f2299ccd2c7410992d6beb150c70c86'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.634 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.634 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.634 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.634 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.634 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.634 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.634 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.634 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.634 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.634 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.634 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.634 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.634 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.634 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.634 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.634 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.634 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.634 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.634 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.634 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.634 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.634 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.634 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.634 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.634 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.634 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.634 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.634 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.634 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.634 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.634 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.635 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.638 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 074d9b5a-057a-46af-aea1-0f43e0ac7418 / tapeda1ac92-e1 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.639 12 DEBUG ceilometer.compute.pollsters [-] 074d9b5a-057a-46af-aea1-0f43e0ac7418/network.outgoing.packets volume: 41 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.641 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e17e26b9-1788-45ec-9c94-b49caf68c78a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 41, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': 'instance-00000098-074d9b5a-057a-46af-aea1-0f43e0ac7418-tapeda1ac92-e1', 'timestamp': '2025-11-22T08:20:36.635427', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-550951011', 'name': 'tapeda1ac92-e1', 'instance_id': '074d9b5a-057a-46af-aea1-0f43e0ac7418', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b3:0e:98', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeda1ac92-e1'}, 'message_id': '1f79473a-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6377.327241852, 'message_signature': 'a55061c6ac72c9e62ad1dcc08738d363ce9099ad326189d7adb3faca75732629'}]}, 'timestamp': '2025-11-22 08:20:36.640238', '_unique_id': '5e272e4de18749fda868ce5fa4a4776d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.641 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.641 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.641 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.641 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.641 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.641 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.641 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.641 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.641 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.641 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.641 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.641 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.641 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.641 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.641 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.641 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.641 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.641 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.641 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.641 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.641 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.641 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.641 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.641 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.641 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.641 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.641 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.641 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.641 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.641 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.641 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.642 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.642 12 DEBUG ceilometer.compute.pollsters [-] 074d9b5a-057a-46af-aea1-0f43e0ac7418/disk.device.write.bytes volume: 72921088 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.643 12 DEBUG ceilometer.compute.pollsters [-] 074d9b5a-057a-46af-aea1-0f43e0ac7418/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.644 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f4194db2-db10-42d5-8ff9-2d6ef98515dd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72921088, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': '074d9b5a-057a-46af-aea1-0f43e0ac7418-vda', 'timestamp': '2025-11-22T08:20:36.642934', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-550951011', 'name': 'instance-00000098', 'instance_id': '074d9b5a-057a-46af-aea1-0f43e0ac7418', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1f79c3b8-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6377.293427257, 'message_signature': 'a125a3c4e9e4e5cb5d76d410523decfac50080ece97497f1cab04fdff91fd147'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': '074d9b5a-057a-46af-aea1-0f43e0ac7418-sda', 'timestamp': '2025-11-22T08:20:36.642934', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-550951011', 'name': 'instance-00000098', 'instance_id': '074d9b5a-057a-46af-aea1-0f43e0ac7418', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1f79d01a-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6377.293427257, 'message_signature': '4d0bcc07b327f7bf8dea86bacb7df4e6f309d6a68661ccb67752292dde1ead58'}]}, 'timestamp': '2025-11-22 08:20:36.643538', '_unique_id': '794a08852c61483081682e18b07f0121'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.644 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.644 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.644 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.644 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.644 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.644 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.644 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.644 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.644 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.644 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.644 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.644 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.644 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.644 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.644 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.644 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.644 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.644 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.644 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.644 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.644 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.644 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:20:36 compute-0 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.644 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.644 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.644 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.644 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.644 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.644 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.644 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.644 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.644 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.644 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.644 12 DEBUG ceilometer.compute.pollsters [-] 074d9b5a-057a-46af-aea1-0f43e0ac7418/network.incoming.bytes volume: 7090 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.645 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8a329075-d347-48b7-8ed7-2386a7406872', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7090, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': 'instance-00000098-074d9b5a-057a-46af-aea1-0f43e0ac7418-tapeda1ac92-e1', 'timestamp': '2025-11-22T08:20:36.644772', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-550951011', 'name': 'tapeda1ac92-e1', 'instance_id': '074d9b5a-057a-46af-aea1-0f43e0ac7418', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b3:0e:98', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeda1ac92-e1'}, 'message_id': '1f7a0b70-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6377.327241852, 'message_signature': 'd9a5fce126761431f13a40ee0f047f08db653049900d1ea97a348854f133e189'}]}, 'timestamp': '2025-11-22 08:20:36.645064', '_unique_id': 'fe0fbce7f0ec4d2b89b2af019e437df6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.645 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.645 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.645 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.645 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.645 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.645 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.645 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.645 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.645 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.645 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.645 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.645 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.645 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.645 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.645 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.645 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.645 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.645 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.645 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.645 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.645 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.645 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.645 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.645 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.645 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.645 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.645 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.645 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.645 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.645 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.645 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.645 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.646 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.667 12 DEBUG ceilometer.compute.pollsters [-] 074d9b5a-057a-46af-aea1-0f43e0ac7418/cpu volume: 13150000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.669 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '134dd15c-4801-4ccf-ac99-5cc80408f01a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 13150000000, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': '074d9b5a-057a-46af-aea1-0f43e0ac7418', 'timestamp': '2025-11-22T08:20:36.646146', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-550951011', 'name': 'instance-00000098', 'instance_id': '074d9b5a-057a-46af-aea1-0f43e0ac7418', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '1f7d8c64-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6377.359087318, 'message_signature': '70c762d6fb3b110b9b9d8baa4bd7ef6843cc52183bf3b4a9f9154c2a23d1ad0a'}]}, 'timestamp': '2025-11-22 08:20:36.668117', '_unique_id': '5b3fda7a0489477f92eed4bef7a42aee'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.669 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.669 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.669 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.669 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.669 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.669 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.669 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.669 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.669 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.669 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.669 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.669 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.669 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.669 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.669 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:20:36 compute-0 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.669 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.669 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.669 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.669 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.669 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.669 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.669 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.669 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.669 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.669 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.669 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.669 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.669 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.669 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.669 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.669 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.669 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.682 12 DEBUG ceilometer.compute.pollsters [-] 074d9b5a-057a-46af-aea1-0f43e0ac7418/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.682 12 DEBUG ceilometer.compute.pollsters [-] 074d9b5a-057a-46af-aea1-0f43e0ac7418/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.684 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '46a74e89-b67e-4fe9-936e-8707c2ade197', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': '074d9b5a-057a-46af-aea1-0f43e0ac7418-vda', 'timestamp': '2025-11-22T08:20:36.670070', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-550951011', 'name': 'instance-00000098', 'instance_id': '074d9b5a-057a-46af-aea1-0f43e0ac7418', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1f7fd00a-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6377.361864966, 'message_signature': 'ac4d7a6f8afe64215dc87f9f1b7acea6043e44133e11706f024ec10c65d79ea9'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': '074d9b5a-057a-46af-aea1-0f43e0ac7418-sda', 'timestamp': '2025-11-22T08:20:36.670070', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-550951011', 'name': 'instance-00000098', 'instance_id': '074d9b5a-057a-46af-aea1-0f43e0ac7418', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1f7fdc3a-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6377.361864966, 'message_signature': '0fdf065ca5c868161c6ca9ea25adb305effe7240e9ea6d17dd3dba1929e30793'}]}, 'timestamp': '2025-11-22 08:20:36.683212', '_unique_id': '778545e919ca49f182a801e5bd006bc8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.684 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.684 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.684 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.684 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.684 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.684 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.684 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.684 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.684 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.684 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.684 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.684 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.684 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.684 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.684 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.684 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.684 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.684 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.684 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.684 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.684 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.684 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.684 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.684 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.684 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.684 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.684 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.684 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.684 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.684 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.684 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.684 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.685 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.685 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-550951011>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-550951011>]
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.685 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.685 12 DEBUG ceilometer.compute.pollsters [-] 074d9b5a-057a-46af-aea1-0f43e0ac7418/disk.device.read.latency volume: 739449095 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.685 12 DEBUG ceilometer.compute.pollsters [-] 074d9b5a-057a-46af-aea1-0f43e0ac7418/disk.device.read.latency volume: 45118577 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.686 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'de0aaf42-5131-4268-8ce1-2e60c5049df0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 739449095, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': '074d9b5a-057a-46af-aea1-0f43e0ac7418-vda', 'timestamp': '2025-11-22T08:20:36.685439', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-550951011', 'name': 'instance-00000098', 'instance_id': '074d9b5a-057a-46af-aea1-0f43e0ac7418', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1f803fa4-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6377.293427257, 'message_signature': 'ee094f614233aaa1b3eec7816e3e4af2fab9dd57ff3d456b2b51424acd74f25f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 45118577, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': '074d9b5a-057a-46af-aea1-0f43e0ac7418-sda', 'timestamp': '2025-11-22T08:20:36.685439', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-550951011', 'name': 'instance-00000098', 'instance_id': '074d9b5a-057a-46af-aea1-0f43e0ac7418', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1f804b20-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6377.293427257, 'message_signature': '0e46498fd057c793dc11eb47d321b29a9a54ce1511e6335c1a5ac286bd788267'}]}, 'timestamp': '2025-11-22 08:20:36.686051', '_unique_id': 'e104876db9cc4091bc9c575081552246'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.686 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.686 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.686 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.686 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.686 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.686 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.686 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.686 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.686 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.686 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.686 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.686 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.686 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.686 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.686 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.686 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.686 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.686 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.686 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.686 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.686 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.686 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.686 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.686 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.686 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.686 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.686 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.686 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.686 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.686 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.686 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.687 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.687 12 DEBUG ceilometer.compute.pollsters [-] 074d9b5a-057a-46af-aea1-0f43e0ac7418/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.688 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'abc896e1-c759-47e6-9692-0c65bd18e3bc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': 'instance-00000098-074d9b5a-057a-46af-aea1-0f43e0ac7418-tapeda1ac92-e1', 'timestamp': '2025-11-22T08:20:36.687599', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-550951011', 'name': 'tapeda1ac92-e1', 'instance_id': '074d9b5a-057a-46af-aea1-0f43e0ac7418', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b3:0e:98', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeda1ac92-e1'}, 'message_id': '1f809418-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6377.327241852, 'message_signature': '733ea20029a0a6f345b486e0aac391e02fbf99fb2b50b998812d26820c2c6f81'}]}, 'timestamp': '2025-11-22 08:20:36.687931', '_unique_id': '49cd9e5fb2134f238dbbccef6f708c9d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.688 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.688 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.688 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.688 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.688 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.688 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.688 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.688 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.688 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.688 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.688 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.688 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.688 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.688 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.688 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.688 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.688 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.688 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.688 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.688 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.688 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.688 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.688 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.688 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.688 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.688 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.688 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.688 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.688 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.688 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.688 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.689 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.689 12 DEBUG ceilometer.compute.pollsters [-] 074d9b5a-057a-46af-aea1-0f43e0ac7418/network.outgoing.bytes volume: 5314 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.691 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f5c3e14a-c308-48a3-8b29-3cda0c8cf8cd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 5314, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': 'instance-00000098-074d9b5a-057a-46af-aea1-0f43e0ac7418-tapeda1ac92-e1', 'timestamp': '2025-11-22T08:20:36.689875', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-550951011', 'name': 'tapeda1ac92-e1', 'instance_id': '074d9b5a-057a-46af-aea1-0f43e0ac7418', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b3:0e:98', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeda1ac92-e1'}, 'message_id': '1f80efbc-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6377.327241852, 'message_signature': 'b74453db0e765a0a9b34613288f64a213175a4a27c5bf17ea829a9841eed8d58'}]}, 'timestamp': '2025-11-22 08:20:36.690335', '_unique_id': '7d823c5701d24012897aec104694e021'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.691 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.691 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.691 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.691 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.691 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.691 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.691 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.691 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.691 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.691 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.691 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.691 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.691 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.691 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.691 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.691 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.691 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.691 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.691 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.691 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.691 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.691 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.691 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.691 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.691 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.691 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.691 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.691 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.691 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.691 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.691 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.692 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.692 12 DEBUG ceilometer.compute.pollsters [-] 074d9b5a-057a-46af-aea1-0f43e0ac7418/disk.device.allocation volume: 30482432 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.692 12 DEBUG ceilometer.compute.pollsters [-] 074d9b5a-057a-46af-aea1-0f43e0ac7418/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.693 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '95560309-075c-46d2-b87c-b9580159470d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30482432, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': '074d9b5a-057a-46af-aea1-0f43e0ac7418-vda', 'timestamp': '2025-11-22T08:20:36.692110', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-550951011', 'name': 'instance-00000098', 'instance_id': '074d9b5a-057a-46af-aea1-0f43e0ac7418', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1f8144bc-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6377.361864966, 'message_signature': '552b7e134bb3394222e52e7779c132fb3cc75e0e30e18f719518601ad8e5c139'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': '074d9b5a-057a-46af-aea1-0f43e0ac7418-sda', 'timestamp': '2025-11-22T08:20:36.692110', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-550951011', 'name': 'instance-00000098', 'instance_id': '074d9b5a-057a-46af-aea1-0f43e0ac7418', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1f815042-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6377.361864966, 'message_signature': '7492fa1053900477b1b36e4d576702369d8c536ed9b24a1f59f5f34b52d1e12b'}]}, 'timestamp': '2025-11-22 08:20:36.692730', '_unique_id': '4986fe6b4d93464ab32bbe496ef4555d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.693 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.693 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.693 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.693 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.693 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.693 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.693 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.693 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.693 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.693 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.693 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.693 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.693 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.693 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.693 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.693 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.693 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.693 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.693 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.693 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.693 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.693 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.693 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.693 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.693 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.693 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.693 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.693 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.693 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.693 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.693 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.694 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.694 12 DEBUG ceilometer.compute.pollsters [-] 074d9b5a-057a-46af-aea1-0f43e0ac7418/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.695 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '73a19fc5-13ac-4515-abce-4cb0756e41a5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': 'instance-00000098-074d9b5a-057a-46af-aea1-0f43e0ac7418-tapeda1ac92-e1', 'timestamp': '2025-11-22T08:20:36.694205', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-550951011', 'name': 'tapeda1ac92-e1', 'instance_id': '074d9b5a-057a-46af-aea1-0f43e0ac7418', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b3:0e:98', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeda1ac92-e1'}, 'message_id': '1f81975a-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6377.327241852, 'message_signature': 'e1ca9a573f87a9c3c99a10a1c4b2d2932e1866a43580a71d3b36b784cca68f3d'}]}, 'timestamp': '2025-11-22 08:20:36.694566', '_unique_id': '3d381a8480ae496aaa280d545e30bc84'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.695 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.695 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.695 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.695 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.695 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.695 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.695 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.695 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.695 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.695 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.695 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.695 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.695 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.695 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.695 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.695 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.695 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.695 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.695 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.695 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.695 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.695 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.695 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.695 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.695 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.695 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.695 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.695 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.695 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.695 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.695 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.695 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.696 12 DEBUG ceilometer.compute.pollsters [-] 074d9b5a-057a-46af-aea1-0f43e0ac7418/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.696 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '22025bc9-9897-493d-b855-3ef2bd60fe80', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': 'instance-00000098-074d9b5a-057a-46af-aea1-0f43e0ac7418-tapeda1ac92-e1', 'timestamp': '2025-11-22T08:20:36.696031', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-550951011', 'name': 'tapeda1ac92-e1', 'instance_id': '074d9b5a-057a-46af-aea1-0f43e0ac7418', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b3:0e:98', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeda1ac92-e1'}, 'message_id': '1f81dd00-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6377.327241852, 'message_signature': '8a6d06961c587db9d58afcdf777610ff04877b818eb09a6cc5210a611ead3098'}]}, 'timestamp': '2025-11-22 08:20:36.696369', '_unique_id': '7e44f98c4ed941fa8d758c0f29fa0a4b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.696 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.696 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.696 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.696 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.696 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.696 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.696 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.696 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.696 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.696 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.696 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.696 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.696 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.696 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.696 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.696 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.696 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.696 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.696 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.696 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.696 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.696 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.696 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.696 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.696 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.696 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.696 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.696 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.696 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.696 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.696 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.697 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.697 12 DEBUG ceilometer.compute.pollsters [-] 074d9b5a-057a-46af-aea1-0f43e0ac7418/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.698 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2922cd7b-a2d1-436a-b8f3-1279e315e6cb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': 'instance-00000098-074d9b5a-057a-46af-aea1-0f43e0ac7418-tapeda1ac92-e1', 'timestamp': '2025-11-22T08:20:36.697806', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-550951011', 'name': 'tapeda1ac92-e1', 'instance_id': '074d9b5a-057a-46af-aea1-0f43e0ac7418', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b3:0e:98', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeda1ac92-e1'}, 'message_id': '1f82227e-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6377.327241852, 'message_signature': '679af4d3862d76d3e85fe78d1d5bfeafbb026dadbd4bd4b752243aa6d4941f38'}]}, 'timestamp': '2025-11-22 08:20:36.698135', '_unique_id': '5004dae4989449ddac9a1b2d913228fc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.698 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.698 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.698 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.698 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.698 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.698 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.698 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.698 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.698 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.698 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.698 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.698 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.698 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.698 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.698 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.698 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.698 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.698 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.698 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.698 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.698 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.698 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.698 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.698 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.698 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.698 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.698 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.698 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.698 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.698 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.698 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.699 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.699 12 DEBUG ceilometer.compute.pollsters [-] 074d9b5a-057a-46af-aea1-0f43e0ac7418/network.incoming.packets volume: 41 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.700 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b6619425-ae63-4f88-b1bb-915086ea9109', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 41, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': 'instance-00000098-074d9b5a-057a-46af-aea1-0f43e0ac7418-tapeda1ac92-e1', 'timestamp': '2025-11-22T08:20:36.699616', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-550951011', 'name': 'tapeda1ac92-e1', 'instance_id': '074d9b5a-057a-46af-aea1-0f43e0ac7418', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b3:0e:98', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeda1ac92-e1'}, 'message_id': '1f826914-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6377.327241852, 'message_signature': 'f9b968a720a7f64b71d3d454f26a34e71c48e3b57608db3b8fa3692e995cc2b4'}]}, 'timestamp': '2025-11-22 08:20:36.699936', '_unique_id': '2b68a5656fe24c4aba8625671273fa5a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.700 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.700 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.700 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.700 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.700 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.700 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.700 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.700 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.700 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.700 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.700 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.700 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.700 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.700 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.700 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.700 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.700 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.700 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.700 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.700 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.700 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.700 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.700 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.700 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.700 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.700 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.700 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.700 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.700 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.700 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.700 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.701 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.701 12 DEBUG ceilometer.compute.pollsters [-] 074d9b5a-057a-46af-aea1-0f43e0ac7418/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.702 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '86fe81c5-6d31-4eda-9f39-1dc1191cfdda', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': 'instance-00000098-074d9b5a-057a-46af-aea1-0f43e0ac7418-tapeda1ac92-e1', 'timestamp': '2025-11-22T08:20:36.701621', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-550951011', 'name': 'tapeda1ac92-e1', 'instance_id': '074d9b5a-057a-46af-aea1-0f43e0ac7418', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b3:0e:98', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeda1ac92-e1'}, 'message_id': '1f82b748-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6377.327241852, 'message_signature': '71b57c90400f77886a94d2b981f9879a49abd177f916ad0f5064068864becb40'}]}, 'timestamp': '2025-11-22 08:20:36.701936', '_unique_id': '74c844a7e978417e910dac14e9322601'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.702 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.702 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.702 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.702 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.702 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.702 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.702 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.702 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.702 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.702 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.702 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.702 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.702 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.702 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.702 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.702 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.702 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.702 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.702 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.702 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.702 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.702 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.702 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.702 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.702 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.702 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.702 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.702 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.702 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.702 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.702 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.703 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.703 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.703 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-550951011>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-550951011>]
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.703 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.703 12 DEBUG ceilometer.compute.pollsters [-] 074d9b5a-057a-46af-aea1-0f43e0ac7418/disk.device.write.requests volume: 309 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.704 12 DEBUG ceilometer.compute.pollsters [-] 074d9b5a-057a-46af-aea1-0f43e0ac7418/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.705 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '287806e6-3c5c-4e06-b86a-69e61cd413db', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 309, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': '074d9b5a-057a-46af-aea1-0f43e0ac7418-vda', 'timestamp': '2025-11-22T08:20:36.703857', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-550951011', 'name': 'instance-00000098', 'instance_id': '074d9b5a-057a-46af-aea1-0f43e0ac7418', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1f830ec8-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6377.293427257, 'message_signature': 'd5bacabbadc095317ffd03fb5e66b8115c0ebb8ef36cadfb75e6a0cc2bf0e9f8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': '074d9b5a-057a-46af-aea1-0f43e0ac7418-sda', 'timestamp': '2025-11-22T08:20:36.703857', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-550951011', 'name': 'instance-00000098', 'instance_id': '074d9b5a-057a-46af-aea1-0f43e0ac7418', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1f831aa8-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6377.293427257, 'message_signature': 'ff6fdef2155dadf87059708748c22357ca715b9611dad315ee71f60c4941ffcd'}]}, 'timestamp': '2025-11-22 08:20:36.704465', '_unique_id': '8cd9269f2cf5477e953ecfed29cf5676'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.705 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.705 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.705 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.705 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.705 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.705 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.705 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.705 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.705 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.705 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.705 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.705 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.705 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.705 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.705 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.705 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.705 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.705 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.705 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.705 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.705 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.705 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.705 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.705 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.705 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.705 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.705 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.705 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.705 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.705 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.705 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.705 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.705 12 DEBUG ceilometer.compute.pollsters [-] 074d9b5a-057a-46af-aea1-0f43e0ac7418/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.706 12 DEBUG ceilometer.compute.pollsters [-] 074d9b5a-057a-46af-aea1-0f43e0ac7418/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.707 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '771729df-03d0-47ba-ac03-dc67312e2bc0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': '074d9b5a-057a-46af-aea1-0f43e0ac7418-vda', 'timestamp': '2025-11-22T08:20:36.705924', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-550951011', 'name': 'instance-00000098', 'instance_id': '074d9b5a-057a-46af-aea1-0f43e0ac7418', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1f835f54-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6377.361864966, 'message_signature': '96c932d37e364ec1175257943049fbbb8cffa568a02581338680cca99526d642'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': '074d9b5a-057a-46af-aea1-0f43e0ac7418-sda', 'timestamp': '2025-11-22T08:20:36.705924', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-550951011', 'name': 'instance-00000098', 'instance_id': '074d9b5a-057a-46af-aea1-0f43e0ac7418', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1f836af8-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6377.361864966, 'message_signature': 'cc957a8fc73287085af3de2a5666ed6dc5205c0a995bf3e4ff7dd1801cf094d8'}]}, 'timestamp': '2025-11-22 08:20:36.706520', '_unique_id': '3120e3804c3e4c48a7bc56e02afa8cd2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.707 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.707 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.707 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.707 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.707 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.707 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.707 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.707 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.707 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.707 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.707 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.707 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.707 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.707 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.707 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.707 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.707 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.707 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.707 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.707 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.707 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.707 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.707 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.707 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.707 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.707 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.707 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.707 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.707 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.707 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.707 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.707 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.708 12 DEBUG ceilometer.compute.pollsters [-] 074d9b5a-057a-46af-aea1-0f43e0ac7418/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.708 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '11b9e6a5-1cc1-4ea4-b28d-1f5f04fa586a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': 'instance-00000098-074d9b5a-057a-46af-aea1-0f43e0ac7418-tapeda1ac92-e1', 'timestamp': '2025-11-22T08:20:36.707994', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-550951011', 'name': 'tapeda1ac92-e1', 'instance_id': '074d9b5a-057a-46af-aea1-0f43e0ac7418', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b3:0e:98', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapeda1ac92-e1'}, 'message_id': '1f83b08a-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6377.327241852, 'message_signature': '57d95f0eac3ad3e0901ab8bda7795bd215d6f9d507ddfa491e6a199b474dd17b'}]}, 'timestamp': '2025-11-22 08:20:36.708339', '_unique_id': '89a2a6368a7241f6ae7d89fe054004b0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.708 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.708 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.708 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.708 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.708 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.708 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.708 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.708 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.708 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.708 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.708 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.708 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.708 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.708 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.708 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.708 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.708 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.708 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.708 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.708 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.708 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.708 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.708 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.708 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.708 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.708 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.708 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.708 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.708 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.708 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.708 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.709 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.709 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.709 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-550951011>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-550951011>]
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.710 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.710 12 DEBUG ceilometer.compute.pollsters [-] 074d9b5a-057a-46af-aea1-0f43e0ac7418/memory.usage volume: 42.8046875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c778b0c8-1934-4cf0-b9e2-58b112cb6b7a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.8046875, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': '074d9b5a-057a-46af-aea1-0f43e0ac7418', 'timestamp': '2025-11-22T08:20:36.710193', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-550951011', 'name': 'instance-00000098', 'instance_id': '074d9b5a-057a-46af-aea1-0f43e0ac7418', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '1f840760-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6377.359087318, 'message_signature': 'd2b5ab7c4674f012815d6578c8e04eadef3a33a839b9ea5aafb7cf9b740f9122'}]}, 'timestamp': '2025-11-22 08:20:36.710528', '_unique_id': 'aaca61affa21469cab04c977f3db7ff6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.711 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.711 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.712 12 DEBUG ceilometer.compute.pollsters [-] 074d9b5a-057a-46af-aea1-0f43e0ac7418/disk.device.read.requests volume: 1068 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.712 12 DEBUG ceilometer.compute.pollsters [-] 074d9b5a-057a-46af-aea1-0f43e0ac7418/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6bfacef3-cf23-464c-995c-594f28458705', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1068, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': '074d9b5a-057a-46af-aea1-0f43e0ac7418-vda', 'timestamp': '2025-11-22T08:20:36.712003', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-550951011', 'name': 'instance-00000098', 'instance_id': '074d9b5a-057a-46af-aea1-0f43e0ac7418', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1f844cca-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6377.293427257, 'message_signature': '17f6396c6b06d7704b38d111260d865dd60716ff0dc23646ae80d8b436dd42c0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': '074d9b5a-057a-46af-aea1-0f43e0ac7418-sda', 'timestamp': '2025-11-22T08:20:36.712003', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-550951011', 'name': 'instance-00000098', 'instance_id': '074d9b5a-057a-46af-aea1-0f43e0ac7418', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1f8458a0-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6377.293427257, 'message_signature': '1df44d95f1151ff286ed1184cd3a45e528975dadee9c1c674d34e3de7c53218d'}]}, 'timestamp': '2025-11-22 08:20:36.712606', '_unique_id': '29903eb01ddb452ab0e30602bac5c09a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.713 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.713 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.714 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.714 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-550951011>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-550951011>]
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.714 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.714 12 DEBUG ceilometer.compute.pollsters [-] 074d9b5a-057a-46af-aea1-0f43e0ac7418/disk.device.write.latency volume: 4174604779 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.714 12 DEBUG ceilometer.compute.pollsters [-] 074d9b5a-057a-46af-aea1-0f43e0ac7418/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.715 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '64b82d78-04c7-48d9-bf1f-67f722f7a964', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4174604779, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': '074d9b5a-057a-46af-aea1-0f43e0ac7418-vda', 'timestamp': '2025-11-22T08:20:36.714664', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-550951011', 'name': 'instance-00000098', 'instance_id': '074d9b5a-057a-46af-aea1-0f43e0ac7418', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1f84b502-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6377.293427257, 'message_signature': '0354edf4e243379dbaeccb7d07d73afb22c5c3e16360b2d9f130673bf1513f67'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': '074d9b5a-057a-46af-aea1-0f43e0ac7418-sda', 'timestamp': '2025-11-22T08:20:36.714664', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-550951011', 'name': 'instance-00000098', 'instance_id': '074d9b5a-057a-46af-aea1-0f43e0ac7418', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1f84c06a-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6377.293427257, 'message_signature': '9147add5dd441496868f8de7e10c4732fb72615c615aa9839cfa913b3a163aa4'}]}, 'timestamp': '2025-11-22 08:20:36.715262', '_unique_id': 'd26fdde875e44998a8b534880bedcbee'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.715 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.715 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.715 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.715 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.715 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.715 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.715 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.715 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.715 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.715 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.715 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.715 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.715 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.715 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.715 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.715 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.715 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.715 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.715 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.715 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.715 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.715 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.715 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.715 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.715 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.715 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.715 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.715 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.715 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.715 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:20:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:20:36.715 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:20:36 compute-0 nova_compute[186544]: 2025-11-22 08:20:36.888 186548 DEBUG oslo_concurrency.processutils [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/074d9b5a-057a-46af-aea1-0f43e0ac7418/f516ea9e43d440df91e1608f0d4988d1.tmp" returned: 1 in 0.482s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:20:36 compute-0 nova_compute[186544]: 2025-11-22 08:20:36.889 186548 DEBUG oslo_concurrency.processutils [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] 'ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/074d9b5a-057a-46af-aea1-0f43e0ac7418/f516ea9e43d440df91e1608f0d4988d1.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Nov 22 08:20:36 compute-0 nova_compute[186544]: 2025-11-22 08:20:36.889 186548 DEBUG nova.virt.libvirt.volume.remotefs [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Creating directory /var/lib/nova/instances/074d9b5a-057a-46af-aea1-0f43e0ac7418 on remote host 192.168.122.102 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91
Nov 22 08:20:36 compute-0 nova_compute[186544]: 2025-11-22 08:20:36.890 186548 DEBUG oslo_concurrency.processutils [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.102 mkdir -p /var/lib/nova/instances/074d9b5a-057a-46af-aea1-0f43e0ac7418 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:20:37 compute-0 nova_compute[186544]: 2025-11-22 08:20:37.099 186548 DEBUG oslo_concurrency.processutils [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.102 mkdir -p /var/lib/nova/instances/074d9b5a-057a-46af-aea1-0f43e0ac7418" returned: 0 in 0.209s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:20:37 compute-0 nova_compute[186544]: 2025-11-22 08:20:37.103 186548 DEBUG nova.virt.libvirt.driver [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Nov 22 08:20:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:20:37.346 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:20:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:20:37.347 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:20:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:20:37.347 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:20:38 compute-0 podman[241662]: 2025-11-22 08:20:38.403791644 +0000 UTC m=+0.053293967 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 22 08:20:39 compute-0 kernel: tapeda1ac92-e1 (unregistering): left promiscuous mode
Nov 22 08:20:39 compute-0 NetworkManager[55036]: <info>  [1763799639.4082] device (tapeda1ac92-e1): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 08:20:39 compute-0 nova_compute[186544]: 2025-11-22 08:20:39.415 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:20:39 compute-0 ovn_controller[94843]: 2025-11-22T08:20:39Z|00696|binding|INFO|Releasing lport eda1ac92-e156-463f-9f90-8fdd14f55dc0 from this chassis (sb_readonly=0)
Nov 22 08:20:39 compute-0 ovn_controller[94843]: 2025-11-22T08:20:39Z|00697|binding|INFO|Setting lport eda1ac92-e156-463f-9f90-8fdd14f55dc0 down in Southbound
Nov 22 08:20:39 compute-0 ovn_controller[94843]: 2025-11-22T08:20:39Z|00698|binding|INFO|Removing iface tapeda1ac92-e1 ovn-installed in OVS
Nov 22 08:20:39 compute-0 nova_compute[186544]: 2025-11-22 08:20:39.432 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:20:39 compute-0 systemd[1]: machine-qemu\x2d80\x2dinstance\x2d00000098.scope: Deactivated successfully.
Nov 22 08:20:39 compute-0 systemd[1]: machine-qemu\x2d80\x2dinstance\x2d00000098.scope: Consumed 15.236s CPU time.
Nov 22 08:20:39 compute-0 systemd-machined[152872]: Machine qemu-80-instance-00000098 terminated.
Nov 22 08:20:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:20:39.561 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b3:0e:98 10.100.0.5'], port_security=['fa:16:3e:b3:0e:98 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '074d9b5a-057a-46af-aea1-0f43e0ac7418', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-52d2fbd4-6713-49c3-93b1-794bccb91cb5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '042f6d127720471aaedb8a1fb7535416', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a792325d-0f21-49cc-9e79-20df696791a7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.205'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=46980f3d-5c4c-4eaa-bdbc-e9dfef13e740, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=eda1ac92-e156-463f-9f90-8fdd14f55dc0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:20:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:20:39.563 103805 INFO neutron.agent.ovn.metadata.agent [-] Port eda1ac92-e156-463f-9f90-8fdd14f55dc0 in datapath 52d2fbd4-6713-49c3-93b1-794bccb91cb5 unbound from our chassis
Nov 22 08:20:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:20:39.564 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 52d2fbd4-6713-49c3-93b1-794bccb91cb5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 08:20:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:20:39.565 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[f806d76a-f446-48d4-b9fc-fb522b41001f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:20:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:20:39.566 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-52d2fbd4-6713-49c3-93b1-794bccb91cb5 namespace which is not needed anymore
Nov 22 08:20:39 compute-0 nova_compute[186544]: 2025-11-22 08:20:39.637 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:20:39 compute-0 nova_compute[186544]: 2025-11-22 08:20:39.641 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:20:39 compute-0 neutron-haproxy-ovnmeta-52d2fbd4-6713-49c3-93b1-794bccb91cb5[241498]: [NOTICE]   (241502) : haproxy version is 2.8.14-c23fe91
Nov 22 08:20:39 compute-0 neutron-haproxy-ovnmeta-52d2fbd4-6713-49c3-93b1-794bccb91cb5[241498]: [NOTICE]   (241502) : path to executable is /usr/sbin/haproxy
Nov 22 08:20:39 compute-0 neutron-haproxy-ovnmeta-52d2fbd4-6713-49c3-93b1-794bccb91cb5[241498]: [WARNING]  (241502) : Exiting Master process...
Nov 22 08:20:39 compute-0 neutron-haproxy-ovnmeta-52d2fbd4-6713-49c3-93b1-794bccb91cb5[241498]: [ALERT]    (241502) : Current worker (241504) exited with code 143 (Terminated)
Nov 22 08:20:39 compute-0 neutron-haproxy-ovnmeta-52d2fbd4-6713-49c3-93b1-794bccb91cb5[241498]: [WARNING]  (241502) : All workers exited. Exiting... (0)
Nov 22 08:20:39 compute-0 systemd[1]: libpod-e2157db018632e27f5de29f54f6e36e8c8fbc379bafb77e1adde5e4a4e0235c1.scope: Deactivated successfully.
Nov 22 08:20:39 compute-0 podman[241709]: 2025-11-22 08:20:39.700188095 +0000 UTC m=+0.051147775 container died e2157db018632e27f5de29f54f6e36e8c8fbc379bafb77e1adde5e4a4e0235c1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-52d2fbd4-6713-49c3-93b1-794bccb91cb5, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 08:20:39 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e2157db018632e27f5de29f54f6e36e8c8fbc379bafb77e1adde5e4a4e0235c1-userdata-shm.mount: Deactivated successfully.
Nov 22 08:20:39 compute-0 systemd[1]: var-lib-containers-storage-overlay-01e62d23a7f3aa8cf3b1013695e56d174d8ec5e3c71da273d3f0ed8784d3b81d-merged.mount: Deactivated successfully.
Nov 22 08:20:39 compute-0 podman[241709]: 2025-11-22 08:20:39.775569696 +0000 UTC m=+0.126529376 container cleanup e2157db018632e27f5de29f54f6e36e8c8fbc379bafb77e1adde5e4a4e0235c1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-52d2fbd4-6713-49c3-93b1-794bccb91cb5, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 22 08:20:39 compute-0 systemd[1]: libpod-conmon-e2157db018632e27f5de29f54f6e36e8c8fbc379bafb77e1adde5e4a4e0235c1.scope: Deactivated successfully.
Nov 22 08:20:39 compute-0 podman[241752]: 2025-11-22 08:20:39.845760919 +0000 UTC m=+0.049241047 container remove e2157db018632e27f5de29f54f6e36e8c8fbc379bafb77e1adde5e4a4e0235c1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-52d2fbd4-6713-49c3-93b1-794bccb91cb5, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 08:20:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:20:39.851 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[e5aa1cd2-48e8-481f-9be3-1122555258b3]: (4, ('Sat Nov 22 08:20:39 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-52d2fbd4-6713-49c3-93b1-794bccb91cb5 (e2157db018632e27f5de29f54f6e36e8c8fbc379bafb77e1adde5e4a4e0235c1)\ne2157db018632e27f5de29f54f6e36e8c8fbc379bafb77e1adde5e4a4e0235c1\nSat Nov 22 08:20:39 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-52d2fbd4-6713-49c3-93b1-794bccb91cb5 (e2157db018632e27f5de29f54f6e36e8c8fbc379bafb77e1adde5e4a4e0235c1)\ne2157db018632e27f5de29f54f6e36e8c8fbc379bafb77e1adde5e4a4e0235c1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:20:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:20:39.853 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[87f80216-5fe3-4a36-98e1-0c140040d400]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:20:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:20:39.854 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap52d2fbd4-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:20:39 compute-0 nova_compute[186544]: 2025-11-22 08:20:39.856 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:20:39 compute-0 kernel: tap52d2fbd4-60: left promiscuous mode
Nov 22 08:20:39 compute-0 nova_compute[186544]: 2025-11-22 08:20:39.874 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:20:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:20:39.877 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[1cd5cb79-4c63-4e81-8e52-d1fc238f40f6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:20:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:20:39.893 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[0f68eabd-aa2d-4b59-a2ed-4af4bbdd9c7e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:20:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:20:39.895 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[44a9fc76-d525-4087-844b-bb2e1a9958c9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:20:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:20:39.910 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[1d903c47-77f3-417c-b3ff-0e6aad3cccf2]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 634860, 'reachable_time': 28750, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241772, 'error': None, 'target': 'ovnmeta-52d2fbd4-6713-49c3-93b1-794bccb91cb5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:20:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:20:39.913 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-52d2fbd4-6713-49c3-93b1-794bccb91cb5 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 08:20:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:20:39.913 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[2f3220e2-153b-4153-91c3-33568dbc9ae9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:20:39 compute-0 systemd[1]: run-netns-ovnmeta\x2d52d2fbd4\x2d6713\x2d49c3\x2d93b1\x2d794bccb91cb5.mount: Deactivated successfully.
Nov 22 08:20:40 compute-0 nova_compute[186544]: 2025-11-22 08:20:40.122 186548 INFO nova.virt.libvirt.driver [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Instance shutdown successfully after 3 seconds.
Nov 22 08:20:40 compute-0 nova_compute[186544]: 2025-11-22 08:20:40.128 186548 INFO nova.virt.libvirt.driver [-] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Instance destroyed successfully.
Nov 22 08:20:40 compute-0 nova_compute[186544]: 2025-11-22 08:20:40.129 186548 DEBUG nova.virt.libvirt.vif [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:19:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-550951011',display_name='tempest-TestNetworkAdvancedServerOps-server-550951011',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-550951011',id=152,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBChgjIQbTG83YupjtVjqz+L3+8SX3/AyjC8fqpXlZMUq0Yc6UvLnNy2SkzagnhhQjk5r+5IpiMQj6wR0xNs5cYnWEn7ZMM5fmHS1ZM+0SVA7KQ3TBAqX6QTRX0NRVvymVQ==',key_name='tempest-TestNetworkAdvancedServerOps-1187204626',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:20:08Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='042f6d127720471aaedb8a1fb7535416',ramdisk_id='',reservation_id='r-pxtss0dk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-1221065053',owner_user_name='tempest-TestNetworkAdvancedServerOps-1221065053-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:20:32Z,user_data=None,user_id='d8853d84c1e84f6baaf01635ef1d0f7c',uuid=074d9b5a-057a-46af-aea1-0f43e0ac7418,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "eda1ac92-e156-463f-9f90-8fdd14f55dc0", "address": "fa:16:3e:b3:0e:98", "network": {"id": "52d2fbd4-6713-49c3-93b1-794bccb91cb5", "bridge": "br-int", "label": "tempest-network-smoke--1623348371", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1623348371", "vif_mac": "fa:16:3e:b3:0e:98"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeda1ac92-e1", "ovs_interfaceid": "eda1ac92-e156-463f-9f90-8fdd14f55dc0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 08:20:40 compute-0 nova_compute[186544]: 2025-11-22 08:20:40.130 186548 DEBUG nova.network.os_vif_util [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Converting VIF {"id": "eda1ac92-e156-463f-9f90-8fdd14f55dc0", "address": "fa:16:3e:b3:0e:98", "network": {"id": "52d2fbd4-6713-49c3-93b1-794bccb91cb5", "bridge": "br-int", "label": "tempest-network-smoke--1623348371", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1623348371", "vif_mac": "fa:16:3e:b3:0e:98"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeda1ac92-e1", "ovs_interfaceid": "eda1ac92-e156-463f-9f90-8fdd14f55dc0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:20:40 compute-0 nova_compute[186544]: 2025-11-22 08:20:40.131 186548 DEBUG nova.network.os_vif_util [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b3:0e:98,bridge_name='br-int',has_traffic_filtering=True,id=eda1ac92-e156-463f-9f90-8fdd14f55dc0,network=Network(52d2fbd4-6713-49c3-93b1-794bccb91cb5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeda1ac92-e1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:20:40 compute-0 nova_compute[186544]: 2025-11-22 08:20:40.131 186548 DEBUG os_vif [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b3:0e:98,bridge_name='br-int',has_traffic_filtering=True,id=eda1ac92-e156-463f-9f90-8fdd14f55dc0,network=Network(52d2fbd4-6713-49c3-93b1-794bccb91cb5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeda1ac92-e1') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 08:20:40 compute-0 nova_compute[186544]: 2025-11-22 08:20:40.133 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:20:40 compute-0 nova_compute[186544]: 2025-11-22 08:20:40.134 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapeda1ac92-e1, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:20:40 compute-0 nova_compute[186544]: 2025-11-22 08:20:40.135 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:20:40 compute-0 nova_compute[186544]: 2025-11-22 08:20:40.136 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:20:40 compute-0 nova_compute[186544]: 2025-11-22 08:20:40.138 186548 INFO os_vif [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b3:0e:98,bridge_name='br-int',has_traffic_filtering=True,id=eda1ac92-e156-463f-9f90-8fdd14f55dc0,network=Network(52d2fbd4-6713-49c3-93b1-794bccb91cb5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeda1ac92-e1')
Nov 22 08:20:40 compute-0 nova_compute[186544]: 2025-11-22 08:20:40.143 186548 DEBUG oslo_concurrency.processutils [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/074d9b5a-057a-46af-aea1-0f43e0ac7418/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:20:40 compute-0 nova_compute[186544]: 2025-11-22 08:20:40.243 186548 DEBUG oslo_concurrency.processutils [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/074d9b5a-057a-46af-aea1-0f43e0ac7418/disk --force-share --output=json" returned: 0 in 0.100s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:20:40 compute-0 nova_compute[186544]: 2025-11-22 08:20:40.244 186548 DEBUG oslo_concurrency.processutils [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/074d9b5a-057a-46af-aea1-0f43e0ac7418/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:20:40 compute-0 nova_compute[186544]: 2025-11-22 08:20:40.299 186548 DEBUG oslo_concurrency.processutils [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/074d9b5a-057a-46af-aea1-0f43e0ac7418/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:20:40 compute-0 nova_compute[186544]: 2025-11-22 08:20:40.301 186548 DEBUG nova.virt.libvirt.volume.remotefs [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Copying file /var/lib/nova/instances/074d9b5a-057a-46af-aea1-0f43e0ac7418_resize/disk to 192.168.122.102:/var/lib/nova/instances/074d9b5a-057a-46af-aea1-0f43e0ac7418/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Nov 22 08:20:40 compute-0 nova_compute[186544]: 2025-11-22 08:20:40.301 186548 DEBUG oslo_concurrency.processutils [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/074d9b5a-057a-46af-aea1-0f43e0ac7418_resize/disk 192.168.122.102:/var/lib/nova/instances/074d9b5a-057a-46af-aea1-0f43e0ac7418/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:20:40 compute-0 nova_compute[186544]: 2025-11-22 08:20:40.389 186548 DEBUG nova.compute.manager [req-9ef239b7-1f80-4e4b-84ef-d3ec602bcec3 req-921b685d-63ce-4e09-bfb1-702ef57488d1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Received event network-vif-unplugged-eda1ac92-e156-463f-9f90-8fdd14f55dc0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:20:40 compute-0 nova_compute[186544]: 2025-11-22 08:20:40.390 186548 DEBUG oslo_concurrency.lockutils [req-9ef239b7-1f80-4e4b-84ef-d3ec602bcec3 req-921b685d-63ce-4e09-bfb1-702ef57488d1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "074d9b5a-057a-46af-aea1-0f43e0ac7418-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:20:40 compute-0 nova_compute[186544]: 2025-11-22 08:20:40.390 186548 DEBUG oslo_concurrency.lockutils [req-9ef239b7-1f80-4e4b-84ef-d3ec602bcec3 req-921b685d-63ce-4e09-bfb1-702ef57488d1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "074d9b5a-057a-46af-aea1-0f43e0ac7418-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:20:40 compute-0 nova_compute[186544]: 2025-11-22 08:20:40.390 186548 DEBUG oslo_concurrency.lockutils [req-9ef239b7-1f80-4e4b-84ef-d3ec602bcec3 req-921b685d-63ce-4e09-bfb1-702ef57488d1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "074d9b5a-057a-46af-aea1-0f43e0ac7418-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:20:40 compute-0 nova_compute[186544]: 2025-11-22 08:20:40.390 186548 DEBUG nova.compute.manager [req-9ef239b7-1f80-4e4b-84ef-d3ec602bcec3 req-921b685d-63ce-4e09-bfb1-702ef57488d1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] No waiting events found dispatching network-vif-unplugged-eda1ac92-e156-463f-9f90-8fdd14f55dc0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:20:40 compute-0 nova_compute[186544]: 2025-11-22 08:20:40.391 186548 WARNING nova.compute.manager [req-9ef239b7-1f80-4e4b-84ef-d3ec602bcec3 req-921b685d-63ce-4e09-bfb1-702ef57488d1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Received unexpected event network-vif-unplugged-eda1ac92-e156-463f-9f90-8fdd14f55dc0 for instance with vm_state active and task_state resize_migrating.
Nov 22 08:20:41 compute-0 nova_compute[186544]: 2025-11-22 08:20:41.148 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:20:41 compute-0 nova_compute[186544]: 2025-11-22 08:20:41.235 186548 DEBUG oslo_concurrency.processutils [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] CMD "scp -r /var/lib/nova/instances/074d9b5a-057a-46af-aea1-0f43e0ac7418_resize/disk 192.168.122.102:/var/lib/nova/instances/074d9b5a-057a-46af-aea1-0f43e0ac7418/disk" returned: 0 in 0.934s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:20:41 compute-0 nova_compute[186544]: 2025-11-22 08:20:41.236 186548 DEBUG nova.virt.libvirt.volume.remotefs [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Copying file /var/lib/nova/instances/074d9b5a-057a-46af-aea1-0f43e0ac7418_resize/disk.config to 192.168.122.102:/var/lib/nova/instances/074d9b5a-057a-46af-aea1-0f43e0ac7418/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Nov 22 08:20:41 compute-0 nova_compute[186544]: 2025-11-22 08:20:41.237 186548 DEBUG oslo_concurrency.processutils [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/074d9b5a-057a-46af-aea1-0f43e0ac7418_resize/disk.config 192.168.122.102:/var/lib/nova/instances/074d9b5a-057a-46af-aea1-0f43e0ac7418/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:20:41 compute-0 podman[241783]: 2025-11-22 08:20:41.402064438 +0000 UTC m=+0.049042622 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 08:20:41 compute-0 podman[241784]: 2025-11-22 08:20:41.415136001 +0000 UTC m=+0.057024829 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9-minimal, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, vcs-type=git, version=9.6)
Nov 22 08:20:41 compute-0 nova_compute[186544]: 2025-11-22 08:20:41.489 186548 DEBUG oslo_concurrency.processutils [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] CMD "scp -C -r /var/lib/nova/instances/074d9b5a-057a-46af-aea1-0f43e0ac7418_resize/disk.config 192.168.122.102:/var/lib/nova/instances/074d9b5a-057a-46af-aea1-0f43e0ac7418/disk.config" returned: 0 in 0.252s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:20:41 compute-0 nova_compute[186544]: 2025-11-22 08:20:41.490 186548 DEBUG nova.virt.libvirt.volume.remotefs [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Copying file /var/lib/nova/instances/074d9b5a-057a-46af-aea1-0f43e0ac7418_resize/disk.info to 192.168.122.102:/var/lib/nova/instances/074d9b5a-057a-46af-aea1-0f43e0ac7418/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Nov 22 08:20:41 compute-0 nova_compute[186544]: 2025-11-22 08:20:41.490 186548 DEBUG oslo_concurrency.processutils [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/074d9b5a-057a-46af-aea1-0f43e0ac7418_resize/disk.info 192.168.122.102:/var/lib/nova/instances/074d9b5a-057a-46af-aea1-0f43e0ac7418/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:20:41 compute-0 nova_compute[186544]: 2025-11-22 08:20:41.718 186548 DEBUG oslo_concurrency.processutils [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] CMD "scp -C -r /var/lib/nova/instances/074d9b5a-057a-46af-aea1-0f43e0ac7418_resize/disk.info 192.168.122.102:/var/lib/nova/instances/074d9b5a-057a-46af-aea1-0f43e0ac7418/disk.info" returned: 0 in 0.229s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:20:42 compute-0 nova_compute[186544]: 2025-11-22 08:20:42.100 186548 DEBUG neutronclient.v2_0.client [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port eda1ac92-e156-463f-9f90-8fdd14f55dc0 for host compute-2.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262
Nov 22 08:20:42 compute-0 nova_compute[186544]: 2025-11-22 08:20:42.512 186548 DEBUG nova.compute.manager [req-5d5e447b-cf5c-478f-b2ea-1df5721cd8b6 req-357ca4d4-acd8-4aba-872b-8e390feb4c16 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Received event network-vif-plugged-eda1ac92-e156-463f-9f90-8fdd14f55dc0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:20:42 compute-0 nova_compute[186544]: 2025-11-22 08:20:42.512 186548 DEBUG oslo_concurrency.lockutils [req-5d5e447b-cf5c-478f-b2ea-1df5721cd8b6 req-357ca4d4-acd8-4aba-872b-8e390feb4c16 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "074d9b5a-057a-46af-aea1-0f43e0ac7418-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:20:42 compute-0 nova_compute[186544]: 2025-11-22 08:20:42.512 186548 DEBUG oslo_concurrency.lockutils [req-5d5e447b-cf5c-478f-b2ea-1df5721cd8b6 req-357ca4d4-acd8-4aba-872b-8e390feb4c16 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "074d9b5a-057a-46af-aea1-0f43e0ac7418-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:20:42 compute-0 nova_compute[186544]: 2025-11-22 08:20:42.513 186548 DEBUG oslo_concurrency.lockutils [req-5d5e447b-cf5c-478f-b2ea-1df5721cd8b6 req-357ca4d4-acd8-4aba-872b-8e390feb4c16 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "074d9b5a-057a-46af-aea1-0f43e0ac7418-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:20:42 compute-0 nova_compute[186544]: 2025-11-22 08:20:42.513 186548 DEBUG nova.compute.manager [req-5d5e447b-cf5c-478f-b2ea-1df5721cd8b6 req-357ca4d4-acd8-4aba-872b-8e390feb4c16 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] No waiting events found dispatching network-vif-plugged-eda1ac92-e156-463f-9f90-8fdd14f55dc0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:20:42 compute-0 nova_compute[186544]: 2025-11-22 08:20:42.513 186548 WARNING nova.compute.manager [req-5d5e447b-cf5c-478f-b2ea-1df5721cd8b6 req-357ca4d4-acd8-4aba-872b-8e390feb4c16 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Received unexpected event network-vif-plugged-eda1ac92-e156-463f-9f90-8fdd14f55dc0 for instance with vm_state active and task_state resize_migrated.
Nov 22 08:20:42 compute-0 nova_compute[186544]: 2025-11-22 08:20:42.569 186548 DEBUG oslo_concurrency.lockutils [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Acquiring lock "074d9b5a-057a-46af-aea1-0f43e0ac7418-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:20:42 compute-0 nova_compute[186544]: 2025-11-22 08:20:42.569 186548 DEBUG oslo_concurrency.lockutils [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Lock "074d9b5a-057a-46af-aea1-0f43e0ac7418-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:20:42 compute-0 nova_compute[186544]: 2025-11-22 08:20:42.569 186548 DEBUG oslo_concurrency.lockutils [None req-f9aa93ff-e01d-48a9-ab2f-ece83194b9ce 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Lock "074d9b5a-057a-46af-aea1-0f43e0ac7418-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:20:45 compute-0 nova_compute[186544]: 2025-11-22 08:20:45.135 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:20:45 compute-0 nova_compute[186544]: 2025-11-22 08:20:45.147 186548 DEBUG nova.compute.manager [req-7e254bee-6a37-46bc-816a-f0ea3db991e8 req-42ae5085-aac8-4564-a6bb-0818e948cb0e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Received event network-changed-eda1ac92-e156-463f-9f90-8fdd14f55dc0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:20:45 compute-0 nova_compute[186544]: 2025-11-22 08:20:45.147 186548 DEBUG nova.compute.manager [req-7e254bee-6a37-46bc-816a-f0ea3db991e8 req-42ae5085-aac8-4564-a6bb-0818e948cb0e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Refreshing instance network info cache due to event network-changed-eda1ac92-e156-463f-9f90-8fdd14f55dc0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:20:45 compute-0 nova_compute[186544]: 2025-11-22 08:20:45.148 186548 DEBUG oslo_concurrency.lockutils [req-7e254bee-6a37-46bc-816a-f0ea3db991e8 req-42ae5085-aac8-4564-a6bb-0818e948cb0e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-074d9b5a-057a-46af-aea1-0f43e0ac7418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:20:45 compute-0 nova_compute[186544]: 2025-11-22 08:20:45.148 186548 DEBUG oslo_concurrency.lockutils [req-7e254bee-6a37-46bc-816a-f0ea3db991e8 req-42ae5085-aac8-4564-a6bb-0818e948cb0e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-074d9b5a-057a-46af-aea1-0f43e0ac7418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:20:45 compute-0 nova_compute[186544]: 2025-11-22 08:20:45.148 186548 DEBUG nova.network.neutron [req-7e254bee-6a37-46bc-816a-f0ea3db991e8 req-42ae5085-aac8-4564-a6bb-0818e948cb0e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Refreshing network info cache for port eda1ac92-e156-463f-9f90-8fdd14f55dc0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:20:46 compute-0 nova_compute[186544]: 2025-11-22 08:20:46.150 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:20:46 compute-0 nova_compute[186544]: 2025-11-22 08:20:46.576 186548 DEBUG nova.compute.manager [req-c6d0dffd-592b-4b87-bee7-cf82aef35a1a req-fc6ab0ca-0b00-40b0-8f33-2a070950a47d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Received event network-vif-plugged-eda1ac92-e156-463f-9f90-8fdd14f55dc0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:20:46 compute-0 nova_compute[186544]: 2025-11-22 08:20:46.577 186548 DEBUG oslo_concurrency.lockutils [req-c6d0dffd-592b-4b87-bee7-cf82aef35a1a req-fc6ab0ca-0b00-40b0-8f33-2a070950a47d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "074d9b5a-057a-46af-aea1-0f43e0ac7418-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:20:46 compute-0 nova_compute[186544]: 2025-11-22 08:20:46.577 186548 DEBUG oslo_concurrency.lockutils [req-c6d0dffd-592b-4b87-bee7-cf82aef35a1a req-fc6ab0ca-0b00-40b0-8f33-2a070950a47d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "074d9b5a-057a-46af-aea1-0f43e0ac7418-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:20:46 compute-0 nova_compute[186544]: 2025-11-22 08:20:46.577 186548 DEBUG oslo_concurrency.lockutils [req-c6d0dffd-592b-4b87-bee7-cf82aef35a1a req-fc6ab0ca-0b00-40b0-8f33-2a070950a47d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "074d9b5a-057a-46af-aea1-0f43e0ac7418-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:20:46 compute-0 nova_compute[186544]: 2025-11-22 08:20:46.578 186548 DEBUG nova.compute.manager [req-c6d0dffd-592b-4b87-bee7-cf82aef35a1a req-fc6ab0ca-0b00-40b0-8f33-2a070950a47d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] No waiting events found dispatching network-vif-plugged-eda1ac92-e156-463f-9f90-8fdd14f55dc0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:20:46 compute-0 nova_compute[186544]: 2025-11-22 08:20:46.578 186548 WARNING nova.compute.manager [req-c6d0dffd-592b-4b87-bee7-cf82aef35a1a req-fc6ab0ca-0b00-40b0-8f33-2a070950a47d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Received unexpected event network-vif-plugged-eda1ac92-e156-463f-9f90-8fdd14f55dc0 for instance with vm_state active and task_state resize_finish.
Nov 22 08:20:47 compute-0 nova_compute[186544]: 2025-11-22 08:20:47.547 186548 DEBUG nova.network.neutron [req-7e254bee-6a37-46bc-816a-f0ea3db991e8 req-42ae5085-aac8-4564-a6bb-0818e948cb0e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Updated VIF entry in instance network info cache for port eda1ac92-e156-463f-9f90-8fdd14f55dc0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:20:47 compute-0 nova_compute[186544]: 2025-11-22 08:20:47.548 186548 DEBUG nova.network.neutron [req-7e254bee-6a37-46bc-816a-f0ea3db991e8 req-42ae5085-aac8-4564-a6bb-0818e948cb0e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Updating instance_info_cache with network_info: [{"id": "eda1ac92-e156-463f-9f90-8fdd14f55dc0", "address": "fa:16:3e:b3:0e:98", "network": {"id": "52d2fbd4-6713-49c3-93b1-794bccb91cb5", "bridge": "br-int", "label": "tempest-network-smoke--1623348371", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeda1ac92-e1", "ovs_interfaceid": "eda1ac92-e156-463f-9f90-8fdd14f55dc0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:20:47 compute-0 nova_compute[186544]: 2025-11-22 08:20:47.566 186548 DEBUG oslo_concurrency.lockutils [req-7e254bee-6a37-46bc-816a-f0ea3db991e8 req-42ae5085-aac8-4564-a6bb-0818e948cb0e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-074d9b5a-057a-46af-aea1-0f43e0ac7418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:20:48 compute-0 nova_compute[186544]: 2025-11-22 08:20:48.653 186548 DEBUG nova.compute.manager [req-5b244a7f-c51c-45bd-b354-6e8695bb19ac req-e7834ce9-2085-4f98-8fcc-c2cc3e503dc7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Received event network-vif-plugged-eda1ac92-e156-463f-9f90-8fdd14f55dc0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:20:48 compute-0 nova_compute[186544]: 2025-11-22 08:20:48.654 186548 DEBUG oslo_concurrency.lockutils [req-5b244a7f-c51c-45bd-b354-6e8695bb19ac req-e7834ce9-2085-4f98-8fcc-c2cc3e503dc7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "074d9b5a-057a-46af-aea1-0f43e0ac7418-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:20:48 compute-0 nova_compute[186544]: 2025-11-22 08:20:48.654 186548 DEBUG oslo_concurrency.lockutils [req-5b244a7f-c51c-45bd-b354-6e8695bb19ac req-e7834ce9-2085-4f98-8fcc-c2cc3e503dc7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "074d9b5a-057a-46af-aea1-0f43e0ac7418-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:20:48 compute-0 nova_compute[186544]: 2025-11-22 08:20:48.655 186548 DEBUG oslo_concurrency.lockutils [req-5b244a7f-c51c-45bd-b354-6e8695bb19ac req-e7834ce9-2085-4f98-8fcc-c2cc3e503dc7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "074d9b5a-057a-46af-aea1-0f43e0ac7418-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:20:48 compute-0 nova_compute[186544]: 2025-11-22 08:20:48.655 186548 DEBUG nova.compute.manager [req-5b244a7f-c51c-45bd-b354-6e8695bb19ac req-e7834ce9-2085-4f98-8fcc-c2cc3e503dc7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] No waiting events found dispatching network-vif-plugged-eda1ac92-e156-463f-9f90-8fdd14f55dc0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:20:48 compute-0 nova_compute[186544]: 2025-11-22 08:20:48.656 186548 WARNING nova.compute.manager [req-5b244a7f-c51c-45bd-b354-6e8695bb19ac req-e7834ce9-2085-4f98-8fcc-c2cc3e503dc7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Received unexpected event network-vif-plugged-eda1ac92-e156-463f-9f90-8fdd14f55dc0 for instance with vm_state resized and task_state resize_reverting.
Nov 22 08:20:50 compute-0 nova_compute[186544]: 2025-11-22 08:20:50.136 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:20:51 compute-0 nova_compute[186544]: 2025-11-22 08:20:51.152 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:20:51 compute-0 nova_compute[186544]: 2025-11-22 08:20:51.965 186548 DEBUG nova.compute.manager [req-5a50a5bc-e546-4020-ab7e-211682292838 req-af80a0dd-2df1-4af3-8d8a-044c6f824dff 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Received event network-vif-unplugged-eda1ac92-e156-463f-9f90-8fdd14f55dc0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:20:51 compute-0 nova_compute[186544]: 2025-11-22 08:20:51.966 186548 DEBUG oslo_concurrency.lockutils [req-5a50a5bc-e546-4020-ab7e-211682292838 req-af80a0dd-2df1-4af3-8d8a-044c6f824dff 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "074d9b5a-057a-46af-aea1-0f43e0ac7418-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:20:51 compute-0 nova_compute[186544]: 2025-11-22 08:20:51.966 186548 DEBUG oslo_concurrency.lockutils [req-5a50a5bc-e546-4020-ab7e-211682292838 req-af80a0dd-2df1-4af3-8d8a-044c6f824dff 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "074d9b5a-057a-46af-aea1-0f43e0ac7418-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:20:51 compute-0 nova_compute[186544]: 2025-11-22 08:20:51.966 186548 DEBUG oslo_concurrency.lockutils [req-5a50a5bc-e546-4020-ab7e-211682292838 req-af80a0dd-2df1-4af3-8d8a-044c6f824dff 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "074d9b5a-057a-46af-aea1-0f43e0ac7418-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:20:51 compute-0 nova_compute[186544]: 2025-11-22 08:20:51.967 186548 DEBUG nova.compute.manager [req-5a50a5bc-e546-4020-ab7e-211682292838 req-af80a0dd-2df1-4af3-8d8a-044c6f824dff 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] No waiting events found dispatching network-vif-unplugged-eda1ac92-e156-463f-9f90-8fdd14f55dc0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:20:51 compute-0 nova_compute[186544]: 2025-11-22 08:20:51.967 186548 WARNING nova.compute.manager [req-5a50a5bc-e546-4020-ab7e-211682292838 req-af80a0dd-2df1-4af3-8d8a-044c6f824dff 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Received unexpected event network-vif-unplugged-eda1ac92-e156-463f-9f90-8fdd14f55dc0 for instance with vm_state resized and task_state resize_reverting.
Nov 22 08:20:52 compute-0 nova_compute[186544]: 2025-11-22 08:20:52.420 186548 INFO nova.compute.manager [None req-46c9dc3d-530b-457d-8647-de8e4c6bbc5b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Swapping old allocation on dict_keys(['0a011418-630a-4be8-ab23-41ec1c11a5ea']) held by migration e34e02d3-9f5b-48e1-82d2-ca0d3bbe1c70 for instance
Nov 22 08:20:52 compute-0 nova_compute[186544]: 2025-11-22 08:20:52.498 186548 DEBUG nova.scheduler.client.report [None req-46c9dc3d-530b-457d-8647-de8e4c6bbc5b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Overwriting current allocation {'allocations': {'1afd6948-7df7-46e7-8718-35e2b3007a5d': {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}, 'generation': 81}}, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'consumer_generation': 1} on consumer 074d9b5a-057a-46af-aea1-0f43e0ac7418 move_allocations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:2018
Nov 22 08:20:52 compute-0 nova_compute[186544]: 2025-11-22 08:20:52.743 186548 INFO nova.network.neutron [None req-46c9dc3d-530b-457d-8647-de8e4c6bbc5b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Updating port eda1ac92-e156-463f-9f90-8fdd14f55dc0 with attributes {'binding:host_id': 'compute-0.ctlplane.example.com', 'device_owner': 'compute:nova'}
Nov 22 08:20:53 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:20:53.542 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=46, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=45) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:20:53 compute-0 nova_compute[186544]: 2025-11-22 08:20:53.542 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:20:53 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:20:53.544 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 08:20:53 compute-0 nova_compute[186544]: 2025-11-22 08:20:53.936 186548 DEBUG oslo_concurrency.lockutils [None req-46c9dc3d-530b-457d-8647-de8e4c6bbc5b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "refresh_cache-074d9b5a-057a-46af-aea1-0f43e0ac7418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:20:53 compute-0 nova_compute[186544]: 2025-11-22 08:20:53.937 186548 DEBUG oslo_concurrency.lockutils [None req-46c9dc3d-530b-457d-8647-de8e4c6bbc5b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquired lock "refresh_cache-074d9b5a-057a-46af-aea1-0f43e0ac7418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:20:53 compute-0 nova_compute[186544]: 2025-11-22 08:20:53.938 186548 DEBUG nova.network.neutron [None req-46c9dc3d-530b-457d-8647-de8e4c6bbc5b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 08:20:54 compute-0 nova_compute[186544]: 2025-11-22 08:20:54.061 186548 DEBUG nova.compute.manager [req-aaec3607-0057-4472-9625-1866ac9dfbad req-7d2bb156-e1b7-43a0-9fb2-441e715bbfd9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Received event network-changed-eda1ac92-e156-463f-9f90-8fdd14f55dc0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:20:54 compute-0 nova_compute[186544]: 2025-11-22 08:20:54.062 186548 DEBUG nova.compute.manager [req-aaec3607-0057-4472-9625-1866ac9dfbad req-7d2bb156-e1b7-43a0-9fb2-441e715bbfd9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Refreshing instance network info cache due to event network-changed-eda1ac92-e156-463f-9f90-8fdd14f55dc0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:20:54 compute-0 nova_compute[186544]: 2025-11-22 08:20:54.062 186548 DEBUG oslo_concurrency.lockutils [req-aaec3607-0057-4472-9625-1866ac9dfbad req-7d2bb156-e1b7-43a0-9fb2-441e715bbfd9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-074d9b5a-057a-46af-aea1-0f43e0ac7418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:20:54 compute-0 nova_compute[186544]: 2025-11-22 08:20:54.070 186548 DEBUG nova.compute.manager [req-ab463557-e0cd-45c8-94a6-1e8f43ef512f req-c4ce8f10-9b00-4c26-a2db-b2707071445a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Received event network-vif-plugged-eda1ac92-e156-463f-9f90-8fdd14f55dc0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:20:54 compute-0 nova_compute[186544]: 2025-11-22 08:20:54.070 186548 DEBUG oslo_concurrency.lockutils [req-ab463557-e0cd-45c8-94a6-1e8f43ef512f req-c4ce8f10-9b00-4c26-a2db-b2707071445a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "074d9b5a-057a-46af-aea1-0f43e0ac7418-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:20:54 compute-0 nova_compute[186544]: 2025-11-22 08:20:54.071 186548 DEBUG oslo_concurrency.lockutils [req-ab463557-e0cd-45c8-94a6-1e8f43ef512f req-c4ce8f10-9b00-4c26-a2db-b2707071445a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "074d9b5a-057a-46af-aea1-0f43e0ac7418-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:20:54 compute-0 nova_compute[186544]: 2025-11-22 08:20:54.071 186548 DEBUG oslo_concurrency.lockutils [req-ab463557-e0cd-45c8-94a6-1e8f43ef512f req-c4ce8f10-9b00-4c26-a2db-b2707071445a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "074d9b5a-057a-46af-aea1-0f43e0ac7418-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:20:54 compute-0 nova_compute[186544]: 2025-11-22 08:20:54.072 186548 DEBUG nova.compute.manager [req-ab463557-e0cd-45c8-94a6-1e8f43ef512f req-c4ce8f10-9b00-4c26-a2db-b2707071445a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] No waiting events found dispatching network-vif-plugged-eda1ac92-e156-463f-9f90-8fdd14f55dc0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:20:54 compute-0 nova_compute[186544]: 2025-11-22 08:20:54.072 186548 WARNING nova.compute.manager [req-ab463557-e0cd-45c8-94a6-1e8f43ef512f req-c4ce8f10-9b00-4c26-a2db-b2707071445a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Received unexpected event network-vif-plugged-eda1ac92-e156-463f-9f90-8fdd14f55dc0 for instance with vm_state resized and task_state resize_reverting.
Nov 22 08:20:54 compute-0 nova_compute[186544]: 2025-11-22 08:20:54.681 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763799639.6800845, 074d9b5a-057a-46af-aea1-0f43e0ac7418 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:20:54 compute-0 nova_compute[186544]: 2025-11-22 08:20:54.682 186548 INFO nova.compute.manager [-] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] VM Stopped (Lifecycle Event)
Nov 22 08:20:54 compute-0 nova_compute[186544]: 2025-11-22 08:20:54.699 186548 DEBUG nova.compute.manager [None req-9ab23616-84f7-4ad3-b3c3-0adffb6eaefe - - - - - -] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:20:54 compute-0 nova_compute[186544]: 2025-11-22 08:20:54.703 186548 DEBUG nova.compute.manager [None req-9ab23616-84f7-4ad3-b3c3-0adffb6eaefe - - - - - -] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:20:54 compute-0 nova_compute[186544]: 2025-11-22 08:20:54.722 186548 INFO nova.compute.manager [None req-9ab23616-84f7-4ad3-b3c3-0adffb6eaefe - - - - - -] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] During sync_power_state the instance has a pending task (resize_reverting). Skip.
Nov 22 08:20:55 compute-0 nova_compute[186544]: 2025-11-22 08:20:55.138 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:20:55 compute-0 nova_compute[186544]: 2025-11-22 08:20:55.816 186548 DEBUG nova.network.neutron [None req-46c9dc3d-530b-457d-8647-de8e4c6bbc5b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Updating instance_info_cache with network_info: [{"id": "eda1ac92-e156-463f-9f90-8fdd14f55dc0", "address": "fa:16:3e:b3:0e:98", "network": {"id": "52d2fbd4-6713-49c3-93b1-794bccb91cb5", "bridge": "br-int", "label": "tempest-network-smoke--1623348371", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeda1ac92-e1", "ovs_interfaceid": "eda1ac92-e156-463f-9f90-8fdd14f55dc0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:20:55 compute-0 nova_compute[186544]: 2025-11-22 08:20:55.829 186548 DEBUG oslo_concurrency.lockutils [None req-46c9dc3d-530b-457d-8647-de8e4c6bbc5b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Releasing lock "refresh_cache-074d9b5a-057a-46af-aea1-0f43e0ac7418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:20:55 compute-0 nova_compute[186544]: 2025-11-22 08:20:55.830 186548 DEBUG nova.virt.libvirt.driver [None req-46c9dc3d-530b-457d-8647-de8e4c6bbc5b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Starting finish_revert_migration finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11843
Nov 22 08:20:55 compute-0 nova_compute[186544]: 2025-11-22 08:20:55.832 186548 DEBUG oslo_concurrency.lockutils [req-aaec3607-0057-4472-9625-1866ac9dfbad req-7d2bb156-e1b7-43a0-9fb2-441e715bbfd9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-074d9b5a-057a-46af-aea1-0f43e0ac7418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:20:55 compute-0 nova_compute[186544]: 2025-11-22 08:20:55.833 186548 DEBUG nova.network.neutron [req-aaec3607-0057-4472-9625-1866ac9dfbad req-7d2bb156-e1b7-43a0-9fb2-441e715bbfd9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Refreshing network info cache for port eda1ac92-e156-463f-9f90-8fdd14f55dc0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:20:55 compute-0 nova_compute[186544]: 2025-11-22 08:20:55.839 186548 DEBUG nova.virt.libvirt.driver [None req-46c9dc3d-530b-457d-8647-de8e4c6bbc5b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Start _get_guest_xml network_info=[{"id": "eda1ac92-e156-463f-9f90-8fdd14f55dc0", "address": "fa:16:3e:b3:0e:98", "network": {"id": "52d2fbd4-6713-49c3-93b1-794bccb91cb5", "bridge": "br-int", "label": "tempest-network-smoke--1623348371", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeda1ac92-e1", "ovs_interfaceid": "eda1ac92-e156-463f-9f90-8fdd14f55dc0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 08:20:55 compute-0 nova_compute[186544]: 2025-11-22 08:20:55.848 186548 WARNING nova.virt.libvirt.driver [None req-46c9dc3d-530b-457d-8647-de8e4c6bbc5b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:20:55 compute-0 nova_compute[186544]: 2025-11-22 08:20:55.854 186548 DEBUG nova.virt.libvirt.host [None req-46c9dc3d-530b-457d-8647-de8e4c6bbc5b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 08:20:55 compute-0 nova_compute[186544]: 2025-11-22 08:20:55.855 186548 DEBUG nova.virt.libvirt.host [None req-46c9dc3d-530b-457d-8647-de8e4c6bbc5b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 08:20:55 compute-0 nova_compute[186544]: 2025-11-22 08:20:55.858 186548 DEBUG nova.virt.libvirt.host [None req-46c9dc3d-530b-457d-8647-de8e4c6bbc5b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 08:20:55 compute-0 nova_compute[186544]: 2025-11-22 08:20:55.858 186548 DEBUG nova.virt.libvirt.host [None req-46c9dc3d-530b-457d-8647-de8e4c6bbc5b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 08:20:55 compute-0 nova_compute[186544]: 2025-11-22 08:20:55.859 186548 DEBUG nova.virt.libvirt.driver [None req-46c9dc3d-530b-457d-8647-de8e4c6bbc5b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 08:20:55 compute-0 nova_compute[186544]: 2025-11-22 08:20:55.859 186548 DEBUG nova.virt.hardware [None req-46c9dc3d-530b-457d-8647-de8e4c6bbc5b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 08:20:55 compute-0 nova_compute[186544]: 2025-11-22 08:20:55.860 186548 DEBUG nova.virt.hardware [None req-46c9dc3d-530b-457d-8647-de8e4c6bbc5b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 08:20:55 compute-0 nova_compute[186544]: 2025-11-22 08:20:55.860 186548 DEBUG nova.virt.hardware [None req-46c9dc3d-530b-457d-8647-de8e4c6bbc5b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 08:20:55 compute-0 nova_compute[186544]: 2025-11-22 08:20:55.860 186548 DEBUG nova.virt.hardware [None req-46c9dc3d-530b-457d-8647-de8e4c6bbc5b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 08:20:55 compute-0 nova_compute[186544]: 2025-11-22 08:20:55.861 186548 DEBUG nova.virt.hardware [None req-46c9dc3d-530b-457d-8647-de8e4c6bbc5b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 08:20:55 compute-0 nova_compute[186544]: 2025-11-22 08:20:55.861 186548 DEBUG nova.virt.hardware [None req-46c9dc3d-530b-457d-8647-de8e4c6bbc5b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 08:20:55 compute-0 nova_compute[186544]: 2025-11-22 08:20:55.861 186548 DEBUG nova.virt.hardware [None req-46c9dc3d-530b-457d-8647-de8e4c6bbc5b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 08:20:55 compute-0 nova_compute[186544]: 2025-11-22 08:20:55.861 186548 DEBUG nova.virt.hardware [None req-46c9dc3d-530b-457d-8647-de8e4c6bbc5b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 08:20:55 compute-0 nova_compute[186544]: 2025-11-22 08:20:55.862 186548 DEBUG nova.virt.hardware [None req-46c9dc3d-530b-457d-8647-de8e4c6bbc5b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 08:20:55 compute-0 nova_compute[186544]: 2025-11-22 08:20:55.862 186548 DEBUG nova.virt.hardware [None req-46c9dc3d-530b-457d-8647-de8e4c6bbc5b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 08:20:55 compute-0 nova_compute[186544]: 2025-11-22 08:20:55.862 186548 DEBUG nova.virt.hardware [None req-46c9dc3d-530b-457d-8647-de8e4c6bbc5b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 08:20:55 compute-0 nova_compute[186544]: 2025-11-22 08:20:55.862 186548 DEBUG nova.objects.instance [None req-46c9dc3d-530b-457d-8647-de8e4c6bbc5b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 074d9b5a-057a-46af-aea1-0f43e0ac7418 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:20:55 compute-0 nova_compute[186544]: 2025-11-22 08:20:55.880 186548 DEBUG oslo_concurrency.processutils [None req-46c9dc3d-530b-457d-8647-de8e4c6bbc5b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/074d9b5a-057a-46af-aea1-0f43e0ac7418/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:20:55 compute-0 nova_compute[186544]: 2025-11-22 08:20:55.977 186548 DEBUG oslo_concurrency.processutils [None req-46c9dc3d-530b-457d-8647-de8e4c6bbc5b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/074d9b5a-057a-46af-aea1-0f43e0ac7418/disk.config --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:20:55 compute-0 nova_compute[186544]: 2025-11-22 08:20:55.978 186548 DEBUG oslo_concurrency.lockutils [None req-46c9dc3d-530b-457d-8647-de8e4c6bbc5b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "/var/lib/nova/instances/074d9b5a-057a-46af-aea1-0f43e0ac7418/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:20:55 compute-0 nova_compute[186544]: 2025-11-22 08:20:55.978 186548 DEBUG oslo_concurrency.lockutils [None req-46c9dc3d-530b-457d-8647-de8e4c6bbc5b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "/var/lib/nova/instances/074d9b5a-057a-46af-aea1-0f43e0ac7418/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:20:55 compute-0 nova_compute[186544]: 2025-11-22 08:20:55.979 186548 DEBUG oslo_concurrency.lockutils [None req-46c9dc3d-530b-457d-8647-de8e4c6bbc5b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "/var/lib/nova/instances/074d9b5a-057a-46af-aea1-0f43e0ac7418/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:20:55 compute-0 nova_compute[186544]: 2025-11-22 08:20:55.980 186548 DEBUG nova.virt.libvirt.vif [None req-46c9dc3d-530b-457d-8647-de8e4c6bbc5b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-22T08:19:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-550951011',display_name='tempest-TestNetworkAdvancedServerOps-server-550951011',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-550951011',id=152,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBChgjIQbTG83YupjtVjqz+L3+8SX3/AyjC8fqpXlZMUq0Yc6UvLnNy2SkzagnhhQjk5r+5IpiMQj6wR0xNs5cYnWEn7ZMM5fmHS1ZM+0SVA7KQ3TBAqX6QTRX0NRVvymVQ==',key_name='tempest-TestNetworkAdvancedServerOps-1187204626',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:20:46Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='042f6d127720471aaedb8a1fb7535416',ramdisk_id='',reservation_id='r-pxtss0dk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-1221065053',owner_user_name='tempest-TestNetworkAdvancedServerOps-1221065053-project-member'},tags=<?>,task_state='resize_reverting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:20:52Z,user_data=None,user_id='d8853d84c1e84f6baaf01635ef1d0f7c',uuid=074d9b5a-057a-46af-aea1-0f43e0ac7418,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "eda1ac92-e156-463f-9f90-8fdd14f55dc0", "address": "fa:16:3e:b3:0e:98", "network": {"id": "52d2fbd4-6713-49c3-93b1-794bccb91cb5", "bridge": "br-int", "label": "tempest-network-smoke--1623348371", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeda1ac92-e1", "ovs_interfaceid": "eda1ac92-e156-463f-9f90-8fdd14f55dc0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 08:20:55 compute-0 nova_compute[186544]: 2025-11-22 08:20:55.980 186548 DEBUG nova.network.os_vif_util [None req-46c9dc3d-530b-457d-8647-de8e4c6bbc5b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converting VIF {"id": "eda1ac92-e156-463f-9f90-8fdd14f55dc0", "address": "fa:16:3e:b3:0e:98", "network": {"id": "52d2fbd4-6713-49c3-93b1-794bccb91cb5", "bridge": "br-int", "label": "tempest-network-smoke--1623348371", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeda1ac92-e1", "ovs_interfaceid": "eda1ac92-e156-463f-9f90-8fdd14f55dc0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:20:55 compute-0 nova_compute[186544]: 2025-11-22 08:20:55.981 186548 DEBUG nova.network.os_vif_util [None req-46c9dc3d-530b-457d-8647-de8e4c6bbc5b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b3:0e:98,bridge_name='br-int',has_traffic_filtering=True,id=eda1ac92-e156-463f-9f90-8fdd14f55dc0,network=Network(52d2fbd4-6713-49c3-93b1-794bccb91cb5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeda1ac92-e1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:20:55 compute-0 nova_compute[186544]: 2025-11-22 08:20:55.984 186548 DEBUG nova.virt.libvirt.driver [None req-46c9dc3d-530b-457d-8647-de8e4c6bbc5b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] End _get_guest_xml xml=<domain type="kvm">
Nov 22 08:20:55 compute-0 nova_compute[186544]:   <uuid>074d9b5a-057a-46af-aea1-0f43e0ac7418</uuid>
Nov 22 08:20:55 compute-0 nova_compute[186544]:   <name>instance-00000098</name>
Nov 22 08:20:55 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 08:20:55 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 08:20:55 compute-0 nova_compute[186544]:   <metadata>
Nov 22 08:20:55 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 08:20:55 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 08:20:55 compute-0 nova_compute[186544]:       <nova:name>tempest-TestNetworkAdvancedServerOps-server-550951011</nova:name>
Nov 22 08:20:55 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 08:20:55</nova:creationTime>
Nov 22 08:20:55 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 08:20:55 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 08:20:55 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 08:20:55 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 08:20:55 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 08:20:55 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 08:20:55 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 08:20:55 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 08:20:55 compute-0 nova_compute[186544]:         <nova:user uuid="d8853d84c1e84f6baaf01635ef1d0f7c">tempest-TestNetworkAdvancedServerOps-1221065053-project-member</nova:user>
Nov 22 08:20:55 compute-0 nova_compute[186544]:         <nova:project uuid="042f6d127720471aaedb8a1fb7535416">tempest-TestNetworkAdvancedServerOps-1221065053</nova:project>
Nov 22 08:20:55 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 08:20:55 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 08:20:55 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 08:20:55 compute-0 nova_compute[186544]:         <nova:port uuid="eda1ac92-e156-463f-9f90-8fdd14f55dc0">
Nov 22 08:20:55 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 22 08:20:55 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 08:20:55 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 08:20:55 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 08:20:55 compute-0 nova_compute[186544]:   </metadata>
Nov 22 08:20:55 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 08:20:55 compute-0 nova_compute[186544]:     <system>
Nov 22 08:20:55 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 08:20:55 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 08:20:55 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 08:20:55 compute-0 nova_compute[186544]:       <entry name="serial">074d9b5a-057a-46af-aea1-0f43e0ac7418</entry>
Nov 22 08:20:55 compute-0 nova_compute[186544]:       <entry name="uuid">074d9b5a-057a-46af-aea1-0f43e0ac7418</entry>
Nov 22 08:20:55 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 08:20:55 compute-0 nova_compute[186544]:     </system>
Nov 22 08:20:55 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 08:20:55 compute-0 nova_compute[186544]:   <os>
Nov 22 08:20:55 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 08:20:55 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 08:20:55 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 08:20:55 compute-0 nova_compute[186544]:   </os>
Nov 22 08:20:55 compute-0 nova_compute[186544]:   <features>
Nov 22 08:20:55 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 08:20:55 compute-0 nova_compute[186544]:     <apic/>
Nov 22 08:20:55 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 08:20:55 compute-0 nova_compute[186544]:   </features>
Nov 22 08:20:55 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 08:20:55 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 08:20:55 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 08:20:55 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 08:20:55 compute-0 nova_compute[186544]:   </clock>
Nov 22 08:20:55 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 08:20:55 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 08:20:55 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 08:20:55 compute-0 nova_compute[186544]:   </cpu>
Nov 22 08:20:55 compute-0 nova_compute[186544]:   <devices>
Nov 22 08:20:55 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 08:20:55 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 08:20:55 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/074d9b5a-057a-46af-aea1-0f43e0ac7418/disk"/>
Nov 22 08:20:55 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 08:20:55 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:20:55 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 08:20:55 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 08:20:55 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/074d9b5a-057a-46af-aea1-0f43e0ac7418/disk.config"/>
Nov 22 08:20:55 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 08:20:55 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:20:55 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 08:20:55 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:b3:0e:98"/>
Nov 22 08:20:55 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:20:55 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 08:20:55 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 08:20:55 compute-0 nova_compute[186544]:       <target dev="tapeda1ac92-e1"/>
Nov 22 08:20:55 compute-0 nova_compute[186544]:     </interface>
Nov 22 08:20:55 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 08:20:55 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/074d9b5a-057a-46af-aea1-0f43e0ac7418/console.log" append="off"/>
Nov 22 08:20:55 compute-0 nova_compute[186544]:     </serial>
Nov 22 08:20:55 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 08:20:55 compute-0 nova_compute[186544]:     <video>
Nov 22 08:20:55 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:20:55 compute-0 nova_compute[186544]:     </video>
Nov 22 08:20:55 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 08:20:55 compute-0 nova_compute[186544]:     <input type="keyboard" bus="usb"/>
Nov 22 08:20:55 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 08:20:55 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 08:20:55 compute-0 nova_compute[186544]:     </rng>
Nov 22 08:20:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 08:20:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:20:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:20:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:20:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:20:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:20:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:20:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:20:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:20:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:20:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:20:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:20:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:20:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:20:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:20:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:20:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:20:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:20:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:20:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:20:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:20:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:20:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:20:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:20:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:20:55 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 08:20:55 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 08:20:55 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 08:20:55 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 08:20:55 compute-0 nova_compute[186544]:   </devices>
Nov 22 08:20:55 compute-0 nova_compute[186544]: </domain>
Nov 22 08:20:55 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 08:20:55 compute-0 nova_compute[186544]: 2025-11-22 08:20:55.985 186548 DEBUG nova.compute.manager [None req-46c9dc3d-530b-457d-8647-de8e4c6bbc5b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Preparing to wait for external event network-vif-plugged-eda1ac92-e156-463f-9f90-8fdd14f55dc0 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 08:20:55 compute-0 nova_compute[186544]: 2025-11-22 08:20:55.985 186548 DEBUG oslo_concurrency.lockutils [None req-46c9dc3d-530b-457d-8647-de8e4c6bbc5b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "074d9b5a-057a-46af-aea1-0f43e0ac7418-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:20:55 compute-0 nova_compute[186544]: 2025-11-22 08:20:55.985 186548 DEBUG oslo_concurrency.lockutils [None req-46c9dc3d-530b-457d-8647-de8e4c6bbc5b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "074d9b5a-057a-46af-aea1-0f43e0ac7418-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:20:55 compute-0 nova_compute[186544]: 2025-11-22 08:20:55.985 186548 DEBUG oslo_concurrency.lockutils [None req-46c9dc3d-530b-457d-8647-de8e4c6bbc5b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "074d9b5a-057a-46af-aea1-0f43e0ac7418-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:20:55 compute-0 nova_compute[186544]: 2025-11-22 08:20:55.986 186548 DEBUG nova.virt.libvirt.vif [None req-46c9dc3d-530b-457d-8647-de8e4c6bbc5b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-22T08:19:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-550951011',display_name='tempest-TestNetworkAdvancedServerOps-server-550951011',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-550951011',id=152,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBChgjIQbTG83YupjtVjqz+L3+8SX3/AyjC8fqpXlZMUq0Yc6UvLnNy2SkzagnhhQjk5r+5IpiMQj6wR0xNs5cYnWEn7ZMM5fmHS1ZM+0SVA7KQ3TBAqX6QTRX0NRVvymVQ==',key_name='tempest-TestNetworkAdvancedServerOps-1187204626',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:20:46Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='042f6d127720471aaedb8a1fb7535416',ramdisk_id='',reservation_id='r-pxtss0dk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-1221065053',owner_user_name='tempest-TestNetworkAdvancedServerOps-1221065053-project-member'},tags=<?>,task_state='resize_reverting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:20:52Z,user_data=None,user_id='d8853d84c1e84f6baaf01635ef1d0f7c',uuid=074d9b5a-057a-46af-aea1-0f43e0ac7418,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "eda1ac92-e156-463f-9f90-8fdd14f55dc0", "address": "fa:16:3e:b3:0e:98", "network": {"id": "52d2fbd4-6713-49c3-93b1-794bccb91cb5", "bridge": "br-int", "label": "tempest-network-smoke--1623348371", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeda1ac92-e1", "ovs_interfaceid": "eda1ac92-e156-463f-9f90-8fdd14f55dc0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 08:20:55 compute-0 nova_compute[186544]: 2025-11-22 08:20:55.986 186548 DEBUG nova.network.os_vif_util [None req-46c9dc3d-530b-457d-8647-de8e4c6bbc5b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converting VIF {"id": "eda1ac92-e156-463f-9f90-8fdd14f55dc0", "address": "fa:16:3e:b3:0e:98", "network": {"id": "52d2fbd4-6713-49c3-93b1-794bccb91cb5", "bridge": "br-int", "label": "tempest-network-smoke--1623348371", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeda1ac92-e1", "ovs_interfaceid": "eda1ac92-e156-463f-9f90-8fdd14f55dc0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:20:55 compute-0 nova_compute[186544]: 2025-11-22 08:20:55.987 186548 DEBUG nova.network.os_vif_util [None req-46c9dc3d-530b-457d-8647-de8e4c6bbc5b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b3:0e:98,bridge_name='br-int',has_traffic_filtering=True,id=eda1ac92-e156-463f-9f90-8fdd14f55dc0,network=Network(52d2fbd4-6713-49c3-93b1-794bccb91cb5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeda1ac92-e1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:20:55 compute-0 nova_compute[186544]: 2025-11-22 08:20:55.987 186548 DEBUG os_vif [None req-46c9dc3d-530b-457d-8647-de8e4c6bbc5b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b3:0e:98,bridge_name='br-int',has_traffic_filtering=True,id=eda1ac92-e156-463f-9f90-8fdd14f55dc0,network=Network(52d2fbd4-6713-49c3-93b1-794bccb91cb5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeda1ac92-e1') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 08:20:55 compute-0 nova_compute[186544]: 2025-11-22 08:20:55.988 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:20:55 compute-0 nova_compute[186544]: 2025-11-22 08:20:55.988 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:20:55 compute-0 nova_compute[186544]: 2025-11-22 08:20:55.988 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:20:55 compute-0 nova_compute[186544]: 2025-11-22 08:20:55.994 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:20:55 compute-0 nova_compute[186544]: 2025-11-22 08:20:55.995 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapeda1ac92-e1, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:20:55 compute-0 nova_compute[186544]: 2025-11-22 08:20:55.995 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapeda1ac92-e1, col_values=(('external_ids', {'iface-id': 'eda1ac92-e156-463f-9f90-8fdd14f55dc0', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b3:0e:98', 'vm-uuid': '074d9b5a-057a-46af-aea1-0f43e0ac7418'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:20:55 compute-0 nova_compute[186544]: 2025-11-22 08:20:55.997 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:20:55 compute-0 NetworkManager[55036]: <info>  [1763799655.9987] manager: (tapeda1ac92-e1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/320)
Nov 22 08:20:55 compute-0 nova_compute[186544]: 2025-11-22 08:20:55.999 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 08:20:56 compute-0 nova_compute[186544]: 2025-11-22 08:20:56.006 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:20:56 compute-0 nova_compute[186544]: 2025-11-22 08:20:56.007 186548 INFO os_vif [None req-46c9dc3d-530b-457d-8647-de8e4c6bbc5b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b3:0e:98,bridge_name='br-int',has_traffic_filtering=True,id=eda1ac92-e156-463f-9f90-8fdd14f55dc0,network=Network(52d2fbd4-6713-49c3-93b1-794bccb91cb5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeda1ac92-e1')
Nov 22 08:20:56 compute-0 nova_compute[186544]: 2025-11-22 08:20:56.153 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:20:56 compute-0 kernel: tapeda1ac92-e1: entered promiscuous mode
Nov 22 08:20:56 compute-0 NetworkManager[55036]: <info>  [1763799656.1724] manager: (tapeda1ac92-e1): new Tun device (/org/freedesktop/NetworkManager/Devices/321)
Nov 22 08:20:56 compute-0 nova_compute[186544]: 2025-11-22 08:20:56.173 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:20:56 compute-0 ovn_controller[94843]: 2025-11-22T08:20:56Z|00699|binding|INFO|Claiming lport eda1ac92-e156-463f-9f90-8fdd14f55dc0 for this chassis.
Nov 22 08:20:56 compute-0 ovn_controller[94843]: 2025-11-22T08:20:56Z|00700|binding|INFO|eda1ac92-e156-463f-9f90-8fdd14f55dc0: Claiming fa:16:3e:b3:0e:98 10.100.0.5
Nov 22 08:20:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:20:56.183 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b3:0e:98 10.100.0.5'], port_security=['fa:16:3e:b3:0e:98 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '074d9b5a-057a-46af-aea1-0f43e0ac7418', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-52d2fbd4-6713-49c3-93b1-794bccb91cb5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '042f6d127720471aaedb8a1fb7535416', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'a792325d-0f21-49cc-9e79-20df696791a7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.205'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=46980f3d-5c4c-4eaa-bdbc-e9dfef13e740, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=eda1ac92-e156-463f-9f90-8fdd14f55dc0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:20:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:20:56.184 103805 INFO neutron.agent.ovn.metadata.agent [-] Port eda1ac92-e156-463f-9f90-8fdd14f55dc0 in datapath 52d2fbd4-6713-49c3-93b1-794bccb91cb5 bound to our chassis
Nov 22 08:20:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:20:56.185 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 52d2fbd4-6713-49c3-93b1-794bccb91cb5
Nov 22 08:20:56 compute-0 nova_compute[186544]: 2025-11-22 08:20:56.188 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:20:56 compute-0 ovn_controller[94843]: 2025-11-22T08:20:56Z|00701|binding|INFO|Setting lport eda1ac92-e156-463f-9f90-8fdd14f55dc0 ovn-installed in OVS
Nov 22 08:20:56 compute-0 ovn_controller[94843]: 2025-11-22T08:20:56Z|00702|binding|INFO|Setting lport eda1ac92-e156-463f-9f90-8fdd14f55dc0 up in Southbound
Nov 22 08:20:56 compute-0 nova_compute[186544]: 2025-11-22 08:20:56.192 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:20:56 compute-0 nova_compute[186544]: 2025-11-22 08:20:56.195 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:20:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:20:56.198 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[920c1fe8-fcd3-4388-a253-23ad4729ded9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:20:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:20:56.199 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap52d2fbd4-61 in ovnmeta-52d2fbd4-6713-49c3-93b1-794bccb91cb5 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 08:20:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:20:56.200 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap52d2fbd4-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 08:20:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:20:56.201 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[7616e031-81f8-4ae0-9868-a19e0f5ad4a7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:20:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:20:56.202 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[7f4b4058-7ea8-474f-aa11-55f19349277f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:20:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:20:56.212 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[d1d7c889-5a00-4941-a306-29ab8225c109]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:20:56 compute-0 systemd-machined[152872]: New machine qemu-81-instance-00000098.
Nov 22 08:20:56 compute-0 systemd[1]: Started Virtual Machine qemu-81-instance-00000098.
Nov 22 08:20:56 compute-0 systemd-udevd[241854]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:20:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:20:56.235 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[99f5c4ea-5482-4c40-a7d2-ad552cf5bae8]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:20:56 compute-0 NetworkManager[55036]: <info>  [1763799656.2492] device (tapeda1ac92-e1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 08:20:56 compute-0 NetworkManager[55036]: <info>  [1763799656.2499] device (tapeda1ac92-e1): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 08:20:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:20:56.265 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[14fcea8f-0a61-4e72-8763-8823dcc7aec2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:20:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:20:56.271 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[c0b9dfae-8ff7-4327-8c97-1d6b2268da64]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:20:56 compute-0 NetworkManager[55036]: <info>  [1763799656.2727] manager: (tap52d2fbd4-60): new Veth device (/org/freedesktop/NetworkManager/Devices/322)
Nov 22 08:20:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:20:56.302 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[b44dcd5b-3507-4f73-b766-84566bf32fba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:20:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:20:56.306 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[59e71215-d35e-49b2-9c4c-65677434e10c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:20:56 compute-0 NetworkManager[55036]: <info>  [1763799656.3292] device (tap52d2fbd4-60): carrier: link connected
Nov 22 08:20:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:20:56.335 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[93283fec-577c-40c6-a4b9-4850d4005a25]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:20:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:20:56.358 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[911fe273-6073-414f-825b-d4682bdcf91d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap52d2fbd4-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0c:e8:32'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 211], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 639696, 'reachable_time': 27363, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241884, 'error': None, 'target': 'ovnmeta-52d2fbd4-6713-49c3-93b1-794bccb91cb5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:20:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:20:56.374 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[a6a44254-6e16-4e99-8c07-45ab8a169238]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0c:e832'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 639696, 'tstamp': 639696}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 241885, 'error': None, 'target': 'ovnmeta-52d2fbd4-6713-49c3-93b1-794bccb91cb5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:20:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:20:56.390 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[a74d0796-7cfd-4844-b479-b9935d01af20]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap52d2fbd4-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0c:e8:32'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 211], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 639696, 'reachable_time': 27363, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 241886, 'error': None, 'target': 'ovnmeta-52d2fbd4-6713-49c3-93b1-794bccb91cb5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:20:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:20:56.436 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[914dbe3e-7cdd-40da-a384-94c798fc8b7c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:20:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:20:56.494 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[56eca037-41de-4040-9bb5-4d83a1cb8b97]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:20:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:20:56.496 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap52d2fbd4-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:20:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:20:56.496 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:20:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:20:56.497 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap52d2fbd4-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:20:56 compute-0 nova_compute[186544]: 2025-11-22 08:20:56.499 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:20:56 compute-0 NetworkManager[55036]: <info>  [1763799656.5005] manager: (tap52d2fbd4-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/323)
Nov 22 08:20:56 compute-0 kernel: tap52d2fbd4-60: entered promiscuous mode
Nov 22 08:20:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:20:56.505 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap52d2fbd4-60, col_values=(('external_ids', {'iface-id': 'a25db17f-5074-4e3e-9504-ee30cd3f6d4c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:20:56 compute-0 ovn_controller[94843]: 2025-11-22T08:20:56Z|00703|binding|INFO|Releasing lport a25db17f-5074-4e3e-9504-ee30cd3f6d4c from this chassis (sb_readonly=0)
Nov 22 08:20:56 compute-0 nova_compute[186544]: 2025-11-22 08:20:56.507 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:20:56 compute-0 nova_compute[186544]: 2025-11-22 08:20:56.509 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:20:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:20:56.510 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/52d2fbd4-6713-49c3-93b1-794bccb91cb5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/52d2fbd4-6713-49c3-93b1-794bccb91cb5.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 08:20:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:20:56.511 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[a4ecbb09-4460-4994-a105-783fcf4ed5e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:20:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:20:56.512 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 08:20:56 compute-0 ovn_metadata_agent[103800]: global
Nov 22 08:20:56 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 08:20:56 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-52d2fbd4-6713-49c3-93b1-794bccb91cb5
Nov 22 08:20:56 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 08:20:56 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 08:20:56 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 08:20:56 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/52d2fbd4-6713-49c3-93b1-794bccb91cb5.pid.haproxy
Nov 22 08:20:56 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 08:20:56 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:20:56 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 08:20:56 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 08:20:56 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 08:20:56 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 08:20:56 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 08:20:56 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 08:20:56 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 08:20:56 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 08:20:56 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 08:20:56 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 08:20:56 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 08:20:56 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 08:20:56 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 08:20:56 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:20:56 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:20:56 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 08:20:56 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 08:20:56 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 08:20:56 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID 52d2fbd4-6713-49c3-93b1-794bccb91cb5
Nov 22 08:20:56 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 08:20:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:20:56.513 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-52d2fbd4-6713-49c3-93b1-794bccb91cb5', 'env', 'PROCESS_TAG=haproxy-52d2fbd4-6713-49c3-93b1-794bccb91cb5', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/52d2fbd4-6713-49c3-93b1-794bccb91cb5.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 08:20:56 compute-0 nova_compute[186544]: 2025-11-22 08:20:56.531 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:20:56 compute-0 ovn_controller[94843]: 2025-11-22T08:20:56Z|00704|binding|INFO|Releasing lport a25db17f-5074-4e3e-9504-ee30cd3f6d4c from this chassis (sb_readonly=0)
Nov 22 08:20:56 compute-0 nova_compute[186544]: 2025-11-22 08:20:56.740 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763799656.7401638, 074d9b5a-057a-46af-aea1-0f43e0ac7418 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:20:56 compute-0 nova_compute[186544]: 2025-11-22 08:20:56.741 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] VM Started (Lifecycle Event)
Nov 22 08:20:56 compute-0 nova_compute[186544]: 2025-11-22 08:20:56.761 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:20:56 compute-0 nova_compute[186544]: 2025-11-22 08:20:56.770 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:20:56 compute-0 nova_compute[186544]: 2025-11-22 08:20:56.774 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763799656.7403772, 074d9b5a-057a-46af-aea1-0f43e0ac7418 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:20:56 compute-0 nova_compute[186544]: 2025-11-22 08:20:56.775 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] VM Paused (Lifecycle Event)
Nov 22 08:20:56 compute-0 nova_compute[186544]: 2025-11-22 08:20:56.797 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:20:56 compute-0 nova_compute[186544]: 2025-11-22 08:20:56.801 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:20:56 compute-0 nova_compute[186544]: 2025-11-22 08:20:56.820 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] During sync_power_state the instance has a pending task (resize_reverting). Skip.
Nov 22 08:20:56 compute-0 nova_compute[186544]: 2025-11-22 08:20:56.938 186548 DEBUG nova.compute.manager [req-cbade4a8-df1b-4982-b491-80160efeb6b6 req-b5379d88-a065-413c-a539-81243a83340c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Received event network-vif-plugged-eda1ac92-e156-463f-9f90-8fdd14f55dc0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:20:56 compute-0 nova_compute[186544]: 2025-11-22 08:20:56.939 186548 DEBUG oslo_concurrency.lockutils [req-cbade4a8-df1b-4982-b491-80160efeb6b6 req-b5379d88-a065-413c-a539-81243a83340c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "074d9b5a-057a-46af-aea1-0f43e0ac7418-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:20:56 compute-0 nova_compute[186544]: 2025-11-22 08:20:56.940 186548 DEBUG oslo_concurrency.lockutils [req-cbade4a8-df1b-4982-b491-80160efeb6b6 req-b5379d88-a065-413c-a539-81243a83340c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "074d9b5a-057a-46af-aea1-0f43e0ac7418-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:20:56 compute-0 nova_compute[186544]: 2025-11-22 08:20:56.941 186548 DEBUG oslo_concurrency.lockutils [req-cbade4a8-df1b-4982-b491-80160efeb6b6 req-b5379d88-a065-413c-a539-81243a83340c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "074d9b5a-057a-46af-aea1-0f43e0ac7418-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:20:56 compute-0 nova_compute[186544]: 2025-11-22 08:20:56.941 186548 DEBUG nova.compute.manager [req-cbade4a8-df1b-4982-b491-80160efeb6b6 req-b5379d88-a065-413c-a539-81243a83340c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Processing event network-vif-plugged-eda1ac92-e156-463f-9f90-8fdd14f55dc0 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 08:20:56 compute-0 nova_compute[186544]: 2025-11-22 08:20:56.943 186548 DEBUG nova.compute.manager [None req-46c9dc3d-530b-457d-8647-de8e4c6bbc5b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 08:20:56 compute-0 nova_compute[186544]: 2025-11-22 08:20:56.946 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763799656.9464755, 074d9b5a-057a-46af-aea1-0f43e0ac7418 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:20:56 compute-0 nova_compute[186544]: 2025-11-22 08:20:56.947 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] VM Resumed (Lifecycle Event)
Nov 22 08:20:56 compute-0 nova_compute[186544]: 2025-11-22 08:20:56.951 186548 INFO nova.virt.libvirt.driver [-] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Instance running successfully.
Nov 22 08:20:56 compute-0 nova_compute[186544]: 2025-11-22 08:20:56.951 186548 DEBUG nova.virt.libvirt.driver [None req-46c9dc3d-530b-457d-8647-de8e4c6bbc5b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] finish_revert_migration finished successfully. finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11887
Nov 22 08:20:56 compute-0 nova_compute[186544]: 2025-11-22 08:20:56.983 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:20:56 compute-0 nova_compute[186544]: 2025-11-22 08:20:56.993 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:20:57 compute-0 podman[241925]: 2025-11-22 08:20:56.945930231 +0000 UTC m=+0.022416004 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 08:20:57 compute-0 nova_compute[186544]: 2025-11-22 08:20:57.048 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] During sync_power_state the instance has a pending task (resize_reverting). Skip.
Nov 22 08:20:57 compute-0 nova_compute[186544]: 2025-11-22 08:20:57.079 186548 INFO nova.compute.manager [None req-46c9dc3d-530b-457d-8647-de8e4c6bbc5b d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Updating instance to original state: 'active'
Nov 22 08:20:57 compute-0 podman[241925]: 2025-11-22 08:20:57.28031976 +0000 UTC m=+0.356805533 container create c8c6d71b10f0cf887523c714f7eff0ae0757003b6b1e67d2f7a364a94f1de2c9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-52d2fbd4-6713-49c3-93b1-794bccb91cb5, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 08:20:57 compute-0 nova_compute[186544]: 2025-11-22 08:20:57.351 186548 DEBUG nova.network.neutron [req-aaec3607-0057-4472-9625-1866ac9dfbad req-7d2bb156-e1b7-43a0-9fb2-441e715bbfd9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Updated VIF entry in instance network info cache for port eda1ac92-e156-463f-9f90-8fdd14f55dc0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:20:57 compute-0 nova_compute[186544]: 2025-11-22 08:20:57.352 186548 DEBUG nova.network.neutron [req-aaec3607-0057-4472-9625-1866ac9dfbad req-7d2bb156-e1b7-43a0-9fb2-441e715bbfd9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Updating instance_info_cache with network_info: [{"id": "eda1ac92-e156-463f-9f90-8fdd14f55dc0", "address": "fa:16:3e:b3:0e:98", "network": {"id": "52d2fbd4-6713-49c3-93b1-794bccb91cb5", "bridge": "br-int", "label": "tempest-network-smoke--1623348371", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeda1ac92-e1", "ovs_interfaceid": "eda1ac92-e156-463f-9f90-8fdd14f55dc0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:20:57 compute-0 nova_compute[186544]: 2025-11-22 08:20:57.367 186548 DEBUG oslo_concurrency.lockutils [req-aaec3607-0057-4472-9625-1866ac9dfbad req-7d2bb156-e1b7-43a0-9fb2-441e715bbfd9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-074d9b5a-057a-46af-aea1-0f43e0ac7418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:20:57 compute-0 systemd[1]: Started libpod-conmon-c8c6d71b10f0cf887523c714f7eff0ae0757003b6b1e67d2f7a364a94f1de2c9.scope.
Nov 22 08:20:57 compute-0 systemd[1]: Started libcrun container.
Nov 22 08:20:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2a94ea10fa7b7e3ab5ee1514a82b2c981548f339376e90ac6271c8e591c78578/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 08:20:57 compute-0 podman[241925]: 2025-11-22 08:20:57.483995711 +0000 UTC m=+0.560481594 container init c8c6d71b10f0cf887523c714f7eff0ae0757003b6b1e67d2f7a364a94f1de2c9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-52d2fbd4-6713-49c3-93b1-794bccb91cb5, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 22 08:20:57 compute-0 podman[241925]: 2025-11-22 08:20:57.4969429 +0000 UTC m=+0.573428703 container start c8c6d71b10f0cf887523c714f7eff0ae0757003b6b1e67d2f7a364a94f1de2c9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-52d2fbd4-6713-49c3-93b1-794bccb91cb5, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251118)
Nov 22 08:20:57 compute-0 neutron-haproxy-ovnmeta-52d2fbd4-6713-49c3-93b1-794bccb91cb5[241940]: [NOTICE]   (241945) : New worker (241947) forked
Nov 22 08:20:57 compute-0 neutron-haproxy-ovnmeta-52d2fbd4-6713-49c3-93b1-794bccb91cb5[241940]: [NOTICE]   (241945) : Loading success.
Nov 22 08:20:58 compute-0 podman[241956]: 2025-11-22 08:20:58.407846118 +0000 UTC m=+0.056406963 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 22 08:20:58 compute-0 podman[241957]: 2025-11-22 08:20:58.41965146 +0000 UTC m=+0.059586962 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 08:20:58 compute-0 podman[241958]: 2025-11-22 08:20:58.425244208 +0000 UTC m=+0.062992207 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 08:20:58 compute-0 podman[241959]: 2025-11-22 08:20:58.453842465 +0000 UTC m=+0.089852421 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 08:20:59 compute-0 nova_compute[186544]: 2025-11-22 08:20:59.030 186548 DEBUG nova.compute.manager [req-01b6b92e-1178-48b3-ad11-8340c3e2a26c req-a327073b-2d7c-4608-89e7-7091856762af 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Received event network-vif-plugged-eda1ac92-e156-463f-9f90-8fdd14f55dc0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:20:59 compute-0 nova_compute[186544]: 2025-11-22 08:20:59.031 186548 DEBUG oslo_concurrency.lockutils [req-01b6b92e-1178-48b3-ad11-8340c3e2a26c req-a327073b-2d7c-4608-89e7-7091856762af 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "074d9b5a-057a-46af-aea1-0f43e0ac7418-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:20:59 compute-0 nova_compute[186544]: 2025-11-22 08:20:59.031 186548 DEBUG oslo_concurrency.lockutils [req-01b6b92e-1178-48b3-ad11-8340c3e2a26c req-a327073b-2d7c-4608-89e7-7091856762af 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "074d9b5a-057a-46af-aea1-0f43e0ac7418-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:20:59 compute-0 nova_compute[186544]: 2025-11-22 08:20:59.031 186548 DEBUG oslo_concurrency.lockutils [req-01b6b92e-1178-48b3-ad11-8340c3e2a26c req-a327073b-2d7c-4608-89e7-7091856762af 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "074d9b5a-057a-46af-aea1-0f43e0ac7418-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:20:59 compute-0 nova_compute[186544]: 2025-11-22 08:20:59.031 186548 DEBUG nova.compute.manager [req-01b6b92e-1178-48b3-ad11-8340c3e2a26c req-a327073b-2d7c-4608-89e7-7091856762af 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] No waiting events found dispatching network-vif-plugged-eda1ac92-e156-463f-9f90-8fdd14f55dc0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:20:59 compute-0 nova_compute[186544]: 2025-11-22 08:20:59.032 186548 WARNING nova.compute.manager [req-01b6b92e-1178-48b3-ad11-8340c3e2a26c req-a327073b-2d7c-4608-89e7-7091856762af 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Received unexpected event network-vif-plugged-eda1ac92-e156-463f-9f90-8fdd14f55dc0 for instance with vm_state active and task_state None.
Nov 22 08:21:00 compute-0 nova_compute[186544]: 2025-11-22 08:21:00.998 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:21:01 compute-0 nova_compute[186544]: 2025-11-22 08:21:01.156 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:21:01 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:21:01.547 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '46'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:21:04 compute-0 nova_compute[186544]: 2025-11-22 08:21:04.167 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:21:04 compute-0 nova_compute[186544]: 2025-11-22 08:21:04.198 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:21:04 compute-0 nova_compute[186544]: 2025-11-22 08:21:04.199 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:21:04 compute-0 nova_compute[186544]: 2025-11-22 08:21:04.199 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:21:04 compute-0 nova_compute[186544]: 2025-11-22 08:21:04.199 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 08:21:04 compute-0 nova_compute[186544]: 2025-11-22 08:21:04.264 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/074d9b5a-057a-46af-aea1-0f43e0ac7418/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:21:04 compute-0 nova_compute[186544]: 2025-11-22 08:21:04.321 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/074d9b5a-057a-46af-aea1-0f43e0ac7418/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:21:04 compute-0 nova_compute[186544]: 2025-11-22 08:21:04.322 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/074d9b5a-057a-46af-aea1-0f43e0ac7418/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:21:04 compute-0 nova_compute[186544]: 2025-11-22 08:21:04.378 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/074d9b5a-057a-46af-aea1-0f43e0ac7418/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:21:04 compute-0 nova_compute[186544]: 2025-11-22 08:21:04.415 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:21:04 compute-0 nova_compute[186544]: 2025-11-22 08:21:04.558 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:21:04 compute-0 nova_compute[186544]: 2025-11-22 08:21:04.560 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5518MB free_disk=73.10772705078125GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 08:21:04 compute-0 nova_compute[186544]: 2025-11-22 08:21:04.560 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:21:04 compute-0 nova_compute[186544]: 2025-11-22 08:21:04.560 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:21:04 compute-0 nova_compute[186544]: 2025-11-22 08:21:04.656 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Instance 074d9b5a-057a-46af-aea1-0f43e0ac7418 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 22 08:21:04 compute-0 nova_compute[186544]: 2025-11-22 08:21:04.657 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 08:21:04 compute-0 nova_compute[186544]: 2025-11-22 08:21:04.657 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 08:21:04 compute-0 nova_compute[186544]: 2025-11-22 08:21:04.678 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Refreshing inventories for resource provider 0a011418-630a-4be8-ab23-41ec1c11a5ea _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 22 08:21:04 compute-0 nova_compute[186544]: 2025-11-22 08:21:04.699 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Updating ProviderTree inventory for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 22 08:21:04 compute-0 nova_compute[186544]: 2025-11-22 08:21:04.700 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Updating inventory in ProviderTree for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 22 08:21:04 compute-0 nova_compute[186544]: 2025-11-22 08:21:04.713 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Refreshing aggregate associations for resource provider 0a011418-630a-4be8-ab23-41ec1c11a5ea, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 22 08:21:04 compute-0 nova_compute[186544]: 2025-11-22 08:21:04.734 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Refreshing trait associations for resource provider 0a011418-630a-4be8-ab23-41ec1c11a5ea, traits: COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_ACCELERATORS,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_RESCUE_BFV,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSSE3,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE41 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 22 08:21:04 compute-0 nova_compute[186544]: 2025-11-22 08:21:04.774 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:21:04 compute-0 nova_compute[186544]: 2025-11-22 08:21:04.787 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:21:04 compute-0 nova_compute[186544]: 2025-11-22 08:21:04.805 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 08:21:04 compute-0 nova_compute[186544]: 2025-11-22 08:21:04.805 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.245s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:21:06 compute-0 nova_compute[186544]: 2025-11-22 08:21:06.002 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:21:06 compute-0 nova_compute[186544]: 2025-11-22 08:21:06.159 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:21:07 compute-0 nova_compute[186544]: 2025-11-22 08:21:07.395 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:21:07 compute-0 nova_compute[186544]: 2025-11-22 08:21:07.804 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:21:07 compute-0 nova_compute[186544]: 2025-11-22 08:21:07.804 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 08:21:07 compute-0 nova_compute[186544]: 2025-11-22 08:21:07.804 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 08:21:08 compute-0 nova_compute[186544]: 2025-11-22 08:21:08.062 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "refresh_cache-074d9b5a-057a-46af-aea1-0f43e0ac7418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:21:08 compute-0 nova_compute[186544]: 2025-11-22 08:21:08.063 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquired lock "refresh_cache-074d9b5a-057a-46af-aea1-0f43e0ac7418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:21:08 compute-0 nova_compute[186544]: 2025-11-22 08:21:08.064 186548 DEBUG nova.network.neutron [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 22 08:21:08 compute-0 nova_compute[186544]: 2025-11-22 08:21:08.065 186548 DEBUG nova.objects.instance [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lazy-loading 'info_cache' on Instance uuid 074d9b5a-057a-46af-aea1-0f43e0ac7418 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:21:09 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:21:09.194 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bd:96:a1 10.100.0.2 2001:db8::f816:3eff:febd:96a1'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:febd:96a1/64', 'neutron:device_id': 'ovnmeta-6b35c418-bf90-4666-a674-9b7153e90ab7', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6b35c418-bf90-4666-a674-9b7153e90ab7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=68a17d4f-a25c-4374-9e25-7ee5a2fc8b25, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=397a3db2-78b9-4182-b3e5-f29d5ae58cda) old=Port_Binding(mac=['fa:16:3e:bd:96:a1 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-6b35c418-bf90-4666-a674-9b7153e90ab7', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6b35c418-bf90-4666-a674-9b7153e90ab7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:21:09 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:21:09.195 103805 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 397a3db2-78b9-4182-b3e5-f29d5ae58cda in datapath 6b35c418-bf90-4666-a674-9b7153e90ab7 updated
Nov 22 08:21:09 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:21:09.196 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6b35c418-bf90-4666-a674-9b7153e90ab7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 08:21:09 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:21:09.198 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[1693b31b-9e24-48a8-87e3-14dde1d9eb96]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:21:09 compute-0 podman[242048]: 2025-11-22 08:21:09.443949096 +0000 UTC m=+0.084714964 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2)
Nov 22 08:21:09 compute-0 nova_compute[186544]: 2025-11-22 08:21:09.596 186548 DEBUG nova.network.neutron [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Updating instance_info_cache with network_info: [{"id": "eda1ac92-e156-463f-9f90-8fdd14f55dc0", "address": "fa:16:3e:b3:0e:98", "network": {"id": "52d2fbd4-6713-49c3-93b1-794bccb91cb5", "bridge": "br-int", "label": "tempest-network-smoke--1623348371", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeda1ac92-e1", "ovs_interfaceid": "eda1ac92-e156-463f-9f90-8fdd14f55dc0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:21:09 compute-0 nova_compute[186544]: 2025-11-22 08:21:09.614 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Releasing lock "refresh_cache-074d9b5a-057a-46af-aea1-0f43e0ac7418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:21:09 compute-0 nova_compute[186544]: 2025-11-22 08:21:09.614 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 22 08:21:09 compute-0 nova_compute[186544]: 2025-11-22 08:21:09.615 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:21:09 compute-0 nova_compute[186544]: 2025-11-22 08:21:09.615 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:21:09 compute-0 nova_compute[186544]: 2025-11-22 08:21:09.615 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 08:21:10 compute-0 nova_compute[186544]: 2025-11-22 08:21:10.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:21:10 compute-0 nova_compute[186544]: 2025-11-22 08:21:10.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:21:11 compute-0 nova_compute[186544]: 2025-11-22 08:21:11.006 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:21:11 compute-0 ovn_controller[94843]: 2025-11-22T08:21:11Z|00063|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b3:0e:98 10.100.0.5
Nov 22 08:21:11 compute-0 nova_compute[186544]: 2025-11-22 08:21:11.161 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:21:12 compute-0 podman[242076]: 2025-11-22 08:21:12.449805546 +0000 UTC m=+0.086929678 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 08:21:12 compute-0 podman[242077]: 2025-11-22 08:21:12.464200972 +0000 UTC m=+0.095591702 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, maintainer=Red Hat, Inc., version=9.6, distribution-scope=public, vcs-type=git, name=ubi9-minimal, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_id=edpm, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=)
Nov 22 08:21:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:21:12.685 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bd:96:a1 10.100.0.2 2001:db8:0:1:f816:3eff:febd:96a1 2001:db8::f816:3eff:febd:96a1'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8:0:1:f816:3eff:febd:96a1/64 2001:db8::f816:3eff:febd:96a1/64', 'neutron:device_id': 'ovnmeta-6b35c418-bf90-4666-a674-9b7153e90ab7', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6b35c418-bf90-4666-a674-9b7153e90ab7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=68a17d4f-a25c-4374-9e25-7ee5a2fc8b25, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=397a3db2-78b9-4182-b3e5-f29d5ae58cda) old=Port_Binding(mac=['fa:16:3e:bd:96:a1 10.100.0.2 2001:db8::f816:3eff:febd:96a1'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:febd:96a1/64', 'neutron:device_id': 'ovnmeta-6b35c418-bf90-4666-a674-9b7153e90ab7', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6b35c418-bf90-4666-a674-9b7153e90ab7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:21:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:21:12.687 103805 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 397a3db2-78b9-4182-b3e5-f29d5ae58cda in datapath 6b35c418-bf90-4666-a674-9b7153e90ab7 updated
Nov 22 08:21:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:21:12.690 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6b35c418-bf90-4666-a674-9b7153e90ab7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 08:21:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:21:12.691 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[7fe05ab7-3a12-4fe6-804f-22de56ee9016]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:21:15 compute-0 nova_compute[186544]: 2025-11-22 08:21:15.165 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:21:16 compute-0 nova_compute[186544]: 2025-11-22 08:21:16.008 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:21:16 compute-0 nova_compute[186544]: 2025-11-22 08:21:16.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:21:16 compute-0 nova_compute[186544]: 2025-11-22 08:21:16.164 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:21:17 compute-0 nova_compute[186544]: 2025-11-22 08:21:17.384 186548 INFO nova.compute.manager [None req-d5e418f2-563e-4bb5-aa0a-db59e62556bf d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Get console output
Nov 22 08:21:17 compute-0 nova_compute[186544]: 2025-11-22 08:21:17.389 213059 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 22 08:21:18 compute-0 nova_compute[186544]: 2025-11-22 08:21:18.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:21:18 compute-0 nova_compute[186544]: 2025-11-22 08:21:18.555 186548 DEBUG nova.compute.manager [req-fb3c4088-e747-4132-b2ef-5280fb1835aa req-6780d2f9-87cf-4b58-b720-0f1d402e39de 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Received event network-changed-eda1ac92-e156-463f-9f90-8fdd14f55dc0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:21:18 compute-0 nova_compute[186544]: 2025-11-22 08:21:18.555 186548 DEBUG nova.compute.manager [req-fb3c4088-e747-4132-b2ef-5280fb1835aa req-6780d2f9-87cf-4b58-b720-0f1d402e39de 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Refreshing instance network info cache due to event network-changed-eda1ac92-e156-463f-9f90-8fdd14f55dc0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:21:18 compute-0 nova_compute[186544]: 2025-11-22 08:21:18.556 186548 DEBUG oslo_concurrency.lockutils [req-fb3c4088-e747-4132-b2ef-5280fb1835aa req-6780d2f9-87cf-4b58-b720-0f1d402e39de 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-074d9b5a-057a-46af-aea1-0f43e0ac7418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:21:18 compute-0 nova_compute[186544]: 2025-11-22 08:21:18.556 186548 DEBUG oslo_concurrency.lockutils [req-fb3c4088-e747-4132-b2ef-5280fb1835aa req-6780d2f9-87cf-4b58-b720-0f1d402e39de 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-074d9b5a-057a-46af-aea1-0f43e0ac7418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:21:18 compute-0 nova_compute[186544]: 2025-11-22 08:21:18.556 186548 DEBUG nova.network.neutron [req-fb3c4088-e747-4132-b2ef-5280fb1835aa req-6780d2f9-87cf-4b58-b720-0f1d402e39de 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Refreshing network info cache for port eda1ac92-e156-463f-9f90-8fdd14f55dc0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:21:18 compute-0 nova_compute[186544]: 2025-11-22 08:21:18.621 186548 DEBUG oslo_concurrency.lockutils [None req-d6436a0d-598b-4ee1-9532-a56e9a95d8c7 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "074d9b5a-057a-46af-aea1-0f43e0ac7418" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:21:18 compute-0 nova_compute[186544]: 2025-11-22 08:21:18.621 186548 DEBUG oslo_concurrency.lockutils [None req-d6436a0d-598b-4ee1-9532-a56e9a95d8c7 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "074d9b5a-057a-46af-aea1-0f43e0ac7418" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:21:18 compute-0 nova_compute[186544]: 2025-11-22 08:21:18.622 186548 DEBUG oslo_concurrency.lockutils [None req-d6436a0d-598b-4ee1-9532-a56e9a95d8c7 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "074d9b5a-057a-46af-aea1-0f43e0ac7418-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:21:18 compute-0 nova_compute[186544]: 2025-11-22 08:21:18.622 186548 DEBUG oslo_concurrency.lockutils [None req-d6436a0d-598b-4ee1-9532-a56e9a95d8c7 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "074d9b5a-057a-46af-aea1-0f43e0ac7418-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:21:18 compute-0 nova_compute[186544]: 2025-11-22 08:21:18.622 186548 DEBUG oslo_concurrency.lockutils [None req-d6436a0d-598b-4ee1-9532-a56e9a95d8c7 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "074d9b5a-057a-46af-aea1-0f43e0ac7418-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:21:18 compute-0 nova_compute[186544]: 2025-11-22 08:21:18.629 186548 INFO nova.compute.manager [None req-d6436a0d-598b-4ee1-9532-a56e9a95d8c7 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Terminating instance
Nov 22 08:21:18 compute-0 nova_compute[186544]: 2025-11-22 08:21:18.635 186548 DEBUG nova.compute.manager [None req-d6436a0d-598b-4ee1-9532-a56e9a95d8c7 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 08:21:18 compute-0 kernel: tapeda1ac92-e1 (unregistering): left promiscuous mode
Nov 22 08:21:18 compute-0 NetworkManager[55036]: <info>  [1763799678.6656] device (tapeda1ac92-e1): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 08:21:18 compute-0 nova_compute[186544]: 2025-11-22 08:21:18.679 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:21:18 compute-0 ovn_controller[94843]: 2025-11-22T08:21:18Z|00705|binding|INFO|Releasing lport eda1ac92-e156-463f-9f90-8fdd14f55dc0 from this chassis (sb_readonly=0)
Nov 22 08:21:18 compute-0 ovn_controller[94843]: 2025-11-22T08:21:18Z|00706|binding|INFO|Setting lport eda1ac92-e156-463f-9f90-8fdd14f55dc0 down in Southbound
Nov 22 08:21:18 compute-0 ovn_controller[94843]: 2025-11-22T08:21:18Z|00707|binding|INFO|Removing iface tapeda1ac92-e1 ovn-installed in OVS
Nov 22 08:21:18 compute-0 nova_compute[186544]: 2025-11-22 08:21:18.696 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:21:18 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:21:18.720 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b3:0e:98 10.100.0.5'], port_security=['fa:16:3e:b3:0e:98 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '074d9b5a-057a-46af-aea1-0f43e0ac7418', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-52d2fbd4-6713-49c3-93b1-794bccb91cb5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '042f6d127720471aaedb8a1fb7535416', 'neutron:revision_number': '12', 'neutron:security_group_ids': 'a792325d-0f21-49cc-9e79-20df696791a7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=46980f3d-5c4c-4eaa-bdbc-e9dfef13e740, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=eda1ac92-e156-463f-9f90-8fdd14f55dc0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:21:18 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:21:18.721 103805 INFO neutron.agent.ovn.metadata.agent [-] Port eda1ac92-e156-463f-9f90-8fdd14f55dc0 in datapath 52d2fbd4-6713-49c3-93b1-794bccb91cb5 unbound from our chassis
Nov 22 08:21:18 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:21:18.722 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 52d2fbd4-6713-49c3-93b1-794bccb91cb5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 08:21:18 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:21:18.723 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[ba7d73c2-559f-4b11-961b-6c1e68568530]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:21:18 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:21:18.723 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-52d2fbd4-6713-49c3-93b1-794bccb91cb5 namespace which is not needed anymore
Nov 22 08:21:18 compute-0 systemd[1]: machine-qemu\x2d81\x2dinstance\x2d00000098.scope: Deactivated successfully.
Nov 22 08:21:18 compute-0 systemd[1]: machine-qemu\x2d81\x2dinstance\x2d00000098.scope: Consumed 15.948s CPU time.
Nov 22 08:21:18 compute-0 systemd-machined[152872]: Machine qemu-81-instance-00000098 terminated.
Nov 22 08:21:18 compute-0 neutron-haproxy-ovnmeta-52d2fbd4-6713-49c3-93b1-794bccb91cb5[241940]: [NOTICE]   (241945) : haproxy version is 2.8.14-c23fe91
Nov 22 08:21:18 compute-0 neutron-haproxy-ovnmeta-52d2fbd4-6713-49c3-93b1-794bccb91cb5[241940]: [NOTICE]   (241945) : path to executable is /usr/sbin/haproxy
Nov 22 08:21:18 compute-0 neutron-haproxy-ovnmeta-52d2fbd4-6713-49c3-93b1-794bccb91cb5[241940]: [WARNING]  (241945) : Exiting Master process...
Nov 22 08:21:18 compute-0 neutron-haproxy-ovnmeta-52d2fbd4-6713-49c3-93b1-794bccb91cb5[241940]: [WARNING]  (241945) : Exiting Master process...
Nov 22 08:21:18 compute-0 neutron-haproxy-ovnmeta-52d2fbd4-6713-49c3-93b1-794bccb91cb5[241940]: [ALERT]    (241945) : Current worker (241947) exited with code 143 (Terminated)
Nov 22 08:21:18 compute-0 neutron-haproxy-ovnmeta-52d2fbd4-6713-49c3-93b1-794bccb91cb5[241940]: [WARNING]  (241945) : All workers exited. Exiting... (0)
Nov 22 08:21:18 compute-0 systemd[1]: libpod-c8c6d71b10f0cf887523c714f7eff0ae0757003b6b1e67d2f7a364a94f1de2c9.scope: Deactivated successfully.
Nov 22 08:21:18 compute-0 podman[242146]: 2025-11-22 08:21:18.886643248 +0000 UTC m=+0.054202200 container died c8c6d71b10f0cf887523c714f7eff0ae0757003b6b1e67d2f7a364a94f1de2c9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-52d2fbd4-6713-49c3-93b1-794bccb91cb5, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118)
Nov 22 08:21:18 compute-0 nova_compute[186544]: 2025-11-22 08:21:18.912 186548 INFO nova.virt.libvirt.driver [-] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Instance destroyed successfully.
Nov 22 08:21:18 compute-0 nova_compute[186544]: 2025-11-22 08:21:18.913 186548 DEBUG nova.objects.instance [None req-d6436a0d-598b-4ee1-9532-a56e9a95d8c7 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lazy-loading 'resources' on Instance uuid 074d9b5a-057a-46af-aea1-0f43e0ac7418 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:21:18 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c8c6d71b10f0cf887523c714f7eff0ae0757003b6b1e67d2f7a364a94f1de2c9-userdata-shm.mount: Deactivated successfully.
Nov 22 08:21:18 compute-0 systemd[1]: var-lib-containers-storage-overlay-2a94ea10fa7b7e3ab5ee1514a82b2c981548f339376e90ac6271c8e591c78578-merged.mount: Deactivated successfully.
Nov 22 08:21:18 compute-0 nova_compute[186544]: 2025-11-22 08:21:18.923 186548 DEBUG nova.virt.libvirt.vif [None req-d6436a0d-598b-4ee1-9532-a56e9a95d8c7 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-22T08:19:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-550951011',display_name='tempest-TestNetworkAdvancedServerOps-server-550951011',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-550951011',id=152,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBChgjIQbTG83YupjtVjqz+L3+8SX3/AyjC8fqpXlZMUq0Yc6UvLnNy2SkzagnhhQjk5r+5IpiMQj6wR0xNs5cYnWEn7ZMM5fmHS1ZM+0SVA7KQ3TBAqX6QTRX0NRVvymVQ==',key_name='tempest-TestNetworkAdvancedServerOps-1187204626',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:20:56Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='042f6d127720471aaedb8a1fb7535416',ramdisk_id='',reservation_id='r-pxtss0dk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1221065053',owner_user_name='tempest-TestNetworkAdvancedServerOps-1221065053-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:20:57Z,user_data=None,user_id='d8853d84c1e84f6baaf01635ef1d0f7c',uuid=074d9b5a-057a-46af-aea1-0f43e0ac7418,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "eda1ac92-e156-463f-9f90-8fdd14f55dc0", "address": "fa:16:3e:b3:0e:98", "network": {"id": "52d2fbd4-6713-49c3-93b1-794bccb91cb5", "bridge": "br-int", "label": "tempest-network-smoke--1623348371", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeda1ac92-e1", "ovs_interfaceid": "eda1ac92-e156-463f-9f90-8fdd14f55dc0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 08:21:18 compute-0 nova_compute[186544]: 2025-11-22 08:21:18.923 186548 DEBUG nova.network.os_vif_util [None req-d6436a0d-598b-4ee1-9532-a56e9a95d8c7 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converting VIF {"id": "eda1ac92-e156-463f-9f90-8fdd14f55dc0", "address": "fa:16:3e:b3:0e:98", "network": {"id": "52d2fbd4-6713-49c3-93b1-794bccb91cb5", "bridge": "br-int", "label": "tempest-network-smoke--1623348371", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeda1ac92-e1", "ovs_interfaceid": "eda1ac92-e156-463f-9f90-8fdd14f55dc0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:21:18 compute-0 nova_compute[186544]: 2025-11-22 08:21:18.924 186548 DEBUG nova.network.os_vif_util [None req-d6436a0d-598b-4ee1-9532-a56e9a95d8c7 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b3:0e:98,bridge_name='br-int',has_traffic_filtering=True,id=eda1ac92-e156-463f-9f90-8fdd14f55dc0,network=Network(52d2fbd4-6713-49c3-93b1-794bccb91cb5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeda1ac92-e1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:21:18 compute-0 nova_compute[186544]: 2025-11-22 08:21:18.924 186548 DEBUG os_vif [None req-d6436a0d-598b-4ee1-9532-a56e9a95d8c7 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b3:0e:98,bridge_name='br-int',has_traffic_filtering=True,id=eda1ac92-e156-463f-9f90-8fdd14f55dc0,network=Network(52d2fbd4-6713-49c3-93b1-794bccb91cb5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeda1ac92-e1') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 08:21:18 compute-0 nova_compute[186544]: 2025-11-22 08:21:18.926 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:21:18 compute-0 nova_compute[186544]: 2025-11-22 08:21:18.926 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapeda1ac92-e1, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:21:18 compute-0 podman[242146]: 2025-11-22 08:21:18.927709222 +0000 UTC m=+0.095268154 container cleanup c8c6d71b10f0cf887523c714f7eff0ae0757003b6b1e67d2f7a364a94f1de2c9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-52d2fbd4-6713-49c3-93b1-794bccb91cb5, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118)
Nov 22 08:21:18 compute-0 nova_compute[186544]: 2025-11-22 08:21:18.928 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:21:18 compute-0 nova_compute[186544]: 2025-11-22 08:21:18.930 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 08:21:18 compute-0 nova_compute[186544]: 2025-11-22 08:21:18.932 186548 INFO os_vif [None req-d6436a0d-598b-4ee1-9532-a56e9a95d8c7 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b3:0e:98,bridge_name='br-int',has_traffic_filtering=True,id=eda1ac92-e156-463f-9f90-8fdd14f55dc0,network=Network(52d2fbd4-6713-49c3-93b1-794bccb91cb5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeda1ac92-e1')
Nov 22 08:21:18 compute-0 nova_compute[186544]: 2025-11-22 08:21:18.933 186548 INFO nova.virt.libvirt.driver [None req-d6436a0d-598b-4ee1-9532-a56e9a95d8c7 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Deleting instance files /var/lib/nova/instances/074d9b5a-057a-46af-aea1-0f43e0ac7418_del
Nov 22 08:21:18 compute-0 systemd[1]: libpod-conmon-c8c6d71b10f0cf887523c714f7eff0ae0757003b6b1e67d2f7a364a94f1de2c9.scope: Deactivated successfully.
Nov 22 08:21:18 compute-0 nova_compute[186544]: 2025-11-22 08:21:18.939 186548 INFO nova.virt.libvirt.driver [None req-d6436a0d-598b-4ee1-9532-a56e9a95d8c7 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Deletion of /var/lib/nova/instances/074d9b5a-057a-46af-aea1-0f43e0ac7418_del complete
Nov 22 08:21:18 compute-0 podman[242192]: 2025-11-22 08:21:18.99564043 +0000 UTC m=+0.045624118 container remove c8c6d71b10f0cf887523c714f7eff0ae0757003b6b1e67d2f7a364a94f1de2c9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-52d2fbd4-6713-49c3-93b1-794bccb91cb5, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 08:21:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:21:19.000 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[454c8023-ef03-41c2-a045-349ee9f6d492]: (4, ('Sat Nov 22 08:21:18 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-52d2fbd4-6713-49c3-93b1-794bccb91cb5 (c8c6d71b10f0cf887523c714f7eff0ae0757003b6b1e67d2f7a364a94f1de2c9)\nc8c6d71b10f0cf887523c714f7eff0ae0757003b6b1e67d2f7a364a94f1de2c9\nSat Nov 22 08:21:18 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-52d2fbd4-6713-49c3-93b1-794bccb91cb5 (c8c6d71b10f0cf887523c714f7eff0ae0757003b6b1e67d2f7a364a94f1de2c9)\nc8c6d71b10f0cf887523c714f7eff0ae0757003b6b1e67d2f7a364a94f1de2c9\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:21:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:21:19.002 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[660280d2-9502-47f3-a1f1-ba468c6b8bda]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:21:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:21:19.003 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap52d2fbd4-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:21:19 compute-0 nova_compute[186544]: 2025-11-22 08:21:19.005 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:21:19 compute-0 kernel: tap52d2fbd4-60: left promiscuous mode
Nov 22 08:21:19 compute-0 nova_compute[186544]: 2025-11-22 08:21:19.011 186548 INFO nova.compute.manager [None req-d6436a0d-598b-4ee1-9532-a56e9a95d8c7 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Took 0.38 seconds to destroy the instance on the hypervisor.
Nov 22 08:21:19 compute-0 nova_compute[186544]: 2025-11-22 08:21:19.012 186548 DEBUG oslo.service.loopingcall [None req-d6436a0d-598b-4ee1-9532-a56e9a95d8c7 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 08:21:19 compute-0 nova_compute[186544]: 2025-11-22 08:21:19.013 186548 DEBUG nova.compute.manager [-] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 08:21:19 compute-0 nova_compute[186544]: 2025-11-22 08:21:19.013 186548 DEBUG nova.network.neutron [-] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 08:21:19 compute-0 nova_compute[186544]: 2025-11-22 08:21:19.018 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:21:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:21:19.023 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[ea27328f-7a22-4efc-b200-ae5474d6bfbf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:21:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:21:19.034 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[24cdc21c-33ac-4822-a8ab-7e5bfd4f50ce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:21:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:21:19.036 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[46697cc0-5fbb-4cea-8fd2-dae276b47b76]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:21:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:21:19.050 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[1f890796-52da-49a1-ae5a-55bf6cf7cc7a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 639689, 'reachable_time': 18399, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 242207, 'error': None, 'target': 'ovnmeta-52d2fbd4-6713-49c3-93b1-794bccb91cb5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:21:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:21:19.055 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-52d2fbd4-6713-49c3-93b1-794bccb91cb5 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 08:21:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:21:19.055 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[0750adea-294a-4094-bb4c-6545555224a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:21:19 compute-0 systemd[1]: run-netns-ovnmeta\x2d52d2fbd4\x2d6713\x2d49c3\x2d93b1\x2d794bccb91cb5.mount: Deactivated successfully.
Nov 22 08:21:19 compute-0 nova_compute[186544]: 2025-11-22 08:21:19.407 186548 DEBUG nova.compute.manager [req-4380069e-f5d6-40dd-8f25-a89f8e1359b5 req-5791a64c-29cf-4880-91e3-f470bb801317 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Received event network-vif-unplugged-eda1ac92-e156-463f-9f90-8fdd14f55dc0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:21:19 compute-0 nova_compute[186544]: 2025-11-22 08:21:19.408 186548 DEBUG oslo_concurrency.lockutils [req-4380069e-f5d6-40dd-8f25-a89f8e1359b5 req-5791a64c-29cf-4880-91e3-f470bb801317 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "074d9b5a-057a-46af-aea1-0f43e0ac7418-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:21:19 compute-0 nova_compute[186544]: 2025-11-22 08:21:19.408 186548 DEBUG oslo_concurrency.lockutils [req-4380069e-f5d6-40dd-8f25-a89f8e1359b5 req-5791a64c-29cf-4880-91e3-f470bb801317 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "074d9b5a-057a-46af-aea1-0f43e0ac7418-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:21:19 compute-0 nova_compute[186544]: 2025-11-22 08:21:19.408 186548 DEBUG oslo_concurrency.lockutils [req-4380069e-f5d6-40dd-8f25-a89f8e1359b5 req-5791a64c-29cf-4880-91e3-f470bb801317 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "074d9b5a-057a-46af-aea1-0f43e0ac7418-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:21:19 compute-0 nova_compute[186544]: 2025-11-22 08:21:19.409 186548 DEBUG nova.compute.manager [req-4380069e-f5d6-40dd-8f25-a89f8e1359b5 req-5791a64c-29cf-4880-91e3-f470bb801317 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] No waiting events found dispatching network-vif-unplugged-eda1ac92-e156-463f-9f90-8fdd14f55dc0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:21:19 compute-0 nova_compute[186544]: 2025-11-22 08:21:19.409 186548 DEBUG nova.compute.manager [req-4380069e-f5d6-40dd-8f25-a89f8e1359b5 req-5791a64c-29cf-4880-91e3-f470bb801317 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Received event network-vif-unplugged-eda1ac92-e156-463f-9f90-8fdd14f55dc0 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 22 08:21:20 compute-0 nova_compute[186544]: 2025-11-22 08:21:20.096 186548 DEBUG nova.network.neutron [-] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:21:20 compute-0 nova_compute[186544]: 2025-11-22 08:21:20.119 186548 INFO nova.compute.manager [-] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Took 1.11 seconds to deallocate network for instance.
Nov 22 08:21:20 compute-0 nova_compute[186544]: 2025-11-22 08:21:20.206 186548 DEBUG oslo_concurrency.lockutils [None req-d6436a0d-598b-4ee1-9532-a56e9a95d8c7 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:21:20 compute-0 nova_compute[186544]: 2025-11-22 08:21:20.207 186548 DEBUG oslo_concurrency.lockutils [None req-d6436a0d-598b-4ee1-9532-a56e9a95d8c7 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:21:20 compute-0 nova_compute[186544]: 2025-11-22 08:21:20.275 186548 DEBUG nova.compute.provider_tree [None req-d6436a0d-598b-4ee1-9532-a56e9a95d8c7 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:21:20 compute-0 nova_compute[186544]: 2025-11-22 08:21:20.288 186548 DEBUG nova.scheduler.client.report [None req-d6436a0d-598b-4ee1-9532-a56e9a95d8c7 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:21:20 compute-0 nova_compute[186544]: 2025-11-22 08:21:20.305 186548 DEBUG oslo_concurrency.lockutils [None req-d6436a0d-598b-4ee1-9532-a56e9a95d8c7 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.099s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:21:20 compute-0 nova_compute[186544]: 2025-11-22 08:21:20.317 186548 DEBUG nova.network.neutron [req-fb3c4088-e747-4132-b2ef-5280fb1835aa req-6780d2f9-87cf-4b58-b720-0f1d402e39de 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Updated VIF entry in instance network info cache for port eda1ac92-e156-463f-9f90-8fdd14f55dc0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:21:20 compute-0 nova_compute[186544]: 2025-11-22 08:21:20.318 186548 DEBUG nova.network.neutron [req-fb3c4088-e747-4132-b2ef-5280fb1835aa req-6780d2f9-87cf-4b58-b720-0f1d402e39de 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Updating instance_info_cache with network_info: [{"id": "eda1ac92-e156-463f-9f90-8fdd14f55dc0", "address": "fa:16:3e:b3:0e:98", "network": {"id": "52d2fbd4-6713-49c3-93b1-794bccb91cb5", "bridge": "br-int", "label": "tempest-network-smoke--1623348371", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeda1ac92-e1", "ovs_interfaceid": "eda1ac92-e156-463f-9f90-8fdd14f55dc0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:21:20 compute-0 nova_compute[186544]: 2025-11-22 08:21:20.332 186548 INFO nova.scheduler.client.report [None req-d6436a0d-598b-4ee1-9532-a56e9a95d8c7 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Deleted allocations for instance 074d9b5a-057a-46af-aea1-0f43e0ac7418
Nov 22 08:21:20 compute-0 nova_compute[186544]: 2025-11-22 08:21:20.335 186548 DEBUG oslo_concurrency.lockutils [req-fb3c4088-e747-4132-b2ef-5280fb1835aa req-6780d2f9-87cf-4b58-b720-0f1d402e39de 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-074d9b5a-057a-46af-aea1-0f43e0ac7418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:21:20 compute-0 nova_compute[186544]: 2025-11-22 08:21:20.394 186548 DEBUG oslo_concurrency.lockutils [None req-d6436a0d-598b-4ee1-9532-a56e9a95d8c7 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "074d9b5a-057a-46af-aea1-0f43e0ac7418" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.773s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:21:21 compute-0 nova_compute[186544]: 2025-11-22 08:21:21.166 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:21:21 compute-0 nova_compute[186544]: 2025-11-22 08:21:21.525 186548 DEBUG nova.compute.manager [req-3389bf4b-5204-4ebb-9d73-43a98ecb018b req-f23fc8ea-cf7c-4ed2-b233-5ae3e1d0ba4e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Received event network-vif-plugged-eda1ac92-e156-463f-9f90-8fdd14f55dc0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:21:21 compute-0 nova_compute[186544]: 2025-11-22 08:21:21.525 186548 DEBUG oslo_concurrency.lockutils [req-3389bf4b-5204-4ebb-9d73-43a98ecb018b req-f23fc8ea-cf7c-4ed2-b233-5ae3e1d0ba4e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "074d9b5a-057a-46af-aea1-0f43e0ac7418-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:21:21 compute-0 nova_compute[186544]: 2025-11-22 08:21:21.526 186548 DEBUG oslo_concurrency.lockutils [req-3389bf4b-5204-4ebb-9d73-43a98ecb018b req-f23fc8ea-cf7c-4ed2-b233-5ae3e1d0ba4e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "074d9b5a-057a-46af-aea1-0f43e0ac7418-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:21:21 compute-0 nova_compute[186544]: 2025-11-22 08:21:21.526 186548 DEBUG oslo_concurrency.lockutils [req-3389bf4b-5204-4ebb-9d73-43a98ecb018b req-f23fc8ea-cf7c-4ed2-b233-5ae3e1d0ba4e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "074d9b5a-057a-46af-aea1-0f43e0ac7418-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:21:21 compute-0 nova_compute[186544]: 2025-11-22 08:21:21.526 186548 DEBUG nova.compute.manager [req-3389bf4b-5204-4ebb-9d73-43a98ecb018b req-f23fc8ea-cf7c-4ed2-b233-5ae3e1d0ba4e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] No waiting events found dispatching network-vif-plugged-eda1ac92-e156-463f-9f90-8fdd14f55dc0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:21:21 compute-0 nova_compute[186544]: 2025-11-22 08:21:21.526 186548 WARNING nova.compute.manager [req-3389bf4b-5204-4ebb-9d73-43a98ecb018b req-f23fc8ea-cf7c-4ed2-b233-5ae3e1d0ba4e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Received unexpected event network-vif-plugged-eda1ac92-e156-463f-9f90-8fdd14f55dc0 for instance with vm_state deleted and task_state None.
Nov 22 08:21:21 compute-0 nova_compute[186544]: 2025-11-22 08:21:21.527 186548 DEBUG nova.compute.manager [req-3389bf4b-5204-4ebb-9d73-43a98ecb018b req-f23fc8ea-cf7c-4ed2-b233-5ae3e1d0ba4e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Received event network-vif-deleted-eda1ac92-e156-463f-9f90-8fdd14f55dc0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:21:21 compute-0 nova_compute[186544]: 2025-11-22 08:21:21.527 186548 INFO nova.compute.manager [req-3389bf4b-5204-4ebb-9d73-43a98ecb018b req-f23fc8ea-cf7c-4ed2-b233-5ae3e1d0ba4e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Neutron deleted interface eda1ac92-e156-463f-9f90-8fdd14f55dc0; detaching it from the instance and deleting it from the info cache
Nov 22 08:21:21 compute-0 nova_compute[186544]: 2025-11-22 08:21:21.527 186548 DEBUG nova.network.neutron [req-3389bf4b-5204-4ebb-9d73-43a98ecb018b req-f23fc8ea-cf7c-4ed2-b233-5ae3e1d0ba4e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106
Nov 22 08:21:21 compute-0 nova_compute[186544]: 2025-11-22 08:21:21.530 186548 DEBUG nova.compute.manager [req-3389bf4b-5204-4ebb-9d73-43a98ecb018b req-f23fc8ea-cf7c-4ed2-b233-5ae3e1d0ba4e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Detach interface failed, port_id=eda1ac92-e156-463f-9f90-8fdd14f55dc0, reason: Instance 074d9b5a-057a-46af-aea1-0f43e0ac7418 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Nov 22 08:21:23 compute-0 nova_compute[186544]: 2025-11-22 08:21:23.141 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:21:23 compute-0 nova_compute[186544]: 2025-11-22 08:21:23.259 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:21:23 compute-0 nova_compute[186544]: 2025-11-22 08:21:23.928 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:21:26 compute-0 nova_compute[186544]: 2025-11-22 08:21:26.168 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:21:28 compute-0 nova_compute[186544]: 2025-11-22 08:21:28.932 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:21:29 compute-0 podman[242210]: 2025-11-22 08:21:29.41604495 +0000 UTC m=+0.064088124 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Nov 22 08:21:29 compute-0 podman[242212]: 2025-11-22 08:21:29.416489732 +0000 UTC m=+0.054284452 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 08:21:29 compute-0 podman[242218]: 2025-11-22 08:21:29.442175196 +0000 UTC m=+0.076669155 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 22 08:21:29 compute-0 podman[242211]: 2025-11-22 08:21:29.452127941 +0000 UTC m=+0.091143122 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118)
Nov 22 08:21:31 compute-0 nova_compute[186544]: 2025-11-22 08:21:31.171 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:21:33 compute-0 nova_compute[186544]: 2025-11-22 08:21:33.911 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763799678.9101233, 074d9b5a-057a-46af-aea1-0f43e0ac7418 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:21:33 compute-0 nova_compute[186544]: 2025-11-22 08:21:33.911 186548 INFO nova.compute.manager [-] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] VM Stopped (Lifecycle Event)
Nov 22 08:21:33 compute-0 nova_compute[186544]: 2025-11-22 08:21:33.935 186548 DEBUG nova.compute.manager [None req-4d8a413c-a85f-4428-a18a-dc8cd5b02090 - - - - - -] [instance: 074d9b5a-057a-46af-aea1-0f43e0ac7418] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:21:33 compute-0 nova_compute[186544]: 2025-11-22 08:21:33.935 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:21:36 compute-0 nova_compute[186544]: 2025-11-22 08:21:36.172 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:21:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:21:37.347 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:21:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:21:37.348 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:21:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:21:37.348 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:21:38 compute-0 nova_compute[186544]: 2025-11-22 08:21:38.938 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:21:40 compute-0 podman[242293]: 2025-11-22 08:21:40.432510891 +0000 UTC m=+0.082326403 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Nov 22 08:21:41 compute-0 nova_compute[186544]: 2025-11-22 08:21:41.174 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:21:43 compute-0 podman[242313]: 2025-11-22 08:21:43.429434532 +0000 UTC m=+0.066015692 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 08:21:43 compute-0 podman[242314]: 2025-11-22 08:21:43.431315998 +0000 UTC m=+0.068930303 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., version=9.6, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 22 08:21:43 compute-0 nova_compute[186544]: 2025-11-22 08:21:43.942 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:21:46 compute-0 nova_compute[186544]: 2025-11-22 08:21:46.176 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:21:47 compute-0 nova_compute[186544]: 2025-11-22 08:21:47.516 186548 DEBUG oslo_concurrency.lockutils [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "97c19ea5-e0e4-4dac-a78d-b974add3bcfe" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:21:47 compute-0 nova_compute[186544]: 2025-11-22 08:21:47.517 186548 DEBUG oslo_concurrency.lockutils [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "97c19ea5-e0e4-4dac-a78d-b974add3bcfe" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:21:47 compute-0 nova_compute[186544]: 2025-11-22 08:21:47.534 186548 DEBUG nova.compute.manager [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 08:21:47 compute-0 nova_compute[186544]: 2025-11-22 08:21:47.652 186548 DEBUG oslo_concurrency.lockutils [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:21:47 compute-0 nova_compute[186544]: 2025-11-22 08:21:47.653 186548 DEBUG oslo_concurrency.lockutils [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:21:47 compute-0 nova_compute[186544]: 2025-11-22 08:21:47.659 186548 DEBUG nova.virt.hardware [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 08:21:47 compute-0 nova_compute[186544]: 2025-11-22 08:21:47.659 186548 INFO nova.compute.claims [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] Claim successful on node compute-0.ctlplane.example.com
Nov 22 08:21:47 compute-0 nova_compute[186544]: 2025-11-22 08:21:47.780 186548 DEBUG nova.compute.provider_tree [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:21:47 compute-0 nova_compute[186544]: 2025-11-22 08:21:47.795 186548 DEBUG nova.scheduler.client.report [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:21:47 compute-0 nova_compute[186544]: 2025-11-22 08:21:47.852 186548 DEBUG oslo_concurrency.lockutils [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.200s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:21:47 compute-0 nova_compute[186544]: 2025-11-22 08:21:47.853 186548 DEBUG nova.compute.manager [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 08:21:47 compute-0 nova_compute[186544]: 2025-11-22 08:21:47.915 186548 DEBUG nova.compute.manager [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 22 08:21:47 compute-0 nova_compute[186544]: 2025-11-22 08:21:47.915 186548 DEBUG nova.network.neutron [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 08:21:47 compute-0 nova_compute[186544]: 2025-11-22 08:21:47.935 186548 INFO nova.virt.libvirt.driver [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 08:21:47 compute-0 nova_compute[186544]: 2025-11-22 08:21:47.955 186548 DEBUG nova.compute.manager [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 08:21:48 compute-0 nova_compute[186544]: 2025-11-22 08:21:48.047 186548 DEBUG nova.compute.manager [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 08:21:48 compute-0 nova_compute[186544]: 2025-11-22 08:21:48.048 186548 DEBUG nova.virt.libvirt.driver [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 08:21:48 compute-0 nova_compute[186544]: 2025-11-22 08:21:48.049 186548 INFO nova.virt.libvirt.driver [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] Creating image(s)
Nov 22 08:21:48 compute-0 nova_compute[186544]: 2025-11-22 08:21:48.050 186548 DEBUG oslo_concurrency.lockutils [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "/var/lib/nova/instances/97c19ea5-e0e4-4dac-a78d-b974add3bcfe/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:21:48 compute-0 nova_compute[186544]: 2025-11-22 08:21:48.050 186548 DEBUG oslo_concurrency.lockutils [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "/var/lib/nova/instances/97c19ea5-e0e4-4dac-a78d-b974add3bcfe/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:21:48 compute-0 nova_compute[186544]: 2025-11-22 08:21:48.051 186548 DEBUG oslo_concurrency.lockutils [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "/var/lib/nova/instances/97c19ea5-e0e4-4dac-a78d-b974add3bcfe/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:21:48 compute-0 nova_compute[186544]: 2025-11-22 08:21:48.063 186548 DEBUG oslo_concurrency.processutils [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:21:48 compute-0 nova_compute[186544]: 2025-11-22 08:21:48.119 186548 DEBUG oslo_concurrency.processutils [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:21:48 compute-0 nova_compute[186544]: 2025-11-22 08:21:48.120 186548 DEBUG oslo_concurrency.lockutils [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:21:48 compute-0 nova_compute[186544]: 2025-11-22 08:21:48.120 186548 DEBUG oslo_concurrency.lockutils [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:21:48 compute-0 nova_compute[186544]: 2025-11-22 08:21:48.132 186548 DEBUG oslo_concurrency.processutils [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:21:48 compute-0 nova_compute[186544]: 2025-11-22 08:21:48.150 186548 DEBUG nova.policy [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '809b865601654264af5bff7f49127cea', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 22 08:21:48 compute-0 nova_compute[186544]: 2025-11-22 08:21:48.187 186548 DEBUG oslo_concurrency.processutils [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:21:48 compute-0 nova_compute[186544]: 2025-11-22 08:21:48.188 186548 DEBUG oslo_concurrency.processutils [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/97c19ea5-e0e4-4dac-a78d-b974add3bcfe/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:21:48 compute-0 nova_compute[186544]: 2025-11-22 08:21:48.244 186548 DEBUG oslo_concurrency.processutils [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/97c19ea5-e0e4-4dac-a78d-b974add3bcfe/disk 1073741824" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:21:48 compute-0 nova_compute[186544]: 2025-11-22 08:21:48.245 186548 DEBUG oslo_concurrency.lockutils [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.125s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:21:48 compute-0 nova_compute[186544]: 2025-11-22 08:21:48.245 186548 DEBUG oslo_concurrency.processutils [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:21:48 compute-0 nova_compute[186544]: 2025-11-22 08:21:48.301 186548 DEBUG oslo_concurrency.processutils [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:21:48 compute-0 nova_compute[186544]: 2025-11-22 08:21:48.303 186548 DEBUG nova.virt.disk.api [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Checking if we can resize image /var/lib/nova/instances/97c19ea5-e0e4-4dac-a78d-b974add3bcfe/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 08:21:48 compute-0 nova_compute[186544]: 2025-11-22 08:21:48.304 186548 DEBUG oslo_concurrency.processutils [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/97c19ea5-e0e4-4dac-a78d-b974add3bcfe/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:21:48 compute-0 nova_compute[186544]: 2025-11-22 08:21:48.364 186548 DEBUG oslo_concurrency.processutils [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/97c19ea5-e0e4-4dac-a78d-b974add3bcfe/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:21:48 compute-0 nova_compute[186544]: 2025-11-22 08:21:48.365 186548 DEBUG nova.virt.disk.api [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Cannot resize image /var/lib/nova/instances/97c19ea5-e0e4-4dac-a78d-b974add3bcfe/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 08:21:48 compute-0 nova_compute[186544]: 2025-11-22 08:21:48.366 186548 DEBUG nova.objects.instance [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lazy-loading 'migration_context' on Instance uuid 97c19ea5-e0e4-4dac-a78d-b974add3bcfe obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:21:48 compute-0 nova_compute[186544]: 2025-11-22 08:21:48.377 186548 DEBUG nova.virt.libvirt.driver [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 08:21:48 compute-0 nova_compute[186544]: 2025-11-22 08:21:48.377 186548 DEBUG nova.virt.libvirt.driver [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] Ensure instance console log exists: /var/lib/nova/instances/97c19ea5-e0e4-4dac-a78d-b974add3bcfe/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 08:21:48 compute-0 nova_compute[186544]: 2025-11-22 08:21:48.377 186548 DEBUG oslo_concurrency.lockutils [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:21:48 compute-0 nova_compute[186544]: 2025-11-22 08:21:48.378 186548 DEBUG oslo_concurrency.lockutils [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:21:48 compute-0 nova_compute[186544]: 2025-11-22 08:21:48.378 186548 DEBUG oslo_concurrency.lockutils [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:21:48 compute-0 nova_compute[186544]: 2025-11-22 08:21:48.945 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:21:51 compute-0 nova_compute[186544]: 2025-11-22 08:21:51.179 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:21:51 compute-0 nova_compute[186544]: 2025-11-22 08:21:51.260 186548 DEBUG nova.network.neutron [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] Successfully created port: d36527e0-7005-4c01-b586-38daed464240 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 22 08:21:53 compute-0 nova_compute[186544]: 2025-11-22 08:21:53.948 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:21:54 compute-0 nova_compute[186544]: 2025-11-22 08:21:54.127 186548 DEBUG nova.network.neutron [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] Successfully updated port: d36527e0-7005-4c01-b586-38daed464240 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 08:21:54 compute-0 nova_compute[186544]: 2025-11-22 08:21:54.142 186548 DEBUG oslo_concurrency.lockutils [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "refresh_cache-97c19ea5-e0e4-4dac-a78d-b974add3bcfe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:21:54 compute-0 nova_compute[186544]: 2025-11-22 08:21:54.143 186548 DEBUG oslo_concurrency.lockutils [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquired lock "refresh_cache-97c19ea5-e0e4-4dac-a78d-b974add3bcfe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:21:54 compute-0 nova_compute[186544]: 2025-11-22 08:21:54.143 186548 DEBUG nova.network.neutron [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 08:21:54 compute-0 nova_compute[186544]: 2025-11-22 08:21:54.226 186548 DEBUG nova.compute.manager [req-3347eea3-41f3-4a96-82f9-249a296efe80 req-02173c41-74ac-47e9-9054-1e2723aae920 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] Received event network-changed-d36527e0-7005-4c01-b586-38daed464240 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:21:54 compute-0 nova_compute[186544]: 2025-11-22 08:21:54.227 186548 DEBUG nova.compute.manager [req-3347eea3-41f3-4a96-82f9-249a296efe80 req-02173c41-74ac-47e9-9054-1e2723aae920 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] Refreshing instance network info cache due to event network-changed-d36527e0-7005-4c01-b586-38daed464240. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:21:54 compute-0 nova_compute[186544]: 2025-11-22 08:21:54.227 186548 DEBUG oslo_concurrency.lockutils [req-3347eea3-41f3-4a96-82f9-249a296efe80 req-02173c41-74ac-47e9-9054-1e2723aae920 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-97c19ea5-e0e4-4dac-a78d-b974add3bcfe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:21:54 compute-0 nova_compute[186544]: 2025-11-22 08:21:54.362 186548 DEBUG nova.network.neutron [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 08:21:56 compute-0 nova_compute[186544]: 2025-11-22 08:21:56.181 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:21:56 compute-0 nova_compute[186544]: 2025-11-22 08:21:56.310 186548 DEBUG nova.network.neutron [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] Updating instance_info_cache with network_info: [{"id": "d36527e0-7005-4c01-b586-38daed464240", "address": "fa:16:3e:4e:e2:00", "network": {"id": "6b35c418-bf90-4666-a674-9b7153e90ab7", "bridge": "br-int", "label": "tempest-network-smoke--584123979", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe4e:e200", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe4e:e200", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd36527e0-70", "ovs_interfaceid": "d36527e0-7005-4c01-b586-38daed464240", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:21:56 compute-0 nova_compute[186544]: 2025-11-22 08:21:56.340 186548 DEBUG oslo_concurrency.lockutils [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Releasing lock "refresh_cache-97c19ea5-e0e4-4dac-a78d-b974add3bcfe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:21:56 compute-0 nova_compute[186544]: 2025-11-22 08:21:56.340 186548 DEBUG nova.compute.manager [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] Instance network_info: |[{"id": "d36527e0-7005-4c01-b586-38daed464240", "address": "fa:16:3e:4e:e2:00", "network": {"id": "6b35c418-bf90-4666-a674-9b7153e90ab7", "bridge": "br-int", "label": "tempest-network-smoke--584123979", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe4e:e200", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe4e:e200", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd36527e0-70", "ovs_interfaceid": "d36527e0-7005-4c01-b586-38daed464240", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 22 08:21:56 compute-0 nova_compute[186544]: 2025-11-22 08:21:56.341 186548 DEBUG oslo_concurrency.lockutils [req-3347eea3-41f3-4a96-82f9-249a296efe80 req-02173c41-74ac-47e9-9054-1e2723aae920 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-97c19ea5-e0e4-4dac-a78d-b974add3bcfe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:21:56 compute-0 nova_compute[186544]: 2025-11-22 08:21:56.341 186548 DEBUG nova.network.neutron [req-3347eea3-41f3-4a96-82f9-249a296efe80 req-02173c41-74ac-47e9-9054-1e2723aae920 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] Refreshing network info cache for port d36527e0-7005-4c01-b586-38daed464240 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:21:56 compute-0 nova_compute[186544]: 2025-11-22 08:21:56.344 186548 DEBUG nova.virt.libvirt.driver [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] Start _get_guest_xml network_info=[{"id": "d36527e0-7005-4c01-b586-38daed464240", "address": "fa:16:3e:4e:e2:00", "network": {"id": "6b35c418-bf90-4666-a674-9b7153e90ab7", "bridge": "br-int", "label": "tempest-network-smoke--584123979", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe4e:e200", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe4e:e200", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd36527e0-70", "ovs_interfaceid": "d36527e0-7005-4c01-b586-38daed464240", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 08:21:56 compute-0 nova_compute[186544]: 2025-11-22 08:21:56.350 186548 WARNING nova.virt.libvirt.driver [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:21:56 compute-0 nova_compute[186544]: 2025-11-22 08:21:56.356 186548 DEBUG nova.virt.libvirt.host [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 08:21:56 compute-0 nova_compute[186544]: 2025-11-22 08:21:56.357 186548 DEBUG nova.virt.libvirt.host [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 08:21:56 compute-0 nova_compute[186544]: 2025-11-22 08:21:56.360 186548 DEBUG nova.virt.libvirt.host [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 08:21:56 compute-0 nova_compute[186544]: 2025-11-22 08:21:56.361 186548 DEBUG nova.virt.libvirt.host [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 08:21:56 compute-0 nova_compute[186544]: 2025-11-22 08:21:56.362 186548 DEBUG nova.virt.libvirt.driver [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 08:21:56 compute-0 nova_compute[186544]: 2025-11-22 08:21:56.362 186548 DEBUG nova.virt.hardware [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 08:21:56 compute-0 nova_compute[186544]: 2025-11-22 08:21:56.363 186548 DEBUG nova.virt.hardware [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 08:21:56 compute-0 nova_compute[186544]: 2025-11-22 08:21:56.363 186548 DEBUG nova.virt.hardware [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 08:21:56 compute-0 nova_compute[186544]: 2025-11-22 08:21:56.364 186548 DEBUG nova.virt.hardware [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 08:21:56 compute-0 nova_compute[186544]: 2025-11-22 08:21:56.364 186548 DEBUG nova.virt.hardware [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 08:21:56 compute-0 nova_compute[186544]: 2025-11-22 08:21:56.364 186548 DEBUG nova.virt.hardware [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 08:21:56 compute-0 nova_compute[186544]: 2025-11-22 08:21:56.364 186548 DEBUG nova.virt.hardware [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 08:21:56 compute-0 nova_compute[186544]: 2025-11-22 08:21:56.365 186548 DEBUG nova.virt.hardware [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 08:21:56 compute-0 nova_compute[186544]: 2025-11-22 08:21:56.365 186548 DEBUG nova.virt.hardware [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 08:21:56 compute-0 nova_compute[186544]: 2025-11-22 08:21:56.365 186548 DEBUG nova.virt.hardware [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 08:21:56 compute-0 nova_compute[186544]: 2025-11-22 08:21:56.366 186548 DEBUG nova.virt.hardware [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 08:21:56 compute-0 nova_compute[186544]: 2025-11-22 08:21:56.370 186548 DEBUG nova.virt.libvirt.vif [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:21:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1532076933',display_name='tempest-TestGettingAddress-server-1532076933',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1532076933',id=156,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNw6MJSPGNLL+kyz8EDCo5RBiInn7CToNjad/C5G7xXekytky+w7dqC4DTuvRP5lSmyynbA95/tXuxj5C1AvwD2ls8Ttqlo5U7TSLLaNP5qNVVinR5ySHcvnqVNdDgyGbg==',key_name='tempest-TestGettingAddress-227311633',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c4200f1d1fbb44a5aaf5e3578f6354ae',ramdisk_id='',reservation_id='r-6djb1gma',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-25838038',owner_user_name='tempest-TestGettingAddress-25838038-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:21:47Z,user_data=None,user_id='809b865601654264af5bff7f49127cea',uuid=97c19ea5-e0e4-4dac-a78d-b974add3bcfe,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d36527e0-7005-4c01-b586-38daed464240", "address": "fa:16:3e:4e:e2:00", "network": {"id": "6b35c418-bf90-4666-a674-9b7153e90ab7", "bridge": "br-int", "label": "tempest-network-smoke--584123979", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe4e:e200", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe4e:e200", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd36527e0-70", "ovs_interfaceid": "d36527e0-7005-4c01-b586-38daed464240", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 08:21:56 compute-0 nova_compute[186544]: 2025-11-22 08:21:56.370 186548 DEBUG nova.network.os_vif_util [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converting VIF {"id": "d36527e0-7005-4c01-b586-38daed464240", "address": "fa:16:3e:4e:e2:00", "network": {"id": "6b35c418-bf90-4666-a674-9b7153e90ab7", "bridge": "br-int", "label": "tempest-network-smoke--584123979", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe4e:e200", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe4e:e200", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd36527e0-70", "ovs_interfaceid": "d36527e0-7005-4c01-b586-38daed464240", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:21:56 compute-0 nova_compute[186544]: 2025-11-22 08:21:56.371 186548 DEBUG nova.network.os_vif_util [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4e:e2:00,bridge_name='br-int',has_traffic_filtering=True,id=d36527e0-7005-4c01-b586-38daed464240,network=Network(6b35c418-bf90-4666-a674-9b7153e90ab7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd36527e0-70') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:21:56 compute-0 nova_compute[186544]: 2025-11-22 08:21:56.372 186548 DEBUG nova.objects.instance [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lazy-loading 'pci_devices' on Instance uuid 97c19ea5-e0e4-4dac-a78d-b974add3bcfe obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:21:56 compute-0 nova_compute[186544]: 2025-11-22 08:21:56.392 186548 DEBUG nova.virt.libvirt.driver [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] End _get_guest_xml xml=<domain type="kvm">
Nov 22 08:21:56 compute-0 nova_compute[186544]:   <uuid>97c19ea5-e0e4-4dac-a78d-b974add3bcfe</uuid>
Nov 22 08:21:56 compute-0 nova_compute[186544]:   <name>instance-0000009c</name>
Nov 22 08:21:56 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 08:21:56 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 08:21:56 compute-0 nova_compute[186544]:   <metadata>
Nov 22 08:21:56 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 08:21:56 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 08:21:56 compute-0 nova_compute[186544]:       <nova:name>tempest-TestGettingAddress-server-1532076933</nova:name>
Nov 22 08:21:56 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 08:21:56</nova:creationTime>
Nov 22 08:21:56 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 08:21:56 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 08:21:56 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 08:21:56 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 08:21:56 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 08:21:56 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 08:21:56 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 08:21:56 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 08:21:56 compute-0 nova_compute[186544]:         <nova:user uuid="809b865601654264af5bff7f49127cea">tempest-TestGettingAddress-25838038-project-member</nova:user>
Nov 22 08:21:56 compute-0 nova_compute[186544]:         <nova:project uuid="c4200f1d1fbb44a5aaf5e3578f6354ae">tempest-TestGettingAddress-25838038</nova:project>
Nov 22 08:21:56 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 08:21:56 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 08:21:56 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 08:21:56 compute-0 nova_compute[186544]:         <nova:port uuid="d36527e0-7005-4c01-b586-38daed464240">
Nov 22 08:21:56 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="2001:db8::f816:3eff:fe4e:e200" ipVersion="6"/>
Nov 22 08:21:56 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Nov 22 08:21:56 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="2001:db8:0:1:f816:3eff:fe4e:e200" ipVersion="6"/>
Nov 22 08:21:56 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 08:21:56 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 08:21:56 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 08:21:56 compute-0 nova_compute[186544]:   </metadata>
Nov 22 08:21:56 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 08:21:56 compute-0 nova_compute[186544]:     <system>
Nov 22 08:21:56 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 08:21:56 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 08:21:56 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 08:21:56 compute-0 nova_compute[186544]:       <entry name="serial">97c19ea5-e0e4-4dac-a78d-b974add3bcfe</entry>
Nov 22 08:21:56 compute-0 nova_compute[186544]:       <entry name="uuid">97c19ea5-e0e4-4dac-a78d-b974add3bcfe</entry>
Nov 22 08:21:56 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 08:21:56 compute-0 nova_compute[186544]:     </system>
Nov 22 08:21:56 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 08:21:56 compute-0 nova_compute[186544]:   <os>
Nov 22 08:21:56 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 08:21:56 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 08:21:56 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 08:21:56 compute-0 nova_compute[186544]:   </os>
Nov 22 08:21:56 compute-0 nova_compute[186544]:   <features>
Nov 22 08:21:56 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 08:21:56 compute-0 nova_compute[186544]:     <apic/>
Nov 22 08:21:56 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 08:21:56 compute-0 nova_compute[186544]:   </features>
Nov 22 08:21:56 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 08:21:56 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 08:21:56 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 08:21:56 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 08:21:56 compute-0 nova_compute[186544]:   </clock>
Nov 22 08:21:56 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 08:21:56 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 08:21:56 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 08:21:56 compute-0 nova_compute[186544]:   </cpu>
Nov 22 08:21:56 compute-0 nova_compute[186544]:   <devices>
Nov 22 08:21:56 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 08:21:56 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 08:21:56 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/97c19ea5-e0e4-4dac-a78d-b974add3bcfe/disk"/>
Nov 22 08:21:56 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 08:21:56 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:21:56 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 08:21:56 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 08:21:56 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/97c19ea5-e0e4-4dac-a78d-b974add3bcfe/disk.config"/>
Nov 22 08:21:56 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 08:21:56 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:21:56 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 08:21:56 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:4e:e2:00"/>
Nov 22 08:21:56 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:21:56 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 08:21:56 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 08:21:56 compute-0 nova_compute[186544]:       <target dev="tapd36527e0-70"/>
Nov 22 08:21:56 compute-0 nova_compute[186544]:     </interface>
Nov 22 08:21:56 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 08:21:56 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/97c19ea5-e0e4-4dac-a78d-b974add3bcfe/console.log" append="off"/>
Nov 22 08:21:56 compute-0 nova_compute[186544]:     </serial>
Nov 22 08:21:56 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 08:21:56 compute-0 nova_compute[186544]:     <video>
Nov 22 08:21:56 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:21:56 compute-0 nova_compute[186544]:     </video>
Nov 22 08:21:56 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 08:21:56 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 08:21:56 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 08:21:56 compute-0 nova_compute[186544]:     </rng>
Nov 22 08:21:56 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 08:21:56 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:21:56 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:21:56 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:21:56 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:21:56 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:21:56 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:21:56 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:21:56 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:21:56 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:21:56 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:21:56 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:21:56 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:21:56 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:21:56 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:21:56 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:21:56 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:21:56 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:21:56 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:21:56 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:21:56 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:21:56 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:21:56 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:21:56 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:21:56 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:21:56 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 08:21:56 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 08:21:56 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 08:21:56 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 08:21:56 compute-0 nova_compute[186544]:   </devices>
Nov 22 08:21:56 compute-0 nova_compute[186544]: </domain>
Nov 22 08:21:56 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 08:21:56 compute-0 nova_compute[186544]: 2025-11-22 08:21:56.394 186548 DEBUG nova.compute.manager [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] Preparing to wait for external event network-vif-plugged-d36527e0-7005-4c01-b586-38daed464240 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 08:21:56 compute-0 nova_compute[186544]: 2025-11-22 08:21:56.394 186548 DEBUG oslo_concurrency.lockutils [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "97c19ea5-e0e4-4dac-a78d-b974add3bcfe-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:21:56 compute-0 nova_compute[186544]: 2025-11-22 08:21:56.394 186548 DEBUG oslo_concurrency.lockutils [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "97c19ea5-e0e4-4dac-a78d-b974add3bcfe-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:21:56 compute-0 nova_compute[186544]: 2025-11-22 08:21:56.395 186548 DEBUG oslo_concurrency.lockutils [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "97c19ea5-e0e4-4dac-a78d-b974add3bcfe-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:21:56 compute-0 nova_compute[186544]: 2025-11-22 08:21:56.395 186548 DEBUG nova.virt.libvirt.vif [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:21:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1532076933',display_name='tempest-TestGettingAddress-server-1532076933',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1532076933',id=156,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNw6MJSPGNLL+kyz8EDCo5RBiInn7CToNjad/C5G7xXekytky+w7dqC4DTuvRP5lSmyynbA95/tXuxj5C1AvwD2ls8Ttqlo5U7TSLLaNP5qNVVinR5ySHcvnqVNdDgyGbg==',key_name='tempest-TestGettingAddress-227311633',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c4200f1d1fbb44a5aaf5e3578f6354ae',ramdisk_id='',reservation_id='r-6djb1gma',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-25838038',owner_user_name='tempest-TestGettingAddress-25838038-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:21:47Z,user_data=None,user_id='809b865601654264af5bff7f49127cea',uuid=97c19ea5-e0e4-4dac-a78d-b974add3bcfe,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d36527e0-7005-4c01-b586-38daed464240", "address": "fa:16:3e:4e:e2:00", "network": {"id": "6b35c418-bf90-4666-a674-9b7153e90ab7", "bridge": "br-int", "label": "tempest-network-smoke--584123979", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe4e:e200", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe4e:e200", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd36527e0-70", "ovs_interfaceid": "d36527e0-7005-4c01-b586-38daed464240", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 08:21:56 compute-0 nova_compute[186544]: 2025-11-22 08:21:56.396 186548 DEBUG nova.network.os_vif_util [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converting VIF {"id": "d36527e0-7005-4c01-b586-38daed464240", "address": "fa:16:3e:4e:e2:00", "network": {"id": "6b35c418-bf90-4666-a674-9b7153e90ab7", "bridge": "br-int", "label": "tempest-network-smoke--584123979", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe4e:e200", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe4e:e200", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd36527e0-70", "ovs_interfaceid": "d36527e0-7005-4c01-b586-38daed464240", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:21:56 compute-0 nova_compute[186544]: 2025-11-22 08:21:56.397 186548 DEBUG nova.network.os_vif_util [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4e:e2:00,bridge_name='br-int',has_traffic_filtering=True,id=d36527e0-7005-4c01-b586-38daed464240,network=Network(6b35c418-bf90-4666-a674-9b7153e90ab7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd36527e0-70') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:21:56 compute-0 nova_compute[186544]: 2025-11-22 08:21:56.397 186548 DEBUG os_vif [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4e:e2:00,bridge_name='br-int',has_traffic_filtering=True,id=d36527e0-7005-4c01-b586-38daed464240,network=Network(6b35c418-bf90-4666-a674-9b7153e90ab7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd36527e0-70') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 08:21:56 compute-0 nova_compute[186544]: 2025-11-22 08:21:56.398 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:21:56 compute-0 nova_compute[186544]: 2025-11-22 08:21:56.398 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:21:56 compute-0 nova_compute[186544]: 2025-11-22 08:21:56.398 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:21:56 compute-0 nova_compute[186544]: 2025-11-22 08:21:56.401 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:21:56 compute-0 nova_compute[186544]: 2025-11-22 08:21:56.401 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd36527e0-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:21:56 compute-0 nova_compute[186544]: 2025-11-22 08:21:56.402 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd36527e0-70, col_values=(('external_ids', {'iface-id': 'd36527e0-7005-4c01-b586-38daed464240', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:4e:e2:00', 'vm-uuid': '97c19ea5-e0e4-4dac-a78d-b974add3bcfe'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:21:56 compute-0 nova_compute[186544]: 2025-11-22 08:21:56.403 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:21:56 compute-0 NetworkManager[55036]: <info>  [1763799716.4049] manager: (tapd36527e0-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/324)
Nov 22 08:21:56 compute-0 nova_compute[186544]: 2025-11-22 08:21:56.406 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 08:21:56 compute-0 nova_compute[186544]: 2025-11-22 08:21:56.410 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:21:56 compute-0 nova_compute[186544]: 2025-11-22 08:21:56.411 186548 INFO os_vif [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4e:e2:00,bridge_name='br-int',has_traffic_filtering=True,id=d36527e0-7005-4c01-b586-38daed464240,network=Network(6b35c418-bf90-4666-a674-9b7153e90ab7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd36527e0-70')
Nov 22 08:21:56 compute-0 nova_compute[186544]: 2025-11-22 08:21:56.478 186548 DEBUG nova.virt.libvirt.driver [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:21:56 compute-0 nova_compute[186544]: 2025-11-22 08:21:56.479 186548 DEBUG nova.virt.libvirt.driver [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:21:56 compute-0 nova_compute[186544]: 2025-11-22 08:21:56.479 186548 DEBUG nova.virt.libvirt.driver [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] No VIF found with MAC fa:16:3e:4e:e2:00, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 08:21:56 compute-0 nova_compute[186544]: 2025-11-22 08:21:56.479 186548 INFO nova.virt.libvirt.driver [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] Using config drive
Nov 22 08:21:57 compute-0 nova_compute[186544]: 2025-11-22 08:21:57.328 186548 INFO nova.virt.libvirt.driver [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] Creating config drive at /var/lib/nova/instances/97c19ea5-e0e4-4dac-a78d-b974add3bcfe/disk.config
Nov 22 08:21:57 compute-0 nova_compute[186544]: 2025-11-22 08:21:57.333 186548 DEBUG oslo_concurrency.processutils [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/97c19ea5-e0e4-4dac-a78d-b974add3bcfe/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpyuem_4b4 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:21:57 compute-0 nova_compute[186544]: 2025-11-22 08:21:57.459 186548 DEBUG oslo_concurrency.processutils [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/97c19ea5-e0e4-4dac-a78d-b974add3bcfe/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpyuem_4b4" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:21:57 compute-0 kernel: tapd36527e0-70: entered promiscuous mode
Nov 22 08:21:57 compute-0 NetworkManager[55036]: <info>  [1763799717.5239] manager: (tapd36527e0-70): new Tun device (/org/freedesktop/NetworkManager/Devices/325)
Nov 22 08:21:57 compute-0 ovn_controller[94843]: 2025-11-22T08:21:57Z|00708|binding|INFO|Claiming lport d36527e0-7005-4c01-b586-38daed464240 for this chassis.
Nov 22 08:21:57 compute-0 ovn_controller[94843]: 2025-11-22T08:21:57Z|00709|binding|INFO|d36527e0-7005-4c01-b586-38daed464240: Claiming fa:16:3e:4e:e2:00 10.100.0.4 2001:db8:0:1:f816:3eff:fe4e:e200 2001:db8::f816:3eff:fe4e:e200
Nov 22 08:21:57 compute-0 nova_compute[186544]: 2025-11-22 08:21:57.527 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:21:57 compute-0 nova_compute[186544]: 2025-11-22 08:21:57.533 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:21:57 compute-0 nova_compute[186544]: 2025-11-22 08:21:57.538 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:21:57 compute-0 nova_compute[186544]: 2025-11-22 08:21:57.541 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:21:57 compute-0 NetworkManager[55036]: <info>  [1763799717.5422] manager: (patch-br-int-to-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/326)
Nov 22 08:21:57 compute-0 NetworkManager[55036]: <info>  [1763799717.5429] manager: (patch-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/327)
Nov 22 08:21:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:21:57.547 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4e:e2:00 10.100.0.4 2001:db8:0:1:f816:3eff:fe4e:e200 2001:db8::f816:3eff:fe4e:e200'], port_security=['fa:16:3e:4e:e2:00 10.100.0.4 2001:db8:0:1:f816:3eff:fe4e:e200 2001:db8::f816:3eff:fe4e:e200'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28 2001:db8:0:1:f816:3eff:fe4e:e200/64 2001:db8::f816:3eff:fe4e:e200/64', 'neutron:device_id': '97c19ea5-e0e4-4dac-a78d-b974add3bcfe', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6b35c418-bf90-4666-a674-9b7153e90ab7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '2', 'neutron:security_group_ids': '77dc402d-bf06-4a39-8313-1435ce0160f8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=68a17d4f-a25c-4374-9e25-7ee5a2fc8b25, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=d36527e0-7005-4c01-b586-38daed464240) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:21:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:21:57.548 103805 INFO neutron.agent.ovn.metadata.agent [-] Port d36527e0-7005-4c01-b586-38daed464240 in datapath 6b35c418-bf90-4666-a674-9b7153e90ab7 bound to our chassis
Nov 22 08:21:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:21:57.550 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6b35c418-bf90-4666-a674-9b7153e90ab7
Nov 22 08:21:57 compute-0 systemd-udevd[242391]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:21:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:21:57.561 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[b30a2ea8-4add-471a-9897-c5b3c92315ac]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:21:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:21:57.562 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6b35c418-b1 in ovnmeta-6b35c418-bf90-4666-a674-9b7153e90ab7 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 08:21:57 compute-0 NetworkManager[55036]: <info>  [1763799717.5639] device (tapd36527e0-70): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 08:21:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:21:57.564 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6b35c418-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 08:21:57 compute-0 NetworkManager[55036]: <info>  [1763799717.5652] device (tapd36527e0-70): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 08:21:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:21:57.564 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[d6066d01-e1ab-4adb-935c-37a3d3c6b501]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:21:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:21:57.565 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[3e36ad28-622f-49af-a741-fbcd66e8a4a8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:21:57 compute-0 systemd-machined[152872]: New machine qemu-82-instance-0000009c.
Nov 22 08:21:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:21:57.574 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[8b359b8a-7a69-469b-9e4d-b27160ee55e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:21:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:21:57.602 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[57e9fff2-8887-4767-991c-df90248931fd]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:21:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:21:57.632 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[2f9648f7-1398-4e1e-858f-556ea6f68089]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:21:57 compute-0 systemd[1]: Started Virtual Machine qemu-82-instance-0000009c.
Nov 22 08:21:57 compute-0 NetworkManager[55036]: <info>  [1763799717.6479] manager: (tap6b35c418-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/328)
Nov 22 08:21:57 compute-0 nova_compute[186544]: 2025-11-22 08:21:57.650 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:21:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:21:57.647 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[8eaaedfe-aa69-42ef-8ff5-0f0c7a87d124]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:21:57 compute-0 nova_compute[186544]: 2025-11-22 08:21:57.666 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:21:57 compute-0 ovn_controller[94843]: 2025-11-22T08:21:57Z|00710|binding|INFO|Setting lport d36527e0-7005-4c01-b586-38daed464240 ovn-installed in OVS
Nov 22 08:21:57 compute-0 ovn_controller[94843]: 2025-11-22T08:21:57Z|00711|binding|INFO|Setting lport d36527e0-7005-4c01-b586-38daed464240 up in Southbound
Nov 22 08:21:57 compute-0 nova_compute[186544]: 2025-11-22 08:21:57.679 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:21:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:21:57.681 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[d2b7bd83-80a1-41d8-bb96-5ef4e2c44c6b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:21:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:21:57.684 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[6da48a2f-55cd-4bfa-8647-c1d1e80c2b68]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:21:57 compute-0 NetworkManager[55036]: <info>  [1763799717.7055] device (tap6b35c418-b0): carrier: link connected
Nov 22 08:21:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:21:57.710 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[6268f9f3-c1b8-40c1-96c1-18d9bd856b8a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:21:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:21:57.725 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[fddd0c3d-4fb5-460a-9a53-a2c06f00e267]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6b35c418-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bd:96:a1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 214], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 645834, 'reachable_time': 30502, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 242424, 'error': None, 'target': 'ovnmeta-6b35c418-bf90-4666-a674-9b7153e90ab7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:21:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:21:57.740 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[5dbb5f3e-1149-4f93-a6e4-cbb4657b3456]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:febd:96a1'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 645834, 'tstamp': 645834}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 242426, 'error': None, 'target': 'ovnmeta-6b35c418-bf90-4666-a674-9b7153e90ab7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:21:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:21:57.756 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[722a0fc9-2cc3-4711-8e14-41faee928636]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6b35c418-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bd:96:a1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 214], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 645834, 'reachable_time': 30502, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 242427, 'error': None, 'target': 'ovnmeta-6b35c418-bf90-4666-a674-9b7153e90ab7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:21:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:21:57.783 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[be566c71-7906-4036-9fb9-4d9a4313a263]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:21:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:21:57.837 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[6fc92c54-bcb7-415e-99e0-dc0a1d7d7833]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:21:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:21:57.839 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6b35c418-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:21:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:21:57.839 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:21:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:21:57.839 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6b35c418-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:21:57 compute-0 nova_compute[186544]: 2025-11-22 08:21:57.841 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:21:57 compute-0 kernel: tap6b35c418-b0: entered promiscuous mode
Nov 22 08:21:57 compute-0 NetworkManager[55036]: <info>  [1763799717.8428] manager: (tap6b35c418-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/329)
Nov 22 08:21:57 compute-0 nova_compute[186544]: 2025-11-22 08:21:57.844 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:21:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:21:57.845 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6b35c418-b0, col_values=(('external_ids', {'iface-id': '397a3db2-78b9-4182-b3e5-f29d5ae58cda'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:21:57 compute-0 nova_compute[186544]: 2025-11-22 08:21:57.846 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:21:57 compute-0 ovn_controller[94843]: 2025-11-22T08:21:57Z|00712|binding|INFO|Releasing lport 397a3db2-78b9-4182-b3e5-f29d5ae58cda from this chassis (sb_readonly=0)
Nov 22 08:21:57 compute-0 nova_compute[186544]: 2025-11-22 08:21:57.848 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:21:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:21:57.849 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6b35c418-bf90-4666-a674-9b7153e90ab7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6b35c418-bf90-4666-a674-9b7153e90ab7.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 08:21:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:21:57.849 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[17d59cba-8067-4578-8420-2d4a5f0ae10a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:21:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:21:57.850 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 08:21:57 compute-0 ovn_metadata_agent[103800]: global
Nov 22 08:21:57 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 08:21:57 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-6b35c418-bf90-4666-a674-9b7153e90ab7
Nov 22 08:21:57 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 08:21:57 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 08:21:57 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 08:21:57 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/6b35c418-bf90-4666-a674-9b7153e90ab7.pid.haproxy
Nov 22 08:21:57 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 08:21:57 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:21:57 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 08:21:57 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 08:21:57 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 08:21:57 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 08:21:57 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 08:21:57 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 08:21:57 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 08:21:57 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 08:21:57 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 08:21:57 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 08:21:57 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 08:21:57 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 08:21:57 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 08:21:57 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:21:57 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:21:57 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 08:21:57 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 08:21:57 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 08:21:57 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID 6b35c418-bf90-4666-a674-9b7153e90ab7
Nov 22 08:21:57 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 08:21:57 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:21:57.851 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6b35c418-bf90-4666-a674-9b7153e90ab7', 'env', 'PROCESS_TAG=haproxy-6b35c418-bf90-4666-a674-9b7153e90ab7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6b35c418-bf90-4666-a674-9b7153e90ab7.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 08:21:57 compute-0 nova_compute[186544]: 2025-11-22 08:21:57.858 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:21:58 compute-0 nova_compute[186544]: 2025-11-22 08:21:58.165 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763799718.1647596, 97c19ea5-e0e4-4dac-a78d-b974add3bcfe => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:21:58 compute-0 nova_compute[186544]: 2025-11-22 08:21:58.165 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] VM Started (Lifecycle Event)
Nov 22 08:21:58 compute-0 nova_compute[186544]: 2025-11-22 08:21:58.185 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:21:58 compute-0 nova_compute[186544]: 2025-11-22 08:21:58.189 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763799718.164897, 97c19ea5-e0e4-4dac-a78d-b974add3bcfe => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:21:58 compute-0 nova_compute[186544]: 2025-11-22 08:21:58.190 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] VM Paused (Lifecycle Event)
Nov 22 08:21:58 compute-0 podman[242466]: 2025-11-22 08:21:58.199427341 +0000 UTC m=+0.060612838 container create 52288acc9a813c4bb9b60f177132486b6aaf912b40a2e722b3233d48dd5e7cc2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6b35c418-bf90-4666-a674-9b7153e90ab7, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 22 08:21:58 compute-0 nova_compute[186544]: 2025-11-22 08:21:58.206 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:21:58 compute-0 nova_compute[186544]: 2025-11-22 08:21:58.209 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:21:58 compute-0 nova_compute[186544]: 2025-11-22 08:21:58.226 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 08:21:58 compute-0 systemd[1]: Started libpod-conmon-52288acc9a813c4bb9b60f177132486b6aaf912b40a2e722b3233d48dd5e7cc2.scope.
Nov 22 08:21:58 compute-0 podman[242466]: 2025-11-22 08:21:58.160869069 +0000 UTC m=+0.022054606 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 08:21:58 compute-0 systemd[1]: Started libcrun container.
Nov 22 08:21:58 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/db826de3fcc176c3eecfcddbe33cd8c0c8ed1f27f3e4a1b69126ea26e4ff693f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 08:21:58 compute-0 podman[242466]: 2025-11-22 08:21:58.289498496 +0000 UTC m=+0.150684033 container init 52288acc9a813c4bb9b60f177132486b6aaf912b40a2e722b3233d48dd5e7cc2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6b35c418-bf90-4666-a674-9b7153e90ab7, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 22 08:21:58 compute-0 podman[242466]: 2025-11-22 08:21:58.295223647 +0000 UTC m=+0.156409154 container start 52288acc9a813c4bb9b60f177132486b6aaf912b40a2e722b3233d48dd5e7cc2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6b35c418-bf90-4666-a674-9b7153e90ab7, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 22 08:21:58 compute-0 nova_compute[186544]: 2025-11-22 08:21:58.307 186548 DEBUG nova.compute.manager [req-5e471cb4-0691-4430-8525-282f679f0f95 req-fcfc9c65-2add-4770-a61b-69fcacb6f089 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] Received event network-vif-plugged-d36527e0-7005-4c01-b586-38daed464240 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:21:58 compute-0 nova_compute[186544]: 2025-11-22 08:21:58.308 186548 DEBUG oslo_concurrency.lockutils [req-5e471cb4-0691-4430-8525-282f679f0f95 req-fcfc9c65-2add-4770-a61b-69fcacb6f089 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "97c19ea5-e0e4-4dac-a78d-b974add3bcfe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:21:58 compute-0 nova_compute[186544]: 2025-11-22 08:21:58.308 186548 DEBUG oslo_concurrency.lockutils [req-5e471cb4-0691-4430-8525-282f679f0f95 req-fcfc9c65-2add-4770-a61b-69fcacb6f089 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "97c19ea5-e0e4-4dac-a78d-b974add3bcfe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:21:58 compute-0 nova_compute[186544]: 2025-11-22 08:21:58.308 186548 DEBUG oslo_concurrency.lockutils [req-5e471cb4-0691-4430-8525-282f679f0f95 req-fcfc9c65-2add-4770-a61b-69fcacb6f089 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "97c19ea5-e0e4-4dac-a78d-b974add3bcfe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:21:58 compute-0 nova_compute[186544]: 2025-11-22 08:21:58.309 186548 DEBUG nova.compute.manager [req-5e471cb4-0691-4430-8525-282f679f0f95 req-fcfc9c65-2add-4770-a61b-69fcacb6f089 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] Processing event network-vif-plugged-d36527e0-7005-4c01-b586-38daed464240 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 08:21:58 compute-0 nova_compute[186544]: 2025-11-22 08:21:58.309 186548 DEBUG nova.compute.manager [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 08:21:58 compute-0 nova_compute[186544]: 2025-11-22 08:21:58.319 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763799718.3191845, 97c19ea5-e0e4-4dac-a78d-b974add3bcfe => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:21:58 compute-0 nova_compute[186544]: 2025-11-22 08:21:58.320 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] VM Resumed (Lifecycle Event)
Nov 22 08:21:58 compute-0 neutron-haproxy-ovnmeta-6b35c418-bf90-4666-a674-9b7153e90ab7[242482]: [NOTICE]   (242486) : New worker (242488) forked
Nov 22 08:21:58 compute-0 neutron-haproxy-ovnmeta-6b35c418-bf90-4666-a674-9b7153e90ab7[242482]: [NOTICE]   (242486) : Loading success.
Nov 22 08:21:58 compute-0 nova_compute[186544]: 2025-11-22 08:21:58.322 186548 DEBUG nova.virt.libvirt.driver [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 08:21:58 compute-0 nova_compute[186544]: 2025-11-22 08:21:58.326 186548 INFO nova.virt.libvirt.driver [-] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] Instance spawned successfully.
Nov 22 08:21:58 compute-0 nova_compute[186544]: 2025-11-22 08:21:58.326 186548 DEBUG nova.virt.libvirt.driver [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 08:21:58 compute-0 nova_compute[186544]: 2025-11-22 08:21:58.350 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:21:58 compute-0 nova_compute[186544]: 2025-11-22 08:21:58.359 186548 DEBUG nova.virt.libvirt.driver [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:21:58 compute-0 nova_compute[186544]: 2025-11-22 08:21:58.360 186548 DEBUG nova.virt.libvirt.driver [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:21:58 compute-0 nova_compute[186544]: 2025-11-22 08:21:58.361 186548 DEBUG nova.virt.libvirt.driver [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:21:58 compute-0 nova_compute[186544]: 2025-11-22 08:21:58.361 186548 DEBUG nova.virt.libvirt.driver [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:21:58 compute-0 nova_compute[186544]: 2025-11-22 08:21:58.362 186548 DEBUG nova.virt.libvirt.driver [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:21:58 compute-0 nova_compute[186544]: 2025-11-22 08:21:58.362 186548 DEBUG nova.virt.libvirt.driver [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:21:58 compute-0 nova_compute[186544]: 2025-11-22 08:21:58.365 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:21:58 compute-0 nova_compute[186544]: 2025-11-22 08:21:58.442 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 08:21:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:21:58.535 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=47, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=46) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:21:58 compute-0 nova_compute[186544]: 2025-11-22 08:21:58.535 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:21:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:21:58.537 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 08:21:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:21:58.537 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '47'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:21:58 compute-0 nova_compute[186544]: 2025-11-22 08:21:58.727 186548 INFO nova.compute.manager [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] Took 10.68 seconds to spawn the instance on the hypervisor.
Nov 22 08:21:58 compute-0 nova_compute[186544]: 2025-11-22 08:21:58.729 186548 DEBUG nova.compute.manager [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:21:58 compute-0 nova_compute[186544]: 2025-11-22 08:21:58.813 186548 INFO nova.compute.manager [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] Took 11.22 seconds to build instance.
Nov 22 08:21:58 compute-0 nova_compute[186544]: 2025-11-22 08:21:58.837 186548 DEBUG oslo_concurrency.lockutils [None req-bde7179b-675d-473e-bb72-60896fd0d084 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "97c19ea5-e0e4-4dac-a78d-b974add3bcfe" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.319s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:21:59 compute-0 nova_compute[186544]: 2025-11-22 08:21:59.616 186548 DEBUG nova.network.neutron [req-3347eea3-41f3-4a96-82f9-249a296efe80 req-02173c41-74ac-47e9-9054-1e2723aae920 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] Updated VIF entry in instance network info cache for port d36527e0-7005-4c01-b586-38daed464240. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:21:59 compute-0 nova_compute[186544]: 2025-11-22 08:21:59.616 186548 DEBUG nova.network.neutron [req-3347eea3-41f3-4a96-82f9-249a296efe80 req-02173c41-74ac-47e9-9054-1e2723aae920 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] Updating instance_info_cache with network_info: [{"id": "d36527e0-7005-4c01-b586-38daed464240", "address": "fa:16:3e:4e:e2:00", "network": {"id": "6b35c418-bf90-4666-a674-9b7153e90ab7", "bridge": "br-int", "label": "tempest-network-smoke--584123979", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe4e:e200", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe4e:e200", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd36527e0-70", "ovs_interfaceid": "d36527e0-7005-4c01-b586-38daed464240", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:21:59 compute-0 nova_compute[186544]: 2025-11-22 08:21:59.634 186548 DEBUG oslo_concurrency.lockutils [req-3347eea3-41f3-4a96-82f9-249a296efe80 req-02173c41-74ac-47e9-9054-1e2723aae920 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-97c19ea5-e0e4-4dac-a78d-b974add3bcfe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:22:00 compute-0 podman[242497]: 2025-11-22 08:22:00.41568413 +0000 UTC m=+0.062839443 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm)
Nov 22 08:22:00 compute-0 podman[242498]: 2025-11-22 08:22:00.416216783 +0000 UTC m=+0.061310416 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 08:22:00 compute-0 podman[242499]: 2025-11-22 08:22:00.418543871 +0000 UTC m=+0.059720057 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 08:22:00 compute-0 podman[242500]: 2025-11-22 08:22:00.452026788 +0000 UTC m=+0.087782440 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 22 08:22:00 compute-0 nova_compute[186544]: 2025-11-22 08:22:00.461 186548 DEBUG nova.compute.manager [req-081b95af-282d-4ec0-95fb-1b54fecc7e3c req-8cdff7f0-8780-449a-baa2-e5d9933a4f8e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] Received event network-vif-plugged-d36527e0-7005-4c01-b586-38daed464240 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:22:00 compute-0 nova_compute[186544]: 2025-11-22 08:22:00.463 186548 DEBUG oslo_concurrency.lockutils [req-081b95af-282d-4ec0-95fb-1b54fecc7e3c req-8cdff7f0-8780-449a-baa2-e5d9933a4f8e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "97c19ea5-e0e4-4dac-a78d-b974add3bcfe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:22:00 compute-0 nova_compute[186544]: 2025-11-22 08:22:00.463 186548 DEBUG oslo_concurrency.lockutils [req-081b95af-282d-4ec0-95fb-1b54fecc7e3c req-8cdff7f0-8780-449a-baa2-e5d9933a4f8e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "97c19ea5-e0e4-4dac-a78d-b974add3bcfe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:22:00 compute-0 nova_compute[186544]: 2025-11-22 08:22:00.464 186548 DEBUG oslo_concurrency.lockutils [req-081b95af-282d-4ec0-95fb-1b54fecc7e3c req-8cdff7f0-8780-449a-baa2-e5d9933a4f8e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "97c19ea5-e0e4-4dac-a78d-b974add3bcfe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:22:00 compute-0 nova_compute[186544]: 2025-11-22 08:22:00.465 186548 DEBUG nova.compute.manager [req-081b95af-282d-4ec0-95fb-1b54fecc7e3c req-8cdff7f0-8780-449a-baa2-e5d9933a4f8e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] No waiting events found dispatching network-vif-plugged-d36527e0-7005-4c01-b586-38daed464240 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:22:00 compute-0 nova_compute[186544]: 2025-11-22 08:22:00.466 186548 WARNING nova.compute.manager [req-081b95af-282d-4ec0-95fb-1b54fecc7e3c req-8cdff7f0-8780-449a-baa2-e5d9933a4f8e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] Received unexpected event network-vif-plugged-d36527e0-7005-4c01-b586-38daed464240 for instance with vm_state active and task_state None.
Nov 22 08:22:01 compute-0 nova_compute[186544]: 2025-11-22 08:22:01.183 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:22:01 compute-0 nova_compute[186544]: 2025-11-22 08:22:01.403 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:22:03 compute-0 nova_compute[186544]: 2025-11-22 08:22:03.197 186548 DEBUG nova.compute.manager [req-d5fe4518-6f4e-4522-b7d7-0827367a68f6 req-aa8a3cfa-6aeb-48af-ad1c-9c0d1e2e8404 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] Received event network-changed-d36527e0-7005-4c01-b586-38daed464240 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:22:03 compute-0 nova_compute[186544]: 2025-11-22 08:22:03.198 186548 DEBUG nova.compute.manager [req-d5fe4518-6f4e-4522-b7d7-0827367a68f6 req-aa8a3cfa-6aeb-48af-ad1c-9c0d1e2e8404 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] Refreshing instance network info cache due to event network-changed-d36527e0-7005-4c01-b586-38daed464240. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:22:03 compute-0 nova_compute[186544]: 2025-11-22 08:22:03.198 186548 DEBUG oslo_concurrency.lockutils [req-d5fe4518-6f4e-4522-b7d7-0827367a68f6 req-aa8a3cfa-6aeb-48af-ad1c-9c0d1e2e8404 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-97c19ea5-e0e4-4dac-a78d-b974add3bcfe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:22:03 compute-0 nova_compute[186544]: 2025-11-22 08:22:03.198 186548 DEBUG oslo_concurrency.lockutils [req-d5fe4518-6f4e-4522-b7d7-0827367a68f6 req-aa8a3cfa-6aeb-48af-ad1c-9c0d1e2e8404 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-97c19ea5-e0e4-4dac-a78d-b974add3bcfe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:22:03 compute-0 nova_compute[186544]: 2025-11-22 08:22:03.199 186548 DEBUG nova.network.neutron [req-d5fe4518-6f4e-4522-b7d7-0827367a68f6 req-aa8a3cfa-6aeb-48af-ad1c-9c0d1e2e8404 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] Refreshing network info cache for port d36527e0-7005-4c01-b586-38daed464240 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:22:04 compute-0 nova_compute[186544]: 2025-11-22 08:22:04.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:22:04 compute-0 nova_compute[186544]: 2025-11-22 08:22:04.184 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:22:04 compute-0 nova_compute[186544]: 2025-11-22 08:22:04.185 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:22:04 compute-0 nova_compute[186544]: 2025-11-22 08:22:04.185 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:22:04 compute-0 nova_compute[186544]: 2025-11-22 08:22:04.185 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 08:22:04 compute-0 nova_compute[186544]: 2025-11-22 08:22:04.251 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/97c19ea5-e0e4-4dac-a78d-b974add3bcfe/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:22:04 compute-0 nova_compute[186544]: 2025-11-22 08:22:04.308 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/97c19ea5-e0e4-4dac-a78d-b974add3bcfe/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:22:04 compute-0 nova_compute[186544]: 2025-11-22 08:22:04.309 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/97c19ea5-e0e4-4dac-a78d-b974add3bcfe/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:22:04 compute-0 nova_compute[186544]: 2025-11-22 08:22:04.362 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/97c19ea5-e0e4-4dac-a78d-b974add3bcfe/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:22:04 compute-0 nova_compute[186544]: 2025-11-22 08:22:04.516 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:22:04 compute-0 nova_compute[186544]: 2025-11-22 08:22:04.517 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5575MB free_disk=73.13592910766602GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 08:22:04 compute-0 nova_compute[186544]: 2025-11-22 08:22:04.517 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:22:04 compute-0 nova_compute[186544]: 2025-11-22 08:22:04.518 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:22:04 compute-0 nova_compute[186544]: 2025-11-22 08:22:04.610 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Instance 97c19ea5-e0e4-4dac-a78d-b974add3bcfe actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 22 08:22:04 compute-0 nova_compute[186544]: 2025-11-22 08:22:04.610 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 08:22:04 compute-0 nova_compute[186544]: 2025-11-22 08:22:04.611 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 08:22:04 compute-0 nova_compute[186544]: 2025-11-22 08:22:04.660 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:22:04 compute-0 nova_compute[186544]: 2025-11-22 08:22:04.694 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:22:04 compute-0 nova_compute[186544]: 2025-11-22 08:22:04.729 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 08:22:04 compute-0 nova_compute[186544]: 2025-11-22 08:22:04.729 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.211s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:22:05 compute-0 nova_compute[186544]: 2025-11-22 08:22:05.593 186548 DEBUG nova.network.neutron [req-d5fe4518-6f4e-4522-b7d7-0827367a68f6 req-aa8a3cfa-6aeb-48af-ad1c-9c0d1e2e8404 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] Updated VIF entry in instance network info cache for port d36527e0-7005-4c01-b586-38daed464240. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:22:05 compute-0 nova_compute[186544]: 2025-11-22 08:22:05.594 186548 DEBUG nova.network.neutron [req-d5fe4518-6f4e-4522-b7d7-0827367a68f6 req-aa8a3cfa-6aeb-48af-ad1c-9c0d1e2e8404 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] Updating instance_info_cache with network_info: [{"id": "d36527e0-7005-4c01-b586-38daed464240", "address": "fa:16:3e:4e:e2:00", "network": {"id": "6b35c418-bf90-4666-a674-9b7153e90ab7", "bridge": "br-int", "label": "tempest-network-smoke--584123979", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe4e:e200", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.213", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe4e:e200", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd36527e0-70", "ovs_interfaceid": "d36527e0-7005-4c01-b586-38daed464240", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:22:05 compute-0 nova_compute[186544]: 2025-11-22 08:22:05.649 186548 DEBUG oslo_concurrency.lockutils [req-d5fe4518-6f4e-4522-b7d7-0827367a68f6 req-aa8a3cfa-6aeb-48af-ad1c-9c0d1e2e8404 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-97c19ea5-e0e4-4dac-a78d-b974add3bcfe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:22:06 compute-0 nova_compute[186544]: 2025-11-22 08:22:06.184 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:22:06 compute-0 nova_compute[186544]: 2025-11-22 08:22:06.404 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:22:07 compute-0 nova_compute[186544]: 2025-11-22 08:22:07.729 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:22:07 compute-0 nova_compute[186544]: 2025-11-22 08:22:07.730 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 08:22:07 compute-0 nova_compute[186544]: 2025-11-22 08:22:07.730 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 08:22:08 compute-0 nova_compute[186544]: 2025-11-22 08:22:08.049 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "refresh_cache-97c19ea5-e0e4-4dac-a78d-b974add3bcfe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:22:08 compute-0 nova_compute[186544]: 2025-11-22 08:22:08.050 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquired lock "refresh_cache-97c19ea5-e0e4-4dac-a78d-b974add3bcfe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:22:08 compute-0 nova_compute[186544]: 2025-11-22 08:22:08.050 186548 DEBUG nova.network.neutron [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 22 08:22:08 compute-0 nova_compute[186544]: 2025-11-22 08:22:08.050 186548 DEBUG nova.objects.instance [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lazy-loading 'info_cache' on Instance uuid 97c19ea5-e0e4-4dac-a78d-b974add3bcfe obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:22:09 compute-0 nova_compute[186544]: 2025-11-22 08:22:09.586 186548 DEBUG nova.network.neutron [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] Updating instance_info_cache with network_info: [{"id": "d36527e0-7005-4c01-b586-38daed464240", "address": "fa:16:3e:4e:e2:00", "network": {"id": "6b35c418-bf90-4666-a674-9b7153e90ab7", "bridge": "br-int", "label": "tempest-network-smoke--584123979", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe4e:e200", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.213", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe4e:e200", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd36527e0-70", "ovs_interfaceid": "d36527e0-7005-4c01-b586-38daed464240", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:22:09 compute-0 nova_compute[186544]: 2025-11-22 08:22:09.603 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Releasing lock "refresh_cache-97c19ea5-e0e4-4dac-a78d-b974add3bcfe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:22:09 compute-0 nova_compute[186544]: 2025-11-22 08:22:09.604 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 22 08:22:09 compute-0 nova_compute[186544]: 2025-11-22 08:22:09.606 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:22:09 compute-0 nova_compute[186544]: 2025-11-22 08:22:09.607 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:22:09 compute-0 nova_compute[186544]: 2025-11-22 08:22:09.607 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 08:22:10 compute-0 nova_compute[186544]: 2025-11-22 08:22:10.165 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:22:10 compute-0 nova_compute[186544]: 2025-11-22 08:22:10.166 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:22:11 compute-0 nova_compute[186544]: 2025-11-22 08:22:11.186 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:22:11 compute-0 nova_compute[186544]: 2025-11-22 08:22:11.406 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:22:11 compute-0 podman[242608]: 2025-11-22 08:22:11.411597416 +0000 UTC m=+0.062113795 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Nov 22 08:22:13 compute-0 ovn_controller[94843]: 2025-11-22T08:22:13Z|00064|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:4e:e2:00 10.100.0.4
Nov 22 08:22:13 compute-0 ovn_controller[94843]: 2025-11-22T08:22:13Z|00065|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:4e:e2:00 10.100.0.4
Nov 22 08:22:14 compute-0 podman[242628]: 2025-11-22 08:22:14.401240556 +0000 UTC m=+0.047621017 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 08:22:14 compute-0 podman[242629]: 2025-11-22 08:22:14.41514696 +0000 UTC m=+0.054168068 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, architecture=x86_64, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, io.openshift.expose-services=, release=1755695350, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 22 08:22:16 compute-0 nova_compute[186544]: 2025-11-22 08:22:16.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:22:16 compute-0 nova_compute[186544]: 2025-11-22 08:22:16.189 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:22:16 compute-0 nova_compute[186544]: 2025-11-22 08:22:16.407 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:22:17 compute-0 nova_compute[186544]: 2025-11-22 08:22:17.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:22:17 compute-0 nova_compute[186544]: 2025-11-22 08:22:17.533 186548 DEBUG nova.virt.libvirt.driver [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Creating tmpfile /var/lib/nova/instances/tmpdu6nfow8 to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Nov 22 08:22:17 compute-0 nova_compute[186544]: 2025-11-22 08:22:17.667 186548 DEBUG nova.compute.manager [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpdu6nfow8',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Nov 22 08:22:19 compute-0 nova_compute[186544]: 2025-11-22 08:22:19.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:22:20 compute-0 nova_compute[186544]: 2025-11-22 08:22:20.567 186548 DEBUG nova.compute.manager [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpdu6nfow8',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='1ef73533-7ed2-422b-a432-c1f12dbc7323',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Nov 22 08:22:20 compute-0 nova_compute[186544]: 2025-11-22 08:22:20.592 186548 DEBUG oslo_concurrency.lockutils [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Acquiring lock "refresh_cache-1ef73533-7ed2-422b-a432-c1f12dbc7323" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:22:20 compute-0 nova_compute[186544]: 2025-11-22 08:22:20.592 186548 DEBUG oslo_concurrency.lockutils [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Acquired lock "refresh_cache-1ef73533-7ed2-422b-a432-c1f12dbc7323" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:22:20 compute-0 nova_compute[186544]: 2025-11-22 08:22:20.593 186548 DEBUG nova.network.neutron [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 08:22:21 compute-0 nova_compute[186544]: 2025-11-22 08:22:21.191 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:22:21 compute-0 nova_compute[186544]: 2025-11-22 08:22:21.409 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:22:21 compute-0 nova_compute[186544]: 2025-11-22 08:22:21.544 186548 DEBUG nova.compute.manager [req-98622f63-1f9d-4628-8545-1b3cf6785259 req-973c40b3-81bc-4452-8d4a-4c58680ba066 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] Received event network-changed-d36527e0-7005-4c01-b586-38daed464240 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:22:21 compute-0 nova_compute[186544]: 2025-11-22 08:22:21.544 186548 DEBUG nova.compute.manager [req-98622f63-1f9d-4628-8545-1b3cf6785259 req-973c40b3-81bc-4452-8d4a-4c58680ba066 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] Refreshing instance network info cache due to event network-changed-d36527e0-7005-4c01-b586-38daed464240. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:22:21 compute-0 nova_compute[186544]: 2025-11-22 08:22:21.544 186548 DEBUG oslo_concurrency.lockutils [req-98622f63-1f9d-4628-8545-1b3cf6785259 req-973c40b3-81bc-4452-8d4a-4c58680ba066 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-97c19ea5-e0e4-4dac-a78d-b974add3bcfe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:22:21 compute-0 nova_compute[186544]: 2025-11-22 08:22:21.545 186548 DEBUG oslo_concurrency.lockutils [req-98622f63-1f9d-4628-8545-1b3cf6785259 req-973c40b3-81bc-4452-8d4a-4c58680ba066 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-97c19ea5-e0e4-4dac-a78d-b974add3bcfe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:22:21 compute-0 nova_compute[186544]: 2025-11-22 08:22:21.545 186548 DEBUG nova.network.neutron [req-98622f63-1f9d-4628-8545-1b3cf6785259 req-973c40b3-81bc-4452-8d4a-4c58680ba066 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] Refreshing network info cache for port d36527e0-7005-4c01-b586-38daed464240 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:22:21 compute-0 nova_compute[186544]: 2025-11-22 08:22:21.757 186548 DEBUG oslo_concurrency.lockutils [None req-e663373e-66b9-4ecc-85d9-1db98f70ef0f 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "97c19ea5-e0e4-4dac-a78d-b974add3bcfe" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:22:21 compute-0 nova_compute[186544]: 2025-11-22 08:22:21.758 186548 DEBUG oslo_concurrency.lockutils [None req-e663373e-66b9-4ecc-85d9-1db98f70ef0f 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "97c19ea5-e0e4-4dac-a78d-b974add3bcfe" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:22:21 compute-0 nova_compute[186544]: 2025-11-22 08:22:21.758 186548 DEBUG oslo_concurrency.lockutils [None req-e663373e-66b9-4ecc-85d9-1db98f70ef0f 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "97c19ea5-e0e4-4dac-a78d-b974add3bcfe-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:22:21 compute-0 nova_compute[186544]: 2025-11-22 08:22:21.758 186548 DEBUG oslo_concurrency.lockutils [None req-e663373e-66b9-4ecc-85d9-1db98f70ef0f 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "97c19ea5-e0e4-4dac-a78d-b974add3bcfe-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:22:21 compute-0 nova_compute[186544]: 2025-11-22 08:22:21.759 186548 DEBUG oslo_concurrency.lockutils [None req-e663373e-66b9-4ecc-85d9-1db98f70ef0f 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "97c19ea5-e0e4-4dac-a78d-b974add3bcfe-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:22:21 compute-0 nova_compute[186544]: 2025-11-22 08:22:21.766 186548 INFO nova.compute.manager [None req-e663373e-66b9-4ecc-85d9-1db98f70ef0f 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] Terminating instance
Nov 22 08:22:21 compute-0 nova_compute[186544]: 2025-11-22 08:22:21.776 186548 DEBUG nova.compute.manager [None req-e663373e-66b9-4ecc-85d9-1db98f70ef0f 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 08:22:21 compute-0 kernel: tapd36527e0-70 (unregistering): left promiscuous mode
Nov 22 08:22:21 compute-0 NetworkManager[55036]: <info>  [1763799741.8043] device (tapd36527e0-70): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 08:22:21 compute-0 nova_compute[186544]: 2025-11-22 08:22:21.814 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:22:21 compute-0 ovn_controller[94843]: 2025-11-22T08:22:21Z|00713|binding|INFO|Releasing lport d36527e0-7005-4c01-b586-38daed464240 from this chassis (sb_readonly=0)
Nov 22 08:22:21 compute-0 ovn_controller[94843]: 2025-11-22T08:22:21Z|00714|binding|INFO|Setting lport d36527e0-7005-4c01-b586-38daed464240 down in Southbound
Nov 22 08:22:21 compute-0 ovn_controller[94843]: 2025-11-22T08:22:21Z|00715|binding|INFO|Removing iface tapd36527e0-70 ovn-installed in OVS
Nov 22 08:22:21 compute-0 nova_compute[186544]: 2025-11-22 08:22:21.816 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:22:21 compute-0 nova_compute[186544]: 2025-11-22 08:22:21.834 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:22:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:22:21.832 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4e:e2:00 10.100.0.4 2001:db8:0:1:f816:3eff:fe4e:e200 2001:db8::f816:3eff:fe4e:e200'], port_security=['fa:16:3e:4e:e2:00 10.100.0.4 2001:db8:0:1:f816:3eff:fe4e:e200 2001:db8::f816:3eff:fe4e:e200'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28 2001:db8:0:1:f816:3eff:fe4e:e200/64 2001:db8::f816:3eff:fe4e:e200/64', 'neutron:device_id': '97c19ea5-e0e4-4dac-a78d-b974add3bcfe', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6b35c418-bf90-4666-a674-9b7153e90ab7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '4', 'neutron:security_group_ids': '77dc402d-bf06-4a39-8313-1435ce0160f8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=68a17d4f-a25c-4374-9e25-7ee5a2fc8b25, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=d36527e0-7005-4c01-b586-38daed464240) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:22:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:22:21.834 103805 INFO neutron.agent.ovn.metadata.agent [-] Port d36527e0-7005-4c01-b586-38daed464240 in datapath 6b35c418-bf90-4666-a674-9b7153e90ab7 unbound from our chassis
Nov 22 08:22:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:22:21.836 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6b35c418-bf90-4666-a674-9b7153e90ab7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 08:22:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:22:21.837 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[2f871499-050e-4714-97d3-c0ef30187dcf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:22:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:22:21.837 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6b35c418-bf90-4666-a674-9b7153e90ab7 namespace which is not needed anymore
Nov 22 08:22:21 compute-0 systemd[1]: machine-qemu\x2d82\x2dinstance\x2d0000009c.scope: Deactivated successfully.
Nov 22 08:22:21 compute-0 systemd[1]: machine-qemu\x2d82\x2dinstance\x2d0000009c.scope: Consumed 14.349s CPU time.
Nov 22 08:22:21 compute-0 systemd-machined[152872]: Machine qemu-82-instance-0000009c terminated.
Nov 22 08:22:22 compute-0 nova_compute[186544]: 2025-11-22 08:22:22.000 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:22:22 compute-0 nova_compute[186544]: 2025-11-22 08:22:22.004 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:22:22 compute-0 nova_compute[186544]: 2025-11-22 08:22:22.047 186548 INFO nova.virt.libvirt.driver [-] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] Instance destroyed successfully.
Nov 22 08:22:22 compute-0 nova_compute[186544]: 2025-11-22 08:22:22.048 186548 DEBUG nova.objects.instance [None req-e663373e-66b9-4ecc-85d9-1db98f70ef0f 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lazy-loading 'resources' on Instance uuid 97c19ea5-e0e4-4dac-a78d-b974add3bcfe obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:22:22 compute-0 neutron-haproxy-ovnmeta-6b35c418-bf90-4666-a674-9b7153e90ab7[242482]: [NOTICE]   (242486) : haproxy version is 2.8.14-c23fe91
Nov 22 08:22:22 compute-0 neutron-haproxy-ovnmeta-6b35c418-bf90-4666-a674-9b7153e90ab7[242482]: [NOTICE]   (242486) : path to executable is /usr/sbin/haproxy
Nov 22 08:22:22 compute-0 neutron-haproxy-ovnmeta-6b35c418-bf90-4666-a674-9b7153e90ab7[242482]: [WARNING]  (242486) : Exiting Master process...
Nov 22 08:22:22 compute-0 neutron-haproxy-ovnmeta-6b35c418-bf90-4666-a674-9b7153e90ab7[242482]: [ALERT]    (242486) : Current worker (242488) exited with code 143 (Terminated)
Nov 22 08:22:22 compute-0 neutron-haproxy-ovnmeta-6b35c418-bf90-4666-a674-9b7153e90ab7[242482]: [WARNING]  (242486) : All workers exited. Exiting... (0)
Nov 22 08:22:22 compute-0 systemd[1]: libpod-52288acc9a813c4bb9b60f177132486b6aaf912b40a2e722b3233d48dd5e7cc2.scope: Deactivated successfully.
Nov 22 08:22:22 compute-0 nova_compute[186544]: 2025-11-22 08:22:22.060 186548 DEBUG nova.virt.libvirt.vif [None req-e663373e-66b9-4ecc-85d9-1db98f70ef0f 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:21:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1532076933',display_name='tempest-TestGettingAddress-server-1532076933',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1532076933',id=156,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNw6MJSPGNLL+kyz8EDCo5RBiInn7CToNjad/C5G7xXekytky+w7dqC4DTuvRP5lSmyynbA95/tXuxj5C1AvwD2ls8Ttqlo5U7TSLLaNP5qNVVinR5ySHcvnqVNdDgyGbg==',key_name='tempest-TestGettingAddress-227311633',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:21:58Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c4200f1d1fbb44a5aaf5e3578f6354ae',ramdisk_id='',reservation_id='r-6djb1gma',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-25838038',owner_user_name='tempest-TestGettingAddress-25838038-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:21:58Z,user_data=None,user_id='809b865601654264af5bff7f49127cea',uuid=97c19ea5-e0e4-4dac-a78d-b974add3bcfe,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d36527e0-7005-4c01-b586-38daed464240", "address": "fa:16:3e:4e:e2:00", "network": {"id": "6b35c418-bf90-4666-a674-9b7153e90ab7", "bridge": "br-int", "label": "tempest-network-smoke--584123979", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe4e:e200", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.213", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe4e:e200", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd36527e0-70", "ovs_interfaceid": "d36527e0-7005-4c01-b586-38daed464240", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 08:22:22 compute-0 nova_compute[186544]: 2025-11-22 08:22:22.061 186548 DEBUG nova.network.os_vif_util [None req-e663373e-66b9-4ecc-85d9-1db98f70ef0f 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converting VIF {"id": "d36527e0-7005-4c01-b586-38daed464240", "address": "fa:16:3e:4e:e2:00", "network": {"id": "6b35c418-bf90-4666-a674-9b7153e90ab7", "bridge": "br-int", "label": "tempest-network-smoke--584123979", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe4e:e200", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.213", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe4e:e200", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd36527e0-70", "ovs_interfaceid": "d36527e0-7005-4c01-b586-38daed464240", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:22:22 compute-0 podman[242698]: 2025-11-22 08:22:22.062060029 +0000 UTC m=+0.146246213 container died 52288acc9a813c4bb9b60f177132486b6aaf912b40a2e722b3233d48dd5e7cc2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6b35c418-bf90-4666-a674-9b7153e90ab7, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 22 08:22:22 compute-0 nova_compute[186544]: 2025-11-22 08:22:22.062 186548 DEBUG nova.network.os_vif_util [None req-e663373e-66b9-4ecc-85d9-1db98f70ef0f 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:4e:e2:00,bridge_name='br-int',has_traffic_filtering=True,id=d36527e0-7005-4c01-b586-38daed464240,network=Network(6b35c418-bf90-4666-a674-9b7153e90ab7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd36527e0-70') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:22:22 compute-0 nova_compute[186544]: 2025-11-22 08:22:22.062 186548 DEBUG os_vif [None req-e663373e-66b9-4ecc-85d9-1db98f70ef0f 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:4e:e2:00,bridge_name='br-int',has_traffic_filtering=True,id=d36527e0-7005-4c01-b586-38daed464240,network=Network(6b35c418-bf90-4666-a674-9b7153e90ab7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd36527e0-70') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 08:22:22 compute-0 nova_compute[186544]: 2025-11-22 08:22:22.064 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:22:22 compute-0 nova_compute[186544]: 2025-11-22 08:22:22.064 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd36527e0-70, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:22:22 compute-0 nova_compute[186544]: 2025-11-22 08:22:22.066 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:22:22 compute-0 nova_compute[186544]: 2025-11-22 08:22:22.068 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:22:22 compute-0 nova_compute[186544]: 2025-11-22 08:22:22.070 186548 INFO os_vif [None req-e663373e-66b9-4ecc-85d9-1db98f70ef0f 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:4e:e2:00,bridge_name='br-int',has_traffic_filtering=True,id=d36527e0-7005-4c01-b586-38daed464240,network=Network(6b35c418-bf90-4666-a674-9b7153e90ab7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd36527e0-70')
Nov 22 08:22:22 compute-0 nova_compute[186544]: 2025-11-22 08:22:22.071 186548 INFO nova.virt.libvirt.driver [None req-e663373e-66b9-4ecc-85d9-1db98f70ef0f 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] Deleting instance files /var/lib/nova/instances/97c19ea5-e0e4-4dac-a78d-b974add3bcfe_del
Nov 22 08:22:22 compute-0 nova_compute[186544]: 2025-11-22 08:22:22.072 186548 INFO nova.virt.libvirt.driver [None req-e663373e-66b9-4ecc-85d9-1db98f70ef0f 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] Deletion of /var/lib/nova/instances/97c19ea5-e0e4-4dac-a78d-b974add3bcfe_del complete
Nov 22 08:22:22 compute-0 nova_compute[186544]: 2025-11-22 08:22:22.146 186548 DEBUG nova.network.neutron [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Updating instance_info_cache with network_info: [{"id": "9887acef-e389-49e2-87d8-70796da43759", "address": "fa:16:3e:b4:b4:21", "network": {"id": "cee14e96-7070-410d-8934-e305861050e3", "bridge": "br-int", "label": "tempest-network-smoke--303711675", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.178", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9887acef-e3", "ovs_interfaceid": "9887acef-e389-49e2-87d8-70796da43759", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:22:22 compute-0 nova_compute[186544]: 2025-11-22 08:22:22.164 186548 DEBUG oslo_concurrency.lockutils [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Releasing lock "refresh_cache-1ef73533-7ed2-422b-a432-c1f12dbc7323" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:22:22 compute-0 nova_compute[186544]: 2025-11-22 08:22:22.172 186548 DEBUG nova.virt.libvirt.driver [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpdu6nfow8',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='1ef73533-7ed2-422b-a432-c1f12dbc7323',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Nov 22 08:22:22 compute-0 nova_compute[186544]: 2025-11-22 08:22:22.172 186548 DEBUG nova.virt.libvirt.driver [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Creating instance directory: /var/lib/nova/instances/1ef73533-7ed2-422b-a432-c1f12dbc7323 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Nov 22 08:22:22 compute-0 nova_compute[186544]: 2025-11-22 08:22:22.173 186548 DEBUG nova.virt.libvirt.driver [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Creating disk.info with the contents: {'/var/lib/nova/instances/1ef73533-7ed2-422b-a432-c1f12dbc7323/disk': 'qcow2', '/var/lib/nova/instances/1ef73533-7ed2-422b-a432-c1f12dbc7323/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854
Nov 22 08:22:22 compute-0 nova_compute[186544]: 2025-11-22 08:22:22.173 186548 DEBUG nova.virt.libvirt.driver [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864
Nov 22 08:22:22 compute-0 nova_compute[186544]: 2025-11-22 08:22:22.173 186548 DEBUG nova.objects.instance [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 1ef73533-7ed2-422b-a432-c1f12dbc7323 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:22:22 compute-0 nova_compute[186544]: 2025-11-22 08:22:22.205 186548 DEBUG oslo_concurrency.processutils [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:22:22 compute-0 nova_compute[186544]: 2025-11-22 08:22:22.223 186548 INFO nova.compute.manager [None req-e663373e-66b9-4ecc-85d9-1db98f70ef0f 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] Took 0.45 seconds to destroy the instance on the hypervisor.
Nov 22 08:22:22 compute-0 nova_compute[186544]: 2025-11-22 08:22:22.225 186548 DEBUG oslo.service.loopingcall [None req-e663373e-66b9-4ecc-85d9-1db98f70ef0f 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 08:22:22 compute-0 nova_compute[186544]: 2025-11-22 08:22:22.225 186548 DEBUG nova.compute.manager [-] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 08:22:22 compute-0 nova_compute[186544]: 2025-11-22 08:22:22.226 186548 DEBUG nova.network.neutron [-] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 08:22:22 compute-0 nova_compute[186544]: 2025-11-22 08:22:22.258 186548 DEBUG oslo_concurrency.processutils [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:22:22 compute-0 nova_compute[186544]: 2025-11-22 08:22:22.259 186548 DEBUG oslo_concurrency.lockutils [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:22:22 compute-0 nova_compute[186544]: 2025-11-22 08:22:22.259 186548 DEBUG oslo_concurrency.lockutils [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:22:22 compute-0 nova_compute[186544]: 2025-11-22 08:22:22.272 186548 DEBUG oslo_concurrency.processutils [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:22:22 compute-0 nova_compute[186544]: 2025-11-22 08:22:22.360 186548 DEBUG oslo_concurrency.processutils [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:22:22 compute-0 nova_compute[186544]: 2025-11-22 08:22:22.362 186548 DEBUG oslo_concurrency.processutils [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/1ef73533-7ed2-422b-a432-c1f12dbc7323/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:22:22 compute-0 systemd[1]: var-lib-containers-storage-overlay-db826de3fcc176c3eecfcddbe33cd8c0c8ed1f27f3e4a1b69126ea26e4ff693f-merged.mount: Deactivated successfully.
Nov 22 08:22:22 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-52288acc9a813c4bb9b60f177132486b6aaf912b40a2e722b3233d48dd5e7cc2-userdata-shm.mount: Deactivated successfully.
Nov 22 08:22:22 compute-0 podman[242698]: 2025-11-22 08:22:22.634558689 +0000 UTC m=+0.718744863 container cleanup 52288acc9a813c4bb9b60f177132486b6aaf912b40a2e722b3233d48dd5e7cc2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6b35c418-bf90-4666-a674-9b7153e90ab7, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118)
Nov 22 08:22:22 compute-0 systemd[1]: libpod-conmon-52288acc9a813c4bb9b60f177132486b6aaf912b40a2e722b3233d48dd5e7cc2.scope: Deactivated successfully.
Nov 22 08:22:22 compute-0 nova_compute[186544]: 2025-11-22 08:22:22.652 186548 DEBUG oslo_concurrency.processutils [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/1ef73533-7ed2-422b-a432-c1f12dbc7323/disk 1073741824" returned: 0 in 0.290s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:22:22 compute-0 nova_compute[186544]: 2025-11-22 08:22:22.654 186548 DEBUG oslo_concurrency.lockutils [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.395s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:22:22 compute-0 nova_compute[186544]: 2025-11-22 08:22:22.655 186548 DEBUG oslo_concurrency.processutils [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:22:22 compute-0 nova_compute[186544]: 2025-11-22 08:22:22.718 186548 DEBUG oslo_concurrency.processutils [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:22:22 compute-0 nova_compute[186544]: 2025-11-22 08:22:22.719 186548 DEBUG nova.virt.disk.api [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Checking if we can resize image /var/lib/nova/instances/1ef73533-7ed2-422b-a432-c1f12dbc7323/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 08:22:22 compute-0 nova_compute[186544]: 2025-11-22 08:22:22.719 186548 DEBUG oslo_concurrency.processutils [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1ef73533-7ed2-422b-a432-c1f12dbc7323/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:22:22 compute-0 nova_compute[186544]: 2025-11-22 08:22:22.777 186548 DEBUG oslo_concurrency.processutils [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1ef73533-7ed2-422b-a432-c1f12dbc7323/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:22:22 compute-0 nova_compute[186544]: 2025-11-22 08:22:22.778 186548 DEBUG nova.virt.disk.api [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Cannot resize image /var/lib/nova/instances/1ef73533-7ed2-422b-a432-c1f12dbc7323/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 08:22:22 compute-0 nova_compute[186544]: 2025-11-22 08:22:22.779 186548 DEBUG nova.objects.instance [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Lazy-loading 'migration_context' on Instance uuid 1ef73533-7ed2-422b-a432-c1f12dbc7323 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:22:22 compute-0 nova_compute[186544]: 2025-11-22 08:22:22.791 186548 DEBUG oslo_concurrency.processutils [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/1ef73533-7ed2-422b-a432-c1f12dbc7323/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:22:22 compute-0 nova_compute[186544]: 2025-11-22 08:22:22.817 186548 DEBUG oslo_concurrency.processutils [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/1ef73533-7ed2-422b-a432-c1f12dbc7323/disk.config 485376" returned: 0 in 0.025s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:22:22 compute-0 nova_compute[186544]: 2025-11-22 08:22:22.819 186548 DEBUG nova.virt.libvirt.volume.remotefs [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Copying file compute-1.ctlplane.example.com:/var/lib/nova/instances/1ef73533-7ed2-422b-a432-c1f12dbc7323/disk.config to /var/lib/nova/instances/1ef73533-7ed2-422b-a432-c1f12dbc7323 copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Nov 22 08:22:22 compute-0 nova_compute[186544]: 2025-11-22 08:22:22.819 186548 DEBUG oslo_concurrency.processutils [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Running cmd (subprocess): scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/1ef73533-7ed2-422b-a432-c1f12dbc7323/disk.config /var/lib/nova/instances/1ef73533-7ed2-422b-a432-c1f12dbc7323 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:22:23 compute-0 podman[242753]: 2025-11-22 08:22:23.077412777 +0000 UTC m=+0.418048227 container remove 52288acc9a813c4bb9b60f177132486b6aaf912b40a2e722b3233d48dd5e7cc2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6b35c418-bf90-4666-a674-9b7153e90ab7, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118)
Nov 22 08:22:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:22:23.084 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[c43ff732-6ad0-4c28-a63f-3dce2e693d4c]: (4, ('Sat Nov 22 08:22:21 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-6b35c418-bf90-4666-a674-9b7153e90ab7 (52288acc9a813c4bb9b60f177132486b6aaf912b40a2e722b3233d48dd5e7cc2)\n52288acc9a813c4bb9b60f177132486b6aaf912b40a2e722b3233d48dd5e7cc2\nSat Nov 22 08:22:22 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-6b35c418-bf90-4666-a674-9b7153e90ab7 (52288acc9a813c4bb9b60f177132486b6aaf912b40a2e722b3233d48dd5e7cc2)\n52288acc9a813c4bb9b60f177132486b6aaf912b40a2e722b3233d48dd5e7cc2\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:22:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:22:23.086 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[46fcee73-8bd3-468d-a6ec-b9c15dc3bcad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:22:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:22:23.087 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6b35c418-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:22:23 compute-0 nova_compute[186544]: 2025-11-22 08:22:23.089 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:22:23 compute-0 kernel: tap6b35c418-b0: left promiscuous mode
Nov 22 08:22:23 compute-0 nova_compute[186544]: 2025-11-22 08:22:23.102 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:22:23 compute-0 nova_compute[186544]: 2025-11-22 08:22:23.103 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:22:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:22:23.105 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[92b359f2-dba3-456c-a3bb-6e5bdefc3aca]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:22:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:22:23.125 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[c1254e8c-9c75-4df5-809f-1638c67ab9a7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:22:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:22:23.127 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[258c640f-fa6a-43e1-9cc0-2492eadb6dbd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:22:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:22:23.140 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[08362973-ad2e-4107-8780-ebe5541a4b61]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 645826, 'reachable_time': 23006, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 242780, 'error': None, 'target': 'ovnmeta-6b35c418-bf90-4666-a674-9b7153e90ab7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:22:23 compute-0 systemd[1]: run-netns-ovnmeta\x2d6b35c418\x2dbf90\x2d4666\x2da674\x2d9b7153e90ab7.mount: Deactivated successfully.
Nov 22 08:22:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:22:23.144 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6b35c418-bf90-4666-a674-9b7153e90ab7 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 08:22:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:22:23.145 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[92082961-ace3-42db-98fb-b9ddc72e507e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:22:23 compute-0 nova_compute[186544]: 2025-11-22 08:22:23.295 186548 DEBUG oslo_concurrency.processutils [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] CMD "scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/1ef73533-7ed2-422b-a432-c1f12dbc7323/disk.config /var/lib/nova/instances/1ef73533-7ed2-422b-a432-c1f12dbc7323" returned: 0 in 0.476s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:22:23 compute-0 nova_compute[186544]: 2025-11-22 08:22:23.295 186548 DEBUG nova.virt.libvirt.driver [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Nov 22 08:22:23 compute-0 nova_compute[186544]: 2025-11-22 08:22:23.296 186548 DEBUG nova.virt.libvirt.vif [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:21:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-896988644',display_name='tempest-TestNetworkAdvancedServerOps-server-896988644',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-896988644',id=155,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGEdgv7ZU5mr1DCHTKMPGMEa2ECF9EUHEhtvPrAip3HJ7nfj7TmONl8h5osSWq7Dqr3V6Hj92ZlV3FEvFnmrY27FQG+lpmv/tm8jg4LcVo4ZQR1NoEXbdJ/E0azh0THluw==',key_name='tempest-TestNetworkAdvancedServerOps-205176852',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:21:50Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='042f6d127720471aaedb8a1fb7535416',ramdisk_id='',reservation_id='r-9m0of0y0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1221065053',owner_user_name='tempest-TestNetworkAdvancedServerOps-1221065053-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:21:50Z,user_data=None,user_id='d8853d84c1e84f6baaf01635ef1d0f7c',uuid=1ef73533-7ed2-422b-a432-c1f12dbc7323,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9887acef-e389-49e2-87d8-70796da43759", "address": "fa:16:3e:b4:b4:21", "network": {"id": "cee14e96-7070-410d-8934-e305861050e3", "bridge": "br-int", "label": "tempest-network-smoke--303711675", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.178", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap9887acef-e3", "ovs_interfaceid": "9887acef-e389-49e2-87d8-70796da43759", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 08:22:23 compute-0 nova_compute[186544]: 2025-11-22 08:22:23.297 186548 DEBUG nova.network.os_vif_util [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Converting VIF {"id": "9887acef-e389-49e2-87d8-70796da43759", "address": "fa:16:3e:b4:b4:21", "network": {"id": "cee14e96-7070-410d-8934-e305861050e3", "bridge": "br-int", "label": "tempest-network-smoke--303711675", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.178", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap9887acef-e3", "ovs_interfaceid": "9887acef-e389-49e2-87d8-70796da43759", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:22:23 compute-0 nova_compute[186544]: 2025-11-22 08:22:23.298 186548 DEBUG nova.network.os_vif_util [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b4:b4:21,bridge_name='br-int',has_traffic_filtering=True,id=9887acef-e389-49e2-87d8-70796da43759,network=Network(cee14e96-7070-410d-8934-e305861050e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9887acef-e3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:22:23 compute-0 nova_compute[186544]: 2025-11-22 08:22:23.298 186548 DEBUG os_vif [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b4:b4:21,bridge_name='br-int',has_traffic_filtering=True,id=9887acef-e389-49e2-87d8-70796da43759,network=Network(cee14e96-7070-410d-8934-e305861050e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9887acef-e3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 08:22:23 compute-0 nova_compute[186544]: 2025-11-22 08:22:23.299 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:22:23 compute-0 nova_compute[186544]: 2025-11-22 08:22:23.299 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:22:23 compute-0 nova_compute[186544]: 2025-11-22 08:22:23.299 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:22:23 compute-0 nova_compute[186544]: 2025-11-22 08:22:23.301 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:22:23 compute-0 nova_compute[186544]: 2025-11-22 08:22:23.301 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9887acef-e3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:22:23 compute-0 nova_compute[186544]: 2025-11-22 08:22:23.302 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9887acef-e3, col_values=(('external_ids', {'iface-id': '9887acef-e389-49e2-87d8-70796da43759', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b4:b4:21', 'vm-uuid': '1ef73533-7ed2-422b-a432-c1f12dbc7323'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:22:23 compute-0 nova_compute[186544]: 2025-11-22 08:22:23.303 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:22:23 compute-0 NetworkManager[55036]: <info>  [1763799743.3045] manager: (tap9887acef-e3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/330)
Nov 22 08:22:23 compute-0 nova_compute[186544]: 2025-11-22 08:22:23.306 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 08:22:23 compute-0 nova_compute[186544]: 2025-11-22 08:22:23.308 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:22:23 compute-0 nova_compute[186544]: 2025-11-22 08:22:23.310 186548 INFO os_vif [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b4:b4:21,bridge_name='br-int',has_traffic_filtering=True,id=9887acef-e389-49e2-87d8-70796da43759,network=Network(cee14e96-7070-410d-8934-e305861050e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9887acef-e3')
Nov 22 08:22:23 compute-0 nova_compute[186544]: 2025-11-22 08:22:23.311 186548 DEBUG nova.virt.libvirt.driver [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Nov 22 08:22:23 compute-0 nova_compute[186544]: 2025-11-22 08:22:23.311 186548 DEBUG nova.compute.manager [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpdu6nfow8',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='1ef73533-7ed2-422b-a432-c1f12dbc7323',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Nov 22 08:22:23 compute-0 nova_compute[186544]: 2025-11-22 08:22:23.356 186548 DEBUG nova.network.neutron [-] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:22:23 compute-0 nova_compute[186544]: 2025-11-22 08:22:23.386 186548 INFO nova.compute.manager [-] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] Took 1.16 seconds to deallocate network for instance.
Nov 22 08:22:23 compute-0 nova_compute[186544]: 2025-11-22 08:22:23.451 186548 DEBUG oslo_concurrency.lockutils [None req-e663373e-66b9-4ecc-85d9-1db98f70ef0f 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:22:23 compute-0 nova_compute[186544]: 2025-11-22 08:22:23.452 186548 DEBUG oslo_concurrency.lockutils [None req-e663373e-66b9-4ecc-85d9-1db98f70ef0f 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:22:23 compute-0 nova_compute[186544]: 2025-11-22 08:22:23.454 186548 DEBUG nova.compute.manager [req-ffabe089-9daa-459c-9568-5743555eaeef req-c36f8347-d9d1-4676-a3e6-7d7bb09ff572 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] Received event network-vif-deleted-d36527e0-7005-4c01-b586-38daed464240 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:22:23 compute-0 nova_compute[186544]: 2025-11-22 08:22:23.517 186548 DEBUG nova.compute.provider_tree [None req-e663373e-66b9-4ecc-85d9-1db98f70ef0f 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:22:23 compute-0 nova_compute[186544]: 2025-11-22 08:22:23.529 186548 DEBUG nova.scheduler.client.report [None req-e663373e-66b9-4ecc-85d9-1db98f70ef0f 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:22:23 compute-0 nova_compute[186544]: 2025-11-22 08:22:23.546 186548 DEBUG oslo_concurrency.lockutils [None req-e663373e-66b9-4ecc-85d9-1db98f70ef0f 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.094s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:22:23 compute-0 nova_compute[186544]: 2025-11-22 08:22:23.573 186548 INFO nova.scheduler.client.report [None req-e663373e-66b9-4ecc-85d9-1db98f70ef0f 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Deleted allocations for instance 97c19ea5-e0e4-4dac-a78d-b974add3bcfe
Nov 22 08:22:23 compute-0 nova_compute[186544]: 2025-11-22 08:22:23.606 186548 DEBUG nova.compute.manager [req-e91affda-8212-4d23-8294-259910edfd31 req-f5231543-08db-497e-ac29-4cbe7fe3cbff 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] Received event network-vif-unplugged-d36527e0-7005-4c01-b586-38daed464240 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:22:23 compute-0 nova_compute[186544]: 2025-11-22 08:22:23.607 186548 DEBUG oslo_concurrency.lockutils [req-e91affda-8212-4d23-8294-259910edfd31 req-f5231543-08db-497e-ac29-4cbe7fe3cbff 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "97c19ea5-e0e4-4dac-a78d-b974add3bcfe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:22:23 compute-0 nova_compute[186544]: 2025-11-22 08:22:23.607 186548 DEBUG oslo_concurrency.lockutils [req-e91affda-8212-4d23-8294-259910edfd31 req-f5231543-08db-497e-ac29-4cbe7fe3cbff 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "97c19ea5-e0e4-4dac-a78d-b974add3bcfe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:22:23 compute-0 nova_compute[186544]: 2025-11-22 08:22:23.607 186548 DEBUG oslo_concurrency.lockutils [req-e91affda-8212-4d23-8294-259910edfd31 req-f5231543-08db-497e-ac29-4cbe7fe3cbff 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "97c19ea5-e0e4-4dac-a78d-b974add3bcfe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:22:23 compute-0 nova_compute[186544]: 2025-11-22 08:22:23.608 186548 DEBUG nova.compute.manager [req-e91affda-8212-4d23-8294-259910edfd31 req-f5231543-08db-497e-ac29-4cbe7fe3cbff 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] No waiting events found dispatching network-vif-unplugged-d36527e0-7005-4c01-b586-38daed464240 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:22:23 compute-0 nova_compute[186544]: 2025-11-22 08:22:23.608 186548 WARNING nova.compute.manager [req-e91affda-8212-4d23-8294-259910edfd31 req-f5231543-08db-497e-ac29-4cbe7fe3cbff 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] Received unexpected event network-vif-unplugged-d36527e0-7005-4c01-b586-38daed464240 for instance with vm_state deleted and task_state None.
Nov 22 08:22:23 compute-0 nova_compute[186544]: 2025-11-22 08:22:23.609 186548 DEBUG nova.compute.manager [req-e91affda-8212-4d23-8294-259910edfd31 req-f5231543-08db-497e-ac29-4cbe7fe3cbff 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] Received event network-vif-plugged-d36527e0-7005-4c01-b586-38daed464240 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:22:23 compute-0 nova_compute[186544]: 2025-11-22 08:22:23.609 186548 DEBUG oslo_concurrency.lockutils [req-e91affda-8212-4d23-8294-259910edfd31 req-f5231543-08db-497e-ac29-4cbe7fe3cbff 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "97c19ea5-e0e4-4dac-a78d-b974add3bcfe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:22:23 compute-0 nova_compute[186544]: 2025-11-22 08:22:23.609 186548 DEBUG oslo_concurrency.lockutils [req-e91affda-8212-4d23-8294-259910edfd31 req-f5231543-08db-497e-ac29-4cbe7fe3cbff 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "97c19ea5-e0e4-4dac-a78d-b974add3bcfe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:22:23 compute-0 nova_compute[186544]: 2025-11-22 08:22:23.610 186548 DEBUG oslo_concurrency.lockutils [req-e91affda-8212-4d23-8294-259910edfd31 req-f5231543-08db-497e-ac29-4cbe7fe3cbff 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "97c19ea5-e0e4-4dac-a78d-b974add3bcfe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:22:23 compute-0 nova_compute[186544]: 2025-11-22 08:22:23.610 186548 DEBUG nova.compute.manager [req-e91affda-8212-4d23-8294-259910edfd31 req-f5231543-08db-497e-ac29-4cbe7fe3cbff 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] No waiting events found dispatching network-vif-plugged-d36527e0-7005-4c01-b586-38daed464240 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:22:23 compute-0 nova_compute[186544]: 2025-11-22 08:22:23.610 186548 WARNING nova.compute.manager [req-e91affda-8212-4d23-8294-259910edfd31 req-f5231543-08db-497e-ac29-4cbe7fe3cbff 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] Received unexpected event network-vif-plugged-d36527e0-7005-4c01-b586-38daed464240 for instance with vm_state deleted and task_state None.
Nov 22 08:22:23 compute-0 nova_compute[186544]: 2025-11-22 08:22:23.628 186548 DEBUG oslo_concurrency.lockutils [None req-e663373e-66b9-4ecc-85d9-1db98f70ef0f 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "97c19ea5-e0e4-4dac-a78d-b974add3bcfe" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.870s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:22:24 compute-0 nova_compute[186544]: 2025-11-22 08:22:24.619 186548 DEBUG nova.network.neutron [req-98622f63-1f9d-4628-8545-1b3cf6785259 req-973c40b3-81bc-4452-8d4a-4c58680ba066 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] Updated VIF entry in instance network info cache for port d36527e0-7005-4c01-b586-38daed464240. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:22:24 compute-0 nova_compute[186544]: 2025-11-22 08:22:24.620 186548 DEBUG nova.network.neutron [req-98622f63-1f9d-4628-8545-1b3cf6785259 req-973c40b3-81bc-4452-8d4a-4c58680ba066 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] Updating instance_info_cache with network_info: [{"id": "d36527e0-7005-4c01-b586-38daed464240", "address": "fa:16:3e:4e:e2:00", "network": {"id": "6b35c418-bf90-4666-a674-9b7153e90ab7", "bridge": "br-int", "label": "tempest-network-smoke--584123979", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe4e:e200", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe4e:e200", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd36527e0-70", "ovs_interfaceid": "d36527e0-7005-4c01-b586-38daed464240", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:22:24 compute-0 nova_compute[186544]: 2025-11-22 08:22:24.639 186548 DEBUG oslo_concurrency.lockutils [req-98622f63-1f9d-4628-8545-1b3cf6785259 req-973c40b3-81bc-4452-8d4a-4c58680ba066 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-97c19ea5-e0e4-4dac-a78d-b974add3bcfe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:22:25 compute-0 nova_compute[186544]: 2025-11-22 08:22:25.329 186548 DEBUG nova.network.neutron [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Port 9887acef-e389-49e2-87d8-70796da43759 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Nov 22 08:22:25 compute-0 nova_compute[186544]: 2025-11-22 08:22:25.339 186548 DEBUG nova.compute.manager [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpdu6nfow8',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='1ef73533-7ed2-422b-a432-c1f12dbc7323',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Nov 22 08:22:25 compute-0 systemd[1]: Starting libvirt proxy daemon...
Nov 22 08:22:25 compute-0 systemd[1]: Started libvirt proxy daemon.
Nov 22 08:22:25 compute-0 kernel: tap9887acef-e3: entered promiscuous mode
Nov 22 08:22:25 compute-0 NetworkManager[55036]: <info>  [1763799745.6797] manager: (tap9887acef-e3): new Tun device (/org/freedesktop/NetworkManager/Devices/331)
Nov 22 08:22:25 compute-0 ovn_controller[94843]: 2025-11-22T08:22:25Z|00716|binding|INFO|Claiming lport 9887acef-e389-49e2-87d8-70796da43759 for this additional chassis.
Nov 22 08:22:25 compute-0 ovn_controller[94843]: 2025-11-22T08:22:25Z|00717|binding|INFO|9887acef-e389-49e2-87d8-70796da43759: Claiming fa:16:3e:b4:b4:21 10.100.0.9
Nov 22 08:22:25 compute-0 nova_compute[186544]: 2025-11-22 08:22:25.680 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:22:25 compute-0 nova_compute[186544]: 2025-11-22 08:22:25.694 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:22:25 compute-0 ovn_controller[94843]: 2025-11-22T08:22:25Z|00718|binding|INFO|Setting lport 9887acef-e389-49e2-87d8-70796da43759 ovn-installed in OVS
Nov 22 08:22:25 compute-0 nova_compute[186544]: 2025-11-22 08:22:25.701 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:22:25 compute-0 systemd-udevd[242817]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:22:25 compute-0 systemd-machined[152872]: New machine qemu-83-instance-0000009b.
Nov 22 08:22:25 compute-0 NetworkManager[55036]: <info>  [1763799745.7325] device (tap9887acef-e3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 08:22:25 compute-0 NetworkManager[55036]: <info>  [1763799745.7335] device (tap9887acef-e3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 08:22:25 compute-0 systemd[1]: Started Virtual Machine qemu-83-instance-0000009b.
Nov 22 08:22:26 compute-0 nova_compute[186544]: 2025-11-22 08:22:26.193 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:22:27 compute-0 nova_compute[186544]: 2025-11-22 08:22:27.172 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763799747.1723278, 1ef73533-7ed2-422b-a432-c1f12dbc7323 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:22:27 compute-0 nova_compute[186544]: 2025-11-22 08:22:27.173 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] VM Started (Lifecycle Event)
Nov 22 08:22:27 compute-0 nova_compute[186544]: 2025-11-22 08:22:27.191 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:22:28 compute-0 nova_compute[186544]: 2025-11-22 08:22:28.304 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:22:29 compute-0 nova_compute[186544]: 2025-11-22 08:22:29.418 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763799749.4176016, 1ef73533-7ed2-422b-a432-c1f12dbc7323 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:22:29 compute-0 nova_compute[186544]: 2025-11-22 08:22:29.419 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] VM Resumed (Lifecycle Event)
Nov 22 08:22:29 compute-0 nova_compute[186544]: 2025-11-22 08:22:29.448 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:22:29 compute-0 nova_compute[186544]: 2025-11-22 08:22:29.452 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:22:29 compute-0 nova_compute[186544]: 2025-11-22 08:22:29.474 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] During the sync_power process the instance has moved from host compute-1.ctlplane.example.com to host compute-0.ctlplane.example.com
Nov 22 08:22:31 compute-0 nova_compute[186544]: 2025-11-22 08:22:31.194 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:22:31 compute-0 ovn_controller[94843]: 2025-11-22T08:22:31Z|00719|binding|INFO|Claiming lport 9887acef-e389-49e2-87d8-70796da43759 for this chassis.
Nov 22 08:22:31 compute-0 ovn_controller[94843]: 2025-11-22T08:22:31Z|00720|binding|INFO|9887acef-e389-49e2-87d8-70796da43759: Claiming fa:16:3e:b4:b4:21 10.100.0.9
Nov 22 08:22:31 compute-0 ovn_controller[94843]: 2025-11-22T08:22:31Z|00721|binding|INFO|Setting lport 9887acef-e389-49e2-87d8-70796da43759 up in Southbound
Nov 22 08:22:31 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:22:31.285 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b4:b4:21 10.100.0.9'], port_security=['fa:16:3e:b4:b4:21 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '1ef73533-7ed2-422b-a432-c1f12dbc7323', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cee14e96-7070-410d-8934-e305861050e3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '042f6d127720471aaedb8a1fb7535416', 'neutron:revision_number': '11', 'neutron:security_group_ids': '6e5fcd1f-0f97-4f29-8604-c8a08fc32894', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.178'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fc22de09-f0b3-4482-a7fb-bd5256ece761, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=9887acef-e389-49e2-87d8-70796da43759) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:22:31 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:22:31.287 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 9887acef-e389-49e2-87d8-70796da43759 in datapath cee14e96-7070-410d-8934-e305861050e3 bound to our chassis
Nov 22 08:22:31 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:22:31.289 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network cee14e96-7070-410d-8934-e305861050e3
Nov 22 08:22:31 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:22:31.300 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[45f13402-f946-4337-a1c6-139bab8b6653]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:22:31 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:22:31.301 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapcee14e96-71 in ovnmeta-cee14e96-7070-410d-8934-e305861050e3 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 08:22:31 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:22:31.306 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapcee14e96-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 08:22:31 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:22:31.306 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[31baff56-04b4-47e2-9c2e-e30c869bbeae]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:22:31 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:22:31.308 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[0878130d-7b19-420a-85d9-28c05e5cd022]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:22:31 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:22:31.324 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[4e873a7b-2376-4008-a5e1-3f18aa0a8b4b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:22:31 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:22:31.342 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[e45032ee-485c-4548-9c2b-a0460213a87b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:22:31 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:22:31.376 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[d399c040-8ac5-43cb-9a9b-5794c0fb81a7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:22:31 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:22:31.382 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[e563042e-827f-4fd7-8c40-5dbed1c3a8e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:22:31 compute-0 NetworkManager[55036]: <info>  [1763799751.3858] manager: (tapcee14e96-70): new Veth device (/org/freedesktop/NetworkManager/Devices/332)
Nov 22 08:22:31 compute-0 systemd-udevd[242911]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:22:31 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:22:31.422 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[9066829a-8e2d-4960-bf1a-9b5086901263]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:22:31 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:22:31.426 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[ff4f1f97-0879-454b-a65e-4680bdb2294e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:22:31 compute-0 podman[242855]: 2025-11-22 08:22:31.442101034 +0000 UTC m=+0.078436929 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 08:22:31 compute-0 podman[242851]: 2025-11-22 08:22:31.457094204 +0000 UTC m=+0.095084279 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team)
Nov 22 08:22:31 compute-0 podman[242853]: 2025-11-22 08:22:31.460805355 +0000 UTC m=+0.098939954 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 08:22:31 compute-0 NetworkManager[55036]: <info>  [1763799751.4628] device (tapcee14e96-70): carrier: link connected
Nov 22 08:22:31 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:22:31.471 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[d93f2d42-5905-4cf3-aea4-2e99df3ebb68]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:22:31 compute-0 podman[242856]: 2025-11-22 08:22:31.486165512 +0000 UTC m=+0.121693356 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3)
Nov 22 08:22:31 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:22:31.490 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[caee5a42-eaa4-4971-a646-35181ca4e759]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcee14e96-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4e:02:19'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 217], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 649210, 'reachable_time': 15560, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 242958, 'error': None, 'target': 'ovnmeta-cee14e96-7070-410d-8934-e305861050e3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:22:31 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:22:31.506 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[0946dcd4-4468-42c3-a37e-57863b7cff35]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe4e:219'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 649210, 'tstamp': 649210}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 242959, 'error': None, 'target': 'ovnmeta-cee14e96-7070-410d-8934-e305861050e3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:22:31 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:22:31.523 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[909d7756-1552-4495-a51c-d22740b52a80]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcee14e96-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4e:02:19'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 217], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 649210, 'reachable_time': 15560, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 242960, 'error': None, 'target': 'ovnmeta-cee14e96-7070-410d-8934-e305861050e3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:22:31 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:22:31.555 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[01a64954-2da3-455d-9587-cdaf1cef802e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:22:31 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:22:31.624 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[a976e5ab-41da-4e55-8384-ef3c88d78610]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:22:31 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:22:31.626 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcee14e96-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:22:31 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:22:31.626 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:22:31 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:22:31.626 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcee14e96-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:22:31 compute-0 NetworkManager[55036]: <info>  [1763799751.6295] manager: (tapcee14e96-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/333)
Nov 22 08:22:31 compute-0 kernel: tapcee14e96-70: entered promiscuous mode
Nov 22 08:22:31 compute-0 nova_compute[186544]: 2025-11-22 08:22:31.629 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:22:31 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:22:31.632 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapcee14e96-70, col_values=(('external_ids', {'iface-id': 'a71e340e-db50-4811-9d79-16735ac5733d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:22:31 compute-0 nova_compute[186544]: 2025-11-22 08:22:31.634 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:22:31 compute-0 ovn_controller[94843]: 2025-11-22T08:22:31Z|00722|binding|INFO|Releasing lport a71e340e-db50-4811-9d79-16735ac5733d from this chassis (sb_readonly=0)
Nov 22 08:22:31 compute-0 nova_compute[186544]: 2025-11-22 08:22:31.636 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:22:31 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:22:31.636 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/cee14e96-7070-410d-8934-e305861050e3.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/cee14e96-7070-410d-8934-e305861050e3.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 08:22:31 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:22:31.638 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[c04c2973-6612-4d45-af29-a30807fd010b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:22:31 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:22:31.639 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 08:22:31 compute-0 ovn_metadata_agent[103800]: global
Nov 22 08:22:31 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 08:22:31 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-cee14e96-7070-410d-8934-e305861050e3
Nov 22 08:22:31 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 08:22:31 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 08:22:31 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 08:22:31 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/cee14e96-7070-410d-8934-e305861050e3.pid.haproxy
Nov 22 08:22:31 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 08:22:31 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:22:31 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 08:22:31 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 08:22:31 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 08:22:31 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 08:22:31 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 08:22:31 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 08:22:31 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 08:22:31 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 08:22:31 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 08:22:31 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 08:22:31 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 08:22:31 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 08:22:31 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 08:22:31 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:22:31 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:22:31 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 08:22:31 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 08:22:31 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 08:22:31 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID cee14e96-7070-410d-8934-e305861050e3
Nov 22 08:22:31 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 08:22:31 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:22:31.639 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-cee14e96-7070-410d-8934-e305861050e3', 'env', 'PROCESS_TAG=haproxy-cee14e96-7070-410d-8934-e305861050e3', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/cee14e96-7070-410d-8934-e305861050e3.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 08:22:31 compute-0 nova_compute[186544]: 2025-11-22 08:22:31.648 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:22:31 compute-0 nova_compute[186544]: 2025-11-22 08:22:31.769 186548 INFO nova.compute.manager [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Post operation of migration started
Nov 22 08:22:32 compute-0 podman[242994]: 2025-11-22 08:22:31.972826392 +0000 UTC m=+0.021876211 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 08:22:32 compute-0 podman[242994]: 2025-11-22 08:22:32.106689018 +0000 UTC m=+0.155738807 container create fe5617efbaeedbeac66ad12fd59f5a3620bb04e7c9cb75e79129e6307e3ffdea (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cee14e96-7070-410d-8934-e305861050e3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Nov 22 08:22:32 compute-0 nova_compute[186544]: 2025-11-22 08:22:32.144 186548 DEBUG oslo_concurrency.lockutils [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Acquiring lock "refresh_cache-1ef73533-7ed2-422b-a432-c1f12dbc7323" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:22:32 compute-0 nova_compute[186544]: 2025-11-22 08:22:32.145 186548 DEBUG oslo_concurrency.lockutils [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Acquired lock "refresh_cache-1ef73533-7ed2-422b-a432-c1f12dbc7323" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:22:32 compute-0 nova_compute[186544]: 2025-11-22 08:22:32.145 186548 DEBUG nova.network.neutron [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 08:22:32 compute-0 systemd[1]: Started libpod-conmon-fe5617efbaeedbeac66ad12fd59f5a3620bb04e7c9cb75e79129e6307e3ffdea.scope.
Nov 22 08:22:32 compute-0 systemd[1]: Started libcrun container.
Nov 22 08:22:32 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0b77c4f4fae75c3fa2e7990e72d48a948795d68187c1957f004efef546085220/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 08:22:32 compute-0 podman[242994]: 2025-11-22 08:22:32.201698625 +0000 UTC m=+0.250748434 container init fe5617efbaeedbeac66ad12fd59f5a3620bb04e7c9cb75e79129e6307e3ffdea (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cee14e96-7070-410d-8934-e305861050e3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 08:22:32 compute-0 podman[242994]: 2025-11-22 08:22:32.206952084 +0000 UTC m=+0.256001873 container start fe5617efbaeedbeac66ad12fd59f5a3620bb04e7c9cb75e79129e6307e3ffdea (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cee14e96-7070-410d-8934-e305861050e3, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 08:22:32 compute-0 neutron-haproxy-ovnmeta-cee14e96-7070-410d-8934-e305861050e3[243009]: [NOTICE]   (243013) : New worker (243015) forked
Nov 22 08:22:32 compute-0 neutron-haproxy-ovnmeta-cee14e96-7070-410d-8934-e305861050e3[243009]: [NOTICE]   (243013) : Loading success.
Nov 22 08:22:33 compute-0 nova_compute[186544]: 2025-11-22 08:22:33.158 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:22:33 compute-0 nova_compute[186544]: 2025-11-22 08:22:33.306 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:22:34 compute-0 nova_compute[186544]: 2025-11-22 08:22:34.048 186548 DEBUG nova.network.neutron [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Updating instance_info_cache with network_info: [{"id": "9887acef-e389-49e2-87d8-70796da43759", "address": "fa:16:3e:b4:b4:21", "network": {"id": "cee14e96-7070-410d-8934-e305861050e3", "bridge": "br-int", "label": "tempest-network-smoke--303711675", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.178", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9887acef-e3", "ovs_interfaceid": "9887acef-e389-49e2-87d8-70796da43759", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:22:34 compute-0 nova_compute[186544]: 2025-11-22 08:22:34.080 186548 DEBUG oslo_concurrency.lockutils [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Releasing lock "refresh_cache-1ef73533-7ed2-422b-a432-c1f12dbc7323" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:22:34 compute-0 nova_compute[186544]: 2025-11-22 08:22:34.104 186548 DEBUG oslo_concurrency.lockutils [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:22:34 compute-0 nova_compute[186544]: 2025-11-22 08:22:34.105 186548 DEBUG oslo_concurrency.lockutils [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:22:34 compute-0 nova_compute[186544]: 2025-11-22 08:22:34.105 186548 DEBUG oslo_concurrency.lockutils [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:22:34 compute-0 nova_compute[186544]: 2025-11-22 08:22:34.109 186548 INFO nova.virt.libvirt.driver [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Nov 22 08:22:34 compute-0 virtqemud[186092]: Domain id=83 name='instance-0000009b' uuid=1ef73533-7ed2-422b-a432-c1f12dbc7323 is tainted: custom-monitor
Nov 22 08:22:35 compute-0 nova_compute[186544]: 2025-11-22 08:22:35.118 186548 INFO nova.virt.libvirt.driver [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Nov 22 08:22:36 compute-0 nova_compute[186544]: 2025-11-22 08:22:36.123 186548 INFO nova.virt.libvirt.driver [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Nov 22 08:22:36 compute-0 nova_compute[186544]: 2025-11-22 08:22:36.129 186548 DEBUG nova.compute.manager [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:22:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:22:36.147 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=48, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=47) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:22:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:22:36.148 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 08:22:36 compute-0 nova_compute[186544]: 2025-11-22 08:22:36.148 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:22:36 compute-0 nova_compute[186544]: 2025-11-22 08:22:36.161 186548 DEBUG nova.objects.instance [None req-e4405f3d-5d2f-44f7-8210-e90785074824 357aaba6e76c40878384261337fd1de1 b7b451582c1a41c4a0df9385f4b60566 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Nov 22 08:22:36 compute-0 nova_compute[186544]: 2025-11-22 08:22:36.196 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.601 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '1ef73533-7ed2-422b-a432-c1f12dbc7323', 'name': 'tempest-TestNetworkAdvancedServerOps-server-896988644', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000009b', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '042f6d127720471aaedb8a1fb7535416', 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'hostId': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.601 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.604 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 1ef73533-7ed2-422b-a432-c1f12dbc7323 / tap9887acef-e3 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.604 12 DEBUG ceilometer.compute.pollsters [-] 1ef73533-7ed2-422b-a432-c1f12dbc7323/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.605 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6ebc5ac6-446a-41e5-aa81-c3c18599ae4b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': 'instance-0000009b-1ef73533-7ed2-422b-a432-c1f12dbc7323-tap9887acef-e3', 'timestamp': '2025-11-22T08:22:36.601984', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-896988644', 'name': 'tap9887acef-e3', 'instance_id': '1ef73533-7ed2-422b-a432-c1f12dbc7323', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b4:b4:21', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9887acef-e3'}, 'message_id': '66fa7516-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6497.293759837, 'message_signature': 'bff0dd9332923a1c49b7e3f54a43e86363f04bc218eba1c312511a3838dcdcb5'}]}, 'timestamp': '2025-11-22 08:22:36.604913', '_unique_id': '06c9a448d9f74eb7b989723824033764'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.605 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.605 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.605 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.605 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.605 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.605 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.605 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.605 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.605 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.605 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.605 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.605 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.605 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.605 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.605 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.605 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.605 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.605 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.605 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.605 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.605 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.605 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.605 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.605 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.605 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.605 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.605 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.605 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.605 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.605 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.605 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.605 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.606 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.606 12 DEBUG ceilometer.compute.pollsters [-] 1ef73533-7ed2-422b-a432-c1f12dbc7323/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.607 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '47833717-b1d9-4a64-b22d-babd53bf358b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': 'instance-0000009b-1ef73533-7ed2-422b-a432-c1f12dbc7323-tap9887acef-e3', 'timestamp': '2025-11-22T08:22:36.606814', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-896988644', 'name': 'tap9887acef-e3', 'instance_id': '1ef73533-7ed2-422b-a432-c1f12dbc7323', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b4:b4:21', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9887acef-e3'}, 'message_id': '66facaa2-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6497.293759837, 'message_signature': 'ca06d6c38ce945e6f12c9e849b3d7115a5bfee64ca7a3e45987949a87f97fbf6'}]}, 'timestamp': '2025-11-22 08:22:36.607062', '_unique_id': '1d001fc90b1540bc81c1b13c83395f92'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.607 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.607 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.607 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.607 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.607 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.607 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.607 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.607 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.607 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.607 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.607 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.607 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.607 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.607 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.607 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.607 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.607 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.607 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.607 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.607 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.607 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.607 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.607 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.607 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.607 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.607 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.607 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.607 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.607 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.607 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.607 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.607 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.608 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.608 12 DEBUG ceilometer.compute.pollsters [-] 1ef73533-7ed2-422b-a432-c1f12dbc7323/network.incoming.bytes volume: 1568 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.608 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f31929d1-8052-4c73-82fe-e14a653596c6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1568, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': 'instance-0000009b-1ef73533-7ed2-422b-a432-c1f12dbc7323-tap9887acef-e3', 'timestamp': '2025-11-22T08:22:36.608171', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-896988644', 'name': 'tap9887acef-e3', 'instance_id': '1ef73533-7ed2-422b-a432-c1f12dbc7323', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b4:b4:21', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9887acef-e3'}, 'message_id': '66fb0062-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6497.293759837, 'message_signature': 'aa786a1d739c607c48120b5ed927af679feea781c8b1abbc16e7fe11b5be1345'}]}, 'timestamp': '2025-11-22 08:22:36.608430', '_unique_id': '8a6cdb65b0cf4ef081520b2dcef16ff8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.608 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.608 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.608 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.608 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.608 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.608 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.608 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.608 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.608 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.608 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.608 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.608 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.608 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.608 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.608 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.608 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.608 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.608 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.608 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.608 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.608 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.608 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.608 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.608 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.608 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.608 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.608 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.608 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.608 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.608 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.608 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.609 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.634 12 DEBUG ceilometer.compute.pollsters [-] 1ef73533-7ed2-422b-a432-c1f12dbc7323/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.634 12 DEBUG ceilometer.compute.pollsters [-] 1ef73533-7ed2-422b-a432-c1f12dbc7323/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.636 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b5f43dd3-4bfb-4dbd-a3e4-640b9e9a78e4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': '1ef73533-7ed2-422b-a432-c1f12dbc7323-vda', 'timestamp': '2025-11-22T08:22:36.609987', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-896988644', 'name': 'instance-0000009b', 'instance_id': '1ef73533-7ed2-422b-a432-c1f12dbc7323', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '66ff01f8-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6497.301773646, 'message_signature': '57b91b33a3feab586f81758496e1d4c526386ba2c80a48573c04d719a3f19537'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': '1ef73533-7ed2-422b-a432-c1f12dbc7323-sda', 'timestamp': '2025-11-22T08:22:36.609987', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-896988644', 'name': 'instance-0000009b', 'instance_id': '1ef73533-7ed2-422b-a432-c1f12dbc7323', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '66ff1058-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6497.301773646, 'message_signature': '5f6e52e44aa4741d433436b8f9dcf722e85961821d60c989220022d292057f5e'}]}, 'timestamp': '2025-11-22 08:22:36.635093', '_unique_id': '8605d63814bd487ea6f537344410987a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.636 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.636 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.636 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.636 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.636 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.636 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.636 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.636 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.636 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.636 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.636 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.636 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.636 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.636 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.636 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.636 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.636 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.636 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.636 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.636 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.636 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.636 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.636 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.636 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.636 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.636 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.636 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.636 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.636 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.636 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.636 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.636 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.636 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.636 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.636 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.636 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.636 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.636 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.636 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.636 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.636 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.636 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.636 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.636 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.636 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.636 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.636 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.636 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.636 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.636 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.636 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.636 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.636 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.636 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.637 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.637 12 DEBUG ceilometer.compute.pollsters [-] 1ef73533-7ed2-422b-a432-c1f12dbc7323/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.638 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '98aa72ce-efa4-4b4b-a032-43ce26cac711', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': 'instance-0000009b-1ef73533-7ed2-422b-a432-c1f12dbc7323-tap9887acef-e3', 'timestamp': '2025-11-22T08:22:36.637339', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-896988644', 'name': 'tap9887acef-e3', 'instance_id': '1ef73533-7ed2-422b-a432-c1f12dbc7323', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b4:b4:21', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9887acef-e3'}, 'message_id': '66ff7516-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6497.293759837, 'message_signature': '93322769a7bb58937fa9979f87f1dfc665d7d15aa73aa5ae449760ee0ce876c5'}]}, 'timestamp': '2025-11-22 08:22:36.637682', '_unique_id': '25a4b33a6ca54ece9da10fc78d8ca2f7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.638 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.638 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.638 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.638 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.638 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.638 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.638 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.638 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.638 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.638 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.638 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.638 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.638 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.638 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.638 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.638 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.638 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.638 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.638 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.638 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.638 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.638 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.638 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.638 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.638 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.638 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.638 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.638 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.638 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.638 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.638 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.639 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.639 12 DEBUG ceilometer.compute.pollsters [-] 1ef73533-7ed2-422b-a432-c1f12dbc7323/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.639 12 DEBUG ceilometer.compute.pollsters [-] 1ef73533-7ed2-422b-a432-c1f12dbc7323/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.640 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0e8ee7dd-3097-4eda-9415-421d1a428260', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': '1ef73533-7ed2-422b-a432-c1f12dbc7323-vda', 'timestamp': '2025-11-22T08:22:36.639349', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-896988644', 'name': 'instance-0000009b', 'instance_id': '1ef73533-7ed2-422b-a432-c1f12dbc7323', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '66ffc26e-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6497.301773646, 'message_signature': 'b65f85885e9ac34206b90c56d994d7bdfc4fc58c4645e793f60330e63e925beb'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': '1ef73533-7ed2-422b-a432-c1f12dbc7323-sda', 'timestamp': '2025-11-22T08:22:36.639349', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-896988644', 'name': 'instance-0000009b', 'instance_id': '1ef73533-7ed2-422b-a432-c1f12dbc7323', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '66ffcb7e-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6497.301773646, 'message_signature': '28d8b888a2c9e48f0dbcf57932f39d65499f3a99f0ad7a5fd3195e4ea5cb1839'}]}, 'timestamp': '2025-11-22 08:22:36.639871', '_unique_id': '5d447c1d553d43b7ad2d356bf46719c4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.640 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.640 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.640 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.640 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.640 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.640 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.640 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.640 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.640 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.640 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.640 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.640 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.640 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.640 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.640 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.640 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.640 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.640 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.640 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.640 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.640 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.640 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.640 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.640 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.640 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.640 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.640 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.640 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.640 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.640 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.640 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.641 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.655 12 DEBUG ceilometer.compute.pollsters [-] 1ef73533-7ed2-422b-a432-c1f12dbc7323/memory.usage volume: 42.4453125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.656 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '88644cd4-e7b7-4af7-8814-be0f36fbaf5c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.4453125, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': '1ef73533-7ed2-422b-a432-c1f12dbc7323', 'timestamp': '2025-11-22T08:22:36.641374', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-896988644', 'name': 'instance-0000009b', 'instance_id': '1ef73533-7ed2-422b-a432-c1f12dbc7323', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '67023ba2-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6497.347100366, 'message_signature': '47eadb0eb93e82ced4183474d9c90c3a683f578a80b2f4b81e9af41a9411a5d4'}]}, 'timestamp': '2025-11-22 08:22:36.655869', '_unique_id': 'ee444697493a416b9f0532e9d950256f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.656 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.656 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.656 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.656 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.656 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.656 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.656 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.656 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.656 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.656 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.656 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.656 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.656 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.656 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.656 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.656 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.656 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.656 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.656 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.656 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.656 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.656 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.656 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.656 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.656 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.656 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.656 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.656 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.656 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.656 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.656 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.656 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.657 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.657 12 DEBUG ceilometer.compute.pollsters [-] 1ef73533-7ed2-422b-a432-c1f12dbc7323/network.outgoing.packets volume: 98 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.658 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'acf79f1f-edb0-4cac-b0ac-17f22052b00e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 98, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': 'instance-0000009b-1ef73533-7ed2-422b-a432-c1f12dbc7323-tap9887acef-e3', 'timestamp': '2025-11-22T08:22:36.657486', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-896988644', 'name': 'tap9887acef-e3', 'instance_id': '1ef73533-7ed2-422b-a432-c1f12dbc7323', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b4:b4:21', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9887acef-e3'}, 'message_id': '6702862a-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6497.293759837, 'message_signature': '750e0a8004b7efc1d6dab5a2122df3e85caf5c172e2fe4edf60bbc6178e1d709'}]}, 'timestamp': '2025-11-22 08:22:36.657730', '_unique_id': 'a1c1a636cd9b41409587568789b8ab0c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.658 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.658 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.658 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.658 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.658 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.658 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.658 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.658 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.658 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.658 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.658 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.658 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.658 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.658 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.658 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.658 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.658 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.658 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.658 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.658 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.658 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.658 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.658 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.658 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.658 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.658 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.658 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.658 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.658 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.658 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.658 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.658 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.658 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.658 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.658 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.658 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.658 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.658 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.658 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.658 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.658 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.658 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.658 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.658 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.658 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.658 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.658 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.658 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.658 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.658 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.658 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.658 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.658 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.658 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.658 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.673 12 DEBUG ceilometer.compute.pollsters [-] 1ef73533-7ed2-422b-a432-c1f12dbc7323/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.674 12 DEBUG ceilometer.compute.pollsters [-] 1ef73533-7ed2-422b-a432-c1f12dbc7323/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.675 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd7ad9da3-89d1-4b1c-b469-01ded6daeb65', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': '1ef73533-7ed2-422b-a432-c1f12dbc7323-vda', 'timestamp': '2025-11-22T08:22:36.658840', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-896988644', 'name': 'instance-0000009b', 'instance_id': '1ef73533-7ed2-422b-a432-c1f12dbc7323', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '67050c74-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6497.350619352, 'message_signature': '9be23701947c6ceaf0b3bdab3742dd89be6166d0c70993655bf256b3460a3352'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': '1ef73533-7ed2-422b-a432-c1f12dbc7323-sda', 'timestamp': '2025-11-22T08:22:36.658840', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-896988644', 'name': 'instance-0000009b', 'instance_id': '1ef73533-7ed2-422b-a432-c1f12dbc7323', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '67051a3e-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6497.350619352, 'message_signature': '301570c5ff15d2a3c7208ce3e5fc7291b54344654f6130ee75ad2eabb7114c2e'}]}, 'timestamp': '2025-11-22 08:22:36.674639', '_unique_id': '65a7d6605b6a4cd6877363bfc9ea0358'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.675 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.675 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.675 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.675 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.675 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.675 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.675 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.675 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.675 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.675 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.675 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.675 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.675 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.675 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.675 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.675 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.675 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.675 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.675 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.675 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.675 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.675 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.675 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.675 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.675 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.675 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.675 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.675 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.675 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.675 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.675 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.676 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.676 12 DEBUG ceilometer.compute.pollsters [-] 1ef73533-7ed2-422b-a432-c1f12dbc7323/disk.device.write.requests volume: 6 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.676 12 DEBUG ceilometer.compute.pollsters [-] 1ef73533-7ed2-422b-a432-c1f12dbc7323/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.677 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cf6a3e30-910c-418c-addc-f9029613e42c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 6, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': '1ef73533-7ed2-422b-a432-c1f12dbc7323-vda', 'timestamp': '2025-11-22T08:22:36.676545', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-896988644', 'name': 'instance-0000009b', 'instance_id': '1ef73533-7ed2-422b-a432-c1f12dbc7323', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '67056ec6-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6497.301773646, 'message_signature': '2372572b4900dbaf68e202d279eee9d2370e59c13e17c4495d0291ebb346a6c9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': '1ef73533-7ed2-422b-a432-c1f12dbc7323-sda', 'timestamp': '2025-11-22T08:22:36.676545', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-896988644', 'name': 'instance-0000009b', 'instance_id': '1ef73533-7ed2-422b-a432-c1f12dbc7323', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '67057844-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6497.301773646, 'message_signature': '621b190dddc828b9164e3fad0c8f381fe05cebfe37092a8c52816e7e9d52555e'}]}, 'timestamp': '2025-11-22 08:22:36.677034', '_unique_id': '29e42cb0ae7a4063bd445c418660c7be'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.677 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.677 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.677 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.677 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.677 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.677 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.677 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.677 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.677 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.677 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.677 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.677 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.677 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.677 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.677 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.677 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.677 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.677 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.677 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.677 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.677 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.677 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.677 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.677 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.677 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.677 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.677 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.677 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.677 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.677 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.677 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.678 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.678 12 DEBUG ceilometer.compute.pollsters [-] 1ef73533-7ed2-422b-a432-c1f12dbc7323/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.679 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c5d31220-6ce3-403a-84dc-9bd5c5a202a6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': 'instance-0000009b-1ef73533-7ed2-422b-a432-c1f12dbc7323-tap9887acef-e3', 'timestamp': '2025-11-22T08:22:36.678309', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-896988644', 'name': 'tap9887acef-e3', 'instance_id': '1ef73533-7ed2-422b-a432-c1f12dbc7323', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b4:b4:21', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9887acef-e3'}, 'message_id': '6705b3ea-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6497.293759837, 'message_signature': 'a3aedb4379a1197ae767544d94107a5b5b0d0a2131ad81cf494b4bf24462ce9b'}]}, 'timestamp': '2025-11-22 08:22:36.678597', '_unique_id': 'f4e08a14f2e94d679fb1749c7cf0e889'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.679 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.679 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.679 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.679 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.679 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.679 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.679 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.679 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.679 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.679 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.679 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.679 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.679 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.679 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.679 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.679 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.679 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.679 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.679 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.679 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.679 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.679 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.679 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.679 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.679 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.679 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.679 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.679 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.679 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.679 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.679 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.679 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.679 12 DEBUG ceilometer.compute.pollsters [-] 1ef73533-7ed2-422b-a432-c1f12dbc7323/network.outgoing.bytes volume: 6228 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.680 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '13c704ae-cc1d-4fc4-9661-76bee6c354db', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6228, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': 'instance-0000009b-1ef73533-7ed2-422b-a432-c1f12dbc7323-tap9887acef-e3', 'timestamp': '2025-11-22T08:22:36.679913', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-896988644', 'name': 'tap9887acef-e3', 'instance_id': '1ef73533-7ed2-422b-a432-c1f12dbc7323', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b4:b4:21', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9887acef-e3'}, 'message_id': '6705f22e-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6497.293759837, 'message_signature': 'e0cddcf6ba589ba8246e838e2b9fd647d9f6d2279a56a2a0622b86db857f8714'}]}, 'timestamp': '2025-11-22 08:22:36.680172', '_unique_id': '29beae1e82c24f5bb2c2dcbb38c6b129'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.680 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.680 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.680 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.680 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.680 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.680 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.680 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.680 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.680 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.680 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.680 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.680 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.680 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.680 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.680 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.680 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.680 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.680 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.680 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.680 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.680 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.680 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.680 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.680 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.680 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.680 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.680 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.680 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.680 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.680 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.680 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.680 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.681 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.681 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.681 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-896988644>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-896988644>]
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.681 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.681 12 DEBUG ceilometer.compute.pollsters [-] 1ef73533-7ed2-422b-a432-c1f12dbc7323/cpu volume: 40000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.682 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dd0f6113-9ce5-4a5a-846e-683a984508bc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 40000000, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': '1ef73533-7ed2-422b-a432-c1f12dbc7323', 'timestamp': '2025-11-22T08:22:36.681884', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-896988644', 'name': 'instance-0000009b', 'instance_id': '1ef73533-7ed2-422b-a432-c1f12dbc7323', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '67063fb8-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6497.347100366, 'message_signature': 'c7ee5aaac7a62365909310178afd7dbf0c214895a641f5a6353b5ba3a0d6765e'}]}, 'timestamp': '2025-11-22 08:22:36.682181', '_unique_id': '4a1d7cb7086748b38e79fd98eec39733'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.682 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.682 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.682 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.682 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.682 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.682 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.682 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.682 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.682 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.682 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.682 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.682 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.682 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.682 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.682 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.682 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.682 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.682 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.682 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.682 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.682 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.682 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.682 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.682 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.682 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.682 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.682 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.682 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.682 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.682 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.682 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.683 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.683 12 DEBUG ceilometer.compute.pollsters [-] 1ef73533-7ed2-422b-a432-c1f12dbc7323/disk.device.write.bytes volume: 28672 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.683 12 DEBUG ceilometer.compute.pollsters [-] 1ef73533-7ed2-422b-a432-c1f12dbc7323/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.684 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '900c5ffd-2b8b-4180-8120-0d84c7bdfd1c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 28672, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': '1ef73533-7ed2-422b-a432-c1f12dbc7323-vda', 'timestamp': '2025-11-22T08:22:36.683486', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-896988644', 'name': 'instance-0000009b', 'instance_id': '1ef73533-7ed2-422b-a432-c1f12dbc7323', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '67068644-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6497.301773646, 'message_signature': '7a575deb2b30b605b54f919cdbed7436513aeef8cb49a60ba6611f3bed2ca12c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': '1ef73533-7ed2-422b-a432-c1f12dbc7323-sda', 'timestamp': '2025-11-22T08:22:36.683486', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-896988644', 'name': 'instance-0000009b', 'instance_id': '1ef73533-7ed2-422b-a432-c1f12dbc7323', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '67068f7c-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6497.301773646, 'message_signature': '63741e5619d1f06c4058c6f23ec02ef17faa19b7128229f75ef3e801af9437aa'}]}, 'timestamp': '2025-11-22 08:22:36.684184', '_unique_id': 'cdda3b3edcb340ce8222db83dae37bb0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.684 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.684 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.684 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.684 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.684 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.684 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.684 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.684 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.684 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.684 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.684 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.684 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.684 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.684 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.684 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.684 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.684 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.684 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.684 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.684 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.684 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.684 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.684 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.684 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.684 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.684 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.684 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.684 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.684 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.684 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.684 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.685 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.685 12 DEBUG ceilometer.compute.pollsters [-] 1ef73533-7ed2-422b-a432-c1f12dbc7323/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.686 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c9fa9e2a-4ff9-48d8-ac9e-ce4c8ae1839c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': 'instance-0000009b-1ef73533-7ed2-422b-a432-c1f12dbc7323-tap9887acef-e3', 'timestamp': '2025-11-22T08:22:36.685768', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-896988644', 'name': 'tap9887acef-e3', 'instance_id': '1ef73533-7ed2-422b-a432-c1f12dbc7323', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b4:b4:21', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9887acef-e3'}, 'message_id': '6706d6ee-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6497.293759837, 'message_signature': 'ecd9fa398e904d082cb8835e7ae511f73795ba9451060e4c21925ab21537715a'}]}, 'timestamp': '2025-11-22 08:22:36.686022', '_unique_id': '3dcfd5242a394ef3a9496c0e65090a2e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.686 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.686 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.686 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.686 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.686 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.686 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.686 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.686 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.686 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.686 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.686 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.686 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.686 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.686 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.686 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.686 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.686 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.686 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.686 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.686 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.686 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.686 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.686 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.686 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.686 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.686 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.686 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.686 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.686 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.686 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.686 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.687 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.687 12 DEBUG ceilometer.compute.pollsters [-] 1ef73533-7ed2-422b-a432-c1f12dbc7323/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.687 12 DEBUG ceilometer.compute.pollsters [-] 1ef73533-7ed2-422b-a432-c1f12dbc7323/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.688 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0f91f121-2d44-47fd-b98b-3b6f99de0686', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': '1ef73533-7ed2-422b-a432-c1f12dbc7323-vda', 'timestamp': '2025-11-22T08:22:36.687234', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-896988644', 'name': 'instance-0000009b', 'instance_id': '1ef73533-7ed2-422b-a432-c1f12dbc7323', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '670710d2-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6497.350619352, 'message_signature': 'e6066895d79a23e0f8488a1340b500901784c181704a0cc8b553d8a71d2e92d3'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': '1ef73533-7ed2-422b-a432-c1f12dbc7323-sda', 'timestamp': '2025-11-22T08:22:36.687234', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-896988644', 'name': 'instance-0000009b', 'instance_id': '1ef73533-7ed2-422b-a432-c1f12dbc7323', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '6707194c-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6497.350619352, 'message_signature': '79d9c6e85a249e44e33b77acb46ee084c58043764647cc6ab46d0250239bc8e5'}]}, 'timestamp': '2025-11-22 08:22:36.687736', '_unique_id': 'd947665da846434d9e509e36b69cddc3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.688 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.688 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.688 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.688 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.688 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.688 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.688 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.688 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.688 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.688 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.688 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.688 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.688 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.688 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.688 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.688 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.688 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.688 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.688 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.688 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.688 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.688 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.688 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.688 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.688 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.688 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.688 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.688 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.688 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.688 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.688 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.688 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.689 12 DEBUG ceilometer.compute.pollsters [-] 1ef73533-7ed2-422b-a432-c1f12dbc7323/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.689 12 DEBUG ceilometer.compute.pollsters [-] 1ef73533-7ed2-422b-a432-c1f12dbc7323/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.689 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c67b6ff6-be73-48fd-810b-88ddc6f075e4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': '1ef73533-7ed2-422b-a432-c1f12dbc7323-vda', 'timestamp': '2025-11-22T08:22:36.689009', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-896988644', 'name': 'instance-0000009b', 'instance_id': '1ef73533-7ed2-422b-a432-c1f12dbc7323', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '67075538-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6497.301773646, 'message_signature': '0186e6b2e6694ea50ce763802674b61d0aed8dcde31f55b8befa5853cc24b1b2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': '1ef73533-7ed2-422b-a432-c1f12dbc7323-sda', 'timestamp': '2025-11-22T08:22:36.689009', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-896988644', 'name': 'instance-0000009b', 'instance_id': '1ef73533-7ed2-422b-a432-c1f12dbc7323', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '67075ef2-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6497.301773646, 'message_signature': 'e75e72b6a9a677b72d1d1d97201eeeddfb76a3f6856669d072651a6f529c8006'}]}, 'timestamp': '2025-11-22 08:22:36.689486', '_unique_id': '7ae8c990173c4a8c85e4cd370d2e5998'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.689 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.689 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.689 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.689 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.689 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.689 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.689 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.689 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.689 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.689 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.689 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.689 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.689 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.689 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.689 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.689 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.689 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.689 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.689 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.689 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.689 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.689 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.689 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.689 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.689 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.689 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.689 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.689 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.689 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.689 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.689 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.689 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.690 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.690 12 DEBUG ceilometer.compute.pollsters [-] 1ef73533-7ed2-422b-a432-c1f12dbc7323/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.691 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0def3591-a7e0-4b98-8fa1-0a47a7b9a2ca', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': 'instance-0000009b-1ef73533-7ed2-422b-a432-c1f12dbc7323-tap9887acef-e3', 'timestamp': '2025-11-22T08:22:36.690719', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-896988644', 'name': 'tap9887acef-e3', 'instance_id': '1ef73533-7ed2-422b-a432-c1f12dbc7323', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b4:b4:21', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9887acef-e3'}, 'message_id': '67079868-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6497.293759837, 'message_signature': 'd5a8938fdf9baf718c4b66ba62741c82536773b257ddb88a7d74888740b6186f'}]}, 'timestamp': '2025-11-22 08:22:36.690981', '_unique_id': 'd72f9bcf8c6140879fd4596b1bcf8ca0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.691 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.691 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.691 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.691 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.691 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.691 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.691 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.691 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.691 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.691 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.691 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.691 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.691 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.691 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.691 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.691 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.691 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.691 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.691 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.691 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.691 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.691 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.691 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.691 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.691 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.691 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.691 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.691 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.691 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.691 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.691 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.692 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.692 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.692 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-896988644>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-896988644>]
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.692 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.692 12 DEBUG ceilometer.compute.pollsters [-] 1ef73533-7ed2-422b-a432-c1f12dbc7323/disk.device.write.latency volume: 8332976 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.692 12 DEBUG ceilometer.compute.pollsters [-] 1ef73533-7ed2-422b-a432-c1f12dbc7323/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.693 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7c50ec95-93bb-4cba-9004-62db6717c4de', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 8332976, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': '1ef73533-7ed2-422b-a432-c1f12dbc7323-vda', 'timestamp': '2025-11-22T08:22:36.692612', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-896988644', 'name': 'instance-0000009b', 'instance_id': '1ef73533-7ed2-422b-a432-c1f12dbc7323', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '6707e26e-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6497.301773646, 'message_signature': '432a3ed0ddd00b6db0c2216b0dcf055c1e693e40d01227dd4abf53e75d5b87af'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': '1ef73533-7ed2-422b-a432-c1f12dbc7323-sda', 'timestamp': '2025-11-22T08:22:36.692612', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-896988644', 'name': 'instance-0000009b', 'instance_id': '1ef73533-7ed2-422b-a432-c1f12dbc7323', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '6707eb1a-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6497.301773646, 'message_signature': 'bf0350e2a06d659dd48789aa6f898f48437c29351fce706cfa869fd76563f23b'}]}, 'timestamp': '2025-11-22 08:22:36.693090', '_unique_id': '56adbd0d8da54690a9b13c425bbbe4e8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.693 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.693 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.693 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.693 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.693 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.693 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.693 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.693 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.693 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.693 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.693 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.693 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.693 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.693 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.693 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.693 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.693 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.693 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.693 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.693 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.693 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.693 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.693 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.693 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.693 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.693 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.693 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.693 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.693 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.693 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.693 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.693 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.694 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.694 12 DEBUG ceilometer.compute.pollsters [-] 1ef73533-7ed2-422b-a432-c1f12dbc7323/network.incoming.packets volume: 29 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.695 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '25b33e90-d894-4144-b8e3-08a284b89736', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 29, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': 'instance-0000009b-1ef73533-7ed2-422b-a432-c1f12dbc7323-tap9887acef-e3', 'timestamp': '2025-11-22T08:22:36.694342', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-896988644', 'name': 'tap9887acef-e3', 'instance_id': '1ef73533-7ed2-422b-a432-c1f12dbc7323', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b4:b4:21', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9887acef-e3'}, 'message_id': '670825c6-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6497.293759837, 'message_signature': '79443a4d77938f5c7ea98f340681bd524da080af970812ec76f81b1311357d54'}]}, 'timestamp': '2025-11-22 08:22:36.694596', '_unique_id': 'de8fd4a482e4401c9f22dac10ded664f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.695 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.695 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.695 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.695 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.695 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.695 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.695 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.695 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.695 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.695 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.695 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.695 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.695 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.695 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.695 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.695 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.695 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.695 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.695 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.695 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.695 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.695 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.695 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.695 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.695 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.695 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.695 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.695 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.695 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.695 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.695 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.695 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.695 12 DEBUG ceilometer.compute.pollsters [-] 1ef73533-7ed2-422b-a432-c1f12dbc7323/disk.device.allocation volume: 30412800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.696 12 DEBUG ceilometer.compute.pollsters [-] 1ef73533-7ed2-422b-a432-c1f12dbc7323/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.696 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e61fd222-6dd2-4db9-b66d-0bd66ae81135', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30412800, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': '1ef73533-7ed2-422b-a432-c1f12dbc7323-vda', 'timestamp': '2025-11-22T08:22:36.695772', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-896988644', 'name': 'instance-0000009b', 'instance_id': '1ef73533-7ed2-422b-a432-c1f12dbc7323', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '67085d48-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6497.350619352, 'message_signature': '00e929674f2499a9e992156aa89ed41fc6a708c06ee5d007c70e83294e3383bf'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': '1ef73533-7ed2-422b-a432-c1f12dbc7323-sda', 'timestamp': '2025-11-22T08:22:36.695772', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-896988644', 'name': 'instance-0000009b', 'instance_id': '1ef73533-7ed2-422b-a432-c1f12dbc7323', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '6708657c-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6497.350619352, 'message_signature': '0f17d7ba74441d0e47f5bf11764fe5f220057ba2b1d82c152a1083d8c4a0f839'}]}, 'timestamp': '2025-11-22 08:22:36.696231', '_unique_id': 'bf7202d4b498407da5d8d07e63034f6b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.696 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.696 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.696 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.696 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.696 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.696 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.696 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.696 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.696 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.696 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.696 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.696 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.696 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.696 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.696 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.696 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.696 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.696 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.696 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.696 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.696 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.696 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.696 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.696 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.696 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.696 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.696 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.696 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.696 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.696 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.696 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.697 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.697 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.697 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-896988644>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-896988644>]
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.697 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.697 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 08:22:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:22:36.697 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-896988644>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-896988644>]
Nov 22 08:22:37 compute-0 nova_compute[186544]: 2025-11-22 08:22:37.046 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763799742.0449452, 97c19ea5-e0e4-4dac-a78d-b974add3bcfe => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:22:37 compute-0 nova_compute[186544]: 2025-11-22 08:22:37.046 186548 INFO nova.compute.manager [-] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] VM Stopped (Lifecycle Event)
Nov 22 08:22:37 compute-0 nova_compute[186544]: 2025-11-22 08:22:37.068 186548 DEBUG nova.compute.manager [None req-f9e44ab3-1397-4d38-886d-1b3af5fa53c3 - - - - - -] [instance: 97c19ea5-e0e4-4dac-a78d-b974add3bcfe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:22:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:22:37.348 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:22:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:22:37.348 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:22:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:22:37.349 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:22:38 compute-0 nova_compute[186544]: 2025-11-22 08:22:38.308 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:22:40 compute-0 nova_compute[186544]: 2025-11-22 08:22:40.449 186548 INFO nova.compute.manager [None req-91588035-6889-408f-9cb5-bc9c965cb065 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Get console output
Nov 22 08:22:40 compute-0 nova_compute[186544]: 2025-11-22 08:22:40.456 213059 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 22 08:22:40 compute-0 ovn_controller[94843]: 2025-11-22T08:22:40Z|00723|binding|INFO|Releasing lport a71e340e-db50-4811-9d79-16735ac5733d from this chassis (sb_readonly=0)
Nov 22 08:22:40 compute-0 nova_compute[186544]: 2025-11-22 08:22:40.693 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:22:41 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:22:41.151 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '48'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:22:41 compute-0 nova_compute[186544]: 2025-11-22 08:22:41.198 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:22:42 compute-0 nova_compute[186544]: 2025-11-22 08:22:42.246 186548 DEBUG nova.compute.manager [req-70a0cdca-67ee-40b9-bb56-d99452ee28cb req-96098a5c-e8c1-4e44-8060-0304564f7a71 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Received event network-changed-9887acef-e389-49e2-87d8-70796da43759 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:22:42 compute-0 nova_compute[186544]: 2025-11-22 08:22:42.247 186548 DEBUG nova.compute.manager [req-70a0cdca-67ee-40b9-bb56-d99452ee28cb req-96098a5c-e8c1-4e44-8060-0304564f7a71 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Refreshing instance network info cache due to event network-changed-9887acef-e389-49e2-87d8-70796da43759. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:22:42 compute-0 nova_compute[186544]: 2025-11-22 08:22:42.247 186548 DEBUG oslo_concurrency.lockutils [req-70a0cdca-67ee-40b9-bb56-d99452ee28cb req-96098a5c-e8c1-4e44-8060-0304564f7a71 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-1ef73533-7ed2-422b-a432-c1f12dbc7323" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:22:42 compute-0 nova_compute[186544]: 2025-11-22 08:22:42.247 186548 DEBUG oslo_concurrency.lockutils [req-70a0cdca-67ee-40b9-bb56-d99452ee28cb req-96098a5c-e8c1-4e44-8060-0304564f7a71 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-1ef73533-7ed2-422b-a432-c1f12dbc7323" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:22:42 compute-0 nova_compute[186544]: 2025-11-22 08:22:42.247 186548 DEBUG nova.network.neutron [req-70a0cdca-67ee-40b9-bb56-d99452ee28cb req-96098a5c-e8c1-4e44-8060-0304564f7a71 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Refreshing network info cache for port 9887acef-e389-49e2-87d8-70796da43759 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:22:42 compute-0 podman[243024]: 2025-11-22 08:22:42.421421427 +0000 UTC m=+0.069487328 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 22 08:22:43 compute-0 nova_compute[186544]: 2025-11-22 08:22:43.034 186548 DEBUG oslo_concurrency.lockutils [None req-d6fb423f-f621-49e8-b26d-df207a0bcd11 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "1ef73533-7ed2-422b-a432-c1f12dbc7323" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:22:43 compute-0 nova_compute[186544]: 2025-11-22 08:22:43.035 186548 DEBUG oslo_concurrency.lockutils [None req-d6fb423f-f621-49e8-b26d-df207a0bcd11 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "1ef73533-7ed2-422b-a432-c1f12dbc7323" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:22:43 compute-0 nova_compute[186544]: 2025-11-22 08:22:43.036 186548 DEBUG oslo_concurrency.lockutils [None req-d6fb423f-f621-49e8-b26d-df207a0bcd11 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "1ef73533-7ed2-422b-a432-c1f12dbc7323-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:22:43 compute-0 nova_compute[186544]: 2025-11-22 08:22:43.036 186548 DEBUG oslo_concurrency.lockutils [None req-d6fb423f-f621-49e8-b26d-df207a0bcd11 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "1ef73533-7ed2-422b-a432-c1f12dbc7323-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:22:43 compute-0 nova_compute[186544]: 2025-11-22 08:22:43.036 186548 DEBUG oslo_concurrency.lockutils [None req-d6fb423f-f621-49e8-b26d-df207a0bcd11 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "1ef73533-7ed2-422b-a432-c1f12dbc7323-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:22:43 compute-0 nova_compute[186544]: 2025-11-22 08:22:43.047 186548 INFO nova.compute.manager [None req-d6fb423f-f621-49e8-b26d-df207a0bcd11 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Terminating instance
Nov 22 08:22:43 compute-0 nova_compute[186544]: 2025-11-22 08:22:43.057 186548 DEBUG nova.compute.manager [None req-d6fb423f-f621-49e8-b26d-df207a0bcd11 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 08:22:43 compute-0 kernel: tap9887acef-e3 (unregistering): left promiscuous mode
Nov 22 08:22:43 compute-0 NetworkManager[55036]: <info>  [1763799763.0880] device (tap9887acef-e3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 08:22:43 compute-0 ovn_controller[94843]: 2025-11-22T08:22:43Z|00724|binding|INFO|Releasing lport 9887acef-e389-49e2-87d8-70796da43759 from this chassis (sb_readonly=0)
Nov 22 08:22:43 compute-0 ovn_controller[94843]: 2025-11-22T08:22:43Z|00725|binding|INFO|Setting lport 9887acef-e389-49e2-87d8-70796da43759 down in Southbound
Nov 22 08:22:43 compute-0 ovn_controller[94843]: 2025-11-22T08:22:43Z|00726|binding|INFO|Removing iface tap9887acef-e3 ovn-installed in OVS
Nov 22 08:22:43 compute-0 nova_compute[186544]: 2025-11-22 08:22:43.099 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:22:43 compute-0 nova_compute[186544]: 2025-11-22 08:22:43.119 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:22:43 compute-0 systemd[1]: machine-qemu\x2d83\x2dinstance\x2d0000009b.scope: Deactivated successfully.
Nov 22 08:22:43 compute-0 systemd[1]: machine-qemu\x2d83\x2dinstance\x2d0000009b.scope: Consumed 2.656s CPU time.
Nov 22 08:22:43 compute-0 systemd-machined[152872]: Machine qemu-83-instance-0000009b terminated.
Nov 22 08:22:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:22:43.273 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b4:b4:21 10.100.0.9'], port_security=['fa:16:3e:b4:b4:21 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '1ef73533-7ed2-422b-a432-c1f12dbc7323', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cee14e96-7070-410d-8934-e305861050e3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '042f6d127720471aaedb8a1fb7535416', 'neutron:revision_number': '13', 'neutron:security_group_ids': '6e5fcd1f-0f97-4f29-8604-c8a08fc32894', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fc22de09-f0b3-4482-a7fb-bd5256ece761, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=9887acef-e389-49e2-87d8-70796da43759) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:22:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:22:43.275 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 9887acef-e389-49e2-87d8-70796da43759 in datapath cee14e96-7070-410d-8934-e305861050e3 unbound from our chassis
Nov 22 08:22:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:22:43.276 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cee14e96-7070-410d-8934-e305861050e3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 08:22:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:22:43.278 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[14a4d5e0-ca65-44d6-a519-3c8afc7951de]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:22:43 compute-0 kernel: tap9887acef-e3: entered promiscuous mode
Nov 22 08:22:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:22:43.280 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-cee14e96-7070-410d-8934-e305861050e3 namespace which is not needed anymore
Nov 22 08:22:43 compute-0 kernel: tap9887acef-e3 (unregistering): left promiscuous mode
Nov 22 08:22:43 compute-0 NetworkManager[55036]: <info>  [1763799763.2865] manager: (tap9887acef-e3): new Tun device (/org/freedesktop/NetworkManager/Devices/334)
Nov 22 08:22:43 compute-0 ovn_controller[94843]: 2025-11-22T08:22:43Z|00727|binding|INFO|Claiming lport 9887acef-e389-49e2-87d8-70796da43759 for this chassis.
Nov 22 08:22:43 compute-0 ovn_controller[94843]: 2025-11-22T08:22:43Z|00728|binding|INFO|9887acef-e389-49e2-87d8-70796da43759: Claiming fa:16:3e:b4:b4:21 10.100.0.9
Nov 22 08:22:43 compute-0 nova_compute[186544]: 2025-11-22 08:22:43.287 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:22:43 compute-0 nova_compute[186544]: 2025-11-22 08:22:43.332 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:22:43 compute-0 ovn_controller[94843]: 2025-11-22T08:22:43Z|00729|binding|INFO|Setting lport 9887acef-e389-49e2-87d8-70796da43759 ovn-installed in OVS
Nov 22 08:22:43 compute-0 ovn_controller[94843]: 2025-11-22T08:22:43Z|00730|if_status|INFO|Dropped 2 log messages in last 241 seconds (most recently, 241 seconds ago) due to excessive rate
Nov 22 08:22:43 compute-0 ovn_controller[94843]: 2025-11-22T08:22:43Z|00731|if_status|INFO|Not setting lport 9887acef-e389-49e2-87d8-70796da43759 down as sb is readonly
Nov 22 08:22:43 compute-0 nova_compute[186544]: 2025-11-22 08:22:43.343 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:22:43 compute-0 nova_compute[186544]: 2025-11-22 08:22:43.375 186548 INFO nova.virt.libvirt.driver [-] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Instance destroyed successfully.
Nov 22 08:22:43 compute-0 nova_compute[186544]: 2025-11-22 08:22:43.376 186548 DEBUG nova.objects.instance [None req-d6fb423f-f621-49e8-b26d-df207a0bcd11 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lazy-loading 'resources' on Instance uuid 1ef73533-7ed2-422b-a432-c1f12dbc7323 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:22:43 compute-0 nova_compute[186544]: 2025-11-22 08:22:43.402 186548 DEBUG nova.virt.libvirt.vif [None req-d6fb423f-f621-49e8-b26d-df207a0bcd11 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-22T08:21:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-896988644',display_name='tempest-TestNetworkAdvancedServerOps-server-896988644',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-896988644',id=155,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGEdgv7ZU5mr1DCHTKMPGMEa2ECF9EUHEhtvPrAip3HJ7nfj7TmONl8h5osSWq7Dqr3V6Hj92ZlV3FEvFnmrY27FQG+lpmv/tm8jg4LcVo4ZQR1NoEXbdJ/E0azh0THluw==',key_name='tempest-TestNetworkAdvancedServerOps-205176852',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:21:50Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='042f6d127720471aaedb8a1fb7535416',ramdisk_id='',reservation_id='r-9m0of0y0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1221065053',owner_user_name='tempest-TestNetworkAdvancedServerOps-1221065053-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:22:36Z,user_data=None,user_id='d8853d84c1e84f6baaf01635ef1d0f7c',uuid=1ef73533-7ed2-422b-a432-c1f12dbc7323,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9887acef-e389-49e2-87d8-70796da43759", "address": "fa:16:3e:b4:b4:21", "network": {"id": "cee14e96-7070-410d-8934-e305861050e3", "bridge": "br-int", "label": "tempest-network-smoke--303711675", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.178", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9887acef-e3", "ovs_interfaceid": "9887acef-e389-49e2-87d8-70796da43759", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 08:22:43 compute-0 nova_compute[186544]: 2025-11-22 08:22:43.403 186548 DEBUG nova.network.os_vif_util [None req-d6fb423f-f621-49e8-b26d-df207a0bcd11 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converting VIF {"id": "9887acef-e389-49e2-87d8-70796da43759", "address": "fa:16:3e:b4:b4:21", "network": {"id": "cee14e96-7070-410d-8934-e305861050e3", "bridge": "br-int", "label": "tempest-network-smoke--303711675", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.178", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9887acef-e3", "ovs_interfaceid": "9887acef-e389-49e2-87d8-70796da43759", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:22:43 compute-0 nova_compute[186544]: 2025-11-22 08:22:43.404 186548 DEBUG nova.network.os_vif_util [None req-d6fb423f-f621-49e8-b26d-df207a0bcd11 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b4:b4:21,bridge_name='br-int',has_traffic_filtering=True,id=9887acef-e389-49e2-87d8-70796da43759,network=Network(cee14e96-7070-410d-8934-e305861050e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9887acef-e3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:22:43 compute-0 nova_compute[186544]: 2025-11-22 08:22:43.404 186548 DEBUG os_vif [None req-d6fb423f-f621-49e8-b26d-df207a0bcd11 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b4:b4:21,bridge_name='br-int',has_traffic_filtering=True,id=9887acef-e389-49e2-87d8-70796da43759,network=Network(cee14e96-7070-410d-8934-e305861050e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9887acef-e3') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 08:22:43 compute-0 nova_compute[186544]: 2025-11-22 08:22:43.406 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:22:43 compute-0 nova_compute[186544]: 2025-11-22 08:22:43.406 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9887acef-e3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:22:43 compute-0 nova_compute[186544]: 2025-11-22 08:22:43.410 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:22:43 compute-0 nova_compute[186544]: 2025-11-22 08:22:43.414 186548 INFO os_vif [None req-d6fb423f-f621-49e8-b26d-df207a0bcd11 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b4:b4:21,bridge_name='br-int',has_traffic_filtering=True,id=9887acef-e389-49e2-87d8-70796da43759,network=Network(cee14e96-7070-410d-8934-e305861050e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9887acef-e3')
Nov 22 08:22:43 compute-0 nova_compute[186544]: 2025-11-22 08:22:43.415 186548 INFO nova.virt.libvirt.driver [None req-d6fb423f-f621-49e8-b26d-df207a0bcd11 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Deleting instance files /var/lib/nova/instances/1ef73533-7ed2-422b-a432-c1f12dbc7323_del
Nov 22 08:22:43 compute-0 nova_compute[186544]: 2025-11-22 08:22:43.416 186548 INFO nova.virt.libvirt.driver [None req-d6fb423f-f621-49e8-b26d-df207a0bcd11 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Deletion of /var/lib/nova/instances/1ef73533-7ed2-422b-a432-c1f12dbc7323_del complete
Nov 22 08:22:43 compute-0 neutron-haproxy-ovnmeta-cee14e96-7070-410d-8934-e305861050e3[243009]: [NOTICE]   (243013) : haproxy version is 2.8.14-c23fe91
Nov 22 08:22:43 compute-0 neutron-haproxy-ovnmeta-cee14e96-7070-410d-8934-e305861050e3[243009]: [NOTICE]   (243013) : path to executable is /usr/sbin/haproxy
Nov 22 08:22:43 compute-0 neutron-haproxy-ovnmeta-cee14e96-7070-410d-8934-e305861050e3[243009]: [WARNING]  (243013) : Exiting Master process...
Nov 22 08:22:43 compute-0 neutron-haproxy-ovnmeta-cee14e96-7070-410d-8934-e305861050e3[243009]: [ALERT]    (243013) : Current worker (243015) exited with code 143 (Terminated)
Nov 22 08:22:43 compute-0 neutron-haproxy-ovnmeta-cee14e96-7070-410d-8934-e305861050e3[243009]: [WARNING]  (243013) : All workers exited. Exiting... (0)
Nov 22 08:22:43 compute-0 systemd[1]: libpod-fe5617efbaeedbeac66ad12fd59f5a3620bb04e7c9cb75e79129e6307e3ffdea.scope: Deactivated successfully.
Nov 22 08:22:43 compute-0 podman[243084]: 2025-11-22 08:22:43.458357667 +0000 UTC m=+0.049007581 container died fe5617efbaeedbeac66ad12fd59f5a3620bb04e7c9cb75e79129e6307e3ffdea (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cee14e96-7070-410d-8934-e305861050e3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 08:22:43 compute-0 ovn_controller[94843]: 2025-11-22T08:22:43Z|00732|binding|INFO|Releasing lport 9887acef-e389-49e2-87d8-70796da43759 from this chassis (sb_readonly=0)
Nov 22 08:22:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:22:43.467 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b4:b4:21 10.100.0.9'], port_security=['fa:16:3e:b4:b4:21 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '1ef73533-7ed2-422b-a432-c1f12dbc7323', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cee14e96-7070-410d-8934-e305861050e3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '042f6d127720471aaedb8a1fb7535416', 'neutron:revision_number': '13', 'neutron:security_group_ids': '6e5fcd1f-0f97-4f29-8604-c8a08fc32894', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fc22de09-f0b3-4482-a7fb-bd5256ece761, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=9887acef-e389-49e2-87d8-70796da43759) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:22:43 compute-0 nova_compute[186544]: 2025-11-22 08:22:43.478 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:22:43 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fe5617efbaeedbeac66ad12fd59f5a3620bb04e7c9cb75e79129e6307e3ffdea-userdata-shm.mount: Deactivated successfully.
Nov 22 08:22:43 compute-0 systemd[1]: var-lib-containers-storage-overlay-0b77c4f4fae75c3fa2e7990e72d48a948795d68187c1957f004efef546085220-merged.mount: Deactivated successfully.
Nov 22 08:22:43 compute-0 podman[243084]: 2025-11-22 08:22:43.511707095 +0000 UTC m=+0.102356989 container cleanup fe5617efbaeedbeac66ad12fd59f5a3620bb04e7c9cb75e79129e6307e3ffdea (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cee14e96-7070-410d-8934-e305861050e3, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 22 08:22:43 compute-0 systemd[1]: libpod-conmon-fe5617efbaeedbeac66ad12fd59f5a3620bb04e7c9cb75e79129e6307e3ffdea.scope: Deactivated successfully.
Nov 22 08:22:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:22:43.537 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b4:b4:21 10.100.0.9'], port_security=['fa:16:3e:b4:b4:21 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '1ef73533-7ed2-422b-a432-c1f12dbc7323', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cee14e96-7070-410d-8934-e305861050e3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '042f6d127720471aaedb8a1fb7535416', 'neutron:revision_number': '13', 'neutron:security_group_ids': '6e5fcd1f-0f97-4f29-8604-c8a08fc32894', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fc22de09-f0b3-4482-a7fb-bd5256ece761, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=9887acef-e389-49e2-87d8-70796da43759) old=Port_Binding(chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:22:43 compute-0 podman[243115]: 2025-11-22 08:22:43.587025095 +0000 UTC m=+0.054724002 container remove fe5617efbaeedbeac66ad12fd59f5a3620bb04e7c9cb75e79129e6307e3ffdea (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cee14e96-7070-410d-8934-e305861050e3, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 08:22:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:22:43.594 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[8a7fc50a-537c-47c8-a952-ae20a98eef6b]: (4, ('Sat Nov 22 08:22:43 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-cee14e96-7070-410d-8934-e305861050e3 (fe5617efbaeedbeac66ad12fd59f5a3620bb04e7c9cb75e79129e6307e3ffdea)\nfe5617efbaeedbeac66ad12fd59f5a3620bb04e7c9cb75e79129e6307e3ffdea\nSat Nov 22 08:22:43 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-cee14e96-7070-410d-8934-e305861050e3 (fe5617efbaeedbeac66ad12fd59f5a3620bb04e7c9cb75e79129e6307e3ffdea)\nfe5617efbaeedbeac66ad12fd59f5a3620bb04e7c9cb75e79129e6307e3ffdea\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:22:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:22:43.595 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[7ccd701d-2e8a-42c8-bcf3-b94cdbec8188]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:22:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:22:43.596 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcee14e96-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:22:43 compute-0 nova_compute[186544]: 2025-11-22 08:22:43.598 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:22:43 compute-0 kernel: tapcee14e96-70: left promiscuous mode
Nov 22 08:22:43 compute-0 nova_compute[186544]: 2025-11-22 08:22:43.611 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:22:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:22:43.613 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[27f8c659-0a92-465a-bfbb-836c8dc2b3e4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:22:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:22:43.635 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[6204db88-26ee-4cfd-a04d-687980637c01]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:22:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:22:43.636 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[5b258ca3-6a3b-4099-917d-308af5718c02]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:22:43 compute-0 nova_compute[186544]: 2025-11-22 08:22:43.641 186548 INFO nova.compute.manager [None req-d6fb423f-f621-49e8-b26d-df207a0bcd11 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Took 0.58 seconds to destroy the instance on the hypervisor.
Nov 22 08:22:43 compute-0 nova_compute[186544]: 2025-11-22 08:22:43.642 186548 DEBUG oslo.service.loopingcall [None req-d6fb423f-f621-49e8-b26d-df207a0bcd11 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 08:22:43 compute-0 nova_compute[186544]: 2025-11-22 08:22:43.643 186548 DEBUG nova.compute.manager [-] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 08:22:43 compute-0 nova_compute[186544]: 2025-11-22 08:22:43.643 186548 DEBUG nova.network.neutron [-] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 08:22:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:22:43.655 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[14d66f01-2312-4f55-8049-2f4c96d32deb]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 649200, 'reachable_time': 29940, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 243128, 'error': None, 'target': 'ovnmeta-cee14e96-7070-410d-8934-e305861050e3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:22:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:22:43.658 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-cee14e96-7070-410d-8934-e305861050e3 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 08:22:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:22:43.658 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[ab2a2253-5f8f-4907-9588-1f3b5c5ac28e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:22:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:22:43.659 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 9887acef-e389-49e2-87d8-70796da43759 in datapath cee14e96-7070-410d-8934-e305861050e3 unbound from our chassis
Nov 22 08:22:43 compute-0 systemd[1]: run-netns-ovnmeta\x2dcee14e96\x2d7070\x2d410d\x2d8934\x2de305861050e3.mount: Deactivated successfully.
Nov 22 08:22:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:22:43.660 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cee14e96-7070-410d-8934-e305861050e3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 08:22:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:22:43.661 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[7105206d-52be-48a3-be10-1c445285cf1b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:22:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:22:43.661 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 9887acef-e389-49e2-87d8-70796da43759 in datapath cee14e96-7070-410d-8934-e305861050e3 unbound from our chassis
Nov 22 08:22:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:22:43.662 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cee14e96-7070-410d-8934-e305861050e3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 08:22:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:22:43.662 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[77dcd4cc-581c-4816-a0b3-82419bef68ca]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:22:44 compute-0 nova_compute[186544]: 2025-11-22 08:22:44.103 186548 DEBUG nova.network.neutron [req-70a0cdca-67ee-40b9-bb56-d99452ee28cb req-96098a5c-e8c1-4e44-8060-0304564f7a71 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Updated VIF entry in instance network info cache for port 9887acef-e389-49e2-87d8-70796da43759. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:22:44 compute-0 nova_compute[186544]: 2025-11-22 08:22:44.104 186548 DEBUG nova.network.neutron [req-70a0cdca-67ee-40b9-bb56-d99452ee28cb req-96098a5c-e8c1-4e44-8060-0304564f7a71 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Updating instance_info_cache with network_info: [{"id": "9887acef-e389-49e2-87d8-70796da43759", "address": "fa:16:3e:b4:b4:21", "network": {"id": "cee14e96-7070-410d-8934-e305861050e3", "bridge": "br-int", "label": "tempest-network-smoke--303711675", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9887acef-e3", "ovs_interfaceid": "9887acef-e389-49e2-87d8-70796da43759", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:22:44 compute-0 nova_compute[186544]: 2025-11-22 08:22:44.289 186548 DEBUG oslo_concurrency.lockutils [req-70a0cdca-67ee-40b9-bb56-d99452ee28cb req-96098a5c-e8c1-4e44-8060-0304564f7a71 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-1ef73533-7ed2-422b-a432-c1f12dbc7323" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:22:44 compute-0 nova_compute[186544]: 2025-11-22 08:22:44.295 186548 DEBUG nova.compute.manager [req-6d16fdd5-3691-4f4c-a78b-91d820fd8436 req-e37f0655-7a8f-42e4-a738-ba86c95d8ded 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Received event network-vif-unplugged-9887acef-e389-49e2-87d8-70796da43759 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:22:44 compute-0 nova_compute[186544]: 2025-11-22 08:22:44.295 186548 DEBUG oslo_concurrency.lockutils [req-6d16fdd5-3691-4f4c-a78b-91d820fd8436 req-e37f0655-7a8f-42e4-a738-ba86c95d8ded 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1ef73533-7ed2-422b-a432-c1f12dbc7323-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:22:44 compute-0 nova_compute[186544]: 2025-11-22 08:22:44.296 186548 DEBUG oslo_concurrency.lockutils [req-6d16fdd5-3691-4f4c-a78b-91d820fd8436 req-e37f0655-7a8f-42e4-a738-ba86c95d8ded 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1ef73533-7ed2-422b-a432-c1f12dbc7323-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:22:44 compute-0 nova_compute[186544]: 2025-11-22 08:22:44.296 186548 DEBUG oslo_concurrency.lockutils [req-6d16fdd5-3691-4f4c-a78b-91d820fd8436 req-e37f0655-7a8f-42e4-a738-ba86c95d8ded 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1ef73533-7ed2-422b-a432-c1f12dbc7323-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:22:44 compute-0 nova_compute[186544]: 2025-11-22 08:22:44.296 186548 DEBUG nova.compute.manager [req-6d16fdd5-3691-4f4c-a78b-91d820fd8436 req-e37f0655-7a8f-42e4-a738-ba86c95d8ded 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] No waiting events found dispatching network-vif-unplugged-9887acef-e389-49e2-87d8-70796da43759 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:22:44 compute-0 nova_compute[186544]: 2025-11-22 08:22:44.297 186548 DEBUG nova.compute.manager [req-6d16fdd5-3691-4f4c-a78b-91d820fd8436 req-e37f0655-7a8f-42e4-a738-ba86c95d8ded 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Received event network-vif-unplugged-9887acef-e389-49e2-87d8-70796da43759 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 22 08:22:45 compute-0 nova_compute[186544]: 2025-11-22 08:22:45.143 186548 DEBUG nova.network.neutron [-] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:22:45 compute-0 nova_compute[186544]: 2025-11-22 08:22:45.212 186548 DEBUG nova.compute.manager [req-08216c26-e041-485a-a809-06696b5a7c39 req-7312f53b-ef17-41df-af58-7df8cedf6738 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Received event network-vif-deleted-9887acef-e389-49e2-87d8-70796da43759 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:22:45 compute-0 nova_compute[186544]: 2025-11-22 08:22:45.213 186548 INFO nova.compute.manager [req-08216c26-e041-485a-a809-06696b5a7c39 req-7312f53b-ef17-41df-af58-7df8cedf6738 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Neutron deleted interface 9887acef-e389-49e2-87d8-70796da43759; detaching it from the instance and deleting it from the info cache
Nov 22 08:22:45 compute-0 nova_compute[186544]: 2025-11-22 08:22:45.214 186548 DEBUG nova.network.neutron [req-08216c26-e041-485a-a809-06696b5a7c39 req-7312f53b-ef17-41df-af58-7df8cedf6738 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:22:45 compute-0 nova_compute[186544]: 2025-11-22 08:22:45.326 186548 INFO nova.compute.manager [-] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Took 1.68 seconds to deallocate network for instance.
Nov 22 08:22:45 compute-0 nova_compute[186544]: 2025-11-22 08:22:45.341 186548 DEBUG nova.compute.manager [req-08216c26-e041-485a-a809-06696b5a7c39 req-7312f53b-ef17-41df-af58-7df8cedf6738 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Detach interface failed, port_id=9887acef-e389-49e2-87d8-70796da43759, reason: Instance 1ef73533-7ed2-422b-a432-c1f12dbc7323 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Nov 22 08:22:45 compute-0 podman[243129]: 2025-11-22 08:22:45.420519931 +0000 UTC m=+0.064400772 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 08:22:45 compute-0 podman[243130]: 2025-11-22 08:22:45.432076386 +0000 UTC m=+0.072104602 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.openshift.expose-services=, managed_by=edpm_ansible, vcs-type=git, version=9.6, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public)
Nov 22 08:22:45 compute-0 nova_compute[186544]: 2025-11-22 08:22:45.627 186548 DEBUG oslo_concurrency.lockutils [None req-d6fb423f-f621-49e8-b26d-df207a0bcd11 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:22:45 compute-0 nova_compute[186544]: 2025-11-22 08:22:45.627 186548 DEBUG oslo_concurrency.lockutils [None req-d6fb423f-f621-49e8-b26d-df207a0bcd11 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:22:45 compute-0 nova_compute[186544]: 2025-11-22 08:22:45.634 186548 DEBUG oslo_concurrency.lockutils [None req-d6fb423f-f621-49e8-b26d-df207a0bcd11 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.007s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:22:45 compute-0 nova_compute[186544]: 2025-11-22 08:22:45.946 186548 INFO nova.scheduler.client.report [None req-d6fb423f-f621-49e8-b26d-df207a0bcd11 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Deleted allocations for instance 1ef73533-7ed2-422b-a432-c1f12dbc7323
Nov 22 08:22:46 compute-0 nova_compute[186544]: 2025-11-22 08:22:46.183 186548 DEBUG oslo_concurrency.lockutils [None req-d6fb423f-f621-49e8-b26d-df207a0bcd11 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "1ef73533-7ed2-422b-a432-c1f12dbc7323" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.147s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:22:46 compute-0 nova_compute[186544]: 2025-11-22 08:22:46.201 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:22:46 compute-0 nova_compute[186544]: 2025-11-22 08:22:46.388 186548 DEBUG nova.compute.manager [req-fd21faac-f2bf-43ab-8170-1b64d98bf367 req-6327dfc1-bf94-4eda-9872-7703e3465505 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Received event network-vif-plugged-9887acef-e389-49e2-87d8-70796da43759 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:22:46 compute-0 nova_compute[186544]: 2025-11-22 08:22:46.388 186548 DEBUG oslo_concurrency.lockutils [req-fd21faac-f2bf-43ab-8170-1b64d98bf367 req-6327dfc1-bf94-4eda-9872-7703e3465505 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1ef73533-7ed2-422b-a432-c1f12dbc7323-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:22:46 compute-0 nova_compute[186544]: 2025-11-22 08:22:46.388 186548 DEBUG oslo_concurrency.lockutils [req-fd21faac-f2bf-43ab-8170-1b64d98bf367 req-6327dfc1-bf94-4eda-9872-7703e3465505 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1ef73533-7ed2-422b-a432-c1f12dbc7323-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:22:46 compute-0 nova_compute[186544]: 2025-11-22 08:22:46.388 186548 DEBUG oslo_concurrency.lockutils [req-fd21faac-f2bf-43ab-8170-1b64d98bf367 req-6327dfc1-bf94-4eda-9872-7703e3465505 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1ef73533-7ed2-422b-a432-c1f12dbc7323-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:22:46 compute-0 nova_compute[186544]: 2025-11-22 08:22:46.389 186548 DEBUG nova.compute.manager [req-fd21faac-f2bf-43ab-8170-1b64d98bf367 req-6327dfc1-bf94-4eda-9872-7703e3465505 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] No waiting events found dispatching network-vif-plugged-9887acef-e389-49e2-87d8-70796da43759 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:22:46 compute-0 nova_compute[186544]: 2025-11-22 08:22:46.389 186548 WARNING nova.compute.manager [req-fd21faac-f2bf-43ab-8170-1b64d98bf367 req-6327dfc1-bf94-4eda-9872-7703e3465505 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Received unexpected event network-vif-plugged-9887acef-e389-49e2-87d8-70796da43759 for instance with vm_state deleted and task_state None.
Nov 22 08:22:48 compute-0 nova_compute[186544]: 2025-11-22 08:22:48.409 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:22:51 compute-0 nova_compute[186544]: 2025-11-22 08:22:51.203 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:22:53 compute-0 nova_compute[186544]: 2025-11-22 08:22:53.412 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:22:54 compute-0 nova_compute[186544]: 2025-11-22 08:22:54.302 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:22:54 compute-0 nova_compute[186544]: 2025-11-22 08:22:54.381 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:22:56 compute-0 nova_compute[186544]: 2025-11-22 08:22:56.205 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:22:58 compute-0 nova_compute[186544]: 2025-11-22 08:22:58.373 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763799763.373291, 1ef73533-7ed2-422b-a432-c1f12dbc7323 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:22:58 compute-0 nova_compute[186544]: 2025-11-22 08:22:58.374 186548 INFO nova.compute.manager [-] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] VM Stopped (Lifecycle Event)
Nov 22 08:22:58 compute-0 nova_compute[186544]: 2025-11-22 08:22:58.394 186548 DEBUG nova.compute.manager [None req-97444888-2ab3-48a9-9c4f-450025e1962b - - - - - -] [instance: 1ef73533-7ed2-422b-a432-c1f12dbc7323] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:22:58 compute-0 nova_compute[186544]: 2025-11-22 08:22:58.415 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:23:01 compute-0 nova_compute[186544]: 2025-11-22 08:23:01.206 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:23:02 compute-0 podman[243175]: 2025-11-22 08:23:02.414618291 +0000 UTC m=+0.055672076 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 08:23:02 compute-0 podman[243174]: 2025-11-22 08:23:02.429489659 +0000 UTC m=+0.072018860 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 22 08:23:02 compute-0 podman[243173]: 2025-11-22 08:23:02.432402171 +0000 UTC m=+0.075885955 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 22 08:23:02 compute-0 podman[243176]: 2025-11-22 08:23:02.488188089 +0000 UTC m=+0.123482762 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 22 08:23:03 compute-0 nova_compute[186544]: 2025-11-22 08:23:03.416 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:23:04 compute-0 nova_compute[186544]: 2025-11-22 08:23:04.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:23:04 compute-0 nova_compute[186544]: 2025-11-22 08:23:04.191 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:23:04 compute-0 nova_compute[186544]: 2025-11-22 08:23:04.192 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:23:04 compute-0 nova_compute[186544]: 2025-11-22 08:23:04.192 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:23:04 compute-0 nova_compute[186544]: 2025-11-22 08:23:04.192 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 08:23:04 compute-0 nova_compute[186544]: 2025-11-22 08:23:04.418 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:23:04 compute-0 nova_compute[186544]: 2025-11-22 08:23:04.419 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5694MB free_disk=73.1367073059082GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 08:23:04 compute-0 nova_compute[186544]: 2025-11-22 08:23:04.419 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:23:04 compute-0 nova_compute[186544]: 2025-11-22 08:23:04.419 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:23:04 compute-0 nova_compute[186544]: 2025-11-22 08:23:04.477 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 08:23:04 compute-0 nova_compute[186544]: 2025-11-22 08:23:04.478 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 08:23:04 compute-0 nova_compute[186544]: 2025-11-22 08:23:04.507 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:23:04 compute-0 nova_compute[186544]: 2025-11-22 08:23:04.518 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:23:05 compute-0 nova_compute[186544]: 2025-11-22 08:23:05.156 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 08:23:05 compute-0 nova_compute[186544]: 2025-11-22 08:23:05.157 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.738s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:23:06 compute-0 nova_compute[186544]: 2025-11-22 08:23:06.209 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:23:08 compute-0 nova_compute[186544]: 2025-11-22 08:23:08.419 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:23:10 compute-0 nova_compute[186544]: 2025-11-22 08:23:10.157 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:23:10 compute-0 nova_compute[186544]: 2025-11-22 08:23:10.158 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 08:23:10 compute-0 nova_compute[186544]: 2025-11-22 08:23:10.158 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 08:23:10 compute-0 nova_compute[186544]: 2025-11-22 08:23:10.172 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 08:23:11 compute-0 nova_compute[186544]: 2025-11-22 08:23:11.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:23:11 compute-0 nova_compute[186544]: 2025-11-22 08:23:11.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:23:11 compute-0 nova_compute[186544]: 2025-11-22 08:23:11.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 08:23:11 compute-0 nova_compute[186544]: 2025-11-22 08:23:11.212 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:23:12 compute-0 nova_compute[186544]: 2025-11-22 08:23:12.159 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:23:12 compute-0 nova_compute[186544]: 2025-11-22 08:23:12.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:23:13 compute-0 podman[243262]: 2025-11-22 08:23:13.410199196 +0000 UTC m=+0.060244339 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd)
Nov 22 08:23:13 compute-0 nova_compute[186544]: 2025-11-22 08:23:13.420 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:23:16 compute-0 nova_compute[186544]: 2025-11-22 08:23:16.213 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:23:16 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:23:16.246 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:10:f1:bb 10.100.0.2 2001:db8::f816:3eff:fe10:f1bb'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe10:f1bb/64', 'neutron:device_id': 'ovnmeta-326c0814-77d4-416b-a5a1-28be00b61ecd', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-326c0814-77d4-416b-a5a1-28be00b61ecd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5c110aad-90e5-4caa-b631-3c18861eaadf, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=e1bc69f6-ec55-4040-be0d-44f334cbe3a6) old=Port_Binding(mac=['fa:16:3e:10:f1:bb 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-326c0814-77d4-416b-a5a1-28be00b61ecd', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-326c0814-77d4-416b-a5a1-28be00b61ecd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:23:16 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:23:16.247 103805 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port e1bc69f6-ec55-4040-be0d-44f334cbe3a6 in datapath 326c0814-77d4-416b-a5a1-28be00b61ecd updated
Nov 22 08:23:16 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:23:16.248 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 326c0814-77d4-416b-a5a1-28be00b61ecd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 08:23:16 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:23:16.249 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[3a6f1aeb-8b18-42bc-9882-8b272c4f7ce2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:23:16 compute-0 podman[243282]: 2025-11-22 08:23:16.406293836 +0000 UTC m=+0.056822714 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 08:23:16 compute-0 podman[243283]: 2025-11-22 08:23:16.417137284 +0000 UTC m=+0.063855268 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, container_name=openstack_network_exporter, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, io.openshift.expose-services=, managed_by=edpm_ansible, architecture=x86_64, maintainer=Red Hat, Inc., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, release=1755695350, build-date=2025-08-20T13:12:41)
Nov 22 08:23:17 compute-0 nova_compute[186544]: 2025-11-22 08:23:17.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:23:17 compute-0 nova_compute[186544]: 2025-11-22 08:23:17.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:23:18 compute-0 nova_compute[186544]: 2025-11-22 08:23:18.423 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:23:20 compute-0 nova_compute[186544]: 2025-11-22 08:23:20.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:23:21 compute-0 nova_compute[186544]: 2025-11-22 08:23:21.215 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:23:23 compute-0 nova_compute[186544]: 2025-11-22 08:23:23.424 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:23:26 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:23:26.122 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=49, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=48) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:23:26 compute-0 nova_compute[186544]: 2025-11-22 08:23:26.123 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:23:26 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:23:26.124 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 08:23:26 compute-0 nova_compute[186544]: 2025-11-22 08:23:26.216 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:23:26 compute-0 nova_compute[186544]: 2025-11-22 08:23:26.584 186548 DEBUG oslo_concurrency.lockutils [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "60603ba6-4ebe-48b0-94b1-5656695c9ded" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:23:26 compute-0 nova_compute[186544]: 2025-11-22 08:23:26.584 186548 DEBUG oslo_concurrency.lockutils [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "60603ba6-4ebe-48b0-94b1-5656695c9ded" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:23:26 compute-0 nova_compute[186544]: 2025-11-22 08:23:26.604 186548 DEBUG nova.compute.manager [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 08:23:26 compute-0 nova_compute[186544]: 2025-11-22 08:23:26.754 186548 DEBUG oslo_concurrency.lockutils [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:23:26 compute-0 nova_compute[186544]: 2025-11-22 08:23:26.755 186548 DEBUG oslo_concurrency.lockutils [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:23:26 compute-0 nova_compute[186544]: 2025-11-22 08:23:26.761 186548 DEBUG nova.virt.hardware [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 08:23:26 compute-0 nova_compute[186544]: 2025-11-22 08:23:26.762 186548 INFO nova.compute.claims [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] Claim successful on node compute-0.ctlplane.example.com
Nov 22 08:23:26 compute-0 nova_compute[186544]: 2025-11-22 08:23:26.915 186548 DEBUG nova.compute.provider_tree [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:23:26 compute-0 nova_compute[186544]: 2025-11-22 08:23:26.927 186548 DEBUG nova.scheduler.client.report [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:23:27 compute-0 nova_compute[186544]: 2025-11-22 08:23:27.037 186548 DEBUG oslo_concurrency.lockutils [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.283s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:23:27 compute-0 nova_compute[186544]: 2025-11-22 08:23:27.038 186548 DEBUG nova.compute.manager [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 08:23:27 compute-0 nova_compute[186544]: 2025-11-22 08:23:27.126 186548 DEBUG nova.compute.manager [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 22 08:23:27 compute-0 nova_compute[186544]: 2025-11-22 08:23:27.127 186548 DEBUG nova.network.neutron [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 08:23:27 compute-0 nova_compute[186544]: 2025-11-22 08:23:27.157 186548 INFO nova.virt.libvirt.driver [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 08:23:27 compute-0 nova_compute[186544]: 2025-11-22 08:23:27.177 186548 DEBUG nova.compute.manager [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 08:23:27 compute-0 nova_compute[186544]: 2025-11-22 08:23:27.283 186548 DEBUG nova.compute.manager [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 08:23:27 compute-0 nova_compute[186544]: 2025-11-22 08:23:27.284 186548 DEBUG nova.virt.libvirt.driver [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 08:23:27 compute-0 nova_compute[186544]: 2025-11-22 08:23:27.284 186548 INFO nova.virt.libvirt.driver [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] Creating image(s)
Nov 22 08:23:27 compute-0 nova_compute[186544]: 2025-11-22 08:23:27.285 186548 DEBUG oslo_concurrency.lockutils [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "/var/lib/nova/instances/60603ba6-4ebe-48b0-94b1-5656695c9ded/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:23:27 compute-0 nova_compute[186544]: 2025-11-22 08:23:27.285 186548 DEBUG oslo_concurrency.lockutils [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "/var/lib/nova/instances/60603ba6-4ebe-48b0-94b1-5656695c9ded/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:23:27 compute-0 nova_compute[186544]: 2025-11-22 08:23:27.286 186548 DEBUG oslo_concurrency.lockutils [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "/var/lib/nova/instances/60603ba6-4ebe-48b0-94b1-5656695c9ded/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:23:27 compute-0 nova_compute[186544]: 2025-11-22 08:23:27.300 186548 DEBUG oslo_concurrency.processutils [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:23:27 compute-0 nova_compute[186544]: 2025-11-22 08:23:27.352 186548 DEBUG nova.policy [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '809b865601654264af5bff7f49127cea', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 22 08:23:27 compute-0 nova_compute[186544]: 2025-11-22 08:23:27.367 186548 DEBUG oslo_concurrency.processutils [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:23:27 compute-0 nova_compute[186544]: 2025-11-22 08:23:27.368 186548 DEBUG oslo_concurrency.lockutils [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:23:27 compute-0 nova_compute[186544]: 2025-11-22 08:23:27.369 186548 DEBUG oslo_concurrency.lockutils [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:23:27 compute-0 nova_compute[186544]: 2025-11-22 08:23:27.381 186548 DEBUG oslo_concurrency.processutils [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:23:27 compute-0 nova_compute[186544]: 2025-11-22 08:23:27.451 186548 DEBUG oslo_concurrency.processutils [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:23:27 compute-0 nova_compute[186544]: 2025-11-22 08:23:27.452 186548 DEBUG oslo_concurrency.processutils [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/60603ba6-4ebe-48b0-94b1-5656695c9ded/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:23:27 compute-0 nova_compute[186544]: 2025-11-22 08:23:27.828 186548 DEBUG oslo_concurrency.processutils [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/60603ba6-4ebe-48b0-94b1-5656695c9ded/disk 1073741824" returned: 0 in 0.377s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:23:27 compute-0 nova_compute[186544]: 2025-11-22 08:23:27.830 186548 DEBUG oslo_concurrency.lockutils [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.461s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:23:27 compute-0 nova_compute[186544]: 2025-11-22 08:23:27.830 186548 DEBUG oslo_concurrency.processutils [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:23:27 compute-0 nova_compute[186544]: 2025-11-22 08:23:27.893 186548 DEBUG oslo_concurrency.processutils [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:23:27 compute-0 nova_compute[186544]: 2025-11-22 08:23:27.894 186548 DEBUG nova.virt.disk.api [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Checking if we can resize image /var/lib/nova/instances/60603ba6-4ebe-48b0-94b1-5656695c9ded/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 08:23:27 compute-0 nova_compute[186544]: 2025-11-22 08:23:27.895 186548 DEBUG oslo_concurrency.processutils [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/60603ba6-4ebe-48b0-94b1-5656695c9ded/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:23:27 compute-0 nova_compute[186544]: 2025-11-22 08:23:27.962 186548 DEBUG oslo_concurrency.processutils [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/60603ba6-4ebe-48b0-94b1-5656695c9ded/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:23:27 compute-0 nova_compute[186544]: 2025-11-22 08:23:27.963 186548 DEBUG nova.virt.disk.api [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Cannot resize image /var/lib/nova/instances/60603ba6-4ebe-48b0-94b1-5656695c9ded/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 08:23:27 compute-0 nova_compute[186544]: 2025-11-22 08:23:27.964 186548 DEBUG nova.objects.instance [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lazy-loading 'migration_context' on Instance uuid 60603ba6-4ebe-48b0-94b1-5656695c9ded obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:23:27 compute-0 nova_compute[186544]: 2025-11-22 08:23:27.978 186548 DEBUG nova.virt.libvirt.driver [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 08:23:27 compute-0 nova_compute[186544]: 2025-11-22 08:23:27.979 186548 DEBUG nova.virt.libvirt.driver [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] Ensure instance console log exists: /var/lib/nova/instances/60603ba6-4ebe-48b0-94b1-5656695c9ded/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 08:23:27 compute-0 nova_compute[186544]: 2025-11-22 08:23:27.980 186548 DEBUG oslo_concurrency.lockutils [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:23:27 compute-0 nova_compute[186544]: 2025-11-22 08:23:27.980 186548 DEBUG oslo_concurrency.lockutils [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:23:27 compute-0 nova_compute[186544]: 2025-11-22 08:23:27.981 186548 DEBUG oslo_concurrency.lockutils [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:23:28 compute-0 nova_compute[186544]: 2025-11-22 08:23:28.427 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:23:28 compute-0 nova_compute[186544]: 2025-11-22 08:23:28.785 186548 DEBUG nova.network.neutron [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] Successfully created port: 60cf0e4a-972c-4c7e-ab24-65a810d650bb _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 22 08:23:30 compute-0 nova_compute[186544]: 2025-11-22 08:23:30.226 186548 DEBUG nova.network.neutron [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] Successfully updated port: 60cf0e4a-972c-4c7e-ab24-65a810d650bb _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 08:23:30 compute-0 nova_compute[186544]: 2025-11-22 08:23:30.254 186548 DEBUG oslo_concurrency.lockutils [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "refresh_cache-60603ba6-4ebe-48b0-94b1-5656695c9ded" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:23:30 compute-0 nova_compute[186544]: 2025-11-22 08:23:30.254 186548 DEBUG oslo_concurrency.lockutils [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquired lock "refresh_cache-60603ba6-4ebe-48b0-94b1-5656695c9ded" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:23:30 compute-0 nova_compute[186544]: 2025-11-22 08:23:30.255 186548 DEBUG nova.network.neutron [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 08:23:30 compute-0 nova_compute[186544]: 2025-11-22 08:23:30.331 186548 DEBUG nova.compute.manager [req-2bf2002f-f6d0-4282-9c82-61c0c3412aea req-d7aa6674-f5ab-406f-872f-aa47444ae931 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] Received event network-changed-60cf0e4a-972c-4c7e-ab24-65a810d650bb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:23:30 compute-0 nova_compute[186544]: 2025-11-22 08:23:30.332 186548 DEBUG nova.compute.manager [req-2bf2002f-f6d0-4282-9c82-61c0c3412aea req-d7aa6674-f5ab-406f-872f-aa47444ae931 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] Refreshing instance network info cache due to event network-changed-60cf0e4a-972c-4c7e-ab24-65a810d650bb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:23:30 compute-0 nova_compute[186544]: 2025-11-22 08:23:30.332 186548 DEBUG oslo_concurrency.lockutils [req-2bf2002f-f6d0-4282-9c82-61c0c3412aea req-d7aa6674-f5ab-406f-872f-aa47444ae931 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-60603ba6-4ebe-48b0-94b1-5656695c9ded" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:23:30 compute-0 nova_compute[186544]: 2025-11-22 08:23:30.763 186548 DEBUG nova.network.neutron [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 08:23:31 compute-0 nova_compute[186544]: 2025-11-22 08:23:31.218 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:23:33 compute-0 podman[243344]: 2025-11-22 08:23:33.40297159 +0000 UTC m=+0.048534609 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 08:23:33 compute-0 podman[243343]: 2025-11-22 08:23:33.403220017 +0000 UTC m=+0.052246042 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 22 08:23:33 compute-0 podman[243342]: 2025-11-22 08:23:33.413150382 +0000 UTC m=+0.064471003 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 08:23:33 compute-0 nova_compute[186544]: 2025-11-22 08:23:33.430 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:23:33 compute-0 podman[243345]: 2025-11-22 08:23:33.440040256 +0000 UTC m=+0.080710435 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 08:23:34 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:23:34.126 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '49'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:23:35 compute-0 nova_compute[186544]: 2025-11-22 08:23:35.251 186548 DEBUG nova.network.neutron [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] Updating instance_info_cache with network_info: [{"id": "60cf0e4a-972c-4c7e-ab24-65a810d650bb", "address": "fa:16:3e:92:d5:7b", "network": {"id": "326c0814-77d4-416b-a5a1-28be00b61ecd", "bridge": "br-int", "label": "tempest-network-smoke--641682246", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe92:d57b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60cf0e4a-97", "ovs_interfaceid": "60cf0e4a-972c-4c7e-ab24-65a810d650bb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:23:35 compute-0 nova_compute[186544]: 2025-11-22 08:23:35.277 186548 DEBUG oslo_concurrency.lockutils [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Releasing lock "refresh_cache-60603ba6-4ebe-48b0-94b1-5656695c9ded" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:23:35 compute-0 nova_compute[186544]: 2025-11-22 08:23:35.278 186548 DEBUG nova.compute.manager [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] Instance network_info: |[{"id": "60cf0e4a-972c-4c7e-ab24-65a810d650bb", "address": "fa:16:3e:92:d5:7b", "network": {"id": "326c0814-77d4-416b-a5a1-28be00b61ecd", "bridge": "br-int", "label": "tempest-network-smoke--641682246", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe92:d57b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60cf0e4a-97", "ovs_interfaceid": "60cf0e4a-972c-4c7e-ab24-65a810d650bb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 22 08:23:35 compute-0 nova_compute[186544]: 2025-11-22 08:23:35.278 186548 DEBUG oslo_concurrency.lockutils [req-2bf2002f-f6d0-4282-9c82-61c0c3412aea req-d7aa6674-f5ab-406f-872f-aa47444ae931 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-60603ba6-4ebe-48b0-94b1-5656695c9ded" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:23:35 compute-0 nova_compute[186544]: 2025-11-22 08:23:35.279 186548 DEBUG nova.network.neutron [req-2bf2002f-f6d0-4282-9c82-61c0c3412aea req-d7aa6674-f5ab-406f-872f-aa47444ae931 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] Refreshing network info cache for port 60cf0e4a-972c-4c7e-ab24-65a810d650bb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:23:35 compute-0 nova_compute[186544]: 2025-11-22 08:23:35.281 186548 DEBUG nova.virt.libvirt.driver [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] Start _get_guest_xml network_info=[{"id": "60cf0e4a-972c-4c7e-ab24-65a810d650bb", "address": "fa:16:3e:92:d5:7b", "network": {"id": "326c0814-77d4-416b-a5a1-28be00b61ecd", "bridge": "br-int", "label": "tempest-network-smoke--641682246", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe92:d57b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60cf0e4a-97", "ovs_interfaceid": "60cf0e4a-972c-4c7e-ab24-65a810d650bb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 08:23:35 compute-0 nova_compute[186544]: 2025-11-22 08:23:35.286 186548 WARNING nova.virt.libvirt.driver [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:23:35 compute-0 nova_compute[186544]: 2025-11-22 08:23:35.290 186548 DEBUG nova.virt.libvirt.host [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 08:23:35 compute-0 nova_compute[186544]: 2025-11-22 08:23:35.290 186548 DEBUG nova.virt.libvirt.host [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 08:23:35 compute-0 nova_compute[186544]: 2025-11-22 08:23:35.296 186548 DEBUG nova.virt.libvirt.host [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 08:23:35 compute-0 nova_compute[186544]: 2025-11-22 08:23:35.296 186548 DEBUG nova.virt.libvirt.host [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 08:23:35 compute-0 nova_compute[186544]: 2025-11-22 08:23:35.299 186548 DEBUG nova.virt.libvirt.driver [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 08:23:35 compute-0 nova_compute[186544]: 2025-11-22 08:23:35.299 186548 DEBUG nova.virt.hardware [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 08:23:35 compute-0 nova_compute[186544]: 2025-11-22 08:23:35.300 186548 DEBUG nova.virt.hardware [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 08:23:35 compute-0 nova_compute[186544]: 2025-11-22 08:23:35.300 186548 DEBUG nova.virt.hardware [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 08:23:35 compute-0 nova_compute[186544]: 2025-11-22 08:23:35.300 186548 DEBUG nova.virt.hardware [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 08:23:35 compute-0 nova_compute[186544]: 2025-11-22 08:23:35.301 186548 DEBUG nova.virt.hardware [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 08:23:35 compute-0 nova_compute[186544]: 2025-11-22 08:23:35.301 186548 DEBUG nova.virt.hardware [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 08:23:35 compute-0 nova_compute[186544]: 2025-11-22 08:23:35.301 186548 DEBUG nova.virt.hardware [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 08:23:35 compute-0 nova_compute[186544]: 2025-11-22 08:23:35.302 186548 DEBUG nova.virt.hardware [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 08:23:35 compute-0 nova_compute[186544]: 2025-11-22 08:23:35.302 186548 DEBUG nova.virt.hardware [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 08:23:35 compute-0 nova_compute[186544]: 2025-11-22 08:23:35.302 186548 DEBUG nova.virt.hardware [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 08:23:35 compute-0 nova_compute[186544]: 2025-11-22 08:23:35.302 186548 DEBUG nova.virt.hardware [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 08:23:35 compute-0 nova_compute[186544]: 2025-11-22 08:23:35.306 186548 DEBUG nova.virt.libvirt.vif [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:23:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-454565421',display_name='tempest-TestGettingAddress-server-454565421',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-454565421',id=158,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDo9IOjv1g7RgDVY2BCJ+mmy9Na2DHnkZ9mE8BRZMtj9iwqLsOttn1FTaKP3MHHav783x9ncOoaXWlLdYlMz7ANRW+4cS5nXI1iryizo5WsS7HuJMsWvhJA60lnbjcIyew==',key_name='tempest-TestGettingAddress-814265221',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c4200f1d1fbb44a5aaf5e3578f6354ae',ramdisk_id='',reservation_id='r-owca6uo7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-25838038',owner_user_name='tempest-TestGettingAddress-25838038-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:23:27Z,user_data=None,user_id='809b865601654264af5bff7f49127cea',uuid=60603ba6-4ebe-48b0-94b1-5656695c9ded,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "60cf0e4a-972c-4c7e-ab24-65a810d650bb", "address": "fa:16:3e:92:d5:7b", "network": {"id": "326c0814-77d4-416b-a5a1-28be00b61ecd", "bridge": "br-int", "label": "tempest-network-smoke--641682246", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe92:d57b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60cf0e4a-97", "ovs_interfaceid": "60cf0e4a-972c-4c7e-ab24-65a810d650bb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 08:23:35 compute-0 nova_compute[186544]: 2025-11-22 08:23:35.306 186548 DEBUG nova.network.os_vif_util [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converting VIF {"id": "60cf0e4a-972c-4c7e-ab24-65a810d650bb", "address": "fa:16:3e:92:d5:7b", "network": {"id": "326c0814-77d4-416b-a5a1-28be00b61ecd", "bridge": "br-int", "label": "tempest-network-smoke--641682246", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe92:d57b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60cf0e4a-97", "ovs_interfaceid": "60cf0e4a-972c-4c7e-ab24-65a810d650bb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:23:35 compute-0 nova_compute[186544]: 2025-11-22 08:23:35.307 186548 DEBUG nova.network.os_vif_util [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:92:d5:7b,bridge_name='br-int',has_traffic_filtering=True,id=60cf0e4a-972c-4c7e-ab24-65a810d650bb,network=Network(326c0814-77d4-416b-a5a1-28be00b61ecd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60cf0e4a-97') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:23:35 compute-0 nova_compute[186544]: 2025-11-22 08:23:35.308 186548 DEBUG nova.objects.instance [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lazy-loading 'pci_devices' on Instance uuid 60603ba6-4ebe-48b0-94b1-5656695c9ded obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:23:35 compute-0 nova_compute[186544]: 2025-11-22 08:23:35.323 186548 DEBUG nova.virt.libvirt.driver [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] End _get_guest_xml xml=<domain type="kvm">
Nov 22 08:23:35 compute-0 nova_compute[186544]:   <uuid>60603ba6-4ebe-48b0-94b1-5656695c9ded</uuid>
Nov 22 08:23:35 compute-0 nova_compute[186544]:   <name>instance-0000009e</name>
Nov 22 08:23:35 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 08:23:35 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 08:23:35 compute-0 nova_compute[186544]:   <metadata>
Nov 22 08:23:35 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 08:23:35 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 08:23:35 compute-0 nova_compute[186544]:       <nova:name>tempest-TestGettingAddress-server-454565421</nova:name>
Nov 22 08:23:35 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 08:23:35</nova:creationTime>
Nov 22 08:23:35 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 08:23:35 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 08:23:35 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 08:23:35 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 08:23:35 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 08:23:35 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 08:23:35 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 08:23:35 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 08:23:35 compute-0 nova_compute[186544]:         <nova:user uuid="809b865601654264af5bff7f49127cea">tempest-TestGettingAddress-25838038-project-member</nova:user>
Nov 22 08:23:35 compute-0 nova_compute[186544]:         <nova:project uuid="c4200f1d1fbb44a5aaf5e3578f6354ae">tempest-TestGettingAddress-25838038</nova:project>
Nov 22 08:23:35 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 08:23:35 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 08:23:35 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 08:23:35 compute-0 nova_compute[186544]:         <nova:port uuid="60cf0e4a-972c-4c7e-ab24-65a810d650bb">
Nov 22 08:23:35 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="2001:db8::f816:3eff:fe92:d57b" ipVersion="6"/>
Nov 22 08:23:35 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Nov 22 08:23:35 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 08:23:35 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 08:23:35 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 08:23:35 compute-0 nova_compute[186544]:   </metadata>
Nov 22 08:23:35 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 08:23:35 compute-0 nova_compute[186544]:     <system>
Nov 22 08:23:35 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 08:23:35 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 08:23:35 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 08:23:35 compute-0 nova_compute[186544]:       <entry name="serial">60603ba6-4ebe-48b0-94b1-5656695c9ded</entry>
Nov 22 08:23:35 compute-0 nova_compute[186544]:       <entry name="uuid">60603ba6-4ebe-48b0-94b1-5656695c9ded</entry>
Nov 22 08:23:35 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 08:23:35 compute-0 nova_compute[186544]:     </system>
Nov 22 08:23:35 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 08:23:35 compute-0 nova_compute[186544]:   <os>
Nov 22 08:23:35 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 08:23:35 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 08:23:35 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 08:23:35 compute-0 nova_compute[186544]:   </os>
Nov 22 08:23:35 compute-0 nova_compute[186544]:   <features>
Nov 22 08:23:35 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 08:23:35 compute-0 nova_compute[186544]:     <apic/>
Nov 22 08:23:35 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 08:23:35 compute-0 nova_compute[186544]:   </features>
Nov 22 08:23:35 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 08:23:35 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 08:23:35 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 08:23:35 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 08:23:35 compute-0 nova_compute[186544]:   </clock>
Nov 22 08:23:35 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 08:23:35 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 08:23:35 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 08:23:35 compute-0 nova_compute[186544]:   </cpu>
Nov 22 08:23:35 compute-0 nova_compute[186544]:   <devices>
Nov 22 08:23:35 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 08:23:35 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 08:23:35 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/60603ba6-4ebe-48b0-94b1-5656695c9ded/disk"/>
Nov 22 08:23:35 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 08:23:35 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:23:35 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 08:23:35 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 08:23:35 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/60603ba6-4ebe-48b0-94b1-5656695c9ded/disk.config"/>
Nov 22 08:23:35 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 08:23:35 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:23:35 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 08:23:35 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:92:d5:7b"/>
Nov 22 08:23:35 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:23:35 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 08:23:35 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 08:23:35 compute-0 nova_compute[186544]:       <target dev="tap60cf0e4a-97"/>
Nov 22 08:23:35 compute-0 nova_compute[186544]:     </interface>
Nov 22 08:23:35 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 08:23:35 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/60603ba6-4ebe-48b0-94b1-5656695c9ded/console.log" append="off"/>
Nov 22 08:23:35 compute-0 nova_compute[186544]:     </serial>
Nov 22 08:23:35 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 08:23:35 compute-0 nova_compute[186544]:     <video>
Nov 22 08:23:35 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:23:35 compute-0 nova_compute[186544]:     </video>
Nov 22 08:23:35 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 08:23:35 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 08:23:35 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 08:23:35 compute-0 nova_compute[186544]:     </rng>
Nov 22 08:23:35 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 08:23:35 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:23:35 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:23:35 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:23:35 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:23:35 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:23:35 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:23:35 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:23:35 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:23:35 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:23:35 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:23:35 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:23:35 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:23:35 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:23:35 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:23:35 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:23:35 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:23:35 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:23:35 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:23:35 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:23:35 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:23:35 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:23:35 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:23:35 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:23:35 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:23:35 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 08:23:35 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 08:23:35 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 08:23:35 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 08:23:35 compute-0 nova_compute[186544]:   </devices>
Nov 22 08:23:35 compute-0 nova_compute[186544]: </domain>
Nov 22 08:23:35 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 08:23:35 compute-0 nova_compute[186544]: 2025-11-22 08:23:35.324 186548 DEBUG nova.compute.manager [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] Preparing to wait for external event network-vif-plugged-60cf0e4a-972c-4c7e-ab24-65a810d650bb prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 08:23:35 compute-0 nova_compute[186544]: 2025-11-22 08:23:35.324 186548 DEBUG oslo_concurrency.lockutils [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "60603ba6-4ebe-48b0-94b1-5656695c9ded-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:23:35 compute-0 nova_compute[186544]: 2025-11-22 08:23:35.324 186548 DEBUG oslo_concurrency.lockutils [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "60603ba6-4ebe-48b0-94b1-5656695c9ded-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:23:35 compute-0 nova_compute[186544]: 2025-11-22 08:23:35.325 186548 DEBUG oslo_concurrency.lockutils [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "60603ba6-4ebe-48b0-94b1-5656695c9ded-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:23:35 compute-0 nova_compute[186544]: 2025-11-22 08:23:35.325 186548 DEBUG nova.virt.libvirt.vif [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:23:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-454565421',display_name='tempest-TestGettingAddress-server-454565421',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-454565421',id=158,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDo9IOjv1g7RgDVY2BCJ+mmy9Na2DHnkZ9mE8BRZMtj9iwqLsOttn1FTaKP3MHHav783x9ncOoaXWlLdYlMz7ANRW+4cS5nXI1iryizo5WsS7HuJMsWvhJA60lnbjcIyew==',key_name='tempest-TestGettingAddress-814265221',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c4200f1d1fbb44a5aaf5e3578f6354ae',ramdisk_id='',reservation_id='r-owca6uo7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-25838038',owner_user_name='tempest-TestGettingAddress-25838038-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:23:27Z,user_data=None,user_id='809b865601654264af5bff7f49127cea',uuid=60603ba6-4ebe-48b0-94b1-5656695c9ded,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "60cf0e4a-972c-4c7e-ab24-65a810d650bb", "address": "fa:16:3e:92:d5:7b", "network": {"id": "326c0814-77d4-416b-a5a1-28be00b61ecd", "bridge": "br-int", "label": "tempest-network-smoke--641682246", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe92:d57b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60cf0e4a-97", "ovs_interfaceid": "60cf0e4a-972c-4c7e-ab24-65a810d650bb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 08:23:35 compute-0 nova_compute[186544]: 2025-11-22 08:23:35.326 186548 DEBUG nova.network.os_vif_util [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converting VIF {"id": "60cf0e4a-972c-4c7e-ab24-65a810d650bb", "address": "fa:16:3e:92:d5:7b", "network": {"id": "326c0814-77d4-416b-a5a1-28be00b61ecd", "bridge": "br-int", "label": "tempest-network-smoke--641682246", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe92:d57b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60cf0e4a-97", "ovs_interfaceid": "60cf0e4a-972c-4c7e-ab24-65a810d650bb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:23:35 compute-0 nova_compute[186544]: 2025-11-22 08:23:35.326 186548 DEBUG nova.network.os_vif_util [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:92:d5:7b,bridge_name='br-int',has_traffic_filtering=True,id=60cf0e4a-972c-4c7e-ab24-65a810d650bb,network=Network(326c0814-77d4-416b-a5a1-28be00b61ecd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60cf0e4a-97') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:23:35 compute-0 nova_compute[186544]: 2025-11-22 08:23:35.327 186548 DEBUG os_vif [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:92:d5:7b,bridge_name='br-int',has_traffic_filtering=True,id=60cf0e4a-972c-4c7e-ab24-65a810d650bb,network=Network(326c0814-77d4-416b-a5a1-28be00b61ecd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60cf0e4a-97') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 08:23:35 compute-0 nova_compute[186544]: 2025-11-22 08:23:35.327 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:23:35 compute-0 nova_compute[186544]: 2025-11-22 08:23:35.327 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:23:35 compute-0 nova_compute[186544]: 2025-11-22 08:23:35.328 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:23:35 compute-0 nova_compute[186544]: 2025-11-22 08:23:35.330 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:23:35 compute-0 nova_compute[186544]: 2025-11-22 08:23:35.330 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap60cf0e4a-97, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:23:35 compute-0 nova_compute[186544]: 2025-11-22 08:23:35.330 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap60cf0e4a-97, col_values=(('external_ids', {'iface-id': '60cf0e4a-972c-4c7e-ab24-65a810d650bb', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:92:d5:7b', 'vm-uuid': '60603ba6-4ebe-48b0-94b1-5656695c9ded'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:23:35 compute-0 NetworkManager[55036]: <info>  [1763799815.3327] manager: (tap60cf0e4a-97): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/335)
Nov 22 08:23:35 compute-0 nova_compute[186544]: 2025-11-22 08:23:35.334 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 08:23:35 compute-0 nova_compute[186544]: 2025-11-22 08:23:35.337 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:23:35 compute-0 nova_compute[186544]: 2025-11-22 08:23:35.338 186548 INFO os_vif [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:92:d5:7b,bridge_name='br-int',has_traffic_filtering=True,id=60cf0e4a-972c-4c7e-ab24-65a810d650bb,network=Network(326c0814-77d4-416b-a5a1-28be00b61ecd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60cf0e4a-97')
Nov 22 08:23:35 compute-0 nova_compute[186544]: 2025-11-22 08:23:35.416 186548 DEBUG nova.virt.libvirt.driver [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:23:35 compute-0 nova_compute[186544]: 2025-11-22 08:23:35.417 186548 DEBUG nova.virt.libvirt.driver [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:23:35 compute-0 nova_compute[186544]: 2025-11-22 08:23:35.417 186548 DEBUG nova.virt.libvirt.driver [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] No VIF found with MAC fa:16:3e:92:d5:7b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 08:23:35 compute-0 nova_compute[186544]: 2025-11-22 08:23:35.417 186548 INFO nova.virt.libvirt.driver [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] Using config drive
Nov 22 08:23:35 compute-0 nova_compute[186544]: 2025-11-22 08:23:35.831 186548 INFO nova.virt.libvirt.driver [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] Creating config drive at /var/lib/nova/instances/60603ba6-4ebe-48b0-94b1-5656695c9ded/disk.config
Nov 22 08:23:35 compute-0 nova_compute[186544]: 2025-11-22 08:23:35.835 186548 DEBUG oslo_concurrency.processutils [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/60603ba6-4ebe-48b0-94b1-5656695c9ded/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjfpmyhgj execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:23:35 compute-0 nova_compute[186544]: 2025-11-22 08:23:35.959 186548 DEBUG oslo_concurrency.processutils [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/60603ba6-4ebe-48b0-94b1-5656695c9ded/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjfpmyhgj" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:23:36 compute-0 kernel: tap60cf0e4a-97: entered promiscuous mode
Nov 22 08:23:36 compute-0 nova_compute[186544]: 2025-11-22 08:23:36.028 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:23:36 compute-0 NetworkManager[55036]: <info>  [1763799816.0290] manager: (tap60cf0e4a-97): new Tun device (/org/freedesktop/NetworkManager/Devices/336)
Nov 22 08:23:36 compute-0 ovn_controller[94843]: 2025-11-22T08:23:36Z|00733|binding|INFO|Claiming lport 60cf0e4a-972c-4c7e-ab24-65a810d650bb for this chassis.
Nov 22 08:23:36 compute-0 ovn_controller[94843]: 2025-11-22T08:23:36Z|00734|binding|INFO|60cf0e4a-972c-4c7e-ab24-65a810d650bb: Claiming fa:16:3e:92:d5:7b 10.100.0.9 2001:db8::f816:3eff:fe92:d57b
Nov 22 08:23:36 compute-0 nova_compute[186544]: 2025-11-22 08:23:36.036 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:23:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:23:36.051 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:92:d5:7b 10.100.0.9 2001:db8::f816:3eff:fe92:d57b'], port_security=['fa:16:3e:92:d5:7b 10.100.0.9 2001:db8::f816:3eff:fe92:d57b'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28 2001:db8::f816:3eff:fe92:d57b/64', 'neutron:device_id': '60603ba6-4ebe-48b0-94b1-5656695c9ded', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-326c0814-77d4-416b-a5a1-28be00b61ecd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a58d74bd-bc51-4723-b0e3-c855953c364c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5c110aad-90e5-4caa-b631-3c18861eaadf, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=60cf0e4a-972c-4c7e-ab24-65a810d650bb) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:23:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:23:36.053 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 60cf0e4a-972c-4c7e-ab24-65a810d650bb in datapath 326c0814-77d4-416b-a5a1-28be00b61ecd bound to our chassis
Nov 22 08:23:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:23:36.054 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 326c0814-77d4-416b-a5a1-28be00b61ecd
Nov 22 08:23:36 compute-0 systemd-udevd[243447]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:23:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:23:36.067 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[40661cf3-5aee-4ac0-968b-06c15fbe50c0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:23:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:23:36.068 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap326c0814-71 in ovnmeta-326c0814-77d4-416b-a5a1-28be00b61ecd namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 08:23:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:23:36.070 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap326c0814-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 08:23:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:23:36.070 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[0b8fdf02-5a03-4f9d-ae41-22191f99ef3f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:23:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:23:36.071 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[ac90d6c7-ed35-4437-8ccd-aa91202696e1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:23:36 compute-0 systemd-machined[152872]: New machine qemu-84-instance-0000009e.
Nov 22 08:23:36 compute-0 NetworkManager[55036]: <info>  [1763799816.0825] device (tap60cf0e4a-97): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 08:23:36 compute-0 NetworkManager[55036]: <info>  [1763799816.0833] device (tap60cf0e4a-97): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 08:23:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:23:36.083 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[2a23cfe6-95f5-4d04-b04b-dbaff7ffd64b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:23:36 compute-0 nova_compute[186544]: 2025-11-22 08:23:36.087 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:23:36 compute-0 systemd[1]: Started Virtual Machine qemu-84-instance-0000009e.
Nov 22 08:23:36 compute-0 ovn_controller[94843]: 2025-11-22T08:23:36Z|00735|binding|INFO|Setting lport 60cf0e4a-972c-4c7e-ab24-65a810d650bb ovn-installed in OVS
Nov 22 08:23:36 compute-0 ovn_controller[94843]: 2025-11-22T08:23:36Z|00736|binding|INFO|Setting lport 60cf0e4a-972c-4c7e-ab24-65a810d650bb up in Southbound
Nov 22 08:23:36 compute-0 nova_compute[186544]: 2025-11-22 08:23:36.095 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:23:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:23:36.099 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[d8190329-5282-4ca1-b66a-4b13721a9df2]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:23:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:23:36.129 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[55d9d464-bcbe-4828-8bf1-9c5006df78a6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:23:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:23:36.134 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[1d370118-f39b-4c98-b011-370e11a45938]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:23:36 compute-0 NetworkManager[55036]: <info>  [1763799816.1353] manager: (tap326c0814-70): new Veth device (/org/freedesktop/NetworkManager/Devices/337)
Nov 22 08:23:36 compute-0 systemd-udevd[243452]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:23:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:23:36.164 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[bef0af64-9f7a-423b-b757-672f9aaab6ce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:23:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:23:36.167 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[0fe8b7c4-6911-4eab-b641-045b5b334b29]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:23:36 compute-0 NetworkManager[55036]: <info>  [1763799816.1908] device (tap326c0814-70): carrier: link connected
Nov 22 08:23:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:23:36.196 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[d04e31a1-21cd-4826-aa30-7ffeb435ce03]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:23:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:23:36.214 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[f3fa19af-c271-456f-92c6-8093b37dcf36]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap326c0814-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:10:f1:bb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 220], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 655682, 'reachable_time': 42591, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 243481, 'error': None, 'target': 'ovnmeta-326c0814-77d4-416b-a5a1-28be00b61ecd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:23:36 compute-0 nova_compute[186544]: 2025-11-22 08:23:36.220 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:23:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:23:36.229 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[4b466f20-5b0f-44fd-9260-808f2cb26f22]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe10:f1bb'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 655682, 'tstamp': 655682}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 243482, 'error': None, 'target': 'ovnmeta-326c0814-77d4-416b-a5a1-28be00b61ecd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:23:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:23:36.245 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[d5a31e18-79a0-423d-a7ff-c240bcd4a2d9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap326c0814-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:10:f1:bb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 220], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 655682, 'reachable_time': 42591, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 243483, 'error': None, 'target': 'ovnmeta-326c0814-77d4-416b-a5a1-28be00b61ecd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:23:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:23:36.280 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[65ba9932-cc50-40e2-97f3-fd2e84312641]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:23:36 compute-0 nova_compute[186544]: 2025-11-22 08:23:36.295 186548 DEBUG nova.compute.manager [req-81b5ae6d-2259-48f5-bdfb-4ca79ce408be req-cc69d1fb-813e-40b4-bd3e-67816ad9b6b1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] Received event network-vif-plugged-60cf0e4a-972c-4c7e-ab24-65a810d650bb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:23:36 compute-0 nova_compute[186544]: 2025-11-22 08:23:36.296 186548 DEBUG oslo_concurrency.lockutils [req-81b5ae6d-2259-48f5-bdfb-4ca79ce408be req-cc69d1fb-813e-40b4-bd3e-67816ad9b6b1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "60603ba6-4ebe-48b0-94b1-5656695c9ded-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:23:36 compute-0 nova_compute[186544]: 2025-11-22 08:23:36.296 186548 DEBUG oslo_concurrency.lockutils [req-81b5ae6d-2259-48f5-bdfb-4ca79ce408be req-cc69d1fb-813e-40b4-bd3e-67816ad9b6b1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "60603ba6-4ebe-48b0-94b1-5656695c9ded-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:23:36 compute-0 nova_compute[186544]: 2025-11-22 08:23:36.296 186548 DEBUG oslo_concurrency.lockutils [req-81b5ae6d-2259-48f5-bdfb-4ca79ce408be req-cc69d1fb-813e-40b4-bd3e-67816ad9b6b1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "60603ba6-4ebe-48b0-94b1-5656695c9ded-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:23:36 compute-0 nova_compute[186544]: 2025-11-22 08:23:36.297 186548 DEBUG nova.compute.manager [req-81b5ae6d-2259-48f5-bdfb-4ca79ce408be req-cc69d1fb-813e-40b4-bd3e-67816ad9b6b1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] Processing event network-vif-plugged-60cf0e4a-972c-4c7e-ab24-65a810d650bb _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 08:23:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:23:36.358 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[5cab4767-1209-4306-bfc8-69725fc21a38]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:23:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:23:36.359 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap326c0814-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:23:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:23:36.359 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:23:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:23:36.360 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap326c0814-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:23:36 compute-0 NetworkManager[55036]: <info>  [1763799816.3621] manager: (tap326c0814-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/338)
Nov 22 08:23:36 compute-0 kernel: tap326c0814-70: entered promiscuous mode
Nov 22 08:23:36 compute-0 nova_compute[186544]: 2025-11-22 08:23:36.362 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:23:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:23:36.365 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap326c0814-70, col_values=(('external_ids', {'iface-id': 'e1bc69f6-ec55-4040-be0d-44f334cbe3a6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:23:36 compute-0 ovn_controller[94843]: 2025-11-22T08:23:36Z|00737|binding|INFO|Releasing lport e1bc69f6-ec55-4040-be0d-44f334cbe3a6 from this chassis (sb_readonly=0)
Nov 22 08:23:36 compute-0 nova_compute[186544]: 2025-11-22 08:23:36.377 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:23:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:23:36.378 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/326c0814-77d4-416b-a5a1-28be00b61ecd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/326c0814-77d4-416b-a5a1-28be00b61ecd.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 08:23:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:23:36.379 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[d76fbd0c-e6ac-4def-b5de-100b737836a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:23:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:23:36.380 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 08:23:36 compute-0 ovn_metadata_agent[103800]: global
Nov 22 08:23:36 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 08:23:36 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-326c0814-77d4-416b-a5a1-28be00b61ecd
Nov 22 08:23:36 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 08:23:36 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 08:23:36 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 08:23:36 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/326c0814-77d4-416b-a5a1-28be00b61ecd.pid.haproxy
Nov 22 08:23:36 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 08:23:36 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:23:36 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 08:23:36 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 08:23:36 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 08:23:36 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 08:23:36 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 08:23:36 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 08:23:36 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 08:23:36 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 08:23:36 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 08:23:36 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 08:23:36 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 08:23:36 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 08:23:36 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 08:23:36 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:23:36 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:23:36 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 08:23:36 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 08:23:36 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 08:23:36 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID 326c0814-77d4-416b-a5a1-28be00b61ecd
Nov 22 08:23:36 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 08:23:36 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:23:36.381 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-326c0814-77d4-416b-a5a1-28be00b61ecd', 'env', 'PROCESS_TAG=haproxy-326c0814-77d4-416b-a5a1-28be00b61ecd', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/326c0814-77d4-416b-a5a1-28be00b61ecd.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 08:23:36 compute-0 nova_compute[186544]: 2025-11-22 08:23:36.811 186548 DEBUG nova.compute.manager [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 08:23:36 compute-0 nova_compute[186544]: 2025-11-22 08:23:36.812 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763799816.8107243, 60603ba6-4ebe-48b0-94b1-5656695c9ded => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:23:36 compute-0 nova_compute[186544]: 2025-11-22 08:23:36.813 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] VM Started (Lifecycle Event)
Nov 22 08:23:36 compute-0 nova_compute[186544]: 2025-11-22 08:23:36.818 186548 DEBUG nova.virt.libvirt.driver [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 08:23:36 compute-0 podman[243516]: 2025-11-22 08:23:36.726579489 +0000 UTC m=+0.022805254 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 08:23:36 compute-0 nova_compute[186544]: 2025-11-22 08:23:36.822 186548 INFO nova.virt.libvirt.driver [-] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] Instance spawned successfully.
Nov 22 08:23:36 compute-0 nova_compute[186544]: 2025-11-22 08:23:36.823 186548 DEBUG nova.virt.libvirt.driver [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 08:23:36 compute-0 nova_compute[186544]: 2025-11-22 08:23:36.842 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:23:36 compute-0 nova_compute[186544]: 2025-11-22 08:23:36.847 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:23:36 compute-0 nova_compute[186544]: 2025-11-22 08:23:36.851 186548 DEBUG nova.virt.libvirt.driver [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:23:36 compute-0 nova_compute[186544]: 2025-11-22 08:23:36.851 186548 DEBUG nova.virt.libvirt.driver [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:23:36 compute-0 nova_compute[186544]: 2025-11-22 08:23:36.852 186548 DEBUG nova.virt.libvirt.driver [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:23:36 compute-0 nova_compute[186544]: 2025-11-22 08:23:36.852 186548 DEBUG nova.virt.libvirt.driver [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:23:36 compute-0 nova_compute[186544]: 2025-11-22 08:23:36.853 186548 DEBUG nova.virt.libvirt.driver [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:23:36 compute-0 nova_compute[186544]: 2025-11-22 08:23:36.853 186548 DEBUG nova.virt.libvirt.driver [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:23:36 compute-0 nova_compute[186544]: 2025-11-22 08:23:36.887 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 08:23:36 compute-0 nova_compute[186544]: 2025-11-22 08:23:36.887 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763799816.810869, 60603ba6-4ebe-48b0-94b1-5656695c9ded => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:23:36 compute-0 nova_compute[186544]: 2025-11-22 08:23:36.887 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] VM Paused (Lifecycle Event)
Nov 22 08:23:36 compute-0 nova_compute[186544]: 2025-11-22 08:23:36.908 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:23:36 compute-0 nova_compute[186544]: 2025-11-22 08:23:36.911 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763799816.8139055, 60603ba6-4ebe-48b0-94b1-5656695c9ded => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:23:36 compute-0 nova_compute[186544]: 2025-11-22 08:23:36.911 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] VM Resumed (Lifecycle Event)
Nov 22 08:23:36 compute-0 nova_compute[186544]: 2025-11-22 08:23:36.927 186548 INFO nova.compute.manager [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] Took 9.64 seconds to spawn the instance on the hypervisor.
Nov 22 08:23:36 compute-0 nova_compute[186544]: 2025-11-22 08:23:36.928 186548 DEBUG nova.compute.manager [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:23:36 compute-0 nova_compute[186544]: 2025-11-22 08:23:36.934 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:23:36 compute-0 nova_compute[186544]: 2025-11-22 08:23:36.936 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:23:36 compute-0 nova_compute[186544]: 2025-11-22 08:23:36.963 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 08:23:36 compute-0 nova_compute[186544]: 2025-11-22 08:23:36.991 186548 INFO nova.compute.manager [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] Took 10.28 seconds to build instance.
Nov 22 08:23:37 compute-0 nova_compute[186544]: 2025-11-22 08:23:37.004 186548 DEBUG oslo_concurrency.lockutils [None req-784d65d6-5b03-44f0-b46e-949734caeeee 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "60603ba6-4ebe-48b0-94b1-5656695c9ded" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.420s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:23:37 compute-0 podman[243516]: 2025-11-22 08:23:37.301906139 +0000 UTC m=+0.598131884 container create f95104086a86e4dd31f0e42e96bcd07c5c0d2375f8c670c8241679ede0363829 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-326c0814-77d4-416b-a5a1-28be00b61ecd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team)
Nov 22 08:23:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:23:37.349 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:23:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:23:37.351 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:23:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:23:37.351 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:23:37 compute-0 systemd[1]: Started libpod-conmon-f95104086a86e4dd31f0e42e96bcd07c5c0d2375f8c670c8241679ede0363829.scope.
Nov 22 08:23:37 compute-0 systemd[1]: Started libcrun container.
Nov 22 08:23:37 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/db34ff5b1144b85d571e2071fae137b4bb682833cde66e4de5960f06c13d1f71/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 08:23:37 compute-0 nova_compute[186544]: 2025-11-22 08:23:37.732 186548 DEBUG nova.network.neutron [req-2bf2002f-f6d0-4282-9c82-61c0c3412aea req-d7aa6674-f5ab-406f-872f-aa47444ae931 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] Updated VIF entry in instance network info cache for port 60cf0e4a-972c-4c7e-ab24-65a810d650bb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:23:37 compute-0 nova_compute[186544]: 2025-11-22 08:23:37.733 186548 DEBUG nova.network.neutron [req-2bf2002f-f6d0-4282-9c82-61c0c3412aea req-d7aa6674-f5ab-406f-872f-aa47444ae931 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] Updating instance_info_cache with network_info: [{"id": "60cf0e4a-972c-4c7e-ab24-65a810d650bb", "address": "fa:16:3e:92:d5:7b", "network": {"id": "326c0814-77d4-416b-a5a1-28be00b61ecd", "bridge": "br-int", "label": "tempest-network-smoke--641682246", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe92:d57b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60cf0e4a-97", "ovs_interfaceid": "60cf0e4a-972c-4c7e-ab24-65a810d650bb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:23:37 compute-0 podman[243516]: 2025-11-22 08:23:37.741060526 +0000 UTC m=+1.037286291 container init f95104086a86e4dd31f0e42e96bcd07c5c0d2375f8c670c8241679ede0363829 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-326c0814-77d4-416b-a5a1-28be00b61ecd, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 22 08:23:37 compute-0 podman[243516]: 2025-11-22 08:23:37.746770377 +0000 UTC m=+1.042996122 container start f95104086a86e4dd31f0e42e96bcd07c5c0d2375f8c670c8241679ede0363829 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-326c0814-77d4-416b-a5a1-28be00b61ecd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118)
Nov 22 08:23:37 compute-0 nova_compute[186544]: 2025-11-22 08:23:37.748 186548 DEBUG oslo_concurrency.lockutils [req-2bf2002f-f6d0-4282-9c82-61c0c3412aea req-d7aa6674-f5ab-406f-872f-aa47444ae931 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-60603ba6-4ebe-48b0-94b1-5656695c9ded" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:23:37 compute-0 neutron-haproxy-ovnmeta-326c0814-77d4-416b-a5a1-28be00b61ecd[243537]: [NOTICE]   (243541) : New worker (243543) forked
Nov 22 08:23:37 compute-0 neutron-haproxy-ovnmeta-326c0814-77d4-416b-a5a1-28be00b61ecd[243537]: [NOTICE]   (243541) : Loading success.
Nov 22 08:23:38 compute-0 nova_compute[186544]: 2025-11-22 08:23:38.434 186548 DEBUG nova.compute.manager [req-e585efc3-42c4-48d2-b56f-87ba4060f3d5 req-9e35f5f4-1f2a-40d4-830b-ad1ba892f535 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] Received event network-vif-plugged-60cf0e4a-972c-4c7e-ab24-65a810d650bb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:23:38 compute-0 nova_compute[186544]: 2025-11-22 08:23:38.434 186548 DEBUG oslo_concurrency.lockutils [req-e585efc3-42c4-48d2-b56f-87ba4060f3d5 req-9e35f5f4-1f2a-40d4-830b-ad1ba892f535 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "60603ba6-4ebe-48b0-94b1-5656695c9ded-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:23:38 compute-0 nova_compute[186544]: 2025-11-22 08:23:38.435 186548 DEBUG oslo_concurrency.lockutils [req-e585efc3-42c4-48d2-b56f-87ba4060f3d5 req-9e35f5f4-1f2a-40d4-830b-ad1ba892f535 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "60603ba6-4ebe-48b0-94b1-5656695c9ded-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:23:38 compute-0 nova_compute[186544]: 2025-11-22 08:23:38.435 186548 DEBUG oslo_concurrency.lockutils [req-e585efc3-42c4-48d2-b56f-87ba4060f3d5 req-9e35f5f4-1f2a-40d4-830b-ad1ba892f535 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "60603ba6-4ebe-48b0-94b1-5656695c9ded-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:23:38 compute-0 nova_compute[186544]: 2025-11-22 08:23:38.435 186548 DEBUG nova.compute.manager [req-e585efc3-42c4-48d2-b56f-87ba4060f3d5 req-9e35f5f4-1f2a-40d4-830b-ad1ba892f535 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] No waiting events found dispatching network-vif-plugged-60cf0e4a-972c-4c7e-ab24-65a810d650bb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:23:38 compute-0 nova_compute[186544]: 2025-11-22 08:23:38.436 186548 WARNING nova.compute.manager [req-e585efc3-42c4-48d2-b56f-87ba4060f3d5 req-9e35f5f4-1f2a-40d4-830b-ad1ba892f535 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] Received unexpected event network-vif-plugged-60cf0e4a-972c-4c7e-ab24-65a810d650bb for instance with vm_state active and task_state None.
Nov 22 08:23:40 compute-0 nova_compute[186544]: 2025-11-22 08:23:40.334 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:23:41 compute-0 nova_compute[186544]: 2025-11-22 08:23:41.222 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:23:42 compute-0 nova_compute[186544]: 2025-11-22 08:23:42.365 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:23:42 compute-0 NetworkManager[55036]: <info>  [1763799822.3663] manager: (patch-br-int-to-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/339)
Nov 22 08:23:42 compute-0 NetworkManager[55036]: <info>  [1763799822.3675] manager: (patch-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/340)
Nov 22 08:23:42 compute-0 nova_compute[186544]: 2025-11-22 08:23:42.432 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:23:42 compute-0 ovn_controller[94843]: 2025-11-22T08:23:42Z|00738|binding|INFO|Releasing lport e1bc69f6-ec55-4040-be0d-44f334cbe3a6 from this chassis (sb_readonly=0)
Nov 22 08:23:42 compute-0 nova_compute[186544]: 2025-11-22 08:23:42.442 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:23:42 compute-0 nova_compute[186544]: 2025-11-22 08:23:42.658 186548 DEBUG nova.compute.manager [req-08b5169a-6ce7-45dc-845f-a42fb68d503f req-d76d242f-0012-4f26-bffe-9b0e3ad19fb7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] Received event network-changed-60cf0e4a-972c-4c7e-ab24-65a810d650bb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:23:42 compute-0 nova_compute[186544]: 2025-11-22 08:23:42.658 186548 DEBUG nova.compute.manager [req-08b5169a-6ce7-45dc-845f-a42fb68d503f req-d76d242f-0012-4f26-bffe-9b0e3ad19fb7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] Refreshing instance network info cache due to event network-changed-60cf0e4a-972c-4c7e-ab24-65a810d650bb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:23:42 compute-0 nova_compute[186544]: 2025-11-22 08:23:42.658 186548 DEBUG oslo_concurrency.lockutils [req-08b5169a-6ce7-45dc-845f-a42fb68d503f req-d76d242f-0012-4f26-bffe-9b0e3ad19fb7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-60603ba6-4ebe-48b0-94b1-5656695c9ded" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:23:42 compute-0 nova_compute[186544]: 2025-11-22 08:23:42.659 186548 DEBUG oslo_concurrency.lockutils [req-08b5169a-6ce7-45dc-845f-a42fb68d503f req-d76d242f-0012-4f26-bffe-9b0e3ad19fb7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-60603ba6-4ebe-48b0-94b1-5656695c9ded" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:23:42 compute-0 nova_compute[186544]: 2025-11-22 08:23:42.659 186548 DEBUG nova.network.neutron [req-08b5169a-6ce7-45dc-845f-a42fb68d503f req-d76d242f-0012-4f26-bffe-9b0e3ad19fb7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] Refreshing network info cache for port 60cf0e4a-972c-4c7e-ab24-65a810d650bb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:23:44 compute-0 podman[243553]: 2025-11-22 08:23:44.41163427 +0000 UTC m=+0.056299691 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 22 08:23:44 compute-0 nova_compute[186544]: 2025-11-22 08:23:44.612 186548 DEBUG nova.network.neutron [req-08b5169a-6ce7-45dc-845f-a42fb68d503f req-d76d242f-0012-4f26-bffe-9b0e3ad19fb7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] Updated VIF entry in instance network info cache for port 60cf0e4a-972c-4c7e-ab24-65a810d650bb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:23:44 compute-0 nova_compute[186544]: 2025-11-22 08:23:44.613 186548 DEBUG nova.network.neutron [req-08b5169a-6ce7-45dc-845f-a42fb68d503f req-d76d242f-0012-4f26-bffe-9b0e3ad19fb7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] Updating instance_info_cache with network_info: [{"id": "60cf0e4a-972c-4c7e-ab24-65a810d650bb", "address": "fa:16:3e:92:d5:7b", "network": {"id": "326c0814-77d4-416b-a5a1-28be00b61ecd", "bridge": "br-int", "label": "tempest-network-smoke--641682246", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe92:d57b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60cf0e4a-97", "ovs_interfaceid": "60cf0e4a-972c-4c7e-ab24-65a810d650bb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:23:44 compute-0 nova_compute[186544]: 2025-11-22 08:23:44.632 186548 DEBUG oslo_concurrency.lockutils [req-08b5169a-6ce7-45dc-845f-a42fb68d503f req-d76d242f-0012-4f26-bffe-9b0e3ad19fb7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-60603ba6-4ebe-48b0-94b1-5656695c9ded" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:23:45 compute-0 nova_compute[186544]: 2025-11-22 08:23:45.338 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:23:46 compute-0 nova_compute[186544]: 2025-11-22 08:23:46.227 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:23:47 compute-0 podman[243573]: 2025-11-22 08:23:47.40245579 +0000 UTC m=+0.053865301 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 08:23:47 compute-0 podman[243574]: 2025-11-22 08:23:47.407847733 +0000 UTC m=+0.055215455 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, version=9.6, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, name=ubi9-minimal, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.openshift.expose-services=, io.openshift.tags=minimal rhel9)
Nov 22 08:23:50 compute-0 nova_compute[186544]: 2025-11-22 08:23:50.339 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:23:51 compute-0 nova_compute[186544]: 2025-11-22 08:23:51.226 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:23:53 compute-0 ovn_controller[94843]: 2025-11-22T08:23:53Z|00066|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:92:d5:7b 10.100.0.9
Nov 22 08:23:53 compute-0 ovn_controller[94843]: 2025-11-22T08:23:53Z|00067|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:92:d5:7b 10.100.0.9
Nov 22 08:23:55 compute-0 nova_compute[186544]: 2025-11-22 08:23:55.340 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:23:56 compute-0 nova_compute[186544]: 2025-11-22 08:23:56.228 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:24:00 compute-0 nova_compute[186544]: 2025-11-22 08:24:00.343 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:24:01 compute-0 nova_compute[186544]: 2025-11-22 08:24:01.229 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:24:04 compute-0 podman[243640]: 2025-11-22 08:24:04.412177577 +0000 UTC m=+0.049102544 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 08:24:04 compute-0 podman[243638]: 2025-11-22 08:24:04.423589088 +0000 UTC m=+0.065413097 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, managed_by=edpm_ansible, tcib_managed=true)
Nov 22 08:24:04 compute-0 podman[243639]: 2025-11-22 08:24:04.446954105 +0000 UTC m=+0.084153580 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 08:24:04 compute-0 podman[243641]: 2025-11-22 08:24:04.454707297 +0000 UTC m=+0.087975554 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 22 08:24:05 compute-0 nova_compute[186544]: 2025-11-22 08:24:05.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:24:05 compute-0 nova_compute[186544]: 2025-11-22 08:24:05.196 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:24:05 compute-0 nova_compute[186544]: 2025-11-22 08:24:05.196 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:24:05 compute-0 nova_compute[186544]: 2025-11-22 08:24:05.197 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:24:05 compute-0 nova_compute[186544]: 2025-11-22 08:24:05.197 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 08:24:05 compute-0 nova_compute[186544]: 2025-11-22 08:24:05.294 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/60603ba6-4ebe-48b0-94b1-5656695c9ded/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:24:05 compute-0 nova_compute[186544]: 2025-11-22 08:24:05.345 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:24:05 compute-0 nova_compute[186544]: 2025-11-22 08:24:05.359 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/60603ba6-4ebe-48b0-94b1-5656695c9ded/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:24:05 compute-0 nova_compute[186544]: 2025-11-22 08:24:05.360 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/60603ba6-4ebe-48b0-94b1-5656695c9ded/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:24:05 compute-0 nova_compute[186544]: 2025-11-22 08:24:05.420 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/60603ba6-4ebe-48b0-94b1-5656695c9ded/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:24:05 compute-0 nova_compute[186544]: 2025-11-22 08:24:05.568 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:24:05 compute-0 nova_compute[186544]: 2025-11-22 08:24:05.570 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5493MB free_disk=73.10802459716797GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 08:24:05 compute-0 nova_compute[186544]: 2025-11-22 08:24:05.570 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:24:05 compute-0 nova_compute[186544]: 2025-11-22 08:24:05.571 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:24:05 compute-0 nova_compute[186544]: 2025-11-22 08:24:05.904 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Instance 60603ba6-4ebe-48b0-94b1-5656695c9ded actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 22 08:24:05 compute-0 nova_compute[186544]: 2025-11-22 08:24:05.904 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 08:24:05 compute-0 nova_compute[186544]: 2025-11-22 08:24:05.905 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 08:24:06 compute-0 nova_compute[186544]: 2025-11-22 08:24:06.030 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:24:06 compute-0 nova_compute[186544]: 2025-11-22 08:24:06.051 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:24:06 compute-0 nova_compute[186544]: 2025-11-22 08:24:06.082 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 08:24:06 compute-0 nova_compute[186544]: 2025-11-22 08:24:06.083 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.512s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:24:06 compute-0 nova_compute[186544]: 2025-11-22 08:24:06.231 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:24:07 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:24:07.604 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=50, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=49) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:24:07 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:24:07.605 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 08:24:07 compute-0 nova_compute[186544]: 2025-11-22 08:24:07.609 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:24:10 compute-0 nova_compute[186544]: 2025-11-22 08:24:10.347 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:24:10 compute-0 ovn_controller[94843]: 2025-11-22T08:24:10Z|00739|binding|INFO|Releasing lport e1bc69f6-ec55-4040-be0d-44f334cbe3a6 from this chassis (sb_readonly=0)
Nov 22 08:24:10 compute-0 nova_compute[186544]: 2025-11-22 08:24:10.636 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:24:11 compute-0 nova_compute[186544]: 2025-11-22 08:24:11.083 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:24:11 compute-0 nova_compute[186544]: 2025-11-22 08:24:11.083 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 08:24:11 compute-0 nova_compute[186544]: 2025-11-22 08:24:11.083 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 08:24:11 compute-0 nova_compute[186544]: 2025-11-22 08:24:11.233 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:24:11 compute-0 nova_compute[186544]: 2025-11-22 08:24:11.457 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "refresh_cache-60603ba6-4ebe-48b0-94b1-5656695c9ded" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:24:11 compute-0 nova_compute[186544]: 2025-11-22 08:24:11.458 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquired lock "refresh_cache-60603ba6-4ebe-48b0-94b1-5656695c9ded" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:24:11 compute-0 nova_compute[186544]: 2025-11-22 08:24:11.458 186548 DEBUG nova.network.neutron [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 22 08:24:11 compute-0 nova_compute[186544]: 2025-11-22 08:24:11.458 186548 DEBUG nova.objects.instance [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lazy-loading 'info_cache' on Instance uuid 60603ba6-4ebe-48b0-94b1-5656695c9ded obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:24:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:24:12.607 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '50'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:24:15 compute-0 nova_compute[186544]: 2025-11-22 08:24:15.350 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:24:15 compute-0 podman[243726]: 2025-11-22 08:24:15.426491263 +0000 UTC m=+0.072679907 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, container_name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible)
Nov 22 08:24:15 compute-0 nova_compute[186544]: 2025-11-22 08:24:15.750 186548 DEBUG nova.network.neutron [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] Updating instance_info_cache with network_info: [{"id": "60cf0e4a-972c-4c7e-ab24-65a810d650bb", "address": "fa:16:3e:92:d5:7b", "network": {"id": "326c0814-77d4-416b-a5a1-28be00b61ecd", "bridge": "br-int", "label": "tempest-network-smoke--641682246", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe92:d57b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60cf0e4a-97", "ovs_interfaceid": "60cf0e4a-972c-4c7e-ab24-65a810d650bb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:24:15 compute-0 nova_compute[186544]: 2025-11-22 08:24:15.769 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Releasing lock "refresh_cache-60603ba6-4ebe-48b0-94b1-5656695c9ded" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:24:15 compute-0 nova_compute[186544]: 2025-11-22 08:24:15.769 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 22 08:24:15 compute-0 nova_compute[186544]: 2025-11-22 08:24:15.770 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:24:15 compute-0 nova_compute[186544]: 2025-11-22 08:24:15.770 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:24:15 compute-0 nova_compute[186544]: 2025-11-22 08:24:15.771 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:24:15 compute-0 nova_compute[186544]: 2025-11-22 08:24:15.771 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 08:24:16 compute-0 nova_compute[186544]: 2025-11-22 08:24:16.236 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:24:17 compute-0 nova_compute[186544]: 2025-11-22 08:24:17.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:24:17 compute-0 nova_compute[186544]: 2025-11-22 08:24:17.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:24:18 compute-0 nova_compute[186544]: 2025-11-22 08:24:18.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:24:18 compute-0 podman[243747]: 2025-11-22 08:24:18.410769041 +0000 UTC m=+0.053926773 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 08:24:18 compute-0 podman[243748]: 2025-11-22 08:24:18.425384332 +0000 UTC m=+0.066029472 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, io.openshift.expose-services=, vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, version=9.6, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, architecture=x86_64)
Nov 22 08:24:20 compute-0 nova_compute[186544]: 2025-11-22 08:24:20.352 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:24:21 compute-0 nova_compute[186544]: 2025-11-22 08:24:21.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:24:21 compute-0 nova_compute[186544]: 2025-11-22 08:24:21.238 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:24:24 compute-0 nova_compute[186544]: 2025-11-22 08:24:24.994 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:24:25 compute-0 nova_compute[186544]: 2025-11-22 08:24:25.353 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:24:26 compute-0 nova_compute[186544]: 2025-11-22 08:24:26.239 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:24:29 compute-0 nova_compute[186544]: 2025-11-22 08:24:29.462 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:24:29 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Nov 22 08:24:30 compute-0 nova_compute[186544]: 2025-11-22 08:24:30.354 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:24:31 compute-0 nova_compute[186544]: 2025-11-22 08:24:31.242 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:24:33 compute-0 nova_compute[186544]: 2025-11-22 08:24:33.158 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:24:35 compute-0 nova_compute[186544]: 2025-11-22 08:24:35.355 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:24:35 compute-0 podman[243794]: 2025-11-22 08:24:35.415811684 +0000 UTC m=+0.054677241 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Nov 22 08:24:35 compute-0 podman[243793]: 2025-11-22 08:24:35.424373126 +0000 UTC m=+0.064121685 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 22 08:24:35 compute-0 podman[243795]: 2025-11-22 08:24:35.431321647 +0000 UTC m=+0.058782853 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 08:24:35 compute-0 podman[243796]: 2025-11-22 08:24:35.490180511 +0000 UTC m=+0.119101943 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118)
Nov 22 08:24:36 compute-0 nova_compute[186544]: 2025-11-22 08:24:36.244 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.602 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '60603ba6-4ebe-48b0-94b1-5656695c9ded', 'name': 'tempest-TestGettingAddress-server-454565421', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000009e', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'user_id': '809b865601654264af5bff7f49127cea', 'hostId': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.603 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.614 12 DEBUG ceilometer.compute.pollsters [-] 60603ba6-4ebe-48b0-94b1-5656695c9ded/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.615 12 DEBUG ceilometer.compute.pollsters [-] 60603ba6-4ebe-48b0-94b1-5656695c9ded/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.617 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '60db9f32-6912-4ccf-9e61-adefa9c4cc6f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '60603ba6-4ebe-48b0-94b1-5656695c9ded-vda', 'timestamp': '2025-11-22T08:24:36.603234', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-454565421', 'name': 'instance-0000009e', 'instance_id': '60603ba6-4ebe-48b0-94b1-5656695c9ded', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ae828a72-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6617.295029665, 'message_signature': '829499a64464f9a93bf1c0a1931d6f912357610cac4ac309a593c14ebcdf0e34'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '60603ba6-4ebe-48b0-94b1-5656695c9ded-sda', 'timestamp': '2025-11-22T08:24:36.603234', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-454565421', 'name': 'instance-0000009e', 'instance_id': '60603ba6-4ebe-48b0-94b1-5656695c9ded', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ae82a34a-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6617.295029665, 'message_signature': 'd33405b9f6cfce958454f3561cc5279732d2c9d24ae8e26f25ae4791bcb1caf6'}]}, 'timestamp': '2025-11-22 08:24:36.615791', '_unique_id': 'f035a5fb7de34779b89de17234a34ffc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.617 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.617 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.617 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.617 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.617 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.617 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.617 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.617 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.617 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.617 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.617 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.617 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.617 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.617 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.617 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.617 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.617 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.617 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.617 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.617 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.617 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.617 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.617 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.617 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.617 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.617 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.617 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.617 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.617 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.617 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.617 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.619 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.622 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 60603ba6-4ebe-48b0-94b1-5656695c9ded / tap60cf0e4a-97 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.623 12 DEBUG ceilometer.compute.pollsters [-] 60603ba6-4ebe-48b0-94b1-5656695c9ded/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.625 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '07c6f822-8af1-422c-a3cf-9a7422392672', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-0000009e-60603ba6-4ebe-48b0-94b1-5656695c9ded-tap60cf0e4a-97', 'timestamp': '2025-11-22T08:24:36.619890', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-454565421', 'name': 'tap60cf0e4a-97', 'instance_id': '60603ba6-4ebe-48b0-94b1-5656695c9ded', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:92:d5:7b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap60cf0e4a-97'}, 'message_id': 'ae83e174-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6617.311724998, 'message_signature': '26dfb1998aec89271d034c9b953e0866c7acadd255276ca221ee5ca0d0359dc5'}]}, 'timestamp': '2025-11-22 08:24:36.623864', '_unique_id': 'dd4231dd9e1d49dc9b36af523f143644'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.625 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.625 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.625 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.625 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.625 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.625 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.625 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.625 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.625 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.625 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.625 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.625 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.625 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.625 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.625 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.625 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.625 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.625 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.625 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.625 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.625 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.625 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.625 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.625 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.625 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.625 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.625 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.625 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.625 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.625 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.625 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.626 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.626 12 DEBUG ceilometer.compute.pollsters [-] 60603ba6-4ebe-48b0-94b1-5656695c9ded/network.incoming.bytes volume: 4247 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.628 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aaa583bd-a040-4250-9eb3-92b3d6f3afaf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4247, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-0000009e-60603ba6-4ebe-48b0-94b1-5656695c9ded-tap60cf0e4a-97', 'timestamp': '2025-11-22T08:24:36.626536', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-454565421', 'name': 'tap60cf0e4a-97', 'instance_id': '60603ba6-4ebe-48b0-94b1-5656695c9ded', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:92:d5:7b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap60cf0e4a-97'}, 'message_id': 'ae845fdc-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6617.311724998, 'message_signature': '7948cc82f211b99d81406b82501a68f4580b47e1b71d6d42afe71b32d622bbe3'}]}, 'timestamp': '2025-11-22 08:24:36.627088', '_unique_id': '2848ab38098b45a9a43e8ee0fca8538e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.628 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.628 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.628 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.628 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.628 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.628 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.628 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.628 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.628 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.628 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.628 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.628 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.628 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.628 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.628 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.628 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.628 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.628 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.628 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.628 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.628 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.628 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.628 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.628 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.628 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.628 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.628 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.628 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.628 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.628 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.628 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.628 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.629 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.629 12 DEBUG ceilometer.compute.pollsters [-] 60603ba6-4ebe-48b0-94b1-5656695c9ded/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.631 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5f0a6304-64a2-47d0-8ad2-df78b98fbf60', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-0000009e-60603ba6-4ebe-48b0-94b1-5656695c9ded-tap60cf0e4a-97', 'timestamp': '2025-11-22T08:24:36.629576', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-454565421', 'name': 'tap60cf0e4a-97', 'instance_id': '60603ba6-4ebe-48b0-94b1-5656695c9ded', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:92:d5:7b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap60cf0e4a-97'}, 'message_id': 'ae84d5c0-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6617.311724998, 'message_signature': 'bcac640d2cbb1aba09fb2d7923fb491e5ceb58d9a730bfe02a22843bec56d6bf'}]}, 'timestamp': '2025-11-22 08:24:36.630126', '_unique_id': '358bb09e8f91449a82d674645d66e35c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.631 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.631 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.631 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.631 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.631 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.631 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.631 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.631 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.631 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.631 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.631 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.631 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.631 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.631 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.631 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.631 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.631 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.631 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.631 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.631 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.631 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.631 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.631 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.631 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.631 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.631 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.631 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.631 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.631 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.631 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.631 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.631 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.631 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.631 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.631 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.631 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.631 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.631 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.631 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.631 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.631 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.631 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.631 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.631 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.631 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.631 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.631 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.631 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.631 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.631 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.631 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.631 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.631 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.631 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.632 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.632 12 DEBUG ceilometer.compute.pollsters [-] 60603ba6-4ebe-48b0-94b1-5656695c9ded/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.634 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '61321d20-b1fa-4e35-ae08-8fec646068ad', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-0000009e-60603ba6-4ebe-48b0-94b1-5656695c9ded-tap60cf0e4a-97', 'timestamp': '2025-11-22T08:24:36.632856', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-454565421', 'name': 'tap60cf0e4a-97', 'instance_id': '60603ba6-4ebe-48b0-94b1-5656695c9ded', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:92:d5:7b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap60cf0e4a-97'}, 'message_id': 'ae855590-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6617.311724998, 'message_signature': 'ecab552f15f4077c64d6d62fa5afe96171f5681545a6e73dbb3bfed1977f5ba4'}]}, 'timestamp': '2025-11-22 08:24:36.633439', '_unique_id': 'decb8856ca044f4e8638a8654c96522a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.634 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.634 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.634 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.634 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.634 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.634 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.634 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.634 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.634 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.634 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.634 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.634 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.634 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.634 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.634 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.634 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.634 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.634 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.634 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.634 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.634 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.634 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.634 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.634 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.634 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.634 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.634 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.634 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.634 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.634 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.634 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.635 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.664 12 DEBUG ceilometer.compute.pollsters [-] 60603ba6-4ebe-48b0-94b1-5656695c9ded/disk.device.write.requests volume: 334 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.665 12 DEBUG ceilometer.compute.pollsters [-] 60603ba6-4ebe-48b0-94b1-5656695c9ded/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.666 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4eb4ffb4-b257-43c4-abfb-95afda348ff6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 334, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '60603ba6-4ebe-48b0-94b1-5656695c9ded-vda', 'timestamp': '2025-11-22T08:24:36.635806', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-454565421', 'name': 'instance-0000009e', 'instance_id': '60603ba6-4ebe-48b0-94b1-5656695c9ded', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ae8a3362-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6617.32763659, 'message_signature': 'cb2365860c0bcae2dfd848ad4075a00b3e365f65f0bdad973a3b03060f34ffd6'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '60603ba6-4ebe-48b0-94b1-5656695c9ded-sda', 'timestamp': '2025-11-22T08:24:36.635806', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-454565421', 'name': 'instance-0000009e', 'instance_id': '60603ba6-4ebe-48b0-94b1-5656695c9ded', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ae8a3fba-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6617.32763659, 'message_signature': 'e1a790e88d4c02fa93a85a286c3b46a919f0243bee2a560dfcc27e204a631659'}]}, 'timestamp': '2025-11-22 08:24:36.665451', '_unique_id': '20d995b044c44dc794ef73499add8742'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.666 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.666 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.666 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.666 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.666 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.666 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.666 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.666 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.666 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.666 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.666 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.666 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.666 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.666 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.666 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.666 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.666 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.666 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.666 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.666 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.666 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.666 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.666 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.666 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.666 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.666 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.666 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.666 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.666 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.666 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.666 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.667 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.667 12 DEBUG ceilometer.compute.pollsters [-] 60603ba6-4ebe-48b0-94b1-5656695c9ded/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.667 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '19f2cd37-36d9-427a-a30e-b1471738e078', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-0000009e-60603ba6-4ebe-48b0-94b1-5656695c9ded-tap60cf0e4a-97', 'timestamp': '2025-11-22T08:24:36.667167', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-454565421', 'name': 'tap60cf0e4a-97', 'instance_id': '60603ba6-4ebe-48b0-94b1-5656695c9ded', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:92:d5:7b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap60cf0e4a-97'}, 'message_id': 'ae8a8d26-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6617.311724998, 'message_signature': '6a6b11b75dfd0d5fd10655a9f80f043028bca69a74b1d3fd2ff5c673b107c12c'}]}, 'timestamp': '2025-11-22 08:24:36.667438', '_unique_id': '435457897be947d9b6b77865071265eb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.667 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.667 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.667 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.667 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.667 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.667 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.667 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.667 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.667 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.667 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.667 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.667 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.667 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.667 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.667 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.667 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.667 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.667 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.667 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.667 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.667 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.667 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.667 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.667 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.667 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.667 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.667 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.667 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.667 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.667 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.667 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.667 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.668 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.668 12 DEBUG ceilometer.compute.pollsters [-] 60603ba6-4ebe-48b0-94b1-5656695c9ded/disk.device.read.bytes volume: 31250944 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.668 12 DEBUG ceilometer.compute.pollsters [-] 60603ba6-4ebe-48b0-94b1-5656695c9ded/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.669 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1bf30913-b59d-451c-bdf4-60053deabaf9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 31250944, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '60603ba6-4ebe-48b0-94b1-5656695c9ded-vda', 'timestamp': '2025-11-22T08:24:36.668552', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-454565421', 'name': 'instance-0000009e', 'instance_id': '60603ba6-4ebe-48b0-94b1-5656695c9ded', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ae8ac200-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6617.32763659, 'message_signature': 'a20004d0222a005b13fd96b6c9e58e0ddfd00bc94ae6f0d0c3f71f373198edc7'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '60603ba6-4ebe-48b0-94b1-5656695c9ded-sda', 'timestamp': '2025-11-22T08:24:36.668552', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-454565421', 'name': 'instance-0000009e', 'instance_id': '60603ba6-4ebe-48b0-94b1-5656695c9ded', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ae8aca52-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6617.32763659, 'message_signature': 'b04c26ad32748d71e09de3438f64b9c8f1fd8f356a0eeca32097583d38261b1f'}]}, 'timestamp': '2025-11-22 08:24:36.668980', '_unique_id': '70ab940b1bb144269936902267ce427c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.669 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.669 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.669 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.669 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.669 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.669 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.669 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.669 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.669 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.669 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.669 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.669 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.669 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.669 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.669 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.669 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.669 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.669 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.669 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.669 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.669 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.669 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.669 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.669 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.669 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.669 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.669 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.669 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.669 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.669 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.669 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.669 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.670 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.670 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestGettingAddress-server-454565421>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-454565421>]
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.670 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.670 12 DEBUG ceilometer.compute.pollsters [-] 60603ba6-4ebe-48b0-94b1-5656695c9ded/network.outgoing.bytes volume: 3638 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.671 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '392be90f-8458-486f-ae9b-35fad32d2506', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3638, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-0000009e-60603ba6-4ebe-48b0-94b1-5656695c9ded-tap60cf0e4a-97', 'timestamp': '2025-11-22T08:24:36.670449', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-454565421', 'name': 'tap60cf0e4a-97', 'instance_id': '60603ba6-4ebe-48b0-94b1-5656695c9ded', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:92:d5:7b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap60cf0e4a-97'}, 'message_id': 'ae8b0c56-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6617.311724998, 'message_signature': '16ea3960b572e69df2b887f639b76acd89b2a1eb4ffd4a07f4b7220f6137e625'}]}, 'timestamp': '2025-11-22 08:24:36.670685', '_unique_id': '9e0f6d7e624d415eaa55bad86d28a376'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.671 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.671 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.671 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.671 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.671 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.671 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.671 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.671 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.671 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.671 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.671 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.671 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.671 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.671 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.671 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.671 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.671 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.671 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.671 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.671 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.671 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.671 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.671 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.671 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.671 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.671 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.671 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.671 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.671 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.671 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.671 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.671 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.671 12 DEBUG ceilometer.compute.pollsters [-] 60603ba6-4ebe-48b0-94b1-5656695c9ded/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.672 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f74692fc-d3de-4a28-a0c7-7da35cd20774', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-0000009e-60603ba6-4ebe-48b0-94b1-5656695c9ded-tap60cf0e4a-97', 'timestamp': '2025-11-22T08:24:36.671797', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-454565421', 'name': 'tap60cf0e4a-97', 'instance_id': '60603ba6-4ebe-48b0-94b1-5656695c9ded', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:92:d5:7b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap60cf0e4a-97'}, 'message_id': 'ae8b40fe-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6617.311724998, 'message_signature': '1f864aec9c00e710ff2e99fe0cef588c89d697cb72dc5f14cf4bdbb64b6f4a78'}]}, 'timestamp': '2025-11-22 08:24:36.672059', '_unique_id': 'ee81dd375a874afb8459b5f3c4bb741e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.672 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.672 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.672 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.672 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.672 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.672 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.672 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.672 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.672 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.672 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.672 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.672 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.672 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.672 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.672 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.672 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.672 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.672 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.672 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.672 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.672 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.672 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.672 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.672 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.672 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.672 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.672 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.672 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.672 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.672 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.672 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.672 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.673 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.693 12 DEBUG ceilometer.compute.pollsters [-] 60603ba6-4ebe-48b0-94b1-5656695c9ded/memory.usage volume: 42.70703125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.694 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f591f68d-d558-48c8-a54b-156639c3148b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.70703125, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '60603ba6-4ebe-48b0-94b1-5656695c9ded', 'timestamp': '2025-11-22T08:24:36.673106', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-454565421', 'name': 'instance-0000009e', 'instance_id': '60603ba6-4ebe-48b0-94b1-5656695c9ded', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'ae8e8b9c-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6617.384771762, 'message_signature': '9a53260ea395bcd642836a91705db739e760c6e1d84aec8d5e28cbacbf9f99cf'}]}, 'timestamp': '2025-11-22 08:24:36.693658', '_unique_id': '77fcc3e3ff4a48ed8fc3f1874a945d60'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.694 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.694 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.694 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.694 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.694 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.694 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.694 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.694 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.694 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.694 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.694 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.694 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.694 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.694 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.694 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.694 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.694 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.694 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.694 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.694 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.694 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.694 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.694 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.694 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.694 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.694 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.694 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.694 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.694 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.694 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.694 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.695 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.695 12 DEBUG ceilometer.compute.pollsters [-] 60603ba6-4ebe-48b0-94b1-5656695c9ded/disk.device.read.requests volume: 1144 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.695 12 DEBUG ceilometer.compute.pollsters [-] 60603ba6-4ebe-48b0-94b1-5656695c9ded/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.696 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a144f647-6dbd-43d3-8367-5f62d767fb84', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1144, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '60603ba6-4ebe-48b0-94b1-5656695c9ded-vda', 'timestamp': '2025-11-22T08:24:36.695501', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-454565421', 'name': 'instance-0000009e', 'instance_id': '60603ba6-4ebe-48b0-94b1-5656695c9ded', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ae8ee038-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6617.32763659, 'message_signature': '532d4c56e6f9e1f25c3489613148d2e7b4936f27aa51479e7f05f15ebb8d6263'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '60603ba6-4ebe-48b0-94b1-5656695c9ded-sda', 'timestamp': '2025-11-22T08:24:36.695501', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-454565421', 'name': 'instance-0000009e', 'instance_id': '60603ba6-4ebe-48b0-94b1-5656695c9ded', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ae8ee948-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6617.32763659, 'message_signature': '00537d91cfc1e7e5d31a012a2a5c347550cd56ad0e5dc02914f35fa8da2e583e'}]}, 'timestamp': '2025-11-22 08:24:36.696001', '_unique_id': '452f1bf14bf8434a941fe4ce62211a34'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.696 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.696 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.696 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.696 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.696 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.696 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.696 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.696 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.696 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.696 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.696 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.696 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.696 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.696 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.696 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.696 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.696 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.696 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.696 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.696 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.696 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.696 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.696 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.696 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.696 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.696 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.696 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.696 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.696 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.696 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.696 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.697 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.697 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.697 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-454565421>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-454565421>]
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.697 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.697 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.697 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-454565421>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-454565421>]
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.697 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.697 12 DEBUG ceilometer.compute.pollsters [-] 60603ba6-4ebe-48b0-94b1-5656695c9ded/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.698 12 DEBUG ceilometer.compute.pollsters [-] 60603ba6-4ebe-48b0-94b1-5656695c9ded/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.698 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'abbd447e-076e-4738-aacb-cd8848c3f94c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '60603ba6-4ebe-48b0-94b1-5656695c9ded-vda', 'timestamp': '2025-11-22T08:24:36.697978', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-454565421', 'name': 'instance-0000009e', 'instance_id': '60603ba6-4ebe-48b0-94b1-5656695c9ded', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ae8f3f9c-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6617.295029665, 'message_signature': 'fe9afaa67a31ccace29778eabf7b1d9c41e78ad579ee7079dac289a3c7b9f212'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '60603ba6-4ebe-48b0-94b1-5656695c9ded-sda', 'timestamp': '2025-11-22T08:24:36.697978', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-454565421', 'name': 'instance-0000009e', 'instance_id': '60603ba6-4ebe-48b0-94b1-5656695c9ded', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ae8f4a46-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6617.295029665, 'message_signature': '276d9d32e1cacf8c6562004451ed5e8bada696bb44a9fef69c6e2491a9e7173a'}]}, 'timestamp': '2025-11-22 08:24:36.698474', '_unique_id': '1309428a243c4408964574977f971aac'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.698 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.698 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.698 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.698 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.698 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.698 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.698 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.698 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.698 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.698 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.698 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.698 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.698 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.698 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.698 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.698 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.698 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.698 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.698 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.698 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.698 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.698 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.698 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.698 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.698 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.698 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.698 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.698 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.698 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.698 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.698 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.698 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.699 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.699 12 DEBUG ceilometer.compute.pollsters [-] 60603ba6-4ebe-48b0-94b1-5656695c9ded/disk.device.read.latency volume: 2831742378 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.700 12 DEBUG ceilometer.compute.pollsters [-] 60603ba6-4ebe-48b0-94b1-5656695c9ded/disk.device.read.latency volume: 536171953 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.700 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8d149908-cd90-490b-872c-489759bd3bc7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2831742378, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '60603ba6-4ebe-48b0-94b1-5656695c9ded-vda', 'timestamp': '2025-11-22T08:24:36.699796', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-454565421', 'name': 'instance-0000009e', 'instance_id': '60603ba6-4ebe-48b0-94b1-5656695c9ded', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ae8f865a-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6617.32763659, 'message_signature': 'c4f222ebf9ca9f3aa817e06f54858d09f373dac5aa84d23cbc2cc9d07c146c80'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 536171953, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '60603ba6-4ebe-48b0-94b1-5656695c9ded-sda', 'timestamp': '2025-11-22T08:24:36.699796', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-454565421', 'name': 'instance-0000009e', 'instance_id': '60603ba6-4ebe-48b0-94b1-5656695c9ded', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ae8f9136-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6617.32763659, 'message_signature': '1c2442c9fe5983daefb294083db1b25a544829dbe2d43b01f51cbd652bb8d959'}]}, 'timestamp': '2025-11-22 08:24:36.700325', '_unique_id': '689b61979e234496a828e768fa1ba5e7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.700 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.700 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.700 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.700 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.700 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.700 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.700 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.700 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.700 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.700 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.700 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.700 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.700 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.700 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.700 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.700 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.700 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.700 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.700 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.700 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.700 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.700 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.700 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.700 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.700 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.700 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.700 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.700 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.700 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.700 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.700 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.701 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.701 12 DEBUG ceilometer.compute.pollsters [-] 60603ba6-4ebe-48b0-94b1-5656695c9ded/network.incoming.packets volume: 26 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.702 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '95d9c41f-1e26-4878-981b-d0907bbe66aa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 26, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-0000009e-60603ba6-4ebe-48b0-94b1-5656695c9ded-tap60cf0e4a-97', 'timestamp': '2025-11-22T08:24:36.701459', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-454565421', 'name': 'tap60cf0e4a-97', 'instance_id': '60603ba6-4ebe-48b0-94b1-5656695c9ded', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:92:d5:7b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap60cf0e4a-97'}, 'message_id': 'ae8fc78c-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6617.311724998, 'message_signature': 'aa123ca6b8a6dd00583d0037446eb6867a35d68f1c23622c518990574c43b87f'}]}, 'timestamp': '2025-11-22 08:24:36.701691', '_unique_id': '739ae8e9b8014064b8e101aa0315e6da'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.702 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.702 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.702 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.702 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.702 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.702 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.702 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.702 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.702 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.702 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.702 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.702 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.702 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.702 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.702 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.702 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.702 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.702 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.702 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.702 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.702 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.702 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.702 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.702 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.702 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.702 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.702 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.702 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.702 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.702 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.702 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.702 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.702 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.702 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.702 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestGettingAddress-server-454565421>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-454565421>]
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.703 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.703 12 DEBUG ceilometer.compute.pollsters [-] 60603ba6-4ebe-48b0-94b1-5656695c9ded/disk.device.write.latency volume: 58855866657 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.703 12 DEBUG ceilometer.compute.pollsters [-] 60603ba6-4ebe-48b0-94b1-5656695c9ded/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.704 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c60db753-204b-4b25-82fd-b1a45575d731', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 58855866657, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '60603ba6-4ebe-48b0-94b1-5656695c9ded-vda', 'timestamp': '2025-11-22T08:24:36.703080', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-454565421', 'name': 'instance-0000009e', 'instance_id': '60603ba6-4ebe-48b0-94b1-5656695c9ded', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ae900666-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6617.32763659, 'message_signature': '9b7cafc7fc01534e08f281b144e258aa7433dccedd495f9c0c272285923670f6'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '60603ba6-4ebe-48b0-94b1-5656695c9ded-sda', 'timestamp': '2025-11-22T08:24:36.703080', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-454565421', 'name': 'instance-0000009e', 'instance_id': '60603ba6-4ebe-48b0-94b1-5656695c9ded', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ae9010de-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6617.32763659, 'message_signature': 'f72aad42f92e5f06049044c852d416cb90bb8eb81af7875235b39c4dd3a0a71b'}]}, 'timestamp': '2025-11-22 08:24:36.703588', '_unique_id': '0e665b5acd6841aabc294826307f093e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.704 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.704 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.704 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.704 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.704 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.704 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.704 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.704 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.704 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.704 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.704 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.704 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.704 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.704 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.704 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.704 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.704 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.704 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.704 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.704 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.704 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.704 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.704 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.704 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.704 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.704 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.704 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.704 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.704 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.704 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.704 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.704 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.704 12 DEBUG ceilometer.compute.pollsters [-] 60603ba6-4ebe-48b0-94b1-5656695c9ded/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.705 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b572a106-2bdc-4f85-aa66-3644209cb4dc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-0000009e-60603ba6-4ebe-48b0-94b1-5656695c9ded-tap60cf0e4a-97', 'timestamp': '2025-11-22T08:24:36.704791', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-454565421', 'name': 'tap60cf0e4a-97', 'instance_id': '60603ba6-4ebe-48b0-94b1-5656695c9ded', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:92:d5:7b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap60cf0e4a-97'}, 'message_id': 'ae904b08-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6617.311724998, 'message_signature': '79c9d2a427f426db3e36cf95c286ac9a1fdf56aa41e96e86b13ba7875ef332b4'}]}, 'timestamp': '2025-11-22 08:24:36.705077', '_unique_id': '78c5b558beff43159c3cd32784825a31'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.705 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.705 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.705 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.705 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.705 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.705 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.705 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.705 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.705 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.705 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.705 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.705 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.705 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.705 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.705 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.705 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.705 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.705 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.705 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.705 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.705 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.705 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.705 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.705 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.705 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.705 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.705 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.705 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.705 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.705 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.705 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.705 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.706 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.706 12 DEBUG ceilometer.compute.pollsters [-] 60603ba6-4ebe-48b0-94b1-5656695c9ded/disk.device.write.bytes volume: 72970240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.706 12 DEBUG ceilometer.compute.pollsters [-] 60603ba6-4ebe-48b0-94b1-5656695c9ded/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.707 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '77a5a0a5-c588-4c25-9b46-1a29197d73d8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72970240, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '60603ba6-4ebe-48b0-94b1-5656695c9ded-vda', 'timestamp': '2025-11-22T08:24:36.706196', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-454565421', 'name': 'instance-0000009e', 'instance_id': '60603ba6-4ebe-48b0-94b1-5656695c9ded', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ae908258-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6617.32763659, 'message_signature': '814dc477f31bbad100f70f8908a76c173d98898a2dcbaf88272d64e61e821aef'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '60603ba6-4ebe-48b0-94b1-5656695c9ded-sda', 'timestamp': '2025-11-22T08:24:36.706196', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-454565421', 'name': 'instance-0000009e', 'instance_id': '60603ba6-4ebe-48b0-94b1-5656695c9ded', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ae908cc6-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6617.32763659, 'message_signature': '027e4903279d1f9c5ad6d237ee9ec4580f66e5bc8802a8497480c1da7ee0eb55'}]}, 'timestamp': '2025-11-22 08:24:36.706731', '_unique_id': 'c557e6b8fccd437caf46cea798ee554f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.707 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.707 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.707 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.707 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.707 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.707 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.707 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.707 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.707 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.707 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.707 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.707 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.707 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.707 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.707 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.707 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.707 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.707 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.707 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.707 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.707 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.707 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.707 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.707 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.707 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.707 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.707 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.707 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.707 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.707 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.707 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.707 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.707 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.707 12 DEBUG ceilometer.compute.pollsters [-] 60603ba6-4ebe-48b0-94b1-5656695c9ded/network.outgoing.packets volume: 30 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.708 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '25982817-72f8-45ef-8b09-cd648ba5e8bd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 30, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': 'instance-0000009e-60603ba6-4ebe-48b0-94b1-5656695c9ded-tap60cf0e4a-97', 'timestamp': '2025-11-22T08:24:36.707882', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-454565421', 'name': 'tap60cf0e4a-97', 'instance_id': '60603ba6-4ebe-48b0-94b1-5656695c9ded', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:92:d5:7b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap60cf0e4a-97'}, 'message_id': 'ae90c420-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6617.311724998, 'message_signature': 'e06abf380acd5f42f8fe970482b0e25354caf0b3a0e3b7f105064469a09d1964'}]}, 'timestamp': '2025-11-22 08:24:36.708196', '_unique_id': 'bec5b8bc3b9d4995b592bb457a050a6d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.708 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.708 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.708 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.708 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.708 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.708 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.708 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.708 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.708 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.708 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.708 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.708 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.708 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.708 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.708 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.708 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.708 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.708 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.708 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.708 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.708 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.708 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.708 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.708 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.708 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.708 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.708 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.708 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.708 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.708 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.708 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.708 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.709 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.709 12 DEBUG ceilometer.compute.pollsters [-] 60603ba6-4ebe-48b0-94b1-5656695c9ded/cpu volume: 14490000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.710 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '95223f43-cf4a-4e74-aec6-4ae09f214c93', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 14490000000, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '60603ba6-4ebe-48b0-94b1-5656695c9ded', 'timestamp': '2025-11-22T08:24:36.709349', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-454565421', 'name': 'instance-0000009e', 'instance_id': '60603ba6-4ebe-48b0-94b1-5656695c9ded', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'ae90fcb0-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6617.384771762, 'message_signature': 'fc8a03c05adb39a5a6074be0d8b6d1d16932b2812fcba65d2fc0da3d0f5d1c24'}]}, 'timestamp': '2025-11-22 08:24:36.709633', '_unique_id': '48c7fef2a08d4e0b80df0b0c84111b61'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.710 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.710 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.710 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.710 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.710 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.710 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.710 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.710 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.710 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.710 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.710 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.710 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.710 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.710 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.710 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.710 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.710 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.710 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.710 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.710 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.710 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.710 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.710 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.710 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.710 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.710 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.710 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.710 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.710 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.710 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.710 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.710 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.710 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.710 12 DEBUG ceilometer.compute.pollsters [-] 60603ba6-4ebe-48b0-94b1-5656695c9ded/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.711 12 DEBUG ceilometer.compute.pollsters [-] 60603ba6-4ebe-48b0-94b1-5656695c9ded/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.711 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '59a115c6-98a3-4874-a670-37f85d16385a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '60603ba6-4ebe-48b0-94b1-5656695c9ded-vda', 'timestamp': '2025-11-22T08:24:36.710722', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-454565421', 'name': 'instance-0000009e', 'instance_id': '60603ba6-4ebe-48b0-94b1-5656695c9ded', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ae91323e-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6617.295029665, 'message_signature': 'db1f468ee3182b7a90e2cebf7fb99b80d3a5922d462c0b87d76f619dc83ac8bb'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '809b865601654264af5bff7f49127cea', 'user_name': None, 'project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'project_name': None, 'resource_id': '60603ba6-4ebe-48b0-94b1-5656695c9ded-sda', 'timestamp': '2025-11-22T08:24:36.710722', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-454565421', 'name': 'instance-0000009e', 'instance_id': '60603ba6-4ebe-48b0-94b1-5656695c9ded', 'instance_type': 'm1.nano', 'host': '5c19ece3d34525b28ab04b416bbc791a2101edc4df56e429a7c03bbb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ae913c66-c77c-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 6617.295029665, 'message_signature': 'dd5eee7f686a4c139c4b6437336cb1c2cd70c6d0cf4ea91dda4bd1d2dfbdcbf7'}]}, 'timestamp': '2025-11-22 08:24:36.711225', '_unique_id': 'ff9c506c8e8447aab60b38cc31bf56da'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.711 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.711 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.711 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.711 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.711 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.711 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.711 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.711 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.711 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.711 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.711 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.711 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.711 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.711 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.711 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.711 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.711 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.711 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.711 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.711 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.711 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.711 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.711 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.711 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.711 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.711 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.711 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.711 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.711 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.711 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:24:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:24:36.711 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:24:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:24:37.349 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:24:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:24:37.350 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:24:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:24:37.351 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:24:39 compute-0 nova_compute[186544]: 2025-11-22 08:24:39.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:24:40 compute-0 nova_compute[186544]: 2025-11-22 08:24:40.357 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:24:41 compute-0 nova_compute[186544]: 2025-11-22 08:24:41.246 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:24:44 compute-0 nova_compute[186544]: 2025-11-22 08:24:44.177 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:24:44 compute-0 nova_compute[186544]: 2025-11-22 08:24:44.178 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 22 08:24:44 compute-0 nova_compute[186544]: 2025-11-22 08:24:44.194 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 22 08:24:45 compute-0 nova_compute[186544]: 2025-11-22 08:24:45.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:24:45 compute-0 nova_compute[186544]: 2025-11-22 08:24:45.164 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 22 08:24:45 compute-0 nova_compute[186544]: 2025-11-22 08:24:45.359 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:24:46 compute-0 nova_compute[186544]: 2025-11-22 08:24:46.249 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:24:46 compute-0 podman[243879]: 2025-11-22 08:24:46.4145647 +0000 UTC m=+0.063454798 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd)
Nov 22 08:24:49 compute-0 podman[243899]: 2025-11-22 08:24:49.411672886 +0000 UTC m=+0.059640705 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 08:24:49 compute-0 podman[243900]: 2025-11-22 08:24:49.4175278 +0000 UTC m=+0.060749881 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-type=git, vendor=Red Hat, Inc., config_id=edpm, io.buildah.version=1.33.7, io.openshift.expose-services=, name=ubi9-minimal, version=9.6, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 22 08:24:50 compute-0 nova_compute[186544]: 2025-11-22 08:24:50.361 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:24:51 compute-0 nova_compute[186544]: 2025-11-22 08:24:51.251 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:24:55 compute-0 nova_compute[186544]: 2025-11-22 08:24:55.363 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:24:56 compute-0 nova_compute[186544]: 2025-11-22 08:24:56.252 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:24:56 compute-0 nova_compute[186544]: 2025-11-22 08:24:56.855 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:24:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:24:56.856 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=51, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=50) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:24:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:24:56.857 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 08:25:00 compute-0 nova_compute[186544]: 2025-11-22 08:25:00.365 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:25:01 compute-0 nova_compute[186544]: 2025-11-22 08:25:01.254 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:25:01 compute-0 nova_compute[186544]: 2025-11-22 08:25:01.593 186548 DEBUG oslo_concurrency.lockutils [None req-043ca72b-2f5a-4537-b9cd-93ad2c6ce86c 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "60603ba6-4ebe-48b0-94b1-5656695c9ded" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:25:01 compute-0 nova_compute[186544]: 2025-11-22 08:25:01.593 186548 DEBUG oslo_concurrency.lockutils [None req-043ca72b-2f5a-4537-b9cd-93ad2c6ce86c 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "60603ba6-4ebe-48b0-94b1-5656695c9ded" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:25:01 compute-0 nova_compute[186544]: 2025-11-22 08:25:01.593 186548 DEBUG oslo_concurrency.lockutils [None req-043ca72b-2f5a-4537-b9cd-93ad2c6ce86c 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "60603ba6-4ebe-48b0-94b1-5656695c9ded-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:25:01 compute-0 nova_compute[186544]: 2025-11-22 08:25:01.594 186548 DEBUG oslo_concurrency.lockutils [None req-043ca72b-2f5a-4537-b9cd-93ad2c6ce86c 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "60603ba6-4ebe-48b0-94b1-5656695c9ded-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:25:01 compute-0 nova_compute[186544]: 2025-11-22 08:25:01.594 186548 DEBUG oslo_concurrency.lockutils [None req-043ca72b-2f5a-4537-b9cd-93ad2c6ce86c 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "60603ba6-4ebe-48b0-94b1-5656695c9ded-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:25:01 compute-0 nova_compute[186544]: 2025-11-22 08:25:01.601 186548 INFO nova.compute.manager [None req-043ca72b-2f5a-4537-b9cd-93ad2c6ce86c 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] Terminating instance
Nov 22 08:25:01 compute-0 nova_compute[186544]: 2025-11-22 08:25:01.615 186548 DEBUG nova.compute.manager [None req-043ca72b-2f5a-4537-b9cd-93ad2c6ce86c 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 08:25:01 compute-0 kernel: tap60cf0e4a-97 (unregistering): left promiscuous mode
Nov 22 08:25:01 compute-0 NetworkManager[55036]: <info>  [1763799901.6425] device (tap60cf0e4a-97): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 08:25:01 compute-0 nova_compute[186544]: 2025-11-22 08:25:01.651 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:25:01 compute-0 ovn_controller[94843]: 2025-11-22T08:25:01Z|00740|binding|INFO|Releasing lport 60cf0e4a-972c-4c7e-ab24-65a810d650bb from this chassis (sb_readonly=0)
Nov 22 08:25:01 compute-0 ovn_controller[94843]: 2025-11-22T08:25:01Z|00741|binding|INFO|Setting lport 60cf0e4a-972c-4c7e-ab24-65a810d650bb down in Southbound
Nov 22 08:25:01 compute-0 ovn_controller[94843]: 2025-11-22T08:25:01Z|00742|binding|INFO|Removing iface tap60cf0e4a-97 ovn-installed in OVS
Nov 22 08:25:01 compute-0 nova_compute[186544]: 2025-11-22 08:25:01.656 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:25:01 compute-0 nova_compute[186544]: 2025-11-22 08:25:01.672 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:25:01 compute-0 systemd[1]: machine-qemu\x2d84\x2dinstance\x2d0000009e.scope: Deactivated successfully.
Nov 22 08:25:01 compute-0 systemd[1]: machine-qemu\x2d84\x2dinstance\x2d0000009e.scope: Consumed 19.027s CPU time.
Nov 22 08:25:01 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:25:01.700 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:92:d5:7b 10.100.0.9 2001:db8::f816:3eff:fe92:d57b'], port_security=['fa:16:3e:92:d5:7b 10.100.0.9 2001:db8::f816:3eff:fe92:d57b'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28 2001:db8::f816:3eff:fe92:d57b/64', 'neutron:device_id': '60603ba6-4ebe-48b0-94b1-5656695c9ded', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-326c0814-77d4-416b-a5a1-28be00b61ecd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4200f1d1fbb44a5aaf5e3578f6354ae', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a58d74bd-bc51-4723-b0e3-c855953c364c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5c110aad-90e5-4caa-b631-3c18861eaadf, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=60cf0e4a-972c-4c7e-ab24-65a810d650bb) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:25:01 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:25:01.701 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 60cf0e4a-972c-4c7e-ab24-65a810d650bb in datapath 326c0814-77d4-416b-a5a1-28be00b61ecd unbound from our chassis
Nov 22 08:25:01 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:25:01.702 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 326c0814-77d4-416b-a5a1-28be00b61ecd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 08:25:01 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:25:01.704 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[2d01334c-1a29-4a3f-9fca-9c111431afb8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:25:01 compute-0 systemd-machined[152872]: Machine qemu-84-instance-0000009e terminated.
Nov 22 08:25:01 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:25:01.704 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-326c0814-77d4-416b-a5a1-28be00b61ecd namespace which is not needed anymore
Nov 22 08:25:01 compute-0 nova_compute[186544]: 2025-11-22 08:25:01.836 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:25:01 compute-0 nova_compute[186544]: 2025-11-22 08:25:01.840 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:25:01 compute-0 nova_compute[186544]: 2025-11-22 08:25:01.884 186548 INFO nova.virt.libvirt.driver [-] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] Instance destroyed successfully.
Nov 22 08:25:01 compute-0 nova_compute[186544]: 2025-11-22 08:25:01.884 186548 DEBUG nova.objects.instance [None req-043ca72b-2f5a-4537-b9cd-93ad2c6ce86c 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lazy-loading 'resources' on Instance uuid 60603ba6-4ebe-48b0-94b1-5656695c9ded obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:25:01 compute-0 neutron-haproxy-ovnmeta-326c0814-77d4-416b-a5a1-28be00b61ecd[243537]: [NOTICE]   (243541) : haproxy version is 2.8.14-c23fe91
Nov 22 08:25:01 compute-0 neutron-haproxy-ovnmeta-326c0814-77d4-416b-a5a1-28be00b61ecd[243537]: [NOTICE]   (243541) : path to executable is /usr/sbin/haproxy
Nov 22 08:25:01 compute-0 neutron-haproxy-ovnmeta-326c0814-77d4-416b-a5a1-28be00b61ecd[243537]: [WARNING]  (243541) : Exiting Master process...
Nov 22 08:25:01 compute-0 neutron-haproxy-ovnmeta-326c0814-77d4-416b-a5a1-28be00b61ecd[243537]: [ALERT]    (243541) : Current worker (243543) exited with code 143 (Terminated)
Nov 22 08:25:01 compute-0 neutron-haproxy-ovnmeta-326c0814-77d4-416b-a5a1-28be00b61ecd[243537]: [WARNING]  (243541) : All workers exited. Exiting... (0)
Nov 22 08:25:01 compute-0 nova_compute[186544]: 2025-11-22 08:25:01.903 186548 DEBUG nova.virt.libvirt.vif [None req-043ca72b-2f5a-4537-b9cd-93ad2c6ce86c 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:23:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-454565421',display_name='tempest-TestGettingAddress-server-454565421',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-454565421',id=158,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDo9IOjv1g7RgDVY2BCJ+mmy9Na2DHnkZ9mE8BRZMtj9iwqLsOttn1FTaKP3MHHav783x9ncOoaXWlLdYlMz7ANRW+4cS5nXI1iryizo5WsS7HuJMsWvhJA60lnbjcIyew==',key_name='tempest-TestGettingAddress-814265221',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:23:36Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c4200f1d1fbb44a5aaf5e3578f6354ae',ramdisk_id='',reservation_id='r-owca6uo7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-25838038',owner_user_name='tempest-TestGettingAddress-25838038-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:23:36Z,user_data=None,user_id='809b865601654264af5bff7f49127cea',uuid=60603ba6-4ebe-48b0-94b1-5656695c9ded,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "60cf0e4a-972c-4c7e-ab24-65a810d650bb", "address": "fa:16:3e:92:d5:7b", "network": {"id": "326c0814-77d4-416b-a5a1-28be00b61ecd", "bridge": "br-int", "label": "tempest-network-smoke--641682246", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe92:d57b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60cf0e4a-97", "ovs_interfaceid": "60cf0e4a-972c-4c7e-ab24-65a810d650bb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 08:25:01 compute-0 nova_compute[186544]: 2025-11-22 08:25:01.904 186548 DEBUG nova.network.os_vif_util [None req-043ca72b-2f5a-4537-b9cd-93ad2c6ce86c 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converting VIF {"id": "60cf0e4a-972c-4c7e-ab24-65a810d650bb", "address": "fa:16:3e:92:d5:7b", "network": {"id": "326c0814-77d4-416b-a5a1-28be00b61ecd", "bridge": "br-int", "label": "tempest-network-smoke--641682246", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe92:d57b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60cf0e4a-97", "ovs_interfaceid": "60cf0e4a-972c-4c7e-ab24-65a810d650bb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:25:01 compute-0 systemd[1]: libpod-f95104086a86e4dd31f0e42e96bcd07c5c0d2375f8c670c8241679ede0363829.scope: Deactivated successfully.
Nov 22 08:25:01 compute-0 nova_compute[186544]: 2025-11-22 08:25:01.905 186548 DEBUG nova.network.os_vif_util [None req-043ca72b-2f5a-4537-b9cd-93ad2c6ce86c 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:92:d5:7b,bridge_name='br-int',has_traffic_filtering=True,id=60cf0e4a-972c-4c7e-ab24-65a810d650bb,network=Network(326c0814-77d4-416b-a5a1-28be00b61ecd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60cf0e4a-97') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:25:01 compute-0 nova_compute[186544]: 2025-11-22 08:25:01.905 186548 DEBUG os_vif [None req-043ca72b-2f5a-4537-b9cd-93ad2c6ce86c 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:92:d5:7b,bridge_name='br-int',has_traffic_filtering=True,id=60cf0e4a-972c-4c7e-ab24-65a810d650bb,network=Network(326c0814-77d4-416b-a5a1-28be00b61ecd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60cf0e4a-97') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 08:25:01 compute-0 nova_compute[186544]: 2025-11-22 08:25:01.907 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:25:01 compute-0 nova_compute[186544]: 2025-11-22 08:25:01.908 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap60cf0e4a-97, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:25:01 compute-0 nova_compute[186544]: 2025-11-22 08:25:01.909 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:25:01 compute-0 nova_compute[186544]: 2025-11-22 08:25:01.912 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 08:25:01 compute-0 podman[243969]: 2025-11-22 08:25:01.912693916 +0000 UTC m=+0.122601419 container died f95104086a86e4dd31f0e42e96bcd07c5c0d2375f8c670c8241679ede0363829 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-326c0814-77d4-416b-a5a1-28be00b61ecd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3)
Nov 22 08:25:01 compute-0 nova_compute[186544]: 2025-11-22 08:25:01.914 186548 INFO os_vif [None req-043ca72b-2f5a-4537-b9cd-93ad2c6ce86c 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:92:d5:7b,bridge_name='br-int',has_traffic_filtering=True,id=60cf0e4a-972c-4c7e-ab24-65a810d650bb,network=Network(326c0814-77d4-416b-a5a1-28be00b61ecd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60cf0e4a-97')
Nov 22 08:25:01 compute-0 nova_compute[186544]: 2025-11-22 08:25:01.915 186548 INFO nova.virt.libvirt.driver [None req-043ca72b-2f5a-4537-b9cd-93ad2c6ce86c 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] Deleting instance files /var/lib/nova/instances/60603ba6-4ebe-48b0-94b1-5656695c9ded_del
Nov 22 08:25:01 compute-0 nova_compute[186544]: 2025-11-22 08:25:01.916 186548 INFO nova.virt.libvirt.driver [None req-043ca72b-2f5a-4537-b9cd-93ad2c6ce86c 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] Deletion of /var/lib/nova/instances/60603ba6-4ebe-48b0-94b1-5656695c9ded_del complete
Nov 22 08:25:01 compute-0 nova_compute[186544]: 2025-11-22 08:25:01.934 186548 DEBUG nova.compute.manager [req-6ef3f196-4bb6-46cf-ac09-44efb775a4f6 req-84ca2a08-4dba-4dba-a6f3-728217adced7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] Received event network-vif-unplugged-60cf0e4a-972c-4c7e-ab24-65a810d650bb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:25:01 compute-0 nova_compute[186544]: 2025-11-22 08:25:01.934 186548 DEBUG oslo_concurrency.lockutils [req-6ef3f196-4bb6-46cf-ac09-44efb775a4f6 req-84ca2a08-4dba-4dba-a6f3-728217adced7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "60603ba6-4ebe-48b0-94b1-5656695c9ded-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:25:01 compute-0 nova_compute[186544]: 2025-11-22 08:25:01.935 186548 DEBUG oslo_concurrency.lockutils [req-6ef3f196-4bb6-46cf-ac09-44efb775a4f6 req-84ca2a08-4dba-4dba-a6f3-728217adced7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "60603ba6-4ebe-48b0-94b1-5656695c9ded-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:25:01 compute-0 nova_compute[186544]: 2025-11-22 08:25:01.935 186548 DEBUG oslo_concurrency.lockutils [req-6ef3f196-4bb6-46cf-ac09-44efb775a4f6 req-84ca2a08-4dba-4dba-a6f3-728217adced7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "60603ba6-4ebe-48b0-94b1-5656695c9ded-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:25:01 compute-0 nova_compute[186544]: 2025-11-22 08:25:01.935 186548 DEBUG nova.compute.manager [req-6ef3f196-4bb6-46cf-ac09-44efb775a4f6 req-84ca2a08-4dba-4dba-a6f3-728217adced7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] No waiting events found dispatching network-vif-unplugged-60cf0e4a-972c-4c7e-ab24-65a810d650bb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:25:01 compute-0 nova_compute[186544]: 2025-11-22 08:25:01.935 186548 DEBUG nova.compute.manager [req-6ef3f196-4bb6-46cf-ac09-44efb775a4f6 req-84ca2a08-4dba-4dba-a6f3-728217adced7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] Received event network-vif-unplugged-60cf0e4a-972c-4c7e-ab24-65a810d650bb for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 22 08:25:02 compute-0 systemd[1]: var-lib-containers-storage-overlay-db34ff5b1144b85d571e2071fae137b4bb682833cde66e4de5960f06c13d1f71-merged.mount: Deactivated successfully.
Nov 22 08:25:02 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f95104086a86e4dd31f0e42e96bcd07c5c0d2375f8c670c8241679ede0363829-userdata-shm.mount: Deactivated successfully.
Nov 22 08:25:02 compute-0 nova_compute[186544]: 2025-11-22 08:25:02.034 186548 DEBUG nova.compute.manager [req-d30cfb81-83a4-4a99-8ead-2e7270cfd5f2 req-1f16ead0-6d83-4122-b6bd-e6aa62e43d88 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] Received event network-changed-60cf0e4a-972c-4c7e-ab24-65a810d650bb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:25:02 compute-0 nova_compute[186544]: 2025-11-22 08:25:02.035 186548 DEBUG nova.compute.manager [req-d30cfb81-83a4-4a99-8ead-2e7270cfd5f2 req-1f16ead0-6d83-4122-b6bd-e6aa62e43d88 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] Refreshing instance network info cache due to event network-changed-60cf0e4a-972c-4c7e-ab24-65a810d650bb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:25:02 compute-0 nova_compute[186544]: 2025-11-22 08:25:02.035 186548 DEBUG oslo_concurrency.lockutils [req-d30cfb81-83a4-4a99-8ead-2e7270cfd5f2 req-1f16ead0-6d83-4122-b6bd-e6aa62e43d88 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-60603ba6-4ebe-48b0-94b1-5656695c9ded" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:25:02 compute-0 nova_compute[186544]: 2025-11-22 08:25:02.035 186548 DEBUG oslo_concurrency.lockutils [req-d30cfb81-83a4-4a99-8ead-2e7270cfd5f2 req-1f16ead0-6d83-4122-b6bd-e6aa62e43d88 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-60603ba6-4ebe-48b0-94b1-5656695c9ded" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:25:02 compute-0 nova_compute[186544]: 2025-11-22 08:25:02.036 186548 DEBUG nova.network.neutron [req-d30cfb81-83a4-4a99-8ead-2e7270cfd5f2 req-1f16ead0-6d83-4122-b6bd-e6aa62e43d88 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] Refreshing network info cache for port 60cf0e4a-972c-4c7e-ab24-65a810d650bb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:25:02 compute-0 nova_compute[186544]: 2025-11-22 08:25:02.042 186548 INFO nova.compute.manager [None req-043ca72b-2f5a-4537-b9cd-93ad2c6ce86c 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] Took 0.43 seconds to destroy the instance on the hypervisor.
Nov 22 08:25:02 compute-0 nova_compute[186544]: 2025-11-22 08:25:02.043 186548 DEBUG oslo.service.loopingcall [None req-043ca72b-2f5a-4537-b9cd-93ad2c6ce86c 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 08:25:02 compute-0 nova_compute[186544]: 2025-11-22 08:25:02.043 186548 DEBUG nova.compute.manager [-] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 08:25:02 compute-0 nova_compute[186544]: 2025-11-22 08:25:02.044 186548 DEBUG nova.network.neutron [-] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 08:25:02 compute-0 podman[243969]: 2025-11-22 08:25:02.200511555 +0000 UTC m=+0.410419038 container cleanup f95104086a86e4dd31f0e42e96bcd07c5c0d2375f8c670c8241679ede0363829 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-326c0814-77d4-416b-a5a1-28be00b61ecd, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 08:25:02 compute-0 systemd[1]: libpod-conmon-f95104086a86e4dd31f0e42e96bcd07c5c0d2375f8c670c8241679ede0363829.scope: Deactivated successfully.
Nov 22 08:25:02 compute-0 podman[244016]: 2025-11-22 08:25:02.409569709 +0000 UTC m=+0.186407185 container remove f95104086a86e4dd31f0e42e96bcd07c5c0d2375f8c670c8241679ede0363829 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-326c0814-77d4-416b-a5a1-28be00b61ecd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 22 08:25:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:25:02.416 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[6a08b30a-c6d1-43fc-9637-b7e69656b745]: (4, ('Sat Nov 22 08:25:01 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-326c0814-77d4-416b-a5a1-28be00b61ecd (f95104086a86e4dd31f0e42e96bcd07c5c0d2375f8c670c8241679ede0363829)\nf95104086a86e4dd31f0e42e96bcd07c5c0d2375f8c670c8241679ede0363829\nSat Nov 22 08:25:02 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-326c0814-77d4-416b-a5a1-28be00b61ecd (f95104086a86e4dd31f0e42e96bcd07c5c0d2375f8c670c8241679ede0363829)\nf95104086a86e4dd31f0e42e96bcd07c5c0d2375f8c670c8241679ede0363829\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:25:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:25:02.418 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[fd7a8b7c-e358-47d7-9587-da82aa6f0db5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:25:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:25:02.419 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap326c0814-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:25:02 compute-0 nova_compute[186544]: 2025-11-22 08:25:02.421 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:25:02 compute-0 kernel: tap326c0814-70: left promiscuous mode
Nov 22 08:25:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:25:02.425 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[92cdc677-8948-4ce6-9ec7-4a24919c242b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:25:02 compute-0 nova_compute[186544]: 2025-11-22 08:25:02.436 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:25:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:25:02.444 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[87e2bb98-643d-4675-bd80-d6e9af134e40]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:25:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:25:02.446 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[978a7d87-2686-40de-8066-afb8faf9a249]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:25:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:25:02.465 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[ae8076df-524d-48c1-aad4-aceb6adedbe8]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 655676, 'reachable_time': 29243, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 244030, 'error': None, 'target': 'ovnmeta-326c0814-77d4-416b-a5a1-28be00b61ecd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:25:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:25:02.468 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-326c0814-77d4-416b-a5a1-28be00b61ecd deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 08:25:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:25:02.468 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[80c43b7a-8cb0-46c5-8f14-427f5d87e3e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:25:02 compute-0 systemd[1]: run-netns-ovnmeta\x2d326c0814\x2d77d4\x2d416b\x2da5a1\x2d28be00b61ecd.mount: Deactivated successfully.
Nov 22 08:25:03 compute-0 nova_compute[186544]: 2025-11-22 08:25:03.279 186548 DEBUG nova.network.neutron [-] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:25:03 compute-0 nova_compute[186544]: 2025-11-22 08:25:03.329 186548 INFO nova.compute.manager [-] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] Took 1.28 seconds to deallocate network for instance.
Nov 22 08:25:03 compute-0 nova_compute[186544]: 2025-11-22 08:25:03.349 186548 DEBUG nova.compute.manager [req-f0a131b0-f272-4844-980a-2cc6d50c03f2 req-b26caaaf-bf5b-49b1-a6d7-8ea9896d55d9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] Received event network-vif-deleted-60cf0e4a-972c-4c7e-ab24-65a810d650bb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:25:03 compute-0 nova_compute[186544]: 2025-11-22 08:25:03.477 186548 DEBUG oslo_concurrency.lockutils [None req-043ca72b-2f5a-4537-b9cd-93ad2c6ce86c 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:25:03 compute-0 nova_compute[186544]: 2025-11-22 08:25:03.477 186548 DEBUG oslo_concurrency.lockutils [None req-043ca72b-2f5a-4537-b9cd-93ad2c6ce86c 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:25:03 compute-0 nova_compute[186544]: 2025-11-22 08:25:03.620 186548 DEBUG nova.compute.provider_tree [None req-043ca72b-2f5a-4537-b9cd-93ad2c6ce86c 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:25:03 compute-0 nova_compute[186544]: 2025-11-22 08:25:03.636 186548 DEBUG nova.scheduler.client.report [None req-043ca72b-2f5a-4537-b9cd-93ad2c6ce86c 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:25:03 compute-0 nova_compute[186544]: 2025-11-22 08:25:03.670 186548 DEBUG oslo_concurrency.lockutils [None req-043ca72b-2f5a-4537-b9cd-93ad2c6ce86c 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.192s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:25:03 compute-0 nova_compute[186544]: 2025-11-22 08:25:03.739 186548 DEBUG nova.network.neutron [req-d30cfb81-83a4-4a99-8ead-2e7270cfd5f2 req-1f16ead0-6d83-4122-b6bd-e6aa62e43d88 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] Updated VIF entry in instance network info cache for port 60cf0e4a-972c-4c7e-ab24-65a810d650bb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:25:03 compute-0 nova_compute[186544]: 2025-11-22 08:25:03.740 186548 DEBUG nova.network.neutron [req-d30cfb81-83a4-4a99-8ead-2e7270cfd5f2 req-1f16ead0-6d83-4122-b6bd-e6aa62e43d88 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] Updating instance_info_cache with network_info: [{"id": "60cf0e4a-972c-4c7e-ab24-65a810d650bb", "address": "fa:16:3e:92:d5:7b", "network": {"id": "326c0814-77d4-416b-a5a1-28be00b61ecd", "bridge": "br-int", "label": "tempest-network-smoke--641682246", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe92:d57b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4200f1d1fbb44a5aaf5e3578f6354ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60cf0e4a-97", "ovs_interfaceid": "60cf0e4a-972c-4c7e-ab24-65a810d650bb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:25:03 compute-0 nova_compute[186544]: 2025-11-22 08:25:03.763 186548 INFO nova.scheduler.client.report [None req-043ca72b-2f5a-4537-b9cd-93ad2c6ce86c 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Deleted allocations for instance 60603ba6-4ebe-48b0-94b1-5656695c9ded
Nov 22 08:25:03 compute-0 nova_compute[186544]: 2025-11-22 08:25:03.788 186548 DEBUG oslo_concurrency.lockutils [req-d30cfb81-83a4-4a99-8ead-2e7270cfd5f2 req-1f16ead0-6d83-4122-b6bd-e6aa62e43d88 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-60603ba6-4ebe-48b0-94b1-5656695c9ded" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:25:03 compute-0 nova_compute[186544]: 2025-11-22 08:25:03.955 186548 DEBUG oslo_concurrency.lockutils [None req-043ca72b-2f5a-4537-b9cd-93ad2c6ce86c 809b865601654264af5bff7f49127cea c4200f1d1fbb44a5aaf5e3578f6354ae - - default default] Lock "60603ba6-4ebe-48b0-94b1-5656695c9ded" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.361s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:25:04 compute-0 nova_compute[186544]: 2025-11-22 08:25:04.005 186548 DEBUG nova.compute.manager [req-76d4d9ac-08f3-4df8-befa-5803a898098e req-43f889ad-7131-4b11-ac53-897f5445fc2b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] Received event network-vif-plugged-60cf0e4a-972c-4c7e-ab24-65a810d650bb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:25:04 compute-0 nova_compute[186544]: 2025-11-22 08:25:04.005 186548 DEBUG oslo_concurrency.lockutils [req-76d4d9ac-08f3-4df8-befa-5803a898098e req-43f889ad-7131-4b11-ac53-897f5445fc2b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "60603ba6-4ebe-48b0-94b1-5656695c9ded-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:25:04 compute-0 nova_compute[186544]: 2025-11-22 08:25:04.005 186548 DEBUG oslo_concurrency.lockutils [req-76d4d9ac-08f3-4df8-befa-5803a898098e req-43f889ad-7131-4b11-ac53-897f5445fc2b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "60603ba6-4ebe-48b0-94b1-5656695c9ded-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:25:04 compute-0 nova_compute[186544]: 2025-11-22 08:25:04.006 186548 DEBUG oslo_concurrency.lockutils [req-76d4d9ac-08f3-4df8-befa-5803a898098e req-43f889ad-7131-4b11-ac53-897f5445fc2b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "60603ba6-4ebe-48b0-94b1-5656695c9ded-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:25:04 compute-0 nova_compute[186544]: 2025-11-22 08:25:04.006 186548 DEBUG nova.compute.manager [req-76d4d9ac-08f3-4df8-befa-5803a898098e req-43f889ad-7131-4b11-ac53-897f5445fc2b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] No waiting events found dispatching network-vif-plugged-60cf0e4a-972c-4c7e-ab24-65a810d650bb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:25:04 compute-0 nova_compute[186544]: 2025-11-22 08:25:04.006 186548 WARNING nova.compute.manager [req-76d4d9ac-08f3-4df8-befa-5803a898098e req-43f889ad-7131-4b11-ac53-897f5445fc2b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] Received unexpected event network-vif-plugged-60cf0e4a-972c-4c7e-ab24-65a810d650bb for instance with vm_state deleted and task_state None.
Nov 22 08:25:05 compute-0 nova_compute[186544]: 2025-11-22 08:25:05.181 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:25:05 compute-0 nova_compute[186544]: 2025-11-22 08:25:05.206 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:25:05 compute-0 nova_compute[186544]: 2025-11-22 08:25:05.207 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:25:05 compute-0 nova_compute[186544]: 2025-11-22 08:25:05.207 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:25:05 compute-0 nova_compute[186544]: 2025-11-22 08:25:05.207 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 08:25:05 compute-0 nova_compute[186544]: 2025-11-22 08:25:05.393 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:25:05 compute-0 nova_compute[186544]: 2025-11-22 08:25:05.394 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5723MB free_disk=73.13666152954102GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 08:25:05 compute-0 nova_compute[186544]: 2025-11-22 08:25:05.395 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:25:05 compute-0 nova_compute[186544]: 2025-11-22 08:25:05.395 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:25:05 compute-0 nova_compute[186544]: 2025-11-22 08:25:05.454 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 08:25:05 compute-0 nova_compute[186544]: 2025-11-22 08:25:05.455 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 08:25:05 compute-0 nova_compute[186544]: 2025-11-22 08:25:05.481 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:25:05 compute-0 nova_compute[186544]: 2025-11-22 08:25:05.494 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:25:05 compute-0 nova_compute[186544]: 2025-11-22 08:25:05.533 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 08:25:05 compute-0 nova_compute[186544]: 2025-11-22 08:25:05.534 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.139s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:25:06 compute-0 nova_compute[186544]: 2025-11-22 08:25:06.256 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:25:06 compute-0 podman[244033]: 2025-11-22 08:25:06.415172562 +0000 UTC m=+0.058823814 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 08:25:06 compute-0 podman[244034]: 2025-11-22 08:25:06.415536041 +0000 UTC m=+0.055449970 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 08:25:06 compute-0 podman[244032]: 2025-11-22 08:25:06.446236169 +0000 UTC m=+0.092159977 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 22 08:25:06 compute-0 podman[244035]: 2025-11-22 08:25:06.478630559 +0000 UTC m=+0.115620956 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Nov 22 08:25:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:25:06.859 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '51'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:25:06 compute-0 nova_compute[186544]: 2025-11-22 08:25:06.910 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:25:09 compute-0 nova_compute[186544]: 2025-11-22 08:25:09.496 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:25:10 compute-0 nova_compute[186544]: 2025-11-22 08:25:10.462 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:25:10 compute-0 nova_compute[186544]: 2025-11-22 08:25:10.580 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:25:11 compute-0 nova_compute[186544]: 2025-11-22 08:25:11.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:25:11 compute-0 nova_compute[186544]: 2025-11-22 08:25:11.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 08:25:11 compute-0 nova_compute[186544]: 2025-11-22 08:25:11.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 08:25:11 compute-0 nova_compute[186544]: 2025-11-22 08:25:11.174 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 08:25:11 compute-0 nova_compute[186544]: 2025-11-22 08:25:11.175 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:25:11 compute-0 nova_compute[186544]: 2025-11-22 08:25:11.175 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:25:11 compute-0 nova_compute[186544]: 2025-11-22 08:25:11.175 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 08:25:11 compute-0 nova_compute[186544]: 2025-11-22 08:25:11.258 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:25:11 compute-0 nova_compute[186544]: 2025-11-22 08:25:11.913 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:25:15 compute-0 nova_compute[186544]: 2025-11-22 08:25:15.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:25:15 compute-0 nova_compute[186544]: 2025-11-22 08:25:15.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:25:16 compute-0 nova_compute[186544]: 2025-11-22 08:25:16.260 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:25:16 compute-0 nova_compute[186544]: 2025-11-22 08:25:16.882 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763799901.8809896, 60603ba6-4ebe-48b0-94b1-5656695c9ded => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:25:16 compute-0 nova_compute[186544]: 2025-11-22 08:25:16.883 186548 INFO nova.compute.manager [-] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] VM Stopped (Lifecycle Event)
Nov 22 08:25:16 compute-0 nova_compute[186544]: 2025-11-22 08:25:16.900 186548 DEBUG nova.compute.manager [None req-caf17053-fcb5-49ef-bf30-7eaa16115701 - - - - - -] [instance: 60603ba6-4ebe-48b0-94b1-5656695c9ded] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:25:16 compute-0 nova_compute[186544]: 2025-11-22 08:25:16.916 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:25:17 compute-0 podman[244115]: 2025-11-22 08:25:17.403108019 +0000 UTC m=+0.052372895 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Nov 22 08:25:18 compute-0 nova_compute[186544]: 2025-11-22 08:25:18.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:25:18 compute-0 nova_compute[186544]: 2025-11-22 08:25:18.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:25:20 compute-0 podman[244135]: 2025-11-22 08:25:20.409096353 +0000 UTC m=+0.056546818 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 08:25:20 compute-0 podman[244136]: 2025-11-22 08:25:20.418988228 +0000 UTC m=+0.064662098 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, managed_by=edpm_ansible, architecture=x86_64, container_name=openstack_network_exporter, release=1755695350, config_id=edpm, build-date=2025-08-20T13:12:41, version=9.6, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 22 08:25:21 compute-0 nova_compute[186544]: 2025-11-22 08:25:21.261 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:25:21 compute-0 nova_compute[186544]: 2025-11-22 08:25:21.918 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:25:23 compute-0 nova_compute[186544]: 2025-11-22 08:25:23.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:25:26 compute-0 nova_compute[186544]: 2025-11-22 08:25:26.263 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:25:26 compute-0 nova_compute[186544]: 2025-11-22 08:25:26.920 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:25:31 compute-0 nova_compute[186544]: 2025-11-22 08:25:31.265 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:25:31 compute-0 nova_compute[186544]: 2025-11-22 08:25:31.923 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:25:36 compute-0 nova_compute[186544]: 2025-11-22 08:25:36.268 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:25:36 compute-0 nova_compute[186544]: 2025-11-22 08:25:36.926 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:25:37.350 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:25:37.350 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:25:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:25:37.350 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:25:37 compute-0 podman[244179]: 2025-11-22 08:25:37.424203482 +0000 UTC m=+0.061713269 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 22 08:25:37 compute-0 podman[244180]: 2025-11-22 08:25:37.436171207 +0000 UTC m=+0.071462659 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 08:25:37 compute-0 podman[244181]: 2025-11-22 08:25:37.44726316 +0000 UTC m=+0.077664883 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller)
Nov 22 08:25:37 compute-0 podman[244178]: 2025-11-22 08:25:37.447339442 +0000 UTC m=+0.086777996 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute)
Nov 22 08:25:41 compute-0 nova_compute[186544]: 2025-11-22 08:25:41.269 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:25:41 compute-0 nova_compute[186544]: 2025-11-22 08:25:41.928 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:25:46 compute-0 nova_compute[186544]: 2025-11-22 08:25:46.271 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:25:46 compute-0 nova_compute[186544]: 2025-11-22 08:25:46.930 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:25:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:25:47.893 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=52, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=51) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:25:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:25:47.894 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 08:25:47 compute-0 nova_compute[186544]: 2025-11-22 08:25:47.894 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:25:48 compute-0 podman[244262]: 2025-11-22 08:25:48.418215656 +0000 UTC m=+0.067755689 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 22 08:25:51 compute-0 nova_compute[186544]: 2025-11-22 08:25:51.273 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:25:51 compute-0 podman[244283]: 2025-11-22 08:25:51.399142409 +0000 UTC m=+0.049374556 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 08:25:51 compute-0 podman[244284]: 2025-11-22 08:25:51.412088707 +0000 UTC m=+0.056609224 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, name=ubi9-minimal, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, vcs-type=git, build-date=2025-08-20T13:12:41, architecture=x86_64, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=)
Nov 22 08:25:51 compute-0 nova_compute[186544]: 2025-11-22 08:25:51.932 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:25:55 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:25:55.896 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '52'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:25:56 compute-0 nova_compute[186544]: 2025-11-22 08:25:56.276 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:25:56 compute-0 nova_compute[186544]: 2025-11-22 08:25:56.934 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:26:01 compute-0 nova_compute[186544]: 2025-11-22 08:26:01.278 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:26:01 compute-0 nova_compute[186544]: 2025-11-22 08:26:01.937 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:26:06 compute-0 nova_compute[186544]: 2025-11-22 08:26:06.279 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:26:06 compute-0 nova_compute[186544]: 2025-11-22 08:26:06.939 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:26:07 compute-0 nova_compute[186544]: 2025-11-22 08:26:07.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:26:07 compute-0 nova_compute[186544]: 2025-11-22 08:26:07.192 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:26:07 compute-0 nova_compute[186544]: 2025-11-22 08:26:07.192 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:26:07 compute-0 nova_compute[186544]: 2025-11-22 08:26:07.192 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:26:07 compute-0 nova_compute[186544]: 2025-11-22 08:26:07.193 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 08:26:07 compute-0 nova_compute[186544]: 2025-11-22 08:26:07.370 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:26:07 compute-0 nova_compute[186544]: 2025-11-22 08:26:07.371 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5732MB free_disk=73.13667678833008GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 08:26:07 compute-0 nova_compute[186544]: 2025-11-22 08:26:07.372 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:26:07 compute-0 nova_compute[186544]: 2025-11-22 08:26:07.372 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:26:07 compute-0 nova_compute[186544]: 2025-11-22 08:26:07.513 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 08:26:07 compute-0 nova_compute[186544]: 2025-11-22 08:26:07.513 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 08:26:07 compute-0 nova_compute[186544]: 2025-11-22 08:26:07.532 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Refreshing inventories for resource provider 0a011418-630a-4be8-ab23-41ec1c11a5ea _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 22 08:26:07 compute-0 nova_compute[186544]: 2025-11-22 08:26:07.557 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Updating ProviderTree inventory for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 22 08:26:07 compute-0 nova_compute[186544]: 2025-11-22 08:26:07.557 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Updating inventory in ProviderTree for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 22 08:26:07 compute-0 nova_compute[186544]: 2025-11-22 08:26:07.593 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Refreshing aggregate associations for resource provider 0a011418-630a-4be8-ab23-41ec1c11a5ea, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 22 08:26:07 compute-0 nova_compute[186544]: 2025-11-22 08:26:07.612 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Refreshing trait associations for resource provider 0a011418-630a-4be8-ab23-41ec1c11a5ea, traits: COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_ACCELERATORS,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_RESCUE_BFV,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSSE3,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE41 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 22 08:26:07 compute-0 nova_compute[186544]: 2025-11-22 08:26:07.633 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:26:07 compute-0 nova_compute[186544]: 2025-11-22 08:26:07.644 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:26:07 compute-0 nova_compute[186544]: 2025-11-22 08:26:07.645 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 08:26:07 compute-0 nova_compute[186544]: 2025-11-22 08:26:07.646 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.274s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:26:08 compute-0 podman[244328]: 2025-11-22 08:26:08.401325007 +0000 UTC m=+0.050182016 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Nov 22 08:26:08 compute-0 podman[244327]: 2025-11-22 08:26:08.431425208 +0000 UTC m=+0.084000048 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 22 08:26:08 compute-0 podman[244329]: 2025-11-22 08:26:08.431594272 +0000 UTC m=+0.078598135 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 08:26:08 compute-0 podman[244330]: 2025-11-22 08:26:08.439438315 +0000 UTC m=+0.082525332 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 22 08:26:11 compute-0 nova_compute[186544]: 2025-11-22 08:26:11.280 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:26:11 compute-0 nova_compute[186544]: 2025-11-22 08:26:11.646 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:26:11 compute-0 nova_compute[186544]: 2025-11-22 08:26:11.647 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 08:26:11 compute-0 nova_compute[186544]: 2025-11-22 08:26:11.940 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:26:12 compute-0 nova_compute[186544]: 2025-11-22 08:26:12.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:26:12 compute-0 nova_compute[186544]: 2025-11-22 08:26:12.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 08:26:12 compute-0 nova_compute[186544]: 2025-11-22 08:26:12.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 08:26:12 compute-0 nova_compute[186544]: 2025-11-22 08:26:12.177 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 08:26:12 compute-0 nova_compute[186544]: 2025-11-22 08:26:12.178 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:26:15 compute-0 nova_compute[186544]: 2025-11-22 08:26:15.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:26:15 compute-0 nova_compute[186544]: 2025-11-22 08:26:15.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:26:16 compute-0 nova_compute[186544]: 2025-11-22 08:26:16.283 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:26:16 compute-0 nova_compute[186544]: 2025-11-22 08:26:16.943 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:26:19 compute-0 nova_compute[186544]: 2025-11-22 08:26:19.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:26:19 compute-0 podman[244412]: 2025-11-22 08:26:19.4030681 +0000 UTC m=+0.057361152 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Nov 22 08:26:20 compute-0 nova_compute[186544]: 2025-11-22 08:26:20.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:26:21 compute-0 nova_compute[186544]: 2025-11-22 08:26:21.285 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:26:21 compute-0 nova_compute[186544]: 2025-11-22 08:26:21.944 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:26:22 compute-0 podman[244433]: 2025-11-22 08:26:22.402082959 +0000 UTC m=+0.051590171 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9-minimal, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1755695350, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, distribution-scope=public)
Nov 22 08:26:22 compute-0 podman[244432]: 2025-11-22 08:26:22.423611438 +0000 UTC m=+0.076002901 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 08:26:23 compute-0 nova_compute[186544]: 2025-11-22 08:26:23.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:26:26 compute-0 nova_compute[186544]: 2025-11-22 08:26:26.286 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:26:26 compute-0 nova_compute[186544]: 2025-11-22 08:26:26.946 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:26:31 compute-0 nova_compute[186544]: 2025-11-22 08:26:31.287 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:26:31 compute-0 nova_compute[186544]: 2025-11-22 08:26:31.948 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:26:33 compute-0 nova_compute[186544]: 2025-11-22 08:26:33.158 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:26:34 compute-0 ovn_controller[94843]: 2025-11-22T08:26:34Z|00743|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Nov 22 08:26:36 compute-0 nova_compute[186544]: 2025-11-22 08:26:36.289 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:26:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:26:36.600 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:26:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:26:36.600 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:26:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:26:36.601 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:26:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:26:36.601 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:26:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:26:36.601 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:26:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:26:36.601 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:26:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:26:36.601 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:26:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:26:36.601 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:26:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:26:36.601 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:26:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:26:36.601 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:26:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:26:36.601 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:26:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:26:36.601 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:26:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:26:36.601 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:26:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:26:36.601 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:26:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:26:36.601 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:26:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:26:36.602 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:26:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:26:36.602 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:26:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:26:36.602 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:26:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:26:36.602 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:26:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:26:36.602 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:26:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:26:36.602 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:26:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:26:36.602 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:26:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:26:36.602 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:26:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:26:36.602 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:26:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:26:36.602 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:26:36 compute-0 nova_compute[186544]: 2025-11-22 08:26:36.951 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:26:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:26:37.351 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:26:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:26:37.351 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:26:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:26:37.351 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:26:39 compute-0 podman[244477]: 2025-11-22 08:26:39.413644768 +0000 UTC m=+0.049240503 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 08:26:39 compute-0 podman[244475]: 2025-11-22 08:26:39.414095378 +0000 UTC m=+0.056345267 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Nov 22 08:26:39 compute-0 podman[244476]: 2025-11-22 08:26:39.43610434 +0000 UTC m=+0.075315855 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 22 08:26:39 compute-0 podman[244479]: 2025-11-22 08:26:39.463197587 +0000 UTC m=+0.089498194 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller)
Nov 22 08:26:40 compute-0 nova_compute[186544]: 2025-11-22 08:26:40.015 186548 DEBUG oslo_concurrency.lockutils [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "1fd3f032-9ff9-42af-aa81-ca9677695572" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:26:40 compute-0 nova_compute[186544]: 2025-11-22 08:26:40.016 186548 DEBUG oslo_concurrency.lockutils [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "1fd3f032-9ff9-42af-aa81-ca9677695572" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:26:40 compute-0 nova_compute[186544]: 2025-11-22 08:26:40.041 186548 DEBUG nova.compute.manager [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 08:26:40 compute-0 nova_compute[186544]: 2025-11-22 08:26:40.172 186548 DEBUG oslo_concurrency.lockutils [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:26:40 compute-0 nova_compute[186544]: 2025-11-22 08:26:40.172 186548 DEBUG oslo_concurrency.lockutils [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:26:40 compute-0 nova_compute[186544]: 2025-11-22 08:26:40.180 186548 DEBUG nova.virt.hardware [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 08:26:40 compute-0 nova_compute[186544]: 2025-11-22 08:26:40.180 186548 INFO nova.compute.claims [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Claim successful on node compute-0.ctlplane.example.com
Nov 22 08:26:40 compute-0 nova_compute[186544]: 2025-11-22 08:26:40.350 186548 DEBUG nova.compute.provider_tree [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:26:40 compute-0 nova_compute[186544]: 2025-11-22 08:26:40.367 186548 DEBUG nova.scheduler.client.report [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:26:40 compute-0 nova_compute[186544]: 2025-11-22 08:26:40.439 186548 DEBUG oslo_concurrency.lockutils [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.267s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:26:40 compute-0 nova_compute[186544]: 2025-11-22 08:26:40.440 186548 DEBUG nova.compute.manager [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 08:26:40 compute-0 nova_compute[186544]: 2025-11-22 08:26:40.556 186548 DEBUG nova.compute.manager [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 22 08:26:40 compute-0 nova_compute[186544]: 2025-11-22 08:26:40.556 186548 DEBUG nova.network.neutron [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 08:26:40 compute-0 nova_compute[186544]: 2025-11-22 08:26:40.599 186548 INFO nova.virt.libvirt.driver [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 08:26:40 compute-0 nova_compute[186544]: 2025-11-22 08:26:40.678 186548 DEBUG nova.compute.manager [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 08:26:40 compute-0 nova_compute[186544]: 2025-11-22 08:26:40.800 186548 DEBUG nova.policy [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 22 08:26:40 compute-0 nova_compute[186544]: 2025-11-22 08:26:40.846 186548 DEBUG nova.compute.manager [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 08:26:40 compute-0 nova_compute[186544]: 2025-11-22 08:26:40.848 186548 DEBUG nova.virt.libvirt.driver [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 08:26:40 compute-0 nova_compute[186544]: 2025-11-22 08:26:40.848 186548 INFO nova.virt.libvirt.driver [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Creating image(s)
Nov 22 08:26:40 compute-0 nova_compute[186544]: 2025-11-22 08:26:40.849 186548 DEBUG oslo_concurrency.lockutils [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "/var/lib/nova/instances/1fd3f032-9ff9-42af-aa81-ca9677695572/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:26:40 compute-0 nova_compute[186544]: 2025-11-22 08:26:40.849 186548 DEBUG oslo_concurrency.lockutils [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "/var/lib/nova/instances/1fd3f032-9ff9-42af-aa81-ca9677695572/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:26:40 compute-0 nova_compute[186544]: 2025-11-22 08:26:40.850 186548 DEBUG oslo_concurrency.lockutils [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "/var/lib/nova/instances/1fd3f032-9ff9-42af-aa81-ca9677695572/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:26:40 compute-0 nova_compute[186544]: 2025-11-22 08:26:40.862 186548 DEBUG oslo_concurrency.processutils [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:26:40 compute-0 nova_compute[186544]: 2025-11-22 08:26:40.918 186548 DEBUG oslo_concurrency.processutils [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:26:40 compute-0 nova_compute[186544]: 2025-11-22 08:26:40.919 186548 DEBUG oslo_concurrency.lockutils [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:26:40 compute-0 nova_compute[186544]: 2025-11-22 08:26:40.919 186548 DEBUG oslo_concurrency.lockutils [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:26:40 compute-0 nova_compute[186544]: 2025-11-22 08:26:40.930 186548 DEBUG oslo_concurrency.processutils [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:26:40 compute-0 nova_compute[186544]: 2025-11-22 08:26:40.983 186548 DEBUG oslo_concurrency.processutils [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:26:40 compute-0 nova_compute[186544]: 2025-11-22 08:26:40.984 186548 DEBUG oslo_concurrency.processutils [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/1fd3f032-9ff9-42af-aa81-ca9677695572/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:26:41 compute-0 nova_compute[186544]: 2025-11-22 08:26:41.155 186548 DEBUG oslo_concurrency.processutils [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/1fd3f032-9ff9-42af-aa81-ca9677695572/disk 1073741824" returned: 0 in 0.171s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:26:41 compute-0 nova_compute[186544]: 2025-11-22 08:26:41.156 186548 DEBUG oslo_concurrency.lockutils [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.237s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:26:41 compute-0 nova_compute[186544]: 2025-11-22 08:26:41.157 186548 DEBUG oslo_concurrency.processutils [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:26:41 compute-0 nova_compute[186544]: 2025-11-22 08:26:41.231 186548 DEBUG oslo_concurrency.processutils [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:26:41 compute-0 nova_compute[186544]: 2025-11-22 08:26:41.233 186548 DEBUG nova.virt.disk.api [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Checking if we can resize image /var/lib/nova/instances/1fd3f032-9ff9-42af-aa81-ca9677695572/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 08:26:41 compute-0 nova_compute[186544]: 2025-11-22 08:26:41.234 186548 DEBUG oslo_concurrency.processutils [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1fd3f032-9ff9-42af-aa81-ca9677695572/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:26:41 compute-0 nova_compute[186544]: 2025-11-22 08:26:41.292 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:26:41 compute-0 nova_compute[186544]: 2025-11-22 08:26:41.295 186548 DEBUG oslo_concurrency.processutils [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1fd3f032-9ff9-42af-aa81-ca9677695572/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:26:41 compute-0 nova_compute[186544]: 2025-11-22 08:26:41.295 186548 DEBUG nova.virt.disk.api [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Cannot resize image /var/lib/nova/instances/1fd3f032-9ff9-42af-aa81-ca9677695572/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 08:26:41 compute-0 nova_compute[186544]: 2025-11-22 08:26:41.296 186548 DEBUG nova.objects.instance [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lazy-loading 'migration_context' on Instance uuid 1fd3f032-9ff9-42af-aa81-ca9677695572 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:26:41 compute-0 nova_compute[186544]: 2025-11-22 08:26:41.322 186548 DEBUG nova.virt.libvirt.driver [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 08:26:41 compute-0 nova_compute[186544]: 2025-11-22 08:26:41.322 186548 DEBUG nova.virt.libvirt.driver [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Ensure instance console log exists: /var/lib/nova/instances/1fd3f032-9ff9-42af-aa81-ca9677695572/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 08:26:41 compute-0 nova_compute[186544]: 2025-11-22 08:26:41.323 186548 DEBUG oslo_concurrency.lockutils [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:26:41 compute-0 nova_compute[186544]: 2025-11-22 08:26:41.323 186548 DEBUG oslo_concurrency.lockutils [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:26:41 compute-0 nova_compute[186544]: 2025-11-22 08:26:41.323 186548 DEBUG oslo_concurrency.lockutils [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:26:41 compute-0 nova_compute[186544]: 2025-11-22 08:26:41.954 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:26:42 compute-0 nova_compute[186544]: 2025-11-22 08:26:42.618 186548 DEBUG nova.network.neutron [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Successfully created port: 9791d20b-bf49-4625-a399-3764afd01f2b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 22 08:26:45 compute-0 nova_compute[186544]: 2025-11-22 08:26:45.925 186548 DEBUG nova.network.neutron [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Successfully updated port: 9791d20b-bf49-4625-a399-3764afd01f2b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 08:26:45 compute-0 nova_compute[186544]: 2025-11-22 08:26:45.942 186548 DEBUG oslo_concurrency.lockutils [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "refresh_cache-1fd3f032-9ff9-42af-aa81-ca9677695572" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:26:45 compute-0 nova_compute[186544]: 2025-11-22 08:26:45.943 186548 DEBUG oslo_concurrency.lockutils [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquired lock "refresh_cache-1fd3f032-9ff9-42af-aa81-ca9677695572" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:26:45 compute-0 nova_compute[186544]: 2025-11-22 08:26:45.943 186548 DEBUG nova.network.neutron [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 08:26:46 compute-0 nova_compute[186544]: 2025-11-22 08:26:46.092 186548 DEBUG nova.compute.manager [req-46456b3a-3629-4247-8411-10ed92d1741c req-6b9bf06a-aa96-4e47-b54e-af5815834125 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Received event network-changed-9791d20b-bf49-4625-a399-3764afd01f2b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:26:46 compute-0 nova_compute[186544]: 2025-11-22 08:26:46.093 186548 DEBUG nova.compute.manager [req-46456b3a-3629-4247-8411-10ed92d1741c req-6b9bf06a-aa96-4e47-b54e-af5815834125 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Refreshing instance network info cache due to event network-changed-9791d20b-bf49-4625-a399-3764afd01f2b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:26:46 compute-0 nova_compute[186544]: 2025-11-22 08:26:46.093 186548 DEBUG oslo_concurrency.lockutils [req-46456b3a-3629-4247-8411-10ed92d1741c req-6b9bf06a-aa96-4e47-b54e-af5815834125 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-1fd3f032-9ff9-42af-aa81-ca9677695572" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:26:46 compute-0 nova_compute[186544]: 2025-11-22 08:26:46.213 186548 DEBUG nova.network.neutron [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 08:26:46 compute-0 nova_compute[186544]: 2025-11-22 08:26:46.293 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:26:46 compute-0 nova_compute[186544]: 2025-11-22 08:26:46.955 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:26:48 compute-0 nova_compute[186544]: 2025-11-22 08:26:48.599 186548 DEBUG nova.network.neutron [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Updating instance_info_cache with network_info: [{"id": "9791d20b-bf49-4625-a399-3764afd01f2b", "address": "fa:16:3e:cf:9d:45", "network": {"id": "803d80ea-824c-483f-b6fb-c444b8aec93a", "bridge": "br-int", "label": "tempest-network-smoke--906185119", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9791d20b-bf", "ovs_interfaceid": "9791d20b-bf49-4625-a399-3764afd01f2b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:26:48 compute-0 nova_compute[186544]: 2025-11-22 08:26:48.622 186548 DEBUG oslo_concurrency.lockutils [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Releasing lock "refresh_cache-1fd3f032-9ff9-42af-aa81-ca9677695572" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:26:48 compute-0 nova_compute[186544]: 2025-11-22 08:26:48.623 186548 DEBUG nova.compute.manager [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Instance network_info: |[{"id": "9791d20b-bf49-4625-a399-3764afd01f2b", "address": "fa:16:3e:cf:9d:45", "network": {"id": "803d80ea-824c-483f-b6fb-c444b8aec93a", "bridge": "br-int", "label": "tempest-network-smoke--906185119", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9791d20b-bf", "ovs_interfaceid": "9791d20b-bf49-4625-a399-3764afd01f2b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 22 08:26:48 compute-0 nova_compute[186544]: 2025-11-22 08:26:48.624 186548 DEBUG oslo_concurrency.lockutils [req-46456b3a-3629-4247-8411-10ed92d1741c req-6b9bf06a-aa96-4e47-b54e-af5815834125 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-1fd3f032-9ff9-42af-aa81-ca9677695572" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:26:48 compute-0 nova_compute[186544]: 2025-11-22 08:26:48.624 186548 DEBUG nova.network.neutron [req-46456b3a-3629-4247-8411-10ed92d1741c req-6b9bf06a-aa96-4e47-b54e-af5815834125 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Refreshing network info cache for port 9791d20b-bf49-4625-a399-3764afd01f2b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:26:48 compute-0 nova_compute[186544]: 2025-11-22 08:26:48.627 186548 DEBUG nova.virt.libvirt.driver [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Start _get_guest_xml network_info=[{"id": "9791d20b-bf49-4625-a399-3764afd01f2b", "address": "fa:16:3e:cf:9d:45", "network": {"id": "803d80ea-824c-483f-b6fb-c444b8aec93a", "bridge": "br-int", "label": "tempest-network-smoke--906185119", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9791d20b-bf", "ovs_interfaceid": "9791d20b-bf49-4625-a399-3764afd01f2b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 08:26:48 compute-0 nova_compute[186544]: 2025-11-22 08:26:48.634 186548 WARNING nova.virt.libvirt.driver [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:26:48 compute-0 nova_compute[186544]: 2025-11-22 08:26:48.646 186548 DEBUG nova.virt.libvirt.host [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 08:26:48 compute-0 nova_compute[186544]: 2025-11-22 08:26:48.647 186548 DEBUG nova.virt.libvirt.host [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 08:26:48 compute-0 nova_compute[186544]: 2025-11-22 08:26:48.653 186548 DEBUG nova.virt.libvirt.host [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 08:26:48 compute-0 nova_compute[186544]: 2025-11-22 08:26:48.654 186548 DEBUG nova.virt.libvirt.host [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 08:26:48 compute-0 nova_compute[186544]: 2025-11-22 08:26:48.656 186548 DEBUG nova.virt.libvirt.driver [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 08:26:48 compute-0 nova_compute[186544]: 2025-11-22 08:26:48.656 186548 DEBUG nova.virt.hardware [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 08:26:48 compute-0 nova_compute[186544]: 2025-11-22 08:26:48.657 186548 DEBUG nova.virt.hardware [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 08:26:48 compute-0 nova_compute[186544]: 2025-11-22 08:26:48.657 186548 DEBUG nova.virt.hardware [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 08:26:48 compute-0 nova_compute[186544]: 2025-11-22 08:26:48.657 186548 DEBUG nova.virt.hardware [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 08:26:48 compute-0 nova_compute[186544]: 2025-11-22 08:26:48.657 186548 DEBUG nova.virt.hardware [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 08:26:48 compute-0 nova_compute[186544]: 2025-11-22 08:26:48.658 186548 DEBUG nova.virt.hardware [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 08:26:48 compute-0 nova_compute[186544]: 2025-11-22 08:26:48.658 186548 DEBUG nova.virt.hardware [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 08:26:48 compute-0 nova_compute[186544]: 2025-11-22 08:26:48.658 186548 DEBUG nova.virt.hardware [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 08:26:48 compute-0 nova_compute[186544]: 2025-11-22 08:26:48.658 186548 DEBUG nova.virt.hardware [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 08:26:48 compute-0 nova_compute[186544]: 2025-11-22 08:26:48.659 186548 DEBUG nova.virt.hardware [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 08:26:48 compute-0 nova_compute[186544]: 2025-11-22 08:26:48.659 186548 DEBUG nova.virt.hardware [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 08:26:48 compute-0 nova_compute[186544]: 2025-11-22 08:26:48.663 186548 DEBUG nova.virt.libvirt.vif [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:26:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-415050896',display_name='tempest-TestNetworkAdvancedServerOps-server-415050896',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-415050896',id=161,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBE10HET3xVG9kYYEFJbNoeXgZQQGPJCraywaCGccNjoZ4Ok7GHB8Zbg7DUabYTyv6kCKJW3QTAclno8OGt86G2miChntJewa0KwQEOoXH+w0t853r2FKBAA8hWK6fNplag==',key_name='tempest-TestNetworkAdvancedServerOps-1677817121',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='042f6d127720471aaedb8a1fb7535416',ramdisk_id='',reservation_id='r-ic6oxu06',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1221065053',owner_user_name='tempest-TestNetworkAdvancedServerOps-1221065053-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:26:40Z,user_data=None,user_id='d8853d84c1e84f6baaf01635ef1d0f7c',uuid=1fd3f032-9ff9-42af-aa81-ca9677695572,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9791d20b-bf49-4625-a399-3764afd01f2b", "address": "fa:16:3e:cf:9d:45", "network": {"id": "803d80ea-824c-483f-b6fb-c444b8aec93a", "bridge": "br-int", "label": "tempest-network-smoke--906185119", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9791d20b-bf", "ovs_interfaceid": "9791d20b-bf49-4625-a399-3764afd01f2b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 08:26:48 compute-0 nova_compute[186544]: 2025-11-22 08:26:48.664 186548 DEBUG nova.network.os_vif_util [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converting VIF {"id": "9791d20b-bf49-4625-a399-3764afd01f2b", "address": "fa:16:3e:cf:9d:45", "network": {"id": "803d80ea-824c-483f-b6fb-c444b8aec93a", "bridge": "br-int", "label": "tempest-network-smoke--906185119", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9791d20b-bf", "ovs_interfaceid": "9791d20b-bf49-4625-a399-3764afd01f2b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:26:48 compute-0 nova_compute[186544]: 2025-11-22 08:26:48.665 186548 DEBUG nova.network.os_vif_util [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cf:9d:45,bridge_name='br-int',has_traffic_filtering=True,id=9791d20b-bf49-4625-a399-3764afd01f2b,network=Network(803d80ea-824c-483f-b6fb-c444b8aec93a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9791d20b-bf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:26:48 compute-0 nova_compute[186544]: 2025-11-22 08:26:48.666 186548 DEBUG nova.objects.instance [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lazy-loading 'pci_devices' on Instance uuid 1fd3f032-9ff9-42af-aa81-ca9677695572 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:26:48 compute-0 nova_compute[186544]: 2025-11-22 08:26:48.687 186548 DEBUG nova.virt.libvirt.driver [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] End _get_guest_xml xml=<domain type="kvm">
Nov 22 08:26:48 compute-0 nova_compute[186544]:   <uuid>1fd3f032-9ff9-42af-aa81-ca9677695572</uuid>
Nov 22 08:26:48 compute-0 nova_compute[186544]:   <name>instance-000000a1</name>
Nov 22 08:26:48 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 08:26:48 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 08:26:48 compute-0 nova_compute[186544]:   <metadata>
Nov 22 08:26:48 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 08:26:48 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 08:26:48 compute-0 nova_compute[186544]:       <nova:name>tempest-TestNetworkAdvancedServerOps-server-415050896</nova:name>
Nov 22 08:26:48 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 08:26:48</nova:creationTime>
Nov 22 08:26:48 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 08:26:48 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 08:26:48 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 08:26:48 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 08:26:48 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 08:26:48 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 08:26:48 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 08:26:48 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 08:26:48 compute-0 nova_compute[186544]:         <nova:user uuid="d8853d84c1e84f6baaf01635ef1d0f7c">tempest-TestNetworkAdvancedServerOps-1221065053-project-member</nova:user>
Nov 22 08:26:48 compute-0 nova_compute[186544]:         <nova:project uuid="042f6d127720471aaedb8a1fb7535416">tempest-TestNetworkAdvancedServerOps-1221065053</nova:project>
Nov 22 08:26:48 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 08:26:48 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 08:26:48 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 08:26:48 compute-0 nova_compute[186544]:         <nova:port uuid="9791d20b-bf49-4625-a399-3764afd01f2b">
Nov 22 08:26:48 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 22 08:26:48 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 08:26:48 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 08:26:48 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 08:26:48 compute-0 nova_compute[186544]:   </metadata>
Nov 22 08:26:48 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 08:26:48 compute-0 nova_compute[186544]:     <system>
Nov 22 08:26:48 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 08:26:48 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 08:26:48 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 08:26:48 compute-0 nova_compute[186544]:       <entry name="serial">1fd3f032-9ff9-42af-aa81-ca9677695572</entry>
Nov 22 08:26:48 compute-0 nova_compute[186544]:       <entry name="uuid">1fd3f032-9ff9-42af-aa81-ca9677695572</entry>
Nov 22 08:26:48 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 08:26:48 compute-0 nova_compute[186544]:     </system>
Nov 22 08:26:48 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 08:26:48 compute-0 nova_compute[186544]:   <os>
Nov 22 08:26:48 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 08:26:48 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 08:26:48 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 08:26:48 compute-0 nova_compute[186544]:   </os>
Nov 22 08:26:48 compute-0 nova_compute[186544]:   <features>
Nov 22 08:26:48 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 08:26:48 compute-0 nova_compute[186544]:     <apic/>
Nov 22 08:26:48 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 08:26:48 compute-0 nova_compute[186544]:   </features>
Nov 22 08:26:48 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 08:26:48 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 08:26:48 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 08:26:48 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 08:26:48 compute-0 nova_compute[186544]:   </clock>
Nov 22 08:26:48 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 08:26:48 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 08:26:48 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 08:26:48 compute-0 nova_compute[186544]:   </cpu>
Nov 22 08:26:48 compute-0 nova_compute[186544]:   <devices>
Nov 22 08:26:48 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 08:26:48 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 08:26:48 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/1fd3f032-9ff9-42af-aa81-ca9677695572/disk"/>
Nov 22 08:26:48 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 08:26:48 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:26:48 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 08:26:48 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 08:26:48 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/1fd3f032-9ff9-42af-aa81-ca9677695572/disk.config"/>
Nov 22 08:26:48 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 08:26:48 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:26:48 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 08:26:48 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:cf:9d:45"/>
Nov 22 08:26:48 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:26:48 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 08:26:48 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 08:26:48 compute-0 nova_compute[186544]:       <target dev="tap9791d20b-bf"/>
Nov 22 08:26:48 compute-0 nova_compute[186544]:     </interface>
Nov 22 08:26:48 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 08:26:48 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/1fd3f032-9ff9-42af-aa81-ca9677695572/console.log" append="off"/>
Nov 22 08:26:48 compute-0 nova_compute[186544]:     </serial>
Nov 22 08:26:48 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 08:26:48 compute-0 nova_compute[186544]:     <video>
Nov 22 08:26:48 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:26:48 compute-0 nova_compute[186544]:     </video>
Nov 22 08:26:48 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 08:26:48 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 08:26:48 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 08:26:48 compute-0 nova_compute[186544]:     </rng>
Nov 22 08:26:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 08:26:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:26:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:26:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:26:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:26:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:26:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:26:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:26:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:26:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:26:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:26:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:26:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:26:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:26:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:26:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:26:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:26:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:26:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:26:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:26:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:26:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:26:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:26:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:26:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:26:48 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 08:26:48 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 08:26:48 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 08:26:48 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 08:26:48 compute-0 nova_compute[186544]:   </devices>
Nov 22 08:26:48 compute-0 nova_compute[186544]: </domain>
Nov 22 08:26:48 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 08:26:48 compute-0 nova_compute[186544]: 2025-11-22 08:26:48.689 186548 DEBUG nova.compute.manager [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Preparing to wait for external event network-vif-plugged-9791d20b-bf49-4625-a399-3764afd01f2b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 08:26:48 compute-0 nova_compute[186544]: 2025-11-22 08:26:48.689 186548 DEBUG oslo_concurrency.lockutils [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "1fd3f032-9ff9-42af-aa81-ca9677695572-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:26:48 compute-0 nova_compute[186544]: 2025-11-22 08:26:48.689 186548 DEBUG oslo_concurrency.lockutils [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "1fd3f032-9ff9-42af-aa81-ca9677695572-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:26:48 compute-0 nova_compute[186544]: 2025-11-22 08:26:48.690 186548 DEBUG oslo_concurrency.lockutils [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "1fd3f032-9ff9-42af-aa81-ca9677695572-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:26:48 compute-0 nova_compute[186544]: 2025-11-22 08:26:48.690 186548 DEBUG nova.virt.libvirt.vif [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:26:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-415050896',display_name='tempest-TestNetworkAdvancedServerOps-server-415050896',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-415050896',id=161,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBE10HET3xVG9kYYEFJbNoeXgZQQGPJCraywaCGccNjoZ4Ok7GHB8Zbg7DUabYTyv6kCKJW3QTAclno8OGt86G2miChntJewa0KwQEOoXH+w0t853r2FKBAA8hWK6fNplag==',key_name='tempest-TestNetworkAdvancedServerOps-1677817121',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='042f6d127720471aaedb8a1fb7535416',ramdisk_id='',reservation_id='r-ic6oxu06',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1221065053',owner_user_name='tempest-TestNetworkAdvancedServerOps-1221065053-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:26:40Z,user_data=None,user_id='d8853d84c1e84f6baaf01635ef1d0f7c',uuid=1fd3f032-9ff9-42af-aa81-ca9677695572,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9791d20b-bf49-4625-a399-3764afd01f2b", "address": "fa:16:3e:cf:9d:45", "network": {"id": "803d80ea-824c-483f-b6fb-c444b8aec93a", "bridge": "br-int", "label": "tempest-network-smoke--906185119", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9791d20b-bf", "ovs_interfaceid": "9791d20b-bf49-4625-a399-3764afd01f2b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 08:26:48 compute-0 nova_compute[186544]: 2025-11-22 08:26:48.691 186548 DEBUG nova.network.os_vif_util [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converting VIF {"id": "9791d20b-bf49-4625-a399-3764afd01f2b", "address": "fa:16:3e:cf:9d:45", "network": {"id": "803d80ea-824c-483f-b6fb-c444b8aec93a", "bridge": "br-int", "label": "tempest-network-smoke--906185119", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9791d20b-bf", "ovs_interfaceid": "9791d20b-bf49-4625-a399-3764afd01f2b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:26:48 compute-0 nova_compute[186544]: 2025-11-22 08:26:48.692 186548 DEBUG nova.network.os_vif_util [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cf:9d:45,bridge_name='br-int',has_traffic_filtering=True,id=9791d20b-bf49-4625-a399-3764afd01f2b,network=Network(803d80ea-824c-483f-b6fb-c444b8aec93a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9791d20b-bf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:26:48 compute-0 nova_compute[186544]: 2025-11-22 08:26:48.692 186548 DEBUG os_vif [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cf:9d:45,bridge_name='br-int',has_traffic_filtering=True,id=9791d20b-bf49-4625-a399-3764afd01f2b,network=Network(803d80ea-824c-483f-b6fb-c444b8aec93a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9791d20b-bf') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 08:26:48 compute-0 nova_compute[186544]: 2025-11-22 08:26:48.692 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:26:48 compute-0 nova_compute[186544]: 2025-11-22 08:26:48.693 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:26:48 compute-0 nova_compute[186544]: 2025-11-22 08:26:48.693 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:26:48 compute-0 nova_compute[186544]: 2025-11-22 08:26:48.696 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:26:48 compute-0 nova_compute[186544]: 2025-11-22 08:26:48.697 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9791d20b-bf, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:26:48 compute-0 nova_compute[186544]: 2025-11-22 08:26:48.697 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9791d20b-bf, col_values=(('external_ids', {'iface-id': '9791d20b-bf49-4625-a399-3764afd01f2b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:cf:9d:45', 'vm-uuid': '1fd3f032-9ff9-42af-aa81-ca9677695572'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:26:48 compute-0 NetworkManager[55036]: <info>  [1763800008.7028] manager: (tap9791d20b-bf): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/341)
Nov 22 08:26:48 compute-0 nova_compute[186544]: 2025-11-22 08:26:48.703 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 08:26:48 compute-0 nova_compute[186544]: 2025-11-22 08:26:48.708 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:26:48 compute-0 nova_compute[186544]: 2025-11-22 08:26:48.709 186548 INFO os_vif [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cf:9d:45,bridge_name='br-int',has_traffic_filtering=True,id=9791d20b-bf49-4625-a399-3764afd01f2b,network=Network(803d80ea-824c-483f-b6fb-c444b8aec93a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9791d20b-bf')
Nov 22 08:26:48 compute-0 nova_compute[186544]: 2025-11-22 08:26:48.779 186548 DEBUG nova.virt.libvirt.driver [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:26:48 compute-0 nova_compute[186544]: 2025-11-22 08:26:48.780 186548 DEBUG nova.virt.libvirt.driver [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:26:48 compute-0 nova_compute[186544]: 2025-11-22 08:26:48.780 186548 DEBUG nova.virt.libvirt.driver [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] No VIF found with MAC fa:16:3e:cf:9d:45, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 08:26:48 compute-0 nova_compute[186544]: 2025-11-22 08:26:48.781 186548 INFO nova.virt.libvirt.driver [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Using config drive
Nov 22 08:26:49 compute-0 nova_compute[186544]: 2025-11-22 08:26:49.730 186548 INFO nova.virt.libvirt.driver [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Creating config drive at /var/lib/nova/instances/1fd3f032-9ff9-42af-aa81-ca9677695572/disk.config
Nov 22 08:26:49 compute-0 nova_compute[186544]: 2025-11-22 08:26:49.735 186548 DEBUG oslo_concurrency.processutils [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1fd3f032-9ff9-42af-aa81-ca9677695572/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpv4pvzyqe execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:26:49 compute-0 nova_compute[186544]: 2025-11-22 08:26:49.861 186548 DEBUG oslo_concurrency.processutils [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1fd3f032-9ff9-42af-aa81-ca9677695572/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpv4pvzyqe" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:26:49 compute-0 kernel: tap9791d20b-bf: entered promiscuous mode
Nov 22 08:26:49 compute-0 ovn_controller[94843]: 2025-11-22T08:26:49Z|00744|binding|INFO|Claiming lport 9791d20b-bf49-4625-a399-3764afd01f2b for this chassis.
Nov 22 08:26:49 compute-0 ovn_controller[94843]: 2025-11-22T08:26:49Z|00745|binding|INFO|9791d20b-bf49-4625-a399-3764afd01f2b: Claiming fa:16:3e:cf:9d:45 10.100.0.11
Nov 22 08:26:49 compute-0 NetworkManager[55036]: <info>  [1763800009.9463] manager: (tap9791d20b-bf): new Tun device (/org/freedesktop/NetworkManager/Devices/342)
Nov 22 08:26:49 compute-0 nova_compute[186544]: 2025-11-22 08:26:49.944 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:26:49 compute-0 nova_compute[186544]: 2025-11-22 08:26:49.950 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:26:49 compute-0 systemd-udevd[244604]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:26:49 compute-0 NetworkManager[55036]: <info>  [1763800009.9886] device (tap9791d20b-bf): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 08:26:49 compute-0 NetworkManager[55036]: <info>  [1763800009.9895] device (tap9791d20b-bf): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 08:26:49 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:26:49.996 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cf:9d:45 10.100.0.11'], port_security=['fa:16:3e:cf:9d:45 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '1fd3f032-9ff9-42af-aa81-ca9677695572', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-803d80ea-824c-483f-b6fb-c444b8aec93a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '042f6d127720471aaedb8a1fb7535416', 'neutron:revision_number': '2', 'neutron:security_group_ids': '06bbe5b1-e2a7-4761-b674-e4ef7686889b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7ac404e1-df81-45d0-a211-7eca3c22b09e, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=9791d20b-bf49-4625-a399-3764afd01f2b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:26:49 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:26:49.997 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 9791d20b-bf49-4625-a399-3764afd01f2b in datapath 803d80ea-824c-483f-b6fb-c444b8aec93a bound to our chassis
Nov 22 08:26:49 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:26:49.998 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 803d80ea-824c-483f-b6fb-c444b8aec93a
Nov 22 08:26:50 compute-0 nova_compute[186544]: 2025-11-22 08:26:50.002 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:26:50 compute-0 systemd-machined[152872]: New machine qemu-85-instance-000000a1.
Nov 22 08:26:50 compute-0 ovn_controller[94843]: 2025-11-22T08:26:50Z|00746|binding|INFO|Setting lport 9791d20b-bf49-4625-a399-3764afd01f2b ovn-installed in OVS
Nov 22 08:26:50 compute-0 ovn_controller[94843]: 2025-11-22T08:26:50Z|00747|binding|INFO|Setting lport 9791d20b-bf49-4625-a399-3764afd01f2b up in Southbound
Nov 22 08:26:50 compute-0 nova_compute[186544]: 2025-11-22 08:26:50.010 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:26:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:26:50.010 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[9b565bbc-aa4d-40b2-b2e2-ca9964adf825]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:26:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:26:50.012 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap803d80ea-81 in ovnmeta-803d80ea-824c-483f-b6fb-c444b8aec93a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 08:26:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:26:50.013 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap803d80ea-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 08:26:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:26:50.014 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[1ed11dca-4d23-4070-94a7-8287866ffab6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:26:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:26:50.016 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[f584e079-50a2-4664-8630-613a181f21a8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:26:50 compute-0 systemd[1]: Started Virtual Machine qemu-85-instance-000000a1.
Nov 22 08:26:50 compute-0 podman[244589]: 2025-11-22 08:26:50.025652438 +0000 UTC m=+0.088745605 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 22 08:26:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:26:50.032 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[194486ba-094e-4237-9105-cb169b324e8a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:26:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:26:50.056 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[fdec71f8-ffac-4f1c-aa41-e026dd609d92]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:26:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:26:50.086 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[b6f7f9e8-3986-4bbb-941a-ffece597877e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:26:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:26:50.091 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[f11a1f67-93c4-4137-b231-80ee56cabd82]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:26:50 compute-0 NetworkManager[55036]: <info>  [1763800010.0924] manager: (tap803d80ea-80): new Veth device (/org/freedesktop/NetworkManager/Devices/343)
Nov 22 08:26:50 compute-0 systemd-udevd[244613]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:26:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:26:50.128 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[c769653e-c189-40e3-ad62-0116a6aaeab7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:26:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:26:50.133 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[48981cee-805a-45a2-99b0-38f4af711135]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:26:50 compute-0 NetworkManager[55036]: <info>  [1763800010.1573] device (tap803d80ea-80): carrier: link connected
Nov 22 08:26:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:26:50.164 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[ac04c98c-9e1d-45d1-a3c5-4388c3ddd246]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:26:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:26:50.181 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[f0e4d227-e684-4546-9681-1fe9575d29b5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap803d80ea-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ef:32:f7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 223], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 675079, 'reachable_time': 37706, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 244649, 'error': None, 'target': 'ovnmeta-803d80ea-824c-483f-b6fb-c444b8aec93a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:26:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:26:50.198 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[33561102-3359-4c75-b236-aebd71e5aec8]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feef:32f7'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 675079, 'tstamp': 675079}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 244650, 'error': None, 'target': 'ovnmeta-803d80ea-824c-483f-b6fb-c444b8aec93a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:26:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:26:50.215 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[6094cf1c-be29-4a18-abca-c11a551cd05f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap803d80ea-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ef:32:f7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 223], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 675079, 'reachable_time': 37706, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 244651, 'error': None, 'target': 'ovnmeta-803d80ea-824c-483f-b6fb-c444b8aec93a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:26:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:26:50.248 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[6e311361-cc88-43e2-b407-a312964564e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:26:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:26:50.316 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[1b89447c-f8e4-4c53-a983-429165d6fad3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:26:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:26:50.319 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap803d80ea-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:26:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:26:50.320 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:26:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:26:50.320 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap803d80ea-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:26:50 compute-0 NetworkManager[55036]: <info>  [1763800010.3232] manager: (tap803d80ea-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/344)
Nov 22 08:26:50 compute-0 kernel: tap803d80ea-80: entered promiscuous mode
Nov 22 08:26:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:26:50.326 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap803d80ea-80, col_values=(('external_ids', {'iface-id': '2559dc1e-bdcf-4850-9e0f-c2d46823e533'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:26:50 compute-0 ovn_controller[94843]: 2025-11-22T08:26:50Z|00748|binding|INFO|Releasing lport 2559dc1e-bdcf-4850-9e0f-c2d46823e533 from this chassis (sb_readonly=0)
Nov 22 08:26:50 compute-0 nova_compute[186544]: 2025-11-22 08:26:50.333 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:26:50 compute-0 nova_compute[186544]: 2025-11-22 08:26:50.340 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:26:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:26:50.341 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/803d80ea-824c-483f-b6fb-c444b8aec93a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/803d80ea-824c-483f-b6fb-c444b8aec93a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 08:26:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:26:50.342 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[6d3b2dba-ff4e-4ca7-9237-7d553690d273]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:26:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:26:50.343 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 08:26:50 compute-0 ovn_metadata_agent[103800]: global
Nov 22 08:26:50 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 08:26:50 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-803d80ea-824c-483f-b6fb-c444b8aec93a
Nov 22 08:26:50 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 08:26:50 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 08:26:50 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 08:26:50 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/803d80ea-824c-483f-b6fb-c444b8aec93a.pid.haproxy
Nov 22 08:26:50 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 08:26:50 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:26:50 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 08:26:50 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 08:26:50 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 08:26:50 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 08:26:50 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 08:26:50 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 08:26:50 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 08:26:50 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 08:26:50 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 08:26:50 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 08:26:50 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 08:26:50 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 08:26:50 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 08:26:50 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:26:50 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:26:50 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 08:26:50 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 08:26:50 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 08:26:50 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID 803d80ea-824c-483f-b6fb-c444b8aec93a
Nov 22 08:26:50 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 08:26:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:26:50.343 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-803d80ea-824c-483f-b6fb-c444b8aec93a', 'env', 'PROCESS_TAG=haproxy-803d80ea-824c-483f-b6fb-c444b8aec93a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/803d80ea-824c-483f-b6fb-c444b8aec93a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 08:26:50 compute-0 nova_compute[186544]: 2025-11-22 08:26:50.398 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763800010.397628, 1fd3f032-9ff9-42af-aa81-ca9677695572 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:26:50 compute-0 nova_compute[186544]: 2025-11-22 08:26:50.398 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] VM Started (Lifecycle Event)
Nov 22 08:26:50 compute-0 nova_compute[186544]: 2025-11-22 08:26:50.419 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:26:50 compute-0 nova_compute[186544]: 2025-11-22 08:26:50.422 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763800010.3978686, 1fd3f032-9ff9-42af-aa81-ca9677695572 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:26:50 compute-0 nova_compute[186544]: 2025-11-22 08:26:50.423 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] VM Paused (Lifecycle Event)
Nov 22 08:26:50 compute-0 nova_compute[186544]: 2025-11-22 08:26:50.447 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:26:50 compute-0 nova_compute[186544]: 2025-11-22 08:26:50.451 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:26:50 compute-0 nova_compute[186544]: 2025-11-22 08:26:50.471 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 08:26:50 compute-0 nova_compute[186544]: 2025-11-22 08:26:50.586 186548 DEBUG nova.compute.manager [req-e4630f97-a1c1-44dd-b846-31261f9b4937 req-8a9315a6-d026-4420-ad22-6f316c64b558 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Received event network-vif-plugged-9791d20b-bf49-4625-a399-3764afd01f2b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:26:50 compute-0 nova_compute[186544]: 2025-11-22 08:26:50.589 186548 DEBUG oslo_concurrency.lockutils [req-e4630f97-a1c1-44dd-b846-31261f9b4937 req-8a9315a6-d026-4420-ad22-6f316c64b558 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1fd3f032-9ff9-42af-aa81-ca9677695572-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:26:50 compute-0 nova_compute[186544]: 2025-11-22 08:26:50.590 186548 DEBUG oslo_concurrency.lockutils [req-e4630f97-a1c1-44dd-b846-31261f9b4937 req-8a9315a6-d026-4420-ad22-6f316c64b558 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1fd3f032-9ff9-42af-aa81-ca9677695572-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:26:50 compute-0 nova_compute[186544]: 2025-11-22 08:26:50.590 186548 DEBUG oslo_concurrency.lockutils [req-e4630f97-a1c1-44dd-b846-31261f9b4937 req-8a9315a6-d026-4420-ad22-6f316c64b558 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1fd3f032-9ff9-42af-aa81-ca9677695572-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:26:50 compute-0 nova_compute[186544]: 2025-11-22 08:26:50.591 186548 DEBUG nova.compute.manager [req-e4630f97-a1c1-44dd-b846-31261f9b4937 req-8a9315a6-d026-4420-ad22-6f316c64b558 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Processing event network-vif-plugged-9791d20b-bf49-4625-a399-3764afd01f2b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 08:26:50 compute-0 nova_compute[186544]: 2025-11-22 08:26:50.591 186548 DEBUG nova.compute.manager [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 08:26:50 compute-0 nova_compute[186544]: 2025-11-22 08:26:50.595 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763800010.5948205, 1fd3f032-9ff9-42af-aa81-ca9677695572 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:26:50 compute-0 nova_compute[186544]: 2025-11-22 08:26:50.595 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] VM Resumed (Lifecycle Event)
Nov 22 08:26:50 compute-0 nova_compute[186544]: 2025-11-22 08:26:50.597 186548 DEBUG nova.virt.libvirt.driver [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 08:26:50 compute-0 nova_compute[186544]: 2025-11-22 08:26:50.600 186548 INFO nova.virt.libvirt.driver [-] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Instance spawned successfully.
Nov 22 08:26:50 compute-0 nova_compute[186544]: 2025-11-22 08:26:50.600 186548 DEBUG nova.virt.libvirt.driver [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 08:26:50 compute-0 nova_compute[186544]: 2025-11-22 08:26:50.621 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:26:50 compute-0 nova_compute[186544]: 2025-11-22 08:26:50.625 186548 DEBUG nova.virt.libvirt.driver [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:26:50 compute-0 nova_compute[186544]: 2025-11-22 08:26:50.626 186548 DEBUG nova.virt.libvirt.driver [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:26:50 compute-0 nova_compute[186544]: 2025-11-22 08:26:50.626 186548 DEBUG nova.virt.libvirt.driver [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:26:50 compute-0 nova_compute[186544]: 2025-11-22 08:26:50.627 186548 DEBUG nova.virt.libvirt.driver [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:26:50 compute-0 nova_compute[186544]: 2025-11-22 08:26:50.627 186548 DEBUG nova.virt.libvirt.driver [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:26:50 compute-0 nova_compute[186544]: 2025-11-22 08:26:50.628 186548 DEBUG nova.virt.libvirt.driver [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:26:50 compute-0 nova_compute[186544]: 2025-11-22 08:26:50.632 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:26:50 compute-0 nova_compute[186544]: 2025-11-22 08:26:50.668 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 08:26:50 compute-0 nova_compute[186544]: 2025-11-22 08:26:50.716 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:26:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:26:50.716 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=53, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=52) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:26:50 compute-0 nova_compute[186544]: 2025-11-22 08:26:50.720 186548 INFO nova.compute.manager [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Took 9.87 seconds to spawn the instance on the hypervisor.
Nov 22 08:26:50 compute-0 nova_compute[186544]: 2025-11-22 08:26:50.720 186548 DEBUG nova.compute.manager [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:26:50 compute-0 podman[244689]: 2025-11-22 08:26:50.687682911 +0000 UTC m=+0.020839134 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 08:26:50 compute-0 nova_compute[186544]: 2025-11-22 08:26:50.810 186548 INFO nova.compute.manager [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Took 10.69 seconds to build instance.
Nov 22 08:26:50 compute-0 nova_compute[186544]: 2025-11-22 08:26:50.842 186548 DEBUG oslo_concurrency.lockutils [None req-132a46d9-41bf-4520-9c0e-b7b6a3c42530 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "1fd3f032-9ff9-42af-aa81-ca9677695572" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.826s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:26:50 compute-0 podman[244689]: 2025-11-22 08:26:50.885655763 +0000 UTC m=+0.218811986 container create 8feb8a28dbffeee42462bdae57dfdd0810be261d60c54868ed0a4c5bc5642096 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-803d80ea-824c-483f-b6fb-c444b8aec93a, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Nov 22 08:26:50 compute-0 systemd[1]: Started libpod-conmon-8feb8a28dbffeee42462bdae57dfdd0810be261d60c54868ed0a4c5bc5642096.scope.
Nov 22 08:26:50 compute-0 systemd[1]: Started libcrun container.
Nov 22 08:26:50 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/20b4ab5f7307199035129b8e7bf9b934f7de92daccf788c2de340cca9a2a833d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 08:26:51 compute-0 podman[244689]: 2025-11-22 08:26:51.052166201 +0000 UTC m=+0.385322444 container init 8feb8a28dbffeee42462bdae57dfdd0810be261d60c54868ed0a4c5bc5642096 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-803d80ea-824c-483f-b6fb-c444b8aec93a, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 22 08:26:51 compute-0 podman[244689]: 2025-11-22 08:26:51.059694407 +0000 UTC m=+0.392850630 container start 8feb8a28dbffeee42462bdae57dfdd0810be261d60c54868ed0a4c5bc5642096 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-803d80ea-824c-483f-b6fb-c444b8aec93a, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 22 08:26:51 compute-0 neutron-haproxy-ovnmeta-803d80ea-824c-483f-b6fb-c444b8aec93a[244704]: [NOTICE]   (244708) : New worker (244710) forked
Nov 22 08:26:51 compute-0 neutron-haproxy-ovnmeta-803d80ea-824c-483f-b6fb-c444b8aec93a[244704]: [NOTICE]   (244708) : Loading success.
Nov 22 08:26:51 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:26:51.194 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 08:26:51 compute-0 nova_compute[186544]: 2025-11-22 08:26:51.296 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:26:51 compute-0 nova_compute[186544]: 2025-11-22 08:26:51.620 186548 DEBUG nova.network.neutron [req-46456b3a-3629-4247-8411-10ed92d1741c req-6b9bf06a-aa96-4e47-b54e-af5815834125 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Updated VIF entry in instance network info cache for port 9791d20b-bf49-4625-a399-3764afd01f2b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:26:51 compute-0 nova_compute[186544]: 2025-11-22 08:26:51.620 186548 DEBUG nova.network.neutron [req-46456b3a-3629-4247-8411-10ed92d1741c req-6b9bf06a-aa96-4e47-b54e-af5815834125 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Updating instance_info_cache with network_info: [{"id": "9791d20b-bf49-4625-a399-3764afd01f2b", "address": "fa:16:3e:cf:9d:45", "network": {"id": "803d80ea-824c-483f-b6fb-c444b8aec93a", "bridge": "br-int", "label": "tempest-network-smoke--906185119", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9791d20b-bf", "ovs_interfaceid": "9791d20b-bf49-4625-a399-3764afd01f2b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:26:51 compute-0 nova_compute[186544]: 2025-11-22 08:26:51.635 186548 DEBUG oslo_concurrency.lockutils [req-46456b3a-3629-4247-8411-10ed92d1741c req-6b9bf06a-aa96-4e47-b54e-af5815834125 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-1fd3f032-9ff9-42af-aa81-ca9677695572" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:26:52 compute-0 nova_compute[186544]: 2025-11-22 08:26:52.699 186548 DEBUG nova.compute.manager [req-593eb908-59da-4eaf-8bcc-a437015f481a req-165d8351-2617-4aa8-95ee-ba722334738e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Received event network-vif-plugged-9791d20b-bf49-4625-a399-3764afd01f2b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:26:52 compute-0 nova_compute[186544]: 2025-11-22 08:26:52.700 186548 DEBUG oslo_concurrency.lockutils [req-593eb908-59da-4eaf-8bcc-a437015f481a req-165d8351-2617-4aa8-95ee-ba722334738e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1fd3f032-9ff9-42af-aa81-ca9677695572-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:26:52 compute-0 nova_compute[186544]: 2025-11-22 08:26:52.700 186548 DEBUG oslo_concurrency.lockutils [req-593eb908-59da-4eaf-8bcc-a437015f481a req-165d8351-2617-4aa8-95ee-ba722334738e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1fd3f032-9ff9-42af-aa81-ca9677695572-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:26:52 compute-0 nova_compute[186544]: 2025-11-22 08:26:52.700 186548 DEBUG oslo_concurrency.lockutils [req-593eb908-59da-4eaf-8bcc-a437015f481a req-165d8351-2617-4aa8-95ee-ba722334738e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1fd3f032-9ff9-42af-aa81-ca9677695572-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:26:52 compute-0 nova_compute[186544]: 2025-11-22 08:26:52.700 186548 DEBUG nova.compute.manager [req-593eb908-59da-4eaf-8bcc-a437015f481a req-165d8351-2617-4aa8-95ee-ba722334738e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] No waiting events found dispatching network-vif-plugged-9791d20b-bf49-4625-a399-3764afd01f2b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:26:52 compute-0 nova_compute[186544]: 2025-11-22 08:26:52.700 186548 WARNING nova.compute.manager [req-593eb908-59da-4eaf-8bcc-a437015f481a req-165d8351-2617-4aa8-95ee-ba722334738e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Received unexpected event network-vif-plugged-9791d20b-bf49-4625-a399-3764afd01f2b for instance with vm_state active and task_state None.
Nov 22 08:26:53 compute-0 podman[244719]: 2025-11-22 08:26:53.405406776 +0000 UTC m=+0.057787912 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 08:26:53 compute-0 podman[244720]: 2025-11-22 08:26:53.414207243 +0000 UTC m=+0.063456093 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.33.7, architecture=x86_64, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, container_name=openstack_network_exporter, name=ubi9-minimal, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 22 08:26:53 compute-0 nova_compute[186544]: 2025-11-22 08:26:53.701 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:26:53 compute-0 nova_compute[186544]: 2025-11-22 08:26:53.869 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:26:53 compute-0 ovn_controller[94843]: 2025-11-22T08:26:53Z|00749|binding|INFO|Releasing lport 2559dc1e-bdcf-4850-9e0f-c2d46823e533 from this chassis (sb_readonly=0)
Nov 22 08:26:53 compute-0 NetworkManager[55036]: <info>  [1763800013.8868] manager: (patch-br-int-to-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/345)
Nov 22 08:26:53 compute-0 NetworkManager[55036]: <info>  [1763800013.8875] manager: (patch-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/346)
Nov 22 08:26:53 compute-0 nova_compute[186544]: 2025-11-22 08:26:53.900 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:26:53 compute-0 ovn_controller[94843]: 2025-11-22T08:26:53Z|00750|binding|INFO|Releasing lport 2559dc1e-bdcf-4850-9e0f-c2d46823e533 from this chassis (sb_readonly=0)
Nov 22 08:26:53 compute-0 nova_compute[186544]: 2025-11-22 08:26:53.905 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:26:54 compute-0 nova_compute[186544]: 2025-11-22 08:26:54.350 186548 DEBUG nova.compute.manager [req-17bb88dc-660e-421e-852b-5d0620486bb3 req-e34ac3c7-3f93-4b87-a03d-89ba023e1af3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Received event network-changed-9791d20b-bf49-4625-a399-3764afd01f2b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:26:54 compute-0 nova_compute[186544]: 2025-11-22 08:26:54.350 186548 DEBUG nova.compute.manager [req-17bb88dc-660e-421e-852b-5d0620486bb3 req-e34ac3c7-3f93-4b87-a03d-89ba023e1af3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Refreshing instance network info cache due to event network-changed-9791d20b-bf49-4625-a399-3764afd01f2b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:26:54 compute-0 nova_compute[186544]: 2025-11-22 08:26:54.350 186548 DEBUG oslo_concurrency.lockutils [req-17bb88dc-660e-421e-852b-5d0620486bb3 req-e34ac3c7-3f93-4b87-a03d-89ba023e1af3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-1fd3f032-9ff9-42af-aa81-ca9677695572" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:26:54 compute-0 nova_compute[186544]: 2025-11-22 08:26:54.351 186548 DEBUG oslo_concurrency.lockutils [req-17bb88dc-660e-421e-852b-5d0620486bb3 req-e34ac3c7-3f93-4b87-a03d-89ba023e1af3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-1fd3f032-9ff9-42af-aa81-ca9677695572" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:26:54 compute-0 nova_compute[186544]: 2025-11-22 08:26:54.351 186548 DEBUG nova.network.neutron [req-17bb88dc-660e-421e-852b-5d0620486bb3 req-e34ac3c7-3f93-4b87-a03d-89ba023e1af3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Refreshing network info cache for port 9791d20b-bf49-4625-a399-3764afd01f2b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:26:55 compute-0 nova_compute[186544]: 2025-11-22 08:26:55.668 186548 DEBUG nova.network.neutron [req-17bb88dc-660e-421e-852b-5d0620486bb3 req-e34ac3c7-3f93-4b87-a03d-89ba023e1af3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Updated VIF entry in instance network info cache for port 9791d20b-bf49-4625-a399-3764afd01f2b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:26:55 compute-0 nova_compute[186544]: 2025-11-22 08:26:55.669 186548 DEBUG nova.network.neutron [req-17bb88dc-660e-421e-852b-5d0620486bb3 req-e34ac3c7-3f93-4b87-a03d-89ba023e1af3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Updating instance_info_cache with network_info: [{"id": "9791d20b-bf49-4625-a399-3764afd01f2b", "address": "fa:16:3e:cf:9d:45", "network": {"id": "803d80ea-824c-483f-b6fb-c444b8aec93a", "bridge": "br-int", "label": "tempest-network-smoke--906185119", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9791d20b-bf", "ovs_interfaceid": "9791d20b-bf49-4625-a399-3764afd01f2b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:26:55 compute-0 nova_compute[186544]: 2025-11-22 08:26:55.718 186548 DEBUG oslo_concurrency.lockutils [req-17bb88dc-660e-421e-852b-5d0620486bb3 req-e34ac3c7-3f93-4b87-a03d-89ba023e1af3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-1fd3f032-9ff9-42af-aa81-ca9677695572" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:26:56 compute-0 nova_compute[186544]: 2025-11-22 08:26:56.298 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:26:58 compute-0 nova_compute[186544]: 2025-11-22 08:26:58.705 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:27:00 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:27:00.198 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '53'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:27:01 compute-0 nova_compute[186544]: 2025-11-22 08:27:01.300 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:27:03 compute-0 nova_compute[186544]: 2025-11-22 08:27:03.708 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:27:06 compute-0 nova_compute[186544]: 2025-11-22 08:27:06.302 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:27:07 compute-0 ovn_controller[94843]: 2025-11-22T08:27:07Z|00068|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:cf:9d:45 10.100.0.11
Nov 22 08:27:07 compute-0 ovn_controller[94843]: 2025-11-22T08:27:07Z|00069|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:cf:9d:45 10.100.0.11
Nov 22 08:27:08 compute-0 nova_compute[186544]: 2025-11-22 08:27:08.710 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:27:09 compute-0 nova_compute[186544]: 2025-11-22 08:27:09.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:27:09 compute-0 nova_compute[186544]: 2025-11-22 08:27:09.191 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:27:09 compute-0 nova_compute[186544]: 2025-11-22 08:27:09.191 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:27:09 compute-0 nova_compute[186544]: 2025-11-22 08:27:09.192 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:27:09 compute-0 nova_compute[186544]: 2025-11-22 08:27:09.192 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 08:27:09 compute-0 nova_compute[186544]: 2025-11-22 08:27:09.270 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1fd3f032-9ff9-42af-aa81-ca9677695572/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:27:09 compute-0 nova_compute[186544]: 2025-11-22 08:27:09.343 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1fd3f032-9ff9-42af-aa81-ca9677695572/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:27:09 compute-0 nova_compute[186544]: 2025-11-22 08:27:09.344 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1fd3f032-9ff9-42af-aa81-ca9677695572/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:27:09 compute-0 nova_compute[186544]: 2025-11-22 08:27:09.404 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1fd3f032-9ff9-42af-aa81-ca9677695572/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:27:09 compute-0 nova_compute[186544]: 2025-11-22 08:27:09.626 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:27:09 compute-0 nova_compute[186544]: 2025-11-22 08:27:09.627 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5489MB free_disk=73.10773468017578GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 08:27:09 compute-0 nova_compute[186544]: 2025-11-22 08:27:09.628 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:27:09 compute-0 nova_compute[186544]: 2025-11-22 08:27:09.628 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:27:09 compute-0 nova_compute[186544]: 2025-11-22 08:27:09.707 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Instance 1fd3f032-9ff9-42af-aa81-ca9677695572 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 22 08:27:09 compute-0 nova_compute[186544]: 2025-11-22 08:27:09.709 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 08:27:09 compute-0 nova_compute[186544]: 2025-11-22 08:27:09.709 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 08:27:09 compute-0 nova_compute[186544]: 2025-11-22 08:27:09.755 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:27:09 compute-0 nova_compute[186544]: 2025-11-22 08:27:09.770 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:27:09 compute-0 nova_compute[186544]: 2025-11-22 08:27:09.796 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 08:27:09 compute-0 nova_compute[186544]: 2025-11-22 08:27:09.797 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.169s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:27:10 compute-0 podman[244793]: 2025-11-22 08:27:10.423145046 +0000 UTC m=+0.055687591 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 08:27:10 compute-0 podman[244791]: 2025-11-22 08:27:10.429483783 +0000 UTC m=+0.069285177 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 08:27:10 compute-0 podman[244792]: 2025-11-22 08:27:10.430630711 +0000 UTC m=+0.067344319 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 08:27:10 compute-0 podman[244799]: 2025-11-22 08:27:10.497308922 +0000 UTC m=+0.123539241 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.build-date=20251118, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 08:27:11 compute-0 nova_compute[186544]: 2025-11-22 08:27:11.307 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:27:11 compute-0 nova_compute[186544]: 2025-11-22 08:27:11.798 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:27:11 compute-0 nova_compute[186544]: 2025-11-22 08:27:11.798 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 08:27:13 compute-0 nova_compute[186544]: 2025-11-22 08:27:13.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:27:13 compute-0 nova_compute[186544]: 2025-11-22 08:27:13.713 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:27:14 compute-0 nova_compute[186544]: 2025-11-22 08:27:14.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:27:14 compute-0 nova_compute[186544]: 2025-11-22 08:27:14.164 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 08:27:14 compute-0 nova_compute[186544]: 2025-11-22 08:27:14.164 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 08:27:14 compute-0 nova_compute[186544]: 2025-11-22 08:27:14.476 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "refresh_cache-1fd3f032-9ff9-42af-aa81-ca9677695572" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:27:14 compute-0 nova_compute[186544]: 2025-11-22 08:27:14.477 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquired lock "refresh_cache-1fd3f032-9ff9-42af-aa81-ca9677695572" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:27:14 compute-0 nova_compute[186544]: 2025-11-22 08:27:14.477 186548 DEBUG nova.network.neutron [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 22 08:27:14 compute-0 nova_compute[186544]: 2025-11-22 08:27:14.477 186548 DEBUG nova.objects.instance [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lazy-loading 'info_cache' on Instance uuid 1fd3f032-9ff9-42af-aa81-ca9677695572 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:27:15 compute-0 nova_compute[186544]: 2025-11-22 08:27:15.784 186548 INFO nova.compute.manager [None req-a3613891-0c0a-425a-8d82-bbe4ec282f3c d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Get console output
Nov 22 08:27:15 compute-0 nova_compute[186544]: 2025-11-22 08:27:15.789 213059 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 22 08:27:15 compute-0 nova_compute[186544]: 2025-11-22 08:27:15.964 186548 DEBUG nova.network.neutron [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Updating instance_info_cache with network_info: [{"id": "9791d20b-bf49-4625-a399-3764afd01f2b", "address": "fa:16:3e:cf:9d:45", "network": {"id": "803d80ea-824c-483f-b6fb-c444b8aec93a", "bridge": "br-int", "label": "tempest-network-smoke--906185119", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9791d20b-bf", "ovs_interfaceid": "9791d20b-bf49-4625-a399-3764afd01f2b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:27:15 compute-0 nova_compute[186544]: 2025-11-22 08:27:15.984 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Releasing lock "refresh_cache-1fd3f032-9ff9-42af-aa81-ca9677695572" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:27:15 compute-0 nova_compute[186544]: 2025-11-22 08:27:15.984 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 22 08:27:16 compute-0 nova_compute[186544]: 2025-11-22 08:27:16.307 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:27:16 compute-0 nova_compute[186544]: 2025-11-22 08:27:16.802 186548 INFO nova.compute.manager [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Rebuilding instance
Nov 22 08:27:16 compute-0 nova_compute[186544]: 2025-11-22 08:27:16.978 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:27:17 compute-0 nova_compute[186544]: 2025-11-22 08:27:17.001 186548 DEBUG nova.compute.manager [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:27:17 compute-0 nova_compute[186544]: 2025-11-22 08:27:17.069 186548 DEBUG nova.objects.instance [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lazy-loading 'pci_requests' on Instance uuid 1fd3f032-9ff9-42af-aa81-ca9677695572 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:27:17 compute-0 nova_compute[186544]: 2025-11-22 08:27:17.079 186548 DEBUG nova.objects.instance [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lazy-loading 'pci_devices' on Instance uuid 1fd3f032-9ff9-42af-aa81-ca9677695572 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:27:17 compute-0 nova_compute[186544]: 2025-11-22 08:27:17.087 186548 DEBUG nova.objects.instance [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lazy-loading 'resources' on Instance uuid 1fd3f032-9ff9-42af-aa81-ca9677695572 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:27:17 compute-0 nova_compute[186544]: 2025-11-22 08:27:17.093 186548 DEBUG nova.objects.instance [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lazy-loading 'migration_context' on Instance uuid 1fd3f032-9ff9-42af-aa81-ca9677695572 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:27:17 compute-0 nova_compute[186544]: 2025-11-22 08:27:17.099 186548 DEBUG nova.objects.instance [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Nov 22 08:27:17 compute-0 nova_compute[186544]: 2025-11-22 08:27:17.101 186548 DEBUG nova.virt.libvirt.driver [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Nov 22 08:27:17 compute-0 nova_compute[186544]: 2025-11-22 08:27:17.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:27:18 compute-0 nova_compute[186544]: 2025-11-22 08:27:18.715 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:27:20 compute-0 kernel: tap9791d20b-bf (unregistering): left promiscuous mode
Nov 22 08:27:20 compute-0 NetworkManager[55036]: <info>  [1763800040.2669] device (tap9791d20b-bf): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 08:27:20 compute-0 ovn_controller[94843]: 2025-11-22T08:27:20Z|00751|binding|INFO|Releasing lport 9791d20b-bf49-4625-a399-3764afd01f2b from this chassis (sb_readonly=0)
Nov 22 08:27:20 compute-0 ovn_controller[94843]: 2025-11-22T08:27:20Z|00752|binding|INFO|Setting lport 9791d20b-bf49-4625-a399-3764afd01f2b down in Southbound
Nov 22 08:27:20 compute-0 ovn_controller[94843]: 2025-11-22T08:27:20Z|00753|binding|INFO|Removing iface tap9791d20b-bf ovn-installed in OVS
Nov 22 08:27:20 compute-0 nova_compute[186544]: 2025-11-22 08:27:20.276 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:27:20 compute-0 nova_compute[186544]: 2025-11-22 08:27:20.294 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:27:20 compute-0 systemd[1]: machine-qemu\x2d85\x2dinstance\x2d000000a1.scope: Deactivated successfully.
Nov 22 08:27:20 compute-0 systemd[1]: machine-qemu\x2d85\x2dinstance\x2d000000a1.scope: Consumed 15.303s CPU time.
Nov 22 08:27:20 compute-0 systemd-machined[152872]: Machine qemu-85-instance-000000a1 terminated.
Nov 22 08:27:20 compute-0 podman[244871]: 2025-11-22 08:27:20.382061835 +0000 UTC m=+0.074079594 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 22 08:27:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:27:20.494 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cf:9d:45 10.100.0.11'], port_security=['fa:16:3e:cf:9d:45 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '1fd3f032-9ff9-42af-aa81-ca9677695572', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-803d80ea-824c-483f-b6fb-c444b8aec93a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '042f6d127720471aaedb8a1fb7535416', 'neutron:revision_number': '4', 'neutron:security_group_ids': '06bbe5b1-e2a7-4761-b674-e4ef7686889b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.225'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7ac404e1-df81-45d0-a211-7eca3c22b09e, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=9791d20b-bf49-4625-a399-3764afd01f2b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:27:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:27:20.495 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 9791d20b-bf49-4625-a399-3764afd01f2b in datapath 803d80ea-824c-483f-b6fb-c444b8aec93a unbound from our chassis
Nov 22 08:27:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:27:20.497 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 803d80ea-824c-483f-b6fb-c444b8aec93a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 08:27:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:27:20.498 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[277dbda9-1136-4024-b741-25ea1ba6818c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:27:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:27:20.498 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-803d80ea-824c-483f-b6fb-c444b8aec93a namespace which is not needed anymore
Nov 22 08:27:20 compute-0 nova_compute[186544]: 2025-11-22 08:27:20.508 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:27:20 compute-0 nova_compute[186544]: 2025-11-22 08:27:20.511 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:27:20 compute-0 neutron-haproxy-ovnmeta-803d80ea-824c-483f-b6fb-c444b8aec93a[244704]: [NOTICE]   (244708) : haproxy version is 2.8.14-c23fe91
Nov 22 08:27:20 compute-0 neutron-haproxy-ovnmeta-803d80ea-824c-483f-b6fb-c444b8aec93a[244704]: [NOTICE]   (244708) : path to executable is /usr/sbin/haproxy
Nov 22 08:27:20 compute-0 neutron-haproxy-ovnmeta-803d80ea-824c-483f-b6fb-c444b8aec93a[244704]: [WARNING]  (244708) : Exiting Master process...
Nov 22 08:27:20 compute-0 neutron-haproxy-ovnmeta-803d80ea-824c-483f-b6fb-c444b8aec93a[244704]: [ALERT]    (244708) : Current worker (244710) exited with code 143 (Terminated)
Nov 22 08:27:20 compute-0 neutron-haproxy-ovnmeta-803d80ea-824c-483f-b6fb-c444b8aec93a[244704]: [WARNING]  (244708) : All workers exited. Exiting... (0)
Nov 22 08:27:20 compute-0 systemd[1]: libpod-8feb8a28dbffeee42462bdae57dfdd0810be261d60c54868ed0a4c5bc5642096.scope: Deactivated successfully.
Nov 22 08:27:20 compute-0 podman[244931]: 2025-11-22 08:27:20.874214987 +0000 UTC m=+0.276273500 container died 8feb8a28dbffeee42462bdae57dfdd0810be261d60c54868ed0a4c5bc5642096 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-803d80ea-824c-483f-b6fb-c444b8aec93a, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 22 08:27:20 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8feb8a28dbffeee42462bdae57dfdd0810be261d60c54868ed0a4c5bc5642096-userdata-shm.mount: Deactivated successfully.
Nov 22 08:27:20 compute-0 systemd[1]: var-lib-containers-storage-overlay-20b4ab5f7307199035129b8e7bf9b934f7de92daccf788c2de340cca9a2a833d-merged.mount: Deactivated successfully.
Nov 22 08:27:21 compute-0 nova_compute[186544]: 2025-11-22 08:27:21.007 186548 DEBUG nova.compute.manager [req-804f1f15-b5a6-4f83-86d6-74c3deb002fe req-99128128-0de9-4ad8-b08d-c51376045ce7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Received event network-vif-unplugged-9791d20b-bf49-4625-a399-3764afd01f2b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:27:21 compute-0 nova_compute[186544]: 2025-11-22 08:27:21.008 186548 DEBUG oslo_concurrency.lockutils [req-804f1f15-b5a6-4f83-86d6-74c3deb002fe req-99128128-0de9-4ad8-b08d-c51376045ce7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1fd3f032-9ff9-42af-aa81-ca9677695572-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:27:21 compute-0 nova_compute[186544]: 2025-11-22 08:27:21.008 186548 DEBUG oslo_concurrency.lockutils [req-804f1f15-b5a6-4f83-86d6-74c3deb002fe req-99128128-0de9-4ad8-b08d-c51376045ce7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1fd3f032-9ff9-42af-aa81-ca9677695572-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:27:21 compute-0 nova_compute[186544]: 2025-11-22 08:27:21.009 186548 DEBUG oslo_concurrency.lockutils [req-804f1f15-b5a6-4f83-86d6-74c3deb002fe req-99128128-0de9-4ad8-b08d-c51376045ce7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1fd3f032-9ff9-42af-aa81-ca9677695572-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:27:21 compute-0 nova_compute[186544]: 2025-11-22 08:27:21.009 186548 DEBUG nova.compute.manager [req-804f1f15-b5a6-4f83-86d6-74c3deb002fe req-99128128-0de9-4ad8-b08d-c51376045ce7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] No waiting events found dispatching network-vif-unplugged-9791d20b-bf49-4625-a399-3764afd01f2b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:27:21 compute-0 nova_compute[186544]: 2025-11-22 08:27:21.009 186548 WARNING nova.compute.manager [req-804f1f15-b5a6-4f83-86d6-74c3deb002fe req-99128128-0de9-4ad8-b08d-c51376045ce7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Received unexpected event network-vif-unplugged-9791d20b-bf49-4625-a399-3764afd01f2b for instance with vm_state active and task_state rebuilding.
Nov 22 08:27:21 compute-0 podman[244931]: 2025-11-22 08:27:21.056977805 +0000 UTC m=+0.459036318 container cleanup 8feb8a28dbffeee42462bdae57dfdd0810be261d60c54868ed0a4c5bc5642096 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-803d80ea-824c-483f-b6fb-c444b8aec93a, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 22 08:27:21 compute-0 systemd[1]: libpod-conmon-8feb8a28dbffeee42462bdae57dfdd0810be261d60c54868ed0a4c5bc5642096.scope: Deactivated successfully.
Nov 22 08:27:21 compute-0 nova_compute[186544]: 2025-11-22 08:27:21.123 186548 INFO nova.virt.libvirt.driver [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Instance shutdown successfully after 4 seconds.
Nov 22 08:27:21 compute-0 nova_compute[186544]: 2025-11-22 08:27:21.130 186548 INFO nova.virt.libvirt.driver [-] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Instance destroyed successfully.
Nov 22 08:27:21 compute-0 nova_compute[186544]: 2025-11-22 08:27:21.137 186548 INFO nova.virt.libvirt.driver [-] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Instance destroyed successfully.
Nov 22 08:27:21 compute-0 nova_compute[186544]: 2025-11-22 08:27:21.138 186548 DEBUG nova.virt.libvirt.vif [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:26:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-415050896',display_name='tempest-TestNetworkAdvancedServerOps-server-415050896',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-415050896',id=161,image_ref='360f90ca-2ddb-4e60-a48e-364e3b48bd96',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBE10HET3xVG9kYYEFJbNoeXgZQQGPJCraywaCGccNjoZ4Ok7GHB8Zbg7DUabYTyv6kCKJW3QTAclno8OGt86G2miChntJewa0KwQEOoXH+w0t853r2FKBAA8hWK6fNplag==',key_name='tempest-TestNetworkAdvancedServerOps-1677817121',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:26:50Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='042f6d127720471aaedb8a1fb7535416',ramdisk_id='',reservation_id='r-ic6oxu06',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='360f90ca-2ddb-4e60-a48e-364e3b48bd96',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1221065053',owner_user_name='tempest-TestNetworkAdvancedServerOps-1221065053-project-member'},tags=<?>,task_state='rebuilding',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:27:16Z,user_data=None,user_id='d8853d84c1e84f6baaf01635ef1d0f7c',uuid=1fd3f032-9ff9-42af-aa81-ca9677695572,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9791d20b-bf49-4625-a399-3764afd01f2b", "address": "fa:16:3e:cf:9d:45", "network": {"id": "803d80ea-824c-483f-b6fb-c444b8aec93a", "bridge": "br-int", "label": "tempest-network-smoke--906185119", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9791d20b-bf", "ovs_interfaceid": "9791d20b-bf49-4625-a399-3764afd01f2b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 08:27:21 compute-0 nova_compute[186544]: 2025-11-22 08:27:21.139 186548 DEBUG nova.network.os_vif_util [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converting VIF {"id": "9791d20b-bf49-4625-a399-3764afd01f2b", "address": "fa:16:3e:cf:9d:45", "network": {"id": "803d80ea-824c-483f-b6fb-c444b8aec93a", "bridge": "br-int", "label": "tempest-network-smoke--906185119", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9791d20b-bf", "ovs_interfaceid": "9791d20b-bf49-4625-a399-3764afd01f2b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:27:21 compute-0 nova_compute[186544]: 2025-11-22 08:27:21.140 186548 DEBUG nova.network.os_vif_util [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:cf:9d:45,bridge_name='br-int',has_traffic_filtering=True,id=9791d20b-bf49-4625-a399-3764afd01f2b,network=Network(803d80ea-824c-483f-b6fb-c444b8aec93a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9791d20b-bf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:27:21 compute-0 nova_compute[186544]: 2025-11-22 08:27:21.140 186548 DEBUG os_vif [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:cf:9d:45,bridge_name='br-int',has_traffic_filtering=True,id=9791d20b-bf49-4625-a399-3764afd01f2b,network=Network(803d80ea-824c-483f-b6fb-c444b8aec93a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9791d20b-bf') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 08:27:21 compute-0 nova_compute[186544]: 2025-11-22 08:27:21.142 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:27:21 compute-0 nova_compute[186544]: 2025-11-22 08:27:21.143 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9791d20b-bf, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:27:21 compute-0 nova_compute[186544]: 2025-11-22 08:27:21.146 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:27:21 compute-0 nova_compute[186544]: 2025-11-22 08:27:21.149 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 08:27:21 compute-0 nova_compute[186544]: 2025-11-22 08:27:21.156 186548 INFO os_vif [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:cf:9d:45,bridge_name='br-int',has_traffic_filtering=True,id=9791d20b-bf49-4625-a399-3764afd01f2b,network=Network(803d80ea-824c-483f-b6fb-c444b8aec93a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9791d20b-bf')
Nov 22 08:27:21 compute-0 nova_compute[186544]: 2025-11-22 08:27:21.158 186548 INFO nova.virt.libvirt.driver [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Deleting instance files /var/lib/nova/instances/1fd3f032-9ff9-42af-aa81-ca9677695572_del
Nov 22 08:27:21 compute-0 nova_compute[186544]: 2025-11-22 08:27:21.159 186548 INFO nova.virt.libvirt.driver [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Deletion of /var/lib/nova/instances/1fd3f032-9ff9-42af-aa81-ca9677695572_del complete
Nov 22 08:27:21 compute-0 nova_compute[186544]: 2025-11-22 08:27:21.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:27:21 compute-0 nova_compute[186544]: 2025-11-22 08:27:21.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:27:21 compute-0 nova_compute[186544]: 2025-11-22 08:27:21.310 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:27:21 compute-0 podman[244960]: 2025-11-22 08:27:21.363816217 +0000 UTC m=+0.286583645 container remove 8feb8a28dbffeee42462bdae57dfdd0810be261d60c54868ed0a4c5bc5642096 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-803d80ea-824c-483f-b6fb-c444b8aec93a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 22 08:27:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:27:21.371 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[68042943-f91c-4471-8546-9fd092f4313c]: (4, ('Sat Nov 22 08:27:20 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-803d80ea-824c-483f-b6fb-c444b8aec93a (8feb8a28dbffeee42462bdae57dfdd0810be261d60c54868ed0a4c5bc5642096)\n8feb8a28dbffeee42462bdae57dfdd0810be261d60c54868ed0a4c5bc5642096\nSat Nov 22 08:27:21 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-803d80ea-824c-483f-b6fb-c444b8aec93a (8feb8a28dbffeee42462bdae57dfdd0810be261d60c54868ed0a4c5bc5642096)\n8feb8a28dbffeee42462bdae57dfdd0810be261d60c54868ed0a4c5bc5642096\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:27:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:27:21.374 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[fe42e0d4-da4f-48ad-b737-8230daf4f9fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:27:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:27:21.375 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap803d80ea-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:27:21 compute-0 nova_compute[186544]: 2025-11-22 08:27:21.378 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:27:21 compute-0 kernel: tap803d80ea-80: left promiscuous mode
Nov 22 08:27:21 compute-0 nova_compute[186544]: 2025-11-22 08:27:21.391 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:27:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:27:21.394 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[ef0afc30-063f-43d4-9e0c-b88ee21397fa]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:27:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:27:21.412 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[bb5a0b87-300d-442d-83f8-8f053392b614]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:27:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:27:21.414 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[85593f7b-2f9d-4222-bc00-6f5879336d66]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:27:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:27:21.430 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[a06ddf39-66ab-4b92-bbc9-3044343e3f43]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 675071, 'reachable_time': 36277, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 244975, 'error': None, 'target': 'ovnmeta-803d80ea-824c-483f-b6fb-c444b8aec93a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:27:21 compute-0 systemd[1]: run-netns-ovnmeta\x2d803d80ea\x2d824c\x2d483f\x2db6fb\x2dc444b8aec93a.mount: Deactivated successfully.
Nov 22 08:27:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:27:21.435 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-803d80ea-824c-483f-b6fb-c444b8aec93a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 08:27:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:27:21.435 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[8726f7eb-9164-4af4-8a61-3f1d62cc1767]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:27:21 compute-0 nova_compute[186544]: 2025-11-22 08:27:21.461 186548 DEBUG nova.virt.libvirt.driver [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 08:27:21 compute-0 nova_compute[186544]: 2025-11-22 08:27:21.462 186548 INFO nova.virt.libvirt.driver [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Creating image(s)
Nov 22 08:27:21 compute-0 nova_compute[186544]: 2025-11-22 08:27:21.462 186548 DEBUG oslo_concurrency.lockutils [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "/var/lib/nova/instances/1fd3f032-9ff9-42af-aa81-ca9677695572/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:27:21 compute-0 nova_compute[186544]: 2025-11-22 08:27:21.463 186548 DEBUG oslo_concurrency.lockutils [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "/var/lib/nova/instances/1fd3f032-9ff9-42af-aa81-ca9677695572/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:27:21 compute-0 nova_compute[186544]: 2025-11-22 08:27:21.464 186548 DEBUG oslo_concurrency.lockutils [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "/var/lib/nova/instances/1fd3f032-9ff9-42af-aa81-ca9677695572/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:27:21 compute-0 nova_compute[186544]: 2025-11-22 08:27:21.479 186548 DEBUG oslo_concurrency.processutils [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:27:21 compute-0 nova_compute[186544]: 2025-11-22 08:27:21.553 186548 DEBUG oslo_concurrency.processutils [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:27:21 compute-0 nova_compute[186544]: 2025-11-22 08:27:21.554 186548 DEBUG oslo_concurrency.lockutils [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "2882af3479446958b785a3f508ce087a26493f42" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:27:21 compute-0 nova_compute[186544]: 2025-11-22 08:27:21.555 186548 DEBUG oslo_concurrency.lockutils [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "2882af3479446958b785a3f508ce087a26493f42" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:27:21 compute-0 nova_compute[186544]: 2025-11-22 08:27:21.567 186548 DEBUG oslo_concurrency.processutils [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:27:21 compute-0 nova_compute[186544]: 2025-11-22 08:27:21.635 186548 DEBUG oslo_concurrency.processutils [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:27:21 compute-0 nova_compute[186544]: 2025-11-22 08:27:21.636 186548 DEBUG oslo_concurrency.processutils [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42,backing_fmt=raw /var/lib/nova/instances/1fd3f032-9ff9-42af-aa81-ca9677695572/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:27:21 compute-0 nova_compute[186544]: 2025-11-22 08:27:21.839 186548 DEBUG oslo_concurrency.processutils [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42,backing_fmt=raw /var/lib/nova/instances/1fd3f032-9ff9-42af-aa81-ca9677695572/disk 1073741824" returned: 0 in 0.202s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:27:21 compute-0 nova_compute[186544]: 2025-11-22 08:27:21.840 186548 DEBUG oslo_concurrency.lockutils [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "2882af3479446958b785a3f508ce087a26493f42" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.285s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:27:21 compute-0 nova_compute[186544]: 2025-11-22 08:27:21.841 186548 DEBUG oslo_concurrency.processutils [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:27:21 compute-0 nova_compute[186544]: 2025-11-22 08:27:21.897 186548 DEBUG oslo_concurrency.processutils [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:27:21 compute-0 nova_compute[186544]: 2025-11-22 08:27:21.898 186548 DEBUG nova.virt.disk.api [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Checking if we can resize image /var/lib/nova/instances/1fd3f032-9ff9-42af-aa81-ca9677695572/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 08:27:21 compute-0 nova_compute[186544]: 2025-11-22 08:27:21.898 186548 DEBUG oslo_concurrency.processutils [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1fd3f032-9ff9-42af-aa81-ca9677695572/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:27:21 compute-0 nova_compute[186544]: 2025-11-22 08:27:21.955 186548 DEBUG oslo_concurrency.processutils [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1fd3f032-9ff9-42af-aa81-ca9677695572/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:27:21 compute-0 nova_compute[186544]: 2025-11-22 08:27:21.956 186548 DEBUG nova.virt.disk.api [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Cannot resize image /var/lib/nova/instances/1fd3f032-9ff9-42af-aa81-ca9677695572/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 08:27:21 compute-0 nova_compute[186544]: 2025-11-22 08:27:21.957 186548 DEBUG nova.virt.libvirt.driver [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 08:27:21 compute-0 nova_compute[186544]: 2025-11-22 08:27:21.957 186548 DEBUG nova.virt.libvirt.driver [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Ensure instance console log exists: /var/lib/nova/instances/1fd3f032-9ff9-42af-aa81-ca9677695572/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 08:27:21 compute-0 nova_compute[186544]: 2025-11-22 08:27:21.957 186548 DEBUG oslo_concurrency.lockutils [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:27:21 compute-0 nova_compute[186544]: 2025-11-22 08:27:21.958 186548 DEBUG oslo_concurrency.lockutils [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:27:21 compute-0 nova_compute[186544]: 2025-11-22 08:27:21.958 186548 DEBUG oslo_concurrency.lockutils [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:27:21 compute-0 nova_compute[186544]: 2025-11-22 08:27:21.960 186548 DEBUG nova.virt.libvirt.driver [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Start _get_guest_xml network_info=[{"id": "9791d20b-bf49-4625-a399-3764afd01f2b", "address": "fa:16:3e:cf:9d:45", "network": {"id": "803d80ea-824c-483f-b6fb-c444b8aec93a", "bridge": "br-int", "label": "tempest-network-smoke--906185119", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9791d20b-bf", "ovs_interfaceid": "9791d20b-bf49-4625-a399-3764afd01f2b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:39:01Z,direct_url=<?>,disk_format='qcow2',id=360f90ca-2ddb-4e60-a48e-364e3b48bd96,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:02Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 08:27:21 compute-0 nova_compute[186544]: 2025-11-22 08:27:21.966 186548 WARNING nova.virt.libvirt.driver [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError
Nov 22 08:27:21 compute-0 nova_compute[186544]: 2025-11-22 08:27:21.970 186548 DEBUG nova.virt.libvirt.host [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 08:27:21 compute-0 nova_compute[186544]: 2025-11-22 08:27:21.971 186548 DEBUG nova.virt.libvirt.host [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 08:27:21 compute-0 nova_compute[186544]: 2025-11-22 08:27:21.973 186548 DEBUG nova.virt.libvirt.host [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 08:27:21 compute-0 nova_compute[186544]: 2025-11-22 08:27:21.974 186548 DEBUG nova.virt.libvirt.host [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 08:27:21 compute-0 nova_compute[186544]: 2025-11-22 08:27:21.975 186548 DEBUG nova.virt.libvirt.driver [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 08:27:21 compute-0 nova_compute[186544]: 2025-11-22 08:27:21.975 186548 DEBUG nova.virt.hardware [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:39:01Z,direct_url=<?>,disk_format='qcow2',id=360f90ca-2ddb-4e60-a48e-364e3b48bd96,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:02Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 08:27:21 compute-0 nova_compute[186544]: 2025-11-22 08:27:21.976 186548 DEBUG nova.virt.hardware [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 08:27:21 compute-0 nova_compute[186544]: 2025-11-22 08:27:21.976 186548 DEBUG nova.virt.hardware [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 08:27:21 compute-0 nova_compute[186544]: 2025-11-22 08:27:21.976 186548 DEBUG nova.virt.hardware [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 08:27:21 compute-0 nova_compute[186544]: 2025-11-22 08:27:21.977 186548 DEBUG nova.virt.hardware [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 08:27:21 compute-0 nova_compute[186544]: 2025-11-22 08:27:21.977 186548 DEBUG nova.virt.hardware [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 08:27:21 compute-0 nova_compute[186544]: 2025-11-22 08:27:21.977 186548 DEBUG nova.virt.hardware [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 08:27:21 compute-0 nova_compute[186544]: 2025-11-22 08:27:21.977 186548 DEBUG nova.virt.hardware [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 08:27:21 compute-0 nova_compute[186544]: 2025-11-22 08:27:21.978 186548 DEBUG nova.virt.hardware [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 08:27:21 compute-0 nova_compute[186544]: 2025-11-22 08:27:21.978 186548 DEBUG nova.virt.hardware [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 08:27:21 compute-0 nova_compute[186544]: 2025-11-22 08:27:21.978 186548 DEBUG nova.virt.hardware [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 08:27:21 compute-0 nova_compute[186544]: 2025-11-22 08:27:21.978 186548 DEBUG nova.objects.instance [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 1fd3f032-9ff9-42af-aa81-ca9677695572 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:27:21 compute-0 nova_compute[186544]: 2025-11-22 08:27:21.991 186548 DEBUG nova.virt.libvirt.vif [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-22T08:26:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-415050896',display_name='tempest-TestNetworkAdvancedServerOps-server-415050896',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-415050896',id=161,image_ref='360f90ca-2ddb-4e60-a48e-364e3b48bd96',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBE10HET3xVG9kYYEFJbNoeXgZQQGPJCraywaCGccNjoZ4Ok7GHB8Zbg7DUabYTyv6kCKJW3QTAclno8OGt86G2miChntJewa0KwQEOoXH+w0t853r2FKBAA8hWK6fNplag==',key_name='tempest-TestNetworkAdvancedServerOps-1677817121',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:26:50Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='042f6d127720471aaedb8a1fb7535416',ramdisk_id='',reservation_id='r-ic6oxu06',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='360f90ca-2ddb-4e60-a48e-364e3b48bd96',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1221065053',owner_user_name='tempest-TestNetworkAdvancedServerOps-1221065053-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:27:21Z,user_data=None,user_id='d8853d84c1e84f6baaf01635ef1d0f7c',uuid=1fd3f032-9ff9-42af-aa81-ca9677695572,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9791d20b-bf49-4625-a399-3764afd01f2b", "address": "fa:16:3e:cf:9d:45", "network": {"id": "803d80ea-824c-483f-b6fb-c444b8aec93a", "bridge": "br-int", "label": "tempest-network-smoke--906185119", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9791d20b-bf", "ovs_interfaceid": "9791d20b-bf49-4625-a399-3764afd01f2b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 08:27:21 compute-0 nova_compute[186544]: 2025-11-22 08:27:21.992 186548 DEBUG nova.network.os_vif_util [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converting VIF {"id": "9791d20b-bf49-4625-a399-3764afd01f2b", "address": "fa:16:3e:cf:9d:45", "network": {"id": "803d80ea-824c-483f-b6fb-c444b8aec93a", "bridge": "br-int", "label": "tempest-network-smoke--906185119", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9791d20b-bf", "ovs_interfaceid": "9791d20b-bf49-4625-a399-3764afd01f2b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:27:21 compute-0 nova_compute[186544]: 2025-11-22 08:27:21.993 186548 DEBUG nova.network.os_vif_util [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:cf:9d:45,bridge_name='br-int',has_traffic_filtering=True,id=9791d20b-bf49-4625-a399-3764afd01f2b,network=Network(803d80ea-824c-483f-b6fb-c444b8aec93a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9791d20b-bf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:27:21 compute-0 nova_compute[186544]: 2025-11-22 08:27:21.995 186548 DEBUG nova.virt.libvirt.driver [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] End _get_guest_xml xml=<domain type="kvm">
Nov 22 08:27:21 compute-0 nova_compute[186544]:   <uuid>1fd3f032-9ff9-42af-aa81-ca9677695572</uuid>
Nov 22 08:27:21 compute-0 nova_compute[186544]:   <name>instance-000000a1</name>
Nov 22 08:27:21 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 08:27:21 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 08:27:21 compute-0 nova_compute[186544]:   <metadata>
Nov 22 08:27:21 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 08:27:21 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 08:27:21 compute-0 nova_compute[186544]:       <nova:name>tempest-TestNetworkAdvancedServerOps-server-415050896</nova:name>
Nov 22 08:27:21 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 08:27:21</nova:creationTime>
Nov 22 08:27:21 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 08:27:21 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 08:27:21 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 08:27:21 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 08:27:21 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 08:27:21 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 08:27:21 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 08:27:21 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 08:27:21 compute-0 nova_compute[186544]:         <nova:user uuid="d8853d84c1e84f6baaf01635ef1d0f7c">tempest-TestNetworkAdvancedServerOps-1221065053-project-member</nova:user>
Nov 22 08:27:21 compute-0 nova_compute[186544]:         <nova:project uuid="042f6d127720471aaedb8a1fb7535416">tempest-TestNetworkAdvancedServerOps-1221065053</nova:project>
Nov 22 08:27:21 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 08:27:21 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="360f90ca-2ddb-4e60-a48e-364e3b48bd96"/>
Nov 22 08:27:21 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 08:27:21 compute-0 nova_compute[186544]:         <nova:port uuid="9791d20b-bf49-4625-a399-3764afd01f2b">
Nov 22 08:27:21 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 22 08:27:21 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 08:27:21 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 08:27:21 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 08:27:21 compute-0 nova_compute[186544]:   </metadata>
Nov 22 08:27:21 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 08:27:21 compute-0 nova_compute[186544]:     <system>
Nov 22 08:27:21 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 08:27:21 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 08:27:21 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 08:27:21 compute-0 nova_compute[186544]:       <entry name="serial">1fd3f032-9ff9-42af-aa81-ca9677695572</entry>
Nov 22 08:27:21 compute-0 nova_compute[186544]:       <entry name="uuid">1fd3f032-9ff9-42af-aa81-ca9677695572</entry>
Nov 22 08:27:21 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 08:27:21 compute-0 nova_compute[186544]:     </system>
Nov 22 08:27:21 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 08:27:21 compute-0 nova_compute[186544]:   <os>
Nov 22 08:27:21 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 08:27:21 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 08:27:21 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 08:27:21 compute-0 nova_compute[186544]:   </os>
Nov 22 08:27:21 compute-0 nova_compute[186544]:   <features>
Nov 22 08:27:21 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 08:27:21 compute-0 nova_compute[186544]:     <apic/>
Nov 22 08:27:21 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 08:27:21 compute-0 nova_compute[186544]:   </features>
Nov 22 08:27:21 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 08:27:21 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 08:27:21 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 08:27:21 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 08:27:21 compute-0 nova_compute[186544]:   </clock>
Nov 22 08:27:21 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 08:27:21 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 08:27:21 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 08:27:21 compute-0 nova_compute[186544]:   </cpu>
Nov 22 08:27:21 compute-0 nova_compute[186544]:   <devices>
Nov 22 08:27:21 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 08:27:21 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 08:27:21 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/1fd3f032-9ff9-42af-aa81-ca9677695572/disk"/>
Nov 22 08:27:21 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 08:27:21 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:27:21 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 08:27:21 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 08:27:21 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/1fd3f032-9ff9-42af-aa81-ca9677695572/disk.config"/>
Nov 22 08:27:21 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 08:27:21 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:27:21 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 08:27:21 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:cf:9d:45"/>
Nov 22 08:27:21 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:27:21 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 08:27:21 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 08:27:21 compute-0 nova_compute[186544]:       <target dev="tap9791d20b-bf"/>
Nov 22 08:27:21 compute-0 nova_compute[186544]:     </interface>
Nov 22 08:27:21 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 08:27:21 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/1fd3f032-9ff9-42af-aa81-ca9677695572/console.log" append="off"/>
Nov 22 08:27:21 compute-0 nova_compute[186544]:     </serial>
Nov 22 08:27:21 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 08:27:21 compute-0 nova_compute[186544]:     <video>
Nov 22 08:27:21 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:27:21 compute-0 nova_compute[186544]:     </video>
Nov 22 08:27:21 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 08:27:21 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 08:27:21 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 08:27:21 compute-0 nova_compute[186544]:     </rng>
Nov 22 08:27:21 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 08:27:21 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:27:21 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:27:21 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:27:21 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:27:21 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:27:21 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:27:21 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:27:21 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:27:21 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:27:21 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:27:21 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:27:21 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:27:21 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:27:21 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:27:21 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:27:21 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:27:21 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:27:21 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:27:21 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:27:21 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:27:21 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:27:21 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:27:21 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:27:21 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:27:21 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 08:27:21 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 08:27:21 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 08:27:21 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 08:27:21 compute-0 nova_compute[186544]:   </devices>
Nov 22 08:27:21 compute-0 nova_compute[186544]: </domain>
Nov 22 08:27:21 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 08:27:21 compute-0 nova_compute[186544]: 2025-11-22 08:27:21.996 186548 DEBUG nova.virt.libvirt.vif [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-22T08:26:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-415050896',display_name='tempest-TestNetworkAdvancedServerOps-server-415050896',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-415050896',id=161,image_ref='360f90ca-2ddb-4e60-a48e-364e3b48bd96',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBE10HET3xVG9kYYEFJbNoeXgZQQGPJCraywaCGccNjoZ4Ok7GHB8Zbg7DUabYTyv6kCKJW3QTAclno8OGt86G2miChntJewa0KwQEOoXH+w0t853r2FKBAA8hWK6fNplag==',key_name='tempest-TestNetworkAdvancedServerOps-1677817121',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:26:50Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='042f6d127720471aaedb8a1fb7535416',ramdisk_id='',reservation_id='r-ic6oxu06',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='360f90ca-2ddb-4e60-a48e-364e3b48bd96',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1221065053',owner_user_name='tempest-TestNetworkAdvancedServerOps-1221065053-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:27:21Z,user_data=None,user_id='d8853d84c1e84f6baaf01635ef1d0f7c',uuid=1fd3f032-9ff9-42af-aa81-ca9677695572,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9791d20b-bf49-4625-a399-3764afd01f2b", "address": "fa:16:3e:cf:9d:45", "network": {"id": "803d80ea-824c-483f-b6fb-c444b8aec93a", "bridge": "br-int", "label": "tempest-network-smoke--906185119", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9791d20b-bf", "ovs_interfaceid": "9791d20b-bf49-4625-a399-3764afd01f2b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 08:27:21 compute-0 nova_compute[186544]: 2025-11-22 08:27:21.997 186548 DEBUG nova.network.os_vif_util [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converting VIF {"id": "9791d20b-bf49-4625-a399-3764afd01f2b", "address": "fa:16:3e:cf:9d:45", "network": {"id": "803d80ea-824c-483f-b6fb-c444b8aec93a", "bridge": "br-int", "label": "tempest-network-smoke--906185119", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9791d20b-bf", "ovs_interfaceid": "9791d20b-bf49-4625-a399-3764afd01f2b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:27:21 compute-0 nova_compute[186544]: 2025-11-22 08:27:21.998 186548 DEBUG nova.network.os_vif_util [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:cf:9d:45,bridge_name='br-int',has_traffic_filtering=True,id=9791d20b-bf49-4625-a399-3764afd01f2b,network=Network(803d80ea-824c-483f-b6fb-c444b8aec93a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9791d20b-bf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:27:21 compute-0 nova_compute[186544]: 2025-11-22 08:27:21.998 186548 DEBUG os_vif [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:cf:9d:45,bridge_name='br-int',has_traffic_filtering=True,id=9791d20b-bf49-4625-a399-3764afd01f2b,network=Network(803d80ea-824c-483f-b6fb-c444b8aec93a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9791d20b-bf') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 08:27:21 compute-0 nova_compute[186544]: 2025-11-22 08:27:21.999 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:27:21 compute-0 nova_compute[186544]: 2025-11-22 08:27:21.999 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:27:22 compute-0 nova_compute[186544]: 2025-11-22 08:27:22.000 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:27:22 compute-0 nova_compute[186544]: 2025-11-22 08:27:22.003 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:27:22 compute-0 nova_compute[186544]: 2025-11-22 08:27:22.004 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9791d20b-bf, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:27:22 compute-0 nova_compute[186544]: 2025-11-22 08:27:22.004 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9791d20b-bf, col_values=(('external_ids', {'iface-id': '9791d20b-bf49-4625-a399-3764afd01f2b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:cf:9d:45', 'vm-uuid': '1fd3f032-9ff9-42af-aa81-ca9677695572'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:27:22 compute-0 NetworkManager[55036]: <info>  [1763800042.0075] manager: (tap9791d20b-bf): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/347)
Nov 22 08:27:22 compute-0 nova_compute[186544]: 2025-11-22 08:27:22.007 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:27:22 compute-0 nova_compute[186544]: 2025-11-22 08:27:22.009 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 08:27:22 compute-0 nova_compute[186544]: 2025-11-22 08:27:22.016 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:27:22 compute-0 nova_compute[186544]: 2025-11-22 08:27:22.017 186548 INFO os_vif [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:cf:9d:45,bridge_name='br-int',has_traffic_filtering=True,id=9791d20b-bf49-4625-a399-3764afd01f2b,network=Network(803d80ea-824c-483f-b6fb-c444b8aec93a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9791d20b-bf')
Nov 22 08:27:22 compute-0 nova_compute[186544]: 2025-11-22 08:27:22.085 186548 DEBUG nova.virt.libvirt.driver [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:27:22 compute-0 nova_compute[186544]: 2025-11-22 08:27:22.085 186548 DEBUG nova.virt.libvirt.driver [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:27:22 compute-0 nova_compute[186544]: 2025-11-22 08:27:22.085 186548 DEBUG nova.virt.libvirt.driver [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] No VIF found with MAC fa:16:3e:cf:9d:45, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 08:27:22 compute-0 nova_compute[186544]: 2025-11-22 08:27:22.086 186548 INFO nova.virt.libvirt.driver [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Using config drive
Nov 22 08:27:22 compute-0 nova_compute[186544]: 2025-11-22 08:27:22.096 186548 DEBUG nova.objects.instance [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 1fd3f032-9ff9-42af-aa81-ca9677695572 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:27:22 compute-0 nova_compute[186544]: 2025-11-22 08:27:22.119 186548 DEBUG nova.objects.instance [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lazy-loading 'keypairs' on Instance uuid 1fd3f032-9ff9-42af-aa81-ca9677695572 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:27:22 compute-0 nova_compute[186544]: 2025-11-22 08:27:22.616 186548 INFO nova.virt.libvirt.driver [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Creating config drive at /var/lib/nova/instances/1fd3f032-9ff9-42af-aa81-ca9677695572/disk.config
Nov 22 08:27:22 compute-0 nova_compute[186544]: 2025-11-22 08:27:22.622 186548 DEBUG oslo_concurrency.processutils [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1fd3f032-9ff9-42af-aa81-ca9677695572/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpo_w84qd1 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:27:22 compute-0 nova_compute[186544]: 2025-11-22 08:27:22.749 186548 DEBUG oslo_concurrency.processutils [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1fd3f032-9ff9-42af-aa81-ca9677695572/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpo_w84qd1" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:27:22 compute-0 NetworkManager[55036]: <info>  [1763800042.8190] manager: (tap9791d20b-bf): new Tun device (/org/freedesktop/NetworkManager/Devices/348)
Nov 22 08:27:22 compute-0 kernel: tap9791d20b-bf: entered promiscuous mode
Nov 22 08:27:22 compute-0 systemd-udevd[244880]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:27:22 compute-0 nova_compute[186544]: 2025-11-22 08:27:22.821 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:27:22 compute-0 ovn_controller[94843]: 2025-11-22T08:27:22Z|00754|binding|INFO|Claiming lport 9791d20b-bf49-4625-a399-3764afd01f2b for this chassis.
Nov 22 08:27:22 compute-0 ovn_controller[94843]: 2025-11-22T08:27:22Z|00755|binding|INFO|9791d20b-bf49-4625-a399-3764afd01f2b: Claiming fa:16:3e:cf:9d:45 10.100.0.11
Nov 22 08:27:22 compute-0 ovn_controller[94843]: 2025-11-22T08:27:22Z|00756|binding|INFO|Setting lport 9791d20b-bf49-4625-a399-3764afd01f2b ovn-installed in OVS
Nov 22 08:27:22 compute-0 nova_compute[186544]: 2025-11-22 08:27:22.836 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:27:22 compute-0 NetworkManager[55036]: <info>  [1763800042.8382] device (tap9791d20b-bf): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 08:27:22 compute-0 nova_compute[186544]: 2025-11-22 08:27:22.837 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:27:22 compute-0 NetworkManager[55036]: <info>  [1763800042.8395] device (tap9791d20b-bf): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 08:27:22 compute-0 systemd-machined[152872]: New machine qemu-86-instance-000000a1.
Nov 22 08:27:22 compute-0 systemd[1]: Started Virtual Machine qemu-86-instance-000000a1.
Nov 22 08:27:22 compute-0 ovn_controller[94843]: 2025-11-22T08:27:22Z|00757|binding|INFO|Setting lport 9791d20b-bf49-4625-a399-3764afd01f2b up in Southbound
Nov 22 08:27:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:27:22.910 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cf:9d:45 10.100.0.11'], port_security=['fa:16:3e:cf:9d:45 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '1fd3f032-9ff9-42af-aa81-ca9677695572', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-803d80ea-824c-483f-b6fb-c444b8aec93a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '042f6d127720471aaedb8a1fb7535416', 'neutron:revision_number': '5', 'neutron:security_group_ids': '06bbe5b1-e2a7-4761-b674-e4ef7686889b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.225'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7ac404e1-df81-45d0-a211-7eca3c22b09e, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=9791d20b-bf49-4625-a399-3764afd01f2b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:27:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:27:22.912 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 9791d20b-bf49-4625-a399-3764afd01f2b in datapath 803d80ea-824c-483f-b6fb-c444b8aec93a bound to our chassis
Nov 22 08:27:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:27:22.913 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 803d80ea-824c-483f-b6fb-c444b8aec93a
Nov 22 08:27:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:27:22.927 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[603073ed-4814-4319-a360-782982ddbf69]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:27:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:27:22.928 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap803d80ea-81 in ovnmeta-803d80ea-824c-483f-b6fb-c444b8aec93a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 08:27:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:27:22.930 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap803d80ea-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 08:27:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:27:22.931 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[be2da877-6352-4ed8-9180-d3c52831ba9a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:27:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:27:22.932 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[fba50538-d177-428d-9b68-7a0b05b43221]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:27:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:27:22.943 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[25677e60-c52f-49c6-9a89-8af251773324]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:27:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:27:22.960 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[3f657d48-5558-475e-8eca-55c59ce1ea80]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:27:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:27:22.992 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[aa97c82c-fc1f-45fc-81be-c51c1880a709]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:27:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:27:22.997 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[4c24c018-5979-4978-b2a5-bfda4e274889]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:27:22 compute-0 NetworkManager[55036]: <info>  [1763800042.9989] manager: (tap803d80ea-80): new Veth device (/org/freedesktop/NetworkManager/Devices/349)
Nov 22 08:27:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:27:23.034 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[103c12e9-9bf0-4e53-8c74-4fa6edaa116f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:27:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:27:23.039 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[d2d71fc8-d945-4cf9-a38b-b690fac6142d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:27:23 compute-0 NetworkManager[55036]: <info>  [1763800043.0671] device (tap803d80ea-80): carrier: link connected
Nov 22 08:27:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:27:23.072 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[b389a887-2204-47d4-8f79-abbca5b6310d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:27:23 compute-0 nova_compute[186544]: 2025-11-22 08:27:23.087 186548 DEBUG nova.compute.manager [req-a349fa6e-081e-48ee-b7bb-57181ba3eae8 req-8ad9d340-bff2-4365-8b61-61dd1395f766 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Received event network-vif-plugged-9791d20b-bf49-4625-a399-3764afd01f2b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:27:23 compute-0 nova_compute[186544]: 2025-11-22 08:27:23.088 186548 DEBUG oslo_concurrency.lockutils [req-a349fa6e-081e-48ee-b7bb-57181ba3eae8 req-8ad9d340-bff2-4365-8b61-61dd1395f766 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1fd3f032-9ff9-42af-aa81-ca9677695572-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:27:23 compute-0 nova_compute[186544]: 2025-11-22 08:27:23.088 186548 DEBUG oslo_concurrency.lockutils [req-a349fa6e-081e-48ee-b7bb-57181ba3eae8 req-8ad9d340-bff2-4365-8b61-61dd1395f766 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1fd3f032-9ff9-42af-aa81-ca9677695572-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:27:23 compute-0 nova_compute[186544]: 2025-11-22 08:27:23.088 186548 DEBUG oslo_concurrency.lockutils [req-a349fa6e-081e-48ee-b7bb-57181ba3eae8 req-8ad9d340-bff2-4365-8b61-61dd1395f766 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1fd3f032-9ff9-42af-aa81-ca9677695572-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:27:23 compute-0 nova_compute[186544]: 2025-11-22 08:27:23.089 186548 DEBUG nova.compute.manager [req-a349fa6e-081e-48ee-b7bb-57181ba3eae8 req-8ad9d340-bff2-4365-8b61-61dd1395f766 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] No waiting events found dispatching network-vif-plugged-9791d20b-bf49-4625-a399-3764afd01f2b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:27:23 compute-0 nova_compute[186544]: 2025-11-22 08:27:23.089 186548 WARNING nova.compute.manager [req-a349fa6e-081e-48ee-b7bb-57181ba3eae8 req-8ad9d340-bff2-4365-8b61-61dd1395f766 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Received unexpected event network-vif-plugged-9791d20b-bf49-4625-a399-3764afd01f2b for instance with vm_state active and task_state rebuild_spawning.
Nov 22 08:27:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:27:23.095 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[3c141e4c-7730-4481-896a-176483d85396]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap803d80ea-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ef:32:f7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 226], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 678370, 'reachable_time': 37743, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 245045, 'error': None, 'target': 'ovnmeta-803d80ea-824c-483f-b6fb-c444b8aec93a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:27:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:27:23.117 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[f94a21ee-cffb-44d1-858e-4e7f0cfbb9f4]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feef:32f7'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 678370, 'tstamp': 678370}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 245047, 'error': None, 'target': 'ovnmeta-803d80ea-824c-483f-b6fb-c444b8aec93a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:27:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:27:23.137 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[d99579f2-cadc-4fa2-b310-277817d659c7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap803d80ea-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ef:32:f7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 226], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 678370, 'reachable_time': 37743, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 245048, 'error': None, 'target': 'ovnmeta-803d80ea-824c-483f-b6fb-c444b8aec93a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:27:23 compute-0 nova_compute[186544]: 2025-11-22 08:27:23.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:27:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:27:23.178 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[052a1885-1b57-473c-91bf-ad1268a1c15e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:27:23 compute-0 nova_compute[186544]: 2025-11-22 08:27:23.190 186548 DEBUG nova.virt.libvirt.host [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Removed pending event for 1fd3f032-9ff9-42af-aa81-ca9677695572 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Nov 22 08:27:23 compute-0 nova_compute[186544]: 2025-11-22 08:27:23.190 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763800043.1898744, 1fd3f032-9ff9-42af-aa81-ca9677695572 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:27:23 compute-0 nova_compute[186544]: 2025-11-22 08:27:23.191 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] VM Resumed (Lifecycle Event)
Nov 22 08:27:23 compute-0 nova_compute[186544]: 2025-11-22 08:27:23.193 186548 DEBUG nova.compute.manager [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 08:27:23 compute-0 nova_compute[186544]: 2025-11-22 08:27:23.193 186548 DEBUG nova.virt.libvirt.driver [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 08:27:23 compute-0 nova_compute[186544]: 2025-11-22 08:27:23.196 186548 INFO nova.virt.libvirt.driver [-] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Instance spawned successfully.
Nov 22 08:27:23 compute-0 nova_compute[186544]: 2025-11-22 08:27:23.197 186548 DEBUG nova.virt.libvirt.driver [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 08:27:23 compute-0 nova_compute[186544]: 2025-11-22 08:27:23.211 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:27:23 compute-0 nova_compute[186544]: 2025-11-22 08:27:23.214 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:27:23 compute-0 nova_compute[186544]: 2025-11-22 08:27:23.222 186548 DEBUG nova.virt.libvirt.driver [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:27:23 compute-0 nova_compute[186544]: 2025-11-22 08:27:23.222 186548 DEBUG nova.virt.libvirt.driver [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:27:23 compute-0 nova_compute[186544]: 2025-11-22 08:27:23.223 186548 DEBUG nova.virt.libvirt.driver [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:27:23 compute-0 nova_compute[186544]: 2025-11-22 08:27:23.223 186548 DEBUG nova.virt.libvirt.driver [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:27:23 compute-0 nova_compute[186544]: 2025-11-22 08:27:23.223 186548 DEBUG nova.virt.libvirt.driver [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:27:23 compute-0 nova_compute[186544]: 2025-11-22 08:27:23.224 186548 DEBUG nova.virt.libvirt.driver [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:27:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:27:23.237 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[f5a1f0d5-70f0-4534-9212-a63f3c99a066]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:27:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:27:23.239 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap803d80ea-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:27:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:27:23.239 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:27:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:27:23.240 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap803d80ea-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:27:23 compute-0 NetworkManager[55036]: <info>  [1763800043.2425] manager: (tap803d80ea-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/350)
Nov 22 08:27:23 compute-0 kernel: tap803d80ea-80: entered promiscuous mode
Nov 22 08:27:23 compute-0 nova_compute[186544]: 2025-11-22 08:27:23.243 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:27:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:27:23.245 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap803d80ea-80, col_values=(('external_ids', {'iface-id': '2559dc1e-bdcf-4850-9e0f-c2d46823e533'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:27:23 compute-0 nova_compute[186544]: 2025-11-22 08:27:23.246 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:27:23 compute-0 ovn_controller[94843]: 2025-11-22T08:27:23Z|00758|binding|INFO|Releasing lport 2559dc1e-bdcf-4850-9e0f-c2d46823e533 from this chassis (sb_readonly=0)
Nov 22 08:27:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:27:23.248 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/803d80ea-824c-483f-b6fb-c444b8aec93a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/803d80ea-824c-483f-b6fb-c444b8aec93a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 08:27:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:27:23.248 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[99105d9d-e259-4483-9952-4ad8abe716c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:27:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:27:23.249 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 08:27:23 compute-0 ovn_metadata_agent[103800]: global
Nov 22 08:27:23 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 08:27:23 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-803d80ea-824c-483f-b6fb-c444b8aec93a
Nov 22 08:27:23 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 08:27:23 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 08:27:23 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 08:27:23 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/803d80ea-824c-483f-b6fb-c444b8aec93a.pid.haproxy
Nov 22 08:27:23 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 08:27:23 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:27:23 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 08:27:23 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 08:27:23 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 08:27:23 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 08:27:23 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 08:27:23 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 08:27:23 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 08:27:23 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 08:27:23 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 08:27:23 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 08:27:23 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 08:27:23 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 08:27:23 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 08:27:23 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:27:23 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:27:23 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 08:27:23 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 08:27:23 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 08:27:23 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID 803d80ea-824c-483f-b6fb-c444b8aec93a
Nov 22 08:27:23 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 08:27:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:27:23.250 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-803d80ea-824c-483f-b6fb-c444b8aec93a', 'env', 'PROCESS_TAG=haproxy-803d80ea-824c-483f-b6fb-c444b8aec93a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/803d80ea-824c-483f-b6fb-c444b8aec93a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 08:27:23 compute-0 nova_compute[186544]: 2025-11-22 08:27:23.256 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Nov 22 08:27:23 compute-0 nova_compute[186544]: 2025-11-22 08:27:23.257 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763800043.1912374, 1fd3f032-9ff9-42af-aa81-ca9677695572 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:27:23 compute-0 nova_compute[186544]: 2025-11-22 08:27:23.257 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] VM Started (Lifecycle Event)
Nov 22 08:27:23 compute-0 nova_compute[186544]: 2025-11-22 08:27:23.259 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:27:23 compute-0 nova_compute[186544]: 2025-11-22 08:27:23.277 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:27:23 compute-0 nova_compute[186544]: 2025-11-22 08:27:23.280 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:27:23 compute-0 nova_compute[186544]: 2025-11-22 08:27:23.310 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Nov 22 08:27:23 compute-0 nova_compute[186544]: 2025-11-22 08:27:23.630 186548 DEBUG nova.compute.manager [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:27:23 compute-0 podman[245081]: 2025-11-22 08:27:23.576309587 +0000 UTC m=+0.022723580 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 08:27:23 compute-0 podman[245081]: 2025-11-22 08:27:23.719800839 +0000 UTC m=+0.166214802 container create bfe6492a72e4be443c62782ddb23d5ea8056dad7478c110b6be7db9b24ef41a2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-803d80ea-824c-483f-b6fb-c444b8aec93a, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Nov 22 08:27:23 compute-0 nova_compute[186544]: 2025-11-22 08:27:23.819 186548 DEBUG oslo_concurrency.lockutils [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:27:23 compute-0 nova_compute[186544]: 2025-11-22 08:27:23.820 186548 DEBUG oslo_concurrency.lockutils [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:27:23 compute-0 nova_compute[186544]: 2025-11-22 08:27:23.820 186548 DEBUG nova.objects.instance [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Nov 22 08:27:23 compute-0 systemd[1]: Started libpod-conmon-bfe6492a72e4be443c62782ddb23d5ea8056dad7478c110b6be7db9b24ef41a2.scope.
Nov 22 08:27:23 compute-0 podman[245094]: 2025-11-22 08:27:23.84863603 +0000 UTC m=+0.081593929 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 08:27:23 compute-0 podman[245095]: 2025-11-22 08:27:23.85717874 +0000 UTC m=+0.089084744 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_id=edpm, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, managed_by=edpm_ansible, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 22 08:27:23 compute-0 systemd[1]: Started libcrun container.
Nov 22 08:27:23 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fb4beb14504536695aec55d5ad84cbfb741421a2d4675c2899ade043c06f5a36/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 08:27:23 compute-0 nova_compute[186544]: 2025-11-22 08:27:23.886 186548 DEBUG oslo_concurrency.lockutils [None req-112d6245-cb13-4d8f-93a6-f86878a75ea9 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.066s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:27:24 compute-0 podman[245081]: 2025-11-22 08:27:24.074111029 +0000 UTC m=+0.520525082 container init bfe6492a72e4be443c62782ddb23d5ea8056dad7478c110b6be7db9b24ef41a2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-803d80ea-824c-483f-b6fb-c444b8aec93a, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 22 08:27:24 compute-0 podman[245081]: 2025-11-22 08:27:24.08063048 +0000 UTC m=+0.527044443 container start bfe6492a72e4be443c62782ddb23d5ea8056dad7478c110b6be7db9b24ef41a2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-803d80ea-824c-483f-b6fb-c444b8aec93a, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 22 08:27:24 compute-0 neutron-haproxy-ovnmeta-803d80ea-824c-483f-b6fb-c444b8aec93a[245131]: [NOTICE]   (245144) : New worker (245146) forked
Nov 22 08:27:24 compute-0 neutron-haproxy-ovnmeta-803d80ea-824c-483f-b6fb-c444b8aec93a[245131]: [NOTICE]   (245144) : Loading success.
Nov 22 08:27:25 compute-0 nova_compute[186544]: 2025-11-22 08:27:25.220 186548 DEBUG nova.compute.manager [req-96b0f4d5-e4b0-43c7-b33e-fbb667d82b7d req-dcb57a69-4f4f-4718-ba09-6056ad491f4e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Received event network-vif-plugged-9791d20b-bf49-4625-a399-3764afd01f2b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:27:25 compute-0 nova_compute[186544]: 2025-11-22 08:27:25.221 186548 DEBUG oslo_concurrency.lockutils [req-96b0f4d5-e4b0-43c7-b33e-fbb667d82b7d req-dcb57a69-4f4f-4718-ba09-6056ad491f4e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1fd3f032-9ff9-42af-aa81-ca9677695572-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:27:25 compute-0 nova_compute[186544]: 2025-11-22 08:27:25.221 186548 DEBUG oslo_concurrency.lockutils [req-96b0f4d5-e4b0-43c7-b33e-fbb667d82b7d req-dcb57a69-4f4f-4718-ba09-6056ad491f4e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1fd3f032-9ff9-42af-aa81-ca9677695572-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:27:25 compute-0 nova_compute[186544]: 2025-11-22 08:27:25.221 186548 DEBUG oslo_concurrency.lockutils [req-96b0f4d5-e4b0-43c7-b33e-fbb667d82b7d req-dcb57a69-4f4f-4718-ba09-6056ad491f4e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1fd3f032-9ff9-42af-aa81-ca9677695572-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:27:25 compute-0 nova_compute[186544]: 2025-11-22 08:27:25.221 186548 DEBUG nova.compute.manager [req-96b0f4d5-e4b0-43c7-b33e-fbb667d82b7d req-dcb57a69-4f4f-4718-ba09-6056ad491f4e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] No waiting events found dispatching network-vif-plugged-9791d20b-bf49-4625-a399-3764afd01f2b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:27:25 compute-0 nova_compute[186544]: 2025-11-22 08:27:25.222 186548 WARNING nova.compute.manager [req-96b0f4d5-e4b0-43c7-b33e-fbb667d82b7d req-dcb57a69-4f4f-4718-ba09-6056ad491f4e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Received unexpected event network-vif-plugged-9791d20b-bf49-4625-a399-3764afd01f2b for instance with vm_state active and task_state None.
Nov 22 08:27:25 compute-0 nova_compute[186544]: 2025-11-22 08:27:25.222 186548 DEBUG nova.compute.manager [req-96b0f4d5-e4b0-43c7-b33e-fbb667d82b7d req-dcb57a69-4f4f-4718-ba09-6056ad491f4e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Received event network-vif-plugged-9791d20b-bf49-4625-a399-3764afd01f2b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:27:25 compute-0 nova_compute[186544]: 2025-11-22 08:27:25.222 186548 DEBUG oslo_concurrency.lockutils [req-96b0f4d5-e4b0-43c7-b33e-fbb667d82b7d req-dcb57a69-4f4f-4718-ba09-6056ad491f4e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1fd3f032-9ff9-42af-aa81-ca9677695572-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:27:25 compute-0 nova_compute[186544]: 2025-11-22 08:27:25.222 186548 DEBUG oslo_concurrency.lockutils [req-96b0f4d5-e4b0-43c7-b33e-fbb667d82b7d req-dcb57a69-4f4f-4718-ba09-6056ad491f4e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1fd3f032-9ff9-42af-aa81-ca9677695572-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:27:25 compute-0 nova_compute[186544]: 2025-11-22 08:27:25.223 186548 DEBUG oslo_concurrency.lockutils [req-96b0f4d5-e4b0-43c7-b33e-fbb667d82b7d req-dcb57a69-4f4f-4718-ba09-6056ad491f4e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1fd3f032-9ff9-42af-aa81-ca9677695572-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:27:25 compute-0 nova_compute[186544]: 2025-11-22 08:27:25.223 186548 DEBUG nova.compute.manager [req-96b0f4d5-e4b0-43c7-b33e-fbb667d82b7d req-dcb57a69-4f4f-4718-ba09-6056ad491f4e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] No waiting events found dispatching network-vif-plugged-9791d20b-bf49-4625-a399-3764afd01f2b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:27:25 compute-0 nova_compute[186544]: 2025-11-22 08:27:25.223 186548 WARNING nova.compute.manager [req-96b0f4d5-e4b0-43c7-b33e-fbb667d82b7d req-dcb57a69-4f4f-4718-ba09-6056ad491f4e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Received unexpected event network-vif-plugged-9791d20b-bf49-4625-a399-3764afd01f2b for instance with vm_state active and task_state None.
Nov 22 08:27:26 compute-0 nova_compute[186544]: 2025-11-22 08:27:26.313 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:27:27 compute-0 nova_compute[186544]: 2025-11-22 08:27:27.010 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:27:31 compute-0 nova_compute[186544]: 2025-11-22 08:27:31.313 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:27:32 compute-0 nova_compute[186544]: 2025-11-22 08:27:32.013 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:27:36 compute-0 nova_compute[186544]: 2025-11-22 08:27:36.315 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:27:37 compute-0 nova_compute[186544]: 2025-11-22 08:27:37.017 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:27:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:27:37.352 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:27:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:27:37.353 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:27:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:27:37.353 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:27:39 compute-0 ovn_controller[94843]: 2025-11-22T08:27:39Z|00070|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:cf:9d:45 10.100.0.11
Nov 22 08:27:39 compute-0 ovn_controller[94843]: 2025-11-22T08:27:39Z|00071|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:cf:9d:45 10.100.0.11
Nov 22 08:27:41 compute-0 nova_compute[186544]: 2025-11-22 08:27:41.318 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:27:41 compute-0 podman[245174]: 2025-11-22 08:27:41.410863049 +0000 UTC m=+0.051732174 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 22 08:27:41 compute-0 podman[245173]: 2025-11-22 08:27:41.436933431 +0000 UTC m=+0.084548903 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true)
Nov 22 08:27:41 compute-0 podman[245175]: 2025-11-22 08:27:41.452418002 +0000 UTC m=+0.090727014 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 08:27:41 compute-0 podman[245176]: 2025-11-22 08:27:41.453241052 +0000 UTC m=+0.088139861 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 22 08:27:42 compute-0 nova_compute[186544]: 2025-11-22 08:27:42.018 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:27:45 compute-0 nova_compute[186544]: 2025-11-22 08:27:45.581 186548 INFO nova.compute.manager [None req-a8d8eabf-8ba2-4709-a299-e6824c96ea3f d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Get console output
Nov 22 08:27:45 compute-0 nova_compute[186544]: 2025-11-22 08:27:45.587 213059 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 22 08:27:46 compute-0 nova_compute[186544]: 2025-11-22 08:27:46.320 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:27:47 compute-0 nova_compute[186544]: 2025-11-22 08:27:47.022 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:27:49 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:27:49.619 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=54, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=53) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:27:49 compute-0 nova_compute[186544]: 2025-11-22 08:27:49.620 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:27:49 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:27:49.621 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 08:27:50 compute-0 ovn_controller[94843]: 2025-11-22T08:27:50Z|00759|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Nov 22 08:27:51 compute-0 nova_compute[186544]: 2025-11-22 08:27:51.322 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:27:51 compute-0 podman[245261]: 2025-11-22 08:27:51.410543221 +0000 UTC m=+0.059048774 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=multipathd)
Nov 22 08:27:51 compute-0 nova_compute[186544]: 2025-11-22 08:27:51.418 186548 DEBUG nova.compute.manager [req-1b9d24c9-b0e0-4538-b8c5-42f6df114a47 req-7c6fb82f-19fc-4eb5-ae4a-4290690b79d1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Received event network-changed-9791d20b-bf49-4625-a399-3764afd01f2b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:27:51 compute-0 nova_compute[186544]: 2025-11-22 08:27:51.418 186548 DEBUG nova.compute.manager [req-1b9d24c9-b0e0-4538-b8c5-42f6df114a47 req-7c6fb82f-19fc-4eb5-ae4a-4290690b79d1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Refreshing instance network info cache due to event network-changed-9791d20b-bf49-4625-a399-3764afd01f2b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:27:51 compute-0 nova_compute[186544]: 2025-11-22 08:27:51.418 186548 DEBUG oslo_concurrency.lockutils [req-1b9d24c9-b0e0-4538-b8c5-42f6df114a47 req-7c6fb82f-19fc-4eb5-ae4a-4290690b79d1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-1fd3f032-9ff9-42af-aa81-ca9677695572" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:27:51 compute-0 nova_compute[186544]: 2025-11-22 08:27:51.418 186548 DEBUG oslo_concurrency.lockutils [req-1b9d24c9-b0e0-4538-b8c5-42f6df114a47 req-7c6fb82f-19fc-4eb5-ae4a-4290690b79d1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-1fd3f032-9ff9-42af-aa81-ca9677695572" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:27:51 compute-0 nova_compute[186544]: 2025-11-22 08:27:51.418 186548 DEBUG nova.network.neutron [req-1b9d24c9-b0e0-4538-b8c5-42f6df114a47 req-7c6fb82f-19fc-4eb5-ae4a-4290690b79d1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Refreshing network info cache for port 9791d20b-bf49-4625-a399-3764afd01f2b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:27:51 compute-0 nova_compute[186544]: 2025-11-22 08:27:51.798 186548 DEBUG oslo_concurrency.lockutils [None req-61f3062c-6fb1-43ef-881b-e3e19ef015fa d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "1fd3f032-9ff9-42af-aa81-ca9677695572" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:27:51 compute-0 nova_compute[186544]: 2025-11-22 08:27:51.799 186548 DEBUG oslo_concurrency.lockutils [None req-61f3062c-6fb1-43ef-881b-e3e19ef015fa d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "1fd3f032-9ff9-42af-aa81-ca9677695572" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:27:51 compute-0 nova_compute[186544]: 2025-11-22 08:27:51.799 186548 DEBUG oslo_concurrency.lockutils [None req-61f3062c-6fb1-43ef-881b-e3e19ef015fa d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "1fd3f032-9ff9-42af-aa81-ca9677695572-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:27:51 compute-0 nova_compute[186544]: 2025-11-22 08:27:51.800 186548 DEBUG oslo_concurrency.lockutils [None req-61f3062c-6fb1-43ef-881b-e3e19ef015fa d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "1fd3f032-9ff9-42af-aa81-ca9677695572-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:27:51 compute-0 nova_compute[186544]: 2025-11-22 08:27:51.801 186548 DEBUG oslo_concurrency.lockutils [None req-61f3062c-6fb1-43ef-881b-e3e19ef015fa d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "1fd3f032-9ff9-42af-aa81-ca9677695572-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:27:51 compute-0 nova_compute[186544]: 2025-11-22 08:27:51.809 186548 INFO nova.compute.manager [None req-61f3062c-6fb1-43ef-881b-e3e19ef015fa d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Terminating instance
Nov 22 08:27:51 compute-0 nova_compute[186544]: 2025-11-22 08:27:51.815 186548 DEBUG nova.compute.manager [None req-61f3062c-6fb1-43ef-881b-e3e19ef015fa d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 08:27:51 compute-0 kernel: tap9791d20b-bf (unregistering): left promiscuous mode
Nov 22 08:27:51 compute-0 NetworkManager[55036]: <info>  [1763800071.8411] device (tap9791d20b-bf): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 08:27:51 compute-0 nova_compute[186544]: 2025-11-22 08:27:51.850 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:27:51 compute-0 ovn_controller[94843]: 2025-11-22T08:27:51Z|00760|binding|INFO|Releasing lport 9791d20b-bf49-4625-a399-3764afd01f2b from this chassis (sb_readonly=0)
Nov 22 08:27:51 compute-0 ovn_controller[94843]: 2025-11-22T08:27:51Z|00761|binding|INFO|Setting lport 9791d20b-bf49-4625-a399-3764afd01f2b down in Southbound
Nov 22 08:27:51 compute-0 ovn_controller[94843]: 2025-11-22T08:27:51Z|00762|binding|INFO|Removing iface tap9791d20b-bf ovn-installed in OVS
Nov 22 08:27:51 compute-0 nova_compute[186544]: 2025-11-22 08:27:51.854 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:27:51 compute-0 nova_compute[186544]: 2025-11-22 08:27:51.869 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:27:51 compute-0 systemd[1]: machine-qemu\x2d86\x2dinstance\x2d000000a1.scope: Deactivated successfully.
Nov 22 08:27:51 compute-0 systemd[1]: machine-qemu\x2d86\x2dinstance\x2d000000a1.scope: Consumed 16.328s CPU time.
Nov 22 08:27:51 compute-0 systemd-machined[152872]: Machine qemu-86-instance-000000a1 terminated.
Nov 22 08:27:52 compute-0 nova_compute[186544]: 2025-11-22 08:27:52.024 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:27:52 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:27:52.058 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cf:9d:45 10.100.0.11'], port_security=['fa:16:3e:cf:9d:45 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '1fd3f032-9ff9-42af-aa81-ca9677695572', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-803d80ea-824c-483f-b6fb-c444b8aec93a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '042f6d127720471aaedb8a1fb7535416', 'neutron:revision_number': '6', 'neutron:security_group_ids': '06bbe5b1-e2a7-4761-b674-e4ef7686889b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7ac404e1-df81-45d0-a211-7eca3c22b09e, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=9791d20b-bf49-4625-a399-3764afd01f2b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:27:52 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:27:52.059 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 9791d20b-bf49-4625-a399-3764afd01f2b in datapath 803d80ea-824c-483f-b6fb-c444b8aec93a unbound from our chassis
Nov 22 08:27:52 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:27:52.060 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 803d80ea-824c-483f-b6fb-c444b8aec93a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 08:27:52 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:27:52.061 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[f00ce715-00c2-40c6-b476-2563423dc3f0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:27:52 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:27:52.062 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-803d80ea-824c-483f-b6fb-c444b8aec93a namespace which is not needed anymore
Nov 22 08:27:52 compute-0 nova_compute[186544]: 2025-11-22 08:27:52.075 186548 INFO nova.virt.libvirt.driver [-] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Instance destroyed successfully.
Nov 22 08:27:52 compute-0 nova_compute[186544]: 2025-11-22 08:27:52.075 186548 DEBUG nova.objects.instance [None req-61f3062c-6fb1-43ef-881b-e3e19ef015fa d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lazy-loading 'resources' on Instance uuid 1fd3f032-9ff9-42af-aa81-ca9677695572 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:27:52 compute-0 nova_compute[186544]: 2025-11-22 08:27:52.086 186548 DEBUG nova.virt.libvirt.vif [None req-61f3062c-6fb1-43ef-881b-e3e19ef015fa d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-22T08:26:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-415050896',display_name='tempest-TestNetworkAdvancedServerOps-server-415050896',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-415050896',id=161,image_ref='360f90ca-2ddb-4e60-a48e-364e3b48bd96',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBE10HET3xVG9kYYEFJbNoeXgZQQGPJCraywaCGccNjoZ4Ok7GHB8Zbg7DUabYTyv6kCKJW3QTAclno8OGt86G2miChntJewa0KwQEOoXH+w0t853r2FKBAA8hWK6fNplag==',key_name='tempest-TestNetworkAdvancedServerOps-1677817121',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:27:23Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='042f6d127720471aaedb8a1fb7535416',ramdisk_id='',reservation_id='r-ic6oxu06',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='360f90ca-2ddb-4e60-a48e-364e3b48bd96',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1221065053',owner_user_name='tempest-TestNetworkAdvancedServerOps-1221065053-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:27:23Z,user_data=None,user_id='d8853d84c1e84f6baaf01635ef1d0f7c',uuid=1fd3f032-9ff9-42af-aa81-ca9677695572,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9791d20b-bf49-4625-a399-3764afd01f2b", "address": "fa:16:3e:cf:9d:45", "network": {"id": "803d80ea-824c-483f-b6fb-c444b8aec93a", "bridge": "br-int", "label": "tempest-network-smoke--906185119", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9791d20b-bf", "ovs_interfaceid": "9791d20b-bf49-4625-a399-3764afd01f2b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 08:27:52 compute-0 nova_compute[186544]: 2025-11-22 08:27:52.086 186548 DEBUG nova.network.os_vif_util [None req-61f3062c-6fb1-43ef-881b-e3e19ef015fa d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converting VIF {"id": "9791d20b-bf49-4625-a399-3764afd01f2b", "address": "fa:16:3e:cf:9d:45", "network": {"id": "803d80ea-824c-483f-b6fb-c444b8aec93a", "bridge": "br-int", "label": "tempest-network-smoke--906185119", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9791d20b-bf", "ovs_interfaceid": "9791d20b-bf49-4625-a399-3764afd01f2b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:27:52 compute-0 nova_compute[186544]: 2025-11-22 08:27:52.087 186548 DEBUG nova.network.os_vif_util [None req-61f3062c-6fb1-43ef-881b-e3e19ef015fa d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:cf:9d:45,bridge_name='br-int',has_traffic_filtering=True,id=9791d20b-bf49-4625-a399-3764afd01f2b,network=Network(803d80ea-824c-483f-b6fb-c444b8aec93a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9791d20b-bf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:27:52 compute-0 nova_compute[186544]: 2025-11-22 08:27:52.087 186548 DEBUG os_vif [None req-61f3062c-6fb1-43ef-881b-e3e19ef015fa d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:cf:9d:45,bridge_name='br-int',has_traffic_filtering=True,id=9791d20b-bf49-4625-a399-3764afd01f2b,network=Network(803d80ea-824c-483f-b6fb-c444b8aec93a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9791d20b-bf') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 08:27:52 compute-0 nova_compute[186544]: 2025-11-22 08:27:52.088 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:27:52 compute-0 nova_compute[186544]: 2025-11-22 08:27:52.088 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9791d20b-bf, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:27:52 compute-0 nova_compute[186544]: 2025-11-22 08:27:52.090 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:27:52 compute-0 nova_compute[186544]: 2025-11-22 08:27:52.093 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 08:27:52 compute-0 nova_compute[186544]: 2025-11-22 08:27:52.095 186548 INFO os_vif [None req-61f3062c-6fb1-43ef-881b-e3e19ef015fa d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:cf:9d:45,bridge_name='br-int',has_traffic_filtering=True,id=9791d20b-bf49-4625-a399-3764afd01f2b,network=Network(803d80ea-824c-483f-b6fb-c444b8aec93a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9791d20b-bf')
Nov 22 08:27:52 compute-0 nova_compute[186544]: 2025-11-22 08:27:52.096 186548 INFO nova.virt.libvirt.driver [None req-61f3062c-6fb1-43ef-881b-e3e19ef015fa d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Deleting instance files /var/lib/nova/instances/1fd3f032-9ff9-42af-aa81-ca9677695572_del
Nov 22 08:27:52 compute-0 nova_compute[186544]: 2025-11-22 08:27:52.097 186548 INFO nova.virt.libvirt.driver [None req-61f3062c-6fb1-43ef-881b-e3e19ef015fa d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Deletion of /var/lib/nova/instances/1fd3f032-9ff9-42af-aa81-ca9677695572_del complete
Nov 22 08:27:52 compute-0 neutron-haproxy-ovnmeta-803d80ea-824c-483f-b6fb-c444b8aec93a[245131]: [NOTICE]   (245144) : haproxy version is 2.8.14-c23fe91
Nov 22 08:27:52 compute-0 neutron-haproxy-ovnmeta-803d80ea-824c-483f-b6fb-c444b8aec93a[245131]: [NOTICE]   (245144) : path to executable is /usr/sbin/haproxy
Nov 22 08:27:52 compute-0 neutron-haproxy-ovnmeta-803d80ea-824c-483f-b6fb-c444b8aec93a[245131]: [WARNING]  (245144) : Exiting Master process...
Nov 22 08:27:52 compute-0 neutron-haproxy-ovnmeta-803d80ea-824c-483f-b6fb-c444b8aec93a[245131]: [ALERT]    (245144) : Current worker (245146) exited with code 143 (Terminated)
Nov 22 08:27:52 compute-0 neutron-haproxy-ovnmeta-803d80ea-824c-483f-b6fb-c444b8aec93a[245131]: [WARNING]  (245144) : All workers exited. Exiting... (0)
Nov 22 08:27:52 compute-0 systemd[1]: libpod-bfe6492a72e4be443c62782ddb23d5ea8056dad7478c110b6be7db9b24ef41a2.scope: Deactivated successfully.
Nov 22 08:27:52 compute-0 podman[245320]: 2025-11-22 08:27:52.305079396 +0000 UTC m=+0.156169405 container died bfe6492a72e4be443c62782ddb23d5ea8056dad7478c110b6be7db9b24ef41a2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-803d80ea-824c-483f-b6fb-c444b8aec93a, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 22 08:27:52 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:27:52.623 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '54'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:27:52 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-bfe6492a72e4be443c62782ddb23d5ea8056dad7478c110b6be7db9b24ef41a2-userdata-shm.mount: Deactivated successfully.
Nov 22 08:27:52 compute-0 systemd[1]: var-lib-containers-storage-overlay-fb4beb14504536695aec55d5ad84cbfb741421a2d4675c2899ade043c06f5a36-merged.mount: Deactivated successfully.
Nov 22 08:27:52 compute-0 nova_compute[186544]: 2025-11-22 08:27:52.705 186548 INFO nova.compute.manager [None req-61f3062c-6fb1-43ef-881b-e3e19ef015fa d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Took 0.89 seconds to destroy the instance on the hypervisor.
Nov 22 08:27:52 compute-0 nova_compute[186544]: 2025-11-22 08:27:52.707 186548 DEBUG oslo.service.loopingcall [None req-61f3062c-6fb1-43ef-881b-e3e19ef015fa d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 08:27:52 compute-0 nova_compute[186544]: 2025-11-22 08:27:52.707 186548 DEBUG nova.compute.manager [-] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 08:27:52 compute-0 nova_compute[186544]: 2025-11-22 08:27:52.707 186548 DEBUG nova.network.neutron [-] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 08:27:52 compute-0 nova_compute[186544]: 2025-11-22 08:27:52.885 186548 DEBUG nova.compute.manager [req-3026fca3-4ff8-4187-a8d7-3df3078ffb53 req-b7a0d9cd-e633-4bdf-abdf-3aa07c308d6a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Received event network-vif-unplugged-9791d20b-bf49-4625-a399-3764afd01f2b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:27:52 compute-0 nova_compute[186544]: 2025-11-22 08:27:52.886 186548 DEBUG oslo_concurrency.lockutils [req-3026fca3-4ff8-4187-a8d7-3df3078ffb53 req-b7a0d9cd-e633-4bdf-abdf-3aa07c308d6a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1fd3f032-9ff9-42af-aa81-ca9677695572-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:27:52 compute-0 nova_compute[186544]: 2025-11-22 08:27:52.886 186548 DEBUG oslo_concurrency.lockutils [req-3026fca3-4ff8-4187-a8d7-3df3078ffb53 req-b7a0d9cd-e633-4bdf-abdf-3aa07c308d6a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1fd3f032-9ff9-42af-aa81-ca9677695572-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:27:52 compute-0 nova_compute[186544]: 2025-11-22 08:27:52.887 186548 DEBUG oslo_concurrency.lockutils [req-3026fca3-4ff8-4187-a8d7-3df3078ffb53 req-b7a0d9cd-e633-4bdf-abdf-3aa07c308d6a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1fd3f032-9ff9-42af-aa81-ca9677695572-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:27:52 compute-0 nova_compute[186544]: 2025-11-22 08:27:52.887 186548 DEBUG nova.compute.manager [req-3026fca3-4ff8-4187-a8d7-3df3078ffb53 req-b7a0d9cd-e633-4bdf-abdf-3aa07c308d6a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] No waiting events found dispatching network-vif-unplugged-9791d20b-bf49-4625-a399-3764afd01f2b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:27:52 compute-0 nova_compute[186544]: 2025-11-22 08:27:52.887 186548 DEBUG nova.compute.manager [req-3026fca3-4ff8-4187-a8d7-3df3078ffb53 req-b7a0d9cd-e633-4bdf-abdf-3aa07c308d6a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Received event network-vif-unplugged-9791d20b-bf49-4625-a399-3764afd01f2b for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 22 08:27:53 compute-0 podman[245320]: 2025-11-22 08:27:53.277095709 +0000 UTC m=+1.128185748 container cleanup bfe6492a72e4be443c62782ddb23d5ea8056dad7478c110b6be7db9b24ef41a2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-803d80ea-824c-483f-b6fb-c444b8aec93a, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 22 08:27:53 compute-0 systemd[1]: libpod-conmon-bfe6492a72e4be443c62782ddb23d5ea8056dad7478c110b6be7db9b24ef41a2.scope: Deactivated successfully.
Nov 22 08:27:54 compute-0 podman[245351]: 2025-11-22 08:27:54.017570983 +0000 UTC m=+0.720128504 container remove bfe6492a72e4be443c62782ddb23d5ea8056dad7478c110b6be7db9b24ef41a2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-803d80ea-824c-483f-b6fb-c444b8aec93a, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 22 08:27:54 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:27:54.024 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[b33fccb5-5794-4987-aea2-34672d66e000]: (4, ('Sat Nov 22 08:27:52 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-803d80ea-824c-483f-b6fb-c444b8aec93a (bfe6492a72e4be443c62782ddb23d5ea8056dad7478c110b6be7db9b24ef41a2)\nbfe6492a72e4be443c62782ddb23d5ea8056dad7478c110b6be7db9b24ef41a2\nSat Nov 22 08:27:53 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-803d80ea-824c-483f-b6fb-c444b8aec93a (bfe6492a72e4be443c62782ddb23d5ea8056dad7478c110b6be7db9b24ef41a2)\nbfe6492a72e4be443c62782ddb23d5ea8056dad7478c110b6be7db9b24ef41a2\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:27:54 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:27:54.027 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[f8966a3d-d0e9-4f51-8a3b-e904fc0ed705]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:27:54 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:27:54.029 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap803d80ea-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:27:54 compute-0 kernel: tap803d80ea-80: left promiscuous mode
Nov 22 08:27:54 compute-0 nova_compute[186544]: 2025-11-22 08:27:54.033 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:27:54 compute-0 nova_compute[186544]: 2025-11-22 08:27:54.050 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:27:54 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:27:54.056 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[da907da3-ef62-4bb3-9c8f-9fc9de6ce247]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:27:54 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:27:54.071 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[bbb69379-8d64-4cb3-b58d-dba93dfeb8f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:27:54 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:27:54.073 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[95ef8d34-58b8-42b0-95d1-e7c4cfb923d0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:27:54 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:27:54.090 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[3d9ecb1f-e033-45b4-a15d-44cbf57f13cc]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 678362, 'reachable_time': 39969, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 245379, 'error': None, 'target': 'ovnmeta-803d80ea-824c-483f-b6fb-c444b8aec93a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:27:54 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:27:54.093 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-803d80ea-824c-483f-b6fb-c444b8aec93a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 08:27:54 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:27:54.093 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[4bc03176-ab34-49b6-b28c-1dbfc4b1fc48]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:27:54 compute-0 systemd[1]: run-netns-ovnmeta\x2d803d80ea\x2d824c\x2d483f\x2db6fb\x2dc444b8aec93a.mount: Deactivated successfully.
Nov 22 08:27:54 compute-0 podman[245365]: 2025-11-22 08:27:54.128324688 +0000 UTC m=+0.060376857 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 08:27:54 compute-0 podman[245366]: 2025-11-22 08:27:54.131820455 +0000 UTC m=+0.062015258 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, architecture=x86_64, io.buildah.version=1.33.7, vendor=Red Hat, Inc., version=9.6, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, release=1755695350, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 22 08:27:54 compute-0 nova_compute[186544]: 2025-11-22 08:27:54.711 186548 DEBUG nova.network.neutron [-] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:27:54 compute-0 nova_compute[186544]: 2025-11-22 08:27:54.766 186548 INFO nova.compute.manager [-] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Took 2.06 seconds to deallocate network for instance.
Nov 22 08:27:54 compute-0 nova_compute[186544]: 2025-11-22 08:27:54.793 186548 DEBUG nova.compute.manager [req-ad828710-2b24-449e-81ac-6434e202cf84 req-e494052d-6507-44f7-83cc-c21d2c8b628f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Received event network-vif-deleted-9791d20b-bf49-4625-a399-3764afd01f2b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:27:54 compute-0 nova_compute[186544]: 2025-11-22 08:27:54.815 186548 DEBUG nova.network.neutron [req-1b9d24c9-b0e0-4538-b8c5-42f6df114a47 req-7c6fb82f-19fc-4eb5-ae4a-4290690b79d1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Updated VIF entry in instance network info cache for port 9791d20b-bf49-4625-a399-3764afd01f2b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:27:54 compute-0 nova_compute[186544]: 2025-11-22 08:27:54.815 186548 DEBUG nova.network.neutron [req-1b9d24c9-b0e0-4538-b8c5-42f6df114a47 req-7c6fb82f-19fc-4eb5-ae4a-4290690b79d1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Updating instance_info_cache with network_info: [{"id": "9791d20b-bf49-4625-a399-3764afd01f2b", "address": "fa:16:3e:cf:9d:45", "network": {"id": "803d80ea-824c-483f-b6fb-c444b8aec93a", "bridge": "br-int", "label": "tempest-network-smoke--906185119", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9791d20b-bf", "ovs_interfaceid": "9791d20b-bf49-4625-a399-3764afd01f2b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:27:54 compute-0 nova_compute[186544]: 2025-11-22 08:27:54.890 186548 DEBUG oslo_concurrency.lockutils [req-1b9d24c9-b0e0-4538-b8c5-42f6df114a47 req-7c6fb82f-19fc-4eb5-ae4a-4290690b79d1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-1fd3f032-9ff9-42af-aa81-ca9677695572" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:27:54 compute-0 nova_compute[186544]: 2025-11-22 08:27:54.916 186548 DEBUG oslo_concurrency.lockutils [None req-61f3062c-6fb1-43ef-881b-e3e19ef015fa d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:27:54 compute-0 nova_compute[186544]: 2025-11-22 08:27:54.916 186548 DEBUG oslo_concurrency.lockutils [None req-61f3062c-6fb1-43ef-881b-e3e19ef015fa d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:27:54 compute-0 nova_compute[186544]: 2025-11-22 08:27:54.987 186548 DEBUG nova.compute.provider_tree [None req-61f3062c-6fb1-43ef-881b-e3e19ef015fa d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:27:54 compute-0 nova_compute[186544]: 2025-11-22 08:27:54.993 186548 DEBUG nova.compute.manager [req-5cf96a45-d368-4acf-aa7d-fd2f5aa5da3c req-7bdc84fa-a6b4-4aa3-9a73-051be12fb1c9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Received event network-vif-plugged-9791d20b-bf49-4625-a399-3764afd01f2b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:27:54 compute-0 nova_compute[186544]: 2025-11-22 08:27:54.993 186548 DEBUG oslo_concurrency.lockutils [req-5cf96a45-d368-4acf-aa7d-fd2f5aa5da3c req-7bdc84fa-a6b4-4aa3-9a73-051be12fb1c9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1fd3f032-9ff9-42af-aa81-ca9677695572-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:27:54 compute-0 nova_compute[186544]: 2025-11-22 08:27:54.994 186548 DEBUG oslo_concurrency.lockutils [req-5cf96a45-d368-4acf-aa7d-fd2f5aa5da3c req-7bdc84fa-a6b4-4aa3-9a73-051be12fb1c9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1fd3f032-9ff9-42af-aa81-ca9677695572-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:27:54 compute-0 nova_compute[186544]: 2025-11-22 08:27:54.994 186548 DEBUG oslo_concurrency.lockutils [req-5cf96a45-d368-4acf-aa7d-fd2f5aa5da3c req-7bdc84fa-a6b4-4aa3-9a73-051be12fb1c9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1fd3f032-9ff9-42af-aa81-ca9677695572-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:27:54 compute-0 nova_compute[186544]: 2025-11-22 08:27:54.994 186548 DEBUG nova.compute.manager [req-5cf96a45-d368-4acf-aa7d-fd2f5aa5da3c req-7bdc84fa-a6b4-4aa3-9a73-051be12fb1c9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] No waiting events found dispatching network-vif-plugged-9791d20b-bf49-4625-a399-3764afd01f2b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:27:54 compute-0 nova_compute[186544]: 2025-11-22 08:27:54.994 186548 WARNING nova.compute.manager [req-5cf96a45-d368-4acf-aa7d-fd2f5aa5da3c req-7bdc84fa-a6b4-4aa3-9a73-051be12fb1c9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Received unexpected event network-vif-plugged-9791d20b-bf49-4625-a399-3764afd01f2b for instance with vm_state deleted and task_state None.
Nov 22 08:27:55 compute-0 nova_compute[186544]: 2025-11-22 08:27:54.999 186548 DEBUG nova.scheduler.client.report [None req-61f3062c-6fb1-43ef-881b-e3e19ef015fa d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:27:55 compute-0 nova_compute[186544]: 2025-11-22 08:27:55.071 186548 DEBUG oslo_concurrency.lockutils [None req-61f3062c-6fb1-43ef-881b-e3e19ef015fa d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.155s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:27:55 compute-0 nova_compute[186544]: 2025-11-22 08:27:55.232 186548 INFO nova.scheduler.client.report [None req-61f3062c-6fb1-43ef-881b-e3e19ef015fa d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Deleted allocations for instance 1fd3f032-9ff9-42af-aa81-ca9677695572
Nov 22 08:27:55 compute-0 nova_compute[186544]: 2025-11-22 08:27:55.400 186548 DEBUG oslo_concurrency.lockutils [None req-61f3062c-6fb1-43ef-881b-e3e19ef015fa d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "1fd3f032-9ff9-42af-aa81-ca9677695572" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.601s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:27:56 compute-0 nova_compute[186544]: 2025-11-22 08:27:56.324 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:27:57 compute-0 nova_compute[186544]: 2025-11-22 08:27:57.091 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:28:01 compute-0 nova_compute[186544]: 2025-11-22 08:28:01.169 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:28:01 compute-0 nova_compute[186544]: 2025-11-22 08:28:01.241 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:28:01 compute-0 nova_compute[186544]: 2025-11-22 08:28:01.325 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:28:02 compute-0 nova_compute[186544]: 2025-11-22 08:28:02.094 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:28:06 compute-0 nova_compute[186544]: 2025-11-22 08:28:06.327 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:28:07 compute-0 nova_compute[186544]: 2025-11-22 08:28:07.074 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763800072.07358, 1fd3f032-9ff9-42af-aa81-ca9677695572 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:28:07 compute-0 nova_compute[186544]: 2025-11-22 08:28:07.075 186548 INFO nova.compute.manager [-] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] VM Stopped (Lifecycle Event)
Nov 22 08:28:07 compute-0 nova_compute[186544]: 2025-11-22 08:28:07.092 186548 DEBUG nova.compute.manager [None req-684e33b8-61ca-49d7-9d8c-8cd68e23896c - - - - - -] [instance: 1fd3f032-9ff9-42af-aa81-ca9677695572] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:28:07 compute-0 nova_compute[186544]: 2025-11-22 08:28:07.096 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:28:10 compute-0 nova_compute[186544]: 2025-11-22 08:28:10.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:28:10 compute-0 nova_compute[186544]: 2025-11-22 08:28:10.187 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:28:10 compute-0 nova_compute[186544]: 2025-11-22 08:28:10.187 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:28:10 compute-0 nova_compute[186544]: 2025-11-22 08:28:10.188 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:28:10 compute-0 nova_compute[186544]: 2025-11-22 08:28:10.188 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 08:28:10 compute-0 nova_compute[186544]: 2025-11-22 08:28:10.417 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:28:10 compute-0 nova_compute[186544]: 2025-11-22 08:28:10.418 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5717MB free_disk=73.13922119140625GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 08:28:10 compute-0 nova_compute[186544]: 2025-11-22 08:28:10.419 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:28:10 compute-0 nova_compute[186544]: 2025-11-22 08:28:10.419 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:28:10 compute-0 nova_compute[186544]: 2025-11-22 08:28:10.475 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 08:28:10 compute-0 nova_compute[186544]: 2025-11-22 08:28:10.476 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 08:28:10 compute-0 nova_compute[186544]: 2025-11-22 08:28:10.496 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:28:10 compute-0 nova_compute[186544]: 2025-11-22 08:28:10.508 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:28:10 compute-0 nova_compute[186544]: 2025-11-22 08:28:10.528 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 08:28:10 compute-0 nova_compute[186544]: 2025-11-22 08:28:10.529 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:28:11 compute-0 nova_compute[186544]: 2025-11-22 08:28:11.329 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:28:12 compute-0 nova_compute[186544]: 2025-11-22 08:28:12.098 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:28:12 compute-0 podman[245418]: 2025-11-22 08:28:12.428017991 +0000 UTC m=+0.060653404 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 08:28:12 compute-0 podman[245417]: 2025-11-22 08:28:12.447656944 +0000 UTC m=+0.086430459 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3)
Nov 22 08:28:12 compute-0 podman[245416]: 2025-11-22 08:28:12.454347519 +0000 UTC m=+0.097228905 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_id=edpm, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 22 08:28:12 compute-0 podman[245429]: 2025-11-22 08:28:12.460238853 +0000 UTC m=+0.086224893 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 22 08:28:12 compute-0 nova_compute[186544]: 2025-11-22 08:28:12.529 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:28:12 compute-0 nova_compute[186544]: 2025-11-22 08:28:12.530 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 08:28:13 compute-0 nova_compute[186544]: 2025-11-22 08:28:13.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:28:14 compute-0 nova_compute[186544]: 2025-11-22 08:28:14.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:28:14 compute-0 nova_compute[186544]: 2025-11-22 08:28:14.164 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 08:28:14 compute-0 nova_compute[186544]: 2025-11-22 08:28:14.164 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 08:28:14 compute-0 nova_compute[186544]: 2025-11-22 08:28:14.178 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 08:28:16 compute-0 nova_compute[186544]: 2025-11-22 08:28:16.331 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:28:17 compute-0 nova_compute[186544]: 2025-11-22 08:28:17.101 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:28:17 compute-0 nova_compute[186544]: 2025-11-22 08:28:17.172 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:28:19 compute-0 nova_compute[186544]: 2025-11-22 08:28:19.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:28:21 compute-0 nova_compute[186544]: 2025-11-22 08:28:21.332 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:28:22 compute-0 nova_compute[186544]: 2025-11-22 08:28:22.105 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:28:22 compute-0 podman[245501]: 2025-11-22 08:28:22.422172365 +0000 UTC m=+0.067872962 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 22 08:28:23 compute-0 nova_compute[186544]: 2025-11-22 08:28:23.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:28:23 compute-0 nova_compute[186544]: 2025-11-22 08:28:23.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:28:24 compute-0 nova_compute[186544]: 2025-11-22 08:28:24.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:28:24 compute-0 podman[245520]: 2025-11-22 08:28:24.414373985 +0000 UTC m=+0.061197848 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 08:28:24 compute-0 podman[245521]: 2025-11-22 08:28:24.427066347 +0000 UTC m=+0.066937469 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, name=ubi9-minimal, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, managed_by=edpm_ansible, architecture=x86_64, com.redhat.component=ubi9-minimal-container, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public)
Nov 22 08:28:26 compute-0 nova_compute[186544]: 2025-11-22 08:28:26.333 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:28:27 compute-0 nova_compute[186544]: 2025-11-22 08:28:27.108 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:28:31 compute-0 nova_compute[186544]: 2025-11-22 08:28:31.336 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:28:32 compute-0 nova_compute[186544]: 2025-11-22 08:28:32.110 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:28:34 compute-0 nova_compute[186544]: 2025-11-22 08:28:34.158 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:28:36 compute-0 nova_compute[186544]: 2025-11-22 08:28:36.339 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:28:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:28:36.601 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:28:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:28:36.601 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:28:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:28:36.601 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:28:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:28:36.601 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:28:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:28:36.602 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:28:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:28:36.602 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:28:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:28:36.602 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:28:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:28:36.602 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:28:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:28:36.602 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:28:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:28:36.602 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:28:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:28:36.602 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:28:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:28:36.602 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:28:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:28:36.602 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:28:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:28:36.602 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:28:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:28:36.602 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:28:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:28:36.602 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:28:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:28:36.603 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:28:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:28:36.603 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:28:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:28:36.603 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:28:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:28:36.603 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:28:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:28:36.603 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:28:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:28:36.603 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:28:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:28:36.603 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:28:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:28:36.603 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:28:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:28:36.603 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:28:37 compute-0 nova_compute[186544]: 2025-11-22 08:28:37.112 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:28:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:28:37.353 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:28:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:28:37.354 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:28:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:28:37.354 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:28:41 compute-0 nova_compute[186544]: 2025-11-22 08:28:41.339 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:28:42 compute-0 nova_compute[186544]: 2025-11-22 08:28:42.115 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:28:43 compute-0 podman[245561]: 2025-11-22 08:28:43.412188736 +0000 UTC m=+0.059172307 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, managed_by=edpm_ansible, io.buildah.version=1.41.3)
Nov 22 08:28:43 compute-0 podman[245562]: 2025-11-22 08:28:43.425223628 +0000 UTC m=+0.067575615 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 08:28:43 compute-0 podman[245563]: 2025-11-22 08:28:43.425504715 +0000 UTC m=+0.064076039 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 08:28:43 compute-0 podman[245569]: 2025-11-22 08:28:43.456292372 +0000 UTC m=+0.089615966 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller)
Nov 22 08:28:46 compute-0 nova_compute[186544]: 2025-11-22 08:28:46.339 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:28:47 compute-0 nova_compute[186544]: 2025-11-22 08:28:47.118 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:28:51 compute-0 nova_compute[186544]: 2025-11-22 08:28:51.341 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:28:52 compute-0 nova_compute[186544]: 2025-11-22 08:28:52.120 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:28:53 compute-0 podman[245647]: 2025-11-22 08:28:53.400836305 +0000 UTC m=+0.054951243 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 08:28:54 compute-0 ovn_controller[94843]: 2025-11-22T08:28:54Z|00763|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Nov 22 08:28:55 compute-0 podman[245668]: 2025-11-22 08:28:55.413104998 +0000 UTC m=+0.057214638 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 08:28:55 compute-0 podman[245669]: 2025-11-22 08:28:55.417151848 +0000 UTC m=+0.057908076 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., release=1755695350, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, vcs-type=git, container_name=openstack_network_exporter, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, architecture=x86_64, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, version=9.6, distribution-scope=public)
Nov 22 08:28:56 compute-0 nova_compute[186544]: 2025-11-22 08:28:56.342 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:28:57 compute-0 nova_compute[186544]: 2025-11-22 08:28:57.123 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:29:01 compute-0 nova_compute[186544]: 2025-11-22 08:29:01.344 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:29:02 compute-0 nova_compute[186544]: 2025-11-22 08:29:02.126 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:29:06 compute-0 nova_compute[186544]: 2025-11-22 08:29:06.346 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:29:07 compute-0 nova_compute[186544]: 2025-11-22 08:29:07.129 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:29:10 compute-0 nova_compute[186544]: 2025-11-22 08:29:10.217 186548 DEBUG nova.compute.manager [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560
Nov 22 08:29:10 compute-0 nova_compute[186544]: 2025-11-22 08:29:10.440 186548 DEBUG oslo_concurrency.lockutils [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:29:10 compute-0 nova_compute[186544]: 2025-11-22 08:29:10.440 186548 DEBUG oslo_concurrency.lockutils [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:29:10 compute-0 nova_compute[186544]: 2025-11-22 08:29:10.594 186548 DEBUG nova.objects.instance [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lazy-loading 'pci_requests' on Instance uuid 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:29:10 compute-0 nova_compute[186544]: 2025-11-22 08:29:10.607 186548 DEBUG nova.virt.hardware [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 08:29:10 compute-0 nova_compute[186544]: 2025-11-22 08:29:10.608 186548 INFO nova.compute.claims [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Claim successful on node compute-0.ctlplane.example.com
Nov 22 08:29:10 compute-0 nova_compute[186544]: 2025-11-22 08:29:10.608 186548 DEBUG nova.objects.instance [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lazy-loading 'resources' on Instance uuid 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:29:10 compute-0 nova_compute[186544]: 2025-11-22 08:29:10.626 186548 DEBUG nova.objects.instance [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lazy-loading 'pci_devices' on Instance uuid 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:29:10 compute-0 nova_compute[186544]: 2025-11-22 08:29:10.692 186548 INFO nova.compute.resource_tracker [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Updating resource usage from migration 379e335f-1bd3-4f90-85f8-6f71327f225f
Nov 22 08:29:10 compute-0 nova_compute[186544]: 2025-11-22 08:29:10.693 186548 DEBUG nova.compute.resource_tracker [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Starting to track incoming migration 379e335f-1bd3-4f90-85f8-6f71327f225f with flavor 1c351edf-5b2d-477d-93d0-c380bdae83e7 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431
Nov 22 08:29:10 compute-0 nova_compute[186544]: 2025-11-22 08:29:10.814 186548 DEBUG nova.compute.provider_tree [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:29:10 compute-0 nova_compute[186544]: 2025-11-22 08:29:10.830 186548 DEBUG nova.scheduler.client.report [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:29:10 compute-0 nova_compute[186544]: 2025-11-22 08:29:10.912 186548 DEBUG oslo_concurrency.lockutils [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.471s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:29:10 compute-0 nova_compute[186544]: 2025-11-22 08:29:10.913 186548 INFO nova.compute.manager [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Migrating
Nov 22 08:29:11 compute-0 nova_compute[186544]: 2025-11-22 08:29:11.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:29:11 compute-0 nova_compute[186544]: 2025-11-22 08:29:11.191 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:29:11 compute-0 nova_compute[186544]: 2025-11-22 08:29:11.192 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:29:11 compute-0 nova_compute[186544]: 2025-11-22 08:29:11.192 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:29:11 compute-0 nova_compute[186544]: 2025-11-22 08:29:11.192 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 08:29:11 compute-0 nova_compute[186544]: 2025-11-22 08:29:11.347 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:29:11 compute-0 nova_compute[186544]: 2025-11-22 08:29:11.351 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:29:11 compute-0 nova_compute[186544]: 2025-11-22 08:29:11.352 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5720MB free_disk=73.13048553466797GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 08:29:11 compute-0 nova_compute[186544]: 2025-11-22 08:29:11.352 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:29:11 compute-0 nova_compute[186544]: 2025-11-22 08:29:11.352 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:29:11 compute-0 nova_compute[186544]: 2025-11-22 08:29:11.390 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Migration for instance 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Nov 22 08:29:11 compute-0 nova_compute[186544]: 2025-11-22 08:29:11.406 186548 INFO nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Updating resource usage from migration 379e335f-1bd3-4f90-85f8-6f71327f225f
Nov 22 08:29:11 compute-0 nova_compute[186544]: 2025-11-22 08:29:11.407 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Starting to track incoming migration 379e335f-1bd3-4f90-85f8-6f71327f225f with flavor 1c351edf-5b2d-477d-93d0-c380bdae83e7 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431
Nov 22 08:29:11 compute-0 nova_compute[186544]: 2025-11-22 08:29:11.460 186548 WARNING nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Instance 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3 has been moved to another host compute-2.ctlplane.example.com(compute-2.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'VCPU': 1, 'MEMORY_MB': 192, 'DISK_GB': 1}}.
Nov 22 08:29:11 compute-0 nova_compute[186544]: 2025-11-22 08:29:11.460 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 08:29:11 compute-0 nova_compute[186544]: 2025-11-22 08:29:11.461 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=704MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 08:29:11 compute-0 nova_compute[186544]: 2025-11-22 08:29:11.498 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:29:11 compute-0 nova_compute[186544]: 2025-11-22 08:29:11.519 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:29:11 compute-0 nova_compute[186544]: 2025-11-22 08:29:11.520 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 08:29:11 compute-0 nova_compute[186544]: 2025-11-22 08:29:11.521 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.168s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:29:12 compute-0 nova_compute[186544]: 2025-11-22 08:29:12.131 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:29:13 compute-0 sshd-session[245712]: Accepted publickey for nova from 192.168.122.102 port 60870 ssh2: ECDSA SHA256:92zYFkcPWlh9+ZvK5fXQ5RiSCmUwy/g/KFplYnusFfI
Nov 22 08:29:13 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Nov 22 08:29:13 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 22 08:29:13 compute-0 systemd-logind[821]: New session 63 of user nova.
Nov 22 08:29:13 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 22 08:29:13 compute-0 systemd[1]: Starting User Manager for UID 42436...
Nov 22 08:29:13 compute-0 systemd[245716]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 22 08:29:13 compute-0 systemd[245716]: Queued start job for default target Main User Target.
Nov 22 08:29:13 compute-0 systemd[245716]: Created slice User Application Slice.
Nov 22 08:29:13 compute-0 systemd[245716]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 22 08:29:13 compute-0 systemd[245716]: Started Daily Cleanup of User's Temporary Directories.
Nov 22 08:29:13 compute-0 systemd[245716]: Reached target Paths.
Nov 22 08:29:13 compute-0 systemd[245716]: Reached target Timers.
Nov 22 08:29:13 compute-0 systemd[245716]: Starting D-Bus User Message Bus Socket...
Nov 22 08:29:13 compute-0 systemd[245716]: Starting Create User's Volatile Files and Directories...
Nov 22 08:29:13 compute-0 nova_compute[186544]: 2025-11-22 08:29:13.521 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:29:13 compute-0 nova_compute[186544]: 2025-11-22 08:29:13.522 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 08:29:13 compute-0 systemd[245716]: Finished Create User's Volatile Files and Directories.
Nov 22 08:29:13 compute-0 systemd[245716]: Listening on D-Bus User Message Bus Socket.
Nov 22 08:29:13 compute-0 systemd[245716]: Reached target Sockets.
Nov 22 08:29:13 compute-0 systemd[245716]: Reached target Basic System.
Nov 22 08:29:13 compute-0 systemd[245716]: Reached target Main User Target.
Nov 22 08:29:13 compute-0 systemd[245716]: Startup finished in 156ms.
Nov 22 08:29:13 compute-0 systemd[1]: Started User Manager for UID 42436.
Nov 22 08:29:13 compute-0 systemd[1]: Started Session 63 of User nova.
Nov 22 08:29:13 compute-0 sshd-session[245712]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 22 08:29:13 compute-0 podman[245734]: 2025-11-22 08:29:13.604623167 +0000 UTC m=+0.057826824 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 08:29:13 compute-0 sshd-session[245757]: Received disconnect from 192.168.122.102 port 60870:11: disconnected by user
Nov 22 08:29:13 compute-0 sshd-session[245757]: Disconnected from user nova 192.168.122.102 port 60870
Nov 22 08:29:13 compute-0 sshd-session[245712]: pam_unix(sshd:session): session closed for user nova
Nov 22 08:29:13 compute-0 systemd[1]: session-63.scope: Deactivated successfully.
Nov 22 08:29:13 compute-0 systemd-logind[821]: Session 63 logged out. Waiting for processes to exit.
Nov 22 08:29:13 compute-0 systemd-logind[821]: Removed session 63.
Nov 22 08:29:13 compute-0 podman[245733]: 2025-11-22 08:29:13.627757087 +0000 UTC m=+0.083865795 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent)
Nov 22 08:29:13 compute-0 podman[245732]: 2025-11-22 08:29:13.636551203 +0000 UTC m=+0.092693262 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team)
Nov 22 08:29:13 compute-0 podman[245735]: 2025-11-22 08:29:13.6461833 +0000 UTC m=+0.096315571 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 22 08:29:13 compute-0 sshd-session[245817]: Accepted publickey for nova from 192.168.122.102 port 60876 ssh2: ECDSA SHA256:92zYFkcPWlh9+ZvK5fXQ5RiSCmUwy/g/KFplYnusFfI
Nov 22 08:29:13 compute-0 systemd-logind[821]: New session 65 of user nova.
Nov 22 08:29:13 compute-0 systemd[1]: Started Session 65 of User nova.
Nov 22 08:29:13 compute-0 sshd-session[245817]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 22 08:29:13 compute-0 sshd-session[245822]: Received disconnect from 192.168.122.102 port 60876:11: disconnected by user
Nov 22 08:29:13 compute-0 sshd-session[245822]: Disconnected from user nova 192.168.122.102 port 60876
Nov 22 08:29:13 compute-0 sshd-session[245817]: pam_unix(sshd:session): session closed for user nova
Nov 22 08:29:13 compute-0 systemd[1]: session-65.scope: Deactivated successfully.
Nov 22 08:29:13 compute-0 systemd-logind[821]: Session 65 logged out. Waiting for processes to exit.
Nov 22 08:29:13 compute-0 systemd-logind[821]: Removed session 65.
Nov 22 08:29:14 compute-0 nova_compute[186544]: 2025-11-22 08:29:14.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:29:14 compute-0 nova_compute[186544]: 2025-11-22 08:29:14.165 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 08:29:14 compute-0 nova_compute[186544]: 2025-11-22 08:29:14.165 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 08:29:14 compute-0 nova_compute[186544]: 2025-11-22 08:29:14.177 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 08:29:14 compute-0 nova_compute[186544]: 2025-11-22 08:29:14.178 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:29:16 compute-0 nova_compute[186544]: 2025-11-22 08:29:16.349 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:29:17 compute-0 nova_compute[186544]: 2025-11-22 08:29:17.134 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:29:17 compute-0 nova_compute[186544]: 2025-11-22 08:29:17.937 186548 DEBUG nova.compute.manager [req-0c283e70-6514-46f8-b778-f1c44ee8b10f req-16103c8d-eb7e-4136-93a5-3468e12ed2ea 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Received event network-vif-unplugged-e6dd9383-6fd6-4da4-8c3b-126dd22ec505 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:29:17 compute-0 nova_compute[186544]: 2025-11-22 08:29:17.937 186548 DEBUG oslo_concurrency.lockutils [req-0c283e70-6514-46f8-b778-f1c44ee8b10f req-16103c8d-eb7e-4136-93a5-3468e12ed2ea 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:29:17 compute-0 nova_compute[186544]: 2025-11-22 08:29:17.937 186548 DEBUG oslo_concurrency.lockutils [req-0c283e70-6514-46f8-b778-f1c44ee8b10f req-16103c8d-eb7e-4136-93a5-3468e12ed2ea 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:29:17 compute-0 nova_compute[186544]: 2025-11-22 08:29:17.938 186548 DEBUG oslo_concurrency.lockutils [req-0c283e70-6514-46f8-b778-f1c44ee8b10f req-16103c8d-eb7e-4136-93a5-3468e12ed2ea 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:29:17 compute-0 nova_compute[186544]: 2025-11-22 08:29:17.938 186548 DEBUG nova.compute.manager [req-0c283e70-6514-46f8-b778-f1c44ee8b10f req-16103c8d-eb7e-4136-93a5-3468e12ed2ea 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] No waiting events found dispatching network-vif-unplugged-e6dd9383-6fd6-4da4-8c3b-126dd22ec505 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:29:17 compute-0 nova_compute[186544]: 2025-11-22 08:29:17.938 186548 WARNING nova.compute.manager [req-0c283e70-6514-46f8-b778-f1c44ee8b10f req-16103c8d-eb7e-4136-93a5-3468e12ed2ea 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Received unexpected event network-vif-unplugged-e6dd9383-6fd6-4da4-8c3b-126dd22ec505 for instance with vm_state active and task_state resize_migrating.
Nov 22 08:29:18 compute-0 sshd-session[245824]: Accepted publickey for nova from 192.168.122.102 port 60884 ssh2: ECDSA SHA256:92zYFkcPWlh9+ZvK5fXQ5RiSCmUwy/g/KFplYnusFfI
Nov 22 08:29:18 compute-0 systemd-logind[821]: New session 66 of user nova.
Nov 22 08:29:18 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:29:18.135 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=55, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=54) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:29:18 compute-0 nova_compute[186544]: 2025-11-22 08:29:18.135 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:29:18 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:29:18.136 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 08:29:18 compute-0 systemd[1]: Started Session 66 of User nova.
Nov 22 08:29:18 compute-0 sshd-session[245824]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 22 08:29:18 compute-0 nova_compute[186544]: 2025-11-22 08:29:18.170 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:29:19 compute-0 sshd-session[245827]: Received disconnect from 192.168.122.102 port 60884:11: disconnected by user
Nov 22 08:29:19 compute-0 sshd-session[245827]: Disconnected from user nova 192.168.122.102 port 60884
Nov 22 08:29:19 compute-0 sshd-session[245824]: pam_unix(sshd:session): session closed for user nova
Nov 22 08:29:19 compute-0 systemd[1]: session-66.scope: Deactivated successfully.
Nov 22 08:29:19 compute-0 systemd-logind[821]: Session 66 logged out. Waiting for processes to exit.
Nov 22 08:29:19 compute-0 systemd-logind[821]: Removed session 66.
Nov 22 08:29:19 compute-0 sshd-session[245829]: Accepted publickey for nova from 192.168.122.102 port 60900 ssh2: ECDSA SHA256:92zYFkcPWlh9+ZvK5fXQ5RiSCmUwy/g/KFplYnusFfI
Nov 22 08:29:19 compute-0 systemd-logind[821]: New session 67 of user nova.
Nov 22 08:29:19 compute-0 systemd[1]: Started Session 67 of User nova.
Nov 22 08:29:19 compute-0 sshd-session[245829]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 22 08:29:19 compute-0 sshd-session[245832]: Received disconnect from 192.168.122.102 port 60900:11: disconnected by user
Nov 22 08:29:19 compute-0 sshd-session[245832]: Disconnected from user nova 192.168.122.102 port 60900
Nov 22 08:29:19 compute-0 sshd-session[245829]: pam_unix(sshd:session): session closed for user nova
Nov 22 08:29:19 compute-0 systemd[1]: session-67.scope: Deactivated successfully.
Nov 22 08:29:19 compute-0 systemd-logind[821]: Session 67 logged out. Waiting for processes to exit.
Nov 22 08:29:19 compute-0 systemd-logind[821]: Removed session 67.
Nov 22 08:29:19 compute-0 sshd-session[245834]: Accepted publickey for nova from 192.168.122.102 port 60906 ssh2: ECDSA SHA256:92zYFkcPWlh9+ZvK5fXQ5RiSCmUwy/g/KFplYnusFfI
Nov 22 08:29:19 compute-0 systemd-logind[821]: New session 68 of user nova.
Nov 22 08:29:19 compute-0 systemd[1]: Started Session 68 of User nova.
Nov 22 08:29:19 compute-0 sshd-session[245834]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 22 08:29:20 compute-0 nova_compute[186544]: 2025-11-22 08:29:20.048 186548 DEBUG nova.compute.manager [req-2e4e086b-1e23-4833-9319-cae0d5223952 req-7c65600d-65f9-46f1-8ab6-71a630908141 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Received event network-vif-plugged-e6dd9383-6fd6-4da4-8c3b-126dd22ec505 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:29:20 compute-0 nova_compute[186544]: 2025-11-22 08:29:20.048 186548 DEBUG oslo_concurrency.lockutils [req-2e4e086b-1e23-4833-9319-cae0d5223952 req-7c65600d-65f9-46f1-8ab6-71a630908141 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:29:20 compute-0 nova_compute[186544]: 2025-11-22 08:29:20.048 186548 DEBUG oslo_concurrency.lockutils [req-2e4e086b-1e23-4833-9319-cae0d5223952 req-7c65600d-65f9-46f1-8ab6-71a630908141 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:29:20 compute-0 nova_compute[186544]: 2025-11-22 08:29:20.048 186548 DEBUG oslo_concurrency.lockutils [req-2e4e086b-1e23-4833-9319-cae0d5223952 req-7c65600d-65f9-46f1-8ab6-71a630908141 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:29:20 compute-0 nova_compute[186544]: 2025-11-22 08:29:20.049 186548 DEBUG nova.compute.manager [req-2e4e086b-1e23-4833-9319-cae0d5223952 req-7c65600d-65f9-46f1-8ab6-71a630908141 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] No waiting events found dispatching network-vif-plugged-e6dd9383-6fd6-4da4-8c3b-126dd22ec505 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:29:20 compute-0 nova_compute[186544]: 2025-11-22 08:29:20.049 186548 WARNING nova.compute.manager [req-2e4e086b-1e23-4833-9319-cae0d5223952 req-7c65600d-65f9-46f1-8ab6-71a630908141 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Received unexpected event network-vif-plugged-e6dd9383-6fd6-4da4-8c3b-126dd22ec505 for instance with vm_state active and task_state resize_migrating.
Nov 22 08:29:20 compute-0 nova_compute[186544]: 2025-11-22 08:29:20.049 186548 DEBUG nova.compute.manager [req-2e4e086b-1e23-4833-9319-cae0d5223952 req-7c65600d-65f9-46f1-8ab6-71a630908141 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Received event network-vif-plugged-e6dd9383-6fd6-4da4-8c3b-126dd22ec505 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:29:20 compute-0 nova_compute[186544]: 2025-11-22 08:29:20.049 186548 DEBUG oslo_concurrency.lockutils [req-2e4e086b-1e23-4833-9319-cae0d5223952 req-7c65600d-65f9-46f1-8ab6-71a630908141 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:29:20 compute-0 nova_compute[186544]: 2025-11-22 08:29:20.049 186548 DEBUG oslo_concurrency.lockutils [req-2e4e086b-1e23-4833-9319-cae0d5223952 req-7c65600d-65f9-46f1-8ab6-71a630908141 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:29:20 compute-0 nova_compute[186544]: 2025-11-22 08:29:20.049 186548 DEBUG oslo_concurrency.lockutils [req-2e4e086b-1e23-4833-9319-cae0d5223952 req-7c65600d-65f9-46f1-8ab6-71a630908141 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:29:20 compute-0 nova_compute[186544]: 2025-11-22 08:29:20.050 186548 DEBUG nova.compute.manager [req-2e4e086b-1e23-4833-9319-cae0d5223952 req-7c65600d-65f9-46f1-8ab6-71a630908141 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] No waiting events found dispatching network-vif-plugged-e6dd9383-6fd6-4da4-8c3b-126dd22ec505 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:29:20 compute-0 nova_compute[186544]: 2025-11-22 08:29:20.050 186548 WARNING nova.compute.manager [req-2e4e086b-1e23-4833-9319-cae0d5223952 req-7c65600d-65f9-46f1-8ab6-71a630908141 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Received unexpected event network-vif-plugged-e6dd9383-6fd6-4da4-8c3b-126dd22ec505 for instance with vm_state active and task_state resize_migrating.
Nov 22 08:29:20 compute-0 nova_compute[186544]: 2025-11-22 08:29:20.050 186548 DEBUG nova.compute.manager [req-2e4e086b-1e23-4833-9319-cae0d5223952 req-7c65600d-65f9-46f1-8ab6-71a630908141 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Received event network-vif-plugged-e6dd9383-6fd6-4da4-8c3b-126dd22ec505 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:29:20 compute-0 nova_compute[186544]: 2025-11-22 08:29:20.050 186548 DEBUG oslo_concurrency.lockutils [req-2e4e086b-1e23-4833-9319-cae0d5223952 req-7c65600d-65f9-46f1-8ab6-71a630908141 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:29:20 compute-0 nova_compute[186544]: 2025-11-22 08:29:20.050 186548 DEBUG oslo_concurrency.lockutils [req-2e4e086b-1e23-4833-9319-cae0d5223952 req-7c65600d-65f9-46f1-8ab6-71a630908141 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:29:20 compute-0 nova_compute[186544]: 2025-11-22 08:29:20.051 186548 DEBUG oslo_concurrency.lockutils [req-2e4e086b-1e23-4833-9319-cae0d5223952 req-7c65600d-65f9-46f1-8ab6-71a630908141 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:29:20 compute-0 nova_compute[186544]: 2025-11-22 08:29:20.051 186548 DEBUG nova.compute.manager [req-2e4e086b-1e23-4833-9319-cae0d5223952 req-7c65600d-65f9-46f1-8ab6-71a630908141 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] No waiting events found dispatching network-vif-plugged-e6dd9383-6fd6-4da4-8c3b-126dd22ec505 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:29:20 compute-0 nova_compute[186544]: 2025-11-22 08:29:20.051 186548 WARNING nova.compute.manager [req-2e4e086b-1e23-4833-9319-cae0d5223952 req-7c65600d-65f9-46f1-8ab6-71a630908141 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Received unexpected event network-vif-plugged-e6dd9383-6fd6-4da4-8c3b-126dd22ec505 for instance with vm_state active and task_state resize_migrating.
Nov 22 08:29:20 compute-0 nova_compute[186544]: 2025-11-22 08:29:20.051 186548 DEBUG nova.compute.manager [req-2e4e086b-1e23-4833-9319-cae0d5223952 req-7c65600d-65f9-46f1-8ab6-71a630908141 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Received event network-vif-unplugged-e6dd9383-6fd6-4da4-8c3b-126dd22ec505 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:29:20 compute-0 nova_compute[186544]: 2025-11-22 08:29:20.051 186548 DEBUG oslo_concurrency.lockutils [req-2e4e086b-1e23-4833-9319-cae0d5223952 req-7c65600d-65f9-46f1-8ab6-71a630908141 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:29:20 compute-0 nova_compute[186544]: 2025-11-22 08:29:20.051 186548 DEBUG oslo_concurrency.lockutils [req-2e4e086b-1e23-4833-9319-cae0d5223952 req-7c65600d-65f9-46f1-8ab6-71a630908141 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:29:20 compute-0 nova_compute[186544]: 2025-11-22 08:29:20.052 186548 DEBUG oslo_concurrency.lockutils [req-2e4e086b-1e23-4833-9319-cae0d5223952 req-7c65600d-65f9-46f1-8ab6-71a630908141 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:29:20 compute-0 nova_compute[186544]: 2025-11-22 08:29:20.052 186548 DEBUG nova.compute.manager [req-2e4e086b-1e23-4833-9319-cae0d5223952 req-7c65600d-65f9-46f1-8ab6-71a630908141 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] No waiting events found dispatching network-vif-unplugged-e6dd9383-6fd6-4da4-8c3b-126dd22ec505 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:29:20 compute-0 nova_compute[186544]: 2025-11-22 08:29:20.052 186548 WARNING nova.compute.manager [req-2e4e086b-1e23-4833-9319-cae0d5223952 req-7c65600d-65f9-46f1-8ab6-71a630908141 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Received unexpected event network-vif-unplugged-e6dd9383-6fd6-4da4-8c3b-126dd22ec505 for instance with vm_state active and task_state resize_migrating.
Nov 22 08:29:20 compute-0 nova_compute[186544]: 2025-11-22 08:29:20.052 186548 DEBUG nova.compute.manager [req-2e4e086b-1e23-4833-9319-cae0d5223952 req-7c65600d-65f9-46f1-8ab6-71a630908141 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Received event network-vif-plugged-e6dd9383-6fd6-4da4-8c3b-126dd22ec505 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:29:20 compute-0 nova_compute[186544]: 2025-11-22 08:29:20.052 186548 DEBUG oslo_concurrency.lockutils [req-2e4e086b-1e23-4833-9319-cae0d5223952 req-7c65600d-65f9-46f1-8ab6-71a630908141 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:29:20 compute-0 nova_compute[186544]: 2025-11-22 08:29:20.052 186548 DEBUG oslo_concurrency.lockutils [req-2e4e086b-1e23-4833-9319-cae0d5223952 req-7c65600d-65f9-46f1-8ab6-71a630908141 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:29:20 compute-0 nova_compute[186544]: 2025-11-22 08:29:20.052 186548 DEBUG oslo_concurrency.lockutils [req-2e4e086b-1e23-4833-9319-cae0d5223952 req-7c65600d-65f9-46f1-8ab6-71a630908141 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:29:20 compute-0 nova_compute[186544]: 2025-11-22 08:29:20.053 186548 DEBUG nova.compute.manager [req-2e4e086b-1e23-4833-9319-cae0d5223952 req-7c65600d-65f9-46f1-8ab6-71a630908141 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] No waiting events found dispatching network-vif-plugged-e6dd9383-6fd6-4da4-8c3b-126dd22ec505 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:29:20 compute-0 nova_compute[186544]: 2025-11-22 08:29:20.053 186548 WARNING nova.compute.manager [req-2e4e086b-1e23-4833-9319-cae0d5223952 req-7c65600d-65f9-46f1-8ab6-71a630908141 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Received unexpected event network-vif-plugged-e6dd9383-6fd6-4da4-8c3b-126dd22ec505 for instance with vm_state active and task_state resize_migrating.
Nov 22 08:29:20 compute-0 sshd-session[245837]: Received disconnect from 192.168.122.102 port 60906:11: disconnected by user
Nov 22 08:29:20 compute-0 sshd-session[245837]: Disconnected from user nova 192.168.122.102 port 60906
Nov 22 08:29:20 compute-0 sshd-session[245834]: pam_unix(sshd:session): session closed for user nova
Nov 22 08:29:20 compute-0 systemd[1]: session-68.scope: Deactivated successfully.
Nov 22 08:29:20 compute-0 systemd-logind[821]: Session 68 logged out. Waiting for processes to exit.
Nov 22 08:29:20 compute-0 systemd-logind[821]: Removed session 68.
Nov 22 08:29:21 compute-0 nova_compute[186544]: 2025-11-22 08:29:21.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:29:21 compute-0 nova_compute[186544]: 2025-11-22 08:29:21.351 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:29:21 compute-0 nova_compute[186544]: 2025-11-22 08:29:21.983 186548 INFO nova.network.neutron [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Updating port e6dd9383-6fd6-4da4-8c3b-126dd22ec505 with attributes {'binding:host_id': 'compute-0.ctlplane.example.com', 'device_owner': 'compute:nova'}
Nov 22 08:29:22 compute-0 nova_compute[186544]: 2025-11-22 08:29:22.135 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:29:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:29:23.138 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '55'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:29:23 compute-0 nova_compute[186544]: 2025-11-22 08:29:23.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:29:24 compute-0 nova_compute[186544]: 2025-11-22 08:29:24.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:29:24 compute-0 podman[245839]: 2025-11-22 08:29:24.446183998 +0000 UTC m=+0.091802430 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd)
Nov 22 08:29:26 compute-0 nova_compute[186544]: 2025-11-22 08:29:26.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:29:26 compute-0 nova_compute[186544]: 2025-11-22 08:29:26.353 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:29:26 compute-0 podman[245859]: 2025-11-22 08:29:26.399666194 +0000 UTC m=+0.051296453 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 08:29:26 compute-0 podman[245860]: 2025-11-22 08:29:26.415140586 +0000 UTC m=+0.062629093 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, distribution-scope=public, io.openshift.expose-services=, vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, release=1755695350, managed_by=edpm_ansible, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm)
Nov 22 08:29:27 compute-0 nova_compute[186544]: 2025-11-22 08:29:27.138 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:29:27 compute-0 nova_compute[186544]: 2025-11-22 08:29:27.826 186548 DEBUG oslo_concurrency.lockutils [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "refresh_cache-1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:29:27 compute-0 nova_compute[186544]: 2025-11-22 08:29:27.826 186548 DEBUG oslo_concurrency.lockutils [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquired lock "refresh_cache-1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:29:27 compute-0 nova_compute[186544]: 2025-11-22 08:29:27.826 186548 DEBUG nova.network.neutron [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 08:29:27 compute-0 nova_compute[186544]: 2025-11-22 08:29:27.978 186548 DEBUG nova.compute.manager [req-7018a9c1-7d77-434f-b4d4-5b87a2d6b8ab req-ada529bf-3f00-4b1d-8678-01ba16763546 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Received event network-changed-e6dd9383-6fd6-4da4-8c3b-126dd22ec505 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:29:27 compute-0 nova_compute[186544]: 2025-11-22 08:29:27.979 186548 DEBUG nova.compute.manager [req-7018a9c1-7d77-434f-b4d4-5b87a2d6b8ab req-ada529bf-3f00-4b1d-8678-01ba16763546 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Refreshing instance network info cache due to event network-changed-e6dd9383-6fd6-4da4-8c3b-126dd22ec505. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:29:27 compute-0 nova_compute[186544]: 2025-11-22 08:29:27.979 186548 DEBUG oslo_concurrency.lockutils [req-7018a9c1-7d77-434f-b4d4-5b87a2d6b8ab req-ada529bf-3f00-4b1d-8678-01ba16763546 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:29:30 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Nov 22 08:29:30 compute-0 systemd[245716]: Activating special unit Exit the Session...
Nov 22 08:29:30 compute-0 systemd[245716]: Stopped target Main User Target.
Nov 22 08:29:30 compute-0 systemd[245716]: Stopped target Basic System.
Nov 22 08:29:30 compute-0 systemd[245716]: Stopped target Paths.
Nov 22 08:29:30 compute-0 systemd[245716]: Stopped target Sockets.
Nov 22 08:29:30 compute-0 systemd[245716]: Stopped target Timers.
Nov 22 08:29:30 compute-0 systemd[245716]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 22 08:29:30 compute-0 systemd[245716]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 22 08:29:30 compute-0 systemd[245716]: Closed D-Bus User Message Bus Socket.
Nov 22 08:29:30 compute-0 systemd[245716]: Stopped Create User's Volatile Files and Directories.
Nov 22 08:29:30 compute-0 systemd[245716]: Removed slice User Application Slice.
Nov 22 08:29:30 compute-0 systemd[245716]: Reached target Shutdown.
Nov 22 08:29:30 compute-0 systemd[245716]: Finished Exit the Session.
Nov 22 08:29:30 compute-0 systemd[245716]: Reached target Exit the Session.
Nov 22 08:29:30 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Nov 22 08:29:30 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Nov 22 08:29:30 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 22 08:29:30 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 22 08:29:30 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 22 08:29:30 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 22 08:29:30 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Nov 22 08:29:31 compute-0 nova_compute[186544]: 2025-11-22 08:29:31.354 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:29:31 compute-0 nova_compute[186544]: 2025-11-22 08:29:31.798 186548 DEBUG nova.network.neutron [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Updating instance_info_cache with network_info: [{"id": "e6dd9383-6fd6-4da4-8c3b-126dd22ec505", "address": "fa:16:3e:09:92:29", "network": {"id": "9ec8d93f-618c-42ae-9ef7-97cfef6c22ef", "bridge": "br-int", "label": "tempest-network-smoke--2141158938", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape6dd9383-6f", "ovs_interfaceid": "e6dd9383-6fd6-4da4-8c3b-126dd22ec505", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:29:31 compute-0 nova_compute[186544]: 2025-11-22 08:29:31.834 186548 DEBUG oslo_concurrency.lockutils [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Releasing lock "refresh_cache-1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:29:31 compute-0 nova_compute[186544]: 2025-11-22 08:29:31.837 186548 DEBUG oslo_concurrency.lockutils [req-7018a9c1-7d77-434f-b4d4-5b87a2d6b8ab req-ada529bf-3f00-4b1d-8678-01ba16763546 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:29:31 compute-0 nova_compute[186544]: 2025-11-22 08:29:31.837 186548 DEBUG nova.network.neutron [req-7018a9c1-7d77-434f-b4d4-5b87a2d6b8ab req-ada529bf-3f00-4b1d-8678-01ba16763546 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Refreshing network info cache for port e6dd9383-6fd6-4da4-8c3b-126dd22ec505 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:29:31 compute-0 nova_compute[186544]: 2025-11-22 08:29:31.952 186548 DEBUG nova.virt.libvirt.driver [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698
Nov 22 08:29:31 compute-0 nova_compute[186544]: 2025-11-22 08:29:31.954 186548 DEBUG nova.virt.libvirt.driver [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719
Nov 22 08:29:31 compute-0 nova_compute[186544]: 2025-11-22 08:29:31.954 186548 INFO nova.virt.libvirt.driver [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Creating image(s)
Nov 22 08:29:31 compute-0 nova_compute[186544]: 2025-11-22 08:29:31.955 186548 DEBUG nova.objects.instance [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:29:31 compute-0 nova_compute[186544]: 2025-11-22 08:29:31.967 186548 DEBUG oslo_concurrency.processutils [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:29:32 compute-0 nova_compute[186544]: 2025-11-22 08:29:32.031 186548 DEBUG oslo_concurrency.processutils [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:29:32 compute-0 nova_compute[186544]: 2025-11-22 08:29:32.033 186548 DEBUG nova.virt.disk.api [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Checking if we can resize image /var/lib/nova/instances/1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 08:29:32 compute-0 nova_compute[186544]: 2025-11-22 08:29:32.033 186548 DEBUG oslo_concurrency.processutils [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:29:32 compute-0 nova_compute[186544]: 2025-11-22 08:29:32.090 186548 DEBUG oslo_concurrency.processutils [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:29:32 compute-0 nova_compute[186544]: 2025-11-22 08:29:32.091 186548 DEBUG nova.virt.disk.api [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Cannot resize image /var/lib/nova/instances/1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 08:29:32 compute-0 nova_compute[186544]: 2025-11-22 08:29:32.108 186548 DEBUG nova.virt.libvirt.driver [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859
Nov 22 08:29:32 compute-0 nova_compute[186544]: 2025-11-22 08:29:32.108 186548 DEBUG nova.virt.libvirt.driver [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Ensure instance console log exists: /var/lib/nova/instances/1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 08:29:32 compute-0 nova_compute[186544]: 2025-11-22 08:29:32.109 186548 DEBUG oslo_concurrency.lockutils [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:29:32 compute-0 nova_compute[186544]: 2025-11-22 08:29:32.109 186548 DEBUG oslo_concurrency.lockutils [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:29:32 compute-0 nova_compute[186544]: 2025-11-22 08:29:32.110 186548 DEBUG oslo_concurrency.lockutils [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:29:32 compute-0 nova_compute[186544]: 2025-11-22 08:29:32.112 186548 DEBUG nova.virt.libvirt.driver [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Start _get_guest_xml network_info=[{"id": "e6dd9383-6fd6-4da4-8c3b-126dd22ec505", "address": "fa:16:3e:09:92:29", "network": {"id": "9ec8d93f-618c-42ae-9ef7-97cfef6c22ef", "bridge": "br-int", "label": "tempest-network-smoke--2141158938", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--2141158938", "vif_mac": "fa:16:3e:09:92:29"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape6dd9383-6f", "ovs_interfaceid": "e6dd9383-6fd6-4da4-8c3b-126dd22ec505", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 08:29:32 compute-0 nova_compute[186544]: 2025-11-22 08:29:32.117 186548 WARNING nova.virt.libvirt.driver [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:29:32 compute-0 nova_compute[186544]: 2025-11-22 08:29:32.124 186548 DEBUG nova.virt.libvirt.host [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 08:29:32 compute-0 nova_compute[186544]: 2025-11-22 08:29:32.124 186548 DEBUG nova.virt.libvirt.host [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 08:29:32 compute-0 nova_compute[186544]: 2025-11-22 08:29:32.132 186548 DEBUG nova.virt.libvirt.host [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 08:29:32 compute-0 nova_compute[186544]: 2025-11-22 08:29:32.133 186548 DEBUG nova.virt.libvirt.host [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 08:29:32 compute-0 nova_compute[186544]: 2025-11-22 08:29:32.135 186548 DEBUG nova.virt.libvirt.driver [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 08:29:32 compute-0 nova_compute[186544]: 2025-11-22 08:29:32.135 186548 DEBUG nova.virt.hardware [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1c351edf-5b2d-477d-93d0-c380bdae83e7',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 08:29:32 compute-0 nova_compute[186544]: 2025-11-22 08:29:32.135 186548 DEBUG nova.virt.hardware [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 08:29:32 compute-0 nova_compute[186544]: 2025-11-22 08:29:32.136 186548 DEBUG nova.virt.hardware [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 08:29:32 compute-0 nova_compute[186544]: 2025-11-22 08:29:32.136 186548 DEBUG nova.virt.hardware [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 08:29:32 compute-0 nova_compute[186544]: 2025-11-22 08:29:32.136 186548 DEBUG nova.virt.hardware [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 08:29:32 compute-0 nova_compute[186544]: 2025-11-22 08:29:32.136 186548 DEBUG nova.virt.hardware [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 08:29:32 compute-0 nova_compute[186544]: 2025-11-22 08:29:32.137 186548 DEBUG nova.virt.hardware [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 08:29:32 compute-0 nova_compute[186544]: 2025-11-22 08:29:32.137 186548 DEBUG nova.virt.hardware [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 08:29:32 compute-0 nova_compute[186544]: 2025-11-22 08:29:32.137 186548 DEBUG nova.virt.hardware [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 08:29:32 compute-0 nova_compute[186544]: 2025-11-22 08:29:32.137 186548 DEBUG nova.virt.hardware [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 08:29:32 compute-0 nova_compute[186544]: 2025-11-22 08:29:32.138 186548 DEBUG nova.virt.hardware [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 08:29:32 compute-0 nova_compute[186544]: 2025-11-22 08:29:32.138 186548 DEBUG nova.objects.instance [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:29:32 compute-0 nova_compute[186544]: 2025-11-22 08:29:32.140 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:29:32 compute-0 nova_compute[186544]: 2025-11-22 08:29:32.228 186548 DEBUG oslo_concurrency.processutils [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:29:32 compute-0 nova_compute[186544]: 2025-11-22 08:29:32.283 186548 DEBUG oslo_concurrency.processutils [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3/disk.config --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:29:32 compute-0 nova_compute[186544]: 2025-11-22 08:29:32.284 186548 DEBUG oslo_concurrency.lockutils [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "/var/lib/nova/instances/1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:29:32 compute-0 nova_compute[186544]: 2025-11-22 08:29:32.285 186548 DEBUG oslo_concurrency.lockutils [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "/var/lib/nova/instances/1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:29:32 compute-0 nova_compute[186544]: 2025-11-22 08:29:32.285 186548 DEBUG oslo_concurrency.lockutils [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "/var/lib/nova/instances/1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:29:32 compute-0 nova_compute[186544]: 2025-11-22 08:29:32.286 186548 DEBUG nova.virt.libvirt.vif [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:28:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-670921403',display_name='tempest-TestNetworkAdvancedServerOps-server-670921403',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-670921403',id=162,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGYHYXp1Qqy6JOhGWOvuacfIk0P6wPolDsKlW4eLBP1reaf5YJ3b0p9NPF3wkmcarWaq/1pXj7o7/84igOB3Q0Y7op1kvlGqjlFnubXR8AIl2+F1RtClL7jm1Y/qEbrbsQ==',key_name='tempest-TestNetworkAdvancedServerOps-1145580001',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:28:40Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='042f6d127720471aaedb8a1fb7535416',ramdisk_id='',reservation_id='r-804tq30l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-1221065053',owner_user_name='tempest-TestNetworkAdvancedServerOps-1221065053-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:29:21Z,user_data=None,user_id='d8853d84c1e84f6baaf01635ef1d0f7c',uuid=1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e6dd9383-6fd6-4da4-8c3b-126dd22ec505", "address": "fa:16:3e:09:92:29", "network": {"id": "9ec8d93f-618c-42ae-9ef7-97cfef6c22ef", "bridge": "br-int", "label": "tempest-network-smoke--2141158938", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--2141158938", "vif_mac": "fa:16:3e:09:92:29"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape6dd9383-6f", "ovs_interfaceid": "e6dd9383-6fd6-4da4-8c3b-126dd22ec505", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 08:29:32 compute-0 nova_compute[186544]: 2025-11-22 08:29:32.287 186548 DEBUG nova.network.os_vif_util [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converting VIF {"id": "e6dd9383-6fd6-4da4-8c3b-126dd22ec505", "address": "fa:16:3e:09:92:29", "network": {"id": "9ec8d93f-618c-42ae-9ef7-97cfef6c22ef", "bridge": "br-int", "label": "tempest-network-smoke--2141158938", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--2141158938", "vif_mac": "fa:16:3e:09:92:29"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape6dd9383-6f", "ovs_interfaceid": "e6dd9383-6fd6-4da4-8c3b-126dd22ec505", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:29:32 compute-0 nova_compute[186544]: 2025-11-22 08:29:32.288 186548 DEBUG nova.network.os_vif_util [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:09:92:29,bridge_name='br-int',has_traffic_filtering=True,id=e6dd9383-6fd6-4da4-8c3b-126dd22ec505,network=Network(9ec8d93f-618c-42ae-9ef7-97cfef6c22ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape6dd9383-6f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:29:32 compute-0 nova_compute[186544]: 2025-11-22 08:29:32.290 186548 DEBUG nova.virt.libvirt.driver [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] End _get_guest_xml xml=<domain type="kvm">
Nov 22 08:29:32 compute-0 nova_compute[186544]:   <uuid>1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3</uuid>
Nov 22 08:29:32 compute-0 nova_compute[186544]:   <name>instance-000000a2</name>
Nov 22 08:29:32 compute-0 nova_compute[186544]:   <memory>196608</memory>
Nov 22 08:29:32 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 08:29:32 compute-0 nova_compute[186544]:   <metadata>
Nov 22 08:29:32 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 08:29:32 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 08:29:32 compute-0 nova_compute[186544]:       <nova:name>tempest-TestNetworkAdvancedServerOps-server-670921403</nova:name>
Nov 22 08:29:32 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 08:29:32</nova:creationTime>
Nov 22 08:29:32 compute-0 nova_compute[186544]:       <nova:flavor name="m1.micro">
Nov 22 08:29:32 compute-0 nova_compute[186544]:         <nova:memory>192</nova:memory>
Nov 22 08:29:32 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 08:29:32 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 08:29:32 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 08:29:32 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 08:29:32 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 08:29:32 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 08:29:32 compute-0 nova_compute[186544]:         <nova:user uuid="d8853d84c1e84f6baaf01635ef1d0f7c">tempest-TestNetworkAdvancedServerOps-1221065053-project-member</nova:user>
Nov 22 08:29:32 compute-0 nova_compute[186544]:         <nova:project uuid="042f6d127720471aaedb8a1fb7535416">tempest-TestNetworkAdvancedServerOps-1221065053</nova:project>
Nov 22 08:29:32 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 08:29:32 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 08:29:32 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 08:29:32 compute-0 nova_compute[186544]:         <nova:port uuid="e6dd9383-6fd6-4da4-8c3b-126dd22ec505">
Nov 22 08:29:32 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Nov 22 08:29:32 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 08:29:32 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 08:29:32 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 08:29:32 compute-0 nova_compute[186544]:   </metadata>
Nov 22 08:29:32 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 08:29:32 compute-0 nova_compute[186544]:     <system>
Nov 22 08:29:32 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 08:29:32 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 08:29:32 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 08:29:32 compute-0 nova_compute[186544]:       <entry name="serial">1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3</entry>
Nov 22 08:29:32 compute-0 nova_compute[186544]:       <entry name="uuid">1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3</entry>
Nov 22 08:29:32 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 08:29:32 compute-0 nova_compute[186544]:     </system>
Nov 22 08:29:32 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 08:29:32 compute-0 nova_compute[186544]:   <os>
Nov 22 08:29:32 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 08:29:32 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 08:29:32 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 08:29:32 compute-0 nova_compute[186544]:   </os>
Nov 22 08:29:32 compute-0 nova_compute[186544]:   <features>
Nov 22 08:29:32 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 08:29:32 compute-0 nova_compute[186544]:     <apic/>
Nov 22 08:29:32 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 08:29:32 compute-0 nova_compute[186544]:   </features>
Nov 22 08:29:32 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 08:29:32 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 08:29:32 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 08:29:32 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 08:29:32 compute-0 nova_compute[186544]:   </clock>
Nov 22 08:29:32 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 08:29:32 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 08:29:32 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 08:29:32 compute-0 nova_compute[186544]:   </cpu>
Nov 22 08:29:32 compute-0 nova_compute[186544]:   <devices>
Nov 22 08:29:32 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 08:29:32 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 08:29:32 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3/disk"/>
Nov 22 08:29:32 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 08:29:32 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:29:32 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 08:29:32 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 08:29:32 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3/disk.config"/>
Nov 22 08:29:32 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 08:29:32 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:29:32 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 08:29:32 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:09:92:29"/>
Nov 22 08:29:32 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:29:32 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 08:29:32 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 08:29:32 compute-0 nova_compute[186544]:       <target dev="tape6dd9383-6f"/>
Nov 22 08:29:32 compute-0 nova_compute[186544]:     </interface>
Nov 22 08:29:32 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 08:29:32 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3/console.log" append="off"/>
Nov 22 08:29:32 compute-0 nova_compute[186544]:     </serial>
Nov 22 08:29:32 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 08:29:32 compute-0 nova_compute[186544]:     <video>
Nov 22 08:29:32 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:29:32 compute-0 nova_compute[186544]:     </video>
Nov 22 08:29:32 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 08:29:32 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 08:29:32 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 08:29:32 compute-0 nova_compute[186544]:     </rng>
Nov 22 08:29:32 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 08:29:32 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:29:32 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:29:32 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:29:32 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:29:32 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:29:32 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:29:32 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:29:32 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:29:32 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:29:32 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:29:32 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:29:32 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:29:32 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:29:32 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:29:32 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:29:32 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:29:32 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:29:32 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:29:32 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:29:32 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:29:32 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:29:32 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:29:32 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:29:32 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:29:32 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 08:29:32 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 08:29:32 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 08:29:32 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 08:29:32 compute-0 nova_compute[186544]:   </devices>
Nov 22 08:29:32 compute-0 nova_compute[186544]: </domain>
Nov 22 08:29:32 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 08:29:32 compute-0 nova_compute[186544]: 2025-11-22 08:29:32.292 186548 DEBUG nova.virt.libvirt.vif [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:28:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-670921403',display_name='tempest-TestNetworkAdvancedServerOps-server-670921403',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-670921403',id=162,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGYHYXp1Qqy6JOhGWOvuacfIk0P6wPolDsKlW4eLBP1reaf5YJ3b0p9NPF3wkmcarWaq/1pXj7o7/84igOB3Q0Y7op1kvlGqjlFnubXR8AIl2+F1RtClL7jm1Y/qEbrbsQ==',key_name='tempest-TestNetworkAdvancedServerOps-1145580001',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:28:40Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='042f6d127720471aaedb8a1fb7535416',ramdisk_id='',reservation_id='r-804tq30l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-1221065053',owner_user_name='tempest-TestNetworkAdvancedServerOps-1221065053-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:29:21Z,user_data=None,user_id='d8853d84c1e84f6baaf01635ef1d0f7c',uuid=1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e6dd9383-6fd6-4da4-8c3b-126dd22ec505", "address": "fa:16:3e:09:92:29", "network": {"id": "9ec8d93f-618c-42ae-9ef7-97cfef6c22ef", "bridge": "br-int", "label": "tempest-network-smoke--2141158938", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--2141158938", "vif_mac": "fa:16:3e:09:92:29"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape6dd9383-6f", "ovs_interfaceid": "e6dd9383-6fd6-4da4-8c3b-126dd22ec505", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 08:29:32 compute-0 nova_compute[186544]: 2025-11-22 08:29:32.292 186548 DEBUG nova.network.os_vif_util [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converting VIF {"id": "e6dd9383-6fd6-4da4-8c3b-126dd22ec505", "address": "fa:16:3e:09:92:29", "network": {"id": "9ec8d93f-618c-42ae-9ef7-97cfef6c22ef", "bridge": "br-int", "label": "tempest-network-smoke--2141158938", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--2141158938", "vif_mac": "fa:16:3e:09:92:29"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape6dd9383-6f", "ovs_interfaceid": "e6dd9383-6fd6-4da4-8c3b-126dd22ec505", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:29:32 compute-0 nova_compute[186544]: 2025-11-22 08:29:32.292 186548 DEBUG nova.network.os_vif_util [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:09:92:29,bridge_name='br-int',has_traffic_filtering=True,id=e6dd9383-6fd6-4da4-8c3b-126dd22ec505,network=Network(9ec8d93f-618c-42ae-9ef7-97cfef6c22ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape6dd9383-6f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:29:32 compute-0 nova_compute[186544]: 2025-11-22 08:29:32.293 186548 DEBUG os_vif [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:09:92:29,bridge_name='br-int',has_traffic_filtering=True,id=e6dd9383-6fd6-4da4-8c3b-126dd22ec505,network=Network(9ec8d93f-618c-42ae-9ef7-97cfef6c22ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape6dd9383-6f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 08:29:32 compute-0 nova_compute[186544]: 2025-11-22 08:29:32.294 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:29:32 compute-0 nova_compute[186544]: 2025-11-22 08:29:32.294 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:29:32 compute-0 nova_compute[186544]: 2025-11-22 08:29:32.295 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:29:32 compute-0 nova_compute[186544]: 2025-11-22 08:29:32.297 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:29:32 compute-0 nova_compute[186544]: 2025-11-22 08:29:32.297 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape6dd9383-6f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:29:32 compute-0 nova_compute[186544]: 2025-11-22 08:29:32.297 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape6dd9383-6f, col_values=(('external_ids', {'iface-id': 'e6dd9383-6fd6-4da4-8c3b-126dd22ec505', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:09:92:29', 'vm-uuid': '1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:29:32 compute-0 nova_compute[186544]: 2025-11-22 08:29:32.299 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:29:32 compute-0 NetworkManager[55036]: <info>  [1763800172.2999] manager: (tape6dd9383-6f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/351)
Nov 22 08:29:32 compute-0 nova_compute[186544]: 2025-11-22 08:29:32.301 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 08:29:32 compute-0 nova_compute[186544]: 2025-11-22 08:29:32.306 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:29:32 compute-0 nova_compute[186544]: 2025-11-22 08:29:32.307 186548 INFO os_vif [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:09:92:29,bridge_name='br-int',has_traffic_filtering=True,id=e6dd9383-6fd6-4da4-8c3b-126dd22ec505,network=Network(9ec8d93f-618c-42ae-9ef7-97cfef6c22ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape6dd9383-6f')
Nov 22 08:29:32 compute-0 nova_compute[186544]: 2025-11-22 08:29:32.609 186548 DEBUG nova.virt.libvirt.driver [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:29:32 compute-0 nova_compute[186544]: 2025-11-22 08:29:32.609 186548 DEBUG nova.virt.libvirt.driver [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:29:32 compute-0 nova_compute[186544]: 2025-11-22 08:29:32.609 186548 DEBUG nova.virt.libvirt.driver [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] No VIF found with MAC fa:16:3e:09:92:29, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 08:29:32 compute-0 nova_compute[186544]: 2025-11-22 08:29:32.610 186548 INFO nova.virt.libvirt.driver [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Using config drive
Nov 22 08:29:32 compute-0 kernel: tape6dd9383-6f: entered promiscuous mode
Nov 22 08:29:32 compute-0 NetworkManager[55036]: <info>  [1763800172.6785] manager: (tape6dd9383-6f): new Tun device (/org/freedesktop/NetworkManager/Devices/352)
Nov 22 08:29:32 compute-0 ovn_controller[94843]: 2025-11-22T08:29:32Z|00764|binding|INFO|Claiming lport e6dd9383-6fd6-4da4-8c3b-126dd22ec505 for this chassis.
Nov 22 08:29:32 compute-0 ovn_controller[94843]: 2025-11-22T08:29:32Z|00765|binding|INFO|e6dd9383-6fd6-4da4-8c3b-126dd22ec505: Claiming fa:16:3e:09:92:29 10.100.0.7
Nov 22 08:29:32 compute-0 nova_compute[186544]: 2025-11-22 08:29:32.677 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:29:32 compute-0 nova_compute[186544]: 2025-11-22 08:29:32.687 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:29:32 compute-0 nova_compute[186544]: 2025-11-22 08:29:32.691 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:29:32 compute-0 NetworkManager[55036]: <info>  [1763800172.6920] manager: (patch-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/353)
Nov 22 08:29:32 compute-0 NetworkManager[55036]: <info>  [1763800172.6927] manager: (patch-br-int-to-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/354)
Nov 22 08:29:32 compute-0 systemd-udevd[245930]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:29:32 compute-0 NetworkManager[55036]: <info>  [1763800172.7167] device (tape6dd9383-6f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 08:29:32 compute-0 NetworkManager[55036]: <info>  [1763800172.7174] device (tape6dd9383-6f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 08:29:32 compute-0 systemd-machined[152872]: New machine qemu-87-instance-000000a2.
Nov 22 08:29:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:29:32.730 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:09:92:29 10.100.0.7'], port_security=['fa:16:3e:09:92:29 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9ec8d93f-618c-42ae-9ef7-97cfef6c22ef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '042f6d127720471aaedb8a1fb7535416', 'neutron:revision_number': '8', 'neutron:security_group_ids': '24e159b0-64cc-460c-86db-d34db1ef5f3c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.218'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8bbea3c6-adfe-4d7f-816a-93045d66e49e, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=e6dd9383-6fd6-4da4-8c3b-126dd22ec505) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:29:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:29:32.732 103805 INFO neutron.agent.ovn.metadata.agent [-] Port e6dd9383-6fd6-4da4-8c3b-126dd22ec505 in datapath 9ec8d93f-618c-42ae-9ef7-97cfef6c22ef bound to our chassis
Nov 22 08:29:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:29:32.733 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9ec8d93f-618c-42ae-9ef7-97cfef6c22ef
Nov 22 08:29:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:29:32.742 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[14fe4947-1db5-46fc-a52f-07afd846745f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:29:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:29:32.743 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9ec8d93f-61 in ovnmeta-9ec8d93f-618c-42ae-9ef7-97cfef6c22ef namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 08:29:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:29:32.745 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9ec8d93f-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 08:29:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:29:32.745 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[e0f85b81-7f5a-4b1f-a06f-b23b614b9e9f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:29:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:29:32.746 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[6815b8a8-8c13-4082-b4c7-241b51acfe46]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:29:32 compute-0 nova_compute[186544]: 2025-11-22 08:29:32.756 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:29:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:29:32.758 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[30c27f48-a107-402a-9aa3-f3a9b8e304fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:29:32 compute-0 systemd[1]: Started Virtual Machine qemu-87-instance-000000a2.
Nov 22 08:29:32 compute-0 nova_compute[186544]: 2025-11-22 08:29:32.765 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:29:32 compute-0 ovn_controller[94843]: 2025-11-22T08:29:32Z|00766|binding|INFO|Setting lport e6dd9383-6fd6-4da4-8c3b-126dd22ec505 ovn-installed in OVS
Nov 22 08:29:32 compute-0 ovn_controller[94843]: 2025-11-22T08:29:32Z|00767|binding|INFO|Setting lport e6dd9383-6fd6-4da4-8c3b-126dd22ec505 up in Southbound
Nov 22 08:29:32 compute-0 nova_compute[186544]: 2025-11-22 08:29:32.774 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:29:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:29:32.775 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[c7b9bff3-de20-4463-bd8c-566b93bf0fd6]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:29:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:29:32.804 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[6254794c-a6dd-480d-a3a8-75e1048a36c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:29:32 compute-0 systemd-udevd[245933]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:29:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:29:32.810 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[ceb110e4-18b8-420d-abb5-3aad3e33d8ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:29:32 compute-0 NetworkManager[55036]: <info>  [1763800172.8115] manager: (tap9ec8d93f-60): new Veth device (/org/freedesktop/NetworkManager/Devices/355)
Nov 22 08:29:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:29:32.843 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[fb66a9e8-06c7-4b46-b4b2-03b537a0fbfe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:29:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:29:32.846 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[d0ed7c08-1a79-4cab-9a04-6c4628a13d57]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:29:32 compute-0 NetworkManager[55036]: <info>  [1763800172.8681] device (tap9ec8d93f-60): carrier: link connected
Nov 22 08:29:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:29:32.872 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[caa10601-821d-4299-9e4b-fb073c691a83]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:29:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:29:32.890 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[adfef2ae-cca1-49a5-8292-b62cfb39afdd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9ec8d93f-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b0:a4:7c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 229], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 691350, 'reachable_time': 16967, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 245964, 'error': None, 'target': 'ovnmeta-9ec8d93f-618c-42ae-9ef7-97cfef6c22ef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:29:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:29:32.906 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[dad3a88b-8ad4-4569-8d43-a31560dd9098]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb0:a47c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 691350, 'tstamp': 691350}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 245965, 'error': None, 'target': 'ovnmeta-9ec8d93f-618c-42ae-9ef7-97cfef6c22ef', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:29:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:29:32.924 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[7f004b64-5f58-4300-bedf-3dc6bd6cc86d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9ec8d93f-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b0:a4:7c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 229], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 691350, 'reachable_time': 16967, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 245966, 'error': None, 'target': 'ovnmeta-9ec8d93f-618c-42ae-9ef7-97cfef6c22ef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:29:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:29:32.954 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[17a4df6a-a945-47f5-8ace-8fea7d16da3d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:29:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:29:33.013 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[d399933b-f417-4771-8812-b6e7dfaf31a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:29:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:29:33.014 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9ec8d93f-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:29:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:29:33.014 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:29:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:29:33.015 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9ec8d93f-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:29:33 compute-0 NetworkManager[55036]: <info>  [1763800173.0173] manager: (tap9ec8d93f-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/356)
Nov 22 08:29:33 compute-0 nova_compute[186544]: 2025-11-22 08:29:33.016 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:29:33 compute-0 kernel: tap9ec8d93f-60: entered promiscuous mode
Nov 22 08:29:33 compute-0 nova_compute[186544]: 2025-11-22 08:29:33.019 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:29:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:29:33.019 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9ec8d93f-60, col_values=(('external_ids', {'iface-id': '2f82bda6-e5f9-45ef-96c2-3856b66571d8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:29:33 compute-0 nova_compute[186544]: 2025-11-22 08:29:33.020 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:29:33 compute-0 ovn_controller[94843]: 2025-11-22T08:29:33Z|00768|binding|INFO|Releasing lport 2f82bda6-e5f9-45ef-96c2-3856b66571d8 from this chassis (sb_readonly=0)
Nov 22 08:29:33 compute-0 nova_compute[186544]: 2025-11-22 08:29:33.033 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:29:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:29:33.034 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9ec8d93f-618c-42ae-9ef7-97cfef6c22ef.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9ec8d93f-618c-42ae-9ef7-97cfef6c22ef.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 08:29:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:29:33.035 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[831511a6-ee9e-4b06-afe0-3eddd65d4209]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:29:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:29:33.036 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 08:29:33 compute-0 ovn_metadata_agent[103800]: global
Nov 22 08:29:33 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 08:29:33 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-9ec8d93f-618c-42ae-9ef7-97cfef6c22ef
Nov 22 08:29:33 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 08:29:33 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 08:29:33 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 08:29:33 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/9ec8d93f-618c-42ae-9ef7-97cfef6c22ef.pid.haproxy
Nov 22 08:29:33 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 08:29:33 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:29:33 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 08:29:33 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 08:29:33 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 08:29:33 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 08:29:33 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 08:29:33 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 08:29:33 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 08:29:33 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 08:29:33 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 08:29:33 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 08:29:33 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 08:29:33 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 08:29:33 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 08:29:33 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:29:33 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:29:33 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 08:29:33 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 08:29:33 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 08:29:33 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID 9ec8d93f-618c-42ae-9ef7-97cfef6c22ef
Nov 22 08:29:33 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 08:29:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:29:33.037 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9ec8d93f-618c-42ae-9ef7-97cfef6c22ef', 'env', 'PROCESS_TAG=haproxy-9ec8d93f-618c-42ae-9ef7-97cfef6c22ef', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9ec8d93f-618c-42ae-9ef7-97cfef6c22ef.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 08:29:33 compute-0 nova_compute[186544]: 2025-11-22 08:29:33.383 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763800173.383179, 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:29:33 compute-0 nova_compute[186544]: 2025-11-22 08:29:33.385 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] VM Resumed (Lifecycle Event)
Nov 22 08:29:33 compute-0 nova_compute[186544]: 2025-11-22 08:29:33.389 186548 DEBUG nova.compute.manager [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 08:29:33 compute-0 nova_compute[186544]: 2025-11-22 08:29:33.395 186548 INFO nova.virt.libvirt.driver [-] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Instance running successfully.
Nov 22 08:29:33 compute-0 virtqemud[186092]: argument unsupported: QEMU guest agent is not configured
Nov 22 08:29:33 compute-0 nova_compute[186544]: 2025-11-22 08:29:33.397 186548 DEBUG nova.virt.libvirt.guest [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200
Nov 22 08:29:33 compute-0 nova_compute[186544]: 2025-11-22 08:29:33.398 186548 DEBUG nova.virt.libvirt.driver [None req-4ac97c84-9bc3-4f21-b966-7594038f6283 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793
Nov 22 08:29:33 compute-0 nova_compute[186544]: 2025-11-22 08:29:33.407 186548 DEBUG nova.compute.manager [req-21b6e10d-ad8f-4e89-a8cf-9539953022c1 req-59e54c65-13c0-4a6d-bcbc-05475932d6c4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Received event network-vif-plugged-e6dd9383-6fd6-4da4-8c3b-126dd22ec505 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:29:33 compute-0 nova_compute[186544]: 2025-11-22 08:29:33.407 186548 DEBUG oslo_concurrency.lockutils [req-21b6e10d-ad8f-4e89-a8cf-9539953022c1 req-59e54c65-13c0-4a6d-bcbc-05475932d6c4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:29:33 compute-0 nova_compute[186544]: 2025-11-22 08:29:33.407 186548 DEBUG oslo_concurrency.lockutils [req-21b6e10d-ad8f-4e89-a8cf-9539953022c1 req-59e54c65-13c0-4a6d-bcbc-05475932d6c4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:29:33 compute-0 nova_compute[186544]: 2025-11-22 08:29:33.408 186548 DEBUG oslo_concurrency.lockutils [req-21b6e10d-ad8f-4e89-a8cf-9539953022c1 req-59e54c65-13c0-4a6d-bcbc-05475932d6c4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:29:33 compute-0 nova_compute[186544]: 2025-11-22 08:29:33.408 186548 DEBUG nova.compute.manager [req-21b6e10d-ad8f-4e89-a8cf-9539953022c1 req-59e54c65-13c0-4a6d-bcbc-05475932d6c4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] No waiting events found dispatching network-vif-plugged-e6dd9383-6fd6-4da4-8c3b-126dd22ec505 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:29:33 compute-0 nova_compute[186544]: 2025-11-22 08:29:33.408 186548 WARNING nova.compute.manager [req-21b6e10d-ad8f-4e89-a8cf-9539953022c1 req-59e54c65-13c0-4a6d-bcbc-05475932d6c4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Received unexpected event network-vif-plugged-e6dd9383-6fd6-4da4-8c3b-126dd22ec505 for instance with vm_state active and task_state resize_finish.
Nov 22 08:29:33 compute-0 nova_compute[186544]: 2025-11-22 08:29:33.431 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:29:33 compute-0 nova_compute[186544]: 2025-11-22 08:29:33.434 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:29:33 compute-0 podman[246002]: 2025-11-22 08:29:33.373568577 +0000 UTC m=+0.023248992 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 08:29:33 compute-0 nova_compute[186544]: 2025-11-22 08:29:33.475 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] During sync_power_state the instance has a pending task (resize_finish). Skip.
Nov 22 08:29:33 compute-0 nova_compute[186544]: 2025-11-22 08:29:33.476 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763800173.3833175, 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:29:33 compute-0 nova_compute[186544]: 2025-11-22 08:29:33.476 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] VM Started (Lifecycle Event)
Nov 22 08:29:33 compute-0 nova_compute[186544]: 2025-11-22 08:29:33.498 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:29:33 compute-0 nova_compute[186544]: 2025-11-22 08:29:33.501 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:29:33 compute-0 nova_compute[186544]: 2025-11-22 08:29:33.545 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] During sync_power_state the instance has a pending task (resize_finish). Skip.
Nov 22 08:29:34 compute-0 podman[246002]: 2025-11-22 08:29:34.129713337 +0000 UTC m=+0.779393772 container create 5cd79a21195eeb489fb205e37d4721977e3a09d85bd999fd53da35c48fbe0b72 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9ec8d93f-618c-42ae-9ef7-97cfef6c22ef, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2)
Nov 22 08:29:34 compute-0 systemd[1]: Started libpod-conmon-5cd79a21195eeb489fb205e37d4721977e3a09d85bd999fd53da35c48fbe0b72.scope.
Nov 22 08:29:34 compute-0 systemd[1]: Started libcrun container.
Nov 22 08:29:34 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ea66879d0feb250fa0732d9e78d554bbe6f54c6acc70f1db2e9c697b88755496/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 08:29:34 compute-0 nova_compute[186544]: 2025-11-22 08:29:34.537 186548 DEBUG nova.network.neutron [req-7018a9c1-7d77-434f-b4d4-5b87a2d6b8ab req-ada529bf-3f00-4b1d-8678-01ba16763546 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Updated VIF entry in instance network info cache for port e6dd9383-6fd6-4da4-8c3b-126dd22ec505. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:29:34 compute-0 nova_compute[186544]: 2025-11-22 08:29:34.538 186548 DEBUG nova.network.neutron [req-7018a9c1-7d77-434f-b4d4-5b87a2d6b8ab req-ada529bf-3f00-4b1d-8678-01ba16763546 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Updating instance_info_cache with network_info: [{"id": "e6dd9383-6fd6-4da4-8c3b-126dd22ec505", "address": "fa:16:3e:09:92:29", "network": {"id": "9ec8d93f-618c-42ae-9ef7-97cfef6c22ef", "bridge": "br-int", "label": "tempest-network-smoke--2141158938", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape6dd9383-6f", "ovs_interfaceid": "e6dd9383-6fd6-4da4-8c3b-126dd22ec505", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:29:34 compute-0 nova_compute[186544]: 2025-11-22 08:29:34.574 186548 DEBUG oslo_concurrency.lockutils [req-7018a9c1-7d77-434f-b4d4-5b87a2d6b8ab req-ada529bf-3f00-4b1d-8678-01ba16763546 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:29:34 compute-0 podman[246002]: 2025-11-22 08:29:34.9298715 +0000 UTC m=+1.579551905 container init 5cd79a21195eeb489fb205e37d4721977e3a09d85bd999fd53da35c48fbe0b72 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9ec8d93f-618c-42ae-9ef7-97cfef6c22ef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true)
Nov 22 08:29:34 compute-0 podman[246002]: 2025-11-22 08:29:34.940243065 +0000 UTC m=+1.589923470 container start 5cd79a21195eeb489fb205e37d4721977e3a09d85bd999fd53da35c48fbe0b72 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9ec8d93f-618c-42ae-9ef7-97cfef6c22ef, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 22 08:29:34 compute-0 neutron-haproxy-ovnmeta-9ec8d93f-618c-42ae-9ef7-97cfef6c22ef[246019]: [NOTICE]   (246023) : New worker (246025) forked
Nov 22 08:29:34 compute-0 neutron-haproxy-ovnmeta-9ec8d93f-618c-42ae-9ef7-97cfef6c22ef[246019]: [NOTICE]   (246023) : Loading success.
Nov 22 08:29:35 compute-0 nova_compute[186544]: 2025-11-22 08:29:35.732 186548 DEBUG nova.compute.manager [req-e23b4135-e34a-44c6-9c85-971f043b3553 req-4410488b-0735-4253-a474-1016c152d77b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Received event network-vif-plugged-e6dd9383-6fd6-4da4-8c3b-126dd22ec505 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:29:35 compute-0 nova_compute[186544]: 2025-11-22 08:29:35.733 186548 DEBUG oslo_concurrency.lockutils [req-e23b4135-e34a-44c6-9c85-971f043b3553 req-4410488b-0735-4253-a474-1016c152d77b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:29:35 compute-0 nova_compute[186544]: 2025-11-22 08:29:35.733 186548 DEBUG oslo_concurrency.lockutils [req-e23b4135-e34a-44c6-9c85-971f043b3553 req-4410488b-0735-4253-a474-1016c152d77b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:29:35 compute-0 nova_compute[186544]: 2025-11-22 08:29:35.733 186548 DEBUG oslo_concurrency.lockutils [req-e23b4135-e34a-44c6-9c85-971f043b3553 req-4410488b-0735-4253-a474-1016c152d77b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:29:35 compute-0 nova_compute[186544]: 2025-11-22 08:29:35.733 186548 DEBUG nova.compute.manager [req-e23b4135-e34a-44c6-9c85-971f043b3553 req-4410488b-0735-4253-a474-1016c152d77b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] No waiting events found dispatching network-vif-plugged-e6dd9383-6fd6-4da4-8c3b-126dd22ec505 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:29:35 compute-0 nova_compute[186544]: 2025-11-22 08:29:35.733 186548 WARNING nova.compute.manager [req-e23b4135-e34a-44c6-9c85-971f043b3553 req-4410488b-0735-4253-a474-1016c152d77b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Received unexpected event network-vif-plugged-e6dd9383-6fd6-4da4-8c3b-126dd22ec505 for instance with vm_state resized and task_state None.
Nov 22 08:29:36 compute-0 nova_compute[186544]: 2025-11-22 08:29:36.358 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:29:37 compute-0 nova_compute[186544]: 2025-11-22 08:29:37.303 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:29:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:29:37.356 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:29:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:29:37.357 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:29:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:29:37.358 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:29:41 compute-0 nova_compute[186544]: 2025-11-22 08:29:41.165 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:29:41 compute-0 nova_compute[186544]: 2025-11-22 08:29:41.358 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:29:42 compute-0 nova_compute[186544]: 2025-11-22 08:29:42.309 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:29:44 compute-0 podman[246035]: 2025-11-22 08:29:44.460286941 +0000 UTC m=+0.089614696 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Nov 22 08:29:44 compute-0 podman[246036]: 2025-11-22 08:29:44.465784807 +0000 UTC m=+0.092861017 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 08:29:44 compute-0 podman[246034]: 2025-11-22 08:29:44.490791101 +0000 UTC m=+0.127458477 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 22 08:29:44 compute-0 podman[246038]: 2025-11-22 08:29:44.500251674 +0000 UTC m=+0.120602569 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_controller)
Nov 22 08:29:46 compute-0 nova_compute[186544]: 2025-11-22 08:29:46.191 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:29:46 compute-0 nova_compute[186544]: 2025-11-22 08:29:46.191 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 22 08:29:46 compute-0 nova_compute[186544]: 2025-11-22 08:29:46.208 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 22 08:29:46 compute-0 nova_compute[186544]: 2025-11-22 08:29:46.359 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:29:47 compute-0 nova_compute[186544]: 2025-11-22 08:29:47.312 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:29:49 compute-0 ovn_controller[94843]: 2025-11-22T08:29:49Z|00072|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:09:92:29 10.100.0.7
Nov 22 08:29:49 compute-0 sshd-session[246134]: Connection closed by 120.157.53.102 port 60886
Nov 22 08:29:51 compute-0 nova_compute[186544]: 2025-11-22 08:29:51.361 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:29:52 compute-0 nova_compute[186544]: 2025-11-22 08:29:52.314 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:29:53 compute-0 nova_compute[186544]: 2025-11-22 08:29:53.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:29:53 compute-0 nova_compute[186544]: 2025-11-22 08:29:53.164 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 22 08:29:55 compute-0 podman[246137]: 2025-11-22 08:29:55.414716808 +0000 UTC m=+0.064198510 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 22 08:29:56 compute-0 nova_compute[186544]: 2025-11-22 08:29:56.362 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:29:57 compute-0 nova_compute[186544]: 2025-11-22 08:29:57.317 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:29:57 compute-0 nova_compute[186544]: 2025-11-22 08:29:57.359 186548 INFO nova.compute.manager [None req-fa5793fd-c358-41d2-97f7-c27eb9557328 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Get console output
Nov 22 08:29:57 compute-0 nova_compute[186544]: 2025-11-22 08:29:57.368 213059 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 22 08:29:57 compute-0 podman[246159]: 2025-11-22 08:29:57.400475639 +0000 UTC m=+0.060612393 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 08:29:57 compute-0 podman[246160]: 2025-11-22 08:29:57.416188866 +0000 UTC m=+0.071938811 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., managed_by=edpm_ansible, architecture=x86_64, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, name=ubi9-minimal, release=1755695350, distribution-scope=public)
Nov 22 08:29:57 compute-0 sshd-session[246135]: Invalid user a from 120.157.53.102 port 60888
Nov 22 08:29:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:29:58.152 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=56, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=55) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:29:58 compute-0 nova_compute[186544]: 2025-11-22 08:29:58.153 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:29:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:29:58.154 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 08:29:58 compute-0 nova_compute[186544]: 2025-11-22 08:29:58.503 186548 DEBUG nova.compute.manager [req-c696fec0-d75f-4ed7-8405-96092809e245 req-ff2c3725-5b91-46b3-b6cd-7cd3df3b4ca3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Received event network-changed-e6dd9383-6fd6-4da4-8c3b-126dd22ec505 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:29:58 compute-0 nova_compute[186544]: 2025-11-22 08:29:58.503 186548 DEBUG nova.compute.manager [req-c696fec0-d75f-4ed7-8405-96092809e245 req-ff2c3725-5b91-46b3-b6cd-7cd3df3b4ca3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Refreshing instance network info cache due to event network-changed-e6dd9383-6fd6-4da4-8c3b-126dd22ec505. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:29:58 compute-0 nova_compute[186544]: 2025-11-22 08:29:58.503 186548 DEBUG oslo_concurrency.lockutils [req-c696fec0-d75f-4ed7-8405-96092809e245 req-ff2c3725-5b91-46b3-b6cd-7cd3df3b4ca3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:29:58 compute-0 nova_compute[186544]: 2025-11-22 08:29:58.504 186548 DEBUG oslo_concurrency.lockutils [req-c696fec0-d75f-4ed7-8405-96092809e245 req-ff2c3725-5b91-46b3-b6cd-7cd3df3b4ca3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:29:58 compute-0 nova_compute[186544]: 2025-11-22 08:29:58.504 186548 DEBUG nova.network.neutron [req-c696fec0-d75f-4ed7-8405-96092809e245 req-ff2c3725-5b91-46b3-b6cd-7cd3df3b4ca3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Refreshing network info cache for port e6dd9383-6fd6-4da4-8c3b-126dd22ec505 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:29:58 compute-0 nova_compute[186544]: 2025-11-22 08:29:58.548 186548 DEBUG oslo_concurrency.lockutils [None req-45f375d7-060f-43e3-bad4-6f020809d0dc d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:29:58 compute-0 nova_compute[186544]: 2025-11-22 08:29:58.549 186548 DEBUG oslo_concurrency.lockutils [None req-45f375d7-060f-43e3-bad4-6f020809d0dc d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:29:58 compute-0 nova_compute[186544]: 2025-11-22 08:29:58.549 186548 DEBUG oslo_concurrency.lockutils [None req-45f375d7-060f-43e3-bad4-6f020809d0dc d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:29:58 compute-0 nova_compute[186544]: 2025-11-22 08:29:58.549 186548 DEBUG oslo_concurrency.lockutils [None req-45f375d7-060f-43e3-bad4-6f020809d0dc d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:29:58 compute-0 nova_compute[186544]: 2025-11-22 08:29:58.550 186548 DEBUG oslo_concurrency.lockutils [None req-45f375d7-060f-43e3-bad4-6f020809d0dc d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:29:58 compute-0 nova_compute[186544]: 2025-11-22 08:29:58.561 186548 INFO nova.compute.manager [None req-45f375d7-060f-43e3-bad4-6f020809d0dc d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Terminating instance
Nov 22 08:29:58 compute-0 nova_compute[186544]: 2025-11-22 08:29:58.568 186548 DEBUG nova.compute.manager [None req-45f375d7-060f-43e3-bad4-6f020809d0dc d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 08:29:58 compute-0 kernel: tape6dd9383-6f (unregistering): left promiscuous mode
Nov 22 08:29:58 compute-0 NetworkManager[55036]: <info>  [1763800198.6075] device (tape6dd9383-6f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 08:29:58 compute-0 ovn_controller[94843]: 2025-11-22T08:29:58Z|00769|binding|INFO|Releasing lport e6dd9383-6fd6-4da4-8c3b-126dd22ec505 from this chassis (sb_readonly=0)
Nov 22 08:29:58 compute-0 ovn_controller[94843]: 2025-11-22T08:29:58Z|00770|binding|INFO|Setting lport e6dd9383-6fd6-4da4-8c3b-126dd22ec505 down in Southbound
Nov 22 08:29:58 compute-0 ovn_controller[94843]: 2025-11-22T08:29:58Z|00771|binding|INFO|Removing iface tape6dd9383-6f ovn-installed in OVS
Nov 22 08:29:58 compute-0 nova_compute[186544]: 2025-11-22 08:29:58.617 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:29:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:29:58.629 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:09:92:29 10.100.0.7'], port_security=['fa:16:3e:09:92:29 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9ec8d93f-618c-42ae-9ef7-97cfef6c22ef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '042f6d127720471aaedb8a1fb7535416', 'neutron:revision_number': '10', 'neutron:security_group_ids': '24e159b0-64cc-460c-86db-d34db1ef5f3c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8bbea3c6-adfe-4d7f-816a-93045d66e49e, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=e6dd9383-6fd6-4da4-8c3b-126dd22ec505) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:29:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:29:58.631 103805 INFO neutron.agent.ovn.metadata.agent [-] Port e6dd9383-6fd6-4da4-8c3b-126dd22ec505 in datapath 9ec8d93f-618c-42ae-9ef7-97cfef6c22ef unbound from our chassis
Nov 22 08:29:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:29:58.632 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9ec8d93f-618c-42ae-9ef7-97cfef6c22ef, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 08:29:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:29:58.633 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[359d602f-129e-4024-855a-ece2e710e841]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:29:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:29:58.634 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9ec8d93f-618c-42ae-9ef7-97cfef6c22ef namespace which is not needed anymore
Nov 22 08:29:58 compute-0 nova_compute[186544]: 2025-11-22 08:29:58.640 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:29:58 compute-0 systemd[1]: machine-qemu\x2d87\x2dinstance\x2d000000a2.scope: Deactivated successfully.
Nov 22 08:29:58 compute-0 systemd[1]: machine-qemu\x2d87\x2dinstance\x2d000000a2.scope: Consumed 16.682s CPU time.
Nov 22 08:29:58 compute-0 systemd-machined[152872]: Machine qemu-87-instance-000000a2 terminated.
Nov 22 08:29:58 compute-0 neutron-haproxy-ovnmeta-9ec8d93f-618c-42ae-9ef7-97cfef6c22ef[246019]: [NOTICE]   (246023) : haproxy version is 2.8.14-c23fe91
Nov 22 08:29:58 compute-0 neutron-haproxy-ovnmeta-9ec8d93f-618c-42ae-9ef7-97cfef6c22ef[246019]: [NOTICE]   (246023) : path to executable is /usr/sbin/haproxy
Nov 22 08:29:58 compute-0 neutron-haproxy-ovnmeta-9ec8d93f-618c-42ae-9ef7-97cfef6c22ef[246019]: [WARNING]  (246023) : Exiting Master process...
Nov 22 08:29:58 compute-0 neutron-haproxy-ovnmeta-9ec8d93f-618c-42ae-9ef7-97cfef6c22ef[246019]: [ALERT]    (246023) : Current worker (246025) exited with code 143 (Terminated)
Nov 22 08:29:58 compute-0 neutron-haproxy-ovnmeta-9ec8d93f-618c-42ae-9ef7-97cfef6c22ef[246019]: [WARNING]  (246023) : All workers exited. Exiting... (0)
Nov 22 08:29:58 compute-0 systemd[1]: libpod-5cd79a21195eeb489fb205e37d4721977e3a09d85bd999fd53da35c48fbe0b72.scope: Deactivated successfully.
Nov 22 08:29:58 compute-0 podman[246228]: 2025-11-22 08:29:58.789019362 +0000 UTC m=+0.053689172 container died 5cd79a21195eeb489fb205e37d4721977e3a09d85bd999fd53da35c48fbe0b72 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9ec8d93f-618c-42ae-9ef7-97cfef6c22ef, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 08:29:58 compute-0 systemd[1]: var-lib-containers-storage-overlay-ea66879d0feb250fa0732d9e78d554bbe6f54c6acc70f1db2e9c697b88755496-merged.mount: Deactivated successfully.
Nov 22 08:29:58 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5cd79a21195eeb489fb205e37d4721977e3a09d85bd999fd53da35c48fbe0b72-userdata-shm.mount: Deactivated successfully.
Nov 22 08:29:58 compute-0 nova_compute[186544]: 2025-11-22 08:29:58.852 186548 INFO nova.virt.libvirt.driver [-] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Instance destroyed successfully.
Nov 22 08:29:58 compute-0 nova_compute[186544]: 2025-11-22 08:29:58.853 186548 DEBUG nova.objects.instance [None req-45f375d7-060f-43e3-bad4-6f020809d0dc d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lazy-loading 'resources' on Instance uuid 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:29:58 compute-0 podman[246228]: 2025-11-22 08:29:58.859491537 +0000 UTC m=+0.124161347 container cleanup 5cd79a21195eeb489fb205e37d4721977e3a09d85bd999fd53da35c48fbe0b72 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9ec8d93f-618c-42ae-9ef7-97cfef6c22ef, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 22 08:29:58 compute-0 systemd[1]: libpod-conmon-5cd79a21195eeb489fb205e37d4721977e3a09d85bd999fd53da35c48fbe0b72.scope: Deactivated successfully.
Nov 22 08:29:58 compute-0 nova_compute[186544]: 2025-11-22 08:29:58.873 186548 DEBUG nova.virt.libvirt.vif [None req-45f375d7-060f-43e3-bad4-6f020809d0dc d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:28:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-670921403',display_name='tempest-TestNetworkAdvancedServerOps-server-670921403',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-670921403',id=162,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGYHYXp1Qqy6JOhGWOvuacfIk0P6wPolDsKlW4eLBP1reaf5YJ3b0p9NPF3wkmcarWaq/1pXj7o7/84igOB3Q0Y7op1kvlGqjlFnubXR8AIl2+F1RtClL7jm1Y/qEbrbsQ==',key_name='tempest-TestNetworkAdvancedServerOps-1145580001',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:29:33Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='042f6d127720471aaedb8a1fb7535416',ramdisk_id='',reservation_id='r-804tq30l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1221065053',owner_user_name='tempest-TestNetworkAdvancedServerOps-1221065053-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:29:44Z,user_data=None,user_id='d8853d84c1e84f6baaf01635ef1d0f7c',uuid=1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e6dd9383-6fd6-4da4-8c3b-126dd22ec505", "address": "fa:16:3e:09:92:29", "network": {"id": "9ec8d93f-618c-42ae-9ef7-97cfef6c22ef", "bridge": "br-int", "label": "tempest-network-smoke--2141158938", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape6dd9383-6f", "ovs_interfaceid": "e6dd9383-6fd6-4da4-8c3b-126dd22ec505", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 08:29:58 compute-0 nova_compute[186544]: 2025-11-22 08:29:58.874 186548 DEBUG nova.network.os_vif_util [None req-45f375d7-060f-43e3-bad4-6f020809d0dc d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converting VIF {"id": "e6dd9383-6fd6-4da4-8c3b-126dd22ec505", "address": "fa:16:3e:09:92:29", "network": {"id": "9ec8d93f-618c-42ae-9ef7-97cfef6c22ef", "bridge": "br-int", "label": "tempest-network-smoke--2141158938", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape6dd9383-6f", "ovs_interfaceid": "e6dd9383-6fd6-4da4-8c3b-126dd22ec505", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:29:58 compute-0 nova_compute[186544]: 2025-11-22 08:29:58.874 186548 DEBUG nova.network.os_vif_util [None req-45f375d7-060f-43e3-bad4-6f020809d0dc d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:09:92:29,bridge_name='br-int',has_traffic_filtering=True,id=e6dd9383-6fd6-4da4-8c3b-126dd22ec505,network=Network(9ec8d93f-618c-42ae-9ef7-97cfef6c22ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape6dd9383-6f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:29:58 compute-0 nova_compute[186544]: 2025-11-22 08:29:58.875 186548 DEBUG os_vif [None req-45f375d7-060f-43e3-bad4-6f020809d0dc d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:09:92:29,bridge_name='br-int',has_traffic_filtering=True,id=e6dd9383-6fd6-4da4-8c3b-126dd22ec505,network=Network(9ec8d93f-618c-42ae-9ef7-97cfef6c22ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape6dd9383-6f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 08:29:58 compute-0 nova_compute[186544]: 2025-11-22 08:29:58.877 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:29:58 compute-0 nova_compute[186544]: 2025-11-22 08:29:58.877 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape6dd9383-6f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:29:58 compute-0 nova_compute[186544]: 2025-11-22 08:29:58.879 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:29:58 compute-0 nova_compute[186544]: 2025-11-22 08:29:58.881 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 08:29:58 compute-0 nova_compute[186544]: 2025-11-22 08:29:58.886 186548 INFO os_vif [None req-45f375d7-060f-43e3-bad4-6f020809d0dc d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:09:92:29,bridge_name='br-int',has_traffic_filtering=True,id=e6dd9383-6fd6-4da4-8c3b-126dd22ec505,network=Network(9ec8d93f-618c-42ae-9ef7-97cfef6c22ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape6dd9383-6f')
Nov 22 08:29:58 compute-0 nova_compute[186544]: 2025-11-22 08:29:58.887 186548 INFO nova.virt.libvirt.driver [None req-45f375d7-060f-43e3-bad4-6f020809d0dc d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Deleting instance files /var/lib/nova/instances/1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3_del
Nov 22 08:29:58 compute-0 nova_compute[186544]: 2025-11-22 08:29:58.895 186548 INFO nova.virt.libvirt.driver [None req-45f375d7-060f-43e3-bad4-6f020809d0dc d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Deletion of /var/lib/nova/instances/1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3_del complete
Nov 22 08:29:58 compute-0 podman[246273]: 2025-11-22 08:29:58.948047196 +0000 UTC m=+0.057181779 container remove 5cd79a21195eeb489fb205e37d4721977e3a09d85bd999fd53da35c48fbe0b72 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9ec8d93f-618c-42ae-9ef7-97cfef6c22ef, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 08:29:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:29:58.960 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[d5a2b19b-f148-4bcd-9d64-925c3fa238d1]: (4, ('Sat Nov 22 08:29:58 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-9ec8d93f-618c-42ae-9ef7-97cfef6c22ef (5cd79a21195eeb489fb205e37d4721977e3a09d85bd999fd53da35c48fbe0b72)\n5cd79a21195eeb489fb205e37d4721977e3a09d85bd999fd53da35c48fbe0b72\nSat Nov 22 08:29:58 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-9ec8d93f-618c-42ae-9ef7-97cfef6c22ef (5cd79a21195eeb489fb205e37d4721977e3a09d85bd999fd53da35c48fbe0b72)\n5cd79a21195eeb489fb205e37d4721977e3a09d85bd999fd53da35c48fbe0b72\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:29:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:29:58.962 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[66815e79-17ed-4591-9e30-8a783fdb489b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:29:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:29:58.963 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9ec8d93f-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:29:58 compute-0 nova_compute[186544]: 2025-11-22 08:29:58.965 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:29:58 compute-0 kernel: tap9ec8d93f-60: left promiscuous mode
Nov 22 08:29:58 compute-0 nova_compute[186544]: 2025-11-22 08:29:58.979 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:29:58 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:29:58.983 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[1f7325ec-5967-431d-98e1-a5967cf446ac]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:29:58 compute-0 nova_compute[186544]: 2025-11-22 08:29:58.985 186548 INFO nova.compute.manager [None req-45f375d7-060f-43e3-bad4-6f020809d0dc d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Took 0.42 seconds to destroy the instance on the hypervisor.
Nov 22 08:29:58 compute-0 nova_compute[186544]: 2025-11-22 08:29:58.986 186548 DEBUG oslo.service.loopingcall [None req-45f375d7-060f-43e3-bad4-6f020809d0dc d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 08:29:58 compute-0 nova_compute[186544]: 2025-11-22 08:29:58.986 186548 DEBUG nova.compute.manager [-] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 08:29:58 compute-0 nova_compute[186544]: 2025-11-22 08:29:58.987 186548 DEBUG nova.network.neutron [-] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 08:29:59 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:29:59.002 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[5cfb53f3-d068-445c-8670-4e1cf04aae0d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:29:59 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:29:59.003 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[397d5541-9a48-4246-8455-f015dbee9f74]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:29:59 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:29:59.024 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[a98bbc7e-4f41-4dd5-b4c4-3cc0d88d8ed3]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 691343, 'reachable_time': 33685, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 246288, 'error': None, 'target': 'ovnmeta-9ec8d93f-618c-42ae-9ef7-97cfef6c22ef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:29:59 compute-0 sshd-session[246135]: Connection closed by invalid user a 120.157.53.102 port 60888 [preauth]
Nov 22 08:29:59 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:29:59.028 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9ec8d93f-618c-42ae-9ef7-97cfef6c22ef deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 08:29:59 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:29:59.028 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[0b114450-b550-4b9e-b0e6-aec49473b769]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:29:59 compute-0 systemd[1]: run-netns-ovnmeta\x2d9ec8d93f\x2d618c\x2d42ae\x2d9ef7\x2d97cfef6c22ef.mount: Deactivated successfully.
Nov 22 08:29:59 compute-0 nova_compute[186544]: 2025-11-22 08:29:59.058 186548 DEBUG nova.compute.manager [req-4e67c9ad-72bb-4ea8-b455-87027eff5191 req-88f76643-cb14-46f6-bfdd-191ee860f4a7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Received event network-vif-unplugged-e6dd9383-6fd6-4da4-8c3b-126dd22ec505 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:29:59 compute-0 nova_compute[186544]: 2025-11-22 08:29:59.059 186548 DEBUG oslo_concurrency.lockutils [req-4e67c9ad-72bb-4ea8-b455-87027eff5191 req-88f76643-cb14-46f6-bfdd-191ee860f4a7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:29:59 compute-0 nova_compute[186544]: 2025-11-22 08:29:59.059 186548 DEBUG oslo_concurrency.lockutils [req-4e67c9ad-72bb-4ea8-b455-87027eff5191 req-88f76643-cb14-46f6-bfdd-191ee860f4a7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:29:59 compute-0 nova_compute[186544]: 2025-11-22 08:29:59.059 186548 DEBUG oslo_concurrency.lockutils [req-4e67c9ad-72bb-4ea8-b455-87027eff5191 req-88f76643-cb14-46f6-bfdd-191ee860f4a7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:29:59 compute-0 nova_compute[186544]: 2025-11-22 08:29:59.059 186548 DEBUG nova.compute.manager [req-4e67c9ad-72bb-4ea8-b455-87027eff5191 req-88f76643-cb14-46f6-bfdd-191ee860f4a7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] No waiting events found dispatching network-vif-unplugged-e6dd9383-6fd6-4da4-8c3b-126dd22ec505 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:29:59 compute-0 nova_compute[186544]: 2025-11-22 08:29:59.060 186548 DEBUG nova.compute.manager [req-4e67c9ad-72bb-4ea8-b455-87027eff5191 req-88f76643-cb14-46f6-bfdd-191ee860f4a7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Received event network-vif-unplugged-e6dd9383-6fd6-4da4-8c3b-126dd22ec505 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 22 08:30:00 compute-0 nova_compute[186544]: 2025-11-22 08:30:00.106 186548 DEBUG nova.network.neutron [-] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:30:00 compute-0 nova_compute[186544]: 2025-11-22 08:30:00.148 186548 INFO nova.compute.manager [-] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Took 1.16 seconds to deallocate network for instance.
Nov 22 08:30:00 compute-0 nova_compute[186544]: 2025-11-22 08:30:00.328 186548 DEBUG oslo_concurrency.lockutils [None req-45f375d7-060f-43e3-bad4-6f020809d0dc d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:30:00 compute-0 nova_compute[186544]: 2025-11-22 08:30:00.329 186548 DEBUG oslo_concurrency.lockutils [None req-45f375d7-060f-43e3-bad4-6f020809d0dc d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:30:00 compute-0 nova_compute[186544]: 2025-11-22 08:30:00.340 186548 DEBUG oslo_concurrency.lockutils [None req-45f375d7-060f-43e3-bad4-6f020809d0dc d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.012s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:30:00 compute-0 nova_compute[186544]: 2025-11-22 08:30:00.430 186548 INFO nova.scheduler.client.report [None req-45f375d7-060f-43e3-bad4-6f020809d0dc d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Deleted allocations for instance 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3
Nov 22 08:30:00 compute-0 nova_compute[186544]: 2025-11-22 08:30:00.510 186548 DEBUG oslo_concurrency.lockutils [None req-45f375d7-060f-43e3-bad4-6f020809d0dc d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.961s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:30:00 compute-0 nova_compute[186544]: 2025-11-22 08:30:00.581 186548 DEBUG nova.network.neutron [req-c696fec0-d75f-4ed7-8405-96092809e245 req-ff2c3725-5b91-46b3-b6cd-7cd3df3b4ca3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Updated VIF entry in instance network info cache for port e6dd9383-6fd6-4da4-8c3b-126dd22ec505. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:30:00 compute-0 nova_compute[186544]: 2025-11-22 08:30:00.582 186548 DEBUG nova.network.neutron [req-c696fec0-d75f-4ed7-8405-96092809e245 req-ff2c3725-5b91-46b3-b6cd-7cd3df3b4ca3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Updating instance_info_cache with network_info: [{"id": "e6dd9383-6fd6-4da4-8c3b-126dd22ec505", "address": "fa:16:3e:09:92:29", "network": {"id": "9ec8d93f-618c-42ae-9ef7-97cfef6c22ef", "bridge": "br-int", "label": "tempest-network-smoke--2141158938", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape6dd9383-6f", "ovs_interfaceid": "e6dd9383-6fd6-4da4-8c3b-126dd22ec505", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:30:00 compute-0 nova_compute[186544]: 2025-11-22 08:30:00.606 186548 DEBUG oslo_concurrency.lockutils [req-c696fec0-d75f-4ed7-8405-96092809e245 req-ff2c3725-5b91-46b3-b6cd-7cd3df3b4ca3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:30:01 compute-0 nova_compute[186544]: 2025-11-22 08:30:01.266 186548 DEBUG nova.compute.manager [req-95427ed5-d3af-4c3c-be29-00c04096d55b req-ce803578-8206-4f4f-bf6b-ba8d251a5bbd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Received event network-vif-plugged-e6dd9383-6fd6-4da4-8c3b-126dd22ec505 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:30:01 compute-0 nova_compute[186544]: 2025-11-22 08:30:01.266 186548 DEBUG oslo_concurrency.lockutils [req-95427ed5-d3af-4c3c-be29-00c04096d55b req-ce803578-8206-4f4f-bf6b-ba8d251a5bbd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:30:01 compute-0 nova_compute[186544]: 2025-11-22 08:30:01.266 186548 DEBUG oslo_concurrency.lockutils [req-95427ed5-d3af-4c3c-be29-00c04096d55b req-ce803578-8206-4f4f-bf6b-ba8d251a5bbd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:30:01 compute-0 nova_compute[186544]: 2025-11-22 08:30:01.267 186548 DEBUG oslo_concurrency.lockutils [req-95427ed5-d3af-4c3c-be29-00c04096d55b req-ce803578-8206-4f4f-bf6b-ba8d251a5bbd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:30:01 compute-0 nova_compute[186544]: 2025-11-22 08:30:01.267 186548 DEBUG nova.compute.manager [req-95427ed5-d3af-4c3c-be29-00c04096d55b req-ce803578-8206-4f4f-bf6b-ba8d251a5bbd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] No waiting events found dispatching network-vif-plugged-e6dd9383-6fd6-4da4-8c3b-126dd22ec505 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:30:01 compute-0 nova_compute[186544]: 2025-11-22 08:30:01.267 186548 WARNING nova.compute.manager [req-95427ed5-d3af-4c3c-be29-00c04096d55b req-ce803578-8206-4f4f-bf6b-ba8d251a5bbd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Received unexpected event network-vif-plugged-e6dd9383-6fd6-4da4-8c3b-126dd22ec505 for instance with vm_state deleted and task_state None.
Nov 22 08:30:01 compute-0 nova_compute[186544]: 2025-11-22 08:30:01.267 186548 DEBUG nova.compute.manager [req-95427ed5-d3af-4c3c-be29-00c04096d55b req-ce803578-8206-4f4f-bf6b-ba8d251a5bbd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Received event network-vif-deleted-e6dd9383-6fd6-4da4-8c3b-126dd22ec505 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:30:01 compute-0 nova_compute[186544]: 2025-11-22 08:30:01.365 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:30:02 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:30:02.156 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '56'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:30:03 compute-0 nova_compute[186544]: 2025-11-22 08:30:03.880 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:30:06 compute-0 nova_compute[186544]: 2025-11-22 08:30:06.367 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:30:07 compute-0 nova_compute[186544]: 2025-11-22 08:30:07.969 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:30:08 compute-0 nova_compute[186544]: 2025-11-22 08:30:08.038 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:30:08 compute-0 nova_compute[186544]: 2025-11-22 08:30:08.888 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:30:11 compute-0 nova_compute[186544]: 2025-11-22 08:30:11.373 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:30:12 compute-0 nova_compute[186544]: 2025-11-22 08:30:12.179 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:30:12 compute-0 nova_compute[186544]: 2025-11-22 08:30:12.214 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:30:12 compute-0 nova_compute[186544]: 2025-11-22 08:30:12.215 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:30:12 compute-0 nova_compute[186544]: 2025-11-22 08:30:12.215 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:30:12 compute-0 nova_compute[186544]: 2025-11-22 08:30:12.215 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 08:30:12 compute-0 nova_compute[186544]: 2025-11-22 08:30:12.398 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:30:12 compute-0 nova_compute[186544]: 2025-11-22 08:30:12.399 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5705MB free_disk=73.12996673583984GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 08:30:12 compute-0 nova_compute[186544]: 2025-11-22 08:30:12.399 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:30:12 compute-0 nova_compute[186544]: 2025-11-22 08:30:12.399 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:30:12 compute-0 nova_compute[186544]: 2025-11-22 08:30:12.562 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 08:30:12 compute-0 nova_compute[186544]: 2025-11-22 08:30:12.562 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 08:30:12 compute-0 nova_compute[186544]: 2025-11-22 08:30:12.629 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:30:12 compute-0 nova_compute[186544]: 2025-11-22 08:30:12.640 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:30:12 compute-0 nova_compute[186544]: 2025-11-22 08:30:12.667 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 08:30:12 compute-0 nova_compute[186544]: 2025-11-22 08:30:12.667 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.268s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:30:13 compute-0 nova_compute[186544]: 2025-11-22 08:30:13.651 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:30:13 compute-0 nova_compute[186544]: 2025-11-22 08:30:13.652 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 08:30:13 compute-0 nova_compute[186544]: 2025-11-22 08:30:13.850 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763800198.8493714, 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:30:13 compute-0 nova_compute[186544]: 2025-11-22 08:30:13.851 186548 INFO nova.compute.manager [-] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] VM Stopped (Lifecycle Event)
Nov 22 08:30:13 compute-0 nova_compute[186544]: 2025-11-22 08:30:13.875 186548 DEBUG nova.compute.manager [None req-d2523281-3335-4f9e-afd2-e6ae368a2661 - - - - - -] [instance: 1f91d1f1-adf9-4f64-941f-9f35d3b7d7f3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:30:13 compute-0 nova_compute[186544]: 2025-11-22 08:30:13.890 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:30:15 compute-0 nova_compute[186544]: 2025-11-22 08:30:15.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:30:15 compute-0 nova_compute[186544]: 2025-11-22 08:30:15.164 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 08:30:15 compute-0 nova_compute[186544]: 2025-11-22 08:30:15.165 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 08:30:15 compute-0 nova_compute[186544]: 2025-11-22 08:30:15.183 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 08:30:15 compute-0 podman[246293]: 2025-11-22 08:30:15.426354681 +0000 UTC m=+0.067653146 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 22 08:30:15 compute-0 podman[246294]: 2025-11-22 08:30:15.426475344 +0000 UTC m=+0.064324164 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 08:30:15 compute-0 podman[246292]: 2025-11-22 08:30:15.427028058 +0000 UTC m=+0.067945174 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 22 08:30:15 compute-0 podman[246295]: 2025-11-22 08:30:15.478473803 +0000 UTC m=+0.111954596 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 08:30:16 compute-0 nova_compute[186544]: 2025-11-22 08:30:16.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:30:16 compute-0 nova_compute[186544]: 2025-11-22 08:30:16.376 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:30:18 compute-0 nova_compute[186544]: 2025-11-22 08:30:18.893 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:30:19 compute-0 nova_compute[186544]: 2025-11-22 08:30:19.159 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:30:21 compute-0 nova_compute[186544]: 2025-11-22 08:30:21.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:30:21 compute-0 nova_compute[186544]: 2025-11-22 08:30:21.379 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:30:23 compute-0 nova_compute[186544]: 2025-11-22 08:30:23.895 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:30:24 compute-0 nova_compute[186544]: 2025-11-22 08:30:24.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:30:24 compute-0 nova_compute[186544]: 2025-11-22 08:30:24.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:30:26 compute-0 nova_compute[186544]: 2025-11-22 08:30:26.379 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:30:26 compute-0 podman[246376]: 2025-11-22 08:30:26.430665286 +0000 UTC m=+0.079002025 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Nov 22 08:30:28 compute-0 nova_compute[186544]: 2025-11-22 08:30:28.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:30:28 compute-0 podman[246396]: 2025-11-22 08:30:28.394152029 +0000 UTC m=+0.045639164 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 08:30:28 compute-0 podman[246397]: 2025-11-22 08:30:28.402693849 +0000 UTC m=+0.049837417 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, architecture=x86_64, io.openshift.expose-services=, config_id=edpm, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, release=1755695350, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6)
Nov 22 08:30:28 compute-0 nova_compute[186544]: 2025-11-22 08:30:28.896 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:30:31 compute-0 nova_compute[186544]: 2025-11-22 08:30:31.381 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:30:33 compute-0 nova_compute[186544]: 2025-11-22 08:30:33.899 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:30:34 compute-0 nova_compute[186544]: 2025-11-22 08:30:34.158 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:30:36 compute-0 nova_compute[186544]: 2025-11-22 08:30:36.383 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:30:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:30:36.601 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:30:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:30:36.601 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:30:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:30:36.601 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:30:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:30:36.601 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:30:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:30:36.601 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:30:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:30:36.601 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:30:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:30:36.601 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:30:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:30:36.602 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:30:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:30:36.602 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:30:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:30:36.602 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:30:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:30:36.602 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:30:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:30:36.602 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:30:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:30:36.602 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:30:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:30:36.602 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:30:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:30:36.602 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:30:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:30:36.602 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:30:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:30:36.602 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:30:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:30:36.602 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:30:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:30:36.602 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:30:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:30:36.602 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:30:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:30:36.603 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:30:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:30:36.603 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:30:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:30:36.603 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:30:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:30:36.603 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:30:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:30:36.603 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:30:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:30:37.358 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:30:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:30:37.359 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:30:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:30:37.359 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:30:38 compute-0 nova_compute[186544]: 2025-11-22 08:30:38.902 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:30:41 compute-0 nova_compute[186544]: 2025-11-22 08:30:41.386 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:30:43 compute-0 nova_compute[186544]: 2025-11-22 08:30:43.906 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:30:46 compute-0 nova_compute[186544]: 2025-11-22 08:30:46.387 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:30:46 compute-0 podman[246436]: 2025-11-22 08:30:46.404873887 +0000 UTC m=+0.052635896 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=edpm, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ceilometer_agent_compute)
Nov 22 08:30:46 compute-0 podman[246437]: 2025-11-22 08:30:46.405869712 +0000 UTC m=+0.050506874 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 22 08:30:46 compute-0 podman[246438]: 2025-11-22 08:30:46.429428282 +0000 UTC m=+0.064759205 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 08:30:46 compute-0 podman[246441]: 2025-11-22 08:30:46.462412173 +0000 UTC m=+0.094582148 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 08:30:48 compute-0 nova_compute[186544]: 2025-11-22 08:30:48.908 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:30:51 compute-0 nova_compute[186544]: 2025-11-22 08:30:51.389 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:30:53 compute-0 nova_compute[186544]: 2025-11-22 08:30:53.911 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:30:55 compute-0 ovn_controller[94843]: 2025-11-22T08:30:55Z|00772|memory_trim|INFO|Detected inactivity (last active 30010 ms ago): trimming memory
Nov 22 08:30:56 compute-0 nova_compute[186544]: 2025-11-22 08:30:56.392 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:30:57 compute-0 podman[246519]: 2025-11-22 08:30:57.430539889 +0000 UTC m=+0.075645043 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118)
Nov 22 08:30:58 compute-0 nova_compute[186544]: 2025-11-22 08:30:58.913 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:30:59 compute-0 podman[246542]: 2025-11-22 08:30:59.410339183 +0000 UTC m=+0.060316505 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 08:30:59 compute-0 podman[246543]: 2025-11-22 08:30:59.419446357 +0000 UTC m=+0.067079811 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, architecture=x86_64, managed_by=edpm_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1755695350, version=9.6, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 22 08:31:01 compute-0 nova_compute[186544]: 2025-11-22 08:31:01.394 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:31:03 compute-0 nova_compute[186544]: 2025-11-22 08:31:03.915 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:31:06 compute-0 nova_compute[186544]: 2025-11-22 08:31:06.394 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:31:08 compute-0 nova_compute[186544]: 2025-11-22 08:31:08.918 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:31:11 compute-0 nova_compute[186544]: 2025-11-22 08:31:11.397 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:31:12 compute-0 nova_compute[186544]: 2025-11-22 08:31:12.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:31:12 compute-0 nova_compute[186544]: 2025-11-22 08:31:12.193 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:31:12 compute-0 nova_compute[186544]: 2025-11-22 08:31:12.193 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:31:12 compute-0 nova_compute[186544]: 2025-11-22 08:31:12.193 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:31:12 compute-0 nova_compute[186544]: 2025-11-22 08:31:12.193 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 08:31:12 compute-0 nova_compute[186544]: 2025-11-22 08:31:12.348 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:31:12 compute-0 nova_compute[186544]: 2025-11-22 08:31:12.349 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5725MB free_disk=73.12996673583984GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 08:31:12 compute-0 nova_compute[186544]: 2025-11-22 08:31:12.349 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:31:12 compute-0 nova_compute[186544]: 2025-11-22 08:31:12.350 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:31:12 compute-0 nova_compute[186544]: 2025-11-22 08:31:12.494 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 08:31:12 compute-0 nova_compute[186544]: 2025-11-22 08:31:12.494 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 08:31:12 compute-0 nova_compute[186544]: 2025-11-22 08:31:12.509 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Refreshing inventories for resource provider 0a011418-630a-4be8-ab23-41ec1c11a5ea _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 22 08:31:12 compute-0 nova_compute[186544]: 2025-11-22 08:31:12.537 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Updating ProviderTree inventory for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 22 08:31:12 compute-0 nova_compute[186544]: 2025-11-22 08:31:12.538 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Updating inventory in ProviderTree for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 22 08:31:12 compute-0 nova_compute[186544]: 2025-11-22 08:31:12.551 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Refreshing aggregate associations for resource provider 0a011418-630a-4be8-ab23-41ec1c11a5ea, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 22 08:31:12 compute-0 nova_compute[186544]: 2025-11-22 08:31:12.570 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Refreshing trait associations for resource provider 0a011418-630a-4be8-ab23-41ec1c11a5ea, traits: COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_ACCELERATORS,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_RESCUE_BFV,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSSE3,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE41 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 22 08:31:12 compute-0 nova_compute[186544]: 2025-11-22 08:31:12.588 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:31:12 compute-0 nova_compute[186544]: 2025-11-22 08:31:12.600 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:31:12 compute-0 nova_compute[186544]: 2025-11-22 08:31:12.602 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 08:31:12 compute-0 nova_compute[186544]: 2025-11-22 08:31:12.602 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.252s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:31:13 compute-0 nova_compute[186544]: 2025-11-22 08:31:13.920 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:31:14 compute-0 nova_compute[186544]: 2025-11-22 08:31:14.602 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:31:14 compute-0 nova_compute[186544]: 2025-11-22 08:31:14.603 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 08:31:16 compute-0 nova_compute[186544]: 2025-11-22 08:31:16.398 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:31:17 compute-0 nova_compute[186544]: 2025-11-22 08:31:17.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:31:17 compute-0 nova_compute[186544]: 2025-11-22 08:31:17.165 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 08:31:17 compute-0 nova_compute[186544]: 2025-11-22 08:31:17.165 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 08:31:17 compute-0 nova_compute[186544]: 2025-11-22 08:31:17.179 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 08:31:17 compute-0 podman[246588]: 2025-11-22 08:31:17.411202348 +0000 UTC m=+0.062814516 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible)
Nov 22 08:31:17 compute-0 podman[246590]: 2025-11-22 08:31:17.421182455 +0000 UTC m=+0.062050079 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 08:31:17 compute-0 podman[246589]: 2025-11-22 08:31:17.434066912 +0000 UTC m=+0.080470032 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 08:31:17 compute-0 podman[246596]: 2025-11-22 08:31:17.484476262 +0000 UTC m=+0.123189962 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 08:31:18 compute-0 nova_compute[186544]: 2025-11-22 08:31:18.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:31:18 compute-0 nova_compute[186544]: 2025-11-22 08:31:18.924 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:31:21 compute-0 nova_compute[186544]: 2025-11-22 08:31:21.159 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:31:21 compute-0 nova_compute[186544]: 2025-11-22 08:31:21.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:31:21 compute-0 nova_compute[186544]: 2025-11-22 08:31:21.401 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:31:23 compute-0 nova_compute[186544]: 2025-11-22 08:31:23.927 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:31:24 compute-0 nova_compute[186544]: 2025-11-22 08:31:24.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:31:25 compute-0 nova_compute[186544]: 2025-11-22 08:31:25.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:31:26 compute-0 nova_compute[186544]: 2025-11-22 08:31:26.403 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:31:28 compute-0 podman[246670]: 2025-11-22 08:31:28.42716047 +0000 UTC m=+0.072042403 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, container_name=multipathd)
Nov 22 08:31:28 compute-0 nova_compute[186544]: 2025-11-22 08:31:28.928 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:31:30 compute-0 nova_compute[186544]: 2025-11-22 08:31:30.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:31:30 compute-0 podman[246692]: 2025-11-22 08:31:30.405244613 +0000 UTC m=+0.055447236 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 08:31:30 compute-0 podman[246693]: 2025-11-22 08:31:30.412671596 +0000 UTC m=+0.057239960 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, container_name=openstack_network_exporter, vendor=Red Hat, Inc., release=1755695350, distribution-scope=public, name=ubi9-minimal, config_id=edpm, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 22 08:31:31 compute-0 nova_compute[186544]: 2025-11-22 08:31:31.406 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:31:33 compute-0 nova_compute[186544]: 2025-11-22 08:31:33.930 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:31:36 compute-0 nova_compute[186544]: 2025-11-22 08:31:36.409 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:31:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:31:37.360 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:31:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:31:37.360 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:31:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:31:37.360 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:31:38 compute-0 nova_compute[186544]: 2025-11-22 08:31:38.933 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:31:41 compute-0 nova_compute[186544]: 2025-11-22 08:31:41.409 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:31:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:31:42.651 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=57, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=56) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:31:42 compute-0 nova_compute[186544]: 2025-11-22 08:31:42.651 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:31:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:31:42.652 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 08:31:43 compute-0 nova_compute[186544]: 2025-11-22 08:31:43.934 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:31:45 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:31:45.653 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '57'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:31:46 compute-0 nova_compute[186544]: 2025-11-22 08:31:46.410 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:31:48 compute-0 podman[246741]: 2025-11-22 08:31:48.426237473 +0000 UTC m=+0.057329822 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 08:31:48 compute-0 podman[246740]: 2025-11-22 08:31:48.448052829 +0000 UTC m=+0.081723762 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 22 08:31:48 compute-0 podman[246739]: 2025-11-22 08:31:48.44890848 +0000 UTC m=+0.087084674 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 08:31:48 compute-0 podman[246742]: 2025-11-22 08:31:48.46431612 +0000 UTC m=+0.090676983 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 22 08:31:48 compute-0 nova_compute[186544]: 2025-11-22 08:31:48.937 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:31:51 compute-0 nova_compute[186544]: 2025-11-22 08:31:51.412 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:31:53 compute-0 nova_compute[186544]: 2025-11-22 08:31:53.939 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:31:56 compute-0 nova_compute[186544]: 2025-11-22 08:31:56.414 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:31:58 compute-0 nova_compute[186544]: 2025-11-22 08:31:58.942 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:31:59 compute-0 podman[246822]: 2025-11-22 08:31:59.42457932 +0000 UTC m=+0.066122838 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team)
Nov 22 08:32:01 compute-0 podman[246843]: 2025-11-22 08:32:01.411143011 +0000 UTC m=+0.057415194 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 08:32:01 compute-0 nova_compute[186544]: 2025-11-22 08:32:01.416 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:32:01 compute-0 podman[246844]: 2025-11-22 08:32:01.428498178 +0000 UTC m=+0.070765592 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, managed_by=edpm_ansible, version=9.6, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, build-date=2025-08-20T13:12:41, config_id=edpm, distribution-scope=public, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc.)
Nov 22 08:32:03 compute-0 nova_compute[186544]: 2025-11-22 08:32:03.944 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:32:06 compute-0 nova_compute[186544]: 2025-11-22 08:32:06.417 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:32:07 compute-0 nova_compute[186544]: 2025-11-22 08:32:07.477 186548 DEBUG oslo_concurrency.lockutils [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "1b601a4d-995a-4ca0-8fbb-622b7d9b174a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:32:07 compute-0 nova_compute[186544]: 2025-11-22 08:32:07.478 186548 DEBUG oslo_concurrency.lockutils [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "1b601a4d-995a-4ca0-8fbb-622b7d9b174a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:32:07 compute-0 nova_compute[186544]: 2025-11-22 08:32:07.490 186548 DEBUG nova.compute.manager [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 08:32:07 compute-0 nova_compute[186544]: 2025-11-22 08:32:07.906 186548 DEBUG oslo_concurrency.lockutils [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:32:07 compute-0 nova_compute[186544]: 2025-11-22 08:32:07.907 186548 DEBUG oslo_concurrency.lockutils [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:32:07 compute-0 nova_compute[186544]: 2025-11-22 08:32:07.915 186548 DEBUG nova.virt.hardware [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 08:32:07 compute-0 nova_compute[186544]: 2025-11-22 08:32:07.915 186548 INFO nova.compute.claims [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Claim successful on node compute-0.ctlplane.example.com
Nov 22 08:32:08 compute-0 nova_compute[186544]: 2025-11-22 08:32:08.107 186548 DEBUG nova.compute.provider_tree [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:32:08 compute-0 nova_compute[186544]: 2025-11-22 08:32:08.120 186548 DEBUG nova.scheduler.client.report [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:32:08 compute-0 nova_compute[186544]: 2025-11-22 08:32:08.140 186548 DEBUG oslo_concurrency.lockutils [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.233s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:32:08 compute-0 nova_compute[186544]: 2025-11-22 08:32:08.141 186548 DEBUG nova.compute.manager [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 08:32:08 compute-0 nova_compute[186544]: 2025-11-22 08:32:08.184 186548 DEBUG nova.compute.manager [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 22 08:32:08 compute-0 nova_compute[186544]: 2025-11-22 08:32:08.184 186548 DEBUG nova.network.neutron [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 08:32:08 compute-0 nova_compute[186544]: 2025-11-22 08:32:08.199 186548 INFO nova.virt.libvirt.driver [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 08:32:08 compute-0 nova_compute[186544]: 2025-11-22 08:32:08.229 186548 DEBUG nova.compute.manager [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 08:32:08 compute-0 nova_compute[186544]: 2025-11-22 08:32:08.334 186548 DEBUG nova.compute.manager [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 08:32:08 compute-0 nova_compute[186544]: 2025-11-22 08:32:08.335 186548 DEBUG nova.virt.libvirt.driver [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 08:32:08 compute-0 nova_compute[186544]: 2025-11-22 08:32:08.336 186548 INFO nova.virt.libvirt.driver [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Creating image(s)
Nov 22 08:32:08 compute-0 nova_compute[186544]: 2025-11-22 08:32:08.337 186548 DEBUG oslo_concurrency.lockutils [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "/var/lib/nova/instances/1b601a4d-995a-4ca0-8fbb-622b7d9b174a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:32:08 compute-0 nova_compute[186544]: 2025-11-22 08:32:08.337 186548 DEBUG oslo_concurrency.lockutils [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "/var/lib/nova/instances/1b601a4d-995a-4ca0-8fbb-622b7d9b174a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:32:08 compute-0 nova_compute[186544]: 2025-11-22 08:32:08.338 186548 DEBUG oslo_concurrency.lockutils [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "/var/lib/nova/instances/1b601a4d-995a-4ca0-8fbb-622b7d9b174a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:32:08 compute-0 nova_compute[186544]: 2025-11-22 08:32:08.356 186548 DEBUG oslo_concurrency.processutils [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:32:08 compute-0 nova_compute[186544]: 2025-11-22 08:32:08.407 186548 DEBUG nova.policy [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 22 08:32:08 compute-0 nova_compute[186544]: 2025-11-22 08:32:08.412 186548 DEBUG oslo_concurrency.processutils [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:32:08 compute-0 nova_compute[186544]: 2025-11-22 08:32:08.413 186548 DEBUG oslo_concurrency.lockutils [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:32:08 compute-0 nova_compute[186544]: 2025-11-22 08:32:08.414 186548 DEBUG oslo_concurrency.lockutils [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:32:08 compute-0 nova_compute[186544]: 2025-11-22 08:32:08.424 186548 DEBUG oslo_concurrency.processutils [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:32:08 compute-0 nova_compute[186544]: 2025-11-22 08:32:08.508 186548 DEBUG oslo_concurrency.processutils [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:32:08 compute-0 nova_compute[186544]: 2025-11-22 08:32:08.509 186548 DEBUG oslo_concurrency.processutils [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/1b601a4d-995a-4ca0-8fbb-622b7d9b174a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:32:08 compute-0 nova_compute[186544]: 2025-11-22 08:32:08.948 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:32:08 compute-0 nova_compute[186544]: 2025-11-22 08:32:08.993 186548 DEBUG oslo_concurrency.processutils [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/1b601a4d-995a-4ca0-8fbb-622b7d9b174a/disk 1073741824" returned: 0 in 0.484s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:32:08 compute-0 nova_compute[186544]: 2025-11-22 08:32:08.994 186548 DEBUG oslo_concurrency.lockutils [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.581s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:32:08 compute-0 nova_compute[186544]: 2025-11-22 08:32:08.995 186548 DEBUG oslo_concurrency.processutils [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:32:09 compute-0 nova_compute[186544]: 2025-11-22 08:32:09.048 186548 DEBUG oslo_concurrency.processutils [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:32:09 compute-0 nova_compute[186544]: 2025-11-22 08:32:09.049 186548 DEBUG nova.virt.disk.api [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Checking if we can resize image /var/lib/nova/instances/1b601a4d-995a-4ca0-8fbb-622b7d9b174a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 08:32:09 compute-0 nova_compute[186544]: 2025-11-22 08:32:09.049 186548 DEBUG oslo_concurrency.processutils [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1b601a4d-995a-4ca0-8fbb-622b7d9b174a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:32:09 compute-0 nova_compute[186544]: 2025-11-22 08:32:09.102 186548 DEBUG oslo_concurrency.processutils [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1b601a4d-995a-4ca0-8fbb-622b7d9b174a/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:32:09 compute-0 nova_compute[186544]: 2025-11-22 08:32:09.103 186548 DEBUG nova.virt.disk.api [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Cannot resize image /var/lib/nova/instances/1b601a4d-995a-4ca0-8fbb-622b7d9b174a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 08:32:09 compute-0 nova_compute[186544]: 2025-11-22 08:32:09.104 186548 DEBUG nova.objects.instance [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lazy-loading 'migration_context' on Instance uuid 1b601a4d-995a-4ca0-8fbb-622b7d9b174a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:32:09 compute-0 nova_compute[186544]: 2025-11-22 08:32:09.152 186548 DEBUG nova.virt.libvirt.driver [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 08:32:09 compute-0 nova_compute[186544]: 2025-11-22 08:32:09.153 186548 DEBUG nova.virt.libvirt.driver [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Ensure instance console log exists: /var/lib/nova/instances/1b601a4d-995a-4ca0-8fbb-622b7d9b174a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 08:32:09 compute-0 nova_compute[186544]: 2025-11-22 08:32:09.153 186548 DEBUG oslo_concurrency.lockutils [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:32:09 compute-0 nova_compute[186544]: 2025-11-22 08:32:09.153 186548 DEBUG oslo_concurrency.lockutils [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:32:09 compute-0 nova_compute[186544]: 2025-11-22 08:32:09.154 186548 DEBUG oslo_concurrency.lockutils [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:32:09 compute-0 nova_compute[186544]: 2025-11-22 08:32:09.507 186548 DEBUG nova.network.neutron [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Successfully created port: 6e6336f7-e62a-4e83-9e9a-e28f87900f3d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 22 08:32:10 compute-0 nova_compute[186544]: 2025-11-22 08:32:10.480 186548 DEBUG nova.network.neutron [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Successfully updated port: 6e6336f7-e62a-4e83-9e9a-e28f87900f3d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 08:32:10 compute-0 nova_compute[186544]: 2025-11-22 08:32:10.503 186548 DEBUG oslo_concurrency.lockutils [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "refresh_cache-1b601a4d-995a-4ca0-8fbb-622b7d9b174a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:32:10 compute-0 nova_compute[186544]: 2025-11-22 08:32:10.504 186548 DEBUG oslo_concurrency.lockutils [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquired lock "refresh_cache-1b601a4d-995a-4ca0-8fbb-622b7d9b174a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:32:10 compute-0 nova_compute[186544]: 2025-11-22 08:32:10.504 186548 DEBUG nova.network.neutron [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 08:32:10 compute-0 nova_compute[186544]: 2025-11-22 08:32:10.566 186548 DEBUG nova.compute.manager [req-cda4554e-4a31-4479-ae70-f365f6f6bdb4 req-7c59bfbf-f69e-42fb-9dcf-c49ace433ae1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Received event network-changed-6e6336f7-e62a-4e83-9e9a-e28f87900f3d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:32:10 compute-0 nova_compute[186544]: 2025-11-22 08:32:10.567 186548 DEBUG nova.compute.manager [req-cda4554e-4a31-4479-ae70-f365f6f6bdb4 req-7c59bfbf-f69e-42fb-9dcf-c49ace433ae1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Refreshing instance network info cache due to event network-changed-6e6336f7-e62a-4e83-9e9a-e28f87900f3d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:32:10 compute-0 nova_compute[186544]: 2025-11-22 08:32:10.567 186548 DEBUG oslo_concurrency.lockutils [req-cda4554e-4a31-4479-ae70-f365f6f6bdb4 req-7c59bfbf-f69e-42fb-9dcf-c49ace433ae1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-1b601a4d-995a-4ca0-8fbb-622b7d9b174a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:32:10 compute-0 nova_compute[186544]: 2025-11-22 08:32:10.626 186548 DEBUG nova.network.neutron [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 08:32:11 compute-0 nova_compute[186544]: 2025-11-22 08:32:11.418 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:32:11 compute-0 nova_compute[186544]: 2025-11-22 08:32:11.492 186548 DEBUG nova.network.neutron [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Updating instance_info_cache with network_info: [{"id": "6e6336f7-e62a-4e83-9e9a-e28f87900f3d", "address": "fa:16:3e:8f:3c:48", "network": {"id": "ced73a45-5f14-41bc-abc6-63a41c14d9ff", "bridge": "br-int", "label": "tempest-network-smoke--1663104876", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e6336f7-e6", "ovs_interfaceid": "6e6336f7-e62a-4e83-9e9a-e28f87900f3d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:32:11 compute-0 nova_compute[186544]: 2025-11-22 08:32:11.519 186548 DEBUG oslo_concurrency.lockutils [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Releasing lock "refresh_cache-1b601a4d-995a-4ca0-8fbb-622b7d9b174a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:32:11 compute-0 nova_compute[186544]: 2025-11-22 08:32:11.519 186548 DEBUG nova.compute.manager [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Instance network_info: |[{"id": "6e6336f7-e62a-4e83-9e9a-e28f87900f3d", "address": "fa:16:3e:8f:3c:48", "network": {"id": "ced73a45-5f14-41bc-abc6-63a41c14d9ff", "bridge": "br-int", "label": "tempest-network-smoke--1663104876", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e6336f7-e6", "ovs_interfaceid": "6e6336f7-e62a-4e83-9e9a-e28f87900f3d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 22 08:32:11 compute-0 nova_compute[186544]: 2025-11-22 08:32:11.519 186548 DEBUG oslo_concurrency.lockutils [req-cda4554e-4a31-4479-ae70-f365f6f6bdb4 req-7c59bfbf-f69e-42fb-9dcf-c49ace433ae1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-1b601a4d-995a-4ca0-8fbb-622b7d9b174a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:32:11 compute-0 nova_compute[186544]: 2025-11-22 08:32:11.520 186548 DEBUG nova.network.neutron [req-cda4554e-4a31-4479-ae70-f365f6f6bdb4 req-7c59bfbf-f69e-42fb-9dcf-c49ace433ae1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Refreshing network info cache for port 6e6336f7-e62a-4e83-9e9a-e28f87900f3d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:32:11 compute-0 nova_compute[186544]: 2025-11-22 08:32:11.522 186548 DEBUG nova.virt.libvirt.driver [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Start _get_guest_xml network_info=[{"id": "6e6336f7-e62a-4e83-9e9a-e28f87900f3d", "address": "fa:16:3e:8f:3c:48", "network": {"id": "ced73a45-5f14-41bc-abc6-63a41c14d9ff", "bridge": "br-int", "label": "tempest-network-smoke--1663104876", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e6336f7-e6", "ovs_interfaceid": "6e6336f7-e62a-4e83-9e9a-e28f87900f3d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 08:32:11 compute-0 nova_compute[186544]: 2025-11-22 08:32:11.526 186548 WARNING nova.virt.libvirt.driver [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:32:11 compute-0 nova_compute[186544]: 2025-11-22 08:32:11.531 186548 DEBUG nova.virt.libvirt.host [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 08:32:11 compute-0 nova_compute[186544]: 2025-11-22 08:32:11.532 186548 DEBUG nova.virt.libvirt.host [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 08:32:11 compute-0 nova_compute[186544]: 2025-11-22 08:32:11.537 186548 DEBUG nova.virt.libvirt.host [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 08:32:11 compute-0 nova_compute[186544]: 2025-11-22 08:32:11.537 186548 DEBUG nova.virt.libvirt.host [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 08:32:11 compute-0 nova_compute[186544]: 2025-11-22 08:32:11.538 186548 DEBUG nova.virt.libvirt.driver [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 08:32:11 compute-0 nova_compute[186544]: 2025-11-22 08:32:11.538 186548 DEBUG nova.virt.hardware [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 08:32:11 compute-0 nova_compute[186544]: 2025-11-22 08:32:11.539 186548 DEBUG nova.virt.hardware [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 08:32:11 compute-0 nova_compute[186544]: 2025-11-22 08:32:11.539 186548 DEBUG nova.virt.hardware [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 08:32:11 compute-0 nova_compute[186544]: 2025-11-22 08:32:11.539 186548 DEBUG nova.virt.hardware [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 08:32:11 compute-0 nova_compute[186544]: 2025-11-22 08:32:11.539 186548 DEBUG nova.virt.hardware [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 08:32:11 compute-0 nova_compute[186544]: 2025-11-22 08:32:11.539 186548 DEBUG nova.virt.hardware [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 08:32:11 compute-0 nova_compute[186544]: 2025-11-22 08:32:11.540 186548 DEBUG nova.virt.hardware [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 08:32:11 compute-0 nova_compute[186544]: 2025-11-22 08:32:11.540 186548 DEBUG nova.virt.hardware [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 08:32:11 compute-0 nova_compute[186544]: 2025-11-22 08:32:11.540 186548 DEBUG nova.virt.hardware [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 08:32:11 compute-0 nova_compute[186544]: 2025-11-22 08:32:11.540 186548 DEBUG nova.virt.hardware [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 08:32:11 compute-0 nova_compute[186544]: 2025-11-22 08:32:11.540 186548 DEBUG nova.virt.hardware [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 08:32:11 compute-0 nova_compute[186544]: 2025-11-22 08:32:11.544 186548 DEBUG nova.virt.libvirt.vif [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:32:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1642746423',display_name='tempest-TestNetworkAdvancedServerOps-server-1642746423',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1642746423',id=164,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLyx3yYTiDGnUmJAh28YsGstH3NrOqX9/QLThE+zKYWB8ANza2qAtrRyZ9J/a5Zb1hHRnNBLhqoKqPZqB0xCn1Lq/9A30rHqEDT5Xmrt1JkqaQbjCW4fioS0E+pv7EJGMg==',key_name='tempest-TestNetworkAdvancedServerOps-1187774737',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='042f6d127720471aaedb8a1fb7535416',ramdisk_id='',reservation_id='r-q3510gar',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1221065053',owner_user_name='tempest-TestNetworkAdvancedServerOps-1221065053-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:32:08Z,user_data=None,user_id='d8853d84c1e84f6baaf01635ef1d0f7c',uuid=1b601a4d-995a-4ca0-8fbb-622b7d9b174a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6e6336f7-e62a-4e83-9e9a-e28f87900f3d", "address": "fa:16:3e:8f:3c:48", "network": {"id": "ced73a45-5f14-41bc-abc6-63a41c14d9ff", "bridge": "br-int", "label": "tempest-network-smoke--1663104876", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e6336f7-e6", "ovs_interfaceid": "6e6336f7-e62a-4e83-9e9a-e28f87900f3d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 08:32:11 compute-0 nova_compute[186544]: 2025-11-22 08:32:11.544 186548 DEBUG nova.network.os_vif_util [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converting VIF {"id": "6e6336f7-e62a-4e83-9e9a-e28f87900f3d", "address": "fa:16:3e:8f:3c:48", "network": {"id": "ced73a45-5f14-41bc-abc6-63a41c14d9ff", "bridge": "br-int", "label": "tempest-network-smoke--1663104876", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e6336f7-e6", "ovs_interfaceid": "6e6336f7-e62a-4e83-9e9a-e28f87900f3d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:32:11 compute-0 nova_compute[186544]: 2025-11-22 08:32:11.545 186548 DEBUG nova.network.os_vif_util [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8f:3c:48,bridge_name='br-int',has_traffic_filtering=True,id=6e6336f7-e62a-4e83-9e9a-e28f87900f3d,network=Network(ced73a45-5f14-41bc-abc6-63a41c14d9ff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e6336f7-e6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:32:11 compute-0 nova_compute[186544]: 2025-11-22 08:32:11.546 186548 DEBUG nova.objects.instance [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lazy-loading 'pci_devices' on Instance uuid 1b601a4d-995a-4ca0-8fbb-622b7d9b174a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:32:11 compute-0 nova_compute[186544]: 2025-11-22 08:32:11.562 186548 DEBUG nova.virt.libvirt.driver [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] End _get_guest_xml xml=<domain type="kvm">
Nov 22 08:32:11 compute-0 nova_compute[186544]:   <uuid>1b601a4d-995a-4ca0-8fbb-622b7d9b174a</uuid>
Nov 22 08:32:11 compute-0 nova_compute[186544]:   <name>instance-000000a4</name>
Nov 22 08:32:11 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 08:32:11 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 08:32:11 compute-0 nova_compute[186544]:   <metadata>
Nov 22 08:32:11 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 08:32:11 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 08:32:11 compute-0 nova_compute[186544]:       <nova:name>tempest-TestNetworkAdvancedServerOps-server-1642746423</nova:name>
Nov 22 08:32:11 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 08:32:11</nova:creationTime>
Nov 22 08:32:11 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 08:32:11 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 08:32:11 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 08:32:11 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 08:32:11 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 08:32:11 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 08:32:11 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 08:32:11 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 08:32:11 compute-0 nova_compute[186544]:         <nova:user uuid="d8853d84c1e84f6baaf01635ef1d0f7c">tempest-TestNetworkAdvancedServerOps-1221065053-project-member</nova:user>
Nov 22 08:32:11 compute-0 nova_compute[186544]:         <nova:project uuid="042f6d127720471aaedb8a1fb7535416">tempest-TestNetworkAdvancedServerOps-1221065053</nova:project>
Nov 22 08:32:11 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 08:32:11 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 08:32:11 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 08:32:11 compute-0 nova_compute[186544]:         <nova:port uuid="6e6336f7-e62a-4e83-9e9a-e28f87900f3d">
Nov 22 08:32:11 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Nov 22 08:32:11 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 08:32:11 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 08:32:11 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 08:32:11 compute-0 nova_compute[186544]:   </metadata>
Nov 22 08:32:11 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 08:32:11 compute-0 nova_compute[186544]:     <system>
Nov 22 08:32:11 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 08:32:11 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 08:32:11 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 08:32:11 compute-0 nova_compute[186544]:       <entry name="serial">1b601a4d-995a-4ca0-8fbb-622b7d9b174a</entry>
Nov 22 08:32:11 compute-0 nova_compute[186544]:       <entry name="uuid">1b601a4d-995a-4ca0-8fbb-622b7d9b174a</entry>
Nov 22 08:32:11 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 08:32:11 compute-0 nova_compute[186544]:     </system>
Nov 22 08:32:11 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 08:32:11 compute-0 nova_compute[186544]:   <os>
Nov 22 08:32:11 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 08:32:11 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 08:32:11 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 08:32:11 compute-0 nova_compute[186544]:   </os>
Nov 22 08:32:11 compute-0 nova_compute[186544]:   <features>
Nov 22 08:32:11 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 08:32:11 compute-0 nova_compute[186544]:     <apic/>
Nov 22 08:32:11 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 08:32:11 compute-0 nova_compute[186544]:   </features>
Nov 22 08:32:11 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 08:32:11 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 08:32:11 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 08:32:11 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 08:32:11 compute-0 nova_compute[186544]:   </clock>
Nov 22 08:32:11 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 08:32:11 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 08:32:11 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 08:32:11 compute-0 nova_compute[186544]:   </cpu>
Nov 22 08:32:11 compute-0 nova_compute[186544]:   <devices>
Nov 22 08:32:11 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 08:32:11 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 08:32:11 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/1b601a4d-995a-4ca0-8fbb-622b7d9b174a/disk"/>
Nov 22 08:32:11 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 08:32:11 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:32:11 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 08:32:11 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 08:32:11 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/1b601a4d-995a-4ca0-8fbb-622b7d9b174a/disk.config"/>
Nov 22 08:32:11 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 08:32:11 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:32:11 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 08:32:11 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:8f:3c:48"/>
Nov 22 08:32:11 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:32:11 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 08:32:11 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 08:32:11 compute-0 nova_compute[186544]:       <target dev="tap6e6336f7-e6"/>
Nov 22 08:32:11 compute-0 nova_compute[186544]:     </interface>
Nov 22 08:32:11 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 08:32:11 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/1b601a4d-995a-4ca0-8fbb-622b7d9b174a/console.log" append="off"/>
Nov 22 08:32:11 compute-0 nova_compute[186544]:     </serial>
Nov 22 08:32:11 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 08:32:11 compute-0 nova_compute[186544]:     <video>
Nov 22 08:32:11 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:32:11 compute-0 nova_compute[186544]:     </video>
Nov 22 08:32:11 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 08:32:11 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 08:32:11 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 08:32:11 compute-0 nova_compute[186544]:     </rng>
Nov 22 08:32:11 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 08:32:11 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:32:11 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:32:11 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:32:11 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:32:11 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:32:11 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:32:11 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:32:11 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:32:11 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:32:11 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:32:11 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:32:11 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:32:11 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:32:11 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:32:11 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:32:11 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:32:11 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:32:11 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:32:11 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:32:11 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:32:11 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:32:11 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:32:11 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:32:11 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:32:11 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 08:32:11 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 08:32:11 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 08:32:11 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 08:32:11 compute-0 nova_compute[186544]:   </devices>
Nov 22 08:32:11 compute-0 nova_compute[186544]: </domain>
Nov 22 08:32:11 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 08:32:11 compute-0 nova_compute[186544]: 2025-11-22 08:32:11.563 186548 DEBUG nova.compute.manager [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Preparing to wait for external event network-vif-plugged-6e6336f7-e62a-4e83-9e9a-e28f87900f3d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 08:32:11 compute-0 nova_compute[186544]: 2025-11-22 08:32:11.563 186548 DEBUG oslo_concurrency.lockutils [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "1b601a4d-995a-4ca0-8fbb-622b7d9b174a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:32:11 compute-0 nova_compute[186544]: 2025-11-22 08:32:11.564 186548 DEBUG oslo_concurrency.lockutils [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "1b601a4d-995a-4ca0-8fbb-622b7d9b174a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:32:11 compute-0 nova_compute[186544]: 2025-11-22 08:32:11.564 186548 DEBUG oslo_concurrency.lockutils [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "1b601a4d-995a-4ca0-8fbb-622b7d9b174a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:32:11 compute-0 nova_compute[186544]: 2025-11-22 08:32:11.565 186548 DEBUG nova.virt.libvirt.vif [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:32:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1642746423',display_name='tempest-TestNetworkAdvancedServerOps-server-1642746423',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1642746423',id=164,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLyx3yYTiDGnUmJAh28YsGstH3NrOqX9/QLThE+zKYWB8ANza2qAtrRyZ9J/a5Zb1hHRnNBLhqoKqPZqB0xCn1Lq/9A30rHqEDT5Xmrt1JkqaQbjCW4fioS0E+pv7EJGMg==',key_name='tempest-TestNetworkAdvancedServerOps-1187774737',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='042f6d127720471aaedb8a1fb7535416',ramdisk_id='',reservation_id='r-q3510gar',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1221065053',owner_user_name='tempest-TestNetworkAdvancedServerOps-1221065053-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:32:08Z,user_data=None,user_id='d8853d84c1e84f6baaf01635ef1d0f7c',uuid=1b601a4d-995a-4ca0-8fbb-622b7d9b174a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6e6336f7-e62a-4e83-9e9a-e28f87900f3d", "address": "fa:16:3e:8f:3c:48", "network": {"id": "ced73a45-5f14-41bc-abc6-63a41c14d9ff", "bridge": "br-int", "label": "tempest-network-smoke--1663104876", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e6336f7-e6", "ovs_interfaceid": "6e6336f7-e62a-4e83-9e9a-e28f87900f3d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 08:32:11 compute-0 nova_compute[186544]: 2025-11-22 08:32:11.565 186548 DEBUG nova.network.os_vif_util [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converting VIF {"id": "6e6336f7-e62a-4e83-9e9a-e28f87900f3d", "address": "fa:16:3e:8f:3c:48", "network": {"id": "ced73a45-5f14-41bc-abc6-63a41c14d9ff", "bridge": "br-int", "label": "tempest-network-smoke--1663104876", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e6336f7-e6", "ovs_interfaceid": "6e6336f7-e62a-4e83-9e9a-e28f87900f3d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:32:11 compute-0 nova_compute[186544]: 2025-11-22 08:32:11.566 186548 DEBUG nova.network.os_vif_util [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8f:3c:48,bridge_name='br-int',has_traffic_filtering=True,id=6e6336f7-e62a-4e83-9e9a-e28f87900f3d,network=Network(ced73a45-5f14-41bc-abc6-63a41c14d9ff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e6336f7-e6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:32:11 compute-0 nova_compute[186544]: 2025-11-22 08:32:11.566 186548 DEBUG os_vif [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8f:3c:48,bridge_name='br-int',has_traffic_filtering=True,id=6e6336f7-e62a-4e83-9e9a-e28f87900f3d,network=Network(ced73a45-5f14-41bc-abc6-63a41c14d9ff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e6336f7-e6') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 08:32:11 compute-0 nova_compute[186544]: 2025-11-22 08:32:11.567 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:32:11 compute-0 nova_compute[186544]: 2025-11-22 08:32:11.567 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:32:11 compute-0 nova_compute[186544]: 2025-11-22 08:32:11.567 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:32:11 compute-0 nova_compute[186544]: 2025-11-22 08:32:11.573 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:32:11 compute-0 nova_compute[186544]: 2025-11-22 08:32:11.573 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6e6336f7-e6, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:32:11 compute-0 nova_compute[186544]: 2025-11-22 08:32:11.573 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6e6336f7-e6, col_values=(('external_ids', {'iface-id': '6e6336f7-e62a-4e83-9e9a-e28f87900f3d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8f:3c:48', 'vm-uuid': '1b601a4d-995a-4ca0-8fbb-622b7d9b174a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:32:11 compute-0 nova_compute[186544]: 2025-11-22 08:32:11.575 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:32:11 compute-0 NetworkManager[55036]: <info>  [1763800331.5759] manager: (tap6e6336f7-e6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/357)
Nov 22 08:32:11 compute-0 nova_compute[186544]: 2025-11-22 08:32:11.576 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 08:32:11 compute-0 nova_compute[186544]: 2025-11-22 08:32:11.581 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:32:11 compute-0 nova_compute[186544]: 2025-11-22 08:32:11.582 186548 INFO os_vif [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8f:3c:48,bridge_name='br-int',has_traffic_filtering=True,id=6e6336f7-e62a-4e83-9e9a-e28f87900f3d,network=Network(ced73a45-5f14-41bc-abc6-63a41c14d9ff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e6336f7-e6')
Nov 22 08:32:11 compute-0 nova_compute[186544]: 2025-11-22 08:32:11.769 186548 DEBUG nova.virt.libvirt.driver [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:32:11 compute-0 nova_compute[186544]: 2025-11-22 08:32:11.770 186548 DEBUG nova.virt.libvirt.driver [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:32:11 compute-0 nova_compute[186544]: 2025-11-22 08:32:11.771 186548 DEBUG nova.virt.libvirt.driver [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] No VIF found with MAC fa:16:3e:8f:3c:48, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 08:32:11 compute-0 nova_compute[186544]: 2025-11-22 08:32:11.771 186548 INFO nova.virt.libvirt.driver [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Using config drive
Nov 22 08:32:12 compute-0 nova_compute[186544]: 2025-11-22 08:32:12.018 186548 INFO nova.virt.libvirt.driver [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Creating config drive at /var/lib/nova/instances/1b601a4d-995a-4ca0-8fbb-622b7d9b174a/disk.config
Nov 22 08:32:12 compute-0 nova_compute[186544]: 2025-11-22 08:32:12.023 186548 DEBUG oslo_concurrency.processutils [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1b601a4d-995a-4ca0-8fbb-622b7d9b174a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpuw637wqa execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:32:12 compute-0 nova_compute[186544]: 2025-11-22 08:32:12.152 186548 DEBUG oslo_concurrency.processutils [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1b601a4d-995a-4ca0-8fbb-622b7d9b174a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpuw637wqa" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:32:12 compute-0 kernel: tap6e6336f7-e6: entered promiscuous mode
Nov 22 08:32:12 compute-0 NetworkManager[55036]: <info>  [1763800332.2255] manager: (tap6e6336f7-e6): new Tun device (/org/freedesktop/NetworkManager/Devices/358)
Nov 22 08:32:12 compute-0 ovn_controller[94843]: 2025-11-22T08:32:12Z|00773|binding|INFO|Claiming lport 6e6336f7-e62a-4e83-9e9a-e28f87900f3d for this chassis.
Nov 22 08:32:12 compute-0 ovn_controller[94843]: 2025-11-22T08:32:12Z|00774|binding|INFO|6e6336f7-e62a-4e83-9e9a-e28f87900f3d: Claiming fa:16:3e:8f:3c:48 10.100.0.3
Nov 22 08:32:12 compute-0 nova_compute[186544]: 2025-11-22 08:32:12.226 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:32:12 compute-0 nova_compute[186544]: 2025-11-22 08:32:12.235 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:32:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:12.242 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8f:3c:48 10.100.0.3'], port_security=['fa:16:3e:8f:3c:48 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '1b601a4d-995a-4ca0-8fbb-622b7d9b174a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ced73a45-5f14-41bc-abc6-63a41c14d9ff', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '042f6d127720471aaedb8a1fb7535416', 'neutron:revision_number': '2', 'neutron:security_group_ids': '91619875-7dbc-4d4a-a182-904b86081a80', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7fe0d12b-4c0f-4e1e-8acf-f7614351545c, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=6e6336f7-e62a-4e83-9e9a-e28f87900f3d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:32:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:12.244 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 6e6336f7-e62a-4e83-9e9a-e28f87900f3d in datapath ced73a45-5f14-41bc-abc6-63a41c14d9ff bound to our chassis
Nov 22 08:32:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:12.245 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ced73a45-5f14-41bc-abc6-63a41c14d9ff
Nov 22 08:32:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:12.258 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[ecaa79ec-edfe-42ee-b653-89ae9ca2ec30]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:32:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:12.259 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapced73a45-51 in ovnmeta-ced73a45-5f14-41bc-abc6-63a41c14d9ff namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 08:32:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:12.263 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapced73a45-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 08:32:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:12.263 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[ebfa7369-7e3b-4b3a-b5b4-56fe026e793a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:32:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:12.264 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[ed666864-22bf-4d84-8c96-f79fa054ccd9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:32:12 compute-0 systemd-udevd[246919]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:32:12 compute-0 systemd-machined[152872]: New machine qemu-88-instance-000000a4.
Nov 22 08:32:12 compute-0 NetworkManager[55036]: <info>  [1763800332.2801] device (tap6e6336f7-e6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 08:32:12 compute-0 NetworkManager[55036]: <info>  [1763800332.2810] device (tap6e6336f7-e6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 08:32:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:12.284 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[cbf0049d-33f3-4d74-ac1d-943d598625ba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:32:12 compute-0 nova_compute[186544]: 2025-11-22 08:32:12.291 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:32:12 compute-0 ovn_controller[94843]: 2025-11-22T08:32:12Z|00775|binding|INFO|Setting lport 6e6336f7-e62a-4e83-9e9a-e28f87900f3d ovn-installed in OVS
Nov 22 08:32:12 compute-0 ovn_controller[94843]: 2025-11-22T08:32:12Z|00776|binding|INFO|Setting lport 6e6336f7-e62a-4e83-9e9a-e28f87900f3d up in Southbound
Nov 22 08:32:12 compute-0 nova_compute[186544]: 2025-11-22 08:32:12.295 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:32:12 compute-0 systemd[1]: Started Virtual Machine qemu-88-instance-000000a4.
Nov 22 08:32:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:12.297 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[2a45d1ce-c51c-41c0-9f8d-8284f8ed89b6]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:32:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:12.330 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[86768a33-70e1-44bf-9c91-ed347beb37d1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:32:12 compute-0 NetworkManager[55036]: <info>  [1763800332.3360] manager: (tapced73a45-50): new Veth device (/org/freedesktop/NetworkManager/Devices/359)
Nov 22 08:32:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:12.335 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[d5170243-1dda-4307-9cac-3cfd894b250b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:32:12 compute-0 systemd-udevd[246923]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:32:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:12.369 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[e3ffa987-6c8e-43c9-942c-798616eac65f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:32:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:12.373 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[5c7d689b-c91e-4f1c-8f07-3970d587e858]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:32:12 compute-0 NetworkManager[55036]: <info>  [1763800332.3998] device (tapced73a45-50): carrier: link connected
Nov 22 08:32:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:12.402 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[731e1bef-eff4-412c-b3dd-534214bf04a0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:32:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:12.420 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[054e6253-0daa-4c45-90fe-55cddbf916f6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapced73a45-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:07:dc:af'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 232], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 707303, 'reachable_time': 41864, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 246952, 'error': None, 'target': 'ovnmeta-ced73a45-5f14-41bc-abc6-63a41c14d9ff', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:32:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:12.436 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[15a1bd01-1236-4d96-b805-407c40c80bed]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe07:dcaf'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 707303, 'tstamp': 707303}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 246953, 'error': None, 'target': 'ovnmeta-ced73a45-5f14-41bc-abc6-63a41c14d9ff', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:32:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:12.453 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[8f2b71fd-1b63-4fd9-90d6-6cb5a765702d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapced73a45-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:07:dc:af'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 232], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 707303, 'reachable_time': 41864, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 246954, 'error': None, 'target': 'ovnmeta-ced73a45-5f14-41bc-abc6-63a41c14d9ff', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:32:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:12.482 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[79ba14c7-240f-4939-917e-ae1a8ce61867]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:32:12 compute-0 nova_compute[186544]: 2025-11-22 08:32:12.511 186548 DEBUG nova.compute.manager [req-52c0cccc-930b-4ffb-8ae7-ec1912556b85 req-dd78cf66-9d80-4f72-ae44-737e9704d6e4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Received event network-vif-plugged-6e6336f7-e62a-4e83-9e9a-e28f87900f3d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:32:12 compute-0 nova_compute[186544]: 2025-11-22 08:32:12.512 186548 DEBUG oslo_concurrency.lockutils [req-52c0cccc-930b-4ffb-8ae7-ec1912556b85 req-dd78cf66-9d80-4f72-ae44-737e9704d6e4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1b601a4d-995a-4ca0-8fbb-622b7d9b174a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:32:12 compute-0 nova_compute[186544]: 2025-11-22 08:32:12.512 186548 DEBUG oslo_concurrency.lockutils [req-52c0cccc-930b-4ffb-8ae7-ec1912556b85 req-dd78cf66-9d80-4f72-ae44-737e9704d6e4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1b601a4d-995a-4ca0-8fbb-622b7d9b174a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:32:12 compute-0 nova_compute[186544]: 2025-11-22 08:32:12.512 186548 DEBUG oslo_concurrency.lockutils [req-52c0cccc-930b-4ffb-8ae7-ec1912556b85 req-dd78cf66-9d80-4f72-ae44-737e9704d6e4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1b601a4d-995a-4ca0-8fbb-622b7d9b174a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:32:12 compute-0 nova_compute[186544]: 2025-11-22 08:32:12.512 186548 DEBUG nova.compute.manager [req-52c0cccc-930b-4ffb-8ae7-ec1912556b85 req-dd78cf66-9d80-4f72-ae44-737e9704d6e4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Processing event network-vif-plugged-6e6336f7-e62a-4e83-9e9a-e28f87900f3d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 08:32:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:12.551 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[31ffcb4a-520b-47d4-a483-dec3bdb439e7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:32:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:12.553 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapced73a45-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:32:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:12.554 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:32:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:12.554 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapced73a45-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:32:12 compute-0 kernel: tapced73a45-50: entered promiscuous mode
Nov 22 08:32:12 compute-0 nova_compute[186544]: 2025-11-22 08:32:12.557 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:32:12 compute-0 NetworkManager[55036]: <info>  [1763800332.5578] manager: (tapced73a45-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/360)
Nov 22 08:32:12 compute-0 nova_compute[186544]: 2025-11-22 08:32:12.560 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:32:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:12.562 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapced73a45-50, col_values=(('external_ids', {'iface-id': '56656750-356c-4b77-b2d5-486506eff60e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:32:12 compute-0 nova_compute[186544]: 2025-11-22 08:32:12.564 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:32:12 compute-0 ovn_controller[94843]: 2025-11-22T08:32:12Z|00777|binding|INFO|Releasing lport 56656750-356c-4b77-b2d5-486506eff60e from this chassis (sb_readonly=0)
Nov 22 08:32:12 compute-0 nova_compute[186544]: 2025-11-22 08:32:12.577 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:32:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:12.579 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ced73a45-5f14-41bc-abc6-63a41c14d9ff.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ced73a45-5f14-41bc-abc6-63a41c14d9ff.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 08:32:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:12.580 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[28e98dcd-caab-4017-9c05-6d43ce625b55]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:32:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:12.582 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 08:32:12 compute-0 ovn_metadata_agent[103800]: global
Nov 22 08:32:12 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 08:32:12 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-ced73a45-5f14-41bc-abc6-63a41c14d9ff
Nov 22 08:32:12 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 08:32:12 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 08:32:12 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 08:32:12 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/ced73a45-5f14-41bc-abc6-63a41c14d9ff.pid.haproxy
Nov 22 08:32:12 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 08:32:12 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:32:12 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 08:32:12 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 08:32:12 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 08:32:12 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 08:32:12 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 08:32:12 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 08:32:12 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 08:32:12 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 08:32:12 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 08:32:12 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 08:32:12 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 08:32:12 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 08:32:12 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 08:32:12 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:32:12 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:32:12 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 08:32:12 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 08:32:12 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 08:32:12 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID ced73a45-5f14-41bc-abc6-63a41c14d9ff
Nov 22 08:32:12 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 08:32:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:12.584 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ced73a45-5f14-41bc-abc6-63a41c14d9ff', 'env', 'PROCESS_TAG=haproxy-ced73a45-5f14-41bc-abc6-63a41c14d9ff', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ced73a45-5f14-41bc-abc6-63a41c14d9ff.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 08:32:12 compute-0 nova_compute[186544]: 2025-11-22 08:32:12.647 186548 DEBUG nova.network.neutron [req-cda4554e-4a31-4479-ae70-f365f6f6bdb4 req-7c59bfbf-f69e-42fb-9dcf-c49ace433ae1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Updated VIF entry in instance network info cache for port 6e6336f7-e62a-4e83-9e9a-e28f87900f3d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:32:12 compute-0 nova_compute[186544]: 2025-11-22 08:32:12.648 186548 DEBUG nova.network.neutron [req-cda4554e-4a31-4479-ae70-f365f6f6bdb4 req-7c59bfbf-f69e-42fb-9dcf-c49ace433ae1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Updating instance_info_cache with network_info: [{"id": "6e6336f7-e62a-4e83-9e9a-e28f87900f3d", "address": "fa:16:3e:8f:3c:48", "network": {"id": "ced73a45-5f14-41bc-abc6-63a41c14d9ff", "bridge": "br-int", "label": "tempest-network-smoke--1663104876", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e6336f7-e6", "ovs_interfaceid": "6e6336f7-e62a-4e83-9e9a-e28f87900f3d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:32:12 compute-0 nova_compute[186544]: 2025-11-22 08:32:12.661 186548 DEBUG oslo_concurrency.lockutils [req-cda4554e-4a31-4479-ae70-f365f6f6bdb4 req-7c59bfbf-f69e-42fb-9dcf-c49ace433ae1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-1b601a4d-995a-4ca0-8fbb-622b7d9b174a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:32:12 compute-0 nova_compute[186544]: 2025-11-22 08:32:12.963 186548 DEBUG nova.compute.manager [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 08:32:12 compute-0 nova_compute[186544]: 2025-11-22 08:32:12.965 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763800332.962932, 1b601a4d-995a-4ca0-8fbb-622b7d9b174a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:32:12 compute-0 nova_compute[186544]: 2025-11-22 08:32:12.965 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] VM Started (Lifecycle Event)
Nov 22 08:32:12 compute-0 nova_compute[186544]: 2025-11-22 08:32:12.971 186548 DEBUG nova.virt.libvirt.driver [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 08:32:12 compute-0 nova_compute[186544]: 2025-11-22 08:32:12.975 186548 INFO nova.virt.libvirt.driver [-] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Instance spawned successfully.
Nov 22 08:32:12 compute-0 nova_compute[186544]: 2025-11-22 08:32:12.975 186548 DEBUG nova.virt.libvirt.driver [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 08:32:12 compute-0 nova_compute[186544]: 2025-11-22 08:32:12.994 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:32:13 compute-0 nova_compute[186544]: 2025-11-22 08:32:13.000 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:32:13 compute-0 nova_compute[186544]: 2025-11-22 08:32:13.004 186548 DEBUG nova.virt.libvirt.driver [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:32:13 compute-0 nova_compute[186544]: 2025-11-22 08:32:13.004 186548 DEBUG nova.virt.libvirt.driver [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:32:13 compute-0 nova_compute[186544]: 2025-11-22 08:32:13.004 186548 DEBUG nova.virt.libvirt.driver [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:32:13 compute-0 nova_compute[186544]: 2025-11-22 08:32:13.005 186548 DEBUG nova.virt.libvirt.driver [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:32:13 compute-0 nova_compute[186544]: 2025-11-22 08:32:13.005 186548 DEBUG nova.virt.libvirt.driver [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:32:13 compute-0 nova_compute[186544]: 2025-11-22 08:32:13.005 186548 DEBUG nova.virt.libvirt.driver [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:32:13 compute-0 nova_compute[186544]: 2025-11-22 08:32:13.036 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 08:32:13 compute-0 nova_compute[186544]: 2025-11-22 08:32:13.036 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763800332.9679196, 1b601a4d-995a-4ca0-8fbb-622b7d9b174a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:32:13 compute-0 nova_compute[186544]: 2025-11-22 08:32:13.037 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] VM Paused (Lifecycle Event)
Nov 22 08:32:13 compute-0 podman[246990]: 2025-11-22 08:32:12.955943639 +0000 UTC m=+0.025448328 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 08:32:13 compute-0 nova_compute[186544]: 2025-11-22 08:32:13.063 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:32:13 compute-0 nova_compute[186544]: 2025-11-22 08:32:13.067 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763800332.9697335, 1b601a4d-995a-4ca0-8fbb-622b7d9b174a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:32:13 compute-0 nova_compute[186544]: 2025-11-22 08:32:13.067 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] VM Resumed (Lifecycle Event)
Nov 22 08:32:13 compute-0 nova_compute[186544]: 2025-11-22 08:32:13.082 186548 INFO nova.compute.manager [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Took 4.75 seconds to spawn the instance on the hypervisor.
Nov 22 08:32:13 compute-0 nova_compute[186544]: 2025-11-22 08:32:13.083 186548 DEBUG nova.compute.manager [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:32:13 compute-0 nova_compute[186544]: 2025-11-22 08:32:13.089 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:32:13 compute-0 nova_compute[186544]: 2025-11-22 08:32:13.091 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:32:13 compute-0 nova_compute[186544]: 2025-11-22 08:32:13.111 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 08:32:13 compute-0 nova_compute[186544]: 2025-11-22 08:32:13.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:32:13 compute-0 nova_compute[186544]: 2025-11-22 08:32:13.165 186548 INFO nova.compute.manager [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Took 5.63 seconds to build instance.
Nov 22 08:32:13 compute-0 nova_compute[186544]: 2025-11-22 08:32:13.182 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:32:13 compute-0 nova_compute[186544]: 2025-11-22 08:32:13.182 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:32:13 compute-0 nova_compute[186544]: 2025-11-22 08:32:13.183 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:32:13 compute-0 nova_compute[186544]: 2025-11-22 08:32:13.183 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 08:32:13 compute-0 nova_compute[186544]: 2025-11-22 08:32:13.184 186548 DEBUG oslo_concurrency.lockutils [None req-46ec5463-bd01-4e72-931e-565498e1c016 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "1b601a4d-995a-4ca0-8fbb-622b7d9b174a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.706s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:32:13 compute-0 nova_compute[186544]: 2025-11-22 08:32:13.247 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1b601a4d-995a-4ca0-8fbb-622b7d9b174a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:32:13 compute-0 nova_compute[186544]: 2025-11-22 08:32:13.308 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1b601a4d-995a-4ca0-8fbb-622b7d9b174a/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:32:13 compute-0 nova_compute[186544]: 2025-11-22 08:32:13.309 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1b601a4d-995a-4ca0-8fbb-622b7d9b174a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:32:13 compute-0 nova_compute[186544]: 2025-11-22 08:32:13.364 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1b601a4d-995a-4ca0-8fbb-622b7d9b174a/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:32:13 compute-0 nova_compute[186544]: 2025-11-22 08:32:13.517 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:32:13 compute-0 nova_compute[186544]: 2025-11-22 08:32:13.518 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5615MB free_disk=73.12911987304688GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 08:32:13 compute-0 nova_compute[186544]: 2025-11-22 08:32:13.518 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:32:13 compute-0 nova_compute[186544]: 2025-11-22 08:32:13.518 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:32:13 compute-0 nova_compute[186544]: 2025-11-22 08:32:13.578 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Instance 1b601a4d-995a-4ca0-8fbb-622b7d9b174a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 22 08:32:13 compute-0 nova_compute[186544]: 2025-11-22 08:32:13.578 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 08:32:13 compute-0 nova_compute[186544]: 2025-11-22 08:32:13.579 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 08:32:13 compute-0 nova_compute[186544]: 2025-11-22 08:32:13.618 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:32:13 compute-0 nova_compute[186544]: 2025-11-22 08:32:13.629 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:32:13 compute-0 nova_compute[186544]: 2025-11-22 08:32:13.647 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 08:32:13 compute-0 nova_compute[186544]: 2025-11-22 08:32:13.648 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:32:13 compute-0 podman[246990]: 2025-11-22 08:32:13.822729521 +0000 UTC m=+0.892234180 container create 01962280eda33dc04412b1ea0e6ea91e155f94eada2fdb11ae169d54963e19e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ced73a45-5f14-41bc-abc6-63a41c14d9ff, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 22 08:32:14 compute-0 systemd[1]: Started libpod-conmon-01962280eda33dc04412b1ea0e6ea91e155f94eada2fdb11ae169d54963e19e9.scope.
Nov 22 08:32:14 compute-0 systemd[1]: Started libcrun container.
Nov 22 08:32:14 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b58e2bf7c76b40ed3e36ac887cc221e39b8394acc727569555b5ed550f6d0300/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 08:32:14 compute-0 podman[246990]: 2025-11-22 08:32:14.419377515 +0000 UTC m=+1.488882204 container init 01962280eda33dc04412b1ea0e6ea91e155f94eada2fdb11ae169d54963e19e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ced73a45-5f14-41bc-abc6-63a41c14d9ff, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true)
Nov 22 08:32:14 compute-0 podman[246990]: 2025-11-22 08:32:14.427572177 +0000 UTC m=+1.497076836 container start 01962280eda33dc04412b1ea0e6ea91e155f94eada2fdb11ae169d54963e19e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ced73a45-5f14-41bc-abc6-63a41c14d9ff, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 22 08:32:14 compute-0 neutron-haproxy-ovnmeta-ced73a45-5f14-41bc-abc6-63a41c14d9ff[247013]: [NOTICE]   (247017) : New worker (247019) forked
Nov 22 08:32:14 compute-0 neutron-haproxy-ovnmeta-ced73a45-5f14-41bc-abc6-63a41c14d9ff[247013]: [NOTICE]   (247017) : Loading success.
Nov 22 08:32:14 compute-0 nova_compute[186544]: 2025-11-22 08:32:14.606 186548 DEBUG nova.compute.manager [req-5d22886a-7b3a-4931-94ac-88f2cfb0ecab req-8f938169-157c-42f6-8a5d-921a6f4e1190 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Received event network-vif-plugged-6e6336f7-e62a-4e83-9e9a-e28f87900f3d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:32:14 compute-0 nova_compute[186544]: 2025-11-22 08:32:14.607 186548 DEBUG oslo_concurrency.lockutils [req-5d22886a-7b3a-4931-94ac-88f2cfb0ecab req-8f938169-157c-42f6-8a5d-921a6f4e1190 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1b601a4d-995a-4ca0-8fbb-622b7d9b174a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:32:14 compute-0 nova_compute[186544]: 2025-11-22 08:32:14.607 186548 DEBUG oslo_concurrency.lockutils [req-5d22886a-7b3a-4931-94ac-88f2cfb0ecab req-8f938169-157c-42f6-8a5d-921a6f4e1190 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1b601a4d-995a-4ca0-8fbb-622b7d9b174a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:32:14 compute-0 nova_compute[186544]: 2025-11-22 08:32:14.607 186548 DEBUG oslo_concurrency.lockutils [req-5d22886a-7b3a-4931-94ac-88f2cfb0ecab req-8f938169-157c-42f6-8a5d-921a6f4e1190 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1b601a4d-995a-4ca0-8fbb-622b7d9b174a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:32:14 compute-0 nova_compute[186544]: 2025-11-22 08:32:14.607 186548 DEBUG nova.compute.manager [req-5d22886a-7b3a-4931-94ac-88f2cfb0ecab req-8f938169-157c-42f6-8a5d-921a6f4e1190 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] No waiting events found dispatching network-vif-plugged-6e6336f7-e62a-4e83-9e9a-e28f87900f3d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:32:14 compute-0 nova_compute[186544]: 2025-11-22 08:32:14.608 186548 WARNING nova.compute.manager [req-5d22886a-7b3a-4931-94ac-88f2cfb0ecab req-8f938169-157c-42f6-8a5d-921a6f4e1190 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Received unexpected event network-vif-plugged-6e6336f7-e62a-4e83-9e9a-e28f87900f3d for instance with vm_state active and task_state None.
Nov 22 08:32:16 compute-0 nova_compute[186544]: 2025-11-22 08:32:16.420 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:32:16 compute-0 nova_compute[186544]: 2025-11-22 08:32:16.575 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:32:16 compute-0 nova_compute[186544]: 2025-11-22 08:32:16.648 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:32:16 compute-0 nova_compute[186544]: 2025-11-22 08:32:16.648 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 08:32:17 compute-0 NetworkManager[55036]: <info>  [1763800337.0765] manager: (patch-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/361)
Nov 22 08:32:17 compute-0 NetworkManager[55036]: <info>  [1763800337.0774] manager: (patch-br-int-to-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/362)
Nov 22 08:32:17 compute-0 ovn_controller[94843]: 2025-11-22T08:32:17Z|00778|binding|INFO|Releasing lport 56656750-356c-4b77-b2d5-486506eff60e from this chassis (sb_readonly=0)
Nov 22 08:32:17 compute-0 nova_compute[186544]: 2025-11-22 08:32:17.081 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:32:17 compute-0 ovn_controller[94843]: 2025-11-22T08:32:17Z|00779|binding|INFO|Releasing lport 56656750-356c-4b77-b2d5-486506eff60e from this chassis (sb_readonly=0)
Nov 22 08:32:17 compute-0 nova_compute[186544]: 2025-11-22 08:32:17.103 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:32:17 compute-0 nova_compute[186544]: 2025-11-22 08:32:17.109 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:32:17 compute-0 nova_compute[186544]: 2025-11-22 08:32:17.508 186548 DEBUG nova.compute.manager [req-e4b570c6-b3db-4977-a622-6417a79a82e3 req-a7f43c6b-a38f-48b8-8d3a-ad76fd1de58f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Received event network-changed-6e6336f7-e62a-4e83-9e9a-e28f87900f3d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:32:17 compute-0 nova_compute[186544]: 2025-11-22 08:32:17.508 186548 DEBUG nova.compute.manager [req-e4b570c6-b3db-4977-a622-6417a79a82e3 req-a7f43c6b-a38f-48b8-8d3a-ad76fd1de58f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Refreshing instance network info cache due to event network-changed-6e6336f7-e62a-4e83-9e9a-e28f87900f3d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:32:17 compute-0 nova_compute[186544]: 2025-11-22 08:32:17.509 186548 DEBUG oslo_concurrency.lockutils [req-e4b570c6-b3db-4977-a622-6417a79a82e3 req-a7f43c6b-a38f-48b8-8d3a-ad76fd1de58f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-1b601a4d-995a-4ca0-8fbb-622b7d9b174a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:32:17 compute-0 nova_compute[186544]: 2025-11-22 08:32:17.509 186548 DEBUG oslo_concurrency.lockutils [req-e4b570c6-b3db-4977-a622-6417a79a82e3 req-a7f43c6b-a38f-48b8-8d3a-ad76fd1de58f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-1b601a4d-995a-4ca0-8fbb-622b7d9b174a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:32:17 compute-0 nova_compute[186544]: 2025-11-22 08:32:17.509 186548 DEBUG nova.network.neutron [req-e4b570c6-b3db-4977-a622-6417a79a82e3 req-a7f43c6b-a38f-48b8-8d3a-ad76fd1de58f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Refreshing network info cache for port 6e6336f7-e62a-4e83-9e9a-e28f87900f3d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:32:18 compute-0 nova_compute[186544]: 2025-11-22 08:32:18.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:32:18 compute-0 nova_compute[186544]: 2025-11-22 08:32:18.164 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 08:32:18 compute-0 nova_compute[186544]: 2025-11-22 08:32:18.164 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 08:32:18 compute-0 nova_compute[186544]: 2025-11-22 08:32:18.360 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "refresh_cache-1b601a4d-995a-4ca0-8fbb-622b7d9b174a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:32:18 compute-0 nova_compute[186544]: 2025-11-22 08:32:18.634 186548 DEBUG nova.network.neutron [req-e4b570c6-b3db-4977-a622-6417a79a82e3 req-a7f43c6b-a38f-48b8-8d3a-ad76fd1de58f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Updated VIF entry in instance network info cache for port 6e6336f7-e62a-4e83-9e9a-e28f87900f3d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:32:18 compute-0 nova_compute[186544]: 2025-11-22 08:32:18.635 186548 DEBUG nova.network.neutron [req-e4b570c6-b3db-4977-a622-6417a79a82e3 req-a7f43c6b-a38f-48b8-8d3a-ad76fd1de58f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Updating instance_info_cache with network_info: [{"id": "6e6336f7-e62a-4e83-9e9a-e28f87900f3d", "address": "fa:16:3e:8f:3c:48", "network": {"id": "ced73a45-5f14-41bc-abc6-63a41c14d9ff", "bridge": "br-int", "label": "tempest-network-smoke--1663104876", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e6336f7-e6", "ovs_interfaceid": "6e6336f7-e62a-4e83-9e9a-e28f87900f3d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:32:18 compute-0 nova_compute[186544]: 2025-11-22 08:32:18.651 186548 DEBUG oslo_concurrency.lockutils [req-e4b570c6-b3db-4977-a622-6417a79a82e3 req-a7f43c6b-a38f-48b8-8d3a-ad76fd1de58f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-1b601a4d-995a-4ca0-8fbb-622b7d9b174a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:32:18 compute-0 nova_compute[186544]: 2025-11-22 08:32:18.652 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquired lock "refresh_cache-1b601a4d-995a-4ca0-8fbb-622b7d9b174a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:32:18 compute-0 nova_compute[186544]: 2025-11-22 08:32:18.652 186548 DEBUG nova.network.neutron [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 22 08:32:18 compute-0 nova_compute[186544]: 2025-11-22 08:32:18.652 186548 DEBUG nova.objects.instance [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lazy-loading 'info_cache' on Instance uuid 1b601a4d-995a-4ca0-8fbb-622b7d9b174a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:32:19 compute-0 podman[247029]: 2025-11-22 08:32:19.424174867 +0000 UTC m=+0.068988758 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 08:32:19 compute-0 podman[247031]: 2025-11-22 08:32:19.448132767 +0000 UTC m=+0.087076674 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 08:32:19 compute-0 podman[247030]: 2025-11-22 08:32:19.454085743 +0000 UTC m=+0.097777377 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Nov 22 08:32:19 compute-0 podman[247037]: 2025-11-22 08:32:19.46205741 +0000 UTC m=+0.093197976 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 08:32:19 compute-0 nova_compute[186544]: 2025-11-22 08:32:19.724 186548 DEBUG nova.network.neutron [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Updating instance_info_cache with network_info: [{"id": "6e6336f7-e62a-4e83-9e9a-e28f87900f3d", "address": "fa:16:3e:8f:3c:48", "network": {"id": "ced73a45-5f14-41bc-abc6-63a41c14d9ff", "bridge": "br-int", "label": "tempest-network-smoke--1663104876", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e6336f7-e6", "ovs_interfaceid": "6e6336f7-e62a-4e83-9e9a-e28f87900f3d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:32:19 compute-0 nova_compute[186544]: 2025-11-22 08:32:19.736 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Releasing lock "refresh_cache-1b601a4d-995a-4ca0-8fbb-622b7d9b174a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:32:19 compute-0 nova_compute[186544]: 2025-11-22 08:32:19.737 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 22 08:32:19 compute-0 nova_compute[186544]: 2025-11-22 08:32:19.737 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:32:21 compute-0 nova_compute[186544]: 2025-11-22 08:32:21.421 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:32:21 compute-0 nova_compute[186544]: 2025-11-22 08:32:21.577 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:32:22 compute-0 nova_compute[186544]: 2025-11-22 08:32:22.732 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:32:23 compute-0 nova_compute[186544]: 2025-11-22 08:32:23.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:32:25 compute-0 nova_compute[186544]: 2025-11-22 08:32:25.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:32:26 compute-0 nova_compute[186544]: 2025-11-22 08:32:26.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:32:26 compute-0 nova_compute[186544]: 2025-11-22 08:32:26.423 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:32:26 compute-0 nova_compute[186544]: 2025-11-22 08:32:26.579 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:32:30 compute-0 nova_compute[186544]: 2025-11-22 08:32:30.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:32:30 compute-0 podman[247138]: 2025-11-22 08:32:30.408482451 +0000 UTC m=+0.052096213 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Nov 22 08:32:31 compute-0 nova_compute[186544]: 2025-11-22 08:32:31.427 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:32:31 compute-0 nova_compute[186544]: 2025-11-22 08:32:31.581 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:32:32 compute-0 podman[247158]: 2025-11-22 08:32:32.408837001 +0000 UTC m=+0.052779880 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 08:32:32 compute-0 podman[247159]: 2025-11-22 08:32:32.42304264 +0000 UTC m=+0.058858809 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, version=9.6, io.openshift.tags=minimal rhel9, architecture=x86_64, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., config_id=edpm, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.openshift.expose-services=, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, distribution-scope=public)
Nov 22 08:32:32 compute-0 ovn_controller[94843]: 2025-11-22T08:32:32Z|00073|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:8f:3c:48 10.100.0.3
Nov 22 08:32:32 compute-0 ovn_controller[94843]: 2025-11-22T08:32:32Z|00074|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:8f:3c:48 10.100.0.3
Nov 22 08:32:36 compute-0 nova_compute[186544]: 2025-11-22 08:32:36.429 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:32:36 compute-0 nova_compute[186544]: 2025-11-22 08:32:36.583 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.604 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '1b601a4d-995a-4ca0-8fbb-622b7d9b174a', 'name': 'tempest-TestNetworkAdvancedServerOps-server-1642746423', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-000000a4', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '042f6d127720471aaedb8a1fb7535416', 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'hostId': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.604 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.608 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 1b601a4d-995a-4ca0-8fbb-622b7d9b174a / tap6e6336f7-e6 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.608 12 DEBUG ceilometer.compute.pollsters [-] 1b601a4d-995a-4ca0-8fbb-622b7d9b174a/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.610 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e24f2813-5ff2-462c-b502-49394b88bf5a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': 'instance-000000a4-1b601a4d-995a-4ca0-8fbb-622b7d9b174a-tap6e6336f7-e6', 'timestamp': '2025-11-22T08:32:36.604883', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1642746423', 'name': 'tap6e6336f7-e6', 'instance_id': '1b601a4d-995a-4ca0-8fbb-622b7d9b174a', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8f:3c:48', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6e6336f7-e6'}, 'message_id': 'cc9bd8c8-c77d-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 7097.296678304, 'message_signature': '58508a830a58bef38d8c00a783862ce7c43d54795d9ec7d0051b806de9c3dd0e'}]}, 'timestamp': '2025-11-22 08:32:36.609263', '_unique_id': '63fc7ddf25d243edb6b872871a35cb64'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.610 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.610 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.610 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.610 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.610 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.610 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.610 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.610 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.610 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.610 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.610 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.610 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.610 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.610 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.610 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.610 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.610 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.610 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.610 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.610 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.610 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.610 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.610 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.610 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.610 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.610 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.610 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.610 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.610 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.610 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.610 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.611 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.647 12 DEBUG ceilometer.compute.pollsters [-] 1b601a4d-995a-4ca0-8fbb-622b7d9b174a/disk.device.write.requests volume: 319 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.648 12 DEBUG ceilometer.compute.pollsters [-] 1b601a4d-995a-4ca0-8fbb-622b7d9b174a/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.649 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0602eb63-3d5e-4dd1-8b61-2bc433548390', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 319, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': '1b601a4d-995a-4ca0-8fbb-622b7d9b174a-vda', 'timestamp': '2025-11-22T08:32:36.611811', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1642746423', 'name': 'instance-000000a4', 'instance_id': '1b601a4d-995a-4ca0-8fbb-622b7d9b174a', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cca1c774-c77d-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 7097.303650415, 'message_signature': '96cf66cd335bfdbc1d12a4cbf64ab4ba3a62f07bd4f2ddd74a279f8e4b86b852'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': '1b601a4d-995a-4ca0-8fbb-622b7d9b174a-sda', 'timestamp': '2025-11-22T08:32:36.611811', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1642746423', 'name': 'instance-000000a4', 'instance_id': '1b601a4d-995a-4ca0-8fbb-622b7d9b174a', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cca1d458-c77d-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 7097.303650415, 'message_signature': '4f8653c9e78ad7d0600fbe9c4f2604180fa74cb83947f9f8fbcbab75574ce9bf'}]}, 'timestamp': '2025-11-22 08:32:36.648422', '_unique_id': '46621f63066f439496180ad29a89f87f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.649 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.649 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.649 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.649 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.649 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.649 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.649 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.649 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.649 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.649 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.649 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.649 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.649 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.649 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.649 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.649 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.649 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.649 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.649 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.649 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.649 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.649 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.649 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.649 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.649 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.649 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.649 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.649 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.649 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.649 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.649 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.649 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.649 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.649 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.649 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.649 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.649 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.649 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.649 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.649 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.649 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.649 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.649 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.649 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.649 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.649 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.649 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.649 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.649 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.649 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.649 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.649 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.649 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.649 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.650 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.650 12 DEBUG ceilometer.compute.pollsters [-] 1b601a4d-995a-4ca0-8fbb-622b7d9b174a/disk.device.read.requests volume: 1098 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.650 12 DEBUG ceilometer.compute.pollsters [-] 1b601a4d-995a-4ca0-8fbb-622b7d9b174a/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.651 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7f9a9bc3-5f23-4e2f-abbb-15c708387924', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1098, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': '1b601a4d-995a-4ca0-8fbb-622b7d9b174a-vda', 'timestamp': '2025-11-22T08:32:36.650627', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1642746423', 'name': 'instance-000000a4', 'instance_id': '1b601a4d-995a-4ca0-8fbb-622b7d9b174a', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cca2389e-c77d-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 7097.303650415, 'message_signature': 'e425e1e2f7409cbd1022fdf35e2d51f04aea45e1ce6265291bbb212a0392e5cd'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': '1b601a4d-995a-4ca0-8fbb-622b7d9b174a-sda', 'timestamp': '2025-11-22T08:32:36.650627', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1642746423', 'name': 'instance-000000a4', 'instance_id': '1b601a4d-995a-4ca0-8fbb-622b7d9b174a', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cca2437a-c77d-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 7097.303650415, 'message_signature': 'da62568d3bf58a00cdd8f1998f89888f5117839c1766148a51c1ddc517658d42'}]}, 'timestamp': '2025-11-22 08:32:36.651199', '_unique_id': '5b2bddfe88c94c8db72b9ce63ae31f0c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.651 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.651 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.651 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.651 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.651 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.651 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.651 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.651 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.651 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.651 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.651 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.651 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.651 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.651 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.651 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.651 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.651 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.651 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.651 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.651 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.651 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.651 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.651 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.651 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.651 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.651 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.651 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.651 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.651 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.651 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.651 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.651 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.652 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.668 12 DEBUG ceilometer.compute.pollsters [-] 1b601a4d-995a-4ca0-8fbb-622b7d9b174a/memory.usage volume: 42.55078125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.670 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '14705adf-e600-4f8b-98e7-b55ae4eced5b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.55078125, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': '1b601a4d-995a-4ca0-8fbb-622b7d9b174a', 'timestamp': '2025-11-22T08:32:36.653011', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1642746423', 'name': 'instance-000000a4', 'instance_id': '1b601a4d-995a-4ca0-8fbb-622b7d9b174a', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'cca5016e-c77d-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 7097.360268968, 'message_signature': '8735ce11824ef1d303ef46a39a37cc745099f200d38e69a7dd4d1ac29a8e4165'}]}, 'timestamp': '2025-11-22 08:32:36.669321', '_unique_id': 'b0727af4bab94628821149baacceead0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.670 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.670 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.670 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.670 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.670 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.670 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.670 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.670 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.670 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.670 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.670 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.670 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.670 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.670 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.670 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.670 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.670 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.670 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.670 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.670 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.670 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.670 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.670 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.670 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.670 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.670 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.670 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.670 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.670 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.670 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.670 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.670 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.671 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.684 12 DEBUG ceilometer.compute.pollsters [-] 1b601a4d-995a-4ca0-8fbb-622b7d9b174a/disk.device.allocation volume: 30154752 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.686 12 DEBUG ceilometer.compute.pollsters [-] 1b601a4d-995a-4ca0-8fbb-622b7d9b174a/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.688 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5985974d-c3ff-4757-90f4-df4227987ef5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30154752, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': '1b601a4d-995a-4ca0-8fbb-622b7d9b174a-vda', 'timestamp': '2025-11-22T08:32:36.672222', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1642746423', 'name': 'instance-000000a4', 'instance_id': '1b601a4d-995a-4ca0-8fbb-622b7d9b174a', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cca79e1a-c77d-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 7097.364106563, 'message_signature': '1e12afb3d909af07d20e5c8028f88b2fd18dfac319fbf2445c6f8572c2017676'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': '1b601a4d-995a-4ca0-8fbb-622b7d9b174a-sda', 'timestamp': '2025-11-22T08:32:36.672222', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1642746423', 'name': 'instance-000000a4', 'instance_id': '1b601a4d-995a-4ca0-8fbb-622b7d9b174a', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cca7c048-c77d-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 7097.364106563, 'message_signature': 'f23012ee3097366333bfc10dc828e1e9e3bc9411901248865b65e2f259ea9c79'}]}, 'timestamp': '2025-11-22 08:32:36.687238', '_unique_id': 'db4c184c19dd4d898bdca229ce5f1e99'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.688 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.688 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.688 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.688 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.688 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.688 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.688 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.688 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.688 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.688 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.688 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.688 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.688 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.688 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.688 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.688 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.688 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.688 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.688 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.688 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.688 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.688 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.688 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.688 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.688 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.688 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.688 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.688 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.688 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.688 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.688 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.690 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.690 12 DEBUG ceilometer.compute.pollsters [-] 1b601a4d-995a-4ca0-8fbb-622b7d9b174a/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.691 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9d97b3ba-373f-4b63-b064-68dc346124bf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': 'instance-000000a4-1b601a4d-995a-4ca0-8fbb-622b7d9b174a-tap6e6336f7-e6', 'timestamp': '2025-11-22T08:32:36.690380', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1642746423', 'name': 'tap6e6336f7-e6', 'instance_id': '1b601a4d-995a-4ca0-8fbb-622b7d9b174a', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8f:3c:48', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6e6336f7-e6'}, 'message_id': 'cca84c48-c77d-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 7097.296678304, 'message_signature': '84eb4ec42ca5ecc69014a7801993f61557e88dfc272c5d317dc279136827c8e0'}]}, 'timestamp': '2025-11-22 08:32:36.690847', '_unique_id': '0455acce091e4d1984cc071e0b01de96'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.691 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.691 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.691 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.691 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.691 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.691 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.691 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.691 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.691 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.691 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.691 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.691 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.691 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.691 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.691 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.691 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.691 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.691 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.691 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.691 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.691 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.691 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.691 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.691 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.691 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.691 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.691 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.691 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.691 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.691 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.691 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.692 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.693 12 DEBUG ceilometer.compute.pollsters [-] 1b601a4d-995a-4ca0-8fbb-622b7d9b174a/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.694 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b63ebb95-edf6-4d2a-9963-5d39703b7b24', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': 'instance-000000a4-1b601a4d-995a-4ca0-8fbb-622b7d9b174a-tap6e6336f7-e6', 'timestamp': '2025-11-22T08:32:36.693067', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1642746423', 'name': 'tap6e6336f7-e6', 'instance_id': '1b601a4d-995a-4ca0-8fbb-622b7d9b174a', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8f:3c:48', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6e6336f7-e6'}, 'message_id': 'cca8b52a-c77d-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 7097.296678304, 'message_signature': '3680f31dbbc37bb0093bc745cdb6135a31b146f3ae35fb84759e7150ce9e5d82'}]}, 'timestamp': '2025-11-22 08:32:36.693497', '_unique_id': 'ea325a5ed4c44b91bddeadab0b2aa22b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.694 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.694 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.694 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.694 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.694 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.694 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.694 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.694 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.694 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.694 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.694 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.694 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.694 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.694 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.694 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.694 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.694 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.694 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.694 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.694 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.694 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.694 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.694 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.694 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.694 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.694 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.694 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.694 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.694 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.694 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.694 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.695 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.695 12 DEBUG ceilometer.compute.pollsters [-] 1b601a4d-995a-4ca0-8fbb-622b7d9b174a/network.outgoing.packets volume: 11 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.696 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f03256c9-622a-499a-ad8b-6b5a0f316da5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 11, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': 'instance-000000a4-1b601a4d-995a-4ca0-8fbb-622b7d9b174a-tap6e6336f7-e6', 'timestamp': '2025-11-22T08:32:36.695666', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1642746423', 'name': 'tap6e6336f7-e6', 'instance_id': '1b601a4d-995a-4ca0-8fbb-622b7d9b174a', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8f:3c:48', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6e6336f7-e6'}, 'message_id': 'cca91aec-c77d-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 7097.296678304, 'message_signature': 'a41cf9961850a2827c6f7faa2b8b99fc892b13646f1ed94deed683581287348b'}]}, 'timestamp': '2025-11-22 08:32:36.696102', '_unique_id': 'b29966037cda4c829e08dc5d0c648aae'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.696 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.696 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.696 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.696 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.696 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.696 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.696 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.696 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.696 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.696 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.696 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.696 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.696 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.696 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.696 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.696 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.696 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.696 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.696 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.696 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.696 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.696 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.696 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.696 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.696 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.696 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.696 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.696 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.696 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.696 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.696 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.696 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.698 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.698 12 DEBUG ceilometer.compute.pollsters [-] 1b601a4d-995a-4ca0-8fbb-622b7d9b174a/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.699 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '639e3f86-84d8-4813-8130-6ce1d0bb2ee6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': 'instance-000000a4-1b601a4d-995a-4ca0-8fbb-622b7d9b174a-tap6e6336f7-e6', 'timestamp': '2025-11-22T08:32:36.698545', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1642746423', 'name': 'tap6e6336f7-e6', 'instance_id': '1b601a4d-995a-4ca0-8fbb-622b7d9b174a', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8f:3c:48', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6e6336f7-e6'}, 'message_id': 'cca98ae0-c77d-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 7097.296678304, 'message_signature': '3986844987d79ac71bef8253db51bd8ff2253402b9d69b9280e3039225f687d3'}]}, 'timestamp': '2025-11-22 08:32:36.698970', '_unique_id': '0973a8ef538949b78acc7dd6096bbe9f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.699 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.699 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.699 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.699 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.699 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.699 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.699 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.699 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.699 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.699 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.699 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.699 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.699 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.699 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.699 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.699 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.699 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.699 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.699 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.699 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.699 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.699 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.699 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.699 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.699 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.699 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.699 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.699 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.699 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.699 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.699 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.699 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.701 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.701 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.701 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1642746423>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1642746423>]
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.701 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.701 12 DEBUG ceilometer.compute.pollsters [-] 1b601a4d-995a-4ca0-8fbb-622b7d9b174a/disk.device.usage volume: 29818880 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.702 12 DEBUG ceilometer.compute.pollsters [-] 1b601a4d-995a-4ca0-8fbb-622b7d9b174a/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.703 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '23f5ec5e-b6c7-405b-aef5-60c5b9b0235d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29818880, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': '1b601a4d-995a-4ca0-8fbb-622b7d9b174a-vda', 'timestamp': '2025-11-22T08:32:36.701814', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1642746423', 'name': 'instance-000000a4', 'instance_id': '1b601a4d-995a-4ca0-8fbb-622b7d9b174a', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ccaa09b6-c77d-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 7097.364106563, 'message_signature': 'f9343eaa563a155ea2f1ec8bfdb4dcdcacf49a81dfa6efdda9d60cd8832cb803'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': '1b601a4d-995a-4ca0-8fbb-622b7d9b174a-sda', 'timestamp': '2025-11-22T08:32:36.701814', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1642746423', 'name': 'instance-000000a4', 'instance_id': '1b601a4d-995a-4ca0-8fbb-622b7d9b174a', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ccaa18a2-c77d-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 7097.364106563, 'message_signature': '0068ee86722e9f8736aeb52d32d3efd50854e5a414bef0e2ee097b57d503b86c'}]}, 'timestamp': '2025-11-22 08:32:36.702575', '_unique_id': '7742b1729bde43babc177d63d3f20802'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.703 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.703 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.703 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.703 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.703 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.703 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.703 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.703 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.703 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.703 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.703 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.703 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.703 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.703 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.703 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.703 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.703 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.703 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.703 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.703 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.703 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.703 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.703 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.703 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.703 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.703 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.703 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.703 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.703 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.703 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.703 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.704 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.704 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.705 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1642746423>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1642746423>]
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.705 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.705 12 DEBUG ceilometer.compute.pollsters [-] 1b601a4d-995a-4ca0-8fbb-622b7d9b174a/cpu volume: 13870000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.706 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fa29541b-03e5-4fc3-b177-769b17835127', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 13870000000, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': '1b601a4d-995a-4ca0-8fbb-622b7d9b174a', 'timestamp': '2025-11-22T08:32:36.705415', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1642746423', 'name': 'instance-000000a4', 'instance_id': '1b601a4d-995a-4ca0-8fbb-622b7d9b174a', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'ccaa962e-c77d-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 7097.360268968, 'message_signature': 'd13ae52d414720d907c784a10aeb9fcd34bb40d44c9c66726fd2b06de1041511'}]}, 'timestamp': '2025-11-22 08:32:36.705797', '_unique_id': '79a1d317c83b44e7bd8ef2a8e7f32dc8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.706 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.706 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.706 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.706 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.706 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.706 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.706 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.706 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.706 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.706 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.706 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.706 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.706 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.706 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.706 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.706 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.706 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.706 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.706 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.706 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.706 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.706 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.706 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.706 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.706 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.706 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.706 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.706 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.706 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.706 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.706 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.707 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.708 12 DEBUG ceilometer.compute.pollsters [-] 1b601a4d-995a-4ca0-8fbb-622b7d9b174a/network.incoming.packets volume: 18 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.709 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c89a456e-db0a-4c1e-b742-dbc92c87dbf3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 18, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': 'instance-000000a4-1b601a4d-995a-4ca0-8fbb-622b7d9b174a-tap6e6336f7-e6', 'timestamp': '2025-11-22T08:32:36.708021', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1642746423', 'name': 'tap6e6336f7-e6', 'instance_id': '1b601a4d-995a-4ca0-8fbb-622b7d9b174a', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8f:3c:48', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6e6336f7-e6'}, 'message_id': 'ccaafc4a-c77d-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 7097.296678304, 'message_signature': '12bec9642de2cd4ebe060fa6fb5a821dedd4e331148c1d89a6f2581454c13715'}]}, 'timestamp': '2025-11-22 08:32:36.708447', '_unique_id': '94a003dd3fbf475a8b411b84cc9bc66a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.709 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.709 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.709 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.709 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.709 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.709 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.709 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.709 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.709 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.709 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.709 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.709 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.709 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.709 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.709 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.709 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.709 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.709 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.709 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.709 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.709 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.709 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.709 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.709 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.709 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.709 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.709 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.709 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.709 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.709 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.709 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.710 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.710 12 DEBUG ceilometer.compute.pollsters [-] 1b601a4d-995a-4ca0-8fbb-622b7d9b174a/disk.device.write.bytes volume: 72773632 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.711 12 DEBUG ceilometer.compute.pollsters [-] 1b601a4d-995a-4ca0-8fbb-622b7d9b174a/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.712 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '18f35f85-70fb-4473-90bc-3caaede7b6a7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72773632, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': '1b601a4d-995a-4ca0-8fbb-622b7d9b174a-vda', 'timestamp': '2025-11-22T08:32:36.710667', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1642746423', 'name': 'instance-000000a4', 'instance_id': '1b601a4d-995a-4ca0-8fbb-622b7d9b174a', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ccab636a-c77d-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 7097.303650415, 'message_signature': '7b55bb2790f793cea2c07521d6bce688a8fa0c320154cdf49cf0eef6fcf3f84b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': '1b601a4d-995a-4ca0-8fbb-622b7d9b174a-sda', 'timestamp': '2025-11-22T08:32:36.710667', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1642746423', 'name': 'instance-000000a4', 'instance_id': '1b601a4d-995a-4ca0-8fbb-622b7d9b174a', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ccab715c-c77d-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 7097.303650415, 'message_signature': 'ddf1fb486f1de37213906a7c05e567f430342d2173f4a06ad4bc9a77a79a8822'}]}, 'timestamp': '2025-11-22 08:32:36.711421', '_unique_id': '232f86310d1241fab2f5b3fe102df70f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.712 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.712 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.712 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.712 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.712 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.712 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.712 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.712 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.712 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.712 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.712 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.712 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.712 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.712 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.712 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.712 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.712 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.712 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.712 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.712 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.712 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.712 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.712 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.712 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.712 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.712 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.712 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.712 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.712 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.712 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.712 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.713 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.714 12 DEBUG ceilometer.compute.pollsters [-] 1b601a4d-995a-4ca0-8fbb-622b7d9b174a/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.715 12 DEBUG ceilometer.compute.pollsters [-] 1b601a4d-995a-4ca0-8fbb-622b7d9b174a/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.718 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'df9d6dd4-6d10-472f-85dc-e7a2fd2fa236', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': '1b601a4d-995a-4ca0-8fbb-622b7d9b174a-vda', 'timestamp': '2025-11-22T08:32:36.714252', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1642746423', 'name': 'instance-000000a4', 'instance_id': '1b601a4d-995a-4ca0-8fbb-622b7d9b174a', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ccabfcee-c77d-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 7097.364106563, 'message_signature': 'a3aea9d99be90de21933c1642f257b4cdc142e732ddcb5ea15ea7d32438780cb'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': '1b601a4d-995a-4ca0-8fbb-622b7d9b174a-sda', 'timestamp': '2025-11-22T08:32:36.714252', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1642746423', 'name': 'instance-000000a4', 'instance_id': '1b601a4d-995a-4ca0-8fbb-622b7d9b174a', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ccac1ca6-c77d-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 7097.364106563, 'message_signature': '24529082c7c045cca7b36f757db79f235cc170801e2779d10a102ad8ad68b41e'}]}, 'timestamp': '2025-11-22 08:32:36.716037', '_unique_id': '5746fd5db3f64e66994fe16e90fd522c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.718 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.718 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.718 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.718 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.718 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.718 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.718 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.718 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.718 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.718 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.718 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.718 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.718 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.718 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.718 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.718 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.718 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.718 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.718 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.718 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.718 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.718 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.718 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.718 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.718 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.718 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.718 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.718 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.718 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.718 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.718 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.718 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.720 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.720 12 DEBUG ceilometer.compute.pollsters [-] 1b601a4d-995a-4ca0-8fbb-622b7d9b174a/disk.device.read.bytes volume: 30775808 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.720 12 DEBUG ceilometer.compute.pollsters [-] 1b601a4d-995a-4ca0-8fbb-622b7d9b174a/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.722 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '932e2d94-da6d-457a-8a90-4e67fe300a80', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30775808, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': '1b601a4d-995a-4ca0-8fbb-622b7d9b174a-vda', 'timestamp': '2025-11-22T08:32:36.720386', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1642746423', 'name': 'instance-000000a4', 'instance_id': '1b601a4d-995a-4ca0-8fbb-622b7d9b174a', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ccace3b6-c77d-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 7097.303650415, 'message_signature': 'c597f4d73356a781fd036ddc2b3a795a644a9e1b73d1aed5960c072bc53b62ad'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': '1b601a4d-995a-4ca0-8fbb-622b7d9b174a-sda', 'timestamp': '2025-11-22T08:32:36.720386', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1642746423', 'name': 'instance-000000a4', 'instance_id': '1b601a4d-995a-4ca0-8fbb-622b7d9b174a', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ccacfa86-c77d-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 7097.303650415, 'message_signature': '0af9bfa8b10b563af32511382eb3232dbc11213a90a668a0905b9feabab0bef8'}]}, 'timestamp': '2025-11-22 08:32:36.721560', '_unique_id': '27615313cb1d4aa0bd51eb71808ff15c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.722 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.722 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.722 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.722 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.722 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.722 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.722 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.722 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.722 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.722 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.722 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.722 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.722 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.722 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.722 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.722 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.722 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.722 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.722 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.722 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.722 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.722 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.722 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.722 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.722 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.722 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.722 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.722 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.722 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.722 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.722 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.724 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.724 12 DEBUG ceilometer.compute.pollsters [-] 1b601a4d-995a-4ca0-8fbb-622b7d9b174a/network.outgoing.bytes volume: 1326 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.726 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ef65b4e6-9f1d-46bd-b3fb-2403b16cae47', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1326, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': 'instance-000000a4-1b601a4d-995a-4ca0-8fbb-622b7d9b174a-tap6e6336f7-e6', 'timestamp': '2025-11-22T08:32:36.724537', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1642746423', 'name': 'tap6e6336f7-e6', 'instance_id': '1b601a4d-995a-4ca0-8fbb-622b7d9b174a', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8f:3c:48', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6e6336f7-e6'}, 'message_id': 'ccad874e-c77d-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 7097.296678304, 'message_signature': '8ae1310b6e8ed2a6656f91fb437c90394cfbcdfb783a9f37e010da17bbeda507'}]}, 'timestamp': '2025-11-22 08:32:36.725338', '_unique_id': '0142eb3d7f28423bb964d6ee6a91b45e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.726 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.726 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.726 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.726 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.726 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.726 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.726 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.726 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.726 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.726 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.726 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.726 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.726 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.726 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.726 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.726 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.726 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.726 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.726 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.726 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.726 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.726 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.726 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.726 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.726 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.726 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.726 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.726 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.726 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.726 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.726 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.727 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.727 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.728 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1642746423>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1642746423>]
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.728 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.728 12 DEBUG ceilometer.compute.pollsters [-] 1b601a4d-995a-4ca0-8fbb-622b7d9b174a/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.729 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5be275c3-d8ff-4a5a-a45a-22ad047a2b48', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': 'instance-000000a4-1b601a4d-995a-4ca0-8fbb-622b7d9b174a-tap6e6336f7-e6', 'timestamp': '2025-11-22T08:32:36.728484', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1642746423', 'name': 'tap6e6336f7-e6', 'instance_id': '1b601a4d-995a-4ca0-8fbb-622b7d9b174a', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8f:3c:48', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6e6336f7-e6'}, 'message_id': 'ccae1b5a-c77d-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 7097.296678304, 'message_signature': 'e972642ff3b932177ae3a3a42c1cfaf8eb6df7201a475c0b05075dec759d0961'}]}, 'timestamp': '2025-11-22 08:32:36.728936', '_unique_id': 'a05cc792dfa84f32a2c5f8945da4264a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.729 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.729 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.729 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.729 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.729 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.729 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.729 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.729 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.729 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.729 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.729 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.729 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.729 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.729 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.729 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.729 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.729 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.729 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.729 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.729 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.729 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.729 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.729 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.729 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.729 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.729 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.729 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.729 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.729 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.729 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.729 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.729 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.730 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.730 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.731 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1642746423>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1642746423>]
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.731 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.731 12 DEBUG ceilometer.compute.pollsters [-] 1b601a4d-995a-4ca0-8fbb-622b7d9b174a/disk.device.read.latency volume: 2972133657 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.731 12 DEBUG ceilometer.compute.pollsters [-] 1b601a4d-995a-4ca0-8fbb-622b7d9b174a/disk.device.read.latency volume: 289889166 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.733 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b99ca28d-a85b-4d23-93b1-1223b9c4d052', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2972133657, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': '1b601a4d-995a-4ca0-8fbb-622b7d9b174a-vda', 'timestamp': '2025-11-22T08:32:36.731483', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1642746423', 'name': 'instance-000000a4', 'instance_id': '1b601a4d-995a-4ca0-8fbb-622b7d9b174a', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ccae91d4-c77d-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 7097.303650415, 'message_signature': 'b66a7b96b2e5d6915355f80f20ff446bcb4f3efff6829334b112e22138082d2d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 289889166, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': '1b601a4d-995a-4ca0-8fbb-622b7d9b174a-sda', 'timestamp': '2025-11-22T08:32:36.731483', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1642746423', 'name': 'instance-000000a4', 'instance_id': '1b601a4d-995a-4ca0-8fbb-622b7d9b174a', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ccaea264-c77d-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 7097.303650415, 'message_signature': '3e7db0dbb76b35c133bcb4de143043fb31b532d089b1c90deea529bfb6bc0839'}]}, 'timestamp': '2025-11-22 08:32:36.732405', '_unique_id': '3f9033401435486982bf2ebee7916e49'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.733 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.733 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.733 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.733 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.733 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.733 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.733 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.733 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.733 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.733 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.733 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.733 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.733 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.733 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.733 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.733 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.733 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.733 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.733 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.733 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.733 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.733 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.733 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.733 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.733 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.733 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.733 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.733 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.733 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.733 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.733 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.734 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.734 12 DEBUG ceilometer.compute.pollsters [-] 1b601a4d-995a-4ca0-8fbb-622b7d9b174a/disk.device.write.latency volume: 119270700666 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.735 12 DEBUG ceilometer.compute.pollsters [-] 1b601a4d-995a-4ca0-8fbb-622b7d9b174a/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.736 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '873f9c91-04ce-4301-af6b-379a91e374c6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 119270700666, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': '1b601a4d-995a-4ca0-8fbb-622b7d9b174a-vda', 'timestamp': '2025-11-22T08:32:36.734583', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1642746423', 'name': 'instance-000000a4', 'instance_id': '1b601a4d-995a-4ca0-8fbb-622b7d9b174a', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ccaf0970-c77d-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 7097.303650415, 'message_signature': 'e1ea9bf25462d02b7cb78bb57fd78123d82639260dab4b05258684605d2d8c41'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': '1b601a4d-995a-4ca0-8fbb-622b7d9b174a-sda', 'timestamp': '2025-11-22T08:32:36.734583', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1642746423', 'name': 'instance-000000a4', 'instance_id': '1b601a4d-995a-4ca0-8fbb-622b7d9b174a', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ccaf1960-c77d-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 7097.303650415, 'message_signature': 'dfbe578839f4c3b326aa6384fbde7ebf46fad751b61d58ecd35a3cb7017ad3f1'}]}, 'timestamp': '2025-11-22 08:32:36.735445', '_unique_id': '390f5cafae804fa0b71dfbaa59c192aa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.736 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.736 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.736 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.736 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.736 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.736 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.736 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.736 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.736 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.736 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.736 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.736 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.736 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.736 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.736 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.736 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.736 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.736 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.736 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.736 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.736 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.736 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.736 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.736 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.736 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.736 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.736 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.736 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.736 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.736 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.736 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.736 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.737 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.737 12 DEBUG ceilometer.compute.pollsters [-] 1b601a4d-995a-4ca0-8fbb-622b7d9b174a/network.incoming.bytes volume: 2234 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.739 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd3f44f8a-566e-4670-9b63-9d39061036be', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2234, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': 'instance-000000a4-1b601a4d-995a-4ca0-8fbb-622b7d9b174a-tap6e6336f7-e6', 'timestamp': '2025-11-22T08:32:36.737668', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1642746423', 'name': 'tap6e6336f7-e6', 'instance_id': '1b601a4d-995a-4ca0-8fbb-622b7d9b174a', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8f:3c:48', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6e6336f7-e6'}, 'message_id': 'ccaf8274-c77d-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 7097.296678304, 'message_signature': 'c4612920da167d59cbc57acb43cd4a483400d4508a3464cefdadbaaf9febd083'}]}, 'timestamp': '2025-11-22 08:32:36.738141', '_unique_id': 'fadc6ac4e9d04f28aa6bec387d1cd557'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.739 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.739 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.739 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.739 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.739 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.739 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.739 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.739 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.739 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.739 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.739 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.739 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.739 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.739 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.739 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.739 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.739 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.739 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.739 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.739 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.739 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.739 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.739 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.739 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.739 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.739 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.739 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.739 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.739 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.739 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.739 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.740 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.740 12 DEBUG ceilometer.compute.pollsters [-] 1b601a4d-995a-4ca0-8fbb-622b7d9b174a/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.741 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b807ebe1-663d-4ce2-a172-c91671c7d6d2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd8853d84c1e84f6baaf01635ef1d0f7c', 'user_name': None, 'project_id': '042f6d127720471aaedb8a1fb7535416', 'project_name': None, 'resource_id': 'instance-000000a4-1b601a4d-995a-4ca0-8fbb-622b7d9b174a-tap6e6336f7-e6', 'timestamp': '2025-11-22T08:32:36.740417', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1642746423', 'name': 'tap6e6336f7-e6', 'instance_id': '1b601a4d-995a-4ca0-8fbb-622b7d9b174a', 'instance_type': 'm1.nano', 'host': '1689fb11c4d3be6a891c645c983600c5712478cde43d95c234632842', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8f:3c:48', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6e6336f7-e6'}, 'message_id': 'ccafee1c-c77d-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 7097.296678304, 'message_signature': '59b19cddddcc8c49994a9372668604966c02f520c10d30a4622c6bfa1fce1017'}]}, 'timestamp': '2025-11-22 08:32:36.740899', '_unique_id': '1abf7098285d4b0886473a29f168faba'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.741 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.741 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.741 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.741 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.741 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.741 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.741 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.741 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.741 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.741 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.741 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.741 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.741 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.741 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.741 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.741 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.741 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.741 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.741 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.741 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.741 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.741 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.741 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.741 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.741 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.741 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.741 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.741 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.741 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.741 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:32:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:32:36.741 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:32:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:37.362 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:32:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:37.363 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:32:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:37.364 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:32:39 compute-0 nova_compute[186544]: 2025-11-22 08:32:39.116 186548 INFO nova.compute.manager [None req-a73c9689-545f-44e0-be90-2d806b9609b5 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Get console output
Nov 22 08:32:39 compute-0 nova_compute[186544]: 2025-11-22 08:32:39.122 213059 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 22 08:32:39 compute-0 nova_compute[186544]: 2025-11-22 08:32:39.158 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:32:39 compute-0 nova_compute[186544]: 2025-11-22 08:32:39.443 186548 DEBUG nova.objects.instance [None req-7e8b35ea-7cfc-46b4-b8d0-1e2b730a609e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lazy-loading 'pci_devices' on Instance uuid 1b601a4d-995a-4ca0-8fbb-622b7d9b174a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:32:39 compute-0 nova_compute[186544]: 2025-11-22 08:32:39.465 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763800359.464941, 1b601a4d-995a-4ca0-8fbb-622b7d9b174a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:32:39 compute-0 nova_compute[186544]: 2025-11-22 08:32:39.465 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] VM Paused (Lifecycle Event)
Nov 22 08:32:39 compute-0 nova_compute[186544]: 2025-11-22 08:32:39.480 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:32:39 compute-0 nova_compute[186544]: 2025-11-22 08:32:39.486 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:32:39 compute-0 nova_compute[186544]: 2025-11-22 08:32:39.500 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] During sync_power_state the instance has a pending task (suspending). Skip.
Nov 22 08:32:41 compute-0 kernel: tap6e6336f7-e6 (unregistering): left promiscuous mode
Nov 22 08:32:41 compute-0 NetworkManager[55036]: <info>  [1763800361.3741] device (tap6e6336f7-e6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 08:32:41 compute-0 ovn_controller[94843]: 2025-11-22T08:32:41Z|00780|binding|INFO|Releasing lport 6e6336f7-e62a-4e83-9e9a-e28f87900f3d from this chassis (sb_readonly=0)
Nov 22 08:32:41 compute-0 nova_compute[186544]: 2025-11-22 08:32:41.382 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:32:41 compute-0 ovn_controller[94843]: 2025-11-22T08:32:41Z|00781|binding|INFO|Setting lport 6e6336f7-e62a-4e83-9e9a-e28f87900f3d down in Southbound
Nov 22 08:32:41 compute-0 ovn_controller[94843]: 2025-11-22T08:32:41Z|00782|binding|INFO|Removing iface tap6e6336f7-e6 ovn-installed in OVS
Nov 22 08:32:41 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:41.393 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8f:3c:48 10.100.0.3'], port_security=['fa:16:3e:8f:3c:48 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '1b601a4d-995a-4ca0-8fbb-622b7d9b174a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ced73a45-5f14-41bc-abc6-63a41c14d9ff', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '042f6d127720471aaedb8a1fb7535416', 'neutron:revision_number': '4', 'neutron:security_group_ids': '91619875-7dbc-4d4a-a182-904b86081a80', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.207'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7fe0d12b-4c0f-4e1e-8acf-f7614351545c, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=6e6336f7-e62a-4e83-9e9a-e28f87900f3d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:32:41 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:41.395 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 6e6336f7-e62a-4e83-9e9a-e28f87900f3d in datapath ced73a45-5f14-41bc-abc6-63a41c14d9ff unbound from our chassis
Nov 22 08:32:41 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:41.396 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ced73a45-5f14-41bc-abc6-63a41c14d9ff, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 08:32:41 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:41.398 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[a7900b13-79b6-4c20-b244-5d92ea7a5265]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:32:41 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:41.399 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ced73a45-5f14-41bc-abc6-63a41c14d9ff namespace which is not needed anymore
Nov 22 08:32:41 compute-0 nova_compute[186544]: 2025-11-22 08:32:41.401 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:32:41 compute-0 nova_compute[186544]: 2025-11-22 08:32:41.430 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:32:41 compute-0 systemd[1]: machine-qemu\x2d88\x2dinstance\x2d000000a4.scope: Deactivated successfully.
Nov 22 08:32:41 compute-0 systemd[1]: machine-qemu\x2d88\x2dinstance\x2d000000a4.scope: Consumed 16.073s CPU time.
Nov 22 08:32:41 compute-0 systemd-machined[152872]: Machine qemu-88-instance-000000a4 terminated.
Nov 22 08:32:41 compute-0 nova_compute[186544]: 2025-11-22 08:32:41.585 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:32:41 compute-0 nova_compute[186544]: 2025-11-22 08:32:41.620 186548 DEBUG nova.compute.manager [None req-7e8b35ea-7cfc-46b4-b8d0-1e2b730a609e d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:32:41 compute-0 neutron-haproxy-ovnmeta-ced73a45-5f14-41bc-abc6-63a41c14d9ff[247013]: [NOTICE]   (247017) : haproxy version is 2.8.14-c23fe91
Nov 22 08:32:41 compute-0 neutron-haproxy-ovnmeta-ced73a45-5f14-41bc-abc6-63a41c14d9ff[247013]: [NOTICE]   (247017) : path to executable is /usr/sbin/haproxy
Nov 22 08:32:41 compute-0 neutron-haproxy-ovnmeta-ced73a45-5f14-41bc-abc6-63a41c14d9ff[247013]: [WARNING]  (247017) : Exiting Master process...
Nov 22 08:32:41 compute-0 neutron-haproxy-ovnmeta-ced73a45-5f14-41bc-abc6-63a41c14d9ff[247013]: [ALERT]    (247017) : Current worker (247019) exited with code 143 (Terminated)
Nov 22 08:32:41 compute-0 neutron-haproxy-ovnmeta-ced73a45-5f14-41bc-abc6-63a41c14d9ff[247013]: [WARNING]  (247017) : All workers exited. Exiting... (0)
Nov 22 08:32:41 compute-0 systemd[1]: libpod-01962280eda33dc04412b1ea0e6ea91e155f94eada2fdb11ae169d54963e19e9.scope: Deactivated successfully.
Nov 22 08:32:41 compute-0 podman[247231]: 2025-11-22 08:32:41.76762775 +0000 UTC m=+0.281454927 container died 01962280eda33dc04412b1ea0e6ea91e155f94eada2fdb11ae169d54963e19e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ced73a45-5f14-41bc-abc6-63a41c14d9ff, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team)
Nov 22 08:32:42 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-01962280eda33dc04412b1ea0e6ea91e155f94eada2fdb11ae169d54963e19e9-userdata-shm.mount: Deactivated successfully.
Nov 22 08:32:42 compute-0 systemd[1]: var-lib-containers-storage-overlay-b58e2bf7c76b40ed3e36ac887cc221e39b8394acc727569555b5ed550f6d0300-merged.mount: Deactivated successfully.
Nov 22 08:32:42 compute-0 podman[247231]: 2025-11-22 08:32:42.415913766 +0000 UTC m=+0.929740943 container cleanup 01962280eda33dc04412b1ea0e6ea91e155f94eada2fdb11ae169d54963e19e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ced73a45-5f14-41bc-abc6-63a41c14d9ff, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team)
Nov 22 08:32:42 compute-0 systemd[1]: libpod-conmon-01962280eda33dc04412b1ea0e6ea91e155f94eada2fdb11ae169d54963e19e9.scope: Deactivated successfully.
Nov 22 08:32:42 compute-0 podman[247279]: 2025-11-22 08:32:42.900230285 +0000 UTC m=+0.458187137 container remove 01962280eda33dc04412b1ea0e6ea91e155f94eada2fdb11ae169d54963e19e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ced73a45-5f14-41bc-abc6-63a41c14d9ff, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 22 08:32:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:42.907 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[22aada30-a631-461a-89a7-8b143b039c68]: (4, ('Sat Nov 22 08:32:41 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-ced73a45-5f14-41bc-abc6-63a41c14d9ff (01962280eda33dc04412b1ea0e6ea91e155f94eada2fdb11ae169d54963e19e9)\n01962280eda33dc04412b1ea0e6ea91e155f94eada2fdb11ae169d54963e19e9\nSat Nov 22 08:32:42 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-ced73a45-5f14-41bc-abc6-63a41c14d9ff (01962280eda33dc04412b1ea0e6ea91e155f94eada2fdb11ae169d54963e19e9)\n01962280eda33dc04412b1ea0e6ea91e155f94eada2fdb11ae169d54963e19e9\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:32:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:42.910 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[d003df16-12b7-40a4-bc20-fd2cce5453b5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:32:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:42.911 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapced73a45-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:32:42 compute-0 nova_compute[186544]: 2025-11-22 08:32:42.914 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:32:42 compute-0 kernel: tapced73a45-50: left promiscuous mode
Nov 22 08:32:42 compute-0 nova_compute[186544]: 2025-11-22 08:32:42.932 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:32:42 compute-0 nova_compute[186544]: 2025-11-22 08:32:42.933 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:32:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:43.037 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[ad6a4fee-2e9e-4743-9f5a-3b1e3ac48ee4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:32:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:43.053 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[4a241ee6-74de-4170-9084-f8e1d8512086]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:32:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:43.055 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[46c91b7f-9e04-4cbd-94c3-72759e110e14]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:32:43 compute-0 nova_compute[186544]: 2025-11-22 08:32:43.067 186548 DEBUG nova.compute.manager [req-dd842036-d28a-4e35-9dd9-8d716ef22a44 req-0a5c5533-8be8-4749-8dea-404a3f840a86 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Received event network-vif-unplugged-6e6336f7-e62a-4e83-9e9a-e28f87900f3d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:32:43 compute-0 nova_compute[186544]: 2025-11-22 08:32:43.068 186548 DEBUG oslo_concurrency.lockutils [req-dd842036-d28a-4e35-9dd9-8d716ef22a44 req-0a5c5533-8be8-4749-8dea-404a3f840a86 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1b601a4d-995a-4ca0-8fbb-622b7d9b174a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:32:43 compute-0 nova_compute[186544]: 2025-11-22 08:32:43.068 186548 DEBUG oslo_concurrency.lockutils [req-dd842036-d28a-4e35-9dd9-8d716ef22a44 req-0a5c5533-8be8-4749-8dea-404a3f840a86 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1b601a4d-995a-4ca0-8fbb-622b7d9b174a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:32:43 compute-0 nova_compute[186544]: 2025-11-22 08:32:43.068 186548 DEBUG oslo_concurrency.lockutils [req-dd842036-d28a-4e35-9dd9-8d716ef22a44 req-0a5c5533-8be8-4749-8dea-404a3f840a86 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1b601a4d-995a-4ca0-8fbb-622b7d9b174a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:32:43 compute-0 nova_compute[186544]: 2025-11-22 08:32:43.069 186548 DEBUG nova.compute.manager [req-dd842036-d28a-4e35-9dd9-8d716ef22a44 req-0a5c5533-8be8-4749-8dea-404a3f840a86 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] No waiting events found dispatching network-vif-unplugged-6e6336f7-e62a-4e83-9e9a-e28f87900f3d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:32:43 compute-0 nova_compute[186544]: 2025-11-22 08:32:43.069 186548 WARNING nova.compute.manager [req-dd842036-d28a-4e35-9dd9-8d716ef22a44 req-0a5c5533-8be8-4749-8dea-404a3f840a86 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Received unexpected event network-vif-unplugged-6e6336f7-e62a-4e83-9e9a-e28f87900f3d for instance with vm_state suspended and task_state None.
Nov 22 08:32:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:43.068 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[76cd6aef-ff47-4a75-ae12-3f4d0639a332]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 707296, 'reachable_time': 38273, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 247300, 'error': None, 'target': 'ovnmeta-ced73a45-5f14-41bc-abc6-63a41c14d9ff', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:32:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:43.071 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ced73a45-5f14-41bc-abc6-63a41c14d9ff deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 08:32:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:43.071 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[6865f0c6-9db8-4326-b0d1-6d2b2c5ecd97]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:32:43 compute-0 systemd[1]: run-netns-ovnmeta\x2dced73a45\x2d5f14\x2d41bc\x2dabc6\x2d63a41c14d9ff.mount: Deactivated successfully.
Nov 22 08:32:44 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:44.909 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=58, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=57) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:32:44 compute-0 nova_compute[186544]: 2025-11-22 08:32:44.909 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:32:44 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:44.910 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 08:32:45 compute-0 nova_compute[186544]: 2025-11-22 08:32:45.161 186548 DEBUG nova.compute.manager [req-d4ac22f7-6cfd-45f2-b81b-2e8511744f20 req-53478466-03f3-4ef8-a608-31546bb2c30d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Received event network-vif-plugged-6e6336f7-e62a-4e83-9e9a-e28f87900f3d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:32:45 compute-0 nova_compute[186544]: 2025-11-22 08:32:45.162 186548 DEBUG oslo_concurrency.lockutils [req-d4ac22f7-6cfd-45f2-b81b-2e8511744f20 req-53478466-03f3-4ef8-a608-31546bb2c30d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1b601a4d-995a-4ca0-8fbb-622b7d9b174a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:32:45 compute-0 nova_compute[186544]: 2025-11-22 08:32:45.163 186548 DEBUG oslo_concurrency.lockutils [req-d4ac22f7-6cfd-45f2-b81b-2e8511744f20 req-53478466-03f3-4ef8-a608-31546bb2c30d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1b601a4d-995a-4ca0-8fbb-622b7d9b174a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:32:45 compute-0 nova_compute[186544]: 2025-11-22 08:32:45.163 186548 DEBUG oslo_concurrency.lockutils [req-d4ac22f7-6cfd-45f2-b81b-2e8511744f20 req-53478466-03f3-4ef8-a608-31546bb2c30d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1b601a4d-995a-4ca0-8fbb-622b7d9b174a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:32:45 compute-0 nova_compute[186544]: 2025-11-22 08:32:45.163 186548 DEBUG nova.compute.manager [req-d4ac22f7-6cfd-45f2-b81b-2e8511744f20 req-53478466-03f3-4ef8-a608-31546bb2c30d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] No waiting events found dispatching network-vif-plugged-6e6336f7-e62a-4e83-9e9a-e28f87900f3d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:32:45 compute-0 nova_compute[186544]: 2025-11-22 08:32:45.164 186548 WARNING nova.compute.manager [req-d4ac22f7-6cfd-45f2-b81b-2e8511744f20 req-53478466-03f3-4ef8-a608-31546bb2c30d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Received unexpected event network-vif-plugged-6e6336f7-e62a-4e83-9e9a-e28f87900f3d for instance with vm_state suspended and task_state None.
Nov 22 08:32:46 compute-0 nova_compute[186544]: 2025-11-22 08:32:46.432 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:32:46 compute-0 nova_compute[186544]: 2025-11-22 08:32:46.587 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:32:46 compute-0 nova_compute[186544]: 2025-11-22 08:32:46.769 186548 INFO nova.compute.manager [None req-b156b407-34d8-4269-85f6-feeea6b1c7cf d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Get console output
Nov 22 08:32:47 compute-0 nova_compute[186544]: 2025-11-22 08:32:47.033 186548 INFO nova.compute.manager [None req-1e2e667f-2979-4398-96bc-ea6cca288a90 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Resuming
Nov 22 08:32:47 compute-0 nova_compute[186544]: 2025-11-22 08:32:47.035 186548 DEBUG nova.objects.instance [None req-1e2e667f-2979-4398-96bc-ea6cca288a90 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lazy-loading 'flavor' on Instance uuid 1b601a4d-995a-4ca0-8fbb-622b7d9b174a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:32:47 compute-0 nova_compute[186544]: 2025-11-22 08:32:47.075 186548 DEBUG oslo_concurrency.lockutils [None req-1e2e667f-2979-4398-96bc-ea6cca288a90 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "refresh_cache-1b601a4d-995a-4ca0-8fbb-622b7d9b174a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:32:47 compute-0 nova_compute[186544]: 2025-11-22 08:32:47.076 186548 DEBUG oslo_concurrency.lockutils [None req-1e2e667f-2979-4398-96bc-ea6cca288a90 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquired lock "refresh_cache-1b601a4d-995a-4ca0-8fbb-622b7d9b174a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:32:47 compute-0 nova_compute[186544]: 2025-11-22 08:32:47.076 186548 DEBUG nova.network.neutron [None req-1e2e667f-2979-4398-96bc-ea6cca288a90 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 08:32:48 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:48.913 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '58'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:32:50 compute-0 nova_compute[186544]: 2025-11-22 08:32:50.139 186548 DEBUG nova.network.neutron [None req-1e2e667f-2979-4398-96bc-ea6cca288a90 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Updating instance_info_cache with network_info: [{"id": "6e6336f7-e62a-4e83-9e9a-e28f87900f3d", "address": "fa:16:3e:8f:3c:48", "network": {"id": "ced73a45-5f14-41bc-abc6-63a41c14d9ff", "bridge": "br-int", "label": "tempest-network-smoke--1663104876", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e6336f7-e6", "ovs_interfaceid": "6e6336f7-e62a-4e83-9e9a-e28f87900f3d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:32:50 compute-0 nova_compute[186544]: 2025-11-22 08:32:50.180 186548 DEBUG oslo_concurrency.lockutils [None req-1e2e667f-2979-4398-96bc-ea6cca288a90 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Releasing lock "refresh_cache-1b601a4d-995a-4ca0-8fbb-622b7d9b174a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:32:50 compute-0 nova_compute[186544]: 2025-11-22 08:32:50.189 186548 DEBUG nova.virt.libvirt.vif [None req-1e2e667f-2979-4398-96bc-ea6cca288a90 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:32:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1642746423',display_name='tempest-TestNetworkAdvancedServerOps-server-1642746423',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1642746423',id=164,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLyx3yYTiDGnUmJAh28YsGstH3NrOqX9/QLThE+zKYWB8ANza2qAtrRyZ9J/a5Zb1hHRnNBLhqoKqPZqB0xCn1Lq/9A30rHqEDT5Xmrt1JkqaQbjCW4fioS0E+pv7EJGMg==',key_name='tempest-TestNetworkAdvancedServerOps-1187774737',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:32:13Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='042f6d127720471aaedb8a1fb7535416',ramdisk_id='',reservation_id='r-q3510gar',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-1221065053',owner_user_name='tempest-TestNetworkAdvancedServerOps-1221065053-project-member'},tags=<?>,task_state='resuming',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:32:41Z,user_data=None,user_id='d8853d84c1e84f6baaf01635ef1d0f7c',uuid=1b601a4d-995a-4ca0-8fbb-622b7d9b174a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='suspended') vif={"id": "6e6336f7-e62a-4e83-9e9a-e28f87900f3d", "address": "fa:16:3e:8f:3c:48", "network": {"id": "ced73a45-5f14-41bc-abc6-63a41c14d9ff", "bridge": "br-int", "label": "tempest-network-smoke--1663104876", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e6336f7-e6", "ovs_interfaceid": "6e6336f7-e62a-4e83-9e9a-e28f87900f3d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 08:32:50 compute-0 nova_compute[186544]: 2025-11-22 08:32:50.190 186548 DEBUG nova.network.os_vif_util [None req-1e2e667f-2979-4398-96bc-ea6cca288a90 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converting VIF {"id": "6e6336f7-e62a-4e83-9e9a-e28f87900f3d", "address": "fa:16:3e:8f:3c:48", "network": {"id": "ced73a45-5f14-41bc-abc6-63a41c14d9ff", "bridge": "br-int", "label": "tempest-network-smoke--1663104876", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e6336f7-e6", "ovs_interfaceid": "6e6336f7-e62a-4e83-9e9a-e28f87900f3d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:32:50 compute-0 nova_compute[186544]: 2025-11-22 08:32:50.190 186548 DEBUG nova.network.os_vif_util [None req-1e2e667f-2979-4398-96bc-ea6cca288a90 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8f:3c:48,bridge_name='br-int',has_traffic_filtering=True,id=6e6336f7-e62a-4e83-9e9a-e28f87900f3d,network=Network(ced73a45-5f14-41bc-abc6-63a41c14d9ff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e6336f7-e6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:32:50 compute-0 nova_compute[186544]: 2025-11-22 08:32:50.191 186548 DEBUG os_vif [None req-1e2e667f-2979-4398-96bc-ea6cca288a90 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8f:3c:48,bridge_name='br-int',has_traffic_filtering=True,id=6e6336f7-e62a-4e83-9e9a-e28f87900f3d,network=Network(ced73a45-5f14-41bc-abc6-63a41c14d9ff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e6336f7-e6') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 08:32:50 compute-0 nova_compute[186544]: 2025-11-22 08:32:50.192 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:32:50 compute-0 nova_compute[186544]: 2025-11-22 08:32:50.192 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:32:50 compute-0 nova_compute[186544]: 2025-11-22 08:32:50.193 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:32:50 compute-0 nova_compute[186544]: 2025-11-22 08:32:50.196 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:32:50 compute-0 nova_compute[186544]: 2025-11-22 08:32:50.196 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6e6336f7-e6, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:32:50 compute-0 nova_compute[186544]: 2025-11-22 08:32:50.196 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6e6336f7-e6, col_values=(('external_ids', {'iface-id': '6e6336f7-e62a-4e83-9e9a-e28f87900f3d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8f:3c:48', 'vm-uuid': '1b601a4d-995a-4ca0-8fbb-622b7d9b174a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:32:50 compute-0 nova_compute[186544]: 2025-11-22 08:32:50.197 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:32:50 compute-0 nova_compute[186544]: 2025-11-22 08:32:50.198 186548 INFO os_vif [None req-1e2e667f-2979-4398-96bc-ea6cca288a90 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8f:3c:48,bridge_name='br-int',has_traffic_filtering=True,id=6e6336f7-e62a-4e83-9e9a-e28f87900f3d,network=Network(ced73a45-5f14-41bc-abc6-63a41c14d9ff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e6336f7-e6')
Nov 22 08:32:50 compute-0 podman[247304]: 2025-11-22 08:32:50.474845613 +0000 UTC m=+0.234196164 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 08:32:50 compute-0 podman[247303]: 2025-11-22 08:32:50.477414477 +0000 UTC m=+0.239858545 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 22 08:32:50 compute-0 nova_compute[186544]: 2025-11-22 08:32:50.485 186548 DEBUG nova.objects.instance [None req-1e2e667f-2979-4398-96bc-ea6cca288a90 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lazy-loading 'numa_topology' on Instance uuid 1b601a4d-995a-4ca0-8fbb-622b7d9b174a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:32:50 compute-0 podman[247301]: 2025-11-22 08:32:50.502265647 +0000 UTC m=+0.265203747 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 22 08:32:50 compute-0 podman[247305]: 2025-11-22 08:32:50.531127948 +0000 UTC m=+0.281986200 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 22 08:32:50 compute-0 kernel: tap6e6336f7-e6: entered promiscuous mode
Nov 22 08:32:50 compute-0 NetworkManager[55036]: <info>  [1763800370.5540] manager: (tap6e6336f7-e6): new Tun device (/org/freedesktop/NetworkManager/Devices/363)
Nov 22 08:32:50 compute-0 ovn_controller[94843]: 2025-11-22T08:32:50Z|00783|binding|INFO|Claiming lport 6e6336f7-e62a-4e83-9e9a-e28f87900f3d for this chassis.
Nov 22 08:32:50 compute-0 ovn_controller[94843]: 2025-11-22T08:32:50Z|00784|binding|INFO|6e6336f7-e62a-4e83-9e9a-e28f87900f3d: Claiming fa:16:3e:8f:3c:48 10.100.0.3
Nov 22 08:32:50 compute-0 nova_compute[186544]: 2025-11-22 08:32:50.554 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:32:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:50.565 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8f:3c:48 10.100.0.3'], port_security=['fa:16:3e:8f:3c:48 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '1b601a4d-995a-4ca0-8fbb-622b7d9b174a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ced73a45-5f14-41bc-abc6-63a41c14d9ff', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '042f6d127720471aaedb8a1fb7535416', 'neutron:revision_number': '5', 'neutron:security_group_ids': '91619875-7dbc-4d4a-a182-904b86081a80', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.207'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7fe0d12b-4c0f-4e1e-8acf-f7614351545c, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=6e6336f7-e62a-4e83-9e9a-e28f87900f3d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:32:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:50.566 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 6e6336f7-e62a-4e83-9e9a-e28f87900f3d in datapath ced73a45-5f14-41bc-abc6-63a41c14d9ff bound to our chassis
Nov 22 08:32:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:50.567 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ced73a45-5f14-41bc-abc6-63a41c14d9ff
Nov 22 08:32:50 compute-0 ovn_controller[94843]: 2025-11-22T08:32:50Z|00785|binding|INFO|Setting lport 6e6336f7-e62a-4e83-9e9a-e28f87900f3d ovn-installed in OVS
Nov 22 08:32:50 compute-0 ovn_controller[94843]: 2025-11-22T08:32:50Z|00786|binding|INFO|Setting lport 6e6336f7-e62a-4e83-9e9a-e28f87900f3d up in Southbound
Nov 22 08:32:50 compute-0 nova_compute[186544]: 2025-11-22 08:32:50.569 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:32:50 compute-0 nova_compute[186544]: 2025-11-22 08:32:50.572 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:32:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:50.577 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[49954184-a6f6-438d-b3e6-c8fc0fdc03d6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:32:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:50.578 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapced73a45-51 in ovnmeta-ced73a45-5f14-41bc-abc6-63a41c14d9ff namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 08:32:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:50.579 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapced73a45-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 08:32:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:50.580 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[45d090a3-6c70-47da-a76e-3dfb7a88ac44]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:32:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:50.580 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[a066cfd8-c1a5-4f79-86f6-5180f2d78ea6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:32:50 compute-0 systemd-udevd[247400]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:32:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:50.592 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[e7d1773f-fdd0-4ed7-8ccb-ba8dc8f77c41]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:32:50 compute-0 systemd-machined[152872]: New machine qemu-89-instance-000000a4.
Nov 22 08:32:50 compute-0 NetworkManager[55036]: <info>  [1763800370.6001] device (tap6e6336f7-e6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 08:32:50 compute-0 NetworkManager[55036]: <info>  [1763800370.6008] device (tap6e6336f7-e6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 08:32:50 compute-0 systemd[1]: Started Virtual Machine qemu-89-instance-000000a4.
Nov 22 08:32:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:50.617 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[dad225b4-6dc9-4598-843d-06fc7f7052f3]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:32:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:50.647 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[066e7b86-6772-47f2-8335-740e2abcb75e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:32:50 compute-0 NetworkManager[55036]: <info>  [1763800370.6547] manager: (tapced73a45-50): new Veth device (/org/freedesktop/NetworkManager/Devices/364)
Nov 22 08:32:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:50.653 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[04459cc4-d34a-4391-9aad-aaef103cdfb3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:32:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:50.692 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[decc9926-31f4-4c59-a0be-ab493e9be2bb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:32:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:50.697 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[4bd35205-7103-4baa-ab16-e06e406672b7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:32:50 compute-0 NetworkManager[55036]: <info>  [1763800370.7205] device (tapced73a45-50): carrier: link connected
Nov 22 08:32:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:50.723 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[2d4f0d18-f50b-4192-bbe1-d8967c2c0c31]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:32:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:50.742 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[337ed54a-101e-4278-bbbc-1e381eb4f683]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapced73a45-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:07:dc:af'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 235], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 711135, 'reachable_time': 19877, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 247433, 'error': None, 'target': 'ovnmeta-ced73a45-5f14-41bc-abc6-63a41c14d9ff', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:32:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:50.760 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[bce1085b-0472-4a27-a33b-e8528b127471]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe07:dcaf'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 711135, 'tstamp': 711135}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 247434, 'error': None, 'target': 'ovnmeta-ced73a45-5f14-41bc-abc6-63a41c14d9ff', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:32:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:50.784 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[067bd2b4-5c30-4b05-b41e-6b2248c4da28]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapced73a45-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:07:dc:af'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 235], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 711135, 'reachable_time': 19877, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 247435, 'error': None, 'target': 'ovnmeta-ced73a45-5f14-41bc-abc6-63a41c14d9ff', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:32:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:50.824 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[19dfa6ab-43f1-433f-9955-e0c3a42b5f0b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:32:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:50.892 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[983faf8f-2139-429a-92bc-390d6281610c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:32:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:50.894 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapced73a45-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:32:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:50.894 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:32:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:50.894 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapced73a45-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:32:50 compute-0 kernel: tapced73a45-50: entered promiscuous mode
Nov 22 08:32:50 compute-0 NetworkManager[55036]: <info>  [1763800370.8969] manager: (tapced73a45-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/365)
Nov 22 08:32:50 compute-0 nova_compute[186544]: 2025-11-22 08:32:50.896 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:32:50 compute-0 nova_compute[186544]: 2025-11-22 08:32:50.898 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:32:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:50.901 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapced73a45-50, col_values=(('external_ids', {'iface-id': '56656750-356c-4b77-b2d5-486506eff60e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:32:50 compute-0 ovn_controller[94843]: 2025-11-22T08:32:50Z|00787|binding|INFO|Releasing lport 56656750-356c-4b77-b2d5-486506eff60e from this chassis (sb_readonly=0)
Nov 22 08:32:50 compute-0 nova_compute[186544]: 2025-11-22 08:32:50.902 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:32:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:50.904 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ced73a45-5f14-41bc-abc6-63a41c14d9ff.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ced73a45-5f14-41bc-abc6-63a41c14d9ff.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 08:32:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:50.905 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[8b8de508-a398-4fcc-b629-12c423288583]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:32:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:50.906 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 08:32:50 compute-0 ovn_metadata_agent[103800]: global
Nov 22 08:32:50 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 08:32:50 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-ced73a45-5f14-41bc-abc6-63a41c14d9ff
Nov 22 08:32:50 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 08:32:50 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 08:32:50 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 08:32:50 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/ced73a45-5f14-41bc-abc6-63a41c14d9ff.pid.haproxy
Nov 22 08:32:50 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 08:32:50 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:32:50 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 08:32:50 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 08:32:50 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 08:32:50 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 08:32:50 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 08:32:50 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 08:32:50 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 08:32:50 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 08:32:50 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 08:32:50 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 08:32:50 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 08:32:50 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 08:32:50 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 08:32:50 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:32:50 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:32:50 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 08:32:50 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 08:32:50 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 08:32:50 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID ced73a45-5f14-41bc-abc6-63a41c14d9ff
Nov 22 08:32:50 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 08:32:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:50.907 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ced73a45-5f14-41bc-abc6-63a41c14d9ff', 'env', 'PROCESS_TAG=haproxy-ced73a45-5f14-41bc-abc6-63a41c14d9ff', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ced73a45-5f14-41bc-abc6-63a41c14d9ff.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 08:32:50 compute-0 nova_compute[186544]: 2025-11-22 08:32:50.916 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:32:51 compute-0 nova_compute[186544]: 2025-11-22 08:32:51.081 186548 DEBUG nova.compute.manager [req-f48881c3-459c-446e-8d7a-a7b8591a7f33 req-8754b633-e007-4966-b746-05565aa6bd72 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Received event network-vif-plugged-6e6336f7-e62a-4e83-9e9a-e28f87900f3d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:32:51 compute-0 nova_compute[186544]: 2025-11-22 08:32:51.081 186548 DEBUG oslo_concurrency.lockutils [req-f48881c3-459c-446e-8d7a-a7b8591a7f33 req-8754b633-e007-4966-b746-05565aa6bd72 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1b601a4d-995a-4ca0-8fbb-622b7d9b174a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:32:51 compute-0 nova_compute[186544]: 2025-11-22 08:32:51.082 186548 DEBUG oslo_concurrency.lockutils [req-f48881c3-459c-446e-8d7a-a7b8591a7f33 req-8754b633-e007-4966-b746-05565aa6bd72 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1b601a4d-995a-4ca0-8fbb-622b7d9b174a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:32:51 compute-0 nova_compute[186544]: 2025-11-22 08:32:51.082 186548 DEBUG oslo_concurrency.lockutils [req-f48881c3-459c-446e-8d7a-a7b8591a7f33 req-8754b633-e007-4966-b746-05565aa6bd72 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1b601a4d-995a-4ca0-8fbb-622b7d9b174a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:32:51 compute-0 nova_compute[186544]: 2025-11-22 08:32:51.082 186548 DEBUG nova.compute.manager [req-f48881c3-459c-446e-8d7a-a7b8591a7f33 req-8754b633-e007-4966-b746-05565aa6bd72 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] No waiting events found dispatching network-vif-plugged-6e6336f7-e62a-4e83-9e9a-e28f87900f3d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:32:51 compute-0 nova_compute[186544]: 2025-11-22 08:32:51.082 186548 WARNING nova.compute.manager [req-f48881c3-459c-446e-8d7a-a7b8591a7f33 req-8754b633-e007-4966-b746-05565aa6bd72 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Received unexpected event network-vif-plugged-6e6336f7-e62a-4e83-9e9a-e28f87900f3d for instance with vm_state suspended and task_state resuming.
Nov 22 08:32:51 compute-0 nova_compute[186544]: 2025-11-22 08:32:51.188 186548 DEBUG nova.virt.libvirt.host [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Removed pending event for 1b601a4d-995a-4ca0-8fbb-622b7d9b174a due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Nov 22 08:32:51 compute-0 nova_compute[186544]: 2025-11-22 08:32:51.188 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763800371.187783, 1b601a4d-995a-4ca0-8fbb-622b7d9b174a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:32:51 compute-0 nova_compute[186544]: 2025-11-22 08:32:51.189 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] VM Started (Lifecycle Event)
Nov 22 08:32:51 compute-0 nova_compute[186544]: 2025-11-22 08:32:51.211 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:32:51 compute-0 nova_compute[186544]: 2025-11-22 08:32:51.223 186548 DEBUG nova.compute.manager [None req-1e2e667f-2979-4398-96bc-ea6cca288a90 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 08:32:51 compute-0 nova_compute[186544]: 2025-11-22 08:32:51.224 186548 DEBUG nova.objects.instance [None req-1e2e667f-2979-4398-96bc-ea6cca288a90 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lazy-loading 'pci_devices' on Instance uuid 1b601a4d-995a-4ca0-8fbb-622b7d9b174a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:32:51 compute-0 nova_compute[186544]: 2025-11-22 08:32:51.226 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:32:51 compute-0 nova_compute[186544]: 2025-11-22 08:32:51.246 186548 INFO nova.virt.libvirt.driver [-] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Instance running successfully.
Nov 22 08:32:51 compute-0 virtqemud[186092]: argument unsupported: QEMU guest agent is not configured
Nov 22 08:32:51 compute-0 nova_compute[186544]: 2025-11-22 08:32:51.249 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] During sync_power_state the instance has a pending task (resuming). Skip.
Nov 22 08:32:51 compute-0 nova_compute[186544]: 2025-11-22 08:32:51.249 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763800371.2010388, 1b601a4d-995a-4ca0-8fbb-622b7d9b174a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:32:51 compute-0 nova_compute[186544]: 2025-11-22 08:32:51.249 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] VM Resumed (Lifecycle Event)
Nov 22 08:32:51 compute-0 nova_compute[186544]: 2025-11-22 08:32:51.251 186548 DEBUG nova.virt.libvirt.guest [None req-1e2e667f-2979-4398-96bc-ea6cca288a90 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200
Nov 22 08:32:51 compute-0 nova_compute[186544]: 2025-11-22 08:32:51.251 186548 DEBUG nova.compute.manager [None req-1e2e667f-2979-4398-96bc-ea6cca288a90 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:32:51 compute-0 nova_compute[186544]: 2025-11-22 08:32:51.275 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:32:51 compute-0 nova_compute[186544]: 2025-11-22 08:32:51.277 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:32:51 compute-0 nova_compute[186544]: 2025-11-22 08:32:51.297 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] During sync_power_state the instance has a pending task (resuming). Skip.
Nov 22 08:32:51 compute-0 podman[247474]: 2025-11-22 08:32:51.287929383 +0000 UTC m=+0.024193556 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 08:32:51 compute-0 podman[247474]: 2025-11-22 08:32:51.417894552 +0000 UTC m=+0.154158705 container create ebc7952a7bdb503479b14996bd6d1c906dd39f8515ea0f259fff287591e197ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ced73a45-5f14-41bc-abc6-63a41c14d9ff, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 22 08:32:51 compute-0 nova_compute[186544]: 2025-11-22 08:32:51.434 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:32:51 compute-0 systemd[1]: Started libpod-conmon-ebc7952a7bdb503479b14996bd6d1c906dd39f8515ea0f259fff287591e197ba.scope.
Nov 22 08:32:51 compute-0 systemd[1]: Started libcrun container.
Nov 22 08:32:51 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7ecd52d24e40b6734604f45c15094444233e0028a8b4d829ecdcd7551e29e813/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 08:32:51 compute-0 podman[247474]: 2025-11-22 08:32:51.588732846 +0000 UTC m=+0.324997039 container init ebc7952a7bdb503479b14996bd6d1c906dd39f8515ea0f259fff287591e197ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ced73a45-5f14-41bc-abc6-63a41c14d9ff, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Nov 22 08:32:51 compute-0 nova_compute[186544]: 2025-11-22 08:32:51.590 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:32:51 compute-0 podman[247474]: 2025-11-22 08:32:51.593955125 +0000 UTC m=+0.330219278 container start ebc7952a7bdb503479b14996bd6d1c906dd39f8515ea0f259fff287591e197ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ced73a45-5f14-41bc-abc6-63a41c14d9ff, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS)
Nov 22 08:32:51 compute-0 neutron-haproxy-ovnmeta-ced73a45-5f14-41bc-abc6-63a41c14d9ff[247490]: [NOTICE]   (247494) : New worker (247496) forked
Nov 22 08:32:51 compute-0 neutron-haproxy-ovnmeta-ced73a45-5f14-41bc-abc6-63a41c14d9ff[247490]: [NOTICE]   (247494) : Loading success.
Nov 22 08:32:53 compute-0 nova_compute[186544]: 2025-11-22 08:32:53.172 186548 DEBUG nova.compute.manager [req-7251c894-befa-4cf1-910d-4c6a232d1452 req-2cfb8d76-5433-4f0f-a619-5eb2631080be 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Received event network-vif-plugged-6e6336f7-e62a-4e83-9e9a-e28f87900f3d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:32:53 compute-0 nova_compute[186544]: 2025-11-22 08:32:53.172 186548 DEBUG oslo_concurrency.lockutils [req-7251c894-befa-4cf1-910d-4c6a232d1452 req-2cfb8d76-5433-4f0f-a619-5eb2631080be 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1b601a4d-995a-4ca0-8fbb-622b7d9b174a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:32:53 compute-0 nova_compute[186544]: 2025-11-22 08:32:53.172 186548 DEBUG oslo_concurrency.lockutils [req-7251c894-befa-4cf1-910d-4c6a232d1452 req-2cfb8d76-5433-4f0f-a619-5eb2631080be 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1b601a4d-995a-4ca0-8fbb-622b7d9b174a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:32:53 compute-0 nova_compute[186544]: 2025-11-22 08:32:53.173 186548 DEBUG oslo_concurrency.lockutils [req-7251c894-befa-4cf1-910d-4c6a232d1452 req-2cfb8d76-5433-4f0f-a619-5eb2631080be 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1b601a4d-995a-4ca0-8fbb-622b7d9b174a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:32:53 compute-0 nova_compute[186544]: 2025-11-22 08:32:53.173 186548 DEBUG nova.compute.manager [req-7251c894-befa-4cf1-910d-4c6a232d1452 req-2cfb8d76-5433-4f0f-a619-5eb2631080be 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] No waiting events found dispatching network-vif-plugged-6e6336f7-e62a-4e83-9e9a-e28f87900f3d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:32:53 compute-0 nova_compute[186544]: 2025-11-22 08:32:53.173 186548 WARNING nova.compute.manager [req-7251c894-befa-4cf1-910d-4c6a232d1452 req-2cfb8d76-5433-4f0f-a619-5eb2631080be 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Received unexpected event network-vif-plugged-6e6336f7-e62a-4e83-9e9a-e28f87900f3d for instance with vm_state active and task_state None.
Nov 22 08:32:55 compute-0 nova_compute[186544]: 2025-11-22 08:32:55.085 186548 INFO nova.compute.manager [None req-2f059b67-6753-43c5-a15f-eb057cb36a53 d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Get console output
Nov 22 08:32:55 compute-0 nova_compute[186544]: 2025-11-22 08:32:55.093 213059 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 22 08:32:56 compute-0 nova_compute[186544]: 2025-11-22 08:32:56.031 186548 DEBUG nova.compute.manager [req-5560b66e-f078-47b1-bf3b-b973a1b3ed75 req-303f27fc-8fad-45b9-a4f3-a20f73484a4c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Received event network-changed-6e6336f7-e62a-4e83-9e9a-e28f87900f3d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:32:56 compute-0 nova_compute[186544]: 2025-11-22 08:32:56.032 186548 DEBUG nova.compute.manager [req-5560b66e-f078-47b1-bf3b-b973a1b3ed75 req-303f27fc-8fad-45b9-a4f3-a20f73484a4c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Refreshing instance network info cache due to event network-changed-6e6336f7-e62a-4e83-9e9a-e28f87900f3d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:32:56 compute-0 nova_compute[186544]: 2025-11-22 08:32:56.032 186548 DEBUG oslo_concurrency.lockutils [req-5560b66e-f078-47b1-bf3b-b973a1b3ed75 req-303f27fc-8fad-45b9-a4f3-a20f73484a4c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-1b601a4d-995a-4ca0-8fbb-622b7d9b174a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:32:56 compute-0 nova_compute[186544]: 2025-11-22 08:32:56.032 186548 DEBUG oslo_concurrency.lockutils [req-5560b66e-f078-47b1-bf3b-b973a1b3ed75 req-303f27fc-8fad-45b9-a4f3-a20f73484a4c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-1b601a4d-995a-4ca0-8fbb-622b7d9b174a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:32:56 compute-0 nova_compute[186544]: 2025-11-22 08:32:56.032 186548 DEBUG nova.network.neutron [req-5560b66e-f078-47b1-bf3b-b973a1b3ed75 req-303f27fc-8fad-45b9-a4f3-a20f73484a4c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Refreshing network info cache for port 6e6336f7-e62a-4e83-9e9a-e28f87900f3d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:32:56 compute-0 nova_compute[186544]: 2025-11-22 08:32:56.095 186548 DEBUG oslo_concurrency.lockutils [None req-a42c7270-f159-48fb-8f99-90021b34fafc d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "1b601a4d-995a-4ca0-8fbb-622b7d9b174a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:32:56 compute-0 nova_compute[186544]: 2025-11-22 08:32:56.096 186548 DEBUG oslo_concurrency.lockutils [None req-a42c7270-f159-48fb-8f99-90021b34fafc d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "1b601a4d-995a-4ca0-8fbb-622b7d9b174a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:32:56 compute-0 nova_compute[186544]: 2025-11-22 08:32:56.096 186548 DEBUG oslo_concurrency.lockutils [None req-a42c7270-f159-48fb-8f99-90021b34fafc d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "1b601a4d-995a-4ca0-8fbb-622b7d9b174a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:32:56 compute-0 nova_compute[186544]: 2025-11-22 08:32:56.096 186548 DEBUG oslo_concurrency.lockutils [None req-a42c7270-f159-48fb-8f99-90021b34fafc d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "1b601a4d-995a-4ca0-8fbb-622b7d9b174a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:32:56 compute-0 nova_compute[186544]: 2025-11-22 08:32:56.097 186548 DEBUG oslo_concurrency.lockutils [None req-a42c7270-f159-48fb-8f99-90021b34fafc d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "1b601a4d-995a-4ca0-8fbb-622b7d9b174a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:32:56 compute-0 nova_compute[186544]: 2025-11-22 08:32:56.105 186548 INFO nova.compute.manager [None req-a42c7270-f159-48fb-8f99-90021b34fafc d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Terminating instance
Nov 22 08:32:56 compute-0 nova_compute[186544]: 2025-11-22 08:32:56.112 186548 DEBUG nova.compute.manager [None req-a42c7270-f159-48fb-8f99-90021b34fafc d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 08:32:56 compute-0 kernel: tap6e6336f7-e6 (unregistering): left promiscuous mode
Nov 22 08:32:56 compute-0 NetworkManager[55036]: <info>  [1763800376.1333] device (tap6e6336f7-e6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 08:32:56 compute-0 nova_compute[186544]: 2025-11-22 08:32:56.142 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:32:56 compute-0 ovn_controller[94843]: 2025-11-22T08:32:56Z|00788|binding|INFO|Releasing lport 6e6336f7-e62a-4e83-9e9a-e28f87900f3d from this chassis (sb_readonly=0)
Nov 22 08:32:56 compute-0 ovn_controller[94843]: 2025-11-22T08:32:56Z|00789|binding|INFO|Setting lport 6e6336f7-e62a-4e83-9e9a-e28f87900f3d down in Southbound
Nov 22 08:32:56 compute-0 ovn_controller[94843]: 2025-11-22T08:32:56Z|00790|binding|INFO|Removing iface tap6e6336f7-e6 ovn-installed in OVS
Nov 22 08:32:56 compute-0 nova_compute[186544]: 2025-11-22 08:32:56.145 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:32:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:56.158 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8f:3c:48 10.100.0.3'], port_security=['fa:16:3e:8f:3c:48 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '1b601a4d-995a-4ca0-8fbb-622b7d9b174a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ced73a45-5f14-41bc-abc6-63a41c14d9ff', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '042f6d127720471aaedb8a1fb7535416', 'neutron:revision_number': '6', 'neutron:security_group_ids': '91619875-7dbc-4d4a-a182-904b86081a80', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7fe0d12b-4c0f-4e1e-8acf-f7614351545c, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=6e6336f7-e62a-4e83-9e9a-e28f87900f3d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:32:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:56.159 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 6e6336f7-e62a-4e83-9e9a-e28f87900f3d in datapath ced73a45-5f14-41bc-abc6-63a41c14d9ff unbound from our chassis
Nov 22 08:32:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:56.160 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ced73a45-5f14-41bc-abc6-63a41c14d9ff, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 08:32:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:56.161 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[55e9f380-ff94-44a3-a0a7-0e6e40a42e51]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:32:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:56.161 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ced73a45-5f14-41bc-abc6-63a41c14d9ff namespace which is not needed anymore
Nov 22 08:32:56 compute-0 nova_compute[186544]: 2025-11-22 08:32:56.164 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:32:56 compute-0 systemd[1]: machine-qemu\x2d89\x2dinstance\x2d000000a4.scope: Deactivated successfully.
Nov 22 08:32:56 compute-0 systemd-machined[152872]: Machine qemu-89-instance-000000a4 terminated.
Nov 22 08:32:56 compute-0 neutron-haproxy-ovnmeta-ced73a45-5f14-41bc-abc6-63a41c14d9ff[247490]: [NOTICE]   (247494) : haproxy version is 2.8.14-c23fe91
Nov 22 08:32:56 compute-0 neutron-haproxy-ovnmeta-ced73a45-5f14-41bc-abc6-63a41c14d9ff[247490]: [NOTICE]   (247494) : path to executable is /usr/sbin/haproxy
Nov 22 08:32:56 compute-0 neutron-haproxy-ovnmeta-ced73a45-5f14-41bc-abc6-63a41c14d9ff[247490]: [WARNING]  (247494) : Exiting Master process...
Nov 22 08:32:56 compute-0 neutron-haproxy-ovnmeta-ced73a45-5f14-41bc-abc6-63a41c14d9ff[247490]: [ALERT]    (247494) : Current worker (247496) exited with code 143 (Terminated)
Nov 22 08:32:56 compute-0 neutron-haproxy-ovnmeta-ced73a45-5f14-41bc-abc6-63a41c14d9ff[247490]: [WARNING]  (247494) : All workers exited. Exiting... (0)
Nov 22 08:32:56 compute-0 systemd[1]: libpod-ebc7952a7bdb503479b14996bd6d1c906dd39f8515ea0f259fff287591e197ba.scope: Deactivated successfully.
Nov 22 08:32:56 compute-0 podman[247531]: 2025-11-22 08:32:56.300292289 +0000 UTC m=+0.047782046 container died ebc7952a7bdb503479b14996bd6d1c906dd39f8515ea0f259fff287591e197ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ced73a45-5f14-41bc-abc6-63a41c14d9ff, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 08:32:56 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ebc7952a7bdb503479b14996bd6d1c906dd39f8515ea0f259fff287591e197ba-userdata-shm.mount: Deactivated successfully.
Nov 22 08:32:56 compute-0 systemd[1]: var-lib-containers-storage-overlay-7ecd52d24e40b6734604f45c15094444233e0028a8b4d829ecdcd7551e29e813-merged.mount: Deactivated successfully.
Nov 22 08:32:56 compute-0 nova_compute[186544]: 2025-11-22 08:32:56.335 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:32:56 compute-0 nova_compute[186544]: 2025-11-22 08:32:56.340 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:32:56 compute-0 podman[247531]: 2025-11-22 08:32:56.344630901 +0000 UTC m=+0.092120648 container cleanup ebc7952a7bdb503479b14996bd6d1c906dd39f8515ea0f259fff287591e197ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ced73a45-5f14-41bc-abc6-63a41c14d9ff, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 08:32:56 compute-0 systemd[1]: libpod-conmon-ebc7952a7bdb503479b14996bd6d1c906dd39f8515ea0f259fff287591e197ba.scope: Deactivated successfully.
Nov 22 08:32:56 compute-0 nova_compute[186544]: 2025-11-22 08:32:56.371 186548 DEBUG nova.compute.manager [req-08d44a3f-e0de-404b-afd2-a368f88ef5d0 req-c907ac6e-b6b2-42a6-bb43-91c2d7b58bef 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Received event network-vif-unplugged-6e6336f7-e62a-4e83-9e9a-e28f87900f3d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:32:56 compute-0 nova_compute[186544]: 2025-11-22 08:32:56.373 186548 DEBUG oslo_concurrency.lockutils [req-08d44a3f-e0de-404b-afd2-a368f88ef5d0 req-c907ac6e-b6b2-42a6-bb43-91c2d7b58bef 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1b601a4d-995a-4ca0-8fbb-622b7d9b174a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:32:56 compute-0 nova_compute[186544]: 2025-11-22 08:32:56.373 186548 DEBUG oslo_concurrency.lockutils [req-08d44a3f-e0de-404b-afd2-a368f88ef5d0 req-c907ac6e-b6b2-42a6-bb43-91c2d7b58bef 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1b601a4d-995a-4ca0-8fbb-622b7d9b174a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:32:56 compute-0 nova_compute[186544]: 2025-11-22 08:32:56.373 186548 DEBUG oslo_concurrency.lockutils [req-08d44a3f-e0de-404b-afd2-a368f88ef5d0 req-c907ac6e-b6b2-42a6-bb43-91c2d7b58bef 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1b601a4d-995a-4ca0-8fbb-622b7d9b174a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:32:56 compute-0 nova_compute[186544]: 2025-11-22 08:32:56.374 186548 DEBUG nova.compute.manager [req-08d44a3f-e0de-404b-afd2-a368f88ef5d0 req-c907ac6e-b6b2-42a6-bb43-91c2d7b58bef 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] No waiting events found dispatching network-vif-unplugged-6e6336f7-e62a-4e83-9e9a-e28f87900f3d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:32:56 compute-0 nova_compute[186544]: 2025-11-22 08:32:56.374 186548 DEBUG nova.compute.manager [req-08d44a3f-e0de-404b-afd2-a368f88ef5d0 req-c907ac6e-b6b2-42a6-bb43-91c2d7b58bef 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Received event network-vif-unplugged-6e6336f7-e62a-4e83-9e9a-e28f87900f3d for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 22 08:32:56 compute-0 nova_compute[186544]: 2025-11-22 08:32:56.391 186548 INFO nova.virt.libvirt.driver [-] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Instance destroyed successfully.
Nov 22 08:32:56 compute-0 nova_compute[186544]: 2025-11-22 08:32:56.392 186548 DEBUG nova.objects.instance [None req-a42c7270-f159-48fb-8f99-90021b34fafc d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lazy-loading 'resources' on Instance uuid 1b601a4d-995a-4ca0-8fbb-622b7d9b174a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:32:56 compute-0 podman[247570]: 2025-11-22 08:32:56.414572382 +0000 UTC m=+0.045697266 container remove ebc7952a7bdb503479b14996bd6d1c906dd39f8515ea0f259fff287591e197ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ced73a45-5f14-41bc-abc6-63a41c14d9ff, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3)
Nov 22 08:32:56 compute-0 nova_compute[186544]: 2025-11-22 08:32:56.415 186548 DEBUG nova.virt.libvirt.vif [None req-a42c7270-f159-48fb-8f99-90021b34fafc d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:32:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1642746423',display_name='tempest-TestNetworkAdvancedServerOps-server-1642746423',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1642746423',id=164,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLyx3yYTiDGnUmJAh28YsGstH3NrOqX9/QLThE+zKYWB8ANza2qAtrRyZ9J/a5Zb1hHRnNBLhqoKqPZqB0xCn1Lq/9A30rHqEDT5Xmrt1JkqaQbjCW4fioS0E+pv7EJGMg==',key_name='tempest-TestNetworkAdvancedServerOps-1187774737',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:32:13Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='042f6d127720471aaedb8a1fb7535416',ramdisk_id='',reservation_id='r-q3510gar',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1221065053',owner_user_name='tempest-TestNetworkAdvancedServerOps-1221065053-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:32:51Z,user_data=None,user_id='d8853d84c1e84f6baaf01635ef1d0f7c',uuid=1b601a4d-995a-4ca0-8fbb-622b7d9b174a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6e6336f7-e62a-4e83-9e9a-e28f87900f3d", "address": "fa:16:3e:8f:3c:48", "network": {"id": "ced73a45-5f14-41bc-abc6-63a41c14d9ff", "bridge": "br-int", "label": "tempest-network-smoke--1663104876", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e6336f7-e6", "ovs_interfaceid": "6e6336f7-e62a-4e83-9e9a-e28f87900f3d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 08:32:56 compute-0 nova_compute[186544]: 2025-11-22 08:32:56.415 186548 DEBUG nova.network.os_vif_util [None req-a42c7270-f159-48fb-8f99-90021b34fafc d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converting VIF {"id": "6e6336f7-e62a-4e83-9e9a-e28f87900f3d", "address": "fa:16:3e:8f:3c:48", "network": {"id": "ced73a45-5f14-41bc-abc6-63a41c14d9ff", "bridge": "br-int", "label": "tempest-network-smoke--1663104876", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e6336f7-e6", "ovs_interfaceid": "6e6336f7-e62a-4e83-9e9a-e28f87900f3d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:32:56 compute-0 nova_compute[186544]: 2025-11-22 08:32:56.416 186548 DEBUG nova.network.os_vif_util [None req-a42c7270-f159-48fb-8f99-90021b34fafc d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8f:3c:48,bridge_name='br-int',has_traffic_filtering=True,id=6e6336f7-e62a-4e83-9e9a-e28f87900f3d,network=Network(ced73a45-5f14-41bc-abc6-63a41c14d9ff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e6336f7-e6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:32:56 compute-0 nova_compute[186544]: 2025-11-22 08:32:56.416 186548 DEBUG os_vif [None req-a42c7270-f159-48fb-8f99-90021b34fafc d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8f:3c:48,bridge_name='br-int',has_traffic_filtering=True,id=6e6336f7-e62a-4e83-9e9a-e28f87900f3d,network=Network(ced73a45-5f14-41bc-abc6-63a41c14d9ff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e6336f7-e6') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 08:32:56 compute-0 nova_compute[186544]: 2025-11-22 08:32:56.418 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:32:56 compute-0 nova_compute[186544]: 2025-11-22 08:32:56.418 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6e6336f7-e6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:32:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:56.419 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[e58f8210-7fb4-439b-a96f-9b0102bc1aa4]: (4, ('Sat Nov 22 08:32:56 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-ced73a45-5f14-41bc-abc6-63a41c14d9ff (ebc7952a7bdb503479b14996bd6d1c906dd39f8515ea0f259fff287591e197ba)\nebc7952a7bdb503479b14996bd6d1c906dd39f8515ea0f259fff287591e197ba\nSat Nov 22 08:32:56 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-ced73a45-5f14-41bc-abc6-63a41c14d9ff (ebc7952a7bdb503479b14996bd6d1c906dd39f8515ea0f259fff287591e197ba)\nebc7952a7bdb503479b14996bd6d1c906dd39f8515ea0f259fff287591e197ba\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:32:56 compute-0 nova_compute[186544]: 2025-11-22 08:32:56.419 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:32:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:56.420 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[944ceac7-5ae7-4fae-9491-f3518b9408ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:32:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:56.421 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapced73a45-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:32:56 compute-0 nova_compute[186544]: 2025-11-22 08:32:56.422 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 08:32:56 compute-0 nova_compute[186544]: 2025-11-22 08:32:56.423 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:32:56 compute-0 kernel: tapced73a45-50: left promiscuous mode
Nov 22 08:32:56 compute-0 nova_compute[186544]: 2025-11-22 08:32:56.425 186548 INFO os_vif [None req-a42c7270-f159-48fb-8f99-90021b34fafc d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8f:3c:48,bridge_name='br-int',has_traffic_filtering=True,id=6e6336f7-e62a-4e83-9e9a-e28f87900f3d,network=Network(ced73a45-5f14-41bc-abc6-63a41c14d9ff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6e6336f7-e6')
Nov 22 08:32:56 compute-0 nova_compute[186544]: 2025-11-22 08:32:56.426 186548 INFO nova.virt.libvirt.driver [None req-a42c7270-f159-48fb-8f99-90021b34fafc d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Deleting instance files /var/lib/nova/instances/1b601a4d-995a-4ca0-8fbb-622b7d9b174a_del
Nov 22 08:32:56 compute-0 nova_compute[186544]: 2025-11-22 08:32:56.427 186548 INFO nova.virt.libvirt.driver [None req-a42c7270-f159-48fb-8f99-90021b34fafc d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Deletion of /var/lib/nova/instances/1b601a4d-995a-4ca0-8fbb-622b7d9b174a_del complete
Nov 22 08:32:56 compute-0 nova_compute[186544]: 2025-11-22 08:32:56.435 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:32:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:56.439 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[8c8d5613-b466-4f6d-ac97-e2b6f028df10]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:32:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:56.453 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[22098098-179b-4994-b215-d3f11c24d5ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:32:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:56.454 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[fc8f1256-a2ea-48fd-98c5-f99062b8ca2f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:32:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:56.469 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[47241d0a-4dcb-4a76-b6ab-0563a3631df8]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 711128, 'reachable_time': 20364, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 247591, 'error': None, 'target': 'ovnmeta-ced73a45-5f14-41bc-abc6-63a41c14d9ff', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:32:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:56.472 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ced73a45-5f14-41bc-abc6-63a41c14d9ff deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 08:32:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:32:56.473 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[0653d272-1af6-4589-84a5-ef929778c529]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:32:56 compute-0 systemd[1]: run-netns-ovnmeta\x2dced73a45\x2d5f14\x2d41bc\x2dabc6\x2d63a41c14d9ff.mount: Deactivated successfully.
Nov 22 08:32:56 compute-0 nova_compute[186544]: 2025-11-22 08:32:56.514 186548 INFO nova.compute.manager [None req-a42c7270-f159-48fb-8f99-90021b34fafc d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Took 0.40 seconds to destroy the instance on the hypervisor.
Nov 22 08:32:56 compute-0 nova_compute[186544]: 2025-11-22 08:32:56.514 186548 DEBUG oslo.service.loopingcall [None req-a42c7270-f159-48fb-8f99-90021b34fafc d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 08:32:56 compute-0 nova_compute[186544]: 2025-11-22 08:32:56.515 186548 DEBUG nova.compute.manager [-] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 08:32:56 compute-0 nova_compute[186544]: 2025-11-22 08:32:56.515 186548 DEBUG nova.network.neutron [-] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 08:32:57 compute-0 nova_compute[186544]: 2025-11-22 08:32:57.481 186548 DEBUG nova.network.neutron [-] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:32:57 compute-0 nova_compute[186544]: 2025-11-22 08:32:57.530 186548 INFO nova.compute.manager [-] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Took 1.01 seconds to deallocate network for instance.
Nov 22 08:32:57 compute-0 nova_compute[186544]: 2025-11-22 08:32:57.838 186548 DEBUG oslo_concurrency.lockutils [None req-a42c7270-f159-48fb-8f99-90021b34fafc d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:32:57 compute-0 nova_compute[186544]: 2025-11-22 08:32:57.839 186548 DEBUG oslo_concurrency.lockutils [None req-a42c7270-f159-48fb-8f99-90021b34fafc d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:32:57 compute-0 nova_compute[186544]: 2025-11-22 08:32:57.921 186548 DEBUG nova.compute.provider_tree [None req-a42c7270-f159-48fb-8f99-90021b34fafc d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:32:57 compute-0 nova_compute[186544]: 2025-11-22 08:32:57.931 186548 DEBUG nova.scheduler.client.report [None req-a42c7270-f159-48fb-8f99-90021b34fafc d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:32:57 compute-0 nova_compute[186544]: 2025-11-22 08:32:57.974 186548 DEBUG oslo_concurrency.lockutils [None req-a42c7270-f159-48fb-8f99-90021b34fafc d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.135s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:32:58 compute-0 nova_compute[186544]: 2025-11-22 08:32:58.013 186548 DEBUG nova.network.neutron [req-5560b66e-f078-47b1-bf3b-b973a1b3ed75 req-303f27fc-8fad-45b9-a4f3-a20f73484a4c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Updated VIF entry in instance network info cache for port 6e6336f7-e62a-4e83-9e9a-e28f87900f3d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:32:58 compute-0 nova_compute[186544]: 2025-11-22 08:32:58.014 186548 DEBUG nova.network.neutron [req-5560b66e-f078-47b1-bf3b-b973a1b3ed75 req-303f27fc-8fad-45b9-a4f3-a20f73484a4c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Updating instance_info_cache with network_info: [{"id": "6e6336f7-e62a-4e83-9e9a-e28f87900f3d", "address": "fa:16:3e:8f:3c:48", "network": {"id": "ced73a45-5f14-41bc-abc6-63a41c14d9ff", "bridge": "br-int", "label": "tempest-network-smoke--1663104876", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "042f6d127720471aaedb8a1fb7535416", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6e6336f7-e6", "ovs_interfaceid": "6e6336f7-e62a-4e83-9e9a-e28f87900f3d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:32:58 compute-0 nova_compute[186544]: 2025-11-22 08:32:58.036 186548 INFO nova.scheduler.client.report [None req-a42c7270-f159-48fb-8f99-90021b34fafc d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Deleted allocations for instance 1b601a4d-995a-4ca0-8fbb-622b7d9b174a
Nov 22 08:32:58 compute-0 nova_compute[186544]: 2025-11-22 08:32:58.077 186548 DEBUG oslo_concurrency.lockutils [req-5560b66e-f078-47b1-bf3b-b973a1b3ed75 req-303f27fc-8fad-45b9-a4f3-a20f73484a4c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-1b601a4d-995a-4ca0-8fbb-622b7d9b174a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:32:58 compute-0 nova_compute[186544]: 2025-11-22 08:32:58.121 186548 DEBUG oslo_concurrency.lockutils [None req-a42c7270-f159-48fb-8f99-90021b34fafc d8853d84c1e84f6baaf01635ef1d0f7c 042f6d127720471aaedb8a1fb7535416 - - default default] Lock "1b601a4d-995a-4ca0-8fbb-622b7d9b174a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.025s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:32:58 compute-0 nova_compute[186544]: 2025-11-22 08:32:58.450 186548 DEBUG nova.compute.manager [req-93d0fbb7-3ef6-482c-aa81-6b767006eecb req-d926cdc7-4591-4b12-b753-e08eb060fe35 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Received event network-vif-plugged-6e6336f7-e62a-4e83-9e9a-e28f87900f3d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:32:58 compute-0 nova_compute[186544]: 2025-11-22 08:32:58.451 186548 DEBUG oslo_concurrency.lockutils [req-93d0fbb7-3ef6-482c-aa81-6b767006eecb req-d926cdc7-4591-4b12-b753-e08eb060fe35 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1b601a4d-995a-4ca0-8fbb-622b7d9b174a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:32:58 compute-0 nova_compute[186544]: 2025-11-22 08:32:58.451 186548 DEBUG oslo_concurrency.lockutils [req-93d0fbb7-3ef6-482c-aa81-6b767006eecb req-d926cdc7-4591-4b12-b753-e08eb060fe35 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1b601a4d-995a-4ca0-8fbb-622b7d9b174a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:32:58 compute-0 nova_compute[186544]: 2025-11-22 08:32:58.452 186548 DEBUG oslo_concurrency.lockutils [req-93d0fbb7-3ef6-482c-aa81-6b767006eecb req-d926cdc7-4591-4b12-b753-e08eb060fe35 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1b601a4d-995a-4ca0-8fbb-622b7d9b174a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:32:58 compute-0 nova_compute[186544]: 2025-11-22 08:32:58.452 186548 DEBUG nova.compute.manager [req-93d0fbb7-3ef6-482c-aa81-6b767006eecb req-d926cdc7-4591-4b12-b753-e08eb060fe35 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] No waiting events found dispatching network-vif-plugged-6e6336f7-e62a-4e83-9e9a-e28f87900f3d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:32:58 compute-0 nova_compute[186544]: 2025-11-22 08:32:58.452 186548 WARNING nova.compute.manager [req-93d0fbb7-3ef6-482c-aa81-6b767006eecb req-d926cdc7-4591-4b12-b753-e08eb060fe35 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Received unexpected event network-vif-plugged-6e6336f7-e62a-4e83-9e9a-e28f87900f3d for instance with vm_state deleted and task_state None.
Nov 22 08:32:58 compute-0 nova_compute[186544]: 2025-11-22 08:32:58.452 186548 DEBUG nova.compute.manager [req-93d0fbb7-3ef6-482c-aa81-6b767006eecb req-d926cdc7-4591-4b12-b753-e08eb060fe35 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Received event network-vif-deleted-6e6336f7-e62a-4e83-9e9a-e28f87900f3d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:32:58 compute-0 nova_compute[186544]: 2025-11-22 08:32:58.453 186548 INFO nova.compute.manager [req-93d0fbb7-3ef6-482c-aa81-6b767006eecb req-d926cdc7-4591-4b12-b753-e08eb060fe35 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Neutron deleted interface 6e6336f7-e62a-4e83-9e9a-e28f87900f3d; detaching it from the instance and deleting it from the info cache
Nov 22 08:32:58 compute-0 nova_compute[186544]: 2025-11-22 08:32:58.453 186548 DEBUG nova.network.neutron [req-93d0fbb7-3ef6-482c-aa81-6b767006eecb req-d926cdc7-4591-4b12-b753-e08eb060fe35 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106
Nov 22 08:32:58 compute-0 nova_compute[186544]: 2025-11-22 08:32:58.455 186548 DEBUG nova.compute.manager [req-93d0fbb7-3ef6-482c-aa81-6b767006eecb req-d926cdc7-4591-4b12-b753-e08eb060fe35 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Detach interface failed, port_id=6e6336f7-e62a-4e83-9e9a-e28f87900f3d, reason: Instance 1b601a4d-995a-4ca0-8fbb-622b7d9b174a could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Nov 22 08:33:01 compute-0 podman[247592]: 2025-11-22 08:33:01.42118305 +0000 UTC m=+0.066207270 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 22 08:33:01 compute-0 nova_compute[186544]: 2025-11-22 08:33:01.420 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:33:01 compute-0 nova_compute[186544]: 2025-11-22 08:33:01.439 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:33:01 compute-0 nova_compute[186544]: 2025-11-22 08:33:01.563 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:33:01 compute-0 nova_compute[186544]: 2025-11-22 08:33:01.647 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:33:03 compute-0 podman[247613]: 2025-11-22 08:33:03.403166187 +0000 UTC m=+0.054537833 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 08:33:03 compute-0 podman[247614]: 2025-11-22 08:33:03.403555748 +0000 UTC m=+0.053626862 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, vcs-type=git, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, io.buildah.version=1.33.7, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., version=9.6, com.redhat.component=ubi9-minimal-container, config_id=edpm, container_name=openstack_network_exporter, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal)
Nov 22 08:33:06 compute-0 nova_compute[186544]: 2025-11-22 08:33:06.423 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:33:06 compute-0 nova_compute[186544]: 2025-11-22 08:33:06.441 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:33:11 compute-0 nova_compute[186544]: 2025-11-22 08:33:11.390 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763800376.3890584, 1b601a4d-995a-4ca0-8fbb-622b7d9b174a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:33:11 compute-0 nova_compute[186544]: 2025-11-22 08:33:11.391 186548 INFO nova.compute.manager [-] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] VM Stopped (Lifecycle Event)
Nov 22 08:33:11 compute-0 nova_compute[186544]: 2025-11-22 08:33:11.419 186548 DEBUG nova.compute.manager [None req-6e67d8c7-5c94-4e5a-8ae2-3cde24967072 - - - - - -] [instance: 1b601a4d-995a-4ca0-8fbb-622b7d9b174a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:33:11 compute-0 nova_compute[186544]: 2025-11-22 08:33:11.426 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:33:11 compute-0 nova_compute[186544]: 2025-11-22 08:33:11.442 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:33:14 compute-0 nova_compute[186544]: 2025-11-22 08:33:14.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:33:14 compute-0 nova_compute[186544]: 2025-11-22 08:33:14.186 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:33:14 compute-0 nova_compute[186544]: 2025-11-22 08:33:14.187 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:33:14 compute-0 nova_compute[186544]: 2025-11-22 08:33:14.187 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:33:14 compute-0 nova_compute[186544]: 2025-11-22 08:33:14.187 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 08:33:14 compute-0 nova_compute[186544]: 2025-11-22 08:33:14.335 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:33:14 compute-0 nova_compute[186544]: 2025-11-22 08:33:14.337 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5648MB free_disk=73.12992095947266GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 08:33:14 compute-0 nova_compute[186544]: 2025-11-22 08:33:14.337 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:33:14 compute-0 nova_compute[186544]: 2025-11-22 08:33:14.337 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:33:14 compute-0 nova_compute[186544]: 2025-11-22 08:33:14.409 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 08:33:14 compute-0 nova_compute[186544]: 2025-11-22 08:33:14.411 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 08:33:14 compute-0 nova_compute[186544]: 2025-11-22 08:33:14.454 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:33:14 compute-0 nova_compute[186544]: 2025-11-22 08:33:14.472 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:33:14 compute-0 nova_compute[186544]: 2025-11-22 08:33:14.499 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 08:33:14 compute-0 nova_compute[186544]: 2025-11-22 08:33:14.500 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.163s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:33:16 compute-0 nova_compute[186544]: 2025-11-22 08:33:16.427 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:33:16 compute-0 nova_compute[186544]: 2025-11-22 08:33:16.444 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:33:16 compute-0 nova_compute[186544]: 2025-11-22 08:33:16.500 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:33:16 compute-0 nova_compute[186544]: 2025-11-22 08:33:16.500 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 08:33:19 compute-0 nova_compute[186544]: 2025-11-22 08:33:19.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:33:19 compute-0 nova_compute[186544]: 2025-11-22 08:33:19.164 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 08:33:19 compute-0 nova_compute[186544]: 2025-11-22 08:33:19.164 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 08:33:19 compute-0 nova_compute[186544]: 2025-11-22 08:33:19.188 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 08:33:19 compute-0 nova_compute[186544]: 2025-11-22 08:33:19.189 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:33:21 compute-0 podman[247658]: 2025-11-22 08:33:21.422052948 +0000 UTC m=+0.065301849 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118)
Nov 22 08:33:21 compute-0 podman[247660]: 2025-11-22 08:33:21.422178051 +0000 UTC m=+0.056181214 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 08:33:21 compute-0 nova_compute[186544]: 2025-11-22 08:33:21.430 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:33:21 compute-0 podman[247659]: 2025-11-22 08:33:21.442032909 +0000 UTC m=+0.080466311 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 22 08:33:21 compute-0 nova_compute[186544]: 2025-11-22 08:33:21.446 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:33:21 compute-0 podman[247666]: 2025-11-22 08:33:21.476241111 +0000 UTC m=+0.107806444 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team)
Nov 22 08:33:23 compute-0 nova_compute[186544]: 2025-11-22 08:33:23.184 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:33:25 compute-0 nova_compute[186544]: 2025-11-22 08:33:25.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:33:26 compute-0 nova_compute[186544]: 2025-11-22 08:33:26.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:33:26 compute-0 nova_compute[186544]: 2025-11-22 08:33:26.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_shelved_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:33:26 compute-0 nova_compute[186544]: 2025-11-22 08:33:26.430 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:33:26 compute-0 nova_compute[186544]: 2025-11-22 08:33:26.447 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:33:28 compute-0 nova_compute[186544]: 2025-11-22 08:33:28.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:33:31 compute-0 nova_compute[186544]: 2025-11-22 08:33:31.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:33:31 compute-0 nova_compute[186544]: 2025-11-22 08:33:31.432 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:33:31 compute-0 nova_compute[186544]: 2025-11-22 08:33:31.449 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:33:32 compute-0 podman[247745]: 2025-11-22 08:33:32.42005144 +0000 UTC m=+0.062943380 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 08:33:34 compute-0 podman[247766]: 2025-11-22 08:33:34.435488681 +0000 UTC m=+0.075576361 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, name=ubi9-minimal, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.openshift.expose-services=, vendor=Red Hat, Inc., version=9.6)
Nov 22 08:33:34 compute-0 podman[247765]: 2025-11-22 08:33:34.439224383 +0000 UTC m=+0.083067466 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 08:33:36 compute-0 nova_compute[186544]: 2025-11-22 08:33:36.435 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:33:36 compute-0 nova_compute[186544]: 2025-11-22 08:33:36.450 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:33:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:33:37.363 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:33:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:33:37.364 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:33:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:33:37.364 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:33:41 compute-0 nova_compute[186544]: 2025-11-22 08:33:41.437 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:33:41 compute-0 nova_compute[186544]: 2025-11-22 08:33:41.452 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:33:45 compute-0 ovn_controller[94843]: 2025-11-22T08:33:45Z|00791|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Nov 22 08:33:46 compute-0 nova_compute[186544]: 2025-11-22 08:33:46.439 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:33:46 compute-0 nova_compute[186544]: 2025-11-22 08:33:46.454 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:33:51 compute-0 nova_compute[186544]: 2025-11-22 08:33:51.441 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:33:51 compute-0 nova_compute[186544]: 2025-11-22 08:33:51.455 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:33:52 compute-0 podman[247809]: 2025-11-22 08:33:52.412111479 +0000 UTC m=+0.060400618 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 08:33:52 compute-0 podman[247808]: 2025-11-22 08:33:52.412229072 +0000 UTC m=+0.065791911 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 08:33:52 compute-0 podman[247810]: 2025-11-22 08:33:52.435158196 +0000 UTC m=+0.079568209 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 08:33:52 compute-0 podman[247816]: 2025-11-22 08:33:52.44910188 +0000 UTC m=+0.090792426 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 22 08:33:56 compute-0 nova_compute[186544]: 2025-11-22 08:33:56.442 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:33:56 compute-0 nova_compute[186544]: 2025-11-22 08:33:56.456 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:33:59 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:33:59.987 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=59, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=58) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:33:59 compute-0 nova_compute[186544]: 2025-11-22 08:33:59.987 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:33:59 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:33:59.988 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 08:34:01 compute-0 nova_compute[186544]: 2025-11-22 08:34:01.444 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:34:01 compute-0 nova_compute[186544]: 2025-11-22 08:34:01.458 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:34:01 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:34:01.989 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '59'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:34:03 compute-0 podman[247895]: 2025-11-22 08:34:03.405375053 +0000 UTC m=+0.057949578 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Nov 22 08:34:05 compute-0 podman[247915]: 2025-11-22 08:34:05.408076821 +0000 UTC m=+0.057441244 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 08:34:05 compute-0 podman[247916]: 2025-11-22 08:34:05.420919647 +0000 UTC m=+0.066228640 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, architecture=x86_64, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, io.buildah.version=1.33.7, vcs-type=git, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, release=1755695350, config_id=edpm, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container)
Nov 22 08:34:06 compute-0 nova_compute[186544]: 2025-11-22 08:34:06.445 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:34:06 compute-0 nova_compute[186544]: 2025-11-22 08:34:06.460 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:34:10 compute-0 nova_compute[186544]: 2025-11-22 08:34:10.376 186548 DEBUG oslo_concurrency.lockutils [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "1e8a6cfb-1b42-472e-99dd-5134bcebbe43" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:34:10 compute-0 nova_compute[186544]: 2025-11-22 08:34:10.377 186548 DEBUG oslo_concurrency.lockutils [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "1e8a6cfb-1b42-472e-99dd-5134bcebbe43" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:34:10 compute-0 nova_compute[186544]: 2025-11-22 08:34:10.391 186548 DEBUG nova.compute.manager [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1e8a6cfb-1b42-472e-99dd-5134bcebbe43] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 08:34:10 compute-0 nova_compute[186544]: 2025-11-22 08:34:10.474 186548 DEBUG oslo_concurrency.lockutils [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:34:10 compute-0 nova_compute[186544]: 2025-11-22 08:34:10.475 186548 DEBUG oslo_concurrency.lockutils [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:34:10 compute-0 nova_compute[186544]: 2025-11-22 08:34:10.482 186548 DEBUG nova.virt.hardware [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 08:34:10 compute-0 nova_compute[186544]: 2025-11-22 08:34:10.483 186548 INFO nova.compute.claims [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1e8a6cfb-1b42-472e-99dd-5134bcebbe43] Claim successful on node compute-0.ctlplane.example.com
Nov 22 08:34:10 compute-0 nova_compute[186544]: 2025-11-22 08:34:10.592 186548 DEBUG nova.compute.provider_tree [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:34:10 compute-0 nova_compute[186544]: 2025-11-22 08:34:10.603 186548 DEBUG nova.scheduler.client.report [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:34:10 compute-0 nova_compute[186544]: 2025-11-22 08:34:10.621 186548 DEBUG oslo_concurrency.lockutils [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.146s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:34:10 compute-0 nova_compute[186544]: 2025-11-22 08:34:10.622 186548 DEBUG nova.compute.manager [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1e8a6cfb-1b42-472e-99dd-5134bcebbe43] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 08:34:10 compute-0 nova_compute[186544]: 2025-11-22 08:34:10.669 186548 DEBUG nova.compute.manager [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1e8a6cfb-1b42-472e-99dd-5134bcebbe43] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 22 08:34:10 compute-0 nova_compute[186544]: 2025-11-22 08:34:10.670 186548 DEBUG nova.network.neutron [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1e8a6cfb-1b42-472e-99dd-5134bcebbe43] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 08:34:10 compute-0 nova_compute[186544]: 2025-11-22 08:34:10.685 186548 INFO nova.virt.libvirt.driver [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1e8a6cfb-1b42-472e-99dd-5134bcebbe43] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 08:34:10 compute-0 nova_compute[186544]: 2025-11-22 08:34:10.703 186548 DEBUG nova.compute.manager [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1e8a6cfb-1b42-472e-99dd-5134bcebbe43] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 08:34:10 compute-0 nova_compute[186544]: 2025-11-22 08:34:10.791 186548 DEBUG nova.compute.manager [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1e8a6cfb-1b42-472e-99dd-5134bcebbe43] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 08:34:10 compute-0 nova_compute[186544]: 2025-11-22 08:34:10.792 186548 DEBUG nova.virt.libvirt.driver [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1e8a6cfb-1b42-472e-99dd-5134bcebbe43] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 08:34:10 compute-0 nova_compute[186544]: 2025-11-22 08:34:10.793 186548 INFO nova.virt.libvirt.driver [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1e8a6cfb-1b42-472e-99dd-5134bcebbe43] Creating image(s)
Nov 22 08:34:10 compute-0 nova_compute[186544]: 2025-11-22 08:34:10.793 186548 DEBUG oslo_concurrency.lockutils [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "/var/lib/nova/instances/1e8a6cfb-1b42-472e-99dd-5134bcebbe43/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:34:10 compute-0 nova_compute[186544]: 2025-11-22 08:34:10.794 186548 DEBUG oslo_concurrency.lockutils [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "/var/lib/nova/instances/1e8a6cfb-1b42-472e-99dd-5134bcebbe43/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:34:10 compute-0 nova_compute[186544]: 2025-11-22 08:34:10.794 186548 DEBUG oslo_concurrency.lockutils [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "/var/lib/nova/instances/1e8a6cfb-1b42-472e-99dd-5134bcebbe43/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:34:10 compute-0 nova_compute[186544]: 2025-11-22 08:34:10.810 186548 DEBUG oslo_concurrency.processutils [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:34:10 compute-0 nova_compute[186544]: 2025-11-22 08:34:10.868 186548 DEBUG oslo_concurrency.processutils [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:34:10 compute-0 nova_compute[186544]: 2025-11-22 08:34:10.869 186548 DEBUG oslo_concurrency.lockutils [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:34:10 compute-0 nova_compute[186544]: 2025-11-22 08:34:10.870 186548 DEBUG oslo_concurrency.lockutils [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:34:10 compute-0 nova_compute[186544]: 2025-11-22 08:34:10.880 186548 DEBUG oslo_concurrency.processutils [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:34:10 compute-0 nova_compute[186544]: 2025-11-22 08:34:10.919 186548 DEBUG nova.policy [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 22 08:34:10 compute-0 nova_compute[186544]: 2025-11-22 08:34:10.936 186548 DEBUG oslo_concurrency.processutils [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:34:10 compute-0 nova_compute[186544]: 2025-11-22 08:34:10.937 186548 DEBUG oslo_concurrency.processutils [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/1e8a6cfb-1b42-472e-99dd-5134bcebbe43/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:34:10 compute-0 nova_compute[186544]: 2025-11-22 08:34:10.977 186548 DEBUG oslo_concurrency.processutils [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/1e8a6cfb-1b42-472e-99dd-5134bcebbe43/disk 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:34:10 compute-0 nova_compute[186544]: 2025-11-22 08:34:10.978 186548 DEBUG oslo_concurrency.lockutils [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.109s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:34:10 compute-0 nova_compute[186544]: 2025-11-22 08:34:10.979 186548 DEBUG oslo_concurrency.processutils [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:34:11 compute-0 nova_compute[186544]: 2025-11-22 08:34:11.038 186548 DEBUG oslo_concurrency.processutils [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:34:11 compute-0 nova_compute[186544]: 2025-11-22 08:34:11.040 186548 DEBUG nova.virt.disk.api [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Checking if we can resize image /var/lib/nova/instances/1e8a6cfb-1b42-472e-99dd-5134bcebbe43/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 08:34:11 compute-0 nova_compute[186544]: 2025-11-22 08:34:11.040 186548 DEBUG oslo_concurrency.processutils [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1e8a6cfb-1b42-472e-99dd-5134bcebbe43/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:34:11 compute-0 nova_compute[186544]: 2025-11-22 08:34:11.093 186548 DEBUG oslo_concurrency.processutils [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1e8a6cfb-1b42-472e-99dd-5134bcebbe43/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:34:11 compute-0 nova_compute[186544]: 2025-11-22 08:34:11.094 186548 DEBUG nova.virt.disk.api [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Cannot resize image /var/lib/nova/instances/1e8a6cfb-1b42-472e-99dd-5134bcebbe43/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 08:34:11 compute-0 nova_compute[186544]: 2025-11-22 08:34:11.094 186548 DEBUG nova.objects.instance [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lazy-loading 'migration_context' on Instance uuid 1e8a6cfb-1b42-472e-99dd-5134bcebbe43 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:34:11 compute-0 nova_compute[186544]: 2025-11-22 08:34:11.277 186548 DEBUG nova.virt.libvirt.driver [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1e8a6cfb-1b42-472e-99dd-5134bcebbe43] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 08:34:11 compute-0 nova_compute[186544]: 2025-11-22 08:34:11.278 186548 DEBUG nova.virt.libvirt.driver [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1e8a6cfb-1b42-472e-99dd-5134bcebbe43] Ensure instance console log exists: /var/lib/nova/instances/1e8a6cfb-1b42-472e-99dd-5134bcebbe43/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 08:34:11 compute-0 nova_compute[186544]: 2025-11-22 08:34:11.278 186548 DEBUG oslo_concurrency.lockutils [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:34:11 compute-0 nova_compute[186544]: 2025-11-22 08:34:11.279 186548 DEBUG oslo_concurrency.lockutils [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:34:11 compute-0 nova_compute[186544]: 2025-11-22 08:34:11.279 186548 DEBUG oslo_concurrency.lockutils [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:34:11 compute-0 nova_compute[186544]: 2025-11-22 08:34:11.447 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:34:11 compute-0 nova_compute[186544]: 2025-11-22 08:34:11.461 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:34:12 compute-0 nova_compute[186544]: 2025-11-22 08:34:12.039 186548 DEBUG nova.network.neutron [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1e8a6cfb-1b42-472e-99dd-5134bcebbe43] Successfully created port: a2290405-be1c-41fd-8df6-e9f6381864b5 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 22 08:34:12 compute-0 nova_compute[186544]: 2025-11-22 08:34:12.985 186548 DEBUG nova.network.neutron [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1e8a6cfb-1b42-472e-99dd-5134bcebbe43] Successfully updated port: a2290405-be1c-41fd-8df6-e9f6381864b5 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 08:34:13 compute-0 nova_compute[186544]: 2025-11-22 08:34:13.004 186548 DEBUG oslo_concurrency.lockutils [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "refresh_cache-1e8a6cfb-1b42-472e-99dd-5134bcebbe43" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:34:13 compute-0 nova_compute[186544]: 2025-11-22 08:34:13.005 186548 DEBUG oslo_concurrency.lockutils [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquired lock "refresh_cache-1e8a6cfb-1b42-472e-99dd-5134bcebbe43" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:34:13 compute-0 nova_compute[186544]: 2025-11-22 08:34:13.005 186548 DEBUG nova.network.neutron [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1e8a6cfb-1b42-472e-99dd-5134bcebbe43] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 08:34:13 compute-0 nova_compute[186544]: 2025-11-22 08:34:13.065 186548 DEBUG nova.compute.manager [req-91d1fe6f-2519-4de2-bc3e-7addc2191a01 req-5a54cc92-6057-4efa-8c63-825f555c814b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1e8a6cfb-1b42-472e-99dd-5134bcebbe43] Received event network-changed-a2290405-be1c-41fd-8df6-e9f6381864b5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:34:13 compute-0 nova_compute[186544]: 2025-11-22 08:34:13.066 186548 DEBUG nova.compute.manager [req-91d1fe6f-2519-4de2-bc3e-7addc2191a01 req-5a54cc92-6057-4efa-8c63-825f555c814b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1e8a6cfb-1b42-472e-99dd-5134bcebbe43] Refreshing instance network info cache due to event network-changed-a2290405-be1c-41fd-8df6-e9f6381864b5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:34:13 compute-0 nova_compute[186544]: 2025-11-22 08:34:13.066 186548 DEBUG oslo_concurrency.lockutils [req-91d1fe6f-2519-4de2-bc3e-7addc2191a01 req-5a54cc92-6057-4efa-8c63-825f555c814b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-1e8a6cfb-1b42-472e-99dd-5134bcebbe43" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:34:13 compute-0 nova_compute[186544]: 2025-11-22 08:34:13.123 186548 DEBUG nova.network.neutron [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1e8a6cfb-1b42-472e-99dd-5134bcebbe43] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 08:34:13 compute-0 nova_compute[186544]: 2025-11-22 08:34:13.821 186548 DEBUG nova.network.neutron [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1e8a6cfb-1b42-472e-99dd-5134bcebbe43] Updating instance_info_cache with network_info: [{"id": "a2290405-be1c-41fd-8df6-e9f6381864b5", "address": "fa:16:3e:1f:c7:2d", "network": {"id": "123af741-0f16-4f2c-9391-bc20715b113c", "bridge": "br-int", "label": "tempest-network-smoke--91900949", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2290405-be", "ovs_interfaceid": "a2290405-be1c-41fd-8df6-e9f6381864b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:34:13 compute-0 nova_compute[186544]: 2025-11-22 08:34:13.840 186548 DEBUG oslo_concurrency.lockutils [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Releasing lock "refresh_cache-1e8a6cfb-1b42-472e-99dd-5134bcebbe43" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:34:13 compute-0 nova_compute[186544]: 2025-11-22 08:34:13.841 186548 DEBUG nova.compute.manager [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1e8a6cfb-1b42-472e-99dd-5134bcebbe43] Instance network_info: |[{"id": "a2290405-be1c-41fd-8df6-e9f6381864b5", "address": "fa:16:3e:1f:c7:2d", "network": {"id": "123af741-0f16-4f2c-9391-bc20715b113c", "bridge": "br-int", "label": "tempest-network-smoke--91900949", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2290405-be", "ovs_interfaceid": "a2290405-be1c-41fd-8df6-e9f6381864b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 22 08:34:13 compute-0 nova_compute[186544]: 2025-11-22 08:34:13.841 186548 DEBUG oslo_concurrency.lockutils [req-91d1fe6f-2519-4de2-bc3e-7addc2191a01 req-5a54cc92-6057-4efa-8c63-825f555c814b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-1e8a6cfb-1b42-472e-99dd-5134bcebbe43" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:34:13 compute-0 nova_compute[186544]: 2025-11-22 08:34:13.841 186548 DEBUG nova.network.neutron [req-91d1fe6f-2519-4de2-bc3e-7addc2191a01 req-5a54cc92-6057-4efa-8c63-825f555c814b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1e8a6cfb-1b42-472e-99dd-5134bcebbe43] Refreshing network info cache for port a2290405-be1c-41fd-8df6-e9f6381864b5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:34:13 compute-0 nova_compute[186544]: 2025-11-22 08:34:13.845 186548 DEBUG nova.virt.libvirt.driver [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1e8a6cfb-1b42-472e-99dd-5134bcebbe43] Start _get_guest_xml network_info=[{"id": "a2290405-be1c-41fd-8df6-e9f6381864b5", "address": "fa:16:3e:1f:c7:2d", "network": {"id": "123af741-0f16-4f2c-9391-bc20715b113c", "bridge": "br-int", "label": "tempest-network-smoke--91900949", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2290405-be", "ovs_interfaceid": "a2290405-be1c-41fd-8df6-e9f6381864b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 08:34:13 compute-0 nova_compute[186544]: 2025-11-22 08:34:13.850 186548 WARNING nova.virt.libvirt.driver [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:34:13 compute-0 nova_compute[186544]: 2025-11-22 08:34:13.855 186548 DEBUG nova.virt.libvirt.host [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 08:34:13 compute-0 nova_compute[186544]: 2025-11-22 08:34:13.855 186548 DEBUG nova.virt.libvirt.host [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 08:34:13 compute-0 nova_compute[186544]: 2025-11-22 08:34:13.861 186548 DEBUG nova.virt.libvirt.host [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 08:34:13 compute-0 nova_compute[186544]: 2025-11-22 08:34:13.862 186548 DEBUG nova.virt.libvirt.host [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 08:34:13 compute-0 nova_compute[186544]: 2025-11-22 08:34:13.863 186548 DEBUG nova.virt.libvirt.driver [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 08:34:13 compute-0 nova_compute[186544]: 2025-11-22 08:34:13.863 186548 DEBUG nova.virt.hardware [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 08:34:13 compute-0 nova_compute[186544]: 2025-11-22 08:34:13.863 186548 DEBUG nova.virt.hardware [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 08:34:13 compute-0 nova_compute[186544]: 2025-11-22 08:34:13.864 186548 DEBUG nova.virt.hardware [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 08:34:13 compute-0 nova_compute[186544]: 2025-11-22 08:34:13.864 186548 DEBUG nova.virt.hardware [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 08:34:13 compute-0 nova_compute[186544]: 2025-11-22 08:34:13.864 186548 DEBUG nova.virt.hardware [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 08:34:13 compute-0 nova_compute[186544]: 2025-11-22 08:34:13.864 186548 DEBUG nova.virt.hardware [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 08:34:13 compute-0 nova_compute[186544]: 2025-11-22 08:34:13.864 186548 DEBUG nova.virt.hardware [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 08:34:13 compute-0 nova_compute[186544]: 2025-11-22 08:34:13.865 186548 DEBUG nova.virt.hardware [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 08:34:13 compute-0 nova_compute[186544]: 2025-11-22 08:34:13.865 186548 DEBUG nova.virt.hardware [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 08:34:13 compute-0 nova_compute[186544]: 2025-11-22 08:34:13.865 186548 DEBUG nova.virt.hardware [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 08:34:13 compute-0 nova_compute[186544]: 2025-11-22 08:34:13.865 186548 DEBUG nova.virt.hardware [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 08:34:13 compute-0 nova_compute[186544]: 2025-11-22 08:34:13.869 186548 DEBUG nova.virt.libvirt.vif [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:34:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-873112655',display_name='tempest-TestNetworkBasicOps-server-873112655',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-873112655',id=166,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBP4xi6+j3qP8kpYDCRdJyeh1o5i9dQ6AbSZPHMqOAregRctXcy99guzsSmOcQrpCsRWmsxA2migBdzpbRcg/oHK6z/Bb9nK+Sw+a+wO3o0QK8RJKGetnwdOJRwMx9J1z0w==',key_name='tempest-TestNetworkBasicOps-1280477233',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='12f63a6d87a947758ab928c0d625ff06',ramdisk_id='',reservation_id='r-y0tu3s3m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1998778518',owner_user_name='tempest-TestNetworkBasicOps-1998778518-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:34:10Z,user_data=None,user_id='033a5e424a0a42afa21b67c28d79d1f4',uuid=1e8a6cfb-1b42-472e-99dd-5134bcebbe43,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a2290405-be1c-41fd-8df6-e9f6381864b5", "address": "fa:16:3e:1f:c7:2d", "network": {"id": "123af741-0f16-4f2c-9391-bc20715b113c", "bridge": "br-int", "label": "tempest-network-smoke--91900949", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2290405-be", "ovs_interfaceid": "a2290405-be1c-41fd-8df6-e9f6381864b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 08:34:13 compute-0 nova_compute[186544]: 2025-11-22 08:34:13.869 186548 DEBUG nova.network.os_vif_util [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converting VIF {"id": "a2290405-be1c-41fd-8df6-e9f6381864b5", "address": "fa:16:3e:1f:c7:2d", "network": {"id": "123af741-0f16-4f2c-9391-bc20715b113c", "bridge": "br-int", "label": "tempest-network-smoke--91900949", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2290405-be", "ovs_interfaceid": "a2290405-be1c-41fd-8df6-e9f6381864b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:34:13 compute-0 nova_compute[186544]: 2025-11-22 08:34:13.870 186548 DEBUG nova.network.os_vif_util [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1f:c7:2d,bridge_name='br-int',has_traffic_filtering=True,id=a2290405-be1c-41fd-8df6-e9f6381864b5,network=Network(123af741-0f16-4f2c-9391-bc20715b113c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2290405-be') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:34:13 compute-0 nova_compute[186544]: 2025-11-22 08:34:13.871 186548 DEBUG nova.objects.instance [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lazy-loading 'pci_devices' on Instance uuid 1e8a6cfb-1b42-472e-99dd-5134bcebbe43 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:34:13 compute-0 nova_compute[186544]: 2025-11-22 08:34:13.881 186548 DEBUG nova.virt.libvirt.driver [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1e8a6cfb-1b42-472e-99dd-5134bcebbe43] End _get_guest_xml xml=<domain type="kvm">
Nov 22 08:34:13 compute-0 nova_compute[186544]:   <uuid>1e8a6cfb-1b42-472e-99dd-5134bcebbe43</uuid>
Nov 22 08:34:13 compute-0 nova_compute[186544]:   <name>instance-000000a6</name>
Nov 22 08:34:13 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 08:34:13 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 08:34:13 compute-0 nova_compute[186544]:   <metadata>
Nov 22 08:34:13 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 08:34:13 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 08:34:13 compute-0 nova_compute[186544]:       <nova:name>tempest-TestNetworkBasicOps-server-873112655</nova:name>
Nov 22 08:34:13 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 08:34:13</nova:creationTime>
Nov 22 08:34:13 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 08:34:13 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 08:34:13 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 08:34:13 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 08:34:13 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 08:34:13 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 08:34:13 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 08:34:13 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 08:34:13 compute-0 nova_compute[186544]:         <nova:user uuid="033a5e424a0a42afa21b67c28d79d1f4">tempest-TestNetworkBasicOps-1998778518-project-member</nova:user>
Nov 22 08:34:13 compute-0 nova_compute[186544]:         <nova:project uuid="12f63a6d87a947758ab928c0d625ff06">tempest-TestNetworkBasicOps-1998778518</nova:project>
Nov 22 08:34:13 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 08:34:13 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 08:34:13 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 08:34:13 compute-0 nova_compute[186544]:         <nova:port uuid="a2290405-be1c-41fd-8df6-e9f6381864b5">
Nov 22 08:34:13 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.19" ipVersion="4"/>
Nov 22 08:34:13 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 08:34:13 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 08:34:13 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 08:34:13 compute-0 nova_compute[186544]:   </metadata>
Nov 22 08:34:13 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 08:34:13 compute-0 nova_compute[186544]:     <system>
Nov 22 08:34:13 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 08:34:13 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 08:34:13 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 08:34:13 compute-0 nova_compute[186544]:       <entry name="serial">1e8a6cfb-1b42-472e-99dd-5134bcebbe43</entry>
Nov 22 08:34:13 compute-0 nova_compute[186544]:       <entry name="uuid">1e8a6cfb-1b42-472e-99dd-5134bcebbe43</entry>
Nov 22 08:34:13 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 08:34:13 compute-0 nova_compute[186544]:     </system>
Nov 22 08:34:13 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 08:34:13 compute-0 nova_compute[186544]:   <os>
Nov 22 08:34:13 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 08:34:13 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 08:34:13 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 08:34:13 compute-0 nova_compute[186544]:   </os>
Nov 22 08:34:13 compute-0 nova_compute[186544]:   <features>
Nov 22 08:34:13 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 08:34:13 compute-0 nova_compute[186544]:     <apic/>
Nov 22 08:34:13 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 08:34:13 compute-0 nova_compute[186544]:   </features>
Nov 22 08:34:13 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 08:34:13 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 08:34:13 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 08:34:13 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 08:34:13 compute-0 nova_compute[186544]:   </clock>
Nov 22 08:34:13 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 08:34:13 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 08:34:13 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 08:34:13 compute-0 nova_compute[186544]:   </cpu>
Nov 22 08:34:13 compute-0 nova_compute[186544]:   <devices>
Nov 22 08:34:13 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 08:34:13 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 08:34:13 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/1e8a6cfb-1b42-472e-99dd-5134bcebbe43/disk"/>
Nov 22 08:34:13 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 08:34:13 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:34:13 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 08:34:13 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 08:34:13 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/1e8a6cfb-1b42-472e-99dd-5134bcebbe43/disk.config"/>
Nov 22 08:34:13 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 08:34:13 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:34:13 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 08:34:13 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:1f:c7:2d"/>
Nov 22 08:34:13 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:34:13 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 08:34:13 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 08:34:13 compute-0 nova_compute[186544]:       <target dev="tapa2290405-be"/>
Nov 22 08:34:13 compute-0 nova_compute[186544]:     </interface>
Nov 22 08:34:13 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 08:34:13 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/1e8a6cfb-1b42-472e-99dd-5134bcebbe43/console.log" append="off"/>
Nov 22 08:34:13 compute-0 nova_compute[186544]:     </serial>
Nov 22 08:34:13 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 08:34:13 compute-0 nova_compute[186544]:     <video>
Nov 22 08:34:13 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:34:13 compute-0 nova_compute[186544]:     </video>
Nov 22 08:34:13 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 08:34:13 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 08:34:13 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 08:34:13 compute-0 nova_compute[186544]:     </rng>
Nov 22 08:34:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 08:34:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:34:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:34:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:34:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:34:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:34:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:34:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:34:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:34:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:34:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:34:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:34:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:34:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:34:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:34:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:34:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:34:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:34:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:34:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:34:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:34:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:34:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:34:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:34:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:34:13 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 08:34:13 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 08:34:13 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 08:34:13 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 08:34:13 compute-0 nova_compute[186544]:   </devices>
Nov 22 08:34:13 compute-0 nova_compute[186544]: </domain>
Nov 22 08:34:13 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 08:34:13 compute-0 nova_compute[186544]: 2025-11-22 08:34:13.882 186548 DEBUG nova.compute.manager [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1e8a6cfb-1b42-472e-99dd-5134bcebbe43] Preparing to wait for external event network-vif-plugged-a2290405-be1c-41fd-8df6-e9f6381864b5 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 08:34:13 compute-0 nova_compute[186544]: 2025-11-22 08:34:13.883 186548 DEBUG oslo_concurrency.lockutils [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "1e8a6cfb-1b42-472e-99dd-5134bcebbe43-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:34:13 compute-0 nova_compute[186544]: 2025-11-22 08:34:13.883 186548 DEBUG oslo_concurrency.lockutils [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "1e8a6cfb-1b42-472e-99dd-5134bcebbe43-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:34:13 compute-0 nova_compute[186544]: 2025-11-22 08:34:13.883 186548 DEBUG oslo_concurrency.lockutils [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "1e8a6cfb-1b42-472e-99dd-5134bcebbe43-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:34:13 compute-0 nova_compute[186544]: 2025-11-22 08:34:13.884 186548 DEBUG nova.virt.libvirt.vif [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:34:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-873112655',display_name='tempest-TestNetworkBasicOps-server-873112655',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-873112655',id=166,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBP4xi6+j3qP8kpYDCRdJyeh1o5i9dQ6AbSZPHMqOAregRctXcy99guzsSmOcQrpCsRWmsxA2migBdzpbRcg/oHK6z/Bb9nK+Sw+a+wO3o0QK8RJKGetnwdOJRwMx9J1z0w==',key_name='tempest-TestNetworkBasicOps-1280477233',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='12f63a6d87a947758ab928c0d625ff06',ramdisk_id='',reservation_id='r-y0tu3s3m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1998778518',owner_user_name='tempest-TestNetworkBasicOps-1998778518-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:34:10Z,user_data=None,user_id='033a5e424a0a42afa21b67c28d79d1f4',uuid=1e8a6cfb-1b42-472e-99dd-5134bcebbe43,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a2290405-be1c-41fd-8df6-e9f6381864b5", "address": "fa:16:3e:1f:c7:2d", "network": {"id": "123af741-0f16-4f2c-9391-bc20715b113c", "bridge": "br-int", "label": "tempest-network-smoke--91900949", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2290405-be", "ovs_interfaceid": "a2290405-be1c-41fd-8df6-e9f6381864b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 08:34:13 compute-0 nova_compute[186544]: 2025-11-22 08:34:13.884 186548 DEBUG nova.network.os_vif_util [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converting VIF {"id": "a2290405-be1c-41fd-8df6-e9f6381864b5", "address": "fa:16:3e:1f:c7:2d", "network": {"id": "123af741-0f16-4f2c-9391-bc20715b113c", "bridge": "br-int", "label": "tempest-network-smoke--91900949", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2290405-be", "ovs_interfaceid": "a2290405-be1c-41fd-8df6-e9f6381864b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:34:13 compute-0 nova_compute[186544]: 2025-11-22 08:34:13.885 186548 DEBUG nova.network.os_vif_util [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1f:c7:2d,bridge_name='br-int',has_traffic_filtering=True,id=a2290405-be1c-41fd-8df6-e9f6381864b5,network=Network(123af741-0f16-4f2c-9391-bc20715b113c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2290405-be') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:34:13 compute-0 nova_compute[186544]: 2025-11-22 08:34:13.885 186548 DEBUG os_vif [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1f:c7:2d,bridge_name='br-int',has_traffic_filtering=True,id=a2290405-be1c-41fd-8df6-e9f6381864b5,network=Network(123af741-0f16-4f2c-9391-bc20715b113c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2290405-be') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 08:34:13 compute-0 nova_compute[186544]: 2025-11-22 08:34:13.886 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:34:13 compute-0 nova_compute[186544]: 2025-11-22 08:34:13.886 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:34:13 compute-0 nova_compute[186544]: 2025-11-22 08:34:13.886 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:34:13 compute-0 nova_compute[186544]: 2025-11-22 08:34:13.889 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:34:13 compute-0 nova_compute[186544]: 2025-11-22 08:34:13.889 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa2290405-be, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:34:13 compute-0 nova_compute[186544]: 2025-11-22 08:34:13.890 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa2290405-be, col_values=(('external_ids', {'iface-id': 'a2290405-be1c-41fd-8df6-e9f6381864b5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1f:c7:2d', 'vm-uuid': '1e8a6cfb-1b42-472e-99dd-5134bcebbe43'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:34:13 compute-0 nova_compute[186544]: 2025-11-22 08:34:13.891 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:34:13 compute-0 NetworkManager[55036]: <info>  [1763800453.8925] manager: (tapa2290405-be): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/366)
Nov 22 08:34:13 compute-0 nova_compute[186544]: 2025-11-22 08:34:13.894 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 08:34:13 compute-0 nova_compute[186544]: 2025-11-22 08:34:13.896 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:34:13 compute-0 nova_compute[186544]: 2025-11-22 08:34:13.897 186548 INFO os_vif [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1f:c7:2d,bridge_name='br-int',has_traffic_filtering=True,id=a2290405-be1c-41fd-8df6-e9f6381864b5,network=Network(123af741-0f16-4f2c-9391-bc20715b113c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2290405-be')
Nov 22 08:34:13 compute-0 nova_compute[186544]: 2025-11-22 08:34:13.934 186548 DEBUG nova.virt.libvirt.driver [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:34:13 compute-0 nova_compute[186544]: 2025-11-22 08:34:13.935 186548 DEBUG nova.virt.libvirt.driver [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:34:13 compute-0 nova_compute[186544]: 2025-11-22 08:34:13.935 186548 DEBUG nova.virt.libvirt.driver [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] No VIF found with MAC fa:16:3e:1f:c7:2d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 08:34:13 compute-0 nova_compute[186544]: 2025-11-22 08:34:13.936 186548 INFO nova.virt.libvirt.driver [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1e8a6cfb-1b42-472e-99dd-5134bcebbe43] Using config drive
Nov 22 08:34:15 compute-0 nova_compute[186544]: 2025-11-22 08:34:15.468 186548 INFO nova.virt.libvirt.driver [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1e8a6cfb-1b42-472e-99dd-5134bcebbe43] Creating config drive at /var/lib/nova/instances/1e8a6cfb-1b42-472e-99dd-5134bcebbe43/disk.config
Nov 22 08:34:15 compute-0 nova_compute[186544]: 2025-11-22 08:34:15.474 186548 DEBUG oslo_concurrency.processutils [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1e8a6cfb-1b42-472e-99dd-5134bcebbe43/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpzki5dk3g execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:34:15 compute-0 nova_compute[186544]: 2025-11-22 08:34:15.601 186548 DEBUG oslo_concurrency.processutils [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1e8a6cfb-1b42-472e-99dd-5134bcebbe43/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpzki5dk3g" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:34:15 compute-0 kernel: tapa2290405-be: entered promiscuous mode
Nov 22 08:34:15 compute-0 NetworkManager[55036]: <info>  [1763800455.6671] manager: (tapa2290405-be): new Tun device (/org/freedesktop/NetworkManager/Devices/367)
Nov 22 08:34:15 compute-0 nova_compute[186544]: 2025-11-22 08:34:15.667 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:34:15 compute-0 ovn_controller[94843]: 2025-11-22T08:34:15Z|00792|binding|INFO|Claiming lport a2290405-be1c-41fd-8df6-e9f6381864b5 for this chassis.
Nov 22 08:34:15 compute-0 ovn_controller[94843]: 2025-11-22T08:34:15Z|00793|binding|INFO|a2290405-be1c-41fd-8df6-e9f6381864b5: Claiming fa:16:3e:1f:c7:2d 10.100.0.19
Nov 22 08:34:15 compute-0 nova_compute[186544]: 2025-11-22 08:34:15.670 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:34:15 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:34:15.684 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1f:c7:2d 10.100.0.19'], port_security=['fa:16:3e:1f:c7:2d 10.100.0.19'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.19/28', 'neutron:device_id': '1e8a6cfb-1b42-472e-99dd-5134bcebbe43', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-123af741-0f16-4f2c-9391-bc20715b113c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '12f63a6d87a947758ab928c0d625ff06', 'neutron:revision_number': '2', 'neutron:security_group_ids': '742aa02a-10aa-4ee5-b074-11d3303b7679', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=addef574-b6f3-4707-b107-88339448d51a, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=a2290405-be1c-41fd-8df6-e9f6381864b5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:34:15 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:34:15.685 103805 INFO neutron.agent.ovn.metadata.agent [-] Port a2290405-be1c-41fd-8df6-e9f6381864b5 in datapath 123af741-0f16-4f2c-9391-bc20715b113c bound to our chassis
Nov 22 08:34:15 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:34:15.686 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 123af741-0f16-4f2c-9391-bc20715b113c
Nov 22 08:34:15 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:34:15.697 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[bdb55f7c-f0a1-4c01-b9be-a00eea740422]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:34:15 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:34:15.698 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap123af741-01 in ovnmeta-123af741-0f16-4f2c-9391-bc20715b113c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 08:34:15 compute-0 systemd-udevd[247991]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:34:15 compute-0 nova_compute[186544]: 2025-11-22 08:34:15.701 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:34:15 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:34:15.701 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap123af741-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 08:34:15 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:34:15.701 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[6212d4e9-1977-4e62-a72e-ff43d3bab40e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:34:15 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:34:15.702 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[575b0ee7-2349-464d-84b0-4565c1503ad3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:34:15 compute-0 ovn_controller[94843]: 2025-11-22T08:34:15Z|00794|binding|INFO|Setting lport a2290405-be1c-41fd-8df6-e9f6381864b5 ovn-installed in OVS
Nov 22 08:34:15 compute-0 ovn_controller[94843]: 2025-11-22T08:34:15Z|00795|binding|INFO|Setting lport a2290405-be1c-41fd-8df6-e9f6381864b5 up in Southbound
Nov 22 08:34:15 compute-0 nova_compute[186544]: 2025-11-22 08:34:15.707 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:34:15 compute-0 NetworkManager[55036]: <info>  [1763800455.7163] device (tapa2290405-be): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 08:34:15 compute-0 NetworkManager[55036]: <info>  [1763800455.7185] device (tapa2290405-be): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 08:34:15 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:34:15.717 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[4311ef62-d475-4f4f-b16a-8989c2bca431]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:34:15 compute-0 systemd-machined[152872]: New machine qemu-90-instance-000000a6.
Nov 22 08:34:15 compute-0 systemd[1]: Started Virtual Machine qemu-90-instance-000000a6.
Nov 22 08:34:15 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:34:15.734 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[ce9544b8-3207-45c0-9198-f0f8777958eb]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:34:15 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:34:15.760 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[4a74f5f4-3a90-4096-9987-d49f51daef24]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:34:15 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:34:15.765 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[01dae565-5b8a-451c-b03b-92eb9841957b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:34:15 compute-0 NetworkManager[55036]: <info>  [1763800455.7666] manager: (tap123af741-00): new Veth device (/org/freedesktop/NetworkManager/Devices/368)
Nov 22 08:34:15 compute-0 systemd-udevd[247997]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:34:15 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:34:15.800 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[8da624b4-648a-4cc8-adbf-757b0bfc35ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:34:15 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:34:15.805 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[90c6fb0c-c955-4e0b-aa01-c44093f87256]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:34:15 compute-0 NetworkManager[55036]: <info>  [1763800455.8338] device (tap123af741-00): carrier: link connected
Nov 22 08:34:15 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:34:15.841 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[8826344d-ca14-4a6b-85fb-54eeaf90f12d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:34:15 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:34:15.860 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[66f0ec4d-f14f-4029-ac99-cacd3a8ad0b7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap123af741-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:05:fa:f6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 238], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 719647, 'reachable_time': 21000, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 248026, 'error': None, 'target': 'ovnmeta-123af741-0f16-4f2c-9391-bc20715b113c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:34:15 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:34:15.877 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[aaa99536-f327-4795-9def-3e0af05242ee]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe05:faf6'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 719647, 'tstamp': 719647}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 248027, 'error': None, 'target': 'ovnmeta-123af741-0f16-4f2c-9391-bc20715b113c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:34:15 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:34:15.896 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[5aa96bd6-5dcd-475e-b164-422ec9b85672]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap123af741-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:05:fa:f6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 238], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 719647, 'reachable_time': 21000, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 248028, 'error': None, 'target': 'ovnmeta-123af741-0f16-4f2c-9391-bc20715b113c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:34:15 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:34:15.930 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[68fda414-491a-4da7-835c-a16242ff75c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:34:15 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:34:15.992 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[c2c0b99c-7759-464c-b3da-09fd9444feef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:34:15 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:34:15.994 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap123af741-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:34:15 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:34:15.994 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:34:15 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:34:15.994 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap123af741-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:34:15 compute-0 nova_compute[186544]: 2025-11-22 08:34:15.997 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:34:15 compute-0 NetworkManager[55036]: <info>  [1763800455.9986] manager: (tap123af741-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/369)
Nov 22 08:34:15 compute-0 kernel: tap123af741-00: entered promiscuous mode
Nov 22 08:34:16 compute-0 nova_compute[186544]: 2025-11-22 08:34:16.000 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:34:16 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:34:16.002 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap123af741-00, col_values=(('external_ids', {'iface-id': 'ccf6ac67-9519-42b8-880e-1d754ab3a18c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:34:16 compute-0 nova_compute[186544]: 2025-11-22 08:34:16.003 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:34:16 compute-0 ovn_controller[94843]: 2025-11-22T08:34:16Z|00796|binding|INFO|Releasing lport ccf6ac67-9519-42b8-880e-1d754ab3a18c from this chassis (sb_readonly=0)
Nov 22 08:34:16 compute-0 nova_compute[186544]: 2025-11-22 08:34:16.004 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:34:16 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:34:16.005 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/123af741-0f16-4f2c-9391-bc20715b113c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/123af741-0f16-4f2c-9391-bc20715b113c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 08:34:16 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:34:16.006 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[67ccfa17-0bd9-4262-8860-f8e4aabb3ead]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:34:16 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:34:16.006 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 08:34:16 compute-0 ovn_metadata_agent[103800]: global
Nov 22 08:34:16 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 08:34:16 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-123af741-0f16-4f2c-9391-bc20715b113c
Nov 22 08:34:16 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 08:34:16 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 08:34:16 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 08:34:16 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/123af741-0f16-4f2c-9391-bc20715b113c.pid.haproxy
Nov 22 08:34:16 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 08:34:16 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:34:16 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 08:34:16 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 08:34:16 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 08:34:16 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 08:34:16 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 08:34:16 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 08:34:16 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 08:34:16 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 08:34:16 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 08:34:16 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 08:34:16 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 08:34:16 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 08:34:16 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 08:34:16 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:34:16 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:34:16 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 08:34:16 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 08:34:16 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 08:34:16 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID 123af741-0f16-4f2c-9391-bc20715b113c
Nov 22 08:34:16 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 08:34:16 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:34:16.007 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-123af741-0f16-4f2c-9391-bc20715b113c', 'env', 'PROCESS_TAG=haproxy-123af741-0f16-4f2c-9391-bc20715b113c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/123af741-0f16-4f2c-9391-bc20715b113c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 08:34:16 compute-0 nova_compute[186544]: 2025-11-22 08:34:16.017 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:34:16 compute-0 nova_compute[186544]: 2025-11-22 08:34:16.027 186548 DEBUG nova.compute.manager [req-a4afdc34-a575-4000-80ac-960694d631ad req-ad4aca9c-bab6-4e76-9069-17860837d2bb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1e8a6cfb-1b42-472e-99dd-5134bcebbe43] Received event network-vif-plugged-a2290405-be1c-41fd-8df6-e9f6381864b5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:34:16 compute-0 nova_compute[186544]: 2025-11-22 08:34:16.027 186548 DEBUG oslo_concurrency.lockutils [req-a4afdc34-a575-4000-80ac-960694d631ad req-ad4aca9c-bab6-4e76-9069-17860837d2bb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1e8a6cfb-1b42-472e-99dd-5134bcebbe43-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:34:16 compute-0 nova_compute[186544]: 2025-11-22 08:34:16.027 186548 DEBUG oslo_concurrency.lockutils [req-a4afdc34-a575-4000-80ac-960694d631ad req-ad4aca9c-bab6-4e76-9069-17860837d2bb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1e8a6cfb-1b42-472e-99dd-5134bcebbe43-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:34:16 compute-0 nova_compute[186544]: 2025-11-22 08:34:16.028 186548 DEBUG oslo_concurrency.lockutils [req-a4afdc34-a575-4000-80ac-960694d631ad req-ad4aca9c-bab6-4e76-9069-17860837d2bb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1e8a6cfb-1b42-472e-99dd-5134bcebbe43-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:34:16 compute-0 nova_compute[186544]: 2025-11-22 08:34:16.028 186548 DEBUG nova.compute.manager [req-a4afdc34-a575-4000-80ac-960694d631ad req-ad4aca9c-bab6-4e76-9069-17860837d2bb 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1e8a6cfb-1b42-472e-99dd-5134bcebbe43] Processing event network-vif-plugged-a2290405-be1c-41fd-8df6-e9f6381864b5 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 08:34:16 compute-0 nova_compute[186544]: 2025-11-22 08:34:16.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:34:16 compute-0 nova_compute[186544]: 2025-11-22 08:34:16.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 08:34:16 compute-0 nova_compute[186544]: 2025-11-22 08:34:16.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:34:16 compute-0 nova_compute[186544]: 2025-11-22 08:34:16.272 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:34:16 compute-0 nova_compute[186544]: 2025-11-22 08:34:16.272 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:34:16 compute-0 nova_compute[186544]: 2025-11-22 08:34:16.272 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:34:16 compute-0 nova_compute[186544]: 2025-11-22 08:34:16.272 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 08:34:16 compute-0 nova_compute[186544]: 2025-11-22 08:34:16.349 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1e8a6cfb-1b42-472e-99dd-5134bcebbe43/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:34:16 compute-0 nova_compute[186544]: 2025-11-22 08:34:16.368 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763800456.3568056, 1e8a6cfb-1b42-472e-99dd-5134bcebbe43 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:34:16 compute-0 nova_compute[186544]: 2025-11-22 08:34:16.368 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1e8a6cfb-1b42-472e-99dd-5134bcebbe43] VM Started (Lifecycle Event)
Nov 22 08:34:16 compute-0 nova_compute[186544]: 2025-11-22 08:34:16.373 186548 DEBUG nova.compute.manager [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1e8a6cfb-1b42-472e-99dd-5134bcebbe43] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 08:34:16 compute-0 nova_compute[186544]: 2025-11-22 08:34:16.376 186548 DEBUG nova.virt.libvirt.driver [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1e8a6cfb-1b42-472e-99dd-5134bcebbe43] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 08:34:16 compute-0 nova_compute[186544]: 2025-11-22 08:34:16.379 186548 INFO nova.virt.libvirt.driver [-] [instance: 1e8a6cfb-1b42-472e-99dd-5134bcebbe43] Instance spawned successfully.
Nov 22 08:34:16 compute-0 nova_compute[186544]: 2025-11-22 08:34:16.380 186548 DEBUG nova.virt.libvirt.driver [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1e8a6cfb-1b42-472e-99dd-5134bcebbe43] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 08:34:16 compute-0 nova_compute[186544]: 2025-11-22 08:34:16.393 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1e8a6cfb-1b42-472e-99dd-5134bcebbe43] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:34:16 compute-0 nova_compute[186544]: 2025-11-22 08:34:16.399 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1e8a6cfb-1b42-472e-99dd-5134bcebbe43] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:34:16 compute-0 nova_compute[186544]: 2025-11-22 08:34:16.403 186548 DEBUG nova.virt.libvirt.driver [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1e8a6cfb-1b42-472e-99dd-5134bcebbe43] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:34:16 compute-0 nova_compute[186544]: 2025-11-22 08:34:16.404 186548 DEBUG nova.virt.libvirt.driver [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1e8a6cfb-1b42-472e-99dd-5134bcebbe43] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:34:16 compute-0 nova_compute[186544]: 2025-11-22 08:34:16.404 186548 DEBUG nova.virt.libvirt.driver [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1e8a6cfb-1b42-472e-99dd-5134bcebbe43] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:34:16 compute-0 nova_compute[186544]: 2025-11-22 08:34:16.405 186548 DEBUG nova.virt.libvirt.driver [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1e8a6cfb-1b42-472e-99dd-5134bcebbe43] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:34:16 compute-0 nova_compute[186544]: 2025-11-22 08:34:16.405 186548 DEBUG nova.virt.libvirt.driver [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1e8a6cfb-1b42-472e-99dd-5134bcebbe43] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:34:16 compute-0 nova_compute[186544]: 2025-11-22 08:34:16.406 186548 DEBUG nova.virt.libvirt.driver [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1e8a6cfb-1b42-472e-99dd-5134bcebbe43] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:34:16 compute-0 nova_compute[186544]: 2025-11-22 08:34:16.417 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1e8a6cfb-1b42-472e-99dd-5134bcebbe43/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:34:16 compute-0 nova_compute[186544]: 2025-11-22 08:34:16.417 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1e8a6cfb-1b42-472e-99dd-5134bcebbe43/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:34:16 compute-0 nova_compute[186544]: 2025-11-22 08:34:16.435 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1e8a6cfb-1b42-472e-99dd-5134bcebbe43] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 08:34:16 compute-0 nova_compute[186544]: 2025-11-22 08:34:16.436 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763800456.35733, 1e8a6cfb-1b42-472e-99dd-5134bcebbe43 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:34:16 compute-0 podman[248064]: 2025-11-22 08:34:16.340332953 +0000 UTC m=+0.024527775 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 08:34:16 compute-0 nova_compute[186544]: 2025-11-22 08:34:16.436 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1e8a6cfb-1b42-472e-99dd-5134bcebbe43] VM Paused (Lifecycle Event)
Nov 22 08:34:16 compute-0 nova_compute[186544]: 2025-11-22 08:34:16.461 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1e8a6cfb-1b42-472e-99dd-5134bcebbe43] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:34:16 compute-0 nova_compute[186544]: 2025-11-22 08:34:16.463 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:34:16 compute-0 nova_compute[186544]: 2025-11-22 08:34:16.467 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763800456.3758948, 1e8a6cfb-1b42-472e-99dd-5134bcebbe43 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:34:16 compute-0 nova_compute[186544]: 2025-11-22 08:34:16.467 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1e8a6cfb-1b42-472e-99dd-5134bcebbe43] VM Resumed (Lifecycle Event)
Nov 22 08:34:16 compute-0 nova_compute[186544]: 2025-11-22 08:34:16.479 186548 INFO nova.compute.manager [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1e8a6cfb-1b42-472e-99dd-5134bcebbe43] Took 5.69 seconds to spawn the instance on the hypervisor.
Nov 22 08:34:16 compute-0 nova_compute[186544]: 2025-11-22 08:34:16.480 186548 DEBUG nova.compute.manager [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1e8a6cfb-1b42-472e-99dd-5134bcebbe43] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:34:16 compute-0 nova_compute[186544]: 2025-11-22 08:34:16.482 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1e8a6cfb-1b42-472e-99dd-5134bcebbe43/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:34:16 compute-0 nova_compute[186544]: 2025-11-22 08:34:16.486 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1e8a6cfb-1b42-472e-99dd-5134bcebbe43] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:34:16 compute-0 nova_compute[186544]: 2025-11-22 08:34:16.490 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1e8a6cfb-1b42-472e-99dd-5134bcebbe43] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:34:16 compute-0 nova_compute[186544]: 2025-11-22 08:34:16.518 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 1e8a6cfb-1b42-472e-99dd-5134bcebbe43] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 08:34:16 compute-0 nova_compute[186544]: 2025-11-22 08:34:16.565 186548 INFO nova.compute.manager [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1e8a6cfb-1b42-472e-99dd-5134bcebbe43] Took 6.12 seconds to build instance.
Nov 22 08:34:16 compute-0 nova_compute[186544]: 2025-11-22 08:34:16.586 186548 DEBUG oslo_concurrency.lockutils [None req-3dd1a858-9e78-46e7-aca8-f65e518f10d6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "1e8a6cfb-1b42-472e-99dd-5134bcebbe43" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.209s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:34:16 compute-0 nova_compute[186544]: 2025-11-22 08:34:16.700 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:34:16 compute-0 podman[248064]: 2025-11-22 08:34:16.700834919 +0000 UTC m=+0.385029721 container create f0edd2084274d396ae7219bb0d71c6125f53b824f3f5f198070600d42f17e52f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-123af741-0f16-4f2c-9391-bc20715b113c, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team)
Nov 22 08:34:16 compute-0 nova_compute[186544]: 2025-11-22 08:34:16.701 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5591MB free_disk=73.12980270385742GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 08:34:16 compute-0 nova_compute[186544]: 2025-11-22 08:34:16.701 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:34:16 compute-0 nova_compute[186544]: 2025-11-22 08:34:16.702 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:34:16 compute-0 systemd[1]: Started libpod-conmon-f0edd2084274d396ae7219bb0d71c6125f53b824f3f5f198070600d42f17e52f.scope.
Nov 22 08:34:16 compute-0 nova_compute[186544]: 2025-11-22 08:34:16.767 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Instance 1e8a6cfb-1b42-472e-99dd-5134bcebbe43 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 22 08:34:16 compute-0 nova_compute[186544]: 2025-11-22 08:34:16.768 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 08:34:16 compute-0 nova_compute[186544]: 2025-11-22 08:34:16.768 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 08:34:16 compute-0 systemd[1]: Started libcrun container.
Nov 22 08:34:16 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e8b2e559a43af8c6bf67a897eab1b554afee59f3785890417061eafd67ff3fe8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 08:34:16 compute-0 podman[248064]: 2025-11-22 08:34:16.813532379 +0000 UTC m=+0.497727201 container init f0edd2084274d396ae7219bb0d71c6125f53b824f3f5f198070600d42f17e52f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-123af741-0f16-4f2c-9391-bc20715b113c, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 22 08:34:16 compute-0 podman[248064]: 2025-11-22 08:34:16.820779518 +0000 UTC m=+0.504974320 container start f0edd2084274d396ae7219bb0d71c6125f53b824f3f5f198070600d42f17e52f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-123af741-0f16-4f2c-9391-bc20715b113c, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 08:34:16 compute-0 nova_compute[186544]: 2025-11-22 08:34:16.834 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:34:16 compute-0 neutron-haproxy-ovnmeta-123af741-0f16-4f2c-9391-bc20715b113c[248089]: [NOTICE]   (248093) : New worker (248095) forked
Nov 22 08:34:16 compute-0 neutron-haproxy-ovnmeta-123af741-0f16-4f2c-9391-bc20715b113c[248089]: [NOTICE]   (248093) : Loading success.
Nov 22 08:34:16 compute-0 nova_compute[186544]: 2025-11-22 08:34:16.854 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:34:16 compute-0 nova_compute[186544]: 2025-11-22 08:34:16.875 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 08:34:16 compute-0 nova_compute[186544]: 2025-11-22 08:34:16.875 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.173s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:34:17 compute-0 nova_compute[186544]: 2025-11-22 08:34:17.139 186548 DEBUG nova.network.neutron [req-91d1fe6f-2519-4de2-bc3e-7addc2191a01 req-5a54cc92-6057-4efa-8c63-825f555c814b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1e8a6cfb-1b42-472e-99dd-5134bcebbe43] Updated VIF entry in instance network info cache for port a2290405-be1c-41fd-8df6-e9f6381864b5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:34:17 compute-0 nova_compute[186544]: 2025-11-22 08:34:17.139 186548 DEBUG nova.network.neutron [req-91d1fe6f-2519-4de2-bc3e-7addc2191a01 req-5a54cc92-6057-4efa-8c63-825f555c814b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1e8a6cfb-1b42-472e-99dd-5134bcebbe43] Updating instance_info_cache with network_info: [{"id": "a2290405-be1c-41fd-8df6-e9f6381864b5", "address": "fa:16:3e:1f:c7:2d", "network": {"id": "123af741-0f16-4f2c-9391-bc20715b113c", "bridge": "br-int", "label": "tempest-network-smoke--91900949", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2290405-be", "ovs_interfaceid": "a2290405-be1c-41fd-8df6-e9f6381864b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:34:17 compute-0 nova_compute[186544]: 2025-11-22 08:34:17.166 186548 DEBUG oslo_concurrency.lockutils [req-91d1fe6f-2519-4de2-bc3e-7addc2191a01 req-5a54cc92-6057-4efa-8c63-825f555c814b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-1e8a6cfb-1b42-472e-99dd-5134bcebbe43" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:34:18 compute-0 nova_compute[186544]: 2025-11-22 08:34:18.117 186548 DEBUG nova.compute.manager [req-e70c9850-333e-41db-811a-67aa18193ae4 req-bafd7a5c-97f8-4366-a74e-746eb603acc6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1e8a6cfb-1b42-472e-99dd-5134bcebbe43] Received event network-vif-plugged-a2290405-be1c-41fd-8df6-e9f6381864b5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:34:18 compute-0 nova_compute[186544]: 2025-11-22 08:34:18.118 186548 DEBUG oslo_concurrency.lockutils [req-e70c9850-333e-41db-811a-67aa18193ae4 req-bafd7a5c-97f8-4366-a74e-746eb603acc6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1e8a6cfb-1b42-472e-99dd-5134bcebbe43-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:34:18 compute-0 nova_compute[186544]: 2025-11-22 08:34:18.119 186548 DEBUG oslo_concurrency.lockutils [req-e70c9850-333e-41db-811a-67aa18193ae4 req-bafd7a5c-97f8-4366-a74e-746eb603acc6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1e8a6cfb-1b42-472e-99dd-5134bcebbe43-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:34:18 compute-0 nova_compute[186544]: 2025-11-22 08:34:18.119 186548 DEBUG oslo_concurrency.lockutils [req-e70c9850-333e-41db-811a-67aa18193ae4 req-bafd7a5c-97f8-4366-a74e-746eb603acc6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1e8a6cfb-1b42-472e-99dd-5134bcebbe43-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:34:18 compute-0 nova_compute[186544]: 2025-11-22 08:34:18.120 186548 DEBUG nova.compute.manager [req-e70c9850-333e-41db-811a-67aa18193ae4 req-bafd7a5c-97f8-4366-a74e-746eb603acc6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1e8a6cfb-1b42-472e-99dd-5134bcebbe43] No waiting events found dispatching network-vif-plugged-a2290405-be1c-41fd-8df6-e9f6381864b5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:34:18 compute-0 nova_compute[186544]: 2025-11-22 08:34:18.120 186548 WARNING nova.compute.manager [req-e70c9850-333e-41db-811a-67aa18193ae4 req-bafd7a5c-97f8-4366-a74e-746eb603acc6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1e8a6cfb-1b42-472e-99dd-5134bcebbe43] Received unexpected event network-vif-plugged-a2290405-be1c-41fd-8df6-e9f6381864b5 for instance with vm_state active and task_state None.
Nov 22 08:34:18 compute-0 nova_compute[186544]: 2025-11-22 08:34:18.893 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:34:20 compute-0 nova_compute[186544]: 2025-11-22 08:34:20.875 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:34:20 compute-0 nova_compute[186544]: 2025-11-22 08:34:20.875 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 08:34:20 compute-0 nova_compute[186544]: 2025-11-22 08:34:20.876 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 08:34:21 compute-0 nova_compute[186544]: 2025-11-22 08:34:21.019 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "refresh_cache-1e8a6cfb-1b42-472e-99dd-5134bcebbe43" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:34:21 compute-0 nova_compute[186544]: 2025-11-22 08:34:21.019 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquired lock "refresh_cache-1e8a6cfb-1b42-472e-99dd-5134bcebbe43" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:34:21 compute-0 nova_compute[186544]: 2025-11-22 08:34:21.019 186548 DEBUG nova.network.neutron [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 1e8a6cfb-1b42-472e-99dd-5134bcebbe43] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 22 08:34:21 compute-0 nova_compute[186544]: 2025-11-22 08:34:21.019 186548 DEBUG nova.objects.instance [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lazy-loading 'info_cache' on Instance uuid 1e8a6cfb-1b42-472e-99dd-5134bcebbe43 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:34:21 compute-0 nova_compute[186544]: 2025-11-22 08:34:21.465 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:34:21 compute-0 nova_compute[186544]: 2025-11-22 08:34:21.741 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:34:21 compute-0 NetworkManager[55036]: <info>  [1763800461.7525] manager: (patch-br-int-to-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/370)
Nov 22 08:34:21 compute-0 NetworkManager[55036]: <info>  [1763800461.7533] manager: (patch-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/371)
Nov 22 08:34:21 compute-0 nova_compute[186544]: 2025-11-22 08:34:21.805 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:34:21 compute-0 ovn_controller[94843]: 2025-11-22T08:34:21Z|00797|binding|INFO|Releasing lport ccf6ac67-9519-42b8-880e-1d754ab3a18c from this chassis (sb_readonly=0)
Nov 22 08:34:21 compute-0 nova_compute[186544]: 2025-11-22 08:34:21.814 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:34:23 compute-0 podman[248107]: 2025-11-22 08:34:23.422971616 +0000 UTC m=+0.060718570 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 08:34:23 compute-0 podman[248106]: 2025-11-22 08:34:23.42476544 +0000 UTC m=+0.062999646 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 22 08:34:23 compute-0 podman[248105]: 2025-11-22 08:34:23.439518123 +0000 UTC m=+0.070674014 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 08:34:23 compute-0 podman[248108]: 2025-11-22 08:34:23.458619255 +0000 UTC m=+0.094181005 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Nov 22 08:34:23 compute-0 nova_compute[186544]: 2025-11-22 08:34:23.896 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:34:25 compute-0 nova_compute[186544]: 2025-11-22 08:34:25.649 186548 DEBUG nova.network.neutron [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 1e8a6cfb-1b42-472e-99dd-5134bcebbe43] Updating instance_info_cache with network_info: [{"id": "a2290405-be1c-41fd-8df6-e9f6381864b5", "address": "fa:16:3e:1f:c7:2d", "network": {"id": "123af741-0f16-4f2c-9391-bc20715b113c", "bridge": "br-int", "label": "tempest-network-smoke--91900949", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2290405-be", "ovs_interfaceid": "a2290405-be1c-41fd-8df6-e9f6381864b5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:34:25 compute-0 nova_compute[186544]: 2025-11-22 08:34:25.677 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Releasing lock "refresh_cache-1e8a6cfb-1b42-472e-99dd-5134bcebbe43" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:34:25 compute-0 nova_compute[186544]: 2025-11-22 08:34:25.677 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 1e8a6cfb-1b42-472e-99dd-5134bcebbe43] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 22 08:34:25 compute-0 nova_compute[186544]: 2025-11-22 08:34:25.678 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:34:25 compute-0 nova_compute[186544]: 2025-11-22 08:34:25.678 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:34:26 compute-0 nova_compute[186544]: 2025-11-22 08:34:26.467 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:34:28 compute-0 nova_compute[186544]: 2025-11-22 08:34:28.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:34:28 compute-0 nova_compute[186544]: 2025-11-22 08:34:28.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:34:28 compute-0 nova_compute[186544]: 2025-11-22 08:34:28.897 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:34:29 compute-0 nova_compute[186544]: 2025-11-22 08:34:29.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:34:30 compute-0 ovn_controller[94843]: 2025-11-22T08:34:30Z|00075|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:1f:c7:2d 10.100.0.19
Nov 22 08:34:30 compute-0 ovn_controller[94843]: 2025-11-22T08:34:30Z|00076|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:1f:c7:2d 10.100.0.19
Nov 22 08:34:31 compute-0 nova_compute[186544]: 2025-11-22 08:34:31.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:34:31 compute-0 nova_compute[186544]: 2025-11-22 08:34:31.469 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:34:33 compute-0 nova_compute[186544]: 2025-11-22 08:34:33.900 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:34:34 compute-0 podman[248211]: 2025-11-22 08:34:34.40239593 +0000 UTC m=+0.048917338 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 22 08:34:36 compute-0 podman[248233]: 2025-11-22 08:34:36.400111024 +0000 UTC m=+0.050191830 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 08:34:36 compute-0 podman[248234]: 2025-11-22 08:34:36.417343228 +0000 UTC m=+0.062927373 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, vcs-type=git, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41)
Nov 22 08:34:36 compute-0 nova_compute[186544]: 2025-11-22 08:34:36.471 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.604 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '1e8a6cfb-1b42-472e-99dd-5134bcebbe43', 'name': 'tempest-TestNetworkBasicOps-server-873112655', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-000000a6', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '12f63a6d87a947758ab928c0d625ff06', 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'hostId': 'a43ff701f099c2fd3b66ee5dcb1e851db8f28e5390f9b01d394bbd00', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.605 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.631 12 DEBUG ceilometer.compute.pollsters [-] 1e8a6cfb-1b42-472e-99dd-5134bcebbe43/disk.device.read.bytes volume: 30386688 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.632 12 DEBUG ceilometer.compute.pollsters [-] 1e8a6cfb-1b42-472e-99dd-5134bcebbe43/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.633 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ee529970-99b1-46db-9682-adff3a2f1bdc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30386688, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '1e8a6cfb-1b42-472e-99dd-5134bcebbe43-vda', 'timestamp': '2025-11-22T08:34:36.605457', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-873112655', 'name': 'instance-000000a6', 'instance_id': '1e8a6cfb-1b42-472e-99dd-5134bcebbe43', 'instance_type': 'm1.nano', 'host': 'a43ff701f099c2fd3b66ee5dcb1e851db8f28e5390f9b01d394bbd00', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1425de0a-c77e-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 7217.297239354, 'message_signature': '54172b90c330044be6fe81097258b59463ad1830e71234edbdb045584e1d9cfe'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '1e8a6cfb-1b42-472e-99dd-5134bcebbe43-sda', 'timestamp': '2025-11-22T08:34:36.605457', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-873112655', 'name': 'instance-000000a6', 'instance_id': '1e8a6cfb-1b42-472e-99dd-5134bcebbe43', 'instance_type': 'm1.nano', 'host': 'a43ff701f099c2fd3b66ee5dcb1e851db8f28e5390f9b01d394bbd00', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1425eb02-c77e-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 7217.297239354, 'message_signature': '2f35ff75e82fe44cd0936620dfd08143207e4fb6060f2434d29883c45cd5e02d'}]}, 'timestamp': '2025-11-22 08:34:36.632310', '_unique_id': 'bf3446ed0d1f4c7294ac696dce662dda'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.633 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.633 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.633 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.633 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.633 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.633 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.633 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.633 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.633 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.633 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.633 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.633 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.633 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.633 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.633 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.633 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.633 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.633 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.633 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.633 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.633 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.633 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.633 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.633 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.633 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.633 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.633 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.633 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.633 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.633 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.633 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.633 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.634 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.634 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.634 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-873112655>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-873112655>]
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.634 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.637 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 1e8a6cfb-1b42-472e-99dd-5134bcebbe43 / tapa2290405-be inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.637 12 DEBUG ceilometer.compute.pollsters [-] 1e8a6cfb-1b42-472e-99dd-5134bcebbe43/network.outgoing.packets volume: 15 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.639 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ac6fd4d8-89c1-47d3-8a21-0b8fe951d0e2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 15, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'instance-000000a6-1e8a6cfb-1b42-472e-99dd-5134bcebbe43-tapa2290405-be', 'timestamp': '2025-11-22T08:34:36.635005', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-873112655', 'name': 'tapa2290405-be', 'instance_id': '1e8a6cfb-1b42-472e-99dd-5134bcebbe43', 'instance_type': 'm1.nano', 'host': 'a43ff701f099c2fd3b66ee5dcb1e851db8f28e5390f9b01d394bbd00', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1f:c7:2d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa2290405-be'}, 'message_id': '1426d378-c77e-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 7217.326808503, 'message_signature': 'cfb05875aa3c52dd3008c4ec9788b5d397a873d6d012cf7e2ac67d12b81d2779'}]}, 'timestamp': '2025-11-22 08:34:36.638331', '_unique_id': 'cd3daa8ee9554e7d9faaa57ef80e72a5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.639 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.639 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.639 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.639 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.639 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.639 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.639 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.639 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.639 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.639 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.639 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.639 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.639 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.639 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.639 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.639 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.639 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.639 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.639 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.639 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.639 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.639 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.639 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.639 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.639 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.639 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.639 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.639 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.639 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.639 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.639 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.640 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.640 12 DEBUG ceilometer.compute.pollsters [-] 1e8a6cfb-1b42-472e-99dd-5134bcebbe43/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.641 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '030fb55c-83ed-475f-8829-73657c484aa8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'instance-000000a6-1e8a6cfb-1b42-472e-99dd-5134bcebbe43-tapa2290405-be', 'timestamp': '2025-11-22T08:34:36.640298', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-873112655', 'name': 'tapa2290405-be', 'instance_id': '1e8a6cfb-1b42-472e-99dd-5134bcebbe43', 'instance_type': 'm1.nano', 'host': 'a43ff701f099c2fd3b66ee5dcb1e851db8f28e5390f9b01d394bbd00', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1f:c7:2d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa2290405-be'}, 'message_id': '142730a2-c77e-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 7217.326808503, 'message_signature': 'a1cb4f5f889be2316bdaefab1c50490112b6eea23844340859bf7fbb79cc51bb'}]}, 'timestamp': '2025-11-22 08:34:36.640653', '_unique_id': '9487245b460148839e136217a2c20bb3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.641 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.641 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.641 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.641 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.641 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.641 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.641 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.641 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.641 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.641 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.641 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.641 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.641 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.641 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.641 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.641 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.641 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.641 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.641 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.641 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.641 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.641 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.641 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.641 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.641 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.641 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.641 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.641 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.641 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.641 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.641 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.641 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.642 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.642 12 DEBUG ceilometer.compute.pollsters [-] 1e8a6cfb-1b42-472e-99dd-5134bcebbe43/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.643 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a80b69ef-d2fb-401b-864d-2f1cac0a6975', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'instance-000000a6-1e8a6cfb-1b42-472e-99dd-5134bcebbe43-tapa2290405-be', 'timestamp': '2025-11-22T08:34:36.642376', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-873112655', 'name': 'tapa2290405-be', 'instance_id': '1e8a6cfb-1b42-472e-99dd-5134bcebbe43', 'instance_type': 'm1.nano', 'host': 'a43ff701f099c2fd3b66ee5dcb1e851db8f28e5390f9b01d394bbd00', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1f:c7:2d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa2290405-be'}, 'message_id': '14278192-c77e-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 7217.326808503, 'message_signature': '977ae8b882b72c625e84557aedffee8b4f1ea6931335688786627d4223bcce54'}]}, 'timestamp': '2025-11-22 08:34:36.642713', '_unique_id': '38400b5f829c450a9d4354b13707713e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.643 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.643 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.643 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.643 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.643 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.643 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.643 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.643 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.643 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.643 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.643 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.643 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.643 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.643 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.643 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.643 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.643 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.643 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.643 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.643 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.643 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.643 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.643 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.643 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.643 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.643 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.643 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.643 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.643 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.643 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.643 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.643 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.644 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.656 12 DEBUG ceilometer.compute.pollsters [-] 1e8a6cfb-1b42-472e-99dd-5134bcebbe43/disk.device.allocation volume: 30547968 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.656 12 DEBUG ceilometer.compute.pollsters [-] 1e8a6cfb-1b42-472e-99dd-5134bcebbe43/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.658 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9c57bcdf-844c-448b-b64d-705d37389393', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30547968, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '1e8a6cfb-1b42-472e-99dd-5134bcebbe43-vda', 'timestamp': '2025-11-22T08:34:36.644254', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-873112655', 'name': 'instance-000000a6', 'instance_id': '1e8a6cfb-1b42-472e-99dd-5134bcebbe43', 'instance_type': 'm1.nano', 'host': 'a43ff701f099c2fd3b66ee5dcb1e851db8f28e5390f9b01d394bbd00', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1429a0f8-c77e-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 7217.336082732, 'message_signature': '0043a911e17048df949db64dab890068f034802622c9d763c70969df24183562'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '1e8a6cfb-1b42-472e-99dd-5134bcebbe43-sda', 'timestamp': '2025-11-22T08:34:36.644254', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-873112655', 'name': 'instance-000000a6', 'instance_id': '1e8a6cfb-1b42-472e-99dd-5134bcebbe43', 'instance_type': 'm1.nano', 'host': 'a43ff701f099c2fd3b66ee5dcb1e851db8f28e5390f9b01d394bbd00', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1429af08-c77e-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 7217.336082732, 'message_signature': '3dfedf692bd5873d2ce1c335302c402878e160c3f037543a9a757616708cef85'}]}, 'timestamp': '2025-11-22 08:34:36.657227', '_unique_id': 'b6aed304a446469fbbe824ae91867596'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.658 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.658 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.658 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.658 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.658 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.658 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.658 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.658 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.658 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.658 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.658 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.658 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.658 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.658 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.658 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.658 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.658 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.658 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.658 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.658 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.658 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.658 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.658 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.658 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.658 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.658 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.658 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.658 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.658 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.658 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.658 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.658 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.658 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.658 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.658 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.658 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.658 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.658 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.658 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.658 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.658 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.658 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.658 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.658 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.658 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.658 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.658 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.658 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.658 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.658 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.658 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.658 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.658 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.658 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.659 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.672 12 DEBUG ceilometer.compute.pollsters [-] 1e8a6cfb-1b42-472e-99dd-5134bcebbe43/memory.usage volume: 40.37890625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.673 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f8c5f3b2-5a0a-415c-a83c-1fc12e5997c8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.37890625, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '1e8a6cfb-1b42-472e-99dd-5134bcebbe43', 'timestamp': '2025-11-22T08:34:36.659132', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-873112655', 'name': 'instance-000000a6', 'instance_id': '1e8a6cfb-1b42-472e-99dd-5134bcebbe43', 'instance_type': 'm1.nano', 'host': 'a43ff701f099c2fd3b66ee5dcb1e851db8f28e5390f9b01d394bbd00', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '142c1c52-c77e-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 7217.364128515, 'message_signature': '65c4ac3d659513bd1af287834f35b74a04df0f75a4cef9618f7b7fad49b86f8e'}]}, 'timestamp': '2025-11-22 08:34:36.672884', '_unique_id': '1c789240289b4e2684be2b2c539c828f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.673 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.673 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.673 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.673 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.673 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.673 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.673 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.673 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.673 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.673 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.673 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.673 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.673 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.673 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.673 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.673 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.673 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.673 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.673 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.673 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.673 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.673 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.673 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.673 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.673 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.673 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.673 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.673 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.673 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.673 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.673 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.674 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.674 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.674 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-873112655>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-873112655>]
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.674 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.674 12 DEBUG ceilometer.compute.pollsters [-] 1e8a6cfb-1b42-472e-99dd-5134bcebbe43/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.675 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd3815eeb-9113-4e70-824a-07c444edd5fb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'instance-000000a6-1e8a6cfb-1b42-472e-99dd-5134bcebbe43-tapa2290405-be', 'timestamp': '2025-11-22T08:34:36.674720', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-873112655', 'name': 'tapa2290405-be', 'instance_id': '1e8a6cfb-1b42-472e-99dd-5134bcebbe43', 'instance_type': 'm1.nano', 'host': 'a43ff701f099c2fd3b66ee5dcb1e851db8f28e5390f9b01d394bbd00', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1f:c7:2d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa2290405-be'}, 'message_id': '142c6f68-c77e-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 7217.326808503, 'message_signature': 'd9c0033d32e99c66f4c59500544d7d1ffbc90c93196eeccc298cbeb54c8e1f93'}]}, 'timestamp': '2025-11-22 08:34:36.674967', '_unique_id': '6272e6727f774be5a1612cb986ff8e46'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.675 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.675 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.675 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.675 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.675 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.675 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.675 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.675 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.675 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.675 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.675 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.675 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.675 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.675 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.675 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.675 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.675 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.675 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.675 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.675 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.675 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.675 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.675 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.675 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.675 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.675 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.675 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.675 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.675 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.675 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.675 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.675 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.676 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.676 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-873112655>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-873112655>]
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.676 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.676 12 DEBUG ceilometer.compute.pollsters [-] 1e8a6cfb-1b42-472e-99dd-5134bcebbe43/network.incoming.bytes volume: 2178 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.676 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2ea9f824-98da-41d8-8697-3188edc06eba', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2178, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'instance-000000a6-1e8a6cfb-1b42-472e-99dd-5134bcebbe43-tapa2290405-be', 'timestamp': '2025-11-22T08:34:36.676321', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-873112655', 'name': 'tapa2290405-be', 'instance_id': '1e8a6cfb-1b42-472e-99dd-5134bcebbe43', 'instance_type': 'm1.nano', 'host': 'a43ff701f099c2fd3b66ee5dcb1e851db8f28e5390f9b01d394bbd00', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1f:c7:2d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa2290405-be'}, 'message_id': '142cad48-c77e-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 7217.326808503, 'message_signature': '9b0e36bece01d7c54c53d6b99a304898538cee12e7848862d5bffc52b546b610'}]}, 'timestamp': '2025-11-22 08:34:36.676546', '_unique_id': '15cf3b2324f74a8ea12309d0d1f7ab12'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.676 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.676 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.676 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.676 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.676 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.676 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.676 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.676 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.676 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.676 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.676 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.676 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.676 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.676 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.676 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.676 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.676 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.676 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.676 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.676 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.676 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.676 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.676 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.676 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.676 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.676 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.676 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.676 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.676 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.676 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.676 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.676 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.677 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.677 12 DEBUG ceilometer.compute.pollsters [-] 1e8a6cfb-1b42-472e-99dd-5134bcebbe43/network.outgoing.bytes volume: 1578 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.678 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c594d9d0-4f8d-42a8-b0be-5ed101b467c5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1578, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'instance-000000a6-1e8a6cfb-1b42-472e-99dd-5134bcebbe43-tapa2290405-be', 'timestamp': '2025-11-22T08:34:36.677613', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-873112655', 'name': 'tapa2290405-be', 'instance_id': '1e8a6cfb-1b42-472e-99dd-5134bcebbe43', 'instance_type': 'm1.nano', 'host': 'a43ff701f099c2fd3b66ee5dcb1e851db8f28e5390f9b01d394bbd00', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1f:c7:2d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa2290405-be'}, 'message_id': '142cdfb6-c77e-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 7217.326808503, 'message_signature': '90cec32333cfbb48bd49efe01d160ec5434632dd23f40c231020e3287fdeab03'}]}, 'timestamp': '2025-11-22 08:34:36.677836', '_unique_id': '1dabf415c0674eae81a43bbd359a64f9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.678 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.678 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.678 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.678 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.678 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.678 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.678 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.678 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.678 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.678 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.678 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.678 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.678 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.678 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.678 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.678 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.678 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.678 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.678 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.678 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.678 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.678 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.678 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.678 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.678 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.678 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.678 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.678 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.678 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.678 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.678 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.678 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.678 12 DEBUG ceilometer.compute.pollsters [-] 1e8a6cfb-1b42-472e-99dd-5134bcebbe43/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.679 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '46573d08-7777-4147-9991-8e17c3c4dfc0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'instance-000000a6-1e8a6cfb-1b42-472e-99dd-5134bcebbe43-tapa2290405-be', 'timestamp': '2025-11-22T08:34:36.678858', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-873112655', 'name': 'tapa2290405-be', 'instance_id': '1e8a6cfb-1b42-472e-99dd-5134bcebbe43', 'instance_type': 'm1.nano', 'host': 'a43ff701f099c2fd3b66ee5dcb1e851db8f28e5390f9b01d394bbd00', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1f:c7:2d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa2290405-be'}, 'message_id': '142d1062-c77e-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 7217.326808503, 'message_signature': 'ccee4b61631b8fd3f8a867705dd519cc6a1df2b7bbb3080d0254b63b531cc4e1'}]}, 'timestamp': '2025-11-22 08:34:36.679082', '_unique_id': '9ac14c337dfd4b2a92335f7430a4a0e1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.679 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.679 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.679 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.679 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.679 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.679 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.679 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.679 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.679 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.679 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.679 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.679 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.679 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.679 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.679 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.679 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.679 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.679 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.679 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.679 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.679 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.679 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.679 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.679 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.679 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.679 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.679 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.679 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.679 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.679 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.679 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.679 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.680 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.680 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.680 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-873112655>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-873112655>]
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.680 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.680 12 DEBUG ceilometer.compute.pollsters [-] 1e8a6cfb-1b42-472e-99dd-5134bcebbe43/disk.device.write.bytes volume: 72904704 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.680 12 DEBUG ceilometer.compute.pollsters [-] 1e8a6cfb-1b42-472e-99dd-5134bcebbe43/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.681 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9bdfdf8a-df9e-431b-8917-0c7b1b4d2694', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72904704, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '1e8a6cfb-1b42-472e-99dd-5134bcebbe43-vda', 'timestamp': '2025-11-22T08:34:36.680420', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-873112655', 'name': 'instance-000000a6', 'instance_id': '1e8a6cfb-1b42-472e-99dd-5134bcebbe43', 'instance_type': 'm1.nano', 'host': 'a43ff701f099c2fd3b66ee5dcb1e851db8f28e5390f9b01d394bbd00', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '142d4d48-c77e-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 7217.297239354, 'message_signature': '1bbe5dc2bf6ee2b0439b94074023ae451b17294c4be9068bb174fc9d52be5eca'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '1e8a6cfb-1b42-472e-99dd-5134bcebbe43-sda', 'timestamp': '2025-11-22T08:34:36.680420', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-873112655', 'name': 'instance-000000a6', 'instance_id': '1e8a6cfb-1b42-472e-99dd-5134bcebbe43', 'instance_type': 'm1.nano', 'host': 'a43ff701f099c2fd3b66ee5dcb1e851db8f28e5390f9b01d394bbd00', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '142d5536-c77e-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 7217.297239354, 'message_signature': '919113576f8386edc1da256a1c549df5cdf78f664879a10526f0af35aaf00cec'}]}, 'timestamp': '2025-11-22 08:34:36.680830', '_unique_id': '807146d54f2b46c5ac7fbb9428814ae5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.681 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.681 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.681 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.681 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.681 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.681 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.681 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.681 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.681 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.681 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.681 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.681 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.681 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.681 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.681 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.681 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.681 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.681 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.681 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.681 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.681 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.681 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.681 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.681 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.681 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.681 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.681 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.681 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.681 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.681 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.681 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.682 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.682 12 DEBUG ceilometer.compute.pollsters [-] 1e8a6cfb-1b42-472e-99dd-5134bcebbe43/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.682 12 DEBUG ceilometer.compute.pollsters [-] 1e8a6cfb-1b42-472e-99dd-5134bcebbe43/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.682 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7eed3562-f85a-4342-88cb-7c76e1d08994', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '1e8a6cfb-1b42-472e-99dd-5134bcebbe43-vda', 'timestamp': '2025-11-22T08:34:36.682111', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-873112655', 'name': 'instance-000000a6', 'instance_id': '1e8a6cfb-1b42-472e-99dd-5134bcebbe43', 'instance_type': 'm1.nano', 'host': 'a43ff701f099c2fd3b66ee5dcb1e851db8f28e5390f9b01d394bbd00', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '142d8f6a-c77e-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 7217.336082732, 'message_signature': '69422ba105d58a82b89b6023819086758d7d1c58a173e99c82f620e7b168a22f'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '1e8a6cfb-1b42-472e-99dd-5134bcebbe43-sda', 'timestamp': '2025-11-22T08:34:36.682111', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-873112655', 'name': 'instance-000000a6', 'instance_id': '1e8a6cfb-1b42-472e-99dd-5134bcebbe43', 'instance_type': 'm1.nano', 'host': 'a43ff701f099c2fd3b66ee5dcb1e851db8f28e5390f9b01d394bbd00', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '142d97da-c77e-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 7217.336082732, 'message_signature': '70fb235203c082ff7b8f7cd83205fe430e21cfdac60fac3c41db6d018ba34839'}]}, 'timestamp': '2025-11-22 08:34:36.682535', '_unique_id': '18b12e8024174c8692c768c582ca08ed'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.682 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.682 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.682 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.682 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.682 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.682 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.682 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.682 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.682 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.682 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.682 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.682 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.682 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.682 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.682 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.682 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.682 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.682 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.682 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.682 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.682 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.682 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.682 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.682 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.682 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.682 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.682 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.682 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.682 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.682 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.682 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.682 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.683 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.683 12 DEBUG ceilometer.compute.pollsters [-] 1e8a6cfb-1b42-472e-99dd-5134bcebbe43/disk.device.read.requests volume: 1096 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.683 12 DEBUG ceilometer.compute.pollsters [-] 1e8a6cfb-1b42-472e-99dd-5134bcebbe43/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.684 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '63033d05-8211-43ae-a6a2-8e2cc2ac845d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1096, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '1e8a6cfb-1b42-472e-99dd-5134bcebbe43-vda', 'timestamp': '2025-11-22T08:34:36.683741', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-873112655', 'name': 'instance-000000a6', 'instance_id': '1e8a6cfb-1b42-472e-99dd-5134bcebbe43', 'instance_type': 'm1.nano', 'host': 'a43ff701f099c2fd3b66ee5dcb1e851db8f28e5390f9b01d394bbd00', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '142dcf52-c77e-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 7217.297239354, 'message_signature': '6dd576f81d334906a2e3f0f11459d9feb2ca003b978c37ebae012a1e8dce1e74'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '1e8a6cfb-1b42-472e-99dd-5134bcebbe43-sda', 'timestamp': '2025-11-22T08:34:36.683741', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-873112655', 'name': 'instance-000000a6', 'instance_id': '1e8a6cfb-1b42-472e-99dd-5134bcebbe43', 'instance_type': 'm1.nano', 'host': 'a43ff701f099c2fd3b66ee5dcb1e851db8f28e5390f9b01d394bbd00', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '142dd6e6-c77e-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 7217.297239354, 'message_signature': '4fd6f2d981a957b54bb7a04d3aec54b440d4e5078438f0822f5bd682645740ed'}]}, 'timestamp': '2025-11-22 08:34:36.684151', '_unique_id': '74a59780f617488c8753ad7bae5e6384'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.684 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.684 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.684 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.684 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.684 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.684 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.684 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.684 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.684 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.684 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.684 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.684 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.684 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.684 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.684 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.684 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.684 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.684 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.684 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.684 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.684 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.684 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.684 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.684 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.684 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.684 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.684 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.684 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.684 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.684 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.684 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.685 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.685 12 DEBUG ceilometer.compute.pollsters [-] 1e8a6cfb-1b42-472e-99dd-5134bcebbe43/disk.device.read.latency volume: 772454313 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.685 12 DEBUG ceilometer.compute.pollsters [-] 1e8a6cfb-1b42-472e-99dd-5134bcebbe43/disk.device.read.latency volume: 40599483 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.686 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bf43d0ea-6e41-4006-a86d-0c0ff55f9c03', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 772454313, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '1e8a6cfb-1b42-472e-99dd-5134bcebbe43-vda', 'timestamp': '2025-11-22T08:34:36.685235', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-873112655', 'name': 'instance-000000a6', 'instance_id': '1e8a6cfb-1b42-472e-99dd-5134bcebbe43', 'instance_type': 'm1.nano', 'host': 'a43ff701f099c2fd3b66ee5dcb1e851db8f28e5390f9b01d394bbd00', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '142e0a6c-c77e-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 7217.297239354, 'message_signature': '5da19e40d215ac2ea683fd3421b72407448948ac1bb6e6d3d5e1cdaa1a5a8d2f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 40599483, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '1e8a6cfb-1b42-472e-99dd-5134bcebbe43-sda', 'timestamp': '2025-11-22T08:34:36.685235', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-873112655', 'name': 'instance-000000a6', 'instance_id': '1e8a6cfb-1b42-472e-99dd-5134bcebbe43', 'instance_type': 'm1.nano', 'host': 'a43ff701f099c2fd3b66ee5dcb1e851db8f28e5390f9b01d394bbd00', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '142e1200-c77e-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 7217.297239354, 'message_signature': '3edf91365cb84adb58f4fe68f8da6e7cc889e733afd4060c2069b0f41ced0e20'}]}, 'timestamp': '2025-11-22 08:34:36.685661', '_unique_id': 'b206355ba7aa47bcb90ea3350f91fa3f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.686 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.686 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.686 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.686 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.686 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.686 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.686 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.686 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.686 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.686 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.686 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.686 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.686 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.686 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.686 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.686 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.686 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.686 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.686 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.686 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.686 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.686 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.686 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.686 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.686 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.686 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.686 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.686 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.686 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.686 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.686 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.686 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.686 12 DEBUG ceilometer.compute.pollsters [-] 1e8a6cfb-1b42-472e-99dd-5134bcebbe43/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.687 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2fd9c0d5-9395-466c-b6fa-39c61f92cdbe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'instance-000000a6-1e8a6cfb-1b42-472e-99dd-5134bcebbe43-tapa2290405-be', 'timestamp': '2025-11-22T08:34:36.686705', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-873112655', 'name': 'tapa2290405-be', 'instance_id': '1e8a6cfb-1b42-472e-99dd-5134bcebbe43', 'instance_type': 'm1.nano', 'host': 'a43ff701f099c2fd3b66ee5dcb1e851db8f28e5390f9b01d394bbd00', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1f:c7:2d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa2290405-be'}, 'message_id': '142e42fc-c77e-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 7217.326808503, 'message_signature': 'a89098b37bdebd86c5b16c360f9f23d754425aae637be26115c93286fdb62826'}]}, 'timestamp': '2025-11-22 08:34:36.686928', '_unique_id': '51d1779895bd421bb0adbbc0c9f2b1ff'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.687 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.687 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.687 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.687 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.687 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.687 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.687 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.687 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.687 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.687 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.687 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.687 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.687 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.687 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.687 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.687 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.687 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.687 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.687 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.687 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.687 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.687 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.687 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.687 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.687 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.687 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.687 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.687 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.687 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.687 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.687 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.687 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.687 12 DEBUG ceilometer.compute.pollsters [-] 1e8a6cfb-1b42-472e-99dd-5134bcebbe43/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.688 12 DEBUG ceilometer.compute.pollsters [-] 1e8a6cfb-1b42-472e-99dd-5134bcebbe43/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.688 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '34e40181-c4fa-4b00-9227-efca016aeef4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '1e8a6cfb-1b42-472e-99dd-5134bcebbe43-vda', 'timestamp': '2025-11-22T08:34:36.687942', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-873112655', 'name': 'instance-000000a6', 'instance_id': '1e8a6cfb-1b42-472e-99dd-5134bcebbe43', 'instance_type': 'm1.nano', 'host': 'a43ff701f099c2fd3b66ee5dcb1e851db8f28e5390f9b01d394bbd00', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '142e731c-c77e-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 7217.336082732, 'message_signature': 'caff6957553a8c94475c441b9b54b862d3714bc0f9fd328adea3674a0c24c308'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '1e8a6cfb-1b42-472e-99dd-5134bcebbe43-sda', 'timestamp': '2025-11-22T08:34:36.687942', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-873112655', 'name': 'instance-000000a6', 'instance_id': '1e8a6cfb-1b42-472e-99dd-5134bcebbe43', 'instance_type': 'm1.nano', 'host': 'a43ff701f099c2fd3b66ee5dcb1e851db8f28e5390f9b01d394bbd00', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '142e7aa6-c77e-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 7217.336082732, 'message_signature': '5dc85e7e902bdd5baa429b1997fd6ae68b2432a7f723fc80af5212acbb52b5ba'}]}, 'timestamp': '2025-11-22 08:34:36.688360', '_unique_id': '27ff2841e4ab424d8c04201b770b831d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.688 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.688 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.688 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.688 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.688 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.688 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.688 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.688 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.688 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.688 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.688 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.688 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.688 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.688 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.688 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.688 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.688 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.688 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.688 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.688 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.688 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.688 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.688 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.688 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.688 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.688 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.688 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.688 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.688 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.688 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.688 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.689 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.689 12 DEBUG ceilometer.compute.pollsters [-] 1e8a6cfb-1b42-472e-99dd-5134bcebbe43/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.690 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '35822880-efa0-40b7-8f3f-ad77029c55df', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'instance-000000a6-1e8a6cfb-1b42-472e-99dd-5134bcebbe43-tapa2290405-be', 'timestamp': '2025-11-22T08:34:36.689431', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-873112655', 'name': 'tapa2290405-be', 'instance_id': '1e8a6cfb-1b42-472e-99dd-5134bcebbe43', 'instance_type': 'm1.nano', 'host': 'a43ff701f099c2fd3b66ee5dcb1e851db8f28e5390f9b01d394bbd00', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1f:c7:2d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa2290405-be'}, 'message_id': '142ead78-c77e-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 7217.326808503, 'message_signature': '0f9198d81a2b172fee84a3e1e2a56b90a789be54bcd566da952b49f95fb2c5a5'}]}, 'timestamp': '2025-11-22 08:34:36.689656', '_unique_id': '968a69bf9dbb4d52bd8b03f930fc74c5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.690 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.690 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.690 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.690 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.690 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.690 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.690 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.690 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.690 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.690 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.690 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.690 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.690 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.690 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.690 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.690 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.690 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.690 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.690 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.690 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.690 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.690 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.690 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.690 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.690 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.690 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.690 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.690 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.690 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.690 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.690 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.690 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.690 12 DEBUG ceilometer.compute.pollsters [-] 1e8a6cfb-1b42-472e-99dd-5134bcebbe43/disk.device.write.requests volume: 305 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.690 12 DEBUG ceilometer.compute.pollsters [-] 1e8a6cfb-1b42-472e-99dd-5134bcebbe43/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.691 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'efbf6068-f637-45e5-8e92-6669c6d0b51b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 305, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '1e8a6cfb-1b42-472e-99dd-5134bcebbe43-vda', 'timestamp': '2025-11-22T08:34:36.690655', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-873112655', 'name': 'instance-000000a6', 'instance_id': '1e8a6cfb-1b42-472e-99dd-5134bcebbe43', 'instance_type': 'm1.nano', 'host': 'a43ff701f099c2fd3b66ee5dcb1e851db8f28e5390f9b01d394bbd00', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '142edd02-c77e-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 7217.297239354, 'message_signature': '6a4727149db599c3c89352be9058fb3edd3ce83544a7537ea2539660aef82deb'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '1e8a6cfb-1b42-472e-99dd-5134bcebbe43-sda', 'timestamp': '2025-11-22T08:34:36.690655', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-873112655', 'name': 'instance-000000a6', 'instance_id': '1e8a6cfb-1b42-472e-99dd-5134bcebbe43', 'instance_type': 'm1.nano', 'host': 'a43ff701f099c2fd3b66ee5dcb1e851db8f28e5390f9b01d394bbd00', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '142ee4be-c77e-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 7217.297239354, 'message_signature': '3a9a922225260900218bdb50f6af6f4dbf7f7491b66f3593a05e7931a856887f'}]}, 'timestamp': '2025-11-22 08:34:36.691057', '_unique_id': '64e7e053be6448b5a0285b23e9b7275f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.691 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.691 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.691 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.691 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.691 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.691 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.691 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.691 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.691 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.691 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.691 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.691 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.691 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.691 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.691 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.691 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.691 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.691 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.691 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.691 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.691 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.691 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.691 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.691 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.691 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.691 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.691 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.691 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.691 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.691 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.691 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.692 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.692 12 DEBUG ceilometer.compute.pollsters [-] 1e8a6cfb-1b42-472e-99dd-5134bcebbe43/cpu volume: 12720000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.692 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd943ae00-4eb6-4ead-9e6a-655b58c5720b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12720000000, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '1e8a6cfb-1b42-472e-99dd-5134bcebbe43', 'timestamp': '2025-11-22T08:34:36.692079', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-873112655', 'name': 'instance-000000a6', 'instance_id': '1e8a6cfb-1b42-472e-99dd-5134bcebbe43', 'instance_type': 'm1.nano', 'host': 'a43ff701f099c2fd3b66ee5dcb1e851db8f28e5390f9b01d394bbd00', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '142f14c0-c77e-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 7217.364128515, 'message_signature': 'f3307c2cc00dccbf9877ac4e135c2a66d6a6bab13aad1e417ee62b273aa68c45'}]}, 'timestamp': '2025-11-22 08:34:36.692309', '_unique_id': '3029ebd0890a4730881c4900ac435f3a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.692 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.692 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.692 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.692 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.692 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.692 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.692 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.692 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.692 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.692 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.692 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.692 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.692 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.692 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.692 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.692 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.692 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.692 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.692 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.692 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.692 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.692 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.692 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.692 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.692 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.692 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.692 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.692 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.692 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.692 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.692 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.692 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.693 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.693 12 DEBUG ceilometer.compute.pollsters [-] 1e8a6cfb-1b42-472e-99dd-5134bcebbe43/disk.device.write.latency volume: 8294344298 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.693 12 DEBUG ceilometer.compute.pollsters [-] 1e8a6cfb-1b42-472e-99dd-5134bcebbe43/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.694 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dfa5ff89-8b77-4e65-b6f6-227310383775', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 8294344298, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '1e8a6cfb-1b42-472e-99dd-5134bcebbe43-vda', 'timestamp': '2025-11-22T08:34:36.693365', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-873112655', 'name': 'instance-000000a6', 'instance_id': '1e8a6cfb-1b42-472e-99dd-5134bcebbe43', 'instance_type': 'm1.nano', 'host': 'a43ff701f099c2fd3b66ee5dcb1e851db8f28e5390f9b01d394bbd00', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '142f46fc-c77e-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 7217.297239354, 'message_signature': '49af1672e7be222f825c421749fbb418d3101e0dbc31fe88f130acd280416ea8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': '1e8a6cfb-1b42-472e-99dd-5134bcebbe43-sda', 'timestamp': '2025-11-22T08:34:36.693365', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-873112655', 'name': 'instance-000000a6', 'instance_id': '1e8a6cfb-1b42-472e-99dd-5134bcebbe43', 'instance_type': 'm1.nano', 'host': 'a43ff701f099c2fd3b66ee5dcb1e851db8f28e5390f9b01d394bbd00', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '142f4e72-c77e-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 7217.297239354, 'message_signature': '614dd48805d0df3938502d74923c7270cfccc46da60f7e970724446fc2f43f1e'}]}, 'timestamp': '2025-11-22 08:34:36.693761', '_unique_id': '3de12b89697c437b96e0a0f161150343'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.694 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.694 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.694 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.694 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.694 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.694 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.694 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.694 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.694 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.694 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.694 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.694 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.694 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.694 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.694 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.694 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.694 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.694 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.694 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.694 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.694 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.694 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.694 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.694 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.694 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.694 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.694 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.694 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.694 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.694 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.694 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.694 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.694 12 DEBUG ceilometer.compute.pollsters [-] 1e8a6cfb-1b42-472e-99dd-5134bcebbe43/network.incoming.packets volume: 18 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.695 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b90ac303-f642-4251-a686-13d360fbf772', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 18, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_name': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_name': None, 'resource_id': 'instance-000000a6-1e8a6cfb-1b42-472e-99dd-5134bcebbe43-tapa2290405-be', 'timestamp': '2025-11-22T08:34:36.694785', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-873112655', 'name': 'tapa2290405-be', 'instance_id': '1e8a6cfb-1b42-472e-99dd-5134bcebbe43', 'instance_type': 'm1.nano', 'host': 'a43ff701f099c2fd3b66ee5dcb1e851db8f28e5390f9b01d394bbd00', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1f:c7:2d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa2290405-be'}, 'message_id': '142f7ea6-c77e-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 7217.326808503, 'message_signature': 'f2a756fdac59d8f9f5c8dccf8ebd3701ea68a1836a3fc56ea31d08a9a01c24f2'}]}, 'timestamp': '2025-11-22 08:34:36.695011', '_unique_id': '5949126c1a8144499787823cf02870a4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.695 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.695 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.695 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.695 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.695 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.695 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.695 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.695 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.695 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.695 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.695 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.695 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.695 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.695 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.695 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.695 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.695 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.695 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.695 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.695 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.695 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.695 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.695 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.695 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.695 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.695 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.695 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.695 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.695 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.695 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:34:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:34:36.695 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:34:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:34:37.365 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:34:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:34:37.365 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:34:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:34:37.366 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:34:37 compute-0 nova_compute[186544]: 2025-11-22 08:34:37.992 186548 DEBUG oslo_concurrency.lockutils [None req-d7084c2c-1f42-4d0f-9302-78c5459543ec 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "1e8a6cfb-1b42-472e-99dd-5134bcebbe43" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:34:37 compute-0 nova_compute[186544]: 2025-11-22 08:34:37.993 186548 DEBUG oslo_concurrency.lockutils [None req-d7084c2c-1f42-4d0f-9302-78c5459543ec 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "1e8a6cfb-1b42-472e-99dd-5134bcebbe43" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:34:37 compute-0 nova_compute[186544]: 2025-11-22 08:34:37.993 186548 DEBUG oslo_concurrency.lockutils [None req-d7084c2c-1f42-4d0f-9302-78c5459543ec 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "1e8a6cfb-1b42-472e-99dd-5134bcebbe43-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:34:37 compute-0 nova_compute[186544]: 2025-11-22 08:34:37.993 186548 DEBUG oslo_concurrency.lockutils [None req-d7084c2c-1f42-4d0f-9302-78c5459543ec 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "1e8a6cfb-1b42-472e-99dd-5134bcebbe43-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:34:37 compute-0 nova_compute[186544]: 2025-11-22 08:34:37.993 186548 DEBUG oslo_concurrency.lockutils [None req-d7084c2c-1f42-4d0f-9302-78c5459543ec 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "1e8a6cfb-1b42-472e-99dd-5134bcebbe43-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:34:38 compute-0 nova_compute[186544]: 2025-11-22 08:34:38.003 186548 INFO nova.compute.manager [None req-d7084c2c-1f42-4d0f-9302-78c5459543ec 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1e8a6cfb-1b42-472e-99dd-5134bcebbe43] Terminating instance
Nov 22 08:34:38 compute-0 nova_compute[186544]: 2025-11-22 08:34:38.011 186548 DEBUG nova.compute.manager [None req-d7084c2c-1f42-4d0f-9302-78c5459543ec 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1e8a6cfb-1b42-472e-99dd-5134bcebbe43] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 08:34:38 compute-0 kernel: tapa2290405-be (unregistering): left promiscuous mode
Nov 22 08:34:38 compute-0 NetworkManager[55036]: <info>  [1763800478.0359] device (tapa2290405-be): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 08:34:38 compute-0 ovn_controller[94843]: 2025-11-22T08:34:38Z|00798|binding|INFO|Releasing lport a2290405-be1c-41fd-8df6-e9f6381864b5 from this chassis (sb_readonly=0)
Nov 22 08:34:38 compute-0 nova_compute[186544]: 2025-11-22 08:34:38.046 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:34:38 compute-0 ovn_controller[94843]: 2025-11-22T08:34:38Z|00799|binding|INFO|Setting lport a2290405-be1c-41fd-8df6-e9f6381864b5 down in Southbound
Nov 22 08:34:38 compute-0 ovn_controller[94843]: 2025-11-22T08:34:38Z|00800|binding|INFO|Removing iface tapa2290405-be ovn-installed in OVS
Nov 22 08:34:38 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:34:38.052 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1f:c7:2d 10.100.0.19'], port_security=['fa:16:3e:1f:c7:2d 10.100.0.19'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.19/28', 'neutron:device_id': '1e8a6cfb-1b42-472e-99dd-5134bcebbe43', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-123af741-0f16-4f2c-9391-bc20715b113c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '12f63a6d87a947758ab928c0d625ff06', 'neutron:revision_number': '4', 'neutron:security_group_ids': '742aa02a-10aa-4ee5-b074-11d3303b7679', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=addef574-b6f3-4707-b107-88339448d51a, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=a2290405-be1c-41fd-8df6-e9f6381864b5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:34:38 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:34:38.053 103805 INFO neutron.agent.ovn.metadata.agent [-] Port a2290405-be1c-41fd-8df6-e9f6381864b5 in datapath 123af741-0f16-4f2c-9391-bc20715b113c unbound from our chassis
Nov 22 08:34:38 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:34:38.054 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 123af741-0f16-4f2c-9391-bc20715b113c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 08:34:38 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:34:38.056 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[e2709d8e-03ca-449d-b69e-10bbc27376c4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:34:38 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:34:38.056 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-123af741-0f16-4f2c-9391-bc20715b113c namespace which is not needed anymore
Nov 22 08:34:38 compute-0 nova_compute[186544]: 2025-11-22 08:34:38.064 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:34:38 compute-0 systemd[1]: machine-qemu\x2d90\x2dinstance\x2d000000a6.scope: Deactivated successfully.
Nov 22 08:34:38 compute-0 systemd[1]: machine-qemu\x2d90\x2dinstance\x2d000000a6.scope: Consumed 14.434s CPU time.
Nov 22 08:34:38 compute-0 systemd-machined[152872]: Machine qemu-90-instance-000000a6 terminated.
Nov 22 08:34:38 compute-0 nova_compute[186544]: 2025-11-22 08:34:38.158 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:34:38 compute-0 neutron-haproxy-ovnmeta-123af741-0f16-4f2c-9391-bc20715b113c[248089]: [NOTICE]   (248093) : haproxy version is 2.8.14-c23fe91
Nov 22 08:34:38 compute-0 neutron-haproxy-ovnmeta-123af741-0f16-4f2c-9391-bc20715b113c[248089]: [NOTICE]   (248093) : path to executable is /usr/sbin/haproxy
Nov 22 08:34:38 compute-0 neutron-haproxy-ovnmeta-123af741-0f16-4f2c-9391-bc20715b113c[248089]: [WARNING]  (248093) : Exiting Master process...
Nov 22 08:34:38 compute-0 neutron-haproxy-ovnmeta-123af741-0f16-4f2c-9391-bc20715b113c[248089]: [WARNING]  (248093) : Exiting Master process...
Nov 22 08:34:38 compute-0 neutron-haproxy-ovnmeta-123af741-0f16-4f2c-9391-bc20715b113c[248089]: [ALERT]    (248093) : Current worker (248095) exited with code 143 (Terminated)
Nov 22 08:34:38 compute-0 neutron-haproxy-ovnmeta-123af741-0f16-4f2c-9391-bc20715b113c[248089]: [WARNING]  (248093) : All workers exited. Exiting... (0)
Nov 22 08:34:38 compute-0 systemd[1]: libpod-f0edd2084274d396ae7219bb0d71c6125f53b824f3f5f198070600d42f17e52f.scope: Deactivated successfully.
Nov 22 08:34:38 compute-0 podman[248305]: 2025-11-22 08:34:38.184619226 +0000 UTC m=+0.046117929 container died f0edd2084274d396ae7219bb0d71c6125f53b824f3f5f198070600d42f17e52f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-123af741-0f16-4f2c-9391-bc20715b113c, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 22 08:34:38 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f0edd2084274d396ae7219bb0d71c6125f53b824f3f5f198070600d42f17e52f-userdata-shm.mount: Deactivated successfully.
Nov 22 08:34:38 compute-0 systemd[1]: var-lib-containers-storage-overlay-e8b2e559a43af8c6bf67a897eab1b554afee59f3785890417061eafd67ff3fe8-merged.mount: Deactivated successfully.
Nov 22 08:34:38 compute-0 podman[248305]: 2025-11-22 08:34:38.2297459 +0000 UTC m=+0.091244603 container cleanup f0edd2084274d396ae7219bb0d71c6125f53b824f3f5f198070600d42f17e52f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-123af741-0f16-4f2c-9391-bc20715b113c, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 22 08:34:38 compute-0 systemd[1]: libpod-conmon-f0edd2084274d396ae7219bb0d71c6125f53b824f3f5f198070600d42f17e52f.scope: Deactivated successfully.
Nov 22 08:34:38 compute-0 nova_compute[186544]: 2025-11-22 08:34:38.274 186548 INFO nova.virt.libvirt.driver [-] [instance: 1e8a6cfb-1b42-472e-99dd-5134bcebbe43] Instance destroyed successfully.
Nov 22 08:34:38 compute-0 nova_compute[186544]: 2025-11-22 08:34:38.275 186548 DEBUG nova.objects.instance [None req-d7084c2c-1f42-4d0f-9302-78c5459543ec 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lazy-loading 'resources' on Instance uuid 1e8a6cfb-1b42-472e-99dd-5134bcebbe43 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:34:38 compute-0 nova_compute[186544]: 2025-11-22 08:34:38.288 186548 DEBUG nova.virt.libvirt.vif [None req-d7084c2c-1f42-4d0f-9302-78c5459543ec 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:34:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-873112655',display_name='tempest-TestNetworkBasicOps-server-873112655',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-873112655',id=166,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBP4xi6+j3qP8kpYDCRdJyeh1o5i9dQ6AbSZPHMqOAregRctXcy99guzsSmOcQrpCsRWmsxA2migBdzpbRcg/oHK6z/Bb9nK+Sw+a+wO3o0QK8RJKGetnwdOJRwMx9J1z0w==',key_name='tempest-TestNetworkBasicOps-1280477233',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:34:16Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='12f63a6d87a947758ab928c0d625ff06',ramdisk_id='',reservation_id='r-y0tu3s3m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1998778518',owner_user_name='tempest-TestNetworkBasicOps-1998778518-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:34:16Z,user_data=None,user_id='033a5e424a0a42afa21b67c28d79d1f4',uuid=1e8a6cfb-1b42-472e-99dd-5134bcebbe43,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a2290405-be1c-41fd-8df6-e9f6381864b5", "address": "fa:16:3e:1f:c7:2d", "network": {"id": "123af741-0f16-4f2c-9391-bc20715b113c", "bridge": "br-int", "label": "tempest-network-smoke--91900949", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2290405-be", "ovs_interfaceid": "a2290405-be1c-41fd-8df6-e9f6381864b5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 08:34:38 compute-0 nova_compute[186544]: 2025-11-22 08:34:38.289 186548 DEBUG nova.network.os_vif_util [None req-d7084c2c-1f42-4d0f-9302-78c5459543ec 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converting VIF {"id": "a2290405-be1c-41fd-8df6-e9f6381864b5", "address": "fa:16:3e:1f:c7:2d", "network": {"id": "123af741-0f16-4f2c-9391-bc20715b113c", "bridge": "br-int", "label": "tempest-network-smoke--91900949", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2290405-be", "ovs_interfaceid": "a2290405-be1c-41fd-8df6-e9f6381864b5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:34:38 compute-0 nova_compute[186544]: 2025-11-22 08:34:38.290 186548 DEBUG nova.network.os_vif_util [None req-d7084c2c-1f42-4d0f-9302-78c5459543ec 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:1f:c7:2d,bridge_name='br-int',has_traffic_filtering=True,id=a2290405-be1c-41fd-8df6-e9f6381864b5,network=Network(123af741-0f16-4f2c-9391-bc20715b113c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2290405-be') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:34:38 compute-0 nova_compute[186544]: 2025-11-22 08:34:38.291 186548 DEBUG os_vif [None req-d7084c2c-1f42-4d0f-9302-78c5459543ec 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:1f:c7:2d,bridge_name='br-int',has_traffic_filtering=True,id=a2290405-be1c-41fd-8df6-e9f6381864b5,network=Network(123af741-0f16-4f2c-9391-bc20715b113c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2290405-be') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 08:34:38 compute-0 nova_compute[186544]: 2025-11-22 08:34:38.292 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:34:38 compute-0 nova_compute[186544]: 2025-11-22 08:34:38.292 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa2290405-be, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:34:38 compute-0 nova_compute[186544]: 2025-11-22 08:34:38.294 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:34:38 compute-0 nova_compute[186544]: 2025-11-22 08:34:38.295 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:34:38 compute-0 nova_compute[186544]: 2025-11-22 08:34:38.297 186548 INFO os_vif [None req-d7084c2c-1f42-4d0f-9302-78c5459543ec 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:1f:c7:2d,bridge_name='br-int',has_traffic_filtering=True,id=a2290405-be1c-41fd-8df6-e9f6381864b5,network=Network(123af741-0f16-4f2c-9391-bc20715b113c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2290405-be')
Nov 22 08:34:38 compute-0 nova_compute[186544]: 2025-11-22 08:34:38.298 186548 INFO nova.virt.libvirt.driver [None req-d7084c2c-1f42-4d0f-9302-78c5459543ec 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1e8a6cfb-1b42-472e-99dd-5134bcebbe43] Deleting instance files /var/lib/nova/instances/1e8a6cfb-1b42-472e-99dd-5134bcebbe43_del
Nov 22 08:34:38 compute-0 nova_compute[186544]: 2025-11-22 08:34:38.299 186548 INFO nova.virt.libvirt.driver [None req-d7084c2c-1f42-4d0f-9302-78c5459543ec 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1e8a6cfb-1b42-472e-99dd-5134bcebbe43] Deletion of /var/lib/nova/instances/1e8a6cfb-1b42-472e-99dd-5134bcebbe43_del complete
Nov 22 08:34:38 compute-0 podman[248337]: 2025-11-22 08:34:38.31405729 +0000 UTC m=+0.060583346 container remove f0edd2084274d396ae7219bb0d71c6125f53b824f3f5f198070600d42f17e52f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-123af741-0f16-4f2c-9391-bc20715b113c, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 08:34:38 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:34:38.319 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[33d95604-560a-41cb-a739-1cd4e768dad9]: (4, ('Sat Nov 22 08:34:38 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-123af741-0f16-4f2c-9391-bc20715b113c (f0edd2084274d396ae7219bb0d71c6125f53b824f3f5f198070600d42f17e52f)\nf0edd2084274d396ae7219bb0d71c6125f53b824f3f5f198070600d42f17e52f\nSat Nov 22 08:34:38 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-123af741-0f16-4f2c-9391-bc20715b113c (f0edd2084274d396ae7219bb0d71c6125f53b824f3f5f198070600d42f17e52f)\nf0edd2084274d396ae7219bb0d71c6125f53b824f3f5f198070600d42f17e52f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:34:38 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:34:38.320 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[cf8d8900-a507-4af8-8b4a-0dbeab025b54]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:34:38 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:34:38.321 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap123af741-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:34:38 compute-0 nova_compute[186544]: 2025-11-22 08:34:38.323 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:34:38 compute-0 kernel: tap123af741-00: left promiscuous mode
Nov 22 08:34:38 compute-0 nova_compute[186544]: 2025-11-22 08:34:38.333 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:34:38 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:34:38.336 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[d9b26692-8c91-4e81-8bd3-755171ff7c3a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:34:38 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:34:38.353 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[24dab41d-c7f2-4e18-b09c-0254fed486d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:34:38 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:34:38.355 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[f093e4fe-6802-4934-91db-b58f243d9e79]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:34:38 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:34:38.372 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[adbc6482-5a2c-4a63-99c6-1f6f34d73f4b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 719639, 'reachable_time': 44856, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 248363, 'error': None, 'target': 'ovnmeta-123af741-0f16-4f2c-9391-bc20715b113c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:34:38 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:34:38.375 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-123af741-0f16-4f2c-9391-bc20715b113c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 08:34:38 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:34:38.375 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[8860c462-ed0e-4211-8a34-32ee86132816]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:34:38 compute-0 systemd[1]: run-netns-ovnmeta\x2d123af741\x2d0f16\x2d4f2c\x2d9391\x2dbc20715b113c.mount: Deactivated successfully.
Nov 22 08:34:38 compute-0 nova_compute[186544]: 2025-11-22 08:34:38.393 186548 INFO nova.compute.manager [None req-d7084c2c-1f42-4d0f-9302-78c5459543ec 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 1e8a6cfb-1b42-472e-99dd-5134bcebbe43] Took 0.38 seconds to destroy the instance on the hypervisor.
Nov 22 08:34:38 compute-0 nova_compute[186544]: 2025-11-22 08:34:38.394 186548 DEBUG oslo.service.loopingcall [None req-d7084c2c-1f42-4d0f-9302-78c5459543ec 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 08:34:38 compute-0 nova_compute[186544]: 2025-11-22 08:34:38.394 186548 DEBUG nova.compute.manager [-] [instance: 1e8a6cfb-1b42-472e-99dd-5134bcebbe43] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 08:34:38 compute-0 nova_compute[186544]: 2025-11-22 08:34:38.394 186548 DEBUG nova.network.neutron [-] [instance: 1e8a6cfb-1b42-472e-99dd-5134bcebbe43] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 08:34:39 compute-0 nova_compute[186544]: 2025-11-22 08:34:39.213 186548 DEBUG nova.compute.manager [req-c4805a69-ad84-4e25-bafa-71ad7490b52e req-d7dfa2f1-71d3-4da8-9f22-76f1b4dfa11c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1e8a6cfb-1b42-472e-99dd-5134bcebbe43] Received event network-vif-unplugged-a2290405-be1c-41fd-8df6-e9f6381864b5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:34:39 compute-0 nova_compute[186544]: 2025-11-22 08:34:39.214 186548 DEBUG oslo_concurrency.lockutils [req-c4805a69-ad84-4e25-bafa-71ad7490b52e req-d7dfa2f1-71d3-4da8-9f22-76f1b4dfa11c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1e8a6cfb-1b42-472e-99dd-5134bcebbe43-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:34:39 compute-0 nova_compute[186544]: 2025-11-22 08:34:39.214 186548 DEBUG oslo_concurrency.lockutils [req-c4805a69-ad84-4e25-bafa-71ad7490b52e req-d7dfa2f1-71d3-4da8-9f22-76f1b4dfa11c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1e8a6cfb-1b42-472e-99dd-5134bcebbe43-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:34:39 compute-0 nova_compute[186544]: 2025-11-22 08:34:39.214 186548 DEBUG oslo_concurrency.lockutils [req-c4805a69-ad84-4e25-bafa-71ad7490b52e req-d7dfa2f1-71d3-4da8-9f22-76f1b4dfa11c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1e8a6cfb-1b42-472e-99dd-5134bcebbe43-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:34:39 compute-0 nova_compute[186544]: 2025-11-22 08:34:39.214 186548 DEBUG nova.compute.manager [req-c4805a69-ad84-4e25-bafa-71ad7490b52e req-d7dfa2f1-71d3-4da8-9f22-76f1b4dfa11c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1e8a6cfb-1b42-472e-99dd-5134bcebbe43] No waiting events found dispatching network-vif-unplugged-a2290405-be1c-41fd-8df6-e9f6381864b5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:34:39 compute-0 nova_compute[186544]: 2025-11-22 08:34:39.215 186548 DEBUG nova.compute.manager [req-c4805a69-ad84-4e25-bafa-71ad7490b52e req-d7dfa2f1-71d3-4da8-9f22-76f1b4dfa11c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1e8a6cfb-1b42-472e-99dd-5134bcebbe43] Received event network-vif-unplugged-a2290405-be1c-41fd-8df6-e9f6381864b5 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 22 08:34:40 compute-0 nova_compute[186544]: 2025-11-22 08:34:40.234 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:34:40 compute-0 nova_compute[186544]: 2025-11-22 08:34:40.920 186548 DEBUG nova.network.neutron [-] [instance: 1e8a6cfb-1b42-472e-99dd-5134bcebbe43] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:34:40 compute-0 nova_compute[186544]: 2025-11-22 08:34:40.941 186548 INFO nova.compute.manager [-] [instance: 1e8a6cfb-1b42-472e-99dd-5134bcebbe43] Took 2.55 seconds to deallocate network for instance.
Nov 22 08:34:41 compute-0 nova_compute[186544]: 2025-11-22 08:34:41.005 186548 DEBUG nova.compute.manager [req-c34fc658-9534-4a66-be28-5d60070fc752 req-d6012889-79bc-460c-b6ef-38c51350dca0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1e8a6cfb-1b42-472e-99dd-5134bcebbe43] Received event network-vif-deleted-a2290405-be1c-41fd-8df6-e9f6381864b5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:34:41 compute-0 nova_compute[186544]: 2025-11-22 08:34:41.007 186548 DEBUG oslo_concurrency.lockutils [None req-d7084c2c-1f42-4d0f-9302-78c5459543ec 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:34:41 compute-0 nova_compute[186544]: 2025-11-22 08:34:41.007 186548 DEBUG oslo_concurrency.lockutils [None req-d7084c2c-1f42-4d0f-9302-78c5459543ec 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:34:41 compute-0 nova_compute[186544]: 2025-11-22 08:34:41.059 186548 DEBUG nova.compute.provider_tree [None req-d7084c2c-1f42-4d0f-9302-78c5459543ec 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:34:41 compute-0 nova_compute[186544]: 2025-11-22 08:34:41.075 186548 DEBUG nova.scheduler.client.report [None req-d7084c2c-1f42-4d0f-9302-78c5459543ec 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:34:41 compute-0 nova_compute[186544]: 2025-11-22 08:34:41.119 186548 DEBUG oslo_concurrency.lockutils [None req-d7084c2c-1f42-4d0f-9302-78c5459543ec 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.112s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:34:41 compute-0 nova_compute[186544]: 2025-11-22 08:34:41.172 186548 INFO nova.scheduler.client.report [None req-d7084c2c-1f42-4d0f-9302-78c5459543ec 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Deleted allocations for instance 1e8a6cfb-1b42-472e-99dd-5134bcebbe43
Nov 22 08:34:41 compute-0 nova_compute[186544]: 2025-11-22 08:34:41.244 186548 DEBUG oslo_concurrency.lockutils [None req-d7084c2c-1f42-4d0f-9302-78c5459543ec 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "1e8a6cfb-1b42-472e-99dd-5134bcebbe43" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.251s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:34:41 compute-0 nova_compute[186544]: 2025-11-22 08:34:41.294 186548 DEBUG nova.compute.manager [req-ba127cb1-e516-46c9-bd96-1dc5cbeed257 req-d422e284-5536-4f0c-a860-71ed87404b9e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1e8a6cfb-1b42-472e-99dd-5134bcebbe43] Received event network-vif-plugged-a2290405-be1c-41fd-8df6-e9f6381864b5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:34:41 compute-0 nova_compute[186544]: 2025-11-22 08:34:41.294 186548 DEBUG oslo_concurrency.lockutils [req-ba127cb1-e516-46c9-bd96-1dc5cbeed257 req-d422e284-5536-4f0c-a860-71ed87404b9e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "1e8a6cfb-1b42-472e-99dd-5134bcebbe43-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:34:41 compute-0 nova_compute[186544]: 2025-11-22 08:34:41.294 186548 DEBUG oslo_concurrency.lockutils [req-ba127cb1-e516-46c9-bd96-1dc5cbeed257 req-d422e284-5536-4f0c-a860-71ed87404b9e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1e8a6cfb-1b42-472e-99dd-5134bcebbe43-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:34:41 compute-0 nova_compute[186544]: 2025-11-22 08:34:41.294 186548 DEBUG oslo_concurrency.lockutils [req-ba127cb1-e516-46c9-bd96-1dc5cbeed257 req-d422e284-5536-4f0c-a860-71ed87404b9e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "1e8a6cfb-1b42-472e-99dd-5134bcebbe43-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:34:41 compute-0 nova_compute[186544]: 2025-11-22 08:34:41.295 186548 DEBUG nova.compute.manager [req-ba127cb1-e516-46c9-bd96-1dc5cbeed257 req-d422e284-5536-4f0c-a860-71ed87404b9e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1e8a6cfb-1b42-472e-99dd-5134bcebbe43] No waiting events found dispatching network-vif-plugged-a2290405-be1c-41fd-8df6-e9f6381864b5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:34:41 compute-0 nova_compute[186544]: 2025-11-22 08:34:41.295 186548 WARNING nova.compute.manager [req-ba127cb1-e516-46c9-bd96-1dc5cbeed257 req-d422e284-5536-4f0c-a860-71ed87404b9e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 1e8a6cfb-1b42-472e-99dd-5134bcebbe43] Received unexpected event network-vif-plugged-a2290405-be1c-41fd-8df6-e9f6381864b5 for instance with vm_state deleted and task_state None.
Nov 22 08:34:41 compute-0 nova_compute[186544]: 2025-11-22 08:34:41.472 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:34:43 compute-0 nova_compute[186544]: 2025-11-22 08:34:43.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:34:43 compute-0 nova_compute[186544]: 2025-11-22 08:34:43.295 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:34:45 compute-0 nova_compute[186544]: 2025-11-22 08:34:45.926 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:34:46 compute-0 nova_compute[186544]: 2025-11-22 08:34:46.035 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:34:46 compute-0 nova_compute[186544]: 2025-11-22 08:34:46.474 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:34:47 compute-0 nova_compute[186544]: 2025-11-22 08:34:47.288 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:34:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:34:47.290 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=60, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=59) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:34:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:34:47.291 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 08:34:48 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:34:48.293 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '60'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:34:48 compute-0 nova_compute[186544]: 2025-11-22 08:34:48.298 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:34:49 compute-0 nova_compute[186544]: 2025-11-22 08:34:49.174 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:34:49 compute-0 nova_compute[186544]: 2025-11-22 08:34:49.174 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 22 08:34:49 compute-0 nova_compute[186544]: 2025-11-22 08:34:49.193 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 22 08:34:51 compute-0 nova_compute[186544]: 2025-11-22 08:34:51.475 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:34:53 compute-0 nova_compute[186544]: 2025-11-22 08:34:53.273 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763800478.271901, 1e8a6cfb-1b42-472e-99dd-5134bcebbe43 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:34:53 compute-0 nova_compute[186544]: 2025-11-22 08:34:53.273 186548 INFO nova.compute.manager [-] [instance: 1e8a6cfb-1b42-472e-99dd-5134bcebbe43] VM Stopped (Lifecycle Event)
Nov 22 08:34:53 compute-0 nova_compute[186544]: 2025-11-22 08:34:53.299 186548 DEBUG nova.compute.manager [None req-04dd2947-ce3a-413a-b74e-949a288d0cca - - - - - -] [instance: 1e8a6cfb-1b42-472e-99dd-5134bcebbe43] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:34:53 compute-0 nova_compute[186544]: 2025-11-22 08:34:53.301 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:34:54 compute-0 podman[248365]: 2025-11-22 08:34:54.459009743 +0000 UTC m=+0.087086720 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 22 08:34:54 compute-0 podman[248366]: 2025-11-22 08:34:54.458986952 +0000 UTC m=+0.094244226 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 22 08:34:54 compute-0 podman[248367]: 2025-11-22 08:34:54.471057081 +0000 UTC m=+0.092610117 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 08:34:54 compute-0 podman[248368]: 2025-11-22 08:34:54.486135252 +0000 UTC m=+0.100503680 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 08:34:56 compute-0 nova_compute[186544]: 2025-11-22 08:34:56.478 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:34:58 compute-0 nova_compute[186544]: 2025-11-22 08:34:58.303 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:34:59 compute-0 nova_compute[186544]: 2025-11-22 08:34:59.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:34:59 compute-0 nova_compute[186544]: 2025-11-22 08:34:59.164 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 22 08:35:01 compute-0 nova_compute[186544]: 2025-11-22 08:35:01.479 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:35:03 compute-0 nova_compute[186544]: 2025-11-22 08:35:03.305 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:35:05 compute-0 podman[248446]: 2025-11-22 08:35:05.405373963 +0000 UTC m=+0.058382591 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Nov 22 08:35:06 compute-0 nova_compute[186544]: 2025-11-22 08:35:06.482 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:35:07 compute-0 podman[248467]: 2025-11-22 08:35:07.390985188 +0000 UTC m=+0.044144570 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 08:35:07 compute-0 podman[248468]: 2025-11-22 08:35:07.398287998 +0000 UTC m=+0.048084998 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, release=1755695350, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.openshift.expose-services=, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, maintainer=Red Hat, Inc., config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container)
Nov 22 08:35:08 compute-0 nova_compute[186544]: 2025-11-22 08:35:08.309 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:35:09 compute-0 podman[200750]: time="2025-11-22T08:35:09Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 22 08:35:09 compute-0 podman[200750]: @ - - [22/Nov/2025:08:35:09 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 22841 "" "Go-http-client/1.1"
Nov 22 08:35:11 compute-0 nova_compute[186544]: 2025-11-22 08:35:11.483 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:35:13 compute-0 nova_compute[186544]: 2025-11-22 08:35:13.311 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:35:16 compute-0 nova_compute[186544]: 2025-11-22 08:35:16.176 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:35:16 compute-0 nova_compute[186544]: 2025-11-22 08:35:16.177 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 08:35:16 compute-0 nova_compute[186544]: 2025-11-22 08:35:16.177 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:35:16 compute-0 nova_compute[186544]: 2025-11-22 08:35:16.199 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:35:16 compute-0 nova_compute[186544]: 2025-11-22 08:35:16.200 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:35:16 compute-0 nova_compute[186544]: 2025-11-22 08:35:16.200 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:35:16 compute-0 nova_compute[186544]: 2025-11-22 08:35:16.200 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 08:35:16 compute-0 nova_compute[186544]: 2025-11-22 08:35:16.349 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:35:16 compute-0 nova_compute[186544]: 2025-11-22 08:35:16.350 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5704MB free_disk=73.1306037902832GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 08:35:16 compute-0 nova_compute[186544]: 2025-11-22 08:35:16.350 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:35:16 compute-0 nova_compute[186544]: 2025-11-22 08:35:16.350 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:35:16 compute-0 nova_compute[186544]: 2025-11-22 08:35:16.430 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 08:35:16 compute-0 nova_compute[186544]: 2025-11-22 08:35:16.431 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 08:35:16 compute-0 nova_compute[186544]: 2025-11-22 08:35:16.457 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:35:16 compute-0 nova_compute[186544]: 2025-11-22 08:35:16.469 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:35:16 compute-0 nova_compute[186544]: 2025-11-22 08:35:16.485 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:35:16 compute-0 nova_compute[186544]: 2025-11-22 08:35:16.492 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 08:35:16 compute-0 nova_compute[186544]: 2025-11-22 08:35:16.493 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.142s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:35:18 compute-0 nova_compute[186544]: 2025-11-22 08:35:18.315 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:35:21 compute-0 nova_compute[186544]: 2025-11-22 08:35:21.479 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:35:21 compute-0 nova_compute[186544]: 2025-11-22 08:35:21.479 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 08:35:21 compute-0 nova_compute[186544]: 2025-11-22 08:35:21.479 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 08:35:21 compute-0 nova_compute[186544]: 2025-11-22 08:35:21.487 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:35:21 compute-0 nova_compute[186544]: 2025-11-22 08:35:21.492 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 08:35:21 compute-0 nova_compute[186544]: 2025-11-22 08:35:21.492 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:35:23 compute-0 nova_compute[186544]: 2025-11-22 08:35:23.319 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:35:24 compute-0 nova_compute[186544]: 2025-11-22 08:35:24.020 186548 DEBUG oslo_concurrency.lockutils [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "3185c90c-d312-49b2-892e-133ba362117b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:35:24 compute-0 nova_compute[186544]: 2025-11-22 08:35:24.021 186548 DEBUG oslo_concurrency.lockutils [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "3185c90c-d312-49b2-892e-133ba362117b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:35:24 compute-0 nova_compute[186544]: 2025-11-22 08:35:24.036 186548 DEBUG nova.compute.manager [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 08:35:24 compute-0 nova_compute[186544]: 2025-11-22 08:35:24.161 186548 DEBUG oslo_concurrency.lockutils [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:35:24 compute-0 nova_compute[186544]: 2025-11-22 08:35:24.162 186548 DEBUG oslo_concurrency.lockutils [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:35:24 compute-0 nova_compute[186544]: 2025-11-22 08:35:24.167 186548 DEBUG nova.virt.hardware [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 08:35:24 compute-0 nova_compute[186544]: 2025-11-22 08:35:24.167 186548 INFO nova.compute.claims [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Claim successful on node compute-0.ctlplane.example.com
Nov 22 08:35:24 compute-0 nova_compute[186544]: 2025-11-22 08:35:24.262 186548 DEBUG nova.compute.provider_tree [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:35:24 compute-0 nova_compute[186544]: 2025-11-22 08:35:24.277 186548 DEBUG nova.scheduler.client.report [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:35:24 compute-0 nova_compute[186544]: 2025-11-22 08:35:24.293 186548 DEBUG oslo_concurrency.lockutils [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.131s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:35:24 compute-0 nova_compute[186544]: 2025-11-22 08:35:24.294 186548 DEBUG nova.compute.manager [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 08:35:24 compute-0 nova_compute[186544]: 2025-11-22 08:35:24.348 186548 DEBUG nova.compute.manager [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 22 08:35:24 compute-0 nova_compute[186544]: 2025-11-22 08:35:24.348 186548 DEBUG nova.network.neutron [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 08:35:24 compute-0 nova_compute[186544]: 2025-11-22 08:35:24.362 186548 INFO nova.virt.libvirt.driver [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 08:35:24 compute-0 nova_compute[186544]: 2025-11-22 08:35:24.379 186548 DEBUG nova.compute.manager [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 08:35:24 compute-0 nova_compute[186544]: 2025-11-22 08:35:24.502 186548 DEBUG nova.compute.manager [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 08:35:24 compute-0 nova_compute[186544]: 2025-11-22 08:35:24.503 186548 DEBUG nova.virt.libvirt.driver [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 08:35:24 compute-0 nova_compute[186544]: 2025-11-22 08:35:24.504 186548 INFO nova.virt.libvirt.driver [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Creating image(s)
Nov 22 08:35:24 compute-0 nova_compute[186544]: 2025-11-22 08:35:24.504 186548 DEBUG oslo_concurrency.lockutils [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "/var/lib/nova/instances/3185c90c-d312-49b2-892e-133ba362117b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:35:24 compute-0 nova_compute[186544]: 2025-11-22 08:35:24.505 186548 DEBUG oslo_concurrency.lockutils [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "/var/lib/nova/instances/3185c90c-d312-49b2-892e-133ba362117b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:35:24 compute-0 nova_compute[186544]: 2025-11-22 08:35:24.505 186548 DEBUG oslo_concurrency.lockutils [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "/var/lib/nova/instances/3185c90c-d312-49b2-892e-133ba362117b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:35:24 compute-0 nova_compute[186544]: 2025-11-22 08:35:24.517 186548 DEBUG oslo_concurrency.processutils [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:35:24 compute-0 nova_compute[186544]: 2025-11-22 08:35:24.581 186548 DEBUG oslo_concurrency.processutils [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:35:24 compute-0 nova_compute[186544]: 2025-11-22 08:35:24.582 186548 DEBUG oslo_concurrency.lockutils [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:35:24 compute-0 nova_compute[186544]: 2025-11-22 08:35:24.582 186548 DEBUG oslo_concurrency.lockutils [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:35:24 compute-0 nova_compute[186544]: 2025-11-22 08:35:24.595 186548 DEBUG oslo_concurrency.processutils [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:35:24 compute-0 nova_compute[186544]: 2025-11-22 08:35:24.675 186548 DEBUG oslo_concurrency.processutils [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:35:24 compute-0 nova_compute[186544]: 2025-11-22 08:35:24.676 186548 DEBUG oslo_concurrency.processutils [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/3185c90c-d312-49b2-892e-133ba362117b/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:35:24 compute-0 nova_compute[186544]: 2025-11-22 08:35:24.804 186548 DEBUG oslo_concurrency.processutils [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/3185c90c-d312-49b2-892e-133ba362117b/disk 1073741824" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:35:24 compute-0 nova_compute[186544]: 2025-11-22 08:35:24.805 186548 DEBUG oslo_concurrency.lockutils [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.223s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:35:24 compute-0 nova_compute[186544]: 2025-11-22 08:35:24.806 186548 DEBUG oslo_concurrency.processutils [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:35:24 compute-0 nova_compute[186544]: 2025-11-22 08:35:24.867 186548 DEBUG oslo_concurrency.processutils [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:35:24 compute-0 nova_compute[186544]: 2025-11-22 08:35:24.868 186548 DEBUG nova.virt.disk.api [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Checking if we can resize image /var/lib/nova/instances/3185c90c-d312-49b2-892e-133ba362117b/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 08:35:24 compute-0 nova_compute[186544]: 2025-11-22 08:35:24.868 186548 DEBUG oslo_concurrency.processutils [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3185c90c-d312-49b2-892e-133ba362117b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:35:24 compute-0 nova_compute[186544]: 2025-11-22 08:35:24.931 186548 DEBUG oslo_concurrency.processutils [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3185c90c-d312-49b2-892e-133ba362117b/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:35:24 compute-0 nova_compute[186544]: 2025-11-22 08:35:24.932 186548 DEBUG nova.virt.disk.api [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Cannot resize image /var/lib/nova/instances/3185c90c-d312-49b2-892e-133ba362117b/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 08:35:24 compute-0 nova_compute[186544]: 2025-11-22 08:35:24.933 186548 DEBUG nova.objects.instance [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lazy-loading 'migration_context' on Instance uuid 3185c90c-d312-49b2-892e-133ba362117b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:35:24 compute-0 nova_compute[186544]: 2025-11-22 08:35:24.949 186548 DEBUG nova.virt.libvirt.driver [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 08:35:24 compute-0 nova_compute[186544]: 2025-11-22 08:35:24.949 186548 DEBUG nova.virt.libvirt.driver [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Ensure instance console log exists: /var/lib/nova/instances/3185c90c-d312-49b2-892e-133ba362117b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 08:35:24 compute-0 nova_compute[186544]: 2025-11-22 08:35:24.950 186548 DEBUG oslo_concurrency.lockutils [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:35:24 compute-0 nova_compute[186544]: 2025-11-22 08:35:24.950 186548 DEBUG oslo_concurrency.lockutils [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:35:24 compute-0 nova_compute[186544]: 2025-11-22 08:35:24.950 186548 DEBUG oslo_concurrency.lockutils [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:35:25 compute-0 nova_compute[186544]: 2025-11-22 08:35:25.009 186548 DEBUG nova.policy [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 22 08:35:25 compute-0 nova_compute[186544]: 2025-11-22 08:35:25.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:35:25 compute-0 podman[248524]: 2025-11-22 08:35:25.407443349 +0000 UTC m=+0.058617548 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 22 08:35:25 compute-0 podman[248525]: 2025-11-22 08:35:25.430047386 +0000 UTC m=+0.077869472 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Nov 22 08:35:25 compute-0 podman[248526]: 2025-11-22 08:35:25.433638935 +0000 UTC m=+0.075874253 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 08:35:25 compute-0 podman[248532]: 2025-11-22 08:35:25.43828345 +0000 UTC m=+0.077040992 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller)
Nov 22 08:35:26 compute-0 nova_compute[186544]: 2025-11-22 08:35:26.048 186548 DEBUG nova.network.neutron [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Successfully created port: 32e5de3f-3250-4a10-b086-ff60f6485d94 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 22 08:35:26 compute-0 nova_compute[186544]: 2025-11-22 08:35:26.489 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:35:27 compute-0 nova_compute[186544]: 2025-11-22 08:35:27.157 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:35:27 compute-0 nova_compute[186544]: 2025-11-22 08:35:27.597 186548 DEBUG nova.network.neutron [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Successfully updated port: 32e5de3f-3250-4a10-b086-ff60f6485d94 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 08:35:27 compute-0 nova_compute[186544]: 2025-11-22 08:35:27.614 186548 DEBUG oslo_concurrency.lockutils [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "refresh_cache-3185c90c-d312-49b2-892e-133ba362117b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:35:27 compute-0 nova_compute[186544]: 2025-11-22 08:35:27.614 186548 DEBUG oslo_concurrency.lockutils [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquired lock "refresh_cache-3185c90c-d312-49b2-892e-133ba362117b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:35:27 compute-0 nova_compute[186544]: 2025-11-22 08:35:27.614 186548 DEBUG nova.network.neutron [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 08:35:27 compute-0 nova_compute[186544]: 2025-11-22 08:35:27.831 186548 DEBUG nova.compute.manager [req-611785bc-7080-4ec8-bf1f-3e4cd2001775 req-61b41fb6-d0f7-465a-8f66-dbb5e3b3b9ec 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Received event network-changed-32e5de3f-3250-4a10-b086-ff60f6485d94 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:35:27 compute-0 nova_compute[186544]: 2025-11-22 08:35:27.832 186548 DEBUG nova.compute.manager [req-611785bc-7080-4ec8-bf1f-3e4cd2001775 req-61b41fb6-d0f7-465a-8f66-dbb5e3b3b9ec 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Refreshing instance network info cache due to event network-changed-32e5de3f-3250-4a10-b086-ff60f6485d94. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:35:27 compute-0 nova_compute[186544]: 2025-11-22 08:35:27.832 186548 DEBUG oslo_concurrency.lockutils [req-611785bc-7080-4ec8-bf1f-3e4cd2001775 req-61b41fb6-d0f7-465a-8f66-dbb5e3b3b9ec 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-3185c90c-d312-49b2-892e-133ba362117b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:35:28 compute-0 nova_compute[186544]: 2025-11-22 08:35:28.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:35:28 compute-0 nova_compute[186544]: 2025-11-22 08:35:28.323 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:35:29 compute-0 nova_compute[186544]: 2025-11-22 08:35:29.019 186548 DEBUG nova.network.neutron [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 08:35:31 compute-0 nova_compute[186544]: 2025-11-22 08:35:31.143 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:35:31 compute-0 nova_compute[186544]: 2025-11-22 08:35:31.173 186548 WARNING nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] While synchronizing instance power states, found 1 instances in the database and 0 instances on the hypervisor.
Nov 22 08:35:31 compute-0 nova_compute[186544]: 2025-11-22 08:35:31.173 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Triggering sync for uuid 3185c90c-d312-49b2-892e-133ba362117b _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Nov 22 08:35:31 compute-0 nova_compute[186544]: 2025-11-22 08:35:31.173 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "3185c90c-d312-49b2-892e-133ba362117b" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:35:31 compute-0 nova_compute[186544]: 2025-11-22 08:35:31.193 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:35:31 compute-0 nova_compute[186544]: 2025-11-22 08:35:31.490 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:35:32 compute-0 nova_compute[186544]: 2025-11-22 08:35:32.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:35:32 compute-0 nova_compute[186544]: 2025-11-22 08:35:32.263 186548 DEBUG nova.network.neutron [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Updating instance_info_cache with network_info: [{"id": "32e5de3f-3250-4a10-b086-ff60f6485d94", "address": "fa:16:3e:f7:8b:09", "network": {"id": "31f800a8-047c-4bf6-a6e5-1eca76d76d66", "bridge": "br-int", "label": "tempest-network-smoke--1588217224", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap32e5de3f-32", "ovs_interfaceid": "32e5de3f-3250-4a10-b086-ff60f6485d94", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:35:32 compute-0 nova_compute[186544]: 2025-11-22 08:35:32.281 186548 DEBUG oslo_concurrency.lockutils [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Releasing lock "refresh_cache-3185c90c-d312-49b2-892e-133ba362117b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:35:32 compute-0 nova_compute[186544]: 2025-11-22 08:35:32.281 186548 DEBUG nova.compute.manager [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Instance network_info: |[{"id": "32e5de3f-3250-4a10-b086-ff60f6485d94", "address": "fa:16:3e:f7:8b:09", "network": {"id": "31f800a8-047c-4bf6-a6e5-1eca76d76d66", "bridge": "br-int", "label": "tempest-network-smoke--1588217224", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap32e5de3f-32", "ovs_interfaceid": "32e5de3f-3250-4a10-b086-ff60f6485d94", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 22 08:35:32 compute-0 nova_compute[186544]: 2025-11-22 08:35:32.282 186548 DEBUG oslo_concurrency.lockutils [req-611785bc-7080-4ec8-bf1f-3e4cd2001775 req-61b41fb6-d0f7-465a-8f66-dbb5e3b3b9ec 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-3185c90c-d312-49b2-892e-133ba362117b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:35:32 compute-0 nova_compute[186544]: 2025-11-22 08:35:32.282 186548 DEBUG nova.network.neutron [req-611785bc-7080-4ec8-bf1f-3e4cd2001775 req-61b41fb6-d0f7-465a-8f66-dbb5e3b3b9ec 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Refreshing network info cache for port 32e5de3f-3250-4a10-b086-ff60f6485d94 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:35:32 compute-0 nova_compute[186544]: 2025-11-22 08:35:32.285 186548 DEBUG nova.virt.libvirt.driver [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Start _get_guest_xml network_info=[{"id": "32e5de3f-3250-4a10-b086-ff60f6485d94", "address": "fa:16:3e:f7:8b:09", "network": {"id": "31f800a8-047c-4bf6-a6e5-1eca76d76d66", "bridge": "br-int", "label": "tempest-network-smoke--1588217224", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap32e5de3f-32", "ovs_interfaceid": "32e5de3f-3250-4a10-b086-ff60f6485d94", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 08:35:32 compute-0 nova_compute[186544]: 2025-11-22 08:35:32.289 186548 WARNING nova.virt.libvirt.driver [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:35:32 compute-0 nova_compute[186544]: 2025-11-22 08:35:32.297 186548 DEBUG nova.virt.libvirt.host [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 08:35:32 compute-0 nova_compute[186544]: 2025-11-22 08:35:32.298 186548 DEBUG nova.virt.libvirt.host [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 08:35:32 compute-0 nova_compute[186544]: 2025-11-22 08:35:32.303 186548 DEBUG nova.virt.libvirt.host [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 08:35:32 compute-0 nova_compute[186544]: 2025-11-22 08:35:32.303 186548 DEBUG nova.virt.libvirt.host [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 08:35:32 compute-0 nova_compute[186544]: 2025-11-22 08:35:32.305 186548 DEBUG nova.virt.libvirt.driver [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 08:35:32 compute-0 nova_compute[186544]: 2025-11-22 08:35:32.305 186548 DEBUG nova.virt.hardware [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 08:35:32 compute-0 nova_compute[186544]: 2025-11-22 08:35:32.306 186548 DEBUG nova.virt.hardware [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 08:35:32 compute-0 nova_compute[186544]: 2025-11-22 08:35:32.306 186548 DEBUG nova.virt.hardware [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 08:35:32 compute-0 nova_compute[186544]: 2025-11-22 08:35:32.306 186548 DEBUG nova.virt.hardware [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 08:35:32 compute-0 nova_compute[186544]: 2025-11-22 08:35:32.306 186548 DEBUG nova.virt.hardware [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 08:35:32 compute-0 nova_compute[186544]: 2025-11-22 08:35:32.307 186548 DEBUG nova.virt.hardware [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 08:35:32 compute-0 nova_compute[186544]: 2025-11-22 08:35:32.307 186548 DEBUG nova.virt.hardware [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 08:35:32 compute-0 nova_compute[186544]: 2025-11-22 08:35:32.307 186548 DEBUG nova.virt.hardware [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 08:35:32 compute-0 nova_compute[186544]: 2025-11-22 08:35:32.307 186548 DEBUG nova.virt.hardware [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 08:35:32 compute-0 nova_compute[186544]: 2025-11-22 08:35:32.308 186548 DEBUG nova.virt.hardware [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 08:35:32 compute-0 nova_compute[186544]: 2025-11-22 08:35:32.308 186548 DEBUG nova.virt.hardware [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 08:35:32 compute-0 nova_compute[186544]: 2025-11-22 08:35:32.312 186548 DEBUG nova.virt.libvirt.vif [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:35:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-710800317',display_name='tempest-TestNetworkBasicOps-server-710800317',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-710800317',id=167,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNYFp+84ONzQcn0iqOuOB+FXdJTNjbX16Nf4bSBmaDOqJkiNXFWk4Dg2HKrhF07TNjNTNPuibNriNIRB+UswtE57Lassn/GZrZqYoCMCkzsy/3S49MgiVtcyuFVVOk/lqQ==',key_name='tempest-TestNetworkBasicOps-1923094795',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='12f63a6d87a947758ab928c0d625ff06',ramdisk_id='',reservation_id='r-2i2p53uv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1998778518',owner_user_name='tempest-TestNetworkBasicOps-1998778518-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:35:24Z,user_data=None,user_id='033a5e424a0a42afa21b67c28d79d1f4',uuid=3185c90c-d312-49b2-892e-133ba362117b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "32e5de3f-3250-4a10-b086-ff60f6485d94", "address": "fa:16:3e:f7:8b:09", "network": {"id": "31f800a8-047c-4bf6-a6e5-1eca76d76d66", "bridge": "br-int", "label": "tempest-network-smoke--1588217224", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap32e5de3f-32", "ovs_interfaceid": "32e5de3f-3250-4a10-b086-ff60f6485d94", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 08:35:32 compute-0 nova_compute[186544]: 2025-11-22 08:35:32.313 186548 DEBUG nova.network.os_vif_util [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converting VIF {"id": "32e5de3f-3250-4a10-b086-ff60f6485d94", "address": "fa:16:3e:f7:8b:09", "network": {"id": "31f800a8-047c-4bf6-a6e5-1eca76d76d66", "bridge": "br-int", "label": "tempest-network-smoke--1588217224", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap32e5de3f-32", "ovs_interfaceid": "32e5de3f-3250-4a10-b086-ff60f6485d94", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:35:32 compute-0 nova_compute[186544]: 2025-11-22 08:35:32.313 186548 DEBUG nova.network.os_vif_util [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f7:8b:09,bridge_name='br-int',has_traffic_filtering=True,id=32e5de3f-3250-4a10-b086-ff60f6485d94,network=Network(31f800a8-047c-4bf6-a6e5-1eca76d76d66),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap32e5de3f-32') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:35:32 compute-0 nova_compute[186544]: 2025-11-22 08:35:32.314 186548 DEBUG nova.objects.instance [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lazy-loading 'pci_devices' on Instance uuid 3185c90c-d312-49b2-892e-133ba362117b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:35:32 compute-0 nova_compute[186544]: 2025-11-22 08:35:32.325 186548 DEBUG nova.virt.libvirt.driver [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] End _get_guest_xml xml=<domain type="kvm">
Nov 22 08:35:32 compute-0 nova_compute[186544]:   <uuid>3185c90c-d312-49b2-892e-133ba362117b</uuid>
Nov 22 08:35:32 compute-0 nova_compute[186544]:   <name>instance-000000a7</name>
Nov 22 08:35:32 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 08:35:32 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 08:35:32 compute-0 nova_compute[186544]:   <metadata>
Nov 22 08:35:32 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 08:35:32 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 08:35:32 compute-0 nova_compute[186544]:       <nova:name>tempest-TestNetworkBasicOps-server-710800317</nova:name>
Nov 22 08:35:32 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 08:35:32</nova:creationTime>
Nov 22 08:35:32 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 08:35:32 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 08:35:32 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 08:35:32 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 08:35:32 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 08:35:32 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 08:35:32 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 08:35:32 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 08:35:32 compute-0 nova_compute[186544]:         <nova:user uuid="033a5e424a0a42afa21b67c28d79d1f4">tempest-TestNetworkBasicOps-1998778518-project-member</nova:user>
Nov 22 08:35:32 compute-0 nova_compute[186544]:         <nova:project uuid="12f63a6d87a947758ab928c0d625ff06">tempest-TestNetworkBasicOps-1998778518</nova:project>
Nov 22 08:35:32 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 08:35:32 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 08:35:32 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 08:35:32 compute-0 nova_compute[186544]:         <nova:port uuid="32e5de3f-3250-4a10-b086-ff60f6485d94">
Nov 22 08:35:32 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Nov 22 08:35:32 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 08:35:32 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 08:35:32 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 08:35:32 compute-0 nova_compute[186544]:   </metadata>
Nov 22 08:35:32 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 08:35:32 compute-0 nova_compute[186544]:     <system>
Nov 22 08:35:32 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 08:35:32 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 08:35:32 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 08:35:32 compute-0 nova_compute[186544]:       <entry name="serial">3185c90c-d312-49b2-892e-133ba362117b</entry>
Nov 22 08:35:32 compute-0 nova_compute[186544]:       <entry name="uuid">3185c90c-d312-49b2-892e-133ba362117b</entry>
Nov 22 08:35:32 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 08:35:32 compute-0 nova_compute[186544]:     </system>
Nov 22 08:35:32 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 08:35:32 compute-0 nova_compute[186544]:   <os>
Nov 22 08:35:32 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 08:35:32 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 08:35:32 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 08:35:32 compute-0 nova_compute[186544]:   </os>
Nov 22 08:35:32 compute-0 nova_compute[186544]:   <features>
Nov 22 08:35:32 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 08:35:32 compute-0 nova_compute[186544]:     <apic/>
Nov 22 08:35:32 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 08:35:32 compute-0 nova_compute[186544]:   </features>
Nov 22 08:35:32 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 08:35:32 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 08:35:32 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 08:35:32 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 08:35:32 compute-0 nova_compute[186544]:   </clock>
Nov 22 08:35:32 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 08:35:32 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 08:35:32 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 08:35:32 compute-0 nova_compute[186544]:   </cpu>
Nov 22 08:35:32 compute-0 nova_compute[186544]:   <devices>
Nov 22 08:35:32 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 08:35:32 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 08:35:32 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/3185c90c-d312-49b2-892e-133ba362117b/disk"/>
Nov 22 08:35:32 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 08:35:32 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:35:32 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 08:35:32 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 08:35:32 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/3185c90c-d312-49b2-892e-133ba362117b/disk.config"/>
Nov 22 08:35:32 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 08:35:32 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:35:32 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 08:35:32 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:f7:8b:09"/>
Nov 22 08:35:32 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:35:32 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 08:35:32 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 08:35:32 compute-0 nova_compute[186544]:       <target dev="tap32e5de3f-32"/>
Nov 22 08:35:32 compute-0 nova_compute[186544]:     </interface>
Nov 22 08:35:32 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 08:35:32 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/3185c90c-d312-49b2-892e-133ba362117b/console.log" append="off"/>
Nov 22 08:35:32 compute-0 nova_compute[186544]:     </serial>
Nov 22 08:35:32 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 08:35:32 compute-0 nova_compute[186544]:     <video>
Nov 22 08:35:32 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:35:32 compute-0 nova_compute[186544]:     </video>
Nov 22 08:35:32 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 08:35:32 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 08:35:32 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 08:35:32 compute-0 nova_compute[186544]:     </rng>
Nov 22 08:35:32 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 08:35:32 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:35:32 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:35:32 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:35:32 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:35:32 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:35:32 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:35:32 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:35:32 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:35:32 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:35:32 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:35:32 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:35:32 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:35:32 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:35:32 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:35:32 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:35:32 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:35:32 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:35:32 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:35:32 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:35:32 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:35:32 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:35:32 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:35:32 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:35:32 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:35:32 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 08:35:32 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 08:35:32 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 08:35:32 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 08:35:32 compute-0 nova_compute[186544]:   </devices>
Nov 22 08:35:32 compute-0 nova_compute[186544]: </domain>
Nov 22 08:35:32 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 08:35:32 compute-0 nova_compute[186544]: 2025-11-22 08:35:32.327 186548 DEBUG nova.compute.manager [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Preparing to wait for external event network-vif-plugged-32e5de3f-3250-4a10-b086-ff60f6485d94 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 08:35:32 compute-0 nova_compute[186544]: 2025-11-22 08:35:32.327 186548 DEBUG oslo_concurrency.lockutils [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "3185c90c-d312-49b2-892e-133ba362117b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:35:32 compute-0 nova_compute[186544]: 2025-11-22 08:35:32.327 186548 DEBUG oslo_concurrency.lockutils [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "3185c90c-d312-49b2-892e-133ba362117b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:35:32 compute-0 nova_compute[186544]: 2025-11-22 08:35:32.328 186548 DEBUG oslo_concurrency.lockutils [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "3185c90c-d312-49b2-892e-133ba362117b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:35:32 compute-0 nova_compute[186544]: 2025-11-22 08:35:32.329 186548 DEBUG nova.virt.libvirt.vif [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:35:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-710800317',display_name='tempest-TestNetworkBasicOps-server-710800317',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-710800317',id=167,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNYFp+84ONzQcn0iqOuOB+FXdJTNjbX16Nf4bSBmaDOqJkiNXFWk4Dg2HKrhF07TNjNTNPuibNriNIRB+UswtE57Lassn/GZrZqYoCMCkzsy/3S49MgiVtcyuFVVOk/lqQ==',key_name='tempest-TestNetworkBasicOps-1923094795',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='12f63a6d87a947758ab928c0d625ff06',ramdisk_id='',reservation_id='r-2i2p53uv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1998778518',owner_user_name='tempest-TestNetworkBasicOps-1998778518-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:35:24Z,user_data=None,user_id='033a5e424a0a42afa21b67c28d79d1f4',uuid=3185c90c-d312-49b2-892e-133ba362117b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "32e5de3f-3250-4a10-b086-ff60f6485d94", "address": "fa:16:3e:f7:8b:09", "network": {"id": "31f800a8-047c-4bf6-a6e5-1eca76d76d66", "bridge": "br-int", "label": "tempest-network-smoke--1588217224", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap32e5de3f-32", "ovs_interfaceid": "32e5de3f-3250-4a10-b086-ff60f6485d94", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 08:35:32 compute-0 nova_compute[186544]: 2025-11-22 08:35:32.329 186548 DEBUG nova.network.os_vif_util [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converting VIF {"id": "32e5de3f-3250-4a10-b086-ff60f6485d94", "address": "fa:16:3e:f7:8b:09", "network": {"id": "31f800a8-047c-4bf6-a6e5-1eca76d76d66", "bridge": "br-int", "label": "tempest-network-smoke--1588217224", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap32e5de3f-32", "ovs_interfaceid": "32e5de3f-3250-4a10-b086-ff60f6485d94", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:35:32 compute-0 nova_compute[186544]: 2025-11-22 08:35:32.330 186548 DEBUG nova.network.os_vif_util [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f7:8b:09,bridge_name='br-int',has_traffic_filtering=True,id=32e5de3f-3250-4a10-b086-ff60f6485d94,network=Network(31f800a8-047c-4bf6-a6e5-1eca76d76d66),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap32e5de3f-32') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:35:32 compute-0 nova_compute[186544]: 2025-11-22 08:35:32.330 186548 DEBUG os_vif [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f7:8b:09,bridge_name='br-int',has_traffic_filtering=True,id=32e5de3f-3250-4a10-b086-ff60f6485d94,network=Network(31f800a8-047c-4bf6-a6e5-1eca76d76d66),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap32e5de3f-32') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 08:35:32 compute-0 nova_compute[186544]: 2025-11-22 08:35:32.331 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:35:32 compute-0 nova_compute[186544]: 2025-11-22 08:35:32.331 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:35:32 compute-0 nova_compute[186544]: 2025-11-22 08:35:32.332 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:35:32 compute-0 nova_compute[186544]: 2025-11-22 08:35:32.335 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:35:32 compute-0 nova_compute[186544]: 2025-11-22 08:35:32.335 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap32e5de3f-32, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:35:32 compute-0 nova_compute[186544]: 2025-11-22 08:35:32.336 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap32e5de3f-32, col_values=(('external_ids', {'iface-id': '32e5de3f-3250-4a10-b086-ff60f6485d94', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f7:8b:09', 'vm-uuid': '3185c90c-d312-49b2-892e-133ba362117b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:35:32 compute-0 NetworkManager[55036]: <info>  [1763800532.3376] manager: (tap32e5de3f-32): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/372)
Nov 22 08:35:32 compute-0 nova_compute[186544]: 2025-11-22 08:35:32.339 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 08:35:32 compute-0 nova_compute[186544]: 2025-11-22 08:35:32.343 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:35:32 compute-0 nova_compute[186544]: 2025-11-22 08:35:32.344 186548 INFO os_vif [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f7:8b:09,bridge_name='br-int',has_traffic_filtering=True,id=32e5de3f-3250-4a10-b086-ff60f6485d94,network=Network(31f800a8-047c-4bf6-a6e5-1eca76d76d66),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap32e5de3f-32')
Nov 22 08:35:32 compute-0 nova_compute[186544]: 2025-11-22 08:35:32.395 186548 DEBUG nova.virt.libvirt.driver [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:35:32 compute-0 nova_compute[186544]: 2025-11-22 08:35:32.396 186548 DEBUG nova.virt.libvirt.driver [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:35:32 compute-0 nova_compute[186544]: 2025-11-22 08:35:32.396 186548 DEBUG nova.virt.libvirt.driver [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] No VIF found with MAC fa:16:3e:f7:8b:09, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 08:35:32 compute-0 nova_compute[186544]: 2025-11-22 08:35:32.396 186548 INFO nova.virt.libvirt.driver [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Using config drive
Nov 22 08:35:33 compute-0 nova_compute[186544]: 2025-11-22 08:35:33.194 186548 INFO nova.virt.libvirt.driver [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Creating config drive at /var/lib/nova/instances/3185c90c-d312-49b2-892e-133ba362117b/disk.config
Nov 22 08:35:33 compute-0 nova_compute[186544]: 2025-11-22 08:35:33.199 186548 DEBUG oslo_concurrency.processutils [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3185c90c-d312-49b2-892e-133ba362117b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpnrew9bdi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:35:33 compute-0 nova_compute[186544]: 2025-11-22 08:35:33.323 186548 DEBUG oslo_concurrency.processutils [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3185c90c-d312-49b2-892e-133ba362117b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpnrew9bdi" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:35:33 compute-0 kernel: tap32e5de3f-32: entered promiscuous mode
Nov 22 08:35:33 compute-0 NetworkManager[55036]: <info>  [1763800533.3785] manager: (tap32e5de3f-32): new Tun device (/org/freedesktop/NetworkManager/Devices/373)
Nov 22 08:35:33 compute-0 ovn_controller[94843]: 2025-11-22T08:35:33Z|00801|binding|INFO|Claiming lport 32e5de3f-3250-4a10-b086-ff60f6485d94 for this chassis.
Nov 22 08:35:33 compute-0 ovn_controller[94843]: 2025-11-22T08:35:33Z|00802|binding|INFO|32e5de3f-3250-4a10-b086-ff60f6485d94: Claiming fa:16:3e:f7:8b:09 10.100.0.3
Nov 22 08:35:33 compute-0 nova_compute[186544]: 2025-11-22 08:35:33.379 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:35:33 compute-0 nova_compute[186544]: 2025-11-22 08:35:33.382 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:35:33 compute-0 systemd-udevd[248630]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:35:33 compute-0 NetworkManager[55036]: <info>  [1763800533.4107] device (tap32e5de3f-32): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 08:35:33 compute-0 NetworkManager[55036]: <info>  [1763800533.4116] device (tap32e5de3f-32): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 08:35:33 compute-0 systemd-machined[152872]: New machine qemu-91-instance-000000a7.
Nov 22 08:35:33 compute-0 nova_compute[186544]: 2025-11-22 08:35:33.433 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:35:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:35:33.436 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f7:8b:09 10.100.0.3'], port_security=['fa:16:3e:f7:8b:09 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '3185c90c-d312-49b2-892e-133ba362117b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-31f800a8-047c-4bf6-a6e5-1eca76d76d66', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '12f63a6d87a947758ab928c0d625ff06', 'neutron:revision_number': '2', 'neutron:security_group_ids': '80b48e61-9c0d-4cfa-b8d5-1ae4ab3d8cdf', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=65a04f5c-0ceb-4f4b-889a-0e4cec0d6191, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=32e5de3f-3250-4a10-b086-ff60f6485d94) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:35:33 compute-0 systemd[1]: Started Virtual Machine qemu-91-instance-000000a7.
Nov 22 08:35:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:35:33.438 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 32e5de3f-3250-4a10-b086-ff60f6485d94 in datapath 31f800a8-047c-4bf6-a6e5-1eca76d76d66 bound to our chassis
Nov 22 08:35:33 compute-0 ovn_controller[94843]: 2025-11-22T08:35:33Z|00803|binding|INFO|Setting lport 32e5de3f-3250-4a10-b086-ff60f6485d94 ovn-installed in OVS
Nov 22 08:35:33 compute-0 ovn_controller[94843]: 2025-11-22T08:35:33Z|00804|binding|INFO|Setting lport 32e5de3f-3250-4a10-b086-ff60f6485d94 up in Southbound
Nov 22 08:35:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:35:33.439 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 31f800a8-047c-4bf6-a6e5-1eca76d76d66
Nov 22 08:35:33 compute-0 nova_compute[186544]: 2025-11-22 08:35:33.440 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:35:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:35:33.449 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[0a066c37-3827-466c-bdf4-0f61808cfc4a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:35:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:35:33.450 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap31f800a8-01 in ovnmeta-31f800a8-047c-4bf6-a6e5-1eca76d76d66 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 08:35:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:35:33.452 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap31f800a8-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 08:35:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:35:33.452 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[83400c9b-c57d-4921-9b23-8f9bb7cf1b51]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:35:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:35:33.453 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[71ef9888-1aab-4e08-8885-bc65c41081c5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:35:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:35:33.464 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[ebcac0e0-cd94-4b4c-ab17-63951ba4846c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:35:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:35:33.476 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[05ce373b-277e-4985-bfdc-bab8435d1042]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:35:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:35:33.501 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[261da765-8288-4618-b623-2db76ae1c7f2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:35:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:35:33.505 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[6f152394-7ed1-405c-976e-5d5504e711ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:35:33 compute-0 NetworkManager[55036]: <info>  [1763800533.5063] manager: (tap31f800a8-00): new Veth device (/org/freedesktop/NetworkManager/Devices/374)
Nov 22 08:35:33 compute-0 systemd-udevd[248633]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:35:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:35:33.534 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[15f4fba7-7822-44ef-ab05-c5e66401246c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:35:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:35:33.537 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[e322bbbe-78f7-4b75-9c76-fb6ee9f6c3fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:35:33 compute-0 NetworkManager[55036]: <info>  [1763800533.5572] device (tap31f800a8-00): carrier: link connected
Nov 22 08:35:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:35:33.562 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[390698c0-a6be-43d8-a073-cff49a899ae8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:35:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:35:33.578 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[49bbc7ae-e89b-443d-bb33-bddb1ecf5ded]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap31f800a8-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ce:3a:28'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 241], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 727419, 'reachable_time': 15390, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 248664, 'error': None, 'target': 'ovnmeta-31f800a8-047c-4bf6-a6e5-1eca76d76d66', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:35:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:35:33.594 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[23b44b06-8cae-447a-b0cd-806730bfe4d0]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fece:3a28'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 727419, 'tstamp': 727419}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 248665, 'error': None, 'target': 'ovnmeta-31f800a8-047c-4bf6-a6e5-1eca76d76d66', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:35:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:35:33.608 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[0e4925ea-93b8-4c2e-b93b-39bd820475c0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap31f800a8-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ce:3a:28'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 241], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 727419, 'reachable_time': 15390, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 248666, 'error': None, 'target': 'ovnmeta-31f800a8-047c-4bf6-a6e5-1eca76d76d66', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:35:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:35:33.636 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[1f75b040-1f30-4f03-8e4f-35a17cc8b95a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:35:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:35:33.688 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[6905a9c3-fc6c-42f5-a442-53768f7aa973]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:35:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:35:33.689 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap31f800a8-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:35:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:35:33.689 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:35:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:35:33.690 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap31f800a8-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:35:33 compute-0 nova_compute[186544]: 2025-11-22 08:35:33.692 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:35:33 compute-0 kernel: tap31f800a8-00: entered promiscuous mode
Nov 22 08:35:33 compute-0 NetworkManager[55036]: <info>  [1763800533.6925] manager: (tap31f800a8-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/375)
Nov 22 08:35:33 compute-0 nova_compute[186544]: 2025-11-22 08:35:33.694 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:35:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:35:33.694 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap31f800a8-00, col_values=(('external_ids', {'iface-id': 'eafded98-90b4-44e0-b8ef-306128acf727'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:35:33 compute-0 nova_compute[186544]: 2025-11-22 08:35:33.695 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:35:33 compute-0 ovn_controller[94843]: 2025-11-22T08:35:33Z|00805|binding|INFO|Releasing lport eafded98-90b4-44e0-b8ef-306128acf727 from this chassis (sb_readonly=0)
Nov 22 08:35:33 compute-0 nova_compute[186544]: 2025-11-22 08:35:33.696 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:35:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:35:33.697 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/31f800a8-047c-4bf6-a6e5-1eca76d76d66.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/31f800a8-047c-4bf6-a6e5-1eca76d76d66.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 08:35:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:35:33.698 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[5d5212b9-7114-493b-8305-294d21ddad2f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:35:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:35:33.699 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 08:35:33 compute-0 ovn_metadata_agent[103800]: global
Nov 22 08:35:33 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 08:35:33 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-31f800a8-047c-4bf6-a6e5-1eca76d76d66
Nov 22 08:35:33 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 08:35:33 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 08:35:33 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 08:35:33 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/31f800a8-047c-4bf6-a6e5-1eca76d76d66.pid.haproxy
Nov 22 08:35:33 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 08:35:33 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:35:33 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 08:35:33 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 08:35:33 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 08:35:33 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 08:35:33 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 08:35:33 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 08:35:33 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 08:35:33 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 08:35:33 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 08:35:33 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 08:35:33 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 08:35:33 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 08:35:33 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 08:35:33 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:35:33 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:35:33 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 08:35:33 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 08:35:33 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 08:35:33 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID 31f800a8-047c-4bf6-a6e5-1eca76d76d66
Nov 22 08:35:33 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 08:35:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:35:33.700 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-31f800a8-047c-4bf6-a6e5-1eca76d76d66', 'env', 'PROCESS_TAG=haproxy-31f800a8-047c-4bf6-a6e5-1eca76d76d66', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/31f800a8-047c-4bf6-a6e5-1eca76d76d66.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 08:35:33 compute-0 nova_compute[186544]: 2025-11-22 08:35:33.708 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:35:33 compute-0 nova_compute[186544]: 2025-11-22 08:35:33.731 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763800533.731413, 3185c90c-d312-49b2-892e-133ba362117b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:35:33 compute-0 nova_compute[186544]: 2025-11-22 08:35:33.732 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 3185c90c-d312-49b2-892e-133ba362117b] VM Started (Lifecycle Event)
Nov 22 08:35:33 compute-0 nova_compute[186544]: 2025-11-22 08:35:33.752 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:35:33 compute-0 nova_compute[186544]: 2025-11-22 08:35:33.756 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763800533.73235, 3185c90c-d312-49b2-892e-133ba362117b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:35:33 compute-0 nova_compute[186544]: 2025-11-22 08:35:33.756 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 3185c90c-d312-49b2-892e-133ba362117b] VM Paused (Lifecycle Event)
Nov 22 08:35:33 compute-0 nova_compute[186544]: 2025-11-22 08:35:33.777 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:35:33 compute-0 nova_compute[186544]: 2025-11-22 08:35:33.780 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:35:33 compute-0 nova_compute[186544]: 2025-11-22 08:35:33.797 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 3185c90c-d312-49b2-892e-133ba362117b] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 08:35:34 compute-0 podman[248705]: 2025-11-22 08:35:34.124260504 +0000 UTC m=+0.091404376 container create 62eec7ca95411bfd2ed7fe33ed89561e7039c4976a767b8d9904c9b00a2e52dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-31f800a8-047c-4bf6-a6e5-1eca76d76d66, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 08:35:34 compute-0 podman[248705]: 2025-11-22 08:35:34.066144831 +0000 UTC m=+0.033288753 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 08:35:34 compute-0 systemd[1]: Started libpod-conmon-62eec7ca95411bfd2ed7fe33ed89561e7039c4976a767b8d9904c9b00a2e52dd.scope.
Nov 22 08:35:34 compute-0 systemd[1]: Started libcrun container.
Nov 22 08:35:34 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b2f620f0057bbd6aefab8b4a9003640b70ce92f3798faa74bd505f2865266c87/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 08:35:34 compute-0 podman[248705]: 2025-11-22 08:35:34.213998849 +0000 UTC m=+0.181142721 container init 62eec7ca95411bfd2ed7fe33ed89561e7039c4976a767b8d9904c9b00a2e52dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-31f800a8-047c-4bf6-a6e5-1eca76d76d66, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 22 08:35:34 compute-0 podman[248705]: 2025-11-22 08:35:34.219182346 +0000 UTC m=+0.186326218 container start 62eec7ca95411bfd2ed7fe33ed89561e7039c4976a767b8d9904c9b00a2e52dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-31f800a8-047c-4bf6-a6e5-1eca76d76d66, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 22 08:35:34 compute-0 neutron-haproxy-ovnmeta-31f800a8-047c-4bf6-a6e5-1eca76d76d66[248720]: [NOTICE]   (248724) : New worker (248726) forked
Nov 22 08:35:34 compute-0 neutron-haproxy-ovnmeta-31f800a8-047c-4bf6-a6e5-1eca76d76d66[248720]: [NOTICE]   (248724) : Loading success.
Nov 22 08:35:34 compute-0 nova_compute[186544]: 2025-11-22 08:35:34.265 186548 DEBUG nova.compute.manager [req-e7fbfd0e-b977-48d6-98cb-1c7e6208daf0 req-c5a67814-193d-42e6-b1ad-d0313eb6b1a5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Received event network-vif-plugged-32e5de3f-3250-4a10-b086-ff60f6485d94 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:35:34 compute-0 nova_compute[186544]: 2025-11-22 08:35:34.265 186548 DEBUG oslo_concurrency.lockutils [req-e7fbfd0e-b977-48d6-98cb-1c7e6208daf0 req-c5a67814-193d-42e6-b1ad-d0313eb6b1a5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "3185c90c-d312-49b2-892e-133ba362117b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:35:34 compute-0 nova_compute[186544]: 2025-11-22 08:35:34.265 186548 DEBUG oslo_concurrency.lockutils [req-e7fbfd0e-b977-48d6-98cb-1c7e6208daf0 req-c5a67814-193d-42e6-b1ad-d0313eb6b1a5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "3185c90c-d312-49b2-892e-133ba362117b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:35:34 compute-0 nova_compute[186544]: 2025-11-22 08:35:34.266 186548 DEBUG oslo_concurrency.lockutils [req-e7fbfd0e-b977-48d6-98cb-1c7e6208daf0 req-c5a67814-193d-42e6-b1ad-d0313eb6b1a5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "3185c90c-d312-49b2-892e-133ba362117b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:35:34 compute-0 nova_compute[186544]: 2025-11-22 08:35:34.266 186548 DEBUG nova.compute.manager [req-e7fbfd0e-b977-48d6-98cb-1c7e6208daf0 req-c5a67814-193d-42e6-b1ad-d0313eb6b1a5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Processing event network-vif-plugged-32e5de3f-3250-4a10-b086-ff60f6485d94 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 08:35:34 compute-0 nova_compute[186544]: 2025-11-22 08:35:34.267 186548 DEBUG nova.compute.manager [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 08:35:34 compute-0 nova_compute[186544]: 2025-11-22 08:35:34.272 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763800534.2725787, 3185c90c-d312-49b2-892e-133ba362117b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:35:34 compute-0 nova_compute[186544]: 2025-11-22 08:35:34.273 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 3185c90c-d312-49b2-892e-133ba362117b] VM Resumed (Lifecycle Event)
Nov 22 08:35:34 compute-0 nova_compute[186544]: 2025-11-22 08:35:34.277 186548 DEBUG nova.virt.libvirt.driver [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 08:35:34 compute-0 nova_compute[186544]: 2025-11-22 08:35:34.282 186548 INFO nova.virt.libvirt.driver [-] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Instance spawned successfully.
Nov 22 08:35:34 compute-0 nova_compute[186544]: 2025-11-22 08:35:34.283 186548 DEBUG nova.virt.libvirt.driver [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 08:35:34 compute-0 nova_compute[186544]: 2025-11-22 08:35:34.314 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:35:34 compute-0 nova_compute[186544]: 2025-11-22 08:35:34.318 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:35:34 compute-0 nova_compute[186544]: 2025-11-22 08:35:34.452 186548 DEBUG nova.virt.libvirt.driver [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:35:34 compute-0 nova_compute[186544]: 2025-11-22 08:35:34.452 186548 DEBUG nova.virt.libvirt.driver [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:35:34 compute-0 nova_compute[186544]: 2025-11-22 08:35:34.453 186548 DEBUG nova.virt.libvirt.driver [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:35:34 compute-0 nova_compute[186544]: 2025-11-22 08:35:34.453 186548 DEBUG nova.virt.libvirt.driver [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:35:34 compute-0 nova_compute[186544]: 2025-11-22 08:35:34.454 186548 DEBUG nova.virt.libvirt.driver [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:35:34 compute-0 nova_compute[186544]: 2025-11-22 08:35:34.455 186548 DEBUG nova.virt.libvirt.driver [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:35:34 compute-0 nova_compute[186544]: 2025-11-22 08:35:34.461 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 3185c90c-d312-49b2-892e-133ba362117b] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 08:35:34 compute-0 nova_compute[186544]: 2025-11-22 08:35:34.545 186548 INFO nova.compute.manager [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Took 10.04 seconds to spawn the instance on the hypervisor.
Nov 22 08:35:34 compute-0 nova_compute[186544]: 2025-11-22 08:35:34.545 186548 DEBUG nova.compute.manager [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:35:34 compute-0 nova_compute[186544]: 2025-11-22 08:35:34.660 186548 INFO nova.compute.manager [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Took 10.53 seconds to build instance.
Nov 22 08:35:34 compute-0 nova_compute[186544]: 2025-11-22 08:35:34.682 186548 DEBUG oslo_concurrency.lockutils [None req-92689504-f15c-4c45-a622-92e740365984 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "3185c90c-d312-49b2-892e-133ba362117b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.661s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:35:34 compute-0 nova_compute[186544]: 2025-11-22 08:35:34.683 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "3185c90c-d312-49b2-892e-133ba362117b" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 3.510s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:35:34 compute-0 nova_compute[186544]: 2025-11-22 08:35:34.683 186548 INFO nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 3185c90c-d312-49b2-892e-133ba362117b] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 08:35:34 compute-0 nova_compute[186544]: 2025-11-22 08:35:34.683 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "3185c90c-d312-49b2-892e-133ba362117b" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:35:35 compute-0 nova_compute[186544]: 2025-11-22 08:35:35.215 186548 DEBUG nova.network.neutron [req-611785bc-7080-4ec8-bf1f-3e4cd2001775 req-61b41fb6-d0f7-465a-8f66-dbb5e3b3b9ec 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Updated VIF entry in instance network info cache for port 32e5de3f-3250-4a10-b086-ff60f6485d94. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:35:35 compute-0 nova_compute[186544]: 2025-11-22 08:35:35.216 186548 DEBUG nova.network.neutron [req-611785bc-7080-4ec8-bf1f-3e4cd2001775 req-61b41fb6-d0f7-465a-8f66-dbb5e3b3b9ec 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Updating instance_info_cache with network_info: [{"id": "32e5de3f-3250-4a10-b086-ff60f6485d94", "address": "fa:16:3e:f7:8b:09", "network": {"id": "31f800a8-047c-4bf6-a6e5-1eca76d76d66", "bridge": "br-int", "label": "tempest-network-smoke--1588217224", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap32e5de3f-32", "ovs_interfaceid": "32e5de3f-3250-4a10-b086-ff60f6485d94", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:35:35 compute-0 nova_compute[186544]: 2025-11-22 08:35:35.228 186548 DEBUG oslo_concurrency.lockutils [req-611785bc-7080-4ec8-bf1f-3e4cd2001775 req-61b41fb6-d0f7-465a-8f66-dbb5e3b3b9ec 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-3185c90c-d312-49b2-892e-133ba362117b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:35:36 compute-0 nova_compute[186544]: 2025-11-22 08:35:36.375 186548 DEBUG nova.compute.manager [req-45715178-68b3-44d7-aa72-5eff81109677 req-015ac464-fecb-40aa-a92a-f53d6aeb56f0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Received event network-vif-plugged-32e5de3f-3250-4a10-b086-ff60f6485d94 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:35:36 compute-0 nova_compute[186544]: 2025-11-22 08:35:36.375 186548 DEBUG oslo_concurrency.lockutils [req-45715178-68b3-44d7-aa72-5eff81109677 req-015ac464-fecb-40aa-a92a-f53d6aeb56f0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "3185c90c-d312-49b2-892e-133ba362117b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:35:36 compute-0 nova_compute[186544]: 2025-11-22 08:35:36.376 186548 DEBUG oslo_concurrency.lockutils [req-45715178-68b3-44d7-aa72-5eff81109677 req-015ac464-fecb-40aa-a92a-f53d6aeb56f0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "3185c90c-d312-49b2-892e-133ba362117b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:35:36 compute-0 nova_compute[186544]: 2025-11-22 08:35:36.376 186548 DEBUG oslo_concurrency.lockutils [req-45715178-68b3-44d7-aa72-5eff81109677 req-015ac464-fecb-40aa-a92a-f53d6aeb56f0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "3185c90c-d312-49b2-892e-133ba362117b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:35:36 compute-0 nova_compute[186544]: 2025-11-22 08:35:36.376 186548 DEBUG nova.compute.manager [req-45715178-68b3-44d7-aa72-5eff81109677 req-015ac464-fecb-40aa-a92a-f53d6aeb56f0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] No waiting events found dispatching network-vif-plugged-32e5de3f-3250-4a10-b086-ff60f6485d94 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:35:36 compute-0 nova_compute[186544]: 2025-11-22 08:35:36.376 186548 WARNING nova.compute.manager [req-45715178-68b3-44d7-aa72-5eff81109677 req-015ac464-fecb-40aa-a92a-f53d6aeb56f0 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Received unexpected event network-vif-plugged-32e5de3f-3250-4a10-b086-ff60f6485d94 for instance with vm_state active and task_state None.
Nov 22 08:35:36 compute-0 podman[248735]: 2025-11-22 08:35:36.457706372 +0000 UTC m=+0.098387649 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, org.label-schema.build-date=20251118)
Nov 22 08:35:36 compute-0 nova_compute[186544]: 2025-11-22 08:35:36.493 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:35:37 compute-0 nova_compute[186544]: 2025-11-22 08:35:37.337 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:35:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:35:37.366 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:35:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:35:37.367 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:35:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:35:37.367 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:35:38 compute-0 podman[248756]: 2025-11-22 08:35:38.409607834 +0000 UTC m=+0.052329032 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, distribution-scope=public, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, architecture=x86_64, vcs-type=git, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., version=9.6, name=ubi9-minimal, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 22 08:35:38 compute-0 podman[248755]: 2025-11-22 08:35:38.410679291 +0000 UTC m=+0.056121256 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 08:35:40 compute-0 NetworkManager[55036]: <info>  [1763800540.3605] manager: (patch-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/376)
Nov 22 08:35:40 compute-0 nova_compute[186544]: 2025-11-22 08:35:40.359 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:35:40 compute-0 NetworkManager[55036]: <info>  [1763800540.3615] manager: (patch-br-int-to-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/377)
Nov 22 08:35:40 compute-0 ovn_controller[94843]: 2025-11-22T08:35:40Z|00806|binding|INFO|Releasing lport eafded98-90b4-44e0-b8ef-306128acf727 from this chassis (sb_readonly=0)
Nov 22 08:35:40 compute-0 nova_compute[186544]: 2025-11-22 08:35:40.389 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:35:40 compute-0 ovn_controller[94843]: 2025-11-22T08:35:40Z|00807|binding|INFO|Releasing lport eafded98-90b4-44e0-b8ef-306128acf727 from this chassis (sb_readonly=0)
Nov 22 08:35:40 compute-0 nova_compute[186544]: 2025-11-22 08:35:40.395 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:35:40 compute-0 nova_compute[186544]: 2025-11-22 08:35:40.713 186548 DEBUG nova.compute.manager [req-abfcc5ea-4e6e-48f8-9721-9b8cecfb247b req-aa23d600-9e06-4117-b38e-10cdaf34a2ad 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Received event network-changed-32e5de3f-3250-4a10-b086-ff60f6485d94 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:35:40 compute-0 nova_compute[186544]: 2025-11-22 08:35:40.713 186548 DEBUG nova.compute.manager [req-abfcc5ea-4e6e-48f8-9721-9b8cecfb247b req-aa23d600-9e06-4117-b38e-10cdaf34a2ad 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Refreshing instance network info cache due to event network-changed-32e5de3f-3250-4a10-b086-ff60f6485d94. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:35:40 compute-0 nova_compute[186544]: 2025-11-22 08:35:40.714 186548 DEBUG oslo_concurrency.lockutils [req-abfcc5ea-4e6e-48f8-9721-9b8cecfb247b req-aa23d600-9e06-4117-b38e-10cdaf34a2ad 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-3185c90c-d312-49b2-892e-133ba362117b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:35:40 compute-0 nova_compute[186544]: 2025-11-22 08:35:40.714 186548 DEBUG oslo_concurrency.lockutils [req-abfcc5ea-4e6e-48f8-9721-9b8cecfb247b req-aa23d600-9e06-4117-b38e-10cdaf34a2ad 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-3185c90c-d312-49b2-892e-133ba362117b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:35:40 compute-0 nova_compute[186544]: 2025-11-22 08:35:40.714 186548 DEBUG nova.network.neutron [req-abfcc5ea-4e6e-48f8-9721-9b8cecfb247b req-aa23d600-9e06-4117-b38e-10cdaf34a2ad 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Refreshing network info cache for port 32e5de3f-3250-4a10-b086-ff60f6485d94 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:35:41 compute-0 nova_compute[186544]: 2025-11-22 08:35:41.495 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:35:41 compute-0 nova_compute[186544]: 2025-11-22 08:35:41.718 186548 DEBUG nova.network.neutron [req-abfcc5ea-4e6e-48f8-9721-9b8cecfb247b req-aa23d600-9e06-4117-b38e-10cdaf34a2ad 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Updated VIF entry in instance network info cache for port 32e5de3f-3250-4a10-b086-ff60f6485d94. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:35:41 compute-0 nova_compute[186544]: 2025-11-22 08:35:41.719 186548 DEBUG nova.network.neutron [req-abfcc5ea-4e6e-48f8-9721-9b8cecfb247b req-aa23d600-9e06-4117-b38e-10cdaf34a2ad 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Updating instance_info_cache with network_info: [{"id": "32e5de3f-3250-4a10-b086-ff60f6485d94", "address": "fa:16:3e:f7:8b:09", "network": {"id": "31f800a8-047c-4bf6-a6e5-1eca76d76d66", "bridge": "br-int", "label": "tempest-network-smoke--1588217224", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap32e5de3f-32", "ovs_interfaceid": "32e5de3f-3250-4a10-b086-ff60f6485d94", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:35:41 compute-0 nova_compute[186544]: 2025-11-22 08:35:41.746 186548 DEBUG oslo_concurrency.lockutils [req-abfcc5ea-4e6e-48f8-9721-9b8cecfb247b req-aa23d600-9e06-4117-b38e-10cdaf34a2ad 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-3185c90c-d312-49b2-892e-133ba362117b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:35:42 compute-0 nova_compute[186544]: 2025-11-22 08:35:42.340 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:35:46 compute-0 nova_compute[186544]: 2025-11-22 08:35:46.496 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:35:47 compute-0 nova_compute[186544]: 2025-11-22 08:35:47.343 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:35:50 compute-0 ovn_controller[94843]: 2025-11-22T08:35:50Z|00077|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f7:8b:09 10.100.0.3
Nov 22 08:35:50 compute-0 ovn_controller[94843]: 2025-11-22T08:35:50Z|00078|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f7:8b:09 10.100.0.3
Nov 22 08:35:51 compute-0 nova_compute[186544]: 2025-11-22 08:35:51.498 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:35:52 compute-0 nova_compute[186544]: 2025-11-22 08:35:52.345 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:35:55 compute-0 nova_compute[186544]: 2025-11-22 08:35:55.521 186548 INFO nova.compute.manager [None req-1ee3f809-92c0-413d-918a-d05a120c801c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Get console output
Nov 22 08:35:55 compute-0 nova_compute[186544]: 2025-11-22 08:35:55.527 213059 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 22 08:35:56 compute-0 podman[248822]: 2025-11-22 08:35:56.414307506 +0000 UTC m=+0.049118363 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 08:35:56 compute-0 podman[248820]: 2025-11-22 08:35:56.43024842 +0000 UTC m=+0.071173908 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm)
Nov 22 08:35:56 compute-0 podman[248821]: 2025-11-22 08:35:56.453355519 +0000 UTC m=+0.091013817 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Nov 22 08:35:56 compute-0 podman[248823]: 2025-11-22 08:35:56.46718104 +0000 UTC m=+0.096685196 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 22 08:35:56 compute-0 nova_compute[186544]: 2025-11-22 08:35:56.500 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:35:57 compute-0 nova_compute[186544]: 2025-11-22 08:35:57.347 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:35:58 compute-0 nova_compute[186544]: 2025-11-22 08:35:58.871 186548 DEBUG oslo_concurrency.lockutils [None req-bc4dc91b-9654-410c-a7f6-f0208ead094c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "interface-3185c90c-d312-49b2-892e-133ba362117b-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:35:58 compute-0 nova_compute[186544]: 2025-11-22 08:35:58.871 186548 DEBUG oslo_concurrency.lockutils [None req-bc4dc91b-9654-410c-a7f6-f0208ead094c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "interface-3185c90c-d312-49b2-892e-133ba362117b-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:35:58 compute-0 nova_compute[186544]: 2025-11-22 08:35:58.872 186548 DEBUG nova.objects.instance [None req-bc4dc91b-9654-410c-a7f6-f0208ead094c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lazy-loading 'flavor' on Instance uuid 3185c90c-d312-49b2-892e-133ba362117b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:36:00 compute-0 nova_compute[186544]: 2025-11-22 08:36:00.954 186548 DEBUG nova.objects.instance [None req-bc4dc91b-9654-410c-a7f6-f0208ead094c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lazy-loading 'pci_requests' on Instance uuid 3185c90c-d312-49b2-892e-133ba362117b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:36:00 compute-0 nova_compute[186544]: 2025-11-22 08:36:00.970 186548 DEBUG nova.network.neutron [None req-bc4dc91b-9654-410c-a7f6-f0208ead094c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 08:36:01 compute-0 nova_compute[186544]: 2025-11-22 08:36:01.141 186548 DEBUG nova.policy [None req-bc4dc91b-9654-410c-a7f6-f0208ead094c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 22 08:36:01 compute-0 nova_compute[186544]: 2025-11-22 08:36:01.502 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:36:02 compute-0 nova_compute[186544]: 2025-11-22 08:36:02.349 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:36:03 compute-0 nova_compute[186544]: 2025-11-22 08:36:03.127 186548 DEBUG nova.network.neutron [None req-bc4dc91b-9654-410c-a7f6-f0208ead094c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Successfully created port: 195c5347-a907-42ac-983c-1579be63a9b3 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 22 08:36:04 compute-0 nova_compute[186544]: 2025-11-22 08:36:04.255 186548 DEBUG nova.network.neutron [None req-bc4dc91b-9654-410c-a7f6-f0208ead094c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Successfully updated port: 195c5347-a907-42ac-983c-1579be63a9b3 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 08:36:04 compute-0 nova_compute[186544]: 2025-11-22 08:36:04.273 186548 DEBUG oslo_concurrency.lockutils [None req-bc4dc91b-9654-410c-a7f6-f0208ead094c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "refresh_cache-3185c90c-d312-49b2-892e-133ba362117b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:36:04 compute-0 nova_compute[186544]: 2025-11-22 08:36:04.274 186548 DEBUG oslo_concurrency.lockutils [None req-bc4dc91b-9654-410c-a7f6-f0208ead094c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquired lock "refresh_cache-3185c90c-d312-49b2-892e-133ba362117b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:36:04 compute-0 nova_compute[186544]: 2025-11-22 08:36:04.274 186548 DEBUG nova.network.neutron [None req-bc4dc91b-9654-410c-a7f6-f0208ead094c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 08:36:04 compute-0 nova_compute[186544]: 2025-11-22 08:36:04.353 186548 DEBUG nova.compute.manager [req-04658c2e-3728-4573-9bc2-25b7585c1203 req-c7b68fda-6dbb-4107-89ce-cf4f57531ade 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Received event network-changed-195c5347-a907-42ac-983c-1579be63a9b3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:36:04 compute-0 nova_compute[186544]: 2025-11-22 08:36:04.354 186548 DEBUG nova.compute.manager [req-04658c2e-3728-4573-9bc2-25b7585c1203 req-c7b68fda-6dbb-4107-89ce-cf4f57531ade 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Refreshing instance network info cache due to event network-changed-195c5347-a907-42ac-983c-1579be63a9b3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:36:04 compute-0 nova_compute[186544]: 2025-11-22 08:36:04.354 186548 DEBUG oslo_concurrency.lockutils [req-04658c2e-3728-4573-9bc2-25b7585c1203 req-c7b68fda-6dbb-4107-89ce-cf4f57531ade 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-3185c90c-d312-49b2-892e-133ba362117b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:36:06 compute-0 nova_compute[186544]: 2025-11-22 08:36:06.247 186548 DEBUG nova.network.neutron [None req-bc4dc91b-9654-410c-a7f6-f0208ead094c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Updating instance_info_cache with network_info: [{"id": "32e5de3f-3250-4a10-b086-ff60f6485d94", "address": "fa:16:3e:f7:8b:09", "network": {"id": "31f800a8-047c-4bf6-a6e5-1eca76d76d66", "bridge": "br-int", "label": "tempest-network-smoke--1588217224", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap32e5de3f-32", "ovs_interfaceid": "32e5de3f-3250-4a10-b086-ff60f6485d94", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "195c5347-a907-42ac-983c-1579be63a9b3", "address": "fa:16:3e:f3:bc:90", "network": {"id": "81a2a04b-7bbe-4694-ab21-1b66ecc56f3f", "bridge": "br-int", "label": "tempest-network-smoke--1571347791", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap195c5347-a9", "ovs_interfaceid": "195c5347-a907-42ac-983c-1579be63a9b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:36:06 compute-0 nova_compute[186544]: 2025-11-22 08:36:06.264 186548 DEBUG oslo_concurrency.lockutils [None req-bc4dc91b-9654-410c-a7f6-f0208ead094c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Releasing lock "refresh_cache-3185c90c-d312-49b2-892e-133ba362117b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:36:06 compute-0 nova_compute[186544]: 2025-11-22 08:36:06.265 186548 DEBUG oslo_concurrency.lockutils [req-04658c2e-3728-4573-9bc2-25b7585c1203 req-c7b68fda-6dbb-4107-89ce-cf4f57531ade 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-3185c90c-d312-49b2-892e-133ba362117b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:36:06 compute-0 nova_compute[186544]: 2025-11-22 08:36:06.265 186548 DEBUG nova.network.neutron [req-04658c2e-3728-4573-9bc2-25b7585c1203 req-c7b68fda-6dbb-4107-89ce-cf4f57531ade 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Refreshing network info cache for port 195c5347-a907-42ac-983c-1579be63a9b3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:36:06 compute-0 nova_compute[186544]: 2025-11-22 08:36:06.270 186548 DEBUG nova.virt.libvirt.vif [None req-bc4dc91b-9654-410c-a7f6-f0208ead094c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:35:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-710800317',display_name='tempest-TestNetworkBasicOps-server-710800317',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-710800317',id=167,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNYFp+84ONzQcn0iqOuOB+FXdJTNjbX16Nf4bSBmaDOqJkiNXFWk4Dg2HKrhF07TNjNTNPuibNriNIRB+UswtE57Lassn/GZrZqYoCMCkzsy/3S49MgiVtcyuFVVOk/lqQ==',key_name='tempest-TestNetworkBasicOps-1923094795',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:35:34Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='12f63a6d87a947758ab928c0d625ff06',ramdisk_id='',reservation_id='r-2i2p53uv',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1998778518',owner_user_name='tempest-TestNetworkBasicOps-1998778518-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:35:34Z,user_data=None,user_id='033a5e424a0a42afa21b67c28d79d1f4',uuid=3185c90c-d312-49b2-892e-133ba362117b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "195c5347-a907-42ac-983c-1579be63a9b3", "address": "fa:16:3e:f3:bc:90", "network": {"id": "81a2a04b-7bbe-4694-ab21-1b66ecc56f3f", "bridge": "br-int", "label": "tempest-network-smoke--1571347791", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap195c5347-a9", "ovs_interfaceid": "195c5347-a907-42ac-983c-1579be63a9b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 08:36:06 compute-0 nova_compute[186544]: 2025-11-22 08:36:06.270 186548 DEBUG nova.network.os_vif_util [None req-bc4dc91b-9654-410c-a7f6-f0208ead094c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converting VIF {"id": "195c5347-a907-42ac-983c-1579be63a9b3", "address": "fa:16:3e:f3:bc:90", "network": {"id": "81a2a04b-7bbe-4694-ab21-1b66ecc56f3f", "bridge": "br-int", "label": "tempest-network-smoke--1571347791", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap195c5347-a9", "ovs_interfaceid": "195c5347-a907-42ac-983c-1579be63a9b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:36:06 compute-0 nova_compute[186544]: 2025-11-22 08:36:06.271 186548 DEBUG nova.network.os_vif_util [None req-bc4dc91b-9654-410c-a7f6-f0208ead094c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f3:bc:90,bridge_name='br-int',has_traffic_filtering=True,id=195c5347-a907-42ac-983c-1579be63a9b3,network=Network(81a2a04b-7bbe-4694-ab21-1b66ecc56f3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap195c5347-a9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:36:06 compute-0 nova_compute[186544]: 2025-11-22 08:36:06.272 186548 DEBUG os_vif [None req-bc4dc91b-9654-410c-a7f6-f0208ead094c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f3:bc:90,bridge_name='br-int',has_traffic_filtering=True,id=195c5347-a907-42ac-983c-1579be63a9b3,network=Network(81a2a04b-7bbe-4694-ab21-1b66ecc56f3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap195c5347-a9') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 08:36:06 compute-0 nova_compute[186544]: 2025-11-22 08:36:06.274 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:36:06 compute-0 nova_compute[186544]: 2025-11-22 08:36:06.274 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:36:06 compute-0 nova_compute[186544]: 2025-11-22 08:36:06.275 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:36:06 compute-0 nova_compute[186544]: 2025-11-22 08:36:06.281 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:36:06 compute-0 nova_compute[186544]: 2025-11-22 08:36:06.281 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap195c5347-a9, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:36:06 compute-0 nova_compute[186544]: 2025-11-22 08:36:06.281 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap195c5347-a9, col_values=(('external_ids', {'iface-id': '195c5347-a907-42ac-983c-1579be63a9b3', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f3:bc:90', 'vm-uuid': '3185c90c-d312-49b2-892e-133ba362117b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:36:06 compute-0 nova_compute[186544]: 2025-11-22 08:36:06.283 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:36:06 compute-0 NetworkManager[55036]: <info>  [1763800566.2845] manager: (tap195c5347-a9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/378)
Nov 22 08:36:06 compute-0 nova_compute[186544]: 2025-11-22 08:36:06.285 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 08:36:06 compute-0 nova_compute[186544]: 2025-11-22 08:36:06.291 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:36:06 compute-0 nova_compute[186544]: 2025-11-22 08:36:06.292 186548 INFO os_vif [None req-bc4dc91b-9654-410c-a7f6-f0208ead094c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f3:bc:90,bridge_name='br-int',has_traffic_filtering=True,id=195c5347-a907-42ac-983c-1579be63a9b3,network=Network(81a2a04b-7bbe-4694-ab21-1b66ecc56f3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap195c5347-a9')
Nov 22 08:36:06 compute-0 nova_compute[186544]: 2025-11-22 08:36:06.293 186548 DEBUG nova.virt.libvirt.vif [None req-bc4dc91b-9654-410c-a7f6-f0208ead094c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:35:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-710800317',display_name='tempest-TestNetworkBasicOps-server-710800317',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-710800317',id=167,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNYFp+84ONzQcn0iqOuOB+FXdJTNjbX16Nf4bSBmaDOqJkiNXFWk4Dg2HKrhF07TNjNTNPuibNriNIRB+UswtE57Lassn/GZrZqYoCMCkzsy/3S49MgiVtcyuFVVOk/lqQ==',key_name='tempest-TestNetworkBasicOps-1923094795',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:35:34Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='12f63a6d87a947758ab928c0d625ff06',ramdisk_id='',reservation_id='r-2i2p53uv',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1998778518',owner_user_name='tempest-TestNetworkBasicOps-1998778518-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:35:34Z,user_data=None,user_id='033a5e424a0a42afa21b67c28d79d1f4',uuid=3185c90c-d312-49b2-892e-133ba362117b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "195c5347-a907-42ac-983c-1579be63a9b3", "address": "fa:16:3e:f3:bc:90", "network": {"id": "81a2a04b-7bbe-4694-ab21-1b66ecc56f3f", "bridge": "br-int", "label": "tempest-network-smoke--1571347791", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap195c5347-a9", "ovs_interfaceid": "195c5347-a907-42ac-983c-1579be63a9b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 08:36:06 compute-0 nova_compute[186544]: 2025-11-22 08:36:06.293 186548 DEBUG nova.network.os_vif_util [None req-bc4dc91b-9654-410c-a7f6-f0208ead094c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converting VIF {"id": "195c5347-a907-42ac-983c-1579be63a9b3", "address": "fa:16:3e:f3:bc:90", "network": {"id": "81a2a04b-7bbe-4694-ab21-1b66ecc56f3f", "bridge": "br-int", "label": "tempest-network-smoke--1571347791", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap195c5347-a9", "ovs_interfaceid": "195c5347-a907-42ac-983c-1579be63a9b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:36:06 compute-0 nova_compute[186544]: 2025-11-22 08:36:06.293 186548 DEBUG nova.network.os_vif_util [None req-bc4dc91b-9654-410c-a7f6-f0208ead094c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f3:bc:90,bridge_name='br-int',has_traffic_filtering=True,id=195c5347-a907-42ac-983c-1579be63a9b3,network=Network(81a2a04b-7bbe-4694-ab21-1b66ecc56f3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap195c5347-a9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:36:06 compute-0 nova_compute[186544]: 2025-11-22 08:36:06.295 186548 DEBUG nova.virt.libvirt.guest [None req-bc4dc91b-9654-410c-a7f6-f0208ead094c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] attach device xml: <interface type="ethernet">
Nov 22 08:36:06 compute-0 nova_compute[186544]:   <mac address="fa:16:3e:f3:bc:90"/>
Nov 22 08:36:06 compute-0 nova_compute[186544]:   <model type="virtio"/>
Nov 22 08:36:06 compute-0 nova_compute[186544]:   <driver name="vhost" rx_queue_size="512"/>
Nov 22 08:36:06 compute-0 nova_compute[186544]:   <mtu size="1442"/>
Nov 22 08:36:06 compute-0 nova_compute[186544]:   <target dev="tap195c5347-a9"/>
Nov 22 08:36:06 compute-0 nova_compute[186544]: </interface>
Nov 22 08:36:06 compute-0 nova_compute[186544]:  attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339
Nov 22 08:36:06 compute-0 kernel: tap195c5347-a9: entered promiscuous mode
Nov 22 08:36:06 compute-0 NetworkManager[55036]: <info>  [1763800566.3087] manager: (tap195c5347-a9): new Tun device (/org/freedesktop/NetworkManager/Devices/379)
Nov 22 08:36:06 compute-0 ovn_controller[94843]: 2025-11-22T08:36:06Z|00808|binding|INFO|Claiming lport 195c5347-a907-42ac-983c-1579be63a9b3 for this chassis.
Nov 22 08:36:06 compute-0 ovn_controller[94843]: 2025-11-22T08:36:06Z|00809|binding|INFO|195c5347-a907-42ac-983c-1579be63a9b3: Claiming fa:16:3e:f3:bc:90 10.100.0.22
Nov 22 08:36:06 compute-0 nova_compute[186544]: 2025-11-22 08:36:06.311 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:36:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:06.327 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f3:bc:90 10.100.0.22'], port_security=['fa:16:3e:f3:bc:90 10.100.0.22'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.22/28', 'neutron:device_id': '3185c90c-d312-49b2-892e-133ba362117b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-81a2a04b-7bbe-4694-ab21-1b66ecc56f3f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '12f63a6d87a947758ab928c0d625ff06', 'neutron:revision_number': '2', 'neutron:security_group_ids': '1130d42c-f40b-4a39-88f2-637246715885', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2ae80faa-0c82-4a3b-aea0-e2368e0ae615, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=195c5347-a907-42ac-983c-1579be63a9b3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:36:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:06.329 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 195c5347-a907-42ac-983c-1579be63a9b3 in datapath 81a2a04b-7bbe-4694-ab21-1b66ecc56f3f bound to our chassis
Nov 22 08:36:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:06.331 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 81a2a04b-7bbe-4694-ab21-1b66ecc56f3f
Nov 22 08:36:06 compute-0 systemd-udevd[248908]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:36:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:06.343 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[548529cc-3541-4265-8d56-9b6315fccf32]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:36:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:06.344 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap81a2a04b-71 in ovnmeta-81a2a04b-7bbe-4694-ab21-1b66ecc56f3f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 08:36:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:06.347 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap81a2a04b-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 08:36:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:06.347 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[25eb2a1b-4bb2-4669-976e-1edbb9fd2705]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:36:06 compute-0 nova_compute[186544]: 2025-11-22 08:36:06.349 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:36:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:06.348 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[f1ee6d55-ee5a-4319-9c1f-f7c4525ccf3e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:36:06 compute-0 ovn_controller[94843]: 2025-11-22T08:36:06Z|00810|binding|INFO|Setting lport 195c5347-a907-42ac-983c-1579be63a9b3 ovn-installed in OVS
Nov 22 08:36:06 compute-0 ovn_controller[94843]: 2025-11-22T08:36:06Z|00811|binding|INFO|Setting lport 195c5347-a907-42ac-983c-1579be63a9b3 up in Southbound
Nov 22 08:36:06 compute-0 nova_compute[186544]: 2025-11-22 08:36:06.352 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:36:06 compute-0 NetworkManager[55036]: <info>  [1763800566.3581] device (tap195c5347-a9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 08:36:06 compute-0 NetworkManager[55036]: <info>  [1763800566.3604] device (tap195c5347-a9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 08:36:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:06.361 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[a3919bf4-ba11-4c0d-b7dd-b72543d052bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:36:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:06.390 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[fbfbaa26-251e-4e53-9796-69b2e5d50c9d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:36:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:06.421 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[c7576427-392f-4d74-9aa9-61952677be05]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:36:06 compute-0 systemd-udevd[248911]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:36:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:06.426 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[d7c6f607-b8de-4588-94e6-17cc3e0e58bb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:36:06 compute-0 NetworkManager[55036]: <info>  [1763800566.4270] manager: (tap81a2a04b-70): new Veth device (/org/freedesktop/NetworkManager/Devices/380)
Nov 22 08:36:06 compute-0 nova_compute[186544]: 2025-11-22 08:36:06.445 186548 DEBUG nova.virt.libvirt.driver [None req-bc4dc91b-9654-410c-a7f6-f0208ead094c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:36:06 compute-0 nova_compute[186544]: 2025-11-22 08:36:06.445 186548 DEBUG nova.virt.libvirt.driver [None req-bc4dc91b-9654-410c-a7f6-f0208ead094c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:36:06 compute-0 nova_compute[186544]: 2025-11-22 08:36:06.445 186548 DEBUG nova.virt.libvirt.driver [None req-bc4dc91b-9654-410c-a7f6-f0208ead094c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] No VIF found with MAC fa:16:3e:f7:8b:09, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 08:36:06 compute-0 nova_compute[186544]: 2025-11-22 08:36:06.446 186548 DEBUG nova.virt.libvirt.driver [None req-bc4dc91b-9654-410c-a7f6-f0208ead094c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] No VIF found with MAC fa:16:3e:f3:bc:90, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 08:36:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:06.453 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[cd4dc735-1ee4-4050-82de-e21a8ed2fe71]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:36:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:06.457 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[35f6e897-e520-437d-a39b-935c8ace1b12]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:36:06 compute-0 nova_compute[186544]: 2025-11-22 08:36:06.467 186548 DEBUG nova.virt.libvirt.guest [None req-bc4dc91b-9654-410c-a7f6-f0208ead094c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 08:36:06 compute-0 nova_compute[186544]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 08:36:06 compute-0 nova_compute[186544]:   <nova:name>tempest-TestNetworkBasicOps-server-710800317</nova:name>
Nov 22 08:36:06 compute-0 nova_compute[186544]:   <nova:creationTime>2025-11-22 08:36:06</nova:creationTime>
Nov 22 08:36:06 compute-0 nova_compute[186544]:   <nova:flavor name="m1.nano">
Nov 22 08:36:06 compute-0 nova_compute[186544]:     <nova:memory>128</nova:memory>
Nov 22 08:36:06 compute-0 nova_compute[186544]:     <nova:disk>1</nova:disk>
Nov 22 08:36:06 compute-0 nova_compute[186544]:     <nova:swap>0</nova:swap>
Nov 22 08:36:06 compute-0 nova_compute[186544]:     <nova:ephemeral>0</nova:ephemeral>
Nov 22 08:36:06 compute-0 nova_compute[186544]:     <nova:vcpus>1</nova:vcpus>
Nov 22 08:36:06 compute-0 nova_compute[186544]:   </nova:flavor>
Nov 22 08:36:06 compute-0 nova_compute[186544]:   <nova:owner>
Nov 22 08:36:06 compute-0 nova_compute[186544]:     <nova:user uuid="033a5e424a0a42afa21b67c28d79d1f4">tempest-TestNetworkBasicOps-1998778518-project-member</nova:user>
Nov 22 08:36:06 compute-0 nova_compute[186544]:     <nova:project uuid="12f63a6d87a947758ab928c0d625ff06">tempest-TestNetworkBasicOps-1998778518</nova:project>
Nov 22 08:36:06 compute-0 nova_compute[186544]:   </nova:owner>
Nov 22 08:36:06 compute-0 nova_compute[186544]:   <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 08:36:06 compute-0 nova_compute[186544]:   <nova:ports>
Nov 22 08:36:06 compute-0 nova_compute[186544]:     <nova:port uuid="32e5de3f-3250-4a10-b086-ff60f6485d94">
Nov 22 08:36:06 compute-0 nova_compute[186544]:       <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Nov 22 08:36:06 compute-0 nova_compute[186544]:     </nova:port>
Nov 22 08:36:06 compute-0 nova_compute[186544]:     <nova:port uuid="195c5347-a907-42ac-983c-1579be63a9b3">
Nov 22 08:36:06 compute-0 nova_compute[186544]:       <nova:ip type="fixed" address="10.100.0.22" ipVersion="4"/>
Nov 22 08:36:06 compute-0 nova_compute[186544]:     </nova:port>
Nov 22 08:36:06 compute-0 nova_compute[186544]:   </nova:ports>
Nov 22 08:36:06 compute-0 nova_compute[186544]: </nova:instance>
Nov 22 08:36:06 compute-0 nova_compute[186544]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Nov 22 08:36:06 compute-0 NetworkManager[55036]: <info>  [1763800566.4774] device (tap81a2a04b-70): carrier: link connected
Nov 22 08:36:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:06.482 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[d973a522-2395-4dc3-83e4-309d9bb2c54e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:36:06 compute-0 nova_compute[186544]: 2025-11-22 08:36:06.490 186548 DEBUG oslo_concurrency.lockutils [None req-bc4dc91b-9654-410c-a7f6-f0208ead094c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "interface-3185c90c-d312-49b2-892e-133ba362117b-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 7.619s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:36:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:06.501 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[019b5d2d-2569-4fe9-80d0-42319e3b2eea]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap81a2a04b-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:15:3f:23'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 243], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 730711, 'reachable_time': 18996, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 248934, 'error': None, 'target': 'ovnmeta-81a2a04b-7bbe-4694-ab21-1b66ecc56f3f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:36:06 compute-0 nova_compute[186544]: 2025-11-22 08:36:06.503 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:36:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:06.517 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[f625060e-b83a-41d4-b742-18651e9024f8]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe15:3f23'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 730711, 'tstamp': 730711}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 248935, 'error': None, 'target': 'ovnmeta-81a2a04b-7bbe-4694-ab21-1b66ecc56f3f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:36:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:06.535 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[273adcd4-e230-4fca-a826-4e23093f9413]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap81a2a04b-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:15:3f:23'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 243], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 730711, 'reachable_time': 18996, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 248936, 'error': None, 'target': 'ovnmeta-81a2a04b-7bbe-4694-ab21-1b66ecc56f3f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:36:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:06.566 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[d1ded9c4-3698-4c11-8438-aa43c85bb63c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:36:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:06.630 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[3f098d2b-3213-4887-9093-b5dad26eda3a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:36:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:06.632 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap81a2a04b-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:36:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:06.632 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:36:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:06.632 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap81a2a04b-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:36:06 compute-0 NetworkManager[55036]: <info>  [1763800566.6348] manager: (tap81a2a04b-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/381)
Nov 22 08:36:06 compute-0 kernel: tap81a2a04b-70: entered promiscuous mode
Nov 22 08:36:06 compute-0 nova_compute[186544]: 2025-11-22 08:36:06.634 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:36:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:06.638 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap81a2a04b-70, col_values=(('external_ids', {'iface-id': 'cf39a85c-bdb1-4c0f-b758-808fae40203c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:36:06 compute-0 ovn_controller[94843]: 2025-11-22T08:36:06Z|00812|binding|INFO|Releasing lport cf39a85c-bdb1-4c0f-b758-808fae40203c from this chassis (sb_readonly=0)
Nov 22 08:36:06 compute-0 nova_compute[186544]: 2025-11-22 08:36:06.639 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:36:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:06.640 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/81a2a04b-7bbe-4694-ab21-1b66ecc56f3f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/81a2a04b-7bbe-4694-ab21-1b66ecc56f3f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 08:36:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:06.641 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[b1787a0a-13a4-4ba4-a967-8b57dea3c13f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:36:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:06.642 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 08:36:06 compute-0 ovn_metadata_agent[103800]: global
Nov 22 08:36:06 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 08:36:06 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-81a2a04b-7bbe-4694-ab21-1b66ecc56f3f
Nov 22 08:36:06 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 08:36:06 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 08:36:06 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 08:36:06 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/81a2a04b-7bbe-4694-ab21-1b66ecc56f3f.pid.haproxy
Nov 22 08:36:06 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 08:36:06 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:36:06 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 08:36:06 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 08:36:06 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 08:36:06 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 08:36:06 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 08:36:06 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 08:36:06 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 08:36:06 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 08:36:06 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 08:36:06 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 08:36:06 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 08:36:06 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 08:36:06 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 08:36:06 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:36:06 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:36:06 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 08:36:06 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 08:36:06 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 08:36:06 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID 81a2a04b-7bbe-4694-ab21-1b66ecc56f3f
Nov 22 08:36:06 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 08:36:06 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:06.642 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-81a2a04b-7bbe-4694-ab21-1b66ecc56f3f', 'env', 'PROCESS_TAG=haproxy-81a2a04b-7bbe-4694-ab21-1b66ecc56f3f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/81a2a04b-7bbe-4694-ab21-1b66ecc56f3f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 08:36:06 compute-0 nova_compute[186544]: 2025-11-22 08:36:06.651 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:36:06 compute-0 nova_compute[186544]: 2025-11-22 08:36:06.688 186548 DEBUG nova.compute.manager [req-66c78452-de28-410e-8e5c-2513d336461c req-7bc532b3-5628-458c-8bb1-58394809eaf4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Received event network-vif-plugged-195c5347-a907-42ac-983c-1579be63a9b3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:36:06 compute-0 nova_compute[186544]: 2025-11-22 08:36:06.689 186548 DEBUG oslo_concurrency.lockutils [req-66c78452-de28-410e-8e5c-2513d336461c req-7bc532b3-5628-458c-8bb1-58394809eaf4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "3185c90c-d312-49b2-892e-133ba362117b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:36:06 compute-0 nova_compute[186544]: 2025-11-22 08:36:06.689 186548 DEBUG oslo_concurrency.lockutils [req-66c78452-de28-410e-8e5c-2513d336461c req-7bc532b3-5628-458c-8bb1-58394809eaf4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "3185c90c-d312-49b2-892e-133ba362117b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:36:06 compute-0 nova_compute[186544]: 2025-11-22 08:36:06.689 186548 DEBUG oslo_concurrency.lockutils [req-66c78452-de28-410e-8e5c-2513d336461c req-7bc532b3-5628-458c-8bb1-58394809eaf4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "3185c90c-d312-49b2-892e-133ba362117b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:36:06 compute-0 nova_compute[186544]: 2025-11-22 08:36:06.689 186548 DEBUG nova.compute.manager [req-66c78452-de28-410e-8e5c-2513d336461c req-7bc532b3-5628-458c-8bb1-58394809eaf4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] No waiting events found dispatching network-vif-plugged-195c5347-a907-42ac-983c-1579be63a9b3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:36:06 compute-0 nova_compute[186544]: 2025-11-22 08:36:06.689 186548 WARNING nova.compute.manager [req-66c78452-de28-410e-8e5c-2513d336461c req-7bc532b3-5628-458c-8bb1-58394809eaf4 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Received unexpected event network-vif-plugged-195c5347-a907-42ac-983c-1579be63a9b3 for instance with vm_state active and task_state None.
Nov 22 08:36:07 compute-0 podman[248965]: 2025-11-22 08:36:06.986060632 +0000 UTC m=+0.022798505 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 08:36:07 compute-0 podman[248965]: 2025-11-22 08:36:07.199739144 +0000 UTC m=+0.236476997 container create 80a1d76f123849ef53001bad68bd76064b7398a26b3ed9d6d8667315cc9210f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-81a2a04b-7bbe-4694-ab21-1b66ecc56f3f, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0)
Nov 22 08:36:07 compute-0 systemd[1]: Started libpod-conmon-80a1d76f123849ef53001bad68bd76064b7398a26b3ed9d6d8667315cc9210f4.scope.
Nov 22 08:36:07 compute-0 systemd[1]: Started libcrun container.
Nov 22 08:36:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/af3a8389bece759c6e37e54a3553dd3067a25e653795c1c01a58d1be591d6a75/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 08:36:07 compute-0 podman[248979]: 2025-11-22 08:36:07.380783652 +0000 UTC m=+0.147804539 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 08:36:07 compute-0 podman[248965]: 2025-11-22 08:36:07.380905084 +0000 UTC m=+0.417642967 container init 80a1d76f123849ef53001bad68bd76064b7398a26b3ed9d6d8667315cc9210f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-81a2a04b-7bbe-4694-ab21-1b66ecc56f3f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 22 08:36:07 compute-0 podman[248965]: 2025-11-22 08:36:07.387083766 +0000 UTC m=+0.423821619 container start 80a1d76f123849ef53001bad68bd76064b7398a26b3ed9d6d8667315cc9210f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-81a2a04b-7bbe-4694-ab21-1b66ecc56f3f, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 08:36:07 compute-0 neutron-haproxy-ovnmeta-81a2a04b-7bbe-4694-ab21-1b66ecc56f3f[248993]: [NOTICE]   (249004) : New worker (249006) forked
Nov 22 08:36:07 compute-0 neutron-haproxy-ovnmeta-81a2a04b-7bbe-4694-ab21-1b66ecc56f3f[248993]: [NOTICE]   (249004) : Loading success.
Nov 22 08:36:07 compute-0 nova_compute[186544]: 2025-11-22 08:36:07.663 186548 DEBUG nova.network.neutron [req-04658c2e-3728-4573-9bc2-25b7585c1203 req-c7b68fda-6dbb-4107-89ce-cf4f57531ade 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Updated VIF entry in instance network info cache for port 195c5347-a907-42ac-983c-1579be63a9b3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:36:07 compute-0 nova_compute[186544]: 2025-11-22 08:36:07.665 186548 DEBUG nova.network.neutron [req-04658c2e-3728-4573-9bc2-25b7585c1203 req-c7b68fda-6dbb-4107-89ce-cf4f57531ade 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Updating instance_info_cache with network_info: [{"id": "32e5de3f-3250-4a10-b086-ff60f6485d94", "address": "fa:16:3e:f7:8b:09", "network": {"id": "31f800a8-047c-4bf6-a6e5-1eca76d76d66", "bridge": "br-int", "label": "tempest-network-smoke--1588217224", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap32e5de3f-32", "ovs_interfaceid": "32e5de3f-3250-4a10-b086-ff60f6485d94", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "195c5347-a907-42ac-983c-1579be63a9b3", "address": "fa:16:3e:f3:bc:90", "network": {"id": "81a2a04b-7bbe-4694-ab21-1b66ecc56f3f", "bridge": "br-int", "label": "tempest-network-smoke--1571347791", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap195c5347-a9", "ovs_interfaceid": "195c5347-a907-42ac-983c-1579be63a9b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:36:07 compute-0 nova_compute[186544]: 2025-11-22 08:36:07.687 186548 DEBUG oslo_concurrency.lockutils [req-04658c2e-3728-4573-9bc2-25b7585c1203 req-c7b68fda-6dbb-4107-89ce-cf4f57531ade 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-3185c90c-d312-49b2-892e-133ba362117b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:36:07 compute-0 nova_compute[186544]: 2025-11-22 08:36:07.990 186548 DEBUG oslo_concurrency.lockutils [None req-8ae71ae5-8a5a-434b-8ca6-62921eb0f4d7 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "interface-3185c90c-d312-49b2-892e-133ba362117b-195c5347-a907-42ac-983c-1579be63a9b3" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:36:07 compute-0 nova_compute[186544]: 2025-11-22 08:36:07.991 186548 DEBUG oslo_concurrency.lockutils [None req-8ae71ae5-8a5a-434b-8ca6-62921eb0f4d7 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "interface-3185c90c-d312-49b2-892e-133ba362117b-195c5347-a907-42ac-983c-1579be63a9b3" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:36:08 compute-0 nova_compute[186544]: 2025-11-22 08:36:08.015 186548 DEBUG nova.objects.instance [None req-8ae71ae5-8a5a-434b-8ca6-62921eb0f4d7 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lazy-loading 'flavor' on Instance uuid 3185c90c-d312-49b2-892e-133ba362117b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:36:08 compute-0 nova_compute[186544]: 2025-11-22 08:36:08.042 186548 DEBUG nova.virt.libvirt.vif [None req-8ae71ae5-8a5a-434b-8ca6-62921eb0f4d7 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:35:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-710800317',display_name='tempest-TestNetworkBasicOps-server-710800317',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-710800317',id=167,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNYFp+84ONzQcn0iqOuOB+FXdJTNjbX16Nf4bSBmaDOqJkiNXFWk4Dg2HKrhF07TNjNTNPuibNriNIRB+UswtE57Lassn/GZrZqYoCMCkzsy/3S49MgiVtcyuFVVOk/lqQ==',key_name='tempest-TestNetworkBasicOps-1923094795',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:35:34Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='12f63a6d87a947758ab928c0d625ff06',ramdisk_id='',reservation_id='r-2i2p53uv',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1998778518',owner_user_name='tempest-TestNetworkBasicOps-1998778518-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:35:34Z,user_data=None,user_id='033a5e424a0a42afa21b67c28d79d1f4',uuid=3185c90c-d312-49b2-892e-133ba362117b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "195c5347-a907-42ac-983c-1579be63a9b3", "address": "fa:16:3e:f3:bc:90", "network": {"id": "81a2a04b-7bbe-4694-ab21-1b66ecc56f3f", "bridge": "br-int", "label": "tempest-network-smoke--1571347791", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap195c5347-a9", "ovs_interfaceid": "195c5347-a907-42ac-983c-1579be63a9b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 08:36:08 compute-0 nova_compute[186544]: 2025-11-22 08:36:08.042 186548 DEBUG nova.network.os_vif_util [None req-8ae71ae5-8a5a-434b-8ca6-62921eb0f4d7 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converting VIF {"id": "195c5347-a907-42ac-983c-1579be63a9b3", "address": "fa:16:3e:f3:bc:90", "network": {"id": "81a2a04b-7bbe-4694-ab21-1b66ecc56f3f", "bridge": "br-int", "label": "tempest-network-smoke--1571347791", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap195c5347-a9", "ovs_interfaceid": "195c5347-a907-42ac-983c-1579be63a9b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:36:08 compute-0 nova_compute[186544]: 2025-11-22 08:36:08.043 186548 DEBUG nova.network.os_vif_util [None req-8ae71ae5-8a5a-434b-8ca6-62921eb0f4d7 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f3:bc:90,bridge_name='br-int',has_traffic_filtering=True,id=195c5347-a907-42ac-983c-1579be63a9b3,network=Network(81a2a04b-7bbe-4694-ab21-1b66ecc56f3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap195c5347-a9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:36:08 compute-0 nova_compute[186544]: 2025-11-22 08:36:08.047 186548 DEBUG nova.virt.libvirt.guest [None req-8ae71ae5-8a5a-434b-8ca6-62921eb0f4d7 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:f3:bc:90"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap195c5347-a9"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Nov 22 08:36:08 compute-0 nova_compute[186544]: 2025-11-22 08:36:08.049 186548 DEBUG nova.virt.libvirt.guest [None req-8ae71ae5-8a5a-434b-8ca6-62921eb0f4d7 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:f3:bc:90"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap195c5347-a9"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Nov 22 08:36:08 compute-0 nova_compute[186544]: 2025-11-22 08:36:08.051 186548 DEBUG nova.virt.libvirt.driver [None req-8ae71ae5-8a5a-434b-8ca6-62921eb0f4d7 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Attempting to detach device tap195c5347-a9 from instance 3185c90c-d312-49b2-892e-133ba362117b from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487
Nov 22 08:36:08 compute-0 nova_compute[186544]: 2025-11-22 08:36:08.052 186548 DEBUG nova.virt.libvirt.guest [None req-8ae71ae5-8a5a-434b-8ca6-62921eb0f4d7 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] detach device xml: <interface type="ethernet">
Nov 22 08:36:08 compute-0 nova_compute[186544]:   <mac address="fa:16:3e:f3:bc:90"/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   <model type="virtio"/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   <driver name="vhost" rx_queue_size="512"/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   <mtu size="1442"/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   <target dev="tap195c5347-a9"/>
Nov 22 08:36:08 compute-0 nova_compute[186544]: </interface>
Nov 22 08:36:08 compute-0 nova_compute[186544]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Nov 22 08:36:08 compute-0 nova_compute[186544]: 2025-11-22 08:36:08.057 186548 DEBUG nova.virt.libvirt.guest [None req-8ae71ae5-8a5a-434b-8ca6-62921eb0f4d7 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:f3:bc:90"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap195c5347-a9"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Nov 22 08:36:08 compute-0 nova_compute[186544]: 2025-11-22 08:36:08.060 186548 DEBUG nova.virt.libvirt.guest [None req-8ae71ae5-8a5a-434b-8ca6-62921eb0f4d7 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:f3:bc:90"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap195c5347-a9"/></interface>not found in domain: <domain type='kvm' id='91'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   <name>instance-000000a7</name>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   <uuid>3185c90c-d312-49b2-892e-133ba362117b</uuid>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   <metadata>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 08:36:08 compute-0 nova_compute[186544]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   <nova:name>tempest-TestNetworkBasicOps-server-710800317</nova:name>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   <nova:creationTime>2025-11-22 08:36:06</nova:creationTime>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   <nova:flavor name="m1.nano">
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <nova:memory>128</nova:memory>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <nova:disk>1</nova:disk>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <nova:swap>0</nova:swap>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <nova:ephemeral>0</nova:ephemeral>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <nova:vcpus>1</nova:vcpus>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   </nova:flavor>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   <nova:owner>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <nova:user uuid="033a5e424a0a42afa21b67c28d79d1f4">tempest-TestNetworkBasicOps-1998778518-project-member</nova:user>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <nova:project uuid="12f63a6d87a947758ab928c0d625ff06">tempest-TestNetworkBasicOps-1998778518</nova:project>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   </nova:owner>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   <nova:ports>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <nova:port uuid="32e5de3f-3250-4a10-b086-ff60f6485d94">
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </nova:port>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <nova:port uuid="195c5347-a907-42ac-983c-1579be63a9b3">
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <nova:ip type="fixed" address="10.100.0.22" ipVersion="4"/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </nova:port>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   </nova:ports>
Nov 22 08:36:08 compute-0 nova_compute[186544]: </nova:instance>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   </metadata>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   <memory unit='KiB'>131072</memory>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   <currentMemory unit='KiB'>131072</currentMemory>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   <vcpu placement='static'>1</vcpu>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   <resource>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <partition>/machine</partition>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   </resource>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   <sysinfo type='smbios'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <system>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <entry name='manufacturer'>RDO</entry>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <entry name='product'>OpenStack Compute</entry>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <entry name='serial'>3185c90c-d312-49b2-892e-133ba362117b</entry>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <entry name='uuid'>3185c90c-d312-49b2-892e-133ba362117b</entry>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <entry name='family'>Virtual Machine</entry>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </system>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   <os>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <boot dev='hd'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <smbios mode='sysinfo'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   </os>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   <features>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <apic/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <vmcoreinfo state='on'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   </features>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   <cpu mode='custom' match='exact' check='full'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <model fallback='forbid'>Nehalem</model>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <feature policy='require' name='x2apic'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <feature policy='require' name='hypervisor'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <feature policy='require' name='vme'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   </cpu>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   <clock offset='utc'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <timer name='pit' tickpolicy='delay'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <timer name='rtc' tickpolicy='catchup'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <timer name='hpet' present='no'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   </clock>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   <on_poweroff>destroy</on_poweroff>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   <on_reboot>restart</on_reboot>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   <on_crash>destroy</on_crash>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   <devices>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <disk type='file' device='disk'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <driver name='qemu' type='qcow2' cache='none'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <source file='/var/lib/nova/instances/3185c90c-d312-49b2-892e-133ba362117b/disk' index='2'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <backingStore type='file' index='3'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:         <format type='raw'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:         <source file='/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:         <backingStore/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       </backingStore>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <target dev='vda' bus='virtio'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='virtio-disk0'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <disk type='file' device='cdrom'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <driver name='qemu' type='raw' cache='none'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <source file='/var/lib/nova/instances/3185c90c-d312-49b2-892e-133ba362117b/disk.config' index='1'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <backingStore/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <target dev='sda' bus='sata'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <readonly/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='sata0-0-0'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <controller type='pci' index='0' model='pcie-root'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='pcie.0'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <controller type='pci' index='1' model='pcie-root-port'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <target chassis='1' port='0x10'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='pci.1'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <controller type='pci' index='2' model='pcie-root-port'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <target chassis='2' port='0x11'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='pci.2'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <controller type='pci' index='3' model='pcie-root-port'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <target chassis='3' port='0x12'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='pci.3'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <controller type='pci' index='4' model='pcie-root-port'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <target chassis='4' port='0x13'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='pci.4'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <controller type='pci' index='5' model='pcie-root-port'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <target chassis='5' port='0x14'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='pci.5'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <controller type='pci' index='6' model='pcie-root-port'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <target chassis='6' port='0x15'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='pci.6'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <controller type='pci' index='7' model='pcie-root-port'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <target chassis='7' port='0x16'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='pci.7'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <controller type='pci' index='8' model='pcie-root-port'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <target chassis='8' port='0x17'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='pci.8'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <controller type='pci' index='9' model='pcie-root-port'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <target chassis='9' port='0x18'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='pci.9'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <controller type='pci' index='10' model='pcie-root-port'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <target chassis='10' port='0x19'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='pci.10'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <controller type='pci' index='11' model='pcie-root-port'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <target chassis='11' port='0x1a'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='pci.11'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <controller type='pci' index='12' model='pcie-root-port'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <target chassis='12' port='0x1b'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='pci.12'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <controller type='pci' index='13' model='pcie-root-port'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <target chassis='13' port='0x1c'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='pci.13'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <controller type='pci' index='14' model='pcie-root-port'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <target chassis='14' port='0x1d'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='pci.14'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <controller type='pci' index='15' model='pcie-root-port'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <target chassis='15' port='0x1e'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='pci.15'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <controller type='pci' index='16' model='pcie-root-port'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <target chassis='16' port='0x1f'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='pci.16'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <controller type='pci' index='17' model='pcie-root-port'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <target chassis='17' port='0x20'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='pci.17'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <controller type='pci' index='18' model='pcie-root-port'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <target chassis='18' port='0x21'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='pci.18'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <controller type='pci' index='19' model='pcie-root-port'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <target chassis='19' port='0x22'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='pci.19'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <controller type='pci' index='20' model='pcie-root-port'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <target chassis='20' port='0x23'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='pci.20'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <controller type='pci' index='21' model='pcie-root-port'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <target chassis='21' port='0x24'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='pci.21'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <controller type='pci' index='22' model='pcie-root-port'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <target chassis='22' port='0x25'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='pci.22'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <controller type='pci' index='23' model='pcie-root-port'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <target chassis='23' port='0x26'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='pci.23'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <controller type='pci' index='24' model='pcie-root-port'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <target chassis='24' port='0x27'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='pci.24'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <controller type='pci' index='25' model='pcie-root-port'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <target chassis='25' port='0x28'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='pci.25'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <model name='pcie-pci-bridge'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='pci.26'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <controller type='usb' index='0' model='piix3-uhci'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='usb'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <controller type='sata' index='0'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='ide'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <interface type='ethernet'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <mac address='fa:16:3e:f7:8b:09'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <target dev='tap32e5de3f-32'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <model type='virtio'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <driver name='vhost' rx_queue_size='512'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <mtu size='1442'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='net0'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </interface>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <interface type='ethernet'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <mac address='fa:16:3e:f3:bc:90'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <target dev='tap195c5347-a9'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <model type='virtio'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <driver name='vhost' rx_queue_size='512'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <mtu size='1442'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='net1'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </interface>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <serial type='pty'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <source path='/dev/pts/0'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <log file='/var/lib/nova/instances/3185c90c-d312-49b2-892e-133ba362117b/console.log' append='off'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <target type='isa-serial' port='0'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:         <model name='isa-serial'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       </target>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='serial0'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </serial>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <console type='pty' tty='/dev/pts/0'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <source path='/dev/pts/0'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <log file='/var/lib/nova/instances/3185c90c-d312-49b2-892e-133ba362117b/console.log' append='off'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <target type='serial' port='0'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='serial0'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </console>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <input type='tablet' bus='usb'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='input0'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <address type='usb' bus='0' port='1'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </input>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <input type='mouse' bus='ps2'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='input1'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </input>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <input type='keyboard' bus='ps2'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='input2'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </input>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <listen type='address' address='::0'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </graphics>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <audio id='1' type='none'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <video>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <model type='virtio' heads='1' primary='yes'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='video0'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </video>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <watchdog model='itco' action='reset'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='watchdog0'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </watchdog>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <memballoon model='virtio'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <stats period='10'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='balloon0'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <rng model='virtio'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <backend model='random'>/dev/urandom</backend>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='rng0'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </rng>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   </devices>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <label>system_u:system_r:svirt_t:s0:c842,c918</label>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c842,c918</imagelabel>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   </seclabel>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <label>+107:+107</label>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <imagelabel>+107:+107</imagelabel>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   </seclabel>
Nov 22 08:36:08 compute-0 nova_compute[186544]: </domain>
Nov 22 08:36:08 compute-0 nova_compute[186544]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Nov 22 08:36:08 compute-0 nova_compute[186544]: 2025-11-22 08:36:08.062 186548 INFO nova.virt.libvirt.driver [None req-8ae71ae5-8a5a-434b-8ca6-62921eb0f4d7 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Successfully detached device tap195c5347-a9 from instance 3185c90c-d312-49b2-892e-133ba362117b from the persistent domain config.
Nov 22 08:36:08 compute-0 nova_compute[186544]: 2025-11-22 08:36:08.063 186548 DEBUG nova.virt.libvirt.driver [None req-8ae71ae5-8a5a-434b-8ca6-62921eb0f4d7 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] (1/8): Attempting to detach device tap195c5347-a9 with device alias net1 from instance 3185c90c-d312-49b2-892e-133ba362117b from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523
Nov 22 08:36:08 compute-0 nova_compute[186544]: 2025-11-22 08:36:08.063 186548 DEBUG nova.virt.libvirt.guest [None req-8ae71ae5-8a5a-434b-8ca6-62921eb0f4d7 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] detach device xml: <interface type="ethernet">
Nov 22 08:36:08 compute-0 nova_compute[186544]:   <mac address="fa:16:3e:f3:bc:90"/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   <model type="virtio"/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   <driver name="vhost" rx_queue_size="512"/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   <mtu size="1442"/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   <target dev="tap195c5347-a9"/>
Nov 22 08:36:08 compute-0 nova_compute[186544]: </interface>
Nov 22 08:36:08 compute-0 nova_compute[186544]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Nov 22 08:36:08 compute-0 kernel: tap195c5347-a9 (unregistering): left promiscuous mode
Nov 22 08:36:08 compute-0 NetworkManager[55036]: <info>  [1763800568.1623] device (tap195c5347-a9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 08:36:08 compute-0 ovn_controller[94843]: 2025-11-22T08:36:08Z|00813|binding|INFO|Releasing lport 195c5347-a907-42ac-983c-1579be63a9b3 from this chassis (sb_readonly=0)
Nov 22 08:36:08 compute-0 ovn_controller[94843]: 2025-11-22T08:36:08Z|00814|binding|INFO|Setting lport 195c5347-a907-42ac-983c-1579be63a9b3 down in Southbound
Nov 22 08:36:08 compute-0 ovn_controller[94843]: 2025-11-22T08:36:08Z|00815|binding|INFO|Removing iface tap195c5347-a9 ovn-installed in OVS
Nov 22 08:36:08 compute-0 nova_compute[186544]: 2025-11-22 08:36:08.172 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:36:08 compute-0 nova_compute[186544]: 2025-11-22 08:36:08.175 186548 DEBUG nova.virt.libvirt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Received event <DeviceRemovedEvent: 1763800568.1755803, 3185c90c-d312-49b2-892e-133ba362117b => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370
Nov 22 08:36:08 compute-0 nova_compute[186544]: 2025-11-22 08:36:08.177 186548 DEBUG nova.virt.libvirt.driver [None req-8ae71ae5-8a5a-434b-8ca6-62921eb0f4d7 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Start waiting for the detach event from libvirt for device tap195c5347-a9 with device alias net1 for instance 3185c90c-d312-49b2-892e-133ba362117b _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599
Nov 22 08:36:08 compute-0 nova_compute[186544]: 2025-11-22 08:36:08.178 186548 DEBUG nova.virt.libvirt.guest [None req-8ae71ae5-8a5a-434b-8ca6-62921eb0f4d7 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:f3:bc:90"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap195c5347-a9"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Nov 22 08:36:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:08.178 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f3:bc:90 10.100.0.22'], port_security=['fa:16:3e:f3:bc:90 10.100.0.22'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.22/28', 'neutron:device_id': '3185c90c-d312-49b2-892e-133ba362117b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-81a2a04b-7bbe-4694-ab21-1b66ecc56f3f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '12f63a6d87a947758ab928c0d625ff06', 'neutron:revision_number': '4', 'neutron:security_group_ids': '1130d42c-f40b-4a39-88f2-637246715885', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2ae80faa-0c82-4a3b-aea0-e2368e0ae615, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=195c5347-a907-42ac-983c-1579be63a9b3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:36:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:08.179 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 195c5347-a907-42ac-983c-1579be63a9b3 in datapath 81a2a04b-7bbe-4694-ab21-1b66ecc56f3f unbound from our chassis
Nov 22 08:36:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:08.180 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 81a2a04b-7bbe-4694-ab21-1b66ecc56f3f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 08:36:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:08.181 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[8c8c813e-c15a-454e-b911-519053eeb475]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:36:08 compute-0 nova_compute[186544]: 2025-11-22 08:36:08.182 186548 DEBUG nova.virt.libvirt.guest [None req-8ae71ae5-8a5a-434b-8ca6-62921eb0f4d7 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:f3:bc:90"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap195c5347-a9"/></interface>not found in domain: <domain type='kvm' id='91'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   <name>instance-000000a7</name>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   <uuid>3185c90c-d312-49b2-892e-133ba362117b</uuid>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   <metadata>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 08:36:08 compute-0 nova_compute[186544]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   <nova:name>tempest-TestNetworkBasicOps-server-710800317</nova:name>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   <nova:creationTime>2025-11-22 08:36:06</nova:creationTime>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   <nova:flavor name="m1.nano">
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <nova:memory>128</nova:memory>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <nova:disk>1</nova:disk>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <nova:swap>0</nova:swap>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <nova:ephemeral>0</nova:ephemeral>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <nova:vcpus>1</nova:vcpus>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   </nova:flavor>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   <nova:owner>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <nova:user uuid="033a5e424a0a42afa21b67c28d79d1f4">tempest-TestNetworkBasicOps-1998778518-project-member</nova:user>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <nova:project uuid="12f63a6d87a947758ab928c0d625ff06">tempest-TestNetworkBasicOps-1998778518</nova:project>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   </nova:owner>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   <nova:ports>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <nova:port uuid="32e5de3f-3250-4a10-b086-ff60f6485d94">
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </nova:port>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <nova:port uuid="195c5347-a907-42ac-983c-1579be63a9b3">
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <nova:ip type="fixed" address="10.100.0.22" ipVersion="4"/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </nova:port>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   </nova:ports>
Nov 22 08:36:08 compute-0 nova_compute[186544]: </nova:instance>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   </metadata>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   <memory unit='KiB'>131072</memory>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   <currentMemory unit='KiB'>131072</currentMemory>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   <vcpu placement='static'>1</vcpu>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   <resource>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <partition>/machine</partition>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   </resource>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   <sysinfo type='smbios'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <system>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <entry name='manufacturer'>RDO</entry>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <entry name='product'>OpenStack Compute</entry>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <entry name='serial'>3185c90c-d312-49b2-892e-133ba362117b</entry>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <entry name='uuid'>3185c90c-d312-49b2-892e-133ba362117b</entry>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <entry name='family'>Virtual Machine</entry>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </system>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   <os>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <boot dev='hd'/>
Nov 22 08:36:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:08.182 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-81a2a04b-7bbe-4694-ab21-1b66ecc56f3f namespace which is not needed anymore
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <smbios mode='sysinfo'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   </os>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   <features>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <apic/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <vmcoreinfo state='on'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   </features>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   <cpu mode='custom' match='exact' check='full'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <model fallback='forbid'>Nehalem</model>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <feature policy='require' name='x2apic'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <feature policy='require' name='hypervisor'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <feature policy='require' name='vme'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   </cpu>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   <clock offset='utc'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <timer name='pit' tickpolicy='delay'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <timer name='rtc' tickpolicy='catchup'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <timer name='hpet' present='no'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   </clock>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   <on_poweroff>destroy</on_poweroff>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   <on_reboot>restart</on_reboot>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   <on_crash>destroy</on_crash>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   <devices>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <disk type='file' device='disk'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <driver name='qemu' type='qcow2' cache='none'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <source file='/var/lib/nova/instances/3185c90c-d312-49b2-892e-133ba362117b/disk' index='2'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <backingStore type='file' index='3'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:         <format type='raw'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:         <source file='/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:         <backingStore/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       </backingStore>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <target dev='vda' bus='virtio'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='virtio-disk0'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <disk type='file' device='cdrom'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <driver name='qemu' type='raw' cache='none'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <source file='/var/lib/nova/instances/3185c90c-d312-49b2-892e-133ba362117b/disk.config' index='1'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <backingStore/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <target dev='sda' bus='sata'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <readonly/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='sata0-0-0'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <controller type='pci' index='0' model='pcie-root'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='pcie.0'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <controller type='pci' index='1' model='pcie-root-port'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <target chassis='1' port='0x10'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='pci.1'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <controller type='pci' index='2' model='pcie-root-port'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <target chassis='2' port='0x11'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='pci.2'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <controller type='pci' index='3' model='pcie-root-port'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <target chassis='3' port='0x12'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='pci.3'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <controller type='pci' index='4' model='pcie-root-port'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <target chassis='4' port='0x13'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='pci.4'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <controller type='pci' index='5' model='pcie-root-port'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <target chassis='5' port='0x14'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='pci.5'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <controller type='pci' index='6' model='pcie-root-port'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <target chassis='6' port='0x15'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='pci.6'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <controller type='pci' index='7' model='pcie-root-port'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <target chassis='7' port='0x16'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='pci.7'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <controller type='pci' index='8' model='pcie-root-port'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <target chassis='8' port='0x17'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='pci.8'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <controller type='pci' index='9' model='pcie-root-port'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <target chassis='9' port='0x18'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='pci.9'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <controller type='pci' index='10' model='pcie-root-port'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <target chassis='10' port='0x19'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='pci.10'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <controller type='pci' index='11' model='pcie-root-port'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <target chassis='11' port='0x1a'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='pci.11'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <controller type='pci' index='12' model='pcie-root-port'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <target chassis='12' port='0x1b'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='pci.12'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <controller type='pci' index='13' model='pcie-root-port'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <target chassis='13' port='0x1c'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='pci.13'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <controller type='pci' index='14' model='pcie-root-port'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <target chassis='14' port='0x1d'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='pci.14'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <controller type='pci' index='15' model='pcie-root-port'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <target chassis='15' port='0x1e'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='pci.15'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <controller type='pci' index='16' model='pcie-root-port'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <target chassis='16' port='0x1f'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='pci.16'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <controller type='pci' index='17' model='pcie-root-port'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <target chassis='17' port='0x20'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='pci.17'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <controller type='pci' index='18' model='pcie-root-port'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <target chassis='18' port='0x21'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='pci.18'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <controller type='pci' index='19' model='pcie-root-port'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <target chassis='19' port='0x22'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='pci.19'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <controller type='pci' index='20' model='pcie-root-port'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <target chassis='20' port='0x23'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='pci.20'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <controller type='pci' index='21' model='pcie-root-port'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <target chassis='21' port='0x24'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='pci.21'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <controller type='pci' index='22' model='pcie-root-port'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <target chassis='22' port='0x25'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='pci.22'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <controller type='pci' index='23' model='pcie-root-port'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <target chassis='23' port='0x26'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='pci.23'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <controller type='pci' index='24' model='pcie-root-port'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <target chassis='24' port='0x27'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='pci.24'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <controller type='pci' index='25' model='pcie-root-port'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <target chassis='25' port='0x28'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='pci.25'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <model name='pcie-pci-bridge'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='pci.26'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <controller type='usb' index='0' model='piix3-uhci'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='usb'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <controller type='sata' index='0'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='ide'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <interface type='ethernet'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <mac address='fa:16:3e:f7:8b:09'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <target dev='tap32e5de3f-32'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <model type='virtio'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <driver name='vhost' rx_queue_size='512'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <mtu size='1442'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='net0'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </interface>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <serial type='pty'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <source path='/dev/pts/0'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <log file='/var/lib/nova/instances/3185c90c-d312-49b2-892e-133ba362117b/console.log' append='off'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <target type='isa-serial' port='0'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:         <model name='isa-serial'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       </target>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='serial0'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </serial>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <console type='pty' tty='/dev/pts/0'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <source path='/dev/pts/0'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <log file='/var/lib/nova/instances/3185c90c-d312-49b2-892e-133ba362117b/console.log' append='off'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <target type='serial' port='0'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='serial0'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </console>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <input type='tablet' bus='usb'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='input0'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <address type='usb' bus='0' port='1'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </input>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <input type='mouse' bus='ps2'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='input1'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </input>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <input type='keyboard' bus='ps2'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='input2'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </input>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <listen type='address' address='::0'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </graphics>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <audio id='1' type='none'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <video>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <model type='virtio' heads='1' primary='yes'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='video0'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </video>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <watchdog model='itco' action='reset'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='watchdog0'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </watchdog>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <memballoon model='virtio'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <stats period='10'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='balloon0'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <rng model='virtio'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <backend model='random'>/dev/urandom</backend>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <alias name='rng0'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </rng>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   </devices>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <label>system_u:system_r:svirt_t:s0:c842,c918</label>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c842,c918</imagelabel>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   </seclabel>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <label>+107:+107</label>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <imagelabel>+107:+107</imagelabel>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   </seclabel>
Nov 22 08:36:08 compute-0 nova_compute[186544]: </domain>
Nov 22 08:36:08 compute-0 nova_compute[186544]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Nov 22 08:36:08 compute-0 nova_compute[186544]: 2025-11-22 08:36:08.182 186548 INFO nova.virt.libvirt.driver [None req-8ae71ae5-8a5a-434b-8ca6-62921eb0f4d7 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Successfully detached device tap195c5347-a9 from instance 3185c90c-d312-49b2-892e-133ba362117b from the live domain config.
Nov 22 08:36:08 compute-0 nova_compute[186544]: 2025-11-22 08:36:08.183 186548 DEBUG nova.virt.libvirt.vif [None req-8ae71ae5-8a5a-434b-8ca6-62921eb0f4d7 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:35:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-710800317',display_name='tempest-TestNetworkBasicOps-server-710800317',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-710800317',id=167,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNYFp+84ONzQcn0iqOuOB+FXdJTNjbX16Nf4bSBmaDOqJkiNXFWk4Dg2HKrhF07TNjNTNPuibNriNIRB+UswtE57Lassn/GZrZqYoCMCkzsy/3S49MgiVtcyuFVVOk/lqQ==',key_name='tempest-TestNetworkBasicOps-1923094795',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:35:34Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='12f63a6d87a947758ab928c0d625ff06',ramdisk_id='',reservation_id='r-2i2p53uv',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1998778518',owner_user_name='tempest-TestNetworkBasicOps-1998778518-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:35:34Z,user_data=None,user_id='033a5e424a0a42afa21b67c28d79d1f4',uuid=3185c90c-d312-49b2-892e-133ba362117b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "195c5347-a907-42ac-983c-1579be63a9b3", "address": "fa:16:3e:f3:bc:90", "network": {"id": "81a2a04b-7bbe-4694-ab21-1b66ecc56f3f", "bridge": "br-int", "label": "tempest-network-smoke--1571347791", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap195c5347-a9", "ovs_interfaceid": "195c5347-a907-42ac-983c-1579be63a9b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 08:36:08 compute-0 nova_compute[186544]: 2025-11-22 08:36:08.183 186548 DEBUG nova.network.os_vif_util [None req-8ae71ae5-8a5a-434b-8ca6-62921eb0f4d7 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converting VIF {"id": "195c5347-a907-42ac-983c-1579be63a9b3", "address": "fa:16:3e:f3:bc:90", "network": {"id": "81a2a04b-7bbe-4694-ab21-1b66ecc56f3f", "bridge": "br-int", "label": "tempest-network-smoke--1571347791", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap195c5347-a9", "ovs_interfaceid": "195c5347-a907-42ac-983c-1579be63a9b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:36:08 compute-0 nova_compute[186544]: 2025-11-22 08:36:08.184 186548 DEBUG nova.network.os_vif_util [None req-8ae71ae5-8a5a-434b-8ca6-62921eb0f4d7 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f3:bc:90,bridge_name='br-int',has_traffic_filtering=True,id=195c5347-a907-42ac-983c-1579be63a9b3,network=Network(81a2a04b-7bbe-4694-ab21-1b66ecc56f3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap195c5347-a9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:36:08 compute-0 nova_compute[186544]: 2025-11-22 08:36:08.184 186548 DEBUG os_vif [None req-8ae71ae5-8a5a-434b-8ca6-62921eb0f4d7 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f3:bc:90,bridge_name='br-int',has_traffic_filtering=True,id=195c5347-a907-42ac-983c-1579be63a9b3,network=Network(81a2a04b-7bbe-4694-ab21-1b66ecc56f3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap195c5347-a9') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 08:36:08 compute-0 nova_compute[186544]: 2025-11-22 08:36:08.186 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:36:08 compute-0 nova_compute[186544]: 2025-11-22 08:36:08.186 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap195c5347-a9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:36:08 compute-0 nova_compute[186544]: 2025-11-22 08:36:08.188 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:36:08 compute-0 nova_compute[186544]: 2025-11-22 08:36:08.188 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:36:08 compute-0 nova_compute[186544]: 2025-11-22 08:36:08.190 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:36:08 compute-0 nova_compute[186544]: 2025-11-22 08:36:08.192 186548 INFO os_vif [None req-8ae71ae5-8a5a-434b-8ca6-62921eb0f4d7 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f3:bc:90,bridge_name='br-int',has_traffic_filtering=True,id=195c5347-a907-42ac-983c-1579be63a9b3,network=Network(81a2a04b-7bbe-4694-ab21-1b66ecc56f3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap195c5347-a9')
Nov 22 08:36:08 compute-0 nova_compute[186544]: 2025-11-22 08:36:08.193 186548 DEBUG nova.virt.libvirt.guest [None req-8ae71ae5-8a5a-434b-8ca6-62921eb0f4d7 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 08:36:08 compute-0 nova_compute[186544]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   <nova:name>tempest-TestNetworkBasicOps-server-710800317</nova:name>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   <nova:creationTime>2025-11-22 08:36:08</nova:creationTime>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   <nova:flavor name="m1.nano">
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <nova:memory>128</nova:memory>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <nova:disk>1</nova:disk>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <nova:swap>0</nova:swap>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <nova:ephemeral>0</nova:ephemeral>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <nova:vcpus>1</nova:vcpus>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   </nova:flavor>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   <nova:owner>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <nova:user uuid="033a5e424a0a42afa21b67c28d79d1f4">tempest-TestNetworkBasicOps-1998778518-project-member</nova:user>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <nova:project uuid="12f63a6d87a947758ab928c0d625ff06">tempest-TestNetworkBasicOps-1998778518</nova:project>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   </nova:owner>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   <nova:ports>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     <nova:port uuid="32e5de3f-3250-4a10-b086-ff60f6485d94">
Nov 22 08:36:08 compute-0 nova_compute[186544]:       <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Nov 22 08:36:08 compute-0 nova_compute[186544]:     </nova:port>
Nov 22 08:36:08 compute-0 nova_compute[186544]:   </nova:ports>
Nov 22 08:36:08 compute-0 nova_compute[186544]: </nova:instance>
Nov 22 08:36:08 compute-0 nova_compute[186544]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Nov 22 08:36:08 compute-0 neutron-haproxy-ovnmeta-81a2a04b-7bbe-4694-ab21-1b66ecc56f3f[248993]: [NOTICE]   (249004) : haproxy version is 2.8.14-c23fe91
Nov 22 08:36:08 compute-0 neutron-haproxy-ovnmeta-81a2a04b-7bbe-4694-ab21-1b66ecc56f3f[248993]: [NOTICE]   (249004) : path to executable is /usr/sbin/haproxy
Nov 22 08:36:08 compute-0 neutron-haproxy-ovnmeta-81a2a04b-7bbe-4694-ab21-1b66ecc56f3f[248993]: [WARNING]  (249004) : Exiting Master process...
Nov 22 08:36:08 compute-0 neutron-haproxy-ovnmeta-81a2a04b-7bbe-4694-ab21-1b66ecc56f3f[248993]: [ALERT]    (249004) : Current worker (249006) exited with code 143 (Terminated)
Nov 22 08:36:08 compute-0 neutron-haproxy-ovnmeta-81a2a04b-7bbe-4694-ab21-1b66ecc56f3f[248993]: [WARNING]  (249004) : All workers exited. Exiting... (0)
Nov 22 08:36:08 compute-0 systemd[1]: libpod-80a1d76f123849ef53001bad68bd76064b7398a26b3ed9d6d8667315cc9210f4.scope: Deactivated successfully.
Nov 22 08:36:08 compute-0 podman[249036]: 2025-11-22 08:36:08.306593295 +0000 UTC m=+0.044537380 container died 80a1d76f123849ef53001bad68bd76064b7398a26b3ed9d6d8667315cc9210f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-81a2a04b-7bbe-4694-ab21-1b66ecc56f3f, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 08:36:08 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-80a1d76f123849ef53001bad68bd76064b7398a26b3ed9d6d8667315cc9210f4-userdata-shm.mount: Deactivated successfully.
Nov 22 08:36:08 compute-0 systemd[1]: var-lib-containers-storage-overlay-af3a8389bece759c6e37e54a3553dd3067a25e653795c1c01a58d1be591d6a75-merged.mount: Deactivated successfully.
Nov 22 08:36:08 compute-0 podman[249036]: 2025-11-22 08:36:08.358037045 +0000 UTC m=+0.095981110 container cleanup 80a1d76f123849ef53001bad68bd76064b7398a26b3ed9d6d8667315cc9210f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-81a2a04b-7bbe-4694-ab21-1b66ecc56f3f, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 08:36:08 compute-0 systemd[1]: libpod-conmon-80a1d76f123849ef53001bad68bd76064b7398a26b3ed9d6d8667315cc9210f4.scope: Deactivated successfully.
Nov 22 08:36:08 compute-0 podman[249065]: 2025-11-22 08:36:08.414690813 +0000 UTC m=+0.037405965 container remove 80a1d76f123849ef53001bad68bd76064b7398a26b3ed9d6d8667315cc9210f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-81a2a04b-7bbe-4694-ab21-1b66ecc56f3f, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 22 08:36:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:08.419 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[06bec27f-00a1-4a49-ba94-caa80676897e]: (4, ('Sat Nov 22 08:36:08 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-81a2a04b-7bbe-4694-ab21-1b66ecc56f3f (80a1d76f123849ef53001bad68bd76064b7398a26b3ed9d6d8667315cc9210f4)\n80a1d76f123849ef53001bad68bd76064b7398a26b3ed9d6d8667315cc9210f4\nSat Nov 22 08:36:08 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-81a2a04b-7bbe-4694-ab21-1b66ecc56f3f (80a1d76f123849ef53001bad68bd76064b7398a26b3ed9d6d8667315cc9210f4)\n80a1d76f123849ef53001bad68bd76064b7398a26b3ed9d6d8667315cc9210f4\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:36:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:08.421 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[96489755-15f2-420b-a5f7-96181717fbee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:36:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:08.422 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap81a2a04b-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:36:08 compute-0 nova_compute[186544]: 2025-11-22 08:36:08.424 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:36:08 compute-0 kernel: tap81a2a04b-70: left promiscuous mode
Nov 22 08:36:08 compute-0 nova_compute[186544]: 2025-11-22 08:36:08.436 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:36:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:08.440 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[f0b512a4-2d72-4553-9f47-bcec08b2a5cd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:36:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:08.456 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[918102aa-5a1f-42f8-a1e1-3fef32881746]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:36:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:08.458 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[03c9d655-8c3c-4040-baf8-8ce1ba8614e2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:36:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:08.472 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[beac7cae-4a7b-4674-8cd3-e9d442a4834a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 730705, 'reachable_time': 27591, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 249087, 'error': None, 'target': 'ovnmeta-81a2a04b-7bbe-4694-ab21-1b66ecc56f3f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:36:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:08.474 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-81a2a04b-7bbe-4694-ab21-1b66ecc56f3f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 08:36:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:08.474 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[270e6c4b-ceef-4482-a6d8-336d99c64d4b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:36:08 compute-0 systemd[1]: run-netns-ovnmeta\x2d81a2a04b\x2d7bbe\x2d4694\x2dab21\x2d1b66ecc56f3f.mount: Deactivated successfully.
Nov 22 08:36:08 compute-0 podman[249078]: 2025-11-22 08:36:08.50940842 +0000 UTC m=+0.049625456 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 08:36:08 compute-0 podman[249080]: 2025-11-22 08:36:08.544761592 +0000 UTC m=+0.081322187 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, architecture=x86_64, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vendor=Red Hat, Inc., io.openshift.expose-services=, build-date=2025-08-20T13:12:41, distribution-scope=public, io.buildah.version=1.33.7)
Nov 22 08:36:08 compute-0 nova_compute[186544]: 2025-11-22 08:36:08.762 186548 DEBUG nova.compute.manager [req-d46f6702-2eaf-4415-afa4-07a1cec649ec req-615cebd0-45d1-43b0-9c85-e7ce4d1f25c1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Received event network-vif-plugged-195c5347-a907-42ac-983c-1579be63a9b3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:36:08 compute-0 nova_compute[186544]: 2025-11-22 08:36:08.762 186548 DEBUG oslo_concurrency.lockutils [req-d46f6702-2eaf-4415-afa4-07a1cec649ec req-615cebd0-45d1-43b0-9c85-e7ce4d1f25c1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "3185c90c-d312-49b2-892e-133ba362117b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:36:08 compute-0 nova_compute[186544]: 2025-11-22 08:36:08.762 186548 DEBUG oslo_concurrency.lockutils [req-d46f6702-2eaf-4415-afa4-07a1cec649ec req-615cebd0-45d1-43b0-9c85-e7ce4d1f25c1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "3185c90c-d312-49b2-892e-133ba362117b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:36:08 compute-0 nova_compute[186544]: 2025-11-22 08:36:08.763 186548 DEBUG oslo_concurrency.lockutils [req-d46f6702-2eaf-4415-afa4-07a1cec649ec req-615cebd0-45d1-43b0-9c85-e7ce4d1f25c1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "3185c90c-d312-49b2-892e-133ba362117b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:36:08 compute-0 nova_compute[186544]: 2025-11-22 08:36:08.763 186548 DEBUG nova.compute.manager [req-d46f6702-2eaf-4415-afa4-07a1cec649ec req-615cebd0-45d1-43b0-9c85-e7ce4d1f25c1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] No waiting events found dispatching network-vif-plugged-195c5347-a907-42ac-983c-1579be63a9b3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:36:08 compute-0 nova_compute[186544]: 2025-11-22 08:36:08.763 186548 WARNING nova.compute.manager [req-d46f6702-2eaf-4415-afa4-07a1cec649ec req-615cebd0-45d1-43b0-9c85-e7ce4d1f25c1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Received unexpected event network-vif-plugged-195c5347-a907-42ac-983c-1579be63a9b3 for instance with vm_state active and task_state None.
Nov 22 08:36:10 compute-0 nova_compute[186544]: 2025-11-22 08:36:10.322 186548 DEBUG oslo_concurrency.lockutils [None req-8ae71ae5-8a5a-434b-8ca6-62921eb0f4d7 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "refresh_cache-3185c90c-d312-49b2-892e-133ba362117b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:36:10 compute-0 nova_compute[186544]: 2025-11-22 08:36:10.323 186548 DEBUG oslo_concurrency.lockutils [None req-8ae71ae5-8a5a-434b-8ca6-62921eb0f4d7 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquired lock "refresh_cache-3185c90c-d312-49b2-892e-133ba362117b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:36:10 compute-0 nova_compute[186544]: 2025-11-22 08:36:10.323 186548 DEBUG nova.network.neutron [None req-8ae71ae5-8a5a-434b-8ca6-62921eb0f4d7 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 08:36:10 compute-0 nova_compute[186544]: 2025-11-22 08:36:10.844 186548 DEBUG nova.compute.manager [req-84433757-cc30-437b-808a-fbb3d0d54eb8 req-94a48b29-2791-4e6a-86c4-a03a0fe3c366 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Received event network-vif-unplugged-195c5347-a907-42ac-983c-1579be63a9b3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:36:10 compute-0 nova_compute[186544]: 2025-11-22 08:36:10.844 186548 DEBUG oslo_concurrency.lockutils [req-84433757-cc30-437b-808a-fbb3d0d54eb8 req-94a48b29-2791-4e6a-86c4-a03a0fe3c366 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "3185c90c-d312-49b2-892e-133ba362117b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:36:10 compute-0 nova_compute[186544]: 2025-11-22 08:36:10.845 186548 DEBUG oslo_concurrency.lockutils [req-84433757-cc30-437b-808a-fbb3d0d54eb8 req-94a48b29-2791-4e6a-86c4-a03a0fe3c366 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "3185c90c-d312-49b2-892e-133ba362117b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:36:10 compute-0 nova_compute[186544]: 2025-11-22 08:36:10.845 186548 DEBUG oslo_concurrency.lockutils [req-84433757-cc30-437b-808a-fbb3d0d54eb8 req-94a48b29-2791-4e6a-86c4-a03a0fe3c366 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "3185c90c-d312-49b2-892e-133ba362117b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:36:10 compute-0 nova_compute[186544]: 2025-11-22 08:36:10.845 186548 DEBUG nova.compute.manager [req-84433757-cc30-437b-808a-fbb3d0d54eb8 req-94a48b29-2791-4e6a-86c4-a03a0fe3c366 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] No waiting events found dispatching network-vif-unplugged-195c5347-a907-42ac-983c-1579be63a9b3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:36:10 compute-0 nova_compute[186544]: 2025-11-22 08:36:10.845 186548 WARNING nova.compute.manager [req-84433757-cc30-437b-808a-fbb3d0d54eb8 req-94a48b29-2791-4e6a-86c4-a03a0fe3c366 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Received unexpected event network-vif-unplugged-195c5347-a907-42ac-983c-1579be63a9b3 for instance with vm_state active and task_state None.
Nov 22 08:36:10 compute-0 nova_compute[186544]: 2025-11-22 08:36:10.846 186548 DEBUG nova.compute.manager [req-84433757-cc30-437b-808a-fbb3d0d54eb8 req-94a48b29-2791-4e6a-86c4-a03a0fe3c366 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Received event network-vif-plugged-195c5347-a907-42ac-983c-1579be63a9b3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:36:10 compute-0 nova_compute[186544]: 2025-11-22 08:36:10.846 186548 DEBUG oslo_concurrency.lockutils [req-84433757-cc30-437b-808a-fbb3d0d54eb8 req-94a48b29-2791-4e6a-86c4-a03a0fe3c366 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "3185c90c-d312-49b2-892e-133ba362117b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:36:10 compute-0 nova_compute[186544]: 2025-11-22 08:36:10.846 186548 DEBUG oslo_concurrency.lockutils [req-84433757-cc30-437b-808a-fbb3d0d54eb8 req-94a48b29-2791-4e6a-86c4-a03a0fe3c366 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "3185c90c-d312-49b2-892e-133ba362117b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:36:10 compute-0 nova_compute[186544]: 2025-11-22 08:36:10.846 186548 DEBUG oslo_concurrency.lockutils [req-84433757-cc30-437b-808a-fbb3d0d54eb8 req-94a48b29-2791-4e6a-86c4-a03a0fe3c366 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "3185c90c-d312-49b2-892e-133ba362117b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:36:10 compute-0 nova_compute[186544]: 2025-11-22 08:36:10.846 186548 DEBUG nova.compute.manager [req-84433757-cc30-437b-808a-fbb3d0d54eb8 req-94a48b29-2791-4e6a-86c4-a03a0fe3c366 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] No waiting events found dispatching network-vif-plugged-195c5347-a907-42ac-983c-1579be63a9b3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:36:10 compute-0 nova_compute[186544]: 2025-11-22 08:36:10.847 186548 WARNING nova.compute.manager [req-84433757-cc30-437b-808a-fbb3d0d54eb8 req-94a48b29-2791-4e6a-86c4-a03a0fe3c366 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Received unexpected event network-vif-plugged-195c5347-a907-42ac-983c-1579be63a9b3 for instance with vm_state active and task_state None.
Nov 22 08:36:10 compute-0 nova_compute[186544]: 2025-11-22 08:36:10.847 186548 DEBUG nova.compute.manager [req-84433757-cc30-437b-808a-fbb3d0d54eb8 req-94a48b29-2791-4e6a-86c4-a03a0fe3c366 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Received event network-vif-deleted-195c5347-a907-42ac-983c-1579be63a9b3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:36:10 compute-0 nova_compute[186544]: 2025-11-22 08:36:10.847 186548 INFO nova.compute.manager [req-84433757-cc30-437b-808a-fbb3d0d54eb8 req-94a48b29-2791-4e6a-86c4-a03a0fe3c366 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Neutron deleted interface 195c5347-a907-42ac-983c-1579be63a9b3; detaching it from the instance and deleting it from the info cache
Nov 22 08:36:10 compute-0 nova_compute[186544]: 2025-11-22 08:36:10.847 186548 DEBUG nova.network.neutron [req-84433757-cc30-437b-808a-fbb3d0d54eb8 req-94a48b29-2791-4e6a-86c4-a03a0fe3c366 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Updating instance_info_cache with network_info: [{"id": "32e5de3f-3250-4a10-b086-ff60f6485d94", "address": "fa:16:3e:f7:8b:09", "network": {"id": "31f800a8-047c-4bf6-a6e5-1eca76d76d66", "bridge": "br-int", "label": "tempest-network-smoke--1588217224", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap32e5de3f-32", "ovs_interfaceid": "32e5de3f-3250-4a10-b086-ff60f6485d94", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:36:10 compute-0 nova_compute[186544]: 2025-11-22 08:36:10.869 186548 DEBUG nova.objects.instance [req-84433757-cc30-437b-808a-fbb3d0d54eb8 req-94a48b29-2791-4e6a-86c4-a03a0fe3c366 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lazy-loading 'system_metadata' on Instance uuid 3185c90c-d312-49b2-892e-133ba362117b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:36:10 compute-0 nova_compute[186544]: 2025-11-22 08:36:10.888 186548 DEBUG nova.objects.instance [req-84433757-cc30-437b-808a-fbb3d0d54eb8 req-94a48b29-2791-4e6a-86c4-a03a0fe3c366 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lazy-loading 'flavor' on Instance uuid 3185c90c-d312-49b2-892e-133ba362117b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:36:10 compute-0 nova_compute[186544]: 2025-11-22 08:36:10.908 186548 DEBUG nova.virt.libvirt.vif [req-84433757-cc30-437b-808a-fbb3d0d54eb8 req-94a48b29-2791-4e6a-86c4-a03a0fe3c366 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:35:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-710800317',display_name='tempest-TestNetworkBasicOps-server-710800317',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-710800317',id=167,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNYFp+84ONzQcn0iqOuOB+FXdJTNjbX16Nf4bSBmaDOqJkiNXFWk4Dg2HKrhF07TNjNTNPuibNriNIRB+UswtE57Lassn/GZrZqYoCMCkzsy/3S49MgiVtcyuFVVOk/lqQ==',key_name='tempest-TestNetworkBasicOps-1923094795',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:35:34Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='12f63a6d87a947758ab928c0d625ff06',ramdisk_id='',reservation_id='r-2i2p53uv',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1998778518',owner_user_name='tempest-TestNetworkBasicOps-1998778518-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:35:34Z,user_data=None,user_id='033a5e424a0a42afa21b67c28d79d1f4',uuid=3185c90c-d312-49b2-892e-133ba362117b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "195c5347-a907-42ac-983c-1579be63a9b3", "address": "fa:16:3e:f3:bc:90", "network": {"id": "81a2a04b-7bbe-4694-ab21-1b66ecc56f3f", "bridge": "br-int", "label": "tempest-network-smoke--1571347791", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap195c5347-a9", "ovs_interfaceid": "195c5347-a907-42ac-983c-1579be63a9b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 08:36:10 compute-0 nova_compute[186544]: 2025-11-22 08:36:10.909 186548 DEBUG nova.network.os_vif_util [req-84433757-cc30-437b-808a-fbb3d0d54eb8 req-94a48b29-2791-4e6a-86c4-a03a0fe3c366 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Converting VIF {"id": "195c5347-a907-42ac-983c-1579be63a9b3", "address": "fa:16:3e:f3:bc:90", "network": {"id": "81a2a04b-7bbe-4694-ab21-1b66ecc56f3f", "bridge": "br-int", "label": "tempest-network-smoke--1571347791", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap195c5347-a9", "ovs_interfaceid": "195c5347-a907-42ac-983c-1579be63a9b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:36:10 compute-0 nova_compute[186544]: 2025-11-22 08:36:10.910 186548 DEBUG nova.network.os_vif_util [req-84433757-cc30-437b-808a-fbb3d0d54eb8 req-94a48b29-2791-4e6a-86c4-a03a0fe3c366 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f3:bc:90,bridge_name='br-int',has_traffic_filtering=True,id=195c5347-a907-42ac-983c-1579be63a9b3,network=Network(81a2a04b-7bbe-4694-ab21-1b66ecc56f3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap195c5347-a9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:36:10 compute-0 nova_compute[186544]: 2025-11-22 08:36:10.918 186548 DEBUG nova.virt.libvirt.guest [req-84433757-cc30-437b-808a-fbb3d0d54eb8 req-94a48b29-2791-4e6a-86c4-a03a0fe3c366 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:f3:bc:90"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap195c5347-a9"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Nov 22 08:36:10 compute-0 nova_compute[186544]: 2025-11-22 08:36:10.922 186548 DEBUG nova.virt.libvirt.guest [req-84433757-cc30-437b-808a-fbb3d0d54eb8 req-94a48b29-2791-4e6a-86c4-a03a0fe3c366 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:f3:bc:90"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap195c5347-a9"/></interface>not found in domain: <domain type='kvm' id='91'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   <name>instance-000000a7</name>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   <uuid>3185c90c-d312-49b2-892e-133ba362117b</uuid>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   <metadata>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 08:36:10 compute-0 nova_compute[186544]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   <nova:name>tempest-TestNetworkBasicOps-server-710800317</nova:name>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   <nova:creationTime>2025-11-22 08:36:08</nova:creationTime>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   <nova:flavor name="m1.nano">
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <nova:memory>128</nova:memory>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <nova:disk>1</nova:disk>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <nova:swap>0</nova:swap>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <nova:ephemeral>0</nova:ephemeral>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <nova:vcpus>1</nova:vcpus>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   </nova:flavor>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   <nova:owner>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <nova:user uuid="033a5e424a0a42afa21b67c28d79d1f4">tempest-TestNetworkBasicOps-1998778518-project-member</nova:user>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <nova:project uuid="12f63a6d87a947758ab928c0d625ff06">tempest-TestNetworkBasicOps-1998778518</nova:project>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   </nova:owner>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   <nova:ports>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <nova:port uuid="32e5de3f-3250-4a10-b086-ff60f6485d94">
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </nova:port>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   </nova:ports>
Nov 22 08:36:10 compute-0 nova_compute[186544]: </nova:instance>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   </metadata>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   <memory unit='KiB'>131072</memory>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   <currentMemory unit='KiB'>131072</currentMemory>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   <vcpu placement='static'>1</vcpu>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   <resource>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <partition>/machine</partition>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   </resource>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   <sysinfo type='smbios'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <system>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <entry name='manufacturer'>RDO</entry>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <entry name='product'>OpenStack Compute</entry>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <entry name='serial'>3185c90c-d312-49b2-892e-133ba362117b</entry>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <entry name='uuid'>3185c90c-d312-49b2-892e-133ba362117b</entry>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <entry name='family'>Virtual Machine</entry>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </system>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   <os>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <boot dev='hd'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <smbios mode='sysinfo'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   </os>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   <features>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <apic/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <vmcoreinfo state='on'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   </features>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   <cpu mode='custom' match='exact' check='full'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <model fallback='forbid'>Nehalem</model>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <feature policy='require' name='x2apic'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <feature policy='require' name='hypervisor'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <feature policy='require' name='vme'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   </cpu>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   <clock offset='utc'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <timer name='pit' tickpolicy='delay'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <timer name='rtc' tickpolicy='catchup'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <timer name='hpet' present='no'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   </clock>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   <on_poweroff>destroy</on_poweroff>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   <on_reboot>restart</on_reboot>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   <on_crash>destroy</on_crash>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   <devices>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <disk type='file' device='disk'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <driver name='qemu' type='qcow2' cache='none'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <source file='/var/lib/nova/instances/3185c90c-d312-49b2-892e-133ba362117b/disk' index='2'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <backingStore type='file' index='3'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:         <format type='raw'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:         <source file='/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:         <backingStore/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       </backingStore>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <target dev='vda' bus='virtio'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='virtio-disk0'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <disk type='file' device='cdrom'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <driver name='qemu' type='raw' cache='none'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <source file='/var/lib/nova/instances/3185c90c-d312-49b2-892e-133ba362117b/disk.config' index='1'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <backingStore/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <target dev='sda' bus='sata'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <readonly/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='sata0-0-0'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <controller type='pci' index='0' model='pcie-root'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='pcie.0'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <controller type='pci' index='1' model='pcie-root-port'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <target chassis='1' port='0x10'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='pci.1'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <controller type='pci' index='2' model='pcie-root-port'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <target chassis='2' port='0x11'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='pci.2'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <controller type='pci' index='3' model='pcie-root-port'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <target chassis='3' port='0x12'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='pci.3'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <controller type='pci' index='4' model='pcie-root-port'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <target chassis='4' port='0x13'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='pci.4'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <controller type='pci' index='5' model='pcie-root-port'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <target chassis='5' port='0x14'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='pci.5'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <controller type='pci' index='6' model='pcie-root-port'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <target chassis='6' port='0x15'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='pci.6'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <controller type='pci' index='7' model='pcie-root-port'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <target chassis='7' port='0x16'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='pci.7'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <controller type='pci' index='8' model='pcie-root-port'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <target chassis='8' port='0x17'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='pci.8'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <controller type='pci' index='9' model='pcie-root-port'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <target chassis='9' port='0x18'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='pci.9'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <controller type='pci' index='10' model='pcie-root-port'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <target chassis='10' port='0x19'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='pci.10'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <controller type='pci' index='11' model='pcie-root-port'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <target chassis='11' port='0x1a'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='pci.11'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <controller type='pci' index='12' model='pcie-root-port'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <target chassis='12' port='0x1b'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='pci.12'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <controller type='pci' index='13' model='pcie-root-port'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <target chassis='13' port='0x1c'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='pci.13'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <controller type='pci' index='14' model='pcie-root-port'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <target chassis='14' port='0x1d'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='pci.14'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <controller type='pci' index='15' model='pcie-root-port'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <target chassis='15' port='0x1e'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='pci.15'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <controller type='pci' index='16' model='pcie-root-port'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <target chassis='16' port='0x1f'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='pci.16'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <controller type='pci' index='17' model='pcie-root-port'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <target chassis='17' port='0x20'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='pci.17'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <controller type='pci' index='18' model='pcie-root-port'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <target chassis='18' port='0x21'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='pci.18'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <controller type='pci' index='19' model='pcie-root-port'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <target chassis='19' port='0x22'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='pci.19'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <controller type='pci' index='20' model='pcie-root-port'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <target chassis='20' port='0x23'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='pci.20'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <controller type='pci' index='21' model='pcie-root-port'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <target chassis='21' port='0x24'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='pci.21'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <controller type='pci' index='22' model='pcie-root-port'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <target chassis='22' port='0x25'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='pci.22'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <controller type='pci' index='23' model='pcie-root-port'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <target chassis='23' port='0x26'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='pci.23'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <controller type='pci' index='24' model='pcie-root-port'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <target chassis='24' port='0x27'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='pci.24'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <controller type='pci' index='25' model='pcie-root-port'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <target chassis='25' port='0x28'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='pci.25'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <model name='pcie-pci-bridge'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='pci.26'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <controller type='usb' index='0' model='piix3-uhci'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='usb'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <controller type='sata' index='0'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='ide'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <interface type='ethernet'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <mac address='fa:16:3e:f7:8b:09'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <target dev='tap32e5de3f-32'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <model type='virtio'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <driver name='vhost' rx_queue_size='512'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <mtu size='1442'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='net0'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </interface>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <serial type='pty'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <source path='/dev/pts/0'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <log file='/var/lib/nova/instances/3185c90c-d312-49b2-892e-133ba362117b/console.log' append='off'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <target type='isa-serial' port='0'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:         <model name='isa-serial'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       </target>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='serial0'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </serial>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <console type='pty' tty='/dev/pts/0'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <source path='/dev/pts/0'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <log file='/var/lib/nova/instances/3185c90c-d312-49b2-892e-133ba362117b/console.log' append='off'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <target type='serial' port='0'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='serial0'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </console>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <input type='tablet' bus='usb'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='input0'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <address type='usb' bus='0' port='1'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </input>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <input type='mouse' bus='ps2'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='input1'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </input>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <input type='keyboard' bus='ps2'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='input2'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </input>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <listen type='address' address='::0'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </graphics>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <audio id='1' type='none'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <video>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <model type='virtio' heads='1' primary='yes'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='video0'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </video>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <watchdog model='itco' action='reset'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='watchdog0'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </watchdog>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <memballoon model='virtio'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <stats period='10'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='balloon0'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <rng model='virtio'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <backend model='random'>/dev/urandom</backend>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='rng0'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </rng>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   </devices>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <label>system_u:system_r:svirt_t:s0:c842,c918</label>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c842,c918</imagelabel>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   </seclabel>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <label>+107:+107</label>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <imagelabel>+107:+107</imagelabel>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   </seclabel>
Nov 22 08:36:10 compute-0 nova_compute[186544]: </domain>
Nov 22 08:36:10 compute-0 nova_compute[186544]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Nov 22 08:36:10 compute-0 nova_compute[186544]: 2025-11-22 08:36:10.924 186548 DEBUG nova.virt.libvirt.guest [req-84433757-cc30-437b-808a-fbb3d0d54eb8 req-94a48b29-2791-4e6a-86c4-a03a0fe3c366 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:f3:bc:90"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap195c5347-a9"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Nov 22 08:36:10 compute-0 nova_compute[186544]: 2025-11-22 08:36:10.928 186548 DEBUG nova.virt.libvirt.guest [req-84433757-cc30-437b-808a-fbb3d0d54eb8 req-94a48b29-2791-4e6a-86c4-a03a0fe3c366 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:f3:bc:90"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap195c5347-a9"/></interface>not found in domain: <domain type='kvm' id='91'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   <name>instance-000000a7</name>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   <uuid>3185c90c-d312-49b2-892e-133ba362117b</uuid>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   <metadata>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 08:36:10 compute-0 nova_compute[186544]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   <nova:name>tempest-TestNetworkBasicOps-server-710800317</nova:name>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   <nova:creationTime>2025-11-22 08:36:08</nova:creationTime>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   <nova:flavor name="m1.nano">
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <nova:memory>128</nova:memory>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <nova:disk>1</nova:disk>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <nova:swap>0</nova:swap>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <nova:ephemeral>0</nova:ephemeral>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <nova:vcpus>1</nova:vcpus>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   </nova:flavor>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   <nova:owner>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <nova:user uuid="033a5e424a0a42afa21b67c28d79d1f4">tempest-TestNetworkBasicOps-1998778518-project-member</nova:user>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <nova:project uuid="12f63a6d87a947758ab928c0d625ff06">tempest-TestNetworkBasicOps-1998778518</nova:project>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   </nova:owner>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   <nova:ports>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <nova:port uuid="32e5de3f-3250-4a10-b086-ff60f6485d94">
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </nova:port>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   </nova:ports>
Nov 22 08:36:10 compute-0 nova_compute[186544]: </nova:instance>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   </metadata>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   <memory unit='KiB'>131072</memory>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   <currentMemory unit='KiB'>131072</currentMemory>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   <vcpu placement='static'>1</vcpu>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   <resource>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <partition>/machine</partition>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   </resource>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   <sysinfo type='smbios'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <system>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <entry name='manufacturer'>RDO</entry>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <entry name='product'>OpenStack Compute</entry>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <entry name='serial'>3185c90c-d312-49b2-892e-133ba362117b</entry>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <entry name='uuid'>3185c90c-d312-49b2-892e-133ba362117b</entry>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <entry name='family'>Virtual Machine</entry>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </system>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   <os>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <boot dev='hd'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <smbios mode='sysinfo'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   </os>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   <features>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <apic/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <vmcoreinfo state='on'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   </features>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   <cpu mode='custom' match='exact' check='full'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <model fallback='forbid'>Nehalem</model>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <feature policy='require' name='x2apic'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <feature policy='require' name='hypervisor'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <feature policy='require' name='vme'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   </cpu>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   <clock offset='utc'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <timer name='pit' tickpolicy='delay'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <timer name='rtc' tickpolicy='catchup'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <timer name='hpet' present='no'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   </clock>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   <on_poweroff>destroy</on_poweroff>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   <on_reboot>restart</on_reboot>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   <on_crash>destroy</on_crash>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   <devices>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <disk type='file' device='disk'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <driver name='qemu' type='qcow2' cache='none'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <source file='/var/lib/nova/instances/3185c90c-d312-49b2-892e-133ba362117b/disk' index='2'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <backingStore type='file' index='3'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:         <format type='raw'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:         <source file='/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:         <backingStore/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       </backingStore>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <target dev='vda' bus='virtio'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='virtio-disk0'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <disk type='file' device='cdrom'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <driver name='qemu' type='raw' cache='none'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <source file='/var/lib/nova/instances/3185c90c-d312-49b2-892e-133ba362117b/disk.config' index='1'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <backingStore/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <target dev='sda' bus='sata'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <readonly/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='sata0-0-0'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <controller type='pci' index='0' model='pcie-root'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='pcie.0'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <controller type='pci' index='1' model='pcie-root-port'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <target chassis='1' port='0x10'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='pci.1'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <controller type='pci' index='2' model='pcie-root-port'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <target chassis='2' port='0x11'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='pci.2'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <controller type='pci' index='3' model='pcie-root-port'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <target chassis='3' port='0x12'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='pci.3'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <controller type='pci' index='4' model='pcie-root-port'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <target chassis='4' port='0x13'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='pci.4'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <controller type='pci' index='5' model='pcie-root-port'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <target chassis='5' port='0x14'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='pci.5'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <controller type='pci' index='6' model='pcie-root-port'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <target chassis='6' port='0x15'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='pci.6'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <controller type='pci' index='7' model='pcie-root-port'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <target chassis='7' port='0x16'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='pci.7'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <controller type='pci' index='8' model='pcie-root-port'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <target chassis='8' port='0x17'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='pci.8'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <controller type='pci' index='9' model='pcie-root-port'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <target chassis='9' port='0x18'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='pci.9'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <controller type='pci' index='10' model='pcie-root-port'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <target chassis='10' port='0x19'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='pci.10'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <controller type='pci' index='11' model='pcie-root-port'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <target chassis='11' port='0x1a'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='pci.11'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <controller type='pci' index='12' model='pcie-root-port'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <target chassis='12' port='0x1b'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='pci.12'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <controller type='pci' index='13' model='pcie-root-port'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <target chassis='13' port='0x1c'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='pci.13'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <controller type='pci' index='14' model='pcie-root-port'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <target chassis='14' port='0x1d'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='pci.14'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <controller type='pci' index='15' model='pcie-root-port'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <target chassis='15' port='0x1e'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='pci.15'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <controller type='pci' index='16' model='pcie-root-port'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <target chassis='16' port='0x1f'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='pci.16'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <controller type='pci' index='17' model='pcie-root-port'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <target chassis='17' port='0x20'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='pci.17'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <controller type='pci' index='18' model='pcie-root-port'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <target chassis='18' port='0x21'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='pci.18'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <controller type='pci' index='19' model='pcie-root-port'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <target chassis='19' port='0x22'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='pci.19'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <controller type='pci' index='20' model='pcie-root-port'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <target chassis='20' port='0x23'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='pci.20'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <controller type='pci' index='21' model='pcie-root-port'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <target chassis='21' port='0x24'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='pci.21'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <controller type='pci' index='22' model='pcie-root-port'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <target chassis='22' port='0x25'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='pci.22'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <controller type='pci' index='23' model='pcie-root-port'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <target chassis='23' port='0x26'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='pci.23'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <controller type='pci' index='24' model='pcie-root-port'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <target chassis='24' port='0x27'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='pci.24'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <controller type='pci' index='25' model='pcie-root-port'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <model name='pcie-root-port'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <target chassis='25' port='0x28'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='pci.25'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <model name='pcie-pci-bridge'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='pci.26'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <controller type='usb' index='0' model='piix3-uhci'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='usb'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <controller type='sata' index='0'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='ide'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </controller>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <interface type='ethernet'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <mac address='fa:16:3e:f7:8b:09'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <target dev='tap32e5de3f-32'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <model type='virtio'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <driver name='vhost' rx_queue_size='512'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <mtu size='1442'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='net0'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </interface>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <serial type='pty'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <source path='/dev/pts/0'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <log file='/var/lib/nova/instances/3185c90c-d312-49b2-892e-133ba362117b/console.log' append='off'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <target type='isa-serial' port='0'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:         <model name='isa-serial'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       </target>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='serial0'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </serial>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <console type='pty' tty='/dev/pts/0'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <source path='/dev/pts/0'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <log file='/var/lib/nova/instances/3185c90c-d312-49b2-892e-133ba362117b/console.log' append='off'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <target type='serial' port='0'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='serial0'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </console>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <input type='tablet' bus='usb'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='input0'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <address type='usb' bus='0' port='1'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </input>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <input type='mouse' bus='ps2'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='input1'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </input>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <input type='keyboard' bus='ps2'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='input2'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </input>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <listen type='address' address='::0'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </graphics>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <audio id='1' type='none'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <video>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <model type='virtio' heads='1' primary='yes'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='video0'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </video>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <watchdog model='itco' action='reset'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='watchdog0'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </watchdog>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <memballoon model='virtio'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <stats period='10'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='balloon0'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <rng model='virtio'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <backend model='random'>/dev/urandom</backend>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <alias name='rng0'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </rng>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   </devices>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <label>system_u:system_r:svirt_t:s0:c842,c918</label>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c842,c918</imagelabel>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   </seclabel>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <label>+107:+107</label>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <imagelabel>+107:+107</imagelabel>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   </seclabel>
Nov 22 08:36:10 compute-0 nova_compute[186544]: </domain>
Nov 22 08:36:10 compute-0 nova_compute[186544]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Nov 22 08:36:10 compute-0 nova_compute[186544]: 2025-11-22 08:36:10.929 186548 WARNING nova.virt.libvirt.driver [req-84433757-cc30-437b-808a-fbb3d0d54eb8 req-94a48b29-2791-4e6a-86c4-a03a0fe3c366 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Detaching interface fa:16:3e:f3:bc:90 failed because the device is no longer found on the guest.: nova.exception.DeviceNotFound: Device 'tap195c5347-a9' not found.
Nov 22 08:36:10 compute-0 nova_compute[186544]: 2025-11-22 08:36:10.930 186548 DEBUG nova.virt.libvirt.vif [req-84433757-cc30-437b-808a-fbb3d0d54eb8 req-94a48b29-2791-4e6a-86c4-a03a0fe3c366 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:35:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-710800317',display_name='tempest-TestNetworkBasicOps-server-710800317',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-710800317',id=167,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNYFp+84ONzQcn0iqOuOB+FXdJTNjbX16Nf4bSBmaDOqJkiNXFWk4Dg2HKrhF07TNjNTNPuibNriNIRB+UswtE57Lassn/GZrZqYoCMCkzsy/3S49MgiVtcyuFVVOk/lqQ==',key_name='tempest-TestNetworkBasicOps-1923094795',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:35:34Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='12f63a6d87a947758ab928c0d625ff06',ramdisk_id='',reservation_id='r-2i2p53uv',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1998778518',owner_user_name='tempest-TestNetworkBasicOps-1998778518-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:35:34Z,user_data=None,user_id='033a5e424a0a42afa21b67c28d79d1f4',uuid=3185c90c-d312-49b2-892e-133ba362117b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "195c5347-a907-42ac-983c-1579be63a9b3", "address": "fa:16:3e:f3:bc:90", "network": {"id": "81a2a04b-7bbe-4694-ab21-1b66ecc56f3f", "bridge": "br-int", "label": "tempest-network-smoke--1571347791", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap195c5347-a9", "ovs_interfaceid": "195c5347-a907-42ac-983c-1579be63a9b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 08:36:10 compute-0 nova_compute[186544]: 2025-11-22 08:36:10.930 186548 DEBUG nova.network.os_vif_util [req-84433757-cc30-437b-808a-fbb3d0d54eb8 req-94a48b29-2791-4e6a-86c4-a03a0fe3c366 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Converting VIF {"id": "195c5347-a907-42ac-983c-1579be63a9b3", "address": "fa:16:3e:f3:bc:90", "network": {"id": "81a2a04b-7bbe-4694-ab21-1b66ecc56f3f", "bridge": "br-int", "label": "tempest-network-smoke--1571347791", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap195c5347-a9", "ovs_interfaceid": "195c5347-a907-42ac-983c-1579be63a9b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:36:10 compute-0 nova_compute[186544]: 2025-11-22 08:36:10.931 186548 DEBUG nova.network.os_vif_util [req-84433757-cc30-437b-808a-fbb3d0d54eb8 req-94a48b29-2791-4e6a-86c4-a03a0fe3c366 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f3:bc:90,bridge_name='br-int',has_traffic_filtering=True,id=195c5347-a907-42ac-983c-1579be63a9b3,network=Network(81a2a04b-7bbe-4694-ab21-1b66ecc56f3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap195c5347-a9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:36:10 compute-0 nova_compute[186544]: 2025-11-22 08:36:10.931 186548 DEBUG os_vif [req-84433757-cc30-437b-808a-fbb3d0d54eb8 req-94a48b29-2791-4e6a-86c4-a03a0fe3c366 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f3:bc:90,bridge_name='br-int',has_traffic_filtering=True,id=195c5347-a907-42ac-983c-1579be63a9b3,network=Network(81a2a04b-7bbe-4694-ab21-1b66ecc56f3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap195c5347-a9') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 08:36:10 compute-0 nova_compute[186544]: 2025-11-22 08:36:10.933 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:36:10 compute-0 nova_compute[186544]: 2025-11-22 08:36:10.933 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap195c5347-a9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:36:10 compute-0 nova_compute[186544]: 2025-11-22 08:36:10.934 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:36:10 compute-0 nova_compute[186544]: 2025-11-22 08:36:10.936 186548 INFO os_vif [req-84433757-cc30-437b-808a-fbb3d0d54eb8 req-94a48b29-2791-4e6a-86c4-a03a0fe3c366 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f3:bc:90,bridge_name='br-int',has_traffic_filtering=True,id=195c5347-a907-42ac-983c-1579be63a9b3,network=Network(81a2a04b-7bbe-4694-ab21-1b66ecc56f3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap195c5347-a9')
Nov 22 08:36:10 compute-0 nova_compute[186544]: 2025-11-22 08:36:10.937 186548 DEBUG nova.virt.libvirt.guest [req-84433757-cc30-437b-808a-fbb3d0d54eb8 req-94a48b29-2791-4e6a-86c4-a03a0fe3c366 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 08:36:10 compute-0 nova_compute[186544]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   <nova:name>tempest-TestNetworkBasicOps-server-710800317</nova:name>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   <nova:creationTime>2025-11-22 08:36:10</nova:creationTime>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   <nova:flavor name="m1.nano">
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <nova:memory>128</nova:memory>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <nova:disk>1</nova:disk>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <nova:swap>0</nova:swap>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <nova:ephemeral>0</nova:ephemeral>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <nova:vcpus>1</nova:vcpus>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   </nova:flavor>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   <nova:owner>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <nova:user uuid="033a5e424a0a42afa21b67c28d79d1f4">tempest-TestNetworkBasicOps-1998778518-project-member</nova:user>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <nova:project uuid="12f63a6d87a947758ab928c0d625ff06">tempest-TestNetworkBasicOps-1998778518</nova:project>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   </nova:owner>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   <nova:ports>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     <nova:port uuid="32e5de3f-3250-4a10-b086-ff60f6485d94">
Nov 22 08:36:10 compute-0 nova_compute[186544]:       <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Nov 22 08:36:10 compute-0 nova_compute[186544]:     </nova:port>
Nov 22 08:36:10 compute-0 nova_compute[186544]:   </nova:ports>
Nov 22 08:36:10 compute-0 nova_compute[186544]: </nova:instance>
Nov 22 08:36:10 compute-0 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 22 08:36:10 compute-0 nova_compute[186544]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Nov 22 08:36:11 compute-0 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 22 08:36:11 compute-0 nova_compute[186544]: 2025-11-22 08:36:11.505 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:36:12 compute-0 nova_compute[186544]: 2025-11-22 08:36:12.344 186548 INFO nova.network.neutron [None req-8ae71ae5-8a5a-434b-8ca6-62921eb0f4d7 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Port 195c5347-a907-42ac-983c-1579be63a9b3 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.
Nov 22 08:36:12 compute-0 nova_compute[186544]: 2025-11-22 08:36:12.344 186548 DEBUG nova.network.neutron [None req-8ae71ae5-8a5a-434b-8ca6-62921eb0f4d7 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Updating instance_info_cache with network_info: [{"id": "32e5de3f-3250-4a10-b086-ff60f6485d94", "address": "fa:16:3e:f7:8b:09", "network": {"id": "31f800a8-047c-4bf6-a6e5-1eca76d76d66", "bridge": "br-int", "label": "tempest-network-smoke--1588217224", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap32e5de3f-32", "ovs_interfaceid": "32e5de3f-3250-4a10-b086-ff60f6485d94", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:36:12 compute-0 nova_compute[186544]: 2025-11-22 08:36:12.381 186548 DEBUG oslo_concurrency.lockutils [None req-8ae71ae5-8a5a-434b-8ca6-62921eb0f4d7 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Releasing lock "refresh_cache-3185c90c-d312-49b2-892e-133ba362117b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:36:12 compute-0 nova_compute[186544]: 2025-11-22 08:36:12.394 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:36:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:12.395 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=61, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=60) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:36:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:12.396 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 08:36:12 compute-0 nova_compute[186544]: 2025-11-22 08:36:12.429 186548 DEBUG oslo_concurrency.lockutils [None req-8ae71ae5-8a5a-434b-8ca6-62921eb0f4d7 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "interface-3185c90c-d312-49b2-892e-133ba362117b-195c5347-a907-42ac-983c-1579be63a9b3" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 4.439s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:36:12 compute-0 ovn_controller[94843]: 2025-11-22T08:36:12Z|00816|binding|INFO|Releasing lport eafded98-90b4-44e0-b8ef-306128acf727 from this chassis (sb_readonly=0)
Nov 22 08:36:12 compute-0 nova_compute[186544]: 2025-11-22 08:36:12.781 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:36:13 compute-0 nova_compute[186544]: 2025-11-22 08:36:13.188 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:36:14 compute-0 nova_compute[186544]: 2025-11-22 08:36:14.175 186548 DEBUG nova.compute.manager [req-7710c580-4c29-4a90-8cac-73dd12819313 req-b9c5cd1f-7d69-42e1-a568-8fd4f08513fd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Received event network-changed-32e5de3f-3250-4a10-b086-ff60f6485d94 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:36:14 compute-0 nova_compute[186544]: 2025-11-22 08:36:14.175 186548 DEBUG nova.compute.manager [req-7710c580-4c29-4a90-8cac-73dd12819313 req-b9c5cd1f-7d69-42e1-a568-8fd4f08513fd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Refreshing instance network info cache due to event network-changed-32e5de3f-3250-4a10-b086-ff60f6485d94. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:36:14 compute-0 nova_compute[186544]: 2025-11-22 08:36:14.175 186548 DEBUG oslo_concurrency.lockutils [req-7710c580-4c29-4a90-8cac-73dd12819313 req-b9c5cd1f-7d69-42e1-a568-8fd4f08513fd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-3185c90c-d312-49b2-892e-133ba362117b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:36:14 compute-0 nova_compute[186544]: 2025-11-22 08:36:14.175 186548 DEBUG oslo_concurrency.lockutils [req-7710c580-4c29-4a90-8cac-73dd12819313 req-b9c5cd1f-7d69-42e1-a568-8fd4f08513fd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-3185c90c-d312-49b2-892e-133ba362117b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:36:14 compute-0 nova_compute[186544]: 2025-11-22 08:36:14.176 186548 DEBUG nova.network.neutron [req-7710c580-4c29-4a90-8cac-73dd12819313 req-b9c5cd1f-7d69-42e1-a568-8fd4f08513fd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Refreshing network info cache for port 32e5de3f-3250-4a10-b086-ff60f6485d94 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:36:14 compute-0 nova_compute[186544]: 2025-11-22 08:36:14.251 186548 DEBUG oslo_concurrency.lockutils [None req-267f7409-2e94-4cbe-94c6-b438146871ac 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "3185c90c-d312-49b2-892e-133ba362117b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:36:14 compute-0 nova_compute[186544]: 2025-11-22 08:36:14.252 186548 DEBUG oslo_concurrency.lockutils [None req-267f7409-2e94-4cbe-94c6-b438146871ac 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "3185c90c-d312-49b2-892e-133ba362117b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:36:14 compute-0 nova_compute[186544]: 2025-11-22 08:36:14.252 186548 DEBUG oslo_concurrency.lockutils [None req-267f7409-2e94-4cbe-94c6-b438146871ac 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "3185c90c-d312-49b2-892e-133ba362117b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:36:14 compute-0 nova_compute[186544]: 2025-11-22 08:36:14.252 186548 DEBUG oslo_concurrency.lockutils [None req-267f7409-2e94-4cbe-94c6-b438146871ac 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "3185c90c-d312-49b2-892e-133ba362117b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:36:14 compute-0 nova_compute[186544]: 2025-11-22 08:36:14.252 186548 DEBUG oslo_concurrency.lockutils [None req-267f7409-2e94-4cbe-94c6-b438146871ac 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "3185c90c-d312-49b2-892e-133ba362117b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:36:14 compute-0 nova_compute[186544]: 2025-11-22 08:36:14.260 186548 INFO nova.compute.manager [None req-267f7409-2e94-4cbe-94c6-b438146871ac 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Terminating instance
Nov 22 08:36:14 compute-0 nova_compute[186544]: 2025-11-22 08:36:14.266 186548 DEBUG nova.compute.manager [None req-267f7409-2e94-4cbe-94c6-b438146871ac 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 08:36:14 compute-0 kernel: tap32e5de3f-32 (unregistering): left promiscuous mode
Nov 22 08:36:14 compute-0 NetworkManager[55036]: <info>  [1763800574.2895] device (tap32e5de3f-32): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 08:36:14 compute-0 nova_compute[186544]: 2025-11-22 08:36:14.299 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:36:14 compute-0 ovn_controller[94843]: 2025-11-22T08:36:14Z|00817|binding|INFO|Releasing lport 32e5de3f-3250-4a10-b086-ff60f6485d94 from this chassis (sb_readonly=0)
Nov 22 08:36:14 compute-0 ovn_controller[94843]: 2025-11-22T08:36:14Z|00818|binding|INFO|Setting lport 32e5de3f-3250-4a10-b086-ff60f6485d94 down in Southbound
Nov 22 08:36:14 compute-0 ovn_controller[94843]: 2025-11-22T08:36:14Z|00819|binding|INFO|Removing iface tap32e5de3f-32 ovn-installed in OVS
Nov 22 08:36:14 compute-0 nova_compute[186544]: 2025-11-22 08:36:14.302 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:36:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:14.312 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f7:8b:09 10.100.0.3'], port_security=['fa:16:3e:f7:8b:09 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '3185c90c-d312-49b2-892e-133ba362117b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-31f800a8-047c-4bf6-a6e5-1eca76d76d66', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '12f63a6d87a947758ab928c0d625ff06', 'neutron:revision_number': '4', 'neutron:security_group_ids': '80b48e61-9c0d-4cfa-b8d5-1ae4ab3d8cdf', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=65a04f5c-0ceb-4f4b-889a-0e4cec0d6191, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=32e5de3f-3250-4a10-b086-ff60f6485d94) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:36:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:14.313 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 32e5de3f-3250-4a10-b086-ff60f6485d94 in datapath 31f800a8-047c-4bf6-a6e5-1eca76d76d66 unbound from our chassis
Nov 22 08:36:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:14.315 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 31f800a8-047c-4bf6-a6e5-1eca76d76d66, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 08:36:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:14.317 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[a0e1aef5-e83a-41d9-8daf-c1aefc5f973d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:36:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:14.317 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-31f800a8-047c-4bf6-a6e5-1eca76d76d66 namespace which is not needed anymore
Nov 22 08:36:14 compute-0 nova_compute[186544]: 2025-11-22 08:36:14.333 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:36:14 compute-0 systemd[1]: machine-qemu\x2d91\x2dinstance\x2d000000a7.scope: Deactivated successfully.
Nov 22 08:36:14 compute-0 systemd[1]: machine-qemu\x2d91\x2dinstance\x2d000000a7.scope: Consumed 15.201s CPU time.
Nov 22 08:36:14 compute-0 systemd-machined[152872]: Machine qemu-91-instance-000000a7 terminated.
Nov 22 08:36:14 compute-0 neutron-haproxy-ovnmeta-31f800a8-047c-4bf6-a6e5-1eca76d76d66[248720]: [NOTICE]   (248724) : haproxy version is 2.8.14-c23fe91
Nov 22 08:36:14 compute-0 neutron-haproxy-ovnmeta-31f800a8-047c-4bf6-a6e5-1eca76d76d66[248720]: [NOTICE]   (248724) : path to executable is /usr/sbin/haproxy
Nov 22 08:36:14 compute-0 neutron-haproxy-ovnmeta-31f800a8-047c-4bf6-a6e5-1eca76d76d66[248720]: [WARNING]  (248724) : Exiting Master process...
Nov 22 08:36:14 compute-0 neutron-haproxy-ovnmeta-31f800a8-047c-4bf6-a6e5-1eca76d76d66[248720]: [ALERT]    (248724) : Current worker (248726) exited with code 143 (Terminated)
Nov 22 08:36:14 compute-0 neutron-haproxy-ovnmeta-31f800a8-047c-4bf6-a6e5-1eca76d76d66[248720]: [WARNING]  (248724) : All workers exited. Exiting... (0)
Nov 22 08:36:14 compute-0 systemd[1]: libpod-62eec7ca95411bfd2ed7fe33ed89561e7039c4976a767b8d9904c9b00a2e52dd.scope: Deactivated successfully.
Nov 22 08:36:14 compute-0 podman[249153]: 2025-11-22 08:36:14.486634537 +0000 UTC m=+0.062706179 container died 62eec7ca95411bfd2ed7fe33ed89561e7039c4976a767b8d9904c9b00a2e52dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-31f800a8-047c-4bf6-a6e5-1eca76d76d66, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 08:36:14 compute-0 nova_compute[186544]: 2025-11-22 08:36:14.486 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:36:14 compute-0 nova_compute[186544]: 2025-11-22 08:36:14.491 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:36:14 compute-0 nova_compute[186544]: 2025-11-22 08:36:14.530 186548 INFO nova.virt.libvirt.driver [-] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Instance destroyed successfully.
Nov 22 08:36:14 compute-0 nova_compute[186544]: 2025-11-22 08:36:14.532 186548 DEBUG nova.objects.instance [None req-267f7409-2e94-4cbe-94c6-b438146871ac 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lazy-loading 'resources' on Instance uuid 3185c90c-d312-49b2-892e-133ba362117b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:36:14 compute-0 nova_compute[186544]: 2025-11-22 08:36:14.533 186548 DEBUG nova.compute.manager [req-1d515533-f929-4901-a0ef-b4e1ec8a715f req-df1f74bf-94a3-418c-b3c7-c79e01a37cb8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Received event network-vif-unplugged-32e5de3f-3250-4a10-b086-ff60f6485d94 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:36:14 compute-0 nova_compute[186544]: 2025-11-22 08:36:14.534 186548 DEBUG oslo_concurrency.lockutils [req-1d515533-f929-4901-a0ef-b4e1ec8a715f req-df1f74bf-94a3-418c-b3c7-c79e01a37cb8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "3185c90c-d312-49b2-892e-133ba362117b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:36:14 compute-0 nova_compute[186544]: 2025-11-22 08:36:14.534 186548 DEBUG oslo_concurrency.lockutils [req-1d515533-f929-4901-a0ef-b4e1ec8a715f req-df1f74bf-94a3-418c-b3c7-c79e01a37cb8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "3185c90c-d312-49b2-892e-133ba362117b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:36:14 compute-0 nova_compute[186544]: 2025-11-22 08:36:14.534 186548 DEBUG oslo_concurrency.lockutils [req-1d515533-f929-4901-a0ef-b4e1ec8a715f req-df1f74bf-94a3-418c-b3c7-c79e01a37cb8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "3185c90c-d312-49b2-892e-133ba362117b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:36:14 compute-0 nova_compute[186544]: 2025-11-22 08:36:14.534 186548 DEBUG nova.compute.manager [req-1d515533-f929-4901-a0ef-b4e1ec8a715f req-df1f74bf-94a3-418c-b3c7-c79e01a37cb8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] No waiting events found dispatching network-vif-unplugged-32e5de3f-3250-4a10-b086-ff60f6485d94 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:36:14 compute-0 nova_compute[186544]: 2025-11-22 08:36:14.534 186548 DEBUG nova.compute.manager [req-1d515533-f929-4901-a0ef-b4e1ec8a715f req-df1f74bf-94a3-418c-b3c7-c79e01a37cb8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Received event network-vif-unplugged-32e5de3f-3250-4a10-b086-ff60f6485d94 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 22 08:36:14 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-62eec7ca95411bfd2ed7fe33ed89561e7039c4976a767b8d9904c9b00a2e52dd-userdata-shm.mount: Deactivated successfully.
Nov 22 08:36:14 compute-0 nova_compute[186544]: 2025-11-22 08:36:14.543 186548 DEBUG nova.virt.libvirt.vif [None req-267f7409-2e94-4cbe-94c6-b438146871ac 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:35:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-710800317',display_name='tempest-TestNetworkBasicOps-server-710800317',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-710800317',id=167,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNYFp+84ONzQcn0iqOuOB+FXdJTNjbX16Nf4bSBmaDOqJkiNXFWk4Dg2HKrhF07TNjNTNPuibNriNIRB+UswtE57Lassn/GZrZqYoCMCkzsy/3S49MgiVtcyuFVVOk/lqQ==',key_name='tempest-TestNetworkBasicOps-1923094795',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:35:34Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='12f63a6d87a947758ab928c0d625ff06',ramdisk_id='',reservation_id='r-2i2p53uv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1998778518',owner_user_name='tempest-TestNetworkBasicOps-1998778518-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:35:34Z,user_data=None,user_id='033a5e424a0a42afa21b67c28d79d1f4',uuid=3185c90c-d312-49b2-892e-133ba362117b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "32e5de3f-3250-4a10-b086-ff60f6485d94", "address": "fa:16:3e:f7:8b:09", "network": {"id": "31f800a8-047c-4bf6-a6e5-1eca76d76d66", "bridge": "br-int", "label": "tempest-network-smoke--1588217224", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap32e5de3f-32", "ovs_interfaceid": "32e5de3f-3250-4a10-b086-ff60f6485d94", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 08:36:14 compute-0 nova_compute[186544]: 2025-11-22 08:36:14.544 186548 DEBUG nova.network.os_vif_util [None req-267f7409-2e94-4cbe-94c6-b438146871ac 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converting VIF {"id": "32e5de3f-3250-4a10-b086-ff60f6485d94", "address": "fa:16:3e:f7:8b:09", "network": {"id": "31f800a8-047c-4bf6-a6e5-1eca76d76d66", "bridge": "br-int", "label": "tempest-network-smoke--1588217224", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap32e5de3f-32", "ovs_interfaceid": "32e5de3f-3250-4a10-b086-ff60f6485d94", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:36:14 compute-0 systemd[1]: var-lib-containers-storage-overlay-b2f620f0057bbd6aefab8b4a9003640b70ce92f3798faa74bd505f2865266c87-merged.mount: Deactivated successfully.
Nov 22 08:36:14 compute-0 nova_compute[186544]: 2025-11-22 08:36:14.544 186548 DEBUG nova.network.os_vif_util [None req-267f7409-2e94-4cbe-94c6-b438146871ac 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f7:8b:09,bridge_name='br-int',has_traffic_filtering=True,id=32e5de3f-3250-4a10-b086-ff60f6485d94,network=Network(31f800a8-047c-4bf6-a6e5-1eca76d76d66),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap32e5de3f-32') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:36:14 compute-0 nova_compute[186544]: 2025-11-22 08:36:14.544 186548 DEBUG os_vif [None req-267f7409-2e94-4cbe-94c6-b438146871ac 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f7:8b:09,bridge_name='br-int',has_traffic_filtering=True,id=32e5de3f-3250-4a10-b086-ff60f6485d94,network=Network(31f800a8-047c-4bf6-a6e5-1eca76d76d66),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap32e5de3f-32') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 08:36:14 compute-0 nova_compute[186544]: 2025-11-22 08:36:14.546 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:36:14 compute-0 nova_compute[186544]: 2025-11-22 08:36:14.546 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap32e5de3f-32, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:36:14 compute-0 nova_compute[186544]: 2025-11-22 08:36:14.548 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:36:14 compute-0 nova_compute[186544]: 2025-11-22 08:36:14.549 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:36:14 compute-0 nova_compute[186544]: 2025-11-22 08:36:14.551 186548 INFO os_vif [None req-267f7409-2e94-4cbe-94c6-b438146871ac 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f7:8b:09,bridge_name='br-int',has_traffic_filtering=True,id=32e5de3f-3250-4a10-b086-ff60f6485d94,network=Network(31f800a8-047c-4bf6-a6e5-1eca76d76d66),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap32e5de3f-32')
Nov 22 08:36:14 compute-0 nova_compute[186544]: 2025-11-22 08:36:14.552 186548 INFO nova.virt.libvirt.driver [None req-267f7409-2e94-4cbe-94c6-b438146871ac 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Deleting instance files /var/lib/nova/instances/3185c90c-d312-49b2-892e-133ba362117b_del
Nov 22 08:36:14 compute-0 nova_compute[186544]: 2025-11-22 08:36:14.552 186548 INFO nova.virt.libvirt.driver [None req-267f7409-2e94-4cbe-94c6-b438146871ac 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Deletion of /var/lib/nova/instances/3185c90c-d312-49b2-892e-133ba362117b_del complete
Nov 22 08:36:14 compute-0 podman[249153]: 2025-11-22 08:36:14.569299546 +0000 UTC m=+0.145371188 container cleanup 62eec7ca95411bfd2ed7fe33ed89561e7039c4976a767b8d9904c9b00a2e52dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-31f800a8-047c-4bf6-a6e5-1eca76d76d66, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 22 08:36:14 compute-0 systemd[1]: libpod-conmon-62eec7ca95411bfd2ed7fe33ed89561e7039c4976a767b8d9904c9b00a2e52dd.scope: Deactivated successfully.
Nov 22 08:36:14 compute-0 nova_compute[186544]: 2025-11-22 08:36:14.654 186548 INFO nova.compute.manager [None req-267f7409-2e94-4cbe-94c6-b438146871ac 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Took 0.39 seconds to destroy the instance on the hypervisor.
Nov 22 08:36:14 compute-0 nova_compute[186544]: 2025-11-22 08:36:14.654 186548 DEBUG oslo.service.loopingcall [None req-267f7409-2e94-4cbe-94c6-b438146871ac 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 08:36:14 compute-0 nova_compute[186544]: 2025-11-22 08:36:14.655 186548 DEBUG nova.compute.manager [-] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 08:36:14 compute-0 nova_compute[186544]: 2025-11-22 08:36:14.655 186548 DEBUG nova.network.neutron [-] [instance: 3185c90c-d312-49b2-892e-133ba362117b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 08:36:14 compute-0 podman[249198]: 2025-11-22 08:36:14.834647643 +0000 UTC m=+0.241345716 container remove 62eec7ca95411bfd2ed7fe33ed89561e7039c4976a767b8d9904c9b00a2e52dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-31f800a8-047c-4bf6-a6e5-1eca76d76d66, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, tcib_managed=true)
Nov 22 08:36:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:14.839 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[19313b04-5f7a-4e90-9f2f-f739bd6a480f]: (4, ('Sat Nov 22 08:36:14 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-31f800a8-047c-4bf6-a6e5-1eca76d76d66 (62eec7ca95411bfd2ed7fe33ed89561e7039c4976a767b8d9904c9b00a2e52dd)\n62eec7ca95411bfd2ed7fe33ed89561e7039c4976a767b8d9904c9b00a2e52dd\nSat Nov 22 08:36:14 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-31f800a8-047c-4bf6-a6e5-1eca76d76d66 (62eec7ca95411bfd2ed7fe33ed89561e7039c4976a767b8d9904c9b00a2e52dd)\n62eec7ca95411bfd2ed7fe33ed89561e7039c4976a767b8d9904c9b00a2e52dd\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:36:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:14.841 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[6fba6a78-5f9a-4450-8a2e-3e497b060c89]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:36:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:14.842 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap31f800a8-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:36:14 compute-0 kernel: tap31f800a8-00: left promiscuous mode
Nov 22 08:36:14 compute-0 nova_compute[186544]: 2025-11-22 08:36:14.843 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:36:14 compute-0 nova_compute[186544]: 2025-11-22 08:36:14.856 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:36:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:14.859 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[fe24bca1-d4a9-47e0-a7fb-595047c44b09]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:36:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:14.884 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[57dd4c08-262a-4dcd-b2df-2e059e497da4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:36:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:14.886 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[3ff50055-7598-4a8c-8023-6d25825e2e75]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:36:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:14.901 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[0a33e5c0-00e4-4907-beca-da02092af6b1]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 727413, 'reachable_time': 35547, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 249213, 'error': None, 'target': 'ovnmeta-31f800a8-047c-4bf6-a6e5-1eca76d76d66', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:36:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:14.903 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-31f800a8-047c-4bf6-a6e5-1eca76d76d66 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 08:36:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:14.904 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[9cae11e7-a7f3-4590-80af-53c206a98bf5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:36:14 compute-0 systemd[1]: run-netns-ovnmeta\x2d31f800a8\x2d047c\x2d4bf6\x2da6e5\x2d1eca76d76d66.mount: Deactivated successfully.
Nov 22 08:36:15 compute-0 nova_compute[186544]: 2025-11-22 08:36:15.151 186548 DEBUG nova.network.neutron [-] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:36:15 compute-0 nova_compute[186544]: 2025-11-22 08:36:15.166 186548 INFO nova.compute.manager [-] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Took 0.51 seconds to deallocate network for instance.
Nov 22 08:36:15 compute-0 nova_compute[186544]: 2025-11-22 08:36:15.225 186548 DEBUG oslo_concurrency.lockutils [None req-267f7409-2e94-4cbe-94c6-b438146871ac 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:36:15 compute-0 nova_compute[186544]: 2025-11-22 08:36:15.226 186548 DEBUG oslo_concurrency.lockutils [None req-267f7409-2e94-4cbe-94c6-b438146871ac 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:36:15 compute-0 nova_compute[186544]: 2025-11-22 08:36:15.261 186548 DEBUG nova.scheduler.client.report [None req-267f7409-2e94-4cbe-94c6-b438146871ac 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Refreshing inventories for resource provider 0a011418-630a-4be8-ab23-41ec1c11a5ea _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 22 08:36:15 compute-0 nova_compute[186544]: 2025-11-22 08:36:15.279 186548 DEBUG nova.scheduler.client.report [None req-267f7409-2e94-4cbe-94c6-b438146871ac 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Updating ProviderTree inventory for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 22 08:36:15 compute-0 nova_compute[186544]: 2025-11-22 08:36:15.279 186548 DEBUG nova.compute.provider_tree [None req-267f7409-2e94-4cbe-94c6-b438146871ac 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Updating inventory in ProviderTree for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 22 08:36:15 compute-0 nova_compute[186544]: 2025-11-22 08:36:15.290 186548 DEBUG nova.scheduler.client.report [None req-267f7409-2e94-4cbe-94c6-b438146871ac 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Refreshing aggregate associations for resource provider 0a011418-630a-4be8-ab23-41ec1c11a5ea, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 22 08:36:15 compute-0 nova_compute[186544]: 2025-11-22 08:36:15.323 186548 DEBUG nova.scheduler.client.report [None req-267f7409-2e94-4cbe-94c6-b438146871ac 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Refreshing trait associations for resource provider 0a011418-630a-4be8-ab23-41ec1c11a5ea, traits: COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_ACCELERATORS,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_RESCUE_BFV,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSSE3,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE41 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 22 08:36:15 compute-0 nova_compute[186544]: 2025-11-22 08:36:15.363 186548 DEBUG nova.compute.provider_tree [None req-267f7409-2e94-4cbe-94c6-b438146871ac 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:36:15 compute-0 nova_compute[186544]: 2025-11-22 08:36:15.377 186548 DEBUG nova.scheduler.client.report [None req-267f7409-2e94-4cbe-94c6-b438146871ac 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:36:15 compute-0 nova_compute[186544]: 2025-11-22 08:36:15.398 186548 DEBUG oslo_concurrency.lockutils [None req-267f7409-2e94-4cbe-94c6-b438146871ac 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.172s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:36:15 compute-0 nova_compute[186544]: 2025-11-22 08:36:15.422 186548 INFO nova.scheduler.client.report [None req-267f7409-2e94-4cbe-94c6-b438146871ac 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Deleted allocations for instance 3185c90c-d312-49b2-892e-133ba362117b
Nov 22 08:36:15 compute-0 nova_compute[186544]: 2025-11-22 08:36:15.482 186548 DEBUG oslo_concurrency.lockutils [None req-267f7409-2e94-4cbe-94c6-b438146871ac 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "3185c90c-d312-49b2-892e-133ba362117b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.230s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:36:15 compute-0 nova_compute[186544]: 2025-11-22 08:36:15.492 186548 DEBUG nova.network.neutron [req-7710c580-4c29-4a90-8cac-73dd12819313 req-b9c5cd1f-7d69-42e1-a568-8fd4f08513fd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Updated VIF entry in instance network info cache for port 32e5de3f-3250-4a10-b086-ff60f6485d94. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:36:15 compute-0 nova_compute[186544]: 2025-11-22 08:36:15.493 186548 DEBUG nova.network.neutron [req-7710c580-4c29-4a90-8cac-73dd12819313 req-b9c5cd1f-7d69-42e1-a568-8fd4f08513fd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Updating instance_info_cache with network_info: [{"id": "32e5de3f-3250-4a10-b086-ff60f6485d94", "address": "fa:16:3e:f7:8b:09", "network": {"id": "31f800a8-047c-4bf6-a6e5-1eca76d76d66", "bridge": "br-int", "label": "tempest-network-smoke--1588217224", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap32e5de3f-32", "ovs_interfaceid": "32e5de3f-3250-4a10-b086-ff60f6485d94", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:36:15 compute-0 nova_compute[186544]: 2025-11-22 08:36:15.511 186548 DEBUG oslo_concurrency.lockutils [req-7710c580-4c29-4a90-8cac-73dd12819313 req-b9c5cd1f-7d69-42e1-a568-8fd4f08513fd 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-3185c90c-d312-49b2-892e-133ba362117b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:36:16 compute-0 nova_compute[186544]: 2025-11-22 08:36:16.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:36:16 compute-0 nova_compute[186544]: 2025-11-22 08:36:16.164 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 08:36:16 compute-0 nova_compute[186544]: 2025-11-22 08:36:16.508 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:36:16 compute-0 nova_compute[186544]: 2025-11-22 08:36:16.613 186548 DEBUG nova.compute.manager [req-3da65bcf-9cfd-43fd-b9c5-37656a327bb9 req-b0524d42-b636-4e6c-b95f-f476369e18d1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Received event network-vif-plugged-32e5de3f-3250-4a10-b086-ff60f6485d94 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:36:16 compute-0 nova_compute[186544]: 2025-11-22 08:36:16.614 186548 DEBUG oslo_concurrency.lockutils [req-3da65bcf-9cfd-43fd-b9c5-37656a327bb9 req-b0524d42-b636-4e6c-b95f-f476369e18d1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "3185c90c-d312-49b2-892e-133ba362117b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:36:16 compute-0 nova_compute[186544]: 2025-11-22 08:36:16.614 186548 DEBUG oslo_concurrency.lockutils [req-3da65bcf-9cfd-43fd-b9c5-37656a327bb9 req-b0524d42-b636-4e6c-b95f-f476369e18d1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "3185c90c-d312-49b2-892e-133ba362117b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:36:16 compute-0 nova_compute[186544]: 2025-11-22 08:36:16.614 186548 DEBUG oslo_concurrency.lockutils [req-3da65bcf-9cfd-43fd-b9c5-37656a327bb9 req-b0524d42-b636-4e6c-b95f-f476369e18d1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "3185c90c-d312-49b2-892e-133ba362117b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:36:16 compute-0 nova_compute[186544]: 2025-11-22 08:36:16.614 186548 DEBUG nova.compute.manager [req-3da65bcf-9cfd-43fd-b9c5-37656a327bb9 req-b0524d42-b636-4e6c-b95f-f476369e18d1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] No waiting events found dispatching network-vif-plugged-32e5de3f-3250-4a10-b086-ff60f6485d94 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:36:16 compute-0 nova_compute[186544]: 2025-11-22 08:36:16.615 186548 WARNING nova.compute.manager [req-3da65bcf-9cfd-43fd-b9c5-37656a327bb9 req-b0524d42-b636-4e6c-b95f-f476369e18d1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Received unexpected event network-vif-plugged-32e5de3f-3250-4a10-b086-ff60f6485d94 for instance with vm_state deleted and task_state None.
Nov 22 08:36:16 compute-0 nova_compute[186544]: 2025-11-22 08:36:16.615 186548 DEBUG nova.compute.manager [req-3da65bcf-9cfd-43fd-b9c5-37656a327bb9 req-b0524d42-b636-4e6c-b95f-f476369e18d1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Received event network-vif-deleted-32e5de3f-3250-4a10-b086-ff60f6485d94 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:36:17 compute-0 nova_compute[186544]: 2025-11-22 08:36:17.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:36:17 compute-0 nova_compute[186544]: 2025-11-22 08:36:17.182 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:36:17 compute-0 nova_compute[186544]: 2025-11-22 08:36:17.182 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:36:17 compute-0 nova_compute[186544]: 2025-11-22 08:36:17.182 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:36:17 compute-0 nova_compute[186544]: 2025-11-22 08:36:17.183 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 08:36:17 compute-0 nova_compute[186544]: 2025-11-22 08:36:17.337 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:36:17 compute-0 nova_compute[186544]: 2025-11-22 08:36:17.338 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5662MB free_disk=73.13057708740234GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 08:36:17 compute-0 nova_compute[186544]: 2025-11-22 08:36:17.339 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:36:17 compute-0 nova_compute[186544]: 2025-11-22 08:36:17.339 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:36:17 compute-0 nova_compute[186544]: 2025-11-22 08:36:17.384 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 08:36:17 compute-0 nova_compute[186544]: 2025-11-22 08:36:17.385 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 08:36:17 compute-0 nova_compute[186544]: 2025-11-22 08:36:17.403 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:36:17 compute-0 nova_compute[186544]: 2025-11-22 08:36:17.415 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:36:17 compute-0 nova_compute[186544]: 2025-11-22 08:36:17.449 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 08:36:17 compute-0 nova_compute[186544]: 2025-11-22 08:36:17.449 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:36:19 compute-0 nova_compute[186544]: 2025-11-22 08:36:19.550 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:36:21 compute-0 nova_compute[186544]: 2025-11-22 08:36:21.430 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:36:21 compute-0 nova_compute[186544]: 2025-11-22 08:36:21.450 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:36:21 compute-0 nova_compute[186544]: 2025-11-22 08:36:21.497 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:36:21 compute-0 nova_compute[186544]: 2025-11-22 08:36:21.508 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:36:22 compute-0 nova_compute[186544]: 2025-11-22 08:36:22.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:36:22 compute-0 nova_compute[186544]: 2025-11-22 08:36:22.164 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 08:36:22 compute-0 nova_compute[186544]: 2025-11-22 08:36:22.164 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 08:36:22 compute-0 nova_compute[186544]: 2025-11-22 08:36:22.175 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 08:36:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:22.398 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '61'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:36:24 compute-0 nova_compute[186544]: 2025-11-22 08:36:24.552 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:36:26 compute-0 nova_compute[186544]: 2025-11-22 08:36:26.510 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:36:27 compute-0 nova_compute[186544]: 2025-11-22 08:36:27.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:36:27 compute-0 nova_compute[186544]: 2025-11-22 08:36:27.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:36:27 compute-0 podman[249218]: 2025-11-22 08:36:27.402104643 +0000 UTC m=+0.051605014 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 22 08:36:27 compute-0 podman[249219]: 2025-11-22 08:36:27.418215081 +0000 UTC m=+0.064119053 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 08:36:27 compute-0 podman[249217]: 2025-11-22 08:36:27.424636749 +0000 UTC m=+0.075407442 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Nov 22 08:36:27 compute-0 podman[249220]: 2025-11-22 08:36:27.455954842 +0000 UTC m=+0.098518752 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 22 08:36:29 compute-0 nova_compute[186544]: 2025-11-22 08:36:29.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:36:29 compute-0 nova_compute[186544]: 2025-11-22 08:36:29.529 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763800574.5272295, 3185c90c-d312-49b2-892e-133ba362117b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:36:29 compute-0 nova_compute[186544]: 2025-11-22 08:36:29.530 186548 INFO nova.compute.manager [-] [instance: 3185c90c-d312-49b2-892e-133ba362117b] VM Stopped (Lifecycle Event)
Nov 22 08:36:29 compute-0 nova_compute[186544]: 2025-11-22 08:36:29.547 186548 DEBUG nova.compute.manager [None req-7a09bf3c-3685-435e-90ff-6e34dc4c1dfc - - - - - -] [instance: 3185c90c-d312-49b2-892e-133ba362117b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:36:29 compute-0 nova_compute[186544]: 2025-11-22 08:36:29.554 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:36:31 compute-0 nova_compute[186544]: 2025-11-22 08:36:31.512 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:36:32 compute-0 nova_compute[186544]: 2025-11-22 08:36:32.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:36:33 compute-0 nova_compute[186544]: 2025-11-22 08:36:33.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:36:34 compute-0 nova_compute[186544]: 2025-11-22 08:36:34.557 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:36:36 compute-0 nova_compute[186544]: 2025-11-22 08:36:36.513 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:36:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:36:36.602 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:36:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:36:36.603 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:36:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:36:36.603 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:36:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:36:36.603 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:36:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:36:36.603 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:36:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:36:36.603 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:36:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:36:36.603 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:36:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:36:36.603 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:36:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:36:36.603 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:36:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:36:36.603 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:36:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:36:36.604 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:36:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:36:36.604 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:36:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:36:36.604 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:36:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:36:36.604 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:36:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:36:36.604 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:36:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:36:36.604 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:36:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:36:36.604 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:36:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:36:36.604 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:36:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:36:36.604 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:36:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:36:36.604 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:36:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:36:36.604 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:36:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:36:36.604 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:36:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:36:36.604 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:36:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:36:36.604 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:36:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:36:36.605 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:36:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:37.367 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:36:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:37.367 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:36:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:37.368 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:36:38 compute-0 podman[249304]: 2025-11-22 08:36:38.407094891 +0000 UTC m=+0.052482485 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 08:36:39 compute-0 podman[249326]: 2025-11-22 08:36:39.404301937 +0000 UTC m=+0.048774185 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.33.7, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, version=9.6, architecture=x86_64, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, release=1755695350)
Nov 22 08:36:39 compute-0 podman[249325]: 2025-11-22 08:36:39.428198466 +0000 UTC m=+0.075820161 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 08:36:39 compute-0 nova_compute[186544]: 2025-11-22 08:36:39.559 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:36:41 compute-0 nova_compute[186544]: 2025-11-22 08:36:41.514 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:36:43 compute-0 nova_compute[186544]: 2025-11-22 08:36:43.158 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:36:44 compute-0 nova_compute[186544]: 2025-11-22 08:36:44.561 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:36:46 compute-0 nova_compute[186544]: 2025-11-22 08:36:46.516 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:36:49 compute-0 nova_compute[186544]: 2025-11-22 08:36:49.563 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:36:51 compute-0 nova_compute[186544]: 2025-11-22 08:36:51.518 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:36:52 compute-0 nova_compute[186544]: 2025-11-22 08:36:52.460 186548 DEBUG oslo_concurrency.lockutils [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "94a2fb15-11fa-4fe8-8e6f-92500b5e9427" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:36:52 compute-0 nova_compute[186544]: 2025-11-22 08:36:52.461 186548 DEBUG oslo_concurrency.lockutils [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "94a2fb15-11fa-4fe8-8e6f-92500b5e9427" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:36:52 compute-0 nova_compute[186544]: 2025-11-22 08:36:52.473 186548 DEBUG nova.compute.manager [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 08:36:52 compute-0 nova_compute[186544]: 2025-11-22 08:36:52.557 186548 DEBUG oslo_concurrency.lockutils [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:36:52 compute-0 nova_compute[186544]: 2025-11-22 08:36:52.557 186548 DEBUG oslo_concurrency.lockutils [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:36:52 compute-0 nova_compute[186544]: 2025-11-22 08:36:52.564 186548 DEBUG nova.virt.hardware [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 08:36:52 compute-0 nova_compute[186544]: 2025-11-22 08:36:52.564 186548 INFO nova.compute.claims [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] Claim successful on node compute-0.ctlplane.example.com
Nov 22 08:36:52 compute-0 nova_compute[186544]: 2025-11-22 08:36:52.670 186548 DEBUG nova.compute.provider_tree [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:36:52 compute-0 nova_compute[186544]: 2025-11-22 08:36:52.681 186548 DEBUG nova.scheduler.client.report [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:36:52 compute-0 nova_compute[186544]: 2025-11-22 08:36:52.696 186548 DEBUG oslo_concurrency.lockutils [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.139s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:36:52 compute-0 nova_compute[186544]: 2025-11-22 08:36:52.697 186548 DEBUG nova.compute.manager [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 08:36:52 compute-0 nova_compute[186544]: 2025-11-22 08:36:52.736 186548 DEBUG nova.compute.manager [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 22 08:36:52 compute-0 nova_compute[186544]: 2025-11-22 08:36:52.737 186548 DEBUG nova.network.neutron [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 08:36:52 compute-0 nova_compute[186544]: 2025-11-22 08:36:52.750 186548 INFO nova.virt.libvirt.driver [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 08:36:52 compute-0 nova_compute[186544]: 2025-11-22 08:36:52.765 186548 DEBUG nova.compute.manager [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 08:36:52 compute-0 nova_compute[186544]: 2025-11-22 08:36:52.884 186548 DEBUG nova.compute.manager [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 08:36:52 compute-0 nova_compute[186544]: 2025-11-22 08:36:52.886 186548 DEBUG nova.virt.libvirt.driver [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 08:36:52 compute-0 nova_compute[186544]: 2025-11-22 08:36:52.887 186548 INFO nova.virt.libvirt.driver [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] Creating image(s)
Nov 22 08:36:52 compute-0 nova_compute[186544]: 2025-11-22 08:36:52.887 186548 DEBUG oslo_concurrency.lockutils [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "/var/lib/nova/instances/94a2fb15-11fa-4fe8-8e6f-92500b5e9427/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:36:52 compute-0 nova_compute[186544]: 2025-11-22 08:36:52.888 186548 DEBUG oslo_concurrency.lockutils [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "/var/lib/nova/instances/94a2fb15-11fa-4fe8-8e6f-92500b5e9427/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:36:52 compute-0 nova_compute[186544]: 2025-11-22 08:36:52.888 186548 DEBUG oslo_concurrency.lockutils [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "/var/lib/nova/instances/94a2fb15-11fa-4fe8-8e6f-92500b5e9427/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:36:52 compute-0 nova_compute[186544]: 2025-11-22 08:36:52.905 186548 DEBUG oslo_concurrency.processutils [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:36:52 compute-0 nova_compute[186544]: 2025-11-22 08:36:52.963 186548 DEBUG oslo_concurrency.processutils [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:36:52 compute-0 nova_compute[186544]: 2025-11-22 08:36:52.965 186548 DEBUG oslo_concurrency.lockutils [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:36:52 compute-0 nova_compute[186544]: 2025-11-22 08:36:52.965 186548 DEBUG oslo_concurrency.lockutils [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:36:52 compute-0 nova_compute[186544]: 2025-11-22 08:36:52.982 186548 DEBUG oslo_concurrency.processutils [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:36:53 compute-0 nova_compute[186544]: 2025-11-22 08:36:53.037 186548 DEBUG nova.policy [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 22 08:36:53 compute-0 nova_compute[186544]: 2025-11-22 08:36:53.041 186548 DEBUG oslo_concurrency.processutils [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:36:53 compute-0 nova_compute[186544]: 2025-11-22 08:36:53.042 186548 DEBUG oslo_concurrency.processutils [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/94a2fb15-11fa-4fe8-8e6f-92500b5e9427/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:36:53 compute-0 nova_compute[186544]: 2025-11-22 08:36:53.074 186548 DEBUG oslo_concurrency.processutils [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/94a2fb15-11fa-4fe8-8e6f-92500b5e9427/disk 1073741824" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:36:53 compute-0 nova_compute[186544]: 2025-11-22 08:36:53.076 186548 DEBUG oslo_concurrency.lockutils [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:36:53 compute-0 nova_compute[186544]: 2025-11-22 08:36:53.076 186548 DEBUG oslo_concurrency.processutils [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:36:53 compute-0 nova_compute[186544]: 2025-11-22 08:36:53.134 186548 DEBUG oslo_concurrency.processutils [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:36:53 compute-0 nova_compute[186544]: 2025-11-22 08:36:53.135 186548 DEBUG nova.virt.disk.api [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Checking if we can resize image /var/lib/nova/instances/94a2fb15-11fa-4fe8-8e6f-92500b5e9427/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 08:36:53 compute-0 nova_compute[186544]: 2025-11-22 08:36:53.136 186548 DEBUG oslo_concurrency.processutils [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/94a2fb15-11fa-4fe8-8e6f-92500b5e9427/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:36:53 compute-0 nova_compute[186544]: 2025-11-22 08:36:53.225 186548 DEBUG oslo_concurrency.processutils [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/94a2fb15-11fa-4fe8-8e6f-92500b5e9427/disk --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:36:53 compute-0 nova_compute[186544]: 2025-11-22 08:36:53.226 186548 DEBUG nova.virt.disk.api [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Cannot resize image /var/lib/nova/instances/94a2fb15-11fa-4fe8-8e6f-92500b5e9427/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 08:36:53 compute-0 nova_compute[186544]: 2025-11-22 08:36:53.226 186548 DEBUG nova.objects.instance [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lazy-loading 'migration_context' on Instance uuid 94a2fb15-11fa-4fe8-8e6f-92500b5e9427 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:36:53 compute-0 nova_compute[186544]: 2025-11-22 08:36:53.240 186548 DEBUG nova.virt.libvirt.driver [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 08:36:53 compute-0 nova_compute[186544]: 2025-11-22 08:36:53.241 186548 DEBUG nova.virt.libvirt.driver [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] Ensure instance console log exists: /var/lib/nova/instances/94a2fb15-11fa-4fe8-8e6f-92500b5e9427/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 08:36:53 compute-0 nova_compute[186544]: 2025-11-22 08:36:53.242 186548 DEBUG oslo_concurrency.lockutils [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:36:53 compute-0 nova_compute[186544]: 2025-11-22 08:36:53.242 186548 DEBUG oslo_concurrency.lockutils [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:36:53 compute-0 nova_compute[186544]: 2025-11-22 08:36:53.242 186548 DEBUG oslo_concurrency.lockutils [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:36:53 compute-0 nova_compute[186544]: 2025-11-22 08:36:53.755 186548 DEBUG nova.network.neutron [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] Successfully created port: 80a8c065-698e-43fe-8221-aecc6a7f9b8b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 22 08:36:54 compute-0 nova_compute[186544]: 2025-11-22 08:36:54.566 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:36:54 compute-0 nova_compute[186544]: 2025-11-22 08:36:54.696 186548 DEBUG nova.network.neutron [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] Successfully updated port: 80a8c065-698e-43fe-8221-aecc6a7f9b8b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 08:36:54 compute-0 nova_compute[186544]: 2025-11-22 08:36:54.710 186548 DEBUG oslo_concurrency.lockutils [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "refresh_cache-94a2fb15-11fa-4fe8-8e6f-92500b5e9427" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:36:54 compute-0 nova_compute[186544]: 2025-11-22 08:36:54.710 186548 DEBUG oslo_concurrency.lockutils [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquired lock "refresh_cache-94a2fb15-11fa-4fe8-8e6f-92500b5e9427" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:36:54 compute-0 nova_compute[186544]: 2025-11-22 08:36:54.711 186548 DEBUG nova.network.neutron [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 08:36:54 compute-0 nova_compute[186544]: 2025-11-22 08:36:54.814 186548 DEBUG nova.compute.manager [req-b5b208f3-7510-48f8-a3ed-7e8f0ff3a092 req-6e072c6d-6859-4279-8268-b56a290a23b1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] Received event network-changed-80a8c065-698e-43fe-8221-aecc6a7f9b8b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:36:54 compute-0 nova_compute[186544]: 2025-11-22 08:36:54.814 186548 DEBUG nova.compute.manager [req-b5b208f3-7510-48f8-a3ed-7e8f0ff3a092 req-6e072c6d-6859-4279-8268-b56a290a23b1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] Refreshing instance network info cache due to event network-changed-80a8c065-698e-43fe-8221-aecc6a7f9b8b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:36:54 compute-0 nova_compute[186544]: 2025-11-22 08:36:54.815 186548 DEBUG oslo_concurrency.lockutils [req-b5b208f3-7510-48f8-a3ed-7e8f0ff3a092 req-6e072c6d-6859-4279-8268-b56a290a23b1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-94a2fb15-11fa-4fe8-8e6f-92500b5e9427" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:36:54 compute-0 nova_compute[186544]: 2025-11-22 08:36:54.844 186548 DEBUG nova.network.neutron [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 08:36:55 compute-0 nova_compute[186544]: 2025-11-22 08:36:55.551 186548 DEBUG nova.network.neutron [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] Updating instance_info_cache with network_info: [{"id": "80a8c065-698e-43fe-8221-aecc6a7f9b8b", "address": "fa:16:3e:fd:07:86", "network": {"id": "a309dc7b-6d46-436e-98ca-69f041ef0a19", "bridge": "br-int", "label": "tempest-network-smoke--625285188", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap80a8c065-69", "ovs_interfaceid": "80a8c065-698e-43fe-8221-aecc6a7f9b8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:36:55 compute-0 nova_compute[186544]: 2025-11-22 08:36:55.573 186548 DEBUG oslo_concurrency.lockutils [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Releasing lock "refresh_cache-94a2fb15-11fa-4fe8-8e6f-92500b5e9427" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:36:55 compute-0 nova_compute[186544]: 2025-11-22 08:36:55.574 186548 DEBUG nova.compute.manager [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] Instance network_info: |[{"id": "80a8c065-698e-43fe-8221-aecc6a7f9b8b", "address": "fa:16:3e:fd:07:86", "network": {"id": "a309dc7b-6d46-436e-98ca-69f041ef0a19", "bridge": "br-int", "label": "tempest-network-smoke--625285188", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap80a8c065-69", "ovs_interfaceid": "80a8c065-698e-43fe-8221-aecc6a7f9b8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 22 08:36:55 compute-0 nova_compute[186544]: 2025-11-22 08:36:55.574 186548 DEBUG oslo_concurrency.lockutils [req-b5b208f3-7510-48f8-a3ed-7e8f0ff3a092 req-6e072c6d-6859-4279-8268-b56a290a23b1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-94a2fb15-11fa-4fe8-8e6f-92500b5e9427" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:36:55 compute-0 nova_compute[186544]: 2025-11-22 08:36:55.574 186548 DEBUG nova.network.neutron [req-b5b208f3-7510-48f8-a3ed-7e8f0ff3a092 req-6e072c6d-6859-4279-8268-b56a290a23b1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] Refreshing network info cache for port 80a8c065-698e-43fe-8221-aecc6a7f9b8b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:36:55 compute-0 nova_compute[186544]: 2025-11-22 08:36:55.577 186548 DEBUG nova.virt.libvirt.driver [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] Start _get_guest_xml network_info=[{"id": "80a8c065-698e-43fe-8221-aecc6a7f9b8b", "address": "fa:16:3e:fd:07:86", "network": {"id": "a309dc7b-6d46-436e-98ca-69f041ef0a19", "bridge": "br-int", "label": "tempest-network-smoke--625285188", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap80a8c065-69", "ovs_interfaceid": "80a8c065-698e-43fe-8221-aecc6a7f9b8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 08:36:55 compute-0 nova_compute[186544]: 2025-11-22 08:36:55.583 186548 WARNING nova.virt.libvirt.driver [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:36:55 compute-0 nova_compute[186544]: 2025-11-22 08:36:55.590 186548 DEBUG nova.virt.libvirt.host [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 08:36:55 compute-0 nova_compute[186544]: 2025-11-22 08:36:55.591 186548 DEBUG nova.virt.libvirt.host [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 08:36:55 compute-0 nova_compute[186544]: 2025-11-22 08:36:55.594 186548 DEBUG nova.virt.libvirt.host [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 08:36:55 compute-0 nova_compute[186544]: 2025-11-22 08:36:55.594 186548 DEBUG nova.virt.libvirt.host [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 08:36:55 compute-0 nova_compute[186544]: 2025-11-22 08:36:55.595 186548 DEBUG nova.virt.libvirt.driver [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 08:36:55 compute-0 nova_compute[186544]: 2025-11-22 08:36:55.595 186548 DEBUG nova.virt.hardware [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 08:36:55 compute-0 nova_compute[186544]: 2025-11-22 08:36:55.596 186548 DEBUG nova.virt.hardware [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 08:36:55 compute-0 nova_compute[186544]: 2025-11-22 08:36:55.596 186548 DEBUG nova.virt.hardware [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 08:36:55 compute-0 nova_compute[186544]: 2025-11-22 08:36:55.597 186548 DEBUG nova.virt.hardware [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 08:36:55 compute-0 nova_compute[186544]: 2025-11-22 08:36:55.597 186548 DEBUG nova.virt.hardware [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 08:36:55 compute-0 nova_compute[186544]: 2025-11-22 08:36:55.597 186548 DEBUG nova.virt.hardware [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 08:36:55 compute-0 nova_compute[186544]: 2025-11-22 08:36:55.597 186548 DEBUG nova.virt.hardware [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 08:36:55 compute-0 nova_compute[186544]: 2025-11-22 08:36:55.598 186548 DEBUG nova.virt.hardware [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 08:36:55 compute-0 nova_compute[186544]: 2025-11-22 08:36:55.598 186548 DEBUG nova.virt.hardware [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 08:36:55 compute-0 nova_compute[186544]: 2025-11-22 08:36:55.598 186548 DEBUG nova.virt.hardware [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 08:36:55 compute-0 nova_compute[186544]: 2025-11-22 08:36:55.598 186548 DEBUG nova.virt.hardware [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 08:36:55 compute-0 nova_compute[186544]: 2025-11-22 08:36:55.602 186548 DEBUG nova.virt.libvirt.vif [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:36:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1254335037',display_name='tempest-TestNetworkBasicOps-server-1254335037',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1254335037',id=168,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKS4Bq3RhYn6P61tbexKnHy+x4mpnkfKJ+b8cPJcLAiFWB/i29v7tZ+sykfMsx26IoYit63aGhlyJ9x1yhUl8sWeynz+PZ7ms1Uj3yMNOx7SJtmw1GvIdMQrAE3Xyl5QXw==',key_name='tempest-TestNetworkBasicOps-1831406658',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='12f63a6d87a947758ab928c0d625ff06',ramdisk_id='',reservation_id='r-ms1qjc9e',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1998778518',owner_user_name='tempest-TestNetworkBasicOps-1998778518-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:36:52Z,user_data=None,user_id='033a5e424a0a42afa21b67c28d79d1f4',uuid=94a2fb15-11fa-4fe8-8e6f-92500b5e9427,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "80a8c065-698e-43fe-8221-aecc6a7f9b8b", "address": "fa:16:3e:fd:07:86", "network": {"id": "a309dc7b-6d46-436e-98ca-69f041ef0a19", "bridge": "br-int", "label": "tempest-network-smoke--625285188", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap80a8c065-69", "ovs_interfaceid": "80a8c065-698e-43fe-8221-aecc6a7f9b8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 08:36:55 compute-0 nova_compute[186544]: 2025-11-22 08:36:55.602 186548 DEBUG nova.network.os_vif_util [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converting VIF {"id": "80a8c065-698e-43fe-8221-aecc6a7f9b8b", "address": "fa:16:3e:fd:07:86", "network": {"id": "a309dc7b-6d46-436e-98ca-69f041ef0a19", "bridge": "br-int", "label": "tempest-network-smoke--625285188", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap80a8c065-69", "ovs_interfaceid": "80a8c065-698e-43fe-8221-aecc6a7f9b8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:36:55 compute-0 nova_compute[186544]: 2025-11-22 08:36:55.603 186548 DEBUG nova.network.os_vif_util [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fd:07:86,bridge_name='br-int',has_traffic_filtering=True,id=80a8c065-698e-43fe-8221-aecc6a7f9b8b,network=Network(a309dc7b-6d46-436e-98ca-69f041ef0a19),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap80a8c065-69') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:36:55 compute-0 nova_compute[186544]: 2025-11-22 08:36:55.603 186548 DEBUG nova.objects.instance [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lazy-loading 'pci_devices' on Instance uuid 94a2fb15-11fa-4fe8-8e6f-92500b5e9427 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:36:55 compute-0 nova_compute[186544]: 2025-11-22 08:36:55.635 186548 DEBUG nova.virt.libvirt.driver [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] End _get_guest_xml xml=<domain type="kvm">
Nov 22 08:36:55 compute-0 nova_compute[186544]:   <uuid>94a2fb15-11fa-4fe8-8e6f-92500b5e9427</uuid>
Nov 22 08:36:55 compute-0 nova_compute[186544]:   <name>instance-000000a8</name>
Nov 22 08:36:55 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 08:36:55 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 08:36:55 compute-0 nova_compute[186544]:   <metadata>
Nov 22 08:36:55 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 08:36:55 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 08:36:55 compute-0 nova_compute[186544]:       <nova:name>tempest-TestNetworkBasicOps-server-1254335037</nova:name>
Nov 22 08:36:55 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 08:36:55</nova:creationTime>
Nov 22 08:36:55 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 08:36:55 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 08:36:55 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 08:36:55 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 08:36:55 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 08:36:55 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 08:36:55 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 08:36:55 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 08:36:55 compute-0 nova_compute[186544]:         <nova:user uuid="033a5e424a0a42afa21b67c28d79d1f4">tempest-TestNetworkBasicOps-1998778518-project-member</nova:user>
Nov 22 08:36:55 compute-0 nova_compute[186544]:         <nova:project uuid="12f63a6d87a947758ab928c0d625ff06">tempest-TestNetworkBasicOps-1998778518</nova:project>
Nov 22 08:36:55 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 08:36:55 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 08:36:55 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 08:36:55 compute-0 nova_compute[186544]:         <nova:port uuid="80a8c065-698e-43fe-8221-aecc6a7f9b8b">
Nov 22 08:36:55 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 22 08:36:55 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 08:36:55 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 08:36:55 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 08:36:55 compute-0 nova_compute[186544]:   </metadata>
Nov 22 08:36:55 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 08:36:55 compute-0 nova_compute[186544]:     <system>
Nov 22 08:36:55 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 08:36:55 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 08:36:55 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 08:36:55 compute-0 nova_compute[186544]:       <entry name="serial">94a2fb15-11fa-4fe8-8e6f-92500b5e9427</entry>
Nov 22 08:36:55 compute-0 nova_compute[186544]:       <entry name="uuid">94a2fb15-11fa-4fe8-8e6f-92500b5e9427</entry>
Nov 22 08:36:55 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 08:36:55 compute-0 nova_compute[186544]:     </system>
Nov 22 08:36:55 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 08:36:55 compute-0 nova_compute[186544]:   <os>
Nov 22 08:36:55 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 08:36:55 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 08:36:55 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 08:36:55 compute-0 nova_compute[186544]:   </os>
Nov 22 08:36:55 compute-0 nova_compute[186544]:   <features>
Nov 22 08:36:55 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 08:36:55 compute-0 nova_compute[186544]:     <apic/>
Nov 22 08:36:55 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 08:36:55 compute-0 nova_compute[186544]:   </features>
Nov 22 08:36:55 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 08:36:55 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 08:36:55 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 08:36:55 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 08:36:55 compute-0 nova_compute[186544]:   </clock>
Nov 22 08:36:55 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 08:36:55 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 08:36:55 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 08:36:55 compute-0 nova_compute[186544]:   </cpu>
Nov 22 08:36:55 compute-0 nova_compute[186544]:   <devices>
Nov 22 08:36:55 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 08:36:55 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 08:36:55 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/94a2fb15-11fa-4fe8-8e6f-92500b5e9427/disk"/>
Nov 22 08:36:55 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 08:36:55 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:36:55 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 08:36:55 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 08:36:55 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/94a2fb15-11fa-4fe8-8e6f-92500b5e9427/disk.config"/>
Nov 22 08:36:55 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 08:36:55 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:36:55 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 08:36:55 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:fd:07:86"/>
Nov 22 08:36:55 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:36:55 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 08:36:55 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 08:36:55 compute-0 nova_compute[186544]:       <target dev="tap80a8c065-69"/>
Nov 22 08:36:55 compute-0 nova_compute[186544]:     </interface>
Nov 22 08:36:55 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 08:36:55 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/94a2fb15-11fa-4fe8-8e6f-92500b5e9427/console.log" append="off"/>
Nov 22 08:36:55 compute-0 nova_compute[186544]:     </serial>
Nov 22 08:36:55 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 08:36:55 compute-0 nova_compute[186544]:     <video>
Nov 22 08:36:55 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:36:55 compute-0 nova_compute[186544]:     </video>
Nov 22 08:36:55 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 08:36:55 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 08:36:55 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 08:36:55 compute-0 nova_compute[186544]:     </rng>
Nov 22 08:36:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 08:36:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:36:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:36:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:36:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:36:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:36:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:36:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:36:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:36:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:36:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:36:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:36:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:36:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:36:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:36:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:36:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:36:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:36:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:36:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:36:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:36:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:36:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:36:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:36:55 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:36:55 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 08:36:55 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 08:36:55 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 08:36:55 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 08:36:55 compute-0 nova_compute[186544]:   </devices>
Nov 22 08:36:55 compute-0 nova_compute[186544]: </domain>
Nov 22 08:36:55 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 08:36:55 compute-0 nova_compute[186544]: 2025-11-22 08:36:55.636 186548 DEBUG nova.compute.manager [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] Preparing to wait for external event network-vif-plugged-80a8c065-698e-43fe-8221-aecc6a7f9b8b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 08:36:55 compute-0 nova_compute[186544]: 2025-11-22 08:36:55.636 186548 DEBUG oslo_concurrency.lockutils [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "94a2fb15-11fa-4fe8-8e6f-92500b5e9427-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:36:55 compute-0 nova_compute[186544]: 2025-11-22 08:36:55.636 186548 DEBUG oslo_concurrency.lockutils [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "94a2fb15-11fa-4fe8-8e6f-92500b5e9427-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:36:55 compute-0 nova_compute[186544]: 2025-11-22 08:36:55.637 186548 DEBUG oslo_concurrency.lockutils [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "94a2fb15-11fa-4fe8-8e6f-92500b5e9427-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:36:55 compute-0 nova_compute[186544]: 2025-11-22 08:36:55.637 186548 DEBUG nova.virt.libvirt.vif [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:36:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1254335037',display_name='tempest-TestNetworkBasicOps-server-1254335037',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1254335037',id=168,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKS4Bq3RhYn6P61tbexKnHy+x4mpnkfKJ+b8cPJcLAiFWB/i29v7tZ+sykfMsx26IoYit63aGhlyJ9x1yhUl8sWeynz+PZ7ms1Uj3yMNOx7SJtmw1GvIdMQrAE3Xyl5QXw==',key_name='tempest-TestNetworkBasicOps-1831406658',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='12f63a6d87a947758ab928c0d625ff06',ramdisk_id='',reservation_id='r-ms1qjc9e',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1998778518',owner_user_name='tempest-TestNetworkBasicOps-1998778518-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:36:52Z,user_data=None,user_id='033a5e424a0a42afa21b67c28d79d1f4',uuid=94a2fb15-11fa-4fe8-8e6f-92500b5e9427,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "80a8c065-698e-43fe-8221-aecc6a7f9b8b", "address": "fa:16:3e:fd:07:86", "network": {"id": "a309dc7b-6d46-436e-98ca-69f041ef0a19", "bridge": "br-int", "label": "tempest-network-smoke--625285188", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap80a8c065-69", "ovs_interfaceid": "80a8c065-698e-43fe-8221-aecc6a7f9b8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 08:36:55 compute-0 nova_compute[186544]: 2025-11-22 08:36:55.638 186548 DEBUG nova.network.os_vif_util [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converting VIF {"id": "80a8c065-698e-43fe-8221-aecc6a7f9b8b", "address": "fa:16:3e:fd:07:86", "network": {"id": "a309dc7b-6d46-436e-98ca-69f041ef0a19", "bridge": "br-int", "label": "tempest-network-smoke--625285188", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap80a8c065-69", "ovs_interfaceid": "80a8c065-698e-43fe-8221-aecc6a7f9b8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:36:55 compute-0 nova_compute[186544]: 2025-11-22 08:36:55.638 186548 DEBUG nova.network.os_vif_util [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fd:07:86,bridge_name='br-int',has_traffic_filtering=True,id=80a8c065-698e-43fe-8221-aecc6a7f9b8b,network=Network(a309dc7b-6d46-436e-98ca-69f041ef0a19),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap80a8c065-69') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:36:55 compute-0 nova_compute[186544]: 2025-11-22 08:36:55.639 186548 DEBUG os_vif [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fd:07:86,bridge_name='br-int',has_traffic_filtering=True,id=80a8c065-698e-43fe-8221-aecc6a7f9b8b,network=Network(a309dc7b-6d46-436e-98ca-69f041ef0a19),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap80a8c065-69') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 08:36:55 compute-0 nova_compute[186544]: 2025-11-22 08:36:55.639 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:36:55 compute-0 nova_compute[186544]: 2025-11-22 08:36:55.639 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:36:55 compute-0 nova_compute[186544]: 2025-11-22 08:36:55.640 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:36:55 compute-0 nova_compute[186544]: 2025-11-22 08:36:55.643 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:36:55 compute-0 nova_compute[186544]: 2025-11-22 08:36:55.643 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap80a8c065-69, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:36:55 compute-0 nova_compute[186544]: 2025-11-22 08:36:55.643 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap80a8c065-69, col_values=(('external_ids', {'iface-id': '80a8c065-698e-43fe-8221-aecc6a7f9b8b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:fd:07:86', 'vm-uuid': '94a2fb15-11fa-4fe8-8e6f-92500b5e9427'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:36:55 compute-0 nova_compute[186544]: 2025-11-22 08:36:55.645 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:36:55 compute-0 NetworkManager[55036]: <info>  [1763800615.6463] manager: (tap80a8c065-69): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/382)
Nov 22 08:36:55 compute-0 nova_compute[186544]: 2025-11-22 08:36:55.648 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 08:36:55 compute-0 nova_compute[186544]: 2025-11-22 08:36:55.650 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:36:55 compute-0 nova_compute[186544]: 2025-11-22 08:36:55.651 186548 INFO os_vif [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fd:07:86,bridge_name='br-int',has_traffic_filtering=True,id=80a8c065-698e-43fe-8221-aecc6a7f9b8b,network=Network(a309dc7b-6d46-436e-98ca-69f041ef0a19),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap80a8c065-69')
Nov 22 08:36:55 compute-0 nova_compute[186544]: 2025-11-22 08:36:55.703 186548 DEBUG nova.virt.libvirt.driver [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:36:55 compute-0 nova_compute[186544]: 2025-11-22 08:36:55.704 186548 DEBUG nova.virt.libvirt.driver [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:36:55 compute-0 nova_compute[186544]: 2025-11-22 08:36:55.704 186548 DEBUG nova.virt.libvirt.driver [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] No VIF found with MAC fa:16:3e:fd:07:86, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 08:36:55 compute-0 nova_compute[186544]: 2025-11-22 08:36:55.705 186548 INFO nova.virt.libvirt.driver [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] Using config drive
Nov 22 08:36:56 compute-0 nova_compute[186544]: 2025-11-22 08:36:56.150 186548 INFO nova.virt.libvirt.driver [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] Creating config drive at /var/lib/nova/instances/94a2fb15-11fa-4fe8-8e6f-92500b5e9427/disk.config
Nov 22 08:36:56 compute-0 nova_compute[186544]: 2025-11-22 08:36:56.156 186548 DEBUG oslo_concurrency.processutils [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/94a2fb15-11fa-4fe8-8e6f-92500b5e9427/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpam2k2ima execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:36:56 compute-0 nova_compute[186544]: 2025-11-22 08:36:56.281 186548 DEBUG oslo_concurrency.processutils [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/94a2fb15-11fa-4fe8-8e6f-92500b5e9427/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpam2k2ima" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:36:56 compute-0 kernel: tap80a8c065-69: entered promiscuous mode
Nov 22 08:36:56 compute-0 NetworkManager[55036]: <info>  [1763800616.3326] manager: (tap80a8c065-69): new Tun device (/org/freedesktop/NetworkManager/Devices/383)
Nov 22 08:36:56 compute-0 ovn_controller[94843]: 2025-11-22T08:36:56Z|00820|binding|INFO|Claiming lport 80a8c065-698e-43fe-8221-aecc6a7f9b8b for this chassis.
Nov 22 08:36:56 compute-0 nova_compute[186544]: 2025-11-22 08:36:56.332 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:36:56 compute-0 ovn_controller[94843]: 2025-11-22T08:36:56Z|00821|binding|INFO|80a8c065-698e-43fe-8221-aecc6a7f9b8b: Claiming fa:16:3e:fd:07:86 10.100.0.5
Nov 22 08:36:56 compute-0 nova_compute[186544]: 2025-11-22 08:36:56.336 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:36:56 compute-0 nova_compute[186544]: 2025-11-22 08:36:56.340 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:36:56 compute-0 systemd-udevd[249400]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:36:56 compute-0 systemd-machined[152872]: New machine qemu-92-instance-000000a8.
Nov 22 08:36:56 compute-0 NetworkManager[55036]: <info>  [1763800616.3785] device (tap80a8c065-69): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 08:36:56 compute-0 NetworkManager[55036]: <info>  [1763800616.3796] device (tap80a8c065-69): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 08:36:56 compute-0 nova_compute[186544]: 2025-11-22 08:36:56.400 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:36:56 compute-0 ovn_controller[94843]: 2025-11-22T08:36:56Z|00822|binding|INFO|Setting lport 80a8c065-698e-43fe-8221-aecc6a7f9b8b ovn-installed in OVS
Nov 22 08:36:56 compute-0 nova_compute[186544]: 2025-11-22 08:36:56.403 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:36:56 compute-0 systemd[1]: Started Virtual Machine qemu-92-instance-000000a8.
Nov 22 08:36:56 compute-0 ovn_controller[94843]: 2025-11-22T08:36:56Z|00823|binding|INFO|Setting lport 80a8c065-698e-43fe-8221-aecc6a7f9b8b up in Southbound
Nov 22 08:36:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:56.431 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fd:07:86 10.100.0.5'], port_security=['fa:16:3e:fd:07:86 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '94a2fb15-11fa-4fe8-8e6f-92500b5e9427', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a309dc7b-6d46-436e-98ca-69f041ef0a19', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '12f63a6d87a947758ab928c0d625ff06', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c1678f38-6764-4fa5-bb59-f6e260db9a36', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=42a78de5-871c-49c7-94ea-05ed25f986e7, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=80a8c065-698e-43fe-8221-aecc6a7f9b8b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:36:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:56.432 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 80a8c065-698e-43fe-8221-aecc6a7f9b8b in datapath a309dc7b-6d46-436e-98ca-69f041ef0a19 bound to our chassis
Nov 22 08:36:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:56.433 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a309dc7b-6d46-436e-98ca-69f041ef0a19
Nov 22 08:36:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:56.446 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[d0a4a6a1-a8b4-4483-afc2-5e0eee8fb114]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:36:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:56.446 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa309dc7b-61 in ovnmeta-a309dc7b-6d46-436e-98ca-69f041ef0a19 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 08:36:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:56.448 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa309dc7b-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 08:36:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:56.448 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[808b4740-5865-485a-842e-138a5499035f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:36:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:56.449 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[08e76a3f-05ed-4318-8d11-618e182771e0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:36:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:56.461 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[5c553398-0703-4891-8873-f0fed92ddd25]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:36:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:56.486 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[b49f9d6b-69a0-4d7d-a622-56bd6fb4dc1e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:36:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:56.518 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[0108fd98-0c95-4919-a729-a76c1281a48f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:36:56 compute-0 nova_compute[186544]: 2025-11-22 08:36:56.519 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:36:56 compute-0 NetworkManager[55036]: <info>  [1763800616.5254] manager: (tapa309dc7b-60): new Veth device (/org/freedesktop/NetworkManager/Devices/384)
Nov 22 08:36:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:56.525 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[f454a095-3d07-4c9c-ba56-0356e1f22c44]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:36:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:56.551 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[c4a35aef-7b4a-49db-b795-fd6c75579131]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:36:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:56.554 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[fef8d8c1-b36c-4021-b6a1-fcbd9916575e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:36:56 compute-0 NetworkManager[55036]: <info>  [1763800616.5733] device (tapa309dc7b-60): carrier: link connected
Nov 22 08:36:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:56.580 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[c79b4c93-533e-4ced-a6d6-42db72ec7e67]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:36:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:56.597 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[4854ae3a-5045-4abb-b747-85a31a45bbe6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa309dc7b-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3b:d8:ce'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 246], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 735721, 'reachable_time': 27960, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 249437, 'error': None, 'target': 'ovnmeta-a309dc7b-6d46-436e-98ca-69f041ef0a19', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:36:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:56.611 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[09799fcf-ac04-434e-a387-80a5025573e5]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe3b:d8ce'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 735721, 'tstamp': 735721}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 249438, 'error': None, 'target': 'ovnmeta-a309dc7b-6d46-436e-98ca-69f041ef0a19', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:36:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:56.625 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[e1dc491d-7e51-43f3-b232-ba1c55354a3d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa309dc7b-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3b:d8:ce'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 246], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 735721, 'reachable_time': 27960, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 249440, 'error': None, 'target': 'ovnmeta-a309dc7b-6d46-436e-98ca-69f041ef0a19', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:36:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:56.652 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[dd37136e-74d7-4a44-ba30-3ba15c870fa1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:36:56 compute-0 nova_compute[186544]: 2025-11-22 08:36:56.701 186548 DEBUG nova.compute.manager [req-cd164c1b-29b2-4f15-9639-762aa5fc8117 req-e57e93fb-2fe0-47df-8a60-e8c440e0e6c5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] Received event network-vif-plugged-80a8c065-698e-43fe-8221-aecc6a7f9b8b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:36:56 compute-0 nova_compute[186544]: 2025-11-22 08:36:56.702 186548 DEBUG oslo_concurrency.lockutils [req-cd164c1b-29b2-4f15-9639-762aa5fc8117 req-e57e93fb-2fe0-47df-8a60-e8c440e0e6c5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "94a2fb15-11fa-4fe8-8e6f-92500b5e9427-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:36:56 compute-0 nova_compute[186544]: 2025-11-22 08:36:56.702 186548 DEBUG oslo_concurrency.lockutils [req-cd164c1b-29b2-4f15-9639-762aa5fc8117 req-e57e93fb-2fe0-47df-8a60-e8c440e0e6c5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "94a2fb15-11fa-4fe8-8e6f-92500b5e9427-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:36:56 compute-0 nova_compute[186544]: 2025-11-22 08:36:56.702 186548 DEBUG oslo_concurrency.lockutils [req-cd164c1b-29b2-4f15-9639-762aa5fc8117 req-e57e93fb-2fe0-47df-8a60-e8c440e0e6c5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "94a2fb15-11fa-4fe8-8e6f-92500b5e9427-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:36:56 compute-0 nova_compute[186544]: 2025-11-22 08:36:56.702 186548 DEBUG nova.compute.manager [req-cd164c1b-29b2-4f15-9639-762aa5fc8117 req-e57e93fb-2fe0-47df-8a60-e8c440e0e6c5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] Processing event network-vif-plugged-80a8c065-698e-43fe-8221-aecc6a7f9b8b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 08:36:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:56.720 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[2259be0e-f6bb-4b43-9f17-0a5db9e856c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:36:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:56.722 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa309dc7b-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:36:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:56.722 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:36:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:56.723 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa309dc7b-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:36:56 compute-0 nova_compute[186544]: 2025-11-22 08:36:56.724 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763800616.7241704, 94a2fb15-11fa-4fe8-8e6f-92500b5e9427 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:36:56 compute-0 nova_compute[186544]: 2025-11-22 08:36:56.725 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] VM Started (Lifecycle Event)
Nov 22 08:36:56 compute-0 NetworkManager[55036]: <info>  [1763800616.7260] manager: (tapa309dc7b-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/385)
Nov 22 08:36:56 compute-0 kernel: tapa309dc7b-60: entered promiscuous mode
Nov 22 08:36:56 compute-0 nova_compute[186544]: 2025-11-22 08:36:56.727 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:36:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:56.729 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa309dc7b-60, col_values=(('external_ids', {'iface-id': '4add315c-1d1e-4b07-9f69-3d9936765ab2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:36:56 compute-0 nova_compute[186544]: 2025-11-22 08:36:56.729 186548 DEBUG nova.compute.manager [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 08:36:56 compute-0 nova_compute[186544]: 2025-11-22 08:36:56.730 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:36:56 compute-0 ovn_controller[94843]: 2025-11-22T08:36:56Z|00824|binding|INFO|Releasing lport 4add315c-1d1e-4b07-9f69-3d9936765ab2 from this chassis (sb_readonly=0)
Nov 22 08:36:56 compute-0 nova_compute[186544]: 2025-11-22 08:36:56.734 186548 DEBUG nova.virt.libvirt.driver [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 08:36:56 compute-0 nova_compute[186544]: 2025-11-22 08:36:56.737 186548 INFO nova.virt.libvirt.driver [-] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] Instance spawned successfully.
Nov 22 08:36:56 compute-0 nova_compute[186544]: 2025-11-22 08:36:56.737 186548 DEBUG nova.virt.libvirt.driver [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 08:36:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:56.744 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a309dc7b-6d46-436e-98ca-69f041ef0a19.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a309dc7b-6d46-436e-98ca-69f041ef0a19.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 08:36:56 compute-0 nova_compute[186544]: 2025-11-22 08:36:56.744 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:36:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:56.745 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[d5d0b613-db4f-4313-ab2a-5c3c2e93495e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:36:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:56.746 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 08:36:56 compute-0 ovn_metadata_agent[103800]: global
Nov 22 08:36:56 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 08:36:56 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-a309dc7b-6d46-436e-98ca-69f041ef0a19
Nov 22 08:36:56 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 08:36:56 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 08:36:56 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 08:36:56 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/a309dc7b-6d46-436e-98ca-69f041ef0a19.pid.haproxy
Nov 22 08:36:56 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 08:36:56 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:36:56 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 08:36:56 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 08:36:56 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 08:36:56 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 08:36:56 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 08:36:56 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 08:36:56 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 08:36:56 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 08:36:56 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 08:36:56 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 08:36:56 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 08:36:56 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 08:36:56 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 08:36:56 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:36:56 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:36:56 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 08:36:56 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 08:36:56 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 08:36:56 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID a309dc7b-6d46-436e-98ca-69f041ef0a19
Nov 22 08:36:56 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 08:36:56 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:36:56.747 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a309dc7b-6d46-436e-98ca-69f041ef0a19', 'env', 'PROCESS_TAG=haproxy-a309dc7b-6d46-436e-98ca-69f041ef0a19', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a309dc7b-6d46-436e-98ca-69f041ef0a19.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 08:36:56 compute-0 nova_compute[186544]: 2025-11-22 08:36:56.747 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:36:56 compute-0 nova_compute[186544]: 2025-11-22 08:36:56.752 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:36:56 compute-0 nova_compute[186544]: 2025-11-22 08:36:56.755 186548 DEBUG nova.virt.libvirt.driver [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:36:56 compute-0 nova_compute[186544]: 2025-11-22 08:36:56.755 186548 DEBUG nova.virt.libvirt.driver [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:36:56 compute-0 nova_compute[186544]: 2025-11-22 08:36:56.756 186548 DEBUG nova.virt.libvirt.driver [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:36:56 compute-0 nova_compute[186544]: 2025-11-22 08:36:56.756 186548 DEBUG nova.virt.libvirt.driver [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:36:56 compute-0 nova_compute[186544]: 2025-11-22 08:36:56.757 186548 DEBUG nova.virt.libvirt.driver [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:36:56 compute-0 nova_compute[186544]: 2025-11-22 08:36:56.757 186548 DEBUG nova.virt.libvirt.driver [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:36:56 compute-0 nova_compute[186544]: 2025-11-22 08:36:56.781 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 08:36:56 compute-0 nova_compute[186544]: 2025-11-22 08:36:56.781 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763800616.7250555, 94a2fb15-11fa-4fe8-8e6f-92500b5e9427 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:36:56 compute-0 nova_compute[186544]: 2025-11-22 08:36:56.782 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] VM Paused (Lifecycle Event)
Nov 22 08:36:56 compute-0 nova_compute[186544]: 2025-11-22 08:36:56.810 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:36:56 compute-0 nova_compute[186544]: 2025-11-22 08:36:56.814 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763800616.7331204, 94a2fb15-11fa-4fe8-8e6f-92500b5e9427 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:36:56 compute-0 nova_compute[186544]: 2025-11-22 08:36:56.814 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] VM Resumed (Lifecycle Event)
Nov 22 08:36:56 compute-0 nova_compute[186544]: 2025-11-22 08:36:56.843 186548 INFO nova.compute.manager [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] Took 3.96 seconds to spawn the instance on the hypervisor.
Nov 22 08:36:56 compute-0 nova_compute[186544]: 2025-11-22 08:36:56.843 186548 DEBUG nova.compute.manager [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:36:56 compute-0 nova_compute[186544]: 2025-11-22 08:36:56.850 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:36:56 compute-0 nova_compute[186544]: 2025-11-22 08:36:56.853 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:36:56 compute-0 nova_compute[186544]: 2025-11-22 08:36:56.888 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 08:36:56 compute-0 nova_compute[186544]: 2025-11-22 08:36:56.933 186548 INFO nova.compute.manager [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] Took 4.41 seconds to build instance.
Nov 22 08:36:56 compute-0 nova_compute[186544]: 2025-11-22 08:36:56.937 186548 DEBUG nova.network.neutron [req-b5b208f3-7510-48f8-a3ed-7e8f0ff3a092 req-6e072c6d-6859-4279-8268-b56a290a23b1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] Updated VIF entry in instance network info cache for port 80a8c065-698e-43fe-8221-aecc6a7f9b8b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:36:56 compute-0 nova_compute[186544]: 2025-11-22 08:36:56.938 186548 DEBUG nova.network.neutron [req-b5b208f3-7510-48f8-a3ed-7e8f0ff3a092 req-6e072c6d-6859-4279-8268-b56a290a23b1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] Updating instance_info_cache with network_info: [{"id": "80a8c065-698e-43fe-8221-aecc6a7f9b8b", "address": "fa:16:3e:fd:07:86", "network": {"id": "a309dc7b-6d46-436e-98ca-69f041ef0a19", "bridge": "br-int", "label": "tempest-network-smoke--625285188", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap80a8c065-69", "ovs_interfaceid": "80a8c065-698e-43fe-8221-aecc6a7f9b8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:36:56 compute-0 nova_compute[186544]: 2025-11-22 08:36:56.953 186548 DEBUG oslo_concurrency.lockutils [req-b5b208f3-7510-48f8-a3ed-7e8f0ff3a092 req-6e072c6d-6859-4279-8268-b56a290a23b1 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-94a2fb15-11fa-4fe8-8e6f-92500b5e9427" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:36:56 compute-0 nova_compute[186544]: 2025-11-22 08:36:56.954 186548 DEBUG oslo_concurrency.lockutils [None req-e451a851-8838-4160-8146-fe9d79b883a2 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "94a2fb15-11fa-4fe8-8e6f-92500b5e9427" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.493s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:36:57 compute-0 podman[249478]: 2025-11-22 08:36:57.107355125 +0000 UTC m=+0.053241015 container create 728e007ed24153f1bb408361e8684853532e5729e78c5d2986742a198b7f2204 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a309dc7b-6d46-436e-98ca-69f041ef0a19, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 22 08:36:57 compute-0 systemd[1]: Started libpod-conmon-728e007ed24153f1bb408361e8684853532e5729e78c5d2986742a198b7f2204.scope.
Nov 22 08:36:57 compute-0 podman[249478]: 2025-11-22 08:36:57.07637011 +0000 UTC m=+0.022256020 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 08:36:57 compute-0 systemd[1]: Started libcrun container.
Nov 22 08:36:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9f058d3562c049912a38ff1c22d9d133a5a79e06dd6ba1bbf9fcb8535c5c4995/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 08:36:57 compute-0 podman[249478]: 2025-11-22 08:36:57.197527159 +0000 UTC m=+0.143413079 container init 728e007ed24153f1bb408361e8684853532e5729e78c5d2986742a198b7f2204 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a309dc7b-6d46-436e-98ca-69f041ef0a19, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 22 08:36:57 compute-0 podman[249478]: 2025-11-22 08:36:57.203452506 +0000 UTC m=+0.149338386 container start 728e007ed24153f1bb408361e8684853532e5729e78c5d2986742a198b7f2204 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a309dc7b-6d46-436e-98ca-69f041ef0a19, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 22 08:36:57 compute-0 neutron-haproxy-ovnmeta-a309dc7b-6d46-436e-98ca-69f041ef0a19[249494]: [NOTICE]   (249498) : New worker (249500) forked
Nov 22 08:36:57 compute-0 neutron-haproxy-ovnmeta-a309dc7b-6d46-436e-98ca-69f041ef0a19[249494]: [NOTICE]   (249498) : Loading success.
Nov 22 08:36:58 compute-0 podman[249511]: 2025-11-22 08:36:58.418074046 +0000 UTC m=+0.056227888 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 08:36:58 compute-0 podman[249509]: 2025-11-22 08:36:58.418097597 +0000 UTC m=+0.063996670 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ceilometer_agent_compute)
Nov 22 08:36:58 compute-0 podman[249510]: 2025-11-22 08:36:58.439063374 +0000 UTC m=+0.079733228 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 22 08:36:58 compute-0 podman[249515]: 2025-11-22 08:36:58.440231743 +0000 UTC m=+0.075829262 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller)
Nov 22 08:36:58 compute-0 nova_compute[186544]: 2025-11-22 08:36:58.806 186548 DEBUG nova.compute.manager [req-af193dd2-7a9d-4eca-bfff-538c0ec3d9de req-70025bab-2cd3-4ec4-96ae-f897048a76a9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] Received event network-vif-plugged-80a8c065-698e-43fe-8221-aecc6a7f9b8b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:36:58 compute-0 nova_compute[186544]: 2025-11-22 08:36:58.807 186548 DEBUG oslo_concurrency.lockutils [req-af193dd2-7a9d-4eca-bfff-538c0ec3d9de req-70025bab-2cd3-4ec4-96ae-f897048a76a9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "94a2fb15-11fa-4fe8-8e6f-92500b5e9427-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:36:58 compute-0 nova_compute[186544]: 2025-11-22 08:36:58.807 186548 DEBUG oslo_concurrency.lockutils [req-af193dd2-7a9d-4eca-bfff-538c0ec3d9de req-70025bab-2cd3-4ec4-96ae-f897048a76a9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "94a2fb15-11fa-4fe8-8e6f-92500b5e9427-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:36:58 compute-0 nova_compute[186544]: 2025-11-22 08:36:58.807 186548 DEBUG oslo_concurrency.lockutils [req-af193dd2-7a9d-4eca-bfff-538c0ec3d9de req-70025bab-2cd3-4ec4-96ae-f897048a76a9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "94a2fb15-11fa-4fe8-8e6f-92500b5e9427-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:36:58 compute-0 nova_compute[186544]: 2025-11-22 08:36:58.808 186548 DEBUG nova.compute.manager [req-af193dd2-7a9d-4eca-bfff-538c0ec3d9de req-70025bab-2cd3-4ec4-96ae-f897048a76a9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] No waiting events found dispatching network-vif-plugged-80a8c065-698e-43fe-8221-aecc6a7f9b8b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:36:58 compute-0 nova_compute[186544]: 2025-11-22 08:36:58.808 186548 WARNING nova.compute.manager [req-af193dd2-7a9d-4eca-bfff-538c0ec3d9de req-70025bab-2cd3-4ec4-96ae-f897048a76a9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] Received unexpected event network-vif-plugged-80a8c065-698e-43fe-8221-aecc6a7f9b8b for instance with vm_state active and task_state None.
Nov 22 08:37:00 compute-0 ovn_controller[94843]: 2025-11-22T08:37:00Z|00825|binding|INFO|Releasing lport 4add315c-1d1e-4b07-9f69-3d9936765ab2 from this chassis (sb_readonly=0)
Nov 22 08:37:00 compute-0 nova_compute[186544]: 2025-11-22 08:37:00.430 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:37:00 compute-0 NetworkManager[55036]: <info>  [1763800620.4329] manager: (patch-br-int-to-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/386)
Nov 22 08:37:00 compute-0 NetworkManager[55036]: <info>  [1763800620.4341] manager: (patch-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/387)
Nov 22 08:37:00 compute-0 ovn_controller[94843]: 2025-11-22T08:37:00Z|00826|binding|INFO|Releasing lport 4add315c-1d1e-4b07-9f69-3d9936765ab2 from this chassis (sb_readonly=0)
Nov 22 08:37:00 compute-0 nova_compute[186544]: 2025-11-22 08:37:00.459 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:37:00 compute-0 nova_compute[186544]: 2025-11-22 08:37:00.463 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:37:00 compute-0 nova_compute[186544]: 2025-11-22 08:37:00.645 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:37:00 compute-0 nova_compute[186544]: 2025-11-22 08:37:00.743 186548 DEBUG nova.compute.manager [req-78d6f72f-c130-467a-a5a0-ad5b83ccdc13 req-be7b0d28-374b-40e0-b188-8a10747a2318 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] Received event network-changed-80a8c065-698e-43fe-8221-aecc6a7f9b8b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:37:00 compute-0 nova_compute[186544]: 2025-11-22 08:37:00.743 186548 DEBUG nova.compute.manager [req-78d6f72f-c130-467a-a5a0-ad5b83ccdc13 req-be7b0d28-374b-40e0-b188-8a10747a2318 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] Refreshing instance network info cache due to event network-changed-80a8c065-698e-43fe-8221-aecc6a7f9b8b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:37:00 compute-0 nova_compute[186544]: 2025-11-22 08:37:00.744 186548 DEBUG oslo_concurrency.lockutils [req-78d6f72f-c130-467a-a5a0-ad5b83ccdc13 req-be7b0d28-374b-40e0-b188-8a10747a2318 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-94a2fb15-11fa-4fe8-8e6f-92500b5e9427" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:37:00 compute-0 nova_compute[186544]: 2025-11-22 08:37:00.744 186548 DEBUG oslo_concurrency.lockutils [req-78d6f72f-c130-467a-a5a0-ad5b83ccdc13 req-be7b0d28-374b-40e0-b188-8a10747a2318 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-94a2fb15-11fa-4fe8-8e6f-92500b5e9427" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:37:00 compute-0 nova_compute[186544]: 2025-11-22 08:37:00.745 186548 DEBUG nova.network.neutron [req-78d6f72f-c130-467a-a5a0-ad5b83ccdc13 req-be7b0d28-374b-40e0-b188-8a10747a2318 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] Refreshing network info cache for port 80a8c065-698e-43fe-8221-aecc6a7f9b8b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:37:01 compute-0 nova_compute[186544]: 2025-11-22 08:37:01.520 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:37:01 compute-0 nova_compute[186544]: 2025-11-22 08:37:01.789 186548 DEBUG nova.network.neutron [req-78d6f72f-c130-467a-a5a0-ad5b83ccdc13 req-be7b0d28-374b-40e0-b188-8a10747a2318 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] Updated VIF entry in instance network info cache for port 80a8c065-698e-43fe-8221-aecc6a7f9b8b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:37:01 compute-0 nova_compute[186544]: 2025-11-22 08:37:01.789 186548 DEBUG nova.network.neutron [req-78d6f72f-c130-467a-a5a0-ad5b83ccdc13 req-be7b0d28-374b-40e0-b188-8a10747a2318 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] Updating instance_info_cache with network_info: [{"id": "80a8c065-698e-43fe-8221-aecc6a7f9b8b", "address": "fa:16:3e:fd:07:86", "network": {"id": "a309dc7b-6d46-436e-98ca-69f041ef0a19", "bridge": "br-int", "label": "tempest-network-smoke--625285188", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap80a8c065-69", "ovs_interfaceid": "80a8c065-698e-43fe-8221-aecc6a7f9b8b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:37:01 compute-0 nova_compute[186544]: 2025-11-22 08:37:01.826 186548 DEBUG oslo_concurrency.lockutils [req-78d6f72f-c130-467a-a5a0-ad5b83ccdc13 req-be7b0d28-374b-40e0-b188-8a10747a2318 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-94a2fb15-11fa-4fe8-8e6f-92500b5e9427" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:37:05 compute-0 nova_compute[186544]: 2025-11-22 08:37:05.649 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:37:06 compute-0 nova_compute[186544]: 2025-11-22 08:37:06.521 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:37:09 compute-0 ovn_controller[94843]: 2025-11-22T08:37:09Z|00079|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:fd:07:86 10.100.0.5
Nov 22 08:37:09 compute-0 ovn_controller[94843]: 2025-11-22T08:37:09Z|00080|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:fd:07:86 10.100.0.5
Nov 22 08:37:09 compute-0 podman[249603]: 2025-11-22 08:37:09.421648336 +0000 UTC m=+0.060632847 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 22 08:37:09 compute-0 podman[249625]: 2025-11-22 08:37:09.515366679 +0000 UTC m=+0.062270287 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_id=edpm, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., release=1755695350, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., vcs-type=git, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, version=9.6, io.buildah.version=1.33.7)
Nov 22 08:37:09 compute-0 podman[249626]: 2025-11-22 08:37:09.533864465 +0000 UTC m=+0.078760024 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 08:37:10 compute-0 nova_compute[186544]: 2025-11-22 08:37:10.655 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:37:11 compute-0 nova_compute[186544]: 2025-11-22 08:37:11.522 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:37:15 compute-0 nova_compute[186544]: 2025-11-22 08:37:15.486 186548 INFO nova.compute.manager [None req-50f4ce15-efd3-45ba-b0fc-2e3a7afd3b40 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] Get console output
Nov 22 08:37:15 compute-0 nova_compute[186544]: 2025-11-22 08:37:15.492 213059 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 22 08:37:15 compute-0 nova_compute[186544]: 2025-11-22 08:37:15.657 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:37:16 compute-0 nova_compute[186544]: 2025-11-22 08:37:16.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:37:16 compute-0 nova_compute[186544]: 2025-11-22 08:37:16.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 08:37:16 compute-0 nova_compute[186544]: 2025-11-22 08:37:16.182 186548 INFO nova.compute.manager [None req-28cfff3a-acb2-44b1-a7c6-b469821f9352 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] Get console output
Nov 22 08:37:16 compute-0 nova_compute[186544]: 2025-11-22 08:37:16.187 213059 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 22 08:37:16 compute-0 nova_compute[186544]: 2025-11-22 08:37:16.525 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:37:17 compute-0 nova_compute[186544]: 2025-11-22 08:37:17.077 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:37:17 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:37:17.078 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=62, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=61) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:37:17 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:37:17.080 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 08:37:17 compute-0 nova_compute[186544]: 2025-11-22 08:37:17.521 186548 DEBUG nova.compute.manager [req-2b2d851e-dea8-45fc-8b73-df75f8afc7f3 req-56e4f06b-0396-47ce-9a93-692ff76fd27f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] Received event network-changed-80a8c065-698e-43fe-8221-aecc6a7f9b8b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:37:17 compute-0 nova_compute[186544]: 2025-11-22 08:37:17.521 186548 DEBUG nova.compute.manager [req-2b2d851e-dea8-45fc-8b73-df75f8afc7f3 req-56e4f06b-0396-47ce-9a93-692ff76fd27f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] Refreshing instance network info cache due to event network-changed-80a8c065-698e-43fe-8221-aecc6a7f9b8b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:37:17 compute-0 nova_compute[186544]: 2025-11-22 08:37:17.521 186548 DEBUG oslo_concurrency.lockutils [req-2b2d851e-dea8-45fc-8b73-df75f8afc7f3 req-56e4f06b-0396-47ce-9a93-692ff76fd27f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-94a2fb15-11fa-4fe8-8e6f-92500b5e9427" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:37:17 compute-0 nova_compute[186544]: 2025-11-22 08:37:17.522 186548 DEBUG oslo_concurrency.lockutils [req-2b2d851e-dea8-45fc-8b73-df75f8afc7f3 req-56e4f06b-0396-47ce-9a93-692ff76fd27f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-94a2fb15-11fa-4fe8-8e6f-92500b5e9427" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:37:17 compute-0 nova_compute[186544]: 2025-11-22 08:37:17.522 186548 DEBUG nova.network.neutron [req-2b2d851e-dea8-45fc-8b73-df75f8afc7f3 req-56e4f06b-0396-47ce-9a93-692ff76fd27f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] Refreshing network info cache for port 80a8c065-698e-43fe-8221-aecc6a7f9b8b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:37:17 compute-0 nova_compute[186544]: 2025-11-22 08:37:17.691 186548 DEBUG oslo_concurrency.lockutils [None req-9ff3b8fc-a7a4-492a-b62f-f62665bdff77 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "94a2fb15-11fa-4fe8-8e6f-92500b5e9427" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:37:17 compute-0 nova_compute[186544]: 2025-11-22 08:37:17.692 186548 DEBUG oslo_concurrency.lockutils [None req-9ff3b8fc-a7a4-492a-b62f-f62665bdff77 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "94a2fb15-11fa-4fe8-8e6f-92500b5e9427" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:37:17 compute-0 nova_compute[186544]: 2025-11-22 08:37:17.692 186548 DEBUG oslo_concurrency.lockutils [None req-9ff3b8fc-a7a4-492a-b62f-f62665bdff77 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "94a2fb15-11fa-4fe8-8e6f-92500b5e9427-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:37:17 compute-0 nova_compute[186544]: 2025-11-22 08:37:17.692 186548 DEBUG oslo_concurrency.lockutils [None req-9ff3b8fc-a7a4-492a-b62f-f62665bdff77 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "94a2fb15-11fa-4fe8-8e6f-92500b5e9427-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:37:17 compute-0 nova_compute[186544]: 2025-11-22 08:37:17.693 186548 DEBUG oslo_concurrency.lockutils [None req-9ff3b8fc-a7a4-492a-b62f-f62665bdff77 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "94a2fb15-11fa-4fe8-8e6f-92500b5e9427-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:37:17 compute-0 nova_compute[186544]: 2025-11-22 08:37:17.702 186548 INFO nova.compute.manager [None req-9ff3b8fc-a7a4-492a-b62f-f62665bdff77 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] Terminating instance
Nov 22 08:37:17 compute-0 nova_compute[186544]: 2025-11-22 08:37:17.708 186548 DEBUG nova.compute.manager [None req-9ff3b8fc-a7a4-492a-b62f-f62665bdff77 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 08:37:17 compute-0 kernel: tap80a8c065-69 (unregistering): left promiscuous mode
Nov 22 08:37:17 compute-0 NetworkManager[55036]: <info>  [1763800637.7299] device (tap80a8c065-69): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 08:37:17 compute-0 ovn_controller[94843]: 2025-11-22T08:37:17Z|00827|binding|INFO|Releasing lport 80a8c065-698e-43fe-8221-aecc6a7f9b8b from this chassis (sb_readonly=0)
Nov 22 08:37:17 compute-0 ovn_controller[94843]: 2025-11-22T08:37:17Z|00828|binding|INFO|Setting lport 80a8c065-698e-43fe-8221-aecc6a7f9b8b down in Southbound
Nov 22 08:37:17 compute-0 ovn_controller[94843]: 2025-11-22T08:37:17Z|00829|binding|INFO|Removing iface tap80a8c065-69 ovn-installed in OVS
Nov 22 08:37:17 compute-0 nova_compute[186544]: 2025-11-22 08:37:17.740 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:37:17 compute-0 nova_compute[186544]: 2025-11-22 08:37:17.742 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:37:17 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:37:17.748 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fd:07:86 10.100.0.5'], port_security=['fa:16:3e:fd:07:86 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '94a2fb15-11fa-4fe8-8e6f-92500b5e9427', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a309dc7b-6d46-436e-98ca-69f041ef0a19', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '12f63a6d87a947758ab928c0d625ff06', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c1678f38-6764-4fa5-bb59-f6e260db9a36', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=42a78de5-871c-49c7-94ea-05ed25f986e7, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=80a8c065-698e-43fe-8221-aecc6a7f9b8b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:37:17 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:37:17.749 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 80a8c065-698e-43fe-8221-aecc6a7f9b8b in datapath a309dc7b-6d46-436e-98ca-69f041ef0a19 unbound from our chassis
Nov 22 08:37:17 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:37:17.749 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a309dc7b-6d46-436e-98ca-69f041ef0a19, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 08:37:17 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:37:17.751 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[68f1d570-7943-4d9f-9de7-e7087cea8519]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:37:17 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:37:17.751 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a309dc7b-6d46-436e-98ca-69f041ef0a19 namespace which is not needed anymore
Nov 22 08:37:17 compute-0 nova_compute[186544]: 2025-11-22 08:37:17.757 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:37:17 compute-0 systemd[1]: machine-qemu\x2d92\x2dinstance\x2d000000a8.scope: Deactivated successfully.
Nov 22 08:37:17 compute-0 systemd[1]: machine-qemu\x2d92\x2dinstance\x2d000000a8.scope: Consumed 14.178s CPU time.
Nov 22 08:37:17 compute-0 systemd-machined[152872]: Machine qemu-92-instance-000000a8 terminated.
Nov 22 08:37:17 compute-0 neutron-haproxy-ovnmeta-a309dc7b-6d46-436e-98ca-69f041ef0a19[249494]: [NOTICE]   (249498) : haproxy version is 2.8.14-c23fe91
Nov 22 08:37:17 compute-0 neutron-haproxy-ovnmeta-a309dc7b-6d46-436e-98ca-69f041ef0a19[249494]: [NOTICE]   (249498) : path to executable is /usr/sbin/haproxy
Nov 22 08:37:17 compute-0 neutron-haproxy-ovnmeta-a309dc7b-6d46-436e-98ca-69f041ef0a19[249494]: [WARNING]  (249498) : Exiting Master process...
Nov 22 08:37:17 compute-0 neutron-haproxy-ovnmeta-a309dc7b-6d46-436e-98ca-69f041ef0a19[249494]: [ALERT]    (249498) : Current worker (249500) exited with code 143 (Terminated)
Nov 22 08:37:17 compute-0 neutron-haproxy-ovnmeta-a309dc7b-6d46-436e-98ca-69f041ef0a19[249494]: [WARNING]  (249498) : All workers exited. Exiting... (0)
Nov 22 08:37:17 compute-0 systemd[1]: libpod-728e007ed24153f1bb408361e8684853532e5729e78c5d2986742a198b7f2204.scope: Deactivated successfully.
Nov 22 08:37:17 compute-0 conmon[249494]: conmon 728e007ed24153f1bb40 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-728e007ed24153f1bb408361e8684853532e5729e78c5d2986742a198b7f2204.scope/container/memory.events
Nov 22 08:37:17 compute-0 podman[249693]: 2025-11-22 08:37:17.884653839 +0000 UTC m=+0.056171428 container died 728e007ed24153f1bb408361e8684853532e5729e78c5d2986742a198b7f2204 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a309dc7b-6d46-436e-98ca-69f041ef0a19, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Nov 22 08:37:17 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-728e007ed24153f1bb408361e8684853532e5729e78c5d2986742a198b7f2204-userdata-shm.mount: Deactivated successfully.
Nov 22 08:37:17 compute-0 systemd[1]: var-lib-containers-storage-overlay-9f058d3562c049912a38ff1c22d9d133a5a79e06dd6ba1bbf9fcb8535c5c4995-merged.mount: Deactivated successfully.
Nov 22 08:37:17 compute-0 podman[249693]: 2025-11-22 08:37:17.962763085 +0000 UTC m=+0.134280674 container cleanup 728e007ed24153f1bb408361e8684853532e5729e78c5d2986742a198b7f2204 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a309dc7b-6d46-436e-98ca-69f041ef0a19, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 22 08:37:17 compute-0 systemd[1]: libpod-conmon-728e007ed24153f1bb408361e8684853532e5729e78c5d2986742a198b7f2204.scope: Deactivated successfully.
Nov 22 08:37:17 compute-0 nova_compute[186544]: 2025-11-22 08:37:17.990 186548 INFO nova.virt.libvirt.driver [-] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] Instance destroyed successfully.
Nov 22 08:37:17 compute-0 nova_compute[186544]: 2025-11-22 08:37:17.991 186548 DEBUG nova.objects.instance [None req-9ff3b8fc-a7a4-492a-b62f-f62665bdff77 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lazy-loading 'resources' on Instance uuid 94a2fb15-11fa-4fe8-8e6f-92500b5e9427 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:37:18 compute-0 nova_compute[186544]: 2025-11-22 08:37:18.010 186548 DEBUG nova.virt.libvirt.vif [None req-9ff3b8fc-a7a4-492a-b62f-f62665bdff77 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:36:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1254335037',display_name='tempest-TestNetworkBasicOps-server-1254335037',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1254335037',id=168,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKS4Bq3RhYn6P61tbexKnHy+x4mpnkfKJ+b8cPJcLAiFWB/i29v7tZ+sykfMsx26IoYit63aGhlyJ9x1yhUl8sWeynz+PZ7ms1Uj3yMNOx7SJtmw1GvIdMQrAE3Xyl5QXw==',key_name='tempest-TestNetworkBasicOps-1831406658',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:36:56Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='12f63a6d87a947758ab928c0d625ff06',ramdisk_id='',reservation_id='r-ms1qjc9e',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1998778518',owner_user_name='tempest-TestNetworkBasicOps-1998778518-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:36:56Z,user_data=None,user_id='033a5e424a0a42afa21b67c28d79d1f4',uuid=94a2fb15-11fa-4fe8-8e6f-92500b5e9427,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "80a8c065-698e-43fe-8221-aecc6a7f9b8b", "address": "fa:16:3e:fd:07:86", "network": {"id": "a309dc7b-6d46-436e-98ca-69f041ef0a19", "bridge": "br-int", "label": "tempest-network-smoke--625285188", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap80a8c065-69", "ovs_interfaceid": "80a8c065-698e-43fe-8221-aecc6a7f9b8b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 08:37:18 compute-0 nova_compute[186544]: 2025-11-22 08:37:18.011 186548 DEBUG nova.network.os_vif_util [None req-9ff3b8fc-a7a4-492a-b62f-f62665bdff77 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converting VIF {"id": "80a8c065-698e-43fe-8221-aecc6a7f9b8b", "address": "fa:16:3e:fd:07:86", "network": {"id": "a309dc7b-6d46-436e-98ca-69f041ef0a19", "bridge": "br-int", "label": "tempest-network-smoke--625285188", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap80a8c065-69", "ovs_interfaceid": "80a8c065-698e-43fe-8221-aecc6a7f9b8b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:37:18 compute-0 nova_compute[186544]: 2025-11-22 08:37:18.012 186548 DEBUG nova.network.os_vif_util [None req-9ff3b8fc-a7a4-492a-b62f-f62665bdff77 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:fd:07:86,bridge_name='br-int',has_traffic_filtering=True,id=80a8c065-698e-43fe-8221-aecc6a7f9b8b,network=Network(a309dc7b-6d46-436e-98ca-69f041ef0a19),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap80a8c065-69') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:37:18 compute-0 nova_compute[186544]: 2025-11-22 08:37:18.012 186548 DEBUG os_vif [None req-9ff3b8fc-a7a4-492a-b62f-f62665bdff77 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:fd:07:86,bridge_name='br-int',has_traffic_filtering=True,id=80a8c065-698e-43fe-8221-aecc6a7f9b8b,network=Network(a309dc7b-6d46-436e-98ca-69f041ef0a19),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap80a8c065-69') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 08:37:18 compute-0 nova_compute[186544]: 2025-11-22 08:37:18.014 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:37:18 compute-0 nova_compute[186544]: 2025-11-22 08:37:18.014 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap80a8c065-69, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:37:18 compute-0 nova_compute[186544]: 2025-11-22 08:37:18.016 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:37:18 compute-0 nova_compute[186544]: 2025-11-22 08:37:18.018 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:37:18 compute-0 nova_compute[186544]: 2025-11-22 08:37:18.020 186548 INFO os_vif [None req-9ff3b8fc-a7a4-492a-b62f-f62665bdff77 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:fd:07:86,bridge_name='br-int',has_traffic_filtering=True,id=80a8c065-698e-43fe-8221-aecc6a7f9b8b,network=Network(a309dc7b-6d46-436e-98ca-69f041ef0a19),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap80a8c065-69')
Nov 22 08:37:18 compute-0 nova_compute[186544]: 2025-11-22 08:37:18.021 186548 INFO nova.virt.libvirt.driver [None req-9ff3b8fc-a7a4-492a-b62f-f62665bdff77 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] Deleting instance files /var/lib/nova/instances/94a2fb15-11fa-4fe8-8e6f-92500b5e9427_del
Nov 22 08:37:18 compute-0 nova_compute[186544]: 2025-11-22 08:37:18.021 186548 INFO nova.virt.libvirt.driver [None req-9ff3b8fc-a7a4-492a-b62f-f62665bdff77 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] Deletion of /var/lib/nova/instances/94a2fb15-11fa-4fe8-8e6f-92500b5e9427_del complete
Nov 22 08:37:18 compute-0 podman[249739]: 2025-11-22 08:37:18.058744774 +0000 UTC m=+0.061127799 container remove 728e007ed24153f1bb408361e8684853532e5729e78c5d2986742a198b7f2204 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a309dc7b-6d46-436e-98ca-69f041ef0a19, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118)
Nov 22 08:37:18 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:37:18.064 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[049b600f-ce07-4c80-acc5-49386ff6b010]: (4, ('Sat Nov 22 08:37:17 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-a309dc7b-6d46-436e-98ca-69f041ef0a19 (728e007ed24153f1bb408361e8684853532e5729e78c5d2986742a198b7f2204)\n728e007ed24153f1bb408361e8684853532e5729e78c5d2986742a198b7f2204\nSat Nov 22 08:37:17 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-a309dc7b-6d46-436e-98ca-69f041ef0a19 (728e007ed24153f1bb408361e8684853532e5729e78c5d2986742a198b7f2204)\n728e007ed24153f1bb408361e8684853532e5729e78c5d2986742a198b7f2204\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:37:18 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:37:18.067 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[680b1295-a6dd-421f-a052-9ed330746c4d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:37:18 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:37:18.068 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa309dc7b-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:37:18 compute-0 nova_compute[186544]: 2025-11-22 08:37:18.070 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:37:18 compute-0 kernel: tapa309dc7b-60: left promiscuous mode
Nov 22 08:37:18 compute-0 nova_compute[186544]: 2025-11-22 08:37:18.073 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:37:18 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:37:18.075 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[a45595b3-b668-412a-8808-ba0088257233]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:37:18 compute-0 nova_compute[186544]: 2025-11-22 08:37:18.083 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:37:18 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:37:18.099 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[ca366fd2-e1a5-4e6b-beab-05bdb8e856db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:37:18 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:37:18.102 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[346df394-86b2-434c-aca0-248f9611bfd4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:37:18 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:37:18.117 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[ce856580-171e-4d3d-92ec-a46c83ea6bea]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 735715, 'reachable_time': 34512, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 249756, 'error': None, 'target': 'ovnmeta-a309dc7b-6d46-436e-98ca-69f041ef0a19', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:37:18 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:37:18.119 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a309dc7b-6d46-436e-98ca-69f041ef0a19 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 08:37:18 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:37:18.119 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[41fa799f-7ea1-4e6e-8dea-f94cc6731b32]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:37:18 compute-0 systemd[1]: run-netns-ovnmeta\x2da309dc7b\x2d6d46\x2d436e\x2d98ca\x2d69f041ef0a19.mount: Deactivated successfully.
Nov 22 08:37:19 compute-0 nova_compute[186544]: 2025-11-22 08:37:19.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:37:19 compute-0 nova_compute[186544]: 2025-11-22 08:37:19.440 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:37:19 compute-0 nova_compute[186544]: 2025-11-22 08:37:19.441 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:37:19 compute-0 nova_compute[186544]: 2025-11-22 08:37:19.441 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:37:19 compute-0 nova_compute[186544]: 2025-11-22 08:37:19.441 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 08:37:19 compute-0 nova_compute[186544]: 2025-11-22 08:37:19.455 186548 DEBUG nova.compute.manager [req-08e1e4b7-c60d-4715-af72-b63f7e188e90 req-e8159059-ac8d-49ae-a01b-530cb4ecdf6c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] Received event network-vif-unplugged-80a8c065-698e-43fe-8221-aecc6a7f9b8b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:37:19 compute-0 nova_compute[186544]: 2025-11-22 08:37:19.456 186548 DEBUG oslo_concurrency.lockutils [req-08e1e4b7-c60d-4715-af72-b63f7e188e90 req-e8159059-ac8d-49ae-a01b-530cb4ecdf6c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "94a2fb15-11fa-4fe8-8e6f-92500b5e9427-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:37:19 compute-0 nova_compute[186544]: 2025-11-22 08:37:19.456 186548 DEBUG oslo_concurrency.lockutils [req-08e1e4b7-c60d-4715-af72-b63f7e188e90 req-e8159059-ac8d-49ae-a01b-530cb4ecdf6c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "94a2fb15-11fa-4fe8-8e6f-92500b5e9427-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:37:19 compute-0 nova_compute[186544]: 2025-11-22 08:37:19.456 186548 DEBUG oslo_concurrency.lockutils [req-08e1e4b7-c60d-4715-af72-b63f7e188e90 req-e8159059-ac8d-49ae-a01b-530cb4ecdf6c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "94a2fb15-11fa-4fe8-8e6f-92500b5e9427-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:37:19 compute-0 nova_compute[186544]: 2025-11-22 08:37:19.456 186548 DEBUG nova.compute.manager [req-08e1e4b7-c60d-4715-af72-b63f7e188e90 req-e8159059-ac8d-49ae-a01b-530cb4ecdf6c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] No waiting events found dispatching network-vif-unplugged-80a8c065-698e-43fe-8221-aecc6a7f9b8b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:37:19 compute-0 nova_compute[186544]: 2025-11-22 08:37:19.457 186548 DEBUG nova.compute.manager [req-08e1e4b7-c60d-4715-af72-b63f7e188e90 req-e8159059-ac8d-49ae-a01b-530cb4ecdf6c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] Received event network-vif-unplugged-80a8c065-698e-43fe-8221-aecc6a7f9b8b for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 22 08:37:19 compute-0 nova_compute[186544]: 2025-11-22 08:37:19.515 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Periodic task is updating the host stats, it is trying to get disk info for instance-000000a8, but the backing disk storage was removed by a concurrent operation such as resize. Error: No disk at /var/lib/nova/instances/94a2fb15-11fa-4fe8-8e6f-92500b5e9427/disk: nova.exception.DiskNotFound: No disk at /var/lib/nova/instances/94a2fb15-11fa-4fe8-8e6f-92500b5e9427/disk
Nov 22 08:37:19 compute-0 nova_compute[186544]: 2025-11-22 08:37:19.523 186548 INFO nova.compute.manager [None req-9ff3b8fc-a7a4-492a-b62f-f62665bdff77 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] Took 1.82 seconds to destroy the instance on the hypervisor.
Nov 22 08:37:19 compute-0 nova_compute[186544]: 2025-11-22 08:37:19.524 186548 DEBUG oslo.service.loopingcall [None req-9ff3b8fc-a7a4-492a-b62f-f62665bdff77 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 08:37:19 compute-0 nova_compute[186544]: 2025-11-22 08:37:19.524 186548 DEBUG nova.compute.manager [-] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 08:37:19 compute-0 nova_compute[186544]: 2025-11-22 08:37:19.524 186548 DEBUG nova.network.neutron [-] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 08:37:19 compute-0 nova_compute[186544]: 2025-11-22 08:37:19.663 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:37:19 compute-0 nova_compute[186544]: 2025-11-22 08:37:19.664 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5655MB free_disk=73.13055801391602GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 08:37:19 compute-0 nova_compute[186544]: 2025-11-22 08:37:19.664 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:37:19 compute-0 nova_compute[186544]: 2025-11-22 08:37:19.664 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:37:19 compute-0 nova_compute[186544]: 2025-11-22 08:37:19.842 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Instance 94a2fb15-11fa-4fe8-8e6f-92500b5e9427 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 22 08:37:19 compute-0 nova_compute[186544]: 2025-11-22 08:37:19.843 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 08:37:19 compute-0 nova_compute[186544]: 2025-11-22 08:37:19.843 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 08:37:20 compute-0 nova_compute[186544]: 2025-11-22 08:37:20.100 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:37:20 compute-0 nova_compute[186544]: 2025-11-22 08:37:20.115 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:37:20 compute-0 nova_compute[186544]: 2025-11-22 08:37:20.134 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 08:37:20 compute-0 nova_compute[186544]: 2025-11-22 08:37:20.135 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.471s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:37:20 compute-0 nova_compute[186544]: 2025-11-22 08:37:20.953 186548 DEBUG nova.network.neutron [req-2b2d851e-dea8-45fc-8b73-df75f8afc7f3 req-56e4f06b-0396-47ce-9a93-692ff76fd27f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] Updated VIF entry in instance network info cache for port 80a8c065-698e-43fe-8221-aecc6a7f9b8b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:37:20 compute-0 nova_compute[186544]: 2025-11-22 08:37:20.953 186548 DEBUG nova.network.neutron [req-2b2d851e-dea8-45fc-8b73-df75f8afc7f3 req-56e4f06b-0396-47ce-9a93-692ff76fd27f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] Updating instance_info_cache with network_info: [{"id": "80a8c065-698e-43fe-8221-aecc6a7f9b8b", "address": "fa:16:3e:fd:07:86", "network": {"id": "a309dc7b-6d46-436e-98ca-69f041ef0a19", "bridge": "br-int", "label": "tempest-network-smoke--625285188", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap80a8c065-69", "ovs_interfaceid": "80a8c065-698e-43fe-8221-aecc6a7f9b8b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:37:20 compute-0 nova_compute[186544]: 2025-11-22 08:37:20.979 186548 DEBUG oslo_concurrency.lockutils [req-2b2d851e-dea8-45fc-8b73-df75f8afc7f3 req-56e4f06b-0396-47ce-9a93-692ff76fd27f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-94a2fb15-11fa-4fe8-8e6f-92500b5e9427" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:37:21 compute-0 nova_compute[186544]: 2025-11-22 08:37:21.527 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:37:21 compute-0 nova_compute[186544]: 2025-11-22 08:37:21.652 186548 DEBUG nova.compute.manager [req-f007dbde-92ea-4afd-a297-89cbc8f976e5 req-919b69f5-df24-47ab-852e-5c3d2397a905 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] Received event network-vif-plugged-80a8c065-698e-43fe-8221-aecc6a7f9b8b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:37:21 compute-0 nova_compute[186544]: 2025-11-22 08:37:21.653 186548 DEBUG oslo_concurrency.lockutils [req-f007dbde-92ea-4afd-a297-89cbc8f976e5 req-919b69f5-df24-47ab-852e-5c3d2397a905 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "94a2fb15-11fa-4fe8-8e6f-92500b5e9427-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:37:21 compute-0 nova_compute[186544]: 2025-11-22 08:37:21.653 186548 DEBUG oslo_concurrency.lockutils [req-f007dbde-92ea-4afd-a297-89cbc8f976e5 req-919b69f5-df24-47ab-852e-5c3d2397a905 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "94a2fb15-11fa-4fe8-8e6f-92500b5e9427-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:37:21 compute-0 nova_compute[186544]: 2025-11-22 08:37:21.653 186548 DEBUG oslo_concurrency.lockutils [req-f007dbde-92ea-4afd-a297-89cbc8f976e5 req-919b69f5-df24-47ab-852e-5c3d2397a905 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "94a2fb15-11fa-4fe8-8e6f-92500b5e9427-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:37:21 compute-0 nova_compute[186544]: 2025-11-22 08:37:21.653 186548 DEBUG nova.compute.manager [req-f007dbde-92ea-4afd-a297-89cbc8f976e5 req-919b69f5-df24-47ab-852e-5c3d2397a905 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] No waiting events found dispatching network-vif-plugged-80a8c065-698e-43fe-8221-aecc6a7f9b8b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:37:21 compute-0 nova_compute[186544]: 2025-11-22 08:37:21.654 186548 WARNING nova.compute.manager [req-f007dbde-92ea-4afd-a297-89cbc8f976e5 req-919b69f5-df24-47ab-852e-5c3d2397a905 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] Received unexpected event network-vif-plugged-80a8c065-698e-43fe-8221-aecc6a7f9b8b for instance with vm_state active and task_state deleting.
Nov 22 08:37:22 compute-0 nova_compute[186544]: 2025-11-22 08:37:22.310 186548 DEBUG nova.network.neutron [-] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:37:22 compute-0 nova_compute[186544]: 2025-11-22 08:37:22.335 186548 INFO nova.compute.manager [-] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] Took 2.81 seconds to deallocate network for instance.
Nov 22 08:37:22 compute-0 nova_compute[186544]: 2025-11-22 08:37:22.419 186548 DEBUG oslo_concurrency.lockutils [None req-9ff3b8fc-a7a4-492a-b62f-f62665bdff77 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:37:22 compute-0 nova_compute[186544]: 2025-11-22 08:37:22.420 186548 DEBUG oslo_concurrency.lockutils [None req-9ff3b8fc-a7a4-492a-b62f-f62665bdff77 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:37:22 compute-0 nova_compute[186544]: 2025-11-22 08:37:22.434 186548 DEBUG nova.compute.manager [req-e908025d-5cb2-43cb-93af-693865ddd510 req-f874d10f-758b-4938-bc58-ad423ddc4700 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] Received event network-vif-deleted-80a8c065-698e-43fe-8221-aecc6a7f9b8b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:37:22 compute-0 nova_compute[186544]: 2025-11-22 08:37:22.487 186548 DEBUG nova.compute.provider_tree [None req-9ff3b8fc-a7a4-492a-b62f-f62665bdff77 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:37:22 compute-0 nova_compute[186544]: 2025-11-22 08:37:22.503 186548 DEBUG nova.scheduler.client.report [None req-9ff3b8fc-a7a4-492a-b62f-f62665bdff77 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:37:22 compute-0 nova_compute[186544]: 2025-11-22 08:37:22.529 186548 DEBUG oslo_concurrency.lockutils [None req-9ff3b8fc-a7a4-492a-b62f-f62665bdff77 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:37:22 compute-0 nova_compute[186544]: 2025-11-22 08:37:22.611 186548 INFO nova.scheduler.client.report [None req-9ff3b8fc-a7a4-492a-b62f-f62665bdff77 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Deleted allocations for instance 94a2fb15-11fa-4fe8-8e6f-92500b5e9427
Nov 22 08:37:22 compute-0 nova_compute[186544]: 2025-11-22 08:37:22.698 186548 DEBUG oslo_concurrency.lockutils [None req-9ff3b8fc-a7a4-492a-b62f-f62665bdff77 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "94a2fb15-11fa-4fe8-8e6f-92500b5e9427" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.007s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:37:23 compute-0 nova_compute[186544]: 2025-11-22 08:37:23.017 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:37:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:37:23.082 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '62'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:37:23 compute-0 nova_compute[186544]: 2025-11-22 08:37:23.135 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:37:23 compute-0 nova_compute[186544]: 2025-11-22 08:37:23.136 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 08:37:23 compute-0 nova_compute[186544]: 2025-11-22 08:37:23.136 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 08:37:23 compute-0 nova_compute[186544]: 2025-11-22 08:37:23.155 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 08:37:23 compute-0 nova_compute[186544]: 2025-11-22 08:37:23.156 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:37:26 compute-0 nova_compute[186544]: 2025-11-22 08:37:26.530 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:37:27 compute-0 nova_compute[186544]: 2025-11-22 08:37:27.178 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:37:28 compute-0 nova_compute[186544]: 2025-11-22 08:37:28.020 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:37:29 compute-0 nova_compute[186544]: 2025-11-22 08:37:29.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:37:29 compute-0 nova_compute[186544]: 2025-11-22 08:37:29.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:37:29 compute-0 podman[249761]: 2025-11-22 08:37:29.414214128 +0000 UTC m=+0.059500429 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 08:37:29 compute-0 podman[249760]: 2025-11-22 08:37:29.42195134 +0000 UTC m=+0.060770082 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 22 08:37:29 compute-0 podman[249762]: 2025-11-22 08:37:29.443734726 +0000 UTC m=+0.085877859 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 22 08:37:29 compute-0 podman[249759]: 2025-11-22 08:37:29.443790398 +0000 UTC m=+0.090858033 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm)
Nov 22 08:37:31 compute-0 nova_compute[186544]: 2025-11-22 08:37:31.104 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:37:31 compute-0 nova_compute[186544]: 2025-11-22 08:37:31.181 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:37:31 compute-0 nova_compute[186544]: 2025-11-22 08:37:31.531 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:37:32 compute-0 nova_compute[186544]: 2025-11-22 08:37:32.989 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763800637.9881575, 94a2fb15-11fa-4fe8-8e6f-92500b5e9427 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:37:32 compute-0 nova_compute[186544]: 2025-11-22 08:37:32.990 186548 INFO nova.compute.manager [-] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] VM Stopped (Lifecycle Event)
Nov 22 08:37:33 compute-0 nova_compute[186544]: 2025-11-22 08:37:33.022 186548 DEBUG nova.compute.manager [None req-130e7b2a-92b5-40fa-9c17-43893c629376 - - - - - -] [instance: 94a2fb15-11fa-4fe8-8e6f-92500b5e9427] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:37:33 compute-0 nova_compute[186544]: 2025-11-22 08:37:33.023 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:37:33 compute-0 nova_compute[186544]: 2025-11-22 08:37:33.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:37:34 compute-0 nova_compute[186544]: 2025-11-22 08:37:34.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:37:36 compute-0 nova_compute[186544]: 2025-11-22 08:37:36.532 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:37:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:37:37.367 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:37:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:37:37.368 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:37:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:37:37.368 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:37:38 compute-0 nova_compute[186544]: 2025-11-22 08:37:38.026 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:37:40 compute-0 podman[249842]: 2025-11-22 08:37:40.40400428 +0000 UTC m=+0.052487416 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 08:37:40 compute-0 podman[249843]: 2025-11-22 08:37:40.428837013 +0000 UTC m=+0.068779588 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, vendor=Red Hat, Inc., config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, container_name=openstack_network_exporter, architecture=x86_64)
Nov 22 08:37:40 compute-0 podman[249844]: 2025-11-22 08:37:40.438079281 +0000 UTC m=+0.071424044 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 08:37:41 compute-0 nova_compute[186544]: 2025-11-22 08:37:41.535 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:37:43 compute-0 nova_compute[186544]: 2025-11-22 08:37:43.029 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:37:46 compute-0 nova_compute[186544]: 2025-11-22 08:37:46.538 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:37:48 compute-0 nova_compute[186544]: 2025-11-22 08:37:48.032 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:37:51 compute-0 nova_compute[186544]: 2025-11-22 08:37:51.539 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:37:53 compute-0 nova_compute[186544]: 2025-11-22 08:37:53.033 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:37:56 compute-0 nova_compute[186544]: 2025-11-22 08:37:56.541 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:37:58 compute-0 nova_compute[186544]: 2025-11-22 08:37:58.035 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:38:00 compute-0 podman[249909]: 2025-11-22 08:38:00.415253041 +0000 UTC m=+0.050931207 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Nov 22 08:38:00 compute-0 podman[249908]: 2025-11-22 08:38:00.43057332 +0000 UTC m=+0.077430342 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 08:38:00 compute-0 podman[249910]: 2025-11-22 08:38:00.452103661 +0000 UTC m=+0.087755206 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 08:38:00 compute-0 podman[249916]: 2025-11-22 08:38:00.462145379 +0000 UTC m=+0.085307316 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 22 08:38:01 compute-0 nova_compute[186544]: 2025-11-22 08:38:01.542 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:38:03 compute-0 nova_compute[186544]: 2025-11-22 08:38:03.039 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:38:06 compute-0 nova_compute[186544]: 2025-11-22 08:38:06.544 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:38:08 compute-0 nova_compute[186544]: 2025-11-22 08:38:08.040 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:38:11 compute-0 podman[249995]: 2025-11-22 08:38:11.4069885 +0000 UTC m=+0.052507877 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 08:38:11 compute-0 podman[249997]: 2025-11-22 08:38:11.422239987 +0000 UTC m=+0.057634663 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 22 08:38:11 compute-0 podman[249996]: 2025-11-22 08:38:11.435068383 +0000 UTC m=+0.067763933 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, architecture=x86_64, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, distribution-scope=public, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41)
Nov 22 08:38:11 compute-0 nova_compute[186544]: 2025-11-22 08:38:11.545 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:38:12 compute-0 ovn_controller[94843]: 2025-11-22T08:38:12Z|00830|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Nov 22 08:38:13 compute-0 nova_compute[186544]: 2025-11-22 08:38:13.044 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:38:16 compute-0 nova_compute[186544]: 2025-11-22 08:38:16.546 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:38:17 compute-0 nova_compute[186544]: 2025-11-22 08:38:17.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:38:17 compute-0 nova_compute[186544]: 2025-11-22 08:38:17.164 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 08:38:18 compute-0 nova_compute[186544]: 2025-11-22 08:38:18.047 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:38:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:38:19.332 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=63, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=62) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:38:19 compute-0 nova_compute[186544]: 2025-11-22 08:38:19.333 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:38:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:38:19.334 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 08:38:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:38:19.334 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '63'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:38:21 compute-0 nova_compute[186544]: 2025-11-22 08:38:21.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:38:21 compute-0 nova_compute[186544]: 2025-11-22 08:38:21.220 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:38:21 compute-0 nova_compute[186544]: 2025-11-22 08:38:21.220 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:38:21 compute-0 nova_compute[186544]: 2025-11-22 08:38:21.220 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:38:21 compute-0 nova_compute[186544]: 2025-11-22 08:38:21.221 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 08:38:21 compute-0 nova_compute[186544]: 2025-11-22 08:38:21.376 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:38:21 compute-0 nova_compute[186544]: 2025-11-22 08:38:21.377 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5701MB free_disk=73.13058853149414GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 08:38:21 compute-0 nova_compute[186544]: 2025-11-22 08:38:21.378 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:38:21 compute-0 nova_compute[186544]: 2025-11-22 08:38:21.378 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:38:21 compute-0 nova_compute[186544]: 2025-11-22 08:38:21.480 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 08:38:21 compute-0 nova_compute[186544]: 2025-11-22 08:38:21.480 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 08:38:21 compute-0 nova_compute[186544]: 2025-11-22 08:38:21.512 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:38:21 compute-0 nova_compute[186544]: 2025-11-22 08:38:21.529 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:38:21 compute-0 nova_compute[186544]: 2025-11-22 08:38:21.548 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:38:21 compute-0 nova_compute[186544]: 2025-11-22 08:38:21.861 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 08:38:21 compute-0 nova_compute[186544]: 2025-11-22 08:38:21.861 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.483s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:38:22 compute-0 nova_compute[186544]: 2025-11-22 08:38:22.862 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:38:22 compute-0 nova_compute[186544]: 2025-11-22 08:38:22.863 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 08:38:22 compute-0 nova_compute[186544]: 2025-11-22 08:38:22.863 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 08:38:22 compute-0 nova_compute[186544]: 2025-11-22 08:38:22.876 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 08:38:23 compute-0 nova_compute[186544]: 2025-11-22 08:38:23.048 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:38:23 compute-0 nova_compute[186544]: 2025-11-22 08:38:23.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:38:26 compute-0 nova_compute[186544]: 2025-11-22 08:38:26.549 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:38:28 compute-0 nova_compute[186544]: 2025-11-22 08:38:28.052 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:38:28 compute-0 nova_compute[186544]: 2025-11-22 08:38:28.157 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:38:31 compute-0 nova_compute[186544]: 2025-11-22 08:38:31.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:38:31 compute-0 nova_compute[186544]: 2025-11-22 08:38:31.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:38:31 compute-0 podman[250062]: 2025-11-22 08:38:31.40554152 +0000 UTC m=+0.044966780 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 08:38:31 compute-0 podman[250061]: 2025-11-22 08:38:31.407068668 +0000 UTC m=+0.049857491 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 22 08:38:31 compute-0 podman[250060]: 2025-11-22 08:38:31.429655445 +0000 UTC m=+0.076081798 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20251118, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 22 08:38:31 compute-0 podman[250063]: 2025-11-22 08:38:31.436436253 +0000 UTC m=+0.072856309 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, container_name=ovn_controller)
Nov 22 08:38:31 compute-0 nova_compute[186544]: 2025-11-22 08:38:31.551 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:38:33 compute-0 nova_compute[186544]: 2025-11-22 08:38:33.055 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:38:34 compute-0 nova_compute[186544]: 2025-11-22 08:38:34.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:38:35 compute-0 nova_compute[186544]: 2025-11-22 08:38:35.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:38:36 compute-0 nova_compute[186544]: 2025-11-22 08:38:36.552 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:38:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:38:36.603 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:38:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:38:36.603 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:38:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:38:36.603 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:38:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:38:36.603 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:38:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:38:36.603 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:38:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:38:36.603 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:38:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:38:36.604 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:38:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:38:36.604 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:38:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:38:36.604 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:38:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:38:36.604 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:38:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:38:36.604 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:38:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:38:36.604 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:38:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:38:36.604 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:38:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:38:36.604 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:38:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:38:36.604 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:38:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:38:36.604 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:38:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:38:36.604 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:38:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:38:36.604 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:38:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:38:36.605 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:38:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:38:36.605 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:38:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:38:36.605 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:38:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:38:36.605 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:38:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:38:36.605 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:38:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:38:36.605 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:38:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:38:36.605 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:38:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:38:37.369 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:38:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:38:37.369 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:38:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:38:37.369 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:38:38 compute-0 nova_compute[186544]: 2025-11-22 08:38:38.057 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:38:41 compute-0 nova_compute[186544]: 2025-11-22 08:38:41.554 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:38:41 compute-0 nova_compute[186544]: 2025-11-22 08:38:41.859 186548 DEBUG oslo_concurrency.lockutils [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "d7f88d47-758b-4f1e-8cbc-1022ae095c46" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:38:41 compute-0 nova_compute[186544]: 2025-11-22 08:38:41.859 186548 DEBUG oslo_concurrency.lockutils [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "d7f88d47-758b-4f1e-8cbc-1022ae095c46" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:38:41 compute-0 nova_compute[186544]: 2025-11-22 08:38:41.881 186548 DEBUG nova.compute.manager [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 08:38:41 compute-0 nova_compute[186544]: 2025-11-22 08:38:41.982 186548 DEBUG oslo_concurrency.lockutils [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:38:41 compute-0 nova_compute[186544]: 2025-11-22 08:38:41.983 186548 DEBUG oslo_concurrency.lockutils [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:38:41 compute-0 nova_compute[186544]: 2025-11-22 08:38:41.990 186548 DEBUG nova.virt.hardware [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 08:38:41 compute-0 nova_compute[186544]: 2025-11-22 08:38:41.991 186548 INFO nova.compute.claims [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] Claim successful on node compute-0.ctlplane.example.com
Nov 22 08:38:42 compute-0 nova_compute[186544]: 2025-11-22 08:38:42.099 186548 DEBUG nova.compute.provider_tree [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:38:42 compute-0 nova_compute[186544]: 2025-11-22 08:38:42.112 186548 DEBUG nova.scheduler.client.report [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:38:42 compute-0 nova_compute[186544]: 2025-11-22 08:38:42.139 186548 DEBUG oslo_concurrency.lockutils [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.157s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:38:42 compute-0 nova_compute[186544]: 2025-11-22 08:38:42.140 186548 DEBUG nova.compute.manager [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 08:38:42 compute-0 nova_compute[186544]: 2025-11-22 08:38:42.226 186548 DEBUG nova.compute.manager [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 22 08:38:42 compute-0 nova_compute[186544]: 2025-11-22 08:38:42.227 186548 DEBUG nova.network.neutron [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 08:38:42 compute-0 nova_compute[186544]: 2025-11-22 08:38:42.250 186548 INFO nova.virt.libvirt.driver [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 08:38:42 compute-0 nova_compute[186544]: 2025-11-22 08:38:42.271 186548 DEBUG nova.compute.manager [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 08:38:42 compute-0 nova_compute[186544]: 2025-11-22 08:38:42.388 186548 DEBUG nova.compute.manager [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 08:38:42 compute-0 nova_compute[186544]: 2025-11-22 08:38:42.390 186548 DEBUG nova.virt.libvirt.driver [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 08:38:42 compute-0 nova_compute[186544]: 2025-11-22 08:38:42.390 186548 INFO nova.virt.libvirt.driver [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] Creating image(s)
Nov 22 08:38:42 compute-0 nova_compute[186544]: 2025-11-22 08:38:42.391 186548 DEBUG oslo_concurrency.lockutils [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "/var/lib/nova/instances/d7f88d47-758b-4f1e-8cbc-1022ae095c46/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:38:42 compute-0 nova_compute[186544]: 2025-11-22 08:38:42.391 186548 DEBUG oslo_concurrency.lockutils [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "/var/lib/nova/instances/d7f88d47-758b-4f1e-8cbc-1022ae095c46/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:38:42 compute-0 nova_compute[186544]: 2025-11-22 08:38:42.392 186548 DEBUG oslo_concurrency.lockutils [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "/var/lib/nova/instances/d7f88d47-758b-4f1e-8cbc-1022ae095c46/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:38:42 compute-0 nova_compute[186544]: 2025-11-22 08:38:42.407 186548 DEBUG oslo_concurrency.processutils [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:38:42 compute-0 podman[250145]: 2025-11-22 08:38:42.422498501 +0000 UTC m=+0.061908839 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, release=1755695350, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, maintainer=Red Hat, Inc., config_id=edpm, managed_by=edpm_ansible, architecture=x86_64, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, container_name=openstack_network_exporter, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 22 08:38:42 compute-0 podman[250144]: 2025-11-22 08:38:42.427082744 +0000 UTC m=+0.067693251 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 08:38:42 compute-0 podman[250146]: 2025-11-22 08:38:42.432468847 +0000 UTC m=+0.070267924 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 22 08:38:42 compute-0 nova_compute[186544]: 2025-11-22 08:38:42.466 186548 DEBUG oslo_concurrency.processutils [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:38:42 compute-0 nova_compute[186544]: 2025-11-22 08:38:42.467 186548 DEBUG oslo_concurrency.lockutils [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:38:42 compute-0 nova_compute[186544]: 2025-11-22 08:38:42.467 186548 DEBUG oslo_concurrency.lockutils [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:38:42 compute-0 nova_compute[186544]: 2025-11-22 08:38:42.479 186548 DEBUG oslo_concurrency.processutils [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:38:42 compute-0 nova_compute[186544]: 2025-11-22 08:38:42.528 186548 DEBUG oslo_concurrency.processutils [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:38:42 compute-0 nova_compute[186544]: 2025-11-22 08:38:42.529 186548 DEBUG oslo_concurrency.processutils [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/d7f88d47-758b-4f1e-8cbc-1022ae095c46/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:38:42 compute-0 nova_compute[186544]: 2025-11-22 08:38:42.671 186548 DEBUG oslo_concurrency.processutils [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/d7f88d47-758b-4f1e-8cbc-1022ae095c46/disk 1073741824" returned: 0 in 0.142s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:38:42 compute-0 nova_compute[186544]: 2025-11-22 08:38:42.672 186548 DEBUG oslo_concurrency.lockutils [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.205s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:38:42 compute-0 nova_compute[186544]: 2025-11-22 08:38:42.673 186548 DEBUG oslo_concurrency.processutils [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:38:42 compute-0 nova_compute[186544]: 2025-11-22 08:38:42.745 186548 DEBUG oslo_concurrency.processutils [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:38:42 compute-0 nova_compute[186544]: 2025-11-22 08:38:42.746 186548 DEBUG nova.virt.disk.api [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Checking if we can resize image /var/lib/nova/instances/d7f88d47-758b-4f1e-8cbc-1022ae095c46/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 08:38:42 compute-0 nova_compute[186544]: 2025-11-22 08:38:42.747 186548 DEBUG oslo_concurrency.processutils [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d7f88d47-758b-4f1e-8cbc-1022ae095c46/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:38:42 compute-0 nova_compute[186544]: 2025-11-22 08:38:42.816 186548 DEBUG oslo_concurrency.processutils [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d7f88d47-758b-4f1e-8cbc-1022ae095c46/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:38:42 compute-0 nova_compute[186544]: 2025-11-22 08:38:42.817 186548 DEBUG nova.virt.disk.api [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Cannot resize image /var/lib/nova/instances/d7f88d47-758b-4f1e-8cbc-1022ae095c46/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 08:38:42 compute-0 nova_compute[186544]: 2025-11-22 08:38:42.818 186548 DEBUG nova.objects.instance [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lazy-loading 'migration_context' on Instance uuid d7f88d47-758b-4f1e-8cbc-1022ae095c46 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:38:42 compute-0 nova_compute[186544]: 2025-11-22 08:38:42.830 186548 DEBUG nova.virt.libvirt.driver [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 08:38:42 compute-0 nova_compute[186544]: 2025-11-22 08:38:42.831 186548 DEBUG nova.virt.libvirt.driver [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] Ensure instance console log exists: /var/lib/nova/instances/d7f88d47-758b-4f1e-8cbc-1022ae095c46/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 08:38:42 compute-0 nova_compute[186544]: 2025-11-22 08:38:42.831 186548 DEBUG oslo_concurrency.lockutils [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:38:42 compute-0 nova_compute[186544]: 2025-11-22 08:38:42.832 186548 DEBUG oslo_concurrency.lockutils [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:38:42 compute-0 nova_compute[186544]: 2025-11-22 08:38:42.832 186548 DEBUG oslo_concurrency.lockutils [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:38:43 compute-0 nova_compute[186544]: 2025-11-22 08:38:43.061 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:38:43 compute-0 nova_compute[186544]: 2025-11-22 08:38:43.111 186548 DEBUG nova.policy [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 22 08:38:44 compute-0 nova_compute[186544]: 2025-11-22 08:38:44.309 186548 DEBUG nova.network.neutron [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] Successfully created port: 47c88170-7066-4279-a4b5-426679ae639e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 22 08:38:45 compute-0 nova_compute[186544]: 2025-11-22 08:38:45.742 186548 DEBUG nova.network.neutron [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] Successfully updated port: 47c88170-7066-4279-a4b5-426679ae639e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 08:38:45 compute-0 nova_compute[186544]: 2025-11-22 08:38:45.763 186548 DEBUG oslo_concurrency.lockutils [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "refresh_cache-d7f88d47-758b-4f1e-8cbc-1022ae095c46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:38:45 compute-0 nova_compute[186544]: 2025-11-22 08:38:45.763 186548 DEBUG oslo_concurrency.lockutils [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquired lock "refresh_cache-d7f88d47-758b-4f1e-8cbc-1022ae095c46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:38:45 compute-0 nova_compute[186544]: 2025-11-22 08:38:45.763 186548 DEBUG nova.network.neutron [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 08:38:45 compute-0 nova_compute[186544]: 2025-11-22 08:38:45.952 186548 DEBUG nova.network.neutron [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 08:38:46 compute-0 nova_compute[186544]: 2025-11-22 08:38:46.221 186548 DEBUG nova.compute.manager [req-443d74fc-85e9-4a27-9d8c-acd8ab4fdc15 req-5fddeaac-dc9f-43aa-aa2d-c8c105ca1529 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] Received event network-changed-47c88170-7066-4279-a4b5-426679ae639e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:38:46 compute-0 nova_compute[186544]: 2025-11-22 08:38:46.222 186548 DEBUG nova.compute.manager [req-443d74fc-85e9-4a27-9d8c-acd8ab4fdc15 req-5fddeaac-dc9f-43aa-aa2d-c8c105ca1529 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] Refreshing instance network info cache due to event network-changed-47c88170-7066-4279-a4b5-426679ae639e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:38:46 compute-0 nova_compute[186544]: 2025-11-22 08:38:46.223 186548 DEBUG oslo_concurrency.lockutils [req-443d74fc-85e9-4a27-9d8c-acd8ab4fdc15 req-5fddeaac-dc9f-43aa-aa2d-c8c105ca1529 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-d7f88d47-758b-4f1e-8cbc-1022ae095c46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:38:46 compute-0 nova_compute[186544]: 2025-11-22 08:38:46.556 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:38:47 compute-0 nova_compute[186544]: 2025-11-22 08:38:47.439 186548 DEBUG nova.network.neutron [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] Updating instance_info_cache with network_info: [{"id": "47c88170-7066-4279-a4b5-426679ae639e", "address": "fa:16:3e:13:ce:c8", "network": {"id": "8a869b57-a476-415c-ba30-ce607e13fca8", "bridge": "br-int", "label": "tempest-network-smoke--688309089", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47c88170-70", "ovs_interfaceid": "47c88170-7066-4279-a4b5-426679ae639e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:38:47 compute-0 nova_compute[186544]: 2025-11-22 08:38:47.527 186548 DEBUG oslo_concurrency.lockutils [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Releasing lock "refresh_cache-d7f88d47-758b-4f1e-8cbc-1022ae095c46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:38:47 compute-0 nova_compute[186544]: 2025-11-22 08:38:47.528 186548 DEBUG nova.compute.manager [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] Instance network_info: |[{"id": "47c88170-7066-4279-a4b5-426679ae639e", "address": "fa:16:3e:13:ce:c8", "network": {"id": "8a869b57-a476-415c-ba30-ce607e13fca8", "bridge": "br-int", "label": "tempest-network-smoke--688309089", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47c88170-70", "ovs_interfaceid": "47c88170-7066-4279-a4b5-426679ae639e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 22 08:38:47 compute-0 nova_compute[186544]: 2025-11-22 08:38:47.528 186548 DEBUG oslo_concurrency.lockutils [req-443d74fc-85e9-4a27-9d8c-acd8ab4fdc15 req-5fddeaac-dc9f-43aa-aa2d-c8c105ca1529 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-d7f88d47-758b-4f1e-8cbc-1022ae095c46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:38:47 compute-0 nova_compute[186544]: 2025-11-22 08:38:47.528 186548 DEBUG nova.network.neutron [req-443d74fc-85e9-4a27-9d8c-acd8ab4fdc15 req-5fddeaac-dc9f-43aa-aa2d-c8c105ca1529 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] Refreshing network info cache for port 47c88170-7066-4279-a4b5-426679ae639e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:38:47 compute-0 nova_compute[186544]: 2025-11-22 08:38:47.531 186548 DEBUG nova.virt.libvirt.driver [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] Start _get_guest_xml network_info=[{"id": "47c88170-7066-4279-a4b5-426679ae639e", "address": "fa:16:3e:13:ce:c8", "network": {"id": "8a869b57-a476-415c-ba30-ce607e13fca8", "bridge": "br-int", "label": "tempest-network-smoke--688309089", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47c88170-70", "ovs_interfaceid": "47c88170-7066-4279-a4b5-426679ae639e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 08:38:47 compute-0 nova_compute[186544]: 2025-11-22 08:38:47.536 186548 WARNING nova.virt.libvirt.driver [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:38:47 compute-0 nova_compute[186544]: 2025-11-22 08:38:47.540 186548 DEBUG nova.virt.libvirt.host [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 08:38:47 compute-0 nova_compute[186544]: 2025-11-22 08:38:47.540 186548 DEBUG nova.virt.libvirt.host [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 08:38:47 compute-0 nova_compute[186544]: 2025-11-22 08:38:47.543 186548 DEBUG nova.virt.libvirt.host [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 08:38:47 compute-0 nova_compute[186544]: 2025-11-22 08:38:47.543 186548 DEBUG nova.virt.libvirt.host [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 08:38:47 compute-0 nova_compute[186544]: 2025-11-22 08:38:47.544 186548 DEBUG nova.virt.libvirt.driver [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 08:38:47 compute-0 nova_compute[186544]: 2025-11-22 08:38:47.544 186548 DEBUG nova.virt.hardware [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 08:38:47 compute-0 nova_compute[186544]: 2025-11-22 08:38:47.545 186548 DEBUG nova.virt.hardware [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 08:38:47 compute-0 nova_compute[186544]: 2025-11-22 08:38:47.545 186548 DEBUG nova.virt.hardware [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 08:38:47 compute-0 nova_compute[186544]: 2025-11-22 08:38:47.545 186548 DEBUG nova.virt.hardware [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 08:38:47 compute-0 nova_compute[186544]: 2025-11-22 08:38:47.545 186548 DEBUG nova.virt.hardware [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 08:38:47 compute-0 nova_compute[186544]: 2025-11-22 08:38:47.545 186548 DEBUG nova.virt.hardware [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 08:38:47 compute-0 nova_compute[186544]: 2025-11-22 08:38:47.546 186548 DEBUG nova.virt.hardware [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 08:38:47 compute-0 nova_compute[186544]: 2025-11-22 08:38:47.546 186548 DEBUG nova.virt.hardware [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 08:38:47 compute-0 nova_compute[186544]: 2025-11-22 08:38:47.546 186548 DEBUG nova.virt.hardware [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 08:38:47 compute-0 nova_compute[186544]: 2025-11-22 08:38:47.546 186548 DEBUG nova.virt.hardware [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 08:38:47 compute-0 nova_compute[186544]: 2025-11-22 08:38:47.547 186548 DEBUG nova.virt.hardware [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 08:38:47 compute-0 nova_compute[186544]: 2025-11-22 08:38:47.550 186548 DEBUG nova.virt.libvirt.vif [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:38:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1957778348',display_name='tempest-TestNetworkBasicOps-server-1957778348',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1957778348',id=170,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBD7tKwusAeHAUK7U642daFpFc4LPNGlMqvYBLcrsBO0wxndhCPGKF5E9erF11Y1uzrtFSzuqQKR4N2xaPllgRybKGwWs/5jFcD9DafsNojTxDHAORsoKdzlApC3KcKT7ZA==',key_name='tempest-TestNetworkBasicOps-550751567',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='12f63a6d87a947758ab928c0d625ff06',ramdisk_id='',reservation_id='r-0wwbka4i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1998778518',owner_user_name='tempest-TestNetworkBasicOps-1998778518-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:38:42Z,user_data=None,user_id='033a5e424a0a42afa21b67c28d79d1f4',uuid=d7f88d47-758b-4f1e-8cbc-1022ae095c46,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "47c88170-7066-4279-a4b5-426679ae639e", "address": "fa:16:3e:13:ce:c8", "network": {"id": "8a869b57-a476-415c-ba30-ce607e13fca8", "bridge": "br-int", "label": "tempest-network-smoke--688309089", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47c88170-70", "ovs_interfaceid": "47c88170-7066-4279-a4b5-426679ae639e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 08:38:47 compute-0 nova_compute[186544]: 2025-11-22 08:38:47.550 186548 DEBUG nova.network.os_vif_util [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converting VIF {"id": "47c88170-7066-4279-a4b5-426679ae639e", "address": "fa:16:3e:13:ce:c8", "network": {"id": "8a869b57-a476-415c-ba30-ce607e13fca8", "bridge": "br-int", "label": "tempest-network-smoke--688309089", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47c88170-70", "ovs_interfaceid": "47c88170-7066-4279-a4b5-426679ae639e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:38:47 compute-0 nova_compute[186544]: 2025-11-22 08:38:47.551 186548 DEBUG nova.network.os_vif_util [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:13:ce:c8,bridge_name='br-int',has_traffic_filtering=True,id=47c88170-7066-4279-a4b5-426679ae639e,network=Network(8a869b57-a476-415c-ba30-ce607e13fca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47c88170-70') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:38:47 compute-0 nova_compute[186544]: 2025-11-22 08:38:47.551 186548 DEBUG nova.objects.instance [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lazy-loading 'pci_devices' on Instance uuid d7f88d47-758b-4f1e-8cbc-1022ae095c46 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:38:47 compute-0 nova_compute[186544]: 2025-11-22 08:38:47.569 186548 DEBUG nova.virt.libvirt.driver [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] End _get_guest_xml xml=<domain type="kvm">
Nov 22 08:38:47 compute-0 nova_compute[186544]:   <uuid>d7f88d47-758b-4f1e-8cbc-1022ae095c46</uuid>
Nov 22 08:38:47 compute-0 nova_compute[186544]:   <name>instance-000000aa</name>
Nov 22 08:38:47 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 08:38:47 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 08:38:47 compute-0 nova_compute[186544]:   <metadata>
Nov 22 08:38:47 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 08:38:47 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 08:38:47 compute-0 nova_compute[186544]:       <nova:name>tempest-TestNetworkBasicOps-server-1957778348</nova:name>
Nov 22 08:38:47 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 08:38:47</nova:creationTime>
Nov 22 08:38:47 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 08:38:47 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 08:38:47 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 08:38:47 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 08:38:47 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 08:38:47 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 08:38:47 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 08:38:47 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 08:38:47 compute-0 nova_compute[186544]:         <nova:user uuid="033a5e424a0a42afa21b67c28d79d1f4">tempest-TestNetworkBasicOps-1998778518-project-member</nova:user>
Nov 22 08:38:47 compute-0 nova_compute[186544]:         <nova:project uuid="12f63a6d87a947758ab928c0d625ff06">tempest-TestNetworkBasicOps-1998778518</nova:project>
Nov 22 08:38:47 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 08:38:47 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 08:38:47 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 08:38:47 compute-0 nova_compute[186544]:         <nova:port uuid="47c88170-7066-4279-a4b5-426679ae639e">
Nov 22 08:38:47 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Nov 22 08:38:47 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 08:38:47 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 08:38:47 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 08:38:47 compute-0 nova_compute[186544]:   </metadata>
Nov 22 08:38:47 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 08:38:47 compute-0 nova_compute[186544]:     <system>
Nov 22 08:38:47 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 08:38:47 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 08:38:47 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 08:38:47 compute-0 nova_compute[186544]:       <entry name="serial">d7f88d47-758b-4f1e-8cbc-1022ae095c46</entry>
Nov 22 08:38:47 compute-0 nova_compute[186544]:       <entry name="uuid">d7f88d47-758b-4f1e-8cbc-1022ae095c46</entry>
Nov 22 08:38:47 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 08:38:47 compute-0 nova_compute[186544]:     </system>
Nov 22 08:38:47 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 08:38:47 compute-0 nova_compute[186544]:   <os>
Nov 22 08:38:47 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 08:38:47 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 08:38:47 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 08:38:47 compute-0 nova_compute[186544]:   </os>
Nov 22 08:38:47 compute-0 nova_compute[186544]:   <features>
Nov 22 08:38:47 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 08:38:47 compute-0 nova_compute[186544]:     <apic/>
Nov 22 08:38:47 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 08:38:47 compute-0 nova_compute[186544]:   </features>
Nov 22 08:38:47 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 08:38:47 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 08:38:47 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 08:38:47 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 08:38:47 compute-0 nova_compute[186544]:   </clock>
Nov 22 08:38:47 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 08:38:47 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 08:38:47 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 08:38:47 compute-0 nova_compute[186544]:   </cpu>
Nov 22 08:38:47 compute-0 nova_compute[186544]:   <devices>
Nov 22 08:38:47 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 08:38:47 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 08:38:47 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/d7f88d47-758b-4f1e-8cbc-1022ae095c46/disk"/>
Nov 22 08:38:47 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 08:38:47 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:38:47 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 08:38:47 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 08:38:47 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/d7f88d47-758b-4f1e-8cbc-1022ae095c46/disk.config"/>
Nov 22 08:38:47 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 08:38:47 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:38:47 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 08:38:47 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:13:ce:c8"/>
Nov 22 08:38:47 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:38:47 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 08:38:47 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 08:38:47 compute-0 nova_compute[186544]:       <target dev="tap47c88170-70"/>
Nov 22 08:38:47 compute-0 nova_compute[186544]:     </interface>
Nov 22 08:38:47 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 08:38:47 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/d7f88d47-758b-4f1e-8cbc-1022ae095c46/console.log" append="off"/>
Nov 22 08:38:47 compute-0 nova_compute[186544]:     </serial>
Nov 22 08:38:47 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 08:38:47 compute-0 nova_compute[186544]:     <video>
Nov 22 08:38:47 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:38:47 compute-0 nova_compute[186544]:     </video>
Nov 22 08:38:47 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 08:38:47 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 08:38:47 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 08:38:47 compute-0 nova_compute[186544]:     </rng>
Nov 22 08:38:47 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 08:38:47 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:38:47 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:38:47 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:38:47 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:38:47 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:38:47 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:38:47 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:38:47 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:38:47 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:38:47 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:38:47 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:38:47 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:38:47 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:38:47 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:38:47 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:38:47 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:38:47 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:38:47 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:38:47 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:38:47 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:38:47 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:38:47 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:38:47 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:38:47 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:38:47 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 08:38:47 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 08:38:47 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 08:38:47 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 08:38:47 compute-0 nova_compute[186544]:   </devices>
Nov 22 08:38:47 compute-0 nova_compute[186544]: </domain>
Nov 22 08:38:47 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 08:38:47 compute-0 nova_compute[186544]: 2025-11-22 08:38:47.571 186548 DEBUG nova.compute.manager [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] Preparing to wait for external event network-vif-plugged-47c88170-7066-4279-a4b5-426679ae639e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 08:38:47 compute-0 nova_compute[186544]: 2025-11-22 08:38:47.571 186548 DEBUG oslo_concurrency.lockutils [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "d7f88d47-758b-4f1e-8cbc-1022ae095c46-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:38:47 compute-0 nova_compute[186544]: 2025-11-22 08:38:47.571 186548 DEBUG oslo_concurrency.lockutils [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "d7f88d47-758b-4f1e-8cbc-1022ae095c46-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:38:47 compute-0 nova_compute[186544]: 2025-11-22 08:38:47.571 186548 DEBUG oslo_concurrency.lockutils [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "d7f88d47-758b-4f1e-8cbc-1022ae095c46-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:38:47 compute-0 nova_compute[186544]: 2025-11-22 08:38:47.572 186548 DEBUG nova.virt.libvirt.vif [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:38:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1957778348',display_name='tempest-TestNetworkBasicOps-server-1957778348',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1957778348',id=170,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBD7tKwusAeHAUK7U642daFpFc4LPNGlMqvYBLcrsBO0wxndhCPGKF5E9erF11Y1uzrtFSzuqQKR4N2xaPllgRybKGwWs/5jFcD9DafsNojTxDHAORsoKdzlApC3KcKT7ZA==',key_name='tempest-TestNetworkBasicOps-550751567',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='12f63a6d87a947758ab928c0d625ff06',ramdisk_id='',reservation_id='r-0wwbka4i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1998778518',owner_user_name='tempest-TestNetworkBasicOps-1998778518-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:38:42Z,user_data=None,user_id='033a5e424a0a42afa21b67c28d79d1f4',uuid=d7f88d47-758b-4f1e-8cbc-1022ae095c46,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "47c88170-7066-4279-a4b5-426679ae639e", "address": "fa:16:3e:13:ce:c8", "network": {"id": "8a869b57-a476-415c-ba30-ce607e13fca8", "bridge": "br-int", "label": "tempest-network-smoke--688309089", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47c88170-70", "ovs_interfaceid": "47c88170-7066-4279-a4b5-426679ae639e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 08:38:47 compute-0 nova_compute[186544]: 2025-11-22 08:38:47.572 186548 DEBUG nova.network.os_vif_util [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converting VIF {"id": "47c88170-7066-4279-a4b5-426679ae639e", "address": "fa:16:3e:13:ce:c8", "network": {"id": "8a869b57-a476-415c-ba30-ce607e13fca8", "bridge": "br-int", "label": "tempest-network-smoke--688309089", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47c88170-70", "ovs_interfaceid": "47c88170-7066-4279-a4b5-426679ae639e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:38:47 compute-0 nova_compute[186544]: 2025-11-22 08:38:47.573 186548 DEBUG nova.network.os_vif_util [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:13:ce:c8,bridge_name='br-int',has_traffic_filtering=True,id=47c88170-7066-4279-a4b5-426679ae639e,network=Network(8a869b57-a476-415c-ba30-ce607e13fca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47c88170-70') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:38:47 compute-0 nova_compute[186544]: 2025-11-22 08:38:47.573 186548 DEBUG os_vif [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:13:ce:c8,bridge_name='br-int',has_traffic_filtering=True,id=47c88170-7066-4279-a4b5-426679ae639e,network=Network(8a869b57-a476-415c-ba30-ce607e13fca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47c88170-70') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 08:38:47 compute-0 nova_compute[186544]: 2025-11-22 08:38:47.573 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:38:47 compute-0 nova_compute[186544]: 2025-11-22 08:38:47.574 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:38:47 compute-0 nova_compute[186544]: 2025-11-22 08:38:47.574 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:38:47 compute-0 nova_compute[186544]: 2025-11-22 08:38:47.576 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:38:47 compute-0 nova_compute[186544]: 2025-11-22 08:38:47.576 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap47c88170-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:38:47 compute-0 nova_compute[186544]: 2025-11-22 08:38:47.577 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap47c88170-70, col_values=(('external_ids', {'iface-id': '47c88170-7066-4279-a4b5-426679ae639e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:13:ce:c8', 'vm-uuid': 'd7f88d47-758b-4f1e-8cbc-1022ae095c46'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:38:47 compute-0 nova_compute[186544]: 2025-11-22 08:38:47.578 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:38:47 compute-0 NetworkManager[55036]: <info>  [1763800727.5793] manager: (tap47c88170-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/388)
Nov 22 08:38:47 compute-0 nova_compute[186544]: 2025-11-22 08:38:47.581 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 08:38:47 compute-0 nova_compute[186544]: 2025-11-22 08:38:47.583 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:38:47 compute-0 nova_compute[186544]: 2025-11-22 08:38:47.584 186548 INFO os_vif [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:13:ce:c8,bridge_name='br-int',has_traffic_filtering=True,id=47c88170-7066-4279-a4b5-426679ae639e,network=Network(8a869b57-a476-415c-ba30-ce607e13fca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47c88170-70')
Nov 22 08:38:47 compute-0 nova_compute[186544]: 2025-11-22 08:38:47.706 186548 DEBUG nova.virt.libvirt.driver [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:38:47 compute-0 nova_compute[186544]: 2025-11-22 08:38:47.708 186548 DEBUG nova.virt.libvirt.driver [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:38:47 compute-0 nova_compute[186544]: 2025-11-22 08:38:47.710 186548 DEBUG nova.virt.libvirt.driver [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] No VIF found with MAC fa:16:3e:13:ce:c8, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 08:38:47 compute-0 nova_compute[186544]: 2025-11-22 08:38:47.711 186548 INFO nova.virt.libvirt.driver [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] Using config drive
Nov 22 08:38:48 compute-0 nova_compute[186544]: 2025-11-22 08:38:48.009 186548 INFO nova.virt.libvirt.driver [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] Creating config drive at /var/lib/nova/instances/d7f88d47-758b-4f1e-8cbc-1022ae095c46/disk.config
Nov 22 08:38:48 compute-0 nova_compute[186544]: 2025-11-22 08:38:48.013 186548 DEBUG oslo_concurrency.processutils [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d7f88d47-758b-4f1e-8cbc-1022ae095c46/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpq91reecs execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:38:48 compute-0 nova_compute[186544]: 2025-11-22 08:38:48.141 186548 DEBUG oslo_concurrency.processutils [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d7f88d47-758b-4f1e-8cbc-1022ae095c46/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpq91reecs" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:38:48 compute-0 nova_compute[186544]: 2025-11-22 08:38:48.158 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:38:48 compute-0 kernel: tap47c88170-70: entered promiscuous mode
Nov 22 08:38:48 compute-0 NetworkManager[55036]: <info>  [1763800728.2025] manager: (tap47c88170-70): new Tun device (/org/freedesktop/NetworkManager/Devices/389)
Nov 22 08:38:48 compute-0 nova_compute[186544]: 2025-11-22 08:38:48.203 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:38:48 compute-0 ovn_controller[94843]: 2025-11-22T08:38:48Z|00831|binding|INFO|Claiming lport 47c88170-7066-4279-a4b5-426679ae639e for this chassis.
Nov 22 08:38:48 compute-0 ovn_controller[94843]: 2025-11-22T08:38:48Z|00832|binding|INFO|47c88170-7066-4279-a4b5-426679ae639e: Claiming fa:16:3e:13:ce:c8 10.100.0.7
Nov 22 08:38:48 compute-0 nova_compute[186544]: 2025-11-22 08:38:48.207 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:38:48 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:38:48.227 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:13:ce:c8 10.100.0.7'], port_security=['fa:16:3e:13:ce:c8 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'd7f88d47-758b-4f1e-8cbc-1022ae095c46', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8a869b57-a476-415c-ba30-ce607e13fca8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '12f63a6d87a947758ab928c0d625ff06', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'eee0946b-cdb5-4e55-983c-475283a7e66d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f98a1991-00c9-465f-9246-5677a61cda3d, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=47c88170-7066-4279-a4b5-426679ae639e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:38:48 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:38:48.229 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 47c88170-7066-4279-a4b5-426679ae639e in datapath 8a869b57-a476-415c-ba30-ce607e13fca8 bound to our chassis
Nov 22 08:38:48 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:38:48.230 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8a869b57-a476-415c-ba30-ce607e13fca8
Nov 22 08:38:48 compute-0 systemd-udevd[250241]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:38:48 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:38:48.242 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[79e08931-d7e1-440b-9b72-6587391031d5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:38:48 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:38:48.243 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap8a869b57-a1 in ovnmeta-8a869b57-a476-415c-ba30-ce607e13fca8 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 08:38:48 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:38:48.244 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap8a869b57-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 08:38:48 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:38:48.245 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[b1143a00-0adc-4266-bd46-36c7a39b3864]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:38:48 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:38:48.245 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[316d26b9-aef3-4894-b5f3-c87b7cb107ee]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:38:48 compute-0 systemd-machined[152872]: New machine qemu-93-instance-000000aa.
Nov 22 08:38:48 compute-0 NetworkManager[55036]: <info>  [1763800728.2532] device (tap47c88170-70): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 08:38:48 compute-0 NetworkManager[55036]: <info>  [1763800728.2545] device (tap47c88170-70): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 08:38:48 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:38:48.257 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[34ebf73c-673a-4e74-b98c-ed199fc9a7a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:38:48 compute-0 nova_compute[186544]: 2025-11-22 08:38:48.277 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:38:48 compute-0 systemd[1]: Started Virtual Machine qemu-93-instance-000000aa.
Nov 22 08:38:48 compute-0 ovn_controller[94843]: 2025-11-22T08:38:48Z|00833|binding|INFO|Setting lport 47c88170-7066-4279-a4b5-426679ae639e ovn-installed in OVS
Nov 22 08:38:48 compute-0 ovn_controller[94843]: 2025-11-22T08:38:48Z|00834|binding|INFO|Setting lport 47c88170-7066-4279-a4b5-426679ae639e up in Southbound
Nov 22 08:38:48 compute-0 nova_compute[186544]: 2025-11-22 08:38:48.285 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:38:48 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:38:48.287 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[3f19c8e8-8f25-425e-9dc4-afd111a29705]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:38:48 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:38:48.312 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[41a6e396-f7ea-41c2-90b8-3bbff71397d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:38:48 compute-0 NetworkManager[55036]: <info>  [1763800728.3180] manager: (tap8a869b57-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/390)
Nov 22 08:38:48 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:38:48.318 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[7585fcf5-e981-4241-a482-24ab0c625d5c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:38:48 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:38:48.348 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[72f85c15-a929-4dc3-98d9-1aa5291a9597]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:38:48 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:38:48.350 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[a43cb1db-3171-463c-a2fc-c92495fc73e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:38:48 compute-0 NetworkManager[55036]: <info>  [1763800728.3731] device (tap8a869b57-a0): carrier: link connected
Nov 22 08:38:48 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:38:48.381 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[4450e45e-9abe-477d-93c5-411df327cf75]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:38:48 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:38:48.401 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[375e3e3a-d317-45ff-9fd2-84bcbff8f452]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8a869b57-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3e:0a:bf'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 249], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 746901, 'reachable_time': 29389, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 250275, 'error': None, 'target': 'ovnmeta-8a869b57-a476-415c-ba30-ce607e13fca8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:38:48 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:38:48.416 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[f49cb9ea-b77c-41ba-a4a9-2c11ab937edf]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe3e:abf'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 746901, 'tstamp': 746901}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 250276, 'error': None, 'target': 'ovnmeta-8a869b57-a476-415c-ba30-ce607e13fca8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:38:48 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:38:48.432 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[4c451a8a-0c25-42ca-968c-d58138e12a73]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8a869b57-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3e:0a:bf'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 249], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 746901, 'reachable_time': 29389, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 250277, 'error': None, 'target': 'ovnmeta-8a869b57-a476-415c-ba30-ce607e13fca8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:38:48 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:38:48.465 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[a1d89324-307b-493f-bbe4-ebd96c9d432a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:38:48 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:38:48.517 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[f847510a-ea78-447c-adc2-41509827b73b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:38:48 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:38:48.518 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8a869b57-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:38:48 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:38:48.518 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:38:48 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:38:48.519 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8a869b57-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:38:48 compute-0 kernel: tap8a869b57-a0: entered promiscuous mode
Nov 22 08:38:48 compute-0 NetworkManager[55036]: <info>  [1763800728.5213] manager: (tap8a869b57-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/391)
Nov 22 08:38:48 compute-0 nova_compute[186544]: 2025-11-22 08:38:48.520 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:38:48 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:38:48.526 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8a869b57-a0, col_values=(('external_ids', {'iface-id': 'c85ad4c8-d148-4415-be4e-a8012749d2ac'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:38:48 compute-0 ovn_controller[94843]: 2025-11-22T08:38:48Z|00835|binding|INFO|Releasing lport c85ad4c8-d148-4415-be4e-a8012749d2ac from this chassis (sb_readonly=0)
Nov 22 08:38:48 compute-0 nova_compute[186544]: 2025-11-22 08:38:48.528 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:38:48 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:38:48.530 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8a869b57-a476-415c-ba30-ce607e13fca8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8a869b57-a476-415c-ba30-ce607e13fca8.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 08:38:48 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:38:48.531 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[8810741b-93d9-412d-b0f0-ec48a71009c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:38:48 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:38:48.532 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 08:38:48 compute-0 ovn_metadata_agent[103800]: global
Nov 22 08:38:48 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 08:38:48 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-8a869b57-a476-415c-ba30-ce607e13fca8
Nov 22 08:38:48 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 08:38:48 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 08:38:48 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 08:38:48 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/8a869b57-a476-415c-ba30-ce607e13fca8.pid.haproxy
Nov 22 08:38:48 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 08:38:48 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:38:48 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 08:38:48 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 08:38:48 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 08:38:48 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 08:38:48 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 08:38:48 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 08:38:48 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 08:38:48 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 08:38:48 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 08:38:48 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 08:38:48 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 08:38:48 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 08:38:48 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 08:38:48 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:38:48 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:38:48 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 08:38:48 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 08:38:48 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 08:38:48 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID 8a869b57-a476-415c-ba30-ce607e13fca8
Nov 22 08:38:48 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 08:38:48 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:38:48.534 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-8a869b57-a476-415c-ba30-ce607e13fca8', 'env', 'PROCESS_TAG=haproxy-8a869b57-a476-415c-ba30-ce607e13fca8', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/8a869b57-a476-415c-ba30-ce607e13fca8.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 08:38:48 compute-0 nova_compute[186544]: 2025-11-22 08:38:48.540 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:38:48 compute-0 nova_compute[186544]: 2025-11-22 08:38:48.769 186548 DEBUG nova.network.neutron [req-443d74fc-85e9-4a27-9d8c-acd8ab4fdc15 req-5fddeaac-dc9f-43aa-aa2d-c8c105ca1529 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] Updated VIF entry in instance network info cache for port 47c88170-7066-4279-a4b5-426679ae639e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:38:48 compute-0 nova_compute[186544]: 2025-11-22 08:38:48.770 186548 DEBUG nova.network.neutron [req-443d74fc-85e9-4a27-9d8c-acd8ab4fdc15 req-5fddeaac-dc9f-43aa-aa2d-c8c105ca1529 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] Updating instance_info_cache with network_info: [{"id": "47c88170-7066-4279-a4b5-426679ae639e", "address": "fa:16:3e:13:ce:c8", "network": {"id": "8a869b57-a476-415c-ba30-ce607e13fca8", "bridge": "br-int", "label": "tempest-network-smoke--688309089", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47c88170-70", "ovs_interfaceid": "47c88170-7066-4279-a4b5-426679ae639e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:38:48 compute-0 nova_compute[186544]: 2025-11-22 08:38:48.786 186548 DEBUG oslo_concurrency.lockutils [req-443d74fc-85e9-4a27-9d8c-acd8ab4fdc15 req-5fddeaac-dc9f-43aa-aa2d-c8c105ca1529 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-d7f88d47-758b-4f1e-8cbc-1022ae095c46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:38:48 compute-0 nova_compute[186544]: 2025-11-22 08:38:48.911 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763800728.910944, d7f88d47-758b-4f1e-8cbc-1022ae095c46 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:38:48 compute-0 nova_compute[186544]: 2025-11-22 08:38:48.912 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] VM Started (Lifecycle Event)
Nov 22 08:38:48 compute-0 nova_compute[186544]: 2025-11-22 08:38:48.940 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:38:48 compute-0 nova_compute[186544]: 2025-11-22 08:38:48.944 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763800728.9110508, d7f88d47-758b-4f1e-8cbc-1022ae095c46 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:38:48 compute-0 nova_compute[186544]: 2025-11-22 08:38:48.944 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] VM Paused (Lifecycle Event)
Nov 22 08:38:48 compute-0 podman[250311]: 2025-11-22 08:38:48.856249104 +0000 UTC m=+0.022507347 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 08:38:48 compute-0 nova_compute[186544]: 2025-11-22 08:38:48.963 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:38:48 compute-0 nova_compute[186544]: 2025-11-22 08:38:48.966 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:38:48 compute-0 podman[250311]: 2025-11-22 08:38:48.986023855 +0000 UTC m=+0.152282068 container create f8a13ef1e2b26274ae32ca7d0a6a4217eb6209e74aa68ff497dee667e0c609a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8a869b57-a476-415c-ba30-ce607e13fca8, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 22 08:38:48 compute-0 nova_compute[186544]: 2025-11-22 08:38:48.987 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 08:38:49 compute-0 nova_compute[186544]: 2025-11-22 08:38:49.066 186548 DEBUG nova.compute.manager [req-a93156d5-d1f1-4d46-b5f8-18ea31519e3e req-497ebc82-f924-4e40-9209-b3ab67afdf0e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] Received event network-vif-plugged-47c88170-7066-4279-a4b5-426679ae639e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:38:49 compute-0 nova_compute[186544]: 2025-11-22 08:38:49.067 186548 DEBUG oslo_concurrency.lockutils [req-a93156d5-d1f1-4d46-b5f8-18ea31519e3e req-497ebc82-f924-4e40-9209-b3ab67afdf0e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "d7f88d47-758b-4f1e-8cbc-1022ae095c46-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:38:49 compute-0 nova_compute[186544]: 2025-11-22 08:38:49.067 186548 DEBUG oslo_concurrency.lockutils [req-a93156d5-d1f1-4d46-b5f8-18ea31519e3e req-497ebc82-f924-4e40-9209-b3ab67afdf0e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "d7f88d47-758b-4f1e-8cbc-1022ae095c46-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:38:49 compute-0 nova_compute[186544]: 2025-11-22 08:38:49.068 186548 DEBUG oslo_concurrency.lockutils [req-a93156d5-d1f1-4d46-b5f8-18ea31519e3e req-497ebc82-f924-4e40-9209-b3ab67afdf0e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "d7f88d47-758b-4f1e-8cbc-1022ae095c46-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:38:49 compute-0 nova_compute[186544]: 2025-11-22 08:38:49.068 186548 DEBUG nova.compute.manager [req-a93156d5-d1f1-4d46-b5f8-18ea31519e3e req-497ebc82-f924-4e40-9209-b3ab67afdf0e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] Processing event network-vif-plugged-47c88170-7066-4279-a4b5-426679ae639e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 08:38:49 compute-0 nova_compute[186544]: 2025-11-22 08:38:49.070 186548 DEBUG nova.compute.manager [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 08:38:49 compute-0 nova_compute[186544]: 2025-11-22 08:38:49.073 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763800729.0737107, d7f88d47-758b-4f1e-8cbc-1022ae095c46 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:38:49 compute-0 nova_compute[186544]: 2025-11-22 08:38:49.074 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] VM Resumed (Lifecycle Event)
Nov 22 08:38:49 compute-0 nova_compute[186544]: 2025-11-22 08:38:49.076 186548 DEBUG nova.virt.libvirt.driver [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 08:38:49 compute-0 systemd[1]: Started libpod-conmon-f8a13ef1e2b26274ae32ca7d0a6a4217eb6209e74aa68ff497dee667e0c609a7.scope.
Nov 22 08:38:49 compute-0 nova_compute[186544]: 2025-11-22 08:38:49.082 186548 INFO nova.virt.libvirt.driver [-] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] Instance spawned successfully.
Nov 22 08:38:49 compute-0 nova_compute[186544]: 2025-11-22 08:38:49.084 186548 DEBUG nova.virt.libvirt.driver [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 08:38:49 compute-0 nova_compute[186544]: 2025-11-22 08:38:49.104 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:38:49 compute-0 systemd[1]: Started libcrun container.
Nov 22 08:38:49 compute-0 nova_compute[186544]: 2025-11-22 08:38:49.107 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:38:49 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9e50d907093cc2ca434c2659f83dd121200f330984075217940a22ec25e15590/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 08:38:49 compute-0 nova_compute[186544]: 2025-11-22 08:38:49.121 186548 DEBUG nova.virt.libvirt.driver [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:38:49 compute-0 nova_compute[186544]: 2025-11-22 08:38:49.122 186548 DEBUG nova.virt.libvirt.driver [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:38:49 compute-0 nova_compute[186544]: 2025-11-22 08:38:49.123 186548 DEBUG nova.virt.libvirt.driver [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:38:49 compute-0 nova_compute[186544]: 2025-11-22 08:38:49.124 186548 DEBUG nova.virt.libvirt.driver [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:38:49 compute-0 nova_compute[186544]: 2025-11-22 08:38:49.124 186548 DEBUG nova.virt.libvirt.driver [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:38:49 compute-0 nova_compute[186544]: 2025-11-22 08:38:49.125 186548 DEBUG nova.virt.libvirt.driver [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:38:49 compute-0 nova_compute[186544]: 2025-11-22 08:38:49.130 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 08:38:49 compute-0 podman[250311]: 2025-11-22 08:38:49.192926691 +0000 UTC m=+0.359184924 container init f8a13ef1e2b26274ae32ca7d0a6a4217eb6209e74aa68ff497dee667e0c609a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8a869b57-a476-415c-ba30-ce607e13fca8, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Nov 22 08:38:49 compute-0 podman[250311]: 2025-11-22 08:38:49.200323073 +0000 UTC m=+0.366581286 container start f8a13ef1e2b26274ae32ca7d0a6a4217eb6209e74aa68ff497dee667e0c609a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8a869b57-a476-415c-ba30-ce607e13fca8, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118)
Nov 22 08:38:49 compute-0 neutron-haproxy-ovnmeta-8a869b57-a476-415c-ba30-ce607e13fca8[250331]: [NOTICE]   (250335) : New worker (250337) forked
Nov 22 08:38:49 compute-0 neutron-haproxy-ovnmeta-8a869b57-a476-415c-ba30-ce607e13fca8[250331]: [NOTICE]   (250335) : Loading success.
Nov 22 08:38:49 compute-0 nova_compute[186544]: 2025-11-22 08:38:49.690 186548 INFO nova.compute.manager [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] Took 7.30 seconds to spawn the instance on the hypervisor.
Nov 22 08:38:49 compute-0 nova_compute[186544]: 2025-11-22 08:38:49.691 186548 DEBUG nova.compute.manager [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:38:49 compute-0 nova_compute[186544]: 2025-11-22 08:38:49.815 186548 INFO nova.compute.manager [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] Took 7.87 seconds to build instance.
Nov 22 08:38:49 compute-0 nova_compute[186544]: 2025-11-22 08:38:49.841 186548 DEBUG oslo_concurrency.lockutils [None req-289d451a-366b-40d4-99f7-bf19caf4010d 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "d7f88d47-758b-4f1e-8cbc-1022ae095c46" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.982s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:38:51 compute-0 nova_compute[186544]: 2025-11-22 08:38:51.251 186548 DEBUG nova.compute.manager [req-43077e03-4325-4cb8-8b6d-e4bdd55c9716 req-aa82bfa0-4091-4dbc-8e16-37365cf04a2f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] Received event network-vif-plugged-47c88170-7066-4279-a4b5-426679ae639e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:38:51 compute-0 nova_compute[186544]: 2025-11-22 08:38:51.252 186548 DEBUG oslo_concurrency.lockutils [req-43077e03-4325-4cb8-8b6d-e4bdd55c9716 req-aa82bfa0-4091-4dbc-8e16-37365cf04a2f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "d7f88d47-758b-4f1e-8cbc-1022ae095c46-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:38:51 compute-0 nova_compute[186544]: 2025-11-22 08:38:51.252 186548 DEBUG oslo_concurrency.lockutils [req-43077e03-4325-4cb8-8b6d-e4bdd55c9716 req-aa82bfa0-4091-4dbc-8e16-37365cf04a2f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "d7f88d47-758b-4f1e-8cbc-1022ae095c46-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:38:51 compute-0 nova_compute[186544]: 2025-11-22 08:38:51.252 186548 DEBUG oslo_concurrency.lockutils [req-43077e03-4325-4cb8-8b6d-e4bdd55c9716 req-aa82bfa0-4091-4dbc-8e16-37365cf04a2f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "d7f88d47-758b-4f1e-8cbc-1022ae095c46-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:38:51 compute-0 nova_compute[186544]: 2025-11-22 08:38:51.252 186548 DEBUG nova.compute.manager [req-43077e03-4325-4cb8-8b6d-e4bdd55c9716 req-aa82bfa0-4091-4dbc-8e16-37365cf04a2f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] No waiting events found dispatching network-vif-plugged-47c88170-7066-4279-a4b5-426679ae639e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:38:51 compute-0 nova_compute[186544]: 2025-11-22 08:38:51.252 186548 WARNING nova.compute.manager [req-43077e03-4325-4cb8-8b6d-e4bdd55c9716 req-aa82bfa0-4091-4dbc-8e16-37365cf04a2f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] Received unexpected event network-vif-plugged-47c88170-7066-4279-a4b5-426679ae639e for instance with vm_state active and task_state None.
Nov 22 08:38:51 compute-0 nova_compute[186544]: 2025-11-22 08:38:51.557 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:38:52 compute-0 nova_compute[186544]: 2025-11-22 08:38:52.579 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:38:55 compute-0 NetworkManager[55036]: <info>  [1763800735.9636] manager: (patch-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/392)
Nov 22 08:38:55 compute-0 NetworkManager[55036]: <info>  [1763800735.9643] manager: (patch-br-int-to-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/393)
Nov 22 08:38:55 compute-0 ovn_controller[94843]: 2025-11-22T08:38:55Z|00836|binding|INFO|Releasing lport c85ad4c8-d148-4415-be4e-a8012749d2ac from this chassis (sb_readonly=0)
Nov 22 08:38:55 compute-0 nova_compute[186544]: 2025-11-22 08:38:55.973 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:38:55 compute-0 ovn_controller[94843]: 2025-11-22T08:38:55Z|00837|binding|INFO|Releasing lport c85ad4c8-d148-4415-be4e-a8012749d2ac from this chassis (sb_readonly=0)
Nov 22 08:38:55 compute-0 nova_compute[186544]: 2025-11-22 08:38:55.988 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:38:55 compute-0 nova_compute[186544]: 2025-11-22 08:38:55.993 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:38:56 compute-0 nova_compute[186544]: 2025-11-22 08:38:56.527 186548 DEBUG nova.compute.manager [req-94a2c8dd-5f32-4ad1-bfd8-d0dfb1b10fd3 req-ed551aaf-174e-4566-9a57-137b4f0a00b2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] Received event network-changed-47c88170-7066-4279-a4b5-426679ae639e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:38:56 compute-0 nova_compute[186544]: 2025-11-22 08:38:56.527 186548 DEBUG nova.compute.manager [req-94a2c8dd-5f32-4ad1-bfd8-d0dfb1b10fd3 req-ed551aaf-174e-4566-9a57-137b4f0a00b2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] Refreshing instance network info cache due to event network-changed-47c88170-7066-4279-a4b5-426679ae639e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:38:56 compute-0 nova_compute[186544]: 2025-11-22 08:38:56.527 186548 DEBUG oslo_concurrency.lockutils [req-94a2c8dd-5f32-4ad1-bfd8-d0dfb1b10fd3 req-ed551aaf-174e-4566-9a57-137b4f0a00b2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-d7f88d47-758b-4f1e-8cbc-1022ae095c46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:38:56 compute-0 nova_compute[186544]: 2025-11-22 08:38:56.527 186548 DEBUG oslo_concurrency.lockutils [req-94a2c8dd-5f32-4ad1-bfd8-d0dfb1b10fd3 req-ed551aaf-174e-4566-9a57-137b4f0a00b2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-d7f88d47-758b-4f1e-8cbc-1022ae095c46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:38:56 compute-0 nova_compute[186544]: 2025-11-22 08:38:56.527 186548 DEBUG nova.network.neutron [req-94a2c8dd-5f32-4ad1-bfd8-d0dfb1b10fd3 req-ed551aaf-174e-4566-9a57-137b4f0a00b2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] Refreshing network info cache for port 47c88170-7066-4279-a4b5-426679ae639e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:38:56 compute-0 nova_compute[186544]: 2025-11-22 08:38:56.559 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:38:57 compute-0 nova_compute[186544]: 2025-11-22 08:38:57.581 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:39:00 compute-0 nova_compute[186544]: 2025-11-22 08:39:00.203 186548 DEBUG nova.network.neutron [req-94a2c8dd-5f32-4ad1-bfd8-d0dfb1b10fd3 req-ed551aaf-174e-4566-9a57-137b4f0a00b2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] Updated VIF entry in instance network info cache for port 47c88170-7066-4279-a4b5-426679ae639e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:39:00 compute-0 nova_compute[186544]: 2025-11-22 08:39:00.204 186548 DEBUG nova.network.neutron [req-94a2c8dd-5f32-4ad1-bfd8-d0dfb1b10fd3 req-ed551aaf-174e-4566-9a57-137b4f0a00b2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] Updating instance_info_cache with network_info: [{"id": "47c88170-7066-4279-a4b5-426679ae639e", "address": "fa:16:3e:13:ce:c8", "network": {"id": "8a869b57-a476-415c-ba30-ce607e13fca8", "bridge": "br-int", "label": "tempest-network-smoke--688309089", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47c88170-70", "ovs_interfaceid": "47c88170-7066-4279-a4b5-426679ae639e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:39:00 compute-0 nova_compute[186544]: 2025-11-22 08:39:00.239 186548 DEBUG oslo_concurrency.lockutils [req-94a2c8dd-5f32-4ad1-bfd8-d0dfb1b10fd3 req-ed551aaf-174e-4566-9a57-137b4f0a00b2 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-d7f88d47-758b-4f1e-8cbc-1022ae095c46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:39:01 compute-0 nova_compute[186544]: 2025-11-22 08:39:01.561 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:39:02 compute-0 podman[250361]: 2025-11-22 08:39:02.415657678 +0000 UTC m=+0.054764572 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 08:39:02 compute-0 podman[250359]: 2025-11-22 08:39:02.422167219 +0000 UTC m=+0.068530352 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Nov 22 08:39:02 compute-0 podman[250360]: 2025-11-22 08:39:02.440094 +0000 UTC m=+0.083733676 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Nov 22 08:39:02 compute-0 podman[250362]: 2025-11-22 08:39:02.459138331 +0000 UTC m=+0.094128014 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 22 08:39:02 compute-0 nova_compute[186544]: 2025-11-22 08:39:02.583 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:39:02 compute-0 ovn_controller[94843]: 2025-11-22T08:39:02Z|00081|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:13:ce:c8 10.100.0.7
Nov 22 08:39:02 compute-0 ovn_controller[94843]: 2025-11-22T08:39:02Z|00082|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:13:ce:c8 10.100.0.7
Nov 22 08:39:06 compute-0 nova_compute[186544]: 2025-11-22 08:39:06.564 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:39:07 compute-0 nova_compute[186544]: 2025-11-22 08:39:07.586 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:39:09 compute-0 nova_compute[186544]: 2025-11-22 08:39:09.170 186548 INFO nova.compute.manager [None req-a0f8b900-54be-4a4c-92e5-da7d6ac8ffca 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] Get console output
Nov 22 08:39:09 compute-0 nova_compute[186544]: 2025-11-22 08:39:09.176 213059 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 22 08:39:10 compute-0 nova_compute[186544]: 2025-11-22 08:39:10.670 186548 DEBUG oslo_concurrency.lockutils [None req-6ef645ad-07a4-4148-884e-7bd266eb1081 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "d7f88d47-758b-4f1e-8cbc-1022ae095c46" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:39:10 compute-0 nova_compute[186544]: 2025-11-22 08:39:10.670 186548 DEBUG oslo_concurrency.lockutils [None req-6ef645ad-07a4-4148-884e-7bd266eb1081 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "d7f88d47-758b-4f1e-8cbc-1022ae095c46" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:39:10 compute-0 nova_compute[186544]: 2025-11-22 08:39:10.670 186548 DEBUG oslo_concurrency.lockutils [None req-6ef645ad-07a4-4148-884e-7bd266eb1081 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "d7f88d47-758b-4f1e-8cbc-1022ae095c46-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:39:10 compute-0 nova_compute[186544]: 2025-11-22 08:39:10.671 186548 DEBUG oslo_concurrency.lockutils [None req-6ef645ad-07a4-4148-884e-7bd266eb1081 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "d7f88d47-758b-4f1e-8cbc-1022ae095c46-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:39:10 compute-0 nova_compute[186544]: 2025-11-22 08:39:10.671 186548 DEBUG oslo_concurrency.lockutils [None req-6ef645ad-07a4-4148-884e-7bd266eb1081 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "d7f88d47-758b-4f1e-8cbc-1022ae095c46-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:39:10 compute-0 nova_compute[186544]: 2025-11-22 08:39:10.681 186548 INFO nova.compute.manager [None req-6ef645ad-07a4-4148-884e-7bd266eb1081 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] Terminating instance
Nov 22 08:39:10 compute-0 nova_compute[186544]: 2025-11-22 08:39:10.686 186548 DEBUG nova.compute.manager [None req-6ef645ad-07a4-4148-884e-7bd266eb1081 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 08:39:10 compute-0 kernel: tap47c88170-70 (unregistering): left promiscuous mode
Nov 22 08:39:10 compute-0 NetworkManager[55036]: <info>  [1763800750.7072] device (tap47c88170-70): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 08:39:10 compute-0 nova_compute[186544]: 2025-11-22 08:39:10.715 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:39:10 compute-0 ovn_controller[94843]: 2025-11-22T08:39:10Z|00838|binding|INFO|Releasing lport 47c88170-7066-4279-a4b5-426679ae639e from this chassis (sb_readonly=0)
Nov 22 08:39:10 compute-0 ovn_controller[94843]: 2025-11-22T08:39:10Z|00839|binding|INFO|Setting lport 47c88170-7066-4279-a4b5-426679ae639e down in Southbound
Nov 22 08:39:10 compute-0 ovn_controller[94843]: 2025-11-22T08:39:10Z|00840|binding|INFO|Removing iface tap47c88170-70 ovn-installed in OVS
Nov 22 08:39:10 compute-0 nova_compute[186544]: 2025-11-22 08:39:10.719 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:39:10 compute-0 nova_compute[186544]: 2025-11-22 08:39:10.732 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:39:10 compute-0 systemd[1]: machine-qemu\x2d93\x2dinstance\x2d000000aa.scope: Deactivated successfully.
Nov 22 08:39:10 compute-0 systemd[1]: machine-qemu\x2d93\x2dinstance\x2d000000aa.scope: Consumed 14.587s CPU time.
Nov 22 08:39:10 compute-0 systemd-machined[152872]: Machine qemu-93-instance-000000aa terminated.
Nov 22 08:39:10 compute-0 nova_compute[186544]: 2025-11-22 08:39:10.951 186548 INFO nova.virt.libvirt.driver [-] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] Instance destroyed successfully.
Nov 22 08:39:10 compute-0 nova_compute[186544]: 2025-11-22 08:39:10.952 186548 DEBUG nova.objects.instance [None req-6ef645ad-07a4-4148-884e-7bd266eb1081 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lazy-loading 'resources' on Instance uuid d7f88d47-758b-4f1e-8cbc-1022ae095c46 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:39:10 compute-0 nova_compute[186544]: 2025-11-22 08:39:10.967 186548 DEBUG nova.virt.libvirt.vif [None req-6ef645ad-07a4-4148-884e-7bd266eb1081 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:38:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1957778348',display_name='tempest-TestNetworkBasicOps-server-1957778348',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1957778348',id=170,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBD7tKwusAeHAUK7U642daFpFc4LPNGlMqvYBLcrsBO0wxndhCPGKF5E9erF11Y1uzrtFSzuqQKR4N2xaPllgRybKGwWs/5jFcD9DafsNojTxDHAORsoKdzlApC3KcKT7ZA==',key_name='tempest-TestNetworkBasicOps-550751567',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:38:49Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='12f63a6d87a947758ab928c0d625ff06',ramdisk_id='',reservation_id='r-0wwbka4i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1998778518',owner_user_name='tempest-TestNetworkBasicOps-1998778518-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:38:49Z,user_data=None,user_id='033a5e424a0a42afa21b67c28d79d1f4',uuid=d7f88d47-758b-4f1e-8cbc-1022ae095c46,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "47c88170-7066-4279-a4b5-426679ae639e", "address": "fa:16:3e:13:ce:c8", "network": {"id": "8a869b57-a476-415c-ba30-ce607e13fca8", "bridge": "br-int", "label": "tempest-network-smoke--688309089", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47c88170-70", "ovs_interfaceid": "47c88170-7066-4279-a4b5-426679ae639e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 08:39:10 compute-0 nova_compute[186544]: 2025-11-22 08:39:10.968 186548 DEBUG nova.network.os_vif_util [None req-6ef645ad-07a4-4148-884e-7bd266eb1081 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converting VIF {"id": "47c88170-7066-4279-a4b5-426679ae639e", "address": "fa:16:3e:13:ce:c8", "network": {"id": "8a869b57-a476-415c-ba30-ce607e13fca8", "bridge": "br-int", "label": "tempest-network-smoke--688309089", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47c88170-70", "ovs_interfaceid": "47c88170-7066-4279-a4b5-426679ae639e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:39:10 compute-0 nova_compute[186544]: 2025-11-22 08:39:10.968 186548 DEBUG nova.network.os_vif_util [None req-6ef645ad-07a4-4148-884e-7bd266eb1081 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:13:ce:c8,bridge_name='br-int',has_traffic_filtering=True,id=47c88170-7066-4279-a4b5-426679ae639e,network=Network(8a869b57-a476-415c-ba30-ce607e13fca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47c88170-70') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:39:10 compute-0 nova_compute[186544]: 2025-11-22 08:39:10.969 186548 DEBUG os_vif [None req-6ef645ad-07a4-4148-884e-7bd266eb1081 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:13:ce:c8,bridge_name='br-int',has_traffic_filtering=True,id=47c88170-7066-4279-a4b5-426679ae639e,network=Network(8a869b57-a476-415c-ba30-ce607e13fca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47c88170-70') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 08:39:10 compute-0 nova_compute[186544]: 2025-11-22 08:39:10.971 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:39:10 compute-0 nova_compute[186544]: 2025-11-22 08:39:10.972 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap47c88170-70, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:39:10 compute-0 nova_compute[186544]: 2025-11-22 08:39:10.973 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:39:10 compute-0 nova_compute[186544]: 2025-11-22 08:39:10.974 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:39:10 compute-0 nova_compute[186544]: 2025-11-22 08:39:10.977 186548 INFO os_vif [None req-6ef645ad-07a4-4148-884e-7bd266eb1081 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:13:ce:c8,bridge_name='br-int',has_traffic_filtering=True,id=47c88170-7066-4279-a4b5-426679ae639e,network=Network(8a869b57-a476-415c-ba30-ce607e13fca8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47c88170-70')
Nov 22 08:39:10 compute-0 nova_compute[186544]: 2025-11-22 08:39:10.977 186548 INFO nova.virt.libvirt.driver [None req-6ef645ad-07a4-4148-884e-7bd266eb1081 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] Deleting instance files /var/lib/nova/instances/d7f88d47-758b-4f1e-8cbc-1022ae095c46_del
Nov 22 08:39:10 compute-0 nova_compute[186544]: 2025-11-22 08:39:10.978 186548 INFO nova.virt.libvirt.driver [None req-6ef645ad-07a4-4148-884e-7bd266eb1081 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] Deletion of /var/lib/nova/instances/d7f88d47-758b-4f1e-8cbc-1022ae095c46_del complete
Nov 22 08:39:11 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:39:11.018 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:13:ce:c8 10.100.0.7'], port_security=['fa:16:3e:13:ce:c8 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'd7f88d47-758b-4f1e-8cbc-1022ae095c46', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8a869b57-a476-415c-ba30-ce607e13fca8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '12f63a6d87a947758ab928c0d625ff06', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'eee0946b-cdb5-4e55-983c-475283a7e66d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.215'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f98a1991-00c9-465f-9246-5677a61cda3d, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=47c88170-7066-4279-a4b5-426679ae639e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:39:11 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:39:11.020 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 47c88170-7066-4279-a4b5-426679ae639e in datapath 8a869b57-a476-415c-ba30-ce607e13fca8 unbound from our chassis
Nov 22 08:39:11 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:39:11.021 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8a869b57-a476-415c-ba30-ce607e13fca8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 08:39:11 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:39:11.022 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[f1cb0ba4-fea6-4cda-bed1-1a89d7abd68f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:39:11 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:39:11.022 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-8a869b57-a476-415c-ba30-ce607e13fca8 namespace which is not needed anymore
Nov 22 08:39:11 compute-0 nova_compute[186544]: 2025-11-22 08:39:11.226 186548 INFO nova.compute.manager [None req-6ef645ad-07a4-4148-884e-7bd266eb1081 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] Took 0.54 seconds to destroy the instance on the hypervisor.
Nov 22 08:39:11 compute-0 nova_compute[186544]: 2025-11-22 08:39:11.226 186548 DEBUG oslo.service.loopingcall [None req-6ef645ad-07a4-4148-884e-7bd266eb1081 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 08:39:11 compute-0 nova_compute[186544]: 2025-11-22 08:39:11.226 186548 DEBUG nova.compute.manager [-] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 08:39:11 compute-0 nova_compute[186544]: 2025-11-22 08:39:11.227 186548 DEBUG nova.network.neutron [-] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 08:39:11 compute-0 neutron-haproxy-ovnmeta-8a869b57-a476-415c-ba30-ce607e13fca8[250331]: [NOTICE]   (250335) : haproxy version is 2.8.14-c23fe91
Nov 22 08:39:11 compute-0 neutron-haproxy-ovnmeta-8a869b57-a476-415c-ba30-ce607e13fca8[250331]: [NOTICE]   (250335) : path to executable is /usr/sbin/haproxy
Nov 22 08:39:11 compute-0 neutron-haproxy-ovnmeta-8a869b57-a476-415c-ba30-ce607e13fca8[250331]: [WARNING]  (250335) : Exiting Master process...
Nov 22 08:39:11 compute-0 neutron-haproxy-ovnmeta-8a869b57-a476-415c-ba30-ce607e13fca8[250331]: [WARNING]  (250335) : Exiting Master process...
Nov 22 08:39:11 compute-0 neutron-haproxy-ovnmeta-8a869b57-a476-415c-ba30-ce607e13fca8[250331]: [ALERT]    (250335) : Current worker (250337) exited with code 143 (Terminated)
Nov 22 08:39:11 compute-0 neutron-haproxy-ovnmeta-8a869b57-a476-415c-ba30-ce607e13fca8[250331]: [WARNING]  (250335) : All workers exited. Exiting... (0)
Nov 22 08:39:11 compute-0 systemd[1]: libpod-f8a13ef1e2b26274ae32ca7d0a6a4217eb6209e74aa68ff497dee667e0c609a7.scope: Deactivated successfully.
Nov 22 08:39:11 compute-0 podman[250486]: 2025-11-22 08:39:11.371314446 +0000 UTC m=+0.260207451 container died f8a13ef1e2b26274ae32ca7d0a6a4217eb6209e74aa68ff497dee667e0c609a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8a869b57-a476-415c-ba30-ce607e13fca8, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 08:39:11 compute-0 nova_compute[186544]: 2025-11-22 08:39:11.567 186548 DEBUG nova.compute.manager [req-3ee5d05c-0211-4db1-b167-707cf77105e8 req-778d463b-4a35-4c7e-b64e-3ba9abf84938 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] Received event network-vif-unplugged-47c88170-7066-4279-a4b5-426679ae639e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:39:11 compute-0 nova_compute[186544]: 2025-11-22 08:39:11.567 186548 DEBUG oslo_concurrency.lockutils [req-3ee5d05c-0211-4db1-b167-707cf77105e8 req-778d463b-4a35-4c7e-b64e-3ba9abf84938 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "d7f88d47-758b-4f1e-8cbc-1022ae095c46-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:39:11 compute-0 nova_compute[186544]: 2025-11-22 08:39:11.568 186548 DEBUG oslo_concurrency.lockutils [req-3ee5d05c-0211-4db1-b167-707cf77105e8 req-778d463b-4a35-4c7e-b64e-3ba9abf84938 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "d7f88d47-758b-4f1e-8cbc-1022ae095c46-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:39:11 compute-0 nova_compute[186544]: 2025-11-22 08:39:11.568 186548 DEBUG oslo_concurrency.lockutils [req-3ee5d05c-0211-4db1-b167-707cf77105e8 req-778d463b-4a35-4c7e-b64e-3ba9abf84938 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "d7f88d47-758b-4f1e-8cbc-1022ae095c46-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:39:11 compute-0 nova_compute[186544]: 2025-11-22 08:39:11.568 186548 DEBUG nova.compute.manager [req-3ee5d05c-0211-4db1-b167-707cf77105e8 req-778d463b-4a35-4c7e-b64e-3ba9abf84938 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] No waiting events found dispatching network-vif-unplugged-47c88170-7066-4279-a4b5-426679ae639e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:39:11 compute-0 nova_compute[186544]: 2025-11-22 08:39:11.568 186548 DEBUG nova.compute.manager [req-3ee5d05c-0211-4db1-b167-707cf77105e8 req-778d463b-4a35-4c7e-b64e-3ba9abf84938 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] Received event network-vif-unplugged-47c88170-7066-4279-a4b5-426679ae639e for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 22 08:39:11 compute-0 nova_compute[186544]: 2025-11-22 08:39:11.569 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:39:11 compute-0 systemd[1]: var-lib-containers-storage-overlay-9e50d907093cc2ca434c2659f83dd121200f330984075217940a22ec25e15590-merged.mount: Deactivated successfully.
Nov 22 08:39:11 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f8a13ef1e2b26274ae32ca7d0a6a4217eb6209e74aa68ff497dee667e0c609a7-userdata-shm.mount: Deactivated successfully.
Nov 22 08:39:11 compute-0 podman[250486]: 2025-11-22 08:39:11.786887491 +0000 UTC m=+0.675780486 container cleanup f8a13ef1e2b26274ae32ca7d0a6a4217eb6209e74aa68ff497dee667e0c609a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8a869b57-a476-415c-ba30-ce607e13fca8, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 22 08:39:11 compute-0 systemd[1]: libpod-conmon-f8a13ef1e2b26274ae32ca7d0a6a4217eb6209e74aa68ff497dee667e0c609a7.scope: Deactivated successfully.
Nov 22 08:39:11 compute-0 podman[250517]: 2025-11-22 08:39:11.864346972 +0000 UTC m=+0.054667940 container remove f8a13ef1e2b26274ae32ca7d0a6a4217eb6209e74aa68ff497dee667e0c609a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8a869b57-a476-415c-ba30-ce607e13fca8, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 08:39:11 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:39:11.869 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[2c21c383-3834-4209-8c2b-829a9c38926a]: (4, ('Sat Nov 22 08:39:11 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-8a869b57-a476-415c-ba30-ce607e13fca8 (f8a13ef1e2b26274ae32ca7d0a6a4217eb6209e74aa68ff497dee667e0c609a7)\nf8a13ef1e2b26274ae32ca7d0a6a4217eb6209e74aa68ff497dee667e0c609a7\nSat Nov 22 08:39:11 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-8a869b57-a476-415c-ba30-ce607e13fca8 (f8a13ef1e2b26274ae32ca7d0a6a4217eb6209e74aa68ff497dee667e0c609a7)\nf8a13ef1e2b26274ae32ca7d0a6a4217eb6209e74aa68ff497dee667e0c609a7\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:39:11 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:39:11.871 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[f4e3993a-9f35-47de-9f00-01c1f602610f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:39:11 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:39:11.872 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8a869b57-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:39:11 compute-0 kernel: tap8a869b57-a0: left promiscuous mode
Nov 22 08:39:11 compute-0 nova_compute[186544]: 2025-11-22 08:39:11.873 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:39:11 compute-0 nova_compute[186544]: 2025-11-22 08:39:11.885 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:39:11 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:39:11.887 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[1bd675d0-8606-4d3e-9155-5d2cb0441fc0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:39:11 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:39:11.904 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[03d39ff3-6d7d-4ee3-8f13-67e78ea10890]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:39:11 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:39:11.906 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[3eaea82a-84bd-4715-978b-24827de6c5d8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:39:11 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:39:11.921 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[d425cb7a-4b89-4cdc-b721-824fcf15ebde]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 746894, 'reachable_time': 26918, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 250533, 'error': None, 'target': 'ovnmeta-8a869b57-a476-415c-ba30-ce607e13fca8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:39:11 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:39:11.923 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-8a869b57-a476-415c-ba30-ce607e13fca8 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 08:39:11 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:39:11.924 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[19ab6be2-14d7-4fc4-80c0-f6e34bd387b0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:39:11 compute-0 systemd[1]: run-netns-ovnmeta\x2d8a869b57\x2da476\x2d415c\x2dba30\x2dce607e13fca8.mount: Deactivated successfully.
Nov 22 08:39:13 compute-0 podman[250534]: 2025-11-22 08:39:13.407791026 +0000 UTC m=+0.057451078 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 08:39:13 compute-0 podman[250535]: 2025-11-22 08:39:13.416236665 +0000 UTC m=+0.061263043 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, vendor=Red Hat, Inc., architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, vcs-type=git, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, config_id=edpm, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, distribution-scope=public, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 22 08:39:13 compute-0 podman[250536]: 2025-11-22 08:39:13.416243345 +0000 UTC m=+0.059253674 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 22 08:39:13 compute-0 nova_compute[186544]: 2025-11-22 08:39:13.822 186548 DEBUG nova.compute.manager [req-7f660ff5-54ac-4ade-9a4b-b683b2d717dc req-42007c07-4984-44f0-96f8-20ed89a23124 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] Received event network-vif-plugged-47c88170-7066-4279-a4b5-426679ae639e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:39:13 compute-0 nova_compute[186544]: 2025-11-22 08:39:13.823 186548 DEBUG oslo_concurrency.lockutils [req-7f660ff5-54ac-4ade-9a4b-b683b2d717dc req-42007c07-4984-44f0-96f8-20ed89a23124 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "d7f88d47-758b-4f1e-8cbc-1022ae095c46-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:39:13 compute-0 nova_compute[186544]: 2025-11-22 08:39:13.823 186548 DEBUG oslo_concurrency.lockutils [req-7f660ff5-54ac-4ade-9a4b-b683b2d717dc req-42007c07-4984-44f0-96f8-20ed89a23124 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "d7f88d47-758b-4f1e-8cbc-1022ae095c46-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:39:13 compute-0 nova_compute[186544]: 2025-11-22 08:39:13.823 186548 DEBUG oslo_concurrency.lockutils [req-7f660ff5-54ac-4ade-9a4b-b683b2d717dc req-42007c07-4984-44f0-96f8-20ed89a23124 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "d7f88d47-758b-4f1e-8cbc-1022ae095c46-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:39:13 compute-0 nova_compute[186544]: 2025-11-22 08:39:13.823 186548 DEBUG nova.compute.manager [req-7f660ff5-54ac-4ade-9a4b-b683b2d717dc req-42007c07-4984-44f0-96f8-20ed89a23124 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] No waiting events found dispatching network-vif-plugged-47c88170-7066-4279-a4b5-426679ae639e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:39:13 compute-0 nova_compute[186544]: 2025-11-22 08:39:13.824 186548 WARNING nova.compute.manager [req-7f660ff5-54ac-4ade-9a4b-b683b2d717dc req-42007c07-4984-44f0-96f8-20ed89a23124 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] Received unexpected event network-vif-plugged-47c88170-7066-4279-a4b5-426679ae639e for instance with vm_state active and task_state deleting.
Nov 22 08:39:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:39:14.272 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=64, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=63) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:39:14 compute-0 nova_compute[186544]: 2025-11-22 08:39:14.272 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:39:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:39:14.273 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 08:39:15 compute-0 nova_compute[186544]: 2025-11-22 08:39:15.205 186548 DEBUG nova.network.neutron [-] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:39:15 compute-0 nova_compute[186544]: 2025-11-22 08:39:15.283 186548 DEBUG nova.compute.manager [req-34e14c33-d449-43e5-a8cd-d751cb82a5cb req-192d4d86-85cf-4e33-bef5-b8c4134fc87d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] Received event network-vif-deleted-47c88170-7066-4279-a4b5-426679ae639e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:39:15 compute-0 nova_compute[186544]: 2025-11-22 08:39:15.283 186548 INFO nova.compute.manager [req-34e14c33-d449-43e5-a8cd-d751cb82a5cb req-192d4d86-85cf-4e33-bef5-b8c4134fc87d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] Neutron deleted interface 47c88170-7066-4279-a4b5-426679ae639e; detaching it from the instance and deleting it from the info cache
Nov 22 08:39:15 compute-0 nova_compute[186544]: 2025-11-22 08:39:15.284 186548 DEBUG nova.network.neutron [req-34e14c33-d449-43e5-a8cd-d751cb82a5cb req-192d4d86-85cf-4e33-bef5-b8c4134fc87d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:39:15 compute-0 nova_compute[186544]: 2025-11-22 08:39:15.453 186548 INFO nova.compute.manager [-] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] Took 4.23 seconds to deallocate network for instance.
Nov 22 08:39:15 compute-0 nova_compute[186544]: 2025-11-22 08:39:15.460 186548 DEBUG nova.compute.manager [req-34e14c33-d449-43e5-a8cd-d751cb82a5cb req-192d4d86-85cf-4e33-bef5-b8c4134fc87d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] Detach interface failed, port_id=47c88170-7066-4279-a4b5-426679ae639e, reason: Instance d7f88d47-758b-4f1e-8cbc-1022ae095c46 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Nov 22 08:39:15 compute-0 nova_compute[186544]: 2025-11-22 08:39:15.534 186548 DEBUG oslo_concurrency.lockutils [None req-6ef645ad-07a4-4148-884e-7bd266eb1081 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:39:15 compute-0 nova_compute[186544]: 2025-11-22 08:39:15.535 186548 DEBUG oslo_concurrency.lockutils [None req-6ef645ad-07a4-4148-884e-7bd266eb1081 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:39:15 compute-0 nova_compute[186544]: 2025-11-22 08:39:15.633 186548 DEBUG nova.compute.provider_tree [None req-6ef645ad-07a4-4148-884e-7bd266eb1081 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:39:15 compute-0 nova_compute[186544]: 2025-11-22 08:39:15.647 186548 DEBUG nova.scheduler.client.report [None req-6ef645ad-07a4-4148-884e-7bd266eb1081 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:39:15 compute-0 nova_compute[186544]: 2025-11-22 08:39:15.732 186548 DEBUG oslo_concurrency.lockutils [None req-6ef645ad-07a4-4148-884e-7bd266eb1081 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.198s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:39:15 compute-0 nova_compute[186544]: 2025-11-22 08:39:15.914 186548 INFO nova.scheduler.client.report [None req-6ef645ad-07a4-4148-884e-7bd266eb1081 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Deleted allocations for instance d7f88d47-758b-4f1e-8cbc-1022ae095c46
Nov 22 08:39:15 compute-0 nova_compute[186544]: 2025-11-22 08:39:15.974 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:39:16 compute-0 nova_compute[186544]: 2025-11-22 08:39:16.066 186548 DEBUG oslo_concurrency.lockutils [None req-6ef645ad-07a4-4148-884e-7bd266eb1081 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "d7f88d47-758b-4f1e-8cbc-1022ae095c46" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.396s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:39:16 compute-0 nova_compute[186544]: 2025-11-22 08:39:16.566 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:39:19 compute-0 nova_compute[186544]: 2025-11-22 08:39:19.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:39:19 compute-0 nova_compute[186544]: 2025-11-22 08:39:19.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 08:39:20 compute-0 nova_compute[186544]: 2025-11-22 08:39:20.977 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:39:21 compute-0 nova_compute[186544]: 2025-11-22 08:39:21.568 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:39:22 compute-0 nova_compute[186544]: 2025-11-22 08:39:22.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:39:22 compute-0 nova_compute[186544]: 2025-11-22 08:39:22.164 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 08:39:22 compute-0 nova_compute[186544]: 2025-11-22 08:39:22.165 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 08:39:22 compute-0 nova_compute[186544]: 2025-11-22 08:39:22.180 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 08:39:23 compute-0 nova_compute[186544]: 2025-11-22 08:39:23.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:39:23 compute-0 nova_compute[186544]: 2025-11-22 08:39:23.186 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:39:23 compute-0 nova_compute[186544]: 2025-11-22 08:39:23.186 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:39:23 compute-0 nova_compute[186544]: 2025-11-22 08:39:23.186 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:39:23 compute-0 nova_compute[186544]: 2025-11-22 08:39:23.186 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 08:39:23 compute-0 nova_compute[186544]: 2025-11-22 08:39:23.376 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:39:23 compute-0 nova_compute[186544]: 2025-11-22 08:39:23.378 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5705MB free_disk=73.13214874267578GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 08:39:23 compute-0 nova_compute[186544]: 2025-11-22 08:39:23.378 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:39:23 compute-0 nova_compute[186544]: 2025-11-22 08:39:23.379 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:39:23 compute-0 nova_compute[186544]: 2025-11-22 08:39:23.430 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 08:39:23 compute-0 nova_compute[186544]: 2025-11-22 08:39:23.430 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 08:39:23 compute-0 nova_compute[186544]: 2025-11-22 08:39:23.451 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:39:23 compute-0 nova_compute[186544]: 2025-11-22 08:39:23.464 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:39:23 compute-0 nova_compute[186544]: 2025-11-22 08:39:23.755 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 08:39:23 compute-0 nova_compute[186544]: 2025-11-22 08:39:23.755 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.376s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:39:24 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:39:24.275 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '64'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:39:25 compute-0 nova_compute[186544]: 2025-11-22 08:39:25.757 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:39:25 compute-0 nova_compute[186544]: 2025-11-22 08:39:25.948 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763800750.9471464, d7f88d47-758b-4f1e-8cbc-1022ae095c46 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:39:25 compute-0 nova_compute[186544]: 2025-11-22 08:39:25.949 186548 INFO nova.compute.manager [-] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] VM Stopped (Lifecycle Event)
Nov 22 08:39:25 compute-0 nova_compute[186544]: 2025-11-22 08:39:25.962 186548 DEBUG nova.compute.manager [None req-fdf1b427-e2ae-4e05-abac-d5de19bf16c4 - - - - - -] [instance: d7f88d47-758b-4f1e-8cbc-1022ae095c46] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:39:25 compute-0 nova_compute[186544]: 2025-11-22 08:39:25.980 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:39:26 compute-0 nova_compute[186544]: 2025-11-22 08:39:26.570 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:39:30 compute-0 nova_compute[186544]: 2025-11-22 08:39:30.158 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:39:30 compute-0 nova_compute[186544]: 2025-11-22 08:39:30.982 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:39:31 compute-0 nova_compute[186544]: 2025-11-22 08:39:31.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:39:31 compute-0 nova_compute[186544]: 2025-11-22 08:39:31.572 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:39:33 compute-0 nova_compute[186544]: 2025-11-22 08:39:33.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:39:33 compute-0 podman[250600]: 2025-11-22 08:39:33.411231266 +0000 UTC m=+0.053390368 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 22 08:39:33 compute-0 podman[250599]: 2025-11-22 08:39:33.416056336 +0000 UTC m=+0.063284053 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 08:39:33 compute-0 podman[250601]: 2025-11-22 08:39:33.440422736 +0000 UTC m=+0.080743152 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 08:39:33 compute-0 podman[250602]: 2025-11-22 08:39:33.451032679 +0000 UTC m=+0.087728066 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Nov 22 08:39:35 compute-0 nova_compute[186544]: 2025-11-22 08:39:35.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:39:35 compute-0 nova_compute[186544]: 2025-11-22 08:39:35.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:39:35 compute-0 nova_compute[186544]: 2025-11-22 08:39:35.985 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:39:36 compute-0 nova_compute[186544]: 2025-11-22 08:39:36.574 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:39:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:39:37.369 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:39:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:39:37.370 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:39:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:39:37.370 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:39:38 compute-0 nova_compute[186544]: 2025-11-22 08:39:38.833 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:39:38 compute-0 nova_compute[186544]: 2025-11-22 08:39:38.901 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:39:40 compute-0 nova_compute[186544]: 2025-11-22 08:39:40.990 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:39:41 compute-0 nova_compute[186544]: 2025-11-22 08:39:41.575 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:39:44 compute-0 podman[250686]: 2025-11-22 08:39:44.405409176 +0000 UTC m=+0.052046827 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 22 08:39:44 compute-0 podman[250685]: 2025-11-22 08:39:44.412215904 +0000 UTC m=+0.057758307 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, maintainer=Red Hat, Inc., name=ubi9-minimal, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, config_id=edpm, distribution-scope=public, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9)
Nov 22 08:39:44 compute-0 podman[250684]: 2025-11-22 08:39:44.429371837 +0000 UTC m=+0.081920213 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 08:39:45 compute-0 nova_compute[186544]: 2025-11-22 08:39:45.994 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:39:46 compute-0 nova_compute[186544]: 2025-11-22 08:39:46.577 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:39:49 compute-0 nova_compute[186544]: 2025-11-22 08:39:49.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:39:50 compute-0 nova_compute[186544]: 2025-11-22 08:39:50.998 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:39:51 compute-0 nova_compute[186544]: 2025-11-22 08:39:51.579 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:39:53 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:39:53.167 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=65, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=64) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:39:53 compute-0 nova_compute[186544]: 2025-11-22 08:39:53.167 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:39:53 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:39:53.168 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 08:39:56 compute-0 nova_compute[186544]: 2025-11-22 08:39:56.002 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:39:56 compute-0 nova_compute[186544]: 2025-11-22 08:39:56.183 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:39:56 compute-0 nova_compute[186544]: 2025-11-22 08:39:56.184 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 22 08:39:56 compute-0 nova_compute[186544]: 2025-11-22 08:39:56.222 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 22 08:39:56 compute-0 nova_compute[186544]: 2025-11-22 08:39:56.584 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:40:00 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:40:00.171 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '65'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:40:01 compute-0 nova_compute[186544]: 2025-11-22 08:40:01.007 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:40:01 compute-0 nova_compute[186544]: 2025-11-22 08:40:01.587 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:40:02 compute-0 nova_compute[186544]: 2025-11-22 08:40:02.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:40:02 compute-0 nova_compute[186544]: 2025-11-22 08:40:02.165 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 22 08:40:04 compute-0 podman[250748]: 2025-11-22 08:40:04.408105147 +0000 UTC m=+0.049185734 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 08:40:04 compute-0 podman[250746]: 2025-11-22 08:40:04.408165859 +0000 UTC m=+0.056821594 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=edpm, container_name=ceilometer_agent_compute)
Nov 22 08:40:04 compute-0 podman[250747]: 2025-11-22 08:40:04.408317873 +0000 UTC m=+0.052922147 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Nov 22 08:40:04 compute-0 podman[250749]: 2025-11-22 08:40:04.438524198 +0000 UTC m=+0.076904308 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 08:40:06 compute-0 nova_compute[186544]: 2025-11-22 08:40:06.009 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:40:06 compute-0 nova_compute[186544]: 2025-11-22 08:40:06.589 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:40:11 compute-0 nova_compute[186544]: 2025-11-22 08:40:11.013 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:40:11 compute-0 nova_compute[186544]: 2025-11-22 08:40:11.591 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:40:15 compute-0 podman[250833]: 2025-11-22 08:40:15.419476771 +0000 UTC m=+0.064063671 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 08:40:15 compute-0 podman[250835]: 2025-11-22 08:40:15.424388992 +0000 UTC m=+0.060756240 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team)
Nov 22 08:40:15 compute-0 podman[250834]: 2025-11-22 08:40:15.426540015 +0000 UTC m=+0.067777823 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, config_id=edpm, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, maintainer=Red Hat, Inc.)
Nov 22 08:40:16 compute-0 nova_compute[186544]: 2025-11-22 08:40:16.017 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:40:16 compute-0 nova_compute[186544]: 2025-11-22 08:40:16.593 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:40:20 compute-0 nova_compute[186544]: 2025-11-22 08:40:20.183 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:40:20 compute-0 nova_compute[186544]: 2025-11-22 08:40:20.184 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 08:40:21 compute-0 nova_compute[186544]: 2025-11-22 08:40:21.021 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:40:21 compute-0 nova_compute[186544]: 2025-11-22 08:40:21.596 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:40:22 compute-0 nova_compute[186544]: 2025-11-22 08:40:22.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:40:22 compute-0 nova_compute[186544]: 2025-11-22 08:40:22.165 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 08:40:22 compute-0 nova_compute[186544]: 2025-11-22 08:40:22.165 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 08:40:22 compute-0 nova_compute[186544]: 2025-11-22 08:40:22.180 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 08:40:23 compute-0 ovn_controller[94843]: 2025-11-22T08:40:23Z|00841|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Nov 22 08:40:24 compute-0 nova_compute[186544]: 2025-11-22 08:40:24.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:40:24 compute-0 nova_compute[186544]: 2025-11-22 08:40:24.189 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:40:24 compute-0 nova_compute[186544]: 2025-11-22 08:40:24.189 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:40:24 compute-0 nova_compute[186544]: 2025-11-22 08:40:24.190 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:40:24 compute-0 nova_compute[186544]: 2025-11-22 08:40:24.190 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 08:40:24 compute-0 nova_compute[186544]: 2025-11-22 08:40:24.431 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:40:24 compute-0 nova_compute[186544]: 2025-11-22 08:40:24.433 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5712MB free_disk=73.1325569152832GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 08:40:24 compute-0 nova_compute[186544]: 2025-11-22 08:40:24.433 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:40:24 compute-0 nova_compute[186544]: 2025-11-22 08:40:24.433 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:40:24 compute-0 nova_compute[186544]: 2025-11-22 08:40:24.504 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 08:40:24 compute-0 nova_compute[186544]: 2025-11-22 08:40:24.505 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 08:40:24 compute-0 nova_compute[186544]: 2025-11-22 08:40:24.530 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:40:24 compute-0 nova_compute[186544]: 2025-11-22 08:40:24.551 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:40:24 compute-0 nova_compute[186544]: 2025-11-22 08:40:24.553 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 08:40:24 compute-0 nova_compute[186544]: 2025-11-22 08:40:24.553 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.120s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:40:25 compute-0 nova_compute[186544]: 2025-11-22 08:40:25.555 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:40:26 compute-0 nova_compute[186544]: 2025-11-22 08:40:26.026 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:40:26 compute-0 nova_compute[186544]: 2025-11-22 08:40:26.598 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:40:31 compute-0 nova_compute[186544]: 2025-11-22 08:40:31.029 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:40:31 compute-0 nova_compute[186544]: 2025-11-22 08:40:31.159 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:40:31 compute-0 nova_compute[186544]: 2025-11-22 08:40:31.599 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:40:32 compute-0 nova_compute[186544]: 2025-11-22 08:40:32.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:40:33 compute-0 nova_compute[186544]: 2025-11-22 08:40:33.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:40:35 compute-0 nova_compute[186544]: 2025-11-22 08:40:35.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:40:35 compute-0 nova_compute[186544]: 2025-11-22 08:40:35.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:40:35 compute-0 podman[250894]: 2025-11-22 08:40:35.424060891 +0000 UTC m=+0.065873707 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118)
Nov 22 08:40:35 compute-0 podman[250896]: 2025-11-22 08:40:35.42848284 +0000 UTC m=+0.060648408 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 08:40:35 compute-0 podman[250895]: 2025-11-22 08:40:35.429603237 +0000 UTC m=+0.064214665 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 22 08:40:35 compute-0 podman[250901]: 2025-11-22 08:40:35.487344362 +0000 UTC m=+0.112878847 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team)
Nov 22 08:40:36 compute-0 nova_compute[186544]: 2025-11-22 08:40:36.032 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:40:36 compute-0 nova_compute[186544]: 2025-11-22 08:40:36.600 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:40:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:40:36.605 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:40:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:40:36.606 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:40:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:40:36.606 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:40:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:40:36.606 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:40:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:40:36.606 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:40:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:40:36.607 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:40:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:40:36.607 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:40:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:40:36.607 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:40:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:40:36.607 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:40:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:40:36.608 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:40:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:40:36.608 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:40:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:40:36.608 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:40:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:40:36.608 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:40:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:40:36.608 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:40:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:40:36.609 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:40:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:40:36.609 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:40:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:40:36.609 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:40:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:40:36.609 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:40:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:40:36.609 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:40:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:40:36.610 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:40:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:40:36.610 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:40:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:40:36.610 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:40:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:40:36.610 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:40:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:40:36.610 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:40:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:40:36.611 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:40:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:40:37.370 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:40:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:40:37.371 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:40:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:40:37.371 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:40:40 compute-0 nova_compute[186544]: 2025-11-22 08:40:40.245 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:40:40 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:40:40.246 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=66, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=65) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:40:40 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:40:40.247 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 08:40:41 compute-0 nova_compute[186544]: 2025-11-22 08:40:41.035 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:40:41 compute-0 nova_compute[186544]: 2025-11-22 08:40:41.602 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:40:46 compute-0 nova_compute[186544]: 2025-11-22 08:40:46.040 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:40:46 compute-0 podman[250983]: 2025-11-22 08:40:46.415252885 +0000 UTC m=+0.061185161 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 08:40:46 compute-0 podman[250985]: 2025-11-22 08:40:46.421853778 +0000 UTC m=+0.056782972 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 08:40:46 compute-0 podman[250984]: 2025-11-22 08:40:46.429131498 +0000 UTC m=+0.069384504 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, name=ubi9-minimal, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, distribution-scope=public, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 22 08:40:46 compute-0 nova_compute[186544]: 2025-11-22 08:40:46.603 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:40:47 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:40:47.249 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '66'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:40:50 compute-0 nova_compute[186544]: 2025-11-22 08:40:50.159 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:40:51 compute-0 nova_compute[186544]: 2025-11-22 08:40:51.044 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:40:51 compute-0 nova_compute[186544]: 2025-11-22 08:40:51.606 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:40:56 compute-0 nova_compute[186544]: 2025-11-22 08:40:56.046 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:40:56 compute-0 nova_compute[186544]: 2025-11-22 08:40:56.608 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:41:01 compute-0 nova_compute[186544]: 2025-11-22 08:41:01.049 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:41:01 compute-0 nova_compute[186544]: 2025-11-22 08:41:01.609 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:41:06 compute-0 nova_compute[186544]: 2025-11-22 08:41:06.052 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:41:06 compute-0 podman[251050]: 2025-11-22 08:41:06.411099387 +0000 UTC m=+0.055988162 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3)
Nov 22 08:41:06 compute-0 podman[251051]: 2025-11-22 08:41:06.412900212 +0000 UTC m=+0.055435059 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 08:41:06 compute-0 podman[251049]: 2025-11-22 08:41:06.414364368 +0000 UTC m=+0.061171360 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 08:41:06 compute-0 podman[251052]: 2025-11-22 08:41:06.451129635 +0000 UTC m=+0.087042568 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 08:41:06 compute-0 nova_compute[186544]: 2025-11-22 08:41:06.610 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:41:07 compute-0 nova_compute[186544]: 2025-11-22 08:41:07.901 186548 DEBUG oslo_concurrency.lockutils [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "ef1b211e-ed2b-408e-9855-6fd78d883d8c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:41:07 compute-0 nova_compute[186544]: 2025-11-22 08:41:07.902 186548 DEBUG oslo_concurrency.lockutils [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "ef1b211e-ed2b-408e-9855-6fd78d883d8c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:41:07 compute-0 nova_compute[186544]: 2025-11-22 08:41:07.958 186548 DEBUG nova.compute.manager [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: ef1b211e-ed2b-408e-9855-6fd78d883d8c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 08:41:08 compute-0 nova_compute[186544]: 2025-11-22 08:41:08.139 186548 DEBUG oslo_concurrency.lockutils [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:41:08 compute-0 nova_compute[186544]: 2025-11-22 08:41:08.139 186548 DEBUG oslo_concurrency.lockutils [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:41:08 compute-0 nova_compute[186544]: 2025-11-22 08:41:08.153 186548 DEBUG nova.virt.hardware [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 08:41:08 compute-0 nova_compute[186544]: 2025-11-22 08:41:08.153 186548 INFO nova.compute.claims [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: ef1b211e-ed2b-408e-9855-6fd78d883d8c] Claim successful on node compute-0.ctlplane.example.com
Nov 22 08:41:08 compute-0 nova_compute[186544]: 2025-11-22 08:41:08.522 186548 DEBUG nova.compute.provider_tree [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:41:08 compute-0 nova_compute[186544]: 2025-11-22 08:41:08.535 186548 DEBUG nova.scheduler.client.report [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:41:08 compute-0 nova_compute[186544]: 2025-11-22 08:41:08.611 186548 DEBUG oslo_concurrency.lockutils [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.472s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:41:08 compute-0 nova_compute[186544]: 2025-11-22 08:41:08.613 186548 DEBUG nova.compute.manager [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: ef1b211e-ed2b-408e-9855-6fd78d883d8c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 08:41:08 compute-0 nova_compute[186544]: 2025-11-22 08:41:08.728 186548 DEBUG nova.compute.manager [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: ef1b211e-ed2b-408e-9855-6fd78d883d8c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 22 08:41:08 compute-0 nova_compute[186544]: 2025-11-22 08:41:08.728 186548 DEBUG nova.network.neutron [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: ef1b211e-ed2b-408e-9855-6fd78d883d8c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 08:41:08 compute-0 nova_compute[186544]: 2025-11-22 08:41:08.755 186548 INFO nova.virt.libvirt.driver [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: ef1b211e-ed2b-408e-9855-6fd78d883d8c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 08:41:08 compute-0 nova_compute[186544]: 2025-11-22 08:41:08.776 186548 DEBUG nova.compute.manager [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: ef1b211e-ed2b-408e-9855-6fd78d883d8c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 08:41:08 compute-0 nova_compute[186544]: 2025-11-22 08:41:08.890 186548 DEBUG nova.compute.manager [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: ef1b211e-ed2b-408e-9855-6fd78d883d8c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 08:41:08 compute-0 nova_compute[186544]: 2025-11-22 08:41:08.891 186548 DEBUG nova.virt.libvirt.driver [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: ef1b211e-ed2b-408e-9855-6fd78d883d8c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 08:41:08 compute-0 nova_compute[186544]: 2025-11-22 08:41:08.892 186548 INFO nova.virt.libvirt.driver [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: ef1b211e-ed2b-408e-9855-6fd78d883d8c] Creating image(s)
Nov 22 08:41:08 compute-0 nova_compute[186544]: 2025-11-22 08:41:08.892 186548 DEBUG oslo_concurrency.lockutils [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "/var/lib/nova/instances/ef1b211e-ed2b-408e-9855-6fd78d883d8c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:41:08 compute-0 nova_compute[186544]: 2025-11-22 08:41:08.893 186548 DEBUG oslo_concurrency.lockutils [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "/var/lib/nova/instances/ef1b211e-ed2b-408e-9855-6fd78d883d8c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:41:08 compute-0 nova_compute[186544]: 2025-11-22 08:41:08.893 186548 DEBUG oslo_concurrency.lockutils [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "/var/lib/nova/instances/ef1b211e-ed2b-408e-9855-6fd78d883d8c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:41:08 compute-0 nova_compute[186544]: 2025-11-22 08:41:08.905 186548 DEBUG oslo_concurrency.processutils [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:41:08 compute-0 nova_compute[186544]: 2025-11-22 08:41:08.961 186548 DEBUG oslo_concurrency.processutils [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:41:08 compute-0 nova_compute[186544]: 2025-11-22 08:41:08.963 186548 DEBUG oslo_concurrency.lockutils [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:41:08 compute-0 nova_compute[186544]: 2025-11-22 08:41:08.964 186548 DEBUG oslo_concurrency.lockutils [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:41:08 compute-0 nova_compute[186544]: 2025-11-22 08:41:08.978 186548 DEBUG oslo_concurrency.processutils [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:41:09 compute-0 nova_compute[186544]: 2025-11-22 08:41:09.036 186548 DEBUG oslo_concurrency.processutils [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:41:09 compute-0 nova_compute[186544]: 2025-11-22 08:41:09.037 186548 DEBUG oslo_concurrency.processutils [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/ef1b211e-ed2b-408e-9855-6fd78d883d8c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:41:09 compute-0 nova_compute[186544]: 2025-11-22 08:41:09.066 186548 DEBUG oslo_concurrency.processutils [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/ef1b211e-ed2b-408e-9855-6fd78d883d8c/disk 1073741824" returned: 0 in 0.030s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:41:09 compute-0 nova_compute[186544]: 2025-11-22 08:41:09.067 186548 DEBUG oslo_concurrency.lockutils [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.104s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:41:09 compute-0 nova_compute[186544]: 2025-11-22 08:41:09.068 186548 DEBUG oslo_concurrency.processutils [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:41:09 compute-0 nova_compute[186544]: 2025-11-22 08:41:09.125 186548 DEBUG oslo_concurrency.processutils [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:41:09 compute-0 nova_compute[186544]: 2025-11-22 08:41:09.126 186548 DEBUG nova.virt.disk.api [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Checking if we can resize image /var/lib/nova/instances/ef1b211e-ed2b-408e-9855-6fd78d883d8c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 08:41:09 compute-0 nova_compute[186544]: 2025-11-22 08:41:09.127 186548 DEBUG oslo_concurrency.processutils [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ef1b211e-ed2b-408e-9855-6fd78d883d8c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:41:09 compute-0 nova_compute[186544]: 2025-11-22 08:41:09.189 186548 DEBUG oslo_concurrency.processutils [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ef1b211e-ed2b-408e-9855-6fd78d883d8c/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:41:09 compute-0 nova_compute[186544]: 2025-11-22 08:41:09.190 186548 DEBUG nova.virt.disk.api [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Cannot resize image /var/lib/nova/instances/ef1b211e-ed2b-408e-9855-6fd78d883d8c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 08:41:09 compute-0 nova_compute[186544]: 2025-11-22 08:41:09.190 186548 DEBUG nova.objects.instance [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lazy-loading 'migration_context' on Instance uuid ef1b211e-ed2b-408e-9855-6fd78d883d8c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:41:09 compute-0 nova_compute[186544]: 2025-11-22 08:41:09.243 186548 DEBUG nova.virt.libvirt.driver [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: ef1b211e-ed2b-408e-9855-6fd78d883d8c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 08:41:09 compute-0 nova_compute[186544]: 2025-11-22 08:41:09.243 186548 DEBUG nova.virt.libvirt.driver [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: ef1b211e-ed2b-408e-9855-6fd78d883d8c] Ensure instance console log exists: /var/lib/nova/instances/ef1b211e-ed2b-408e-9855-6fd78d883d8c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 08:41:09 compute-0 nova_compute[186544]: 2025-11-22 08:41:09.244 186548 DEBUG oslo_concurrency.lockutils [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:41:09 compute-0 nova_compute[186544]: 2025-11-22 08:41:09.244 186548 DEBUG oslo_concurrency.lockutils [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:41:09 compute-0 nova_compute[186544]: 2025-11-22 08:41:09.245 186548 DEBUG oslo_concurrency.lockutils [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:41:09 compute-0 nova_compute[186544]: 2025-11-22 08:41:09.376 186548 DEBUG nova.policy [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 22 08:41:11 compute-0 nova_compute[186544]: 2025-11-22 08:41:11.056 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:41:11 compute-0 nova_compute[186544]: 2025-11-22 08:41:11.610 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:41:12 compute-0 nova_compute[186544]: 2025-11-22 08:41:12.357 186548 DEBUG nova.network.neutron [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: ef1b211e-ed2b-408e-9855-6fd78d883d8c] Successfully created port: 5f079e8d-4296-402c-b85e-0ed6435f72ec _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 22 08:41:15 compute-0 nova_compute[186544]: 2025-11-22 08:41:15.272 186548 DEBUG nova.network.neutron [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: ef1b211e-ed2b-408e-9855-6fd78d883d8c] Successfully updated port: 5f079e8d-4296-402c-b85e-0ed6435f72ec _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 08:41:15 compute-0 nova_compute[186544]: 2025-11-22 08:41:15.310 186548 DEBUG oslo_concurrency.lockutils [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "refresh_cache-ef1b211e-ed2b-408e-9855-6fd78d883d8c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:41:15 compute-0 nova_compute[186544]: 2025-11-22 08:41:15.311 186548 DEBUG oslo_concurrency.lockutils [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquired lock "refresh_cache-ef1b211e-ed2b-408e-9855-6fd78d883d8c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:41:15 compute-0 nova_compute[186544]: 2025-11-22 08:41:15.312 186548 DEBUG nova.network.neutron [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: ef1b211e-ed2b-408e-9855-6fd78d883d8c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 08:41:15 compute-0 nova_compute[186544]: 2025-11-22 08:41:15.496 186548 DEBUG nova.compute.manager [req-dfb1af8d-8642-40b0-aef9-ac27e82631a3 req-24c48374-5c49-4711-87da-424a32faba2d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ef1b211e-ed2b-408e-9855-6fd78d883d8c] Received event network-changed-5f079e8d-4296-402c-b85e-0ed6435f72ec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:41:15 compute-0 nova_compute[186544]: 2025-11-22 08:41:15.497 186548 DEBUG nova.compute.manager [req-dfb1af8d-8642-40b0-aef9-ac27e82631a3 req-24c48374-5c49-4711-87da-424a32faba2d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ef1b211e-ed2b-408e-9855-6fd78d883d8c] Refreshing instance network info cache due to event network-changed-5f079e8d-4296-402c-b85e-0ed6435f72ec. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:41:15 compute-0 nova_compute[186544]: 2025-11-22 08:41:15.497 186548 DEBUG oslo_concurrency.lockutils [req-dfb1af8d-8642-40b0-aef9-ac27e82631a3 req-24c48374-5c49-4711-87da-424a32faba2d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-ef1b211e-ed2b-408e-9855-6fd78d883d8c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:41:16 compute-0 nova_compute[186544]: 2025-11-22 08:41:16.059 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:41:16 compute-0 nova_compute[186544]: 2025-11-22 08:41:16.359 186548 DEBUG nova.network.neutron [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: ef1b211e-ed2b-408e-9855-6fd78d883d8c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 08:41:16 compute-0 nova_compute[186544]: 2025-11-22 08:41:16.612 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:41:17 compute-0 podman[251154]: 2025-11-22 08:41:17.409063571 +0000 UTC m=+0.052387093 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., config_id=edpm, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, version=9.6, io.openshift.expose-services=, managed_by=edpm_ansible, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 22 08:41:17 compute-0 podman[251155]: 2025-11-22 08:41:17.414322531 +0000 UTC m=+0.054515126 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 08:41:17 compute-0 podman[251153]: 2025-11-22 08:41:17.425289692 +0000 UTC m=+0.073696010 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 08:41:19 compute-0 nova_compute[186544]: 2025-11-22 08:41:19.357 186548 DEBUG nova.network.neutron [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: ef1b211e-ed2b-408e-9855-6fd78d883d8c] Updating instance_info_cache with network_info: [{"id": "5f079e8d-4296-402c-b85e-0ed6435f72ec", "address": "fa:16:3e:c9:0b:14", "network": {"id": "b8d3a649-095f-4b94-af55-194302b82348", "bridge": "br-int", "label": "tempest-network-smoke--1450792415", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f079e8d-42", "ovs_interfaceid": "5f079e8d-4296-402c-b85e-0ed6435f72ec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:41:19 compute-0 nova_compute[186544]: 2025-11-22 08:41:19.419 186548 DEBUG oslo_concurrency.lockutils [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Releasing lock "refresh_cache-ef1b211e-ed2b-408e-9855-6fd78d883d8c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:41:19 compute-0 nova_compute[186544]: 2025-11-22 08:41:19.419 186548 DEBUG nova.compute.manager [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: ef1b211e-ed2b-408e-9855-6fd78d883d8c] Instance network_info: |[{"id": "5f079e8d-4296-402c-b85e-0ed6435f72ec", "address": "fa:16:3e:c9:0b:14", "network": {"id": "b8d3a649-095f-4b94-af55-194302b82348", "bridge": "br-int", "label": "tempest-network-smoke--1450792415", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f079e8d-42", "ovs_interfaceid": "5f079e8d-4296-402c-b85e-0ed6435f72ec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 22 08:41:19 compute-0 nova_compute[186544]: 2025-11-22 08:41:19.419 186548 DEBUG oslo_concurrency.lockutils [req-dfb1af8d-8642-40b0-aef9-ac27e82631a3 req-24c48374-5c49-4711-87da-424a32faba2d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-ef1b211e-ed2b-408e-9855-6fd78d883d8c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:41:19 compute-0 nova_compute[186544]: 2025-11-22 08:41:19.420 186548 DEBUG nova.network.neutron [req-dfb1af8d-8642-40b0-aef9-ac27e82631a3 req-24c48374-5c49-4711-87da-424a32faba2d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ef1b211e-ed2b-408e-9855-6fd78d883d8c] Refreshing network info cache for port 5f079e8d-4296-402c-b85e-0ed6435f72ec _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:41:19 compute-0 nova_compute[186544]: 2025-11-22 08:41:19.422 186548 DEBUG nova.virt.libvirt.driver [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: ef1b211e-ed2b-408e-9855-6fd78d883d8c] Start _get_guest_xml network_info=[{"id": "5f079e8d-4296-402c-b85e-0ed6435f72ec", "address": "fa:16:3e:c9:0b:14", "network": {"id": "b8d3a649-095f-4b94-af55-194302b82348", "bridge": "br-int", "label": "tempest-network-smoke--1450792415", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f079e8d-42", "ovs_interfaceid": "5f079e8d-4296-402c-b85e-0ed6435f72ec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 08:41:19 compute-0 nova_compute[186544]: 2025-11-22 08:41:19.427 186548 WARNING nova.virt.libvirt.driver [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:41:19 compute-0 nova_compute[186544]: 2025-11-22 08:41:19.441 186548 DEBUG nova.virt.libvirt.host [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 08:41:19 compute-0 nova_compute[186544]: 2025-11-22 08:41:19.442 186548 DEBUG nova.virt.libvirt.host [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 08:41:19 compute-0 nova_compute[186544]: 2025-11-22 08:41:19.447 186548 DEBUG nova.virt.libvirt.host [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 08:41:19 compute-0 nova_compute[186544]: 2025-11-22 08:41:19.448 186548 DEBUG nova.virt.libvirt.host [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 08:41:19 compute-0 nova_compute[186544]: 2025-11-22 08:41:19.449 186548 DEBUG nova.virt.libvirt.driver [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 08:41:19 compute-0 nova_compute[186544]: 2025-11-22 08:41:19.449 186548 DEBUG nova.virt.hardware [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 08:41:19 compute-0 nova_compute[186544]: 2025-11-22 08:41:19.450 186548 DEBUG nova.virt.hardware [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 08:41:19 compute-0 nova_compute[186544]: 2025-11-22 08:41:19.450 186548 DEBUG nova.virt.hardware [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 08:41:19 compute-0 nova_compute[186544]: 2025-11-22 08:41:19.450 186548 DEBUG nova.virt.hardware [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 08:41:19 compute-0 nova_compute[186544]: 2025-11-22 08:41:19.450 186548 DEBUG nova.virt.hardware [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 08:41:19 compute-0 nova_compute[186544]: 2025-11-22 08:41:19.451 186548 DEBUG nova.virt.hardware [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 08:41:19 compute-0 nova_compute[186544]: 2025-11-22 08:41:19.451 186548 DEBUG nova.virt.hardware [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 08:41:19 compute-0 nova_compute[186544]: 2025-11-22 08:41:19.451 186548 DEBUG nova.virt.hardware [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 08:41:19 compute-0 nova_compute[186544]: 2025-11-22 08:41:19.451 186548 DEBUG nova.virt.hardware [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 08:41:19 compute-0 nova_compute[186544]: 2025-11-22 08:41:19.451 186548 DEBUG nova.virt.hardware [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 08:41:19 compute-0 nova_compute[186544]: 2025-11-22 08:41:19.452 186548 DEBUG nova.virt.hardware [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 08:41:19 compute-0 nova_compute[186544]: 2025-11-22 08:41:19.455 186548 DEBUG nova.virt.libvirt.vif [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:41:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1691977367',display_name='tempest-TestNetworkBasicOps-server-1691977367',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1691977367',id=172,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEDKNpzsG1Q103jCUovNZ6lZtbmUMlhqfx7jyAfB5//Deh5bgYdssEW1JTiM/GKS/xaHPWQqeN0/2XjEbR3xYPKpNXOT8+3wHkA2J+IEkQZTjE/GU5Hdc2pANDTG8O2dww==',key_name='tempest-TestNetworkBasicOps-1130046090',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='12f63a6d87a947758ab928c0d625ff06',ramdisk_id='',reservation_id='r-jrtky288',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1998778518',owner_user_name='tempest-TestNetworkBasicOps-1998778518-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:41:08Z,user_data=None,user_id='033a5e424a0a42afa21b67c28d79d1f4',uuid=ef1b211e-ed2b-408e-9855-6fd78d883d8c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5f079e8d-4296-402c-b85e-0ed6435f72ec", "address": "fa:16:3e:c9:0b:14", "network": {"id": "b8d3a649-095f-4b94-af55-194302b82348", "bridge": "br-int", "label": "tempest-network-smoke--1450792415", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f079e8d-42", "ovs_interfaceid": "5f079e8d-4296-402c-b85e-0ed6435f72ec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 08:41:19 compute-0 nova_compute[186544]: 2025-11-22 08:41:19.455 186548 DEBUG nova.network.os_vif_util [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converting VIF {"id": "5f079e8d-4296-402c-b85e-0ed6435f72ec", "address": "fa:16:3e:c9:0b:14", "network": {"id": "b8d3a649-095f-4b94-af55-194302b82348", "bridge": "br-int", "label": "tempest-network-smoke--1450792415", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f079e8d-42", "ovs_interfaceid": "5f079e8d-4296-402c-b85e-0ed6435f72ec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:41:19 compute-0 nova_compute[186544]: 2025-11-22 08:41:19.456 186548 DEBUG nova.network.os_vif_util [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c9:0b:14,bridge_name='br-int',has_traffic_filtering=True,id=5f079e8d-4296-402c-b85e-0ed6435f72ec,network=Network(b8d3a649-095f-4b94-af55-194302b82348),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5f079e8d-42') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:41:19 compute-0 nova_compute[186544]: 2025-11-22 08:41:19.457 186548 DEBUG nova.objects.instance [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lazy-loading 'pci_devices' on Instance uuid ef1b211e-ed2b-408e-9855-6fd78d883d8c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:41:19 compute-0 nova_compute[186544]: 2025-11-22 08:41:19.470 186548 DEBUG nova.virt.libvirt.driver [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: ef1b211e-ed2b-408e-9855-6fd78d883d8c] End _get_guest_xml xml=<domain type="kvm">
Nov 22 08:41:19 compute-0 nova_compute[186544]:   <uuid>ef1b211e-ed2b-408e-9855-6fd78d883d8c</uuid>
Nov 22 08:41:19 compute-0 nova_compute[186544]:   <name>instance-000000ac</name>
Nov 22 08:41:19 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 08:41:19 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 08:41:19 compute-0 nova_compute[186544]:   <metadata>
Nov 22 08:41:19 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 08:41:19 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 08:41:19 compute-0 nova_compute[186544]:       <nova:name>tempest-TestNetworkBasicOps-server-1691977367</nova:name>
Nov 22 08:41:19 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 08:41:19</nova:creationTime>
Nov 22 08:41:19 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 08:41:19 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 08:41:19 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 08:41:19 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 08:41:19 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 08:41:19 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 08:41:19 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 08:41:19 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 08:41:19 compute-0 nova_compute[186544]:         <nova:user uuid="033a5e424a0a42afa21b67c28d79d1f4">tempest-TestNetworkBasicOps-1998778518-project-member</nova:user>
Nov 22 08:41:19 compute-0 nova_compute[186544]:         <nova:project uuid="12f63a6d87a947758ab928c0d625ff06">tempest-TestNetworkBasicOps-1998778518</nova:project>
Nov 22 08:41:19 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 08:41:19 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 08:41:19 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 08:41:19 compute-0 nova_compute[186544]:         <nova:port uuid="5f079e8d-4296-402c-b85e-0ed6435f72ec">
Nov 22 08:41:19 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.29" ipVersion="4"/>
Nov 22 08:41:19 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 08:41:19 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 08:41:19 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 08:41:19 compute-0 nova_compute[186544]:   </metadata>
Nov 22 08:41:19 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 08:41:19 compute-0 nova_compute[186544]:     <system>
Nov 22 08:41:19 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 08:41:19 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 08:41:19 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 08:41:19 compute-0 nova_compute[186544]:       <entry name="serial">ef1b211e-ed2b-408e-9855-6fd78d883d8c</entry>
Nov 22 08:41:19 compute-0 nova_compute[186544]:       <entry name="uuid">ef1b211e-ed2b-408e-9855-6fd78d883d8c</entry>
Nov 22 08:41:19 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 08:41:19 compute-0 nova_compute[186544]:     </system>
Nov 22 08:41:19 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 08:41:19 compute-0 nova_compute[186544]:   <os>
Nov 22 08:41:19 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 08:41:19 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 08:41:19 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 08:41:19 compute-0 nova_compute[186544]:   </os>
Nov 22 08:41:19 compute-0 nova_compute[186544]:   <features>
Nov 22 08:41:19 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 08:41:19 compute-0 nova_compute[186544]:     <apic/>
Nov 22 08:41:19 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 08:41:19 compute-0 nova_compute[186544]:   </features>
Nov 22 08:41:19 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 08:41:19 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 08:41:19 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 08:41:19 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 08:41:19 compute-0 nova_compute[186544]:   </clock>
Nov 22 08:41:19 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 08:41:19 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 08:41:19 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 08:41:19 compute-0 nova_compute[186544]:   </cpu>
Nov 22 08:41:19 compute-0 nova_compute[186544]:   <devices>
Nov 22 08:41:19 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 08:41:19 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 08:41:19 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/ef1b211e-ed2b-408e-9855-6fd78d883d8c/disk"/>
Nov 22 08:41:19 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 08:41:19 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:41:19 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 08:41:19 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 08:41:19 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/ef1b211e-ed2b-408e-9855-6fd78d883d8c/disk.config"/>
Nov 22 08:41:19 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 08:41:19 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:41:19 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 08:41:19 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:c9:0b:14"/>
Nov 22 08:41:19 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:41:19 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 08:41:19 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 08:41:19 compute-0 nova_compute[186544]:       <target dev="tap5f079e8d-42"/>
Nov 22 08:41:19 compute-0 nova_compute[186544]:     </interface>
Nov 22 08:41:19 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 08:41:19 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/ef1b211e-ed2b-408e-9855-6fd78d883d8c/console.log" append="off"/>
Nov 22 08:41:19 compute-0 nova_compute[186544]:     </serial>
Nov 22 08:41:19 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 08:41:19 compute-0 nova_compute[186544]:     <video>
Nov 22 08:41:19 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:41:19 compute-0 nova_compute[186544]:     </video>
Nov 22 08:41:19 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 08:41:19 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 08:41:19 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 08:41:19 compute-0 nova_compute[186544]:     </rng>
Nov 22 08:41:19 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 08:41:19 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:41:19 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:41:19 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:41:19 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:41:19 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:41:19 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:41:19 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:41:19 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:41:19 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:41:19 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:41:19 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:41:19 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:41:19 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:41:19 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:41:19 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:41:19 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:41:19 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:41:19 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:41:19 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:41:19 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:41:19 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:41:19 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:41:19 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:41:19 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:41:19 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 08:41:19 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 08:41:19 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 08:41:19 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 08:41:19 compute-0 nova_compute[186544]:   </devices>
Nov 22 08:41:19 compute-0 nova_compute[186544]: </domain>
Nov 22 08:41:19 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 08:41:19 compute-0 nova_compute[186544]: 2025-11-22 08:41:19.471 186548 DEBUG nova.compute.manager [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: ef1b211e-ed2b-408e-9855-6fd78d883d8c] Preparing to wait for external event network-vif-plugged-5f079e8d-4296-402c-b85e-0ed6435f72ec prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 08:41:19 compute-0 nova_compute[186544]: 2025-11-22 08:41:19.471 186548 DEBUG oslo_concurrency.lockutils [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "ef1b211e-ed2b-408e-9855-6fd78d883d8c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:41:19 compute-0 nova_compute[186544]: 2025-11-22 08:41:19.471 186548 DEBUG oslo_concurrency.lockutils [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "ef1b211e-ed2b-408e-9855-6fd78d883d8c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:41:19 compute-0 nova_compute[186544]: 2025-11-22 08:41:19.472 186548 DEBUG oslo_concurrency.lockutils [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "ef1b211e-ed2b-408e-9855-6fd78d883d8c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:41:19 compute-0 nova_compute[186544]: 2025-11-22 08:41:19.472 186548 DEBUG nova.virt.libvirt.vif [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:41:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1691977367',display_name='tempest-TestNetworkBasicOps-server-1691977367',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1691977367',id=172,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEDKNpzsG1Q103jCUovNZ6lZtbmUMlhqfx7jyAfB5//Deh5bgYdssEW1JTiM/GKS/xaHPWQqeN0/2XjEbR3xYPKpNXOT8+3wHkA2J+IEkQZTjE/GU5Hdc2pANDTG8O2dww==',key_name='tempest-TestNetworkBasicOps-1130046090',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='12f63a6d87a947758ab928c0d625ff06',ramdisk_id='',reservation_id='r-jrtky288',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1998778518',owner_user_name='tempest-TestNetworkBasicOps-1998778518-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:41:08Z,user_data=None,user_id='033a5e424a0a42afa21b67c28d79d1f4',uuid=ef1b211e-ed2b-408e-9855-6fd78d883d8c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5f079e8d-4296-402c-b85e-0ed6435f72ec", "address": "fa:16:3e:c9:0b:14", "network": {"id": "b8d3a649-095f-4b94-af55-194302b82348", "bridge": "br-int", "label": "tempest-network-smoke--1450792415", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f079e8d-42", "ovs_interfaceid": "5f079e8d-4296-402c-b85e-0ed6435f72ec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 08:41:19 compute-0 nova_compute[186544]: 2025-11-22 08:41:19.472 186548 DEBUG nova.network.os_vif_util [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converting VIF {"id": "5f079e8d-4296-402c-b85e-0ed6435f72ec", "address": "fa:16:3e:c9:0b:14", "network": {"id": "b8d3a649-095f-4b94-af55-194302b82348", "bridge": "br-int", "label": "tempest-network-smoke--1450792415", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f079e8d-42", "ovs_interfaceid": "5f079e8d-4296-402c-b85e-0ed6435f72ec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:41:19 compute-0 nova_compute[186544]: 2025-11-22 08:41:19.473 186548 DEBUG nova.network.os_vif_util [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c9:0b:14,bridge_name='br-int',has_traffic_filtering=True,id=5f079e8d-4296-402c-b85e-0ed6435f72ec,network=Network(b8d3a649-095f-4b94-af55-194302b82348),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5f079e8d-42') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:41:19 compute-0 nova_compute[186544]: 2025-11-22 08:41:19.473 186548 DEBUG os_vif [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c9:0b:14,bridge_name='br-int',has_traffic_filtering=True,id=5f079e8d-4296-402c-b85e-0ed6435f72ec,network=Network(b8d3a649-095f-4b94-af55-194302b82348),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5f079e8d-42') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 08:41:19 compute-0 nova_compute[186544]: 2025-11-22 08:41:19.474 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:41:19 compute-0 nova_compute[186544]: 2025-11-22 08:41:19.474 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:41:19 compute-0 nova_compute[186544]: 2025-11-22 08:41:19.474 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:41:19 compute-0 nova_compute[186544]: 2025-11-22 08:41:19.477 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:41:19 compute-0 nova_compute[186544]: 2025-11-22 08:41:19.477 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5f079e8d-42, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:41:19 compute-0 nova_compute[186544]: 2025-11-22 08:41:19.478 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5f079e8d-42, col_values=(('external_ids', {'iface-id': '5f079e8d-4296-402c-b85e-0ed6435f72ec', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c9:0b:14', 'vm-uuid': 'ef1b211e-ed2b-408e-9855-6fd78d883d8c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:41:19 compute-0 nova_compute[186544]: 2025-11-22 08:41:19.479 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:41:19 compute-0 NetworkManager[55036]: <info>  [1763800879.4801] manager: (tap5f079e8d-42): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/394)
Nov 22 08:41:19 compute-0 nova_compute[186544]: 2025-11-22 08:41:19.481 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 08:41:19 compute-0 nova_compute[186544]: 2025-11-22 08:41:19.485 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:41:19 compute-0 nova_compute[186544]: 2025-11-22 08:41:19.486 186548 INFO os_vif [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c9:0b:14,bridge_name='br-int',has_traffic_filtering=True,id=5f079e8d-4296-402c-b85e-0ed6435f72ec,network=Network(b8d3a649-095f-4b94-af55-194302b82348),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5f079e8d-42')
Nov 22 08:41:19 compute-0 nova_compute[186544]: 2025-11-22 08:41:19.550 186548 DEBUG nova.virt.libvirt.driver [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:41:19 compute-0 nova_compute[186544]: 2025-11-22 08:41:19.551 186548 DEBUG nova.virt.libvirt.driver [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:41:19 compute-0 nova_compute[186544]: 2025-11-22 08:41:19.551 186548 DEBUG nova.virt.libvirt.driver [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] No VIF found with MAC fa:16:3e:c9:0b:14, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 08:41:19 compute-0 nova_compute[186544]: 2025-11-22 08:41:19.551 186548 INFO nova.virt.libvirt.driver [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: ef1b211e-ed2b-408e-9855-6fd78d883d8c] Using config drive
Nov 22 08:41:20 compute-0 nova_compute[186544]: 2025-11-22 08:41:20.591 186548 INFO nova.virt.libvirt.driver [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: ef1b211e-ed2b-408e-9855-6fd78d883d8c] Creating config drive at /var/lib/nova/instances/ef1b211e-ed2b-408e-9855-6fd78d883d8c/disk.config
Nov 22 08:41:20 compute-0 nova_compute[186544]: 2025-11-22 08:41:20.596 186548 DEBUG oslo_concurrency.processutils [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ef1b211e-ed2b-408e-9855-6fd78d883d8c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpgg_i97_h execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:41:20 compute-0 nova_compute[186544]: 2025-11-22 08:41:20.720 186548 DEBUG oslo_concurrency.processutils [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ef1b211e-ed2b-408e-9855-6fd78d883d8c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpgg_i97_h" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:41:20 compute-0 kernel: tap5f079e8d-42: entered promiscuous mode
Nov 22 08:41:20 compute-0 NetworkManager[55036]: <info>  [1763800880.7807] manager: (tap5f079e8d-42): new Tun device (/org/freedesktop/NetworkManager/Devices/395)
Nov 22 08:41:20 compute-0 nova_compute[186544]: 2025-11-22 08:41:20.782 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:41:20 compute-0 ovn_controller[94843]: 2025-11-22T08:41:20Z|00842|binding|INFO|Claiming lport 5f079e8d-4296-402c-b85e-0ed6435f72ec for this chassis.
Nov 22 08:41:20 compute-0 ovn_controller[94843]: 2025-11-22T08:41:20Z|00843|binding|INFO|5f079e8d-4296-402c-b85e-0ed6435f72ec: Claiming fa:16:3e:c9:0b:14 10.100.0.29
Nov 22 08:41:20 compute-0 systemd-udevd[251234]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:41:20 compute-0 NetworkManager[55036]: <info>  [1763800880.8210] device (tap5f079e8d-42): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 08:41:20 compute-0 NetworkManager[55036]: <info>  [1763800880.8224] device (tap5f079e8d-42): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 08:41:20 compute-0 systemd-machined[152872]: New machine qemu-94-instance-000000ac.
Nov 22 08:41:20 compute-0 nova_compute[186544]: 2025-11-22 08:41:20.830 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:41:20 compute-0 ovn_controller[94843]: 2025-11-22T08:41:20Z|00844|binding|INFO|Setting lport 5f079e8d-4296-402c-b85e-0ed6435f72ec ovn-installed in OVS
Nov 22 08:41:20 compute-0 nova_compute[186544]: 2025-11-22 08:41:20.832 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:41:20 compute-0 systemd[1]: Started Virtual Machine qemu-94-instance-000000ac.
Nov 22 08:41:20 compute-0 ovn_controller[94843]: 2025-11-22T08:41:20Z|00845|binding|INFO|Setting lport 5f079e8d-4296-402c-b85e-0ed6435f72ec up in Southbound
Nov 22 08:41:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:41:20.885 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c9:0b:14 10.100.0.29'], port_security=['fa:16:3e:c9:0b:14 10.100.0.29'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.29/28', 'neutron:device_id': 'ef1b211e-ed2b-408e-9855-6fd78d883d8c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b8d3a649-095f-4b94-af55-194302b82348', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '12f63a6d87a947758ab928c0d625ff06', 'neutron:revision_number': '2', 'neutron:security_group_ids': '88899a54-ed01-4ce9-9624-a096a374cee4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c8bf1006-f131-4804-9af5-1455421df1f6, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=5f079e8d-4296-402c-b85e-0ed6435f72ec) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:41:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:41:20.886 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 5f079e8d-4296-402c-b85e-0ed6435f72ec in datapath b8d3a649-095f-4b94-af55-194302b82348 bound to our chassis
Nov 22 08:41:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:41:20.887 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b8d3a649-095f-4b94-af55-194302b82348
Nov 22 08:41:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:41:20.898 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[4e6f5c35-5697-4c65-b465-943de4b763b1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:41:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:41:20.899 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb8d3a649-01 in ovnmeta-b8d3a649-095f-4b94-af55-194302b82348 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 08:41:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:41:20.901 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb8d3a649-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 08:41:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:41:20.902 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[d5458498-211a-47a9-9497-3f9aea279abe]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:41:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:41:20.902 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[2fdd02c5-1a12-4088-9077-72c38fd50c1f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:41:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:41:20.916 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[93994762-eea5-4ec0-ad07-d9a641bfc8e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:41:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:41:20.930 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[b65d3439-6238-4a67-be9e-301c702ed843]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:41:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:41:20.953 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[34d28854-b335-439c-9109-195d2a6c7413]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:41:20 compute-0 NetworkManager[55036]: <info>  [1763800880.9601] manager: (tapb8d3a649-00): new Veth device (/org/freedesktop/NetworkManager/Devices/396)
Nov 22 08:41:20 compute-0 systemd-udevd[251239]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:41:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:41:20.959 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[07254b5f-610d-4c49-8bb7-d5fb77b88e15]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:41:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:41:20.990 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[1fe50f43-956e-493d-9ff0-d9207fed9437]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:41:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:41:20.992 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[093487c1-0958-44d2-98c7-6856368068d0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:41:21 compute-0 NetworkManager[55036]: <info>  [1763800881.0129] device (tapb8d3a649-00): carrier: link connected
Nov 22 08:41:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:41:21.018 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[f0cc6a44-36d3-4648-b67d-05a7426fed98]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:41:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:41:21.033 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[ca0efc15-77b0-418b-a44e-6921aa9b1012]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb8d3a649-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:de:8d:73'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 252], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 762165, 'reachable_time': 16285, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 251270, 'error': None, 'target': 'ovnmeta-b8d3a649-095f-4b94-af55-194302b82348', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:41:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:41:21.046 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[e9b32608-f175-46ca-b6f5-3256f5ade04e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fede:8d73'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 762165, 'tstamp': 762165}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 251271, 'error': None, 'target': 'ovnmeta-b8d3a649-095f-4b94-af55-194302b82348', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:41:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:41:21.061 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[56cab819-17e5-436d-bc44-14d4040afb7e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb8d3a649-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:de:8d:73'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 252], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 762165, 'reachable_time': 16285, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 251272, 'error': None, 'target': 'ovnmeta-b8d3a649-095f-4b94-af55-194302b82348', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:41:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:41:21.089 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[b01b7a50-ceff-44bb-85a6-c6b1e19c8cf3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:41:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:41:21.143 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[62a7cc85-6944-459a-9b61-469b5194e664]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:41:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:41:21.145 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb8d3a649-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:41:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:41:21.146 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:41:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:41:21.146 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb8d3a649-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:41:21 compute-0 nova_compute[186544]: 2025-11-22 08:41:21.148 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:41:21 compute-0 NetworkManager[55036]: <info>  [1763800881.1486] manager: (tapb8d3a649-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/397)
Nov 22 08:41:21 compute-0 kernel: tapb8d3a649-00: entered promiscuous mode
Nov 22 08:41:21 compute-0 nova_compute[186544]: 2025-11-22 08:41:21.150 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:41:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:41:21.152 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb8d3a649-00, col_values=(('external_ids', {'iface-id': '02280f12-e9a8-46b2-8ba0-7dd36f2a11f6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:41:21 compute-0 nova_compute[186544]: 2025-11-22 08:41:21.153 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:41:21 compute-0 ovn_controller[94843]: 2025-11-22T08:41:21Z|00846|binding|INFO|Releasing lport 02280f12-e9a8-46b2-8ba0-7dd36f2a11f6 from this chassis (sb_readonly=0)
Nov 22 08:41:21 compute-0 nova_compute[186544]: 2025-11-22 08:41:21.154 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:41:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:41:21.156 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b8d3a649-095f-4b94-af55-194302b82348.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b8d3a649-095f-4b94-af55-194302b82348.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 08:41:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:41:21.157 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[e6b1c70b-bc12-4a39-a9b9-665144fda694]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:41:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:41:21.158 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 08:41:21 compute-0 ovn_metadata_agent[103800]: global
Nov 22 08:41:21 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 08:41:21 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-b8d3a649-095f-4b94-af55-194302b82348
Nov 22 08:41:21 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 08:41:21 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 08:41:21 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 08:41:21 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/b8d3a649-095f-4b94-af55-194302b82348.pid.haproxy
Nov 22 08:41:21 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 08:41:21 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:41:21 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 08:41:21 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 08:41:21 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 08:41:21 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 08:41:21 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 08:41:21 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 08:41:21 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 08:41:21 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 08:41:21 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 08:41:21 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 08:41:21 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 08:41:21 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 08:41:21 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 08:41:21 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:41:21 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:41:21 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 08:41:21 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 08:41:21 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 08:41:21 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID b8d3a649-095f-4b94-af55-194302b82348
Nov 22 08:41:21 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 08:41:21 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:41:21.158 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b8d3a649-095f-4b94-af55-194302b82348', 'env', 'PROCESS_TAG=haproxy-b8d3a649-095f-4b94-af55-194302b82348', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b8d3a649-095f-4b94-af55-194302b82348.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 08:41:21 compute-0 nova_compute[186544]: 2025-11-22 08:41:21.165 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:41:21 compute-0 nova_compute[186544]: 2025-11-22 08:41:21.205 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763800881.2047772, ef1b211e-ed2b-408e-9855-6fd78d883d8c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:41:21 compute-0 nova_compute[186544]: 2025-11-22 08:41:21.206 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: ef1b211e-ed2b-408e-9855-6fd78d883d8c] VM Started (Lifecycle Event)
Nov 22 08:41:21 compute-0 nova_compute[186544]: 2025-11-22 08:41:21.272 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: ef1b211e-ed2b-408e-9855-6fd78d883d8c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:41:21 compute-0 nova_compute[186544]: 2025-11-22 08:41:21.277 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763800881.2059069, ef1b211e-ed2b-408e-9855-6fd78d883d8c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:41:21 compute-0 nova_compute[186544]: 2025-11-22 08:41:21.277 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: ef1b211e-ed2b-408e-9855-6fd78d883d8c] VM Paused (Lifecycle Event)
Nov 22 08:41:21 compute-0 nova_compute[186544]: 2025-11-22 08:41:21.303 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: ef1b211e-ed2b-408e-9855-6fd78d883d8c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:41:21 compute-0 nova_compute[186544]: 2025-11-22 08:41:21.306 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: ef1b211e-ed2b-408e-9855-6fd78d883d8c] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:41:21 compute-0 nova_compute[186544]: 2025-11-22 08:41:21.326 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: ef1b211e-ed2b-408e-9855-6fd78d883d8c] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 08:41:21 compute-0 podman[251311]: 2025-11-22 08:41:21.520600922 +0000 UTC m=+0.062324909 container create 7e70b01b636d6e4a82ec06ca08dcb84f92e813e2e5d19f76058cfe7e8cbfd5d6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b8d3a649-095f-4b94-af55-194302b82348, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 22 08:41:21 compute-0 systemd[1]: Started libpod-conmon-7e70b01b636d6e4a82ec06ca08dcb84f92e813e2e5d19f76058cfe7e8cbfd5d6.scope.
Nov 22 08:41:21 compute-0 podman[251311]: 2025-11-22 08:41:21.483260301 +0000 UTC m=+0.024984308 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 08:41:21 compute-0 systemd[1]: Started libcrun container.
Nov 22 08:41:21 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a4f81f41b6f03cecb2a2bf458d41e93f909334fca6d25e0ca8664aa8cf84c927/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 08:41:21 compute-0 podman[251311]: 2025-11-22 08:41:21.597212332 +0000 UTC m=+0.138936339 container init 7e70b01b636d6e4a82ec06ca08dcb84f92e813e2e5d19f76058cfe7e8cbfd5d6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b8d3a649-095f-4b94-af55-194302b82348, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 22 08:41:21 compute-0 podman[251311]: 2025-11-22 08:41:21.602607025 +0000 UTC m=+0.144331012 container start 7e70b01b636d6e4a82ec06ca08dcb84f92e813e2e5d19f76058cfe7e8cbfd5d6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b8d3a649-095f-4b94-af55-194302b82348, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 22 08:41:21 compute-0 nova_compute[186544]: 2025-11-22 08:41:21.616 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:41:21 compute-0 neutron-haproxy-ovnmeta-b8d3a649-095f-4b94-af55-194302b82348[251325]: [NOTICE]   (251329) : New worker (251331) forked
Nov 22 08:41:21 compute-0 neutron-haproxy-ovnmeta-b8d3a649-095f-4b94-af55-194302b82348[251325]: [NOTICE]   (251329) : Loading success.
Nov 22 08:41:21 compute-0 nova_compute[186544]: 2025-11-22 08:41:21.756 186548 DEBUG nova.compute.manager [req-f56c4562-6df4-4d42-9d7f-671780df3373 req-d06ed169-8bb0-4dd5-82d9-ec2181fae3ec 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ef1b211e-ed2b-408e-9855-6fd78d883d8c] Received event network-vif-plugged-5f079e8d-4296-402c-b85e-0ed6435f72ec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:41:21 compute-0 nova_compute[186544]: 2025-11-22 08:41:21.757 186548 DEBUG oslo_concurrency.lockutils [req-f56c4562-6df4-4d42-9d7f-671780df3373 req-d06ed169-8bb0-4dd5-82d9-ec2181fae3ec 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "ef1b211e-ed2b-408e-9855-6fd78d883d8c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:41:21 compute-0 nova_compute[186544]: 2025-11-22 08:41:21.757 186548 DEBUG oslo_concurrency.lockutils [req-f56c4562-6df4-4d42-9d7f-671780df3373 req-d06ed169-8bb0-4dd5-82d9-ec2181fae3ec 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ef1b211e-ed2b-408e-9855-6fd78d883d8c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:41:21 compute-0 nova_compute[186544]: 2025-11-22 08:41:21.758 186548 DEBUG oslo_concurrency.lockutils [req-f56c4562-6df4-4d42-9d7f-671780df3373 req-d06ed169-8bb0-4dd5-82d9-ec2181fae3ec 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ef1b211e-ed2b-408e-9855-6fd78d883d8c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:41:21 compute-0 nova_compute[186544]: 2025-11-22 08:41:21.758 186548 DEBUG nova.compute.manager [req-f56c4562-6df4-4d42-9d7f-671780df3373 req-d06ed169-8bb0-4dd5-82d9-ec2181fae3ec 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ef1b211e-ed2b-408e-9855-6fd78d883d8c] Processing event network-vif-plugged-5f079e8d-4296-402c-b85e-0ed6435f72ec _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 08:41:21 compute-0 nova_compute[186544]: 2025-11-22 08:41:21.758 186548 DEBUG nova.compute.manager [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: ef1b211e-ed2b-408e-9855-6fd78d883d8c] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 08:41:21 compute-0 nova_compute[186544]: 2025-11-22 08:41:21.761 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763800881.7617188, ef1b211e-ed2b-408e-9855-6fd78d883d8c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:41:21 compute-0 nova_compute[186544]: 2025-11-22 08:41:21.762 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: ef1b211e-ed2b-408e-9855-6fd78d883d8c] VM Resumed (Lifecycle Event)
Nov 22 08:41:21 compute-0 nova_compute[186544]: 2025-11-22 08:41:21.763 186548 DEBUG nova.virt.libvirt.driver [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: ef1b211e-ed2b-408e-9855-6fd78d883d8c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 08:41:21 compute-0 nova_compute[186544]: 2025-11-22 08:41:21.770 186548 INFO nova.virt.libvirt.driver [-] [instance: ef1b211e-ed2b-408e-9855-6fd78d883d8c] Instance spawned successfully.
Nov 22 08:41:21 compute-0 nova_compute[186544]: 2025-11-22 08:41:21.770 186548 DEBUG nova.virt.libvirt.driver [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: ef1b211e-ed2b-408e-9855-6fd78d883d8c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 08:41:21 compute-0 nova_compute[186544]: 2025-11-22 08:41:21.809 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: ef1b211e-ed2b-408e-9855-6fd78d883d8c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:41:21 compute-0 nova_compute[186544]: 2025-11-22 08:41:21.815 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: ef1b211e-ed2b-408e-9855-6fd78d883d8c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:41:21 compute-0 nova_compute[186544]: 2025-11-22 08:41:21.819 186548 DEBUG nova.virt.libvirt.driver [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: ef1b211e-ed2b-408e-9855-6fd78d883d8c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:41:21 compute-0 nova_compute[186544]: 2025-11-22 08:41:21.820 186548 DEBUG nova.virt.libvirt.driver [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: ef1b211e-ed2b-408e-9855-6fd78d883d8c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:41:21 compute-0 nova_compute[186544]: 2025-11-22 08:41:21.820 186548 DEBUG nova.virt.libvirt.driver [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: ef1b211e-ed2b-408e-9855-6fd78d883d8c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:41:21 compute-0 nova_compute[186544]: 2025-11-22 08:41:21.821 186548 DEBUG nova.virt.libvirt.driver [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: ef1b211e-ed2b-408e-9855-6fd78d883d8c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:41:21 compute-0 nova_compute[186544]: 2025-11-22 08:41:21.821 186548 DEBUG nova.virt.libvirt.driver [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: ef1b211e-ed2b-408e-9855-6fd78d883d8c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:41:21 compute-0 nova_compute[186544]: 2025-11-22 08:41:21.822 186548 DEBUG nova.virt.libvirt.driver [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: ef1b211e-ed2b-408e-9855-6fd78d883d8c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:41:21 compute-0 nova_compute[186544]: 2025-11-22 08:41:21.854 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: ef1b211e-ed2b-408e-9855-6fd78d883d8c] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 08:41:22 compute-0 nova_compute[186544]: 2025-11-22 08:41:22.063 186548 DEBUG nova.network.neutron [req-dfb1af8d-8642-40b0-aef9-ac27e82631a3 req-24c48374-5c49-4711-87da-424a32faba2d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ef1b211e-ed2b-408e-9855-6fd78d883d8c] Updated VIF entry in instance network info cache for port 5f079e8d-4296-402c-b85e-0ed6435f72ec. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:41:22 compute-0 nova_compute[186544]: 2025-11-22 08:41:22.064 186548 DEBUG nova.network.neutron [req-dfb1af8d-8642-40b0-aef9-ac27e82631a3 req-24c48374-5c49-4711-87da-424a32faba2d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ef1b211e-ed2b-408e-9855-6fd78d883d8c] Updating instance_info_cache with network_info: [{"id": "5f079e8d-4296-402c-b85e-0ed6435f72ec", "address": "fa:16:3e:c9:0b:14", "network": {"id": "b8d3a649-095f-4b94-af55-194302b82348", "bridge": "br-int", "label": "tempest-network-smoke--1450792415", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f079e8d-42", "ovs_interfaceid": "5f079e8d-4296-402c-b85e-0ed6435f72ec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:41:22 compute-0 nova_compute[186544]: 2025-11-22 08:41:22.090 186548 DEBUG oslo_concurrency.lockutils [req-dfb1af8d-8642-40b0-aef9-ac27e82631a3 req-24c48374-5c49-4711-87da-424a32faba2d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-ef1b211e-ed2b-408e-9855-6fd78d883d8c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:41:22 compute-0 nova_compute[186544]: 2025-11-22 08:41:22.128 186548 INFO nova.compute.manager [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: ef1b211e-ed2b-408e-9855-6fd78d883d8c] Took 13.24 seconds to spawn the instance on the hypervisor.
Nov 22 08:41:22 compute-0 nova_compute[186544]: 2025-11-22 08:41:22.128 186548 DEBUG nova.compute.manager [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: ef1b211e-ed2b-408e-9855-6fd78d883d8c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:41:22 compute-0 nova_compute[186544]: 2025-11-22 08:41:22.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:41:22 compute-0 nova_compute[186544]: 2025-11-22 08:41:22.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 08:41:22 compute-0 nova_compute[186544]: 2025-11-22 08:41:22.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 08:41:22 compute-0 nova_compute[186544]: 2025-11-22 08:41:22.182 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: ef1b211e-ed2b-408e-9855-6fd78d883d8c] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Nov 22 08:41:22 compute-0 nova_compute[186544]: 2025-11-22 08:41:22.182 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 08:41:22 compute-0 nova_compute[186544]: 2025-11-22 08:41:22.183 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:41:22 compute-0 nova_compute[186544]: 2025-11-22 08:41:22.183 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 08:41:22 compute-0 nova_compute[186544]: 2025-11-22 08:41:22.367 186548 INFO nova.compute.manager [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: ef1b211e-ed2b-408e-9855-6fd78d883d8c] Took 14.28 seconds to build instance.
Nov 22 08:41:22 compute-0 nova_compute[186544]: 2025-11-22 08:41:22.464 186548 DEBUG oslo_concurrency.lockutils [None req-5ccd77f6-657f-47c7-9996-f15d49df4fb6 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "ef1b211e-ed2b-408e-9855-6fd78d883d8c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.563s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:41:23 compute-0 nova_compute[186544]: 2025-11-22 08:41:23.910 186548 DEBUG nova.compute.manager [req-6921c5e2-8e3d-45cb-a326-0be227da26dd req-0f36c72b-fd98-469c-a4a2-72730c26bbb9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ef1b211e-ed2b-408e-9855-6fd78d883d8c] Received event network-vif-plugged-5f079e8d-4296-402c-b85e-0ed6435f72ec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:41:23 compute-0 nova_compute[186544]: 2025-11-22 08:41:23.911 186548 DEBUG oslo_concurrency.lockutils [req-6921c5e2-8e3d-45cb-a326-0be227da26dd req-0f36c72b-fd98-469c-a4a2-72730c26bbb9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "ef1b211e-ed2b-408e-9855-6fd78d883d8c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:41:23 compute-0 nova_compute[186544]: 2025-11-22 08:41:23.911 186548 DEBUG oslo_concurrency.lockutils [req-6921c5e2-8e3d-45cb-a326-0be227da26dd req-0f36c72b-fd98-469c-a4a2-72730c26bbb9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ef1b211e-ed2b-408e-9855-6fd78d883d8c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:41:23 compute-0 nova_compute[186544]: 2025-11-22 08:41:23.912 186548 DEBUG oslo_concurrency.lockutils [req-6921c5e2-8e3d-45cb-a326-0be227da26dd req-0f36c72b-fd98-469c-a4a2-72730c26bbb9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ef1b211e-ed2b-408e-9855-6fd78d883d8c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:41:23 compute-0 nova_compute[186544]: 2025-11-22 08:41:23.912 186548 DEBUG nova.compute.manager [req-6921c5e2-8e3d-45cb-a326-0be227da26dd req-0f36c72b-fd98-469c-a4a2-72730c26bbb9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ef1b211e-ed2b-408e-9855-6fd78d883d8c] No waiting events found dispatching network-vif-plugged-5f079e8d-4296-402c-b85e-0ed6435f72ec pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:41:23 compute-0 nova_compute[186544]: 2025-11-22 08:41:23.912 186548 WARNING nova.compute.manager [req-6921c5e2-8e3d-45cb-a326-0be227da26dd req-0f36c72b-fd98-469c-a4a2-72730c26bbb9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ef1b211e-ed2b-408e-9855-6fd78d883d8c] Received unexpected event network-vif-plugged-5f079e8d-4296-402c-b85e-0ed6435f72ec for instance with vm_state active and task_state None.
Nov 22 08:41:24 compute-0 nova_compute[186544]: 2025-11-22 08:41:24.480 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:41:26 compute-0 nova_compute[186544]: 2025-11-22 08:41:26.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:41:26 compute-0 nova_compute[186544]: 2025-11-22 08:41:26.165 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:41:26 compute-0 nova_compute[186544]: 2025-11-22 08:41:26.189 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:41:26 compute-0 nova_compute[186544]: 2025-11-22 08:41:26.190 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:41:26 compute-0 nova_compute[186544]: 2025-11-22 08:41:26.190 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:41:26 compute-0 nova_compute[186544]: 2025-11-22 08:41:26.190 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 08:41:26 compute-0 nova_compute[186544]: 2025-11-22 08:41:26.280 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ef1b211e-ed2b-408e-9855-6fd78d883d8c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:41:26 compute-0 nova_compute[186544]: 2025-11-22 08:41:26.338 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ef1b211e-ed2b-408e-9855-6fd78d883d8c/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:41:26 compute-0 nova_compute[186544]: 2025-11-22 08:41:26.339 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ef1b211e-ed2b-408e-9855-6fd78d883d8c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:41:26 compute-0 nova_compute[186544]: 2025-11-22 08:41:26.394 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ef1b211e-ed2b-408e-9855-6fd78d883d8c/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:41:26 compute-0 nova_compute[186544]: 2025-11-22 08:41:26.577 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:41:26 compute-0 nova_compute[186544]: 2025-11-22 08:41:26.578 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5548MB free_disk=73.13124084472656GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 08:41:26 compute-0 nova_compute[186544]: 2025-11-22 08:41:26.579 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:41:26 compute-0 nova_compute[186544]: 2025-11-22 08:41:26.579 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:41:26 compute-0 nova_compute[186544]: 2025-11-22 08:41:26.619 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:41:26 compute-0 nova_compute[186544]: 2025-11-22 08:41:26.712 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Instance ef1b211e-ed2b-408e-9855-6fd78d883d8c actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 22 08:41:26 compute-0 nova_compute[186544]: 2025-11-22 08:41:26.713 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 08:41:26 compute-0 nova_compute[186544]: 2025-11-22 08:41:26.713 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 08:41:26 compute-0 nova_compute[186544]: 2025-11-22 08:41:26.741 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Refreshing inventories for resource provider 0a011418-630a-4be8-ab23-41ec1c11a5ea _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 22 08:41:26 compute-0 nova_compute[186544]: 2025-11-22 08:41:26.763 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Updating ProviderTree inventory for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 22 08:41:26 compute-0 nova_compute[186544]: 2025-11-22 08:41:26.764 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Updating inventory in ProviderTree for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 22 08:41:26 compute-0 nova_compute[186544]: 2025-11-22 08:41:26.782 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Refreshing aggregate associations for resource provider 0a011418-630a-4be8-ab23-41ec1c11a5ea, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 22 08:41:26 compute-0 nova_compute[186544]: 2025-11-22 08:41:26.801 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Refreshing trait associations for resource provider 0a011418-630a-4be8-ab23-41ec1c11a5ea, traits: COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_ACCELERATORS,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_RESCUE_BFV,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSSE3,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE41 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 22 08:41:26 compute-0 nova_compute[186544]: 2025-11-22 08:41:26.868 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:41:26 compute-0 nova_compute[186544]: 2025-11-22 08:41:26.898 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:41:27 compute-0 nova_compute[186544]: 2025-11-22 08:41:27.071 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 08:41:27 compute-0 nova_compute[186544]: 2025-11-22 08:41:27.072 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.493s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:41:29 compute-0 nova_compute[186544]: 2025-11-22 08:41:29.484 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:41:31 compute-0 nova_compute[186544]: 2025-11-22 08:41:31.619 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:41:33 compute-0 nova_compute[186544]: 2025-11-22 08:41:33.067 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:41:33 compute-0 nova_compute[186544]: 2025-11-22 08:41:33.068 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:41:34 compute-0 ovn_controller[94843]: 2025-11-22T08:41:34Z|00083|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:c9:0b:14 10.100.0.29
Nov 22 08:41:34 compute-0 ovn_controller[94843]: 2025-11-22T08:41:34Z|00084|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c9:0b:14 10.100.0.29
Nov 22 08:41:34 compute-0 nova_compute[186544]: 2025-11-22 08:41:34.488 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:41:35 compute-0 nova_compute[186544]: 2025-11-22 08:41:35.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:41:35 compute-0 nova_compute[186544]: 2025-11-22 08:41:35.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:41:36 compute-0 nova_compute[186544]: 2025-11-22 08:41:36.621 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:41:37 compute-0 nova_compute[186544]: 2025-11-22 08:41:37.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:41:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:41:37.372 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:41:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:41:37.373 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:41:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:41:37.374 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:41:37 compute-0 podman[251365]: 2025-11-22 08:41:37.416339545 +0000 UTC m=+0.063596931 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute)
Nov 22 08:41:37 compute-0 podman[251366]: 2025-11-22 08:41:37.416393656 +0000 UTC m=+0.063291413 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 22 08:41:37 compute-0 podman[251367]: 2025-11-22 08:41:37.440559742 +0000 UTC m=+0.080076706 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 08:41:37 compute-0 podman[251368]: 2025-11-22 08:41:37.46315538 +0000 UTC m=+0.092570005 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 22 08:41:39 compute-0 nova_compute[186544]: 2025-11-22 08:41:39.491 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:41:41 compute-0 nova_compute[186544]: 2025-11-22 08:41:41.623 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:41:44 compute-0 nova_compute[186544]: 2025-11-22 08:41:44.493 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:41:46 compute-0 nova_compute[186544]: 2025-11-22 08:41:46.626 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:41:48 compute-0 podman[251450]: 2025-11-22 08:41:48.403169273 +0000 UTC m=+0.051436091 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 08:41:48 compute-0 podman[251451]: 2025-11-22 08:41:48.414599144 +0000 UTC m=+0.059141571 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.openshift.expose-services=, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_id=edpm, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, maintainer=Red Hat, Inc., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible)
Nov 22 08:41:48 compute-0 podman[251452]: 2025-11-22 08:41:48.420386207 +0000 UTC m=+0.058841543 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2, container_name=multipathd, managed_by=edpm_ansible)
Nov 22 08:41:49 compute-0 nova_compute[186544]: 2025-11-22 08:41:49.497 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:41:51 compute-0 nova_compute[186544]: 2025-11-22 08:41:51.627 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:41:54 compute-0 nova_compute[186544]: 2025-11-22 08:41:54.500 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:41:56 compute-0 nova_compute[186544]: 2025-11-22 08:41:56.629 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:41:59 compute-0 nova_compute[186544]: 2025-11-22 08:41:59.503 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:42:01 compute-0 nova_compute[186544]: 2025-11-22 08:42:01.631 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:42:04 compute-0 nova_compute[186544]: 2025-11-22 08:42:04.506 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:42:06 compute-0 nova_compute[186544]: 2025-11-22 08:42:06.635 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:42:08 compute-0 podman[251508]: 2025-11-22 08:42:08.423963521 +0000 UTC m=+0.057480889 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 22 08:42:08 compute-0 podman[251507]: 2025-11-22 08:42:08.425428447 +0000 UTC m=+0.065010166 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=edpm, io.buildah.version=1.41.3, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS)
Nov 22 08:42:08 compute-0 podman[251509]: 2025-11-22 08:42:08.429068536 +0000 UTC m=+0.058375960 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 08:42:08 compute-0 podman[251510]: 2025-11-22 08:42:08.455243773 +0000 UTC m=+0.082949468 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 22 08:42:09 compute-0 nova_compute[186544]: 2025-11-22 08:42:09.509 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:42:11 compute-0 nova_compute[186544]: 2025-11-22 08:42:11.637 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:42:12 compute-0 ovn_controller[94843]: 2025-11-22T08:42:12Z|00847|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Nov 22 08:42:14 compute-0 nova_compute[186544]: 2025-11-22 08:42:14.512 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:42:16 compute-0 nova_compute[186544]: 2025-11-22 08:42:16.640 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:42:19 compute-0 podman[251598]: 2025-11-22 08:42:19.397110991 +0000 UTC m=+0.050586299 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 08:42:19 compute-0 podman[251599]: 2025-11-22 08:42:19.407553958 +0000 UTC m=+0.056564366 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, release=1755695350, com.redhat.component=ubi9-minimal-container, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, version=9.6, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, managed_by=edpm_ansible)
Nov 22 08:42:19 compute-0 podman[251600]: 2025-11-22 08:42:19.411943466 +0000 UTC m=+0.058970456 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible)
Nov 22 08:42:19 compute-0 nova_compute[186544]: 2025-11-22 08:42:19.515 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:42:21 compute-0 nova_compute[186544]: 2025-11-22 08:42:21.642 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:42:22 compute-0 nova_compute[186544]: 2025-11-22 08:42:22.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:42:22 compute-0 nova_compute[186544]: 2025-11-22 08:42:22.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 08:42:24 compute-0 nova_compute[186544]: 2025-11-22 08:42:24.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:42:24 compute-0 nova_compute[186544]: 2025-11-22 08:42:24.164 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 08:42:24 compute-0 nova_compute[186544]: 2025-11-22 08:42:24.165 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 08:42:24 compute-0 nova_compute[186544]: 2025-11-22 08:42:24.517 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:42:24 compute-0 nova_compute[186544]: 2025-11-22 08:42:24.557 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "refresh_cache-ef1b211e-ed2b-408e-9855-6fd78d883d8c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:42:24 compute-0 nova_compute[186544]: 2025-11-22 08:42:24.557 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquired lock "refresh_cache-ef1b211e-ed2b-408e-9855-6fd78d883d8c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:42:24 compute-0 nova_compute[186544]: 2025-11-22 08:42:24.558 186548 DEBUG nova.network.neutron [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: ef1b211e-ed2b-408e-9855-6fd78d883d8c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 22 08:42:24 compute-0 nova_compute[186544]: 2025-11-22 08:42:24.558 186548 DEBUG nova.objects.instance [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lazy-loading 'info_cache' on Instance uuid ef1b211e-ed2b-408e-9855-6fd78d883d8c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:42:26 compute-0 nova_compute[186544]: 2025-11-22 08:42:26.646 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:42:27 compute-0 nova_compute[186544]: 2025-11-22 08:42:27.384 186548 DEBUG nova.network.neutron [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: ef1b211e-ed2b-408e-9855-6fd78d883d8c] Updating instance_info_cache with network_info: [{"id": "5f079e8d-4296-402c-b85e-0ed6435f72ec", "address": "fa:16:3e:c9:0b:14", "network": {"id": "b8d3a649-095f-4b94-af55-194302b82348", "bridge": "br-int", "label": "tempest-network-smoke--1450792415", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f079e8d-42", "ovs_interfaceid": "5f079e8d-4296-402c-b85e-0ed6435f72ec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:42:27 compute-0 nova_compute[186544]: 2025-11-22 08:42:27.424 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Releasing lock "refresh_cache-ef1b211e-ed2b-408e-9855-6fd78d883d8c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:42:27 compute-0 nova_compute[186544]: 2025-11-22 08:42:27.425 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: ef1b211e-ed2b-408e-9855-6fd78d883d8c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 22 08:42:28 compute-0 nova_compute[186544]: 2025-11-22 08:42:28.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:42:28 compute-0 nova_compute[186544]: 2025-11-22 08:42:28.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:42:28 compute-0 nova_compute[186544]: 2025-11-22 08:42:28.188 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:42:28 compute-0 nova_compute[186544]: 2025-11-22 08:42:28.188 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:42:28 compute-0 nova_compute[186544]: 2025-11-22 08:42:28.189 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:42:28 compute-0 nova_compute[186544]: 2025-11-22 08:42:28.189 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 08:42:28 compute-0 nova_compute[186544]: 2025-11-22 08:42:28.275 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ef1b211e-ed2b-408e-9855-6fd78d883d8c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:42:28 compute-0 nova_compute[186544]: 2025-11-22 08:42:28.329 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ef1b211e-ed2b-408e-9855-6fd78d883d8c/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:42:28 compute-0 nova_compute[186544]: 2025-11-22 08:42:28.330 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ef1b211e-ed2b-408e-9855-6fd78d883d8c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:42:28 compute-0 nova_compute[186544]: 2025-11-22 08:42:28.387 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ef1b211e-ed2b-408e-9855-6fd78d883d8c/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:42:28 compute-0 nova_compute[186544]: 2025-11-22 08:42:28.509 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:42:28 compute-0 nova_compute[186544]: 2025-11-22 08:42:28.511 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5545MB free_disk=73.10324478149414GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 08:42:28 compute-0 nova_compute[186544]: 2025-11-22 08:42:28.511 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:42:28 compute-0 nova_compute[186544]: 2025-11-22 08:42:28.511 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:42:28 compute-0 nova_compute[186544]: 2025-11-22 08:42:28.720 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Instance ef1b211e-ed2b-408e-9855-6fd78d883d8c actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 22 08:42:28 compute-0 nova_compute[186544]: 2025-11-22 08:42:28.721 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 08:42:28 compute-0 nova_compute[186544]: 2025-11-22 08:42:28.721 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 08:42:28 compute-0 nova_compute[186544]: 2025-11-22 08:42:28.952 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:42:28 compute-0 nova_compute[186544]: 2025-11-22 08:42:28.971 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:42:28 compute-0 nova_compute[186544]: 2025-11-22 08:42:28.974 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 08:42:28 compute-0 nova_compute[186544]: 2025-11-22 08:42:28.974 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.463s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:42:29 compute-0 nova_compute[186544]: 2025-11-22 08:42:29.519 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:42:31 compute-0 nova_compute[186544]: 2025-11-22 08:42:31.652 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:42:32 compute-0 nova_compute[186544]: 2025-11-22 08:42:32.563 186548 DEBUG oslo_concurrency.lockutils [None req-e0870a4b-eb07-4217-8441-644f533d8800 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "ef1b211e-ed2b-408e-9855-6fd78d883d8c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:42:32 compute-0 nova_compute[186544]: 2025-11-22 08:42:32.563 186548 DEBUG oslo_concurrency.lockutils [None req-e0870a4b-eb07-4217-8441-644f533d8800 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "ef1b211e-ed2b-408e-9855-6fd78d883d8c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:42:32 compute-0 nova_compute[186544]: 2025-11-22 08:42:32.563 186548 DEBUG oslo_concurrency.lockutils [None req-e0870a4b-eb07-4217-8441-644f533d8800 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "ef1b211e-ed2b-408e-9855-6fd78d883d8c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:42:32 compute-0 nova_compute[186544]: 2025-11-22 08:42:32.564 186548 DEBUG oslo_concurrency.lockutils [None req-e0870a4b-eb07-4217-8441-644f533d8800 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "ef1b211e-ed2b-408e-9855-6fd78d883d8c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:42:32 compute-0 nova_compute[186544]: 2025-11-22 08:42:32.564 186548 DEBUG oslo_concurrency.lockutils [None req-e0870a4b-eb07-4217-8441-644f533d8800 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "ef1b211e-ed2b-408e-9855-6fd78d883d8c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:42:32 compute-0 nova_compute[186544]: 2025-11-22 08:42:32.572 186548 INFO nova.compute.manager [None req-e0870a4b-eb07-4217-8441-644f533d8800 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: ef1b211e-ed2b-408e-9855-6fd78d883d8c] Terminating instance
Nov 22 08:42:32 compute-0 nova_compute[186544]: 2025-11-22 08:42:32.578 186548 DEBUG nova.compute.manager [None req-e0870a4b-eb07-4217-8441-644f533d8800 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: ef1b211e-ed2b-408e-9855-6fd78d883d8c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 08:42:32 compute-0 kernel: tap5f079e8d-42 (unregistering): left promiscuous mode
Nov 22 08:42:32 compute-0 NetworkManager[55036]: <info>  [1763800952.6095] device (tap5f079e8d-42): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 08:42:32 compute-0 nova_compute[186544]: 2025-11-22 08:42:32.616 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:42:32 compute-0 ovn_controller[94843]: 2025-11-22T08:42:32Z|00848|binding|INFO|Releasing lport 5f079e8d-4296-402c-b85e-0ed6435f72ec from this chassis (sb_readonly=0)
Nov 22 08:42:32 compute-0 ovn_controller[94843]: 2025-11-22T08:42:32Z|00849|binding|INFO|Setting lport 5f079e8d-4296-402c-b85e-0ed6435f72ec down in Southbound
Nov 22 08:42:32 compute-0 ovn_controller[94843]: 2025-11-22T08:42:32Z|00850|binding|INFO|Removing iface tap5f079e8d-42 ovn-installed in OVS
Nov 22 08:42:32 compute-0 nova_compute[186544]: 2025-11-22 08:42:32.619 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:42:32 compute-0 nova_compute[186544]: 2025-11-22 08:42:32.635 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:42:32 compute-0 systemd[1]: machine-qemu\x2d94\x2dinstance\x2d000000ac.scope: Deactivated successfully.
Nov 22 08:42:32 compute-0 systemd[1]: machine-qemu\x2d94\x2dinstance\x2d000000ac.scope: Consumed 15.945s CPU time.
Nov 22 08:42:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:42:32.654 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c9:0b:14 10.100.0.29'], port_security=['fa:16:3e:c9:0b:14 10.100.0.29'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.29/28', 'neutron:device_id': 'ef1b211e-ed2b-408e-9855-6fd78d883d8c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b8d3a649-095f-4b94-af55-194302b82348', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '12f63a6d87a947758ab928c0d625ff06', 'neutron:revision_number': '4', 'neutron:security_group_ids': '88899a54-ed01-4ce9-9624-a096a374cee4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c8bf1006-f131-4804-9af5-1455421df1f6, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=5f079e8d-4296-402c-b85e-0ed6435f72ec) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:42:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:42:32.656 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 5f079e8d-4296-402c-b85e-0ed6435f72ec in datapath b8d3a649-095f-4b94-af55-194302b82348 unbound from our chassis
Nov 22 08:42:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:42:32.657 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b8d3a649-095f-4b94-af55-194302b82348, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 08:42:32 compute-0 systemd-machined[152872]: Machine qemu-94-instance-000000ac terminated.
Nov 22 08:42:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:42:32.658 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[c95f5cc3-c1cd-4486-bd31-85da8988fc43]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:42:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:42:32.659 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b8d3a649-095f-4b94-af55-194302b82348 namespace which is not needed anymore
Nov 22 08:42:32 compute-0 neutron-haproxy-ovnmeta-b8d3a649-095f-4b94-af55-194302b82348[251325]: [NOTICE]   (251329) : haproxy version is 2.8.14-c23fe91
Nov 22 08:42:32 compute-0 neutron-haproxy-ovnmeta-b8d3a649-095f-4b94-af55-194302b82348[251325]: [NOTICE]   (251329) : path to executable is /usr/sbin/haproxy
Nov 22 08:42:32 compute-0 neutron-haproxy-ovnmeta-b8d3a649-095f-4b94-af55-194302b82348[251325]: [WARNING]  (251329) : Exiting Master process...
Nov 22 08:42:32 compute-0 nova_compute[186544]: 2025-11-22 08:42:32.802 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:42:32 compute-0 neutron-haproxy-ovnmeta-b8d3a649-095f-4b94-af55-194302b82348[251325]: [ALERT]    (251329) : Current worker (251331) exited with code 143 (Terminated)
Nov 22 08:42:32 compute-0 neutron-haproxy-ovnmeta-b8d3a649-095f-4b94-af55-194302b82348[251325]: [WARNING]  (251329) : All workers exited. Exiting... (0)
Nov 22 08:42:32 compute-0 systemd[1]: libpod-7e70b01b636d6e4a82ec06ca08dcb84f92e813e2e5d19f76058cfe7e8cbfd5d6.scope: Deactivated successfully.
Nov 22 08:42:32 compute-0 nova_compute[186544]: 2025-11-22 08:42:32.806 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:42:32 compute-0 podman[251690]: 2025-11-22 08:42:32.813711672 +0000 UTC m=+0.050303472 container died 7e70b01b636d6e4a82ec06ca08dcb84f92e813e2e5d19f76058cfe7e8cbfd5d6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b8d3a649-095f-4b94-af55-194302b82348, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118)
Nov 22 08:42:32 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7e70b01b636d6e4a82ec06ca08dcb84f92e813e2e5d19f76058cfe7e8cbfd5d6-userdata-shm.mount: Deactivated successfully.
Nov 22 08:42:32 compute-0 systemd[1]: var-lib-containers-storage-overlay-a4f81f41b6f03cecb2a2bf458d41e93f909334fca6d25e0ca8664aa8cf84c927-merged.mount: Deactivated successfully.
Nov 22 08:42:32 compute-0 nova_compute[186544]: 2025-11-22 08:42:32.857 186548 INFO nova.virt.libvirt.driver [-] [instance: ef1b211e-ed2b-408e-9855-6fd78d883d8c] Instance destroyed successfully.
Nov 22 08:42:32 compute-0 nova_compute[186544]: 2025-11-22 08:42:32.858 186548 DEBUG nova.objects.instance [None req-e0870a4b-eb07-4217-8441-644f533d8800 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lazy-loading 'resources' on Instance uuid ef1b211e-ed2b-408e-9855-6fd78d883d8c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:42:32 compute-0 podman[251690]: 2025-11-22 08:42:32.858930978 +0000 UTC m=+0.095522778 container cleanup 7e70b01b636d6e4a82ec06ca08dcb84f92e813e2e5d19f76058cfe7e8cbfd5d6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b8d3a649-095f-4b94-af55-194302b82348, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 08:42:32 compute-0 systemd[1]: libpod-conmon-7e70b01b636d6e4a82ec06ca08dcb84f92e813e2e5d19f76058cfe7e8cbfd5d6.scope: Deactivated successfully.
Nov 22 08:42:32 compute-0 nova_compute[186544]: 2025-11-22 08:42:32.870 186548 DEBUG nova.virt.libvirt.vif [None req-e0870a4b-eb07-4217-8441-644f533d8800 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:41:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1691977367',display_name='tempest-TestNetworkBasicOps-server-1691977367',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1691977367',id=172,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEDKNpzsG1Q103jCUovNZ6lZtbmUMlhqfx7jyAfB5//Deh5bgYdssEW1JTiM/GKS/xaHPWQqeN0/2XjEbR3xYPKpNXOT8+3wHkA2J+IEkQZTjE/GU5Hdc2pANDTG8O2dww==',key_name='tempest-TestNetworkBasicOps-1130046090',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:41:22Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='12f63a6d87a947758ab928c0d625ff06',ramdisk_id='',reservation_id='r-jrtky288',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1998778518',owner_user_name='tempest-TestNetworkBasicOps-1998778518-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:41:22Z,user_data=None,user_id='033a5e424a0a42afa21b67c28d79d1f4',uuid=ef1b211e-ed2b-408e-9855-6fd78d883d8c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5f079e8d-4296-402c-b85e-0ed6435f72ec", "address": "fa:16:3e:c9:0b:14", "network": {"id": "b8d3a649-095f-4b94-af55-194302b82348", "bridge": "br-int", "label": "tempest-network-smoke--1450792415", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f079e8d-42", "ovs_interfaceid": "5f079e8d-4296-402c-b85e-0ed6435f72ec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 08:42:32 compute-0 nova_compute[186544]: 2025-11-22 08:42:32.870 186548 DEBUG nova.network.os_vif_util [None req-e0870a4b-eb07-4217-8441-644f533d8800 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converting VIF {"id": "5f079e8d-4296-402c-b85e-0ed6435f72ec", "address": "fa:16:3e:c9:0b:14", "network": {"id": "b8d3a649-095f-4b94-af55-194302b82348", "bridge": "br-int", "label": "tempest-network-smoke--1450792415", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f079e8d-42", "ovs_interfaceid": "5f079e8d-4296-402c-b85e-0ed6435f72ec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:42:32 compute-0 nova_compute[186544]: 2025-11-22 08:42:32.871 186548 DEBUG nova.network.os_vif_util [None req-e0870a4b-eb07-4217-8441-644f533d8800 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c9:0b:14,bridge_name='br-int',has_traffic_filtering=True,id=5f079e8d-4296-402c-b85e-0ed6435f72ec,network=Network(b8d3a649-095f-4b94-af55-194302b82348),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5f079e8d-42') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:42:32 compute-0 nova_compute[186544]: 2025-11-22 08:42:32.871 186548 DEBUG os_vif [None req-e0870a4b-eb07-4217-8441-644f533d8800 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c9:0b:14,bridge_name='br-int',has_traffic_filtering=True,id=5f079e8d-4296-402c-b85e-0ed6435f72ec,network=Network(b8d3a649-095f-4b94-af55-194302b82348),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5f079e8d-42') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 08:42:32 compute-0 nova_compute[186544]: 2025-11-22 08:42:32.874 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:42:32 compute-0 nova_compute[186544]: 2025-11-22 08:42:32.875 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5f079e8d-42, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:42:32 compute-0 nova_compute[186544]: 2025-11-22 08:42:32.877 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:42:32 compute-0 nova_compute[186544]: 2025-11-22 08:42:32.879 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 08:42:32 compute-0 nova_compute[186544]: 2025-11-22 08:42:32.881 186548 INFO os_vif [None req-e0870a4b-eb07-4217-8441-644f533d8800 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c9:0b:14,bridge_name='br-int',has_traffic_filtering=True,id=5f079e8d-4296-402c-b85e-0ed6435f72ec,network=Network(b8d3a649-095f-4b94-af55-194302b82348),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5f079e8d-42')
Nov 22 08:42:32 compute-0 nova_compute[186544]: 2025-11-22 08:42:32.882 186548 INFO nova.virt.libvirt.driver [None req-e0870a4b-eb07-4217-8441-644f533d8800 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: ef1b211e-ed2b-408e-9855-6fd78d883d8c] Deleting instance files /var/lib/nova/instances/ef1b211e-ed2b-408e-9855-6fd78d883d8c_del
Nov 22 08:42:32 compute-0 nova_compute[186544]: 2025-11-22 08:42:32.882 186548 INFO nova.virt.libvirt.driver [None req-e0870a4b-eb07-4217-8441-644f533d8800 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: ef1b211e-ed2b-408e-9855-6fd78d883d8c] Deletion of /var/lib/nova/instances/ef1b211e-ed2b-408e-9855-6fd78d883d8c_del complete
Nov 22 08:42:32 compute-0 podman[251735]: 2025-11-22 08:42:32.949333629 +0000 UTC m=+0.064713908 container remove 7e70b01b636d6e4a82ec06ca08dcb84f92e813e2e5d19f76058cfe7e8cbfd5d6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b8d3a649-095f-4b94-af55-194302b82348, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 08:42:32 compute-0 nova_compute[186544]: 2025-11-22 08:42:32.955 186548 INFO nova.compute.manager [None req-e0870a4b-eb07-4217-8441-644f533d8800 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: ef1b211e-ed2b-408e-9855-6fd78d883d8c] Took 0.38 seconds to destroy the instance on the hypervisor.
Nov 22 08:42:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:42:32.955 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[f1ca38b0-c560-4a83-a12a-8ef984071c47]: (4, ('Sat Nov 22 08:42:32 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-b8d3a649-095f-4b94-af55-194302b82348 (7e70b01b636d6e4a82ec06ca08dcb84f92e813e2e5d19f76058cfe7e8cbfd5d6)\n7e70b01b636d6e4a82ec06ca08dcb84f92e813e2e5d19f76058cfe7e8cbfd5d6\nSat Nov 22 08:42:32 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-b8d3a649-095f-4b94-af55-194302b82348 (7e70b01b636d6e4a82ec06ca08dcb84f92e813e2e5d19f76058cfe7e8cbfd5d6)\n7e70b01b636d6e4a82ec06ca08dcb84f92e813e2e5d19f76058cfe7e8cbfd5d6\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:42:32 compute-0 nova_compute[186544]: 2025-11-22 08:42:32.956 186548 DEBUG oslo.service.loopingcall [None req-e0870a4b-eb07-4217-8441-644f533d8800 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 08:42:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:42:32.957 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[f63f2ec6-2838-4362-a1a0-c7b2b188b987]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:42:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:42:32.958 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb8d3a649-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:42:32 compute-0 nova_compute[186544]: 2025-11-22 08:42:32.958 186548 DEBUG nova.compute.manager [-] [instance: ef1b211e-ed2b-408e-9855-6fd78d883d8c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 08:42:32 compute-0 nova_compute[186544]: 2025-11-22 08:42:32.959 186548 DEBUG nova.network.neutron [-] [instance: ef1b211e-ed2b-408e-9855-6fd78d883d8c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 08:42:32 compute-0 kernel: tapb8d3a649-00: left promiscuous mode
Nov 22 08:42:32 compute-0 nova_compute[186544]: 2025-11-22 08:42:32.961 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:42:32 compute-0 nova_compute[186544]: 2025-11-22 08:42:32.971 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:42:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:42:32.974 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[76e9ff97-d478-46a4-a6ad-2647b0a4547f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:42:32 compute-0 nova_compute[186544]: 2025-11-22 08:42:32.974 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:42:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:42:32.990 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[98c6ff2f-cf60-4822-a928-051e5e6e19b1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:42:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:42:32.991 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[15829ec9-49d5-4ed8-8866-9c732c0697cc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:42:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:42:33.005 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[e651200f-e4bd-4529-9f07-b32a4e1ebee3]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 762158, 'reachable_time': 31243, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 251750, 'error': None, 'target': 'ovnmeta-b8d3a649-095f-4b94-af55-194302b82348', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:42:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:42:33.007 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b8d3a649-095f-4b94-af55-194302b82348 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 08:42:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:42:33.007 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[ffab89fb-eb2b-4f41-a27d-e3e913c01f0c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:42:33 compute-0 systemd[1]: run-netns-ovnmeta\x2db8d3a649\x2d095f\x2d4b94\x2daf55\x2d194302b82348.mount: Deactivated successfully.
Nov 22 08:42:33 compute-0 nova_compute[186544]: 2025-11-22 08:42:33.157 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:42:33 compute-0 nova_compute[186544]: 2025-11-22 08:42:33.865 186548 DEBUG nova.compute.manager [req-c9d191f3-744b-433b-a07f-a5a96a4ff870 req-34ccef27-c3ed-4870-a6df-e1bfc1709b3a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ef1b211e-ed2b-408e-9855-6fd78d883d8c] Received event network-vif-unplugged-5f079e8d-4296-402c-b85e-0ed6435f72ec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:42:33 compute-0 nova_compute[186544]: 2025-11-22 08:42:33.866 186548 DEBUG oslo_concurrency.lockutils [req-c9d191f3-744b-433b-a07f-a5a96a4ff870 req-34ccef27-c3ed-4870-a6df-e1bfc1709b3a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "ef1b211e-ed2b-408e-9855-6fd78d883d8c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:42:33 compute-0 nova_compute[186544]: 2025-11-22 08:42:33.866 186548 DEBUG oslo_concurrency.lockutils [req-c9d191f3-744b-433b-a07f-a5a96a4ff870 req-34ccef27-c3ed-4870-a6df-e1bfc1709b3a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ef1b211e-ed2b-408e-9855-6fd78d883d8c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:42:33 compute-0 nova_compute[186544]: 2025-11-22 08:42:33.867 186548 DEBUG oslo_concurrency.lockutils [req-c9d191f3-744b-433b-a07f-a5a96a4ff870 req-34ccef27-c3ed-4870-a6df-e1bfc1709b3a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ef1b211e-ed2b-408e-9855-6fd78d883d8c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:42:33 compute-0 nova_compute[186544]: 2025-11-22 08:42:33.867 186548 DEBUG nova.compute.manager [req-c9d191f3-744b-433b-a07f-a5a96a4ff870 req-34ccef27-c3ed-4870-a6df-e1bfc1709b3a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ef1b211e-ed2b-408e-9855-6fd78d883d8c] No waiting events found dispatching network-vif-unplugged-5f079e8d-4296-402c-b85e-0ed6435f72ec pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:42:33 compute-0 nova_compute[186544]: 2025-11-22 08:42:33.867 186548 DEBUG nova.compute.manager [req-c9d191f3-744b-433b-a07f-a5a96a4ff870 req-34ccef27-c3ed-4870-a6df-e1bfc1709b3a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ef1b211e-ed2b-408e-9855-6fd78d883d8c] Received event network-vif-unplugged-5f079e8d-4296-402c-b85e-0ed6435f72ec for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 22 08:42:34 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:42:34.714 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=67, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=66) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:42:34 compute-0 nova_compute[186544]: 2025-11-22 08:42:34.714 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:42:34 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:42:34.716 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 08:42:34 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:42:34.717 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '67'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:42:34 compute-0 nova_compute[186544]: 2025-11-22 08:42:34.882 186548 DEBUG nova.network.neutron [-] [instance: ef1b211e-ed2b-408e-9855-6fd78d883d8c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:42:35 compute-0 nova_compute[186544]: 2025-11-22 08:42:35.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:42:35 compute-0 nova_compute[186544]: 2025-11-22 08:42:35.167 186548 INFO nova.compute.manager [-] [instance: ef1b211e-ed2b-408e-9855-6fd78d883d8c] Took 2.21 seconds to deallocate network for instance.
Nov 22 08:42:35 compute-0 nova_compute[186544]: 2025-11-22 08:42:35.279 186548 DEBUG oslo_concurrency.lockutils [None req-e0870a4b-eb07-4217-8441-644f533d8800 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:42:35 compute-0 nova_compute[186544]: 2025-11-22 08:42:35.279 186548 DEBUG oslo_concurrency.lockutils [None req-e0870a4b-eb07-4217-8441-644f533d8800 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:42:35 compute-0 nova_compute[186544]: 2025-11-22 08:42:35.326 186548 DEBUG nova.compute.manager [req-205632e6-6313-4b79-8155-6a801e0508e7 req-30ac0258-3801-4832-a29a-daf5654d0c46 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ef1b211e-ed2b-408e-9855-6fd78d883d8c] Received event network-vif-deleted-5f079e8d-4296-402c-b85e-0ed6435f72ec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:42:35 compute-0 nova_compute[186544]: 2025-11-22 08:42:35.418 186548 DEBUG nova.compute.provider_tree [None req-e0870a4b-eb07-4217-8441-644f533d8800 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:42:35 compute-0 nova_compute[186544]: 2025-11-22 08:42:35.438 186548 DEBUG nova.scheduler.client.report [None req-e0870a4b-eb07-4217-8441-644f533d8800 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:42:35 compute-0 nova_compute[186544]: 2025-11-22 08:42:35.485 186548 DEBUG oslo_concurrency.lockutils [None req-e0870a4b-eb07-4217-8441-644f533d8800 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.206s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:42:35 compute-0 nova_compute[186544]: 2025-11-22 08:42:35.537 186548 INFO nova.scheduler.client.report [None req-e0870a4b-eb07-4217-8441-644f533d8800 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Deleted allocations for instance ef1b211e-ed2b-408e-9855-6fd78d883d8c
Nov 22 08:42:35 compute-0 nova_compute[186544]: 2025-11-22 08:42:35.643 186548 DEBUG oslo_concurrency.lockutils [None req-e0870a4b-eb07-4217-8441-644f533d8800 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "ef1b211e-ed2b-408e-9855-6fd78d883d8c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.080s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:42:36 compute-0 nova_compute[186544]: 2025-11-22 08:42:36.004 186548 DEBUG nova.compute.manager [req-8de288e5-6f11-4835-ba58-532aeaaf3c29 req-c236e96f-bd8b-492e-99f4-68a94ccaee05 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ef1b211e-ed2b-408e-9855-6fd78d883d8c] Received event network-vif-plugged-5f079e8d-4296-402c-b85e-0ed6435f72ec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:42:36 compute-0 nova_compute[186544]: 2025-11-22 08:42:36.005 186548 DEBUG oslo_concurrency.lockutils [req-8de288e5-6f11-4835-ba58-532aeaaf3c29 req-c236e96f-bd8b-492e-99f4-68a94ccaee05 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "ef1b211e-ed2b-408e-9855-6fd78d883d8c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:42:36 compute-0 nova_compute[186544]: 2025-11-22 08:42:36.005 186548 DEBUG oslo_concurrency.lockutils [req-8de288e5-6f11-4835-ba58-532aeaaf3c29 req-c236e96f-bd8b-492e-99f4-68a94ccaee05 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ef1b211e-ed2b-408e-9855-6fd78d883d8c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:42:36 compute-0 nova_compute[186544]: 2025-11-22 08:42:36.005 186548 DEBUG oslo_concurrency.lockutils [req-8de288e5-6f11-4835-ba58-532aeaaf3c29 req-c236e96f-bd8b-492e-99f4-68a94ccaee05 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "ef1b211e-ed2b-408e-9855-6fd78d883d8c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:42:36 compute-0 nova_compute[186544]: 2025-11-22 08:42:36.005 186548 DEBUG nova.compute.manager [req-8de288e5-6f11-4835-ba58-532aeaaf3c29 req-c236e96f-bd8b-492e-99f4-68a94ccaee05 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ef1b211e-ed2b-408e-9855-6fd78d883d8c] No waiting events found dispatching network-vif-plugged-5f079e8d-4296-402c-b85e-0ed6435f72ec pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:42:36 compute-0 nova_compute[186544]: 2025-11-22 08:42:36.006 186548 WARNING nova.compute.manager [req-8de288e5-6f11-4835-ba58-532aeaaf3c29 req-c236e96f-bd8b-492e-99f4-68a94ccaee05 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: ef1b211e-ed2b-408e-9855-6fd78d883d8c] Received unexpected event network-vif-plugged-5f079e8d-4296-402c-b85e-0ed6435f72ec for instance with vm_state deleted and task_state None.
Nov 22 08:42:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:42:36.604 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:42:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:42:36.605 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:42:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:42:36.605 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:42:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:42:36.605 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:42:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:42:36.605 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:42:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:42:36.605 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:42:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:42:36.606 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:42:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:42:36.606 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:42:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:42:36.606 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:42:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:42:36.606 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:42:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:42:36.606 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:42:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:42:36.606 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:42:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:42:36.606 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:42:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:42:36.607 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:42:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:42:36.607 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:42:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:42:36.607 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:42:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:42:36.607 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:42:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:42:36.607 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:42:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:42:36.607 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:42:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:42:36.607 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:42:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:42:36.607 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:42:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:42:36.608 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:42:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:42:36.608 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:42:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:42:36.608 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:42:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:42:36.608 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:42:36 compute-0 nova_compute[186544]: 2025-11-22 08:42:36.653 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:42:37 compute-0 nova_compute[186544]: 2025-11-22 08:42:37.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:42:37 compute-0 nova_compute[186544]: 2025-11-22 08:42:37.165 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:42:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:42:37.373 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:42:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:42:37.374 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:42:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:42:37.374 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:42:37 compute-0 nova_compute[186544]: 2025-11-22 08:42:37.879 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:42:39 compute-0 podman[251753]: 2025-11-22 08:42:39.424218556 +0000 UTC m=+0.063896777 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 08:42:39 compute-0 podman[251751]: 2025-11-22 08:42:39.432302156 +0000 UTC m=+0.079074552 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 08:42:39 compute-0 podman[251752]: 2025-11-22 08:42:39.448363813 +0000 UTC m=+0.090491505 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS)
Nov 22 08:42:39 compute-0 podman[251756]: 2025-11-22 08:42:39.480985987 +0000 UTC m=+0.116299290 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 22 08:42:41 compute-0 nova_compute[186544]: 2025-11-22 08:42:41.582 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:42:41 compute-0 nova_compute[186544]: 2025-11-22 08:42:41.653 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:42:42 compute-0 nova_compute[186544]: 2025-11-22 08:42:42.883 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:42:46 compute-0 nova_compute[186544]: 2025-11-22 08:42:46.655 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:42:47 compute-0 nova_compute[186544]: 2025-11-22 08:42:47.857 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763800952.8552327, ef1b211e-ed2b-408e-9855-6fd78d883d8c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:42:47 compute-0 nova_compute[186544]: 2025-11-22 08:42:47.858 186548 INFO nova.compute.manager [-] [instance: ef1b211e-ed2b-408e-9855-6fd78d883d8c] VM Stopped (Lifecycle Event)
Nov 22 08:42:47 compute-0 nova_compute[186544]: 2025-11-22 08:42:47.880 186548 DEBUG nova.compute.manager [None req-e18c833f-3712-4d80-9853-cdae6874e6a0 - - - - - -] [instance: ef1b211e-ed2b-408e-9855-6fd78d883d8c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:42:47 compute-0 nova_compute[186544]: 2025-11-22 08:42:47.886 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:42:50 compute-0 podman[251837]: 2025-11-22 08:42:50.402870912 +0000 UTC m=+0.050083097 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 08:42:50 compute-0 podman[251839]: 2025-11-22 08:42:50.428880243 +0000 UTC m=+0.070756896 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 22 08:42:50 compute-0 podman[251838]: 2025-11-22 08:42:50.431870107 +0000 UTC m=+0.077102583 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, io.openshift.expose-services=, architecture=x86_64, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-type=git, container_name=openstack_network_exporter, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, config_id=edpm, io.buildah.version=1.33.7, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 22 08:42:51 compute-0 nova_compute[186544]: 2025-11-22 08:42:51.160 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:42:51 compute-0 nova_compute[186544]: 2025-11-22 08:42:51.656 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:42:52 compute-0 nova_compute[186544]: 2025-11-22 08:42:52.889 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:42:56 compute-0 nova_compute[186544]: 2025-11-22 08:42:56.658 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:42:57 compute-0 nova_compute[186544]: 2025-11-22 08:42:57.893 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:43:01 compute-0 nova_compute[186544]: 2025-11-22 08:43:01.660 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:43:02 compute-0 nova_compute[186544]: 2025-11-22 08:43:02.896 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:43:06 compute-0 nova_compute[186544]: 2025-11-22 08:43:06.662 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:43:07 compute-0 nova_compute[186544]: 2025-11-22 08:43:07.899 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:43:10 compute-0 podman[251899]: 2025-11-22 08:43:10.407563172 +0000 UTC m=+0.058444503 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 22 08:43:10 compute-0 podman[251900]: 2025-11-22 08:43:10.409960402 +0000 UTC m=+0.057017678 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 22 08:43:10 compute-0 podman[251901]: 2025-11-22 08:43:10.411387027 +0000 UTC m=+0.055145531 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 08:43:10 compute-0 podman[251902]: 2025-11-22 08:43:10.449343124 +0000 UTC m=+0.089835608 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118)
Nov 22 08:43:11 compute-0 nova_compute[186544]: 2025-11-22 08:43:11.665 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:43:12 compute-0 nova_compute[186544]: 2025-11-22 08:43:12.903 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:43:16 compute-0 nova_compute[186544]: 2025-11-22 08:43:16.667 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:43:17 compute-0 nova_compute[186544]: 2025-11-22 08:43:17.906 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:43:21 compute-0 podman[251984]: 2025-11-22 08:43:21.410344216 +0000 UTC m=+0.058600728 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 22 08:43:21 compute-0 podman[251983]: 2025-11-22 08:43:21.410870618 +0000 UTC m=+0.061202811 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 08:43:21 compute-0 podman[251985]: 2025-11-22 08:43:21.426179186 +0000 UTC m=+0.067233050 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 08:43:21 compute-0 nova_compute[186544]: 2025-11-22 08:43:21.668 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:43:22 compute-0 nova_compute[186544]: 2025-11-22 08:43:22.911 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:43:24 compute-0 nova_compute[186544]: 2025-11-22 08:43:24.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:43:24 compute-0 nova_compute[186544]: 2025-11-22 08:43:24.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 08:43:24 compute-0 nova_compute[186544]: 2025-11-22 08:43:24.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 08:43:24 compute-0 nova_compute[186544]: 2025-11-22 08:43:24.185 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 08:43:24 compute-0 nova_compute[186544]: 2025-11-22 08:43:24.186 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:43:24 compute-0 nova_compute[186544]: 2025-11-22 08:43:24.186 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 08:43:26 compute-0 nova_compute[186544]: 2025-11-22 08:43:26.669 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:43:27 compute-0 nova_compute[186544]: 2025-11-22 08:43:27.914 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:43:29 compute-0 nova_compute[186544]: 2025-11-22 08:43:29.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:43:30 compute-0 nova_compute[186544]: 2025-11-22 08:43:30.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:43:30 compute-0 nova_compute[186544]: 2025-11-22 08:43:30.202 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:43:30 compute-0 nova_compute[186544]: 2025-11-22 08:43:30.202 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:43:30 compute-0 nova_compute[186544]: 2025-11-22 08:43:30.203 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:43:30 compute-0 nova_compute[186544]: 2025-11-22 08:43:30.203 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 08:43:30 compute-0 nova_compute[186544]: 2025-11-22 08:43:30.358 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:43:30 compute-0 nova_compute[186544]: 2025-11-22 08:43:30.360 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5714MB free_disk=73.13214492797852GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 08:43:30 compute-0 nova_compute[186544]: 2025-11-22 08:43:30.360 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:43:30 compute-0 nova_compute[186544]: 2025-11-22 08:43:30.361 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:43:30 compute-0 nova_compute[186544]: 2025-11-22 08:43:30.430 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 08:43:30 compute-0 nova_compute[186544]: 2025-11-22 08:43:30.430 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 08:43:30 compute-0 nova_compute[186544]: 2025-11-22 08:43:30.459 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:43:30 compute-0 nova_compute[186544]: 2025-11-22 08:43:30.474 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:43:30 compute-0 nova_compute[186544]: 2025-11-22 08:43:30.543 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 08:43:30 compute-0 nova_compute[186544]: 2025-11-22 08:43:30.543 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.183s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:43:31 compute-0 nova_compute[186544]: 2025-11-22 08:43:31.672 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:43:32 compute-0 nova_compute[186544]: 2025-11-22 08:43:32.915 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:43:33 compute-0 nova_compute[186544]: 2025-11-22 08:43:33.538 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:43:33 compute-0 nova_compute[186544]: 2025-11-22 08:43:33.539 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:43:33 compute-0 ovn_controller[94843]: 2025-11-22T08:43:33Z|00851|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Nov 22 08:43:35 compute-0 nova_compute[186544]: 2025-11-22 08:43:35.005 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:43:35 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:43:35.006 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=68, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=67) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:43:35 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:43:35.007 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 08:43:36 compute-0 nova_compute[186544]: 2025-11-22 08:43:36.674 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:43:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:43:37.008 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '68'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:43:37 compute-0 nova_compute[186544]: 2025-11-22 08:43:37.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:43:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:43:37.373 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:43:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:43:37.374 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:43:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:43:37.374 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:43:37 compute-0 nova_compute[186544]: 2025-11-22 08:43:37.918 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:43:38 compute-0 nova_compute[186544]: 2025-11-22 08:43:38.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:43:38 compute-0 nova_compute[186544]: 2025-11-22 08:43:38.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:43:41 compute-0 podman[252049]: 2025-11-22 08:43:41.414986394 +0000 UTC m=+0.061438227 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 22 08:43:41 compute-0 podman[252050]: 2025-11-22 08:43:41.417941596 +0000 UTC m=+0.059970830 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 22 08:43:41 compute-0 podman[252051]: 2025-11-22 08:43:41.427429511 +0000 UTC m=+0.067200279 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 08:43:41 compute-0 podman[252052]: 2025-11-22 08:43:41.452343296 +0000 UTC m=+0.088963177 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 22 08:43:41 compute-0 nova_compute[186544]: 2025-11-22 08:43:41.675 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:43:42 compute-0 nova_compute[186544]: 2025-11-22 08:43:42.922 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:43:46 compute-0 nova_compute[186544]: 2025-11-22 08:43:46.677 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:43:47 compute-0 nova_compute[186544]: 2025-11-22 08:43:47.924 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:43:51 compute-0 nova_compute[186544]: 2025-11-22 08:43:51.678 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:43:52 compute-0 podman[252139]: 2025-11-22 08:43:52.402647292 +0000 UTC m=+0.051393420 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 08:43:52 compute-0 podman[252141]: 2025-11-22 08:43:52.413995612 +0000 UTC m=+0.058745551 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 22 08:43:52 compute-0 podman[252140]: 2025-11-22 08:43:52.414425442 +0000 UTC m=+0.061221332 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, managed_by=edpm_ansible, architecture=x86_64, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, version=9.6, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, vendor=Red Hat, Inc.)
Nov 22 08:43:52 compute-0 nova_compute[186544]: 2025-11-22 08:43:52.927 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:43:56 compute-0 nova_compute[186544]: 2025-11-22 08:43:56.679 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:43:57 compute-0 nova_compute[186544]: 2025-11-22 08:43:57.931 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:44:01 compute-0 nova_compute[186544]: 2025-11-22 08:44:01.680 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:44:02 compute-0 nova_compute[186544]: 2025-11-22 08:44:02.933 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:44:06 compute-0 nova_compute[186544]: 2025-11-22 08:44:06.682 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:44:07 compute-0 nova_compute[186544]: 2025-11-22 08:44:07.936 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:44:11 compute-0 nova_compute[186544]: 2025-11-22 08:44:11.684 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:44:12 compute-0 podman[252204]: 2025-11-22 08:44:12.447478164 +0000 UTC m=+0.078500888 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118)
Nov 22 08:44:12 compute-0 podman[252205]: 2025-11-22 08:44:12.451732179 +0000 UTC m=+0.088725590 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 08:44:12 compute-0 podman[252203]: 2025-11-22 08:44:12.454956778 +0000 UTC m=+0.097039155 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, config_id=edpm, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 22 08:44:12 compute-0 podman[252206]: 2025-11-22 08:44:12.48096131 +0000 UTC m=+0.104222003 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 08:44:12 compute-0 nova_compute[186544]: 2025-11-22 08:44:12.937 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:44:16 compute-0 nova_compute[186544]: 2025-11-22 08:44:16.687 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:44:17 compute-0 nova_compute[186544]: 2025-11-22 08:44:17.940 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:44:21 compute-0 nova_compute[186544]: 2025-11-22 08:44:21.688 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:44:22 compute-0 nova_compute[186544]: 2025-11-22 08:44:22.943 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:44:23 compute-0 podman[252292]: 2025-11-22 08:44:23.403535962 +0000 UTC m=+0.054213589 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 08:44:23 compute-0 podman[252294]: 2025-11-22 08:44:23.416579763 +0000 UTC m=+0.057110360 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 08:44:23 compute-0 podman[252293]: 2025-11-22 08:44:23.42496581 +0000 UTC m=+0.062415300 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, release=1755695350, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, name=ubi9-minimal, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 22 08:44:25 compute-0 nova_compute[186544]: 2025-11-22 08:44:25.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:44:25 compute-0 nova_compute[186544]: 2025-11-22 08:44:25.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 08:44:25 compute-0 nova_compute[186544]: 2025-11-22 08:44:25.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 08:44:25 compute-0 nova_compute[186544]: 2025-11-22 08:44:25.174 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 08:44:25 compute-0 nova_compute[186544]: 2025-11-22 08:44:25.174 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:44:25 compute-0 nova_compute[186544]: 2025-11-22 08:44:25.175 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 08:44:26 compute-0 nova_compute[186544]: 2025-11-22 08:44:26.690 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:44:27 compute-0 nova_compute[186544]: 2025-11-22 08:44:27.946 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:44:29 compute-0 nova_compute[186544]: 2025-11-22 08:44:29.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:44:31 compute-0 nova_compute[186544]: 2025-11-22 08:44:31.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:44:31 compute-0 nova_compute[186544]: 2025-11-22 08:44:31.197 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:44:31 compute-0 nova_compute[186544]: 2025-11-22 08:44:31.198 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:44:31 compute-0 nova_compute[186544]: 2025-11-22 08:44:31.198 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:44:31 compute-0 nova_compute[186544]: 2025-11-22 08:44:31.198 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 08:44:31 compute-0 nova_compute[186544]: 2025-11-22 08:44:31.367 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:44:31 compute-0 nova_compute[186544]: 2025-11-22 08:44:31.368 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5719MB free_disk=73.13214492797852GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 08:44:31 compute-0 nova_compute[186544]: 2025-11-22 08:44:31.368 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:44:31 compute-0 nova_compute[186544]: 2025-11-22 08:44:31.368 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:44:31 compute-0 nova_compute[186544]: 2025-11-22 08:44:31.430 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 08:44:31 compute-0 nova_compute[186544]: 2025-11-22 08:44:31.431 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 08:44:31 compute-0 nova_compute[186544]: 2025-11-22 08:44:31.462 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:44:31 compute-0 nova_compute[186544]: 2025-11-22 08:44:31.476 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:44:31 compute-0 nova_compute[186544]: 2025-11-22 08:44:31.478 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 08:44:31 compute-0 nova_compute[186544]: 2025-11-22 08:44:31.479 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:44:31 compute-0 nova_compute[186544]: 2025-11-22 08:44:31.692 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:44:32 compute-0 nova_compute[186544]: 2025-11-22 08:44:32.948 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:44:33 compute-0 nova_compute[186544]: 2025-11-22 08:44:33.479 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:44:35 compute-0 nova_compute[186544]: 2025-11-22 08:44:35.158 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:44:36.605 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:44:36.605 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:44:36.605 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:44:36.605 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:44:36.605 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:44:36.605 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:44:36.605 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:44:36.606 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:44:36.606 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:44:36.606 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:44:36.606 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:44:36.606 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:44:36.606 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:44:36.606 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:44:36.606 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:44:36.606 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:44:36.606 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:44:36.606 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:44:36.606 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:44:36.606 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:44:36.607 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:44:36.607 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:44:36.607 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:44:36.607 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:44:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:44:36.607 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:44:36 compute-0 nova_compute[186544]: 2025-11-22 08:44:36.694 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:44:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:44:37.375 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:44:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:44:37.375 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:44:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:44:37.375 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:44:37 compute-0 nova_compute[186544]: 2025-11-22 08:44:37.950 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:44:38 compute-0 nova_compute[186544]: 2025-11-22 08:44:38.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:44:40 compute-0 nova_compute[186544]: 2025-11-22 08:44:40.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:44:40 compute-0 nova_compute[186544]: 2025-11-22 08:44:40.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:44:41 compute-0 nova_compute[186544]: 2025-11-22 08:44:41.698 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:44:42 compute-0 nova_compute[186544]: 2025-11-22 08:44:42.953 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:44:43 compute-0 podman[252357]: 2025-11-22 08:44:43.413107064 +0000 UTC m=+0.056409192 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 22 08:44:43 compute-0 podman[252358]: 2025-11-22 08:44:43.422182199 +0000 UTC m=+0.061085999 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 08:44:43 compute-0 podman[252356]: 2025-11-22 08:44:43.428050453 +0000 UTC m=+0.072967911 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Nov 22 08:44:43 compute-0 podman[252359]: 2025-11-22 08:44:43.461438307 +0000 UTC m=+0.097017505 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 22 08:44:46 compute-0 nova_compute[186544]: 2025-11-22 08:44:46.699 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:44:47 compute-0 nova_compute[186544]: 2025-11-22 08:44:47.956 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:44:51 compute-0 nova_compute[186544]: 2025-11-22 08:44:51.701 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:44:52 compute-0 nova_compute[186544]: 2025-11-22 08:44:52.959 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:44:53 compute-0 nova_compute[186544]: 2025-11-22 08:44:53.159 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:44:54 compute-0 podman[252441]: 2025-11-22 08:44:54.405345115 +0000 UTC m=+0.053441230 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 08:44:54 compute-0 podman[252442]: 2025-11-22 08:44:54.412096262 +0000 UTC m=+0.054560508 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, version=9.6, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, distribution-scope=public, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., release=1755695350)
Nov 22 08:44:54 compute-0 podman[252443]: 2025-11-22 08:44:54.433181322 +0000 UTC m=+0.067704892 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 08:44:55 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:44:55.817 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=69, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=68) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:44:55 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:44:55.818 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 08:44:55 compute-0 nova_compute[186544]: 2025-11-22 08:44:55.819 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:44:56 compute-0 nova_compute[186544]: 2025-11-22 08:44:56.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:44:56 compute-0 nova_compute[186544]: 2025-11-22 08:44:56.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 22 08:44:56 compute-0 nova_compute[186544]: 2025-11-22 08:44:56.332 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 22 08:44:56 compute-0 nova_compute[186544]: 2025-11-22 08:44:56.703 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:44:57 compute-0 nova_compute[186544]: 2025-11-22 08:44:57.961 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:44:58 compute-0 nova_compute[186544]: 2025-11-22 08:44:58.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:45:01 compute-0 nova_compute[186544]: 2025-11-22 08:45:01.704 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:45:02 compute-0 nova_compute[186544]: 2025-11-22 08:45:02.964 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:45:03 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:45:03.819 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '69'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:45:06 compute-0 nova_compute[186544]: 2025-11-22 08:45:06.705 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:45:07 compute-0 nova_compute[186544]: 2025-11-22 08:45:07.967 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:45:11 compute-0 nova_compute[186544]: 2025-11-22 08:45:11.176 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:45:11 compute-0 nova_compute[186544]: 2025-11-22 08:45:11.177 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 22 08:45:11 compute-0 nova_compute[186544]: 2025-11-22 08:45:11.706 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:45:12 compute-0 nova_compute[186544]: 2025-11-22 08:45:12.969 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:45:14 compute-0 podman[252506]: 2025-11-22 08:45:14.399853554 +0000 UTC m=+0.047195716 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 22 08:45:14 compute-0 podman[252507]: 2025-11-22 08:45:14.414202748 +0000 UTC m=+0.060914544 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 08:45:14 compute-0 podman[252505]: 2025-11-22 08:45:14.429397333 +0000 UTC m=+0.083979033 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, org.label-schema.license=GPLv2)
Nov 22 08:45:14 compute-0 podman[252508]: 2025-11-22 08:45:14.440158279 +0000 UTC m=+0.083527583 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2)
Nov 22 08:45:16 compute-0 nova_compute[186544]: 2025-11-22 08:45:16.707 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:45:17 compute-0 nova_compute[186544]: 2025-11-22 08:45:17.971 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:45:18 compute-0 nova_compute[186544]: 2025-11-22 08:45:18.944 186548 DEBUG oslo_concurrency.lockutils [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "a2a084a1-24f5-46c7-a76c-25bfe7b4a286" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:45:18 compute-0 nova_compute[186544]: 2025-11-22 08:45:18.944 186548 DEBUG oslo_concurrency.lockutils [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "a2a084a1-24f5-46c7-a76c-25bfe7b4a286" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:45:18 compute-0 nova_compute[186544]: 2025-11-22 08:45:18.958 186548 DEBUG nova.compute.manager [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 08:45:19 compute-0 nova_compute[186544]: 2025-11-22 08:45:19.049 186548 DEBUG oslo_concurrency.lockutils [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:45:19 compute-0 nova_compute[186544]: 2025-11-22 08:45:19.049 186548 DEBUG oslo_concurrency.lockutils [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:45:19 compute-0 nova_compute[186544]: 2025-11-22 08:45:19.059 186548 DEBUG nova.virt.hardware [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 08:45:19 compute-0 nova_compute[186544]: 2025-11-22 08:45:19.059 186548 INFO nova.compute.claims [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Claim successful on node compute-0.ctlplane.example.com
Nov 22 08:45:19 compute-0 nova_compute[186544]: 2025-11-22 08:45:19.168 186548 DEBUG nova.compute.provider_tree [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:45:19 compute-0 nova_compute[186544]: 2025-11-22 08:45:19.184 186548 DEBUG nova.scheduler.client.report [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:45:19 compute-0 nova_compute[186544]: 2025-11-22 08:45:19.201 186548 DEBUG oslo_concurrency.lockutils [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.152s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:45:19 compute-0 nova_compute[186544]: 2025-11-22 08:45:19.203 186548 DEBUG nova.compute.manager [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 08:45:19 compute-0 nova_compute[186544]: 2025-11-22 08:45:19.244 186548 DEBUG nova.compute.manager [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 22 08:45:19 compute-0 nova_compute[186544]: 2025-11-22 08:45:19.245 186548 DEBUG nova.network.neutron [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 08:45:19 compute-0 nova_compute[186544]: 2025-11-22 08:45:19.260 186548 INFO nova.virt.libvirt.driver [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 08:45:19 compute-0 nova_compute[186544]: 2025-11-22 08:45:19.277 186548 DEBUG nova.compute.manager [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 08:45:19 compute-0 nova_compute[186544]: 2025-11-22 08:45:19.367 186548 DEBUG nova.compute.manager [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 08:45:19 compute-0 nova_compute[186544]: 2025-11-22 08:45:19.368 186548 DEBUG nova.virt.libvirt.driver [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 08:45:19 compute-0 nova_compute[186544]: 2025-11-22 08:45:19.369 186548 INFO nova.virt.libvirt.driver [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Creating image(s)
Nov 22 08:45:19 compute-0 nova_compute[186544]: 2025-11-22 08:45:19.369 186548 DEBUG oslo_concurrency.lockutils [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "/var/lib/nova/instances/a2a084a1-24f5-46c7-a76c-25bfe7b4a286/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:45:19 compute-0 nova_compute[186544]: 2025-11-22 08:45:19.370 186548 DEBUG oslo_concurrency.lockutils [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "/var/lib/nova/instances/a2a084a1-24f5-46c7-a76c-25bfe7b4a286/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:45:19 compute-0 nova_compute[186544]: 2025-11-22 08:45:19.370 186548 DEBUG oslo_concurrency.lockutils [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "/var/lib/nova/instances/a2a084a1-24f5-46c7-a76c-25bfe7b4a286/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:45:19 compute-0 nova_compute[186544]: 2025-11-22 08:45:19.383 186548 DEBUG oslo_concurrency.processutils [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:45:19 compute-0 nova_compute[186544]: 2025-11-22 08:45:19.442 186548 DEBUG oslo_concurrency.processutils [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:45:19 compute-0 nova_compute[186544]: 2025-11-22 08:45:19.443 186548 DEBUG oslo_concurrency.lockutils [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:45:19 compute-0 nova_compute[186544]: 2025-11-22 08:45:19.444 186548 DEBUG oslo_concurrency.lockutils [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:45:19 compute-0 nova_compute[186544]: 2025-11-22 08:45:19.457 186548 DEBUG oslo_concurrency.processutils [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:45:19 compute-0 nova_compute[186544]: 2025-11-22 08:45:19.519 186548 DEBUG oslo_concurrency.processutils [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:45:19 compute-0 nova_compute[186544]: 2025-11-22 08:45:19.520 186548 DEBUG oslo_concurrency.processutils [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/a2a084a1-24f5-46c7-a76c-25bfe7b4a286/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:45:19 compute-0 nova_compute[186544]: 2025-11-22 08:45:19.572 186548 DEBUG oslo_concurrency.processutils [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/a2a084a1-24f5-46c7-a76c-25bfe7b4a286/disk 1073741824" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:45:19 compute-0 nova_compute[186544]: 2025-11-22 08:45:19.574 186548 DEBUG oslo_concurrency.lockutils [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.130s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:45:19 compute-0 nova_compute[186544]: 2025-11-22 08:45:19.574 186548 DEBUG oslo_concurrency.processutils [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:45:19 compute-0 nova_compute[186544]: 2025-11-22 08:45:19.599 186548 DEBUG nova.policy [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 22 08:45:19 compute-0 nova_compute[186544]: 2025-11-22 08:45:19.636 186548 DEBUG oslo_concurrency.processutils [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:45:19 compute-0 nova_compute[186544]: 2025-11-22 08:45:19.637 186548 DEBUG nova.virt.disk.api [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Checking if we can resize image /var/lib/nova/instances/a2a084a1-24f5-46c7-a76c-25bfe7b4a286/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 08:45:19 compute-0 nova_compute[186544]: 2025-11-22 08:45:19.637 186548 DEBUG oslo_concurrency.processutils [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a2a084a1-24f5-46c7-a76c-25bfe7b4a286/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:45:19 compute-0 nova_compute[186544]: 2025-11-22 08:45:19.700 186548 DEBUG oslo_concurrency.processutils [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a2a084a1-24f5-46c7-a76c-25bfe7b4a286/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:45:19 compute-0 nova_compute[186544]: 2025-11-22 08:45:19.701 186548 DEBUG nova.virt.disk.api [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Cannot resize image /var/lib/nova/instances/a2a084a1-24f5-46c7-a76c-25bfe7b4a286/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 08:45:19 compute-0 nova_compute[186544]: 2025-11-22 08:45:19.701 186548 DEBUG nova.objects.instance [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lazy-loading 'migration_context' on Instance uuid a2a084a1-24f5-46c7-a76c-25bfe7b4a286 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:45:19 compute-0 nova_compute[186544]: 2025-11-22 08:45:19.719 186548 DEBUG nova.virt.libvirt.driver [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 08:45:19 compute-0 nova_compute[186544]: 2025-11-22 08:45:19.719 186548 DEBUG nova.virt.libvirt.driver [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Ensure instance console log exists: /var/lib/nova/instances/a2a084a1-24f5-46c7-a76c-25bfe7b4a286/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 08:45:19 compute-0 nova_compute[186544]: 2025-11-22 08:45:19.720 186548 DEBUG oslo_concurrency.lockutils [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:45:19 compute-0 nova_compute[186544]: 2025-11-22 08:45:19.720 186548 DEBUG oslo_concurrency.lockutils [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:45:19 compute-0 nova_compute[186544]: 2025-11-22 08:45:19.720 186548 DEBUG oslo_concurrency.lockutils [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:45:20 compute-0 nova_compute[186544]: 2025-11-22 08:45:20.207 186548 DEBUG nova.network.neutron [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Successfully created port: a18bcf41-9655-4103-a948-8f1276716654 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 22 08:45:21 compute-0 nova_compute[186544]: 2025-11-22 08:45:21.125 186548 DEBUG nova.network.neutron [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Successfully updated port: a18bcf41-9655-4103-a948-8f1276716654 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 08:45:21 compute-0 nova_compute[186544]: 2025-11-22 08:45:21.141 186548 DEBUG oslo_concurrency.lockutils [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "refresh_cache-a2a084a1-24f5-46c7-a76c-25bfe7b4a286" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:45:21 compute-0 nova_compute[186544]: 2025-11-22 08:45:21.142 186548 DEBUG oslo_concurrency.lockutils [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquired lock "refresh_cache-a2a084a1-24f5-46c7-a76c-25bfe7b4a286" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:45:21 compute-0 nova_compute[186544]: 2025-11-22 08:45:21.142 186548 DEBUG nova.network.neutron [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 08:45:21 compute-0 nova_compute[186544]: 2025-11-22 08:45:21.231 186548 DEBUG nova.compute.manager [req-968d716d-f7f0-41af-a5ff-5e5055c08d8e req-8c07b19a-aca2-4cd0-897e-85013a6e053f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Received event network-changed-a18bcf41-9655-4103-a948-8f1276716654 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:45:21 compute-0 nova_compute[186544]: 2025-11-22 08:45:21.231 186548 DEBUG nova.compute.manager [req-968d716d-f7f0-41af-a5ff-5e5055c08d8e req-8c07b19a-aca2-4cd0-897e-85013a6e053f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Refreshing instance network info cache due to event network-changed-a18bcf41-9655-4103-a948-8f1276716654. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:45:21 compute-0 nova_compute[186544]: 2025-11-22 08:45:21.232 186548 DEBUG oslo_concurrency.lockutils [req-968d716d-f7f0-41af-a5ff-5e5055c08d8e req-8c07b19a-aca2-4cd0-897e-85013a6e053f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-a2a084a1-24f5-46c7-a76c-25bfe7b4a286" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:45:21 compute-0 nova_compute[186544]: 2025-11-22 08:45:21.293 186548 DEBUG nova.network.neutron [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 08:45:21 compute-0 nova_compute[186544]: 2025-11-22 08:45:21.709 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:45:22 compute-0 nova_compute[186544]: 2025-11-22 08:45:22.074 186548 DEBUG nova.network.neutron [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Updating instance_info_cache with network_info: [{"id": "a18bcf41-9655-4103-a948-8f1276716654", "address": "fa:16:3e:da:49:f4", "network": {"id": "75a459da-4098-4237-9a69-6ce91c909b9c", "bridge": "br-int", "label": "tempest-network-smoke--86102444", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa18bcf41-96", "ovs_interfaceid": "a18bcf41-9655-4103-a948-8f1276716654", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:45:22 compute-0 nova_compute[186544]: 2025-11-22 08:45:22.089 186548 DEBUG oslo_concurrency.lockutils [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Releasing lock "refresh_cache-a2a084a1-24f5-46c7-a76c-25bfe7b4a286" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:45:22 compute-0 nova_compute[186544]: 2025-11-22 08:45:22.090 186548 DEBUG nova.compute.manager [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Instance network_info: |[{"id": "a18bcf41-9655-4103-a948-8f1276716654", "address": "fa:16:3e:da:49:f4", "network": {"id": "75a459da-4098-4237-9a69-6ce91c909b9c", "bridge": "br-int", "label": "tempest-network-smoke--86102444", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa18bcf41-96", "ovs_interfaceid": "a18bcf41-9655-4103-a948-8f1276716654", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 22 08:45:22 compute-0 nova_compute[186544]: 2025-11-22 08:45:22.090 186548 DEBUG oslo_concurrency.lockutils [req-968d716d-f7f0-41af-a5ff-5e5055c08d8e req-8c07b19a-aca2-4cd0-897e-85013a6e053f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-a2a084a1-24f5-46c7-a76c-25bfe7b4a286" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:45:22 compute-0 nova_compute[186544]: 2025-11-22 08:45:22.090 186548 DEBUG nova.network.neutron [req-968d716d-f7f0-41af-a5ff-5e5055c08d8e req-8c07b19a-aca2-4cd0-897e-85013a6e053f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Refreshing network info cache for port a18bcf41-9655-4103-a948-8f1276716654 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:45:22 compute-0 nova_compute[186544]: 2025-11-22 08:45:22.093 186548 DEBUG nova.virt.libvirt.driver [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Start _get_guest_xml network_info=[{"id": "a18bcf41-9655-4103-a948-8f1276716654", "address": "fa:16:3e:da:49:f4", "network": {"id": "75a459da-4098-4237-9a69-6ce91c909b9c", "bridge": "br-int", "label": "tempest-network-smoke--86102444", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa18bcf41-96", "ovs_interfaceid": "a18bcf41-9655-4103-a948-8f1276716654", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 08:45:22 compute-0 nova_compute[186544]: 2025-11-22 08:45:22.097 186548 WARNING nova.virt.libvirt.driver [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:45:22 compute-0 nova_compute[186544]: 2025-11-22 08:45:22.105 186548 DEBUG nova.virt.libvirt.host [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 08:45:22 compute-0 nova_compute[186544]: 2025-11-22 08:45:22.105 186548 DEBUG nova.virt.libvirt.host [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 08:45:22 compute-0 nova_compute[186544]: 2025-11-22 08:45:22.108 186548 DEBUG nova.virt.libvirt.host [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 08:45:22 compute-0 nova_compute[186544]: 2025-11-22 08:45:22.109 186548 DEBUG nova.virt.libvirt.host [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 08:45:22 compute-0 nova_compute[186544]: 2025-11-22 08:45:22.110 186548 DEBUG nova.virt.libvirt.driver [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 08:45:22 compute-0 nova_compute[186544]: 2025-11-22 08:45:22.110 186548 DEBUG nova.virt.hardware [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 08:45:22 compute-0 nova_compute[186544]: 2025-11-22 08:45:22.111 186548 DEBUG nova.virt.hardware [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 08:45:22 compute-0 nova_compute[186544]: 2025-11-22 08:45:22.111 186548 DEBUG nova.virt.hardware [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 08:45:22 compute-0 nova_compute[186544]: 2025-11-22 08:45:22.111 186548 DEBUG nova.virt.hardware [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 08:45:22 compute-0 nova_compute[186544]: 2025-11-22 08:45:22.111 186548 DEBUG nova.virt.hardware [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 08:45:22 compute-0 nova_compute[186544]: 2025-11-22 08:45:22.111 186548 DEBUG nova.virt.hardware [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 08:45:22 compute-0 nova_compute[186544]: 2025-11-22 08:45:22.112 186548 DEBUG nova.virt.hardware [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 08:45:22 compute-0 nova_compute[186544]: 2025-11-22 08:45:22.112 186548 DEBUG nova.virt.hardware [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 08:45:22 compute-0 nova_compute[186544]: 2025-11-22 08:45:22.112 186548 DEBUG nova.virt.hardware [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 08:45:22 compute-0 nova_compute[186544]: 2025-11-22 08:45:22.112 186548 DEBUG nova.virt.hardware [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 08:45:22 compute-0 nova_compute[186544]: 2025-11-22 08:45:22.113 186548 DEBUG nova.virt.hardware [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 08:45:22 compute-0 nova_compute[186544]: 2025-11-22 08:45:22.116 186548 DEBUG nova.virt.libvirt.vif [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:45:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2118287641',display_name='tempest-TestNetworkBasicOps-server-2118287641',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2118287641',id=176,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCzR+9MOqXw8TBBpsaw/IGWRtEptID1j8riDZ0v3IyOTb6fraFLij0X6Vh1uCEygecc6geFg0ggERXTcmILgqRqG2INhfMi/D9TMR2Ak3+WIDqG4qsd4hRi6d7c9daFBIg==',key_name='tempest-TestNetworkBasicOps-1133952864',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='12f63a6d87a947758ab928c0d625ff06',ramdisk_id='',reservation_id='r-i7rx9jbw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1998778518',owner_user_name='tempest-TestNetworkBasicOps-1998778518-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:45:19Z,user_data=None,user_id='033a5e424a0a42afa21b67c28d79d1f4',uuid=a2a084a1-24f5-46c7-a76c-25bfe7b4a286,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a18bcf41-9655-4103-a948-8f1276716654", "address": "fa:16:3e:da:49:f4", "network": {"id": "75a459da-4098-4237-9a69-6ce91c909b9c", "bridge": "br-int", "label": "tempest-network-smoke--86102444", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa18bcf41-96", "ovs_interfaceid": "a18bcf41-9655-4103-a948-8f1276716654", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 08:45:22 compute-0 nova_compute[186544]: 2025-11-22 08:45:22.117 186548 DEBUG nova.network.os_vif_util [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converting VIF {"id": "a18bcf41-9655-4103-a948-8f1276716654", "address": "fa:16:3e:da:49:f4", "network": {"id": "75a459da-4098-4237-9a69-6ce91c909b9c", "bridge": "br-int", "label": "tempest-network-smoke--86102444", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa18bcf41-96", "ovs_interfaceid": "a18bcf41-9655-4103-a948-8f1276716654", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:45:22 compute-0 nova_compute[186544]: 2025-11-22 08:45:22.117 186548 DEBUG nova.network.os_vif_util [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:da:49:f4,bridge_name='br-int',has_traffic_filtering=True,id=a18bcf41-9655-4103-a948-8f1276716654,network=Network(75a459da-4098-4237-9a69-6ce91c909b9c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa18bcf41-96') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:45:22 compute-0 nova_compute[186544]: 2025-11-22 08:45:22.118 186548 DEBUG nova.objects.instance [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lazy-loading 'pci_devices' on Instance uuid a2a084a1-24f5-46c7-a76c-25bfe7b4a286 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:45:22 compute-0 nova_compute[186544]: 2025-11-22 08:45:22.130 186548 DEBUG nova.virt.libvirt.driver [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] End _get_guest_xml xml=<domain type="kvm">
Nov 22 08:45:22 compute-0 nova_compute[186544]:   <uuid>a2a084a1-24f5-46c7-a76c-25bfe7b4a286</uuid>
Nov 22 08:45:22 compute-0 nova_compute[186544]:   <name>instance-000000b0</name>
Nov 22 08:45:22 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 08:45:22 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 08:45:22 compute-0 nova_compute[186544]:   <metadata>
Nov 22 08:45:22 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 08:45:22 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 08:45:22 compute-0 nova_compute[186544]:       <nova:name>tempest-TestNetworkBasicOps-server-2118287641</nova:name>
Nov 22 08:45:22 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 08:45:22</nova:creationTime>
Nov 22 08:45:22 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 08:45:22 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 08:45:22 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 08:45:22 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 08:45:22 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 08:45:22 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 08:45:22 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 08:45:22 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 08:45:22 compute-0 nova_compute[186544]:         <nova:user uuid="033a5e424a0a42afa21b67c28d79d1f4">tempest-TestNetworkBasicOps-1998778518-project-member</nova:user>
Nov 22 08:45:22 compute-0 nova_compute[186544]:         <nova:project uuid="12f63a6d87a947758ab928c0d625ff06">tempest-TestNetworkBasicOps-1998778518</nova:project>
Nov 22 08:45:22 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 08:45:22 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 08:45:22 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 08:45:22 compute-0 nova_compute[186544]:         <nova:port uuid="a18bcf41-9655-4103-a948-8f1276716654">
Nov 22 08:45:22 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Nov 22 08:45:22 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 08:45:22 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 08:45:22 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 08:45:22 compute-0 nova_compute[186544]:   </metadata>
Nov 22 08:45:22 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 08:45:22 compute-0 nova_compute[186544]:     <system>
Nov 22 08:45:22 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 08:45:22 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 08:45:22 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 08:45:22 compute-0 nova_compute[186544]:       <entry name="serial">a2a084a1-24f5-46c7-a76c-25bfe7b4a286</entry>
Nov 22 08:45:22 compute-0 nova_compute[186544]:       <entry name="uuid">a2a084a1-24f5-46c7-a76c-25bfe7b4a286</entry>
Nov 22 08:45:22 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 08:45:22 compute-0 nova_compute[186544]:     </system>
Nov 22 08:45:22 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 08:45:22 compute-0 nova_compute[186544]:   <os>
Nov 22 08:45:22 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 08:45:22 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 08:45:22 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 08:45:22 compute-0 nova_compute[186544]:   </os>
Nov 22 08:45:22 compute-0 nova_compute[186544]:   <features>
Nov 22 08:45:22 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 08:45:22 compute-0 nova_compute[186544]:     <apic/>
Nov 22 08:45:22 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 08:45:22 compute-0 nova_compute[186544]:   </features>
Nov 22 08:45:22 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 08:45:22 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 08:45:22 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 08:45:22 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 08:45:22 compute-0 nova_compute[186544]:   </clock>
Nov 22 08:45:22 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 08:45:22 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 08:45:22 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 08:45:22 compute-0 nova_compute[186544]:   </cpu>
Nov 22 08:45:22 compute-0 nova_compute[186544]:   <devices>
Nov 22 08:45:22 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 08:45:22 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 08:45:22 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/a2a084a1-24f5-46c7-a76c-25bfe7b4a286/disk"/>
Nov 22 08:45:22 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 08:45:22 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:45:22 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 08:45:22 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 08:45:22 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/a2a084a1-24f5-46c7-a76c-25bfe7b4a286/disk.config"/>
Nov 22 08:45:22 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 08:45:22 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:45:22 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 08:45:22 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:da:49:f4"/>
Nov 22 08:45:22 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:45:22 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 08:45:22 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 08:45:22 compute-0 nova_compute[186544]:       <target dev="tapa18bcf41-96"/>
Nov 22 08:45:22 compute-0 nova_compute[186544]:     </interface>
Nov 22 08:45:22 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 08:45:22 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/a2a084a1-24f5-46c7-a76c-25bfe7b4a286/console.log" append="off"/>
Nov 22 08:45:22 compute-0 nova_compute[186544]:     </serial>
Nov 22 08:45:22 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 08:45:22 compute-0 nova_compute[186544]:     <video>
Nov 22 08:45:22 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:45:22 compute-0 nova_compute[186544]:     </video>
Nov 22 08:45:22 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 08:45:22 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 08:45:22 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 08:45:22 compute-0 nova_compute[186544]:     </rng>
Nov 22 08:45:22 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 08:45:22 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:45:22 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:45:22 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:45:22 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:45:22 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:45:22 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:45:22 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:45:22 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:45:22 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:45:22 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:45:22 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:45:22 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:45:22 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:45:22 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:45:22 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:45:22 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:45:22 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:45:22 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:45:22 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:45:22 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:45:22 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:45:22 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:45:22 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:45:22 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:45:22 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 08:45:22 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 08:45:22 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 08:45:22 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 08:45:22 compute-0 nova_compute[186544]:   </devices>
Nov 22 08:45:22 compute-0 nova_compute[186544]: </domain>
Nov 22 08:45:22 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 08:45:22 compute-0 nova_compute[186544]: 2025-11-22 08:45:22.131 186548 DEBUG nova.compute.manager [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Preparing to wait for external event network-vif-plugged-a18bcf41-9655-4103-a948-8f1276716654 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 08:45:22 compute-0 nova_compute[186544]: 2025-11-22 08:45:22.132 186548 DEBUG oslo_concurrency.lockutils [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "a2a084a1-24f5-46c7-a76c-25bfe7b4a286-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:45:22 compute-0 nova_compute[186544]: 2025-11-22 08:45:22.132 186548 DEBUG oslo_concurrency.lockutils [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "a2a084a1-24f5-46c7-a76c-25bfe7b4a286-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:45:22 compute-0 nova_compute[186544]: 2025-11-22 08:45:22.132 186548 DEBUG oslo_concurrency.lockutils [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "a2a084a1-24f5-46c7-a76c-25bfe7b4a286-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:45:22 compute-0 nova_compute[186544]: 2025-11-22 08:45:22.133 186548 DEBUG nova.virt.libvirt.vif [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:45:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2118287641',display_name='tempest-TestNetworkBasicOps-server-2118287641',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2118287641',id=176,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCzR+9MOqXw8TBBpsaw/IGWRtEptID1j8riDZ0v3IyOTb6fraFLij0X6Vh1uCEygecc6geFg0ggERXTcmILgqRqG2INhfMi/D9TMR2Ak3+WIDqG4qsd4hRi6d7c9daFBIg==',key_name='tempest-TestNetworkBasicOps-1133952864',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='12f63a6d87a947758ab928c0d625ff06',ramdisk_id='',reservation_id='r-i7rx9jbw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1998778518',owner_user_name='tempest-TestNetworkBasicOps-1998778518-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:45:19Z,user_data=None,user_id='033a5e424a0a42afa21b67c28d79d1f4',uuid=a2a084a1-24f5-46c7-a76c-25bfe7b4a286,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a18bcf41-9655-4103-a948-8f1276716654", "address": "fa:16:3e:da:49:f4", "network": {"id": "75a459da-4098-4237-9a69-6ce91c909b9c", "bridge": "br-int", "label": "tempest-network-smoke--86102444", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa18bcf41-96", "ovs_interfaceid": "a18bcf41-9655-4103-a948-8f1276716654", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 08:45:22 compute-0 nova_compute[186544]: 2025-11-22 08:45:22.133 186548 DEBUG nova.network.os_vif_util [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converting VIF {"id": "a18bcf41-9655-4103-a948-8f1276716654", "address": "fa:16:3e:da:49:f4", "network": {"id": "75a459da-4098-4237-9a69-6ce91c909b9c", "bridge": "br-int", "label": "tempest-network-smoke--86102444", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa18bcf41-96", "ovs_interfaceid": "a18bcf41-9655-4103-a948-8f1276716654", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:45:22 compute-0 nova_compute[186544]: 2025-11-22 08:45:22.134 186548 DEBUG nova.network.os_vif_util [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:da:49:f4,bridge_name='br-int',has_traffic_filtering=True,id=a18bcf41-9655-4103-a948-8f1276716654,network=Network(75a459da-4098-4237-9a69-6ce91c909b9c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa18bcf41-96') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:45:22 compute-0 nova_compute[186544]: 2025-11-22 08:45:22.134 186548 DEBUG os_vif [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:da:49:f4,bridge_name='br-int',has_traffic_filtering=True,id=a18bcf41-9655-4103-a948-8f1276716654,network=Network(75a459da-4098-4237-9a69-6ce91c909b9c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa18bcf41-96') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 08:45:22 compute-0 nova_compute[186544]: 2025-11-22 08:45:22.135 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:45:22 compute-0 nova_compute[186544]: 2025-11-22 08:45:22.135 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:45:22 compute-0 nova_compute[186544]: 2025-11-22 08:45:22.135 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:45:22 compute-0 nova_compute[186544]: 2025-11-22 08:45:22.138 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:45:22 compute-0 nova_compute[186544]: 2025-11-22 08:45:22.138 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa18bcf41-96, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:45:22 compute-0 nova_compute[186544]: 2025-11-22 08:45:22.139 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa18bcf41-96, col_values=(('external_ids', {'iface-id': 'a18bcf41-9655-4103-a948-8f1276716654', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:da:49:f4', 'vm-uuid': 'a2a084a1-24f5-46c7-a76c-25bfe7b4a286'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:45:22 compute-0 nova_compute[186544]: 2025-11-22 08:45:22.140 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:45:22 compute-0 NetworkManager[55036]: <info>  [1763801122.1414] manager: (tapa18bcf41-96): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/398)
Nov 22 08:45:22 compute-0 nova_compute[186544]: 2025-11-22 08:45:22.145 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 08:45:22 compute-0 nova_compute[186544]: 2025-11-22 08:45:22.147 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:45:22 compute-0 nova_compute[186544]: 2025-11-22 08:45:22.147 186548 INFO os_vif [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:da:49:f4,bridge_name='br-int',has_traffic_filtering=True,id=a18bcf41-9655-4103-a948-8f1276716654,network=Network(75a459da-4098-4237-9a69-6ce91c909b9c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa18bcf41-96')
Nov 22 08:45:22 compute-0 nova_compute[186544]: 2025-11-22 08:45:22.185 186548 DEBUG nova.virt.libvirt.driver [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:45:22 compute-0 nova_compute[186544]: 2025-11-22 08:45:22.187 186548 DEBUG nova.virt.libvirt.driver [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:45:22 compute-0 nova_compute[186544]: 2025-11-22 08:45:22.187 186548 DEBUG nova.virt.libvirt.driver [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] No VIF found with MAC fa:16:3e:da:49:f4, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 08:45:22 compute-0 nova_compute[186544]: 2025-11-22 08:45:22.187 186548 INFO nova.virt.libvirt.driver [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Using config drive
Nov 22 08:45:22 compute-0 nova_compute[186544]: 2025-11-22 08:45:22.591 186548 INFO nova.virt.libvirt.driver [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Creating config drive at /var/lib/nova/instances/a2a084a1-24f5-46c7-a76c-25bfe7b4a286/disk.config
Nov 22 08:45:22 compute-0 nova_compute[186544]: 2025-11-22 08:45:22.595 186548 DEBUG oslo_concurrency.processutils [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a2a084a1-24f5-46c7-a76c-25bfe7b4a286/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmptxg3c2g7 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:45:22 compute-0 nova_compute[186544]: 2025-11-22 08:45:22.719 186548 DEBUG oslo_concurrency.processutils [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a2a084a1-24f5-46c7-a76c-25bfe7b4a286/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmptxg3c2g7" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:45:22 compute-0 kernel: tapa18bcf41-96: entered promiscuous mode
Nov 22 08:45:22 compute-0 NetworkManager[55036]: <info>  [1763801122.7794] manager: (tapa18bcf41-96): new Tun device (/org/freedesktop/NetworkManager/Devices/399)
Nov 22 08:45:22 compute-0 ovn_controller[94843]: 2025-11-22T08:45:22Z|00852|binding|INFO|Claiming lport a18bcf41-9655-4103-a948-8f1276716654 for this chassis.
Nov 22 08:45:22 compute-0 ovn_controller[94843]: 2025-11-22T08:45:22Z|00853|binding|INFO|a18bcf41-9655-4103-a948-8f1276716654: Claiming fa:16:3e:da:49:f4 10.100.0.13
Nov 22 08:45:22 compute-0 nova_compute[186544]: 2025-11-22 08:45:22.780 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:45:22 compute-0 nova_compute[186544]: 2025-11-22 08:45:22.783 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:45:22 compute-0 nova_compute[186544]: 2025-11-22 08:45:22.788 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:45:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:45:22.804 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:da:49:f4 10.100.0.13'], port_security=['fa:16:3e:da:49:f4 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'a2a084a1-24f5-46c7-a76c-25bfe7b4a286', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-75a459da-4098-4237-9a69-6ce91c909b9c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '12f63a6d87a947758ab928c0d625ff06', 'neutron:revision_number': '2', 'neutron:security_group_ids': '74b59ad3-9ee8-4065-9258-54596f8680d2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=233785d2-7366-479b-b956-6331b9bfdb2d, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=a18bcf41-9655-4103-a948-8f1276716654) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:45:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:45:22.805 103805 INFO neutron.agent.ovn.metadata.agent [-] Port a18bcf41-9655-4103-a948-8f1276716654 in datapath 75a459da-4098-4237-9a69-6ce91c909b9c bound to our chassis
Nov 22 08:45:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:45:22.806 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 75a459da-4098-4237-9a69-6ce91c909b9c
Nov 22 08:45:22 compute-0 systemd-udevd[252622]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:45:22 compute-0 systemd-machined[152872]: New machine qemu-95-instance-000000b0.
Nov 22 08:45:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:45:22.817 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[b9f302eb-ddb7-4dcd-ba0c-f0744637ca7b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:45:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:45:22.818 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap75a459da-41 in ovnmeta-75a459da-4098-4237-9a69-6ce91c909b9c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 08:45:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:45:22.820 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap75a459da-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 08:45:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:45:22.820 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[a8d62af5-27ce-46ea-8463-b2b6c82a902d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:45:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:45:22.821 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[c24b7b55-1fb6-4b1b-a1df-094ddbc17f05]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:45:22 compute-0 NetworkManager[55036]: <info>  [1763801122.8257] device (tapa18bcf41-96): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 08:45:22 compute-0 NetworkManager[55036]: <info>  [1763801122.8267] device (tapa18bcf41-96): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 08:45:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:45:22.831 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[77b36c99-c84b-42ab-be35-1a1c86f44697]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:45:22 compute-0 systemd[1]: Started Virtual Machine qemu-95-instance-000000b0.
Nov 22 08:45:22 compute-0 nova_compute[186544]: 2025-11-22 08:45:22.846 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:45:22 compute-0 ovn_controller[94843]: 2025-11-22T08:45:22Z|00854|binding|INFO|Setting lport a18bcf41-9655-4103-a948-8f1276716654 ovn-installed in OVS
Nov 22 08:45:22 compute-0 ovn_controller[94843]: 2025-11-22T08:45:22Z|00855|binding|INFO|Setting lport a18bcf41-9655-4103-a948-8f1276716654 up in Southbound
Nov 22 08:45:22 compute-0 nova_compute[186544]: 2025-11-22 08:45:22.850 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:45:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:45:22.857 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[73a9489b-dfbf-4f1d-a08f-6fface18b332]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:45:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:45:22.886 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[4d05f05b-6f57-4532-861b-1099dca6cb95]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:45:22 compute-0 systemd-udevd[252627]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:45:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:45:22.893 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[dd314392-5adb-4734-9c70-10e086db381f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:45:22 compute-0 NetworkManager[55036]: <info>  [1763801122.8953] manager: (tap75a459da-40): new Veth device (/org/freedesktop/NetworkManager/Devices/400)
Nov 22 08:45:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:45:22.926 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[e524f549-6f55-413d-9cda-ddc247ce27c3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:45:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:45:22.930 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[1a719308-0a9c-4bcc-9ac0-f02bdd5b9208]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:45:22 compute-0 NetworkManager[55036]: <info>  [1763801122.9529] device (tap75a459da-40): carrier: link connected
Nov 22 08:45:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:45:22.957 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[5cfb7dce-22ec-49e2-a181-40a4654264a4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:45:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:45:22.975 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[7f19f801-d902-407d-87bd-d149b9bf6582]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap75a459da-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d3:7c:24'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 255], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 786359, 'reachable_time': 42630, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 252656, 'error': None, 'target': 'ovnmeta-75a459da-4098-4237-9a69-6ce91c909b9c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:45:22 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:45:22.990 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[7df65980-5e6c-4393-92be-b1c17cf11551]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed3:7c24'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 786359, 'tstamp': 786359}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 252657, 'error': None, 'target': 'ovnmeta-75a459da-4098-4237-9a69-6ce91c909b9c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:45:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:45:23.005 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[e4678e35-a868-4dc6-9d8d-360b4209e945]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap75a459da-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d3:7c:24'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 255], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 786359, 'reachable_time': 42630, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 252658, 'error': None, 'target': 'ovnmeta-75a459da-4098-4237-9a69-6ce91c909b9c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:45:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:45:23.033 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[3aff50bb-000e-4a79-abf4-fdf64127c484]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:45:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:45:23.092 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[453d31cc-62bb-4357-8f29-f14447d6595c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:45:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:45:23.094 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap75a459da-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:45:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:45:23.094 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:45:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:45:23.095 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap75a459da-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:45:23 compute-0 NetworkManager[55036]: <info>  [1763801123.0976] manager: (tap75a459da-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/401)
Nov 22 08:45:23 compute-0 nova_compute[186544]: 2025-11-22 08:45:23.097 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:45:23 compute-0 kernel: tap75a459da-40: entered promiscuous mode
Nov 22 08:45:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:45:23.100 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap75a459da-40, col_values=(('external_ids', {'iface-id': '39800cc0-ce78-44f5-a846-1b0efde3902d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:45:23 compute-0 ovn_controller[94843]: 2025-11-22T08:45:23Z|00856|binding|INFO|Releasing lport 39800cc0-ce78-44f5-a846-1b0efde3902d from this chassis (sb_readonly=0)
Nov 22 08:45:23 compute-0 nova_compute[186544]: 2025-11-22 08:45:23.101 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:45:23 compute-0 nova_compute[186544]: 2025-11-22 08:45:23.103 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:45:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:45:23.103 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/75a459da-4098-4237-9a69-6ce91c909b9c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/75a459da-4098-4237-9a69-6ce91c909b9c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 08:45:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:45:23.105 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[37e3f1bd-5052-40aa-be3b-4747d2b9a2fb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:45:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:45:23.106 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 08:45:23 compute-0 ovn_metadata_agent[103800]: global
Nov 22 08:45:23 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 08:45:23 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-75a459da-4098-4237-9a69-6ce91c909b9c
Nov 22 08:45:23 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 08:45:23 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 08:45:23 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 08:45:23 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/75a459da-4098-4237-9a69-6ce91c909b9c.pid.haproxy
Nov 22 08:45:23 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 08:45:23 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:45:23 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 08:45:23 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 08:45:23 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 08:45:23 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 08:45:23 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 08:45:23 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 08:45:23 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 08:45:23 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 08:45:23 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 08:45:23 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 08:45:23 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 08:45:23 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 08:45:23 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 08:45:23 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:45:23 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:45:23 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 08:45:23 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 08:45:23 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 08:45:23 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID 75a459da-4098-4237-9a69-6ce91c909b9c
Nov 22 08:45:23 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 08:45:23 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:45:23.107 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-75a459da-4098-4237-9a69-6ce91c909b9c', 'env', 'PROCESS_TAG=haproxy-75a459da-4098-4237-9a69-6ce91c909b9c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/75a459da-4098-4237-9a69-6ce91c909b9c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 08:45:23 compute-0 nova_compute[186544]: 2025-11-22 08:45:23.114 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:45:23 compute-0 nova_compute[186544]: 2025-11-22 08:45:23.125 186548 DEBUG nova.network.neutron [req-968d716d-f7f0-41af-a5ff-5e5055c08d8e req-8c07b19a-aca2-4cd0-897e-85013a6e053f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Updated VIF entry in instance network info cache for port a18bcf41-9655-4103-a948-8f1276716654. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:45:23 compute-0 nova_compute[186544]: 2025-11-22 08:45:23.125 186548 DEBUG nova.network.neutron [req-968d716d-f7f0-41af-a5ff-5e5055c08d8e req-8c07b19a-aca2-4cd0-897e-85013a6e053f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Updating instance_info_cache with network_info: [{"id": "a18bcf41-9655-4103-a948-8f1276716654", "address": "fa:16:3e:da:49:f4", "network": {"id": "75a459da-4098-4237-9a69-6ce91c909b9c", "bridge": "br-int", "label": "tempest-network-smoke--86102444", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa18bcf41-96", "ovs_interfaceid": "a18bcf41-9655-4103-a948-8f1276716654", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:45:23 compute-0 nova_compute[186544]: 2025-11-22 08:45:23.141 186548 DEBUG oslo_concurrency.lockutils [req-968d716d-f7f0-41af-a5ff-5e5055c08d8e req-8c07b19a-aca2-4cd0-897e-85013a6e053f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-a2a084a1-24f5-46c7-a76c-25bfe7b4a286" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:45:23 compute-0 nova_compute[186544]: 2025-11-22 08:45:23.349 186548 DEBUG nova.compute.manager [req-106b6cc7-2c2c-4481-98ea-6961c7593221 req-8711fbf5-60b9-4212-821d-d38088bd98f6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Received event network-vif-plugged-a18bcf41-9655-4103-a948-8f1276716654 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:45:23 compute-0 nova_compute[186544]: 2025-11-22 08:45:23.349 186548 DEBUG oslo_concurrency.lockutils [req-106b6cc7-2c2c-4481-98ea-6961c7593221 req-8711fbf5-60b9-4212-821d-d38088bd98f6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "a2a084a1-24f5-46c7-a76c-25bfe7b4a286-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:45:23 compute-0 nova_compute[186544]: 2025-11-22 08:45:23.349 186548 DEBUG oslo_concurrency.lockutils [req-106b6cc7-2c2c-4481-98ea-6961c7593221 req-8711fbf5-60b9-4212-821d-d38088bd98f6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "a2a084a1-24f5-46c7-a76c-25bfe7b4a286-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:45:23 compute-0 nova_compute[186544]: 2025-11-22 08:45:23.350 186548 DEBUG oslo_concurrency.lockutils [req-106b6cc7-2c2c-4481-98ea-6961c7593221 req-8711fbf5-60b9-4212-821d-d38088bd98f6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "a2a084a1-24f5-46c7-a76c-25bfe7b4a286-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:45:23 compute-0 nova_compute[186544]: 2025-11-22 08:45:23.350 186548 DEBUG nova.compute.manager [req-106b6cc7-2c2c-4481-98ea-6961c7593221 req-8711fbf5-60b9-4212-821d-d38088bd98f6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Processing event network-vif-plugged-a18bcf41-9655-4103-a948-8f1276716654 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 08:45:23 compute-0 podman[252693]: 2025-11-22 08:45:23.503863175 +0000 UTC m=+0.052393833 container create 015adda13752d7ef0bb4721c9a9fa4809f0ce8bab8ce1e40d2e96cd19ffb4ba7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-75a459da-4098-4237-9a69-6ce91c909b9c, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Nov 22 08:45:23 compute-0 nova_compute[186544]: 2025-11-22 08:45:23.529 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763801123.5282452, a2a084a1-24f5-46c7-a76c-25bfe7b4a286 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:45:23 compute-0 nova_compute[186544]: 2025-11-22 08:45:23.529 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] VM Started (Lifecycle Event)
Nov 22 08:45:23 compute-0 nova_compute[186544]: 2025-11-22 08:45:23.531 186548 DEBUG nova.compute.manager [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 08:45:23 compute-0 nova_compute[186544]: 2025-11-22 08:45:23.536 186548 DEBUG nova.virt.libvirt.driver [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 08:45:23 compute-0 systemd[1]: Started libpod-conmon-015adda13752d7ef0bb4721c9a9fa4809f0ce8bab8ce1e40d2e96cd19ffb4ba7.scope.
Nov 22 08:45:23 compute-0 nova_compute[186544]: 2025-11-22 08:45:23.540 186548 INFO nova.virt.libvirt.driver [-] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Instance spawned successfully.
Nov 22 08:45:23 compute-0 nova_compute[186544]: 2025-11-22 08:45:23.541 186548 DEBUG nova.virt.libvirt.driver [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 08:45:23 compute-0 nova_compute[186544]: 2025-11-22 08:45:23.550 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:45:23 compute-0 nova_compute[186544]: 2025-11-22 08:45:23.556 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:45:23 compute-0 nova_compute[186544]: 2025-11-22 08:45:23.560 186548 DEBUG nova.virt.libvirt.driver [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:45:23 compute-0 nova_compute[186544]: 2025-11-22 08:45:23.560 186548 DEBUG nova.virt.libvirt.driver [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:45:23 compute-0 nova_compute[186544]: 2025-11-22 08:45:23.560 186548 DEBUG nova.virt.libvirt.driver [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:45:23 compute-0 nova_compute[186544]: 2025-11-22 08:45:23.561 186548 DEBUG nova.virt.libvirt.driver [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:45:23 compute-0 nova_compute[186544]: 2025-11-22 08:45:23.561 186548 DEBUG nova.virt.libvirt.driver [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:45:23 compute-0 nova_compute[186544]: 2025-11-22 08:45:23.562 186548 DEBUG nova.virt.libvirt.driver [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:45:23 compute-0 systemd[1]: Started libcrun container.
Nov 22 08:45:23 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/23c88672e80d463ae2bf28997f85a4d73fb5b9c92614655aec9838a597db95a7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 08:45:23 compute-0 podman[252693]: 2025-11-22 08:45:23.476394118 +0000 UTC m=+0.024924706 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 08:45:23 compute-0 nova_compute[186544]: 2025-11-22 08:45:23.582 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 08:45:23 compute-0 nova_compute[186544]: 2025-11-22 08:45:23.582 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763801123.5288384, a2a084a1-24f5-46c7-a76c-25bfe7b4a286 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:45:23 compute-0 nova_compute[186544]: 2025-11-22 08:45:23.582 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] VM Paused (Lifecycle Event)
Nov 22 08:45:23 compute-0 podman[252693]: 2025-11-22 08:45:23.583102361 +0000 UTC m=+0.131632939 container init 015adda13752d7ef0bb4721c9a9fa4809f0ce8bab8ce1e40d2e96cd19ffb4ba7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-75a459da-4098-4237-9a69-6ce91c909b9c, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 08:45:23 compute-0 podman[252693]: 2025-11-22 08:45:23.588695229 +0000 UTC m=+0.137225797 container start 015adda13752d7ef0bb4721c9a9fa4809f0ce8bab8ce1e40d2e96cd19ffb4ba7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-75a459da-4098-4237-9a69-6ce91c909b9c, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 22 08:45:23 compute-0 nova_compute[186544]: 2025-11-22 08:45:23.608 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:45:23 compute-0 neutron-haproxy-ovnmeta-75a459da-4098-4237-9a69-6ce91c909b9c[252710]: [NOTICE]   (252714) : New worker (252716) forked
Nov 22 08:45:23 compute-0 neutron-haproxy-ovnmeta-75a459da-4098-4237-9a69-6ce91c909b9c[252710]: [NOTICE]   (252714) : Loading success.
Nov 22 08:45:23 compute-0 nova_compute[186544]: 2025-11-22 08:45:23.624 186548 INFO nova.compute.manager [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Took 4.26 seconds to spawn the instance on the hypervisor.
Nov 22 08:45:23 compute-0 nova_compute[186544]: 2025-11-22 08:45:23.625 186548 DEBUG nova.compute.manager [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:45:23 compute-0 nova_compute[186544]: 2025-11-22 08:45:23.627 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763801123.5357244, a2a084a1-24f5-46c7-a76c-25bfe7b4a286 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:45:23 compute-0 nova_compute[186544]: 2025-11-22 08:45:23.627 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] VM Resumed (Lifecycle Event)
Nov 22 08:45:23 compute-0 nova_compute[186544]: 2025-11-22 08:45:23.662 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:45:23 compute-0 nova_compute[186544]: 2025-11-22 08:45:23.667 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:45:23 compute-0 nova_compute[186544]: 2025-11-22 08:45:23.692 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 08:45:23 compute-0 nova_compute[186544]: 2025-11-22 08:45:23.726 186548 INFO nova.compute.manager [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Took 4.72 seconds to build instance.
Nov 22 08:45:23 compute-0 nova_compute[186544]: 2025-11-22 08:45:23.746 186548 DEBUG oslo_concurrency.lockutils [None req-3c80d1a2-fe64-478b-b6b2-294659153d9c 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "a2a084a1-24f5-46c7-a76c-25bfe7b4a286" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.802s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:45:25 compute-0 podman[252726]: 2025-11-22 08:45:25.412138201 +0000 UTC m=+0.060108724 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, version=9.6, distribution-scope=public, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm)
Nov 22 08:45:25 compute-0 podman[252727]: 2025-11-22 08:45:25.420343954 +0000 UTC m=+0.064095253 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 22 08:45:25 compute-0 podman[252725]: 2025-11-22 08:45:25.437434175 +0000 UTC m=+0.087030348 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 08:45:25 compute-0 nova_compute[186544]: 2025-11-22 08:45:25.458 186548 DEBUG nova.compute.manager [req-7bb96631-c7c0-47f2-8a83-53824af65748 req-47cdbf43-7153-4e0a-84e4-4e06e07dd668 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Received event network-vif-plugged-a18bcf41-9655-4103-a948-8f1276716654 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:45:25 compute-0 nova_compute[186544]: 2025-11-22 08:45:25.459 186548 DEBUG oslo_concurrency.lockutils [req-7bb96631-c7c0-47f2-8a83-53824af65748 req-47cdbf43-7153-4e0a-84e4-4e06e07dd668 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "a2a084a1-24f5-46c7-a76c-25bfe7b4a286-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:45:25 compute-0 nova_compute[186544]: 2025-11-22 08:45:25.460 186548 DEBUG oslo_concurrency.lockutils [req-7bb96631-c7c0-47f2-8a83-53824af65748 req-47cdbf43-7153-4e0a-84e4-4e06e07dd668 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "a2a084a1-24f5-46c7-a76c-25bfe7b4a286-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:45:25 compute-0 nova_compute[186544]: 2025-11-22 08:45:25.460 186548 DEBUG oslo_concurrency.lockutils [req-7bb96631-c7c0-47f2-8a83-53824af65748 req-47cdbf43-7153-4e0a-84e4-4e06e07dd668 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "a2a084a1-24f5-46c7-a76c-25bfe7b4a286-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:45:25 compute-0 nova_compute[186544]: 2025-11-22 08:45:25.460 186548 DEBUG nova.compute.manager [req-7bb96631-c7c0-47f2-8a83-53824af65748 req-47cdbf43-7153-4e0a-84e4-4e06e07dd668 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] No waiting events found dispatching network-vif-plugged-a18bcf41-9655-4103-a948-8f1276716654 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:45:25 compute-0 nova_compute[186544]: 2025-11-22 08:45:25.461 186548 WARNING nova.compute.manager [req-7bb96631-c7c0-47f2-8a83-53824af65748 req-47cdbf43-7153-4e0a-84e4-4e06e07dd668 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Received unexpected event network-vif-plugged-a18bcf41-9655-4103-a948-8f1276716654 for instance with vm_state active and task_state None.
Nov 22 08:45:26 compute-0 nova_compute[186544]: 2025-11-22 08:45:26.173 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:45:26 compute-0 nova_compute[186544]: 2025-11-22 08:45:26.173 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 08:45:26 compute-0 nova_compute[186544]: 2025-11-22 08:45:26.173 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 08:45:26 compute-0 nova_compute[186544]: 2025-11-22 08:45:26.332 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "refresh_cache-a2a084a1-24f5-46c7-a76c-25bfe7b4a286" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:45:26 compute-0 nova_compute[186544]: 2025-11-22 08:45:26.333 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquired lock "refresh_cache-a2a084a1-24f5-46c7-a76c-25bfe7b4a286" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:45:26 compute-0 nova_compute[186544]: 2025-11-22 08:45:26.333 186548 DEBUG nova.network.neutron [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 22 08:45:26 compute-0 nova_compute[186544]: 2025-11-22 08:45:26.333 186548 DEBUG nova.objects.instance [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lazy-loading 'info_cache' on Instance uuid a2a084a1-24f5-46c7-a76c-25bfe7b4a286 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:45:26 compute-0 ovn_controller[94843]: 2025-11-22T08:45:26Z|00857|binding|INFO|Releasing lport 39800cc0-ce78-44f5-a846-1b0efde3902d from this chassis (sb_readonly=0)
Nov 22 08:45:26 compute-0 nova_compute[186544]: 2025-11-22 08:45:26.632 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:45:26 compute-0 NetworkManager[55036]: <info>  [1763801126.6329] manager: (patch-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/402)
Nov 22 08:45:26 compute-0 NetworkManager[55036]: <info>  [1763801126.6339] manager: (patch-br-int-to-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/403)
Nov 22 08:45:26 compute-0 ovn_controller[94843]: 2025-11-22T08:45:26Z|00858|binding|INFO|Releasing lport 39800cc0-ce78-44f5-a846-1b0efde3902d from this chassis (sb_readonly=0)
Nov 22 08:45:26 compute-0 nova_compute[186544]: 2025-11-22 08:45:26.664 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:45:26 compute-0 nova_compute[186544]: 2025-11-22 08:45:26.668 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:45:26 compute-0 nova_compute[186544]: 2025-11-22 08:45:26.711 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:45:27 compute-0 nova_compute[186544]: 2025-11-22 08:45:27.140 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:45:27 compute-0 nova_compute[186544]: 2025-11-22 08:45:27.335 186548 DEBUG nova.compute.manager [req-aa44438b-150f-4a65-b62b-871f98ed7a7d req-3e445c32-08fd-4947-8fa3-2764fec195ba 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Received event network-changed-a18bcf41-9655-4103-a948-8f1276716654 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:45:27 compute-0 nova_compute[186544]: 2025-11-22 08:45:27.336 186548 DEBUG nova.compute.manager [req-aa44438b-150f-4a65-b62b-871f98ed7a7d req-3e445c32-08fd-4947-8fa3-2764fec195ba 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Refreshing instance network info cache due to event network-changed-a18bcf41-9655-4103-a948-8f1276716654. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:45:27 compute-0 nova_compute[186544]: 2025-11-22 08:45:27.336 186548 DEBUG oslo_concurrency.lockutils [req-aa44438b-150f-4a65-b62b-871f98ed7a7d req-3e445c32-08fd-4947-8fa3-2764fec195ba 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-a2a084a1-24f5-46c7-a76c-25bfe7b4a286" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:45:27 compute-0 nova_compute[186544]: 2025-11-22 08:45:27.576 186548 DEBUG nova.network.neutron [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Updating instance_info_cache with network_info: [{"id": "a18bcf41-9655-4103-a948-8f1276716654", "address": "fa:16:3e:da:49:f4", "network": {"id": "75a459da-4098-4237-9a69-6ce91c909b9c", "bridge": "br-int", "label": "tempest-network-smoke--86102444", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.227", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa18bcf41-96", "ovs_interfaceid": "a18bcf41-9655-4103-a948-8f1276716654", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:45:27 compute-0 nova_compute[186544]: 2025-11-22 08:45:27.690 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Releasing lock "refresh_cache-a2a084a1-24f5-46c7-a76c-25bfe7b4a286" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:45:27 compute-0 nova_compute[186544]: 2025-11-22 08:45:27.691 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 22 08:45:27 compute-0 nova_compute[186544]: 2025-11-22 08:45:27.691 186548 DEBUG oslo_concurrency.lockutils [req-aa44438b-150f-4a65-b62b-871f98ed7a7d req-3e445c32-08fd-4947-8fa3-2764fec195ba 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-a2a084a1-24f5-46c7-a76c-25bfe7b4a286" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:45:27 compute-0 nova_compute[186544]: 2025-11-22 08:45:27.691 186548 DEBUG nova.network.neutron [req-aa44438b-150f-4a65-b62b-871f98ed7a7d req-3e445c32-08fd-4947-8fa3-2764fec195ba 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Refreshing network info cache for port a18bcf41-9655-4103-a948-8f1276716654 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:45:27 compute-0 nova_compute[186544]: 2025-11-22 08:45:27.692 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:45:27 compute-0 nova_compute[186544]: 2025-11-22 08:45:27.692 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 08:45:28 compute-0 nova_compute[186544]: 2025-11-22 08:45:28.760 186548 DEBUG nova.network.neutron [req-aa44438b-150f-4a65-b62b-871f98ed7a7d req-3e445c32-08fd-4947-8fa3-2764fec195ba 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Updated VIF entry in instance network info cache for port a18bcf41-9655-4103-a948-8f1276716654. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:45:28 compute-0 nova_compute[186544]: 2025-11-22 08:45:28.761 186548 DEBUG nova.network.neutron [req-aa44438b-150f-4a65-b62b-871f98ed7a7d req-3e445c32-08fd-4947-8fa3-2764fec195ba 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Updating instance_info_cache with network_info: [{"id": "a18bcf41-9655-4103-a948-8f1276716654", "address": "fa:16:3e:da:49:f4", "network": {"id": "75a459da-4098-4237-9a69-6ce91c909b9c", "bridge": "br-int", "label": "tempest-network-smoke--86102444", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.227", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa18bcf41-96", "ovs_interfaceid": "a18bcf41-9655-4103-a948-8f1276716654", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:45:28 compute-0 nova_compute[186544]: 2025-11-22 08:45:28.775 186548 DEBUG oslo_concurrency.lockutils [req-aa44438b-150f-4a65-b62b-871f98ed7a7d req-3e445c32-08fd-4947-8fa3-2764fec195ba 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-a2a084a1-24f5-46c7-a76c-25bfe7b4a286" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:45:29 compute-0 nova_compute[186544]: 2025-11-22 08:45:29.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:45:31 compute-0 nova_compute[186544]: 2025-11-22 08:45:31.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:45:31 compute-0 nova_compute[186544]: 2025-11-22 08:45:31.190 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:45:31 compute-0 nova_compute[186544]: 2025-11-22 08:45:31.191 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:45:31 compute-0 nova_compute[186544]: 2025-11-22 08:45:31.191 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:45:31 compute-0 nova_compute[186544]: 2025-11-22 08:45:31.191 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 08:45:31 compute-0 nova_compute[186544]: 2025-11-22 08:45:31.257 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a2a084a1-24f5-46c7-a76c-25bfe7b4a286/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:45:31 compute-0 nova_compute[186544]: 2025-11-22 08:45:31.315 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a2a084a1-24f5-46c7-a76c-25bfe7b4a286/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:45:31 compute-0 nova_compute[186544]: 2025-11-22 08:45:31.316 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a2a084a1-24f5-46c7-a76c-25bfe7b4a286/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:45:31 compute-0 nova_compute[186544]: 2025-11-22 08:45:31.370 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a2a084a1-24f5-46c7-a76c-25bfe7b4a286/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:45:31 compute-0 nova_compute[186544]: 2025-11-22 08:45:31.509 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:45:31 compute-0 nova_compute[186544]: 2025-11-22 08:45:31.510 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5564MB free_disk=73.13112258911133GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 08:45:31 compute-0 nova_compute[186544]: 2025-11-22 08:45:31.510 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:45:31 compute-0 nova_compute[186544]: 2025-11-22 08:45:31.511 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:45:31 compute-0 nova_compute[186544]: 2025-11-22 08:45:31.624 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Instance a2a084a1-24f5-46c7-a76c-25bfe7b4a286 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 22 08:45:31 compute-0 nova_compute[186544]: 2025-11-22 08:45:31.625 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 08:45:31 compute-0 nova_compute[186544]: 2025-11-22 08:45:31.625 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 08:45:31 compute-0 nova_compute[186544]: 2025-11-22 08:45:31.689 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:45:31 compute-0 nova_compute[186544]: 2025-11-22 08:45:31.708 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:45:31 compute-0 nova_compute[186544]: 2025-11-22 08:45:31.713 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:45:31 compute-0 nova_compute[186544]: 2025-11-22 08:45:31.734 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 08:45:31 compute-0 nova_compute[186544]: 2025-11-22 08:45:31.735 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.224s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:45:32 compute-0 nova_compute[186544]: 2025-11-22 08:45:32.142 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:45:34 compute-0 nova_compute[186544]: 2025-11-22 08:45:34.735 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:45:35 compute-0 nova_compute[186544]: 2025-11-22 08:45:35.157 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:45:36 compute-0 nova_compute[186544]: 2025-11-22 08:45:36.715 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:45:37 compute-0 nova_compute[186544]: 2025-11-22 08:45:37.143 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:45:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:45:37.375 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:45:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:45:37.376 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:45:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:45:37.376 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:45:37 compute-0 ovn_controller[94843]: 2025-11-22T08:45:37Z|00085|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:da:49:f4 10.100.0.13
Nov 22 08:45:37 compute-0 ovn_controller[94843]: 2025-11-22T08:45:37Z|00086|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:da:49:f4 10.100.0.13
Nov 22 08:45:39 compute-0 nova_compute[186544]: 2025-11-22 08:45:39.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:45:41 compute-0 nova_compute[186544]: 2025-11-22 08:45:41.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:45:41 compute-0 nova_compute[186544]: 2025-11-22 08:45:41.718 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:45:42 compute-0 nova_compute[186544]: 2025-11-22 08:45:42.146 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:45:42 compute-0 nova_compute[186544]: 2025-11-22 08:45:42.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:45:44 compute-0 nova_compute[186544]: 2025-11-22 08:45:44.143 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:45:44 compute-0 nova_compute[186544]: 2025-11-22 08:45:44.160 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Triggering sync for uuid a2a084a1-24f5-46c7-a76c-25bfe7b4a286 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Nov 22 08:45:44 compute-0 nova_compute[186544]: 2025-11-22 08:45:44.160 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "a2a084a1-24f5-46c7-a76c-25bfe7b4a286" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:45:44 compute-0 nova_compute[186544]: 2025-11-22 08:45:44.161 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "a2a084a1-24f5-46c7-a76c-25bfe7b4a286" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:45:44 compute-0 nova_compute[186544]: 2025-11-22 08:45:44.182 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "a2a084a1-24f5-46c7-a76c-25bfe7b4a286" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.021s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:45:45 compute-0 podman[252817]: 2025-11-22 08:45:45.419087928 +0000 UTC m=+0.054860636 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 08:45:45 compute-0 podman[252816]: 2025-11-22 08:45:45.419063767 +0000 UTC m=+0.056778692 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 08:45:45 compute-0 podman[252815]: 2025-11-22 08:45:45.441671785 +0000 UTC m=+0.087125342 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Nov 22 08:45:45 compute-0 podman[252823]: 2025-11-22 08:45:45.455052074 +0000 UTC m=+0.083346997 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Nov 22 08:45:46 compute-0 nova_compute[186544]: 2025-11-22 08:45:46.720 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:45:47 compute-0 nova_compute[186544]: 2025-11-22 08:45:47.148 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:45:51 compute-0 nova_compute[186544]: 2025-11-22 08:45:51.724 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:45:52 compute-0 nova_compute[186544]: 2025-11-22 08:45:52.149 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:45:56 compute-0 podman[252900]: 2025-11-22 08:45:56.405902803 +0000 UTC m=+0.048144758 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 08:45:56 compute-0 podman[252902]: 2025-11-22 08:45:56.41509054 +0000 UTC m=+0.051351158 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 22 08:45:56 compute-0 podman[252901]: 2025-11-22 08:45:56.42118607 +0000 UTC m=+0.060124254 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-type=git, release=1755695350, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=)
Nov 22 08:45:56 compute-0 nova_compute[186544]: 2025-11-22 08:45:56.726 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:45:57 compute-0 nova_compute[186544]: 2025-11-22 08:45:57.151 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:46:01 compute-0 nova_compute[186544]: 2025-11-22 08:46:01.728 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:46:02 compute-0 nova_compute[186544]: 2025-11-22 08:46:02.152 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:46:06 compute-0 nova_compute[186544]: 2025-11-22 08:46:06.730 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:46:07 compute-0 nova_compute[186544]: 2025-11-22 08:46:07.154 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:46:11 compute-0 nova_compute[186544]: 2025-11-22 08:46:11.733 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:46:11 compute-0 ovn_controller[94843]: 2025-11-22T08:46:11Z|00859|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Nov 22 08:46:12 compute-0 nova_compute[186544]: 2025-11-22 08:46:12.156 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:46:14 compute-0 nova_compute[186544]: 2025-11-22 08:46:14.899 186548 INFO nova.compute.manager [None req-2e7a7d59-c827-4cd5-9d0a-7557bc8e8e62 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Get console output
Nov 22 08:46:14 compute-0 nova_compute[186544]: 2025-11-22 08:46:14.904 213059 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 22 08:46:16 compute-0 nova_compute[186544]: 2025-11-22 08:46:16.210 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:46:16 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:46:16.210 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=70, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=69) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:46:16 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:46:16.211 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 08:46:16 compute-0 podman[252962]: 2025-11-22 08:46:16.410017671 +0000 UTC m=+0.054251170 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Nov 22 08:46:16 compute-0 podman[252963]: 2025-11-22 08:46:16.41037115 +0000 UTC m=+0.052359073 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 08:46:16 compute-0 podman[252964]: 2025-11-22 08:46:16.421016493 +0000 UTC m=+0.059699485 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 08:46:16 compute-0 podman[252965]: 2025-11-22 08:46:16.443803655 +0000 UTC m=+0.077303539 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 22 08:46:16 compute-0 nova_compute[186544]: 2025-11-22 08:46:16.735 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:46:17 compute-0 nova_compute[186544]: 2025-11-22 08:46:17.095 186548 DEBUG nova.compute.manager [req-861ab372-8ae4-4def-9376-8c89f2e793e3 req-14e68a39-f65a-4d61-8071-153deac54224 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Received event network-changed-a18bcf41-9655-4103-a948-8f1276716654 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:46:17 compute-0 nova_compute[186544]: 2025-11-22 08:46:17.095 186548 DEBUG nova.compute.manager [req-861ab372-8ae4-4def-9376-8c89f2e793e3 req-14e68a39-f65a-4d61-8071-153deac54224 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Refreshing instance network info cache due to event network-changed-a18bcf41-9655-4103-a948-8f1276716654. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:46:17 compute-0 nova_compute[186544]: 2025-11-22 08:46:17.095 186548 DEBUG oslo_concurrency.lockutils [req-861ab372-8ae4-4def-9376-8c89f2e793e3 req-14e68a39-f65a-4d61-8071-153deac54224 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-a2a084a1-24f5-46c7-a76c-25bfe7b4a286" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:46:17 compute-0 nova_compute[186544]: 2025-11-22 08:46:17.095 186548 DEBUG oslo_concurrency.lockutils [req-861ab372-8ae4-4def-9376-8c89f2e793e3 req-14e68a39-f65a-4d61-8071-153deac54224 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-a2a084a1-24f5-46c7-a76c-25bfe7b4a286" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:46:17 compute-0 nova_compute[186544]: 2025-11-22 08:46:17.096 186548 DEBUG nova.network.neutron [req-861ab372-8ae4-4def-9376-8c89f2e793e3 req-14e68a39-f65a-4d61-8071-153deac54224 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Refreshing network info cache for port a18bcf41-9655-4103-a948-8f1276716654 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:46:17 compute-0 nova_compute[186544]: 2025-11-22 08:46:17.158 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:46:17 compute-0 nova_compute[186544]: 2025-11-22 08:46:17.195 186548 DEBUG nova.compute.manager [req-9e667bb0-a653-4cea-b05c-f40d8bc244ca req-c869603c-8a2c-4d6a-90df-321217b2996e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Received event network-vif-unplugged-a18bcf41-9655-4103-a948-8f1276716654 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:46:17 compute-0 nova_compute[186544]: 2025-11-22 08:46:17.196 186548 DEBUG oslo_concurrency.lockutils [req-9e667bb0-a653-4cea-b05c-f40d8bc244ca req-c869603c-8a2c-4d6a-90df-321217b2996e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "a2a084a1-24f5-46c7-a76c-25bfe7b4a286-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:46:17 compute-0 nova_compute[186544]: 2025-11-22 08:46:17.196 186548 DEBUG oslo_concurrency.lockutils [req-9e667bb0-a653-4cea-b05c-f40d8bc244ca req-c869603c-8a2c-4d6a-90df-321217b2996e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "a2a084a1-24f5-46c7-a76c-25bfe7b4a286-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:46:17 compute-0 nova_compute[186544]: 2025-11-22 08:46:17.197 186548 DEBUG oslo_concurrency.lockutils [req-9e667bb0-a653-4cea-b05c-f40d8bc244ca req-c869603c-8a2c-4d6a-90df-321217b2996e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "a2a084a1-24f5-46c7-a76c-25bfe7b4a286-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:46:17 compute-0 nova_compute[186544]: 2025-11-22 08:46:17.197 186548 DEBUG nova.compute.manager [req-9e667bb0-a653-4cea-b05c-f40d8bc244ca req-c869603c-8a2c-4d6a-90df-321217b2996e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] No waiting events found dispatching network-vif-unplugged-a18bcf41-9655-4103-a948-8f1276716654 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:46:17 compute-0 nova_compute[186544]: 2025-11-22 08:46:17.197 186548 WARNING nova.compute.manager [req-9e667bb0-a653-4cea-b05c-f40d8bc244ca req-c869603c-8a2c-4d6a-90df-321217b2996e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Received unexpected event network-vif-unplugged-a18bcf41-9655-4103-a948-8f1276716654 for instance with vm_state active and task_state None.
Nov 22 08:46:17 compute-0 nova_compute[186544]: 2025-11-22 08:46:17.884 186548 INFO nova.compute.manager [None req-90aeac4c-ae70-4a9a-a6f0-ee0f9f269055 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Get console output
Nov 22 08:46:17 compute-0 nova_compute[186544]: 2025-11-22 08:46:17.889 213059 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 22 08:46:18 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:46:18.213 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '70'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:46:18 compute-0 nova_compute[186544]: 2025-11-22 08:46:18.879 186548 DEBUG nova.network.neutron [req-861ab372-8ae4-4def-9376-8c89f2e793e3 req-14e68a39-f65a-4d61-8071-153deac54224 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Updated VIF entry in instance network info cache for port a18bcf41-9655-4103-a948-8f1276716654. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:46:18 compute-0 nova_compute[186544]: 2025-11-22 08:46:18.880 186548 DEBUG nova.network.neutron [req-861ab372-8ae4-4def-9376-8c89f2e793e3 req-14e68a39-f65a-4d61-8071-153deac54224 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Updating instance_info_cache with network_info: [{"id": "a18bcf41-9655-4103-a948-8f1276716654", "address": "fa:16:3e:da:49:f4", "network": {"id": "75a459da-4098-4237-9a69-6ce91c909b9c", "bridge": "br-int", "label": "tempest-network-smoke--86102444", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.227", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa18bcf41-96", "ovs_interfaceid": "a18bcf41-9655-4103-a948-8f1276716654", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:46:18 compute-0 nova_compute[186544]: 2025-11-22 08:46:18.894 186548 DEBUG oslo_concurrency.lockutils [req-861ab372-8ae4-4def-9376-8c89f2e793e3 req-14e68a39-f65a-4d61-8071-153deac54224 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-a2a084a1-24f5-46c7-a76c-25bfe7b4a286" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:46:19 compute-0 nova_compute[186544]: 2025-11-22 08:46:19.287 186548 DEBUG nova.compute.manager [req-0e8e1de1-503a-4bbb-b5c0-7f02592f729c req-ea05f494-ed44-488c-92a0-471bb3bf2f3a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Received event network-vif-plugged-a18bcf41-9655-4103-a948-8f1276716654 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:46:19 compute-0 nova_compute[186544]: 2025-11-22 08:46:19.288 186548 DEBUG oslo_concurrency.lockutils [req-0e8e1de1-503a-4bbb-b5c0-7f02592f729c req-ea05f494-ed44-488c-92a0-471bb3bf2f3a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "a2a084a1-24f5-46c7-a76c-25bfe7b4a286-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:46:19 compute-0 nova_compute[186544]: 2025-11-22 08:46:19.288 186548 DEBUG oslo_concurrency.lockutils [req-0e8e1de1-503a-4bbb-b5c0-7f02592f729c req-ea05f494-ed44-488c-92a0-471bb3bf2f3a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "a2a084a1-24f5-46c7-a76c-25bfe7b4a286-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:46:19 compute-0 nova_compute[186544]: 2025-11-22 08:46:19.288 186548 DEBUG oslo_concurrency.lockutils [req-0e8e1de1-503a-4bbb-b5c0-7f02592f729c req-ea05f494-ed44-488c-92a0-471bb3bf2f3a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "a2a084a1-24f5-46c7-a76c-25bfe7b4a286-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:46:19 compute-0 nova_compute[186544]: 2025-11-22 08:46:19.288 186548 DEBUG nova.compute.manager [req-0e8e1de1-503a-4bbb-b5c0-7f02592f729c req-ea05f494-ed44-488c-92a0-471bb3bf2f3a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] No waiting events found dispatching network-vif-plugged-a18bcf41-9655-4103-a948-8f1276716654 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:46:19 compute-0 nova_compute[186544]: 2025-11-22 08:46:19.289 186548 WARNING nova.compute.manager [req-0e8e1de1-503a-4bbb-b5c0-7f02592f729c req-ea05f494-ed44-488c-92a0-471bb3bf2f3a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Received unexpected event network-vif-plugged-a18bcf41-9655-4103-a948-8f1276716654 for instance with vm_state active and task_state None.
Nov 22 08:46:19 compute-0 nova_compute[186544]: 2025-11-22 08:46:19.509 186548 DEBUG nova.compute.manager [req-9e6c9b01-2db9-4467-9109-e3816d24ad10 req-aab9e1ce-f531-4ae4-a7bb-33aa8055f72a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Received event network-changed-a18bcf41-9655-4103-a948-8f1276716654 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:46:19 compute-0 nova_compute[186544]: 2025-11-22 08:46:19.509 186548 DEBUG nova.compute.manager [req-9e6c9b01-2db9-4467-9109-e3816d24ad10 req-aab9e1ce-f531-4ae4-a7bb-33aa8055f72a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Refreshing instance network info cache due to event network-changed-a18bcf41-9655-4103-a948-8f1276716654. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:46:19 compute-0 nova_compute[186544]: 2025-11-22 08:46:19.509 186548 DEBUG oslo_concurrency.lockutils [req-9e6c9b01-2db9-4467-9109-e3816d24ad10 req-aab9e1ce-f531-4ae4-a7bb-33aa8055f72a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-a2a084a1-24f5-46c7-a76c-25bfe7b4a286" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:46:19 compute-0 nova_compute[186544]: 2025-11-22 08:46:19.510 186548 DEBUG oslo_concurrency.lockutils [req-9e6c9b01-2db9-4467-9109-e3816d24ad10 req-aab9e1ce-f531-4ae4-a7bb-33aa8055f72a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-a2a084a1-24f5-46c7-a76c-25bfe7b4a286" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:46:19 compute-0 nova_compute[186544]: 2025-11-22 08:46:19.510 186548 DEBUG nova.network.neutron [req-9e6c9b01-2db9-4467-9109-e3816d24ad10 req-aab9e1ce-f531-4ae4-a7bb-33aa8055f72a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Refreshing network info cache for port a18bcf41-9655-4103-a948-8f1276716654 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:46:20 compute-0 nova_compute[186544]: 2025-11-22 08:46:20.215 186548 INFO nova.compute.manager [None req-2a080f61-5131-41f2-934a-9e2e9bffb120 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Get console output
Nov 22 08:46:20 compute-0 nova_compute[186544]: 2025-11-22 08:46:20.223 213059 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 22 08:46:21 compute-0 nova_compute[186544]: 2025-11-22 08:46:21.436 186548 DEBUG nova.compute.manager [req-0fc21440-6049-44bc-a3c6-372f15d547d6 req-7ce5ee34-336f-4f88-8116-c7c00eb27c1b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Received event network-vif-plugged-a18bcf41-9655-4103-a948-8f1276716654 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:46:21 compute-0 nova_compute[186544]: 2025-11-22 08:46:21.436 186548 DEBUG oslo_concurrency.lockutils [req-0fc21440-6049-44bc-a3c6-372f15d547d6 req-7ce5ee34-336f-4f88-8116-c7c00eb27c1b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "a2a084a1-24f5-46c7-a76c-25bfe7b4a286-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:46:21 compute-0 nova_compute[186544]: 2025-11-22 08:46:21.437 186548 DEBUG oslo_concurrency.lockutils [req-0fc21440-6049-44bc-a3c6-372f15d547d6 req-7ce5ee34-336f-4f88-8116-c7c00eb27c1b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "a2a084a1-24f5-46c7-a76c-25bfe7b4a286-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:46:21 compute-0 nova_compute[186544]: 2025-11-22 08:46:21.437 186548 DEBUG oslo_concurrency.lockutils [req-0fc21440-6049-44bc-a3c6-372f15d547d6 req-7ce5ee34-336f-4f88-8116-c7c00eb27c1b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "a2a084a1-24f5-46c7-a76c-25bfe7b4a286-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:46:21 compute-0 nova_compute[186544]: 2025-11-22 08:46:21.437 186548 DEBUG nova.compute.manager [req-0fc21440-6049-44bc-a3c6-372f15d547d6 req-7ce5ee34-336f-4f88-8116-c7c00eb27c1b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] No waiting events found dispatching network-vif-plugged-a18bcf41-9655-4103-a948-8f1276716654 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:46:21 compute-0 nova_compute[186544]: 2025-11-22 08:46:21.437 186548 WARNING nova.compute.manager [req-0fc21440-6049-44bc-a3c6-372f15d547d6 req-7ce5ee34-336f-4f88-8116-c7c00eb27c1b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Received unexpected event network-vif-plugged-a18bcf41-9655-4103-a948-8f1276716654 for instance with vm_state active and task_state None.
Nov 22 08:46:21 compute-0 nova_compute[186544]: 2025-11-22 08:46:21.438 186548 DEBUG nova.compute.manager [req-0fc21440-6049-44bc-a3c6-372f15d547d6 req-7ce5ee34-336f-4f88-8116-c7c00eb27c1b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Received event network-vif-plugged-a18bcf41-9655-4103-a948-8f1276716654 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:46:21 compute-0 nova_compute[186544]: 2025-11-22 08:46:21.438 186548 DEBUG oslo_concurrency.lockutils [req-0fc21440-6049-44bc-a3c6-372f15d547d6 req-7ce5ee34-336f-4f88-8116-c7c00eb27c1b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "a2a084a1-24f5-46c7-a76c-25bfe7b4a286-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:46:21 compute-0 nova_compute[186544]: 2025-11-22 08:46:21.438 186548 DEBUG oslo_concurrency.lockutils [req-0fc21440-6049-44bc-a3c6-372f15d547d6 req-7ce5ee34-336f-4f88-8116-c7c00eb27c1b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "a2a084a1-24f5-46c7-a76c-25bfe7b4a286-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:46:21 compute-0 nova_compute[186544]: 2025-11-22 08:46:21.438 186548 DEBUG oslo_concurrency.lockutils [req-0fc21440-6049-44bc-a3c6-372f15d547d6 req-7ce5ee34-336f-4f88-8116-c7c00eb27c1b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "a2a084a1-24f5-46c7-a76c-25bfe7b4a286-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:46:21 compute-0 nova_compute[186544]: 2025-11-22 08:46:21.439 186548 DEBUG nova.compute.manager [req-0fc21440-6049-44bc-a3c6-372f15d547d6 req-7ce5ee34-336f-4f88-8116-c7c00eb27c1b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] No waiting events found dispatching network-vif-plugged-a18bcf41-9655-4103-a948-8f1276716654 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:46:21 compute-0 nova_compute[186544]: 2025-11-22 08:46:21.439 186548 WARNING nova.compute.manager [req-0fc21440-6049-44bc-a3c6-372f15d547d6 req-7ce5ee34-336f-4f88-8116-c7c00eb27c1b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Received unexpected event network-vif-plugged-a18bcf41-9655-4103-a948-8f1276716654 for instance with vm_state active and task_state None.
Nov 22 08:46:21 compute-0 nova_compute[186544]: 2025-11-22 08:46:21.700 186548 DEBUG nova.network.neutron [req-9e6c9b01-2db9-4467-9109-e3816d24ad10 req-aab9e1ce-f531-4ae4-a7bb-33aa8055f72a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Updated VIF entry in instance network info cache for port a18bcf41-9655-4103-a948-8f1276716654. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:46:21 compute-0 nova_compute[186544]: 2025-11-22 08:46:21.701 186548 DEBUG nova.network.neutron [req-9e6c9b01-2db9-4467-9109-e3816d24ad10 req-aab9e1ce-f531-4ae4-a7bb-33aa8055f72a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Updating instance_info_cache with network_info: [{"id": "a18bcf41-9655-4103-a948-8f1276716654", "address": "fa:16:3e:da:49:f4", "network": {"id": "75a459da-4098-4237-9a69-6ce91c909b9c", "bridge": "br-int", "label": "tempest-network-smoke--86102444", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.227", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa18bcf41-96", "ovs_interfaceid": "a18bcf41-9655-4103-a948-8f1276716654", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:46:21 compute-0 nova_compute[186544]: 2025-11-22 08:46:21.716 186548 DEBUG oslo_concurrency.lockutils [req-9e6c9b01-2db9-4467-9109-e3816d24ad10 req-aab9e1ce-f531-4ae4-a7bb-33aa8055f72a 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-a2a084a1-24f5-46c7-a76c-25bfe7b4a286" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:46:21 compute-0 nova_compute[186544]: 2025-11-22 08:46:21.737 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:46:22 compute-0 nova_compute[186544]: 2025-11-22 08:46:22.160 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:46:26 compute-0 nova_compute[186544]: 2025-11-22 08:46:26.179 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:46:26 compute-0 nova_compute[186544]: 2025-11-22 08:46:26.180 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 08:46:26 compute-0 nova_compute[186544]: 2025-11-22 08:46:26.181 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 08:46:26 compute-0 nova_compute[186544]: 2025-11-22 08:46:26.316 186548 DEBUG nova.compute.manager [req-ff8bd66d-f8bb-44e0-8ca3-8e0960d0ac8d req-c626dcd1-9ebb-443e-b59a-01c9f4a65600 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Received event network-changed-a18bcf41-9655-4103-a948-8f1276716654 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:46:26 compute-0 nova_compute[186544]: 2025-11-22 08:46:26.317 186548 DEBUG nova.compute.manager [req-ff8bd66d-f8bb-44e0-8ca3-8e0960d0ac8d req-c626dcd1-9ebb-443e-b59a-01c9f4a65600 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Refreshing instance network info cache due to event network-changed-a18bcf41-9655-4103-a948-8f1276716654. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:46:26 compute-0 nova_compute[186544]: 2025-11-22 08:46:26.317 186548 DEBUG oslo_concurrency.lockutils [req-ff8bd66d-f8bb-44e0-8ca3-8e0960d0ac8d req-c626dcd1-9ebb-443e-b59a-01c9f4a65600 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-a2a084a1-24f5-46c7-a76c-25bfe7b4a286" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:46:26 compute-0 nova_compute[186544]: 2025-11-22 08:46:26.317 186548 DEBUG oslo_concurrency.lockutils [req-ff8bd66d-f8bb-44e0-8ca3-8e0960d0ac8d req-c626dcd1-9ebb-443e-b59a-01c9f4a65600 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-a2a084a1-24f5-46c7-a76c-25bfe7b4a286" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:46:26 compute-0 nova_compute[186544]: 2025-11-22 08:46:26.317 186548 DEBUG nova.network.neutron [req-ff8bd66d-f8bb-44e0-8ca3-8e0960d0ac8d req-c626dcd1-9ebb-443e-b59a-01c9f4a65600 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Refreshing network info cache for port a18bcf41-9655-4103-a948-8f1276716654 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:46:26 compute-0 nova_compute[186544]: 2025-11-22 08:46:26.372 186548 DEBUG oslo_concurrency.lockutils [None req-a8ed4812-c7c0-484d-b2c0-6a2b27f4ac9f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "a2a084a1-24f5-46c7-a76c-25bfe7b4a286" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:46:26 compute-0 nova_compute[186544]: 2025-11-22 08:46:26.372 186548 DEBUG oslo_concurrency.lockutils [None req-a8ed4812-c7c0-484d-b2c0-6a2b27f4ac9f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "a2a084a1-24f5-46c7-a76c-25bfe7b4a286" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:46:26 compute-0 nova_compute[186544]: 2025-11-22 08:46:26.373 186548 DEBUG oslo_concurrency.lockutils [None req-a8ed4812-c7c0-484d-b2c0-6a2b27f4ac9f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "a2a084a1-24f5-46c7-a76c-25bfe7b4a286-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:46:26 compute-0 nova_compute[186544]: 2025-11-22 08:46:26.373 186548 DEBUG oslo_concurrency.lockutils [None req-a8ed4812-c7c0-484d-b2c0-6a2b27f4ac9f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "a2a084a1-24f5-46c7-a76c-25bfe7b4a286-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:46:26 compute-0 nova_compute[186544]: 2025-11-22 08:46:26.373 186548 DEBUG oslo_concurrency.lockutils [None req-a8ed4812-c7c0-484d-b2c0-6a2b27f4ac9f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "a2a084a1-24f5-46c7-a76c-25bfe7b4a286-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:46:26 compute-0 nova_compute[186544]: 2025-11-22 08:46:26.384 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "refresh_cache-a2a084a1-24f5-46c7-a76c-25bfe7b4a286" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:46:26 compute-0 nova_compute[186544]: 2025-11-22 08:46:26.385 186548 INFO nova.compute.manager [None req-a8ed4812-c7c0-484d-b2c0-6a2b27f4ac9f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Terminating instance
Nov 22 08:46:26 compute-0 nova_compute[186544]: 2025-11-22 08:46:26.392 186548 DEBUG nova.compute.manager [None req-a8ed4812-c7c0-484d-b2c0-6a2b27f4ac9f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 08:46:26 compute-0 kernel: tapa18bcf41-96 (unregistering): left promiscuous mode
Nov 22 08:46:26 compute-0 NetworkManager[55036]: <info>  [1763801186.4153] device (tapa18bcf41-96): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 08:46:26 compute-0 ovn_controller[94843]: 2025-11-22T08:46:26Z|00860|binding|INFO|Releasing lport a18bcf41-9655-4103-a948-8f1276716654 from this chassis (sb_readonly=0)
Nov 22 08:46:26 compute-0 ovn_controller[94843]: 2025-11-22T08:46:26Z|00861|binding|INFO|Setting lport a18bcf41-9655-4103-a948-8f1276716654 down in Southbound
Nov 22 08:46:26 compute-0 nova_compute[186544]: 2025-11-22 08:46:26.425 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:46:26 compute-0 ovn_controller[94843]: 2025-11-22T08:46:26Z|00862|binding|INFO|Removing iface tapa18bcf41-96 ovn-installed in OVS
Nov 22 08:46:26 compute-0 nova_compute[186544]: 2025-11-22 08:46:26.427 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:46:26 compute-0 nova_compute[186544]: 2025-11-22 08:46:26.443 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:46:26 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:46:26.463 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:da:49:f4 10.100.0.13'], port_security=['fa:16:3e:da:49:f4 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'a2a084a1-24f5-46c7-a76c-25bfe7b4a286', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-75a459da-4098-4237-9a69-6ce91c909b9c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '12f63a6d87a947758ab928c0d625ff06', 'neutron:revision_number': '8', 'neutron:security_group_ids': '74b59ad3-9ee8-4065-9258-54596f8680d2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=233785d2-7366-479b-b956-6331b9bfdb2d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=a18bcf41-9655-4103-a948-8f1276716654) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:46:26 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:46:26.464 103805 INFO neutron.agent.ovn.metadata.agent [-] Port a18bcf41-9655-4103-a948-8f1276716654 in datapath 75a459da-4098-4237-9a69-6ce91c909b9c unbound from our chassis
Nov 22 08:46:26 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:46:26.465 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 75a459da-4098-4237-9a69-6ce91c909b9c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 08:46:26 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:46:26.466 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[949690b0-d1dc-49f8-8985-4f7fe0fc8ea7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:46:26 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:46:26.467 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-75a459da-4098-4237-9a69-6ce91c909b9c namespace which is not needed anymore
Nov 22 08:46:26 compute-0 systemd[1]: machine-qemu\x2d95\x2dinstance\x2d000000b0.scope: Deactivated successfully.
Nov 22 08:46:26 compute-0 systemd[1]: machine-qemu\x2d95\x2dinstance\x2d000000b0.scope: Consumed 15.937s CPU time.
Nov 22 08:46:26 compute-0 systemd-machined[152872]: Machine qemu-95-instance-000000b0 terminated.
Nov 22 08:46:26 compute-0 podman[253044]: 2025-11-22 08:46:26.50774147 +0000 UTC m=+0.066988044 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 08:46:26 compute-0 podman[253048]: 2025-11-22 08:46:26.511115243 +0000 UTC m=+0.061201001 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 22 08:46:26 compute-0 podman[253047]: 2025-11-22 08:46:26.536627472 +0000 UTC m=+0.067331092 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, name=ubi9-minimal, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, architecture=x86_64, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, vcs-type=git, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 22 08:46:26 compute-0 neutron-haproxy-ovnmeta-75a459da-4098-4237-9a69-6ce91c909b9c[252710]: [NOTICE]   (252714) : haproxy version is 2.8.14-c23fe91
Nov 22 08:46:26 compute-0 neutron-haproxy-ovnmeta-75a459da-4098-4237-9a69-6ce91c909b9c[252710]: [NOTICE]   (252714) : path to executable is /usr/sbin/haproxy
Nov 22 08:46:26 compute-0 neutron-haproxy-ovnmeta-75a459da-4098-4237-9a69-6ce91c909b9c[252710]: [WARNING]  (252714) : Exiting Master process...
Nov 22 08:46:26 compute-0 neutron-haproxy-ovnmeta-75a459da-4098-4237-9a69-6ce91c909b9c[252710]: [ALERT]    (252714) : Current worker (252716) exited with code 143 (Terminated)
Nov 22 08:46:26 compute-0 neutron-haproxy-ovnmeta-75a459da-4098-4237-9a69-6ce91c909b9c[252710]: [WARNING]  (252714) : All workers exited. Exiting... (0)
Nov 22 08:46:26 compute-0 systemd[1]: libpod-015adda13752d7ef0bb4721c9a9fa4809f0ce8bab8ce1e40d2e96cd19ffb4ba7.scope: Deactivated successfully.
Nov 22 08:46:26 compute-0 kernel: tapa18bcf41-96: entered promiscuous mode
Nov 22 08:46:26 compute-0 NetworkManager[55036]: <info>  [1763801186.6173] manager: (tapa18bcf41-96): new Tun device (/org/freedesktop/NetworkManager/Devices/404)
Nov 22 08:46:26 compute-0 kernel: tapa18bcf41-96 (unregistering): left promiscuous mode
Nov 22 08:46:26 compute-0 podman[253122]: 2025-11-22 08:46:26.617723944 +0000 UTC m=+0.056844803 container died 015adda13752d7ef0bb4721c9a9fa4809f0ce8bab8ce1e40d2e96cd19ffb4ba7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-75a459da-4098-4237-9a69-6ce91c909b9c, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 08:46:26 compute-0 ovn_controller[94843]: 2025-11-22T08:46:26Z|00863|binding|INFO|Claiming lport a18bcf41-9655-4103-a948-8f1276716654 for this chassis.
Nov 22 08:46:26 compute-0 ovn_controller[94843]: 2025-11-22T08:46:26Z|00864|binding|INFO|a18bcf41-9655-4103-a948-8f1276716654: Claiming fa:16:3e:da:49:f4 10.100.0.13
Nov 22 08:46:26 compute-0 nova_compute[186544]: 2025-11-22 08:46:26.621 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:46:26 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:46:26.631 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:da:49:f4 10.100.0.13'], port_security=['fa:16:3e:da:49:f4 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'a2a084a1-24f5-46c7-a76c-25bfe7b4a286', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-75a459da-4098-4237-9a69-6ce91c909b9c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '12f63a6d87a947758ab928c0d625ff06', 'neutron:revision_number': '8', 'neutron:security_group_ids': '74b59ad3-9ee8-4065-9258-54596f8680d2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=233785d2-7366-479b-b956-6331b9bfdb2d, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=a18bcf41-9655-4103-a948-8f1276716654) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:46:26 compute-0 ovn_controller[94843]: 2025-11-22T08:46:26Z|00865|binding|INFO|Setting lport a18bcf41-9655-4103-a948-8f1276716654 ovn-installed in OVS
Nov 22 08:46:26 compute-0 ovn_controller[94843]: 2025-11-22T08:46:26Z|00866|binding|INFO|Setting lport a18bcf41-9655-4103-a948-8f1276716654 up in Southbound
Nov 22 08:46:26 compute-0 nova_compute[186544]: 2025-11-22 08:46:26.639 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:46:26 compute-0 nova_compute[186544]: 2025-11-22 08:46:26.640 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:46:26 compute-0 ovn_controller[94843]: 2025-11-22T08:46:26Z|00867|binding|INFO|Releasing lport a18bcf41-9655-4103-a948-8f1276716654 from this chassis (sb_readonly=1)
Nov 22 08:46:26 compute-0 ovn_controller[94843]: 2025-11-22T08:46:26Z|00868|if_status|INFO|Dropped 5 log messages in last 1423 seconds (most recently, 1423 seconds ago) due to excessive rate
Nov 22 08:46:26 compute-0 ovn_controller[94843]: 2025-11-22T08:46:26Z|00869|if_status|INFO|Not setting lport a18bcf41-9655-4103-a948-8f1276716654 down as sb is readonly
Nov 22 08:46:26 compute-0 ovn_controller[94843]: 2025-11-22T08:46:26Z|00870|binding|INFO|Removing iface tapa18bcf41-96 ovn-installed in OVS
Nov 22 08:46:26 compute-0 nova_compute[186544]: 2025-11-22 08:46:26.641 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:46:26 compute-0 ovn_controller[94843]: 2025-11-22T08:46:26Z|00871|binding|INFO|Releasing lport a18bcf41-9655-4103-a948-8f1276716654 from this chassis (sb_readonly=1)
Nov 22 08:46:26 compute-0 ovn_controller[94843]: 2025-11-22T08:46:26Z|00872|binding|INFO|Setting lport a18bcf41-9655-4103-a948-8f1276716654 down in Southbound
Nov 22 08:46:26 compute-0 nova_compute[186544]: 2025-11-22 08:46:26.654 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:46:26 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:46:26.656 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:da:49:f4 10.100.0.13'], port_security=['fa:16:3e:da:49:f4 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'a2a084a1-24f5-46c7-a76c-25bfe7b4a286', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-75a459da-4098-4237-9a69-6ce91c909b9c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '12f63a6d87a947758ab928c0d625ff06', 'neutron:revision_number': '8', 'neutron:security_group_ids': '74b59ad3-9ee8-4065-9258-54596f8680d2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=233785d2-7366-479b-b956-6331b9bfdb2d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=a18bcf41-9655-4103-a948-8f1276716654) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:46:26 compute-0 nova_compute[186544]: 2025-11-22 08:46:26.674 186548 INFO nova.virt.libvirt.driver [-] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Instance destroyed successfully.
Nov 22 08:46:26 compute-0 nova_compute[186544]: 2025-11-22 08:46:26.675 186548 DEBUG nova.objects.instance [None req-a8ed4812-c7c0-484d-b2c0-6a2b27f4ac9f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lazy-loading 'resources' on Instance uuid a2a084a1-24f5-46c7-a76c-25bfe7b4a286 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:46:26 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-015adda13752d7ef0bb4721c9a9fa4809f0ce8bab8ce1e40d2e96cd19ffb4ba7-userdata-shm.mount: Deactivated successfully.
Nov 22 08:46:26 compute-0 systemd[1]: var-lib-containers-storage-overlay-23c88672e80d463ae2bf28997f85a4d73fb5b9c92614655aec9838a597db95a7-merged.mount: Deactivated successfully.
Nov 22 08:46:26 compute-0 nova_compute[186544]: 2025-11-22 08:46:26.688 186548 DEBUG nova.virt.libvirt.vif [None req-a8ed4812-c7c0-484d-b2c0-6a2b27f4ac9f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:45:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2118287641',display_name='tempest-TestNetworkBasicOps-server-2118287641',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2118287641',id=176,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCzR+9MOqXw8TBBpsaw/IGWRtEptID1j8riDZ0v3IyOTb6fraFLij0X6Vh1uCEygecc6geFg0ggERXTcmILgqRqG2INhfMi/D9TMR2Ak3+WIDqG4qsd4hRi6d7c9daFBIg==',key_name='tempest-TestNetworkBasicOps-1133952864',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:45:23Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='12f63a6d87a947758ab928c0d625ff06',ramdisk_id='',reservation_id='r-i7rx9jbw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1998778518',owner_user_name='tempest-TestNetworkBasicOps-1998778518-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:45:23Z,user_data=None,user_id='033a5e424a0a42afa21b67c28d79d1f4',uuid=a2a084a1-24f5-46c7-a76c-25bfe7b4a286,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a18bcf41-9655-4103-a948-8f1276716654", "address": "fa:16:3e:da:49:f4", "network": {"id": "75a459da-4098-4237-9a69-6ce91c909b9c", "bridge": "br-int", "label": "tempest-network-smoke--86102444", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.227", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa18bcf41-96", "ovs_interfaceid": "a18bcf41-9655-4103-a948-8f1276716654", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 08:46:26 compute-0 nova_compute[186544]: 2025-11-22 08:46:26.689 186548 DEBUG nova.network.os_vif_util [None req-a8ed4812-c7c0-484d-b2c0-6a2b27f4ac9f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converting VIF {"id": "a18bcf41-9655-4103-a948-8f1276716654", "address": "fa:16:3e:da:49:f4", "network": {"id": "75a459da-4098-4237-9a69-6ce91c909b9c", "bridge": "br-int", "label": "tempest-network-smoke--86102444", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.227", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa18bcf41-96", "ovs_interfaceid": "a18bcf41-9655-4103-a948-8f1276716654", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:46:26 compute-0 nova_compute[186544]: 2025-11-22 08:46:26.689 186548 DEBUG nova.network.os_vif_util [None req-a8ed4812-c7c0-484d-b2c0-6a2b27f4ac9f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:da:49:f4,bridge_name='br-int',has_traffic_filtering=True,id=a18bcf41-9655-4103-a948-8f1276716654,network=Network(75a459da-4098-4237-9a69-6ce91c909b9c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa18bcf41-96') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:46:26 compute-0 nova_compute[186544]: 2025-11-22 08:46:26.689 186548 DEBUG os_vif [None req-a8ed4812-c7c0-484d-b2c0-6a2b27f4ac9f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:da:49:f4,bridge_name='br-int',has_traffic_filtering=True,id=a18bcf41-9655-4103-a948-8f1276716654,network=Network(75a459da-4098-4237-9a69-6ce91c909b9c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa18bcf41-96') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 08:46:26 compute-0 nova_compute[186544]: 2025-11-22 08:46:26.691 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:46:26 compute-0 nova_compute[186544]: 2025-11-22 08:46:26.691 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa18bcf41-96, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:46:26 compute-0 nova_compute[186544]: 2025-11-22 08:46:26.694 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 08:46:26 compute-0 nova_compute[186544]: 2025-11-22 08:46:26.696 186548 INFO os_vif [None req-a8ed4812-c7c0-484d-b2c0-6a2b27f4ac9f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:da:49:f4,bridge_name='br-int',has_traffic_filtering=True,id=a18bcf41-9655-4103-a948-8f1276716654,network=Network(75a459da-4098-4237-9a69-6ce91c909b9c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa18bcf41-96')
Nov 22 08:46:26 compute-0 nova_compute[186544]: 2025-11-22 08:46:26.696 186548 INFO nova.virt.libvirt.driver [None req-a8ed4812-c7c0-484d-b2c0-6a2b27f4ac9f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Deleting instance files /var/lib/nova/instances/a2a084a1-24f5-46c7-a76c-25bfe7b4a286_del
Nov 22 08:46:26 compute-0 nova_compute[186544]: 2025-11-22 08:46:26.697 186548 INFO nova.virt.libvirt.driver [None req-a8ed4812-c7c0-484d-b2c0-6a2b27f4ac9f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Deletion of /var/lib/nova/instances/a2a084a1-24f5-46c7-a76c-25bfe7b4a286_del complete
Nov 22 08:46:26 compute-0 nova_compute[186544]: 2025-11-22 08:46:26.705 186548 DEBUG nova.compute.manager [req-d56a8e1c-93dc-4952-be61-5bf495319236 req-aff457b2-2d1f-404c-8979-bea5734d5b17 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Received event network-vif-unplugged-a18bcf41-9655-4103-a948-8f1276716654 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:46:26 compute-0 nova_compute[186544]: 2025-11-22 08:46:26.705 186548 DEBUG oslo_concurrency.lockutils [req-d56a8e1c-93dc-4952-be61-5bf495319236 req-aff457b2-2d1f-404c-8979-bea5734d5b17 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "a2a084a1-24f5-46c7-a76c-25bfe7b4a286-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:46:26 compute-0 nova_compute[186544]: 2025-11-22 08:46:26.705 186548 DEBUG oslo_concurrency.lockutils [req-d56a8e1c-93dc-4952-be61-5bf495319236 req-aff457b2-2d1f-404c-8979-bea5734d5b17 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "a2a084a1-24f5-46c7-a76c-25bfe7b4a286-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:46:26 compute-0 nova_compute[186544]: 2025-11-22 08:46:26.706 186548 DEBUG oslo_concurrency.lockutils [req-d56a8e1c-93dc-4952-be61-5bf495319236 req-aff457b2-2d1f-404c-8979-bea5734d5b17 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "a2a084a1-24f5-46c7-a76c-25bfe7b4a286-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:46:26 compute-0 nova_compute[186544]: 2025-11-22 08:46:26.706 186548 DEBUG nova.compute.manager [req-d56a8e1c-93dc-4952-be61-5bf495319236 req-aff457b2-2d1f-404c-8979-bea5734d5b17 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] No waiting events found dispatching network-vif-unplugged-a18bcf41-9655-4103-a948-8f1276716654 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:46:26 compute-0 nova_compute[186544]: 2025-11-22 08:46:26.706 186548 DEBUG nova.compute.manager [req-d56a8e1c-93dc-4952-be61-5bf495319236 req-aff457b2-2d1f-404c-8979-bea5734d5b17 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Received event network-vif-unplugged-a18bcf41-9655-4103-a948-8f1276716654 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 22 08:46:26 compute-0 podman[253122]: 2025-11-22 08:46:26.713301422 +0000 UTC m=+0.152422281 container cleanup 015adda13752d7ef0bb4721c9a9fa4809f0ce8bab8ce1e40d2e96cd19ffb4ba7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-75a459da-4098-4237-9a69-6ce91c909b9c, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 08:46:26 compute-0 systemd[1]: libpod-conmon-015adda13752d7ef0bb4721c9a9fa4809f0ce8bab8ce1e40d2e96cd19ffb4ba7.scope: Deactivated successfully.
Nov 22 08:46:26 compute-0 nova_compute[186544]: 2025-11-22 08:46:26.739 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:46:26 compute-0 nova_compute[186544]: 2025-11-22 08:46:26.768 186548 INFO nova.compute.manager [None req-a8ed4812-c7c0-484d-b2c0-6a2b27f4ac9f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Took 0.38 seconds to destroy the instance on the hypervisor.
Nov 22 08:46:26 compute-0 nova_compute[186544]: 2025-11-22 08:46:26.769 186548 DEBUG oslo.service.loopingcall [None req-a8ed4812-c7c0-484d-b2c0-6a2b27f4ac9f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 08:46:26 compute-0 nova_compute[186544]: 2025-11-22 08:46:26.769 186548 DEBUG nova.compute.manager [-] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 08:46:26 compute-0 nova_compute[186544]: 2025-11-22 08:46:26.769 186548 DEBUG nova.network.neutron [-] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 08:46:26 compute-0 podman[253163]: 2025-11-22 08:46:26.802914503 +0000 UTC m=+0.067472506 container remove 015adda13752d7ef0bb4721c9a9fa4809f0ce8bab8ce1e40d2e96cd19ffb4ba7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-75a459da-4098-4237-9a69-6ce91c909b9c, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 22 08:46:26 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:46:26.808 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[d8fed808-1364-4de9-8f5b-6acc7739ebd9]: (4, ('Sat Nov 22 08:46:26 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-75a459da-4098-4237-9a69-6ce91c909b9c (015adda13752d7ef0bb4721c9a9fa4809f0ce8bab8ce1e40d2e96cd19ffb4ba7)\n015adda13752d7ef0bb4721c9a9fa4809f0ce8bab8ce1e40d2e96cd19ffb4ba7\nSat Nov 22 08:46:26 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-75a459da-4098-4237-9a69-6ce91c909b9c (015adda13752d7ef0bb4721c9a9fa4809f0ce8bab8ce1e40d2e96cd19ffb4ba7)\n015adda13752d7ef0bb4721c9a9fa4809f0ce8bab8ce1e40d2e96cd19ffb4ba7\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:46:26 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:46:26.810 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[9a168c07-4eea-4d5d-a1c9-dbc99a84742e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:46:26 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:46:26.811 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap75a459da-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:46:26 compute-0 nova_compute[186544]: 2025-11-22 08:46:26.813 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:46:26 compute-0 kernel: tap75a459da-40: left promiscuous mode
Nov 22 08:46:26 compute-0 nova_compute[186544]: 2025-11-22 08:46:26.826 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:46:26 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:46:26.828 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[28fd23b3-62fc-4f01-9eaf-e4695e763972]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:46:26 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:46:26.843 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[33db7cd3-2033-4a05-a2a3-eeb3996de2f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:46:26 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:46:26.844 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[8af90210-48d2-4a16-b558-c58418416391]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:46:26 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:46:26.859 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[ea39b727-d3e3-4bd3-8691-849aaedf0550]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 786352, 'reachable_time': 43855, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 253178, 'error': None, 'target': 'ovnmeta-75a459da-4098-4237-9a69-6ce91c909b9c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:46:26 compute-0 systemd[1]: run-netns-ovnmeta\x2d75a459da\x2d4098\x2d4237\x2d9a69\x2d6ce91c909b9c.mount: Deactivated successfully.
Nov 22 08:46:26 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:46:26.863 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-75a459da-4098-4237-9a69-6ce91c909b9c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 08:46:26 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:46:26.863 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[f6e692e1-ff16-441f-9d21-a25bb7a06689]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:46:26 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:46:26.865 103805 INFO neutron.agent.ovn.metadata.agent [-] Port a18bcf41-9655-4103-a948-8f1276716654 in datapath 75a459da-4098-4237-9a69-6ce91c909b9c unbound from our chassis
Nov 22 08:46:26 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:46:26.865 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 75a459da-4098-4237-9a69-6ce91c909b9c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 08:46:26 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:46:26.866 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[6940856e-3664-4581-963b-ae9f619401b4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:46:26 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:46:26.867 103805 INFO neutron.agent.ovn.metadata.agent [-] Port a18bcf41-9655-4103-a948-8f1276716654 in datapath 75a459da-4098-4237-9a69-6ce91c909b9c unbound from our chassis
Nov 22 08:46:26 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:46:26.868 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 75a459da-4098-4237-9a69-6ce91c909b9c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 08:46:26 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:46:26.868 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[d413f5a4-848c-4caa-8f91-c9fa54e7bb02]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:46:28 compute-0 nova_compute[186544]: 2025-11-22 08:46:28.944 186548 DEBUG nova.compute.manager [req-75437118-9303-4b23-8ac7-0d64c3734436 req-4a00a99d-faab-485f-aa6e-72deb9e853ac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Received event network-vif-plugged-a18bcf41-9655-4103-a948-8f1276716654 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:46:28 compute-0 nova_compute[186544]: 2025-11-22 08:46:28.945 186548 DEBUG oslo_concurrency.lockutils [req-75437118-9303-4b23-8ac7-0d64c3734436 req-4a00a99d-faab-485f-aa6e-72deb9e853ac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "a2a084a1-24f5-46c7-a76c-25bfe7b4a286-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:46:28 compute-0 nova_compute[186544]: 2025-11-22 08:46:28.946 186548 DEBUG oslo_concurrency.lockutils [req-75437118-9303-4b23-8ac7-0d64c3734436 req-4a00a99d-faab-485f-aa6e-72deb9e853ac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "a2a084a1-24f5-46c7-a76c-25bfe7b4a286-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:46:28 compute-0 nova_compute[186544]: 2025-11-22 08:46:28.946 186548 DEBUG oslo_concurrency.lockutils [req-75437118-9303-4b23-8ac7-0d64c3734436 req-4a00a99d-faab-485f-aa6e-72deb9e853ac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "a2a084a1-24f5-46c7-a76c-25bfe7b4a286-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:46:28 compute-0 nova_compute[186544]: 2025-11-22 08:46:28.946 186548 DEBUG nova.compute.manager [req-75437118-9303-4b23-8ac7-0d64c3734436 req-4a00a99d-faab-485f-aa6e-72deb9e853ac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] No waiting events found dispatching network-vif-plugged-a18bcf41-9655-4103-a948-8f1276716654 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:46:28 compute-0 nova_compute[186544]: 2025-11-22 08:46:28.947 186548 WARNING nova.compute.manager [req-75437118-9303-4b23-8ac7-0d64c3734436 req-4a00a99d-faab-485f-aa6e-72deb9e853ac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Received unexpected event network-vif-plugged-a18bcf41-9655-4103-a948-8f1276716654 for instance with vm_state active and task_state deleting.
Nov 22 08:46:31 compute-0 nova_compute[186544]: 2025-11-22 08:46:31.035 186548 DEBUG nova.network.neutron [-] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:46:31 compute-0 nova_compute[186544]: 2025-11-22 08:46:31.061 186548 INFO nova.compute.manager [-] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Took 4.29 seconds to deallocate network for instance.
Nov 22 08:46:31 compute-0 nova_compute[186544]: 2025-11-22 08:46:31.092 186548 DEBUG nova.compute.manager [req-f322d555-f418-44d1-a458-fc327c98db19 req-61003db0-0b53-40f1-ac6e-d1c586220a99 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Received event network-vif-deleted-a18bcf41-9655-4103-a948-8f1276716654 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:46:31 compute-0 nova_compute[186544]: 2025-11-22 08:46:31.126 186548 DEBUG nova.network.neutron [req-ff8bd66d-f8bb-44e0-8ca3-8e0960d0ac8d req-c626dcd1-9ebb-443e-b59a-01c9f4a65600 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Updated VIF entry in instance network info cache for port a18bcf41-9655-4103-a948-8f1276716654. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:46:31 compute-0 nova_compute[186544]: 2025-11-22 08:46:31.127 186548 DEBUG nova.network.neutron [req-ff8bd66d-f8bb-44e0-8ca3-8e0960d0ac8d req-c626dcd1-9ebb-443e-b59a-01c9f4a65600 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Updating instance_info_cache with network_info: [{"id": "a18bcf41-9655-4103-a948-8f1276716654", "address": "fa:16:3e:da:49:f4", "network": {"id": "75a459da-4098-4237-9a69-6ce91c909b9c", "bridge": "br-int", "label": "tempest-network-smoke--86102444", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa18bcf41-96", "ovs_interfaceid": "a18bcf41-9655-4103-a948-8f1276716654", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:46:31 compute-0 nova_compute[186544]: 2025-11-22 08:46:31.325 186548 DEBUG oslo_concurrency.lockutils [req-ff8bd66d-f8bb-44e0-8ca3-8e0960d0ac8d req-c626dcd1-9ebb-443e-b59a-01c9f4a65600 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-a2a084a1-24f5-46c7-a76c-25bfe7b4a286" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:46:31 compute-0 nova_compute[186544]: 2025-11-22 08:46:31.325 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquired lock "refresh_cache-a2a084a1-24f5-46c7-a76c-25bfe7b4a286" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:46:31 compute-0 nova_compute[186544]: 2025-11-22 08:46:31.326 186548 DEBUG nova.network.neutron [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 22 08:46:31 compute-0 nova_compute[186544]: 2025-11-22 08:46:31.326 186548 DEBUG nova.objects.instance [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lazy-loading 'info_cache' on Instance uuid a2a084a1-24f5-46c7-a76c-25bfe7b4a286 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:46:31 compute-0 nova_compute[186544]: 2025-11-22 08:46:31.404 186548 DEBUG oslo_concurrency.lockutils [None req-a8ed4812-c7c0-484d-b2c0-6a2b27f4ac9f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:46:31 compute-0 nova_compute[186544]: 2025-11-22 08:46:31.405 186548 DEBUG oslo_concurrency.lockutils [None req-a8ed4812-c7c0-484d-b2c0-6a2b27f4ac9f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:46:31 compute-0 nova_compute[186544]: 2025-11-22 08:46:31.436 186548 DEBUG nova.scheduler.client.report [None req-a8ed4812-c7c0-484d-b2c0-6a2b27f4ac9f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Refreshing inventories for resource provider 0a011418-630a-4be8-ab23-41ec1c11a5ea _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 22 08:46:31 compute-0 nova_compute[186544]: 2025-11-22 08:46:31.454 186548 DEBUG nova.scheduler.client.report [None req-a8ed4812-c7c0-484d-b2c0-6a2b27f4ac9f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Updating ProviderTree inventory for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 22 08:46:31 compute-0 nova_compute[186544]: 2025-11-22 08:46:31.455 186548 DEBUG nova.compute.provider_tree [None req-a8ed4812-c7c0-484d-b2c0-6a2b27f4ac9f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Updating inventory in ProviderTree for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 22 08:46:31 compute-0 nova_compute[186544]: 2025-11-22 08:46:31.490 186548 DEBUG nova.scheduler.client.report [None req-a8ed4812-c7c0-484d-b2c0-6a2b27f4ac9f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Refreshing aggregate associations for resource provider 0a011418-630a-4be8-ab23-41ec1c11a5ea, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 22 08:46:31 compute-0 nova_compute[186544]: 2025-11-22 08:46:31.520 186548 DEBUG nova.scheduler.client.report [None req-a8ed4812-c7c0-484d-b2c0-6a2b27f4ac9f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Refreshing trait associations for resource provider 0a011418-630a-4be8-ab23-41ec1c11a5ea, traits: COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_ACCELERATORS,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_RESCUE_BFV,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSSE3,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE41 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 22 08:46:31 compute-0 nova_compute[186544]: 2025-11-22 08:46:31.577 186548 DEBUG nova.compute.provider_tree [None req-a8ed4812-c7c0-484d-b2c0-6a2b27f4ac9f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:46:31 compute-0 nova_compute[186544]: 2025-11-22 08:46:31.596 186548 DEBUG nova.scheduler.client.report [None req-a8ed4812-c7c0-484d-b2c0-6a2b27f4ac9f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:46:31 compute-0 nova_compute[186544]: 2025-11-22 08:46:31.622 186548 DEBUG oslo_concurrency.lockutils [None req-a8ed4812-c7c0-484d-b2c0-6a2b27f4ac9f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.218s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:46:31 compute-0 nova_compute[186544]: 2025-11-22 08:46:31.669 186548 INFO nova.scheduler.client.report [None req-a8ed4812-c7c0-484d-b2c0-6a2b27f4ac9f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Deleted allocations for instance a2a084a1-24f5-46c7-a76c-25bfe7b4a286
Nov 22 08:46:31 compute-0 nova_compute[186544]: 2025-11-22 08:46:31.693 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:46:31 compute-0 nova_compute[186544]: 2025-11-22 08:46:31.740 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:46:31 compute-0 nova_compute[186544]: 2025-11-22 08:46:31.809 186548 DEBUG oslo_concurrency.lockutils [None req-a8ed4812-c7c0-484d-b2c0-6a2b27f4ac9f 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "a2a084a1-24f5-46c7-a76c-25bfe7b4a286" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.437s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:46:32 compute-0 nova_compute[186544]: 2025-11-22 08:46:32.655 186548 DEBUG nova.network.neutron [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:46:32 compute-0 nova_compute[186544]: 2025-11-22 08:46:32.893 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Releasing lock "refresh_cache-a2a084a1-24f5-46c7-a76c-25bfe7b4a286" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:46:32 compute-0 nova_compute[186544]: 2025-11-22 08:46:32.894 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 22 08:46:32 compute-0 nova_compute[186544]: 2025-11-22 08:46:32.894 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:46:32 compute-0 nova_compute[186544]: 2025-11-22 08:46:32.894 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:46:32 compute-0 nova_compute[186544]: 2025-11-22 08:46:32.894 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 08:46:32 compute-0 nova_compute[186544]: 2025-11-22 08:46:32.895 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:46:32 compute-0 nova_compute[186544]: 2025-11-22 08:46:32.916 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:46:32 compute-0 nova_compute[186544]: 2025-11-22 08:46:32.916 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:46:32 compute-0 nova_compute[186544]: 2025-11-22 08:46:32.917 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:46:32 compute-0 nova_compute[186544]: 2025-11-22 08:46:32.917 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 08:46:33 compute-0 nova_compute[186544]: 2025-11-22 08:46:33.054 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:46:33 compute-0 nova_compute[186544]: 2025-11-22 08:46:33.055 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5710MB free_disk=73.13188934326172GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 08:46:33 compute-0 nova_compute[186544]: 2025-11-22 08:46:33.055 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:46:33 compute-0 nova_compute[186544]: 2025-11-22 08:46:33.055 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:46:33 compute-0 nova_compute[186544]: 2025-11-22 08:46:33.819 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 08:46:33 compute-0 nova_compute[186544]: 2025-11-22 08:46:33.820 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 08:46:33 compute-0 nova_compute[186544]: 2025-11-22 08:46:33.849 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:46:33 compute-0 nova_compute[186544]: 2025-11-22 08:46:33.862 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:46:33 compute-0 nova_compute[186544]: 2025-11-22 08:46:33.888 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 08:46:33 compute-0 nova_compute[186544]: 2025-11-22 08:46:33.888 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.833s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:46:36.605 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:46:36.606 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:46:36.606 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:46:36.607 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:46:36.607 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:46:36.607 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:46:36.607 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:46:36.607 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:46:36.607 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:46:36.608 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:46:36.608 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:46:36.608 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:46:36.608 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:46:36.608 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:46:36.609 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:46:36.609 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:46:36.609 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:46:36.609 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:46:36.609 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:46:36.609 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:46:36.610 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:46:36.610 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:46:36.610 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:46:36.610 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:46:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:46:36.610 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:46:36 compute-0 nova_compute[186544]: 2025-11-22 08:46:36.696 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:46:36 compute-0 nova_compute[186544]: 2025-11-22 08:46:36.742 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:46:37 compute-0 nova_compute[186544]: 2025-11-22 08:46:37.157 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:46:37 compute-0 nova_compute[186544]: 2025-11-22 08:46:37.158 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:46:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:46:37.376 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:46:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:46:37.377 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:46:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:46:37.377 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:46:38 compute-0 nova_compute[186544]: 2025-11-22 08:46:38.429 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:46:38 compute-0 nova_compute[186544]: 2025-11-22 08:46:38.496 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:46:39 compute-0 nova_compute[186544]: 2025-11-22 08:46:39.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:46:41 compute-0 nova_compute[186544]: 2025-11-22 08:46:41.672 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763801186.6716495, a2a084a1-24f5-46c7-a76c-25bfe7b4a286 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:46:41 compute-0 nova_compute[186544]: 2025-11-22 08:46:41.673 186548 INFO nova.compute.manager [-] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] VM Stopped (Lifecycle Event)
Nov 22 08:46:41 compute-0 nova_compute[186544]: 2025-11-22 08:46:41.698 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:46:41 compute-0 nova_compute[186544]: 2025-11-22 08:46:41.745 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:46:41 compute-0 nova_compute[186544]: 2025-11-22 08:46:41.876 186548 DEBUG nova.compute.manager [None req-56d2283b-5738-4eeb-83ca-1cf4694885df - - - - - -] [instance: a2a084a1-24f5-46c7-a76c-25bfe7b4a286] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:46:42 compute-0 nova_compute[186544]: 2025-11-22 08:46:42.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:46:44 compute-0 nova_compute[186544]: 2025-11-22 08:46:44.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:46:46 compute-0 nova_compute[186544]: 2025-11-22 08:46:46.700 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:46:46 compute-0 nova_compute[186544]: 2025-11-22 08:46:46.747 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:46:47 compute-0 podman[253182]: 2025-11-22 08:46:47.422253703 +0000 UTC m=+0.072813718 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 22 08:46:47 compute-0 podman[253184]: 2025-11-22 08:46:47.43634049 +0000 UTC m=+0.068985773 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 08:46:47 compute-0 podman[253183]: 2025-11-22 08:46:47.438412932 +0000 UTC m=+0.085636285 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 08:46:47 compute-0 podman[253185]: 2025-11-22 08:46:47.458114568 +0000 UTC m=+0.096357089 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 22 08:46:51 compute-0 nova_compute[186544]: 2025-11-22 08:46:51.702 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:46:51 compute-0 nova_compute[186544]: 2025-11-22 08:46:51.748 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:46:53 compute-0 nova_compute[186544]: 2025-11-22 08:46:53.158 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:46:56 compute-0 nova_compute[186544]: 2025-11-22 08:46:56.704 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:46:56 compute-0 nova_compute[186544]: 2025-11-22 08:46:56.749 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:46:57 compute-0 podman[253265]: 2025-11-22 08:46:57.409729801 +0000 UTC m=+0.057369437 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, vendor=Red Hat, Inc., config_id=edpm, vcs-type=git, version=9.6, architecture=x86_64, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 22 08:46:57 compute-0 podman[253266]: 2025-11-22 08:46:57.41497847 +0000 UTC m=+0.059733335 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 22 08:46:57 compute-0 podman[253264]: 2025-11-22 08:46:57.441330081 +0000 UTC m=+0.086077146 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 08:47:00 compute-0 nova_compute[186544]: 2025-11-22 08:47:00.907 186548 DEBUG oslo_concurrency.lockutils [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "a7cff8e0-bed1-44f0-bf56-3e2e92438f70" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:47:00 compute-0 nova_compute[186544]: 2025-11-22 08:47:00.908 186548 DEBUG oslo_concurrency.lockutils [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "a7cff8e0-bed1-44f0-bf56-3e2e92438f70" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:47:00 compute-0 nova_compute[186544]: 2025-11-22 08:47:00.948 186548 DEBUG nova.compute.manager [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 08:47:01 compute-0 nova_compute[186544]: 2025-11-22 08:47:01.193 186548 DEBUG oslo_concurrency.lockutils [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:47:01 compute-0 nova_compute[186544]: 2025-11-22 08:47:01.194 186548 DEBUG oslo_concurrency.lockutils [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:47:01 compute-0 nova_compute[186544]: 2025-11-22 08:47:01.314 186548 DEBUG nova.virt.hardware [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 08:47:01 compute-0 nova_compute[186544]: 2025-11-22 08:47:01.315 186548 INFO nova.compute.claims [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] Claim successful on node compute-0.ctlplane.example.com
Nov 22 08:47:01 compute-0 nova_compute[186544]: 2025-11-22 08:47:01.667 186548 DEBUG nova.compute.provider_tree [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:47:01 compute-0 nova_compute[186544]: 2025-11-22 08:47:01.678 186548 DEBUG nova.scheduler.client.report [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:47:01 compute-0 nova_compute[186544]: 2025-11-22 08:47:01.706 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:47:01 compute-0 nova_compute[186544]: 2025-11-22 08:47:01.751 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:47:01 compute-0 nova_compute[186544]: 2025-11-22 08:47:01.760 186548 DEBUG oslo_concurrency.lockutils [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.566s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:47:01 compute-0 nova_compute[186544]: 2025-11-22 08:47:01.760 186548 DEBUG nova.compute.manager [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 08:47:01 compute-0 nova_compute[186544]: 2025-11-22 08:47:01.944 186548 DEBUG nova.compute.manager [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 22 08:47:01 compute-0 nova_compute[186544]: 2025-11-22 08:47:01.945 186548 DEBUG nova.network.neutron [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 08:47:02 compute-0 nova_compute[186544]: 2025-11-22 08:47:02.169 186548 INFO nova.virt.libvirt.driver [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 08:47:02 compute-0 nova_compute[186544]: 2025-11-22 08:47:02.283 186548 DEBUG nova.compute.manager [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 08:47:02 compute-0 nova_compute[186544]: 2025-11-22 08:47:02.316 186548 DEBUG nova.policy [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '033a5e424a0a42afa21b67c28d79d1f4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '12f63a6d87a947758ab928c0d625ff06', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 22 08:47:02 compute-0 nova_compute[186544]: 2025-11-22 08:47:02.651 186548 DEBUG nova.compute.manager [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 08:47:02 compute-0 nova_compute[186544]: 2025-11-22 08:47:02.652 186548 DEBUG nova.virt.libvirt.driver [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 08:47:02 compute-0 nova_compute[186544]: 2025-11-22 08:47:02.653 186548 INFO nova.virt.libvirt.driver [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] Creating image(s)
Nov 22 08:47:02 compute-0 nova_compute[186544]: 2025-11-22 08:47:02.653 186548 DEBUG oslo_concurrency.lockutils [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "/var/lib/nova/instances/a7cff8e0-bed1-44f0-bf56-3e2e92438f70/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:47:02 compute-0 nova_compute[186544]: 2025-11-22 08:47:02.654 186548 DEBUG oslo_concurrency.lockutils [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "/var/lib/nova/instances/a7cff8e0-bed1-44f0-bf56-3e2e92438f70/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:47:02 compute-0 nova_compute[186544]: 2025-11-22 08:47:02.654 186548 DEBUG oslo_concurrency.lockutils [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "/var/lib/nova/instances/a7cff8e0-bed1-44f0-bf56-3e2e92438f70/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:47:02 compute-0 nova_compute[186544]: 2025-11-22 08:47:02.666 186548 DEBUG oslo_concurrency.processutils [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:47:02 compute-0 nova_compute[186544]: 2025-11-22 08:47:02.737 186548 DEBUG oslo_concurrency.processutils [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:47:02 compute-0 nova_compute[186544]: 2025-11-22 08:47:02.738 186548 DEBUG oslo_concurrency.lockutils [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:47:02 compute-0 nova_compute[186544]: 2025-11-22 08:47:02.738 186548 DEBUG oslo_concurrency.lockutils [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:47:02 compute-0 nova_compute[186544]: 2025-11-22 08:47:02.749 186548 DEBUG oslo_concurrency.processutils [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:47:02 compute-0 nova_compute[186544]: 2025-11-22 08:47:02.803 186548 DEBUG oslo_concurrency.processutils [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:47:02 compute-0 nova_compute[186544]: 2025-11-22 08:47:02.804 186548 DEBUG oslo_concurrency.processutils [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/a7cff8e0-bed1-44f0-bf56-3e2e92438f70/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:47:03 compute-0 nova_compute[186544]: 2025-11-22 08:47:03.097 186548 DEBUG oslo_concurrency.processutils [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/a7cff8e0-bed1-44f0-bf56-3e2e92438f70/disk 1073741824" returned: 0 in 0.293s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:47:03 compute-0 nova_compute[186544]: 2025-11-22 08:47:03.098 186548 DEBUG oslo_concurrency.lockutils [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.360s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:47:03 compute-0 nova_compute[186544]: 2025-11-22 08:47:03.099 186548 DEBUG oslo_concurrency.processutils [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:47:03 compute-0 nova_compute[186544]: 2025-11-22 08:47:03.154 186548 DEBUG oslo_concurrency.processutils [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:47:03 compute-0 nova_compute[186544]: 2025-11-22 08:47:03.156 186548 DEBUG nova.virt.disk.api [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Checking if we can resize image /var/lib/nova/instances/a7cff8e0-bed1-44f0-bf56-3e2e92438f70/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 08:47:03 compute-0 nova_compute[186544]: 2025-11-22 08:47:03.156 186548 DEBUG oslo_concurrency.processutils [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7cff8e0-bed1-44f0-bf56-3e2e92438f70/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:47:03 compute-0 nova_compute[186544]: 2025-11-22 08:47:03.212 186548 DEBUG oslo_concurrency.processutils [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7cff8e0-bed1-44f0-bf56-3e2e92438f70/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:47:03 compute-0 nova_compute[186544]: 2025-11-22 08:47:03.213 186548 DEBUG nova.virt.disk.api [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Cannot resize image /var/lib/nova/instances/a7cff8e0-bed1-44f0-bf56-3e2e92438f70/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 08:47:03 compute-0 nova_compute[186544]: 2025-11-22 08:47:03.213 186548 DEBUG nova.objects.instance [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lazy-loading 'migration_context' on Instance uuid a7cff8e0-bed1-44f0-bf56-3e2e92438f70 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:47:03 compute-0 nova_compute[186544]: 2025-11-22 08:47:03.232 186548 DEBUG nova.virt.libvirt.driver [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 08:47:03 compute-0 nova_compute[186544]: 2025-11-22 08:47:03.233 186548 DEBUG nova.virt.libvirt.driver [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] Ensure instance console log exists: /var/lib/nova/instances/a7cff8e0-bed1-44f0-bf56-3e2e92438f70/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 08:47:03 compute-0 nova_compute[186544]: 2025-11-22 08:47:03.233 186548 DEBUG oslo_concurrency.lockutils [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:47:03 compute-0 nova_compute[186544]: 2025-11-22 08:47:03.234 186548 DEBUG oslo_concurrency.lockutils [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:47:03 compute-0 nova_compute[186544]: 2025-11-22 08:47:03.234 186548 DEBUG oslo_concurrency.lockutils [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:47:05 compute-0 nova_compute[186544]: 2025-11-22 08:47:05.974 186548 DEBUG nova.network.neutron [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] Successfully created port: d94b746b-db17-4e24-9b5b-3e9d317b172b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 22 08:47:06 compute-0 nova_compute[186544]: 2025-11-22 08:47:06.708 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:47:06 compute-0 nova_compute[186544]: 2025-11-22 08:47:06.753 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:47:09 compute-0 nova_compute[186544]: 2025-11-22 08:47:09.359 186548 DEBUG nova.network.neutron [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] Successfully updated port: d94b746b-db17-4e24-9b5b-3e9d317b172b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 08:47:09 compute-0 nova_compute[186544]: 2025-11-22 08:47:09.400 186548 DEBUG oslo_concurrency.lockutils [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "refresh_cache-a7cff8e0-bed1-44f0-bf56-3e2e92438f70" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:47:09 compute-0 nova_compute[186544]: 2025-11-22 08:47:09.400 186548 DEBUG oslo_concurrency.lockutils [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquired lock "refresh_cache-a7cff8e0-bed1-44f0-bf56-3e2e92438f70" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:47:09 compute-0 nova_compute[186544]: 2025-11-22 08:47:09.400 186548 DEBUG nova.network.neutron [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 08:47:09 compute-0 nova_compute[186544]: 2025-11-22 08:47:09.495 186548 DEBUG nova.compute.manager [req-143fdb93-2b40-46cd-a14c-473a104775a5 req-6e074044-d66b-45ba-b9d4-9a848db2159f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] Received event network-changed-d94b746b-db17-4e24-9b5b-3e9d317b172b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:47:09 compute-0 nova_compute[186544]: 2025-11-22 08:47:09.496 186548 DEBUG nova.compute.manager [req-143fdb93-2b40-46cd-a14c-473a104775a5 req-6e074044-d66b-45ba-b9d4-9a848db2159f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] Refreshing instance network info cache due to event network-changed-d94b746b-db17-4e24-9b5b-3e9d317b172b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:47:09 compute-0 nova_compute[186544]: 2025-11-22 08:47:09.496 186548 DEBUG oslo_concurrency.lockutils [req-143fdb93-2b40-46cd-a14c-473a104775a5 req-6e074044-d66b-45ba-b9d4-9a848db2159f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-a7cff8e0-bed1-44f0-bf56-3e2e92438f70" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:47:10 compute-0 nova_compute[186544]: 2025-11-22 08:47:10.301 186548 DEBUG nova.network.neutron [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 08:47:11 compute-0 nova_compute[186544]: 2025-11-22 08:47:11.311 186548 DEBUG nova.network.neutron [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] Updating instance_info_cache with network_info: [{"id": "d94b746b-db17-4e24-9b5b-3e9d317b172b", "address": "fa:16:3e:57:91:f5", "network": {"id": "57cf6a8a-ab79-4033-be9d-b010af3d5245", "bridge": "br-int", "label": "tempest-network-smoke--430375971", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd94b746b-db", "ovs_interfaceid": "d94b746b-db17-4e24-9b5b-3e9d317b172b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:47:11 compute-0 nova_compute[186544]: 2025-11-22 08:47:11.434 186548 DEBUG oslo_concurrency.lockutils [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Releasing lock "refresh_cache-a7cff8e0-bed1-44f0-bf56-3e2e92438f70" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:47:11 compute-0 nova_compute[186544]: 2025-11-22 08:47:11.435 186548 DEBUG nova.compute.manager [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] Instance network_info: |[{"id": "d94b746b-db17-4e24-9b5b-3e9d317b172b", "address": "fa:16:3e:57:91:f5", "network": {"id": "57cf6a8a-ab79-4033-be9d-b010af3d5245", "bridge": "br-int", "label": "tempest-network-smoke--430375971", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd94b746b-db", "ovs_interfaceid": "d94b746b-db17-4e24-9b5b-3e9d317b172b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 22 08:47:11 compute-0 nova_compute[186544]: 2025-11-22 08:47:11.435 186548 DEBUG oslo_concurrency.lockutils [req-143fdb93-2b40-46cd-a14c-473a104775a5 req-6e074044-d66b-45ba-b9d4-9a848db2159f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-a7cff8e0-bed1-44f0-bf56-3e2e92438f70" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:47:11 compute-0 nova_compute[186544]: 2025-11-22 08:47:11.435 186548 DEBUG nova.network.neutron [req-143fdb93-2b40-46cd-a14c-473a104775a5 req-6e074044-d66b-45ba-b9d4-9a848db2159f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] Refreshing network info cache for port d94b746b-db17-4e24-9b5b-3e9d317b172b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:47:11 compute-0 nova_compute[186544]: 2025-11-22 08:47:11.439 186548 DEBUG nova.virt.libvirt.driver [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] Start _get_guest_xml network_info=[{"id": "d94b746b-db17-4e24-9b5b-3e9d317b172b", "address": "fa:16:3e:57:91:f5", "network": {"id": "57cf6a8a-ab79-4033-be9d-b010af3d5245", "bridge": "br-int", "label": "tempest-network-smoke--430375971", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd94b746b-db", "ovs_interfaceid": "d94b746b-db17-4e24-9b5b-3e9d317b172b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 08:47:11 compute-0 nova_compute[186544]: 2025-11-22 08:47:11.444 186548 WARNING nova.virt.libvirt.driver [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:47:11 compute-0 nova_compute[186544]: 2025-11-22 08:47:11.450 186548 DEBUG nova.virt.libvirt.host [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 08:47:11 compute-0 nova_compute[186544]: 2025-11-22 08:47:11.450 186548 DEBUG nova.virt.libvirt.host [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 08:47:11 compute-0 nova_compute[186544]: 2025-11-22 08:47:11.457 186548 DEBUG nova.virt.libvirt.host [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 08:47:11 compute-0 nova_compute[186544]: 2025-11-22 08:47:11.458 186548 DEBUG nova.virt.libvirt.host [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 08:47:11 compute-0 nova_compute[186544]: 2025-11-22 08:47:11.459 186548 DEBUG nova.virt.libvirt.driver [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 08:47:11 compute-0 nova_compute[186544]: 2025-11-22 08:47:11.459 186548 DEBUG nova.virt.hardware [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 08:47:11 compute-0 nova_compute[186544]: 2025-11-22 08:47:11.460 186548 DEBUG nova.virt.hardware [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 08:47:11 compute-0 nova_compute[186544]: 2025-11-22 08:47:11.460 186548 DEBUG nova.virt.hardware [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 08:47:11 compute-0 nova_compute[186544]: 2025-11-22 08:47:11.460 186548 DEBUG nova.virt.hardware [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 08:47:11 compute-0 nova_compute[186544]: 2025-11-22 08:47:11.460 186548 DEBUG nova.virt.hardware [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 08:47:11 compute-0 nova_compute[186544]: 2025-11-22 08:47:11.460 186548 DEBUG nova.virt.hardware [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 08:47:11 compute-0 nova_compute[186544]: 2025-11-22 08:47:11.461 186548 DEBUG nova.virt.hardware [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 08:47:11 compute-0 nova_compute[186544]: 2025-11-22 08:47:11.461 186548 DEBUG nova.virt.hardware [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 08:47:11 compute-0 nova_compute[186544]: 2025-11-22 08:47:11.461 186548 DEBUG nova.virt.hardware [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 08:47:11 compute-0 nova_compute[186544]: 2025-11-22 08:47:11.461 186548 DEBUG nova.virt.hardware [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 08:47:11 compute-0 nova_compute[186544]: 2025-11-22 08:47:11.462 186548 DEBUG nova.virt.hardware [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 08:47:11 compute-0 nova_compute[186544]: 2025-11-22 08:47:11.466 186548 DEBUG nova.virt.libvirt.vif [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:46:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1531729469',display_name='tempest-TestNetworkBasicOps-server-1531729469',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1531729469',id=178,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFuuTOBsJaGrb6rpI9waYP65xLrhDiEtReh8kT/AeVg5nZ440ZcMMmIJ/yEjGe10IlohnCLDoGgmoVEg5ALAcah0VswlQqgcIpiL6XrMw/7xeXQGcq95c25y6jb0TwcVtg==',key_name='tempest-TestNetworkBasicOps-1584846425',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='12f63a6d87a947758ab928c0d625ff06',ramdisk_id='',reservation_id='r-ag0hiige',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1998778518',owner_user_name='tempest-TestNetworkBasicOps-1998778518-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:47:02Z,user_data=None,user_id='033a5e424a0a42afa21b67c28d79d1f4',uuid=a7cff8e0-bed1-44f0-bf56-3e2e92438f70,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d94b746b-db17-4e24-9b5b-3e9d317b172b", "address": "fa:16:3e:57:91:f5", "network": {"id": "57cf6a8a-ab79-4033-be9d-b010af3d5245", "bridge": "br-int", "label": "tempest-network-smoke--430375971", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd94b746b-db", "ovs_interfaceid": "d94b746b-db17-4e24-9b5b-3e9d317b172b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 08:47:11 compute-0 nova_compute[186544]: 2025-11-22 08:47:11.466 186548 DEBUG nova.network.os_vif_util [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converting VIF {"id": "d94b746b-db17-4e24-9b5b-3e9d317b172b", "address": "fa:16:3e:57:91:f5", "network": {"id": "57cf6a8a-ab79-4033-be9d-b010af3d5245", "bridge": "br-int", "label": "tempest-network-smoke--430375971", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd94b746b-db", "ovs_interfaceid": "d94b746b-db17-4e24-9b5b-3e9d317b172b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:47:11 compute-0 nova_compute[186544]: 2025-11-22 08:47:11.467 186548 DEBUG nova.network.os_vif_util [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:57:91:f5,bridge_name='br-int',has_traffic_filtering=True,id=d94b746b-db17-4e24-9b5b-3e9d317b172b,network=Network(57cf6a8a-ab79-4033-be9d-b010af3d5245),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd94b746b-db') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:47:11 compute-0 nova_compute[186544]: 2025-11-22 08:47:11.469 186548 DEBUG nova.objects.instance [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lazy-loading 'pci_devices' on Instance uuid a7cff8e0-bed1-44f0-bf56-3e2e92438f70 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:47:11 compute-0 nova_compute[186544]: 2025-11-22 08:47:11.502 186548 DEBUG nova.virt.libvirt.driver [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] End _get_guest_xml xml=<domain type="kvm">
Nov 22 08:47:11 compute-0 nova_compute[186544]:   <uuid>a7cff8e0-bed1-44f0-bf56-3e2e92438f70</uuid>
Nov 22 08:47:11 compute-0 nova_compute[186544]:   <name>instance-000000b2</name>
Nov 22 08:47:11 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 08:47:11 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 08:47:11 compute-0 nova_compute[186544]:   <metadata>
Nov 22 08:47:11 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 08:47:11 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 08:47:11 compute-0 nova_compute[186544]:       <nova:name>tempest-TestNetworkBasicOps-server-1531729469</nova:name>
Nov 22 08:47:11 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 08:47:11</nova:creationTime>
Nov 22 08:47:11 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 08:47:11 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 08:47:11 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 08:47:11 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 08:47:11 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 08:47:11 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 08:47:11 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 08:47:11 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 08:47:11 compute-0 nova_compute[186544]:         <nova:user uuid="033a5e424a0a42afa21b67c28d79d1f4">tempest-TestNetworkBasicOps-1998778518-project-member</nova:user>
Nov 22 08:47:11 compute-0 nova_compute[186544]:         <nova:project uuid="12f63a6d87a947758ab928c0d625ff06">tempest-TestNetworkBasicOps-1998778518</nova:project>
Nov 22 08:47:11 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 08:47:11 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 08:47:11 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 08:47:11 compute-0 nova_compute[186544]:         <nova:port uuid="d94b746b-db17-4e24-9b5b-3e9d317b172b">
Nov 22 08:47:11 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Nov 22 08:47:11 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 08:47:11 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 08:47:11 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 08:47:11 compute-0 nova_compute[186544]:   </metadata>
Nov 22 08:47:11 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 08:47:11 compute-0 nova_compute[186544]:     <system>
Nov 22 08:47:11 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 08:47:11 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 08:47:11 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 08:47:11 compute-0 nova_compute[186544]:       <entry name="serial">a7cff8e0-bed1-44f0-bf56-3e2e92438f70</entry>
Nov 22 08:47:11 compute-0 nova_compute[186544]:       <entry name="uuid">a7cff8e0-bed1-44f0-bf56-3e2e92438f70</entry>
Nov 22 08:47:11 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 08:47:11 compute-0 nova_compute[186544]:     </system>
Nov 22 08:47:11 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 08:47:11 compute-0 nova_compute[186544]:   <os>
Nov 22 08:47:11 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 08:47:11 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 08:47:11 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 08:47:11 compute-0 nova_compute[186544]:   </os>
Nov 22 08:47:11 compute-0 nova_compute[186544]:   <features>
Nov 22 08:47:11 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 08:47:11 compute-0 nova_compute[186544]:     <apic/>
Nov 22 08:47:11 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 08:47:11 compute-0 nova_compute[186544]:   </features>
Nov 22 08:47:11 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 08:47:11 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 08:47:11 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 08:47:11 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 08:47:11 compute-0 nova_compute[186544]:   </clock>
Nov 22 08:47:11 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 08:47:11 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 08:47:11 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 08:47:11 compute-0 nova_compute[186544]:   </cpu>
Nov 22 08:47:11 compute-0 nova_compute[186544]:   <devices>
Nov 22 08:47:11 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 08:47:11 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 08:47:11 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/a7cff8e0-bed1-44f0-bf56-3e2e92438f70/disk"/>
Nov 22 08:47:11 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 08:47:11 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:47:11 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 08:47:11 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 08:47:11 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/a7cff8e0-bed1-44f0-bf56-3e2e92438f70/disk.config"/>
Nov 22 08:47:11 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 08:47:11 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:47:11 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 08:47:11 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:57:91:f5"/>
Nov 22 08:47:11 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:47:11 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 08:47:11 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 08:47:11 compute-0 nova_compute[186544]:       <target dev="tapd94b746b-db"/>
Nov 22 08:47:11 compute-0 nova_compute[186544]:     </interface>
Nov 22 08:47:11 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 08:47:11 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/a7cff8e0-bed1-44f0-bf56-3e2e92438f70/console.log" append="off"/>
Nov 22 08:47:11 compute-0 nova_compute[186544]:     </serial>
Nov 22 08:47:11 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 08:47:11 compute-0 nova_compute[186544]:     <video>
Nov 22 08:47:11 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:47:11 compute-0 nova_compute[186544]:     </video>
Nov 22 08:47:11 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 08:47:11 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 08:47:11 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 08:47:11 compute-0 nova_compute[186544]:     </rng>
Nov 22 08:47:11 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 08:47:11 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:47:11 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:47:11 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:47:11 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:47:11 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:47:11 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:47:11 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:47:11 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:47:11 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:47:11 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:47:11 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:47:11 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:47:11 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:47:11 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:47:11 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:47:11 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:47:11 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:47:11 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:47:11 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:47:11 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:47:11 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:47:11 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:47:11 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:47:11 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:47:11 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 08:47:11 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 08:47:11 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 08:47:11 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 08:47:11 compute-0 nova_compute[186544]:   </devices>
Nov 22 08:47:11 compute-0 nova_compute[186544]: </domain>
Nov 22 08:47:11 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 08:47:11 compute-0 nova_compute[186544]: 2025-11-22 08:47:11.503 186548 DEBUG nova.compute.manager [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] Preparing to wait for external event network-vif-plugged-d94b746b-db17-4e24-9b5b-3e9d317b172b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 08:47:11 compute-0 nova_compute[186544]: 2025-11-22 08:47:11.504 186548 DEBUG oslo_concurrency.lockutils [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "a7cff8e0-bed1-44f0-bf56-3e2e92438f70-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:47:11 compute-0 nova_compute[186544]: 2025-11-22 08:47:11.504 186548 DEBUG oslo_concurrency.lockutils [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "a7cff8e0-bed1-44f0-bf56-3e2e92438f70-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:47:11 compute-0 nova_compute[186544]: 2025-11-22 08:47:11.505 186548 DEBUG oslo_concurrency.lockutils [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "a7cff8e0-bed1-44f0-bf56-3e2e92438f70-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:47:11 compute-0 nova_compute[186544]: 2025-11-22 08:47:11.506 186548 DEBUG nova.virt.libvirt.vif [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:46:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1531729469',display_name='tempest-TestNetworkBasicOps-server-1531729469',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1531729469',id=178,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFuuTOBsJaGrb6rpI9waYP65xLrhDiEtReh8kT/AeVg5nZ440ZcMMmIJ/yEjGe10IlohnCLDoGgmoVEg5ALAcah0VswlQqgcIpiL6XrMw/7xeXQGcq95c25y6jb0TwcVtg==',key_name='tempest-TestNetworkBasicOps-1584846425',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='12f63a6d87a947758ab928c0d625ff06',ramdisk_id='',reservation_id='r-ag0hiige',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1998778518',owner_user_name='tempest-TestNetworkBasicOps-1998778518-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:47:02Z,user_data=None,user_id='033a5e424a0a42afa21b67c28d79d1f4',uuid=a7cff8e0-bed1-44f0-bf56-3e2e92438f70,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d94b746b-db17-4e24-9b5b-3e9d317b172b", "address": "fa:16:3e:57:91:f5", "network": {"id": "57cf6a8a-ab79-4033-be9d-b010af3d5245", "bridge": "br-int", "label": "tempest-network-smoke--430375971", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd94b746b-db", "ovs_interfaceid": "d94b746b-db17-4e24-9b5b-3e9d317b172b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 08:47:11 compute-0 nova_compute[186544]: 2025-11-22 08:47:11.506 186548 DEBUG nova.network.os_vif_util [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converting VIF {"id": "d94b746b-db17-4e24-9b5b-3e9d317b172b", "address": "fa:16:3e:57:91:f5", "network": {"id": "57cf6a8a-ab79-4033-be9d-b010af3d5245", "bridge": "br-int", "label": "tempest-network-smoke--430375971", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd94b746b-db", "ovs_interfaceid": "d94b746b-db17-4e24-9b5b-3e9d317b172b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:47:11 compute-0 nova_compute[186544]: 2025-11-22 08:47:11.507 186548 DEBUG nova.network.os_vif_util [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:57:91:f5,bridge_name='br-int',has_traffic_filtering=True,id=d94b746b-db17-4e24-9b5b-3e9d317b172b,network=Network(57cf6a8a-ab79-4033-be9d-b010af3d5245),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd94b746b-db') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:47:11 compute-0 nova_compute[186544]: 2025-11-22 08:47:11.507 186548 DEBUG os_vif [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:57:91:f5,bridge_name='br-int',has_traffic_filtering=True,id=d94b746b-db17-4e24-9b5b-3e9d317b172b,network=Network(57cf6a8a-ab79-4033-be9d-b010af3d5245),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd94b746b-db') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 08:47:11 compute-0 nova_compute[186544]: 2025-11-22 08:47:11.507 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:47:11 compute-0 nova_compute[186544]: 2025-11-22 08:47:11.508 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:47:11 compute-0 nova_compute[186544]: 2025-11-22 08:47:11.508 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:47:11 compute-0 nova_compute[186544]: 2025-11-22 08:47:11.512 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:47:11 compute-0 nova_compute[186544]: 2025-11-22 08:47:11.512 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd94b746b-db, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:47:11 compute-0 nova_compute[186544]: 2025-11-22 08:47:11.513 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd94b746b-db, col_values=(('external_ids', {'iface-id': 'd94b746b-db17-4e24-9b5b-3e9d317b172b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:57:91:f5', 'vm-uuid': 'a7cff8e0-bed1-44f0-bf56-3e2e92438f70'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:47:11 compute-0 nova_compute[186544]: 2025-11-22 08:47:11.514 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:47:11 compute-0 NetworkManager[55036]: <info>  [1763801231.5155] manager: (tapd94b746b-db): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/405)
Nov 22 08:47:11 compute-0 nova_compute[186544]: 2025-11-22 08:47:11.517 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 08:47:11 compute-0 nova_compute[186544]: 2025-11-22 08:47:11.521 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:47:11 compute-0 nova_compute[186544]: 2025-11-22 08:47:11.522 186548 INFO os_vif [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:57:91:f5,bridge_name='br-int',has_traffic_filtering=True,id=d94b746b-db17-4e24-9b5b-3e9d317b172b,network=Network(57cf6a8a-ab79-4033-be9d-b010af3d5245),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd94b746b-db')
Nov 22 08:47:11 compute-0 nova_compute[186544]: 2025-11-22 08:47:11.755 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:47:11 compute-0 nova_compute[186544]: 2025-11-22 08:47:11.815 186548 DEBUG nova.virt.libvirt.driver [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:47:11 compute-0 nova_compute[186544]: 2025-11-22 08:47:11.816 186548 DEBUG nova.virt.libvirt.driver [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:47:11 compute-0 nova_compute[186544]: 2025-11-22 08:47:11.816 186548 DEBUG nova.virt.libvirt.driver [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] No VIF found with MAC fa:16:3e:57:91:f5, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 08:47:11 compute-0 nova_compute[186544]: 2025-11-22 08:47:11.816 186548 INFO nova.virt.libvirt.driver [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] Using config drive
Nov 22 08:47:12 compute-0 nova_compute[186544]: 2025-11-22 08:47:12.431 186548 INFO nova.virt.libvirt.driver [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] Creating config drive at /var/lib/nova/instances/a7cff8e0-bed1-44f0-bf56-3e2e92438f70/disk.config
Nov 22 08:47:12 compute-0 nova_compute[186544]: 2025-11-22 08:47:12.435 186548 DEBUG oslo_concurrency.processutils [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a7cff8e0-bed1-44f0-bf56-3e2e92438f70/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp004ihrrt execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:47:12 compute-0 nova_compute[186544]: 2025-11-22 08:47:12.559 186548 DEBUG oslo_concurrency.processutils [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a7cff8e0-bed1-44f0-bf56-3e2e92438f70/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp004ihrrt" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:47:12 compute-0 kernel: tapd94b746b-db: entered promiscuous mode
Nov 22 08:47:12 compute-0 nova_compute[186544]: 2025-11-22 08:47:12.630 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:47:12 compute-0 NetworkManager[55036]: <info>  [1763801232.6308] manager: (tapd94b746b-db): new Tun device (/org/freedesktop/NetworkManager/Devices/406)
Nov 22 08:47:12 compute-0 ovn_controller[94843]: 2025-11-22T08:47:12Z|00873|binding|INFO|Claiming lport d94b746b-db17-4e24-9b5b-3e9d317b172b for this chassis.
Nov 22 08:47:12 compute-0 ovn_controller[94843]: 2025-11-22T08:47:12Z|00874|binding|INFO|d94b746b-db17-4e24-9b5b-3e9d317b172b: Claiming fa:16:3e:57:91:f5 10.100.0.7
Nov 22 08:47:12 compute-0 nova_compute[186544]: 2025-11-22 08:47:12.637 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:47:12 compute-0 nova_compute[186544]: 2025-11-22 08:47:12.638 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:47:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:47:12.670 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:57:91:f5 10.100.0.7'], port_security=['fa:16:3e:57:91:f5 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'a7cff8e0-bed1-44f0-bf56-3e2e92438f70', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-57cf6a8a-ab79-4033-be9d-b010af3d5245', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '12f63a6d87a947758ab928c0d625ff06', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a846c07c-5040-4e4f-85a2-5806a9804a7c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=62a0fd5f-18f3-409d-ad4e-2a99061b21ac, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=d94b746b-db17-4e24-9b5b-3e9d317b172b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:47:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:47:12.671 103805 INFO neutron.agent.ovn.metadata.agent [-] Port d94b746b-db17-4e24-9b5b-3e9d317b172b in datapath 57cf6a8a-ab79-4033-be9d-b010af3d5245 bound to our chassis
Nov 22 08:47:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:47:12.672 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 57cf6a8a-ab79-4033-be9d-b010af3d5245
Nov 22 08:47:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:47:12.683 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[a0b8d92d-9121-4a43-8be3-e0b1be39db7d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:47:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:47:12.684 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap57cf6a8a-a1 in ovnmeta-57cf6a8a-ab79-4033-be9d-b010af3d5245 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 08:47:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:47:12.686 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap57cf6a8a-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 08:47:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:47:12.686 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[f76bf5e1-b8cf-485c-84c5-2afea04471b2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:47:12 compute-0 systemd-machined[152872]: New machine qemu-96-instance-000000b2.
Nov 22 08:47:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:47:12.688 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[1c5d816d-9b83-4994-b008-0e2376d26b8c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:47:12 compute-0 systemd-udevd[253361]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:47:12 compute-0 nova_compute[186544]: 2025-11-22 08:47:12.700 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:47:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:47:12.699 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[d6984405-d0ed-4e12-8dd4-db4da06da643]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:47:12 compute-0 ovn_controller[94843]: 2025-11-22T08:47:12Z|00875|binding|INFO|Setting lport d94b746b-db17-4e24-9b5b-3e9d317b172b ovn-installed in OVS
Nov 22 08:47:12 compute-0 ovn_controller[94843]: 2025-11-22T08:47:12Z|00876|binding|INFO|Setting lport d94b746b-db17-4e24-9b5b-3e9d317b172b up in Southbound
Nov 22 08:47:12 compute-0 nova_compute[186544]: 2025-11-22 08:47:12.704 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:47:12 compute-0 systemd[1]: Started Virtual Machine qemu-96-instance-000000b2.
Nov 22 08:47:12 compute-0 NetworkManager[55036]: <info>  [1763801232.7172] device (tapd94b746b-db): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 08:47:12 compute-0 NetworkManager[55036]: <info>  [1763801232.7179] device (tapd94b746b-db): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 08:47:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:47:12.724 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[647c482d-fe69-4192-b055-f64a90369fff]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:47:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:47:12.748 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[9aabbbc8-61db-4efc-bfca-e621836dc8b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:47:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:47:12.752 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[e029bb7e-7c40-49a0-a254-529bb5bfa3e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:47:12 compute-0 systemd-udevd[253365]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:47:12 compute-0 NetworkManager[55036]: <info>  [1763801232.7553] manager: (tap57cf6a8a-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/407)
Nov 22 08:47:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:47:12.788 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[a9deb708-d6da-4d62-9805-d082485e4949]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:47:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:47:12.792 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[c3b5b93b-9657-4c9e-a93c-f132713d0311]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:47:12 compute-0 NetworkManager[55036]: <info>  [1763801232.8164] device (tap57cf6a8a-a0): carrier: link connected
Nov 22 08:47:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:47:12.822 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[4fb456b4-78cf-4451-8deb-5146bed085e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:47:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:47:12.837 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[d7882c92-6633-4b3d-94fd-f5407d7db5d4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap57cf6a8a-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c4:82:68'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 258], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 797345, 'reachable_time': 38968, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 253393, 'error': None, 'target': 'ovnmeta-57cf6a8a-ab79-4033-be9d-b010af3d5245', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:47:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:47:12.860 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[7a2c9836-6d94-4707-b74f-c5d1ee77e631]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec4:8268'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 797345, 'tstamp': 797345}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 253394, 'error': None, 'target': 'ovnmeta-57cf6a8a-ab79-4033-be9d-b010af3d5245', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:47:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:47:12.880 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[0ffd514b-c055-4d2e-b2f9-6caff0278667]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap57cf6a8a-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c4:82:68'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 258], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 797345, 'reachable_time': 38968, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 253395, 'error': None, 'target': 'ovnmeta-57cf6a8a-ab79-4033-be9d-b010af3d5245', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:47:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:47:12.913 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[c95d3902-45b1-4fd1-b27a-1d9c75ddd105]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:47:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:47:12.993 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[9bfcc2f6-fbf1-40a5-86fb-75b1dd9d87c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:47:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:47:12.995 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap57cf6a8a-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:47:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:47:12.996 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:47:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:47:12.996 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap57cf6a8a-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:47:12 compute-0 nova_compute[186544]: 2025-11-22 08:47:12.999 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:47:12 compute-0 NetworkManager[55036]: <info>  [1763801232.9998] manager: (tap57cf6a8a-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/408)
Nov 22 08:47:13 compute-0 kernel: tap57cf6a8a-a0: entered promiscuous mode
Nov 22 08:47:13 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:47:13.002 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap57cf6a8a-a0, col_values=(('external_ids', {'iface-id': '141b2dcd-1d35-41a6-8a2d-d4f7196888b4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:47:13 compute-0 ovn_controller[94843]: 2025-11-22T08:47:13Z|00877|binding|INFO|Releasing lport 141b2dcd-1d35-41a6-8a2d-d4f7196888b4 from this chassis (sb_readonly=0)
Nov 22 08:47:13 compute-0 nova_compute[186544]: 2025-11-22 08:47:13.003 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:47:13 compute-0 nova_compute[186544]: 2025-11-22 08:47:13.014 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:47:13 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:47:13.015 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/57cf6a8a-ab79-4033-be9d-b010af3d5245.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/57cf6a8a-ab79-4033-be9d-b010af3d5245.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 08:47:13 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:47:13.017 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[d75c0863-76da-4bf2-9c4b-b46196b8a00a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:47:13 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:47:13.018 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 08:47:13 compute-0 ovn_metadata_agent[103800]: global
Nov 22 08:47:13 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 08:47:13 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-57cf6a8a-ab79-4033-be9d-b010af3d5245
Nov 22 08:47:13 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 08:47:13 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 08:47:13 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 08:47:13 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/57cf6a8a-ab79-4033-be9d-b010af3d5245.pid.haproxy
Nov 22 08:47:13 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 08:47:13 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:47:13 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 08:47:13 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 08:47:13 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 08:47:13 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 08:47:13 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 08:47:13 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 08:47:13 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 08:47:13 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 08:47:13 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 08:47:13 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 08:47:13 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 08:47:13 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 08:47:13 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 08:47:13 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:47:13 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:47:13 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 08:47:13 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 08:47:13 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 08:47:13 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID 57cf6a8a-ab79-4033-be9d-b010af3d5245
Nov 22 08:47:13 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 08:47:13 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:47:13.019 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-57cf6a8a-ab79-4033-be9d-b010af3d5245', 'env', 'PROCESS_TAG=haproxy-57cf6a8a-ab79-4033-be9d-b010af3d5245', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/57cf6a8a-ab79-4033-be9d-b010af3d5245.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 08:47:13 compute-0 nova_compute[186544]: 2025-11-22 08:47:13.118 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763801233.1173463, a7cff8e0-bed1-44f0-bf56-3e2e92438f70 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:47:13 compute-0 nova_compute[186544]: 2025-11-22 08:47:13.118 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] VM Started (Lifecycle Event)
Nov 22 08:47:13 compute-0 nova_compute[186544]: 2025-11-22 08:47:13.145 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:47:13 compute-0 nova_compute[186544]: 2025-11-22 08:47:13.150 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763801233.1174998, a7cff8e0-bed1-44f0-bf56-3e2e92438f70 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:47:13 compute-0 nova_compute[186544]: 2025-11-22 08:47:13.150 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] VM Paused (Lifecycle Event)
Nov 22 08:47:13 compute-0 nova_compute[186544]: 2025-11-22 08:47:13.173 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:47:13 compute-0 nova_compute[186544]: 2025-11-22 08:47:13.177 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:47:13 compute-0 nova_compute[186544]: 2025-11-22 08:47:13.198 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 08:47:13 compute-0 podman[253434]: 2025-11-22 08:47:13.366952281 +0000 UTC m=+0.028533525 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 08:47:13 compute-0 nova_compute[186544]: 2025-11-22 08:47:13.552 186548 DEBUG nova.compute.manager [req-e4f4a5c5-ddda-4cfd-beec-3e99e16c8a6e req-bfa4971f-0dfd-4a28-bd6e-dfd10fef4857 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] Received event network-vif-plugged-d94b746b-db17-4e24-9b5b-3e9d317b172b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:47:13 compute-0 nova_compute[186544]: 2025-11-22 08:47:13.553 186548 DEBUG oslo_concurrency.lockutils [req-e4f4a5c5-ddda-4cfd-beec-3e99e16c8a6e req-bfa4971f-0dfd-4a28-bd6e-dfd10fef4857 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "a7cff8e0-bed1-44f0-bf56-3e2e92438f70-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:47:13 compute-0 nova_compute[186544]: 2025-11-22 08:47:13.553 186548 DEBUG oslo_concurrency.lockutils [req-e4f4a5c5-ddda-4cfd-beec-3e99e16c8a6e req-bfa4971f-0dfd-4a28-bd6e-dfd10fef4857 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "a7cff8e0-bed1-44f0-bf56-3e2e92438f70-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:47:13 compute-0 nova_compute[186544]: 2025-11-22 08:47:13.553 186548 DEBUG oslo_concurrency.lockutils [req-e4f4a5c5-ddda-4cfd-beec-3e99e16c8a6e req-bfa4971f-0dfd-4a28-bd6e-dfd10fef4857 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "a7cff8e0-bed1-44f0-bf56-3e2e92438f70-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:47:13 compute-0 nova_compute[186544]: 2025-11-22 08:47:13.553 186548 DEBUG nova.compute.manager [req-e4f4a5c5-ddda-4cfd-beec-3e99e16c8a6e req-bfa4971f-0dfd-4a28-bd6e-dfd10fef4857 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] Processing event network-vif-plugged-d94b746b-db17-4e24-9b5b-3e9d317b172b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 08:47:13 compute-0 nova_compute[186544]: 2025-11-22 08:47:13.554 186548 DEBUG nova.compute.manager [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 08:47:13 compute-0 nova_compute[186544]: 2025-11-22 08:47:13.559 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763801233.5586202, a7cff8e0-bed1-44f0-bf56-3e2e92438f70 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:47:13 compute-0 nova_compute[186544]: 2025-11-22 08:47:13.559 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] VM Resumed (Lifecycle Event)
Nov 22 08:47:13 compute-0 nova_compute[186544]: 2025-11-22 08:47:13.561 186548 DEBUG nova.virt.libvirt.driver [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 08:47:13 compute-0 nova_compute[186544]: 2025-11-22 08:47:13.568 186548 INFO nova.virt.libvirt.driver [-] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] Instance spawned successfully.
Nov 22 08:47:13 compute-0 nova_compute[186544]: 2025-11-22 08:47:13.568 186548 DEBUG nova.virt.libvirt.driver [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 08:47:13 compute-0 nova_compute[186544]: 2025-11-22 08:47:13.590 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:47:13 compute-0 nova_compute[186544]: 2025-11-22 08:47:13.596 186548 DEBUG nova.virt.libvirt.driver [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:47:13 compute-0 nova_compute[186544]: 2025-11-22 08:47:13.597 186548 DEBUG nova.virt.libvirt.driver [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:47:13 compute-0 nova_compute[186544]: 2025-11-22 08:47:13.597 186548 DEBUG nova.virt.libvirt.driver [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:47:13 compute-0 nova_compute[186544]: 2025-11-22 08:47:13.598 186548 DEBUG nova.virt.libvirt.driver [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:47:13 compute-0 nova_compute[186544]: 2025-11-22 08:47:13.598 186548 DEBUG nova.virt.libvirt.driver [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:47:13 compute-0 nova_compute[186544]: 2025-11-22 08:47:13.599 186548 DEBUG nova.virt.libvirt.driver [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:47:13 compute-0 nova_compute[186544]: 2025-11-22 08:47:13.603 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:47:13 compute-0 nova_compute[186544]: 2025-11-22 08:47:13.628 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 08:47:13 compute-0 nova_compute[186544]: 2025-11-22 08:47:13.841 186548 INFO nova.compute.manager [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] Took 11.19 seconds to spawn the instance on the hypervisor.
Nov 22 08:47:13 compute-0 nova_compute[186544]: 2025-11-22 08:47:13.842 186548 DEBUG nova.compute.manager [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:47:14 compute-0 nova_compute[186544]: 2025-11-22 08:47:14.071 186548 INFO nova.compute.manager [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] Took 13.02 seconds to build instance.
Nov 22 08:47:14 compute-0 podman[253434]: 2025-11-22 08:47:14.10896907 +0000 UTC m=+0.770550294 container create 0028252800f9cfa48821e6d24711b6544a376bc5724281af32b34e14ab223672 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-57cf6a8a-ab79-4033-be9d-b010af3d5245, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Nov 22 08:47:14 compute-0 nova_compute[186544]: 2025-11-22 08:47:14.123 186548 DEBUG oslo_concurrency.lockutils [None req-a552566d-ae90-40df-9203-f6e422fa7bdf 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "a7cff8e0-bed1-44f0-bf56-3e2e92438f70" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.215s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:47:14 compute-0 systemd[1]: Started libpod-conmon-0028252800f9cfa48821e6d24711b6544a376bc5724281af32b34e14ab223672.scope.
Nov 22 08:47:14 compute-0 systemd[1]: Started libcrun container.
Nov 22 08:47:14 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fa3268928c9460e4b1a4e5c1bb4f57e38241276cf1bab03c89e92f50772e5f2c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 08:47:14 compute-0 nova_compute[186544]: 2025-11-22 08:47:14.360 186548 DEBUG nova.network.neutron [req-143fdb93-2b40-46cd-a14c-473a104775a5 req-6e074044-d66b-45ba-b9d4-9a848db2159f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] Updated VIF entry in instance network info cache for port d94b746b-db17-4e24-9b5b-3e9d317b172b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:47:14 compute-0 nova_compute[186544]: 2025-11-22 08:47:14.361 186548 DEBUG nova.network.neutron [req-143fdb93-2b40-46cd-a14c-473a104775a5 req-6e074044-d66b-45ba-b9d4-9a848db2159f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] Updating instance_info_cache with network_info: [{"id": "d94b746b-db17-4e24-9b5b-3e9d317b172b", "address": "fa:16:3e:57:91:f5", "network": {"id": "57cf6a8a-ab79-4033-be9d-b010af3d5245", "bridge": "br-int", "label": "tempest-network-smoke--430375971", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd94b746b-db", "ovs_interfaceid": "d94b746b-db17-4e24-9b5b-3e9d317b172b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:47:14 compute-0 nova_compute[186544]: 2025-11-22 08:47:14.375 186548 DEBUG oslo_concurrency.lockutils [req-143fdb93-2b40-46cd-a14c-473a104775a5 req-6e074044-d66b-45ba-b9d4-9a848db2159f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-a7cff8e0-bed1-44f0-bf56-3e2e92438f70" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:47:14 compute-0 podman[253434]: 2025-11-22 08:47:14.493128838 +0000 UTC m=+1.154710092 container init 0028252800f9cfa48821e6d24711b6544a376bc5724281af32b34e14ab223672 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-57cf6a8a-ab79-4033-be9d-b010af3d5245, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 08:47:14 compute-0 podman[253434]: 2025-11-22 08:47:14.50089258 +0000 UTC m=+1.162473804 container start 0028252800f9cfa48821e6d24711b6544a376bc5724281af32b34e14ab223672 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-57cf6a8a-ab79-4033-be9d-b010af3d5245, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 08:47:14 compute-0 neutron-haproxy-ovnmeta-57cf6a8a-ab79-4033-be9d-b010af3d5245[253449]: [NOTICE]   (253453) : New worker (253455) forked
Nov 22 08:47:14 compute-0 neutron-haproxy-ovnmeta-57cf6a8a-ab79-4033-be9d-b010af3d5245[253449]: [NOTICE]   (253453) : Loading success.
Nov 22 08:47:15 compute-0 nova_compute[186544]: 2025-11-22 08:47:15.689 186548 DEBUG nova.compute.manager [req-d306b34a-5c3f-4f59-a53c-7d862d426a2c req-e1ce7a96-fedb-4179-aab2-53c39f3f04c6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] Received event network-vif-plugged-d94b746b-db17-4e24-9b5b-3e9d317b172b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:47:15 compute-0 nova_compute[186544]: 2025-11-22 08:47:15.690 186548 DEBUG oslo_concurrency.lockutils [req-d306b34a-5c3f-4f59-a53c-7d862d426a2c req-e1ce7a96-fedb-4179-aab2-53c39f3f04c6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "a7cff8e0-bed1-44f0-bf56-3e2e92438f70-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:47:15 compute-0 nova_compute[186544]: 2025-11-22 08:47:15.690 186548 DEBUG oslo_concurrency.lockutils [req-d306b34a-5c3f-4f59-a53c-7d862d426a2c req-e1ce7a96-fedb-4179-aab2-53c39f3f04c6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "a7cff8e0-bed1-44f0-bf56-3e2e92438f70-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:47:15 compute-0 nova_compute[186544]: 2025-11-22 08:47:15.691 186548 DEBUG oslo_concurrency.lockutils [req-d306b34a-5c3f-4f59-a53c-7d862d426a2c req-e1ce7a96-fedb-4179-aab2-53c39f3f04c6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "a7cff8e0-bed1-44f0-bf56-3e2e92438f70-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:47:15 compute-0 nova_compute[186544]: 2025-11-22 08:47:15.691 186548 DEBUG nova.compute.manager [req-d306b34a-5c3f-4f59-a53c-7d862d426a2c req-e1ce7a96-fedb-4179-aab2-53c39f3f04c6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] No waiting events found dispatching network-vif-plugged-d94b746b-db17-4e24-9b5b-3e9d317b172b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:47:15 compute-0 nova_compute[186544]: 2025-11-22 08:47:15.691 186548 WARNING nova.compute.manager [req-d306b34a-5c3f-4f59-a53c-7d862d426a2c req-e1ce7a96-fedb-4179-aab2-53c39f3f04c6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] Received unexpected event network-vif-plugged-d94b746b-db17-4e24-9b5b-3e9d317b172b for instance with vm_state active and task_state None.
Nov 22 08:47:16 compute-0 nova_compute[186544]: 2025-11-22 08:47:16.515 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:47:16 compute-0 nova_compute[186544]: 2025-11-22 08:47:16.759 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:47:18 compute-0 podman[253464]: 2025-11-22 08:47:18.41908251 +0000 UTC m=+0.065245810 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 22 08:47:18 compute-0 podman[253466]: 2025-11-22 08:47:18.426880693 +0000 UTC m=+0.062628797 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 08:47:18 compute-0 podman[253465]: 2025-11-22 08:47:18.437177767 +0000 UTC m=+0.082313092 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 08:47:18 compute-0 podman[253467]: 2025-11-22 08:47:18.464113742 +0000 UTC m=+0.096911763 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller)
Nov 22 08:47:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:47:20.371 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=71, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=70) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:47:20 compute-0 nova_compute[186544]: 2025-11-22 08:47:20.372 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:47:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:47:20.373 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 08:47:21 compute-0 nova_compute[186544]: 2025-11-22 08:47:21.371 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:47:21 compute-0 NetworkManager[55036]: <info>  [1763801241.3728] manager: (patch-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/409)
Nov 22 08:47:21 compute-0 NetworkManager[55036]: <info>  [1763801241.3748] manager: (patch-br-int-to-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/410)
Nov 22 08:47:21 compute-0 ovn_controller[94843]: 2025-11-22T08:47:21Z|00878|binding|INFO|Releasing lport 141b2dcd-1d35-41a6-8a2d-d4f7196888b4 from this chassis (sb_readonly=0)
Nov 22 08:47:21 compute-0 ovn_controller[94843]: 2025-11-22T08:47:21Z|00879|binding|INFO|Releasing lport 141b2dcd-1d35-41a6-8a2d-d4f7196888b4 from this chassis (sb_readonly=0)
Nov 22 08:47:21 compute-0 nova_compute[186544]: 2025-11-22 08:47:21.400 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:47:21 compute-0 nova_compute[186544]: 2025-11-22 08:47:21.405 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:47:21 compute-0 nova_compute[186544]: 2025-11-22 08:47:21.517 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:47:21 compute-0 nova_compute[186544]: 2025-11-22 08:47:21.761 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:47:21 compute-0 nova_compute[186544]: 2025-11-22 08:47:21.823 186548 DEBUG nova.compute.manager [req-d8ff62dd-5708-4568-94c3-78ed18c97ed7 req-7eeaa607-ec57-4d8f-8d10-65fde4274f0e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] Received event network-changed-d94b746b-db17-4e24-9b5b-3e9d317b172b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:47:21 compute-0 nova_compute[186544]: 2025-11-22 08:47:21.824 186548 DEBUG nova.compute.manager [req-d8ff62dd-5708-4568-94c3-78ed18c97ed7 req-7eeaa607-ec57-4d8f-8d10-65fde4274f0e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] Refreshing instance network info cache due to event network-changed-d94b746b-db17-4e24-9b5b-3e9d317b172b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:47:21 compute-0 nova_compute[186544]: 2025-11-22 08:47:21.824 186548 DEBUG oslo_concurrency.lockutils [req-d8ff62dd-5708-4568-94c3-78ed18c97ed7 req-7eeaa607-ec57-4d8f-8d10-65fde4274f0e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-a7cff8e0-bed1-44f0-bf56-3e2e92438f70" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:47:21 compute-0 nova_compute[186544]: 2025-11-22 08:47:21.824 186548 DEBUG oslo_concurrency.lockutils [req-d8ff62dd-5708-4568-94c3-78ed18c97ed7 req-7eeaa607-ec57-4d8f-8d10-65fde4274f0e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-a7cff8e0-bed1-44f0-bf56-3e2e92438f70" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:47:21 compute-0 nova_compute[186544]: 2025-11-22 08:47:21.825 186548 DEBUG nova.network.neutron [req-d8ff62dd-5708-4568-94c3-78ed18c97ed7 req-7eeaa607-ec57-4d8f-8d10-65fde4274f0e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] Refreshing network info cache for port d94b746b-db17-4e24-9b5b-3e9d317b172b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:47:24 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:47:24.376 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '71'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:47:24 compute-0 nova_compute[186544]: 2025-11-22 08:47:24.612 186548 DEBUG nova.network.neutron [req-d8ff62dd-5708-4568-94c3-78ed18c97ed7 req-7eeaa607-ec57-4d8f-8d10-65fde4274f0e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] Updated VIF entry in instance network info cache for port d94b746b-db17-4e24-9b5b-3e9d317b172b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:47:24 compute-0 nova_compute[186544]: 2025-11-22 08:47:24.613 186548 DEBUG nova.network.neutron [req-d8ff62dd-5708-4568-94c3-78ed18c97ed7 req-7eeaa607-ec57-4d8f-8d10-65fde4274f0e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] Updating instance_info_cache with network_info: [{"id": "d94b746b-db17-4e24-9b5b-3e9d317b172b", "address": "fa:16:3e:57:91:f5", "network": {"id": "57cf6a8a-ab79-4033-be9d-b010af3d5245", "bridge": "br-int", "label": "tempest-network-smoke--430375971", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.189", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd94b746b-db", "ovs_interfaceid": "d94b746b-db17-4e24-9b5b-3e9d317b172b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:47:24 compute-0 nova_compute[186544]: 2025-11-22 08:47:24.658 186548 DEBUG oslo_concurrency.lockutils [req-d8ff62dd-5708-4568-94c3-78ed18c97ed7 req-7eeaa607-ec57-4d8f-8d10-65fde4274f0e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-a7cff8e0-bed1-44f0-bf56-3e2e92438f70" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:47:26 compute-0 nova_compute[186544]: 2025-11-22 08:47:26.519 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:47:26 compute-0 nova_compute[186544]: 2025-11-22 08:47:26.760 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:47:27 compute-0 nova_compute[186544]: 2025-11-22 08:47:27.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:47:27 compute-0 nova_compute[186544]: 2025-11-22 08:47:27.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 08:47:28 compute-0 nova_compute[186544]: 2025-11-22 08:47:28.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:47:28 compute-0 nova_compute[186544]: 2025-11-22 08:47:28.164 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 08:47:28 compute-0 nova_compute[186544]: 2025-11-22 08:47:28.164 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 08:47:28 compute-0 podman[253564]: 2025-11-22 08:47:28.399568857 +0000 UTC m=+0.051331658 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 08:47:28 compute-0 podman[253565]: 2025-11-22 08:47:28.414015533 +0000 UTC m=+0.062523684 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, maintainer=Red Hat, Inc., name=ubi9-minimal, managed_by=edpm_ansible, vcs-type=git, build-date=2025-08-20T13:12:41, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.component=ubi9-minimal-container, config_id=edpm)
Nov 22 08:47:28 compute-0 podman[253566]: 2025-11-22 08:47:28.414124056 +0000 UTC m=+0.060725149 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 08:47:29 compute-0 ovn_controller[94843]: 2025-11-22T08:47:29Z|00087|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:57:91:f5 10.100.0.7
Nov 22 08:47:29 compute-0 ovn_controller[94843]: 2025-11-22T08:47:29Z|00088|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:57:91:f5 10.100.0.7
Nov 22 08:47:30 compute-0 nova_compute[186544]: 2025-11-22 08:47:30.297 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "refresh_cache-a7cff8e0-bed1-44f0-bf56-3e2e92438f70" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:47:30 compute-0 nova_compute[186544]: 2025-11-22 08:47:30.297 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquired lock "refresh_cache-a7cff8e0-bed1-44f0-bf56-3e2e92438f70" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:47:30 compute-0 nova_compute[186544]: 2025-11-22 08:47:30.297 186548 DEBUG nova.network.neutron [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 22 08:47:30 compute-0 nova_compute[186544]: 2025-11-22 08:47:30.297 186548 DEBUG nova.objects.instance [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lazy-loading 'info_cache' on Instance uuid a7cff8e0-bed1-44f0-bf56-3e2e92438f70 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:47:31 compute-0 nova_compute[186544]: 2025-11-22 08:47:31.521 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:47:31 compute-0 nova_compute[186544]: 2025-11-22 08:47:31.763 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:47:32 compute-0 nova_compute[186544]: 2025-11-22 08:47:32.739 186548 DEBUG nova.network.neutron [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] Updating instance_info_cache with network_info: [{"id": "d94b746b-db17-4e24-9b5b-3e9d317b172b", "address": "fa:16:3e:57:91:f5", "network": {"id": "57cf6a8a-ab79-4033-be9d-b010af3d5245", "bridge": "br-int", "label": "tempest-network-smoke--430375971", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.189", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd94b746b-db", "ovs_interfaceid": "d94b746b-db17-4e24-9b5b-3e9d317b172b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:47:32 compute-0 nova_compute[186544]: 2025-11-22 08:47:32.754 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Releasing lock "refresh_cache-a7cff8e0-bed1-44f0-bf56-3e2e92438f70" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:47:32 compute-0 nova_compute[186544]: 2025-11-22 08:47:32.754 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 22 08:47:32 compute-0 nova_compute[186544]: 2025-11-22 08:47:32.755 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:47:32 compute-0 nova_compute[186544]: 2025-11-22 08:47:32.755 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:47:32 compute-0 nova_compute[186544]: 2025-11-22 08:47:32.781 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:47:32 compute-0 nova_compute[186544]: 2025-11-22 08:47:32.782 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:47:32 compute-0 nova_compute[186544]: 2025-11-22 08:47:32.782 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:47:32 compute-0 nova_compute[186544]: 2025-11-22 08:47:32.782 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 08:47:32 compute-0 nova_compute[186544]: 2025-11-22 08:47:32.855 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7cff8e0-bed1-44f0-bf56-3e2e92438f70/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:47:32 compute-0 nova_compute[186544]: 2025-11-22 08:47:32.912 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7cff8e0-bed1-44f0-bf56-3e2e92438f70/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:47:32 compute-0 nova_compute[186544]: 2025-11-22 08:47:32.913 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7cff8e0-bed1-44f0-bf56-3e2e92438f70/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:47:32 compute-0 nova_compute[186544]: 2025-11-22 08:47:32.971 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7cff8e0-bed1-44f0-bf56-3e2e92438f70/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:47:33 compute-0 nova_compute[186544]: 2025-11-22 08:47:33.121 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:47:33 compute-0 nova_compute[186544]: 2025-11-22 08:47:33.123 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5509MB free_disk=73.10273742675781GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 08:47:33 compute-0 nova_compute[186544]: 2025-11-22 08:47:33.123 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:47:33 compute-0 nova_compute[186544]: 2025-11-22 08:47:33.124 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:47:33 compute-0 nova_compute[186544]: 2025-11-22 08:47:33.281 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Instance a7cff8e0-bed1-44f0-bf56-3e2e92438f70 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 22 08:47:33 compute-0 nova_compute[186544]: 2025-11-22 08:47:33.282 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 08:47:33 compute-0 nova_compute[186544]: 2025-11-22 08:47:33.282 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 08:47:33 compute-0 nova_compute[186544]: 2025-11-22 08:47:33.538 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:47:33 compute-0 nova_compute[186544]: 2025-11-22 08:47:33.560 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:47:33 compute-0 nova_compute[186544]: 2025-11-22 08:47:33.604 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 08:47:33 compute-0 nova_compute[186544]: 2025-11-22 08:47:33.605 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.481s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:47:36 compute-0 nova_compute[186544]: 2025-11-22 08:47:36.523 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:47:36 compute-0 nova_compute[186544]: 2025-11-22 08:47:36.726 186548 INFO nova.compute.manager [None req-723b04f8-cec5-4367-b06c-d02937676758 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] Get console output
Nov 22 08:47:36 compute-0 nova_compute[186544]: 2025-11-22 08:47:36.733 213059 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 22 08:47:36 compute-0 nova_compute[186544]: 2025-11-22 08:47:36.766 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:47:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:47:37.377 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:47:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:47:37.378 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:47:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:47:37.378 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:47:37 compute-0 ovn_controller[94843]: 2025-11-22T08:47:37Z|00880|binding|INFO|Releasing lport 141b2dcd-1d35-41a6-8a2d-d4f7196888b4 from this chassis (sb_readonly=0)
Nov 22 08:47:37 compute-0 nova_compute[186544]: 2025-11-22 08:47:37.733 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:47:37 compute-0 ovn_controller[94843]: 2025-11-22T08:47:37Z|00881|binding|INFO|Releasing lport 141b2dcd-1d35-41a6-8a2d-d4f7196888b4 from this chassis (sb_readonly=0)
Nov 22 08:47:37 compute-0 nova_compute[186544]: 2025-11-22 08:47:37.783 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:47:38 compute-0 nova_compute[186544]: 2025-11-22 08:47:38.985 186548 INFO nova.compute.manager [None req-004042bb-12fc-4ed9-ae1f-e48f300d0661 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] Get console output
Nov 22 08:47:38 compute-0 nova_compute[186544]: 2025-11-22 08:47:38.991 213059 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 22 08:47:39 compute-0 nova_compute[186544]: 2025-11-22 08:47:39.012 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:47:39 compute-0 nova_compute[186544]: 2025-11-22 08:47:39.013 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:47:39 compute-0 nova_compute[186544]: 2025-11-22 08:47:39.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:47:40 compute-0 nova_compute[186544]: 2025-11-22 08:47:40.301 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:47:40 compute-0 NetworkManager[55036]: <info>  [1763801260.3021] manager: (patch-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/411)
Nov 22 08:47:40 compute-0 NetworkManager[55036]: <info>  [1763801260.3026] manager: (patch-br-int-to-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/412)
Nov 22 08:47:40 compute-0 ovn_controller[94843]: 2025-11-22T08:47:40Z|00882|binding|INFO|Releasing lport 141b2dcd-1d35-41a6-8a2d-d4f7196888b4 from this chassis (sb_readonly=0)
Nov 22 08:47:40 compute-0 nova_compute[186544]: 2025-11-22 08:47:40.348 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:47:40 compute-0 nova_compute[186544]: 2025-11-22 08:47:40.358 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:47:40 compute-0 nova_compute[186544]: 2025-11-22 08:47:40.855 186548 INFO nova.compute.manager [None req-37c02f59-eb12-4177-a6f9-f29b0eda1341 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] Get console output
Nov 22 08:47:40 compute-0 nova_compute[186544]: 2025-11-22 08:47:40.860 213059 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 22 08:47:41 compute-0 nova_compute[186544]: 2025-11-22 08:47:41.525 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:47:41 compute-0 nova_compute[186544]: 2025-11-22 08:47:41.768 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:47:42 compute-0 nova_compute[186544]: 2025-11-22 08:47:42.025 186548 DEBUG nova.compute.manager [req-67553f9e-dc7b-4298-9203-b846a5e113a2 req-91dbefb2-8657-45a9-aa56-2d5d1e0255c3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] Received event network-changed-d94b746b-db17-4e24-9b5b-3e9d317b172b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:47:42 compute-0 nova_compute[186544]: 2025-11-22 08:47:42.025 186548 DEBUG nova.compute.manager [req-67553f9e-dc7b-4298-9203-b846a5e113a2 req-91dbefb2-8657-45a9-aa56-2d5d1e0255c3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] Refreshing instance network info cache due to event network-changed-d94b746b-db17-4e24-9b5b-3e9d317b172b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:47:42 compute-0 nova_compute[186544]: 2025-11-22 08:47:42.026 186548 DEBUG oslo_concurrency.lockutils [req-67553f9e-dc7b-4298-9203-b846a5e113a2 req-91dbefb2-8657-45a9-aa56-2d5d1e0255c3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-a7cff8e0-bed1-44f0-bf56-3e2e92438f70" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:47:42 compute-0 nova_compute[186544]: 2025-11-22 08:47:42.026 186548 DEBUG oslo_concurrency.lockutils [req-67553f9e-dc7b-4298-9203-b846a5e113a2 req-91dbefb2-8657-45a9-aa56-2d5d1e0255c3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-a7cff8e0-bed1-44f0-bf56-3e2e92438f70" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:47:42 compute-0 nova_compute[186544]: 2025-11-22 08:47:42.026 186548 DEBUG nova.network.neutron [req-67553f9e-dc7b-4298-9203-b846a5e113a2 req-91dbefb2-8657-45a9-aa56-2d5d1e0255c3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] Refreshing network info cache for port d94b746b-db17-4e24-9b5b-3e9d317b172b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:47:42 compute-0 nova_compute[186544]: 2025-11-22 08:47:42.119 186548 DEBUG oslo_concurrency.lockutils [None req-53a12f8f-7c61-4208-b70a-7099e3023072 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "a7cff8e0-bed1-44f0-bf56-3e2e92438f70" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:47:42 compute-0 nova_compute[186544]: 2025-11-22 08:47:42.119 186548 DEBUG oslo_concurrency.lockutils [None req-53a12f8f-7c61-4208-b70a-7099e3023072 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "a7cff8e0-bed1-44f0-bf56-3e2e92438f70" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:47:42 compute-0 nova_compute[186544]: 2025-11-22 08:47:42.120 186548 DEBUG oslo_concurrency.lockutils [None req-53a12f8f-7c61-4208-b70a-7099e3023072 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "a7cff8e0-bed1-44f0-bf56-3e2e92438f70-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:47:42 compute-0 nova_compute[186544]: 2025-11-22 08:47:42.120 186548 DEBUG oslo_concurrency.lockutils [None req-53a12f8f-7c61-4208-b70a-7099e3023072 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "a7cff8e0-bed1-44f0-bf56-3e2e92438f70-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:47:42 compute-0 nova_compute[186544]: 2025-11-22 08:47:42.120 186548 DEBUG oslo_concurrency.lockutils [None req-53a12f8f-7c61-4208-b70a-7099e3023072 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "a7cff8e0-bed1-44f0-bf56-3e2e92438f70-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:47:42 compute-0 nova_compute[186544]: 2025-11-22 08:47:42.128 186548 INFO nova.compute.manager [None req-53a12f8f-7c61-4208-b70a-7099e3023072 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] Terminating instance
Nov 22 08:47:42 compute-0 nova_compute[186544]: 2025-11-22 08:47:42.134 186548 DEBUG nova.compute.manager [None req-53a12f8f-7c61-4208-b70a-7099e3023072 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 08:47:42 compute-0 kernel: tapd94b746b-db (unregistering): left promiscuous mode
Nov 22 08:47:42 compute-0 NetworkManager[55036]: <info>  [1763801262.1529] device (tapd94b746b-db): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 08:47:42 compute-0 nova_compute[186544]: 2025-11-22 08:47:42.164 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:47:42 compute-0 ovn_controller[94843]: 2025-11-22T08:47:42Z|00883|binding|INFO|Releasing lport d94b746b-db17-4e24-9b5b-3e9d317b172b from this chassis (sb_readonly=0)
Nov 22 08:47:42 compute-0 ovn_controller[94843]: 2025-11-22T08:47:42Z|00884|binding|INFO|Setting lport d94b746b-db17-4e24-9b5b-3e9d317b172b down in Southbound
Nov 22 08:47:42 compute-0 ovn_controller[94843]: 2025-11-22T08:47:42Z|00885|binding|INFO|Removing iface tapd94b746b-db ovn-installed in OVS
Nov 22 08:47:42 compute-0 nova_compute[186544]: 2025-11-22 08:47:42.166 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:47:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:47:42.179 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:57:91:f5 10.100.0.7'], port_security=['fa:16:3e:57:91:f5 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'a7cff8e0-bed1-44f0-bf56-3e2e92438f70', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-57cf6a8a-ab79-4033-be9d-b010af3d5245', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '12f63a6d87a947758ab928c0d625ff06', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a846c07c-5040-4e4f-85a2-5806a9804a7c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=62a0fd5f-18f3-409d-ad4e-2a99061b21ac, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=d94b746b-db17-4e24-9b5b-3e9d317b172b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:47:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:47:42.180 103805 INFO neutron.agent.ovn.metadata.agent [-] Port d94b746b-db17-4e24-9b5b-3e9d317b172b in datapath 57cf6a8a-ab79-4033-be9d-b010af3d5245 unbound from our chassis
Nov 22 08:47:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:47:42.181 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 57cf6a8a-ab79-4033-be9d-b010af3d5245, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 08:47:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:47:42.183 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[7ab35041-fc2a-419a-8558-65e69e6a1354]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:47:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:47:42.183 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-57cf6a8a-ab79-4033-be9d-b010af3d5245 namespace which is not needed anymore
Nov 22 08:47:42 compute-0 nova_compute[186544]: 2025-11-22 08:47:42.185 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:47:42 compute-0 systemd[1]: machine-qemu\x2d96\x2dinstance\x2d000000b2.scope: Deactivated successfully.
Nov 22 08:47:42 compute-0 systemd[1]: machine-qemu\x2d96\x2dinstance\x2d000000b2.scope: Consumed 15.866s CPU time.
Nov 22 08:47:42 compute-0 systemd-machined[152872]: Machine qemu-96-instance-000000b2 terminated.
Nov 22 08:47:42 compute-0 neutron-haproxy-ovnmeta-57cf6a8a-ab79-4033-be9d-b010af3d5245[253449]: [NOTICE]   (253453) : haproxy version is 2.8.14-c23fe91
Nov 22 08:47:42 compute-0 neutron-haproxy-ovnmeta-57cf6a8a-ab79-4033-be9d-b010af3d5245[253449]: [NOTICE]   (253453) : path to executable is /usr/sbin/haproxy
Nov 22 08:47:42 compute-0 neutron-haproxy-ovnmeta-57cf6a8a-ab79-4033-be9d-b010af3d5245[253449]: [WARNING]  (253453) : Exiting Master process...
Nov 22 08:47:42 compute-0 neutron-haproxy-ovnmeta-57cf6a8a-ab79-4033-be9d-b010af3d5245[253449]: [ALERT]    (253453) : Current worker (253455) exited with code 143 (Terminated)
Nov 22 08:47:42 compute-0 neutron-haproxy-ovnmeta-57cf6a8a-ab79-4033-be9d-b010af3d5245[253449]: [WARNING]  (253453) : All workers exited. Exiting... (0)
Nov 22 08:47:42 compute-0 systemd[1]: libpod-0028252800f9cfa48821e6d24711b6544a376bc5724281af32b34e14ab223672.scope: Deactivated successfully.
Nov 22 08:47:42 compute-0 podman[253663]: 2025-11-22 08:47:42.318800811 +0000 UTC m=+0.051923562 container died 0028252800f9cfa48821e6d24711b6544a376bc5724281af32b34e14ab223672 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-57cf6a8a-ab79-4033-be9d-b010af3d5245, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3)
Nov 22 08:47:42 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0028252800f9cfa48821e6d24711b6544a376bc5724281af32b34e14ab223672-userdata-shm.mount: Deactivated successfully.
Nov 22 08:47:42 compute-0 systemd[1]: var-lib-containers-storage-overlay-fa3268928c9460e4b1a4e5c1bb4f57e38241276cf1bab03c89e92f50772e5f2c-merged.mount: Deactivated successfully.
Nov 22 08:47:42 compute-0 nova_compute[186544]: 2025-11-22 08:47:42.359 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:47:42 compute-0 nova_compute[186544]: 2025-11-22 08:47:42.364 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:47:42 compute-0 podman[253663]: 2025-11-22 08:47:42.401087982 +0000 UTC m=+0.134210743 container cleanup 0028252800f9cfa48821e6d24711b6544a376bc5724281af32b34e14ab223672 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-57cf6a8a-ab79-4033-be9d-b010af3d5245, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 22 08:47:42 compute-0 nova_compute[186544]: 2025-11-22 08:47:42.403 186548 INFO nova.virt.libvirt.driver [-] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] Instance destroyed successfully.
Nov 22 08:47:42 compute-0 nova_compute[186544]: 2025-11-22 08:47:42.404 186548 DEBUG nova.objects.instance [None req-53a12f8f-7c61-4208-b70a-7099e3023072 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lazy-loading 'resources' on Instance uuid a7cff8e0-bed1-44f0-bf56-3e2e92438f70 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:47:42 compute-0 systemd[1]: libpod-conmon-0028252800f9cfa48821e6d24711b6544a376bc5724281af32b34e14ab223672.scope: Deactivated successfully.
Nov 22 08:47:42 compute-0 nova_compute[186544]: 2025-11-22 08:47:42.431 186548 DEBUG nova.virt.libvirt.vif [None req-53a12f8f-7c61-4208-b70a-7099e3023072 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:46:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1531729469',display_name='tempest-TestNetworkBasicOps-server-1531729469',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1531729469',id=178,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFuuTOBsJaGrb6rpI9waYP65xLrhDiEtReh8kT/AeVg5nZ440ZcMMmIJ/yEjGe10IlohnCLDoGgmoVEg5ALAcah0VswlQqgcIpiL6XrMw/7xeXQGcq95c25y6jb0TwcVtg==',key_name='tempest-TestNetworkBasicOps-1584846425',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:47:13Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='12f63a6d87a947758ab928c0d625ff06',ramdisk_id='',reservation_id='r-ag0hiige',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1998778518',owner_user_name='tempest-TestNetworkBasicOps-1998778518-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:47:13Z,user_data=None,user_id='033a5e424a0a42afa21b67c28d79d1f4',uuid=a7cff8e0-bed1-44f0-bf56-3e2e92438f70,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d94b746b-db17-4e24-9b5b-3e9d317b172b", "address": "fa:16:3e:57:91:f5", "network": {"id": "57cf6a8a-ab79-4033-be9d-b010af3d5245", "bridge": "br-int", "label": "tempest-network-smoke--430375971", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.189", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd94b746b-db", "ovs_interfaceid": "d94b746b-db17-4e24-9b5b-3e9d317b172b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 08:47:42 compute-0 nova_compute[186544]: 2025-11-22 08:47:42.432 186548 DEBUG nova.network.os_vif_util [None req-53a12f8f-7c61-4208-b70a-7099e3023072 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converting VIF {"id": "d94b746b-db17-4e24-9b5b-3e9d317b172b", "address": "fa:16:3e:57:91:f5", "network": {"id": "57cf6a8a-ab79-4033-be9d-b010af3d5245", "bridge": "br-int", "label": "tempest-network-smoke--430375971", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.189", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd94b746b-db", "ovs_interfaceid": "d94b746b-db17-4e24-9b5b-3e9d317b172b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:47:42 compute-0 nova_compute[186544]: 2025-11-22 08:47:42.432 186548 DEBUG nova.network.os_vif_util [None req-53a12f8f-7c61-4208-b70a-7099e3023072 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:57:91:f5,bridge_name='br-int',has_traffic_filtering=True,id=d94b746b-db17-4e24-9b5b-3e9d317b172b,network=Network(57cf6a8a-ab79-4033-be9d-b010af3d5245),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd94b746b-db') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:47:42 compute-0 nova_compute[186544]: 2025-11-22 08:47:42.433 186548 DEBUG os_vif [None req-53a12f8f-7c61-4208-b70a-7099e3023072 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:57:91:f5,bridge_name='br-int',has_traffic_filtering=True,id=d94b746b-db17-4e24-9b5b-3e9d317b172b,network=Network(57cf6a8a-ab79-4033-be9d-b010af3d5245),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd94b746b-db') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 08:47:42 compute-0 nova_compute[186544]: 2025-11-22 08:47:42.434 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:47:42 compute-0 nova_compute[186544]: 2025-11-22 08:47:42.435 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd94b746b-db, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:47:42 compute-0 nova_compute[186544]: 2025-11-22 08:47:42.436 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:47:42 compute-0 nova_compute[186544]: 2025-11-22 08:47:42.438 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 08:47:42 compute-0 nova_compute[186544]: 2025-11-22 08:47:42.438 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:47:42 compute-0 nova_compute[186544]: 2025-11-22 08:47:42.440 186548 INFO os_vif [None req-53a12f8f-7c61-4208-b70a-7099e3023072 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:57:91:f5,bridge_name='br-int',has_traffic_filtering=True,id=d94b746b-db17-4e24-9b5b-3e9d317b172b,network=Network(57cf6a8a-ab79-4033-be9d-b010af3d5245),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd94b746b-db')
Nov 22 08:47:42 compute-0 nova_compute[186544]: 2025-11-22 08:47:42.441 186548 INFO nova.virt.libvirt.driver [None req-53a12f8f-7c61-4208-b70a-7099e3023072 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] Deleting instance files /var/lib/nova/instances/a7cff8e0-bed1-44f0-bf56-3e2e92438f70_del
Nov 22 08:47:42 compute-0 nova_compute[186544]: 2025-11-22 08:47:42.441 186548 INFO nova.virt.libvirt.driver [None req-53a12f8f-7c61-4208-b70a-7099e3023072 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] Deletion of /var/lib/nova/instances/a7cff8e0-bed1-44f0-bf56-3e2e92438f70_del complete
Nov 22 08:47:42 compute-0 podman[253706]: 2025-11-22 08:47:42.482601493 +0000 UTC m=+0.053970113 container remove 0028252800f9cfa48821e6d24711b6544a376bc5724281af32b34e14ab223672 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-57cf6a8a-ab79-4033-be9d-b010af3d5245, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 08:47:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:47:42.488 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[8888d7e1-c300-4140-a289-20bc1ddc10c6]: (4, ('Sat Nov 22 08:47:42 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-57cf6a8a-ab79-4033-be9d-b010af3d5245 (0028252800f9cfa48821e6d24711b6544a376bc5724281af32b34e14ab223672)\n0028252800f9cfa48821e6d24711b6544a376bc5724281af32b34e14ab223672\nSat Nov 22 08:47:42 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-57cf6a8a-ab79-4033-be9d-b010af3d5245 (0028252800f9cfa48821e6d24711b6544a376bc5724281af32b34e14ab223672)\n0028252800f9cfa48821e6d24711b6544a376bc5724281af32b34e14ab223672\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:47:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:47:42.490 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[c421dd16-5844-434f-836f-39ad368bd66c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:47:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:47:42.491 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap57cf6a8a-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:47:42 compute-0 nova_compute[186544]: 2025-11-22 08:47:42.492 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:47:42 compute-0 kernel: tap57cf6a8a-a0: left promiscuous mode
Nov 22 08:47:42 compute-0 nova_compute[186544]: 2025-11-22 08:47:42.504 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:47:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:47:42.506 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[2a83f57c-0208-4fef-b704-1cd34d5be384]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:47:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:47:42.521 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[b082d9af-5001-4efc-9cc8-6f89b2eefd5f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:47:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:47:42.523 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[df6b8080-4b8d-4ded-82d2-b676b70c7d0f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:47:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:47:42.536 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[2c914bc8-1bf8-4d35-b6ff-63916bd2c610]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 797338, 'reachable_time': 21733, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 253721, 'error': None, 'target': 'ovnmeta-57cf6a8a-ab79-4033-be9d-b010af3d5245', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:47:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:47:42.539 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-57cf6a8a-ab79-4033-be9d-b010af3d5245 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 08:47:42 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:47:42.539 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[77398710-53bf-4359-b44a-c05919ad477c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:47:42 compute-0 systemd[1]: run-netns-ovnmeta\x2d57cf6a8a\x2dab79\x2d4033\x2dbe9d\x2db010af3d5245.mount: Deactivated successfully.
Nov 22 08:47:42 compute-0 nova_compute[186544]: 2025-11-22 08:47:42.546 186548 INFO nova.compute.manager [None req-53a12f8f-7c61-4208-b70a-7099e3023072 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] Took 0.41 seconds to destroy the instance on the hypervisor.
Nov 22 08:47:42 compute-0 nova_compute[186544]: 2025-11-22 08:47:42.547 186548 DEBUG oslo.service.loopingcall [None req-53a12f8f-7c61-4208-b70a-7099e3023072 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 08:47:42 compute-0 nova_compute[186544]: 2025-11-22 08:47:42.547 186548 DEBUG nova.compute.manager [-] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 08:47:42 compute-0 nova_compute[186544]: 2025-11-22 08:47:42.547 186548 DEBUG nova.network.neutron [-] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 08:47:43 compute-0 nova_compute[186544]: 2025-11-22 08:47:43.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:47:45 compute-0 nova_compute[186544]: 2025-11-22 08:47:45.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:47:45 compute-0 nova_compute[186544]: 2025-11-22 08:47:45.495 186548 DEBUG nova.compute.manager [req-65171afc-bb28-49c0-96a5-862095cf32e2 req-a46ac536-eba5-4cc0-bb3c-4d1677bfe02b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] Received event network-vif-unplugged-d94b746b-db17-4e24-9b5b-3e9d317b172b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:47:45 compute-0 nova_compute[186544]: 2025-11-22 08:47:45.495 186548 DEBUG oslo_concurrency.lockutils [req-65171afc-bb28-49c0-96a5-862095cf32e2 req-a46ac536-eba5-4cc0-bb3c-4d1677bfe02b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "a7cff8e0-bed1-44f0-bf56-3e2e92438f70-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:47:45 compute-0 nova_compute[186544]: 2025-11-22 08:47:45.496 186548 DEBUG oslo_concurrency.lockutils [req-65171afc-bb28-49c0-96a5-862095cf32e2 req-a46ac536-eba5-4cc0-bb3c-4d1677bfe02b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "a7cff8e0-bed1-44f0-bf56-3e2e92438f70-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:47:45 compute-0 nova_compute[186544]: 2025-11-22 08:47:45.497 186548 DEBUG oslo_concurrency.lockutils [req-65171afc-bb28-49c0-96a5-862095cf32e2 req-a46ac536-eba5-4cc0-bb3c-4d1677bfe02b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "a7cff8e0-bed1-44f0-bf56-3e2e92438f70-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:47:45 compute-0 nova_compute[186544]: 2025-11-22 08:47:45.497 186548 DEBUG nova.compute.manager [req-65171afc-bb28-49c0-96a5-862095cf32e2 req-a46ac536-eba5-4cc0-bb3c-4d1677bfe02b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] No waiting events found dispatching network-vif-unplugged-d94b746b-db17-4e24-9b5b-3e9d317b172b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:47:45 compute-0 nova_compute[186544]: 2025-11-22 08:47:45.497 186548 DEBUG nova.compute.manager [req-65171afc-bb28-49c0-96a5-862095cf32e2 req-a46ac536-eba5-4cc0-bb3c-4d1677bfe02b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] Received event network-vif-unplugged-d94b746b-db17-4e24-9b5b-3e9d317b172b for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 22 08:47:46 compute-0 nova_compute[186544]: 2025-11-22 08:47:46.771 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:47:47 compute-0 nova_compute[186544]: 2025-11-22 08:47:47.437 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:47:48 compute-0 nova_compute[186544]: 2025-11-22 08:47:48.481 186548 DEBUG nova.network.neutron [req-67553f9e-dc7b-4298-9203-b846a5e113a2 req-91dbefb2-8657-45a9-aa56-2d5d1e0255c3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] Updated VIF entry in instance network info cache for port d94b746b-db17-4e24-9b5b-3e9d317b172b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:47:48 compute-0 nova_compute[186544]: 2025-11-22 08:47:48.482 186548 DEBUG nova.network.neutron [req-67553f9e-dc7b-4298-9203-b846a5e113a2 req-91dbefb2-8657-45a9-aa56-2d5d1e0255c3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] Updating instance_info_cache with network_info: [{"id": "d94b746b-db17-4e24-9b5b-3e9d317b172b", "address": "fa:16:3e:57:91:f5", "network": {"id": "57cf6a8a-ab79-4033-be9d-b010af3d5245", "bridge": "br-int", "label": "tempest-network-smoke--430375971", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "12f63a6d87a947758ab928c0d625ff06", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd94b746b-db", "ovs_interfaceid": "d94b746b-db17-4e24-9b5b-3e9d317b172b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:47:48 compute-0 nova_compute[186544]: 2025-11-22 08:47:48.537 186548 DEBUG oslo_concurrency.lockutils [req-67553f9e-dc7b-4298-9203-b846a5e113a2 req-91dbefb2-8657-45a9-aa56-2d5d1e0255c3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-a7cff8e0-bed1-44f0-bf56-3e2e92438f70" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:47:48 compute-0 nova_compute[186544]: 2025-11-22 08:47:48.626 186548 DEBUG nova.network.neutron [-] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:47:48 compute-0 nova_compute[186544]: 2025-11-22 08:47:48.654 186548 INFO nova.compute.manager [-] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] Took 6.11 seconds to deallocate network for instance.
Nov 22 08:47:48 compute-0 nova_compute[186544]: 2025-11-22 08:47:48.782 186548 DEBUG oslo_concurrency.lockutils [None req-53a12f8f-7c61-4208-b70a-7099e3023072 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:47:48 compute-0 nova_compute[186544]: 2025-11-22 08:47:48.782 186548 DEBUG oslo_concurrency.lockutils [None req-53a12f8f-7c61-4208-b70a-7099e3023072 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:47:48 compute-0 nova_compute[186544]: 2025-11-22 08:47:48.797 186548 DEBUG nova.compute.manager [req-bdab2e69-a584-48c3-85cf-58b2d2c2a493 req-74a2a4cf-19c4-44de-adf1-8cde8ee05809 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] Received event network-vif-plugged-d94b746b-db17-4e24-9b5b-3e9d317b172b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:47:48 compute-0 nova_compute[186544]: 2025-11-22 08:47:48.798 186548 DEBUG oslo_concurrency.lockutils [req-bdab2e69-a584-48c3-85cf-58b2d2c2a493 req-74a2a4cf-19c4-44de-adf1-8cde8ee05809 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "a7cff8e0-bed1-44f0-bf56-3e2e92438f70-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:47:48 compute-0 nova_compute[186544]: 2025-11-22 08:47:48.798 186548 DEBUG oslo_concurrency.lockutils [req-bdab2e69-a584-48c3-85cf-58b2d2c2a493 req-74a2a4cf-19c4-44de-adf1-8cde8ee05809 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "a7cff8e0-bed1-44f0-bf56-3e2e92438f70-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:47:48 compute-0 nova_compute[186544]: 2025-11-22 08:47:48.798 186548 DEBUG oslo_concurrency.lockutils [req-bdab2e69-a584-48c3-85cf-58b2d2c2a493 req-74a2a4cf-19c4-44de-adf1-8cde8ee05809 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "a7cff8e0-bed1-44f0-bf56-3e2e92438f70-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:47:48 compute-0 nova_compute[186544]: 2025-11-22 08:47:48.798 186548 DEBUG nova.compute.manager [req-bdab2e69-a584-48c3-85cf-58b2d2c2a493 req-74a2a4cf-19c4-44de-adf1-8cde8ee05809 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] No waiting events found dispatching network-vif-plugged-d94b746b-db17-4e24-9b5b-3e9d317b172b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:47:48 compute-0 nova_compute[186544]: 2025-11-22 08:47:48.799 186548 WARNING nova.compute.manager [req-bdab2e69-a584-48c3-85cf-58b2d2c2a493 req-74a2a4cf-19c4-44de-adf1-8cde8ee05809 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] Received unexpected event network-vif-plugged-d94b746b-db17-4e24-9b5b-3e9d317b172b for instance with vm_state deleted and task_state None.
Nov 22 08:47:48 compute-0 nova_compute[186544]: 2025-11-22 08:47:48.875 186548 DEBUG nova.compute.provider_tree [None req-53a12f8f-7c61-4208-b70a-7099e3023072 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:47:48 compute-0 nova_compute[186544]: 2025-11-22 08:47:48.892 186548 DEBUG nova.scheduler.client.report [None req-53a12f8f-7c61-4208-b70a-7099e3023072 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:47:48 compute-0 nova_compute[186544]: 2025-11-22 08:47:48.920 186548 DEBUG oslo_concurrency.lockutils [None req-53a12f8f-7c61-4208-b70a-7099e3023072 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.138s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:47:48 compute-0 nova_compute[186544]: 2025-11-22 08:47:48.997 186548 INFO nova.scheduler.client.report [None req-53a12f8f-7c61-4208-b70a-7099e3023072 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Deleted allocations for instance a7cff8e0-bed1-44f0-bf56-3e2e92438f70
Nov 22 08:47:49 compute-0 nova_compute[186544]: 2025-11-22 08:47:49.109 186548 DEBUG oslo_concurrency.lockutils [None req-53a12f8f-7c61-4208-b70a-7099e3023072 033a5e424a0a42afa21b67c28d79d1f4 12f63a6d87a947758ab928c0d625ff06 - - default default] Lock "a7cff8e0-bed1-44f0-bf56-3e2e92438f70" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.990s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:47:49 compute-0 podman[253722]: 2025-11-22 08:47:49.971224242 +0000 UTC m=+0.063477477 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118)
Nov 22 08:47:49 compute-0 podman[253724]: 2025-11-22 08:47:49.983851384 +0000 UTC m=+0.065663610 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 08:47:50 compute-0 podman[253723]: 2025-11-22 08:47:50.008318608 +0000 UTC m=+0.087018048 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 22 08:47:50 compute-0 podman[253725]: 2025-11-22 08:47:50.008456391 +0000 UTC m=+0.089759305 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 22 08:47:51 compute-0 nova_compute[186544]: 2025-11-22 08:47:51.020 186548 DEBUG nova.compute.manager [req-fd3a18cd-a923-4011-935f-357684e01cda req-1e024710-b6cc-46cc-b357-9eeb3035bc9f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] Received event network-vif-deleted-d94b746b-db17-4e24-9b5b-3e9d317b172b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:47:51 compute-0 nova_compute[186544]: 2025-11-22 08:47:51.772 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:47:52 compute-0 nova_compute[186544]: 2025-11-22 08:47:52.439 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:47:56 compute-0 nova_compute[186544]: 2025-11-22 08:47:56.774 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:47:57 compute-0 nova_compute[186544]: 2025-11-22 08:47:57.399 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763801262.396774, a7cff8e0-bed1-44f0-bf56-3e2e92438f70 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:47:57 compute-0 nova_compute[186544]: 2025-11-22 08:47:57.399 186548 INFO nova.compute.manager [-] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] VM Stopped (Lifecycle Event)
Nov 22 08:47:57 compute-0 nova_compute[186544]: 2025-11-22 08:47:57.421 186548 DEBUG nova.compute.manager [None req-b13a7fc3-6e11-48b6-947c-0038167a044a - - - - - -] [instance: a7cff8e0-bed1-44f0-bf56-3e2e92438f70] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:47:57 compute-0 nova_compute[186544]: 2025-11-22 08:47:57.441 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:47:59 compute-0 podman[253811]: 2025-11-22 08:47:59.405051512 +0000 UTC m=+0.054771393 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 08:47:59 compute-0 podman[253813]: 2025-11-22 08:47:59.420049092 +0000 UTC m=+0.061427177 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd)
Nov 22 08:47:59 compute-0 podman[253812]: 2025-11-22 08:47:59.434432947 +0000 UTC m=+0.081247316 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, version=9.6, io.buildah.version=1.33.7, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., name=ubi9-minimal, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350)
Nov 22 08:47:59 compute-0 nova_compute[186544]: 2025-11-22 08:47:59.469 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:47:59 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:47:59.470 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=72, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=71) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:47:59 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:47:59.471 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 08:48:01 compute-0 nova_compute[186544]: 2025-11-22 08:48:01.688 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:48:01 compute-0 nova_compute[186544]: 2025-11-22 08:48:01.754 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:48:01 compute-0 nova_compute[186544]: 2025-11-22 08:48:01.776 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:48:02 compute-0 nova_compute[186544]: 2025-11-22 08:48:02.443 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:48:05 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:48:05.474 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '72'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:48:06 compute-0 nova_compute[186544]: 2025-11-22 08:48:06.777 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:48:07 compute-0 nova_compute[186544]: 2025-11-22 08:48:07.446 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:48:11 compute-0 nova_compute[186544]: 2025-11-22 08:48:11.778 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:48:12 compute-0 nova_compute[186544]: 2025-11-22 08:48:12.448 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:48:16 compute-0 nova_compute[186544]: 2025-11-22 08:48:16.779 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:48:17 compute-0 nova_compute[186544]: 2025-11-22 08:48:17.450 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:48:20 compute-0 podman[253876]: 2025-11-22 08:48:20.451287122 +0000 UTC m=+0.097374984 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 22 08:48:20 compute-0 podman[253875]: 2025-11-22 08:48:20.457973906 +0000 UTC m=+0.105628417 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3)
Nov 22 08:48:20 compute-0 podman[253877]: 2025-11-22 08:48:20.466061586 +0000 UTC m=+0.105059393 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 08:48:20 compute-0 podman[253878]: 2025-11-22 08:48:20.496471906 +0000 UTC m=+0.135089904 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 22 08:48:21 compute-0 nova_compute[186544]: 2025-11-22 08:48:21.781 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:48:22 compute-0 nova_compute[186544]: 2025-11-22 08:48:22.451 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:48:26 compute-0 nova_compute[186544]: 2025-11-22 08:48:26.782 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:48:27 compute-0 nova_compute[186544]: 2025-11-22 08:48:27.453 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:48:28 compute-0 nova_compute[186544]: 2025-11-22 08:48:28.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:48:28 compute-0 nova_compute[186544]: 2025-11-22 08:48:28.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 08:48:29 compute-0 nova_compute[186544]: 2025-11-22 08:48:29.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:48:29 compute-0 nova_compute[186544]: 2025-11-22 08:48:29.165 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 08:48:29 compute-0 nova_compute[186544]: 2025-11-22 08:48:29.165 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 08:48:29 compute-0 nova_compute[186544]: 2025-11-22 08:48:29.215 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 08:48:30 compute-0 podman[253963]: 2025-11-22 08:48:30.393998996 +0000 UTC m=+0.044147861 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 08:48:30 compute-0 podman[253965]: 2025-11-22 08:48:30.408206876 +0000 UTC m=+0.051200045 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Nov 22 08:48:30 compute-0 podman[253964]: 2025-11-22 08:48:30.430070846 +0000 UTC m=+0.076479408 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, io.openshift.expose-services=, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, architecture=x86_64, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 22 08:48:31 compute-0 nova_compute[186544]: 2025-11-22 08:48:31.784 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:48:32 compute-0 nova_compute[186544]: 2025-11-22 08:48:32.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:48:32 compute-0 nova_compute[186544]: 2025-11-22 08:48:32.191 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:48:32 compute-0 nova_compute[186544]: 2025-11-22 08:48:32.191 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:48:32 compute-0 nova_compute[186544]: 2025-11-22 08:48:32.191 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:48:32 compute-0 nova_compute[186544]: 2025-11-22 08:48:32.191 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 08:48:32 compute-0 nova_compute[186544]: 2025-11-22 08:48:32.338 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:48:32 compute-0 nova_compute[186544]: 2025-11-22 08:48:32.339 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5707MB free_disk=73.13182067871094GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 08:48:32 compute-0 nova_compute[186544]: 2025-11-22 08:48:32.339 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:48:32 compute-0 nova_compute[186544]: 2025-11-22 08:48:32.340 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:48:32 compute-0 nova_compute[186544]: 2025-11-22 08:48:32.424 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 08:48:32 compute-0 nova_compute[186544]: 2025-11-22 08:48:32.424 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 08:48:32 compute-0 nova_compute[186544]: 2025-11-22 08:48:32.456 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:48:32 compute-0 nova_compute[186544]: 2025-11-22 08:48:32.477 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:48:32 compute-0 nova_compute[186544]: 2025-11-22 08:48:32.525 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:48:32 compute-0 nova_compute[186544]: 2025-11-22 08:48:32.560 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 08:48:32 compute-0 nova_compute[186544]: 2025-11-22 08:48:32.560 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.221s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:48:34 compute-0 nova_compute[186544]: 2025-11-22 08:48:34.561 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:48:36.605 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:48:36.606 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:48:36.606 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:48:36.606 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:48:36.606 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:48:36.606 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:48:36.606 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:48:36.606 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:48:36.606 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:48:36.606 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:48:36.606 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:48:36.606 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:48:36.606 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:48:36.607 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:48:36.607 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:48:36.607 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:48:36.607 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:48:36.607 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:48:36.607 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:48:36.607 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:48:36.607 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:48:36.607 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:48:36.607 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:48:36.607 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:48:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:48:36.607 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:48:36 compute-0 nova_compute[186544]: 2025-11-22 08:48:36.785 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:48:37 compute-0 nova_compute[186544]: 2025-11-22 08:48:37.158 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:48:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:48:37.379 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:48:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:48:37.380 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:48:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:48:37.380 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:48:37 compute-0 nova_compute[186544]: 2025-11-22 08:48:37.459 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:48:40 compute-0 nova_compute[186544]: 2025-11-22 08:48:40.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:48:40 compute-0 nova_compute[186544]: 2025-11-22 08:48:40.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:48:40 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:48:40.454 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=73, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=72) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:48:40 compute-0 nova_compute[186544]: 2025-11-22 08:48:40.455 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:48:40 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:48:40.455 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 08:48:41 compute-0 nova_compute[186544]: 2025-11-22 08:48:41.786 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:48:42 compute-0 nova_compute[186544]: 2025-11-22 08:48:42.461 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:48:43 compute-0 nova_compute[186544]: 2025-11-22 08:48:43.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:48:43 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:48:43.457 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '73'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:48:45 compute-0 nova_compute[186544]: 2025-11-22 08:48:45.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:48:46 compute-0 nova_compute[186544]: 2025-11-22 08:48:46.788 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:48:47 compute-0 nova_compute[186544]: 2025-11-22 08:48:47.463 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:48:51 compute-0 podman[254028]: 2025-11-22 08:48:51.437984351 +0000 UTC m=+0.082412674 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 22 08:48:51 compute-0 podman[254030]: 2025-11-22 08:48:51.440108223 +0000 UTC m=+0.057593462 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 08:48:51 compute-0 podman[254029]: 2025-11-22 08:48:51.465314195 +0000 UTC m=+0.086828573 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 22 08:48:51 compute-0 podman[254036]: 2025-11-22 08:48:51.487065522 +0000 UTC m=+0.100933891 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 22 08:48:51 compute-0 nova_compute[186544]: 2025-11-22 08:48:51.789 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:48:52 compute-0 nova_compute[186544]: 2025-11-22 08:48:52.465 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:48:54 compute-0 nova_compute[186544]: 2025-11-22 08:48:54.160 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:48:56 compute-0 nova_compute[186544]: 2025-11-22 08:48:56.791 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:48:57 compute-0 nova_compute[186544]: 2025-11-22 08:48:57.470 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:49:01 compute-0 podman[254112]: 2025-11-22 08:49:01.392067566 +0000 UTC m=+0.044470748 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 08:49:01 compute-0 podman[254114]: 2025-11-22 08:49:01.409418125 +0000 UTC m=+0.055630914 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 08:49:01 compute-0 podman[254113]: 2025-11-22 08:49:01.409855016 +0000 UTC m=+0.059195642 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, architecture=x86_64, config_id=edpm, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, name=ubi9-minimal, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.buildah.version=1.33.7, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=)
Nov 22 08:49:01 compute-0 nova_compute[186544]: 2025-11-22 08:49:01.793 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:49:02 compute-0 nova_compute[186544]: 2025-11-22 08:49:02.473 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:49:06 compute-0 nova_compute[186544]: 2025-11-22 08:49:06.795 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:49:07 compute-0 nova_compute[186544]: 2025-11-22 08:49:07.475 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:49:10 compute-0 ovn_controller[94843]: 2025-11-22T08:49:10Z|00886|memory_trim|INFO|Detected inactivity (last active 30010 ms ago): trimming memory
Nov 22 08:49:11 compute-0 nova_compute[186544]: 2025-11-22 08:49:11.798 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:49:12 compute-0 nova_compute[186544]: 2025-11-22 08:49:12.478 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:49:16 compute-0 nova_compute[186544]: 2025-11-22 08:49:16.800 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:49:17 compute-0 nova_compute[186544]: 2025-11-22 08:49:17.480 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:49:21 compute-0 nova_compute[186544]: 2025-11-22 08:49:21.800 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:49:22 compute-0 podman[254176]: 2025-11-22 08:49:22.410534763 +0000 UTC m=+0.055693926 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 08:49:22 compute-0 podman[254174]: 2025-11-22 08:49:22.417863643 +0000 UTC m=+0.068412299 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Nov 22 08:49:22 compute-0 podman[254175]: 2025-11-22 08:49:22.432633588 +0000 UTC m=+0.081638236 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 22 08:49:22 compute-0 podman[254177]: 2025-11-22 08:49:22.445496605 +0000 UTC m=+0.086483015 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 22 08:49:22 compute-0 nova_compute[186544]: 2025-11-22 08:49:22.482 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:49:26 compute-0 nova_compute[186544]: 2025-11-22 08:49:26.802 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:49:27 compute-0 nova_compute[186544]: 2025-11-22 08:49:27.484 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:49:30 compute-0 nova_compute[186544]: 2025-11-22 08:49:30.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:49:30 compute-0 nova_compute[186544]: 2025-11-22 08:49:30.164 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 08:49:30 compute-0 nova_compute[186544]: 2025-11-22 08:49:30.164 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 08:49:30 compute-0 nova_compute[186544]: 2025-11-22 08:49:30.176 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 08:49:30 compute-0 nova_compute[186544]: 2025-11-22 08:49:30.176 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:49:30 compute-0 nova_compute[186544]: 2025-11-22 08:49:30.177 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 08:49:31 compute-0 nova_compute[186544]: 2025-11-22 08:49:31.804 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:49:32 compute-0 nova_compute[186544]: 2025-11-22 08:49:32.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:49:32 compute-0 nova_compute[186544]: 2025-11-22 08:49:32.196 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:49:32 compute-0 nova_compute[186544]: 2025-11-22 08:49:32.196 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:49:32 compute-0 nova_compute[186544]: 2025-11-22 08:49:32.196 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:49:32 compute-0 nova_compute[186544]: 2025-11-22 08:49:32.197 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 08:49:32 compute-0 nova_compute[186544]: 2025-11-22 08:49:32.349 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:49:32 compute-0 nova_compute[186544]: 2025-11-22 08:49:32.350 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5713MB free_disk=73.13182067871094GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 08:49:32 compute-0 nova_compute[186544]: 2025-11-22 08:49:32.351 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:49:32 compute-0 nova_compute[186544]: 2025-11-22 08:49:32.351 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:49:32 compute-0 podman[254256]: 2025-11-22 08:49:32.393567613 +0000 UTC m=+0.046535499 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 08:49:32 compute-0 nova_compute[186544]: 2025-11-22 08:49:32.404 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 08:49:32 compute-0 nova_compute[186544]: 2025-11-22 08:49:32.405 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 08:49:32 compute-0 podman[254257]: 2025-11-22 08:49:32.410413128 +0000 UTC m=+0.058893604 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, distribution-scope=public, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, config_id=edpm, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, version=9.6, managed_by=edpm_ansible)
Nov 22 08:49:32 compute-0 podman[254258]: 2025-11-22 08:49:32.414655163 +0000 UTC m=+0.059116199 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=multipathd)
Nov 22 08:49:32 compute-0 nova_compute[186544]: 2025-11-22 08:49:32.425 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:49:32 compute-0 nova_compute[186544]: 2025-11-22 08:49:32.436 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:49:32 compute-0 nova_compute[186544]: 2025-11-22 08:49:32.437 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 08:49:32 compute-0 nova_compute[186544]: 2025-11-22 08:49:32.438 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.087s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:49:32 compute-0 nova_compute[186544]: 2025-11-22 08:49:32.487 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:49:34 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:49:34.409 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=74, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=73) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:49:34 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:49:34.410 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 08:49:34 compute-0 nova_compute[186544]: 2025-11-22 08:49:34.409 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:49:35 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:49:35.412 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '74'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:49:36 compute-0 nova_compute[186544]: 2025-11-22 08:49:36.438 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:49:36 compute-0 nova_compute[186544]: 2025-11-22 08:49:36.806 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:49:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:49:37.381 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:49:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:49:37.381 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:49:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:49:37.381 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:49:37 compute-0 nova_compute[186544]: 2025-11-22 08:49:37.489 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:49:38 compute-0 nova_compute[186544]: 2025-11-22 08:49:38.159 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:49:40 compute-0 nova_compute[186544]: 2025-11-22 08:49:40.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:49:40 compute-0 nova_compute[186544]: 2025-11-22 08:49:40.165 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:49:41 compute-0 nova_compute[186544]: 2025-11-22 08:49:41.809 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:49:42 compute-0 nova_compute[186544]: 2025-11-22 08:49:42.491 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:49:44 compute-0 nova_compute[186544]: 2025-11-22 08:49:44.154 186548 DEBUG oslo_concurrency.lockutils [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] Acquiring lock "9a83695e-0d5f-4415-9a37-f3de81a77591" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:49:44 compute-0 nova_compute[186544]: 2025-11-22 08:49:44.155 186548 DEBUG oslo_concurrency.lockutils [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] Lock "9a83695e-0d5f-4415-9a37-f3de81a77591" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:49:44 compute-0 nova_compute[186544]: 2025-11-22 08:49:44.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:49:44 compute-0 nova_compute[186544]: 2025-11-22 08:49:44.174 186548 DEBUG nova.compute.manager [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 08:49:44 compute-0 nova_compute[186544]: 2025-11-22 08:49:44.320 186548 DEBUG oslo_concurrency.lockutils [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:49:44 compute-0 nova_compute[186544]: 2025-11-22 08:49:44.321 186548 DEBUG oslo_concurrency.lockutils [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:49:44 compute-0 nova_compute[186544]: 2025-11-22 08:49:44.326 186548 DEBUG nova.virt.hardware [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 08:49:44 compute-0 nova_compute[186544]: 2025-11-22 08:49:44.327 186548 INFO nova.compute.claims [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] Claim successful on node compute-0.ctlplane.example.com
Nov 22 08:49:44 compute-0 nova_compute[186544]: 2025-11-22 08:49:44.417 186548 DEBUG nova.compute.provider_tree [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:49:44 compute-0 nova_compute[186544]: 2025-11-22 08:49:44.430 186548 DEBUG nova.scheduler.client.report [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:49:44 compute-0 nova_compute[186544]: 2025-11-22 08:49:44.453 186548 DEBUG oslo_concurrency.lockutils [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.132s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:49:44 compute-0 nova_compute[186544]: 2025-11-22 08:49:44.453 186548 DEBUG nova.compute.manager [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 08:49:44 compute-0 nova_compute[186544]: 2025-11-22 08:49:44.528 186548 DEBUG nova.compute.manager [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 22 08:49:44 compute-0 nova_compute[186544]: 2025-11-22 08:49:44.529 186548 DEBUG nova.network.neutron [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 08:49:44 compute-0 nova_compute[186544]: 2025-11-22 08:49:44.554 186548 INFO nova.virt.libvirt.driver [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 08:49:44 compute-0 nova_compute[186544]: 2025-11-22 08:49:44.607 186548 DEBUG nova.compute.manager [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 08:49:44 compute-0 nova_compute[186544]: 2025-11-22 08:49:44.713 186548 DEBUG nova.compute.manager [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 08:49:44 compute-0 nova_compute[186544]: 2025-11-22 08:49:44.714 186548 DEBUG nova.virt.libvirt.driver [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 08:49:44 compute-0 nova_compute[186544]: 2025-11-22 08:49:44.714 186548 INFO nova.virt.libvirt.driver [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] Creating image(s)
Nov 22 08:49:44 compute-0 nova_compute[186544]: 2025-11-22 08:49:44.715 186548 DEBUG oslo_concurrency.lockutils [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] Acquiring lock "/var/lib/nova/instances/9a83695e-0d5f-4415-9a37-f3de81a77591/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:49:44 compute-0 nova_compute[186544]: 2025-11-22 08:49:44.715 186548 DEBUG oslo_concurrency.lockutils [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] Lock "/var/lib/nova/instances/9a83695e-0d5f-4415-9a37-f3de81a77591/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:49:44 compute-0 nova_compute[186544]: 2025-11-22 08:49:44.716 186548 DEBUG oslo_concurrency.lockutils [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] Lock "/var/lib/nova/instances/9a83695e-0d5f-4415-9a37-f3de81a77591/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:49:44 compute-0 nova_compute[186544]: 2025-11-22 08:49:44.729 186548 DEBUG oslo_concurrency.processutils [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:49:44 compute-0 nova_compute[186544]: 2025-11-22 08:49:44.801 186548 DEBUG oslo_concurrency.processutils [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:49:44 compute-0 nova_compute[186544]: 2025-11-22 08:49:44.802 186548 DEBUG oslo_concurrency.lockutils [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:49:44 compute-0 nova_compute[186544]: 2025-11-22 08:49:44.802 186548 DEBUG oslo_concurrency.lockutils [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:49:44 compute-0 nova_compute[186544]: 2025-11-22 08:49:44.814 186548 DEBUG oslo_concurrency.processutils [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:49:44 compute-0 nova_compute[186544]: 2025-11-22 08:49:44.868 186548 DEBUG oslo_concurrency.processutils [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:49:44 compute-0 nova_compute[186544]: 2025-11-22 08:49:44.869 186548 DEBUG oslo_concurrency.processutils [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/9a83695e-0d5f-4415-9a37-f3de81a77591/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:49:44 compute-0 nova_compute[186544]: 2025-11-22 08:49:44.929 186548 DEBUG oslo_concurrency.processutils [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/9a83695e-0d5f-4415-9a37-f3de81a77591/disk 1073741824" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:49:44 compute-0 nova_compute[186544]: 2025-11-22 08:49:44.930 186548 DEBUG oslo_concurrency.lockutils [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.127s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:49:44 compute-0 nova_compute[186544]: 2025-11-22 08:49:44.930 186548 DEBUG oslo_concurrency.processutils [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:49:44 compute-0 nova_compute[186544]: 2025-11-22 08:49:44.988 186548 DEBUG oslo_concurrency.processutils [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:49:44 compute-0 nova_compute[186544]: 2025-11-22 08:49:44.989 186548 DEBUG nova.virt.disk.api [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] Checking if we can resize image /var/lib/nova/instances/9a83695e-0d5f-4415-9a37-f3de81a77591/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 08:49:44 compute-0 nova_compute[186544]: 2025-11-22 08:49:44.990 186548 DEBUG oslo_concurrency.processutils [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9a83695e-0d5f-4415-9a37-f3de81a77591/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:49:45 compute-0 nova_compute[186544]: 2025-11-22 08:49:45.050 186548 DEBUG oslo_concurrency.processutils [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9a83695e-0d5f-4415-9a37-f3de81a77591/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:49:45 compute-0 nova_compute[186544]: 2025-11-22 08:49:45.051 186548 DEBUG nova.virt.disk.api [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] Cannot resize image /var/lib/nova/instances/9a83695e-0d5f-4415-9a37-f3de81a77591/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 08:49:45 compute-0 nova_compute[186544]: 2025-11-22 08:49:45.051 186548 DEBUG nova.objects.instance [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] Lazy-loading 'migration_context' on Instance uuid 9a83695e-0d5f-4415-9a37-f3de81a77591 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:49:45 compute-0 nova_compute[186544]: 2025-11-22 08:49:45.059 186548 DEBUG nova.virt.libvirt.driver [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 08:49:45 compute-0 nova_compute[186544]: 2025-11-22 08:49:45.060 186548 DEBUG nova.virt.libvirt.driver [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] Ensure instance console log exists: /var/lib/nova/instances/9a83695e-0d5f-4415-9a37-f3de81a77591/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 08:49:45 compute-0 nova_compute[186544]: 2025-11-22 08:49:45.060 186548 DEBUG oslo_concurrency.lockutils [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:49:45 compute-0 nova_compute[186544]: 2025-11-22 08:49:45.060 186548 DEBUG oslo_concurrency.lockutils [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:49:45 compute-0 nova_compute[186544]: 2025-11-22 08:49:45.060 186548 DEBUG oslo_concurrency.lockutils [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:49:45 compute-0 nova_compute[186544]: 2025-11-22 08:49:45.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:49:45 compute-0 nova_compute[186544]: 2025-11-22 08:49:45.433 186548 DEBUG nova.policy [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a89106859f3942e3ac8b97f7b8a093fe', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f996d3c331cb4615815a7a9d118ecd66', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 22 08:49:46 compute-0 nova_compute[186544]: 2025-11-22 08:49:46.498 186548 DEBUG nova.network.neutron [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] Successfully created port: 01ba516c-9f4e-405d-9048-83ab7c320a3a _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 22 08:49:46 compute-0 nova_compute[186544]: 2025-11-22 08:49:46.811 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:49:47 compute-0 nova_compute[186544]: 2025-11-22 08:49:47.173 186548 DEBUG nova.network.neutron [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] Successfully updated port: 01ba516c-9f4e-405d-9048-83ab7c320a3a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 08:49:47 compute-0 nova_compute[186544]: 2025-11-22 08:49:47.215 186548 DEBUG oslo_concurrency.lockutils [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] Acquiring lock "refresh_cache-9a83695e-0d5f-4415-9a37-f3de81a77591" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:49:47 compute-0 nova_compute[186544]: 2025-11-22 08:49:47.216 186548 DEBUG oslo_concurrency.lockutils [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] Acquired lock "refresh_cache-9a83695e-0d5f-4415-9a37-f3de81a77591" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:49:47 compute-0 nova_compute[186544]: 2025-11-22 08:49:47.216 186548 DEBUG nova.network.neutron [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 08:49:47 compute-0 nova_compute[186544]: 2025-11-22 08:49:47.263 186548 DEBUG nova.compute.manager [req-d45ad547-28f5-412f-96ee-34e83029a69b req-0abcb582-4dac-4141-a046-f6420299684c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] Received event network-changed-01ba516c-9f4e-405d-9048-83ab7c320a3a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:49:47 compute-0 nova_compute[186544]: 2025-11-22 08:49:47.264 186548 DEBUG nova.compute.manager [req-d45ad547-28f5-412f-96ee-34e83029a69b req-0abcb582-4dac-4141-a046-f6420299684c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] Refreshing instance network info cache due to event network-changed-01ba516c-9f4e-405d-9048-83ab7c320a3a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:49:47 compute-0 nova_compute[186544]: 2025-11-22 08:49:47.264 186548 DEBUG oslo_concurrency.lockutils [req-d45ad547-28f5-412f-96ee-34e83029a69b req-0abcb582-4dac-4141-a046-f6420299684c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-9a83695e-0d5f-4415-9a37-f3de81a77591" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:49:47 compute-0 nova_compute[186544]: 2025-11-22 08:49:47.336 186548 DEBUG nova.network.neutron [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 08:49:47 compute-0 nova_compute[186544]: 2025-11-22 08:49:47.494 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:49:48 compute-0 nova_compute[186544]: 2025-11-22 08:49:48.642 186548 DEBUG nova.network.neutron [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] Updating instance_info_cache with network_info: [{"id": "01ba516c-9f4e-405d-9048-83ab7c320a3a", "address": "fa:16:3e:b7:b7:d0", "network": {"id": "97a92abf-e880-4df3-8778-f25bbdc3abc6", "bridge": "br-int", "label": "tempest-network-smoke--660043884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f996d3c331cb4615815a7a9d118ecd66", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01ba516c-9f", "ovs_interfaceid": "01ba516c-9f4e-405d-9048-83ab7c320a3a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:49:48 compute-0 nova_compute[186544]: 2025-11-22 08:49:48.656 186548 DEBUG oslo_concurrency.lockutils [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] Releasing lock "refresh_cache-9a83695e-0d5f-4415-9a37-f3de81a77591" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:49:48 compute-0 nova_compute[186544]: 2025-11-22 08:49:48.656 186548 DEBUG nova.compute.manager [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] Instance network_info: |[{"id": "01ba516c-9f4e-405d-9048-83ab7c320a3a", "address": "fa:16:3e:b7:b7:d0", "network": {"id": "97a92abf-e880-4df3-8778-f25bbdc3abc6", "bridge": "br-int", "label": "tempest-network-smoke--660043884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f996d3c331cb4615815a7a9d118ecd66", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01ba516c-9f", "ovs_interfaceid": "01ba516c-9f4e-405d-9048-83ab7c320a3a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 22 08:49:48 compute-0 nova_compute[186544]: 2025-11-22 08:49:48.657 186548 DEBUG oslo_concurrency.lockutils [req-d45ad547-28f5-412f-96ee-34e83029a69b req-0abcb582-4dac-4141-a046-f6420299684c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-9a83695e-0d5f-4415-9a37-f3de81a77591" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:49:48 compute-0 nova_compute[186544]: 2025-11-22 08:49:48.657 186548 DEBUG nova.network.neutron [req-d45ad547-28f5-412f-96ee-34e83029a69b req-0abcb582-4dac-4141-a046-f6420299684c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] Refreshing network info cache for port 01ba516c-9f4e-405d-9048-83ab7c320a3a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:49:48 compute-0 nova_compute[186544]: 2025-11-22 08:49:48.660 186548 DEBUG nova.virt.libvirt.driver [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] Start _get_guest_xml network_info=[{"id": "01ba516c-9f4e-405d-9048-83ab7c320a3a", "address": "fa:16:3e:b7:b7:d0", "network": {"id": "97a92abf-e880-4df3-8778-f25bbdc3abc6", "bridge": "br-int", "label": "tempest-network-smoke--660043884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f996d3c331cb4615815a7a9d118ecd66", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01ba516c-9f", "ovs_interfaceid": "01ba516c-9f4e-405d-9048-83ab7c320a3a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 08:49:48 compute-0 nova_compute[186544]: 2025-11-22 08:49:48.663 186548 WARNING nova.virt.libvirt.driver [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:49:48 compute-0 nova_compute[186544]: 2025-11-22 08:49:48.667 186548 DEBUG nova.virt.libvirt.host [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 08:49:48 compute-0 nova_compute[186544]: 2025-11-22 08:49:48.668 186548 DEBUG nova.virt.libvirt.host [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 08:49:48 compute-0 nova_compute[186544]: 2025-11-22 08:49:48.673 186548 DEBUG nova.virt.libvirt.host [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 08:49:48 compute-0 nova_compute[186544]: 2025-11-22 08:49:48.674 186548 DEBUG nova.virt.libvirt.host [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 08:49:48 compute-0 nova_compute[186544]: 2025-11-22 08:49:48.675 186548 DEBUG nova.virt.libvirt.driver [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 08:49:48 compute-0 nova_compute[186544]: 2025-11-22 08:49:48.675 186548 DEBUG nova.virt.hardware [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 08:49:48 compute-0 nova_compute[186544]: 2025-11-22 08:49:48.676 186548 DEBUG nova.virt.hardware [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 08:49:48 compute-0 nova_compute[186544]: 2025-11-22 08:49:48.676 186548 DEBUG nova.virt.hardware [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 08:49:48 compute-0 nova_compute[186544]: 2025-11-22 08:49:48.676 186548 DEBUG nova.virt.hardware [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 08:49:48 compute-0 nova_compute[186544]: 2025-11-22 08:49:48.676 186548 DEBUG nova.virt.hardware [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 08:49:48 compute-0 nova_compute[186544]: 2025-11-22 08:49:48.677 186548 DEBUG nova.virt.hardware [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 08:49:48 compute-0 nova_compute[186544]: 2025-11-22 08:49:48.677 186548 DEBUG nova.virt.hardware [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 08:49:48 compute-0 nova_compute[186544]: 2025-11-22 08:49:48.677 186548 DEBUG nova.virt.hardware [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 08:49:48 compute-0 nova_compute[186544]: 2025-11-22 08:49:48.677 186548 DEBUG nova.virt.hardware [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 08:49:48 compute-0 nova_compute[186544]: 2025-11-22 08:49:48.678 186548 DEBUG nova.virt.hardware [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 08:49:48 compute-0 nova_compute[186544]: 2025-11-22 08:49:48.678 186548 DEBUG nova.virt.hardware [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 08:49:48 compute-0 nova_compute[186544]: 2025-11-22 08:49:48.683 186548 DEBUG nova.virt.libvirt.vif [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:49:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1476424987-access_point-1602015144',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1476424987-access_point-1602015144',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1476424987-ac',id=180,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCr7NF5co5IMPbVlvNNXaWEcV/TMK4gRym1IU+xrFFYbHfyyuO2kK1tTqQYmQ/RTGQJ8JuSs83cgnhJoMhrhV2PVCH3lEoOzIowFbWiaRx38gwHps/13xxUrEJrxPx8nOg==',key_name='tempest-TestSecurityGroupsBasicOps-1968243366',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f996d3c331cb4615815a7a9d118ecd66',ramdisk_id='',reservation_id='r-osxdn1hz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-1476424987',owner_user_name='tempest-TestSecurityGroupsBasicOps-1476424987-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:49:44Z,user_data=None,user_id='a89106859f3942e3ac8b97f7b8a093fe',uuid=9a83695e-0d5f-4415-9a37-f3de81a77591,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "01ba516c-9f4e-405d-9048-83ab7c320a3a", "address": "fa:16:3e:b7:b7:d0", "network": {"id": "97a92abf-e880-4df3-8778-f25bbdc3abc6", "bridge": "br-int", "label": "tempest-network-smoke--660043884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f996d3c331cb4615815a7a9d118ecd66", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01ba516c-9f", "ovs_interfaceid": "01ba516c-9f4e-405d-9048-83ab7c320a3a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 08:49:48 compute-0 nova_compute[186544]: 2025-11-22 08:49:48.683 186548 DEBUG nova.network.os_vif_util [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] Converting VIF {"id": "01ba516c-9f4e-405d-9048-83ab7c320a3a", "address": "fa:16:3e:b7:b7:d0", "network": {"id": "97a92abf-e880-4df3-8778-f25bbdc3abc6", "bridge": "br-int", "label": "tempest-network-smoke--660043884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f996d3c331cb4615815a7a9d118ecd66", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01ba516c-9f", "ovs_interfaceid": "01ba516c-9f4e-405d-9048-83ab7c320a3a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:49:48 compute-0 nova_compute[186544]: 2025-11-22 08:49:48.684 186548 DEBUG nova.network.os_vif_util [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b7:b7:d0,bridge_name='br-int',has_traffic_filtering=True,id=01ba516c-9f4e-405d-9048-83ab7c320a3a,network=Network(97a92abf-e880-4df3-8778-f25bbdc3abc6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01ba516c-9f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:49:48 compute-0 nova_compute[186544]: 2025-11-22 08:49:48.685 186548 DEBUG nova.objects.instance [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] Lazy-loading 'pci_devices' on Instance uuid 9a83695e-0d5f-4415-9a37-f3de81a77591 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:49:48 compute-0 nova_compute[186544]: 2025-11-22 08:49:48.698 186548 DEBUG nova.virt.libvirt.driver [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] End _get_guest_xml xml=<domain type="kvm">
Nov 22 08:49:48 compute-0 nova_compute[186544]:   <uuid>9a83695e-0d5f-4415-9a37-f3de81a77591</uuid>
Nov 22 08:49:48 compute-0 nova_compute[186544]:   <name>instance-000000b4</name>
Nov 22 08:49:48 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 08:49:48 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 08:49:48 compute-0 nova_compute[186544]:   <metadata>
Nov 22 08:49:48 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 08:49:48 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 08:49:48 compute-0 nova_compute[186544]:       <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-1476424987-access_point-1602015144</nova:name>
Nov 22 08:49:48 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 08:49:48</nova:creationTime>
Nov 22 08:49:48 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 08:49:48 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 08:49:48 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 08:49:48 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 08:49:48 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 08:49:48 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 08:49:48 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 08:49:48 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 08:49:48 compute-0 nova_compute[186544]:         <nova:user uuid="a89106859f3942e3ac8b97f7b8a093fe">tempest-TestSecurityGroupsBasicOps-1476424987-project-member</nova:user>
Nov 22 08:49:48 compute-0 nova_compute[186544]:         <nova:project uuid="f996d3c331cb4615815a7a9d118ecd66">tempest-TestSecurityGroupsBasicOps-1476424987</nova:project>
Nov 22 08:49:48 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 08:49:48 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 08:49:48 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 08:49:48 compute-0 nova_compute[186544]:         <nova:port uuid="01ba516c-9f4e-405d-9048-83ab7c320a3a">
Nov 22 08:49:48 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Nov 22 08:49:48 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 08:49:48 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 08:49:48 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 08:49:48 compute-0 nova_compute[186544]:   </metadata>
Nov 22 08:49:48 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 08:49:48 compute-0 nova_compute[186544]:     <system>
Nov 22 08:49:48 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 08:49:48 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 08:49:48 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 08:49:48 compute-0 nova_compute[186544]:       <entry name="serial">9a83695e-0d5f-4415-9a37-f3de81a77591</entry>
Nov 22 08:49:48 compute-0 nova_compute[186544]:       <entry name="uuid">9a83695e-0d5f-4415-9a37-f3de81a77591</entry>
Nov 22 08:49:48 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 08:49:48 compute-0 nova_compute[186544]:     </system>
Nov 22 08:49:48 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 08:49:48 compute-0 nova_compute[186544]:   <os>
Nov 22 08:49:48 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 08:49:48 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 08:49:48 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 08:49:48 compute-0 nova_compute[186544]:   </os>
Nov 22 08:49:48 compute-0 nova_compute[186544]:   <features>
Nov 22 08:49:48 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 08:49:48 compute-0 nova_compute[186544]:     <apic/>
Nov 22 08:49:48 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 08:49:48 compute-0 nova_compute[186544]:   </features>
Nov 22 08:49:48 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 08:49:48 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 08:49:48 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 08:49:48 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 08:49:48 compute-0 nova_compute[186544]:   </clock>
Nov 22 08:49:48 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 08:49:48 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 08:49:48 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 08:49:48 compute-0 nova_compute[186544]:   </cpu>
Nov 22 08:49:48 compute-0 nova_compute[186544]:   <devices>
Nov 22 08:49:48 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 08:49:48 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 08:49:48 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/9a83695e-0d5f-4415-9a37-f3de81a77591/disk"/>
Nov 22 08:49:48 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 08:49:48 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:49:48 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 08:49:48 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 08:49:48 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/9a83695e-0d5f-4415-9a37-f3de81a77591/disk.config"/>
Nov 22 08:49:48 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 08:49:48 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:49:48 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 08:49:48 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:b7:b7:d0"/>
Nov 22 08:49:48 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:49:48 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 08:49:48 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 08:49:48 compute-0 nova_compute[186544]:       <target dev="tap01ba516c-9f"/>
Nov 22 08:49:48 compute-0 nova_compute[186544]:     </interface>
Nov 22 08:49:48 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 08:49:48 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/9a83695e-0d5f-4415-9a37-f3de81a77591/console.log" append="off"/>
Nov 22 08:49:48 compute-0 nova_compute[186544]:     </serial>
Nov 22 08:49:48 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 08:49:48 compute-0 nova_compute[186544]:     <video>
Nov 22 08:49:48 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:49:48 compute-0 nova_compute[186544]:     </video>
Nov 22 08:49:48 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 08:49:48 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 08:49:48 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 08:49:48 compute-0 nova_compute[186544]:     </rng>
Nov 22 08:49:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 08:49:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:49:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:49:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:49:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:49:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:49:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:49:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:49:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:49:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:49:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:49:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:49:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:49:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:49:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:49:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:49:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:49:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:49:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:49:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:49:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:49:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:49:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:49:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:49:48 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:49:48 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 08:49:48 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 08:49:48 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 08:49:48 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 08:49:48 compute-0 nova_compute[186544]:   </devices>
Nov 22 08:49:48 compute-0 nova_compute[186544]: </domain>
Nov 22 08:49:48 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 08:49:48 compute-0 nova_compute[186544]: 2025-11-22 08:49:48.700 186548 DEBUG nova.compute.manager [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] Preparing to wait for external event network-vif-plugged-01ba516c-9f4e-405d-9048-83ab7c320a3a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 08:49:48 compute-0 nova_compute[186544]: 2025-11-22 08:49:48.701 186548 DEBUG oslo_concurrency.lockutils [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] Acquiring lock "9a83695e-0d5f-4415-9a37-f3de81a77591-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:49:48 compute-0 nova_compute[186544]: 2025-11-22 08:49:48.701 186548 DEBUG oslo_concurrency.lockutils [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] Lock "9a83695e-0d5f-4415-9a37-f3de81a77591-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:49:48 compute-0 nova_compute[186544]: 2025-11-22 08:49:48.701 186548 DEBUG oslo_concurrency.lockutils [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] Lock "9a83695e-0d5f-4415-9a37-f3de81a77591-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:49:48 compute-0 nova_compute[186544]: 2025-11-22 08:49:48.702 186548 DEBUG nova.virt.libvirt.vif [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:49:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1476424987-access_point-1602015144',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1476424987-access_point-1602015144',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1476424987-ac',id=180,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCr7NF5co5IMPbVlvNNXaWEcV/TMK4gRym1IU+xrFFYbHfyyuO2kK1tTqQYmQ/RTGQJ8JuSs83cgnhJoMhrhV2PVCH3lEoOzIowFbWiaRx38gwHps/13xxUrEJrxPx8nOg==',key_name='tempest-TestSecurityGroupsBasicOps-1968243366',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f996d3c331cb4615815a7a9d118ecd66',ramdisk_id='',reservation_id='r-osxdn1hz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-1476424987',owner_user_name='tempest-TestSecurityGroupsBasicOps-1476424987-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:49:44Z,user_data=None,user_id='a89106859f3942e3ac8b97f7b8a093fe',uuid=9a83695e-0d5f-4415-9a37-f3de81a77591,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "01ba516c-9f4e-405d-9048-83ab7c320a3a", "address": "fa:16:3e:b7:b7:d0", "network": {"id": "97a92abf-e880-4df3-8778-f25bbdc3abc6", "bridge": "br-int", "label": "tempest-network-smoke--660043884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f996d3c331cb4615815a7a9d118ecd66", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01ba516c-9f", "ovs_interfaceid": "01ba516c-9f4e-405d-9048-83ab7c320a3a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 08:49:48 compute-0 nova_compute[186544]: 2025-11-22 08:49:48.702 186548 DEBUG nova.network.os_vif_util [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] Converting VIF {"id": "01ba516c-9f4e-405d-9048-83ab7c320a3a", "address": "fa:16:3e:b7:b7:d0", "network": {"id": "97a92abf-e880-4df3-8778-f25bbdc3abc6", "bridge": "br-int", "label": "tempest-network-smoke--660043884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f996d3c331cb4615815a7a9d118ecd66", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01ba516c-9f", "ovs_interfaceid": "01ba516c-9f4e-405d-9048-83ab7c320a3a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:49:48 compute-0 nova_compute[186544]: 2025-11-22 08:49:48.702 186548 DEBUG nova.network.os_vif_util [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b7:b7:d0,bridge_name='br-int',has_traffic_filtering=True,id=01ba516c-9f4e-405d-9048-83ab7c320a3a,network=Network(97a92abf-e880-4df3-8778-f25bbdc3abc6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01ba516c-9f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:49:48 compute-0 nova_compute[186544]: 2025-11-22 08:49:48.703 186548 DEBUG os_vif [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b7:b7:d0,bridge_name='br-int',has_traffic_filtering=True,id=01ba516c-9f4e-405d-9048-83ab7c320a3a,network=Network(97a92abf-e880-4df3-8778-f25bbdc3abc6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01ba516c-9f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 08:49:48 compute-0 nova_compute[186544]: 2025-11-22 08:49:48.703 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:49:48 compute-0 nova_compute[186544]: 2025-11-22 08:49:48.704 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:49:48 compute-0 nova_compute[186544]: 2025-11-22 08:49:48.704 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:49:48 compute-0 nova_compute[186544]: 2025-11-22 08:49:48.706 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:49:48 compute-0 nova_compute[186544]: 2025-11-22 08:49:48.706 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap01ba516c-9f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:49:48 compute-0 nova_compute[186544]: 2025-11-22 08:49:48.707 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap01ba516c-9f, col_values=(('external_ids', {'iface-id': '01ba516c-9f4e-405d-9048-83ab7c320a3a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b7:b7:d0', 'vm-uuid': '9a83695e-0d5f-4415-9a37-f3de81a77591'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:49:48 compute-0 nova_compute[186544]: 2025-11-22 08:49:48.708 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:49:48 compute-0 NetworkManager[55036]: <info>  [1763801388.7102] manager: (tap01ba516c-9f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/413)
Nov 22 08:49:48 compute-0 nova_compute[186544]: 2025-11-22 08:49:48.710 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 08:49:48 compute-0 nova_compute[186544]: 2025-11-22 08:49:48.715 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:49:48 compute-0 nova_compute[186544]: 2025-11-22 08:49:48.715 186548 INFO os_vif [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b7:b7:d0,bridge_name='br-int',has_traffic_filtering=True,id=01ba516c-9f4e-405d-9048-83ab7c320a3a,network=Network(97a92abf-e880-4df3-8778-f25bbdc3abc6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01ba516c-9f')
Nov 22 08:49:48 compute-0 nova_compute[186544]: 2025-11-22 08:49:48.921 186548 DEBUG nova.virt.libvirt.driver [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:49:48 compute-0 nova_compute[186544]: 2025-11-22 08:49:48.921 186548 DEBUG nova.virt.libvirt.driver [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:49:48 compute-0 nova_compute[186544]: 2025-11-22 08:49:48.922 186548 DEBUG nova.virt.libvirt.driver [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] No VIF found with MAC fa:16:3e:b7:b7:d0, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 08:49:48 compute-0 nova_compute[186544]: 2025-11-22 08:49:48.922 186548 INFO nova.virt.libvirt.driver [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] Using config drive
Nov 22 08:49:49 compute-0 nova_compute[186544]: 2025-11-22 08:49:49.772 186548 INFO nova.virt.libvirt.driver [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] Creating config drive at /var/lib/nova/instances/9a83695e-0d5f-4415-9a37-f3de81a77591/disk.config
Nov 22 08:49:49 compute-0 nova_compute[186544]: 2025-11-22 08:49:49.779 186548 DEBUG oslo_concurrency.processutils [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/9a83695e-0d5f-4415-9a37-f3de81a77591/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpb5diig18 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:49:49 compute-0 nova_compute[186544]: 2025-11-22 08:49:49.902 186548 DEBUG oslo_concurrency.processutils [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/9a83695e-0d5f-4415-9a37-f3de81a77591/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpb5diig18" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:49:50 compute-0 kernel: tap01ba516c-9f: entered promiscuous mode
Nov 22 08:49:50 compute-0 NetworkManager[55036]: <info>  [1763801390.0702] manager: (tap01ba516c-9f): new Tun device (/org/freedesktop/NetworkManager/Devices/414)
Nov 22 08:49:50 compute-0 ovn_controller[94843]: 2025-11-22T08:49:50Z|00887|binding|INFO|Claiming lport 01ba516c-9f4e-405d-9048-83ab7c320a3a for this chassis.
Nov 22 08:49:50 compute-0 ovn_controller[94843]: 2025-11-22T08:49:50Z|00888|binding|INFO|01ba516c-9f4e-405d-9048-83ab7c320a3a: Claiming fa:16:3e:b7:b7:d0 10.100.0.8
Nov 22 08:49:50 compute-0 nova_compute[186544]: 2025-11-22 08:49:50.070 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:49:50 compute-0 nova_compute[186544]: 2025-11-22 08:49:50.075 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:49:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:49:50.085 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b7:b7:d0 10.100.0.8'], port_security=['fa:16:3e:b7:b7:d0 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '9a83695e-0d5f-4415-9a37-f3de81a77591', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-97a92abf-e880-4df3-8778-f25bbdc3abc6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f996d3c331cb4615815a7a9d118ecd66', 'neutron:revision_number': '2', 'neutron:security_group_ids': '98fd1964-80f7-4f30-acda-ff57179dc6aa ca0720a8-6649-4ad4-a909-7598feb69085', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ee00205b-c551-438c-bd0f-bc696d54eb50, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=01ba516c-9f4e-405d-9048-83ab7c320a3a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:49:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:49:50.086 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 01ba516c-9f4e-405d-9048-83ab7c320a3a in datapath 97a92abf-e880-4df3-8778-f25bbdc3abc6 bound to our chassis
Nov 22 08:49:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:49:50.087 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 97a92abf-e880-4df3-8778-f25bbdc3abc6
Nov 22 08:49:50 compute-0 systemd-udevd[254353]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:49:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:49:50.113 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[4a2c98e7-f874-428f-94c2-93ba956167b5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:49:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:49:50.114 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap97a92abf-e1 in ovnmeta-97a92abf-e880-4df3-8778-f25bbdc3abc6 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 08:49:50 compute-0 NetworkManager[55036]: <info>  [1763801390.1204] device (tap01ba516c-9f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 08:49:50 compute-0 NetworkManager[55036]: <info>  [1763801390.1212] device (tap01ba516c-9f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 08:49:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:49:50.119 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap97a92abf-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 08:49:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:49:50.119 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[0304b964-ef36-4f2b-9869-857f4d361ca3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:49:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:49:50.120 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[10d03a40-7da1-4a4f-8fe4-cb82685140a8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:49:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:49:50.131 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[4001abd1-668b-47aa-8818-347064ba6711]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:49:50 compute-0 systemd-machined[152872]: New machine qemu-97-instance-000000b4.
Nov 22 08:49:50 compute-0 nova_compute[186544]: 2025-11-22 08:49:50.136 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:49:50 compute-0 ovn_controller[94843]: 2025-11-22T08:49:50Z|00889|binding|INFO|Setting lport 01ba516c-9f4e-405d-9048-83ab7c320a3a ovn-installed in OVS
Nov 22 08:49:50 compute-0 ovn_controller[94843]: 2025-11-22T08:49:50Z|00890|binding|INFO|Setting lport 01ba516c-9f4e-405d-9048-83ab7c320a3a up in Southbound
Nov 22 08:49:50 compute-0 nova_compute[186544]: 2025-11-22 08:49:50.142 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:49:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:49:50.143 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[3559e9ce-205f-470f-8f31-c25380154e53]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:49:50 compute-0 systemd[1]: Started Virtual Machine qemu-97-instance-000000b4.
Nov 22 08:49:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:49:50.169 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[20b1e87e-90a1-427e-a305-de008f8c43c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:49:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:49:50.173 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[7b1f7d23-4f5e-44b7-958f-9e53be81909d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:49:50 compute-0 systemd-udevd[254358]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:49:50 compute-0 NetworkManager[55036]: <info>  [1763801390.1746] manager: (tap97a92abf-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/415)
Nov 22 08:49:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:49:50.199 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[2a031076-6fb2-4611-ba80-0c844cdc6fa7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:49:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:49:50.202 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[d82e5cea-5e0f-40f0-99b5-0d9987b7a1d1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:49:50 compute-0 NetworkManager[55036]: <info>  [1763801390.2190] device (tap97a92abf-e0): carrier: link connected
Nov 22 08:49:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:49:50.222 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[56b4795e-0768-497d-8d1d-a7f57d53b789]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:49:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:49:50.234 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[d94f9f1d-11b3-4d7c-b51a-5a73c352a52b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap97a92abf-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d1:b2:ff'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 261], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 813085, 'reachable_time': 44260, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 254389, 'error': None, 'target': 'ovnmeta-97a92abf-e880-4df3-8778-f25bbdc3abc6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:49:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:49:50.246 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[6d887851-a353-4c31-84bf-cdef0a44501c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed1:b2ff'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 813085, 'tstamp': 813085}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 254390, 'error': None, 'target': 'ovnmeta-97a92abf-e880-4df3-8778-f25bbdc3abc6', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:49:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:49:50.258 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[58a1d6fc-1f9c-4203-b022-2dcc142b6472]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap97a92abf-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d1:b2:ff'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 261], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 813085, 'reachable_time': 44260, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 254391, 'error': None, 'target': 'ovnmeta-97a92abf-e880-4df3-8778-f25bbdc3abc6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:49:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:49:50.281 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[b45d3f0f-d02e-48b5-a354-30345aed9230]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:49:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:49:50.326 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[b3d28edc-4cb4-432f-a16a-359c1aee6e89]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:49:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:49:50.327 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap97a92abf-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:49:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:49:50.327 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:49:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:49:50.327 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap97a92abf-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:49:50 compute-0 nova_compute[186544]: 2025-11-22 08:49:50.329 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:49:50 compute-0 NetworkManager[55036]: <info>  [1763801390.3297] manager: (tap97a92abf-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/416)
Nov 22 08:49:50 compute-0 kernel: tap97a92abf-e0: entered promiscuous mode
Nov 22 08:49:50 compute-0 nova_compute[186544]: 2025-11-22 08:49:50.332 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:49:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:49:50.332 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap97a92abf-e0, col_values=(('external_ids', {'iface-id': 'de6fe623-0eee-4ec3-8793-a02118dbb427'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:49:50 compute-0 nova_compute[186544]: 2025-11-22 08:49:50.333 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:49:50 compute-0 ovn_controller[94843]: 2025-11-22T08:49:50Z|00891|binding|INFO|Releasing lport de6fe623-0eee-4ec3-8793-a02118dbb427 from this chassis (sb_readonly=0)
Nov 22 08:49:50 compute-0 nova_compute[186544]: 2025-11-22 08:49:50.345 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:49:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:49:50.347 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/97a92abf-e880-4df3-8778-f25bbdc3abc6.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/97a92abf-e880-4df3-8778-f25bbdc3abc6.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 08:49:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:49:50.349 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[2501ea18-f634-4126-9114-cc66d952db81]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:49:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:49:50.350 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 08:49:50 compute-0 ovn_metadata_agent[103800]: global
Nov 22 08:49:50 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 08:49:50 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-97a92abf-e880-4df3-8778-f25bbdc3abc6
Nov 22 08:49:50 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 08:49:50 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 08:49:50 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 08:49:50 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/97a92abf-e880-4df3-8778-f25bbdc3abc6.pid.haproxy
Nov 22 08:49:50 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 08:49:50 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:49:50 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 08:49:50 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 08:49:50 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 08:49:50 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 08:49:50 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 08:49:50 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 08:49:50 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 08:49:50 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 08:49:50 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 08:49:50 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 08:49:50 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 08:49:50 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 08:49:50 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 08:49:50 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:49:50 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:49:50 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 08:49:50 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 08:49:50 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 08:49:50 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID 97a92abf-e880-4df3-8778-f25bbdc3abc6
Nov 22 08:49:50 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 08:49:50 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:49:50.351 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-97a92abf-e880-4df3-8778-f25bbdc3abc6', 'env', 'PROCESS_TAG=haproxy-97a92abf-e880-4df3-8778-f25bbdc3abc6', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/97a92abf-e880-4df3-8778-f25bbdc3abc6.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 08:49:50 compute-0 nova_compute[186544]: 2025-11-22 08:49:50.605 186548 DEBUG nova.compute.manager [req-c92c10ef-92d4-4312-a993-d2e72406ba02 req-c4fbb403-9f6c-4013-8637-64b800483ed8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] Received event network-vif-plugged-01ba516c-9f4e-405d-9048-83ab7c320a3a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:49:50 compute-0 nova_compute[186544]: 2025-11-22 08:49:50.606 186548 DEBUG oslo_concurrency.lockutils [req-c92c10ef-92d4-4312-a993-d2e72406ba02 req-c4fbb403-9f6c-4013-8637-64b800483ed8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "9a83695e-0d5f-4415-9a37-f3de81a77591-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:49:50 compute-0 nova_compute[186544]: 2025-11-22 08:49:50.606 186548 DEBUG oslo_concurrency.lockutils [req-c92c10ef-92d4-4312-a993-d2e72406ba02 req-c4fbb403-9f6c-4013-8637-64b800483ed8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "9a83695e-0d5f-4415-9a37-f3de81a77591-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:49:50 compute-0 nova_compute[186544]: 2025-11-22 08:49:50.606 186548 DEBUG oslo_concurrency.lockutils [req-c92c10ef-92d4-4312-a993-d2e72406ba02 req-c4fbb403-9f6c-4013-8637-64b800483ed8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "9a83695e-0d5f-4415-9a37-f3de81a77591-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:49:50 compute-0 nova_compute[186544]: 2025-11-22 08:49:50.606 186548 DEBUG nova.compute.manager [req-c92c10ef-92d4-4312-a993-d2e72406ba02 req-c4fbb403-9f6c-4013-8637-64b800483ed8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] Processing event network-vif-plugged-01ba516c-9f4e-405d-9048-83ab7c320a3a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 08:49:50 compute-0 nova_compute[186544]: 2025-11-22 08:49:50.723 186548 DEBUG nova.network.neutron [req-d45ad547-28f5-412f-96ee-34e83029a69b req-0abcb582-4dac-4141-a046-f6420299684c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] Updated VIF entry in instance network info cache for port 01ba516c-9f4e-405d-9048-83ab7c320a3a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:49:50 compute-0 nova_compute[186544]: 2025-11-22 08:49:50.724 186548 DEBUG nova.network.neutron [req-d45ad547-28f5-412f-96ee-34e83029a69b req-0abcb582-4dac-4141-a046-f6420299684c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] Updating instance_info_cache with network_info: [{"id": "01ba516c-9f4e-405d-9048-83ab7c320a3a", "address": "fa:16:3e:b7:b7:d0", "network": {"id": "97a92abf-e880-4df3-8778-f25bbdc3abc6", "bridge": "br-int", "label": "tempest-network-smoke--660043884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f996d3c331cb4615815a7a9d118ecd66", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01ba516c-9f", "ovs_interfaceid": "01ba516c-9f4e-405d-9048-83ab7c320a3a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:49:50 compute-0 nova_compute[186544]: 2025-11-22 08:49:50.740 186548 DEBUG oslo_concurrency.lockutils [req-d45ad547-28f5-412f-96ee-34e83029a69b req-0abcb582-4dac-4141-a046-f6420299684c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-9a83695e-0d5f-4415-9a37-f3de81a77591" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:49:50 compute-0 podman[254423]: 2025-11-22 08:49:50.666932723 +0000 UTC m=+0.019332489 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 08:49:50 compute-0 podman[254423]: 2025-11-22 08:49:50.906592906 +0000 UTC m=+0.258992642 container create 0fc87cad42a7cc2c3c8fc798a039ec9e2e6d857cab0e355728fd3a7e18507f8c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-97a92abf-e880-4df3-8778-f25bbdc3abc6, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 08:49:51 compute-0 systemd[1]: Started libpod-conmon-0fc87cad42a7cc2c3c8fc798a039ec9e2e6d857cab0e355728fd3a7e18507f8c.scope.
Nov 22 08:49:51 compute-0 systemd[1]: Started libcrun container.
Nov 22 08:49:51 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a03c75d75cc5ec41e7eb3fb999fdb15582669ecd6b139923c5b87e311aa534fa/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 08:49:51 compute-0 podman[254423]: 2025-11-22 08:49:51.313752033 +0000 UTC m=+0.666151799 container init 0fc87cad42a7cc2c3c8fc798a039ec9e2e6d857cab0e355728fd3a7e18507f8c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-97a92abf-e880-4df3-8778-f25bbdc3abc6, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 22 08:49:51 compute-0 podman[254423]: 2025-11-22 08:49:51.319975146 +0000 UTC m=+0.672374882 container start 0fc87cad42a7cc2c3c8fc798a039ec9e2e6d857cab0e355728fd3a7e18507f8c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-97a92abf-e880-4df3-8778-f25bbdc3abc6, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118)
Nov 22 08:49:51 compute-0 neutron-haproxy-ovnmeta-97a92abf-e880-4df3-8778-f25bbdc3abc6[254439]: [NOTICE]   (254449) : New worker (254452) forked
Nov 22 08:49:51 compute-0 neutron-haproxy-ovnmeta-97a92abf-e880-4df3-8778-f25bbdc3abc6[254439]: [NOTICE]   (254449) : Loading success.
Nov 22 08:49:51 compute-0 nova_compute[186544]: 2025-11-22 08:49:51.375 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763801391.375074, 9a83695e-0d5f-4415-9a37-f3de81a77591 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:49:51 compute-0 nova_compute[186544]: 2025-11-22 08:49:51.376 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] VM Started (Lifecycle Event)
Nov 22 08:49:51 compute-0 nova_compute[186544]: 2025-11-22 08:49:51.377 186548 DEBUG nova.compute.manager [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 08:49:51 compute-0 nova_compute[186544]: 2025-11-22 08:49:51.381 186548 DEBUG nova.virt.libvirt.driver [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 08:49:51 compute-0 nova_compute[186544]: 2025-11-22 08:49:51.384 186548 INFO nova.virt.libvirt.driver [-] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] Instance spawned successfully.
Nov 22 08:49:51 compute-0 nova_compute[186544]: 2025-11-22 08:49:51.384 186548 DEBUG nova.virt.libvirt.driver [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 08:49:51 compute-0 nova_compute[186544]: 2025-11-22 08:49:51.399 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:49:51 compute-0 nova_compute[186544]: 2025-11-22 08:49:51.404 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:49:51 compute-0 nova_compute[186544]: 2025-11-22 08:49:51.407 186548 DEBUG nova.virt.libvirt.driver [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:49:51 compute-0 nova_compute[186544]: 2025-11-22 08:49:51.408 186548 DEBUG nova.virt.libvirt.driver [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:49:51 compute-0 nova_compute[186544]: 2025-11-22 08:49:51.408 186548 DEBUG nova.virt.libvirt.driver [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:49:51 compute-0 nova_compute[186544]: 2025-11-22 08:49:51.408 186548 DEBUG nova.virt.libvirt.driver [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:49:51 compute-0 nova_compute[186544]: 2025-11-22 08:49:51.409 186548 DEBUG nova.virt.libvirt.driver [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:49:51 compute-0 nova_compute[186544]: 2025-11-22 08:49:51.409 186548 DEBUG nova.virt.libvirt.driver [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:49:51 compute-0 nova_compute[186544]: 2025-11-22 08:49:51.433 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 08:49:51 compute-0 nova_compute[186544]: 2025-11-22 08:49:51.433 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763801391.3752515, 9a83695e-0d5f-4415-9a37-f3de81a77591 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:49:51 compute-0 nova_compute[186544]: 2025-11-22 08:49:51.433 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] VM Paused (Lifecycle Event)
Nov 22 08:49:51 compute-0 nova_compute[186544]: 2025-11-22 08:49:51.454 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:49:51 compute-0 nova_compute[186544]: 2025-11-22 08:49:51.457 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763801391.380382, 9a83695e-0d5f-4415-9a37-f3de81a77591 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:49:51 compute-0 nova_compute[186544]: 2025-11-22 08:49:51.457 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] VM Resumed (Lifecycle Event)
Nov 22 08:49:51 compute-0 nova_compute[186544]: 2025-11-22 08:49:51.493 186548 INFO nova.compute.manager [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] Took 6.78 seconds to spawn the instance on the hypervisor.
Nov 22 08:49:51 compute-0 nova_compute[186544]: 2025-11-22 08:49:51.494 186548 DEBUG nova.compute.manager [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:49:51 compute-0 nova_compute[186544]: 2025-11-22 08:49:51.495 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:49:51 compute-0 nova_compute[186544]: 2025-11-22 08:49:51.502 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:49:51 compute-0 nova_compute[186544]: 2025-11-22 08:49:51.534 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 08:49:51 compute-0 nova_compute[186544]: 2025-11-22 08:49:51.570 186548 INFO nova.compute.manager [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] Took 7.34 seconds to build instance.
Nov 22 08:49:51 compute-0 nova_compute[186544]: 2025-11-22 08:49:51.584 186548 DEBUG oslo_concurrency.lockutils [None req-216e9565-a1bb-4fb6-a619-8fb72748d40b a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] Lock "9a83695e-0d5f-4415-9a37-f3de81a77591" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.430s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:49:51 compute-0 nova_compute[186544]: 2025-11-22 08:49:51.814 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:49:52 compute-0 nova_compute[186544]: 2025-11-22 08:49:52.676 186548 DEBUG nova.compute.manager [req-44286775-73e1-4bb7-af9c-aeb46340ca01 req-3b77a36f-5d27-4a7a-b9b8-d8e66a475e67 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] Received event network-vif-plugged-01ba516c-9f4e-405d-9048-83ab7c320a3a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:49:52 compute-0 nova_compute[186544]: 2025-11-22 08:49:52.677 186548 DEBUG oslo_concurrency.lockutils [req-44286775-73e1-4bb7-af9c-aeb46340ca01 req-3b77a36f-5d27-4a7a-b9b8-d8e66a475e67 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "9a83695e-0d5f-4415-9a37-f3de81a77591-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:49:52 compute-0 nova_compute[186544]: 2025-11-22 08:49:52.677 186548 DEBUG oslo_concurrency.lockutils [req-44286775-73e1-4bb7-af9c-aeb46340ca01 req-3b77a36f-5d27-4a7a-b9b8-d8e66a475e67 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "9a83695e-0d5f-4415-9a37-f3de81a77591-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:49:52 compute-0 nova_compute[186544]: 2025-11-22 08:49:52.677 186548 DEBUG oslo_concurrency.lockutils [req-44286775-73e1-4bb7-af9c-aeb46340ca01 req-3b77a36f-5d27-4a7a-b9b8-d8e66a475e67 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "9a83695e-0d5f-4415-9a37-f3de81a77591-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:49:52 compute-0 nova_compute[186544]: 2025-11-22 08:49:52.678 186548 DEBUG nova.compute.manager [req-44286775-73e1-4bb7-af9c-aeb46340ca01 req-3b77a36f-5d27-4a7a-b9b8-d8e66a475e67 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] No waiting events found dispatching network-vif-plugged-01ba516c-9f4e-405d-9048-83ab7c320a3a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:49:52 compute-0 nova_compute[186544]: 2025-11-22 08:49:52.678 186548 WARNING nova.compute.manager [req-44286775-73e1-4bb7-af9c-aeb46340ca01 req-3b77a36f-5d27-4a7a-b9b8-d8e66a475e67 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] Received unexpected event network-vif-plugged-01ba516c-9f4e-405d-9048-83ab7c320a3a for instance with vm_state active and task_state None.
Nov 22 08:49:53 compute-0 podman[254462]: 2025-11-22 08:49:53.409046483 +0000 UTC m=+0.057161861 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS)
Nov 22 08:49:53 compute-0 podman[254463]: 2025-11-22 08:49:53.412217012 +0000 UTC m=+0.056567927 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 08:49:53 compute-0 podman[254461]: 2025-11-22 08:49:53.412450848 +0000 UTC m=+0.061691844 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute)
Nov 22 08:49:53 compute-0 podman[254464]: 2025-11-22 08:49:53.436798288 +0000 UTC m=+0.078558469 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 22 08:49:53 compute-0 nova_compute[186544]: 2025-11-22 08:49:53.708 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:49:54 compute-0 NetworkManager[55036]: <info>  [1763801394.0847] manager: (patch-br-int-to-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/417)
Nov 22 08:49:54 compute-0 NetworkManager[55036]: <info>  [1763801394.0852] manager: (patch-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/418)
Nov 22 08:49:54 compute-0 nova_compute[186544]: 2025-11-22 08:49:54.096 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:49:54 compute-0 nova_compute[186544]: 2025-11-22 08:49:54.172 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:49:54 compute-0 ovn_controller[94843]: 2025-11-22T08:49:54Z|00892|binding|INFO|Releasing lport de6fe623-0eee-4ec3-8793-a02118dbb427 from this chassis (sb_readonly=0)
Nov 22 08:49:54 compute-0 nova_compute[186544]: 2025-11-22 08:49:54.190 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:49:55 compute-0 nova_compute[186544]: 2025-11-22 08:49:55.794 186548 DEBUG nova.compute.manager [req-aad75513-7505-4541-a658-4ca760f32a6b req-c43af2a9-d262-4dd0-abc2-5ed1bc0ec26e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] Received event network-changed-01ba516c-9f4e-405d-9048-83ab7c320a3a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:49:55 compute-0 nova_compute[186544]: 2025-11-22 08:49:55.794 186548 DEBUG nova.compute.manager [req-aad75513-7505-4541-a658-4ca760f32a6b req-c43af2a9-d262-4dd0-abc2-5ed1bc0ec26e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] Refreshing instance network info cache due to event network-changed-01ba516c-9f4e-405d-9048-83ab7c320a3a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:49:55 compute-0 nova_compute[186544]: 2025-11-22 08:49:55.794 186548 DEBUG oslo_concurrency.lockutils [req-aad75513-7505-4541-a658-4ca760f32a6b req-c43af2a9-d262-4dd0-abc2-5ed1bc0ec26e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-9a83695e-0d5f-4415-9a37-f3de81a77591" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:49:55 compute-0 nova_compute[186544]: 2025-11-22 08:49:55.795 186548 DEBUG oslo_concurrency.lockutils [req-aad75513-7505-4541-a658-4ca760f32a6b req-c43af2a9-d262-4dd0-abc2-5ed1bc0ec26e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-9a83695e-0d5f-4415-9a37-f3de81a77591" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:49:55 compute-0 nova_compute[186544]: 2025-11-22 08:49:55.795 186548 DEBUG nova.network.neutron [req-aad75513-7505-4541-a658-4ca760f32a6b req-c43af2a9-d262-4dd0-abc2-5ed1bc0ec26e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] Refreshing network info cache for port 01ba516c-9f4e-405d-9048-83ab7c320a3a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:49:56 compute-0 nova_compute[186544]: 2025-11-22 08:49:56.815 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:49:56 compute-0 nova_compute[186544]: 2025-11-22 08:49:56.915 186548 DEBUG nova.network.neutron [req-aad75513-7505-4541-a658-4ca760f32a6b req-c43af2a9-d262-4dd0-abc2-5ed1bc0ec26e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] Updated VIF entry in instance network info cache for port 01ba516c-9f4e-405d-9048-83ab7c320a3a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:49:56 compute-0 nova_compute[186544]: 2025-11-22 08:49:56.915 186548 DEBUG nova.network.neutron [req-aad75513-7505-4541-a658-4ca760f32a6b req-c43af2a9-d262-4dd0-abc2-5ed1bc0ec26e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] Updating instance_info_cache with network_info: [{"id": "01ba516c-9f4e-405d-9048-83ab7c320a3a", "address": "fa:16:3e:b7:b7:d0", "network": {"id": "97a92abf-e880-4df3-8778-f25bbdc3abc6", "bridge": "br-int", "label": "tempest-network-smoke--660043884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f996d3c331cb4615815a7a9d118ecd66", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01ba516c-9f", "ovs_interfaceid": "01ba516c-9f4e-405d-9048-83ab7c320a3a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:49:56 compute-0 nova_compute[186544]: 2025-11-22 08:49:56.933 186548 DEBUG oslo_concurrency.lockutils [req-aad75513-7505-4541-a658-4ca760f32a6b req-c43af2a9-d262-4dd0-abc2-5ed1bc0ec26e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-9a83695e-0d5f-4415-9a37-f3de81a77591" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:49:58 compute-0 nova_compute[186544]: 2025-11-22 08:49:58.165 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:49:58 compute-0 nova_compute[186544]: 2025-11-22 08:49:58.713 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:50:01 compute-0 nova_compute[186544]: 2025-11-22 08:50:01.818 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:50:03 compute-0 podman[254549]: 2025-11-22 08:50:03.402223583 +0000 UTC m=+0.050277102 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 08:50:03 compute-0 podman[254550]: 2025-11-22 08:50:03.445431838 +0000 UTC m=+0.090366930 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, config_id=edpm, container_name=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., name=ubi9-minimal, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc.)
Nov 22 08:50:03 compute-0 podman[254551]: 2025-11-22 08:50:03.459758082 +0000 UTC m=+0.094656327 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 08:50:03 compute-0 nova_compute[186544]: 2025-11-22 08:50:03.717 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:50:04 compute-0 ovn_controller[94843]: 2025-11-22T08:50:04Z|00089|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:b7:b7:d0 10.100.0.8
Nov 22 08:50:04 compute-0 ovn_controller[94843]: 2025-11-22T08:50:04Z|00090|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b7:b7:d0 10.100.0.8
Nov 22 08:50:06 compute-0 nova_compute[186544]: 2025-11-22 08:50:06.821 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:50:08 compute-0 nova_compute[186544]: 2025-11-22 08:50:08.720 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:50:09 compute-0 nova_compute[186544]: 2025-11-22 08:50:09.178 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:50:09 compute-0 nova_compute[186544]: 2025-11-22 08:50:09.179 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 22 08:50:09 compute-0 nova_compute[186544]: 2025-11-22 08:50:09.212 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 22 08:50:11 compute-0 nova_compute[186544]: 2025-11-22 08:50:11.822 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:50:13 compute-0 nova_compute[186544]: 2025-11-22 08:50:13.724 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:50:16 compute-0 nova_compute[186544]: 2025-11-22 08:50:16.825 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:50:18 compute-0 nova_compute[186544]: 2025-11-22 08:50:18.727 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:50:19 compute-0 nova_compute[186544]: 2025-11-22 08:50:19.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:50:19 compute-0 nova_compute[186544]: 2025-11-22 08:50:19.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 22 08:50:21 compute-0 nova_compute[186544]: 2025-11-22 08:50:21.826 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:50:22 compute-0 nova_compute[186544]: 2025-11-22 08:50:22.750 186548 DEBUG nova.compute.manager [req-0a41bfca-9333-4c1c-9ac2-ab7508f470a3 req-a2ada838-c1f2-4df8-a715-bf54d239bedf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] Received event network-changed-01ba516c-9f4e-405d-9048-83ab7c320a3a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:50:22 compute-0 nova_compute[186544]: 2025-11-22 08:50:22.751 186548 DEBUG nova.compute.manager [req-0a41bfca-9333-4c1c-9ac2-ab7508f470a3 req-a2ada838-c1f2-4df8-a715-bf54d239bedf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] Refreshing instance network info cache due to event network-changed-01ba516c-9f4e-405d-9048-83ab7c320a3a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:50:22 compute-0 nova_compute[186544]: 2025-11-22 08:50:22.751 186548 DEBUG oslo_concurrency.lockutils [req-0a41bfca-9333-4c1c-9ac2-ab7508f470a3 req-a2ada838-c1f2-4df8-a715-bf54d239bedf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-9a83695e-0d5f-4415-9a37-f3de81a77591" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:50:22 compute-0 nova_compute[186544]: 2025-11-22 08:50:22.752 186548 DEBUG oslo_concurrency.lockutils [req-0a41bfca-9333-4c1c-9ac2-ab7508f470a3 req-a2ada838-c1f2-4df8-a715-bf54d239bedf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-9a83695e-0d5f-4415-9a37-f3de81a77591" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:50:22 compute-0 nova_compute[186544]: 2025-11-22 08:50:22.753 186548 DEBUG nova.network.neutron [req-0a41bfca-9333-4c1c-9ac2-ab7508f470a3 req-a2ada838-c1f2-4df8-a715-bf54d239bedf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] Refreshing network info cache for port 01ba516c-9f4e-405d-9048-83ab7c320a3a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:50:23 compute-0 nova_compute[186544]: 2025-11-22 08:50:23.151 186548 DEBUG oslo_concurrency.lockutils [None req-78aa7236-5274-41e9-b9d7-feede53c83b5 a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] Acquiring lock "9a83695e-0d5f-4415-9a37-f3de81a77591" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:50:23 compute-0 nova_compute[186544]: 2025-11-22 08:50:23.153 186548 DEBUG oslo_concurrency.lockutils [None req-78aa7236-5274-41e9-b9d7-feede53c83b5 a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] Lock "9a83695e-0d5f-4415-9a37-f3de81a77591" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:50:23 compute-0 nova_compute[186544]: 2025-11-22 08:50:23.153 186548 DEBUG oslo_concurrency.lockutils [None req-78aa7236-5274-41e9-b9d7-feede53c83b5 a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] Acquiring lock "9a83695e-0d5f-4415-9a37-f3de81a77591-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:50:23 compute-0 nova_compute[186544]: 2025-11-22 08:50:23.154 186548 DEBUG oslo_concurrency.lockutils [None req-78aa7236-5274-41e9-b9d7-feede53c83b5 a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] Lock "9a83695e-0d5f-4415-9a37-f3de81a77591-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:50:23 compute-0 nova_compute[186544]: 2025-11-22 08:50:23.154 186548 DEBUG oslo_concurrency.lockutils [None req-78aa7236-5274-41e9-b9d7-feede53c83b5 a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] Lock "9a83695e-0d5f-4415-9a37-f3de81a77591-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:50:23 compute-0 nova_compute[186544]: 2025-11-22 08:50:23.162 186548 INFO nova.compute.manager [None req-78aa7236-5274-41e9-b9d7-feede53c83b5 a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] Terminating instance
Nov 22 08:50:23 compute-0 nova_compute[186544]: 2025-11-22 08:50:23.169 186548 DEBUG nova.compute.manager [None req-78aa7236-5274-41e9-b9d7-feede53c83b5 a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 08:50:23 compute-0 nova_compute[186544]: 2025-11-22 08:50:23.732 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:50:23 compute-0 nova_compute[186544]: 2025-11-22 08:50:23.781 186548 DEBUG nova.network.neutron [req-0a41bfca-9333-4c1c-9ac2-ab7508f470a3 req-a2ada838-c1f2-4df8-a715-bf54d239bedf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] Updated VIF entry in instance network info cache for port 01ba516c-9f4e-405d-9048-83ab7c320a3a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:50:23 compute-0 nova_compute[186544]: 2025-11-22 08:50:23.781 186548 DEBUG nova.network.neutron [req-0a41bfca-9333-4c1c-9ac2-ab7508f470a3 req-a2ada838-c1f2-4df8-a715-bf54d239bedf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] Updating instance_info_cache with network_info: [{"id": "01ba516c-9f4e-405d-9048-83ab7c320a3a", "address": "fa:16:3e:b7:b7:d0", "network": {"id": "97a92abf-e880-4df3-8778-f25bbdc3abc6", "bridge": "br-int", "label": "tempest-network-smoke--660043884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f996d3c331cb4615815a7a9d118ecd66", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01ba516c-9f", "ovs_interfaceid": "01ba516c-9f4e-405d-9048-83ab7c320a3a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:50:24 compute-0 nova_compute[186544]: 2025-11-22 08:50:24.190 186548 DEBUG oslo_concurrency.lockutils [req-0a41bfca-9333-4c1c-9ac2-ab7508f470a3 req-a2ada838-c1f2-4df8-a715-bf54d239bedf 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-9a83695e-0d5f-4415-9a37-f3de81a77591" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:50:24 compute-0 podman[254636]: 2025-11-22 08:50:24.422083572 +0000 UTC m=+0.063339283 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 08:50:24 compute-0 podman[254635]: 2025-11-22 08:50:24.430286895 +0000 UTC m=+0.073216467 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 22 08:50:24 compute-0 podman[254634]: 2025-11-22 08:50:24.451692523 +0000 UTC m=+0.086605777 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 22 08:50:24 compute-0 podman[254637]: 2025-11-22 08:50:24.484278618 +0000 UTC m=+0.108549320 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 08:50:26 compute-0 nova_compute[186544]: 2025-11-22 08:50:26.829 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:50:28 compute-0 nova_compute[186544]: 2025-11-22 08:50:28.736 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:50:29 compute-0 kernel: tap01ba516c-9f (unregistering): left promiscuous mode
Nov 22 08:50:29 compute-0 NetworkManager[55036]: <info>  [1763801429.2395] device (tap01ba516c-9f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 08:50:29 compute-0 nova_compute[186544]: 2025-11-22 08:50:29.246 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:50:29 compute-0 ovn_controller[94843]: 2025-11-22T08:50:29Z|00893|binding|INFO|Releasing lport 01ba516c-9f4e-405d-9048-83ab7c320a3a from this chassis (sb_readonly=0)
Nov 22 08:50:29 compute-0 ovn_controller[94843]: 2025-11-22T08:50:29Z|00894|binding|INFO|Setting lport 01ba516c-9f4e-405d-9048-83ab7c320a3a down in Southbound
Nov 22 08:50:29 compute-0 ovn_controller[94843]: 2025-11-22T08:50:29Z|00895|binding|INFO|Removing iface tap01ba516c-9f ovn-installed in OVS
Nov 22 08:50:29 compute-0 nova_compute[186544]: 2025-11-22 08:50:29.247 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:50:29 compute-0 nova_compute[186544]: 2025-11-22 08:50:29.265 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:50:29 compute-0 systemd[1]: machine-qemu\x2d97\x2dinstance\x2d000000b4.scope: Deactivated successfully.
Nov 22 08:50:29 compute-0 systemd[1]: machine-qemu\x2d97\x2dinstance\x2d000000b4.scope: Consumed 16.392s CPU time.
Nov 22 08:50:29 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:50:29.293 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b7:b7:d0 10.100.0.8'], port_security=['fa:16:3e:b7:b7:d0 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '9a83695e-0d5f-4415-9a37-f3de81a77591', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-97a92abf-e880-4df3-8778-f25bbdc3abc6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f996d3c331cb4615815a7a9d118ecd66', 'neutron:revision_number': '4', 'neutron:security_group_ids': '98fd1964-80f7-4f30-acda-ff57179dc6aa ca0720a8-6649-4ad4-a909-7598feb69085', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ee00205b-c551-438c-bd0f-bc696d54eb50, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=01ba516c-9f4e-405d-9048-83ab7c320a3a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:50:29 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:50:29.295 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 01ba516c-9f4e-405d-9048-83ab7c320a3a in datapath 97a92abf-e880-4df3-8778-f25bbdc3abc6 unbound from our chassis
Nov 22 08:50:29 compute-0 systemd-machined[152872]: Machine qemu-97-instance-000000b4 terminated.
Nov 22 08:50:29 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:50:29.295 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 97a92abf-e880-4df3-8778-f25bbdc3abc6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 08:50:29 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:50:29.296 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[aeec7b29-57e7-43cf-8ea1-106bc2024d5b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:50:29 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:50:29.297 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-97a92abf-e880-4df3-8778-f25bbdc3abc6 namespace which is not needed anymore
Nov 22 08:50:29 compute-0 nova_compute[186544]: 2025-11-22 08:50:29.441 186548 INFO nova.virt.libvirt.driver [-] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] Instance destroyed successfully.
Nov 22 08:50:29 compute-0 nova_compute[186544]: 2025-11-22 08:50:29.442 186548 DEBUG nova.objects.instance [None req-78aa7236-5274-41e9-b9d7-feede53c83b5 a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] Lazy-loading 'resources' on Instance uuid 9a83695e-0d5f-4415-9a37-f3de81a77591 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:50:29 compute-0 nova_compute[186544]: 2025-11-22 08:50:29.452 186548 DEBUG nova.virt.libvirt.vif [None req-78aa7236-5274-41e9-b9d7-feede53c83b5 a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:49:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1476424987-access_point-1602015144',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1476424987-access_point-1602015144',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1476424987-ac',id=180,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCr7NF5co5IMPbVlvNNXaWEcV/TMK4gRym1IU+xrFFYbHfyyuO2kK1tTqQYmQ/RTGQJ8JuSs83cgnhJoMhrhV2PVCH3lEoOzIowFbWiaRx38gwHps/13xxUrEJrxPx8nOg==',key_name='tempest-TestSecurityGroupsBasicOps-1968243366',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:49:51Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='f996d3c331cb4615815a7a9d118ecd66',ramdisk_id='',reservation_id='r-osxdn1hz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-1476424987',owner_user_name='tempest-TestSecurityGroupsBasicOps-1476424987-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:49:51Z,user_data=None,user_id='a89106859f3942e3ac8b97f7b8a093fe',uuid=9a83695e-0d5f-4415-9a37-f3de81a77591,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "01ba516c-9f4e-405d-9048-83ab7c320a3a", "address": "fa:16:3e:b7:b7:d0", "network": {"id": "97a92abf-e880-4df3-8778-f25bbdc3abc6", "bridge": "br-int", "label": "tempest-network-smoke--660043884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f996d3c331cb4615815a7a9d118ecd66", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01ba516c-9f", "ovs_interfaceid": "01ba516c-9f4e-405d-9048-83ab7c320a3a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 08:50:29 compute-0 nova_compute[186544]: 2025-11-22 08:50:29.453 186548 DEBUG nova.network.os_vif_util [None req-78aa7236-5274-41e9-b9d7-feede53c83b5 a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] Converting VIF {"id": "01ba516c-9f4e-405d-9048-83ab7c320a3a", "address": "fa:16:3e:b7:b7:d0", "network": {"id": "97a92abf-e880-4df3-8778-f25bbdc3abc6", "bridge": "br-int", "label": "tempest-network-smoke--660043884", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f996d3c331cb4615815a7a9d118ecd66", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01ba516c-9f", "ovs_interfaceid": "01ba516c-9f4e-405d-9048-83ab7c320a3a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:50:29 compute-0 nova_compute[186544]: 2025-11-22 08:50:29.453 186548 DEBUG nova.network.os_vif_util [None req-78aa7236-5274-41e9-b9d7-feede53c83b5 a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b7:b7:d0,bridge_name='br-int',has_traffic_filtering=True,id=01ba516c-9f4e-405d-9048-83ab7c320a3a,network=Network(97a92abf-e880-4df3-8778-f25bbdc3abc6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01ba516c-9f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:50:29 compute-0 nova_compute[186544]: 2025-11-22 08:50:29.454 186548 DEBUG os_vif [None req-78aa7236-5274-41e9-b9d7-feede53c83b5 a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b7:b7:d0,bridge_name='br-int',has_traffic_filtering=True,id=01ba516c-9f4e-405d-9048-83ab7c320a3a,network=Network(97a92abf-e880-4df3-8778-f25bbdc3abc6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01ba516c-9f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 08:50:29 compute-0 nova_compute[186544]: 2025-11-22 08:50:29.456 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:50:29 compute-0 nova_compute[186544]: 2025-11-22 08:50:29.457 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap01ba516c-9f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:50:29 compute-0 nova_compute[186544]: 2025-11-22 08:50:29.459 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:50:29 compute-0 nova_compute[186544]: 2025-11-22 08:50:29.462 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 08:50:29 compute-0 nova_compute[186544]: 2025-11-22 08:50:29.468 186548 INFO os_vif [None req-78aa7236-5274-41e9-b9d7-feede53c83b5 a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b7:b7:d0,bridge_name='br-int',has_traffic_filtering=True,id=01ba516c-9f4e-405d-9048-83ab7c320a3a,network=Network(97a92abf-e880-4df3-8778-f25bbdc3abc6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01ba516c-9f')
Nov 22 08:50:29 compute-0 nova_compute[186544]: 2025-11-22 08:50:29.469 186548 INFO nova.virt.libvirt.driver [None req-78aa7236-5274-41e9-b9d7-feede53c83b5 a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] Deleting instance files /var/lib/nova/instances/9a83695e-0d5f-4415-9a37-f3de81a77591_del
Nov 22 08:50:29 compute-0 nova_compute[186544]: 2025-11-22 08:50:29.469 186548 INFO nova.virt.libvirt.driver [None req-78aa7236-5274-41e9-b9d7-feede53c83b5 a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] Deletion of /var/lib/nova/instances/9a83695e-0d5f-4415-9a37-f3de81a77591_del complete
Nov 22 08:50:29 compute-0 nova_compute[186544]: 2025-11-22 08:50:29.554 186548 INFO nova.compute.manager [None req-78aa7236-5274-41e9-b9d7-feede53c83b5 a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] Took 6.39 seconds to destroy the instance on the hypervisor.
Nov 22 08:50:29 compute-0 nova_compute[186544]: 2025-11-22 08:50:29.555 186548 DEBUG oslo.service.loopingcall [None req-78aa7236-5274-41e9-b9d7-feede53c83b5 a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 08:50:29 compute-0 nova_compute[186544]: 2025-11-22 08:50:29.555 186548 DEBUG nova.compute.manager [-] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 08:50:29 compute-0 nova_compute[186544]: 2025-11-22 08:50:29.555 186548 DEBUG nova.network.neutron [-] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 08:50:29 compute-0 neutron-haproxy-ovnmeta-97a92abf-e880-4df3-8778-f25bbdc3abc6[254439]: [NOTICE]   (254449) : haproxy version is 2.8.14-c23fe91
Nov 22 08:50:29 compute-0 neutron-haproxy-ovnmeta-97a92abf-e880-4df3-8778-f25bbdc3abc6[254439]: [NOTICE]   (254449) : path to executable is /usr/sbin/haproxy
Nov 22 08:50:29 compute-0 neutron-haproxy-ovnmeta-97a92abf-e880-4df3-8778-f25bbdc3abc6[254439]: [WARNING]  (254449) : Exiting Master process...
Nov 22 08:50:29 compute-0 neutron-haproxy-ovnmeta-97a92abf-e880-4df3-8778-f25bbdc3abc6[254439]: [WARNING]  (254449) : Exiting Master process...
Nov 22 08:50:29 compute-0 neutron-haproxy-ovnmeta-97a92abf-e880-4df3-8778-f25bbdc3abc6[254439]: [ALERT]    (254449) : Current worker (254452) exited with code 143 (Terminated)
Nov 22 08:50:29 compute-0 neutron-haproxy-ovnmeta-97a92abf-e880-4df3-8778-f25bbdc3abc6[254439]: [WARNING]  (254449) : All workers exited. Exiting... (0)
Nov 22 08:50:29 compute-0 systemd[1]: libpod-0fc87cad42a7cc2c3c8fc798a039ec9e2e6d857cab0e355728fd3a7e18507f8c.scope: Deactivated successfully.
Nov 22 08:50:29 compute-0 podman[254742]: 2025-11-22 08:50:29.598585912 +0000 UTC m=+0.222976053 container died 0fc87cad42a7cc2c3c8fc798a039ec9e2e6d857cab0e355728fd3a7e18507f8c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-97a92abf-e880-4df3-8778-f25bbdc3abc6, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 08:50:29 compute-0 nova_compute[186544]: 2025-11-22 08:50:29.614 186548 DEBUG nova.compute.manager [req-7e32ce98-a4d1-4546-ab53-4ff9eeb5f3e2 req-f0aa822f-3500-43a7-a4dd-ece2e1bfe6ac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] Received event network-vif-unplugged-01ba516c-9f4e-405d-9048-83ab7c320a3a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:50:29 compute-0 nova_compute[186544]: 2025-11-22 08:50:29.615 186548 DEBUG oslo_concurrency.lockutils [req-7e32ce98-a4d1-4546-ab53-4ff9eeb5f3e2 req-f0aa822f-3500-43a7-a4dd-ece2e1bfe6ac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "9a83695e-0d5f-4415-9a37-f3de81a77591-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:50:29 compute-0 nova_compute[186544]: 2025-11-22 08:50:29.615 186548 DEBUG oslo_concurrency.lockutils [req-7e32ce98-a4d1-4546-ab53-4ff9eeb5f3e2 req-f0aa822f-3500-43a7-a4dd-ece2e1bfe6ac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "9a83695e-0d5f-4415-9a37-f3de81a77591-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:50:29 compute-0 nova_compute[186544]: 2025-11-22 08:50:29.615 186548 DEBUG oslo_concurrency.lockutils [req-7e32ce98-a4d1-4546-ab53-4ff9eeb5f3e2 req-f0aa822f-3500-43a7-a4dd-ece2e1bfe6ac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "9a83695e-0d5f-4415-9a37-f3de81a77591-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:50:29 compute-0 nova_compute[186544]: 2025-11-22 08:50:29.615 186548 DEBUG nova.compute.manager [req-7e32ce98-a4d1-4546-ab53-4ff9eeb5f3e2 req-f0aa822f-3500-43a7-a4dd-ece2e1bfe6ac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] No waiting events found dispatching network-vif-unplugged-01ba516c-9f4e-405d-9048-83ab7c320a3a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:50:29 compute-0 nova_compute[186544]: 2025-11-22 08:50:29.616 186548 DEBUG nova.compute.manager [req-7e32ce98-a4d1-4546-ab53-4ff9eeb5f3e2 req-f0aa822f-3500-43a7-a4dd-ece2e1bfe6ac 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] Received event network-vif-unplugged-01ba516c-9f4e-405d-9048-83ab7c320a3a for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 22 08:50:29 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0fc87cad42a7cc2c3c8fc798a039ec9e2e6d857cab0e355728fd3a7e18507f8c-userdata-shm.mount: Deactivated successfully.
Nov 22 08:50:29 compute-0 systemd[1]: var-lib-containers-storage-overlay-a03c75d75cc5ec41e7eb3fb999fdb15582669ecd6b139923c5b87e311aa534fa-merged.mount: Deactivated successfully.
Nov 22 08:50:30 compute-0 nova_compute[186544]: 2025-11-22 08:50:30.174 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:50:30 compute-0 nova_compute[186544]: 2025-11-22 08:50:30.176 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 08:50:30 compute-0 nova_compute[186544]: 2025-11-22 08:50:30.176 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 08:50:30 compute-0 nova_compute[186544]: 2025-11-22 08:50:30.193 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875
Nov 22 08:50:30 compute-0 nova_compute[186544]: 2025-11-22 08:50:30.193 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 08:50:30 compute-0 podman[254742]: 2025-11-22 08:50:30.250329694 +0000 UTC m=+0.874719835 container cleanup 0fc87cad42a7cc2c3c8fc798a039ec9e2e6d857cab0e355728fd3a7e18507f8c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-97a92abf-e880-4df3-8778-f25bbdc3abc6, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 22 08:50:30 compute-0 systemd[1]: libpod-conmon-0fc87cad42a7cc2c3c8fc798a039ec9e2e6d857cab0e355728fd3a7e18507f8c.scope: Deactivated successfully.
Nov 22 08:50:30 compute-0 nova_compute[186544]: 2025-11-22 08:50:30.875 186548 DEBUG nova.network.neutron [-] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:50:30 compute-0 nova_compute[186544]: 2025-11-22 08:50:30.923 186548 INFO nova.compute.manager [-] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] Took 1.37 seconds to deallocate network for instance.
Nov 22 08:50:31 compute-0 nova_compute[186544]: 2025-11-22 08:50:31.006 186548 DEBUG oslo_concurrency.lockutils [None req-78aa7236-5274-41e9-b9d7-feede53c83b5 a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:50:31 compute-0 nova_compute[186544]: 2025-11-22 08:50:31.007 186548 DEBUG oslo_concurrency.lockutils [None req-78aa7236-5274-41e9-b9d7-feede53c83b5 a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:50:31 compute-0 podman[254786]: 2025-11-22 08:50:31.045023732 +0000 UTC m=+0.771792065 container remove 0fc87cad42a7cc2c3c8fc798a039ec9e2e6d857cab0e355728fd3a7e18507f8c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-97a92abf-e880-4df3-8778-f25bbdc3abc6, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 22 08:50:31 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:50:31.050 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[dc5151c2-f3a6-4d69-a080-86d72576b441]: (4, ('Sat Nov 22 08:50:29 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-97a92abf-e880-4df3-8778-f25bbdc3abc6 (0fc87cad42a7cc2c3c8fc798a039ec9e2e6d857cab0e355728fd3a7e18507f8c)\n0fc87cad42a7cc2c3c8fc798a039ec9e2e6d857cab0e355728fd3a7e18507f8c\nSat Nov 22 08:50:30 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-97a92abf-e880-4df3-8778-f25bbdc3abc6 (0fc87cad42a7cc2c3c8fc798a039ec9e2e6d857cab0e355728fd3a7e18507f8c)\n0fc87cad42a7cc2c3c8fc798a039ec9e2e6d857cab0e355728fd3a7e18507f8c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:50:31 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:50:31.051 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[8c26bca8-8c2b-4a32-9b70-488ee2669181]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:50:31 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:50:31.052 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap97a92abf-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:50:31 compute-0 nova_compute[186544]: 2025-11-22 08:50:31.055 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:50:31 compute-0 kernel: tap97a92abf-e0: left promiscuous mode
Nov 22 08:50:31 compute-0 nova_compute[186544]: 2025-11-22 08:50:31.063 186548 DEBUG nova.compute.provider_tree [None req-78aa7236-5274-41e9-b9d7-feede53c83b5 a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:50:31 compute-0 nova_compute[186544]: 2025-11-22 08:50:31.067 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:50:31 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:50:31.068 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[a7488453-af34-4c4b-8295-5c2b80707654]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:50:31 compute-0 nova_compute[186544]: 2025-11-22 08:50:31.077 186548 DEBUG nova.scheduler.client.report [None req-78aa7236-5274-41e9-b9d7-feede53c83b5 a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:50:31 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:50:31.087 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[25940f8a-ba2f-4633-8273-89961c61aeb3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:50:31 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:50:31.088 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[9f6214c2-1352-4ef5-99be-cd5cb12b2bc6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:50:31 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:50:31.102 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[26af02fb-c4a2-4b03-bcc3-74d1d853c76a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 813080, 'reachable_time': 34885, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 254802, 'error': None, 'target': 'ovnmeta-97a92abf-e880-4df3-8778-f25bbdc3abc6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:50:31 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:50:31.104 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-97a92abf-e880-4df3-8778-f25bbdc3abc6 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 08:50:31 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:50:31.104 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[511c2a85-5452-43a4-8087-39b356ba2a7b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:50:31 compute-0 systemd[1]: run-netns-ovnmeta\x2d97a92abf\x2de880\x2d4df3\x2d8778\x2df25bbdc3abc6.mount: Deactivated successfully.
Nov 22 08:50:31 compute-0 nova_compute[186544]: 2025-11-22 08:50:31.121 186548 DEBUG oslo_concurrency.lockutils [None req-78aa7236-5274-41e9-b9d7-feede53c83b5 a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.114s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:50:31 compute-0 nova_compute[186544]: 2025-11-22 08:50:31.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:50:31 compute-0 nova_compute[186544]: 2025-11-22 08:50:31.162 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 08:50:31 compute-0 nova_compute[186544]: 2025-11-22 08:50:31.180 186548 INFO nova.scheduler.client.report [None req-78aa7236-5274-41e9-b9d7-feede53c83b5 a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] Deleted allocations for instance 9a83695e-0d5f-4415-9a37-f3de81a77591
Nov 22 08:50:31 compute-0 nova_compute[186544]: 2025-11-22 08:50:31.246 186548 DEBUG oslo_concurrency.lockutils [None req-78aa7236-5274-41e9-b9d7-feede53c83b5 a89106859f3942e3ac8b97f7b8a093fe f996d3c331cb4615815a7a9d118ecd66 - - default default] Lock "9a83695e-0d5f-4415-9a37-f3de81a77591" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 8.094s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:50:31 compute-0 nova_compute[186544]: 2025-11-22 08:50:31.831 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:50:31 compute-0 nova_compute[186544]: 2025-11-22 08:50:31.882 186548 DEBUG nova.compute.manager [req-81428173-b945-45f8-b3c3-7de3fa05d227 req-a8f4178e-04d4-4321-a911-50d6675e33f5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] Received event network-vif-plugged-01ba516c-9f4e-405d-9048-83ab7c320a3a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:50:31 compute-0 nova_compute[186544]: 2025-11-22 08:50:31.883 186548 DEBUG oslo_concurrency.lockutils [req-81428173-b945-45f8-b3c3-7de3fa05d227 req-a8f4178e-04d4-4321-a911-50d6675e33f5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "9a83695e-0d5f-4415-9a37-f3de81a77591-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:50:31 compute-0 nova_compute[186544]: 2025-11-22 08:50:31.883 186548 DEBUG oslo_concurrency.lockutils [req-81428173-b945-45f8-b3c3-7de3fa05d227 req-a8f4178e-04d4-4321-a911-50d6675e33f5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "9a83695e-0d5f-4415-9a37-f3de81a77591-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:50:31 compute-0 nova_compute[186544]: 2025-11-22 08:50:31.883 186548 DEBUG oslo_concurrency.lockutils [req-81428173-b945-45f8-b3c3-7de3fa05d227 req-a8f4178e-04d4-4321-a911-50d6675e33f5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "9a83695e-0d5f-4415-9a37-f3de81a77591-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:50:31 compute-0 nova_compute[186544]: 2025-11-22 08:50:31.884 186548 DEBUG nova.compute.manager [req-81428173-b945-45f8-b3c3-7de3fa05d227 req-a8f4178e-04d4-4321-a911-50d6675e33f5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] No waiting events found dispatching network-vif-plugged-01ba516c-9f4e-405d-9048-83ab7c320a3a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:50:31 compute-0 nova_compute[186544]: 2025-11-22 08:50:31.884 186548 WARNING nova.compute.manager [req-81428173-b945-45f8-b3c3-7de3fa05d227 req-a8f4178e-04d4-4321-a911-50d6675e33f5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] Received unexpected event network-vif-plugged-01ba516c-9f4e-405d-9048-83ab7c320a3a for instance with vm_state deleted and task_state None.
Nov 22 08:50:31 compute-0 nova_compute[186544]: 2025-11-22 08:50:31.884 186548 DEBUG nova.compute.manager [req-81428173-b945-45f8-b3c3-7de3fa05d227 req-a8f4178e-04d4-4321-a911-50d6675e33f5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] Received event network-vif-deleted-01ba516c-9f4e-405d-9048-83ab7c320a3a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:50:32 compute-0 nova_compute[186544]: 2025-11-22 08:50:32.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:50:32 compute-0 nova_compute[186544]: 2025-11-22 08:50:32.182 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:50:32 compute-0 nova_compute[186544]: 2025-11-22 08:50:32.182 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:50:32 compute-0 nova_compute[186544]: 2025-11-22 08:50:32.182 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:50:32 compute-0 nova_compute[186544]: 2025-11-22 08:50:32.182 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 08:50:32 compute-0 nova_compute[186544]: 2025-11-22 08:50:32.351 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:50:32 compute-0 nova_compute[186544]: 2025-11-22 08:50:32.352 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5711MB free_disk=73.1318130493164GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 08:50:32 compute-0 nova_compute[186544]: 2025-11-22 08:50:32.352 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:50:32 compute-0 nova_compute[186544]: 2025-11-22 08:50:32.352 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:50:32 compute-0 nova_compute[186544]: 2025-11-22 08:50:32.397 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 08:50:32 compute-0 nova_compute[186544]: 2025-11-22 08:50:32.397 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 08:50:32 compute-0 nova_compute[186544]: 2025-11-22 08:50:32.417 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:50:32 compute-0 nova_compute[186544]: 2025-11-22 08:50:32.428 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:50:32 compute-0 nova_compute[186544]: 2025-11-22 08:50:32.481 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 08:50:32 compute-0 nova_compute[186544]: 2025-11-22 08:50:32.481 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:50:33 compute-0 nova_compute[186544]: 2025-11-22 08:50:33.500 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:50:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:50:33.501 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=75, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=74) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:50:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:50:33.502 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 08:50:34 compute-0 podman[254805]: 2025-11-22 08:50:34.418327938 +0000 UTC m=+0.058437413 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, container_name=openstack_network_exporter, name=ubi9-minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, distribution-scope=public, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=)
Nov 22 08:50:34 compute-0 podman[254806]: 2025-11-22 08:50:34.423132736 +0000 UTC m=+0.059776326 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3)
Nov 22 08:50:34 compute-0 podman[254804]: 2025-11-22 08:50:34.441081879 +0000 UTC m=+0.082404824 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 08:50:34 compute-0 nova_compute[186544]: 2025-11-22 08:50:34.459 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:50:35 compute-0 nova_compute[186544]: 2025-11-22 08:50:35.401 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:50:35 compute-0 nova_compute[186544]: 2025-11-22 08:50:35.515 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:50:36.606 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:50:36.607 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:50:36.607 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:50:36.607 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:50:36.607 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:50:36.607 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:50:36.607 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:50:36.607 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:50:36.607 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:50:36.607 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:50:36.607 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:50:36.607 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:50:36.607 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:50:36.607 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:50:36.607 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:50:36.608 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:50:36.608 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:50:36.608 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:50:36.608 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:50:36.608 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:50:36.608 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:50:36.608 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:50:36.608 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:50:36.608 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:50:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:50:36.608 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:50:36 compute-0 nova_compute[186544]: 2025-11-22 08:50:36.833 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:50:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:50:37.382 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:50:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:50:37.383 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:50:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:50:37.383 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:50:37 compute-0 nova_compute[186544]: 2025-11-22 08:50:37.481 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:50:39 compute-0 nova_compute[186544]: 2025-11-22 08:50:39.462 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:50:39 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:50:39.504 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '75'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:50:40 compute-0 nova_compute[186544]: 2025-11-22 08:50:40.158 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:50:41 compute-0 nova_compute[186544]: 2025-11-22 08:50:41.835 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:50:42 compute-0 nova_compute[186544]: 2025-11-22 08:50:42.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:50:42 compute-0 nova_compute[186544]: 2025-11-22 08:50:42.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:50:44 compute-0 nova_compute[186544]: 2025-11-22 08:50:44.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:50:44 compute-0 nova_compute[186544]: 2025-11-22 08:50:44.440 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763801429.4394832, 9a83695e-0d5f-4415-9a37-f3de81a77591 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:50:44 compute-0 nova_compute[186544]: 2025-11-22 08:50:44.441 186548 INFO nova.compute.manager [-] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] VM Stopped (Lifecycle Event)
Nov 22 08:50:44 compute-0 nova_compute[186544]: 2025-11-22 08:50:44.464 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:50:44 compute-0 nova_compute[186544]: 2025-11-22 08:50:44.499 186548 DEBUG nova.compute.manager [None req-9846e858-181a-481b-82c0-4e86c6c871dd - - - - - -] [instance: 9a83695e-0d5f-4415-9a37-f3de81a77591] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:50:46 compute-0 nova_compute[186544]: 2025-11-22 08:50:46.836 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:50:47 compute-0 nova_compute[186544]: 2025-11-22 08:50:47.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:50:49 compute-0 nova_compute[186544]: 2025-11-22 08:50:49.466 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:50:51 compute-0 nova_compute[186544]: 2025-11-22 08:50:51.839 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:50:54 compute-0 nova_compute[186544]: 2025-11-22 08:50:54.467 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:50:55 compute-0 podman[254868]: 2025-11-22 08:50:55.414688251 +0000 UTC m=+0.054361442 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 08:50:55 compute-0 podman[254867]: 2025-11-22 08:50:55.422539584 +0000 UTC m=+0.064831130 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 08:50:55 compute-0 podman[254870]: 2025-11-22 08:50:55.448170537 +0000 UTC m=+0.079593735 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_id=ovn_controller, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 22 08:50:55 compute-0 podman[254869]: 2025-11-22 08:50:55.450517765 +0000 UTC m=+0.085636124 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 08:50:56 compute-0 nova_compute[186544]: 2025-11-22 08:50:56.840 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:50:58 compute-0 nova_compute[186544]: 2025-11-22 08:50:58.159 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:50:59 compute-0 nova_compute[186544]: 2025-11-22 08:50:59.470 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:51:01 compute-0 nova_compute[186544]: 2025-11-22 08:51:01.842 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:51:04 compute-0 nova_compute[186544]: 2025-11-22 08:51:04.472 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:51:05 compute-0 podman[254954]: 2025-11-22 08:51:05.417320083 +0000 UTC m=+0.063832236 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 08:51:05 compute-0 podman[254955]: 2025-11-22 08:51:05.417342384 +0000 UTC m=+0.058666299 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, version=9.6, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.openshift.expose-services=, container_name=openstack_network_exporter, release=1755695350, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, architecture=x86_64, vcs-type=git, maintainer=Red Hat, Inc.)
Nov 22 08:51:05 compute-0 podman[254956]: 2025-11-22 08:51:05.427725879 +0000 UTC m=+0.064220985 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 08:51:06 compute-0 nova_compute[186544]: 2025-11-22 08:51:06.844 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:51:09 compute-0 nova_compute[186544]: 2025-11-22 08:51:09.474 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:51:11 compute-0 nova_compute[186544]: 2025-11-22 08:51:11.846 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:51:14 compute-0 nova_compute[186544]: 2025-11-22 08:51:14.476 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:51:15 compute-0 nova_compute[186544]: 2025-11-22 08:51:15.805 186548 DEBUG oslo_concurrency.lockutils [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "50ec0a38-9293-42b4-9106-2c57fc99c3fb" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:51:15 compute-0 nova_compute[186544]: 2025-11-22 08:51:15.805 186548 DEBUG oslo_concurrency.lockutils [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "50ec0a38-9293-42b4-9106-2c57fc99c3fb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:51:15 compute-0 nova_compute[186544]: 2025-11-22 08:51:15.825 186548 DEBUG nova.compute.manager [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 08:51:15 compute-0 nova_compute[186544]: 2025-11-22 08:51:15.924 186548 DEBUG oslo_concurrency.lockutils [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:51:15 compute-0 nova_compute[186544]: 2025-11-22 08:51:15.925 186548 DEBUG oslo_concurrency.lockutils [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:51:15 compute-0 nova_compute[186544]: 2025-11-22 08:51:15.932 186548 DEBUG nova.virt.hardware [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 08:51:15 compute-0 nova_compute[186544]: 2025-11-22 08:51:15.932 186548 INFO nova.compute.claims [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] Claim successful on node compute-0.ctlplane.example.com
Nov 22 08:51:16 compute-0 nova_compute[186544]: 2025-11-22 08:51:16.038 186548 DEBUG nova.compute.provider_tree [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:51:16 compute-0 nova_compute[186544]: 2025-11-22 08:51:16.050 186548 DEBUG nova.scheduler.client.report [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:51:16 compute-0 nova_compute[186544]: 2025-11-22 08:51:16.072 186548 DEBUG oslo_concurrency.lockutils [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.147s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:51:16 compute-0 nova_compute[186544]: 2025-11-22 08:51:16.073 186548 DEBUG nova.compute.manager [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 08:51:16 compute-0 nova_compute[186544]: 2025-11-22 08:51:16.119 186548 DEBUG nova.compute.manager [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 22 08:51:16 compute-0 nova_compute[186544]: 2025-11-22 08:51:16.119 186548 DEBUG nova.network.neutron [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 08:51:16 compute-0 nova_compute[186544]: 2025-11-22 08:51:16.196 186548 INFO nova.virt.libvirt.driver [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 08:51:16 compute-0 nova_compute[186544]: 2025-11-22 08:51:16.222 186548 DEBUG nova.compute.manager [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 08:51:16 compute-0 nova_compute[186544]: 2025-11-22 08:51:16.324 186548 DEBUG nova.compute.manager [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 08:51:16 compute-0 nova_compute[186544]: 2025-11-22 08:51:16.325 186548 DEBUG nova.virt.libvirt.driver [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 08:51:16 compute-0 nova_compute[186544]: 2025-11-22 08:51:16.325 186548 INFO nova.virt.libvirt.driver [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] Creating image(s)
Nov 22 08:51:16 compute-0 nova_compute[186544]: 2025-11-22 08:51:16.326 186548 DEBUG oslo_concurrency.lockutils [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "/var/lib/nova/instances/50ec0a38-9293-42b4-9106-2c57fc99c3fb/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:51:16 compute-0 nova_compute[186544]: 2025-11-22 08:51:16.326 186548 DEBUG oslo_concurrency.lockutils [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "/var/lib/nova/instances/50ec0a38-9293-42b4-9106-2c57fc99c3fb/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:51:16 compute-0 nova_compute[186544]: 2025-11-22 08:51:16.327 186548 DEBUG oslo_concurrency.lockutils [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "/var/lib/nova/instances/50ec0a38-9293-42b4-9106-2c57fc99c3fb/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:51:16 compute-0 nova_compute[186544]: 2025-11-22 08:51:16.340 186548 DEBUG oslo_concurrency.processutils [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:51:16 compute-0 nova_compute[186544]: 2025-11-22 08:51:16.412 186548 DEBUG oslo_concurrency.processutils [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:51:16 compute-0 nova_compute[186544]: 2025-11-22 08:51:16.413 186548 DEBUG oslo_concurrency.lockutils [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:51:16 compute-0 nova_compute[186544]: 2025-11-22 08:51:16.413 186548 DEBUG oslo_concurrency.lockutils [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:51:16 compute-0 nova_compute[186544]: 2025-11-22 08:51:16.425 186548 DEBUG oslo_concurrency.processutils [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:51:16 compute-0 nova_compute[186544]: 2025-11-22 08:51:16.482 186548 DEBUG oslo_concurrency.processutils [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:51:16 compute-0 nova_compute[186544]: 2025-11-22 08:51:16.483 186548 DEBUG oslo_concurrency.processutils [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/50ec0a38-9293-42b4-9106-2c57fc99c3fb/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:51:16 compute-0 nova_compute[186544]: 2025-11-22 08:51:16.520 186548 DEBUG oslo_concurrency.processutils [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/50ec0a38-9293-42b4-9106-2c57fc99c3fb/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:51:16 compute-0 nova_compute[186544]: 2025-11-22 08:51:16.521 186548 DEBUG oslo_concurrency.lockutils [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.107s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:51:16 compute-0 nova_compute[186544]: 2025-11-22 08:51:16.521 186548 DEBUG oslo_concurrency.processutils [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:51:16 compute-0 nova_compute[186544]: 2025-11-22 08:51:16.565 186548 DEBUG nova.policy [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 22 08:51:16 compute-0 nova_compute[186544]: 2025-11-22 08:51:16.581 186548 DEBUG oslo_concurrency.processutils [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:51:16 compute-0 nova_compute[186544]: 2025-11-22 08:51:16.582 186548 DEBUG nova.virt.disk.api [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Checking if we can resize image /var/lib/nova/instances/50ec0a38-9293-42b4-9106-2c57fc99c3fb/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 08:51:16 compute-0 nova_compute[186544]: 2025-11-22 08:51:16.583 186548 DEBUG oslo_concurrency.processutils [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/50ec0a38-9293-42b4-9106-2c57fc99c3fb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:51:16 compute-0 nova_compute[186544]: 2025-11-22 08:51:16.641 186548 DEBUG oslo_concurrency.processutils [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/50ec0a38-9293-42b4-9106-2c57fc99c3fb/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:51:16 compute-0 nova_compute[186544]: 2025-11-22 08:51:16.642 186548 DEBUG nova.virt.disk.api [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Cannot resize image /var/lib/nova/instances/50ec0a38-9293-42b4-9106-2c57fc99c3fb/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 08:51:16 compute-0 nova_compute[186544]: 2025-11-22 08:51:16.642 186548 DEBUG nova.objects.instance [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lazy-loading 'migration_context' on Instance uuid 50ec0a38-9293-42b4-9106-2c57fc99c3fb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:51:16 compute-0 nova_compute[186544]: 2025-11-22 08:51:16.653 186548 DEBUG nova.virt.libvirt.driver [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 08:51:16 compute-0 nova_compute[186544]: 2025-11-22 08:51:16.654 186548 DEBUG nova.virt.libvirt.driver [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] Ensure instance console log exists: /var/lib/nova/instances/50ec0a38-9293-42b4-9106-2c57fc99c3fb/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 08:51:16 compute-0 nova_compute[186544]: 2025-11-22 08:51:16.654 186548 DEBUG oslo_concurrency.lockutils [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:51:16 compute-0 nova_compute[186544]: 2025-11-22 08:51:16.654 186548 DEBUG oslo_concurrency.lockutils [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:51:16 compute-0 nova_compute[186544]: 2025-11-22 08:51:16.655 186548 DEBUG oslo_concurrency.lockutils [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:51:16 compute-0 nova_compute[186544]: 2025-11-22 08:51:16.847 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:51:17 compute-0 nova_compute[186544]: 2025-11-22 08:51:17.544 186548 DEBUG nova.network.neutron [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] Successfully created port: a65c1b37-e8e5-4ea7-b0bf-8f2db5841192 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 22 08:51:19 compute-0 nova_compute[186544]: 2025-11-22 08:51:19.143 186548 DEBUG nova.network.neutron [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] Successfully updated port: a65c1b37-e8e5-4ea7-b0bf-8f2db5841192 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 08:51:19 compute-0 nova_compute[186544]: 2025-11-22 08:51:19.164 186548 DEBUG oslo_concurrency.lockutils [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "refresh_cache-50ec0a38-9293-42b4-9106-2c57fc99c3fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:51:19 compute-0 nova_compute[186544]: 2025-11-22 08:51:19.165 186548 DEBUG oslo_concurrency.lockutils [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquired lock "refresh_cache-50ec0a38-9293-42b4-9106-2c57fc99c3fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:51:19 compute-0 nova_compute[186544]: 2025-11-22 08:51:19.165 186548 DEBUG nova.network.neutron [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 08:51:19 compute-0 nova_compute[186544]: 2025-11-22 08:51:19.260 186548 DEBUG nova.compute.manager [req-ed13d92f-4789-4749-b87d-b406d0ea30a7 req-dd8a8e25-9056-42ea-91aa-46686ee5b77b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] Received event network-changed-a65c1b37-e8e5-4ea7-b0bf-8f2db5841192 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:51:19 compute-0 nova_compute[186544]: 2025-11-22 08:51:19.261 186548 DEBUG nova.compute.manager [req-ed13d92f-4789-4749-b87d-b406d0ea30a7 req-dd8a8e25-9056-42ea-91aa-46686ee5b77b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] Refreshing instance network info cache due to event network-changed-a65c1b37-e8e5-4ea7-b0bf-8f2db5841192. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:51:19 compute-0 nova_compute[186544]: 2025-11-22 08:51:19.261 186548 DEBUG oslo_concurrency.lockutils [req-ed13d92f-4789-4749-b87d-b406d0ea30a7 req-dd8a8e25-9056-42ea-91aa-46686ee5b77b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-50ec0a38-9293-42b4-9106-2c57fc99c3fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:51:19 compute-0 nova_compute[186544]: 2025-11-22 08:51:19.364 186548 DEBUG nova.network.neutron [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 08:51:19 compute-0 nova_compute[186544]: 2025-11-22 08:51:19.478 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:51:21 compute-0 nova_compute[186544]: 2025-11-22 08:51:21.849 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:51:22 compute-0 nova_compute[186544]: 2025-11-22 08:51:22.531 186548 DEBUG nova.network.neutron [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] Updating instance_info_cache with network_info: [{"id": "a65c1b37-e8e5-4ea7-b0bf-8f2db5841192", "address": "fa:16:3e:5b:7e:cc", "network": {"id": "6462ae38-eefd-46f7-8dfd-98d64cb746b6", "bridge": "br-int", "label": "tempest-network-smoke--1071805895", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa65c1b37-e8", "ovs_interfaceid": "a65c1b37-e8e5-4ea7-b0bf-8f2db5841192", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:51:23 compute-0 nova_compute[186544]: 2025-11-22 08:51:23.014 186548 DEBUG oslo_concurrency.lockutils [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Releasing lock "refresh_cache-50ec0a38-9293-42b4-9106-2c57fc99c3fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:51:23 compute-0 nova_compute[186544]: 2025-11-22 08:51:23.015 186548 DEBUG nova.compute.manager [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] Instance network_info: |[{"id": "a65c1b37-e8e5-4ea7-b0bf-8f2db5841192", "address": "fa:16:3e:5b:7e:cc", "network": {"id": "6462ae38-eefd-46f7-8dfd-98d64cb746b6", "bridge": "br-int", "label": "tempest-network-smoke--1071805895", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa65c1b37-e8", "ovs_interfaceid": "a65c1b37-e8e5-4ea7-b0bf-8f2db5841192", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 22 08:51:23 compute-0 nova_compute[186544]: 2025-11-22 08:51:23.015 186548 DEBUG oslo_concurrency.lockutils [req-ed13d92f-4789-4749-b87d-b406d0ea30a7 req-dd8a8e25-9056-42ea-91aa-46686ee5b77b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-50ec0a38-9293-42b4-9106-2c57fc99c3fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:51:23 compute-0 nova_compute[186544]: 2025-11-22 08:51:23.015 186548 DEBUG nova.network.neutron [req-ed13d92f-4789-4749-b87d-b406d0ea30a7 req-dd8a8e25-9056-42ea-91aa-46686ee5b77b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] Refreshing network info cache for port a65c1b37-e8e5-4ea7-b0bf-8f2db5841192 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:51:23 compute-0 nova_compute[186544]: 2025-11-22 08:51:23.018 186548 DEBUG nova.virt.libvirt.driver [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] Start _get_guest_xml network_info=[{"id": "a65c1b37-e8e5-4ea7-b0bf-8f2db5841192", "address": "fa:16:3e:5b:7e:cc", "network": {"id": "6462ae38-eefd-46f7-8dfd-98d64cb746b6", "bridge": "br-int", "label": "tempest-network-smoke--1071805895", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa65c1b37-e8", "ovs_interfaceid": "a65c1b37-e8e5-4ea7-b0bf-8f2db5841192", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 08:51:23 compute-0 nova_compute[186544]: 2025-11-22 08:51:23.023 186548 WARNING nova.virt.libvirt.driver [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:51:23 compute-0 nova_compute[186544]: 2025-11-22 08:51:23.030 186548 DEBUG nova.virt.libvirt.host [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 08:51:23 compute-0 nova_compute[186544]: 2025-11-22 08:51:23.031 186548 DEBUG nova.virt.libvirt.host [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 08:51:23 compute-0 nova_compute[186544]: 2025-11-22 08:51:23.038 186548 DEBUG nova.virt.libvirt.host [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 08:51:23 compute-0 nova_compute[186544]: 2025-11-22 08:51:23.039 186548 DEBUG nova.virt.libvirt.host [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 08:51:23 compute-0 nova_compute[186544]: 2025-11-22 08:51:23.040 186548 DEBUG nova.virt.libvirt.driver [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 08:51:23 compute-0 nova_compute[186544]: 2025-11-22 08:51:23.040 186548 DEBUG nova.virt.hardware [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 08:51:23 compute-0 nova_compute[186544]: 2025-11-22 08:51:23.040 186548 DEBUG nova.virt.hardware [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 08:51:23 compute-0 nova_compute[186544]: 2025-11-22 08:51:23.040 186548 DEBUG nova.virt.hardware [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 08:51:23 compute-0 nova_compute[186544]: 2025-11-22 08:51:23.041 186548 DEBUG nova.virt.hardware [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 08:51:23 compute-0 nova_compute[186544]: 2025-11-22 08:51:23.041 186548 DEBUG nova.virt.hardware [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 08:51:23 compute-0 nova_compute[186544]: 2025-11-22 08:51:23.041 186548 DEBUG nova.virt.hardware [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 08:51:23 compute-0 nova_compute[186544]: 2025-11-22 08:51:23.042 186548 DEBUG nova.virt.hardware [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 08:51:23 compute-0 nova_compute[186544]: 2025-11-22 08:51:23.042 186548 DEBUG nova.virt.hardware [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 08:51:23 compute-0 nova_compute[186544]: 2025-11-22 08:51:23.042 186548 DEBUG nova.virt.hardware [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 08:51:23 compute-0 nova_compute[186544]: 2025-11-22 08:51:23.042 186548 DEBUG nova.virt.hardware [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 08:51:23 compute-0 nova_compute[186544]: 2025-11-22 08:51:23.043 186548 DEBUG nova.virt.hardware [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 08:51:23 compute-0 nova_compute[186544]: 2025-11-22 08:51:23.047 186548 DEBUG nova.virt.libvirt.vif [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:51:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-2048016152',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-2048016152',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-588574044-acc',id=181,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPiBPbwkfvp292DdxiAKQErFTXea99iSJ0iNktwUcZngbhgOVkJ8LcCoiBgeLTItTjMH75el9p0D+2vq3rbfgroqRlNCO9aORnrX2+bekE/q3IKlWTvN5P4p1NDQcXdubA==',key_name='tempest-TestSecurityGroupsBasicOps-1102160890',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b5da13b07bb34fc3b4cd1452f7dd6971',ramdisk_id='',reservation_id='r-ila3owa7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-588574044',owner_user_name='tempest-TestSecurityGroupsBasicOps-588574044-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:51:16Z,user_data=None,user_id='7bb85b33f2b44468ab5d86bf5ba98421',uuid=50ec0a38-9293-42b4-9106-2c57fc99c3fb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a65c1b37-e8e5-4ea7-b0bf-8f2db5841192", "address": "fa:16:3e:5b:7e:cc", "network": {"id": "6462ae38-eefd-46f7-8dfd-98d64cb746b6", "bridge": "br-int", "label": "tempest-network-smoke--1071805895", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa65c1b37-e8", "ovs_interfaceid": "a65c1b37-e8e5-4ea7-b0bf-8f2db5841192", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 08:51:23 compute-0 nova_compute[186544]: 2025-11-22 08:51:23.047 186548 DEBUG nova.network.os_vif_util [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Converting VIF {"id": "a65c1b37-e8e5-4ea7-b0bf-8f2db5841192", "address": "fa:16:3e:5b:7e:cc", "network": {"id": "6462ae38-eefd-46f7-8dfd-98d64cb746b6", "bridge": "br-int", "label": "tempest-network-smoke--1071805895", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa65c1b37-e8", "ovs_interfaceid": "a65c1b37-e8e5-4ea7-b0bf-8f2db5841192", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:51:23 compute-0 nova_compute[186544]: 2025-11-22 08:51:23.048 186548 DEBUG nova.network.os_vif_util [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5b:7e:cc,bridge_name='br-int',has_traffic_filtering=True,id=a65c1b37-e8e5-4ea7-b0bf-8f2db5841192,network=Network(6462ae38-eefd-46f7-8dfd-98d64cb746b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa65c1b37-e8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:51:23 compute-0 nova_compute[186544]: 2025-11-22 08:51:23.049 186548 DEBUG nova.objects.instance [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lazy-loading 'pci_devices' on Instance uuid 50ec0a38-9293-42b4-9106-2c57fc99c3fb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:51:23 compute-0 nova_compute[186544]: 2025-11-22 08:51:23.061 186548 DEBUG nova.virt.libvirt.driver [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] End _get_guest_xml xml=<domain type="kvm">
Nov 22 08:51:23 compute-0 nova_compute[186544]:   <uuid>50ec0a38-9293-42b4-9106-2c57fc99c3fb</uuid>
Nov 22 08:51:23 compute-0 nova_compute[186544]:   <name>instance-000000b5</name>
Nov 22 08:51:23 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 08:51:23 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 08:51:23 compute-0 nova_compute[186544]:   <metadata>
Nov 22 08:51:23 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 08:51:23 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 08:51:23 compute-0 nova_compute[186544]:       <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-2048016152</nova:name>
Nov 22 08:51:23 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 08:51:23</nova:creationTime>
Nov 22 08:51:23 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 08:51:23 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 08:51:23 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 08:51:23 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 08:51:23 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 08:51:23 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 08:51:23 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 08:51:23 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 08:51:23 compute-0 nova_compute[186544]:         <nova:user uuid="7bb85b33f2b44468ab5d86bf5ba98421">tempest-TestSecurityGroupsBasicOps-588574044-project-member</nova:user>
Nov 22 08:51:23 compute-0 nova_compute[186544]:         <nova:project uuid="b5da13b07bb34fc3b4cd1452f7dd6971">tempest-TestSecurityGroupsBasicOps-588574044</nova:project>
Nov 22 08:51:23 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 08:51:23 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 08:51:23 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 08:51:23 compute-0 nova_compute[186544]:         <nova:port uuid="a65c1b37-e8e5-4ea7-b0bf-8f2db5841192">
Nov 22 08:51:23 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Nov 22 08:51:23 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 08:51:23 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 08:51:23 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 08:51:23 compute-0 nova_compute[186544]:   </metadata>
Nov 22 08:51:23 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 08:51:23 compute-0 nova_compute[186544]:     <system>
Nov 22 08:51:23 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 08:51:23 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 08:51:23 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 08:51:23 compute-0 nova_compute[186544]:       <entry name="serial">50ec0a38-9293-42b4-9106-2c57fc99c3fb</entry>
Nov 22 08:51:23 compute-0 nova_compute[186544]:       <entry name="uuid">50ec0a38-9293-42b4-9106-2c57fc99c3fb</entry>
Nov 22 08:51:23 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 08:51:23 compute-0 nova_compute[186544]:     </system>
Nov 22 08:51:23 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 08:51:23 compute-0 nova_compute[186544]:   <os>
Nov 22 08:51:23 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 08:51:23 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 08:51:23 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 08:51:23 compute-0 nova_compute[186544]:   </os>
Nov 22 08:51:23 compute-0 nova_compute[186544]:   <features>
Nov 22 08:51:23 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 08:51:23 compute-0 nova_compute[186544]:     <apic/>
Nov 22 08:51:23 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 08:51:23 compute-0 nova_compute[186544]:   </features>
Nov 22 08:51:23 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 08:51:23 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 08:51:23 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 08:51:23 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 08:51:23 compute-0 nova_compute[186544]:   </clock>
Nov 22 08:51:23 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 08:51:23 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 08:51:23 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 08:51:23 compute-0 nova_compute[186544]:   </cpu>
Nov 22 08:51:23 compute-0 nova_compute[186544]:   <devices>
Nov 22 08:51:23 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 08:51:23 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 08:51:23 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/50ec0a38-9293-42b4-9106-2c57fc99c3fb/disk"/>
Nov 22 08:51:23 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 08:51:23 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:51:23 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 08:51:23 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 08:51:23 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/50ec0a38-9293-42b4-9106-2c57fc99c3fb/disk.config"/>
Nov 22 08:51:23 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 08:51:23 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:51:23 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 08:51:23 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:5b:7e:cc"/>
Nov 22 08:51:23 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:51:23 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 08:51:23 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 08:51:23 compute-0 nova_compute[186544]:       <target dev="tapa65c1b37-e8"/>
Nov 22 08:51:23 compute-0 nova_compute[186544]:     </interface>
Nov 22 08:51:23 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 08:51:23 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/50ec0a38-9293-42b4-9106-2c57fc99c3fb/console.log" append="off"/>
Nov 22 08:51:23 compute-0 nova_compute[186544]:     </serial>
Nov 22 08:51:23 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 08:51:23 compute-0 nova_compute[186544]:     <video>
Nov 22 08:51:23 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:51:23 compute-0 nova_compute[186544]:     </video>
Nov 22 08:51:23 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 08:51:23 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 08:51:23 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 08:51:23 compute-0 nova_compute[186544]:     </rng>
Nov 22 08:51:23 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 08:51:23 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:51:23 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:51:23 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:51:23 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:51:23 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:51:23 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:51:23 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:51:23 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:51:23 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:51:23 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:51:23 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:51:23 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:51:23 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:51:23 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:51:23 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:51:23 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:51:23 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:51:23 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:51:23 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:51:23 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:51:23 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:51:23 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:51:23 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:51:23 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:51:23 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 08:51:23 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 08:51:23 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 08:51:23 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 08:51:23 compute-0 nova_compute[186544]:   </devices>
Nov 22 08:51:23 compute-0 nova_compute[186544]: </domain>
Nov 22 08:51:23 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 08:51:23 compute-0 nova_compute[186544]: 2025-11-22 08:51:23.062 186548 DEBUG nova.compute.manager [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] Preparing to wait for external event network-vif-plugged-a65c1b37-e8e5-4ea7-b0bf-8f2db5841192 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 08:51:23 compute-0 nova_compute[186544]: 2025-11-22 08:51:23.063 186548 DEBUG oslo_concurrency.lockutils [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "50ec0a38-9293-42b4-9106-2c57fc99c3fb-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:51:23 compute-0 nova_compute[186544]: 2025-11-22 08:51:23.063 186548 DEBUG oslo_concurrency.lockutils [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "50ec0a38-9293-42b4-9106-2c57fc99c3fb-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:51:23 compute-0 nova_compute[186544]: 2025-11-22 08:51:23.063 186548 DEBUG oslo_concurrency.lockutils [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "50ec0a38-9293-42b4-9106-2c57fc99c3fb-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:51:23 compute-0 nova_compute[186544]: 2025-11-22 08:51:23.064 186548 DEBUG nova.virt.libvirt.vif [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:51:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-2048016152',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-2048016152',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-588574044-acc',id=181,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPiBPbwkfvp292DdxiAKQErFTXea99iSJ0iNktwUcZngbhgOVkJ8LcCoiBgeLTItTjMH75el9p0D+2vq3rbfgroqRlNCO9aORnrX2+bekE/q3IKlWTvN5P4p1NDQcXdubA==',key_name='tempest-TestSecurityGroupsBasicOps-1102160890',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b5da13b07bb34fc3b4cd1452f7dd6971',ramdisk_id='',reservation_id='r-ila3owa7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-588574044',owner_user_name='tempest-TestSecurityGroupsBasicOps-588574044-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:51:16Z,user_data=None,user_id='7bb85b33f2b44468ab5d86bf5ba98421',uuid=50ec0a38-9293-42b4-9106-2c57fc99c3fb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a65c1b37-e8e5-4ea7-b0bf-8f2db5841192", "address": "fa:16:3e:5b:7e:cc", "network": {"id": "6462ae38-eefd-46f7-8dfd-98d64cb746b6", "bridge": "br-int", "label": "tempest-network-smoke--1071805895", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa65c1b37-e8", "ovs_interfaceid": "a65c1b37-e8e5-4ea7-b0bf-8f2db5841192", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 08:51:23 compute-0 nova_compute[186544]: 2025-11-22 08:51:23.064 186548 DEBUG nova.network.os_vif_util [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Converting VIF {"id": "a65c1b37-e8e5-4ea7-b0bf-8f2db5841192", "address": "fa:16:3e:5b:7e:cc", "network": {"id": "6462ae38-eefd-46f7-8dfd-98d64cb746b6", "bridge": "br-int", "label": "tempest-network-smoke--1071805895", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa65c1b37-e8", "ovs_interfaceid": "a65c1b37-e8e5-4ea7-b0bf-8f2db5841192", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:51:23 compute-0 nova_compute[186544]: 2025-11-22 08:51:23.065 186548 DEBUG nova.network.os_vif_util [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5b:7e:cc,bridge_name='br-int',has_traffic_filtering=True,id=a65c1b37-e8e5-4ea7-b0bf-8f2db5841192,network=Network(6462ae38-eefd-46f7-8dfd-98d64cb746b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa65c1b37-e8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:51:23 compute-0 nova_compute[186544]: 2025-11-22 08:51:23.065 186548 DEBUG os_vif [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5b:7e:cc,bridge_name='br-int',has_traffic_filtering=True,id=a65c1b37-e8e5-4ea7-b0bf-8f2db5841192,network=Network(6462ae38-eefd-46f7-8dfd-98d64cb746b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa65c1b37-e8') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 08:51:23 compute-0 nova_compute[186544]: 2025-11-22 08:51:23.066 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:51:23 compute-0 nova_compute[186544]: 2025-11-22 08:51:23.066 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:51:23 compute-0 nova_compute[186544]: 2025-11-22 08:51:23.067 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:51:23 compute-0 nova_compute[186544]: 2025-11-22 08:51:23.069 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:51:23 compute-0 nova_compute[186544]: 2025-11-22 08:51:23.070 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa65c1b37-e8, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:51:23 compute-0 nova_compute[186544]: 2025-11-22 08:51:23.070 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa65c1b37-e8, col_values=(('external_ids', {'iface-id': 'a65c1b37-e8e5-4ea7-b0bf-8f2db5841192', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5b:7e:cc', 'vm-uuid': '50ec0a38-9293-42b4-9106-2c57fc99c3fb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:51:23 compute-0 NetworkManager[55036]: <info>  [1763801483.0726] manager: (tapa65c1b37-e8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/419)
Nov 22 08:51:23 compute-0 nova_compute[186544]: 2025-11-22 08:51:23.071 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:51:23 compute-0 nova_compute[186544]: 2025-11-22 08:51:23.074 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 08:51:23 compute-0 nova_compute[186544]: 2025-11-22 08:51:23.078 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:51:23 compute-0 nova_compute[186544]: 2025-11-22 08:51:23.079 186548 INFO os_vif [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5b:7e:cc,bridge_name='br-int',has_traffic_filtering=True,id=a65c1b37-e8e5-4ea7-b0bf-8f2db5841192,network=Network(6462ae38-eefd-46f7-8dfd-98d64cb746b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa65c1b37-e8')
Nov 22 08:51:23 compute-0 nova_compute[186544]: 2025-11-22 08:51:23.119 186548 DEBUG nova.virt.libvirt.driver [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:51:23 compute-0 nova_compute[186544]: 2025-11-22 08:51:23.120 186548 DEBUG nova.virt.libvirt.driver [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:51:23 compute-0 nova_compute[186544]: 2025-11-22 08:51:23.120 186548 DEBUG nova.virt.libvirt.driver [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] No VIF found with MAC fa:16:3e:5b:7e:cc, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 08:51:23 compute-0 nova_compute[186544]: 2025-11-22 08:51:23.120 186548 INFO nova.virt.libvirt.driver [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] Using config drive
Nov 22 08:51:24 compute-0 nova_compute[186544]: 2025-11-22 08:51:24.478 186548 INFO nova.virt.libvirt.driver [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] Creating config drive at /var/lib/nova/instances/50ec0a38-9293-42b4-9106-2c57fc99c3fb/disk.config
Nov 22 08:51:24 compute-0 nova_compute[186544]: 2025-11-22 08:51:24.483 186548 DEBUG oslo_concurrency.processutils [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/50ec0a38-9293-42b4-9106-2c57fc99c3fb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpuqucz3ui execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:51:24 compute-0 nova_compute[186544]: 2025-11-22 08:51:24.622 186548 DEBUG oslo_concurrency.processutils [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/50ec0a38-9293-42b4-9106-2c57fc99c3fb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpuqucz3ui" returned: 0 in 0.139s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:51:24 compute-0 kernel: tapa65c1b37-e8: entered promiscuous mode
Nov 22 08:51:24 compute-0 NetworkManager[55036]: <info>  [1763801484.6820] manager: (tapa65c1b37-e8): new Tun device (/org/freedesktop/NetworkManager/Devices/420)
Nov 22 08:51:24 compute-0 nova_compute[186544]: 2025-11-22 08:51:24.680 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:51:24 compute-0 nova_compute[186544]: 2025-11-22 08:51:24.686 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:51:24 compute-0 ovn_controller[94843]: 2025-11-22T08:51:24Z|00896|binding|INFO|Claiming lport a65c1b37-e8e5-4ea7-b0bf-8f2db5841192 for this chassis.
Nov 22 08:51:24 compute-0 ovn_controller[94843]: 2025-11-22T08:51:24Z|00897|binding|INFO|a65c1b37-e8e5-4ea7-b0bf-8f2db5841192: Claiming fa:16:3e:5b:7e:cc 10.100.0.6
Nov 22 08:51:24 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:51:24.698 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5b:7e:cc 10.100.0.6'], port_security=['fa:16:3e:5b:7e:cc 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '50ec0a38-9293-42b4-9106-2c57fc99c3fb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6462ae38-eefd-46f7-8dfd-98d64cb746b6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2e6b1900-aaed-4594-90b0-32a5351bb717 fe343cc7-4a08-4834-8563-a8a49e3a0f49', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4b91ecf2-a2ff-45e5-963e-c9d8b70b8af0, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=a65c1b37-e8e5-4ea7-b0bf-8f2db5841192) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:51:24 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:51:24.699 103805 INFO neutron.agent.ovn.metadata.agent [-] Port a65c1b37-e8e5-4ea7-b0bf-8f2db5841192 in datapath 6462ae38-eefd-46f7-8dfd-98d64cb746b6 bound to our chassis
Nov 22 08:51:24 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:51:24.700 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6462ae38-eefd-46f7-8dfd-98d64cb746b6
Nov 22 08:51:24 compute-0 systemd-udevd[255051]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:51:24 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:51:24.711 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[6d114b7c-cb78-457f-ad4f-773542ca9388]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:51:24 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:51:24.712 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6462ae38-e1 in ovnmeta-6462ae38-eefd-46f7-8dfd-98d64cb746b6 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 08:51:24 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:51:24.714 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6462ae38-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 08:51:24 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:51:24.714 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[42b35256-225f-43ff-9f7a-6cd6762a4daa]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:51:24 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:51:24.715 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[54355596-bed0-445e-b8b5-de43101ab699]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:51:24 compute-0 NetworkManager[55036]: <info>  [1763801484.7251] device (tapa65c1b37-e8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 08:51:24 compute-0 NetworkManager[55036]: <info>  [1763801484.7265] device (tapa65c1b37-e8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 08:51:24 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:51:24.726 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[4cb4d423-cbc4-430e-875d-b2a3e9e1a4c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:51:24 compute-0 systemd-machined[152872]: New machine qemu-98-instance-000000b5.
Nov 22 08:51:24 compute-0 nova_compute[186544]: 2025-11-22 08:51:24.740 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:51:24 compute-0 ovn_controller[94843]: 2025-11-22T08:51:24Z|00898|binding|INFO|Setting lport a65c1b37-e8e5-4ea7-b0bf-8f2db5841192 ovn-installed in OVS
Nov 22 08:51:24 compute-0 ovn_controller[94843]: 2025-11-22T08:51:24Z|00899|binding|INFO|Setting lport a65c1b37-e8e5-4ea7-b0bf-8f2db5841192 up in Southbound
Nov 22 08:51:24 compute-0 nova_compute[186544]: 2025-11-22 08:51:24.746 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:51:24 compute-0 systemd[1]: Started Virtual Machine qemu-98-instance-000000b5.
Nov 22 08:51:24 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:51:24.750 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[de6ac1bb-bde9-452a-9e5c-47bcc3a115c5]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:51:24 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:51:24.777 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[0107de40-f348-4414-af5a-ca604e7a04a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:51:24 compute-0 NetworkManager[55036]: <info>  [1763801484.7852] manager: (tap6462ae38-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/421)
Nov 22 08:51:24 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:51:24.784 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[84a40fbc-67fb-438f-98f7-eea9ee359f67]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:51:24 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:51:24.817 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[9476db7b-9d13-4a3e-83ad-b4f25f855227]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:51:24 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:51:24.821 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[0f2e149f-0f9b-4bd7-b70d-c7e4fe056d26]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:51:24 compute-0 NetworkManager[55036]: <info>  [1763801484.8399] device (tap6462ae38-e0): carrier: link connected
Nov 22 08:51:24 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:51:24.843 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[65f1b445-fdd8-42d6-a6c9-709ecfb67688]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:51:24 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:51:24.862 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[4615fd17-9c77-4175-b0dd-c2d7417f7199]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6462ae38-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6e:92:7b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 264], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 822547, 'reachable_time': 21651, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 255084, 'error': None, 'target': 'ovnmeta-6462ae38-eefd-46f7-8dfd-98d64cb746b6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:51:24 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:51:24.876 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[0714be3a-6b5d-420b-89f7-b0b85fe9d857]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe6e:927b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 822547, 'tstamp': 822547}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 255085, 'error': None, 'target': 'ovnmeta-6462ae38-eefd-46f7-8dfd-98d64cb746b6', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:51:24 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:51:24.891 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[8927c72f-8a4c-42c7-8b26-5463708e3d23]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6462ae38-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6e:92:7b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 264], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 822547, 'reachable_time': 21651, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 255086, 'error': None, 'target': 'ovnmeta-6462ae38-eefd-46f7-8dfd-98d64cb746b6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:51:24 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:51:24.919 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[4f74b18e-5014-447e-912a-77f76ec001fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:51:24 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:51:24.974 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[de89f091-c09f-43a6-becd-8c5951d7e0f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:51:24 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:51:24.975 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6462ae38-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:51:24 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:51:24.976 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:51:24 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:51:24.976 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6462ae38-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:51:24 compute-0 nova_compute[186544]: 2025-11-22 08:51:24.978 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:51:24 compute-0 kernel: tap6462ae38-e0: entered promiscuous mode
Nov 22 08:51:24 compute-0 NetworkManager[55036]: <info>  [1763801484.9797] manager: (tap6462ae38-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/422)
Nov 22 08:51:24 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:51:24.979 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6462ae38-e0, col_values=(('external_ids', {'iface-id': '36a49105-890a-4b11-8fcd-be2c813442f9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:51:24 compute-0 nova_compute[186544]: 2025-11-22 08:51:24.980 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:51:24 compute-0 ovn_controller[94843]: 2025-11-22T08:51:24Z|00900|binding|INFO|Releasing lport 36a49105-890a-4b11-8fcd-be2c813442f9 from this chassis (sb_readonly=0)
Nov 22 08:51:24 compute-0 nova_compute[186544]: 2025-11-22 08:51:24.994 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:51:24 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:51:24.998 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6462ae38-eefd-46f7-8dfd-98d64cb746b6.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6462ae38-eefd-46f7-8dfd-98d64cb746b6.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 08:51:24 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:51:24.999 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[1fd4b462-a8f1-49dd-9304-f2b3274b78e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:51:25 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:51:25.000 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 08:51:25 compute-0 ovn_metadata_agent[103800]: global
Nov 22 08:51:25 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 08:51:25 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-6462ae38-eefd-46f7-8dfd-98d64cb746b6
Nov 22 08:51:25 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 08:51:25 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 08:51:25 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 08:51:25 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/6462ae38-eefd-46f7-8dfd-98d64cb746b6.pid.haproxy
Nov 22 08:51:25 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 08:51:25 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:51:25 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 08:51:25 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 08:51:25 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 08:51:25 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 08:51:25 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 08:51:25 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 08:51:25 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 08:51:25 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 08:51:25 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 08:51:25 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 08:51:25 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 08:51:25 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 08:51:25 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 08:51:25 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:51:25 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:51:25 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 08:51:25 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 08:51:25 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 08:51:25 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID 6462ae38-eefd-46f7-8dfd-98d64cb746b6
Nov 22 08:51:25 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 08:51:25 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:51:25.001 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6462ae38-eefd-46f7-8dfd-98d64cb746b6', 'env', 'PROCESS_TAG=haproxy-6462ae38-eefd-46f7-8dfd-98d64cb746b6', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6462ae38-eefd-46f7-8dfd-98d64cb746b6.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 08:51:25 compute-0 podman[255118]: 2025-11-22 08:51:25.367458359 +0000 UTC m=+0.052066692 container create 0637e9668083934d37761da372308f87245fe38beeb78955610e322016f249fc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6462ae38-eefd-46f7-8dfd-98d64cb746b6, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 08:51:25 compute-0 systemd[1]: Started libpod-conmon-0637e9668083934d37761da372308f87245fe38beeb78955610e322016f249fc.scope.
Nov 22 08:51:25 compute-0 podman[255118]: 2025-11-22 08:51:25.339598394 +0000 UTC m=+0.024206737 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 08:51:25 compute-0 systemd[1]: Started libcrun container.
Nov 22 08:51:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7a7336862ebc29a1fb23fe8abb82d08be5315992efca04fafbdc5e2dbc646626/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 08:51:25 compute-0 podman[255118]: 2025-11-22 08:51:25.455905584 +0000 UTC m=+0.140513917 container init 0637e9668083934d37761da372308f87245fe38beeb78955610e322016f249fc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6462ae38-eefd-46f7-8dfd-98d64cb746b6, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 08:51:25 compute-0 podman[255118]: 2025-11-22 08:51:25.461945154 +0000 UTC m=+0.146553477 container start 0637e9668083934d37761da372308f87245fe38beeb78955610e322016f249fc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6462ae38-eefd-46f7-8dfd-98d64cb746b6, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Nov 22 08:51:25 compute-0 neutron-haproxy-ovnmeta-6462ae38-eefd-46f7-8dfd-98d64cb746b6[255134]: [NOTICE]   (255153) : New worker (255157) forked
Nov 22 08:51:25 compute-0 neutron-haproxy-ovnmeta-6462ae38-eefd-46f7-8dfd-98d64cb746b6[255134]: [NOTICE]   (255153) : Loading success.
Nov 22 08:51:25 compute-0 podman[255138]: 2025-11-22 08:51:25.505879854 +0000 UTC m=+0.050803100 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 22 08:51:25 compute-0 podman[255137]: 2025-11-22 08:51:25.54352886 +0000 UTC m=+0.090885196 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute)
Nov 22 08:51:25 compute-0 podman[255183]: 2025-11-22 08:51:25.585870672 +0000 UTC m=+0.049811246 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 08:51:25 compute-0 podman[255184]: 2025-11-22 08:51:25.639492531 +0000 UTC m=+0.105646000 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 22 08:51:25 compute-0 nova_compute[186544]: 2025-11-22 08:51:25.679 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763801485.6791847, 50ec0a38-9293-42b4-9106-2c57fc99c3fb => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:51:25 compute-0 nova_compute[186544]: 2025-11-22 08:51:25.680 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] VM Started (Lifecycle Event)
Nov 22 08:51:25 compute-0 nova_compute[186544]: 2025-11-22 08:51:25.704 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:51:25 compute-0 nova_compute[186544]: 2025-11-22 08:51:25.710 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763801485.6819472, 50ec0a38-9293-42b4-9106-2c57fc99c3fb => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:51:25 compute-0 nova_compute[186544]: 2025-11-22 08:51:25.710 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] VM Paused (Lifecycle Event)
Nov 22 08:51:25 compute-0 nova_compute[186544]: 2025-11-22 08:51:25.727 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:51:25 compute-0 nova_compute[186544]: 2025-11-22 08:51:25.730 186548 DEBUG nova.compute.manager [req-4b63c1af-2f92-4d4b-a61c-5895cd3c4767 req-27ae69a8-7019-4717-8bf3-c6934d1a3abe 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] Received event network-vif-plugged-a65c1b37-e8e5-4ea7-b0bf-8f2db5841192 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:51:25 compute-0 nova_compute[186544]: 2025-11-22 08:51:25.731 186548 DEBUG oslo_concurrency.lockutils [req-4b63c1af-2f92-4d4b-a61c-5895cd3c4767 req-27ae69a8-7019-4717-8bf3-c6934d1a3abe 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "50ec0a38-9293-42b4-9106-2c57fc99c3fb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:51:25 compute-0 nova_compute[186544]: 2025-11-22 08:51:25.731 186548 DEBUG oslo_concurrency.lockutils [req-4b63c1af-2f92-4d4b-a61c-5895cd3c4767 req-27ae69a8-7019-4717-8bf3-c6934d1a3abe 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "50ec0a38-9293-42b4-9106-2c57fc99c3fb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:51:25 compute-0 nova_compute[186544]: 2025-11-22 08:51:25.731 186548 DEBUG oslo_concurrency.lockutils [req-4b63c1af-2f92-4d4b-a61c-5895cd3c4767 req-27ae69a8-7019-4717-8bf3-c6934d1a3abe 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "50ec0a38-9293-42b4-9106-2c57fc99c3fb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:51:25 compute-0 nova_compute[186544]: 2025-11-22 08:51:25.731 186548 DEBUG nova.compute.manager [req-4b63c1af-2f92-4d4b-a61c-5895cd3c4767 req-27ae69a8-7019-4717-8bf3-c6934d1a3abe 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] Processing event network-vif-plugged-a65c1b37-e8e5-4ea7-b0bf-8f2db5841192 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 08:51:25 compute-0 nova_compute[186544]: 2025-11-22 08:51:25.732 186548 DEBUG nova.compute.manager [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 08:51:25 compute-0 nova_compute[186544]: 2025-11-22 08:51:25.735 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:51:25 compute-0 nova_compute[186544]: 2025-11-22 08:51:25.737 186548 DEBUG nova.virt.libvirt.driver [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 08:51:25 compute-0 nova_compute[186544]: 2025-11-22 08:51:25.739 186548 INFO nova.virt.libvirt.driver [-] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] Instance spawned successfully.
Nov 22 08:51:25 compute-0 nova_compute[186544]: 2025-11-22 08:51:25.739 186548 DEBUG nova.virt.libvirt.driver [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 08:51:25 compute-0 nova_compute[186544]: 2025-11-22 08:51:25.759 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 08:51:25 compute-0 nova_compute[186544]: 2025-11-22 08:51:25.759 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763801485.7351751, 50ec0a38-9293-42b4-9106-2c57fc99c3fb => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:51:25 compute-0 nova_compute[186544]: 2025-11-22 08:51:25.760 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] VM Resumed (Lifecycle Event)
Nov 22 08:51:25 compute-0 nova_compute[186544]: 2025-11-22 08:51:25.767 186548 DEBUG nova.virt.libvirt.driver [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:51:25 compute-0 nova_compute[186544]: 2025-11-22 08:51:25.767 186548 DEBUG nova.virt.libvirt.driver [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:51:25 compute-0 nova_compute[186544]: 2025-11-22 08:51:25.768 186548 DEBUG nova.virt.libvirt.driver [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:51:25 compute-0 nova_compute[186544]: 2025-11-22 08:51:25.768 186548 DEBUG nova.virt.libvirt.driver [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:51:25 compute-0 nova_compute[186544]: 2025-11-22 08:51:25.769 186548 DEBUG nova.virt.libvirt.driver [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:51:25 compute-0 nova_compute[186544]: 2025-11-22 08:51:25.769 186548 DEBUG nova.virt.libvirt.driver [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:51:25 compute-0 nova_compute[186544]: 2025-11-22 08:51:25.774 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:51:25 compute-0 nova_compute[186544]: 2025-11-22 08:51:25.777 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:51:25 compute-0 nova_compute[186544]: 2025-11-22 08:51:25.805 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 08:51:25 compute-0 nova_compute[186544]: 2025-11-22 08:51:25.874 186548 INFO nova.compute.manager [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] Took 9.55 seconds to spawn the instance on the hypervisor.
Nov 22 08:51:25 compute-0 nova_compute[186544]: 2025-11-22 08:51:25.875 186548 DEBUG nova.compute.manager [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:51:25 compute-0 nova_compute[186544]: 2025-11-22 08:51:25.968 186548 INFO nova.compute.manager [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] Took 10.08 seconds to build instance.
Nov 22 08:51:25 compute-0 nova_compute[186544]: 2025-11-22 08:51:25.985 186548 DEBUG oslo_concurrency.lockutils [None req-93761eff-a594-4e48-8218-f7e51675fc0b 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "50ec0a38-9293-42b4-9106-2c57fc99c3fb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.180s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:51:26 compute-0 nova_compute[186544]: 2025-11-22 08:51:26.452 186548 DEBUG nova.network.neutron [req-ed13d92f-4789-4749-b87d-b406d0ea30a7 req-dd8a8e25-9056-42ea-91aa-46686ee5b77b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] Updated VIF entry in instance network info cache for port a65c1b37-e8e5-4ea7-b0bf-8f2db5841192. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:51:26 compute-0 nova_compute[186544]: 2025-11-22 08:51:26.452 186548 DEBUG nova.network.neutron [req-ed13d92f-4789-4749-b87d-b406d0ea30a7 req-dd8a8e25-9056-42ea-91aa-46686ee5b77b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] Updating instance_info_cache with network_info: [{"id": "a65c1b37-e8e5-4ea7-b0bf-8f2db5841192", "address": "fa:16:3e:5b:7e:cc", "network": {"id": "6462ae38-eefd-46f7-8dfd-98d64cb746b6", "bridge": "br-int", "label": "tempest-network-smoke--1071805895", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa65c1b37-e8", "ovs_interfaceid": "a65c1b37-e8e5-4ea7-b0bf-8f2db5841192", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:51:26 compute-0 nova_compute[186544]: 2025-11-22 08:51:26.480 186548 DEBUG oslo_concurrency.lockutils [req-ed13d92f-4789-4749-b87d-b406d0ea30a7 req-dd8a8e25-9056-42ea-91aa-46686ee5b77b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-50ec0a38-9293-42b4-9106-2c57fc99c3fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:51:26 compute-0 nova_compute[186544]: 2025-11-22 08:51:26.851 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:51:27 compute-0 nova_compute[186544]: 2025-11-22 08:51:27.854 186548 DEBUG nova.compute.manager [req-91ff26b7-78ff-4a66-8bb2-ecc839ad948f req-7b36d195-7043-470e-a34f-b0e107abe068 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] Received event network-vif-plugged-a65c1b37-e8e5-4ea7-b0bf-8f2db5841192 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:51:27 compute-0 nova_compute[186544]: 2025-11-22 08:51:27.854 186548 DEBUG oslo_concurrency.lockutils [req-91ff26b7-78ff-4a66-8bb2-ecc839ad948f req-7b36d195-7043-470e-a34f-b0e107abe068 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "50ec0a38-9293-42b4-9106-2c57fc99c3fb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:51:27 compute-0 nova_compute[186544]: 2025-11-22 08:51:27.854 186548 DEBUG oslo_concurrency.lockutils [req-91ff26b7-78ff-4a66-8bb2-ecc839ad948f req-7b36d195-7043-470e-a34f-b0e107abe068 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "50ec0a38-9293-42b4-9106-2c57fc99c3fb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:51:27 compute-0 nova_compute[186544]: 2025-11-22 08:51:27.855 186548 DEBUG oslo_concurrency.lockutils [req-91ff26b7-78ff-4a66-8bb2-ecc839ad948f req-7b36d195-7043-470e-a34f-b0e107abe068 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "50ec0a38-9293-42b4-9106-2c57fc99c3fb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:51:27 compute-0 nova_compute[186544]: 2025-11-22 08:51:27.855 186548 DEBUG nova.compute.manager [req-91ff26b7-78ff-4a66-8bb2-ecc839ad948f req-7b36d195-7043-470e-a34f-b0e107abe068 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] No waiting events found dispatching network-vif-plugged-a65c1b37-e8e5-4ea7-b0bf-8f2db5841192 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:51:27 compute-0 nova_compute[186544]: 2025-11-22 08:51:27.855 186548 WARNING nova.compute.manager [req-91ff26b7-78ff-4a66-8bb2-ecc839ad948f req-7b36d195-7043-470e-a34f-b0e107abe068 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] Received unexpected event network-vif-plugged-a65c1b37-e8e5-4ea7-b0bf-8f2db5841192 for instance with vm_state active and task_state None.
Nov 22 08:51:28 compute-0 nova_compute[186544]: 2025-11-22 08:51:28.073 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:51:30 compute-0 nova_compute[186544]: 2025-11-22 08:51:30.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:51:30 compute-0 nova_compute[186544]: 2025-11-22 08:51:30.164 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 08:51:30 compute-0 nova_compute[186544]: 2025-11-22 08:51:30.164 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 08:51:30 compute-0 nova_compute[186544]: 2025-11-22 08:51:30.492 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "refresh_cache-50ec0a38-9293-42b4-9106-2c57fc99c3fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:51:30 compute-0 nova_compute[186544]: 2025-11-22 08:51:30.493 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquired lock "refresh_cache-50ec0a38-9293-42b4-9106-2c57fc99c3fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:51:30 compute-0 nova_compute[186544]: 2025-11-22 08:51:30.493 186548 DEBUG nova.network.neutron [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 22 08:51:30 compute-0 nova_compute[186544]: 2025-11-22 08:51:30.493 186548 DEBUG nova.objects.instance [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lazy-loading 'info_cache' on Instance uuid 50ec0a38-9293-42b4-9106-2c57fc99c3fb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:51:31 compute-0 nova_compute[186544]: 2025-11-22 08:51:31.852 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:51:32 compute-0 ovn_controller[94843]: 2025-11-22T08:51:32Z|00901|binding|INFO|Releasing lport 36a49105-890a-4b11-8fcd-be2c813442f9 from this chassis (sb_readonly=0)
Nov 22 08:51:32 compute-0 NetworkManager[55036]: <info>  [1763801492.9481] manager: (patch-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/423)
Nov 22 08:51:32 compute-0 NetworkManager[55036]: <info>  [1763801492.9493] manager: (patch-br-int-to-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/424)
Nov 22 08:51:32 compute-0 nova_compute[186544]: 2025-11-22 08:51:32.963 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:51:32 compute-0 ovn_controller[94843]: 2025-11-22T08:51:32Z|00902|binding|INFO|Releasing lport 36a49105-890a-4b11-8fcd-be2c813442f9 from this chassis (sb_readonly=0)
Nov 22 08:51:32 compute-0 nova_compute[186544]: 2025-11-22 08:51:32.997 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:51:33 compute-0 nova_compute[186544]: 2025-11-22 08:51:33.002 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:51:33 compute-0 nova_compute[186544]: 2025-11-22 08:51:33.075 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:51:33 compute-0 nova_compute[186544]: 2025-11-22 08:51:33.307 186548 DEBUG nova.network.neutron [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] Updating instance_info_cache with network_info: [{"id": "a65c1b37-e8e5-4ea7-b0bf-8f2db5841192", "address": "fa:16:3e:5b:7e:cc", "network": {"id": "6462ae38-eefd-46f7-8dfd-98d64cb746b6", "bridge": "br-int", "label": "tempest-network-smoke--1071805895", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa65c1b37-e8", "ovs_interfaceid": "a65c1b37-e8e5-4ea7-b0bf-8f2db5841192", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:51:33 compute-0 nova_compute[186544]: 2025-11-22 08:51:33.340 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Releasing lock "refresh_cache-50ec0a38-9293-42b4-9106-2c57fc99c3fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:51:33 compute-0 nova_compute[186544]: 2025-11-22 08:51:33.341 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 22 08:51:33 compute-0 nova_compute[186544]: 2025-11-22 08:51:33.341 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:51:33 compute-0 nova_compute[186544]: 2025-11-22 08:51:33.341 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 08:51:33 compute-0 nova_compute[186544]: 2025-11-22 08:51:33.800 186548 DEBUG nova.compute.manager [req-d2e46760-fc5f-442f-ae2a-6282b77f0148 req-3f3b609b-918f-4f1a-92ed-df167aa53dc3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] Received event network-changed-a65c1b37-e8e5-4ea7-b0bf-8f2db5841192 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:51:33 compute-0 nova_compute[186544]: 2025-11-22 08:51:33.801 186548 DEBUG nova.compute.manager [req-d2e46760-fc5f-442f-ae2a-6282b77f0148 req-3f3b609b-918f-4f1a-92ed-df167aa53dc3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] Refreshing instance network info cache due to event network-changed-a65c1b37-e8e5-4ea7-b0bf-8f2db5841192. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:51:33 compute-0 nova_compute[186544]: 2025-11-22 08:51:33.801 186548 DEBUG oslo_concurrency.lockutils [req-d2e46760-fc5f-442f-ae2a-6282b77f0148 req-3f3b609b-918f-4f1a-92ed-df167aa53dc3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-50ec0a38-9293-42b4-9106-2c57fc99c3fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:51:33 compute-0 nova_compute[186544]: 2025-11-22 08:51:33.801 186548 DEBUG oslo_concurrency.lockutils [req-d2e46760-fc5f-442f-ae2a-6282b77f0148 req-3f3b609b-918f-4f1a-92ed-df167aa53dc3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-50ec0a38-9293-42b4-9106-2c57fc99c3fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:51:33 compute-0 nova_compute[186544]: 2025-11-22 08:51:33.801 186548 DEBUG nova.network.neutron [req-d2e46760-fc5f-442f-ae2a-6282b77f0148 req-3f3b609b-918f-4f1a-92ed-df167aa53dc3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] Refreshing network info cache for port a65c1b37-e8e5-4ea7-b0bf-8f2db5841192 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:51:34 compute-0 nova_compute[186544]: 2025-11-22 08:51:34.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:51:34 compute-0 nova_compute[186544]: 2025-11-22 08:51:34.185 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:51:34 compute-0 nova_compute[186544]: 2025-11-22 08:51:34.186 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:51:34 compute-0 nova_compute[186544]: 2025-11-22 08:51:34.186 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:51:34 compute-0 nova_compute[186544]: 2025-11-22 08:51:34.187 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 08:51:34 compute-0 nova_compute[186544]: 2025-11-22 08:51:34.244 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/50ec0a38-9293-42b4-9106-2c57fc99c3fb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:51:34 compute-0 nova_compute[186544]: 2025-11-22 08:51:34.305 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/50ec0a38-9293-42b4-9106-2c57fc99c3fb/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:51:34 compute-0 nova_compute[186544]: 2025-11-22 08:51:34.307 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/50ec0a38-9293-42b4-9106-2c57fc99c3fb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:51:34 compute-0 nova_compute[186544]: 2025-11-22 08:51:34.370 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/50ec0a38-9293-42b4-9106-2c57fc99c3fb/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:51:34 compute-0 nova_compute[186544]: 2025-11-22 08:51:34.515 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:51:34 compute-0 nova_compute[186544]: 2025-11-22 08:51:34.517 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5568MB free_disk=73.13093948364258GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 08:51:34 compute-0 nova_compute[186544]: 2025-11-22 08:51:34.517 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:51:34 compute-0 nova_compute[186544]: 2025-11-22 08:51:34.518 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:51:34 compute-0 nova_compute[186544]: 2025-11-22 08:51:34.592 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Instance 50ec0a38-9293-42b4-9106-2c57fc99c3fb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 22 08:51:34 compute-0 nova_compute[186544]: 2025-11-22 08:51:34.592 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 08:51:34 compute-0 nova_compute[186544]: 2025-11-22 08:51:34.593 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 08:51:34 compute-0 nova_compute[186544]: 2025-11-22 08:51:34.617 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Refreshing inventories for resource provider 0a011418-630a-4be8-ab23-41ec1c11a5ea _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 22 08:51:34 compute-0 nova_compute[186544]: 2025-11-22 08:51:34.637 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Updating ProviderTree inventory for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 22 08:51:34 compute-0 nova_compute[186544]: 2025-11-22 08:51:34.638 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Updating inventory in ProviderTree for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 22 08:51:34 compute-0 nova_compute[186544]: 2025-11-22 08:51:34.659 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Refreshing aggregate associations for resource provider 0a011418-630a-4be8-ab23-41ec1c11a5ea, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 22 08:51:34 compute-0 nova_compute[186544]: 2025-11-22 08:51:34.687 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Refreshing trait associations for resource provider 0a011418-630a-4be8-ab23-41ec1c11a5ea, traits: COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_ACCELERATORS,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_RESCUE_BFV,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSSE3,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE41 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 22 08:51:34 compute-0 nova_compute[186544]: 2025-11-22 08:51:34.723 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:51:34 compute-0 nova_compute[186544]: 2025-11-22 08:51:34.735 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:51:34 compute-0 nova_compute[186544]: 2025-11-22 08:51:34.762 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 08:51:34 compute-0 nova_compute[186544]: 2025-11-22 08:51:34.763 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.245s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:51:35 compute-0 nova_compute[186544]: 2025-11-22 08:51:35.517 186548 DEBUG nova.network.neutron [req-d2e46760-fc5f-442f-ae2a-6282b77f0148 req-3f3b609b-918f-4f1a-92ed-df167aa53dc3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] Updated VIF entry in instance network info cache for port a65c1b37-e8e5-4ea7-b0bf-8f2db5841192. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:51:35 compute-0 nova_compute[186544]: 2025-11-22 08:51:35.518 186548 DEBUG nova.network.neutron [req-d2e46760-fc5f-442f-ae2a-6282b77f0148 req-3f3b609b-918f-4f1a-92ed-df167aa53dc3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] Updating instance_info_cache with network_info: [{"id": "a65c1b37-e8e5-4ea7-b0bf-8f2db5841192", "address": "fa:16:3e:5b:7e:cc", "network": {"id": "6462ae38-eefd-46f7-8dfd-98d64cb746b6", "bridge": "br-int", "label": "tempest-network-smoke--1071805895", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa65c1b37-e8", "ovs_interfaceid": "a65c1b37-e8e5-4ea7-b0bf-8f2db5841192", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:51:35 compute-0 nova_compute[186544]: 2025-11-22 08:51:35.540 186548 DEBUG oslo_concurrency.lockutils [req-d2e46760-fc5f-442f-ae2a-6282b77f0148 req-3f3b609b-918f-4f1a-92ed-df167aa53dc3 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-50ec0a38-9293-42b4-9106-2c57fc99c3fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:51:36 compute-0 podman[255250]: 2025-11-22 08:51:36.408377278 +0000 UTC m=+0.051338404 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 08:51:36 compute-0 podman[255251]: 2025-11-22 08:51:36.419151253 +0000 UTC m=+0.062717704 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.openshift.tags=minimal rhel9, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, managed_by=edpm_ansible, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=edpm, version=9.6, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 22 08:51:36 compute-0 podman[255252]: 2025-11-22 08:51:36.431009315 +0000 UTC m=+0.068408924 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 22 08:51:36 compute-0 nova_compute[186544]: 2025-11-22 08:51:36.854 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:51:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:51:37.383 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:51:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:51:37.384 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:51:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:51:37.385 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:51:38 compute-0 nova_compute[186544]: 2025-11-22 08:51:38.078 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:51:38 compute-0 nova_compute[186544]: 2025-11-22 08:51:38.763 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:51:40 compute-0 ovn_controller[94843]: 2025-11-22T08:51:40Z|00091|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:5b:7e:cc 10.100.0.6
Nov 22 08:51:40 compute-0 ovn_controller[94843]: 2025-11-22T08:51:40Z|00092|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:5b:7e:cc 10.100.0.6
Nov 22 08:51:41 compute-0 nova_compute[186544]: 2025-11-22 08:51:41.855 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:51:42 compute-0 nova_compute[186544]: 2025-11-22 08:51:42.158 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:51:43 compute-0 nova_compute[186544]: 2025-11-22 08:51:43.081 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:51:44 compute-0 nova_compute[186544]: 2025-11-22 08:51:44.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:51:44 compute-0 nova_compute[186544]: 2025-11-22 08:51:44.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:51:46 compute-0 nova_compute[186544]: 2025-11-22 08:51:46.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:51:46 compute-0 nova_compute[186544]: 2025-11-22 08:51:46.857 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:51:47 compute-0 nova_compute[186544]: 2025-11-22 08:51:47.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:51:48 compute-0 nova_compute[186544]: 2025-11-22 08:51:48.083 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:51:51 compute-0 nova_compute[186544]: 2025-11-22 08:51:51.860 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:51:53 compute-0 nova_compute[186544]: 2025-11-22 08:51:53.087 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:51:56 compute-0 podman[255330]: 2025-11-22 08:51:56.415407146 +0000 UTC m=+0.061744140 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 08:51:56 compute-0 podman[255332]: 2025-11-22 08:51:56.420126563 +0000 UTC m=+0.058042730 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 08:51:56 compute-0 podman[255331]: 2025-11-22 08:51:56.420206205 +0000 UTC m=+0.063566426 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251118, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Nov 22 08:51:56 compute-0 podman[255336]: 2025-11-22 08:51:56.453228726 +0000 UTC m=+0.087835792 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Nov 22 08:51:56 compute-0 nova_compute[186544]: 2025-11-22 08:51:56.861 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:51:58 compute-0 nova_compute[186544]: 2025-11-22 08:51:58.090 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:52:00 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:52:00.810 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=76, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=75) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:52:00 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:52:00.812 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 08:52:00 compute-0 nova_compute[186544]: 2025-11-22 08:52:00.811 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:52:01 compute-0 nova_compute[186544]: 2025-11-22 08:52:01.862 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:52:03 compute-0 nova_compute[186544]: 2025-11-22 08:52:03.092 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:52:06 compute-0 nova_compute[186544]: 2025-11-22 08:52:06.865 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:52:07 compute-0 podman[255417]: 2025-11-22 08:52:07.397589709 +0000 UTC m=+0.050649876 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 08:52:07 compute-0 podman[255419]: 2025-11-22 08:52:07.410561098 +0000 UTC m=+0.053159559 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, org.label-schema.build-date=20251118)
Nov 22 08:52:07 compute-0 podman[255418]: 2025-11-22 08:52:07.41064056 +0000 UTC m=+0.058823428 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., architecture=x86_64, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, release=1755695350, managed_by=edpm_ansible, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, config_id=edpm, io.openshift.tags=minimal rhel9, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 22 08:52:08 compute-0 nova_compute[186544]: 2025-11-22 08:52:08.095 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:52:09 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:52:09.813 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '76'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:52:11 compute-0 nova_compute[186544]: 2025-11-22 08:52:11.867 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:52:13 compute-0 nova_compute[186544]: 2025-11-22 08:52:13.098 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:52:16 compute-0 nova_compute[186544]: 2025-11-22 08:52:16.869 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:52:18 compute-0 nova_compute[186544]: 2025-11-22 08:52:18.101 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:52:21 compute-0 nova_compute[186544]: 2025-11-22 08:52:21.870 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:52:23 compute-0 nova_compute[186544]: 2025-11-22 08:52:23.104 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:52:25 compute-0 ovn_controller[94843]: 2025-11-22T08:52:25Z|00903|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Nov 22 08:52:26 compute-0 nova_compute[186544]: 2025-11-22 08:52:26.872 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:52:26 compute-0 podman[255480]: 2025-11-22 08:52:26.966674631 +0000 UTC m=+0.058014008 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true)
Nov 22 08:52:26 compute-0 podman[255481]: 2025-11-22 08:52:26.974096914 +0000 UTC m=+0.059184458 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 22 08:52:26 compute-0 podman[255482]: 2025-11-22 08:52:26.978163804 +0000 UTC m=+0.059966296 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 08:52:27 compute-0 podman[255483]: 2025-11-22 08:52:27.008238553 +0000 UTC m=+0.088255881 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Nov 22 08:52:28 compute-0 nova_compute[186544]: 2025-11-22 08:52:28.107 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:52:31 compute-0 nova_compute[186544]: 2025-11-22 08:52:31.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:52:31 compute-0 nova_compute[186544]: 2025-11-22 08:52:31.164 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 08:52:31 compute-0 nova_compute[186544]: 2025-11-22 08:52:31.164 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 08:52:31 compute-0 nova_compute[186544]: 2025-11-22 08:52:31.550 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "refresh_cache-50ec0a38-9293-42b4-9106-2c57fc99c3fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:52:31 compute-0 nova_compute[186544]: 2025-11-22 08:52:31.551 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquired lock "refresh_cache-50ec0a38-9293-42b4-9106-2c57fc99c3fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:52:31 compute-0 nova_compute[186544]: 2025-11-22 08:52:31.551 186548 DEBUG nova.network.neutron [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 22 08:52:31 compute-0 nova_compute[186544]: 2025-11-22 08:52:31.551 186548 DEBUG nova.objects.instance [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lazy-loading 'info_cache' on Instance uuid 50ec0a38-9293-42b4-9106-2c57fc99c3fb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:52:31 compute-0 nova_compute[186544]: 2025-11-22 08:52:31.794 186548 DEBUG nova.compute.manager [req-c10b3aba-56e0-40e2-9234-e5a6a0097517 req-9ccbe184-e725-4e55-93f7-7563b9500100 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] Received event network-changed-a65c1b37-e8e5-4ea7-b0bf-8f2db5841192 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:52:31 compute-0 nova_compute[186544]: 2025-11-22 08:52:31.794 186548 DEBUG nova.compute.manager [req-c10b3aba-56e0-40e2-9234-e5a6a0097517 req-9ccbe184-e725-4e55-93f7-7563b9500100 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] Refreshing instance network info cache due to event network-changed-a65c1b37-e8e5-4ea7-b0bf-8f2db5841192. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:52:31 compute-0 nova_compute[186544]: 2025-11-22 08:52:31.794 186548 DEBUG oslo_concurrency.lockutils [req-c10b3aba-56e0-40e2-9234-e5a6a0097517 req-9ccbe184-e725-4e55-93f7-7563b9500100 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-50ec0a38-9293-42b4-9106-2c57fc99c3fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:52:31 compute-0 nova_compute[186544]: 2025-11-22 08:52:31.857 186548 DEBUG oslo_concurrency.lockutils [None req-0c1e96f6-e1e4-4bc0-9002-00dd6c9970dd 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "50ec0a38-9293-42b4-9106-2c57fc99c3fb" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:52:31 compute-0 nova_compute[186544]: 2025-11-22 08:52:31.858 186548 DEBUG oslo_concurrency.lockutils [None req-0c1e96f6-e1e4-4bc0-9002-00dd6c9970dd 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "50ec0a38-9293-42b4-9106-2c57fc99c3fb" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:52:31 compute-0 nova_compute[186544]: 2025-11-22 08:52:31.858 186548 DEBUG oslo_concurrency.lockutils [None req-0c1e96f6-e1e4-4bc0-9002-00dd6c9970dd 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "50ec0a38-9293-42b4-9106-2c57fc99c3fb-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:52:31 compute-0 nova_compute[186544]: 2025-11-22 08:52:31.858 186548 DEBUG oslo_concurrency.lockutils [None req-0c1e96f6-e1e4-4bc0-9002-00dd6c9970dd 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "50ec0a38-9293-42b4-9106-2c57fc99c3fb-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:52:31 compute-0 nova_compute[186544]: 2025-11-22 08:52:31.858 186548 DEBUG oslo_concurrency.lockutils [None req-0c1e96f6-e1e4-4bc0-9002-00dd6c9970dd 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "50ec0a38-9293-42b4-9106-2c57fc99c3fb-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:52:31 compute-0 nova_compute[186544]: 2025-11-22 08:52:31.866 186548 INFO nova.compute.manager [None req-0c1e96f6-e1e4-4bc0-9002-00dd6c9970dd 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] Terminating instance
Nov 22 08:52:31 compute-0 nova_compute[186544]: 2025-11-22 08:52:31.871 186548 DEBUG nova.compute.manager [None req-0c1e96f6-e1e4-4bc0-9002-00dd6c9970dd 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 08:52:31 compute-0 nova_compute[186544]: 2025-11-22 08:52:31.875 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:52:31 compute-0 kernel: tapa65c1b37-e8 (unregistering): left promiscuous mode
Nov 22 08:52:31 compute-0 NetworkManager[55036]: <info>  [1763801551.8983] device (tapa65c1b37-e8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 08:52:31 compute-0 nova_compute[186544]: 2025-11-22 08:52:31.909 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:52:31 compute-0 ovn_controller[94843]: 2025-11-22T08:52:31Z|00904|binding|INFO|Releasing lport a65c1b37-e8e5-4ea7-b0bf-8f2db5841192 from this chassis (sb_readonly=0)
Nov 22 08:52:31 compute-0 ovn_controller[94843]: 2025-11-22T08:52:31Z|00905|binding|INFO|Setting lport a65c1b37-e8e5-4ea7-b0bf-8f2db5841192 down in Southbound
Nov 22 08:52:31 compute-0 ovn_controller[94843]: 2025-11-22T08:52:31Z|00906|binding|INFO|Removing iface tapa65c1b37-e8 ovn-installed in OVS
Nov 22 08:52:31 compute-0 nova_compute[186544]: 2025-11-22 08:52:31.912 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:52:31 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:52:31.916 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5b:7e:cc 10.100.0.6'], port_security=['fa:16:3e:5b:7e:cc 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '50ec0a38-9293-42b4-9106-2c57fc99c3fb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6462ae38-eefd-46f7-8dfd-98d64cb746b6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2e6b1900-aaed-4594-90b0-32a5351bb717 fe343cc7-4a08-4834-8563-a8a49e3a0f49', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4b91ecf2-a2ff-45e5-963e-c9d8b70b8af0, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=a65c1b37-e8e5-4ea7-b0bf-8f2db5841192) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:52:31 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:52:31.918 103805 INFO neutron.agent.ovn.metadata.agent [-] Port a65c1b37-e8e5-4ea7-b0bf-8f2db5841192 in datapath 6462ae38-eefd-46f7-8dfd-98d64cb746b6 unbound from our chassis
Nov 22 08:52:31 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:52:31.919 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6462ae38-eefd-46f7-8dfd-98d64cb746b6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 08:52:31 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:52:31.923 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[34abc790-aecc-4707-9018-711eb19c820f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:52:31 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:52:31.924 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6462ae38-eefd-46f7-8dfd-98d64cb746b6 namespace which is not needed anymore
Nov 22 08:52:31 compute-0 nova_compute[186544]: 2025-11-22 08:52:31.927 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:52:31 compute-0 systemd[1]: machine-qemu\x2d98\x2dinstance\x2d000000b5.scope: Deactivated successfully.
Nov 22 08:52:31 compute-0 systemd[1]: machine-qemu\x2d98\x2dinstance\x2d000000b5.scope: Consumed 17.104s CPU time.
Nov 22 08:52:31 compute-0 systemd-machined[152872]: Machine qemu-98-instance-000000b5 terminated.
Nov 22 08:52:32 compute-0 neutron-haproxy-ovnmeta-6462ae38-eefd-46f7-8dfd-98d64cb746b6[255134]: [NOTICE]   (255153) : haproxy version is 2.8.14-c23fe91
Nov 22 08:52:32 compute-0 neutron-haproxy-ovnmeta-6462ae38-eefd-46f7-8dfd-98d64cb746b6[255134]: [NOTICE]   (255153) : path to executable is /usr/sbin/haproxy
Nov 22 08:52:32 compute-0 neutron-haproxy-ovnmeta-6462ae38-eefd-46f7-8dfd-98d64cb746b6[255134]: [WARNING]  (255153) : Exiting Master process...
Nov 22 08:52:32 compute-0 neutron-haproxy-ovnmeta-6462ae38-eefd-46f7-8dfd-98d64cb746b6[255134]: [ALERT]    (255153) : Current worker (255157) exited with code 143 (Terminated)
Nov 22 08:52:32 compute-0 neutron-haproxy-ovnmeta-6462ae38-eefd-46f7-8dfd-98d64cb746b6[255134]: [WARNING]  (255153) : All workers exited. Exiting... (0)
Nov 22 08:52:32 compute-0 systemd[1]: libpod-0637e9668083934d37761da372308f87245fe38beeb78955610e322016f249fc.scope: Deactivated successfully.
Nov 22 08:52:32 compute-0 podman[255585]: 2025-11-22 08:52:32.066201001 +0000 UTC m=+0.065373908 container died 0637e9668083934d37761da372308f87245fe38beeb78955610e322016f249fc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6462ae38-eefd-46f7-8dfd-98d64cb746b6, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true)
Nov 22 08:52:32 compute-0 nova_compute[186544]: 2025-11-22 08:52:32.094 186548 DEBUG nova.compute.manager [req-93717eb5-16d4-4025-b672-300c3b18c1dd req-cb3abfe8-a836-4d47-8535-927c765e63a6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] Received event network-vif-unplugged-a65c1b37-e8e5-4ea7-b0bf-8f2db5841192 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:52:32 compute-0 nova_compute[186544]: 2025-11-22 08:52:32.095 186548 DEBUG oslo_concurrency.lockutils [req-93717eb5-16d4-4025-b672-300c3b18c1dd req-cb3abfe8-a836-4d47-8535-927c765e63a6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "50ec0a38-9293-42b4-9106-2c57fc99c3fb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:52:32 compute-0 nova_compute[186544]: 2025-11-22 08:52:32.095 186548 DEBUG oslo_concurrency.lockutils [req-93717eb5-16d4-4025-b672-300c3b18c1dd req-cb3abfe8-a836-4d47-8535-927c765e63a6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "50ec0a38-9293-42b4-9106-2c57fc99c3fb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:52:32 compute-0 nova_compute[186544]: 2025-11-22 08:52:32.095 186548 DEBUG oslo_concurrency.lockutils [req-93717eb5-16d4-4025-b672-300c3b18c1dd req-cb3abfe8-a836-4d47-8535-927c765e63a6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "50ec0a38-9293-42b4-9106-2c57fc99c3fb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:52:32 compute-0 nova_compute[186544]: 2025-11-22 08:52:32.096 186548 DEBUG nova.compute.manager [req-93717eb5-16d4-4025-b672-300c3b18c1dd req-cb3abfe8-a836-4d47-8535-927c765e63a6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] No waiting events found dispatching network-vif-unplugged-a65c1b37-e8e5-4ea7-b0bf-8f2db5841192 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:52:32 compute-0 nova_compute[186544]: 2025-11-22 08:52:32.096 186548 DEBUG nova.compute.manager [req-93717eb5-16d4-4025-b672-300c3b18c1dd req-cb3abfe8-a836-4d47-8535-927c765e63a6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] Received event network-vif-unplugged-a65c1b37-e8e5-4ea7-b0bf-8f2db5841192 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 22 08:52:32 compute-0 nova_compute[186544]: 2025-11-22 08:52:32.096 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:52:32 compute-0 nova_compute[186544]: 2025-11-22 08:52:32.102 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:52:32 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0637e9668083934d37761da372308f87245fe38beeb78955610e322016f249fc-userdata-shm.mount: Deactivated successfully.
Nov 22 08:52:32 compute-0 systemd[1]: var-lib-containers-storage-overlay-7a7336862ebc29a1fb23fe8abb82d08be5315992efca04fafbdc5e2dbc646626-merged.mount: Deactivated successfully.
Nov 22 08:52:32 compute-0 nova_compute[186544]: 2025-11-22 08:52:32.135 186548 INFO nova.virt.libvirt.driver [-] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] Instance destroyed successfully.
Nov 22 08:52:32 compute-0 nova_compute[186544]: 2025-11-22 08:52:32.136 186548 DEBUG nova.objects.instance [None req-0c1e96f6-e1e4-4bc0-9002-00dd6c9970dd 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lazy-loading 'resources' on Instance uuid 50ec0a38-9293-42b4-9106-2c57fc99c3fb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:52:32 compute-0 podman[255585]: 2025-11-22 08:52:32.144490407 +0000 UTC m=+0.143663314 container cleanup 0637e9668083934d37761da372308f87245fe38beeb78955610e322016f249fc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6462ae38-eefd-46f7-8dfd-98d64cb746b6, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 22 08:52:32 compute-0 nova_compute[186544]: 2025-11-22 08:52:32.150 186548 DEBUG nova.virt.libvirt.vif [None req-0c1e96f6-e1e4-4bc0-9002-00dd6c9970dd 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:51:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-2048016152',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-2048016152',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-588574044-acc',id=181,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPiBPbwkfvp292DdxiAKQErFTXea99iSJ0iNktwUcZngbhgOVkJ8LcCoiBgeLTItTjMH75el9p0D+2vq3rbfgroqRlNCO9aORnrX2+bekE/q3IKlWTvN5P4p1NDQcXdubA==',key_name='tempest-TestSecurityGroupsBasicOps-1102160890',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:51:25Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b5da13b07bb34fc3b4cd1452f7dd6971',ramdisk_id='',reservation_id='r-ila3owa7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-588574044',owner_user_name='tempest-TestSecurityGroupsBasicOps-588574044-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:51:25Z,user_data=None,user_id='7bb85b33f2b44468ab5d86bf5ba98421',uuid=50ec0a38-9293-42b4-9106-2c57fc99c3fb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a65c1b37-e8e5-4ea7-b0bf-8f2db5841192", "address": "fa:16:3e:5b:7e:cc", "network": {"id": "6462ae38-eefd-46f7-8dfd-98d64cb746b6", "bridge": "br-int", "label": "tempest-network-smoke--1071805895", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa65c1b37-e8", "ovs_interfaceid": "a65c1b37-e8e5-4ea7-b0bf-8f2db5841192", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 08:52:32 compute-0 nova_compute[186544]: 2025-11-22 08:52:32.150 186548 DEBUG nova.network.os_vif_util [None req-0c1e96f6-e1e4-4bc0-9002-00dd6c9970dd 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Converting VIF {"id": "a65c1b37-e8e5-4ea7-b0bf-8f2db5841192", "address": "fa:16:3e:5b:7e:cc", "network": {"id": "6462ae38-eefd-46f7-8dfd-98d64cb746b6", "bridge": "br-int", "label": "tempest-network-smoke--1071805895", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa65c1b37-e8", "ovs_interfaceid": "a65c1b37-e8e5-4ea7-b0bf-8f2db5841192", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:52:32 compute-0 nova_compute[186544]: 2025-11-22 08:52:32.151 186548 DEBUG nova.network.os_vif_util [None req-0c1e96f6-e1e4-4bc0-9002-00dd6c9970dd 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:5b:7e:cc,bridge_name='br-int',has_traffic_filtering=True,id=a65c1b37-e8e5-4ea7-b0bf-8f2db5841192,network=Network(6462ae38-eefd-46f7-8dfd-98d64cb746b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa65c1b37-e8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:52:32 compute-0 nova_compute[186544]: 2025-11-22 08:52:32.151 186548 DEBUG os_vif [None req-0c1e96f6-e1e4-4bc0-9002-00dd6c9970dd 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:5b:7e:cc,bridge_name='br-int',has_traffic_filtering=True,id=a65c1b37-e8e5-4ea7-b0bf-8f2db5841192,network=Network(6462ae38-eefd-46f7-8dfd-98d64cb746b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa65c1b37-e8') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 08:52:32 compute-0 systemd[1]: libpod-conmon-0637e9668083934d37761da372308f87245fe38beeb78955610e322016f249fc.scope: Deactivated successfully.
Nov 22 08:52:32 compute-0 nova_compute[186544]: 2025-11-22 08:52:32.153 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:52:32 compute-0 nova_compute[186544]: 2025-11-22 08:52:32.153 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa65c1b37-e8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:52:32 compute-0 nova_compute[186544]: 2025-11-22 08:52:32.156 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:52:32 compute-0 nova_compute[186544]: 2025-11-22 08:52:32.159 186548 INFO os_vif [None req-0c1e96f6-e1e4-4bc0-9002-00dd6c9970dd 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:5b:7e:cc,bridge_name='br-int',has_traffic_filtering=True,id=a65c1b37-e8e5-4ea7-b0bf-8f2db5841192,network=Network(6462ae38-eefd-46f7-8dfd-98d64cb746b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa65c1b37-e8')
Nov 22 08:52:32 compute-0 nova_compute[186544]: 2025-11-22 08:52:32.160 186548 INFO nova.virt.libvirt.driver [None req-0c1e96f6-e1e4-4bc0-9002-00dd6c9970dd 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] Deleting instance files /var/lib/nova/instances/50ec0a38-9293-42b4-9106-2c57fc99c3fb_del
Nov 22 08:52:32 compute-0 nova_compute[186544]: 2025-11-22 08:52:32.161 186548 INFO nova.virt.libvirt.driver [None req-0c1e96f6-e1e4-4bc0-9002-00dd6c9970dd 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] Deletion of /var/lib/nova/instances/50ec0a38-9293-42b4-9106-2c57fc99c3fb_del complete
Nov 22 08:52:32 compute-0 nova_compute[186544]: 2025-11-22 08:52:32.226 186548 INFO nova.compute.manager [None req-0c1e96f6-e1e4-4bc0-9002-00dd6c9970dd 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] Took 0.35 seconds to destroy the instance on the hypervisor.
Nov 22 08:52:32 compute-0 nova_compute[186544]: 2025-11-22 08:52:32.226 186548 DEBUG oslo.service.loopingcall [None req-0c1e96f6-e1e4-4bc0-9002-00dd6c9970dd 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 08:52:32 compute-0 nova_compute[186544]: 2025-11-22 08:52:32.226 186548 DEBUG nova.compute.manager [-] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 08:52:32 compute-0 nova_compute[186544]: 2025-11-22 08:52:32.227 186548 DEBUG nova.network.neutron [-] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 08:52:32 compute-0 podman[255629]: 2025-11-22 08:52:32.238359366 +0000 UTC m=+0.071599712 container remove 0637e9668083934d37761da372308f87245fe38beeb78955610e322016f249fc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6462ae38-eefd-46f7-8dfd-98d64cb746b6, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 22 08:52:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:52:32.243 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[57b9cc41-53f2-470b-ba41-f2e8e61f01ab]: (4, ('Sat Nov 22 08:52:31 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-6462ae38-eefd-46f7-8dfd-98d64cb746b6 (0637e9668083934d37761da372308f87245fe38beeb78955610e322016f249fc)\n0637e9668083934d37761da372308f87245fe38beeb78955610e322016f249fc\nSat Nov 22 08:52:32 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-6462ae38-eefd-46f7-8dfd-98d64cb746b6 (0637e9668083934d37761da372308f87245fe38beeb78955610e322016f249fc)\n0637e9668083934d37761da372308f87245fe38beeb78955610e322016f249fc\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:52:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:52:32.245 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[1c76b571-8639-436d-956b-5bc2ddb3ce99]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:52:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:52:32.246 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6462ae38-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:52:32 compute-0 kernel: tap6462ae38-e0: left promiscuous mode
Nov 22 08:52:32 compute-0 nova_compute[186544]: 2025-11-22 08:52:32.248 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:52:32 compute-0 nova_compute[186544]: 2025-11-22 08:52:32.258 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:52:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:52:32.262 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[fa1d0e38-81ee-4710-be6d-4835bd3d7a77]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:52:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:52:32.276 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[e64e53d8-590e-4b23-9592-9f1d73e45100]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:52:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:52:32.277 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[0629058c-741d-4fca-91f3-f7c1280bb90e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:52:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:52:32.293 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[8b6509c4-53c5-4334-bd09-dff03d2f4703]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 822541, 'reachable_time': 35731, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 255645, 'error': None, 'target': 'ovnmeta-6462ae38-eefd-46f7-8dfd-98d64cb746b6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:52:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:52:32.295 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6462ae38-eefd-46f7-8dfd-98d64cb746b6 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 08:52:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:52:32.295 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[5ed84468-e411-473e-97b5-c8b12c936837]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:52:32 compute-0 systemd[1]: run-netns-ovnmeta\x2d6462ae38\x2deefd\x2d46f7\x2d8dfd\x2d98d64cb746b6.mount: Deactivated successfully.
Nov 22 08:52:33 compute-0 nova_compute[186544]: 2025-11-22 08:52:33.549 186548 DEBUG nova.network.neutron [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] Updating instance_info_cache with network_info: [{"id": "a65c1b37-e8e5-4ea7-b0bf-8f2db5841192", "address": "fa:16:3e:5b:7e:cc", "network": {"id": "6462ae38-eefd-46f7-8dfd-98d64cb746b6", "bridge": "br-int", "label": "tempest-network-smoke--1071805895", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa65c1b37-e8", "ovs_interfaceid": "a65c1b37-e8e5-4ea7-b0bf-8f2db5841192", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:52:33 compute-0 nova_compute[186544]: 2025-11-22 08:52:33.563 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Releasing lock "refresh_cache-50ec0a38-9293-42b4-9106-2c57fc99c3fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:52:33 compute-0 nova_compute[186544]: 2025-11-22 08:52:33.564 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 22 08:52:33 compute-0 nova_compute[186544]: 2025-11-22 08:52:33.564 186548 DEBUG oslo_concurrency.lockutils [req-c10b3aba-56e0-40e2-9234-e5a6a0097517 req-9ccbe184-e725-4e55-93f7-7563b9500100 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-50ec0a38-9293-42b4-9106-2c57fc99c3fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:52:33 compute-0 nova_compute[186544]: 2025-11-22 08:52:33.565 186548 DEBUG nova.network.neutron [req-c10b3aba-56e0-40e2-9234-e5a6a0097517 req-9ccbe184-e725-4e55-93f7-7563b9500100 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] Refreshing network info cache for port a65c1b37-e8e5-4ea7-b0bf-8f2db5841192 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:52:33 compute-0 nova_compute[186544]: 2025-11-22 08:52:33.567 186548 DEBUG nova.network.neutron [-] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:52:33 compute-0 nova_compute[186544]: 2025-11-22 08:52:33.595 186548 INFO nova.compute.manager [-] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] Took 1.37 seconds to deallocate network for instance.
Nov 22 08:52:33 compute-0 nova_compute[186544]: 2025-11-22 08:52:33.663 186548 DEBUG oslo_concurrency.lockutils [None req-0c1e96f6-e1e4-4bc0-9002-00dd6c9970dd 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:52:33 compute-0 nova_compute[186544]: 2025-11-22 08:52:33.663 186548 DEBUG oslo_concurrency.lockutils [None req-0c1e96f6-e1e4-4bc0-9002-00dd6c9970dd 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:52:33 compute-0 nova_compute[186544]: 2025-11-22 08:52:33.713 186548 INFO nova.network.neutron [req-c10b3aba-56e0-40e2-9234-e5a6a0097517 req-9ccbe184-e725-4e55-93f7-7563b9500100 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] Port a65c1b37-e8e5-4ea7-b0bf-8f2db5841192 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.
Nov 22 08:52:33 compute-0 nova_compute[186544]: 2025-11-22 08:52:33.713 186548 DEBUG nova.network.neutron [req-c10b3aba-56e0-40e2-9234-e5a6a0097517 req-9ccbe184-e725-4e55-93f7-7563b9500100 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:52:33 compute-0 nova_compute[186544]: 2025-11-22 08:52:33.727 186548 DEBUG oslo_concurrency.lockutils [req-c10b3aba-56e0-40e2-9234-e5a6a0097517 req-9ccbe184-e725-4e55-93f7-7563b9500100 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-50ec0a38-9293-42b4-9106-2c57fc99c3fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:52:33 compute-0 nova_compute[186544]: 2025-11-22 08:52:33.795 186548 DEBUG nova.compute.provider_tree [None req-0c1e96f6-e1e4-4bc0-9002-00dd6c9970dd 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:52:33 compute-0 nova_compute[186544]: 2025-11-22 08:52:33.807 186548 DEBUG nova.scheduler.client.report [None req-0c1e96f6-e1e4-4bc0-9002-00dd6c9970dd 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:52:33 compute-0 nova_compute[186544]: 2025-11-22 08:52:33.826 186548 DEBUG oslo_concurrency.lockutils [None req-0c1e96f6-e1e4-4bc0-9002-00dd6c9970dd 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.163s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:52:33 compute-0 nova_compute[186544]: 2025-11-22 08:52:33.873 186548 DEBUG nova.compute.manager [req-7312402f-d406-4d8c-93c7-1aa18798adf3 req-c717bd0a-c45f-40c2-b55b-92ff0c5a42be 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] Received event network-vif-deleted-a65c1b37-e8e5-4ea7-b0bf-8f2db5841192 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:52:33 compute-0 nova_compute[186544]: 2025-11-22 08:52:33.897 186548 INFO nova.scheduler.client.report [None req-0c1e96f6-e1e4-4bc0-9002-00dd6c9970dd 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Deleted allocations for instance 50ec0a38-9293-42b4-9106-2c57fc99c3fb
Nov 22 08:52:33 compute-0 nova_compute[186544]: 2025-11-22 08:52:33.954 186548 DEBUG oslo_concurrency.lockutils [None req-0c1e96f6-e1e4-4bc0-9002-00dd6c9970dd 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "50ec0a38-9293-42b4-9106-2c57fc99c3fb" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.097s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:52:34 compute-0 nova_compute[186544]: 2025-11-22 08:52:34.161 186548 DEBUG nova.compute.manager [req-be94839c-f5e3-40c2-a2f0-595bd899a6c7 req-c91dc282-d014-4c90-a638-e14facd458a7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] Received event network-vif-plugged-a65c1b37-e8e5-4ea7-b0bf-8f2db5841192 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:52:34 compute-0 nova_compute[186544]: 2025-11-22 08:52:34.162 186548 DEBUG oslo_concurrency.lockutils [req-be94839c-f5e3-40c2-a2f0-595bd899a6c7 req-c91dc282-d014-4c90-a638-e14facd458a7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "50ec0a38-9293-42b4-9106-2c57fc99c3fb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:52:34 compute-0 nova_compute[186544]: 2025-11-22 08:52:34.162 186548 DEBUG oslo_concurrency.lockutils [req-be94839c-f5e3-40c2-a2f0-595bd899a6c7 req-c91dc282-d014-4c90-a638-e14facd458a7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "50ec0a38-9293-42b4-9106-2c57fc99c3fb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:52:34 compute-0 nova_compute[186544]: 2025-11-22 08:52:34.162 186548 DEBUG oslo_concurrency.lockutils [req-be94839c-f5e3-40c2-a2f0-595bd899a6c7 req-c91dc282-d014-4c90-a638-e14facd458a7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "50ec0a38-9293-42b4-9106-2c57fc99c3fb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:52:34 compute-0 nova_compute[186544]: 2025-11-22 08:52:34.162 186548 DEBUG nova.compute.manager [req-be94839c-f5e3-40c2-a2f0-595bd899a6c7 req-c91dc282-d014-4c90-a638-e14facd458a7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] No waiting events found dispatching network-vif-plugged-a65c1b37-e8e5-4ea7-b0bf-8f2db5841192 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:52:34 compute-0 nova_compute[186544]: 2025-11-22 08:52:34.163 186548 WARNING nova.compute.manager [req-be94839c-f5e3-40c2-a2f0-595bd899a6c7 req-c91dc282-d014-4c90-a638-e14facd458a7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] Received unexpected event network-vif-plugged-a65c1b37-e8e5-4ea7-b0bf-8f2db5841192 for instance with vm_state deleted and task_state None.
Nov 22 08:52:34 compute-0 nova_compute[186544]: 2025-11-22 08:52:34.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:52:34 compute-0 nova_compute[186544]: 2025-11-22 08:52:34.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 08:52:36 compute-0 nova_compute[186544]: 2025-11-22 08:52:36.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:52:36 compute-0 nova_compute[186544]: 2025-11-22 08:52:36.180 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:52:36 compute-0 nova_compute[186544]: 2025-11-22 08:52:36.181 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:52:36 compute-0 nova_compute[186544]: 2025-11-22 08:52:36.181 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:52:36 compute-0 nova_compute[186544]: 2025-11-22 08:52:36.181 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 08:52:36 compute-0 nova_compute[186544]: 2025-11-22 08:52:36.338 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:52:36 compute-0 nova_compute[186544]: 2025-11-22 08:52:36.339 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5698MB free_disk=73.13180541992188GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 08:52:36 compute-0 nova_compute[186544]: 2025-11-22 08:52:36.339 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:52:36 compute-0 nova_compute[186544]: 2025-11-22 08:52:36.340 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:52:36 compute-0 nova_compute[186544]: 2025-11-22 08:52:36.548 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 08:52:36 compute-0 nova_compute[186544]: 2025-11-22 08:52:36.549 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 08:52:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:52:36.606 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:52:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:52:36.607 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:52:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:52:36.607 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:52:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:52:36.607 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:52:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:52:36.607 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:52:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:52:36.607 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:52:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:52:36.607 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:52:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:52:36.607 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:52:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:52:36.607 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:52:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:52:36.607 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:52:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:52:36.607 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:52:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:52:36.607 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:52:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:52:36.607 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:52:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:52:36.607 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:52:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:52:36.607 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:52:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:52:36.608 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:52:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:52:36.608 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:52:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:52:36.608 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:52:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:52:36.608 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:52:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:52:36.608 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:52:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:52:36.608 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:52:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:52:36.608 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:52:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:52:36.608 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:52:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:52:36.608 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:52:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:52:36.608 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:52:36 compute-0 nova_compute[186544]: 2025-11-22 08:52:36.691 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:52:36 compute-0 nova_compute[186544]: 2025-11-22 08:52:36.703 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:52:36 compute-0 nova_compute[186544]: 2025-11-22 08:52:36.721 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 08:52:36 compute-0 nova_compute[186544]: 2025-11-22 08:52:36.722 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.382s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:52:36 compute-0 nova_compute[186544]: 2025-11-22 08:52:36.877 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:52:37 compute-0 nova_compute[186544]: 2025-11-22 08:52:37.156 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:52:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:52:37.385 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:52:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:52:37.385 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:52:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:52:37.385 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:52:38 compute-0 podman[255647]: 2025-11-22 08:52:38.407056685 +0000 UTC m=+0.057527986 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 08:52:38 compute-0 podman[255649]: 2025-11-22 08:52:38.416084448 +0000 UTC m=+0.063190365 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 22 08:52:38 compute-0 podman[255648]: 2025-11-22 08:52:38.444125317 +0000 UTC m=+0.090348763 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, architecture=x86_64, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, version=9.6, io.openshift.expose-services=, vendor=Red Hat, Inc., container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 22 08:52:39 compute-0 nova_compute[186544]: 2025-11-22 08:52:39.914 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:52:39 compute-0 nova_compute[186544]: 2025-11-22 08:52:39.987 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:52:40 compute-0 nova_compute[186544]: 2025-11-22 08:52:40.722 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:52:41 compute-0 nova_compute[186544]: 2025-11-22 08:52:41.878 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:52:42 compute-0 nova_compute[186544]: 2025-11-22 08:52:42.157 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:52:44 compute-0 nova_compute[186544]: 2025-11-22 08:52:44.159 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:52:44 compute-0 nova_compute[186544]: 2025-11-22 08:52:44.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:52:44 compute-0 nova_compute[186544]: 2025-11-22 08:52:44.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:52:46 compute-0 nova_compute[186544]: 2025-11-22 08:52:46.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:52:46 compute-0 nova_compute[186544]: 2025-11-22 08:52:46.880 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:52:47 compute-0 nova_compute[186544]: 2025-11-22 08:52:47.132 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763801552.1305914, 50ec0a38-9293-42b4-9106-2c57fc99c3fb => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:52:47 compute-0 nova_compute[186544]: 2025-11-22 08:52:47.133 186548 INFO nova.compute.manager [-] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] VM Stopped (Lifecycle Event)
Nov 22 08:52:47 compute-0 nova_compute[186544]: 2025-11-22 08:52:47.159 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:52:47 compute-0 nova_compute[186544]: 2025-11-22 08:52:47.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:52:47 compute-0 nova_compute[186544]: 2025-11-22 08:52:47.176 186548 DEBUG nova.compute.manager [None req-9a66e7c4-f10c-4203-b3df-829b99f46727 - - - - - -] [instance: 50ec0a38-9293-42b4-9106-2c57fc99c3fb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:52:51 compute-0 nova_compute[186544]: 2025-11-22 08:52:51.884 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:52:52 compute-0 nova_compute[186544]: 2025-11-22 08:52:52.161 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:52:56 compute-0 nova_compute[186544]: 2025-11-22 08:52:56.831 186548 DEBUG oslo_concurrency.lockutils [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "3ba3c659-9910-4a7d-9f33-2caba4ec591e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:52:56 compute-0 nova_compute[186544]: 2025-11-22 08:52:56.831 186548 DEBUG oslo_concurrency.lockutils [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "3ba3c659-9910-4a7d-9f33-2caba4ec591e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:52:56 compute-0 nova_compute[186544]: 2025-11-22 08:52:56.852 186548 DEBUG nova.compute.manager [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 08:52:56 compute-0 nova_compute[186544]: 2025-11-22 08:52:56.885 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:52:56 compute-0 nova_compute[186544]: 2025-11-22 08:52:56.934 186548 DEBUG oslo_concurrency.lockutils [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:52:56 compute-0 nova_compute[186544]: 2025-11-22 08:52:56.935 186548 DEBUG oslo_concurrency.lockutils [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:52:56 compute-0 nova_compute[186544]: 2025-11-22 08:52:56.942 186548 DEBUG nova.virt.hardware [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 08:52:56 compute-0 nova_compute[186544]: 2025-11-22 08:52:56.942 186548 INFO nova.compute.claims [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] Claim successful on node compute-0.ctlplane.example.com
Nov 22 08:52:57 compute-0 nova_compute[186544]: 2025-11-22 08:52:57.033 186548 DEBUG nova.compute.provider_tree [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:52:57 compute-0 nova_compute[186544]: 2025-11-22 08:52:57.044 186548 DEBUG nova.scheduler.client.report [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:52:57 compute-0 nova_compute[186544]: 2025-11-22 08:52:57.060 186548 DEBUG oslo_concurrency.lockutils [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.125s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:52:57 compute-0 nova_compute[186544]: 2025-11-22 08:52:57.061 186548 DEBUG nova.compute.manager [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 08:52:57 compute-0 nova_compute[186544]: 2025-11-22 08:52:57.111 186548 DEBUG nova.compute.manager [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 22 08:52:57 compute-0 nova_compute[186544]: 2025-11-22 08:52:57.112 186548 DEBUG nova.network.neutron [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 08:52:57 compute-0 nova_compute[186544]: 2025-11-22 08:52:57.124 186548 INFO nova.virt.libvirt.driver [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 08:52:57 compute-0 nova_compute[186544]: 2025-11-22 08:52:57.141 186548 DEBUG nova.compute.manager [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 08:52:57 compute-0 nova_compute[186544]: 2025-11-22 08:52:57.163 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:52:57 compute-0 nova_compute[186544]: 2025-11-22 08:52:57.227 186548 DEBUG nova.compute.manager [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 08:52:57 compute-0 nova_compute[186544]: 2025-11-22 08:52:57.229 186548 DEBUG nova.virt.libvirt.driver [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 08:52:57 compute-0 nova_compute[186544]: 2025-11-22 08:52:57.229 186548 INFO nova.virt.libvirt.driver [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] Creating image(s)
Nov 22 08:52:57 compute-0 nova_compute[186544]: 2025-11-22 08:52:57.230 186548 DEBUG oslo_concurrency.lockutils [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "/var/lib/nova/instances/3ba3c659-9910-4a7d-9f33-2caba4ec591e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:52:57 compute-0 nova_compute[186544]: 2025-11-22 08:52:57.230 186548 DEBUG oslo_concurrency.lockutils [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "/var/lib/nova/instances/3ba3c659-9910-4a7d-9f33-2caba4ec591e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:52:57 compute-0 nova_compute[186544]: 2025-11-22 08:52:57.231 186548 DEBUG oslo_concurrency.lockutils [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "/var/lib/nova/instances/3ba3c659-9910-4a7d-9f33-2caba4ec591e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:52:57 compute-0 nova_compute[186544]: 2025-11-22 08:52:57.247 186548 DEBUG oslo_concurrency.processutils [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:52:57 compute-0 nova_compute[186544]: 2025-11-22 08:52:57.308 186548 DEBUG oslo_concurrency.processutils [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:52:57 compute-0 nova_compute[186544]: 2025-11-22 08:52:57.309 186548 DEBUG oslo_concurrency.lockutils [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:52:57 compute-0 nova_compute[186544]: 2025-11-22 08:52:57.309 186548 DEBUG oslo_concurrency.lockutils [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:52:57 compute-0 nova_compute[186544]: 2025-11-22 08:52:57.321 186548 DEBUG oslo_concurrency.processutils [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:52:57 compute-0 nova_compute[186544]: 2025-11-22 08:52:57.384 186548 DEBUG oslo_concurrency.processutils [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:52:57 compute-0 nova_compute[186544]: 2025-11-22 08:52:57.385 186548 DEBUG oslo_concurrency.processutils [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/3ba3c659-9910-4a7d-9f33-2caba4ec591e/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:52:57 compute-0 podman[255714]: 2025-11-22 08:52:57.41542278 +0000 UTC m=+0.058658645 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118)
Nov 22 08:52:57 compute-0 podman[255713]: 2025-11-22 08:52:57.442208188 +0000 UTC m=+0.089958243 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 08:52:57 compute-0 podman[255715]: 2025-11-22 08:52:57.442439945 +0000 UTC m=+0.083409753 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 08:52:57 compute-0 podman[255716]: 2025-11-22 08:52:57.477213199 +0000 UTC m=+0.114463306 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 22 08:52:57 compute-0 nova_compute[186544]: 2025-11-22 08:52:57.536 186548 DEBUG oslo_concurrency.processutils [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/3ba3c659-9910-4a7d-9f33-2caba4ec591e/disk 1073741824" returned: 0 in 0.151s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:52:57 compute-0 nova_compute[186544]: 2025-11-22 08:52:57.537 186548 DEBUG oslo_concurrency.lockutils [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.228s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:52:57 compute-0 nova_compute[186544]: 2025-11-22 08:52:57.537 186548 DEBUG oslo_concurrency.processutils [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:52:57 compute-0 nova_compute[186544]: 2025-11-22 08:52:57.583 186548 DEBUG nova.policy [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 22 08:52:57 compute-0 nova_compute[186544]: 2025-11-22 08:52:57.594 186548 DEBUG oslo_concurrency.processutils [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:52:57 compute-0 nova_compute[186544]: 2025-11-22 08:52:57.595 186548 DEBUG nova.virt.disk.api [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Checking if we can resize image /var/lib/nova/instances/3ba3c659-9910-4a7d-9f33-2caba4ec591e/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 08:52:57 compute-0 nova_compute[186544]: 2025-11-22 08:52:57.595 186548 DEBUG oslo_concurrency.processutils [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3ba3c659-9910-4a7d-9f33-2caba4ec591e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:52:57 compute-0 nova_compute[186544]: 2025-11-22 08:52:57.655 186548 DEBUG oslo_concurrency.processutils [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3ba3c659-9910-4a7d-9f33-2caba4ec591e/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:52:57 compute-0 nova_compute[186544]: 2025-11-22 08:52:57.657 186548 DEBUG nova.virt.disk.api [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Cannot resize image /var/lib/nova/instances/3ba3c659-9910-4a7d-9f33-2caba4ec591e/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 08:52:57 compute-0 nova_compute[186544]: 2025-11-22 08:52:57.658 186548 DEBUG nova.objects.instance [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lazy-loading 'migration_context' on Instance uuid 3ba3c659-9910-4a7d-9f33-2caba4ec591e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:52:57 compute-0 nova_compute[186544]: 2025-11-22 08:52:57.669 186548 DEBUG nova.virt.libvirt.driver [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 08:52:57 compute-0 nova_compute[186544]: 2025-11-22 08:52:57.670 186548 DEBUG nova.virt.libvirt.driver [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] Ensure instance console log exists: /var/lib/nova/instances/3ba3c659-9910-4a7d-9f33-2caba4ec591e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 08:52:57 compute-0 nova_compute[186544]: 2025-11-22 08:52:57.670 186548 DEBUG oslo_concurrency.lockutils [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:52:57 compute-0 nova_compute[186544]: 2025-11-22 08:52:57.671 186548 DEBUG oslo_concurrency.lockutils [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:52:57 compute-0 nova_compute[186544]: 2025-11-22 08:52:57.671 186548 DEBUG oslo_concurrency.lockutils [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:52:58 compute-0 nova_compute[186544]: 2025-11-22 08:52:58.763 186548 DEBUG nova.network.neutron [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] Successfully created port: 99a0470d-e680-42ab-8405-7803d6c30f96 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 22 08:53:00 compute-0 nova_compute[186544]: 2025-11-22 08:53:00.158 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:53:01 compute-0 nova_compute[186544]: 2025-11-22 08:53:01.289 186548 DEBUG nova.network.neutron [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] Successfully updated port: 99a0470d-e680-42ab-8405-7803d6c30f96 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 08:53:01 compute-0 nova_compute[186544]: 2025-11-22 08:53:01.311 186548 DEBUG oslo_concurrency.lockutils [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "refresh_cache-3ba3c659-9910-4a7d-9f33-2caba4ec591e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:53:01 compute-0 nova_compute[186544]: 2025-11-22 08:53:01.311 186548 DEBUG oslo_concurrency.lockutils [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquired lock "refresh_cache-3ba3c659-9910-4a7d-9f33-2caba4ec591e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:53:01 compute-0 nova_compute[186544]: 2025-11-22 08:53:01.311 186548 DEBUG nova.network.neutron [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 08:53:01 compute-0 nova_compute[186544]: 2025-11-22 08:53:01.465 186548 DEBUG nova.compute.manager [req-1652775b-3075-4a85-b0da-ff5eb8f9c70b req-e375d3fa-2e92-4961-8fe2-4a933f0f6fd9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] Received event network-changed-99a0470d-e680-42ab-8405-7803d6c30f96 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:53:01 compute-0 nova_compute[186544]: 2025-11-22 08:53:01.466 186548 DEBUG nova.compute.manager [req-1652775b-3075-4a85-b0da-ff5eb8f9c70b req-e375d3fa-2e92-4961-8fe2-4a933f0f6fd9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] Refreshing instance network info cache due to event network-changed-99a0470d-e680-42ab-8405-7803d6c30f96. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:53:01 compute-0 nova_compute[186544]: 2025-11-22 08:53:01.466 186548 DEBUG oslo_concurrency.lockutils [req-1652775b-3075-4a85-b0da-ff5eb8f9c70b req-e375d3fa-2e92-4961-8fe2-4a933f0f6fd9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-3ba3c659-9910-4a7d-9f33-2caba4ec591e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:53:01 compute-0 nova_compute[186544]: 2025-11-22 08:53:01.887 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:53:02 compute-0 nova_compute[186544]: 2025-11-22 08:53:02.165 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:53:02 compute-0 nova_compute[186544]: 2025-11-22 08:53:02.590 186548 DEBUG nova.network.neutron [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 08:53:05 compute-0 nova_compute[186544]: 2025-11-22 08:53:05.639 186548 DEBUG nova.network.neutron [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] Updating instance_info_cache with network_info: [{"id": "99a0470d-e680-42ab-8405-7803d6c30f96", "address": "fa:16:3e:1f:c5:07", "network": {"id": "7869fdf1-f7f0-49a6-9871-2150df66ed42", "bridge": "br-int", "label": "tempest-network-smoke--1803885914", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap99a0470d-e6", "ovs_interfaceid": "99a0470d-e680-42ab-8405-7803d6c30f96", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:53:05 compute-0 nova_compute[186544]: 2025-11-22 08:53:05.692 186548 DEBUG oslo_concurrency.lockutils [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Releasing lock "refresh_cache-3ba3c659-9910-4a7d-9f33-2caba4ec591e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:53:05 compute-0 nova_compute[186544]: 2025-11-22 08:53:05.693 186548 DEBUG nova.compute.manager [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] Instance network_info: |[{"id": "99a0470d-e680-42ab-8405-7803d6c30f96", "address": "fa:16:3e:1f:c5:07", "network": {"id": "7869fdf1-f7f0-49a6-9871-2150df66ed42", "bridge": "br-int", "label": "tempest-network-smoke--1803885914", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap99a0470d-e6", "ovs_interfaceid": "99a0470d-e680-42ab-8405-7803d6c30f96", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 22 08:53:05 compute-0 nova_compute[186544]: 2025-11-22 08:53:05.693 186548 DEBUG oslo_concurrency.lockutils [req-1652775b-3075-4a85-b0da-ff5eb8f9c70b req-e375d3fa-2e92-4961-8fe2-4a933f0f6fd9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-3ba3c659-9910-4a7d-9f33-2caba4ec591e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:53:05 compute-0 nova_compute[186544]: 2025-11-22 08:53:05.693 186548 DEBUG nova.network.neutron [req-1652775b-3075-4a85-b0da-ff5eb8f9c70b req-e375d3fa-2e92-4961-8fe2-4a933f0f6fd9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] Refreshing network info cache for port 99a0470d-e680-42ab-8405-7803d6c30f96 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:53:05 compute-0 nova_compute[186544]: 2025-11-22 08:53:05.696 186548 DEBUG nova.virt.libvirt.driver [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] Start _get_guest_xml network_info=[{"id": "99a0470d-e680-42ab-8405-7803d6c30f96", "address": "fa:16:3e:1f:c5:07", "network": {"id": "7869fdf1-f7f0-49a6-9871-2150df66ed42", "bridge": "br-int", "label": "tempest-network-smoke--1803885914", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap99a0470d-e6", "ovs_interfaceid": "99a0470d-e680-42ab-8405-7803d6c30f96", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 08:53:05 compute-0 nova_compute[186544]: 2025-11-22 08:53:05.701 186548 WARNING nova.virt.libvirt.driver [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:53:05 compute-0 nova_compute[186544]: 2025-11-22 08:53:05.708 186548 DEBUG nova.virt.libvirt.host [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 08:53:05 compute-0 nova_compute[186544]: 2025-11-22 08:53:05.708 186548 DEBUG nova.virt.libvirt.host [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 08:53:05 compute-0 nova_compute[186544]: 2025-11-22 08:53:05.712 186548 DEBUG nova.virt.libvirt.host [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 08:53:05 compute-0 nova_compute[186544]: 2025-11-22 08:53:05.713 186548 DEBUG nova.virt.libvirt.host [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 08:53:05 compute-0 nova_compute[186544]: 2025-11-22 08:53:05.714 186548 DEBUG nova.virt.libvirt.driver [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 08:53:05 compute-0 nova_compute[186544]: 2025-11-22 08:53:05.715 186548 DEBUG nova.virt.hardware [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 08:53:05 compute-0 nova_compute[186544]: 2025-11-22 08:53:05.715 186548 DEBUG nova.virt.hardware [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 08:53:05 compute-0 nova_compute[186544]: 2025-11-22 08:53:05.715 186548 DEBUG nova.virt.hardware [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 08:53:05 compute-0 nova_compute[186544]: 2025-11-22 08:53:05.716 186548 DEBUG nova.virt.hardware [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 08:53:05 compute-0 nova_compute[186544]: 2025-11-22 08:53:05.716 186548 DEBUG nova.virt.hardware [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 08:53:05 compute-0 nova_compute[186544]: 2025-11-22 08:53:05.716 186548 DEBUG nova.virt.hardware [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 08:53:05 compute-0 nova_compute[186544]: 2025-11-22 08:53:05.716 186548 DEBUG nova.virt.hardware [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 08:53:05 compute-0 nova_compute[186544]: 2025-11-22 08:53:05.717 186548 DEBUG nova.virt.hardware [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 08:53:05 compute-0 nova_compute[186544]: 2025-11-22 08:53:05.717 186548 DEBUG nova.virt.hardware [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 08:53:05 compute-0 nova_compute[186544]: 2025-11-22 08:53:05.717 186548 DEBUG nova.virt.hardware [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 08:53:05 compute-0 nova_compute[186544]: 2025-11-22 08:53:05.717 186548 DEBUG nova.virt.hardware [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 08:53:05 compute-0 nova_compute[186544]: 2025-11-22 08:53:05.721 186548 DEBUG nova.virt.libvirt.vif [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:52:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-534108987',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-534108987',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-588574044-acc',id=183,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLiGQmcqGXcsiG3cxIcUvS1tAu25Vah0oTek5lovbLMmZ85EcufMlHOsC9OIWKjqPQy3WbONgErBbtOzH1DLa6F6Qpynfr2ZY+A1c+Z39f1muaHxw/crPZGVkXZfE8lq0Q==',key_name='tempest-TestSecurityGroupsBasicOps-1950223607',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b5da13b07bb34fc3b4cd1452f7dd6971',ramdisk_id='',reservation_id='r-3zc97jdy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-588574044',owner_user_name='tempest-TestSecurityGroupsBasicOps-588574044-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:52:57Z,user_data=None,user_id='7bb85b33f2b44468ab5d86bf5ba98421',uuid=3ba3c659-9910-4a7d-9f33-2caba4ec591e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "99a0470d-e680-42ab-8405-7803d6c30f96", "address": "fa:16:3e:1f:c5:07", "network": {"id": "7869fdf1-f7f0-49a6-9871-2150df66ed42", "bridge": "br-int", "label": "tempest-network-smoke--1803885914", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap99a0470d-e6", "ovs_interfaceid": "99a0470d-e680-42ab-8405-7803d6c30f96", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 08:53:05 compute-0 nova_compute[186544]: 2025-11-22 08:53:05.722 186548 DEBUG nova.network.os_vif_util [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Converting VIF {"id": "99a0470d-e680-42ab-8405-7803d6c30f96", "address": "fa:16:3e:1f:c5:07", "network": {"id": "7869fdf1-f7f0-49a6-9871-2150df66ed42", "bridge": "br-int", "label": "tempest-network-smoke--1803885914", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap99a0470d-e6", "ovs_interfaceid": "99a0470d-e680-42ab-8405-7803d6c30f96", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:53:05 compute-0 nova_compute[186544]: 2025-11-22 08:53:05.722 186548 DEBUG nova.network.os_vif_util [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1f:c5:07,bridge_name='br-int',has_traffic_filtering=True,id=99a0470d-e680-42ab-8405-7803d6c30f96,network=Network(7869fdf1-f7f0-49a6-9871-2150df66ed42),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap99a0470d-e6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:53:05 compute-0 nova_compute[186544]: 2025-11-22 08:53:05.723 186548 DEBUG nova.objects.instance [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lazy-loading 'pci_devices' on Instance uuid 3ba3c659-9910-4a7d-9f33-2caba4ec591e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:53:05 compute-0 nova_compute[186544]: 2025-11-22 08:53:05.741 186548 DEBUG nova.virt.libvirt.driver [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] End _get_guest_xml xml=<domain type="kvm">
Nov 22 08:53:05 compute-0 nova_compute[186544]:   <uuid>3ba3c659-9910-4a7d-9f33-2caba4ec591e</uuid>
Nov 22 08:53:05 compute-0 nova_compute[186544]:   <name>instance-000000b7</name>
Nov 22 08:53:05 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 08:53:05 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 08:53:05 compute-0 nova_compute[186544]:   <metadata>
Nov 22 08:53:05 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 08:53:05 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 08:53:05 compute-0 nova_compute[186544]:       <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-534108987</nova:name>
Nov 22 08:53:05 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 08:53:05</nova:creationTime>
Nov 22 08:53:05 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 08:53:05 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 08:53:05 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 08:53:05 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 08:53:05 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 08:53:05 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 08:53:05 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 08:53:05 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 08:53:05 compute-0 nova_compute[186544]:         <nova:user uuid="7bb85b33f2b44468ab5d86bf5ba98421">tempest-TestSecurityGroupsBasicOps-588574044-project-member</nova:user>
Nov 22 08:53:05 compute-0 nova_compute[186544]:         <nova:project uuid="b5da13b07bb34fc3b4cd1452f7dd6971">tempest-TestSecurityGroupsBasicOps-588574044</nova:project>
Nov 22 08:53:05 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 08:53:05 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 08:53:05 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 08:53:05 compute-0 nova_compute[186544]:         <nova:port uuid="99a0470d-e680-42ab-8405-7803d6c30f96">
Nov 22 08:53:05 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Nov 22 08:53:05 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 08:53:05 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 08:53:05 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 08:53:05 compute-0 nova_compute[186544]:   </metadata>
Nov 22 08:53:05 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 08:53:05 compute-0 nova_compute[186544]:     <system>
Nov 22 08:53:05 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 08:53:05 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 08:53:05 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 08:53:05 compute-0 nova_compute[186544]:       <entry name="serial">3ba3c659-9910-4a7d-9f33-2caba4ec591e</entry>
Nov 22 08:53:05 compute-0 nova_compute[186544]:       <entry name="uuid">3ba3c659-9910-4a7d-9f33-2caba4ec591e</entry>
Nov 22 08:53:05 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 08:53:05 compute-0 nova_compute[186544]:     </system>
Nov 22 08:53:05 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 08:53:05 compute-0 nova_compute[186544]:   <os>
Nov 22 08:53:05 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 08:53:05 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 08:53:05 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 08:53:05 compute-0 nova_compute[186544]:   </os>
Nov 22 08:53:05 compute-0 nova_compute[186544]:   <features>
Nov 22 08:53:05 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 08:53:05 compute-0 nova_compute[186544]:     <apic/>
Nov 22 08:53:05 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 08:53:05 compute-0 nova_compute[186544]:   </features>
Nov 22 08:53:05 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 08:53:05 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 08:53:05 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 08:53:05 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 08:53:05 compute-0 nova_compute[186544]:   </clock>
Nov 22 08:53:05 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 08:53:05 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 08:53:05 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 08:53:05 compute-0 nova_compute[186544]:   </cpu>
Nov 22 08:53:05 compute-0 nova_compute[186544]:   <devices>
Nov 22 08:53:05 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 08:53:05 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 08:53:05 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/3ba3c659-9910-4a7d-9f33-2caba4ec591e/disk"/>
Nov 22 08:53:05 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 08:53:05 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:53:05 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 08:53:05 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 08:53:05 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/3ba3c659-9910-4a7d-9f33-2caba4ec591e/disk.config"/>
Nov 22 08:53:05 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 08:53:05 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:53:05 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 08:53:05 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:1f:c5:07"/>
Nov 22 08:53:05 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:53:05 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 08:53:05 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 08:53:05 compute-0 nova_compute[186544]:       <target dev="tap99a0470d-e6"/>
Nov 22 08:53:05 compute-0 nova_compute[186544]:     </interface>
Nov 22 08:53:05 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 08:53:05 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/3ba3c659-9910-4a7d-9f33-2caba4ec591e/console.log" append="off"/>
Nov 22 08:53:05 compute-0 nova_compute[186544]:     </serial>
Nov 22 08:53:05 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 08:53:05 compute-0 nova_compute[186544]:     <video>
Nov 22 08:53:05 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:53:05 compute-0 nova_compute[186544]:     </video>
Nov 22 08:53:05 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 08:53:05 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 08:53:05 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 08:53:05 compute-0 nova_compute[186544]:     </rng>
Nov 22 08:53:05 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 08:53:05 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:53:05 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:53:05 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:53:05 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:53:05 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:53:05 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:53:05 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:53:05 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:53:05 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:53:05 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:53:05 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:53:05 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:53:05 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:53:05 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:53:05 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:53:05 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:53:05 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:53:05 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:53:05 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:53:05 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:53:05 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:53:05 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:53:05 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:53:05 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:53:05 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 08:53:05 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 08:53:05 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 08:53:05 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 08:53:05 compute-0 nova_compute[186544]:   </devices>
Nov 22 08:53:05 compute-0 nova_compute[186544]: </domain>
Nov 22 08:53:05 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 08:53:05 compute-0 nova_compute[186544]: 2025-11-22 08:53:05.742 186548 DEBUG nova.compute.manager [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] Preparing to wait for external event network-vif-plugged-99a0470d-e680-42ab-8405-7803d6c30f96 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 08:53:05 compute-0 nova_compute[186544]: 2025-11-22 08:53:05.743 186548 DEBUG oslo_concurrency.lockutils [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "3ba3c659-9910-4a7d-9f33-2caba4ec591e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:53:05 compute-0 nova_compute[186544]: 2025-11-22 08:53:05.743 186548 DEBUG oslo_concurrency.lockutils [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "3ba3c659-9910-4a7d-9f33-2caba4ec591e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:53:05 compute-0 nova_compute[186544]: 2025-11-22 08:53:05.743 186548 DEBUG oslo_concurrency.lockutils [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "3ba3c659-9910-4a7d-9f33-2caba4ec591e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:53:05 compute-0 nova_compute[186544]: 2025-11-22 08:53:05.744 186548 DEBUG nova.virt.libvirt.vif [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:52:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-534108987',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-534108987',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-588574044-acc',id=183,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLiGQmcqGXcsiG3cxIcUvS1tAu25Vah0oTek5lovbLMmZ85EcufMlHOsC9OIWKjqPQy3WbONgErBbtOzH1DLa6F6Qpynfr2ZY+A1c+Z39f1muaHxw/crPZGVkXZfE8lq0Q==',key_name='tempest-TestSecurityGroupsBasicOps-1950223607',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b5da13b07bb34fc3b4cd1452f7dd6971',ramdisk_id='',reservation_id='r-3zc97jdy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-588574044',owner_user_name='tempest-TestSecurityGroupsBasicOps-588574044-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:52:57Z,user_data=None,user_id='7bb85b33f2b44468ab5d86bf5ba98421',uuid=3ba3c659-9910-4a7d-9f33-2caba4ec591e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "99a0470d-e680-42ab-8405-7803d6c30f96", "address": "fa:16:3e:1f:c5:07", "network": {"id": "7869fdf1-f7f0-49a6-9871-2150df66ed42", "bridge": "br-int", "label": "tempest-network-smoke--1803885914", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap99a0470d-e6", "ovs_interfaceid": "99a0470d-e680-42ab-8405-7803d6c30f96", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 08:53:05 compute-0 nova_compute[186544]: 2025-11-22 08:53:05.744 186548 DEBUG nova.network.os_vif_util [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Converting VIF {"id": "99a0470d-e680-42ab-8405-7803d6c30f96", "address": "fa:16:3e:1f:c5:07", "network": {"id": "7869fdf1-f7f0-49a6-9871-2150df66ed42", "bridge": "br-int", "label": "tempest-network-smoke--1803885914", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap99a0470d-e6", "ovs_interfaceid": "99a0470d-e680-42ab-8405-7803d6c30f96", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:53:05 compute-0 nova_compute[186544]: 2025-11-22 08:53:05.744 186548 DEBUG nova.network.os_vif_util [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1f:c5:07,bridge_name='br-int',has_traffic_filtering=True,id=99a0470d-e680-42ab-8405-7803d6c30f96,network=Network(7869fdf1-f7f0-49a6-9871-2150df66ed42),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap99a0470d-e6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:53:05 compute-0 nova_compute[186544]: 2025-11-22 08:53:05.745 186548 DEBUG os_vif [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1f:c5:07,bridge_name='br-int',has_traffic_filtering=True,id=99a0470d-e680-42ab-8405-7803d6c30f96,network=Network(7869fdf1-f7f0-49a6-9871-2150df66ed42),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap99a0470d-e6') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 08:53:05 compute-0 nova_compute[186544]: 2025-11-22 08:53:05.745 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:53:05 compute-0 nova_compute[186544]: 2025-11-22 08:53:05.745 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:53:05 compute-0 nova_compute[186544]: 2025-11-22 08:53:05.746 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:53:05 compute-0 nova_compute[186544]: 2025-11-22 08:53:05.748 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:53:05 compute-0 nova_compute[186544]: 2025-11-22 08:53:05.749 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap99a0470d-e6, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:53:05 compute-0 nova_compute[186544]: 2025-11-22 08:53:05.749 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap99a0470d-e6, col_values=(('external_ids', {'iface-id': '99a0470d-e680-42ab-8405-7803d6c30f96', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1f:c5:07', 'vm-uuid': '3ba3c659-9910-4a7d-9f33-2caba4ec591e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:53:05 compute-0 nova_compute[186544]: 2025-11-22 08:53:05.751 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:53:05 compute-0 NetworkManager[55036]: <info>  [1763801585.7519] manager: (tap99a0470d-e6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/425)
Nov 22 08:53:05 compute-0 nova_compute[186544]: 2025-11-22 08:53:05.754 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 08:53:05 compute-0 nova_compute[186544]: 2025-11-22 08:53:05.759 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:53:05 compute-0 nova_compute[186544]: 2025-11-22 08:53:05.760 186548 INFO os_vif [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1f:c5:07,bridge_name='br-int',has_traffic_filtering=True,id=99a0470d-e680-42ab-8405-7803d6c30f96,network=Network(7869fdf1-f7f0-49a6-9871-2150df66ed42),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap99a0470d-e6')
Nov 22 08:53:05 compute-0 nova_compute[186544]: 2025-11-22 08:53:05.976 186548 DEBUG nova.virt.libvirt.driver [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:53:05 compute-0 nova_compute[186544]: 2025-11-22 08:53:05.976 186548 DEBUG nova.virt.libvirt.driver [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:53:05 compute-0 nova_compute[186544]: 2025-11-22 08:53:05.976 186548 DEBUG nova.virt.libvirt.driver [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] No VIF found with MAC fa:16:3e:1f:c5:07, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 08:53:05 compute-0 nova_compute[186544]: 2025-11-22 08:53:05.977 186548 INFO nova.virt.libvirt.driver [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] Using config drive
Nov 22 08:53:06 compute-0 nova_compute[186544]: 2025-11-22 08:53:06.888 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:53:06 compute-0 nova_compute[186544]: 2025-11-22 08:53:06.924 186548 INFO nova.virt.libvirt.driver [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] Creating config drive at /var/lib/nova/instances/3ba3c659-9910-4a7d-9f33-2caba4ec591e/disk.config
Nov 22 08:53:06 compute-0 nova_compute[186544]: 2025-11-22 08:53:06.929 186548 DEBUG oslo_concurrency.processutils [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3ba3c659-9910-4a7d-9f33-2caba4ec591e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp381vu0_f execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:53:07 compute-0 nova_compute[186544]: 2025-11-22 08:53:07.053 186548 DEBUG oslo_concurrency.processutils [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3ba3c659-9910-4a7d-9f33-2caba4ec591e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp381vu0_f" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:53:07 compute-0 kernel: tap99a0470d-e6: entered promiscuous mode
Nov 22 08:53:07 compute-0 NetworkManager[55036]: <info>  [1763801587.1170] manager: (tap99a0470d-e6): new Tun device (/org/freedesktop/NetworkManager/Devices/426)
Nov 22 08:53:07 compute-0 ovn_controller[94843]: 2025-11-22T08:53:07Z|00907|binding|INFO|Claiming lport 99a0470d-e680-42ab-8405-7803d6c30f96 for this chassis.
Nov 22 08:53:07 compute-0 ovn_controller[94843]: 2025-11-22T08:53:07Z|00908|binding|INFO|99a0470d-e680-42ab-8405-7803d6c30f96: Claiming fa:16:3e:1f:c5:07 10.100.0.3
Nov 22 08:53:07 compute-0 nova_compute[186544]: 2025-11-22 08:53:07.116 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:53:07 compute-0 nova_compute[186544]: 2025-11-22 08:53:07.120 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:53:07 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:53:07.137 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1f:c5:07 10.100.0.3'], port_security=['fa:16:3e:1f:c5:07 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '3ba3c659-9910-4a7d-9f33-2caba4ec591e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7869fdf1-f7f0-49a6-9871-2150df66ed42', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd48db9a4-06b7-4292-b472-284fe9411414 e32d9632-0862-4299-b57c-2a352cee3249', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b14ca5ed-8591-43f2-b230-b8a5c78453dc, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=99a0470d-e680-42ab-8405-7803d6c30f96) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:53:07 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:53:07.139 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 99a0470d-e680-42ab-8405-7803d6c30f96 in datapath 7869fdf1-f7f0-49a6-9871-2150df66ed42 bound to our chassis
Nov 22 08:53:07 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:53:07.139 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7869fdf1-f7f0-49a6-9871-2150df66ed42
Nov 22 08:53:07 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:53:07.151 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[ebbe7e80-94d1-4a78-93d0-e6b8e78554c0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:53:07 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:53:07.152 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7869fdf1-f1 in ovnmeta-7869fdf1-f7f0-49a6-9871-2150df66ed42 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 08:53:07 compute-0 systemd-udevd[255830]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:53:07 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:53:07.154 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7869fdf1-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 08:53:07 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:53:07.154 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[64c0ca76-4c1e-49cc-853c-3f51b7136109]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:53:07 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:53:07.155 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[18213faa-9519-4a40-9a84-88f3edae47dc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:53:07 compute-0 systemd-machined[152872]: New machine qemu-99-instance-000000b7.
Nov 22 08:53:07 compute-0 NetworkManager[55036]: <info>  [1763801587.1666] device (tap99a0470d-e6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 08:53:07 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:53:07.166 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[3c3f775d-120c-4a6a-8b0e-2721733e971f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:53:07 compute-0 NetworkManager[55036]: <info>  [1763801587.1686] device (tap99a0470d-e6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 08:53:07 compute-0 nova_compute[186544]: 2025-11-22 08:53:07.172 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:53:07 compute-0 systemd[1]: Started Virtual Machine qemu-99-instance-000000b7.
Nov 22 08:53:07 compute-0 ovn_controller[94843]: 2025-11-22T08:53:07Z|00909|binding|INFO|Setting lport 99a0470d-e680-42ab-8405-7803d6c30f96 ovn-installed in OVS
Nov 22 08:53:07 compute-0 ovn_controller[94843]: 2025-11-22T08:53:07Z|00910|binding|INFO|Setting lport 99a0470d-e680-42ab-8405-7803d6c30f96 up in Southbound
Nov 22 08:53:07 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:53:07.179 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[97811daa-ce32-46ec-ac04-4fd77a0a42a1]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:53:07 compute-0 nova_compute[186544]: 2025-11-22 08:53:07.180 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:53:07 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:53:07.204 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[a62da3c0-53ef-40f3-94a4-098d1eb61f22]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:53:07 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:53:07.212 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[9eeda6d2-f083-4fc7-b8b1-279fdf59a0b0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:53:07 compute-0 NetworkManager[55036]: <info>  [1763801587.2127] manager: (tap7869fdf1-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/427)
Nov 22 08:53:07 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:53:07.240 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[769e2ae1-9161-4822-bda2-6b7020815d1b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:53:07 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:53:07.243 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[066ab9d1-439b-4e8b-a28a-02ab6fe80857]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:53:07 compute-0 NetworkManager[55036]: <info>  [1763801587.2664] device (tap7869fdf1-f0): carrier: link connected
Nov 22 08:53:07 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:53:07.271 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[4e0de35c-fbc0-4515-aa96-39c673952f3e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:53:07 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:53:07.288 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[ae2f4be5-8b0e-4ec7-8654-2f96acf49d9d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7869fdf1-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e5:d6:6b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 267], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 832790, 'reachable_time': 35974, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 255863, 'error': None, 'target': 'ovnmeta-7869fdf1-f7f0-49a6-9871-2150df66ed42', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:53:07 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:53:07.303 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[79cf0956-248e-4056-b2ea-253e2b56b6c7]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee5:d66b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 832790, 'tstamp': 832790}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 255864, 'error': None, 'target': 'ovnmeta-7869fdf1-f7f0-49a6-9871-2150df66ed42', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:53:07 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:53:07.319 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[8b07fff1-c6c1-4470-b5cb-d1a084b65a09]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7869fdf1-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e5:d6:6b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 267], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 832790, 'reachable_time': 35974, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 255865, 'error': None, 'target': 'ovnmeta-7869fdf1-f7f0-49a6-9871-2150df66ed42', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:53:07 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:53:07.347 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[f4101826-e7d0-4fd3-b08e-10df808af78a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:53:07 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:53:07.404 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[aaaeda7f-e6de-4ced-9e05-552d02083108]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:53:07 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:53:07.405 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7869fdf1-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:53:07 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:53:07.406 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:53:07 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:53:07.406 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7869fdf1-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:53:07 compute-0 NetworkManager[55036]: <info>  [1763801587.4088] manager: (tap7869fdf1-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/428)
Nov 22 08:53:07 compute-0 kernel: tap7869fdf1-f0: entered promiscuous mode
Nov 22 08:53:07 compute-0 nova_compute[186544]: 2025-11-22 08:53:07.408 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:53:07 compute-0 nova_compute[186544]: 2025-11-22 08:53:07.411 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:53:07 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:53:07.411 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7869fdf1-f0, col_values=(('external_ids', {'iface-id': 'd3399a48-53c9-4e67-ba6e-df4c20b617cc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:53:07 compute-0 nova_compute[186544]: 2025-11-22 08:53:07.413 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:53:07 compute-0 ovn_controller[94843]: 2025-11-22T08:53:07Z|00911|binding|INFO|Releasing lport d3399a48-53c9-4e67-ba6e-df4c20b617cc from this chassis (sb_readonly=0)
Nov 22 08:53:07 compute-0 nova_compute[186544]: 2025-11-22 08:53:07.413 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:53:07 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:53:07.414 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7869fdf1-f7f0-49a6-9871-2150df66ed42.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7869fdf1-f7f0-49a6-9871-2150df66ed42.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 08:53:07 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:53:07.423 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[81ac5a7d-090a-4b81-8074-40bfb94836ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:53:07 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:53:07.424 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 08:53:07 compute-0 ovn_metadata_agent[103800]: global
Nov 22 08:53:07 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 08:53:07 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-7869fdf1-f7f0-49a6-9871-2150df66ed42
Nov 22 08:53:07 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 08:53:07 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 08:53:07 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 08:53:07 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/7869fdf1-f7f0-49a6-9871-2150df66ed42.pid.haproxy
Nov 22 08:53:07 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 08:53:07 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:53:07 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 08:53:07 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 08:53:07 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 08:53:07 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 08:53:07 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 08:53:07 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 08:53:07 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 08:53:07 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 08:53:07 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 08:53:07 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 08:53:07 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 08:53:07 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 08:53:07 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 08:53:07 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:53:07 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:53:07 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 08:53:07 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 08:53:07 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 08:53:07 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID 7869fdf1-f7f0-49a6-9871-2150df66ed42
Nov 22 08:53:07 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 08:53:07 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:53:07.424 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7869fdf1-f7f0-49a6-9871-2150df66ed42', 'env', 'PROCESS_TAG=haproxy-7869fdf1-f7f0-49a6-9871-2150df66ed42', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7869fdf1-f7f0-49a6-9871-2150df66ed42.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 08:53:07 compute-0 nova_compute[186544]: 2025-11-22 08:53:07.424 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:53:07 compute-0 nova_compute[186544]: 2025-11-22 08:53:07.656 186548 DEBUG nova.network.neutron [req-1652775b-3075-4a85-b0da-ff5eb8f9c70b req-e375d3fa-2e92-4961-8fe2-4a933f0f6fd9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] Updated VIF entry in instance network info cache for port 99a0470d-e680-42ab-8405-7803d6c30f96. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:53:07 compute-0 nova_compute[186544]: 2025-11-22 08:53:07.656 186548 DEBUG nova.network.neutron [req-1652775b-3075-4a85-b0da-ff5eb8f9c70b req-e375d3fa-2e92-4961-8fe2-4a933f0f6fd9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] Updating instance_info_cache with network_info: [{"id": "99a0470d-e680-42ab-8405-7803d6c30f96", "address": "fa:16:3e:1f:c5:07", "network": {"id": "7869fdf1-f7f0-49a6-9871-2150df66ed42", "bridge": "br-int", "label": "tempest-network-smoke--1803885914", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap99a0470d-e6", "ovs_interfaceid": "99a0470d-e680-42ab-8405-7803d6c30f96", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:53:07 compute-0 nova_compute[186544]: 2025-11-22 08:53:07.681 186548 DEBUG oslo_concurrency.lockutils [req-1652775b-3075-4a85-b0da-ff5eb8f9c70b req-e375d3fa-2e92-4961-8fe2-4a933f0f6fd9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-3ba3c659-9910-4a7d-9f33-2caba4ec591e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:53:07 compute-0 podman[255897]: 2025-11-22 08:53:07.840673423 +0000 UTC m=+0.110959040 container create 9daadd8795678b6293129413fa5cf64c4e297b0f6cbf39ca49a06a29f65018ac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7869fdf1-f7f0-49a6-9871-2150df66ed42, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Nov 22 08:53:07 compute-0 podman[255897]: 2025-11-22 08:53:07.757234212 +0000 UTC m=+0.027519829 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 08:53:07 compute-0 nova_compute[186544]: 2025-11-22 08:53:07.879 186548 DEBUG nova.compute.manager [req-9b21375e-cee6-4ad5-b50b-95a33260123b req-ab6d786c-7b51-42ef-9eab-6fe51a33cc45 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] Received event network-vif-plugged-99a0470d-e680-42ab-8405-7803d6c30f96 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:53:07 compute-0 nova_compute[186544]: 2025-11-22 08:53:07.880 186548 DEBUG oslo_concurrency.lockutils [req-9b21375e-cee6-4ad5-b50b-95a33260123b req-ab6d786c-7b51-42ef-9eab-6fe51a33cc45 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "3ba3c659-9910-4a7d-9f33-2caba4ec591e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:53:07 compute-0 nova_compute[186544]: 2025-11-22 08:53:07.880 186548 DEBUG oslo_concurrency.lockutils [req-9b21375e-cee6-4ad5-b50b-95a33260123b req-ab6d786c-7b51-42ef-9eab-6fe51a33cc45 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "3ba3c659-9910-4a7d-9f33-2caba4ec591e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:53:07 compute-0 nova_compute[186544]: 2025-11-22 08:53:07.880 186548 DEBUG oslo_concurrency.lockutils [req-9b21375e-cee6-4ad5-b50b-95a33260123b req-ab6d786c-7b51-42ef-9eab-6fe51a33cc45 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "3ba3c659-9910-4a7d-9f33-2caba4ec591e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:53:07 compute-0 nova_compute[186544]: 2025-11-22 08:53:07.880 186548 DEBUG nova.compute.manager [req-9b21375e-cee6-4ad5-b50b-95a33260123b req-ab6d786c-7b51-42ef-9eab-6fe51a33cc45 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] Processing event network-vif-plugged-99a0470d-e680-42ab-8405-7803d6c30f96 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 08:53:07 compute-0 systemd[1]: Started libpod-conmon-9daadd8795678b6293129413fa5cf64c4e297b0f6cbf39ca49a06a29f65018ac.scope.
Nov 22 08:53:07 compute-0 systemd[1]: Started libcrun container.
Nov 22 08:53:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/69555d8d0ebdd3bb05a00e5c949047b658283bbf5c8490a300869068fa058d43/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 08:53:07 compute-0 podman[255897]: 2025-11-22 08:53:07.932230305 +0000 UTC m=+0.202515922 container init 9daadd8795678b6293129413fa5cf64c4e297b0f6cbf39ca49a06a29f65018ac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7869fdf1-f7f0-49a6-9871-2150df66ed42, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118)
Nov 22 08:53:07 compute-0 podman[255897]: 2025-11-22 08:53:07.939851693 +0000 UTC m=+0.210137310 container start 9daadd8795678b6293129413fa5cf64c4e297b0f6cbf39ca49a06a29f65018ac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7869fdf1-f7f0-49a6-9871-2150df66ed42, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 22 08:53:07 compute-0 neutron-haproxy-ovnmeta-7869fdf1-f7f0-49a6-9871-2150df66ed42[255911]: [NOTICE]   (255916) : New worker (255918) forked
Nov 22 08:53:07 compute-0 neutron-haproxy-ovnmeta-7869fdf1-f7f0-49a6-9871-2150df66ed42[255911]: [NOTICE]   (255916) : Loading success.
Nov 22 08:53:08 compute-0 nova_compute[186544]: 2025-11-22 08:53:08.592 186548 DEBUG nova.compute.manager [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 08:53:08 compute-0 nova_compute[186544]: 2025-11-22 08:53:08.593 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763801588.592055, 3ba3c659-9910-4a7d-9f33-2caba4ec591e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:53:08 compute-0 nova_compute[186544]: 2025-11-22 08:53:08.593 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] VM Started (Lifecycle Event)
Nov 22 08:53:08 compute-0 nova_compute[186544]: 2025-11-22 08:53:08.597 186548 DEBUG nova.virt.libvirt.driver [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 08:53:08 compute-0 nova_compute[186544]: 2025-11-22 08:53:08.600 186548 INFO nova.virt.libvirt.driver [-] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] Instance spawned successfully.
Nov 22 08:53:08 compute-0 nova_compute[186544]: 2025-11-22 08:53:08.600 186548 DEBUG nova.virt.libvirt.driver [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 08:53:08 compute-0 nova_compute[186544]: 2025-11-22 08:53:08.617 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:53:08 compute-0 nova_compute[186544]: 2025-11-22 08:53:08.622 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:53:08 compute-0 nova_compute[186544]: 2025-11-22 08:53:08.625 186548 DEBUG nova.virt.libvirt.driver [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:53:08 compute-0 nova_compute[186544]: 2025-11-22 08:53:08.625 186548 DEBUG nova.virt.libvirt.driver [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:53:08 compute-0 nova_compute[186544]: 2025-11-22 08:53:08.626 186548 DEBUG nova.virt.libvirt.driver [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:53:08 compute-0 nova_compute[186544]: 2025-11-22 08:53:08.626 186548 DEBUG nova.virt.libvirt.driver [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:53:08 compute-0 nova_compute[186544]: 2025-11-22 08:53:08.626 186548 DEBUG nova.virt.libvirt.driver [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:53:08 compute-0 nova_compute[186544]: 2025-11-22 08:53:08.627 186548 DEBUG nova.virt.libvirt.driver [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:53:08 compute-0 nova_compute[186544]: 2025-11-22 08:53:08.658 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 08:53:08 compute-0 nova_compute[186544]: 2025-11-22 08:53:08.660 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763801588.5946937, 3ba3c659-9910-4a7d-9f33-2caba4ec591e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:53:08 compute-0 nova_compute[186544]: 2025-11-22 08:53:08.660 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] VM Paused (Lifecycle Event)
Nov 22 08:53:08 compute-0 nova_compute[186544]: 2025-11-22 08:53:08.683 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:53:08 compute-0 nova_compute[186544]: 2025-11-22 08:53:08.687 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763801588.5964868, 3ba3c659-9910-4a7d-9f33-2caba4ec591e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:53:08 compute-0 nova_compute[186544]: 2025-11-22 08:53:08.687 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] VM Resumed (Lifecycle Event)
Nov 22 08:53:08 compute-0 nova_compute[186544]: 2025-11-22 08:53:08.717 186548 INFO nova.compute.manager [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] Took 11.49 seconds to spawn the instance on the hypervisor.
Nov 22 08:53:08 compute-0 nova_compute[186544]: 2025-11-22 08:53:08.717 186548 DEBUG nova.compute.manager [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:53:08 compute-0 nova_compute[186544]: 2025-11-22 08:53:08.718 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:53:08 compute-0 nova_compute[186544]: 2025-11-22 08:53:08.727 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:53:08 compute-0 nova_compute[186544]: 2025-11-22 08:53:08.766 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 08:53:08 compute-0 nova_compute[186544]: 2025-11-22 08:53:08.804 186548 INFO nova.compute.manager [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] Took 11.90 seconds to build instance.
Nov 22 08:53:08 compute-0 nova_compute[186544]: 2025-11-22 08:53:08.825 186548 DEBUG oslo_concurrency.lockutils [None req-289c3919-c913-4f4f-aacd-c6432aa58b09 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "3ba3c659-9910-4a7d-9f33-2caba4ec591e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.993s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:53:09 compute-0 podman[255934]: 2025-11-22 08:53:09.429526767 +0000 UTC m=+0.064155329 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 08:53:09 compute-0 podman[255935]: 2025-11-22 08:53:09.438364434 +0000 UTC m=+0.070014503 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, container_name=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, maintainer=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 22 08:53:09 compute-0 podman[255936]: 2025-11-22 08:53:09.445000328 +0000 UTC m=+0.069536632 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3)
Nov 22 08:53:09 compute-0 nova_compute[186544]: 2025-11-22 08:53:09.995 186548 DEBUG nova.compute.manager [req-c26210f3-8f39-46a6-8c67-47b48a22df4c req-32c4c6b4-56b0-4586-af0d-a7c2b6a28e6c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] Received event network-vif-plugged-99a0470d-e680-42ab-8405-7803d6c30f96 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:53:09 compute-0 nova_compute[186544]: 2025-11-22 08:53:09.995 186548 DEBUG oslo_concurrency.lockutils [req-c26210f3-8f39-46a6-8c67-47b48a22df4c req-32c4c6b4-56b0-4586-af0d-a7c2b6a28e6c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "3ba3c659-9910-4a7d-9f33-2caba4ec591e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:53:09 compute-0 nova_compute[186544]: 2025-11-22 08:53:09.995 186548 DEBUG oslo_concurrency.lockutils [req-c26210f3-8f39-46a6-8c67-47b48a22df4c req-32c4c6b4-56b0-4586-af0d-a7c2b6a28e6c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "3ba3c659-9910-4a7d-9f33-2caba4ec591e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:53:09 compute-0 nova_compute[186544]: 2025-11-22 08:53:09.995 186548 DEBUG oslo_concurrency.lockutils [req-c26210f3-8f39-46a6-8c67-47b48a22df4c req-32c4c6b4-56b0-4586-af0d-a7c2b6a28e6c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "3ba3c659-9910-4a7d-9f33-2caba4ec591e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:53:09 compute-0 nova_compute[186544]: 2025-11-22 08:53:09.996 186548 DEBUG nova.compute.manager [req-c26210f3-8f39-46a6-8c67-47b48a22df4c req-32c4c6b4-56b0-4586-af0d-a7c2b6a28e6c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] No waiting events found dispatching network-vif-plugged-99a0470d-e680-42ab-8405-7803d6c30f96 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:53:09 compute-0 nova_compute[186544]: 2025-11-22 08:53:09.996 186548 WARNING nova.compute.manager [req-c26210f3-8f39-46a6-8c67-47b48a22df4c req-32c4c6b4-56b0-4586-af0d-a7c2b6a28e6c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] Received unexpected event network-vif-plugged-99a0470d-e680-42ab-8405-7803d6c30f96 for instance with vm_state active and task_state None.
Nov 22 08:53:10 compute-0 nova_compute[186544]: 2025-11-22 08:53:10.753 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:53:11 compute-0 nova_compute[186544]: 2025-11-22 08:53:11.890 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:53:12 compute-0 nova_compute[186544]: 2025-11-22 08:53:12.867 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:53:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:53:12.867 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=77, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=76) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:53:12 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:53:12.868 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 08:53:13 compute-0 ovn_controller[94843]: 2025-11-22T08:53:13Z|00912|binding|INFO|Releasing lport d3399a48-53c9-4e67-ba6e-df4c20b617cc from this chassis (sb_readonly=0)
Nov 22 08:53:13 compute-0 NetworkManager[55036]: <info>  [1763801593.7625] manager: (patch-br-int-to-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/429)
Nov 22 08:53:13 compute-0 NetworkManager[55036]: <info>  [1763801593.7633] manager: (patch-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/430)
Nov 22 08:53:13 compute-0 nova_compute[186544]: 2025-11-22 08:53:13.779 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:53:13 compute-0 ovn_controller[94843]: 2025-11-22T08:53:13Z|00913|binding|INFO|Releasing lport d3399a48-53c9-4e67-ba6e-df4c20b617cc from this chassis (sb_readonly=0)
Nov 22 08:53:13 compute-0 nova_compute[186544]: 2025-11-22 08:53:13.793 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:53:13 compute-0 nova_compute[186544]: 2025-11-22 08:53:13.798 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:53:14 compute-0 nova_compute[186544]: 2025-11-22 08:53:14.416 186548 DEBUG nova.compute.manager [req-3d548a12-6465-4f42-ab8a-a822e53c4238 req-1ed8ae1e-7b36-4fba-9ed4-7de047d7e951 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] Received event network-changed-99a0470d-e680-42ab-8405-7803d6c30f96 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:53:14 compute-0 nova_compute[186544]: 2025-11-22 08:53:14.416 186548 DEBUG nova.compute.manager [req-3d548a12-6465-4f42-ab8a-a822e53c4238 req-1ed8ae1e-7b36-4fba-9ed4-7de047d7e951 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] Refreshing instance network info cache due to event network-changed-99a0470d-e680-42ab-8405-7803d6c30f96. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:53:14 compute-0 nova_compute[186544]: 2025-11-22 08:53:14.416 186548 DEBUG oslo_concurrency.lockutils [req-3d548a12-6465-4f42-ab8a-a822e53c4238 req-1ed8ae1e-7b36-4fba-9ed4-7de047d7e951 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-3ba3c659-9910-4a7d-9f33-2caba4ec591e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:53:14 compute-0 nova_compute[186544]: 2025-11-22 08:53:14.416 186548 DEBUG oslo_concurrency.lockutils [req-3d548a12-6465-4f42-ab8a-a822e53c4238 req-1ed8ae1e-7b36-4fba-9ed4-7de047d7e951 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-3ba3c659-9910-4a7d-9f33-2caba4ec591e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:53:14 compute-0 nova_compute[186544]: 2025-11-22 08:53:14.417 186548 DEBUG nova.network.neutron [req-3d548a12-6465-4f42-ab8a-a822e53c4238 req-1ed8ae1e-7b36-4fba-9ed4-7de047d7e951 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] Refreshing network info cache for port 99a0470d-e680-42ab-8405-7803d6c30f96 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:53:15 compute-0 nova_compute[186544]: 2025-11-22 08:53:15.758 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:53:16 compute-0 nova_compute[186544]: 2025-11-22 08:53:16.012 186548 DEBUG nova.network.neutron [req-3d548a12-6465-4f42-ab8a-a822e53c4238 req-1ed8ae1e-7b36-4fba-9ed4-7de047d7e951 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] Updated VIF entry in instance network info cache for port 99a0470d-e680-42ab-8405-7803d6c30f96. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:53:16 compute-0 nova_compute[186544]: 2025-11-22 08:53:16.013 186548 DEBUG nova.network.neutron [req-3d548a12-6465-4f42-ab8a-a822e53c4238 req-1ed8ae1e-7b36-4fba-9ed4-7de047d7e951 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] Updating instance_info_cache with network_info: [{"id": "99a0470d-e680-42ab-8405-7803d6c30f96", "address": "fa:16:3e:1f:c5:07", "network": {"id": "7869fdf1-f7f0-49a6-9871-2150df66ed42", "bridge": "br-int", "label": "tempest-network-smoke--1803885914", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.239", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap99a0470d-e6", "ovs_interfaceid": "99a0470d-e680-42ab-8405-7803d6c30f96", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:53:16 compute-0 nova_compute[186544]: 2025-11-22 08:53:16.042 186548 DEBUG oslo_concurrency.lockutils [req-3d548a12-6465-4f42-ab8a-a822e53c4238 req-1ed8ae1e-7b36-4fba-9ed4-7de047d7e951 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-3ba3c659-9910-4a7d-9f33-2caba4ec591e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:53:16 compute-0 nova_compute[186544]: 2025-11-22 08:53:16.892 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:53:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:53:19.870 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '77'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:53:20 compute-0 nova_compute[186544]: 2025-11-22 08:53:20.760 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:53:21 compute-0 nova_compute[186544]: 2025-11-22 08:53:21.894 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:53:25 compute-0 ovn_controller[94843]: 2025-11-22T08:53:25Z|00093|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:1f:c5:07 10.100.0.3
Nov 22 08:53:25 compute-0 ovn_controller[94843]: 2025-11-22T08:53:25Z|00094|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:1f:c5:07 10.100.0.3
Nov 22 08:53:25 compute-0 nova_compute[186544]: 2025-11-22 08:53:25.763 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:53:26 compute-0 nova_compute[186544]: 2025-11-22 08:53:26.897 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:53:28 compute-0 podman[256021]: 2025-11-22 08:53:28.420319439 +0000 UTC m=+0.057861564 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 22 08:53:28 compute-0 podman[256020]: 2025-11-22 08:53:28.420508754 +0000 UTC m=+0.068429494 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 22 08:53:28 compute-0 podman[256022]: 2025-11-22 08:53:28.447697723 +0000 UTC m=+0.083668130 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 08:53:28 compute-0 podman[256027]: 2025-11-22 08:53:28.454863819 +0000 UTC m=+0.087877373 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118)
Nov 22 08:53:30 compute-0 nova_compute[186544]: 2025-11-22 08:53:30.767 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:53:31 compute-0 nova_compute[186544]: 2025-11-22 08:53:31.899 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:53:33 compute-0 nova_compute[186544]: 2025-11-22 08:53:33.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:53:33 compute-0 nova_compute[186544]: 2025-11-22 08:53:33.164 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 08:53:33 compute-0 nova_compute[186544]: 2025-11-22 08:53:33.165 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 08:53:33 compute-0 nova_compute[186544]: 2025-11-22 08:53:33.642 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "refresh_cache-3ba3c659-9910-4a7d-9f33-2caba4ec591e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:53:33 compute-0 nova_compute[186544]: 2025-11-22 08:53:33.643 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquired lock "refresh_cache-3ba3c659-9910-4a7d-9f33-2caba4ec591e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:53:33 compute-0 nova_compute[186544]: 2025-11-22 08:53:33.643 186548 DEBUG nova.network.neutron [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 22 08:53:33 compute-0 nova_compute[186544]: 2025-11-22 08:53:33.643 186548 DEBUG nova.objects.instance [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lazy-loading 'info_cache' on Instance uuid 3ba3c659-9910-4a7d-9f33-2caba4ec591e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:53:34 compute-0 nova_compute[186544]: 2025-11-22 08:53:34.250 186548 DEBUG nova.compute.manager [req-639cc235-5e9f-4afe-8e9d-f5d60336048f req-ae7acb97-96ff-47c7-928d-1a7abffda380 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] Received event network-changed-99a0470d-e680-42ab-8405-7803d6c30f96 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:53:34 compute-0 nova_compute[186544]: 2025-11-22 08:53:34.250 186548 DEBUG nova.compute.manager [req-639cc235-5e9f-4afe-8e9d-f5d60336048f req-ae7acb97-96ff-47c7-928d-1a7abffda380 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] Refreshing instance network info cache due to event network-changed-99a0470d-e680-42ab-8405-7803d6c30f96. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:53:34 compute-0 nova_compute[186544]: 2025-11-22 08:53:34.250 186548 DEBUG oslo_concurrency.lockutils [req-639cc235-5e9f-4afe-8e9d-f5d60336048f req-ae7acb97-96ff-47c7-928d-1a7abffda380 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-3ba3c659-9910-4a7d-9f33-2caba4ec591e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:53:34 compute-0 nova_compute[186544]: 2025-11-22 08:53:34.329 186548 DEBUG oslo_concurrency.lockutils [None req-8fe90523-f547-48d1-a71c-10150ad29bb5 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "3ba3c659-9910-4a7d-9f33-2caba4ec591e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:53:34 compute-0 nova_compute[186544]: 2025-11-22 08:53:34.330 186548 DEBUG oslo_concurrency.lockutils [None req-8fe90523-f547-48d1-a71c-10150ad29bb5 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "3ba3c659-9910-4a7d-9f33-2caba4ec591e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:53:34 compute-0 nova_compute[186544]: 2025-11-22 08:53:34.330 186548 DEBUG oslo_concurrency.lockutils [None req-8fe90523-f547-48d1-a71c-10150ad29bb5 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "3ba3c659-9910-4a7d-9f33-2caba4ec591e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:53:34 compute-0 nova_compute[186544]: 2025-11-22 08:53:34.330 186548 DEBUG oslo_concurrency.lockutils [None req-8fe90523-f547-48d1-a71c-10150ad29bb5 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "3ba3c659-9910-4a7d-9f33-2caba4ec591e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:53:34 compute-0 nova_compute[186544]: 2025-11-22 08:53:34.330 186548 DEBUG oslo_concurrency.lockutils [None req-8fe90523-f547-48d1-a71c-10150ad29bb5 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "3ba3c659-9910-4a7d-9f33-2caba4ec591e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:53:34 compute-0 nova_compute[186544]: 2025-11-22 08:53:34.336 186548 INFO nova.compute.manager [None req-8fe90523-f547-48d1-a71c-10150ad29bb5 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] Terminating instance
Nov 22 08:53:34 compute-0 nova_compute[186544]: 2025-11-22 08:53:34.341 186548 DEBUG nova.compute.manager [None req-8fe90523-f547-48d1-a71c-10150ad29bb5 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 08:53:34 compute-0 kernel: tap99a0470d-e6 (unregistering): left promiscuous mode
Nov 22 08:53:34 compute-0 NetworkManager[55036]: <info>  [1763801614.3785] device (tap99a0470d-e6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 08:53:34 compute-0 nova_compute[186544]: 2025-11-22 08:53:34.385 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:53:34 compute-0 ovn_controller[94843]: 2025-11-22T08:53:34Z|00914|binding|INFO|Releasing lport 99a0470d-e680-42ab-8405-7803d6c30f96 from this chassis (sb_readonly=0)
Nov 22 08:53:34 compute-0 ovn_controller[94843]: 2025-11-22T08:53:34Z|00915|binding|INFO|Setting lport 99a0470d-e680-42ab-8405-7803d6c30f96 down in Southbound
Nov 22 08:53:34 compute-0 ovn_controller[94843]: 2025-11-22T08:53:34Z|00916|binding|INFO|Removing iface tap99a0470d-e6 ovn-installed in OVS
Nov 22 08:53:34 compute-0 nova_compute[186544]: 2025-11-22 08:53:34.387 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:53:34 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:53:34.395 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1f:c5:07 10.100.0.3'], port_security=['fa:16:3e:1f:c5:07 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '3ba3c659-9910-4a7d-9f33-2caba4ec591e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7869fdf1-f7f0-49a6-9871-2150df66ed42', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd48db9a4-06b7-4292-b472-284fe9411414 e32d9632-0862-4299-b57c-2a352cee3249', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b14ca5ed-8591-43f2-b230-b8a5c78453dc, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=99a0470d-e680-42ab-8405-7803d6c30f96) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:53:34 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:53:34.396 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 99a0470d-e680-42ab-8405-7803d6c30f96 in datapath 7869fdf1-f7f0-49a6-9871-2150df66ed42 unbound from our chassis
Nov 22 08:53:34 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:53:34.398 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7869fdf1-f7f0-49a6-9871-2150df66ed42, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 08:53:34 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:53:34.400 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[f97f1670-ca11-49c6-afbd-fab661ba8ac4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:53:34 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:53:34.400 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7869fdf1-f7f0-49a6-9871-2150df66ed42 namespace which is not needed anymore
Nov 22 08:53:34 compute-0 nova_compute[186544]: 2025-11-22 08:53:34.403 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:53:34 compute-0 systemd[1]: machine-qemu\x2d99\x2dinstance\x2d000000b7.scope: Deactivated successfully.
Nov 22 08:53:34 compute-0 systemd[1]: machine-qemu\x2d99\x2dinstance\x2d000000b7.scope: Consumed 16.235s CPU time.
Nov 22 08:53:34 compute-0 systemd-machined[152872]: Machine qemu-99-instance-000000b7 terminated.
Nov 22 08:53:34 compute-0 nova_compute[186544]: 2025-11-22 08:53:34.639 186548 INFO nova.virt.libvirt.driver [-] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] Instance destroyed successfully.
Nov 22 08:53:34 compute-0 nova_compute[186544]: 2025-11-22 08:53:34.640 186548 DEBUG nova.objects.instance [None req-8fe90523-f547-48d1-a71c-10150ad29bb5 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lazy-loading 'resources' on Instance uuid 3ba3c659-9910-4a7d-9f33-2caba4ec591e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:53:34 compute-0 nova_compute[186544]: 2025-11-22 08:53:34.670 186548 DEBUG nova.virt.libvirt.vif [None req-8fe90523-f547-48d1-a71c-10150ad29bb5 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:52:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-534108987',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-534108987',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-588574044-acc',id=183,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLiGQmcqGXcsiG3cxIcUvS1tAu25Vah0oTek5lovbLMmZ85EcufMlHOsC9OIWKjqPQy3WbONgErBbtOzH1DLa6F6Qpynfr2ZY+A1c+Z39f1muaHxw/crPZGVkXZfE8lq0Q==',key_name='tempest-TestSecurityGroupsBasicOps-1950223607',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:53:08Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b5da13b07bb34fc3b4cd1452f7dd6971',ramdisk_id='',reservation_id='r-3zc97jdy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-588574044',owner_user_name='tempest-TestSecurityGroupsBasicOps-588574044-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:53:08Z,user_data=None,user_id='7bb85b33f2b44468ab5d86bf5ba98421',uuid=3ba3c659-9910-4a7d-9f33-2caba4ec591e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "99a0470d-e680-42ab-8405-7803d6c30f96", "address": "fa:16:3e:1f:c5:07", "network": {"id": "7869fdf1-f7f0-49a6-9871-2150df66ed42", "bridge": "br-int", "label": "tempest-network-smoke--1803885914", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.239", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap99a0470d-e6", "ovs_interfaceid": "99a0470d-e680-42ab-8405-7803d6c30f96", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 08:53:34 compute-0 nova_compute[186544]: 2025-11-22 08:53:34.671 186548 DEBUG nova.network.os_vif_util [None req-8fe90523-f547-48d1-a71c-10150ad29bb5 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Converting VIF {"id": "99a0470d-e680-42ab-8405-7803d6c30f96", "address": "fa:16:3e:1f:c5:07", "network": {"id": "7869fdf1-f7f0-49a6-9871-2150df66ed42", "bridge": "br-int", "label": "tempest-network-smoke--1803885914", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.239", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap99a0470d-e6", "ovs_interfaceid": "99a0470d-e680-42ab-8405-7803d6c30f96", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:53:34 compute-0 nova_compute[186544]: 2025-11-22 08:53:34.671 186548 DEBUG nova.network.os_vif_util [None req-8fe90523-f547-48d1-a71c-10150ad29bb5 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:1f:c5:07,bridge_name='br-int',has_traffic_filtering=True,id=99a0470d-e680-42ab-8405-7803d6c30f96,network=Network(7869fdf1-f7f0-49a6-9871-2150df66ed42),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap99a0470d-e6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:53:34 compute-0 nova_compute[186544]: 2025-11-22 08:53:34.672 186548 DEBUG os_vif [None req-8fe90523-f547-48d1-a71c-10150ad29bb5 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:1f:c5:07,bridge_name='br-int',has_traffic_filtering=True,id=99a0470d-e680-42ab-8405-7803d6c30f96,network=Network(7869fdf1-f7f0-49a6-9871-2150df66ed42),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap99a0470d-e6') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 08:53:34 compute-0 nova_compute[186544]: 2025-11-22 08:53:34.674 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:53:34 compute-0 nova_compute[186544]: 2025-11-22 08:53:34.675 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap99a0470d-e6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:53:34 compute-0 nova_compute[186544]: 2025-11-22 08:53:34.676 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:53:34 compute-0 nova_compute[186544]: 2025-11-22 08:53:34.679 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 08:53:34 compute-0 nova_compute[186544]: 2025-11-22 08:53:34.681 186548 INFO os_vif [None req-8fe90523-f547-48d1-a71c-10150ad29bb5 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:1f:c5:07,bridge_name='br-int',has_traffic_filtering=True,id=99a0470d-e680-42ab-8405-7803d6c30f96,network=Network(7869fdf1-f7f0-49a6-9871-2150df66ed42),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap99a0470d-e6')
Nov 22 08:53:34 compute-0 nova_compute[186544]: 2025-11-22 08:53:34.682 186548 INFO nova.virt.libvirt.driver [None req-8fe90523-f547-48d1-a71c-10150ad29bb5 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] Deleting instance files /var/lib/nova/instances/3ba3c659-9910-4a7d-9f33-2caba4ec591e_del
Nov 22 08:53:34 compute-0 nova_compute[186544]: 2025-11-22 08:53:34.683 186548 INFO nova.virt.libvirt.driver [None req-8fe90523-f547-48d1-a71c-10150ad29bb5 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] Deletion of /var/lib/nova/instances/3ba3c659-9910-4a7d-9f33-2caba4ec591e_del complete
Nov 22 08:53:34 compute-0 nova_compute[186544]: 2025-11-22 08:53:34.779 186548 INFO nova.compute.manager [None req-8fe90523-f547-48d1-a71c-10150ad29bb5 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] Took 0.44 seconds to destroy the instance on the hypervisor.
Nov 22 08:53:34 compute-0 nova_compute[186544]: 2025-11-22 08:53:34.780 186548 DEBUG oslo.service.loopingcall [None req-8fe90523-f547-48d1-a71c-10150ad29bb5 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 08:53:34 compute-0 nova_compute[186544]: 2025-11-22 08:53:34.780 186548 DEBUG nova.compute.manager [-] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 08:53:34 compute-0 nova_compute[186544]: 2025-11-22 08:53:34.780 186548 DEBUG nova.network.neutron [-] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 08:53:34 compute-0 neutron-haproxy-ovnmeta-7869fdf1-f7f0-49a6-9871-2150df66ed42[255911]: [NOTICE]   (255916) : haproxy version is 2.8.14-c23fe91
Nov 22 08:53:34 compute-0 neutron-haproxy-ovnmeta-7869fdf1-f7f0-49a6-9871-2150df66ed42[255911]: [NOTICE]   (255916) : path to executable is /usr/sbin/haproxy
Nov 22 08:53:34 compute-0 neutron-haproxy-ovnmeta-7869fdf1-f7f0-49a6-9871-2150df66ed42[255911]: [WARNING]  (255916) : Exiting Master process...
Nov 22 08:53:34 compute-0 neutron-haproxy-ovnmeta-7869fdf1-f7f0-49a6-9871-2150df66ed42[255911]: [ALERT]    (255916) : Current worker (255918) exited with code 143 (Terminated)
Nov 22 08:53:34 compute-0 neutron-haproxy-ovnmeta-7869fdf1-f7f0-49a6-9871-2150df66ed42[255911]: [WARNING]  (255916) : All workers exited. Exiting... (0)
Nov 22 08:53:34 compute-0 systemd[1]: libpod-9daadd8795678b6293129413fa5cf64c4e297b0f6cbf39ca49a06a29f65018ac.scope: Deactivated successfully.
Nov 22 08:53:34 compute-0 podman[256129]: 2025-11-22 08:53:34.802759518 +0000 UTC m=+0.311578656 container died 9daadd8795678b6293129413fa5cf64c4e297b0f6cbf39ca49a06a29f65018ac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7869fdf1-f7f0-49a6-9871-2150df66ed42, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 08:53:35 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9daadd8795678b6293129413fa5cf64c4e297b0f6cbf39ca49a06a29f65018ac-userdata-shm.mount: Deactivated successfully.
Nov 22 08:53:35 compute-0 systemd[1]: var-lib-containers-storage-overlay-69555d8d0ebdd3bb05a00e5c949047b658283bbf5c8490a300869068fa058d43-merged.mount: Deactivated successfully.
Nov 22 08:53:35 compute-0 podman[256129]: 2025-11-22 08:53:35.579824622 +0000 UTC m=+1.088643730 container cleanup 9daadd8795678b6293129413fa5cf64c4e297b0f6cbf39ca49a06a29f65018ac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7869fdf1-f7f0-49a6-9871-2150df66ed42, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 08:53:35 compute-0 systemd[1]: libpod-conmon-9daadd8795678b6293129413fa5cf64c4e297b0f6cbf39ca49a06a29f65018ac.scope: Deactivated successfully.
Nov 22 08:53:35 compute-0 nova_compute[186544]: 2025-11-22 08:53:35.884 186548 DEBUG nova.network.neutron [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] Updating instance_info_cache with network_info: [{"id": "99a0470d-e680-42ab-8405-7803d6c30f96", "address": "fa:16:3e:1f:c5:07", "network": {"id": "7869fdf1-f7f0-49a6-9871-2150df66ed42", "bridge": "br-int", "label": "tempest-network-smoke--1803885914", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap99a0470d-e6", "ovs_interfaceid": "99a0470d-e680-42ab-8405-7803d6c30f96", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:53:35 compute-0 podman[256177]: 2025-11-22 08:53:35.886107046 +0000 UTC m=+0.285435132 container remove 9daadd8795678b6293129413fa5cf64c4e297b0f6cbf39ca49a06a29f65018ac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7869fdf1-f7f0-49a6-9871-2150df66ed42, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 22 08:53:35 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:53:35.893 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[e3370aa5-6af6-4453-8af3-3d98c83cfe82]: (4, ('Sat Nov 22 08:53:34 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-7869fdf1-f7f0-49a6-9871-2150df66ed42 (9daadd8795678b6293129413fa5cf64c4e297b0f6cbf39ca49a06a29f65018ac)\n9daadd8795678b6293129413fa5cf64c4e297b0f6cbf39ca49a06a29f65018ac\nSat Nov 22 08:53:35 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-7869fdf1-f7f0-49a6-9871-2150df66ed42 (9daadd8795678b6293129413fa5cf64c4e297b0f6cbf39ca49a06a29f65018ac)\n9daadd8795678b6293129413fa5cf64c4e297b0f6cbf39ca49a06a29f65018ac\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:53:35 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:53:35.895 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[e738ac26-0f4f-44a3-9023-d1511815ecf2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:53:35 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:53:35.895 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7869fdf1-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:53:35 compute-0 nova_compute[186544]: 2025-11-22 08:53:35.897 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:53:35 compute-0 kernel: tap7869fdf1-f0: left promiscuous mode
Nov 22 08:53:35 compute-0 nova_compute[186544]: 2025-11-22 08:53:35.910 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Releasing lock "refresh_cache-3ba3c659-9910-4a7d-9f33-2caba4ec591e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:53:35 compute-0 nova_compute[186544]: 2025-11-22 08:53:35.912 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 22 08:53:35 compute-0 nova_compute[186544]: 2025-11-22 08:53:35.912 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:53:35 compute-0 nova_compute[186544]: 2025-11-22 08:53:35.913 186548 DEBUG oslo_concurrency.lockutils [req-639cc235-5e9f-4afe-8e9d-f5d60336048f req-ae7acb97-96ff-47c7-928d-1a7abffda380 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-3ba3c659-9910-4a7d-9f33-2caba4ec591e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:53:35 compute-0 nova_compute[186544]: 2025-11-22 08:53:35.914 186548 DEBUG nova.network.neutron [req-639cc235-5e9f-4afe-8e9d-f5d60336048f req-ae7acb97-96ff-47c7-928d-1a7abffda380 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] Refreshing network info cache for port 99a0470d-e680-42ab-8405-7803d6c30f96 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:53:35 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:53:35.914 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[3123543f-cfc6-4ea8-94ad-9464c409f624]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:53:35 compute-0 nova_compute[186544]: 2025-11-22 08:53:35.930 186548 DEBUG nova.network.neutron [-] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:53:35 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:53:35.933 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[ad8fe342-b877-462a-8599-03d98016f377]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:53:35 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:53:35.935 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[dcb229b5-dbbb-447b-8fa4-551fdf6ff9d8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:53:35 compute-0 nova_compute[186544]: 2025-11-22 08:53:35.948 186548 INFO nova.compute.manager [-] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] Took 1.17 seconds to deallocate network for instance.
Nov 22 08:53:35 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:53:35.951 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[ea6a3ace-1810-4430-8cc3-1909489ec615]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 832783, 'reachable_time': 22515, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 256193, 'error': None, 'target': 'ovnmeta-7869fdf1-f7f0-49a6-9871-2150df66ed42', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:53:35 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:53:35.954 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7869fdf1-f7f0-49a6-9871-2150df66ed42 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 08:53:35 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:53:35.954 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[21adccce-8fc0-4ea8-b1ee-b29a7d7f15bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:53:35 compute-0 systemd[1]: run-netns-ovnmeta\x2d7869fdf1\x2df7f0\x2d49a6\x2d9871\x2d2150df66ed42.mount: Deactivated successfully.
Nov 22 08:53:36 compute-0 nova_compute[186544]: 2025-11-22 08:53:36.008 186548 DEBUG nova.compute.manager [req-494c3ea5-6098-4e16-afa0-6200d959e5d1 req-0016fb17-90b5-4988-9b0b-9d667fd926c5 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] Received event network-vif-deleted-99a0470d-e680-42ab-8405-7803d6c30f96 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:53:36 compute-0 nova_compute[186544]: 2025-11-22 08:53:36.023 186548 DEBUG oslo_concurrency.lockutils [None req-8fe90523-f547-48d1-a71c-10150ad29bb5 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:53:36 compute-0 nova_compute[186544]: 2025-11-22 08:53:36.024 186548 DEBUG oslo_concurrency.lockutils [None req-8fe90523-f547-48d1-a71c-10150ad29bb5 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:53:36 compute-0 nova_compute[186544]: 2025-11-22 08:53:36.099 186548 INFO nova.network.neutron [req-639cc235-5e9f-4afe-8e9d-f5d60336048f req-ae7acb97-96ff-47c7-928d-1a7abffda380 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] Port 99a0470d-e680-42ab-8405-7803d6c30f96 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.
Nov 22 08:53:36 compute-0 nova_compute[186544]: 2025-11-22 08:53:36.099 186548 DEBUG nova.network.neutron [req-639cc235-5e9f-4afe-8e9d-f5d60336048f req-ae7acb97-96ff-47c7-928d-1a7abffda380 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:53:36 compute-0 nova_compute[186544]: 2025-11-22 08:53:36.109 186548 DEBUG nova.compute.provider_tree [None req-8fe90523-f547-48d1-a71c-10150ad29bb5 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:53:36 compute-0 nova_compute[186544]: 2025-11-22 08:53:36.131 186548 DEBUG oslo_concurrency.lockutils [req-639cc235-5e9f-4afe-8e9d-f5d60336048f req-ae7acb97-96ff-47c7-928d-1a7abffda380 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-3ba3c659-9910-4a7d-9f33-2caba4ec591e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:53:36 compute-0 nova_compute[186544]: 2025-11-22 08:53:36.134 186548 DEBUG nova.scheduler.client.report [None req-8fe90523-f547-48d1-a71c-10150ad29bb5 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:53:36 compute-0 nova_compute[186544]: 2025-11-22 08:53:36.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:53:36 compute-0 nova_compute[186544]: 2025-11-22 08:53:36.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 08:53:36 compute-0 nova_compute[186544]: 2025-11-22 08:53:36.164 186548 DEBUG oslo_concurrency.lockutils [None req-8fe90523-f547-48d1-a71c-10150ad29bb5 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.140s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:53:36 compute-0 nova_compute[186544]: 2025-11-22 08:53:36.209 186548 INFO nova.scheduler.client.report [None req-8fe90523-f547-48d1-a71c-10150ad29bb5 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Deleted allocations for instance 3ba3c659-9910-4a7d-9f33-2caba4ec591e
Nov 22 08:53:36 compute-0 nova_compute[186544]: 2025-11-22 08:53:36.285 186548 DEBUG oslo_concurrency.lockutils [None req-8fe90523-f547-48d1-a71c-10150ad29bb5 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "3ba3c659-9910-4a7d-9f33-2caba4ec591e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.955s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:53:36 compute-0 nova_compute[186544]: 2025-11-22 08:53:36.363 186548 DEBUG nova.compute.manager [req-a3698549-a7c7-45b6-9cfa-9a7caa989017 req-61011a2b-2d9d-4101-9fc8-a133cccce728 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] Received event network-vif-unplugged-99a0470d-e680-42ab-8405-7803d6c30f96 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:53:36 compute-0 nova_compute[186544]: 2025-11-22 08:53:36.363 186548 DEBUG oslo_concurrency.lockutils [req-a3698549-a7c7-45b6-9cfa-9a7caa989017 req-61011a2b-2d9d-4101-9fc8-a133cccce728 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "3ba3c659-9910-4a7d-9f33-2caba4ec591e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:53:36 compute-0 nova_compute[186544]: 2025-11-22 08:53:36.363 186548 DEBUG oslo_concurrency.lockutils [req-a3698549-a7c7-45b6-9cfa-9a7caa989017 req-61011a2b-2d9d-4101-9fc8-a133cccce728 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "3ba3c659-9910-4a7d-9f33-2caba4ec591e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:53:36 compute-0 nova_compute[186544]: 2025-11-22 08:53:36.364 186548 DEBUG oslo_concurrency.lockutils [req-a3698549-a7c7-45b6-9cfa-9a7caa989017 req-61011a2b-2d9d-4101-9fc8-a133cccce728 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "3ba3c659-9910-4a7d-9f33-2caba4ec591e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:53:36 compute-0 nova_compute[186544]: 2025-11-22 08:53:36.364 186548 DEBUG nova.compute.manager [req-a3698549-a7c7-45b6-9cfa-9a7caa989017 req-61011a2b-2d9d-4101-9fc8-a133cccce728 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] No waiting events found dispatching network-vif-unplugged-99a0470d-e680-42ab-8405-7803d6c30f96 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:53:36 compute-0 nova_compute[186544]: 2025-11-22 08:53:36.364 186548 WARNING nova.compute.manager [req-a3698549-a7c7-45b6-9cfa-9a7caa989017 req-61011a2b-2d9d-4101-9fc8-a133cccce728 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] Received unexpected event network-vif-unplugged-99a0470d-e680-42ab-8405-7803d6c30f96 for instance with vm_state deleted and task_state None.
Nov 22 08:53:36 compute-0 nova_compute[186544]: 2025-11-22 08:53:36.365 186548 DEBUG nova.compute.manager [req-a3698549-a7c7-45b6-9cfa-9a7caa989017 req-61011a2b-2d9d-4101-9fc8-a133cccce728 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] Received event network-vif-plugged-99a0470d-e680-42ab-8405-7803d6c30f96 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:53:36 compute-0 nova_compute[186544]: 2025-11-22 08:53:36.365 186548 DEBUG oslo_concurrency.lockutils [req-a3698549-a7c7-45b6-9cfa-9a7caa989017 req-61011a2b-2d9d-4101-9fc8-a133cccce728 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "3ba3c659-9910-4a7d-9f33-2caba4ec591e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:53:36 compute-0 nova_compute[186544]: 2025-11-22 08:53:36.365 186548 DEBUG oslo_concurrency.lockutils [req-a3698549-a7c7-45b6-9cfa-9a7caa989017 req-61011a2b-2d9d-4101-9fc8-a133cccce728 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "3ba3c659-9910-4a7d-9f33-2caba4ec591e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:53:36 compute-0 nova_compute[186544]: 2025-11-22 08:53:36.365 186548 DEBUG oslo_concurrency.lockutils [req-a3698549-a7c7-45b6-9cfa-9a7caa989017 req-61011a2b-2d9d-4101-9fc8-a133cccce728 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "3ba3c659-9910-4a7d-9f33-2caba4ec591e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:53:36 compute-0 nova_compute[186544]: 2025-11-22 08:53:36.365 186548 DEBUG nova.compute.manager [req-a3698549-a7c7-45b6-9cfa-9a7caa989017 req-61011a2b-2d9d-4101-9fc8-a133cccce728 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] No waiting events found dispatching network-vif-plugged-99a0470d-e680-42ab-8405-7803d6c30f96 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:53:36 compute-0 nova_compute[186544]: 2025-11-22 08:53:36.366 186548 WARNING nova.compute.manager [req-a3698549-a7c7-45b6-9cfa-9a7caa989017 req-61011a2b-2d9d-4101-9fc8-a133cccce728 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] Received unexpected event network-vif-plugged-99a0470d-e680-42ab-8405-7803d6c30f96 for instance with vm_state deleted and task_state None.
Nov 22 08:53:36 compute-0 nova_compute[186544]: 2025-11-22 08:53:36.901 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:53:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:53:37.386 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:53:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:53:37.386 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:53:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:53:37.387 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:53:38 compute-0 nova_compute[186544]: 2025-11-22 08:53:38.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:53:38 compute-0 nova_compute[186544]: 2025-11-22 08:53:38.185 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:53:38 compute-0 nova_compute[186544]: 2025-11-22 08:53:38.186 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:53:38 compute-0 nova_compute[186544]: 2025-11-22 08:53:38.186 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:53:38 compute-0 nova_compute[186544]: 2025-11-22 08:53:38.187 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 08:53:38 compute-0 nova_compute[186544]: 2025-11-22 08:53:38.365 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:53:38 compute-0 nova_compute[186544]: 2025-11-22 08:53:38.369 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5698MB free_disk=73.13179779052734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 08:53:38 compute-0 nova_compute[186544]: 2025-11-22 08:53:38.370 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:53:38 compute-0 nova_compute[186544]: 2025-11-22 08:53:38.370 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:53:38 compute-0 nova_compute[186544]: 2025-11-22 08:53:38.424 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 08:53:38 compute-0 nova_compute[186544]: 2025-11-22 08:53:38.424 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 08:53:38 compute-0 nova_compute[186544]: 2025-11-22 08:53:38.447 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:53:38 compute-0 nova_compute[186544]: 2025-11-22 08:53:38.457 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:53:38 compute-0 nova_compute[186544]: 2025-11-22 08:53:38.977 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 08:53:38 compute-0 nova_compute[186544]: 2025-11-22 08:53:38.978 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.608s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:53:39 compute-0 nova_compute[186544]: 2025-11-22 08:53:39.678 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:53:40 compute-0 podman[256195]: 2025-11-22 08:53:40.404206273 +0000 UTC m=+0.054687326 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 08:53:40 compute-0 podman[256197]: 2025-11-22 08:53:40.410947089 +0000 UTC m=+0.054995114 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 08:53:40 compute-0 podman[256196]: 2025-11-22 08:53:40.411029831 +0000 UTC m=+0.058980892 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, name=ubi9-minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, config_id=edpm, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 22 08:53:41 compute-0 nova_compute[186544]: 2025-11-22 08:53:41.902 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:53:41 compute-0 nova_compute[186544]: 2025-11-22 08:53:41.978 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:53:44 compute-0 nova_compute[186544]: 2025-11-22 08:53:44.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:53:44 compute-0 nova_compute[186544]: 2025-11-22 08:53:44.346 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:53:44 compute-0 nova_compute[186544]: 2025-11-22 08:53:44.431 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:53:44 compute-0 nova_compute[186544]: 2025-11-22 08:53:44.679 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:53:45 compute-0 nova_compute[186544]: 2025-11-22 08:53:45.158 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:53:46 compute-0 nova_compute[186544]: 2025-11-22 08:53:46.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:53:46 compute-0 nova_compute[186544]: 2025-11-22 08:53:46.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:53:46 compute-0 nova_compute[186544]: 2025-11-22 08:53:46.903 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:53:49 compute-0 nova_compute[186544]: 2025-11-22 08:53:49.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:53:49 compute-0 nova_compute[186544]: 2025-11-22 08:53:49.623 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763801614.622137, 3ba3c659-9910-4a7d-9f33-2caba4ec591e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:53:49 compute-0 nova_compute[186544]: 2025-11-22 08:53:49.623 186548 INFO nova.compute.manager [-] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] VM Stopped (Lifecycle Event)
Nov 22 08:53:49 compute-0 nova_compute[186544]: 2025-11-22 08:53:49.647 186548 DEBUG nova.compute.manager [None req-19de5d9d-69db-44c2-b099-4d644a75372d - - - - - -] [instance: 3ba3c659-9910-4a7d-9f33-2caba4ec591e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:53:49 compute-0 nova_compute[186544]: 2025-11-22 08:53:49.681 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:53:51 compute-0 nova_compute[186544]: 2025-11-22 08:53:51.904 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:53:54 compute-0 nova_compute[186544]: 2025-11-22 08:53:54.683 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:53:56 compute-0 nova_compute[186544]: 2025-11-22 08:53:56.904 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:53:59 compute-0 podman[256258]: 2025-11-22 08:53:59.410384626 +0000 UTC m=+0.050381271 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 08:53:59 compute-0 podman[256259]: 2025-11-22 08:53:59.438671621 +0000 UTC m=+0.074484483 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 08:53:59 compute-0 podman[256257]: 2025-11-22 08:53:59.445443298 +0000 UTC m=+0.078326698 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 08:53:59 compute-0 podman[256260]: 2025-11-22 08:53:59.473935599 +0000 UTC m=+0.108248454 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3)
Nov 22 08:53:59 compute-0 nova_compute[186544]: 2025-11-22 08:53:59.685 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:54:01 compute-0 nova_compute[186544]: 2025-11-22 08:54:01.905 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:54:04 compute-0 nova_compute[186544]: 2025-11-22 08:54:04.687 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:54:06 compute-0 nova_compute[186544]: 2025-11-22 08:54:06.907 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:54:09 compute-0 nova_compute[186544]: 2025-11-22 08:54:09.690 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:54:11 compute-0 podman[256344]: 2025-11-22 08:54:11.401333204 +0000 UTC m=+0.046172636 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 08:54:11 compute-0 podman[256345]: 2025-11-22 08:54:11.409351912 +0000 UTC m=+0.052530224 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, distribution-scope=public, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.openshift.expose-services=, name=ubi9-minimal, version=9.6, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter)
Nov 22 08:54:11 compute-0 podman[256346]: 2025-11-22 08:54:11.417075321 +0000 UTC m=+0.056528641 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 22 08:54:11 compute-0 nova_compute[186544]: 2025-11-22 08:54:11.909 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:54:14 compute-0 nova_compute[186544]: 2025-11-22 08:54:14.691 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:54:16 compute-0 nova_compute[186544]: 2025-11-22 08:54:16.910 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:54:19 compute-0 nova_compute[186544]: 2025-11-22 08:54:19.693 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:54:21 compute-0 nova_compute[186544]: 2025-11-22 08:54:21.913 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:54:24 compute-0 nova_compute[186544]: 2025-11-22 08:54:24.696 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:54:25 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:54:25.424 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=78, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=77) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:54:25 compute-0 nova_compute[186544]: 2025-11-22 08:54:25.425 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:54:25 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:54:25.426 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 08:54:26 compute-0 nova_compute[186544]: 2025-11-22 08:54:26.914 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:54:27 compute-0 ovn_controller[94843]: 2025-11-22T08:54:27Z|00917|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Nov 22 08:54:28 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:54:28.428 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '78'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:54:29 compute-0 nova_compute[186544]: 2025-11-22 08:54:29.697 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:54:30 compute-0 podman[256403]: 2025-11-22 08:54:30.416883145 +0000 UTC m=+0.063198295 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Nov 22 08:54:30 compute-0 podman[256404]: 2025-11-22 08:54:30.437078103 +0000 UTC m=+0.081476666 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 08:54:30 compute-0 podman[256405]: 2025-11-22 08:54:30.448132374 +0000 UTC m=+0.087963744 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 08:54:30 compute-0 podman[256406]: 2025-11-22 08:54:30.484449467 +0000 UTC m=+0.117094821 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 22 08:54:31 compute-0 nova_compute[186544]: 2025-11-22 08:54:31.915 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:54:33 compute-0 nova_compute[186544]: 2025-11-22 08:54:33.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:54:33 compute-0 nova_compute[186544]: 2025-11-22 08:54:33.164 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 08:54:33 compute-0 nova_compute[186544]: 2025-11-22 08:54:33.164 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 08:54:33 compute-0 nova_compute[186544]: 2025-11-22 08:54:33.182 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 08:54:34 compute-0 nova_compute[186544]: 2025-11-22 08:54:34.699 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:54:36.607 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:54:36.608 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:54:36.608 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:54:36.608 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:54:36.608 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:54:36.608 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:54:36.608 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:54:36.608 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:54:36.608 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:54:36.608 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:54:36.608 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:54:36.608 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:54:36.609 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:54:36.609 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:54:36.609 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:54:36.609 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:54:36.609 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:54:36.609 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:54:36.609 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:54:36.609 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:54:36.609 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:54:36.609 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:54:36.609 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:54:36.609 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:54:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:54:36.609 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 08:54:36 compute-0 nova_compute[186544]: 2025-11-22 08:54:36.917 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:54:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:54:37.387 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:54:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:54:37.388 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:54:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:54:37.388 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:54:38 compute-0 nova_compute[186544]: 2025-11-22 08:54:38.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:54:38 compute-0 nova_compute[186544]: 2025-11-22 08:54:38.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 08:54:39 compute-0 nova_compute[186544]: 2025-11-22 08:54:39.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:54:39 compute-0 nova_compute[186544]: 2025-11-22 08:54:39.209 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:54:39 compute-0 nova_compute[186544]: 2025-11-22 08:54:39.210 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:54:39 compute-0 nova_compute[186544]: 2025-11-22 08:54:39.210 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:54:39 compute-0 nova_compute[186544]: 2025-11-22 08:54:39.210 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 08:54:39 compute-0 nova_compute[186544]: 2025-11-22 08:54:39.362 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:54:39 compute-0 nova_compute[186544]: 2025-11-22 08:54:39.362 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5721MB free_disk=73.13179779052734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 08:54:39 compute-0 nova_compute[186544]: 2025-11-22 08:54:39.363 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:54:39 compute-0 nova_compute[186544]: 2025-11-22 08:54:39.363 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:54:39 compute-0 nova_compute[186544]: 2025-11-22 08:54:39.434 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 08:54:39 compute-0 nova_compute[186544]: 2025-11-22 08:54:39.434 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 08:54:39 compute-0 nova_compute[186544]: 2025-11-22 08:54:39.455 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:54:39 compute-0 nova_compute[186544]: 2025-11-22 08:54:39.469 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:54:39 compute-0 nova_compute[186544]: 2025-11-22 08:54:39.471 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 08:54:39 compute-0 nova_compute[186544]: 2025-11-22 08:54:39.471 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.109s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:54:39 compute-0 nova_compute[186544]: 2025-11-22 08:54:39.702 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:54:41 compute-0 nova_compute[186544]: 2025-11-22 08:54:41.920 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:54:42 compute-0 nova_compute[186544]: 2025-11-22 08:54:42.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:54:42 compute-0 nova_compute[186544]: 2025-11-22 08:54:42.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:54:42 compute-0 nova_compute[186544]: 2025-11-22 08:54:42.163 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:54:42 compute-0 nova_compute[186544]: 2025-11-22 08:54:42.163 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:54:42 compute-0 nova_compute[186544]: 2025-11-22 08:54:42.164 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:54:42 compute-0 nova_compute[186544]: 2025-11-22 08:54:42.164 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:54:42 compute-0 nova_compute[186544]: 2025-11-22 08:54:42.164 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:54:42 compute-0 nova_compute[186544]: 2025-11-22 08:54:42.164 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:54:42 compute-0 nova_compute[186544]: 2025-11-22 08:54:42.185 186548 DEBUG nova.virt.libvirt.imagecache [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Verify base images _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:314
Nov 22 08:54:42 compute-0 nova_compute[186544]: 2025-11-22 08:54:42.185 186548 WARNING nova.virt.libvirt.imagecache [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Unknown base file: /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726
Nov 22 08:54:42 compute-0 nova_compute[186544]: 2025-11-22 08:54:42.185 186548 WARNING nova.virt.libvirt.imagecache [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Unknown base file: /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42
Nov 22 08:54:42 compute-0 nova_compute[186544]: 2025-11-22 08:54:42.185 186548 WARNING nova.virt.libvirt.imagecache [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Unknown base file: /var/lib/nova/instances/_base/9794f07bebbd0adda446514edb1f5d23da5d1e0c
Nov 22 08:54:42 compute-0 nova_compute[186544]: 2025-11-22 08:54:42.185 186548 WARNING nova.virt.libvirt.imagecache [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Unknown base file: /var/lib/nova/instances/_base/8536cfb4a3f8d7a3046f2bf7b9787f58422494f1
Nov 22 08:54:42 compute-0 nova_compute[186544]: 2025-11-22 08:54:42.185 186548 WARNING nova.virt.libvirt.imagecache [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Unknown base file: /var/lib/nova/instances/_base/a8f717718e805875f8335c6cd87b43bdb39ec5d3
Nov 22 08:54:42 compute-0 nova_compute[186544]: 2025-11-22 08:54:42.185 186548 WARNING nova.virt.libvirt.imagecache [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Unknown base file: /var/lib/nova/instances/_base/e4c7d8ef8985edf6bd00969b5ee2ac6a894891c5
Nov 22 08:54:42 compute-0 nova_compute[186544]: 2025-11-22 08:54:42.186 186548 WARNING nova.virt.libvirt.imagecache [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Unknown base file: /var/lib/nova/instances/_base/b311e5b8c8aaec45fed9ab32a27ca19947834438
Nov 22 08:54:42 compute-0 nova_compute[186544]: 2025-11-22 08:54:42.186 186548 INFO nova.virt.libvirt.imagecache [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Removable base files: /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42 /var/lib/nova/instances/_base/9794f07bebbd0adda446514edb1f5d23da5d1e0c /var/lib/nova/instances/_base/8536cfb4a3f8d7a3046f2bf7b9787f58422494f1 /var/lib/nova/instances/_base/a8f717718e805875f8335c6cd87b43bdb39ec5d3 /var/lib/nova/instances/_base/e4c7d8ef8985edf6bd00969b5ee2ac6a894891c5 /var/lib/nova/instances/_base/b311e5b8c8aaec45fed9ab32a27ca19947834438
Nov 22 08:54:42 compute-0 nova_compute[186544]: 2025-11-22 08:54:42.186 186548 INFO nova.virt.libvirt.imagecache [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726
Nov 22 08:54:42 compute-0 nova_compute[186544]: 2025-11-22 08:54:42.186 186548 INFO nova.virt.libvirt.imagecache [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/2882af3479446958b785a3f508ce087a26493f42
Nov 22 08:54:42 compute-0 nova_compute[186544]: 2025-11-22 08:54:42.186 186548 INFO nova.virt.libvirt.imagecache [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/9794f07bebbd0adda446514edb1f5d23da5d1e0c
Nov 22 08:54:42 compute-0 nova_compute[186544]: 2025-11-22 08:54:42.186 186548 INFO nova.virt.libvirt.imagecache [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/8536cfb4a3f8d7a3046f2bf7b9787f58422494f1
Nov 22 08:54:42 compute-0 nova_compute[186544]: 2025-11-22 08:54:42.186 186548 INFO nova.virt.libvirt.imagecache [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/a8f717718e805875f8335c6cd87b43bdb39ec5d3
Nov 22 08:54:42 compute-0 nova_compute[186544]: 2025-11-22 08:54:42.187 186548 INFO nova.virt.libvirt.imagecache [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/e4c7d8ef8985edf6bd00969b5ee2ac6a894891c5
Nov 22 08:54:42 compute-0 nova_compute[186544]: 2025-11-22 08:54:42.187 186548 INFO nova.virt.libvirt.imagecache [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/b311e5b8c8aaec45fed9ab32a27ca19947834438
Nov 22 08:54:42 compute-0 nova_compute[186544]: 2025-11-22 08:54:42.187 186548 DEBUG nova.virt.libvirt.imagecache [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Verification complete _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:350
Nov 22 08:54:42 compute-0 nova_compute[186544]: 2025-11-22 08:54:42.187 186548 DEBUG nova.virt.libvirt.imagecache [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Verify swap images _age_and_verify_swap_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:299
Nov 22 08:54:42 compute-0 nova_compute[186544]: 2025-11-22 08:54:42.187 186548 DEBUG nova.virt.libvirt.imagecache [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Verify ephemeral images _age_and_verify_ephemeral_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:284
Nov 22 08:54:42 compute-0 podman[256487]: 2025-11-22 08:54:42.41416056 +0000 UTC m=+0.062437057 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 08:54:42 compute-0 podman[256488]: 2025-11-22 08:54:42.439655517 +0000 UTC m=+0.076432472 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, release=1755695350, io.buildah.version=1.33.7, managed_by=edpm_ansible, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, version=9.6, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git)
Nov 22 08:54:42 compute-0 podman[256489]: 2025-11-22 08:54:42.457148457 +0000 UTC m=+0.096486164 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_managed=true, config_id=multipathd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 22 08:54:44 compute-0 nova_compute[186544]: 2025-11-22 08:54:44.188 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:54:44 compute-0 nova_compute[186544]: 2025-11-22 08:54:44.705 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:54:45 compute-0 nova_compute[186544]: 2025-11-22 08:54:45.158 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:54:46 compute-0 nova_compute[186544]: 2025-11-22 08:54:46.922 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:54:47 compute-0 nova_compute[186544]: 2025-11-22 08:54:47.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:54:48 compute-0 nova_compute[186544]: 2025-11-22 08:54:48.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:54:49 compute-0 nova_compute[186544]: 2025-11-22 08:54:49.709 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:54:50 compute-0 nova_compute[186544]: 2025-11-22 08:54:50.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:54:51 compute-0 nova_compute[186544]: 2025-11-22 08:54:51.923 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:54:54 compute-0 nova_compute[186544]: 2025-11-22 08:54:54.712 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:54:56 compute-0 nova_compute[186544]: 2025-11-22 08:54:56.924 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:54:59 compute-0 nova_compute[186544]: 2025-11-22 08:54:59.714 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:55:01 compute-0 nova_compute[186544]: 2025-11-22 08:55:01.159 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:55:01 compute-0 podman[256549]: 2025-11-22 08:55:01.400593137 +0000 UTC m=+0.051827856 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, tcib_managed=true, config_id=edpm, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 08:55:01 compute-0 podman[256551]: 2025-11-22 08:55:01.41211282 +0000 UTC m=+0.055370293 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 08:55:01 compute-0 podman[256550]: 2025-11-22 08:55:01.428126314 +0000 UTC m=+0.076054562 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Nov 22 08:55:01 compute-0 podman[256552]: 2025-11-22 08:55:01.445717246 +0000 UTC m=+0.086549269 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_controller)
Nov 22 08:55:01 compute-0 nova_compute[186544]: 2025-11-22 08:55:01.926 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:55:04 compute-0 nova_compute[186544]: 2025-11-22 08:55:04.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:55:04 compute-0 nova_compute[186544]: 2025-11-22 08:55:04.275 186548 DEBUG oslo_concurrency.lockutils [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:55:04 compute-0 nova_compute[186544]: 2025-11-22 08:55:04.275 186548 DEBUG oslo_concurrency.lockutils [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:55:04 compute-0 nova_compute[186544]: 2025-11-22 08:55:04.294 186548 DEBUG nova.compute.manager [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 08:55:04 compute-0 nova_compute[186544]: 2025-11-22 08:55:04.377 186548 DEBUG oslo_concurrency.lockutils [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:55:04 compute-0 nova_compute[186544]: 2025-11-22 08:55:04.378 186548 DEBUG oslo_concurrency.lockutils [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:55:04 compute-0 nova_compute[186544]: 2025-11-22 08:55:04.385 186548 DEBUG nova.virt.hardware [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 08:55:04 compute-0 nova_compute[186544]: 2025-11-22 08:55:04.385 186548 INFO nova.compute.claims [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] Claim successful on node compute-0.ctlplane.example.com
Nov 22 08:55:04 compute-0 nova_compute[186544]: 2025-11-22 08:55:04.485 186548 DEBUG nova.compute.provider_tree [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:55:04 compute-0 nova_compute[186544]: 2025-11-22 08:55:04.499 186548 DEBUG nova.scheduler.client.report [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:55:04 compute-0 nova_compute[186544]: 2025-11-22 08:55:04.523 186548 DEBUG oslo_concurrency.lockutils [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.145s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:55:04 compute-0 nova_compute[186544]: 2025-11-22 08:55:04.524 186548 DEBUG nova.compute.manager [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 08:55:04 compute-0 nova_compute[186544]: 2025-11-22 08:55:04.598 186548 DEBUG nova.compute.manager [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 22 08:55:04 compute-0 nova_compute[186544]: 2025-11-22 08:55:04.599 186548 DEBUG nova.network.neutron [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 08:55:04 compute-0 nova_compute[186544]: 2025-11-22 08:55:04.619 186548 INFO nova.virt.libvirt.driver [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 08:55:04 compute-0 nova_compute[186544]: 2025-11-22 08:55:04.634 186548 DEBUG nova.compute.manager [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 08:55:04 compute-0 nova_compute[186544]: 2025-11-22 08:55:04.716 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:55:04 compute-0 nova_compute[186544]: 2025-11-22 08:55:04.738 186548 DEBUG nova.compute.manager [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 08:55:04 compute-0 nova_compute[186544]: 2025-11-22 08:55:04.739 186548 DEBUG nova.virt.libvirt.driver [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 08:55:04 compute-0 nova_compute[186544]: 2025-11-22 08:55:04.740 186548 INFO nova.virt.libvirt.driver [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] Creating image(s)
Nov 22 08:55:04 compute-0 nova_compute[186544]: 2025-11-22 08:55:04.740 186548 DEBUG oslo_concurrency.lockutils [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "/var/lib/nova/instances/3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:55:04 compute-0 nova_compute[186544]: 2025-11-22 08:55:04.741 186548 DEBUG oslo_concurrency.lockutils [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "/var/lib/nova/instances/3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:55:04 compute-0 nova_compute[186544]: 2025-11-22 08:55:04.741 186548 DEBUG oslo_concurrency.lockutils [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "/var/lib/nova/instances/3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:55:04 compute-0 nova_compute[186544]: 2025-11-22 08:55:04.753 186548 DEBUG oslo_concurrency.processutils [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:55:04 compute-0 nova_compute[186544]: 2025-11-22 08:55:04.811 186548 DEBUG oslo_concurrency.processutils [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:55:04 compute-0 nova_compute[186544]: 2025-11-22 08:55:04.812 186548 DEBUG oslo_concurrency.lockutils [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:55:04 compute-0 nova_compute[186544]: 2025-11-22 08:55:04.812 186548 DEBUG oslo_concurrency.lockutils [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:55:04 compute-0 nova_compute[186544]: 2025-11-22 08:55:04.824 186548 DEBUG oslo_concurrency.processutils [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:55:04 compute-0 nova_compute[186544]: 2025-11-22 08:55:04.882 186548 DEBUG oslo_concurrency.processutils [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:55:04 compute-0 nova_compute[186544]: 2025-11-22 08:55:04.883 186548 DEBUG oslo_concurrency.processutils [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:55:05 compute-0 nova_compute[186544]: 2025-11-22 08:55:05.056 186548 DEBUG oslo_concurrency.processutils [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6/disk 1073741824" returned: 0 in 0.173s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:55:05 compute-0 nova_compute[186544]: 2025-11-22 08:55:05.057 186548 DEBUG oslo_concurrency.lockutils [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.245s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:55:05 compute-0 nova_compute[186544]: 2025-11-22 08:55:05.058 186548 DEBUG oslo_concurrency.processutils [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:55:05 compute-0 nova_compute[186544]: 2025-11-22 08:55:05.108 186548 DEBUG oslo_concurrency.processutils [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:55:05 compute-0 nova_compute[186544]: 2025-11-22 08:55:05.109 186548 DEBUG nova.virt.disk.api [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Checking if we can resize image /var/lib/nova/instances/3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 08:55:05 compute-0 nova_compute[186544]: 2025-11-22 08:55:05.109 186548 DEBUG oslo_concurrency.processutils [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:55:05 compute-0 nova_compute[186544]: 2025-11-22 08:55:05.161 186548 DEBUG oslo_concurrency.processutils [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:55:05 compute-0 nova_compute[186544]: 2025-11-22 08:55:05.162 186548 DEBUG nova.virt.disk.api [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Cannot resize image /var/lib/nova/instances/3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 08:55:05 compute-0 nova_compute[186544]: 2025-11-22 08:55:05.162 186548 DEBUG nova.objects.instance [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lazy-loading 'migration_context' on Instance uuid 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:55:05 compute-0 nova_compute[186544]: 2025-11-22 08:55:05.178 186548 DEBUG nova.virt.libvirt.driver [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 08:55:05 compute-0 nova_compute[186544]: 2025-11-22 08:55:05.179 186548 DEBUG nova.virt.libvirt.driver [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] Ensure instance console log exists: /var/lib/nova/instances/3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 08:55:05 compute-0 nova_compute[186544]: 2025-11-22 08:55:05.179 186548 DEBUG oslo_concurrency.lockutils [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:55:05 compute-0 nova_compute[186544]: 2025-11-22 08:55:05.180 186548 DEBUG oslo_concurrency.lockutils [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:55:05 compute-0 nova_compute[186544]: 2025-11-22 08:55:05.180 186548 DEBUG oslo_concurrency.lockutils [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:55:06 compute-0 nova_compute[186544]: 2025-11-22 08:55:06.150 186548 DEBUG nova.policy [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 22 08:55:06 compute-0 nova_compute[186544]: 2025-11-22 08:55:06.929 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:55:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:55:08.802 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=79, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=78) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:55:08 compute-0 nova_compute[186544]: 2025-11-22 08:55:08.802 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:55:08 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:55:08.803 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 08:55:09 compute-0 nova_compute[186544]: 2025-11-22 08:55:09.717 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:55:09 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:55:09.806 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '79'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:55:10 compute-0 nova_compute[186544]: 2025-11-22 08:55:10.815 186548 DEBUG nova.network.neutron [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] Successfully created port: 61d2893b-3f64-4239-b7aa-532d126977ba _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 22 08:55:11 compute-0 nova_compute[186544]: 2025-11-22 08:55:11.478 186548 DEBUG nova.network.neutron [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] Successfully updated port: 61d2893b-3f64-4239-b7aa-532d126977ba _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 08:55:11 compute-0 nova_compute[186544]: 2025-11-22 08:55:11.495 186548 DEBUG oslo_concurrency.lockutils [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "refresh_cache-3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:55:11 compute-0 nova_compute[186544]: 2025-11-22 08:55:11.495 186548 DEBUG oslo_concurrency.lockutils [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquired lock "refresh_cache-3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:55:11 compute-0 nova_compute[186544]: 2025-11-22 08:55:11.495 186548 DEBUG nova.network.neutron [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 08:55:11 compute-0 nova_compute[186544]: 2025-11-22 08:55:11.611 186548 DEBUG nova.compute.manager [req-fa703e5d-7173-42c2-a9df-00ecdfbab7cf req-b6941e9d-f7c6-4c52-aea9-f8d28c3d33a7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] Received event network-changed-61d2893b-3f64-4239-b7aa-532d126977ba external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:55:11 compute-0 nova_compute[186544]: 2025-11-22 08:55:11.612 186548 DEBUG nova.compute.manager [req-fa703e5d-7173-42c2-a9df-00ecdfbab7cf req-b6941e9d-f7c6-4c52-aea9-f8d28c3d33a7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] Refreshing instance network info cache due to event network-changed-61d2893b-3f64-4239-b7aa-532d126977ba. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:55:11 compute-0 nova_compute[186544]: 2025-11-22 08:55:11.612 186548 DEBUG oslo_concurrency.lockutils [req-fa703e5d-7173-42c2-a9df-00ecdfbab7cf req-b6941e9d-f7c6-4c52-aea9-f8d28c3d33a7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:55:11 compute-0 nova_compute[186544]: 2025-11-22 08:55:11.660 186548 DEBUG nova.network.neutron [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 08:55:11 compute-0 nova_compute[186544]: 2025-11-22 08:55:11.929 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:55:13 compute-0 nova_compute[186544]: 2025-11-22 08:55:13.094 186548 DEBUG nova.network.neutron [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] Updating instance_info_cache with network_info: [{"id": "61d2893b-3f64-4239-b7aa-532d126977ba", "address": "fa:16:3e:95:ec:c3", "network": {"id": "4d568a4f-3fc1-4760-b924-569e98e1b4a7", "bridge": "br-int", "label": "tempest-network-smoke--2129642516", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61d2893b-3f", "ovs_interfaceid": "61d2893b-3f64-4239-b7aa-532d126977ba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:55:13 compute-0 nova_compute[186544]: 2025-11-22 08:55:13.119 186548 DEBUG oslo_concurrency.lockutils [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Releasing lock "refresh_cache-3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:55:13 compute-0 nova_compute[186544]: 2025-11-22 08:55:13.119 186548 DEBUG nova.compute.manager [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] Instance network_info: |[{"id": "61d2893b-3f64-4239-b7aa-532d126977ba", "address": "fa:16:3e:95:ec:c3", "network": {"id": "4d568a4f-3fc1-4760-b924-569e98e1b4a7", "bridge": "br-int", "label": "tempest-network-smoke--2129642516", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61d2893b-3f", "ovs_interfaceid": "61d2893b-3f64-4239-b7aa-532d126977ba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 22 08:55:13 compute-0 nova_compute[186544]: 2025-11-22 08:55:13.119 186548 DEBUG oslo_concurrency.lockutils [req-fa703e5d-7173-42c2-a9df-00ecdfbab7cf req-b6941e9d-f7c6-4c52-aea9-f8d28c3d33a7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:55:13 compute-0 nova_compute[186544]: 2025-11-22 08:55:13.120 186548 DEBUG nova.network.neutron [req-fa703e5d-7173-42c2-a9df-00ecdfbab7cf req-b6941e9d-f7c6-4c52-aea9-f8d28c3d33a7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] Refreshing network info cache for port 61d2893b-3f64-4239-b7aa-532d126977ba _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:55:13 compute-0 nova_compute[186544]: 2025-11-22 08:55:13.123 186548 DEBUG nova.virt.libvirt.driver [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] Start _get_guest_xml network_info=[{"id": "61d2893b-3f64-4239-b7aa-532d126977ba", "address": "fa:16:3e:95:ec:c3", "network": {"id": "4d568a4f-3fc1-4760-b924-569e98e1b4a7", "bridge": "br-int", "label": "tempest-network-smoke--2129642516", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61d2893b-3f", "ovs_interfaceid": "61d2893b-3f64-4239-b7aa-532d126977ba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 08:55:13 compute-0 nova_compute[186544]: 2025-11-22 08:55:13.127 186548 WARNING nova.virt.libvirt.driver [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:55:13 compute-0 nova_compute[186544]: 2025-11-22 08:55:13.134 186548 DEBUG nova.virt.libvirt.host [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 08:55:13 compute-0 nova_compute[186544]: 2025-11-22 08:55:13.135 186548 DEBUG nova.virt.libvirt.host [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 08:55:13 compute-0 nova_compute[186544]: 2025-11-22 08:55:13.140 186548 DEBUG nova.virt.libvirt.host [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 08:55:13 compute-0 nova_compute[186544]: 2025-11-22 08:55:13.141 186548 DEBUG nova.virt.libvirt.host [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 08:55:13 compute-0 nova_compute[186544]: 2025-11-22 08:55:13.142 186548 DEBUG nova.virt.libvirt.driver [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 08:55:13 compute-0 nova_compute[186544]: 2025-11-22 08:55:13.142 186548 DEBUG nova.virt.hardware [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 08:55:13 compute-0 nova_compute[186544]: 2025-11-22 08:55:13.143 186548 DEBUG nova.virt.hardware [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 08:55:13 compute-0 nova_compute[186544]: 2025-11-22 08:55:13.143 186548 DEBUG nova.virt.hardware [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 08:55:13 compute-0 nova_compute[186544]: 2025-11-22 08:55:13.143 186548 DEBUG nova.virt.hardware [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 08:55:13 compute-0 nova_compute[186544]: 2025-11-22 08:55:13.143 186548 DEBUG nova.virt.hardware [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 08:55:13 compute-0 nova_compute[186544]: 2025-11-22 08:55:13.143 186548 DEBUG nova.virt.hardware [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 08:55:13 compute-0 nova_compute[186544]: 2025-11-22 08:55:13.144 186548 DEBUG nova.virt.hardware [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 08:55:13 compute-0 nova_compute[186544]: 2025-11-22 08:55:13.144 186548 DEBUG nova.virt.hardware [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 08:55:13 compute-0 nova_compute[186544]: 2025-11-22 08:55:13.144 186548 DEBUG nova.virt.hardware [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 08:55:13 compute-0 nova_compute[186544]: 2025-11-22 08:55:13.145 186548 DEBUG nova.virt.hardware [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 08:55:13 compute-0 nova_compute[186544]: 2025-11-22 08:55:13.145 186548 DEBUG nova.virt.hardware [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 08:55:13 compute-0 nova_compute[186544]: 2025-11-22 08:55:13.148 186548 DEBUG nova.virt.libvirt.vif [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:55:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-gen-1-1689879905',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-gen-1-1689879905',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-588574044-gen',id=185,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAw4mkfFiRvcPYUE0PIsjDRQ2l9YGHYiiINwqgLaZjT4Jz+8V9nq9XzUIN6IBe3EfaIEfpC2/icXPG/z2BuoLG3JQ3o5sNQAUv0uO8d73RniLqWnU/BDSHGzBNmpYdI7sw==',key_name='tempest-TestSecurityGroupsBasicOps-2127958203',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b5da13b07bb34fc3b4cd1452f7dd6971',ramdisk_id='',reservation_id='r-sg0rjxgo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-588574044',owner_user_name='tempest-TestSecurityGroupsBasicOps-588574044-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:55:04Z,user_data=None,user_id='7bb85b33f2b44468ab5d86bf5ba98421',uuid=3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "61d2893b-3f64-4239-b7aa-532d126977ba", "address": "fa:16:3e:95:ec:c3", "network": {"id": "4d568a4f-3fc1-4760-b924-569e98e1b4a7", "bridge": "br-int", "label": "tempest-network-smoke--2129642516", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61d2893b-3f", "ovs_interfaceid": "61d2893b-3f64-4239-b7aa-532d126977ba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 08:55:13 compute-0 nova_compute[186544]: 2025-11-22 08:55:13.149 186548 DEBUG nova.network.os_vif_util [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Converting VIF {"id": "61d2893b-3f64-4239-b7aa-532d126977ba", "address": "fa:16:3e:95:ec:c3", "network": {"id": "4d568a4f-3fc1-4760-b924-569e98e1b4a7", "bridge": "br-int", "label": "tempest-network-smoke--2129642516", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61d2893b-3f", "ovs_interfaceid": "61d2893b-3f64-4239-b7aa-532d126977ba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:55:13 compute-0 nova_compute[186544]: 2025-11-22 08:55:13.150 186548 DEBUG nova.network.os_vif_util [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:95:ec:c3,bridge_name='br-int',has_traffic_filtering=True,id=61d2893b-3f64-4239-b7aa-532d126977ba,network=Network(4d568a4f-3fc1-4760-b924-569e98e1b4a7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap61d2893b-3f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:55:13 compute-0 nova_compute[186544]: 2025-11-22 08:55:13.150 186548 DEBUG nova.objects.instance [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lazy-loading 'pci_devices' on Instance uuid 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:55:13 compute-0 nova_compute[186544]: 2025-11-22 08:55:13.168 186548 DEBUG nova.virt.libvirt.driver [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] End _get_guest_xml xml=<domain type="kvm">
Nov 22 08:55:13 compute-0 nova_compute[186544]:   <uuid>3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6</uuid>
Nov 22 08:55:13 compute-0 nova_compute[186544]:   <name>instance-000000b9</name>
Nov 22 08:55:13 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 08:55:13 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 08:55:13 compute-0 nova_compute[186544]:   <metadata>
Nov 22 08:55:13 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 08:55:13 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 08:55:13 compute-0 nova_compute[186544]:       <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-gen-1-1689879905</nova:name>
Nov 22 08:55:13 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 08:55:13</nova:creationTime>
Nov 22 08:55:13 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 08:55:13 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 08:55:13 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 08:55:13 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 08:55:13 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 08:55:13 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 08:55:13 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 08:55:13 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 08:55:13 compute-0 nova_compute[186544]:         <nova:user uuid="7bb85b33f2b44468ab5d86bf5ba98421">tempest-TestSecurityGroupsBasicOps-588574044-project-member</nova:user>
Nov 22 08:55:13 compute-0 nova_compute[186544]:         <nova:project uuid="b5da13b07bb34fc3b4cd1452f7dd6971">tempest-TestSecurityGroupsBasicOps-588574044</nova:project>
Nov 22 08:55:13 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 08:55:13 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 08:55:13 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 08:55:13 compute-0 nova_compute[186544]:         <nova:port uuid="61d2893b-3f64-4239-b7aa-532d126977ba">
Nov 22 08:55:13 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 22 08:55:13 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 08:55:13 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 08:55:13 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 08:55:13 compute-0 nova_compute[186544]:   </metadata>
Nov 22 08:55:13 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 08:55:13 compute-0 nova_compute[186544]:     <system>
Nov 22 08:55:13 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 08:55:13 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 08:55:13 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 08:55:13 compute-0 nova_compute[186544]:       <entry name="serial">3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6</entry>
Nov 22 08:55:13 compute-0 nova_compute[186544]:       <entry name="uuid">3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6</entry>
Nov 22 08:55:13 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 08:55:13 compute-0 nova_compute[186544]:     </system>
Nov 22 08:55:13 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 08:55:13 compute-0 nova_compute[186544]:   <os>
Nov 22 08:55:13 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 08:55:13 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 08:55:13 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 08:55:13 compute-0 nova_compute[186544]:   </os>
Nov 22 08:55:13 compute-0 nova_compute[186544]:   <features>
Nov 22 08:55:13 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 08:55:13 compute-0 nova_compute[186544]:     <apic/>
Nov 22 08:55:13 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 08:55:13 compute-0 nova_compute[186544]:   </features>
Nov 22 08:55:13 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 08:55:13 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 08:55:13 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 08:55:13 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 08:55:13 compute-0 nova_compute[186544]:   </clock>
Nov 22 08:55:13 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 08:55:13 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 08:55:13 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 08:55:13 compute-0 nova_compute[186544]:   </cpu>
Nov 22 08:55:13 compute-0 nova_compute[186544]:   <devices>
Nov 22 08:55:13 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 08:55:13 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 08:55:13 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6/disk"/>
Nov 22 08:55:13 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 08:55:13 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:55:13 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 08:55:13 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 08:55:13 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6/disk.config"/>
Nov 22 08:55:13 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 08:55:13 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:55:13 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 08:55:13 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:95:ec:c3"/>
Nov 22 08:55:13 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:55:13 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 08:55:13 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 08:55:13 compute-0 nova_compute[186544]:       <target dev="tap61d2893b-3f"/>
Nov 22 08:55:13 compute-0 nova_compute[186544]:     </interface>
Nov 22 08:55:13 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 08:55:13 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6/console.log" append="off"/>
Nov 22 08:55:13 compute-0 nova_compute[186544]:     </serial>
Nov 22 08:55:13 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 08:55:13 compute-0 nova_compute[186544]:     <video>
Nov 22 08:55:13 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:55:13 compute-0 nova_compute[186544]:     </video>
Nov 22 08:55:13 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 08:55:13 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 08:55:13 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 08:55:13 compute-0 nova_compute[186544]:     </rng>
Nov 22 08:55:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 08:55:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:55:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:55:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:55:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:55:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:55:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:55:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:55:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:55:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:55:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:55:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:55:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:55:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:55:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:55:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:55:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:55:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:55:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:55:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:55:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:55:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:55:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:55:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:55:13 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:55:13 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 08:55:13 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 08:55:13 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 08:55:13 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 08:55:13 compute-0 nova_compute[186544]:   </devices>
Nov 22 08:55:13 compute-0 nova_compute[186544]: </domain>
Nov 22 08:55:13 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 08:55:13 compute-0 nova_compute[186544]: 2025-11-22 08:55:13.169 186548 DEBUG nova.compute.manager [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] Preparing to wait for external event network-vif-plugged-61d2893b-3f64-4239-b7aa-532d126977ba prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 08:55:13 compute-0 nova_compute[186544]: 2025-11-22 08:55:13.169 186548 DEBUG oslo_concurrency.lockutils [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:55:13 compute-0 nova_compute[186544]: 2025-11-22 08:55:13.169 186548 DEBUG oslo_concurrency.lockutils [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:55:13 compute-0 nova_compute[186544]: 2025-11-22 08:55:13.169 186548 DEBUG oslo_concurrency.lockutils [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:55:13 compute-0 nova_compute[186544]: 2025-11-22 08:55:13.171 186548 DEBUG nova.virt.libvirt.vif [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:55:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-gen-1-1689879905',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-gen-1-1689879905',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-588574044-gen',id=185,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAw4mkfFiRvcPYUE0PIsjDRQ2l9YGHYiiINwqgLaZjT4Jz+8V9nq9XzUIN6IBe3EfaIEfpC2/icXPG/z2BuoLG3JQ3o5sNQAUv0uO8d73RniLqWnU/BDSHGzBNmpYdI7sw==',key_name='tempest-TestSecurityGroupsBasicOps-2127958203',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b5da13b07bb34fc3b4cd1452f7dd6971',ramdisk_id='',reservation_id='r-sg0rjxgo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-588574044',owner_user_name='tempest-TestSecurityGroupsBasicOps-588574044-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:55:04Z,user_data=None,user_id='7bb85b33f2b44468ab5d86bf5ba98421',uuid=3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "61d2893b-3f64-4239-b7aa-532d126977ba", "address": "fa:16:3e:95:ec:c3", "network": {"id": "4d568a4f-3fc1-4760-b924-569e98e1b4a7", "bridge": "br-int", "label": "tempest-network-smoke--2129642516", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61d2893b-3f", "ovs_interfaceid": "61d2893b-3f64-4239-b7aa-532d126977ba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 08:55:13 compute-0 nova_compute[186544]: 2025-11-22 08:55:13.171 186548 DEBUG nova.network.os_vif_util [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Converting VIF {"id": "61d2893b-3f64-4239-b7aa-532d126977ba", "address": "fa:16:3e:95:ec:c3", "network": {"id": "4d568a4f-3fc1-4760-b924-569e98e1b4a7", "bridge": "br-int", "label": "tempest-network-smoke--2129642516", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61d2893b-3f", "ovs_interfaceid": "61d2893b-3f64-4239-b7aa-532d126977ba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:55:13 compute-0 nova_compute[186544]: 2025-11-22 08:55:13.171 186548 DEBUG nova.network.os_vif_util [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:95:ec:c3,bridge_name='br-int',has_traffic_filtering=True,id=61d2893b-3f64-4239-b7aa-532d126977ba,network=Network(4d568a4f-3fc1-4760-b924-569e98e1b4a7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap61d2893b-3f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:55:13 compute-0 nova_compute[186544]: 2025-11-22 08:55:13.172 186548 DEBUG os_vif [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:95:ec:c3,bridge_name='br-int',has_traffic_filtering=True,id=61d2893b-3f64-4239-b7aa-532d126977ba,network=Network(4d568a4f-3fc1-4760-b924-569e98e1b4a7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap61d2893b-3f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 08:55:13 compute-0 nova_compute[186544]: 2025-11-22 08:55:13.172 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:55:13 compute-0 nova_compute[186544]: 2025-11-22 08:55:13.173 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:55:13 compute-0 nova_compute[186544]: 2025-11-22 08:55:13.173 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:55:13 compute-0 nova_compute[186544]: 2025-11-22 08:55:13.176 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:55:13 compute-0 nova_compute[186544]: 2025-11-22 08:55:13.176 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap61d2893b-3f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:55:13 compute-0 nova_compute[186544]: 2025-11-22 08:55:13.177 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap61d2893b-3f, col_values=(('external_ids', {'iface-id': '61d2893b-3f64-4239-b7aa-532d126977ba', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:95:ec:c3', 'vm-uuid': '3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:55:13 compute-0 nova_compute[186544]: 2025-11-22 08:55:13.178 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:55:13 compute-0 NetworkManager[55036]: <info>  [1763801713.1802] manager: (tap61d2893b-3f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/431)
Nov 22 08:55:13 compute-0 nova_compute[186544]: 2025-11-22 08:55:13.181 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 08:55:13 compute-0 nova_compute[186544]: 2025-11-22 08:55:13.185 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:55:13 compute-0 nova_compute[186544]: 2025-11-22 08:55:13.185 186548 INFO os_vif [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:95:ec:c3,bridge_name='br-int',has_traffic_filtering=True,id=61d2893b-3f64-4239-b7aa-532d126977ba,network=Network(4d568a4f-3fc1-4760-b924-569e98e1b4a7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap61d2893b-3f')
Nov 22 08:55:13 compute-0 nova_compute[186544]: 2025-11-22 08:55:13.236 186548 DEBUG nova.virt.libvirt.driver [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:55:13 compute-0 nova_compute[186544]: 2025-11-22 08:55:13.236 186548 DEBUG nova.virt.libvirt.driver [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:55:13 compute-0 nova_compute[186544]: 2025-11-22 08:55:13.236 186548 DEBUG nova.virt.libvirt.driver [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] No VIF found with MAC fa:16:3e:95:ec:c3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 08:55:13 compute-0 nova_compute[186544]: 2025-11-22 08:55:13.237 186548 INFO nova.virt.libvirt.driver [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] Using config drive
Nov 22 08:55:13 compute-0 podman[256652]: 2025-11-22 08:55:13.410210094 +0000 UTC m=+0.058088640 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 08:55:13 compute-0 podman[256654]: 2025-11-22 08:55:13.414398077 +0000 UTC m=+0.056321937 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, tcib_managed=true, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd)
Nov 22 08:55:13 compute-0 podman[256653]: 2025-11-22 08:55:13.418286272 +0000 UTC m=+0.063061102 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, name=ubi9-minimal, io.buildah.version=1.33.7, managed_by=edpm_ansible, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., distribution-scope=public, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, build-date=2025-08-20T13:12:41)
Nov 22 08:55:13 compute-0 nova_compute[186544]: 2025-11-22 08:55:13.876 186548 INFO nova.virt.libvirt.driver [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] Creating config drive at /var/lib/nova/instances/3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6/disk.config
Nov 22 08:55:13 compute-0 nova_compute[186544]: 2025-11-22 08:55:13.881 186548 DEBUG oslo_concurrency.processutils [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_ox04tu2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:55:14 compute-0 nova_compute[186544]: 2025-11-22 08:55:14.011 186548 DEBUG oslo_concurrency.processutils [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_ox04tu2" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:55:14 compute-0 kernel: tap61d2893b-3f: entered promiscuous mode
Nov 22 08:55:14 compute-0 ovn_controller[94843]: 2025-11-22T08:55:14Z|00918|binding|INFO|Claiming lport 61d2893b-3f64-4239-b7aa-532d126977ba for this chassis.
Nov 22 08:55:14 compute-0 ovn_controller[94843]: 2025-11-22T08:55:14Z|00919|binding|INFO|61d2893b-3f64-4239-b7aa-532d126977ba: Claiming fa:16:3e:95:ec:c3 10.100.0.5
Nov 22 08:55:14 compute-0 nova_compute[186544]: 2025-11-22 08:55:14.097 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:55:14 compute-0 NetworkManager[55036]: <info>  [1763801714.0994] manager: (tap61d2893b-3f): new Tun device (/org/freedesktop/NetworkManager/Devices/432)
Nov 22 08:55:14 compute-0 nova_compute[186544]: 2025-11-22 08:55:14.101 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:55:14 compute-0 nova_compute[186544]: 2025-11-22 08:55:14.111 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:55:14 compute-0 nova_compute[186544]: 2025-11-22 08:55:14.114 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:55:14 compute-0 NetworkManager[55036]: <info>  [1763801714.1155] manager: (patch-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/433)
Nov 22 08:55:14 compute-0 NetworkManager[55036]: <info>  [1763801714.1162] manager: (patch-br-int-to-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/434)
Nov 22 08:55:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:55:14.124 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:95:ec:c3 10.100.0.5'], port_security=['fa:16:3e:95:ec:c3 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4d568a4f-3fc1-4760-b924-569e98e1b4a7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'neutron:revision_number': '2', 'neutron:security_group_ids': '984f816a-7307-432f-9913-aaee1df5bf08', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=809b4a06-a3bb-45c6-b4f6-e663731ee64f, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=61d2893b-3f64-4239-b7aa-532d126977ba) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:55:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:55:14.126 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 61d2893b-3f64-4239-b7aa-532d126977ba in datapath 4d568a4f-3fc1-4760-b924-569e98e1b4a7 bound to our chassis
Nov 22 08:55:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:55:14.127 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4d568a4f-3fc1-4760-b924-569e98e1b4a7
Nov 22 08:55:14 compute-0 systemd-udevd[256731]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:55:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:55:14.141 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[a4ae94ce-dd5e-4865-b225-fcf40a3fdebe]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:55:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:55:14.142 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap4d568a4f-31 in ovnmeta-4d568a4f-3fc1-4760-b924-569e98e1b4a7 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 08:55:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:55:14.145 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap4d568a4f-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 08:55:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:55:14.146 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[390dbb12-a77a-41d6-b91d-e3764ec36a78]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:55:14 compute-0 NetworkManager[55036]: <info>  [1763801714.1474] device (tap61d2893b-3f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 08:55:14 compute-0 NetworkManager[55036]: <info>  [1763801714.1484] device (tap61d2893b-3f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 08:55:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:55:14.148 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[d4af08c3-b39e-4bf2-b941-4f3d4f4d281d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:55:14 compute-0 systemd-machined[152872]: New machine qemu-100-instance-000000b9.
Nov 22 08:55:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:55:14.160 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[e7efd172-a993-4f55-ae64-df4983591011]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:55:14 compute-0 nova_compute[186544]: 2025-11-22 08:55:14.177 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:55:14 compute-0 nova_compute[186544]: 2025-11-22 08:55:14.177 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 22 08:55:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:55:14.181 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[91d5fc9f-9c82-49d7-940c-78cb6a01cbb6]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:55:14 compute-0 systemd[1]: Started Virtual Machine qemu-100-instance-000000b9.
Nov 22 08:55:14 compute-0 nova_compute[186544]: 2025-11-22 08:55:14.193 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:55:14 compute-0 nova_compute[186544]: 2025-11-22 08:55:14.197 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 22 08:55:14 compute-0 ovn_controller[94843]: 2025-11-22T08:55:14Z|00920|binding|INFO|Setting lport 61d2893b-3f64-4239-b7aa-532d126977ba ovn-installed in OVS
Nov 22 08:55:14 compute-0 ovn_controller[94843]: 2025-11-22T08:55:14Z|00921|binding|INFO|Setting lport 61d2893b-3f64-4239-b7aa-532d126977ba up in Southbound
Nov 22 08:55:14 compute-0 nova_compute[186544]: 2025-11-22 08:55:14.203 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:55:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:55:14.227 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[b7ca1d7e-196f-441e-98b5-34de111d9451]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:55:14 compute-0 NetworkManager[55036]: <info>  [1763801714.2363] manager: (tap4d568a4f-30): new Veth device (/org/freedesktop/NetworkManager/Devices/435)
Nov 22 08:55:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:55:14.235 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[12de3a2d-e433-40d0-a32f-9229f497f4eb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:55:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:55:14.274 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[16b95ae7-7015-4b1a-a1cc-7464c3bf28c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:55:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:55:14.278 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[112faa09-70f4-4554-8062-eadcfea0a6c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:55:14 compute-0 NetworkManager[55036]: <info>  [1763801714.3013] device (tap4d568a4f-30): carrier: link connected
Nov 22 08:55:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:55:14.306 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[8c1ea505-1403-41e6-b823-0ebb211d817c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:55:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:55:14.321 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[96246f3e-b483-4529-b583-442bc9e94f58]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4d568a4f-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:74:a1:82'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 270], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 845493, 'reachable_time': 36190, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 256766, 'error': None, 'target': 'ovnmeta-4d568a4f-3fc1-4760-b924-569e98e1b4a7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:55:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:55:14.337 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[458b4261-5a25-4ee5-9e42-014aaf37df9e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe74:a182'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 845493, 'tstamp': 845493}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 256767, 'error': None, 'target': 'ovnmeta-4d568a4f-3fc1-4760-b924-569e98e1b4a7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:55:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:55:14.355 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[4a680091-4f44-4e87-bb9b-c5e2c267bfb8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4d568a4f-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:74:a1:82'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 270], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 845493, 'reachable_time': 36190, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 256768, 'error': None, 'target': 'ovnmeta-4d568a4f-3fc1-4760-b924-569e98e1b4a7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:55:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:55:14.386 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[e61b2944-16d5-4fbd-a0db-9763e1686fd7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:55:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:55:14.447 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[b33f2c0b-b9c7-469e-bdbe-e761ebca1317]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:55:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:55:14.448 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4d568a4f-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:55:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:55:14.448 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:55:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:55:14.449 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4d568a4f-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:55:14 compute-0 kernel: tap4d568a4f-30: entered promiscuous mode
Nov 22 08:55:14 compute-0 NetworkManager[55036]: <info>  [1763801714.4511] manager: (tap4d568a4f-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/436)
Nov 22 08:55:14 compute-0 nova_compute[186544]: 2025-11-22 08:55:14.451 186548 DEBUG nova.compute.manager [req-7c8e85e1-f4d5-46dc-8407-f07f9ddaec61 req-1aeb6ad9-f03a-4489-ae8b-52067261e0c7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] Received event network-vif-plugged-61d2893b-3f64-4239-b7aa-532d126977ba external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:55:14 compute-0 nova_compute[186544]: 2025-11-22 08:55:14.452 186548 DEBUG oslo_concurrency.lockutils [req-7c8e85e1-f4d5-46dc-8407-f07f9ddaec61 req-1aeb6ad9-f03a-4489-ae8b-52067261e0c7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:55:14 compute-0 nova_compute[186544]: 2025-11-22 08:55:14.452 186548 DEBUG oslo_concurrency.lockutils [req-7c8e85e1-f4d5-46dc-8407-f07f9ddaec61 req-1aeb6ad9-f03a-4489-ae8b-52067261e0c7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:55:14 compute-0 nova_compute[186544]: 2025-11-22 08:55:14.453 186548 DEBUG oslo_concurrency.lockutils [req-7c8e85e1-f4d5-46dc-8407-f07f9ddaec61 req-1aeb6ad9-f03a-4489-ae8b-52067261e0c7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:55:14 compute-0 nova_compute[186544]: 2025-11-22 08:55:14.453 186548 DEBUG nova.compute.manager [req-7c8e85e1-f4d5-46dc-8407-f07f9ddaec61 req-1aeb6ad9-f03a-4489-ae8b-52067261e0c7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] Processing event network-vif-plugged-61d2893b-3f64-4239-b7aa-532d126977ba _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 08:55:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:55:14.453 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4d568a4f-30, col_values=(('external_ids', {'iface-id': '70dec18e-900f-4578-a135-56e2da0bb3bd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:55:14 compute-0 nova_compute[186544]: 2025-11-22 08:55:14.454 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:55:14 compute-0 ovn_controller[94843]: 2025-11-22T08:55:14Z|00922|binding|INFO|Releasing lport 70dec18e-900f-4578-a135-56e2da0bb3bd from this chassis (sb_readonly=0)
Nov 22 08:55:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:55:14.456 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4d568a4f-3fc1-4760-b924-569e98e1b4a7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4d568a4f-3fc1-4760-b924-569e98e1b4a7.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 08:55:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:55:14.456 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[532e509d-5962-4266-85b2-05a187675e1a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:55:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:55:14.457 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 08:55:14 compute-0 ovn_metadata_agent[103800]: global
Nov 22 08:55:14 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 08:55:14 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-4d568a4f-3fc1-4760-b924-569e98e1b4a7
Nov 22 08:55:14 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 08:55:14 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 08:55:14 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 08:55:14 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/4d568a4f-3fc1-4760-b924-569e98e1b4a7.pid.haproxy
Nov 22 08:55:14 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 08:55:14 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:55:14 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 08:55:14 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 08:55:14 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 08:55:14 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 08:55:14 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 08:55:14 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 08:55:14 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 08:55:14 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 08:55:14 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 08:55:14 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 08:55:14 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 08:55:14 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 08:55:14 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 08:55:14 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:55:14 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:55:14 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 08:55:14 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 08:55:14 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 08:55:14 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID 4d568a4f-3fc1-4760-b924-569e98e1b4a7
Nov 22 08:55:14 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 08:55:14 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:55:14.458 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-4d568a4f-3fc1-4760-b924-569e98e1b4a7', 'env', 'PROCESS_TAG=haproxy-4d568a4f-3fc1-4760-b924-569e98e1b4a7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/4d568a4f-3fc1-4760-b924-569e98e1b4a7.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 08:55:14 compute-0 nova_compute[186544]: 2025-11-22 08:55:14.466 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:55:14 compute-0 nova_compute[186544]: 2025-11-22 08:55:14.517 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763801714.517467, 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:55:14 compute-0 nova_compute[186544]: 2025-11-22 08:55:14.519 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] VM Started (Lifecycle Event)
Nov 22 08:55:14 compute-0 nova_compute[186544]: 2025-11-22 08:55:14.521 186548 DEBUG nova.compute.manager [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 08:55:14 compute-0 nova_compute[186544]: 2025-11-22 08:55:14.525 186548 DEBUG nova.virt.libvirt.driver [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 08:55:14 compute-0 nova_compute[186544]: 2025-11-22 08:55:14.529 186548 INFO nova.virt.libvirt.driver [-] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] Instance spawned successfully.
Nov 22 08:55:14 compute-0 nova_compute[186544]: 2025-11-22 08:55:14.529 186548 DEBUG nova.virt.libvirt.driver [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 08:55:14 compute-0 nova_compute[186544]: 2025-11-22 08:55:14.544 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:55:14 compute-0 nova_compute[186544]: 2025-11-22 08:55:14.551 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:55:14 compute-0 nova_compute[186544]: 2025-11-22 08:55:14.555 186548 DEBUG nova.virt.libvirt.driver [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:55:14 compute-0 nova_compute[186544]: 2025-11-22 08:55:14.555 186548 DEBUG nova.virt.libvirt.driver [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:55:14 compute-0 nova_compute[186544]: 2025-11-22 08:55:14.556 186548 DEBUG nova.virt.libvirt.driver [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:55:14 compute-0 nova_compute[186544]: 2025-11-22 08:55:14.556 186548 DEBUG nova.virt.libvirt.driver [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:55:14 compute-0 nova_compute[186544]: 2025-11-22 08:55:14.557 186548 DEBUG nova.virt.libvirt.driver [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:55:14 compute-0 nova_compute[186544]: 2025-11-22 08:55:14.557 186548 DEBUG nova.virt.libvirt.driver [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:55:14 compute-0 nova_compute[186544]: 2025-11-22 08:55:14.617 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 08:55:14 compute-0 nova_compute[186544]: 2025-11-22 08:55:14.617 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763801714.5183158, 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:55:14 compute-0 nova_compute[186544]: 2025-11-22 08:55:14.618 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] VM Paused (Lifecycle Event)
Nov 22 08:55:14 compute-0 nova_compute[186544]: 2025-11-22 08:55:14.638 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:55:14 compute-0 nova_compute[186544]: 2025-11-22 08:55:14.653 186548 INFO nova.compute.manager [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] Took 9.91 seconds to spawn the instance on the hypervisor.
Nov 22 08:55:14 compute-0 nova_compute[186544]: 2025-11-22 08:55:14.654 186548 DEBUG nova.compute.manager [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:55:14 compute-0 nova_compute[186544]: 2025-11-22 08:55:14.657 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763801714.5241294, 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:55:14 compute-0 nova_compute[186544]: 2025-11-22 08:55:14.657 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] VM Resumed (Lifecycle Event)
Nov 22 08:55:14 compute-0 nova_compute[186544]: 2025-11-22 08:55:14.683 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:55:14 compute-0 nova_compute[186544]: 2025-11-22 08:55:14.686 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:55:14 compute-0 nova_compute[186544]: 2025-11-22 08:55:14.705 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 08:55:14 compute-0 nova_compute[186544]: 2025-11-22 08:55:14.727 186548 INFO nova.compute.manager [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] Took 10.38 seconds to build instance.
Nov 22 08:55:14 compute-0 nova_compute[186544]: 2025-11-22 08:55:14.747 186548 DEBUG oslo_concurrency.lockutils [None req-11f93d6d-8c6e-4bc4-94f5-ed02cfbe0734 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.471s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:55:14 compute-0 podman[256803]: 2025-11-22 08:55:14.801105188 +0000 UTC m=+0.022884894 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 08:55:14 compute-0 podman[256803]: 2025-11-22 08:55:14.916071006 +0000 UTC m=+0.137850672 container create f76cbc4d1f2bf53ee2fccc136d1b81b229db76d2f65b6656a6f85cc65aeb77b9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4d568a4f-3fc1-4760-b924-569e98e1b4a7, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 22 08:55:14 compute-0 systemd[1]: Started libpod-conmon-f76cbc4d1f2bf53ee2fccc136d1b81b229db76d2f65b6656a6f85cc65aeb77b9.scope.
Nov 22 08:55:14 compute-0 nova_compute[186544]: 2025-11-22 08:55:14.987 186548 DEBUG nova.network.neutron [req-fa703e5d-7173-42c2-a9df-00ecdfbab7cf req-b6941e9d-f7c6-4c52-aea9-f8d28c3d33a7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] Updated VIF entry in instance network info cache for port 61d2893b-3f64-4239-b7aa-532d126977ba. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:55:14 compute-0 nova_compute[186544]: 2025-11-22 08:55:14.988 186548 DEBUG nova.network.neutron [req-fa703e5d-7173-42c2-a9df-00ecdfbab7cf req-b6941e9d-f7c6-4c52-aea9-f8d28c3d33a7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] Updating instance_info_cache with network_info: [{"id": "61d2893b-3f64-4239-b7aa-532d126977ba", "address": "fa:16:3e:95:ec:c3", "network": {"id": "4d568a4f-3fc1-4760-b924-569e98e1b4a7", "bridge": "br-int", "label": "tempest-network-smoke--2129642516", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61d2893b-3f", "ovs_interfaceid": "61d2893b-3f64-4239-b7aa-532d126977ba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:55:15 compute-0 nova_compute[186544]: 2025-11-22 08:55:15.001 186548 DEBUG oslo_concurrency.lockutils [req-fa703e5d-7173-42c2-a9df-00ecdfbab7cf req-b6941e9d-f7c6-4c52-aea9-f8d28c3d33a7 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:55:15 compute-0 systemd[1]: Started libcrun container.
Nov 22 08:55:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7c939729e6843eaa27f1a90af55bb5e7977a43114bef0e89566dacd7f7fccbd7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 08:55:15 compute-0 podman[256803]: 2025-11-22 08:55:15.164031095 +0000 UTC m=+0.385810771 container init f76cbc4d1f2bf53ee2fccc136d1b81b229db76d2f65b6656a6f85cc65aeb77b9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4d568a4f-3fc1-4760-b924-569e98e1b4a7, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 08:55:15 compute-0 podman[256803]: 2025-11-22 08:55:15.169542871 +0000 UTC m=+0.391322547 container start f76cbc4d1f2bf53ee2fccc136d1b81b229db76d2f65b6656a6f85cc65aeb77b9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4d568a4f-3fc1-4760-b924-569e98e1b4a7, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 22 08:55:15 compute-0 neutron-haproxy-ovnmeta-4d568a4f-3fc1-4760-b924-569e98e1b4a7[256818]: [NOTICE]   (256822) : New worker (256824) forked
Nov 22 08:55:15 compute-0 neutron-haproxy-ovnmeta-4d568a4f-3fc1-4760-b924-569e98e1b4a7[256818]: [NOTICE]   (256822) : Loading success.
Nov 22 08:55:16 compute-0 nova_compute[186544]: 2025-11-22 08:55:16.536 186548 DEBUG nova.compute.manager [req-e58eff12-121b-4fec-b596-4a67669904bc req-eba4fc44-d426-4422-86ba-cfab42fac118 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] Received event network-vif-plugged-61d2893b-3f64-4239-b7aa-532d126977ba external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:55:16 compute-0 nova_compute[186544]: 2025-11-22 08:55:16.536 186548 DEBUG oslo_concurrency.lockutils [req-e58eff12-121b-4fec-b596-4a67669904bc req-eba4fc44-d426-4422-86ba-cfab42fac118 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:55:16 compute-0 nova_compute[186544]: 2025-11-22 08:55:16.537 186548 DEBUG oslo_concurrency.lockutils [req-e58eff12-121b-4fec-b596-4a67669904bc req-eba4fc44-d426-4422-86ba-cfab42fac118 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:55:16 compute-0 nova_compute[186544]: 2025-11-22 08:55:16.537 186548 DEBUG oslo_concurrency.lockutils [req-e58eff12-121b-4fec-b596-4a67669904bc req-eba4fc44-d426-4422-86ba-cfab42fac118 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:55:16 compute-0 nova_compute[186544]: 2025-11-22 08:55:16.537 186548 DEBUG nova.compute.manager [req-e58eff12-121b-4fec-b596-4a67669904bc req-eba4fc44-d426-4422-86ba-cfab42fac118 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] No waiting events found dispatching network-vif-plugged-61d2893b-3f64-4239-b7aa-532d126977ba pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:55:16 compute-0 nova_compute[186544]: 2025-11-22 08:55:16.538 186548 WARNING nova.compute.manager [req-e58eff12-121b-4fec-b596-4a67669904bc req-eba4fc44-d426-4422-86ba-cfab42fac118 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] Received unexpected event network-vif-plugged-61d2893b-3f64-4239-b7aa-532d126977ba for instance with vm_state active and task_state None.
Nov 22 08:55:16 compute-0 nova_compute[186544]: 2025-11-22 08:55:16.932 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:55:18 compute-0 nova_compute[186544]: 2025-11-22 08:55:18.179 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:55:21 compute-0 nova_compute[186544]: 2025-11-22 08:55:21.071 186548 DEBUG nova.compute.manager [req-3ea4d01c-72c1-484b-a950-5764a75b7be2 req-2f1d89e1-b393-4e40-ae22-56cbfbba8890 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] Received event network-changed-61d2893b-3f64-4239-b7aa-532d126977ba external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:55:21 compute-0 nova_compute[186544]: 2025-11-22 08:55:21.072 186548 DEBUG nova.compute.manager [req-3ea4d01c-72c1-484b-a950-5764a75b7be2 req-2f1d89e1-b393-4e40-ae22-56cbfbba8890 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] Refreshing instance network info cache due to event network-changed-61d2893b-3f64-4239-b7aa-532d126977ba. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:55:21 compute-0 nova_compute[186544]: 2025-11-22 08:55:21.072 186548 DEBUG oslo_concurrency.lockutils [req-3ea4d01c-72c1-484b-a950-5764a75b7be2 req-2f1d89e1-b393-4e40-ae22-56cbfbba8890 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:55:21 compute-0 nova_compute[186544]: 2025-11-22 08:55:21.072 186548 DEBUG oslo_concurrency.lockutils [req-3ea4d01c-72c1-484b-a950-5764a75b7be2 req-2f1d89e1-b393-4e40-ae22-56cbfbba8890 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:55:21 compute-0 nova_compute[186544]: 2025-11-22 08:55:21.073 186548 DEBUG nova.network.neutron [req-3ea4d01c-72c1-484b-a950-5764a75b7be2 req-2f1d89e1-b393-4e40-ae22-56cbfbba8890 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] Refreshing network info cache for port 61d2893b-3f64-4239-b7aa-532d126977ba _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:55:21 compute-0 nova_compute[186544]: 2025-11-22 08:55:21.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:55:21 compute-0 nova_compute[186544]: 2025-11-22 08:55:21.164 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 22 08:55:21 compute-0 nova_compute[186544]: 2025-11-22 08:55:21.936 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:55:23 compute-0 nova_compute[186544]: 2025-11-22 08:55:23.205 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:55:23 compute-0 nova_compute[186544]: 2025-11-22 08:55:23.898 186548 DEBUG nova.network.neutron [req-3ea4d01c-72c1-484b-a950-5764a75b7be2 req-2f1d89e1-b393-4e40-ae22-56cbfbba8890 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] Updated VIF entry in instance network info cache for port 61d2893b-3f64-4239-b7aa-532d126977ba. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:55:23 compute-0 nova_compute[186544]: 2025-11-22 08:55:23.899 186548 DEBUG nova.network.neutron [req-3ea4d01c-72c1-484b-a950-5764a75b7be2 req-2f1d89e1-b393-4e40-ae22-56cbfbba8890 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] Updating instance_info_cache with network_info: [{"id": "61d2893b-3f64-4239-b7aa-532d126977ba", "address": "fa:16:3e:95:ec:c3", "network": {"id": "4d568a4f-3fc1-4760-b924-569e98e1b4a7", "bridge": "br-int", "label": "tempest-network-smoke--2129642516", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61d2893b-3f", "ovs_interfaceid": "61d2893b-3f64-4239-b7aa-532d126977ba", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:55:23 compute-0 nova_compute[186544]: 2025-11-22 08:55:23.964 186548 DEBUG oslo_concurrency.lockutils [req-3ea4d01c-72c1-484b-a950-5764a75b7be2 req-2f1d89e1-b393-4e40-ae22-56cbfbba8890 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:55:24 compute-0 nova_compute[186544]: 2025-11-22 08:55:24.095 186548 DEBUG nova.compute.manager [req-90c3eecd-e348-460e-a66a-ae79ae7ae45e req-07c180a0-2a36-47a0-ac84-52fa71e17b0d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] Received event network-changed-61d2893b-3f64-4239-b7aa-532d126977ba external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:55:24 compute-0 nova_compute[186544]: 2025-11-22 08:55:24.095 186548 DEBUG nova.compute.manager [req-90c3eecd-e348-460e-a66a-ae79ae7ae45e req-07c180a0-2a36-47a0-ac84-52fa71e17b0d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] Refreshing instance network info cache due to event network-changed-61d2893b-3f64-4239-b7aa-532d126977ba. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:55:24 compute-0 nova_compute[186544]: 2025-11-22 08:55:24.095 186548 DEBUG oslo_concurrency.lockutils [req-90c3eecd-e348-460e-a66a-ae79ae7ae45e req-07c180a0-2a36-47a0-ac84-52fa71e17b0d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:55:24 compute-0 nova_compute[186544]: 2025-11-22 08:55:24.096 186548 DEBUG oslo_concurrency.lockutils [req-90c3eecd-e348-460e-a66a-ae79ae7ae45e req-07c180a0-2a36-47a0-ac84-52fa71e17b0d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:55:24 compute-0 nova_compute[186544]: 2025-11-22 08:55:24.096 186548 DEBUG nova.network.neutron [req-90c3eecd-e348-460e-a66a-ae79ae7ae45e req-07c180a0-2a36-47a0-ac84-52fa71e17b0d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] Refreshing network info cache for port 61d2893b-3f64-4239-b7aa-532d126977ba _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:55:25 compute-0 nova_compute[186544]: 2025-11-22 08:55:25.123 186548 DEBUG nova.network.neutron [req-90c3eecd-e348-460e-a66a-ae79ae7ae45e req-07c180a0-2a36-47a0-ac84-52fa71e17b0d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] Updated VIF entry in instance network info cache for port 61d2893b-3f64-4239-b7aa-532d126977ba. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:55:25 compute-0 nova_compute[186544]: 2025-11-22 08:55:25.124 186548 DEBUG nova.network.neutron [req-90c3eecd-e348-460e-a66a-ae79ae7ae45e req-07c180a0-2a36-47a0-ac84-52fa71e17b0d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] Updating instance_info_cache with network_info: [{"id": "61d2893b-3f64-4239-b7aa-532d126977ba", "address": "fa:16:3e:95:ec:c3", "network": {"id": "4d568a4f-3fc1-4760-b924-569e98e1b4a7", "bridge": "br-int", "label": "tempest-network-smoke--2129642516", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61d2893b-3f", "ovs_interfaceid": "61d2893b-3f64-4239-b7aa-532d126977ba", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:55:25 compute-0 nova_compute[186544]: 2025-11-22 08:55:25.142 186548 DEBUG oslo_concurrency.lockutils [req-90c3eecd-e348-460e-a66a-ae79ae7ae45e req-07c180a0-2a36-47a0-ac84-52fa71e17b0d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:55:26 compute-0 nova_compute[186544]: 2025-11-22 08:55:26.943 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:55:28 compute-0 nova_compute[186544]: 2025-11-22 08:55:28.207 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:55:29 compute-0 ovn_controller[94843]: 2025-11-22T08:55:29Z|00095|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:95:ec:c3 10.100.0.5
Nov 22 08:55:29 compute-0 ovn_controller[94843]: 2025-11-22T08:55:29Z|00096|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:95:ec:c3 10.100.0.5
Nov 22 08:55:31 compute-0 nova_compute[186544]: 2025-11-22 08:55:31.946 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:55:32 compute-0 podman[256853]: 2025-11-22 08:55:32.402450172 +0000 UTC m=+0.051158028 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 22 08:55:32 compute-0 podman[256852]: 2025-11-22 08:55:32.406092522 +0000 UTC m=+0.057497675 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.schema-version=1.0)
Nov 22 08:55:32 compute-0 podman[256854]: 2025-11-22 08:55:32.431063407 +0000 UTC m=+0.077013856 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 08:55:32 compute-0 podman[256855]: 2025-11-22 08:55:32.444075767 +0000 UTC m=+0.085220778 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251118, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 08:55:33 compute-0 nova_compute[186544]: 2025-11-22 08:55:33.196 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:55:33 compute-0 nova_compute[186544]: 2025-11-22 08:55:33.197 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 08:55:33 compute-0 nova_compute[186544]: 2025-11-22 08:55:33.197 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 08:55:33 compute-0 nova_compute[186544]: 2025-11-22 08:55:33.215 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:55:33 compute-0 nova_compute[186544]: 2025-11-22 08:55:33.796 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "refresh_cache-3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:55:33 compute-0 nova_compute[186544]: 2025-11-22 08:55:33.797 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquired lock "refresh_cache-3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:55:33 compute-0 nova_compute[186544]: 2025-11-22 08:55:33.797 186548 DEBUG nova.network.neutron [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 22 08:55:33 compute-0 nova_compute[186544]: 2025-11-22 08:55:33.797 186548 DEBUG nova.objects.instance [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lazy-loading 'info_cache' on Instance uuid 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:55:34 compute-0 nova_compute[186544]: 2025-11-22 08:55:34.763 186548 DEBUG oslo_concurrency.lockutils [None req-bc7ad39e-4bff-468c-9bd8-e0b961526668 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:55:34 compute-0 nova_compute[186544]: 2025-11-22 08:55:34.764 186548 DEBUG oslo_concurrency.lockutils [None req-bc7ad39e-4bff-468c-9bd8-e0b961526668 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:55:34 compute-0 nova_compute[186544]: 2025-11-22 08:55:34.764 186548 DEBUG oslo_concurrency.lockutils [None req-bc7ad39e-4bff-468c-9bd8-e0b961526668 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:55:34 compute-0 nova_compute[186544]: 2025-11-22 08:55:34.764 186548 DEBUG oslo_concurrency.lockutils [None req-bc7ad39e-4bff-468c-9bd8-e0b961526668 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:55:34 compute-0 nova_compute[186544]: 2025-11-22 08:55:34.765 186548 DEBUG oslo_concurrency.lockutils [None req-bc7ad39e-4bff-468c-9bd8-e0b961526668 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:55:34 compute-0 nova_compute[186544]: 2025-11-22 08:55:34.785 186548 INFO nova.compute.manager [None req-bc7ad39e-4bff-468c-9bd8-e0b961526668 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] Terminating instance
Nov 22 08:55:34 compute-0 nova_compute[186544]: 2025-11-22 08:55:34.793 186548 DEBUG nova.compute.manager [None req-bc7ad39e-4bff-468c-9bd8-e0b961526668 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 08:55:34 compute-0 kernel: tap61d2893b-3f (unregistering): left promiscuous mode
Nov 22 08:55:34 compute-0 NetworkManager[55036]: <info>  [1763801734.8198] device (tap61d2893b-3f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 08:55:34 compute-0 nova_compute[186544]: 2025-11-22 08:55:34.829 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:55:34 compute-0 ovn_controller[94843]: 2025-11-22T08:55:34Z|00923|binding|INFO|Releasing lport 61d2893b-3f64-4239-b7aa-532d126977ba from this chassis (sb_readonly=0)
Nov 22 08:55:34 compute-0 ovn_controller[94843]: 2025-11-22T08:55:34Z|00924|binding|INFO|Setting lport 61d2893b-3f64-4239-b7aa-532d126977ba down in Southbound
Nov 22 08:55:34 compute-0 ovn_controller[94843]: 2025-11-22T08:55:34Z|00925|binding|INFO|Removing iface tap61d2893b-3f ovn-installed in OVS
Nov 22 08:55:34 compute-0 nova_compute[186544]: 2025-11-22 08:55:34.831 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:55:34 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:55:34.844 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:95:ec:c3 10.100.0.5', 'unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4d568a4f-3fc1-4760-b924-569e98e1b4a7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=809b4a06-a3bb-45c6-b4f6-e663731ee64f, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=61d2893b-3f64-4239-b7aa-532d126977ba) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:55:34 compute-0 nova_compute[186544]: 2025-11-22 08:55:34.845 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:55:34 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:55:34.846 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 61d2893b-3f64-4239-b7aa-532d126977ba in datapath 4d568a4f-3fc1-4760-b924-569e98e1b4a7 unbound from our chassis
Nov 22 08:55:34 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:55:34.847 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4d568a4f-3fc1-4760-b924-569e98e1b4a7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 08:55:34 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:55:34.848 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[6a42eb1e-cfc3-47d5-8b2a-c392486070c1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:55:34 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:55:34.849 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-4d568a4f-3fc1-4760-b924-569e98e1b4a7 namespace which is not needed anymore
Nov 22 08:55:34 compute-0 systemd[1]: machine-qemu\x2d100\x2dinstance\x2d000000b9.scope: Deactivated successfully.
Nov 22 08:55:34 compute-0 systemd[1]: machine-qemu\x2d100\x2dinstance\x2d000000b9.scope: Consumed 15.443s CPU time.
Nov 22 08:55:34 compute-0 systemd-machined[152872]: Machine qemu-100-instance-000000b9 terminated.
Nov 22 08:55:34 compute-0 neutron-haproxy-ovnmeta-4d568a4f-3fc1-4760-b924-569e98e1b4a7[256818]: [NOTICE]   (256822) : haproxy version is 2.8.14-c23fe91
Nov 22 08:55:34 compute-0 neutron-haproxy-ovnmeta-4d568a4f-3fc1-4760-b924-569e98e1b4a7[256818]: [NOTICE]   (256822) : path to executable is /usr/sbin/haproxy
Nov 22 08:55:34 compute-0 neutron-haproxy-ovnmeta-4d568a4f-3fc1-4760-b924-569e98e1b4a7[256818]: [WARNING]  (256822) : Exiting Master process...
Nov 22 08:55:34 compute-0 neutron-haproxy-ovnmeta-4d568a4f-3fc1-4760-b924-569e98e1b4a7[256818]: [ALERT]    (256822) : Current worker (256824) exited with code 143 (Terminated)
Nov 22 08:55:34 compute-0 neutron-haproxy-ovnmeta-4d568a4f-3fc1-4760-b924-569e98e1b4a7[256818]: [WARNING]  (256822) : All workers exited. Exiting... (0)
Nov 22 08:55:34 compute-0 systemd[1]: libpod-f76cbc4d1f2bf53ee2fccc136d1b81b229db76d2f65b6656a6f85cc65aeb77b9.scope: Deactivated successfully.
Nov 22 08:55:34 compute-0 podman[256961]: 2025-11-22 08:55:34.977395292 +0000 UTC m=+0.054389659 container died f76cbc4d1f2bf53ee2fccc136d1b81b229db76d2f65b6656a6f85cc65aeb77b9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4d568a4f-3fc1-4760-b924-569e98e1b4a7, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS)
Nov 22 08:55:35 compute-0 systemd[1]: var-lib-containers-storage-overlay-7c939729e6843eaa27f1a90af55bb5e7977a43114bef0e89566dacd7f7fccbd7-merged.mount: Deactivated successfully.
Nov 22 08:55:35 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f76cbc4d1f2bf53ee2fccc136d1b81b229db76d2f65b6656a6f85cc65aeb77b9-userdata-shm.mount: Deactivated successfully.
Nov 22 08:55:35 compute-0 podman[256961]: 2025-11-22 08:55:35.025310071 +0000 UTC m=+0.102304408 container cleanup f76cbc4d1f2bf53ee2fccc136d1b81b229db76d2f65b6656a6f85cc65aeb77b9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4d568a4f-3fc1-4760-b924-569e98e1b4a7, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 08:55:35 compute-0 systemd[1]: libpod-conmon-f76cbc4d1f2bf53ee2fccc136d1b81b229db76d2f65b6656a6f85cc65aeb77b9.scope: Deactivated successfully.
Nov 22 08:55:35 compute-0 nova_compute[186544]: 2025-11-22 08:55:35.054 186548 DEBUG nova.compute.manager [req-bd53d1ab-bd01-4b57-b2de-1ff762ace57e req-be1a0006-91e5-4b10-ac90-e4e19af6d330 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] Received event network-vif-unplugged-61d2893b-3f64-4239-b7aa-532d126977ba external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:55:35 compute-0 nova_compute[186544]: 2025-11-22 08:55:35.055 186548 DEBUG oslo_concurrency.lockutils [req-bd53d1ab-bd01-4b57-b2de-1ff762ace57e req-be1a0006-91e5-4b10-ac90-e4e19af6d330 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:55:35 compute-0 nova_compute[186544]: 2025-11-22 08:55:35.056 186548 DEBUG oslo_concurrency.lockutils [req-bd53d1ab-bd01-4b57-b2de-1ff762ace57e req-be1a0006-91e5-4b10-ac90-e4e19af6d330 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:55:35 compute-0 nova_compute[186544]: 2025-11-22 08:55:35.057 186548 DEBUG oslo_concurrency.lockutils [req-bd53d1ab-bd01-4b57-b2de-1ff762ace57e req-be1a0006-91e5-4b10-ac90-e4e19af6d330 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:55:35 compute-0 nova_compute[186544]: 2025-11-22 08:55:35.057 186548 DEBUG nova.compute.manager [req-bd53d1ab-bd01-4b57-b2de-1ff762ace57e req-be1a0006-91e5-4b10-ac90-e4e19af6d330 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] No waiting events found dispatching network-vif-unplugged-61d2893b-3f64-4239-b7aa-532d126977ba pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:55:35 compute-0 nova_compute[186544]: 2025-11-22 08:55:35.057 186548 DEBUG nova.compute.manager [req-bd53d1ab-bd01-4b57-b2de-1ff762ace57e req-be1a0006-91e5-4b10-ac90-e4e19af6d330 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] Received event network-vif-unplugged-61d2893b-3f64-4239-b7aa-532d126977ba for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 22 08:55:35 compute-0 nova_compute[186544]: 2025-11-22 08:55:35.060 186548 INFO nova.virt.libvirt.driver [-] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] Instance destroyed successfully.
Nov 22 08:55:35 compute-0 nova_compute[186544]: 2025-11-22 08:55:35.061 186548 DEBUG nova.objects.instance [None req-bc7ad39e-4bff-468c-9bd8-e0b961526668 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lazy-loading 'resources' on Instance uuid 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:55:35 compute-0 nova_compute[186544]: 2025-11-22 08:55:35.072 186548 DEBUG nova.virt.libvirt.vif [None req-bc7ad39e-4bff-468c-9bd8-e0b961526668 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:55:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-gen-1-1689879905',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-gen-1-1689879905',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-588574044-gen',id=185,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAw4mkfFiRvcPYUE0PIsjDRQ2l9YGHYiiINwqgLaZjT4Jz+8V9nq9XzUIN6IBe3EfaIEfpC2/icXPG/z2BuoLG3JQ3o5sNQAUv0uO8d73RniLqWnU/BDSHGzBNmpYdI7sw==',key_name='tempest-TestSecurityGroupsBasicOps-2127958203',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:55:14Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b5da13b07bb34fc3b4cd1452f7dd6971',ramdisk_id='',reservation_id='r-sg0rjxgo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-588574044',owner_user_name='tempest-TestSecurityGroupsBasicOps-588574044-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:55:14Z,user_data=None,user_id='7bb85b33f2b44468ab5d86bf5ba98421',uuid=3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "61d2893b-3f64-4239-b7aa-532d126977ba", "address": "fa:16:3e:95:ec:c3", "network": {"id": "4d568a4f-3fc1-4760-b924-569e98e1b4a7", "bridge": "br-int", "label": "tempest-network-smoke--2129642516", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61d2893b-3f", "ovs_interfaceid": "61d2893b-3f64-4239-b7aa-532d126977ba", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 08:55:35 compute-0 nova_compute[186544]: 2025-11-22 08:55:35.073 186548 DEBUG nova.network.os_vif_util [None req-bc7ad39e-4bff-468c-9bd8-e0b961526668 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Converting VIF {"id": "61d2893b-3f64-4239-b7aa-532d126977ba", "address": "fa:16:3e:95:ec:c3", "network": {"id": "4d568a4f-3fc1-4760-b924-569e98e1b4a7", "bridge": "br-int", "label": "tempest-network-smoke--2129642516", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61d2893b-3f", "ovs_interfaceid": "61d2893b-3f64-4239-b7aa-532d126977ba", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:55:35 compute-0 nova_compute[186544]: 2025-11-22 08:55:35.073 186548 DEBUG nova.network.os_vif_util [None req-bc7ad39e-4bff-468c-9bd8-e0b961526668 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:95:ec:c3,bridge_name='br-int',has_traffic_filtering=True,id=61d2893b-3f64-4239-b7aa-532d126977ba,network=Network(4d568a4f-3fc1-4760-b924-569e98e1b4a7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap61d2893b-3f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:55:35 compute-0 nova_compute[186544]: 2025-11-22 08:55:35.074 186548 DEBUG os_vif [None req-bc7ad39e-4bff-468c-9bd8-e0b961526668 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:95:ec:c3,bridge_name='br-int',has_traffic_filtering=True,id=61d2893b-3f64-4239-b7aa-532d126977ba,network=Network(4d568a4f-3fc1-4760-b924-569e98e1b4a7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap61d2893b-3f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 08:55:35 compute-0 nova_compute[186544]: 2025-11-22 08:55:35.075 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:55:35 compute-0 nova_compute[186544]: 2025-11-22 08:55:35.075 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap61d2893b-3f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:55:35 compute-0 nova_compute[186544]: 2025-11-22 08:55:35.079 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 08:55:35 compute-0 nova_compute[186544]: 2025-11-22 08:55:35.081 186548 INFO os_vif [None req-bc7ad39e-4bff-468c-9bd8-e0b961526668 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:95:ec:c3,bridge_name='br-int',has_traffic_filtering=True,id=61d2893b-3f64-4239-b7aa-532d126977ba,network=Network(4d568a4f-3fc1-4760-b924-569e98e1b4a7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap61d2893b-3f')
Nov 22 08:55:35 compute-0 nova_compute[186544]: 2025-11-22 08:55:35.082 186548 INFO nova.virt.libvirt.driver [None req-bc7ad39e-4bff-468c-9bd8-e0b961526668 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] Deleting instance files /var/lib/nova/instances/3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6_del
Nov 22 08:55:35 compute-0 nova_compute[186544]: 2025-11-22 08:55:35.082 186548 INFO nova.virt.libvirt.driver [None req-bc7ad39e-4bff-468c-9bd8-e0b961526668 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] Deletion of /var/lib/nova/instances/3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6_del complete
Nov 22 08:55:35 compute-0 nova_compute[186544]: 2025-11-22 08:55:35.095 186548 DEBUG nova.network.neutron [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] Updating instance_info_cache with network_info: [{"id": "61d2893b-3f64-4239-b7aa-532d126977ba", "address": "fa:16:3e:95:ec:c3", "network": {"id": "4d568a4f-3fc1-4760-b924-569e98e1b4a7", "bridge": "br-int", "label": "tempest-network-smoke--2129642516", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61d2893b-3f", "ovs_interfaceid": "61d2893b-3f64-4239-b7aa-532d126977ba", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:55:35 compute-0 podman[257007]: 2025-11-22 08:55:35.097041756 +0000 UTC m=+0.048963916 container remove f76cbc4d1f2bf53ee2fccc136d1b81b229db76d2f65b6656a6f85cc65aeb77b9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4d568a4f-3fc1-4760-b924-569e98e1b4a7, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 08:55:35 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:55:35.106 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[efc61478-af79-4435-b316-4239666dbf47]: (4, ('Sat Nov 22 08:55:34 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-4d568a4f-3fc1-4760-b924-569e98e1b4a7 (f76cbc4d1f2bf53ee2fccc136d1b81b229db76d2f65b6656a6f85cc65aeb77b9)\nf76cbc4d1f2bf53ee2fccc136d1b81b229db76d2f65b6656a6f85cc65aeb77b9\nSat Nov 22 08:55:35 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-4d568a4f-3fc1-4760-b924-569e98e1b4a7 (f76cbc4d1f2bf53ee2fccc136d1b81b229db76d2f65b6656a6f85cc65aeb77b9)\nf76cbc4d1f2bf53ee2fccc136d1b81b229db76d2f65b6656a6f85cc65aeb77b9\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:55:35 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:55:35.107 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[83a9637b-22ca-49d5-b9e2-9b114fa06b34]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:55:35 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:55:35.108 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4d568a4f-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:55:35 compute-0 nova_compute[186544]: 2025-11-22 08:55:35.110 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:55:35 compute-0 kernel: tap4d568a4f-30: left promiscuous mode
Nov 22 08:55:35 compute-0 nova_compute[186544]: 2025-11-22 08:55:35.112 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Releasing lock "refresh_cache-3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:55:35 compute-0 nova_compute[186544]: 2025-11-22 08:55:35.113 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 22 08:55:35 compute-0 nova_compute[186544]: 2025-11-22 08:55:35.121 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:55:35 compute-0 nova_compute[186544]: 2025-11-22 08:55:35.122 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:55:35 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:55:35.125 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[b1f55d3c-e76c-4b2b-9194-9a99fbe9876d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:55:35 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:55:35.139 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[f113a24d-ad85-4fcc-9a29-90c146ad1eb8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:55:35 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:55:35.141 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[73d60fad-bb2d-46db-9762-5c8fd04881cf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:55:35 compute-0 nova_compute[186544]: 2025-11-22 08:55:35.150 186548 INFO nova.compute.manager [None req-bc7ad39e-4bff-468c-9bd8-e0b961526668 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] Took 0.36 seconds to destroy the instance on the hypervisor.
Nov 22 08:55:35 compute-0 nova_compute[186544]: 2025-11-22 08:55:35.151 186548 DEBUG oslo.service.loopingcall [None req-bc7ad39e-4bff-468c-9bd8-e0b961526668 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 08:55:35 compute-0 nova_compute[186544]: 2025-11-22 08:55:35.152 186548 DEBUG nova.compute.manager [-] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 08:55:35 compute-0 nova_compute[186544]: 2025-11-22 08:55:35.152 186548 DEBUG nova.network.neutron [-] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 08:55:35 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:55:35.157 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[23881039-a9bf-4d4d-bfae-4a5982732600]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 845485, 'reachable_time': 21535, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 257026, 'error': None, 'target': 'ovnmeta-4d568a4f-3fc1-4760-b924-569e98e1b4a7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:55:35 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:55:35.160 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-4d568a4f-3fc1-4760-b924-569e98e1b4a7 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 08:55:35 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:55:35.160 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[45a29908-e9ab-4dcd-a1a0-3e7cc8a7094a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:55:35 compute-0 systemd[1]: run-netns-ovnmeta\x2d4d568a4f\x2d3fc1\x2d4760\x2db924\x2d569e98e1b4a7.mount: Deactivated successfully.
Nov 22 08:55:36 compute-0 nova_compute[186544]: 2025-11-22 08:55:36.947 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:55:37 compute-0 nova_compute[186544]: 2025-11-22 08:55:37.342 186548 DEBUG nova.compute.manager [req-06f1798a-356b-4668-b29c-bdb0b444c5d7 req-5480310b-676b-4e7a-b47b-4fa14a8684ec 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] Received event network-vif-plugged-61d2893b-3f64-4239-b7aa-532d126977ba external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:55:37 compute-0 nova_compute[186544]: 2025-11-22 08:55:37.344 186548 DEBUG oslo_concurrency.lockutils [req-06f1798a-356b-4668-b29c-bdb0b444c5d7 req-5480310b-676b-4e7a-b47b-4fa14a8684ec 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:55:37 compute-0 nova_compute[186544]: 2025-11-22 08:55:37.345 186548 DEBUG oslo_concurrency.lockutils [req-06f1798a-356b-4668-b29c-bdb0b444c5d7 req-5480310b-676b-4e7a-b47b-4fa14a8684ec 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:55:37 compute-0 nova_compute[186544]: 2025-11-22 08:55:37.345 186548 DEBUG oslo_concurrency.lockutils [req-06f1798a-356b-4668-b29c-bdb0b444c5d7 req-5480310b-676b-4e7a-b47b-4fa14a8684ec 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:55:37 compute-0 nova_compute[186544]: 2025-11-22 08:55:37.345 186548 DEBUG nova.compute.manager [req-06f1798a-356b-4668-b29c-bdb0b444c5d7 req-5480310b-676b-4e7a-b47b-4fa14a8684ec 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] No waiting events found dispatching network-vif-plugged-61d2893b-3f64-4239-b7aa-532d126977ba pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:55:37 compute-0 nova_compute[186544]: 2025-11-22 08:55:37.345 186548 WARNING nova.compute.manager [req-06f1798a-356b-4668-b29c-bdb0b444c5d7 req-5480310b-676b-4e7a-b47b-4fa14a8684ec 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] Received unexpected event network-vif-plugged-61d2893b-3f64-4239-b7aa-532d126977ba for instance with vm_state active and task_state deleting.
Nov 22 08:55:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:55:37.388 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:55:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:55:37.388 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:55:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:55:37.389 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:55:37 compute-0 nova_compute[186544]: 2025-11-22 08:55:37.564 186548 DEBUG nova.network.neutron [-] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:55:37 compute-0 nova_compute[186544]: 2025-11-22 08:55:37.573 186548 DEBUG nova.compute.manager [req-fbf6e00f-9609-49b0-bfa6-da126d97580c req-673d2b73-7461-4b62-8c2b-0f3c90dd91e9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] Received event network-vif-deleted-61d2893b-3f64-4239-b7aa-532d126977ba external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:55:37 compute-0 nova_compute[186544]: 2025-11-22 08:55:37.574 186548 INFO nova.compute.manager [req-fbf6e00f-9609-49b0-bfa6-da126d97580c req-673d2b73-7461-4b62-8c2b-0f3c90dd91e9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] Neutron deleted interface 61d2893b-3f64-4239-b7aa-532d126977ba; detaching it from the instance and deleting it from the info cache
Nov 22 08:55:37 compute-0 nova_compute[186544]: 2025-11-22 08:55:37.574 186548 DEBUG nova.network.neutron [req-fbf6e00f-9609-49b0-bfa6-da126d97580c req-673d2b73-7461-4b62-8c2b-0f3c90dd91e9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:55:37 compute-0 nova_compute[186544]: 2025-11-22 08:55:37.613 186548 INFO nova.compute.manager [-] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] Took 2.46 seconds to deallocate network for instance.
Nov 22 08:55:37 compute-0 nova_compute[186544]: 2025-11-22 08:55:37.627 186548 DEBUG nova.compute.manager [req-fbf6e00f-9609-49b0-bfa6-da126d97580c req-673d2b73-7461-4b62-8c2b-0f3c90dd91e9 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] Detach interface failed, port_id=61d2893b-3f64-4239-b7aa-532d126977ba, reason: Instance 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Nov 22 08:55:37 compute-0 nova_compute[186544]: 2025-11-22 08:55:37.720 186548 DEBUG oslo_concurrency.lockutils [None req-bc7ad39e-4bff-468c-9bd8-e0b961526668 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:55:37 compute-0 nova_compute[186544]: 2025-11-22 08:55:37.720 186548 DEBUG oslo_concurrency.lockutils [None req-bc7ad39e-4bff-468c-9bd8-e0b961526668 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:55:37 compute-0 nova_compute[186544]: 2025-11-22 08:55:37.789 186548 DEBUG nova.compute.provider_tree [None req-bc7ad39e-4bff-468c-9bd8-e0b961526668 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:55:37 compute-0 nova_compute[186544]: 2025-11-22 08:55:37.803 186548 DEBUG nova.scheduler.client.report [None req-bc7ad39e-4bff-468c-9bd8-e0b961526668 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:55:37 compute-0 nova_compute[186544]: 2025-11-22 08:55:37.835 186548 DEBUG oslo_concurrency.lockutils [None req-bc7ad39e-4bff-468c-9bd8-e0b961526668 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.115s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:55:38 compute-0 nova_compute[186544]: 2025-11-22 08:55:38.055 186548 INFO nova.scheduler.client.report [None req-bc7ad39e-4bff-468c-9bd8-e0b961526668 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Deleted allocations for instance 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6
Nov 22 08:55:38 compute-0 nova_compute[186544]: 2025-11-22 08:55:38.133 186548 DEBUG oslo_concurrency.lockutils [None req-bc7ad39e-4bff-468c-9bd8-e0b961526668 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.370s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:55:39 compute-0 nova_compute[186544]: 2025-11-22 08:55:39.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:55:39 compute-0 nova_compute[186544]: 2025-11-22 08:55:39.164 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 08:55:39 compute-0 nova_compute[186544]: 2025-11-22 08:55:39.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:55:39 compute-0 nova_compute[186544]: 2025-11-22 08:55:39.196 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:55:39 compute-0 nova_compute[186544]: 2025-11-22 08:55:39.196 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:55:39 compute-0 nova_compute[186544]: 2025-11-22 08:55:39.197 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:55:39 compute-0 nova_compute[186544]: 2025-11-22 08:55:39.197 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 08:55:39 compute-0 nova_compute[186544]: 2025-11-22 08:55:39.420 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:55:39 compute-0 nova_compute[186544]: 2025-11-22 08:55:39.421 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5684MB free_disk=73.13179016113281GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 08:55:39 compute-0 nova_compute[186544]: 2025-11-22 08:55:39.421 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:55:39 compute-0 nova_compute[186544]: 2025-11-22 08:55:39.422 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:55:39 compute-0 nova_compute[186544]: 2025-11-22 08:55:39.471 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 08:55:39 compute-0 nova_compute[186544]: 2025-11-22 08:55:39.472 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 08:55:39 compute-0 nova_compute[186544]: 2025-11-22 08:55:39.497 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:55:39 compute-0 nova_compute[186544]: 2025-11-22 08:55:39.509 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:55:39 compute-0 nova_compute[186544]: 2025-11-22 08:55:39.555 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 08:55:39 compute-0 nova_compute[186544]: 2025-11-22 08:55:39.556 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.134s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:55:40 compute-0 nova_compute[186544]: 2025-11-22 08:55:40.077 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:55:41 compute-0 nova_compute[186544]: 2025-11-22 08:55:41.950 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:55:43 compute-0 nova_compute[186544]: 2025-11-22 08:55:43.556 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:55:44 compute-0 nova_compute[186544]: 2025-11-22 08:55:44.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:55:44 compute-0 podman[257028]: 2025-11-22 08:55:44.401006558 +0000 UTC m=+0.051705423 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 08:55:44 compute-0 podman[257029]: 2025-11-22 08:55:44.406379429 +0000 UTC m=+0.054361707 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, distribution-scope=public, architecture=x86_64, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, container_name=openstack_network_exporter, release=1755695350, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, maintainer=Red Hat, Inc.)
Nov 22 08:55:44 compute-0 podman[257030]: 2025-11-22 08:55:44.409402414 +0000 UTC m=+0.053363873 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Nov 22 08:55:45 compute-0 nova_compute[186544]: 2025-11-22 08:55:45.080 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:55:46 compute-0 nova_compute[186544]: 2025-11-22 08:55:46.951 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:55:47 compute-0 nova_compute[186544]: 2025-11-22 08:55:47.158 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:55:48 compute-0 nova_compute[186544]: 2025-11-22 08:55:48.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:55:49 compute-0 nova_compute[186544]: 2025-11-22 08:55:49.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:55:50 compute-0 nova_compute[186544]: 2025-11-22 08:55:50.059 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763801735.05687, 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:55:50 compute-0 nova_compute[186544]: 2025-11-22 08:55:50.060 186548 INFO nova.compute.manager [-] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] VM Stopped (Lifecycle Event)
Nov 22 08:55:50 compute-0 nova_compute[186544]: 2025-11-22 08:55:50.084 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:55:50 compute-0 nova_compute[186544]: 2025-11-22 08:55:50.146 186548 DEBUG nova.compute.manager [None req-9588fbf0-7590-4a3b-a738-c00f0866a4d6 - - - - - -] [instance: 3f8c6c93-ddc4-4dc8-99da-10e948ac5ac6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:55:51 compute-0 nova_compute[186544]: 2025-11-22 08:55:51.953 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:55:52 compute-0 nova_compute[186544]: 2025-11-22 08:55:52.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:55:53 compute-0 nova_compute[186544]: 2025-11-22 08:55:53.059 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:55:53 compute-0 nova_compute[186544]: 2025-11-22 08:55:53.147 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:55:55 compute-0 nova_compute[186544]: 2025-11-22 08:55:55.086 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:55:56 compute-0 nova_compute[186544]: 2025-11-22 08:55:56.955 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:56:00 compute-0 nova_compute[186544]: 2025-11-22 08:56:00.089 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:56:01 compute-0 nova_compute[186544]: 2025-11-22 08:56:01.959 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:56:03 compute-0 nova_compute[186544]: 2025-11-22 08:56:03.145 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:56:03 compute-0 podman[257096]: 2025-11-22 08:56:03.4162232 +0000 UTC m=+0.057559967 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 08:56:03 compute-0 podman[257095]: 2025-11-22 08:56:03.416355684 +0000 UTC m=+0.061711789 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent)
Nov 22 08:56:03 compute-0 podman[257094]: 2025-11-22 08:56:03.416223 +0000 UTC m=+0.064137239 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 22 08:56:03 compute-0 podman[257097]: 2025-11-22 08:56:03.448147995 +0000 UTC m=+0.087571895 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 22 08:56:05 compute-0 nova_compute[186544]: 2025-11-22 08:56:05.091 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:56:06 compute-0 nova_compute[186544]: 2025-11-22 08:56:06.960 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:56:10 compute-0 nova_compute[186544]: 2025-11-22 08:56:10.094 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:56:11 compute-0 nova_compute[186544]: 2025-11-22 08:56:11.962 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:56:12 compute-0 nova_compute[186544]: 2025-11-22 08:56:12.035 186548 DEBUG oslo_concurrency.lockutils [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "f4bd920b-a036-4c96-9c58-797037cb9df9" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:56:12 compute-0 nova_compute[186544]: 2025-11-22 08:56:12.035 186548 DEBUG oslo_concurrency.lockutils [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "f4bd920b-a036-4c96-9c58-797037cb9df9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:56:12 compute-0 nova_compute[186544]: 2025-11-22 08:56:12.046 186548 DEBUG nova.compute.manager [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 08:56:12 compute-0 nova_compute[186544]: 2025-11-22 08:56:12.139 186548 DEBUG oslo_concurrency.lockutils [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:56:12 compute-0 nova_compute[186544]: 2025-11-22 08:56:12.140 186548 DEBUG oslo_concurrency.lockutils [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:56:12 compute-0 nova_compute[186544]: 2025-11-22 08:56:12.148 186548 DEBUG nova.virt.hardware [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 08:56:12 compute-0 nova_compute[186544]: 2025-11-22 08:56:12.148 186548 INFO nova.compute.claims [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] Claim successful on node compute-0.ctlplane.example.com
Nov 22 08:56:12 compute-0 nova_compute[186544]: 2025-11-22 08:56:12.238 186548 DEBUG nova.compute.provider_tree [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:56:12 compute-0 nova_compute[186544]: 2025-11-22 08:56:12.250 186548 DEBUG nova.scheduler.client.report [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:56:12 compute-0 nova_compute[186544]: 2025-11-22 08:56:12.270 186548 DEBUG oslo_concurrency.lockutils [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.130s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:56:12 compute-0 nova_compute[186544]: 2025-11-22 08:56:12.271 186548 DEBUG nova.compute.manager [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 08:56:12 compute-0 nova_compute[186544]: 2025-11-22 08:56:12.323 186548 DEBUG nova.compute.manager [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 22 08:56:12 compute-0 nova_compute[186544]: 2025-11-22 08:56:12.324 186548 DEBUG nova.network.neutron [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 08:56:12 compute-0 nova_compute[186544]: 2025-11-22 08:56:12.348 186548 INFO nova.virt.libvirt.driver [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 08:56:12 compute-0 nova_compute[186544]: 2025-11-22 08:56:12.365 186548 DEBUG nova.compute.manager [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 08:56:12 compute-0 nova_compute[186544]: 2025-11-22 08:56:12.455 186548 DEBUG nova.compute.manager [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 08:56:12 compute-0 nova_compute[186544]: 2025-11-22 08:56:12.456 186548 DEBUG nova.virt.libvirt.driver [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 08:56:12 compute-0 nova_compute[186544]: 2025-11-22 08:56:12.456 186548 INFO nova.virt.libvirt.driver [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] Creating image(s)
Nov 22 08:56:12 compute-0 nova_compute[186544]: 2025-11-22 08:56:12.457 186548 DEBUG oslo_concurrency.lockutils [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "/var/lib/nova/instances/f4bd920b-a036-4c96-9c58-797037cb9df9/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:56:12 compute-0 nova_compute[186544]: 2025-11-22 08:56:12.457 186548 DEBUG oslo_concurrency.lockutils [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "/var/lib/nova/instances/f4bd920b-a036-4c96-9c58-797037cb9df9/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:56:12 compute-0 nova_compute[186544]: 2025-11-22 08:56:12.458 186548 DEBUG oslo_concurrency.lockutils [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "/var/lib/nova/instances/f4bd920b-a036-4c96-9c58-797037cb9df9/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:56:12 compute-0 nova_compute[186544]: 2025-11-22 08:56:12.469 186548 DEBUG oslo_concurrency.processutils [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:56:12 compute-0 nova_compute[186544]: 2025-11-22 08:56:12.523 186548 DEBUG oslo_concurrency.processutils [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:56:12 compute-0 nova_compute[186544]: 2025-11-22 08:56:12.525 186548 DEBUG oslo_concurrency.lockutils [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:56:12 compute-0 nova_compute[186544]: 2025-11-22 08:56:12.527 186548 DEBUG oslo_concurrency.lockutils [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:56:12 compute-0 nova_compute[186544]: 2025-11-22 08:56:12.554 186548 DEBUG oslo_concurrency.processutils [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:56:12 compute-0 nova_compute[186544]: 2025-11-22 08:56:12.617 186548 DEBUG oslo_concurrency.processutils [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:56:12 compute-0 nova_compute[186544]: 2025-11-22 08:56:12.618 186548 DEBUG oslo_concurrency.processutils [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/f4bd920b-a036-4c96-9c58-797037cb9df9/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:56:12 compute-0 nova_compute[186544]: 2025-11-22 08:56:12.653 186548 DEBUG oslo_concurrency.processutils [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/f4bd920b-a036-4c96-9c58-797037cb9df9/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:56:12 compute-0 nova_compute[186544]: 2025-11-22 08:56:12.655 186548 DEBUG oslo_concurrency.lockutils [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.128s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:56:12 compute-0 nova_compute[186544]: 2025-11-22 08:56:12.655 186548 DEBUG oslo_concurrency.processutils [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:56:12 compute-0 nova_compute[186544]: 2025-11-22 08:56:12.711 186548 DEBUG oslo_concurrency.processutils [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:56:12 compute-0 nova_compute[186544]: 2025-11-22 08:56:12.712 186548 DEBUG nova.virt.disk.api [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Checking if we can resize image /var/lib/nova/instances/f4bd920b-a036-4c96-9c58-797037cb9df9/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 08:56:12 compute-0 nova_compute[186544]: 2025-11-22 08:56:12.713 186548 DEBUG oslo_concurrency.processutils [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f4bd920b-a036-4c96-9c58-797037cb9df9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:56:12 compute-0 nova_compute[186544]: 2025-11-22 08:56:12.769 186548 DEBUG oslo_concurrency.processutils [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f4bd920b-a036-4c96-9c58-797037cb9df9/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:56:12 compute-0 nova_compute[186544]: 2025-11-22 08:56:12.771 186548 DEBUG nova.virt.disk.api [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Cannot resize image /var/lib/nova/instances/f4bd920b-a036-4c96-9c58-797037cb9df9/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 08:56:12 compute-0 nova_compute[186544]: 2025-11-22 08:56:12.771 186548 DEBUG nova.objects.instance [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lazy-loading 'migration_context' on Instance uuid f4bd920b-a036-4c96-9c58-797037cb9df9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:56:12 compute-0 nova_compute[186544]: 2025-11-22 08:56:12.786 186548 DEBUG nova.virt.libvirt.driver [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 08:56:12 compute-0 nova_compute[186544]: 2025-11-22 08:56:12.787 186548 DEBUG nova.virt.libvirt.driver [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] Ensure instance console log exists: /var/lib/nova/instances/f4bd920b-a036-4c96-9c58-797037cb9df9/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 08:56:12 compute-0 nova_compute[186544]: 2025-11-22 08:56:12.788 186548 DEBUG oslo_concurrency.lockutils [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:56:12 compute-0 nova_compute[186544]: 2025-11-22 08:56:12.788 186548 DEBUG oslo_concurrency.lockutils [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:56:12 compute-0 nova_compute[186544]: 2025-11-22 08:56:12.788 186548 DEBUG oslo_concurrency.lockutils [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:56:12 compute-0 nova_compute[186544]: 2025-11-22 08:56:12.844 186548 DEBUG nova.policy [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 22 08:56:13 compute-0 nova_compute[186544]: 2025-11-22 08:56:13.799 186548 DEBUG nova.network.neutron [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] Successfully created port: 0fc6a761-743b-450a-97e3-0b4cb62012d4 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 22 08:56:15 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:56:15.074 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=80, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=79) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:56:15 compute-0 nova_compute[186544]: 2025-11-22 08:56:15.075 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:56:15 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:56:15.075 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 08:56:15 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:56:15.076 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '80'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:56:15 compute-0 nova_compute[186544]: 2025-11-22 08:56:15.096 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:56:15 compute-0 nova_compute[186544]: 2025-11-22 08:56:15.364 186548 DEBUG nova.network.neutron [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] Successfully updated port: 0fc6a761-743b-450a-97e3-0b4cb62012d4 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 08:56:15 compute-0 nova_compute[186544]: 2025-11-22 08:56:15.382 186548 DEBUG oslo_concurrency.lockutils [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "refresh_cache-f4bd920b-a036-4c96-9c58-797037cb9df9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:56:15 compute-0 nova_compute[186544]: 2025-11-22 08:56:15.382 186548 DEBUG oslo_concurrency.lockutils [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquired lock "refresh_cache-f4bd920b-a036-4c96-9c58-797037cb9df9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:56:15 compute-0 nova_compute[186544]: 2025-11-22 08:56:15.382 186548 DEBUG nova.network.neutron [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 08:56:15 compute-0 podman[257195]: 2025-11-22 08:56:15.421320416 +0000 UTC m=+0.063013631 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 08:56:15 compute-0 podman[257196]: 2025-11-22 08:56:15.42193948 +0000 UTC m=+0.063415220 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, managed_by=edpm_ansible, vendor=Red Hat, Inc., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, architecture=x86_64, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 22 08:56:15 compute-0 podman[257197]: 2025-11-22 08:56:15.430121502 +0000 UTC m=+0.071168082 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 08:56:15 compute-0 nova_compute[186544]: 2025-11-22 08:56:15.498 186548 DEBUG nova.compute.manager [req-a90d1ee4-9002-4cc4-8482-48f25af54140 req-703cce9f-656d-4e33-9b2e-c4a0c7ceef9b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] Received event network-changed-0fc6a761-743b-450a-97e3-0b4cb62012d4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:56:15 compute-0 nova_compute[186544]: 2025-11-22 08:56:15.499 186548 DEBUG nova.compute.manager [req-a90d1ee4-9002-4cc4-8482-48f25af54140 req-703cce9f-656d-4e33-9b2e-c4a0c7ceef9b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] Refreshing instance network info cache due to event network-changed-0fc6a761-743b-450a-97e3-0b4cb62012d4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:56:15 compute-0 nova_compute[186544]: 2025-11-22 08:56:15.499 186548 DEBUG oslo_concurrency.lockutils [req-a90d1ee4-9002-4cc4-8482-48f25af54140 req-703cce9f-656d-4e33-9b2e-c4a0c7ceef9b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-f4bd920b-a036-4c96-9c58-797037cb9df9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:56:15 compute-0 nova_compute[186544]: 2025-11-22 08:56:15.553 186548 DEBUG nova.network.neutron [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 08:56:16 compute-0 nova_compute[186544]: 2025-11-22 08:56:16.578 186548 DEBUG nova.network.neutron [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] Updating instance_info_cache with network_info: [{"id": "0fc6a761-743b-450a-97e3-0b4cb62012d4", "address": "fa:16:3e:84:a5:4e", "network": {"id": "13f06a8d-f6ae-46ea-b973-f89bfd41893c", "bridge": "br-int", "label": "tempest-network-smoke--1719566022", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0fc6a761-74", "ovs_interfaceid": "0fc6a761-743b-450a-97e3-0b4cb62012d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:56:16 compute-0 nova_compute[186544]: 2025-11-22 08:56:16.612 186548 DEBUG oslo_concurrency.lockutils [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Releasing lock "refresh_cache-f4bd920b-a036-4c96-9c58-797037cb9df9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:56:16 compute-0 nova_compute[186544]: 2025-11-22 08:56:16.612 186548 DEBUG nova.compute.manager [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] Instance network_info: |[{"id": "0fc6a761-743b-450a-97e3-0b4cb62012d4", "address": "fa:16:3e:84:a5:4e", "network": {"id": "13f06a8d-f6ae-46ea-b973-f89bfd41893c", "bridge": "br-int", "label": "tempest-network-smoke--1719566022", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0fc6a761-74", "ovs_interfaceid": "0fc6a761-743b-450a-97e3-0b4cb62012d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 22 08:56:16 compute-0 nova_compute[186544]: 2025-11-22 08:56:16.613 186548 DEBUG oslo_concurrency.lockutils [req-a90d1ee4-9002-4cc4-8482-48f25af54140 req-703cce9f-656d-4e33-9b2e-c4a0c7ceef9b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-f4bd920b-a036-4c96-9c58-797037cb9df9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:56:16 compute-0 nova_compute[186544]: 2025-11-22 08:56:16.613 186548 DEBUG nova.network.neutron [req-a90d1ee4-9002-4cc4-8482-48f25af54140 req-703cce9f-656d-4e33-9b2e-c4a0c7ceef9b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] Refreshing network info cache for port 0fc6a761-743b-450a-97e3-0b4cb62012d4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:56:16 compute-0 nova_compute[186544]: 2025-11-22 08:56:16.616 186548 DEBUG nova.virt.libvirt.driver [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] Start _get_guest_xml network_info=[{"id": "0fc6a761-743b-450a-97e3-0b4cb62012d4", "address": "fa:16:3e:84:a5:4e", "network": {"id": "13f06a8d-f6ae-46ea-b973-f89bfd41893c", "bridge": "br-int", "label": "tempest-network-smoke--1719566022", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0fc6a761-74", "ovs_interfaceid": "0fc6a761-743b-450a-97e3-0b4cb62012d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 08:56:16 compute-0 nova_compute[186544]: 2025-11-22 08:56:16.622 186548 WARNING nova.virt.libvirt.driver [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:56:16 compute-0 nova_compute[186544]: 2025-11-22 08:56:16.629 186548 DEBUG nova.virt.libvirt.host [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 08:56:16 compute-0 nova_compute[186544]: 2025-11-22 08:56:16.630 186548 DEBUG nova.virt.libvirt.host [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 08:56:16 compute-0 nova_compute[186544]: 2025-11-22 08:56:16.633 186548 DEBUG nova.virt.libvirt.host [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 08:56:16 compute-0 nova_compute[186544]: 2025-11-22 08:56:16.634 186548 DEBUG nova.virt.libvirt.host [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 08:56:16 compute-0 nova_compute[186544]: 2025-11-22 08:56:16.635 186548 DEBUG nova.virt.libvirt.driver [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 08:56:16 compute-0 nova_compute[186544]: 2025-11-22 08:56:16.635 186548 DEBUG nova.virt.hardware [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 08:56:16 compute-0 nova_compute[186544]: 2025-11-22 08:56:16.635 186548 DEBUG nova.virt.hardware [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 08:56:16 compute-0 nova_compute[186544]: 2025-11-22 08:56:16.636 186548 DEBUG nova.virt.hardware [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 08:56:16 compute-0 nova_compute[186544]: 2025-11-22 08:56:16.636 186548 DEBUG nova.virt.hardware [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 08:56:16 compute-0 nova_compute[186544]: 2025-11-22 08:56:16.636 186548 DEBUG nova.virt.hardware [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 08:56:16 compute-0 nova_compute[186544]: 2025-11-22 08:56:16.636 186548 DEBUG nova.virt.hardware [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 08:56:16 compute-0 nova_compute[186544]: 2025-11-22 08:56:16.636 186548 DEBUG nova.virt.hardware [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 08:56:16 compute-0 nova_compute[186544]: 2025-11-22 08:56:16.637 186548 DEBUG nova.virt.hardware [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 08:56:16 compute-0 nova_compute[186544]: 2025-11-22 08:56:16.637 186548 DEBUG nova.virt.hardware [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 08:56:16 compute-0 nova_compute[186544]: 2025-11-22 08:56:16.637 186548 DEBUG nova.virt.hardware [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 08:56:16 compute-0 nova_compute[186544]: 2025-11-22 08:56:16.637 186548 DEBUG nova.virt.hardware [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 08:56:16 compute-0 nova_compute[186544]: 2025-11-22 08:56:16.641 186548 DEBUG nova.virt.libvirt.vif [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:56:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-714377660',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-714377660',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-588574044-acc',id=186,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPHag9aku7GBAkvmU1fN0BEWX+GKKnlz+wxkaBB81usyGYFIYXdms2nPbWtkH6Pt7jECf5QAIbXgbB8vKLjCswltA0JVkNEQJtQT24F3RbwiywHh6gfKMrOlaLUdm6xTrw==',key_name='tempest-TestSecurityGroupsBasicOps-1099778393',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b5da13b07bb34fc3b4cd1452f7dd6971',ramdisk_id='',reservation_id='r-25tubkmi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-588574044',owner_user_name='tempest-TestSecurityGroupsBasicOps-588574044-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:56:12Z,user_data=None,user_id='7bb85b33f2b44468ab5d86bf5ba98421',uuid=f4bd920b-a036-4c96-9c58-797037cb9df9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0fc6a761-743b-450a-97e3-0b4cb62012d4", "address": "fa:16:3e:84:a5:4e", "network": {"id": "13f06a8d-f6ae-46ea-b973-f89bfd41893c", "bridge": "br-int", "label": "tempest-network-smoke--1719566022", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0fc6a761-74", "ovs_interfaceid": "0fc6a761-743b-450a-97e3-0b4cb62012d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 08:56:16 compute-0 nova_compute[186544]: 2025-11-22 08:56:16.641 186548 DEBUG nova.network.os_vif_util [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Converting VIF {"id": "0fc6a761-743b-450a-97e3-0b4cb62012d4", "address": "fa:16:3e:84:a5:4e", "network": {"id": "13f06a8d-f6ae-46ea-b973-f89bfd41893c", "bridge": "br-int", "label": "tempest-network-smoke--1719566022", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0fc6a761-74", "ovs_interfaceid": "0fc6a761-743b-450a-97e3-0b4cb62012d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:56:16 compute-0 nova_compute[186544]: 2025-11-22 08:56:16.642 186548 DEBUG nova.network.os_vif_util [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:84:a5:4e,bridge_name='br-int',has_traffic_filtering=True,id=0fc6a761-743b-450a-97e3-0b4cb62012d4,network=Network(13f06a8d-f6ae-46ea-b973-f89bfd41893c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0fc6a761-74') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:56:16 compute-0 nova_compute[186544]: 2025-11-22 08:56:16.643 186548 DEBUG nova.objects.instance [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lazy-loading 'pci_devices' on Instance uuid f4bd920b-a036-4c96-9c58-797037cb9df9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:56:16 compute-0 nova_compute[186544]: 2025-11-22 08:56:16.660 186548 DEBUG nova.virt.libvirt.driver [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] End _get_guest_xml xml=<domain type="kvm">
Nov 22 08:56:16 compute-0 nova_compute[186544]:   <uuid>f4bd920b-a036-4c96-9c58-797037cb9df9</uuid>
Nov 22 08:56:16 compute-0 nova_compute[186544]:   <name>instance-000000ba</name>
Nov 22 08:56:16 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 08:56:16 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 08:56:16 compute-0 nova_compute[186544]:   <metadata>
Nov 22 08:56:16 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 08:56:16 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 08:56:16 compute-0 nova_compute[186544]:       <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-714377660</nova:name>
Nov 22 08:56:16 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 08:56:16</nova:creationTime>
Nov 22 08:56:16 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 08:56:16 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 08:56:16 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 08:56:16 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 08:56:16 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 08:56:16 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 08:56:16 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 08:56:16 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 08:56:16 compute-0 nova_compute[186544]:         <nova:user uuid="7bb85b33f2b44468ab5d86bf5ba98421">tempest-TestSecurityGroupsBasicOps-588574044-project-member</nova:user>
Nov 22 08:56:16 compute-0 nova_compute[186544]:         <nova:project uuid="b5da13b07bb34fc3b4cd1452f7dd6971">tempest-TestSecurityGroupsBasicOps-588574044</nova:project>
Nov 22 08:56:16 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 08:56:16 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 08:56:16 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 08:56:16 compute-0 nova_compute[186544]:         <nova:port uuid="0fc6a761-743b-450a-97e3-0b4cb62012d4">
Nov 22 08:56:16 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 22 08:56:16 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 08:56:16 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 08:56:16 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 08:56:16 compute-0 nova_compute[186544]:   </metadata>
Nov 22 08:56:16 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 08:56:16 compute-0 nova_compute[186544]:     <system>
Nov 22 08:56:16 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 08:56:16 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 08:56:16 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 08:56:16 compute-0 nova_compute[186544]:       <entry name="serial">f4bd920b-a036-4c96-9c58-797037cb9df9</entry>
Nov 22 08:56:16 compute-0 nova_compute[186544]:       <entry name="uuid">f4bd920b-a036-4c96-9c58-797037cb9df9</entry>
Nov 22 08:56:16 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 08:56:16 compute-0 nova_compute[186544]:     </system>
Nov 22 08:56:16 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 08:56:16 compute-0 nova_compute[186544]:   <os>
Nov 22 08:56:16 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 08:56:16 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 08:56:16 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 08:56:16 compute-0 nova_compute[186544]:   </os>
Nov 22 08:56:16 compute-0 nova_compute[186544]:   <features>
Nov 22 08:56:16 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 08:56:16 compute-0 nova_compute[186544]:     <apic/>
Nov 22 08:56:16 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 08:56:16 compute-0 nova_compute[186544]:   </features>
Nov 22 08:56:16 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 08:56:16 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 08:56:16 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 08:56:16 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 08:56:16 compute-0 nova_compute[186544]:   </clock>
Nov 22 08:56:16 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 08:56:16 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 08:56:16 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 08:56:16 compute-0 nova_compute[186544]:   </cpu>
Nov 22 08:56:16 compute-0 nova_compute[186544]:   <devices>
Nov 22 08:56:16 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 08:56:16 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 08:56:16 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/f4bd920b-a036-4c96-9c58-797037cb9df9/disk"/>
Nov 22 08:56:16 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 08:56:16 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:56:16 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 08:56:16 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 08:56:16 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/f4bd920b-a036-4c96-9c58-797037cb9df9/disk.config"/>
Nov 22 08:56:16 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 08:56:16 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:56:16 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 08:56:16 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:84:a5:4e"/>
Nov 22 08:56:16 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:56:16 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 08:56:16 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 08:56:16 compute-0 nova_compute[186544]:       <target dev="tap0fc6a761-74"/>
Nov 22 08:56:16 compute-0 nova_compute[186544]:     </interface>
Nov 22 08:56:16 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 08:56:16 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/f4bd920b-a036-4c96-9c58-797037cb9df9/console.log" append="off"/>
Nov 22 08:56:16 compute-0 nova_compute[186544]:     </serial>
Nov 22 08:56:16 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 08:56:16 compute-0 nova_compute[186544]:     <video>
Nov 22 08:56:16 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:56:16 compute-0 nova_compute[186544]:     </video>
Nov 22 08:56:16 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 08:56:16 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 08:56:16 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 08:56:16 compute-0 nova_compute[186544]:     </rng>
Nov 22 08:56:16 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 08:56:16 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:56:16 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:56:16 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:56:16 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:56:16 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:56:16 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:56:16 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:56:16 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:56:16 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:56:16 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:56:16 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:56:16 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:56:16 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:56:16 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:56:16 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:56:16 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:56:16 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:56:16 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:56:16 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:56:16 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:56:16 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:56:16 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:56:16 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:56:16 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:56:16 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 08:56:16 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 08:56:16 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 08:56:16 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 08:56:16 compute-0 nova_compute[186544]:   </devices>
Nov 22 08:56:16 compute-0 nova_compute[186544]: </domain>
Nov 22 08:56:16 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 08:56:16 compute-0 nova_compute[186544]: 2025-11-22 08:56:16.661 186548 DEBUG nova.compute.manager [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] Preparing to wait for external event network-vif-plugged-0fc6a761-743b-450a-97e3-0b4cb62012d4 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 08:56:16 compute-0 nova_compute[186544]: 2025-11-22 08:56:16.661 186548 DEBUG oslo_concurrency.lockutils [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "f4bd920b-a036-4c96-9c58-797037cb9df9-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:56:16 compute-0 nova_compute[186544]: 2025-11-22 08:56:16.662 186548 DEBUG oslo_concurrency.lockutils [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "f4bd920b-a036-4c96-9c58-797037cb9df9-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:56:16 compute-0 nova_compute[186544]: 2025-11-22 08:56:16.662 186548 DEBUG oslo_concurrency.lockutils [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "f4bd920b-a036-4c96-9c58-797037cb9df9-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:56:16 compute-0 nova_compute[186544]: 2025-11-22 08:56:16.662 186548 DEBUG nova.virt.libvirt.vif [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:56:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-714377660',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-714377660',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-588574044-acc',id=186,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPHag9aku7GBAkvmU1fN0BEWX+GKKnlz+wxkaBB81usyGYFIYXdms2nPbWtkH6Pt7jECf5QAIbXgbB8vKLjCswltA0JVkNEQJtQT24F3RbwiywHh6gfKMrOlaLUdm6xTrw==',key_name='tempest-TestSecurityGroupsBasicOps-1099778393',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b5da13b07bb34fc3b4cd1452f7dd6971',ramdisk_id='',reservation_id='r-25tubkmi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-588574044',owner_user_name='tempest-TestSecurityGroupsBasicOps-588574044-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:56:12Z,user_data=None,user_id='7bb85b33f2b44468ab5d86bf5ba98421',uuid=f4bd920b-a036-4c96-9c58-797037cb9df9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0fc6a761-743b-450a-97e3-0b4cb62012d4", "address": "fa:16:3e:84:a5:4e", "network": {"id": "13f06a8d-f6ae-46ea-b973-f89bfd41893c", "bridge": "br-int", "label": "tempest-network-smoke--1719566022", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0fc6a761-74", "ovs_interfaceid": "0fc6a761-743b-450a-97e3-0b4cb62012d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 08:56:16 compute-0 nova_compute[186544]: 2025-11-22 08:56:16.663 186548 DEBUG nova.network.os_vif_util [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Converting VIF {"id": "0fc6a761-743b-450a-97e3-0b4cb62012d4", "address": "fa:16:3e:84:a5:4e", "network": {"id": "13f06a8d-f6ae-46ea-b973-f89bfd41893c", "bridge": "br-int", "label": "tempest-network-smoke--1719566022", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0fc6a761-74", "ovs_interfaceid": "0fc6a761-743b-450a-97e3-0b4cb62012d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:56:16 compute-0 nova_compute[186544]: 2025-11-22 08:56:16.663 186548 DEBUG nova.network.os_vif_util [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:84:a5:4e,bridge_name='br-int',has_traffic_filtering=True,id=0fc6a761-743b-450a-97e3-0b4cb62012d4,network=Network(13f06a8d-f6ae-46ea-b973-f89bfd41893c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0fc6a761-74') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:56:16 compute-0 nova_compute[186544]: 2025-11-22 08:56:16.664 186548 DEBUG os_vif [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:84:a5:4e,bridge_name='br-int',has_traffic_filtering=True,id=0fc6a761-743b-450a-97e3-0b4cb62012d4,network=Network(13f06a8d-f6ae-46ea-b973-f89bfd41893c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0fc6a761-74') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 08:56:16 compute-0 nova_compute[186544]: 2025-11-22 08:56:16.664 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:56:16 compute-0 nova_compute[186544]: 2025-11-22 08:56:16.665 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:56:16 compute-0 nova_compute[186544]: 2025-11-22 08:56:16.665 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:56:16 compute-0 nova_compute[186544]: 2025-11-22 08:56:16.668 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:56:16 compute-0 nova_compute[186544]: 2025-11-22 08:56:16.668 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0fc6a761-74, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:56:16 compute-0 nova_compute[186544]: 2025-11-22 08:56:16.669 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0fc6a761-74, col_values=(('external_ids', {'iface-id': '0fc6a761-743b-450a-97e3-0b4cb62012d4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:84:a5:4e', 'vm-uuid': 'f4bd920b-a036-4c96-9c58-797037cb9df9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:56:16 compute-0 nova_compute[186544]: 2025-11-22 08:56:16.671 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:56:16 compute-0 NetworkManager[55036]: <info>  [1763801776.6719] manager: (tap0fc6a761-74): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/437)
Nov 22 08:56:16 compute-0 nova_compute[186544]: 2025-11-22 08:56:16.673 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 08:56:16 compute-0 nova_compute[186544]: 2025-11-22 08:56:16.678 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:56:16 compute-0 nova_compute[186544]: 2025-11-22 08:56:16.679 186548 INFO os_vif [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:84:a5:4e,bridge_name='br-int',has_traffic_filtering=True,id=0fc6a761-743b-450a-97e3-0b4cb62012d4,network=Network(13f06a8d-f6ae-46ea-b973-f89bfd41893c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0fc6a761-74')
Nov 22 08:56:16 compute-0 nova_compute[186544]: 2025-11-22 08:56:16.728 186548 DEBUG nova.virt.libvirt.driver [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:56:16 compute-0 nova_compute[186544]: 2025-11-22 08:56:16.729 186548 DEBUG nova.virt.libvirt.driver [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:56:16 compute-0 nova_compute[186544]: 2025-11-22 08:56:16.729 186548 DEBUG nova.virt.libvirt.driver [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] No VIF found with MAC fa:16:3e:84:a5:4e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 08:56:16 compute-0 nova_compute[186544]: 2025-11-22 08:56:16.730 186548 INFO nova.virt.libvirt.driver [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] Using config drive
Nov 22 08:56:16 compute-0 nova_compute[186544]: 2025-11-22 08:56:16.963 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:56:18 compute-0 nova_compute[186544]: 2025-11-22 08:56:18.817 186548 INFO nova.virt.libvirt.driver [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] Creating config drive at /var/lib/nova/instances/f4bd920b-a036-4c96-9c58-797037cb9df9/disk.config
Nov 22 08:56:18 compute-0 nova_compute[186544]: 2025-11-22 08:56:18.827 186548 DEBUG oslo_concurrency.processutils [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f4bd920b-a036-4c96-9c58-797037cb9df9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpn3wol1tj execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:56:18 compute-0 nova_compute[186544]: 2025-11-22 08:56:18.957 186548 DEBUG oslo_concurrency.processutils [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f4bd920b-a036-4c96-9c58-797037cb9df9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpn3wol1tj" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:56:19 compute-0 kernel: tap0fc6a761-74: entered promiscuous mode
Nov 22 08:56:19 compute-0 ovn_controller[94843]: 2025-11-22T08:56:19Z|00926|binding|INFO|Claiming lport 0fc6a761-743b-450a-97e3-0b4cb62012d4 for this chassis.
Nov 22 08:56:19 compute-0 nova_compute[186544]: 2025-11-22 08:56:19.035 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:56:19 compute-0 NetworkManager[55036]: <info>  [1763801779.0358] manager: (tap0fc6a761-74): new Tun device (/org/freedesktop/NetworkManager/Devices/438)
Nov 22 08:56:19 compute-0 ovn_controller[94843]: 2025-11-22T08:56:19Z|00927|binding|INFO|0fc6a761-743b-450a-97e3-0b4cb62012d4: Claiming fa:16:3e:84:a5:4e 10.100.0.5
Nov 22 08:56:19 compute-0 nova_compute[186544]: 2025-11-22 08:56:19.040 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:56:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:56:19.064 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:84:a5:4e 10.100.0.5'], port_security=['fa:16:3e:84:a5:4e 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'f4bd920b-a036-4c96-9c58-797037cb9df9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-13f06a8d-f6ae-46ea-b973-f89bfd41893c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b9b6088d-9774-4171-9ae0-83f685fc1451 c37d3c74-274b-44a5-ab29-e1041fb93c1c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=71640610-572c-4a8b-b43c-9737c485843b, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=0fc6a761-743b-450a-97e3-0b4cb62012d4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:56:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:56:19.065 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 0fc6a761-743b-450a-97e3-0b4cb62012d4 in datapath 13f06a8d-f6ae-46ea-b973-f89bfd41893c bound to our chassis
Nov 22 08:56:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:56:19.067 103805 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 13f06a8d-f6ae-46ea-b973-f89bfd41893c
Nov 22 08:56:19 compute-0 systemd-udevd[257279]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:56:19 compute-0 NetworkManager[55036]: <info>  [1763801779.0812] device (tap0fc6a761-74): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 08:56:19 compute-0 systemd-machined[152872]: New machine qemu-101-instance-000000ba.
Nov 22 08:56:19 compute-0 NetworkManager[55036]: <info>  [1763801779.0825] device (tap0fc6a761-74): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 08:56:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:56:19.081 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[9323a772-0ce7-41fc-b495-ecab419f4079]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:56:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:56:19.083 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap13f06a8d-f1 in ovnmeta-13f06a8d-f6ae-46ea-b973-f89bfd41893c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 22 08:56:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:56:19.085 213522 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap13f06a8d-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 22 08:56:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:56:19.085 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[3bd63fa9-e112-4f9a-b56f-88a529efaece]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:56:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:56:19.086 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[f1115fd5-0d9e-45db-a02c-e33eb1b4818e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:56:19 compute-0 nova_compute[186544]: 2025-11-22 08:56:19.093 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:56:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:56:19.096 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[f0171e66-9acb-4ec9-8c08-37c357f6529d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:56:19 compute-0 ovn_controller[94843]: 2025-11-22T08:56:19Z|00928|binding|INFO|Setting lport 0fc6a761-743b-450a-97e3-0b4cb62012d4 ovn-installed in OVS
Nov 22 08:56:19 compute-0 ovn_controller[94843]: 2025-11-22T08:56:19Z|00929|binding|INFO|Setting lport 0fc6a761-743b-450a-97e3-0b4cb62012d4 up in Southbound
Nov 22 08:56:19 compute-0 nova_compute[186544]: 2025-11-22 08:56:19.102 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:56:19 compute-0 systemd[1]: Started Virtual Machine qemu-101-instance-000000ba.
Nov 22 08:56:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:56:19.119 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[89f3397a-b657-42c7-b023-c00bec23ef7a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:56:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:56:19.147 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[0a9dc761-775e-4666-835d-c2dd17f66a5b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:56:19 compute-0 NetworkManager[55036]: <info>  [1763801779.1544] manager: (tap13f06a8d-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/439)
Nov 22 08:56:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:56:19.154 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[5801660e-559d-46ec-87e7-14966c8b1778]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:56:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:56:19.183 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[282294f1-b236-4bf7-8293-7990f560f96c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:56:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:56:19.186 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[c752bb45-415c-47e5-bc46-e30441cb19b5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:56:19 compute-0 NetworkManager[55036]: <info>  [1763801779.2090] device (tap13f06a8d-f0): carrier: link connected
Nov 22 08:56:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:56:19.213 213536 DEBUG oslo.privsep.daemon [-] privsep: reply[2c4ee3f7-792f-4ace-936f-92db1e2a92c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:56:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:56:19.227 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[c45a1c66-9f63-4b13-97c9-88de8c09e0e3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap13f06a8d-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ef:28:18'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 273], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 851984, 'reachable_time': 21287, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 257313, 'error': None, 'target': 'ovnmeta-13f06a8d-f6ae-46ea-b973-f89bfd41893c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:56:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:56:19.243 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[30dadd00-2dbb-4a45-b9f0-7b20d748628a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feef:2818'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 851984, 'tstamp': 851984}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 257314, 'error': None, 'target': 'ovnmeta-13f06a8d-f6ae-46ea-b973-f89bfd41893c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:56:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:56:19.257 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[be8cf75c-c06a-4f9c-bbe5-50697b5f58bd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap13f06a8d-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ef:28:18'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 273], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 851984, 'reachable_time': 21287, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 257315, 'error': None, 'target': 'ovnmeta-13f06a8d-f6ae-46ea-b973-f89bfd41893c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:56:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:56:19.283 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[7a612527-5695-4cee-9d63-5ed26922ddc7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:56:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:56:19.341 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[a9ad0539-006c-40f2-a8f9-d3fde57f122e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:56:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:56:19.342 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap13f06a8d-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:56:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:56:19.342 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:56:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:56:19.343 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap13f06a8d-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:56:19 compute-0 nova_compute[186544]: 2025-11-22 08:56:19.345 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:56:19 compute-0 NetworkManager[55036]: <info>  [1763801779.3454] manager: (tap13f06a8d-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/440)
Nov 22 08:56:19 compute-0 kernel: tap13f06a8d-f0: entered promiscuous mode
Nov 22 08:56:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:56:19.349 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap13f06a8d-f0, col_values=(('external_ids', {'iface-id': '1131c2e2-47c4-45b6-8f6d-5a584a957856'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:56:19 compute-0 nova_compute[186544]: 2025-11-22 08:56:19.351 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:56:19 compute-0 ovn_controller[94843]: 2025-11-22T08:56:19Z|00930|binding|INFO|Releasing lport 1131c2e2-47c4-45b6-8f6d-5a584a957856 from this chassis (sb_readonly=0)
Nov 22 08:56:19 compute-0 nova_compute[186544]: 2025-11-22 08:56:19.365 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:56:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:56:19.366 103805 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/13f06a8d-f6ae-46ea-b973-f89bfd41893c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/13f06a8d-f6ae-46ea-b973-f89bfd41893c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 22 08:56:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:56:19.367 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[038273e5-43ae-4bd5-bcc1-2a7a515e4f63]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:56:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:56:19.368 103805 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 22 08:56:19 compute-0 ovn_metadata_agent[103800]: global
Nov 22 08:56:19 compute-0 ovn_metadata_agent[103800]:     log         /dev/log local0 debug
Nov 22 08:56:19 compute-0 ovn_metadata_agent[103800]:     log-tag     haproxy-metadata-proxy-13f06a8d-f6ae-46ea-b973-f89bfd41893c
Nov 22 08:56:19 compute-0 ovn_metadata_agent[103800]:     user        root
Nov 22 08:56:19 compute-0 ovn_metadata_agent[103800]:     group       root
Nov 22 08:56:19 compute-0 ovn_metadata_agent[103800]:     maxconn     1024
Nov 22 08:56:19 compute-0 ovn_metadata_agent[103800]:     pidfile     /var/lib/neutron/external/pids/13f06a8d-f6ae-46ea-b973-f89bfd41893c.pid.haproxy
Nov 22 08:56:19 compute-0 ovn_metadata_agent[103800]:     daemon
Nov 22 08:56:19 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:56:19 compute-0 ovn_metadata_agent[103800]: defaults
Nov 22 08:56:19 compute-0 ovn_metadata_agent[103800]:     log global
Nov 22 08:56:19 compute-0 ovn_metadata_agent[103800]:     mode http
Nov 22 08:56:19 compute-0 ovn_metadata_agent[103800]:     option httplog
Nov 22 08:56:19 compute-0 ovn_metadata_agent[103800]:     option dontlognull
Nov 22 08:56:19 compute-0 ovn_metadata_agent[103800]:     option http-server-close
Nov 22 08:56:19 compute-0 ovn_metadata_agent[103800]:     option forwardfor
Nov 22 08:56:19 compute-0 ovn_metadata_agent[103800]:     retries                 3
Nov 22 08:56:19 compute-0 ovn_metadata_agent[103800]:     timeout http-request    30s
Nov 22 08:56:19 compute-0 ovn_metadata_agent[103800]:     timeout connect         30s
Nov 22 08:56:19 compute-0 ovn_metadata_agent[103800]:     timeout client          32s
Nov 22 08:56:19 compute-0 ovn_metadata_agent[103800]:     timeout server          32s
Nov 22 08:56:19 compute-0 ovn_metadata_agent[103800]:     timeout http-keep-alive 30s
Nov 22 08:56:19 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:56:19 compute-0 ovn_metadata_agent[103800]: 
Nov 22 08:56:19 compute-0 ovn_metadata_agent[103800]: listen listener
Nov 22 08:56:19 compute-0 ovn_metadata_agent[103800]:     bind 169.254.169.254:80
Nov 22 08:56:19 compute-0 ovn_metadata_agent[103800]:     server metadata /var/lib/neutron/metadata_proxy
Nov 22 08:56:19 compute-0 ovn_metadata_agent[103800]:     http-request add-header X-OVN-Network-ID 13f06a8d-f6ae-46ea-b973-f89bfd41893c
Nov 22 08:56:19 compute-0 ovn_metadata_agent[103800]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 22 08:56:19 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:56:19.369 103805 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-13f06a8d-f6ae-46ea-b973-f89bfd41893c', 'env', 'PROCESS_TAG=haproxy-13f06a8d-f6ae-46ea-b973-f89bfd41893c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/13f06a8d-f6ae-46ea-b973-f89bfd41893c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 22 08:56:19 compute-0 nova_compute[186544]: 2025-11-22 08:56:19.383 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763801779.3829753, f4bd920b-a036-4c96-9c58-797037cb9df9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:56:19 compute-0 nova_compute[186544]: 2025-11-22 08:56:19.384 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] VM Started (Lifecycle Event)
Nov 22 08:56:19 compute-0 nova_compute[186544]: 2025-11-22 08:56:19.400 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:56:19 compute-0 nova_compute[186544]: 2025-11-22 08:56:19.405 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763801779.3832338, f4bd920b-a036-4c96-9c58-797037cb9df9 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:56:19 compute-0 nova_compute[186544]: 2025-11-22 08:56:19.406 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] VM Paused (Lifecycle Event)
Nov 22 08:56:19 compute-0 nova_compute[186544]: 2025-11-22 08:56:19.420 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:56:19 compute-0 nova_compute[186544]: 2025-11-22 08:56:19.423 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:56:19 compute-0 nova_compute[186544]: 2025-11-22 08:56:19.439 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 08:56:19 compute-0 podman[257353]: 2025-11-22 08:56:19.775350287 +0000 UTC m=+0.076383099 container create 0e52b0714c2242e375c6ec80746ad6b6eb9e3da84dae05012eb5b5d540a435f9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-13f06a8d-f6ae-46ea-b973-f89bfd41893c, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 08:56:19 compute-0 systemd[1]: Started libpod-conmon-0e52b0714c2242e375c6ec80746ad6b6eb9e3da84dae05012eb5b5d540a435f9.scope.
Nov 22 08:56:19 compute-0 podman[257353]: 2025-11-22 08:56:19.725208245 +0000 UTC m=+0.026241077 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 22 08:56:19 compute-0 systemd[1]: Started libcrun container.
Nov 22 08:56:19 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7eca7587e49fa0bc1d3bdefe821f2c00bbd053f711809662746060e33e1dc906/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 22 08:56:19 compute-0 podman[257353]: 2025-11-22 08:56:19.87665015 +0000 UTC m=+0.177682982 container init 0e52b0714c2242e375c6ec80746ad6b6eb9e3da84dae05012eb5b5d540a435f9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-13f06a8d-f6ae-46ea-b973-f89bfd41893c, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 22 08:56:19 compute-0 podman[257353]: 2025-11-22 08:56:19.882617796 +0000 UTC m=+0.183650598 container start 0e52b0714c2242e375c6ec80746ad6b6eb9e3da84dae05012eb5b5d540a435f9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-13f06a8d-f6ae-46ea-b973-f89bfd41893c, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 22 08:56:19 compute-0 neutron-haproxy-ovnmeta-13f06a8d-f6ae-46ea-b973-f89bfd41893c[257369]: [NOTICE]   (257373) : New worker (257375) forked
Nov 22 08:56:19 compute-0 neutron-haproxy-ovnmeta-13f06a8d-f6ae-46ea-b973-f89bfd41893c[257369]: [NOTICE]   (257373) : Loading success.
Nov 22 08:56:21 compute-0 nova_compute[186544]: 2025-11-22 08:56:21.004 186548 DEBUG nova.compute.manager [req-1bf09ef6-5184-4257-b4f4-2c8ac4bf725a req-6a2fbbfb-0101-49db-89ad-c087b3508fce 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] Received event network-vif-plugged-0fc6a761-743b-450a-97e3-0b4cb62012d4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:56:21 compute-0 nova_compute[186544]: 2025-11-22 08:56:21.005 186548 DEBUG oslo_concurrency.lockutils [req-1bf09ef6-5184-4257-b4f4-2c8ac4bf725a req-6a2fbbfb-0101-49db-89ad-c087b3508fce 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "f4bd920b-a036-4c96-9c58-797037cb9df9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:56:21 compute-0 nova_compute[186544]: 2025-11-22 08:56:21.005 186548 DEBUG oslo_concurrency.lockutils [req-1bf09ef6-5184-4257-b4f4-2c8ac4bf725a req-6a2fbbfb-0101-49db-89ad-c087b3508fce 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "f4bd920b-a036-4c96-9c58-797037cb9df9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:56:21 compute-0 nova_compute[186544]: 2025-11-22 08:56:21.005 186548 DEBUG oslo_concurrency.lockutils [req-1bf09ef6-5184-4257-b4f4-2c8ac4bf725a req-6a2fbbfb-0101-49db-89ad-c087b3508fce 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "f4bd920b-a036-4c96-9c58-797037cb9df9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:56:21 compute-0 nova_compute[186544]: 2025-11-22 08:56:21.005 186548 DEBUG nova.compute.manager [req-1bf09ef6-5184-4257-b4f4-2c8ac4bf725a req-6a2fbbfb-0101-49db-89ad-c087b3508fce 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] Processing event network-vif-plugged-0fc6a761-743b-450a-97e3-0b4cb62012d4 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 08:56:21 compute-0 nova_compute[186544]: 2025-11-22 08:56:21.006 186548 DEBUG nova.compute.manager [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 08:56:21 compute-0 nova_compute[186544]: 2025-11-22 08:56:21.010 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763801781.0099852, f4bd920b-a036-4c96-9c58-797037cb9df9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:56:21 compute-0 nova_compute[186544]: 2025-11-22 08:56:21.010 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] VM Resumed (Lifecycle Event)
Nov 22 08:56:21 compute-0 nova_compute[186544]: 2025-11-22 08:56:21.012 186548 DEBUG nova.virt.libvirt.driver [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 08:56:21 compute-0 nova_compute[186544]: 2025-11-22 08:56:21.015 186548 INFO nova.virt.libvirt.driver [-] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] Instance spawned successfully.
Nov 22 08:56:21 compute-0 nova_compute[186544]: 2025-11-22 08:56:21.015 186548 DEBUG nova.virt.libvirt.driver [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 08:56:21 compute-0 nova_compute[186544]: 2025-11-22 08:56:21.044 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:56:21 compute-0 nova_compute[186544]: 2025-11-22 08:56:21.051 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:56:21 compute-0 nova_compute[186544]: 2025-11-22 08:56:21.056 186548 DEBUG nova.virt.libvirt.driver [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:56:21 compute-0 nova_compute[186544]: 2025-11-22 08:56:21.056 186548 DEBUG nova.virt.libvirt.driver [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:56:21 compute-0 nova_compute[186544]: 2025-11-22 08:56:21.057 186548 DEBUG nova.virt.libvirt.driver [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:56:21 compute-0 nova_compute[186544]: 2025-11-22 08:56:21.058 186548 DEBUG nova.virt.libvirt.driver [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:56:21 compute-0 nova_compute[186544]: 2025-11-22 08:56:21.058 186548 DEBUG nova.virt.libvirt.driver [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:56:21 compute-0 nova_compute[186544]: 2025-11-22 08:56:21.059 186548 DEBUG nova.virt.libvirt.driver [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:56:21 compute-0 nova_compute[186544]: 2025-11-22 08:56:21.096 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 08:56:21 compute-0 nova_compute[186544]: 2025-11-22 08:56:21.251 186548 INFO nova.compute.manager [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] Took 8.80 seconds to spawn the instance on the hypervisor.
Nov 22 08:56:21 compute-0 nova_compute[186544]: 2025-11-22 08:56:21.252 186548 DEBUG nova.compute.manager [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:56:21 compute-0 nova_compute[186544]: 2025-11-22 08:56:21.383 186548 INFO nova.compute.manager [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] Took 9.29 seconds to build instance.
Nov 22 08:56:21 compute-0 nova_compute[186544]: 2025-11-22 08:56:21.385 186548 DEBUG nova.network.neutron [req-a90d1ee4-9002-4cc4-8482-48f25af54140 req-703cce9f-656d-4e33-9b2e-c4a0c7ceef9b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] Updated VIF entry in instance network info cache for port 0fc6a761-743b-450a-97e3-0b4cb62012d4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:56:21 compute-0 nova_compute[186544]: 2025-11-22 08:56:21.386 186548 DEBUG nova.network.neutron [req-a90d1ee4-9002-4cc4-8482-48f25af54140 req-703cce9f-656d-4e33-9b2e-c4a0c7ceef9b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] Updating instance_info_cache with network_info: [{"id": "0fc6a761-743b-450a-97e3-0b4cb62012d4", "address": "fa:16:3e:84:a5:4e", "network": {"id": "13f06a8d-f6ae-46ea-b973-f89bfd41893c", "bridge": "br-int", "label": "tempest-network-smoke--1719566022", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0fc6a761-74", "ovs_interfaceid": "0fc6a761-743b-450a-97e3-0b4cb62012d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:56:21 compute-0 nova_compute[186544]: 2025-11-22 08:56:21.413 186548 DEBUG oslo_concurrency.lockutils [req-a90d1ee4-9002-4cc4-8482-48f25af54140 req-703cce9f-656d-4e33-9b2e-c4a0c7ceef9b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-f4bd920b-a036-4c96-9c58-797037cb9df9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:56:21 compute-0 nova_compute[186544]: 2025-11-22 08:56:21.516 186548 DEBUG oslo_concurrency.lockutils [None req-b9e9672f-363a-45a3-878c-95761bee52ec 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "f4bd920b-a036-4c96-9c58-797037cb9df9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.481s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:56:21 compute-0 nova_compute[186544]: 2025-11-22 08:56:21.674 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:56:21 compute-0 nova_compute[186544]: 2025-11-22 08:56:21.964 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:56:23 compute-0 nova_compute[186544]: 2025-11-22 08:56:23.129 186548 DEBUG nova.compute.manager [req-82972adf-fd50-4f29-a502-d72ccef9cd4e req-9a072534-681b-419e-a6f1-a8fdb16cb13f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] Received event network-vif-plugged-0fc6a761-743b-450a-97e3-0b4cb62012d4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:56:23 compute-0 nova_compute[186544]: 2025-11-22 08:56:23.129 186548 DEBUG oslo_concurrency.lockutils [req-82972adf-fd50-4f29-a502-d72ccef9cd4e req-9a072534-681b-419e-a6f1-a8fdb16cb13f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "f4bd920b-a036-4c96-9c58-797037cb9df9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:56:23 compute-0 nova_compute[186544]: 2025-11-22 08:56:23.130 186548 DEBUG oslo_concurrency.lockutils [req-82972adf-fd50-4f29-a502-d72ccef9cd4e req-9a072534-681b-419e-a6f1-a8fdb16cb13f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "f4bd920b-a036-4c96-9c58-797037cb9df9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:56:23 compute-0 nova_compute[186544]: 2025-11-22 08:56:23.130 186548 DEBUG oslo_concurrency.lockutils [req-82972adf-fd50-4f29-a502-d72ccef9cd4e req-9a072534-681b-419e-a6f1-a8fdb16cb13f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "f4bd920b-a036-4c96-9c58-797037cb9df9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:56:23 compute-0 nova_compute[186544]: 2025-11-22 08:56:23.130 186548 DEBUG nova.compute.manager [req-82972adf-fd50-4f29-a502-d72ccef9cd4e req-9a072534-681b-419e-a6f1-a8fdb16cb13f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] No waiting events found dispatching network-vif-plugged-0fc6a761-743b-450a-97e3-0b4cb62012d4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:56:23 compute-0 nova_compute[186544]: 2025-11-22 08:56:23.131 186548 WARNING nova.compute.manager [req-82972adf-fd50-4f29-a502-d72ccef9cd4e req-9a072534-681b-419e-a6f1-a8fdb16cb13f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] Received unexpected event network-vif-plugged-0fc6a761-743b-450a-97e3-0b4cb62012d4 for instance with vm_state active and task_state None.
Nov 22 08:56:26 compute-0 nova_compute[186544]: 2025-11-22 08:56:26.679 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:56:26 compute-0 nova_compute[186544]: 2025-11-22 08:56:26.967 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:56:30 compute-0 nova_compute[186544]: 2025-11-22 08:56:30.389 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:56:30 compute-0 NetworkManager[55036]: <info>  [1763801790.3911] manager: (patch-br-int-to-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/441)
Nov 22 08:56:30 compute-0 NetworkManager[55036]: <info>  [1763801790.3919] manager: (patch-provnet-72bb0060-bae3-4d38-a04c-622bbad4893d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/442)
Nov 22 08:56:30 compute-0 ovn_controller[94843]: 2025-11-22T08:56:30Z|00931|binding|INFO|Releasing lport 1131c2e2-47c4-45b6-8f6d-5a584a957856 from this chassis (sb_readonly=0)
Nov 22 08:56:30 compute-0 ovn_controller[94843]: 2025-11-22T08:56:30Z|00932|binding|INFO|Releasing lport 1131c2e2-47c4-45b6-8f6d-5a584a957856 from this chassis (sb_readonly=0)
Nov 22 08:56:30 compute-0 nova_compute[186544]: 2025-11-22 08:56:30.422 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:56:30 compute-0 nova_compute[186544]: 2025-11-22 08:56:30.434 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:56:30 compute-0 nova_compute[186544]: 2025-11-22 08:56:30.980 186548 DEBUG nova.compute.manager [req-7e074a21-47fe-43f2-ad40-d51264c71501 req-a3ea45c6-f68a-490e-af65-24814ede3845 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] Received event network-changed-0fc6a761-743b-450a-97e3-0b4cb62012d4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:56:30 compute-0 nova_compute[186544]: 2025-11-22 08:56:30.980 186548 DEBUG nova.compute.manager [req-7e074a21-47fe-43f2-ad40-d51264c71501 req-a3ea45c6-f68a-490e-af65-24814ede3845 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] Refreshing instance network info cache due to event network-changed-0fc6a761-743b-450a-97e3-0b4cb62012d4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:56:30 compute-0 nova_compute[186544]: 2025-11-22 08:56:30.981 186548 DEBUG oslo_concurrency.lockutils [req-7e074a21-47fe-43f2-ad40-d51264c71501 req-a3ea45c6-f68a-490e-af65-24814ede3845 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-f4bd920b-a036-4c96-9c58-797037cb9df9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:56:30 compute-0 nova_compute[186544]: 2025-11-22 08:56:30.981 186548 DEBUG oslo_concurrency.lockutils [req-7e074a21-47fe-43f2-ad40-d51264c71501 req-a3ea45c6-f68a-490e-af65-24814ede3845 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-f4bd920b-a036-4c96-9c58-797037cb9df9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:56:30 compute-0 nova_compute[186544]: 2025-11-22 08:56:30.981 186548 DEBUG nova.network.neutron [req-7e074a21-47fe-43f2-ad40-d51264c71501 req-a3ea45c6-f68a-490e-af65-24814ede3845 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] Refreshing network info cache for port 0fc6a761-743b-450a-97e3-0b4cb62012d4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:56:31 compute-0 nova_compute[186544]: 2025-11-22 08:56:31.682 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:56:31 compute-0 nova_compute[186544]: 2025-11-22 08:56:31.968 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:56:34 compute-0 nova_compute[186544]: 2025-11-22 08:56:34.179 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:56:34 compute-0 nova_compute[186544]: 2025-11-22 08:56:34.180 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 08:56:34 compute-0 nova_compute[186544]: 2025-11-22 08:56:34.180 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 08:56:34 compute-0 podman[257403]: 2025-11-22 08:56:34.418124288 +0000 UTC m=+0.063225147 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 22 08:56:34 compute-0 podman[257405]: 2025-11-22 08:56:34.418172039 +0000 UTC m=+0.055253321 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 08:56:34 compute-0 podman[257404]: 2025-11-22 08:56:34.440638692 +0000 UTC m=+0.083360383 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 22 08:56:34 compute-0 podman[257409]: 2025-11-22 08:56:34.457300031 +0000 UTC m=+0.088098508 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 08:56:34 compute-0 nova_compute[186544]: 2025-11-22 08:56:34.810 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "refresh_cache-f4bd920b-a036-4c96-9c58-797037cb9df9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:56:35 compute-0 nova_compute[186544]: 2025-11-22 08:56:35.899 186548 DEBUG nova.network.neutron [req-7e074a21-47fe-43f2-ad40-d51264c71501 req-a3ea45c6-f68a-490e-af65-24814ede3845 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] Updated VIF entry in instance network info cache for port 0fc6a761-743b-450a-97e3-0b4cb62012d4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:56:35 compute-0 nova_compute[186544]: 2025-11-22 08:56:35.900 186548 DEBUG nova.network.neutron [req-7e074a21-47fe-43f2-ad40-d51264c71501 req-a3ea45c6-f68a-490e-af65-24814ede3845 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] Updating instance_info_cache with network_info: [{"id": "0fc6a761-743b-450a-97e3-0b4cb62012d4", "address": "fa:16:3e:84:a5:4e", "network": {"id": "13f06a8d-f6ae-46ea-b973-f89bfd41893c", "bridge": "br-int", "label": "tempest-network-smoke--1719566022", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0fc6a761-74", "ovs_interfaceid": "0fc6a761-743b-450a-97e3-0b4cb62012d4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:56:35 compute-0 ovn_controller[94843]: 2025-11-22T08:56:35Z|00097|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:84:a5:4e 10.100.0.5
Nov 22 08:56:35 compute-0 ovn_controller[94843]: 2025-11-22T08:56:35Z|00098|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:84:a5:4e 10.100.0.5
Nov 22 08:56:36 compute-0 nova_compute[186544]: 2025-11-22 08:56:36.108 186548 DEBUG oslo_concurrency.lockutils [req-7e074a21-47fe-43f2-ad40-d51264c71501 req-a3ea45c6-f68a-490e-af65-24814ede3845 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-f4bd920b-a036-4c96-9c58-797037cb9df9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:56:36 compute-0 nova_compute[186544]: 2025-11-22 08:56:36.108 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquired lock "refresh_cache-f4bd920b-a036-4c96-9c58-797037cb9df9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:56:36 compute-0 nova_compute[186544]: 2025-11-22 08:56:36.108 186548 DEBUG nova.network.neutron [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 22 08:56:36 compute-0 nova_compute[186544]: 2025-11-22 08:56:36.109 186548 DEBUG nova.objects.instance [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lazy-loading 'info_cache' on Instance uuid f4bd920b-a036-4c96-9c58-797037cb9df9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.609 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f4bd920b-a036-4c96-9c58-797037cb9df9', 'name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-714377660', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-000000ba', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'hostId': 'dd385e04e2879c1671229d4e2e747f71264f7a16b982bacfa5dbb804', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.610 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.610 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.610 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-714377660>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-714377660>]
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.611 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.638 12 DEBUG ceilometer.compute.pollsters [-] f4bd920b-a036-4c96-9c58-797037cb9df9/disk.device.write.bytes volume: 72769536 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.639 12 DEBUG ceilometer.compute.pollsters [-] f4bd920b-a036-4c96-9c58-797037cb9df9/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.640 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0b1191e9-2c58-463c-8fea-df2f9c331cd2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72769536, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': 'f4bd920b-a036-4c96-9c58-797037cb9df9-vda', 'timestamp': '2025-11-22T08:56:36.611242', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-714377660', 'name': 'instance-000000ba', 'instance_id': 'f4bd920b-a036-4c96-9c58-797037cb9df9', 'instance_type': 'm1.nano', 'host': 'dd385e04e2879c1671229d4e2e747f71264f7a16b982bacfa5dbb804', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '26eefe74-c781-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 8537.303034518, 'message_signature': '3602f102dde118856f36651b89ccbf2529ea4585984c1f65782a71411d1e37d2'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': 'f4bd920b-a036-4c96-9c58-797037cb9df9-sda', 'timestamp': '2025-11-22T08:56:36.611242', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-714377660', 'name': 'instance-000000ba', 'instance_id': 'f4bd920b-a036-4c96-9c58-797037cb9df9', 'instance_type': 'm1.nano', 'host': 'dd385e04e2879c1671229d4e2e747f71264f7a16b982bacfa5dbb804', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '26ef0b12-c781-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 8537.303034518, 'message_signature': '5a2c6744dafaae4ba81c00966a4d8897caf4076c58a33fb17699fa162a7dec2a'}]}, 'timestamp': '2025-11-22 08:56:36.639567', '_unique_id': 'a6d0c1024cbc4819943f4ea3a9dc2eda'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.640 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.640 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.640 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.640 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.640 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.640 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.640 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.640 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.640 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.640 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.640 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.640 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.640 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.640 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.640 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.640 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.640 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.640 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.640 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.640 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.640 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.640 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.640 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.640 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.640 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.640 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.640 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.640 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.640 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.640 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.640 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.640 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.641 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.642 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.642 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-714377660>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-714377660>]
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.642 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.648 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for f4bd920b-a036-4c96-9c58-797037cb9df9 / tap0fc6a761-74 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.648 12 DEBUG ceilometer.compute.pollsters [-] f4bd920b-a036-4c96-9c58-797037cb9df9/network.outgoing.bytes volume: 1242 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.650 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7d88758a-8b47-4388-8eeb-4939b704d3c7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1242, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': 'instance-000000ba-f4bd920b-a036-4c96-9c58-797037cb9df9-tap0fc6a761-74', 'timestamp': '2025-11-22T08:56:36.642516', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-714377660', 'name': 'tap0fc6a761-74', 'instance_id': 'f4bd920b-a036-4c96-9c58-797037cb9df9', 'instance_type': 'm1.nano', 'host': 'dd385e04e2879c1671229d4e2e747f71264f7a16b982bacfa5dbb804', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:84:a5:4e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0fc6a761-74'}, 'message_id': '26f0823a-c781-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 8537.334330268, 'message_signature': 'cfb5e2d0f15ad9648010ae5523dabcdf0c9a98cebc0b438cd4b8b208118a8cd3'}]}, 'timestamp': '2025-11-22 08:56:36.649419', '_unique_id': '8b800f744e974fbda9a009a8a9bd4f32'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.650 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.650 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.650 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.650 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.650 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.650 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.650 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.650 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.650 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.650 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.650 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.650 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.650 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.650 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.650 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.650 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.650 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.650 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.650 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.650 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.650 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.650 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.650 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.650 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.650 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.650 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.650 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.650 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.650 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.650 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.650 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.651 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.661 12 DEBUG ceilometer.compute.pollsters [-] f4bd920b-a036-4c96-9c58-797037cb9df9/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.662 12 DEBUG ceilometer.compute.pollsters [-] f4bd920b-a036-4c96-9c58-797037cb9df9/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.663 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b77f9f5b-b34f-43f6-bc0a-cdb1b8af6a1c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': 'f4bd920b-a036-4c96-9c58-797037cb9df9-vda', 'timestamp': '2025-11-22T08:56:36.651155', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-714377660', 'name': 'instance-000000ba', 'instance_id': 'f4bd920b-a036-4c96-9c58-797037cb9df9', 'instance_type': 'm1.nano', 'host': 'dd385e04e2879c1671229d4e2e747f71264f7a16b982bacfa5dbb804', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '26f28b48-c781-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 8537.34295916, 'message_signature': 'cc83b3c22c0cf089e6d5f47955628531d1afcbdabd9039a1c89d5a7fb54cc5df'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': 'f4bd920b-a036-4c96-9c58-797037cb9df9-sda', 'timestamp': '2025-11-22T08:56:36.651155', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-714377660', 'name': 'instance-000000ba', 'instance_id': 'f4bd920b-a036-4c96-9c58-797037cb9df9', 'instance_type': 'm1.nano', 'host': 'dd385e04e2879c1671229d4e2e747f71264f7a16b982bacfa5dbb804', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '26f295f2-c781-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 8537.34295916, 'message_signature': '39d70fec683f0ba791085331bb6bda9f6cc59978eae20344d452e2056622d734'}]}, 'timestamp': '2025-11-22 08:56:36.662739', '_unique_id': '1fdb743b32494cc5aea0be0542ca0b66'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.663 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.663 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.663 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.663 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.663 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.663 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.663 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.663 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.663 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.663 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.663 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.663 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.663 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.663 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.663 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.663 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.663 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.663 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.663 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.663 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.663 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.663 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.663 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.663 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.663 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.663 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.663 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.663 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.663 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.663 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.663 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.663 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.664 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.664 12 DEBUG ceilometer.compute.pollsters [-] f4bd920b-a036-4c96-9c58-797037cb9df9/network.incoming.packets volume: 12 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.665 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c62d9bd6-fe40-4e2c-8582-14f9a7a77855', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 12, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': 'instance-000000ba-f4bd920b-a036-4c96-9c58-797037cb9df9-tap0fc6a761-74', 'timestamp': '2025-11-22T08:56:36.664542', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-714377660', 'name': 'tap0fc6a761-74', 'instance_id': 'f4bd920b-a036-4c96-9c58-797037cb9df9', 'instance_type': 'm1.nano', 'host': 'dd385e04e2879c1671229d4e2e747f71264f7a16b982bacfa5dbb804', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:84:a5:4e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0fc6a761-74'}, 'message_id': '26f2e5fc-c781-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 8537.334330268, 'message_signature': 'c83033e9aa5645c0bdbbf019a6dd29d70d32e2fb6f054e79d28aee18256d67eb'}]}, 'timestamp': '2025-11-22 08:56:36.664792', '_unique_id': 'd768a4636e1d4b42bc1796242ae72c20'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.665 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.665 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.665 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.665 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.665 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.665 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.665 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.665 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.665 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.665 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.665 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.665 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.665 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.665 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.665 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.665 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.665 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.665 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.665 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.665 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.665 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.665 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.665 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.665 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.665 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.665 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.665 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.665 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.665 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.665 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.665 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.665 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.665 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.665 12 DEBUG ceilometer.compute.pollsters [-] f4bd920b-a036-4c96-9c58-797037cb9df9/disk.device.usage volume: 29753344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.666 12 DEBUG ceilometer.compute.pollsters [-] f4bd920b-a036-4c96-9c58-797037cb9df9/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.666 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8ecce932-8d3f-4575-9ed0-3a22b1b2bc53', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29753344, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': 'f4bd920b-a036-4c96-9c58-797037cb9df9-vda', 'timestamp': '2025-11-22T08:56:36.665960', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-714377660', 'name': 'instance-000000ba', 'instance_id': 'f4bd920b-a036-4c96-9c58-797037cb9df9', 'instance_type': 'm1.nano', 'host': 'dd385e04e2879c1671229d4e2e747f71264f7a16b982bacfa5dbb804', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '26f31d60-c781-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 8537.34295916, 'message_signature': '30053f399b95714c3a4c6d6870cfde474f3b2776c72c317186aabdf75317e557'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': 'f4bd920b-a036-4c96-9c58-797037cb9df9-sda', 'timestamp': '2025-11-22T08:56:36.665960', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-714377660', 'name': 'instance-000000ba', 'instance_id': 'f4bd920b-a036-4c96-9c58-797037cb9df9', 'instance_type': 'm1.nano', 'host': 'dd385e04e2879c1671229d4e2e747f71264f7a16b982bacfa5dbb804', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '26f3262a-c781-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 8537.34295916, 'message_signature': '1ca307d1a0ee1f34dcdc98c63afd169ca81b35942533e963a16e09629771773f'}]}, 'timestamp': '2025-11-22 08:56:36.666439', '_unique_id': '38f2391c49ad43ab92db71864b15f416'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.666 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.666 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.666 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.666 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.666 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.666 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.666 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.666 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.666 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.666 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.666 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.666 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.666 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.666 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.666 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.666 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.666 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.666 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.666 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.666 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.666 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.666 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.666 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.666 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.666 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.666 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.666 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.666 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.666 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.666 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.666 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.666 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.667 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.668 12 DEBUG ceilometer.compute.pollsters [-] f4bd920b-a036-4c96-9c58-797037cb9df9/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.668 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c867b0e5-5142-4c0c-9f49-dc633f9d8162', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': 'instance-000000ba-f4bd920b-a036-4c96-9c58-797037cb9df9-tap0fc6a761-74', 'timestamp': '2025-11-22T08:56:36.668061', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-714377660', 'name': 'tap0fc6a761-74', 'instance_id': 'f4bd920b-a036-4c96-9c58-797037cb9df9', 'instance_type': 'm1.nano', 'host': 'dd385e04e2879c1671229d4e2e747f71264f7a16b982bacfa5dbb804', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:84:a5:4e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0fc6a761-74'}, 'message_id': '26f37120-c781-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 8537.334330268, 'message_signature': '3f6c51da21839988dfbeb5b5a2a9bf5fa837bf2dbe497701f75ad1f11c11747b'}]}, 'timestamp': '2025-11-22 08:56:36.668378', '_unique_id': '47f2660570774ed5a1a42199e10d9254'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.668 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.668 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.668 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.668 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.668 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.668 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.668 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.668 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.668 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.668 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.668 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.668 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.668 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.668 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.668 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.668 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.668 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.668 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.668 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.668 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.668 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.668 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.668 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.668 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.668 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.668 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.668 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.668 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.668 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.668 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.668 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.668 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.669 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.670 12 DEBUG ceilometer.compute.pollsters [-] f4bd920b-a036-4c96-9c58-797037cb9df9/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.670 12 DEBUG ceilometer.compute.pollsters [-] f4bd920b-a036-4c96-9c58-797037cb9df9/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.671 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9f656a5a-08b9-4b7f-95a4-d6cf8ba2149f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': 'f4bd920b-a036-4c96-9c58-797037cb9df9-vda', 'timestamp': '2025-11-22T08:56:36.670046', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-714377660', 'name': 'instance-000000ba', 'instance_id': 'f4bd920b-a036-4c96-9c58-797037cb9df9', 'instance_type': 'm1.nano', 'host': 'dd385e04e2879c1671229d4e2e747f71264f7a16b982bacfa5dbb804', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '26f3be96-c781-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 8537.34295916, 'message_signature': '79df3078d00b88060d71b3d109fbaa21698c5be41ffb317cf0822c08e811f8de'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': 'f4bd920b-a036-4c96-9c58-797037cb9df9-sda', 'timestamp': '2025-11-22T08:56:36.670046', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-714377660', 'name': 'instance-000000ba', 'instance_id': 'f4bd920b-a036-4c96-9c58-797037cb9df9', 'instance_type': 'm1.nano', 'host': 'dd385e04e2879c1671229d4e2e747f71264f7a16b982bacfa5dbb804', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '26f3c9f4-c781-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 8537.34295916, 'message_signature': '42a9299e54edf9b1d5e48f38474c32d42bb9c1bfe9ae49023f2c2bc8f30408b6'}]}, 'timestamp': '2025-11-22 08:56:36.670612', '_unique_id': '28401edab0da438cb492dc6eafb3bbae'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.671 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.671 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.671 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.671 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.671 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.671 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.671 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.671 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.671 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.671 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.671 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.671 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.671 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.671 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.671 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.671 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.671 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.671 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.671 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.671 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.671 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.671 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.671 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.671 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.671 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.671 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.671 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.671 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.671 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.671 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.671 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.671 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.672 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.672 12 DEBUG ceilometer.compute.pollsters [-] f4bd920b-a036-4c96-9c58-797037cb9df9/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.673 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '80258c49-9f5b-496f-a772-d5ec133cac67', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': 'instance-000000ba-f4bd920b-a036-4c96-9c58-797037cb9df9-tap0fc6a761-74', 'timestamp': '2025-11-22T08:56:36.672352', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-714377660', 'name': 'tap0fc6a761-74', 'instance_id': 'f4bd920b-a036-4c96-9c58-797037cb9df9', 'instance_type': 'm1.nano', 'host': 'dd385e04e2879c1671229d4e2e747f71264f7a16b982bacfa5dbb804', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:84:a5:4e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0fc6a761-74'}, 'message_id': '26f41864-c781-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 8537.334330268, 'message_signature': 'fc29664c8a90ce2b452e6437970af4b8ab3dec0d4b64e1d703d726d34c607b90'}]}, 'timestamp': '2025-11-22 08:56:36.672637', '_unique_id': '7f60cb5bc7b94a4b8e3a7ed675ba9b2f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.673 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.673 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.673 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.673 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.673 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.673 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.673 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.673 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.673 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.673 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.673 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.673 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.673 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.673 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.673 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.673 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.673 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.673 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.673 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.673 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.673 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.673 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.673 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.673 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.673 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.673 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.673 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.673 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.673 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.673 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.673 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.673 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.674 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.674 12 DEBUG ceilometer.compute.pollsters [-] f4bd920b-a036-4c96-9c58-797037cb9df9/disk.device.read.bytes volume: 31017472 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.674 12 DEBUG ceilometer.compute.pollsters [-] f4bd920b-a036-4c96-9c58-797037cb9df9/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.675 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '05f34591-c244-4b71-b3d1-1bfc8b69e125', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 31017472, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': 'f4bd920b-a036-4c96-9c58-797037cb9df9-vda', 'timestamp': '2025-11-22T08:56:36.674121', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-714377660', 'name': 'instance-000000ba', 'instance_id': 'f4bd920b-a036-4c96-9c58-797037cb9df9', 'instance_type': 'm1.nano', 'host': 'dd385e04e2879c1671229d4e2e747f71264f7a16b982bacfa5dbb804', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '26f45de2-c781-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 8537.303034518, 'message_signature': '0dea00703067240e28bee396600692d30da5b31fe29218c5c7f5890f830b36f5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': 'f4bd920b-a036-4c96-9c58-797037cb9df9-sda', 'timestamp': '2025-11-22T08:56:36.674121', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-714377660', 'name': 'instance-000000ba', 'instance_id': 'f4bd920b-a036-4c96-9c58-797037cb9df9', 'instance_type': 'm1.nano', 'host': 'dd385e04e2879c1671229d4e2e747f71264f7a16b982bacfa5dbb804', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '26f4679c-c781-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 8537.303034518, 'message_signature': '983ec664476e87746613e65f561b09cf87d076452169c8e0364d3229b891d8a3'}]}, 'timestamp': '2025-11-22 08:56:36.674648', '_unique_id': 'cab73ac524a54df49ce0f39a4a1cf06d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.675 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.675 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.675 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.675 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.675 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.675 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.675 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.675 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.675 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.675 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.675 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.675 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.675 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.675 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.675 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.675 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.675 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.675 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.675 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.675 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.675 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.675 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.675 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.675 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.675 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.675 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.675 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.675 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.675 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.675 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.675 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.675 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.676 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.676 12 DEBUG ceilometer.compute.pollsters [-] f4bd920b-a036-4c96-9c58-797037cb9df9/network.outgoing.packets volume: 9 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.677 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '715d22a5-13c4-4403-830d-fac033b53050', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 9, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': 'instance-000000ba-f4bd920b-a036-4c96-9c58-797037cb9df9-tap0fc6a761-74', 'timestamp': '2025-11-22T08:56:36.676223', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-714377660', 'name': 'tap0fc6a761-74', 'instance_id': 'f4bd920b-a036-4c96-9c58-797037cb9df9', 'instance_type': 'm1.nano', 'host': 'dd385e04e2879c1671229d4e2e747f71264f7a16b982bacfa5dbb804', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:84:a5:4e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0fc6a761-74'}, 'message_id': '26f4b15c-c781-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 8537.334330268, 'message_signature': '583207d2763c3704d8ed3cdae93a2d238e3c794f1effb565baa6166e3ff695af'}]}, 'timestamp': '2025-11-22 08:56:36.676555', '_unique_id': '0e952102fb194084b17b6accf17ab543'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.677 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.677 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.677 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.677 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.677 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.677 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.677 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.677 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.677 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.677 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.677 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.677 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.677 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.677 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.677 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.677 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.677 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.677 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.677 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.677 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.677 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.677 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.677 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.677 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.677 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.677 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.677 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.677 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.677 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.677 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.677 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.677 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.678 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.678 12 DEBUG ceilometer.compute.pollsters [-] f4bd920b-a036-4c96-9c58-797037cb9df9/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.678 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3c66ba46-a1a0-4268-9f74-eea2a1e43492', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': 'instance-000000ba-f4bd920b-a036-4c96-9c58-797037cb9df9-tap0fc6a761-74', 'timestamp': '2025-11-22T08:56:36.678134', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-714377660', 'name': 'tap0fc6a761-74', 'instance_id': 'f4bd920b-a036-4c96-9c58-797037cb9df9', 'instance_type': 'm1.nano', 'host': 'dd385e04e2879c1671229d4e2e747f71264f7a16b982bacfa5dbb804', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:84:a5:4e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0fc6a761-74'}, 'message_id': '26f4fac2-c781-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 8537.334330268, 'message_signature': 'faf2d5b2236aef75f06049e14c37542160792eb605c400a96c303d340cb6d009'}]}, 'timestamp': '2025-11-22 08:56:36.678463', '_unique_id': 'f473c1bd2c424e99b932b4b52c766e9d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.678 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.678 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.678 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.678 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.678 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.678 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.678 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.678 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.678 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.678 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.678 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.678 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.678 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.678 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.678 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.678 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.678 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.678 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.678 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.678 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.678 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.678 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.678 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.678 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.678 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.678 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.678 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.678 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.678 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.678 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.678 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.679 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.680 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.680 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-714377660>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-714377660>]
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.680 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.680 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.680 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-714377660>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-714377660>]
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.680 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.680 12 DEBUG ceilometer.compute.pollsters [-] f4bd920b-a036-4c96-9c58-797037cb9df9/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.681 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3f6e18ed-cb94-4982-a4f2-7537829b8d4e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': 'instance-000000ba-f4bd920b-a036-4c96-9c58-797037cb9df9-tap0fc6a761-74', 'timestamp': '2025-11-22T08:56:36.680875', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-714377660', 'name': 'tap0fc6a761-74', 'instance_id': 'f4bd920b-a036-4c96-9c58-797037cb9df9', 'instance_type': 'm1.nano', 'host': 'dd385e04e2879c1671229d4e2e747f71264f7a16b982bacfa5dbb804', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:84:a5:4e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0fc6a761-74'}, 'message_id': '26f56584-c781-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 8537.334330268, 'message_signature': 'd61badbe771d4cb372a4e0cd767b378cef3209cae221a4136e540a5d39fff9cb'}]}, 'timestamp': '2025-11-22 08:56:36.681165', '_unique_id': '2344e683f4944e2e8025dd4adbd615ac'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.681 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.681 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.681 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.681 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.681 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.681 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.681 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.681 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.681 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.681 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.681 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.681 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.681 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.681 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.681 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.681 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.681 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.681 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.681 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.681 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.681 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.681 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.681 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.681 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.681 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.681 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.681 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.681 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.681 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.681 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.681 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.681 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.682 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.683 12 DEBUG ceilometer.compute.pollsters [-] f4bd920b-a036-4c96-9c58-797037cb9df9/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.683 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'add9e07a-49a5-427e-a836-d15b1f21b57a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': 'instance-000000ba-f4bd920b-a036-4c96-9c58-797037cb9df9-tap0fc6a761-74', 'timestamp': '2025-11-22T08:56:36.683089', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-714377660', 'name': 'tap0fc6a761-74', 'instance_id': 'f4bd920b-a036-4c96-9c58-797037cb9df9', 'instance_type': 'm1.nano', 'host': 'dd385e04e2879c1671229d4e2e747f71264f7a16b982bacfa5dbb804', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:84:a5:4e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0fc6a761-74'}, 'message_id': '26f5bc8c-c781-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 8537.334330268, 'message_signature': 'f4d671adf493cd9526be5ca71eddae0b8e2a527ddcd46fe53cfffa1cb8406f29'}]}, 'timestamp': '2025-11-22 08:56:36.683394', '_unique_id': 'b87a0414d2174354b5bdc5fc084720dd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.683 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.683 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.683 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.683 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.683 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.683 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.683 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.683 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.683 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.683 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.683 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.683 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.683 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.683 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.683 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.683 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.683 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.683 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.683 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.683 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.683 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.683 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.683 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.683 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.683 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.683 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.683 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.683 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.683 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.683 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.683 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.683 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.685 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.685 12 DEBUG ceilometer.compute.pollsters [-] f4bd920b-a036-4c96-9c58-797037cb9df9/disk.device.write.latency volume: 3278082898 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.685 12 DEBUG ceilometer.compute.pollsters [-] f4bd920b-a036-4c96-9c58-797037cb9df9/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.686 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '509fccfa-00f4-4a0c-ae1e-65d2e2216ebf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3278082898, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': 'f4bd920b-a036-4c96-9c58-797037cb9df9-vda', 'timestamp': '2025-11-22T08:56:36.685152', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-714377660', 'name': 'instance-000000ba', 'instance_id': 'f4bd920b-a036-4c96-9c58-797037cb9df9', 'instance_type': 'm1.nano', 'host': 'dd385e04e2879c1671229d4e2e747f71264f7a16b982bacfa5dbb804', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '26f60ff2-c781-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 8537.303034518, 'message_signature': 'b81ae3bc04dc1e6620d5721ed06434c8ad890169b68e8250950996d7ec909f9e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': 'f4bd920b-a036-4c96-9c58-797037cb9df9-sda', 'timestamp': '2025-11-22T08:56:36.685152', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-714377660', 'name': 'instance-000000ba', 'instance_id': 'f4bd920b-a036-4c96-9c58-797037cb9df9', 'instance_type': 'm1.nano', 'host': 'dd385e04e2879c1671229d4e2e747f71264f7a16b982bacfa5dbb804', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '26f61d1c-c781-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 8537.303034518, 'message_signature': '63b860fea3764eb01d39ee025a5c3e4dc69a5b4f1971aa14afcfcd8569239325'}]}, 'timestamp': '2025-11-22 08:56:36.685890', '_unique_id': '46171f65afae47e08388ed58837c0279'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.686 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.686 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.686 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.686 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.686 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.686 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.686 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.686 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.686 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.686 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.686 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.686 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.686 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.686 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.686 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.686 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.686 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.686 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.686 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.686 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.686 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.686 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.686 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.686 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.686 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:56:36 compute-0 nova_compute[186544]: 2025-11-22 08:56:36.687 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.686 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.686 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.686 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.686 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.686 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.686 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.686 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.688 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.702 12 DEBUG ceilometer.compute.pollsters [-] f4bd920b-a036-4c96-9c58-797037cb9df9/memory.usage volume: 40.3671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.704 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '28b6b091-30b7-4128-b245-13fdc3ad8fe4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.3671875, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': 'f4bd920b-a036-4c96-9c58-797037cb9df9', 'timestamp': '2025-11-22T08:56:36.688168', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-714377660', 'name': 'instance-000000ba', 'instance_id': 'f4bd920b-a036-4c96-9c58-797037cb9df9', 'instance_type': 'm1.nano', 'host': 'dd385e04e2879c1671229d4e2e747f71264f7a16b982bacfa5dbb804', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '26f8cfe4-c781-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 8537.394622701, 'message_signature': '0dc3132e97b1abfff656f524a02eee9cbedbe3fd66bf6f36bdbfbf08e85d05c1'}]}, 'timestamp': '2025-11-22 08:56:36.703608', '_unique_id': 'b86f72eb4a7a4520ba88494fb02304fd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.704 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.704 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.704 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.704 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.704 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.704 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.704 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.704 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.704 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.704 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.704 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.704 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.704 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.704 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.704 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.704 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.704 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.704 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.704 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.704 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.704 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.704 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.704 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.704 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.704 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.704 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.704 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.704 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.704 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.704 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.704 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.704 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.705 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.705 12 DEBUG ceilometer.compute.pollsters [-] f4bd920b-a036-4c96-9c58-797037cb9df9/cpu volume: 12720000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.706 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1caaa4d8-6041-4ed4-97f7-209e8a2cca93', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12720000000, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': 'f4bd920b-a036-4c96-9c58-797037cb9df9', 'timestamp': '2025-11-22T08:56:36.705783', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-714377660', 'name': 'instance-000000ba', 'instance_id': 'f4bd920b-a036-4c96-9c58-797037cb9df9', 'instance_type': 'm1.nano', 'host': 'dd385e04e2879c1671229d4e2e747f71264f7a16b982bacfa5dbb804', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '26f9354c-c781-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 8537.394622701, 'message_signature': 'd25fe1a22f491c06f04d280282addfc3e4bb46b04a0a20a2df7d019cdea22265'}]}, 'timestamp': '2025-11-22 08:56:36.706182', '_unique_id': '122d0cc2240444b29a5b7c6d5e616c81'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.706 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.706 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.706 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.706 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.706 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.706 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.706 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.706 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.706 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.706 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.706 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.706 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.706 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.706 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.706 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.706 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.706 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.706 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.706 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.706 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.706 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.706 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.706 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.706 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.706 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.706 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.706 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.706 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.706 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.706 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.706 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.706 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.707 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.707 12 DEBUG ceilometer.compute.pollsters [-] f4bd920b-a036-4c96-9c58-797037cb9df9/disk.device.read.requests volume: 1135 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.708 12 DEBUG ceilometer.compute.pollsters [-] f4bd920b-a036-4c96-9c58-797037cb9df9/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.709 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7e5cf488-af0b-459d-9677-5b480e5b213f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1135, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': 'f4bd920b-a036-4c96-9c58-797037cb9df9-vda', 'timestamp': '2025-11-22T08:56:36.707883', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-714377660', 'name': 'instance-000000ba', 'instance_id': 'f4bd920b-a036-4c96-9c58-797037cb9df9', 'instance_type': 'm1.nano', 'host': 'dd385e04e2879c1671229d4e2e747f71264f7a16b982bacfa5dbb804', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '26f98470-c781-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 8537.303034518, 'message_signature': '3baa6e7d68b58b5a6c662101d8d26fd5c25345164368b63c2267df1da10eec01'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': 'f4bd920b-a036-4c96-9c58-797037cb9df9-sda', 'timestamp': '2025-11-22T08:56:36.707883', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-714377660', 'name': 'instance-000000ba', 'instance_id': 'f4bd920b-a036-4c96-9c58-797037cb9df9', 'instance_type': 'm1.nano', 'host': 'dd385e04e2879c1671229d4e2e747f71264f7a16b982bacfa5dbb804', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '26f98f1a-c781-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 8537.303034518, 'message_signature': 'cb8abf63bf593708db993b09dd8f115b73514b82dd4b4eb634ecfc8ee9836b4f'}]}, 'timestamp': '2025-11-22 08:56:36.708431', '_unique_id': '03f96e74c6d54f6cb6fdc7a8286f0af6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.709 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.709 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.709 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.709 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.709 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.709 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.709 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.709 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.709 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.709 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.709 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.709 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.709 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.709 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.709 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.709 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.709 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.709 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.709 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.709 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.709 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.709 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.709 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.709 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.709 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.709 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.709 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.709 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.709 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.709 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.709 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.709 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.710 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.710 12 DEBUG ceilometer.compute.pollsters [-] f4bd920b-a036-4c96-9c58-797037cb9df9/disk.device.write.requests volume: 306 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.710 12 DEBUG ceilometer.compute.pollsters [-] f4bd920b-a036-4c96-9c58-797037cb9df9/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.711 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '45eb1554-1f3b-464f-905b-56f8994373f9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 306, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': 'f4bd920b-a036-4c96-9c58-797037cb9df9-vda', 'timestamp': '2025-11-22T08:56:36.710187', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-714377660', 'name': 'instance-000000ba', 'instance_id': 'f4bd920b-a036-4c96-9c58-797037cb9df9', 'instance_type': 'm1.nano', 'host': 'dd385e04e2879c1671229d4e2e747f71264f7a16b982bacfa5dbb804', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '26f9df1a-c781-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 8537.303034518, 'message_signature': '5e383848510257d572164572559e016c525140e4de4b97e99521e870a7cd3c32'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': 'f4bd920b-a036-4c96-9c58-797037cb9df9-sda', 'timestamp': '2025-11-22T08:56:36.710187', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-714377660', 'name': 'instance-000000ba', 'instance_id': 'f4bd920b-a036-4c96-9c58-797037cb9df9', 'instance_type': 'm1.nano', 'host': 'dd385e04e2879c1671229d4e2e747f71264f7a16b982bacfa5dbb804', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '26f9e8fc-c781-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 8537.303034518, 'message_signature': '217c201b8e6fd403d9ec326adda18d78a4536fac69194e58dc57b634175d6e94'}]}, 'timestamp': '2025-11-22 08:56:36.710732', '_unique_id': 'd3fca84660a4475b9479e04d69c1ddf2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.711 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.711 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.711 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.711 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.711 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.711 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.711 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.711 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.711 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.711 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.711 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.711 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.711 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.711 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.711 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.711 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.711 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.711 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.711 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.711 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.711 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.711 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.711 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.711 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.711 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.711 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.711 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.711 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.711 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.711 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.711 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.711 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.712 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.712 12 DEBUG ceilometer.compute.pollsters [-] f4bd920b-a036-4c96-9c58-797037cb9df9/disk.device.read.latency volume: 1054586717 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.712 12 DEBUG ceilometer.compute.pollsters [-] f4bd920b-a036-4c96-9c58-797037cb9df9/disk.device.read.latency volume: 238188206 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.713 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4cc018d8-5f5e-4a9e-8709-99f6c412b5e7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1054586717, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': 'f4bd920b-a036-4c96-9c58-797037cb9df9-vda', 'timestamp': '2025-11-22T08:56:36.712435', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-714377660', 'name': 'instance-000000ba', 'instance_id': 'f4bd920b-a036-4c96-9c58-797037cb9df9', 'instance_type': 'm1.nano', 'host': 'dd385e04e2879c1671229d4e2e747f71264f7a16b982bacfa5dbb804', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '26fa35e6-c781-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 8537.303034518, 'message_signature': 'fcfb818f60bcc69245cfc12f92829c7e2f1540c1c988a6528a72907b75d85bd4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 238188206, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': 'f4bd920b-a036-4c96-9c58-797037cb9df9-sda', 'timestamp': '2025-11-22T08:56:36.712435', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-714377660', 'name': 'instance-000000ba', 'instance_id': 'f4bd920b-a036-4c96-9c58-797037cb9df9', 'instance_type': 'm1.nano', 'host': 'dd385e04e2879c1671229d4e2e747f71264f7a16b982bacfa5dbb804', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '26fa3f78-c781-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 8537.303034518, 'message_signature': 'e0d4b5f1253c542605b4f090f7e3e64e0aefa686694b8e2aaa0b7de54d41956a'}]}, 'timestamp': '2025-11-22 08:56:36.712942', '_unique_id': '8c9ac5780edd4958a6a6c643c7cecbfc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.713 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.713 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.713 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.713 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.713 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.713 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.713 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.713 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.713 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.713 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.713 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.713 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.713 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.713 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.713 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.713 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.713 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.713 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.713 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.713 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.713 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.713 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.713 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.713 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.713 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.713 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.713 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.713 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.713 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.713 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.713 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.714 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.714 12 DEBUG ceilometer.compute.pollsters [-] f4bd920b-a036-4c96-9c58-797037cb9df9/network.incoming.bytes volume: 1574 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.715 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '747dae56-b5ac-43dc-9f39-79f6364b38e8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1574, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': 'instance-000000ba-f4bd920b-a036-4c96-9c58-797037cb9df9-tap0fc6a761-74', 'timestamp': '2025-11-22T08:56:36.714950', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-714377660', 'name': 'tap0fc6a761-74', 'instance_id': 'f4bd920b-a036-4c96-9c58-797037cb9df9', 'instance_type': 'm1.nano', 'host': 'dd385e04e2879c1671229d4e2e747f71264f7a16b982bacfa5dbb804', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:84:a5:4e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0fc6a761-74'}, 'message_id': '26fa9874-c781-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 8537.334330268, 'message_signature': '3e0b5fa524fb2c143e9d9ecdd0443fbe409e90d0f57f5858819d3f123f6804ff'}]}, 'timestamp': '2025-11-22 08:56:36.715245', '_unique_id': '5c3740b06367479eac7a8317de1f5041'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.715 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.715 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.715 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.715 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.715 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.715 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.715 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.715 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.715 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.715 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.715 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.715 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.715 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.715 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.715 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.715 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.715 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.715 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.715 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.715 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.715 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.715 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.715 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.715 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.715 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.715 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.715 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.715 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.715 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.715 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.715 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.716 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.716 12 DEBUG ceilometer.compute.pollsters [-] f4bd920b-a036-4c96-9c58-797037cb9df9/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.717 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8c0b411e-c911-40e2-bbaf-bbcf21930e56', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7bb85b33f2b44468ab5d86bf5ba98421', 'user_name': None, 'project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'project_name': None, 'resource_id': 'instance-000000ba-f4bd920b-a036-4c96-9c58-797037cb9df9-tap0fc6a761-74', 'timestamp': '2025-11-22T08:56:36.716894', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-714377660', 'name': 'tap0fc6a761-74', 'instance_id': 'f4bd920b-a036-4c96-9c58-797037cb9df9', 'instance_type': 'm1.nano', 'host': 'dd385e04e2879c1671229d4e2e747f71264f7a16b982bacfa5dbb804', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:84:a5:4e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0fc6a761-74'}, 'message_id': '26fae46e-c781-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 8537.334330268, 'message_signature': 'ede6cf9ba5e1fa88e754363a304903400265b6dca7e66d111b8f0d2897054558'}]}, 'timestamp': '2025-11-22 08:56:36.717188', '_unique_id': 'e21075a46aca4889a04cfa41a3b4cdd3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.717 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.717 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.717 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.717 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.717 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.717 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.717 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.717 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.717 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.717 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.717 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.717 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.717 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.717 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.717 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.717 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.717 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.717 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.717 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.717 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.717 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.717 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.717 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.717 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.717 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.717 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.717 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.717 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.717 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.717 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.717 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:56:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:56:36.717 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:56:36 compute-0 nova_compute[186544]: 2025-11-22 08:56:36.969 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:56:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:56:37.388 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:56:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:56:37.389 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:56:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:56:37.390 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:56:38 compute-0 nova_compute[186544]: 2025-11-22 08:56:38.158 186548 DEBUG nova.network.neutron [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] Updating instance_info_cache with network_info: [{"id": "0fc6a761-743b-450a-97e3-0b4cb62012d4", "address": "fa:16:3e:84:a5:4e", "network": {"id": "13f06a8d-f6ae-46ea-b973-f89bfd41893c", "bridge": "br-int", "label": "tempest-network-smoke--1719566022", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0fc6a761-74", "ovs_interfaceid": "0fc6a761-743b-450a-97e3-0b4cb62012d4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:56:38 compute-0 nova_compute[186544]: 2025-11-22 08:56:38.179 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Releasing lock "refresh_cache-f4bd920b-a036-4c96-9c58-797037cb9df9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:56:38 compute-0 nova_compute[186544]: 2025-11-22 08:56:38.180 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 22 08:56:39 compute-0 nova_compute[186544]: 2025-11-22 08:56:39.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:56:39 compute-0 nova_compute[186544]: 2025-11-22 08:56:39.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 08:56:40 compute-0 nova_compute[186544]: 2025-11-22 08:56:40.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:56:40 compute-0 nova_compute[186544]: 2025-11-22 08:56:40.186 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:56:40 compute-0 nova_compute[186544]: 2025-11-22 08:56:40.187 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:56:40 compute-0 nova_compute[186544]: 2025-11-22 08:56:40.188 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:56:40 compute-0 nova_compute[186544]: 2025-11-22 08:56:40.188 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 08:56:40 compute-0 nova_compute[186544]: 2025-11-22 08:56:40.252 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f4bd920b-a036-4c96-9c58-797037cb9df9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:56:40 compute-0 nova_compute[186544]: 2025-11-22 08:56:40.307 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f4bd920b-a036-4c96-9c58-797037cb9df9/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:56:40 compute-0 nova_compute[186544]: 2025-11-22 08:56:40.308 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f4bd920b-a036-4c96-9c58-797037cb9df9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:56:40 compute-0 nova_compute[186544]: 2025-11-22 08:56:40.366 186548 DEBUG oslo_concurrency.processutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f4bd920b-a036-4c96-9c58-797037cb9df9/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:56:40 compute-0 nova_compute[186544]: 2025-11-22 08:56:40.506 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:56:40 compute-0 nova_compute[186544]: 2025-11-22 08:56:40.507 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5520MB free_disk=73.10310745239258GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 08:56:40 compute-0 nova_compute[186544]: 2025-11-22 08:56:40.507 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:56:40 compute-0 nova_compute[186544]: 2025-11-22 08:56:40.507 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:56:40 compute-0 nova_compute[186544]: 2025-11-22 08:56:40.577 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Instance f4bd920b-a036-4c96-9c58-797037cb9df9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 22 08:56:40 compute-0 nova_compute[186544]: 2025-11-22 08:56:40.577 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 08:56:40 compute-0 nova_compute[186544]: 2025-11-22 08:56:40.578 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 08:56:40 compute-0 nova_compute[186544]: 2025-11-22 08:56:40.600 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Refreshing inventories for resource provider 0a011418-630a-4be8-ab23-41ec1c11a5ea _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 22 08:56:40 compute-0 nova_compute[186544]: 2025-11-22 08:56:40.615 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Updating ProviderTree inventory for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 22 08:56:40 compute-0 nova_compute[186544]: 2025-11-22 08:56:40.616 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Updating inventory in ProviderTree for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 22 08:56:40 compute-0 nova_compute[186544]: 2025-11-22 08:56:40.630 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Refreshing aggregate associations for resource provider 0a011418-630a-4be8-ab23-41ec1c11a5ea, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 22 08:56:40 compute-0 nova_compute[186544]: 2025-11-22 08:56:40.654 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Refreshing trait associations for resource provider 0a011418-630a-4be8-ab23-41ec1c11a5ea, traits: COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_ACCELERATORS,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_RESCUE_BFV,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSSE3,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE41 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 22 08:56:40 compute-0 nova_compute[186544]: 2025-11-22 08:56:40.700 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:56:40 compute-0 nova_compute[186544]: 2025-11-22 08:56:40.712 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:56:40 compute-0 nova_compute[186544]: 2025-11-22 08:56:40.733 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 08:56:40 compute-0 nova_compute[186544]: 2025-11-22 08:56:40.734 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.226s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:56:41 compute-0 nova_compute[186544]: 2025-11-22 08:56:41.690 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:56:41 compute-0 nova_compute[186544]: 2025-11-22 08:56:41.971 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:56:44 compute-0 nova_compute[186544]: 2025-11-22 08:56:44.734 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:56:46 compute-0 nova_compute[186544]: 2025-11-22 08:56:46.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:56:46 compute-0 podman[257494]: 2025-11-22 08:56:46.4075946 +0000 UTC m=+0.059399983 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 08:56:46 compute-0 podman[257496]: 2025-11-22 08:56:46.429436187 +0000 UTC m=+0.063477862 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 08:56:46 compute-0 podman[257495]: 2025-11-22 08:56:46.439919425 +0000 UTC m=+0.088035307 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, distribution-scope=public, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., managed_by=edpm_ansible, vcs-type=git, name=ubi9-minimal)
Nov 22 08:56:46 compute-0 nova_compute[186544]: 2025-11-22 08:56:46.694 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:56:46 compute-0 nova_compute[186544]: 2025-11-22 08:56:46.972 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:56:47 compute-0 nova_compute[186544]: 2025-11-22 08:56:47.158 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:56:49 compute-0 nova_compute[186544]: 2025-11-22 08:56:49.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:56:50 compute-0 nova_compute[186544]: 2025-11-22 08:56:50.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:56:51 compute-0 nova_compute[186544]: 2025-11-22 08:56:51.697 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:56:51 compute-0 nova_compute[186544]: 2025-11-22 08:56:51.975 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:56:52 compute-0 nova_compute[186544]: 2025-11-22 08:56:52.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:56:56 compute-0 nova_compute[186544]: 2025-11-22 08:56:56.701 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:56:56 compute-0 nova_compute[186544]: 2025-11-22 08:56:56.976 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:57:01 compute-0 nova_compute[186544]: 2025-11-22 08:57:01.706 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:57:01 compute-0 nova_compute[186544]: 2025-11-22 08:57:01.978 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:57:04 compute-0 nova_compute[186544]: 2025-11-22 08:57:04.159 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:57:05 compute-0 podman[257560]: 2025-11-22 08:57:05.410147425 +0000 UTC m=+0.054483551 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118)
Nov 22 08:57:05 compute-0 podman[257559]: 2025-11-22 08:57:05.41114156 +0000 UTC m=+0.060012028 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 22 08:57:05 compute-0 podman[257561]: 2025-11-22 08:57:05.418691316 +0000 UTC m=+0.057309121 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 08:57:05 compute-0 podman[257566]: 2025-11-22 08:57:05.450075027 +0000 UTC m=+0.084222032 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 08:57:06 compute-0 nova_compute[186544]: 2025-11-22 08:57:06.709 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:57:06 compute-0 nova_compute[186544]: 2025-11-22 08:57:06.979 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:57:11 compute-0 nova_compute[186544]: 2025-11-22 08:57:11.711 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:57:11 compute-0 nova_compute[186544]: 2025-11-22 08:57:11.982 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:57:16 compute-0 nova_compute[186544]: 2025-11-22 08:57:16.714 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:57:16 compute-0 nova_compute[186544]: 2025-11-22 08:57:16.984 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:57:17 compute-0 podman[257643]: 2025-11-22 08:57:17.414504862 +0000 UTC m=+0.061317659 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 08:57:17 compute-0 podman[257645]: 2025-11-22 08:57:17.428356233 +0000 UTC m=+0.067807408 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 08:57:17 compute-0 podman[257644]: 2025-11-22 08:57:17.46362115 +0000 UTC m=+0.104636325 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-type=git, build-date=2025-08-20T13:12:41, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, architecture=x86_64, distribution-scope=public, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 22 08:57:21 compute-0 nova_compute[186544]: 2025-11-22 08:57:21.717 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:57:21 compute-0 nova_compute[186544]: 2025-11-22 08:57:21.986 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:57:25 compute-0 ovn_controller[94843]: 2025-11-22T08:57:25Z|00933|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Nov 22 08:57:26 compute-0 nova_compute[186544]: 2025-11-22 08:57:26.719 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:57:26 compute-0 nova_compute[186544]: 2025-11-22 08:57:26.987 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:57:30 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:57:30.217 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=81, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=80) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:57:30 compute-0 nova_compute[186544]: 2025-11-22 08:57:30.217 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:57:30 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:57:30.218 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 08:57:31 compute-0 nova_compute[186544]: 2025-11-22 08:57:31.183 186548 DEBUG nova.compute.manager [req-e8869585-cb6b-40ba-8f00-306c5d76176d req-f8c77018-5dab-4802-a87f-3f3a802cc635 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] Received event network-changed-0fc6a761-743b-450a-97e3-0b4cb62012d4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:57:31 compute-0 nova_compute[186544]: 2025-11-22 08:57:31.184 186548 DEBUG nova.compute.manager [req-e8869585-cb6b-40ba-8f00-306c5d76176d req-f8c77018-5dab-4802-a87f-3f3a802cc635 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] Refreshing instance network info cache due to event network-changed-0fc6a761-743b-450a-97e3-0b4cb62012d4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:57:31 compute-0 nova_compute[186544]: 2025-11-22 08:57:31.184 186548 DEBUG oslo_concurrency.lockutils [req-e8869585-cb6b-40ba-8f00-306c5d76176d req-f8c77018-5dab-4802-a87f-3f3a802cc635 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-f4bd920b-a036-4c96-9c58-797037cb9df9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:57:31 compute-0 nova_compute[186544]: 2025-11-22 08:57:31.184 186548 DEBUG oslo_concurrency.lockutils [req-e8869585-cb6b-40ba-8f00-306c5d76176d req-f8c77018-5dab-4802-a87f-3f3a802cc635 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-f4bd920b-a036-4c96-9c58-797037cb9df9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:57:31 compute-0 nova_compute[186544]: 2025-11-22 08:57:31.184 186548 DEBUG nova.network.neutron [req-e8869585-cb6b-40ba-8f00-306c5d76176d req-f8c77018-5dab-4802-a87f-3f3a802cc635 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] Refreshing network info cache for port 0fc6a761-743b-450a-97e3-0b4cb62012d4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:57:31 compute-0 nova_compute[186544]: 2025-11-22 08:57:31.260 186548 DEBUG oslo_concurrency.lockutils [None req-7b191d3b-9711-462f-b347-d526cbec1c15 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "f4bd920b-a036-4c96-9c58-797037cb9df9" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:57:31 compute-0 nova_compute[186544]: 2025-11-22 08:57:31.261 186548 DEBUG oslo_concurrency.lockutils [None req-7b191d3b-9711-462f-b347-d526cbec1c15 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "f4bd920b-a036-4c96-9c58-797037cb9df9" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:57:31 compute-0 nova_compute[186544]: 2025-11-22 08:57:31.262 186548 DEBUG oslo_concurrency.lockutils [None req-7b191d3b-9711-462f-b347-d526cbec1c15 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "f4bd920b-a036-4c96-9c58-797037cb9df9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:57:31 compute-0 nova_compute[186544]: 2025-11-22 08:57:31.262 186548 DEBUG oslo_concurrency.lockutils [None req-7b191d3b-9711-462f-b347-d526cbec1c15 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "f4bd920b-a036-4c96-9c58-797037cb9df9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:57:31 compute-0 nova_compute[186544]: 2025-11-22 08:57:31.262 186548 DEBUG oslo_concurrency.lockutils [None req-7b191d3b-9711-462f-b347-d526cbec1c15 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "f4bd920b-a036-4c96-9c58-797037cb9df9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:57:31 compute-0 nova_compute[186544]: 2025-11-22 08:57:31.270 186548 INFO nova.compute.manager [None req-7b191d3b-9711-462f-b347-d526cbec1c15 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] Terminating instance
Nov 22 08:57:31 compute-0 nova_compute[186544]: 2025-11-22 08:57:31.277 186548 DEBUG nova.compute.manager [None req-7b191d3b-9711-462f-b347-d526cbec1c15 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 08:57:31 compute-0 kernel: tap0fc6a761-74 (unregistering): left promiscuous mode
Nov 22 08:57:31 compute-0 NetworkManager[55036]: <info>  [1763801851.3173] device (tap0fc6a761-74): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 08:57:31 compute-0 ovn_controller[94843]: 2025-11-22T08:57:31Z|00934|binding|INFO|Releasing lport 0fc6a761-743b-450a-97e3-0b4cb62012d4 from this chassis (sb_readonly=0)
Nov 22 08:57:31 compute-0 ovn_controller[94843]: 2025-11-22T08:57:31Z|00935|binding|INFO|Setting lport 0fc6a761-743b-450a-97e3-0b4cb62012d4 down in Southbound
Nov 22 08:57:31 compute-0 nova_compute[186544]: 2025-11-22 08:57:31.325 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:57:31 compute-0 ovn_controller[94843]: 2025-11-22T08:57:31Z|00936|binding|INFO|Removing iface tap0fc6a761-74 ovn-installed in OVS
Nov 22 08:57:31 compute-0 nova_compute[186544]: 2025-11-22 08:57:31.327 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:57:31 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:57:31.340 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:84:a5:4e 10.100.0.5'], port_security=['fa:16:3e:84:a5:4e 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'f4bd920b-a036-4c96-9c58-797037cb9df9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-13f06a8d-f6ae-46ea-b973-f89bfd41893c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b5da13b07bb34fc3b4cd1452f7dd6971', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b9b6088d-9774-4171-9ae0-83f685fc1451 c37d3c74-274b-44a5-ab29-e1041fb93c1c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=71640610-572c-4a8b-b43c-9737c485843b, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=0fc6a761-743b-450a-97e3-0b4cb62012d4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:57:31 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:57:31.341 103805 INFO neutron.agent.ovn.metadata.agent [-] Port 0fc6a761-743b-450a-97e3-0b4cb62012d4 in datapath 13f06a8d-f6ae-46ea-b973-f89bfd41893c unbound from our chassis
Nov 22 08:57:31 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:57:31.342 103805 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 13f06a8d-f6ae-46ea-b973-f89bfd41893c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 22 08:57:31 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:57:31.343 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[83da5a07-502c-4958-84c2-35bf92d35405]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:57:31 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:57:31.344 103805 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-13f06a8d-f6ae-46ea-b973-f89bfd41893c namespace which is not needed anymore
Nov 22 08:57:31 compute-0 nova_compute[186544]: 2025-11-22 08:57:31.346 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:57:31 compute-0 systemd[1]: machine-qemu\x2d101\x2dinstance\x2d000000ba.scope: Deactivated successfully.
Nov 22 08:57:31 compute-0 systemd[1]: machine-qemu\x2d101\x2dinstance\x2d000000ba.scope: Consumed 16.592s CPU time.
Nov 22 08:57:31 compute-0 systemd-machined[152872]: Machine qemu-101-instance-000000ba terminated.
Nov 22 08:57:31 compute-0 nova_compute[186544]: 2025-11-22 08:57:31.497 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:57:31 compute-0 nova_compute[186544]: 2025-11-22 08:57:31.501 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:57:31 compute-0 nova_compute[186544]: 2025-11-22 08:57:31.532 186548 INFO nova.virt.libvirt.driver [-] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] Instance destroyed successfully.
Nov 22 08:57:31 compute-0 nova_compute[186544]: 2025-11-22 08:57:31.532 186548 DEBUG nova.objects.instance [None req-7b191d3b-9711-462f-b347-d526cbec1c15 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lazy-loading 'resources' on Instance uuid f4bd920b-a036-4c96-9c58-797037cb9df9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:57:31 compute-0 nova_compute[186544]: 2025-11-22 08:57:31.551 186548 DEBUG nova.virt.libvirt.vif [None req-7b191d3b-9711-462f-b347-d526cbec1c15 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:56:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-714377660',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-588574044-access_point-714377660',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-588574044-acc',id=186,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPHag9aku7GBAkvmU1fN0BEWX+GKKnlz+wxkaBB81usyGYFIYXdms2nPbWtkH6Pt7jECf5QAIbXgbB8vKLjCswltA0JVkNEQJtQT24F3RbwiywHh6gfKMrOlaLUdm6xTrw==',key_name='tempest-TestSecurityGroupsBasicOps-1099778393',keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:56:21Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b5da13b07bb34fc3b4cd1452f7dd6971',ramdisk_id='',reservation_id='r-25tubkmi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-588574044',owner_user_name='tempest-TestSecurityGroupsBasicOps-588574044-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:56:21Z,user_data=None,user_id='7bb85b33f2b44468ab5d86bf5ba98421',uuid=f4bd920b-a036-4c96-9c58-797037cb9df9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0fc6a761-743b-450a-97e3-0b4cb62012d4", "address": "fa:16:3e:84:a5:4e", "network": {"id": "13f06a8d-f6ae-46ea-b973-f89bfd41893c", "bridge": "br-int", "label": "tempest-network-smoke--1719566022", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0fc6a761-74", "ovs_interfaceid": "0fc6a761-743b-450a-97e3-0b4cb62012d4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 08:57:31 compute-0 nova_compute[186544]: 2025-11-22 08:57:31.551 186548 DEBUG nova.network.os_vif_util [None req-7b191d3b-9711-462f-b347-d526cbec1c15 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Converting VIF {"id": "0fc6a761-743b-450a-97e3-0b4cb62012d4", "address": "fa:16:3e:84:a5:4e", "network": {"id": "13f06a8d-f6ae-46ea-b973-f89bfd41893c", "bridge": "br-int", "label": "tempest-network-smoke--1719566022", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0fc6a761-74", "ovs_interfaceid": "0fc6a761-743b-450a-97e3-0b4cb62012d4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:57:31 compute-0 nova_compute[186544]: 2025-11-22 08:57:31.552 186548 DEBUG nova.network.os_vif_util [None req-7b191d3b-9711-462f-b347-d526cbec1c15 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:84:a5:4e,bridge_name='br-int',has_traffic_filtering=True,id=0fc6a761-743b-450a-97e3-0b4cb62012d4,network=Network(13f06a8d-f6ae-46ea-b973-f89bfd41893c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0fc6a761-74') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:57:31 compute-0 nova_compute[186544]: 2025-11-22 08:57:31.552 186548 DEBUG os_vif [None req-7b191d3b-9711-462f-b347-d526cbec1c15 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:84:a5:4e,bridge_name='br-int',has_traffic_filtering=True,id=0fc6a761-743b-450a-97e3-0b4cb62012d4,network=Network(13f06a8d-f6ae-46ea-b973-f89bfd41893c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0fc6a761-74') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 08:57:31 compute-0 nova_compute[186544]: 2025-11-22 08:57:31.554 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:57:31 compute-0 nova_compute[186544]: 2025-11-22 08:57:31.554 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0fc6a761-74, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:57:31 compute-0 nova_compute[186544]: 2025-11-22 08:57:31.555 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:57:31 compute-0 nova_compute[186544]: 2025-11-22 08:57:31.556 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:57:31 compute-0 nova_compute[186544]: 2025-11-22 08:57:31.558 186548 INFO os_vif [None req-7b191d3b-9711-462f-b347-d526cbec1c15 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:84:a5:4e,bridge_name='br-int',has_traffic_filtering=True,id=0fc6a761-743b-450a-97e3-0b4cb62012d4,network=Network(13f06a8d-f6ae-46ea-b973-f89bfd41893c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0fc6a761-74')
Nov 22 08:57:31 compute-0 nova_compute[186544]: 2025-11-22 08:57:31.559 186548 INFO nova.virt.libvirt.driver [None req-7b191d3b-9711-462f-b347-d526cbec1c15 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] Deleting instance files /var/lib/nova/instances/f4bd920b-a036-4c96-9c58-797037cb9df9_del
Nov 22 08:57:31 compute-0 nova_compute[186544]: 2025-11-22 08:57:31.560 186548 INFO nova.virt.libvirt.driver [None req-7b191d3b-9711-462f-b347-d526cbec1c15 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] Deletion of /var/lib/nova/instances/f4bd920b-a036-4c96-9c58-797037cb9df9_del complete
Nov 22 08:57:31 compute-0 nova_compute[186544]: 2025-11-22 08:57:31.675 186548 INFO nova.compute.manager [None req-7b191d3b-9711-462f-b347-d526cbec1c15 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] Took 0.40 seconds to destroy the instance on the hypervisor.
Nov 22 08:57:31 compute-0 nova_compute[186544]: 2025-11-22 08:57:31.676 186548 DEBUG oslo.service.loopingcall [None req-7b191d3b-9711-462f-b347-d526cbec1c15 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 08:57:31 compute-0 nova_compute[186544]: 2025-11-22 08:57:31.676 186548 DEBUG nova.compute.manager [-] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 08:57:31 compute-0 nova_compute[186544]: 2025-11-22 08:57:31.676 186548 DEBUG nova.network.neutron [-] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 08:57:31 compute-0 neutron-haproxy-ovnmeta-13f06a8d-f6ae-46ea-b973-f89bfd41893c[257369]: [NOTICE]   (257373) : haproxy version is 2.8.14-c23fe91
Nov 22 08:57:31 compute-0 neutron-haproxy-ovnmeta-13f06a8d-f6ae-46ea-b973-f89bfd41893c[257369]: [NOTICE]   (257373) : path to executable is /usr/sbin/haproxy
Nov 22 08:57:31 compute-0 neutron-haproxy-ovnmeta-13f06a8d-f6ae-46ea-b973-f89bfd41893c[257369]: [WARNING]  (257373) : Exiting Master process...
Nov 22 08:57:31 compute-0 neutron-haproxy-ovnmeta-13f06a8d-f6ae-46ea-b973-f89bfd41893c[257369]: [ALERT]    (257373) : Current worker (257375) exited with code 143 (Terminated)
Nov 22 08:57:31 compute-0 neutron-haproxy-ovnmeta-13f06a8d-f6ae-46ea-b973-f89bfd41893c[257369]: [WARNING]  (257373) : All workers exited. Exiting... (0)
Nov 22 08:57:31 compute-0 systemd[1]: libpod-0e52b0714c2242e375c6ec80746ad6b6eb9e3da84dae05012eb5b5d540a435f9.scope: Deactivated successfully.
Nov 22 08:57:31 compute-0 podman[257728]: 2025-11-22 08:57:31.702560845 +0000 UTC m=+0.273426896 container died 0e52b0714c2242e375c6ec80746ad6b6eb9e3da84dae05012eb5b5d540a435f9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-13f06a8d-f6ae-46ea-b973-f89bfd41893c, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 22 08:57:31 compute-0 nova_compute[186544]: 2025-11-22 08:57:31.713 186548 DEBUG nova.compute.manager [req-962ab2b8-786e-41a6-af8f-4b3af4714156 req-840f42b1-813d-461e-887b-03250d85ff1c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] Received event network-vif-unplugged-0fc6a761-743b-450a-97e3-0b4cb62012d4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:57:31 compute-0 nova_compute[186544]: 2025-11-22 08:57:31.713 186548 DEBUG oslo_concurrency.lockutils [req-962ab2b8-786e-41a6-af8f-4b3af4714156 req-840f42b1-813d-461e-887b-03250d85ff1c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "f4bd920b-a036-4c96-9c58-797037cb9df9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:57:31 compute-0 nova_compute[186544]: 2025-11-22 08:57:31.714 186548 DEBUG oslo_concurrency.lockutils [req-962ab2b8-786e-41a6-af8f-4b3af4714156 req-840f42b1-813d-461e-887b-03250d85ff1c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "f4bd920b-a036-4c96-9c58-797037cb9df9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:57:31 compute-0 nova_compute[186544]: 2025-11-22 08:57:31.714 186548 DEBUG oslo_concurrency.lockutils [req-962ab2b8-786e-41a6-af8f-4b3af4714156 req-840f42b1-813d-461e-887b-03250d85ff1c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "f4bd920b-a036-4c96-9c58-797037cb9df9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:57:31 compute-0 nova_compute[186544]: 2025-11-22 08:57:31.714 186548 DEBUG nova.compute.manager [req-962ab2b8-786e-41a6-af8f-4b3af4714156 req-840f42b1-813d-461e-887b-03250d85ff1c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] No waiting events found dispatching network-vif-unplugged-0fc6a761-743b-450a-97e3-0b4cb62012d4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:57:31 compute-0 nova_compute[186544]: 2025-11-22 08:57:31.715 186548 DEBUG nova.compute.manager [req-962ab2b8-786e-41a6-af8f-4b3af4714156 req-840f42b1-813d-461e-887b-03250d85ff1c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] Received event network-vif-unplugged-0fc6a761-743b-450a-97e3-0b4cb62012d4 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 22 08:57:31 compute-0 nova_compute[186544]: 2025-11-22 08:57:31.989 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:57:32 compute-0 nova_compute[186544]: 2025-11-22 08:57:32.366 186548 DEBUG nova.network.neutron [-] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:57:32 compute-0 nova_compute[186544]: 2025-11-22 08:57:32.379 186548 INFO nova.compute.manager [-] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] Took 0.70 seconds to deallocate network for instance.
Nov 22 08:57:32 compute-0 nova_compute[186544]: 2025-11-22 08:57:32.483 186548 DEBUG oslo_concurrency.lockutils [None req-7b191d3b-9711-462f-b347-d526cbec1c15 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:57:32 compute-0 nova_compute[186544]: 2025-11-22 08:57:32.483 186548 DEBUG oslo_concurrency.lockutils [None req-7b191d3b-9711-462f-b347-d526cbec1c15 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:57:32 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0e52b0714c2242e375c6ec80746ad6b6eb9e3da84dae05012eb5b5d540a435f9-userdata-shm.mount: Deactivated successfully.
Nov 22 08:57:32 compute-0 nova_compute[186544]: 2025-11-22 08:57:32.555 186548 DEBUG nova.compute.provider_tree [None req-7b191d3b-9711-462f-b347-d526cbec1c15 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:57:32 compute-0 systemd[1]: var-lib-containers-storage-overlay-7eca7587e49fa0bc1d3bdefe821f2c00bbd053f711809662746060e33e1dc906-merged.mount: Deactivated successfully.
Nov 22 08:57:32 compute-0 nova_compute[186544]: 2025-11-22 08:57:32.569 186548 DEBUG nova.scheduler.client.report [None req-7b191d3b-9711-462f-b347-d526cbec1c15 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:57:32 compute-0 nova_compute[186544]: 2025-11-22 08:57:32.594 186548 DEBUG oslo_concurrency.lockutils [None req-7b191d3b-9711-462f-b347-d526cbec1c15 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:57:32 compute-0 nova_compute[186544]: 2025-11-22 08:57:32.661 186548 INFO nova.scheduler.client.report [None req-7b191d3b-9711-462f-b347-d526cbec1c15 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Deleted allocations for instance f4bd920b-a036-4c96-9c58-797037cb9df9
Nov 22 08:57:32 compute-0 nova_compute[186544]: 2025-11-22 08:57:32.733 186548 DEBUG nova.network.neutron [req-e8869585-cb6b-40ba-8f00-306c5d76176d req-f8c77018-5dab-4802-a87f-3f3a802cc635 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] Updated VIF entry in instance network info cache for port 0fc6a761-743b-450a-97e3-0b4cb62012d4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:57:32 compute-0 nova_compute[186544]: 2025-11-22 08:57:32.734 186548 DEBUG nova.network.neutron [req-e8869585-cb6b-40ba-8f00-306c5d76176d req-f8c77018-5dab-4802-a87f-3f3a802cc635 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] Updating instance_info_cache with network_info: [{"id": "0fc6a761-743b-450a-97e3-0b4cb62012d4", "address": "fa:16:3e:84:a5:4e", "network": {"id": "13f06a8d-f6ae-46ea-b973-f89bfd41893c", "bridge": "br-int", "label": "tempest-network-smoke--1719566022", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5da13b07bb34fc3b4cd1452f7dd6971", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0fc6a761-74", "ovs_interfaceid": "0fc6a761-743b-450a-97e3-0b4cb62012d4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:57:32 compute-0 nova_compute[186544]: 2025-11-22 08:57:32.753 186548 DEBUG oslo_concurrency.lockutils [None req-7b191d3b-9711-462f-b347-d526cbec1c15 7bb85b33f2b44468ab5d86bf5ba98421 b5da13b07bb34fc3b4cd1452f7dd6971 - - default default] Lock "f4bd920b-a036-4c96-9c58-797037cb9df9" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.492s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:57:32 compute-0 nova_compute[186544]: 2025-11-22 08:57:32.758 186548 DEBUG oslo_concurrency.lockutils [req-e8869585-cb6b-40ba-8f00-306c5d76176d req-f8c77018-5dab-4802-a87f-3f3a802cc635 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-f4bd920b-a036-4c96-9c58-797037cb9df9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:57:33 compute-0 podman[257728]: 2025-11-22 08:57:33.030993773 +0000 UTC m=+1.601859824 container cleanup 0e52b0714c2242e375c6ec80746ad6b6eb9e3da84dae05012eb5b5d540a435f9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-13f06a8d-f6ae-46ea-b973-f89bfd41893c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3)
Nov 22 08:57:33 compute-0 systemd[1]: libpod-conmon-0e52b0714c2242e375c6ec80746ad6b6eb9e3da84dae05012eb5b5d540a435f9.scope: Deactivated successfully.
Nov 22 08:57:33 compute-0 podman[257775]: 2025-11-22 08:57:33.513222635 +0000 UTC m=+0.461910584 container remove 0e52b0714c2242e375c6ec80746ad6b6eb9e3da84dae05012eb5b5d540a435f9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-13f06a8d-f6ae-46ea-b973-f89bfd41893c, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 22 08:57:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:57:33.518 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[103e4239-d68c-47b5-9095-abb932446284]: (4, ('Sat Nov 22 08:57:31 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-13f06a8d-f6ae-46ea-b973-f89bfd41893c (0e52b0714c2242e375c6ec80746ad6b6eb9e3da84dae05012eb5b5d540a435f9)\n0e52b0714c2242e375c6ec80746ad6b6eb9e3da84dae05012eb5b5d540a435f9\nSat Nov 22 08:57:33 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-13f06a8d-f6ae-46ea-b973-f89bfd41893c (0e52b0714c2242e375c6ec80746ad6b6eb9e3da84dae05012eb5b5d540a435f9)\n0e52b0714c2242e375c6ec80746ad6b6eb9e3da84dae05012eb5b5d540a435f9\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:57:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:57:33.521 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[d7f3baad-d89e-472b-bcbe-7415200735a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:57:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:57:33.522 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap13f06a8d-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:57:33 compute-0 nova_compute[186544]: 2025-11-22 08:57:33.524 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:57:33 compute-0 kernel: tap13f06a8d-f0: left promiscuous mode
Nov 22 08:57:33 compute-0 nova_compute[186544]: 2025-11-22 08:57:33.537 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:57:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:57:33.541 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[abb7eb56-086a-481c-b350-1110abc422c4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:57:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:57:33.556 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[39527041-59cf-461a-a8e3-11ef6873c763]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:57:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:57:33.557 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[610c2fe3-94aa-4ac4-ba48-9ea43ae1f953]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:57:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:57:33.571 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[d5543232-02f1-45b0-8c3a-8514327ddfbf]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 851978, 'reachable_time': 34247, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 257790, 'error': None, 'target': 'ovnmeta-13f06a8d-f6ae-46ea-b973-f89bfd41893c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:57:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:57:33.574 103918 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-13f06a8d-f6ae-46ea-b973-f89bfd41893c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 22 08:57:33 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:57:33.574 103918 DEBUG oslo.privsep.daemon [-] privsep: reply[a95ca1ff-353e-49fa-8de0-80bba423469b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:57:33 compute-0 systemd[1]: run-netns-ovnmeta\x2d13f06a8d\x2df6ae\x2d46ea\x2db973\x2df89bfd41893c.mount: Deactivated successfully.
Nov 22 08:57:33 compute-0 nova_compute[186544]: 2025-11-22 08:57:33.830 186548 DEBUG nova.compute.manager [req-83202ff2-883c-47f6-8d80-033cd59cade7 req-2c1f92b4-01d8-490b-9335-75525db0462c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] Received event network-vif-plugged-0fc6a761-743b-450a-97e3-0b4cb62012d4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:57:33 compute-0 nova_compute[186544]: 2025-11-22 08:57:33.831 186548 DEBUG oslo_concurrency.lockutils [req-83202ff2-883c-47f6-8d80-033cd59cade7 req-2c1f92b4-01d8-490b-9335-75525db0462c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "f4bd920b-a036-4c96-9c58-797037cb9df9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:57:33 compute-0 nova_compute[186544]: 2025-11-22 08:57:33.831 186548 DEBUG oslo_concurrency.lockutils [req-83202ff2-883c-47f6-8d80-033cd59cade7 req-2c1f92b4-01d8-490b-9335-75525db0462c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "f4bd920b-a036-4c96-9c58-797037cb9df9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:57:33 compute-0 nova_compute[186544]: 2025-11-22 08:57:33.831 186548 DEBUG oslo_concurrency.lockutils [req-83202ff2-883c-47f6-8d80-033cd59cade7 req-2c1f92b4-01d8-490b-9335-75525db0462c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "f4bd920b-a036-4c96-9c58-797037cb9df9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:57:33 compute-0 nova_compute[186544]: 2025-11-22 08:57:33.831 186548 DEBUG nova.compute.manager [req-83202ff2-883c-47f6-8d80-033cd59cade7 req-2c1f92b4-01d8-490b-9335-75525db0462c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] No waiting events found dispatching network-vif-plugged-0fc6a761-743b-450a-97e3-0b4cb62012d4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:57:33 compute-0 nova_compute[186544]: 2025-11-22 08:57:33.832 186548 WARNING nova.compute.manager [req-83202ff2-883c-47f6-8d80-033cd59cade7 req-2c1f92b4-01d8-490b-9335-75525db0462c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] Received unexpected event network-vif-plugged-0fc6a761-743b-450a-97e3-0b4cb62012d4 for instance with vm_state deleted and task_state None.
Nov 22 08:57:33 compute-0 nova_compute[186544]: 2025-11-22 08:57:33.832 186548 DEBUG nova.compute.manager [req-83202ff2-883c-47f6-8d80-033cd59cade7 req-2c1f92b4-01d8-490b-9335-75525db0462c 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] Received event network-vif-deleted-0fc6a761-743b-450a-97e3-0b4cb62012d4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:57:35 compute-0 nova_compute[186544]: 2025-11-22 08:57:35.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:57:35 compute-0 nova_compute[186544]: 2025-11-22 08:57:35.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 08:57:35 compute-0 nova_compute[186544]: 2025-11-22 08:57:35.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 08:57:35 compute-0 nova_compute[186544]: 2025-11-22 08:57:35.181 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 08:57:35 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:57:35.220 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '81'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:57:36 compute-0 podman[257792]: 2025-11-22 08:57:36.422370365 +0000 UTC m=+0.063344049 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible)
Nov 22 08:57:36 compute-0 podman[257793]: 2025-11-22 08:57:36.434534624 +0000 UTC m=+0.071649944 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 08:57:36 compute-0 podman[257791]: 2025-11-22 08:57:36.442917961 +0000 UTC m=+0.083880465 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 22 08:57:36 compute-0 podman[257794]: 2025-11-22 08:57:36.464408629 +0000 UTC m=+0.097437678 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 22 08:57:36 compute-0 nova_compute[186544]: 2025-11-22 08:57:36.556 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:57:36 compute-0 nova_compute[186544]: 2025-11-22 08:57:36.993 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:57:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:57:37.389 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:57:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:57:37.390 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:57:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:57:37.390 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:57:37 compute-0 nova_compute[186544]: 2025-11-22 08:57:37.722 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:57:37 compute-0 nova_compute[186544]: 2025-11-22 08:57:37.790 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:57:41 compute-0 nova_compute[186544]: 2025-11-22 08:57:41.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:57:41 compute-0 nova_compute[186544]: 2025-11-22 08:57:41.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 08:57:41 compute-0 nova_compute[186544]: 2025-11-22 08:57:41.559 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:57:41 compute-0 nova_compute[186544]: 2025-11-22 08:57:41.994 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:57:42 compute-0 nova_compute[186544]: 2025-11-22 08:57:42.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:57:42 compute-0 nova_compute[186544]: 2025-11-22 08:57:42.190 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:57:42 compute-0 nova_compute[186544]: 2025-11-22 08:57:42.191 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:57:42 compute-0 nova_compute[186544]: 2025-11-22 08:57:42.191 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:57:42 compute-0 nova_compute[186544]: 2025-11-22 08:57:42.191 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 08:57:42 compute-0 nova_compute[186544]: 2025-11-22 08:57:42.337 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:57:42 compute-0 nova_compute[186544]: 2025-11-22 08:57:42.338 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5717MB free_disk=73.13178253173828GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 08:57:42 compute-0 nova_compute[186544]: 2025-11-22 08:57:42.338 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:57:42 compute-0 nova_compute[186544]: 2025-11-22 08:57:42.338 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:57:43 compute-0 nova_compute[186544]: 2025-11-22 08:57:43.353 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 08:57:43 compute-0 nova_compute[186544]: 2025-11-22 08:57:43.353 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 08:57:43 compute-0 nova_compute[186544]: 2025-11-22 08:57:43.589 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:57:43 compute-0 nova_compute[186544]: 2025-11-22 08:57:43.604 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:57:43 compute-0 nova_compute[186544]: 2025-11-22 08:57:43.629 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 08:57:43 compute-0 nova_compute[186544]: 2025-11-22 08:57:43.630 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.292s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:57:46 compute-0 nova_compute[186544]: 2025-11-22 08:57:46.533 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763801851.5303261, f4bd920b-a036-4c96-9c58-797037cb9df9 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:57:46 compute-0 nova_compute[186544]: 2025-11-22 08:57:46.534 186548 INFO nova.compute.manager [-] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] VM Stopped (Lifecycle Event)
Nov 22 08:57:46 compute-0 nova_compute[186544]: 2025-11-22 08:57:46.553 186548 DEBUG nova.compute.manager [None req-6b6754ff-0d7f-4f0f-be6b-5d4497f6fac9 - - - - - -] [instance: f4bd920b-a036-4c96-9c58-797037cb9df9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:57:46 compute-0 nova_compute[186544]: 2025-11-22 08:57:46.562 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:57:46 compute-0 nova_compute[186544]: 2025-11-22 08:57:46.995 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:57:47 compute-0 nova_compute[186544]: 2025-11-22 08:57:47.631 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:57:48 compute-0 nova_compute[186544]: 2025-11-22 08:57:48.158 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:57:48 compute-0 nova_compute[186544]: 2025-11-22 08:57:48.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:57:48 compute-0 podman[257879]: 2025-11-22 08:57:48.418828989 +0000 UTC m=+0.058071690 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 08:57:48 compute-0 podman[257880]: 2025-11-22 08:57:48.443525035 +0000 UTC m=+0.079031784 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, vcs-type=git, version=9.6, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 22 08:57:48 compute-0 podman[257881]: 2025-11-22 08:57:48.452291772 +0000 UTC m=+0.080409880 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 08:57:50 compute-0 nova_compute[186544]: 2025-11-22 08:57:50.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:57:51 compute-0 nova_compute[186544]: 2025-11-22 08:57:51.566 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:57:51 compute-0 nova_compute[186544]: 2025-11-22 08:57:51.997 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:57:52 compute-0 nova_compute[186544]: 2025-11-22 08:57:52.597 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:57:53 compute-0 nova_compute[186544]: 2025-11-22 08:57:53.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:57:56 compute-0 nova_compute[186544]: 2025-11-22 08:57:56.570 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:57:57 compute-0 nova_compute[186544]: 2025-11-22 08:57:56.999 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:58:01 compute-0 nova_compute[186544]: 2025-11-22 08:58:01.573 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:58:02 compute-0 nova_compute[186544]: 2025-11-22 08:58:02.001 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:58:06 compute-0 nova_compute[186544]: 2025-11-22 08:58:06.217 186548 DEBUG oslo_concurrency.lockutils [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Acquiring lock "f54acf85-11a3-44be-a962-ac430c2f1756" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:58:06 compute-0 nova_compute[186544]: 2025-11-22 08:58:06.218 186548 DEBUG oslo_concurrency.lockutils [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Lock "f54acf85-11a3-44be-a962-ac430c2f1756" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:58:06 compute-0 nova_compute[186544]: 2025-11-22 08:58:06.240 186548 DEBUG nova.compute.manager [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 22 08:58:06 compute-0 nova_compute[186544]: 2025-11-22 08:58:06.391 186548 DEBUG oslo_concurrency.lockutils [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:58:06 compute-0 nova_compute[186544]: 2025-11-22 08:58:06.391 186548 DEBUG oslo_concurrency.lockutils [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:58:06 compute-0 nova_compute[186544]: 2025-11-22 08:58:06.398 186548 DEBUG nova.virt.hardware [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 22 08:58:06 compute-0 nova_compute[186544]: 2025-11-22 08:58:06.398 186548 INFO nova.compute.claims [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Claim successful on node compute-0.ctlplane.example.com
Nov 22 08:58:06 compute-0 nova_compute[186544]: 2025-11-22 08:58:06.577 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:58:06 compute-0 nova_compute[186544]: 2025-11-22 08:58:06.703 186548 DEBUG nova.compute.provider_tree [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:58:06 compute-0 nova_compute[186544]: 2025-11-22 08:58:06.721 186548 DEBUG nova.scheduler.client.report [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:58:06 compute-0 nova_compute[186544]: 2025-11-22 08:58:06.745 186548 DEBUG oslo_concurrency.lockutils [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.353s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:58:06 compute-0 nova_compute[186544]: 2025-11-22 08:58:06.745 186548 DEBUG nova.compute.manager [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 22 08:58:06 compute-0 nova_compute[186544]: 2025-11-22 08:58:06.936 186548 DEBUG nova.compute.manager [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 22 08:58:06 compute-0 nova_compute[186544]: 2025-11-22 08:58:06.936 186548 DEBUG nova.network.neutron [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 22 08:58:06 compute-0 nova_compute[186544]: 2025-11-22 08:58:06.959 186548 INFO nova.virt.libvirt.driver [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 22 08:58:06 compute-0 nova_compute[186544]: 2025-11-22 08:58:06.979 186548 DEBUG nova.compute.manager [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 22 08:58:07 compute-0 nova_compute[186544]: 2025-11-22 08:58:07.002 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:58:07 compute-0 nova_compute[186544]: 2025-11-22 08:58:07.097 186548 DEBUG nova.compute.manager [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 22 08:58:07 compute-0 nova_compute[186544]: 2025-11-22 08:58:07.098 186548 DEBUG nova.virt.libvirt.driver [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 22 08:58:07 compute-0 nova_compute[186544]: 2025-11-22 08:58:07.099 186548 INFO nova.virt.libvirt.driver [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Creating image(s)
Nov 22 08:58:07 compute-0 nova_compute[186544]: 2025-11-22 08:58:07.099 186548 DEBUG oslo_concurrency.lockutils [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Acquiring lock "/var/lib/nova/instances/f54acf85-11a3-44be-a962-ac430c2f1756/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:58:07 compute-0 nova_compute[186544]: 2025-11-22 08:58:07.099 186548 DEBUG oslo_concurrency.lockutils [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Lock "/var/lib/nova/instances/f54acf85-11a3-44be-a962-ac430c2f1756/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:58:07 compute-0 nova_compute[186544]: 2025-11-22 08:58:07.100 186548 DEBUG oslo_concurrency.lockutils [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Lock "/var/lib/nova/instances/f54acf85-11a3-44be-a962-ac430c2f1756/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:58:07 compute-0 nova_compute[186544]: 2025-11-22 08:58:07.112 186548 DEBUG oslo_concurrency.processutils [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:58:07 compute-0 nova_compute[186544]: 2025-11-22 08:58:07.168 186548 DEBUG oslo_concurrency.processutils [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:58:07 compute-0 nova_compute[186544]: 2025-11-22 08:58:07.169 186548 DEBUG oslo_concurrency.lockutils [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Acquiring lock "169b85625b85d2ad681b52460a0c196a18b2a726" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:58:07 compute-0 nova_compute[186544]: 2025-11-22 08:58:07.169 186548 DEBUG oslo_concurrency.lockutils [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:58:07 compute-0 nova_compute[186544]: 2025-11-22 08:58:07.179 186548 DEBUG oslo_concurrency.processutils [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:58:07 compute-0 nova_compute[186544]: 2025-11-22 08:58:07.236 186548 DEBUG oslo_concurrency.processutils [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:58:07 compute-0 nova_compute[186544]: 2025-11-22 08:58:07.237 186548 DEBUG oslo_concurrency.processutils [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/f54acf85-11a3-44be-a962-ac430c2f1756/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:58:07 compute-0 nova_compute[186544]: 2025-11-22 08:58:07.389 186548 DEBUG oslo_concurrency.processutils [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726,backing_fmt=raw /var/lib/nova/instances/f54acf85-11a3-44be-a962-ac430c2f1756/disk 1073741824" returned: 0 in 0.152s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:58:07 compute-0 nova_compute[186544]: 2025-11-22 08:58:07.390 186548 DEBUG oslo_concurrency.lockutils [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Lock "169b85625b85d2ad681b52460a0c196a18b2a726" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.221s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:58:07 compute-0 nova_compute[186544]: 2025-11-22 08:58:07.390 186548 DEBUG oslo_concurrency.processutils [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:58:07 compute-0 podman[257951]: 2025-11-22 08:58:07.421699529 +0000 UTC m=+0.054789419 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 08:58:07 compute-0 podman[257950]: 2025-11-22 08:58:07.436535234 +0000 UTC m=+0.076274458 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 22 08:58:07 compute-0 nova_compute[186544]: 2025-11-22 08:58:07.448 186548 DEBUG oslo_concurrency.processutils [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/169b85625b85d2ad681b52460a0c196a18b2a726 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:58:07 compute-0 nova_compute[186544]: 2025-11-22 08:58:07.449 186548 DEBUG nova.virt.disk.api [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Checking if we can resize image /var/lib/nova/instances/f54acf85-11a3-44be-a962-ac430c2f1756/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 22 08:58:07 compute-0 nova_compute[186544]: 2025-11-22 08:58:07.451 186548 DEBUG oslo_concurrency.processutils [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f54acf85-11a3-44be-a962-ac430c2f1756/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:58:07 compute-0 podman[257949]: 2025-11-22 08:58:07.46161296 +0000 UTC m=+0.105569277 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118)
Nov 22 08:58:07 compute-0 podman[257962]: 2025-11-22 08:58:07.474242871 +0000 UTC m=+0.099116369 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 22 08:58:07 compute-0 nova_compute[186544]: 2025-11-22 08:58:07.506 186548 DEBUG oslo_concurrency.processutils [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f54acf85-11a3-44be-a962-ac430c2f1756/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:58:07 compute-0 nova_compute[186544]: 2025-11-22 08:58:07.507 186548 DEBUG nova.virt.disk.api [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Cannot resize image /var/lib/nova/instances/f54acf85-11a3-44be-a962-ac430c2f1756/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 22 08:58:07 compute-0 nova_compute[186544]: 2025-11-22 08:58:07.508 186548 DEBUG nova.objects.instance [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Lazy-loading 'migration_context' on Instance uuid f54acf85-11a3-44be-a962-ac430c2f1756 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:58:07 compute-0 nova_compute[186544]: 2025-11-22 08:58:07.527 186548 DEBUG nova.virt.libvirt.driver [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 22 08:58:07 compute-0 nova_compute[186544]: 2025-11-22 08:58:07.527 186548 DEBUG nova.virt.libvirt.driver [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Ensure instance console log exists: /var/lib/nova/instances/f54acf85-11a3-44be-a962-ac430c2f1756/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 22 08:58:07 compute-0 nova_compute[186544]: 2025-11-22 08:58:07.528 186548 DEBUG oslo_concurrency.lockutils [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:58:07 compute-0 nova_compute[186544]: 2025-11-22 08:58:07.528 186548 DEBUG oslo_concurrency.lockutils [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:58:07 compute-0 nova_compute[186544]: 2025-11-22 08:58:07.528 186548 DEBUG oslo_concurrency.lockutils [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:58:07 compute-0 nova_compute[186544]: 2025-11-22 08:58:07.976 186548 DEBUG nova.policy [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b31993e3b9ba48dc95406061036b537a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '664a512879714e69beefb15b448f9498', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 22 08:58:11 compute-0 nova_compute[186544]: 2025-11-22 08:58:11.381 186548 DEBUG nova.network.neutron [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Successfully created port: d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 22 08:58:11 compute-0 nova_compute[186544]: 2025-11-22 08:58:11.581 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:58:12 compute-0 nova_compute[186544]: 2025-11-22 08:58:12.005 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:58:13 compute-0 nova_compute[186544]: 2025-11-22 08:58:13.053 186548 DEBUG nova.network.neutron [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Successfully updated port: d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 22 08:58:13 compute-0 nova_compute[186544]: 2025-11-22 08:58:13.067 186548 DEBUG oslo_concurrency.lockutils [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Acquiring lock "refresh_cache-f54acf85-11a3-44be-a962-ac430c2f1756" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:58:13 compute-0 nova_compute[186544]: 2025-11-22 08:58:13.068 186548 DEBUG oslo_concurrency.lockutils [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Acquired lock "refresh_cache-f54acf85-11a3-44be-a962-ac430c2f1756" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:58:13 compute-0 nova_compute[186544]: 2025-11-22 08:58:13.069 186548 DEBUG nova.network.neutron [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 08:58:13 compute-0 nova_compute[186544]: 2025-11-22 08:58:13.219 186548 DEBUG nova.compute.manager [req-9fe953fa-fc98-44eb-a2ab-5704e4dc2253 req-c6c7de23-2e93-4c3c-bf3a-84c667731e21 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Received event network-changed-d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:58:13 compute-0 nova_compute[186544]: 2025-11-22 08:58:13.220 186548 DEBUG nova.compute.manager [req-9fe953fa-fc98-44eb-a2ab-5704e4dc2253 req-c6c7de23-2e93-4c3c-bf3a-84c667731e21 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Refreshing instance network info cache due to event network-changed-d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 22 08:58:13 compute-0 nova_compute[186544]: 2025-11-22 08:58:13.220 186548 DEBUG oslo_concurrency.lockutils [req-9fe953fa-fc98-44eb-a2ab-5704e4dc2253 req-c6c7de23-2e93-4c3c-bf3a-84c667731e21 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "refresh_cache-f54acf85-11a3-44be-a962-ac430c2f1756" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:58:13 compute-0 nova_compute[186544]: 2025-11-22 08:58:13.249 186548 DEBUG nova.network.neutron [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 22 08:58:15 compute-0 nova_compute[186544]: 2025-11-22 08:58:15.158 186548 DEBUG nova.network.neutron [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Updating instance_info_cache with network_info: [{"id": "d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273", "address": "fa:16:3e:e2:f2:39", "network": {"id": "5eb557a9-23d5-43ab-8781-2f9d2c26caba", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-135091389-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "664a512879714e69beefb15b448f9498", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4bab24c-c6", "ovs_interfaceid": "d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:58:15 compute-0 nova_compute[186544]: 2025-11-22 08:58:15.278 186548 DEBUG oslo_concurrency.lockutils [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Releasing lock "refresh_cache-f54acf85-11a3-44be-a962-ac430c2f1756" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:58:15 compute-0 nova_compute[186544]: 2025-11-22 08:58:15.279 186548 DEBUG nova.compute.manager [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Instance network_info: |[{"id": "d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273", "address": "fa:16:3e:e2:f2:39", "network": {"id": "5eb557a9-23d5-43ab-8781-2f9d2c26caba", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-135091389-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "664a512879714e69beefb15b448f9498", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4bab24c-c6", "ovs_interfaceid": "d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 22 08:58:15 compute-0 nova_compute[186544]: 2025-11-22 08:58:15.280 186548 DEBUG oslo_concurrency.lockutils [req-9fe953fa-fc98-44eb-a2ab-5704e4dc2253 req-c6c7de23-2e93-4c3c-bf3a-84c667731e21 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquired lock "refresh_cache-f54acf85-11a3-44be-a962-ac430c2f1756" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:58:15 compute-0 nova_compute[186544]: 2025-11-22 08:58:15.281 186548 DEBUG nova.network.neutron [req-9fe953fa-fc98-44eb-a2ab-5704e4dc2253 req-c6c7de23-2e93-4c3c-bf3a-84c667731e21 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Refreshing network info cache for port d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 22 08:58:15 compute-0 nova_compute[186544]: 2025-11-22 08:58:15.286 186548 DEBUG nova.virt.libvirt.driver [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Start _get_guest_xml network_info=[{"id": "d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273", "address": "fa:16:3e:e2:f2:39", "network": {"id": "5eb557a9-23d5-43ab-8781-2f9d2c26caba", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-135091389-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "664a512879714e69beefb15b448f9498", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4bab24c-c6", "ovs_interfaceid": "d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'image_id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 22 08:58:15 compute-0 nova_compute[186544]: 2025-11-22 08:58:15.291 186548 WARNING nova.virt.libvirt.driver [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:58:15 compute-0 nova_compute[186544]: 2025-11-22 08:58:15.301 186548 DEBUG nova.virt.libvirt.host [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 22 08:58:15 compute-0 nova_compute[186544]: 2025-11-22 08:58:15.302 186548 DEBUG nova.virt.libvirt.host [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 22 08:58:15 compute-0 nova_compute[186544]: 2025-11-22 08:58:15.304 186548 DEBUG nova.virt.libvirt.host [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 22 08:58:15 compute-0 nova_compute[186544]: 2025-11-22 08:58:15.305 186548 DEBUG nova.virt.libvirt.host [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 22 08:58:15 compute-0 nova_compute[186544]: 2025-11-22 08:58:15.306 186548 DEBUG nova.virt.libvirt.driver [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 22 08:58:15 compute-0 nova_compute[186544]: 2025-11-22 08:58:15.307 186548 DEBUG nova.virt.hardware [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-22T07:38:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='31612188-3cd6-428b-9166-9568f0affd4a',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-22T07:38:59Z,direct_url=<?>,disk_format='qcow2',id=eb6eb4ac-7956-4021-b3a0-d612ae61d38c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='cb198b45e9034b108a19399d19c6cf14',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-22T07:39:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 22 08:58:15 compute-0 nova_compute[186544]: 2025-11-22 08:58:15.307 186548 DEBUG nova.virt.hardware [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 22 08:58:15 compute-0 nova_compute[186544]: 2025-11-22 08:58:15.307 186548 DEBUG nova.virt.hardware [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 22 08:58:15 compute-0 nova_compute[186544]: 2025-11-22 08:58:15.308 186548 DEBUG nova.virt.hardware [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 22 08:58:15 compute-0 nova_compute[186544]: 2025-11-22 08:58:15.308 186548 DEBUG nova.virt.hardware [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 22 08:58:15 compute-0 nova_compute[186544]: 2025-11-22 08:58:15.308 186548 DEBUG nova.virt.hardware [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 22 08:58:15 compute-0 nova_compute[186544]: 2025-11-22 08:58:15.308 186548 DEBUG nova.virt.hardware [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 22 08:58:15 compute-0 nova_compute[186544]: 2025-11-22 08:58:15.309 186548 DEBUG nova.virt.hardware [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 22 08:58:15 compute-0 nova_compute[186544]: 2025-11-22 08:58:15.309 186548 DEBUG nova.virt.hardware [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 22 08:58:15 compute-0 nova_compute[186544]: 2025-11-22 08:58:15.309 186548 DEBUG nova.virt.hardware [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 22 08:58:15 compute-0 nova_compute[186544]: 2025-11-22 08:58:15.309 186548 DEBUG nova.virt.hardware [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 22 08:58:15 compute-0 nova_compute[186544]: 2025-11-22 08:58:15.313 186548 DEBUG nova.virt.libvirt.vif [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:58:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerAdvancedOps-server-1645058772',display_name='tempest-TestServerAdvancedOps-server-1645058772',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testserveradvancedops-server-1645058772',id=188,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='664a512879714e69beefb15b448f9498',ramdisk_id='',reservation_id='r-djuyjwq4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerAdvancedOps-1894421725',owner_user_name='tempest-TestServerAdvancedOps-1894421725-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:58:07Z,user_data=None,user_id='b31993e3b9ba48dc95406061036b537a',uuid=f54acf85-11a3-44be-a962-ac430c2f1756,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273", "address": "fa:16:3e:e2:f2:39", "network": {"id": "5eb557a9-23d5-43ab-8781-2f9d2c26caba", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-135091389-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "664a512879714e69beefb15b448f9498", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4bab24c-c6", "ovs_interfaceid": "d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 22 08:58:15 compute-0 nova_compute[186544]: 2025-11-22 08:58:15.313 186548 DEBUG nova.network.os_vif_util [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Converting VIF {"id": "d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273", "address": "fa:16:3e:e2:f2:39", "network": {"id": "5eb557a9-23d5-43ab-8781-2f9d2c26caba", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-135091389-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "664a512879714e69beefb15b448f9498", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4bab24c-c6", "ovs_interfaceid": "d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:58:15 compute-0 nova_compute[186544]: 2025-11-22 08:58:15.314 186548 DEBUG nova.network.os_vif_util [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e2:f2:39,bridge_name='br-int',has_traffic_filtering=True,id=d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273,network=Network(5eb557a9-23d5-43ab-8781-2f9d2c26caba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd4bab24c-c6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:58:15 compute-0 nova_compute[186544]: 2025-11-22 08:58:15.314 186548 DEBUG nova.objects.instance [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Lazy-loading 'pci_devices' on Instance uuid f54acf85-11a3-44be-a962-ac430c2f1756 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:58:15 compute-0 nova_compute[186544]: 2025-11-22 08:58:15.329 186548 DEBUG nova.virt.libvirt.driver [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] End _get_guest_xml xml=<domain type="kvm">
Nov 22 08:58:15 compute-0 nova_compute[186544]:   <uuid>f54acf85-11a3-44be-a962-ac430c2f1756</uuid>
Nov 22 08:58:15 compute-0 nova_compute[186544]:   <name>instance-000000bc</name>
Nov 22 08:58:15 compute-0 nova_compute[186544]:   <memory>131072</memory>
Nov 22 08:58:15 compute-0 nova_compute[186544]:   <vcpu>1</vcpu>
Nov 22 08:58:15 compute-0 nova_compute[186544]:   <metadata>
Nov 22 08:58:15 compute-0 nova_compute[186544]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 22 08:58:15 compute-0 nova_compute[186544]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 22 08:58:15 compute-0 nova_compute[186544]:       <nova:name>tempest-TestServerAdvancedOps-server-1645058772</nova:name>
Nov 22 08:58:15 compute-0 nova_compute[186544]:       <nova:creationTime>2025-11-22 08:58:15</nova:creationTime>
Nov 22 08:58:15 compute-0 nova_compute[186544]:       <nova:flavor name="m1.nano">
Nov 22 08:58:15 compute-0 nova_compute[186544]:         <nova:memory>128</nova:memory>
Nov 22 08:58:15 compute-0 nova_compute[186544]:         <nova:disk>1</nova:disk>
Nov 22 08:58:15 compute-0 nova_compute[186544]:         <nova:swap>0</nova:swap>
Nov 22 08:58:15 compute-0 nova_compute[186544]:         <nova:ephemeral>0</nova:ephemeral>
Nov 22 08:58:15 compute-0 nova_compute[186544]:         <nova:vcpus>1</nova:vcpus>
Nov 22 08:58:15 compute-0 nova_compute[186544]:       </nova:flavor>
Nov 22 08:58:15 compute-0 nova_compute[186544]:       <nova:owner>
Nov 22 08:58:15 compute-0 nova_compute[186544]:         <nova:user uuid="b31993e3b9ba48dc95406061036b537a">tempest-TestServerAdvancedOps-1894421725-project-member</nova:user>
Nov 22 08:58:15 compute-0 nova_compute[186544]:         <nova:project uuid="664a512879714e69beefb15b448f9498">tempest-TestServerAdvancedOps-1894421725</nova:project>
Nov 22 08:58:15 compute-0 nova_compute[186544]:       </nova:owner>
Nov 22 08:58:15 compute-0 nova_compute[186544]:       <nova:root type="image" uuid="eb6eb4ac-7956-4021-b3a0-d612ae61d38c"/>
Nov 22 08:58:15 compute-0 nova_compute[186544]:       <nova:ports>
Nov 22 08:58:15 compute-0 nova_compute[186544]:         <nova:port uuid="d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273">
Nov 22 08:58:15 compute-0 nova_compute[186544]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Nov 22 08:58:15 compute-0 nova_compute[186544]:         </nova:port>
Nov 22 08:58:15 compute-0 nova_compute[186544]:       </nova:ports>
Nov 22 08:58:15 compute-0 nova_compute[186544]:     </nova:instance>
Nov 22 08:58:15 compute-0 nova_compute[186544]:   </metadata>
Nov 22 08:58:15 compute-0 nova_compute[186544]:   <sysinfo type="smbios">
Nov 22 08:58:15 compute-0 nova_compute[186544]:     <system>
Nov 22 08:58:15 compute-0 nova_compute[186544]:       <entry name="manufacturer">RDO</entry>
Nov 22 08:58:15 compute-0 nova_compute[186544]:       <entry name="product">OpenStack Compute</entry>
Nov 22 08:58:15 compute-0 nova_compute[186544]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 22 08:58:15 compute-0 nova_compute[186544]:       <entry name="serial">f54acf85-11a3-44be-a962-ac430c2f1756</entry>
Nov 22 08:58:15 compute-0 nova_compute[186544]:       <entry name="uuid">f54acf85-11a3-44be-a962-ac430c2f1756</entry>
Nov 22 08:58:15 compute-0 nova_compute[186544]:       <entry name="family">Virtual Machine</entry>
Nov 22 08:58:15 compute-0 nova_compute[186544]:     </system>
Nov 22 08:58:15 compute-0 nova_compute[186544]:   </sysinfo>
Nov 22 08:58:15 compute-0 nova_compute[186544]:   <os>
Nov 22 08:58:15 compute-0 nova_compute[186544]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 22 08:58:15 compute-0 nova_compute[186544]:     <boot dev="hd"/>
Nov 22 08:58:15 compute-0 nova_compute[186544]:     <smbios mode="sysinfo"/>
Nov 22 08:58:15 compute-0 nova_compute[186544]:   </os>
Nov 22 08:58:15 compute-0 nova_compute[186544]:   <features>
Nov 22 08:58:15 compute-0 nova_compute[186544]:     <acpi/>
Nov 22 08:58:15 compute-0 nova_compute[186544]:     <apic/>
Nov 22 08:58:15 compute-0 nova_compute[186544]:     <vmcoreinfo/>
Nov 22 08:58:15 compute-0 nova_compute[186544]:   </features>
Nov 22 08:58:15 compute-0 nova_compute[186544]:   <clock offset="utc">
Nov 22 08:58:15 compute-0 nova_compute[186544]:     <timer name="pit" tickpolicy="delay"/>
Nov 22 08:58:15 compute-0 nova_compute[186544]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 22 08:58:15 compute-0 nova_compute[186544]:     <timer name="hpet" present="no"/>
Nov 22 08:58:15 compute-0 nova_compute[186544]:   </clock>
Nov 22 08:58:15 compute-0 nova_compute[186544]:   <cpu mode="custom" match="exact">
Nov 22 08:58:15 compute-0 nova_compute[186544]:     <model>Nehalem</model>
Nov 22 08:58:15 compute-0 nova_compute[186544]:     <topology sockets="1" cores="1" threads="1"/>
Nov 22 08:58:15 compute-0 nova_compute[186544]:   </cpu>
Nov 22 08:58:15 compute-0 nova_compute[186544]:   <devices>
Nov 22 08:58:15 compute-0 nova_compute[186544]:     <disk type="file" device="disk">
Nov 22 08:58:15 compute-0 nova_compute[186544]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 22 08:58:15 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/f54acf85-11a3-44be-a962-ac430c2f1756/disk"/>
Nov 22 08:58:15 compute-0 nova_compute[186544]:       <target dev="vda" bus="virtio"/>
Nov 22 08:58:15 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:58:15 compute-0 nova_compute[186544]:     <disk type="file" device="cdrom">
Nov 22 08:58:15 compute-0 nova_compute[186544]:       <driver name="qemu" type="raw" cache="none"/>
Nov 22 08:58:15 compute-0 nova_compute[186544]:       <source file="/var/lib/nova/instances/f54acf85-11a3-44be-a962-ac430c2f1756/disk.config"/>
Nov 22 08:58:15 compute-0 nova_compute[186544]:       <target dev="sda" bus="sata"/>
Nov 22 08:58:15 compute-0 nova_compute[186544]:     </disk>
Nov 22 08:58:15 compute-0 nova_compute[186544]:     <interface type="ethernet">
Nov 22 08:58:15 compute-0 nova_compute[186544]:       <mac address="fa:16:3e:e2:f2:39"/>
Nov 22 08:58:15 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:58:15 compute-0 nova_compute[186544]:       <driver name="vhost" rx_queue_size="512"/>
Nov 22 08:58:15 compute-0 nova_compute[186544]:       <mtu size="1442"/>
Nov 22 08:58:15 compute-0 nova_compute[186544]:       <target dev="tapd4bab24c-c6"/>
Nov 22 08:58:15 compute-0 nova_compute[186544]:     </interface>
Nov 22 08:58:15 compute-0 nova_compute[186544]:     <serial type="pty">
Nov 22 08:58:15 compute-0 nova_compute[186544]:       <log file="/var/lib/nova/instances/f54acf85-11a3-44be-a962-ac430c2f1756/console.log" append="off"/>
Nov 22 08:58:15 compute-0 nova_compute[186544]:     </serial>
Nov 22 08:58:15 compute-0 nova_compute[186544]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 22 08:58:15 compute-0 nova_compute[186544]:     <video>
Nov 22 08:58:15 compute-0 nova_compute[186544]:       <model type="virtio"/>
Nov 22 08:58:15 compute-0 nova_compute[186544]:     </video>
Nov 22 08:58:15 compute-0 nova_compute[186544]:     <input type="tablet" bus="usb"/>
Nov 22 08:58:15 compute-0 nova_compute[186544]:     <rng model="virtio">
Nov 22 08:58:15 compute-0 nova_compute[186544]:       <backend model="random">/dev/urandom</backend>
Nov 22 08:58:15 compute-0 nova_compute[186544]:     </rng>
Nov 22 08:58:15 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root"/>
Nov 22 08:58:15 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:58:15 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:58:15 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:58:15 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:58:15 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:58:15 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:58:15 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:58:15 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:58:15 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:58:15 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:58:15 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:58:15 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:58:15 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:58:15 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:58:15 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:58:15 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:58:15 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:58:15 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:58:15 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:58:15 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:58:15 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:58:15 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:58:15 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:58:15 compute-0 nova_compute[186544]:     <controller type="pci" model="pcie-root-port"/>
Nov 22 08:58:15 compute-0 nova_compute[186544]:     <controller type="usb" index="0"/>
Nov 22 08:58:15 compute-0 nova_compute[186544]:     <memballoon model="virtio">
Nov 22 08:58:15 compute-0 nova_compute[186544]:       <stats period="10"/>
Nov 22 08:58:15 compute-0 nova_compute[186544]:     </memballoon>
Nov 22 08:58:15 compute-0 nova_compute[186544]:   </devices>
Nov 22 08:58:15 compute-0 nova_compute[186544]: </domain>
Nov 22 08:58:15 compute-0 nova_compute[186544]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 22 08:58:15 compute-0 nova_compute[186544]: 2025-11-22 08:58:15.330 186548 DEBUG nova.compute.manager [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Preparing to wait for external event network-vif-plugged-d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 22 08:58:15 compute-0 nova_compute[186544]: 2025-11-22 08:58:15.331 186548 DEBUG oslo_concurrency.lockutils [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Acquiring lock "f54acf85-11a3-44be-a962-ac430c2f1756-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:58:15 compute-0 nova_compute[186544]: 2025-11-22 08:58:15.331 186548 DEBUG oslo_concurrency.lockutils [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Lock "f54acf85-11a3-44be-a962-ac430c2f1756-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:58:15 compute-0 nova_compute[186544]: 2025-11-22 08:58:15.331 186548 DEBUG oslo_concurrency.lockutils [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Lock "f54acf85-11a3-44be-a962-ac430c2f1756-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:58:15 compute-0 nova_compute[186544]: 2025-11-22 08:58:15.332 186548 DEBUG nova.virt.libvirt.vif [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-22T08:58:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerAdvancedOps-server-1645058772',display_name='tempest-TestServerAdvancedOps-server-1645058772',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testserveradvancedops-server-1645058772',id=188,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='664a512879714e69beefb15b448f9498',ramdisk_id='',reservation_id='r-djuyjwq4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerAdvancedOps-1894421725',owner_user_name='tempest-TestServerAdvancedOps-1894421725-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-22T08:58:07Z,user_data=None,user_id='b31993e3b9ba48dc95406061036b537a',uuid=f54acf85-11a3-44be-a962-ac430c2f1756,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273", "address": "fa:16:3e:e2:f2:39", "network": {"id": "5eb557a9-23d5-43ab-8781-2f9d2c26caba", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-135091389-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "664a512879714e69beefb15b448f9498", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4bab24c-c6", "ovs_interfaceid": "d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 08:58:15 compute-0 nova_compute[186544]: 2025-11-22 08:58:15.332 186548 DEBUG nova.network.os_vif_util [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Converting VIF {"id": "d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273", "address": "fa:16:3e:e2:f2:39", "network": {"id": "5eb557a9-23d5-43ab-8781-2f9d2c26caba", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-135091389-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "664a512879714e69beefb15b448f9498", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4bab24c-c6", "ovs_interfaceid": "d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:58:15 compute-0 nova_compute[186544]: 2025-11-22 08:58:15.332 186548 DEBUG nova.network.os_vif_util [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e2:f2:39,bridge_name='br-int',has_traffic_filtering=True,id=d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273,network=Network(5eb557a9-23d5-43ab-8781-2f9d2c26caba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd4bab24c-c6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:58:15 compute-0 nova_compute[186544]: 2025-11-22 08:58:15.333 186548 DEBUG os_vif [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e2:f2:39,bridge_name='br-int',has_traffic_filtering=True,id=d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273,network=Network(5eb557a9-23d5-43ab-8781-2f9d2c26caba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd4bab24c-c6') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 08:58:15 compute-0 nova_compute[186544]: 2025-11-22 08:58:15.333 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:58:15 compute-0 nova_compute[186544]: 2025-11-22 08:58:15.334 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:58:15 compute-0 nova_compute[186544]: 2025-11-22 08:58:15.334 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:58:15 compute-0 nova_compute[186544]: 2025-11-22 08:58:15.336 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:58:15 compute-0 nova_compute[186544]: 2025-11-22 08:58:15.336 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd4bab24c-c6, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:58:15 compute-0 nova_compute[186544]: 2025-11-22 08:58:15.337 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd4bab24c-c6, col_values=(('external_ids', {'iface-id': 'd4bab24c-c6fc-4c6a-a1a2-ff38c93ac273', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e2:f2:39', 'vm-uuid': 'f54acf85-11a3-44be-a962-ac430c2f1756'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:58:15 compute-0 nova_compute[186544]: 2025-11-22 08:58:15.338 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:58:15 compute-0 NetworkManager[55036]: <info>  [1763801895.3392] manager: (tapd4bab24c-c6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/443)
Nov 22 08:58:15 compute-0 nova_compute[186544]: 2025-11-22 08:58:15.340 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 22 08:58:15 compute-0 nova_compute[186544]: 2025-11-22 08:58:15.344 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:58:15 compute-0 nova_compute[186544]: 2025-11-22 08:58:15.345 186548 INFO os_vif [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e2:f2:39,bridge_name='br-int',has_traffic_filtering=True,id=d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273,network=Network(5eb557a9-23d5-43ab-8781-2f9d2c26caba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd4bab24c-c6')
Nov 22 08:58:15 compute-0 nova_compute[186544]: 2025-11-22 08:58:15.563 186548 DEBUG nova.virt.libvirt.driver [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:58:15 compute-0 nova_compute[186544]: 2025-11-22 08:58:15.564 186548 DEBUG nova.virt.libvirt.driver [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 22 08:58:15 compute-0 nova_compute[186544]: 2025-11-22 08:58:15.564 186548 DEBUG nova.virt.libvirt.driver [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] No VIF found with MAC fa:16:3e:e2:f2:39, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 22 08:58:15 compute-0 nova_compute[186544]: 2025-11-22 08:58:15.565 186548 INFO nova.virt.libvirt.driver [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Using config drive
Nov 22 08:58:15 compute-0 nova_compute[186544]: 2025-11-22 08:58:15.930 186548 INFO nova.virt.libvirt.driver [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Creating config drive at /var/lib/nova/instances/f54acf85-11a3-44be-a962-ac430c2f1756/disk.config
Nov 22 08:58:15 compute-0 nova_compute[186544]: 2025-11-22 08:58:15.936 186548 DEBUG oslo_concurrency.processutils [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f54acf85-11a3-44be-a962-ac430c2f1756/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5oblsmrh execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 08:58:16 compute-0 nova_compute[186544]: 2025-11-22 08:58:16.059 186548 DEBUG oslo_concurrency.processutils [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f54acf85-11a3-44be-a962-ac430c2f1756/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5oblsmrh" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 08:58:16 compute-0 kernel: tapd4bab24c-c6: entered promiscuous mode
Nov 22 08:58:16 compute-0 nova_compute[186544]: 2025-11-22 08:58:16.117 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:58:16 compute-0 ovn_controller[94843]: 2025-11-22T08:58:16Z|00937|binding|INFO|Claiming lport d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273 for this chassis.
Nov 22 08:58:16 compute-0 ovn_controller[94843]: 2025-11-22T08:58:16Z|00938|binding|INFO|d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273: Claiming fa:16:3e:e2:f2:39 10.100.0.13
Nov 22 08:58:16 compute-0 NetworkManager[55036]: <info>  [1763801896.1185] manager: (tapd4bab24c-c6): new Tun device (/org/freedesktop/NetworkManager/Devices/444)
Nov 22 08:58:16 compute-0 nova_compute[186544]: 2025-11-22 08:58:16.121 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:58:16 compute-0 systemd-udevd[258053]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:58:16 compute-0 nova_compute[186544]: 2025-11-22 08:58:16.151 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:58:16 compute-0 NetworkManager[55036]: <info>  [1763801896.1542] device (tapd4bab24c-c6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 08:58:16 compute-0 NetworkManager[55036]: <info>  [1763801896.1558] device (tapd4bab24c-c6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 08:58:16 compute-0 ovn_controller[94843]: 2025-11-22T08:58:16Z|00939|binding|INFO|Setting lport d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273 ovn-installed in OVS
Nov 22 08:58:16 compute-0 nova_compute[186544]: 2025-11-22 08:58:16.156 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:58:16 compute-0 systemd-machined[152872]: New machine qemu-102-instance-000000bc.
Nov 22 08:58:16 compute-0 systemd[1]: Started Virtual Machine qemu-102-instance-000000bc.
Nov 22 08:58:16 compute-0 ovn_controller[94843]: 2025-11-22T08:58:16Z|00940|binding|INFO|Setting lport d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273 up in Southbound
Nov 22 08:58:16 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:58:16.199 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e2:f2:39 10.100.0.13'], port_security=['fa:16:3e:e2:f2:39 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'f54acf85-11a3-44be-a962-ac430c2f1756', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5eb557a9-23d5-43ab-8781-2f9d2c26caba', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '664a512879714e69beefb15b448f9498', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a9a52d92-7b69-4d8e-bd37-a0315ee8cf4a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bd088f45-f47e-4599-af11-3d351df95ebf, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:58:16 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:58:16.200 103805 INFO neutron.agent.ovn.metadata.agent [-] Port d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273 in datapath 5eb557a9-23d5-43ab-8781-2f9d2c26caba bound to our chassis
Nov 22 08:58:16 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:58:16.201 103805 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 5eb557a9-23d5-43ab-8781-2f9d2c26caba or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 22 08:58:16 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:58:16.203 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[40976198-2ef4-439a-89e0-5ea8a0dce3a7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:58:16 compute-0 nova_compute[186544]: 2025-11-22 08:58:16.722 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763801896.72219, f54acf85-11a3-44be-a962-ac430c2f1756 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:58:16 compute-0 nova_compute[186544]: 2025-11-22 08:58:16.723 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] VM Started (Lifecycle Event)
Nov 22 08:58:16 compute-0 nova_compute[186544]: 2025-11-22 08:58:16.745 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:58:16 compute-0 nova_compute[186544]: 2025-11-22 08:58:16.750 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763801896.7224119, f54acf85-11a3-44be-a962-ac430c2f1756 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:58:16 compute-0 nova_compute[186544]: 2025-11-22 08:58:16.750 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] VM Paused (Lifecycle Event)
Nov 22 08:58:16 compute-0 nova_compute[186544]: 2025-11-22 08:58:16.766 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:58:16 compute-0 nova_compute[186544]: 2025-11-22 08:58:16.768 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:58:16 compute-0 nova_compute[186544]: 2025-11-22 08:58:16.783 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 08:58:17 compute-0 nova_compute[186544]: 2025-11-22 08:58:17.007 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:58:17 compute-0 nova_compute[186544]: 2025-11-22 08:58:17.009 186548 DEBUG nova.network.neutron [req-9fe953fa-fc98-44eb-a2ab-5704e4dc2253 req-c6c7de23-2e93-4c3c-bf3a-84c667731e21 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Updated VIF entry in instance network info cache for port d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 22 08:58:17 compute-0 nova_compute[186544]: 2025-11-22 08:58:17.010 186548 DEBUG nova.network.neutron [req-9fe953fa-fc98-44eb-a2ab-5704e4dc2253 req-c6c7de23-2e93-4c3c-bf3a-84c667731e21 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Updating instance_info_cache with network_info: [{"id": "d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273", "address": "fa:16:3e:e2:f2:39", "network": {"id": "5eb557a9-23d5-43ab-8781-2f9d2c26caba", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-135091389-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "664a512879714e69beefb15b448f9498", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4bab24c-c6", "ovs_interfaceid": "d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:58:17 compute-0 nova_compute[186544]: 2025-11-22 08:58:17.034 186548 DEBUG oslo_concurrency.lockutils [req-9fe953fa-fc98-44eb-a2ab-5704e4dc2253 req-c6c7de23-2e93-4c3c-bf3a-84c667731e21 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Releasing lock "refresh_cache-f54acf85-11a3-44be-a962-ac430c2f1756" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:58:17 compute-0 nova_compute[186544]: 2025-11-22 08:58:17.097 186548 DEBUG nova.compute.manager [req-3717124c-6266-4f34-aeef-8b698c35cbae req-194c9447-fb16-4c2c-8ee7-a309e615695b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Received event network-vif-plugged-d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:58:17 compute-0 nova_compute[186544]: 2025-11-22 08:58:17.098 186548 DEBUG oslo_concurrency.lockutils [req-3717124c-6266-4f34-aeef-8b698c35cbae req-194c9447-fb16-4c2c-8ee7-a309e615695b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "f54acf85-11a3-44be-a962-ac430c2f1756-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:58:17 compute-0 nova_compute[186544]: 2025-11-22 08:58:17.098 186548 DEBUG oslo_concurrency.lockutils [req-3717124c-6266-4f34-aeef-8b698c35cbae req-194c9447-fb16-4c2c-8ee7-a309e615695b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "f54acf85-11a3-44be-a962-ac430c2f1756-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:58:17 compute-0 nova_compute[186544]: 2025-11-22 08:58:17.099 186548 DEBUG oslo_concurrency.lockutils [req-3717124c-6266-4f34-aeef-8b698c35cbae req-194c9447-fb16-4c2c-8ee7-a309e615695b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "f54acf85-11a3-44be-a962-ac430c2f1756-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:58:17 compute-0 nova_compute[186544]: 2025-11-22 08:58:17.099 186548 DEBUG nova.compute.manager [req-3717124c-6266-4f34-aeef-8b698c35cbae req-194c9447-fb16-4c2c-8ee7-a309e615695b 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Processing event network-vif-plugged-d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 22 08:58:17 compute-0 nova_compute[186544]: 2025-11-22 08:58:17.100 186548 DEBUG nova.compute.manager [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 08:58:17 compute-0 nova_compute[186544]: 2025-11-22 08:58:17.102 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763801897.1025982, f54acf85-11a3-44be-a962-ac430c2f1756 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:58:17 compute-0 nova_compute[186544]: 2025-11-22 08:58:17.103 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] VM Resumed (Lifecycle Event)
Nov 22 08:58:17 compute-0 nova_compute[186544]: 2025-11-22 08:58:17.105 186548 DEBUG nova.virt.libvirt.driver [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 22 08:58:17 compute-0 nova_compute[186544]: 2025-11-22 08:58:17.108 186548 INFO nova.virt.libvirt.driver [-] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Instance spawned successfully.
Nov 22 08:58:17 compute-0 nova_compute[186544]: 2025-11-22 08:58:17.108 186548 DEBUG nova.virt.libvirt.driver [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 22 08:58:17 compute-0 nova_compute[186544]: 2025-11-22 08:58:17.122 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:58:17 compute-0 nova_compute[186544]: 2025-11-22 08:58:17.126 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:58:17 compute-0 nova_compute[186544]: 2025-11-22 08:58:17.130 186548 DEBUG nova.virt.libvirt.driver [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:58:17 compute-0 nova_compute[186544]: 2025-11-22 08:58:17.130 186548 DEBUG nova.virt.libvirt.driver [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:58:17 compute-0 nova_compute[186544]: 2025-11-22 08:58:17.131 186548 DEBUG nova.virt.libvirt.driver [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:58:17 compute-0 nova_compute[186544]: 2025-11-22 08:58:17.131 186548 DEBUG nova.virt.libvirt.driver [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:58:17 compute-0 nova_compute[186544]: 2025-11-22 08:58:17.132 186548 DEBUG nova.virt.libvirt.driver [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:58:17 compute-0 nova_compute[186544]: 2025-11-22 08:58:17.132 186548 DEBUG nova.virt.libvirt.driver [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 22 08:58:17 compute-0 nova_compute[186544]: 2025-11-22 08:58:17.154 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 22 08:58:17 compute-0 nova_compute[186544]: 2025-11-22 08:58:17.239 186548 INFO nova.compute.manager [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Took 10.14 seconds to spawn the instance on the hypervisor.
Nov 22 08:58:17 compute-0 nova_compute[186544]: 2025-11-22 08:58:17.240 186548 DEBUG nova.compute.manager [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:58:17 compute-0 nova_compute[186544]: 2025-11-22 08:58:17.303 186548 INFO nova.compute.manager [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Took 10.97 seconds to build instance.
Nov 22 08:58:17 compute-0 nova_compute[186544]: 2025-11-22 08:58:17.322 186548 DEBUG oslo_concurrency.lockutils [None req-9c702b56-821f-4d18-8a3d-6f8974e3513f b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Lock "f54acf85-11a3-44be-a962-ac430c2f1756" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.104s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:58:19 compute-0 nova_compute[186544]: 2025-11-22 08:58:19.298 186548 DEBUG nova.compute.manager [req-cccbc545-e342-4a94-9b7a-409e17962a2a req-ba61705e-a6ed-4e63-a95e-73e8dedaebc8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Received event network-vif-plugged-d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:58:19 compute-0 nova_compute[186544]: 2025-11-22 08:58:19.300 186548 DEBUG oslo_concurrency.lockutils [req-cccbc545-e342-4a94-9b7a-409e17962a2a req-ba61705e-a6ed-4e63-a95e-73e8dedaebc8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "f54acf85-11a3-44be-a962-ac430c2f1756-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:58:19 compute-0 nova_compute[186544]: 2025-11-22 08:58:19.301 186548 DEBUG oslo_concurrency.lockutils [req-cccbc545-e342-4a94-9b7a-409e17962a2a req-ba61705e-a6ed-4e63-a95e-73e8dedaebc8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "f54acf85-11a3-44be-a962-ac430c2f1756-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:58:19 compute-0 nova_compute[186544]: 2025-11-22 08:58:19.302 186548 DEBUG oslo_concurrency.lockutils [req-cccbc545-e342-4a94-9b7a-409e17962a2a req-ba61705e-a6ed-4e63-a95e-73e8dedaebc8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "f54acf85-11a3-44be-a962-ac430c2f1756-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:58:19 compute-0 nova_compute[186544]: 2025-11-22 08:58:19.302 186548 DEBUG nova.compute.manager [req-cccbc545-e342-4a94-9b7a-409e17962a2a req-ba61705e-a6ed-4e63-a95e-73e8dedaebc8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] No waiting events found dispatching network-vif-plugged-d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:58:19 compute-0 nova_compute[186544]: 2025-11-22 08:58:19.303 186548 WARNING nova.compute.manager [req-cccbc545-e342-4a94-9b7a-409e17962a2a req-ba61705e-a6ed-4e63-a95e-73e8dedaebc8 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Received unexpected event network-vif-plugged-d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273 for instance with vm_state active and task_state None.
Nov 22 08:58:19 compute-0 podman[258074]: 2025-11-22 08:58:19.424020846 +0000 UTC m=+0.060105710 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, container_name=multipathd)
Nov 22 08:58:19 compute-0 podman[258073]: 2025-11-22 08:58:19.424371154 +0000 UTC m=+0.064754363 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, maintainer=Red Hat, Inc., name=ubi9-minimal, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, architecture=x86_64, build-date=2025-08-20T13:12:41, vcs-type=git, config_id=edpm, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 22 08:58:19 compute-0 podman[258072]: 2025-11-22 08:58:19.451188025 +0000 UTC m=+0.091450601 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 08:58:20 compute-0 nova_compute[186544]: 2025-11-22 08:58:20.257 186548 DEBUG nova.objects.instance [None req-01b9cefa-248e-4e6c-a625-aa8f47b81b34 b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Lazy-loading 'pci_devices' on Instance uuid f54acf85-11a3-44be-a962-ac430c2f1756 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:58:20 compute-0 nova_compute[186544]: 2025-11-22 08:58:20.339 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:58:20 compute-0 nova_compute[186544]: 2025-11-22 08:58:20.356 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763801900.3563523, f54acf85-11a3-44be-a962-ac430c2f1756 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:58:20 compute-0 nova_compute[186544]: 2025-11-22 08:58:20.357 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] VM Paused (Lifecycle Event)
Nov 22 08:58:20 compute-0 nova_compute[186544]: 2025-11-22 08:58:20.389 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:58:20 compute-0 nova_compute[186544]: 2025-11-22 08:58:20.394 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:58:20 compute-0 nova_compute[186544]: 2025-11-22 08:58:20.426 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] During sync_power_state the instance has a pending task (suspending). Skip.
Nov 22 08:58:20 compute-0 kernel: tapd4bab24c-c6 (unregistering): left promiscuous mode
Nov 22 08:58:20 compute-0 NetworkManager[55036]: <info>  [1763801900.8375] device (tapd4bab24c-c6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 08:58:20 compute-0 ovn_controller[94843]: 2025-11-22T08:58:20Z|00941|binding|INFO|Releasing lport d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273 from this chassis (sb_readonly=0)
Nov 22 08:58:20 compute-0 nova_compute[186544]: 2025-11-22 08:58:20.847 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:58:20 compute-0 ovn_controller[94843]: 2025-11-22T08:58:20Z|00942|binding|INFO|Setting lport d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273 down in Southbound
Nov 22 08:58:20 compute-0 ovn_controller[94843]: 2025-11-22T08:58:20Z|00943|binding|INFO|Removing iface tapd4bab24c-c6 ovn-installed in OVS
Nov 22 08:58:20 compute-0 nova_compute[186544]: 2025-11-22 08:58:20.850 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:58:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:58:20.858 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e2:f2:39 10.100.0.13'], port_security=['fa:16:3e:e2:f2:39 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'f54acf85-11a3-44be-a962-ac430c2f1756', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5eb557a9-23d5-43ab-8781-2f9d2c26caba', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '664a512879714e69beefb15b448f9498', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a9a52d92-7b69-4d8e-bd37-a0315ee8cf4a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bd088f45-f47e-4599-af11-3d351df95ebf, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:58:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:58:20.859 103805 INFO neutron.agent.ovn.metadata.agent [-] Port d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273 in datapath 5eb557a9-23d5-43ab-8781-2f9d2c26caba unbound from our chassis
Nov 22 08:58:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:58:20.859 103805 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 5eb557a9-23d5-43ab-8781-2f9d2c26caba or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 22 08:58:20 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:58:20.863 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[68fb6cd5-c936-48a9-87de-57772d2721dc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:58:20 compute-0 nova_compute[186544]: 2025-11-22 08:58:20.866 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:58:20 compute-0 systemd[1]: machine-qemu\x2d102\x2dinstance\x2d000000bc.scope: Deactivated successfully.
Nov 22 08:58:20 compute-0 systemd[1]: machine-qemu\x2d102\x2dinstance\x2d000000bc.scope: Consumed 3.808s CPU time.
Nov 22 08:58:20 compute-0 systemd-machined[152872]: Machine qemu-102-instance-000000bc terminated.
Nov 22 08:58:21 compute-0 nova_compute[186544]: 2025-11-22 08:58:21.088 186548 DEBUG nova.compute.manager [None req-01b9cefa-248e-4e6c-a625-aa8f47b81b34 b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:58:21 compute-0 nova_compute[186544]: 2025-11-22 08:58:21.406 186548 DEBUG nova.compute.manager [req-5ec3cc61-fac4-4277-a32a-226be309b858 req-e70aa945-54ef-4784-a13b-365eb61a9f18 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Received event network-vif-unplugged-d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:58:21 compute-0 nova_compute[186544]: 2025-11-22 08:58:21.407 186548 DEBUG oslo_concurrency.lockutils [req-5ec3cc61-fac4-4277-a32a-226be309b858 req-e70aa945-54ef-4784-a13b-365eb61a9f18 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "f54acf85-11a3-44be-a962-ac430c2f1756-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:58:21 compute-0 nova_compute[186544]: 2025-11-22 08:58:21.407 186548 DEBUG oslo_concurrency.lockutils [req-5ec3cc61-fac4-4277-a32a-226be309b858 req-e70aa945-54ef-4784-a13b-365eb61a9f18 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "f54acf85-11a3-44be-a962-ac430c2f1756-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:58:21 compute-0 nova_compute[186544]: 2025-11-22 08:58:21.407 186548 DEBUG oslo_concurrency.lockutils [req-5ec3cc61-fac4-4277-a32a-226be309b858 req-e70aa945-54ef-4784-a13b-365eb61a9f18 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "f54acf85-11a3-44be-a962-ac430c2f1756-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:58:21 compute-0 nova_compute[186544]: 2025-11-22 08:58:21.408 186548 DEBUG nova.compute.manager [req-5ec3cc61-fac4-4277-a32a-226be309b858 req-e70aa945-54ef-4784-a13b-365eb61a9f18 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] No waiting events found dispatching network-vif-unplugged-d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:58:21 compute-0 nova_compute[186544]: 2025-11-22 08:58:21.408 186548 WARNING nova.compute.manager [req-5ec3cc61-fac4-4277-a32a-226be309b858 req-e70aa945-54ef-4784-a13b-365eb61a9f18 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Received unexpected event network-vif-unplugged-d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273 for instance with vm_state suspended and task_state None.
Nov 22 08:58:21 compute-0 nova_compute[186544]: 2025-11-22 08:58:21.408 186548 DEBUG nova.compute.manager [req-5ec3cc61-fac4-4277-a32a-226be309b858 req-e70aa945-54ef-4784-a13b-365eb61a9f18 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Received event network-vif-plugged-d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:58:21 compute-0 nova_compute[186544]: 2025-11-22 08:58:21.409 186548 DEBUG oslo_concurrency.lockutils [req-5ec3cc61-fac4-4277-a32a-226be309b858 req-e70aa945-54ef-4784-a13b-365eb61a9f18 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "f54acf85-11a3-44be-a962-ac430c2f1756-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:58:21 compute-0 nova_compute[186544]: 2025-11-22 08:58:21.409 186548 DEBUG oslo_concurrency.lockutils [req-5ec3cc61-fac4-4277-a32a-226be309b858 req-e70aa945-54ef-4784-a13b-365eb61a9f18 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "f54acf85-11a3-44be-a962-ac430c2f1756-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:58:21 compute-0 nova_compute[186544]: 2025-11-22 08:58:21.409 186548 DEBUG oslo_concurrency.lockutils [req-5ec3cc61-fac4-4277-a32a-226be309b858 req-e70aa945-54ef-4784-a13b-365eb61a9f18 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "f54acf85-11a3-44be-a962-ac430c2f1756-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:58:21 compute-0 nova_compute[186544]: 2025-11-22 08:58:21.409 186548 DEBUG nova.compute.manager [req-5ec3cc61-fac4-4277-a32a-226be309b858 req-e70aa945-54ef-4784-a13b-365eb61a9f18 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] No waiting events found dispatching network-vif-plugged-d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:58:21 compute-0 nova_compute[186544]: 2025-11-22 08:58:21.410 186548 WARNING nova.compute.manager [req-5ec3cc61-fac4-4277-a32a-226be309b858 req-e70aa945-54ef-4784-a13b-365eb61a9f18 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Received unexpected event network-vif-plugged-d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273 for instance with vm_state suspended and task_state None.
Nov 22 08:58:22 compute-0 nova_compute[186544]: 2025-11-22 08:58:22.008 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:58:23 compute-0 nova_compute[186544]: 2025-11-22 08:58:23.225 186548 INFO nova.compute.manager [None req-35ebaa39-d28e-418a-9788-bafa135e708c b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Resuming
Nov 22 08:58:23 compute-0 nova_compute[186544]: 2025-11-22 08:58:23.226 186548 DEBUG nova.objects.instance [None req-35ebaa39-d28e-418a-9788-bafa135e708c b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Lazy-loading 'flavor' on Instance uuid f54acf85-11a3-44be-a962-ac430c2f1756 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:58:23 compute-0 nova_compute[186544]: 2025-11-22 08:58:23.301 186548 DEBUG oslo_concurrency.lockutils [None req-35ebaa39-d28e-418a-9788-bafa135e708c b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Acquiring lock "refresh_cache-f54acf85-11a3-44be-a962-ac430c2f1756" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:58:23 compute-0 nova_compute[186544]: 2025-11-22 08:58:23.302 186548 DEBUG oslo_concurrency.lockutils [None req-35ebaa39-d28e-418a-9788-bafa135e708c b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Acquired lock "refresh_cache-f54acf85-11a3-44be-a962-ac430c2f1756" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:58:23 compute-0 nova_compute[186544]: 2025-11-22 08:58:23.302 186548 DEBUG nova.network.neutron [None req-35ebaa39-d28e-418a-9788-bafa135e708c b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 08:58:25 compute-0 nova_compute[186544]: 2025-11-22 08:58:25.268 186548 DEBUG nova.network.neutron [None req-35ebaa39-d28e-418a-9788-bafa135e708c b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Updating instance_info_cache with network_info: [{"id": "d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273", "address": "fa:16:3e:e2:f2:39", "network": {"id": "5eb557a9-23d5-43ab-8781-2f9d2c26caba", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-135091389-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "664a512879714e69beefb15b448f9498", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4bab24c-c6", "ovs_interfaceid": "d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:58:25 compute-0 nova_compute[186544]: 2025-11-22 08:58:25.296 186548 DEBUG oslo_concurrency.lockutils [None req-35ebaa39-d28e-418a-9788-bafa135e708c b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Releasing lock "refresh_cache-f54acf85-11a3-44be-a962-ac430c2f1756" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:58:25 compute-0 nova_compute[186544]: 2025-11-22 08:58:25.302 186548 DEBUG nova.virt.libvirt.vif [None req-35ebaa39-d28e-418a-9788-bafa135e708c b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:58:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestServerAdvancedOps-server-1645058772',display_name='tempest-TestServerAdvancedOps-server-1645058772',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testserveradvancedops-server-1645058772',id=188,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:58:17Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='664a512879714e69beefb15b448f9498',ramdisk_id='',reservation_id='r-djuyjwq4',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestServerAdvancedOps-1894421725',owner_user_name='tempest-TestServerAdvancedOps-1894421725-project-member'},tags=<?>,task_state='resuming',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:58:21Z,user_data=None,user_id='b31993e3b9ba48dc95406061036b537a',uuid=f54acf85-11a3-44be-a962-ac430c2f1756,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='suspended') vif={"id": "d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273", "address": "fa:16:3e:e2:f2:39", "network": {"id": "5eb557a9-23d5-43ab-8781-2f9d2c26caba", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-135091389-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "664a512879714e69beefb15b448f9498", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4bab24c-c6", "ovs_interfaceid": "d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 08:58:25 compute-0 nova_compute[186544]: 2025-11-22 08:58:25.303 186548 DEBUG nova.network.os_vif_util [None req-35ebaa39-d28e-418a-9788-bafa135e708c b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Converting VIF {"id": "d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273", "address": "fa:16:3e:e2:f2:39", "network": {"id": "5eb557a9-23d5-43ab-8781-2f9d2c26caba", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-135091389-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "664a512879714e69beefb15b448f9498", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4bab24c-c6", "ovs_interfaceid": "d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:58:25 compute-0 nova_compute[186544]: 2025-11-22 08:58:25.304 186548 DEBUG nova.network.os_vif_util [None req-35ebaa39-d28e-418a-9788-bafa135e708c b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e2:f2:39,bridge_name='br-int',has_traffic_filtering=True,id=d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273,network=Network(5eb557a9-23d5-43ab-8781-2f9d2c26caba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd4bab24c-c6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:58:25 compute-0 nova_compute[186544]: 2025-11-22 08:58:25.304 186548 DEBUG os_vif [None req-35ebaa39-d28e-418a-9788-bafa135e708c b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e2:f2:39,bridge_name='br-int',has_traffic_filtering=True,id=d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273,network=Network(5eb557a9-23d5-43ab-8781-2f9d2c26caba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd4bab24c-c6') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 08:58:25 compute-0 nova_compute[186544]: 2025-11-22 08:58:25.305 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:58:25 compute-0 nova_compute[186544]: 2025-11-22 08:58:25.305 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:58:25 compute-0 nova_compute[186544]: 2025-11-22 08:58:25.305 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:58:25 compute-0 nova_compute[186544]: 2025-11-22 08:58:25.310 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:58:25 compute-0 nova_compute[186544]: 2025-11-22 08:58:25.310 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd4bab24c-c6, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:58:25 compute-0 nova_compute[186544]: 2025-11-22 08:58:25.310 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd4bab24c-c6, col_values=(('external_ids', {'iface-id': 'd4bab24c-c6fc-4c6a-a1a2-ff38c93ac273', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e2:f2:39', 'vm-uuid': 'f54acf85-11a3-44be-a962-ac430c2f1756'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:58:25 compute-0 nova_compute[186544]: 2025-11-22 08:58:25.311 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:58:25 compute-0 nova_compute[186544]: 2025-11-22 08:58:25.311 186548 INFO os_vif [None req-35ebaa39-d28e-418a-9788-bafa135e708c b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e2:f2:39,bridge_name='br-int',has_traffic_filtering=True,id=d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273,network=Network(5eb557a9-23d5-43ab-8781-2f9d2c26caba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd4bab24c-c6')
Nov 22 08:58:25 compute-0 nova_compute[186544]: 2025-11-22 08:58:25.341 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:58:25 compute-0 nova_compute[186544]: 2025-11-22 08:58:25.484 186548 DEBUG nova.objects.instance [None req-35ebaa39-d28e-418a-9788-bafa135e708c b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Lazy-loading 'numa_topology' on Instance uuid f54acf85-11a3-44be-a962-ac430c2f1756 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:58:25 compute-0 kernel: tapd4bab24c-c6: entered promiscuous mode
Nov 22 08:58:25 compute-0 NetworkManager[55036]: <info>  [1763801905.5546] manager: (tapd4bab24c-c6): new Tun device (/org/freedesktop/NetworkManager/Devices/445)
Nov 22 08:58:25 compute-0 ovn_controller[94843]: 2025-11-22T08:58:25Z|00944|binding|INFO|Claiming lport d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273 for this chassis.
Nov 22 08:58:25 compute-0 nova_compute[186544]: 2025-11-22 08:58:25.556 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:58:25 compute-0 ovn_controller[94843]: 2025-11-22T08:58:25Z|00945|binding|INFO|d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273: Claiming fa:16:3e:e2:f2:39 10.100.0.13
Nov 22 08:58:25 compute-0 ovn_controller[94843]: 2025-11-22T08:58:25Z|00946|binding|INFO|Setting lport d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273 ovn-installed in OVS
Nov 22 08:58:25 compute-0 nova_compute[186544]: 2025-11-22 08:58:25.616 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:58:25 compute-0 nova_compute[186544]: 2025-11-22 08:58:25.619 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:58:25 compute-0 systemd-udevd[258174]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:58:25 compute-0 NetworkManager[55036]: <info>  [1763801905.6349] device (tapd4bab24c-c6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 08:58:25 compute-0 NetworkManager[55036]: <info>  [1763801905.6359] device (tapd4bab24c-c6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 08:58:25 compute-0 systemd-machined[152872]: New machine qemu-103-instance-000000bc.
Nov 22 08:58:25 compute-0 systemd[1]: Started Virtual Machine qemu-103-instance-000000bc.
Nov 22 08:58:25 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:58:25.842 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e2:f2:39 10.100.0.13'], port_security=['fa:16:3e:e2:f2:39 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'f54acf85-11a3-44be-a962-ac430c2f1756', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5eb557a9-23d5-43ab-8781-2f9d2c26caba', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '664a512879714e69beefb15b448f9498', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'a9a52d92-7b69-4d8e-bd37-a0315ee8cf4a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bd088f45-f47e-4599-af11-3d351df95ebf, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:58:25 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:58:25.843 103805 INFO neutron.agent.ovn.metadata.agent [-] Port d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273 in datapath 5eb557a9-23d5-43ab-8781-2f9d2c26caba bound to our chassis
Nov 22 08:58:25 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:58:25.844 103805 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 5eb557a9-23d5-43ab-8781-2f9d2c26caba or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 22 08:58:25 compute-0 ovn_controller[94843]: 2025-11-22T08:58:25Z|00947|binding|INFO|Setting lport d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273 up in Southbound
Nov 22 08:58:25 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:58:25.846 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[1bdc69ef-81f1-48fe-9a7f-19d21429317b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:58:26 compute-0 nova_compute[186544]: 2025-11-22 08:58:26.341 186548 DEBUG nova.compute.manager [req-04842d82-c9bc-4488-9895-58c7c1746354 req-8f2b2660-46d4-42e8-8370-ee3f19402167 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Received event network-vif-plugged-d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:58:26 compute-0 nova_compute[186544]: 2025-11-22 08:58:26.341 186548 DEBUG oslo_concurrency.lockutils [req-04842d82-c9bc-4488-9895-58c7c1746354 req-8f2b2660-46d4-42e8-8370-ee3f19402167 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "f54acf85-11a3-44be-a962-ac430c2f1756-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:58:26 compute-0 nova_compute[186544]: 2025-11-22 08:58:26.341 186548 DEBUG oslo_concurrency.lockutils [req-04842d82-c9bc-4488-9895-58c7c1746354 req-8f2b2660-46d4-42e8-8370-ee3f19402167 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "f54acf85-11a3-44be-a962-ac430c2f1756-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:58:26 compute-0 nova_compute[186544]: 2025-11-22 08:58:26.342 186548 DEBUG oslo_concurrency.lockutils [req-04842d82-c9bc-4488-9895-58c7c1746354 req-8f2b2660-46d4-42e8-8370-ee3f19402167 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "f54acf85-11a3-44be-a962-ac430c2f1756-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:58:26 compute-0 nova_compute[186544]: 2025-11-22 08:58:26.342 186548 DEBUG nova.compute.manager [req-04842d82-c9bc-4488-9895-58c7c1746354 req-8f2b2660-46d4-42e8-8370-ee3f19402167 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] No waiting events found dispatching network-vif-plugged-d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:58:26 compute-0 nova_compute[186544]: 2025-11-22 08:58:26.342 186548 WARNING nova.compute.manager [req-04842d82-c9bc-4488-9895-58c7c1746354 req-8f2b2660-46d4-42e8-8370-ee3f19402167 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Received unexpected event network-vif-plugged-d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273 for instance with vm_state suspended and task_state resuming.
Nov 22 08:58:26 compute-0 nova_compute[186544]: 2025-11-22 08:58:26.474 186548 DEBUG nova.virt.libvirt.host [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Removed pending event for f54acf85-11a3-44be-a962-ac430c2f1756 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Nov 22 08:58:26 compute-0 nova_compute[186544]: 2025-11-22 08:58:26.474 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763801906.4737813, f54acf85-11a3-44be-a962-ac430c2f1756 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:58:26 compute-0 nova_compute[186544]: 2025-11-22 08:58:26.474 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] VM Started (Lifecycle Event)
Nov 22 08:58:26 compute-0 nova_compute[186544]: 2025-11-22 08:58:26.505 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:58:26 compute-0 nova_compute[186544]: 2025-11-22 08:58:26.517 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763801906.5169218, f54acf85-11a3-44be-a962-ac430c2f1756 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:58:26 compute-0 nova_compute[186544]: 2025-11-22 08:58:26.517 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] VM Resumed (Lifecycle Event)
Nov 22 08:58:26 compute-0 nova_compute[186544]: 2025-11-22 08:58:26.521 186548 DEBUG nova.compute.manager [None req-35ebaa39-d28e-418a-9788-bafa135e708c b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 08:58:26 compute-0 nova_compute[186544]: 2025-11-22 08:58:26.521 186548 DEBUG nova.objects.instance [None req-35ebaa39-d28e-418a-9788-bafa135e708c b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Lazy-loading 'pci_devices' on Instance uuid f54acf85-11a3-44be-a962-ac430c2f1756 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:58:26 compute-0 nova_compute[186544]: 2025-11-22 08:58:26.550 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:58:26 compute-0 nova_compute[186544]: 2025-11-22 08:58:26.554 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:58:26 compute-0 nova_compute[186544]: 2025-11-22 08:58:26.556 186548 INFO nova.virt.libvirt.driver [-] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Instance running successfully.
Nov 22 08:58:26 compute-0 virtqemud[186092]: argument unsupported: QEMU guest agent is not configured
Nov 22 08:58:26 compute-0 nova_compute[186544]: 2025-11-22 08:58:26.558 186548 DEBUG nova.virt.libvirt.guest [None req-35ebaa39-d28e-418a-9788-bafa135e708c b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200
Nov 22 08:58:26 compute-0 nova_compute[186544]: 2025-11-22 08:58:26.558 186548 DEBUG nova.compute.manager [None req-35ebaa39-d28e-418a-9788-bafa135e708c b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:58:26 compute-0 nova_compute[186544]: 2025-11-22 08:58:26.584 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] During sync_power_state the instance has a pending task (resuming). Skip.
Nov 22 08:58:27 compute-0 nova_compute[186544]: 2025-11-22 08:58:27.010 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:58:27 compute-0 nova_compute[186544]: 2025-11-22 08:58:27.772 186548 DEBUG nova.objects.instance [None req-6c72370a-ff77-440b-a205-11e04a87f59c b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Lazy-loading 'pci_devices' on Instance uuid f54acf85-11a3-44be-a962-ac430c2f1756 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:58:27 compute-0 nova_compute[186544]: 2025-11-22 08:58:27.791 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763801907.7910314, f54acf85-11a3-44be-a962-ac430c2f1756 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:58:27 compute-0 nova_compute[186544]: 2025-11-22 08:58:27.791 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] VM Paused (Lifecycle Event)
Nov 22 08:58:27 compute-0 nova_compute[186544]: 2025-11-22 08:58:27.808 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:58:27 compute-0 nova_compute[186544]: 2025-11-22 08:58:27.812 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:58:27 compute-0 nova_compute[186544]: 2025-11-22 08:58:27.837 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] During sync_power_state the instance has a pending task (suspending). Skip.
Nov 22 08:58:28 compute-0 kernel: tapd4bab24c-c6 (unregistering): left promiscuous mode
Nov 22 08:58:28 compute-0 NetworkManager[55036]: <info>  [1763801908.4178] device (tapd4bab24c-c6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 08:58:28 compute-0 ovn_controller[94843]: 2025-11-22T08:58:28Z|00948|binding|INFO|Releasing lport d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273 from this chassis (sb_readonly=0)
Nov 22 08:58:28 compute-0 ovn_controller[94843]: 2025-11-22T08:58:28Z|00949|binding|INFO|Setting lport d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273 down in Southbound
Nov 22 08:58:28 compute-0 nova_compute[186544]: 2025-11-22 08:58:28.426 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:58:28 compute-0 ovn_controller[94843]: 2025-11-22T08:58:28Z|00950|binding|INFO|Removing iface tapd4bab24c-c6 ovn-installed in OVS
Nov 22 08:58:28 compute-0 nova_compute[186544]: 2025-11-22 08:58:28.429 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:58:28 compute-0 nova_compute[186544]: 2025-11-22 08:58:28.441 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:58:28 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:58:28.441 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e2:f2:39 10.100.0.13'], port_security=['fa:16:3e:e2:f2:39 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'f54acf85-11a3-44be-a962-ac430c2f1756', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5eb557a9-23d5-43ab-8781-2f9d2c26caba', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '664a512879714e69beefb15b448f9498', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'a9a52d92-7b69-4d8e-bd37-a0315ee8cf4a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bd088f45-f47e-4599-af11-3d351df95ebf, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:58:28 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:58:28.445 103805 INFO neutron.agent.ovn.metadata.agent [-] Port d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273 in datapath 5eb557a9-23d5-43ab-8781-2f9d2c26caba unbound from our chassis
Nov 22 08:58:28 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:58:28.446 103805 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 5eb557a9-23d5-43ab-8781-2f9d2c26caba or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 22 08:58:28 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:58:28.447 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[ceb2774b-e44b-4c23-ac1d-d78bbe495c65]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:58:28 compute-0 systemd[1]: machine-qemu\x2d103\x2dinstance\x2d000000bc.scope: Deactivated successfully.
Nov 22 08:58:28 compute-0 systemd[1]: machine-qemu\x2d103\x2dinstance\x2d000000bc.scope: Consumed 2.117s CPU time.
Nov 22 08:58:28 compute-0 systemd-machined[152872]: Machine qemu-103-instance-000000bc terminated.
Nov 22 08:58:28 compute-0 nova_compute[186544]: 2025-11-22 08:58:28.497 186548 DEBUG nova.compute.manager [req-703a2da6-12da-43cb-bf14-57428301fbde req-fc710ce4-de69-44d7-86f6-a680b5b545f6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Received event network-vif-plugged-d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:58:28 compute-0 nova_compute[186544]: 2025-11-22 08:58:28.497 186548 DEBUG oslo_concurrency.lockutils [req-703a2da6-12da-43cb-bf14-57428301fbde req-fc710ce4-de69-44d7-86f6-a680b5b545f6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "f54acf85-11a3-44be-a962-ac430c2f1756-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:58:28 compute-0 nova_compute[186544]: 2025-11-22 08:58:28.498 186548 DEBUG oslo_concurrency.lockutils [req-703a2da6-12da-43cb-bf14-57428301fbde req-fc710ce4-de69-44d7-86f6-a680b5b545f6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "f54acf85-11a3-44be-a962-ac430c2f1756-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:58:28 compute-0 nova_compute[186544]: 2025-11-22 08:58:28.498 186548 DEBUG oslo_concurrency.lockutils [req-703a2da6-12da-43cb-bf14-57428301fbde req-fc710ce4-de69-44d7-86f6-a680b5b545f6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "f54acf85-11a3-44be-a962-ac430c2f1756-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:58:28 compute-0 nova_compute[186544]: 2025-11-22 08:58:28.499 186548 DEBUG nova.compute.manager [req-703a2da6-12da-43cb-bf14-57428301fbde req-fc710ce4-de69-44d7-86f6-a680b5b545f6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] No waiting events found dispatching network-vif-plugged-d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:58:28 compute-0 nova_compute[186544]: 2025-11-22 08:58:28.499 186548 WARNING nova.compute.manager [req-703a2da6-12da-43cb-bf14-57428301fbde req-fc710ce4-de69-44d7-86f6-a680b5b545f6 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Received unexpected event network-vif-plugged-d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273 for instance with vm_state active and task_state suspending.
Nov 22 08:58:28 compute-0 kernel: tapd4bab24c-c6: entered promiscuous mode
Nov 22 08:58:28 compute-0 kernel: tapd4bab24c-c6 (unregistering): left promiscuous mode
Nov 22 08:58:28 compute-0 NetworkManager[55036]: <info>  [1763801908.6219] manager: (tapd4bab24c-c6): new Tun device (/org/freedesktop/NetworkManager/Devices/446)
Nov 22 08:58:28 compute-0 nova_compute[186544]: 2025-11-22 08:58:28.626 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:58:28 compute-0 nova_compute[186544]: 2025-11-22 08:58:28.663 186548 DEBUG nova.compute.manager [None req-6c72370a-ff77-440b-a205-11e04a87f59c b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:58:30 compute-0 nova_compute[186544]: 2025-11-22 08:58:30.345 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:58:31 compute-0 nova_compute[186544]: 2025-11-22 08:58:31.040 186548 DEBUG nova.compute.manager [req-2819d3e6-107e-4181-9763-5b1a944dad81 req-0ebe7639-3b3c-448a-84df-c747d901c55e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Received event network-vif-unplugged-d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:58:31 compute-0 nova_compute[186544]: 2025-11-22 08:58:31.041 186548 DEBUG oslo_concurrency.lockutils [req-2819d3e6-107e-4181-9763-5b1a944dad81 req-0ebe7639-3b3c-448a-84df-c747d901c55e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "f54acf85-11a3-44be-a962-ac430c2f1756-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:58:31 compute-0 nova_compute[186544]: 2025-11-22 08:58:31.041 186548 DEBUG oslo_concurrency.lockutils [req-2819d3e6-107e-4181-9763-5b1a944dad81 req-0ebe7639-3b3c-448a-84df-c747d901c55e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "f54acf85-11a3-44be-a962-ac430c2f1756-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:58:31 compute-0 nova_compute[186544]: 2025-11-22 08:58:31.041 186548 DEBUG oslo_concurrency.lockutils [req-2819d3e6-107e-4181-9763-5b1a944dad81 req-0ebe7639-3b3c-448a-84df-c747d901c55e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "f54acf85-11a3-44be-a962-ac430c2f1756-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:58:31 compute-0 nova_compute[186544]: 2025-11-22 08:58:31.042 186548 DEBUG nova.compute.manager [req-2819d3e6-107e-4181-9763-5b1a944dad81 req-0ebe7639-3b3c-448a-84df-c747d901c55e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] No waiting events found dispatching network-vif-unplugged-d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:58:31 compute-0 nova_compute[186544]: 2025-11-22 08:58:31.042 186548 WARNING nova.compute.manager [req-2819d3e6-107e-4181-9763-5b1a944dad81 req-0ebe7639-3b3c-448a-84df-c747d901c55e 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Received unexpected event network-vif-unplugged-d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273 for instance with vm_state suspended and task_state None.
Nov 22 08:58:32 compute-0 nova_compute[186544]: 2025-11-22 08:58:32.012 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:58:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:58:32.924 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=82, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=81) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:58:32 compute-0 nova_compute[186544]: 2025-11-22 08:58:32.926 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:58:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:58:32.926 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 08:58:32 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:58:32.927 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '82'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:58:33 compute-0 nova_compute[186544]: 2025-11-22 08:58:33.138 186548 DEBUG nova.compute.manager [req-e1d97e5b-6971-49fe-bb34-c1d4ac7a847f req-712b681a-164b-4261-8da5-e9f199971f2d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Received event network-vif-plugged-d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:58:33 compute-0 nova_compute[186544]: 2025-11-22 08:58:33.139 186548 DEBUG oslo_concurrency.lockutils [req-e1d97e5b-6971-49fe-bb34-c1d4ac7a847f req-712b681a-164b-4261-8da5-e9f199971f2d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "f54acf85-11a3-44be-a962-ac430c2f1756-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:58:33 compute-0 nova_compute[186544]: 2025-11-22 08:58:33.139 186548 DEBUG oslo_concurrency.lockutils [req-e1d97e5b-6971-49fe-bb34-c1d4ac7a847f req-712b681a-164b-4261-8da5-e9f199971f2d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "f54acf85-11a3-44be-a962-ac430c2f1756-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:58:33 compute-0 nova_compute[186544]: 2025-11-22 08:58:33.140 186548 DEBUG oslo_concurrency.lockutils [req-e1d97e5b-6971-49fe-bb34-c1d4ac7a847f req-712b681a-164b-4261-8da5-e9f199971f2d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "f54acf85-11a3-44be-a962-ac430c2f1756-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:58:33 compute-0 nova_compute[186544]: 2025-11-22 08:58:33.140 186548 DEBUG nova.compute.manager [req-e1d97e5b-6971-49fe-bb34-c1d4ac7a847f req-712b681a-164b-4261-8da5-e9f199971f2d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] No waiting events found dispatching network-vif-plugged-d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:58:33 compute-0 nova_compute[186544]: 2025-11-22 08:58:33.141 186548 WARNING nova.compute.manager [req-e1d97e5b-6971-49fe-bb34-c1d4ac7a847f req-712b681a-164b-4261-8da5-e9f199971f2d 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Received unexpected event network-vif-plugged-d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273 for instance with vm_state suspended and task_state resuming.
Nov 22 08:58:33 compute-0 nova_compute[186544]: 2025-11-22 08:58:33.202 186548 INFO nova.compute.manager [None req-e3862dcb-ab6a-4566-93f4-aead914ff6db b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Resuming
Nov 22 08:58:33 compute-0 nova_compute[186544]: 2025-11-22 08:58:33.204 186548 DEBUG nova.objects.instance [None req-e3862dcb-ab6a-4566-93f4-aead914ff6db b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Lazy-loading 'flavor' on Instance uuid f54acf85-11a3-44be-a962-ac430c2f1756 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:58:33 compute-0 nova_compute[186544]: 2025-11-22 08:58:33.252 186548 DEBUG oslo_concurrency.lockutils [None req-e3862dcb-ab6a-4566-93f4-aead914ff6db b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Acquiring lock "refresh_cache-f54acf85-11a3-44be-a962-ac430c2f1756" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:58:33 compute-0 nova_compute[186544]: 2025-11-22 08:58:33.253 186548 DEBUG oslo_concurrency.lockutils [None req-e3862dcb-ab6a-4566-93f4-aead914ff6db b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Acquired lock "refresh_cache-f54acf85-11a3-44be-a962-ac430c2f1756" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:58:33 compute-0 nova_compute[186544]: 2025-11-22 08:58:33.253 186548 DEBUG nova.network.neutron [None req-e3862dcb-ab6a-4566-93f4-aead914ff6db b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 22 08:58:35 compute-0 nova_compute[186544]: 2025-11-22 08:58:35.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:58:35 compute-0 nova_compute[186544]: 2025-11-22 08:58:35.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 08:58:35 compute-0 nova_compute[186544]: 2025-11-22 08:58:35.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 08:58:35 compute-0 nova_compute[186544]: 2025-11-22 08:58:35.185 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "refresh_cache-f54acf85-11a3-44be-a962-ac430c2f1756" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 22 08:58:35 compute-0 nova_compute[186544]: 2025-11-22 08:58:35.317 186548 DEBUG nova.network.neutron [None req-e3862dcb-ab6a-4566-93f4-aead914ff6db b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Updating instance_info_cache with network_info: [{"id": "d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273", "address": "fa:16:3e:e2:f2:39", "network": {"id": "5eb557a9-23d5-43ab-8781-2f9d2c26caba", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-135091389-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "664a512879714e69beefb15b448f9498", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4bab24c-c6", "ovs_interfaceid": "d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:58:35 compute-0 nova_compute[186544]: 2025-11-22 08:58:35.343 186548 DEBUG oslo_concurrency.lockutils [None req-e3862dcb-ab6a-4566-93f4-aead914ff6db b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Releasing lock "refresh_cache-f54acf85-11a3-44be-a962-ac430c2f1756" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:58:35 compute-0 nova_compute[186544]: 2025-11-22 08:58:35.344 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquired lock "refresh_cache-f54acf85-11a3-44be-a962-ac430c2f1756" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 22 08:58:35 compute-0 nova_compute[186544]: 2025-11-22 08:58:35.344 186548 DEBUG nova.network.neutron [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 22 08:58:35 compute-0 nova_compute[186544]: 2025-11-22 08:58:35.344 186548 DEBUG nova.objects.instance [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lazy-loading 'info_cache' on Instance uuid f54acf85-11a3-44be-a962-ac430c2f1756 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:58:35 compute-0 nova_compute[186544]: 2025-11-22 08:58:35.349 186548 DEBUG nova.virt.libvirt.vif [None req-e3862dcb-ab6a-4566-93f4-aead914ff6db b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:58:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestServerAdvancedOps-server-1645058772',display_name='tempest-TestServerAdvancedOps-server-1645058772',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testserveradvancedops-server-1645058772',id=188,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:58:17Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='664a512879714e69beefb15b448f9498',ramdisk_id='',reservation_id='r-djuyjwq4',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestServerAdvancedOps-1894421725',owner_user_name='tempest-TestServerAdvancedOps-1894421725-project-member'},tags=<?>,task_state='resuming',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:58:28Z,user_data=None,user_id='b31993e3b9ba48dc95406061036b537a',uuid=f54acf85-11a3-44be-a962-ac430c2f1756,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='suspended') vif={"id": "d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273", "address": "fa:16:3e:e2:f2:39", "network": {"id": "5eb557a9-23d5-43ab-8781-2f9d2c26caba", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-135091389-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "664a512879714e69beefb15b448f9498", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4bab24c-c6", "ovs_interfaceid": "d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 22 08:58:35 compute-0 nova_compute[186544]: 2025-11-22 08:58:35.350 186548 DEBUG nova.network.os_vif_util [None req-e3862dcb-ab6a-4566-93f4-aead914ff6db b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Converting VIF {"id": "d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273", "address": "fa:16:3e:e2:f2:39", "network": {"id": "5eb557a9-23d5-43ab-8781-2f9d2c26caba", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-135091389-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "664a512879714e69beefb15b448f9498", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4bab24c-c6", "ovs_interfaceid": "d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:58:35 compute-0 nova_compute[186544]: 2025-11-22 08:58:35.351 186548 DEBUG nova.network.os_vif_util [None req-e3862dcb-ab6a-4566-93f4-aead914ff6db b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e2:f2:39,bridge_name='br-int',has_traffic_filtering=True,id=d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273,network=Network(5eb557a9-23d5-43ab-8781-2f9d2c26caba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd4bab24c-c6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:58:35 compute-0 nova_compute[186544]: 2025-11-22 08:58:35.351 186548 DEBUG os_vif [None req-e3862dcb-ab6a-4566-93f4-aead914ff6db b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e2:f2:39,bridge_name='br-int',has_traffic_filtering=True,id=d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273,network=Network(5eb557a9-23d5-43ab-8781-2f9d2c26caba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd4bab24c-c6') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 22 08:58:35 compute-0 nova_compute[186544]: 2025-11-22 08:58:35.352 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:58:35 compute-0 nova_compute[186544]: 2025-11-22 08:58:35.353 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:58:35 compute-0 nova_compute[186544]: 2025-11-22 08:58:35.353 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:58:35 compute-0 nova_compute[186544]: 2025-11-22 08:58:35.354 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:58:35 compute-0 nova_compute[186544]: 2025-11-22 08:58:35.357 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:58:35 compute-0 nova_compute[186544]: 2025-11-22 08:58:35.357 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd4bab24c-c6, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:58:35 compute-0 nova_compute[186544]: 2025-11-22 08:58:35.358 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd4bab24c-c6, col_values=(('external_ids', {'iface-id': 'd4bab24c-c6fc-4c6a-a1a2-ff38c93ac273', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e2:f2:39', 'vm-uuid': 'f54acf85-11a3-44be-a962-ac430c2f1756'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:58:35 compute-0 nova_compute[186544]: 2025-11-22 08:58:35.358 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 22 08:58:35 compute-0 nova_compute[186544]: 2025-11-22 08:58:35.358 186548 INFO os_vif [None req-e3862dcb-ab6a-4566-93f4-aead914ff6db b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e2:f2:39,bridge_name='br-int',has_traffic_filtering=True,id=d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273,network=Network(5eb557a9-23d5-43ab-8781-2f9d2c26caba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd4bab24c-c6')
Nov 22 08:58:35 compute-0 nova_compute[186544]: 2025-11-22 08:58:35.440 186548 DEBUG nova.objects.instance [None req-e3862dcb-ab6a-4566-93f4-aead914ff6db b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Lazy-loading 'numa_topology' on Instance uuid f54acf85-11a3-44be-a962-ac430c2f1756 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:58:35 compute-0 kernel: tapd4bab24c-c6: entered promiscuous mode
Nov 22 08:58:35 compute-0 NetworkManager[55036]: <info>  [1763801915.5960] manager: (tapd4bab24c-c6): new Tun device (/org/freedesktop/NetworkManager/Devices/447)
Nov 22 08:58:35 compute-0 ovn_controller[94843]: 2025-11-22T08:58:35Z|00951|binding|INFO|Claiming lport d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273 for this chassis.
Nov 22 08:58:35 compute-0 ovn_controller[94843]: 2025-11-22T08:58:35Z|00952|binding|INFO|d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273: Claiming fa:16:3e:e2:f2:39 10.100.0.13
Nov 22 08:58:35 compute-0 nova_compute[186544]: 2025-11-22 08:58:35.596 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:58:35 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:58:35.605 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e2:f2:39 10.100.0.13'], port_security=['fa:16:3e:e2:f2:39 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'f54acf85-11a3-44be-a962-ac430c2f1756', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5eb557a9-23d5-43ab-8781-2f9d2c26caba', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '664a512879714e69beefb15b448f9498', 'neutron:revision_number': '7', 'neutron:security_group_ids': 'a9a52d92-7b69-4d8e-bd37-a0315ee8cf4a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bd088f45-f47e-4599-af11-3d351df95ebf, chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:58:35 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:58:35.607 103805 INFO neutron.agent.ovn.metadata.agent [-] Port d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273 in datapath 5eb557a9-23d5-43ab-8781-2f9d2c26caba bound to our chassis
Nov 22 08:58:35 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:58:35.607 103805 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 5eb557a9-23d5-43ab-8781-2f9d2c26caba or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 22 08:58:35 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:58:35.608 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[be4a8d18-7c83-4522-bd1f-2ad953cd0e18]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:58:35 compute-0 ovn_controller[94843]: 2025-11-22T08:58:35Z|00953|binding|INFO|Setting lport d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273 up in Southbound
Nov 22 08:58:35 compute-0 ovn_controller[94843]: 2025-11-22T08:58:35Z|00954|binding|INFO|Setting lport d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273 ovn-installed in OVS
Nov 22 08:58:35 compute-0 nova_compute[186544]: 2025-11-22 08:58:35.625 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:58:35 compute-0 nova_compute[186544]: 2025-11-22 08:58:35.627 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:58:35 compute-0 systemd-udevd[258235]: Network interface NamePolicy= disabled on kernel command line.
Nov 22 08:58:35 compute-0 nova_compute[186544]: 2025-11-22 08:58:35.631 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:58:35 compute-0 systemd-machined[152872]: New machine qemu-104-instance-000000bc.
Nov 22 08:58:35 compute-0 NetworkManager[55036]: <info>  [1763801915.6434] device (tapd4bab24c-c6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 22 08:58:35 compute-0 NetworkManager[55036]: <info>  [1763801915.6448] device (tapd4bab24c-c6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 22 08:58:35 compute-0 systemd[1]: Started Virtual Machine qemu-104-instance-000000bc.
Nov 22 08:58:35 compute-0 nova_compute[186544]: 2025-11-22 08:58:35.963 186548 DEBUG nova.compute.manager [req-f4dee2c7-2131-4268-9680-59df22515566 req-84f6525d-0d4d-4fde-82ad-6ed0aadeb17f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Received event network-vif-plugged-d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:58:35 compute-0 nova_compute[186544]: 2025-11-22 08:58:35.964 186548 DEBUG oslo_concurrency.lockutils [req-f4dee2c7-2131-4268-9680-59df22515566 req-84f6525d-0d4d-4fde-82ad-6ed0aadeb17f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "f54acf85-11a3-44be-a962-ac430c2f1756-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:58:35 compute-0 nova_compute[186544]: 2025-11-22 08:58:35.964 186548 DEBUG oslo_concurrency.lockutils [req-f4dee2c7-2131-4268-9680-59df22515566 req-84f6525d-0d4d-4fde-82ad-6ed0aadeb17f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "f54acf85-11a3-44be-a962-ac430c2f1756-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:58:35 compute-0 nova_compute[186544]: 2025-11-22 08:58:35.964 186548 DEBUG oslo_concurrency.lockutils [req-f4dee2c7-2131-4268-9680-59df22515566 req-84f6525d-0d4d-4fde-82ad-6ed0aadeb17f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "f54acf85-11a3-44be-a962-ac430c2f1756-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:58:35 compute-0 nova_compute[186544]: 2025-11-22 08:58:35.965 186548 DEBUG nova.compute.manager [req-f4dee2c7-2131-4268-9680-59df22515566 req-84f6525d-0d4d-4fde-82ad-6ed0aadeb17f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] No waiting events found dispatching network-vif-plugged-d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:58:35 compute-0 nova_compute[186544]: 2025-11-22 08:58:35.965 186548 WARNING nova.compute.manager [req-f4dee2c7-2131-4268-9680-59df22515566 req-84f6525d-0d4d-4fde-82ad-6ed0aadeb17f 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Received unexpected event network-vif-plugged-d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273 for instance with vm_state suspended and task_state resuming.
Nov 22 08:58:36 compute-0 nova_compute[186544]: 2025-11-22 08:58:36.132 186548 DEBUG nova.virt.libvirt.host [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Removed pending event for f54acf85-11a3-44be-a962-ac430c2f1756 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Nov 22 08:58:36 compute-0 nova_compute[186544]: 2025-11-22 08:58:36.133 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763801916.1319218, f54acf85-11a3-44be-a962-ac430c2f1756 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:58:36 compute-0 nova_compute[186544]: 2025-11-22 08:58:36.133 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] VM Started (Lifecycle Event)
Nov 22 08:58:36 compute-0 nova_compute[186544]: 2025-11-22 08:58:36.161 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:58:36 compute-0 nova_compute[186544]: 2025-11-22 08:58:36.163 186548 DEBUG nova.compute.manager [None req-e3862dcb-ab6a-4566-93f4-aead914ff6db b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 22 08:58:36 compute-0 nova_compute[186544]: 2025-11-22 08:58:36.164 186548 DEBUG nova.objects.instance [None req-e3862dcb-ab6a-4566-93f4-aead914ff6db b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Lazy-loading 'pci_devices' on Instance uuid f54acf85-11a3-44be-a962-ac430c2f1756 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:58:36 compute-0 nova_compute[186544]: 2025-11-22 08:58:36.171 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Synchronizing instance power state after lifecycle event "Started"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:58:36 compute-0 nova_compute[186544]: 2025-11-22 08:58:36.191 186548 INFO nova.virt.libvirt.driver [-] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Instance running successfully.
Nov 22 08:58:36 compute-0 virtqemud[186092]: argument unsupported: QEMU guest agent is not configured
Nov 22 08:58:36 compute-0 nova_compute[186544]: 2025-11-22 08:58:36.195 186548 DEBUG nova.virt.libvirt.guest [None req-e3862dcb-ab6a-4566-93f4-aead914ff6db b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200
Nov 22 08:58:36 compute-0 nova_compute[186544]: 2025-11-22 08:58:36.195 186548 DEBUG nova.compute.manager [None req-e3862dcb-ab6a-4566-93f4-aead914ff6db b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:58:36 compute-0 nova_compute[186544]: 2025-11-22 08:58:36.202 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] During sync_power_state the instance has a pending task (resuming). Skip.
Nov 22 08:58:36 compute-0 nova_compute[186544]: 2025-11-22 08:58:36.203 186548 DEBUG nova.virt.driver [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] Emitting event <LifecycleEvent: 1763801916.143043, f54acf85-11a3-44be-a962-ac430c2f1756 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:58:36 compute-0 nova_compute[186544]: 2025-11-22 08:58:36.203 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] VM Resumed (Lifecycle Event)
Nov 22 08:58:36 compute-0 nova_compute[186544]: 2025-11-22 08:58:36.238 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:58:36 compute-0 nova_compute[186544]: 2025-11-22 08:58:36.242 186548 DEBUG nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 22 08:58:36 compute-0 nova_compute[186544]: 2025-11-22 08:58:36.270 186548 INFO nova.compute.manager [None req-75f721ce-9a61-4883-b9ac-ddd107129fc7 - - - - - -] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] During sync_power_state the instance has a pending task (resuming). Skip.
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.613 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f54acf85-11a3-44be-a962-ac430c2f1756', 'name': 'tempest-TestServerAdvancedOps-server-1645058772', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-000000bc', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '664a512879714e69beefb15b448f9498', 'user_id': 'b31993e3b9ba48dc95406061036b537a', 'hostId': '44fb8ad90173c08c6c181b826fd71ecec6e30dc395d59ff95199eb61', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.614 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.631 12 DEBUG ceilometer.compute.pollsters [-] f54acf85-11a3-44be-a962-ac430c2f1756/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.633 12 DEBUG ceilometer.compute.pollsters [-] f54acf85-11a3-44be-a962-ac430c2f1756/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.634 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '935fee82-3491-4cb6-8552-50794bee606a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': 'b31993e3b9ba48dc95406061036b537a', 'user_name': None, 'project_id': '664a512879714e69beefb15b448f9498', 'project_name': None, 'resource_id': 'f54acf85-11a3-44be-a962-ac430c2f1756-vda', 'timestamp': '2025-11-22T08:58:36.614993', 'resource_metadata': {'display_name': 'tempest-TestServerAdvancedOps-server-1645058772', 'name': 'instance-000000bc', 'instance_id': 'f54acf85-11a3-44be-a962-ac430c2f1756', 'instance_type': 'm1.nano', 'host': '44fb8ad90173c08c6c181b826fd71ecec6e30dc395d59ff95199eb61', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '6e747dfa-c781-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 8657.306861264, 'message_signature': 'ac5ab92caf70037aee55bb2b7fd1283d7e78e42cc047bc718c939e50f9e1d964'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'b31993e3b9ba48dc95406061036b537a', 'user_name': None, 'project_id': '664a512879714e69beefb15b448f9498', 'project_name': None, 'resource_id': 'f54acf85-11a3-44be-a962-ac430c2f1756-sda', 'timestamp': '2025-11-22T08:58:36.614993', 'resource_metadata': {'display_name': 'tempest-TestServerAdvancedOps-server-1645058772', 'name': 'instance-000000bc', 'instance_id': 'f54acf85-11a3-44be-a962-ac430c2f1756', 'instance_type': 'm1.nano', 'host': '44fb8ad90173c08c6c181b826fd71ecec6e30dc395d59ff95199eb61', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '6e74a8d4-c781-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 8657.306861264, 'message_signature': '17c41bae3011c96765cf0b9c384c7a17dda0fc005674c02addd2670d7948b9ae'}]}, 'timestamp': '2025-11-22 08:58:36.633494', '_unique_id': '6872b0d1b66f41f18ee570381d135e59'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.634 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.634 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.634 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.634 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.634 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.634 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.634 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.634 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.634 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.634 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.634 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.634 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.634 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.634 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.634 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.634 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.634 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.634 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.634 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.634 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.634 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.634 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.634 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.634 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.634 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.634 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.634 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.634 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.634 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.634 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.634 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.636 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.685 12 DEBUG ceilometer.compute.pollsters [-] f54acf85-11a3-44be-a962-ac430c2f1756/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.686 12 DEBUG ceilometer.compute.pollsters [-] f54acf85-11a3-44be-a962-ac430c2f1756/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.687 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8f7cef52-aa22-4885-a6e6-a62e320c701b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b31993e3b9ba48dc95406061036b537a', 'user_name': None, 'project_id': '664a512879714e69beefb15b448f9498', 'project_name': None, 'resource_id': 'f54acf85-11a3-44be-a962-ac430c2f1756-vda', 'timestamp': '2025-11-22T08:58:36.636918', 'resource_metadata': {'display_name': 'tempest-TestServerAdvancedOps-server-1645058772', 'name': 'instance-000000bc', 'instance_id': 'f54acf85-11a3-44be-a962-ac430c2f1756', 'instance_type': 'm1.nano', 'host': '44fb8ad90173c08c6c181b826fd71ecec6e30dc395d59ff95199eb61', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '6e7cae12-c781-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 8657.328749202, 'message_signature': 'd31d4836aceb35a066e9e9a24a69f2fdeae1b25ea37a0f87bf4b24a906fb057d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b31993e3b9ba48dc95406061036b537a', 'user_name': None, 'project_id': '664a512879714e69beefb15b448f9498', 'project_name': None, 'resource_id': 'f54acf85-11a3-44be-a962-ac430c2f1756-sda', 'timestamp': '2025-11-22T08:58:36.636918', 'resource_metadata': {'display_name': 'tempest-TestServerAdvancedOps-server-1645058772', 'name': 'instance-000000bc', 'instance_id': 'f54acf85-11a3-44be-a962-ac430c2f1756', 'instance_type': 'm1.nano', 'host': '44fb8ad90173c08c6c181b826fd71ecec6e30dc395d59ff95199eb61', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '6e7cbe20-c781-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 8657.328749202, 'message_signature': 'b33af1193127bd7adcc68b2264a82d7dfd7255aaa425a0f26e432db46b5b2984'}]}, 'timestamp': '2025-11-22 08:58:36.686412', '_unique_id': '002feb5ea7554a119141ffdec911f493'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.687 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.687 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.687 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.687 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.687 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.687 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.687 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.687 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.687 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.687 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.687 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.687 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.687 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.687 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.687 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.687 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.687 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.687 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.687 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.687 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.687 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.687 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.687 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.687 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.687 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.687 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.687 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.687 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.687 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.687 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.687 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.687 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.688 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.688 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.688 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestServerAdvancedOps-server-1645058772>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestServerAdvancedOps-server-1645058772>]
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.689 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.694 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for f54acf85-11a3-44be-a962-ac430c2f1756 / tapd4bab24c-c6 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.694 12 DEBUG ceilometer.compute.pollsters [-] f54acf85-11a3-44be-a962-ac430c2f1756/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.695 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7add75c5-0762-4e4c-b957-77b55db6ed7e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b31993e3b9ba48dc95406061036b537a', 'user_name': None, 'project_id': '664a512879714e69beefb15b448f9498', 'project_name': None, 'resource_id': 'instance-000000bc-f54acf85-11a3-44be-a962-ac430c2f1756-tapd4bab24c-c6', 'timestamp': '2025-11-22T08:58:36.689205', 'resource_metadata': {'display_name': 'tempest-TestServerAdvancedOps-server-1645058772', 'name': 'tapd4bab24c-c6', 'instance_id': 'f54acf85-11a3-44be-a962-ac430c2f1756', 'instance_type': 'm1.nano', 'host': '44fb8ad90173c08c6c181b826fd71ecec6e30dc395d59ff95199eb61', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e2:f2:39', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd4bab24c-c6'}, 'message_id': '6e7e09f6-c781-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 8657.381023548, 'message_signature': '353c08994d59bd9ba282d8f846ee553585118de51af8bc2e2f00bb6aa761a013'}]}, 'timestamp': '2025-11-22 08:58:36.694925', '_unique_id': '58b0890fd76b4cd68a3b5be664963054'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.695 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.695 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.695 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.695 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.695 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.695 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.695 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.695 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.695 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.695 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.695 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.695 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.695 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.695 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.695 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.695 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.695 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.695 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.695 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.695 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.695 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.695 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.695 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.695 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.695 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.695 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.695 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.695 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.695 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.695 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.695 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.695 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.696 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.696 12 DEBUG ceilometer.compute.pollsters [-] f54acf85-11a3-44be-a962-ac430c2f1756/network.outgoing.bytes volume: 240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.697 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b2622eb0-abff-455e-8f5a-e5e09e263324', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 240, 'user_id': 'b31993e3b9ba48dc95406061036b537a', 'user_name': None, 'project_id': '664a512879714e69beefb15b448f9498', 'project_name': None, 'resource_id': 'instance-000000bc-f54acf85-11a3-44be-a962-ac430c2f1756-tapd4bab24c-c6', 'timestamp': '2025-11-22T08:58:36.696843', 'resource_metadata': {'display_name': 'tempest-TestServerAdvancedOps-server-1645058772', 'name': 'tapd4bab24c-c6', 'instance_id': 'f54acf85-11a3-44be-a962-ac430c2f1756', 'instance_type': 'm1.nano', 'host': '44fb8ad90173c08c6c181b826fd71ecec6e30dc395d59ff95199eb61', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e2:f2:39', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd4bab24c-c6'}, 'message_id': '6e7e6248-c781-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 8657.381023548, 'message_signature': '757b3a3d7ed1fb845a3cd2190789456338b5efef987ed4ab05c97d7e397bd277'}]}, 'timestamp': '2025-11-22 08:58:36.697175', '_unique_id': 'd9bc944a8d4244a78e7f9b00e96ec9e9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.697 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.697 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.697 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.697 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.697 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.697 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.697 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.697 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.697 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.697 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.697 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.697 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.697 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.697 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.697 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.697 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.697 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.697 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.697 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.697 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.697 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.697 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.697 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.697 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.697 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.697 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.697 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.697 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.697 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.697 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.697 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.697 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.698 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.699 12 DEBUG ceilometer.compute.pollsters [-] f54acf85-11a3-44be-a962-ac430c2f1756/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.699 12 DEBUG ceilometer.compute.pollsters [-] f54acf85-11a3-44be-a962-ac430c2f1756/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.700 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f5920ce5-1b2d-4983-bc54-acd39aa9c62a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'b31993e3b9ba48dc95406061036b537a', 'user_name': None, 'project_id': '664a512879714e69beefb15b448f9498', 'project_name': None, 'resource_id': 'f54acf85-11a3-44be-a962-ac430c2f1756-vda', 'timestamp': '2025-11-22T08:58:36.698993', 'resource_metadata': {'display_name': 'tempest-TestServerAdvancedOps-server-1645058772', 'name': 'instance-000000bc', 'instance_id': 'f54acf85-11a3-44be-a962-ac430c2f1756', 'instance_type': 'm1.nano', 'host': '44fb8ad90173c08c6c181b826fd71ecec6e30dc395d59ff95199eb61', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '6e7eb662-c781-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 8657.328749202, 'message_signature': '5d51aceca7e81d34b206e5decb710180a1fb268909ede9190f92c376754bb78e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'b31993e3b9ba48dc95406061036b537a', 'user_name': None, 'project_id': '664a512879714e69beefb15b448f9498', 'project_name': None, 'resource_id': 'f54acf85-11a3-44be-a962-ac430c2f1756-sda', 'timestamp': '2025-11-22T08:58:36.698993', 'resource_metadata': {'display_name': 'tempest-TestServerAdvancedOps-server-1645058772', 'name': 'instance-000000bc', 'instance_id': 'f54acf85-11a3-44be-a962-ac430c2f1756', 'instance_type': 'm1.nano', 'host': '44fb8ad90173c08c6c181b826fd71ecec6e30dc395d59ff95199eb61', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '6e7ec29c-c781-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 8657.328749202, 'message_signature': '70a8dd9da5eed6c91e052d69f780a1dd8f2eec1645ccec5691550f18fe2e7bfa'}]}, 'timestamp': '2025-11-22 08:58:36.699627', '_unique_id': 'a1ba3636ecd34a72951d3ef30bef6d47'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.700 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.700 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.700 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.700 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.700 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.700 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.700 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.700 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.700 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.700 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.700 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.700 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.700 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.700 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.700 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.700 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.700 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.700 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.700 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.700 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.700 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.700 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.700 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.700 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.700 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.700 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.700 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.700 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.700 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.700 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.700 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.700 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.701 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.720 12 DEBUG ceilometer.compute.pollsters [-] f54acf85-11a3-44be-a962-ac430c2f1756/cpu volume: 550000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.723 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cfddbe74-eed5-41cc-8467-8cf47098dafb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 550000000, 'user_id': 'b31993e3b9ba48dc95406061036b537a', 'user_name': None, 'project_id': '664a512879714e69beefb15b448f9498', 'project_name': None, 'resource_id': 'f54acf85-11a3-44be-a962-ac430c2f1756', 'timestamp': '2025-11-22T08:58:36.701660', 'resource_metadata': {'display_name': 'tempest-TestServerAdvancedOps-server-1645058772', 'name': 'instance-000000bc', 'instance_id': 'f54acf85-11a3-44be-a962-ac430c2f1756', 'instance_type': 'm1.nano', 'host': '44fb8ad90173c08c6c181b826fd71ecec6e30dc395d59ff95199eb61', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '6e821b54-c781-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 8657.412355799, 'message_signature': 'ebeab2cfb7686d693ba9e2f57d3d506371ccf67ba8852dde1f0c36b1660dd4ae'}]}, 'timestamp': '2025-11-22 08:58:36.721708', '_unique_id': '6d366ab7e374401fb87419c42f308cfd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.723 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.723 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.723 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.723 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.723 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.723 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.723 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.723 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.723 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.723 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.723 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.723 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.723 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.723 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.723 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.723 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.723 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.723 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.723 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.723 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.723 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.723 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.723 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.723 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.723 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.723 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.723 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.723 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.723 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.723 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.723 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.723 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.724 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.724 12 DEBUG ceilometer.compute.pollsters [-] f54acf85-11a3-44be-a962-ac430c2f1756/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.725 12 DEBUG ceilometer.compute.pollsters [-] f54acf85-11a3-44be-a962-ac430c2f1756/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.726 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9980b981-6964-4e6b-807d-86499643059a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'b31993e3b9ba48dc95406061036b537a', 'user_name': None, 'project_id': '664a512879714e69beefb15b448f9498', 'project_name': None, 'resource_id': 'f54acf85-11a3-44be-a962-ac430c2f1756-vda', 'timestamp': '2025-11-22T08:58:36.724776', 'resource_metadata': {'display_name': 'tempest-TestServerAdvancedOps-server-1645058772', 'name': 'instance-000000bc', 'instance_id': 'f54acf85-11a3-44be-a962-ac430c2f1756', 'instance_type': 'm1.nano', 'host': '44fb8ad90173c08c6c181b826fd71ecec6e30dc395d59ff95199eb61', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '6e82a6b4-c781-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 8657.328749202, 'message_signature': 'f54b3e8fc1c31fa385a863fce391493ba9698cb20d1527b577a337a417c2ce7b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'b31993e3b9ba48dc95406061036b537a', 'user_name': None, 'project_id': '664a512879714e69beefb15b448f9498', 'project_name': None, 'resource_id': 'f54acf85-11a3-44be-a962-ac430c2f1756-sda', 'timestamp': '2025-11-22T08:58:36.724776', 'resource_metadata': {'display_name': 'tempest-TestServerAdvancedOps-server-1645058772', 'name': 'instance-000000bc', 'instance_id': 'f54acf85-11a3-44be-a962-ac430c2f1756', 'instance_type': 'm1.nano', 'host': '44fb8ad90173c08c6c181b826fd71ecec6e30dc395d59ff95199eb61', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '6e82b2da-c781-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 8657.328749202, 'message_signature': 'b7938a554d1bcb36e7897541a678add9a10f70fa7164b7ab2b55292a6c480e8f'}]}, 'timestamp': '2025-11-22 08:58:36.725437', '_unique_id': '1035aa4281ef4b1e893388c289f0e2a2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.726 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.726 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.726 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.726 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.726 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.726 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.726 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.726 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.726 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.726 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.726 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.726 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.726 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.726 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.726 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.726 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.726 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.726 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.726 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.726 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.726 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.726 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.726 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.726 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.726 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.726 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.726 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.726 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.726 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.726 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.726 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.726 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.727 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.727 12 DEBUG ceilometer.compute.pollsters [-] f54acf85-11a3-44be-a962-ac430c2f1756/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.727 12 DEBUG ceilometer.compute.pollsters [-] f54acf85-11a3-44be-a962-ac430c2f1756/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.728 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd27cf2dc-ec45-47a5-aed1-4daa84cceb23', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'b31993e3b9ba48dc95406061036b537a', 'user_name': None, 'project_id': '664a512879714e69beefb15b448f9498', 'project_name': None, 'resource_id': 'f54acf85-11a3-44be-a962-ac430c2f1756-vda', 'timestamp': '2025-11-22T08:58:36.727176', 'resource_metadata': {'display_name': 'tempest-TestServerAdvancedOps-server-1645058772', 'name': 'instance-000000bc', 'instance_id': 'f54acf85-11a3-44be-a962-ac430c2f1756', 'instance_type': 'm1.nano', 'host': '44fb8ad90173c08c6c181b826fd71ecec6e30dc395d59ff95199eb61', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '6e830370-c781-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 8657.328749202, 'message_signature': '0b7117ab8698c374da8392413511d19e87e2e3ed7236705bf784cba29cf6608c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'b31993e3b9ba48dc95406061036b537a', 'user_name': None, 'project_id': '664a512879714e69beefb15b448f9498', 'project_name': None, 'resource_id': 'f54acf85-11a3-44be-a962-ac430c2f1756-sda', 'timestamp': '2025-11-22T08:58:36.727176', 'resource_metadata': {'display_name': 'tempest-TestServerAdvancedOps-server-1645058772', 'name': 'instance-000000bc', 'instance_id': 'f54acf85-11a3-44be-a962-ac430c2f1756', 'instance_type': 'm1.nano', 'host': '44fb8ad90173c08c6c181b826fd71ecec6e30dc395d59ff95199eb61', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '6e830e92-c781-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 8657.328749202, 'message_signature': '5cc9fe86ca84748161a119dbd864e17a7df83fa85aca4315080a5b80e70156e7'}]}, 'timestamp': '2025-11-22 08:58:36.727785', '_unique_id': '91b664a8a4ff4c1d9a81a6dc78a4c3a1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.728 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.728 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.728 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.728 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.728 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.728 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.728 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.728 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.728 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.728 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.728 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.728 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.728 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.728 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.728 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.728 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.728 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.728 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.728 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.728 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.728 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.728 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.728 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.728 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.728 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.728 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.728 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.728 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.728 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.728 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.728 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.729 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.729 12 DEBUG ceilometer.compute.pollsters [-] f54acf85-11a3-44be-a962-ac430c2f1756/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.730 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ccfc6740-8b65-47ba-8669-d5df0d52d7b9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': 'b31993e3b9ba48dc95406061036b537a', 'user_name': None, 'project_id': '664a512879714e69beefb15b448f9498', 'project_name': None, 'resource_id': 'instance-000000bc-f54acf85-11a3-44be-a962-ac430c2f1756-tapd4bab24c-c6', 'timestamp': '2025-11-22T08:58:36.729548', 'resource_metadata': {'display_name': 'tempest-TestServerAdvancedOps-server-1645058772', 'name': 'tapd4bab24c-c6', 'instance_id': 'f54acf85-11a3-44be-a962-ac430c2f1756', 'instance_type': 'm1.nano', 'host': '44fb8ad90173c08c6c181b826fd71ecec6e30dc395d59ff95199eb61', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e2:f2:39', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd4bab24c-c6'}, 'message_id': '6e8362b6-c781-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 8657.381023548, 'message_signature': 'bd39d7d0f407ad410512faf9d392fa9169d07476ef317ef6e8a82eb3ee06ffd6'}]}, 'timestamp': '2025-11-22 08:58:36.729962', '_unique_id': '65f4806e96974f679b187460bcbcdee5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.730 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.730 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.730 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.730 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.730 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.730 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.730 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.730 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.730 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.730 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.730 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.730 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.730 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.730 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.730 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.730 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.730 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.730 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.730 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.730 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.730 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.730 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.730 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.730 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.730 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.730 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.730 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.730 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.730 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.730 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.730 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.730 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.731 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.731 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.731 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestServerAdvancedOps-server-1645058772>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestServerAdvancedOps-server-1645058772>]
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.731 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.732 12 DEBUG ceilometer.compute.pollsters [-] f54acf85-11a3-44be-a962-ac430c2f1756/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.732 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c8659a05-fd01-44a8-a890-f8fc423acf87', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': 'b31993e3b9ba48dc95406061036b537a', 'user_name': None, 'project_id': '664a512879714e69beefb15b448f9498', 'project_name': None, 'resource_id': 'instance-000000bc-f54acf85-11a3-44be-a962-ac430c2f1756-tapd4bab24c-c6', 'timestamp': '2025-11-22T08:58:36.732102', 'resource_metadata': {'display_name': 'tempest-TestServerAdvancedOps-server-1645058772', 'name': 'tapd4bab24c-c6', 'instance_id': 'f54acf85-11a3-44be-a962-ac430c2f1756', 'instance_type': 'm1.nano', 'host': '44fb8ad90173c08c6c181b826fd71ecec6e30dc395d59ff95199eb61', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e2:f2:39', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd4bab24c-c6'}, 'message_id': '6e83c346-c781-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 8657.381023548, 'message_signature': 'e32dd0248cbcb168b5ce75a2516255b81959616ae6f4ccde96256fb8fb25b87f'}]}, 'timestamp': '2025-11-22 08:58:36.732422', '_unique_id': '86ffc6e0017042f4bc7791f983738704'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.732 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.732 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.732 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.732 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.732 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.732 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.732 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.732 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.732 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.732 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.732 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.732 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.732 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.732 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.732 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.732 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.732 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.732 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.732 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.732 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.732 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.732 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.732 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.732 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.732 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.732 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.732 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.732 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.732 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.732 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.732 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.732 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.733 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.734 12 DEBUG ceilometer.compute.pollsters [-] f54acf85-11a3-44be-a962-ac430c2f1756/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.734 12 DEBUG ceilometer.compute.pollsters [-] f54acf85-11a3-44be-a962-ac430c2f1756/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.735 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a06c9678-0ac0-4b07-8215-7368354d6b40', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b31993e3b9ba48dc95406061036b537a', 'user_name': None, 'project_id': '664a512879714e69beefb15b448f9498', 'project_name': None, 'resource_id': 'f54acf85-11a3-44be-a962-ac430c2f1756-vda', 'timestamp': '2025-11-22T08:58:36.734050', 'resource_metadata': {'display_name': 'tempest-TestServerAdvancedOps-server-1645058772', 'name': 'instance-000000bc', 'instance_id': 'f54acf85-11a3-44be-a962-ac430c2f1756', 'instance_type': 'm1.nano', 'host': '44fb8ad90173c08c6c181b826fd71ecec6e30dc395d59ff95199eb61', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '6e840f04-c781-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 8657.328749202, 'message_signature': '1350def0cb6a526f5220df147a6b60b6391e8085e5d5327d9f3532c27eaaa5c9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b31993e3b9ba48dc95406061036b537a', 'user_name': None, 'project_id': '664a512879714e69beefb15b448f9498', 'project_name': None, 'resource_id': 'f54acf85-11a3-44be-a962-ac430c2f1756-sda', 'timestamp': '2025-11-22T08:58:36.734050', 'resource_metadata': {'display_name': 'tempest-TestServerAdvancedOps-server-1645058772', 'name': 'instance-000000bc', 'instance_id': 'f54acf85-11a3-44be-a962-ac430c2f1756', 'instance_type': 'm1.nano', 'host': '44fb8ad90173c08c6c181b826fd71ecec6e30dc395d59ff95199eb61', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '6e841a9e-c781-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 8657.328749202, 'message_signature': '4e4599610a353e2f87da9b085d4cb3084ded3487ad16bc12737a64b0fce83242'}]}, 'timestamp': '2025-11-22 08:58:36.734645', '_unique_id': 'cf00a710ac334eb8b3108fe4b0d9ef02'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.735 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.735 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.735 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.735 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.735 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.735 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.735 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.735 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.735 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.735 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.735 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.735 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.735 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.735 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.735 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.735 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.735 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.735 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.735 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.735 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.735 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.735 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.735 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.735 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.735 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.735 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.735 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.735 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.735 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.735 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.735 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.736 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.736 12 DEBUG ceilometer.compute.pollsters [-] f54acf85-11a3-44be-a962-ac430c2f1756/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.737 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8bf040d4-889a-4d64-a263-1c118385095c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b31993e3b9ba48dc95406061036b537a', 'user_name': None, 'project_id': '664a512879714e69beefb15b448f9498', 'project_name': None, 'resource_id': 'instance-000000bc-f54acf85-11a3-44be-a962-ac430c2f1756-tapd4bab24c-c6', 'timestamp': '2025-11-22T08:58:36.736485', 'resource_metadata': {'display_name': 'tempest-TestServerAdvancedOps-server-1645058772', 'name': 'tapd4bab24c-c6', 'instance_id': 'f54acf85-11a3-44be-a962-ac430c2f1756', 'instance_type': 'm1.nano', 'host': '44fb8ad90173c08c6c181b826fd71ecec6e30dc395d59ff95199eb61', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e2:f2:39', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd4bab24c-c6'}, 'message_id': '6e846e2c-c781-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 8657.381023548, 'message_signature': '62114bc8b1dc55b4127966174371435f64182348c4dd20de41d873a89079ac4b'}]}, 'timestamp': '2025-11-22 08:58:36.736809', '_unique_id': 'bb4839c7cfd34aa38d8d0c092f82cd13'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.737 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.737 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.737 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.737 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.737 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.737 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.737 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.737 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.737 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.737 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.737 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.737 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.737 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.737 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.737 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.737 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.737 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.737 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.737 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.737 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.737 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.737 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.737 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.737 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.737 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.737 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.737 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.737 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.737 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.737 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.737 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.739 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.739 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.739 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestServerAdvancedOps-server-1645058772>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestServerAdvancedOps-server-1645058772>]
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.739 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.739 12 DEBUG ceilometer.compute.pollsters [-] f54acf85-11a3-44be-a962-ac430c2f1756/network.outgoing.packets volume: 4 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.740 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '34dfd6ac-e964-4741-9958-b5184cb4b5e3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 4, 'user_id': 'b31993e3b9ba48dc95406061036b537a', 'user_name': None, 'project_id': '664a512879714e69beefb15b448f9498', 'project_name': None, 'resource_id': 'instance-000000bc-f54acf85-11a3-44be-a962-ac430c2f1756-tapd4bab24c-c6', 'timestamp': '2025-11-22T08:58:36.739792', 'resource_metadata': {'display_name': 'tempest-TestServerAdvancedOps-server-1645058772', 'name': 'tapd4bab24c-c6', 'instance_id': 'f54acf85-11a3-44be-a962-ac430c2f1756', 'instance_type': 'm1.nano', 'host': '44fb8ad90173c08c6c181b826fd71ecec6e30dc395d59ff95199eb61', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e2:f2:39', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd4bab24c-c6'}, 'message_id': '6e84f018-c781-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 8657.381023548, 'message_signature': '2b53f1bf68cc6aa7bc720773668c463419e206a2af2a3c5c83726105adaa7ede'}]}, 'timestamp': '2025-11-22 08:58:36.740131', '_unique_id': '55732a07d2f74d47bccaf0602bd7fb44'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.740 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.740 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.740 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.740 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.740 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.740 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.740 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.740 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.740 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.740 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.740 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.740 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.740 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.740 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.740 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.740 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.740 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.740 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.740 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.740 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.740 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.740 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.740 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.740 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.740 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.740 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.740 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.740 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.740 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.740 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.740 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.740 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.741 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.741 12 DEBUG ceilometer.compute.pollsters [-] f54acf85-11a3-44be-a962-ac430c2f1756/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.742 12 DEBUG ceilometer.compute.pollsters [-] f54acf85-11a3-44be-a962-ac430c2f1756/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.742 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '55f9ad26-da76-4ff8-ada9-c5c8cddfd836', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': 'b31993e3b9ba48dc95406061036b537a', 'user_name': None, 'project_id': '664a512879714e69beefb15b448f9498', 'project_name': None, 'resource_id': 'f54acf85-11a3-44be-a962-ac430c2f1756-vda', 'timestamp': '2025-11-22T08:58:36.741862', 'resource_metadata': {'display_name': 'tempest-TestServerAdvancedOps-server-1645058772', 'name': 'instance-000000bc', 'instance_id': 'f54acf85-11a3-44be-a962-ac430c2f1756', 'instance_type': 'm1.nano', 'host': '44fb8ad90173c08c6c181b826fd71ecec6e30dc395d59ff95199eb61', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '6e85407c-c781-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 8657.306861264, 'message_signature': '2113f8485073bc9544d9df5145cc973a34068e9c75b73387a608a30d2e0148a3'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'b31993e3b9ba48dc95406061036b537a', 'user_name': None, 'project_id': '664a512879714e69beefb15b448f9498', 'project_name': None, 'resource_id': 'f54acf85-11a3-44be-a962-ac430c2f1756-sda', 'timestamp': '2025-11-22T08:58:36.741862', 'resource_metadata': {'display_name': 'tempest-TestServerAdvancedOps-server-1645058772', 'name': 'instance-000000bc', 'instance_id': 'f54acf85-11a3-44be-a962-ac430c2f1756', 'instance_type': 'm1.nano', 'host': '44fb8ad90173c08c6c181b826fd71ecec6e30dc395d59ff95199eb61', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '6e854ba8-c781-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 8657.306861264, 'message_signature': '68a23d9e96159c67ce092b8d66e4efcec6410697bc96bdfb1c8a2fa3592d912c'}]}, 'timestamp': '2025-11-22 08:58:36.742451', '_unique_id': 'fc9922131a3340238f1f67bfc32ec442'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.742 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.742 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.742 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.742 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.742 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.742 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.742 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.742 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.742 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.742 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.742 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.742 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.742 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.742 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.742 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.742 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.742 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.742 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.742 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.742 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.742 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.742 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.742 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.742 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.742 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.742 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.742 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.742 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.742 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.742 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.742 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.743 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.744 12 DEBUG ceilometer.compute.pollsters [-] f54acf85-11a3-44be-a962-ac430c2f1756/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.744 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance f54acf85-11a3-44be-a962-ac430c2f1756: ceilometer.compute.pollsters.NoVolumeException
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.744 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.744 12 DEBUG ceilometer.compute.pollsters [-] f54acf85-11a3-44be-a962-ac430c2f1756/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.744 12 DEBUG ceilometer.compute.pollsters [-] f54acf85-11a3-44be-a962-ac430c2f1756/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.745 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f3d32cd0-42bf-4d66-9ec5-39301c2976cb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'b31993e3b9ba48dc95406061036b537a', 'user_name': None, 'project_id': '664a512879714e69beefb15b448f9498', 'project_name': None, 'resource_id': 'f54acf85-11a3-44be-a962-ac430c2f1756-vda', 'timestamp': '2025-11-22T08:58:36.744508', 'resource_metadata': {'display_name': 'tempest-TestServerAdvancedOps-server-1645058772', 'name': 'instance-000000bc', 'instance_id': 'f54acf85-11a3-44be-a962-ac430c2f1756', 'instance_type': 'm1.nano', 'host': '44fb8ad90173c08c6c181b826fd71ecec6e30dc395d59ff95199eb61', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '6e85a710-c781-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 8657.328749202, 'message_signature': '872291c554220c5ec4a8af75319dd0d6ad0051a0b36b6afca984618750cd996b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'b31993e3b9ba48dc95406061036b537a', 'user_name': None, 'project_id': '664a512879714e69beefb15b448f9498', 'project_name': None, 'resource_id': 'f54acf85-11a3-44be-a962-ac430c2f1756-sda', 'timestamp': '2025-11-22T08:58:36.744508', 'resource_metadata': {'display_name': 'tempest-TestServerAdvancedOps-server-1645058772', 'name': 'instance-000000bc', 'instance_id': 'f54acf85-11a3-44be-a962-ac430c2f1756', 'instance_type': 'm1.nano', 'host': '44fb8ad90173c08c6c181b826fd71ecec6e30dc395d59ff95199eb61', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '6e85b340-c781-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 8657.328749202, 'message_signature': 'd071f37cdf5c8e1ca40d2740fb422bb3c8c933b38f227b27845066774d8bd2cf'}]}, 'timestamp': '2025-11-22 08:58:36.745107', '_unique_id': '3c232b49f8f943b9804e135718ed95ca'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.745 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.745 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.745 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.745 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.745 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.745 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.745 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.745 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.745 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.745 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.745 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.745 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.745 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.745 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.745 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.745 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.745 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.745 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.745 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.745 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.745 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.745 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.745 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.745 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.745 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.745 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.745 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.745 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.745 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.745 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.745 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.746 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.746 12 DEBUG ceilometer.compute.pollsters [-] f54acf85-11a3-44be-a962-ac430c2f1756/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.747 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '455c8dc7-431c-4cef-b572-8e9256277848', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b31993e3b9ba48dc95406061036b537a', 'user_name': None, 'project_id': '664a512879714e69beefb15b448f9498', 'project_name': None, 'resource_id': 'instance-000000bc-f54acf85-11a3-44be-a962-ac430c2f1756-tapd4bab24c-c6', 'timestamp': '2025-11-22T08:58:36.746871', 'resource_metadata': {'display_name': 'tempest-TestServerAdvancedOps-server-1645058772', 'name': 'tapd4bab24c-c6', 'instance_id': 'f54acf85-11a3-44be-a962-ac430c2f1756', 'instance_type': 'm1.nano', 'host': '44fb8ad90173c08c6c181b826fd71ecec6e30dc395d59ff95199eb61', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e2:f2:39', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd4bab24c-c6'}, 'message_id': '6e8604f8-c781-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 8657.381023548, 'message_signature': '9b5617aff06d399260c644101a53d12549f9747ef67b73260104a6c7132c2479'}]}, 'timestamp': '2025-11-22 08:58:36.747211', '_unique_id': '673088e89fe145f1b8b72a80802c7368'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.747 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.747 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.747 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.747 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.747 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.747 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.747 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.747 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.747 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.747 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.747 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.747 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.747 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.747 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.747 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.747 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.747 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.747 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.747 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.747 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.747 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.747 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.747 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.747 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.747 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.747 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.747 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.747 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.747 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.747 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.747 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.747 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.748 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.748 12 DEBUG ceilometer.compute.pollsters [-] f54acf85-11a3-44be-a962-ac430c2f1756/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.749 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '81974b3f-4fe4-402f-9190-7ab1109d5ea5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b31993e3b9ba48dc95406061036b537a', 'user_name': None, 'project_id': '664a512879714e69beefb15b448f9498', 'project_name': None, 'resource_id': 'instance-000000bc-f54acf85-11a3-44be-a962-ac430c2f1756-tapd4bab24c-c6', 'timestamp': '2025-11-22T08:58:36.748817', 'resource_metadata': {'display_name': 'tempest-TestServerAdvancedOps-server-1645058772', 'name': 'tapd4bab24c-c6', 'instance_id': 'f54acf85-11a3-44be-a962-ac430c2f1756', 'instance_type': 'm1.nano', 'host': '44fb8ad90173c08c6c181b826fd71ecec6e30dc395d59ff95199eb61', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e2:f2:39', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd4bab24c-c6'}, 'message_id': '6e865142-c781-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 8657.381023548, 'message_signature': 'd10018417f6454bd7ead8f75da48c64a9b417ce8e5df22887e47be3225d0aea1'}]}, 'timestamp': '2025-11-22 08:58:36.749210', '_unique_id': '5a87db5bc5324570a039deacfd968a48'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.749 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.749 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.749 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.749 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.749 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.749 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.749 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.749 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.749 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.749 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.749 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.749 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.749 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.749 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.749 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.749 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.749 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.749 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.749 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.749 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.749 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.749 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.749 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.749 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.749 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.749 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.749 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.749 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.749 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.749 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.749 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.749 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.751 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.751 12 DEBUG ceilometer.compute.pollsters [-] f54acf85-11a3-44be-a962-ac430c2f1756/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.752 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c4b36664-9775-4ab6-8da5-605731670937', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b31993e3b9ba48dc95406061036b537a', 'user_name': None, 'project_id': '664a512879714e69beefb15b448f9498', 'project_name': None, 'resource_id': 'instance-000000bc-f54acf85-11a3-44be-a962-ac430c2f1756-tapd4bab24c-c6', 'timestamp': '2025-11-22T08:58:36.751614', 'resource_metadata': {'display_name': 'tempest-TestServerAdvancedOps-server-1645058772', 'name': 'tapd4bab24c-c6', 'instance_id': 'f54acf85-11a3-44be-a962-ac430c2f1756', 'instance_type': 'm1.nano', 'host': '44fb8ad90173c08c6c181b826fd71ecec6e30dc395d59ff95199eb61', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e2:f2:39', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd4bab24c-c6'}, 'message_id': '6e86c01e-c781-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 8657.381023548, 'message_signature': '78cc770d9dbecaf088ad0efe1760c02badb8832a2987af1072027d1dc05c3725'}]}, 'timestamp': '2025-11-22 08:58:36.752063', '_unique_id': '677c461ad59e4c5b872c29f7a12dcb78'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.752 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.752 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.752 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.752 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.752 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.752 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.752 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.752 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.752 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.752 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.752 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.752 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.752 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.752 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.752 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.752 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.752 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.752 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.752 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.752 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.752 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.752 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.752 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.752 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.752 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.752 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.752 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.752 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.752 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.752 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.752 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.753 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.754 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.754 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestServerAdvancedOps-server-1645058772>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestServerAdvancedOps-server-1645058772>]
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.754 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.754 12 DEBUG ceilometer.compute.pollsters [-] f54acf85-11a3-44be-a962-ac430c2f1756/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.755 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '14421cf7-e37c-4d20-92e3-c189d83eab3e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b31993e3b9ba48dc95406061036b537a', 'user_name': None, 'project_id': '664a512879714e69beefb15b448f9498', 'project_name': None, 'resource_id': 'instance-000000bc-f54acf85-11a3-44be-a962-ac430c2f1756-tapd4bab24c-c6', 'timestamp': '2025-11-22T08:58:36.754516', 'resource_metadata': {'display_name': 'tempest-TestServerAdvancedOps-server-1645058772', 'name': 'tapd4bab24c-c6', 'instance_id': 'f54acf85-11a3-44be-a962-ac430c2f1756', 'instance_type': 'm1.nano', 'host': '44fb8ad90173c08c6c181b826fd71ecec6e30dc395d59ff95199eb61', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e2:f2:39', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd4bab24c-c6'}, 'message_id': '6e872f0e-c781-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 8657.381023548, 'message_signature': '52591704a3a77c1cd9fe6aa60497ca593921953839b8789e18cb07bd01e98f87'}]}, 'timestamp': '2025-11-22 08:58:36.754851', '_unique_id': 'eab61916df094f9c97489a4eb002537a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.755 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.755 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.755 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.755 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.755 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.755 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.755 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.755 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.755 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.755 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.755 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.755 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.755 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.755 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.755 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.755 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.755 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.755 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.755 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.755 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.755 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.755 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.755 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.755 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.755 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.755 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.755 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.755 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.755 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.755 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.755 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.756 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.756 12 DEBUG ceilometer.compute.pollsters [-] f54acf85-11a3-44be-a962-ac430c2f1756/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.756 12 DEBUG ceilometer.compute.pollsters [-] f54acf85-11a3-44be-a962-ac430c2f1756/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.757 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '796b39ff-354e-4bb4-989c-739b8f1bd852', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'b31993e3b9ba48dc95406061036b537a', 'user_name': None, 'project_id': '664a512879714e69beefb15b448f9498', 'project_name': None, 'resource_id': 'f54acf85-11a3-44be-a962-ac430c2f1756-vda', 'timestamp': '2025-11-22T08:58:36.756597', 'resource_metadata': {'display_name': 'tempest-TestServerAdvancedOps-server-1645058772', 'name': 'instance-000000bc', 'instance_id': 'f54acf85-11a3-44be-a962-ac430c2f1756', 'instance_type': 'm1.nano', 'host': '44fb8ad90173c08c6c181b826fd71ecec6e30dc395d59ff95199eb61', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '6e878224-c781-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 8657.306861264, 'message_signature': 'f6e14001e4064a111b7dbedb69f18535e95ed42902b184b9e7cb4b10039d7c47'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'b31993e3b9ba48dc95406061036b537a', 'user_name': None, 'project_id': '664a512879714e69beefb15b448f9498', 'project_name': None, 'resource_id': 'f54acf85-11a3-44be-a962-ac430c2f1756-sda', 'timestamp': '2025-11-22T08:58:36.756597', 'resource_metadata': {'display_name': 'tempest-TestServerAdvancedOps-server-1645058772', 'name': 'instance-000000bc', 'instance_id': 'f54acf85-11a3-44be-a962-ac430c2f1756', 'instance_type': 'm1.nano', 'host': '44fb8ad90173c08c6c181b826fd71ecec6e30dc395d59ff95199eb61', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '31612188-3cd6-428b-9166-9568f0affd4a', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c'}, 'image_ref': 'eb6eb4ac-7956-4021-b3a0-d612ae61d38c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '6e878e18-c781-11f0-ab78-fa163ea8d0a9', 'monotonic_time': 8657.306861264, 'message_signature': 'f16a35a33644f64e1e97820f6b85781b3cf18717627bc82bbae79c31b699dba0'}]}, 'timestamp': '2025-11-22 08:58:36.757281', '_unique_id': 'e1aecba91e9f48fa96c3fb6c94a9a762'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.757 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.757 12 ERROR oslo_messaging.notify.messaging     yield
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.757 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.757 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.757 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.757 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.757 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.757 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.757 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.757 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.757 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.757 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.757 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.757 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.757 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.757 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.757 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.757 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.757 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.757 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.757 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.757 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.757 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.757 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.757 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.757 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.757 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.757 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.757 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.757 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 22 08:58:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 08:58:36.757 12 ERROR oslo_messaging.notify.messaging 
Nov 22 08:58:37 compute-0 nova_compute[186544]: 2025-11-22 08:58:37.013 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:58:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:58:37.390 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:58:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:58:37.390 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:58:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:58:37.391 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:58:37 compute-0 nova_compute[186544]: 2025-11-22 08:58:37.611 186548 DEBUG nova.network.neutron [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Updating instance_info_cache with network_info: [{"id": "d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273", "address": "fa:16:3e:e2:f2:39", "network": {"id": "5eb557a9-23d5-43ab-8781-2f9d2c26caba", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-135091389-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "664a512879714e69beefb15b448f9498", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4bab24c-c6", "ovs_interfaceid": "d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:58:37 compute-0 nova_compute[186544]: 2025-11-22 08:58:37.630 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Releasing lock "refresh_cache-f54acf85-11a3-44be-a962-ac430c2f1756" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 22 08:58:37 compute-0 nova_compute[186544]: 2025-11-22 08:58:37.630 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 22 08:58:37 compute-0 nova_compute[186544]: 2025-11-22 08:58:37.746 186548 DEBUG oslo_concurrency.lockutils [None req-7a1bb4de-4962-42b1-9acb-e0076a2696fe b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Acquiring lock "f54acf85-11a3-44be-a962-ac430c2f1756" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:58:37 compute-0 nova_compute[186544]: 2025-11-22 08:58:37.747 186548 DEBUG oslo_concurrency.lockutils [None req-7a1bb4de-4962-42b1-9acb-e0076a2696fe b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Lock "f54acf85-11a3-44be-a962-ac430c2f1756" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:58:37 compute-0 nova_compute[186544]: 2025-11-22 08:58:37.747 186548 DEBUG oslo_concurrency.lockutils [None req-7a1bb4de-4962-42b1-9acb-e0076a2696fe b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Acquiring lock "f54acf85-11a3-44be-a962-ac430c2f1756-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:58:37 compute-0 nova_compute[186544]: 2025-11-22 08:58:37.748 186548 DEBUG oslo_concurrency.lockutils [None req-7a1bb4de-4962-42b1-9acb-e0076a2696fe b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Lock "f54acf85-11a3-44be-a962-ac430c2f1756-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:58:37 compute-0 nova_compute[186544]: 2025-11-22 08:58:37.748 186548 DEBUG oslo_concurrency.lockutils [None req-7a1bb4de-4962-42b1-9acb-e0076a2696fe b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Lock "f54acf85-11a3-44be-a962-ac430c2f1756-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:58:37 compute-0 nova_compute[186544]: 2025-11-22 08:58:37.757 186548 INFO nova.compute.manager [None req-7a1bb4de-4962-42b1-9acb-e0076a2696fe b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Terminating instance
Nov 22 08:58:37 compute-0 nova_compute[186544]: 2025-11-22 08:58:37.764 186548 DEBUG nova.compute.manager [None req-7a1bb4de-4962-42b1-9acb-e0076a2696fe b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 22 08:58:37 compute-0 kernel: tapd4bab24c-c6 (unregistering): left promiscuous mode
Nov 22 08:58:37 compute-0 NetworkManager[55036]: <info>  [1763801917.7868] device (tapd4bab24c-c6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 22 08:58:37 compute-0 ovn_controller[94843]: 2025-11-22T08:58:37Z|00955|binding|INFO|Releasing lport d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273 from this chassis (sb_readonly=0)
Nov 22 08:58:37 compute-0 ovn_controller[94843]: 2025-11-22T08:58:37Z|00956|binding|INFO|Setting lport d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273 down in Southbound
Nov 22 08:58:37 compute-0 ovn_controller[94843]: 2025-11-22T08:58:37Z|00957|binding|INFO|Removing iface tapd4bab24c-c6 ovn-installed in OVS
Nov 22 08:58:37 compute-0 nova_compute[186544]: 2025-11-22 08:58:37.794 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:58:37 compute-0 nova_compute[186544]: 2025-11-22 08:58:37.797 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:58:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:58:37.802 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e2:f2:39 10.100.0.13'], port_security=['fa:16:3e:e2:f2:39 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'f54acf85-11a3-44be-a962-ac430c2f1756', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5eb557a9-23d5-43ab-8781-2f9d2c26caba', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '664a512879714e69beefb15b448f9498', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'a9a52d92-7b69-4d8e-bd37-a0315ee8cf4a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bd088f45-f47e-4599-af11-3d351df95ebf, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>], logical_port=d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5173b579d0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 08:58:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:58:37.804 103805 INFO neutron.agent.ovn.metadata.agent [-] Port d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273 in datapath 5eb557a9-23d5-43ab-8781-2f9d2c26caba unbound from our chassis
Nov 22 08:58:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:58:37.804 103805 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 5eb557a9-23d5-43ab-8781-2f9d2c26caba or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 22 08:58:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:58:37.805 213522 DEBUG oslo.privsep.daemon [-] privsep: reply[05ffb2ec-199c-4cb5-b58a-ff823aa9f7ed]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 22 08:58:37 compute-0 nova_compute[186544]: 2025-11-22 08:58:37.809 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:58:37 compute-0 systemd[1]: machine-qemu\x2d104\x2dinstance\x2d000000bc.scope: Deactivated successfully.
Nov 22 08:58:37 compute-0 systemd[1]: machine-qemu\x2d104\x2dinstance\x2d000000bc.scope: Consumed 2.070s CPU time.
Nov 22 08:58:37 compute-0 systemd-machined[152872]: Machine qemu-104-instance-000000bc terminated.
Nov 22 08:58:37 compute-0 podman[258258]: 2025-11-22 08:58:37.888954726 +0000 UTC m=+0.058862059 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 22 08:58:37 compute-0 podman[258255]: 2025-11-22 08:58:37.893379045 +0000 UTC m=+0.063616196 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 08:58:37 compute-0 podman[258260]: 2025-11-22 08:58:37.925061104 +0000 UTC m=+0.086385576 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 22 08:58:37 compute-0 podman[258259]: 2025-11-22 08:58:37.925141716 +0000 UTC m=+0.087952054 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 08:58:37 compute-0 NetworkManager[55036]: <info>  [1763801917.9793] manager: (tapd4bab24c-c6): new Tun device (/org/freedesktop/NetworkManager/Devices/448)
Nov 22 08:58:37 compute-0 nova_compute[186544]: 2025-11-22 08:58:37.980 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:58:37 compute-0 nova_compute[186544]: 2025-11-22 08:58:37.984 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:58:38 compute-0 nova_compute[186544]: 2025-11-22 08:58:38.013 186548 INFO nova.virt.libvirt.driver [-] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Instance destroyed successfully.
Nov 22 08:58:38 compute-0 nova_compute[186544]: 2025-11-22 08:58:38.014 186548 DEBUG nova.objects.instance [None req-7a1bb4de-4962-42b1-9acb-e0076a2696fe b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Lazy-loading 'resources' on Instance uuid f54acf85-11a3-44be-a962-ac430c2f1756 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 22 08:58:38 compute-0 nova_compute[186544]: 2025-11-22 08:58:38.031 186548 DEBUG nova.virt.libvirt.vif [None req-7a1bb4de-4962-42b1-9acb-e0076a2696fe b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-22T08:58:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestServerAdvancedOps-server-1645058772',display_name='tempest-TestServerAdvancedOps-server-1645058772',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testserveradvancedops-server-1645058772',id=188,image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-22T08:58:17Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='664a512879714e69beefb15b448f9498',ramdisk_id='',reservation_id='r-djuyjwq4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='eb6eb4ac-7956-4021-b3a0-d612ae61d38c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestServerAdvancedOps-1894421725',owner_user_name='tempest-TestServerAdvancedOps-1894421725-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-22T08:58:36Z,user_data=None,user_id='b31993e3b9ba48dc95406061036b537a',uuid=f54acf85-11a3-44be-a962-ac430c2f1756,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273", "address": "fa:16:3e:e2:f2:39", "network": {"id": "5eb557a9-23d5-43ab-8781-2f9d2c26caba", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-135091389-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "664a512879714e69beefb15b448f9498", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4bab24c-c6", "ovs_interfaceid": "d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 22 08:58:38 compute-0 nova_compute[186544]: 2025-11-22 08:58:38.031 186548 DEBUG nova.network.os_vif_util [None req-7a1bb4de-4962-42b1-9acb-e0076a2696fe b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Converting VIF {"id": "d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273", "address": "fa:16:3e:e2:f2:39", "network": {"id": "5eb557a9-23d5-43ab-8781-2f9d2c26caba", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-135091389-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "664a512879714e69beefb15b448f9498", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4bab24c-c6", "ovs_interfaceid": "d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 22 08:58:38 compute-0 nova_compute[186544]: 2025-11-22 08:58:38.032 186548 DEBUG nova.network.os_vif_util [None req-7a1bb4de-4962-42b1-9acb-e0076a2696fe b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e2:f2:39,bridge_name='br-int',has_traffic_filtering=True,id=d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273,network=Network(5eb557a9-23d5-43ab-8781-2f9d2c26caba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd4bab24c-c6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 22 08:58:38 compute-0 nova_compute[186544]: 2025-11-22 08:58:38.032 186548 DEBUG os_vif [None req-7a1bb4de-4962-42b1-9acb-e0076a2696fe b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e2:f2:39,bridge_name='br-int',has_traffic_filtering=True,id=d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273,network=Network(5eb557a9-23d5-43ab-8781-2f9d2c26caba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd4bab24c-c6') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 22 08:58:38 compute-0 nova_compute[186544]: 2025-11-22 08:58:38.034 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:58:38 compute-0 nova_compute[186544]: 2025-11-22 08:58:38.034 186548 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd4bab24c-c6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 08:58:38 compute-0 nova_compute[186544]: 2025-11-22 08:58:38.035 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:58:38 compute-0 nova_compute[186544]: 2025-11-22 08:58:38.037 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:58:38 compute-0 nova_compute[186544]: 2025-11-22 08:58:38.038 186548 INFO os_vif [None req-7a1bb4de-4962-42b1-9acb-e0076a2696fe b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e2:f2:39,bridge_name='br-int',has_traffic_filtering=True,id=d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273,network=Network(5eb557a9-23d5-43ab-8781-2f9d2c26caba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd4bab24c-c6')
Nov 22 08:58:38 compute-0 nova_compute[186544]: 2025-11-22 08:58:38.039 186548 INFO nova.virt.libvirt.driver [None req-7a1bb4de-4962-42b1-9acb-e0076a2696fe b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Deleting instance files /var/lib/nova/instances/f54acf85-11a3-44be-a962-ac430c2f1756_del
Nov 22 08:58:38 compute-0 nova_compute[186544]: 2025-11-22 08:58:38.039 186548 INFO nova.virt.libvirt.driver [None req-7a1bb4de-4962-42b1-9acb-e0076a2696fe b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Deletion of /var/lib/nova/instances/f54acf85-11a3-44be-a962-ac430c2f1756_del complete
Nov 22 08:58:38 compute-0 nova_compute[186544]: 2025-11-22 08:58:38.104 186548 DEBUG nova.compute.manager [req-1ea328a8-4b9b-4915-8287-4866f5e7a4c8 req-7a28cc8d-9e2f-4f99-b945-9c4c445dca53 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Received event network-vif-plugged-d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:58:38 compute-0 nova_compute[186544]: 2025-11-22 08:58:38.104 186548 DEBUG oslo_concurrency.lockutils [req-1ea328a8-4b9b-4915-8287-4866f5e7a4c8 req-7a28cc8d-9e2f-4f99-b945-9c4c445dca53 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "f54acf85-11a3-44be-a962-ac430c2f1756-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:58:38 compute-0 nova_compute[186544]: 2025-11-22 08:58:38.105 186548 DEBUG oslo_concurrency.lockutils [req-1ea328a8-4b9b-4915-8287-4866f5e7a4c8 req-7a28cc8d-9e2f-4f99-b945-9c4c445dca53 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "f54acf85-11a3-44be-a962-ac430c2f1756-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:58:38 compute-0 nova_compute[186544]: 2025-11-22 08:58:38.105 186548 DEBUG oslo_concurrency.lockutils [req-1ea328a8-4b9b-4915-8287-4866f5e7a4c8 req-7a28cc8d-9e2f-4f99-b945-9c4c445dca53 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "f54acf85-11a3-44be-a962-ac430c2f1756-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:58:38 compute-0 nova_compute[186544]: 2025-11-22 08:58:38.105 186548 DEBUG nova.compute.manager [req-1ea328a8-4b9b-4915-8287-4866f5e7a4c8 req-7a28cc8d-9e2f-4f99-b945-9c4c445dca53 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] No waiting events found dispatching network-vif-plugged-d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:58:38 compute-0 nova_compute[186544]: 2025-11-22 08:58:38.106 186548 WARNING nova.compute.manager [req-1ea328a8-4b9b-4915-8287-4866f5e7a4c8 req-7a28cc8d-9e2f-4f99-b945-9c4c445dca53 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Received unexpected event network-vif-plugged-d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273 for instance with vm_state active and task_state deleting.
Nov 22 08:58:38 compute-0 nova_compute[186544]: 2025-11-22 08:58:38.121 186548 INFO nova.compute.manager [None req-7a1bb4de-4962-42b1-9acb-e0076a2696fe b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Took 0.36 seconds to destroy the instance on the hypervisor.
Nov 22 08:58:38 compute-0 nova_compute[186544]: 2025-11-22 08:58:38.121 186548 DEBUG oslo.service.loopingcall [None req-7a1bb4de-4962-42b1-9acb-e0076a2696fe b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 22 08:58:38 compute-0 nova_compute[186544]: 2025-11-22 08:58:38.122 186548 DEBUG nova.compute.manager [-] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 22 08:58:38 compute-0 nova_compute[186544]: 2025-11-22 08:58:38.122 186548 DEBUG nova.network.neutron [-] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 22 08:58:38 compute-0 nova_compute[186544]: 2025-11-22 08:58:38.775 186548 DEBUG nova.network.neutron [-] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 22 08:58:38 compute-0 nova_compute[186544]: 2025-11-22 08:58:38.798 186548 INFO nova.compute.manager [-] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Took 0.68 seconds to deallocate network for instance.
Nov 22 08:58:38 compute-0 nova_compute[186544]: 2025-11-22 08:58:38.877 186548 DEBUG oslo_concurrency.lockutils [None req-7a1bb4de-4962-42b1-9acb-e0076a2696fe b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:58:38 compute-0 nova_compute[186544]: 2025-11-22 08:58:38.877 186548 DEBUG oslo_concurrency.lockutils [None req-7a1bb4de-4962-42b1-9acb-e0076a2696fe b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:58:38 compute-0 nova_compute[186544]: 2025-11-22 08:58:38.892 186548 DEBUG nova.compute.manager [req-58da8c58-23df-4996-ad75-e037dfe3426e req-41713988-2ee3-47ea-835f-c9cf1f7c5000 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Received event network-vif-deleted-d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:58:39 compute-0 nova_compute[186544]: 2025-11-22 08:58:39.036 186548 DEBUG nova.compute.provider_tree [None req-7a1bb4de-4962-42b1-9acb-e0076a2696fe b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:58:39 compute-0 nova_compute[186544]: 2025-11-22 08:58:39.056 186548 DEBUG nova.scheduler.client.report [None req-7a1bb4de-4962-42b1-9acb-e0076a2696fe b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:58:39 compute-0 nova_compute[186544]: 2025-11-22 08:58:39.093 186548 DEBUG oslo_concurrency.lockutils [None req-7a1bb4de-4962-42b1-9acb-e0076a2696fe b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.215s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:58:39 compute-0 nova_compute[186544]: 2025-11-22 08:58:39.159 186548 INFO nova.scheduler.client.report [None req-7a1bb4de-4962-42b1-9acb-e0076a2696fe b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Deleted allocations for instance f54acf85-11a3-44be-a962-ac430c2f1756
Nov 22 08:58:39 compute-0 nova_compute[186544]: 2025-11-22 08:58:39.271 186548 DEBUG oslo_concurrency.lockutils [None req-7a1bb4de-4962-42b1-9acb-e0076a2696fe b31993e3b9ba48dc95406061036b537a 664a512879714e69beefb15b448f9498 - - default default] Lock "f54acf85-11a3-44be-a962-ac430c2f1756" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.524s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:58:40 compute-0 nova_compute[186544]: 2025-11-22 08:58:40.347 186548 DEBUG nova.compute.manager [req-2af4e6eb-aa2e-4641-bebf-80445588eb49 req-16542bb6-1222-412f-b2a6-a0ad40c62d96 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Received event network-vif-unplugged-d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:58:40 compute-0 nova_compute[186544]: 2025-11-22 08:58:40.348 186548 DEBUG oslo_concurrency.lockutils [req-2af4e6eb-aa2e-4641-bebf-80445588eb49 req-16542bb6-1222-412f-b2a6-a0ad40c62d96 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "f54acf85-11a3-44be-a962-ac430c2f1756-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:58:40 compute-0 nova_compute[186544]: 2025-11-22 08:58:40.348 186548 DEBUG oslo_concurrency.lockutils [req-2af4e6eb-aa2e-4641-bebf-80445588eb49 req-16542bb6-1222-412f-b2a6-a0ad40c62d96 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "f54acf85-11a3-44be-a962-ac430c2f1756-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:58:40 compute-0 nova_compute[186544]: 2025-11-22 08:58:40.349 186548 DEBUG oslo_concurrency.lockutils [req-2af4e6eb-aa2e-4641-bebf-80445588eb49 req-16542bb6-1222-412f-b2a6-a0ad40c62d96 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "f54acf85-11a3-44be-a962-ac430c2f1756-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:58:40 compute-0 nova_compute[186544]: 2025-11-22 08:58:40.349 186548 DEBUG nova.compute.manager [req-2af4e6eb-aa2e-4641-bebf-80445588eb49 req-16542bb6-1222-412f-b2a6-a0ad40c62d96 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] No waiting events found dispatching network-vif-unplugged-d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:58:40 compute-0 nova_compute[186544]: 2025-11-22 08:58:40.350 186548 WARNING nova.compute.manager [req-2af4e6eb-aa2e-4641-bebf-80445588eb49 req-16542bb6-1222-412f-b2a6-a0ad40c62d96 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Received unexpected event network-vif-unplugged-d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273 for instance with vm_state deleted and task_state None.
Nov 22 08:58:40 compute-0 nova_compute[186544]: 2025-11-22 08:58:40.350 186548 DEBUG nova.compute.manager [req-2af4e6eb-aa2e-4641-bebf-80445588eb49 req-16542bb6-1222-412f-b2a6-a0ad40c62d96 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Received event network-vif-plugged-d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 22 08:58:40 compute-0 nova_compute[186544]: 2025-11-22 08:58:40.350 186548 DEBUG oslo_concurrency.lockutils [req-2af4e6eb-aa2e-4641-bebf-80445588eb49 req-16542bb6-1222-412f-b2a6-a0ad40c62d96 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Acquiring lock "f54acf85-11a3-44be-a962-ac430c2f1756-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:58:40 compute-0 nova_compute[186544]: 2025-11-22 08:58:40.351 186548 DEBUG oslo_concurrency.lockutils [req-2af4e6eb-aa2e-4641-bebf-80445588eb49 req-16542bb6-1222-412f-b2a6-a0ad40c62d96 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "f54acf85-11a3-44be-a962-ac430c2f1756-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:58:40 compute-0 nova_compute[186544]: 2025-11-22 08:58:40.351 186548 DEBUG oslo_concurrency.lockutils [req-2af4e6eb-aa2e-4641-bebf-80445588eb49 req-16542bb6-1222-412f-b2a6-a0ad40c62d96 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] Lock "f54acf85-11a3-44be-a962-ac430c2f1756-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:58:40 compute-0 nova_compute[186544]: 2025-11-22 08:58:40.352 186548 DEBUG nova.compute.manager [req-2af4e6eb-aa2e-4641-bebf-80445588eb49 req-16542bb6-1222-412f-b2a6-a0ad40c62d96 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] No waiting events found dispatching network-vif-plugged-d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 22 08:58:40 compute-0 nova_compute[186544]: 2025-11-22 08:58:40.353 186548 WARNING nova.compute.manager [req-2af4e6eb-aa2e-4641-bebf-80445588eb49 req-16542bb6-1222-412f-b2a6-a0ad40c62d96 3bd5bb3b3617449b856c920086ec97e6 521c01b7689e4780ad8db56f665e9ebe - - default default] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Received unexpected event network-vif-plugged-d4bab24c-c6fc-4c6a-a1a2-ff38c93ac273 for instance with vm_state deleted and task_state None.
Nov 22 08:58:42 compute-0 nova_compute[186544]: 2025-11-22 08:58:42.014 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:58:42 compute-0 nova_compute[186544]: 2025-11-22 08:58:42.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:58:42 compute-0 nova_compute[186544]: 2025-11-22 08:58:42.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 08:58:42 compute-0 nova_compute[186544]: 2025-11-22 08:58:42.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:58:42 compute-0 nova_compute[186544]: 2025-11-22 08:58:42.211 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:58:42 compute-0 nova_compute[186544]: 2025-11-22 08:58:42.211 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:58:42 compute-0 nova_compute[186544]: 2025-11-22 08:58:42.211 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:58:42 compute-0 nova_compute[186544]: 2025-11-22 08:58:42.212 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 08:58:42 compute-0 nova_compute[186544]: 2025-11-22 08:58:42.240 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:58:42 compute-0 nova_compute[186544]: 2025-11-22 08:58:42.360 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:58:42 compute-0 nova_compute[186544]: 2025-11-22 08:58:42.360 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5623MB free_disk=73.13102722167969GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 08:58:42 compute-0 nova_compute[186544]: 2025-11-22 08:58:42.361 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:58:42 compute-0 nova_compute[186544]: 2025-11-22 08:58:42.361 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:58:42 compute-0 nova_compute[186544]: 2025-11-22 08:58:42.459 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 08:58:42 compute-0 nova_compute[186544]: 2025-11-22 08:58:42.460 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 08:58:42 compute-0 nova_compute[186544]: 2025-11-22 08:58:42.507 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:58:42 compute-0 nova_compute[186544]: 2025-11-22 08:58:42.535 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:58:42 compute-0 nova_compute[186544]: 2025-11-22 08:58:42.575 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 08:58:42 compute-0 nova_compute[186544]: 2025-11-22 08:58:42.576 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.215s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:58:43 compute-0 nova_compute[186544]: 2025-11-22 08:58:43.037 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:58:46 compute-0 nova_compute[186544]: 2025-11-22 08:58:46.577 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:58:47 compute-0 nova_compute[186544]: 2025-11-22 08:58:47.016 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:58:48 compute-0 nova_compute[186544]: 2025-11-22 08:58:48.040 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:58:48 compute-0 nova_compute[186544]: 2025-11-22 08:58:48.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:58:50 compute-0 nova_compute[186544]: 2025-11-22 08:58:50.159 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:58:50 compute-0 podman[258354]: 2025-11-22 08:58:50.412254438 +0000 UTC m=+0.057042693 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 08:58:50 compute-0 podman[258356]: 2025-11-22 08:58:50.420545092 +0000 UTC m=+0.057910325 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 08:58:50 compute-0 podman[258355]: 2025-11-22 08:58:50.42167819 +0000 UTC m=+0.064926887 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, build-date=2025-08-20T13:12:41, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, name=ubi9-minimal)
Nov 22 08:58:51 compute-0 nova_compute[186544]: 2025-11-22 08:58:51.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:58:52 compute-0 nova_compute[186544]: 2025-11-22 08:58:52.019 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:58:53 compute-0 nova_compute[186544]: 2025-11-22 08:58:53.012 186548 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763801918.0116224, f54acf85-11a3-44be-a962-ac430c2f1756 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 22 08:58:53 compute-0 nova_compute[186544]: 2025-11-22 08:58:53.014 186548 INFO nova.compute.manager [-] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] VM Stopped (Lifecycle Event)
Nov 22 08:58:53 compute-0 nova_compute[186544]: 2025-11-22 08:58:53.043 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:58:53 compute-0 nova_compute[186544]: 2025-11-22 08:58:53.047 186548 DEBUG nova.compute.manager [None req-af5ef977-6337-4fe1-80e8-88ac66b5dae4 - - - - - -] [instance: f54acf85-11a3-44be-a962-ac430c2f1756] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 22 08:58:53 compute-0 nova_compute[186544]: 2025-11-22 08:58:53.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:58:54 compute-0 nova_compute[186544]: 2025-11-22 08:58:54.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:58:57 compute-0 nova_compute[186544]: 2025-11-22 08:58:57.022 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:58:58 compute-0 nova_compute[186544]: 2025-11-22 08:58:58.046 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:59:02 compute-0 nova_compute[186544]: 2025-11-22 08:59:02.024 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:59:03 compute-0 nova_compute[186544]: 2025-11-22 08:59:03.048 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:59:07 compute-0 nova_compute[186544]: 2025-11-22 08:59:07.026 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:59:08 compute-0 nova_compute[186544]: 2025-11-22 08:59:08.050 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:59:08 compute-0 podman[258420]: 2025-11-22 08:59:08.409962716 +0000 UTC m=+0.053290932 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 08:59:08 compute-0 podman[258421]: 2025-11-22 08:59:08.418220849 +0000 UTC m=+0.058238214 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 08:59:08 compute-0 podman[258419]: 2025-11-22 08:59:08.429452665 +0000 UTC m=+0.074558855 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2)
Nov 22 08:59:08 compute-0 podman[258422]: 2025-11-22 08:59:08.45447115 +0000 UTC m=+0.089802730 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 22 08:59:09 compute-0 nova_compute[186544]: 2025-11-22 08:59:09.158 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:59:12 compute-0 nova_compute[186544]: 2025-11-22 08:59:12.027 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:59:13 compute-0 nova_compute[186544]: 2025-11-22 08:59:13.052 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:59:13 compute-0 ovn_controller[94843]: 2025-11-22T08:59:13Z|00958|memory_trim|INFO|Detected inactivity (last active 30006 ms ago): trimming memory
Nov 22 08:59:17 compute-0 nova_compute[186544]: 2025-11-22 08:59:17.029 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:59:18 compute-0 nova_compute[186544]: 2025-11-22 08:59:18.053 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:59:21 compute-0 podman[258503]: 2025-11-22 08:59:21.415151903 +0000 UTC m=+0.050167865 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, architecture=x86_64, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., version=9.6, io.buildah.version=1.33.7, name=ubi9-minimal)
Nov 22 08:59:21 compute-0 podman[258504]: 2025-11-22 08:59:21.422428091 +0000 UTC m=+0.054197073 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3)
Nov 22 08:59:21 compute-0 podman[258502]: 2025-11-22 08:59:21.437033641 +0000 UTC m=+0.075058677 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 08:59:22 compute-0 nova_compute[186544]: 2025-11-22 08:59:22.032 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:59:23 compute-0 nova_compute[186544]: 2025-11-22 08:59:23.056 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:59:27 compute-0 nova_compute[186544]: 2025-11-22 08:59:27.033 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:59:28 compute-0 nova_compute[186544]: 2025-11-22 08:59:28.058 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:59:32 compute-0 nova_compute[186544]: 2025-11-22 08:59:32.035 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:59:33 compute-0 nova_compute[186544]: 2025-11-22 08:59:33.060 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:59:36 compute-0 nova_compute[186544]: 2025-11-22 08:59:36.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:59:36 compute-0 nova_compute[186544]: 2025-11-22 08:59:36.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 08:59:36 compute-0 nova_compute[186544]: 2025-11-22 08:59:36.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 08:59:36 compute-0 nova_compute[186544]: 2025-11-22 08:59:36.205 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 08:59:37 compute-0 nova_compute[186544]: 2025-11-22 08:59:37.038 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:59:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:59:37.391 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:59:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:59:37.391 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:59:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 08:59:37.392 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:59:38 compute-0 nova_compute[186544]: 2025-11-22 08:59:38.064 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:59:39 compute-0 podman[258563]: 2025-11-22 08:59:39.453253482 +0000 UTC m=+0.075028816 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 08:59:39 compute-0 podman[258562]: 2025-11-22 08:59:39.471084221 +0000 UTC m=+0.097033407 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 22 08:59:39 compute-0 podman[258561]: 2025-11-22 08:59:39.474329531 +0000 UTC m=+0.102437251 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2)
Nov 22 08:59:39 compute-0 podman[258565]: 2025-11-22 08:59:39.521093012 +0000 UTC m=+0.126674968 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 22 08:59:42 compute-0 nova_compute[186544]: 2025-11-22 08:59:42.039 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:59:42 compute-0 nova_compute[186544]: 2025-11-22 08:59:42.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:59:42 compute-0 nova_compute[186544]: 2025-11-22 08:59:42.214 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:59:42 compute-0 nova_compute[186544]: 2025-11-22 08:59:42.215 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:59:42 compute-0 nova_compute[186544]: 2025-11-22 08:59:42.215 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:59:42 compute-0 nova_compute[186544]: 2025-11-22 08:59:42.215 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 08:59:42 compute-0 nova_compute[186544]: 2025-11-22 08:59:42.409 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 08:59:42 compute-0 nova_compute[186544]: 2025-11-22 08:59:42.410 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5683MB free_disk=73.13102722167969GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 08:59:42 compute-0 nova_compute[186544]: 2025-11-22 08:59:42.410 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 08:59:42 compute-0 nova_compute[186544]: 2025-11-22 08:59:42.411 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 08:59:42 compute-0 nova_compute[186544]: 2025-11-22 08:59:42.523 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 08:59:42 compute-0 nova_compute[186544]: 2025-11-22 08:59:42.523 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 08:59:42 compute-0 nova_compute[186544]: 2025-11-22 08:59:42.554 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 08:59:42 compute-0 nova_compute[186544]: 2025-11-22 08:59:42.575 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 08:59:42 compute-0 nova_compute[186544]: 2025-11-22 08:59:42.577 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 08:59:42 compute-0 nova_compute[186544]: 2025-11-22 08:59:42.578 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.167s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 08:59:43 compute-0 nova_compute[186544]: 2025-11-22 08:59:43.066 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:59:44 compute-0 nova_compute[186544]: 2025-11-22 08:59:44.579 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:59:44 compute-0 nova_compute[186544]: 2025-11-22 08:59:44.579 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 08:59:47 compute-0 nova_compute[186544]: 2025-11-22 08:59:47.039 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:59:47 compute-0 nova_compute[186544]: 2025-11-22 08:59:47.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:59:48 compute-0 nova_compute[186544]: 2025-11-22 08:59:48.067 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:59:48 compute-0 nova_compute[186544]: 2025-11-22 08:59:48.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:59:51 compute-0 nova_compute[186544]: 2025-11-22 08:59:51.158 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:59:51 compute-0 nova_compute[186544]: 2025-11-22 08:59:51.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:59:52 compute-0 nova_compute[186544]: 2025-11-22 08:59:52.041 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:59:52 compute-0 podman[258646]: 2025-11-22 08:59:52.434833428 +0000 UTC m=+0.065749129 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 08:59:52 compute-0 podman[258647]: 2025-11-22 08:59:52.450701578 +0000 UTC m=+0.076008061 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, release=1755695350, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, name=ubi9-minimal, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, vcs-type=git, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 22 08:59:52 compute-0 podman[258648]: 2025-11-22 08:59:52.482221473 +0000 UTC m=+0.097191172 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 22 08:59:53 compute-0 nova_compute[186544]: 2025-11-22 08:59:53.070 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:59:53 compute-0 nova_compute[186544]: 2025-11-22 08:59:53.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:59:55 compute-0 nova_compute[186544]: 2025-11-22 08:59:55.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 08:59:57 compute-0 nova_compute[186544]: 2025-11-22 08:59:57.044 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 08:59:58 compute-0 nova_compute[186544]: 2025-11-22 08:59:58.073 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:00:02 compute-0 nova_compute[186544]: 2025-11-22 09:00:02.046 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:00:03 compute-0 nova_compute[186544]: 2025-11-22 09:00:03.076 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:00:07 compute-0 nova_compute[186544]: 2025-11-22 09:00:07.048 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:00:08 compute-0 nova_compute[186544]: 2025-11-22 09:00:08.078 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:00:09 compute-0 nova_compute[186544]: 2025-11-22 09:00:09.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:00:10 compute-0 podman[258706]: 2025-11-22 09:00:10.411235218 +0000 UTC m=+0.059533956 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Nov 22 09:00:10 compute-0 podman[258707]: 2025-11-22 09:00:10.434051589 +0000 UTC m=+0.064085118 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 22 09:00:10 compute-0 podman[258713]: 2025-11-22 09:00:10.434431108 +0000 UTC m=+0.071460639 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 09:00:10 compute-0 podman[258714]: 2025-11-22 09:00:10.457125807 +0000 UTC m=+0.088164340 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Nov 22 09:00:12 compute-0 nova_compute[186544]: 2025-11-22 09:00:12.050 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:00:13 compute-0 nova_compute[186544]: 2025-11-22 09:00:13.082 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:00:17 compute-0 nova_compute[186544]: 2025-11-22 09:00:17.051 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:00:18 compute-0 nova_compute[186544]: 2025-11-22 09:00:18.085 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:00:22 compute-0 nova_compute[186544]: 2025-11-22 09:00:22.054 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:00:22 compute-0 sshd-session[258793]: Accepted publickey for zuul from 192.168.122.10 port 60176 ssh2: ECDSA SHA256:XSwr0+qdoVcqF4tsJEe7yzRPcUrJW8ET1w6IEkjbKvs
Nov 22 09:00:22 compute-0 systemd-logind[821]: New session 69 of user zuul.
Nov 22 09:00:22 compute-0 systemd[1]: Started Session 69 of User zuul.
Nov 22 09:00:22 compute-0 sshd-session[258793]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 22 09:00:22 compute-0 podman[258798]: 2025-11-22 09:00:22.569902213 +0000 UTC m=+0.051523879 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, container_name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 22 09:00:22 compute-0 podman[258797]: 2025-11-22 09:00:22.576575036 +0000 UTC m=+0.062182710 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.openshift.expose-services=, container_name=openstack_network_exporter, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, name=ubi9-minimal, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, config_id=edpm, distribution-scope=public, maintainer=Red Hat, Inc., version=9.6, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git)
Nov 22 09:00:22 compute-0 podman[258795]: 2025-11-22 09:00:22.592023596 +0000 UTC m=+0.079570517 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 09:00:22 compute-0 sudo[258862]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt'
Nov 22 09:00:22 compute-0 sudo[258862]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:00:23 compute-0 nova_compute[186544]: 2025-11-22 09:00:23.088 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:00:24 compute-0 nova_compute[186544]: 2025-11-22 09:00:24.177 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:00:24 compute-0 nova_compute[186544]: 2025-11-22 09:00:24.178 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 22 09:00:24 compute-0 nova_compute[186544]: 2025-11-22 09:00:24.200 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 22 09:00:25 compute-0 nova_compute[186544]: 2025-11-22 09:00:25.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:00:25 compute-0 nova_compute[186544]: 2025-11-22 09:00:25.164 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 22 09:00:27 compute-0 nova_compute[186544]: 2025-11-22 09:00:27.055 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:00:28 compute-0 nova_compute[186544]: 2025-11-22 09:00:28.090 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:00:28 compute-0 ovs-vsctl[259033]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Nov 22 09:00:28 compute-0 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 258886 (sos)
Nov 22 09:00:28 compute-0 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Nov 22 09:00:28 compute-0 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Nov 22 09:00:29 compute-0 virtqemud[186092]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Nov 22 09:00:29 compute-0 virtqemud[186092]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Nov 22 09:00:29 compute-0 virtqemud[186092]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Nov 22 09:00:30 compute-0 crontab[259445]: (root) LIST (root)
Nov 22 09:00:32 compute-0 nova_compute[186544]: 2025-11-22 09:00:32.057 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:00:32 compute-0 systemd[1]: Starting Hostname Service...
Nov 22 09:00:32 compute-0 systemd[1]: Started Hostname Service.
Nov 22 09:00:33 compute-0 nova_compute[186544]: 2025-11-22 09:00:33.092 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:00:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:00:36.609 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:00:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:00:36.610 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:00:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:00:36.610 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:00:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:00:36.610 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:00:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:00:36.611 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:00:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:00:36.611 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:00:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:00:36.611 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:00:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:00:36.611 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:00:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:00:36.611 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:00:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:00:36.611 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:00:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:00:36.611 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:00:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:00:36.611 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:00:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:00:36.611 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:00:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:00:36.611 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:00:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:00:36.611 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:00:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:00:36.612 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:00:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:00:36.612 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:00:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:00:36.612 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:00:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:00:36.612 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:00:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:00:36.612 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:00:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:00:36.612 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:00:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:00:36.612 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:00:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:00:36.612 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:00:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:00:36.612 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:00:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:00:36.612 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:00:37 compute-0 nova_compute[186544]: 2025-11-22 09:00:37.060 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:00:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 09:00:37.392 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 09:00:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 09:00:37.393 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 09:00:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 09:00:37.393 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 09:00:38 compute-0 nova_compute[186544]: 2025-11-22 09:00:38.095 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:00:38 compute-0 nova_compute[186544]: 2025-11-22 09:00:38.353 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:00:38 compute-0 nova_compute[186544]: 2025-11-22 09:00:38.353 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 09:00:38 compute-0 nova_compute[186544]: 2025-11-22 09:00:38.354 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 09:00:38 compute-0 nova_compute[186544]: 2025-11-22 09:00:38.388 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 09:00:40 compute-0 ovs-appctl[260756]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Nov 22 09:00:40 compute-0 ovs-appctl[260761]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Nov 22 09:00:40 compute-0 ovs-appctl[260765]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Nov 22 09:00:41 compute-0 podman[260866]: 2025-11-22 09:00:41.433778606 +0000 UTC m=+0.069205384 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 09:00:41 compute-0 podman[260865]: 2025-11-22 09:00:41.454085375 +0000 UTC m=+0.090565659 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 22 09:00:41 compute-0 podman[260867]: 2025-11-22 09:00:41.470071307 +0000 UTC m=+0.102080951 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_id=ovn_controller)
Nov 22 09:00:41 compute-0 podman[260861]: 2025-11-22 09:00:41.478168787 +0000 UTC m=+0.117143052 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Nov 22 09:00:42 compute-0 nova_compute[186544]: 2025-11-22 09:00:42.061 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:00:43 compute-0 nova_compute[186544]: 2025-11-22 09:00:43.097 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:00:43 compute-0 nova_compute[186544]: 2025-11-22 09:00:43.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:00:43 compute-0 nova_compute[186544]: 2025-11-22 09:00:43.202 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 09:00:43 compute-0 nova_compute[186544]: 2025-11-22 09:00:43.202 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 09:00:43 compute-0 nova_compute[186544]: 2025-11-22 09:00:43.202 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 09:00:43 compute-0 nova_compute[186544]: 2025-11-22 09:00:43.203 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 09:00:43 compute-0 nova_compute[186544]: 2025-11-22 09:00:43.362 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 09:00:43 compute-0 nova_compute[186544]: 2025-11-22 09:00:43.363 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5549MB free_disk=72.66046142578125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 09:00:43 compute-0 nova_compute[186544]: 2025-11-22 09:00:43.363 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 09:00:43 compute-0 nova_compute[186544]: 2025-11-22 09:00:43.364 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 09:00:43 compute-0 nova_compute[186544]: 2025-11-22 09:00:43.556 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 09:00:43 compute-0 nova_compute[186544]: 2025-11-22 09:00:43.556 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 09:00:43 compute-0 nova_compute[186544]: 2025-11-22 09:00:43.599 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 09:00:43 compute-0 nova_compute[186544]: 2025-11-22 09:00:43.624 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 09:00:43 compute-0 nova_compute[186544]: 2025-11-22 09:00:43.772 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 09:00:43 compute-0 nova_compute[186544]: 2025-11-22 09:00:43.772 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.408s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 09:00:46 compute-0 nova_compute[186544]: 2025-11-22 09:00:46.772 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:00:46 compute-0 nova_compute[186544]: 2025-11-22 09:00:46.773 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 09:00:47 compute-0 nova_compute[186544]: 2025-11-22 09:00:47.062 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:00:48 compute-0 nova_compute[186544]: 2025-11-22 09:00:48.100 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:00:48 compute-0 virtqemud[186092]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Nov 22 09:00:49 compute-0 nova_compute[186544]: 2025-11-22 09:00:49.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:00:50 compute-0 nova_compute[186544]: 2025-11-22 09:00:50.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:00:50 compute-0 systemd[1]: Starting Time & Date Service...
Nov 22 09:00:50 compute-0 systemd[1]: Started Time & Date Service.
Nov 22 09:00:52 compute-0 nova_compute[186544]: 2025-11-22 09:00:52.064 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:00:52 compute-0 nova_compute[186544]: 2025-11-22 09:00:52.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:00:53 compute-0 nova_compute[186544]: 2025-11-22 09:00:53.104 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:00:53 compute-0 nova_compute[186544]: 2025-11-22 09:00:53.157 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:00:53 compute-0 podman[262363]: 2025-11-22 09:00:53.412651896 +0000 UTC m=+0.058827278 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, io.buildah.version=1.33.7, release=1755695350, vendor=Red Hat, Inc., name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.openshift.expose-services=)
Nov 22 09:00:53 compute-0 podman[262362]: 2025-11-22 09:00:53.412737628 +0000 UTC m=+0.060115040 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 09:00:53 compute-0 podman[262364]: 2025-11-22 09:00:53.414785369 +0000 UTC m=+0.060088419 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 09:00:54 compute-0 nova_compute[186544]: 2025-11-22 09:00:54.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:00:56 compute-0 nova_compute[186544]: 2025-11-22 09:00:56.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:00:57 compute-0 nova_compute[186544]: 2025-11-22 09:00:57.067 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:00:58 compute-0 nova_compute[186544]: 2025-11-22 09:00:58.107 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:01:01 compute-0 CROND[262426]: (root) CMD (run-parts /etc/cron.hourly)
Nov 22 09:01:01 compute-0 run-parts[262429]: (/etc/cron.hourly) starting 0anacron
Nov 22 09:01:01 compute-0 run-parts[262435]: (/etc/cron.hourly) finished 0anacron
Nov 22 09:01:01 compute-0 CROND[262425]: (root) CMDEND (run-parts /etc/cron.hourly)
Nov 22 09:01:02 compute-0 nova_compute[186544]: 2025-11-22 09:01:02.069 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:01:03 compute-0 nova_compute[186544]: 2025-11-22 09:01:03.110 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:01:07 compute-0 nova_compute[186544]: 2025-11-22 09:01:07.071 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:01:08 compute-0 nova_compute[186544]: 2025-11-22 09:01:08.113 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:01:12 compute-0 nova_compute[186544]: 2025-11-22 09:01:12.071 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:01:12 compute-0 podman[262440]: 2025-11-22 09:01:12.436733709 +0000 UTC m=+0.070798262 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 22 09:01:12 compute-0 podman[262439]: 2025-11-22 09:01:12.436935583 +0000 UTC m=+0.077743223 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute)
Nov 22 09:01:12 compute-0 podman[262441]: 2025-11-22 09:01:12.441945807 +0000 UTC m=+0.068699421 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 09:01:12 compute-0 podman[262442]: 2025-11-22 09:01:12.468784857 +0000 UTC m=+0.098835992 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 22 09:01:13 compute-0 nova_compute[186544]: 2025-11-22 09:01:13.115 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:01:13 compute-0 nova_compute[186544]: 2025-11-22 09:01:13.159 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:01:17 compute-0 nova_compute[186544]: 2025-11-22 09:01:17.075 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:01:18 compute-0 nova_compute[186544]: 2025-11-22 09:01:18.118 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:01:18 compute-0 sudo[258862]: pam_unix(sudo:session): session closed for user root
Nov 22 09:01:18 compute-0 sshd-session[258827]: Received disconnect from 192.168.122.10 port 60176:11: disconnected by user
Nov 22 09:01:18 compute-0 sshd-session[258827]: Disconnected from user zuul 192.168.122.10 port 60176
Nov 22 09:01:18 compute-0 sshd-session[258793]: pam_unix(sshd:session): session closed for user zuul
Nov 22 09:01:18 compute-0 systemd[1]: session-69.scope: Deactivated successfully.
Nov 22 09:01:18 compute-0 systemd[1]: session-69.scope: Consumed 1min 26.345s CPU time, 767.7M memory peak, read 266.8M from disk, written 18.7M to disk.
Nov 22 09:01:18 compute-0 systemd-logind[821]: Session 69 logged out. Waiting for processes to exit.
Nov 22 09:01:18 compute-0 systemd-logind[821]: Removed session 69.
Nov 22 09:01:19 compute-0 sshd-session[262523]: Accepted publickey for zuul from 192.168.122.10 port 46662 ssh2: ECDSA SHA256:XSwr0+qdoVcqF4tsJEe7yzRPcUrJW8ET1w6IEkjbKvs
Nov 22 09:01:19 compute-0 systemd-logind[821]: New session 70 of user zuul.
Nov 22 09:01:19 compute-0 systemd[1]: Started Session 70 of User zuul.
Nov 22 09:01:19 compute-0 sshd-session[262523]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 22 09:01:19 compute-0 sudo[262527]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/cat /var/tmp/sos-osp/sosreport-compute-0-2025-11-22-zzbyoxf.tar.xz
Nov 22 09:01:19 compute-0 sudo[262527]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:01:19 compute-0 sudo[262527]: pam_unix(sudo:session): session closed for user root
Nov 22 09:01:19 compute-0 sshd-session[262526]: Received disconnect from 192.168.122.10 port 46662:11: disconnected by user
Nov 22 09:01:19 compute-0 sshd-session[262526]: Disconnected from user zuul 192.168.122.10 port 46662
Nov 22 09:01:19 compute-0 sshd-session[262523]: pam_unix(sshd:session): session closed for user zuul
Nov 22 09:01:19 compute-0 systemd[1]: session-70.scope: Deactivated successfully.
Nov 22 09:01:19 compute-0 systemd-logind[821]: Session 70 logged out. Waiting for processes to exit.
Nov 22 09:01:19 compute-0 systemd-logind[821]: Removed session 70.
Nov 22 09:01:19 compute-0 sshd-session[262552]: Accepted publickey for zuul from 192.168.122.10 port 46672 ssh2: ECDSA SHA256:XSwr0+qdoVcqF4tsJEe7yzRPcUrJW8ET1w6IEkjbKvs
Nov 22 09:01:19 compute-0 systemd-logind[821]: New session 71 of user zuul.
Nov 22 09:01:19 compute-0 systemd[1]: Started Session 71 of User zuul.
Nov 22 09:01:19 compute-0 sshd-session[262552]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 22 09:01:20 compute-0 sudo[262556]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rm -rf /var/tmp/sos-osp
Nov 22 09:01:20 compute-0 sudo[262556]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:01:20 compute-0 sudo[262556]: pam_unix(sudo:session): session closed for user root
Nov 22 09:01:20 compute-0 sshd-session[262555]: Received disconnect from 192.168.122.10 port 46672:11: disconnected by user
Nov 22 09:01:20 compute-0 sshd-session[262555]: Disconnected from user zuul 192.168.122.10 port 46672
Nov 22 09:01:20 compute-0 sshd-session[262552]: pam_unix(sshd:session): session closed for user zuul
Nov 22 09:01:20 compute-0 systemd[1]: session-71.scope: Deactivated successfully.
Nov 22 09:01:20 compute-0 systemd-logind[821]: Session 71 logged out. Waiting for processes to exit.
Nov 22 09:01:20 compute-0 systemd-logind[821]: Removed session 71.
Nov 22 09:01:20 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 22 09:01:20 compute-0 systemd[1]: systemd-timedated.service: Deactivated successfully.
Nov 22 09:01:22 compute-0 nova_compute[186544]: 2025-11-22 09:01:22.078 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:01:23 compute-0 nova_compute[186544]: 2025-11-22 09:01:23.120 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:01:24 compute-0 podman[262585]: 2025-11-22 09:01:24.421633609 +0000 UTC m=+0.064039706 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 09:01:24 compute-0 podman[262586]: 2025-11-22 09:01:24.431642295 +0000 UTC m=+0.070409162 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, release=1755695350, version=9.6, name=ubi9-minimal, vcs-type=git, com.redhat.component=ubi9-minimal-container, distribution-scope=public, architecture=x86_64, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 22 09:01:24 compute-0 podman[262587]: 2025-11-22 09:01:24.455140974 +0000 UTC m=+0.078654766 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 22 09:01:27 compute-0 nova_compute[186544]: 2025-11-22 09:01:27.079 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:01:28 compute-0 nova_compute[186544]: 2025-11-22 09:01:28.125 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:01:32 compute-0 nova_compute[186544]: 2025-11-22 09:01:32.082 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:01:33 compute-0 nova_compute[186544]: 2025-11-22 09:01:33.128 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:01:37 compute-0 nova_compute[186544]: 2025-11-22 09:01:37.082 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:01:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 09:01:37.394 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 09:01:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 09:01:37.394 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 09:01:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 09:01:37.394 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 09:01:38 compute-0 nova_compute[186544]: 2025-11-22 09:01:38.130 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:01:38 compute-0 nova_compute[186544]: 2025-11-22 09:01:38.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:01:38 compute-0 nova_compute[186544]: 2025-11-22 09:01:38.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 09:01:38 compute-0 nova_compute[186544]: 2025-11-22 09:01:38.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 09:01:38 compute-0 nova_compute[186544]: 2025-11-22 09:01:38.185 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 09:01:42 compute-0 nova_compute[186544]: 2025-11-22 09:01:42.085 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:01:43 compute-0 nova_compute[186544]: 2025-11-22 09:01:43.133 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:01:43 compute-0 nova_compute[186544]: 2025-11-22 09:01:43.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:01:43 compute-0 nova_compute[186544]: 2025-11-22 09:01:43.212 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 09:01:43 compute-0 nova_compute[186544]: 2025-11-22 09:01:43.213 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 09:01:43 compute-0 nova_compute[186544]: 2025-11-22 09:01:43.213 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 09:01:43 compute-0 nova_compute[186544]: 2025-11-22 09:01:43.214 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 09:01:43 compute-0 nova_compute[186544]: 2025-11-22 09:01:43.414 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 09:01:43 compute-0 nova_compute[186544]: 2025-11-22 09:01:43.416 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5695MB free_disk=73.13075637817383GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 09:01:43 compute-0 nova_compute[186544]: 2025-11-22 09:01:43.417 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 09:01:43 compute-0 nova_compute[186544]: 2025-11-22 09:01:43.418 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 09:01:43 compute-0 podman[262648]: 2025-11-22 09:01:43.42000371 +0000 UTC m=+0.068314491 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 09:01:43 compute-0 podman[262650]: 2025-11-22 09:01:43.430115499 +0000 UTC m=+0.073640692 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 09:01:43 compute-0 podman[262649]: 2025-11-22 09:01:43.443882868 +0000 UTC m=+0.088738784 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 09:01:43 compute-0 podman[262651]: 2025-11-22 09:01:43.451986197 +0000 UTC m=+0.090590139 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller)
Nov 22 09:01:43 compute-0 nova_compute[186544]: 2025-11-22 09:01:43.535 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 09:01:43 compute-0 nova_compute[186544]: 2025-11-22 09:01:43.535 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 09:01:43 compute-0 nova_compute[186544]: 2025-11-22 09:01:43.583 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Refreshing inventories for resource provider 0a011418-630a-4be8-ab23-41ec1c11a5ea _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 22 09:01:43 compute-0 nova_compute[186544]: 2025-11-22 09:01:43.620 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Updating ProviderTree inventory for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 22 09:01:43 compute-0 nova_compute[186544]: 2025-11-22 09:01:43.620 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Updating inventory in ProviderTree for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 22 09:01:43 compute-0 nova_compute[186544]: 2025-11-22 09:01:43.641 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Refreshing aggregate associations for resource provider 0a011418-630a-4be8-ab23-41ec1c11a5ea, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 22 09:01:43 compute-0 nova_compute[186544]: 2025-11-22 09:01:43.671 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Refreshing trait associations for resource provider 0a011418-630a-4be8-ab23-41ec1c11a5ea, traits: COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_ACCELERATORS,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_RESCUE_BFV,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSSE3,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE41 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 22 09:01:43 compute-0 nova_compute[186544]: 2025-11-22 09:01:43.708 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 09:01:43 compute-0 nova_compute[186544]: 2025-11-22 09:01:43.720 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 09:01:43 compute-0 nova_compute[186544]: 2025-11-22 09:01:43.831 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 09:01:43 compute-0 nova_compute[186544]: 2025-11-22 09:01:43.832 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.414s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 09:01:47 compute-0 nova_compute[186544]: 2025-11-22 09:01:47.086 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:01:47 compute-0 nova_compute[186544]: 2025-11-22 09:01:47.834 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:01:47 compute-0 nova_compute[186544]: 2025-11-22 09:01:47.835 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 09:01:48 compute-0 nova_compute[186544]: 2025-11-22 09:01:48.134 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:01:49 compute-0 nova_compute[186544]: 2025-11-22 09:01:49.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:01:52 compute-0 nova_compute[186544]: 2025-11-22 09:01:52.088 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:01:52 compute-0 nova_compute[186544]: 2025-11-22 09:01:52.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:01:53 compute-0 nova_compute[186544]: 2025-11-22 09:01:53.137 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:01:53 compute-0 nova_compute[186544]: 2025-11-22 09:01:53.157 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:01:53 compute-0 nova_compute[186544]: 2025-11-22 09:01:53.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:01:55 compute-0 podman[262733]: 2025-11-22 09:01:55.437140034 +0000 UTC m=+0.072416203 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 09:01:55 compute-0 podman[262734]: 2025-11-22 09:01:55.454322976 +0000 UTC m=+0.078771598 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, architecture=x86_64, container_name=openstack_network_exporter, distribution-scope=public, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, version=9.6, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, io.openshift.expose-services=, io.openshift.tags=minimal rhel9)
Nov 22 09:01:55 compute-0 podman[262735]: 2025-11-22 09:01:55.455621288 +0000 UTC m=+0.074090844 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Nov 22 09:01:56 compute-0 nova_compute[186544]: 2025-11-22 09:01:56.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:01:57 compute-0 nova_compute[186544]: 2025-11-22 09:01:57.089 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:01:58 compute-0 nova_compute[186544]: 2025-11-22 09:01:58.140 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:01:58 compute-0 nova_compute[186544]: 2025-11-22 09:01:58.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:02:02 compute-0 nova_compute[186544]: 2025-11-22 09:02:02.089 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:02:03 compute-0 nova_compute[186544]: 2025-11-22 09:02:03.143 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:02:07 compute-0 nova_compute[186544]: 2025-11-22 09:02:07.091 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:02:08 compute-0 nova_compute[186544]: 2025-11-22 09:02:08.145 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:02:12 compute-0 nova_compute[186544]: 2025-11-22 09:02:12.092 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:02:13 compute-0 nova_compute[186544]: 2025-11-22 09:02:13.148 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:02:14 compute-0 podman[262798]: 2025-11-22 09:02:14.441900841 +0000 UTC m=+0.076297667 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 22 09:02:14 compute-0 podman[262799]: 2025-11-22 09:02:14.455208239 +0000 UTC m=+0.083665109 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 22 09:02:14 compute-0 podman[262800]: 2025-11-22 09:02:14.463156324 +0000 UTC m=+0.087756699 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 09:02:14 compute-0 podman[262806]: 2025-11-22 09:02:14.494104845 +0000 UTC m=+0.112405626 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 22 09:02:17 compute-0 nova_compute[186544]: 2025-11-22 09:02:17.094 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:02:18 compute-0 nova_compute[186544]: 2025-11-22 09:02:18.150 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:02:22 compute-0 nova_compute[186544]: 2025-11-22 09:02:22.096 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:02:23 compute-0 nova_compute[186544]: 2025-11-22 09:02:23.153 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:02:26 compute-0 podman[262887]: 2025-11-22 09:02:26.398688447 +0000 UTC m=+0.050067302 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 09:02:26 compute-0 podman[262888]: 2025-11-22 09:02:26.406874229 +0000 UTC m=+0.054936003 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, name=ubi9-minimal, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, version=9.6, config_id=edpm, managed_by=edpm_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 22 09:02:26 compute-0 podman[262889]: 2025-11-22 09:02:26.413729428 +0000 UTC m=+0.057380603 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team)
Nov 22 09:02:27 compute-0 nova_compute[186544]: 2025-11-22 09:02:27.097 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:02:28 compute-0 nova_compute[186544]: 2025-11-22 09:02:28.156 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:02:32 compute-0 nova_compute[186544]: 2025-11-22 09:02:32.099 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:02:33 compute-0 nova_compute[186544]: 2025-11-22 09:02:33.159 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:02:36.610 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:02:36.610 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:02:36.610 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:02:36.610 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:02:36.611 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:02:36.611 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:02:36.611 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:02:36.611 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:02:36.611 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:02:36.611 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:02:36.611 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:02:36.611 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:02:36.611 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:02:36.611 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:02:36.611 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:02:36.611 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:02:36.611 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:02:36.611 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:02:36.611 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:02:36.612 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:02:36.612 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:02:36.612 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:02:36.612 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:02:36.612 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:02:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:02:36.612 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:02:37 compute-0 nova_compute[186544]: 2025-11-22 09:02:37.102 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:02:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 09:02:37.394 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 09:02:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 09:02:37.395 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 09:02:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 09:02:37.396 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 09:02:38 compute-0 nova_compute[186544]: 2025-11-22 09:02:38.162 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:02:40 compute-0 nova_compute[186544]: 2025-11-22 09:02:40.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:02:40 compute-0 nova_compute[186544]: 2025-11-22 09:02:40.164 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 09:02:40 compute-0 nova_compute[186544]: 2025-11-22 09:02:40.164 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 09:02:40 compute-0 nova_compute[186544]: 2025-11-22 09:02:40.181 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 09:02:42 compute-0 nova_compute[186544]: 2025-11-22 09:02:42.104 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:02:43 compute-0 nova_compute[186544]: 2025-11-22 09:02:43.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:02:43 compute-0 nova_compute[186544]: 2025-11-22 09:02:43.165 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:02:43 compute-0 nova_compute[186544]: 2025-11-22 09:02:43.203 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 09:02:43 compute-0 nova_compute[186544]: 2025-11-22 09:02:43.204 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 09:02:43 compute-0 nova_compute[186544]: 2025-11-22 09:02:43.204 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 09:02:43 compute-0 nova_compute[186544]: 2025-11-22 09:02:43.204 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 09:02:43 compute-0 nova_compute[186544]: 2025-11-22 09:02:43.358 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 09:02:43 compute-0 nova_compute[186544]: 2025-11-22 09:02:43.359 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5686MB free_disk=73.13077545166016GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 09:02:43 compute-0 nova_compute[186544]: 2025-11-22 09:02:43.359 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 09:02:43 compute-0 nova_compute[186544]: 2025-11-22 09:02:43.359 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 09:02:43 compute-0 nova_compute[186544]: 2025-11-22 09:02:43.862 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 09:02:43 compute-0 nova_compute[186544]: 2025-11-22 09:02:43.862 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 09:02:43 compute-0 nova_compute[186544]: 2025-11-22 09:02:43.902 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 09:02:43 compute-0 nova_compute[186544]: 2025-11-22 09:02:43.925 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 09:02:43 compute-0 nova_compute[186544]: 2025-11-22 09:02:43.926 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 09:02:43 compute-0 nova_compute[186544]: 2025-11-22 09:02:43.926 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.567s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 09:02:45 compute-0 podman[262948]: 2025-11-22 09:02:45.423779325 +0000 UTC m=+0.054676286 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Nov 22 09:02:45 compute-0 podman[262947]: 2025-11-22 09:02:45.434638262 +0000 UTC m=+0.064957299 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 22 09:02:45 compute-0 podman[262949]: 2025-11-22 09:02:45.437164333 +0000 UTC m=+0.063057092 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 09:02:45 compute-0 podman[262950]: 2025-11-22 09:02:45.470430402 +0000 UTC m=+0.091281716 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 22 09:02:47 compute-0 nova_compute[186544]: 2025-11-22 09:02:47.106 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:02:48 compute-0 nova_compute[186544]: 2025-11-22 09:02:48.167 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:02:48 compute-0 nova_compute[186544]: 2025-11-22 09:02:48.927 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:02:48 compute-0 nova_compute[186544]: 2025-11-22 09:02:48.927 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 09:02:49 compute-0 nova_compute[186544]: 2025-11-22 09:02:49.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:02:52 compute-0 nova_compute[186544]: 2025-11-22 09:02:52.108 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:02:52 compute-0 nova_compute[186544]: 2025-11-22 09:02:52.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:02:53 compute-0 nova_compute[186544]: 2025-11-22 09:02:53.170 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:02:55 compute-0 nova_compute[186544]: 2025-11-22 09:02:55.158 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:02:55 compute-0 nova_compute[186544]: 2025-11-22 09:02:55.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:02:57 compute-0 nova_compute[186544]: 2025-11-22 09:02:57.110 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:02:57 compute-0 podman[263034]: 2025-11-22 09:02:57.212476889 +0000 UTC m=+0.059454284 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 09:02:57 compute-0 podman[263035]: 2025-11-22 09:02:57.222854774 +0000 UTC m=+0.070429344 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, managed_by=edpm_ansible, io.openshift.expose-services=, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 22 09:02:57 compute-0 podman[263036]: 2025-11-22 09:02:57.254508103 +0000 UTC m=+0.098497995 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 22 09:02:58 compute-0 nova_compute[186544]: 2025-11-22 09:02:58.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:02:58 compute-0 nova_compute[186544]: 2025-11-22 09:02:58.173 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:02:59 compute-0 nova_compute[186544]: 2025-11-22 09:02:59.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:03:02 compute-0 nova_compute[186544]: 2025-11-22 09:03:02.112 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:03:03 compute-0 nova_compute[186544]: 2025-11-22 09:03:03.175 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:03:07 compute-0 nova_compute[186544]: 2025-11-22 09:03:07.113 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:03:08 compute-0 nova_compute[186544]: 2025-11-22 09:03:08.178 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:03:12 compute-0 nova_compute[186544]: 2025-11-22 09:03:12.115 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:03:13 compute-0 nova_compute[186544]: 2025-11-22 09:03:13.181 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:03:14 compute-0 nova_compute[186544]: 2025-11-22 09:03:14.159 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:03:16 compute-0 podman[263096]: 2025-11-22 09:03:16.413154726 +0000 UTC m=+0.061496384 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 22 09:03:16 compute-0 podman[263097]: 2025-11-22 09:03:16.413167906 +0000 UTC m=+0.058235144 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 09:03:16 compute-0 podman[263098]: 2025-11-22 09:03:16.435770472 +0000 UTC m=+0.078193144 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 09:03:16 compute-0 podman[263099]: 2025-11-22 09:03:16.456069192 +0000 UTC m=+0.092210129 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 22 09:03:17 compute-0 nova_compute[186544]: 2025-11-22 09:03:17.117 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:03:18 compute-0 nova_compute[186544]: 2025-11-22 09:03:18.184 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:03:22 compute-0 nova_compute[186544]: 2025-11-22 09:03:22.119 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:03:23 compute-0 nova_compute[186544]: 2025-11-22 09:03:23.186 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:03:27 compute-0 nova_compute[186544]: 2025-11-22 09:03:27.120 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:03:27 compute-0 podman[263185]: 2025-11-22 09:03:27.409114749 +0000 UTC m=+0.055344242 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 09:03:27 compute-0 podman[263187]: 2025-11-22 09:03:27.43234216 +0000 UTC m=+0.063575614 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 22 09:03:27 compute-0 podman[263186]: 2025-11-22 09:03:27.432385461 +0000 UTC m=+0.067210984 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, version=9.6, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, maintainer=Red Hat, Inc., release=1755695350, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.openshift.expose-services=, vendor=Red Hat, Inc.)
Nov 22 09:03:28 compute-0 nova_compute[186544]: 2025-11-22 09:03:28.189 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:03:32 compute-0 nova_compute[186544]: 2025-11-22 09:03:32.121 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:03:33 compute-0 nova_compute[186544]: 2025-11-22 09:03:33.192 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:03:37 compute-0 nova_compute[186544]: 2025-11-22 09:03:37.123 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:03:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 09:03:37.395 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 09:03:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 09:03:37.396 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 09:03:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 09:03:37.396 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 09:03:38 compute-0 nova_compute[186544]: 2025-11-22 09:03:38.195 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:03:42 compute-0 nova_compute[186544]: 2025-11-22 09:03:42.124 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:03:42 compute-0 nova_compute[186544]: 2025-11-22 09:03:42.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:03:42 compute-0 nova_compute[186544]: 2025-11-22 09:03:42.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 09:03:42 compute-0 nova_compute[186544]: 2025-11-22 09:03:42.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 09:03:42 compute-0 nova_compute[186544]: 2025-11-22 09:03:42.190 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 09:03:43 compute-0 nova_compute[186544]: 2025-11-22 09:03:43.198 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:03:44 compute-0 nova_compute[186544]: 2025-11-22 09:03:44.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:03:44 compute-0 nova_compute[186544]: 2025-11-22 09:03:44.204 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 09:03:44 compute-0 nova_compute[186544]: 2025-11-22 09:03:44.204 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 09:03:44 compute-0 nova_compute[186544]: 2025-11-22 09:03:44.205 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 09:03:44 compute-0 nova_compute[186544]: 2025-11-22 09:03:44.205 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 09:03:44 compute-0 nova_compute[186544]: 2025-11-22 09:03:44.356 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 09:03:44 compute-0 nova_compute[186544]: 2025-11-22 09:03:44.357 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5698MB free_disk=73.13145446777344GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 09:03:44 compute-0 nova_compute[186544]: 2025-11-22 09:03:44.357 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 09:03:44 compute-0 nova_compute[186544]: 2025-11-22 09:03:44.358 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 09:03:44 compute-0 nova_compute[186544]: 2025-11-22 09:03:44.547 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 09:03:44 compute-0 nova_compute[186544]: 2025-11-22 09:03:44.547 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 09:03:44 compute-0 nova_compute[186544]: 2025-11-22 09:03:44.814 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 09:03:44 compute-0 nova_compute[186544]: 2025-11-22 09:03:44.833 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 09:03:44 compute-0 nova_compute[186544]: 2025-11-22 09:03:44.834 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 09:03:44 compute-0 nova_compute[186544]: 2025-11-22 09:03:44.834 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.477s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 09:03:47 compute-0 nova_compute[186544]: 2025-11-22 09:03:47.126 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:03:47 compute-0 podman[263249]: 2025-11-22 09:03:47.40840484 +0000 UTC m=+0.057341292 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 22 09:03:47 compute-0 podman[263248]: 2025-11-22 09:03:47.40841756 +0000 UTC m=+0.060628992 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 09:03:47 compute-0 podman[263250]: 2025-11-22 09:03:47.413348812 +0000 UTC m=+0.057336762 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 09:03:47 compute-0 podman[263251]: 2025-11-22 09:03:47.433800504 +0000 UTC m=+0.075956209 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 22 09:03:48 compute-0 nova_compute[186544]: 2025-11-22 09:03:48.200 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:03:49 compute-0 nova_compute[186544]: 2025-11-22 09:03:49.835 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:03:49 compute-0 nova_compute[186544]: 2025-11-22 09:03:49.836 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:03:49 compute-0 nova_compute[186544]: 2025-11-22 09:03:49.836 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 09:03:52 compute-0 nova_compute[186544]: 2025-11-22 09:03:52.128 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:03:53 compute-0 nova_compute[186544]: 2025-11-22 09:03:53.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:03:53 compute-0 nova_compute[186544]: 2025-11-22 09:03:53.203 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:03:55 compute-0 nova_compute[186544]: 2025-11-22 09:03:55.160 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:03:55 compute-0 nova_compute[186544]: 2025-11-22 09:03:55.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:03:57 compute-0 nova_compute[186544]: 2025-11-22 09:03:57.130 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:03:58 compute-0 nova_compute[186544]: 2025-11-22 09:03:58.206 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:03:58 compute-0 podman[263329]: 2025-11-22 09:03:58.42296359 +0000 UTC m=+0.071227813 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 09:03:58 compute-0 podman[263331]: 2025-11-22 09:03:58.430853545 +0000 UTC m=+0.068538607 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_managed=true, config_id=multipathd)
Nov 22 09:03:58 compute-0 podman[263330]: 2025-11-22 09:03:58.469064225 +0000 UTC m=+0.103426596 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.buildah.version=1.33.7, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., config_id=edpm)
Nov 22 09:03:59 compute-0 nova_compute[186544]: 2025-11-22 09:03:59.168 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:03:59 compute-0 nova_compute[186544]: 2025-11-22 09:03:59.169 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:04:02 compute-0 nova_compute[186544]: 2025-11-22 09:04:02.133 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:04:03 compute-0 nova_compute[186544]: 2025-11-22 09:04:03.209 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:04:07 compute-0 nova_compute[186544]: 2025-11-22 09:04:07.135 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:04:08 compute-0 nova_compute[186544]: 2025-11-22 09:04:08.213 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:04:12 compute-0 nova_compute[186544]: 2025-11-22 09:04:12.137 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:04:13 compute-0 nova_compute[186544]: 2025-11-22 09:04:13.215 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:04:17 compute-0 nova_compute[186544]: 2025-11-22 09:04:17.138 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:04:18 compute-0 nova_compute[186544]: 2025-11-22 09:04:18.219 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:04:18 compute-0 podman[263392]: 2025-11-22 09:04:18.415535387 +0000 UTC m=+0.055120947 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 09:04:18 compute-0 podman[263391]: 2025-11-22 09:04:18.41928728 +0000 UTC m=+0.061466083 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 09:04:18 compute-0 podman[263390]: 2025-11-22 09:04:18.446258653 +0000 UTC m=+0.094633079 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 22 09:04:18 compute-0 podman[263398]: 2025-11-22 09:04:18.457066689 +0000 UTC m=+0.082807069 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 09:04:22 compute-0 nova_compute[186544]: 2025-11-22 09:04:22.139 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:04:23 compute-0 nova_compute[186544]: 2025-11-22 09:04:23.221 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:04:27 compute-0 nova_compute[186544]: 2025-11-22 09:04:27.141 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:04:28 compute-0 nova_compute[186544]: 2025-11-22 09:04:28.224 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:04:29 compute-0 podman[263476]: 2025-11-22 09:04:29.405530603 +0000 UTC m=+0.053773313 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, maintainer=Red Hat, Inc., version=9.6, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, architecture=x86_64, io.buildah.version=1.33.7, name=ubi9-minimal)
Nov 22 09:04:29 compute-0 podman[263475]: 2025-11-22 09:04:29.421995638 +0000 UTC m=+0.062974640 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 09:04:29 compute-0 podman[263477]: 2025-11-22 09:04:29.451653718 +0000 UTC m=+0.084184092 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 22 09:04:32 compute-0 nova_compute[186544]: 2025-11-22 09:04:32.143 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:04:33 compute-0 nova_compute[186544]: 2025-11-22 09:04:33.228 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:04:36.608 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:04:36.609 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:04:36.609 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:04:36.609 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:04:36.609 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:04:36.609 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:04:36.609 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:04:36.609 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:04:36.609 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:04:36.609 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:04:36.609 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:04:36.610 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:04:36.610 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:04:36.610 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:04:36.610 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:04:36.610 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:04:36.610 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:04:36.610 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:04:36.610 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:04:36.610 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:04:36.610 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:04:36.610 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:04:36.610 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:04:36.610 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:04:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:04:36.611 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:04:37 compute-0 nova_compute[186544]: 2025-11-22 09:04:37.144 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:04:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 09:04:37.396 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 09:04:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 09:04:37.396 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 09:04:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 09:04:37.397 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 09:04:38 compute-0 nova_compute[186544]: 2025-11-22 09:04:38.231 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:04:42 compute-0 nova_compute[186544]: 2025-11-22 09:04:42.146 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:04:43 compute-0 nova_compute[186544]: 2025-11-22 09:04:43.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:04:43 compute-0 nova_compute[186544]: 2025-11-22 09:04:43.164 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 09:04:43 compute-0 nova_compute[186544]: 2025-11-22 09:04:43.164 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 09:04:43 compute-0 nova_compute[186544]: 2025-11-22 09:04:43.205 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 09:04:43 compute-0 nova_compute[186544]: 2025-11-22 09:04:43.233 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:04:45 compute-0 nova_compute[186544]: 2025-11-22 09:04:45.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:04:45 compute-0 nova_compute[186544]: 2025-11-22 09:04:45.198 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 09:04:45 compute-0 nova_compute[186544]: 2025-11-22 09:04:45.198 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 09:04:45 compute-0 nova_compute[186544]: 2025-11-22 09:04:45.198 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 09:04:45 compute-0 nova_compute[186544]: 2025-11-22 09:04:45.198 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 09:04:45 compute-0 nova_compute[186544]: 2025-11-22 09:04:45.343 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 09:04:45 compute-0 nova_compute[186544]: 2025-11-22 09:04:45.344 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5691MB free_disk=73.13145446777344GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 09:04:45 compute-0 nova_compute[186544]: 2025-11-22 09:04:45.344 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 09:04:45 compute-0 nova_compute[186544]: 2025-11-22 09:04:45.345 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 09:04:45 compute-0 nova_compute[186544]: 2025-11-22 09:04:45.434 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 09:04:45 compute-0 nova_compute[186544]: 2025-11-22 09:04:45.435 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 09:04:45 compute-0 nova_compute[186544]: 2025-11-22 09:04:45.463 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 09:04:45 compute-0 nova_compute[186544]: 2025-11-22 09:04:45.481 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 09:04:45 compute-0 nova_compute[186544]: 2025-11-22 09:04:45.482 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 09:04:45 compute-0 nova_compute[186544]: 2025-11-22 09:04:45.483 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.138s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 09:04:47 compute-0 nova_compute[186544]: 2025-11-22 09:04:47.148 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:04:48 compute-0 nova_compute[186544]: 2025-11-22 09:04:48.234 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:04:49 compute-0 podman[263536]: 2025-11-22 09:04:49.419240947 +0000 UTC m=+0.053500187 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 09:04:49 compute-0 podman[263537]: 2025-11-22 09:04:49.421549194 +0000 UTC m=+0.055033945 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 09:04:49 compute-0 podman[263535]: 2025-11-22 09:04:49.427197243 +0000 UTC m=+0.067161513 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 22 09:04:49 compute-0 podman[263538]: 2025-11-22 09:04:49.453381588 +0000 UTC m=+0.083778752 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 22 09:04:50 compute-0 nova_compute[186544]: 2025-11-22 09:04:50.483 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:04:50 compute-0 nova_compute[186544]: 2025-11-22 09:04:50.484 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:04:50 compute-0 nova_compute[186544]: 2025-11-22 09:04:50.484 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 09:04:52 compute-0 nova_compute[186544]: 2025-11-22 09:04:52.150 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:04:53 compute-0 nova_compute[186544]: 2025-11-22 09:04:53.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:04:53 compute-0 nova_compute[186544]: 2025-11-22 09:04:53.237 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:04:55 compute-0 nova_compute[186544]: 2025-11-22 09:04:55.157 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:04:56 compute-0 nova_compute[186544]: 2025-11-22 09:04:56.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:04:57 compute-0 nova_compute[186544]: 2025-11-22 09:04:57.151 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:04:58 compute-0 nova_compute[186544]: 2025-11-22 09:04:58.240 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:05:00 compute-0 nova_compute[186544]: 2025-11-22 09:05:00.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:05:00 compute-0 podman[263622]: 2025-11-22 09:05:00.417544508 +0000 UTC m=+0.057510175 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 09:05:00 compute-0 podman[263624]: 2025-11-22 09:05:00.421958677 +0000 UTC m=+0.055052655 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 22 09:05:00 compute-0 podman[263623]: 2025-11-22 09:05:00.422054689 +0000 UTC m=+0.060183660 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., managed_by=edpm_ansible, config_id=edpm, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.openshift.expose-services=, container_name=openstack_network_exporter, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64)
Nov 22 09:05:01 compute-0 nova_compute[186544]: 2025-11-22 09:05:01.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:05:02 compute-0 nova_compute[186544]: 2025-11-22 09:05:02.153 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:05:03 compute-0 nova_compute[186544]: 2025-11-22 09:05:03.242 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:05:07 compute-0 nova_compute[186544]: 2025-11-22 09:05:07.153 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:05:08 compute-0 nova_compute[186544]: 2025-11-22 09:05:08.244 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:05:12 compute-0 nova_compute[186544]: 2025-11-22 09:05:12.153 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:05:13 compute-0 nova_compute[186544]: 2025-11-22 09:05:13.247 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:05:14 compute-0 nova_compute[186544]: 2025-11-22 09:05:14.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:05:17 compute-0 nova_compute[186544]: 2025-11-22 09:05:17.156 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:05:17 compute-0 nova_compute[186544]: 2025-11-22 09:05:17.172 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:05:18 compute-0 nova_compute[186544]: 2025-11-22 09:05:18.249 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:05:20 compute-0 podman[263683]: 2025-11-22 09:05:20.412203927 +0000 UTC m=+0.062594041 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251118, io.buildah.version=1.41.3)
Nov 22 09:05:20 compute-0 podman[263684]: 2025-11-22 09:05:20.429982793 +0000 UTC m=+0.077504097 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Nov 22 09:05:20 compute-0 podman[263685]: 2025-11-22 09:05:20.435715605 +0000 UTC m=+0.078689827 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 09:05:20 compute-0 podman[263686]: 2025-11-22 09:05:20.460145246 +0000 UTC m=+0.094704421 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 22 09:05:22 compute-0 nova_compute[186544]: 2025-11-22 09:05:22.157 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:05:23 compute-0 nova_compute[186544]: 2025-11-22 09:05:23.251 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:05:27 compute-0 nova_compute[186544]: 2025-11-22 09:05:27.160 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:05:28 compute-0 nova_compute[186544]: 2025-11-22 09:05:28.253 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:05:31 compute-0 nova_compute[186544]: 2025-11-22 09:05:31.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:05:31 compute-0 nova_compute[186544]: 2025-11-22 09:05:31.164 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 22 09:05:31 compute-0 podman[263767]: 2025-11-22 09:05:31.425862885 +0000 UTC m=+0.066551948 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=multipathd, managed_by=edpm_ansible)
Nov 22 09:05:31 compute-0 podman[263766]: 2025-11-22 09:05:31.427114626 +0000 UTC m=+0.067888321 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, config_id=edpm, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 22 09:05:31 compute-0 podman[263765]: 2025-11-22 09:05:31.441015828 +0000 UTC m=+0.089014551 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 09:05:32 compute-0 nova_compute[186544]: 2025-11-22 09:05:32.160 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:05:33 compute-0 nova_compute[186544]: 2025-11-22 09:05:33.255 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:05:37 compute-0 nova_compute[186544]: 2025-11-22 09:05:37.162 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:05:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 09:05:37.397 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 09:05:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 09:05:37.397 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 09:05:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 09:05:37.397 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 09:05:38 compute-0 nova_compute[186544]: 2025-11-22 09:05:38.200 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:05:38 compute-0 nova_compute[186544]: 2025-11-22 09:05:38.200 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 22 09:05:38 compute-0 nova_compute[186544]: 2025-11-22 09:05:38.236 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 22 09:05:38 compute-0 nova_compute[186544]: 2025-11-22 09:05:38.256 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:05:41 compute-0 nova_compute[186544]: 2025-11-22 09:05:41.196 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:05:42 compute-0 nova_compute[186544]: 2025-11-22 09:05:42.164 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:05:43 compute-0 nova_compute[186544]: 2025-11-22 09:05:43.259 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:05:44 compute-0 nova_compute[186544]: 2025-11-22 09:05:44.166 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:05:44 compute-0 nova_compute[186544]: 2025-11-22 09:05:44.166 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 09:05:44 compute-0 nova_compute[186544]: 2025-11-22 09:05:44.167 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 09:05:44 compute-0 nova_compute[186544]: 2025-11-22 09:05:44.193 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 09:05:45 compute-0 nova_compute[186544]: 2025-11-22 09:05:45.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:05:45 compute-0 nova_compute[186544]: 2025-11-22 09:05:45.194 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 09:05:45 compute-0 nova_compute[186544]: 2025-11-22 09:05:45.195 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 09:05:45 compute-0 nova_compute[186544]: 2025-11-22 09:05:45.195 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 09:05:45 compute-0 nova_compute[186544]: 2025-11-22 09:05:45.195 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 09:05:45 compute-0 nova_compute[186544]: 2025-11-22 09:05:45.340 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 09:05:45 compute-0 nova_compute[186544]: 2025-11-22 09:05:45.341 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5707MB free_disk=73.13145446777344GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 09:05:45 compute-0 nova_compute[186544]: 2025-11-22 09:05:45.341 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 09:05:45 compute-0 nova_compute[186544]: 2025-11-22 09:05:45.341 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 09:05:45 compute-0 nova_compute[186544]: 2025-11-22 09:05:45.416 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 09:05:45 compute-0 nova_compute[186544]: 2025-11-22 09:05:45.417 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 09:05:45 compute-0 nova_compute[186544]: 2025-11-22 09:05:45.450 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 09:05:45 compute-0 nova_compute[186544]: 2025-11-22 09:05:45.517 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 09:05:45 compute-0 nova_compute[186544]: 2025-11-22 09:05:45.519 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 09:05:45 compute-0 nova_compute[186544]: 2025-11-22 09:05:45.519 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.178s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 09:05:47 compute-0 nova_compute[186544]: 2025-11-22 09:05:47.164 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:05:48 compute-0 nova_compute[186544]: 2025-11-22 09:05:48.262 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:05:51 compute-0 podman[263828]: 2025-11-22 09:05:51.408971618 +0000 UTC m=+0.048628718 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 09:05:51 compute-0 podman[263827]: 2025-11-22 09:05:51.435207704 +0000 UTC m=+0.077265772 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, tcib_managed=true)
Nov 22 09:05:51 compute-0 podman[263826]: 2025-11-22 09:05:51.440146195 +0000 UTC m=+0.087031542 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 22 09:05:51 compute-0 podman[263832]: 2025-11-22 09:05:51.446731247 +0000 UTC m=+0.081792473 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 09:05:51 compute-0 nova_compute[186544]: 2025-11-22 09:05:51.519 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:05:52 compute-0 nova_compute[186544]: 2025-11-22 09:05:52.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:05:52 compute-0 nova_compute[186544]: 2025-11-22 09:05:52.164 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 09:05:52 compute-0 nova_compute[186544]: 2025-11-22 09:05:52.167 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:05:53 compute-0 nova_compute[186544]: 2025-11-22 09:05:53.265 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:05:54 compute-0 nova_compute[186544]: 2025-11-22 09:05:54.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:05:57 compute-0 nova_compute[186544]: 2025-11-22 09:05:57.158 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:05:57 compute-0 nova_compute[186544]: 2025-11-22 09:05:57.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:05:57 compute-0 nova_compute[186544]: 2025-11-22 09:05:57.168 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:05:58 compute-0 nova_compute[186544]: 2025-11-22 09:05:58.267 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:06:01 compute-0 nova_compute[186544]: 2025-11-22 09:06:01.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:06:02 compute-0 nova_compute[186544]: 2025-11-22 09:06:02.169 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:06:02 compute-0 podman[263908]: 2025-11-22 09:06:02.41219637 +0000 UTC m=+0.051259173 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 09:06:02 compute-0 podman[263909]: 2025-11-22 09:06:02.435330728 +0000 UTC m=+0.061166335 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, architecture=x86_64, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, config_id=edpm, distribution-scope=public, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 22 09:06:02 compute-0 podman[263915]: 2025-11-22 09:06:02.467500489 +0000 UTC m=+0.089524433 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 22 09:06:03 compute-0 nova_compute[186544]: 2025-11-22 09:06:03.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:06:03 compute-0 nova_compute[186544]: 2025-11-22 09:06:03.270 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:06:07 compute-0 nova_compute[186544]: 2025-11-22 09:06:07.171 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:06:08 compute-0 nova_compute[186544]: 2025-11-22 09:06:08.272 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:06:12 compute-0 nova_compute[186544]: 2025-11-22 09:06:12.173 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:06:13 compute-0 nova_compute[186544]: 2025-11-22 09:06:13.274 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:06:17 compute-0 nova_compute[186544]: 2025-11-22 09:06:17.173 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:06:18 compute-0 nova_compute[186544]: 2025-11-22 09:06:18.276 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:06:22 compute-0 nova_compute[186544]: 2025-11-22 09:06:22.176 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:06:22 compute-0 podman[263972]: 2025-11-22 09:06:22.406585559 +0000 UTC m=+0.052847351 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent)
Nov 22 09:06:22 compute-0 podman[263973]: 2025-11-22 09:06:22.412992917 +0000 UTC m=+0.053814075 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 09:06:22 compute-0 podman[263971]: 2025-11-22 09:06:22.437305285 +0000 UTC m=+0.087888273 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 09:06:22 compute-0 podman[263974]: 2025-11-22 09:06:22.441121159 +0000 UTC m=+0.081432365 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3)
Nov 22 09:06:23 compute-0 nova_compute[186544]: 2025-11-22 09:06:23.277 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:06:27 compute-0 nova_compute[186544]: 2025-11-22 09:06:27.178 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:06:28 compute-0 nova_compute[186544]: 2025-11-22 09:06:28.280 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:06:29 compute-0 nova_compute[186544]: 2025-11-22 09:06:29.143 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:06:32 compute-0 nova_compute[186544]: 2025-11-22 09:06:32.179 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:06:33 compute-0 nova_compute[186544]: 2025-11-22 09:06:33.282 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:06:33 compute-0 podman[264054]: 2025-11-22 09:06:33.402580041 +0000 UTC m=+0.049426646 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 09:06:33 compute-0 podman[264055]: 2025-11-22 09:06:33.41714249 +0000 UTC m=+0.059135835 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., version=9.6, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, vcs-type=git, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 22 09:06:33 compute-0 podman[264056]: 2025-11-22 09:06:33.439698855 +0000 UTC m=+0.070225438 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.schema-version=1.0)
Nov 22 09:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:06:36.615 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:06:36.615 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:06:36.615 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:06:36.616 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:06:36.616 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:06:36.616 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:06:36.616 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:06:36.616 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:06:36.616 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:06:36.616 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:06:36.616 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:06:36.616 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:06:36.616 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:06:36.616 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:06:36.616 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:06:36.616 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:06:36.616 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:06:36.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:06:36.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:06:36.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:06:36.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:06:36.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:06:36.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:06:36.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:06:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:06:36.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:06:37 compute-0 nova_compute[186544]: 2025-11-22 09:06:37.187 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:06:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 09:06:37.399 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 09:06:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 09:06:37.400 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 09:06:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 09:06:37.400 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 09:06:38 compute-0 nova_compute[186544]: 2025-11-22 09:06:38.285 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:06:42 compute-0 nova_compute[186544]: 2025-11-22 09:06:42.191 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:06:43 compute-0 nova_compute[186544]: 2025-11-22 09:06:43.289 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:06:44 compute-0 nova_compute[186544]: 2025-11-22 09:06:44.183 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:06:44 compute-0 nova_compute[186544]: 2025-11-22 09:06:44.184 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 09:06:44 compute-0 nova_compute[186544]: 2025-11-22 09:06:44.184 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 09:06:44 compute-0 nova_compute[186544]: 2025-11-22 09:06:44.221 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 09:06:46 compute-0 nova_compute[186544]: 2025-11-22 09:06:46.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:06:46 compute-0 nova_compute[186544]: 2025-11-22 09:06:46.214 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 09:06:46 compute-0 nova_compute[186544]: 2025-11-22 09:06:46.215 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 09:06:46 compute-0 nova_compute[186544]: 2025-11-22 09:06:46.215 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 09:06:46 compute-0 nova_compute[186544]: 2025-11-22 09:06:46.215 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 09:06:46 compute-0 nova_compute[186544]: 2025-11-22 09:06:46.386 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 09:06:46 compute-0 nova_compute[186544]: 2025-11-22 09:06:46.387 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5704MB free_disk=73.13150787353516GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 09:06:46 compute-0 nova_compute[186544]: 2025-11-22 09:06:46.387 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 09:06:46 compute-0 nova_compute[186544]: 2025-11-22 09:06:46.387 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 09:06:46 compute-0 nova_compute[186544]: 2025-11-22 09:06:46.669 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 09:06:46 compute-0 nova_compute[186544]: 2025-11-22 09:06:46.669 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 09:06:46 compute-0 nova_compute[186544]: 2025-11-22 09:06:46.756 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Refreshing inventories for resource provider 0a011418-630a-4be8-ab23-41ec1c11a5ea _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 22 09:06:46 compute-0 nova_compute[186544]: 2025-11-22 09:06:46.795 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Updating ProviderTree inventory for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 22 09:06:46 compute-0 nova_compute[186544]: 2025-11-22 09:06:46.796 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Updating inventory in ProviderTree for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 22 09:06:46 compute-0 nova_compute[186544]: 2025-11-22 09:06:46.837 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Refreshing aggregate associations for resource provider 0a011418-630a-4be8-ab23-41ec1c11a5ea, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 22 09:06:46 compute-0 nova_compute[186544]: 2025-11-22 09:06:46.899 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Refreshing trait associations for resource provider 0a011418-630a-4be8-ab23-41ec1c11a5ea, traits: COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_ACCELERATORS,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_RESCUE_BFV,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSSE3,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE41 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 22 09:06:46 compute-0 nova_compute[186544]: 2025-11-22 09:06:46.971 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 09:06:46 compute-0 nova_compute[186544]: 2025-11-22 09:06:46.985 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 09:06:46 compute-0 nova_compute[186544]: 2025-11-22 09:06:46.986 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 09:06:46 compute-0 nova_compute[186544]: 2025-11-22 09:06:46.986 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.599s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 09:06:47 compute-0 nova_compute[186544]: 2025-11-22 09:06:47.192 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:06:48 compute-0 nova_compute[186544]: 2025-11-22 09:06:48.291 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:06:50 compute-0 nova_compute[186544]: 2025-11-22 09:06:50.128 186548 DEBUG oslo_concurrency.processutils [None req-7f8d9728-2d8b-420d-a211-d58aa5b5dfd5 74ad5d4ed255439cafdb153ee87124a2 cb198b45e9034b108a19399d19c6cf14 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 22 09:06:50 compute-0 nova_compute[186544]: 2025-11-22 09:06:50.157 186548 DEBUG oslo_concurrency.processutils [None req-7f8d9728-2d8b-420d-a211-d58aa5b5dfd5 74ad5d4ed255439cafdb153ee87124a2 cb198b45e9034b108a19399d19c6cf14 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.029s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 22 09:06:52 compute-0 nova_compute[186544]: 2025-11-22 09:06:52.194 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:06:52 compute-0 nova_compute[186544]: 2025-11-22 09:06:52.987 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:06:53 compute-0 nova_compute[186544]: 2025-11-22 09:06:53.294 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:06:53 compute-0 podman[264118]: 2025-11-22 09:06:53.414312989 +0000 UTC m=+0.057208579 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 22 09:06:53 compute-0 podman[264124]: 2025-11-22 09:06:53.423986187 +0000 UTC m=+0.056803119 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 09:06:53 compute-0 podman[264117]: 2025-11-22 09:06:53.429696667 +0000 UTC m=+0.078994423 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 09:06:53 compute-0 podman[264130]: 2025-11-22 09:06:53.490362819 +0000 UTC m=+0.114103407 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251118, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 22 09:06:54 compute-0 nova_compute[186544]: 2025-11-22 09:06:54.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:06:54 compute-0 nova_compute[186544]: 2025-11-22 09:06:54.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 09:06:55 compute-0 nova_compute[186544]: 2025-11-22 09:06:55.943 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:06:55 compute-0 ovn_metadata_agent[103800]: 2025-11-22 09:06:55.944 103805 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=83, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'f6:bc:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '52:fd:07:da:46:1f'}, ipsec=False) old=SB_Global(nb_cfg=82) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 22 09:06:55 compute-0 ovn_metadata_agent[103800]: 2025-11-22 09:06:55.945 103805 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 22 09:06:56 compute-0 nova_compute[186544]: 2025-11-22 09:06:56.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:06:57 compute-0 nova_compute[186544]: 2025-11-22 09:06:57.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:06:57 compute-0 nova_compute[186544]: 2025-11-22 09:06:57.196 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:06:58 compute-0 nova_compute[186544]: 2025-11-22 09:06:58.158 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:06:58 compute-0 nova_compute[186544]: 2025-11-22 09:06:58.297 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:07:01 compute-0 ovn_metadata_agent[103800]: 2025-11-22 09:07:01.947 103805 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=df09844c-c111-44b4-9c36-d4950a55a590, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '83'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 22 09:07:02 compute-0 nova_compute[186544]: 2025-11-22 09:07:02.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:07:02 compute-0 nova_compute[186544]: 2025-11-22 09:07:02.198 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:07:03 compute-0 nova_compute[186544]: 2025-11-22 09:07:03.299 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:07:04 compute-0 nova_compute[186544]: 2025-11-22 09:07:04.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:07:04 compute-0 podman[264204]: 2025-11-22 09:07:04.407243208 +0000 UTC m=+0.052815780 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 09:07:04 compute-0 podman[264206]: 2025-11-22 09:07:04.418495494 +0000 UTC m=+0.054950612 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 22 09:07:04 compute-0 podman[264205]: 2025-11-22 09:07:04.418962776 +0000 UTC m=+0.062417986 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, distribution-scope=public, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.openshift.expose-services=)
Nov 22 09:07:07 compute-0 nova_compute[186544]: 2025-11-22 09:07:07.199 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:07:08 compute-0 nova_compute[186544]: 2025-11-22 09:07:08.302 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:07:12 compute-0 nova_compute[186544]: 2025-11-22 09:07:12.206 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:07:13 compute-0 nova_compute[186544]: 2025-11-22 09:07:13.305 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:07:17 compute-0 nova_compute[186544]: 2025-11-22 09:07:17.208 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:07:18 compute-0 nova_compute[186544]: 2025-11-22 09:07:18.309 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:07:20 compute-0 nova_compute[186544]: 2025-11-22 09:07:20.158 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:07:22 compute-0 nova_compute[186544]: 2025-11-22 09:07:22.210 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:07:23 compute-0 nova_compute[186544]: 2025-11-22 09:07:23.313 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:07:24 compute-0 podman[264270]: 2025-11-22 09:07:24.416903955 +0000 UTC m=+0.056596604 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 09:07:24 compute-0 podman[264268]: 2025-11-22 09:07:24.440049934 +0000 UTC m=+0.088406096 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 09:07:24 compute-0 podman[264269]: 2025-11-22 09:07:24.440187077 +0000 UTC m=+0.083243998 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 22 09:07:24 compute-0 podman[264276]: 2025-11-22 09:07:24.444636456 +0000 UTC m=+0.080182473 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 09:07:27 compute-0 nova_compute[186544]: 2025-11-22 09:07:27.212 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:07:28 compute-0 nova_compute[186544]: 2025-11-22 09:07:28.318 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:07:32 compute-0 nova_compute[186544]: 2025-11-22 09:07:32.214 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:07:33 compute-0 nova_compute[186544]: 2025-11-22 09:07:33.320 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:07:35 compute-0 podman[264351]: 2025-11-22 09:07:35.403631491 +0000 UTC m=+0.046545066 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 09:07:35 compute-0 podman[264352]: 2025-11-22 09:07:35.422155047 +0000 UTC m=+0.060687794 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, maintainer=Red Hat, Inc., vcs-type=git, architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.33.7, io.openshift.expose-services=, container_name=openstack_network_exporter, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6)
Nov 22 09:07:35 compute-0 podman[264353]: 2025-11-22 09:07:35.435367022 +0000 UTC m=+0.064843016 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd)
Nov 22 09:07:37 compute-0 nova_compute[186544]: 2025-11-22 09:07:37.217 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:07:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 09:07:37.400 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 09:07:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 09:07:37.400 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 09:07:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 09:07:37.400 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 09:07:38 compute-0 nova_compute[186544]: 2025-11-22 09:07:38.323 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:07:42 compute-0 nova_compute[186544]: 2025-11-22 09:07:42.218 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:07:43 compute-0 nova_compute[186544]: 2025-11-22 09:07:43.325 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:07:44 compute-0 nova_compute[186544]: 2025-11-22 09:07:44.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:07:44 compute-0 nova_compute[186544]: 2025-11-22 09:07:44.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 09:07:44 compute-0 nova_compute[186544]: 2025-11-22 09:07:44.164 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 09:07:44 compute-0 nova_compute[186544]: 2025-11-22 09:07:44.289 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 09:07:47 compute-0 nova_compute[186544]: 2025-11-22 09:07:47.219 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:07:48 compute-0 nova_compute[186544]: 2025-11-22 09:07:48.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:07:48 compute-0 nova_compute[186544]: 2025-11-22 09:07:48.195 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 09:07:48 compute-0 nova_compute[186544]: 2025-11-22 09:07:48.195 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 09:07:48 compute-0 nova_compute[186544]: 2025-11-22 09:07:48.196 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 09:07:48 compute-0 nova_compute[186544]: 2025-11-22 09:07:48.196 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 09:07:48 compute-0 nova_compute[186544]: 2025-11-22 09:07:48.327 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:07:48 compute-0 nova_compute[186544]: 2025-11-22 09:07:48.345 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 09:07:48 compute-0 nova_compute[186544]: 2025-11-22 09:07:48.346 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5707MB free_disk=73.13150787353516GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 09:07:48 compute-0 nova_compute[186544]: 2025-11-22 09:07:48.346 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 09:07:48 compute-0 nova_compute[186544]: 2025-11-22 09:07:48.346 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 09:07:48 compute-0 nova_compute[186544]: 2025-11-22 09:07:48.415 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 09:07:48 compute-0 nova_compute[186544]: 2025-11-22 09:07:48.415 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 09:07:48 compute-0 nova_compute[186544]: 2025-11-22 09:07:48.521 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 09:07:48 compute-0 nova_compute[186544]: 2025-11-22 09:07:48.536 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 09:07:48 compute-0 nova_compute[186544]: 2025-11-22 09:07:48.537 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 09:07:48 compute-0 nova_compute[186544]: 2025-11-22 09:07:48.537 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.191s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 09:07:52 compute-0 nova_compute[186544]: 2025-11-22 09:07:52.220 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:07:53 compute-0 nova_compute[186544]: 2025-11-22 09:07:53.330 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:07:53 compute-0 nova_compute[186544]: 2025-11-22 09:07:53.538 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:07:55 compute-0 nova_compute[186544]: 2025-11-22 09:07:55.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:07:55 compute-0 nova_compute[186544]: 2025-11-22 09:07:55.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 09:07:55 compute-0 podman[264414]: 2025-11-22 09:07:55.413045311 +0000 UTC m=+0.053454055 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Nov 22 09:07:55 compute-0 podman[264415]: 2025-11-22 09:07:55.437156445 +0000 UTC m=+0.074786351 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 09:07:55 compute-0 podman[264416]: 2025-11-22 09:07:55.447740525 +0000 UTC m=+0.081777352 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 22 09:07:55 compute-0 podman[264413]: 2025-11-22 09:07:55.448522245 +0000 UTC m=+0.092434185 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 22 09:07:56 compute-0 nova_compute[186544]: 2025-11-22 09:07:56.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:07:57 compute-0 nova_compute[186544]: 2025-11-22 09:07:57.222 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:07:58 compute-0 nova_compute[186544]: 2025-11-22 09:07:58.333 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:07:59 compute-0 nova_compute[186544]: 2025-11-22 09:07:59.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:08:00 compute-0 nova_compute[186544]: 2025-11-22 09:08:00.158 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:08:02 compute-0 nova_compute[186544]: 2025-11-22 09:08:02.224 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:08:03 compute-0 nova_compute[186544]: 2025-11-22 09:08:03.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:08:03 compute-0 nova_compute[186544]: 2025-11-22 09:08:03.336 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:08:06 compute-0 nova_compute[186544]: 2025-11-22 09:08:06.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:08:06 compute-0 podman[264496]: 2025-11-22 09:08:06.405197563 +0000 UTC m=+0.052614605 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.expose-services=, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, name=ubi9-minimal, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., version=9.6)
Nov 22 09:08:06 compute-0 podman[264495]: 2025-11-22 09:08:06.415979378 +0000 UTC m=+0.068123687 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 09:08:06 compute-0 podman[264497]: 2025-11-22 09:08:06.416015109 +0000 UTC m=+0.059123495 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 09:08:07 compute-0 nova_compute[186544]: 2025-11-22 09:08:07.225 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:08:08 compute-0 nova_compute[186544]: 2025-11-22 09:08:08.339 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:08:12 compute-0 nova_compute[186544]: 2025-11-22 09:08:12.226 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:08:13 compute-0 nova_compute[186544]: 2025-11-22 09:08:13.343 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:08:17 compute-0 nova_compute[186544]: 2025-11-22 09:08:17.227 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:08:18 compute-0 nova_compute[186544]: 2025-11-22 09:08:18.347 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:08:22 compute-0 nova_compute[186544]: 2025-11-22 09:08:22.229 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:08:23 compute-0 nova_compute[186544]: 2025-11-22 09:08:23.350 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:08:26 compute-0 podman[264556]: 2025-11-22 09:08:26.409563458 +0000 UTC m=+0.054121562 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 09:08:26 compute-0 podman[264555]: 2025-11-22 09:08:26.412740306 +0000 UTC m=+0.060081358 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, managed_by=edpm_ansible)
Nov 22 09:08:26 compute-0 podman[264557]: 2025-11-22 09:08:26.415087354 +0000 UTC m=+0.055048395 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 09:08:26 compute-0 podman[264558]: 2025-11-22 09:08:26.444500468 +0000 UTC m=+0.081235859 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 09:08:27 compute-0 nova_compute[186544]: 2025-11-22 09:08:27.230 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:08:28 compute-0 nova_compute[186544]: 2025-11-22 09:08:28.356 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:08:32 compute-0 nova_compute[186544]: 2025-11-22 09:08:32.232 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:08:33 compute-0 nova_compute[186544]: 2025-11-22 09:08:33.360 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:08:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:08:36.615 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:08:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:08:36.615 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:08:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:08:36.615 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:08:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:08:36.616 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:08:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:08:36.616 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:08:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:08:36.616 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:08:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:08:36.616 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:08:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:08:36.616 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:08:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:08:36.616 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:08:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:08:36.616 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:08:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:08:36.616 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:08:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:08:36.616 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:08:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:08:36.616 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:08:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:08:36.616 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:08:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:08:36.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:08:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:08:36.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:08:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:08:36.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:08:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:08:36.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:08:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:08:36.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:08:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:08:36.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:08:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:08:36.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:08:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:08:36.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:08:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:08:36.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:08:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:08:36.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:08:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:08:36.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:08:37 compute-0 nova_compute[186544]: 2025-11-22 09:08:37.234 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:08:37 compute-0 podman[264641]: 2025-11-22 09:08:37.399120932 +0000 UTC m=+0.047433105 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 09:08:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 09:08:37.401 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 09:08:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 09:08:37.401 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 09:08:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 09:08:37.401 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 09:08:37 compute-0 podman[264643]: 2025-11-22 09:08:37.408091692 +0000 UTC m=+0.049749894 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 09:08:37 compute-0 podman[264642]: 2025-11-22 09:08:37.408092792 +0000 UTC m=+0.053084685 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, vcs-type=git, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, architecture=x86_64, io.openshift.expose-services=, name=ubi9-minimal, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 22 09:08:38 compute-0 nova_compute[186544]: 2025-11-22 09:08:38.364 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:08:42 compute-0 nova_compute[186544]: 2025-11-22 09:08:42.235 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:08:43 compute-0 nova_compute[186544]: 2025-11-22 09:08:43.366 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:08:44 compute-0 nova_compute[186544]: 2025-11-22 09:08:44.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:08:44 compute-0 nova_compute[186544]: 2025-11-22 09:08:44.164 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 09:08:44 compute-0 nova_compute[186544]: 2025-11-22 09:08:44.164 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 09:08:44 compute-0 nova_compute[186544]: 2025-11-22 09:08:44.181 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 09:08:47 compute-0 nova_compute[186544]: 2025-11-22 09:08:47.236 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:08:48 compute-0 nova_compute[186544]: 2025-11-22 09:08:48.369 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:08:49 compute-0 nova_compute[186544]: 2025-11-22 09:08:49.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:08:49 compute-0 nova_compute[186544]: 2025-11-22 09:08:49.203 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 09:08:49 compute-0 nova_compute[186544]: 2025-11-22 09:08:49.203 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 09:08:49 compute-0 nova_compute[186544]: 2025-11-22 09:08:49.204 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 09:08:49 compute-0 nova_compute[186544]: 2025-11-22 09:08:49.204 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 09:08:49 compute-0 nova_compute[186544]: 2025-11-22 09:08:49.341 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 09:08:49 compute-0 nova_compute[186544]: 2025-11-22 09:08:49.342 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5713MB free_disk=73.13150787353516GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 09:08:49 compute-0 nova_compute[186544]: 2025-11-22 09:08:49.342 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 09:08:49 compute-0 nova_compute[186544]: 2025-11-22 09:08:49.342 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 09:08:49 compute-0 nova_compute[186544]: 2025-11-22 09:08:49.683 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 09:08:49 compute-0 nova_compute[186544]: 2025-11-22 09:08:49.683 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 09:08:49 compute-0 nova_compute[186544]: 2025-11-22 09:08:49.705 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 09:08:49 compute-0 nova_compute[186544]: 2025-11-22 09:08:49.724 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 09:08:49 compute-0 nova_compute[186544]: 2025-11-22 09:08:49.725 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 09:08:49 compute-0 nova_compute[186544]: 2025-11-22 09:08:49.726 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.383s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 09:08:52 compute-0 nova_compute[186544]: 2025-11-22 09:08:52.239 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:08:53 compute-0 nova_compute[186544]: 2025-11-22 09:08:53.373 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:08:53 compute-0 nova_compute[186544]: 2025-11-22 09:08:53.726 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:08:57 compute-0 nova_compute[186544]: 2025-11-22 09:08:57.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:08:57 compute-0 nova_compute[186544]: 2025-11-22 09:08:57.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:08:57 compute-0 nova_compute[186544]: 2025-11-22 09:08:57.164 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 09:08:57 compute-0 nova_compute[186544]: 2025-11-22 09:08:57.241 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:08:57 compute-0 podman[264705]: 2025-11-22 09:08:57.25302212 +0000 UTC m=+0.066478271 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 22 09:08:57 compute-0 podman[264706]: 2025-11-22 09:08:57.253077231 +0000 UTC m=+0.062858902 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 09:08:57 compute-0 podman[264707]: 2025-11-22 09:08:57.278165612 +0000 UTC m=+0.085359120 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 09:08:57 compute-0 podman[264708]: 2025-11-22 09:08:57.322444381 +0000 UTC m=+0.124486014 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 22 09:08:58 compute-0 nova_compute[186544]: 2025-11-22 09:08:58.376 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:09:01 compute-0 nova_compute[186544]: 2025-11-22 09:09:01.159 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:09:01 compute-0 nova_compute[186544]: 2025-11-22 09:09:01.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:09:02 compute-0 nova_compute[186544]: 2025-11-22 09:09:02.244 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:09:03 compute-0 nova_compute[186544]: 2025-11-22 09:09:03.378 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:09:04 compute-0 nova_compute[186544]: 2025-11-22 09:09:04.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:09:07 compute-0 nova_compute[186544]: 2025-11-22 09:09:07.245 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:09:08 compute-0 nova_compute[186544]: 2025-11-22 09:09:08.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:09:08 compute-0 nova_compute[186544]: 2025-11-22 09:09:08.380 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:09:08 compute-0 podman[264790]: 2025-11-22 09:09:08.399868372 +0000 UTC m=+0.048704397 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 09:09:08 compute-0 podman[264791]: 2025-11-22 09:09:08.412078729 +0000 UTC m=+0.054783995 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, release=1755695350, version=9.6, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9-minimal, vcs-type=git, container_name=openstack_network_exporter, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 22 09:09:08 compute-0 podman[264792]: 2025-11-22 09:09:08.420659328 +0000 UTC m=+0.060013362 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 09:09:12 compute-0 nova_compute[186544]: 2025-11-22 09:09:12.246 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:09:13 compute-0 nova_compute[186544]: 2025-11-22 09:09:13.382 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:09:17 compute-0 nova_compute[186544]: 2025-11-22 09:09:17.249 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:09:18 compute-0 nova_compute[186544]: 2025-11-22 09:09:18.421 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:09:22 compute-0 nova_compute[186544]: 2025-11-22 09:09:22.250 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:09:23 compute-0 nova_compute[186544]: 2025-11-22 09:09:23.158 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:09:23 compute-0 nova_compute[186544]: 2025-11-22 09:09:23.424 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:09:27 compute-0 nova_compute[186544]: 2025-11-22 09:09:27.252 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:09:27 compute-0 podman[264852]: 2025-11-22 09:09:27.413147969 +0000 UTC m=+0.054425037 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 09:09:27 compute-0 podman[264853]: 2025-11-22 09:09:27.41398646 +0000 UTC m=+0.057499642 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 22 09:09:27 compute-0 podman[264854]: 2025-11-22 09:09:27.414781898 +0000 UTC m=+0.052869528 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 09:09:27 compute-0 podman[264855]: 2025-11-22 09:09:27.507069457 +0000 UTC m=+0.136529477 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, io.buildah.version=1.41.3)
Nov 22 09:09:28 compute-0 nova_compute[186544]: 2025-11-22 09:09:28.430 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:09:32 compute-0 nova_compute[186544]: 2025-11-22 09:09:32.257 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:09:33 compute-0 nova_compute[186544]: 2025-11-22 09:09:33.433 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:09:37 compute-0 nova_compute[186544]: 2025-11-22 09:09:37.259 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:09:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 09:09:37.402 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 09:09:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 09:09:37.402 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 09:09:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 09:09:37.402 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 09:09:38 compute-0 nova_compute[186544]: 2025-11-22 09:09:38.437 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:09:39 compute-0 podman[264941]: 2025-11-22 09:09:39.426199945 +0000 UTC m=+0.066298126 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 09:09:39 compute-0 podman[264943]: 2025-11-22 09:09:39.439187881 +0000 UTC m=+0.067018584 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, container_name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 09:09:39 compute-0 podman[264942]: 2025-11-22 09:09:39.441080148 +0000 UTC m=+0.067005134 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, architecture=x86_64, managed_by=edpm_ansible, name=ubi9-minimal, distribution-scope=public, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, maintainer=Red Hat, Inc.)
Nov 22 09:09:42 compute-0 nova_compute[186544]: 2025-11-22 09:09:42.260 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:09:43 compute-0 nova_compute[186544]: 2025-11-22 09:09:43.440 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:09:46 compute-0 nova_compute[186544]: 2025-11-22 09:09:46.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:09:46 compute-0 nova_compute[186544]: 2025-11-22 09:09:46.164 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 09:09:46 compute-0 nova_compute[186544]: 2025-11-22 09:09:46.165 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 09:09:46 compute-0 nova_compute[186544]: 2025-11-22 09:09:46.181 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 09:09:47 compute-0 nova_compute[186544]: 2025-11-22 09:09:47.261 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:09:48 compute-0 nova_compute[186544]: 2025-11-22 09:09:48.442 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:09:51 compute-0 nova_compute[186544]: 2025-11-22 09:09:51.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:09:51 compute-0 nova_compute[186544]: 2025-11-22 09:09:51.212 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 09:09:51 compute-0 nova_compute[186544]: 2025-11-22 09:09:51.213 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 09:09:51 compute-0 nova_compute[186544]: 2025-11-22 09:09:51.213 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 09:09:51 compute-0 nova_compute[186544]: 2025-11-22 09:09:51.213 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 09:09:51 compute-0 nova_compute[186544]: 2025-11-22 09:09:51.365 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 09:09:51 compute-0 nova_compute[186544]: 2025-11-22 09:09:51.366 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5698MB free_disk=73.13150787353516GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 09:09:51 compute-0 nova_compute[186544]: 2025-11-22 09:09:51.366 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 09:09:51 compute-0 nova_compute[186544]: 2025-11-22 09:09:51.366 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 09:09:51 compute-0 nova_compute[186544]: 2025-11-22 09:09:51.603 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 09:09:51 compute-0 nova_compute[186544]: 2025-11-22 09:09:51.603 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 09:09:51 compute-0 nova_compute[186544]: 2025-11-22 09:09:51.639 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 09:09:51 compute-0 nova_compute[186544]: 2025-11-22 09:09:51.661 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 09:09:51 compute-0 nova_compute[186544]: 2025-11-22 09:09:51.663 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 09:09:51 compute-0 nova_compute[186544]: 2025-11-22 09:09:51.664 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.297s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 09:09:52 compute-0 nova_compute[186544]: 2025-11-22 09:09:52.263 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:09:53 compute-0 nova_compute[186544]: 2025-11-22 09:09:53.445 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:09:55 compute-0 nova_compute[186544]: 2025-11-22 09:09:55.663 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:09:57 compute-0 nova_compute[186544]: 2025-11-22 09:09:57.265 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:09:58 compute-0 podman[265007]: 2025-11-22 09:09:58.411071891 +0000 UTC m=+0.050200054 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 09:09:58 compute-0 podman[265006]: 2025-11-22 09:09:58.411308168 +0000 UTC m=+0.053111935 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 22 09:09:58 compute-0 podman[265005]: 2025-11-22 09:09:58.420195334 +0000 UTC m=+0.059897220 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 09:09:58 compute-0 nova_compute[186544]: 2025-11-22 09:09:58.472 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:09:58 compute-0 podman[265008]: 2025-11-22 09:09:58.490956137 +0000 UTC m=+0.127732692 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Nov 22 09:09:59 compute-0 nova_compute[186544]: 2025-11-22 09:09:59.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:09:59 compute-0 nova_compute[186544]: 2025-11-22 09:09:59.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:09:59 compute-0 nova_compute[186544]: 2025-11-22 09:09:59.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 09:10:02 compute-0 nova_compute[186544]: 2025-11-22 09:10:02.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:10:02 compute-0 nova_compute[186544]: 2025-11-22 09:10:02.267 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:10:03 compute-0 nova_compute[186544]: 2025-11-22 09:10:03.158 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:10:03 compute-0 nova_compute[186544]: 2025-11-22 09:10:03.474 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:10:05 compute-0 nova_compute[186544]: 2025-11-22 09:10:05.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:10:07 compute-0 nova_compute[186544]: 2025-11-22 09:10:07.268 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:10:08 compute-0 nova_compute[186544]: 2025-11-22 09:10:08.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:10:08 compute-0 nova_compute[186544]: 2025-11-22 09:10:08.477 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:10:10 compute-0 podman[265087]: 2025-11-22 09:10:10.393693065 +0000 UTC m=+0.046013851 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 09:10:10 compute-0 podman[265089]: 2025-11-22 09:10:10.41316419 +0000 UTC m=+0.054354455 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 09:10:10 compute-0 podman[265088]: 2025-11-22 09:10:10.4332542 +0000 UTC m=+0.082171383 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, name=ubi9-minimal, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 22 09:10:12 compute-0 nova_compute[186544]: 2025-11-22 09:10:12.271 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:10:13 compute-0 nova_compute[186544]: 2025-11-22 09:10:13.480 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:10:17 compute-0 nova_compute[186544]: 2025-11-22 09:10:17.273 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:10:18 compute-0 nova_compute[186544]: 2025-11-22 09:10:18.526 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:10:22 compute-0 nova_compute[186544]: 2025-11-22 09:10:22.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:10:22 compute-0 nova_compute[186544]: 2025-11-22 09:10:22.275 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:10:23 compute-0 nova_compute[186544]: 2025-11-22 09:10:23.530 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:10:27 compute-0 nova_compute[186544]: 2025-11-22 09:10:27.277 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:10:28 compute-0 nova_compute[186544]: 2025-11-22 09:10:28.533 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:10:29 compute-0 podman[265155]: 2025-11-22 09:10:29.436115704 +0000 UTC m=+0.070667053 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 09:10:29 compute-0 podman[265151]: 2025-11-22 09:10:29.444002345 +0000 UTC m=+0.086352254 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 09:10:29 compute-0 podman[265150]: 2025-11-22 09:10:29.45524466 +0000 UTC m=+0.104142589 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 22 09:10:29 compute-0 podman[265161]: 2025-11-22 09:10:29.478829484 +0000 UTC m=+0.101034853 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3)
Nov 22 09:10:32 compute-0 nova_compute[186544]: 2025-11-22 09:10:32.190 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:10:32 compute-0 nova_compute[186544]: 2025-11-22 09:10:32.190 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 22 09:10:32 compute-0 nova_compute[186544]: 2025-11-22 09:10:32.279 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:10:33 compute-0 nova_compute[186544]: 2025-11-22 09:10:33.535 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:10:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:10:36.616 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:10:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:10:36.616 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:10:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:10:36.616 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:10:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:10:36.616 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:10:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:10:36.616 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:10:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:10:36.616 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:10:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:10:36.616 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:10:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:10:36.616 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:10:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:10:36.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:10:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:10:36.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:10:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:10:36.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:10:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:10:36.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:10:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:10:36.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:10:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:10:36.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:10:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:10:36.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:10:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:10:36.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:10:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:10:36.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:10:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:10:36.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:10:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:10:36.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:10:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:10:36.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:10:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:10:36.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:10:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:10:36.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:10:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:10:36.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:10:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:10:36.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:10:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:10:36.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:10:37 compute-0 nova_compute[186544]: 2025-11-22 09:10:37.282 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:10:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 09:10:37.403 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 09:10:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 09:10:37.404 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 09:10:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 09:10:37.404 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 09:10:38 compute-0 nova_compute[186544]: 2025-11-22 09:10:38.538 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:10:41 compute-0 podman[265236]: 2025-11-22 09:10:41.388186604 +0000 UTC m=+0.042136347 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 09:10:41 compute-0 podman[265237]: 2025-11-22 09:10:41.404085172 +0000 UTC m=+0.053247148 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, distribution-scope=public, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, vcs-type=git, version=9.6, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, maintainer=Red Hat, Inc.)
Nov 22 09:10:41 compute-0 podman[265238]: 2025-11-22 09:10:41.410938948 +0000 UTC m=+0.055527243 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 22 09:10:42 compute-0 nova_compute[186544]: 2025-11-22 09:10:42.284 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:10:43 compute-0 nova_compute[186544]: 2025-11-22 09:10:43.542 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:10:45 compute-0 nova_compute[186544]: 2025-11-22 09:10:45.186 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:10:45 compute-0 nova_compute[186544]: 2025-11-22 09:10:45.186 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 22 09:10:45 compute-0 nova_compute[186544]: 2025-11-22 09:10:45.211 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 22 09:10:46 compute-0 nova_compute[186544]: 2025-11-22 09:10:46.188 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:10:46 compute-0 nova_compute[186544]: 2025-11-22 09:10:46.188 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 09:10:46 compute-0 nova_compute[186544]: 2025-11-22 09:10:46.188 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 09:10:46 compute-0 nova_compute[186544]: 2025-11-22 09:10:46.218 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 09:10:47 compute-0 nova_compute[186544]: 2025-11-22 09:10:47.286 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:10:48 compute-0 nova_compute[186544]: 2025-11-22 09:10:48.544 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:10:52 compute-0 nova_compute[186544]: 2025-11-22 09:10:52.287 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:10:53 compute-0 nova_compute[186544]: 2025-11-22 09:10:53.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:10:53 compute-0 nova_compute[186544]: 2025-11-22 09:10:53.198 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 09:10:53 compute-0 nova_compute[186544]: 2025-11-22 09:10:53.199 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 09:10:53 compute-0 nova_compute[186544]: 2025-11-22 09:10:53.199 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 09:10:53 compute-0 nova_compute[186544]: 2025-11-22 09:10:53.199 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 09:10:53 compute-0 nova_compute[186544]: 2025-11-22 09:10:53.346 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 09:10:53 compute-0 nova_compute[186544]: 2025-11-22 09:10:53.347 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5706MB free_disk=73.13150787353516GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 09:10:53 compute-0 nova_compute[186544]: 2025-11-22 09:10:53.347 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 09:10:53 compute-0 nova_compute[186544]: 2025-11-22 09:10:53.347 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 09:10:53 compute-0 nova_compute[186544]: 2025-11-22 09:10:53.482 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 09:10:53 compute-0 nova_compute[186544]: 2025-11-22 09:10:53.482 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 09:10:53 compute-0 nova_compute[186544]: 2025-11-22 09:10:53.525 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 09:10:53 compute-0 nova_compute[186544]: 2025-11-22 09:10:53.543 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 09:10:53 compute-0 nova_compute[186544]: 2025-11-22 09:10:53.545 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 09:10:53 compute-0 nova_compute[186544]: 2025-11-22 09:10:53.545 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.197s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 09:10:53 compute-0 nova_compute[186544]: 2025-11-22 09:10:53.547 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:10:56 compute-0 nova_compute[186544]: 2025-11-22 09:10:56.546 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:10:57 compute-0 nova_compute[186544]: 2025-11-22 09:10:57.289 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:10:58 compute-0 nova_compute[186544]: 2025-11-22 09:10:58.550 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:10:59 compute-0 nova_compute[186544]: 2025-11-22 09:10:59.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:11:00 compute-0 podman[265304]: 2025-11-22 09:11:00.400984019 +0000 UTC m=+0.046581656 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 22 09:11:00 compute-0 podman[265305]: 2025-11-22 09:11:00.405587062 +0000 UTC m=+0.048458222 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 09:11:00 compute-0 podman[265303]: 2025-11-22 09:11:00.431130824 +0000 UTC m=+0.080542734 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=edpm)
Nov 22 09:11:00 compute-0 podman[265306]: 2025-11-22 09:11:00.442083961 +0000 UTC m=+0.080989114 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, container_name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 22 09:11:01 compute-0 nova_compute[186544]: 2025-11-22 09:11:01.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:11:01 compute-0 nova_compute[186544]: 2025-11-22 09:11:01.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 09:11:02 compute-0 nova_compute[186544]: 2025-11-22 09:11:02.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:11:02 compute-0 nova_compute[186544]: 2025-11-22 09:11:02.290 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:11:03 compute-0 nova_compute[186544]: 2025-11-22 09:11:03.552 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:11:05 compute-0 nova_compute[186544]: 2025-11-22 09:11:05.158 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:11:06 compute-0 nova_compute[186544]: 2025-11-22 09:11:06.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:11:07 compute-0 nova_compute[186544]: 2025-11-22 09:11:07.292 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:11:08 compute-0 nova_compute[186544]: 2025-11-22 09:11:08.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:11:08 compute-0 nova_compute[186544]: 2025-11-22 09:11:08.555 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:11:12 compute-0 nova_compute[186544]: 2025-11-22 09:11:12.293 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:11:12 compute-0 podman[265387]: 2025-11-22 09:11:12.396351524 +0000 UTC m=+0.046489004 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 09:11:12 compute-0 podman[265389]: 2025-11-22 09:11:12.414096016 +0000 UTC m=+0.056040367 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd)
Nov 22 09:11:12 compute-0 podman[265388]: 2025-11-22 09:11:12.414118936 +0000 UTC m=+0.058300041 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vcs-type=git, architecture=x86_64, com.redhat.component=ubi9-minimal-container, version=9.6, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1755695350, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=minimal rhel9)
Nov 22 09:11:13 compute-0 nova_compute[186544]: 2025-11-22 09:11:13.558 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:11:17 compute-0 nova_compute[186544]: 2025-11-22 09:11:17.294 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:11:18 compute-0 nova_compute[186544]: 2025-11-22 09:11:18.561 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:11:22 compute-0 nova_compute[186544]: 2025-11-22 09:11:22.296 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:11:23 compute-0 nova_compute[186544]: 2025-11-22 09:11:23.563 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:11:27 compute-0 nova_compute[186544]: 2025-11-22 09:11:27.300 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:11:28 compute-0 nova_compute[186544]: 2025-11-22 09:11:28.159 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:11:28 compute-0 nova_compute[186544]: 2025-11-22 09:11:28.567 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:11:31 compute-0 podman[265447]: 2025-11-22 09:11:31.415155857 +0000 UTC m=+0.059641144 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 22 09:11:31 compute-0 podman[265448]: 2025-11-22 09:11:31.429095366 +0000 UTC m=+0.069783951 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 09:11:31 compute-0 podman[265446]: 2025-11-22 09:11:31.429151868 +0000 UTC m=+0.083548457 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 22 09:11:31 compute-0 podman[265454]: 2025-11-22 09:11:31.477229519 +0000 UTC m=+0.120374423 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 22 09:11:32 compute-0 nova_compute[186544]: 2025-11-22 09:11:32.302 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:11:33 compute-0 nova_compute[186544]: 2025-11-22 09:11:33.570 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:11:37 compute-0 nova_compute[186544]: 2025-11-22 09:11:37.303 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:11:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 09:11:37.404 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 09:11:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 09:11:37.405 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 09:11:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 09:11:37.406 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 09:11:38 compute-0 nova_compute[186544]: 2025-11-22 09:11:38.572 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:11:42 compute-0 nova_compute[186544]: 2025-11-22 09:11:42.303 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:11:43 compute-0 podman[265532]: 2025-11-22 09:11:43.398775698 +0000 UTC m=+0.051588038 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 09:11:43 compute-0 podman[265533]: 2025-11-22 09:11:43.4091525 +0000 UTC m=+0.055163214 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, version=9.6, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, release=1755695350, vendor=Red Hat, Inc., container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 22 09:11:43 compute-0 podman[265534]: 2025-11-22 09:11:43.409457598 +0000 UTC m=+0.055082293 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 22 09:11:43 compute-0 nova_compute[186544]: 2025-11-22 09:11:43.575 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:11:47 compute-0 nova_compute[186544]: 2025-11-22 09:11:47.305 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:11:48 compute-0 nova_compute[186544]: 2025-11-22 09:11:48.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:11:48 compute-0 nova_compute[186544]: 2025-11-22 09:11:48.164 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 09:11:48 compute-0 nova_compute[186544]: 2025-11-22 09:11:48.164 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 09:11:48 compute-0 nova_compute[186544]: 2025-11-22 09:11:48.180 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 09:11:48 compute-0 nova_compute[186544]: 2025-11-22 09:11:48.577 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:11:52 compute-0 nova_compute[186544]: 2025-11-22 09:11:52.306 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:11:53 compute-0 nova_compute[186544]: 2025-11-22 09:11:53.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:11:53 compute-0 nova_compute[186544]: 2025-11-22 09:11:53.204 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 09:11:53 compute-0 nova_compute[186544]: 2025-11-22 09:11:53.204 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 09:11:53 compute-0 nova_compute[186544]: 2025-11-22 09:11:53.205 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 09:11:53 compute-0 nova_compute[186544]: 2025-11-22 09:11:53.205 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 09:11:53 compute-0 nova_compute[186544]: 2025-11-22 09:11:53.344 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 09:11:53 compute-0 nova_compute[186544]: 2025-11-22 09:11:53.345 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5713MB free_disk=73.13150787353516GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 09:11:53 compute-0 nova_compute[186544]: 2025-11-22 09:11:53.345 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 09:11:53 compute-0 nova_compute[186544]: 2025-11-22 09:11:53.346 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 09:11:53 compute-0 nova_compute[186544]: 2025-11-22 09:11:53.442 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 09:11:53 compute-0 nova_compute[186544]: 2025-11-22 09:11:53.443 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 09:11:53 compute-0 nova_compute[186544]: 2025-11-22 09:11:53.468 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Refreshing inventories for resource provider 0a011418-630a-4be8-ab23-41ec1c11a5ea _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 22 09:11:53 compute-0 nova_compute[186544]: 2025-11-22 09:11:53.515 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Updating ProviderTree inventory for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 22 09:11:53 compute-0 nova_compute[186544]: 2025-11-22 09:11:53.516 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Updating inventory in ProviderTree for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 22 09:11:53 compute-0 nova_compute[186544]: 2025-11-22 09:11:53.537 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Refreshing aggregate associations for resource provider 0a011418-630a-4be8-ab23-41ec1c11a5ea, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 22 09:11:53 compute-0 nova_compute[186544]: 2025-11-22 09:11:53.565 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Refreshing trait associations for resource provider 0a011418-630a-4be8-ab23-41ec1c11a5ea, traits: COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_ACCELERATORS,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_RESCUE_BFV,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSSE3,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE41 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 22 09:11:53 compute-0 nova_compute[186544]: 2025-11-22 09:11:53.580 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:11:53 compute-0 nova_compute[186544]: 2025-11-22 09:11:53.590 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 09:11:53 compute-0 nova_compute[186544]: 2025-11-22 09:11:53.605 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 09:11:53 compute-0 nova_compute[186544]: 2025-11-22 09:11:53.606 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 09:11:53 compute-0 nova_compute[186544]: 2025-11-22 09:11:53.606 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.261s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 09:11:57 compute-0 nova_compute[186544]: 2025-11-22 09:11:57.308 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:11:57 compute-0 nova_compute[186544]: 2025-11-22 09:11:57.607 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:11:58 compute-0 nova_compute[186544]: 2025-11-22 09:11:58.583 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:12:00 compute-0 nova_compute[186544]: 2025-11-22 09:12:00.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:12:01 compute-0 nova_compute[186544]: 2025-11-22 09:12:01.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:12:01 compute-0 nova_compute[186544]: 2025-11-22 09:12:01.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 09:12:02 compute-0 nova_compute[186544]: 2025-11-22 09:12:02.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:12:02 compute-0 nova_compute[186544]: 2025-11-22 09:12:02.309 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:12:02 compute-0 podman[265594]: 2025-11-22 09:12:02.412144908 +0000 UTC m=+0.053415723 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 22 09:12:02 compute-0 podman[265595]: 2025-11-22 09:12:02.43234778 +0000 UTC m=+0.065224871 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 22 09:12:02 compute-0 podman[265593]: 2025-11-22 09:12:02.444099886 +0000 UTC m=+0.079628001 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 22 09:12:02 compute-0 podman[265601]: 2025-11-22 09:12:02.456319564 +0000 UTC m=+0.080339478 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team)
Nov 22 09:12:03 compute-0 nova_compute[186544]: 2025-11-22 09:12:03.586 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:12:06 compute-0 nova_compute[186544]: 2025-11-22 09:12:06.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:12:07 compute-0 nova_compute[186544]: 2025-11-22 09:12:07.158 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:12:07 compute-0 nova_compute[186544]: 2025-11-22 09:12:07.312 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:12:08 compute-0 nova_compute[186544]: 2025-11-22 09:12:08.589 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:12:09 compute-0 nova_compute[186544]: 2025-11-22 09:12:09.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:12:12 compute-0 nova_compute[186544]: 2025-11-22 09:12:12.314 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:12:13 compute-0 nova_compute[186544]: 2025-11-22 09:12:13.592 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:12:14 compute-0 podman[265677]: 2025-11-22 09:12:14.424548827 +0000 UTC m=+0.068511350 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., version=9.6, maintainer=Red Hat, Inc., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, config_id=edpm, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible)
Nov 22 09:12:14 compute-0 podman[265676]: 2025-11-22 09:12:14.433136696 +0000 UTC m=+0.080660735 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 22 09:12:14 compute-0 podman[265678]: 2025-11-22 09:12:14.446057911 +0000 UTC m=+0.086517688 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_id=multipathd, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 22 09:12:17 compute-0 nova_compute[186544]: 2025-11-22 09:12:17.316 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:12:18 compute-0 nova_compute[186544]: 2025-11-22 09:12:18.595 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:12:22 compute-0 nova_compute[186544]: 2025-11-22 09:12:22.319 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:12:23 compute-0 nova_compute[186544]: 2025-11-22 09:12:23.598 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:12:27 compute-0 nova_compute[186544]: 2025-11-22 09:12:27.321 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:12:28 compute-0 nova_compute[186544]: 2025-11-22 09:12:28.602 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:12:32 compute-0 nova_compute[186544]: 2025-11-22 09:12:32.322 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:12:33 compute-0 podman[265738]: 2025-11-22 09:12:33.415813658 +0000 UTC m=+0.053142196 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 09:12:33 compute-0 podman[265736]: 2025-11-22 09:12:33.432221138 +0000 UTC m=+0.072057847 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, config_id=edpm)
Nov 22 09:12:33 compute-0 podman[265737]: 2025-11-22 09:12:33.439808933 +0000 UTC m=+0.080089672 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Nov 22 09:12:33 compute-0 podman[265739]: 2025-11-22 09:12:33.470665794 +0000 UTC m=+0.104601529 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 22 09:12:33 compute-0 nova_compute[186544]: 2025-11-22 09:12:33.605 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:12:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:12:36.616 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:12:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:12:36.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:12:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:12:36.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:12:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:12:36.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:12:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:12:36.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:12:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:12:36.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:12:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:12:36.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:12:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:12:36.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:12:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:12:36.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:12:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:12:36.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:12:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:12:36.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:12:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:12:36.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:12:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:12:36.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:12:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:12:36.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:12:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:12:36.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:12:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:12:36.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:12:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:12:36.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:12:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:12:36.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:12:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:12:36.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:12:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:12:36.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:12:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:12:36.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:12:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:12:36.619 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:12:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:12:36.619 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:12:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:12:36.619 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:12:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:12:36.619 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:12:37 compute-0 nova_compute[186544]: 2025-11-22 09:12:37.325 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:12:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 09:12:37.405 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 09:12:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 09:12:37.406 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 09:12:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 09:12:37.406 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 09:12:38 compute-0 nova_compute[186544]: 2025-11-22 09:12:38.607 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:12:42 compute-0 nova_compute[186544]: 2025-11-22 09:12:42.326 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:12:43 compute-0 nova_compute[186544]: 2025-11-22 09:12:43.610 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:12:45 compute-0 podman[265820]: 2025-11-22 09:12:45.407413848 +0000 UTC m=+0.052011098 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., version=9.6, io.openshift.expose-services=, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, distribution-scope=public, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vcs-type=git)
Nov 22 09:12:45 compute-0 podman[265821]: 2025-11-22 09:12:45.413837215 +0000 UTC m=+0.053686979 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, container_name=multipathd)
Nov 22 09:12:45 compute-0 podman[265819]: 2025-11-22 09:12:45.420750704 +0000 UTC m=+0.068774837 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 09:12:47 compute-0 nova_compute[186544]: 2025-11-22 09:12:47.329 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:12:48 compute-0 nova_compute[186544]: 2025-11-22 09:12:48.613 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:12:49 compute-0 nova_compute[186544]: 2025-11-22 09:12:49.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:12:49 compute-0 nova_compute[186544]: 2025-11-22 09:12:49.164 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 09:12:49 compute-0 nova_compute[186544]: 2025-11-22 09:12:49.164 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 09:12:49 compute-0 nova_compute[186544]: 2025-11-22 09:12:49.183 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 09:12:52 compute-0 nova_compute[186544]: 2025-11-22 09:12:52.330 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:12:53 compute-0 nova_compute[186544]: 2025-11-22 09:12:53.616 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:12:55 compute-0 nova_compute[186544]: 2025-11-22 09:12:55.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:12:55 compute-0 nova_compute[186544]: 2025-11-22 09:12:55.196 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 09:12:55 compute-0 nova_compute[186544]: 2025-11-22 09:12:55.196 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 09:12:55 compute-0 nova_compute[186544]: 2025-11-22 09:12:55.197 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 09:12:55 compute-0 nova_compute[186544]: 2025-11-22 09:12:55.197 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 09:12:55 compute-0 nova_compute[186544]: 2025-11-22 09:12:55.327 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 09:12:55 compute-0 nova_compute[186544]: 2025-11-22 09:12:55.328 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5709MB free_disk=73.13150787353516GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 09:12:55 compute-0 nova_compute[186544]: 2025-11-22 09:12:55.328 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 09:12:55 compute-0 nova_compute[186544]: 2025-11-22 09:12:55.328 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 09:12:55 compute-0 nova_compute[186544]: 2025-11-22 09:12:55.537 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 09:12:55 compute-0 nova_compute[186544]: 2025-11-22 09:12:55.537 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 09:12:55 compute-0 nova_compute[186544]: 2025-11-22 09:12:55.558 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 09:12:55 compute-0 nova_compute[186544]: 2025-11-22 09:12:55.584 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 09:12:55 compute-0 nova_compute[186544]: 2025-11-22 09:12:55.586 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 09:12:55 compute-0 nova_compute[186544]: 2025-11-22 09:12:55.586 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.258s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 09:12:57 compute-0 nova_compute[186544]: 2025-11-22 09:12:57.332 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:12:58 compute-0 nova_compute[186544]: 2025-11-22 09:12:58.618 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:12:59 compute-0 nova_compute[186544]: 2025-11-22 09:12:59.586 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:13:00 compute-0 nova_compute[186544]: 2025-11-22 09:13:00.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:13:02 compute-0 nova_compute[186544]: 2025-11-22 09:13:02.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:13:02 compute-0 nova_compute[186544]: 2025-11-22 09:13:02.333 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:13:03 compute-0 nova_compute[186544]: 2025-11-22 09:13:03.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:13:03 compute-0 nova_compute[186544]: 2025-11-22 09:13:03.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 09:13:03 compute-0 nova_compute[186544]: 2025-11-22 09:13:03.621 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:13:04 compute-0 podman[265883]: 2025-11-22 09:13:04.405168618 +0000 UTC m=+0.054653212 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 22 09:13:04 compute-0 podman[265884]: 2025-11-22 09:13:04.407150076 +0000 UTC m=+0.054112849 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Nov 22 09:13:04 compute-0 podman[265885]: 2025-11-22 09:13:04.411106002 +0000 UTC m=+0.052261164 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 09:13:04 compute-0 podman[265886]: 2025-11-22 09:13:04.438969681 +0000 UTC m=+0.078175576 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 22 09:13:06 compute-0 nova_compute[186544]: 2025-11-22 09:13:06.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:13:07 compute-0 nova_compute[186544]: 2025-11-22 09:13:07.363 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:13:08 compute-0 nova_compute[186544]: 2025-11-22 09:13:08.158 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:13:08 compute-0 nova_compute[186544]: 2025-11-22 09:13:08.625 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:13:09 compute-0 nova_compute[186544]: 2025-11-22 09:13:09.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:13:12 compute-0 nova_compute[186544]: 2025-11-22 09:13:12.366 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:13:13 compute-0 nova_compute[186544]: 2025-11-22 09:13:13.627 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:13:16 compute-0 podman[265968]: 2025-11-22 09:13:16.39832715 +0000 UTC m=+0.047957781 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 09:13:16 compute-0 podman[265970]: 2025-11-22 09:13:16.411543941 +0000 UTC m=+0.055462783 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 22 09:13:16 compute-0 podman[265969]: 2025-11-22 09:13:16.412069513 +0000 UTC m=+0.058464854 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, config_id=edpm, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9)
Nov 22 09:13:17 compute-0 nova_compute[186544]: 2025-11-22 09:13:17.367 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:13:18 compute-0 nova_compute[186544]: 2025-11-22 09:13:18.630 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:13:22 compute-0 nova_compute[186544]: 2025-11-22 09:13:22.368 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:13:23 compute-0 nova_compute[186544]: 2025-11-22 09:13:23.632 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:13:27 compute-0 nova_compute[186544]: 2025-11-22 09:13:27.371 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:13:28 compute-0 nova_compute[186544]: 2025-11-22 09:13:28.635 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:13:29 compute-0 nova_compute[186544]: 2025-11-22 09:13:29.158 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:13:32 compute-0 nova_compute[186544]: 2025-11-22 09:13:32.373 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:13:33 compute-0 nova_compute[186544]: 2025-11-22 09:13:33.638 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:13:35 compute-0 podman[266029]: 2025-11-22 09:13:35.411520225 +0000 UTC m=+0.060050674 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_id=edpm, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 09:13:35 compute-0 podman[266031]: 2025-11-22 09:13:35.422419531 +0000 UTC m=+0.063726374 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 09:13:35 compute-0 podman[266030]: 2025-11-22 09:13:35.432105277 +0000 UTC m=+0.077482430 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 09:13:35 compute-0 podman[266032]: 2025-11-22 09:13:35.447949632 +0000 UTC m=+0.084065359 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 22 09:13:37 compute-0 nova_compute[186544]: 2025-11-22 09:13:37.374 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:13:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 09:13:37.405 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 09:13:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 09:13:37.406 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 09:13:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 09:13:37.406 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 09:13:38 compute-0 nova_compute[186544]: 2025-11-22 09:13:38.641 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:13:42 compute-0 nova_compute[186544]: 2025-11-22 09:13:42.376 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:13:43 compute-0 nova_compute[186544]: 2025-11-22 09:13:43.643 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:13:47 compute-0 nova_compute[186544]: 2025-11-22 09:13:47.377 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:13:47 compute-0 podman[266119]: 2025-11-22 09:13:47.4028802 +0000 UTC m=+0.052162262 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, io.buildah.version=1.33.7, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public)
Nov 22 09:13:47 compute-0 podman[266120]: 2025-11-22 09:13:47.406506259 +0000 UTC m=+0.051996667 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.build-date=20251118, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, org.label-schema.vendor=CentOS)
Nov 22 09:13:47 compute-0 podman[266118]: 2025-11-22 09:13:47.428347782 +0000 UTC m=+0.081147838 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 09:13:48 compute-0 nova_compute[186544]: 2025-11-22 09:13:48.646 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:13:50 compute-0 nova_compute[186544]: 2025-11-22 09:13:50.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:13:50 compute-0 nova_compute[186544]: 2025-11-22 09:13:50.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 09:13:50 compute-0 nova_compute[186544]: 2025-11-22 09:13:50.164 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 09:13:50 compute-0 nova_compute[186544]: 2025-11-22 09:13:50.191 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 09:13:52 compute-0 nova_compute[186544]: 2025-11-22 09:13:52.380 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:13:53 compute-0 nova_compute[186544]: 2025-11-22 09:13:53.648 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:13:55 compute-0 nova_compute[186544]: 2025-11-22 09:13:55.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:13:55 compute-0 nova_compute[186544]: 2025-11-22 09:13:55.198 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 09:13:55 compute-0 nova_compute[186544]: 2025-11-22 09:13:55.199 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 09:13:55 compute-0 nova_compute[186544]: 2025-11-22 09:13:55.199 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 09:13:55 compute-0 nova_compute[186544]: 2025-11-22 09:13:55.199 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 09:13:55 compute-0 nova_compute[186544]: 2025-11-22 09:13:55.358 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 09:13:55 compute-0 nova_compute[186544]: 2025-11-22 09:13:55.359 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5716MB free_disk=73.13150787353516GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 09:13:55 compute-0 nova_compute[186544]: 2025-11-22 09:13:55.360 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 09:13:55 compute-0 nova_compute[186544]: 2025-11-22 09:13:55.360 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 09:13:55 compute-0 nova_compute[186544]: 2025-11-22 09:13:55.470 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 09:13:55 compute-0 nova_compute[186544]: 2025-11-22 09:13:55.470 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 09:13:55 compute-0 nova_compute[186544]: 2025-11-22 09:13:55.614 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 09:13:55 compute-0 nova_compute[186544]: 2025-11-22 09:13:55.777 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 09:13:55 compute-0 nova_compute[186544]: 2025-11-22 09:13:55.779 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 09:13:55 compute-0 nova_compute[186544]: 2025-11-22 09:13:55.779 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.419s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 09:13:57 compute-0 nova_compute[186544]: 2025-11-22 09:13:57.382 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:13:58 compute-0 nova_compute[186544]: 2025-11-22 09:13:58.651 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:13:59 compute-0 nova_compute[186544]: 2025-11-22 09:13:59.780 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:14:02 compute-0 nova_compute[186544]: 2025-11-22 09:14:02.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:14:02 compute-0 nova_compute[186544]: 2025-11-22 09:14:02.383 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:14:03 compute-0 nova_compute[186544]: 2025-11-22 09:14:03.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:14:03 compute-0 nova_compute[186544]: 2025-11-22 09:14:03.654 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:14:05 compute-0 nova_compute[186544]: 2025-11-22 09:14:05.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:14:05 compute-0 nova_compute[186544]: 2025-11-22 09:14:05.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 09:14:06 compute-0 podman[266180]: 2025-11-22 09:14:06.419038088 +0000 UTC m=+0.063180270 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 09:14:06 compute-0 podman[266181]: 2025-11-22 09:14:06.440025019 +0000 UTC m=+0.078076133 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 09:14:06 compute-0 podman[266182]: 2025-11-22 09:14:06.44292651 +0000 UTC m=+0.076947076 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 09:14:06 compute-0 podman[266188]: 2025-11-22 09:14:06.446305593 +0000 UTC m=+0.078156076 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 22 09:14:07 compute-0 nova_compute[186544]: 2025-11-22 09:14:07.386 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:14:08 compute-0 nova_compute[186544]: 2025-11-22 09:14:08.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:14:08 compute-0 nova_compute[186544]: 2025-11-22 09:14:08.658 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:14:10 compute-0 nova_compute[186544]: 2025-11-22 09:14:10.157 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:14:11 compute-0 nova_compute[186544]: 2025-11-22 09:14:11.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:14:12 compute-0 nova_compute[186544]: 2025-11-22 09:14:12.389 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:14:13 compute-0 nova_compute[186544]: 2025-11-22 09:14:13.662 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:14:17 compute-0 nova_compute[186544]: 2025-11-22 09:14:17.391 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:14:18 compute-0 podman[266262]: 2025-11-22 09:14:18.415149032 +0000 UTC m=+0.052294125 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 09:14:18 compute-0 podman[266263]: 2025-11-22 09:14:18.426007926 +0000 UTC m=+0.060261509 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, architecture=x86_64, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, config_id=edpm, io.openshift.tags=minimal rhel9, release=1755695350, vendor=Red Hat, Inc., io.openshift.expose-services=, container_name=openstack_network_exporter, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 22 09:14:18 compute-0 podman[266264]: 2025-11-22 09:14:18.433185881 +0000 UTC m=+0.062830182 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 22 09:14:18 compute-0 nova_compute[186544]: 2025-11-22 09:14:18.665 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:14:22 compute-0 nova_compute[186544]: 2025-11-22 09:14:22.392 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:14:23 compute-0 nova_compute[186544]: 2025-11-22 09:14:23.667 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:14:27 compute-0 nova_compute[186544]: 2025-11-22 09:14:27.394 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:14:28 compute-0 nova_compute[186544]: 2025-11-22 09:14:28.670 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:14:32 compute-0 nova_compute[186544]: 2025-11-22 09:14:32.395 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:14:33 compute-0 nova_compute[186544]: 2025-11-22 09:14:33.721 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:14:36.616 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:14:36.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:14:36.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:14:36.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:14:36.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:14:36.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:14:36.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:14:36.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:14:36.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:14:36.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:14:36.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:14:36.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:14:36.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:14:36.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:14:36.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:14:36.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:14:36.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:14:36.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:14:36.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:14:36.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:14:36.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:14:36.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:14:36.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:14:36.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:14:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:14:36.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:14:37 compute-0 nova_compute[186544]: 2025-11-22 09:14:37.396 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:14:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 09:14:37.406 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 09:14:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 09:14:37.407 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 09:14:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 09:14:37.407 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 09:14:37 compute-0 podman[266326]: 2025-11-22 09:14:37.430427597 +0000 UTC m=+0.064809640 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Nov 22 09:14:37 compute-0 podman[266327]: 2025-11-22 09:14:37.437297935 +0000 UTC m=+0.066311428 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 09:14:37 compute-0 podman[266325]: 2025-11-22 09:14:37.455476677 +0000 UTC m=+0.099476764 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118)
Nov 22 09:14:37 compute-0 podman[266328]: 2025-11-22 09:14:37.479063182 +0000 UTC m=+0.107966602 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 22 09:14:38 compute-0 nova_compute[186544]: 2025-11-22 09:14:38.723 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:14:42 compute-0 nova_compute[186544]: 2025-11-22 09:14:42.398 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:14:43 compute-0 nova_compute[186544]: 2025-11-22 09:14:43.775 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:14:47 compute-0 nova_compute[186544]: 2025-11-22 09:14:47.400 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:14:48 compute-0 nova_compute[186544]: 2025-11-22 09:14:48.851 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:14:49 compute-0 podman[266408]: 2025-11-22 09:14:49.397986445 +0000 UTC m=+0.049101807 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 22 09:14:49 compute-0 podman[266409]: 2025-11-22 09:14:49.409772923 +0000 UTC m=+0.056155149 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, version=9.6, vcs-type=git, container_name=openstack_network_exporter, io.openshift.expose-services=, release=1755695350, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 22 09:14:49 compute-0 podman[266410]: 2025-11-22 09:14:49.421110839 +0000 UTC m=+0.064715197 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 22 09:14:51 compute-0 nova_compute[186544]: 2025-11-22 09:14:51.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:14:51 compute-0 nova_compute[186544]: 2025-11-22 09:14:51.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 09:14:51 compute-0 nova_compute[186544]: 2025-11-22 09:14:51.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 09:14:51 compute-0 nova_compute[186544]: 2025-11-22 09:14:51.183 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 09:14:52 compute-0 nova_compute[186544]: 2025-11-22 09:14:52.403 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:14:53 compute-0 nova_compute[186544]: 2025-11-22 09:14:53.853 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:14:56 compute-0 nova_compute[186544]: 2025-11-22 09:14:56.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:14:56 compute-0 nova_compute[186544]: 2025-11-22 09:14:56.227 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 09:14:56 compute-0 nova_compute[186544]: 2025-11-22 09:14:56.227 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 09:14:56 compute-0 nova_compute[186544]: 2025-11-22 09:14:56.227 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 09:14:56 compute-0 nova_compute[186544]: 2025-11-22 09:14:56.227 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 09:14:56 compute-0 nova_compute[186544]: 2025-11-22 09:14:56.359 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 09:14:56 compute-0 nova_compute[186544]: 2025-11-22 09:14:56.360 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5703MB free_disk=73.13150787353516GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 09:14:56 compute-0 nova_compute[186544]: 2025-11-22 09:14:56.360 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 09:14:56 compute-0 nova_compute[186544]: 2025-11-22 09:14:56.360 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 09:14:56 compute-0 nova_compute[186544]: 2025-11-22 09:14:56.425 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 09:14:56 compute-0 nova_compute[186544]: 2025-11-22 09:14:56.425 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 09:14:56 compute-0 nova_compute[186544]: 2025-11-22 09:14:56.448 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 09:14:56 compute-0 nova_compute[186544]: 2025-11-22 09:14:56.467 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 09:14:56 compute-0 nova_compute[186544]: 2025-11-22 09:14:56.468 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 09:14:56 compute-0 nova_compute[186544]: 2025-11-22 09:14:56.469 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.109s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 09:14:57 compute-0 nova_compute[186544]: 2025-11-22 09:14:57.428 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:14:58 compute-0 nova_compute[186544]: 2025-11-22 09:14:58.856 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:15:00 compute-0 nova_compute[186544]: 2025-11-22 09:15:00.470 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:15:02 compute-0 nova_compute[186544]: 2025-11-22 09:15:02.428 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:15:03 compute-0 nova_compute[186544]: 2025-11-22 09:15:03.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:15:03 compute-0 nova_compute[186544]: 2025-11-22 09:15:03.858 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:15:04 compute-0 nova_compute[186544]: 2025-11-22 09:15:04.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:15:06 compute-0 nova_compute[186544]: 2025-11-22 09:15:06.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:15:06 compute-0 nova_compute[186544]: 2025-11-22 09:15:06.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 09:15:07 compute-0 nova_compute[186544]: 2025-11-22 09:15:07.430 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:15:08 compute-0 podman[266473]: 2025-11-22 09:15:08.401452002 +0000 UTC m=+0.045559631 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 22 09:15:08 compute-0 podman[266472]: 2025-11-22 09:15:08.402480947 +0000 UTC m=+0.051044155 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 22 09:15:08 compute-0 podman[266474]: 2025-11-22 09:15:08.413193128 +0000 UTC m=+0.053153826 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 09:15:08 compute-0 podman[266480]: 2025-11-22 09:15:08.463633057 +0000 UTC m=+0.099262900 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 22 09:15:08 compute-0 nova_compute[186544]: 2025-11-22 09:15:08.860 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:15:09 compute-0 nova_compute[186544]: 2025-11-22 09:15:09.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:15:12 compute-0 nova_compute[186544]: 2025-11-22 09:15:12.158 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:15:12 compute-0 nova_compute[186544]: 2025-11-22 09:15:12.431 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:15:13 compute-0 nova_compute[186544]: 2025-11-22 09:15:13.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:15:13 compute-0 nova_compute[186544]: 2025-11-22 09:15:13.868 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:15:17 compute-0 nova_compute[186544]: 2025-11-22 09:15:17.434 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:15:18 compute-0 nova_compute[186544]: 2025-11-22 09:15:18.874 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:15:20 compute-0 podman[266552]: 2025-11-22 09:15:20.404107304 +0000 UTC m=+0.050685106 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, version=9.6)
Nov 22 09:15:20 compute-0 podman[266551]: 2025-11-22 09:15:20.418430132 +0000 UTC m=+0.063198300 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 09:15:20 compute-0 podman[266553]: 2025-11-22 09:15:20.47249152 +0000 UTC m=+0.103727258 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible)
Nov 22 09:15:22 compute-0 nova_compute[186544]: 2025-11-22 09:15:22.436 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:15:23 compute-0 nova_compute[186544]: 2025-11-22 09:15:23.877 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:15:27 compute-0 nova_compute[186544]: 2025-11-22 09:15:27.437 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:15:28 compute-0 nova_compute[186544]: 2025-11-22 09:15:28.880 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:15:30 compute-0 nova_compute[186544]: 2025-11-22 09:15:30.158 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:15:32 compute-0 nova_compute[186544]: 2025-11-22 09:15:32.438 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:15:33 compute-0 nova_compute[186544]: 2025-11-22 09:15:33.883 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:15:35 compute-0 nova_compute[186544]: 2025-11-22 09:15:35.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:15:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 09:15:37.407 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 09:15:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 09:15:37.408 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 09:15:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 09:15:37.408 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 09:15:37 compute-0 nova_compute[186544]: 2025-11-22 09:15:37.439 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:15:38 compute-0 nova_compute[186544]: 2025-11-22 09:15:38.885 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:15:39 compute-0 podman[266616]: 2025-11-22 09:15:39.40907932 +0000 UTC m=+0.049834866 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3)
Nov 22 09:15:39 compute-0 podman[266618]: 2025-11-22 09:15:39.417549466 +0000 UTC m=+0.050374058 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 09:15:39 compute-0 podman[266617]: 2025-11-22 09:15:39.432999182 +0000 UTC m=+0.069785341 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 09:15:39 compute-0 podman[266623]: 2025-11-22 09:15:39.465063403 +0000 UTC m=+0.089088491 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 22 09:15:42 compute-0 nova_compute[186544]: 2025-11-22 09:15:42.441 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:15:43 compute-0 nova_compute[186544]: 2025-11-22 09:15:43.173 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:15:43 compute-0 nova_compute[186544]: 2025-11-22 09:15:43.174 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 22 09:15:43 compute-0 nova_compute[186544]: 2025-11-22 09:15:43.889 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:15:47 compute-0 nova_compute[186544]: 2025-11-22 09:15:47.443 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:15:48 compute-0 nova_compute[186544]: 2025-11-22 09:15:48.892 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:15:51 compute-0 podman[266700]: 2025-11-22 09:15:51.418665741 +0000 UTC m=+0.052489970 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 09:15:51 compute-0 podman[266701]: 2025-11-22 09:15:51.418576539 +0000 UTC m=+0.057602655 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, io.buildah.version=1.33.7, io.openshift.expose-services=, config_id=edpm, distribution-scope=public)
Nov 22 09:15:51 compute-0 podman[266702]: 2025-11-22 09:15:51.423906179 +0000 UTC m=+0.057173364 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Nov 22 09:15:52 compute-0 nova_compute[186544]: 2025-11-22 09:15:52.179 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:15:52 compute-0 nova_compute[186544]: 2025-11-22 09:15:52.180 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 09:15:52 compute-0 nova_compute[186544]: 2025-11-22 09:15:52.180 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 09:15:52 compute-0 nova_compute[186544]: 2025-11-22 09:15:52.210 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 09:15:52 compute-0 nova_compute[186544]: 2025-11-22 09:15:52.445 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:15:53 compute-0 nova_compute[186544]: 2025-11-22 09:15:53.895 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:15:56 compute-0 nova_compute[186544]: 2025-11-22 09:15:56.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:15:56 compute-0 nova_compute[186544]: 2025-11-22 09:15:56.190 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 09:15:56 compute-0 nova_compute[186544]: 2025-11-22 09:15:56.190 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 09:15:56 compute-0 nova_compute[186544]: 2025-11-22 09:15:56.190 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 09:15:56 compute-0 nova_compute[186544]: 2025-11-22 09:15:56.190 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 09:15:56 compute-0 nova_compute[186544]: 2025-11-22 09:15:56.329 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 09:15:56 compute-0 nova_compute[186544]: 2025-11-22 09:15:56.330 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5696MB free_disk=73.13150787353516GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 09:15:56 compute-0 nova_compute[186544]: 2025-11-22 09:15:56.330 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 09:15:56 compute-0 nova_compute[186544]: 2025-11-22 09:15:56.331 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 09:15:56 compute-0 nova_compute[186544]: 2025-11-22 09:15:56.591 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 09:15:56 compute-0 nova_compute[186544]: 2025-11-22 09:15:56.592 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 09:15:56 compute-0 nova_compute[186544]: 2025-11-22 09:15:56.626 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 09:15:56 compute-0 nova_compute[186544]: 2025-11-22 09:15:56.640 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 09:15:56 compute-0 nova_compute[186544]: 2025-11-22 09:15:56.642 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 09:15:56 compute-0 nova_compute[186544]: 2025-11-22 09:15:56.642 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.312s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 09:15:57 compute-0 nova_compute[186544]: 2025-11-22 09:15:57.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:15:57 compute-0 nova_compute[186544]: 2025-11-22 09:15:57.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 22 09:15:57 compute-0 nova_compute[186544]: 2025-11-22 09:15:57.187 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 22 09:15:57 compute-0 nova_compute[186544]: 2025-11-22 09:15:57.446 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:15:58 compute-0 nova_compute[186544]: 2025-11-22 09:15:58.898 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:16:02 compute-0 nova_compute[186544]: 2025-11-22 09:16:02.188 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:16:02 compute-0 nova_compute[186544]: 2025-11-22 09:16:02.449 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:16:03 compute-0 nova_compute[186544]: 2025-11-22 09:16:03.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:16:03 compute-0 nova_compute[186544]: 2025-11-22 09:16:03.901 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:16:04 compute-0 nova_compute[186544]: 2025-11-22 09:16:04.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:16:07 compute-0 nova_compute[186544]: 2025-11-22 09:16:07.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:16:07 compute-0 nova_compute[186544]: 2025-11-22 09:16:07.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 09:16:07 compute-0 nova_compute[186544]: 2025-11-22 09:16:07.449 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:16:08 compute-0 nova_compute[186544]: 2025-11-22 09:16:08.904 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:16:10 compute-0 podman[266764]: 2025-11-22 09:16:10.40618962 +0000 UTC m=+0.050540632 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 22 09:16:10 compute-0 podman[266763]: 2025-11-22 09:16:10.406286282 +0000 UTC m=+0.053925524 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118)
Nov 22 09:16:10 compute-0 podman[266762]: 2025-11-22 09:16:10.416347558 +0000 UTC m=+0.066175854 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 09:16:10 compute-0 podman[266765]: 2025-11-22 09:16:10.440109007 +0000 UTC m=+0.081859685 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller)
Nov 22 09:16:11 compute-0 nova_compute[186544]: 2025-11-22 09:16:11.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:16:12 compute-0 nova_compute[186544]: 2025-11-22 09:16:12.450 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:16:13 compute-0 nova_compute[186544]: 2025-11-22 09:16:13.158 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:16:13 compute-0 nova_compute[186544]: 2025-11-22 09:16:13.907 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:16:14 compute-0 nova_compute[186544]: 2025-11-22 09:16:14.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:16:17 compute-0 nova_compute[186544]: 2025-11-22 09:16:17.493 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:16:18 compute-0 nova_compute[186544]: 2025-11-22 09:16:18.910 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:16:22 compute-0 podman[266848]: 2025-11-22 09:16:22.400504059 +0000 UTC m=+0.052361007 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 09:16:22 compute-0 podman[266849]: 2025-11-22 09:16:22.402129008 +0000 UTC m=+0.049971989 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, version=9.6, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, container_name=openstack_network_exporter, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, managed_by=edpm_ansible, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9)
Nov 22 09:16:22 compute-0 podman[266850]: 2025-11-22 09:16:22.41119929 +0000 UTC m=+0.056534189 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_managed=true, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 22 09:16:22 compute-0 nova_compute[186544]: 2025-11-22 09:16:22.495 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:16:23 compute-0 nova_compute[186544]: 2025-11-22 09:16:23.912 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:16:27 compute-0 nova_compute[186544]: 2025-11-22 09:16:27.547 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:16:28 compute-0 nova_compute[186544]: 2025-11-22 09:16:28.915 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:16:32 compute-0 nova_compute[186544]: 2025-11-22 09:16:32.549 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:16:33 compute-0 nova_compute[186544]: 2025-11-22 09:16:33.143 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:16:33 compute-0 nova_compute[186544]: 2025-11-22 09:16:33.968 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:16:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:16:36.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:16:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:16:36.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:16:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:16:36.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:16:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:16:36.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:16:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:16:36.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:16:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:16:36.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:16:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:16:36.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:16:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:16:36.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:16:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:16:36.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:16:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:16:36.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:16:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:16:36.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:16:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:16:36.619 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:16:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:16:36.619 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:16:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:16:36.619 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:16:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:16:36.619 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:16:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:16:36.619 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:16:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:16:36.619 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:16:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:16:36.619 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:16:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:16:36.619 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:16:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:16:36.619 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:16:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:16:36.620 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:16:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:16:36.620 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:16:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:16:36.620 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:16:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:16:36.620 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:16:36 compute-0 ceilometer_agent_compute[197247]: 2025-11-22 09:16:36.620 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 22 09:16:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 09:16:37.408 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 09:16:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 09:16:37.408 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 09:16:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 09:16:37.408 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 09:16:37 compute-0 nova_compute[186544]: 2025-11-22 09:16:37.549 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:16:38 compute-0 nova_compute[186544]: 2025-11-22 09:16:38.971 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:16:41 compute-0 podman[266910]: 2025-11-22 09:16:41.401842173 +0000 UTC m=+0.055241077 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 22 09:16:41 compute-0 podman[266911]: 2025-11-22 09:16:41.424143257 +0000 UTC m=+0.068365917 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 22 09:16:41 compute-0 podman[266912]: 2025-11-22 09:16:41.438562988 +0000 UTC m=+0.079567770 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 22 09:16:41 compute-0 podman[266918]: 2025-11-22 09:16:41.455085671 +0000 UTC m=+0.086350555 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 22 09:16:42 compute-0 nova_compute[186544]: 2025-11-22 09:16:42.552 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:16:43 compute-0 nova_compute[186544]: 2025-11-22 09:16:43.974 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:16:47 compute-0 nova_compute[186544]: 2025-11-22 09:16:47.553 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:16:48 compute-0 nova_compute[186544]: 2025-11-22 09:16:48.976 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:16:52 compute-0 nova_compute[186544]: 2025-11-22 09:16:52.555 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:16:53 compute-0 nova_compute[186544]: 2025-11-22 09:16:53.180 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:16:53 compute-0 nova_compute[186544]: 2025-11-22 09:16:53.180 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 09:16:53 compute-0 nova_compute[186544]: 2025-11-22 09:16:53.180 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 09:16:53 compute-0 nova_compute[186544]: 2025-11-22 09:16:53.199 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 09:16:53 compute-0 podman[266993]: 2025-11-22 09:16:53.399074403 +0000 UTC m=+0.051077426 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 22 09:16:53 compute-0 podman[266994]: 2025-11-22 09:16:53.407201311 +0000 UTC m=+0.055506373 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, io.openshift.expose-services=, vcs-type=git, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., managed_by=edpm_ansible, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, distribution-scope=public, version=9.6)
Nov 22 09:16:53 compute-0 podman[266995]: 2025-11-22 09:16:53.424134743 +0000 UTC m=+0.067032673 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 22 09:16:53 compute-0 nova_compute[186544]: 2025-11-22 09:16:53.978 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:16:57 compute-0 nova_compute[186544]: 2025-11-22 09:16:57.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:16:57 compute-0 nova_compute[186544]: 2025-11-22 09:16:57.203 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 09:16:57 compute-0 nova_compute[186544]: 2025-11-22 09:16:57.205 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 09:16:57 compute-0 nova_compute[186544]: 2025-11-22 09:16:57.205 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 09:16:57 compute-0 nova_compute[186544]: 2025-11-22 09:16:57.205 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 22 09:16:57 compute-0 nova_compute[186544]: 2025-11-22 09:16:57.355 186548 WARNING nova.virt.libvirt.driver [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 22 09:16:57 compute-0 nova_compute[186544]: 2025-11-22 09:16:57.356 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5707MB free_disk=73.13150787353516GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 22 09:16:57 compute-0 nova_compute[186544]: 2025-11-22 09:16:57.356 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 09:16:57 compute-0 nova_compute[186544]: 2025-11-22 09:16:57.357 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 09:16:57 compute-0 nova_compute[186544]: 2025-11-22 09:16:57.427 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 22 09:16:57 compute-0 nova_compute[186544]: 2025-11-22 09:16:57.427 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 22 09:16:57 compute-0 nova_compute[186544]: 2025-11-22 09:16:57.444 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Refreshing inventories for resource provider 0a011418-630a-4be8-ab23-41ec1c11a5ea _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 22 09:16:57 compute-0 nova_compute[186544]: 2025-11-22 09:16:57.462 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Updating ProviderTree inventory for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 22 09:16:57 compute-0 nova_compute[186544]: 2025-11-22 09:16:57.462 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Updating inventory in ProviderTree for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 22 09:16:57 compute-0 nova_compute[186544]: 2025-11-22 09:16:57.478 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Refreshing aggregate associations for resource provider 0a011418-630a-4be8-ab23-41ec1c11a5ea, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 22 09:16:57 compute-0 nova_compute[186544]: 2025-11-22 09:16:57.521 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Refreshing trait associations for resource provider 0a011418-630a-4be8-ab23-41ec1c11a5ea, traits: COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_ACCELERATORS,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_RESCUE_BFV,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSSE3,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE41 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 22 09:16:57 compute-0 nova_compute[186544]: 2025-11-22 09:16:57.544 186548 DEBUG nova.compute.provider_tree [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed in ProviderTree for provider: 0a011418-630a-4be8-ab23-41ec1c11a5ea update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 22 09:16:57 compute-0 nova_compute[186544]: 2025-11-22 09:16:57.557 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:16:57 compute-0 nova_compute[186544]: 2025-11-22 09:16:57.575 186548 DEBUG nova.scheduler.client.report [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Inventory has not changed for provider 0a011418-630a-4be8-ab23-41ec1c11a5ea based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 22 09:16:57 compute-0 nova_compute[186544]: 2025-11-22 09:16:57.576 186548 DEBUG nova.compute.resource_tracker [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 22 09:16:57 compute-0 nova_compute[186544]: 2025-11-22 09:16:57.576 186548 DEBUG oslo_concurrency.lockutils [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.220s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 09:16:58 compute-0 nova_compute[186544]: 2025-11-22 09:16:58.980 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:17:02 compute-0 nova_compute[186544]: 2025-11-22 09:17:02.559 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:17:03 compute-0 nova_compute[186544]: 2025-11-22 09:17:03.984 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:17:04 compute-0 nova_compute[186544]: 2025-11-22 09:17:04.577 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:17:05 compute-0 nova_compute[186544]: 2025-11-22 09:17:05.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:17:05 compute-0 nova_compute[186544]: 2025-11-22 09:17:05.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:17:07 compute-0 nova_compute[186544]: 2025-11-22 09:17:07.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:17:07 compute-0 nova_compute[186544]: 2025-11-22 09:17:07.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 22 09:17:07 compute-0 nova_compute[186544]: 2025-11-22 09:17:07.560 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:17:08 compute-0 nova_compute[186544]: 2025-11-22 09:17:08.987 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:17:12 compute-0 nova_compute[186544]: 2025-11-22 09:17:12.164 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:17:12 compute-0 podman[267054]: 2025-11-22 09:17:12.41357125 +0000 UTC m=+0.059432449 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 22 09:17:12 compute-0 podman[267053]: 2025-11-22 09:17:12.413773825 +0000 UTC m=+0.060769991 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 22 09:17:12 compute-0 podman[267055]: 2025-11-22 09:17:12.422724572 +0000 UTC m=+0.063213831 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 22 09:17:12 compute-0 podman[267056]: 2025-11-22 09:17:12.462744078 +0000 UTC m=+0.100942040 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 22 09:17:12 compute-0 nova_compute[186544]: 2025-11-22 09:17:12.561 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:17:13 compute-0 nova_compute[186544]: 2025-11-22 09:17:13.989 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:17:15 compute-0 nova_compute[186544]: 2025-11-22 09:17:15.158 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:17:16 compute-0 nova_compute[186544]: 2025-11-22 09:17:16.163 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:17:17 compute-0 nova_compute[186544]: 2025-11-22 09:17:17.562 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:17:18 compute-0 nova_compute[186544]: 2025-11-22 09:17:18.992 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:17:22 compute-0 nova_compute[186544]: 2025-11-22 09:17:22.563 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:17:24 compute-0 nova_compute[186544]: 2025-11-22 09:17:24.043 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:17:24 compute-0 podman[267141]: 2025-11-22 09:17:24.414874159 +0000 UTC m=+0.059061060 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 09:17:24 compute-0 podman[267142]: 2025-11-22 09:17:24.420194879 +0000 UTC m=+0.060426753 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, vcs-type=git, config_id=edpm, managed_by=edpm_ansible, version=9.6, distribution-scope=public, io.buildah.version=1.33.7, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, maintainer=Red Hat, Inc.)
Nov 22 09:17:24 compute-0 podman[267143]: 2025-11-22 09:17:24.42965658 +0000 UTC m=+0.065680242 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 22 09:17:27 compute-0 nova_compute[186544]: 2025-11-22 09:17:27.565 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:17:29 compute-0 nova_compute[186544]: 2025-11-22 09:17:29.045 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:17:32 compute-0 nova_compute[186544]: 2025-11-22 09:17:32.567 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:17:34 compute-0 nova_compute[186544]: 2025-11-22 09:17:34.048 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:17:35 compute-0 nova_compute[186544]: 2025-11-22 09:17:35.158 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:17:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 09:17:37.409 103805 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 22 09:17:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 09:17:37.410 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 22 09:17:37 compute-0 ovn_metadata_agent[103800]: 2025-11-22 09:17:37.410 103805 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 22 09:17:37 compute-0 nova_compute[186544]: 2025-11-22 09:17:37.570 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:17:39 compute-0 nova_compute[186544]: 2025-11-22 09:17:39.052 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:17:39 compute-0 sshd-session[267204]: Accepted publickey for zuul from 192.168.122.10 port 59546 ssh2: ECDSA SHA256:XSwr0+qdoVcqF4tsJEe7yzRPcUrJW8ET1w6IEkjbKvs
Nov 22 09:17:39 compute-0 systemd-logind[821]: New session 72 of user zuul.
Nov 22 09:17:39 compute-0 systemd[1]: Started Session 72 of User zuul.
Nov 22 09:17:39 compute-0 sshd-session[267204]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 22 09:17:39 compute-0 sudo[267208]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt'
Nov 22 09:17:39 compute-0 sudo[267208]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 22 09:17:42 compute-0 nova_compute[186544]: 2025-11-22 09:17:42.570 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:17:43 compute-0 podman[267352]: 2025-11-22 09:17:43.431091078 +0000 UTC m=+0.068799847 container health_status c6f69d367cdebddfb5adca41d6885aa6f76d35a1df25f9aa4662af6967e6bd37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent)
Nov 22 09:17:43 compute-0 podman[267353]: 2025-11-22 09:17:43.432127704 +0000 UTC m=+0.067303601 container health_status d93bf209468d6c32801d40675c5c2d4e0a5e64148a3363e9e857b12ba36f5ca9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 22 09:17:43 compute-0 podman[267351]: 2025-11-22 09:17:43.438198001 +0000 UTC m=+0.075353737 container health_status a6c6f86887ec2a88cc27fc0b12e8c636662be3d7da293a68f4a224488c20cc7b (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 22 09:17:43 compute-0 podman[267354]: 2025-11-22 09:17:43.475779787 +0000 UTC m=+0.102271363 container health_status ed21f5af467fd625b436f60c406a39e238d7a523f3a907730de65e6a7dece46f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 22 09:17:44 compute-0 nova_compute[186544]: 2025-11-22 09:17:44.054 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:17:44 compute-0 ovs-vsctl[267461]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Nov 22 09:17:45 compute-0 virtqemud[186092]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Nov 22 09:17:45 compute-0 virtqemud[186092]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Nov 22 09:17:45 compute-0 virtqemud[186092]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Nov 22 09:17:46 compute-0 crontab[267871]: (root) LIST (root)
Nov 22 09:17:47 compute-0 nova_compute[186544]: 2025-11-22 09:17:47.571 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:17:48 compute-0 systemd[1]: Starting Hostname Service...
Nov 22 09:17:48 compute-0 systemd[1]: Started Hostname Service.
Nov 22 09:17:49 compute-0 nova_compute[186544]: 2025-11-22 09:17:49.056 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:17:49 compute-0 sshd-session[268023]: Connection closed by 185.216.140.186 port 50722
Nov 22 09:17:52 compute-0 nova_compute[186544]: 2025-11-22 09:17:52.573 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:17:53 compute-0 nova_compute[186544]: 2025-11-22 09:17:53.162 186548 DEBUG oslo_service.periodic_task [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 22 09:17:53 compute-0 nova_compute[186544]: 2025-11-22 09:17:53.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 22 09:17:53 compute-0 nova_compute[186544]: 2025-11-22 09:17:53.163 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 22 09:17:53 compute-0 nova_compute[186544]: 2025-11-22 09:17:53.198 186548 DEBUG nova.compute.manager [None req-df62cf10-3011-4507-9e74-43da54024e8f - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 22 09:17:54 compute-0 nova_compute[186544]: 2025-11-22 09:17:54.059 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 22 09:17:55 compute-0 podman[268796]: 2025-11-22 09:17:55.430130533 +0000 UTC m=+0.065215640 container health_status 0017e37c66cba4b32b088774f107754a4ee75576f8f0367610f8b52198ad9cb3 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 22 09:17:55 compute-0 podman[268802]: 2025-11-22 09:17:55.439298057 +0000 UTC m=+0.077754136 container health_status 26effdaa621a7c281dfcb9aa9785165ee2474631c9a51bad768f92930e2c7caf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., distribution-scope=public, managed_by=edpm_ansible, version=9.6, container_name=openstack_network_exporter, vcs-type=git, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., architecture=x86_64, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350)
Nov 22 09:17:55 compute-0 podman[268803]: 2025-11-22 09:17:55.446015869 +0000 UTC m=+0.077536879 container health_status 8f45e1efadb41d706e07a8505e94635319ef202c781ec7d4da4d2c7e4d1cdd5f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, tcib_managed=true)
Nov 22 09:17:57 compute-0 ovs-appctl[269295]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Nov 22 09:17:57 compute-0 ovs-appctl[269298]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Nov 22 09:17:57 compute-0 ovs-appctl[269302]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Nov 22 09:17:57 compute-0 nova_compute[186544]: 2025-11-22 09:17:57.575 186548 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
